1 00:00:04,440 --> 00:00:12,639 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. This season 2 00:00:12,720 --> 00:00:16,200 Speaker 1: on smart Talks with IBM, Malcolm Gladwell and team are 3 00:00:16,239 --> 00:00:19,479 Speaker 1: diving into the transformative world of artificial intelligence with a 4 00:00:19,520 --> 00:00:24,120 Speaker 1: fresh perspective on the concept of open What does open 5 00:00:24,560 --> 00:00:27,920 Speaker 1: really mean in the context of AI. It can mean 6 00:00:28,000 --> 00:00:31,600 Speaker 1: open source code or open data, but it also encompasses 7 00:00:31,720 --> 00:00:36,040 Speaker 1: fostering an ecosystem of ideas, ensuring diverse perspectives are heard, 8 00:00:36,159 --> 00:00:39,919 Speaker 1: and enabling new levels of transparency. Join hosts from your 9 00:00:39,960 --> 00:00:43,560 Speaker 1: favorite Pushkin podcasts as they explore how openness and AI 10 00:00:43,800 --> 00:00:49,040 Speaker 1: is reshaping industries, driving innovation, and redefining what's possible. You'll 11 00:00:49,080 --> 00:00:52,360 Speaker 1: hear from industry experts and leaders about the implications and 12 00:00:52,400 --> 00:00:56,160 Speaker 1: possibilities of open AI, and of course, Malcolm Gladwell will 13 00:00:56,200 --> 00:00:58,200 Speaker 1: be there to guide you through the season with his 14 00:00:58,360 --> 00:01:01,840 Speaker 1: unique insights. Look out for new episodes of Smart Talks 15 00:01:01,920 --> 00:01:05,360 Speaker 1: every other week on the iHeartRadio app, Apple Podcasts, or 16 00:01:05,360 --> 00:01:08,920 Speaker 1: wherever you get your podcasts, and learn more at IBM 17 00:01:09,080 --> 00:01:11,360 Speaker 1: dot com slash smart Talks. 18 00:01:14,480 --> 00:01:17,800 Speaker 2: Hello, Hello, Welcome to Smart Talks with IBM, a podcast 19 00:01:17,880 --> 00:01:24,319 Speaker 2: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Glappwell. This season, 20 00:01:24,480 --> 00:01:27,399 Speaker 2: we're diving back into the world of artificial intelligence, but 21 00:01:27,520 --> 00:01:33,920 Speaker 2: with a focus on the powerful concept of open its possibilities, implications, 22 00:01:34,240 --> 00:01:37,760 Speaker 2: and misconceptions. We'll look at openness from a variety of 23 00:01:37,760 --> 00:01:41,920 Speaker 2: angles and explore how the concept is already reshaping industries, 24 00:01:42,360 --> 00:01:46,160 Speaker 2: ways of doing business and our very notion of what's possible. 25 00:01:47,280 --> 00:01:50,400 Speaker 2: On today's episode, I'm joined by Jason Kelly, the global 26 00:01:50,440 --> 00:01:54,680 Speaker 2: Managing Partner for IBM Strategic Partners and Ecosystems, and by 27 00:01:54,720 --> 00:01:59,120 Speaker 2: Christy Fredericks, the Senior Vice president and Chief Partnership Officer 28 00:01:59,440 --> 00:02:04,080 Speaker 2: at Polowa Networks. We discussed how their partnership in the 29 00:02:04,120 --> 00:02:10,040 Speaker 2: cybersecurity space helps strengthen enterprises by focusing on seamless cybersecurity 30 00:02:10,080 --> 00:02:15,359 Speaker 2: solutions tailored to meet the evolving threat landscape. By leveraging 31 00:02:15,400 --> 00:02:19,960 Speaker 2: AI and automation, this collaboration aims to modernize security programs, 32 00:02:20,360 --> 00:02:25,359 Speaker 2: improve response times, and produce risks. Jason and Christy both 33 00:02:25,400 --> 00:02:29,639 Speaker 2: bring a tremendous amount of experience and expertise to the subject. 34 00:02:30,760 --> 00:02:45,440 Speaker 2: I think you're really going to enjoy this one, Jason, Christy, 35 00:02:45,560 --> 00:02:48,680 Speaker 2: Welcome to Smart Talks with IBM. Thank you for joining me. 36 00:02:48,880 --> 00:02:49,280 Speaker 3: Thank you. 37 00:02:49,440 --> 00:02:50,320 Speaker 4: It's great to be here. 38 00:02:50,639 --> 00:02:54,120 Speaker 2: We are here to discuss cybersecurity and the partnership between 39 00:02:54,160 --> 00:02:57,360 Speaker 2: IBM and Palo Alto Networks. But before we get there, 40 00:02:57,600 --> 00:02:59,280 Speaker 2: I wanted you guys to tell me a little bit 41 00:03:00,080 --> 00:03:04,079 Speaker 2: out yourself. Jason, let's start with you. I see on 42 00:03:04,120 --> 00:03:08,680 Speaker 2: your resume west Point, which makes we think there's some 43 00:03:08,760 --> 00:03:11,440 Speaker 2: interesting things going on there. How did you get to 44 00:03:11,480 --> 00:03:12,160 Speaker 2: west Point? 45 00:03:12,760 --> 00:03:15,240 Speaker 3: West Point? West Point was the decision. First, it was 46 00:03:15,280 --> 00:03:18,399 Speaker 3: it was affordable back in the day. But I had 47 00:03:18,440 --> 00:03:20,280 Speaker 3: a sense of service. My father was a World War 48 00:03:20,280 --> 00:03:23,280 Speaker 3: Two vet, so I grew up on the weekends watching 49 00:03:23,400 --> 00:03:27,320 Speaker 3: World War two video. You know, he's army as well. Yeah, 50 00:03:27,360 --> 00:03:31,880 Speaker 3: and so I thought, oh, that'd be exciting, and I 51 00:03:31,919 --> 00:03:35,160 Speaker 3: thought I do some type of service. Went there and 52 00:03:35,240 --> 00:03:37,960 Speaker 3: now I have the biggest family, extended family I could 53 00:03:37,960 --> 00:03:41,280 Speaker 3: ever have. So it was very exciting. Played football lucked out, 54 00:03:42,120 --> 00:03:47,640 Speaker 3: meaning I wasn't recruited. I walked on and that kept 55 00:03:47,640 --> 00:03:49,680 Speaker 3: me there because it gave me something and outlet with 56 00:03:49,720 --> 00:03:53,200 Speaker 3: all the other pressures. Defensive back I was. I was 57 00:03:53,440 --> 00:03:55,320 Speaker 3: great at knocking the ball down, not the best at 58 00:03:55,320 --> 00:03:55,760 Speaker 3: catching it. 59 00:03:55,880 --> 00:04:00,480 Speaker 2: Yeah, and then you were a ranger I was. 60 00:04:00,560 --> 00:04:04,000 Speaker 3: I was privileged to be a US Army airborne ranger station, 61 00:04:04,080 --> 00:04:06,560 Speaker 3: but did most of my time in northern Italy. We're 62 00:04:06,600 --> 00:04:10,480 Speaker 3: part of the eighty second Airbarnship Post. Oh yeah, that's 63 00:04:10,520 --> 00:04:13,040 Speaker 3: what people say, seriously, like, you know, you were you 64 00:04:13,080 --> 00:04:15,720 Speaker 3: were northern, you were drinking wine and having bred, you know. 65 00:04:16,200 --> 00:04:19,479 Speaker 3: But it was a part of a NATO force there 66 00:04:19,480 --> 00:04:22,040 Speaker 3: at the time. Yeah, so exciting. 67 00:04:22,279 --> 00:04:25,960 Speaker 2: How did you get from there to IBM? 68 00:04:26,400 --> 00:04:28,960 Speaker 3: A long path. As I came out of the military, 69 00:04:29,160 --> 00:04:36,840 Speaker 3: I started manufacturing retail housing and did a quick stint, 70 00:04:36,839 --> 00:04:40,919 Speaker 3: took a leave of absence from industry, and did a 71 00:04:41,080 --> 00:04:44,240 Speaker 3: stint of yet again public service in the state of 72 00:04:44,240 --> 00:04:48,880 Speaker 3: Tennessee with economic development and got a whiff of how 73 00:04:48,920 --> 00:04:52,360 Speaker 3: fun it could be to do things around data and media. 74 00:04:52,720 --> 00:04:55,680 Speaker 3: Started a small media firm what we would now call 75 00:04:55,680 --> 00:04:59,920 Speaker 3: a digital firm, sold it and said I wanted to 76 00:05:00,080 --> 00:05:02,440 Speaker 3: go do it again somewhere, but I want to go 77 00:05:02,480 --> 00:05:05,119 Speaker 3: to a big company. And the family at IBM brought 78 00:05:05,120 --> 00:05:07,279 Speaker 3: me in and yet to let me go. 79 00:05:07,520 --> 00:05:08,640 Speaker 2: That was how many years ago? 80 00:05:10,520 --> 00:05:14,360 Speaker 3: Two decades? Oh wow, So I know I look amazingly young, but. 81 00:05:14,400 --> 00:05:16,600 Speaker 2: Yes, because you must have. 82 00:05:17,480 --> 00:05:20,400 Speaker 3: And IBM was my fifth career and and and I've 83 00:05:20,520 --> 00:05:22,479 Speaker 3: I've enjoyed it since and that's what I what I 84 00:05:22,520 --> 00:05:25,200 Speaker 3: do there, build teams, grow new parts of the company, 85 00:05:25,440 --> 00:05:28,239 Speaker 3: and get to work with some of the most brilliant 86 00:05:28,240 --> 00:05:29,920 Speaker 3: people on the face of the planet, as well as 87 00:05:29,960 --> 00:05:33,480 Speaker 3: partners like like Christy that just keep it exciting. 88 00:05:33,800 --> 00:05:36,400 Speaker 2: Christy, you're I was delighted to learn that you are Canadian? 89 00:05:36,480 --> 00:05:42,200 Speaker 2: Yes here, yeah? But you so you were a consultant 90 00:05:42,240 --> 00:05:44,200 Speaker 2: for a long time a Bane Yes. 91 00:05:44,520 --> 00:05:47,400 Speaker 4: Yeah. I joined Baine Consulting intending to spend a couple 92 00:05:47,400 --> 00:05:49,560 Speaker 4: of years there learn the ropes and then go get 93 00:05:49,560 --> 00:05:53,000 Speaker 4: my first real job. But the value personally to my 94 00:05:53,040 --> 00:05:54,960 Speaker 4: growth and development, and then that we were able to 95 00:05:55,000 --> 00:05:57,520 Speaker 4: bring our clients. I ended up there for sixteen years 96 00:05:58,080 --> 00:06:02,360 Speaker 4: and then post Bane went onto another my first product company, 97 00:06:02,400 --> 00:06:05,080 Speaker 4: a new relic, and then it's come full circle at 98 00:06:05,120 --> 00:06:08,719 Speaker 4: Polota Networks. But at Bain it was all about bringing 99 00:06:08,760 --> 00:06:12,520 Speaker 4: expertise across different industries to help our clients improve whatever 100 00:06:12,560 --> 00:06:15,520 Speaker 4: they needed to improve and bringing that expertise to bear. 101 00:06:16,160 --> 00:06:18,120 Speaker 4: And then you have the product lens and you think, Okay, 102 00:06:18,120 --> 00:06:20,200 Speaker 4: we're going to build the absolute best product to help 103 00:06:20,320 --> 00:06:23,120 Speaker 4: our customers do what they need to get done, and 104 00:06:23,160 --> 00:06:25,640 Speaker 4: then I joined pal Alto about six seven months ago 105 00:06:26,400 --> 00:06:29,200 Speaker 4: in a partnerships role and I'm delighted to be able 106 00:06:29,240 --> 00:06:32,359 Speaker 4: to work with amazing consulting companies like IBM where we 107 00:06:32,400 --> 00:06:33,160 Speaker 4: bring both to bear. 108 00:06:33,720 --> 00:06:36,880 Speaker 2: How long have IBM and Palo Alto Network's been partners? 109 00:06:37,480 --> 00:06:40,400 Speaker 3: So we've been We've been working together for quite quite 110 00:06:40,400 --> 00:06:43,920 Speaker 3: a long time, but we made it official, meaning we 111 00:06:44,240 --> 00:06:46,400 Speaker 3: got married as strategic partners last year. 112 00:06:46,560 --> 00:06:48,520 Speaker 2: Oh I see. So what is it that each of 113 00:06:48,640 --> 00:06:51,760 Speaker 2: you bring to the table? What's each side special? 114 00:06:51,839 --> 00:06:54,760 Speaker 3: So it's great that you asked that because about a 115 00:06:54,800 --> 00:06:58,720 Speaker 3: decade ago, are now CEO Arvin Christ says, you know, 116 00:06:58,720 --> 00:07:01,320 Speaker 3: it wouldn't be great if we has had this one 117 00:07:01,360 --> 00:07:04,039 Speaker 3: focus with what does IBM do and you have this 118 00:07:04,080 --> 00:07:06,919 Speaker 3: whole list, and he says, let's make it simple. We 119 00:07:07,040 --> 00:07:11,720 Speaker 3: are a multi cloud, hybrid cloud AI company. And so 120 00:07:11,800 --> 00:07:14,600 Speaker 3: when you say that, it sounds very simple. But then people, 121 00:07:14,880 --> 00:07:18,000 Speaker 3: what the hell is that? What your hybrid cloud? Well, 122 00:07:18,680 --> 00:07:21,120 Speaker 3: both of those two things have a lot of data involved, 123 00:07:21,320 --> 00:07:23,920 Speaker 3: and a lot of those mean that that data is 124 00:07:23,920 --> 00:07:27,840 Speaker 3: going to sit in multiple places and distributed environments. Well, 125 00:07:27,880 --> 00:07:31,240 Speaker 3: if you're able to tie those things together with multiple partners. 126 00:07:31,560 --> 00:07:35,680 Speaker 3: You also have to make sure that it's secure because 127 00:07:36,120 --> 00:07:39,280 Speaker 3: in the direction that we're going, where data is now 128 00:07:39,520 --> 00:07:42,000 Speaker 3: being consumed in many different places and it is the 129 00:07:42,040 --> 00:07:46,480 Speaker 3: fuel behind AI as we know. Then you say, ah, well, 130 00:07:47,000 --> 00:07:49,480 Speaker 3: who does that well and who does it in a 131 00:07:49,520 --> 00:07:52,320 Speaker 3: way that's getting rid of seams, the seams that could 132 00:07:52,360 --> 00:07:56,400 Speaker 3: be across multiple products, multiple product SATs even And that's 133 00:07:56,400 --> 00:07:57,360 Speaker 3: where Powell comes in. 134 00:07:58,200 --> 00:08:01,960 Speaker 4: I think the canal wisdom in cybersecurity was always you 135 00:08:02,040 --> 00:08:05,440 Speaker 4: need all the new tools, right, you need it every threat. 136 00:08:05,440 --> 00:08:07,200 Speaker 4: It's like, whack them all. Every threat that pops up, 137 00:08:07,240 --> 00:08:10,720 Speaker 4: you get the tool that's purpose built for that specific thing. Well, 138 00:08:10,760 --> 00:08:13,280 Speaker 4: fast forward to you know, the RSA conference this year. 139 00:08:13,320 --> 00:08:16,920 Speaker 4: There were four thousand vendors on the floor. You look 140 00:08:16,960 --> 00:08:20,240 Speaker 4: at an average company, there's hundreds of cybersecurity tools. It 141 00:08:20,320 --> 00:08:24,680 Speaker 4: introduces a level of complexity that is really hard to manage. 142 00:08:25,040 --> 00:08:30,360 Speaker 4: You as a user, query and application right, that query 143 00:08:30,440 --> 00:08:33,959 Speaker 4: can go through a bunch of different pings from one 144 00:08:33,960 --> 00:08:35,680 Speaker 4: cloud to the next. It goes into and out of 145 00:08:35,679 --> 00:08:38,040 Speaker 4: as SaaS application. It may be running along a network, 146 00:08:38,280 --> 00:08:39,959 Speaker 4: you may be accessing it from your phone, which is 147 00:08:39,960 --> 00:08:42,800 Speaker 4: an unmanaged device. It's got to go in and out. 148 00:08:42,840 --> 00:08:45,600 Speaker 4: And if you say, okay, I've got to secure that phone, 149 00:08:46,200 --> 00:08:48,480 Speaker 4: I've got to secure the network, I've got it. Then 150 00:08:48,480 --> 00:08:50,720 Speaker 4: all of a sudden you've got sort of firewalls, software 151 00:08:50,720 --> 00:08:54,040 Speaker 4: and hardwel flows popping up everywhere. You've got cloud security, 152 00:08:54,600 --> 00:08:56,880 Speaker 4: and it's you've probably heard of this concept of zero trust, 153 00:08:56,880 --> 00:08:58,480 Speaker 4: which is every time you have to check and say 154 00:08:58,480 --> 00:09:00,200 Speaker 4: are you allowed in here? Are you allowed in here? 155 00:09:00,480 --> 00:09:03,040 Speaker 4: The number of places that can fall down it just 156 00:09:03,080 --> 00:09:06,800 Speaker 4: becomes overwhelming. So you end up with either alerts firing, 157 00:09:07,280 --> 00:09:10,120 Speaker 4: you know, every two seconds that you have to then 158 00:09:10,120 --> 00:09:13,160 Speaker 4: go investigate in most of which are false positives or 159 00:09:13,679 --> 00:09:16,959 Speaker 4: you miss something right. And so that was the conventionalism 160 00:09:17,080 --> 00:09:18,719 Speaker 4: was we've got to buy all these tools, and now 161 00:09:18,720 --> 00:09:21,880 Speaker 4: you've got overwhelmed CIOs and CSOs with hundreds of tools, 162 00:09:21,960 --> 00:09:25,120 Speaker 4: and Palo Alto strategy has been, look, we're going to 163 00:09:25,120 --> 00:09:28,160 Speaker 4: create a platform where everything can be stitched together, everything 164 00:09:28,200 --> 00:09:31,760 Speaker 4: can speak the same language, and we can sort of 165 00:09:32,559 --> 00:09:35,520 Speaker 4: manage throughout the architecture and watch, you know, this call 166 00:09:35,600 --> 00:09:39,800 Speaker 4: as as it's passing through all these different checkpoints, and 167 00:09:40,040 --> 00:09:41,560 Speaker 4: we can do it in a way that you still 168 00:09:41,600 --> 00:09:43,440 Speaker 4: have the confidence that it's best to breed right, so 169 00:09:43,480 --> 00:09:46,120 Speaker 4: you're not making any trade offs. But it's not so 170 00:09:46,160 --> 00:09:50,439 Speaker 4: simple just to get from the spaghetti to the seamless architecture. 171 00:09:50,480 --> 00:09:54,000 Speaker 4: You need, oftentimes to re engineer your business processes. You 172 00:09:54,080 --> 00:09:57,240 Speaker 4: have to re architect your digital environment. And so that's 173 00:09:57,280 --> 00:10:00,200 Speaker 4: where we partner with a company like IBM to bring 174 00:10:00,200 --> 00:10:02,280 Speaker 4: that expertise and say, we're going to help you not 175 00:10:02,440 --> 00:10:05,320 Speaker 4: just deploy the best cybersecurity architecture, but really get your 176 00:10:05,400 --> 00:10:08,440 Speaker 4: environment ready to have this zero customers as well as. 177 00:10:08,360 --> 00:10:12,600 Speaker 3: All of those players that cross that spaghetti, because when 178 00:10:12,600 --> 00:10:15,160 Speaker 3: you start thinking about all the other partners that you 179 00:10:15,200 --> 00:10:17,840 Speaker 3: work with, if you're you think of an industry perspective, 180 00:10:17,880 --> 00:10:20,440 Speaker 3: you're going to have an ERP. It could be an Oracle, 181 00:10:20,480 --> 00:10:22,960 Speaker 3: it could be an SAP. You're not going to have 182 00:10:23,000 --> 00:10:24,920 Speaker 3: one cloud, as I mentioned, it's going to be possibly 183 00:10:25,200 --> 00:10:28,760 Speaker 3: multiple clouds. You'll have some AWS maybe Microsoft asure and 184 00:10:28,800 --> 00:10:31,040 Speaker 3: then even even some Google in there, and then your 185 00:10:31,080 --> 00:10:33,840 Speaker 3: own that you've built in your private over there, uh, 186 00:10:34,760 --> 00:10:38,760 Speaker 3: some an IBM cloud. You'll have those multiple clouds, and 187 00:10:38,760 --> 00:10:41,800 Speaker 3: then you also will have you know, fit for purpose, Oh, 188 00:10:41,920 --> 00:10:44,480 Speaker 3: I need a I need a salesforce in there for 189 00:10:44,920 --> 00:10:47,400 Speaker 3: my customer focusing I need. I'm doing some graphics, so 190 00:10:47,440 --> 00:10:49,680 Speaker 3: I have Adobe, so I just as I can name 191 00:10:49,760 --> 00:10:54,840 Speaker 3: name name, all of those then have to be re engineered. Seriously. 192 00:10:55,440 --> 00:10:57,240 Speaker 3: I mean, come on, Malcolm, you're gonna sit there you 193 00:10:57,280 --> 00:11:00,320 Speaker 3: think how long that would take. So if you haven't 194 00:11:00,400 --> 00:11:03,480 Speaker 3: done that before, you're going to have to go to 195 00:11:03,520 --> 00:11:05,560 Speaker 3: each one of those individually, or you can work with 196 00:11:05,880 --> 00:11:08,560 Speaker 3: a company that can tie those things together, because we 197 00:11:08,600 --> 00:11:12,120 Speaker 3: are also strategic partners with them. So that's where you 198 00:11:12,120 --> 00:11:15,640 Speaker 3: start to say, Okay, I see how this comes together. 199 00:11:16,559 --> 00:11:20,080 Speaker 3: You have to make sure that your ecosystem is going 200 00:11:20,080 --> 00:11:22,920 Speaker 3: to be stronger than your competitor's ecosystem, and you have 201 00:11:22,960 --> 00:11:25,320 Speaker 3: to be secure in what you're doing because as you 202 00:11:25,400 --> 00:11:29,000 Speaker 3: add more players or products, you create seams, and you 203 00:11:29,040 --> 00:11:32,160 Speaker 3: want to make sure there's fewer seams and that there's 204 00:11:32,240 --> 00:11:37,120 Speaker 3: zero trust across that capability you're building. And that's why 205 00:11:37,200 --> 00:11:38,840 Speaker 3: the compliment between the two companies. 206 00:11:38,960 --> 00:11:41,000 Speaker 2: Well, take a step back from a moment before we 207 00:11:41,120 --> 00:11:43,840 Speaker 2: sort of launch, once get into the specifics of what 208 00:11:43,960 --> 00:11:47,760 Speaker 2: you guys are doing. I'm curious at this moment in 209 00:11:47,920 --> 00:11:54,240 Speaker 2: twenty twenty four, how nervous should we be about cybersecurity? 210 00:11:54,440 --> 00:11:58,640 Speaker 2: So compared it to five years ago or ten years ago. 211 00:11:59,160 --> 00:12:01,600 Speaker 2: Are we are less nervous than you were five years ago? 212 00:12:01,600 --> 00:12:04,240 Speaker 2: Were more nervous? Or all of changes going on right 213 00:12:04,280 --> 00:12:08,320 Speaker 2: now increasing vulnerability or decreasing it? 214 00:12:08,720 --> 00:12:12,200 Speaker 3: I would say Christie, also, I think we share the 215 00:12:12,240 --> 00:12:16,400 Speaker 3: point of view is that it's not necessarily being more nervous. 216 00:12:16,400 --> 00:12:20,960 Speaker 3: I think you should be more prepared because the amounts 217 00:12:21,440 --> 00:12:25,760 Speaker 3: of threat is increasing based on our dependence upon data. 218 00:12:26,840 --> 00:12:31,160 Speaker 3: And that's that's where I think the attention should be placed. 219 00:12:31,200 --> 00:12:34,640 Speaker 3: Is that more and more, especially with the importance of AI, 220 00:12:35,720 --> 00:12:38,480 Speaker 3: that you say, okay, then what's under all that? And 221 00:12:38,520 --> 00:12:43,240 Speaker 3: it's the data, as I said, So knowing that you 222 00:12:43,280 --> 00:12:44,880 Speaker 3: should be more concerned. 223 00:12:45,800 --> 00:12:49,760 Speaker 2: Does the advent of AI and its rapid evolution help 224 00:12:49,880 --> 00:12:51,680 Speaker 2: defense more or offense more? 225 00:12:52,240 --> 00:12:55,120 Speaker 4: I think it's I think it's like any mega trend 226 00:12:55,160 --> 00:12:59,200 Speaker 4: that we've witnessed both. Right, So you think about AI. 227 00:12:59,320 --> 00:13:03,679 Speaker 4: It's great, right in terms of what it's going to 228 00:13:03,760 --> 00:13:07,080 Speaker 4: unlock for productivity, for humanity, but it also makes it 229 00:13:07,120 --> 00:13:09,720 Speaker 4: a whole lot easier to build ransomware. It's a whole 230 00:13:09,760 --> 00:13:13,120 Speaker 4: lot easier to test different ways into a system. Right. 231 00:13:13,520 --> 00:13:15,280 Speaker 4: But I think that's true if you think about like 232 00:13:15,360 --> 00:13:16,960 Speaker 4: the rise of the Internet, right, all of a sudden, 233 00:13:17,000 --> 00:13:20,720 Speaker 4: everyone was putting their data online and you had to 234 00:13:20,760 --> 00:13:23,240 Speaker 4: think of new ways to stay ahead and keep that secure. 235 00:13:23,280 --> 00:13:25,840 Speaker 4: And I don't think AI is any different. You've got 236 00:13:25,960 --> 00:13:29,400 Speaker 4: companies like Palta, partnerships like Powell and IBM that are 237 00:13:30,640 --> 00:13:34,080 Speaker 4: constantly scanning the landscape for not only the current threats, 238 00:13:34,120 --> 00:13:36,800 Speaker 4: but what's next, what's coming around the corner, what's after AI? 239 00:13:37,440 --> 00:13:39,760 Speaker 4: And so I think taking it seriously and being prepared 240 00:13:39,840 --> 00:13:41,920 Speaker 4: is probably the right way of looking at it, as 241 00:13:41,960 --> 00:13:44,120 Speaker 4: opposed to because if you think about it too hard, 242 00:13:44,160 --> 00:13:46,839 Speaker 4: you'll just want to crawl into a corner and stuff 243 00:13:46,840 --> 00:13:49,920 Speaker 4: everything under the mattress. 244 00:13:50,280 --> 00:13:57,480 Speaker 2: I am the CEO of a regional hospital chain, big 245 00:13:57,640 --> 00:14:02,679 Speaker 2: distribute healthcare system, so a ton of data. The consequences 246 00:14:02,720 --> 00:14:06,640 Speaker 2: of being hacked and help for ransom are life and death. 247 00:14:06,800 --> 00:14:10,400 Speaker 2: Life and death. Right littally, when you come so, you 248 00:14:10,840 --> 00:14:12,720 Speaker 2: come down, you sit down with me, and you chat 249 00:14:12,760 --> 00:14:16,240 Speaker 2: with me. Walk me through the kinds of things you 250 00:14:16,280 --> 00:14:18,840 Speaker 2: would tell me about what I need to get safer. 251 00:14:19,160 --> 00:14:22,360 Speaker 2: For example, let's start with one. Is it likely that 252 00:14:22,400 --> 00:14:24,320 Speaker 2: I'm spending too little? Or am I spending money in 253 00:14:24,360 --> 00:14:25,040 Speaker 2: the wrong place. 254 00:14:25,920 --> 00:14:29,960 Speaker 4: Great question. It depends how you've broken it out. If 255 00:14:30,000 --> 00:14:33,120 Speaker 4: you are distributing all of your dollars across a whole 256 00:14:33,160 --> 00:14:35,720 Speaker 4: bunch of different tools, it's likely you're just spending the 257 00:14:35,760 --> 00:14:38,040 Speaker 4: wrong money. And in fact, you know, putting it all 258 00:14:38,040 --> 00:14:40,800 Speaker 4: in one place is a way of potentially saving money 259 00:14:41,160 --> 00:14:45,000 Speaker 4: but keeping your security actually higher. And I'd love to 260 00:14:45,040 --> 00:14:46,880 Speaker 4: hear Jason, how you would approach it. How we would 261 00:14:46,880 --> 00:14:49,080 Speaker 4: approach it, of course, is by saying, you know, what, 262 00:14:49,080 --> 00:14:52,200 Speaker 4: what does your environment look like? You know, do you 263 00:14:52,240 --> 00:14:56,720 Speaker 4: have the connected medical devices into your EMR? Are your 264 00:14:56,840 --> 00:15:00,360 Speaker 4: respirators and ventilators all online? Right? And so we would 265 00:15:00,360 --> 00:15:03,200 Speaker 4: talk about, okay, here's how you get coverage and how 266 00:15:03,240 --> 00:15:05,720 Speaker 4: the coverage of both the firewalls as well as the 267 00:15:05,800 --> 00:15:09,240 Speaker 4: detectors all feedback into your security operation center and you 268 00:15:09,240 --> 00:15:13,000 Speaker 4: can manage it and do your learning with AI and 269 00:15:13,080 --> 00:15:14,000 Speaker 4: keep yourself securing. 270 00:15:14,040 --> 00:15:16,360 Speaker 3: So yeah, and I would say Christy and I would 271 00:15:16,360 --> 00:15:18,840 Speaker 3: go to the same point because if you get under 272 00:15:19,120 --> 00:15:22,600 Speaker 3: what she was just asking, it is your data on 273 00:15:22,720 --> 00:15:26,840 Speaker 3: prem and when it's on prem, how active is it 274 00:15:26,880 --> 00:15:31,040 Speaker 3: across the enterprise? And so that begins the basis for 275 00:15:31,120 --> 00:15:33,040 Speaker 3: the start and then often you're going to say, well, 276 00:15:33,080 --> 00:15:36,240 Speaker 3: we actually take in data from outside, and then we 277 00:15:36,280 --> 00:15:39,800 Speaker 3: also have the circumstances. There's a lot of PII and 278 00:15:39,840 --> 00:15:45,280 Speaker 3: so that personal is the personal information, right, And so 279 00:15:45,320 --> 00:15:48,640 Speaker 3: now you're saying, okay, now, how are we securing that 280 00:15:48,960 --> 00:15:51,440 Speaker 3: and where are we securing it? And so you have 281 00:15:51,520 --> 00:15:55,680 Speaker 3: to start really thinking about the different areas within that 282 00:15:55,920 --> 00:16:00,240 Speaker 3: hospital chain. Are you sharing that amongst your hospitals? And 283 00:16:00,280 --> 00:16:03,480 Speaker 3: now you start to think of if I'm saying no 284 00:16:03,560 --> 00:16:04,960 Speaker 3: to a lot of that, it's like, well, then are 285 00:16:05,000 --> 00:16:07,680 Speaker 3: you as efficient as you want to be? So there 286 00:16:07,760 --> 00:16:11,160 Speaker 3: is that trade off of you know, am I so 287 00:16:11,320 --> 00:16:14,920 Speaker 3: tightly walled that I'm not productive? And so that's where 288 00:16:14,960 --> 00:16:17,960 Speaker 3: we would start to say, what's the outcome that you're 289 00:16:17,960 --> 00:16:20,160 Speaker 3: trying to get to? All Right, maybe you're good, Maybe 290 00:16:20,200 --> 00:16:22,560 Speaker 3: you're you're good with your five locations and you don't 291 00:16:22,800 --> 00:16:24,960 Speaker 3: need to go any further, but maybe you want to 292 00:16:25,000 --> 00:16:27,000 Speaker 3: expand to fifty and by the way, you're going to 293 00:16:27,080 --> 00:16:29,080 Speaker 3: go crossport or you're going to be in Toronto and 294 00:16:29,200 --> 00:16:32,480 Speaker 3: in New York. Okay, well then how do you do that? 295 00:16:33,360 --> 00:16:36,040 Speaker 3: And so I think that it's very easy to start 296 00:16:36,120 --> 00:16:41,600 Speaker 3: jumping into any of the typical situations. But the first 297 00:16:41,640 --> 00:16:45,960 Speaker 3: question that you have to ask you as the hospital CEOs, 298 00:16:46,400 --> 00:16:49,360 Speaker 3: what's your objective? What are you what are you trying 299 00:16:49,480 --> 00:16:52,040 Speaker 3: to do? Because too often what we see is that 300 00:16:52,400 --> 00:16:55,720 Speaker 3: there's some bright, new, shiny thing that everybody wants to 301 00:16:56,040 --> 00:16:58,440 Speaker 3: put in play. You know, it's a sandwich looking for 302 00:16:58,560 --> 00:17:01,440 Speaker 3: lunch and you go, but what is it that you 303 00:17:01,520 --> 00:17:04,000 Speaker 3: want to do as this? Are you doing research? Are 304 00:17:04,000 --> 00:17:07,680 Speaker 3: your research hospital? Are you more consumer oriented? So those 305 00:17:07,720 --> 00:17:10,119 Speaker 3: are the questions you start to ask because they start 306 00:17:10,160 --> 00:17:13,880 Speaker 3: to then tell a story in line with what Christy questions. 307 00:17:13,920 --> 00:17:16,720 Speaker 3: And I think that that's where the again, the complement 308 00:17:17,240 --> 00:17:19,960 Speaker 3: is that instead of just saying, oh, well, that's thanks 309 00:17:20,040 --> 00:17:23,199 Speaker 3: for telling me all this, Malcolm, here's your ten page strategy, 310 00:17:23,880 --> 00:17:28,399 Speaker 3: go find somebody. We have the benefit in IBM. And 311 00:17:28,440 --> 00:17:30,520 Speaker 3: it's probably why I'm still there is you know, we're 312 00:17:30,640 --> 00:17:33,399 Speaker 3: very unique. We're the only company on the planet that 313 00:17:33,560 --> 00:17:38,679 Speaker 3: has a consulting business at scale inside of a technology company, 314 00:17:39,560 --> 00:17:42,840 Speaker 3: and so we have, you know, the left brain, right brain. 315 00:17:42,920 --> 00:17:45,480 Speaker 3: We're able to do that and then we're able to say, okay, 316 00:17:45,480 --> 00:17:49,280 Speaker 3: now which partners are going to be most valuable for 317 00:17:49,359 --> 00:17:51,719 Speaker 3: our clients? What's going to work for you, isn't going 318 00:17:51,760 --> 00:17:54,119 Speaker 3: to work for the manufacturer down the road, isn't going 319 00:17:54,160 --> 00:17:57,880 Speaker 3: to work for the consumer or CpG company across the river. 320 00:17:59,000 --> 00:18:02,600 Speaker 3: Those things are bare very specific. The threats and the 321 00:18:02,640 --> 00:18:05,800 Speaker 3: scenes that I was talking about are very specific. So 322 00:18:05,920 --> 00:18:08,439 Speaker 3: that's where it becomes very valuable to make sure that 323 00:18:09,400 --> 00:18:12,400 Speaker 3: I'm not just giving you some strategy that's generic. 324 00:18:12,920 --> 00:18:17,720 Speaker 2: But everything as a healthcare CEO, everything I have done, 325 00:18:18,160 --> 00:18:21,520 Speaker 2: almost everything I've done over the last ten years, has 326 00:18:21,600 --> 00:18:24,200 Speaker 2: it had the effect of increasing my vulnerability. I want 327 00:18:24,200 --> 00:18:26,960 Speaker 2: to digitize data within the hospital used to be on 328 00:18:27,000 --> 00:18:30,560 Speaker 2: pieces of paper. I want doctors to go home and 329 00:18:30,680 --> 00:18:32,680 Speaker 2: to be able to seamlessly hook into stuff at work 330 00:18:32,720 --> 00:18:35,199 Speaker 2: because they got to do all their paperwork. I want 331 00:18:35,240 --> 00:18:37,919 Speaker 2: to make sure the diabetes people are speaking to the 332 00:18:38,040 --> 00:18:41,720 Speaker 2: organ transplant people. And so isn't that everything I have 333 00:18:41,840 --> 00:18:45,080 Speaker 2: done to kind of keep up with the revolution in healthcare? 334 00:18:45,480 --> 00:18:47,840 Speaker 2: Isn't that also making me more and more vulnerable to 335 00:18:48,359 --> 00:18:49,000 Speaker 2: a bad actor. 336 00:18:49,160 --> 00:18:51,520 Speaker 4: It's such a great question because think about the quality 337 00:18:51,520 --> 00:18:54,919 Speaker 4: of healthcare delivery. Right, So now doctors aren't filling out forms, 338 00:18:54,960 --> 00:18:57,679 Speaker 4: They're spending time with patients, and so the quality of 339 00:18:57,720 --> 00:19:00,439 Speaker 4: care is improving and the vulnerability is improving, and so 340 00:19:00,480 --> 00:19:05,159 Speaker 4: I think that's where having a strong cybersecurity strategy actually 341 00:19:05,320 --> 00:19:07,600 Speaker 4: enables all of that. One of our products is our 342 00:19:07,640 --> 00:19:10,600 Speaker 4: sas product, and we tested it with some business applications, 343 00:19:10,640 --> 00:19:13,200 Speaker 4: and oftentimes the wrap is, oh, security is going to 344 00:19:13,240 --> 00:19:15,280 Speaker 4: slow you down, right, like you have to add a firewall, 345 00:19:15,320 --> 00:19:19,159 Speaker 4: you have to checkpoints. Our product actually increases the velocity 346 00:19:19,560 --> 00:19:21,720 Speaker 4: of your ability to use that application because of the 347 00:19:21,760 --> 00:19:25,679 Speaker 4: way that it is queried through our system as opposed 348 00:19:25,720 --> 00:19:28,240 Speaker 4: to just through the regular network. So it doesn't slow 349 00:19:28,280 --> 00:19:30,480 Speaker 4: it down and in fact, it makes it run more efficiently. 350 00:19:31,240 --> 00:19:36,479 Speaker 4: That's just one minor example. But back to the healthcare question. I, 351 00:19:36,520 --> 00:19:38,879 Speaker 4: as a patient want my doctors accessing all the technology 352 00:19:38,880 --> 00:19:40,960 Speaker 4: and talking to each other and connecting the dots behind 353 00:19:40,960 --> 00:19:43,560 Speaker 4: the scenes. I also want my data to stay private, 354 00:19:44,119 --> 00:19:49,240 Speaker 4: and so having both a consulting partner who understands how 355 00:19:49,280 --> 00:19:51,720 Speaker 4: to ask questions of the environment and of the applications 356 00:19:51,720 --> 00:19:54,600 Speaker 4: you're using, and who understands the industry inside and out, 357 00:19:55,000 --> 00:19:57,800 Speaker 4: and a technology partner that builds and stays ahead of 358 00:19:57,840 --> 00:20:00,600 Speaker 4: all of the different threats come together advise you. I 359 00:20:00,600 --> 00:20:04,920 Speaker 4: think is super important. When you bring in a partner 360 00:20:04,920 --> 00:20:08,600 Speaker 4: like IBM, with a platform like pal Alta that covers 361 00:20:08,920 --> 00:20:12,720 Speaker 4: you know, all the different parts of your environment, you're 362 00:20:12,760 --> 00:20:16,280 Speaker 4: able to say, look, where where are the vulnerabilities in 363 00:20:16,320 --> 00:20:19,360 Speaker 4: the system, Where are the different endpoints that we need 364 00:20:19,400 --> 00:20:21,280 Speaker 4: to have covered, And then just make sure you get 365 00:20:21,320 --> 00:20:24,920 Speaker 4: that breadth of coverage and then you're better able to so, yes, 366 00:20:24,960 --> 00:20:27,040 Speaker 4: you've increased the risk, but then you've mitigated it. 367 00:20:27,760 --> 00:20:32,119 Speaker 2: So to give so before I retire my healthcare analogy, 368 00:20:32,480 --> 00:20:35,800 Speaker 2: because I was thinking about just trying to understand the 369 00:20:35,840 --> 00:20:40,600 Speaker 2: importance of this idea of having a single platform. So 370 00:20:40,960 --> 00:20:44,240 Speaker 2: if this mudtle healthcare network is typical, I've acquired a 371 00:20:44,240 --> 00:20:46,720 Speaker 2: whole series of over the last ten years. I bought 372 00:20:46,720 --> 00:20:49,560 Speaker 2: a hospital over here, some I got some physicians things 373 00:20:49,560 --> 00:20:52,720 Speaker 2: that I snapped up over here. I bought a diagnostics company, 374 00:20:53,160 --> 00:20:56,560 Speaker 2: and so I have all of these legacy systems and 375 00:20:56,600 --> 00:20:58,760 Speaker 2: I had, like you said, maybe I got some stuff 376 00:20:58,760 --> 00:21:00,920 Speaker 2: in the cloud with one company, some stuff with the cloud. 377 00:21:01,160 --> 00:21:03,840 Speaker 2: And what you're saying is the first step is to 378 00:21:03,920 --> 00:21:07,800 Speaker 2: kind of rationalize that put it on a single platform, 379 00:21:07,840 --> 00:21:11,040 Speaker 2: so you understand where your points of weakness are as 380 00:21:11,040 --> 00:21:13,040 Speaker 2: opposed to being blind to your points of weakness. 381 00:21:14,800 --> 00:21:19,760 Speaker 4: There's yes, although anyone who's done any kind of M 382 00:21:19,760 --> 00:21:22,359 Speaker 4: and A knows that that's a long journey, right. So 383 00:21:22,440 --> 00:21:26,080 Speaker 4: I think the first step is just understanding where everything is, 384 00:21:26,720 --> 00:21:27,960 Speaker 4: and then you get on a path and you say, 385 00:21:27,960 --> 00:21:31,080 Speaker 4: where's the biggest risk. Let's neutralize or mitigate that risk 386 00:21:31,119 --> 00:21:34,400 Speaker 4: one at a time. The thing about open end Secure, 387 00:21:35,160 --> 00:21:37,720 Speaker 4: you know, Palo Alto. We keep touting the benefits of 388 00:21:37,760 --> 00:21:40,439 Speaker 4: the platform. Everything on Palo Alto, your risk is going 389 00:21:40,440 --> 00:21:42,439 Speaker 4: to be mitigated and you're going to have the full visibility. 390 00:21:43,119 --> 00:21:46,480 Speaker 4: But you can't get there overnight. And so we've got 391 00:21:46,720 --> 00:21:50,000 Speaker 4: you know, thousands of integrations with other technology companies, including 392 00:21:50,040 --> 00:21:52,679 Speaker 4: our partners, to make sure that we can capture and 393 00:21:52,880 --> 00:21:57,040 Speaker 4: have visibility into those those endpoints in those systems as well. 394 00:21:57,280 --> 00:21:59,320 Speaker 4: And so I think step one is just figure out 395 00:21:59,320 --> 00:22:01,840 Speaker 4: where everything is, get the scan, so politize a couple 396 00:22:01,880 --> 00:22:03,880 Speaker 4: of products where you can kind of deploy and get 397 00:22:03,920 --> 00:22:06,760 Speaker 4: a view of your attack surface. I love the analogy. 398 00:22:06,920 --> 00:22:09,560 Speaker 4: Just like a digital environment is a house, right, and 399 00:22:09,600 --> 00:22:11,520 Speaker 4: so like you have your front door log of course, 400 00:22:11,760 --> 00:22:13,679 Speaker 4: because probably they're going to try the front door first, 401 00:22:14,440 --> 00:22:15,920 Speaker 4: but that's not all you're going to do, right, you're 402 00:22:15,920 --> 00:22:17,560 Speaker 4: going to make sure the whole you know, the windows 403 00:22:17,560 --> 00:22:19,720 Speaker 4: are locked and there's an alarm system and all of that. 404 00:22:21,040 --> 00:22:23,119 Speaker 4: And I think that's how you have to think about it, 405 00:22:23,200 --> 00:22:25,400 Speaker 4: is just how do we cover the whole service. 406 00:22:25,640 --> 00:22:29,040 Speaker 2: So everyone lay. People like me have been bombarded over 407 00:22:29,200 --> 00:22:32,560 Speaker 2: it seems like over the last year with one thing 408 00:22:32,600 --> 00:22:35,879 Speaker 2: another about how quickly AI is moving forward and how 409 00:22:35,920 --> 00:22:37,560 Speaker 2: big of a deal it is suddenly is going to 410 00:22:37,600 --> 00:22:41,760 Speaker 2: be in the economy. What is the impact of that 411 00:22:42,680 --> 00:22:48,639 Speaker 2: dramatic change in AI's capabilities on this cybersecurity question? So 412 00:22:48,760 --> 00:22:51,119 Speaker 2: what does it mean if you're defending somebody that you 413 00:22:51,240 --> 00:22:54,080 Speaker 2: now have these sophisticated AI tools, you suppose. 414 00:22:55,080 --> 00:22:58,320 Speaker 3: I think that AI becomes the force multiplier for cyber 415 00:23:00,119 --> 00:23:04,800 Speaker 3: To think about cyber Before it was just locking your doors, 416 00:23:05,320 --> 00:23:08,320 Speaker 3: locking the windows, and if you were really good, you 417 00:23:08,359 --> 00:23:14,320 Speaker 3: had an alarm system. You know. Now with AI, you 418 00:23:14,359 --> 00:23:17,399 Speaker 3: can said, well, I can predict what's going to happen. 419 00:23:17,520 --> 00:23:19,560 Speaker 3: I can see around the corner. I know, I can 420 00:23:19,640 --> 00:23:22,919 Speaker 3: leave my windows open upstairs and it's fine, and it's okay. 421 00:23:23,160 --> 00:23:25,760 Speaker 2: I mean because why because the AI is running a 422 00:23:25,800 --> 00:23:29,359 Speaker 2: million simulations. 423 00:23:27,560 --> 00:23:31,720 Speaker 3: It can And that's exactly it. It becomes the intelligent 424 00:23:31,920 --> 00:23:35,760 Speaker 3: part of that AI. It's not artificial, it's augmented. So 425 00:23:35,800 --> 00:23:38,720 Speaker 3: you now have this new capability to see around corners, 426 00:23:39,440 --> 00:23:42,640 Speaker 3: and so you're able to do the jobs of yesterday 427 00:23:42,680 --> 00:23:48,560 Speaker 3: more effectively. And the queries that you were doing, and 428 00:23:48,600 --> 00:23:51,879 Speaker 3: that's all you're really doing, now you're doing them, you know, faster, 429 00:23:52,160 --> 00:23:55,879 Speaker 3: you're able to access even more data and you're able 430 00:23:55,920 --> 00:24:00,480 Speaker 3: to then make it more secure. So that's why AI 431 00:24:00,560 --> 00:24:01,879 Speaker 3: becomes a force multiplier. 432 00:24:02,160 --> 00:24:06,240 Speaker 2: Yeah, and just talk about the faster part. What does 433 00:24:06,400 --> 00:24:09,960 Speaker 2: faster mean in practical terms? If you're trying to defend 434 00:24:09,960 --> 00:24:12,720 Speaker 2: an enterprise against a cyber attack, what does speed matter 435 00:24:12,760 --> 00:24:13,440 Speaker 2: in that environment? 436 00:24:14,440 --> 00:24:17,960 Speaker 3: You're always trying to find a place through I go 437 00:24:18,040 --> 00:24:20,200 Speaker 3: back to you. We brought up the army. You always 438 00:24:20,280 --> 00:24:22,000 Speaker 3: how do you break the line? How do you find 439 00:24:22,000 --> 00:24:24,679 Speaker 3: a penetration point? And when you think about you know, 440 00:24:25,119 --> 00:24:29,080 Speaker 3: pin testing, penetration testing, where are those? So if you're 441 00:24:29,119 --> 00:24:31,919 Speaker 3: able to do that faster than the bad guys, and 442 00:24:31,960 --> 00:24:35,560 Speaker 3: not only faster, but you're picking more probable points. This 443 00:24:35,720 --> 00:24:38,600 Speaker 3: is back to the intelligence. I could waste time doing 444 00:24:38,760 --> 00:24:41,840 Speaker 3: penetration testing someplace where That's why I mentioned leave if 445 00:24:41,840 --> 00:24:44,680 Speaker 3: they can't get in the second story windows. Why are 446 00:24:44,720 --> 00:24:48,000 Speaker 3: you spending time trying it so that becomes more effective? 447 00:24:48,800 --> 00:24:51,600 Speaker 3: So that's when I think of speed. That's what I 448 00:24:51,640 --> 00:24:54,080 Speaker 3: think of because with not just speed, I think it's 449 00:24:54,080 --> 00:24:55,720 Speaker 3: also what's effective. 450 00:24:55,560 --> 00:24:57,200 Speaker 4: Just to put a put a fine point on it. 451 00:24:57,320 --> 00:24:59,359 Speaker 4: So I found a way in Okay, Now what I 452 00:24:59,359 --> 00:25:01,000 Speaker 4: don't know where the jewelry is, so I have to 453 00:25:01,040 --> 00:25:03,960 Speaker 4: look around and see if there's any hidden gems and 454 00:25:03,960 --> 00:25:06,120 Speaker 4: try to find my way. That used to take a week, 455 00:25:06,160 --> 00:25:10,080 Speaker 4: two weeks, sort of seven to fourteen days. Now it's hours, right, 456 00:25:10,119 --> 00:25:12,560 Speaker 4: So they're in and they can actually expeltrate data within 457 00:25:12,880 --> 00:25:16,040 Speaker 4: less than a day. The metric we use in the 458 00:25:16,080 --> 00:25:18,960 Speaker 4: security operation Center is meantime to detect, so to see 459 00:25:18,960 --> 00:25:21,959 Speaker 4: anyone's there, meantime to respond and remediate to get them 460 00:25:21,960 --> 00:25:26,200 Speaker 4: out right. That used to be also you know, seven eight, nine, 461 00:25:26,240 --> 00:25:30,320 Speaker 4: ten days. Now it needs to be less than an hour. 462 00:25:31,080 --> 00:25:35,040 Speaker 4: And with our AI based security operations platform, it is. 463 00:25:35,480 --> 00:25:38,360 Speaker 4: Now you've got one tool that whether it's all peloton 464 00:25:38,440 --> 00:25:40,720 Speaker 4: networks or whether it's just you know, hoofringing data from 465 00:25:40,760 --> 00:25:42,920 Speaker 4: other places, then you're able to see it all together. 466 00:25:42,960 --> 00:25:44,960 Speaker 4: So you actually get fewer alerts, so you get from 467 00:25:45,400 --> 00:25:48,679 Speaker 4: thousands of alerts down to one hundred alerts right, and 468 00:25:48,720 --> 00:25:51,240 Speaker 4: you can investigate them and you investigate them using AI too. 469 00:25:51,560 --> 00:25:55,040 Speaker 4: And AI is today, it's today's threat, but it's you know, 470 00:25:55,080 --> 00:25:57,760 Speaker 4: you think about threat and opportunity to think about what's next. 471 00:25:57,760 --> 00:25:59,760 Speaker 4: You always have to be kind of evolving. 472 00:25:59,520 --> 00:26:01,760 Speaker 3: And you have to. I think we talk about threat 473 00:26:01,760 --> 00:26:03,639 Speaker 3: and risk. You know, we didn't tell you know, what 474 00:26:03,720 --> 00:26:08,360 Speaker 3: is the cost of cyber some type of penetration. It's 475 00:26:08,760 --> 00:26:11,800 Speaker 3: typical costs is about four and a half million dollars. 476 00:26:12,920 --> 00:26:17,760 Speaker 3: And that's just in labor and remediation. If you think 477 00:26:17,760 --> 00:26:22,240 Speaker 3: about reputational risk as well, our Institute for Business Value 478 00:26:22,320 --> 00:26:25,320 Speaker 3: to the study and found it in twenty twenty three 479 00:26:25,359 --> 00:26:28,199 Speaker 3: and they were thirty nine banks that we watched that 480 00:26:29,240 --> 00:26:33,240 Speaker 3: suffered a reputational risk market value of one hundred and 481 00:26:33,320 --> 00:26:36,440 Speaker 3: thirty billion dollars. And so you start to think, wow, 482 00:26:36,520 --> 00:26:41,520 Speaker 3: that's just reputational risk. So that's what's at stake here, 483 00:26:42,040 --> 00:26:44,280 Speaker 3: and it's only that is only going to get bigger. 484 00:26:45,080 --> 00:26:47,600 Speaker 4: So one of the piece we haven't talked about about 485 00:26:47,640 --> 00:26:50,040 Speaker 4: AI that I find super interesting because we've been talking 486 00:26:50,160 --> 00:26:53,920 Speaker 4: essentially about like the terminator, the robots fighting robots, right, 487 00:26:53,920 --> 00:26:56,600 Speaker 4: like whose robots are quicker? Like I'm designing a tax 488 00:26:56,640 --> 00:26:58,840 Speaker 4: and I'm defending against tax and I think that's that's 489 00:26:58,880 --> 00:27:03,080 Speaker 4: super important. But we recently launch and our working at 490 00:27:03,080 --> 00:27:05,919 Speaker 4: IBM on our AI security product to actually secure the 491 00:27:05,960 --> 00:27:08,080 Speaker 4: use of ALI because it also opens up another set 492 00:27:08,119 --> 00:27:12,080 Speaker 4: of threat factors. I'll give you an example. I'm a 493 00:27:12,080 --> 00:27:14,800 Speaker 4: marketing executive now for your hospital. So I work for you, 494 00:27:15,480 --> 00:27:19,199 Speaker 4: and you want to announce the launch of a new center, 495 00:27:19,680 --> 00:27:22,119 Speaker 4: and so I upload all the information about all the 496 00:27:22,200 --> 00:27:24,360 Speaker 4: patients and our you know how we do things into 497 00:27:24,400 --> 00:27:26,399 Speaker 4: chat GPT to write the PR for me. Well, I've 498 00:27:26,400 --> 00:27:29,520 Speaker 4: also just uploaded to chat GPT a whole bunch of secrets, right. 499 00:27:30,119 --> 00:27:33,520 Speaker 4: So it's it's how employees are using AI, because I think, 500 00:27:33,560 --> 00:27:35,760 Speaker 4: you know, some companies are sort of building their own 501 00:27:35,800 --> 00:27:37,960 Speaker 4: language models and their own AI applications that they want 502 00:27:37,960 --> 00:27:40,840 Speaker 4: to keep secure. Others are just curious about how their 503 00:27:40,840 --> 00:27:44,359 Speaker 4: employees are using AI applications on the shelf, and so 504 00:27:44,400 --> 00:27:46,719 Speaker 4: we announced in May a product where you can actually 505 00:27:47,480 --> 00:27:50,200 Speaker 4: scan and see how AI is being used in your 506 00:27:50,400 --> 00:27:53,480 Speaker 4: enterprise and within We made the announcement with the GA 507 00:27:53,480 --> 00:27:55,199 Speaker 4: was last month, but we made the announcement in May, 508 00:27:55,240 --> 00:27:58,320 Speaker 4: and we had immediately thousands of CIOs signing up because 509 00:27:58,400 --> 00:28:01,720 Speaker 4: just understanding you know who's using what it's another open 510 00:28:01,800 --> 00:28:05,399 Speaker 4: question because you know, we talk about AI enhancing productivity 511 00:28:05,400 --> 00:28:06,919 Speaker 4: and all the benefits it's going to bring, but it 512 00:28:06,960 --> 00:28:09,760 Speaker 4: brings it brings risks, not just in how it's being 513 00:28:09,840 --> 00:28:12,600 Speaker 4: used by the thread actors, but also you know what 514 00:28:12,640 --> 00:28:13,720 Speaker 4: other vulnerabilities that. 515 00:28:13,720 --> 00:28:17,720 Speaker 2: Exc It's the eye that you does that system tell 516 00:28:17,760 --> 00:28:19,240 Speaker 2: you what's a problematic use. 517 00:28:20,000 --> 00:28:22,440 Speaker 4: It does, so what what it does, and you've got 518 00:28:22,480 --> 00:28:24,200 Speaker 4: to train it right. But what it does is say 519 00:28:24,200 --> 00:28:27,240 Speaker 4: this is this is outside of your policy. So CIOs 520 00:28:27,240 --> 00:28:29,720 Speaker 4: will set policies on here's what is acceptable and not 521 00:28:29,840 --> 00:28:31,680 Speaker 4: acceptable use. So we'll be able to scan and say 522 00:28:31,720 --> 00:28:34,359 Speaker 4: these these falling uses are outside of policy, and then 523 00:28:34,440 --> 00:28:36,440 Speaker 4: it'll punt and say I think this is too restrictive, 524 00:28:36,480 --> 00:28:38,040 Speaker 4: I think this is too permissive, and then you can 525 00:28:38,080 --> 00:28:41,720 Speaker 4: sort of update your policies from there. That's just sort 526 00:28:41,760 --> 00:28:44,240 Speaker 4: of the visibility piece, and then there's the run time piece, 527 00:28:44,240 --> 00:28:46,160 Speaker 4: which will actually stop you from using it. So you 528 00:28:46,200 --> 00:28:48,880 Speaker 4: go and say, okay, here's all my patient's social security numbers. 529 00:28:48,880 --> 00:28:51,480 Speaker 4: I'm going to upload them to chat GPT to you know, 530 00:28:52,080 --> 00:28:54,360 Speaker 4: get an understanding of like where they all live. I 531 00:28:54,400 --> 00:28:56,600 Speaker 4: don't know what why you would possibly do that, but 532 00:28:56,640 --> 00:28:59,000 Speaker 4: let's say you are and then you know, it'll note 533 00:28:59,040 --> 00:29:00,920 Speaker 4: that looks like a social security number. You can't upload 534 00:29:00,920 --> 00:29:02,040 Speaker 4: that into your prompt. 535 00:29:01,920 --> 00:29:07,080 Speaker 2: So it will stop you before you Yeah, thoughtful voice 536 00:29:07,080 --> 00:29:09,719 Speaker 2: over your shoulder, just to remind you not to do 537 00:29:09,760 --> 00:29:13,600 Speaker 2: something silly exactly. But this is just talk a little 538 00:29:13,640 --> 00:29:17,120 Speaker 2: bit more about adding AI into this mix. You say 539 00:29:17,120 --> 00:29:20,680 Speaker 2: it's a force multiplier. It's a really interesting dig into that. 540 00:29:20,720 --> 00:29:24,520 Speaker 2: What other instances of what that means? How does the 541 00:29:25,040 --> 00:29:31,280 Speaker 2: balance between AI and human expertise work in the kind 542 00:29:31,280 --> 00:29:33,000 Speaker 2: of next generation of cybersecurity? 543 00:29:33,840 --> 00:29:38,120 Speaker 3: I think the common way to look at as it 544 00:29:38,280 --> 00:29:41,800 Speaker 3: back to the force multipliers, It's not going to be 545 00:29:42,000 --> 00:29:44,240 Speaker 3: is your AI better? But can you use it better? 546 00:29:44,640 --> 00:29:48,360 Speaker 3: Can you ask your AI the right questions? Are you 547 00:29:48,440 --> 00:29:51,880 Speaker 3: well trained? So the competition really becomes your use of AI? 548 00:29:52,680 --> 00:29:56,080 Speaker 3: And are you pointed it in the right direction. You 549 00:29:56,080 --> 00:29:59,120 Speaker 3: have fifty people, can they do the work of two 550 00:29:59,240 --> 00:30:02,200 Speaker 3: hundred and fifty? And can they do it in a 551 00:30:02,240 --> 00:30:05,320 Speaker 3: safe and secure manner? So you're not opening up more 552 00:30:05,440 --> 00:30:08,680 Speaker 3: risk based on or too much risk is your risk 553 00:30:08,760 --> 00:30:11,520 Speaker 3: tolerance in order to get the outcome. So that's why 554 00:30:11,640 --> 00:30:14,720 Speaker 3: I think there's the opportunity. And so you see this 555 00:30:15,320 --> 00:30:18,040 Speaker 3: truly as a force multiplier because the first thing people go, oh, 556 00:30:18,080 --> 00:30:20,880 Speaker 3: you're going to get rid of people. Oh, the people 557 00:30:20,960 --> 00:30:24,920 Speaker 3: portion is still still going to be just as important 558 00:30:24,920 --> 00:30:26,760 Speaker 3: because they're doing that other piece of work. 559 00:30:26,840 --> 00:30:29,000 Speaker 4: One of my favorite statistics is that there are now 560 00:30:29,120 --> 00:30:31,240 Speaker 4: more bank tellers in the US than there were in 561 00:30:31,320 --> 00:30:34,480 Speaker 4: nineteen sixty before the ATM was invented. Right, So, but 562 00:30:34,480 --> 00:30:36,200 Speaker 4: it used to be you would go to your bank 563 00:30:36,680 --> 00:30:39,000 Speaker 4: because you had to. I remember doing this. You go, 564 00:30:39,160 --> 00:30:40,880 Speaker 4: you fill out your deposit slip, you hand it to 565 00:30:40,960 --> 00:30:43,960 Speaker 4: the teller and they give you your cash. And then ATMs 566 00:30:43,960 --> 00:30:45,680 Speaker 4: are invented. It's like, oh no, what's going to happen? 567 00:30:45,720 --> 00:30:48,480 Speaker 4: All these jobs and now there's more, Right, but you're 568 00:30:48,520 --> 00:30:51,360 Speaker 4: not withdrawing money from a bank teller. You're now doing 569 00:30:51,360 --> 00:30:55,480 Speaker 4: more sophisticated transactions. And so I think it's similar with AI, right, 570 00:30:55,600 --> 00:30:57,960 Speaker 4: like you want people doing things that only people can do. 571 00:30:58,440 --> 00:31:01,280 Speaker 2: The human element remains absolutely central in all of this. 572 00:31:02,840 --> 00:31:07,440 Speaker 2: How do you make sure that your cybersecurity folks are 573 00:31:07,640 --> 00:31:10,600 Speaker 2: equipped to handle high value tasks are ready for this 574 00:31:10,680 --> 00:31:12,520 Speaker 2: increasing responsibility. 575 00:31:13,120 --> 00:31:15,040 Speaker 4: There's a couple of ways to answer this, but I 576 00:31:15,080 --> 00:31:19,480 Speaker 4: think the more you're able to automate the routine and 577 00:31:19,520 --> 00:31:24,440 Speaker 4: the mundane tasks. For example, the bulk of cybersecurity happens 578 00:31:24,440 --> 00:31:27,360 Speaker 4: in the security operations center. There's analysts who are sitting 579 00:31:27,360 --> 00:31:30,480 Speaker 4: in that center. If they're spending all day either configuring 580 00:31:31,560 --> 00:31:34,160 Speaker 4: alerts or responding to alerts, they're not able to do 581 00:31:34,200 --> 00:31:36,960 Speaker 4: the advanced sort of threat hunting and analysis work. And 582 00:31:37,000 --> 00:31:38,680 Speaker 4: so I think a big chunk of it is just 583 00:31:38,720 --> 00:31:41,440 Speaker 4: freeing up their time to be able to do the 584 00:31:41,480 --> 00:31:44,400 Speaker 4: more advanced strategic work. And a lot of the automation 585 00:31:44,480 --> 00:31:48,440 Speaker 4: tools based on AI, like our cortex XIM product, is 586 00:31:49,160 --> 00:31:51,360 Speaker 4: it's designed to free up their time in order to 587 00:31:51,360 --> 00:31:52,040 Speaker 4: be able to do that. 588 00:31:52,640 --> 00:31:56,120 Speaker 3: And from our perspective, is making sure that it's a 589 00:31:56,160 --> 00:32:00,280 Speaker 3: requirement to make sure that you have the qualifications because 590 00:32:00,280 --> 00:32:02,960 Speaker 3: people can easily get used to doing what they've always done. 591 00:32:03,400 --> 00:32:06,360 Speaker 3: I know this, and that's that's what I do. You say, well, no, 592 00:32:07,880 --> 00:32:10,760 Speaker 3: all the threat actors are learning on the fly. They're 593 00:32:10,800 --> 00:32:14,320 Speaker 3: trying to always outsmart you. So it's in your best interest, 594 00:32:14,400 --> 00:32:17,000 Speaker 3: our best interest, our client's best and partners best, and 595 00:32:17,320 --> 00:32:20,200 Speaker 3: that you are on the front leaning edge of that 596 00:32:20,560 --> 00:32:21,680 Speaker 3: learning capability. 597 00:32:22,040 --> 00:32:24,000 Speaker 2: If you're talking to a client he wants to develop 598 00:32:24,080 --> 00:32:29,480 Speaker 2: a kind of unified cybersecurity strategy, what's the best single 599 00:32:29,520 --> 00:32:33,280 Speaker 2: piece of advice you can give them? 600 00:32:33,320 --> 00:32:37,400 Speaker 4: You should have a single platform. It's hard not to 601 00:32:37,440 --> 00:32:40,000 Speaker 4: answer that, but it is true. I mean all joking 602 00:32:40,040 --> 00:32:43,720 Speaker 4: aside having you know the best of breed solutions that 603 00:32:43,760 --> 00:32:45,680 Speaker 4: are all talking to each other and able to stitch 604 00:32:45,720 --> 00:32:49,000 Speaker 4: together and identify threats before human might be able to. 605 00:32:49,920 --> 00:32:52,360 Speaker 4: That's number one, and number two is making sure you 606 00:32:52,360 --> 00:32:55,920 Speaker 4: have visibility on all elements so you're able to cover 607 00:32:55,960 --> 00:32:58,600 Speaker 4: your whole environment and understand how people are accessing it. 608 00:32:58,880 --> 00:33:03,360 Speaker 3: I'd say think like a actor. Yeah, always think outside 609 00:33:03,440 --> 00:33:06,280 Speaker 3: in because you get comfortable the other way around. 610 00:33:07,920 --> 00:33:12,160 Speaker 2: You guys work together with a fortunate enviioutrial company, and 611 00:33:12,480 --> 00:33:14,120 Speaker 2: I'd love for you to talk a little bit about 612 00:33:14,280 --> 00:33:16,360 Speaker 2: use that as a kind of case study for what 613 00:33:16,440 --> 00:33:20,920 Speaker 2: this collaboration between the two your two companies looks like. 614 00:33:21,280 --> 00:33:22,840 Speaker 2: When you work with a cloud. 615 00:33:23,040 --> 00:33:26,000 Speaker 4: It really was, you know, IBM leading on a digital 616 00:33:26,000 --> 00:33:29,320 Speaker 4: transformation for this client that wanted to move their applications 617 00:33:29,360 --> 00:33:31,320 Speaker 4: into the cloud, and so you're asking a lot of 618 00:33:31,400 --> 00:33:34,080 Speaker 4: questions about how does AI increase the risk and the 619 00:33:34,120 --> 00:33:36,480 Speaker 4: surface area. Those same questions ten years ago were asked 620 00:33:36,480 --> 00:33:38,880 Speaker 4: about the cloud. And we're still on the journey where 621 00:33:38,960 --> 00:33:42,040 Speaker 4: where companies are migrating to the cloud. We're not anywhere 622 00:33:42,080 --> 00:33:44,240 Speaker 4: near finished that yet. And so there's two pieces to 623 00:33:44,280 --> 00:33:46,440 Speaker 4: a cloud migration. One is just refactoring for the cloud 624 00:33:46,480 --> 00:33:48,760 Speaker 4: to make sure the application works effectively in the cloud. 625 00:33:49,080 --> 00:33:51,040 Speaker 4: And the second is security. And then you built in 626 00:33:51,040 --> 00:33:54,640 Speaker 4: security by design using POW does prison of cloud products 627 00:33:54,640 --> 00:33:56,600 Speaker 4: to make sure that not only did you have the 628 00:33:56,680 --> 00:33:59,200 Speaker 4: visibility so our cloud product you can scan and see 629 00:33:59,200 --> 00:34:02,640 Speaker 4: where the vulnerability. And then there's also you know, cloud 630 00:34:03,120 --> 00:34:07,080 Speaker 4: firewalls essentially that will keep bad actors out and keep 631 00:34:07,120 --> 00:34:08,600 Speaker 4: the cloud instant secure. 632 00:34:09,640 --> 00:34:12,520 Speaker 2: If we sit down and have this conversation five years 633 00:34:12,560 --> 00:34:15,120 Speaker 2: from now, which I actually hope we do, be fun, 634 00:34:16,320 --> 00:34:20,840 Speaker 2: this pretend is twenty twenty nine. Tell me what are 635 00:34:20,880 --> 00:34:22,480 Speaker 2: you happy about in twenty twenty nine. 636 00:34:22,719 --> 00:34:28,360 Speaker 3: I think twenty twenty nine quantum computing is mainstream. I 637 00:34:28,360 --> 00:34:34,080 Speaker 3: think quantum computing is now quantum safe, where we're using 638 00:34:34,880 --> 00:34:39,720 Speaker 3: quantum computing to make sure that those bad actors aren't 639 00:34:39,719 --> 00:34:42,040 Speaker 3: as bad as they used to be back in twenty 640 00:34:42,040 --> 00:34:46,160 Speaker 3: twenty four, and that we're seeing around the corners and 641 00:34:46,280 --> 00:34:51,400 Speaker 3: that we're empowering our Palo Alto relationship. That in twenty 642 00:34:51,480 --> 00:34:56,279 Speaker 3: twenty nine is the premiere type of capability that people 643 00:34:56,280 --> 00:34:59,319 Speaker 3: are looking at when they think of what used to 644 00:34:59,360 --> 00:35:02,279 Speaker 3: be AI. Now is quantum capability. 645 00:35:02,520 --> 00:35:07,319 Speaker 4: Yeah, yeah, I think for AI, everyone's just using it 646 00:35:07,320 --> 00:35:10,160 Speaker 4: as part of their job. The way email was an 647 00:35:10,200 --> 00:35:12,840 Speaker 4: innovation in the nineties, the way you know cloud was 648 00:35:12,840 --> 00:35:15,719 Speaker 4: an innovation in the twenty tens, and we thought, how 649 00:35:15,719 --> 00:35:17,279 Speaker 4: are we going to use this? What impact is it 650 00:35:17,320 --> 00:35:19,759 Speaker 4: going to have on productivity? All these people who are 651 00:35:19,760 --> 00:35:22,000 Speaker 4: spending their days typing up memos, like what are they 652 00:35:22,000 --> 00:35:24,520 Speaker 4: going to do? We're going to be past that fear 653 00:35:24,640 --> 00:35:27,239 Speaker 4: and we're all going to understand that it is this 654 00:35:27,520 --> 00:35:31,319 Speaker 4: like truly positive force multiplier. For you know, every employee 655 00:35:31,440 --> 00:35:35,200 Speaker 4: is able to do their best work and spend their 656 00:35:35,239 --> 00:35:37,279 Speaker 4: time on the things that only they can do, and 657 00:35:37,320 --> 00:35:39,480 Speaker 4: then the AI is doing the rest of that for them. 658 00:35:39,560 --> 00:35:44,160 Speaker 3: Right, AI is fun to enable many things to work together. 659 00:35:44,400 --> 00:35:47,799 Speaker 3: It won't be just one language model. We won't even 660 00:35:47,840 --> 00:35:51,040 Speaker 3: think about. It will be the difference between you know, 661 00:35:51,920 --> 00:35:56,360 Speaker 3: Malcolm having a fax machine, stereo and a telephone and 662 00:35:56,840 --> 00:35:59,600 Speaker 3: a memo board. Now it's in your pocket and it's 663 00:35:59,640 --> 00:36:01,880 Speaker 3: all one thing and you don't even call that, you know. 664 00:36:01,920 --> 00:36:03,680 Speaker 3: I said, walk them into my kids the other day 665 00:36:03,680 --> 00:36:07,319 Speaker 3: and they're like, what's a walk man? So I do 666 00:36:07,400 --> 00:36:09,319 Speaker 3: think it will. It'll be part of the past and 667 00:36:09,360 --> 00:36:12,520 Speaker 3: it's it will be the thought of the seamless connection. 668 00:36:12,640 --> 00:36:19,680 Speaker 3: That is secure seamless connection, of HR, of finance, of distribution, logistics, 669 00:36:19,680 --> 00:36:23,360 Speaker 3: of billing, all of those will have a capability to 670 00:36:23,400 --> 00:36:24,360 Speaker 3: work together. 671 00:36:24,719 --> 00:36:28,160 Speaker 2: Yeah, I have to do some social quick fire questions 672 00:36:28,640 --> 00:36:29,640 Speaker 2: you guys ready. 673 00:36:29,400 --> 00:36:29,879 Speaker 3: All right? 674 00:36:31,320 --> 00:36:34,000 Speaker 2: What's the number one thing that people misunderstand about AI? 675 00:36:35,440 --> 00:36:38,680 Speaker 3: The reliance on data? What do you mean by that? 676 00:36:39,480 --> 00:36:42,000 Speaker 3: I think that it's just assumed that it's happening and 677 00:36:42,200 --> 00:36:44,680 Speaker 3: it can just go out and grab data anywhere. 678 00:36:45,680 --> 00:36:47,319 Speaker 2: Yeah, you have. 679 00:36:47,280 --> 00:36:50,520 Speaker 3: To have good data, reliable data, and access to the data. 680 00:36:51,239 --> 00:36:53,000 Speaker 4: I mean people are too afraid of it. 681 00:36:53,400 --> 00:36:56,480 Speaker 2: Checkbox and image generators are the biggest things in consumer 682 00:36:56,520 --> 00:36:58,440 Speaker 2: AI right now. What do you think is that next 683 00:36:58,480 --> 00:36:59,759 Speaker 2: big business application? 684 00:37:00,760 --> 00:37:04,000 Speaker 3: I think it's the tying together of multiple capabilities. I'm 685 00:37:04,080 --> 00:37:07,360 Speaker 3: hinted towards this earlier is I think tying together the 686 00:37:07,480 --> 00:37:10,160 Speaker 3: disparate systems that sit in different parts of the organization 687 00:37:10,200 --> 00:37:12,839 Speaker 3: front office, back office, making it one office and tying 688 00:37:12,840 --> 00:37:14,560 Speaker 3: together those different functions. That's it. 689 00:37:15,640 --> 00:37:18,080 Speaker 4: I mean, it's workflow automation. I think back to your 690 00:37:18,080 --> 00:37:21,319 Speaker 4: point on the reliance on data seems easy. It's a 691 00:37:21,320 --> 00:37:22,759 Speaker 4: lot harder than you think, because you have to have 692 00:37:22,800 --> 00:37:24,760 Speaker 4: everything set up in exactly the right way to get 693 00:37:25,000 --> 00:37:27,680 Speaker 4: all of your systems automated and the sort of the 694 00:37:27,680 --> 00:37:29,839 Speaker 4: more boring jobs taken care of so that humans could 695 00:37:29,880 --> 00:37:30,880 Speaker 4: do the strategic ones. 696 00:37:32,600 --> 00:37:35,080 Speaker 2: How are you already using AI in your day to 697 00:37:35,120 --> 00:37:35,600 Speaker 2: day life? 698 00:37:37,800 --> 00:37:40,720 Speaker 4: I mean I use it at work all the time, 699 00:37:41,360 --> 00:37:43,319 Speaker 4: and then I've found right now I go to chat, 700 00:37:43,400 --> 00:37:46,520 Speaker 4: GPT instead of Google to look things up. I like 701 00:37:46,560 --> 00:37:47,400 Speaker 4: having a conversation. 702 00:37:48,840 --> 00:37:52,920 Speaker 3: We have a wonderful capability in our consulting business called 703 00:37:53,520 --> 00:37:57,719 Speaker 3: our consulting assistant, and consulting advantage is the proper name 704 00:37:57,760 --> 00:37:59,760 Speaker 3: for it, but I look at it as that assistant. 705 00:38:00,400 --> 00:38:02,680 Speaker 3: It's a force multiplier for me. So if I need 706 00:38:02,719 --> 00:38:07,520 Speaker 3: to pull together content proposals with the teams, we go 707 00:38:07,600 --> 00:38:08,120 Speaker 3: straight to that. 708 00:38:08,960 --> 00:38:11,440 Speaker 2: We are one more we hear, so many definitions of 709 00:38:11,520 --> 00:38:14,799 Speaker 2: open related to technology. How do you define it and 710 00:38:14,840 --> 00:38:16,960 Speaker 2: how does the concept help you innovate? 711 00:38:18,120 --> 00:38:21,600 Speaker 4: By definition? In cybersecurity, you don't want to be too open, right, 712 00:38:22,040 --> 00:38:26,040 Speaker 4: So I think we enable openness with this concept of 713 00:38:26,120 --> 00:38:28,400 Speaker 4: zero trust and saying like everyone's invited in as long 714 00:38:28,400 --> 00:38:31,080 Speaker 4: as you have the right credentials, right. So that's that's 715 00:38:31,160 --> 00:38:33,000 Speaker 4: one way, and then the other way is just making 716 00:38:33,040 --> 00:38:36,560 Speaker 4: sure you're connected to all the different systems in order 717 00:38:36,560 --> 00:38:38,880 Speaker 4: to be able to have that visibility and see what's happening. 718 00:38:38,880 --> 00:38:42,719 Speaker 4: Because if you are blind, that's the minute you have 719 00:38:42,719 --> 00:38:43,560 Speaker 4: that vulnerability. 720 00:38:43,719 --> 00:38:49,680 Speaker 3: Yeah, and I'd say it's moving quickly with security. It 721 00:38:49,760 --> 00:38:53,439 Speaker 3: sounds contradictory open. Oh then that means you're not safe. No, 722 00:38:53,520 --> 00:38:54,960 Speaker 3: you are safe and you can move faster. 723 00:38:55,280 --> 00:38:56,960 Speaker 2: Yeah, thank you so much. It's fun. 724 00:38:57,080 --> 00:38:58,680 Speaker 4: Thanks a lot, Thank you great. We'll see you in 725 00:38:58,719 --> 00:38:59,200 Speaker 4: five years. 726 00:38:59,280 --> 00:39:03,720 Speaker 2: Yea years, five years, man, I'll be old in five years. 727 00:39:05,960 --> 00:39:09,719 Speaker 2: Thank you to Jason kellyot IBM and Christy Fredericks at 728 00:39:09,760 --> 00:39:14,440 Speaker 2: Palo Alto Networks for that fascinating conversation about the threats 729 00:39:14,480 --> 00:39:20,280 Speaker 2: and opportunities in cybersecurity today. As Jason and Christie stressed, 730 00:39:20,640 --> 00:39:25,120 Speaker 2: AI can be a force multiplier for enterprise across industries. 731 00:39:26,120 --> 00:39:28,600 Speaker 2: When you're working with multiple products and have your data 732 00:39:28,640 --> 00:39:32,920 Speaker 2: in distributed environments, you need technology that will work across 733 00:39:33,000 --> 00:39:37,560 Speaker 2: your organization, and with Palo Alto Networks platform, you can 734 00:39:37,680 --> 00:39:43,600 Speaker 2: enhance cyber resiliency and simplify your operations. Through their collaboration, 735 00:39:44,000 --> 00:39:47,520 Speaker 2: IBM and Palo Alto Networks are charting the future a 736 00:39:47,640 --> 00:39:54,880 Speaker 2: fully integrated open end to end security solutions. Smart Talks 737 00:39:54,880 --> 00:39:58,200 Speaker 2: with IBM is produced by Matt Romano, Joey fish Ground, 738 00:39:58,320 --> 00:40:03,080 Speaker 2: Amy Gaines McQuaid, and Jacob Goldstein. We're edited by Lydia 739 00:40:03,120 --> 00:40:07,000 Speaker 2: Jane Kott. Our engineers are Sarah Bruguerer and Ben Tolliday. 740 00:40:07,320 --> 00:40:10,200 Speaker 2: Theme song by Gramoscope. Special thanks to the eight Bar 741 00:40:10,520 --> 00:40:14,239 Speaker 2: and IBM teams, as well as the Pushkin Marketing team. 742 00:40:14,640 --> 00:40:17,400 Speaker 2: Smart Talks with IBM is a production of Pushkin Industries 743 00:40:17,680 --> 00:40:21,960 Speaker 2: and Ruby Studio at iHeartMedia. To find more Pushkin podcasts, 744 00:40:22,280 --> 00:40:26,080 Speaker 2: listen on the iHeartRadio app, Apple Podcasts, or wherever you 745 00:40:26,200 --> 00:40:31,440 Speaker 2: listen to podcasts. I'm Malcolm Glapham. This is a paid 746 00:40:31,520 --> 00:40:36,400 Speaker 2: advertisement from IBM. The conversations on this podcast don't necessarily 747 00:40:36,440 --> 00:40:51,160 Speaker 2: represent IBM's positions, strategies, or opinions,