1 00:00:00,120 --> 00:00:02,880 Speaker 1: Hey everyone, it's Robert and Joe here. Today we've got 2 00:00:02,880 --> 00:00:05,120 Speaker 1: something a little different to share with you. It's a 3 00:00:05,160 --> 00:00:08,760 Speaker 1: new season of the Smart Talks with IBM podcast series. 4 00:00:09,240 --> 00:00:12,040 Speaker 2: This season, on smart Talks, Malcolm Gladwell and team are 5 00:00:12,080 --> 00:00:15,280 Speaker 2: diving into the transformative world of artificial intelligence with a 6 00:00:15,320 --> 00:00:18,640 Speaker 2: fresh perspective on the concept of open What does open 7 00:00:18,720 --> 00:00:21,919 Speaker 2: really mean in the context of AI. It can mean 8 00:00:22,040 --> 00:00:25,639 Speaker 2: open source code or open data, but it also encompasses 9 00:00:25,720 --> 00:00:30,800 Speaker 2: fostering an ecosystem of ideas, ensuring diverse perspectives are heard, 10 00:00:31,160 --> 00:00:33,559 Speaker 2: and enabling new levels of transparency. 11 00:00:33,880 --> 00:00:37,120 Speaker 1: Join hosts from your favorite pushkin podcasts as they explore 12 00:00:37,120 --> 00:00:40,960 Speaker 1: how openness in AI is reshaping industries, driving innovation, and 13 00:00:41,000 --> 00:00:44,880 Speaker 1: redefining what's possible. You'll hear from industry experts and leaders 14 00:00:44,880 --> 00:00:48,360 Speaker 1: about the implication and possibilities of open AI, and of course, 15 00:00:48,720 --> 00:00:50,920 Speaker 1: Malcolm Gladwell will be there to guide you through the 16 00:00:50,960 --> 00:00:52,720 Speaker 1: season with his unique insights. 17 00:00:53,000 --> 00:00:55,560 Speaker 2: Look out for new episodes of Smart Talks every other 18 00:00:55,640 --> 00:00:59,200 Speaker 2: week on the iHeartRadio app, Apple Podcasts, or wherever you 19 00:00:59,240 --> 00:01:02,520 Speaker 2: get your podcast and learn more at IBM dot com 20 00:01:02,560 --> 00:01:04,280 Speaker 2: slash smart Talks. 21 00:01:07,120 --> 00:01:10,479 Speaker 3: Hello, Hello, Welcome to smart Talks with IBM, a podcast 22 00:01:10,480 --> 00:01:16,920 Speaker 3: from Pushkin Industries, iHeartRadio and IBM. I'm Malcolm Glappwell. This season, 23 00:01:17,120 --> 00:01:20,039 Speaker 3: we're diving back into the world of artificial intelligence, but 24 00:01:20,120 --> 00:01:26,560 Speaker 3: with a focus on the powerful concept of open its possibilities, implications, 25 00:01:26,840 --> 00:01:30,320 Speaker 3: and misconceptions. We'll look at openness from a variety of 26 00:01:30,400 --> 00:01:34,559 Speaker 3: angles and explore how the concept is already reshaping industries, 27 00:01:35,000 --> 00:01:38,759 Speaker 3: ways of doing business and our very notion of what's possible. 28 00:01:39,880 --> 00:01:43,000 Speaker 3: On today's episode, I'm joined by Jason Kelly, the global 29 00:01:43,040 --> 00:01:47,240 Speaker 3: Managing Partner for IBM Strategic Partners and Ecosystems, and by 30 00:01:47,360 --> 00:01:51,720 Speaker 3: Christy Fredericks, the Senior Vice President and Chief Partnership Officer 31 00:01:52,040 --> 00:01:56,560 Speaker 3: at Palo Alto Networks. We discussed how their partnership in 32 00:01:56,600 --> 00:02:01,760 Speaker 3: the cybersecurity space helps strengthen enterprises by focusing on seamless 33 00:02:01,880 --> 00:02:07,440 Speaker 3: cybersecurity solutions tailored to meet the evolving threat landscape. By 34 00:02:07,520 --> 00:02:12,560 Speaker 3: leveraging AI and automation, this collaboration aims to modernize security programs, 35 00:02:12,960 --> 00:02:17,959 Speaker 3: improve response times, and produce risks. Jason and Christie both 36 00:02:18,000 --> 00:02:22,280 Speaker 3: bring a tremendous amount of experience and expertise to the subject. 37 00:02:23,400 --> 00:02:38,040 Speaker 3: I think you're really going to enjoy this one. Jason Christy, 38 00:02:38,160 --> 00:02:41,240 Speaker 3: Welcome to Smart Talks with IBM. Thank you for joining me. 39 00:02:41,520 --> 00:02:42,919 Speaker 4: Thank you, it's great to be here. 40 00:02:43,240 --> 00:02:46,720 Speaker 3: We are here to discuss cybersecurity and the partnership between 41 00:02:46,760 --> 00:02:50,000 Speaker 3: IBM and Polo Alter Networks. But before we get there, 42 00:02:50,240 --> 00:02:51,919 Speaker 3: I wanted you guys to tell me a little bit 43 00:02:52,480 --> 00:02:57,200 Speaker 3: about yourself. Jason's start with you. I see on your 44 00:02:57,840 --> 00:03:01,760 Speaker 3: resume west Point, which makes we think there's some interesting 45 00:03:01,800 --> 00:03:04,799 Speaker 3: things going on there. How did you get to west Point? 46 00:03:05,400 --> 00:03:07,880 Speaker 4: West Point? West Point was the decision. First, it was 47 00:03:07,919 --> 00:03:11,000 Speaker 4: it was affordable back in the day. But I had 48 00:03:11,040 --> 00:03:12,880 Speaker 4: a sense of service. My father was a World War 49 00:03:12,919 --> 00:03:15,920 Speaker 4: Two vet, so I grew up on the weekends watching 50 00:03:16,000 --> 00:03:19,920 Speaker 4: World War two video. You know he's army as well. Yeah, 51 00:03:20,000 --> 00:03:24,520 Speaker 4: and so I thought, oh, that'd be exciting, and I 52 00:03:24,520 --> 00:03:27,800 Speaker 4: thought I do some type of service. Went there and 53 00:03:27,840 --> 00:03:30,560 Speaker 4: now I have the biggest family, extended family I could 54 00:03:30,560 --> 00:03:33,920 Speaker 4: ever have. So it was very exciting. Played football lucked out, 55 00:03:34,760 --> 00:03:40,240 Speaker 4: meaning I wasn't recruited. I walked on and that kept 56 00:03:40,280 --> 00:03:42,160 Speaker 4: me there because it gave me something and out left 57 00:03:42,240 --> 00:03:45,440 Speaker 4: with all the other pressures that a defensive back I was. 58 00:03:45,480 --> 00:03:47,560 Speaker 4: I was great at knocking the ball down, not the 59 00:03:47,560 --> 00:03:48,360 Speaker 4: best at catching it. 60 00:03:48,480 --> 00:03:51,720 Speaker 3: Yeah, and then you were a ranger. 61 00:03:52,640 --> 00:03:55,040 Speaker 4: I was. I was privileged to be a US Army 62 00:03:55,040 --> 00:03:58,240 Speaker 4: airborne ranger station, but did most of my time in 63 00:03:58,280 --> 00:04:01,320 Speaker 4: northern Italy. Were part of the It's eighty second. 64 00:04:01,200 --> 00:04:02,240 Speaker 3: Air barnship post. 65 00:04:02,320 --> 00:04:05,080 Speaker 4: Oh yeah, that's what people say, seriously, like, you know, 66 00:04:05,160 --> 00:04:07,840 Speaker 4: you were you were nor You're drinking wine and having 67 00:04:07,880 --> 00:04:10,880 Speaker 4: bred you know. But it was a part of a 68 00:04:11,080 --> 00:04:14,680 Speaker 4: NATO force there at the time. Yeah, so exciting. 69 00:04:14,880 --> 00:04:18,560 Speaker 3: How did you get from there to IBM? 70 00:04:19,040 --> 00:04:21,560 Speaker 4: A long path. As I came out of the military, 71 00:04:21,800 --> 00:04:29,440 Speaker 4: I started manufacturing retail housing and did a quick stint, 72 00:04:29,480 --> 00:04:33,520 Speaker 4: took a leave of absence from industry, and did a 73 00:04:33,720 --> 00:04:36,839 Speaker 4: stint of yet again public service in the state of 74 00:04:36,880 --> 00:04:41,479 Speaker 4: Tennessee with economic development, and got a whiff of how 75 00:04:41,520 --> 00:04:45,000 Speaker 4: fun it could be to do things around data and media. 76 00:04:45,320 --> 00:04:48,280 Speaker 4: Started a small media firm what we would now call 77 00:04:48,279 --> 00:04:52,640 Speaker 4: a digital firm, sold it and said I wanted to 78 00:04:52,640 --> 00:04:55,039 Speaker 4: go do it again somewhere, but I wanted to go 79 00:04:55,080 --> 00:04:57,719 Speaker 4: to a big company. And the family at IBM brought 80 00:04:57,760 --> 00:04:59,880 Speaker 4: me in and yet to let me go. 81 00:05:00,120 --> 00:05:01,280 Speaker 3: That was how many years ago? 82 00:05:03,120 --> 00:05:03,799 Speaker 4: Two decades? 83 00:05:03,960 --> 00:05:04,360 Speaker 3: Oh wow? 84 00:05:04,760 --> 00:05:07,200 Speaker 4: Yeah, so, I know I look amazingly young on but yes, 85 00:05:10,240 --> 00:05:13,719 Speaker 4: IBM was my fifth career and and and I've enjoyed 86 00:05:13,720 --> 00:05:15,440 Speaker 4: a since. And that's what I what I do. They 87 00:05:15,680 --> 00:05:18,560 Speaker 4: build teams, grow new parts of the company and get 88 00:05:18,600 --> 00:05:21,200 Speaker 4: to work with some of the most brilliant people on 89 00:05:21,200 --> 00:05:23,640 Speaker 4: the face of the planet, as well as partners like 90 00:05:23,920 --> 00:05:26,080 Speaker 4: like Christy that just keep it exciting. 91 00:05:26,440 --> 00:05:29,000 Speaker 3: Christy, you're I was delighted to learn that you are Canadian. 92 00:05:29,080 --> 00:05:34,800 Speaker 3: Yes here, Yeah, but you so you were a consultant 93 00:05:34,839 --> 00:05:36,240 Speaker 3: for a long time at Bane. 94 00:05:36,560 --> 00:05:39,800 Speaker 5: Yes. Yeah. I joined Bain Consulting intending to spend a 95 00:05:39,839 --> 00:05:41,920 Speaker 5: couple of years there learn the ropes and then go 96 00:05:41,960 --> 00:05:45,479 Speaker 5: get my first real job. But the value personally to 97 00:05:45,520 --> 00:05:47,560 Speaker 5: my growth and development, and then that we were able 98 00:05:47,560 --> 00:05:50,119 Speaker 5: to bring our clients. I ended up there for sixteen years, 99 00:05:50,680 --> 00:05:54,239 Speaker 5: and then post Bain went on to another my first 100 00:05:54,240 --> 00:05:57,039 Speaker 5: product company, a new relic, and then it's come full 101 00:05:57,040 --> 00:06:00,440 Speaker 5: circle at Polluta Networks. But at Bain it was all 102 00:06:00,480 --> 00:06:03,920 Speaker 5: about bringing expertise across different industries to help our clients 103 00:06:04,160 --> 00:06:07,719 Speaker 5: improve whatever they needed to improve and bringing that expertise 104 00:06:07,800 --> 00:06:10,200 Speaker 5: to bear. And then you have the product lens and 105 00:06:10,200 --> 00:06:11,960 Speaker 5: you think Okay, we're going to build the absolute best 106 00:06:12,000 --> 00:06:14,560 Speaker 5: product to help our customers do what they need to 107 00:06:14,600 --> 00:06:17,360 Speaker 5: get done. And then I joined pal Alto about six 108 00:06:17,400 --> 00:06:21,480 Speaker 5: seven months ago in a partnerships role and I'm delighted 109 00:06:21,520 --> 00:06:23,840 Speaker 5: to be able to work with amazing consulting companies like 110 00:06:23,880 --> 00:06:25,800 Speaker 5: IBM where we bring both to bear. 111 00:06:26,320 --> 00:06:29,480 Speaker 3: How long have IBM and Polo Alto Network's been partners? 112 00:06:30,080 --> 00:06:33,000 Speaker 4: So we've been We've been working together for quite quite 113 00:06:33,040 --> 00:06:36,520 Speaker 4: a long time. But we made it official, meaning we 114 00:06:36,839 --> 00:06:39,000 Speaker 4: got married as strategic partners last year. 115 00:06:39,160 --> 00:06:41,159 Speaker 3: Oh I see. So what is it that each of 116 00:06:41,240 --> 00:06:44,320 Speaker 3: you bring to the table? What's each side special? 117 00:06:44,480 --> 00:06:47,359 Speaker 4: So it's great that you asked that because about a 118 00:06:47,400 --> 00:06:51,320 Speaker 4: decade ago, are now CEO Arvin Christ says, you know, 119 00:06:51,320 --> 00:06:53,920 Speaker 4: it wouldn't be great if we just had this one 120 00:06:53,960 --> 00:06:56,479 Speaker 4: focus with this what does IBM do? And you have 121 00:06:56,560 --> 00:06:58,880 Speaker 4: this whole list, and he says, let's make it simple. 122 00:06:59,440 --> 00:07:04,200 Speaker 4: We are a multi cloud, hybrid cloud AI company. And 123 00:07:04,240 --> 00:07:06,599 Speaker 4: so when you say that, it sounds very simple, but 124 00:07:06,640 --> 00:07:10,600 Speaker 4: then people, what the hell is that? What your hybrid cloud? Well, 125 00:07:11,280 --> 00:07:13,720 Speaker 4: both of those two things have a lot of data involved, 126 00:07:13,960 --> 00:07:16,520 Speaker 4: and a lot of those mean that that data is 127 00:07:16,520 --> 00:07:20,480 Speaker 4: going to sit in multiple places and distributed environments. Well, 128 00:07:20,480 --> 00:07:23,840 Speaker 4: if you're able to tie those things together with multiple partners, 129 00:07:24,200 --> 00:07:28,280 Speaker 4: you also have to make sure that it's secure because 130 00:07:28,720 --> 00:07:31,880 Speaker 4: in the direction that we're going, where data is now 131 00:07:32,160 --> 00:07:34,640 Speaker 4: being consumed in many different places and it is the 132 00:07:34,640 --> 00:07:39,120 Speaker 4: fuel behind AI as we know, then you say, ah, well, 133 00:07:39,640 --> 00:07:42,120 Speaker 4: who does that well and who does it in a 134 00:07:42,120 --> 00:07:44,960 Speaker 4: way that's getting rid of seams, the seams that could 135 00:07:44,960 --> 00:07:49,000 Speaker 4: be across multiple products, multiple product SATs even And that's 136 00:07:49,000 --> 00:07:49,960 Speaker 4: where Powell comes in. 137 00:07:50,840 --> 00:07:54,560 Speaker 5: I think the conventional wisdom and cybersecurity was always you 138 00:07:54,640 --> 00:07:58,040 Speaker 5: need all the new tools, right, you need it every threat. 139 00:07:58,040 --> 00:07:59,760 Speaker 5: It's like, whack them all. Every threat that pops up, 140 00:07:59,800 --> 00:08:03,360 Speaker 5: you get the tool that's purpose built for that specific thing. Well, 141 00:08:03,360 --> 00:08:06,320 Speaker 5: fast forward to the RSA conference this year. There were 142 00:08:06,600 --> 00:08:09,760 Speaker 5: four thousand vendors on the floor. You look at an 143 00:08:09,760 --> 00:08:13,800 Speaker 5: average company, there's hundreds of cybersecurity tools. It introduces a 144 00:08:13,880 --> 00:08:17,760 Speaker 5: level of complexity that is really hard to manage you 145 00:08:18,000 --> 00:08:23,240 Speaker 5: as a user, query and application, right, that query can 146 00:08:23,280 --> 00:08:26,880 Speaker 5: go through a bunch of different pings from one cloud 147 00:08:26,880 --> 00:08:29,160 Speaker 5: to the next, It goes into and out of assaas application, 148 00:08:29,240 --> 00:08:31,679 Speaker 5: it maybe running along a network. You may be accessing 149 00:08:31,720 --> 00:08:34,560 Speaker 5: it from your phone, which is an unmanaged device. It's 150 00:08:34,559 --> 00:08:36,679 Speaker 5: got to go in and out. And if you say, okay, 151 00:08:36,880 --> 00:08:39,640 Speaker 5: I've got to secure that phone, I've got to secure 152 00:08:39,679 --> 00:08:41,560 Speaker 5: the network. I've got it, then all of a sudden, 153 00:08:41,600 --> 00:08:45,240 Speaker 5: you've got sort of firewalls, software and hardwelfowles popping up everywhere. 154 00:08:45,480 --> 00:08:48,400 Speaker 5: You've got cloud security, and it's you've probably heard of 155 00:08:48,400 --> 00:08:50,400 Speaker 5: this concept of zero trust, which is every time you 156 00:08:50,400 --> 00:08:51,920 Speaker 5: have to check and say are you allowed in here? 157 00:08:51,960 --> 00:08:54,120 Speaker 5: Are you allowed in here? The number of places that 158 00:08:54,160 --> 00:08:56,840 Speaker 5: can fall down it just becomes overwhelming. So you end 159 00:08:56,960 --> 00:09:02,360 Speaker 5: up with either alerts firing every two seconds that you 160 00:09:02,360 --> 00:09:04,120 Speaker 5: have to then go investigate in most of which are 161 00:09:04,559 --> 00:09:08,679 Speaker 5: false positives or you miss something right. And so that 162 00:09:08,760 --> 00:09:11,000 Speaker 5: was the conventionalism was we've got to buy all these tools, 163 00:09:11,040 --> 00:09:13,960 Speaker 5: and now you've got overwhelmed CIOs and CSOs with hundreds 164 00:09:13,960 --> 00:09:17,439 Speaker 5: of tools, and Palo Alto strategy has been, look, we're 165 00:09:17,480 --> 00:09:20,280 Speaker 5: going to create a platform where everything can be stitched together, 166 00:09:20,400 --> 00:09:24,240 Speaker 5: everything can speak the same language, and we can sort 167 00:09:24,240 --> 00:09:28,560 Speaker 5: of manage throughout the architecture and watch this call as 168 00:09:28,640 --> 00:09:32,760 Speaker 5: as it's passing through all these different checkpoints, and we 169 00:09:32,760 --> 00:09:34,280 Speaker 5: can do it in a way that you still have 170 00:09:34,320 --> 00:09:36,280 Speaker 5: the confidence that it's best to breed right, so you're 171 00:09:36,280 --> 00:09:39,160 Speaker 5: not making any trade offs. But it's not so simple 172 00:09:39,559 --> 00:09:43,040 Speaker 5: just to get from the spaghetti to the seamless architecture. 173 00:09:43,080 --> 00:09:46,600 Speaker 5: You need, oftentimes to re engineer your business processes. You 174 00:09:46,679 --> 00:09:49,839 Speaker 5: have to re architect your digital environment. And so that's 175 00:09:49,880 --> 00:09:52,720 Speaker 5: where we partner with a company like IBM to bring 176 00:09:52,760 --> 00:09:54,920 Speaker 5: that expertise and say we're going to help you not 177 00:09:55,040 --> 00:09:57,920 Speaker 5: just deploy the best cybersecurity architecture, but really get your 178 00:09:58,040 --> 00:10:00,760 Speaker 5: environment ready to have this ar as. 179 00:10:00,600 --> 00:10:04,160 Speaker 4: Well as all of those players that cross that spaghetti. 180 00:10:04,840 --> 00:10:07,560 Speaker 4: Because when you start thinking about all the other partners 181 00:10:07,559 --> 00:10:09,480 Speaker 4: that you work with, if you're you think of an 182 00:10:09,520 --> 00:10:12,480 Speaker 4: industry perspective, you're gonna have an e ERP. It could 183 00:10:12,520 --> 00:10:15,200 Speaker 4: be an Oracle, it could be an SAP. You're not 184 00:10:15,240 --> 00:10:16,840 Speaker 4: going to have one cloud, as I mentioned, it's gonna 185 00:10:16,840 --> 00:10:20,960 Speaker 4: be possibly multiple clouds. You'll have some aws maybe Microsoft 186 00:10:21,040 --> 00:10:23,360 Speaker 4: Azure and then even even some Google in there, and 187 00:10:23,360 --> 00:10:26,440 Speaker 4: then your own that you've built in your private over there, uh, 188 00:10:27,360 --> 00:10:31,360 Speaker 4: some an IBM cloud. You'll have those multiple clouds, and 189 00:10:31,400 --> 00:10:34,520 Speaker 4: then you also will have you know, fit for purpose Oh, well, 190 00:10:34,559 --> 00:10:37,120 Speaker 4: I need a I need a salesforce in there for 191 00:10:37,520 --> 00:10:40,000 Speaker 4: my customer focusing. I need I'm doing some graphics, so 192 00:10:40,040 --> 00:10:42,280 Speaker 4: I have Adobe. So I just as I can name 193 00:10:42,400 --> 00:10:47,480 Speaker 4: name name, all of those then have to be re engineered. Seriously. 194 00:10:48,080 --> 00:10:49,840 Speaker 4: I mean, come on, Malcolm, You're gonna sit there. You 195 00:10:49,880 --> 00:10:52,920 Speaker 4: think how long that would take. So if you haven't 196 00:10:52,960 --> 00:10:56,240 Speaker 4: done that before, you're gonna have to go to each 197 00:10:56,280 --> 00:10:58,600 Speaker 4: one of those individually, or you can work with a 198 00:10:58,640 --> 00:11:01,280 Speaker 4: company that can tie those things together, because we are 199 00:11:01,320 --> 00:11:05,040 Speaker 4: also strategic partners with them. So that's where you start 200 00:11:05,040 --> 00:11:09,280 Speaker 4: to say, Okay, I see how this comes together. You 201 00:11:09,480 --> 00:11:12,800 Speaker 4: have to make sure that your ecosystem is going to 202 00:11:12,800 --> 00:11:15,640 Speaker 4: be stronger than your competitor's ecosystem, and you have to 203 00:11:15,679 --> 00:11:18,280 Speaker 4: be secure in what you're doing because as you add 204 00:11:18,320 --> 00:11:21,800 Speaker 4: more players or products, you create seams, and you want 205 00:11:21,840 --> 00:11:25,160 Speaker 4: to make sure there's fewer seams and that there's zero 206 00:11:25,240 --> 00:11:29,880 Speaker 4: trust across that capability you're building. And that's why the 207 00:11:29,920 --> 00:11:31,880 Speaker 4: compliment between the two companies. 208 00:11:31,559 --> 00:11:33,600 Speaker 3: Well, take a step back from a moment before we 209 00:11:33,720 --> 00:11:36,440 Speaker 3: sort of launch, once get into the specifics of what 210 00:11:36,559 --> 00:11:40,400 Speaker 3: you guys are doing. I'm curious at this moment in 211 00:11:40,520 --> 00:11:46,880 Speaker 3: twenty twenty four. How nervous should we be about cybersecurity? 212 00:11:47,040 --> 00:11:51,240 Speaker 3: So compared it to five years ago or ten years ago, 213 00:11:51,760 --> 00:11:53,760 Speaker 3: are we are you less nervous than you were five 214 00:11:53,840 --> 00:11:56,480 Speaker 3: years ago or more nervous or all of changes going 215 00:11:56,520 --> 00:12:00,920 Speaker 3: on right now increasing vulnerability or decreasing it? 216 00:12:01,360 --> 00:12:04,840 Speaker 4: I would say Christie, also, I think we share the 217 00:12:04,880 --> 00:12:09,000 Speaker 4: point of view is that it's not necessarily being more nervous. 218 00:12:09,040 --> 00:12:13,559 Speaker 4: I think you should be more prepared because the amounts 219 00:12:14,040 --> 00:12:18,400 Speaker 4: of threat is increasing based on our dependence upon data. 220 00:12:19,480 --> 00:12:23,680 Speaker 4: And that's that's where I think the attention should be placed. 221 00:12:23,800 --> 00:12:27,240 Speaker 4: Is that more and more, especially with the importance of AI, 222 00:12:28,360 --> 00:12:31,080 Speaker 4: that you say, okay, then what's under all that? And 223 00:12:31,120 --> 00:12:35,840 Speaker 4: it's the data? As I said, So knowing that you 224 00:12:35,880 --> 00:12:37,480 Speaker 4: should be more concerned. 225 00:12:38,400 --> 00:12:42,400 Speaker 3: Does the advent of AI and its rapid evolution help 226 00:12:42,520 --> 00:12:44,320 Speaker 3: defense more or offense more? 227 00:12:44,840 --> 00:12:47,760 Speaker 5: I think it's I think it's like any mega trend 228 00:12:47,760 --> 00:12:51,760 Speaker 5: that we've witnessed both. Right, So you think about AI, 229 00:12:51,920 --> 00:12:55,800 Speaker 5: it's it's ninety nine percent great, right in terms of 230 00:12:55,840 --> 00:12:58,840 Speaker 5: what it's going to unlock for productivity, for humanity, But 231 00:12:59,000 --> 00:13:01,080 Speaker 5: it also makes it a whole lot easier to build 232 00:13:01,160 --> 00:13:04,160 Speaker 5: ransomware it's a whole lot easier to test different ways 233 00:13:04,640 --> 00:13:07,240 Speaker 5: into a system, right. But I think that's true if 234 00:13:07,280 --> 00:13:09,080 Speaker 5: you think about like the rise of the Internet, right, 235 00:13:09,080 --> 00:13:11,800 Speaker 5: all of a sudden, everyone was putting their data online 236 00:13:12,559 --> 00:13:14,800 Speaker 5: and you had to think of new ways to stay 237 00:13:14,840 --> 00:13:16,640 Speaker 5: ahead and keep that secure. And I don't think AI 238 00:13:16,720 --> 00:13:20,240 Speaker 5: is any different. You've got companies like Palta, partnerships like 239 00:13:20,240 --> 00:13:25,600 Speaker 5: Powell and IBM that are constantly scanning the landscape for 240 00:13:25,640 --> 00:13:28,000 Speaker 5: not only the current threats, but what's next, what's coming 241 00:13:28,040 --> 00:13:30,640 Speaker 5: around the corner, what's after AI? And so I think 242 00:13:30,880 --> 00:13:33,200 Speaker 5: taking it seriously and being prepared is probably the right 243 00:13:33,280 --> 00:13:35,800 Speaker 5: way of looking at it as opposed to because if 244 00:13:35,800 --> 00:13:38,280 Speaker 5: you think about it too hard, you'll just want to 245 00:13:38,320 --> 00:13:42,520 Speaker 5: crawl into a corner and stuff everything under the mattress. 246 00:13:42,880 --> 00:13:50,080 Speaker 3: I am the CEO of a regional hospital chain, big 247 00:13:50,280 --> 00:13:55,319 Speaker 3: distribute healthcare system, so a ton of data. The consequences 248 00:13:55,320 --> 00:13:58,079 Speaker 3: of being hacked and help for ransom. 249 00:13:57,720 --> 00:13:59,240 Speaker 2: Are life and death. 250 00:13:59,400 --> 00:14:03,000 Speaker 3: Life and death, right, little when you come So you 251 00:14:03,440 --> 00:14:05,360 Speaker 3: come down, you sit down with me, and you chat 252 00:14:05,400 --> 00:14:08,880 Speaker 3: with me, walk me through the kinds of things you 253 00:14:08,880 --> 00:14:11,480 Speaker 3: would tell me about what I need to get safer. 254 00:14:11,800 --> 00:14:15,000 Speaker 3: For example, let's start with one. Is it likely that 255 00:14:15,040 --> 00:14:16,920 Speaker 3: I'm spending too little or am I spending money in 256 00:14:16,960 --> 00:14:17,640 Speaker 3: the wrong place? 257 00:14:18,520 --> 00:14:22,600 Speaker 5: Great question, It depends how you've broken it out. If 258 00:14:22,640 --> 00:14:25,760 Speaker 5: you are distributing all of your dollars across a whole 259 00:14:25,760 --> 00:14:28,360 Speaker 5: bunch of different tools, it's likely you're just spending the 260 00:14:28,360 --> 00:14:30,640 Speaker 5: wrong money. And in fact, you know, putting it all 261 00:14:30,640 --> 00:14:33,440 Speaker 5: in one place is a way of potentially saving money 262 00:14:33,760 --> 00:14:37,640 Speaker 5: but keeping your security actually higher. And I'd love to 263 00:14:37,640 --> 00:14:39,480 Speaker 5: hear Jason, how you would approach it. How we would 264 00:14:39,480 --> 00:14:41,680 Speaker 5: approach it, of course, is by saying, you know, what, 265 00:14:41,680 --> 00:14:44,800 Speaker 5: what does your environment look like? You know, do you 266 00:14:44,840 --> 00:14:49,320 Speaker 5: have the connected medical devices into your EMR? Are your 267 00:14:49,440 --> 00:14:52,960 Speaker 5: respirators and ventilators all online? Right? And so we would 268 00:14:52,960 --> 00:14:55,800 Speaker 5: talk about, okay, here's how you get coverage, and how 269 00:14:55,840 --> 00:14:58,360 Speaker 5: the coverage of both the firewalls as well as the 270 00:14:58,440 --> 00:15:01,720 Speaker 5: detectors all feed back into your security operations center and 271 00:15:01,760 --> 00:15:04,480 Speaker 5: you can manage it and do your learning with AI 272 00:15:05,520 --> 00:15:06,560 Speaker 5: and keep yourself secure. 273 00:15:06,560 --> 00:15:08,680 Speaker 4: And so yeah, and I would say Christy and I 274 00:15:08,720 --> 00:15:11,080 Speaker 4: would go to the same point because if you get 275 00:15:11,160 --> 00:15:14,480 Speaker 4: under what she was just asking, it is your data 276 00:15:15,080 --> 00:15:19,360 Speaker 4: on prem and when it's on prem how active is 277 00:15:19,400 --> 00:15:23,600 Speaker 4: it across the enterprise? And so that begins the basis 278 00:15:23,600 --> 00:15:25,640 Speaker 4: for the start. And then often you're going to say, well, 279 00:15:25,680 --> 00:15:28,840 Speaker 4: we actually take in data from outside and then we 280 00:15:28,880 --> 00:15:32,400 Speaker 4: also have the circumstances. There's a lot of PII and 281 00:15:32,440 --> 00:15:37,680 Speaker 4: so that personal is the personal information, right, Yeah, And 282 00:15:37,760 --> 00:15:41,000 Speaker 4: so now you're saying, okay, now how are we securing 283 00:15:41,080 --> 00:15:43,920 Speaker 4: that and where are we securing it? And so you 284 00:15:43,960 --> 00:15:47,800 Speaker 4: have to start really thinking about the different areas within 285 00:15:48,120 --> 00:15:52,200 Speaker 4: that hospital chain. Are you sharing that amongst your hospitals? 286 00:15:52,760 --> 00:15:55,880 Speaker 4: And now you start to think of if I'm saying 287 00:15:55,960 --> 00:15:57,400 Speaker 4: no to a lot of that, it's like, well, then 288 00:15:57,440 --> 00:16:00,120 Speaker 4: are you as efficient as you want to be? So 289 00:16:00,160 --> 00:16:03,640 Speaker 4: there is that trade off of you know, am I 290 00:16:03,680 --> 00:16:07,240 Speaker 4: so tightly walled that I'm not productive? And so that's 291 00:16:07,280 --> 00:16:10,400 Speaker 4: where we would start to say what's the outcome that 292 00:16:10,400 --> 00:16:12,560 Speaker 4: you're trying to get to? All Right, maybe you're good, 293 00:16:12,600 --> 00:16:14,960 Speaker 4: Maybe you're you're good with your five locations and you 294 00:16:14,960 --> 00:16:17,480 Speaker 4: don't need to go any further, but maybe you want 295 00:16:17,520 --> 00:16:19,520 Speaker 4: to expand to fifty and by the way, you're going 296 00:16:19,600 --> 00:16:21,680 Speaker 4: to go crossporder, you're going to be in Toronto and 297 00:16:21,800 --> 00:16:25,080 Speaker 4: in New York. Okay, Well, then how do you do that? 298 00:16:25,960 --> 00:16:28,640 Speaker 4: And so I think that it's very easy to start 299 00:16:28,720 --> 00:16:34,160 Speaker 4: jumping into any of the typical situations. But the first 300 00:16:34,280 --> 00:16:38,560 Speaker 4: question that you have to ask you, as the hospital CEOs, 301 00:16:39,040 --> 00:16:42,360 Speaker 4: what's your objective? Are? What are you trying to do? 302 00:16:42,400 --> 00:16:45,960 Speaker 4: Because too often what we see is that there's some bright, new, 303 00:16:45,960 --> 00:16:49,840 Speaker 4: shiny thing that everybody wants to put in play. You know, 304 00:16:49,880 --> 00:16:53,280 Speaker 4: it's a sandwich looking for lunch and you go, but 305 00:16:53,360 --> 00:16:54,960 Speaker 4: what is it that you want to do as this? 306 00:16:55,560 --> 00:16:58,120 Speaker 4: Are you doing research? Are your research hospital? Are you 307 00:16:58,200 --> 00:17:01,400 Speaker 4: more consumer wriented? So those are the questions you start 308 00:17:01,440 --> 00:17:03,880 Speaker 4: to ask because they start to then tell a story 309 00:17:04,400 --> 00:17:07,640 Speaker 4: in line with what Christy questions. And I think that 310 00:17:07,640 --> 00:17:11,080 Speaker 4: that's where the again the compliment is that instead of 311 00:17:11,160 --> 00:17:14,000 Speaker 4: just saying, oh, well, that's thanks for telling me all this, Malcolm, 312 00:17:14,040 --> 00:17:18,680 Speaker 4: here's your ten page strategy. Now go find somebody. We 313 00:17:18,760 --> 00:17:22,080 Speaker 4: have the benefit in IBM, and it's probably why I'm 314 00:17:22,080 --> 00:17:24,440 Speaker 4: still there is you know, we're very unique. We're the 315 00:17:24,480 --> 00:17:28,640 Speaker 4: only company on the planet that has a consulting business 316 00:17:28,920 --> 00:17:32,920 Speaker 4: at scale inside of a technology company, and so we have, 317 00:17:33,760 --> 00:17:36,000 Speaker 4: you know, the left brain, right brain, We're able to 318 00:17:36,040 --> 00:17:38,240 Speaker 4: do that. And then we're able to say, okay, now 319 00:17:38,240 --> 00:17:42,640 Speaker 4: which partners are going to be most valuable for our clients. 320 00:17:43,080 --> 00:17:44,680 Speaker 4: What's going to work for you? Isn't going to work 321 00:17:44,720 --> 00:17:47,040 Speaker 4: for the manufacturer down the road, isn't going to work 322 00:17:47,040 --> 00:17:51,879 Speaker 4: for the consumer or CpG company across the river. Those 323 00:17:51,920 --> 00:17:55,720 Speaker 4: things are very specific. The threats and the scenes that 324 00:17:55,760 --> 00:17:59,040 Speaker 4: I was talking about are very specific. So that's where 325 00:17:59,080 --> 00:18:02,359 Speaker 4: it becomes very value to make sure that I'm not 326 00:18:02,440 --> 00:18:05,480 Speaker 4: just giving you some strategy that's generic. 327 00:18:05,560 --> 00:18:10,280 Speaker 3: But everything. As a healthcare CEO, everything I have done, 328 00:18:10,800 --> 00:18:14,120 Speaker 3: almost everything I've done over the last ten years, has 329 00:18:14,240 --> 00:18:16,800 Speaker 3: it had the effect of increasing my vulnerability. I want 330 00:18:16,840 --> 00:18:19,600 Speaker 3: to digitize data within the hospital used to be on 331 00:18:19,640 --> 00:18:23,200 Speaker 3: pieces of paper. I want doctors to go home and 332 00:18:23,320 --> 00:18:25,280 Speaker 3: to be able to seamlessly hook into stuff at work 333 00:18:25,320 --> 00:18:27,840 Speaker 3: because they got to do all their paperwork. I want 334 00:18:27,840 --> 00:18:30,520 Speaker 3: to make sure the diabetes people are speaking to the 335 00:18:30,640 --> 00:18:34,240 Speaker 3: organ transplant people. And so I'm isn't that everything I 336 00:18:34,280 --> 00:18:36,920 Speaker 3: have done to kind of keep up with the revolution 337 00:18:37,040 --> 00:18:39,760 Speaker 3: in healthcare? Isn't that also making me more and more 338 00:18:39,840 --> 00:18:41,639 Speaker 3: vulnerable to a bad actor. 339 00:18:41,760 --> 00:18:44,120 Speaker 5: It's such a great question because think about the quality 340 00:18:44,119 --> 00:18:47,560 Speaker 5: of healthcare delivery, right, So now doctors aren't filling out forms, 341 00:18:47,560 --> 00:18:50,320 Speaker 5: they're spending time with patients, and so the quality of 342 00:18:50,320 --> 00:18:52,920 Speaker 5: care is improving and the vulnerability is improving, right, And 343 00:18:52,960 --> 00:18:56,840 Speaker 5: so I think that's where having a strong cybersecurity strategy 344 00:18:57,480 --> 00:18:59,920 Speaker 5: actually enables all of that. One of our products is 345 00:19:00,119 --> 00:19:03,240 Speaker 5: our sas product, and we tested it with some business applications, 346 00:19:03,280 --> 00:19:05,840 Speaker 5: and oftentimes the wrap is, oh, security is going to 347 00:19:05,840 --> 00:19:07,880 Speaker 5: slow you down, right, Like you have to add a firewall, 348 00:19:07,920 --> 00:19:11,720 Speaker 5: you have to checkpoints. Our product actually increases the velocity 349 00:19:11,880 --> 00:19:14,359 Speaker 5: of your ability to use that application because of the 350 00:19:14,359 --> 00:19:18,280 Speaker 5: way that it is queried through our system as opposed 351 00:19:18,320 --> 00:19:20,880 Speaker 5: to just through the regular network. So it doesn't slow 352 00:19:20,880 --> 00:19:23,119 Speaker 5: it down and in fact, it makes it run more efficiently. 353 00:19:23,840 --> 00:19:29,080 Speaker 5: That's just one minor example. But back to the healthcare question. I, 354 00:19:29,119 --> 00:19:31,480 Speaker 5: as a patient want my doctors accessing all the technology 355 00:19:31,520 --> 00:19:33,560 Speaker 5: and talking to each other and connecting the dots behind 356 00:19:33,600 --> 00:19:36,159 Speaker 5: the scenes. I also want my data to stay private, 357 00:19:36,720 --> 00:19:41,880 Speaker 5: and so having both a consulting partner who understands how 358 00:19:41,920 --> 00:19:44,320 Speaker 5: to ask questions of the environment and of the applications 359 00:19:44,320 --> 00:19:47,200 Speaker 5: you're using, and who understands the industry inside and out, 360 00:19:47,640 --> 00:19:50,439 Speaker 5: and a technology partner that builds and stays ahead of 361 00:19:50,480 --> 00:19:53,080 Speaker 5: all of the different threats, come together and advise you. 362 00:19:53,160 --> 00:19:57,119 Speaker 5: I think is super important. When you bring in a 363 00:19:57,160 --> 00:20:01,160 Speaker 5: partner like IBM, with a platform like pal Aalta that covers, 364 00:20:01,560 --> 00:20:05,320 Speaker 5: you know, all the different parts of your environment, you're 365 00:20:05,359 --> 00:20:08,879 Speaker 5: able to say, look, where where are the vulnerabilities in 366 00:20:08,920 --> 00:20:11,960 Speaker 5: the system, Where are the different endpoints that we need 367 00:20:12,000 --> 00:20:13,920 Speaker 5: to have covered, and then just make sure you get 368 00:20:13,920 --> 00:20:17,520 Speaker 5: that breadth of coverage, and then you're better able to so, yes, 369 00:20:17,560 --> 00:20:19,639 Speaker 5: you've increased the risk, but then you've mitigated it. 370 00:20:20,400 --> 00:20:24,720 Speaker 3: So to give so before I retire my healthcare analogy, 371 00:20:25,119 --> 00:20:28,399 Speaker 3: because I was thinking about just trying to understand the 372 00:20:28,440 --> 00:20:33,239 Speaker 3: importance of this idea of having a single platform. So 373 00:20:33,560 --> 00:20:36,840 Speaker 3: if this muddle healthcare network is typical, I've acquired a 374 00:20:36,880 --> 00:20:39,320 Speaker 3: whole series of over the last ten years. I bought 375 00:20:39,359 --> 00:20:42,160 Speaker 3: a hospital over here, some I've got some physicians things 376 00:20:42,160 --> 00:20:45,320 Speaker 3: that I snapped up over here. I bought a diagnostics company, 377 00:20:45,760 --> 00:20:49,159 Speaker 3: and so I have all of these legacy systems and 378 00:20:49,200 --> 00:20:51,359 Speaker 3: I had, like you said, maybe I got some stuff 379 00:20:51,359 --> 00:20:53,560 Speaker 3: from the cloud with one company, some stuff with the cloud. 380 00:20:53,760 --> 00:20:56,480 Speaker 3: And what you're saying is the first step is to 381 00:20:56,560 --> 00:21:00,520 Speaker 3: kind of rationalize that put it on a single so 382 00:21:00,560 --> 00:21:04,040 Speaker 3: you understand where your points of weakness are as opposed 383 00:21:04,040 --> 00:21:05,640 Speaker 3: to being blind to your points of weakness. 384 00:21:07,400 --> 00:21:12,359 Speaker 5: There's yes, although anyone who's done any kind of M 385 00:21:12,400 --> 00:21:15,000 Speaker 5: and A knows that that's a long journey, right. So 386 00:21:15,080 --> 00:21:18,679 Speaker 5: I think the first step is just understanding where everything is. 387 00:21:19,320 --> 00:21:20,560 Speaker 5: And then you get on a path and you say, 388 00:21:20,560 --> 00:21:23,399 Speaker 5: where's the biggest risk. Let's let's neutralize or mitigate that 389 00:21:23,480 --> 00:21:27,040 Speaker 5: risk one at a time. The thing about open end secure, 390 00:21:27,760 --> 00:21:30,320 Speaker 5: you know Palo Alto. We keep touting the benefits of 391 00:21:30,359 --> 00:21:33,040 Speaker 5: the platform. Everything on Palo Alto, your risk is going 392 00:21:33,080 --> 00:21:35,040 Speaker 5: to be mitigated and you're going to have the full visibility. 393 00:21:35,720 --> 00:21:39,119 Speaker 5: But you can't get there overnight. And so we've got 394 00:21:39,320 --> 00:21:42,639 Speaker 5: you know, thousands of integrations with other technology companies, including 395 00:21:42,640 --> 00:21:45,280 Speaker 5: our partners, to make sure that we can capture and 396 00:21:45,480 --> 00:21:49,679 Speaker 5: have visibility into those those endpoints in those systems as well. 397 00:21:49,880 --> 00:21:51,959 Speaker 5: And so I think step one is just figure out 398 00:21:51,960 --> 00:21:54,280 Speaker 5: where everything is. Just get the scan, so poltize a 399 00:21:54,280 --> 00:21:56,199 Speaker 5: couple of products where you can kind of deploy and 400 00:21:56,240 --> 00:21:59,400 Speaker 5: get a view of your attack surface. I love the analogy. 401 00:21:59,520 --> 00:22:02,159 Speaker 5: Just like a digital environment is a house, right, and 402 00:22:02,200 --> 00:22:04,159 Speaker 5: so like you have your front door lock, of course, 403 00:22:04,359 --> 00:22:06,280 Speaker 5: because probably they're going to try the front door first. 404 00:22:07,040 --> 00:22:08,520 Speaker 5: But that's not all you're going to do, right, You're 405 00:22:08,520 --> 00:22:10,159 Speaker 5: going to make sure the whole you know, the windows 406 00:22:10,160 --> 00:22:12,320 Speaker 5: are locked and there's an alarm system and all of that. 407 00:22:13,680 --> 00:22:15,719 Speaker 5: And I think that's how you have to think about it, 408 00:22:15,800 --> 00:22:17,639 Speaker 5: is just how do we cover the whole service? 409 00:22:18,240 --> 00:22:21,640 Speaker 3: So everyone laid, people like me have been bombarded over 410 00:22:21,800 --> 00:22:25,159 Speaker 3: it seems like over the last year with one thing 411 00:22:25,200 --> 00:22:28,480 Speaker 3: another about how quickly AI is moving forward and how 412 00:22:28,560 --> 00:22:30,199 Speaker 3: big of a deal it is suddenly is going to 413 00:22:30,200 --> 00:22:34,360 Speaker 3: be in the economy. What is the impact of that 414 00:22:35,320 --> 00:22:41,240 Speaker 3: dramatic change in AI's capabilities on this cybersecurity question? So 415 00:22:41,359 --> 00:22:43,760 Speaker 3: what does it mean if you're defending somebody that you 416 00:22:43,840 --> 00:22:46,320 Speaker 3: now have these sophisticated AI tools you disuppose. 417 00:22:47,680 --> 00:22:50,960 Speaker 4: I think that AI becomes the force multiplier for cyber 418 00:22:52,680 --> 00:22:57,400 Speaker 4: to think about cyber Before it was just locking your doors, 419 00:22:57,920 --> 00:23:01,120 Speaker 4: locking the windows. If you were really good, you had 420 00:23:01,160 --> 00:23:02,360 Speaker 4: an alarm system. 421 00:23:02,680 --> 00:23:02,879 Speaker 5: You know. 422 00:23:04,720 --> 00:23:09,000 Speaker 4: Now with AI, you can say, well, I can predict 423 00:23:09,080 --> 00:23:11,400 Speaker 4: what's going to happen, I can see around the corner. 424 00:23:11,680 --> 00:23:14,199 Speaker 4: I know, I can leave my windows open upstairs and 425 00:23:14,240 --> 00:23:15,560 Speaker 4: it's fine and it's okay. 426 00:23:15,760 --> 00:23:18,360 Speaker 3: I mean because why because the AI is running a 427 00:23:18,400 --> 00:23:20,280 Speaker 3: million simulations it. 428 00:23:20,640 --> 00:23:25,280 Speaker 4: Can and that's exactly it. It becomes the intelligent part of 429 00:23:25,440 --> 00:23:28,760 Speaker 4: that AI. It's not artificial, it's augmented. So you now 430 00:23:28,760 --> 00:23:32,359 Speaker 4: have this new capability to see around corners and so 431 00:23:32,600 --> 00:23:36,360 Speaker 4: you're able to do the jobs of yesterday more effectively. 432 00:23:37,359 --> 00:23:41,560 Speaker 4: And the queries that you were doing and that's all 433 00:23:41,600 --> 00:23:44,520 Speaker 4: you're really doing, now you're doing them, you know, faster, 434 00:23:44,760 --> 00:23:48,480 Speaker 4: you're able to access even more data and you're able 435 00:23:48,520 --> 00:23:53,080 Speaker 4: to then make it more secure. So that's why AI 436 00:23:53,200 --> 00:23:54,520 Speaker 4: becomes a force multiplier. 437 00:23:54,800 --> 00:23:58,840 Speaker 3: Yeah, and just talk about the faster part. What does 438 00:23:59,040 --> 00:24:02,560 Speaker 3: faster mean in practical terms? If you're trying to defend 439 00:24:02,600 --> 00:24:05,320 Speaker 3: an enterprise against a cyber attack, what does speed matter 440 00:24:05,359 --> 00:24:06,080 Speaker 3: in that environment? 441 00:24:07,080 --> 00:24:10,600 Speaker 4: You're always trying to find a place through I go 442 00:24:10,640 --> 00:24:12,760 Speaker 4: back to you. We brought up the army. You always 443 00:24:12,880 --> 00:24:14,600 Speaker 4: how do you break the line? How do you find 444 00:24:14,600 --> 00:24:17,280 Speaker 4: a penetration point? And when you think about you know, 445 00:24:17,720 --> 00:24:21,679 Speaker 4: pin testing, penetration testing, where are those? So if you're 446 00:24:21,720 --> 00:24:24,520 Speaker 4: able to do that faster than the bad guys, and 447 00:24:24,560 --> 00:24:28,159 Speaker 4: not only faster, but you're picking more probable points. This 448 00:24:28,320 --> 00:24:31,200 Speaker 4: is back to the intelligence. I could waste time doing 449 00:24:31,400 --> 00:24:34,439 Speaker 4: penetration testing someplace where. That's why I mentioned leave. If 450 00:24:34,480 --> 00:24:37,280 Speaker 4: they can't get in the second story windows, why are 451 00:24:37,320 --> 00:24:40,600 Speaker 4: you spending time trying it? So that becomes more effective. 452 00:24:41,400 --> 00:24:44,200 Speaker 4: So that's when I think of speed. That's what I 453 00:24:44,240 --> 00:24:46,720 Speaker 4: think of because with not just speed, I think it's 454 00:24:46,720 --> 00:24:47,800 Speaker 4: also what's effective. 455 00:24:48,200 --> 00:24:50,119 Speaker 5: Just to put a fine point on it. So I 456 00:24:50,160 --> 00:24:52,240 Speaker 5: found a way in. Okay, Now what I don't know 457 00:24:52,280 --> 00:24:54,119 Speaker 5: where the jewelry is, so I have to look around 458 00:24:54,200 --> 00:24:56,800 Speaker 5: and see if there's any hidden gems and try to 459 00:24:56,840 --> 00:24:58,960 Speaker 5: find my way. That used to take a week, two 460 00:24:58,960 --> 00:25:02,680 Speaker 5: weeks or seven or fourteen days. Now it's hours, right, 461 00:25:02,760 --> 00:25:05,160 Speaker 5: So they're in and they can actually explatrate data within 462 00:25:05,480 --> 00:25:08,639 Speaker 5: less than a day. The metric we use in the 463 00:25:08,720 --> 00:25:11,560 Speaker 5: security operation center is meantime to detect, so to see 464 00:25:11,600 --> 00:25:14,600 Speaker 5: anyone's there, meantime to respond and remediate to get them 465 00:25:14,600 --> 00:25:18,840 Speaker 5: out right. That used to be also you know, seven eight, nine, 466 00:25:18,880 --> 00:25:22,960 Speaker 5: ten days. Now it needs to be less than an hour, 467 00:25:23,720 --> 00:25:27,640 Speaker 5: and with our AI based security operations platform, it is. 468 00:25:28,119 --> 00:25:31,000 Speaker 5: Now you've got one tool. That's whether it's all peloton 469 00:25:31,040 --> 00:25:33,360 Speaker 5: networks or whether it's just you know, hoofringing data from 470 00:25:33,359 --> 00:25:35,520 Speaker 5: other places, then you're able to see it all together. 471 00:25:35,560 --> 00:25:37,600 Speaker 5: So you actually get fewer alerts. So you get from 472 00:25:38,000 --> 00:25:41,320 Speaker 5: thousands of alerts down to one hundred alerts right, and 473 00:25:41,359 --> 00:25:43,840 Speaker 5: you can investigate them, and you investigate them using AI too. 474 00:25:44,160 --> 00:25:47,360 Speaker 5: And AI is is today. It's today's threat, but it's 475 00:25:47,440 --> 00:25:49,560 Speaker 5: you know, you think about threat and opportunity to think 476 00:25:49,560 --> 00:25:51,480 Speaker 5: about what's next. You always have to be kind of 477 00:25:51,520 --> 00:25:52,840 Speaker 5: evolving and you have to think. 478 00:25:53,240 --> 00:25:55,760 Speaker 4: We talk about threat and risk. You know, we didn't 479 00:25:55,760 --> 00:25:59,520 Speaker 4: tell you know, what is the cost of cyber some 480 00:25:59,640 --> 00:26:03,359 Speaker 4: type of penetration. It's typical costs is about four and 481 00:26:03,359 --> 00:26:09,159 Speaker 4: a half million dollars. And that's just in labor and remediation. 482 00:26:09,840 --> 00:26:14,040 Speaker 4: If you think about reputational risk as well, our Institute 483 00:26:14,040 --> 00:26:16,199 Speaker 4: for Business Value to to study and found it in 484 00:26:16,320 --> 00:26:19,840 Speaker 4: twenty twenty three there were thirty nine banks that we 485 00:26:19,960 --> 00:26:25,439 Speaker 4: watched that suffered a reputational risk market value of one 486 00:26:25,520 --> 00:26:29,040 Speaker 4: hundred and thirty billion dollars. And so you start to think, wow, 487 00:26:29,119 --> 00:26:34,120 Speaker 4: that's just reputational risk. So that's what's at stake here, 488 00:26:34,640 --> 00:26:36,919 Speaker 4: and it's only that is only going to get bigger. 489 00:26:37,680 --> 00:26:40,240 Speaker 5: So one of the piece we haven't talked about about 490 00:26:40,240 --> 00:26:42,680 Speaker 5: AI that I find super interesting because we've been talking 491 00:26:42,760 --> 00:26:46,520 Speaker 5: essentially about like the terminator, the robots fighting robots, right, 492 00:26:46,520 --> 00:26:49,320 Speaker 5: Like whose robots are quicker? Like I'm designing attacks and 493 00:26:49,359 --> 00:26:51,440 Speaker 5: I'm defending against a tax and I think that's that's 494 00:26:51,480 --> 00:26:55,639 Speaker 5: super important. But we recently launch and our working at 495 00:26:55,680 --> 00:26:58,520 Speaker 5: IBM on our AI security product to actually secure the 496 00:26:58,600 --> 00:27:00,440 Speaker 5: use of AI because it also opens up up another 497 00:27:00,520 --> 00:27:04,600 Speaker 5: set of threat factors. I'll give you an example. I'm 498 00:27:04,600 --> 00:27:07,120 Speaker 5: a marketing executive now for your hospital. So I work 499 00:27:07,200 --> 00:27:10,000 Speaker 5: for you, and you want to announce the launch of 500 00:27:10,040 --> 00:27:14,040 Speaker 5: a new center, and so I upload all the information 501 00:27:14,119 --> 00:27:16,320 Speaker 5: about all the patients and our you know how we 502 00:27:16,359 --> 00:27:18,879 Speaker 5: do things into chat GPT to write the PR for me. Well, 503 00:27:18,880 --> 00:27:21,000 Speaker 5: I've also just uploaded to chat GPT a whole bunch 504 00:27:21,040 --> 00:27:25,600 Speaker 5: of secrets, right. So it's it's how employees are using AI, 505 00:27:25,600 --> 00:27:27,800 Speaker 5: because I think, you know, some companies are sort of 506 00:27:27,800 --> 00:27:30,199 Speaker 5: building their own language models and their own AI applications 507 00:27:30,200 --> 00:27:32,919 Speaker 5: that they want to keep secure. Others are just curious 508 00:27:32,920 --> 00:27:35,679 Speaker 5: about how their employees are using AI applications on the shelf, 509 00:27:36,720 --> 00:27:38,760 Speaker 5: and so we announced in May a product where you 510 00:27:38,800 --> 00:27:42,520 Speaker 5: can actually scan and see how AI is being used 511 00:27:42,520 --> 00:27:45,520 Speaker 5: in your enterprise and within We made the announcement with 512 00:27:45,640 --> 00:27:47,560 Speaker 5: the GA was last month, but we made the announcement 513 00:27:47,560 --> 00:27:50,440 Speaker 5: in May, and we had immediately thousands of CIOs signing 514 00:27:50,520 --> 00:27:53,760 Speaker 5: up because just understanding you know who's using what, it's 515 00:27:53,800 --> 00:27:56,600 Speaker 5: another open question because you know, we talk about AI 516 00:27:57,040 --> 00:27:59,280 Speaker 5: enhancing productivity and all the benefits it's going to bring, 517 00:27:59,320 --> 00:28:02,000 Speaker 5: but it brings it brings risks, not just in how 518 00:28:02,000 --> 00:28:04,480 Speaker 5: it's being used by the threat actors, but also you 519 00:28:04,520 --> 00:28:06,400 Speaker 5: know what other vulnerabilities that. 520 00:28:06,320 --> 00:28:09,920 Speaker 3: Exc It is the eye that you does that system 521 00:28:10,080 --> 00:28:11,840 Speaker 3: tell you what's a problematic use. 522 00:28:12,640 --> 00:28:15,080 Speaker 5: It does, so what what it does, and you've got 523 00:28:15,080 --> 00:28:16,800 Speaker 5: to train it right. But what it does is say 524 00:28:16,840 --> 00:28:19,879 Speaker 5: this is this is outside of your policy. So CIOs 525 00:28:19,880 --> 00:28:22,399 Speaker 5: will set policies on here's what is acceptable and not 526 00:28:22,440 --> 00:28:24,320 Speaker 5: acceptable use. So we'll be able to scan and say 527 00:28:24,320 --> 00:28:27,000 Speaker 5: these these falling uses are outside of policy, and then 528 00:28:27,040 --> 00:28:29,080 Speaker 5: it'll punt and say I think this is too restrictive, 529 00:28:29,080 --> 00:28:30,679 Speaker 5: I think this is too permissive, and then you can 530 00:28:30,680 --> 00:28:34,360 Speaker 5: sort of update your policies from there. That's just sort 531 00:28:34,359 --> 00:28:36,840 Speaker 5: of the visibility piece, and then there's the run time piece, 532 00:28:36,840 --> 00:28:38,800 Speaker 5: which will actually stop you from using it. So you 533 00:28:38,840 --> 00:28:41,480 Speaker 5: go and say, Okay, here's all my patients' social security numbers. 534 00:28:41,520 --> 00:28:44,080 Speaker 5: I'm going to upload them to chat GPT to you know, 535 00:28:44,680 --> 00:28:47,000 Speaker 5: get an understanding of like where they all live. I 536 00:28:47,000 --> 00:28:49,200 Speaker 5: don't know what why you would possibly do that, but 537 00:28:49,240 --> 00:28:51,640 Speaker 5: let's say you were, and then you know, it'll note 538 00:28:51,680 --> 00:28:53,480 Speaker 5: that looks like a social security number. You can't upload 539 00:28:53,560 --> 00:28:54,840 Speaker 5: that into your prompt, so it. 540 00:28:54,800 --> 00:29:00,520 Speaker 3: Will stop you before you Yeah, thoughtful voice over you shoulder, 541 00:29:01,120 --> 00:29:04,040 Speaker 3: just to remind you not to do something silly exactly. 542 00:29:04,400 --> 00:29:06,920 Speaker 3: But this is just talk a little bit more about 543 00:29:07,120 --> 00:29:11,000 Speaker 3: adding AI into this mix. You say it's a force multiplier. 544 00:29:11,080 --> 00:29:15,040 Speaker 3: It's really interesting dig into that, what other instances of 545 00:29:15,040 --> 00:29:19,400 Speaker 3: what that means. How does the balance between AI and 546 00:29:20,640 --> 00:29:25,640 Speaker 3: human expertise work in the kind of next generation of cybersecurity. 547 00:29:26,480 --> 00:29:30,720 Speaker 4: I think the common way to look at as it 548 00:29:30,920 --> 00:29:34,400 Speaker 4: back to the force multipliers, It's not going to be 549 00:29:34,640 --> 00:29:36,880 Speaker 4: is your AI better? But can you use it better? 550 00:29:37,280 --> 00:29:41,000 Speaker 4: Can you ask your AI the right questions? Are you 551 00:29:41,040 --> 00:29:44,520 Speaker 4: well trained? So the competition really becomes your use of AI? 552 00:29:45,320 --> 00:29:48,680 Speaker 4: And are you pointed it in the right direction. You 553 00:29:48,720 --> 00:29:51,800 Speaker 4: have fifty people, can they do the work of two 554 00:29:51,880 --> 00:29:54,800 Speaker 4: hundred and fifty and can they do it in a 555 00:29:54,840 --> 00:29:57,960 Speaker 4: safe and secure manner, So you're not opening up more 556 00:29:58,040 --> 00:30:01,280 Speaker 4: risk based on or two much risk is your risk 557 00:30:01,360 --> 00:30:04,040 Speaker 4: tolerance in order to get the outcome. So that's why 558 00:30:04,240 --> 00:30:07,400 Speaker 4: I think there's the opportunity. And so you see this 559 00:30:07,920 --> 00:30:10,680 Speaker 4: truly as a force multiplier because the first thing people go, oh, 560 00:30:10,680 --> 00:30:13,480 Speaker 4: you're going to get rid of people. Oh, the people 561 00:30:13,560 --> 00:30:17,560 Speaker 4: portion is still still going to be just as important 562 00:30:17,560 --> 00:30:19,400 Speaker 4: because they're doing that other piece of work. 563 00:30:19,480 --> 00:30:21,640 Speaker 5: One of my favorite statistics is that there are now 564 00:30:21,720 --> 00:30:23,840 Speaker 5: more bank tellers in the US than there were in 565 00:30:23,920 --> 00:30:27,080 Speaker 5: nineteen sixty before the ATM was invented. Right, So, but 566 00:30:27,120 --> 00:30:28,800 Speaker 5: it used to be you would go to your bank 567 00:30:29,280 --> 00:30:31,640 Speaker 5: because you had to. I remember doing this. You go, 568 00:30:31,760 --> 00:30:33,560 Speaker 5: you fill out your deposit slip, you hand it to 569 00:30:33,600 --> 00:30:36,120 Speaker 5: the teller and they give you your cash. And then 570 00:30:36,160 --> 00:30:38,000 Speaker 5: ATMs are invented, and it's like, oh, no, what's going 571 00:30:38,040 --> 00:30:40,040 Speaker 5: to happen all these jobs and now there's more, Right, 572 00:30:39,880 --> 00:30:43,600 Speaker 5: but you're not withdrawing money from a bank teller. You're 573 00:30:43,640 --> 00:30:47,080 Speaker 5: now doing more sophisticated transactions. And so I think it's 574 00:30:47,080 --> 00:30:49,640 Speaker 5: similar with AI, right, Like you want people doing things 575 00:30:49,680 --> 00:30:50,600 Speaker 5: that only people can do. 576 00:30:51,080 --> 00:30:53,880 Speaker 3: The human element remains absolutely central in all of this. 577 00:30:55,440 --> 00:30:59,960 Speaker 3: How do you make sure that your cybersecurity folks are 578 00:31:00,040 --> 00:31:03,080 Speaker 3: are equipped to handle high value tasks, are ready for 579 00:31:03,120 --> 00:31:04,520 Speaker 3: this increasing responsibility. 580 00:31:05,720 --> 00:31:07,680 Speaker 5: There's a couple of ways to answer this, but I 581 00:31:07,680 --> 00:31:12,120 Speaker 5: think the more you're able to automate the routine and 582 00:31:12,160 --> 00:31:17,040 Speaker 5: the mundane tasks. For example, the bulk of cybersecurity happens 583 00:31:17,040 --> 00:31:19,960 Speaker 5: in the security operations center. There's analysts who are sitting 584 00:31:20,000 --> 00:31:23,080 Speaker 5: in that center. If they're spending all day either configuring 585 00:31:24,160 --> 00:31:26,760 Speaker 5: alerts or responding to alerts, they're not able to do 586 00:31:26,800 --> 00:31:29,560 Speaker 5: the advanced sort of threat hunting and analysis work. And 587 00:31:29,600 --> 00:31:31,280 Speaker 5: so I think a big chunk of it is just 588 00:31:31,320 --> 00:31:34,080 Speaker 5: freeing up their time to be able to do the 589 00:31:34,080 --> 00:31:37,040 Speaker 5: more advanced strategic work. And a lot of the automation 590 00:31:37,120 --> 00:31:41,080 Speaker 5: tools based on AI, like our cortex XIM product, is 591 00:31:41,760 --> 00:31:44,000 Speaker 5: it's designed to free up their time in order to 592 00:31:44,000 --> 00:31:44,640 Speaker 5: be able to do that. 593 00:31:45,280 --> 00:31:48,640 Speaker 4: And from our perspective, it is making sure that it's 594 00:31:48,640 --> 00:31:52,600 Speaker 4: a requirement to make sure that you have the qualifications 595 00:31:52,600 --> 00:31:54,920 Speaker 4: because people can easily get used to doing what they've 596 00:31:54,960 --> 00:31:58,160 Speaker 4: always done. I know this, and that's what I do. 597 00:31:58,480 --> 00:32:02,600 Speaker 4: You say, Well, now, all the threat actors are learning 598 00:32:02,680 --> 00:32:05,640 Speaker 4: on fly. They're trying to always outsmart you. So it's 599 00:32:05,680 --> 00:32:08,560 Speaker 4: in your best interest, our best interest, our client's best 600 00:32:08,760 --> 00:32:12,360 Speaker 4: and partners best that you are on the front leaning 601 00:32:12,480 --> 00:32:14,280 Speaker 4: edge of that learning capability. 602 00:32:14,680 --> 00:32:16,640 Speaker 3: If you're talking to a client he wants to develop 603 00:32:16,720 --> 00:32:22,120 Speaker 3: a kind of unified cybersecurity strategy, what's the best single 604 00:32:22,160 --> 00:32:25,880 Speaker 3: piece of advice you can give them? 605 00:32:25,920 --> 00:32:30,040 Speaker 5: You should have a single platform. It's hard not to 606 00:32:30,040 --> 00:32:32,640 Speaker 5: answer that, but it is true. I mean all joking 607 00:32:32,640 --> 00:32:36,360 Speaker 5: aside having you know the best of breed solutions that 608 00:32:36,360 --> 00:32:38,320 Speaker 5: are all talking to each other and able to stitch 609 00:32:38,320 --> 00:32:41,600 Speaker 5: together and identify threats before human might be able to. 610 00:32:42,520 --> 00:32:44,960 Speaker 5: That's number one, and number two is making sure you 611 00:32:44,960 --> 00:32:48,560 Speaker 5: have visibility on all elements so you're able to cover 612 00:32:48,600 --> 00:32:51,239 Speaker 5: your whole environment and understand how people are accessing it. 613 00:32:51,480 --> 00:32:55,520 Speaker 4: I'd say, think like a bad actor. Yeah, always think 614 00:32:55,560 --> 00:32:58,880 Speaker 4: outside in because you get comfortable the other way around. 615 00:33:00,560 --> 00:33:04,800 Speaker 3: You guys work together with a push Environ company, and 616 00:33:05,080 --> 00:33:06,680 Speaker 3: I'd love for you to talk a little bit about 617 00:33:06,880 --> 00:33:08,960 Speaker 3: use that as a kind of case study for what 618 00:33:09,040 --> 00:33:13,520 Speaker 3: this collaboration between the two your two companies looks like. 619 00:33:13,880 --> 00:33:15,480 Speaker 3: When you work with a cloud. 620 00:33:15,640 --> 00:33:18,600 Speaker 5: It really was, you know, IBM leading on a digital 621 00:33:18,600 --> 00:33:21,960 Speaker 5: transformation for this client that wanted to move their applications 622 00:33:21,960 --> 00:33:23,920 Speaker 5: into the cloud. And so you're asking a lot of 623 00:33:24,000 --> 00:33:26,680 Speaker 5: questions about how does AI increase the risk and the 624 00:33:26,720 --> 00:33:29,080 Speaker 5: surface area. Those same questions ten years ago were asked 625 00:33:29,120 --> 00:33:31,480 Speaker 5: about the cloud, and we're still on the journey where 626 00:33:32,120 --> 00:33:34,800 Speaker 5: companies are migrating to the cloud. We're not anywhere near 627 00:33:34,880 --> 00:33:36,960 Speaker 5: finished that yet. And so there's two pieces to a 628 00:33:36,960 --> 00:33:39,160 Speaker 5: cloud migration. One is just refactoring for the cloud to 629 00:33:39,160 --> 00:33:41,760 Speaker 5: make sure the application works effectively in the cloud. And 630 00:33:41,800 --> 00:33:44,120 Speaker 5: the second is security. And then you built in security 631 00:33:44,200 --> 00:33:47,320 Speaker 5: by design using power does prison of cloud products to 632 00:33:47,320 --> 00:33:49,960 Speaker 5: make sure that not only did you have the visibility 633 00:33:50,080 --> 00:33:52,040 Speaker 5: so our cloud product you can scan and see where 634 00:33:52,040 --> 00:33:55,240 Speaker 5: the vulnerabilities are. And then there's also you know, cloud 635 00:33:55,760 --> 00:33:59,680 Speaker 5: firewalls essentially that will keep bad actors out and keep 636 00:33:59,720 --> 00:34:01,200 Speaker 5: the cloud instance secure. 637 00:34:02,280 --> 00:34:05,160 Speaker 3: If we sit down and have this conversation five years 638 00:34:05,160 --> 00:34:07,680 Speaker 3: from now, which I actually hope we do, be fun 639 00:34:08,920 --> 00:34:13,440 Speaker 3: this pretend is twenty twenty nine. Tell me what are 640 00:34:13,480 --> 00:34:15,080 Speaker 3: you happy about in twenty twenty nine. 641 00:34:15,320 --> 00:34:20,960 Speaker 4: I think twenty twenty nine quantum computing is mainstream I 642 00:34:20,960 --> 00:34:26,680 Speaker 4: think quantum computing is now quantum safe, where we're using 643 00:34:27,480 --> 00:34:32,320 Speaker 4: quantum computing to make sure that those bad actors aren't 644 00:34:32,320 --> 00:34:34,640 Speaker 4: as bad as they used to be back in twenty 645 00:34:34,680 --> 00:34:38,840 Speaker 4: twenty four, and that we're seeing around the corners and 646 00:34:38,880 --> 00:34:44,000 Speaker 4: that we're empowering our palo alto relationship. That in twenty 647 00:34:44,080 --> 00:34:48,879 Speaker 4: twenty nine is the premiere type of capability that people 648 00:34:48,920 --> 00:34:51,960 Speaker 4: are looking at when they think of what used to 649 00:34:51,960 --> 00:34:54,879 Speaker 4: be AI and now is quantum capability. 650 00:34:55,120 --> 00:34:55,400 Speaker 1: Yeah. 651 00:34:56,160 --> 00:35:00,239 Speaker 5: Yeah, I think for AI, everyone's just using it part 652 00:35:00,280 --> 00:35:03,360 Speaker 5: of their job. The way email was an innovation in 653 00:35:03,400 --> 00:35:05,960 Speaker 5: the nineties, the way you know cloud was an innovation 654 00:35:06,040 --> 00:35:08,520 Speaker 5: in the twenty tens, and we thought, how are we 655 00:35:08,560 --> 00:35:10,120 Speaker 5: going to use this? What impact is it going to 656 00:35:10,200 --> 00:35:12,759 Speaker 5: have on productivity? All these people who are spending their 657 00:35:12,840 --> 00:35:15,000 Speaker 5: days typing up memos, like what are they going to do? 658 00:35:15,239 --> 00:35:17,640 Speaker 5: We're going to be past that fear and we're all 659 00:35:17,680 --> 00:35:21,160 Speaker 5: going to understand that it is this like truly positive 660 00:35:21,200 --> 00:35:24,560 Speaker 5: force multiplier for you know, every employee is able to 661 00:35:24,600 --> 00:35:28,680 Speaker 5: do their best work and spend their time on the 662 00:35:28,719 --> 00:35:30,520 Speaker 5: things that only they can do, and then the AI 663 00:35:30,719 --> 00:35:32,080 Speaker 5: is doing the rest of that for them. 664 00:35:32,160 --> 00:35:36,800 Speaker 4: Right AI, it's fun to enable many things to work together. 665 00:35:37,000 --> 00:35:40,399 Speaker 4: It won't be just one language model. We won't even 666 00:35:40,440 --> 00:35:43,640 Speaker 4: think about. It will be the difference between you know, 667 00:35:44,560 --> 00:35:48,120 Speaker 4: Malcolm having a fax machine, a stereo and a telephone 668 00:35:48,960 --> 00:35:52,040 Speaker 4: and a memo board. Now it's in your pocket and 669 00:35:52,080 --> 00:35:54,279 Speaker 4: it's all one thing and you don't even call that, 670 00:35:54,360 --> 00:35:55,920 Speaker 4: you know. I said, walk them into my kids the 671 00:35:56,000 --> 00:35:58,920 Speaker 4: other day and they're like, what's a walk man? So 672 00:35:59,280 --> 00:36:01,839 Speaker 4: I do think will it'll be part of the past 673 00:36:01,880 --> 00:36:05,120 Speaker 4: and it's it will be the start of the seamless connection. 674 00:36:05,280 --> 00:36:12,279 Speaker 4: That is secure seamless connection of HR, of finance, of distribution, logistics, 675 00:36:12,320 --> 00:36:15,960 Speaker 4: of billing, all of those will have a capability to 676 00:36:16,000 --> 00:36:16,960 Speaker 4: work together. 677 00:36:17,360 --> 00:36:20,720 Speaker 3: Yeah, I have to do some social quick fire questions 678 00:36:21,280 --> 00:36:24,960 Speaker 3: you guys ready, all right? What's the number one thing 679 00:36:25,000 --> 00:36:26,640 Speaker 3: that people misunderstand about AI? 680 00:36:28,040 --> 00:36:29,400 Speaker 4: The reliance on data? 681 00:36:30,560 --> 00:36:31,279 Speaker 1: What do you mean by that? 682 00:36:32,080 --> 00:36:34,640 Speaker 4: I think that it's just assumed that it's happening and 683 00:36:34,800 --> 00:36:37,320 Speaker 4: it can just go out and grab data anywhere. 684 00:36:38,320 --> 00:36:39,960 Speaker 3: Yeah, you have. 685 00:36:39,920 --> 00:36:43,120 Speaker 4: To have good data, reliable data, and access to the data. 686 00:36:43,840 --> 00:36:45,600 Speaker 5: I mean people are too afraid of it. 687 00:36:46,040 --> 00:36:49,080 Speaker 3: Checkbox and image generators are the biggest things in consumer 688 00:36:49,120 --> 00:36:51,040 Speaker 3: AI right now. What do you think is that next 689 00:36:51,080 --> 00:36:52,400 Speaker 3: big business application? 690 00:36:52,560 --> 00:36:56,200 Speaker 4: Jason, I think it's the tying together of multiple capabilities. 691 00:36:56,200 --> 00:36:59,080 Speaker 4: I'm hinting towards us earlier is that I think tying 692 00:36:59,120 --> 00:37:02,200 Speaker 4: together the disparate systems that sit in different parts of 693 00:37:02,200 --> 00:37:04,920 Speaker 4: the organization front office, back office, making it one office 694 00:37:05,080 --> 00:37:07,160 Speaker 4: and tying together those different functions, that's it. 695 00:37:08,280 --> 00:37:10,680 Speaker 5: I mean, it's workflow automation. I think back to your 696 00:37:10,680 --> 00:37:13,920 Speaker 5: point on the reliance on data seems easy. It's a 697 00:37:13,920 --> 00:37:15,360 Speaker 5: lot harder than you think because you have to have 698 00:37:15,400 --> 00:37:17,400 Speaker 5: everything set up in exactly the right way to get 699 00:37:17,600 --> 00:37:20,279 Speaker 5: all of your systems automated and the sort of the 700 00:37:20,280 --> 00:37:22,480 Speaker 5: more boring jobs taken care of so that humans could 701 00:37:22,480 --> 00:37:23,520 Speaker 5: do the strategic ones. 702 00:37:25,200 --> 00:37:27,680 Speaker 3: How are you already using AI in your day to 703 00:37:27,719 --> 00:37:28,240 Speaker 3: day life? 704 00:37:30,440 --> 00:37:33,360 Speaker 5: I mean I use it at work all the time, 705 00:37:33,960 --> 00:37:36,000 Speaker 5: and then I've found right now I go to chat 706 00:37:36,040 --> 00:37:39,120 Speaker 5: GPT instead of Google to look things up. I like 707 00:37:39,160 --> 00:37:40,080 Speaker 5: having a conversation. 708 00:37:41,440 --> 00:37:45,520 Speaker 4: We have a wonderful capability in our consulting business called 709 00:37:46,160 --> 00:37:50,319 Speaker 4: our consulting assistant, and consulting advantage is the proper name 710 00:37:50,360 --> 00:37:52,400 Speaker 4: for it, but I look at it as that assistant 711 00:37:52,480 --> 00:37:55,000 Speaker 4: and it's a force multiplier for me. So if I 712 00:37:55,120 --> 00:38:00,120 Speaker 4: need to pull together content proposals with the teams, we 713 00:38:00,160 --> 00:38:00,759 Speaker 4: go straight to that. 714 00:38:01,560 --> 00:38:04,040 Speaker 3: We are one more. We hear so many definitions of 715 00:38:04,120 --> 00:38:07,400 Speaker 3: open related to technology. How do you define it and 716 00:38:07,480 --> 00:38:09,600 Speaker 3: how does the concept help you innovate? 717 00:38:10,719 --> 00:38:14,200 Speaker 5: By definition? In cybersecurity, you don't want to be too open, right, 718 00:38:14,680 --> 00:38:18,640 Speaker 5: So I think we enable openness with this concept of 719 00:38:18,719 --> 00:38:21,000 Speaker 5: zero trust and saying like everyone's invited in as long 720 00:38:21,000 --> 00:38:23,759 Speaker 5: as you have the right credentials, right. So that's that's 721 00:38:23,760 --> 00:38:25,640 Speaker 5: one way, and then the other way is just making 722 00:38:25,640 --> 00:38:29,160 Speaker 5: sure you're connected to all the different systems in order 723 00:38:29,200 --> 00:38:31,480 Speaker 5: to be able to have that visibility and see what's happening. 724 00:38:31,520 --> 00:38:35,319 Speaker 5: Because if you are blind, that's the minute you have 725 00:38:35,360 --> 00:38:36,400 Speaker 5: that vulnerability. 726 00:38:37,040 --> 00:38:43,440 Speaker 4: And I'd say it's moving quickly with security, it sounds contradictory. 727 00:38:43,840 --> 00:38:46,239 Speaker 4: Open hold then it means you're not safe. No, you 728 00:38:46,280 --> 00:38:47,560 Speaker 4: are safe and you can move faster. 729 00:38:47,880 --> 00:38:49,560 Speaker 3: Yeah, thank you so much. It's fun. 730 00:38:49,719 --> 00:38:51,319 Speaker 5: Thanks a lot, Thank you great. We'll see you in 731 00:38:51,320 --> 00:38:51,799 Speaker 5: five years. 732 00:38:52,760 --> 00:38:58,839 Speaker 3: Five years, man, I'll be old in five years. Thank 733 00:38:58,880 --> 00:39:02,120 Speaker 3: you to Jason Kelly and I M and Christy Fredericks 734 00:39:02,160 --> 00:39:06,640 Speaker 3: at Palo Alto Networks for that fascinating conversation about the 735 00:39:06,680 --> 00:39:12,880 Speaker 3: threats and opportunities in cybersecurity today. As Jason and Christie stressed, 736 00:39:13,239 --> 00:39:17,719 Speaker 3: AI can be a force multiplier for enterprise across industries. 737 00:39:18,760 --> 00:39:21,200 Speaker 3: When you're working with multiple products and have your data 738 00:39:21,239 --> 00:39:25,560 Speaker 3: in distributed environments, you need technology that will work across 739 00:39:25,600 --> 00:39:30,160 Speaker 3: your organization, and with Palo Alto Networks platform, you can 740 00:39:30,280 --> 00:39:36,239 Speaker 3: enhance cyber resiliency and simplify your operations. Through their collaboration, 741 00:39:36,600 --> 00:39:40,160 Speaker 3: IBM and Palo Alto Networks are charting the future a 742 00:39:40,280 --> 00:39:47,520 Speaker 3: fully integrated open end to end security solutions. Smart Talks 743 00:39:47,520 --> 00:39:50,800 Speaker 3: with IBM is produced by Matt Romano, Joey fish Ground, 744 00:39:50,960 --> 00:39:55,680 Speaker 3: Amy Gaines McQuaid, and Jacob Goldstein. We're edited by Lydia 745 00:39:55,760 --> 00:39:59,560 Speaker 3: Jane Kott. Our engineers are Sarah Brugaer and Ben Holliday. 746 00:40:00,040 --> 00:40:02,799 Speaker 3: Theme song by Gramoscope. Special thanks to the eight Bar 747 00:40:03,160 --> 00:40:06,799 Speaker 3: and IBM teams, as well as the Pushkin Marketing team. 748 00:40:07,280 --> 00:40:10,000 Speaker 3: Smart Talks with IBM is a production of Pushkin Industries 749 00:40:10,280 --> 00:40:14,560 Speaker 3: and Ruby Studio at iHeartMedia. To find more Pushkin podcasts, 750 00:40:14,920 --> 00:40:18,680 Speaker 3: listen on the iHeartRadio app, Apple Podcasts, or wherever you 751 00:40:18,840 --> 00:40:24,080 Speaker 3: listen to podcasts. I'm Malcolm glapwom. This is a paid 752 00:40:24,160 --> 00:40:29,000 Speaker 3: advertisement from IBM. The conversations on this podcast don't necessarily 753 00:40:29,040 --> 00:40:43,840 Speaker 3: represent IBM's positions, strategies, or opinions.