1 00:00:00,960 --> 00:00:04,360 Speaker 1: You're listening to a Shazy's podcast, could you tell us 2 00:00:04,360 --> 00:00:06,399 Speaker 1: a little bit more around the trains with saying the cybersecurity, 3 00:00:06,640 --> 00:00:10,760 Speaker 1: there's a statistic in our predictions that eighty percent of 4 00:00:10,800 --> 00:00:14,200 Speaker 1: breaches are now happening in the cloud, and so I 5 00:00:14,240 --> 00:00:17,840 Speaker 1: think that is going to continue to ramp up. And 6 00:00:17,880 --> 00:00:20,400 Speaker 1: if you think about all the cloud service providers, so 7 00:00:20,400 --> 00:00:23,680 Speaker 1: all the usual suspects of Google, Aws, Azure, and we 8 00:00:23,760 --> 00:00:26,920 Speaker 1: have this what's called a shared responsibility model in that 9 00:00:27,960 --> 00:00:31,200 Speaker 1: they have their own security. However, most companies also have 10 00:00:31,640 --> 00:00:38,520 Speaker 1: another security that is implemented. But that whole shared responsibility 11 00:00:38,520 --> 00:00:41,760 Speaker 1: model is becoming quite interesting from a cloud perspective, just 12 00:00:41,760 --> 00:00:46,680 Speaker 1: because when something does go wrong, then the line's not 13 00:00:46,760 --> 00:00:50,120 Speaker 1: always like let's say, as an example, you decide with 14 00:00:50,600 --> 00:00:54,320 Speaker 1: share Ziase that you're not going to have any other security, 15 00:00:54,360 --> 00:00:55,560 Speaker 1: You're just going to go all in with one of 16 00:00:55,560 --> 00:00:59,680 Speaker 1: the cloud service providers, and then something happens, Chances are 17 00:01:00,280 --> 00:01:02,279 Speaker 1: the cloud the CSP is probably going to say, well, 18 00:01:02,280 --> 00:01:04,360 Speaker 1: it's not my problem. We did what we did. Our 19 00:01:04,760 --> 00:01:07,679 Speaker 1: part of the responsible piece for that you should have 20 00:01:07,760 --> 00:01:11,959 Speaker 1: had X y Z in there. So that one's quite interesting. 21 00:01:12,959 --> 00:01:16,959 Speaker 1: I'd also say the prevalence of what we were talking 22 00:01:16,959 --> 00:01:20,120 Speaker 1: about earlier with jen AI and LMS and where that's going. 23 00:01:20,680 --> 00:01:23,319 Speaker 1: We have to be thinking about fighting AI with AI. 24 00:01:23,880 --> 00:01:28,319 Speaker 1: If you're a company who has is doing something from 25 00:01:28,319 --> 00:01:32,640 Speaker 1: a cyber perspective, and that who whatever organization that you're 26 00:01:32,680 --> 00:01:36,080 Speaker 1: working with, that they have not invested in using AI 27 00:01:36,280 --> 00:01:39,160 Speaker 1: and machine learning tools to have that type of automation, 28 00:01:40,600 --> 00:01:42,880 Speaker 1: you're going to be in a really, really world of 29 00:01:42,959 --> 00:01:47,880 Speaker 1: hurt because you cannot fight that with a human. You've 30 00:01:47,920 --> 00:01:51,000 Speaker 1: got to go and make the hopefully keep the AI 31 00:01:51,120 --> 00:01:54,520 Speaker 1: and machine learning and everything else smarter than what the 32 00:01:54,520 --> 00:01:57,240 Speaker 1: bad actors are doing. But you absolutely have to fight 33 00:01:57,280 --> 00:02:00,000 Speaker 1: AI with AI. Has there been a big steep hop 34 00:02:00,280 --> 00:02:03,960 Speaker 1: envestment into the AI space, Absolutely, yeah, one hundred percent. 35 00:02:04,080 --> 00:02:09,560 Speaker 1: So we've got a few different categories, but it's precision AI, 36 00:02:09,720 --> 00:02:13,600 Speaker 1: so we've actually that's our term that we've trademarked, and 37 00:02:14,160 --> 00:02:18,000 Speaker 1: from a generative AI perspective, then it's making sure that 38 00:02:18,680 --> 00:02:21,600 Speaker 1: you are able to understand exactly where things are coming 39 00:02:21,600 --> 00:02:24,400 Speaker 1: from and that not only the integrity of the AI 40 00:02:24,800 --> 00:02:28,919 Speaker 1: is correct, but also that it's secure. And so those 41 00:02:28,960 --> 00:02:31,560 Speaker 1: are the things in that we also talked earlier about 42 00:02:32,000 --> 00:02:35,120 Speaker 1: cloud and developers, and because you'll know a lot of 43 00:02:35,160 --> 00:02:39,360 Speaker 1: developers that are grabbing things to develop now through AI, 44 00:02:39,800 --> 00:02:42,800 Speaker 1: So we're making sure that that's secure and we have 45 00:02:43,480 --> 00:02:46,680 Speaker 1: it's an AI security posture assessment, so we can go 46 00:02:46,720 --> 00:02:50,040 Speaker 1: into an organization and say we're going to run some 47 00:02:50,120 --> 00:02:52,679 Speaker 1: tests and diagnostics through and go and do something and 48 00:02:52,919 --> 00:02:55,200 Speaker 1: basically be able to come back and say, this is 49 00:02:55,400 --> 00:02:58,959 Speaker 1: your security posture from an AI perspective. So if you 50 00:02:59,040 --> 00:03:00,480 Speaker 1: hit to summarize what you say, I think should be 51 00:03:00,480 --> 00:03:03,640 Speaker 1: top of mind from Vista's for listening. We're not thinking 52 00:03:03,639 --> 00:03:06,720 Speaker 1: about cybersecurity. What would that be? So one thing I 53 00:03:06,760 --> 00:03:10,560 Speaker 1: haven't talked about. We do think twenty twenty five is 54 00:03:10,680 --> 00:03:13,120 Speaker 1: the year of the browser. And what I mean by 55 00:03:13,120 --> 00:03:18,000 Speaker 1: that is such a high percentage of work is done 56 00:03:18,120 --> 00:03:21,520 Speaker 1: in a browser if you think about it. And part 57 00:03:21,560 --> 00:03:24,120 Speaker 1: of one of the ways that we've addressed that is 58 00:03:24,120 --> 00:03:26,760 Speaker 1: we acquired a company about a year ago and now 59 00:03:26,800 --> 00:03:29,400 Speaker 1: it's been automatically integrated in so we all use it 60 00:03:29,480 --> 00:03:33,320 Speaker 1: as well. It's called Prisma Access Browser and it basically 61 00:03:33,360 --> 00:03:37,280 Speaker 1: can tighten down anything. So within your organization, if you 62 00:03:37,280 --> 00:03:41,200 Speaker 1: have contractors or you've got BYOD, basically whenever you're from 63 00:03:41,200 --> 00:03:43,720 Speaker 1: a shares use perspective to say, Okay, we know that 64 00:03:43,800 --> 00:03:47,560 Speaker 1: this is safe, this is secure whenever they're working on 65 00:03:47,560 --> 00:03:50,680 Speaker 1: our stuff because we've locked down the browser in the past. 66 00:03:51,120 --> 00:03:53,600 Speaker 1: That was kind of a bit of sprawl. So if 67 00:03:53,640 --> 00:03:57,320 Speaker 1: I were an investor, I'd probably be having a look 68 00:03:57,360 --> 00:04:00,480 Speaker 1: around that. It's a big bat investment. Something we just 69 00:04:00,520 --> 00:04:02,680 Speaker 1: trust and use, isn't it exactly? And that's the thing. 70 00:04:02,880 --> 00:04:06,440 Speaker 1: It's yeah, you're spot on it. It's something that I 71 00:04:06,480 --> 00:04:08,840 Speaker 1: think a lot of us trust our phones and trust 72 00:04:09,360 --> 00:04:12,360 Speaker 1: all kinds of different things. And but yeah, just that 73 00:04:12,480 --> 00:04:16,320 Speaker 1: security perspective, that's just something else to kind of think about. 74 00:04:16,560 --> 00:04:19,159 Speaker 1: Investing involves the risk you might lose the money you 75 00:04:19,160 --> 00:04:22,480 Speaker 1: start with. We recommend talking to a licensed financial advisor. 76 00:04:23,160 --> 00:04:26,960 Speaker 1: We also recommend reading product disclosure documents before deciding to invest.