1 00:00:02,520 --> 00:00:07,880 Speaker 1: Bloomberg Audio Studios, podcasts, radio news in. 2 00:00:07,960 --> 00:00:11,280 Speaker 2: Viniar and Cisco announcing plans this week to extend their partnership, 3 00:00:11,320 --> 00:00:15,200 Speaker 2: creating their most advanced AI architecture package that will provide 4 00:00:15,200 --> 00:00:18,880 Speaker 2: secure enterprise AI networking. I'm very place to say the 5 00:00:18,960 --> 00:00:22,160 Speaker 2: Cisco Chief Product Officer and Executive vice president g Two 6 00:00:22,160 --> 00:00:24,160 Speaker 2: Pateau joined us now for more, Gezo, It's good to 7 00:00:24,160 --> 00:00:24,640 Speaker 2: see you, sir. 8 00:00:24,800 --> 00:00:25,520 Speaker 3: It's great to see you. 9 00:00:25,560 --> 00:00:27,000 Speaker 2: Thank you for having me, Thank you for being here. 10 00:00:27,040 --> 00:00:28,240 Speaker 2: I wanted to do this for a while with you, 11 00:00:28,280 --> 00:00:30,400 Speaker 2: we all have. Let's just take a giant step back 12 00:00:30,440 --> 00:00:32,600 Speaker 2: before we get into this partnership. You know what I 13 00:00:32,640 --> 00:00:35,559 Speaker 2: worry about. I worry that AI will do to services 14 00:00:35,560 --> 00:00:39,239 Speaker 2: what manufacturing was done to by globalization. What is the 15 00:00:39,320 --> 00:00:41,919 Speaker 2: risk of that happening over the next several decades. 16 00:00:42,240 --> 00:00:46,320 Speaker 4: Look at any replatforming that happens. You'll always have some 17 00:00:46,600 --> 00:00:48,680 Speaker 4: level of displacement of jobs, and we need to make 18 00:00:48,680 --> 00:00:51,040 Speaker 4: sure that we make it as painless for society to 19 00:00:51,400 --> 00:00:54,480 Speaker 4: make that transition happen. But I worry less about AI 20 00:00:54,600 --> 00:00:57,320 Speaker 4: taking my job. I worry more about someone who uses 21 00:00:57,360 --> 00:00:58,960 Speaker 4: AI really well taking my job. 22 00:01:00,000 --> 00:01:01,160 Speaker 3: So if you take a step back. 23 00:01:01,320 --> 00:01:04,959 Speaker 4: It's actually really exciting to see what the possibilities are 24 00:01:04,959 --> 00:01:08,240 Speaker 4: with AI because what people, even despite all the hype, 25 00:01:08,240 --> 00:01:12,440 Speaker 4: what people grossly underestimate is the original insights that AI 26 00:01:12,600 --> 00:01:15,480 Speaker 4: is going to be able to go derive that don't 27 00:01:15,480 --> 00:01:17,680 Speaker 4: exist in the human corpus of knowledge, that allow us 28 00:01:17,720 --> 00:01:20,039 Speaker 4: to solve problems that we have not been able to 29 00:01:20,040 --> 00:01:22,520 Speaker 4: solve until now. And so, you know, whether it be 30 00:01:22,520 --> 00:01:26,560 Speaker 4: in healthcare or financial services, or transportation or whatever industry 31 00:01:26,600 --> 00:01:29,160 Speaker 4: you pick, you will actually have a reimagination of the 32 00:01:29,200 --> 00:01:31,160 Speaker 4: ways in which we can solve problems that have not 33 00:01:31,280 --> 00:01:34,000 Speaker 4: existed up until now. And that's exciting, frankly, because you 34 00:01:34,120 --> 00:01:37,440 Speaker 4: have this, you know, population of eight billion people that 35 00:01:37,480 --> 00:01:40,880 Speaker 4: will feel like it's got the throughput capacity of eighty billion. 36 00:01:41,080 --> 00:01:43,160 Speaker 2: Are we seeing that happen right now? Is that a 37 00:01:43,240 --> 00:01:45,480 Speaker 2: dream for the future or something that's being realized in 38 00:01:45,520 --> 00:01:46,160 Speaker 2: this very moment. 39 00:01:46,319 --> 00:01:47,640 Speaker 3: You're going to start seeing it happen. 40 00:01:47,680 --> 00:01:51,080 Speaker 4: In twenty twenty five, for example, there will be you know, 41 00:01:52,480 --> 00:01:56,000 Speaker 4: a meaningful amount of code that will actually start getting 42 00:01:56,000 --> 00:01:59,000 Speaker 4: autonomously written, and so you'll actually start to see so 43 00:01:59,080 --> 00:02:02,840 Speaker 4: many more The capacity of engineers will actually ten x 44 00:02:03,360 --> 00:02:04,880 Speaker 4: during the course of twenty twenty five. 45 00:02:05,280 --> 00:02:06,920 Speaker 2: It's exciting to see this, as you know, is the 46 00:02:06,960 --> 00:02:09,320 Speaker 2: societal issue that i'd be slightly concerned about. So there 47 00:02:09,320 --> 00:02:11,239 Speaker 2: are a bunch of graduates who are graduating over the 48 00:02:11,320 --> 00:02:13,480 Speaker 2: next couple of years or so, and they went to 49 00:02:13,560 --> 00:02:15,240 Speaker 2: learn to code because they would tell this is what 50 00:02:15,280 --> 00:02:17,119 Speaker 2: you need to go and learn to do. And they're 51 00:02:17,160 --> 00:02:19,600 Speaker 2: coming out into a workforce where maybe they feel like 52 00:02:19,600 --> 00:02:22,120 Speaker 2: they're no longer needed. What's the advice you'd give to 53 00:02:22,200 --> 00:02:23,519 Speaker 2: them as they graduate. 54 00:02:23,919 --> 00:02:27,480 Speaker 4: Here's the thing that I feel like people you might 55 00:02:27,560 --> 00:02:30,680 Speaker 4: overestimate the capabilities on that front, because AI is still 56 00:02:30,760 --> 00:02:33,120 Speaker 4: not going to have the human instinct and the judgment 57 00:02:33,280 --> 00:02:34,760 Speaker 4: that you're going to need to have. You're still going 58 00:02:34,800 --> 00:02:37,280 Speaker 4: to need to do oversight. There's a human in the loop. 59 00:02:37,320 --> 00:02:39,600 Speaker 4: But now that developer is going to have a companion 60 00:02:39,720 --> 00:02:41,679 Speaker 4: that can actually do the stuff that they didn't really 61 00:02:41,720 --> 00:02:43,800 Speaker 4: care to do or they weren't as good as at doing. 62 00:02:43,919 --> 00:02:46,920 Speaker 4: So this is a net positive in my mind, and 63 00:02:46,960 --> 00:02:48,959 Speaker 4: you're going to see a lot of progress in society 64 00:02:49,000 --> 00:02:51,520 Speaker 4: because of it, rather than a regression that happens. 65 00:02:51,600 --> 00:02:53,800 Speaker 1: Not to build on negativity, because I actually am a 66 00:02:53,880 --> 00:02:57,000 Speaker 1: very big believer on the positivity and I'm excited for it. 67 00:02:57,480 --> 00:02:59,120 Speaker 1: But there is this question of who's going to lose 68 00:02:59,120 --> 00:03:01,799 Speaker 1: their jobs in the meantime, and John alluded to this, 69 00:03:01,800 --> 00:03:03,720 Speaker 1: this idea that it's more of a white collar issue 70 00:03:03,760 --> 00:03:06,840 Speaker 1: than a blue collar issue. And I'm thinking of things 71 00:03:06,880 --> 00:03:08,519 Speaker 1: that we've heard about, like at banks where there have 72 00:03:08,600 --> 00:03:11,080 Speaker 1: been fat fingers that cause transfers or whatever else, and 73 00:03:11,120 --> 00:03:13,960 Speaker 1: that some of those efforts are being shifted to AI. 74 00:03:14,600 --> 00:03:17,240 Speaker 1: How significant do you think that transition is going to be. 75 00:03:17,520 --> 00:03:20,720 Speaker 4: You will always have some degree of jobs that will 76 00:03:20,720 --> 00:03:23,520 Speaker 4: get displaced, and then there'll be new jobs that actually 77 00:03:23,520 --> 00:03:26,359 Speaker 4: get created. And the thing that we have to keep 78 00:03:26,360 --> 00:03:28,840 Speaker 4: in mind is it's there's a lot of jobs that 79 00:03:28,960 --> 00:03:31,760 Speaker 4: right now people don't have enough labor to get done 80 00:03:32,880 --> 00:03:36,120 Speaker 4: that we are either we just don't have the resources 81 00:03:36,160 --> 00:03:37,360 Speaker 4: to do that actually don't. 82 00:03:37,200 --> 00:03:37,920 Speaker 3: Happen right now. 83 00:03:38,320 --> 00:03:40,280 Speaker 4: And so what AI will allow us to do is 84 00:03:40,480 --> 00:03:43,200 Speaker 4: not just go do things that humans don't do as well, 85 00:03:43,560 --> 00:03:45,680 Speaker 4: not just do things that replace humans, but actually do 86 00:03:45,760 --> 00:03:47,640 Speaker 4: things that humans don't have the time to get done. 87 00:03:47,840 --> 00:03:50,080 Speaker 1: Which industries do you think are being the most aggressive 88 00:03:50,080 --> 00:03:53,400 Speaker 1: about adapting to different AI tools and bringing them into 89 00:03:53,440 --> 00:03:53,800 Speaker 1: their entry. 90 00:03:53,800 --> 00:03:55,360 Speaker 3: I think it's loss subord you're seeing it right now. 91 00:03:55,440 --> 00:03:58,880 Speaker 4: In fact, i'd say, you know, two years ago, when 92 00:03:58,920 --> 00:04:01,800 Speaker 4: chat GPT came out, almost every company in every industry, 93 00:04:02,000 --> 00:04:04,080 Speaker 4: I'm noticing what's ourr AI strategy going to look like. 94 00:04:04,520 --> 00:04:06,280 Speaker 3: And we've now had a couple of years to go 95 00:04:06,360 --> 00:04:06,720 Speaker 3: bake that. 96 00:04:06,920 --> 00:04:09,040 Speaker 4: And in twenty twenty five you're going to see this 97 00:04:09,120 --> 00:04:13,960 Speaker 4: really exciting move from just chat pots to agents where 98 00:04:14,000 --> 00:04:18,200 Speaker 4: you'll actually have workflows get automated in every industry, in 99 00:04:18,240 --> 00:04:21,160 Speaker 4: every sector, in every segment of the market, and every geography. 100 00:04:21,240 --> 00:04:22,720 Speaker 3: So I feel like, you know. 101 00:04:22,640 --> 00:04:24,159 Speaker 4: There's only going to be two kinds of companies in 102 00:04:24,160 --> 00:04:26,239 Speaker 4: the world, those that are going to be really dextrous 103 00:04:26,279 --> 00:04:27,719 Speaker 4: for the use of AI and others that are going 104 00:04:27,800 --> 00:04:28,880 Speaker 4: to really struggle for elevance. 105 00:04:29,000 --> 00:04:32,120 Speaker 2: Before we talk about facilitating that for corporate America, G two, 106 00:04:32,120 --> 00:04:34,400 Speaker 2: can we talk about how things are changing inside Cisco? 107 00:04:34,800 --> 00:04:38,279 Speaker 2: What's different about how Cisco's operating with these advancements that 108 00:04:38,320 --> 00:04:39,239 Speaker 2: you're talking about. 109 00:04:39,400 --> 00:04:42,320 Speaker 4: We are lucky in the sense that we've actually got 110 00:04:42,320 --> 00:04:45,640 Speaker 4: a front row seat on what's happening, and you know, 111 00:04:45,760 --> 00:04:48,640 Speaker 4: everything from the way in which we're building out infrastructure 112 00:04:48,680 --> 00:04:50,560 Speaker 4: to the way in which we are getting our security 113 00:04:50,560 --> 00:04:53,159 Speaker 4: and safety parameters in the right way, and then making 114 00:04:53,160 --> 00:04:56,520 Speaker 4: sure that the workflows get automated. So you know, how 115 00:04:56,520 --> 00:04:59,320 Speaker 4: are we using AI and engineering? How are we using 116 00:04:59,360 --> 00:05:03,320 Speaker 4: AI our contact center so that when customers call because 117 00:05:03,320 --> 00:05:05,680 Speaker 4: they've got a problem, we're using AI to make sure 118 00:05:05,680 --> 00:05:07,560 Speaker 4: that we can help them solve the problems faster. What 119 00:05:07,600 --> 00:05:09,760 Speaker 4: are we doing in legal? What are we doing in marketing? 120 00:05:09,839 --> 00:05:11,760 Speaker 4: What are we doing in sales? Each one of these 121 00:05:11,800 --> 00:05:14,839 Speaker 4: areas is getting rethought and reimagined. It's actually exciting to see. 122 00:05:14,800 --> 00:05:17,600 Speaker 2: The whole world's got to do this. Yes, every company's 123 00:05:17,600 --> 00:05:18,440 Speaker 2: got to do the same thing. 124 00:05:18,920 --> 00:05:20,960 Speaker 3: Every company every introduces. 125 00:05:20,360 --> 00:05:22,960 Speaker 2: This partnership with in Vidio, talks to us about that 126 00:05:23,040 --> 00:05:25,320 Speaker 2: and what's going to change over the next several years 127 00:05:25,320 --> 00:05:26,400 Speaker 2: with that partnership in mind. 128 00:05:26,960 --> 00:05:27,640 Speaker 3: So we're excited. 129 00:05:28,240 --> 00:05:30,480 Speaker 4: Jensen, the CEO of n Vidia, put it best. He said, 130 00:05:30,560 --> 00:05:34,839 Speaker 4: Cisco and Nvideo are putting together the blueprint for securing AI, 131 00:05:35,600 --> 00:05:38,440 Speaker 4: and John, I go talk to a lot of customers 132 00:05:38,480 --> 00:05:40,440 Speaker 4: all the time, and when you go, when you go, 133 00:05:40,520 --> 00:05:43,960 Speaker 4: ask customers how enthusiastic they are about AI. Virtually every 134 00:05:43,960 --> 00:05:46,760 Speaker 4: CEO we had a study we recently did ninety seven 135 00:05:46,800 --> 00:05:48,760 Speaker 4: percent of the CEO said they were excited about AI, 136 00:05:48,800 --> 00:05:52,279 Speaker 4: but only one point seven percent felt prepared, right, and 137 00:05:52,320 --> 00:05:55,280 Speaker 4: so there's this kind of dissonance between the preparedness and 138 00:05:55,320 --> 00:05:57,440 Speaker 4: the level of enthusiasm that's there. And then you ask 139 00:05:57,480 --> 00:06:00,560 Speaker 4: in the next level question, why do you not feel prepared? 140 00:06:01,360 --> 00:06:02,680 Speaker 3: There's two reasons that come up. 141 00:06:02,760 --> 00:06:04,640 Speaker 4: The first one is they don't feel like they've got 142 00:06:04,640 --> 00:06:07,760 Speaker 4: the right infrastructure in place to make sure that they'll 143 00:06:07,800 --> 00:06:09,720 Speaker 4: be able to harness the full potential of AI. And 144 00:06:09,760 --> 00:06:12,120 Speaker 4: the second one is safety and security actually holds them 145 00:06:12,120 --> 00:06:14,400 Speaker 4: back because they feel like it also is going to 146 00:06:14,440 --> 00:06:17,360 Speaker 4: have a negative effect on adoption because if you don't 147 00:06:17,360 --> 00:06:19,360 Speaker 4: trust the system, you're not going to use it right, 148 00:06:19,760 --> 00:06:22,560 Speaker 4: And so that's in it. Those are two areas where Cisco, 149 00:06:23,320 --> 00:06:25,600 Speaker 4: you know, helps customers out and what we've done with 150 00:06:25,760 --> 00:06:28,640 Speaker 4: and videos, we just announced a partnership for this thing 151 00:06:28,760 --> 00:06:30,919 Speaker 4: called the Cisco the Secure AI Factory. 152 00:06:31,440 --> 00:06:32,600 Speaker 3: Now, what is an AI factory. 153 00:06:32,600 --> 00:06:37,520 Speaker 4: An AI factory essentially is a data center that's meticulously. 154 00:06:36,800 --> 00:06:38,560 Speaker 3: Built for AI workloads. 155 00:06:39,240 --> 00:06:41,680 Speaker 4: And what it allows you to do is make sure 156 00:06:41,720 --> 00:06:45,479 Speaker 4: that you can you can you can create these data 157 00:06:45,560 --> 00:06:49,360 Speaker 4: centers in the enterprise that can allow organizations to fully 158 00:06:49,360 --> 00:06:52,360 Speaker 4: harness the potential of AI, and so we are jointly 159 00:06:52,400 --> 00:06:56,480 Speaker 4: partnering over there wibr networking, our security, their technologies looks. 160 00:06:56,520 --> 00:06:59,120 Speaker 4: Cisco is one of the greatest networking companies in the world, 161 00:06:59,440 --> 00:07:02,880 Speaker 4: and Video is a pioneer on the chips and actually 162 00:07:02,920 --> 00:07:04,880 Speaker 4: create it to some degree. The movement in AI and 163 00:07:04,920 --> 00:07:08,560 Speaker 4: the combination of the two companies together, and I think 164 00:07:08,800 --> 00:07:10,480 Speaker 4: you're going to see more and more of this where 165 00:07:10,640 --> 00:07:13,120 Speaker 4: companies are going to partner with each other in this 166 00:07:13,200 --> 00:07:18,239 Speaker 4: ecosystem to really go serve needs of customers and work backwards, 167 00:07:18,320 --> 00:07:20,200 Speaker 4: even if there might be a slight competitive elopment. 168 00:07:20,400 --> 00:07:22,720 Speaker 1: This is fascinating to me because what it signifies is 169 00:07:22,720 --> 00:07:25,560 Speaker 1: there's a growing number of companies that don't want to 170 00:07:25,600 --> 00:07:28,640 Speaker 1: rely on the hyperscalers, that don't want to rely on 171 00:07:28,680 --> 00:07:32,480 Speaker 1: the cloud systems to provide the artificial intelligence for their businesses. 172 00:07:32,960 --> 00:07:35,440 Speaker 1: And it raises a question about the model that everyone 173 00:07:35,440 --> 00:07:37,560 Speaker 1: has bought into in the market, which is that they 174 00:07:37,600 --> 00:07:40,920 Speaker 1: will be the big beneficiaries. How much is that changing 175 00:07:40,960 --> 00:07:45,200 Speaker 1: with companies not wanting to necessarily develop their artificial intelligence 176 00:07:45,240 --> 00:07:47,800 Speaker 1: in the cloud for security reasons and want to have 177 00:07:47,840 --> 00:07:51,680 Speaker 1: it in secure locations that keeps their data encapsulated. 178 00:07:51,840 --> 00:07:52,800 Speaker 3: I think you're going to have both. 179 00:07:52,840 --> 00:07:55,480 Speaker 4: I think there's going to be probably the fastest growth 180 00:07:55,480 --> 00:07:59,400 Speaker 4: that hyperscalers experience that they've ever experienced with AI, and 181 00:07:59,440 --> 00:08:02,000 Speaker 4: there will be the fastest growth that enterprises have because 182 00:08:02,000 --> 00:08:04,800 Speaker 4: there will be some data centers that they'll want to 183 00:08:04,800 --> 00:08:07,480 Speaker 4: repatriate back within the enterprise so that they can make 184 00:08:07,480 --> 00:08:10,200 Speaker 4: sure that especially in areas like Inferunce, you'll actually see 185 00:08:10,200 --> 00:08:10,920 Speaker 4: a lot more of that. 186 00:08:11,520 --> 00:08:12,600 Speaker 3: There's two big areas. 187 00:08:12,640 --> 00:08:15,440 Speaker 4: There's training infrastructure that's needed to train the models, and 188 00:08:15,480 --> 00:08:18,040 Speaker 4: then there's inferencing, which is what happens when you use 189 00:08:18,080 --> 00:08:20,320 Speaker 4: the models. And on the inferencing side, you're seeing a 190 00:08:20,320 --> 00:08:23,560 Speaker 4: fair amount of repatriation of data centers back in enterprise. 191 00:08:23,600 --> 00:08:24,760 Speaker 4: And I think you're going to see a little bit 192 00:08:24,800 --> 00:08:26,720 Speaker 4: of both. And we want to make sure that whether 193 00:08:26,760 --> 00:08:30,520 Speaker 4: you're a hyperscaler, you're a service provider, you're an enterprise, 194 00:08:30,960 --> 00:08:32,760 Speaker 4: Cisco wants to serve each one of them in a 195 00:08:32,760 --> 00:08:35,160 Speaker 4: way that they can actually have infrastructure be plug and 196 00:08:35,160 --> 00:08:37,880 Speaker 4: play and it's massively simplified. 197 00:08:37,880 --> 00:08:38,959 Speaker 3: It's still too complicated. 198 00:08:39,240 --> 00:08:42,240 Speaker 1: What's the role of government in this? There's tremendous competition 199 00:08:42,280 --> 00:08:44,920 Speaker 1: around the world. Obviously, I'm thinking of Washington and Beijing. 200 00:08:44,960 --> 00:08:46,800 Speaker 3: Do you like what you're hearing out of the administration. 201 00:08:47,800 --> 00:08:50,760 Speaker 4: I'm actually very optimistic about the kind of things that 202 00:08:51,480 --> 00:08:54,120 Speaker 4: you're seeing across the board with the enthusiasm on AI. 203 00:08:54,160 --> 00:08:56,800 Speaker 4: If we weren't talking about AI, I would be concerned. 204 00:08:57,400 --> 00:08:59,400 Speaker 4: But right now the governments are talking about it. They're 205 00:08:59,400 --> 00:09:02,160 Speaker 4: talking about how automation happens within the government, how automation 206 00:09:02,280 --> 00:09:04,360 Speaker 4: is happening in the private sector, how are they're going 207 00:09:04,400 --> 00:09:06,360 Speaker 4: to make sure that they can accelerate the innovation on 208 00:09:06,440 --> 00:09:06,960 Speaker 4: those areas. 209 00:09:07,000 --> 00:09:08,160 Speaker 3: So I think it's not positive. 210 00:09:08,320 --> 00:09:10,760 Speaker 2: Are we going to be replaced? So we care about? 211 00:09:10,880 --> 00:09:13,800 Speaker 4: No, John, You'll still be here in five years from now. 212 00:09:13,840 --> 00:09:15,760 Speaker 4: I'm still going to come and talk to you after 213 00:09:15,760 --> 00:09:18,520 Speaker 4: the show, and it's not going to be an AI humanoid. 214 00:09:18,520 --> 00:09:19,439 Speaker 3: It's going to be John. 215 00:09:19,600 --> 00:09:21,720 Speaker 2: That's all I wanted to know. That's all I care about. 216 00:09:21,720 --> 00:09:23,280 Speaker 1: How do you know I haven't already been replaced? 217 00:09:23,320 --> 00:09:24,640 Speaker 2: That would be incredibly. 218 00:09:26,000 --> 00:09:26,280 Speaker 3: Sorry. 219 00:09:26,320 --> 00:09:28,240 Speaker 2: Kerry had a guest recently. It was remote and I 220 00:09:28,280 --> 00:09:30,360 Speaker 2: worried it might be AI. Just didn't have a clue. 221 00:09:30,400 --> 00:09:32,440 Speaker 2: You never know, you never know, you never know. 222 00:09:32,520 --> 00:09:34,880 Speaker 3: G you're an Optimus. Someday this is going to be 223 00:09:34,920 --> 00:09:35,440 Speaker 3: a fun world. 224 00:09:36,160 --> 00:09:38,400 Speaker 2: Slightly converting me. I'm looking forward to being on this 225 00:09:38,480 --> 00:09:40,400 Speaker 2: journey with you, GT one of the best. Appreciate it, 226 00:09:40,400 --> 00:09:42,400 Speaker 2: thanks for being about Thank you, G two. But it's 227 00:09:42,400 --> 00:09:45,640 Speaker 2: now there. The Cisco Chief Product Officer and executive vice president,