1 00:00:02,720 --> 00:00:09,600 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. Okay, perfect transition to 2 00:00:09,800 --> 00:00:12,280 Speaker 1: get to our next guest, because we're talking about the 3 00:00:12,440 --> 00:00:15,240 Speaker 1: increased energy usage that we've seen a lot of that 4 00:00:15,280 --> 00:00:17,360 Speaker 1: has to do with AI. Check out all the news 5 00:00:17,360 --> 00:00:20,200 Speaker 1: about AI just in the last couple of days, Carol. 6 00:00:20,200 --> 00:00:22,400 Speaker 1: Lots of headlines. Apple rolling out this five hundred and 7 00:00:22,440 --> 00:00:25,520 Speaker 1: fifty nine dollars iPhone five hundred and ninety nine dollars 8 00:00:25,520 --> 00:00:28,920 Speaker 1: excuse me, iPhone sixteen eight with an AI bid for growth. 9 00:00:29,440 --> 00:00:33,040 Speaker 1: Microsoft unveiling AI tools that can create video game scenes 10 00:00:33,080 --> 00:00:36,159 Speaker 1: that would normally have to be programmed and animated by 11 00:00:36,159 --> 00:00:38,440 Speaker 1: a human. It's a model it built using data collected 12 00:00:38,800 --> 00:00:41,560 Speaker 1: from Xbox gamers and their controllers. 13 00:00:41,640 --> 00:00:43,640 Speaker 2: There's so much going on, and don't forget this comes 14 00:00:43,640 --> 00:00:46,080 Speaker 2: after the big news earlier this week. We talked about this. 15 00:00:46,200 --> 00:00:49,360 Speaker 2: Mira Muradi, the former chief CTO over an Open Ai, 16 00:00:49,520 --> 00:00:52,680 Speaker 2: launching a new AI startup. And then you had open 17 00:00:52,720 --> 00:00:56,160 Speaker 2: ai co founder Ilias a Skiver raising more than a 18 00:00:56,240 --> 00:00:59,520 Speaker 2: billion dollars for his startup at evaluation of over thirty 19 00:00:59,680 --> 00:01:00,600 Speaker 2: billion dollars. 20 00:01:00,720 --> 00:01:02,639 Speaker 1: Okay, lots of money coming into this and G two 21 00:01:02,640 --> 00:01:05,120 Speaker 1: Patel is on top of all of it. He's Executive 22 00:01:05,160 --> 00:01:07,720 Speaker 1: VP and Chief Product Officer at Cisco, where he's charged 23 00:01:07,760 --> 00:01:11,399 Speaker 1: with making sure that Cisco's portfolio works in the world 24 00:01:11,440 --> 00:01:15,440 Speaker 1: of AI. He joins us from Silicon Valley G two. 25 00:01:16,040 --> 00:01:21,039 Speaker 1: Where do you see Silica see Cisco's place in this 26 00:01:21,120 --> 00:01:21,960 Speaker 1: new world of AI? 27 00:01:24,160 --> 00:01:26,480 Speaker 3: You know the way that we think about this. Firstly, Carolynton, 28 00:01:26,520 --> 00:01:28,959 Speaker 3: thank you for having me on the show. And you know, 29 00:01:29,000 --> 00:01:31,200 Speaker 3: if you look at the way that the world is 30 00:01:31,240 --> 00:01:35,480 Speaker 3: evolving right now, it's it's moving really fast. But if 31 00:01:35,520 --> 00:01:37,959 Speaker 3: you look back at the gold Rush, think about Cisco's 32 00:01:37,959 --> 00:01:40,920 Speaker 3: position over here is we're going to provide the infrastructure 33 00:01:41,720 --> 00:01:45,160 Speaker 3: and the safety and security guard rail so that organizations 34 00:01:45,200 --> 00:01:48,720 Speaker 3: can actually use AI scalably in the market. And if 35 00:01:48,720 --> 00:01:51,160 Speaker 3: you think about a common sentiment that you have right 36 00:01:51,200 --> 00:01:54,240 Speaker 3: now in the market, and we've done a study to 37 00:01:54,280 --> 00:01:59,280 Speaker 3: actually prove this as well, there's an overwhelming majority of 38 00:01:59,320 --> 00:02:03,160 Speaker 3: companies where the CEOs are very excited about the possibilities 39 00:02:03,160 --> 00:02:05,080 Speaker 3: of AI, but virtually all of. 40 00:02:05,040 --> 00:02:06,480 Speaker 4: Them feel very underprepared. 41 00:02:06,560 --> 00:02:08,880 Speaker 3: In fact, the study that we did, we found that 42 00:02:09,000 --> 00:02:12,800 Speaker 3: ninety seven percent of the CEOs that actually we're pushing 43 00:02:13,000 --> 00:02:16,320 Speaker 3: AI projects that we're really excited about it only one 44 00:02:16,360 --> 00:02:19,360 Speaker 3: point seven percent of them felt fully prepared, and I 45 00:02:19,400 --> 00:02:22,120 Speaker 3: totally get it. I mean, this is a it's a 46 00:02:22,200 --> 00:02:25,000 Speaker 3: very fast moving market. People have to kind of think differently, 47 00:02:25,639 --> 00:02:28,480 Speaker 3: and so that's an area where we feel like, you know, 48 00:02:28,560 --> 00:02:32,280 Speaker 3: having preparedness and infrastructure, making sure that safety and security 49 00:02:32,800 --> 00:02:36,320 Speaker 3: don't become impediments for the adoption of AI, and having 50 00:02:36,400 --> 00:02:37,600 Speaker 3: the skills. 51 00:02:37,120 --> 00:02:41,560 Speaker 4: Gap get really addressed are the three. 52 00:02:41,360 --> 00:02:44,160 Speaker 3: Big challenges that most organizations have that we at Cisco 53 00:02:44,240 --> 00:02:46,880 Speaker 3: want to try to go out and help our customers 54 00:02:46,880 --> 00:02:47,919 Speaker 3: with CEE two. 55 00:02:47,919 --> 00:02:49,320 Speaker 2: One thing I think about is, you know, there's the 56 00:02:49,360 --> 00:02:52,799 Speaker 2: idea of move fast, break things, get it. There's also well, 57 00:02:52,840 --> 00:02:54,760 Speaker 2: wait a minute, we want to see where the dust 58 00:02:54,760 --> 00:02:57,240 Speaker 2: settles when it comes to AI a little bit because 59 00:02:57,720 --> 00:02:59,920 Speaker 2: we're at this point two more than two years in 60 00:03:00,040 --> 00:03:03,359 Speaker 2: and of just the euphoria and the build and the spend, 61 00:03:03,480 --> 00:03:05,760 Speaker 2: and now we want to see, okay, what do we 62 00:03:05,760 --> 00:03:07,800 Speaker 2: get for our money? Is there a little bit maybe 63 00:03:08,320 --> 00:03:11,639 Speaker 2: of that going on that there are executives, especially if 64 00:03:11,639 --> 00:03:14,000 Speaker 2: smaller mid sized companies, who are saying, we got to 65 00:03:14,040 --> 00:03:16,760 Speaker 2: kind of hold off until we really understand how do 66 00:03:16,760 --> 00:03:17,680 Speaker 2: we monetize this. 67 00:03:19,440 --> 00:03:21,680 Speaker 3: Yes, if you take a step back and broadly, like, 68 00:03:22,160 --> 00:03:24,160 Speaker 3: if we like to take a look at what's happening 69 00:03:24,160 --> 00:03:26,239 Speaker 3: in the market, there's only two kinds of companies that 70 00:03:26,280 --> 00:03:29,119 Speaker 3: are going to exist in the market. There's a first kind, 71 00:03:29,120 --> 00:03:31,239 Speaker 3: which is going to get highly dexterous with the use 72 00:03:31,240 --> 00:03:33,880 Speaker 3: of AI, and the second which is going to be 73 00:03:34,840 --> 00:03:37,200 Speaker 3: you know, struggling for relevance. Now, if you look at 74 00:03:37,240 --> 00:03:40,960 Speaker 3: the great ones, majority of them are getting slowed down 75 00:03:41,400 --> 00:03:46,240 Speaker 3: because of a few very foundational reasons. The first one is, 76 00:03:47,360 --> 00:03:49,480 Speaker 3: you know, safety and security gets to be a big concern, 77 00:03:49,560 --> 00:03:52,280 Speaker 3: especially for large regulated organizations. They want to make sure 78 00:03:52,280 --> 00:03:55,320 Speaker 3: that they tread carefully on that front. And as you 79 00:03:55,400 --> 00:03:58,520 Speaker 3: have things like deep seak and you know, the cost 80 00:03:58,560 --> 00:04:01,000 Speaker 3: of models gets to be lower and lower, and the 81 00:04:01,000 --> 00:04:03,880 Speaker 3: cost curve goes down precipitously, you're going to have more 82 00:04:03,880 --> 00:04:06,320 Speaker 3: and more models that are out there. And these models 83 00:04:06,320 --> 00:04:09,760 Speaker 3: by definition are unpredictable, and we've found that it's pretty 84 00:04:09,800 --> 00:04:12,480 Speaker 3: easy to jail break these models, and so that safety 85 00:04:12,560 --> 00:04:14,200 Speaker 3: security concern is a real one. 86 00:04:14,880 --> 00:04:16,360 Speaker 4: The second area is that. 87 00:04:16,279 --> 00:04:19,680 Speaker 3: People just aren't quite equipped for going out and making 88 00:04:19,720 --> 00:04:23,000 Speaker 3: sure that the infrastructure is ready for AI. And then 89 00:04:23,040 --> 00:04:25,360 Speaker 3: third one is this notion of skills, and if you 90 00:04:25,400 --> 00:04:28,960 Speaker 3: think about skills, I personally believe that staying on the 91 00:04:29,040 --> 00:04:32,320 Speaker 3: sidelines is not going to help you go up the 92 00:04:32,400 --> 00:04:35,440 Speaker 3: learning curve. So you have to jump in and you 93 00:04:35,480 --> 00:04:36,960 Speaker 3: have to jump in the fray and make sure that 94 00:04:37,000 --> 00:04:41,000 Speaker 3: you're starting to take some projects and learn from those 95 00:04:41,040 --> 00:04:45,000 Speaker 3: projects and have these practical experiences as a company so 96 00:04:45,040 --> 00:04:47,200 Speaker 3: that you get a better feel, a better intuition of 97 00:04:47,200 --> 00:04:49,360 Speaker 3: what's happening in AI and what it's going to take 98 00:04:49,400 --> 00:04:52,200 Speaker 3: to succeed. But just staying on the sidelines, I think 99 00:04:52,279 --> 00:04:55,080 Speaker 3: is a really bad strategy that's going to let others 100 00:04:55,120 --> 00:04:58,279 Speaker 3: get ahead of you and you'll. 101 00:04:57,680 --> 00:04:58,279 Speaker 4: Be left behind. 102 00:04:58,839 --> 00:05:02,160 Speaker 1: One thing love to hear from folks in your position 103 00:05:02,320 --> 00:05:05,120 Speaker 1: who are interacting with customers each and every day is 104 00:05:05,440 --> 00:05:09,480 Speaker 1: try to get an understanding of where demand is, who's buying, 105 00:05:09,560 --> 00:05:12,560 Speaker 1: what house spend is. When it comes to it, what 106 00:05:12,560 --> 00:05:16,600 Speaker 1: can you tell us about the environment right now, So there's. 107 00:05:16,360 --> 00:05:19,440 Speaker 4: No shortage of demand signal on AI right now. 108 00:05:19,480 --> 00:05:22,120 Speaker 3: In fact, what's happening is there's a lot of money 109 00:05:22,200 --> 00:05:25,760 Speaker 3: in from different parts of it that are going into AI. 110 00:05:26,240 --> 00:05:29,000 Speaker 3: And then the question becomes, what are the use cases 111 00:05:29,040 --> 00:05:30,839 Speaker 3: where AI is being used really well? So if you 112 00:05:30,880 --> 00:05:34,560 Speaker 3: think about some of the you know, really highly adopted 113 00:05:34,640 --> 00:05:39,880 Speaker 3: use cases right now in large companies software coding and 114 00:05:39,920 --> 00:05:44,359 Speaker 3: software development, in twenty twenty five, you will actually be 115 00:05:44,440 --> 00:05:50,760 Speaker 3: able to have, you know, a meaningful amount of your 116 00:05:50,800 --> 00:05:53,279 Speaker 3: code for a mid level engineer that'll actually be done 117 00:05:53,279 --> 00:05:55,640 Speaker 3: by AI. And so that's an area that's going to 118 00:05:55,640 --> 00:05:59,680 Speaker 3: be hugely you know growing. The second area that we've 119 00:05:59,720 --> 00:06:02,360 Speaker 3: seen a fair amount of movement and momentum on is 120 00:06:02,440 --> 00:06:05,279 Speaker 3: this notion of a contact center agent. Each one of us, 121 00:06:05,640 --> 00:06:08,080 Speaker 3: Caroly and Tim, we all probably will spend about four 122 00:06:08,080 --> 00:06:11,400 Speaker 3: to six weeks of our lifetimes on hold waiting for 123 00:06:11,880 --> 00:06:14,600 Speaker 3: some agent. When you call a company and they need 124 00:06:14,640 --> 00:06:17,120 Speaker 3: to make sure that they actually solve your problem. That's 125 00:06:17,160 --> 00:06:20,440 Speaker 3: an enormous waste of time for humanity at large. And 126 00:06:21,240 --> 00:06:23,480 Speaker 3: I think there's going to be ninety to ninety five 127 00:06:23,520 --> 00:06:27,159 Speaker 3: percent of these calls will be addressed with a voice 128 00:06:27,200 --> 00:06:29,159 Speaker 3: agent that is going to be an AI agent, where 129 00:06:29,200 --> 00:06:31,040 Speaker 3: you it'll sound just waiting. 130 00:06:32,720 --> 00:06:34,520 Speaker 1: Really, I'm really still waiting for that to happen. 131 00:06:34,560 --> 00:06:37,200 Speaker 2: One that actually works and is I don't know, like 132 00:06:37,240 --> 00:06:39,680 Speaker 2: I do feel like we've all gone through automated. 133 00:06:39,240 --> 00:06:43,680 Speaker 1: Systems and representative representative et representative. 134 00:06:44,120 --> 00:06:44,719 Speaker 4: That's how I do. 135 00:06:45,680 --> 00:06:47,000 Speaker 3: That's exactly right. 136 00:06:47,920 --> 00:06:51,680 Speaker 2: So is it really going to be a very different experience. 137 00:06:53,080 --> 00:06:54,800 Speaker 3: Well, I think one of the challenges that we've had 138 00:06:54,839 --> 00:06:59,159 Speaker 3: if you think about, you know, calling an agent, and 139 00:06:59,200 --> 00:07:02,440 Speaker 3: when you have something that's an automated agent, the reason 140 00:07:02,480 --> 00:07:04,440 Speaker 3: it sounds so frustrating is that it sounds like a 141 00:07:04,520 --> 00:07:08,960 Speaker 3: robot and you can't interrupt it. It doesn't understand tonality, 142 00:07:08,960 --> 00:07:12,400 Speaker 3: it doesn't understand expression, doesn't understand the emotion that the 143 00:07:12,440 --> 00:07:15,480 Speaker 3: customer might be feeling. And so what you're now starting 144 00:07:15,520 --> 00:07:18,640 Speaker 3: to see happen is these voice agents are getting so 145 00:07:18,640 --> 00:07:22,240 Speaker 3: sophisticated that you can interrupt its speed. It speaks at 146 00:07:22,240 --> 00:07:24,520 Speaker 3: the speed that a human would be able to speak. 147 00:07:24,560 --> 00:07:27,720 Speaker 3: You can redirect it, you can throw curveballs. It'll understand 148 00:07:27,800 --> 00:07:29,800 Speaker 3: your tone if you're frustrated. It's not going to ask 149 00:07:29,840 --> 00:07:34,040 Speaker 3: you silly questions when you're frustrated about things that are 150 00:07:34,080 --> 00:07:36,000 Speaker 3: like how's the weather, how are you feeling, because that's 151 00:07:36,040 --> 00:07:38,920 Speaker 3: not what the customer might be might might want to 152 00:07:38,960 --> 00:07:43,080 Speaker 3: talk about. So those kind of dynamics will fundamentally change 153 00:07:43,560 --> 00:07:45,480 Speaker 3: the take rate and adoption of this. 154 00:07:45,560 --> 00:07:46,520 Speaker 4: So in the next two. 155 00:07:46,440 --> 00:07:51,400 Speaker 3: Years, I assert that it'll probably be about eighty to 156 00:07:51,440 --> 00:07:55,120 Speaker 3: eighty five percent of the inbound calls that come in 157 00:07:55,200 --> 00:07:57,040 Speaker 3: will be able to be handled by an AI agent, 158 00:07:57,080 --> 00:07:59,559 Speaker 3: and the ones that aren't handled by an AI agent, 159 00:08:00,040 --> 00:08:01,880 Speaker 3: the AI agent will be smart enough to hand it 160 00:08:01,920 --> 00:08:04,960 Speaker 3: off to a human agent with all the context, so 161 00:08:05,000 --> 00:08:07,000 Speaker 3: you don't have to repeat your account number again and 162 00:08:07,040 --> 00:08:09,920 Speaker 3: you don't have to go out and you know, kind 163 00:08:09,920 --> 00:08:12,320 Speaker 3: of restate your problem just the way that we actually 164 00:08:12,360 --> 00:08:15,240 Speaker 3: feel right now. And that's what gets the frustration levels high. 165 00:08:15,720 --> 00:08:18,680 Speaker 3: And I'm I'm really hopeful about that that entire category. 166 00:08:18,720 --> 00:08:20,600 Speaker 3: So that's the second category where you're seeing a fair 167 00:08:20,640 --> 00:08:24,680 Speaker 3: amount of momentum, and then there's there's many more use 168 00:08:24,720 --> 00:08:27,520 Speaker 3: cases and automated workflows that you'll start to just see 169 00:08:27,560 --> 00:08:33,120 Speaker 3: get more and more proliferated throughout the entire corporate landscape. 170 00:08:33,200 --> 00:08:35,560 Speaker 2: Well don't judge me, G two, but I just started 171 00:08:35,559 --> 00:08:38,400 Speaker 2: really playing with chat Gipt and I came in and 172 00:08:38,440 --> 00:08:40,440 Speaker 2: I was like kind of blown away. I came in 173 00:08:40,440 --> 00:08:42,440 Speaker 2: one morning like, Tim, you've got to start playing with You're. 174 00:08:42,320 --> 00:08:43,880 Speaker 1: Saying you soon you're going to be paying two hundred 175 00:08:43,880 --> 00:08:45,760 Speaker 1: bucks a month for the agent future. 176 00:08:46,000 --> 00:08:46,440 Speaker 4: I see it. 177 00:08:46,600 --> 00:08:47,760 Speaker 2: Everything's a monthly faith. 178 00:08:48,520 --> 00:08:50,640 Speaker 3: I don't know if you folks have tried out this operator. 179 00:08:50,800 --> 00:08:53,839 Speaker 3: It is unbelievable what it can do because. 180 00:08:54,800 --> 00:08:57,079 Speaker 1: A month version, right, yeah. 181 00:08:56,920 --> 00:08:58,960 Speaker 3: It's it's the two hundred dollars a month version. And 182 00:08:59,000 --> 00:09:00,920 Speaker 3: by the way, the interesting part is they're losing money 183 00:09:00,920 --> 00:09:02,719 Speaker 3: on the two hundred dollars a month, which tells you 184 00:09:03,200 --> 00:09:06,320 Speaker 3: the amount of impact and the amount of adoption rate. 185 00:09:06,360 --> 00:09:08,600 Speaker 4: That is most people think it's bad. I think it's 186 00:09:08,600 --> 00:09:09,200 Speaker 4: a really. 187 00:09:08,960 --> 00:09:12,040 Speaker 3: Good sign for EI because there's no shortage of people 188 00:09:12,120 --> 00:09:14,600 Speaker 3: wanting to use this. They're solving a real problem. So 189 00:09:14,760 --> 00:09:18,480 Speaker 3: you can literally tell Carol, tell this agent, Hey, I 190 00:09:18,760 --> 00:09:20,640 Speaker 3: want to go out for a movie with my spouse 191 00:09:20,679 --> 00:09:22,480 Speaker 3: tonight and I want to make sure that I can 192 00:09:22,559 --> 00:09:25,760 Speaker 3: have dinner. Can you go find me a dinner reservation. 193 00:09:27,200 --> 00:09:28,319 Speaker 4: And I'll just do it for you. 194 00:09:29,000 --> 00:09:32,240 Speaker 2: Yeah, listen, come back soon. I really would like to 195 00:09:32,280 --> 00:09:35,000 Speaker 2: continue this conversation with you. Really enjoyed it. G Two Patel, 196 00:09:35,320 --> 00:09:38,240 Speaker 2: executive vice president and chief product officer over at Cisco, 197 00:09:38,640 --> 00:09:40,800 Speaker 2: joining us from Silicon Valley