1 00:00:02,520 --> 00:00:07,040 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,800 --> 00:00:09,119 Speaker 2: Let's head over to Davos. 3 00:00:09,160 --> 00:00:13,040 Speaker 3: Salesforce CEO Mark Benyoff speaks with Bloomberg's Bradstone and Mustapha. 4 00:00:13,080 --> 00:00:15,560 Speaker 3: We're not getting along well on panels that they were 5 00:00:15,680 --> 00:00:17,680 Speaker 3: on here at Davos. You can go back and see that. 6 00:00:18,320 --> 00:00:21,440 Speaker 3: I remember one specifical panel that I was in and 7 00:00:21,520 --> 00:00:24,000 Speaker 3: it was thought they were going to come to blows, 8 00:00:24,000 --> 00:00:25,120 Speaker 3: but you know, it didn't happen. 9 00:00:25,360 --> 00:00:28,240 Speaker 1: And so has Larry Ellison, an oracle you think, stepped 10 00:00:28,240 --> 00:00:32,800 Speaker 1: in as a technology partner or a source of funding 11 00:00:32,880 --> 00:00:33,520 Speaker 1: for open Ai. 12 00:00:35,000 --> 00:00:37,959 Speaker 3: I can't speak to those specifics, but what I can 13 00:00:38,000 --> 00:00:40,920 Speaker 3: tell you is that I think it's a big moment 14 00:00:40,960 --> 00:00:43,280 Speaker 3: in the history of open ai that they're not running 15 00:00:43,320 --> 00:00:44,480 Speaker 3: on a Microsoft data center. 16 00:00:44,520 --> 00:00:48,680 Speaker 1: And how does it impact I guess Microsoft's competitive competitiveness 17 00:00:48,920 --> 00:00:52,159 Speaker 1: on AI going forward. We'll talk about Copilot and the like. 18 00:00:52,240 --> 00:00:56,200 Speaker 1: But did you feel like, you know, because of these developments, 19 00:00:57,000 --> 00:01:00,480 Speaker 1: Satya and Microsoft are in an unfavorable strategic position. 20 00:01:00,840 --> 00:01:03,360 Speaker 3: Well, I think, as you know, Microsoft doesn't have today 21 00:01:03,400 --> 00:01:06,759 Speaker 3: their own frontier model, so they rely on open Ai. 22 00:01:06,920 --> 00:01:09,240 Speaker 3: And now open Ai is saying that they're going to 23 00:01:09,319 --> 00:01:12,800 Speaker 3: another data center structure. So it's a big moment. 24 00:01:12,720 --> 00:01:17,360 Speaker 1: Right, and to just close the loop on Stargate. You know, 25 00:01:17,600 --> 00:01:20,759 Speaker 1: part of this is obviously driven by national security concerns 26 00:01:20,760 --> 00:01:24,000 Speaker 1: and the idea that you know that it is vitally 27 00:01:24,040 --> 00:01:26,399 Speaker 1: important for the US to maintain this lead. I mean, 28 00:01:26,440 --> 00:01:28,760 Speaker 1: do you share the concern and the urgency and do 29 00:01:28,800 --> 00:01:32,959 Speaker 1: you think China is close to AI parody? 30 00:01:33,160 --> 00:01:38,520 Speaker 3: Well, you have what six questions in there, let's kind 31 00:01:38,520 --> 00:01:41,320 Speaker 3: of peel it apart one by one. I think that 32 00:01:41,400 --> 00:01:46,520 Speaker 3: we're just seeing an incredible amount of investment in AI overall, 33 00:01:46,600 --> 00:01:50,400 Speaker 3: which is because it's really exciting. Look at what's happening 34 00:01:50,440 --> 00:01:52,480 Speaker 3: here at the conference I just showed you. You know that 35 00:01:52,560 --> 00:01:57,040 Speaker 3: Salesforce has run the information management for the World Economic 36 00:01:57,080 --> 00:02:01,160 Speaker 3: Forum for more than a decade, So all of everything 37 00:02:01,200 --> 00:02:05,120 Speaker 3: that's happening here is running on Salesforce. But and we've 38 00:02:05,200 --> 00:02:08,679 Speaker 3: used our Einstein platform here for years, but this is 39 00:02:08,720 --> 00:02:11,440 Speaker 3: the year that we're using Agent Force. So now you 40 00:02:11,560 --> 00:02:13,600 Speaker 3: have an agent that's run front end of your app. 41 00:02:13,720 --> 00:02:18,960 Speaker 3: Probably that is okay, Well, what it is is you 42 00:02:19,080 --> 00:02:22,480 Speaker 3: have the ability now to use an agent, a kind 43 00:02:22,520 --> 00:02:25,280 Speaker 3: of almost like a digital worker to help you to 44 00:02:25,680 --> 00:02:30,160 Speaker 3: kind of get your job done. So here I am, 45 00:02:30,240 --> 00:02:33,000 Speaker 3: I have my app, I'm on my phone. I'm normally 46 00:02:33,080 --> 00:02:35,400 Speaker 3: just bringing up my World Economic Forum app. I'm going 47 00:02:35,480 --> 00:02:37,600 Speaker 3: through trying to find all my sessions. So when is 48 00:02:37,639 --> 00:02:40,840 Speaker 3: Brad speaking? How do I know Brad? Can I contact 49 00:02:40,840 --> 00:02:43,480 Speaker 3: Brad instead? Now I can just say to the agent, hey, 50 00:02:43,800 --> 00:02:46,480 Speaker 3: when is Brad speaking? And let leave contact him for 51 00:02:46,639 --> 00:02:49,760 Speaker 3: me and help me make this happen. At Salesforce, we 52 00:02:49,919 --> 00:02:54,040 Speaker 3: just deployed our Agentforce technology on help dot salesforce dot com, 53 00:02:54,040 --> 00:02:58,040 Speaker 3: which is how we deliver help to our customers. Thirty 54 00:02:58,080 --> 00:03:01,160 Speaker 3: six thousand customers a week come to help dot salesforce 55 00:03:01,240 --> 00:03:03,680 Speaker 3: dot com. If you had come there a month or 56 00:03:03,680 --> 00:03:07,440 Speaker 3: so or ago, about ten thousand of those customers have 57 00:03:07,520 --> 00:03:10,080 Speaker 3: to move off and get support from human beings, and 58 00:03:10,120 --> 00:03:13,200 Speaker 3: about twenty six thousand got resolved by our software. Now 59 00:03:13,240 --> 00:03:16,959 Speaker 3: with agent Force, only five thousand are going off to 60 00:03:17,040 --> 00:03:19,280 Speaker 3: human beings. So our nine thousand support agents have a 61 00:03:19,320 --> 00:03:21,920 Speaker 3: little bit less to do because the software is able 62 00:03:21,919 --> 00:03:23,600 Speaker 3: to resolve a lot of this on its own. And 63 00:03:23,639 --> 00:03:27,400 Speaker 3: that's a very exciting moment that AI and agents can 64 00:03:27,440 --> 00:03:29,840 Speaker 3: kind of give us this incredible productivity that we couldn't 65 00:03:29,880 --> 00:03:32,760 Speaker 3: have whether we're trying to navigate the World Economic Form 66 00:03:32,760 --> 00:03:36,400 Speaker 3: and Davos or where Mark Benioffice trying to answer his customers' 67 00:03:37,200 --> 00:03:40,160 Speaker 3: support issues. Agents are making a huge difference in our 68 00:03:40,160 --> 00:03:41,200 Speaker 3: ability to be successful. 69 00:03:41,240 --> 00:03:42,840 Speaker 1: I want to come back to that, but I just 70 00:03:42,880 --> 00:03:45,480 Speaker 1: want to make sure we answered the previous question on 71 00:03:45,640 --> 00:03:49,280 Speaker 1: kind of AI parody between China and the US. 72 00:03:49,360 --> 00:03:51,280 Speaker 3: I don't think there is AI parody today. No. I 73 00:03:51,320 --> 00:03:54,360 Speaker 3: think that the US has a definitive lead, and I 74 00:03:54,400 --> 00:03:57,560 Speaker 3: think most of the analysis and I think obviously you 75 00:03:57,560 --> 00:03:59,600 Speaker 3: have a big shootout going on at the very top 76 00:03:59,680 --> 00:04:02,960 Speaker 3: end of the aron tier models between Google, Open Ai, 77 00:04:03,960 --> 00:04:09,240 Speaker 3: and Anthropic. Salesforce is obviously an investor in Anthropic. We 78 00:04:09,280 --> 00:04:14,720 Speaker 3: obviously have relationships with open ai as and also Google. 79 00:04:16,080 --> 00:04:20,440 Speaker 3: The Google technology I think is amazing. Neil Ferguson just 80 00:04:20,480 --> 00:04:23,120 Speaker 3: wrote a tremendous analysis of the three models and kind 81 00:04:23,160 --> 00:04:26,560 Speaker 3: of said it appears at Google. So Gemini as a player, 82 00:04:26,920 --> 00:04:29,320 Speaker 3: I think Gemini is more than a player. It's probably ahead. 83 00:04:30,279 --> 00:04:32,400 Speaker 1: Okay, And by the way, Dario proceeded you on stage 84 00:04:32,400 --> 00:04:33,400 Speaker 1: earlier this afternoon. 85 00:04:33,920 --> 00:04:35,920 Speaker 3: He has an outstanding model and is doing a great 86 00:04:35,960 --> 00:04:36,760 Speaker 3: job in his company. 87 00:04:37,040 --> 00:04:38,320 Speaker 2: Okay, So back to agent forest. 88 00:04:38,320 --> 00:04:42,840 Speaker 1: He said, few less things for your employees to do 89 00:04:42,920 --> 00:04:45,360 Speaker 1: with agent force, But does it also. 90 00:04:45,160 --> 00:04:46,400 Speaker 2: Mean fewer employees. 91 00:04:46,440 --> 00:04:49,960 Speaker 1: What are the ramifications of this kind of new age 92 00:04:50,000 --> 00:04:54,960 Speaker 1: of agentic models going to unsupervised do our tasks for us? 93 00:04:55,040 --> 00:04:57,960 Speaker 3: No, no question, I mean, of course, in the history, look, 94 00:04:58,000 --> 00:05:01,160 Speaker 3: technology is always getting lower costs in these yer to use. 95 00:05:01,200 --> 00:05:03,720 Speaker 3: There's not any you know, horses out on the streets 96 00:05:03,800 --> 00:05:06,760 Speaker 3: right now going out, and now we have cars now. 97 00:05:07,040 --> 00:05:08,960 Speaker 3: And in the same way, I think we should just 98 00:05:09,000 --> 00:05:11,800 Speaker 3: think about the analogy I just gave you, which is 99 00:05:11,800 --> 00:05:17,000 Speaker 3: that we have a lot less work for our support 100 00:05:17,040 --> 00:05:19,839 Speaker 3: agents to do because we've deployed agent force at salesforce, 101 00:05:20,240 --> 00:05:23,840 Speaker 3: so I can redeploy that resource into other parts of 102 00:05:23,880 --> 00:05:26,480 Speaker 3: the company. That's very exciting for me. I have plenty 103 00:05:26,520 --> 00:05:29,960 Speaker 3: to do, so I can kind of say, Okay, these 104 00:05:29,960 --> 00:05:32,320 Speaker 3: folks can work on this now instead of being over here. 105 00:05:32,440 --> 00:05:36,000 Speaker 1: Has it changed the arc of your hiring plans going forward? 106 00:05:36,000 --> 00:05:39,280 Speaker 1: And what is the widespread deployment of this technology mean 107 00:05:39,360 --> 00:05:41,840 Speaker 1: for you a full employment labor market? 108 00:05:42,960 --> 00:05:45,560 Speaker 3: Well, that is really the powerful thing. We're in a 109 00:05:45,640 --> 00:05:48,080 Speaker 3: labor market where it's really hard to hire people. There 110 00:05:48,120 --> 00:05:51,600 Speaker 3: aren't people to hire I want to radically expand sales 111 00:05:51,640 --> 00:05:54,479 Speaker 3: and service and marketing and salesforce because we are seeing 112 00:05:54,880 --> 00:05:58,960 Speaker 3: a huge amount of demand in moving and deploying our 113 00:05:59,000 --> 00:06:02,520 Speaker 3: new technologies. Finding those people incredibly difficult. That I have 114 00:06:02,600 --> 00:06:06,360 Speaker 3: agents at my disposal is tremendous. So look, I want 115 00:06:06,360 --> 00:06:10,720 Speaker 3: an unlimited workforce. I think everybody does agent force AI agents. 116 00:06:11,240 --> 00:06:15,320 Speaker 3: That's beginning of an unlimited workforce. Giveing an example. We 117 00:06:15,360 --> 00:06:18,919 Speaker 3: have a customer called Wiley. They're a bookseller in the US. 118 00:06:19,000 --> 00:06:20,840 Speaker 3: You probably have picked up some of their books when 119 00:06:20,839 --> 00:06:24,360 Speaker 3: you're in college. You know them well, and you know 120 00:06:24,560 --> 00:06:27,800 Speaker 3: they have this moment every year called back to school, 121 00:06:27,839 --> 00:06:29,560 Speaker 3: and they have to normally hire a lot of gig 122 00:06:29,600 --> 00:06:31,839 Speaker 3: workers to go into sales or into service or whatever 123 00:06:31,880 --> 00:06:33,720 Speaker 3: it is. This is a year they did not have 124 00:06:33,800 --> 00:06:36,359 Speaker 3: to do it. There were one of our launch customers 125 00:06:36,400 --> 00:06:39,440 Speaker 3: on Agent Force. They have the ability to just scale 126 00:06:39,640 --> 00:06:41,840 Speaker 3: their sales and service when they needed it and then 127 00:06:41,880 --> 00:06:45,720 Speaker 3: bring it back down when that demand spike ended. And 128 00:06:46,080 --> 00:06:49,160 Speaker 3: we have now you know, lots of examples of that. 129 00:06:49,200 --> 00:06:53,920 Speaker 3: So we've now released the code late October. We did 130 00:06:53,920 --> 00:06:56,080 Speaker 3: about two hundred deals in. 131 00:06:56,040 --> 00:06:57,360 Speaker 2: Our third quarter. 132 00:06:57,560 --> 00:07:02,120 Speaker 3: In our fourth quarter, we'll see thousands of agent Force deals, 133 00:07:02,400 --> 00:07:06,680 Speaker 3: and I've never seen anything go as fast at Salesforce 134 00:07:06,760 --> 00:07:10,160 Speaker 3: as this. I've never been as excited about the technology 135 00:07:10,160 --> 00:07:11,360 Speaker 3: industry as I am right now. 136 00:07:11,520 --> 00:07:14,240 Speaker 1: So in the tech industry, we love a good old 137 00:07:14,240 --> 00:07:16,240 Speaker 1: fashioned feud. We used to have a lot of them, 138 00:07:16,360 --> 00:07:19,960 Speaker 1: like Larry and Bill Sparring. You know, Kendrick Lamar and 139 00:07:20,040 --> 00:07:23,280 Speaker 1: Drake had had nothing on the tech industry. But recently, 140 00:07:23,320 --> 00:07:27,320 Speaker 1: you know, you've been tweeting or posting on x as 141 00:07:27,360 --> 00:07:32,640 Speaker 1: you describe agent Force and drawing juxtapositions with Microsoft and 142 00:07:32,800 --> 00:07:36,720 Speaker 1: co Pilot. Not to do more Microsoft pashion here, but 143 00:07:37,040 --> 00:07:39,280 Speaker 1: I think you compared it to Yeah. 144 00:07:39,400 --> 00:07:41,640 Speaker 2: No, but I want you to because I don't know 145 00:07:41,680 --> 00:07:44,600 Speaker 2: that I really fully got my brain around it. 146 00:07:44,720 --> 00:07:48,960 Speaker 1: How the agent Force model is different than the coilet mind. 147 00:07:48,760 --> 00:07:51,000 Speaker 3: I can make it very similar. Well, first of all, 148 00:07:51,040 --> 00:07:54,280 Speaker 3: you're right, Microsoft has disappointed you know, a lot of 149 00:07:54,320 --> 00:07:56,560 Speaker 3: customers with Copilot, and the reason why is. 150 00:07:56,520 --> 00:07:57,880 Speaker 2: I'm not sure everybody agrees. 151 00:07:57,920 --> 00:08:00,520 Speaker 3: But okay, well, I mean you can read the Gartner analysis, 152 00:08:00,520 --> 00:08:03,040 Speaker 3: you can read a lot of the media, but I'm 153 00:08:03,040 --> 00:08:05,440 Speaker 3: sure you're a big Copilot user yourself, Is that right? 154 00:08:05,680 --> 00:08:07,800 Speaker 1: I try to moderate my Oh, I see you know 155 00:08:07,920 --> 00:08:09,680 Speaker 1: I don't want to, all right, but you're like most people, 156 00:08:09,760 --> 00:08:12,200 Speaker 1: you don't use it, all right, thank you, thank you 157 00:08:12,240 --> 00:08:13,040 Speaker 1: for making my case. 158 00:08:13,080 --> 00:08:17,520 Speaker 3: But anyway, the point of it is all Copilot is 159 00:08:17,520 --> 00:08:20,080 Speaker 3: is repackaged chat GPT. So in a chart I just 160 00:08:20,120 --> 00:08:23,560 Speaker 3: saw like the twenty most used apps currently on the Internet, 161 00:08:23,600 --> 00:08:26,440 Speaker 3: like open Ai was very much at the top. Co 162 00:08:26,560 --> 00:08:29,240 Speaker 3: Pilot was not on the list. I just retweeted it. Actually, 163 00:08:29,280 --> 00:08:32,600 Speaker 3: so when you look at that, why that is is 164 00:08:32,640 --> 00:08:36,880 Speaker 3: because why use it? It's not really offering me any value. 165 00:08:37,000 --> 00:08:41,040 Speaker 3: So agent force is really what AI was meant to be. 166 00:08:41,120 --> 00:08:44,040 Speaker 3: The ability to give a company, a user, a customer, 167 00:08:44,600 --> 00:08:47,520 Speaker 3: really the value of artificial intelligence. That's why I was 168 00:08:47,559 --> 00:08:51,400 Speaker 3: excited to bring it to the show. Excited that Davos 169 00:08:51,400 --> 00:08:54,439 Speaker 3: has deployed it and we can really show customers here 170 00:08:54,559 --> 00:08:57,520 Speaker 3: this is a really great opportunity. In an example, yesterday 171 00:08:57,559 --> 00:09:00,599 Speaker 3: we were in a session with Rollbank Canada who just 172 00:09:00,640 --> 00:09:03,559 Speaker 3: deployed it for their wealth management. Look their ability to 173 00:09:03,760 --> 00:09:06,320 Speaker 3: scale and compete against the big US banks when their 174 00:09:06,360 --> 00:09:09,480 Speaker 3: wealth management is augmented because they're using agent for us. 175 00:09:09,520 --> 00:09:12,840 Speaker 3: That's very exciting for them. They're CEO Dave McKay is here. 176 00:09:12,880 --> 00:09:15,520 Speaker 3: It was really a great, a great moment to hear 177 00:09:15,559 --> 00:09:16,599 Speaker 3: you know what they're doing with this. 178 00:09:17,600 --> 00:09:21,480 Speaker 1: So before you mentioned the abundant capital flowing into AI, 179 00:09:21,960 --> 00:09:24,920 Speaker 1: you know, anthropic to even today there was news of 180 00:09:24,960 --> 00:09:29,200 Speaker 1: another one hundred one billion dollars from Google. Is there 181 00:09:29,520 --> 00:09:33,079 Speaker 1: any case to be made that AI is over invested 182 00:09:33,120 --> 00:09:33,760 Speaker 1: at this point? 183 00:09:34,679 --> 00:09:37,920 Speaker 3: No, there's not. I would say that we are really 184 00:09:38,000 --> 00:09:43,280 Speaker 3: at a point where everything is changing. I think the evidence, 185 00:09:43,360 --> 00:09:45,240 Speaker 3: like we just did a demo with your associate in 186 00:09:45,280 --> 00:09:47,600 Speaker 3: the back here where I just took out my phone 187 00:09:47,600 --> 00:09:50,360 Speaker 3: and we were demoing the World Economic Forum app. And 188 00:09:50,440 --> 00:09:52,240 Speaker 3: if we were here a year ago, we'd be looking 189 00:09:52,280 --> 00:09:54,200 Speaker 3: at a very similar app kind of going through what 190 00:09:54,360 --> 00:09:57,200 Speaker 3: sessions are we going through? Who's coming to the conference? 191 00:09:57,280 --> 00:09:59,600 Speaker 3: I need to contact this person, all the typical things, 192 00:10:00,040 --> 00:10:02,040 Speaker 3: and instead I have this agent that I just can 193 00:10:02,640 --> 00:10:05,200 Speaker 3: go right to and say, hey, do all this for me. 194 00:10:06,240 --> 00:10:10,360 Speaker 3: That level of productivity, that's why in the third quarter 195 00:10:10,480 --> 00:10:14,240 Speaker 3: of last year we saw productivity rise in the US 196 00:10:14,280 --> 00:10:18,319 Speaker 3: without any additional workers. That really hasn't happened in a while. 197 00:10:18,520 --> 00:10:22,120 Speaker 3: That very society margin there that I think that is 198 00:10:22,160 --> 00:10:24,400 Speaker 3: exactly what is going on. I think AI is kind 199 00:10:24,400 --> 00:10:27,000 Speaker 3: of kicking in Right now, I'm writing the business plan 200 00:10:27,040 --> 00:10:30,200 Speaker 3: for my next fiscal year, which starts February one, and 201 00:10:30,240 --> 00:10:31,640 Speaker 3: this will be the third year in a row that 202 00:10:31,679 --> 00:10:34,400 Speaker 3: I'm working with an AI to write my business plan 203 00:10:34,480 --> 00:10:38,240 Speaker 3: for the year. So I'm I have a colleague. I 204 00:10:38,440 --> 00:10:42,000 Speaker 3: will write different parts of the plan, and then I 205 00:10:42,040 --> 00:10:44,080 Speaker 3: will talk to the AI and I will say, hey, 206 00:10:44,080 --> 00:10:46,040 Speaker 3: what do you think of this? How does this compete? 207 00:10:48,720 --> 00:10:51,000 Speaker 3: We have our we have some great technology. And then 208 00:10:51,040 --> 00:10:54,520 Speaker 3: I'll say, hey, how does this How does this affect 209 00:10:54,600 --> 00:10:57,839 Speaker 3: this competitor? How does it affect that competitor? And I 210 00:10:57,880 --> 00:11:01,400 Speaker 3: have been shocked at how it's impact me. That it 211 00:11:01,440 --> 00:11:04,320 Speaker 3: has made me more productive, but also expands my own 212 00:11:04,360 --> 00:11:07,520 Speaker 3: consciousness on what I should be doing and leading and 213 00:11:07,559 --> 00:11:10,560 Speaker 3: moving forward. So AI is definitely a part of I 214 00:11:10,559 --> 00:11:11,160 Speaker 3: cat to be. 215 00:11:11,160 --> 00:11:13,280 Speaker 1: A little bit more skeptical, and when I are. 216 00:11:13,280 --> 00:11:14,880 Speaker 3: I didn't know that. How long do I know you for? 217 00:11:15,080 --> 00:11:17,960 Speaker 3: I never knew this, But I'm so happy to know 218 00:11:18,000 --> 00:11:18,880 Speaker 3: this about you. 219 00:11:18,920 --> 00:11:22,280 Speaker 1: When I when I hear things like Larry, your former boss, 220 00:11:22,320 --> 00:11:27,760 Speaker 1: Larry Ellison saying today that there will be targeted vaccines 221 00:11:27,800 --> 00:11:31,040 Speaker 1: for cancer using AI within forty eight hours, I mean 222 00:11:31,480 --> 00:11:33,440 Speaker 1: it just feels like the exuberance is. 223 00:11:34,040 --> 00:11:35,839 Speaker 3: I don't think he means in the next forty eight hours. 224 00:11:35,840 --> 00:11:37,119 Speaker 3: So I think what he's saying. 225 00:11:36,960 --> 00:11:39,320 Speaker 2: Is we understand forty eight hours within diego okay. 226 00:11:39,120 --> 00:11:41,800 Speaker 3: So what is okay? So I've actually talked to him 227 00:11:41,840 --> 00:11:45,400 Speaker 3: about what he's doing. It is pretty incredible. His vision, 228 00:11:46,120 --> 00:11:49,120 Speaker 3: you know, his vision is that, you know, there's these 229 00:11:49,160 --> 00:11:51,840 Speaker 3: amazing kind of cancer tests that have kind of been developed, 230 00:11:51,880 --> 00:11:54,400 Speaker 3: like GRAYIL and others, where you could get a blood 231 00:11:54,400 --> 00:11:56,839 Speaker 3: test and it shows you if you have these fragments 232 00:11:56,840 --> 00:11:58,800 Speaker 3: of cancer in your bloodstream and so forth, and then 233 00:11:58,880 --> 00:12:01,679 Speaker 3: they go to try to find the cant in this case. 234 00:12:01,920 --> 00:12:03,880 Speaker 3: And I'm not going to describe it as beautifully as 235 00:12:03,880 --> 00:12:07,840 Speaker 3: he did today, but because it's his vision, his idea 236 00:12:08,559 --> 00:12:10,800 Speaker 3: is that you're gonna be able to run that same 237 00:12:10,920 --> 00:12:15,000 Speaker 3: test and then use AI to have much more affinity 238 00:12:15,000 --> 00:12:18,640 Speaker 3: on what's really going on, and then be able to 239 00:12:18,760 --> 00:12:25,680 Speaker 3: use our mRNA technology to develop a highly personalized vaccine 240 00:12:26,040 --> 00:12:29,520 Speaker 3: for that cancer and eradicated from your body. That is 241 00:12:29,600 --> 00:12:34,160 Speaker 3: pretty extraordinary. Obviously it hasn't happened yet, but the vision 242 00:12:34,240 --> 00:12:36,480 Speaker 3: is awesome. And you know what, I think we should 243 00:12:36,480 --> 00:12:39,240 Speaker 3: be looking for innovation in the world, and it's exciting 244 00:12:39,280 --> 00:12:42,600 Speaker 3: to see someone like Larry Ellison having a huge vision 245 00:12:42,640 --> 00:12:46,920 Speaker 3: for cancer. He has invested lots of his own money 246 00:12:46,960 --> 00:12:50,720 Speaker 3: and cancer treatment and has centers in la and in 247 00:12:50,760 --> 00:12:53,120 Speaker 3: Oxford and it's been very impressive. Okay to see what 248 00:12:53,160 --> 00:12:53,560 Speaker 3: he's done. 249 00:12:53,600 --> 00:12:55,840 Speaker 1: I want to hit a couple more topics before we 250 00:12:55,880 --> 00:12:58,720 Speaker 1: are supplanted by a head of state. I think last 251 00:12:58,800 --> 00:13:01,040 Speaker 1: year we talked about social media and you sounded the 252 00:13:01,080 --> 00:13:04,520 Speaker 1: alarm on misinformation and you. 253 00:13:04,480 --> 00:13:06,280 Speaker 2: Know, the lack of proper oversight. 254 00:13:07,000 --> 00:13:09,920 Speaker 1: You've viewed some salty language, and in the last few 255 00:13:09,960 --> 00:13:14,400 Speaker 1: months we have seen a retreat on moderation. You know what, 256 00:13:14,400 --> 00:13:17,960 Speaker 1: what do you make recent moves at Meta and the 257 00:13:18,000 --> 00:13:21,240 Speaker 1: company formerly noticed Twitter, And do you think it's dangerous? 258 00:13:21,920 --> 00:13:25,240 Speaker 3: Well, I think, you know, my thoughts on social media 259 00:13:25,320 --> 00:13:27,680 Speaker 3: is really well known. But I think that you know, 260 00:13:27,720 --> 00:13:29,679 Speaker 3: we need to continue to look at from the fundamental 261 00:13:29,679 --> 00:13:33,040 Speaker 3: reshaping of you know, section two thirty. I think if 262 00:13:33,080 --> 00:13:36,840 Speaker 3: we are going to let these organizations go to every 263 00:13:36,880 --> 00:13:41,079 Speaker 3: possible extreme, then they should be held accountable and liable 264 00:13:41,160 --> 00:13:43,760 Speaker 3: for whatever the content is. And I think that when 265 00:13:43,760 --> 00:13:45,920 Speaker 3: we look at section two thirty, it wasn't written for 266 00:13:45,960 --> 00:13:48,959 Speaker 3: this kind of moment, and especially with the age of AI, 267 00:13:49,520 --> 00:13:50,840 Speaker 3: this is a time when that we have to go 268 00:13:50,920 --> 00:13:52,720 Speaker 3: back and do that. And I hope that the Trump 269 00:13:52,760 --> 00:13:55,920 Speaker 3: administration will evaluate that they've made a number of comments 270 00:13:55,960 --> 00:13:58,560 Speaker 3: along that along those lines, and I think that that 271 00:13:58,720 --> 00:14:02,480 Speaker 3: is appropriate to to get rid of that regulation so 272 00:14:02,520 --> 00:14:07,240 Speaker 3: that these organizations can be held accountable. I think you're 273 00:14:07,240 --> 00:14:09,439 Speaker 3: a journalist, you're held accountable for you what you write 274 00:14:09,480 --> 00:14:11,640 Speaker 3: and what you put on your platform, So should they. 275 00:14:12,840 --> 00:14:16,720 Speaker 1: You have come to Davos in years past and made 276 00:14:16,880 --> 00:14:18,320 Speaker 1: some pretty strenuous. 277 00:14:17,920 --> 00:14:20,960 Speaker 3: One thing we said was yes, positive, Well. 278 00:14:20,840 --> 00:14:23,560 Speaker 1: We've still three and a half minutes and you've made 279 00:14:23,600 --> 00:14:28,800 Speaker 1: some pretty strenuous laid out some strenuous climate goals. I 280 00:14:28,840 --> 00:14:31,640 Speaker 1: think there was planting a trillion trees by twenty twenty. 281 00:14:31,920 --> 00:14:35,040 Speaker 3: I mean, how in fact today right now, actually we're 282 00:14:35,040 --> 00:14:43,400 Speaker 3: announcing that we're delivering the largest forest reserve in Earth's 283 00:14:43,440 --> 00:14:46,640 Speaker 3: history in the Congo, and President of the Congo's making 284 00:14:46,640 --> 00:14:49,680 Speaker 3: the announcement part of the trillion tree announcement that we 285 00:14:49,680 --> 00:14:52,520 Speaker 3: did five years ago. And we're almost at two hundred 286 00:14:52,560 --> 00:14:55,640 Speaker 3: billion trees, so we're not quite at a trillion, but 287 00:14:55,680 --> 00:14:58,960 Speaker 3: we got to keep going, I will se question twoga 288 00:14:59,000 --> 00:15:02,000 Speaker 3: cont Okay, I'm glad you're excited about that. 289 00:15:02,840 --> 00:15:04,200 Speaker 2: I love trees. We love trees. 290 00:15:04,520 --> 00:15:06,360 Speaker 3: If I'm not sure you do based on that, but 291 00:15:07,040 --> 00:15:09,760 Speaker 3: we spent exactly about five seconds on trees and a 292 00:15:09,800 --> 00:15:12,960 Speaker 3: lot on Larry Ellison. I think you like Larry Ellison 293 00:15:12,960 --> 00:15:18,040 Speaker 3: more than trees. Brad. My question is I'm just saying. 294 00:15:17,800 --> 00:15:22,280 Speaker 1: That is required to run these AI data centers. If 295 00:15:22,320 --> 00:15:26,240 Speaker 1: the industry is sort of conveniently waving it away talking 296 00:15:26,280 --> 00:15:31,160 Speaker 1: about nuclear if we've really accounted for the requirements that 297 00:15:31,160 --> 00:15:32,080 Speaker 1: these companies are. 298 00:15:31,920 --> 00:15:34,120 Speaker 2: Going to need as they scale up their operations. 299 00:15:34,800 --> 00:15:37,400 Speaker 3: Yeah, well, I think that that whole idea that energy 300 00:15:37,480 --> 00:15:40,040 Speaker 3: still needs to get rationalized against what's happening with these 301 00:15:40,120 --> 00:15:43,400 Speaker 3: data center demands and some of the stuff that's happening 302 00:15:43,480 --> 00:15:46,400 Speaker 3: I think is kind of there's some kind of indiscriminate 303 00:15:46,760 --> 00:15:50,280 Speaker 3: kind of development and people just using them as much 304 00:15:50,320 --> 00:15:52,560 Speaker 3: as they can and so forth. I think it's salesforce. 305 00:15:52,640 --> 00:15:54,800 Speaker 3: We continue to try to do our best to be 306 00:15:54,840 --> 00:15:57,400 Speaker 3: in that zero company, including in the work of our 307 00:15:57,440 --> 00:16:01,000 Speaker 3: AI will deliver about two trillion AIS actions this week. 308 00:16:01,880 --> 00:16:03,720 Speaker 3: You know we've been doing that for about a decade. 309 00:16:03,800 --> 00:16:07,760 Speaker 3: You mentioned Einstein now agent force, and even in development 310 00:16:07,760 --> 00:16:09,480 Speaker 3: of our own models and all of these things, we 311 00:16:09,520 --> 00:16:13,760 Speaker 3: have not had these same exhaustive energy demands. And part 312 00:16:13,760 --> 00:16:17,440 Speaker 3: of that's because of how we operate our own systems. 313 00:16:18,160 --> 00:16:20,320 Speaker 3: But I think for everybody there's probably an opportunity for 314 00:16:20,360 --> 00:16:21,560 Speaker 3: another level of efficiency. 315 00:16:21,920 --> 00:16:24,520 Speaker 2: And what is the solution? Is Is it nuclear? Is 316 00:16:24,560 --> 00:16:25,760 Speaker 2: it fuel cells? 317 00:16:25,800 --> 00:16:25,880 Speaker 3: Like? 318 00:16:25,920 --> 00:16:29,240 Speaker 1: How does the industry catch up to the energy use 319 00:16:29,240 --> 00:16:30,120 Speaker 1: that will be required? 320 00:16:31,560 --> 00:16:33,480 Speaker 3: I think that it's going to How are we going 321 00:16:33,560 --> 00:16:35,520 Speaker 3: to get more energy in the world. I mean, I'm 322 00:16:35,560 --> 00:16:39,040 Speaker 3: not really an energy expert. I'm obviously a huge bowl 323 00:16:39,120 --> 00:16:42,680 Speaker 3: on fusion. I think what Commonwealth Energy is doing is incredible. 324 00:16:42,720 --> 00:16:44,720 Speaker 3: I'm an investor in the company and I hope that 325 00:16:45,160 --> 00:16:47,520 Speaker 3: it's a huge success. And then yes, I also think 326 00:16:47,920 --> 00:16:52,720 Speaker 3: these kind of SMR nuclear modular nuclear plants. I think 327 00:16:52,720 --> 00:16:55,440 Speaker 3: there's a new one just going in Texas for example, 328 00:16:55,520 --> 00:16:58,480 Speaker 3: right now in other places. I mean, if you think 329 00:16:58,520 --> 00:17:01,000 Speaker 3: about it, you know we've run small nuclear for a 330 00:17:01,040 --> 00:17:05,720 Speaker 3: long time. Everyone knows about nuclear submarines or nuclear aircraft carriers, right, 331 00:17:06,240 --> 00:17:08,800 Speaker 3: and that's basically what we're talking about. Why don't we 332 00:17:08,840 --> 00:17:10,480 Speaker 3: just take one of those and just plug it into 333 00:17:10,480 --> 00:17:13,680 Speaker 3: our grid. We just don't think that way. But that's 334 00:17:13,720 --> 00:17:16,960 Speaker 3: all we're really talking about, being able to just use 335 00:17:17,560 --> 00:17:19,520 Speaker 3: the energy that we have in the world that's available 336 00:17:19,520 --> 00:17:21,600 Speaker 3: to us, and deliver in the clean way. I think 337 00:17:21,760 --> 00:17:23,879 Speaker 3: obviously we have to think about the amount of carbon 338 00:17:23,880 --> 00:17:27,280 Speaker 3: we're putting in the atmosphere. It's very significant, not just 339 00:17:27,320 --> 00:17:30,080 Speaker 3: from our emissions, but as you mentioned, the trees. And 340 00:17:30,119 --> 00:17:31,479 Speaker 3: I hate to go back there because I know you 341 00:17:31,480 --> 00:17:34,080 Speaker 3: have it to test for them. But you know, we 342 00:17:34,160 --> 00:17:36,280 Speaker 3: used to have six trillion trees on the planet. Now 343 00:17:36,280 --> 00:17:38,840 Speaker 3: we have less than three trillion. Every trillion trees is 344 00:17:38,840 --> 00:17:43,080 Speaker 3: two hundred gigatons of carbon, so that's six hundred gigatons 345 00:17:43,119 --> 00:17:48,280 Speaker 3: of carbon that we emitted through deforestation. And connecting that 346 00:17:48,440 --> 00:17:53,520 Speaker 3: back to industrialization, we've only admitted about two hundred gigatons 347 00:17:54,000 --> 00:17:57,800 Speaker 3: through all industrializations, so you can see deforestation remains a 348 00:17:57,880 --> 00:17:58,520 Speaker 3: major issue. 349 00:17:58,560 --> 00:18:00,480 Speaker 1: Last question, you could give me a yes, Ordain, you're 350 00:18:00,600 --> 00:18:00,920 Speaker 1: the owner. 351 00:18:01,000 --> 00:18:02,640 Speaker 3: I know we've probably exited the trees now once. 352 00:18:03,560 --> 00:18:06,000 Speaker 1: Well, no, this is actually understand you're the owner of. 353 00:18:06,000 --> 00:18:07,920 Speaker 3: No one's going to accuse you of being the Laura. 354 00:18:08,080 --> 00:18:11,840 Speaker 1: You're the owner of Time magazine, a tree based publication 355 00:18:12,119 --> 00:18:14,760 Speaker 1: at a time where it's very uncomfortable. 356 00:18:14,119 --> 00:18:15,160 Speaker 2: For media owners. 357 00:18:15,400 --> 00:18:17,520 Speaker 1: There were reports that you're going to sell, are you? 358 00:18:17,640 --> 00:18:20,919 Speaker 3: Are you going to There's no deal on the table 359 00:18:20,960 --> 00:18:23,520 Speaker 3: for time, so if that we do sign something, I 360 00:18:23,520 --> 00:18:24,520 Speaker 3: will call you right away. 361 00:18:25,240 --> 00:18:26,640 Speaker 2: Mark Benieff, thank you very much. 362 00:18:26,680 --> 00:18:27,160 Speaker 3: Thank you,