1 00:00:02,520 --> 00:00:08,559 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. No, let's hood over 2 00:00:08,640 --> 00:00:11,360 Speaker 1: two doubles where Bloomberg's Emily Chang is stunning. 3 00:00:11,400 --> 00:00:14,960 Speaker 2: My Emily, Fawnie, thank you so much. I'm here in 4 00:00:15,080 --> 00:00:18,279 Speaker 2: Davos with Mark Benioff, CEO and share and founder of 5 00:00:18,280 --> 00:00:21,720 Speaker 2: course of Salesforce. Good to be with you in another country. 6 00:00:21,960 --> 00:00:26,160 Speaker 1: You, Emily, Aloha aloha from the slow or the Comma 7 00:00:26,480 --> 00:00:27,080 Speaker 1: of Davos. 8 00:00:27,440 --> 00:00:29,960 Speaker 2: All right, so this is my first Davos. This is 9 00:00:30,000 --> 00:00:32,560 Speaker 2: your twenty fourth, right, Davos. 10 00:00:32,600 --> 00:00:35,800 Speaker 1: What's it like being a nice Hawaiian girl here in Davos. 11 00:00:35,960 --> 00:00:39,760 Speaker 2: No, it's chili, but I'm liking the vibes. What's the 12 00:00:39,800 --> 00:00:44,040 Speaker 2: mood this year? Are people anxious? Are they optimistic? Or 13 00:00:44,080 --> 00:00:46,120 Speaker 2: are they confused about the future. 14 00:00:46,920 --> 00:00:50,159 Speaker 1: Well, it's an exciting moment, There's no question. This is 15 00:00:50,600 --> 00:00:54,080 Speaker 1: just an unbelievable amount of change. And I think when 16 00:00:54,080 --> 00:00:57,880 Speaker 1: there's this much change and this much excitement, people are 17 00:00:57,960 --> 00:00:59,880 Speaker 1: a little frenetic. And I think that, you know, some 18 00:01:00,040 --> 00:01:02,880 Speaker 1: times Davos is like a measurement of the economy. That's 19 00:01:02,880 --> 00:01:05,000 Speaker 1: not what's going on. Well, is the economy good? Is 20 00:01:05,040 --> 00:01:07,600 Speaker 1: it bad? How are people feeling about the economy? More. 21 00:01:07,800 --> 00:01:10,560 Speaker 1: This is more like, well, what's happening exactly in the world, 22 00:01:10,560 --> 00:01:13,920 Speaker 1: because you have the AI situation, you have the political situation, 23 00:01:14,920 --> 00:01:17,959 Speaker 1: You've got you know, these very broad shifts, and I 24 00:01:17,959 --> 00:01:20,720 Speaker 1: think that is what's really happening here. There's a lot 25 00:01:20,760 --> 00:01:24,160 Speaker 1: of discussion. And you know, Davos was built on the 26 00:01:24,200 --> 00:01:27,000 Speaker 1: concept of a multi stakeholder dialogue, the idea that we 27 00:01:27,080 --> 00:01:30,080 Speaker 1: bring in lots of different kinds of people political leaders, 28 00:01:30,080 --> 00:01:34,840 Speaker 1: business leaders, spiritual leaders, cultural leaders and have a conversation. 29 00:01:35,440 --> 00:01:39,240 Speaker 1: And this year's Davos is about the spirit of dialogue 30 00:01:39,360 --> 00:01:41,880 Speaker 1: and that is playing out in spades. 31 00:01:42,120 --> 00:01:44,600 Speaker 2: Well, talking about that spirit, we're waiting for President Trump 32 00:01:44,640 --> 00:01:47,800 Speaker 2: to arrive. You've said you want you want to see 33 00:01:47,840 --> 00:01:50,680 Speaker 2: him prioritize US businesses. What does that mean? 34 00:01:51,440 --> 00:01:53,320 Speaker 1: Well, I think the number one thing is, you know, 35 00:01:53,360 --> 00:01:56,920 Speaker 1: for the US president is as he is doing his 36 00:01:57,040 --> 00:01:59,920 Speaker 1: job all over the world, he needs to support you 37 00:02:00,240 --> 00:02:03,320 Speaker 1: businesses and make them successful. And I was with Macron 38 00:02:03,440 --> 00:02:07,400 Speaker 1: yesterday and his job is to make French businesses successful. 39 00:02:07,400 --> 00:02:10,240 Speaker 1: And there's no different you know, that's the role of 40 00:02:10,280 --> 00:02:13,240 Speaker 1: the nation state leader is to help the people of 41 00:02:13,320 --> 00:02:15,280 Speaker 1: their state. And so I have a lot of respect 42 00:02:15,320 --> 00:02:19,120 Speaker 1: for President Trump and how he always supports the businesses 43 00:02:19,120 --> 00:02:19,920 Speaker 1: of the United States. 44 00:02:20,200 --> 00:02:22,880 Speaker 2: So you're using Davos sort of as a live demo 45 00:02:23,040 --> 00:02:25,680 Speaker 2: for some of Salesforce's new digital labor tools. You know, 46 00:02:25,720 --> 00:02:29,040 Speaker 2: I know you were prepping on slack bot, but a 47 00:02:29,040 --> 00:02:32,040 Speaker 2: lot of AI customers are saying, supposed. 48 00:02:31,680 --> 00:02:33,959 Speaker 1: To give away my secrets. Believe that's not right. 49 00:02:34,040 --> 00:02:36,160 Speaker 2: I don't think that's too big a secret, but a 50 00:02:36,200 --> 00:02:38,960 Speaker 2: lot of customers are saying they're sort of stuck in 51 00:02:39,000 --> 00:02:42,640 Speaker 2: AI pilot purgatory, like they're not They're trying stuff, but 52 00:02:42,680 --> 00:02:43,560 Speaker 2: it's just not working. 53 00:02:43,760 --> 00:02:46,280 Speaker 1: Fortunate because there is so much that you can do 54 00:02:46,400 --> 00:02:49,120 Speaker 1: right now in AI. Unfortunately, there's so many false narratives. 55 00:02:49,639 --> 00:02:53,520 Speaker 1: So I think because so many companies and pundits have 56 00:02:53,600 --> 00:02:56,399 Speaker 1: said things that are not true, it kind of freezes 57 00:02:56,480 --> 00:02:58,520 Speaker 1: certain customers and they don't know who to believe. So 58 00:02:58,520 --> 00:03:01,200 Speaker 1: that's why, like we were talking before we went on 59 00:03:01,320 --> 00:03:04,720 Speaker 1: that ramone is here the Sea of Pepsi. You know, 60 00:03:04,840 --> 00:03:08,320 Speaker 1: it's a new generation, and he just deployed one hundred 61 00:03:08,320 --> 00:03:10,919 Speaker 1: and twenty five thousand of his salespeople, one hundred and 62 00:03:10,919 --> 00:03:14,239 Speaker 1: twenty five thousand, supporting millions of businesses, a lot of 63 00:03:14,280 --> 00:03:16,280 Speaker 1: small and medium businesses all over the world. Is sell 64 00:03:16,360 --> 00:03:20,200 Speaker 1: pepsi using new AI generation tools that we've built for him. 65 00:03:20,560 --> 00:03:23,000 Speaker 1: Really only took us a few months and they're getting 66 00:03:23,040 --> 00:03:26,640 Speaker 1: fantastic results. And that all started actually last year. I 67 00:03:26,680 --> 00:03:30,040 Speaker 1: saw ramone and he once said I'm ready to get 68 00:03:30,040 --> 00:03:33,080 Speaker 1: a new level of efficacy in pepsi. And another Baraj 69 00:03:33,240 --> 00:03:35,480 Speaker 1: is here from FedEx, the CEO of fed X. We 70 00:03:35,560 --> 00:03:38,080 Speaker 1: just finished with him. He was a year ago saying 71 00:03:38,080 --> 00:03:41,720 Speaker 1: I've got trouble selling internationally. We identified all of his 72 00:03:41,800 --> 00:03:46,000 Speaker 1: domestic customers that when they start to work with his salespeople, 73 00:03:46,120 --> 00:03:49,400 Speaker 1: his website and they haven't done international. Now we surfaced 74 00:03:49,440 --> 00:03:52,800 Speaker 1: them as an opportunity for international. It's generated hundreds of 75 00:03:52,800 --> 00:03:56,920 Speaker 1: millions of dollars for Fedexso there is great opportunities right 76 00:03:56,960 --> 00:04:01,000 Speaker 1: now to deploy AI. And look, that's our job. Our 77 00:04:01,080 --> 00:04:04,080 Speaker 1: job is to sell it, to convince, show the use cases, 78 00:04:04,120 --> 00:04:07,720 Speaker 1: make people excited about it, and that's what we're doing here. 79 00:04:07,760 --> 00:04:10,640 Speaker 1: We will meet with more than five hundred CEOs one 80 00:04:10,680 --> 00:04:13,520 Speaker 1: on one here. 81 00:04:13,040 --> 00:04:16,240 Speaker 2: You know, you obviously have also taken on a sort 82 00:04:16,279 --> 00:04:20,960 Speaker 2: of you know, a policy bent. You recently apologized for 83 00:04:21,120 --> 00:04:23,760 Speaker 2: endorsing President Trump, sending the National Guard to San Francisco. 84 00:04:24,760 --> 00:04:27,360 Speaker 2: How will you what did you learn from that moment, 85 00:04:27,400 --> 00:04:30,960 Speaker 2: and how will you continue to reconcile the sort of 86 00:04:31,000 --> 00:04:35,520 Speaker 2: Salesforce Ohana values with the administration's policies, whether it's ice 87 00:04:35,600 --> 00:04:38,799 Speaker 2: raids in Minneapolis or taking over Greenland. 88 00:04:38,960 --> 00:04:41,240 Speaker 1: That's such a great question, you know, Emily. The way 89 00:04:41,279 --> 00:04:42,719 Speaker 1: I look at it, and I think that this is 90 00:04:42,880 --> 00:04:45,040 Speaker 1: probably the right way to look at it is I've 91 00:04:45,040 --> 00:04:48,200 Speaker 1: been doing this for twenty six years now. Salesforce, amazingly 92 00:04:48,279 --> 00:04:51,359 Speaker 1: is twenty six years old, eighty thousand employees, will do 93 00:04:51,400 --> 00:04:55,040 Speaker 1: more than forty one billion in revenue this year. Presidents 94 00:04:55,080 --> 00:04:58,400 Speaker 1: are constantly changing, but our core values are not changing, 95 00:04:58,520 --> 00:05:01,560 Speaker 1: and so we're just anchored down into our core values 96 00:05:01,600 --> 00:05:04,200 Speaker 1: and then we just speak to that and what we're 97 00:05:04,240 --> 00:05:07,520 Speaker 1: trying to achieve as a business, to reflect back into 98 00:05:07,560 --> 00:05:10,040 Speaker 1: our customers and our communities what those values are. As 99 00:05:10,080 --> 00:05:12,719 Speaker 1: you know, in San Francisco, where we are the largest 100 00:05:13,120 --> 00:05:15,640 Speaker 1: tech company, there have been a lot of challenges, and 101 00:05:15,680 --> 00:05:19,200 Speaker 1: so yeah, I always have anxiety about safety for our employees, 102 00:05:19,240 --> 00:05:24,320 Speaker 1: and I think that that anxiety just manifests into how 103 00:05:24,320 --> 00:05:27,400 Speaker 1: can we create a greater San Francisco? And so I'm 104 00:05:27,440 --> 00:05:30,200 Speaker 1: excited to be working with everyone from Mayor Lurry to 105 00:05:30,240 --> 00:05:31,359 Speaker 1: President Trump to do that. 106 00:05:33,160 --> 00:05:37,080 Speaker 2: You're really focused on AI and especially it's impact on teams, 107 00:05:37,160 --> 00:05:40,760 Speaker 2: and you've been sort of raising the alarm about AI 108 00:05:40,880 --> 00:05:43,960 Speaker 2: influencing team suicides, which is, you know, it's horrible that 109 00:05:43,960 --> 00:05:45,680 Speaker 2: we even have to talk about this. You've called them 110 00:05:45,720 --> 00:05:50,200 Speaker 2: suicide coaches. Like, what should companies be doing about this? 111 00:05:50,320 --> 00:05:52,080 Speaker 1: Well, I think we should talk about that. We are 112 00:05:52,120 --> 00:05:54,960 Speaker 1: dealing with a new kind of very unwieldy technology which 113 00:05:54,960 --> 00:05:58,440 Speaker 1: by left by itself, these large language models, they can 114 00:05:58,480 --> 00:06:01,000 Speaker 1: do all kinds of terrible things. I mean, they can 115 00:06:01,040 --> 00:06:02,680 Speaker 1: do a lot of great things too. You've used them, 116 00:06:02,960 --> 00:06:05,440 Speaker 1: you know, they can summarize things, and you can ask 117 00:06:05,480 --> 00:06:08,719 Speaker 1: them queries and you can get information. But unfortunately, we 118 00:06:08,760 --> 00:06:12,720 Speaker 1: saw this year this horrible example or this company character 119 00:06:12,760 --> 00:06:18,039 Speaker 1: AI specifically started to find that their product was turning 120 00:06:18,040 --> 00:06:22,320 Speaker 1: into a suicide coach for their customers. And one of 121 00:06:22,320 --> 00:06:24,600 Speaker 1: the most horrific things I've ever seen in technology is 122 00:06:24,680 --> 00:06:28,839 Speaker 1: these children, you know, died and it was completely unnecessary, 123 00:06:29,200 --> 00:06:30,680 Speaker 1: and I think it needs to be a wake up 124 00:06:30,720 --> 00:06:34,200 Speaker 1: call that you know, we're letting all of this AI 125 00:06:34,320 --> 00:06:38,240 Speaker 1: technology out fully unregulated. And by the way, you remember 126 00:06:38,600 --> 00:06:40,560 Speaker 1: that's true what we did with social media as well. 127 00:06:40,960 --> 00:06:44,240 Speaker 1: And you probably remember what I said in twenty eighteen here, 128 00:06:44,480 --> 00:06:47,039 Speaker 1: you know, which was social media was becoming the same 129 00:06:47,200 --> 00:06:50,360 Speaker 1: kind of very dangerous technology. And look, a lot of 130 00:06:50,400 --> 00:06:52,760 Speaker 1: governments have made changes around social media now, so you 131 00:06:52,800 --> 00:06:55,520 Speaker 1: go to a lot of countries, if you're not seventeen 132 00:06:55,600 --> 00:06:57,640 Speaker 1: eighteen years old, you're not going to use social media. 133 00:06:57,680 --> 00:07:00,680 Speaker 1: They're our controls or our protections. We need to get 134 00:07:00,680 --> 00:07:03,560 Speaker 1: there with AI now. There doesn't need to be any 135 00:07:03,600 --> 00:07:05,919 Speaker 1: more deaths. You know, we need to be like taking 136 00:07:05,960 --> 00:07:11,320 Speaker 1: this seriously. And AI is great, it's incredible, but these models, 137 00:07:11,640 --> 00:07:15,800 Speaker 1: you know, they're inaccurate, they hallucinate, they're kind of hard 138 00:07:15,800 --> 00:07:16,400 Speaker 1: to control. 139 00:07:16,560 --> 00:07:19,000 Speaker 2: So what should we do? I mean, should CEOs be 140 00:07:19,080 --> 00:07:22,360 Speaker 2: held personally liable? Should I was talking to Demis Hassibus, 141 00:07:22,360 --> 00:07:26,119 Speaker 2: the CEO of Google Deep Mind, and he I asked, look, 142 00:07:26,160 --> 00:07:28,760 Speaker 2: if we knew every company in every country would pause, 143 00:07:29,040 --> 00:07:33,200 Speaker 2: should we pause AI so society and regulation can catch up? 144 00:07:33,400 --> 00:07:35,760 Speaker 2: And he said yes, he would be in favor of that. 145 00:07:35,800 --> 00:07:36,640 Speaker 2: Would you be in favor? 146 00:07:36,720 --> 00:07:39,760 Speaker 1: Emily? Number one? Tech hates regulation? You know that they 147 00:07:39,800 --> 00:07:41,840 Speaker 1: hate it, right? They get people on, oh, it slows 148 00:07:41,840 --> 00:07:44,800 Speaker 1: down innovation. They hate regulation, they hate it, they hate it, 149 00:07:44,800 --> 00:07:47,960 Speaker 1: they hate it except one regulation they love, Section two thirty, 150 00:07:48,360 --> 00:07:51,240 Speaker 1: which basically says they're not held responsible or accountable for 151 00:07:51,280 --> 00:07:53,960 Speaker 1: these deaths or for anything that their social media or 152 00:07:54,000 --> 00:07:57,120 Speaker 1: their AI does. And that needs to change because here 153 00:07:57,120 --> 00:08:00,320 Speaker 1: at Bloomberg, if you say something, you're held accountable, and 154 00:08:00,360 --> 00:08:02,720 Speaker 1: you know, I own time, and if we say something, 155 00:08:02,720 --> 00:08:05,240 Speaker 1: we're held accountable. But these companies are not held accountable 156 00:08:05,240 --> 00:08:08,840 Speaker 1: because of a very specific federal regulation called section two thirty. 157 00:08:08,840 --> 00:08:13,200 Speaker 1: It needs to be reshaped, reformed, you know, basically restructured 158 00:08:13,280 --> 00:08:16,720 Speaker 1: to really reflect social media and AI and the dangers 159 00:08:16,760 --> 00:08:19,200 Speaker 1: that we see. And that is something that we should 160 00:08:19,240 --> 00:08:22,480 Speaker 1: all be pushing for that these these companies need to 161 00:08:22,480 --> 00:08:24,440 Speaker 1: be held accountable, and right now they're not going to 162 00:08:24,480 --> 00:08:28,440 Speaker 1: be held accountable because of this regulation. 163 00:08:29,000 --> 00:08:31,280 Speaker 2: And right and like what regulation is the right regulation? 164 00:08:31,360 --> 00:08:34,040 Speaker 2: Like California try to regulate. David Sax has his own version. 165 00:08:34,080 --> 00:08:35,880 Speaker 2: I know you've you've just chatted with him, like is 166 00:08:35,880 --> 00:08:36,600 Speaker 2: there any regulation? 167 00:08:36,760 --> 00:08:38,400 Speaker 1: Well, I'm going to have a one on one conversation 168 00:08:38,480 --> 00:08:41,000 Speaker 1: with David is a close friend of mine later today. 169 00:08:41,240 --> 00:08:43,240 Speaker 1: I hope you tune in and we're going to talk 170 00:08:43,280 --> 00:08:47,080 Speaker 1: about that because, look, I think social media kind of 171 00:08:47,280 --> 00:08:49,680 Speaker 1: is almost like a best practice that we can learn 172 00:08:49,720 --> 00:08:52,400 Speaker 1: from that if we look around the world, you'll see 173 00:08:52,400 --> 00:08:55,520 Speaker 1: that a lot of countries, not necessarily ours, but a 174 00:08:55,520 --> 00:08:58,680 Speaker 1: lot of countries have made really good changes, and we 175 00:08:58,720 --> 00:09:01,920 Speaker 1: could make those changes to because you know, it's kind 176 00:09:01,920 --> 00:09:04,240 Speaker 1: of funny. I'm like a large language model myself. You know, 177 00:09:04,280 --> 00:09:06,840 Speaker 1: I feel like I'm putting one word together, but I 178 00:09:06,920 --> 00:09:09,040 Speaker 1: have context. I have a life. I was born, I 179 00:09:09,040 --> 00:09:11,160 Speaker 1: have a childhood, I have friends, hopefully you're one of them. 180 00:09:11,960 --> 00:09:14,319 Speaker 1: And you know, I have a creator. I have a 181 00:09:14,400 --> 00:09:17,760 Speaker 1: higher power than I'm connected to. These large language models. 182 00:09:18,200 --> 00:09:20,720 Speaker 1: They're running by themselves. They don't they were not born, 183 00:09:20,800 --> 00:09:23,360 Speaker 1: they don't have a childhood, they don't have context, they're 184 00:09:23,360 --> 00:09:27,520 Speaker 1: not connected to a creator, so that they're running kind 185 00:09:27,559 --> 00:09:29,319 Speaker 1: of in a different way. And then all of a sudden, 186 00:09:29,559 --> 00:09:31,920 Speaker 1: it feels very familiar. It feels like, oh, this is 187 00:09:31,920 --> 00:09:35,280 Speaker 1: my friend. It's not your friend. So we need to 188 00:09:35,320 --> 00:09:38,480 Speaker 1: like step in now and look, we can kind of 189 00:09:38,520 --> 00:09:40,680 Speaker 1: see where it's going. It's not completely out of control. 190 00:09:41,160 --> 00:09:43,360 Speaker 1: So now is the moment. And I think people like 191 00:09:43,440 --> 00:09:46,040 Speaker 1: you and others should be aggressive about this. 192 00:09:45,960 --> 00:09:48,720 Speaker 2: Last quick question and it feels almost going to the stock. 193 00:09:48,800 --> 00:09:50,760 Speaker 2: But you know, you're trying to build your company on 194 00:09:50,960 --> 00:09:54,280 Speaker 2: this new emerging technology. Wall Street's really down on software. 195 00:09:54,320 --> 00:09:57,360 Speaker 2: Wall Streets salesforce is taking salesforce stock taking a beating, 196 00:09:57,800 --> 00:10:01,200 Speaker 2: you know quickly what I mean what needs to happen. 197 00:10:01,800 --> 00:10:03,880 Speaker 1: For twenty six years and every now and then the 198 00:10:03,880 --> 00:10:06,000 Speaker 1: software sector kind of falls out of favor. It doesn't 199 00:10:06,000 --> 00:10:07,760 Speaker 1: mean our financials are out of favor. I mean we'll 200 00:10:07,760 --> 00:10:10,720 Speaker 1: do more than forty one billion in revenue this year, 201 00:10:10,840 --> 00:10:13,760 Speaker 1: more than fifteen billion in cash flow our margins or 202 00:10:13,800 --> 00:10:16,560 Speaker 1: more than thirty four percent all record numbers, by the way, 203 00:10:17,240 --> 00:10:20,160 Speaker 1: and we just introduced an amazing new version of Slack. 204 00:10:20,240 --> 00:10:23,160 Speaker 1: You know that with slack Bot, the new AI enabled 205 00:10:23,200 --> 00:10:25,839 Speaker 1: Slack that have the context and the AI built in. 206 00:10:26,280 --> 00:10:29,600 Speaker 1: We have Agent Force helping our customers deliver customer agents. 207 00:10:29,600 --> 00:10:32,880 Speaker 1: So we do employee agents and customer agents. We've rebuilt 208 00:10:32,880 --> 00:10:35,680 Speaker 1: our architecture to have a data layer and application layer, 209 00:10:35,960 --> 00:10:39,000 Speaker 1: the agent force layer, and then these agents that sit 210 00:10:39,040 --> 00:10:41,240 Speaker 1: on the top and the ecosystem. That's been a lot 211 00:10:41,240 --> 00:10:43,560 Speaker 1: of work in the last three years, selling a lot 212 00:10:43,559 --> 00:10:46,760 Speaker 1: of huge deals. But you know, you sometimes have to 213 00:10:46,760 --> 00:10:49,480 Speaker 1: wait around for Wall Street to realize, well, this is 214 00:10:49,520 --> 00:10:53,120 Speaker 1: a huge opportunity and we're just going to keep executing that. 215 00:10:53,240 --> 00:10:54,920 Speaker 2: Yes or no? Are you at one billion yet? One 216 00:10:54,920 --> 00:10:55,600 Speaker 2: billion agents? 217 00:10:56,280 --> 00:10:59,280 Speaker 1: I think we're beyond that. I haven't actually counted them all, 218 00:10:59,320 --> 00:11:02,160 Speaker 1: but you know we're here's the way to think about it. 219 00:11:02,240 --> 00:11:05,520 Speaker 1: We now just passed and I think we're number one 220 00:11:05,640 --> 00:11:09,120 Speaker 1: or number two enterprise software company, or maybe software company 221 00:11:09,160 --> 00:11:12,480 Speaker 1: overall in the use of tokens. You know, tokens are 222 00:11:12,520 --> 00:11:16,200 Speaker 1: what these models generate to generate their intelligence. We just 223 00:11:16,200 --> 00:11:19,240 Speaker 1: passed eleven point one trillion tokens. And you remember we 224 00:11:19,320 --> 00:11:22,640 Speaker 1: talked after our earnings and we were about three trillion tokens. 225 00:11:23,559 --> 00:11:26,959 Speaker 1: And that idea that as an enterprise software company, we're 226 00:11:27,000 --> 00:11:30,040 Speaker 1: publicly reporting and talking about the number of tokens. And 227 00:11:30,080 --> 00:11:32,800 Speaker 1: I know all of those people, you know who love 228 00:11:32,840 --> 00:11:35,960 Speaker 1: Bilbo Baggins are wondering if that's the tokens. It's different tokens. 229 00:11:37,520 --> 00:11:39,920 Speaker 2: All right, Mark, next time, we'll have to do this 230 00:11:39,960 --> 00:11:43,880 Speaker 2: in Hawaii. Thank you, Yes, thank you. Okay, I'm good, 231 00:11:44,000 --> 00:11:44,720 Speaker 2: I'm surviving. 232 00:11:44,760 --> 00:11:47,160 Speaker 1: Thank you, it's a beautiful day. Thank you for joining 233 00:11:47,280 --> 00:11:49,320 Speaker 1: us next time it's Waikiki Beach, let's do that. 234 00:11:49,640 --> 00:11:51,600 Speaker 2: Absolutely, Mark Benioff's CEO of Salesforce,