1 00:00:00,160 --> 00:00:03,000 Speaker 1: You're a veteran of the World Economic Forum. 2 00:00:03,400 --> 00:00:07,600 Speaker 2: It's Sam's first of I'm curious, how receptive do you 3 00:00:07,640 --> 00:00:10,680 Speaker 2: think world leaders will be to know that message that 4 00:00:10,720 --> 00:00:15,880 Speaker 2: we just heard, maybe leaning into the disruptive changes at 5 00:00:15,880 --> 00:00:17,919 Speaker 2: AI will bring and perhaps. 6 00:00:17,560 --> 00:00:19,479 Speaker 1: You know, not as much fighting them. 7 00:00:20,120 --> 00:00:20,960 Speaker 3: Yeah, well, I think that. 8 00:00:21,480 --> 00:00:25,080 Speaker 4: You know, we're in a really incredible moment in technology, 9 00:00:25,079 --> 00:00:27,280 Speaker 4: and I'm sure everybody knows that and can feel it 10 00:00:27,400 --> 00:00:28,200 Speaker 4: experienced that. 11 00:00:28,280 --> 00:00:29,000 Speaker 3: It's awesome. 12 00:00:30,000 --> 00:00:34,519 Speaker 4: And we were just at this UK Safety summit together 13 00:00:35,000 --> 00:00:37,559 Speaker 4: and this is where regulators from around the world are 14 00:00:37,600 --> 00:00:40,800 Speaker 4: really coming together in a really unique way, because I 15 00:00:40,840 --> 00:00:47,080 Speaker 4: had never seen the minister of every technology country coming 16 00:00:47,880 --> 00:00:50,720 Speaker 4: into a forum to say, we are going to regulate 17 00:00:50,760 --> 00:00:56,040 Speaker 4: this technology. You know, you look at social media over 18 00:00:56,080 --> 00:00:58,560 Speaker 4: the last decade. It's been a fucking shit show what's 19 00:00:58,560 --> 00:01:01,840 Speaker 4: been going on, and regulators have not done their job. 20 00:01:02,480 --> 00:01:04,760 Speaker 4: And so to see them actually start to take this 21 00:01:04,920 --> 00:01:08,759 Speaker 4: very seriously, that is amazing and very important. 22 00:01:09,400 --> 00:01:11,280 Speaker 3: And I think this idea. 23 00:01:11,160 --> 00:01:16,200 Speaker 4: That companies like open AI or Microsoft and others want 24 00:01:16,240 --> 00:01:19,920 Speaker 4: to create digital people that can do all kinds of 25 00:01:20,040 --> 00:01:23,679 Speaker 4: tasks that we do. You know, will they be taxed, 26 00:01:23,720 --> 00:01:26,760 Speaker 4: to how they'll be regulated, what are the dangers involved? 27 00:01:27,240 --> 00:01:29,920 Speaker 4: All of these kind of important questions. At the same time, 28 00:01:30,000 --> 00:01:32,360 Speaker 4: we want to have these incredible benefits of AI that 29 00:01:32,400 --> 00:01:37,440 Speaker 4: we're all experiencing, better healthcare, better education, you know, augmentation 30 00:01:37,640 --> 00:01:39,800 Speaker 4: of our ye. Let me tell you an incredible story 31 00:01:39,800 --> 00:01:43,240 Speaker 4: where I was just in Milan with Gucci and they 32 00:01:43,280 --> 00:01:47,000 Speaker 4: brought us into work on their call center and very excited. 33 00:01:47,319 --> 00:01:49,600 Speaker 4: How do we use AI? How do we create better 34 00:01:49,640 --> 00:01:53,600 Speaker 4: customer service? But we don't really know what is their 35 00:01:53,680 --> 00:01:56,400 Speaker 4: end goal? Is it more productivity? You know, is it 36 00:01:56,720 --> 00:01:59,680 Speaker 4: just better customer relationships? As a higher margins, do they 37 00:01:59,680 --> 00:02:01,960 Speaker 4: want to do staff? Do they want to increase staff? 38 00:02:02,760 --> 00:02:05,760 Speaker 4: But when we apply the technology that we're seeing now, 39 00:02:06,800 --> 00:02:10,160 Speaker 4: what we saw was the current state of AI, which 40 00:02:10,240 --> 00:02:13,600 Speaker 4: is really the ability to augment human being. Human performance 41 00:02:14,040 --> 00:02:18,639 Speaker 4: revenues went up thirty percent because these individuals who were 42 00:02:18,880 --> 00:02:23,440 Speaker 4: call center service agents also then simultaneously became sales agents 43 00:02:23,520 --> 00:02:25,880 Speaker 4: marketing agents. They were able to do all these things 44 00:02:25,880 --> 00:02:29,160 Speaker 4: that they just could not do before. And that's a 45 00:02:29,160 --> 00:02:31,440 Speaker 4: tremendous theme I think that's going on. I feel that 46 00:02:31,480 --> 00:02:34,600 Speaker 4: way myself, you know, when I'm with chat GPT or 47 00:02:34,639 --> 00:02:38,200 Speaker 4: co pilot or anthropic cloud or whatever it is. Mister Awl, 48 00:02:38,520 --> 00:02:40,880 Speaker 4: I get that same feeling. Oh wow, I've got a 49 00:02:40,919 --> 00:02:45,120 Speaker 4: little bit more capability than I had before because I'm 50 00:02:45,160 --> 00:02:49,640 Speaker 4: being augmented through AI. And I'm sure it's true in 51 00:02:49,960 --> 00:02:53,200 Speaker 4: a lot of disciplines. Certainly it's true in healthcare, and 52 00:02:53,240 --> 00:02:54,959 Speaker 4: it's going to be true in a lot more things. 53 00:02:55,320 --> 00:02:57,600 Speaker 3: But we're on an arc and we all know that that. 54 00:02:58,240 --> 00:02:59,880 Speaker 4: You know, it's going to start to be able to 55 00:02:59,919 --> 00:03:03,560 Speaker 4: do things that will be fundamentally a surrogate for what 56 00:03:03,639 --> 00:03:04,120 Speaker 4: we all. 57 00:03:04,000 --> 00:03:07,280 Speaker 2: Do today and what will determine whether a company can 58 00:03:07,320 --> 00:03:10,560 Speaker 2: be successful with AI or no, how long will. 59 00:03:10,440 --> 00:03:13,239 Speaker 4: It be until I come to Davos to be interviewed 60 00:03:13,240 --> 00:03:14,280 Speaker 4: by the Bloomberg AI. 61 00:03:14,840 --> 00:03:16,760 Speaker 3: That's what I really want to go. I'm going to 62 00:03:16,840 --> 00:03:21,120 Speaker 3: replace well maybe you know. I mean, we've been doing. 63 00:03:20,960 --> 00:03:23,520 Speaker 4: This together for a long time, right, what a couple decades? 64 00:03:23,600 --> 00:03:27,920 Speaker 4: Few decades? I'm not that old, but okay, but you 65 00:03:27,919 --> 00:03:30,040 Speaker 4: know what I mean where it's like, you know, I 66 00:03:30,440 --> 00:03:33,919 Speaker 4: don't know when we get to this point where there's 67 00:03:34,040 --> 00:03:36,120 Speaker 4: not just all of us in the room, but there's 68 00:03:36,160 --> 00:03:40,000 Speaker 4: an AI as well participating in this conversation, and we're 69 00:03:40,040 --> 00:03:43,960 Speaker 4: about to create digital people, digital salespeople, digital service people, 70 00:03:44,000 --> 00:03:49,080 Speaker 4: digital marketing people. And you know, certainly we can see 71 00:03:49,080 --> 00:03:52,520 Speaker 4: it in radiology already. You know, I had a CTA scan, 72 00:03:53,000 --> 00:03:54,360 Speaker 4: you know where they look at your heart with a 73 00:03:54,400 --> 00:03:55,280 Speaker 4: CT scanner. 74 00:03:55,360 --> 00:03:57,520 Speaker 3: Over the last fifteen years, I've had three of them. 75 00:03:57,720 --> 00:03:59,680 Speaker 3: Everything good, everything's good. Great. 76 00:04:00,120 --> 00:04:02,480 Speaker 4: What was interesting was fifteen years ago and I did it. 77 00:04:02,920 --> 00:04:04,920 Speaker 4: The guys who did it in LA who kind of 78 00:04:04,920 --> 00:04:07,920 Speaker 4: invented calcium scoring and all that kind of stuff. They 79 00:04:07,920 --> 00:04:10,400 Speaker 4: were there looking at it the screen, etc. And then 80 00:04:10,440 --> 00:04:12,440 Speaker 4: the computer comes out and says gives a number like 81 00:04:12,720 --> 00:04:17,360 Speaker 4: six five four. Then five years ago is actually quite 82 00:04:17,360 --> 00:04:19,360 Speaker 4: a bit better. And the CT scan got a lot better. 83 00:04:19,800 --> 00:04:23,360 Speaker 4: And last year it was amazing. But then they took 84 00:04:23,360 --> 00:04:25,039 Speaker 4: the scan from last year and then they can just 85 00:04:25,080 --> 00:04:28,720 Speaker 4: put it into this new software product and they got 86 00:04:28,760 --> 00:04:33,320 Speaker 4: the same result as if the radiologist was reading it there. 87 00:04:33,960 --> 00:04:38,159 Speaker 4: And I think that is very interesting because hey, I 88 00:04:38,240 --> 00:04:40,680 Speaker 4: can go to an incredible scanning center in. 89 00:04:40,720 --> 00:04:41,440 Speaker 3: LA, but. 90 00:04:42,880 --> 00:04:45,160 Speaker 4: Think about all the places in the world who may 91 00:04:45,160 --> 00:04:49,040 Speaker 4: not have that top flight radiologist now they have it. 92 00:04:49,320 --> 00:04:53,560 Speaker 4: That's a level of democratization and equanimity made possible by 93 00:04:53,600 --> 00:04:55,880 Speaker 4: AI and technology that we just don't have before, and 94 00:04:55,920 --> 00:04:59,320 Speaker 4: it's a digital radiologist. So yeah, we're going to have 95 00:04:59,360 --> 00:05:01,240 Speaker 4: to have government step in. We're going to have to 96 00:05:01,279 --> 00:05:04,040 Speaker 4: have partnership. It's going to have to be a multi 97 00:05:04,040 --> 00:05:06,719 Speaker 4: stakeholder dialogue, and we're going to have to get to 98 00:05:06,760 --> 00:05:08,800 Speaker 4: a higher level when it comes to AI, and we're 99 00:05:08,800 --> 00:05:11,839 Speaker 4: gonna have to visualize the future and think about what's 100 00:05:11,880 --> 00:05:15,000 Speaker 4: going to happen because, as Sam likes to say, we 101 00:05:15,080 --> 00:05:18,320 Speaker 4: don't really know what's going to happen next. But we 102 00:05:18,360 --> 00:05:22,719 Speaker 4: all saw Minority Report, you know, we saw war games, 103 00:05:23,400 --> 00:05:27,280 Speaker 4: we know about how you know, we saw the movie Her. 104 00:05:27,720 --> 00:05:31,000 Speaker 4: It doesn't take our rocket scientist to fill it in. 105 00:05:31,160 --> 00:05:32,800 Speaker 4: A lot of people who are in this room could 106 00:05:32,800 --> 00:05:35,160 Speaker 4: come in here and sit in this chair and kind 107 00:05:35,160 --> 00:05:38,960 Speaker 4: of give their prophecy on AI because we've all been 108 00:05:39,040 --> 00:05:42,560 Speaker 4: living in this, you know, in science fiction is becoming 109 00:05:42,640 --> 00:05:45,880 Speaker 4: a reality and that's what's really interesting, and I think 110 00:05:45,880 --> 00:05:48,680 Speaker 4: we should just admit, Wow, this is really amazing and 111 00:05:48,760 --> 00:05:50,200 Speaker 4: we don't completely know. 112 00:05:50,279 --> 00:05:52,520 Speaker 2: Well, so, you know, turning away from sci fi and 113 00:05:52,560 --> 00:05:55,760 Speaker 2: projecting based on what you know, what's your current thinking 114 00:05:55,800 --> 00:05:59,200 Speaker 2: about the extent of the coming disruption to labor markets 115 00:05:59,480 --> 00:06:03,200 Speaker 2: and you worry about the potential kind of political ramifications 116 00:06:03,720 --> 00:06:08,320 Speaker 2: of dislocation and what that might do to the political 117 00:06:08,400 --> 00:06:11,240 Speaker 2: process in terms of fueling candidates that might take advantage 118 00:06:11,279 --> 00:06:12,240 Speaker 2: of that dissatisfaction. 119 00:06:13,320 --> 00:06:14,960 Speaker 4: I think the one I just got out of a 120 00:06:14,960 --> 00:06:17,200 Speaker 4: meeting with two hundred CEOs for the last four hours. 121 00:06:17,200 --> 00:06:22,120 Speaker 4: And you know what CEOs want is they want more margin, 122 00:06:23,160 --> 00:06:27,160 Speaker 4: they want more productivity, they want higher value customer relationships. 123 00:06:27,480 --> 00:06:31,840 Speaker 4: These are the three things that universally every CEO wants. Now, 124 00:06:32,640 --> 00:06:35,680 Speaker 4: you know, how are they going to get it? 125 00:06:36,200 --> 00:06:39,599 Speaker 3: Okay, what will it do for them? 126 00:06:39,720 --> 00:06:42,200 Speaker 4: Well, they're going to get it through this kind of 127 00:06:42,279 --> 00:06:45,440 Speaker 4: incredible AI technology. That's why we're feathering it into all 128 00:06:45,480 --> 00:06:48,719 Speaker 4: of our apps and our platform, So our Einstein product, 129 00:06:48,720 --> 00:06:52,479 Speaker 4: which already does a trillion AI transactions a week. 130 00:06:52,360 --> 00:06:53,920 Speaker 3: Both predictive and generitive. 131 00:06:55,240 --> 00:07:00,680 Speaker 4: Well, that technology it must augment our customers, employees and 132 00:07:00,720 --> 00:07:04,120 Speaker 4: their customers as well, so whether they're doing commerce or sales, 133 00:07:04,200 --> 00:07:07,760 Speaker 4: or service or marketing, whatever, augmentation of the human experience. 134 00:07:08,160 --> 00:07:10,960 Speaker 4: This is a tremendous opportunity that's at hand right now. 135 00:07:11,320 --> 00:07:15,040 Speaker 4: So that radiologist who's reading that scan can be there 136 00:07:15,120 --> 00:07:17,560 Speaker 4: going has a partner next to them, which is that 137 00:07:17,680 --> 00:07:20,640 Speaker 4: AI and that idea that we can be augmented or 138 00:07:20,680 --> 00:07:24,480 Speaker 4: that Gucci you know, service rep is now being augmented 139 00:07:24,560 --> 00:07:28,480 Speaker 4: by that AI. And that is what we're really seeing 140 00:07:28,520 --> 00:07:32,480 Speaker 4: as the opportunity in today's present moment, with the current 141 00:07:32,560 --> 00:07:36,800 Speaker 4: generation of AI, with the current technology, it's really about 142 00:07:36,840 --> 00:07:40,600 Speaker 4: human augmentation. We can do a lot to augment human 143 00:07:40,640 --> 00:07:44,240 Speaker 4: beings in these incredibly powerful new ways, and we should 144 00:07:44,240 --> 00:07:48,080 Speaker 4: focus on that ourselves. For our kids. We can give 145 00:07:48,120 --> 00:07:51,440 Speaker 4: them incredible, you know, tutors that are helping them as 146 00:07:51,480 --> 00:07:54,560 Speaker 4: long as the human remains in the loop. Because, as 147 00:07:54,560 --> 00:07:57,280 Speaker 4: we all know, these things lie pretty bad, or they 148 00:07:57,360 --> 00:07:59,880 Speaker 4: call them hallucinations to kind of taper it down. But 149 00:08:00,480 --> 00:08:02,040 Speaker 4: all of a sudden, I'm sure we've all had this 150 00:08:02,120 --> 00:08:06,080 Speaker 4: experience where you put some prompt in and all of 151 00:08:06,120 --> 00:08:09,120 Speaker 4: a sudden it's like, no, it's not we know it's 152 00:08:09,160 --> 00:08:11,280 Speaker 4: not right, and then we go, oh, that's weird, and 153 00:08:11,320 --> 00:08:14,080 Speaker 4: then we'll turn to our friend who works at one 154 00:08:14,120 --> 00:08:17,000 Speaker 4: of these companies, Oh yeah, it's a hallucination, like, wow, 155 00:08:17,040 --> 00:08:18,040 Speaker 4: it looks like a lie to me. 156 00:08:18,480 --> 00:08:21,000 Speaker 2: You do you worry about the impact this might have 157 00:08:21,120 --> 00:08:24,680 Speaker 2: on the democratic process As we heading into a year 158 00:08:24,720 --> 00:08:27,040 Speaker 2: with seventy seven elections around the world. 159 00:08:27,640 --> 00:08:29,840 Speaker 4: Well, I mean, I think I started worrying about that 160 00:08:30,000 --> 00:08:33,000 Speaker 4: when we saw what was happening with these social media companies. 161 00:08:33,200 --> 00:08:36,520 Speaker 4: Let's start there. That still has not been addressed. Isn't 162 00:08:36,559 --> 00:08:39,040 Speaker 4: that How many articles have you written about that? Do 163 00:08:39,080 --> 00:08:41,959 Speaker 4: I need to pull them all out their headlines? I mean, 164 00:08:42,080 --> 00:08:46,960 Speaker 4: the reality is is that that remains the number one issue. 165 00:08:47,440 --> 00:08:50,440 Speaker 4: And I think that that idea that that's kind of 166 00:08:50,520 --> 00:08:54,040 Speaker 4: out there that we need to address tech companies' core values. 167 00:08:54,640 --> 00:08:57,040 Speaker 4: What is really important to these tech companies and how 168 00:08:57,040 --> 00:09:00,800 Speaker 4: they operate is everybody's business. And you know, we can 169 00:09:00,840 --> 00:09:04,200 Speaker 4: see that also even with the AI companies. Hey, I 170 00:09:04,360 --> 00:09:09,040 Speaker 4: own Time, you know that, and Bloomberg and New York Times, 171 00:09:09,080 --> 00:09:14,240 Speaker 4: and we're all finding our intellectual property, your stories, your 172 00:09:14,400 --> 00:09:18,679 Speaker 4: work that Bloomberg paid for or Time paid for or 173 00:09:18,720 --> 00:09:22,560 Speaker 4: New York Times paid for surfacing in these results because 174 00:09:22,880 --> 00:09:26,520 Speaker 4: all the DAID training data has been stolen. That is 175 00:09:26,600 --> 00:09:30,280 Speaker 4: a pretty big thought that there's a commodity user interface. 176 00:09:30,400 --> 00:09:32,880 Speaker 4: We can see that because you can get your iPhone 177 00:09:33,000 --> 00:09:35,040 Speaker 4: or Google phone and go to the play Store or 178 00:09:35,040 --> 00:09:38,120 Speaker 4: the app store and download the open ai app or 179 00:09:38,160 --> 00:09:41,920 Speaker 4: download the Copilot app. Commodity UI on the front end. 180 00:09:42,400 --> 00:09:45,439 Speaker 4: In the middle, you have what's becoming, i think somewhat 181 00:09:45,520 --> 00:09:50,160 Speaker 4: highly commoditized large language models. So we can see that 182 00:09:50,200 --> 00:09:53,440 Speaker 4: because the open source models and even the best of 183 00:09:53,480 --> 00:09:57,960 Speaker 4: the proprietary models are pretty close. And then you have 184 00:09:58,480 --> 00:10:01,079 Speaker 4: this broad set of training data, which is kind of 185 00:10:01,120 --> 00:10:03,160 Speaker 4: the third tier of it, which has been just basically 186 00:10:03,240 --> 00:10:03,720 Speaker 4: ripped off. 187 00:10:03,800 --> 00:10:03,920 Speaker 1: Right. 188 00:10:03,960 --> 00:10:06,040 Speaker 2: Okay, so you probably didn't have time to negotiate with 189 00:10:06,120 --> 00:10:07,760 Speaker 2: Sam while he was on his way out, But what's 190 00:10:07,960 --> 00:10:08,800 Speaker 2: what's the fair. 191 00:10:08,600 --> 00:10:09,960 Speaker 1: Price for Times content? 192 00:10:10,360 --> 00:10:13,439 Speaker 4: Well, we're waiting for Bloomberg to actually complete their negotiations 193 00:10:13,480 --> 00:10:15,960 Speaker 4: and then we're going to leverage that point. I mean, 194 00:10:16,160 --> 00:10:19,480 Speaker 4: nobody really exactly knows, because this has all happened just 195 00:10:19,559 --> 00:10:22,280 Speaker 4: in a period of hours. You know, we saw that 196 00:10:22,360 --> 00:10:24,920 Speaker 4: Axel got a deal, and you know, we see New 197 00:10:25,000 --> 00:10:27,840 Speaker 4: York Times issuing, and I don't know what Bloomberg is doing, 198 00:10:27,880 --> 00:10:29,640 Speaker 4: and I don't even know really what Time is doing, 199 00:10:29,679 --> 00:10:32,559 Speaker 4: but I know that you know, if you're going to 200 00:10:32,640 --> 00:10:35,120 Speaker 4: use this data, I think that probably there's a pretty 201 00:10:35,120 --> 00:10:37,960 Speaker 4: great company to be built on a standardized set of 202 00:10:38,000 --> 00:10:40,960 Speaker 4: training data that lets all these companies play a fair, 203 00:10:41,600 --> 00:10:45,120 Speaker 4: fair game and let the content creators like yourself get 204 00:10:45,160 --> 00:10:48,640 Speaker 4: paid fairly for their work, and I think that bridge 205 00:10:48,640 --> 00:10:51,400 Speaker 4: has not yet been crossed, and that's a mistake by 206 00:10:51,440 --> 00:10:52,200 Speaker 4: the AI company. 207 00:10:52,240 --> 00:10:53,560 Speaker 3: It is you very easy to do. 208 00:10:53,559 --> 00:10:56,400 Speaker 2: Do you worry that the proliferation of these models could 209 00:10:56,440 --> 00:10:57,959 Speaker 2: further undermine the news business? 210 00:10:59,640 --> 00:11:01,720 Speaker 3: No, I think that you know what. 211 00:11:01,720 --> 00:11:04,880 Speaker 4: We just saw what happened, you know with the Arena 212 00:11:04,920 --> 00:11:08,360 Speaker 4: group right where they published a couple of AI articles 213 00:11:08,400 --> 00:11:11,040 Speaker 4: and the CEO basically got, you know, shot in the 214 00:11:11,080 --> 00:11:15,080 Speaker 4: street by the market. You know that said whoa, that 215 00:11:15,280 --> 00:11:17,280 Speaker 4: wasn't exactly right. And I'm sure I don't know if 216 00:11:17,320 --> 00:11:19,360 Speaker 4: you have this experience, but I have the experience. Well, 217 00:11:19,360 --> 00:11:23,000 Speaker 4: I'll even read an article written by an AI about 218 00:11:23,000 --> 00:11:25,160 Speaker 4: me and I go, WHOA, Yeah, I guess I'm actually 219 00:11:25,240 --> 00:11:27,120 Speaker 4: pretty good or that's a lot better than I even 220 00:11:27,160 --> 00:11:30,280 Speaker 4: remember that. So you know, there's a you can tell 221 00:11:30,320 --> 00:11:33,240 Speaker 4: the difference between and you've read the AI stories and 222 00:11:33,280 --> 00:11:36,680 Speaker 4: so have I between Hey, a first draft that then 223 00:11:37,320 --> 00:11:41,520 Speaker 4: you know is shaped and really worked on by a 224 00:11:41,559 --> 00:11:45,640 Speaker 4: professional writer like yourself and just an AI and it's 225 00:11:45,679 --> 00:11:47,560 Speaker 4: get shot into the network. 226 00:11:47,960 --> 00:11:49,040 Speaker 3: And we're still not. 227 00:11:49,080 --> 00:11:52,880 Speaker 4: At that point yet where you know that that AI 228 00:11:53,040 --> 00:11:55,240 Speaker 4: is ready to be unleashed, and I don't know exactly 229 00:11:55,280 --> 00:11:58,120 Speaker 4: how we're gonna, you know, be ready for that moment, 230 00:11:58,160 --> 00:11:59,600 Speaker 4: but it will be a moment that we all have 231 00:11:59,600 --> 00:11:59,920 Speaker 4: to think of. 232 00:12:00,679 --> 00:12:02,440 Speaker 2: I want to hit a couple more issues before we 233 00:12:02,559 --> 00:12:06,360 Speaker 2: run out of time. We had Szachi Nadella here this morning. 234 00:12:07,600 --> 00:12:11,800 Speaker 2: Microsoft Teams is competing with Salesforce and Slack in the 235 00:12:11,840 --> 00:12:12,960 Speaker 2: market payment place. 236 00:12:13,320 --> 00:12:15,480 Speaker 1: I know you guys have had some success. 237 00:12:15,040 --> 00:12:19,640 Speaker 2: In getting the EU to stop Microsoft from bundling Teams 238 00:12:19,640 --> 00:12:23,720 Speaker 2: together with its office Suite. You think Microsoft, Now I 239 00:12:23,760 --> 00:12:27,120 Speaker 2: haven't checked the market today, but perhaps the most valuable 240 00:12:27,120 --> 00:12:30,040 Speaker 2: company in the world, that there should be more attention 241 00:12:30,240 --> 00:12:33,720 Speaker 2: on Microsoft and its practices in terms of bundling software. 242 00:12:33,840 --> 00:12:36,320 Speaker 4: Well, this all happened way before we bought Slack, which 243 00:12:36,360 --> 00:12:39,720 Speaker 4: is that Microsoft, of course as a multi decade history 244 00:12:39,760 --> 00:12:43,439 Speaker 4: of bundling products and new initiatives into core products where 245 00:12:43,440 --> 00:12:47,040 Speaker 4: they have a monopoly to achieve market share new places. 246 00:12:47,080 --> 00:12:50,200 Speaker 4: And we all know those stories, the Netscape story, you know, 247 00:12:50,280 --> 00:12:52,000 Speaker 4: and many other stories. 248 00:12:52,480 --> 00:12:54,839 Speaker 3: And I think that there was. 249 00:12:54,760 --> 00:12:56,560 Speaker 4: A case that would has been made that the EU 250 00:12:56,679 --> 00:12:58,599 Speaker 4: is looking at that they did this also to the 251 00:12:59,200 --> 00:13:01,720 Speaker 4: Slack founders, and I think that's going. 252 00:13:01,640 --> 00:13:03,160 Speaker 3: To be up to the EU to decide. 253 00:13:03,440 --> 00:13:06,480 Speaker 4: They've obviously made some very aggressive statements, but I think, look, 254 00:13:06,520 --> 00:13:08,760 Speaker 4: it gets back to core values. You know, what kind 255 00:13:08,800 --> 00:13:11,640 Speaker 4: of company are you building, what are your values? How 256 00:13:11,679 --> 00:13:12,520 Speaker 4: are you operating? 257 00:13:12,720 --> 00:13:15,520 Speaker 3: Salesforce is now two and a half decades old. 258 00:13:16,040 --> 00:13:19,280 Speaker 4: You know, we've got seventy thousand employees where you know, 259 00:13:19,360 --> 00:13:22,320 Speaker 4: have given guidance that you know, we're looking to do 260 00:13:22,480 --> 00:13:26,079 Speaker 4: revenue in the mid thirties this year. It's a huge 261 00:13:26,120 --> 00:13:29,360 Speaker 4: software company, third largest software company, second largest in Japan. 262 00:13:30,120 --> 00:13:33,160 Speaker 4: And what I'm most proud of is the core values 263 00:13:33,160 --> 00:13:35,120 Speaker 4: that we've been able to maintain, which is our trust, 264 00:13:35,160 --> 00:13:38,600 Speaker 4: focus on trust at the number one point, customer success, 265 00:13:38,840 --> 00:13:43,080 Speaker 4: the level of innovation, are our importance of equality and 266 00:13:43,280 --> 00:13:46,920 Speaker 4: equity in how we operate our business, and also sustainable. 267 00:13:47,120 --> 00:13:48,960 Speaker 4: I think I'm out of valsing. 268 00:13:49,400 --> 00:13:52,760 Speaker 2: Okay, get you to unlock your inner Larry Ellison, does 269 00:13:52,800 --> 00:13:54,319 Speaker 2: Microsoft abuse it's market power? 270 00:13:54,640 --> 00:13:57,960 Speaker 4: Well, that's my mentor, as you know, And look, I 271 00:13:57,960 --> 00:13:59,320 Speaker 4: think this is Look, it's been. 272 00:13:59,240 --> 00:14:01,440 Speaker 3: Decided by the core. It's before that. 273 00:14:01,440 --> 00:14:03,800 Speaker 4: That has happened many other times, so you know, I 274 00:14:03,800 --> 00:14:05,920 Speaker 4: don't know if it's to happen again. Let's see what 275 00:14:05,920 --> 00:14:08,320 Speaker 4: the courts decide, and let's see we how the regulators decide. 276 00:14:08,320 --> 00:14:10,920 Speaker 4: But all the data, it's very clear. I mean, everybody 277 00:14:10,960 --> 00:14:13,000 Speaker 4: knows what the data is. And a lot of people 278 00:14:13,000 --> 00:14:15,719 Speaker 4: here probably have their own opinion. But I don't want 279 00:14:15,760 --> 00:14:17,480 Speaker 4: to come in and take the role of the regulator. 280 00:14:17,520 --> 00:14:18,960 Speaker 4: I don't think that's fair for me to take that. 281 00:14:19,040 --> 00:14:22,880 Speaker 2: Okay, yesterday in the US, Donald Trump won the Iowa 282 00:14:22,960 --> 00:14:25,960 Speaker 2: caucus and I just wanted to ask you, I asked 283 00:14:25,960 --> 00:14:26,680 Speaker 2: sat As. 284 00:14:26,640 --> 00:14:28,360 Speaker 1: Sam, and it's on my mind. 285 00:14:28,840 --> 00:14:31,880 Speaker 2: You know, what's at stake in this, in this monumental 286 00:14:31,880 --> 00:14:34,440 Speaker 2: election in the US. I know it's fashionable every four 287 00:14:34,480 --> 00:14:37,400 Speaker 2: years and say, this is the most important national. 288 00:14:37,080 --> 00:14:38,320 Speaker 1: Election in a lifetime. 289 00:14:38,560 --> 00:14:41,160 Speaker 2: Where's your head at right now as we look at 290 00:14:42,160 --> 00:14:45,520 Speaker 2: election in the US, seventy seven other national elections. 291 00:14:45,120 --> 00:14:46,160 Speaker 1: Are around the world. 292 00:14:46,880 --> 00:14:48,400 Speaker 2: What do you feel like is at stake for this 293 00:14:48,520 --> 00:14:52,720 Speaker 2: planet and maybe a little more provincially for Salesforce, for you, 294 00:14:53,080 --> 00:14:53,760 Speaker 2: for your family? 295 00:14:54,480 --> 00:14:54,680 Speaker 1: Well? 296 00:14:54,720 --> 00:14:55,440 Speaker 3: I think that for me. 297 00:14:55,760 --> 00:14:59,280 Speaker 4: You know, I've looked you been through this now so 298 00:14:59,320 --> 00:15:03,400 Speaker 4: many times and US the CEO of Salesforce that administrations change. 299 00:15:03,560 --> 00:15:06,520 Speaker 4: But getting back to my previous answer, my core values 300 00:15:06,520 --> 00:15:09,960 Speaker 4: don't change. The company's core values do not change. And 301 00:15:10,120 --> 00:15:12,680 Speaker 4: I think this is extremely important. Obviously, the US government 302 00:15:12,760 --> 00:15:15,480 Speaker 4: is a large customer of salesforce, and depending on who's 303 00:15:15,480 --> 00:15:18,160 Speaker 4: in office, it creates a whole stir with a different 304 00:15:18,200 --> 00:15:22,080 Speaker 4: part of our employee base. So that's just a reality. 305 00:15:22,400 --> 00:15:25,200 Speaker 4: What the reality is is that, hey, we are the 306 00:15:25,200 --> 00:15:28,920 Speaker 4: same company regardless of when that election is going to occur, 307 00:15:29,040 --> 00:15:31,880 Speaker 4: and regardless of who that president will be, and we 308 00:15:32,000 --> 00:15:34,960 Speaker 4: have to operate with a set of core values true 309 00:15:35,040 --> 00:15:39,320 Speaker 4: North know what we're doing nowhere're going and and operate 310 00:15:39,360 --> 00:15:40,840 Speaker 4: in that way. And I hope that we've been able 311 00:15:40,840 --> 00:15:42,520 Speaker 4: to do that for two and a half decades, not 312 00:15:42,560 --> 00:15:46,080 Speaker 4: that we haven't made mistakes. Every CEO needs forgiveness, and 313 00:15:46,160 --> 00:15:52,400 Speaker 4: every company needs forgiveness. Every reporter also needs forgiveness. But yes, 314 00:15:53,040 --> 00:15:55,840 Speaker 4: every company except you. Meant to say, but I think 315 00:15:55,880 --> 00:15:58,520 Speaker 4: that we have to realize that. You know, I don't 316 00:15:58,520 --> 00:16:02,200 Speaker 4: think you can put a COO in that box and 317 00:16:02,240 --> 00:16:05,880 Speaker 4: say and now what, because that's there's no answer. Yeah. 318 00:16:06,160 --> 00:16:06,960 Speaker 1: Sorry, one more. 319 00:16:07,640 --> 00:16:12,440 Speaker 2: Have you been surprised at the level of pushbacks not 320 00:16:12,480 --> 00:16:16,160 Speaker 2: the right word by some voices in Silicon Valley against 321 00:16:16,680 --> 00:16:18,200 Speaker 2: DEI ESG. 322 00:16:18,600 --> 00:16:20,880 Speaker 1: It feels like a vocal tide that's rising. 323 00:16:21,000 --> 00:16:24,160 Speaker 2: Maybe I'm checking Twitter too often, but I know you're 324 00:16:24,160 --> 00:16:26,040 Speaker 2: just spenting too much some Yeah, no, I am, and 325 00:16:26,080 --> 00:16:28,920 Speaker 2: I'm trying to cut back, but you know, I think I. 326 00:16:28,840 --> 00:16:30,600 Speaker 3: Had to time or limit on myself. 327 00:16:30,640 --> 00:16:32,800 Speaker 1: So yeah, what what is going on? 328 00:16:33,040 --> 00:16:35,440 Speaker 2: And how are you reading the room right now in 329 00:16:35,560 --> 00:16:39,760 Speaker 2: terms of these kind of counter movements against diversity, against ESG. 330 00:16:40,960 --> 00:16:42,720 Speaker 4: You know, let me tell you how I look at it, 331 00:16:42,800 --> 00:16:47,000 Speaker 4: which is obviously I have lived this, lived it. And 332 00:16:49,200 --> 00:16:53,960 Speaker 4: should we pay women the same for what a man 333 00:16:54,120 --> 00:16:58,000 Speaker 4: gets paid? Let's start there. Should men and women be 334 00:16:58,120 --> 00:16:59,920 Speaker 4: paid the same for the same work? 335 00:17:01,440 --> 00:17:01,680 Speaker 3: Yes? 336 00:17:01,800 --> 00:17:01,960 Speaker 4: Or no? 337 00:17:02,880 --> 00:17:05,600 Speaker 3: Is it a debate? Still? Is it a question Harley? 338 00:17:05,920 --> 00:17:07,200 Speaker 1: For Elon and his tho. 339 00:17:07,200 --> 00:17:09,359 Speaker 4: Well, I actually don't know. I never had that conversation 340 00:17:09,400 --> 00:17:11,480 Speaker 4: with him. I don't know the answer actually for him, 341 00:17:11,840 --> 00:17:14,280 Speaker 4: but I know that in my own data set in 342 00:17:14,280 --> 00:17:16,560 Speaker 4: my company, when I looked at do we pay men 343 00:17:16,600 --> 00:17:18,480 Speaker 4: and women equally for equal work? 344 00:17:18,560 --> 00:17:18,960 Speaker 3: We did not. 345 00:17:19,760 --> 00:17:22,760 Speaker 4: And as we acquired companies, because we've acquired about sixty 346 00:17:22,800 --> 00:17:26,600 Speaker 4: companies over twenty five years, in all of those data sets, 347 00:17:26,640 --> 00:17:29,320 Speaker 4: there was this huge variation and it's like. 348 00:17:29,800 --> 00:17:32,040 Speaker 3: Really, it's twenty twenty four. 349 00:17:32,560 --> 00:17:34,640 Speaker 4: Can we not agree that we're going to pay men 350 00:17:34,680 --> 00:17:37,639 Speaker 4: and women equally for the same work. Is this not 351 00:17:37,840 --> 00:17:40,960 Speaker 4: something we can agree to because in the data it 352 00:17:41,040 --> 00:17:43,880 Speaker 4: says that we do not. And when we've made those 353 00:17:43,880 --> 00:17:46,720 Speaker 4: slight adjustments, all of a sudden in our own company, 354 00:17:47,040 --> 00:17:50,320 Speaker 4: it all of a sudden gets off again. So it's 355 00:17:50,359 --> 00:17:53,480 Speaker 4: an equity issue or an equality issue, right, It's pay equality, 356 00:17:53,480 --> 00:17:55,800 Speaker 4: we call it, right, And I think that that's a 357 00:17:55,920 --> 00:18:00,119 Speaker 4: really good place to start any debate or conversation. And 358 00:18:00,160 --> 00:18:02,400 Speaker 4: I do, by the way, because I'll have folks, not 359 00:18:02,640 --> 00:18:04,879 Speaker 4: just yourself, but other CEOs who want to take me 360 00:18:04,960 --> 00:18:07,600 Speaker 4: on and say, hey, what's your position on equity today, 361 00:18:07,680 --> 00:18:10,560 Speaker 4: or what's your position and on equality today? And say, look, 362 00:18:10,640 --> 00:18:13,760 Speaker 4: I cannot speak on all equality issues. I don't understand 363 00:18:13,760 --> 00:18:15,359 Speaker 4: them all myself. It's too complex. 364 00:18:15,920 --> 00:18:16,119 Speaker 2: You know. 365 00:18:16,240 --> 00:18:17,600 Speaker 3: I'm just the CEO of a company. 366 00:18:18,080 --> 00:18:20,439 Speaker 4: But I do know that I have a lot of 367 00:18:21,080 --> 00:18:24,400 Speaker 4: incredible women in my company, and I think they deserve 368 00:18:24,480 --> 00:18:26,600 Speaker 4: to be paid the same as men.