1 00:00:00,120 --> 00:00:01,400 Speaker 1: Thank you for joining us. 2 00:00:01,760 --> 00:00:05,199 Speaker 2: We talked about maybe moving this event to the Caribbean 3 00:00:05,240 --> 00:00:05,760 Speaker 2: next year. 4 00:00:06,440 --> 00:00:09,320 Speaker 1: You can watch some cricket, have some good weather. That 5 00:00:09,440 --> 00:00:10,039 Speaker 1: sounds great. 6 00:00:10,480 --> 00:00:13,200 Speaker 2: If you want to lob a question up here for Satya, 7 00:00:13,360 --> 00:00:15,080 Speaker 2: please scan the QR code and you. 8 00:00:15,000 --> 00:00:16,360 Speaker 1: Could submit a question. 9 00:00:16,440 --> 00:00:19,279 Speaker 2: But I want to start with the question Satia that 10 00:00:19,360 --> 00:00:22,560 Speaker 2: I know you're just going to love, which is acknowledging 11 00:00:22,600 --> 00:00:26,119 Speaker 2: this remarkable moment at the end of last week where 12 00:00:26,160 --> 00:00:30,320 Speaker 2: the market capitalization of Microsoft so passed Apple and Microsoft 13 00:00:30,400 --> 00:00:31,400 Speaker 2: became the most. 14 00:00:31,440 --> 00:00:33,800 Speaker 1: Valuable public company in the world. 15 00:00:33,840 --> 00:00:36,879 Speaker 2: And it just reminded me of a moment in twenty 16 00:00:37,000 --> 00:00:41,200 Speaker 2: ten when Apple's market cap exceeded Microsoft's and Steve Jobs 17 00:00:41,240 --> 00:00:44,680 Speaker 2: sent an email to employees and he said, stocks go 18 00:00:44,800 --> 00:00:47,960 Speaker 2: up and down, but he wanted to recognize an extraordinary moment, 19 00:00:48,000 --> 00:00:50,879 Speaker 2: and he said, and remember, Apple is only as good 20 00:00:50,920 --> 00:00:52,320 Speaker 2: as our next amazing product. 21 00:00:52,880 --> 00:00:55,080 Speaker 1: I wanted to know, and I suspect the answers know 22 00:00:55,160 --> 00:00:55,720 Speaker 1: whether you. 23 00:00:55,680 --> 00:00:59,520 Speaker 2: Acknowledged to Microsoft employees this moment, and if you had 24 00:00:59,520 --> 00:01:01,240 Speaker 2: how you would finish that sentence. 25 00:01:01,720 --> 00:01:06,880 Speaker 1: Microsoft is only as good as what I think you got, Jobs. 26 00:01:06,600 --> 00:01:11,200 Speaker 3: Added, right, I think if I had to sort of 27 00:01:11,800 --> 00:01:15,880 Speaker 3: pick up We're only as good as our ability to 28 00:01:15,920 --> 00:01:20,120 Speaker 3: execute prosecute our mission, because in some sense, there's no 29 00:01:20,200 --> 00:01:24,440 Speaker 3: god given right for companies to even exist forever. Right 30 00:01:24,600 --> 00:01:29,440 Speaker 3: they have to serve as social purpose. And our mission 31 00:01:30,640 --> 00:01:33,760 Speaker 3: is to empower every person in every organization on the 32 00:01:33,760 --> 00:01:34,720 Speaker 3: planet to achieve more. 33 00:01:34,720 --> 00:01:38,280 Speaker 1: And I feel that if we're building products. 34 00:01:37,800 --> 00:01:41,320 Speaker 3: Services that speak to that mission in the sense it's 35 00:01:41,440 --> 00:01:44,360 Speaker 3: relevant to people and organizations, that's I think pretty unique 36 00:01:44,360 --> 00:01:47,000 Speaker 3: about Microsoft. We think about people as first class, but 37 00:01:47,120 --> 00:01:51,400 Speaker 3: we think about institutions and organizations people build to outclass 38 00:01:51,440 --> 00:01:53,360 Speaker 3: them as also first class. 39 00:01:53,400 --> 00:01:54,240 Speaker 1: And as long as. 40 00:01:54,080 --> 00:01:56,640 Speaker 3: We're building things for that, then I think we have 41 00:01:56,720 --> 00:01:57,640 Speaker 3: a right to exist. 42 00:01:57,800 --> 00:02:00,320 Speaker 1: Did you take even a moment personally with. 43 00:02:00,280 --> 00:02:07,240 Speaker 2: Employees to recognize in my thirty two years of Microsoft, 44 00:02:07,280 --> 00:02:08,440 Speaker 2: we have gone up and down. 45 00:02:09,760 --> 00:02:11,480 Speaker 3: And that's why I think the most important thing to 46 00:02:11,520 --> 00:02:14,359 Speaker 3: focus on is and this is after all, a lagging indicator, 47 00:02:14,400 --> 00:02:16,359 Speaker 3: not a leading indicator. So the last thing you want 48 00:02:16,400 --> 00:02:18,639 Speaker 3: to do is to fixate on the stock price, which 49 00:02:18,880 --> 00:02:21,880 Speaker 3: you know means nothing in terms of what happens tomorrow, 50 00:02:21,960 --> 00:02:25,200 Speaker 3: especially in our industry which really has no franchise value. 51 00:02:25,280 --> 00:02:28,320 Speaker 1: Quite frankly, I mean the problem in some sense is 52 00:02:28,400 --> 00:02:29,320 Speaker 1: for all of us. 53 00:02:29,200 --> 00:02:32,920 Speaker 3: Whether we can bet it all on what comes next, 54 00:02:33,360 --> 00:02:35,320 Speaker 3: which means it's very very hard, right. 55 00:02:35,600 --> 00:02:37,640 Speaker 2: And speaking of what comes next, you guys have been 56 00:02:37,680 --> 00:02:41,880 Speaker 2: fairly aggressive and out in front on adding AI features 57 00:02:41,880 --> 00:02:44,840 Speaker 2: and capabilities to your products, and yesterday you made an 58 00:02:44,880 --> 00:02:49,520 Speaker 2: announcement where you're expanding the rollout of the co pilot 59 00:02:49,639 --> 00:02:53,880 Speaker 2: tool fuel partly by your partnership with an open AI 60 00:02:54,320 --> 00:02:57,600 Speaker 2: in the Microsoft products like Outlook and Word and Excel. 61 00:02:58,040 --> 00:03:00,600 Speaker 2: Talk a little bit about that and how idly do 62 00:03:00,600 --> 00:03:02,840 Speaker 2: you expect those tools to be used? 63 00:03:03,080 --> 00:03:07,440 Speaker 3: Yeah, I mean if I stepped back, if you sort 64 00:03:07,440 --> 00:03:11,000 Speaker 3: of look at what has happened even in the last 65 00:03:11,120 --> 00:03:14,079 Speaker 3: i'd say sixteen months, Right, you have to go back 66 00:03:14,120 --> 00:03:18,000 Speaker 3: to November of twenty two when chat Gipt first came out. 67 00:03:18,040 --> 00:03:19,720 Speaker 3: I think that was the moment which I like to 68 00:03:19,760 --> 00:03:23,519 Speaker 3: describe as the Mosaic like moment. In fact, interestingly enough, 69 00:03:23,520 --> 00:03:26,200 Speaker 3: it is November of ninety three, a year after I 70 00:03:26,280 --> 00:03:31,000 Speaker 3: joined Microsoft in Mosaic first came out, and I think 71 00:03:31,040 --> 00:03:34,720 Speaker 3: that's the first AI product that we all could relate 72 00:03:34,800 --> 00:03:39,080 Speaker 3: to and get a real sense of what this generation 73 00:03:39,120 --> 00:03:42,760 Speaker 3: of AI can do in our lives. But for me, 74 00:03:43,040 --> 00:03:48,040 Speaker 3: maybe The product that really helped crystallize the potential was 75 00:03:48,080 --> 00:03:50,840 Speaker 3: get ub co pilot, which probably came six eight months 76 00:03:50,840 --> 00:03:54,280 Speaker 3: before that, and especially when we move you know, we 77 00:03:54,400 --> 00:03:59,040 Speaker 3: scaled from GPT three to three five, that's around the 78 00:03:59,080 --> 00:04:02,360 Speaker 3: time when we fell that if you can take something 79 00:04:02,880 --> 00:04:06,200 Speaker 3: like software development, which is, let's call it, the most 80 00:04:06,200 --> 00:04:09,360 Speaker 3: elite knowledge work there is, and have a tool that 81 00:04:09,480 --> 00:04:13,440 Speaker 3: allows software developmer, in fact, bring joy back to software development, 82 00:04:13,560 --> 00:04:17,160 Speaker 3: keep them in flow, get them to finish tasks that 83 00:04:17,600 --> 00:04:22,400 Speaker 3: to be solidified. In fact, the idea that you can 84 00:04:22,480 --> 00:04:26,920 Speaker 3: have a copilot for pretty much every human task. And 85 00:04:27,000 --> 00:04:29,919 Speaker 3: so we have been on that journey and we are 86 00:04:29,920 --> 00:04:31,200 Speaker 3: anchored on two real things. 87 00:04:31,240 --> 00:04:31,360 Speaker 1: Right. 88 00:04:31,400 --> 00:04:34,040 Speaker 3: The one thing the breakthrough is the user interface. In fact, 89 00:04:34,040 --> 00:04:36,680 Speaker 3: seventy years of computing history has always been about can 90 00:04:36,720 --> 00:04:39,560 Speaker 3: you build that most intuitive user experience. That's kind of 91 00:04:39,560 --> 00:04:45,040 Speaker 3: how it led to graphical interfaces or the multi touch phone. 92 00:04:44,800 --> 00:04:45,400 Speaker 1: Or what have you. 93 00:04:45,839 --> 00:04:48,320 Speaker 3: But now with natural language, you ultimately in some sense 94 00:04:48,360 --> 00:04:50,800 Speaker 3: have arrived at that point where it's not about us 95 00:04:50,880 --> 00:04:53,400 Speaker 3: understanding computers, but computers understanding us. 96 00:04:54,200 --> 00:04:58,320 Speaker 1: So that's one major breakthrough. The other breakthrough is we 97 00:04:58,440 --> 00:04:58,800 Speaker 1: now have. 98 00:04:58,760 --> 00:05:01,640 Speaker 3: A new reasoning engine, which is a neural reasoning engine, 99 00:05:01,680 --> 00:05:05,160 Speaker 3: because the other seventy year history of computing was we 100 00:05:05,240 --> 00:05:08,840 Speaker 3: digitize people, places and things and try to make sense 101 00:05:08,880 --> 00:05:11,280 Speaker 3: of it, reason about it. And so we now have 102 00:05:11,320 --> 00:05:13,760 Speaker 3: a new capability. So you put these two things together, 103 00:05:13,839 --> 00:05:18,320 Speaker 3: a new user interface that's much more intuitive, grounded in 104 00:05:18,400 --> 00:05:21,960 Speaker 3: natural language, multi modal, multi turn, multi domain. 105 00:05:22,200 --> 00:05:23,200 Speaker 1: And a reasoning engine. 106 00:05:23,200 --> 00:05:26,240 Speaker 3: Pretty much every software category, what is productivity, what is 107 00:05:26,240 --> 00:05:29,880 Speaker 3: an operating system, what's a browser? They all in some 108 00:05:29,920 --> 00:05:33,640 Speaker 3: sense collapse. And so that's why we ask copilot. Just 109 00:05:33,760 --> 00:05:35,560 Speaker 3: as maybe in the past we were known as an 110 00:05:35,640 --> 00:05:39,920 Speaker 3: office company, or a Windows company or a cloud company. 111 00:05:39,960 --> 00:05:41,719 Speaker 1: I think going forward we will be We have a 112 00:05:41,720 --> 00:05:42,359 Speaker 1: co pilot. 113 00:05:42,400 --> 00:05:44,640 Speaker 3: We have a co pilot stack in Azure, which is 114 00:05:44,680 --> 00:05:48,000 Speaker 3: all the APIs, and that's sort of what our co 115 00:05:48,160 --> 00:05:48,719 Speaker 3: focus is. 116 00:05:48,880 --> 00:05:51,920 Speaker 2: Right now, tell us, let's talk a little bit about 117 00:05:51,920 --> 00:05:55,680 Speaker 2: the relationship with open ai and how we should understand it. 118 00:05:56,480 --> 00:05:59,960 Speaker 1: Copilot is powered by open Ai. 119 00:06:00,000 --> 00:06:04,720 Speaker 2: We saw some instability in the relationship back in November, which. 120 00:06:06,640 --> 00:06:08,239 Speaker 1: You seem to have now come through. 121 00:06:09,560 --> 00:06:14,040 Speaker 2: Is Microsoft outsourcing what you're describing as a core capability 122 00:06:14,080 --> 00:06:14,640 Speaker 2: going forward? 123 00:06:15,120 --> 00:06:18,400 Speaker 3: Yeah, So I think if you sort of step back 124 00:06:19,800 --> 00:06:23,320 Speaker 3: in fact, it's probably helpful to understand. I grew up 125 00:06:23,320 --> 00:06:29,520 Speaker 3: in a Microsoft which sort of had these massive partnerships. 126 00:06:30,200 --> 00:06:34,719 Speaker 3: The first partnership that at least I joined was around 127 00:06:34,839 --> 00:06:38,000 Speaker 3: Intel and Microsoft. I don't think Windows would have existed 128 00:06:38,000 --> 00:06:42,600 Speaker 3: without Intel, and Intel wouldn't have had the success without Windows. Subsequently, 129 00:06:42,640 --> 00:06:45,240 Speaker 3: in fact, it's interesting I worked on a SQL server 130 00:06:45,360 --> 00:06:48,480 Speaker 3: product with SAP. In fact, I don't think sql database 131 00:06:48,520 --> 00:06:53,320 Speaker 3: would have existed without SAP. SAP success often being able 132 00:06:53,360 --> 00:06:56,599 Speaker 3: to support SQL server also help them a lot. And 133 00:06:56,680 --> 00:06:58,320 Speaker 3: so in the same way, and I think of open 134 00:06:58,360 --> 00:07:01,560 Speaker 3: AI and Microsoft, so I'm you who's still constructing. In fact, 135 00:07:01,600 --> 00:07:04,320 Speaker 3: a lot of people talk about organic development, which of 136 00:07:04,320 --> 00:07:06,800 Speaker 3: course is the core people talk about M and A, 137 00:07:07,120 --> 00:07:09,600 Speaker 3: but sort of not as much as talked about how 138 00:07:09,680 --> 00:07:14,600 Speaker 3: much enterprise value gets created by partnering effectively. So that's 139 00:07:14,640 --> 00:07:16,800 Speaker 3: the spirit with which I think about open AI. 140 00:07:18,080 --> 00:07:19,680 Speaker 1: So there's a whole lot we do. 141 00:07:19,800 --> 00:07:22,880 Speaker 3: So when you say outsourcing, who's outsourcing what to whom? 142 00:07:23,280 --> 00:07:24,520 Speaker 1: Is the real question? Right? 143 00:07:24,600 --> 00:07:28,160 Speaker 3: So we build a compute, they then use the compute 144 00:07:28,200 --> 00:07:30,480 Speaker 3: to do the training. We then take that put it 145 00:07:30,520 --> 00:07:33,880 Speaker 3: into products, and so in some sense it's a partnership 146 00:07:33,920 --> 00:07:37,040 Speaker 3: that is based on each of us really reinforcing what 147 00:07:37,120 --> 00:07:40,960 Speaker 3: everyone each other does, and then ultimately being competitive in 148 00:07:41,000 --> 00:07:46,240 Speaker 3: the marketplace. There's room for I call it horizontal specialization. 149 00:07:46,560 --> 00:07:50,480 Speaker 3: There is room for vertical specialization. Sometimes some business models 150 00:07:50,480 --> 00:07:54,440 Speaker 3: are in vogue. I'm a big believer in horizontal specialization, 151 00:07:55,040 --> 00:07:57,520 Speaker 3: especially if you can vertically integrate everything. 152 00:07:57,960 --> 00:08:01,040 Speaker 2: Do you have to worry about being index and over 153 00:08:01,120 --> 00:08:06,280 Speaker 2: reliant on a company, a partner whose ultimate goals and 154 00:08:06,360 --> 00:08:08,640 Speaker 2: mission might be different from Microsoft's. 155 00:08:09,200 --> 00:08:12,000 Speaker 3: I mean, you don't go into any partnership. First of all, 156 00:08:12,120 --> 00:08:14,320 Speaker 3: there is independence in a partnership. There are two different 157 00:08:14,360 --> 00:08:18,800 Speaker 3: companies answerable to two sets of different stakeholders with different interests. 158 00:08:18,840 --> 00:08:22,400 Speaker 3: So therefore you have to then create a commercial partnership 159 00:08:22,440 --> 00:08:24,680 Speaker 3: in it that is mutually beneficial. 160 00:08:24,720 --> 00:08:26,760 Speaker 1: So that's why I think partnerships. 161 00:08:26,160 --> 00:08:28,600 Speaker 3: Where you enter into partnerships where one side is trying 162 00:08:28,600 --> 00:08:31,080 Speaker 3: to take advantage of the other is not long term stable. 163 00:08:31,760 --> 00:08:34,680 Speaker 3: But if two partners can, and that's sort of why 164 00:08:34,720 --> 00:08:37,520 Speaker 3: I go back to the history of enterprise value that 165 00:08:37,679 --> 00:08:40,760 Speaker 3: was created with partnerships that at least I've been involved 166 00:08:40,760 --> 00:08:43,880 Speaker 3: in across my career at Microsoft. So yes, I think 167 00:08:44,360 --> 00:08:46,760 Speaker 3: you have to sort of I feel very way good 168 00:08:46,800 --> 00:08:50,360 Speaker 3: about the construct we have. I feel at the same 169 00:08:50,400 --> 00:08:54,520 Speaker 3: time very capable of controlling our own destiny. So it's 170 00:08:54,520 --> 00:08:58,640 Speaker 3: not like that we are single threaded even today on Azure. 171 00:08:59,360 --> 00:09:01,520 Speaker 3: And this is not about even open Ai. It's all 172 00:09:01,559 --> 00:09:04,480 Speaker 3: about reflection of what our customers want. Every customer will 173 00:09:04,480 --> 00:09:07,200 Speaker 3: comes to Azure for example infect our own products. 174 00:09:07,200 --> 00:09:08,400 Speaker 1: It's not about one model. 175 00:09:08,720 --> 00:09:11,839 Speaker 3: We care deeply about having the best frontier model, which 176 00:09:11,840 --> 00:09:14,360 Speaker 3: happens to be, for example today GPD four. 177 00:09:15,000 --> 00:09:18,240 Speaker 1: But we also have mixed straw as models. 178 00:09:18,240 --> 00:09:22,040 Speaker 3: As a service inside of Azure, we use Lama in places, 179 00:09:22,080 --> 00:09:26,000 Speaker 3: we have PI, which is the best SLM from Microsoft. 180 00:09:26,400 --> 00:09:29,360 Speaker 3: So there is going to be diversity of capability and 181 00:09:29,480 --> 00:09:32,000 Speaker 3: models that we will have that we will invest in. 182 00:09:32,679 --> 00:09:34,719 Speaker 1: But we will partner very very deeply with open Ai. 183 00:09:35,080 --> 00:09:38,880 Speaker 2: What is the right operating model for a company like 184 00:09:38,920 --> 00:09:41,640 Speaker 2: open Ai. I mean, currently it's a capped for profit 185 00:09:41,679 --> 00:09:46,600 Speaker 2: company owned by a nonprofit, a very unorthodox arrangement probably 186 00:09:46,600 --> 00:09:49,480 Speaker 2: contributed to some of the drama and instability in November. 187 00:09:49,840 --> 00:09:51,160 Speaker 1: Have they figured it out yet? 188 00:09:51,160 --> 00:09:54,319 Speaker 2: Are you comfortable now that you've got a partner with 189 00:09:54,480 --> 00:09:55,800 Speaker 2: a stable operating model? 190 00:09:56,120 --> 00:09:58,240 Speaker 1: You're talking to Sam later, I am, and I will 191 00:09:58,280 --> 00:10:01,120 Speaker 1: ask you as well. He's for your opinion. 192 00:10:01,559 --> 00:10:05,640 Speaker 3: He needs to answer that question and his board obviously, 193 00:10:05,720 --> 00:10:09,600 Speaker 3: and I'm sure they're working through it. Look, I always 194 00:10:09,600 --> 00:10:12,560 Speaker 3: say this, which is we invested, we partnered when they 195 00:10:12,600 --> 00:10:15,920 Speaker 3: were whatever they were and whatever they are today, right 196 00:10:15,960 --> 00:10:20,000 Speaker 3: in terms of being a cap profit, nonprofit what have you. 197 00:10:20,880 --> 00:10:24,520 Speaker 3: So i'mncomfortable. I have no issues with any structure. What 198 00:10:24,679 --> 00:10:26,359 Speaker 3: we just want is good stability. 199 00:10:26,720 --> 00:10:27,760 Speaker 1: Uh. 200 00:10:27,200 --> 00:10:29,520 Speaker 3: And and as I said, we don't even need like 201 00:10:29,559 --> 00:10:31,840 Speaker 3: I'm not even interested in a board seat. 202 00:10:31,720 --> 00:10:34,440 Speaker 1: Or we don't need control. We definitely don't have control. 203 00:10:34,559 --> 00:10:36,720 Speaker 3: We have no We just want to have a good 204 00:10:36,720 --> 00:10:39,600 Speaker 3: commercial partnership and we want to be investors in the 205 00:10:39,800 --> 00:10:43,440 Speaker 3: entity in even the way they are structured. So what 206 00:10:43,520 --> 00:10:46,360 Speaker 3: I would like is good governance and real stability. 207 00:10:46,360 --> 00:10:49,640 Speaker 1: That's it. You have a board observer, non voting. 208 00:10:50,480 --> 00:10:53,040 Speaker 2: I was joking backstage, it feels like having somebody at 209 00:10:53,040 --> 00:10:55,640 Speaker 2: the back on the back of the bus without a seatbelt. 210 00:10:56,640 --> 00:10:56,840 Speaker 1: Yeah. 211 00:10:56,880 --> 00:11:00,760 Speaker 3: I mean, so it doesn't matter to be right, I mean, 212 00:11:00,800 --> 00:11:04,559 Speaker 3: the board seat is not the critical path at all 213 00:11:04,640 --> 00:11:06,880 Speaker 3: for us. What is most important, as I said, is 214 00:11:07,200 --> 00:11:10,120 Speaker 3: we just want a board that cares about open AI 215 00:11:10,840 --> 00:11:13,000 Speaker 3: on the open AIE side, and that's all we care. 216 00:11:13,040 --> 00:11:14,160 Speaker 1: I mean, that's all we can. 217 00:11:14,080 --> 00:11:18,800 Speaker 3: Ask for, and we just want stability in the partnership 218 00:11:19,320 --> 00:11:21,160 Speaker 3: so that we can then continue to invest in it. 219 00:11:21,200 --> 00:11:21,600 Speaker 1: That's it. 220 00:11:22,120 --> 00:11:24,679 Speaker 2: I'm always a little cautious in these sort of hype 221 00:11:24,679 --> 00:11:29,280 Speaker 2: cycles in Silicon Valley that certain technologies are being kind 222 00:11:29,280 --> 00:11:33,320 Speaker 2: of overhyped, over promised, you know so far, for example, 223 00:11:33,360 --> 00:11:34,719 Speaker 2: what the integration of. 224 00:11:35,200 --> 00:11:36,600 Speaker 1: Chat GPT into being. 225 00:11:37,160 --> 00:11:40,560 Speaker 2: Have the results met your expectations or has it been 226 00:11:40,600 --> 00:11:41,160 Speaker 2: over I. 227 00:11:41,160 --> 00:11:44,800 Speaker 3: Mean if you sort of take the combination of sort 228 00:11:44,800 --> 00:11:49,600 Speaker 3: of chat GPT usage and even being usage or copality usage, 229 00:11:49,800 --> 00:11:51,319 Speaker 3: I think it's at this point you have to sort 230 00:11:51,320 --> 00:11:53,960 Speaker 3: of ask yourself your own user habit right, how many 231 00:11:54,000 --> 00:11:54,920 Speaker 3: times do you go? 232 00:11:55,040 --> 00:11:55,680 Speaker 1: For example? 233 00:11:55,720 --> 00:11:58,400 Speaker 3: I think the real question here is the largest software 234 00:11:58,400 --> 00:12:02,280 Speaker 3: business there is is as we know, and the question 235 00:12:02,440 --> 00:12:03,720 Speaker 3: is is that stable. 236 00:12:04,800 --> 00:12:07,720 Speaker 1: I think, like all big things, it will take time. 237 00:12:07,800 --> 00:12:09,840 Speaker 3: But I think at this point the idea that you 238 00:12:09,920 --> 00:12:13,520 Speaker 3: go to one of these agents that gets you to 239 00:12:13,559 --> 00:12:17,480 Speaker 3: the answer quicker is pretty clear. It doesn't mean that 240 00:12:17,600 --> 00:12:19,560 Speaker 3: such as we know of it today. I mean Google 241 00:12:19,600 --> 00:12:22,000 Speaker 3: obviously it's super strong. They have the defaults on Apple, 242 00:12:22,040 --> 00:12:25,240 Speaker 3: they have their own Android defaults. Chrome on Windows is 243 00:12:25,280 --> 00:12:28,000 Speaker 3: the largest browser share and what have you. So it's 244 00:12:28,040 --> 00:12:32,120 Speaker 3: a structurally a fantastic position that they have. But that said, 245 00:12:32,200 --> 00:12:34,280 Speaker 3: I think search as we know if it is going 246 00:12:34,280 --> 00:12:35,760 Speaker 3: to change, and the web as we know if it 247 00:12:35,800 --> 00:12:37,960 Speaker 3: is going to change, and so we have a real opportunity, 248 00:12:37,960 --> 00:12:40,920 Speaker 3: whether it's with being but also even independent of bing 249 00:12:40,960 --> 00:12:41,520 Speaker 3: for example. 250 00:12:41,600 --> 00:12:42,080 Speaker 1: That's why I. 251 00:12:42,080 --> 00:12:44,320 Speaker 3: Think about co pilot as our real product. Right to me, 252 00:12:45,320 --> 00:12:49,840 Speaker 3: the relationship we all will have with computers is going 253 00:12:49,880 --> 00:12:54,000 Speaker 3: to be now with an agent which will be on 254 00:12:54,120 --> 00:12:57,440 Speaker 3: all your computers, and to me, that I think is 255 00:12:57,480 --> 00:13:00,000 Speaker 3: going to be the defining category. 256 00:13:00,080 --> 00:13:01,080 Speaker 1: Of this next generation. 257 00:13:01,280 --> 00:13:01,760 Speaker 3: I want to. 258 00:13:01,760 --> 00:13:05,880 Speaker 2: Change gears and ask you about this year's elections. I 259 00:13:05,880 --> 00:13:09,200 Speaker 2: think seventy democratic elections around the world, more than half 260 00:13:09,760 --> 00:13:12,719 Speaker 2: of people on Earth will have access to vote to 261 00:13:12,760 --> 00:13:16,839 Speaker 2: an election. Donald Trump yesterday won the Iowa caucuses. I 262 00:13:16,920 --> 00:13:19,080 Speaker 2: know it's sort of fashionable to say that these are 263 00:13:19,120 --> 00:13:23,800 Speaker 2: the most important elections of our lifetime. I'm curious what 264 00:13:23,880 --> 00:13:27,959 Speaker 2: you think is at stake for Microsoft, particularly with the 265 00:13:28,080 --> 00:13:31,000 Speaker 2: US election and for the safe stewardship of AI. 266 00:13:31,160 --> 00:13:37,520 Speaker 1: Does this feel momentous to you? I mean, you know, 267 00:13:37,559 --> 00:13:39,640 Speaker 1: if I step back for AI. 268 00:13:39,840 --> 00:13:42,439 Speaker 3: I mean, the one thing that to your core question 269 00:13:43,360 --> 00:13:46,640 Speaker 3: as a multinational company, you know, the one thing that 270 00:13:46,679 --> 00:13:48,880 Speaker 3: I'm always grounded on is we're also. 271 00:13:48,679 --> 00:13:50,400 Speaker 1: An American company. 272 00:13:50,520 --> 00:13:56,160 Speaker 3: So the state of the United States politically, economically, socially, 273 00:13:57,080 --> 00:14:00,880 Speaker 3: and its stature in the world across those dement matters 274 00:14:00,920 --> 00:14:05,280 Speaker 3: a lot. So I think that because that's our passport. 275 00:14:05,000 --> 00:14:06,160 Speaker 1: When we show up anywhere. 276 00:14:06,200 --> 00:14:08,200 Speaker 3: At the end of the day, we are an American company, 277 00:14:09,120 --> 00:14:12,720 Speaker 3: and to the degree to which America has the relationships 278 00:14:12,840 --> 00:14:17,120 Speaker 3: they're welcome, then they welcome the companies that are born 279 00:14:17,160 --> 00:14:19,320 Speaker 3: in the United States. So that's I think the fundamental 280 00:14:19,360 --> 00:14:23,960 Speaker 3: thing in that context. Obviously, in our democratic process, having 281 00:14:24,000 --> 00:14:28,560 Speaker 3: that process be you know, well administered, that people have 282 00:14:28,680 --> 00:14:31,400 Speaker 3: trust in it, I think it's super important. So we've 283 00:14:31,440 --> 00:14:34,240 Speaker 3: always i mean three years, we've done a lot around 284 00:14:34,640 --> 00:14:38,080 Speaker 3: what does it mean to support the democratic process, whether 285 00:14:38,200 --> 00:14:40,960 Speaker 3: it's the support for the parties, it's the support for 286 00:14:41,080 --> 00:14:44,480 Speaker 3: the election process itself. Of course, the thing in AI, 287 00:14:44,600 --> 00:14:48,320 Speaker 3: it's not like this is the first election where disinformation 288 00:14:48,640 --> 00:14:52,440 Speaker 3: or misinformation and election interference is going. 289 00:14:52,320 --> 00:14:55,640 Speaker 1: To be a real challenge that we all have to tackle. 290 00:14:55,760 --> 00:14:58,160 Speaker 3: Me as a company have to do our best work right, 291 00:14:58,200 --> 00:15:01,000 Speaker 3: whether in the context of AI. We have lots of 292 00:15:01,040 --> 00:15:04,120 Speaker 3: initiatives around content IDs and other things that will then 293 00:15:04,240 --> 00:15:08,760 Speaker 3: help us at least vouch for the veracity of any 294 00:15:08,880 --> 00:15:10,880 Speaker 3: content out there, and that's I think the work that 295 00:15:10,920 --> 00:15:14,520 Speaker 3: we need to do. But ultimately, in the democratic process, 296 00:15:15,480 --> 00:15:18,760 Speaker 3: essentially ensuring the integrity of elections is one of the 297 00:15:18,760 --> 00:15:20,560 Speaker 3: fundamental challenges we have to face up to. 298 00:15:21,480 --> 00:15:23,440 Speaker 1: Whichever administration. 299 00:15:25,200 --> 00:15:29,120 Speaker 2: Takes over, they will probably make it more difficult for 300 00:15:29,240 --> 00:15:33,160 Speaker 2: Microsoft to do business in China. How are you thinking 301 00:15:33,200 --> 00:15:36,400 Speaker 2: about the technology you develop in China? And how long 302 00:15:36,440 --> 00:15:39,480 Speaker 2: do you think you'll be able to employee engineers working 303 00:15:39,520 --> 00:15:41,320 Speaker 2: on technologies like AI in China. 304 00:15:41,360 --> 00:15:43,480 Speaker 1: So a couple of things. I mean, China is not 305 00:15:43,600 --> 00:15:47,480 Speaker 1: a large business for Microsoft. In fact, if you sort 306 00:15:47,520 --> 00:15:48,560 Speaker 1: of look at our P and. 307 00:15:48,640 --> 00:15:50,240 Speaker 3: L even I think this is one of those things 308 00:15:50,240 --> 00:15:54,440 Speaker 3: that is probably not as well understood. Is it's fundamentally 309 00:15:54,840 --> 00:15:58,040 Speaker 3: we do business in China in order to support other 310 00:15:58,160 --> 00:15:59,920 Speaker 3: multinationals going into China. 311 00:16:00,600 --> 00:16:03,440 Speaker 1: So this is the German automakers or American. 312 00:16:03,040 --> 00:16:06,920 Speaker 3: Automakers, or CpG firms or what have you around the 313 00:16:06,960 --> 00:16:11,680 Speaker 3: world who depend on having commonality of infrastructure between the 314 00:16:11,720 --> 00:16:14,520 Speaker 3: rest of the world in China depend on Microsoft, and 315 00:16:14,560 --> 00:16:16,240 Speaker 3: that's why in some sense we have to be in 316 00:16:16,360 --> 00:16:18,920 Speaker 3: China in order to support our customers, and the same 317 00:16:18,960 --> 00:16:22,400 Speaker 3: it's true of Chinese multinationals going outside of China as well, 318 00:16:22,440 --> 00:16:24,720 Speaker 3: and so that's really our business. So there is not 319 00:16:24,960 --> 00:16:29,240 Speaker 3: a domestic business that we have to speak of. In 320 00:16:29,360 --> 00:16:32,320 Speaker 3: terms of human capital, the way I look at it 321 00:16:32,360 --> 00:16:35,760 Speaker 3: and say is the greatness of the United States has 322 00:16:35,840 --> 00:16:39,200 Speaker 3: been all about being able to attract the best talent. 323 00:16:39,320 --> 00:16:41,560 Speaker 3: We definitely want people to come to the United States, 324 00:16:42,320 --> 00:16:44,200 Speaker 3: we want them to work in the United States, but 325 00:16:44,240 --> 00:16:47,280 Speaker 3: we also want to tap into human capital around the 326 00:16:47,320 --> 00:16:49,840 Speaker 3: world to be able to contribute to what is essentially 327 00:16:49,880 --> 00:16:51,200 Speaker 3: American intellectual property. 328 00:16:51,240 --> 00:16:52,040 Speaker 1: At the end of the day. 329 00:16:52,760 --> 00:16:54,920 Speaker 3: Right now, when I look at some of the machine 330 00:16:55,000 --> 00:16:57,640 Speaker 3: learning papers and so on, there's as much being written 331 00:16:57,640 --> 00:17:00,800 Speaker 3: in Mandarin as it is written in English. And so 332 00:17:00,880 --> 00:17:03,600 Speaker 3: to the degree to which we believe that somehow knowledge 333 00:17:03,920 --> 00:17:07,920 Speaker 3: creation doesn't have boundaries, and in fact, the worst mistake 334 00:17:08,040 --> 00:17:10,679 Speaker 3: we could be making is to somehow shut ourselves up. 335 00:17:11,119 --> 00:17:12,960 Speaker 3: I mean the lesson of history, at least as I 336 00:17:13,119 --> 00:17:16,600 Speaker 3: read it, is that the worst mistake any civilization, any society, 337 00:17:16,680 --> 00:17:19,800 Speaker 3: can make is to somehow shut yourself off from knowledge 338 00:17:19,800 --> 00:17:22,959 Speaker 3: that's being created elsewhere. So to us, if there are 339 00:17:23,200 --> 00:17:25,680 Speaker 3: great there's great talent in China that wants to work 340 00:17:26,320 --> 00:17:30,639 Speaker 3: while at Microsoft contributing to essentially an American company's intellectual property, 341 00:17:30,640 --> 00:17:33,240 Speaker 3: we welcome it. At the same time, we are very 342 00:17:33,320 --> 00:17:36,080 Speaker 3: very clear about sort of a this is our intellectual property. 343 00:17:36,160 --> 00:17:38,840 Speaker 3: We are definitely not going to have any collaborations that 344 00:17:38,880 --> 00:17:43,840 Speaker 3: are not in alignment with in the United States's national interest. 345 00:17:44,119 --> 00:17:47,720 Speaker 2: Okay, I want to go back to Microsoft's investment in 346 00:17:47,760 --> 00:17:50,439 Speaker 2: open Ai, which is being scrutinized now by the EU 347 00:17:50,880 --> 00:17:54,720 Speaker 2: and others. It feels to me like to the extent 348 00:17:54,800 --> 00:17:57,119 Speaker 2: that there was a holiday for Microsoft in terms of 349 00:17:57,160 --> 00:18:00,680 Speaker 2: anti test scrutiny. The holiday has ended, but you feel 350 00:18:00,720 --> 00:18:04,320 Speaker 2: like Microsoft going forward will now be more limited and 351 00:18:04,359 --> 00:18:06,399 Speaker 2: not only the kind of acquisitions it can do, but 352 00:18:06,480 --> 00:18:08,960 Speaker 2: now the kinds of investments that it can make. 353 00:18:09,520 --> 00:18:12,159 Speaker 3: Look, I mean, I think it's inevitable that, you know, 354 00:18:12,480 --> 00:18:16,080 Speaker 3: regulators everywhere, anti trust folks are going to look at, 355 00:18:16,359 --> 00:18:19,520 Speaker 3: you know, whatever a company of our size and scale does. 356 00:18:20,640 --> 00:18:23,359 Speaker 3: And so that's why I think in this context, I 357 00:18:24,760 --> 00:18:28,520 Speaker 3: all I say is, if we want competition in AI 358 00:18:29,880 --> 00:18:32,320 Speaker 3: against you know, some players who are. 359 00:18:32,240 --> 00:18:33,800 Speaker 1: Completely vertically integrated. 360 00:18:34,440 --> 00:18:38,720 Speaker 3: I think partnerships is one avenue of in fact having competition. 361 00:18:38,840 --> 00:18:40,720 Speaker 1: So I'm sure the regulators will look at it. 362 00:18:40,640 --> 00:18:45,639 Speaker 3: And say is this a pro competition partnership or not? 363 00:18:45,920 --> 00:18:48,600 Speaker 3: And to me, I think it's a no brainer. I mean, 364 00:18:48,640 --> 00:18:50,600 Speaker 3: if you don't, if Mike think about this, right, if 365 00:18:50,640 --> 00:18:54,840 Speaker 3: Microsoft had not taken the highly risky I mean, this 366 00:18:54,920 --> 00:18:58,000 Speaker 3: is now all conventional wisdom. But when we made those investments, 367 00:18:58,080 --> 00:19:02,159 Speaker 3: and when we backed even open Ai went, you know, 368 00:19:02,440 --> 00:19:07,080 Speaker 3: all in on a particular form of compute that led 369 00:19:07,119 --> 00:19:10,199 Speaker 3: to all of these breakthroughs, you know it would have 370 00:19:10,280 --> 00:19:12,480 Speaker 3: not been what we wouldn't have had what we have, 371 00:19:12,600 --> 00:19:15,399 Speaker 3: and more importantly, you know the incumbents would have been 372 00:19:15,400 --> 00:19:15,840 Speaker 3: the winners. 373 00:19:16,000 --> 00:19:16,160 Speaker 1: Right. 374 00:19:16,520 --> 00:19:20,679 Speaker 2: It took twenty one months for Microsoft's acquisition of Activision 375 00:19:21,040 --> 00:19:22,879 Speaker 2: to be approved, and then almost I think it was 376 00:19:22,880 --> 00:19:26,280 Speaker 2: a couple of weeks later, Adobe's attempted acquisition of Figma 377 00:19:26,680 --> 00:19:30,120 Speaker 2: was going to be rejected. They walked away from It 378 00:19:30,160 --> 00:19:31,840 Speaker 2: is the era of big deals over. 379 00:19:34,440 --> 00:19:35,960 Speaker 1: I think this is where. 380 00:19:37,720 --> 00:19:44,200 Speaker 3: Perhaps looking at whenever I talked to any antitrust folks, 381 00:19:44,280 --> 00:19:47,600 Speaker 3: I always asked them have they talked to venture capitalists, 382 00:19:48,080 --> 00:19:52,280 Speaker 3: because I feel the best way to make sense is 383 00:19:52,320 --> 00:19:55,480 Speaker 3: not by size of any one company and things. 384 00:19:55,560 --> 00:19:57,480 Speaker 1: What is a big deal, what is a small deal? 385 00:19:58,040 --> 00:20:00,720 Speaker 3: There is conventional wisdom there are no flies no, Then 386 00:20:00,800 --> 00:20:03,000 Speaker 3: every VC you meet will tell you where the no 387 00:20:03,080 --> 00:20:04,960 Speaker 3: fly zone. See, all you've got to do is track 388 00:20:05,280 --> 00:20:07,840 Speaker 3: because at the end of the day, you want new entrance. Right, 389 00:20:07,880 --> 00:20:12,000 Speaker 3: That's the core of making sure you have vibrant competition 390 00:20:12,359 --> 00:20:15,000 Speaker 3: is that there is room for new entry and new innovation, 391 00:20:16,040 --> 00:20:16,560 Speaker 3: and that's. 392 00:20:16,400 --> 00:20:17,959 Speaker 1: Where venture money is focused on. 393 00:20:18,200 --> 00:20:21,680 Speaker 3: And so by category you look at that, you'll take 394 00:20:21,720 --> 00:20:24,399 Speaker 3: productivity right, sort of a place where we have, you know, 395 00:20:24,520 --> 00:20:27,680 Speaker 3: some great success in the past. But think about the 396 00:20:27,760 --> 00:20:30,040 Speaker 3: number of new companies that have been born even in 397 00:20:30,080 --> 00:20:33,639 Speaker 3: the last ten fifteen years, right, zoom slack notion. 398 00:20:33,920 --> 00:20:36,160 Speaker 1: I mean, you wake up and there's a big company. 399 00:20:36,680 --> 00:20:37,399 Speaker 1: Why is that? 400 00:20:37,400 --> 00:20:39,879 Speaker 3: That is because there's a real opportunity for new entrance. 401 00:20:40,000 --> 00:20:42,359 Speaker 3: Then you say, okay, how many new search engines have 402 00:20:42,440 --> 00:20:45,480 Speaker 3: been born? None? 403 00:20:45,800 --> 00:20:48,880 Speaker 1: And so to some degree that I think is the analysis. 404 00:20:48,920 --> 00:20:51,639 Speaker 3: The unit of analysis is as simple as looking at 405 00:20:51,720 --> 00:20:54,760 Speaker 3: where is venture money going in which categories? 406 00:20:54,800 --> 00:20:57,000 Speaker 1: And gaming that was up with you know, you know, 407 00:20:57,160 --> 00:20:59,680 Speaker 1: we've been at gaming. We love gaming. In fact, Flight. 408 00:20:59,520 --> 00:21:03,960 Speaker 3: Simulator was created before even Windows, but we were number three, 409 00:21:04,080 --> 00:21:07,199 Speaker 3: number four, and with now Activision, I think we have 410 00:21:07,280 --> 00:21:09,479 Speaker 3: a chance of being a good publisher quite frankly on 411 00:21:09,560 --> 00:21:14,120 Speaker 3: Sony and Nintendo and PCs and Xbox and so yeah, 412 00:21:14,160 --> 00:21:17,320 Speaker 3: we're excited about that acquisition. Closing, I'm glad we've got 413 00:21:17,359 --> 00:21:20,840 Speaker 3: it through. But I think each category by category is 414 00:21:20,880 --> 00:21:21,600 Speaker 3: the review. 415 00:21:22,119 --> 00:21:24,680 Speaker 2: I want to quickly ask an audience question and perhaps 416 00:21:24,720 --> 00:21:27,320 Speaker 2: this audience member went to CES where there were a 417 00:21:27,320 --> 00:21:29,600 Speaker 2: lot of AI of course was a theme and there 418 00:21:29,600 --> 00:21:33,480 Speaker 2: were a lot of AI powered gadgets that were displayed. 419 00:21:34,160 --> 00:21:36,639 Speaker 2: Do you feel like we're coming to the end of 420 00:21:36,680 --> 00:21:41,119 Speaker 2: the smartphone era? And what as an AI successor to 421 00:21:41,160 --> 00:21:43,879 Speaker 2: the smartphone look like? And does Microsoft eventually play in 422 00:21:43,920 --> 00:21:44,600 Speaker 2: that category? 423 00:21:45,000 --> 00:21:47,399 Speaker 1: Yeah, I mean, I think of CES this year was 424 00:21:47,880 --> 00:21:48,600 Speaker 1: very interesting. 425 00:21:49,480 --> 00:21:54,600 Speaker 3: Obviously, I thought the demo of the rabbit os and 426 00:21:54,960 --> 00:21:56,200 Speaker 3: the device was fantastic. 427 00:21:56,240 --> 00:21:57,400 Speaker 1: I think I must. 428 00:21:57,240 --> 00:22:02,199 Speaker 3: Say after jobs sort of launch of iPhone probably one 429 00:22:02,200 --> 00:22:05,520 Speaker 3: of the most impressive presentations I've seen of capturing the 430 00:22:06,320 --> 00:22:10,320 Speaker 3: vision of what is possible going forward for what is 431 00:22:10,359 --> 00:22:16,080 Speaker 3: an agent centric operating system? And interface, and I think 432 00:22:16,119 --> 00:22:20,560 Speaker 3: that's what everybody's going seeking, which device will make it 433 00:22:20,640 --> 00:22:23,800 Speaker 3: and so on. It's unclear, but I think it's very 434 00:22:23,880 --> 00:22:26,160 Speaker 3: very clear that computer I go back to that. Right, 435 00:22:26,200 --> 00:22:30,040 Speaker 3: if you have a breakthrough in natural interface where this 436 00:22:30,240 --> 00:22:32,440 Speaker 3: idea that you have to go one app at a 437 00:22:32,440 --> 00:22:35,320 Speaker 3: time and all of the cognitive load is with you 438 00:22:36,000 --> 00:22:39,240 Speaker 3: as a human, does seem like there can be a 439 00:22:39,280 --> 00:22:42,199 Speaker 3: real breakthrough because you know, in the past, when we 440 00:22:42,320 --> 00:22:45,320 Speaker 3: had the first generation, whether it was Kortana or Alexa, 441 00:22:45,560 --> 00:22:49,000 Speaker 3: Ciri or what have you, it was just not it 442 00:22:49,040 --> 00:22:53,200 Speaker 3: was too brittle where we didn't have these transformers, these 443 00:22:53,240 --> 00:22:56,600 Speaker 3: large language models, whereas now we have I think the 444 00:22:56,720 --> 00:23:00,000 Speaker 3: tech to go and come up with a new app model. 445 00:23:00,119 --> 00:23:02,040 Speaker 3: And once you have a new interface in a new 446 00:23:02,040 --> 00:23:04,840 Speaker 3: app model, I think new hardware is also possible. 447 00:23:04,840 --> 00:23:06,879 Speaker 2: And it's had an opportunity for Microsoft or are you 448 00:23:06,920 --> 00:23:07,919 Speaker 2: moving away from hardware? 449 00:23:07,920 --> 00:23:10,840 Speaker 3: I mean, look, I mean it's always it's an opportunity 450 00:23:10,880 --> 00:23:14,080 Speaker 3: for us, and so yeah, I mean we make hardware today, 451 00:23:14,119 --> 00:23:18,080 Speaker 3: we have surface devices, we make mixed reality devices. The 452 00:23:18,080 --> 00:23:20,800 Speaker 3: biggest hardware business we happen to have is our cloud. 453 00:23:21,720 --> 00:23:23,240 Speaker 1: We stream from the cloud. 454 00:23:23,320 --> 00:23:27,280 Speaker 3: So therefore I think you'll see us exercise the full 455 00:23:27,320 --> 00:23:27,919 Speaker 3: stack of it. 456 00:23:28,480 --> 00:23:31,000 Speaker 2: Last question, since we're out of time, on a topic 457 00:23:31,040 --> 00:23:32,880 Speaker 2: that I know is near and dear, dear heart, which 458 00:23:32,920 --> 00:23:33,560 Speaker 2: is cricket. 459 00:23:35,160 --> 00:23:37,320 Speaker 1: You're sponsoring a league here in the US. 460 00:23:37,560 --> 00:23:41,440 Speaker 2: How do you convince Americans who can barely get through 461 00:23:41,440 --> 00:23:44,600 Speaker 2: a soccer game to fall in love with cricket. 462 00:23:44,760 --> 00:23:47,480 Speaker 3: Well, I mean, you look, you know, there's room in 463 00:23:47,520 --> 00:23:52,240 Speaker 3: the United States for all kinds of things. But quite honestly, 464 00:23:52,240 --> 00:23:56,359 Speaker 3: I mean to us in fact, since you brought cricket. Actually, 465 00:23:56,320 --> 00:23:58,720 Speaker 3: the next World Cup of the T twenty is in 466 00:23:58,720 --> 00:24:00,320 Speaker 3: the United States, and I'm looking forward to. 467 00:24:00,320 --> 00:24:02,880 Speaker 1: India and Pakistan. There's an India Pakistan game in New 468 00:24:02,960 --> 00:24:04,800 Speaker 1: York where you be going, I hope, so if I 469 00:24:04,840 --> 00:24:07,800 Speaker 1: can get tickets, that is, thank you for probably starting 470 00:24:07,840 --> 00:24:08,200 Speaker 1: a ticket. 471 00:24:08,800 --> 00:24:12,320 Speaker 3: But yo, it's you know, it's it's a it's a 472 00:24:12,359 --> 00:24:15,040 Speaker 3: sport that obviously for US South Asians it's a big deal. 473 00:24:15,359 --> 00:24:17,919 Speaker 3: It's a religion for us, and so we're obsessed about it, 474 00:24:17,960 --> 00:24:19,240 Speaker 3: and I love the sport. 475 00:24:19,000 --> 00:24:21,119 Speaker 1: And I'm glad that it's not being played even in 476 00:24:21,160 --> 00:24:21,760 Speaker 1: the United States. 477 00:24:21,760 --> 00:24:24,280 Speaker 3: In fact, originally the first Test match I think was 478 00:24:24,320 --> 00:24:27,640 Speaker 3: in US Canada, but after the American Revolution, I think 479 00:24:27,640 --> 00:24:28,480 Speaker 3: the one. 480 00:24:28,880 --> 00:24:31,960 Speaker 1: I think the US one, and but but the US. 481 00:24:32,160 --> 00:24:36,200 Speaker 3: We rejected cricket as a British sport after the American Revolution, 482 00:24:36,400 --> 00:24:37,400 Speaker 3: but we can bring it back. 483 00:24:38,240 --> 00:24:40,240 Speaker 1: Satya, thank you very much, thank you so much,