1 00:00:02,360 --> 00:00:07,480 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. We now want to 2 00:00:07,520 --> 00:00:11,720 Speaker 1: welcome to our TV and radio audiences worldwide. Amazon's president 3 00:00:11,760 --> 00:00:13,760 Speaker 1: and CEO, Andy Jesse, Great to have you. 4 00:00:13,720 --> 00:00:15,239 Speaker 2: On New York. Thanks for having me. It's great to 5 00:00:15,240 --> 00:00:15,600 Speaker 2: be here. 6 00:00:15,520 --> 00:00:17,919 Speaker 1: Back home where you grew up at and you are 7 00:00:18,120 --> 00:00:23,120 Speaker 1: unveiling generative AI infused Alexa Plus. How is this going 8 00:00:23,160 --> 00:00:26,599 Speaker 1: to really well excite your customer base? What are you 9 00:00:26,600 --> 00:00:27,520 Speaker 1: most excited for it? 10 00:00:27,680 --> 00:00:31,120 Speaker 2: Well, you know we've had Alex has been around for 11 00:00:31,120 --> 00:00:34,560 Speaker 2: ten years. We have six hundred million devices in customers 12 00:00:34,600 --> 00:00:39,840 Speaker 2: homes and offices, and Alexa Plus is our next generation 13 00:00:40,159 --> 00:00:44,480 Speaker 2: Alexa Personal Assistant, and she's meaningfully smarter and more capable 14 00:00:44,479 --> 00:00:47,320 Speaker 2: and useful than her prior self. You can do all 15 00:00:47,360 --> 00:00:49,240 Speaker 2: the things that you used to do, but every single 16 00:00:49,320 --> 00:00:51,360 Speaker 2: one of those functions is better. And I can give 17 00:00:51,400 --> 00:00:53,280 Speaker 2: you just an example. There are so many examples, but 18 00:00:53,800 --> 00:00:58,800 Speaker 2: you know, if you have smart home controls with Alexa now, 19 00:00:58,880 --> 00:01:02,720 Speaker 2: you can say, Hey, Alexa, I have guests coming over 20 00:01:02,840 --> 00:01:08,000 Speaker 2: at seven pm. Could you raise the drapes, could you 21 00:01:08,640 --> 00:01:11,360 Speaker 2: raise the temperature by five degrees, turn on the porch 22 00:01:11,440 --> 00:01:14,440 Speaker 2: lights and the driveway lights, and put on melo dinner 23 00:01:14,520 --> 00:01:17,640 Speaker 2: music in the dining room and music you're not me. 24 00:01:17,760 --> 00:01:20,160 Speaker 2: I choose shop Fighters, but most people will choose Melo 25 00:01:20,280 --> 00:01:22,920 Speaker 2: dining music. But you can do that all verbally and 26 00:01:23,000 --> 00:01:26,600 Speaker 2: simply using conversational language like that. You don't need a nap. 27 00:01:26,840 --> 00:01:30,360 Speaker 2: It just happens. And so Alexa plus, with what we've 28 00:01:30,440 --> 00:01:33,120 Speaker 2: just announced and what we're launching, it really is. There 29 00:01:33,120 --> 00:01:35,480 Speaker 2: have been a lot of chatbots around that are good 30 00:01:35,520 --> 00:01:39,279 Speaker 2: at answering questions, but they don't take actions. Alexi plus 31 00:01:39,360 --> 00:01:41,400 Speaker 2: is really Alexa is going to be the first one 32 00:01:41,400 --> 00:01:44,840 Speaker 2: that not only is highly intelligent to answer various questions, 33 00:01:45,040 --> 00:01:47,440 Speaker 2: but she can do so many things for you. She 34 00:01:47,480 --> 00:01:50,080 Speaker 2: can play music and play video, and control your smart 35 00:01:50,080 --> 00:01:53,960 Speaker 2: home and make reservations for you and hire people to 36 00:01:54,000 --> 00:01:57,600 Speaker 2: fix your oven. I mean, it really is the first big, 37 00:01:57,880 --> 00:02:01,440 Speaker 2: large scale practical use of gen AI. The consumers are 38 00:02:01,440 --> 00:02:03,120 Speaker 2: going to be able to see and use naturally. 39 00:02:03,400 --> 00:02:05,400 Speaker 1: I can see how you're going to obsess over it, 40 00:02:05,680 --> 00:02:07,720 Speaker 1: whether it's ordering the latest Buffalo Wings or in New 41 00:02:07,800 --> 00:02:09,560 Speaker 1: York and seeing where the best space is. 42 00:02:09,560 --> 00:02:10,000 Speaker 2: To do that. 43 00:02:10,040 --> 00:02:12,840 Speaker 1: But how are your stakeholders of shareholder based going to 44 00:02:12,840 --> 00:02:16,240 Speaker 1: obsess about this? About gender to AI in Alexi Plus. 45 00:02:16,240 --> 00:02:17,639 Speaker 1: How is that adding value for them? 46 00:02:17,840 --> 00:02:21,919 Speaker 2: Well, you know, I think if you think about what 47 00:02:22,000 --> 00:02:26,079 Speaker 2: Alexa allows customers to do, it makes shopping more easy 48 00:02:26,200 --> 00:02:28,679 Speaker 2: because it's so much more intuitive now to buy products. 49 00:02:29,080 --> 00:02:33,440 Speaker 2: It makes enjoying music easier, It makes enjoying video and 50 00:02:33,480 --> 00:02:37,200 Speaker 2: streaming media better. It allows you to control your smart 51 00:02:37,200 --> 00:02:39,240 Speaker 2: home in a different way. So every single one of 52 00:02:39,280 --> 00:02:43,639 Speaker 2: our consumer customer experiences gets better with Alexa Plus. And 53 00:02:43,680 --> 00:02:46,720 Speaker 2: then you know, of course, Alexa has its own business model. 54 00:02:46,800 --> 00:02:49,200 Speaker 2: We have a brand new lineup of devices that are 55 00:02:49,200 --> 00:02:50,959 Speaker 2: coming in the fall. I think that are beautiful. I 56 00:02:51,000 --> 00:02:54,600 Speaker 2: think people are going to really like we have opportunities 57 00:02:55,440 --> 00:02:58,960 Speaker 2: to service new products and advertising in various interfaces like 58 00:02:59,000 --> 00:03:01,720 Speaker 2: our mobile and our desk top interface that's coming in Alexa. 59 00:03:02,160 --> 00:03:04,840 Speaker 2: And then we have subscriptions, and you know, so I 60 00:03:04,840 --> 00:03:07,480 Speaker 2: think there's a sustainable business model there as well. 61 00:03:07,880 --> 00:03:10,760 Speaker 1: Talk to me about the subscriptions because you're getting it free. 62 00:03:10,800 --> 00:03:11,400 Speaker 2: If you've got. 63 00:03:11,240 --> 00:03:13,800 Speaker 1: Prime, I get a lot with Prime. Now are you 64 00:03:13,880 --> 00:03:15,639 Speaker 1: able to increase the prices there? 65 00:03:15,639 --> 00:03:20,200 Speaker 2: Do you think of Prime? Well, Prime is an incredible value. 66 00:03:20,240 --> 00:03:23,320 Speaker 2: If you think about getting free shipping on three hundred 67 00:03:23,320 --> 00:03:25,720 Speaker 2: million plus items. You know when we launch Prime, it's 68 00:03:25,720 --> 00:03:27,639 Speaker 2: free shipping on about a million items to say, it's 69 00:03:27,639 --> 00:03:29,639 Speaker 2: three hundred million items. And most of the time you're 70 00:03:29,639 --> 00:03:31,760 Speaker 2: getting your products now inside of a day. And so 71 00:03:32,360 --> 00:03:34,400 Speaker 2: you know, between that and what you get with Prime 72 00:03:34,480 --> 00:03:38,360 Speaker 2: Video and Prime Music and the grocery subscription, and you 73 00:03:38,400 --> 00:03:42,840 Speaker 2: know our unique selling events, primes and incredible values increase 74 00:03:43,000 --> 00:03:46,800 Speaker 2: to add on top of it Alexa plus. You know, 75 00:03:46,800 --> 00:03:49,760 Speaker 2: it's just great value. So what do they plan right now? 76 00:03:49,800 --> 00:03:52,480 Speaker 2: But it's I do think Prime is unusual value and 77 00:03:52,480 --> 00:03:54,200 Speaker 2: it's why people use it so expansively. 78 00:03:54,840 --> 00:03:58,400 Speaker 1: And to increase that use has been this invention, this 79 00:03:58,520 --> 00:04:02,640 Speaker 1: innovation of generative and that costs money. And you've actually 80 00:04:02,680 --> 00:04:04,880 Speaker 1: just took to the stage yesterday to say, out of 81 00:04:04,880 --> 00:04:07,880 Speaker 1: all companies, you are spending the most on AI. How 82 00:04:07,960 --> 00:04:09,320 Speaker 1: much are you spending on AI? 83 00:04:09,480 --> 00:04:12,960 Speaker 2: Well, you know, I think we don't disclose the exact amount, 84 00:04:13,000 --> 00:04:15,720 Speaker 2: but you know we you know, we're spending a pretty 85 00:04:15,720 --> 00:04:19,440 Speaker 2: significant amount of cappax and the allion's share of it 86 00:04:19,520 --> 00:04:23,640 Speaker 2: is on generative AI. We've said in our aws business 87 00:04:23,760 --> 00:04:26,800 Speaker 2: even though general AI for US is a many billion 88 00:04:26,839 --> 00:04:29,800 Speaker 2: dollars a year business and growing triple digit percentages year 89 00:04:29,800 --> 00:04:32,000 Speaker 2: over year, that if we had even more capacity, we 90 00:04:32,040 --> 00:04:34,520 Speaker 2: could use it to monetize it. And we have this 91 00:04:34,680 --> 00:04:40,920 Speaker 2: really interesting and very fortunate flywheel and AI inside of Amazon, 92 00:04:40,960 --> 00:04:44,520 Speaker 2: which is if your mission is to make customers' lives 93 00:04:44,560 --> 00:04:46,800 Speaker 2: easier and better every day, which it is for us, 94 00:04:47,160 --> 00:04:51,039 Speaker 2: and if you believe that all the customer experiences we 95 00:04:51,120 --> 00:04:53,479 Speaker 2: know of today are going to be reinvented through generative AI, 96 00:04:53,560 --> 00:04:56,200 Speaker 2: which we also believe. You believe those two things, you're 97 00:04:56,240 --> 00:04:58,520 Speaker 2: going to be building a lot of genera ABAI apps. 98 00:04:59,080 --> 00:05:00,960 Speaker 2: If you build a lot of gener of aif, by 99 00:05:00,960 --> 00:05:03,279 Speaker 2: the way, other companies are too on top of AWS, 100 00:05:03,279 --> 00:05:06,360 Speaker 2: which is the leading technology infrastructure platform. So if there 101 00:05:06,360 --> 00:05:08,599 Speaker 2: are a lot of gener of AI apps being built 102 00:05:08,640 --> 00:05:10,599 Speaker 2: on top of you, you can't help but get a 103 00:05:10,600 --> 00:05:13,520 Speaker 2: lot of feedback from people on how they want those 104 00:05:13,560 --> 00:05:16,440 Speaker 2: building blocks that create gener AI to be better. And 105 00:05:16,480 --> 00:05:19,360 Speaker 2: if you're willing to invest in those building blocks, which 106 00:05:19,400 --> 00:05:21,440 Speaker 2: we are, as you know, with our own chips with 107 00:05:21,520 --> 00:05:24,599 Speaker 2: Tradium and with our own frontier model with Amazon Nova 108 00:05:24,680 --> 00:05:27,840 Speaker 2: and model building services and stage Maker AI and betterck. 109 00:05:27,839 --> 00:05:30,640 Speaker 2: If you're willing to invest in those building blocks and 110 00:05:30,640 --> 00:05:32,880 Speaker 2: you're getting a lot of feedback, they get better much 111 00:05:32,920 --> 00:05:35,240 Speaker 2: more quickly, which can't help them make it easier and 112 00:05:35,279 --> 00:05:38,200 Speaker 2: quicker for people to build gender AI applications, which means 113 00:05:38,200 --> 00:05:40,360 Speaker 2: you get more running on the platform. So that flywheel 114 00:05:40,880 --> 00:05:42,880 Speaker 2: is very unique for Amazon. 115 00:05:43,320 --> 00:05:46,599 Speaker 1: The line share, though, you did say basically one hundred 116 00:05:46,600 --> 00:05:49,960 Speaker 1: billion dollar run rate for CAPEX expenditure. Can you give 117 00:05:50,040 --> 00:05:51,919 Speaker 1: us even like a percentage ratedown of how much that 118 00:05:51,960 --> 00:05:54,480 Speaker 1: goes to distribution logistics and how much goes to AI 119 00:05:54,760 --> 00:05:55,479 Speaker 1: line shares? 120 00:05:55,760 --> 00:05:57,800 Speaker 2: You know, you can tell you most of it. You 121 00:05:57,800 --> 00:05:59,800 Speaker 2: know most of it. You know the line shares. 122 00:05:59,520 --> 00:06:02,000 Speaker 1: More than fifty Yes, is it more than eighty percent? 123 00:06:03,720 --> 00:06:05,840 Speaker 2: We're playing the warmer and colder game? Yes, exactly. 124 00:06:06,160 --> 00:06:09,080 Speaker 1: I love a game taught us about that capacity that 125 00:06:09,120 --> 00:06:12,920 Speaker 1: you talked about. Yeah, AWS could grow even faster if 126 00:06:12,960 --> 00:06:15,440 Speaker 1: you had all the chips that you needed, all the 127 00:06:15,440 --> 00:06:18,120 Speaker 1: power you needed, the motherboards you needed. How much faster 128 00:06:18,160 --> 00:06:19,120 Speaker 1: could a WS grow? 129 00:06:20,000 --> 00:06:22,840 Speaker 2: I could. It's hard to put an exact percentage, but 130 00:06:23,520 --> 00:06:26,359 Speaker 2: I do think it could be growing faster. I'm confident 131 00:06:26,400 --> 00:06:30,200 Speaker 2: could be growing faster. And you know there's there is 132 00:06:30,440 --> 00:06:32,599 Speaker 2: you know, for a long time, there still aren't as 133 00:06:32,640 --> 00:06:35,839 Speaker 2: many chips as we all want we have. We're fortunate 134 00:06:35,880 --> 00:06:38,560 Speaker 2: in that we're very big partners with Video. But then 135 00:06:38,560 --> 00:06:41,680 Speaker 2: we also have our own custom AI silicon in tradium too, 136 00:06:41,680 --> 00:06:44,320 Speaker 2: which we just released in Reinvent, which is thirty to 137 00:06:44,320 --> 00:06:47,719 Speaker 2: forty percent more price performance than the GPU powered instances, 138 00:06:47,720 --> 00:06:49,600 Speaker 2: which is a big deal at scale if you're doing 139 00:06:49,640 --> 00:06:52,040 Speaker 2: gender of AI on the inference side. So we have 140 00:06:52,200 --> 00:06:54,560 Speaker 2: maybe more chips than some others might have access to, 141 00:06:54,640 --> 00:06:56,400 Speaker 2: but we still don't have enough. And then there's just 142 00:06:56,560 --> 00:06:59,640 Speaker 2: is not enough power in the world right now, and 143 00:07:00,320 --> 00:07:02,560 Speaker 2: we're all working really hard on that. I expect that 144 00:07:02,680 --> 00:07:05,680 Speaker 2: to relieve some the second half of this year, but 145 00:07:06,400 --> 00:07:08,400 Speaker 2: right now, and you know, the world can change, but 146 00:07:08,560 --> 00:07:11,040 Speaker 2: right now we have just insatiable demand. 147 00:07:12,240 --> 00:07:15,280 Speaker 1: We had amazing demand coming from Jensen Wang who had 148 00:07:15,320 --> 00:07:19,000 Speaker 1: his numbers out yesterday and Nvidio. Was there a limitation 149 00:07:19,120 --> 00:07:21,280 Speaker 1: on the in video chips in particular that pulled back 150 00:07:21,280 --> 00:07:23,160 Speaker 1: capacity And at what point do you think can depend 151 00:07:23,160 --> 00:07:26,520 Speaker 1: even more on your own in house built chips offset 152 00:07:26,560 --> 00:07:26,960 Speaker 1: any of that? 153 00:07:27,400 --> 00:07:29,960 Speaker 2: Well, you know, I would say, I mean there's a 154 00:07:29,960 --> 00:07:32,400 Speaker 2: lot of demand for gener Ai right now. People are 155 00:07:32,520 --> 00:07:35,760 Speaker 2: very excited about it. I think that all the different 156 00:07:35,760 --> 00:07:39,680 Speaker 2: providers of chips have been constrained to some extent. I 157 00:07:39,720 --> 00:07:43,200 Speaker 2: think some of the some of the new generations maybe 158 00:07:43,280 --> 00:07:45,560 Speaker 2: have gone through different evolutions and when they're going to 159 00:07:45,600 --> 00:07:48,440 Speaker 2: be released maybe a little later than people thought. There 160 00:07:48,480 --> 00:07:51,240 Speaker 2: are some components like motherboards and things like that that 161 00:07:51,600 --> 00:07:55,119 Speaker 2: we all use that particular ones that are in shorter 162 00:07:55,200 --> 00:07:58,040 Speaker 2: supply than others. So you know, I do think there's 163 00:07:58,040 --> 00:08:00,920 Speaker 2: some there are some supply chainges choose which I expect 164 00:08:00,960 --> 00:08:04,080 Speaker 2: to get better. I do you know people are very 165 00:08:04,080 --> 00:08:06,920 Speaker 2: excited about trainingum Too to your question and whether we 166 00:08:06,920 --> 00:08:09,880 Speaker 2: could see more demand there, and we have gone back 167 00:08:10,400 --> 00:08:13,240 Speaker 2: at least a couple occasions to make more trainingum Too 168 00:08:13,240 --> 00:08:15,440 Speaker 2: than we'd intended because we have so much demand. So 169 00:08:15,920 --> 00:08:18,560 Speaker 2: I expect the will of customers for as long as 170 00:08:18,600 --> 00:08:21,920 Speaker 2: I can foresee wanting to run compute on instances that 171 00:08:21,960 --> 00:08:24,160 Speaker 2: have in vidio chips. But I think a lot of 172 00:08:24,160 --> 00:08:27,240 Speaker 2: the demand will also be served by training the. 173 00:08:27,200 --> 00:08:32,320 Speaker 1: Customers that you couldn't serve because of the limited capacity. 174 00:08:32,920 --> 00:08:34,160 Speaker 1: Is it that you just have to put it back 175 00:08:34,160 --> 00:08:36,319 Speaker 1: from everyone a little bit more generally or who lost 176 00:08:36,320 --> 00:08:36,679 Speaker 1: out here? 177 00:08:36,760 --> 00:08:42,400 Speaker 2: Do you think it's always a combination. I mean, there's 178 00:08:42,440 --> 00:08:46,280 Speaker 2: for people that just have a very small amount of 179 00:08:46,400 --> 00:08:49,160 Speaker 2: accelerators that they need. They don't usually have a problem. 180 00:08:49,200 --> 00:08:51,719 Speaker 2: We have something called capacity blocks. It's kind of like 181 00:08:51,760 --> 00:08:57,760 Speaker 2: an on demand way to use accelerators and and generate 182 00:08:57,880 --> 00:09:00,480 Speaker 2: e chips and that continues to go. It's really the 183 00:09:00,480 --> 00:09:04,800 Speaker 2: folks who have built, you know, have an idea for 184 00:09:04,840 --> 00:09:08,079 Speaker 2: a new application, but they need a lot of chips. 185 00:09:08,160 --> 00:09:10,840 Speaker 2: Where if we don't have the capacity, we have to 186 00:09:11,280 --> 00:09:12,800 Speaker 2: you know, we have to give them what we can 187 00:09:12,840 --> 00:09:14,360 Speaker 2: give them and give it to them as fast as 188 00:09:14,360 --> 00:09:16,880 Speaker 2: we can and push our partners to get it in sooner. 189 00:09:16,960 --> 00:09:19,720 Speaker 2: And they can't get their initiatives done as quickly as 190 00:09:19,720 --> 00:09:21,679 Speaker 2: they want to if the capacity isn't there. 191 00:09:21,760 --> 00:09:23,800 Speaker 1: Andthropics had all the capacity it needs, you've got on 192 00:09:23,800 --> 00:09:24,640 Speaker 1: a close relationship. 193 00:09:24,679 --> 00:09:27,880 Speaker 2: Yeah, we have a very close partnership with Anthropic. We 194 00:09:27,920 --> 00:09:30,440 Speaker 2: have this project called Project right near with them. They're 195 00:09:30,440 --> 00:09:33,760 Speaker 2: building their next model, their next version of their frontier 196 00:09:33,800 --> 00:09:37,000 Speaker 2: model on top of trainum to and our custom AI silicon. 197 00:09:37,360 --> 00:09:39,959 Speaker 2: They're going to use over four hundred thousand trainum two 198 00:09:40,040 --> 00:09:46,160 Speaker 2: chips and so yes, they have capacity. They're they're ramping 199 00:09:46,240 --> 00:09:48,520 Speaker 2: up and we're excited about that partnership and what they're building. 200 00:09:48,840 --> 00:09:51,960 Speaker 1: You mentioned the power side. How much is that something 201 00:09:51,960 --> 00:09:53,920 Speaker 1: you're talking to the administration about. How much is the 202 00:09:53,960 --> 00:09:56,680 Speaker 1: administration supportive of the buildout that you need to do. 203 00:09:57,520 --> 00:10:01,560 Speaker 2: We have been talking to administration, you know, multiple administrations 204 00:10:01,600 --> 00:10:03,680 Speaker 2: in this country over the years, as well as in 205 00:10:03,760 --> 00:10:06,720 Speaker 2: other countries as well. And I think the power shortage 206 00:10:06,800 --> 00:10:10,000 Speaker 2: really snuck up on people, you know, really right after 207 00:10:10,040 --> 00:10:13,800 Speaker 2: the pandemic. And I would say that the current administration 208 00:10:13,880 --> 00:10:16,839 Speaker 2: is very receptive to it. They understand the constraints it's 209 00:10:16,880 --> 00:10:19,440 Speaker 2: having on the economy right now and are convicted about 210 00:10:19,480 --> 00:10:19,920 Speaker 2: solving it. 211 00:10:20,600 --> 00:10:25,120 Speaker 1: What about the restrictions around chips? And what's interesting is 212 00:10:25,160 --> 00:10:28,520 Speaker 1: Microsoft just called on the administration today to say, this 213 00:10:28,640 --> 00:10:33,559 Speaker 1: limitation on chip access for some of our close allies 214 00:10:33,840 --> 00:10:36,200 Speaker 1: around the world is going to limit our global business. 215 00:10:36,240 --> 00:10:37,400 Speaker 2: You are a global business. 216 00:10:37,480 --> 00:10:38,559 Speaker 1: Is that something you're worried about? 217 00:10:39,040 --> 00:10:42,400 Speaker 2: Hey, we are? You know. I think I think that 218 00:10:43,720 --> 00:10:46,800 Speaker 2: you're really talking about that AI diffusion act, I think, 219 00:10:46,960 --> 00:10:50,560 Speaker 2: and I'm going to be curious to see where that goes. 220 00:10:50,600 --> 00:10:53,800 Speaker 2: I mean, it was it was enacted pretty quickly at 221 00:10:53,800 --> 00:10:56,760 Speaker 2: the very end of the last administration. I don't know 222 00:10:56,800 --> 00:10:59,480 Speaker 2: how this administration feels about it, but I would say 223 00:10:59,480 --> 00:11:03,080 Speaker 2: that we share the concern that it has limitations on 224 00:11:03,160 --> 00:11:07,120 Speaker 2: certain countries who are natural allies of the US who 225 00:11:07,320 --> 00:11:09,720 Speaker 2: just to be able to do their business and those 226 00:11:09,760 --> 00:11:11,439 Speaker 2: companies to be able to get done what they want 227 00:11:11,480 --> 00:11:14,120 Speaker 2: to get done on top of these technology infrastructure platforms 228 00:11:14,160 --> 00:11:16,880 Speaker 2: like AWS, they're going to need more chips, and so 229 00:11:17,200 --> 00:11:19,360 Speaker 2: I think if we don't do it, we're going to 230 00:11:19,400 --> 00:11:24,440 Speaker 2: basically give up that business and those relationships to other 231 00:11:24,480 --> 00:11:27,240 Speaker 2: countries who can provide those chips, and I think we're 232 00:11:27,240 --> 00:11:28,640 Speaker 2: better off being partners with them. 233 00:11:28,720 --> 00:11:30,000 Speaker 1: Is it a risk to AWS. 234 00:11:30,360 --> 00:11:33,400 Speaker 2: It's not so much risk. I mean, I mean it 235 00:11:33,440 --> 00:11:37,559 Speaker 2: had it, you know, in the scheme of things. It's 236 00:11:37,600 --> 00:11:41,079 Speaker 2: not a big swinger. But I also think that there 237 00:11:41,080 --> 00:11:43,880 Speaker 2: are so many countries who are in the early stages 238 00:11:43,880 --> 00:11:48,280 Speaker 2: of their economic development who both really need access to 239 00:11:48,360 --> 00:11:51,600 Speaker 2: the most cutting edge sophisticated technology to build the right 240 00:11:51,600 --> 00:11:55,600 Speaker 2: customer experiences, and that could be big geographic markets for 241 00:11:55,800 --> 00:11:59,000 Speaker 2: companies like ours and lots of other technology companies, where 242 00:11:59,000 --> 00:12:00,640 Speaker 2: I think it would be a shame to limit them 243 00:12:00,640 --> 00:12:01,720 Speaker 2: and to limit the companies. 244 00:12:02,760 --> 00:12:05,440 Speaker 1: The AI diffusion rules that was brought in very swiftly 245 00:12:05,480 --> 00:12:07,560 Speaker 1: by the Biden administration all of this is in the 246 00:12:07,559 --> 00:12:11,040 Speaker 1: context of US versus China and not wanting to get 247 00:12:11,080 --> 00:12:14,600 Speaker 1: the most sophisticated equipment and chip and technology into China. 248 00:12:15,720 --> 00:12:19,440 Speaker 1: What rate for us for a moment the administration and 249 00:12:19,480 --> 00:12:22,600 Speaker 1: whether it's been positive or negative for your business when 250 00:12:22,600 --> 00:12:25,480 Speaker 1: it comes to China Dominimus, for example, the fact that 251 00:12:26,280 --> 00:12:28,480 Speaker 1: Chinese competitors, if we call them that, Shean and the 252 00:12:28,640 --> 00:12:30,240 Speaker 1: like it can have to pay more to get goods 253 00:12:30,280 --> 00:12:32,319 Speaker 1: into the country. Does that help or hinder you? 254 00:12:33,320 --> 00:12:38,440 Speaker 2: Well, you know, I would say, you know, on Deminimus specifically, 255 00:12:38,800 --> 00:12:41,880 Speaker 2: you know, we have a certain number of items that 256 00:12:41,880 --> 00:12:44,360 Speaker 2: are shipped in that way as well, for things like Hall, 257 00:12:45,160 --> 00:12:48,280 Speaker 2: which is our new low price offering. We maybe have 258 00:12:48,400 --> 00:12:50,360 Speaker 2: less of it than some other companies like the ones 259 00:12:50,400 --> 00:12:54,600 Speaker 2: that you mentioned, but I think it's early in this administration. 260 00:12:54,679 --> 00:12:58,120 Speaker 2: But what I would say is that it is encouraging 261 00:12:58,200 --> 00:13:00,920 Speaker 2: to us that we have an administry that wants to 262 00:13:00,960 --> 00:13:03,640 Speaker 2: hear from business. I would say that, you know, we've 263 00:13:04,080 --> 00:13:07,320 Speaker 2: been a business through six administrations. Every single one of 264 00:13:07,320 --> 00:13:10,079 Speaker 2: them are primary for focuses to take care of customers. 265 00:13:10,080 --> 00:13:12,160 Speaker 2: But we try to build a productive relationship with the 266 00:13:12,160 --> 00:13:14,600 Speaker 2: administration because we want to help the country, and I 267 00:13:14,600 --> 00:13:17,360 Speaker 2: would say that some administrations are more receptive to it 268 00:13:17,360 --> 00:13:22,280 Speaker 2: than others. But this administration cares about what business thinks. 269 00:13:22,360 --> 00:13:28,120 Speaker 2: And I've always been surprised that it isn't obvious that 270 00:13:28,200 --> 00:13:31,280 Speaker 2: the best economic results for a country are going to 271 00:13:31,280 --> 00:13:34,000 Speaker 2: be when the public and the private sector collaborate. And 272 00:13:34,160 --> 00:13:38,160 Speaker 2: you know, I don't expect the government to koutout to 273 00:13:38,240 --> 00:13:40,960 Speaker 2: what companies want, but they should get their feedback and 274 00:13:41,000 --> 00:13:44,280 Speaker 2: their input because they're going to make policies better together 275 00:13:44,360 --> 00:13:47,080 Speaker 2: if they collaborate. And I'm encouraged early on that this 276 00:13:47,120 --> 00:13:48,960 Speaker 2: administration wants to talk to businesses. 277 00:13:49,080 --> 00:13:50,600 Speaker 1: Are you taking calls possily? 278 00:13:50,920 --> 00:13:55,440 Speaker 2: You know, I take calls. We talked to you know, 279 00:13:55,520 --> 00:13:57,679 Speaker 2: the same thing with all the administrations. We talk to 280 00:13:57,720 --> 00:14:00,200 Speaker 2: people in the administration. We share what's working for us, 281 00:14:00,240 --> 00:14:03,280 Speaker 2: what's not working for us, concerns that we have. As 282 00:14:03,280 --> 00:14:06,400 Speaker 2: I said, some administrations care more about our feedback than others. 283 00:14:06,520 --> 00:14:07,600 Speaker 1: Has Trump can about yours? 284 00:14:07,640 --> 00:14:10,600 Speaker 2: Have you spoken to I have spoken to the President. Look, 285 00:14:10,800 --> 00:14:13,360 Speaker 2: as I said, this administration has been pretty busy the 286 00:14:13,400 --> 00:14:17,160 Speaker 2: first month, and but I am encouraged that they are 287 00:14:17,240 --> 00:14:22,080 Speaker 2: having conversations with businesses and they do care about our 288 00:14:22,080 --> 00:14:24,920 Speaker 2: feedback and we'll see what happens. But I you know, 289 00:14:24,960 --> 00:14:26,840 Speaker 2: it starts with a dialogue. You have to have a 290 00:14:26,880 --> 00:14:29,200 Speaker 2: dialogue to have any kind of relationship, and. 291 00:14:29,160 --> 00:14:32,200 Speaker 1: They care about AI infrastructure. Just take stargate. Are we 292 00:14:32,200 --> 00:14:35,120 Speaker 1: going to hear more from you on how much you're 293 00:14:35,120 --> 00:14:37,520 Speaker 1: investing here in the United States on the AI infrastructure 294 00:14:37,560 --> 00:14:38,280 Speaker 1: build out as well? 295 00:14:38,520 --> 00:14:41,560 Speaker 2: Well, you know, as we talked about earlier, we said 296 00:14:41,600 --> 00:14:45,680 Speaker 2: it was you know, directionally right in terms of the 297 00:14:45,720 --> 00:14:49,120 Speaker 2: run rate on our capex. But you know, we're spending 298 00:14:49,160 --> 00:14:52,240 Speaker 2: a lot on AI infrastructure, and the lion's share of 299 00:14:52,280 --> 00:14:54,280 Speaker 2: it is not just on AI but also in this 300 00:14:54,360 --> 00:14:57,280 Speaker 2: country in the US. We spend elsewhere because we have 301 00:14:57,280 --> 00:14:59,960 Speaker 2: a global business. We have customers everywhere. We have customers, 302 00:15:00,120 --> 00:15:03,760 Speaker 2: you know, a couple hundred countries. But we have a 303 00:15:03,800 --> 00:15:08,800 Speaker 2: pretty substantial investment that I don't expect to attenuate soon. 304 00:15:09,080 --> 00:15:13,360 Speaker 1: Only we could get a number. I'm interested in something 305 00:15:13,360 --> 00:15:15,920 Speaker 1: that perhaps is going to feel a more sensitive topic, 306 00:15:16,520 --> 00:15:19,920 Speaker 1: and it comes around perhaps some words that were missing 307 00:15:20,000 --> 00:15:23,120 Speaker 1: in your annual report this year, which we're diversity and inclusion. 308 00:15:23,160 --> 00:15:25,200 Speaker 1: I put this in the context of the administration as 309 00:15:25,240 --> 00:15:28,160 Speaker 1: it stands, because I know that Amazon strives to be 310 00:15:28,240 --> 00:15:32,040 Speaker 1: the Earth's best employer. And I'm just wondering how your 311 00:15:32,080 --> 00:15:36,480 Speaker 1: employees react to perhaps the lack of certain words now involved, 312 00:15:36,880 --> 00:15:39,480 Speaker 1: and whether or not programs might be forced to change 313 00:15:39,560 --> 00:15:39,800 Speaker 1: or not. 314 00:15:40,200 --> 00:15:42,320 Speaker 2: Yeah, well, what I would tell you just at a 315 00:15:42,400 --> 00:15:46,160 Speaker 2: high level, if you serve as many customers as we do, 316 00:15:46,280 --> 00:15:50,440 Speaker 2: in as many diverse groups as we do, and we 317 00:15:50,520 --> 00:15:53,640 Speaker 2: intend to moving forward, you have to have a diverse 318 00:15:53,720 --> 00:15:57,640 Speaker 2: team to be able to build products that work for everybody. 319 00:15:57,720 --> 00:16:00,640 Speaker 2: And that has always been our intention to continues to 320 00:16:00,640 --> 00:16:06,080 Speaker 2: be our intention. I think that you know, there were 321 00:16:06,160 --> 00:16:09,440 Speaker 2: so many programs that we launched and other companies launched 322 00:16:10,440 --> 00:16:13,520 Speaker 2: in the pandemic, and as you probably have seen over 323 00:16:13,560 --> 00:16:17,240 Speaker 2: the last three years, we've gone through very thoroughly every 324 00:16:17,360 --> 00:16:20,680 Speaker 2: single one of our business areas, and the programs that 325 00:16:20,760 --> 00:16:23,720 Speaker 2: we had conviction about we doubled down on, and the 326 00:16:23,760 --> 00:16:26,760 Speaker 2: programs that we don't have conviction about, we streamlined and 327 00:16:26,800 --> 00:16:28,720 Speaker 2: we stopped doing. And so we did the same thing 328 00:16:28,760 --> 00:16:32,520 Speaker 2: and looking at all of our programs on diversity, and 329 00:16:32,920 --> 00:16:34,880 Speaker 2: you know, we have some programs that we have doubled 330 00:16:34,880 --> 00:16:37,560 Speaker 2: down on. A good example is our Career Choice inside 331 00:16:37,560 --> 00:16:41,400 Speaker 2: our Fulfillment Network, our Fulfillment Center. Teammates are able to 332 00:16:41,400 --> 00:16:44,840 Speaker 2: get an advanced education for free on us to advance 333 00:16:44,920 --> 00:16:48,360 Speaker 2: their career and their own development. And that has been very, 334 00:16:48,480 --> 00:16:50,840 Speaker 2: very successful and very meaningful, and so we doubled down 335 00:16:50,880 --> 00:16:53,480 Speaker 2: our program like that. There are other programs that really 336 00:16:53,520 --> 00:16:56,520 Speaker 2: just haven't been that successful and haven't moved the needle much, 337 00:16:56,560 --> 00:16:59,720 Speaker 2: and those we just moved away from. But we have 338 00:17:00,080 --> 00:17:02,080 Speaker 2: first group. We're trying to continue to build out a 339 00:17:02,120 --> 00:17:03,240 Speaker 2: diverse group and that won't. 340 00:17:03,120 --> 00:17:07,040 Speaker 1: Change and you can't. You can still use that word ultimately, 341 00:17:07,040 --> 00:17:08,159 Speaker 1: you're not having to reframe it. 342 00:17:08,840 --> 00:17:11,920 Speaker 2: That is. I mean, look, you can call lots of people, 343 00:17:11,920 --> 00:17:15,240 Speaker 2: call lots of different things, but we have a giant 344 00:17:15,280 --> 00:17:18,399 Speaker 2: customer base of every imaginable group of people where we 345 00:17:18,440 --> 00:17:20,440 Speaker 2: want builders who can build for them. 346 00:17:20,760 --> 00:17:24,640 Speaker 1: Culture is key. Yeah, and you just talked about how 347 00:17:24,680 --> 00:17:27,600 Speaker 1: you're doing more is less. It's always a focus on frugality. 348 00:17:27,720 --> 00:17:30,240 Speaker 1: That's in your very principles. That's something you've been doing 349 00:17:30,280 --> 00:17:32,720 Speaker 1: at the employee base as well. You wrote out made 350 00:17:32,720 --> 00:17:35,200 Speaker 1: it very clear that they were going to be reducing layers. 351 00:17:35,600 --> 00:17:38,280 Speaker 2: How is that going, Yeah, it's gone. Well. You know, 352 00:17:38,320 --> 00:17:42,120 Speaker 2: I look, if you have a company where the culture 353 00:17:42,280 --> 00:17:45,360 Speaker 2: is an important ingredient in your success, which has absolutely 354 00:17:45,400 --> 00:17:49,560 Speaker 2: been true for Amazon. It's not your birth rate to 355 00:17:49,640 --> 00:17:51,840 Speaker 2: keep having a strong culture, you know, especially as you 356 00:17:51,840 --> 00:17:54,080 Speaker 2: grow the number of people, the number of businesses you're in, 357 00:17:54,119 --> 00:17:56,840 Speaker 2: the geographies that you're in, and so you have to 358 00:17:56,880 --> 00:17:59,359 Speaker 2: work at it all the time. And for us, you know, 359 00:17:59,359 --> 00:18:01,560 Speaker 2: there are two areas when we looked at it as 360 00:18:01,560 --> 00:18:04,760 Speaker 2: a leadership team last year that we wanted to strengthen. 361 00:18:04,800 --> 00:18:11,000 Speaker 2: One was we have always hired really strong owners, smart ambitious, 362 00:18:11,080 --> 00:18:14,480 Speaker 2: strong owners who get to own the allian's share of 363 00:18:16,119 --> 00:18:19,000 Speaker 2: In this case, I would say, you know ninety plus 364 00:18:19,000 --> 00:18:24,840 Speaker 2: percent of the decisions, and they you know, as you 365 00:18:24,960 --> 00:18:28,080 Speaker 2: add a lot of people, you end up with a 366 00:18:28,119 --> 00:18:31,320 Speaker 2: lot of middle managers, and those middle managers, all well intended, 367 00:18:31,400 --> 00:18:33,600 Speaker 2: want to put their fingerprint on everything. So you end 368 00:18:33,720 --> 00:18:36,119 Speaker 2: up with these people being in the pre meeting for 369 00:18:36,160 --> 00:18:38,240 Speaker 2: the pre meeting, for the pre meeting for the decision meeting, 370 00:18:38,560 --> 00:18:41,359 Speaker 2: and not always making recommendations and owning things the way 371 00:18:41,400 --> 00:18:43,720 Speaker 2: we want that type of ownership. So we took a 372 00:18:43,760 --> 00:18:48,439 Speaker 2: goal collectively to increase the ratio of individual contributors to 373 00:18:48,480 --> 00:18:50,840 Speaker 2: managers by over fifteen percent as a company by the 374 00:18:50,880 --> 00:18:53,320 Speaker 2: end of this quarter, we've made very good progress in 375 00:18:53,359 --> 00:18:55,679 Speaker 2: that we've beat that already and it's going to allow 376 00:18:55,840 --> 00:18:57,800 Speaker 2: us for the people that are doing the work, they're 377 00:18:57,800 --> 00:18:59,320 Speaker 2: going to have more ownership and they're going to be 378 00:18:59,320 --> 00:19:01,480 Speaker 2: able to move more QUI And then I think the 379 00:19:01,520 --> 00:19:04,600 Speaker 2: other thing we saw was that if you're a culture 380 00:19:04,600 --> 00:19:07,360 Speaker 2: that invents a lot and collaborates a lot like we do, 381 00:19:08,640 --> 00:19:11,359 Speaker 2: if you don't have people in the office together doing 382 00:19:11,400 --> 00:19:13,879 Speaker 2: that invention, it's just meaningfully worse. And you know, you 383 00:19:14,320 --> 00:19:16,880 Speaker 2: don't invent the same way, you don't collaborate the same way, 384 00:19:16,920 --> 00:19:18,520 Speaker 2: you don't connect with each other the same way, you 385 00:19:18,520 --> 00:19:21,560 Speaker 2: don't learn the culture the same way. And so having 386 00:19:21,560 --> 00:19:24,680 Speaker 2: people back in the office together more frequently, we felt 387 00:19:24,720 --> 00:19:26,159 Speaker 2: very strong. We bet it's going to be better for 388 00:19:26,200 --> 00:19:27,159 Speaker 2: customers in the business. 389 00:19:27,480 --> 00:19:31,000 Speaker 1: Let's talk about that invention and that collaboration. We've got 390 00:19:31,000 --> 00:19:33,160 Speaker 1: a new quantum chip us A lot love the name. 391 00:19:34,119 --> 00:19:36,000 Speaker 1: What does that mean for you to have this more 392 00:19:36,040 --> 00:19:39,240 Speaker 1: efficient chip at this exact moment. Will we start seeing 393 00:19:39,280 --> 00:19:40,800 Speaker 1: it be practically useful soon? 394 00:19:41,280 --> 00:19:45,400 Speaker 2: Welltum compute. Quantum computing is very high potential. It has 395 00:19:45,480 --> 00:19:50,199 Speaker 2: the chance to solve some very computationally intense problems, and 396 00:19:50,720 --> 00:19:54,800 Speaker 2: I still think it's realistically a few years away from 397 00:19:54,880 --> 00:19:56,919 Speaker 2: having a real shot at solving those problems. But you 398 00:19:56,920 --> 00:20:00,520 Speaker 2: have to solve a bunch of these challenges that relate 399 00:20:00,520 --> 00:20:03,280 Speaker 2: to quantum computing along the way, and one of them 400 00:20:03,320 --> 00:20:06,320 Speaker 2: really is around error correction on the CUBIS, and that's 401 00:20:06,320 --> 00:20:09,320 Speaker 2: what Osla does. It's a very unique inventive way to 402 00:20:09,400 --> 00:20:12,479 Speaker 2: do error correction on the CUBIS that makes a meaningful difference, 403 00:20:12,560 --> 00:20:15,240 Speaker 2: and we're excited about that milestone. You can't get to 404 00:20:15,400 --> 00:20:18,200 Speaker 2: something that has real impact unless you get those milestones 405 00:20:18,200 --> 00:20:20,120 Speaker 2: done along the way, and you invent along the way, 406 00:20:20,160 --> 00:20:21,720 Speaker 2: and it's just another example of invention. 407 00:20:22,160 --> 00:20:26,119 Speaker 1: I feel the learning of how generative AI suddenly became 408 00:20:26,760 --> 00:20:30,879 Speaker 1: not just in the business parlay, but suddenly everyone was 409 00:20:30,880 --> 00:20:34,119 Speaker 1: discussing it. I think people are aware quantum could do 410 00:20:34,240 --> 00:20:36,640 Speaker 1: the same thing. So when you hear people debating between 411 00:20:36,680 --> 00:20:38,720 Speaker 1: two decades or five years, where do you sit on 412 00:20:38,760 --> 00:20:39,760 Speaker 1: the grounder scope of that? 413 00:20:41,080 --> 00:20:44,520 Speaker 2: Gosh, you know, I don't know for sure. You know, 414 00:20:44,600 --> 00:20:47,600 Speaker 2: I would say I'm hopeful that it's more in the 415 00:20:48,160 --> 00:20:50,960 Speaker 2: five ish year range than it is the twenty year range. 416 00:20:51,160 --> 00:20:54,800 Speaker 2: You know, all these things that are successful are seven 417 00:20:55,040 --> 00:20:57,919 Speaker 2: ten twenty year overnight successes. You know, it wasn't like 418 00:20:58,000 --> 00:21:00,800 Speaker 2: you know, if you look at gender of AI, it's 419 00:21:00,840 --> 00:21:03,199 Speaker 2: just kind of another evolution of AI. But we've been 420 00:21:03,240 --> 00:21:06,199 Speaker 2: working on AI for fifty some odd years, I mean, 421 00:21:06,840 --> 00:21:09,359 Speaker 2: and it's just boom, it happened, and it really shocked 422 00:21:09,400 --> 00:21:12,600 Speaker 2: us when it actually was more accessible and worked. And 423 00:21:12,800 --> 00:21:15,240 Speaker 2: I think the same thing could happen with quantum computing, 424 00:21:15,240 --> 00:21:17,199 Speaker 2: which is you know, takes a long time, takes a 425 00:21:17,240 --> 00:21:18,639 Speaker 2: long time, takes a long and then all of a 426 00:21:18,640 --> 00:21:22,639 Speaker 2: sudden it's functional and solves problems that you couldn't solve 427 00:21:22,680 --> 00:21:25,199 Speaker 2: easily or cost effectively before. And it just feels like 428 00:21:25,200 --> 00:21:28,040 Speaker 2: it happened overnight. But quantum computing we've been working on 429 00:21:28,160 --> 00:21:29,600 Speaker 2: now for ten plus years. 430 00:21:29,960 --> 00:21:32,960 Speaker 1: And then the euphoria comes and then people try to 431 00:21:33,040 --> 00:21:36,280 Speaker 1: make head or tail of how long that EUFORIA lasts. 432 00:21:37,240 --> 00:21:39,560 Speaker 1: Going back to the investment that you make, and particularly 433 00:21:39,600 --> 00:21:41,639 Speaker 1: in generative AI, do you think it's going to be 434 00:21:41,680 --> 00:21:44,879 Speaker 1: peak here for generative AI in terms of that investment 435 00:21:44,880 --> 00:21:47,199 Speaker 1: that Amazon makes, I don't know. 436 00:21:47,280 --> 00:21:51,560 Speaker 2: You know, it's funny, I have this feeling that you're 437 00:21:51,560 --> 00:21:54,440 Speaker 2: going to end up with some people that feel disillusioned 438 00:21:54,440 --> 00:21:57,480 Speaker 2: about gender of AI because the investment or you getting 439 00:21:57,520 --> 00:22:00,960 Speaker 2: the commensurate return. It's still very early. So many companies 440 00:22:00,960 --> 00:22:03,119 Speaker 2: that are really doing pilots right now their general of 441 00:22:03,160 --> 00:22:06,320 Speaker 2: AI applications and you read see a little bit of it. 442 00:22:07,359 --> 00:22:10,919 Speaker 2: But I think that the smart companies are going to 443 00:22:10,960 --> 00:22:14,280 Speaker 2: figure out which are the initiatives that can really change 444 00:22:14,320 --> 00:22:16,960 Speaker 2: their customer experience and their businesses and keep investing in 445 00:22:17,000 --> 00:22:22,200 Speaker 2: general AI, and the slower companies are going to wait 446 00:22:22,240 --> 00:22:24,520 Speaker 2: to see if it's safe to go outside, and they'll 447 00:22:24,560 --> 00:22:27,560 Speaker 2: be behind by two or three years, maybe more because 448 00:22:27,600 --> 00:22:31,280 Speaker 2: the reality is even more soll than software development, General 449 00:22:31,280 --> 00:22:34,760 Speaker 2: of AI is very iterative. It's not on software development 450 00:22:34,800 --> 00:22:36,560 Speaker 2: you can get on a whiteboard with a team of 451 00:22:36,640 --> 00:22:39,480 Speaker 2: architects and design something and maybe it doesn't work exactly 452 00:22:39,520 --> 00:22:42,240 Speaker 2: as you design it, but largely you know, whereas in 453 00:22:42,280 --> 00:22:45,880 Speaker 2: general of AI, the models they get better at kind 454 00:22:45,920 --> 00:22:49,080 Speaker 2: of disproportionate rates. Sometimes you know, scaling low. 455 00:22:49,160 --> 00:22:50,359 Speaker 1: Have we've got a new scaling laws. 456 00:22:50,640 --> 00:22:52,920 Speaker 2: I mean, I think that a lot of times when 457 00:22:52,920 --> 00:22:55,040 Speaker 2: you're building models and the model gets so much better 458 00:22:55,040 --> 00:22:57,040 Speaker 2: and you talk to the scientists and the team, they 459 00:22:57,119 --> 00:22:58,919 Speaker 2: just can't believe how much better it got because the 460 00:22:58,920 --> 00:23:02,280 Speaker 2: model is learning itself. And so I think that if 461 00:23:02,280 --> 00:23:04,960 Speaker 2: you actually aren't investing in general AI, you're going to 462 00:23:04,960 --> 00:23:07,640 Speaker 2: be behind by even the amount of time that you waited, 463 00:23:08,000 --> 00:23:10,720 Speaker 2: because there are so many lessons you get from iterating 464 00:23:10,720 --> 00:23:12,440 Speaker 2: and building applications. 465 00:23:11,800 --> 00:23:14,920 Speaker 1: Today, and you want to be the supermarket of AI 466 00:23:15,240 --> 00:23:18,000 Speaker 1: feels like agnostic to the models, as it is with 467 00:23:18,119 --> 00:23:19,200 Speaker 1: Amazon alexaplus. 468 00:23:19,600 --> 00:23:22,960 Speaker 2: Well, you know, look, the truth is what we care 469 00:23:23,040 --> 00:23:25,960 Speaker 2: most about in all our businesses, but as it relates 470 00:23:25,960 --> 00:23:29,119 Speaker 2: to aws and AI, we want our customers to be 471 00:23:29,119 --> 00:23:32,439 Speaker 2: able to change their customer experiences and improve their businesses 472 00:23:32,720 --> 00:23:35,160 Speaker 2: so they can last over a long period time successfully. 473 00:23:35,200 --> 00:23:37,800 Speaker 2: And if we do right by our customers and take 474 00:23:37,800 --> 00:23:40,560 Speaker 2: the long term approach we do, and they're able to 475 00:23:40,680 --> 00:23:45,240 Speaker 2: run their applications successfully on top of our technology infrastructure services, 476 00:23:45,760 --> 00:23:48,439 Speaker 2: then they're successful and we ride along with them. And 477 00:23:48,520 --> 00:23:51,040 Speaker 2: so you know, we in all these areas, we have 478 00:23:51,160 --> 00:23:53,439 Speaker 2: services we build ourselves. In the models area, we have 479 00:23:53,480 --> 00:23:56,640 Speaker 2: Amazon Nova, which people are really excited about because it's 480 00:23:56,680 --> 00:23:59,320 Speaker 2: got comparable intelligence to the leading models in the world, 481 00:23:59,359 --> 00:24:03,160 Speaker 2: but it's meaningfully less expensive and lower latency. But if 482 00:24:03,200 --> 00:24:06,119 Speaker 2: customers prefer to run other models. We have huge partnership 483 00:24:06,119 --> 00:24:10,679 Speaker 2: with Anthropic and with Lama and Mistraw deep Seek. I mean, 484 00:24:10,720 --> 00:24:14,040 Speaker 2: we have the largest collection of leading foundation models in 485 00:24:14,040 --> 00:24:17,520 Speaker 2: the world, and if customers are having success with those, 486 00:24:17,880 --> 00:24:19,800 Speaker 2: then we're happy. And the truth is, if you build 487 00:24:19,840 --> 00:24:22,679 Speaker 2: a lot of general of AI applications, people don't realize this. 488 00:24:23,119 --> 00:24:26,840 Speaker 2: You use multiple model types, often in the same application. 489 00:24:26,920 --> 00:24:30,640 Speaker 2: Even Alexa Plus, as we talked about, uses multiple foundation models, 490 00:24:31,040 --> 00:24:33,879 Speaker 2: and so we want people to use the right model 491 00:24:34,040 --> 00:24:36,840 Speaker 2: for their applications, and then we make it easy for 492 00:24:36,840 --> 00:24:39,040 Speaker 2: them to switch between them and run it easily and 493 00:24:39,080 --> 00:24:40,920 Speaker 2: successfully in AWS. 494 00:24:40,920 --> 00:24:43,919 Speaker 1: Andy Jesse, perfect place to leave it. Start with Alexa 495 00:24:43,960 --> 00:24:46,760 Speaker 1: plus and non Alexa plus and all the generation and 496 00:24:47,080 --> 00:24:49,280 Speaker 1: generative AI that comes with it. We thank you so much, 497 00:24:49,359 --> 00:24:52,080 Speaker 1: thanks for having me. Andy Jesse, President and CEO of 498 00:24:52,119 --> 00:24:52,560 Speaker 1: Amazon