1 00:00:02,360 --> 00:00:06,720 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,240 --> 00:00:10,520 Speaker 2: Open AI's top of mind across the world of technology. 3 00:00:10,560 --> 00:00:14,080 Speaker 2: That includes, for Elon Musk, who posted on x overnight 4 00:00:14,360 --> 00:00:17,640 Speaker 2: that his EV company Tesla, is having to boost compensation 5 00:00:18,280 --> 00:00:21,480 Speaker 2: to ward off approaches for his AI talent. Those approaches, 6 00:00:21,560 --> 00:00:24,880 Speaker 2: claims mask include open ai. Ethan Knight, a member of 7 00:00:24,920 --> 00:00:28,880 Speaker 2: Tesla's team working on computer vision for FSD, was approached 8 00:00:28,880 --> 00:00:31,840 Speaker 2: by open ai, so Musk moved him to his own 9 00:00:31,880 --> 00:00:35,960 Speaker 2: AI company, XAI. Let's get to the open ai side 10 00:00:36,000 --> 00:00:38,920 Speaker 2: of the story. Let's talk about talent. Let's talk about 11 00:00:38,920 --> 00:00:41,639 Speaker 2: what's happening in the world of artificial intelligence. Brad lightcap 12 00:00:41,840 --> 00:00:45,800 Speaker 2: open ai coo joins us here in New York City. 13 00:00:46,040 --> 00:00:49,519 Speaker 2: There's no accusation that anything wrong was done, but it 14 00:00:49,600 --> 00:00:52,400 Speaker 2: highlights an interesting point. Right now, there is a lot 15 00:00:52,440 --> 00:00:55,520 Speaker 2: being done on the training of prior generation models, future 16 00:00:55,520 --> 00:00:59,240 Speaker 2: generation models. There is a finite amount of talent. Give 17 00:00:59,280 --> 00:01:01,720 Speaker 2: the open ai side of this equation, how much of 18 00:01:01,760 --> 00:01:03,880 Speaker 2: a premium are you having to pay out? I don't 19 00:01:03,880 --> 00:01:08,840 Speaker 2: know if it's salary or stock or engineering freedom to 20 00:01:08,920 --> 00:01:10,000 Speaker 2: get the top names. 21 00:01:10,520 --> 00:01:13,240 Speaker 1: Well, thank you Ed Caroline for having me. It's great 22 00:01:13,280 --> 00:01:15,240 Speaker 1: to be with you. You know, look, this is a 23 00:01:15,240 --> 00:01:16,080 Speaker 1: competitive market. 24 00:01:16,280 --> 00:01:16,720 Speaker 2: For sure. 25 00:01:17,319 --> 00:01:21,800 Speaker 1: It's an amazing place that we're at because there's a 26 00:01:21,880 --> 00:01:23,440 Speaker 1: few number of people that can really make such a 27 00:01:23,480 --> 00:01:26,920 Speaker 1: dramatic impact in this field. But we try and pull 28 00:01:26,959 --> 00:01:29,920 Speaker 1: people into our mission, and I think that's historically what's 29 00:01:29,959 --> 00:01:33,000 Speaker 1: been our strength is people want to build in the 30 00:01:33,000 --> 00:01:36,800 Speaker 1: place where we're pushing the frontier, the furthest and also 31 00:01:36,840 --> 00:01:39,000 Speaker 1: where we really think about the impact of the technology 32 00:01:39,040 --> 00:01:41,600 Speaker 1: both on a societal level, and then also want to 33 00:01:41,600 --> 00:01:42,720 Speaker 1: apply at an industry level. 34 00:01:43,400 --> 00:01:45,960 Speaker 2: How big is open AI, I now how many employees 35 00:01:46,000 --> 00:01:46,760 Speaker 2: and where are they? 36 00:01:47,240 --> 00:01:50,520 Speaker 1: We're about twelve hundred people. Twelve hundred, yes, mostly in 37 00:01:50,520 --> 00:01:55,200 Speaker 1: San Francisco, but also in London and Dublin and increasingly 38 00:01:55,240 --> 00:01:56,720 Speaker 1: other places in the world. 39 00:01:56,520 --> 00:01:58,760 Speaker 2: Increasingly out the places in the world. The reporting was 40 00:01:58,760 --> 00:02:02,200 Speaker 2: that Japan is the next frontier for you. What's the 41 00:02:02,280 --> 00:02:05,200 Speaker 2: strategy there and where you choose I don't know whether 42 00:02:05,240 --> 00:02:07,080 Speaker 2: you call it opening a new market, the way you 43 00:02:07,160 --> 00:02:08,320 Speaker 2: choose to have operations. 44 00:02:08,840 --> 00:02:12,000 Speaker 1: Well, we're very fortunate we have a very global base 45 00:02:12,040 --> 00:02:14,280 Speaker 1: of demand, so we want to show up where our 46 00:02:14,320 --> 00:02:15,920 Speaker 1: customers are. We think this is going to be a 47 00:02:15,960 --> 00:02:20,320 Speaker 1: global transformation. We feel a lot of pull from places 48 00:02:20,360 --> 00:02:23,360 Speaker 1: like Japan and Asia broadly, and so we'll show up 49 00:02:23,360 --> 00:02:24,320 Speaker 1: where we feel like we need to be. 50 00:02:24,960 --> 00:02:27,560 Speaker 3: And let's talk about where show up from an industry perspective. 51 00:02:27,560 --> 00:02:29,959 Speaker 3: You're here in New York talking to partners. We understand 52 00:02:30,000 --> 00:02:33,200 Speaker 3: Sam and the teams have been in Hollywood discussing how 53 00:02:33,800 --> 00:02:35,920 Speaker 3: Sourra can be used, but ultimately how you can have 54 00:02:36,280 --> 00:02:39,679 Speaker 3: a working relationship with companies and industries that think they're 55 00:02:39,680 --> 00:02:42,280 Speaker 3: going to be upended. Told us about how those conversations going, 56 00:02:42,280 --> 00:02:44,040 Speaker 3: for example, with Hollywood and publications. 57 00:02:44,200 --> 00:02:47,200 Speaker 1: Yeah, you know, we take a very unique perspective here. 58 00:02:47,560 --> 00:02:51,560 Speaker 1: We really model open AI as a partnership's company, and 59 00:02:51,639 --> 00:02:53,919 Speaker 1: we think this transformation is going to be very broad based. 60 00:02:53,960 --> 00:02:56,680 Speaker 1: And so we're one company. We can't do everything ourselves. 61 00:02:57,080 --> 00:03:02,040 Speaker 1: Being able to find ways to partner with industry, with countries, 62 00:03:02,960 --> 00:03:07,160 Speaker 1: with specific companies, that's really important to our mission and 63 00:03:07,280 --> 00:03:09,519 Speaker 1: the way that we think about bringing technology to the world. 64 00:03:10,040 --> 00:03:12,480 Speaker 1: By way of recent example, we work really closely actually 65 00:03:12,520 --> 00:03:14,880 Speaker 1: with the publishing industry right now on ways that we 66 00:03:14,919 --> 00:03:17,720 Speaker 1: can deploy AI into publishing to make the quality of 67 00:03:17,800 --> 00:03:20,920 Speaker 1: journalism better, to make the quality of reader experiences better. 68 00:03:21,480 --> 00:03:24,840 Speaker 1: We've heard nothing but really actually a lot of enthusiasm 69 00:03:24,880 --> 00:03:27,320 Speaker 1: from that industry, and so we're always looking at ways 70 00:03:27,320 --> 00:03:28,520 Speaker 1: that we can partner with industry. 71 00:03:28,919 --> 00:03:29,880 Speaker 2: I expect a lot. 72 00:03:29,720 --> 00:03:31,679 Speaker 1: More from us in the future on this front, and 73 00:03:32,240 --> 00:03:34,520 Speaker 1: sort of a great example of where we really are 74 00:03:34,560 --> 00:03:35,880 Speaker 1: excited about the impact that we can. 75 00:03:35,760 --> 00:03:39,320 Speaker 3: Make a media It's interesting that media companies over in Europe, 76 00:03:39,360 --> 00:03:41,080 Speaker 3: for example, seem to have got with that quite quickly. 77 00:03:41,080 --> 00:03:43,240 Speaker 3: Ashal Spring are a key deal that you sort of 78 00:03:43,320 --> 00:03:47,080 Speaker 3: birthed early days and actual Springer therefore scenes for businesses 79 00:03:47,160 --> 00:03:49,400 Speaker 3: side and perhaps a little bit more blue links within 80 00:03:49,560 --> 00:03:53,720 Speaker 3: chat GPT. New York Times ain't on that discussion point 81 00:03:53,840 --> 00:03:57,600 Speaker 3: right now. How is that legal fight and indeed behind 82 00:03:57,640 --> 00:03:59,880 Speaker 3: the scenes discussion going with those sorts of companies. 83 00:04:00,520 --> 00:04:03,200 Speaker 1: Well, I can't comment on the New York Times case 84 00:04:03,240 --> 00:04:06,360 Speaker 1: specifically other than to say we think it's without merit, 85 00:04:06,360 --> 00:04:10,400 Speaker 1: But the overwhelming majority of what we hear from the 86 00:04:10,400 --> 00:04:13,720 Speaker 1: publishing industry is actually quite positive. There's a high level 87 00:04:13,760 --> 00:04:16,680 Speaker 1: of excitement everywhere about what these tools can do, both 88 00:04:16,680 --> 00:04:20,200 Speaker 1: to unlock better journalism, to give readers more access to information, 89 00:04:20,880 --> 00:04:23,800 Speaker 1: to enhance the way that people engage with content. So 90 00:04:24,720 --> 00:04:26,760 Speaker 1: the way that we actually think about even the fundamental 91 00:04:26,800 --> 00:04:30,440 Speaker 1: business of information engagement and consumption. These things are all 92 00:04:30,440 --> 00:04:34,040 Speaker 1: really possible now with language models and increasingly with multimodel 93 00:04:34,080 --> 00:04:37,080 Speaker 1: models too, So we think we're just starting to scratch 94 00:04:37,120 --> 00:04:38,800 Speaker 1: the surface on that opportunity. We've got a lot more 95 00:04:38,800 --> 00:04:41,480 Speaker 1: to share there soon. But stay tuned. 96 00:04:41,800 --> 00:04:44,920 Speaker 3: You're being positive, being optimistic, of course you are. What's 97 00:04:44,960 --> 00:04:48,279 Speaker 3: interesting is I come away from meetings where Hollywood is 98 00:04:48,520 --> 00:04:52,240 Speaker 3: deeply anxious. People in my industry are deeply anxious. Goodness, 99 00:04:52,279 --> 00:04:53,839 Speaker 3: everyone where a job is trying to work out how 100 00:04:53,880 --> 00:04:56,719 Speaker 3: they adopt AI. And also there's this question that is 101 00:04:56,760 --> 00:05:00,760 Speaker 3: ongoing of open versus closed. You're going we're another legal 102 00:05:00,800 --> 00:05:02,920 Speaker 3: fight at the moment in Loam Muscus dragged you into 103 00:05:02,960 --> 00:05:05,680 Speaker 3: it and wants you to rename yourself. Where do you 104 00:05:05,720 --> 00:05:08,360 Speaker 3: sit on the closed verst to open? How much more 105 00:05:08,400 --> 00:05:09,839 Speaker 3: open do you have to become? 106 00:05:10,720 --> 00:05:11,000 Speaker 2: Yeah? 107 00:05:11,040 --> 00:05:14,000 Speaker 1: Well, look, I think we really focus just on where 108 00:05:14,000 --> 00:05:16,040 Speaker 1: we can deploy the technology and what's the best way 109 00:05:16,080 --> 00:05:17,920 Speaker 1: to do that. And at the end of the day, 110 00:05:17,920 --> 00:05:20,080 Speaker 1: we're going to focus on where our customers are and 111 00:05:20,160 --> 00:05:22,440 Speaker 1: the way that they want to deploy it. We work 112 00:05:22,480 --> 00:05:25,680 Speaker 1: with thousands of enterprises across the world. We spend a 113 00:05:25,680 --> 00:05:27,920 Speaker 1: lot of time working with those enterprises on ways that 114 00:05:28,080 --> 00:05:30,480 Speaker 1: we can configure the technology specific to their use case, 115 00:05:30,480 --> 00:05:33,719 Speaker 1: specific to their stack, and we think that the easiest 116 00:05:33,760 --> 00:05:35,120 Speaker 1: way to do that is the way that we serve it. 117 00:05:35,839 --> 00:05:37,520 Speaker 1: But that's not to say that, you know, we wouldn't 118 00:05:37,520 --> 00:05:40,000 Speaker 1: look at other ways in the future. We really ultimately 119 00:05:40,000 --> 00:05:42,160 Speaker 1: want to be where we think the best place to 120 00:05:42,200 --> 00:05:43,440 Speaker 1: deploy the technology is going to be. 121 00:05:43,520 --> 00:05:46,560 Speaker 2: Problemomberg Television and radio audience worldwide. We're in New York 122 00:05:46,600 --> 00:05:50,400 Speaker 2: City speaking to open Ai COO Brad light Cap. I 123 00:05:50,440 --> 00:05:54,560 Speaker 2: want to learn more about where open ai sits in 124 00:05:54,640 --> 00:05:57,719 Speaker 2: its growth. It's day to day operations. You know, you 125 00:05:57,839 --> 00:06:01,320 Speaker 2: just said thousands of enterprises. We came to know you 126 00:06:01,360 --> 00:06:05,080 Speaker 2: because of the explosive response to chat GPT, and that 127 00:06:05,160 --> 00:06:08,920 Speaker 2: was principally in the context of the consumer facing chatbot. 128 00:06:09,400 --> 00:06:12,880 Speaker 2: But it sounds very much like growth now is focused 129 00:06:12,880 --> 00:06:16,839 Speaker 2: on that enterprise arrangement licensing of technology. Can you give 130 00:06:16,880 --> 00:06:20,080 Speaker 2: me proportions the rate of growth for those two pillars. 131 00:06:20,960 --> 00:06:24,440 Speaker 1: Well, we've seen tremendous growth in the enterprise. Twenty four 132 00:06:24,560 --> 00:06:26,479 Speaker 1: is going to be the year of adoption for AI. 133 00:06:26,600 --> 00:06:29,360 Speaker 1: In the enterprise. We felt really like twenty three was 134 00:06:29,400 --> 00:06:31,440 Speaker 1: the year that people just started to wrap their head 135 00:06:31,440 --> 00:06:33,919 Speaker 1: around what was possible with AI in the enterprise. But 136 00:06:34,000 --> 00:06:37,320 Speaker 1: increasingly the market is pulling us toward real applications that 137 00:06:37,360 --> 00:06:40,960 Speaker 1: are delivering real business results and a very broad focus 138 00:06:41,040 --> 00:06:44,480 Speaker 1: on AI enablement. So how do we think about bringing 139 00:06:44,520 --> 00:06:47,640 Speaker 1: AI both to our operations but also to our workforces, 140 00:06:48,120 --> 00:06:50,000 Speaker 1: and then also to new product experiences that we can 141 00:06:50,040 --> 00:06:52,839 Speaker 1: deliver to customers. And we work with plenty of companies 142 00:06:52,839 --> 00:06:54,320 Speaker 1: that are doing all three of those things at the 143 00:06:54,320 --> 00:06:57,320 Speaker 1: same time. We think that's very possible and it's having 144 00:06:57,480 --> 00:07:00,520 Speaker 1: really dramatic bottom line impact today. This is what we 145 00:07:00,560 --> 00:07:03,440 Speaker 1: think the pull is going to be, and we're ready 146 00:07:03,440 --> 00:07:03,880 Speaker 1: to support. 147 00:07:04,240 --> 00:07:07,679 Speaker 2: Brad. You're the COO, so I'm assuming you're responsible for 148 00:07:08,120 --> 00:07:11,760 Speaker 2: also some of the purse strings involved in open AIS finances. 149 00:07:12,440 --> 00:07:15,960 Speaker 2: Compute must be a really considerable cost right now, and 150 00:07:16,000 --> 00:07:18,400 Speaker 2: with that the energy consideration as well. 151 00:07:19,240 --> 00:07:19,640 Speaker 1: Are you. 152 00:07:21,080 --> 00:07:25,720 Speaker 2: Making enough revenue from this enterprise business to keep the 153 00:07:25,800 --> 00:07:29,400 Speaker 2: lights on when you factor in the compute costs involved? 154 00:07:30,120 --> 00:07:34,360 Speaker 1: We are yes. We think that we have well, first 155 00:07:34,360 --> 00:07:36,840 Speaker 1: of all, a very diversified business and so we serve 156 00:07:36,920 --> 00:07:40,000 Speaker 1: one hundreds of millions of users with our chatchapet product, 157 00:07:40,800 --> 00:07:43,160 Speaker 1: and many of those users are paying users. Of course, 158 00:07:43,200 --> 00:07:46,520 Speaker 1: we serve now over six hundred thousand individual users in 159 00:07:46,560 --> 00:07:50,080 Speaker 1: the enterprise with Chatchaputy Enterprise, and so we're just seeing 160 00:07:50,080 --> 00:07:53,280 Speaker 1: tremendous momentum. We're very early innings on this, and so 161 00:07:53,720 --> 00:07:55,840 Speaker 1: we look at it as hey, this is the start 162 00:07:55,840 --> 00:07:58,760 Speaker 1: of something, and ultimately our priority is just to be 163 00:07:58,760 --> 00:08:01,400 Speaker 1: able to deliver the best and telligence, whether it's in 164 00:08:01,440 --> 00:08:04,120 Speaker 1: the enterprise or to individuals that's the most useful, that 165 00:08:04,160 --> 00:08:06,000 Speaker 1: drives the biggest business impact. 166 00:08:06,120 --> 00:08:08,560 Speaker 2: Cara, I'm thinking back to the conversation we had the 167 00:08:08,640 --> 00:08:12,600 Speaker 2: other day with Daniel ama Day from Anthropic, and what 168 00:08:12,640 --> 00:08:16,600 Speaker 2: she explained Brad was the relationship with AWS and Bedrock, 169 00:08:16,960 --> 00:08:19,000 Speaker 2: they're getting a lot of business through that. You have 170 00:08:19,080 --> 00:08:23,040 Speaker 2: a quite similar and actually a pre existing situation with 171 00:08:23,160 --> 00:08:26,520 Speaker 2: Microsoft and Azure. You are able to explain how much 172 00:08:26,600 --> 00:08:31,520 Speaker 2: new business comes through that conduit of the Microsoft relationship 173 00:08:31,560 --> 00:08:35,400 Speaker 2: and what we're probably existing asual enterprise customers. 174 00:08:36,240 --> 00:08:39,720 Speaker 1: We're really fortunate to have Microsoft as a partner. They've 175 00:08:39,720 --> 00:08:42,000 Speaker 1: been an amazing partner with us for a long time. 176 00:08:42,520 --> 00:08:46,040 Speaker 1: We've increasingly looked at the world as ways that we 177 00:08:46,120 --> 00:08:49,880 Speaker 1: can bring open an eye technology to life together. We're 178 00:08:50,000 --> 00:08:52,920 Speaker 1: very fortunate that they're offering our models through their Azure 179 00:08:53,000 --> 00:08:55,800 Speaker 1: Opening Eye service. But we have an independent business, so 180 00:08:55,840 --> 00:09:00,440 Speaker 1: we serve customers independently of Microsoft, and it's one. 181 00:09:00,320 --> 00:09:02,680 Speaker 2: Bigger than the Microsoft Avenue. 182 00:09:02,840 --> 00:09:06,320 Speaker 1: Yeah, look, you know it's fluid between the two, but 183 00:09:07,080 --> 00:09:10,439 Speaker 1: we really are focused on what's best for the companies 184 00:09:10,440 --> 00:09:12,640 Speaker 1: we work with, and we're very fortunate to be able 185 00:09:12,679 --> 00:09:14,800 Speaker 1: to diversify the surfaces in which we offer our products. 186 00:09:14,960 --> 00:09:17,440 Speaker 1: We think the demand for this technology is massive, and 187 00:09:17,480 --> 00:09:21,079 Speaker 1: so being a smaller company obviously having a partner that 188 00:09:21,120 --> 00:09:24,200 Speaker 1: can help bring the technology to scale is really really 189 00:09:24,240 --> 00:09:27,079 Speaker 1: advantageous for us, and we're excited to support them in 190 00:09:27,080 --> 00:09:27,559 Speaker 1: the future. 191 00:09:28,040 --> 00:09:31,959 Speaker 3: It's an extraordinary, very novel kind of partnership, and in fact, 192 00:09:31,960 --> 00:09:37,720 Speaker 3: the whole of AI is birthing these kind of frenemy relationships. 193 00:09:38,360 --> 00:09:41,720 Speaker 3: I just think of also Aquia Hiers deals that seem 194 00:09:41,720 --> 00:09:44,640 Speaker 3: to be going now. Microsoft has just seemingly aquihied the 195 00:09:44,640 --> 00:09:48,360 Speaker 3: whole of inflection. What did you make of that particular move, 196 00:09:48,559 --> 00:09:50,720 Speaker 3: particularly when suddenly you've got what the co founder of 197 00:09:50,720 --> 00:09:52,880 Speaker 3: DeepMind coming in to be the head of Consumer AI 198 00:09:52,920 --> 00:09:55,480 Speaker 3: at Microsoft, and ultimately you put out a response to 199 00:09:55,520 --> 00:09:58,320 Speaker 3: that being like, congratulations, But how did that make you 200 00:09:58,360 --> 00:10:01,800 Speaker 3: feel as the previous partner of choice for Microsoft. 201 00:10:02,160 --> 00:10:04,840 Speaker 1: Well, we're excited for them. We congratulate with the STAFFA 202 00:10:04,920 --> 00:10:07,360 Speaker 1: and Karen and the whole team, and we work very 203 00:10:07,400 --> 00:10:09,840 Speaker 1: closely with Microsoft, and so we think that's going to 204 00:10:09,840 --> 00:10:12,920 Speaker 1: be a major advantage for a partnership as well. We're 205 00:10:12,960 --> 00:10:15,720 Speaker 1: focused on what we can control on the open eye side, 206 00:10:16,320 --> 00:10:18,640 Speaker 1: which for us right now is really thinking about the 207 00:10:18,679 --> 00:10:21,560 Speaker 1: next wave of the technology and thinking about ways that 208 00:10:21,559 --> 00:10:24,320 Speaker 1: that wave is going to start to enable businesses. And 209 00:10:24,360 --> 00:10:26,040 Speaker 1: the more we can think about that problem together, the 210 00:10:26,040 --> 00:10:26,800 Speaker 1: better we'll all be. 211 00:10:28,480 --> 00:10:31,679 Speaker 2: A lot of CEOs I speak to some of them, 212 00:10:31,679 --> 00:10:35,480 Speaker 2: are your customers say their technology agnostic right now? In 213 00:10:35,520 --> 00:10:38,640 Speaker 2: other words, they might use GPT or another foundation model, 214 00:10:38,840 --> 00:10:41,360 Speaker 2: they might also be building their own in parallel. And 215 00:10:41,400 --> 00:10:43,240 Speaker 2: when we told our audience who are coming on the show, 216 00:10:43,280 --> 00:10:46,120 Speaker 2: a lot of people wanted to get your thoughts on 217 00:10:46,160 --> 00:10:50,520 Speaker 2: the Palenteer relationship. It's an interesting case study. They seem 218 00:10:50,559 --> 00:10:54,080 Speaker 2: to be one of the beneficiaries of the enterprise investment 219 00:10:54,120 --> 00:10:56,960 Speaker 2: in AI, or at least the data analytics side. What's 220 00:10:56,960 --> 00:10:59,560 Speaker 2: it like working with alex Carry. 221 00:11:00,240 --> 00:11:02,080 Speaker 1: You know, we actually work with a bunch of partners 222 00:11:02,080 --> 00:11:04,720 Speaker 1: that help us bring the technology to life, and there's 223 00:11:04,760 --> 00:11:06,920 Speaker 1: so many ways to deploy the technology and the enterprise. 224 00:11:06,960 --> 00:11:09,520 Speaker 1: That's what's kind of amazing about this technology is it's 225 00:11:09,600 --> 00:11:12,280 Speaker 1: not just one function in the enterprise or one part 226 00:11:12,280 --> 00:11:15,680 Speaker 1: of the enterprise stack. It's so diverse and it's so 227 00:11:15,760 --> 00:11:19,120 Speaker 1: broad based. And so we work with with with with 228 00:11:19,200 --> 00:11:22,160 Speaker 1: many partners to bring that to life, and we'll continue 229 00:11:22,160 --> 00:11:24,320 Speaker 1: to look for partners to work with to help us 230 00:11:24,360 --> 00:11:28,000 Speaker 1: do that. But yeah, no, we you know, look, we're 231 00:11:29,160 --> 00:11:32,240 Speaker 1: we're excited with with with Wave Volunteers has been enabling 232 00:11:32,400 --> 00:11:35,280 Speaker 1: their customers on top of our technology. Uh, and we'll 233 00:11:35,280 --> 00:11:36,640 Speaker 1: continue to support them going forward. 234 00:11:37,400 --> 00:11:40,520 Speaker 2: Brad, every day it crosses my desk, you know that. 235 00:11:40,520 --> 00:11:42,720 Speaker 2: I've done quite a lot of reporlicing on Sam's chip 236 00:11:42,760 --> 00:11:46,720 Speaker 2: ambitions as well. There's some relationship going on between the 237 00:11:46,840 --> 00:11:52,160 Speaker 2: UAE and open AI fundraising or technology partnerships. Can you 238 00:11:52,360 --> 00:11:54,559 Speaker 2: clarify what's going on there and why you need to 239 00:11:54,679 --> 00:11:56,679 Speaker 2: raise funds from somewhere like that? 240 00:11:57,200 --> 00:12:01,000 Speaker 1: Yeah, well, look, I won't comment specifically on that. We broadly, though, 241 00:12:01,000 --> 00:12:04,600 Speaker 1: believe that there's just not enough total supply in the 242 00:12:04,640 --> 00:12:05,120 Speaker 1: world and. 243 00:12:05,080 --> 00:12:07,679 Speaker 2: That of AI at Center RASS specifically. 244 00:12:07,200 --> 00:12:09,880 Speaker 1: AI accelerators, and broadly that the supply chain will need 245 00:12:09,920 --> 00:12:11,520 Speaker 1: to adapt to what we think is going to be 246 00:12:12,120 --> 00:12:15,120 Speaker 1: this highly inflected and nearly exponential demand in the next 247 00:12:15,160 --> 00:12:17,199 Speaker 1: ten years. And so we spend a lot of time 248 00:12:17,240 --> 00:12:19,440 Speaker 1: thinking about ways that we can make sure that that 249 00:12:19,480 --> 00:12:21,480 Speaker 1: demand is met. Part of it is just being able 250 00:12:21,480 --> 00:12:23,800 Speaker 1: to bring our models to bear, but upstream of that 251 00:12:23,920 --> 00:12:26,640 Speaker 1: is obviously making sure that the actual underlying hardware and 252 00:12:26,679 --> 00:12:29,200 Speaker 1: infrastructure exists to be able to build the systems that 253 00:12:29,200 --> 00:12:32,560 Speaker 1: we need to ultimately serve that demand. And so, you know, 254 00:12:33,040 --> 00:12:35,439 Speaker 1: we think a lot about it, and you know, our 255 00:12:35,440 --> 00:12:38,079 Speaker 1: hope there is that the supply chain starts to think 256 00:12:38,080 --> 00:12:39,080 Speaker 1: about that the same way we do. 257 00:12:39,800 --> 00:12:43,880 Speaker 3: Brad your mantra, your manifesto. Your focus is to build 258 00:12:44,559 --> 00:12:48,960 Speaker 3: artificial general intelligence for the good of humanity. And I 259 00:12:49,000 --> 00:12:52,600 Speaker 3: speak to a lot of humans who are terrified. Can 260 00:12:52,640 --> 00:12:56,960 Speaker 3: you encapsulate how you can without regulation being there in 261 00:12:57,000 --> 00:13:00,000 Speaker 3: place yet and the ease sort of putting the court 262 00:13:00,120 --> 00:13:04,240 Speaker 3: out there. You're doing all of this without disrupting too 263 00:13:04,360 --> 00:13:06,480 Speaker 3: much too quickly. Too hot. 264 00:13:08,040 --> 00:13:08,240 Speaker 2: Yeah. 265 00:13:08,280 --> 00:13:11,280 Speaker 1: Well, look, central to our mission is to make sure 266 00:13:11,280 --> 00:13:14,200 Speaker 1: that we bring the technology in a way that's safe 267 00:13:14,559 --> 00:13:17,360 Speaker 1: and that creates broad benefit. It's really encoded into how 268 00:13:17,360 --> 00:13:20,040 Speaker 1: open Eye was structured, and it is actually the thing 269 00:13:20,040 --> 00:13:22,880 Speaker 1: that we spend our time thinking about how to deliver 270 00:13:23,280 --> 00:13:27,160 Speaker 1: in real world results today. And so we're very optimistic, 271 00:13:27,679 --> 00:13:31,640 Speaker 1: and I can understand where the fear comes from. Things 272 00:13:31,679 --> 00:13:33,960 Speaker 1: are moving very fast, even for us at the center 273 00:13:34,000 --> 00:13:38,040 Speaker 1: of it sometimes is dizzying. But look, our perspective is 274 00:13:38,040 --> 00:13:41,080 Speaker 1: that this is an enabling technology. At its core, it's 275 00:13:41,080 --> 00:13:43,120 Speaker 1: going to make businesses more efficient, it can make people 276 00:13:43,160 --> 00:13:46,760 Speaker 1: more productive, it can be dramatically inflecting in how people learn, 277 00:13:46,840 --> 00:13:49,880 Speaker 1: how people can understand information, and so we think that 278 00:13:49,880 --> 00:13:52,720 Speaker 1: that has a very broadly positive effect on the world, 279 00:13:53,160 --> 00:13:54,640 Speaker 1: and the world that we wake up to ten years 280 00:13:54,679 --> 00:13:56,640 Speaker 1: from now will be one that we look back on 281 00:13:57,679 --> 00:14:00,559 Speaker 1: and really are excited to be able to go build 282 00:14:01,040 --> 00:14:04,440 Speaker 1: today and you know works, that's that's that's really what 283 00:14:04,440 --> 00:14:05,640 Speaker 1: we wake up every day and want to go do. 284 00:14:05,880 --> 00:14:08,960 Speaker 3: So we want to thank you being in the house 285 00:14:09,000 --> 00:14:12,160 Speaker 3: with us. The CEO of open Ai, Brad Lightcap there 286 00:14:12,320 --> 00:14:14,920 Speaker 3: joining us in New York. We thank him