1 00:00:02,360 --> 00:00:07,360 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. Let's say you lived 2 00:00:07,400 --> 00:00:10,360 Speaker 1: Bloomberg House and dovels right now were Bloomberg's Sharon Gafari 3 00:00:10,680 --> 00:00:14,720 Speaker 1: is in conversation with Sarah Fryar, the CFO of Opening. 4 00:00:14,520 --> 00:00:16,840 Speaker 2: Swearing, and first of all, thank you Bloomberg for having 5 00:00:16,880 --> 00:00:19,840 Speaker 2: us today. It's been amazing to spend my first twenty 6 00:00:19,840 --> 00:00:22,560 Speaker 2: four hours and Davus and to see that AI is 7 00:00:22,600 --> 00:00:25,560 Speaker 2: really top of the agenda and I think appropriately. So 8 00:00:26,560 --> 00:00:29,120 Speaker 2: you're right, there's been a lot of hype. Sam did 9 00:00:29,120 --> 00:00:31,960 Speaker 2: tell everyone to chill, so we'll message just. 10 00:00:32,040 --> 00:00:34,159 Speaker 3: Chill a little, and he's right. 11 00:00:34,680 --> 00:00:38,240 Speaker 2: We do internally feel that we have a path towards AGI. 12 00:00:39,040 --> 00:00:42,720 Speaker 2: We had a big breakthrough in twenty twenty four around 13 00:00:42,760 --> 00:00:45,839 Speaker 2: reasoning models in particular, so you saw us move from 14 00:00:45,880 --> 00:00:50,680 Speaker 2: a world of more chat models, chatbots, Chat schept fastest 15 00:00:50,720 --> 00:00:52,720 Speaker 2: app in the world to one hundred million users and 16 00:00:52,760 --> 00:00:56,400 Speaker 2: today well over three hundred million users. But last year 17 00:00:56,440 --> 00:00:59,360 Speaker 2: we really started to see the progression into reasoning and 18 00:00:59,400 --> 00:01:02,080 Speaker 2: what we call our OH series of models. So OH 19 00:01:02,120 --> 00:01:04,120 Speaker 2: one preview was launched and now we're all the way 20 00:01:04,160 --> 00:01:06,800 Speaker 2: to one. Hopefully you've all tried it on what of 21 00:01:06,800 --> 00:01:10,040 Speaker 2: those reasoning models do, they really start to help you 22 00:01:10,120 --> 00:01:12,360 Speaker 2: think about difficult problems. 23 00:01:12,600 --> 00:01:13,959 Speaker 3: So you see the model. 24 00:01:14,040 --> 00:01:15,959 Speaker 2: One of my favorite things to do is actually watch 25 00:01:16,040 --> 00:01:18,399 Speaker 2: what it's doing in real time. You see it go 26 00:01:18,520 --> 00:01:21,560 Speaker 2: down chains of thought trying to answer the prompt that 27 00:01:21,600 --> 00:01:23,759 Speaker 2: you've put in. You see it sometimes hit a cul 28 00:01:23,800 --> 00:01:26,000 Speaker 2: de sac, it doesn't quite get to the answer, and 29 00:01:26,040 --> 00:01:28,399 Speaker 2: so it almost has to reverse course and try a 30 00:01:28,440 --> 00:01:30,480 Speaker 2: different approach to solving that problem. 31 00:01:30,680 --> 00:01:31,760 Speaker 3: A lot like the way. 32 00:01:31,760 --> 00:01:35,320 Speaker 2: We all do to ultimately give you the best answer, 33 00:01:35,400 --> 00:01:39,000 Speaker 2: And frankly, I think often the answer is not a definitive, 34 00:01:39,160 --> 00:01:43,360 Speaker 2: singular answer. It's an exploration, it's a way to continue 35 00:01:43,360 --> 00:01:44,119 Speaker 2: a conversation. 36 00:01:44,720 --> 00:01:45,959 Speaker 3: We are upping the pace. 37 00:01:46,040 --> 00:01:48,120 Speaker 2: So we just went from OH one to we previewed 38 00:01:48,160 --> 00:01:51,200 Speaker 2: OH three, just the second model. We like to confuse 39 00:01:51,240 --> 00:01:54,520 Speaker 2: sometimes with our nomenclature, but OH three that just took 40 00:01:54,560 --> 00:01:57,160 Speaker 2: three months. And if you think about the length of 41 00:01:57,200 --> 00:02:00,880 Speaker 2: time from chat GPT three to chat GPT four, that 42 00:02:00,960 --> 00:02:04,560 Speaker 2: took about two years. So the pace of innovation is 43 00:02:04,640 --> 00:02:05,680 Speaker 2: really accelerating. 44 00:02:06,000 --> 00:02:07,640 Speaker 3: What's it like to work in a place like that. 45 00:02:08,280 --> 00:02:15,120 Speaker 2: It's exhilarating, it's inspiring, it's sometimes exhausting and it's definitely 46 00:02:15,160 --> 00:02:17,280 Speaker 2: done with a sense of privilege. I think all of 47 00:02:17,360 --> 00:02:20,480 Speaker 2: us at open AI think a lot about the magnitude 48 00:02:20,520 --> 00:02:22,640 Speaker 2: of what we're bringing into the world. We think about 49 00:02:22,680 --> 00:02:26,560 Speaker 2: it with regard to safety, with alignment, but also with 50 00:02:26,720 --> 00:02:29,919 Speaker 2: the optimism of what it can do. When you can 51 00:02:29,960 --> 00:02:32,840 Speaker 2: put that sort of human intelligence into the hands of 52 00:02:33,280 --> 00:02:35,160 Speaker 2: kids in schools, when you put it into the hands 53 00:02:35,160 --> 00:02:37,360 Speaker 2: of teachers, when you put it into the hands of 54 00:02:37,360 --> 00:02:40,120 Speaker 2: our colleagues at work. And we're just getting started. 55 00:02:41,080 --> 00:02:41,520 Speaker 3: Absolutely. 56 00:02:41,600 --> 00:02:43,080 Speaker 4: So you mentioned reasoning. 57 00:02:44,040 --> 00:02:45,640 Speaker 1: Can you tell us a little bit about how this 58 00:02:45,800 --> 00:02:48,120 Speaker 1: kind of really advanced reasoning that Open a Eye is 59 00:02:48,120 --> 00:02:51,080 Speaker 1: coming out with. How does that translate into real customer 60 00:02:51,160 --> 00:02:53,240 Speaker 1: value for you? How are you seeing Opening Eyes users 61 00:02:54,160 --> 00:02:57,560 Speaker 1: actually gain productivity, gain insights from that. 62 00:02:58,000 --> 00:03:02,400 Speaker 2: Yeah, so reasoning is on the path to agents, and 63 00:03:02,440 --> 00:03:05,560 Speaker 2: we really feel that we're now moving into this area 64 00:03:06,280 --> 00:03:10,160 Speaker 2: era of agentic technology and into a world of agents. 65 00:03:10,520 --> 00:03:13,640 Speaker 2: So what we see today with the reasoning models is 66 00:03:13,680 --> 00:03:16,760 Speaker 2: they are for some of the most difficult problems. We 67 00:03:16,840 --> 00:03:20,080 Speaker 2: see it used in areas like pharma, for example, deep 68 00:03:20,160 --> 00:03:24,320 Speaker 2: research into new molecules that help us think about drugs 69 00:03:24,360 --> 00:03:27,760 Speaker 2: and drug discovery. Those have been some of the areas 70 00:03:27,800 --> 00:03:30,120 Speaker 2: that I have been most excited about the progress so 71 00:03:30,160 --> 00:03:33,000 Speaker 2: far and how people are taking the O series and 72 00:03:33,080 --> 00:03:35,880 Speaker 2: using it. But as I talked about with agents, and 73 00:03:35,880 --> 00:03:37,080 Speaker 2: I'm sure you're probably going to ask you. 74 00:03:37,120 --> 00:03:39,280 Speaker 1: Yes, and let's find how do you because everyone has 75 00:03:39,280 --> 00:03:41,240 Speaker 1: a different sort of idea, But what's your take on 76 00:03:41,280 --> 00:03:42,120 Speaker 1: what an agent is? 77 00:03:42,360 --> 00:03:45,520 Speaker 4: And you know how Sam said agents are going to 78 00:03:46,480 --> 00:03:48,280 Speaker 4: going to be acting as employees. 79 00:03:47,880 --> 00:03:50,280 Speaker 1: Right in twenty twenty five, Y're actually, so, where do 80 00:03:50,320 --> 00:03:52,280 Speaker 1: you think we'll actually see agents start to do the 81 00:03:52,320 --> 00:03:53,240 Speaker 1: work of real people? 82 00:03:53,320 --> 00:03:54,960 Speaker 4: First? What sector is do you think it's going to hit? 83 00:03:55,280 --> 00:03:57,600 Speaker 2: So it's important with agents, first of all to think 84 00:03:57,640 --> 00:03:59,160 Speaker 2: about why is it even possible? 85 00:03:59,280 --> 00:04:00,160 Speaker 3: And so that's why I'm. 86 00:04:00,120 --> 00:04:03,200 Speaker 2: Made the point about with reasoning models that kind of 87 00:04:03,400 --> 00:04:05,800 Speaker 2: cul de sac, that a reasoning model goes down and 88 00:04:05,840 --> 00:04:08,040 Speaker 2: then has to back up and try a different way 89 00:04:08,080 --> 00:04:12,200 Speaker 2: to solve because in an old world of deterministic software 90 00:04:12,560 --> 00:04:15,960 Speaker 2: effactively the developer or the architect had to in some 91 00:04:16,000 --> 00:04:21,680 Speaker 2: ways envision all the possible routes that a question or 92 00:04:21,960 --> 00:04:25,600 Speaker 2: an output could take. Today, with a reasoning model acts 93 00:04:25,720 --> 00:04:28,400 Speaker 2: much more in the way that we humans think about problems, 94 00:04:28,720 --> 00:04:31,240 Speaker 2: and so that starts to bring forth this idea of 95 00:04:31,320 --> 00:04:35,719 Speaker 2: agents within our workplace, maybe even just as task workers 96 00:04:35,760 --> 00:04:39,520 Speaker 2: alongside us every day. So some of the earlier things 97 00:04:39,520 --> 00:04:41,800 Speaker 2: that we are looking at is how do we use 98 00:04:41,839 --> 00:04:44,480 Speaker 2: agents to just solve day to day problems, Like I'm 99 00:04:44,480 --> 00:04:45,200 Speaker 2: a working mom. 100 00:04:45,839 --> 00:04:48,080 Speaker 3: Often I'm in that all the women in. 101 00:04:48,080 --> 00:04:51,039 Speaker 2: The audience just smiled. You're in that moment. You're like, 102 00:04:51,240 --> 00:04:54,680 Speaker 2: oh no, it's that five o'clock. I have still got 103 00:04:54,760 --> 00:04:57,479 Speaker 2: multiple hours of work. I've got to get home. There's 104 00:04:57,520 --> 00:05:00,000 Speaker 2: supposed to be something on the table to feed my children. 105 00:05:00,000 --> 00:05:03,120 Speaker 2: And right now my husband is really useful, but maybe 106 00:05:03,160 --> 00:05:04,000 Speaker 2: not all the times. 107 00:05:04,120 --> 00:05:06,599 Speaker 3: The most useful. What am I going to do? 108 00:05:07,400 --> 00:05:09,080 Speaker 2: And this is a moment when you think about an 109 00:05:09,120 --> 00:05:12,960 Speaker 2: agent that can perhaps figure out what am I going 110 00:05:13,000 --> 00:05:15,599 Speaker 2: to get delivered to the home. I don't want to 111 00:05:15,600 --> 00:05:18,720 Speaker 2: have to be specific. It's like, please get me something healthy. 112 00:05:18,960 --> 00:05:22,240 Speaker 2: We had pizza last night, not Italian. Let's not do 113 00:05:22,320 --> 00:05:25,880 Speaker 2: too many carbs again. Might know my budget, might know 114 00:05:25,920 --> 00:05:28,720 Speaker 2: what local restaurants I typically have ordered from, might come 115 00:05:28,800 --> 00:05:31,080 Speaker 2: up with a new idea. So that would be an 116 00:05:31,120 --> 00:05:33,640 Speaker 2: agent more in my personal life. But you are also 117 00:05:33,680 --> 00:05:36,080 Speaker 2: starting to see this dawn of agents in our workforce, 118 00:05:36,400 --> 00:05:39,640 Speaker 2: and so it may be areas like software development, it 119 00:05:39,680 --> 00:05:43,880 Speaker 2: could be areas like research within labs deep science. We 120 00:05:43,920 --> 00:05:47,279 Speaker 2: see many folks here at Davos, you know, heralding that 121 00:05:47,360 --> 00:05:51,360 Speaker 2: age of agents in areas like customer relationship management for example. 122 00:05:51,760 --> 00:05:53,839 Speaker 2: So I think that you're going to see a lot 123 00:05:53,880 --> 00:05:57,760 Speaker 2: of companies talking about new agents coming to the forefront. 124 00:05:57,880 --> 00:06:01,120 Speaker 2: Customer success is another one. But you've got to remember 125 00:06:01,160 --> 00:06:03,600 Speaker 2: what that is built on top of. And it's really 126 00:06:03,640 --> 00:06:08,440 Speaker 2: built on top of these reasoning models that we open 127 00:06:08,480 --> 00:06:12,120 Speaker 2: AI have brought into the world as the frontier technology 128 00:06:12,160 --> 00:06:13,279 Speaker 2: and the frontier company. 129 00:06:13,800 --> 00:06:17,320 Speaker 1: Great, let's talk a little bit about you your bread 130 00:06:17,320 --> 00:06:20,200 Speaker 1: and butter, which is a world of finance, Right, how 131 00:06:20,240 --> 00:06:23,479 Speaker 1: do you top a record setting VC funding round? 132 00:06:24,279 --> 00:06:26,880 Speaker 4: What is next? What do you expect this year. 133 00:06:26,720 --> 00:06:30,240 Speaker 1: In terms of and obviously AI costs a lot, right, 134 00:06:30,279 --> 00:06:33,599 Speaker 1: the cost of computing increases year every year, So what 135 00:06:33,680 --> 00:06:34,200 Speaker 1: can we expect? 136 00:06:34,240 --> 00:06:36,400 Speaker 4: Will there be another mega around in twenty twenty five. 137 00:06:36,800 --> 00:06:41,160 Speaker 2: Our technology is built on three things. Great people who 138 00:06:41,200 --> 00:06:45,640 Speaker 2: are building the most frontier algorithms and models in the world. 139 00:06:46,200 --> 00:06:49,760 Speaker 2: It's built a top of tremendous amount of compute, and 140 00:06:49,800 --> 00:06:52,400 Speaker 2: I think we're just scratching the surface on that. And 141 00:06:52,440 --> 00:06:55,240 Speaker 2: then of course data and in the world of compute, 142 00:06:55,240 --> 00:06:57,720 Speaker 2: in order to buy that compute, to have access to it, 143 00:06:57,760 --> 00:07:01,000 Speaker 2: to control our own destiny, we've had to do an 144 00:07:01,040 --> 00:07:04,760 Speaker 2: extraordinary amount of fundraising. Luckily, we also have a business 145 00:07:04,800 --> 00:07:07,960 Speaker 2: model that supports it. I already talked about chat chept 146 00:07:08,120 --> 00:07:11,040 Speaker 2: for consumer over three hundred million users today. 147 00:07:11,440 --> 00:07:14,200 Speaker 3: It is and it's a workhourse. 148 00:07:13,640 --> 00:07:17,200 Speaker 2: When it comes to revenue, revenue growth and ultimately profitability. 149 00:07:17,640 --> 00:07:22,480 Speaker 2: But we are seeing now enterprises of every size embrace 150 00:07:22,520 --> 00:07:25,720 Speaker 2: this technology, and we really see ourselves also as the 151 00:07:25,840 --> 00:07:29,640 Speaker 2: enterprise company. In fact, there's a really good symbiosis between 152 00:07:29,680 --> 00:07:33,200 Speaker 2: those two areas because often when I meet customers, and 153 00:07:33,240 --> 00:07:35,440 Speaker 2: I'm meeting a lot of them here in Davas, it's 154 00:07:35,480 --> 00:07:38,040 Speaker 2: actually their personal experience they start with. When I say, hey, 155 00:07:38,120 --> 00:07:40,880 Speaker 2: how are you using chatchpt, they'll actually give me a 156 00:07:40,920 --> 00:07:44,640 Speaker 2: personal anecdote first. And for those of you who sell 157 00:07:44,680 --> 00:07:47,160 Speaker 2: into enterprises, you know that if you've won your customer 158 00:07:47,240 --> 00:07:50,040 Speaker 2: heart in just their day to day, being able to 159 00:07:50,040 --> 00:07:52,920 Speaker 2: go in and then sell them into their enterprise environment 160 00:07:53,000 --> 00:07:56,200 Speaker 2: gets an awful lot easier. And so that enterprise model 161 00:07:56,280 --> 00:07:59,600 Speaker 2: is really building across every sector of the economy, every 162 00:07:59,640 --> 00:08:02,120 Speaker 2: type of company, every scale of company, and I'd be 163 00:08:02,160 --> 00:08:06,560 Speaker 2: happy to talk through examples. Morgan's Stanley is a great example, 164 00:08:06,600 --> 00:08:08,880 Speaker 2: since we're talking financing, and they're also a good partner 165 00:08:08,880 --> 00:08:11,520 Speaker 2: in that front. But they're using our technology and areas 166 00:08:11,520 --> 00:08:14,920 Speaker 2: like wealth management, in areas like even their investment banking, 167 00:08:14,920 --> 00:08:17,520 Speaker 2: and they've been doing it now for multiple years. So 168 00:08:18,440 --> 00:08:21,400 Speaker 2: in terms of the future from a financing perspective, I 169 00:08:21,400 --> 00:08:24,160 Speaker 2: suspect we will continue to have to finance at pace, 170 00:08:24,560 --> 00:08:26,960 Speaker 2: but we will do that on the merits of our 171 00:08:27,000 --> 00:08:27,800 Speaker 2: business as well. 172 00:08:29,160 --> 00:08:32,240 Speaker 1: How about an IPO, what would that look like for 173 00:08:32,320 --> 00:08:36,280 Speaker 1: open Ai? You mentioned profitability? You know, how far do 174 00:08:36,280 --> 00:08:37,760 Speaker 1: you think we are from that? What are you know? 175 00:08:37,760 --> 00:08:39,440 Speaker 1: You're someone who's taken two companies public? 176 00:08:39,480 --> 00:08:40,679 Speaker 3: What would the. 177 00:08:40,640 --> 00:08:42,719 Speaker 4: Steps be for open ai to get there? And how 178 00:08:42,760 --> 00:08:44,720 Speaker 4: far are we from profitability? Yeah? 179 00:08:44,880 --> 00:08:47,880 Speaker 2: So, as I've told every company I've been associated with 180 00:08:48,120 --> 00:08:51,920 Speaker 2: on an IPO, it's a it's not a destination. An 181 00:08:51,960 --> 00:08:54,600 Speaker 2: IPO is just a marker on a journey. And if 182 00:08:54,600 --> 00:08:57,199 Speaker 2: you get wrapped around the idea that the IPO is 183 00:08:57,240 --> 00:09:00,280 Speaker 2: the destination is a very kind of dangerous world. Live 184 00:09:00,320 --> 00:09:04,160 Speaker 2: in because there's a feeling of finality and manland going 185 00:09:04,200 --> 00:09:08,920 Speaker 2: public is just the beginning of another very interesting part 186 00:09:08,920 --> 00:09:11,400 Speaker 2: of your journey. Like what are the positives in IPO 187 00:09:11,640 --> 00:09:14,240 Speaker 2: Number one? I think it's a very strong credentializing moment 188 00:09:14,320 --> 00:09:17,320 Speaker 2: for any company. I love that moment of sunlight, of 189 00:09:17,400 --> 00:09:20,440 Speaker 2: being able to show your financials externally, to show how 190 00:09:20,440 --> 00:09:24,960 Speaker 2: your business is building. It's the best disinfectant in any room. 191 00:09:25,800 --> 00:09:28,440 Speaker 2: It's a great way to fundraise because it opens the 192 00:09:28,480 --> 00:09:32,520 Speaker 2: door not just to equity and selling your equity in 193 00:09:32,559 --> 00:09:35,240 Speaker 2: that moment, but it also starts opening the door to 194 00:09:35,360 --> 00:09:40,120 Speaker 2: many more areas of financing, starting with mezzanine debt, structured debt, 195 00:09:40,160 --> 00:09:40,600 Speaker 2: and so. 196 00:09:40,480 --> 00:09:42,480 Speaker 3: On and again. In a world where we're buying a 197 00:09:42,480 --> 00:09:43,840 Speaker 3: lot of compute. 198 00:09:43,400 --> 00:09:45,960 Speaker 2: We need to get there fast because equity is an 199 00:09:45,960 --> 00:09:49,600 Speaker 2: expensive way to raise capital and to deploy capital. We 200 00:09:49,679 --> 00:09:51,520 Speaker 2: need to make sure we continue to bring down that 201 00:09:51,640 --> 00:09:54,959 Speaker 2: weighted average cost of capital. And then the third thing 202 00:09:55,280 --> 00:09:59,199 Speaker 2: that I like about an IPO I say that somewhat 203 00:09:59,200 --> 00:10:02,679 Speaker 2: tongue in cheek, is it does add an element of 204 00:10:02,760 --> 00:10:05,560 Speaker 2: rigor and discipline to the company's cadence. That is a 205 00:10:05,640 --> 00:10:09,080 Speaker 2: very good thing. The reason why I kind of caveat 206 00:10:09,120 --> 00:10:10,200 Speaker 2: at the statement. 207 00:10:10,280 --> 00:10:12,080 Speaker 3: Is you need to be very careful. 208 00:10:12,200 --> 00:10:14,400 Speaker 2: It doesn't make you short term in how you think 209 00:10:14,440 --> 00:10:17,080 Speaker 2: about your company. It's very easy to get wrapped around 210 00:10:17,160 --> 00:10:21,360 Speaker 2: ninety day cycles. But for folks who operate companies do 211 00:10:21,400 --> 00:10:24,240 Speaker 2: not work on ninety day cycles, particularly companies in our 212 00:10:24,280 --> 00:10:26,640 Speaker 2: sector right now that are having to take a very 213 00:10:26,720 --> 00:10:31,199 Speaker 2: long term perspective. Right the O series of models, the 214 00:10:31,800 --> 00:10:35,480 Speaker 2: work for that was really done probably two years previous, 215 00:10:35,559 --> 00:10:38,080 Speaker 2: when we kind of put shovels in the ground, built 216 00:10:38,160 --> 00:10:41,320 Speaker 2: data centers, built up infrastructure to do it. And so 217 00:10:41,920 --> 00:10:45,439 Speaker 2: this means that during an IPO process, it's incredibly important 218 00:10:45,440 --> 00:10:48,280 Speaker 2: to bring along the right type of investor, the investor 219 00:10:48,320 --> 00:10:51,200 Speaker 2: that understands the longevity that will be required to build 220 00:10:51,240 --> 00:10:54,920 Speaker 2: the business, probably a little bit more like biotech investors 221 00:10:54,960 --> 00:11:01,520 Speaker 2: in some way. So it's always a potential uh station 222 00:11:01,760 --> 00:11:04,079 Speaker 2: on the journey that we're on, but I don't want 223 00:11:04,080 --> 00:11:06,280 Speaker 2: to make it the destination, all right. 224 00:11:06,320 --> 00:11:09,040 Speaker 4: So we talked about funding, mentioned the recent round. 225 00:11:09,520 --> 00:11:12,280 Speaker 1: Which was really the first major accomplishment right under your 226 00:11:12,280 --> 00:11:15,040 Speaker 1: tenure a CFO and open AI. Now there's another big 227 00:11:15,080 --> 00:11:18,000 Speaker 1: task at hand, which is the restructure of open Ai. 228 00:11:18,040 --> 00:11:20,480 Speaker 4: From a nonprofit to a for profit. 229 00:11:20,520 --> 00:11:22,800 Speaker 1: Now Opening Eye plans to make it a public benefit 230 00:11:22,880 --> 00:11:24,079 Speaker 1: for profit corporation, but. 231 00:11:24,040 --> 00:11:25,200 Speaker 4: Nonetheless a for profit. 232 00:11:25,720 --> 00:11:29,199 Speaker 1: Why is this an important change for the company, Why 233 00:11:29,200 --> 00:11:32,000 Speaker 1: do you need this, and particularly why is it important 234 00:11:32,000 --> 00:11:32,640 Speaker 1: to investors? 235 00:11:33,240 --> 00:11:33,520 Speaker 3: Yeah? 236 00:11:33,600 --> 00:11:36,679 Speaker 2: So, first and foremost, as we think through the restructure, 237 00:11:37,080 --> 00:11:39,680 Speaker 2: the most important thing for those of us at open 238 00:11:39,720 --> 00:11:42,400 Speaker 2: AI focused on it is to also make sure that 239 00:11:42,480 --> 00:11:47,160 Speaker 2: the nonprofit is front and center. Our mission is everything 240 00:11:47,360 --> 00:11:50,959 Speaker 2: to create AGI that benefits humanity, that really benefits everyone 241 00:11:51,080 --> 00:11:51,760 Speaker 2: people in. 242 00:11:51,679 --> 00:11:54,000 Speaker 3: The world, and that is. 243 00:11:53,960 --> 00:11:56,200 Speaker 2: What the board. Remember today, the board is the board 244 00:11:56,240 --> 00:11:58,679 Speaker 2: of the nonprofit. That is what they are most focused on. 245 00:11:59,600 --> 00:12:02,760 Speaker 2: In regards to shifting over to being a PBC. The 246 00:12:02,840 --> 00:12:05,080 Speaker 2: reason that we are thinking through that is the right 247 00:12:05,280 --> 00:12:06,000 Speaker 2: corporate structure. 248 00:12:06,040 --> 00:12:07,160 Speaker 3: It's not a done deal. 249 00:12:07,320 --> 00:12:10,680 Speaker 2: But a PBC does allow us to balance both being 250 00:12:11,000 --> 00:12:15,000 Speaker 2: mission focused as well as being shareholder focused economically focused. 251 00:12:16,280 --> 00:12:18,560 Speaker 2: It's not the be all and end all, because we 252 00:12:18,600 --> 00:12:22,240 Speaker 2: are proven that we are able to access capital, that 253 00:12:22,240 --> 00:12:24,439 Speaker 2: we're able to continue to grow the business, and we're 254 00:12:24,480 --> 00:12:27,320 Speaker 2: able to attract great people. But those are the three 255 00:12:27,520 --> 00:12:30,240 Speaker 2: tenants that I look at and thinking about why we 256 00:12:30,360 --> 00:12:33,520 Speaker 2: might want to look more like a traditional company because 257 00:12:33,800 --> 00:12:37,240 Speaker 2: it is fun to innovate, but it's better to stay 258 00:12:37,240 --> 00:12:41,240 Speaker 2: innovating on the innovation edge that we want to be on, 259 00:12:41,280 --> 00:12:43,679 Speaker 2: which is less about our corporate structure and much more 260 00:12:43,800 --> 00:12:46,440 Speaker 2: about our models and the products that we're building. 261 00:12:47,240 --> 00:12:49,000 Speaker 1: Right, And so, as you said, it's not a done deal, 262 00:12:49,280 --> 00:12:53,120 Speaker 1: there are steps to go through. What are some of 263 00:12:53,160 --> 00:12:56,640 Speaker 1: the challenges of navigating this restructure, especially as some people, 264 00:12:56,720 --> 00:12:59,760 Speaker 1: including Elon Musk, have criticized the company for the move, 265 00:13:00,040 --> 00:13:01,000 Speaker 1: for the intended move. 266 00:13:01,200 --> 00:13:04,280 Speaker 2: So for Elon and Sam as being quite public about 267 00:13:04,280 --> 00:13:07,679 Speaker 2: this and others like Greg our co founder, that we 268 00:13:07,720 --> 00:13:11,480 Speaker 2: see Elon as a competitor. We think he's a strong competitor, 269 00:13:11,840 --> 00:13:14,440 Speaker 2: but we hope that he won't keep resorting to kind 270 00:13:14,440 --> 00:13:18,280 Speaker 2: of using law and lawfare to compete. I think even 271 00:13:18,320 --> 00:13:21,560 Speaker 2: in his own words, AI and building AI is a 272 00:13:21,679 --> 00:13:25,079 Speaker 2: very capital intensive business, and I think even he recognized 273 00:13:25,200 --> 00:13:28,560 Speaker 2: very early on that it would require us to be 274 00:13:28,760 --> 00:13:31,480 Speaker 2: much more than a nonprofit. So I actually think he's 275 00:13:31,520 --> 00:13:34,520 Speaker 2: even said that we don't we shouldn't be a nonprofit to. 276 00:13:34,520 --> 00:13:36,240 Speaker 3: Be able to raise the capital required. 277 00:13:37,080 --> 00:13:38,800 Speaker 2: So for right now, you kind of said, is it 278 00:13:38,840 --> 00:13:39,920 Speaker 2: complicated internally? 279 00:13:39,960 --> 00:13:42,439 Speaker 3: We try not to distract. We want to make sure. 280 00:13:42,400 --> 00:13:44,800 Speaker 2: Our researchers are able to do the job they want 281 00:13:44,800 --> 00:13:48,079 Speaker 2: to do, to be curious, to be serendipitous. We want 282 00:13:48,120 --> 00:13:50,440 Speaker 2: to make sure those that are taking that research and 283 00:13:50,480 --> 00:13:53,480 Speaker 2: turning it into product or customer focus first and foremost, 284 00:13:53,720 --> 00:13:57,720 Speaker 2: but also thinking about innovation things customers may never have 285 00:13:57,760 --> 00:14:00,240 Speaker 2: thought about. Make sure our go to market folks to 286 00:14:00,280 --> 00:14:02,240 Speaker 2: do what they do. And so we really try to 287 00:14:02,240 --> 00:14:05,360 Speaker 2: make sure our focus from a restructured perspective stays in 288 00:14:05,480 --> 00:14:06,679 Speaker 2: the right context and. 289 00:14:06,679 --> 00:14:07,600 Speaker 3: At the right scale. 290 00:14:08,120 --> 00:14:11,480 Speaker 1: So there's a legal challenge from write elend musk and 291 00:14:11,600 --> 00:14:16,360 Speaker 1: regulatory approvals, but there's also just this changes the you know, 292 00:14:16,480 --> 00:14:19,600 Speaker 1: the structure for existing investors and you know, kind of 293 00:14:19,600 --> 00:14:21,560 Speaker 1: reassessing the equity in the company. 294 00:14:22,000 --> 00:14:24,080 Speaker 4: So what can that what is that. 295 00:14:24,040 --> 00:14:26,880 Speaker 1: Going to look like with existing investors like Microsoft, and 296 00:14:27,240 --> 00:14:29,560 Speaker 1: you know what kind of equity what might we expect? 297 00:14:29,560 --> 00:14:32,200 Speaker 4: I know Altman has said that some of the numbers out. 298 00:14:32,000 --> 00:14:35,240 Speaker 1: There are maybe you know, not not quite right. 299 00:14:35,320 --> 00:14:38,000 Speaker 4: But what do you think may happen with equity at 300 00:14:38,000 --> 00:14:39,120 Speaker 4: the company for him and others? 301 00:14:39,360 --> 00:14:42,360 Speaker 2: Yeah, so if you look at the folks really involved 302 00:14:42,360 --> 00:14:45,040 Speaker 2: from a restructured perspective. It's not that many parties at 303 00:14:45,040 --> 00:14:48,080 Speaker 2: the table. As we said, the nonprofit is actually the 304 00:14:48,160 --> 00:14:50,400 Speaker 2: key focus for those of us at Open AI and 305 00:14:50,440 --> 00:14:55,080 Speaker 2: making sure that the nonprofit remains well capitalized, well funded, and. 306 00:14:55,040 --> 00:14:57,680 Speaker 3: Can really help us live up to this. 307 00:14:58,120 --> 00:15:03,400 Speaker 2: Very grand inspirational mission that we have for others like Microsoft. 308 00:15:03,080 --> 00:15:05,440 Speaker 3: A really early partner for. 309 00:15:05,360 --> 00:15:08,440 Speaker 2: Us, both from a capital perspective for helping us build 310 00:15:08,440 --> 00:15:11,680 Speaker 2: infrastructure initially to be a partner even in how we 311 00:15:11,720 --> 00:15:14,920 Speaker 2: think about some of our go to market that is 312 00:15:14,920 --> 00:15:17,640 Speaker 2: more of a negotiation. For the investors who came in 313 00:15:17,720 --> 00:15:20,680 Speaker 2: last year. We actually raised that round with them sitting 314 00:15:20,760 --> 00:15:23,560 Speaker 2: more atop so for them it will be much more simple. 315 00:15:23,640 --> 00:15:26,280 Speaker 2: It'll just be a conversion of the six point six 316 00:15:26,360 --> 00:15:29,040 Speaker 2: billion they put in over the one hundred and fifty 317 00:15:29,040 --> 00:15:32,560 Speaker 2: seven billion post money valuation. That's the percentage of the 318 00:15:32,560 --> 00:15:35,080 Speaker 2: company that they owned at that moment. It's easy math. 319 00:15:35,120 --> 00:15:38,520 Speaker 2: It feels much more like a growth equity around. 320 00:15:39,320 --> 00:15:41,360 Speaker 1: Well, we're probably not going to get to the bottom 321 00:15:41,360 --> 00:15:42,600 Speaker 1: of EQUI numbers today, so I'm. 322 00:15:42,520 --> 00:15:43,120 Speaker 3: Going to move on. 323 00:15:43,240 --> 00:15:46,040 Speaker 1: But I mean we put out there's seven percent was 324 00:15:46,040 --> 00:15:48,640 Speaker 1: a number that was out there. I think Altman it said, yeah, 325 00:15:49,360 --> 00:15:53,040 Speaker 1: that's that's not right, any kind of ballpark of what 326 00:15:53,040 --> 00:15:54,280 Speaker 1: we might expect. 327 00:15:53,960 --> 00:15:56,080 Speaker 2: Or nothing that we can discuss it in. I'm not 328 00:15:56,120 --> 00:15:58,720 Speaker 2: trying to shirk the question, but frankly, we don't have 329 00:15:58,760 --> 00:16:01,760 Speaker 2: a definitive answer, not really able to talk about it. 330 00:16:02,160 --> 00:16:07,360 Speaker 1: Let's talk about global policy and infrastructure. So just on 331 00:16:07,480 --> 00:16:10,160 Speaker 1: the policy front, last night we all watch tech leaders, 332 00:16:10,240 --> 00:16:13,920 Speaker 1: including Zambalman, atten Trump's inauguration. How does the company plan 333 00:16:14,000 --> 00:16:15,760 Speaker 1: on working with this new administration. 334 00:16:16,480 --> 00:16:21,000 Speaker 2: So we have released multiple blue papers now about Infrastructure 335 00:16:21,040 --> 00:16:23,560 Speaker 2: Being Destiny, and what we mean by that is we 336 00:16:23,600 --> 00:16:27,520 Speaker 2: recognize that this new era era of AI is going 337 00:16:27,560 --> 00:16:30,480 Speaker 2: to require a lot of both public and private sector 338 00:16:31,320 --> 00:16:33,280 Speaker 2: relationship building and coming. 339 00:16:33,040 --> 00:16:35,240 Speaker 3: To bringing it to the world. 340 00:16:35,760 --> 00:16:40,320 Speaker 2: That's because of the capital requirements, but also how we 341 00:16:40,360 --> 00:16:43,520 Speaker 2: think about the right regulatory environment. How do we make 342 00:16:43,560 --> 00:16:46,680 Speaker 2: sure we build this in a way where it benefits humanity. 343 00:16:47,160 --> 00:16:50,880 Speaker 2: And so with that, our blue paper on Infrastructure Is 344 00:16:50,960 --> 00:16:54,920 Speaker 2: Destiny really leans into things like building up economic zones, 345 00:16:55,040 --> 00:16:57,800 Speaker 2: recognizing a little bit with a US lens right now 346 00:16:58,120 --> 00:17:01,520 Speaker 2: that it is about productivity but also about national security. 347 00:17:02,160 --> 00:17:05,680 Speaker 2: But I sit in front of you as both an American, 348 00:17:06,200 --> 00:17:09,919 Speaker 2: a brit and European, so growing up in Northern Ireland 349 00:17:09,960 --> 00:17:14,400 Speaker 2: turns out had some really good advantages. And so from 350 00:17:14,480 --> 00:17:19,440 Speaker 2: a US perspective, we see the productivity build already beginning 351 00:17:19,440 --> 00:17:22,959 Speaker 2: to happen, frankly, and that's just good for people, right, 352 00:17:23,160 --> 00:17:26,080 Speaker 2: better standard of life, better quality of life, better way 353 00:17:26,080 --> 00:17:29,280 Speaker 2: to use their time. But we also want to make 354 00:17:29,280 --> 00:17:32,919 Speaker 2: sure that it's a Western alliance maybe one way to 355 00:17:32,920 --> 00:17:36,000 Speaker 2: put it, so that everyone is embracing it, because we 356 00:17:36,119 --> 00:17:39,639 Speaker 2: do have competitors that are going to come after this 357 00:17:39,920 --> 00:17:43,159 Speaker 2: also from a national level, and I think we need 358 00:17:43,200 --> 00:17:46,160 Speaker 2: to make sure also from a national security perspective, we're investing. 359 00:17:46,400 --> 00:17:48,960 Speaker 2: So we've thought about things like economic development zones, how 360 00:17:48,960 --> 00:17:51,800 Speaker 2: to governments really reach into their populace help them think 361 00:17:51,800 --> 00:17:54,760 Speaker 2: about reskilling and retooling. We're doing a lot of work 362 00:17:54,840 --> 00:17:58,879 Speaker 2: in the education sector. We've struck some kind of incredible deals. 363 00:17:58,920 --> 00:18:01,200 Speaker 2: ASU is a good example of university in the US 364 00:18:01,200 --> 00:18:04,600 Speaker 2: that has two hundred and eighty thousand seats deployed on 365 00:18:04,680 --> 00:18:08,359 Speaker 2: chat GPT. My alma mater, Oxford, has also been a 366 00:18:08,400 --> 00:18:12,160 Speaker 2: really good customer for US, for example, and we love 367 00:18:12,320 --> 00:18:15,320 Speaker 2: getting down into that level because we see students embracing 368 00:18:15,359 --> 00:18:18,520 Speaker 2: this technology. So that's been the second piece of the 369 00:18:18,520 --> 00:18:21,360 Speaker 2: infrastructure's destiny. And then beyond that, how do we think 370 00:18:21,400 --> 00:18:24,919 Speaker 2: about working in the right regulatory environment so that again 371 00:18:24,960 --> 00:18:29,600 Speaker 2: we deployed technology safely, but we also recognize that people 372 00:18:29,640 --> 00:18:31,920 Speaker 2: want to be able to use it and that governments, 373 00:18:32,000 --> 00:18:34,640 Speaker 2: for the benefit of their citizens, need to make sure 374 00:18:34,680 --> 00:18:35,920 Speaker 2: they're on the forefront as well. 375 00:18:37,680 --> 00:18:40,800 Speaker 1: So opener I recently put out an economic blueprint right 376 00:18:40,840 --> 00:18:43,840 Speaker 1: of their you know, the company's thoughts on what USAI 377 00:18:43,880 --> 00:18:45,960 Speaker 1: policy should be. In a big message there was that 378 00:18:46,040 --> 00:18:49,240 Speaker 1: the sort of technological arms race and AI between the 379 00:18:49,359 --> 00:18:52,480 Speaker 1: US and China, and that the US should invest. 380 00:18:52,080 --> 00:18:54,960 Speaker 4: And support the build out of infrastructure. 381 00:18:57,200 --> 00:19:00,399 Speaker 1: Any indication of if there is support for this in 382 00:19:00,440 --> 00:19:04,080 Speaker 1: the Trump administration and why are we hearing more rhetoric 383 00:19:04,080 --> 00:19:07,520 Speaker 1: about the US versus China sort of AI competition. 384 00:19:07,680 --> 00:19:10,359 Speaker 3: Now, I don't think it's rhetoric. I think it's fact. 385 00:19:10,520 --> 00:19:14,040 Speaker 2: There is absolutely competition going on right now between US 386 00:19:14,119 --> 00:19:17,240 Speaker 2: and China. China is absolutely investing in this area. They 387 00:19:17,320 --> 00:19:21,000 Speaker 2: absolutely know how critical it is to their economy, but 388 00:19:21,080 --> 00:19:23,840 Speaker 2: also from a security perspective, so we should not be 389 00:19:23,960 --> 00:19:27,600 Speaker 2: naive on that front. We absolutely see already with the 390 00:19:27,600 --> 00:19:30,600 Speaker 2: Trump administration a real willingness to lean in I think, 391 00:19:30,640 --> 00:19:33,280 Speaker 2: to be very kind of on the economic front foot 392 00:19:34,280 --> 00:19:37,120 Speaker 2: and whether that comes from perhaps being more open from 393 00:19:37,160 --> 00:19:41,000 Speaker 2: a regulatory standpoint, more open from just a business competition standpoint. 394 00:19:41,880 --> 00:19:43,439 Speaker 3: We're excited to actually get to work. 395 00:19:46,200 --> 00:19:48,280 Speaker 1: Let's talk about Europe maybe for a second, because you 396 00:19:48,359 --> 00:19:52,480 Speaker 1: mentioned your European heritage and we are here in Europe. 397 00:19:53,359 --> 00:19:56,359 Speaker 1: You know, it's no secret that's sort of policymakers in 398 00:19:56,400 --> 00:19:58,640 Speaker 1: the EU are considered tougher on tech. We have seen 399 00:19:58,680 --> 00:20:00,960 Speaker 1: AI regulation come out of E you right, whereas in 400 00:20:01,000 --> 00:20:03,480 Speaker 1: the US we don't have federal legislation really on the matter. 401 00:20:04,560 --> 00:20:06,840 Speaker 1: Do you think that regulation can threaten the pace of 402 00:20:06,880 --> 00:20:07,760 Speaker 1: AI innovation? 403 00:20:08,080 --> 00:20:09,080 Speaker 4: And are we seeing. 404 00:20:08,920 --> 00:20:13,760 Speaker 1: Maybe a changing appetite from some governments on that front? 405 00:20:14,080 --> 00:20:17,439 Speaker 1: You know, I know the UK is not is different, 406 00:20:17,440 --> 00:20:21,440 Speaker 1: but the PM recent announcements right there about fast tracking 407 00:20:21,800 --> 00:20:24,719 Speaker 1: AI infrastructure similar to some of the policy proposals that 408 00:20:24,920 --> 00:20:26,040 Speaker 1: you're putting forward in the US. 409 00:20:26,080 --> 00:20:27,240 Speaker 4: So what do you think about. 410 00:20:27,080 --> 00:20:29,760 Speaker 1: That balance right in the EU of regulation being tough 411 00:20:29,760 --> 00:20:30,960 Speaker 1: on tech versus innovation. 412 00:20:31,400 --> 00:20:33,160 Speaker 2: Yeah, I mean I would kind of start if I'm 413 00:20:33,200 --> 00:20:36,119 Speaker 2: sitting in the EU or the UK and think about 414 00:20:36,320 --> 00:20:39,640 Speaker 2: productivity and what that means for the growth of my economy. Right, 415 00:20:39,640 --> 00:20:43,399 Speaker 2: we've seen a real divergence between the United States and 416 00:20:43,520 --> 00:20:46,479 Speaker 2: other economies around the world, and it really kind of 417 00:20:46,680 --> 00:20:49,280 Speaker 2: hurts my soul in some ways to see the growth 418 00:20:49,359 --> 00:20:52,720 Speaker 2: rates the UK is as particularly being quite a low 419 00:20:52,760 --> 00:20:57,400 Speaker 2: growth rate relatively speaking. And I think technology is certainly 420 00:20:57,880 --> 00:21:02,520 Speaker 2: a major accelerant fact to hire GDP growth. Better quality 421 00:21:02,560 --> 00:21:05,320 Speaker 2: of life for citizens. We just did a survey that 422 00:21:05,400 --> 00:21:07,920 Speaker 2: said what do people want when they think about AI, 423 00:21:08,240 --> 00:21:10,200 Speaker 2: and it was really three things. Number One, they want 424 00:21:10,200 --> 00:21:13,080 Speaker 2: a better quality of life. Generally, they want to make 425 00:21:13,119 --> 00:21:16,760 Speaker 2: sure that they're getting jobs, that they're getting a lift 426 00:21:16,840 --> 00:21:19,520 Speaker 2: from the income that they take home every day. Second 427 00:21:19,520 --> 00:21:22,160 Speaker 2: thing they want is better healthcare. And the third thing 428 00:21:22,280 --> 00:21:24,440 Speaker 2: that they're looking for is just save me time, help 429 00:21:24,480 --> 00:21:26,640 Speaker 2: me be more efficient, help me just have that life 430 00:21:26,640 --> 00:21:28,199 Speaker 2: where I have time to spend with my kids or 431 00:21:28,240 --> 00:21:30,080 Speaker 2: have time to spend on the hobby that matters to me. 432 00:21:30,560 --> 00:21:33,760 Speaker 2: And so we think that from a government perspective, you 433 00:21:33,880 --> 00:21:36,600 Speaker 2: really want to be mindful of finding that balance between 434 00:21:36,840 --> 00:21:40,760 Speaker 2: overregulating but then perhaps not allowing technology to bear the 435 00:21:40,760 --> 00:21:42,960 Speaker 2: fruit that I think it's capable of. I think the 436 00:21:43,040 --> 00:21:45,159 Speaker 2: US has found a really nice path on that. The 437 00:21:45,200 --> 00:21:49,119 Speaker 2: blueprint that you're referencing has a whole conversation about the 438 00:21:49,160 --> 00:21:53,440 Speaker 2: advent of cars into society. Actually originally innovated in the UK, 439 00:21:54,480 --> 00:21:57,320 Speaker 2: but at the time we used people walking in front 440 00:21:57,320 --> 00:22:00,800 Speaker 2: of cars using red flags to remind estrians there was 441 00:22:00,840 --> 00:22:03,320 Speaker 2: this car coming. How could that be right? People don't 442 00:22:03,320 --> 00:22:05,760 Speaker 2: walk that fast. It's because we also regulated the speed 443 00:22:05,760 --> 00:22:07,880 Speaker 2: of the car I think, down to something like three 444 00:22:07,880 --> 00:22:10,919 Speaker 2: miles an hour. And so hence US took up that 445 00:22:10,960 --> 00:22:15,600 Speaker 2: mental created an incredible automobile industry and really took the 446 00:22:15,600 --> 00:22:18,399 Speaker 2: baton at a moment when maybe the UK should have 447 00:22:18,440 --> 00:22:21,000 Speaker 2: thrived on that innovation in particular. So it is but 448 00:22:21,119 --> 00:22:24,320 Speaker 2: one example, and one example is not everything, but I 449 00:22:24,359 --> 00:22:27,240 Speaker 2: think it's a good metaphor for thinking about how do 450 00:22:27,359 --> 00:22:31,320 Speaker 2: you balance leaning into this new technology while doing it 451 00:22:31,359 --> 00:22:34,080 Speaker 2: in a way that feels safe and for the good 452 00:22:34,160 --> 00:22:35,040 Speaker 2: of your citizens. 453 00:22:35,720 --> 00:22:38,000 Speaker 1: I want to wrap up with a few more personal questions. 454 00:22:38,000 --> 00:22:40,800 Speaker 1: So you are, you know, one of the top ranking 455 00:22:41,000 --> 00:22:44,040 Speaker 1: female executives in tech right now? 456 00:22:44,119 --> 00:22:45,639 Speaker 4: There is a lot. 457 00:22:45,520 --> 00:22:49,800 Speaker 1: Of discussion about DEI in the tech field about how, you. 458 00:22:49,760 --> 00:22:52,800 Speaker 4: Know, how do women succeed in the tech industry. What's 459 00:22:52,800 --> 00:22:53,720 Speaker 4: the best way to do that? 460 00:22:53,960 --> 00:22:56,119 Speaker 1: How have you been able to navigate this and what 461 00:22:56,160 --> 00:22:58,040 Speaker 1: are your thoughts on the topic. 462 00:22:58,359 --> 00:23:02,040 Speaker 2: It's something very near and dear my heart. I started 463 00:23:02,040 --> 00:23:04,840 Speaker 2: my career as an engineer. In fact, my very first 464 00:23:04,960 --> 00:23:08,119 Speaker 2: job out of college or my first internship was working 465 00:23:08,160 --> 00:23:10,199 Speaker 2: on a gold mine in Ghana. Believe it or not, 466 00:23:11,080 --> 00:23:12,879 Speaker 2: I wrote my masters on how. 467 00:23:12,760 --> 00:23:14,480 Speaker 3: To extract gold out of sulfid doors. 468 00:23:14,480 --> 00:23:18,199 Speaker 2: If anyone wants to have a conversation later, and I 469 00:23:18,359 --> 00:23:21,480 Speaker 2: say that because I did that, and I came back 470 00:23:21,520 --> 00:23:24,160 Speaker 2: to the UK and thought that is not a job 471 00:23:24,240 --> 00:23:27,879 Speaker 2: I can be successful in because there were no other women. 472 00:23:28,320 --> 00:23:31,280 Speaker 2: And so I strongly grab hold of that statement about 473 00:23:31,320 --> 00:23:34,119 Speaker 2: it's very hard to be what you can't see. And 474 00:23:34,160 --> 00:23:36,840 Speaker 2: so I think in someone sitting and has the privilege 475 00:23:36,840 --> 00:23:39,280 Speaker 2: today of sitting in my seat, first of all, it's 476 00:23:39,440 --> 00:23:44,200 Speaker 2: showing that women can be and so hopefully that generation 477 00:23:44,320 --> 00:23:46,680 Speaker 2: that's coming up behind can see. 478 00:23:46,480 --> 00:23:48,000 Speaker 3: What they are capable of. 479 00:23:48,600 --> 00:23:50,800 Speaker 2: I think the second thing is how do we lean 480 00:23:50,840 --> 00:23:53,119 Speaker 2: into technology to go back to what we can do 481 00:23:53,160 --> 00:23:56,520 Speaker 2: with AI you really can start to create personal tutors. 482 00:23:56,560 --> 00:23:59,399 Speaker 2: I just saw an amazing piece out of World Bank 483 00:24:00,240 --> 00:24:01,879 Speaker 2: where they did a study in Nigeria. 484 00:24:02,119 --> 00:24:04,920 Speaker 3: For particularly, they looked at the gender dynamic. 485 00:24:05,320 --> 00:24:09,960 Speaker 2: So they created using chat SHEPT effectively a after school 486 00:24:10,000 --> 00:24:13,159 Speaker 2: tutor program, and I think for girls, I think they 487 00:24:13,240 --> 00:24:15,560 Speaker 2: zeroed in on the girls in particular. They saw almost 488 00:24:15,600 --> 00:24:18,960 Speaker 2: a two year leap in education outcomes in just a 489 00:24:19,000 --> 00:24:22,600 Speaker 2: few months of using the technology, because it really does 490 00:24:22,840 --> 00:24:25,840 Speaker 2: create a personalized moment. Like one of the things that 491 00:24:25,920 --> 00:24:29,080 Speaker 2: hurts my soul a lot is when I hear, particularly 492 00:24:29,119 --> 00:24:31,400 Speaker 2: young girls say I'm not good at math. 493 00:24:31,640 --> 00:24:33,280 Speaker 3: Which is just not true. 494 00:24:33,720 --> 00:24:37,280 Speaker 2: They would never say I'm not good at speaking German 495 00:24:37,840 --> 00:24:41,879 Speaker 2: because you don't know yet. You are just starting your life. 496 00:24:41,920 --> 00:24:44,600 Speaker 2: You do not know, but it may not be being 497 00:24:44,680 --> 00:24:47,960 Speaker 2: taught to you in a way that resonates. Computer science 498 00:24:48,119 --> 00:24:49,919 Speaker 2: is often taught first through gaming. 499 00:24:50,520 --> 00:24:51,480 Speaker 3: My daughter, when she. 500 00:24:51,480 --> 00:24:54,359 Speaker 2: First took her for CS classes, said Mom, I just 501 00:24:54,359 --> 00:24:56,200 Speaker 2: don't even like this. I don't want to build games. 502 00:24:56,560 --> 00:24:59,600 Speaker 2: Once she recognized that she could use it to create websites, 503 00:24:59,680 --> 00:25:03,680 Speaker 2: she could pull her artistic side, and she was fascinated 504 00:25:03,800 --> 00:25:06,200 Speaker 2: drawn to it. Today, you know, I'm really proud of 505 00:25:06,280 --> 00:25:09,240 Speaker 2: the fact that she is a hard scientist, taking chemistry 506 00:25:09,640 --> 00:25:13,359 Speaker 2: in university and doing computer science so I think a 507 00:25:13,400 --> 00:25:15,440 Speaker 2: lot of it comes down to how do we think 508 00:25:15,480 --> 00:25:19,000 Speaker 2: about approaching people where they are? And then in the workforce, 509 00:25:19,080 --> 00:25:22,800 Speaker 2: how do I turn around at the table, reach behind myself, 510 00:25:23,040 --> 00:25:26,680 Speaker 2: pull people up of all different types of diversity, not 511 00:25:26,760 --> 00:25:29,280 Speaker 2: just women, because I think in the end, the more 512 00:25:29,359 --> 00:25:32,400 Speaker 2: diversity we have around the table, the better the products 513 00:25:32,400 --> 00:25:35,000 Speaker 2: that we build, the better the outcomes we have for 514 00:25:35,119 --> 00:25:38,199 Speaker 2: society at large. And you cannot do that when you 515 00:25:38,280 --> 00:25:42,119 Speaker 2: have a homogeneous point of view sitting in the room. 516 00:25:44,359 --> 00:25:47,280 Speaker 1: This is a question that I got from others, so 517 00:25:47,320 --> 00:25:48,480 Speaker 1: I know it's on people's mind. 518 00:25:48,520 --> 00:25:50,480 Speaker 4: But what is Sam Altman like as a boss? 519 00:25:53,240 --> 00:25:57,920 Speaker 2: Let's see, where's the one when I need it? Sam 520 00:25:58,040 --> 00:26:00,720 Speaker 2: is a delight. He's super fun to work with because 521 00:26:00,720 --> 00:26:03,479 Speaker 2: his brain runs at one thousand miles an hour, and 522 00:26:03,520 --> 00:26:06,159 Speaker 2: so I kind of as someone who's really hyper curious. 523 00:26:06,240 --> 00:26:08,399 Speaker 2: I love that because it keeps me on my toes. 524 00:26:10,359 --> 00:26:14,320 Speaker 2: He has a good heart, and I think that's really important. 525 00:26:16,760 --> 00:26:19,760 Speaker 2: But he's always pushing and in fact, I had a 526 00:26:19,840 --> 00:26:22,120 Speaker 2: conversation with him recently you talk about you know when 527 00:26:22,119 --> 00:26:26,080 Speaker 2: you're having that personal moment where I said, Sam, hey, 528 00:26:26,240 --> 00:26:30,439 Speaker 2: just something to know about me that when you're giving 529 00:26:30,440 --> 00:26:33,120 Speaker 2: me feedback. If it's always on the thing that could 530 00:26:33,160 --> 00:26:35,439 Speaker 2: be a little bit better, that is good, bring it on, 531 00:26:36,200 --> 00:26:39,639 Speaker 2: but sometimes it's really helpful to kind of also remind 532 00:26:39,680 --> 00:26:42,640 Speaker 2: me what went really well, like I will respond better 533 00:26:42,680 --> 00:26:42,919 Speaker 2: to you. 534 00:26:43,560 --> 00:26:47,399 Speaker 3: And actually he was like, that is really helpful. 535 00:26:47,000 --> 00:26:49,160 Speaker 2: Because I'm someone who likes tough feedback all the time. 536 00:26:49,200 --> 00:26:50,959 Speaker 2: But I'm really going to use that and think about that. 537 00:26:51,000 --> 00:26:54,240 Speaker 2: And I've seen him shift how we interact and so 538 00:26:54,359 --> 00:26:58,040 Speaker 2: again it's like, this is why diversity is important, because 539 00:26:58,080 --> 00:26:59,440 Speaker 2: we need to be able to talk to each other 540 00:26:59,480 --> 00:27:01,640 Speaker 2: about differ from wise that we respond and I hope 541 00:27:01,640 --> 00:27:03,840 Speaker 2: actually that helps him thing about others that. 542 00:27:03,800 --> 00:27:04,800 Speaker 3: He's surrounding himself with. 543 00:27:05,000 --> 00:27:08,560 Speaker 2: So yeah, overall, it's been a wild and incredible and 544 00:27:08,600 --> 00:27:13,320 Speaker 2: inspirational first kind of run at the company. It's an 545 00:27:13,320 --> 00:27:14,600 Speaker 2: incredible group of people. 546 00:27:15,040 --> 00:27:16,360 Speaker 3: You are literally. 547 00:27:16,000 --> 00:27:18,760 Speaker 2: Working with the best in the world at what they do, 548 00:27:20,160 --> 00:27:23,320 Speaker 2: and you really have to work with very different personalities and. 549 00:27:23,400 --> 00:27:23,960 Speaker 3: I love it. 550 00:27:24,960 --> 00:27:26,600 Speaker 4: I would call that reinforcement learning. 551 00:27:26,680 --> 00:27:28,600 Speaker 3: It is reinforcement. 552 00:27:27,920 --> 00:27:33,760 Speaker 1: Learning talking IPOs and general intelligence that conversation OpenAI CFO. 553 00:27:33,800 --> 00:27:37,080 Speaker 4: Of course, Sarah Free are talking to Bloomberg at Bloomberg 554 00:27:37,160 --> 00:27:39,080 Speaker 4: House in DeVos