1 00:00:03,120 --> 00:00:11,160 Speaker 1: Bloomberg Audio Studios, Podcasts, radio news. 2 00:00:14,240 --> 00:00:17,120 Speaker 2: Well, Elon Musk is now the richest person on the planet. 3 00:00:17,480 --> 00:00:20,319 Speaker 2: More than half the satellites in space are owned and 4 00:00:20,440 --> 00:00:24,640 Speaker 2: controlled by one man. Well, he's a legitimate super genius, 5 00:00:24,840 --> 00:00:25,759 Speaker 2: I mean legitimate. 6 00:00:25,960 --> 00:00:28,600 Speaker 1: He says he's always voted for Democrats, but this year 7 00:00:28,640 --> 00:00:29,400 Speaker 1: it will be different. 8 00:00:29,440 --> 00:00:30,400 Speaker 3: He'll vote Republican. 9 00:00:30,680 --> 00:00:33,159 Speaker 2: There is a reason the US government is so reliant 10 00:00:33,200 --> 00:00:35,720 Speaker 2: on him. Alon Musk is a scam artist and he's 11 00:00:35,760 --> 00:00:36,400 Speaker 2: done nothing. 12 00:00:36,479 --> 00:00:40,720 Speaker 4: Anything he does, he's fascinating people. 13 00:00:50,479 --> 00:00:54,600 Speaker 2: Hey, welcome to Elon, Inc. Bloomberg's weekly podcast about Elon Musk. 14 00:00:54,880 --> 00:00:58,440 Speaker 2: It's March fifth, twenty twenty four. I'm Max Chafkin in 15 00:00:58,600 --> 00:01:03,080 Speaker 2: for David Papadopolis. This week we have courtroom drama for you. 16 00:01:03,160 --> 00:01:07,680 Speaker 2: We're looking at an Elon feud between him and open ai, 17 00:01:07,840 --> 00:01:12,240 Speaker 2: the maker of chat GBT. Last Thursday, Elon sued Sam Altman, 18 00:01:12,280 --> 00:01:14,839 Speaker 2: the co founder of open Ai, claiming that he breached 19 00:01:14,880 --> 00:01:18,399 Speaker 2: his contract by pursuing profit over the public good. The 20 00:01:18,440 --> 00:01:22,600 Speaker 2: funny thing is that Elon has his own for profit 21 00:01:22,920 --> 00:01:26,320 Speaker 2: AI company, x dot Ai. So there's some question about 22 00:01:26,360 --> 00:01:29,319 Speaker 2: whether or not this is like a high minded activity 23 00:01:29,640 --> 00:01:33,760 Speaker 2: or another sort of brilliant troll by a very troll 24 00:01:33,800 --> 00:01:36,200 Speaker 2: happy billionaire. We're going to talk about that, and then 25 00:01:36,319 --> 00:01:38,360 Speaker 2: later in the episode we're going to be talking about 26 00:01:38,400 --> 00:01:41,960 Speaker 2: another bit of legal action on Elon Musk stocket. The 27 00:01:42,640 --> 00:01:46,280 Speaker 2: lawyers who got Elon Musk's pay package, the fifty five 28 00:01:46,319 --> 00:01:49,480 Speaker 2: billion dollar pay package from Tesla, voided they want to 29 00:01:49,520 --> 00:01:52,200 Speaker 2: be paid get this six billion dollars and they don't 30 00:01:52,200 --> 00:01:54,800 Speaker 2: want cash, they want Tesla stock. So we will get 31 00:01:54,840 --> 00:01:57,200 Speaker 2: into all of that, and to do so, we've got 32 00:01:57,200 --> 00:01:59,919 Speaker 2: a great panel for you. We have gathered Sharene Geffar, 33 00:02:00,240 --> 00:02:05,040 Speaker 2: who covers AI for Bloomberg. Hey, Dana Hull, now officially 34 00:02:05,160 --> 00:02:10,840 Speaker 2: Bloomberg's chief Elon Musk reporter, Hello, Dana Hey. And Matt Levine, 35 00:02:10,919 --> 00:02:14,440 Speaker 2: author of Bloomberg Opinion's Money Stuff newsletter, who always seems 36 00:02:14,480 --> 00:02:18,080 Speaker 2: to get his vacations interrupted by Elon news Hello Matt. 37 00:02:18,600 --> 00:02:21,440 Speaker 4: Yeah, I'm Bloomberg's unofficial Elon Musk correspondent. 38 00:02:22,440 --> 00:02:25,000 Speaker 2: So this lawsuit, first, let's start with Sharen. Can you 39 00:02:25,080 --> 00:02:27,680 Speaker 2: just give us an overview of what is in this complaint? 40 00:02:27,760 --> 00:02:29,280 Speaker 2: What is Elon Musk alleging? 41 00:02:29,639 --> 00:02:33,000 Speaker 1: Yeah, So, basically, Elon is claiming that when he invested 42 00:02:33,040 --> 00:02:36,200 Speaker 1: in open Ai and made a deal with Sam Altman 43 00:02:36,280 --> 00:02:40,200 Speaker 1: and the founders of the company that the plan was 44 00:02:40,200 --> 00:02:43,040 Speaker 1: for open ay to be a nonprofit. And Elon's claim 45 00:02:43,080 --> 00:02:45,840 Speaker 1: is that open ai has breached this contract and this 46 00:02:45,919 --> 00:02:50,160 Speaker 1: agreement by turning into a for profit company with you know, 47 00:02:50,200 --> 00:02:54,000 Speaker 1: heavy investment by Microsoft, and you know, the outcome is 48 00:02:54,040 --> 00:02:57,520 Speaker 1: he Elon wants to see Openey's research become public and 49 00:02:57,600 --> 00:03:01,400 Speaker 1: wants to rid open Ai of it agreement with Microsoft. 50 00:03:01,440 --> 00:03:05,360 Speaker 2: At this point, Matt, does this make sense? Like, can 51 00:03:05,400 --> 00:03:10,040 Speaker 2: a nonprofit donor sue a nonprofit for you know, not 52 00:03:10,160 --> 00:03:12,360 Speaker 2: being sufficiently good to humanity? 53 00:03:12,760 --> 00:03:17,840 Speaker 4: Not really? I mean, like normally the enforcement of nonprofit 54 00:03:17,919 --> 00:03:20,400 Speaker 4: missions is a matter for like the state attorney general, 55 00:03:20,480 --> 00:03:23,359 Speaker 4: not for individual donors, right, Otherwise you'd all the time 56 00:03:23,440 --> 00:03:25,520 Speaker 4: see donors saying, you know, I gave one hundred dollars 57 00:03:25,520 --> 00:03:27,080 Speaker 4: to this charity and they didn't do what I wanted. 58 00:03:27,480 --> 00:03:30,120 Speaker 4: But so that's why he's bringing a breach of contract action. 59 00:03:30,360 --> 00:03:32,919 Speaker 4: And it's very strange because they didn't sign any sort 60 00:03:32,960 --> 00:03:34,880 Speaker 4: of contract, right, Like He's like, well, I have this 61 00:03:34,920 --> 00:03:38,080 Speaker 4: email from Sam Altman from twenty fifteen that's basically a contract. 62 00:03:38,360 --> 00:03:40,600 Speaker 4: So it's a bit of a stretch as a legal action. 63 00:03:41,120 --> 00:03:44,080 Speaker 2: Yeah, there's an email, I think, and it's like there's 64 00:03:44,240 --> 00:03:46,400 Speaker 2: this just sort of a bunch of vague principles and 65 00:03:46,440 --> 00:03:49,600 Speaker 2: then this the articles of incorporation, and I guess there's 66 00:03:49,640 --> 00:03:53,480 Speaker 2: some maybe additional agreement in Elon's head, Dana. Why is 67 00:03:53,520 --> 00:03:55,560 Speaker 2: he doing this? I mean, I think it's tempting to 68 00:03:55,640 --> 00:03:58,160 Speaker 2: just sort of say, like this is just a guy 69 00:03:58,240 --> 00:04:00,920 Speaker 2: who loves a good stunt, keeping himself in the middle 70 00:04:00,920 --> 00:04:01,400 Speaker 2: of the news. 71 00:04:01,640 --> 00:04:03,520 Speaker 5: I think Elon just always has to be the man 72 00:04:03,560 --> 00:04:07,560 Speaker 5: in the arena. And Tesla's valuation was all about being 73 00:04:07,560 --> 00:04:11,200 Speaker 5: an EV company first mover advantage. Now it very much 74 00:04:11,240 --> 00:04:13,640 Speaker 5: rests on it being an AI company. And he sees 75 00:04:13,720 --> 00:04:17,320 Speaker 5: Sam as this big threat. I mean Sam is like 76 00:04:17,440 --> 00:04:20,000 Speaker 5: the golden boy now, right, like raising all this money, 77 00:04:20,080 --> 00:04:22,840 Speaker 5: and you know he's got this deal with Microsoft Open Ai. 78 00:04:23,000 --> 00:04:25,719 Speaker 5: He's back, you know, running open Ai. He's supposedly raising 79 00:04:25,760 --> 00:04:28,479 Speaker 5: money for like a chip venture, and like Elon is 80 00:04:28,640 --> 00:04:31,080 Speaker 5: very competitive, like he's got to be the alpha male. 81 00:04:31,200 --> 00:04:34,440 Speaker 5: And if you look at the landscape of AI titans, 82 00:04:34,560 --> 00:04:37,680 Speaker 5: Sam is like number one, and so Elon is going 83 00:04:37,680 --> 00:04:38,280 Speaker 5: for the jugular. 84 00:04:38,440 --> 00:04:40,480 Speaker 2: Yeah, I mean he's been like a lot of the 85 00:04:40,560 --> 00:04:43,359 Speaker 2: language in this lawsuit. I noticed from like listening to 86 00:04:43,400 --> 00:04:45,600 Speaker 2: all these podcasts that Elon's been doing over the last 87 00:04:45,680 --> 00:04:47,919 Speaker 2: year or so, where he's sort of been, I don't know, 88 00:04:48,000 --> 00:04:50,719 Speaker 2: kind of grumbling about about not getting enough credit for 89 00:04:50,800 --> 00:04:54,200 Speaker 2: this and also criticizing open Ai. What is he actually 90 00:04:54,240 --> 00:04:57,279 Speaker 2: looking for here? Like, what's the endgame? Is there any 91 00:04:57,360 --> 00:04:59,920 Speaker 2: kind of hope for some kind of resolution that would 92 00:04:59,920 --> 00:05:02,400 Speaker 2: be you know, advantageous to him. 93 00:05:02,680 --> 00:05:05,480 Speaker 5: I guess in his ideal world, what like the whole 94 00:05:05,560 --> 00:05:08,720 Speaker 5: open ai project falls apart, and like they have to 95 00:05:08,880 --> 00:05:11,120 Speaker 5: like reveal to the whole world what they're working on, 96 00:05:11,320 --> 00:05:14,000 Speaker 5: and I mean Stream knows better than I do what 97 00:05:14,360 --> 00:05:15,679 Speaker 5: the endgame would be here. 98 00:05:16,080 --> 00:05:19,920 Speaker 1: Yeah, So Elon's makes an interesting case here that open 99 00:05:19,960 --> 00:05:24,039 Speaker 1: ai has already reached AGI or artificial general intelligence, this 100 00:05:24,120 --> 00:05:28,640 Speaker 1: idea of when AI essentially surpasses or matches at least 101 00:05:28,800 --> 00:05:31,919 Speaker 1: the intelligence of humanity. And you know that that is 102 00:05:32,000 --> 00:05:34,600 Speaker 1: Openey's mission and has always been right to build AGI, 103 00:05:34,800 --> 00:05:37,839 Speaker 1: as is Googles, as is several other leading A companies. 104 00:05:38,120 --> 00:05:41,120 Speaker 1: Elon's making the case in this lawsuit that with GPT four, 105 00:05:41,600 --> 00:05:44,640 Speaker 1: open Ai has already achieved AGI. And at that point, 106 00:05:45,080 --> 00:05:48,400 Speaker 1: it actually is in their agreement with Microsoft that open 107 00:05:48,440 --> 00:05:52,359 Speaker 1: ai and Microsoft's contractual agreements do no longer hold once. 108 00:05:52,160 --> 00:05:53,440 Speaker 2: Open ai has reached AGI. 109 00:05:53,960 --> 00:05:56,000 Speaker 1: Now open Ai says, no, we are far off from 110 00:05:56,040 --> 00:05:59,120 Speaker 1: that GPT four. Our latest model is nowhere near being AGI. 111 00:06:00,120 --> 00:06:02,200 Speaker 1: How does the case in Elon is making which is 112 00:06:02,240 --> 00:06:05,160 Speaker 1: a bold one, and he actually cites some Microsoft research. 113 00:06:05,200 --> 00:06:08,400 Speaker 1: It's very interesting where a few Microsoft researchers did do 114 00:06:08,440 --> 00:06:11,960 Speaker 1: this paper recently saying that we see sparks of AGI 115 00:06:12,279 --> 00:06:15,360 Speaker 1: in the latest model. So he kind of uses Microsoft's 116 00:06:15,360 --> 00:06:19,719 Speaker 1: own researchers work to shoe their pail in this suit. 117 00:06:20,200 --> 00:06:23,520 Speaker 2: And we should say that open ai has not yet 118 00:06:23,680 --> 00:06:26,919 Speaker 2: commented on this lawsuit, although Bloomberg did report over the 119 00:06:26,960 --> 00:06:30,000 Speaker 2: weekend basically a bunch a couple of memos in which 120 00:06:30,040 --> 00:06:33,360 Speaker 2: they're waving it off, saying it's all bogus met Where 121 00:06:33,360 --> 00:06:35,039 Speaker 2: do you think this goes? I mean, do you think 122 00:06:35,040 --> 00:06:38,000 Speaker 2: this could wind up in court? Or is this gonna 123 00:06:38,000 --> 00:06:40,400 Speaker 2: be one of those lawsuits that I don't know, we 124 00:06:40,440 --> 00:06:41,360 Speaker 2: just never hear about again. 125 00:06:42,240 --> 00:06:44,760 Speaker 4: I mean, he doesn't really settle, and like where are 126 00:06:44,760 --> 00:06:47,400 Speaker 4: they going to pay him? Like there's no settlement here. Right, 127 00:06:47,680 --> 00:06:50,000 Speaker 4: So it's going to go to court. My suspicion is 128 00:06:50,000 --> 00:06:53,520 Speaker 4: that he will lose fairly early on because of sort 129 00:06:53,560 --> 00:06:55,960 Speaker 4: of basic things like there was no contract, he has 130 00:06:56,040 --> 00:06:59,360 Speaker 4: no standing to sue, et cetera. But if that's not true, 131 00:06:59,360 --> 00:07:02,000 Speaker 4: I don't know. I mean, one reason, one strategic reason 132 00:07:02,000 --> 00:07:04,080 Speaker 4: he might be bringing this lawsuit is because if he 133 00:07:04,160 --> 00:07:07,000 Speaker 4: gets anywhere, he can get discovery right and he can 134 00:07:07,080 --> 00:07:09,720 Speaker 4: like find out stuff about the inner workings of open Ai, 135 00:07:09,920 --> 00:07:11,520 Speaker 4: which is really useful to him as a guy who's 136 00:07:11,520 --> 00:07:13,200 Speaker 4: building a competitor to open Ai. 137 00:07:13,320 --> 00:07:15,320 Speaker 2: And amazing tweet material to you. 138 00:07:15,400 --> 00:07:18,080 Speaker 4: Yeah, I mean, I think that a lot of lawsuits 139 00:07:18,080 --> 00:07:21,080 Speaker 4: settle because for most people it is unpleasant and expensive 140 00:07:21,120 --> 00:07:24,640 Speaker 4: to conduct a lawsuit. For Elon Musk, it's extremely fun 141 00:07:24,640 --> 00:07:27,000 Speaker 4: and he has a lot of money. So I think 142 00:07:27,040 --> 00:07:30,520 Speaker 4: that he will end up not winning any substantive relief 143 00:07:30,560 --> 00:07:32,080 Speaker 4: from this lawsuit, but we'll have a lot of fun 144 00:07:32,120 --> 00:07:32,440 Speaker 4: doing it. 145 00:07:32,520 --> 00:07:34,560 Speaker 2: To me anyway, as a stunt, it's like a win win, right, 146 00:07:34,560 --> 00:07:37,280 Speaker 2: because on one hand, the judge could say, yes, open 147 00:07:37,320 --> 00:07:40,080 Speaker 2: Ai got to Agi and therefore it has to give up. 148 00:07:40,120 --> 00:07:42,480 Speaker 2: It's you know, has to this Microsoft partnership is dead 149 00:07:42,800 --> 00:07:44,880 Speaker 2: or on the other hand, a judge will have to say, no, 150 00:07:45,000 --> 00:07:47,280 Speaker 2: your your AI isn't very good, which would also be 151 00:07:47,320 --> 00:07:49,960 Speaker 2: good for Elon Musk. I think, Matt, how do you 152 00:07:50,040 --> 00:07:53,640 Speaker 2: rank this in the other sort of pantheon of dubious 153 00:07:53,680 --> 00:07:55,160 Speaker 2: Elon Musk lawsuits. 154 00:07:55,680 --> 00:07:57,800 Speaker 4: I'm sympathetic to this one. Like, I don't think he's 155 00:07:57,800 --> 00:07:59,560 Speaker 4: gonna win, you know, like I don't think that like 156 00:07:59,560 --> 00:08:01,400 Speaker 4: there was a contract, and I think, like the legal 157 00:08:01,400 --> 00:08:03,920 Speaker 4: theory here is dubious. But I look at this as 158 00:08:03,960 --> 00:08:06,480 Speaker 4: like not a piece of strategy or whatever. I look 159 00:08:06,480 --> 00:08:08,880 Speaker 4: at it as like he donated tens of millions of 160 00:08:08,960 --> 00:08:11,440 Speaker 4: dollars to a nonprofit that is now an eighty six 161 00:08:11,440 --> 00:08:15,240 Speaker 4: billion dollars startup that's raising money and commercializing stuff for MICROSOFTS. 162 00:08:15,240 --> 00:08:16,800 Speaker 4: And he's like, well, how did that work? Like, I 163 00:08:16,800 --> 00:08:19,480 Speaker 4: think that's a very reasonable complaint, like not as a 164 00:08:19,560 --> 00:08:21,920 Speaker 4: legal matter, just as like a you know, personal matter. 165 00:08:22,120 --> 00:08:23,880 Speaker 4: And so of course he's mad and he's bringing this 166 00:08:23,960 --> 00:08:25,840 Speaker 4: lot suit. I don't know, I'm sympathetic to that. 167 00:08:26,040 --> 00:08:29,160 Speaker 2: He's pointing to some real hypocrisy with open AI, which 168 00:08:29,240 --> 00:08:32,679 Speaker 2: is that like he's gone Sam Altman has gone around saying, 169 00:08:33,240 --> 00:08:34,840 Speaker 2: you know, AI is really out of control we got. 170 00:08:34,960 --> 00:08:37,040 Speaker 2: We gotta save the world from AI, and his solution 171 00:08:37,120 --> 00:08:40,520 Speaker 2: seems to be putting the AI into like every Microsoft product. 172 00:08:40,760 --> 00:08:42,440 Speaker 2: And there's something I think Musk kind a quote on 173 00:08:42,480 --> 00:08:44,720 Speaker 2: this that's been circulating where he said it says, if 174 00:08:44,840 --> 00:08:47,640 Speaker 2: you gave you know, money to save the Amazon and 175 00:08:47,679 --> 00:08:50,240 Speaker 2: instead it turned out that the saved Amazon charity was 176 00:08:50,240 --> 00:08:53,200 Speaker 2: actually just like cutting down trees. Dana, what do you 177 00:08:53,200 --> 00:08:53,600 Speaker 2: make of that? 178 00:08:53,800 --> 00:08:56,240 Speaker 5: I guess on its face, the idea that a nonprofit 179 00:08:56,440 --> 00:09:00,600 Speaker 5: is now like a financially very lucrative, highly valued part 180 00:09:00,640 --> 00:09:04,079 Speaker 5: of Microsoft maybe is problematic. I could see the I 181 00:09:04,080 --> 00:09:05,719 Speaker 5: could see the argument for that, But I think the 182 00:09:05,800 --> 00:09:08,880 Speaker 5: larger context here is that, like, Elon isn't just like 183 00:09:08,960 --> 00:09:11,240 Speaker 5: an investor who feels like he was duped. He is 184 00:09:11,280 --> 00:09:13,760 Speaker 5: a competitor. I mean, he has his own AI company, 185 00:09:13,800 --> 00:09:16,319 Speaker 5: and this is the same guy. Let us remind everyone 186 00:09:16,720 --> 00:09:19,120 Speaker 5: who made this big show of like signing a letter 187 00:09:19,280 --> 00:09:22,720 Speaker 5: saying that AI should be paused while he was like 188 00:09:22,800 --> 00:09:25,920 Speaker 5: secretly creating his own startup. So, like, I just don't 189 00:09:25,960 --> 00:09:28,000 Speaker 5: have a lot of empathy for Elon in this regard 190 00:09:28,160 --> 00:09:31,520 Speaker 5: because he's always got his own interest at play, and 191 00:09:31,600 --> 00:09:34,800 Speaker 5: right now his own interest is that through X and 192 00:09:35,040 --> 00:09:37,520 Speaker 5: X dot AI, he's trying to start his own competitor 193 00:09:37,640 --> 00:09:39,200 Speaker 5: to chat GPT, which is. 194 00:09:39,240 --> 00:09:43,400 Speaker 2: GROCK And so yeah, they're two competitors. Actually XAI and 195 00:09:43,520 --> 00:09:46,280 Speaker 2: Tesla is a competitor right in a sense. 196 00:09:46,120 --> 00:09:48,040 Speaker 5: If you take the take the view that like, yeah, 197 00:09:48,040 --> 00:09:50,080 Speaker 5: Tesla is trying to do full self driving, and like 198 00:09:50,120 --> 00:09:52,360 Speaker 5: Tesla has all this data and Tesla is training all 199 00:09:52,400 --> 00:09:54,760 Speaker 5: these models all the time. But I just want to 200 00:09:54,760 --> 00:09:57,160 Speaker 5: make sure there's you know, as long as everyone realizes that, 201 00:09:57,280 --> 00:09:59,920 Speaker 5: like you know, Elon is not just like an investor 202 00:10:00,080 --> 00:10:03,080 Speaker 5: who said that open ai is no longer a nonprofit 203 00:10:03,240 --> 00:10:06,200 Speaker 5: or is it really a nonprofit? He's like a tech 204 00:10:06,280 --> 00:10:09,959 Speaker 5: titan who has his own AI ambitions and see Sam 205 00:10:10,000 --> 00:10:12,839 Speaker 5: as the number one threat to what he's trying to accomplish. 206 00:10:13,040 --> 00:10:15,800 Speaker 2: Yeah, Sharine, you're reading on this like as you look 207 00:10:15,840 --> 00:10:18,640 Speaker 2: at the competitive landscape. You know, we've talked with you 208 00:10:18,840 --> 00:10:21,600 Speaker 2: on earlier episodes about GROK. This kind of like, you know, 209 00:10:21,720 --> 00:10:25,800 Speaker 2: kind of surprisingly competent cat GPT clone like cat GPT, 210 00:10:25,960 --> 00:10:27,920 Speaker 2: but a little less woke. I mean, is this as 211 00:10:27,960 --> 00:10:31,040 Speaker 2: simple as like he could harm a competitor with this lawsuit? 212 00:10:31,040 --> 00:10:32,840 Speaker 2: He could He's you know, at the very least, you know, 213 00:10:32,880 --> 00:10:35,600 Speaker 2: making things a little annoying for Sam Altman, if not worse. 214 00:10:35,880 --> 00:10:37,120 Speaker 2: I think two things can be true. 215 00:10:37,200 --> 00:10:41,280 Speaker 1: I think open ai is clearly a competitor to Elon's company, 216 00:10:41,360 --> 00:10:43,920 Speaker 1: so he has an incentive there to go up against them. 217 00:10:44,160 --> 00:10:46,079 Speaker 1: On the other hand, it is true that open ai 218 00:10:46,240 --> 00:10:49,360 Speaker 1: started as a nonprofit and now it's not, and Elon 219 00:10:49,480 --> 00:10:51,360 Speaker 1: did donate a lot of money toward it, And so 220 00:10:51,640 --> 00:10:54,959 Speaker 1: I think that Elon's case brings up some valid questions 221 00:10:55,400 --> 00:10:58,440 Speaker 1: about the trajectory of open ai. And I think, you know, 222 00:10:58,520 --> 00:11:01,320 Speaker 1: his personal motivations and ins to can certainly be wrapped 223 00:11:01,320 --> 00:11:03,880 Speaker 1: into this case. It's hard to separate that out. But 224 00:11:03,960 --> 00:11:06,080 Speaker 1: I do think this is bringing some really you know, 225 00:11:06,280 --> 00:11:07,760 Speaker 1: worthwhile questions to surface. 226 00:11:08,240 --> 00:11:11,840 Speaker 2: What's like the realistic outcome though, could the judge be like, yes, yes, 227 00:11:12,240 --> 00:11:17,000 Speaker 2: you know, Gavel, gavel chat, GPT, GPT four is dead. Sorry, 228 00:11:17,160 --> 00:11:19,960 Speaker 2: you know, and now we have to like use Grock 229 00:11:20,200 --> 00:11:22,600 Speaker 2: when we're doing Excel or whatever in the future. 230 00:11:23,440 --> 00:11:25,360 Speaker 1: I mean, I don't think that the court is going 231 00:11:25,400 --> 00:11:28,079 Speaker 1: to mandate that you use any certain software, probably certainly 232 00:11:28,160 --> 00:11:31,560 Speaker 1: not Grock, but I do think I mean, look, I'm 233 00:11:31,559 --> 00:11:33,960 Speaker 1: not a legal expert. But from the legal analysis I 234 00:11:34,000 --> 00:11:36,760 Speaker 1: have seen, it seems like certainly this would be a 235 00:11:36,800 --> 00:11:41,280 Speaker 1: difficult case. But you could essentially force open ai sure 236 00:11:41,360 --> 00:11:44,240 Speaker 1: to like open source its models, or to go about 237 00:11:44,280 --> 00:11:46,760 Speaker 1: abiding by its contract in a different way in the 238 00:11:46,800 --> 00:11:48,600 Speaker 1: way that Elon wants it to, which is to say 239 00:11:48,640 --> 00:11:51,840 Speaker 1: that now Openey's contract with Microsoft is severed, we have 240 00:11:51,920 --> 00:11:54,200 Speaker 1: to open just license this stuff to the public, to 241 00:11:54,280 --> 00:11:56,640 Speaker 1: Elon and anyone else. I think it's more about open 242 00:11:56,679 --> 00:11:59,959 Speaker 1: sourcing the software of open ai rather than any certain 243 00:12:00,080 --> 00:12:01,280 Speaker 1: person using any certain tool. 244 00:12:01,640 --> 00:12:04,360 Speaker 2: So, like Matt's saying, a public good right right. 245 00:12:04,679 --> 00:12:06,360 Speaker 4: One thing I think is interested about this case is 246 00:12:06,400 --> 00:12:09,240 Speaker 4: that like Sam Altman and Elon Musk that got together 247 00:12:09,320 --> 00:12:11,880 Speaker 4: to found a nonprofit. These are guys who are like 248 00:12:12,040 --> 00:12:17,079 Speaker 4: so deeply part of like the startup tech silicon value ecosystem, 249 00:12:17,120 --> 00:12:20,079 Speaker 4: where like like Elon Musk clearly thinks he's doing a 250 00:12:20,080 --> 00:12:22,920 Speaker 4: lot of good for the world by starting for profit companies. 251 00:12:23,120 --> 00:12:25,760 Speaker 4: It's just strange that they decided the way to do 252 00:12:25,800 --> 00:12:27,400 Speaker 4: their most good for the world is by starting a 253 00:12:27,440 --> 00:12:30,000 Speaker 4: nonprofit because they never had that thought before about all 254 00:12:30,000 --> 00:12:32,199 Speaker 4: of their other businesses. And like Elon Musk is running 255 00:12:32,360 --> 00:12:34,640 Speaker 4: his own AI business out of a for profit company. 256 00:12:35,000 --> 00:12:39,400 Speaker 4: It just feels like everyone here is unfamiliar with what 257 00:12:39,440 --> 00:12:43,240 Speaker 4: a nonprofit is and uncomfortable with it, and everyone just 258 00:12:43,280 --> 00:12:44,800 Speaker 4: kind of wants it to be a for profit business. 259 00:12:44,840 --> 00:12:48,040 Speaker 4: But it gives Elan some some leverage to mess with 260 00:12:48,800 --> 00:12:50,559 Speaker 4: Sam Altman's for profit nonprofit. 261 00:12:50,720 --> 00:12:52,680 Speaker 1: I do think being a nonprofit was part of the 262 00:12:52,679 --> 00:12:55,960 Speaker 1: pitch that appealed to certain researchers in the AI community. 263 00:12:56,520 --> 00:12:58,160 Speaker 1: I think we have to remember that, like there's such 264 00:12:58,240 --> 00:13:00,600 Speaker 1: fierce competition for these people who are are you know, 265 00:13:00,679 --> 00:13:04,439 Speaker 1: PhDs and really understand AI, and so I do think 266 00:13:04,480 --> 00:13:06,679 Speaker 1: that was a good way to differentiate what openee was 267 00:13:06,720 --> 00:13:08,719 Speaker 1: doing compared to Google at the time. And that's really 268 00:13:08,760 --> 00:13:10,360 Speaker 1: if you look at the history of OPENINGI, it was 269 00:13:10,360 --> 00:13:14,600 Speaker 1: created as this antidote to Google's private development of potentially 270 00:13:14,640 --> 00:13:15,960 Speaker 1: this life changing AGI. 271 00:13:16,760 --> 00:13:18,080 Speaker 2: So I think that's why it started. 272 00:13:18,120 --> 00:13:20,360 Speaker 1: Now, like like you're pointing out, whether everyone really understood 273 00:13:20,360 --> 00:13:22,880 Speaker 1: what it means to be a nonprofit is a different question. 274 00:13:23,080 --> 00:13:26,280 Speaker 4: Yeah. I think Also another thing that was very important 275 00:13:26,400 --> 00:13:29,080 Speaker 4: to attracting researchers was paying them a lot of money, 276 00:13:29,200 --> 00:13:32,120 Speaker 4: including in stock options, which you can't really do it 277 00:13:32,160 --> 00:13:35,560 Speaker 4: a nonprofit and they did anyway, so like, yeah, the 278 00:13:35,640 --> 00:13:38,720 Speaker 4: for profit is also very important to attracting researchers. 279 00:13:39,000 --> 00:13:42,199 Speaker 3: Yeah, absolutely well, and not only that, but like it's 280 00:13:42,240 --> 00:13:45,960 Speaker 3: not only a form of governance that they're all unfamiliar with, 281 00:13:46,000 --> 00:13:48,360 Speaker 3: but it's one that they've in previous years talked about 282 00:13:48,360 --> 00:13:50,560 Speaker 3: how how bad it is, right, Like, Like the whole point, 283 00:13:50,600 --> 00:13:54,240 Speaker 3: the whole premise of Elon Musk's like thing is that, 284 00:13:54,320 --> 00:13:57,000 Speaker 3: you know, Silicon Valley, this like startup tech, startup way 285 00:13:57,000 --> 00:13:57,520 Speaker 3: of doing. 286 00:13:57,360 --> 00:14:01,160 Speaker 2: Things is uniquely you know, innovative. It's like it's like 287 00:14:01,200 --> 00:14:03,480 Speaker 2: basically the way to solve all the world's problems, and 288 00:14:03,520 --> 00:14:07,000 Speaker 2: you kind of have open AI maybe pursuing that, and 289 00:14:07,440 --> 00:14:09,360 Speaker 2: like Elon Musk is on the outside of it for 290 00:14:09,400 --> 00:14:12,400 Speaker 2: a change and is like, I don't know, a little jealous. 291 00:14:12,480 --> 00:14:14,560 Speaker 4: Yeah. I think if you asked Elon Musk on any 292 00:14:14,640 --> 00:14:17,679 Speaker 4: day except last week, who does more good for the world, 293 00:14:17,760 --> 00:14:20,080 Speaker 4: nonprofits or for profit startups? You would have said for 294 00:14:20,120 --> 00:14:22,920 Speaker 4: profit startups, right, But then last week it's like nonprofits. 295 00:14:23,160 --> 00:14:26,480 Speaker 2: Okay, that's it for AI. We're gonna leave Scharen, let 296 00:14:26,480 --> 00:14:29,000 Speaker 2: her get back to reporting on this as it develops. 297 00:14:29,040 --> 00:14:30,880 Speaker 2: Thank you, Sharen. For your time today. 298 00:14:31,080 --> 00:14:32,120 Speaker 1: Great, thanks for having me. 299 00:14:35,880 --> 00:14:39,680 Speaker 2: All right, let's turn to Delaware, where in January Elon 300 00:14:39,760 --> 00:14:43,760 Speaker 2: Musk's enormous fifty five billion dollar pay package was invalidated 301 00:14:43,800 --> 00:14:45,920 Speaker 2: by a judge and now we have a new fight 302 00:14:46,080 --> 00:14:48,800 Speaker 2: over how much the lawyers who won that class action 303 00:14:48,880 --> 00:14:51,160 Speaker 2: lawsuit are going to get paid. And I just I 304 00:14:51,240 --> 00:14:54,240 Speaker 2: read this in Matt's newsletter. The plaintiff lawyers want to 305 00:14:54,280 --> 00:14:58,680 Speaker 2: be paid in stock. I believe six billion dollars in 306 00:14:59,000 --> 00:15:02,520 Speaker 2: Tesla stock. Matt, you wrote, the obvious thing to say 307 00:15:02,720 --> 00:15:03,720 Speaker 2: is this is absurd. 308 00:15:04,200 --> 00:15:07,400 Speaker 4: It's a lot of money for taking away his pay package. 309 00:15:08,120 --> 00:15:11,600 Speaker 4: Like what I said was that you know, Tesla's shareholders 310 00:15:11,760 --> 00:15:15,280 Speaker 4: did vote to give Elan Musk fifty six billion dollars 311 00:15:15,320 --> 00:15:18,400 Speaker 4: of Tesla's stock if he could raise Tesla's market cap 312 00:15:18,520 --> 00:15:20,920 Speaker 4: by like one thousand percent, And then he did do 313 00:15:21,000 --> 00:15:23,200 Speaker 4: that thing, and they wanted to give him the stock, 314 00:15:23,600 --> 00:15:26,160 Speaker 4: and these lawyers went to court and clouded it back 315 00:15:26,160 --> 00:15:27,680 Speaker 4: from him. And now the lawyers are like, we want 316 00:15:27,720 --> 00:15:29,520 Speaker 4: ten percent of that amount of money. Now, have they 317 00:15:29,600 --> 00:15:32,000 Speaker 4: raised Tesla's market cap by six hundred million dollars, No, 318 00:15:32,040 --> 00:15:34,400 Speaker 4: they have not, or even by sixty billion dollars. So 319 00:15:34,760 --> 00:15:37,800 Speaker 4: it is a strange. It's a very large ask. 320 00:15:38,080 --> 00:15:41,080 Speaker 5: It's like the largest like lawyer fee that anyone has 321 00:15:41,120 --> 00:15:43,440 Speaker 5: ever asked for. I mean, what other in what other 322 00:15:43,760 --> 00:15:46,600 Speaker 5: scenario has any has any plaintiff lawyer asked for six 323 00:15:46,640 --> 00:15:47,960 Speaker 5: billion dollars worth of stock? 324 00:15:48,080 --> 00:15:48,680 Speaker 2: Like never like it? 325 00:15:48,840 --> 00:15:51,200 Speaker 4: Well, I mean their argument is they got fifty six 326 00:15:51,240 --> 00:15:53,160 Speaker 4: billion dollars of value for the shareholders. 327 00:15:53,280 --> 00:15:55,560 Speaker 5: Yeah, but Anne Lipton was telling me, telling us that 328 00:15:55,600 --> 00:15:58,600 Speaker 5: this is like the largest fee request she's ever heard of, 329 00:15:58,800 --> 00:16:01,840 Speaker 5: and yeah, it's absurd, but it's also cheaper than like 330 00:16:01,960 --> 00:16:04,480 Speaker 5: most lawyers feed. Right, Like usually if you win, you 331 00:16:04,520 --> 00:16:07,240 Speaker 5: get like thirty percent of like the final judgment, and 332 00:16:07,320 --> 00:16:10,200 Speaker 5: this would be like eleven percent. So like it's like 333 00:16:10,240 --> 00:16:14,120 Speaker 5: absurdly large and yet less than what most lawyers get. 334 00:16:14,640 --> 00:16:17,640 Speaker 5: And the plaintiffs lawyers are arguing that, like they don't 335 00:16:17,640 --> 00:16:19,720 Speaker 5: want to cripple Tesla, they don't want Tesla to have 336 00:16:19,760 --> 00:16:21,960 Speaker 5: to give them six billion in cash, so they just 337 00:16:22,000 --> 00:16:25,080 Speaker 5: really like to have six billion worth of stock, which 338 00:16:25,480 --> 00:16:27,720 Speaker 5: everyone is like outraged about it. I mean, Elon is 339 00:16:27,760 --> 00:16:29,880 Speaker 5: like this is criminal, and all the fans and a 340 00:16:29,920 --> 00:16:32,320 Speaker 5: lot of TESL executives are like on x al weekend, 341 00:16:32,440 --> 00:16:34,680 Speaker 5: just talking about how ridiculous it is. 342 00:16:34,680 --> 00:16:37,400 Speaker 2: Is there any precedent? Like do people ever like sue 343 00:16:37,520 --> 00:16:40,320 Speaker 2: Kroger for slipping and falling and then get Kroger shares? 344 00:16:40,440 --> 00:16:42,560 Speaker 2: Like does this ever happen where you get shares in 345 00:16:42,600 --> 00:16:43,960 Speaker 2: the company you sue? 346 00:16:44,280 --> 00:16:49,280 Speaker 4: Yeah, Like especially those bankruptcies, tobacco bankruptcies. Like the answer is, 347 00:16:49,320 --> 00:16:51,160 Speaker 4: if you like take all of a company's money, then 348 00:16:51,160 --> 00:16:53,480 Speaker 4: you get shares. Because they don't have any money. That's 349 00:16:53,480 --> 00:16:55,720 Speaker 4: not the case here, but like, but yes, Like they 350 00:16:55,720 --> 00:16:59,920 Speaker 4: don't want to cause financial distress by taking six billion 351 00:17:00,080 --> 00:17:02,960 Speaker 4: dollars in cash from Tesla, and so the only way 352 00:17:03,280 --> 00:17:06,040 Speaker 4: for them to ask for an absolutely absurd size of 353 00:17:06,119 --> 00:17:08,399 Speaker 4: award is by asking for it in stock rather than cash. 354 00:17:08,520 --> 00:17:10,200 Speaker 4: Like that's what's happening here. They don't want to stock. 355 00:17:10,240 --> 00:17:12,399 Speaker 4: They just they can't with a straight face ask for 356 00:17:12,440 --> 00:17:14,359 Speaker 4: six billion dollars of cash, so they're asking for stock. 357 00:17:14,840 --> 00:17:18,080 Speaker 2: It really plays into kind of Elon Musk's narrative about 358 00:17:18,080 --> 00:17:20,000 Speaker 2: this case, right, because he has said all along, this 359 00:17:20,119 --> 00:17:23,159 Speaker 2: is just the lawyers. You know, this pay package actually 360 00:17:23,160 --> 00:17:26,360 Speaker 2: made Tesla stronger, and now Dana you have the lawyers 361 00:17:26,440 --> 00:17:29,399 Speaker 2: essentially like playing right, into his hand at least his 362 00:17:29,480 --> 00:17:33,159 Speaker 2: sort of social media his like social media pattern. 363 00:17:33,880 --> 00:17:35,959 Speaker 5: Yeah, and I mean and Elon does have a point, 364 00:17:36,000 --> 00:17:38,200 Speaker 5: Like the plaineiff here is like a pocket plane iff. 365 00:17:38,200 --> 00:17:40,200 Speaker 5: I mean it really is the law firm that took 366 00:17:40,240 --> 00:17:43,880 Speaker 5: this on and you know, like had a rare win. 367 00:17:43,920 --> 00:17:46,440 Speaker 5: I mean, Elon almost never loses in any kind of court. 368 00:17:46,560 --> 00:17:49,119 Speaker 5: And you know, not only is he pissed about it, 369 00:17:49,119 --> 00:17:50,840 Speaker 5: but it'll be interesting to see what happens, Like the 370 00:17:50,920 --> 00:17:52,600 Speaker 5: judge is going to have to have a hearing on 371 00:17:52,640 --> 00:17:54,960 Speaker 5: this fee request and there's going to be scores of 372 00:17:55,040 --> 00:17:58,119 Speaker 5: objections and is she gonna knock down the fee or 373 00:17:58,160 --> 00:17:59,720 Speaker 5: go along with it. I mean it's hard for me 374 00:17:59,760 --> 00:18:02,959 Speaker 5: to imagine her going along with a fee of this size. 375 00:18:03,000 --> 00:18:05,719 Speaker 5: But I mean it was this rare you know, like 376 00:18:05,800 --> 00:18:10,680 Speaker 5: his pay package was this like moonshot, unheralded, unprecedented pay package. 377 00:18:10,800 --> 00:18:14,720 Speaker 5: Now we have the same kind of moonshot unprecedented fever requests. 378 00:18:14,720 --> 00:18:16,119 Speaker 5: I don't know, Matt, do you what do you think? Oh? 379 00:18:16,200 --> 00:18:19,199 Speaker 4: Yeah, I mean you're completely right, Like this is this 380 00:18:19,240 --> 00:18:21,440 Speaker 4: is an entrepreneurial law firm that is in the business 381 00:18:21,800 --> 00:18:26,000 Speaker 4: of finding problems with companies and seeing if it can 382 00:18:26,160 --> 00:18:28,879 Speaker 4: you know, spin up some cash for itself by identifying 383 00:18:28,880 --> 00:18:31,919 Speaker 4: those problems. And if you were in a case like 384 00:18:31,960 --> 00:18:33,760 Speaker 4: this that they put a lot of time and effort 385 00:18:33,760 --> 00:18:35,800 Speaker 4: into and took a lot of risk because it's you know, 386 00:18:35,800 --> 00:18:37,760 Speaker 4: it's a long shot case. If you win a case 387 00:18:37,800 --> 00:18:40,400 Speaker 4: like this, you do owe it to yourself to ask 388 00:18:40,440 --> 00:18:42,560 Speaker 4: for six billion dollars. Right, you may not get the 389 00:18:42,560 --> 00:18:44,880 Speaker 4: six billion dollars, but like you'd feel like a fool 390 00:18:44,880 --> 00:18:47,480 Speaker 4: if you didn't ask, because like, right, it's modest. That's 391 00:18:47,520 --> 00:18:49,280 Speaker 4: like only ten percent of the amount that recovered, and 392 00:18:49,320 --> 00:18:51,600 Speaker 4: they did such a great job for shoulders, it'd be 393 00:18:51,640 --> 00:18:54,520 Speaker 4: crazy not to ask for it. I assume they expect 394 00:18:54,520 --> 00:18:57,240 Speaker 4: it to be cut back to some somewhat more reasonable number. 395 00:18:57,280 --> 00:18:58,840 Speaker 4: But if you start high, you get cut back to 396 00:18:58,920 --> 00:19:00,680 Speaker 4: a still very large number. 397 00:19:01,080 --> 00:19:04,719 Speaker 2: So like, like it'll still be enormous, but not six billion. 398 00:19:05,040 --> 00:19:07,320 Speaker 4: It'll buy all of them a lot of yachts. Yeah, 399 00:19:08,359 --> 00:19:10,280 Speaker 4: I mean, it's like this, This is not like an 400 00:19:10,320 --> 00:19:12,919 Speaker 4: industrial company that nude to invest you know, like this 401 00:19:13,000 --> 00:19:14,840 Speaker 4: is like you know a dozen lawyers or whatever. Right, 402 00:19:14,880 --> 00:19:18,119 Speaker 4: Like they'll they'll they'll have a nice you know Christmas 403 00:19:18,160 --> 00:19:18,560 Speaker 4: this year. 404 00:19:18,920 --> 00:19:21,520 Speaker 2: So there have two questions for both of you. One is, 405 00:19:21,600 --> 00:19:24,600 Speaker 2: has Elon appealed? I know we've been expecting him to appeal, 406 00:19:24,880 --> 00:19:27,800 Speaker 2: and then how do they actually get the money back? 407 00:19:27,880 --> 00:19:29,560 Speaker 2: What does that look like? Are these options? Has he 408 00:19:29,640 --> 00:19:33,000 Speaker 2: exercised these options? Does he write a fifty five billion 409 00:19:33,080 --> 00:19:36,760 Speaker 2: dollar check? Or like what does the actual payback look like? 410 00:19:37,280 --> 00:19:39,040 Speaker 5: So there's sort of like a sequence of things that 411 00:19:39,119 --> 00:19:41,119 Speaker 5: have to happen. First, there needs to be a hearing 412 00:19:41,160 --> 00:19:43,520 Speaker 5: on the sphe request, and then the judge has to 413 00:19:43,640 --> 00:19:46,280 Speaker 5: enter like her final judgment, and then that starts the 414 00:19:46,320 --> 00:19:49,600 Speaker 5: thirty day clock by which Elon has to appeal. So 415 00:19:50,320 --> 00:19:52,560 Speaker 5: he hasn't appealed yet because the clock hasn't started yet, 416 00:19:52,600 --> 00:19:54,760 Speaker 5: because the judge hasn't entered her final ruling yet, because 417 00:19:54,760 --> 00:19:56,240 Speaker 5: there hasn't been the hearing on the fee yet. So 418 00:19:56,320 --> 00:19:59,159 Speaker 5: like we all expect him to appeal, but it's not 419 00:19:59,160 --> 00:20:01,359 Speaker 5: going to happen till they or the spring, and then 420 00:20:01,880 --> 00:20:05,679 Speaker 5: the shares get returned to Tesla, like it's a derivative lawsuit. 421 00:20:05,840 --> 00:20:09,119 Speaker 5: So the sort of argument is that, you know, Elon 422 00:20:09,200 --> 00:20:11,399 Speaker 5: doesn't have to cut a check, but like all the 423 00:20:11,440 --> 00:20:14,000 Speaker 5: other shareholders who were deluded by the fact that he 424 00:20:14,040 --> 00:20:17,080 Speaker 5: got this pay package will now not be deluded because 425 00:20:17,080 --> 00:20:20,360 Speaker 5: all the shares go back to like the pool, and 426 00:20:20,840 --> 00:20:24,080 Speaker 5: which is why Elon is surprise surprised, like basically letting 427 00:20:24,119 --> 00:20:25,520 Speaker 5: it be known that he wants a new pay. 428 00:20:25,320 --> 00:20:29,040 Speaker 2: Package, but that's a separate paypack. I mean, his idea 429 00:20:29,160 --> 00:20:32,040 Speaker 2: right is to have to get the keep the fifty 430 00:20:32,040 --> 00:20:33,679 Speaker 2: five billion he was already paid and they get a 431 00:20:33,760 --> 00:20:36,960 Speaker 2: new fifty five billion or something like that. 432 00:20:37,240 --> 00:20:39,439 Speaker 5: Regardless of whether he wins or loses on appeal, he 433 00:20:39,480 --> 00:20:42,000 Speaker 5: still wants another pay package so that he can keep 434 00:20:42,040 --> 00:20:45,040 Speaker 5: doing AI and robotics. But I mean, it's hard to 435 00:20:45,080 --> 00:20:47,439 Speaker 5: imagine him winning on appeal. I mean, I don't know. 436 00:20:47,920 --> 00:20:51,320 Speaker 5: Jeff Feely, our colleague in Delaware, Chance Recurt, doesn't think 437 00:20:51,880 --> 00:20:54,560 Speaker 5: that Elon is likely to win on appeal, just because 438 00:20:54,600 --> 00:20:57,640 Speaker 5: the judge's ruling was like this two hundred page opinion 439 00:20:57,720 --> 00:20:59,639 Speaker 5: that was very carefully written. 440 00:21:00,280 --> 00:21:02,520 Speaker 4: You could have objections to this ruling, and I think 441 00:21:02,560 --> 00:21:05,480 Speaker 4: it's possible to imagine a courd reversing it on appeal. 442 00:21:05,480 --> 00:21:08,040 Speaker 4: I probably agree that it's less likely than not. By 443 00:21:08,080 --> 00:21:09,959 Speaker 4: the way, the other thing, like it's not at all 444 00:21:10,040 --> 00:21:12,680 Speaker 4: or nothing, like he wins on appeal or he gets 445 00:21:12,720 --> 00:21:14,560 Speaker 4: the fifty five billion taken away. First of all, the 446 00:21:14,600 --> 00:21:17,280 Speaker 4: fifty five billion number is fake, Like that's it's stock options. 447 00:21:17,280 --> 00:21:19,520 Speaker 4: He has no stock. The options are not worth fifty 448 00:21:19,520 --> 00:21:21,680 Speaker 4: five billion number. That was like a number in the disclosure. 449 00:21:21,680 --> 00:21:23,320 Speaker 4: They're worth a little bit less than that now, but 450 00:21:23,359 --> 00:21:25,159 Speaker 4: like they've been worth more or less than that as 451 00:21:25,200 --> 00:21:28,240 Speaker 4: the stock price moves. But also, like the fix here 452 00:21:28,359 --> 00:21:31,760 Speaker 4: is not necessarily like when on appeal, it's have Tesla 453 00:21:32,480 --> 00:21:35,520 Speaker 4: give him that pay package again and then maybe also 454 00:21:35,520 --> 00:21:38,679 Speaker 4: another pay package. Right, Like what this ruling says is 455 00:21:38,720 --> 00:21:41,240 Speaker 4: that the pay package they gave him in twenty eighteen 456 00:21:41,359 --> 00:21:44,640 Speaker 4: was invalid. But it's still a company with the board 457 00:21:44,680 --> 00:21:46,840 Speaker 4: of directors that can make decisions, and if they made 458 00:21:46,880 --> 00:21:51,080 Speaker 4: a decision, like we think he deserved half of that 459 00:21:51,119 --> 00:21:53,720 Speaker 4: pay package, Like we take the judge's point that he 460 00:21:53,880 --> 00:21:57,119 Speaker 4: couldn't deserve fifty five billion, but he deserves thirty billion. 461 00:21:57,560 --> 00:21:59,359 Speaker 4: And we're going to submit it to a vote to 462 00:21:59,359 --> 00:22:03,160 Speaker 4: the shareholders. We're going to fix what happened last time 463 00:22:03,160 --> 00:22:05,480 Speaker 4: where the judge found that the shareholder vote was not 464 00:22:05,520 --> 00:22:08,119 Speaker 4: fully informed. So we're going to give the sharelders more 465 00:22:08,119 --> 00:22:10,280 Speaker 4: information about the decision making and We're going to ask 466 00:22:10,320 --> 00:22:12,720 Speaker 4: the sharelders to vote again to give him thirty billion 467 00:22:12,760 --> 00:22:17,080 Speaker 4: dollars of options based on his twenty eighteen performance. Like 468 00:22:17,160 --> 00:22:19,680 Speaker 4: the scharelders vote yes, right, and then it's just the 469 00:22:19,760 --> 00:22:21,560 Speaker 4: question of, like, well that hold up in court, And like, 470 00:22:21,600 --> 00:22:23,760 Speaker 4: I think they can do things to fix the problem 471 00:22:23,800 --> 00:22:25,520 Speaker 4: and they make it more bulletproof in court next time. 472 00:22:25,560 --> 00:22:26,840 Speaker 4: So I don't think that this is like he gets 473 00:22:26,920 --> 00:22:28,560 Speaker 4: nothing or he gets fifty five million dollars. And I 474 00:22:28,600 --> 00:22:31,040 Speaker 4: think even if he loses on appeal, there are ways 475 00:22:31,040 --> 00:22:34,159 Speaker 4: for Tesla to continue pumping money to him. 476 00:22:34,320 --> 00:22:36,280 Speaker 5: If they moved to Texas, right, that could be that 477 00:22:36,280 --> 00:22:38,600 Speaker 5: could that's one avenue, Right if they move the incorporation 478 00:22:38,640 --> 00:22:40,280 Speaker 5: of the company to Texas, that's a. 479 00:22:40,200 --> 00:22:43,399 Speaker 4: Really complicated avenue where someone will sue to prevent them 480 00:22:43,400 --> 00:22:45,160 Speaker 4: from moving to Texas, and then we'll have a really 481 00:22:45,200 --> 00:22:47,439 Speaker 4: really fun court case. But I think even without I 482 00:22:47,440 --> 00:22:51,159 Speaker 4: think that, like I think the right move strategically and 483 00:22:51,240 --> 00:22:53,920 Speaker 4: legally is for them to pay him more in Delaware 484 00:22:54,160 --> 00:22:56,960 Speaker 4: even before trying to move to Texas. But like Elon 485 00:22:57,040 --> 00:22:58,080 Speaker 4: may not want to hear that advice. 486 00:23:02,240 --> 00:23:03,760 Speaker 2: It sort of feels like to me that these two 487 00:23:03,840 --> 00:23:06,400 Speaker 2: things are connected, these two topics we've been talking about, 488 00:23:06,400 --> 00:23:10,320 Speaker 2: you know, because Elon is threatening essentially to do his 489 00:23:10,480 --> 00:23:14,480 Speaker 2: AI stuff outside of Tesla unless he gets another big 490 00:23:14,520 --> 00:23:18,239 Speaker 2: pay package. At the same time, he's filing this you know, 491 00:23:18,480 --> 00:23:22,400 Speaker 2: mega lawsuit, super fun, chock full of details and sort 492 00:23:22,440 --> 00:23:25,679 Speaker 2: of hilarity about how instrumental he was in the creation 493 00:23:25,760 --> 00:23:28,200 Speaker 2: of open Ad, this super valuable startup. I mean, these 494 00:23:28,200 --> 00:23:30,400 Speaker 2: two things are related, right, Like this is part all 495 00:23:30,480 --> 00:23:33,440 Speaker 2: part of some kind of strategy to get the Tesla 496 00:23:33,520 --> 00:23:36,200 Speaker 2: board to pay him another fifty billion dollars. 497 00:23:36,640 --> 00:23:38,639 Speaker 4: I mean, he doesn't need that much strategy. Like he 498 00:23:38,680 --> 00:23:40,359 Speaker 4: could go to Tesla and be like, please pay me 499 00:23:40,359 --> 00:23:41,880 Speaker 4: one hundred million dollars and then it'd be like here 500 00:23:41,880 --> 00:23:44,679 Speaker 4: you go. Now there's like there's some legal strategy in 501 00:23:44,680 --> 00:23:47,400 Speaker 4: the background where like like, yes, he needs a credible 502 00:23:47,400 --> 00:23:48,879 Speaker 4: threat so that when they go back to court, he 503 00:23:48,920 --> 00:23:50,919 Speaker 4: can be like, see, I would have left if they 504 00:23:50,960 --> 00:23:51,879 Speaker 4: didn't pay me that money. 505 00:23:52,119 --> 00:23:55,399 Speaker 2: But do you think it strengthens his case, like like 506 00:23:55,600 --> 00:23:58,600 Speaker 2: on appeal or whatever, that he's an AI pioneer, that 507 00:23:58,640 --> 00:24:01,720 Speaker 2: he's clearly he's thought about leaving before, you know, like 508 00:24:01,760 --> 00:24:03,200 Speaker 2: that this is swirling around it. 509 00:24:03,200 --> 00:24:05,960 Speaker 4: A little bit. I think that, like when you read 510 00:24:06,240 --> 00:24:09,359 Speaker 4: the judge's decision in this case, she's like, you know, 511 00:24:09,400 --> 00:24:11,240 Speaker 4: they said they needed to pay him this much to 512 00:24:11,280 --> 00:24:13,240 Speaker 4: retain him, but in fact, he had said things like 513 00:24:13,240 --> 00:24:15,320 Speaker 4: I'm a Tesla lifeer, I'm never going anywhere. This is 514 00:24:15,359 --> 00:24:18,399 Speaker 4: my legacy, and so they didn't really need to pay 515 00:24:18,480 --> 00:24:20,040 Speaker 4: him this much to retain him. And that was all 516 00:24:20,160 --> 00:24:23,199 Speaker 4: stuff he said in twenty eighteen. Like now you know, 517 00:24:23,400 --> 00:24:27,240 Speaker 4: now he's like got seven other companies and he's you know, 518 00:24:27,280 --> 00:24:30,720 Speaker 4: publicly saying how Distractedy is and how how he's lost 519 00:24:30,720 --> 00:24:32,520 Speaker 4: interest in Tesla. Like they do need to pay him 520 00:24:32,520 --> 00:24:34,520 Speaker 4: more to retain him. So I think they will have 521 00:24:34,560 --> 00:24:36,760 Speaker 4: a you know, the next pay package they give him, 522 00:24:36,760 --> 00:24:39,280 Speaker 4: they will have a better case that they need to 523 00:24:39,280 --> 00:24:41,680 Speaker 4: do it to motivate him, which is like a double 524 00:24:41,760 --> 00:24:44,560 Speaker 4: edged sword. It's sort of like it sort of suggests 525 00:24:44,560 --> 00:24:47,639 Speaker 4: that he is not fulfilling his fiducial i duties to 526 00:24:47,640 --> 00:24:50,040 Speaker 4: Tesla shareholders in some way. But I think that like 527 00:24:50,080 --> 00:24:51,399 Speaker 4: probably does help him at the margin. 528 00:24:51,560 --> 00:24:53,760 Speaker 5: I also wonder if this is sort of an attempt 529 00:24:53,760 --> 00:24:56,679 Speaker 5: to just try to bring some real change to Tesla's board. 530 00:24:56,720 --> 00:24:59,400 Speaker 5: I mean, it seems like the board is really the problem, right, 531 00:24:59,480 --> 00:25:01,800 Speaker 5: Elon has stack the board with very close friends. They 532 00:25:01,880 --> 00:25:04,119 Speaker 5: vacation together. I mean, this all came out in testimony. 533 00:25:04,200 --> 00:25:06,760 Speaker 5: So I wonder if, like part of the judge's role 534 00:25:06,840 --> 00:25:10,240 Speaker 5: here is to try to push for real governance change. 535 00:25:10,280 --> 00:25:14,320 Speaker 5: But then that's hard to imagine too, because I just 536 00:25:14,560 --> 00:25:16,920 Speaker 5: I don't really see a lot of big outsiders coming 537 00:25:16,960 --> 00:25:19,800 Speaker 5: in and like starting like a board fight with Elon. 538 00:25:19,880 --> 00:25:22,640 Speaker 5: I mean, it's Elon's board, it always it always has been. 539 00:25:22,880 --> 00:25:25,320 Speaker 5: And you need a way to motivate Elon, Like how 540 00:25:25,359 --> 00:25:27,960 Speaker 5: do you motivate the richest person on the planet to 541 00:25:28,040 --> 00:25:30,719 Speaker 5: stick with the company when he has like a billion 542 00:25:30,840 --> 00:25:34,760 Speaker 5: startup ideas in his own head and is running He's 543 00:25:34,800 --> 00:25:38,080 Speaker 5: currently running six companies, and so Tesla has always struggled 544 00:25:38,080 --> 00:25:41,320 Speaker 5: with how to keep him engaged in Tesla because if 545 00:25:41,320 --> 00:25:44,440 Speaker 5: he walked completely, like the valuation of that company would 546 00:25:44,440 --> 00:25:46,680 Speaker 5: be very much in question. And so the board is 547 00:25:46,720 --> 00:25:49,520 Speaker 5: always wrangled with like, oh, how do we keep Elon engaged? 548 00:25:49,640 --> 00:25:52,240 Speaker 5: Like how do we keep him here when his first 549 00:25:52,240 --> 00:25:55,919 Speaker 5: love is clearly SpaceX And it's wild I mean because 550 00:25:56,000 --> 00:25:59,399 Speaker 5: he is a part time CEO. I mean there's no 551 00:25:59,480 --> 00:26:03,200 Speaker 5: other who just works part time like this, It's pretty 552 00:26:03,440 --> 00:26:08,480 Speaker 5: it's pretty amazing how like little time Mealon spends at Tesla. 553 00:26:08,560 --> 00:26:11,040 Speaker 5: I mean granted, like you know, he would argue that 554 00:26:11,080 --> 00:26:13,200 Speaker 5: he sleeps there when he needs to and he's working 555 00:26:13,240 --> 00:26:16,080 Speaker 5: around the clock, but I mean the man hyper focuses, 556 00:26:16,080 --> 00:26:17,879 Speaker 5: but then he is like a wall and doing his 557 00:26:17,960 --> 00:26:18,520 Speaker 5: other things. 558 00:26:22,840 --> 00:26:26,240 Speaker 2: We're going to go to our ending segment called is 559 00:26:26,320 --> 00:26:29,240 Speaker 2: this a thing? And in which you too have to 560 00:26:29,280 --> 00:26:32,159 Speaker 2: tell me is this a thing? And the topic of 561 00:26:32,200 --> 00:26:37,080 Speaker 2: this as a tweet from Elon Musk on February twenty eighth. Tonight, 562 00:26:37,200 --> 00:26:40,240 Speaker 2: we radically increase the design goals for the new Tesla Roaster. 563 00:26:40,520 --> 00:26:42,560 Speaker 2: There will never be another car like this, if you 564 00:26:42,600 --> 00:26:46,200 Speaker 2: could even call it a car. Tesla slash SpaceX collab 565 00:26:46,440 --> 00:26:49,160 Speaker 2: production design complete and unveil at the end of the year, 566 00:26:49,200 --> 00:26:53,400 Speaker 2: aiming to ship next year. Dana, is this real? Yes? 567 00:26:53,480 --> 00:26:55,240 Speaker 5: I think it's real. Like, are they gonna stick to 568 00:26:55,240 --> 00:26:55,919 Speaker 5: the timeline? 569 00:26:56,119 --> 00:26:56,320 Speaker 4: No? 570 00:26:56,520 --> 00:26:59,199 Speaker 5: But the backdrop of this is that, like byd is 571 00:26:59,240 --> 00:27:02,360 Speaker 5: now Tesla's biggest competitor. They're the Chinese company that makes 572 00:27:02,400 --> 00:27:05,160 Speaker 5: a lot of cars. They're like the leader in ev sales, 573 00:27:05,200 --> 00:27:07,280 Speaker 5: and they also have some kind of like new supercar, 574 00:27:08,200 --> 00:27:11,119 Speaker 5: like remind everyone. Like Tesla first said that they were 575 00:27:11,160 --> 00:27:14,720 Speaker 5: working on a next generation roadster in twenty seventeen. 576 00:27:14,640 --> 00:27:18,159 Speaker 6: Tesla with a sports called Tesla Royster. It was the 577 00:27:18,160 --> 00:27:20,400 Speaker 6: foundation of the whole company, was the Tesla Royster. People 578 00:27:20,400 --> 00:27:22,000 Speaker 6: who asked us for a long time, when are you 579 00:27:22,080 --> 00:27:25,120 Speaker 6: going to make a new roadster. We are making it now. 580 00:27:26,400 --> 00:27:28,119 Speaker 5: I was there at that event that was back in 581 00:27:28,160 --> 00:27:31,440 Speaker 5: the day when like Tesla invited you know, mainstream financial 582 00:27:31,480 --> 00:27:34,440 Speaker 5: journalists to their product launches, and it was this awesome event. 583 00:27:34,560 --> 00:27:36,919 Speaker 5: I mean, Tesla pulled out all the stops. It was 584 00:27:36,960 --> 00:27:39,639 Speaker 5: in La Fronds von Wholshausen, the lead designer for Tesla, 585 00:27:39,800 --> 00:27:42,120 Speaker 5: was actually like in the roadster as it came back 586 00:27:42,160 --> 00:27:44,480 Speaker 5: out of the back of the semi. But what's weird 587 00:27:44,520 --> 00:27:47,040 Speaker 5: to me is that like Tesla needs to go down 588 00:27:47,040 --> 00:27:50,160 Speaker 5: market and make this next generation cheaper, twenty five thousand 589 00:27:50,200 --> 00:27:52,960 Speaker 5: dollars car in order to remain competitive. So I'm not 590 00:27:53,040 --> 00:27:55,600 Speaker 5: really sure why they're like now talking about this high 591 00:27:55,720 --> 00:27:58,240 Speaker 5: end niche product when like the future of the company 592 00:27:58,320 --> 00:28:01,280 Speaker 5: is really about the downmark it cheaper product. But I 593 00:28:01,320 --> 00:28:03,560 Speaker 5: think he's just trying to remind everyone. Oh yeah, that 594 00:28:03,680 --> 00:28:06,440 Speaker 5: this like product that we promised years ago is still happening. 595 00:28:06,760 --> 00:28:08,719 Speaker 5: And I'm sure that when he tweeted that that that 596 00:28:08,760 --> 00:28:11,399 Speaker 5: was it. That was news to probably all the engineers, like, 597 00:28:11,480 --> 00:28:14,240 Speaker 5: oh now this is backfront center again, Matt. 598 00:28:14,280 --> 00:28:16,480 Speaker 2: What's your what's your take on this? Is this? Are 599 00:28:16,560 --> 00:28:19,240 Speaker 2: you going to put down your deposit for the new 600 00:28:19,400 --> 00:28:20,480 Speaker 2: new Tesla roaster? 601 00:28:20,840 --> 00:28:22,879 Speaker 4: I'm the wrong person to ask that. I will say that, 602 00:28:22,920 --> 00:28:26,000 Speaker 4: like after he tweeted about radically increasing the design goals, 603 00:28:26,000 --> 00:28:28,159 Speaker 4: which I don't know that that means. It's a funny phrase. 604 00:28:28,520 --> 00:28:31,639 Speaker 4: He also did a callback to his twenty eighteen tweeter 605 00:28:31,800 --> 00:28:35,160 Speaker 4: he said SpaceX option package for new Tesla Roadster will 606 00:28:35,200 --> 00:28:39,160 Speaker 4: include ten small rocket thrusters arranged seamlessly around car. Maybe 607 00:28:39,160 --> 00:28:42,200 Speaker 4: they will even allow a Tesla to fly. So you know, 608 00:28:42,680 --> 00:28:44,720 Speaker 4: it's always like a little bit of not a thing, 609 00:28:44,800 --> 00:28:46,600 Speaker 4: But you know, I believe that on that like there 610 00:28:46,640 --> 00:28:48,600 Speaker 4: will you know, there'll be a roadster and stuff like that. 611 00:28:48,720 --> 00:28:51,440 Speaker 2: So do you think it's like eleven rocket boosters with 612 00:28:51,520 --> 00:28:54,520 Speaker 2: the radical radically increase the design goals? Maybe that's what 613 00:28:54,520 --> 00:28:55,720 Speaker 2: he means he. 614 00:28:55,640 --> 00:29:01,920 Speaker 4: Did retweet or re crock whatever he did call that back. 615 00:29:02,080 --> 00:29:04,000 Speaker 4: Sarah he wants. 616 00:29:04,360 --> 00:29:06,800 Speaker 5: He also had some joke about like, oh, like we 617 00:29:06,800 --> 00:29:09,120 Speaker 5: were promised flying cars and all we got was two 618 00:29:09,160 --> 00:29:12,120 Speaker 5: hundred needy characters, and now he's basically kind of promising 619 00:29:12,160 --> 00:29:14,040 Speaker 5: wing wink that there will be flying cars. 620 00:29:14,600 --> 00:29:17,040 Speaker 2: All right. On that note, let's end here. Thank you 621 00:29:17,080 --> 00:29:20,160 Speaker 2: for listening to Elon, Inc. And thanks to Dana and Matt, 622 00:29:20,840 --> 00:29:25,360 Speaker 2: Thank you, thank you, thanks guys. Before we go, a 623 00:29:25,400 --> 00:29:28,680 Speaker 2: programming note, Elon Inc. Is going to south By Southwest 624 00:29:28,880 --> 00:29:31,520 Speaker 2: next week in Austin, Texas. We're going to be doing 625 00:29:31,560 --> 00:29:34,920 Speaker 2: a live taping on Tuesday, March twelfth at ten am. 626 00:29:35,040 --> 00:29:37,560 Speaker 2: If you're in Austin at south By Southwest, please come 627 00:29:37,680 --> 00:29:39,840 Speaker 2: say hi to us. We'll be talking to suel Chan, 628 00:29:40,160 --> 00:29:43,680 Speaker 2: the editor in chief of the Texas Tribune, and Rachel Monroe, 629 00:29:43,800 --> 00:29:46,880 Speaker 2: the New Yorker's Texas correspondent. We're going to be talking 630 00:29:46,880 --> 00:29:50,400 Speaker 2: about how much or how little Elon has impacted his 631 00:29:50,440 --> 00:29:53,080 Speaker 2: home state, and perhaps how much his home state has 632 00:29:53,120 --> 00:30:03,800 Speaker 2: impacted Elon. This episode was produced by Stacy Wong. Naomi 633 00:30:03,800 --> 00:30:07,239 Speaker 2: Shavin and Rayhan Harmanci are senior editors. The idea for 634 00:30:07,320 --> 00:30:11,000 Speaker 2: this very show came from Rayhan Blake Maples Handles Engineering 635 00:30:11,040 --> 00:30:14,360 Speaker 2: and we get special editing assistants from Jeff Grocott. Her 636 00:30:14,360 --> 00:30:18,280 Speaker 2: supervising producer is Magnus Hendrickson. Huge thanks to Joel Weber, 637 00:30:18,400 --> 00:30:22,160 Speaker 2: Sean Wen and Angel Recchio. The Elon Inc. Theme is 638 00:30:22,200 --> 00:30:27,040 Speaker 2: written and performed by Taka Yasuzawa and Alex Suguiera. Sage 639 00:30:27,040 --> 00:30:30,520 Speaker 2: Bauman is a head of Bloomberg Podcast and Brendan Newnham 640 00:30:30,880 --> 00:30:33,560 Speaker 2: is our executive producer. I'm Max Chapkin. If you have 641 00:30:33,600 --> 00:30:36,200 Speaker 2: a minute, rate and review our show, it'll help others 642 00:30:36,400 --> 00:30:43,640 Speaker 2: find us and we will appreciate it. See you next week.