1 00:00:00,240 --> 00:00:02,400 Speaker 1: I know your commitment to free speech. I respect that 2 00:00:02,480 --> 00:00:04,920 Speaker 1: because I think it's a integral it's the foundational thing 3 00:00:04,960 --> 00:00:09,160 Speaker 1: of democracies really, But I also know your opposition to 4 00:00:09,160 --> 00:00:12,200 Speaker 1: anti semitism. You've spoken about it and tweeted about it. 5 00:00:12,760 --> 00:00:15,800 Speaker 2: That was Israelly, Prime Minister, bb NET and Yahoo talking 6 00:00:15,840 --> 00:00:18,600 Speaker 2: to Elon Musk at a Tesla factory two months ago. 7 00:00:19,320 --> 00:00:21,960 Speaker 2: A lot has happened since then, and much of it 8 00:00:21,960 --> 00:00:25,799 Speaker 2: has been fairly disastrous for Elon's social media platform of 9 00:00:25,920 --> 00:00:29,840 Speaker 2: choice X. The fallout continues for Elo Musk after the 10 00:00:29,840 --> 00:00:31,800 Speaker 2: billionaire and Dawson anti semitic. 11 00:00:31,480 --> 00:00:35,800 Speaker 3: Post scrative advertisers like Disney, Apple, and IBM have pulled 12 00:00:35,840 --> 00:00:38,159 Speaker 3: their ads from the social media site X. 13 00:00:38,280 --> 00:00:40,960 Speaker 4: Yesterday, Musk and I claims he's anti Semitic. 14 00:00:41,000 --> 00:00:44,400 Speaker 2: He also threatened a Thermo nuclear lawsuit against a media 15 00:00:44,479 --> 00:00:47,559 Speaker 2: watchdog company, Bill. 16 00:00:47,600 --> 00:00:50,159 Speaker 4: Elon Musk is now the richest person on the planet. 17 00:00:50,640 --> 00:00:53,479 Speaker 2: More than half the satellites in space are owned and 18 00:00:53,560 --> 00:00:55,720 Speaker 2: controlled by one man. 19 00:00:55,880 --> 00:00:58,600 Speaker 3: Starting his own artificial intelligence company. 20 00:00:58,840 --> 00:01:02,360 Speaker 2: Well, he's a legitimate, super genius and legitimate. He says 21 00:01:02,360 --> 00:01:04,959 Speaker 2: He's always voted for Democrats, but this year it will 22 00:01:05,000 --> 00:01:05,440 Speaker 2: be different. 23 00:01:05,480 --> 00:01:06,440 Speaker 5: He'll vote Republican. 24 00:01:06,720 --> 00:01:09,200 Speaker 3: There is a reason the US government is so reliant 25 00:01:09,240 --> 00:01:09,600 Speaker 3: on him. 26 00:01:09,800 --> 00:01:13,200 Speaker 6: Alon Musk is a scam artist and he's done nothing. 27 00:01:14,000 --> 00:01:16,760 Speaker 1: Anything he does, He's fascinating people. 28 00:01:26,520 --> 00:01:29,360 Speaker 2: Welcome to Elan, Inc. Where we discuss Elon Musk's vast 29 00:01:29,360 --> 00:01:32,600 Speaker 2: corporate empire, his latest gambits and antics, and how to 30 00:01:32,640 --> 00:01:36,039 Speaker 2: make sense of it all. I'm your host, David Papodopolis. 31 00:01:37,680 --> 00:01:41,600 Speaker 2: Last Thursday night, we released an emergency podcast after Musk 32 00:01:41,640 --> 00:01:45,800 Speaker 2: went further in amplifying and agreeing with anti Semitic ideas 33 00:01:45,840 --> 00:01:49,280 Speaker 2: than he ever had before. The fallout from those posts 34 00:01:49,320 --> 00:01:53,720 Speaker 2: is ongoing. Musk has rejected accusations of bigotry, but advertisers 35 00:01:53,720 --> 00:01:57,280 Speaker 2: are still fleeing the site, and his CEO, Linda Yakarino, 36 00:01:57,520 --> 00:02:00,840 Speaker 2: is feeling the heat. Here with the Today to discuss 37 00:02:00,880 --> 00:02:04,200 Speaker 2: the latest in all things acts and another wild development 38 00:02:04,200 --> 00:02:07,880 Speaker 2: in the tech world, the power Struggle. At OpenAI. Are 39 00:02:07,920 --> 00:02:12,880 Speaker 2: our regular muscologists, Sarah Friar, who oversees our coverage of 40 00:02:12,919 --> 00:02:17,840 Speaker 2: Silicon Valley's biggest companies. Hello, Max Chafkin, senior reporter at 41 00:02:17,840 --> 00:02:23,919 Speaker 2: Bloomberg BusinessWeek, Hey, Hey Max. And Danahull, who covers Tesla 42 00:02:23,960 --> 00:02:26,720 Speaker 2: and has also been writing about the reaction to Elon's 43 00:02:26,760 --> 00:02:27,880 Speaker 2: controversial comments. 44 00:02:28,240 --> 00:02:29,040 Speaker 5: Always a pleasure. 45 00:02:29,800 --> 00:02:33,080 Speaker 2: Okay, So, Sarah, the past few days have been action packed. 46 00:02:33,520 --> 00:02:36,560 Speaker 2: Give us a bit of a refresher on what Elon 47 00:02:36,720 --> 00:02:40,120 Speaker 2: said last week that people found so objectionable. 48 00:02:42,200 --> 00:02:47,079 Speaker 6: Well, he was responding to a tweet that espoused of 49 00:02:47,280 --> 00:02:53,600 Speaker 6: a very racist theory about Jewish people that was actually 50 00:02:53,639 --> 00:02:55,679 Speaker 6: tied to I'm not going to repeat it again. You 51 00:02:55,720 --> 00:02:58,440 Speaker 6: can listen to our emergency episode. But he said that 52 00:02:58,600 --> 00:03:02,119 Speaker 6: is the actual truth, and then it caused this, this 53 00:03:02,480 --> 00:03:08,040 Speaker 6: backlash and really shock from all of his his supporters 54 00:03:08,080 --> 00:03:12,919 Speaker 6: and also his advertisers. Even the White House put out 55 00:03:12,919 --> 00:03:17,280 Speaker 6: of statement saying that this wasn't something that they were 56 00:03:17,320 --> 00:03:22,320 Speaker 6: aligned with. And furthermore, Linda Yakarino, who is the CEO 57 00:03:23,080 --> 00:03:25,960 Speaker 6: of X and who is supposed to be in this 58 00:03:26,080 --> 00:03:30,720 Speaker 6: job to bring back advertisers to the platform, basically had 59 00:03:30,760 --> 00:03:32,920 Speaker 6: to ignore it. Had to ignore what he said and 60 00:03:33,000 --> 00:03:37,200 Speaker 6: change the subjects to retaliation against a research group. So 61 00:03:37,240 --> 00:03:38,960 Speaker 6: that's where we are today, all. 62 00:03:39,000 --> 00:03:41,760 Speaker 2: Right, Max, So in the last few days now, what 63 00:03:41,960 --> 00:03:45,400 Speaker 2: has Elon's response been. Well, so, the. 64 00:03:45,080 --> 00:03:48,760 Speaker 3: Most interesting thing about this to me is that Elon 65 00:03:48,880 --> 00:03:52,560 Speaker 3: Musk typically in these situations. Somebody says a racist thing, 66 00:03:53,120 --> 00:03:56,400 Speaker 3: they apologize, and Elon Musk has not apologized, which is 67 00:03:56,520 --> 00:03:59,520 Speaker 3: very much in character for Elon Musk. He's very much 68 00:03:59,560 --> 00:04:02,520 Speaker 3: a you know, never back down kind of fellow. What 69 00:04:02,640 --> 00:04:07,280 Speaker 3: he's done is, as Sarah said, denied the sort of 70 00:04:07,480 --> 00:04:10,800 Speaker 3: central allegation. He's deny that he's anti Semitic, and he's 71 00:04:10,800 --> 00:04:13,240 Speaker 3: had some sort of help from some of his buddies. 72 00:04:13,480 --> 00:04:15,240 Speaker 3: But what he's largely tried to do is sort of 73 00:04:15,320 --> 00:04:18,839 Speaker 3: changed the subject. He's attacked Media Matters. 74 00:04:18,400 --> 00:04:20,200 Speaker 2: And what has Media Matters done here? 75 00:04:20,240 --> 00:04:24,719 Speaker 3: Media Matters, before this tweet, had put out a report 76 00:04:25,120 --> 00:04:29,760 Speaker 3: alleging that X, the platform formerly known as Twitter, was 77 00:04:29,800 --> 00:04:34,120 Speaker 3: showing ads next to anti Semitic content, and then Elon 78 00:04:34,240 --> 00:04:39,240 Speaker 3: Musk tweets this thing, which really I would say reinforces 79 00:04:39,320 --> 00:04:42,400 Speaker 3: the problem that Media Matters is counting out because it's 80 00:04:42,440 --> 00:04:44,640 Speaker 3: not just like as Media Matters saying there were some 81 00:04:44,680 --> 00:04:47,160 Speaker 3: anti Semites on this platform. The call is coming from 82 00:04:47,160 --> 00:04:50,279 Speaker 3: inside the house, So that totally reinforces the problem with 83 00:04:50,320 --> 00:04:52,600 Speaker 3: Media Matters is report and I think is part of 84 00:04:52,640 --> 00:04:57,560 Speaker 3: what kind of kicked off the cascade of essentially advertiser boycott. 85 00:04:57,640 --> 00:04:59,680 Speaker 2: This cuts it just before we go through the media 86 00:04:59,720 --> 00:05:02,479 Speaker 2: matter Lawsuitdan give us the exact lineup in the exact 87 00:05:02,560 --> 00:05:06,960 Speaker 2: rundown who has broken now and called off advertising with 88 00:05:07,320 --> 00:05:08,400 Speaker 2: X Well, I just want. 89 00:05:08,240 --> 00:05:10,239 Speaker 5: To back up for one moment and make the point 90 00:05:10,279 --> 00:05:13,000 Speaker 5: that Musk is blaming this on the media, that you know, 91 00:05:13,600 --> 00:05:17,520 Speaker 5: media matters, this research group and journalists like us have 92 00:05:17,839 --> 00:05:19,760 Speaker 5: made this into a Maelstrom. But I want to be 93 00:05:19,880 --> 00:05:22,920 Speaker 5: very clear that pretty quickly after Musk did this tweet, 94 00:05:23,200 --> 00:05:26,040 Speaker 5: the American Jewish community came out with a very strong 95 00:05:26,160 --> 00:05:30,119 Speaker 5: statement that said elon Musk's agreement agreement with the user 96 00:05:30,160 --> 00:05:33,360 Speaker 5: promoting elements of the Great Replacement theory isn't the truth. 97 00:05:33,760 --> 00:05:37,080 Speaker 5: It is the deadliest anti Semitic conspiracy theory in modern 98 00:05:37,200 --> 00:05:41,880 Speaker 5: US history and motivated the Pittsburgh synagogue shooting. Like he 99 00:05:42,040 --> 00:05:45,960 Speaker 5: was called out by pretty significant organizations, then the media 100 00:05:46,000 --> 00:05:49,880 Speaker 5: wrote about it, Then the advertiser boycott started happening, and 101 00:05:49,920 --> 00:05:51,840 Speaker 5: so that's where we are today. But I just think 102 00:05:51,839 --> 00:05:54,880 Speaker 5: it's disingenuous to sort of blame this on reporters when 103 00:05:54,880 --> 00:05:58,400 Speaker 5: you have some pretty large and significant organizations calling him 104 00:05:58,400 --> 00:05:59,799 Speaker 5: out for the tweet in the first place. 105 00:06:00,200 --> 00:06:02,599 Speaker 3: One other thing, which is that he one way he 106 00:06:02,640 --> 00:06:05,480 Speaker 3: tried to cover up the damage, or to do a 107 00:06:05,480 --> 00:06:08,480 Speaker 3: little bit of damage control was by sort of redirecting 108 00:06:08,520 --> 00:06:12,679 Speaker 3: this to the Israeli Palestinian crisis. So he's essentially trying 109 00:06:12,720 --> 00:06:14,920 Speaker 3: to say I could not be an anti Semite because 110 00:06:14,960 --> 00:06:19,000 Speaker 3: I'm incredibly pro Israel, which just to say that is 111 00:06:19,080 --> 00:06:23,039 Speaker 3: not those two things are not necessarily are not necessarily 112 00:06:23,080 --> 00:06:23,839 Speaker 3: mutually exclusives. 113 00:06:23,839 --> 00:06:26,560 Speaker 6: And also we haven't seen him enforced that. And also 114 00:06:27,520 --> 00:06:31,000 Speaker 6: it was this moment where you saw a lot of 115 00:06:31,080 --> 00:06:36,400 Speaker 6: Jewish leaders who normally would probably have criticized him, say hey, 116 00:06:36,480 --> 00:06:38,640 Speaker 6: that's great that you're doing that, and they sort of 117 00:06:38,680 --> 00:06:44,040 Speaker 6: seem to prioritize Elon Must's power and influence over the conversation, 118 00:06:44,800 --> 00:06:49,560 Speaker 6: over any personal anti Semitism or racism he may be espousing. 119 00:06:50,040 --> 00:06:54,839 Speaker 6: And that just goes to show how much control this 120 00:06:54,960 --> 00:06:57,960 Speaker 6: man has and how it's very difficult for people to 121 00:06:58,320 --> 00:06:59,720 Speaker 6: make an enemy of him. 122 00:06:59,800 --> 00:07:01,040 Speaker 3: For all. 123 00:07:01,120 --> 00:07:03,680 Speaker 2: Right, but Dan, can you give me that rundown of 124 00:07:03,720 --> 00:07:07,200 Speaker 2: companies that have pulled advertising for Max Pause? 125 00:07:07,360 --> 00:07:10,280 Speaker 5: This is what I believe they're calling it. Sarah backed 126 00:07:10,280 --> 00:07:14,920 Speaker 5: me up here. I believe it's Apple, Disney, IBM are 127 00:07:14,920 --> 00:07:15,440 Speaker 5: the big three. 128 00:07:16,000 --> 00:07:18,880 Speaker 6: Those are the big ones. There are a few banks, 129 00:07:18,920 --> 00:07:24,960 Speaker 6: a few communications companies, a lot of companies that already 130 00:07:25,040 --> 00:07:28,200 Speaker 6: had a pause in place that are just continuing that pause. 131 00:07:28,920 --> 00:07:33,720 Speaker 6: The sixty percent drop in advertising revenue that Musk has 132 00:07:33,800 --> 00:07:40,240 Speaker 6: talked about, it doesn't seem to be reversing, all right. 133 00:07:40,280 --> 00:07:43,200 Speaker 2: So now let's get into this. You know, must describe 134 00:07:43,200 --> 00:07:46,920 Speaker 2: that I believe as a thermonuclear lawsuit against Media Matters. 135 00:07:47,000 --> 00:07:48,760 Speaker 2: It that the lawsuit has dropped Max. 136 00:07:48,800 --> 00:07:54,520 Speaker 3: And it says what the lawsuit says that Media Matters 137 00:07:54,560 --> 00:07:56,040 Speaker 3: report was contrived. 138 00:07:56,520 --> 00:07:58,920 Speaker 2: It doesn't stipulated. 139 00:07:59,240 --> 00:08:03,400 Speaker 3: Yeah, It doesn't say that Media Matters made up these adjacencies. 140 00:08:03,720 --> 00:08:07,960 Speaker 3: It says that Media Matters refreshed their Twitter feeds. 141 00:08:07,640 --> 00:08:08,440 Speaker 2: So many times. 142 00:08:08,480 --> 00:08:11,160 Speaker 3: They are they cherry they cherry picked? Yeah, And I'm 143 00:08:11,240 --> 00:08:13,880 Speaker 3: not sure. Look, I'm not a legal scholar. I don't 144 00:08:13,920 --> 00:08:17,080 Speaker 3: know if this is gonna hold up in court or not. 145 00:08:17,560 --> 00:08:19,920 Speaker 3: But I would say that the goal here, I don't 146 00:08:19,960 --> 00:08:24,120 Speaker 3: think is to extract a settlement from Media Matters. It's 147 00:08:24,240 --> 00:08:29,440 Speaker 3: to basically respond to this general swirl of allegations in 148 00:08:29,480 --> 00:08:33,480 Speaker 3: the most aggressive way possible. And also, you know, I think, unfortunately, 149 00:08:33,760 --> 00:08:35,880 Speaker 3: to kind of create a chilling effect, to make it 150 00:08:35,920 --> 00:08:38,640 Speaker 3: more difficult for nonprofits and yes, journalists. 151 00:08:38,720 --> 00:08:41,120 Speaker 6: By the way, the Center for Countering Digital Hate was 152 00:08:41,160 --> 00:08:45,400 Speaker 6: already facing legal complaints from X earlier this year. This 153 00:08:45,440 --> 00:08:47,480 Speaker 6: is a pattern. He said he would go after the 154 00:08:47,520 --> 00:08:50,679 Speaker 6: ADL of the Anti Defamation League as well. They ended 155 00:08:50,760 --> 00:08:54,160 Speaker 6: up having a somewhat of a truce. And this is 156 00:08:54,320 --> 00:08:59,679 Speaker 6: just a company that has made their API, which researchers 157 00:08:59,760 --> 00:09:03,400 Speaker 6: used to that's an application programming interface. But what it 158 00:09:03,440 --> 00:09:06,880 Speaker 6: means is that the way that researchers used to look 159 00:09:07,440 --> 00:09:10,160 Speaker 6: at the content on X to figure out if there 160 00:09:10,200 --> 00:09:13,280 Speaker 6: was misinformation spreading ahead of an election, to figure out 161 00:09:13,280 --> 00:09:16,439 Speaker 6: what was going viral, what was happening in global conversation, 162 00:09:17,080 --> 00:09:19,640 Speaker 6: Now that costs at a minimum forty two thousand dollars 163 00:09:19,720 --> 00:09:23,480 Speaker 6: a month, which is incredibly unaffordable for any nonprofit to 164 00:09:23,640 --> 00:09:26,560 Speaker 6: even hold X accountable in the way that they used to. 165 00:09:26,760 --> 00:09:29,960 Speaker 6: So all of this to say, X would prefer that 166 00:09:30,000 --> 00:09:33,640 Speaker 6: the only data that we believe about that company comes 167 00:09:33,640 --> 00:09:34,640 Speaker 6: from the company itself. 168 00:09:34,880 --> 00:09:38,200 Speaker 2: So we believe that this attempt to essentially enforce them 169 00:09:38,440 --> 00:09:41,840 Speaker 2: or have something of a chilling effect on the media 170 00:09:42,280 --> 00:09:43,200 Speaker 2: will be successful. 171 00:09:44,080 --> 00:09:44,320 Speaker 4: Max. 172 00:09:44,880 --> 00:09:48,880 Speaker 3: I think the way that media chilling effects worth it's 173 00:09:48,920 --> 00:09:50,400 Speaker 3: not like a one to one thing. No, I don't 174 00:09:50,440 --> 00:09:53,160 Speaker 3: think it's going to be successful because in certain ways, 175 00:09:53,160 --> 00:09:57,040 Speaker 3: the lawsuit actually confirms certain aspects of the Media Matters port. 176 00:09:57,040 --> 00:09:57,680 Speaker 2: I mean, he doesn't. 177 00:09:57,840 --> 00:10:00,559 Speaker 3: The lawsuit doesn't dispute that these were real so greenshots. 178 00:10:00,559 --> 00:10:03,400 Speaker 3: It doesn't dispute that there were, you know, anti Semites 179 00:10:03,400 --> 00:10:07,880 Speaker 3: on this platform. It disputes that Media Matters was overstating 180 00:10:07,880 --> 00:10:10,840 Speaker 3: the likelihood that you would see an anti Semitic ad 181 00:10:10,880 --> 00:10:12,880 Speaker 3: next to that. And I don't think it's going to 182 00:10:12,920 --> 00:10:15,400 Speaker 3: have an immediate chilling effect. But when you see like 183 00:10:15,400 --> 00:10:18,640 Speaker 3: a cascade of things like this, you know it doesn't. 184 00:10:19,120 --> 00:10:23,200 Speaker 3: It doesn't make people more anxious to call out discrimination 185 00:10:23,440 --> 00:10:25,839 Speaker 3: and so on. And honestly, I think it's it's kind 186 00:10:25,840 --> 00:10:28,960 Speaker 3: of a clever way for him to respond to these 187 00:10:28,960 --> 00:10:33,040 Speaker 3: allegations without apologizing for them, because he almost can't apologize 188 00:10:33,040 --> 00:10:35,720 Speaker 3: for them because this is sort of important to his 189 00:10:35,880 --> 00:10:38,000 Speaker 3: like right wing persona, like a lot of his fans 190 00:10:38,000 --> 00:10:42,000 Speaker 3: would be really disappointed if he said, no, I totally misspoke. 191 00:10:42,040 --> 00:10:44,120 Speaker 3: I need a moment of reflection. He doesn't want to 192 00:10:44,160 --> 00:10:46,199 Speaker 3: do that, so this is a way to kind of zig. 193 00:10:46,320 --> 00:10:48,840 Speaker 5: What strikes me is that it's kind of like with politics, 194 00:10:48,880 --> 00:10:51,280 Speaker 5: you know how, like the politician does something bad and 195 00:10:51,320 --> 00:10:53,960 Speaker 5: then the surrogates rally to the defense. You're seeing that 196 00:10:54,000 --> 00:10:57,440 Speaker 5: play at now, Like so Elon creates this horrific news 197 00:10:57,440 --> 00:11:01,600 Speaker 5: cycle last week, never apologizes, the board doesn't apologize, and 198 00:11:01,640 --> 00:11:04,040 Speaker 5: believe me, like they had ample opportunity to. No one 199 00:11:04,080 --> 00:11:06,280 Speaker 5: puts out a statement. But then you see like Bill 200 00:11:06,280 --> 00:11:08,680 Speaker 5: Eckman ride to the rescue over the weekend, and then 201 00:11:08,800 --> 00:11:12,600 Speaker 5: yesterday Sam Teller, elunch chief former chief of staff, puts 202 00:11:12,640 --> 00:11:15,600 Speaker 5: out a tweet. I spent nearly five years as Elunch 203 00:11:15,679 --> 00:11:17,920 Speaker 5: chief of staff, and that time I never observed a 204 00:11:17,960 --> 00:11:20,800 Speaker 5: trace of anti Semitism. He goes on, and then, like 205 00:11:21,080 --> 00:11:24,640 Speaker 5: you know, Elon responds thanks Sam with a heart emoji. 206 00:11:24,679 --> 00:11:27,720 Speaker 6: This is all a distraction from what's actually happening at 207 00:11:27,720 --> 00:11:30,800 Speaker 6: this company, which is that they've really reduced their content 208 00:11:30,880 --> 00:11:34,640 Speaker 6: moderation rules because Elon Musk says he wants anyone to 209 00:11:35,080 --> 00:11:38,000 Speaker 6: speak freely, and in order to fix it, they're relying 210 00:11:38,040 --> 00:11:40,600 Speaker 6: on this community notes. 211 00:11:40,360 --> 00:11:45,000 Speaker 2: Tool, which citizen journalists basically thousands. 212 00:11:44,679 --> 00:11:48,280 Speaker 6: Of people who just rate tweets and explain why they're 213 00:11:48,320 --> 00:11:52,400 Speaker 6: wrong or add context where needed and Bloomberg actually has 214 00:11:52,440 --> 00:11:58,080 Speaker 6: an investigation out today showing that this system is incredibly slow. 215 00:11:58,880 --> 00:12:02,760 Speaker 6: It can take hours for viral misinformation to get a 216 00:12:02,840 --> 00:12:07,040 Speaker 6: label put on it because it's this voting process that 217 00:12:07,080 --> 00:12:11,000 Speaker 6: takes place. And then it's also really inconsistent, and it 218 00:12:11,080 --> 00:12:15,440 Speaker 6: allows for repeat offenders who would have been banned under 219 00:12:15,480 --> 00:12:21,360 Speaker 6: the prior Twitter regime to continue to post at their pleasure. 220 00:12:21,920 --> 00:12:25,960 Speaker 2: Okay, Sarah, So earlier on you reminded us that X's revenue, 221 00:12:26,160 --> 00:12:29,880 Speaker 2: its advertising revenue is down roughly sixty percent since Elon 222 00:12:29,960 --> 00:12:34,400 Speaker 2: took over the company. We've had additional advertisers pull out. 223 00:12:34,480 --> 00:12:37,360 Speaker 2: Now you know how much damage has been done and 224 00:12:37,400 --> 00:12:40,600 Speaker 2: how existential is this? I mean listen, Elon himself in 225 00:12:40,640 --> 00:12:43,880 Speaker 2: the recent past has thrown out the word bankruptcy. Those 226 00:12:43,920 --> 00:12:48,960 Speaker 2: are his words. And how does Linda Yakarino turn this around? 227 00:12:50,160 --> 00:12:53,200 Speaker 6: Well, a lot of damage has been done to the 228 00:12:53,240 --> 00:12:58,720 Speaker 6: revenue model. That said, they think that they can eventually 229 00:12:58,760 --> 00:13:02,720 Speaker 6: replace it with the subscription business of paying for ex 230 00:13:02,800 --> 00:13:09,440 Speaker 6: premium that is still not reaching even a fraction of 231 00:13:09,480 --> 00:13:11,160 Speaker 6: the users that they would need for it to reach, 232 00:13:11,320 --> 00:13:14,760 Speaker 6: or to replace the revenues that they've lost from advertisers. 233 00:13:15,240 --> 00:13:18,200 Speaker 6: Linda Yakarino. Oh my goodness, I don't know where where 234 00:13:18,240 --> 00:13:23,319 Speaker 6: I start. I thought that she would last a few weeks, 235 00:13:23,320 --> 00:13:24,960 Speaker 6: maybe before she realized. 236 00:13:24,640 --> 00:13:27,200 Speaker 2: Oh wrong, Sarah, I was so wrong. 237 00:13:27,559 --> 00:13:29,200 Speaker 6: She's in it to win it. She's in it for 238 00:13:29,240 --> 00:13:33,160 Speaker 6: the long haul. I am. I'm actually, you know, impressed 239 00:13:33,360 --> 00:13:37,320 Speaker 6: with her ability to withstand, you know, all the pressures 240 00:13:37,360 --> 00:13:40,600 Speaker 6: of this job and still survive in the role. Her 241 00:13:40,640 --> 00:13:42,680 Speaker 6: main goal seems to be to not get fired. She's 242 00:13:43,200 --> 00:13:48,720 Speaker 6: she is incredibly devoted to espousing whatever Musk wants her to. 243 00:13:49,160 --> 00:13:52,960 Speaker 6: So she was also part of this weekend spin of 244 00:13:53,080 --> 00:13:56,920 Speaker 6: turning this anti semitic tweet backlash into a backlash over 245 00:13:57,000 --> 00:14:00,000 Speaker 6: the media matters and bogus reports from journalist. 246 00:14:00,400 --> 00:14:03,160 Speaker 3: What I love about this, though, is honestly she is trying. 247 00:14:03,200 --> 00:14:05,240 Speaker 3: Her plan as far as I can tell, which Sarah 248 00:14:05,320 --> 00:14:08,400 Speaker 3: was alluding to, is that they're going to just tack right. 249 00:14:08,559 --> 00:14:12,040 Speaker 3: They see twenty twenty four election is coming. You got 250 00:14:12,040 --> 00:14:14,720 Speaker 3: a nice size conservative audience on X right now. I 251 00:14:14,720 --> 00:14:16,040 Speaker 3: mean you know that we can we could have a 252 00:14:16,040 --> 00:14:19,800 Speaker 3: debate about how big the audience is overall and what 253 00:14:19,880 --> 00:14:22,280 Speaker 3: engagement is like, but it's definitely clear that you've got 254 00:14:22,280 --> 00:14:23,760 Speaker 3: a lot of right wingers are spending a lot of 255 00:14:23,800 --> 00:14:26,640 Speaker 3: time there, and so her plan seems to be to 256 00:14:27,120 --> 00:14:30,520 Speaker 3: use this and to and to sell advertisements to you know, 257 00:14:30,560 --> 00:14:35,240 Speaker 3: political action committees and I guess candidates and nonprofits. Again, 258 00:14:35,480 --> 00:14:38,640 Speaker 3: this is not a huge market, and Apple and these 259 00:14:38,720 --> 00:14:42,000 Speaker 3: advertisers that are pulling out are huge advertisers, and it 260 00:14:42,080 --> 00:14:45,880 Speaker 3: seems exceedingly unlikely that you would actually you would actually 261 00:14:45,920 --> 00:14:47,880 Speaker 3: be able to make up the difference. On the other hand, 262 00:14:48,040 --> 00:14:50,760 Speaker 3: maybe they don't need to because it's a much smaller company. 263 00:14:50,760 --> 00:14:52,360 Speaker 3: They've laid off, you know, a huge amount of the 264 00:14:52,360 --> 00:14:54,600 Speaker 3: staff costs the lower and you know you have this 265 00:14:54,680 --> 00:14:58,360 Speaker 3: like trickle of the debt of subscribers. I'm just trying 266 00:14:58,360 --> 00:15:01,120 Speaker 3: to give you the Linda, you know, the best case 267 00:15:01,160 --> 00:15:02,840 Speaker 3: scenario from Linda's pot of view. 268 00:15:02,680 --> 00:15:05,080 Speaker 5: Andrew Tate is going to like is going to advertise 269 00:15:05,160 --> 00:15:07,920 Speaker 5: right like these right wing influencers are offering to sort 270 00:15:07,920 --> 00:15:08,560 Speaker 5: of pay money. 271 00:15:08,600 --> 00:15:11,480 Speaker 6: Okay, but two things right wing influencers hate being on 272 00:15:11,520 --> 00:15:15,280 Speaker 6: a platform where there are no left wing woke people 273 00:15:15,320 --> 00:15:19,360 Speaker 6: to troll, and if there aren't any then they'll be bored. 274 00:15:20,320 --> 00:15:24,440 Speaker 6: And the other thing is all of these advertisers who 275 00:15:24,440 --> 00:15:28,920 Speaker 6: are pulling out and making this big statement of you know, 276 00:15:29,280 --> 00:15:32,880 Speaker 6: we don't endorse anti Semitism. It's not that big of 277 00:15:32,920 --> 00:15:35,600 Speaker 6: a deal for their balance sheets to pull out of 278 00:15:35,640 --> 00:15:39,360 Speaker 6: Twitter advertising. It's not that effective. They have Meta, they 279 00:15:39,440 --> 00:15:42,200 Speaker 6: have Google, they have these other places to advertise. They 280 00:15:42,200 --> 00:15:44,000 Speaker 6: don't need to be on Twitter. They were on Twitter 281 00:15:44,000 --> 00:15:47,840 Speaker 6: because of the relationships and where it counts for Elon 282 00:15:47,960 --> 00:15:51,920 Speaker 6: Musk in his government contracts, in his relationships with bankers. 283 00:15:52,520 --> 00:15:53,680 Speaker 6: People aren't pulling up. 284 00:15:56,720 --> 00:15:59,920 Speaker 2: Okay, welcome back. So somehow X isn't even the biggest 285 00:16:00,120 --> 00:16:03,360 Speaker 2: news story in the tech world right now. Open Ai 286 00:16:03,800 --> 00:16:07,640 Speaker 2: a company that Elon started with Sam Altman before he 287 00:16:07,880 --> 00:16:11,560 Speaker 2: would leave in a twenty eighteen power struggle Max, what 288 00:16:12,040 --> 00:16:14,560 Speaker 2: exactly has it been facing? For the love of god, 289 00:16:14,640 --> 00:16:15,800 Speaker 2: what is going on here? 290 00:16:16,480 --> 00:16:19,440 Speaker 3: Open ai is pretty much the hottest thing, or was 291 00:16:19,640 --> 00:16:22,480 Speaker 3: until a few days ago, pretty much the hottest. 292 00:16:22,000 --> 00:16:22,920 Speaker 2: Thing in tech. 293 00:16:23,160 --> 00:16:26,720 Speaker 3: Open Ai created chat GPT, the the chatbot that that 294 00:16:26,800 --> 00:16:28,760 Speaker 3: many people are very excited about, many companies are very 295 00:16:28,800 --> 00:16:31,600 Speaker 3: excited about. And Microsoft, you know, one of the world's 296 00:16:31,680 --> 00:16:35,480 Speaker 3: largest companies, has essentially put at the center of its 297 00:16:35,480 --> 00:16:36,600 Speaker 3: of an entire like. 298 00:16:36,560 --> 00:16:38,360 Speaker 2: Corporate overhaul effort. 299 00:16:38,960 --> 00:16:40,800 Speaker 3: We should say, first of all, open Ai was co 300 00:16:40,920 --> 00:16:44,920 Speaker 3: founded by Elon Musk or Elon Musk played a key 301 00:16:45,040 --> 00:16:48,680 Speaker 3: role in the creation of the of this startup, was 302 00:16:48,720 --> 00:16:52,840 Speaker 3: the main funder early on, and for years open AI's 303 00:16:52,880 --> 00:16:55,880 Speaker 3: CEO Sam Altman, who was kind of Musk's partner in 304 00:16:55,920 --> 00:16:58,720 Speaker 3: the in the creation of the of the entity, was 305 00:16:58,760 --> 00:17:02,640 Speaker 3: going around saying this technology is so amazing, it could 306 00:17:02,720 --> 00:17:04,880 Speaker 3: end the world, Like we really need to be careful 307 00:17:04,880 --> 00:17:08,440 Speaker 3: here because Ai could take over and the robots could 308 00:17:08,440 --> 00:17:11,080 Speaker 3: somehow decide to annihilate the human race. And Elon Musk 309 00:17:11,080 --> 00:17:13,440 Speaker 3: actually played a big role in kind of pumping. 310 00:17:13,119 --> 00:17:14,080 Speaker 4: Up this idea. 311 00:17:14,200 --> 00:17:17,199 Speaker 3: Now, I don't think that open AI's senior management, I 312 00:17:17,200 --> 00:17:19,919 Speaker 3: don't think Microsoft, I don't think Sam Altman ever really 313 00:17:19,960 --> 00:17:23,080 Speaker 3: took seriously the idea that open Ai was its little 314 00:17:23,160 --> 00:17:25,760 Speaker 3: chatbot that can write an email for you, was going 315 00:17:25,800 --> 00:17:27,960 Speaker 3: to turn around and annihilate the whole human race. 316 00:17:28,200 --> 00:17:31,240 Speaker 2: But that is the very argument that they apparently made here. 317 00:17:31,280 --> 00:17:37,280 Speaker 3: But crucially, unfortunately for Sam Altman, some open ai board members, 318 00:17:37,680 --> 00:17:40,359 Speaker 3: I think, do really buy into that idea. And we 319 00:17:40,480 --> 00:17:45,639 Speaker 3: had this amazingly sudden coup that happened on Friday where 320 00:17:45,720 --> 00:17:48,399 Speaker 3: a number of open ai board members. Open ai was 321 00:17:48,440 --> 00:17:50,680 Speaker 3: governed by a nonprofit. It's kind of a weird structure. 322 00:17:51,080 --> 00:17:54,760 Speaker 3: Voted Sam Altman off the island, fired him, and they 323 00:17:54,760 --> 00:17:58,320 Speaker 3: were able to do so, strangely, without consulting any of 324 00:17:58,320 --> 00:18:01,200 Speaker 3: the venture capitalists who I invested in this thing, and 325 00:18:01,240 --> 00:18:04,560 Speaker 3: without consulting with Microsoft, which had put thirteen billion dollars 326 00:18:04,600 --> 00:18:05,160 Speaker 3: into this thing. 327 00:18:05,600 --> 00:18:08,439 Speaker 2: You know, it's funny because I grew up journalistically in 328 00:18:08,520 --> 00:18:11,600 Speaker 2: Latin America and I witnessed a lot of coups. Some worked, 329 00:18:11,840 --> 00:18:15,000 Speaker 2: some didn't. This has got to be the worst coup 330 00:18:15,560 --> 00:18:17,879 Speaker 2: I've seen since the two thousand and two attempt to 331 00:18:17,920 --> 00:18:20,480 Speaker 2: ousto Goo jobs. It's been pretty bad. And right now 332 00:18:20,520 --> 00:18:25,800 Speaker 2: as we speak, Sarah Friar, there's a mutiny going on 333 00:18:26,640 --> 00:18:29,680 Speaker 2: inside open a open ai correct. 334 00:18:30,240 --> 00:18:36,000 Speaker 6: Yes, About ninety percent or above of the existing employees 335 00:18:36,040 --> 00:18:38,920 Speaker 6: that open ai have signed a petition, by the way, 336 00:18:39,000 --> 00:18:42,400 Speaker 6: including the chief scientist who is on the board, pushing 337 00:18:42,480 --> 00:18:46,760 Speaker 6: out Sam Altman and Greg Brockman. These people have all 338 00:18:46,760 --> 00:18:51,040 Speaker 6: signed this memo saying that if they aren't reinstated, they're 339 00:18:51,080 --> 00:18:54,919 Speaker 6: out of there. Which is it? It's wild? Right? 340 00:18:54,680 --> 00:18:55,240 Speaker 2: Are all? 341 00:18:55,560 --> 00:18:58,480 Speaker 6: Is Microsoft going to end up acquiring open ai for 342 00:18:58,600 --> 00:19:02,480 Speaker 6: like zero dollars, and you know the rest of this 343 00:19:02,560 --> 00:19:06,600 Speaker 6: is in their hands. Are these leaders who were ousted 344 00:19:06,640 --> 00:19:09,760 Speaker 6: going to come back and try to say businesses running 345 00:19:09,800 --> 00:19:10,320 Speaker 6: as usual? 346 00:19:11,640 --> 00:19:12,280 Speaker 2: It is. 347 00:19:13,760 --> 00:19:17,040 Speaker 6: One of the craziest, Like people have compared it to 348 00:19:17,080 --> 00:19:19,480 Speaker 6: like Steve Jobs getting kicked out of Apple and brought back, 349 00:19:19,920 --> 00:19:22,359 Speaker 6: or Jack Dorsy is getting kicked out of Twitter and 350 00:19:22,400 --> 00:19:26,520 Speaker 6: brought back. Those two situations happened over a matter of years. 351 00:19:27,080 --> 00:19:30,320 Speaker 6: This is happening in a matter of days. So we're 352 00:19:30,359 --> 00:19:31,720 Speaker 6: all not sleeping that much. 353 00:19:31,760 --> 00:19:34,800 Speaker 7: But I do think he played some role, right, which 354 00:19:34,800 --> 00:19:36,439 Speaker 7: is what I want to get at, because you know, 355 00:19:36,480 --> 00:19:39,160 Speaker 7: we don't ultimately know all should be revealed at some point, 356 00:19:39,320 --> 00:19:41,760 Speaker 7: but you know, to a certain degree, Yeah, it is 357 00:19:41,800 --> 00:19:44,080 Speaker 7: interesting without really knowing anything. 358 00:19:44,160 --> 00:19:46,040 Speaker 2: God, it seems like his fingerprints would be all over this. 359 00:19:46,119 --> 00:19:48,320 Speaker 2: I mean, you told us the other day, Max that 360 00:19:48,359 --> 00:19:51,760 Speaker 2: they Musk that Elon Musk Samultman rivalry was a top 361 00:19:51,840 --> 00:19:54,160 Speaker 2: twenty Musk rub I had it. I think I. 362 00:19:54,080 --> 00:19:56,760 Speaker 3: Underrated it, right, I mean, clearly this was a bigger, 363 00:19:57,320 --> 00:20:01,080 Speaker 3: more consequential rivalry than I had even at Yeah, last 364 00:20:01,080 --> 00:20:03,480 Speaker 3: week we talked about Elon Musk going on the Lex 365 00:20:03,520 --> 00:20:06,520 Speaker 3: Friedman podcast and he said a few things that kind 366 00:20:06,560 --> 00:20:10,399 Speaker 3: of anticipated this. He brought up ilias suit Skiver, the 367 00:20:10,440 --> 00:20:14,520 Speaker 3: chief scientist uh and and basically took credit for hiring him, 368 00:20:14,960 --> 00:20:18,119 Speaker 3: and heaped a lot of praise on him, and and 369 00:20:18,280 --> 00:20:21,520 Speaker 3: also made a kind of complaint about Sam Altman open AI. 370 00:20:21,560 --> 00:20:23,480 Speaker 3: He said, open a I was supposed to be open source, 371 00:20:23,520 --> 00:20:27,280 Speaker 3: and now it's profitable. He's been complaining for years that 372 00:20:27,400 --> 00:20:30,800 Speaker 3: this is like a startup worth billions of dollars that 373 00:20:30,840 --> 00:20:33,600 Speaker 3: he founded as a nonprofit. And what we saw with 374 00:20:33,720 --> 00:20:37,280 Speaker 3: the board, right was this effort to take this startup, 375 00:20:37,320 --> 00:20:39,639 Speaker 3: take this thing that Microsoft is so excited about, that 376 00:20:39,720 --> 00:20:42,480 Speaker 3: venture capitalists are so excited about, and wrestle it back 377 00:20:42,680 --> 00:20:45,800 Speaker 3: to its nonprofit roots, which you know, does two things 378 00:20:45,840 --> 00:20:48,159 Speaker 3: for for Elon Musk. One is it validates this thing 379 00:20:48,200 --> 00:20:50,119 Speaker 3: that he's been saying he can he can claim to 380 00:20:50,119 --> 00:20:52,159 Speaker 3: be right, which he loves to do. And also it 381 00:20:52,240 --> 00:20:54,159 Speaker 3: kind of does him a favor because Elon Musk just 382 00:20:54,240 --> 00:21:00,800 Speaker 3: launched this kind of haphazard chant of Rock. Yes, I'm saying, 383 00:21:00,960 --> 00:21:03,560 Speaker 3: and I regret to say, I know that Grock has 384 00:21:03,400 --> 00:21:06,640 Speaker 3: its defenders and everything like that, but basically this thing 385 00:21:06,920 --> 00:21:10,280 Speaker 3: Rock was assembled in a matter of months, there's Yeah, 386 00:21:10,320 --> 00:21:12,480 Speaker 3: I'd say there's a little excitement among Elon's fans, but 387 00:21:12,480 --> 00:21:14,720 Speaker 3: I don't think it's exactly caught fire. But now all 388 00:21:14,720 --> 00:21:17,879 Speaker 3: of a sudden you have this utter crisis at the 389 00:21:17,960 --> 00:21:20,480 Speaker 3: leading player. I mean, I think that could only help 390 00:21:20,720 --> 00:21:23,119 Speaker 3: Elon Musk, either in terms of recruiting or just in 391 00:21:23,200 --> 00:21:26,200 Speaker 3: terms of getting people to try a different chatbox. 392 00:21:26,280 --> 00:21:28,600 Speaker 2: Right now, one other thing on this, Sarah, do we 393 00:21:28,640 --> 00:21:31,640 Speaker 2: have any sense right because indeed, as maxis said here, 394 00:21:31,760 --> 00:21:34,000 Speaker 2: Elon played a role in the founding of open Ai. 395 00:21:35,040 --> 00:21:37,359 Speaker 6: He was the co founder. He put one hundred million 396 00:21:37,400 --> 00:21:41,160 Speaker 6: of his own money into it. He tried to take 397 00:21:41,280 --> 00:21:45,040 Speaker 6: full control at some point and that didn't work out. 398 00:21:45,200 --> 00:21:51,520 Speaker 6: But he was very passionate about trying to prevent AI 399 00:21:51,840 --> 00:21:55,480 Speaker 6: from getting out of control. And of course this narrative 400 00:21:55,520 --> 00:21:57,840 Speaker 6: of like, we've got to prevent AI from taking over 401 00:21:57,960 --> 00:22:00,400 Speaker 6: is also somewhat self serving because it's like the only 402 00:22:00,440 --> 00:22:01,960 Speaker 6: one who can save you from it is me. 403 00:22:03,000 --> 00:22:07,240 Speaker 3: We should also say, for future feud watching potential Musk 404 00:22:07,280 --> 00:22:10,800 Speaker 3: weighing in here, Musk, you know, hinting darkly that there 405 00:22:10,840 --> 00:22:13,080 Speaker 3: must have been something going on that the public needs 406 00:22:13,080 --> 00:22:16,720 Speaker 3: to know about. That is a big problem for Microsoft, 407 00:22:16,760 --> 00:22:19,320 Speaker 3: which is like really trying to just clean up this mess, 408 00:22:19,400 --> 00:22:22,000 Speaker 3: and it's a big problem for the venture capitalists who 409 00:22:22,000 --> 00:22:24,000 Speaker 3: invested in open Ai. I mean, part of what makes 410 00:22:24,000 --> 00:22:29,199 Speaker 3: this whole thing so wild is that is that, like overnight, 411 00:22:29,440 --> 00:22:32,400 Speaker 3: this incredibly valuable startup that all of these very powerful 412 00:22:32,400 --> 00:22:35,560 Speaker 3: people were sort of banking on is getting taken apart. 413 00:22:35,640 --> 00:22:39,040 Speaker 2: Right, It's amazing, Max, this thing went from apparently a 414 00:22:39,119 --> 00:22:43,840 Speaker 2: value of eighty six billion dollars to perhaps zero in seconds. 415 00:22:43,880 --> 00:22:47,080 Speaker 2: It's one hundred percent, right, It's a staggering if it 416 00:22:47,119 --> 00:22:48,200 Speaker 2: really goes down this way. 417 00:22:48,320 --> 00:22:50,439 Speaker 3: Yeah, it's not going to go down to zero, but 418 00:22:50,600 --> 00:22:52,080 Speaker 3: at least I don't think it will. 419 00:22:52,240 --> 00:22:57,400 Speaker 6: That's the thing, Max, This tension, these relationships, they all 420 00:22:57,440 --> 00:23:00,560 Speaker 6: go way back. Elon Musk has known Sam Altman forever. 421 00:23:00,840 --> 00:23:03,800 Speaker 6: Sam Altman, by the way, was in the first Y 422 00:23:03,880 --> 00:23:07,680 Speaker 6: Combinator class with Emmett Sheer, who's now the CEO taking 423 00:23:07,720 --> 00:23:12,359 Speaker 6: over maybe on an interim basis, we don't know for 424 00:23:12,440 --> 00:23:12,960 Speaker 6: open Ai. 425 00:23:13,440 --> 00:23:17,440 Speaker 2: All these here, Sarah Emmit, Sarah Sheer is just simply 426 00:23:17,920 --> 00:23:20,040 Speaker 2: sitting in that position, keeping it warm until I think, 427 00:23:20,040 --> 00:23:22,639 Speaker 2: as you once said, until Elon Musk takes that seat. 428 00:23:22,760 --> 00:23:27,160 Speaker 6: Oh gosh, Yeah, Thanksgivings coming up. One thing I'm grateful 429 00:23:27,200 --> 00:23:30,440 Speaker 6: for is that that Elon Musk didn't end up getting 430 00:23:30,480 --> 00:23:32,280 Speaker 6: that CEO role over the weekend. 431 00:23:32,880 --> 00:23:35,639 Speaker 2: Hey, note that we're recording this on Tuesday afternoon, and 432 00:23:35,680 --> 00:23:39,280 Speaker 2: this is a very quickly developing story. The latest here 433 00:23:39,359 --> 00:23:42,240 Speaker 2: is that Altman is apparently now in talks to return 434 00:23:42,320 --> 00:23:46,280 Speaker 2: to open Ai Crazy times. So before we head into 435 00:23:46,320 --> 00:23:49,480 Speaker 2: our final segment here, we should mention that the SpaceX 436 00:23:49,560 --> 00:23:52,760 Speaker 2: starship launch that we previewed last week did indeed happen 437 00:23:52,800 --> 00:23:57,480 Speaker 2: on Saturday, and alas ended in flames. The flames, though, 438 00:23:57,680 --> 00:24:00,320 Speaker 2: came a little later than the last time on these 439 00:24:00,359 --> 00:24:03,400 Speaker 2: things went up in space, so Elon is a touch 440 00:24:03,480 --> 00:24:07,199 Speaker 2: closer to that Mars colony he's dreaming of. If you 441 00:24:07,200 --> 00:24:08,960 Speaker 2: want to know what's at stake with the launch and 442 00:24:09,000 --> 00:24:11,800 Speaker 2: all those going forward, picture to check out last week's spot. 443 00:24:14,720 --> 00:24:18,399 Speaker 2: Thanksgiving is just two days away, everyone, and our thoughts 444 00:24:18,440 --> 00:24:22,160 Speaker 2: here at least as always turned to the Musk family table. 445 00:24:22,800 --> 00:24:26,600 Speaker 2: In the past, it's been a pretty raucous and wild event. Max. 446 00:24:26,720 --> 00:24:29,240 Speaker 2: Paint us a picture here. What's it going to be 447 00:24:29,359 --> 00:24:29,960 Speaker 2: like this year? 448 00:24:30,560 --> 00:24:33,320 Speaker 3: I feel like all Thanksgiving dinners to some extent are 449 00:24:33,359 --> 00:24:36,960 Speaker 3: exercises in like avoiding politics, and you're just trying to 450 00:24:37,119 --> 00:24:38,880 Speaker 3: like sit there with your family just. 451 00:24:38,840 --> 00:24:42,760 Speaker 4: Like hoping head down, just can't avoid it. And so 452 00:24:42,840 --> 00:24:45,360 Speaker 4: I mean, I imagine it's kind of the same situation, 453 00:24:45,640 --> 00:24:48,680 Speaker 4: except it's all of them just hoping that Elon will, 454 00:24:48,760 --> 00:24:51,960 Speaker 4: like you do, not go on a wokeness rant, which 455 00:24:52,000 --> 00:24:55,880 Speaker 4: I again like that feels like a common Thanksgiving tableau 456 00:24:56,040 --> 00:24:58,920 Speaker 4: kind of brings everyone down, And I imagine that could 457 00:24:59,119 --> 00:25:01,880 Speaker 4: that's you know, almost inevitability, and the real the only 458 00:25:01,960 --> 00:25:03,760 Speaker 4: question is how do they kind of steer away from it? 459 00:25:04,320 --> 00:25:06,119 Speaker 2: Well, listen, the one thing that I'm hoping for this 460 00:25:06,200 --> 00:25:09,440 Speaker 2: Thanksgiving is so that Sarah Friar can get some rest 461 00:25:09,480 --> 00:25:13,720 Speaker 2: and that there isn't some like enormous open ai slash 462 00:25:14,080 --> 00:25:15,720 Speaker 2: x news on Thursday morning. 463 00:25:16,240 --> 00:25:17,440 Speaker 6: Okay, let's help. 464 00:25:20,320 --> 00:25:22,720 Speaker 2: All Right, that's it. Thanks for listening to Elon, Inc. 465 00:25:22,840 --> 00:25:26,879 Speaker 2: And thanks to our panel Sarah, Dana Max. 466 00:25:27,280 --> 00:25:29,000 Speaker 3: Great to be here, have a great week. 467 00:25:29,720 --> 00:25:30,200 Speaker 6: Thanks all. 468 00:25:37,240 --> 00:25:41,040 Speaker 2: This episode was produced by Stacy Wong, Naomi Shaven and 469 00:25:41,080 --> 00:25:45,480 Speaker 2: Rayhan Harmancia are senior editors. The idea for this very 470 00:25:45,560 --> 00:25:49,960 Speaker 2: show also came from Rayhon Blake Maples handles engineering, and 471 00:25:49,960 --> 00:25:54,320 Speaker 2: we get special editing assistant from Jeff Gropat. Our supervising 472 00:25:54,320 --> 00:25:58,120 Speaker 2: producer is Magnus Hendrickson, Thanks a bunch of Angel Rascio 473 00:25:58,520 --> 00:26:02,640 Speaker 2: and to BusinessWeek editor Weber as well. The Elon Inc. 474 00:26:02,800 --> 00:26:07,440 Speaker 2: Theme is written and performed by Taka Yasuzawa and Alex Sugiira. 475 00:26:08,240 --> 00:26:10,800 Speaker 2: Sage Bauman is the head of Bloomberg Podcast and our 476 00:26:10,840 --> 00:26:15,520 Speaker 2: executive producer. I am David Papadopoulos. If you have a minute, 477 00:26:15,760 --> 00:26:18,879 Speaker 2: rate and review our show, it'll help other listeners find 478 00:26:18,960 --> 00:26:22,160 Speaker 2: us happy Thanksgiving all. See you next week.