1 00:00:00,160 --> 00:00:03,520 Speaker 1: Elon Muskee is on most days, the richest person on Earth. 2 00:00:03,800 --> 00:00:06,280 Speaker 1: He's the co founder and or head of no fewer 3 00:00:06,320 --> 00:00:11,240 Speaker 1: than six companies Tesla, SpaceX, Boring, neurlink X, and Xai. 4 00:00:11,960 --> 00:00:15,120 Speaker 1: Of course, he also co founded PayPal. He employs more 5 00:00:15,120 --> 00:00:18,119 Speaker 1: than one hundred thousand people working on technologies that impact 6 00:00:18,120 --> 00:00:20,720 Speaker 1: our day to day lives and could usher in our future. 7 00:00:21,360 --> 00:00:25,040 Speaker 1: His influence, whether it's in space, social media, the automotive industry, 8 00:00:25,160 --> 00:00:30,040 Speaker 1: or increasingly in geopolitics, is vast. Every day Elon is 9 00:00:30,080 --> 00:00:32,839 Speaker 1: making headlines, and so it can be hard to keep 10 00:00:32,920 --> 00:00:36,360 Speaker 1: up with the onslaught to help you figure out what matters, 11 00:00:36,440 --> 00:00:38,919 Speaker 1: we have assembled a panel of Bloomberg experts on all 12 00:00:38,960 --> 00:00:42,720 Speaker 1: things Elon to discuss and sometimes to argue about what 13 00:00:42,760 --> 00:00:43,760 Speaker 1: he's doing in the world. 14 00:00:47,000 --> 00:00:49,920 Speaker 2: Well, Elon Musk is now the richest person on the planet. 15 00:00:50,240 --> 00:00:53,080 Speaker 1: More than half the satellites in space are owned and 16 00:00:53,159 --> 00:00:55,320 Speaker 1: controlled by one man. 17 00:00:55,480 --> 00:00:58,200 Speaker 2: Starting his own artificial intelligence company. 18 00:00:58,400 --> 00:00:59,960 Speaker 3: Well, he's a legitimate super gene. 19 00:01:00,840 --> 00:01:01,400 Speaker 4: Legitimate. 20 00:01:01,600 --> 00:01:01,960 Speaker 2: He says. 21 00:01:01,960 --> 00:01:04,560 Speaker 1: He's always voted for Democrats, but this year it will 22 00:01:04,600 --> 00:01:05,039 Speaker 1: be different. 23 00:01:05,080 --> 00:01:06,039 Speaker 3: He'll vote Republican. 24 00:01:06,280 --> 00:01:08,800 Speaker 5: There is a reason the US government is so reliant 25 00:01:08,840 --> 00:01:09,200 Speaker 5: on him. 26 00:01:09,400 --> 00:01:13,000 Speaker 3: Alon Musk is a scam artist and he's done nothing. 27 00:01:13,600 --> 00:01:16,360 Speaker 2: Anything he does, he's fascinating the people. 28 00:01:26,400 --> 00:01:29,679 Speaker 1: Hello and welcome to the inaugural episode of Elon Inc. 29 00:01:30,120 --> 00:01:33,520 Speaker 1: My name is David Papadopoulos. This is a brand new 30 00:01:33,520 --> 00:01:36,880 Speaker 1: weekly podcast that will dissect the biggest stories about one 31 00:01:36,880 --> 00:01:41,280 Speaker 1: of the world's most powerful and polarizing men. This week, 32 00:01:41,319 --> 00:01:43,800 Speaker 1: we'll start with AI. In the last few days, Musk 33 00:01:43,880 --> 00:01:46,800 Speaker 1: sat down with UK Prime Minister Rishi Sunak to discuss 34 00:01:46,840 --> 00:01:50,800 Speaker 1: the opportunities and risks AI brings, and he rolled out 35 00:01:50,840 --> 00:01:56,200 Speaker 1: GROC his answer to AI Internet sensation chat GBT. Meanwhile, 36 00:01:56,360 --> 00:01:59,160 Speaker 1: other pieces of his empire are demanding his attention. As 37 00:01:59,240 --> 00:02:02,840 Speaker 1: is the usual, the United Auto Workers union president set 38 00:02:02,840 --> 00:02:05,680 Speaker 1: his sits on Tesla just days after scoring an historic 39 00:02:05,680 --> 00:02:09,040 Speaker 1: payhike for workers in Detroit. The billionaire could face his 40 00:02:09,120 --> 00:02:12,399 Speaker 1: most serious labor fight yet. To discuss this and more, 41 00:02:12,440 --> 00:02:15,160 Speaker 1: we brought in Sarah Fryar, who oversees our coverage of 42 00:02:15,200 --> 00:02:20,280 Speaker 1: the biggest companies in Silicon Valley. Hello, Hello, Sarah. Dana Hall, 43 00:02:20,320 --> 00:02:22,720 Speaker 1: who began covering Tesla for us when the stock was 44 00:02:22,760 --> 00:02:25,920 Speaker 1: worth less than one tenth of what it is today. Dana, 45 00:02:26,200 --> 00:02:31,239 Speaker 1: good morning, and Max Chafkin, reporter, editor and troublemaker at 46 00:02:31,240 --> 00:02:36,440 Speaker 1: Bloomberg Business Week. Hey, Hey, so we're going to start 47 00:02:36,520 --> 00:02:40,160 Speaker 1: indeed with XAI, and we're going to start with. 48 00:02:40,240 --> 00:02:40,799 Speaker 2: You, Sarah. 49 00:02:41,080 --> 00:02:44,200 Speaker 1: What does GROC do and Visa VI all the rivals 50 00:02:44,200 --> 00:02:47,200 Speaker 1: in the market today, how does it stack up any idea? 51 00:02:47,760 --> 00:02:50,160 Speaker 6: Well, first of all, first thing you need to know 52 00:02:50,200 --> 00:02:53,040 Speaker 6: about GROC is none of the media has had an 53 00:02:53,040 --> 00:02:56,000 Speaker 6: opportunity to test this thing. It is yet another chat 54 00:02:56,000 --> 00:02:59,520 Speaker 6: bot in the mix, much like chat Gibt or Bart 55 00:02:59,560 --> 00:03:03,920 Speaker 6: or any of the others. And this top selling point 56 00:03:04,240 --> 00:03:08,360 Speaker 6: is that it is it is supposed to be snarky, irreverent, funny. 57 00:03:08,480 --> 00:03:13,080 Speaker 6: Elon Musk posted on x formerly known as Twitter where 58 00:03:13,080 --> 00:03:16,959 Speaker 6: he asked the chatbot how to make cocaine and he 59 00:03:17,080 --> 00:03:22,400 Speaker 6: responded with, you know, a joking set of instruction. 60 00:03:22,520 --> 00:03:23,320 Speaker 1: It was a good answer though. 61 00:03:23,360 --> 00:03:27,440 Speaker 6: It was a good answer, So that's kind of the vibe. 62 00:03:27,560 --> 00:03:32,440 Speaker 6: And I think it's important to know that this is 63 00:03:33,000 --> 00:03:38,400 Speaker 6: deeply integrated with x the social media platform, but it 64 00:03:38,480 --> 00:03:41,600 Speaker 6: is not part of x It is a separate company, XAI, 65 00:03:42,320 --> 00:03:48,400 Speaker 6: and this partnership provides the top selling point for this chatbot, 66 00:03:48,440 --> 00:03:50,600 Speaker 6: which is that it will be trained on real time 67 00:03:50,720 --> 00:03:52,880 Speaker 6: Twitter data sorry X data. 68 00:03:53,080 --> 00:03:55,920 Speaker 1: Yeah, so yeah, they they You know, I heard Musk 69 00:03:56,320 --> 00:04:00,200 Speaker 1: spinning that as something that really will separate it from rivals, is. 70 00:04:00,200 --> 00:04:03,000 Speaker 6: That it will absolutely separate it from rivals to have 71 00:04:03,040 --> 00:04:06,480 Speaker 6: that real time information. However, that is also the scary 72 00:04:06,520 --> 00:04:10,560 Speaker 6: thing because as we know, X can be accessible sometimes. 73 00:04:10,200 --> 00:04:12,640 Speaker 1: You know, Max. One of the things that jumps out 74 00:04:12,680 --> 00:04:15,800 Speaker 1: at me a little bit is when you look at 75 00:04:16,080 --> 00:04:19,720 Speaker 1: the stock market in twenty twenty three and what's're driving 76 00:04:20,200 --> 00:04:24,560 Speaker 1: all the gains in the stock market, It has largely 77 00:04:24,680 --> 00:04:28,480 Speaker 1: largely been AI. Right by one crewde measurement, AI has 78 00:04:28,480 --> 00:04:31,520 Speaker 1: added twenty percentage points to the Nasdaq's rally this year. 79 00:04:31,960 --> 00:04:34,440 Speaker 1: Seen another way, you take Microsoft in the video two 80 00:04:34,600 --> 00:04:37,080 Speaker 1: leaders in AI or the two leaders in A, and 81 00:04:37,120 --> 00:04:39,400 Speaker 1: they've contributed more to the gains in the S and 82 00:04:39,440 --> 00:04:43,240 Speaker 1: P five hundred than any other stock by orders of magnitude. 83 00:04:43,720 --> 00:04:49,560 Speaker 1: So I'm wondering, is he rushing grocout now, perhaps before 84 00:04:49,600 --> 00:04:52,279 Speaker 1: it's ready to try to tap into this storyline? 85 00:04:52,600 --> 00:04:56,679 Speaker 2: Obviously, Yes, he is obviously rushing grocout before it's ready. 86 00:04:56,960 --> 00:05:01,000 Speaker 2: The press release slash website that they put up notes 87 00:05:01,080 --> 00:05:03,559 Speaker 2: that they've only had something like two months of work 88 00:05:03,680 --> 00:05:07,480 Speaker 2: on this thing. They've they've they just started this company 89 00:05:07,760 --> 00:05:11,080 Speaker 2: really in the past few months. The website lists a 90 00:05:11,080 --> 00:05:14,920 Speaker 2: lot of like very detailed technical information about grock. It 91 00:05:15,240 --> 00:05:17,679 Speaker 2: claims that there's certain certain benchmarks it's sort of holding 92 00:05:17,760 --> 00:05:21,920 Speaker 2: up well against, like the Facebook version of chat GPT, 93 00:05:22,200 --> 00:05:25,560 Speaker 2: but maybe a little behind GPT four, which is like 94 00:05:25,600 --> 00:05:28,880 Speaker 2: the most up to date version of the best one 95 00:05:28,880 --> 00:05:30,719 Speaker 2: of these models. The port of thing to keep in 96 00:05:30,760 --> 00:05:34,840 Speaker 2: mind is like this as until people start actually using this, 97 00:05:35,080 --> 00:05:37,360 Speaker 2: I think it's worth like thinking about this as vaporware 98 00:05:37,600 --> 00:05:39,640 Speaker 2: or not even taking it seriously as a real product. 99 00:05:39,839 --> 00:05:44,000 Speaker 6: The one tangible outcome of grock right now is that 100 00:05:44,040 --> 00:05:48,560 Speaker 6: it is the reason the only reason to buy X 101 00:05:48,640 --> 00:05:52,560 Speaker 6: Premium Plus, because it is only available through X Premium Plus. 102 00:05:52,640 --> 00:05:58,280 Speaker 6: So as a little background, there are some some tiers of. 103 00:05:58,240 --> 00:06:00,920 Speaker 2: Subscription for X, which is a new thing. 104 00:06:00,800 --> 00:06:03,720 Speaker 6: Which is a new thing this just started. It used 105 00:06:03,720 --> 00:06:05,960 Speaker 6: to just be that there was a one premium tier, 106 00:06:06,000 --> 00:06:08,559 Speaker 6: but now there are three. In the top tier gives 107 00:06:08,560 --> 00:06:12,120 Speaker 6: you access to this chatbot when it comes out of beta. 108 00:06:12,160 --> 00:06:14,520 Speaker 6: We don't know when that will happen, but that is 109 00:06:14,720 --> 00:06:17,719 Speaker 6: that is one reason people are spending sixteen dollars a month, 110 00:06:17,839 --> 00:06:19,679 Speaker 6: or you know, if you're buying from the app. 111 00:06:19,880 --> 00:06:21,680 Speaker 1: So serah, fright, is this going to be the thing? 112 00:06:21,800 --> 00:06:23,880 Speaker 1: Is this a silver bullet for X? X has been 113 00:06:23,920 --> 00:06:26,480 Speaker 1: obviously struggling. It's a well known story from the moment 114 00:06:26,520 --> 00:06:30,760 Speaker 1: that that Musk purchased it a year ago. Does this 115 00:06:30,839 --> 00:06:31,800 Speaker 1: help X? 116 00:06:32,040 --> 00:06:34,320 Speaker 6: I mean every little bit helps that they're so desperate 117 00:06:34,320 --> 00:06:37,159 Speaker 6: they're trying to like sell user names for money for 118 00:06:37,320 --> 00:06:40,599 Speaker 6: like a few tons of thousands of dollars. I think 119 00:06:41,000 --> 00:06:42,880 Speaker 6: you know this is this is a company that has 120 00:06:42,960 --> 00:06:48,440 Speaker 6: lost more than sixty percent of its advertiser revenue. They 121 00:06:48,480 --> 00:06:52,599 Speaker 6: have had a drop in users. They're struggling, they have 122 00:06:52,920 --> 00:06:56,880 Speaker 6: mountains of debt to repay to banks, and this is 123 00:06:57,320 --> 00:07:01,679 Speaker 6: a way to give people something tangible to buy. 124 00:07:02,160 --> 00:07:07,760 Speaker 1: So, Dana, he's got Xai, an ai company, and it's 125 00:07:07,920 --> 00:07:11,920 Speaker 1: and and it's very interrelated with X. But then of 126 00:07:12,000 --> 00:07:14,440 Speaker 1: course he's got Tesla, where he's got a whole nother 127 00:07:14,960 --> 00:07:17,840 Speaker 1: artificial intelligence thing going on, and that is truly like 128 00:07:17,880 --> 00:07:21,440 Speaker 1: the crown jewel of his artificial intelligence. 129 00:07:21,360 --> 00:07:24,080 Speaker 3: Right right, And to Max's point, he's always trying to 130 00:07:24,400 --> 00:07:27,200 Speaker 3: position Tesla as being at the vanguard of what's next. So, 131 00:07:27,280 --> 00:07:30,120 Speaker 3: back in twenty twenty one, on an earnings call, Musk said, 132 00:07:30,120 --> 00:07:32,640 Speaker 3: and this is a direct quote, I think long term 133 00:07:32,760 --> 00:07:35,559 Speaker 3: people will think of Tesla as much as an AI 134 00:07:35,680 --> 00:07:38,240 Speaker 3: and robotics company as we are a car company or 135 00:07:38,280 --> 00:07:41,520 Speaker 3: an energy company. And so you know, he's part like 136 00:07:41,520 --> 00:07:44,520 Speaker 3: Tesla's part of the Magnificent seven because they're using AI 137 00:07:44,800 --> 00:07:45,560 Speaker 3: and nermalness. 138 00:07:45,800 --> 00:07:48,040 Speaker 1: Are we gonna We're gonna use a magnificent seven term? 139 00:07:48,160 --> 00:07:50,360 Speaker 2: What is that? It's a bloomberg. 140 00:07:51,680 --> 00:07:54,800 Speaker 1: It's the seven most magnificent stocks in the world. 141 00:07:55,240 --> 00:07:57,200 Speaker 3: Oh, I thought that was like a finance thing that 142 00:07:57,240 --> 00:07:59,600 Speaker 3: everyone knew. It's like, it's like the top seven companies 143 00:07:59,640 --> 00:08:02,000 Speaker 3: in the It's like the Nvidia, it's like the AI company. 144 00:08:02,120 --> 00:08:05,360 Speaker 1: By the way, it's four here, a magnificent four. Yeah, 145 00:08:05,520 --> 00:08:07,920 Speaker 1: but in any of it. Right, So he's positioning it 146 00:08:07,960 --> 00:08:10,880 Speaker 1: as it's not just right, it certainly not just about 147 00:08:11,400 --> 00:08:14,360 Speaker 1: the hardware and the but we are an AI company. 148 00:08:14,480 --> 00:08:17,560 Speaker 3: Well, right, so electric vehicles are becoming mainstream, how does 149 00:08:17,600 --> 00:08:20,480 Speaker 3: Tesla differentiate itself. Well, we don't just make electric cars. 150 00:08:20,480 --> 00:08:22,800 Speaker 3: We make electric cars that can drive themselves. Not really 151 00:08:22,840 --> 00:08:25,600 Speaker 3: like they've never actually accomplished that, but they are using 152 00:08:25,640 --> 00:08:28,400 Speaker 3: AI to train the cars to try to try to 153 00:08:28,480 --> 00:08:32,560 Speaker 3: drive themselves. And so that's like a big promise of 154 00:08:32,679 --> 00:08:35,400 Speaker 3: Tesla's valuation right now is that they're not just a 155 00:08:35,400 --> 00:08:37,959 Speaker 3: car company, They're an AI company. And that's the drum 156 00:08:38,000 --> 00:08:41,880 Speaker 3: that Elon has been beating for years now. 157 00:08:45,320 --> 00:08:48,120 Speaker 1: Mentioned at the beginning that when you first started covering Tesla, 158 00:08:48,200 --> 00:08:50,760 Speaker 1: it was worth a tiny fraction of what it is today. Indeed, 159 00:08:50,800 --> 00:08:52,840 Speaker 1: a lot of that great valuation. I don't know the 160 00:08:52,840 --> 00:08:54,240 Speaker 1: exact number off the top of my head, but I 161 00:08:54,240 --> 00:08:57,120 Speaker 1: think it's worth something around eight hundred billion dollars. A 162 00:08:57,160 --> 00:08:59,760 Speaker 1: lot of that is that whole AI play that value, right, 163 00:08:59,760 --> 00:09:02,760 Speaker 1: that's sense that down the road AI and what it 164 00:09:02,800 --> 00:09:05,600 Speaker 1: does in terms of autonomous driving and all that is 165 00:09:05,679 --> 00:09:08,200 Speaker 1: going to make this company hoodles and noodles and money. 166 00:09:08,280 --> 00:09:11,439 Speaker 3: Yeah, and they're using the same system to train optimists 167 00:09:11,440 --> 00:09:14,320 Speaker 3: the robot, which is the robot that is eventually going 168 00:09:14,360 --> 00:09:16,800 Speaker 3: to replace the factory workers. But I think, you know, 169 00:09:16,840 --> 00:09:19,120 Speaker 3: back to the point of X dot AI, like max, 170 00:09:19,200 --> 00:09:21,040 Speaker 3: is it even a real company? Like do we even 171 00:09:21,080 --> 00:09:22,360 Speaker 3: know how many people work at this? 172 00:09:24,280 --> 00:09:26,800 Speaker 2: I mean, okay, he does silly stuff. He makes a 173 00:09:26,800 --> 00:09:29,280 Speaker 2: fool of himself, very easy to make fun of him 174 00:09:30,000 --> 00:09:33,640 Speaker 2: for the stuff he tweets or posts or whatever. That said, 175 00:09:33,920 --> 00:09:37,920 Speaker 2: the guy has a lot of respect from scientists and engineers, 176 00:09:37,960 --> 00:09:40,040 Speaker 2: and there are real AI engineers who have who have 177 00:09:40,080 --> 00:09:43,400 Speaker 2: signed up to work at this thing. That said, what 178 00:09:43,559 --> 00:09:46,640 Speaker 2: David described and what Sarah was describing earlier about the 179 00:09:46,720 --> 00:09:50,440 Speaker 2: company does not make like a ton of sense. So 180 00:09:50,920 --> 00:09:54,760 Speaker 2: large language models depend on like truth. And that's why, 181 00:09:54,920 --> 00:09:57,480 Speaker 2: like open AI when it was like figuring out how 182 00:09:57,480 --> 00:10:02,080 Speaker 2: to build its model, it's scanning books, Encyclopa, Wikipedia, stuff 183 00:10:02,080 --> 00:10:04,360 Speaker 2: that has a lot of eyeballs on it and is 184 00:10:04,400 --> 00:10:07,200 Speaker 2: like known to be true. Because large language models, even 185 00:10:07,240 --> 00:10:10,760 Speaker 2: when they're based on stuff that's true, make stuff up Twitter. 186 00:10:11,240 --> 00:10:14,840 Speaker 2: If anybody's been using x over the past few months, 187 00:10:14,920 --> 00:10:18,720 Speaker 2: you know, during the you know, conflict in Israel, during 188 00:10:18,760 --> 00:10:22,480 Speaker 2: the war in Ukraine and so on, it's it's full 189 00:10:22,520 --> 00:10:25,080 Speaker 2: of crap. It's full of stuff that is false or 190 00:10:25,120 --> 00:10:28,240 Speaker 2: made up or exaggerated or whatever, and it's gotten worse 191 00:10:28,280 --> 00:10:29,880 Speaker 2: and worse. It's not the kind of thing you want 192 00:10:29,880 --> 00:10:33,280 Speaker 2: to build a model on. The second thing is these 193 00:10:33,400 --> 00:10:36,400 Speaker 2: open AI and so on, they cannot do current information. 194 00:10:36,520 --> 00:10:38,560 Speaker 2: The reason they can't do current information is because you 195 00:10:38,600 --> 00:10:42,680 Speaker 2: can't retrain your model constantly. It costs a huge amount 196 00:10:42,679 --> 00:10:46,400 Speaker 2: of money to retrain these models. So the ambition to 197 00:10:46,480 --> 00:10:49,120 Speaker 2: make this this groc or gronk or whatever, it's going 198 00:10:49,200 --> 00:10:53,160 Speaker 2: to be grock to make it like up to date 199 00:10:53,240 --> 00:10:56,000 Speaker 2: with tweets and stuff. If that were to come to pass, 200 00:10:56,080 --> 00:10:58,360 Speaker 2: it would cost a fortune. It would cost even more. 201 00:10:58,360 --> 00:11:01,360 Speaker 2: Open AI, the best of these companies is losing huge 202 00:11:01,400 --> 00:11:03,920 Speaker 2: sums of money. If Elon Musk is able to do this, 203 00:11:04,000 --> 00:11:06,640 Speaker 2: it will be very, very expensive. You know, it might 204 00:11:06,679 --> 00:11:10,199 Speaker 2: actually draw people to X super plus premium or whatever 205 00:11:10,320 --> 00:11:13,360 Speaker 2: whatever this like high end version is ex premium plus. 206 00:11:13,440 --> 00:11:15,800 Speaker 2: Is that it? But it's gonna cost way more than 207 00:11:15,800 --> 00:11:16,719 Speaker 2: twenty two dollars a month. 208 00:11:16,760 --> 00:11:18,720 Speaker 1: But it is a thought there that you're losing money now, 209 00:11:18,760 --> 00:11:21,720 Speaker 1: but you're you know, volume, and you're scaling, and that 210 00:11:21,800 --> 00:11:22,520 Speaker 1: down the road. 211 00:11:22,440 --> 00:11:24,679 Speaker 2: Races are going to come up in theory, right people, 212 00:11:24,800 --> 00:11:27,040 Speaker 2: Microsoft and these big companies are able to extract value 213 00:11:27,040 --> 00:11:27,200 Speaker 2: of it. 214 00:11:27,280 --> 00:11:31,520 Speaker 6: But has losing money ever been something that Musk worries 215 00:11:31,559 --> 00:11:35,080 Speaker 6: about when he's building something new? I mean, I this 216 00:11:35,080 --> 00:11:37,400 Speaker 6: this you can also I always think, you know, with 217 00:11:37,480 --> 00:11:39,920 Speaker 6: must we have to think about as his psychology, there's 218 00:11:39,960 --> 00:11:44,720 Speaker 6: a lot of history here. He was the the person 219 00:11:44,760 --> 00:11:48,160 Speaker 6: who started open ai alongside Sam Altman, back when it 220 00:11:48,240 --> 00:11:52,280 Speaker 6: was supposedly a nonprofit. He gave tens of millions of 221 00:11:52,320 --> 00:11:56,000 Speaker 6: dollars to the cause. He was quite concerned or something. 222 00:11:56,480 --> 00:11:59,200 Speaker 6: Isn't he the artificial general. 223 00:11:59,080 --> 00:12:01,000 Speaker 1: Right in general, Sarah. Isn't he been one of the 224 00:12:01,040 --> 00:12:03,320 Speaker 1: biggest voices out there saying warning of the of the 225 00:12:03,400 --> 00:12:03,960 Speaker 1: dangers of. 226 00:12:04,000 --> 00:12:07,559 Speaker 6: About to say He's talked to world leaders about that recently. 227 00:12:07,640 --> 00:12:12,760 Speaker 6: You know, he he's concerned about AI, you know, replacing 228 00:12:12,840 --> 00:12:16,120 Speaker 6: human consciousness. He he is on a mission to preserve 229 00:12:16,200 --> 00:12:19,880 Speaker 6: human consciousness. That's that's maybe the reason he backed into 230 00:12:20,000 --> 00:12:22,960 Speaker 6: for why he bought Twitter. Whether or not that was 231 00:12:23,840 --> 00:12:25,400 Speaker 6: that's something he came up with after the fact, I 232 00:12:25,440 --> 00:12:27,760 Speaker 6: don't know. But but there's a there's a lot of 233 00:12:27,960 --> 00:12:31,720 Speaker 6: a lot of existing tension between him and Sam Altman, 234 00:12:31,760 --> 00:12:36,880 Speaker 6: who's now the leader of open AI. Open Ai just 235 00:12:36,960 --> 00:12:39,480 Speaker 6: had its demo day, like a couple of days after 236 00:12:40,640 --> 00:12:44,679 Speaker 6: Musk released Rock. So I think there's some some competitive 237 00:12:45,000 --> 00:12:46,040 Speaker 6: landscape that we're playing. 238 00:12:46,120 --> 00:12:48,480 Speaker 3: Oh, Elon is totally competitive, he'd like and he's got 239 00:12:48,520 --> 00:12:50,240 Speaker 3: to be. He's got to be. He's got to be 240 00:12:50,240 --> 00:12:52,080 Speaker 3: in the arena, you know, like it's not enough to 241 00:12:52,080 --> 00:12:54,319 Speaker 3: do cars and rockets. If AI is the hot new thing, 242 00:12:54,440 --> 00:12:55,880 Speaker 3: he's got to be in there. And if there's like 243 00:12:55,960 --> 00:12:58,800 Speaker 3: personal history, he's going to go for the jugular. And 244 00:12:58,800 --> 00:12:59,880 Speaker 3: they've been training barbs. 245 00:13:00,440 --> 00:13:03,040 Speaker 2: Yeah. And to answer your question, like, it's the same 246 00:13:03,080 --> 00:13:07,600 Speaker 2: as with autopilot, right the Tesla. Tesla's full self driving 247 00:13:07,679 --> 00:13:10,400 Speaker 2: capabilities do not The Tesla does not drive itself without 248 00:13:10,440 --> 00:13:13,640 Speaker 2: a human yet. But the plan, obviously, if you're a 249 00:13:13,640 --> 00:13:15,560 Speaker 2: bull and you really believe in the stock, is that 250 00:13:15,640 --> 00:13:17,720 Speaker 2: you know, in the long run, this company is going 251 00:13:17,760 --> 00:13:21,280 Speaker 2: to replace essentially all of transportation. We're all gonna, you know, 252 00:13:21,559 --> 00:13:24,600 Speaker 2: have robotic Tesla's driving us around. It's going to completely 253 00:13:24,800 --> 00:13:27,920 Speaker 2: reimagine the world. Same thing here, right, like Elon Musk's 254 00:13:28,000 --> 00:13:30,760 Speaker 2: stated goal, and to some extent, I think the goal 255 00:13:30,880 --> 00:13:32,800 Speaker 2: of the venture capitalists and so on who are backing 256 00:13:32,880 --> 00:13:37,120 Speaker 2: similar companies is artificial general intelligence, Like we can we'll 257 00:13:37,160 --> 00:13:39,559 Speaker 2: have software that you know, can solve any problem for us. 258 00:13:39,600 --> 00:13:43,240 Speaker 2: I think there are really good reasons to be skeptical 259 00:13:43,760 --> 00:13:47,559 Speaker 2: both of you know, the immediate prospects of artificial general intelligence, 260 00:13:47,559 --> 00:13:50,360 Speaker 2: even the long term ones as well, as the prospects 261 00:13:50,360 --> 00:13:54,119 Speaker 2: of like full self driving, like both of these things closer. 262 00:13:53,880 --> 00:13:56,599 Speaker 6: Than they're warning us about the future of AI, and 263 00:13:57,559 --> 00:14:00,600 Speaker 6: Musk was among the leaders that call for a pause 264 00:14:00,640 --> 00:14:04,199 Speaker 6: and development for ethical reasons. Are also the ones that 265 00:14:04,600 --> 00:14:05,760 Speaker 6: are saying, hey, trust me. 266 00:14:06,000 --> 00:14:08,000 Speaker 1: So sir, he it was just six months ago that 267 00:14:08,040 --> 00:14:10,600 Speaker 1: he called for that pause, and then he launches this. 268 00:14:10,760 --> 00:14:12,680 Speaker 1: But I guess what is his argument, Well, I called 269 00:14:12,679 --> 00:14:14,600 Speaker 1: for a pause, No one listened to the pause, so 270 00:14:14,720 --> 00:14:15,200 Speaker 1: the hell with it? 271 00:14:15,200 --> 00:14:17,240 Speaker 3: Will Wall Well, just like signed a letter, Like he 272 00:14:17,320 --> 00:14:19,760 Speaker 3: just was a signatory on a letter about a pause 273 00:14:19,920 --> 00:14:21,720 Speaker 3: while he was building his own company, Like give me 274 00:14:21,720 --> 00:14:22,000 Speaker 3: a break. 275 00:14:22,080 --> 00:14:24,080 Speaker 2: I mean, also the pause stuff, I think it's worth 276 00:14:24,160 --> 00:14:27,080 Speaker 2: like taking all that with some skepticism just because like 277 00:14:27,280 --> 00:14:30,280 Speaker 2: it's amazing marketing. Like I think this is true with Elon, 278 00:14:30,400 --> 00:14:32,880 Speaker 2: this is true with Sam Altman's true with all these guys. 279 00:14:33,440 --> 00:14:36,040 Speaker 2: If you're like, you're like, this technology is so dangerous 280 00:14:36,200 --> 00:14:38,480 Speaker 2: it could eventually like take over the world, and you 281 00:14:38,520 --> 00:14:40,320 Speaker 2: can have it for yourself for only twenty two dollars 282 00:14:40,360 --> 00:14:44,680 Speaker 2: a month, it's a really powerful enticement if you're trying 283 00:14:44,680 --> 00:14:47,720 Speaker 2: to sign people up for a technology where the use 284 00:14:47,760 --> 00:14:50,920 Speaker 2: cases are not totally clear, Like beyond making funny posts 285 00:14:50,960 --> 00:14:52,480 Speaker 2: on social media, it also makes. 286 00:14:52,360 --> 00:14:56,840 Speaker 6: It seem inevitable, right, and all these ethical concerns, misinformation concerns, 287 00:14:56,880 --> 00:14:59,840 Speaker 6: things that are wrapped into aire are just you know, 288 00:15:00,160 --> 00:15:02,640 Speaker 6: going to happen no matter what, and they're everyone's problem 289 00:15:02,720 --> 00:15:05,680 Speaker 6: and that takes accountability away from the people who are 290 00:15:05,720 --> 00:15:06,720 Speaker 6: building in so Sarah. 291 00:15:06,800 --> 00:15:10,080 Speaker 1: So he also recently sat down with Rishi Suak, the 292 00:15:10,160 --> 00:15:13,200 Speaker 1: Prime Minister of the UK, to discuss AI. How did 293 00:15:13,240 --> 00:15:17,640 Speaker 1: that conversation go? What did did we learn anything new 294 00:15:17,720 --> 00:15:19,280 Speaker 1: from this from this long sit down. 295 00:15:19,120 --> 00:15:22,160 Speaker 5: With Well, it was great for Elon. I think. I 296 00:15:22,160 --> 00:15:23,200 Speaker 5: think Sunek. 297 00:15:22,960 --> 00:15:28,440 Speaker 6: Was was quite excited for the opportunity, as some people 298 00:15:28,480 --> 00:15:31,560 Speaker 6: are when they're in Elon's orbit. I think you got 299 00:15:31,600 --> 00:15:34,120 Speaker 6: to keep them close and and you never know when. 300 00:15:35,280 --> 00:15:38,560 Speaker 6: The fun thing about Elon Musk is he's involved in geopolitics, 301 00:15:38,560 --> 00:15:42,560 Speaker 6: He's involved in in every industry and so love him 302 00:15:42,640 --> 00:15:44,240 Speaker 6: or hate him, you got to be friends with him 303 00:15:44,400 --> 00:15:47,160 Speaker 6: in some respects. I thought it was really interesting he 304 00:15:48,640 --> 00:15:51,200 Speaker 6: who knows what he truly believes here, but he said 305 00:15:51,240 --> 00:15:55,160 Speaker 6: that that eventually jobs won't be necessary. 306 00:15:54,760 --> 00:15:57,800 Speaker 4: Will come a point where no job is needed. 307 00:15:58,000 --> 00:15:59,520 Speaker 6: You can have a job if you want to have 308 00:15:59,560 --> 00:16:03,760 Speaker 6: a job for sort of personal satisfaction. But the AI 309 00:16:03,880 --> 00:16:06,280 Speaker 6: will be able to do everything, we won't have jobs. 310 00:16:06,800 --> 00:16:12,560 Speaker 6: He praised Sunak for involving China in the in the 311 00:16:12,640 --> 00:16:15,760 Speaker 6: AI Future conference, said that, you know, we need to 312 00:16:15,920 --> 00:16:18,920 Speaker 6: be thinking about their role here too. But yeah, I 313 00:16:18,920 --> 00:16:21,280 Speaker 6: think I think, you know, he's been doing this sort 314 00:16:21,320 --> 00:16:23,800 Speaker 6: of world tour. I think that he wishes. 315 00:16:23,880 --> 00:16:28,800 Speaker 1: But he recently met with Israel, Israel's Prime Minister BB yeah. 316 00:16:28,640 --> 00:16:32,320 Speaker 6: Before the conflicts, met with from Turkey. It's a long 317 00:16:32,400 --> 00:16:34,880 Speaker 6: long He was part of the panel, the Schumer AI panel, 318 00:16:34,960 --> 00:16:38,000 Speaker 6: and and I think, I think honestly he wishes he 319 00:16:38,080 --> 00:16:42,400 Speaker 6: had the same reverence from Joe Biden. I think that 320 00:16:42,400 --> 00:16:44,800 Speaker 6: that is. That is one takeaway that I heard from 321 00:16:44,840 --> 00:16:47,320 Speaker 6: a lot of the Elon fans like oh Man, I wish, 322 00:16:47,440 --> 00:16:49,880 Speaker 6: I wish Joe Biden looked at Elon musks the way. 323 00:16:51,160 --> 00:16:54,000 Speaker 1: All right, So so before we before we talk unions 324 00:16:54,000 --> 00:16:56,680 Speaker 1: and cars in a second, I want one last question 325 00:16:56,720 --> 00:16:58,880 Speaker 1: on this, which is for you Max. So the Labor 326 00:16:58,920 --> 00:17:02,840 Speaker 1: Party in England hasn't has indeed been on Rishi Sunac's 327 00:17:02,920 --> 00:17:06,199 Speaker 1: case for saying how that he was so diffusive in 328 00:17:06,240 --> 00:17:08,560 Speaker 1: his praise and excited to be there. It seemed like 329 00:17:08,600 --> 00:17:12,639 Speaker 1: he was angling for his next job post Primier Prime 330 00:17:12,680 --> 00:17:14,800 Speaker 1: Prime ministership. So I guess it would basically like keep 331 00:17:15,040 --> 00:17:17,920 Speaker 1: trying to be pulling a Yacharino as it were, right 332 00:17:17,960 --> 00:17:22,119 Speaker 1: where you were, Linda Yacharina, where you interview Musk en 333 00:17:22,160 --> 00:17:22,720 Speaker 1: route to landing. 334 00:17:23,880 --> 00:17:26,679 Speaker 5: He's the CEO of x Ovax now after. 335 00:17:26,480 --> 00:17:30,400 Speaker 1: She had interviewed Musk on stage a few months back. 336 00:17:30,440 --> 00:17:33,280 Speaker 1: So if he pulls this off, Max, what job does 337 00:17:33,320 --> 00:17:34,440 Speaker 1: Elon Musk give Rihi? 338 00:17:35,000 --> 00:17:37,440 Speaker 2: Well, first of all, he should call Linda Yakarino before 339 00:17:37,560 --> 00:17:39,800 Speaker 2: he you know, follows through with this plan, because like 340 00:17:40,119 --> 00:17:43,680 Speaker 2: I know, officially, Linda Yacharino is really excited about sports 341 00:17:43,720 --> 00:17:46,560 Speaker 2: on Twitter and is really excited about everything about x 342 00:17:46,640 --> 00:17:47,080 Speaker 2: and everything. 343 00:17:47,119 --> 00:17:50,200 Speaker 5: But there is a job he loves everything. 344 00:17:50,240 --> 00:17:52,399 Speaker 2: This is not a good job. And and so Rishi, 345 00:17:52,480 --> 00:17:55,239 Speaker 2: if you're listening, uh, just I want you to think 346 00:17:55,280 --> 00:17:59,159 Speaker 2: about it carefully. That said, I think the boring company, Uh, 347 00:17:59,720 --> 00:18:02,760 Speaker 2: you know, Elon Musk is not gonna You're not gonna. 348 00:18:02,760 --> 00:18:04,920 Speaker 2: You're not talking one of the primo jobs. I'd say 349 00:18:04,920 --> 00:18:06,639 Speaker 2: in that Elon vers you're talking about sort of a 350 00:18:07,000 --> 00:18:09,919 Speaker 2: lower level drop. Then again, he was a chief executive, 351 00:18:09,960 --> 00:18:12,560 Speaker 2: so I think a CEO route role would be, uh, 352 00:18:12,680 --> 00:18:16,160 Speaker 2: would be appropriate. So I'm going boring, but also tunnels 353 00:18:16,240 --> 00:18:19,800 Speaker 2: infrastructure kind of kind of in line with for someone 354 00:18:19,800 --> 00:18:23,000 Speaker 2: who has a tunnel, right, Yeah, there's a tunnel. 355 00:18:23,080 --> 00:18:24,840 Speaker 1: A channel, A very nice I like that. I think 356 00:18:24,840 --> 00:18:26,639 Speaker 1: that's a good call. That's that's one to watch. We 357 00:18:26,680 --> 00:18:29,080 Speaker 1: will we will keep an eye on on that going forward. 358 00:18:34,000 --> 00:18:38,720 Speaker 1: Welcome back, Okay, So Dana on the home front, Musk 359 00:18:38,800 --> 00:18:42,960 Speaker 1: has an issue looming on the horizon, which is uh, 360 00:18:43,240 --> 00:18:45,359 Speaker 1: the fact that Sean Fain, the leader of the u 361 00:18:45,440 --> 00:18:50,480 Speaker 1: a W the United Auto Workers, has targeted Tesla and 362 00:18:50,560 --> 00:18:53,720 Speaker 1: said he's coming for Tesla. After scoring historic gains wage 363 00:18:53,720 --> 00:18:56,399 Speaker 1: games for auto workers at four General Motors and Stilantis, 364 00:18:56,960 --> 00:18:59,600 Speaker 1: he says he's coming for Tesla. Here's Fain in his 365 00:18:59,640 --> 00:18:59,960 Speaker 1: own way. 366 00:19:00,520 --> 00:19:04,000 Speaker 4: For weeks, we've seen an army of analysts and pundits 367 00:19:04,520 --> 00:19:08,560 Speaker 4: crying that our union was going too far, that we 368 00:19:08,560 --> 00:19:11,760 Speaker 4: were demanding too much. They said they weren't sure if 369 00:19:11,760 --> 00:19:14,160 Speaker 4: we would ever get a deal, and then we got 370 00:19:14,200 --> 00:19:14,640 Speaker 4: all three. 371 00:19:15,800 --> 00:19:19,000 Speaker 1: So, Dana, who exactly is Sean Fain? What does he 372 00:19:19,080 --> 00:19:24,960 Speaker 1: represent and does he stand a chance at actually organizing 373 00:19:25,040 --> 00:19:27,600 Speaker 1: labor at Tesla, something that has failed time and again 374 00:19:27,640 --> 00:19:28,359 Speaker 1: in the past. 375 00:19:28,640 --> 00:19:31,080 Speaker 3: So Sean Faine is the new face of the United 376 00:19:31,119 --> 00:19:34,879 Speaker 3: Auto Workers. He's this kind of like militant, you know, 377 00:19:34,920 --> 00:19:36,560 Speaker 3: and I don't say that in a disparaging way, but 378 00:19:36,640 --> 00:19:39,400 Speaker 3: like he's this like new militant head of the UAW. 379 00:19:39,440 --> 00:19:43,359 Speaker 3: He just won this historic victory in Detroit after six weeks, 380 00:19:43,440 --> 00:19:46,040 Speaker 3: and now he's going to try to organize everywhere. And 381 00:19:46,640 --> 00:19:48,160 Speaker 3: there are a lot of auto plants in the United 382 00:19:48,200 --> 00:19:50,560 Speaker 3: States that are not union. It's not just Tesla, it's 383 00:19:50,600 --> 00:19:53,440 Speaker 3: like everywhere in the South, Like you know, Volkswagen, Toyota, 384 00:19:53,440 --> 00:19:56,359 Speaker 3: and Nissan, they all have these plants that are not unionized. 385 00:19:56,400 --> 00:19:59,119 Speaker 3: And I think the UAW just has to be really strategic. 386 00:19:59,160 --> 00:20:01,320 Speaker 3: They just won this huge victory in Detroit. Where are 387 00:20:01,359 --> 00:20:03,439 Speaker 3: they gonna where are they going to organize next? And 388 00:20:03,520 --> 00:20:05,920 Speaker 3: of course they're going to say we're organizing at TESLA. 389 00:20:06,000 --> 00:20:07,880 Speaker 3: I mean, they're going to organize everywhere. But I think 390 00:20:07,920 --> 00:20:10,280 Speaker 3: what will be interesting to watch is where do they 391 00:20:10,320 --> 00:20:13,040 Speaker 3: actually put in their resources to develop a real campaign. 392 00:20:13,640 --> 00:20:15,440 Speaker 3: I think that they would have a much easier time 393 00:20:15,520 --> 00:20:18,120 Speaker 3: trying to organize a plant where they where they had 394 00:20:18,160 --> 00:20:20,480 Speaker 3: like at least gotten to a vote before. At Tesla, 395 00:20:20,520 --> 00:20:21,960 Speaker 3: they've never even gotten to a vote. 396 00:20:22,040 --> 00:20:23,280 Speaker 2: So what would some. 397 00:20:23,200 --> 00:20:25,200 Speaker 1: Of those plants be where they've had votes some of 398 00:20:25,240 --> 00:20:26,920 Speaker 1: them and some of the plants in the South. 399 00:20:27,119 --> 00:20:30,199 Speaker 3: Yeah, plants in the South, like like I mean Volkswagons 400 00:20:30,280 --> 00:20:33,960 Speaker 3: Plant and Chattanooga, Tennessee comes to mind. So, like organizing 401 00:20:33,960 --> 00:20:36,639 Speaker 3: a campaign, it's like a long process. You need to 402 00:20:36,640 --> 00:20:39,439 Speaker 3: get like multiple workers on multiple shifts to all be 403 00:20:39,520 --> 00:20:41,800 Speaker 3: pro union, and they don't have that at Tesla right now. 404 00:20:41,840 --> 00:20:43,680 Speaker 3: Like we reported a couple of days ago that there's 405 00:20:44,040 --> 00:20:47,359 Speaker 3: an organizing committee at Tesla. Okay, that's great, but like 406 00:20:47,560 --> 00:20:50,560 Speaker 3: in order to have like a union, you need, like 407 00:20:50,800 --> 00:20:53,919 Speaker 3: I mean Tesla the Fremont plant employees twenty thousand people, 408 00:20:53,960 --> 00:20:56,800 Speaker 3: You need like thousands of people to be very pro 409 00:20:56,920 --> 00:20:59,280 Speaker 3: union in order to ever get the momentum to get 410 00:20:59,320 --> 00:21:02,119 Speaker 3: to a vote. Like when they tried last time in 411 00:21:02,119 --> 00:21:04,440 Speaker 3: twenty eighteen, it was like it was like a handful 412 00:21:04,480 --> 00:21:07,480 Speaker 3: of people, like it never progressed, it like fizzled out, 413 00:21:07,960 --> 00:21:09,400 Speaker 3: never got to a vote. 414 00:21:09,600 --> 00:21:13,880 Speaker 1: Yeah, so tell me this. So when Elon has pushed 415 00:21:13,920 --> 00:21:15,760 Speaker 1: back and he's pushed back hard on unions and he's 416 00:21:15,800 --> 00:21:18,919 Speaker 1: been successful in the vast bulk of his plans. What 417 00:21:19,000 --> 00:21:21,520 Speaker 1: is his argument, What does he say I am able 418 00:21:21,560 --> 00:21:24,600 Speaker 1: to do for these workers that they can't get from 419 00:21:24,640 --> 00:21:25,080 Speaker 1: a union. 420 00:21:25,160 --> 00:21:27,240 Speaker 3: So Elon's biggest argument, and I think it's a very 421 00:21:27,240 --> 00:21:30,880 Speaker 3: powerful one, is equity. Like everyone who works at Tesla 422 00:21:31,280 --> 00:21:34,800 Speaker 3: gets equity in the company stock stock. The executives get 423 00:21:35,119 --> 00:21:39,720 Speaker 3: stock options, the regular employees get RSUs, which are restricted 424 00:21:39,720 --> 00:21:42,199 Speaker 3: stock units that vest typically over a four year period, 425 00:21:42,440 --> 00:21:44,639 Speaker 3: and then every employee can also buy stock at a 426 00:21:44,680 --> 00:21:48,480 Speaker 3: discount fifteen percent discount through the Employee Stock person plan. 427 00:21:48,960 --> 00:21:52,000 Speaker 3: That is a powerful lever and a lot of people, 428 00:21:52,080 --> 00:21:56,199 Speaker 3: frankly see the UAW as being corrupt and out of 429 00:21:56,280 --> 00:21:58,960 Speaker 3: touch with what they want. And I mean there's some 430 00:21:59,160 --> 00:22:02,840 Speaker 3: legitimacy to that, Like the former UAW president went to 431 00:22:02,880 --> 00:22:06,760 Speaker 3: prison for embezzling union funds, and like Elon will bring 432 00:22:06,800 --> 00:22:09,199 Speaker 3: that up time and time again. And so I just 433 00:22:09,200 --> 00:22:11,359 Speaker 3: think Sean Fain is going to face an uphill battle 434 00:22:11,440 --> 00:22:15,000 Speaker 3: trying to organize Tesla. They will certainly try, but there 435 00:22:15,040 --> 00:22:17,359 Speaker 3: are a lot of non union plants in the United States. 436 00:22:17,600 --> 00:22:17,879 Speaker 4: Max. 437 00:22:17,920 --> 00:22:20,800 Speaker 1: This is a very different moment for labor than it 438 00:22:20,960 --> 00:22:23,320 Speaker 1: was back in twenty eighteen in some of those previous episodes, 439 00:22:23,320 --> 00:22:25,440 Speaker 1: I mean to be clear, coming out of the pandemic, 440 00:22:25,560 --> 00:22:28,359 Speaker 1: and history actually shows the long arc of history dating 441 00:22:28,400 --> 00:22:31,880 Speaker 1: back century shows labor has a tendency to rise up 442 00:22:31,960 --> 00:22:34,480 Speaker 1: after pandemics. This has always been the case, and so 443 00:22:34,600 --> 00:22:38,760 Speaker 1: it's just an interesting time. Yeah, perhaps it won't prove 444 00:22:38,840 --> 00:22:39,520 Speaker 1: quite so easy. 445 00:22:39,680 --> 00:22:41,320 Speaker 2: So a couple of things going on that are worth 446 00:22:41,400 --> 00:22:43,919 Speaker 2: keep in mind. One is, like, the equation may have 447 00:22:44,000 --> 00:22:46,560 Speaker 2: changed for a Tesla worker in Fremont. So with this 448 00:22:46,680 --> 00:22:51,240 Speaker 2: big pay increase twenty five percent for Detroit autoworkers, right, 449 00:22:52,080 --> 00:22:55,919 Speaker 2: maybe the union package or like the comparative advantage of 450 00:22:55,960 --> 00:22:58,320 Speaker 2: working for Tesla is not quite what it was as 451 00:22:58,440 --> 00:23:01,879 Speaker 2: in twenty eighteen. The other thing is Tesla cannot grow forever. 452 00:23:02,040 --> 00:23:05,080 Speaker 2: Right in twenty eighteen is very different companies. Right at 453 00:23:05,119 --> 00:23:08,639 Speaker 2: the beginning of the Model three production, it had a 454 00:23:08,680 --> 00:23:10,960 Speaker 2: lot of sort of new stuff ahead of it. At 455 00:23:10,960 --> 00:23:13,520 Speaker 2: the moment, you know, Tesla's worth around seven hundred billion 456 00:23:13,600 --> 00:23:18,000 Speaker 2: dollars with a lot of sort of futuristic elon musk, 457 00:23:18,520 --> 00:23:21,720 Speaker 2: you know, fairy dust assumptions kind of baked into the 458 00:23:21,760 --> 00:23:25,600 Speaker 2: stock it's also been relatively flat over the last year. 459 00:23:25,720 --> 00:23:27,760 Speaker 2: And then the other thing is we have seen a 460 00:23:27,760 --> 00:23:32,160 Speaker 2: lot of success by labor unions kind of targeting companies 461 00:23:32,280 --> 00:23:36,560 Speaker 2: like Tesla, And so think about the efforts to unionize Starbucks, 462 00:23:36,600 --> 00:23:38,960 Speaker 2: which for a very long time was this kind of 463 00:23:39,080 --> 00:23:43,280 Speaker 2: left leaning, sort of much admired company, admired by a 464 00:23:43,320 --> 00:23:46,240 Speaker 2: lot of liberals, that now has you know, found itself 465 00:23:46,320 --> 00:23:48,880 Speaker 2: kind of in the crosshairs of organized labor. The same 466 00:23:49,160 --> 00:23:54,399 Speaker 2: labor Union Workers United that went after Starbucks has is 467 00:23:54,440 --> 00:23:58,200 Speaker 2: making an effort to unionize Tesla's Buffalo plant. So there's 468 00:23:58,200 --> 00:24:00,640 Speaker 2: actually a lot of stuff happening here. And the last 469 00:24:00,680 --> 00:24:02,960 Speaker 2: thing I'll say that I think is significant is, you know, 470 00:24:03,080 --> 00:24:05,520 Speaker 2: Joe Biden got on the picket lines at during the 471 00:24:05,600 --> 00:24:09,720 Speaker 2: UAW strike, and there's a real there's like a potential 472 00:24:09,760 --> 00:24:12,800 Speaker 2: for a political risk to Tesla where you have on 473 00:24:12,800 --> 00:24:15,240 Speaker 2: one hand, Elon Musk kind of tacking to the right, 474 00:24:15,480 --> 00:24:19,000 Speaker 2: sort of going on X and palaling around with you know, 475 00:24:19,119 --> 00:24:22,840 Speaker 2: Vek Ramaswami and Ronda Santis and lately, most recently I 476 00:24:22,840 --> 00:24:25,000 Speaker 2: think he was tweeting some sort of stop the steel 477 00:24:25,119 --> 00:24:30,000 Speaker 2: adjacent videos. And you have the Democratic candidate, you know, 478 00:24:30,080 --> 00:24:33,040 Speaker 2: sort of side going around saying I'm siding with unions 479 00:24:33,080 --> 00:24:35,560 Speaker 2: and so on. You know, the Bay Area politics have 480 00:24:35,680 --> 00:24:37,680 Speaker 2: changed a little bit in the Bay Area maybe over 481 00:24:37,720 --> 00:24:39,760 Speaker 2: the past few years, but it is a left leaning 482 00:24:40,760 --> 00:24:43,320 Speaker 2: you know, it's a left leaning area. And if this 483 00:24:43,400 --> 00:24:47,960 Speaker 2: becomes super politicized, that could also affect Tesla's ability to 484 00:24:48,040 --> 00:24:49,600 Speaker 2: keep its workers from forming a union. 485 00:24:50,960 --> 00:24:53,000 Speaker 3: But Musk can also shift production, so he also has 486 00:24:53,040 --> 00:24:55,280 Speaker 3: a huge plan in Texas, which is a right to 487 00:24:55,320 --> 00:24:57,359 Speaker 3: work state. So like, I just sort of feel like 488 00:24:57,400 --> 00:25:00,320 Speaker 3: it's a risk to the Tesla workers, Like, Okay, they 489 00:25:00,320 --> 00:25:02,240 Speaker 3: try to organize the union, then Eline could be like, 490 00:25:02,240 --> 00:25:03,840 Speaker 3: you know what, like forget it, Like I'm going to 491 00:25:03,880 --> 00:25:05,800 Speaker 3: shift production to the South. I've got this new plant 492 00:25:05,840 --> 00:25:08,879 Speaker 3: in Mexico. Like there's risks sort of all around. But 493 00:25:09,000 --> 00:25:10,919 Speaker 3: you know, I think what Tesla lacks right now is 494 00:25:11,640 --> 00:25:14,480 Speaker 3: Christian Small is the like famous Amazon employee who like 495 00:25:14,600 --> 00:25:16,960 Speaker 3: organize the Amazon union. Like I don't know who that 496 00:25:17,080 --> 00:25:19,760 Speaker 3: is at Tesla. Is it Sean Fain coming from Detroit 497 00:25:19,840 --> 00:25:22,199 Speaker 3: to like try to organize, or is there like someone 498 00:25:22,240 --> 00:25:25,000 Speaker 3: internally who's going to like rise up and be that worker. 499 00:25:25,119 --> 00:25:27,560 Speaker 3: Like I just think that organizing works best when it's internal. 500 00:25:27,880 --> 00:25:30,440 Speaker 3: The UAW has never been able to penetrate that plant, 501 00:25:30,520 --> 00:25:33,560 Speaker 3: and yes, they will try again, but I just still 502 00:25:33,560 --> 00:25:35,600 Speaker 3: think the odds of success are long, so Dana. 503 00:25:35,640 --> 00:25:41,000 Speaker 1: But in Europe, he actually faces some labor pressure there right. 504 00:25:40,840 --> 00:25:42,200 Speaker 3: In Sweden and in Germany. 505 00:25:42,280 --> 00:25:47,360 Speaker 1: Yeah, and in general, those pressures in Europe, and any 506 00:25:47,400 --> 00:25:49,840 Speaker 1: wage pressures that may may bubble up here in the US. 507 00:25:49,920 --> 00:25:51,960 Speaker 1: They come in a slightly tricky time for Tesla in 508 00:25:52,000 --> 00:25:55,359 Speaker 1: the sense that margins are shrinking right now. Profit margins 509 00:25:55,400 --> 00:25:56,280 Speaker 1: are shrinking at Tesla. 510 00:25:56,840 --> 00:25:58,560 Speaker 3: Yeah, I mean, listen, this is a great time for 511 00:25:58,640 --> 00:26:01,440 Speaker 3: labor to make its move, no doubt, and they will. 512 00:26:01,480 --> 00:26:03,840 Speaker 3: And in Europe, like the trade unions are very powerful, 513 00:26:03,880 --> 00:26:07,080 Speaker 3: so the trade union effort in places like Germany, they 514 00:26:07,119 --> 00:26:09,439 Speaker 3: do have levers and they can push, and like you know, 515 00:26:09,480 --> 00:26:12,919 Speaker 3: Tesla will probably raise wages to accommodate that. But in 516 00:26:12,960 --> 00:26:15,399 Speaker 3: the United States, like I think we have to be 517 00:26:15,440 --> 00:26:18,240 Speaker 3: honest that like the cage match that we really want 518 00:26:18,320 --> 00:26:20,600 Speaker 3: is Sean Fayne versus Elon Musk. That's what everyone is 519 00:26:20,640 --> 00:26:23,479 Speaker 3: gunning for. Like, you know, the Zuckerberg thing is never 520 00:26:23,520 --> 00:26:26,359 Speaker 3: going to happen, but people want Sean fain to come. 521 00:26:26,960 --> 00:26:29,399 Speaker 3: They really want people want to see what this looks like, 522 00:26:29,560 --> 00:26:30,440 Speaker 3: because we'll break. 523 00:26:31,040 --> 00:26:32,919 Speaker 6: But if we don't think that it's going to happen, 524 00:26:33,080 --> 00:26:35,960 Speaker 6: then I think that the biggest takeaway is what is 525 00:26:36,000 --> 00:26:39,639 Speaker 6: this going to do to Elon's If there is a 526 00:26:39,760 --> 00:26:42,200 Speaker 6: war declared Sean fayin versus Elon. 527 00:26:41,920 --> 00:26:44,480 Speaker 5: Musk, He's going to take that really personally. 528 00:26:44,560 --> 00:26:47,160 Speaker 6: I mean, Sean Faine has said that he doesn't think 529 00:26:47,200 --> 00:26:51,879 Speaker 6: billionaires should exist. He has he has a lot of 530 00:26:52,000 --> 00:26:55,359 Speaker 6: feeling around that, and we could see that come and 531 00:26:55,480 --> 00:26:58,679 Speaker 6: impact his policies for the twenty twenty four election. On 532 00:26:59,000 --> 00:27:02,879 Speaker 6: X he's already, as Max said, cozied up to some 533 00:27:03,440 --> 00:27:07,280 Speaker 6: of the far right and the conservative candidates. This is 534 00:27:07,280 --> 00:27:12,920 Speaker 6: something that he has now power over this megaphone that made. 535 00:27:12,800 --> 00:27:14,840 Speaker 3: Yeah, he can made shadow Band, the uaw tweets. I mean, 536 00:27:14,840 --> 00:27:15,520 Speaker 3: he could do all yeah. 537 00:27:15,880 --> 00:27:17,800 Speaker 6: I mean, you know, who knows what he'll do. But 538 00:27:17,880 --> 00:27:20,000 Speaker 6: I do think that we need to think of all 539 00:27:20,080 --> 00:27:22,679 Speaker 6: of these things in the context of his new power 540 00:27:23,160 --> 00:27:25,720 Speaker 6: over the conversation ahead of a major presidential election. 541 00:27:25,920 --> 00:27:29,800 Speaker 1: Yep, Okay, So we have a very pro union president 542 00:27:29,800 --> 00:27:32,320 Speaker 1: in the White House, we have Elon Musk who's always 543 00:27:32,320 --> 00:27:35,119 Speaker 1: been a libertarian attacking further to the right. We have 544 00:27:35,240 --> 00:27:38,119 Speaker 1: this labor moment and the rise of Sean Fain and 545 00:27:38,160 --> 00:27:41,520 Speaker 1: Sean Fain setting his sights on Tesla Max. What does 546 00:27:41,560 --> 00:27:45,639 Speaker 1: all this mean for the election one year from today? 547 00:27:45,720 --> 00:27:48,640 Speaker 2: All right, So here's my sort of take on what's 548 00:27:48,720 --> 00:27:52,439 Speaker 2: going through Elon's head and like why he's, you know, 549 00:27:52,440 --> 00:27:55,040 Speaker 2: doing the things he's doing politically. Like if you're wondering 550 00:27:55,119 --> 00:27:57,800 Speaker 2: why Elon Musk, who maybe ten years ago you thought 551 00:27:57,840 --> 00:28:01,880 Speaker 2: of as a like Obama adjacent kind of left of center, uh, 552 00:28:02,480 --> 00:28:05,320 Speaker 2: Silicon Valley solutionist, you know, kind of like a green 553 00:28:05,520 --> 00:28:08,479 Speaker 2: capitalist who's building solar panels. Like why is he suddenly 554 00:28:08,480 --> 00:28:12,360 Speaker 2: tweeting about voting machines in Georgia and you know, complaining 555 00:28:12,400 --> 00:28:16,639 Speaker 2: about like transgender rights. And it's because it's partly because 556 00:28:16,680 --> 00:28:20,040 Speaker 2: of this union thing. It's because he's he's dealing with 557 00:28:20,320 --> 00:28:24,359 Speaker 2: political headwinds that are new and Joe Biden, after getting 558 00:28:24,400 --> 00:28:28,760 Speaker 2: elected president, really embraced Detroit and these union shops, even 559 00:28:29,080 --> 00:28:32,840 Speaker 2: around electric cars. Elon Musk felt kind of squeezed out 560 00:28:32,840 --> 00:28:34,480 Speaker 2: of that. And I think both on a on a 561 00:28:34,520 --> 00:28:37,159 Speaker 2: personal level, like felt kind of like offended that the 562 00:28:37,200 --> 00:28:39,800 Speaker 2: Democrats like weren't thanking him more for for all the 563 00:28:39,840 --> 00:28:42,520 Speaker 2: work he'd done to popularize technical to be clear, and 564 00:28:42,560 --> 00:28:43,320 Speaker 2: on a political level. 565 00:28:43,360 --> 00:28:45,960 Speaker 1: But to be clear, I mean, Tesla absolutely positively does 566 00:28:46,080 --> 00:28:48,160 Speaker 1: dominate the EV space in this country. 567 00:28:48,720 --> 00:28:52,240 Speaker 2: Right, and so so maybe his aggrievement was was justified. 568 00:28:52,280 --> 00:28:54,120 Speaker 2: But but you got to also just look at the 569 00:28:54,120 --> 00:28:57,400 Speaker 2: political reality, which is that the Democrats consider organized labor 570 00:28:57,400 --> 00:29:00,880 Speaker 2: an important constituency and Elon Musk has set himself kind 571 00:29:00,880 --> 00:29:03,200 Speaker 2: of opposed to that. And I think as you've seen 572 00:29:03,280 --> 00:29:07,240 Speaker 2: Biden kind of you know, move closer to to Detroit 573 00:29:07,440 --> 00:29:10,440 Speaker 2: and and to the ua W, you're seeing Elon sort 574 00:29:10,480 --> 00:29:14,040 Speaker 2: of embrace you know, right wing causes and in particular 575 00:29:14,120 --> 00:29:16,440 Speaker 2: Republican candidates, you know. And initially it looked like he 576 00:29:16,560 --> 00:29:20,080 Speaker 2: was gonna maybe be a Dissanta supporter. Then Desanta's like 577 00:29:20,120 --> 00:29:23,280 Speaker 2: political prospects haven't looked totally great if you're following the polls. 578 00:29:23,320 --> 00:29:26,400 Speaker 2: Lately he sort of tweaked, uh, you know, pivoted to 579 00:29:26,520 --> 00:29:29,240 Speaker 2: vek Ramaswami. That's not looking so good. And lately we're 580 00:29:29,280 --> 00:29:32,520 Speaker 2: seeing you know, Donald Trump, the the likely Republican candidate, 581 00:29:32,560 --> 00:29:35,120 Speaker 2: and I think Elon is right now trying to find 582 00:29:35,120 --> 00:29:38,640 Speaker 2: his way, you know, back into into you know, the 583 00:29:38,680 --> 00:29:41,840 Speaker 2: good graces of MAGA because of course, for a few 584 00:29:41,920 --> 00:29:43,680 Speaker 2: years there they there was a little bit of a 585 00:29:43,680 --> 00:29:46,600 Speaker 2: strange estrangement between Elon World and Trump. 586 00:29:46,320 --> 00:29:48,720 Speaker 6: World, even though he's been on the record in Walter 587 00:29:48,760 --> 00:29:51,760 Speaker 6: Isaacson's book saying he compared Trump to a con man. 588 00:29:52,280 --> 00:29:54,440 Speaker 2: Yeah, I think it's a good bet that Trump has 589 00:29:54,480 --> 00:29:57,400 Speaker 2: not read that, and I think that so long certainly 590 00:29:58,000 --> 00:29:58,840 Speaker 2: Elon's bet. 591 00:29:59,000 --> 00:30:03,400 Speaker 1: Okay. So, so we have our final segment. We are 592 00:30:03,760 --> 00:30:06,080 Speaker 1: we're going to be bringing this back periodically. We're calling 593 00:30:06,160 --> 00:30:10,040 Speaker 1: it Cage Match Watch. And you know, Dana tried to 594 00:30:10,040 --> 00:30:12,400 Speaker 1: give us this whole thing a Sean Faine Elon Musk here, 595 00:30:12,600 --> 00:30:15,760 Speaker 1: that's not the cage match. The cage match we're all 596 00:30:15,800 --> 00:30:19,400 Speaker 1: waiting for, and it's gonna happen Elon Musk versus Mark Zuckerberg. 597 00:30:19,640 --> 00:30:19,920 Speaker 2: Max. 598 00:30:19,920 --> 00:30:21,200 Speaker 1: There have been some developments. 599 00:30:21,240 --> 00:30:23,080 Speaker 2: Well, as you know, David, I have done a lot 600 00:30:23,120 --> 00:30:25,480 Speaker 2: of analysis, a lot of careful thinking on this. And 601 00:30:25,920 --> 00:30:28,080 Speaker 2: you also know that I am an Elon truther. You know, 602 00:30:28,160 --> 00:30:31,760 Speaker 2: I say, conventional wisdom says that Mark Zuckerberg, who trains 603 00:30:31,800 --> 00:30:35,120 Speaker 2: with you know, famous mixed martial arts fighter. He looks 604 00:30:35,200 --> 00:30:36,800 Speaker 2: kind of jacked when you see pictures of him with 605 00:30:36,840 --> 00:30:38,840 Speaker 2: his shirt off, that he is a shoe and to win. 606 00:30:39,480 --> 00:30:42,720 Speaker 2: I've argued, as you know that Elon is very big. 607 00:30:42,760 --> 00:30:44,600 Speaker 2: He's got a long reach. He you know, he's like 608 00:30:44,680 --> 00:30:47,480 Speaker 2: six foot four or something around early two hundred and 609 00:30:47,480 --> 00:30:49,560 Speaker 2: forty two hundred and fifty pounds. You know he's gonna 610 00:30:49,600 --> 00:30:51,800 Speaker 2: have a big advantage. But I was listening to Joe 611 00:30:51,880 --> 00:30:55,160 Speaker 2: Rogan's podcast, and I have an update, a strategy update 612 00:30:55,600 --> 00:30:58,680 Speaker 2: to share about Elon, which is that I don't think 613 00:30:59,120 --> 00:31:01,880 Speaker 2: he knows what he's doing. There were long stretches in 614 00:31:01,920 --> 00:31:02,400 Speaker 2: this pocket. 615 00:31:02,680 --> 00:31:04,800 Speaker 1: But he's a life long brawler. I mean, the reason 616 00:31:04,880 --> 00:31:07,360 Speaker 1: why I'm coming around to musk a little bit, which 617 00:31:07,400 --> 00:31:10,480 Speaker 1: I was initially skeptical. He's been fighting for decades. 618 00:31:10,640 --> 00:31:13,080 Speaker 2: Yes, he said he's got a PhD in street fighting. 619 00:31:14,000 --> 00:31:16,880 Speaker 2: He uh. There was like there were these long sections. 620 00:31:16,920 --> 00:31:19,560 Speaker 2: Everyone should listen to. This is very important. Ever, stop 621 00:31:19,600 --> 00:31:21,760 Speaker 2: what you're doing right now, pull your car over. Listen 622 00:31:21,840 --> 00:31:24,240 Speaker 2: to forty five minutes of Elon and Joe Rogan talking 623 00:31:24,240 --> 00:31:30,160 Speaker 2: about this. But there are long sections of discussion around 624 00:31:30,240 --> 00:31:33,520 Speaker 2: moves and and I gotta say Elon Musk did not 625 00:31:33,840 --> 00:31:37,640 Speaker 2: sound very confident or well informed on the tactics of 626 00:31:37,760 --> 00:31:38,600 Speaker 2: jiu jitsu or. 627 00:31:38,520 --> 00:31:41,000 Speaker 1: Any other in the in the heat of the moment 628 00:31:41,080 --> 00:31:43,080 Speaker 1: he's in the ring, he's just it's just gonna that 629 00:31:43,080 --> 00:31:45,560 Speaker 1: that that street brawler in him is just gonna come out. 630 00:31:45,640 --> 00:31:48,320 Speaker 3: There's another development, Sarah, like what happened to Mark. 631 00:31:48,480 --> 00:31:51,800 Speaker 6: Zuckerberg towards a cl has just got surgery, which is 632 00:31:51,880 --> 00:31:54,440 Speaker 6: which is kind of interesting because Musk said that he 633 00:31:54,520 --> 00:31:56,960 Speaker 6: might need to get surgery to strengthen his neck, which 634 00:31:57,000 --> 00:31:59,400 Speaker 6: is I don't think that's what surgery does. He said 635 00:31:59,400 --> 00:32:02,440 Speaker 6: that he's not a doctor. You know that's true. I 636 00:32:02,480 --> 00:32:05,560 Speaker 6: don't want to spread misinformation on this podcast. These men 637 00:32:05,800 --> 00:32:09,800 Speaker 6: spend millions of dollars on personal physical security they have. 638 00:32:10,120 --> 00:32:14,600 Speaker 6: They have like former Secret Service agents following them around, 639 00:32:15,120 --> 00:32:18,360 Speaker 6: and yet they're putting themselves in physical danger all the time. 640 00:32:19,600 --> 00:32:23,120 Speaker 2: The only security these guys need are their own fists. 641 00:32:25,160 --> 00:32:26,320 Speaker 2: All right, that's it. 642 00:32:26,600 --> 00:32:28,320 Speaker 1: Enough of the cage match talk. 643 00:32:28,680 --> 00:32:29,480 Speaker 2: Let's call it quits. 644 00:32:29,920 --> 00:32:32,440 Speaker 1: Thanks for listening to the first ever episode of elan Inc. 645 00:32:32,720 --> 00:32:36,760 Speaker 1: And thanks to our contributors Max, Sarah Dana. 646 00:32:36,880 --> 00:32:37,440 Speaker 3: My pleasure. 647 00:32:37,720 --> 00:32:39,800 Speaker 2: Thank you always pleasure. 648 00:32:47,000 --> 00:32:51,240 Speaker 1: Our supervising producers Magnus Hendrickson. This episode was produced by 649 00:32:51,240 --> 00:32:55,400 Speaker 1: Stacy Wang, Naomi Shaven and Rayhan Harmanci are our senior editors. 650 00:32:55,760 --> 00:32:59,360 Speaker 1: The idea for this very show also came from Rayhan. 651 00:33:00,080 --> 00:33:03,640 Speaker 1: Blake Maples handles engineering, and we get special editing assistants 652 00:33:03,680 --> 00:33:07,560 Speaker 1: from Jeff Grocott, thanks a bunch of BusinessWeek editor Joel 653 00:33:07,600 --> 00:33:10,840 Speaker 1: Weber as well. The elon Ink theme is written and 654 00:33:10,880 --> 00:33:16,040 Speaker 1: performed by Taka Yasuzawa and Alex Sukiira. Sage Bauman is 655 00:33:16,040 --> 00:33:19,240 Speaker 1: the head of Bloomberg Podcast and our executive producer. I 656 00:33:19,480 --> 00:33:22,800 Speaker 1: am David Papadopoulos. If you have a minute, rate and 657 00:33:22,840 --> 00:33:25,560 Speaker 1: review the show, it'll help other listeners find us. 658 00:33:25,920 --> 00:33:26,800 Speaker 2: See you next week.