1 00:00:02,480 --> 00:00:08,560 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. Let me tell you 2 00:00:08,600 --> 00:00:11,559 Speaker 1: we have a new star. A star is born. 3 00:00:11,640 --> 00:00:14,720 Speaker 2: Elon comes up on Mars Juson Kennemy. 4 00:00:14,880 --> 00:00:18,520 Speaker 3: He is the Thomas Edison plus plus plus of our age. 5 00:00:18,640 --> 00:00:21,079 Speaker 2: Probably his whole life is from a position of insecurity. 6 00:00:21,079 --> 00:00:23,239 Speaker 1: I feel for the guy. I would say ninety eight 7 00:00:23,239 --> 00:00:25,360 Speaker 1: percent really appreciate what he does. 8 00:00:25,400 --> 00:00:28,480 Speaker 4: But those two percent that are nasty, they are up 9 00:00:28,880 --> 00:00:29,880 Speaker 4: in fourth post. 10 00:00:30,280 --> 00:00:32,360 Speaker 1: We are meant for great things in the United States 11 00:00:32,360 --> 00:00:36,440 Speaker 1: of America, and Elon reminds us of that we don't 12 00:00:36,440 --> 00:00:46,320 Speaker 1: have a fourth branch of governments called Elon Musk. Welcome 13 00:00:46,320 --> 00:00:50,480 Speaker 1: to elan Ing, Bloomberg's weekly podcast about Elon Musk. It's Tuesday, 14 00:00:50,560 --> 00:00:54,840 Speaker 1: April twenty second. I'm your host, David Papadopoulos, now sometimes 15 00:00:54,920 --> 00:00:57,639 Speaker 1: lost in the shuffle of all things Elon is one 16 00:00:57,680 --> 00:01:02,280 Speaker 1: of his most high tech and potentially consequential companies, Neuralink. 17 00:01:02,600 --> 00:01:04,600 Speaker 1: Last year, the company managed to insert a chip in 18 00:01:04,640 --> 00:01:06,880 Speaker 1: the brain of a paralyzed man that allows him, to, 19 00:01:07,200 --> 00:01:11,399 Speaker 1: among other things, move the pieces of an online chessboard 20 00:01:11,520 --> 00:01:15,520 Speaker 1: just by thinking, which is pretty cool. The company has 21 00:01:15,560 --> 00:01:17,520 Speaker 1: big Glance for twenty twenty five and is trying to 22 00:01:17,520 --> 00:01:20,200 Speaker 1: move fast in an industry that is quickly heating up. 23 00:01:20,200 --> 00:01:23,639 Speaker 1: So I asked Sarah Friar, longtime regular on this show, 24 00:01:23,640 --> 00:01:25,880 Speaker 1: to hop on and catch us all up on the 25 00:01:26,000 --> 00:01:31,000 Speaker 1: race to control human brain waves. But first, it's Tesla 26 00:01:31,160 --> 00:01:34,520 Speaker 1: earnings day. Now, I know, I know it can often 27 00:01:34,560 --> 00:01:38,760 Speaker 1: seem like every Tesla earnings call is a do or 28 00:01:38,920 --> 00:01:42,240 Speaker 1: die moment for Elon, but this one really does feel big, 29 00:01:42,800 --> 00:01:46,520 Speaker 1: with Doze a lightning rod of controversy, Tesla's sales and 30 00:01:46,560 --> 00:01:51,000 Speaker 1: stock price sinking, and global protest raging. Even longtime Tesla 31 00:01:51,000 --> 00:01:54,240 Speaker 1: bulls are worried, very worried that permanent damage has been 32 00:01:54,240 --> 00:01:56,840 Speaker 1: done to the brand. So when Elon takes the mic 33 00:01:56,880 --> 00:02:00,280 Speaker 1: this evening, the stakes will be very high to talk 34 00:02:00,320 --> 00:02:03,200 Speaker 1: about what to expect from the earnings report and the call, 35 00:02:03,320 --> 00:02:06,800 Speaker 1: and what Elon can and cannot achieve on it. I'm 36 00:02:06,840 --> 00:02:10,480 Speaker 1: sitting here with our regulars, Max Chafkin and Dana Hull. 37 00:02:10,840 --> 00:02:14,799 Speaker 1: Max Elo to you, sir, Hello David, Hey Danna, Hello again, 38 00:02:15,320 --> 00:02:19,480 Speaker 1: Hey Danna. So Dana. I will note that as I 39 00:02:19,680 --> 00:02:24,680 Speaker 1: was being the good Bloomberg employee, I am working hard 40 00:02:24,880 --> 00:02:27,840 Speaker 1: at night Sunday night, I saw a story with the 41 00:02:27,880 --> 00:02:32,720 Speaker 1: headline drop Tesla bull calls Code Red saying Musk needs 42 00:02:32,760 --> 00:02:37,520 Speaker 1: to leave Doge, this is our good friend, the bull 43 00:02:37,560 --> 00:02:41,560 Speaker 1: of bulls. Dan ives the man who is to Tesla 44 00:02:41,639 --> 00:02:46,520 Speaker 1: bulls what Max Chafkin is to mets fandom. Okay, and 45 00:02:46,760 --> 00:02:49,200 Speaker 1: you have him quoted in here from his report, Danna 46 00:02:49,280 --> 00:02:54,240 Speaker 1: saying Tesla is Musk, and Musk is Tesla, and anyone 47 00:02:54,320 --> 00:02:57,400 Speaker 1: thinks the brand damage Musk has inflicted is not a 48 00:02:57,440 --> 00:03:00,680 Speaker 1: real thing, just spend some time speaking to car buyers 49 00:03:00,680 --> 00:03:05,400 Speaker 1: in the US, Europe and Asia. Then, Dana, just minutes 50 00:03:05,639 --> 00:03:08,120 Speaker 1: before we came up to record this show, you dropped 51 00:03:08,160 --> 00:03:13,079 Speaker 1: this piece. Treasurers of eight US states, including California and Illinois, 52 00:03:13,639 --> 00:03:17,399 Speaker 1: shot a letter to the Tesla board that essentially says, Hey, 53 00:03:17,400 --> 00:03:22,480 Speaker 1: Tesla board, the company is driverless. There are no driverless cars, 54 00:03:22,480 --> 00:03:26,200 Speaker 1: but we do have a driver less company. Where's the CEO? 55 00:03:26,840 --> 00:03:30,680 Speaker 1: Why aren't you bringing the CEO to heal? So it 56 00:03:30,840 --> 00:03:33,519 Speaker 1: strikes me that on the cusp of this earnings call, 57 00:03:33,960 --> 00:03:36,720 Speaker 1: the mood is getting bleaker by the minute. 58 00:03:37,120 --> 00:03:39,640 Speaker 3: This is all true, and at the same time, like 59 00:03:39,840 --> 00:03:41,720 Speaker 3: I feel like I'm in groundhog Day because I have 60 00:03:41,840 --> 00:03:45,960 Speaker 3: been at this very point countless other times, like most 61 00:03:46,120 --> 00:03:50,080 Speaker 3: recently when Musk bought Twitter, which is now called X 62 00:03:50,160 --> 00:03:53,040 Speaker 3: And like the board and these pension funds and these 63 00:03:53,080 --> 00:03:55,960 Speaker 3: states like all went through the same exercise. So it's like, 64 00:03:56,040 --> 00:03:58,560 Speaker 3: I don't want to downplay the severity of the crisis 65 00:03:58,600 --> 00:04:01,520 Speaker 3: that Tesla is in, just want a table set by 66 00:04:01,520 --> 00:04:03,840 Speaker 3: saying that I have been here before. And typically what 67 00:04:03,920 --> 00:04:07,280 Speaker 3: happens is things are dire. Investors are upset. There's all 68 00:04:07,320 --> 00:04:10,160 Speaker 3: this like scrambling. People are like, why doesn't Tesla have 69 00:04:10,200 --> 00:04:13,200 Speaker 3: a COO or a president? What's the succession plan? Why 70 00:04:13,200 --> 00:04:15,920 Speaker 3: doesn't Elon ever focus on this company? And then he 71 00:04:16,000 --> 00:04:18,719 Speaker 3: pulls something out of a hat and he gives investors 72 00:04:18,880 --> 00:04:21,719 Speaker 3: enough to be excited about, whether it's AI or the 73 00:04:21,800 --> 00:04:25,720 Speaker 3: ROBOTAXI or his end to end neural nets or whatever 74 00:04:25,839 --> 00:04:29,279 Speaker 3: that like they get like excited again about the rosy future. 75 00:04:29,400 --> 00:04:33,880 Speaker 3: And so once again we're coming into an earnings call where, yes, 76 00:04:34,040 --> 00:04:38,000 Speaker 3: like sales in the first quarter were terrible, gross margins 77 00:04:38,000 --> 00:04:40,240 Speaker 3: are going to be terrible. Free clash flow is something 78 00:04:40,240 --> 00:04:43,080 Speaker 3: to look at, but like investors already know that it's 79 00:04:43,080 --> 00:04:46,080 Speaker 3: a terrible quarter. So what's going to be important is 80 00:04:46,600 --> 00:04:49,360 Speaker 3: Elon's tone on the call, and Max and I often 81 00:04:49,400 --> 00:04:53,240 Speaker 3: like text each other like, oh today we have dire Elon. 82 00:04:53,000 --> 00:04:55,200 Speaker 4: Or oh no, he's actually embulent Elon. 83 00:04:55,360 --> 00:04:58,320 Speaker 3: And I think the big question is he going to 84 00:04:58,400 --> 00:05:00,920 Speaker 3: say in his prepared remarks his time at Doze is 85 00:05:00,960 --> 00:05:02,720 Speaker 3: coming to an end? Or are any of the cell 86 00:05:02,800 --> 00:05:06,440 Speaker 3: side analysts actually gonna have the balls to ask him 87 00:05:06,480 --> 00:05:08,880 Speaker 3: when he's coming back, or or are they going to 88 00:05:08,920 --> 00:05:11,200 Speaker 3: avoid it completely like that is the elephant in the room, 89 00:05:11,279 --> 00:05:14,280 Speaker 3: Like the brand damage is all because of his work. 90 00:05:14,040 --> 00:05:18,640 Speaker 1: With Doje and Max, that is true that there's always 91 00:05:18,640 --> 00:05:21,279 Speaker 1: been some sort of rabbit in the hat that Elon 92 00:05:21,400 --> 00:05:23,240 Speaker 1: has been able to pull out in these earnings calls 93 00:05:23,240 --> 00:05:26,720 Speaker 1: and hey, look over here, don't mind the fundamentals or 94 00:05:26,720 --> 00:05:28,960 Speaker 1: this or that. Look at what's coming in three months, 95 00:05:29,000 --> 00:05:31,559 Speaker 1: six months, twelve years, thirty years. That's what you should 96 00:05:31,560 --> 00:05:35,760 Speaker 1: be excited about. My question is, given what we've seen 97 00:05:35,839 --> 00:05:40,200 Speaker 1: in the last few months and the state of affairs 98 00:05:40,240 --> 00:05:45,200 Speaker 1: there at Tesla, are those rabbits gonna work Unless the 99 00:05:45,320 --> 00:05:48,400 Speaker 1: rabbit is I. Elon musk Am leaving Doge in the 100 00:05:48,440 --> 00:05:51,440 Speaker 1: next twenty four hours and I will start paying it, 101 00:05:51,640 --> 00:05:55,040 Speaker 1: really paying attention to Tessel, along with SpaceX and XAI 102 00:05:55,279 --> 00:05:57,960 Speaker 1: and neuralink and all these other things. But no, I'm 103 00:05:58,000 --> 00:06:01,040 Speaker 1: really doing that. I'm really leaving doge. Okay, that would 104 00:06:01,040 --> 00:06:03,919 Speaker 1: be a hell of a rabbit. Do the other rabbits 105 00:06:04,000 --> 00:06:05,400 Speaker 1: work Right now, I. 106 00:06:05,360 --> 00:06:09,919 Speaker 5: Think it's getting harder to see what he could say, because, 107 00:06:10,120 --> 00:06:13,760 Speaker 5: as Dana is saying, like for years now, he has 108 00:06:13,880 --> 00:06:18,320 Speaker 5: been trying to spin the story forward towards driverless cars, 109 00:06:18,800 --> 00:06:24,680 Speaker 5: towards humanoid robots, towards giant supercomputers or whatever, and like 110 00:06:24,760 --> 00:06:27,839 Speaker 5: at a certain point there's the like, okay, like show 111 00:06:27,839 --> 00:06:31,120 Speaker 5: me the money, show me the driverless car. I mean, 112 00:06:31,200 --> 00:06:34,960 Speaker 5: Tesla's shareholders have been very patient with Elon Musk, and 113 00:06:35,000 --> 00:06:36,599 Speaker 5: I think, like there's no reason to think that their 114 00:06:36,680 --> 00:06:42,040 Speaker 5: patience is gonna instantly zap that Kathy Wood and her Ilk, 115 00:06:42,200 --> 00:06:46,160 Speaker 5: who have been basically counting on the Robotaxi to create 116 00:06:46,440 --> 00:06:50,320 Speaker 5: this enormous, you know, multi trillion dollar economy that will 117 00:06:50,400 --> 00:06:54,120 Speaker 5: essentially like replace the entire auto industry. Like, I don't 118 00:06:54,120 --> 00:06:58,640 Speaker 5: think enough has changed to cause them to doubt that thesis. 119 00:06:58,720 --> 00:07:02,960 Speaker 5: So I think it is more likely that we're gonna 120 00:07:03,040 --> 00:07:04,800 Speaker 5: hear more of the same from Elon Musk, he's going 121 00:07:04,880 --> 00:07:08,800 Speaker 5: to talk about the Austin Robotaxi like they are framing 122 00:07:08,839 --> 00:07:11,600 Speaker 5: this as an earnings call and a company update. Like 123 00:07:11,720 --> 00:07:15,679 Speaker 5: my prediction is that the company update is just it's 124 00:07:15,760 --> 00:07:17,240 Speaker 5: kind of the thing we know about. It's that there 125 00:07:17,600 --> 00:07:21,000 Speaker 5: there's gonna be some sort of Robotaxi trial in Austin 126 00:07:21,560 --> 00:07:24,679 Speaker 5: in June soon. And I think you're absolutely right, David, 127 00:07:24,680 --> 00:07:27,400 Speaker 5: that if Elon Musk said now, the way I think 128 00:07:27,400 --> 00:07:28,440 Speaker 5: you framed it is not how. 129 00:07:28,360 --> 00:07:28,920 Speaker 1: He would do it. 130 00:07:29,160 --> 00:07:31,560 Speaker 5: What he would do is he would declare victory. He 131 00:07:31,600 --> 00:07:34,720 Speaker 5: would say, you know, we've successfully remade the government. 132 00:07:34,960 --> 00:07:36,480 Speaker 1: We brought the bureaucrats down. 133 00:07:36,720 --> 00:07:39,679 Speaker 5: Now I feel like it's safe to go back to Tesla. 134 00:07:40,240 --> 00:07:42,960 Speaker 5: I don't know that he is ready to do that. 135 00:07:43,480 --> 00:07:46,360 Speaker 5: I haven't seen a lot of signs that he is 136 00:07:46,720 --> 00:07:50,360 Speaker 5: withdrawing from Trump. It does seem like Trump World is 137 00:07:50,440 --> 00:07:53,120 Speaker 5: maybe attempting to put a little bit of distance between 138 00:07:53,160 --> 00:07:56,120 Speaker 5: itself and Elon. But I'm not sure Elon is ready 139 00:07:56,160 --> 00:07:57,720 Speaker 5: to quit Trump. And at the end of the day, 140 00:07:57,880 --> 00:07:59,520 Speaker 5: I don't know that Trump is ready to quit Elon. 141 00:07:59,560 --> 00:08:01,200 Speaker 3: A couple of things I want to mention. You know, 142 00:08:01,400 --> 00:08:05,760 Speaker 3: the macro environment is crappy. I mean, treasury yields are rising, 143 00:08:06,000 --> 00:08:09,200 Speaker 3: the stock market is tanking, like we're everyone is talking 144 00:08:09,200 --> 00:08:12,800 Speaker 3: about a recession. This tariff policy that has everyone super 145 00:08:12,840 --> 00:08:16,040 Speaker 3: confused is not good. And Tesla is not immune from tariffs. 146 00:08:16,040 --> 00:08:18,640 Speaker 3: I mean they import a lot of their supply chain, 147 00:08:18,720 --> 00:08:22,400 Speaker 3: including their LFP batteries come from China. So that's like 148 00:08:22,440 --> 00:08:24,920 Speaker 3: a whole other thing that is weighing on the stock 149 00:08:25,040 --> 00:08:27,720 Speaker 3: besides Elan and DOEZ. It's just like the macro environment 150 00:08:27,800 --> 00:08:30,400 Speaker 3: is awful. Interest rates are still really high. People typically 151 00:08:30,440 --> 00:08:33,120 Speaker 3: do not buy cars when interest rates are high and 152 00:08:33,120 --> 00:08:35,040 Speaker 3: when people are losing their jobs left and right. So 153 00:08:35,840 --> 00:08:38,439 Speaker 3: that's like a whole other thing is is the tariff implication. 154 00:08:38,600 --> 00:08:42,920 Speaker 3: But then there's also just continuing, continuing market confusion about 155 00:08:42,920 --> 00:08:44,959 Speaker 3: this idea that Tesla is going to come out with 156 00:08:45,240 --> 00:08:48,200 Speaker 3: a cheaper car like Ed Ludlow and I reported a 157 00:08:48,240 --> 00:08:50,880 Speaker 3: year ago that there really is no cheaper car. Like 158 00:08:51,000 --> 00:08:54,200 Speaker 3: the quote unquote cheaper mass market car is maybe like 159 00:08:54,320 --> 00:08:57,559 Speaker 3: a cheaper version of the Model Y and the Model 160 00:08:57,600 --> 00:09:01,680 Speaker 3: three that's like lower range decontent, but like there's no 161 00:09:01,960 --> 00:09:05,000 Speaker 3: new model. And you know, a couple of quarters ago, 162 00:09:05,080 --> 00:09:07,240 Speaker 3: Tesla said, oh, we're going to be making this cheaper 163 00:09:07,280 --> 00:09:09,560 Speaker 3: car in the first half of the year. Now, Reuter's 164 00:09:09,600 --> 00:09:11,959 Speaker 3: had a history out saying that that has been delayed, 165 00:09:12,000 --> 00:09:14,880 Speaker 3: but it's like, of course it's delayed, like it's not 166 00:09:14,960 --> 00:09:18,480 Speaker 3: really a new model. So I just think that Tesla 167 00:09:18,760 --> 00:09:21,080 Speaker 3: picked very clearly, like a year ago, that like the 168 00:09:21,120 --> 00:09:25,640 Speaker 3: future is AI autonomy, robots and not high volumes. 169 00:09:25,679 --> 00:09:28,199 Speaker 4: So but that's another thing that people are going to 170 00:09:28,240 --> 00:09:28,760 Speaker 4: be looking for. 171 00:09:29,120 --> 00:09:32,640 Speaker 1: They've achieved that the volumes are going down fast, and 172 00:09:32,920 --> 00:09:37,760 Speaker 1: they're one new rollout, the cyber truck, which you know, 173 00:09:37,800 --> 00:09:39,920 Speaker 1: it's funny. We have mentioned on the show that it 174 00:09:40,000 --> 00:09:42,960 Speaker 1: hasn't gotten off to a roaring start, but I guess 175 00:09:42,960 --> 00:09:46,560 Speaker 1: it was only this morning that I saw the actual numbers. 176 00:09:46,559 --> 00:09:48,559 Speaker 1: And you guys feel free to yell at me if 177 00:09:48,559 --> 00:09:50,440 Speaker 1: you've said these numbers on the show and I just 178 00:09:50,480 --> 00:09:56,880 Speaker 1: failed to properly pay attention. But apparently Cox Automotive estimates 179 00:09:56,920 --> 00:10:01,640 Speaker 1: that cyber truck sales dropped f fifty percent first quarter 180 00:10:01,760 --> 00:10:04,920 Speaker 1: versus fourth quarter. Now, I know there's some you know, 181 00:10:05,040 --> 00:10:08,760 Speaker 1: imperfections about doing a quarter on quarter comparison like that, 182 00:10:08,800 --> 00:10:12,880 Speaker 1: but still, Jesus Christ, I mean, come on, you should 183 00:10:12,920 --> 00:10:15,240 Speaker 1: be they should be ramping up wildly right. 184 00:10:15,360 --> 00:10:17,439 Speaker 3: Well, they've had a lot of recalls of the cyber truck, 185 00:10:17,640 --> 00:10:20,520 Speaker 3: and the cyber truck is still expensive. And the fourth 186 00:10:20,559 --> 00:10:22,760 Speaker 3: quarter is always the big quarter because you know there's 187 00:10:22,760 --> 00:10:25,199 Speaker 3: a big sort of end of the year push. But yeah, no, 188 00:10:25,280 --> 00:10:27,640 Speaker 3: the cyber truck is a I mean, cyber truck is 189 00:10:27,640 --> 00:10:28,959 Speaker 3: not a big selling you're about to. 190 00:10:28,880 --> 00:10:32,240 Speaker 1: Say disaster that you were going to say disaster. Is 191 00:10:32,280 --> 00:10:33,600 Speaker 1: it a disaster, Dana Hall. 192 00:10:34,360 --> 00:10:35,840 Speaker 3: But the other thing that I think so funny is 193 00:10:35,840 --> 00:10:38,480 Speaker 3: that the cyber truck, of all the cars and Tesla's 194 00:10:38,480 --> 00:10:41,240 Speaker 3: line up, is like the epitome of Elon Musk himself. 195 00:10:41,360 --> 00:10:44,480 Speaker 3: So that's why you're seeing like dog walkers in Brooklyn 196 00:10:44,640 --> 00:10:47,880 Speaker 3: leaving their like you know, bags on the car of 197 00:10:47,920 --> 00:10:50,600 Speaker 3: the truck where it is the epitome of Elon himself 198 00:10:50,640 --> 00:10:55,160 Speaker 3: because he wanted this thing. His other executives wanted, like 199 00:10:55,600 --> 00:10:58,120 Speaker 3: you know, to go high volume. Like the cyber truck 200 00:10:58,200 --> 00:11:00,160 Speaker 3: is Elon, and Elon is the cyber truck. So as 201 00:11:00,200 --> 00:11:03,679 Speaker 3: much as the backlash is impacting the brand, it is 202 00:11:03,720 --> 00:11:06,240 Speaker 3: that vehicle that is that is burying the front of it. 203 00:11:06,240 --> 00:11:10,680 Speaker 5: It's the new Elon. It's the it's the trolling, social 204 00:11:10,800 --> 00:11:16,840 Speaker 5: media addicted you know, hyper partisan Trump supporter version of Elon. 205 00:11:16,920 --> 00:11:19,360 Speaker 5: Of course, like the Model three is also Elon. I 206 00:11:19,360 --> 00:11:20,960 Speaker 5: mean that, And that is the thing that like is 207 00:11:21,000 --> 00:11:25,080 Speaker 5: so weird about what has happened. Like Tesla was able 208 00:11:25,120 --> 00:11:27,720 Speaker 5: to break the mold essentially for an automaker in a 209 00:11:27,720 --> 00:11:30,400 Speaker 5: bunch of different ways, and and one of the things 210 00:11:30,400 --> 00:11:32,640 Speaker 5: that had going for it was Elon Musk and the 211 00:11:32,679 --> 00:11:36,400 Speaker 5: brand that he has like cultivated around himself. And what 212 00:11:36,440 --> 00:11:40,600 Speaker 5: we've seen is like taking that asset, that thing that. 213 00:11:40,559 --> 00:11:45,680 Speaker 6: Made Tesla special and made it so successful and throwing 214 00:11:45,720 --> 00:11:49,400 Speaker 6: it in the wood chipper along with USAID and you know, 215 00:11:49,679 --> 00:11:51,439 Speaker 6: and other parts of the federal government. 216 00:11:56,000 --> 00:12:01,200 Speaker 1: Now, one thing we haven't mentioned yet is optimists going 217 00:12:01,280 --> 00:12:03,080 Speaker 1: to space or are we going to get a lot 218 00:12:03,080 --> 00:12:05,960 Speaker 1: of optimist talk on this call? Optimist the robot. 219 00:12:05,880 --> 00:12:07,640 Speaker 4: Oh yeah, come on sure. 220 00:12:08,160 --> 00:12:10,439 Speaker 3: I mean, hey, Musk says that Optimist is going to 221 00:12:10,520 --> 00:12:12,719 Speaker 3: go to Mars on starship by like the end of 222 00:12:12,800 --> 00:12:16,040 Speaker 3: next year, and he's very excited about Optimists. 223 00:12:16,040 --> 00:12:17,199 Speaker 4: I mean that is like. 224 00:12:17,320 --> 00:12:19,360 Speaker 3: I think on the last call you talked about optimists 225 00:12:19,400 --> 00:12:21,640 Speaker 3: more than the ROBOTAXI. So yeah, I'm sure we'll get 226 00:12:21,679 --> 00:12:22,680 Speaker 3: an optimist update. 227 00:12:23,200 --> 00:12:23,400 Speaker 1: You know. 228 00:12:23,640 --> 00:12:25,280 Speaker 3: The other thing that I think is just so weird 229 00:12:25,280 --> 00:12:27,520 Speaker 3: about the earnings calls and Max and I have listened 230 00:12:27,520 --> 00:12:29,640 Speaker 3: to like so many of them over the years, is 231 00:12:29,640 --> 00:12:33,480 Speaker 3: that they never introduced who the other executives are, so like, 232 00:12:33,520 --> 00:12:35,440 Speaker 3: I'm always right, I'm. 233 00:12:35,280 --> 00:12:36,079 Speaker 4: Always in ode. 234 00:12:36,160 --> 00:12:38,280 Speaker 3: Never they never say and now we're gonna hear from 235 00:12:38,360 --> 00:12:41,480 Speaker 3: you know, like so and so who leads like autopilot software, 236 00:12:41,679 --> 00:12:45,440 Speaker 3: like Elon talks and then like some other random executive. 237 00:12:44,960 --> 00:12:49,200 Speaker 1: Pipes, you ain't Elon Musk, You're just you're just a sidekick, 238 00:12:49,440 --> 00:12:52,360 Speaker 1: an anonymous, faceless sidekick. 239 00:12:52,440 --> 00:12:55,040 Speaker 3: I mean, like multiple chats and everyone's like who's speaking, 240 00:12:55,040 --> 00:12:57,040 Speaker 3: And I'm like, you know, this sounds like Lars. Oh no, 241 00:12:57,120 --> 00:13:00,040 Speaker 3: it's not, that's Franz like whatever. You know, so we 242 00:13:00,120 --> 00:13:03,080 Speaker 3: don't bother introducing the other executives. And then even if 243 00:13:03,080 --> 00:13:06,000 Speaker 3: the other executives talk, Elon talks over them and interrupts them. 244 00:13:06,080 --> 00:13:09,520 Speaker 3: It's just like it's unlike any other earnings call. 245 00:13:09,480 --> 00:13:10,880 Speaker 1: That you've ever listened to. 246 00:13:11,160 --> 00:13:12,960 Speaker 3: And it's like, I'm so Max and I are so 247 00:13:13,040 --> 00:13:15,480 Speaker 3: used to it because we usually play bingo, But like 248 00:13:15,520 --> 00:13:17,800 Speaker 3: if you've never listened to a Tesla earnings call. You 249 00:13:17,920 --> 00:13:20,600 Speaker 3: really should. It will give you a lot of insight 250 00:13:20,679 --> 00:13:24,160 Speaker 3: into how these things go. And it's also just frankly 251 00:13:24,280 --> 00:13:27,200 Speaker 3: like so disappointing. I mean, could I say this on 252 00:13:27,240 --> 00:13:30,480 Speaker 3: a Bloomberg show. Like the Shell side analysts, they're all 253 00:13:30,480 --> 00:13:33,079 Speaker 3: about their trying to perfect their models. They don't actually 254 00:13:33,120 --> 00:13:36,080 Speaker 3: ask the hard questions. The best person who really would 255 00:13:36,160 --> 00:13:39,520 Speaker 3: try to nail Elon on particulars was Tony Sacandagia Bernstein, 256 00:13:40,320 --> 00:13:41,280 Speaker 3: and he retired. 257 00:13:41,360 --> 00:13:45,040 Speaker 1: But Van Ives has worked up as he is right now, 258 00:13:45,520 --> 00:13:48,679 Speaker 1: goes out and grills Elon with a tough question or two. 259 00:13:49,240 --> 00:13:51,920 Speaker 3: He's never He's almost never gets called on to ask 260 00:13:51,920 --> 00:13:52,520 Speaker 3: the questions. 261 00:13:52,640 --> 00:13:56,439 Speaker 1: Okay, okay, Max, we need to wrap up here on Tesla, 262 00:13:56,520 --> 00:13:59,320 Speaker 1: but let me ask you this before we do. Uh 263 00:14:00,120 --> 00:14:04,560 Speaker 1: Test has been saying, this isn't just a quarterly earnings report, 264 00:14:04,640 --> 00:14:08,040 Speaker 1: it's a company update. Are they're already starting to spin 265 00:14:08,160 --> 00:14:10,840 Speaker 1: us here? We got some big grand reveal. 266 00:14:11,280 --> 00:14:14,719 Speaker 5: I mean I think he needs given how poor we're 267 00:14:14,760 --> 00:14:17,440 Speaker 5: expecting the earnings to be, given how bad the sales 268 00:14:17,440 --> 00:14:20,960 Speaker 5: have been, I think he's going to be looking for 269 00:14:21,080 --> 00:14:24,160 Speaker 5: some way, as we've been talking about, to spin this forward. 270 00:14:24,240 --> 00:14:27,960 Speaker 5: And I think that company update it could be as 271 00:14:28,120 --> 00:14:31,080 Speaker 5: as Danna said, an update about the way that the 272 00:14:31,120 --> 00:14:33,800 Speaker 5: CEO is spending his time, although again I'm not I 273 00:14:33,920 --> 00:14:36,920 Speaker 5: kind of doubt it that it could be an update 274 00:14:37,000 --> 00:14:40,920 Speaker 5: about Rubotaxi efforts in Austin, and or it could be 275 00:14:41,280 --> 00:14:44,560 Speaker 5: about something further down the pipe. And I think you 276 00:14:44,600 --> 00:14:48,000 Speaker 5: asked Anna about optimists. It is certain he will bring 277 00:14:48,080 --> 00:14:52,200 Speaker 5: up optimists because what Elon Musk has done always has done, 278 00:14:52,360 --> 00:14:55,360 Speaker 5: is has found ways to sort of push the camera 279 00:14:55,440 --> 00:14:58,840 Speaker 5: out like the focal point way into the future to 280 00:14:58,920 --> 00:15:01,880 Speaker 5: this like really optim mystic future. And at this point, 281 00:15:02,160 --> 00:15:06,480 Speaker 5: like robotaxis are kind of real, and they're also kind 282 00:15:06,520 --> 00:15:11,520 Speaker 5: of obviously underwhelming. And so you know, Weimo has been 283 00:15:11,520 --> 00:15:14,600 Speaker 5: doing the thing that Elon Musk has has been promising 284 00:15:15,040 --> 00:15:18,200 Speaker 5: for years and at last I checked, it hasn't created 285 00:15:18,240 --> 00:15:21,960 Speaker 5: a thirty trillion dollar industry. So like, as we get 286 00:15:21,960 --> 00:15:25,720 Speaker 5: closer to the actual rollout of the robotaxi whatever it is, 287 00:15:25,800 --> 00:15:27,840 Speaker 5: and I promise you it's not going to be a 288 00:15:27,920 --> 00:15:29,880 Speaker 5: coast to coast, it's not going to be like, really 289 00:15:29,920 --> 00:15:32,320 Speaker 5: a robotax is going to be like Weimo or something 290 00:15:32,680 --> 00:15:35,520 Speaker 5: in one city which has been happening for years and years. 291 00:15:35,640 --> 00:15:38,640 Speaker 5: So Elon's going to have to find some other way 292 00:15:38,960 --> 00:15:41,840 Speaker 5: to keep that focal point in the future. And I think, 293 00:15:41,880 --> 00:15:43,960 Speaker 5: you know, Humanoi wrote a robots probably what it is? 294 00:15:43,960 --> 00:15:47,720 Speaker 1: Well, well, this is now truly the last thought I 295 00:15:47,720 --> 00:15:50,400 Speaker 1: will leave you guys with on this that our good 296 00:15:50,440 --> 00:15:53,640 Speaker 1: friend of the pod, Esha Day, and her piece previewing 297 00:15:53,680 --> 00:15:57,200 Speaker 1: the earnings this morning had a quote from somebody saying, 298 00:15:57,600 --> 00:16:00,600 Speaker 1: because the stock, as much as Tesla's stock is, it's 299 00:16:00,600 --> 00:16:04,200 Speaker 1: still trades at seventy eight times forward earnings, which for 300 00:16:04,240 --> 00:16:07,320 Speaker 1: those who aren't familiar with that metric, it's an incredible 301 00:16:07,360 --> 00:16:10,040 Speaker 1: a lot of times for forward earnings. And the quote 302 00:16:10,040 --> 00:16:13,680 Speaker 1: from this one investor is of all the stocks and 303 00:16:13,760 --> 00:16:16,600 Speaker 1: the Magnificent seven, the tech stocks in Meg seven, it 304 00:16:16,640 --> 00:16:22,040 Speaker 1: is Tesla whose valuation is almost entirely about the future. Basically, 305 00:16:22,080 --> 00:16:25,440 Speaker 1: Max the stock is bulled ah. 306 00:16:25,960 --> 00:16:27,560 Speaker 6: And that's the thing, like it could be. 307 00:16:27,800 --> 00:16:31,080 Speaker 5: I think one of the big sort of like misunderstandings 308 00:16:31,120 --> 00:16:34,960 Speaker 5: of Elon Musk's support for Donald Trump is this sense 309 00:16:35,040 --> 00:16:37,600 Speaker 5: that like he was coming from a place of strength, 310 00:16:37,600 --> 00:16:39,800 Speaker 5: and of course he was, he's the world's richest man. 311 00:16:39,960 --> 00:16:42,400 Speaker 1: But in other ways this is a company that. 312 00:16:44,000 --> 00:16:47,480 Speaker 5: Has always kind of been teetering, and it's always halfway 313 00:16:47,480 --> 00:16:51,760 Speaker 5: between you know, taking over the world and like wildly 314 00:16:52,680 --> 00:16:56,520 Speaker 5: underperforming expectations. And that's why Dana people like Dana me 315 00:16:56,880 --> 00:16:59,320 Speaker 5: are a little bit like, you know, like why it's 316 00:16:59,320 --> 00:17:01,840 Speaker 5: hard to get ex excited about anyone calls like, oh, 317 00:17:01,840 --> 00:17:04,119 Speaker 5: this is going to be I mean, we are excited 318 00:17:04,160 --> 00:17:06,280 Speaker 5: because this is our super Bowl, but you know, we 319 00:17:06,320 --> 00:17:09,040 Speaker 5: see this movie every three months, so so like yeah, 320 00:17:09,119 --> 00:17:10,480 Speaker 5: like I don't think this is going to be the 321 00:17:10,520 --> 00:17:14,080 Speaker 5: thing that does Elon as CEO of Tesla like you know, 322 00:17:14,160 --> 00:17:15,520 Speaker 5: but but on the other hand. 323 00:17:15,320 --> 00:17:15,719 Speaker 6: I don't know. 324 00:17:15,920 --> 00:17:18,080 Speaker 5: We we live in we live in crazy times. 325 00:17:18,240 --> 00:17:20,600 Speaker 3: I just want to say one thing, Nothing will compare 326 00:17:20,840 --> 00:17:21,680 Speaker 3: as crazy. 327 00:17:21,400 --> 00:17:22,760 Speaker 4: As now is with Trump. 328 00:17:23,040 --> 00:17:26,240 Speaker 3: Nothing will compare to twenty eighteen, which, lest I remind you, 329 00:17:26,400 --> 00:17:28,919 Speaker 3: is the year that Tesla struggled to ramp up the 330 00:17:28,960 --> 00:17:32,000 Speaker 3: model three. Yes, Elon tried to take the company private 331 00:17:32,040 --> 00:17:34,280 Speaker 3: through the four to twenty thing. He smoked pot with 332 00:17:34,359 --> 00:17:38,200 Speaker 3: Joe Rogan. He called the tie Cave Rescuer a pedo guy. 333 00:17:38,880 --> 00:17:41,720 Speaker 3: Executives are quitting left and right, like in real time, 334 00:17:41,800 --> 00:17:45,320 Speaker 3: including the chief accounting officer. Like I like that that 335 00:17:45,359 --> 00:17:48,760 Speaker 3: would be for me is is twenty eighteen. 336 00:17:49,400 --> 00:17:51,600 Speaker 1: Those are all good points to which you know, I 337 00:17:51,600 --> 00:17:55,359 Speaker 1: would say so Max, at the time that Dan is describing, 338 00:17:55,760 --> 00:17:58,480 Speaker 1: maybe truly the company was teetering. It was trying to 339 00:17:58,520 --> 00:18:01,080 Speaker 1: scale up and get to to where it is now. 340 00:18:01,440 --> 00:18:04,800 Speaker 1: I don't think the company is teetering so much now. 341 00:18:04,880 --> 00:18:06,800 Speaker 1: I think it's not. When you it's not. 342 00:18:08,200 --> 00:18:11,199 Speaker 7: Still have a car company, yeah right, What you have 343 00:18:11,520 --> 00:18:16,680 Speaker 7: is a stock right that is just incredibly valued even 344 00:18:16,760 --> 00:18:18,359 Speaker 7: after a fifty percent drop. 345 00:18:18,840 --> 00:18:21,080 Speaker 1: That's the you know, the big question. So when I said, 346 00:18:21,119 --> 00:18:23,240 Speaker 1: I should also reframe it. The stock is in bulk, 347 00:18:23,760 --> 00:18:26,240 Speaker 1: but boy, there's a lot of hocus pocus there. Yeah. 348 00:18:26,320 --> 00:18:29,120 Speaker 5: Yeah, the Tesla the business is has a very good 349 00:18:29,160 --> 00:18:34,240 Speaker 5: business and drinking. But the stock, yeah yeah, Tesla. The 350 00:18:34,280 --> 00:18:38,840 Speaker 5: stock though, as you're saying, as ESA's saying, like it 351 00:18:38,840 --> 00:18:43,159 Speaker 5: it is entirely a lot of that value is is 352 00:18:43,200 --> 00:18:46,159 Speaker 5: about is about the future and about assumptions that the 353 00:18:46,240 --> 00:18:50,240 Speaker 5: market has made and that I'm not totally sure like 354 00:18:50,520 --> 00:18:53,280 Speaker 5: why like what like what the what whether there's much 355 00:18:53,280 --> 00:18:56,359 Speaker 5: of a basis in reality for those those assumptions. 356 00:18:56,560 --> 00:19:05,639 Speaker 1: Dana, thank you as always, Thanks thank you. Okay, so 357 00:19:05,920 --> 00:19:09,760 Speaker 1: now I'm joined by one of the OG's of the show, 358 00:19:10,680 --> 00:19:14,639 Speaker 1: Sarah Friar, big tech team leader here at Bloomberg. Sarah, 359 00:19:14,680 --> 00:19:17,920 Speaker 1: great to have you on. Thanks for having me, Okay, Sarah. 360 00:19:18,000 --> 00:19:21,280 Speaker 1: So we're going to talk now about one of Elon's 361 00:19:21,320 --> 00:19:24,639 Speaker 1: companies that, well, for us at least, it's kind of 362 00:19:24,680 --> 00:19:28,720 Speaker 1: fallen off the radar just because it's been drowned out 363 00:19:28,800 --> 00:19:35,320 Speaker 1: by Tesla and Doze and Doze and Doze and SpaceX 364 00:19:35,400 --> 00:19:40,159 Speaker 1: and Tesla and Dose do And that's Neuralink, which is 365 00:19:40,200 --> 00:19:44,200 Speaker 1: a pretty fascinating company, but that just again just doesn't 366 00:19:44,520 --> 00:19:48,320 Speaker 1: make news every second of every day for starters. Remind 367 00:19:48,680 --> 00:19:53,840 Speaker 1: our listeners what exactly Neuralink is and where it was 368 00:19:54,520 --> 00:19:57,560 Speaker 1: as we last understood it, so that we can get 369 00:19:57,560 --> 00:19:58,720 Speaker 1: into new developments. 370 00:19:59,359 --> 00:20:02,080 Speaker 2: Must be nice to be at the Musk company that's 371 00:20:02,119 --> 00:20:05,880 Speaker 2: not under the microscope, not super involved with what's going 372 00:20:05,880 --> 00:20:08,600 Speaker 2: on in DC thinking about the future, like that's the 373 00:20:08,600 --> 00:20:09,560 Speaker 2: sweet spot right. 374 00:20:09,840 --> 00:20:12,240 Speaker 1: Just quietly beavering away doing their work. 375 00:20:13,280 --> 00:20:18,679 Speaker 2: So, Neuralink is one of Musk's companies that is in 376 00:20:18,720 --> 00:20:21,960 Speaker 2: the business of putting chips in people's brains to help 377 00:20:22,000 --> 00:20:26,120 Speaker 2: them do things that their bodies can't do. Right now, 378 00:20:26,160 --> 00:20:30,439 Speaker 2: they're focusing on quadriplegics, people who have no ability to 379 00:20:30,600 --> 00:20:34,439 Speaker 2: use their arms or legs, and they've only had so 380 00:20:34,600 --> 00:20:39,760 Speaker 2: far publicly three patients for this device, which again involves 381 00:20:40,400 --> 00:20:45,440 Speaker 2: surgical implantation in the brain, attaching electrodes to your brain 382 00:20:45,480 --> 00:20:51,240 Speaker 2: tissue to help you move a cursor on a computer 383 00:20:51,960 --> 00:20:56,680 Speaker 2: to help you to make your needs known via your 384 00:20:56,720 --> 00:21:01,840 Speaker 2: actual brain waves. Musk has talked about eventually leading us 385 00:21:01,880 --> 00:21:07,600 Speaker 2: to symbiosis with artificial intelligence. That's extremely far away from 386 00:21:07,640 --> 00:21:10,879 Speaker 2: the current reality, which is still very exciting, which is 387 00:21:10,960 --> 00:21:14,680 Speaker 2: that they have this human trial that started last year 388 00:21:14,760 --> 00:21:21,920 Speaker 2: where they are implanting devices into people's brains for real. 389 00:21:22,720 --> 00:21:27,240 Speaker 2: They put their first chip in a patient named Nolan Arba. 390 00:21:27,600 --> 00:21:30,560 Speaker 2: We talked about that on the podcast in the past. 391 00:21:31,040 --> 00:21:35,000 Speaker 2: Since then, they've implanted the device in a few more people, 392 00:21:35,000 --> 00:21:37,159 Speaker 2: and they've improved the technique. 393 00:21:37,480 --> 00:21:40,480 Speaker 1: For a second, before we talk about that and the 394 00:21:40,520 --> 00:21:44,159 Speaker 1: improvement in the technique, tell us about Nolan Arbaugh and 395 00:21:44,200 --> 00:21:46,440 Speaker 1: what made him the right candidate for this chip. 396 00:21:47,240 --> 00:21:51,760 Speaker 2: Well, Nolan Arba was a quadriphlegic, so he fit the 397 00:21:51,840 --> 00:21:55,280 Speaker 2: parameters of what they needed to prove that they could 398 00:21:55,480 --> 00:21:59,399 Speaker 2: actually make something that read somebody's thoughts that would allow 399 00:21:59,520 --> 00:22:04,520 Speaker 2: him to now, for instance, play video games independently, which 400 00:22:04,560 --> 00:22:09,040 Speaker 2: is really remarkable. This is on the road to helping 401 00:22:09,080 --> 00:22:12,720 Speaker 2: with other motor conditions such as Parkinson's. 402 00:22:13,320 --> 00:22:15,440 Speaker 4: They have a few. 403 00:22:16,880 --> 00:22:22,240 Speaker 2: Initiatives looking into how they could work on vision this 404 00:22:23,280 --> 00:22:26,960 Speaker 2: breakthrough status Project breakthrough status is what the FDA gives 405 00:22:26,960 --> 00:22:29,840 Speaker 2: you if you're working on something that is so important 406 00:22:29,840 --> 00:22:33,159 Speaker 2: it should be accelerated. Neuralink has one called blind Site, 407 00:22:33,760 --> 00:22:39,080 Speaker 2: which is supposed to help with eye issues. So unfortunately 408 00:22:39,160 --> 00:22:43,760 Speaker 2: for our ba, some of the electrodes detached from his 409 00:22:43,920 --> 00:22:48,040 Speaker 2: brain tissue over the course of wearing the implant, and 410 00:22:48,119 --> 00:22:50,439 Speaker 2: so the company had to rectify that, and so with 411 00:22:50,520 --> 00:22:53,480 Speaker 2: future patients what they did, and then they now had 412 00:22:54,280 --> 00:22:57,480 Speaker 2: total of three, and there was a second later in 413 00:22:57,560 --> 00:23:01,840 Speaker 2: twenty twenty four. Then there was somebody early in twenty 414 00:23:01,880 --> 00:23:05,920 Speaker 2: twenty five. This year they've tried to reduce the space 415 00:23:06,200 --> 00:23:10,240 Speaker 2: between the chip and the brain no air bubbles. Now, 416 00:23:10,440 --> 00:23:15,040 Speaker 2: even though this is a company that is relatively quiet 417 00:23:15,119 --> 00:23:17,720 Speaker 2: in the muscaverse, this is going to be a big 418 00:23:17,760 --> 00:23:19,919 Speaker 2: year for them. They've said that there might be twenty 419 00:23:19,920 --> 00:23:23,000 Speaker 2: to thirty more patients this year. 420 00:23:23,160 --> 00:23:26,320 Speaker 1: Well, wha wha, wha. So you go from basically, if 421 00:23:26,320 --> 00:23:29,919 Speaker 1: my math is right, one not even one a quarter, 422 00:23:30,359 --> 00:23:33,439 Speaker 1: one every four months or so. So now we're going 423 00:23:33,520 --> 00:23:35,600 Speaker 1: to be doing a few a month. 424 00:23:36,760 --> 00:23:39,080 Speaker 4: Well, that is the hope. 425 00:23:39,160 --> 00:23:43,160 Speaker 2: I think. They just said that they are opening their 426 00:23:43,480 --> 00:23:47,959 Speaker 2: prime trial, which is like basically the their integrated medical 427 00:23:47,960 --> 00:23:51,480 Speaker 2: device trial, the trial for this telepathy chip. They tweeted 428 00:23:51,520 --> 00:23:53,320 Speaker 2: saying they're calling for patients. 429 00:23:53,359 --> 00:23:55,080 Speaker 4: So that is. 430 00:23:56,680 --> 00:23:58,960 Speaker 2: Signed to the world like listen, We're ready for more. 431 00:23:59,880 --> 00:24:03,199 Speaker 2: I think that that may have something to do also 432 00:24:03,320 --> 00:24:06,440 Speaker 2: with the amount of competition that's cropping up. There are 433 00:24:06,480 --> 00:24:10,359 Speaker 2: a lot of startups that are doing more in this 434 00:24:10,560 --> 00:24:14,520 Speaker 2: space as we speak. Just recently, just a few days ago, 435 00:24:14,680 --> 00:24:19,280 Speaker 2: Science Corp Rays over one hundred million from Cosla. They're 436 00:24:19,359 --> 00:24:23,520 Speaker 2: starting with a retina implant to help with with eye issues. 437 00:24:24,000 --> 00:24:26,679 Speaker 2: They also got breakthrough status from the FDA. But there 438 00:24:26,680 --> 00:24:32,240 Speaker 2: are a few other startups, Parodramics, Precision Neuroscience that are 439 00:24:32,280 --> 00:24:33,800 Speaker 2: working on similar things. 440 00:24:34,280 --> 00:24:36,439 Speaker 1: You're saying the space is even getting more heated and 441 00:24:36,480 --> 00:24:40,960 Speaker 1: more competitive, and Neuralink was chasing to begin with, right, 442 00:24:41,400 --> 00:24:41,800 Speaker 1: But I. 443 00:24:41,720 --> 00:24:45,640 Speaker 2: Think that Neuralink was very well funded. I mean they 444 00:24:45,640 --> 00:24:48,080 Speaker 2: have more than six hundred million in funding. They last 445 00:24:48,200 --> 00:24:51,080 Speaker 2: raised in twenty twenty three at a three point five 446 00:24:51,119 --> 00:24:54,359 Speaker 2: billion valuation. I think a lot of that excitement and 447 00:24:54,480 --> 00:24:58,359 Speaker 2: funding around neuralink is linked to Elon Musk himself. And 448 00:24:58,440 --> 00:25:00,399 Speaker 2: I also think that you can't really get a way 449 00:25:00,440 --> 00:25:03,040 Speaker 2: with cutting corners in this market the way you might 450 00:25:03,119 --> 00:25:06,080 Speaker 2: be able to in a Tesla supply chain. 451 00:25:06,280 --> 00:25:08,119 Speaker 1: Your cut corners and people die. 452 00:25:08,960 --> 00:25:11,719 Speaker 2: Yes, yes, and that would be very bad for business. 453 00:25:11,960 --> 00:25:15,479 Speaker 1: Okay, now, so you mentioned that Nolan Arbaugh is able 454 00:25:15,520 --> 00:25:20,199 Speaker 1: to play video games with his mind. Now and the 455 00:25:20,240 --> 00:25:25,520 Speaker 1: other patients, what concretely, if anything beyond playing video games 456 00:25:25,520 --> 00:25:28,080 Speaker 1: are they able to do with this technology. Understood of course, 457 00:25:28,080 --> 00:25:30,679 Speaker 1: that this technology is still developing in the ideas that 458 00:25:30,840 --> 00:25:32,720 Speaker 1: slowly but surely they're going to be able to add 459 00:25:32,720 --> 00:25:35,560 Speaker 1: and add and add capacity, the aspiration being at some 460 00:25:35,760 --> 00:25:40,120 Speaker 1: point that these quadriplegics are even able to stand up 461 00:25:40,280 --> 00:25:44,199 Speaker 1: and walk. But as of right now, beyond playing video games, 462 00:25:44,359 --> 00:25:46,600 Speaker 1: are they doing anything else. 463 00:25:47,359 --> 00:25:51,520 Speaker 2: It's basically that Nolan Arba can control a computer with 464 00:25:51,560 --> 00:25:54,400 Speaker 2: his thoughts, which goes a long way in today's world. 465 00:25:54,520 --> 00:25:56,600 Speaker 2: Right you can move a cursor around and click on 466 00:25:56,720 --> 00:26:00,679 Speaker 2: things the way you would do with your hand or 467 00:26:01,040 --> 00:26:03,359 Speaker 2: use on a phone, but you can't if you're a quadriplegic. 468 00:26:03,480 --> 00:26:07,560 Speaker 2: So that opens up a lot of possibilities to communicate, 469 00:26:08,280 --> 00:26:11,600 Speaker 2: to make choices on the internet, and yes, to play 470 00:26:11,680 --> 00:26:14,560 Speaker 2: video games. So I think that that is the main 471 00:26:14,920 --> 00:26:18,200 Speaker 2: claim to fame at the moment. Eventually, yes, they hope 472 00:26:18,240 --> 00:26:22,639 Speaker 2: to make patients able to do all sorts of things. Again, 473 00:26:23,080 --> 00:26:27,040 Speaker 2: they plan to use this chip to work on conditions 474 00:26:27,080 --> 00:26:31,359 Speaker 2: from obesity to bipolar to Parkinson's and then eventually communication 475 00:26:31,560 --> 00:26:37,200 Speaker 2: between humans and computers directly, that kind of telepathic of symbiosis, 476 00:26:37,280 --> 00:26:38,919 Speaker 2: as Musk said. 477 00:26:39,080 --> 00:26:41,239 Speaker 1: Yeah, and I suppose it is possible, and I just 478 00:26:41,640 --> 00:26:44,960 Speaker 1: can't exactly. I'm just not sure what I should make 479 00:26:44,960 --> 00:26:46,320 Speaker 1: of this. But at some point we're going to be 480 00:26:46,320 --> 00:26:50,639 Speaker 1: able to simply download like entire encyclopedias into our brain 481 00:26:50,720 --> 00:26:53,199 Speaker 1: and just retain them there, and I don't know, it 482 00:26:53,240 --> 00:26:54,720 Speaker 1: all just starts getting really weird. 483 00:26:55,800 --> 00:27:00,880 Speaker 2: It should eventually help us with memory retention, with you know, 484 00:27:01,040 --> 00:27:04,400 Speaker 2: keeping us true to our diet plans, that sort of thing. 485 00:27:04,400 --> 00:27:06,359 Speaker 2: I mean, that's all kind of what's been talked about 486 00:27:06,400 --> 00:27:08,119 Speaker 2: in this in this realm. 487 00:27:08,400 --> 00:27:10,120 Speaker 1: I guess I just wonder this though, if I start 488 00:27:10,200 --> 00:27:14,359 Speaker 1: downloading volumes of encyclopedias into my brain, well, I still 489 00:27:14,480 --> 00:27:18,359 Speaker 1: have enough bandwidth to remember all the name all the 490 00:27:18,480 --> 00:27:19,240 Speaker 1: names of mind. 491 00:27:19,320 --> 00:27:21,720 Speaker 2: I expect that it'll be the computer that has the 492 00:27:21,800 --> 00:27:26,240 Speaker 2: volumes of encyclopedias and you just you just like access 493 00:27:26,280 --> 00:27:27,120 Speaker 2: it as you need. 494 00:27:27,359 --> 00:27:29,960 Speaker 1: I see like it's up there in the cloud. 495 00:27:29,800 --> 00:27:32,320 Speaker 2: And I the way we do now with the internet. 496 00:27:32,520 --> 00:27:34,439 Speaker 2: But you know, we don't have It's not like our 497 00:27:34,440 --> 00:27:36,760 Speaker 2: brains are going to have well who knows, right, I 498 00:27:36,760 --> 00:27:39,000 Speaker 2: don't know what the future holds. What I do know 499 00:27:39,359 --> 00:27:43,560 Speaker 2: is while companies like Neuralink are talking about this pie 500 00:27:43,560 --> 00:27:46,520 Speaker 2: in the sky future while working on like the direct 501 00:27:46,600 --> 00:27:54,560 Speaker 2: medical science, this idea of brain control of devices is 502 00:27:54,720 --> 00:28:00,800 Speaker 2: actually pretty mainstream now in tech. Saw Meta came out 503 00:28:00,920 --> 00:28:05,160 Speaker 2: with research just earlier this year on how they would 504 00:28:05,160 --> 00:28:08,640 Speaker 2: be able to determine based on brain waves what you're 505 00:28:08,720 --> 00:28:12,000 Speaker 2: going to type wow. And that's not with an invasive chip. 506 00:28:12,040 --> 00:28:15,119 Speaker 2: That's just from putting a lectures on your brain on 507 00:28:15,160 --> 00:28:18,719 Speaker 2: the outside. And they read people's activity while they typed 508 00:28:18,840 --> 00:28:20,920 Speaker 2: and tried to make connections. 509 00:28:20,760 --> 00:28:22,200 Speaker 1: And the hit rate was pretty good. 510 00:28:22,880 --> 00:28:25,160 Speaker 2: It was better than people expected. Yeah, it was pretty good. 511 00:28:25,760 --> 00:28:29,080 Speaker 2: It wasn't perfect, but it's baby steps. This is all 512 00:28:29,119 --> 00:28:31,200 Speaker 2: baby steps. What people say is happening in their research, 513 00:28:31,200 --> 00:28:33,720 Speaker 2: it's you know, these studies are very tailored to try 514 00:28:33,760 --> 00:28:38,640 Speaker 2: to like produce something narrow, Okay, but. 515 00:28:38,560 --> 00:28:41,240 Speaker 1: Any even The point is is that this is the 516 00:28:41,760 --> 00:28:45,200 Speaker 1: entire industry is tacking and tacking quickly in this direction. 517 00:28:45,800 --> 00:28:50,320 Speaker 2: Apple has a patent about reading your brain waves through 518 00:28:50,480 --> 00:28:55,640 Speaker 2: air pods and that, yeah, exactly that that sounds very mainstream. 519 00:28:55,680 --> 00:28:58,360 Speaker 2: Now that's this is all early days, but it's it's 520 00:28:58,400 --> 00:29:03,040 Speaker 2: spooky enough that the idea that your brain activity might 521 00:29:03,080 --> 00:29:07,480 Speaker 2: be accessible, yes, has caused a wave in reactions from 522 00:29:07,560 --> 00:29:11,720 Speaker 2: state governments. There have been some early attempts to try 523 00:29:11,760 --> 00:29:17,320 Speaker 2: to regulate the privacy of brainwave data. This is so 524 00:29:17,680 --> 00:29:21,120 Speaker 2: early in the science, but governments have seen what happened 525 00:29:21,120 --> 00:29:25,280 Speaker 2: with other technologies that have grown too quickly without regulation, 526 00:29:25,400 --> 00:29:30,080 Speaker 2: and so we've seen some states try to decide now 527 00:29:30,160 --> 00:29:34,360 Speaker 2: that your brain wave data should remain yours, unprivate and 528 00:29:34,440 --> 00:29:36,800 Speaker 2: not used to sell you ads or something. 529 00:29:37,440 --> 00:29:42,200 Speaker 1: Okay, now, Sarah, with most of Elon's companies, yeah, most 530 00:29:42,440 --> 00:29:45,800 Speaker 1: we initially saw when Trump was elected, and of course 531 00:29:45,880 --> 00:29:48,760 Speaker 1: Elon went to great lengths to help get him elected, 532 00:29:49,120 --> 00:29:54,400 Speaker 1: a real spike in the valuation of his companies. Kesla 533 00:29:54,520 --> 00:29:58,960 Speaker 1: initially surged wildly that, of course, for reasons stated again 534 00:29:58,960 --> 00:30:01,920 Speaker 1: and again on this show, has since reversed and gone 535 00:30:01,960 --> 00:30:04,080 Speaker 1: down quite a bit. But the valuation of many of 536 00:30:04,120 --> 00:30:08,760 Speaker 1: his private companies SpaceX x XAI, which is now taken 537 00:30:08,800 --> 00:30:12,720 Speaker 1: over x. You mentioned that back in twenty twenty three, 538 00:30:12,880 --> 00:30:16,520 Speaker 1: Neuralink at evaluation of roughly three and a half billion dollars. 539 00:30:16,880 --> 00:30:20,320 Speaker 1: Do you know what it is today and whether it 540 00:30:20,520 --> 00:30:24,640 Speaker 1: also got that Trump election boost. 541 00:30:25,040 --> 00:30:28,360 Speaker 2: Well so, without a new funding round, and we haven't 542 00:30:28,520 --> 00:30:32,040 Speaker 2: heard of one happening yet, we can't say for sure 543 00:30:33,080 --> 00:30:37,400 Speaker 2: what the valuation is. But what Reuter's found middle of 544 00:30:37,520 --> 00:30:40,800 Speaker 2: last year was that there were some share trades on 545 00:30:40,880 --> 00:30:44,880 Speaker 2: the secondary market, which is the market for you know, 546 00:30:44,880 --> 00:30:49,200 Speaker 2: if we're private holders of neuralink stock, we could offload 547 00:30:49,240 --> 00:30:52,840 Speaker 2: that stock or trade it for something else on a 548 00:30:52,880 --> 00:30:57,800 Speaker 2: private market that's not necessarily accessible to everyone. They were 549 00:30:57,840 --> 00:31:03,520 Speaker 2: trading at five billion valuation, so that is higher, but 550 00:31:03,600 --> 00:31:06,360 Speaker 2: I would caution against saying that that doesn't mean that 551 00:31:06,400 --> 00:31:10,240 Speaker 2: Neuralink now has a five billion dollar evaluation. Reuters attributed 552 00:31:10,320 --> 00:31:14,760 Speaker 2: that to the interest in the company's shares just because 553 00:31:14,800 --> 00:31:20,320 Speaker 2: of Elon Musk's prominence, the fact that if you invest 554 00:31:20,400 --> 00:31:24,040 Speaker 2: in an Elon Musk company, there's going to be some 555 00:31:24,240 --> 00:31:28,200 Speaker 2: bright future for that company kind of no matter what, 556 00:31:28,400 --> 00:31:31,880 Speaker 2: because he will figure it out. That that's sort of 557 00:31:30,800 --> 00:31:34,440 Speaker 2: the the Elon effect, the belief around his abilities. 558 00:31:35,200 --> 00:31:39,920 Speaker 1: Any sense of whether Elon's proximity to this government, the 559 00:31:39,960 --> 00:31:44,080 Speaker 1: Trump administration, any sense that that will be helpful. There's 560 00:31:44,120 --> 00:31:46,320 Speaker 1: been a lot of speculation as to how that applies 561 00:31:46,360 --> 00:31:52,560 Speaker 1: to SpaceX and Tesla and X itself. How about for Neuralink. 562 00:31:53,280 --> 00:31:59,960 Speaker 2: I mean, there was some concern about Musk's influence earlier 563 00:32:00,240 --> 00:32:05,880 Speaker 2: this year when those cuts resulted in the firing of 564 00:32:06,240 --> 00:32:12,840 Speaker 2: FDA staffers who were tasks with reviewing Neuralink. That was 565 00:32:12,880 --> 00:32:17,520 Speaker 2: actually reversed. Those those staffers were reinstated, according to reports. 566 00:32:17,560 --> 00:32:24,200 Speaker 2: But I think chaos at the FDA is not great 567 00:32:24,240 --> 00:32:28,240 Speaker 2: for any company in this kind of line of work. 568 00:32:28,920 --> 00:32:32,040 Speaker 1: So, in other words, whereas some of his companies, terms 569 00:32:32,040 --> 00:32:36,840 Speaker 1: of other companies might benefit from less regulation, this is 570 00:32:36,840 --> 00:32:39,440 Speaker 1: a company that needs the FDA to say, hey, Neuralink, 571 00:32:39,520 --> 00:32:43,080 Speaker 1: you are now allowed to do X or Y, and you. 572 00:32:43,040 --> 00:32:47,760 Speaker 2: Can't do that unless you have FDA staffers to communicate with, 573 00:32:47,920 --> 00:32:50,920 Speaker 2: to design your clinical studies with, to make sure that 574 00:32:50,320 --> 00:32:53,520 Speaker 2: they're they're aligned with what the FDA wants to see 575 00:32:53,520 --> 00:32:56,080 Speaker 2: in order to grant approval. I mean these companies, all 576 00:32:56,160 --> 00:32:59,000 Speaker 2: the companies in the space are in very close constant 577 00:32:59,040 --> 00:33:03,600 Speaker 2: contact with the FDA about what next steps to take 578 00:33:04,080 --> 00:33:08,040 Speaker 2: to get to that milestone of commercial availability for their products, 579 00:33:08,080 --> 00:33:11,520 Speaker 2: which might not happen for neuralink for years. So it's 580 00:33:11,680 --> 00:33:16,480 Speaker 2: very important that the FDA is still in this communication 581 00:33:16,920 --> 00:33:21,360 Speaker 2: with all these device startups. They also just were clear 582 00:33:21,480 --> 00:33:23,960 Speaker 2: to start a trial in Canada, so it's not just 583 00:33:24,000 --> 00:33:28,040 Speaker 2: a US related company anymore. And certainly a lot of 584 00:33:28,080 --> 00:33:32,080 Speaker 2: biotech companies are working on international markets first before coming 585 00:33:32,120 --> 00:33:35,760 Speaker 2: to the US because it just can be easier. But yeah, 586 00:33:35,800 --> 00:33:38,760 Speaker 2: cuts of the FDA and also all of the cuts 587 00:33:39,000 --> 00:33:43,800 Speaker 2: in research generally at universities, Academic research is very much 588 00:33:43,840 --> 00:33:48,520 Speaker 2: a part of the building blocks for what comes next 589 00:33:48,720 --> 00:33:56,160 Speaker 2: in AI and neural understanding what those brain impulses actually mean. 590 00:33:56,960 --> 00:33:59,040 Speaker 2: All the research kind of builds on the foundation of 591 00:33:59,040 --> 00:34:01,160 Speaker 2: what's been there before, and so the fact that a 592 00:34:01,160 --> 00:34:04,800 Speaker 2: lot of federal funding for research in general has been 593 00:34:04,920 --> 00:34:09,320 Speaker 2: cut or has not been given to academics without intense 594 00:34:09,360 --> 00:34:11,840 Speaker 2: restrictions on things that have nothing to do with that research. 595 00:34:12,160 --> 00:34:15,640 Speaker 2: That's not great for a company like Neuralink or anything 596 00:34:15,640 --> 00:34:16,560 Speaker 2: in the biotech space. 597 00:34:17,040 --> 00:34:18,800 Speaker 1: Got it, Sarah, thanks again. 598 00:34:18,920 --> 00:34:19,600 Speaker 2: Thank you so much. 599 00:34:19,800 --> 00:34:20,200 Speaker 4: Take care. 600 00:34:27,040 --> 00:34:30,200 Speaker 1: This episode is produced by Stacy Wong. Anna mas Rakus 601 00:34:30,280 --> 00:34:33,440 Speaker 1: is our editor, Blake Maples handles engineering, and Dave Purcell 602 00:34:33,560 --> 00:34:37,719 Speaker 1: fact checks. Our supervising producer is Magnus Henrickson. The e 603 00:34:37,920 --> 00:34:41,000 Speaker 1: lining theme is written and performed by Taka Yasuzawa and 604 00:34:41,080 --> 00:34:44,799 Speaker 1: Alex Sugira. Brandon Francis Newnam is our executive producer, and 605 00:34:44,880 --> 00:34:48,240 Speaker 1: Sage Bauman is the head of Bloomberg Podcasts. A big 606 00:34:48,280 --> 00:34:51,560 Speaker 1: thanks as always to our supporters Joe Weber and Brad Stone. 607 00:34:51,880 --> 00:34:54,759 Speaker 1: I'm David Papato Applis. If you have a minute, rate 608 00:34:54,800 --> 00:34:57,439 Speaker 1: and review our show, it'll help other listeners find us. 609 00:34:57,680 --> 00:34:58,520 Speaker 1: See you next week 610 00:35:01,000 --> 00:35:04,360 Speaker 6: At the STA