1 00:00:03,960 --> 00:00:06,720 Speaker 1: Well, Elon Musk gives now the richest person on the planet. 2 00:00:07,200 --> 00:00:10,000 Speaker 2: More than half the satellites in space are owned and 3 00:00:10,119 --> 00:00:12,120 Speaker 2: controlled by one man. 4 00:00:12,400 --> 00:00:15,160 Speaker 1: Starting his own artificial intelligence company. 5 00:00:15,360 --> 00:00:17,200 Speaker 3: Well, he's a legitimate, super genius. 6 00:00:17,800 --> 00:00:18,919 Speaker 4: Legitimate, he says. 7 00:00:18,920 --> 00:00:21,520 Speaker 3: He's always voted for Democrats, but this year it will 8 00:00:21,560 --> 00:00:23,000 Speaker 3: be different. He'll vote Republican. 9 00:00:23,280 --> 00:00:25,760 Speaker 5: There is a reason the US government is so reliant 10 00:00:25,800 --> 00:00:26,200 Speaker 5: on him. 11 00:00:26,360 --> 00:00:29,040 Speaker 4: Elon Musk is a scam artist and he's done nothing. 12 00:00:30,560 --> 00:00:33,320 Speaker 4: Anything he does is fascinating the people. 13 00:00:42,640 --> 00:00:45,600 Speaker 2: Welcome to Elon, Inc. Where we discuss Elon Musk's vast 14 00:00:45,600 --> 00:00:48,640 Speaker 2: corporate empire, his latest gambits and antics, and how to 15 00:00:48,640 --> 00:00:51,879 Speaker 2: make sense of it all. I'm your host, David Papadoppolos. 16 00:00:52,680 --> 00:00:55,640 Speaker 2: The news is coming fast and furious in the Elon Verse. 17 00:00:56,080 --> 00:00:59,720 Speaker 2: Last night, Elon announced that Neuralink had successfully implanted a 18 00:00:59,760 --> 00:01:03,240 Speaker 2: chip into a human. Big news, clearly, but just how 19 00:01:03,280 --> 00:01:08,280 Speaker 2: excited or terrified should we be? Meanwhile, over at X, 20 00:01:08,600 --> 00:01:11,160 Speaker 2: the team had a strange response to deep fake porn 21 00:01:11,160 --> 00:01:14,440 Speaker 2: images of Taylor Swift that had gone viral on the platform. 22 00:01:14,840 --> 00:01:17,200 Speaker 2: They made it impossible to search for her at all. 23 00:01:17,760 --> 00:01:20,800 Speaker 2: Oh and Tesla earnings came and went and the stock 24 00:01:20,840 --> 00:01:25,080 Speaker 2: market gave Elon a big thumbs down. To talk about this, 25 00:01:25,280 --> 00:01:27,840 Speaker 2: I'm sitting here in the studio with Max Schafkin, senior 26 00:01:27,880 --> 00:01:33,720 Speaker 2: reporter at Bloomberg BusinessWeek. Hello Max, Hey, you missed me, Max, wwa. 27 00:01:33,600 --> 00:01:35,240 Speaker 4: I did. It's so good to see you, David. 28 00:01:36,280 --> 00:01:38,759 Speaker 2: Later, we'll be joined by Kurt Wagner and Dana Hall 29 00:01:38,880 --> 00:01:41,759 Speaker 2: to talk about X and Tesla, respectively. But first we 30 00:01:41,760 --> 00:01:44,960 Speaker 2: welcome Sarah McBride to the show. Hello Sarah. 31 00:01:45,000 --> 00:01:45,280 Speaker 6: Hello. 32 00:01:46,160 --> 00:01:48,920 Speaker 2: Sarah is Bloomberg's Neurlink reporter and will help guide us 33 00:01:48,920 --> 00:01:54,360 Speaker 2: through this cerebral quantum leap. So, Sarah, what do we 34 00:01:54,480 --> 00:01:58,160 Speaker 2: actually know here? What happened? A chip went in, the 35 00:01:58,240 --> 00:02:00,640 Speaker 2: patient survived. What else do we know? 36 00:02:01,560 --> 00:02:06,160 Speaker 6: Well? In his tweet yesterday, Elon said that signals were 37 00:02:06,520 --> 00:02:10,440 Speaker 6: working in the patient's brain, that neurons were firing or 38 00:02:10,480 --> 00:02:14,200 Speaker 6: words to that effect, which is great, but it would 39 00:02:14,240 --> 00:02:17,120 Speaker 6: have been terrible news had that not been the case. 40 00:02:17,480 --> 00:02:21,160 Speaker 2: And what exactly does neurons firing? What is that for 41 00:02:21,560 --> 00:02:24,000 Speaker 2: the lay people out there like MEA Max? What does 42 00:02:24,040 --> 00:02:24,480 Speaker 2: that mean? 43 00:02:24,840 --> 00:02:29,560 Speaker 6: In his post yesterday, Elon wrote that initial results showed 44 00:02:29,720 --> 00:02:34,200 Speaker 6: promising neurons spike detection and all that means is that 45 00:02:34,480 --> 00:02:37,560 Speaker 6: there are one hundred billion neurons almost in the average 46 00:02:37,760 --> 00:02:41,520 Speaker 6: human brain, and so the neurons, which are the cells 47 00:02:41,560 --> 00:02:45,920 Speaker 6: of the brain that are near this device are working. There. 48 00:02:46,720 --> 00:02:49,560 Speaker 6: When a neuron works, you can tell because it kind 49 00:02:49,600 --> 00:02:53,640 Speaker 6: of activates at the end, there's a tiny electrical signal. 50 00:02:53,760 --> 00:02:57,480 Speaker 6: So that's what they're picking up with the device as 51 00:02:57,600 --> 00:02:59,480 Speaker 6: after it got implanted on Sunday. 52 00:03:00,080 --> 00:03:03,079 Speaker 2: Pretty encouraging. I mean, hard to know exactly how much 53 00:03:03,120 --> 00:03:05,760 Speaker 2: to read into it, but certainly positive for times the 54 00:03:05,760 --> 00:03:07,240 Speaker 2: patient survives. 55 00:03:06,760 --> 00:03:10,959 Speaker 6: Right, doesn't mean that the device is working. It means 56 00:03:11,000 --> 00:03:14,360 Speaker 6: that it didn't destroy anything in the patient's brain, and 57 00:03:14,480 --> 00:03:17,520 Speaker 6: probably a few days or weeks down the line, then 58 00:03:17,560 --> 00:03:21,800 Speaker 6: they'll see if it can actually result in the patient 59 00:03:21,880 --> 00:03:25,360 Speaker 6: being able to, for example, mover cursor on a computer 60 00:03:25,720 --> 00:03:27,639 Speaker 6: outside outside their boxy. 61 00:03:27,760 --> 00:03:29,760 Speaker 2: So we don't know that yet. We're not there yet. 62 00:03:29,840 --> 00:03:32,880 Speaker 4: We don't know that yet, not even close, not even close. 63 00:03:34,160 --> 00:03:36,760 Speaker 1: Sorry not to be the wet blanket, but like I 64 00:03:36,840 --> 00:03:39,600 Speaker 1: think we need to approach the claims being made here 65 00:03:39,600 --> 00:03:42,480 Speaker 1: with some skepticism. First of all, it's Elon Musk we're 66 00:03:42,480 --> 00:03:45,320 Speaker 1: talking about. He's a guy who tends to present you know, 67 00:03:45,400 --> 00:03:48,840 Speaker 1: let's just say like the rosiest possible version of any 68 00:03:48,920 --> 00:03:53,160 Speaker 1: prospective technology. And this is a safety trial, it's not 69 00:03:53,240 --> 00:03:54,200 Speaker 1: an efficacy trial. 70 00:03:54,240 --> 00:03:56,120 Speaker 4: This is the point here is to put this. 71 00:03:56,080 --> 00:03:58,880 Speaker 1: In a few people's brains and see if anything bad 72 00:03:58,920 --> 00:04:02,000 Speaker 1: happens to them. And so far, according to a single 73 00:04:02,000 --> 00:04:04,000 Speaker 1: tweet by Elon Musk, we don't know. We haven't heard 74 00:04:04,000 --> 00:04:06,600 Speaker 1: anything from the patient. We haven't heard anything else. Patient's doing. 75 00:04:06,920 --> 00:04:07,320 Speaker 4: Okay. 76 00:04:07,440 --> 00:04:10,760 Speaker 1: I just want to remind everyone in the monkey trials 77 00:04:10,840 --> 00:04:13,800 Speaker 1: that Neuraline did, there were a lot of serious COMPLICATIONDS 78 00:04:13,840 --> 00:04:16,000 Speaker 1: for the monkeys. And I went back and reread the 79 00:04:16,040 --> 00:04:18,440 Speaker 1: story that Wired published on this. Right, these these were 80 00:04:18,480 --> 00:04:22,200 Speaker 1: not necessarily happening the next day, So I think there's 81 00:04:22,240 --> 00:04:25,520 Speaker 1: still some time even before we can say these are safe. 82 00:04:25,600 --> 00:04:29,120 Speaker 2: Right, So I was sort of thinking that the fact 83 00:04:29,120 --> 00:04:32,800 Speaker 2: that here we are TK hours later the patient is 84 00:04:32,839 --> 00:04:37,000 Speaker 2: still alive was fairly significant that as a positive sign. 85 00:04:37,360 --> 00:04:39,720 Speaker 2: Max is sort of suggesting that, well, okay, great, made 86 00:04:39,760 --> 00:04:42,160 Speaker 2: it through the surgery. But you know, these things tend 87 00:04:42,160 --> 00:04:45,719 Speaker 2: to pop up down the road. Just how significant is 88 00:04:46,240 --> 00:04:49,240 Speaker 2: initial survival? Obviously again great for the patient, but. 89 00:04:49,240 --> 00:04:52,880 Speaker 6: Yeah, I have to say it's very significant. This has 90 00:04:52,920 --> 00:04:55,720 Speaker 6: been years in the making. This has been Elon promising 91 00:04:55,760 --> 00:04:57,520 Speaker 6: every year like, oh yeah, we're going to have this 92 00:04:57,600 --> 00:05:00,560 Speaker 6: device in a few months or next year, and finally, 93 00:05:00,640 --> 00:05:04,200 Speaker 6: literally years after he first promised that it is in 94 00:05:04,560 --> 00:05:09,520 Speaker 6: a human brain. That said, other companies that have done 95 00:05:09,560 --> 00:05:13,080 Speaker 6: similar work have tended to wait a few days before 96 00:05:13,279 --> 00:05:14,560 Speaker 6: making similar. 97 00:05:15,360 --> 00:05:18,000 Speaker 2: But this is Elon Mosky. You knew he wasn't gonna wait. 98 00:05:18,120 --> 00:05:22,520 Speaker 6: This is Elon. Also, this patient is you know, monkeys 99 00:05:23,320 --> 00:05:27,040 Speaker 6: can't indicate when there's something wrong. Monkeys try to scratch 100 00:05:27,120 --> 00:05:30,720 Speaker 6: their heads and mess with the scar sites. I'm not 101 00:05:30,880 --> 00:05:34,600 Speaker 6: saying that those surgeries weren't botched. I've read some of 102 00:05:34,680 --> 00:05:38,960 Speaker 6: the UC Davis reports, and maybe there were mistakes made 103 00:05:39,480 --> 00:05:43,040 Speaker 6: during the surgery, but this is a human patient and 104 00:05:43,880 --> 00:05:47,960 Speaker 6: hopefully a lot was learned from those monkey experiments. They've 105 00:05:48,000 --> 00:05:52,920 Speaker 6: done many implants in primates since those botched ones, so 106 00:05:53,120 --> 00:05:56,040 Speaker 6: I think they know the stakes here and this one 107 00:05:56,279 --> 00:05:57,720 Speaker 6: was super carefully done. 108 00:05:57,800 --> 00:06:00,520 Speaker 2: Now on the surgery itself, it is not a human 109 00:06:00,680 --> 00:06:04,279 Speaker 2: doing the surgery itself, right, It is a robot. 110 00:06:04,360 --> 00:06:06,160 Speaker 4: Correct, there's a. 111 00:06:06,240 --> 00:06:09,960 Speaker 6: Robot assisting in the surgery, but it's not like the 112 00:06:09,960 --> 00:06:12,640 Speaker 6: patient goes in the surgeons are like, okay, up to 113 00:06:12,680 --> 00:06:15,440 Speaker 6: the robot. We're out of here. It's still a lot 114 00:06:15,480 --> 00:06:19,679 Speaker 6: of humans in the operating room kind of managing the robot. 115 00:06:20,120 --> 00:06:22,440 Speaker 2: Well, that's reassuring that there are humans involved here. 116 00:06:22,480 --> 00:06:22,680 Speaker 4: Now. 117 00:06:22,800 --> 00:06:25,720 Speaker 2: Listen, Max made it sound like it was all this 118 00:06:25,800 --> 00:06:28,640 Speaker 2: is all about the safety of the product and not 119 00:06:28,680 --> 00:06:32,000 Speaker 2: so much so much about efficacy. But efficacy means when 120 00:06:32,000 --> 00:06:36,279 Speaker 2: you were talking about whether this person will be able 121 00:06:36,320 --> 00:06:38,880 Speaker 2: to move a cursor with his or her brain, that 122 00:06:38,960 --> 00:06:41,839 Speaker 2: sounds like efficacy to me, not safety. 123 00:06:41,880 --> 00:06:44,520 Speaker 6: So they definitely are interested in safety. That is, to 124 00:06:44,600 --> 00:06:47,560 Speaker 6: Max's point, the top priority. But it's a little different 125 00:06:47,560 --> 00:06:51,000 Speaker 6: to drug trials. So device trials don't go through the 126 00:06:51,040 --> 00:06:53,359 Speaker 6: same phase one, phase two, phase three we might be 127 00:06:53,440 --> 00:06:57,640 Speaker 6: familiar with from drugs. There are different stages and they're 128 00:06:57,680 --> 00:07:01,000 Speaker 6: analogous to some of those phase one, two, three, But 129 00:07:01,400 --> 00:07:04,200 Speaker 6: you don't put a device like this in somebody's brain 130 00:07:04,800 --> 00:07:08,360 Speaker 6: and not know that it's working before you go ahead 131 00:07:08,360 --> 00:07:09,920 Speaker 6: and implant the next patient. 132 00:07:10,080 --> 00:07:13,040 Speaker 1: So, but just to be clear, Sarah, there's another phase 133 00:07:13,080 --> 00:07:14,520 Speaker 1: that they're gonna need tomorrow. 134 00:07:14,720 --> 00:07:17,640 Speaker 6: Yeah, there are two more phases there. We're looking at 135 00:07:17,840 --> 00:07:19,320 Speaker 6: years of child right. 136 00:07:19,360 --> 00:07:22,560 Speaker 1: Well, nonetheless, Elon Musk in addition to announcing that the 137 00:07:22,640 --> 00:07:25,880 Speaker 1: neurons are firing or whatever whatever he said, whatever that means. 138 00:07:26,080 --> 00:07:28,760 Speaker 1: And now it's a product name for this. It's called Telepathy. 139 00:07:29,080 --> 00:07:32,560 Speaker 1: It sounds wonderful. I mean, telepathy sounds pretty good. 140 00:07:32,840 --> 00:07:35,360 Speaker 6: Well, he's changed the name a few times, so let's 141 00:07:35,360 --> 00:07:37,720 Speaker 6: see if telepathy sticks. But it's catchy. 142 00:07:37,960 --> 00:07:39,920 Speaker 2: Matt Max seems to like it, so there we go. 143 00:07:40,360 --> 00:07:43,280 Speaker 1: I just it's just funny how we've gone from we 144 00:07:43,360 --> 00:07:46,360 Speaker 1: put this in the guy. The guy is not dead yet, 145 00:07:46,640 --> 00:07:49,680 Speaker 1: thank god, and it's called telepathy, and you know, I 146 00:07:49,680 --> 00:07:52,400 Speaker 1: feel like we're one tweet away from like pre order one. 147 00:07:52,480 --> 00:07:54,360 Speaker 4: Now it'll only be as. 148 00:07:55,920 --> 00:07:56,760 Speaker 2: Put your deposit in. 149 00:07:57,120 --> 00:07:58,840 Speaker 6: I have a feeling it's not going to be called 150 00:07:58,880 --> 00:08:02,280 Speaker 6: telepathy when it fie gets released to the market. Somebody's 151 00:08:02,360 --> 00:08:04,880 Speaker 6: going to put the kibasha on that. But yeah, it's 152 00:08:04,880 --> 00:08:05,440 Speaker 6: a great name. 153 00:08:05,800 --> 00:08:09,520 Speaker 2: So what do stage two, Sarah, and stage three look like? 154 00:08:09,560 --> 00:08:12,080 Speaker 2: And when do we all go around with these things 155 00:08:12,120 --> 00:08:12,840 Speaker 2: in our heads? 156 00:08:13,160 --> 00:08:16,160 Speaker 6: Elon said that he wanted to get it in over 157 00:08:16,280 --> 00:08:21,680 Speaker 6: ten patients this year, which is ambitious, but they could 158 00:08:21,760 --> 00:08:23,400 Speaker 6: do it, you know, if they did one a month 159 00:08:23,720 --> 00:08:28,280 Speaker 6: and then they have to analyze all the data, submit 160 00:08:28,360 --> 00:08:31,400 Speaker 6: that to the FDA. Once the FDA's reviewed it, then 161 00:08:31,480 --> 00:08:34,640 Speaker 6: they can start the next set of trials. And then 162 00:08:34,679 --> 00:08:38,240 Speaker 6: the third set is called the pivotal trials, and those 163 00:08:38,400 --> 00:08:42,000 Speaker 6: are the most exciting. You know that it's closest to 164 00:08:42,040 --> 00:08:46,440 Speaker 6: an actual product. Those will be hundreds, if not thousands 165 00:08:46,480 --> 00:08:51,480 Speaker 6: of people, but that's probably five years away. Yeah, if 166 00:08:51,760 --> 00:08:53,280 Speaker 6: everything works as hoped. 167 00:08:53,600 --> 00:08:56,280 Speaker 2: Now, let me ask you, as you said, in a 168 00:08:56,320 --> 00:08:59,320 Speaker 2: good scenario, they're able to do about one a month. 169 00:08:59,840 --> 00:09:03,120 Speaker 2: I guess that's I'm curious about that is What is 170 00:09:03,160 --> 00:09:06,520 Speaker 2: it about these that are so difficult that you can 171 00:09:06,559 --> 00:09:08,280 Speaker 2: only do one a month? Is just simply the fact 172 00:09:08,320 --> 00:09:11,120 Speaker 2: that they are still These are just baby steps for them, 173 00:09:11,160 --> 00:09:13,280 Speaker 2: and they're being You have to be super cautious at 174 00:09:13,280 --> 00:09:14,280 Speaker 2: this at this stage. 175 00:09:14,320 --> 00:09:17,839 Speaker 6: I mean, yeah, it's the first time they're doing this. 176 00:09:18,520 --> 00:09:26,000 Speaker 6: You're putting electrodes several millimeters deep into somebody's cortex. You're 177 00:09:26,040 --> 00:09:29,160 Speaker 6: slicing open their skull to do it, You're cutting into 178 00:09:29,200 --> 00:09:31,800 Speaker 6: their dora. You want to make sure that the first 179 00:09:31,840 --> 00:09:35,720 Speaker 6: one goes pretty well before you even try patient too. 180 00:09:35,840 --> 00:09:38,000 Speaker 6: Then you want to make sure patient two is doing 181 00:09:38,040 --> 00:09:41,880 Speaker 6: pretty well before you try patient three, and then you 182 00:09:41,960 --> 00:09:45,400 Speaker 6: need to start actually doing the stuff you're trying to do, 183 00:09:45,480 --> 00:09:48,800 Speaker 6: get the patient to move cursors and so on, control 184 00:09:49,120 --> 00:09:52,920 Speaker 6: external devices. Then you have to build all that data, 185 00:09:53,400 --> 00:09:56,960 Speaker 6: collate the results, present it to the FDA, let the 186 00:09:57,000 --> 00:10:00,199 Speaker 6: FDA review it. And this is assuming that every thing 187 00:10:00,280 --> 00:10:03,960 Speaker 6: goes well. If you have a setback like some patient 188 00:10:04,000 --> 00:10:07,280 Speaker 6: who's scarring, heals in a weird way, or that could 189 00:10:07,480 --> 00:10:08,760 Speaker 6: delay everything by months. 190 00:10:08,800 --> 00:10:11,960 Speaker 2: More sure, listen last question before we let you go here. 191 00:10:12,080 --> 00:10:18,280 Speaker 2: So Neurlink in the musk constellation of companies isn't certainly 192 00:10:18,320 --> 00:10:20,840 Speaker 2: isn't the most valuable or I don't even think towards 193 00:10:21,040 --> 00:10:23,480 Speaker 2: the top end of those. But this is a big 194 00:10:23,559 --> 00:10:27,680 Speaker 2: step right where roughly has it's a privately held company, 195 00:10:28,200 --> 00:10:32,000 Speaker 2: where has its valuation roughly been, and what if anything 196 00:10:32,160 --> 00:10:34,800 Speaker 2: does this development do for that valuation? 197 00:10:36,360 --> 00:10:39,559 Speaker 6: Yeah, I mean it'll probably increase the valuation. And as 198 00:10:39,600 --> 00:10:42,720 Speaker 6: to where it is in the constellation of musk companies, 199 00:10:42,760 --> 00:10:45,800 Speaker 6: it's toward the bottom, but toward the bottom for a 200 00:10:45,880 --> 00:10:49,200 Speaker 6: musk company. In this case, Neuralink's worth three point five 201 00:10:49,240 --> 00:10:53,600 Speaker 6: billion dollars For a medical device company that just implanted 202 00:10:53,679 --> 00:10:58,640 Speaker 6: its first device in a human That is huge, just unbelievable. 203 00:10:58,800 --> 00:11:03,640 Speaker 6: So that's the must fact driving up the valuation. And 204 00:11:03,720 --> 00:11:06,120 Speaker 6: then you know, if they make it to the next trial, 205 00:11:06,200 --> 00:11:10,000 Speaker 6: it'll have another outsized jump that'll make every other medical 206 00:11:10,040 --> 00:11:13,480 Speaker 6: device company jealous. And you're seeing it already. I said 207 00:11:14,200 --> 00:11:18,240 Speaker 6: the CEO of a rival company called Synchron last night. 208 00:11:18,400 --> 00:11:20,760 Speaker 6: I just think he couldn't help himself. He had to 209 00:11:20,800 --> 00:11:24,000 Speaker 6: post a tweet of one of his I think it 210 00:11:24,080 --> 00:11:27,920 Speaker 6: was one of his patients manipulating Pong with his mind, 211 00:11:28,360 --> 00:11:30,920 Speaker 6: just like, we're still here, don't forget about us. We've 212 00:11:30,920 --> 00:11:33,960 Speaker 6: got devices in patients, still there and. 213 00:11:34,280 --> 00:11:36,560 Speaker 2: In many ways, as you report it many times, in 214 00:11:36,600 --> 00:11:39,439 Speaker 2: many ways many steps ahead of of Neuralink. 215 00:11:39,480 --> 00:11:43,320 Speaker 6: Actually right, yeah, exactly, Well, in some ways they're in patience, 216 00:11:43,400 --> 00:11:46,800 Speaker 6: but in other ways their device isn't as sophisticated as 217 00:11:46,800 --> 00:11:51,079 Speaker 6: the Neuralink device. Neuralink has a thousand electrodes more than 218 00:11:51,080 --> 00:11:55,120 Speaker 6: a thousand on its device, and Synchron doesn't have that many. 219 00:11:55,200 --> 00:11:57,080 Speaker 6: So you can go back and forth and debate the 220 00:11:57,120 --> 00:12:01,480 Speaker 6: pros and cons. But yeah, they were in patients for Nearlink. 221 00:12:01,120 --> 00:12:04,160 Speaker 1: And way they're way behind on names because, like I mean, 222 00:12:05,080 --> 00:12:08,360 Speaker 1: because a normal sober minded medical device company would not 223 00:12:08,679 --> 00:12:11,600 Speaker 1: just like go around, like you know, marketing a highly 224 00:12:11,600 --> 00:12:15,000 Speaker 1: speculative brand name on their you know very much in development. 225 00:12:15,080 --> 00:12:17,559 Speaker 1: Maybe it works, maybe it's safe. We don't know yet product, 226 00:12:17,559 --> 00:12:20,280 Speaker 1: but you on, he'll it's difficult to go ahead and 227 00:12:20,320 --> 00:12:23,079 Speaker 1: do that. And investors, you know, for better worse, they 228 00:12:23,160 --> 00:12:24,200 Speaker 1: like that, they love it. 229 00:12:24,400 --> 00:12:27,800 Speaker 2: Okay, Sarah, listen, thank you very much for being with us. 230 00:12:27,880 --> 00:12:31,000 Speaker 2: We will have you on as soon as we know more. 231 00:12:31,040 --> 00:12:33,960 Speaker 2: As soon as you know more about our first patient. 232 00:12:33,640 --> 00:12:36,040 Speaker 6: Here sounds good. Thanks for having me. 233 00:12:41,360 --> 00:12:43,720 Speaker 2: Max and are now joined by Kurt Wagner, a tech 234 00:12:43,760 --> 00:12:47,800 Speaker 2: reporter here at Bloomberg covering social media and the author 235 00:12:47,880 --> 00:12:51,000 Speaker 2: of the soon to be released book Battle for the Bird, 236 00:12:51,360 --> 00:12:56,600 Speaker 2: about Twitter and the battle for control of the company. Kurt, 237 00:12:56,800 --> 00:12:59,960 Speaker 2: I know there's something tailor about Taylor Swift and deep 238 00:13:00,120 --> 00:13:03,600 Speaker 2: fake porn, and then Musks people came back and they 239 00:13:03,679 --> 00:13:06,760 Speaker 2: knew they hit the nuclear option, and you suddenly couldn't 240 00:13:07,120 --> 00:13:11,280 Speaker 2: search it all for her. What the what the heck happened? Yeah? 241 00:13:11,360 --> 00:13:14,480 Speaker 5: I mean, you had all the ingredients right there of 242 00:13:14,520 --> 00:13:17,320 Speaker 5: exactly what happened. So the end of last week we 243 00:13:17,559 --> 00:13:20,440 Speaker 5: saw that there were some there was deep fake porn 244 00:13:20,520 --> 00:13:23,040 Speaker 5: of the most arguably the most famous woman in the 245 00:13:23,040 --> 00:13:27,800 Speaker 5: world right now, Taylor Swift, circulating on x and this 246 00:13:27,920 --> 00:13:32,240 Speaker 5: violates several of the company's policies, first and foremost. And 247 00:13:32,320 --> 00:13:36,360 Speaker 5: so what we saw was naturally people seeking it out, 248 00:13:36,559 --> 00:13:40,880 Speaker 5: sharing it, commenting on it, outrage of course from swift 249 00:13:40,920 --> 00:13:43,600 Speaker 5: E's and other fans of not only Taylor, but just 250 00:13:43,640 --> 00:13:46,240 Speaker 5: people who realized that this is not a good thing 251 00:13:46,320 --> 00:13:49,280 Speaker 5: to be happening on any platform. And then you saw 252 00:13:49,320 --> 00:13:51,560 Speaker 5: the companies sort of scrambling to try and figure out 253 00:13:51,600 --> 00:13:54,480 Speaker 5: what to do about this, right, and David you mentioned 254 00:13:54,840 --> 00:13:58,560 Speaker 5: they ultimately for a couple of days literally blocked people 255 00:13:58,679 --> 00:14:02,080 Speaker 5: from having searched results when they searched for Taylor Swift's name. 256 00:14:02,200 --> 00:14:04,240 Speaker 5: It's sort of happy to get into sort of how 257 00:14:04,280 --> 00:14:04,839 Speaker 5: extreme that. 258 00:14:04,880 --> 00:14:07,280 Speaker 2: Is, right, Why did they take such an extreme step? 259 00:14:07,320 --> 00:14:10,120 Speaker 2: Why could they not be more precise in there and 260 00:14:10,200 --> 00:14:11,319 Speaker 2: their crackdown on this? 261 00:14:11,640 --> 00:14:14,000 Speaker 5: So you do something like this when you're having trouble 262 00:14:15,120 --> 00:14:17,319 Speaker 5: cleaning up all of the stuff on your own, right, this, 263 00:14:17,800 --> 00:14:20,160 Speaker 5: this is I would consider a last resort or close 264 00:14:20,200 --> 00:14:23,000 Speaker 5: to a last resort. And to me, the fact that 265 00:14:23,040 --> 00:14:26,200 Speaker 5: they simply had to block people from even searching for 266 00:14:26,280 --> 00:14:28,520 Speaker 5: Taylor Swift is a sign that they were not able 267 00:14:28,560 --> 00:14:31,440 Speaker 5: to get the problem under control on their own, and 268 00:14:31,480 --> 00:14:33,480 Speaker 5: so they had to take a drastic measure like this. 269 00:14:33,520 --> 00:14:37,760 Speaker 5: I've covered Twitter for ten years. I don't remember this 270 00:14:37,880 --> 00:14:41,360 Speaker 5: happening before in terms of the search ban, certainly not 271 00:14:41,480 --> 00:14:44,400 Speaker 5: as it relates to a high profile user like this. 272 00:14:45,000 --> 00:14:48,360 Speaker 2: And that tells you what then about the current state of. 273 00:14:48,480 --> 00:14:51,000 Speaker 5: X Well, that the trust and safety function over there 274 00:14:51,120 --> 00:14:53,760 Speaker 5: is not operating in the way that it was before 275 00:14:53,800 --> 00:14:57,160 Speaker 5: Elon took over, and then it's not operating very smoothly, right, 276 00:14:57,200 --> 00:15:01,000 Speaker 5: I mean, presumably how something like this would normally be 277 00:15:01,160 --> 00:15:04,600 Speaker 5: handled is it would be flagged to the company, you know, 278 00:15:04,680 --> 00:15:08,160 Speaker 5: oftentimes maybe even by the not Taylor directly, but probably 279 00:15:08,160 --> 00:15:11,200 Speaker 5: her team. Right, They'd have a partnership's contact at the 280 00:15:11,200 --> 00:15:14,320 Speaker 5: company that they would go to. The company would not 281 00:15:14,360 --> 00:15:16,800 Speaker 5: only immediately take down the original tweet, but they would 282 00:15:17,000 --> 00:15:20,680 Speaker 5: probably take those photos or videos and start to scan 283 00:15:20,800 --> 00:15:24,200 Speaker 5: the rest of the service for either identical matches to them, 284 00:15:24,760 --> 00:15:27,400 Speaker 5: or other photos or videos that look very similar. Maybe 285 00:15:27,440 --> 00:15:29,640 Speaker 5: it's the same photo but it's been a watermark has 286 00:15:29,640 --> 00:15:31,480 Speaker 5: been added or something like that, right, And they would 287 00:15:31,520 --> 00:15:33,920 Speaker 5: sort of automate the process of taking the stuff down 288 00:15:33,960 --> 00:15:37,200 Speaker 5: as quickly as possible. What we know is that the 289 00:15:37,240 --> 00:15:41,960 Speaker 5: original content was up I believe for something like seventeen hours. 290 00:15:41,760 --> 00:15:44,440 Speaker 2: It's now down. Ultimately, the original content is down. 291 00:15:44,480 --> 00:15:45,120 Speaker 5: It's now down. 292 00:15:45,280 --> 00:15:45,720 Speaker 4: That's right. 293 00:15:45,960 --> 00:15:48,200 Speaker 5: And they've and to be clear starting last night. So 294 00:15:48,280 --> 00:15:51,320 Speaker 5: Monday night they started letting people search for Taylor Swift again. 295 00:15:51,360 --> 00:15:53,520 Speaker 5: So they feel like they've gotten the process or they 296 00:15:53,640 --> 00:15:56,160 Speaker 5: excuse me, the problem under control. But it took several 297 00:15:56,240 --> 00:15:57,440 Speaker 5: days on that. 298 00:15:57,960 --> 00:16:00,880 Speaker 2: You mentioned the NFL game. The NFL game is, as 299 00:16:01,400 --> 00:16:05,360 Speaker 2: all of basically mankind knows, Taylor Swift is dating the 300 00:16:05,400 --> 00:16:08,920 Speaker 2: tight end for the Kansas City Chiefs. Travis Kelcey Kansas 301 00:16:08,880 --> 00:16:12,960 Speaker 2: City Chiefs played on Sunday in the semi finals of 302 00:16:13,400 --> 00:16:16,160 Speaker 2: the NFL playoffs. They won, They're going to the super Bowl. 303 00:16:16,480 --> 00:16:20,720 Speaker 2: So this band though on Taylor Swift searches is happening 304 00:16:20,880 --> 00:16:23,120 Speaker 2: as this playoff game, this Kens City Chiefs game is 305 00:16:23,160 --> 00:16:26,760 Speaker 2: going on, and Kurt, the NFL has been right one 306 00:16:26,800 --> 00:16:30,280 Speaker 2: of the few bright spots for X of late in 307 00:16:30,360 --> 00:16:35,280 Speaker 2: terms of generating buzz and content and all that horrible timing. 308 00:16:35,000 --> 00:16:38,160 Speaker 5: No terrible timing, I mean, imagine how many people are 309 00:16:38,200 --> 00:16:42,120 Speaker 5: watching that game Sunday and see, you know, CBS cutaway 310 00:16:42,160 --> 00:16:44,320 Speaker 5: to a shot of Taylor Swift jumping around in the 311 00:16:44,640 --> 00:16:47,960 Speaker 5: you know, the Kelsey box or whatever and want to 312 00:16:48,000 --> 00:16:50,160 Speaker 5: go to X and search about it or to talk 313 00:16:50,200 --> 00:16:53,560 Speaker 5: about it or whatever. Right, and suddenly a huge part 314 00:16:53,920 --> 00:16:57,440 Speaker 5: of the service is not working for that exact use case. 315 00:16:57,360 --> 00:17:01,120 Speaker 2: Kurt, So the nuclear option, somebody there at X hit 316 00:17:01,200 --> 00:17:03,640 Speaker 2: the button. There was a very very large button that 317 00:17:03,760 --> 00:17:07,680 Speaker 2: just says nuke, and they hit it and Taylor Swift 318 00:17:07,720 --> 00:17:13,920 Speaker 2: disappeared from X for basically a day. Is that what 319 00:17:13,960 --> 00:17:16,679 Speaker 2: we're going to expect going forward is that the blueprint 320 00:17:16,800 --> 00:17:19,720 Speaker 2: then as this comes up in coming days and weeks, 321 00:17:19,720 --> 00:17:21,800 Speaker 2: including perhaps during the Super Bowl in two weeks. 322 00:17:23,200 --> 00:17:25,600 Speaker 5: I mean, that can't be the long term blueprint, right. 323 00:17:25,600 --> 00:17:28,399 Speaker 5: That's just not a good strategy, is not a good 324 00:17:28,440 --> 00:17:32,959 Speaker 5: business model. It's a bad way to handle this, but 325 00:17:33,080 --> 00:17:36,280 Speaker 5: it's what you do when you're desperate. So my hope 326 00:17:36,440 --> 00:17:39,080 Speaker 5: is that there were a lot of learnings from the 327 00:17:39,119 --> 00:17:43,359 Speaker 5: past couple days that the team that is there figured out, Okay, 328 00:17:43,560 --> 00:17:46,439 Speaker 5: you know, here's what we did wrong, or you know, 329 00:17:46,520 --> 00:17:50,280 Speaker 5: maybe add some new level of kind of monitoring for 330 00:17:50,359 --> 00:17:53,040 Speaker 5: some high profile accounts. As you mentioned, the Super Bowl 331 00:17:53,240 --> 00:17:55,359 Speaker 5: is two weeks from now. If this stuff starts to 332 00:17:55,400 --> 00:17:58,040 Speaker 5: circulate again, like well, they have made enough of it 333 00:17:58,200 --> 00:18:02,600 Speaker 5: change in the next two weeks, I'd be surprised. So 334 00:18:03,080 --> 00:18:05,560 Speaker 5: it may be that we see something like this happen again. 335 00:18:05,840 --> 00:18:08,720 Speaker 5: What I will point out real quick is that they 336 00:18:08,840 --> 00:18:12,240 Speaker 5: did say, and they're they're announcing this very intentionally. This 337 00:18:12,320 --> 00:18:15,600 Speaker 5: week they're going to open a trust and safety office 338 00:18:15,680 --> 00:18:18,200 Speaker 5: in Austin, Texas. I claim they're going to hire one 339 00:18:18,240 --> 00:18:21,920 Speaker 5: hundred full time employees to Now one hundred is a 340 00:18:21,920 --> 00:18:24,600 Speaker 5: small amount, to be clear, but full time employees at 341 00:18:24,800 --> 00:18:27,840 Speaker 5: X one hundred is actually a pretty decent size. And 342 00:18:27,840 --> 00:18:30,919 Speaker 5: they're announcing this because Linda Yakarino is going to be 343 00:18:30,960 --> 00:18:35,480 Speaker 5: speaking before Congress on Wednesday about protecting children online. 344 00:18:35,560 --> 00:18:38,560 Speaker 2: Is it actually meaningful or is it just simply pr 345 00:18:38,840 --> 00:18:40,960 Speaker 2: and for disappearance. 346 00:18:41,160 --> 00:18:45,159 Speaker 5: I think it's more pr than it is meaningful. 347 00:18:45,320 --> 00:18:48,080 Speaker 2: What would a reasonable number be for a task like that? 348 00:18:48,200 --> 00:18:52,360 Speaker 5: Kurt, Well, it's tough because so many of these companies, 349 00:18:52,359 --> 00:18:55,399 Speaker 5: like Meta, I think, was touting tens of thousands of 350 00:18:55,400 --> 00:18:58,399 Speaker 5: content moderators, but most of them are contractors, right, They're 351 00:18:58,440 --> 00:19:02,760 Speaker 5: like basically just sifting through this never ending wheel of 352 00:19:02,880 --> 00:19:05,840 Speaker 5: terrible content and clicking allow or not allow or whatever. 353 00:19:06,640 --> 00:19:10,480 Speaker 5: Having one hundred full time employees working on this could 354 00:19:10,480 --> 00:19:15,360 Speaker 5: be meaningful if those people are, you know, writing policy 355 00:19:15,400 --> 00:19:18,879 Speaker 5: handling high profile accounts like Taylor Swift. I need to 356 00:19:18,920 --> 00:19:21,399 Speaker 5: see what these people ares jobs description are going to 357 00:19:21,440 --> 00:19:23,840 Speaker 5: be before I can really say how meaningful it is. 358 00:19:23,960 --> 00:19:27,160 Speaker 2: But okay, but what if you are a much lesser 359 00:19:27,359 --> 00:19:30,720 Speaker 2: public personality and do not have the influence or the 360 00:19:30,760 --> 00:19:33,680 Speaker 2: deep pockets that Taylor Swift has. When if you too 361 00:19:33,720 --> 00:19:36,960 Speaker 2: are targeted in something like this, what is your fate? 362 00:19:37,720 --> 00:19:40,520 Speaker 5: I think you're in trouble in that situation. Quite frankly, 363 00:19:40,560 --> 00:19:44,119 Speaker 5: I mean, if it takes seventeen hours and several days 364 00:19:43,720 --> 00:19:47,000 Speaker 5: to figure this out for Taylor Swift, you know, what 365 00:19:47,160 --> 00:19:51,639 Speaker 5: chance does someone who has one hundred followers, or maybe 366 00:19:51,640 --> 00:19:54,120 Speaker 5: someone who's not even on X right, but yet their 367 00:19:54,840 --> 00:19:58,680 Speaker 5: face or their content or their body has been posted 368 00:19:58,680 --> 00:20:01,320 Speaker 5: to X right. They might not even really be aware 369 00:20:01,359 --> 00:20:03,800 Speaker 5: that it's up there. This is not just an X problem. 370 00:20:03,840 --> 00:20:05,639 Speaker 5: So I don't want to continue. You know, we're talking 371 00:20:05,640 --> 00:20:08,280 Speaker 5: about X because of the situation over the last couple 372 00:20:08,280 --> 00:20:11,320 Speaker 5: of days, But like this idea of you know, women 373 00:20:11,359 --> 00:20:14,359 Speaker 5: in particular, having their bodies shared online without their consent, 374 00:20:14,480 --> 00:20:17,439 Speaker 5: Like this happens on websites all over the place, and 375 00:20:17,520 --> 00:20:20,080 Speaker 5: it's not okay, right, But you just hold a higher 376 00:20:20,119 --> 00:20:23,439 Speaker 5: standard to some of these larger companies because they have 377 00:20:23,480 --> 00:20:26,000 Speaker 5: so much power, so much money, so much influence. You 378 00:20:26,080 --> 00:20:28,920 Speaker 5: hope that the Metas and the x's of the world 379 00:20:29,000 --> 00:20:31,040 Speaker 5: can get this right. Because if they can't get it right, 380 00:20:31,200 --> 00:20:33,280 Speaker 5: what kind of hope do we have for anybody? 381 00:20:33,320 --> 00:20:35,600 Speaker 1: Yeah, we should say there's been reporting suggesting this was 382 00:20:35,640 --> 00:20:38,280 Speaker 1: a Microsoft tool that was used to create this or 383 00:20:38,280 --> 00:20:40,480 Speaker 1: could at least create some of these images. There have 384 00:20:40,560 --> 00:20:44,640 Speaker 1: been Google alphabet has had its own battles with basically 385 00:20:44,680 --> 00:20:47,480 Speaker 1: deep celebrity deep fakes. So yeah, I mean a lot 386 00:20:47,520 --> 00:20:49,440 Speaker 1: of big companies, a lot of big companies that have 387 00:20:49,720 --> 00:20:52,600 Speaker 1: you know, gigantic vast teams of people you would think 388 00:20:52,640 --> 00:20:55,120 Speaker 1: trying to stop this stuff, have also struggled, But they 389 00:20:55,160 --> 00:20:59,879 Speaker 1: just haven't struggled in such a spectacularly comic way. The 390 00:21:00,080 --> 00:21:02,359 Speaker 1: banning of Taylor Swift. Also, the risk isn't just like 391 00:21:02,400 --> 00:21:05,159 Speaker 1: getting sued by Taylor Swift. It's like upsetting your users. 392 00:21:05,240 --> 00:21:06,840 Speaker 1: I don't know if other people have noticed this, but 393 00:21:06,920 --> 00:21:09,800 Speaker 1: like sometimes you're just searching for something on Twitter and 394 00:21:09,840 --> 00:21:12,040 Speaker 1: there will just be like porn that that'll just like 395 00:21:12,160 --> 00:21:14,800 Speaker 1: find its way in that has found its way past 396 00:21:14,840 --> 00:21:17,600 Speaker 1: the filters and into your feed. Like it's very and 397 00:21:17,880 --> 00:21:20,240 Speaker 1: again when you're talking about like the Taylor Swift fandom, 398 00:21:20,240 --> 00:21:21,880 Speaker 1: I don't probably don't have to tell people people feel 399 00:21:22,000 --> 00:21:23,880 Speaker 1: very strongly about it, Like I don't think they want 400 00:21:23,920 --> 00:21:25,639 Speaker 1: to see normal people don't want to. 401 00:21:25,640 --> 00:21:27,440 Speaker 4: See Taylor Swift revenge porn. 402 00:21:27,800 --> 00:21:30,000 Speaker 1: And that might be enough to turn you know, large 403 00:21:30,040 --> 00:21:33,239 Speaker 1: numbers of people off of X permanently. 404 00:21:33,400 --> 00:21:37,720 Speaker 5: And advertisers max right, Like we could have an hour 405 00:21:37,760 --> 00:21:41,959 Speaker 5: long conversation about advertisers fleen X and Elon, but this 406 00:21:42,040 --> 00:21:44,480 Speaker 5: type of thing does not help them at all because 407 00:21:44,680 --> 00:21:48,520 Speaker 5: Coca Cola, Apple, Disney, pick your brand. The last thing 408 00:21:48,560 --> 00:21:50,320 Speaker 5: they want to do is be handing over their money 409 00:21:50,320 --> 00:21:54,920 Speaker 5: to a platform that is, you know, known for Taylor Swift. 410 00:21:55,280 --> 00:21:59,159 Speaker 1: Well, it's quite the advertising decks like Linda on Madison 411 00:21:59,200 --> 00:22:01,720 Speaker 1: Avenue being like, listen, we've got some great products for you. 412 00:22:01,760 --> 00:22:04,800 Speaker 4: We've got the NFL, we've got deep fake porn on 413 00:22:04,800 --> 00:22:05,960 Speaker 4: one side, exactly. 414 00:22:06,119 --> 00:22:09,439 Speaker 2: Yeah, yeah, Kurt, we will see how they do in 415 00:22:09,480 --> 00:22:13,320 Speaker 2: two weeks when the Kansas City chiefs, Travis Kelcey and 416 00:22:13,520 --> 00:22:15,800 Speaker 2: Taylor Swift take on to San Francisco forty hours and 417 00:22:15,800 --> 00:22:19,120 Speaker 2: we'll see how X holds up. Then. Thanks for joining us, Yeah, 418 00:22:19,200 --> 00:22:19,679 Speaker 2: my pleasure. 419 00:22:19,720 --> 00:22:20,080 Speaker 5: Thank you. 420 00:22:24,080 --> 00:22:27,720 Speaker 2: We are now joined by our ace Tesla reporter, Dana 421 00:22:27,760 --> 00:22:32,439 Speaker 2: hall Low. Dana, thanks for having me. Okay, so last 422 00:22:32,520 --> 00:22:36,680 Speaker 2: week and we previewed this on the show. Tesla released 423 00:22:37,119 --> 00:22:41,680 Speaker 2: fourth quarter earnings and the market gave it, I said 424 00:22:41,680 --> 00:22:43,640 Speaker 2: earlier thumbs out. I actually think they gave it two 425 00:22:43,680 --> 00:22:48,120 Speaker 2: thumbs down. The stock cratered towards the end of last week. Actually, 426 00:22:48,160 --> 00:22:50,520 Speaker 2: in all of the S and P five hundred index, 427 00:22:50,520 --> 00:22:52,600 Speaker 2: there are only two companies year to date that have 428 00:22:52,680 --> 00:22:55,720 Speaker 2: fallen more than Tesla, and they are two companies it 429 00:22:56,000 --> 00:23:00,959 Speaker 2: mired in major crises. I know the stock is bouncing 430 00:23:01,000 --> 00:23:04,119 Speaker 2: back some now, but still it seems like this was 431 00:23:04,160 --> 00:23:07,760 Speaker 2: a flop. Dana tell us why did it fall so much? 432 00:23:08,320 --> 00:23:11,119 Speaker 2: Was it something that was actually in the earnings report 433 00:23:11,200 --> 00:23:14,560 Speaker 2: or was it something that was said that unnerved investors 434 00:23:14,560 --> 00:23:14,960 Speaker 2: so much? 435 00:23:16,680 --> 00:23:19,439 Speaker 3: Well, it was both. The January earnings call is all 436 00:23:19,480 --> 00:23:22,840 Speaker 3: about forward looking guidance and this was a company that 437 00:23:22,880 --> 00:23:24,919 Speaker 3: did not really give any guidance. They just said that 438 00:23:24,960 --> 00:23:26,960 Speaker 3: things were going to be lower and that they're in 439 00:23:27,040 --> 00:23:30,320 Speaker 3: between two waves of growth. Tellesla is currently between two 440 00:23:30,400 --> 00:23:31,320 Speaker 3: major growth waves. 441 00:23:31,760 --> 00:23:33,760 Speaker 5: Were folks on making sure that our next growth wave 442 00:23:33,840 --> 00:23:37,399 Speaker 5: driven by next gen vehicle, mpgy, storage, both self driving 443 00:23:37,400 --> 00:23:39,880 Speaker 5: and other projects is executed as well as possible. 444 00:23:40,280 --> 00:23:42,920 Speaker 3: And you know, there was not a number like how 445 00:23:42,960 --> 00:23:45,240 Speaker 3: many cars are they going to deliver in twenty twenty four. 446 00:23:45,320 --> 00:23:47,480 Speaker 3: They didn't say two million, they didn't say two point 447 00:23:47,560 --> 00:23:50,280 Speaker 3: five million. They just said that growth could be lower 448 00:23:50,880 --> 00:23:53,560 Speaker 3: and that they're in between two waves. And that was 449 00:23:53,600 --> 00:23:56,800 Speaker 3: sort of a nerving And then as the call went on, 450 00:23:57,119 --> 00:23:59,760 Speaker 3: like things just sort of you know, Elon was very 451 00:23:59,800 --> 00:24:02,760 Speaker 3: like sunny and optimistic and upbeat in terms of his tone, 452 00:24:02,960 --> 00:24:07,440 Speaker 3: but the lack of guidance was really disturbing. And yeah, 453 00:24:07,480 --> 00:24:09,560 Speaker 3: the stock is now down twenty two percent so far 454 00:24:09,640 --> 00:24:10,520 Speaker 3: in twenty twenty four. 455 00:24:11,040 --> 00:24:13,520 Speaker 2: You know, Max, My take is, as I was looking 456 00:24:13,560 --> 00:24:17,119 Speaker 2: at this briefly, was that you have a company in Tesla. 457 00:24:17,160 --> 00:24:19,840 Speaker 2: It has along been this great growth stock. Right, All 458 00:24:19,880 --> 00:24:22,640 Speaker 2: stocks investor CEE stocks in two categories, neither a growth 459 00:24:22,640 --> 00:24:25,119 Speaker 2: stock or a value stock, and a growth stock is 460 00:24:25,800 --> 00:24:28,160 Speaker 2: a stock that you were buying an anticipation of great 461 00:24:28,200 --> 00:24:30,679 Speaker 2: growth going forward. In the last few years, you know, 462 00:24:30,760 --> 00:24:34,480 Speaker 2: around sixty percent a year that is slowed to twenty percent, 463 00:24:34,520 --> 00:24:37,080 Speaker 2: and that seems to be roughly what people are forecasting 464 00:24:37,119 --> 00:24:40,439 Speaker 2: going forward. So I guess, you know, when you're valued 465 00:24:40,440 --> 00:24:43,000 Speaker 2: as much as it was eight hundred billion dollars at 466 00:24:43,000 --> 00:24:45,639 Speaker 2: one point not too recently, I guess this is just 467 00:24:45,640 --> 00:24:47,920 Speaker 2: sort of like a gravity moment here, right, Yeah. 468 00:24:47,960 --> 00:24:50,520 Speaker 1: I mean I think that's and Elon himself kind of 469 00:24:50,560 --> 00:24:52,959 Speaker 1: said that, you know, he's sort of in attempting, as 470 00:24:53,040 --> 00:24:55,720 Speaker 1: Dana says, to kind of lower expectations. Said, you know, 471 00:24:55,760 --> 00:24:57,240 Speaker 1: at a certain point, you know the growth is going 472 00:24:57,320 --> 00:24:59,400 Speaker 1: to like we the law of large numbers comes in 473 00:24:59,440 --> 00:25:02,520 Speaker 1: and and you can't grow fifty percent a year. I 474 00:25:02,600 --> 00:25:06,280 Speaker 1: think that it's part of what's going on here is 475 00:25:06,320 --> 00:25:11,200 Speaker 1: it's not just that Elon Musk is giving these predictions 476 00:25:11,280 --> 00:25:13,800 Speaker 1: or non predictions that are dower, but when he's trying 477 00:25:13,840 --> 00:25:16,960 Speaker 1: to tell the growth story, it's getting harder and harder 478 00:25:17,280 --> 00:25:19,840 Speaker 1: to kind of conceptualize. Like during the call, he talked 479 00:25:19,880 --> 00:25:24,439 Speaker 1: about optimists, the Tesla robot as this kind of be 480 00:25:24,560 --> 00:25:25,000 Speaker 1: all end all. 481 00:25:25,040 --> 00:25:26,679 Speaker 4: He said it is going to be the most valuable 482 00:25:26,720 --> 00:25:27,280 Speaker 4: product of. 483 00:25:27,280 --> 00:25:29,679 Speaker 1: All time, And on the call itself, you had like 484 00:25:29,760 --> 00:25:33,040 Speaker 1: one of Elon's own employees kind of jumping in and saying, well, 485 00:25:33,119 --> 00:25:35,440 Speaker 1: the big problem is for now, we can't figure out 486 00:25:35,440 --> 00:25:37,760 Speaker 1: anything to do with optimists. So there's kind of this 487 00:25:37,960 --> 00:25:42,560 Speaker 1: gap between Elon's version of what the company is and 488 00:25:42,640 --> 00:25:44,880 Speaker 1: the kind of Wall Street version of the company. 489 00:25:44,960 --> 00:25:49,359 Speaker 2: Do we believe that that employee is still employed at Tesla? 490 00:25:49,640 --> 00:25:52,639 Speaker 1: It did not sound good because there was you know, 491 00:25:52,720 --> 00:25:55,040 Speaker 1: when you're getting into an argument with the boss on 492 00:25:55,160 --> 00:25:56,760 Speaker 1: the call, it's. 493 00:25:56,960 --> 00:25:59,080 Speaker 2: There's a pretty big gap between the greatest product of 494 00:25:59,080 --> 00:26:00,560 Speaker 2: all time and we have no I did what to 495 00:26:00,680 --> 00:26:01,240 Speaker 2: do with this thing? 496 00:26:01,440 --> 00:26:03,920 Speaker 3: I think what's really interesting is that, Okay, so Tesla 497 00:26:04,000 --> 00:26:07,040 Speaker 3: fundamentally is a car company. Most of their revenue comes 498 00:26:07,080 --> 00:26:10,120 Speaker 3: from cars. However, they don't really have a car right now, 499 00:26:10,160 --> 00:26:13,480 Speaker 3: Like the cyber truck is barely in production. I mean, 500 00:26:13,600 --> 00:26:15,760 Speaker 3: the why is this bestseller, but like the Why is 501 00:26:15,800 --> 00:26:17,560 Speaker 3: getting old. So you're not going to see this like 502 00:26:17,640 --> 00:26:19,920 Speaker 3: big wave of growth until they come out with their 503 00:26:20,359 --> 00:26:23,800 Speaker 3: next generation platform, which they basically said would be like 504 00:26:24,040 --> 00:26:27,479 Speaker 3: late twenty twenty five maybe. So in the meantime, what 505 00:26:27,520 --> 00:26:29,760 Speaker 3: do you tell Wall Street about? You talk about, Oh, 506 00:26:29,800 --> 00:26:32,240 Speaker 3: we're an AI company, we're a robots company, and they've 507 00:26:32,280 --> 00:26:35,400 Speaker 3: got this optimist robot. We've got this Dojo computer which 508 00:26:35,440 --> 00:26:38,200 Speaker 3: is now a long shot. We have an energy business. 509 00:26:38,200 --> 00:26:40,160 Speaker 3: So they're gonna they have to talk about their other 510 00:26:40,240 --> 00:26:45,959 Speaker 3: products because the core automotive product is just like, you know, 511 00:26:46,040 --> 00:26:47,960 Speaker 3: they barely talked about the cyber truck on the call 512 00:26:48,000 --> 00:26:49,520 Speaker 3: at all. I mean, it was it was like a 513 00:26:49,600 --> 00:26:50,320 Speaker 3: huge emission. 514 00:26:50,359 --> 00:26:54,600 Speaker 2: But having said that, I do see on my Tesla 515 00:26:54,680 --> 00:27:00,280 Speaker 2: earnings Bingo card here the cyber truck box was. And 516 00:27:00,320 --> 00:27:02,600 Speaker 2: I want to talk about this Bingo card. I missed 517 00:27:02,640 --> 00:27:06,120 Speaker 2: this last week. I didn't play, and I'm a little disappointed. 518 00:27:06,480 --> 00:27:09,600 Speaker 2: But it seems like that we came close to getting 519 00:27:09,600 --> 00:27:13,000 Speaker 2: Bingo but came up short. Here is that right? We 520 00:27:13,000 --> 00:27:14,520 Speaker 2: didn't quite we couldn't quite pull it off. 521 00:27:15,240 --> 00:27:18,400 Speaker 1: No, And I will say that Elon Musk's you know, frantic, 522 00:27:19,000 --> 00:27:21,399 Speaker 1: you know, citing all the future products, trying to come 523 00:27:21,480 --> 00:27:22,960 Speaker 1: up with some kind of groc story was gold for 524 00:27:23,000 --> 00:27:26,040 Speaker 1: Bingo because we had Dojo, we had optimists, we had 525 00:27:26,080 --> 00:27:27,920 Speaker 1: you know, demand for Tesla's that are unlimited. 526 00:27:27,960 --> 00:27:30,000 Speaker 4: This was good for Bingo, but there was. 527 00:27:30,000 --> 00:27:32,080 Speaker 2: A bad for the stock. Good for Bingo, bad for 528 00:27:32,119 --> 00:27:34,399 Speaker 2: the stock, which is which has been a hypothesis of 529 00:27:34,440 --> 00:27:35,040 Speaker 2: mine for many. 530 00:27:35,000 --> 00:27:37,200 Speaker 4: Their hedge funds all over the building models that you 531 00:27:37,240 --> 00:27:37,800 Speaker 4: know correlate. 532 00:27:37,960 --> 00:27:39,960 Speaker 2: But I'll also say this, and I don't know, Dan, 533 00:27:40,040 --> 00:27:42,439 Speaker 2: if you, if you played a hand in formulating this 534 00:27:42,520 --> 00:27:45,439 Speaker 2: Bingo card here, I feel like it was slightly rigged 535 00:27:45,440 --> 00:27:48,160 Speaker 2: for us not to get it. I mean, audibly Puff's 536 00:27:48,240 --> 00:27:51,360 Speaker 2: joint was only put on the card to block. There 537 00:27:51,440 --> 00:27:55,600 Speaker 2: was no chance he was going to audibly puff a joint. No, no, no, no, no. 538 00:27:55,760 --> 00:27:58,720 Speaker 2: Hold on also digging our own grave. I mean those 539 00:27:58,720 --> 00:28:01,400 Speaker 2: are the two that blocked you hitting bingo. He wasn't 540 00:28:01,440 --> 00:28:04,200 Speaker 2: going to repeat your own You guys rigged this thing. 541 00:28:04,280 --> 00:28:04,879 Speaker 2: It was written. 542 00:28:04,960 --> 00:28:07,399 Speaker 3: No, he repeats himself all that. He repeats himself all 543 00:28:07,440 --> 00:28:10,119 Speaker 3: the time. I mean, that's the thing about Elon's as 544 00:28:10,560 --> 00:28:12,880 Speaker 3: as our former colleague Sean Okayn used to say, he's 545 00:28:12,880 --> 00:28:16,600 Speaker 3: like a comedian practicing his bit, Like he says a 546 00:28:16,600 --> 00:28:19,240 Speaker 3: lot of stock phrases over and over and over again. 547 00:28:19,240 --> 00:28:21,359 Speaker 1: But I want to direct your attention to I believe 548 00:28:21,359 --> 00:28:25,360 Speaker 1: it's n five Uh reproduction is hard, which is something 549 00:28:25,400 --> 00:28:28,400 Speaker 1: that Elon Musk has reiterated over and over and over 550 00:28:28,440 --> 00:28:32,240 Speaker 1: again recently, and he very easily could have said that 551 00:28:32,320 --> 00:28:35,320 Speaker 1: on the UH call you don't need audibly puffs Joy, 552 00:28:35,480 --> 00:28:39,520 Speaker 1: and I want to remind you audibly puffs joyd was realistic, you. 553 00:28:39,440 --> 00:28:41,880 Speaker 2: Know, I recogrect him first. Can you give me my 554 00:28:41,880 --> 00:28:46,280 Speaker 2: bingo car back? All right, let's end it there. Thank 555 00:28:46,320 --> 00:28:49,200 Speaker 2: you for listening to Elon Inc. And thanks to Dana 556 00:28:49,760 --> 00:28:50,360 Speaker 2: and Max. 557 00:28:51,160 --> 00:28:51,960 Speaker 4: Great to be here. 558 00:28:52,960 --> 00:28:53,719 Speaker 3: Always a pleasure. 559 00:29:01,040 --> 00:29:04,600 Speaker 2: This episode was produced by Stacy Wong. Naomi Shaven and 560 00:29:04,680 --> 00:29:08,760 Speaker 2: Rayhan Harmanci are our senior editors. The idea for this 561 00:29:09,000 --> 00:29:13,800 Speaker 2: very show also came from Rayhan Lake Maples handles engineering, 562 00:29:13,840 --> 00:29:17,320 Speaker 2: and we get special editing assistants from Jeff Grocott. Our 563 00:29:17,480 --> 00:29:21,719 Speaker 2: supervising producer is Magnus Henrickson. Huge thanks to Angel Rascio 564 00:29:21,760 --> 00:29:25,200 Speaker 2: and Joel Weber. The Elon Inc. Theme is written and 565 00:29:25,240 --> 00:29:29,720 Speaker 2: performed by Taka Yasuzawa and Alex Sugiura. Sage Bauman is 566 00:29:29,760 --> 00:29:32,719 Speaker 2: the head of Bloomberg Podcast and our executive producer. I 567 00:29:32,760 --> 00:29:36,040 Speaker 2: am David Papadopoulos. If you have a minute, rate and 568 00:29:36,080 --> 00:29:39,440 Speaker 2: review our show. It'll help other listeners find us, see 569 00:29:39,480 --> 00:29:40,080 Speaker 2: you next week.