1 00:00:02,480 --> 00:00:07,000 Speaker 1: Bloomberg Audio Studios, Podcasts, radio news. 2 00:00:07,920 --> 00:00:10,559 Speaker 2: Let me tell you we have a new star. A 3 00:00:10,720 --> 00:00:14,680 Speaker 2: star is born Elon On Mars Juthan Kennemy. 4 00:00:14,880 --> 00:00:18,520 Speaker 3: He is the Thomas Edison plus plus plus of our age. 5 00:00:18,600 --> 00:00:21,040 Speaker 2: Probably his whole life is from a position of insecurity. 6 00:00:21,079 --> 00:00:23,200 Speaker 2: I feel for the guy. I would say ninety eight 7 00:00:23,239 --> 00:00:25,319 Speaker 2: percent really appreciate what he does. 8 00:00:25,400 --> 00:00:28,440 Speaker 3: But those two percent that are nasty, they are up 9 00:00:28,560 --> 00:00:29,800 Speaker 3: pay in four fos. 10 00:00:30,280 --> 00:00:32,280 Speaker 2: We are meant for great things in the United States 11 00:00:32,320 --> 00:00:34,560 Speaker 2: of America, and Elon reminds. 12 00:00:34,320 --> 00:00:34,640 Speaker 4: Us of that. 13 00:00:35,840 --> 00:00:37,400 Speaker 2: I'm very disappointed in Elon. 14 00:00:37,479 --> 00:00:39,600 Speaker 3: I've helped Elon a lot. 15 00:00:38,680 --> 00:00:38,959 Speaker 1: A lot. 16 00:00:46,520 --> 00:00:50,600 Speaker 2: Welcome to Elon, Inc. Bloomberg's weekly podcast about Elon Muskets Tuesday, 17 00:00:50,680 --> 00:00:54,960 Speaker 2: June tenth. I'm your host, David Papadopolis. So the Wild Feud, 18 00:00:55,480 --> 00:00:58,360 Speaker 2: the feud to end all feuds between Elon and the 19 00:00:58,360 --> 00:01:01,600 Speaker 2: President appears to be sick down a bit for the moment. 20 00:01:02,480 --> 00:01:06,720 Speaker 2: Saalacious posts are being deleted. Nice thaties exchanged, but the 21 00:01:06,840 --> 00:01:09,959 Speaker 2: damage to their alliance is done and will break it 22 00:01:10,040 --> 00:01:13,800 Speaker 2: all down. Also, a big date is upon us this Thursday. 23 00:01:13,920 --> 00:01:18,840 Speaker 2: Tesla will apparently finally start its long awaited Robotaxi service 24 00:01:18,840 --> 00:01:21,800 Speaker 2: in Austin, Texas. Now, the torching of several self driving 25 00:01:21,800 --> 00:01:24,360 Speaker 2: waymos in La is perhaps an ominous sign for the launch, 26 00:01:24,560 --> 00:01:27,720 Speaker 2: as is the bombshell report by our very own Dana 27 00:01:27,800 --> 00:01:30,920 Speaker 2: Hall about a lethal accident involving in Tesla AND's self 28 00:01:31,000 --> 00:01:34,520 Speaker 2: driving mode. But perhaps for Muscinco it is an opportunity 29 00:01:34,560 --> 00:01:37,800 Speaker 2: to draw attention to something other than his brawl with Trump. 30 00:01:38,560 --> 00:01:40,959 Speaker 2: Then at the end, we're gonna have a little feud watch. 31 00:01:41,440 --> 00:01:44,000 Speaker 2: So lots and lots to go through, and let's get 32 00:01:44,080 --> 00:01:48,120 Speaker 2: to it with our regulars, the aforementioned Dana Hall and 33 00:01:48,280 --> 00:01:50,080 Speaker 2: Max Chafkin. Hello to you both. 34 00:01:50,520 --> 00:01:51,880 Speaker 3: Hey, David, happy to be here. 35 00:01:52,040 --> 00:01:52,880 Speaker 5: Good day, David. 36 00:01:53,440 --> 00:01:55,880 Speaker 2: How are you guys doing you look you're smiling again. 37 00:01:55,920 --> 00:02:01,200 Speaker 1: Max, I mean, crazy week last week. I'm still I'm 38 00:02:01,280 --> 00:02:02,240 Speaker 1: still still a buzz. 39 00:02:02,240 --> 00:02:06,240 Speaker 2: You're still a glow. So we spoke. You did an 40 00:02:06,280 --> 00:02:09,880 Speaker 2: emergency pod on Thursday with Josh, and then you and 41 00:02:09,960 --> 00:02:15,120 Speaker 2: I did the Friday one. Stuff keeps happening. Although I 42 00:02:15,120 --> 00:02:18,080 Speaker 2: guess maybe tacking back in the other direction bring us 43 00:02:18,160 --> 00:02:19,000 Speaker 2: now up to speed. 44 00:02:19,280 --> 00:02:22,280 Speaker 1: Yeah, we I mean I think we started to kind 45 00:02:22,280 --> 00:02:24,840 Speaker 1: of hint at this in our conversation on. 46 00:02:24,760 --> 00:02:27,760 Speaker 5: Friday Elon Musk, I mean. 47 00:02:27,680 --> 00:02:32,120 Speaker 1: You know, pretty quickly started to at least turn the 48 00:02:32,480 --> 00:02:38,880 Speaker 1: the the dial down it. He deleted the most incendiary tweet, 49 00:02:39,080 --> 00:02:42,680 Speaker 1: the one about Jeffrey Epstein. He I believe deleted the 50 00:02:42,680 --> 00:02:44,880 Speaker 1: tweet calling for Donald Trump to be impeached. 51 00:02:45,080 --> 00:02:46,480 Speaker 2: He's deleted at least two. 52 00:02:46,880 --> 00:02:47,200 Speaker 5: Yeah. 53 00:02:47,280 --> 00:02:52,000 Speaker 1: He's also gone back to sort of something resembling his 54 00:02:52,400 --> 00:02:55,359 Speaker 1: regular scheduled programming. At the end of last week, he 55 00:02:55,880 --> 00:02:58,520 Speaker 1: like just started tweeting about, you know, a big win 56 00:02:58,680 --> 00:03:01,520 Speaker 1: for Starlink and Dominica, you know, the Islands, I guess, 57 00:03:01,560 --> 00:03:03,239 Speaker 1: which I thought was kind of funny because it's like, Okay, 58 00:03:03,240 --> 00:03:06,200 Speaker 1: here's the world, here's we're huge, where the world leader 59 00:03:06,200 --> 00:03:09,200 Speaker 1: does not hate me. But in any case, he's been 60 00:03:09,240 --> 00:03:13,880 Speaker 1: talking about his companies again. He is also sort of 61 00:03:14,720 --> 00:03:19,800 Speaker 1: amplifying Trump administration messaging on the protests. I believe he 62 00:03:20,000 --> 00:03:23,160 Speaker 1: praise tweeted a jd Vance statement as well as a 63 00:03:23,240 --> 00:03:27,679 Speaker 1: Donald Trump truth. He refollowed Stephen Miller for those who 64 00:03:27,680 --> 00:03:29,120 Speaker 1: are following that driving. 65 00:03:29,040 --> 00:03:31,120 Speaker 5: Got so so yeah. 66 00:03:31,160 --> 00:03:34,560 Speaker 1: I mean, he's sort of trying to edge his way 67 00:03:35,360 --> 00:03:40,280 Speaker 1: back into the good grades of Donald Trump without explicitly apologizing. 68 00:03:40,360 --> 00:03:42,960 Speaker 1: He hasn't done that or really come close to doing that. 69 00:03:43,200 --> 00:03:46,800 Speaker 2: Now Trump is still keeping him at arms length, though. 70 00:03:47,320 --> 00:03:47,640 Speaker 5: Yeah. 71 00:03:47,800 --> 00:03:52,800 Speaker 1: Trump was asked yesterday about Elon Musk's drug use, and 72 00:03:53,360 --> 00:03:56,840 Speaker 1: he answered the question in a way that seemed very diplomatic. 73 00:03:56,960 --> 00:03:59,200 Speaker 5: We should just listen to that, David, I think, yeah, 74 00:03:59,280 --> 00:03:59,840 Speaker 5: let's do that. 75 00:04:00,080 --> 00:04:03,080 Speaker 4: There was this New York Times report that he did 76 00:04:03,080 --> 00:04:06,680 Speaker 4: not want to talk about Elon Musk, that alleged that he, 77 00:04:06,920 --> 00:04:08,360 Speaker 4: towards the end of his time in the White House 78 00:04:09,040 --> 00:04:12,640 Speaker 4: was blurring the lines between recreational use of drugs and 79 00:04:12,720 --> 00:04:14,360 Speaker 4: medicinal Do. 80 00:04:14,320 --> 00:04:16,080 Speaker 2: You think he ever had drugs here at the White House? 81 00:04:16,120 --> 00:04:17,960 Speaker 2: I really don't know. I don't think so. I hope not. 82 00:04:18,960 --> 00:04:20,800 Speaker 2: Look I wish him well. You understand, we had a 83 00:04:20,800 --> 00:04:24,960 Speaker 2: good relationship, and I just wish him well, very well. 84 00:04:25,000 --> 00:04:30,479 Speaker 1: Actually, I mean it's not exactly a total denial, but 85 00:04:30,760 --> 00:04:34,480 Speaker 1: he also said he didn't see anything concerning you know. 86 00:04:34,720 --> 00:04:37,640 Speaker 1: To me, that's sort of a rear that tells you 87 00:04:37,720 --> 00:04:41,960 Speaker 1: that this feud is still going on, although it does 88 00:04:42,000 --> 00:04:44,719 Speaker 1: appear that Donald Trump's to turn that down down as well. 89 00:04:44,839 --> 00:04:48,159 Speaker 2: A day or so earlier, the Washington Post had I 90 00:04:48,160 --> 00:04:50,720 Speaker 2: think this is over the weekend, reporting that Trump in 91 00:04:51,200 --> 00:04:55,159 Speaker 2: phone calls with some of his advisors and such, had 92 00:04:55,240 --> 00:04:59,320 Speaker 2: called Musk a big time drug addict. So yes, certainly 93 00:04:59,400 --> 00:05:02,960 Speaker 2: a very differ tone here in this in this press conference. 94 00:05:03,120 --> 00:05:05,760 Speaker 1: It's a different thing to say I never saw somebody 95 00:05:05,800 --> 00:05:08,400 Speaker 1: take drugs versus I know he's not a drug addict, 96 00:05:08,520 --> 00:05:10,800 Speaker 1: right like, So there's like he could have denied it 97 00:05:10,839 --> 00:05:14,680 Speaker 1: in a more complete way, although he does back up 98 00:05:14,680 --> 00:05:16,040 Speaker 1: Elon's denials to some extent. 99 00:05:16,160 --> 00:05:18,760 Speaker 2: Okay, now, Danna, we missed you last week and we 100 00:05:18,800 --> 00:05:22,039 Speaker 2: didn't get your hot takes towards the end of the week. 101 00:05:22,120 --> 00:05:25,320 Speaker 2: So as you are observing at all, what is what 102 00:05:25,440 --> 00:05:26,600 Speaker 2: is your take from there? 103 00:05:26,920 --> 00:05:27,080 Speaker 1: Yeah? 104 00:05:27,120 --> 00:05:27,320 Speaker 2: I was. 105 00:05:27,360 --> 00:05:29,200 Speaker 3: I was off at the end of last week, trying 106 00:05:29,240 --> 00:05:31,839 Speaker 3: to be like a normal person. So I was following 107 00:05:31,880 --> 00:05:34,839 Speaker 3: this like tit for tat back and forth sort of 108 00:05:34,920 --> 00:05:37,200 Speaker 3: on my phone, but not in real time, and then 109 00:05:37,240 --> 00:05:39,440 Speaker 3: it was very hard to play catch up anyway. It 110 00:05:39,520 --> 00:05:42,040 Speaker 3: was it was just sort of crazy. But I thought 111 00:05:42,040 --> 00:05:44,919 Speaker 3: that Trump's what Trump said in that clip, I wish 112 00:05:44,960 --> 00:05:48,040 Speaker 3: him well. I wish him very well. Actually it's like, 113 00:05:48,279 --> 00:05:50,160 Speaker 3: is he wishing him well because he thinks he's a 114 00:05:50,200 --> 00:05:52,719 Speaker 3: drug addict, or is he wishing him well because the 115 00:05:52,760 --> 00:05:55,599 Speaker 3: relationship is over, or is he wishing him well because 116 00:05:55,640 --> 00:05:58,960 Speaker 3: he still needs him to finance the midterms? Like it 117 00:05:59,040 --> 00:06:02,360 Speaker 3: was just just an interesting, interesting response. 118 00:06:02,720 --> 00:06:06,359 Speaker 2: My reading of that was door number three. I wish 119 00:06:06,440 --> 00:06:09,520 Speaker 2: him well. Muski's working very hard to try to simmer 120 00:06:09,600 --> 00:06:11,560 Speaker 2: things down, and I'm gonna take a step in that 121 00:06:11,600 --> 00:06:14,680 Speaker 2: direction as well. I wish him well so that he 122 00:06:14,760 --> 00:06:15,840 Speaker 2: keeps sending me miney. 123 00:06:16,640 --> 00:06:20,039 Speaker 3: Yeah, you know, my hot take was it totally reminded 124 00:06:20,080 --> 00:06:22,719 Speaker 3: me of the whole pedo guy thing from twenty eighteen 125 00:06:22,839 --> 00:06:27,640 Speaker 3: when like Elon, with zero zero proof whatsoever, basically just 126 00:06:27,680 --> 00:06:31,400 Speaker 3: like went on this rampage and called this cave rescuer 127 00:06:32,000 --> 00:06:34,680 Speaker 3: a pedo guy, and like then doubled down on it 128 00:06:34,800 --> 00:06:37,360 Speaker 3: and just he's like when Elon is certain that he's 129 00:06:37,440 --> 00:06:39,440 Speaker 3: right about something, or when he's upset or when he 130 00:06:39,480 --> 00:06:43,479 Speaker 3: feels like he's being attacked, he just goes thermonuclear. 131 00:06:43,600 --> 00:06:46,839 Speaker 2: And specifically you're referencing the Epstein thing. 132 00:06:47,120 --> 00:06:48,919 Speaker 3: Yeah, like this is a thing like Elon is like 133 00:06:48,960 --> 00:06:51,360 Speaker 3: obsessed with his idea that people are our pedophiles. 134 00:06:51,560 --> 00:06:54,719 Speaker 1: It's the pedo guy insults all over. I mean, the 135 00:06:54,760 --> 00:06:58,240 Speaker 1: allegations around Jeffrey Epstein It's exactly the same insult, except 136 00:06:58,400 --> 00:07:02,520 Speaker 1: the difference. I see one of those guys, one of 137 00:07:02,560 --> 00:07:05,520 Speaker 1: those guys as president, and one was a guy who 138 00:07:05,520 --> 00:07:07,720 Speaker 1: had you know, been on TV once exactly. 139 00:07:08,080 --> 00:07:09,960 Speaker 3: But for him to level this at the president and 140 00:07:10,000 --> 00:07:12,720 Speaker 3: to do the twofer of he's in the Epstein files 141 00:07:12,760 --> 00:07:15,000 Speaker 3: and that's why they haven't been released and he should 142 00:07:15,040 --> 00:07:18,840 Speaker 3: be impeached, I was like, Wow, Okay, someone really needs 143 00:07:18,840 --> 00:07:20,800 Speaker 3: to like step in and take his phone away. In 144 00:07:20,840 --> 00:07:23,320 Speaker 3: the old days, that would have been Sam Teller, his 145 00:07:23,640 --> 00:07:25,440 Speaker 3: chief of staff. I don't know who takes the phone 146 00:07:25,440 --> 00:07:26,960 Speaker 3: away these days. Is it Katie Miller? 147 00:07:27,320 --> 00:07:28,120 Speaker 5: Is it somewhere else? 148 00:07:28,200 --> 00:07:31,200 Speaker 3: Like I would love to just know, like the apparatus. 149 00:07:31,240 --> 00:07:32,960 Speaker 3: And then there was a lot of reporting about how 150 00:07:32,960 --> 00:07:36,360 Speaker 3: this datane was reached because of intermediaries, and I'm like, well, 151 00:07:36,640 --> 00:07:39,480 Speaker 3: who exactly is the who exactly are the intermediaries at 152 00:07:39,480 --> 00:07:40,360 Speaker 3: this stage of the game. 153 00:07:40,880 --> 00:07:43,560 Speaker 1: It's Bill act and Kanye West And that's a non 154 00:07:43,800 --> 00:07:48,080 Speaker 1: on Twitter Alaska or whatever. He was like, hey guy, Hey, Elon, 155 00:07:48,120 --> 00:07:49,760 Speaker 1: I think you should cool down. And Elon was like, 156 00:07:49,800 --> 00:07:52,360 Speaker 1: you know, that sounds like a pretty smart IDEA random 157 00:07:52,360 --> 00:07:54,080 Speaker 1: guy with a couple hundred Twitter followers. 158 00:07:54,480 --> 00:07:57,160 Speaker 5: You know, one thing I've been thinking about in the 159 00:07:57,200 --> 00:07:58,440 Speaker 5: sober light of day. 160 00:07:59,400 --> 00:08:02,640 Speaker 1: You know, we have as we've been talking about the 161 00:08:02,640 --> 00:08:07,640 Speaker 1: Elon Trump kind of relationship a lot of Trump reporters, 162 00:08:07,680 --> 00:08:10,720 Speaker 1: including Josh Green, who was on the podcast last week. 163 00:08:10,800 --> 00:08:12,360 Speaker 2: I said I told you so, Yeah. 164 00:08:12,360 --> 00:08:13,960 Speaker 1: I have said I told you so yeah. And I 165 00:08:14,000 --> 00:08:17,480 Speaker 1: am happily eating crow. Although I do think we're starting 166 00:08:17,480 --> 00:08:21,000 Speaker 1: to see that my take that this relationship may continue, 167 00:08:21,480 --> 00:08:21,960 Speaker 1: I don't know. 168 00:08:21,880 --> 00:08:24,000 Speaker 2: Want you want to share the crow with other people. 169 00:08:24,080 --> 00:08:25,680 Speaker 5: Yeah, it's not like as bad as did last week. 170 00:08:25,960 --> 00:08:29,160 Speaker 1: I do wonder if Elon is sort of a combination 171 00:08:29,840 --> 00:08:33,520 Speaker 1: of like one of these Trump advisors who were sort 172 00:08:33,559 --> 00:08:35,480 Speaker 1: of chewed up and spit out by the White House 173 00:08:35,960 --> 00:08:39,560 Speaker 1: and a world leader. You know, Elon is part Scaramucci 174 00:08:39,920 --> 00:08:43,520 Speaker 1: and part Kim Jong Um. And I say that because Elon, 175 00:08:44,440 --> 00:08:48,640 Speaker 1: because that's quite a merging, has leverage, you know, and 176 00:08:48,679 --> 00:08:51,280 Speaker 1: we saw it over We saw it last week when 177 00:08:51,320 --> 00:08:55,040 Speaker 1: he was threatening to to you know, cancel the Dragon program. 178 00:08:55,280 --> 00:08:59,520 Speaker 2: He has the kind of like the you know, campaign donations, 179 00:09:00,160 --> 00:09:02,920 Speaker 2: Dragon all those things are right, but Max, at the 180 00:09:02,960 --> 00:09:05,679 Speaker 2: same time, you can't tell me that he doesn't have 181 00:09:05,720 --> 00:09:08,040 Speaker 2: more to lose in all this than than Trump. No, 182 00:09:08,120 --> 00:09:09,320 Speaker 2: he totally does. 183 00:09:09,360 --> 00:09:11,640 Speaker 1: But I just think there's a way to reach kind 184 00:09:11,679 --> 00:09:15,680 Speaker 1: of a equilibrium where they are neither friends nor enemies, 185 00:09:15,679 --> 00:09:17,360 Speaker 1: which is kind of the way it works with all 186 00:09:17,400 --> 00:09:20,199 Speaker 1: these world leaders, where Trump one day will be hinting 187 00:09:20,240 --> 00:09:23,319 Speaker 1: at the possibility of nuclear war, the next minute saying 188 00:09:23,360 --> 00:09:25,000 Speaker 1: I love this guy, you know. And we've seen it 189 00:09:25,000 --> 00:09:27,520 Speaker 1: with Kim Jong Um, We've seen it with Ajijenping, We've 190 00:09:27,520 --> 00:09:30,560 Speaker 1: seen it with Vladimir Putin, where he will swing rather 191 00:09:30,720 --> 00:09:34,280 Speaker 1: wildly in the context of a negotiation. And I kind 192 00:09:34,280 --> 00:09:36,640 Speaker 1: of think that's a little bit of what we saw 193 00:09:36,840 --> 00:09:37,320 Speaker 1: last week. 194 00:09:38,240 --> 00:09:39,880 Speaker 3: I would just like to point out that we had 195 00:09:39,920 --> 00:09:42,640 Speaker 3: all this bs for so many episodes about the cage 196 00:09:42,679 --> 00:09:45,120 Speaker 3: match and what the real cage match was, and David, 197 00:09:45,640 --> 00:09:48,560 Speaker 3: this was the cage match. This is the true cage match. 198 00:09:48,679 --> 00:09:53,199 Speaker 3: Trump on like Bannon in one corner, Katie Miller and 199 00:09:53,240 --> 00:09:53,680 Speaker 3: another I. 200 00:09:53,679 --> 00:09:57,320 Speaker 2: Don't know, Like it's just Bannon is Trump's cup man? 201 00:09:57,400 --> 00:10:00,600 Speaker 2: Who who's Musk cup Man? Yeah, that's it's not It's 202 00:10:00,600 --> 00:10:04,959 Speaker 2: not Katie Miller, Zachman or Cacumen or Zachman the ref. No, 203 00:10:05,080 --> 00:10:08,200 Speaker 2: it's true. It's the cage match to end all cage matches, 204 00:10:08,400 --> 00:10:10,120 Speaker 2: and now they just write each side has gone to 205 00:10:10,160 --> 00:10:18,480 Speaker 2: their respective corners at least for the time being. Okay, now, Dana. 206 00:10:18,600 --> 00:10:24,160 Speaker 2: On Thursday, Tesla has a big day of sorts, the 207 00:10:24,400 --> 00:10:29,240 Speaker 2: launch of the vaunted robotaxi. Tell us or remind us, 208 00:10:29,280 --> 00:10:32,600 Speaker 2: I should say about the specifics and what's at stake 209 00:10:32,679 --> 00:10:34,160 Speaker 2: here for Tesla and Musk. 210 00:10:34,679 --> 00:10:36,240 Speaker 3: I don't want to go overboard and say it's like 211 00:10:36,280 --> 00:10:38,920 Speaker 3: a big launch of it, you know. Bloomberg has reported 212 00:10:39,040 --> 00:10:41,880 Speaker 3: My colleagu Ed Ludlow has reported that Tesla was sort 213 00:10:41,880 --> 00:10:44,240 Speaker 3: of eyeing June twelfth as the day that they would 214 00:10:44,280 --> 00:10:49,040 Speaker 3: begin some kind of robotaxi service in Austin, Texas. We 215 00:10:49,120 --> 00:10:52,600 Speaker 3: don't know if this is paid customers, if it's employees, 216 00:10:52,720 --> 00:10:56,400 Speaker 3: if it's just like delivering you know, vehicles from the 217 00:10:56,440 --> 00:10:59,559 Speaker 3: factory to new owners. We know that Tesla has been 218 00:11:01,160 --> 00:11:04,960 Speaker 3: FSD on supervised on some Austin streets downtown, but like 219 00:11:05,120 --> 00:11:07,280 Speaker 3: there's no event, Like no, I don't know of any 220 00:11:07,320 --> 00:11:10,640 Speaker 3: investor or cellside Wall Street analysts who's gotten like an 221 00:11:10,679 --> 00:11:14,440 Speaker 3: invitation to some big show. So it could be like 222 00:11:14,480 --> 00:11:18,600 Speaker 3: a very soft launch and you know, people within Austin 223 00:11:18,679 --> 00:11:20,800 Speaker 3: and terms like Austin City officials have been pretty mum 224 00:11:20,880 --> 00:11:22,360 Speaker 3: about it. So there's still just like a lot that 225 00:11:22,400 --> 00:11:24,320 Speaker 3: we don't know. I mean, if you live in Austin, Texas, 226 00:11:24,600 --> 00:11:26,560 Speaker 3: give us a holler because we're trying to figure it out. 227 00:11:26,640 --> 00:11:28,840 Speaker 3: I mean, I was thinking should I go to Austin, 228 00:11:28,920 --> 00:11:30,400 Speaker 3: and then I'm like, well, I don't know what I 229 00:11:30,400 --> 00:11:33,720 Speaker 3: would actually see because I just think that they are 230 00:11:33,920 --> 00:11:36,360 Speaker 3: trying to tamp down expectations a little bit. So what 231 00:11:36,400 --> 00:11:37,160 Speaker 3: exactly this will be? 232 00:11:37,440 --> 00:11:40,040 Speaker 2: And by the way, so for those listeners in Austin, Texas, 233 00:11:40,040 --> 00:11:42,480 Speaker 2: they should write into us Max at what is our 234 00:11:42,520 --> 00:11:43,960 Speaker 2: address again, Elon Inc. 235 00:11:44,000 --> 00:11:45,480 Speaker 5: At Bloomberg dot net. 236 00:11:45,400 --> 00:11:47,480 Speaker 2: And give us any hot tips on that. But I 237 00:11:47,520 --> 00:11:50,560 Speaker 2: do wonder Max if Dana's last point is exactly it 238 00:11:51,000 --> 00:11:53,280 Speaker 2: that this went from a year or so ago to 239 00:11:53,320 --> 00:11:54,920 Speaker 2: be Matt, this is going to be big. It's gonna 240 00:11:54,920 --> 00:11:56,960 Speaker 2: be huge. We're gonna have bunkeding up all over the 241 00:11:57,040 --> 00:11:58,200 Speaker 2: stadium and it's going to be. 242 00:11:58,200 --> 00:12:01,720 Speaker 1: A big knowledge just launch you could possibly imagine. And 243 00:12:01,720 --> 00:12:05,240 Speaker 1: and it's particularly noteworthy because this is a company, this 244 00:12:05,320 --> 00:12:07,960 Speaker 1: is a guy who is kind of known for these 245 00:12:08,160 --> 00:12:14,160 Speaker 1: big events. Elon Musk really likes to do these grand shows. 246 00:12:14,200 --> 00:12:15,800 Speaker 2: Why he's going to be so bitter that he misses 247 00:12:15,800 --> 00:12:18,200 Speaker 2: the screaming his name and. 248 00:12:18,400 --> 00:12:21,520 Speaker 5: And you know, they started to signal this during the 249 00:12:21,559 --> 00:12:22,079 Speaker 5: earnings call. 250 00:12:22,120 --> 00:12:24,520 Speaker 1: We talked about this on our the episode that we 251 00:12:24,559 --> 00:12:27,480 Speaker 1: did immediately after the last Tesla earnings. We're talking about 252 00:12:27,880 --> 00:12:35,079 Speaker 1: ten to twenty taxis with teleoperation a very restricted area. 253 00:12:35,200 --> 00:12:38,079 Speaker 1: It also sounded like on the earnings call that they 254 00:12:38,080 --> 00:12:41,239 Speaker 1: are are kitting these things out with more sensors than 255 00:12:41,280 --> 00:12:45,840 Speaker 1: at least the standard model. Hy has so a lot 256 00:12:45,880 --> 00:12:49,280 Speaker 1: of reasons to not look at this as a true commercial. 257 00:12:48,920 --> 00:12:52,560 Speaker 5: Launch, and really not anywhere near a true commercial launch. 258 00:12:52,640 --> 00:12:54,600 Speaker 5: But you know, maybe who knows. 259 00:12:55,000 --> 00:12:58,360 Speaker 1: I'm prepared to be surprised because that's what Elon Musk does. 260 00:12:58,400 --> 00:13:01,800 Speaker 1: But we have not seen any signs that this is 261 00:13:01,880 --> 00:13:05,360 Speaker 1: going to be a big event, and in fact, I 262 00:13:05,360 --> 00:13:08,160 Speaker 1: think it's gonna be the kind of thing where investors, 263 00:13:08,280 --> 00:13:11,720 Speaker 1: maybe not on Thursday, but start to ask, hey, didn't 264 00:13:11,760 --> 00:13:13,680 Speaker 1: he promise that we would have a commercial launch? 265 00:13:13,679 --> 00:13:14,120 Speaker 5: You know what I mean? 266 00:13:14,160 --> 00:13:17,439 Speaker 2: Didn't he promised it? Didn't We friggin bid this price 267 00:13:17,480 --> 00:13:19,400 Speaker 2: of the stock up a whole heck of a lot 268 00:13:19,480 --> 00:13:22,000 Speaker 2: based on all this isn't you know, and so but 269 00:13:22,120 --> 00:13:23,800 Speaker 2: you know what one of the interesting things is is 270 00:13:24,160 --> 00:13:26,160 Speaker 2: they always seem to forget that they had bid them, 271 00:13:26,400 --> 00:13:29,320 Speaker 2: they did price the stock up, and then when it doesn't, 272 00:13:29,400 --> 00:13:31,680 Speaker 2: if it doesn't come to pass, then they just go, well, 273 00:13:31,880 --> 00:13:34,360 Speaker 2: we're just gonna bid it up on some some other thing. 274 00:13:34,480 --> 00:13:37,000 Speaker 2: As you say, Max, coming down the road. Now, Dana, 275 00:13:37,120 --> 00:13:40,760 Speaker 2: tell us a little bit about a story, a fantastic 276 00:13:40,880 --> 00:13:43,280 Speaker 2: story that you and friend of the Pod Craig Trudel, 277 00:13:43,360 --> 00:13:46,640 Speaker 2: had last week about something that is related at the 278 00:13:46,760 --> 00:13:51,000 Speaker 2: very least, you know, tangentially related to this whole Robotaxi launch. 279 00:13:51,440 --> 00:13:52,679 Speaker 2: Hit us with the details. 280 00:13:53,000 --> 00:13:56,800 Speaker 3: Yeah, so this was just sort of a horrible, tragic 281 00:13:56,960 --> 00:14:00,880 Speaker 3: tale of how a seventy one year old grandmother in 282 00:14:00,960 --> 00:14:05,200 Speaker 3: Arizona was killed by a Tesla and it all came 283 00:14:05,280 --> 00:14:08,280 Speaker 3: to late in a very weird sequence of events, which 284 00:14:08,440 --> 00:14:13,960 Speaker 3: was in October, NITSA announced that they were investigating Tesla 285 00:14:14,240 --> 00:14:18,360 Speaker 3: FSD because of four separate crashes where the vehicle with 286 00:14:18,520 --> 00:14:22,800 Speaker 3: FSD engaged seemed to have difficulty where there was fog, 287 00:14:23,800 --> 00:14:27,160 Speaker 3: dust or glare. And they listed these four crashes and 288 00:14:27,200 --> 00:14:31,600 Speaker 3: one of them was pedestrian fatal fatality pedestrian Arizona. And 289 00:14:31,640 --> 00:14:34,480 Speaker 3: I was like, wait, what, Like a tesla using FSD 290 00:14:35,080 --> 00:14:37,320 Speaker 3: killed a pedestrian, Like I didn't hear about that, Like 291 00:14:37,520 --> 00:14:41,040 Speaker 3: what was that? And we asked tesla, we asked NITSA, 292 00:14:41,120 --> 00:14:43,680 Speaker 3: we asked for more details. Nobody had anything. And so 293 00:14:43,800 --> 00:14:47,560 Speaker 3: then I went to my old friend public records, which 294 00:14:48,560 --> 00:14:51,120 Speaker 3: you know, I just can't stress enough the importance of 295 00:14:51,240 --> 00:14:53,720 Speaker 3: digging for public records. And I filed a Public Records 296 00:14:53,760 --> 00:14:56,920 Speaker 3: Act request with the Arizona Department of Public Safety and 297 00:14:56,960 --> 00:14:59,320 Speaker 3: I was like, do you have any details about this 298 00:14:59,480 --> 00:15:02,200 Speaker 3: crash where apparently at Panestia was killed. And I asked 299 00:15:02,200 --> 00:15:06,800 Speaker 3: for the crash report, photographs, and any camera footage taken 300 00:15:06,840 --> 00:15:09,520 Speaker 3: from vehicles at the scene, and like lo and behold, 301 00:15:09,640 --> 00:15:12,400 Speaker 3: several months later, in like late March, I got back 302 00:15:12,480 --> 00:15:17,480 Speaker 3: this trove of data, including the video footage of the 303 00:15:17,520 --> 00:15:21,120 Speaker 3: crash taken from the tesla itself. And I have to say, 304 00:15:21,160 --> 00:15:23,440 Speaker 3: like I have looked at a lot of crash footage 305 00:15:23,480 --> 00:15:26,000 Speaker 3: and a lot of crash reports in my time as 306 00:15:26,000 --> 00:15:30,120 Speaker 3: an automotive reporter, I've never gotten back footage from a 307 00:15:30,160 --> 00:15:32,960 Speaker 3: tesla like this. I mean, the state trooper who had 308 00:15:32,960 --> 00:15:35,760 Speaker 3: the wherewithal to download it right at the scene has 309 00:15:35,800 --> 00:15:38,000 Speaker 3: my kudos, And so we just got back this like 310 00:15:38,040 --> 00:15:41,840 Speaker 3: really harrowing footage, and it showed that sun glare was 311 00:15:41,880 --> 00:15:45,040 Speaker 3: the issue that the Tesla that was driving could not 312 00:15:45,160 --> 00:15:47,920 Speaker 3: see in front of it because of the glare on 313 00:15:48,040 --> 00:15:51,680 Speaker 3: the camera. And this was all the more intriguing to 314 00:15:51,720 --> 00:15:54,480 Speaker 3: me because on the last earnings call, Elon was specifically 315 00:15:54,560 --> 00:15:56,960 Speaker 3: asked about sun glare and gave this like very long 316 00:15:56,960 --> 00:15:59,960 Speaker 3: winded response about it. So it all the stars just 317 00:16:00,120 --> 00:16:01,680 Speaker 3: kind of a line where we had all these elements. 318 00:16:01,680 --> 00:16:04,600 Speaker 2: Oh, but the long wind did response was what, hey, 319 00:16:04,920 --> 00:16:06,320 Speaker 2: don't sweat it. We got it. 320 00:16:06,640 --> 00:16:06,800 Speaker 1: Yeah. 321 00:16:06,840 --> 00:16:09,320 Speaker 3: It was basically like, we've solved this problem. We had 322 00:16:09,360 --> 00:16:12,800 Speaker 3: this breakthrough recently, we can bypass the image processor, we're 323 00:16:12,800 --> 00:16:13,960 Speaker 3: doing photon counting. 324 00:16:14,240 --> 00:16:16,520 Speaker 2: Well. So this is one of the questions I had 325 00:16:16,560 --> 00:16:19,360 Speaker 2: for you, which is this accident did happen a couple 326 00:16:19,400 --> 00:16:23,160 Speaker 2: of years ago. So is there any confidence out there, 327 00:16:23,280 --> 00:16:25,800 Speaker 2: not just among Elon in his inner circle, but among 328 00:16:26,240 --> 00:16:31,400 Speaker 2: broader Tesla followers that they indeed have made advancements and 329 00:16:31,440 --> 00:16:34,320 Speaker 2: developments that would fix such a problem. 330 00:16:34,800 --> 00:16:36,920 Speaker 3: Yeah, I mean to be clear, you know, Tesla is 331 00:16:37,120 --> 00:16:40,920 Speaker 3: always updating its hardware stack and its software stack, and 332 00:16:41,200 --> 00:16:46,320 Speaker 3: so what version of hardware and software the Tesla that 333 00:16:46,480 --> 00:16:49,440 Speaker 3: was involved in this fatal crash had in November twenty 334 00:16:49,480 --> 00:16:52,560 Speaker 3: twenty three versus now, I don't really know because I 335 00:16:52,600 --> 00:16:55,080 Speaker 3: asked Tesla if they could tell me, like what was 336 00:16:55,080 --> 00:16:57,000 Speaker 3: the hardware version and what was the software version? And 337 00:16:57,040 --> 00:16:57,840 Speaker 3: they didn't respond. 338 00:16:58,040 --> 00:17:00,840 Speaker 2: Did Tesla respond at all to anything you ask them? 339 00:17:01,400 --> 00:17:03,320 Speaker 3: No, I mean I said, to a long list of 340 00:17:03,400 --> 00:17:07,119 Speaker 3: questions to several executives. But you can it's reasonable to 341 00:17:07,200 --> 00:17:10,720 Speaker 3: presume that, yes, the current use out there is a 342 00:17:11,160 --> 00:17:15,320 Speaker 3: different version. Maybe they've solved it, maybe they haven't. Doesn't matter, 343 00:17:15,520 --> 00:17:19,040 Speaker 3: Like this woman is dead and NITZA is investigating this 344 00:17:19,160 --> 00:17:24,280 Speaker 3: crash as part of a potential defect investigation. And you know, 345 00:17:25,160 --> 00:17:27,760 Speaker 3: we only know about this crash because Tessel reported it 346 00:17:27,800 --> 00:17:28,359 Speaker 3: to NITSA. 347 00:17:29,320 --> 00:17:32,440 Speaker 2: Dan of course, NITSA being National. 348 00:17:32,200 --> 00:17:34,920 Speaker 3: Highway Traffic Safety Administration. 349 00:17:35,000 --> 00:17:41,399 Speaker 2: Right basically the men and women who regulate the transportation sector. Now, Max, 350 00:17:41,760 --> 00:17:44,800 Speaker 2: as Dana is saying here, regardless of what happens with 351 00:17:44,840 --> 00:17:47,479 Speaker 2: the technology going forward, that's not going to bring this 352 00:17:47,520 --> 00:17:51,320 Speaker 2: woman back. Of course, humans, as we've discussed, and that 353 00:17:51,440 --> 00:17:54,880 Speaker 2: as we've argued about in the past human drivers kill 354 00:17:55,080 --> 00:17:57,240 Speaker 2: thousands of people a year on the road, and I 355 00:17:57,240 --> 00:18:00,119 Speaker 2: should also say, as the story points out, but I 356 00:18:00,160 --> 00:18:04,720 Speaker 2: guess the question is, is society ready to accept the 357 00:18:04,840 --> 00:18:08,560 Speaker 2: deaths caused by the robots the same way it's willing 358 00:18:08,640 --> 00:18:12,320 Speaker 2: to say, Hey, God, awful, horrible that this human driver 359 00:18:12,520 --> 00:18:15,359 Speaker 2: killed someone else. But it's just part of the you know, 360 00:18:15,520 --> 00:18:16,280 Speaker 2: just the way things go. 361 00:18:16,680 --> 00:18:19,480 Speaker 1: Obviously society is not ready to accept that. There are 362 00:18:19,520 --> 00:18:22,359 Speaker 1: two problems here. One is that Tesla has all of 363 00:18:22,400 --> 00:18:26,679 Speaker 1: these statistics about the number of miles driven by what 364 00:18:26,720 --> 00:18:32,280 Speaker 1: it calls full self driving parentheses supervised FSD supervised. Those 365 00:18:32,640 --> 00:18:37,880 Speaker 1: statistics are interesting, but they are not the same as 366 00:18:37,920 --> 00:18:41,800 Speaker 1: what Tesla is purporting to launch in Austin or anywhere else, 367 00:18:41,920 --> 00:18:44,359 Speaker 1: which is the same system but without a human in. 368 00:18:44,359 --> 00:18:46,040 Speaker 2: The loop, the exact same system. 369 00:18:46,400 --> 00:18:48,240 Speaker 5: Yeah, and that's what Elon Musk has been saying. 370 00:18:48,280 --> 00:18:51,520 Speaker 2: So just sounds the supervised and gets ready. 371 00:18:51,280 --> 00:18:53,360 Speaker 5: For the lack of supervision. 372 00:18:53,000 --> 00:18:55,120 Speaker 3: FSD unsupervised basically. 373 00:18:55,920 --> 00:18:59,240 Speaker 2: But in the nitty gritty of it, all you're doing 374 00:18:59,280 --> 00:19:00,880 Speaker 2: is stripping out one we. 375 00:19:00,840 --> 00:19:02,960 Speaker 1: Heard a couple of I mean, there are a few 376 00:19:03,000 --> 00:19:06,040 Speaker 1: modifications that they're that they appear to be making in 377 00:19:06,160 --> 00:19:09,879 Speaker 1: Austin because it's not ready. But my point is where 378 00:19:09,920 --> 00:19:15,120 Speaker 1: we're taking one thing and attempting to make a judgment 379 00:19:15,160 --> 00:19:17,439 Speaker 1: about another thing that is different, and that is different 380 00:19:17,480 --> 00:19:20,800 Speaker 1: and in like a hugely important way. The second thing 381 00:19:20,920 --> 00:19:24,560 Speaker 1: is like, these self driving systems are very limited. They 382 00:19:24,600 --> 00:19:27,359 Speaker 1: are only used on certain roads at certain times of day, 383 00:19:27,760 --> 00:19:34,960 Speaker 1: and they deliberately exclude difficult situations, so like snow splare 384 00:19:35,080 --> 00:19:38,399 Speaker 1: maybe they turn off when they can't handle it, and 385 00:19:38,440 --> 00:19:43,159 Speaker 1: so we actually don't know like human accidents. Although they 386 00:19:43,200 --> 00:19:45,560 Speaker 1: add up to a lot, it is not a lot 387 00:19:45,600 --> 00:19:48,880 Speaker 1: in terms of miles driven, and we don't know how 388 00:19:49,359 --> 00:19:52,679 Speaker 1: those limitations, the limitations that are in those self driving 389 00:19:52,720 --> 00:19:56,640 Speaker 1: systems today, and that people use them in specific ways, right, 390 00:19:56,640 --> 00:19:59,280 Speaker 1: they don't necessarily use them if they're in a really 391 00:19:59,359 --> 00:20:03,679 Speaker 1: chaotic cy driving situation, the kinds of situations in other words, 392 00:20:03,800 --> 00:20:06,760 Speaker 1: that are more likely to lead to accidents. So these 393 00:20:06,760 --> 00:20:10,439 Speaker 1: statistics are interesting, but they don't necessarily prove anything. And 394 00:20:10,480 --> 00:20:14,040 Speaker 1: that's why it's kind of so crazy that Elon Musk 395 00:20:14,119 --> 00:20:16,240 Speaker 1: is attempting to just kind of do this. 396 00:20:16,880 --> 00:20:18,919 Speaker 2: Than any final points on this before we move on. 397 00:20:19,760 --> 00:20:21,639 Speaker 3: Yeah, I just think that there's going to be a 398 00:20:21,680 --> 00:20:25,080 Speaker 3: lot of questions about the rollout, the parameters of it. 399 00:20:25,840 --> 00:20:28,560 Speaker 3: You know what happens if there is an accident in Austin. 400 00:20:28,760 --> 00:20:30,960 Speaker 3: Is the city of Austin going to do anything or 401 00:20:31,000 --> 00:20:33,600 Speaker 3: the county or the state of Texas or nits And 402 00:20:33,640 --> 00:20:38,800 Speaker 3: then meanwhile, this defect investigation into how the vehicles perform 403 00:20:39,000 --> 00:20:42,639 Speaker 3: in things like fog and sunglair is ongoing, and I 404 00:20:42,640 --> 00:20:44,720 Speaker 3: think it's just telling that, you know, when we talk 405 00:20:44,760 --> 00:20:47,680 Speaker 3: about Trump and Musk and like the levers and the 406 00:20:47,720 --> 00:20:50,440 Speaker 3: copramot that each have on the other, Like what is 407 00:20:50,560 --> 00:20:52,560 Speaker 3: NITSA going to do here? Like are they just going 408 00:20:52,600 --> 00:20:55,400 Speaker 3: to wrap up this investigation and say Tesla told us 409 00:20:55,440 --> 00:20:58,840 Speaker 3: everything's fixed and this investigation dies on the line or 410 00:20:58,880 --> 00:21:02,440 Speaker 3: does it continue? So that's another thing to keep watching on. 411 00:21:02,440 --> 00:21:04,400 Speaker 1: One other thing about Dana's story, and this is kind 412 00:21:04,400 --> 00:21:06,719 Speaker 1: of important when you're talking about cities like Austin. 413 00:21:07,160 --> 00:21:09,120 Speaker 2: Do read it. By the way, it ran as a 414 00:21:09,480 --> 00:21:12,000 Speaker 2: big take, a bloomber big take last Wednesday. 415 00:21:12,440 --> 00:21:16,280 Speaker 1: Oh so one important thing also about the story is 416 00:21:17,160 --> 00:21:21,879 Speaker 1: av systems. Autonomous vehicle systems have trouble with non vehicles, 417 00:21:21,960 --> 00:21:26,280 Speaker 1: so with pedestrians, with cyclists, and with children, and like, 418 00:21:26,359 --> 00:21:29,000 Speaker 1: this is not the first time an autonomous vehicle has 419 00:21:29,119 --> 00:21:31,639 Speaker 1: killed a pedestrian. That that of course happened with an 420 00:21:31,720 --> 00:21:34,680 Speaker 1: uber autonomous vehicle, like about eight years ago or so. 421 00:21:35,880 --> 00:21:38,959 Speaker 1: And of course cities are driving environments where you have 422 00:21:39,040 --> 00:21:42,800 Speaker 1: lots and lots of pedestrians, where the interaction between pedestrians 423 00:21:42,800 --> 00:21:45,359 Speaker 1: and cars is really important. We've seen issues in San 424 00:21:45,359 --> 00:21:49,840 Speaker 1: Francisco where autonomous vehicles have severely hurt a pedestrian. It 425 00:21:49,880 --> 00:21:53,000 Speaker 1: was a cruise vehicle that dragged someone under its wheel 426 00:21:53,040 --> 00:21:56,280 Speaker 1: and as a result, CRUs essentially shut down its entire 427 00:21:56,640 --> 00:22:00,919 Speaker 1: av operation. We've also seen these kind of interactions with 428 00:22:01,000 --> 00:22:05,200 Speaker 1: pedestrians that are not necessarily dangerous but are problematic for 429 00:22:05,240 --> 00:22:07,560 Speaker 1: the autonomous vehicles, where pedestrians just sort of get in 430 00:22:07,600 --> 00:22:10,639 Speaker 1: their way, try to mess with them, try to confuse them. 431 00:22:10,960 --> 00:22:14,880 Speaker 1: And so I would expect that if this rollout happens 432 00:22:14,920 --> 00:22:17,560 Speaker 1: on any kind of scale, there are going to be 433 00:22:17,640 --> 00:22:20,240 Speaker 1: things like that, there are going to be issues with 434 00:22:20,560 --> 00:22:23,760 Speaker 1: protesters or with people just kind of messing with slash 435 00:22:23,880 --> 00:22:27,600 Speaker 1: vandalizing these vehicles, and like it's just creating like a 436 00:22:27,680 --> 00:22:32,959 Speaker 1: lot of different vectors for chaos or something worse. And 437 00:22:33,000 --> 00:22:35,080 Speaker 1: I think that's one of the reasons why this is 438 00:22:35,119 --> 00:22:37,320 Speaker 1: such a limited launch to start with. This is only 439 00:22:37,359 --> 00:22:38,320 Speaker 1: ten to twenty cars. 440 00:22:38,720 --> 00:22:42,119 Speaker 2: Listen, I think for sure those things are going to happen, 441 00:22:42,200 --> 00:22:45,200 Speaker 2: especially as you roll out more of these things. I mean, 442 00:22:45,200 --> 00:22:48,280 Speaker 2: it's interesting to me, Unlike you, I happen to pretty 443 00:22:48,320 --> 00:22:51,320 Speaker 2: much firmly believe that human drivers are lousy and that 444 00:22:51,400 --> 00:22:54,199 Speaker 2: if given the opportunity, the robots will eventually do it 445 00:22:54,200 --> 00:22:56,840 Speaker 2: better than us. But you're also starting to convince me that, 446 00:22:56,920 --> 00:23:01,159 Speaker 2: you know what, we the humans may never allow the 447 00:23:01,320 --> 00:23:04,240 Speaker 2: robots to be tested enough to get to that point. 448 00:23:04,320 --> 00:23:06,240 Speaker 1: And that is that's like one of the reasons when 449 00:23:06,240 --> 00:23:08,639 Speaker 1: you talk to people in the industry who are skeptical, 450 00:23:08,800 --> 00:23:11,480 Speaker 1: that is their skepticism. They're not saying the robots could 451 00:23:11,480 --> 00:23:13,960 Speaker 1: never do this, we could never get there. It's that 452 00:23:14,000 --> 00:23:16,280 Speaker 1: the things we would have to do are things we 453 00:23:16,359 --> 00:23:18,360 Speaker 1: are not going to be willing to do. Like, are 454 00:23:18,400 --> 00:23:21,479 Speaker 1: you going to be willing to accept a technology that 455 00:23:21,600 --> 00:23:24,480 Speaker 1: kills pedestrians at a higher rate than human drivers? 456 00:23:24,920 --> 00:23:27,359 Speaker 5: I mean initially or maybe total? 457 00:23:27,440 --> 00:23:31,040 Speaker 1: Like fer people die, Well, what a fewer people in 458 00:23:31,119 --> 00:23:33,720 Speaker 1: cars die, but more pedestrians die? Would you know that's 459 00:23:33,760 --> 00:23:36,879 Speaker 1: still of course not Okay, Well that's going to be 460 00:23:36,880 --> 00:23:37,400 Speaker 1: a problem. 461 00:23:38,680 --> 00:23:41,000 Speaker 2: Well they call the whole thing off, all the robo 462 00:23:41,560 --> 00:23:43,000 Speaker 2: taxi launch off. 463 00:23:43,280 --> 00:23:46,119 Speaker 3: Yeah, I guess the question is like, are you willing so? 464 00:23:46,560 --> 00:23:49,760 Speaker 3: Roughly forty thousand people die in traffic accidents every year 465 00:23:49,760 --> 00:23:53,080 Speaker 3: in the United States. Alcohol, drug use, and speed and 466 00:23:53,119 --> 00:23:56,160 Speaker 3: not wearing seatbelts are all huge factors in these crashes. 467 00:23:56,240 --> 00:23:59,719 Speaker 3: But the premise of autonomous driving is that we can 468 00:23:59,760 --> 00:24:03,680 Speaker 3: get to crash zero, but on the way to crash zero, 469 00:24:03,840 --> 00:24:07,919 Speaker 3: are we happy if robots like end up killing people? 470 00:24:08,000 --> 00:24:10,400 Speaker 3: I don't think so. I mean, so that's the calculus 471 00:24:10,400 --> 00:24:13,359 Speaker 3: that everyone's trying to make. And what was so sad 472 00:24:13,400 --> 00:24:15,880 Speaker 3: about this crash is it's like a double fault accident, 473 00:24:15,960 --> 00:24:18,439 Speaker 3: right like Tesla's legally is always says that the driver 474 00:24:18,520 --> 00:24:21,560 Speaker 3: has to maintain control at all times. So clearly the 475 00:24:21,680 --> 00:24:24,919 Speaker 3: driver was not paying attention and did not realize that 476 00:24:24,960 --> 00:24:26,840 Speaker 3: the glare was really bad, and perhaps they should have 477 00:24:26,840 --> 00:24:29,639 Speaker 3: slowed down or moved over before coming upon all these cars. 478 00:24:29,640 --> 00:24:33,120 Speaker 3: But then also f Steve was apparently engaged per Nitza 479 00:24:33,160 --> 00:24:37,040 Speaker 3: and that didn't see all the warning signs either, including 480 00:24:37,440 --> 00:24:39,959 Speaker 3: cars that were pulled over with hazard lights on. I mean, 481 00:24:39,960 --> 00:24:42,600 Speaker 3: there were a lot of stationary objects that one would 482 00:24:42,600 --> 00:24:44,400 Speaker 3: think an FSD system would have seen. 483 00:24:45,040 --> 00:24:47,640 Speaker 1: I mean, this is the kind of scenario where I mean, 484 00:24:47,760 --> 00:24:50,520 Speaker 1: and again we don't know for sure, it's very dangerous 485 00:24:50,840 --> 00:24:52,760 Speaker 1: anytime you get out of your car on the highway, 486 00:24:52,760 --> 00:24:53,560 Speaker 1: it's very dangerous. 487 00:24:53,640 --> 00:24:55,520 Speaker 5: Yes, but this does sound like. 488 00:24:55,440 --> 00:24:57,840 Speaker 1: A scenario where a human driver might have done a 489 00:24:57,920 --> 00:25:00,840 Speaker 1: better job. And like we're like, as Dana is saying, 490 00:25:00,880 --> 00:25:04,199 Speaker 1: there's lots there's a lot of context around this that 491 00:25:04,200 --> 00:25:06,560 Speaker 1: that a human driver might pick up. It's also kind 492 00:25:06,600 --> 00:25:09,840 Speaker 1: of upsetting that the pedestrian who got out of the 493 00:25:09,840 --> 00:25:12,679 Speaker 1: car was attempting to help, was attempting to kind of 494 00:25:12,760 --> 00:25:16,800 Speaker 1: like you know, direct traffic around this incident, and in 495 00:25:16,840 --> 00:25:20,440 Speaker 1: a way that like again maybe a human driver, a 496 00:25:20,560 --> 00:25:23,000 Speaker 1: tentive human driver anyway, is able. 497 00:25:22,800 --> 00:25:24,880 Speaker 5: To avoid and avoid this tragedy. 498 00:25:25,359 --> 00:25:28,119 Speaker 2: All Right, I'm sure next week we will have a 499 00:25:28,160 --> 00:25:32,600 Speaker 2: bit of a recap of just exactly what transpires down 500 00:25:32,600 --> 00:25:41,960 Speaker 2: there in Austin hard pivot now Max hard hard pivot 501 00:25:42,440 --> 00:25:46,720 Speaker 2: to a mini feud that has been thrown off that 502 00:25:46,840 --> 00:25:49,880 Speaker 2: is a shard of sorts of the feud to end 503 00:25:49,880 --> 00:25:51,200 Speaker 2: all feuds. Tell us about it. 504 00:25:51,600 --> 00:25:53,000 Speaker 5: Yeah, Elon versus Rock. 505 00:25:53,200 --> 00:25:57,640 Speaker 1: Now, this isn't the first time Elon and his favorite 506 00:25:57,760 --> 00:26:02,119 Speaker 1: chatbot have beefed in public, but there was in the 507 00:26:02,240 --> 00:26:07,520 Speaker 1: kind of like back and forth over Elon Musk and 508 00:26:07,560 --> 00:26:14,040 Speaker 1: the Trump administration. Grock cited a tweet from Elon Musk 509 00:26:14,200 --> 00:26:17,120 Speaker 1: that was that was fake. It essentially Grock essentially fell 510 00:26:17,160 --> 00:26:20,800 Speaker 1: for a meme. I'm gonna try to avoid getting too 511 00:26:20,880 --> 00:26:23,600 Speaker 1: far into the meme because I don't want to defame anyone, 512 00:26:23,880 --> 00:26:26,240 Speaker 1: but it's a left wing meme without a lot of 513 00:26:26,280 --> 00:26:30,960 Speaker 1: basis around Elon Musk and a love triangle within. 514 00:26:30,800 --> 00:26:35,040 Speaker 5: The Trump administration. Grock cited this tweet from. 515 00:26:34,960 --> 00:26:38,280 Speaker 1: Elon purporting to be for this and acting as if 516 00:26:38,320 --> 00:26:41,160 Speaker 1: it was real, and Elon wrote, O f FS, which 517 00:26:41,200 --> 00:26:43,840 Speaker 1: means for sake, this is fake and. 518 00:26:44,200 --> 00:26:45,320 Speaker 2: With some sort of emoji. 519 00:26:45,720 --> 00:26:46,639 Speaker 5: Yeah, I can't remember that. 520 00:26:47,480 --> 00:26:49,400 Speaker 2: Like the hanging hands, what's the one where they had 521 00:26:49,480 --> 00:26:54,879 Speaker 2: you put that hand, which to me just screams of groc. 522 00:26:55,359 --> 00:26:59,399 Speaker 2: I'm so so disappointed in you, so like you. 523 00:26:59,720 --> 00:27:02,000 Speaker 1: Gra is not a person, I just want to say, 524 00:27:02,040 --> 00:27:06,440 Speaker 1: and we should not be anthropomorphizing uh Rock or any 525 00:27:06,440 --> 00:27:07,000 Speaker 1: other AI. 526 00:27:07,320 --> 00:27:09,280 Speaker 5: It's just a It's just like a it's like a 527 00:27:09,320 --> 00:27:11,679 Speaker 5: spell checker or something. You cannot fall in love with 528 00:27:11,720 --> 00:27:14,040 Speaker 5: your spell checker. You cannot be mad at you can't 529 00:27:14,040 --> 00:27:15,040 Speaker 5: be disappointed in it. 530 00:27:15,400 --> 00:27:18,880 Speaker 1: On the other hand, it is kind of funny that 531 00:27:18,880 --> 00:27:23,280 Speaker 1: Elon Musk is simultaneously saying that these that this AI 532 00:27:23,400 --> 00:27:27,040 Speaker 1: is basically gonna like lead to a super intelligence and 533 00:27:27,119 --> 00:27:29,200 Speaker 1: at the same time baby getting into. 534 00:27:29,119 --> 00:27:32,440 Speaker 2: These baby steps Max baby steps. By the way, I did, 535 00:27:32,480 --> 00:27:35,760 Speaker 2: something came to me. Uh, if there is truly a 536 00:27:35,880 --> 00:27:40,159 Speaker 2: detent between Musk and Trump and coming days, there's no 537 00:27:40,280 --> 00:27:42,880 Speaker 2: better place to end it or or or I should 538 00:27:42,960 --> 00:27:45,480 Speaker 2: say to Ring in that detent than at the military parade. 539 00:27:45,520 --> 00:27:47,560 Speaker 2: And it happens. I think you have Musk at some 540 00:27:47,640 --> 00:27:50,399 Speaker 2: point in one of the tanks and then he just 541 00:27:50,440 --> 00:27:52,880 Speaker 2: suddenly pops up from the turret, right, I mean. 542 00:27:54,960 --> 00:27:55,640 Speaker 3: The cyber truck. 543 00:27:55,720 --> 00:28:00,040 Speaker 5: I mean that's cyber truck could bear. It's like the 544 00:28:00,040 --> 00:28:02,000 Speaker 5: Cyberg gets Russian warlord. 545 00:28:02,200 --> 00:28:06,400 Speaker 2: Dam Yes, exactly, Uh, Cyberg, that's good. We should send 546 00:28:06,400 --> 00:28:07,000 Speaker 2: that over to them. 547 00:28:07,040 --> 00:28:10,119 Speaker 1: But look, this is a lesson that you can't believe 548 00:28:10,320 --> 00:28:13,879 Speaker 1: every screenshot you see on the internet, and you definitely 549 00:28:13,920 --> 00:28:19,800 Speaker 1: shouldn't believe every screenshot cited by an Ai Chapbop and 550 00:28:19,880 --> 00:28:22,159 Speaker 1: El almost knows that well, so, which is why you 551 00:28:22,160 --> 00:28:23,359 Speaker 1: know why he's disappointed. 552 00:28:23,920 --> 00:28:28,879 Speaker 2: So disappointed, very very disappointed, Max, Dana, thanks as always. 553 00:28:28,640 --> 00:28:30,480 Speaker 5: Thanks David, Always a pleasure. 554 00:28:39,040 --> 00:28:41,840 Speaker 2: This episode was produced by Stacey Wong and edited by 555 00:28:41,840 --> 00:28:46,000 Speaker 2: Anna Masirakas, Blake Maples handles engineering, and Dave Percell fact checks. 556 00:28:46,240 --> 00:28:49,840 Speaker 2: Our supervising producer is Magnus Henrickson. The elon Ing theme 557 00:28:49,960 --> 00:28:52,680 Speaker 2: is written and performed by Taka yes Suzawa and Alex 558 00:28:52,760 --> 00:28:55,960 Speaker 2: Sugi Eira. Brendan Francis Newnham is our executive producer, and 559 00:28:56,040 --> 00:28:59,800 Speaker 2: Sage Bowman is the head of Bloomberg Podcasts. A big things. 560 00:29:00,080 --> 00:29:03,120 Speaker 2: As always to our supporters, Joel Weber and Brad Stone, 561 00:29:03,520 --> 00:29:06,840 Speaker 2: I'm David Papadopoulos. If you have a minute, rate and 562 00:29:06,880 --> 00:29:09,960 Speaker 2: review our show, it'll help other listeners find us. See 563 00:29:10,000 --> 00:29:10,560 Speaker 2: you next week.