1 00:00:03,960 --> 00:00:06,880 Speaker 1: Well, Elon Musk gives now the richest person on the planet. 2 00:00:07,200 --> 00:00:10,039 Speaker 2: More than half the satellites in space are owned and 3 00:00:10,119 --> 00:00:12,080 Speaker 2: controlled by one man. 4 00:00:12,440 --> 00:00:15,160 Speaker 1: Starting his own artificial intelligence company. 5 00:00:15,400 --> 00:00:18,919 Speaker 2: Well, he's a legitimate super genius. I mean legitimate. He says. 6 00:00:18,920 --> 00:00:21,560 Speaker 2: He's always voted for Democrats, but this year it will 7 00:00:21,560 --> 00:00:22,000 Speaker 2: be different. 8 00:00:22,040 --> 00:00:23,040 Speaker 3: He'll vote Republican. 9 00:00:23,280 --> 00:00:25,760 Speaker 2: There is a reason the US government is so reliant 10 00:00:25,800 --> 00:00:26,160 Speaker 2: on him. 11 00:00:26,400 --> 00:00:29,960 Speaker 1: Elon Musk is a scam artist and he's done nothing. 12 00:00:30,600 --> 00:00:43,680 Speaker 2: Anything he does is fascinating the people. Welcome to Elon Inc. 13 00:00:43,800 --> 00:00:47,400 Speaker 2: Where we discuss Elon Musk's vast corporate empire, his latest 14 00:00:47,440 --> 00:00:49,840 Speaker 2: gambits and antics, and how to make sense of it all. 15 00:00:50,120 --> 00:00:54,040 Speaker 2: I'm your host, David Papadopolis. We were surprised this week 16 00:00:54,080 --> 00:00:58,120 Speaker 2: to learn from our colleague Sharen Gafari that GROC, the 17 00:00:58,200 --> 00:01:01,600 Speaker 2: AI chatbot from Elon Musk, this very new AI company, 18 00:01:02,080 --> 00:01:05,399 Speaker 2: is actually giving programs like Chad GBT a serious run 19 00:01:05,440 --> 00:01:08,760 Speaker 2: for their money. We were less surprised when we saw 20 00:01:08,800 --> 00:01:12,760 Speaker 2: the EU as investigating X for illicit content and disinformation 21 00:01:12,920 --> 00:01:16,479 Speaker 2: on the platform, and we did raise our eyebrows when 22 00:01:16,480 --> 00:01:21,400 Speaker 2: a recall alert for Tesla's autopilot software went out last week. Now, 23 00:01:21,400 --> 00:01:23,480 Speaker 2: this wasn't the kind of recall we're all used to. 24 00:01:24,120 --> 00:01:28,119 Speaker 2: The company called for users to update their cars software remotely. 25 00:01:28,600 --> 00:01:31,400 Speaker 2: It felt like a peek into the future of cars. 26 00:01:33,480 --> 00:01:36,200 Speaker 2: To discuss all of this and more, we've beamed in Sharene, 27 00:01:36,520 --> 00:01:39,480 Speaker 2: who reports on AI for US in San Francisco. Hello, 28 00:01:39,959 --> 00:01:45,280 Speaker 2: hither and Max Schafkin, senior reporter and editor at Bloomberg BusinessWeek. 29 00:01:45,440 --> 00:01:47,000 Speaker 4: Hello David, Hello Max. 30 00:01:47,880 --> 00:01:51,480 Speaker 2: Later on, Dana Hall, our longtime Tesla reporter, will join 31 00:01:51,560 --> 00:01:57,040 Speaker 2: us as well. Okay, so now Max, we know, okay 32 00:01:57,800 --> 00:02:02,480 Speaker 2: that recently you had a little cameo and last week 33 00:02:02,520 --> 00:02:05,400 Speaker 2: tonight with John Oliver, they found some old tape that 34 00:02:05,440 --> 00:02:06,240 Speaker 2: they put on. 35 00:02:06,280 --> 00:02:08,040 Speaker 4: So in that way, he's kind of the Henry Ford 36 00:02:08,040 --> 00:02:09,160 Speaker 4: of space. 37 00:02:09,280 --> 00:02:11,800 Speaker 2: But you know, Max, we also pulled some old tape 38 00:02:11,840 --> 00:02:15,880 Speaker 2: of yours on our very first podcast, our debut episode, 39 00:02:16,080 --> 00:02:19,160 Speaker 2: in which you said this about groc. 40 00:02:19,120 --> 00:02:21,720 Speaker 1: Until people start actually using this, I think it's worth 41 00:02:21,800 --> 00:02:24,240 Speaker 1: like thinking about this as vaporware or not even taking 42 00:02:24,240 --> 00:02:25,519 Speaker 1: it seriously as a real product. 43 00:02:25,560 --> 00:02:29,400 Speaker 2: All right, Well, stand okay, so Max, we'll give you your time. 44 00:02:29,520 --> 00:02:33,600 Speaker 2: Just pipe down, you'll get your chance to respond. But Sharen, 45 00:02:34,200 --> 00:02:38,200 Speaker 2: you've tried groc Out for a few days recently. What 46 00:02:38,360 --> 00:02:39,280 Speaker 2: did you find? 47 00:02:40,919 --> 00:02:46,239 Speaker 3: It exceeded my expectations? I like max given first of all, 48 00:02:46,400 --> 00:02:50,640 Speaker 3: how how much other companies had leg up over Elon 49 00:02:50,840 --> 00:02:54,200 Speaker 3: with this one right open a eyes chat SPT came 50 00:02:54,240 --> 00:02:57,400 Speaker 3: out over a year ago. Google's been working on AI forever. 51 00:02:58,040 --> 00:02:59,800 Speaker 3: I didn't think in a matter of a couple months 52 00:03:00,400 --> 00:03:02,560 Speaker 3: and a small team could spin up something that would 53 00:03:02,800 --> 00:03:05,440 Speaker 3: significantly be better than any of the other AI chatbots 54 00:03:05,440 --> 00:03:08,440 Speaker 3: out there. However, I did find that, especially with one 55 00:03:08,560 --> 00:03:12,840 Speaker 3: important use case for me, which is summarizing real time events, 56 00:03:13,040 --> 00:03:14,400 Speaker 3: Rock performed really well. 57 00:03:15,240 --> 00:03:18,079 Speaker 2: And when you say really well, a tell us what 58 00:03:18,120 --> 00:03:20,720 Speaker 2: that looked like? What really well looked like? And is 59 00:03:20,800 --> 00:03:23,280 Speaker 2: really well better than the competition? 60 00:03:24,880 --> 00:03:28,600 Speaker 3: Yes, I should say really well comparative to other chatbots 61 00:03:28,639 --> 00:03:30,240 Speaker 3: that are out there right now, I think none of 62 00:03:30,240 --> 00:03:33,639 Speaker 3: these tools are anywhere near perfect. They're still wrong a 63 00:03:33,680 --> 00:03:35,920 Speaker 3: good percentage of the time. I will caveat that my 64 00:03:36,040 --> 00:03:38,480 Speaker 3: test was not a scientific one. You know, I used 65 00:03:38,480 --> 00:03:40,080 Speaker 3: it for three days and asked a bunch of news 66 00:03:40,160 --> 00:03:44,000 Speaker 3: questions basically, and you know how it did better? Was 67 00:03:44,040 --> 00:03:45,720 Speaker 3: that a lot of times? And I've written about this 68 00:03:45,760 --> 00:03:49,000 Speaker 3: in the past. The mainstream chatbots out there, the leading 69 00:03:49,040 --> 00:03:51,480 Speaker 3: moost cutting edge ones can be really smart if you 70 00:03:51,520 --> 00:03:54,960 Speaker 3: ask them to solve, you know, a math problem or 71 00:03:55,240 --> 00:03:58,840 Speaker 3: to write you a poem in the style of Shakespeare. 72 00:03:59,120 --> 00:04:01,840 Speaker 3: But when you have some about current events, they can 73 00:04:01,920 --> 00:04:05,120 Speaker 3: just totally get stumped and flub the answer because even 74 00:04:05,160 --> 00:04:08,119 Speaker 3: as they're even if they're being updated with real time information, 75 00:04:08,680 --> 00:04:11,520 Speaker 3: it takes training to be able to tell you know 76 00:04:11,560 --> 00:04:14,400 Speaker 3: that the chat bought hey, this is right, this is wrong, 77 00:04:14,440 --> 00:04:16,720 Speaker 3: and you can't constantly be training something in real time 78 00:04:16,760 --> 00:04:19,080 Speaker 3: at least they haven't figured that out quite yet. And 79 00:04:19,120 --> 00:04:21,880 Speaker 3: so where Grock excelled was that when I asked it 80 00:04:21,880 --> 00:04:25,080 Speaker 3: about the news, it actually was able to give me, 81 00:04:25,760 --> 00:04:28,760 Speaker 3: in my experience, a better answer, more or less than 82 00:04:28,800 --> 00:04:31,200 Speaker 3: the other ones out there, like chat, Scheept, your Google's. 83 00:04:30,839 --> 00:04:33,479 Speaker 2: Bard and so you're talking about right, you asked the 84 00:04:33,520 --> 00:04:37,120 Speaker 2: things about the Israel Hamas war, you asked the things 85 00:04:37,160 --> 00:04:40,080 Speaker 2: about the cop twenty eight climate summit, and it did 86 00:04:40,240 --> 00:04:46,039 Speaker 2: write better often than Chat, chapt and Bart. So Max Chafkin, what. 87 00:04:46,240 --> 00:04:50,280 Speaker 1: Say you, I agree with everything Shreena's saying. I mean 88 00:04:50,320 --> 00:04:51,680 Speaker 1: it's a really. 89 00:04:51,400 --> 00:04:54,120 Speaker 2: You retract your statements from Wakwa. 90 00:04:53,960 --> 00:04:55,279 Speaker 4: No, because I do. 91 00:04:55,760 --> 00:04:57,400 Speaker 1: First of all, it's not clear how quickly they put 92 00:04:57,400 --> 00:04:59,800 Speaker 1: this thing together, like it might have been true, you know, 93 00:05:00,080 --> 00:05:02,440 Speaker 1: four weeks ago when I said it, and maybe they 94 00:05:02,520 --> 00:05:07,880 Speaker 1: actually moved really quickly. I also think, you know, it's 95 00:05:07,880 --> 00:05:09,320 Speaker 1: sort of hard to know how to think about this 96 00:05:09,360 --> 00:05:11,440 Speaker 1: because on one hand, you know, it's like, Wow, what 97 00:05:11,520 --> 00:05:14,480 Speaker 1: an impressive showing by Elon Musk. He's still got it right, 98 00:05:14,560 --> 00:05:17,400 Speaker 1: Like he can small team, he's able to, you know, 99 00:05:17,680 --> 00:05:22,120 Speaker 1: despite all of the crazy things swirling around him, he's 100 00:05:22,160 --> 00:05:25,040 Speaker 1: able to recruit competent engineers and who are able to 101 00:05:25,080 --> 00:05:25,880 Speaker 1: then build a product. 102 00:05:26,160 --> 00:05:27,400 Speaker 4: I mean, the other way looking. 103 00:05:27,200 --> 00:05:30,000 Speaker 1: At is I guess maybe chat GPT wasn't quite as 104 00:05:30,000 --> 00:05:32,640 Speaker 1: good as we thought. Maybe like that thirteen billion dollars 105 00:05:32,640 --> 00:05:35,920 Speaker 1: that open Ai spent, or you know, this enormous amount 106 00:05:35,920 --> 00:05:38,279 Speaker 1: of money, many years spent. The fact that one guy 107 00:05:38,320 --> 00:05:41,040 Speaker 1: could replicate it in a matter of I don't know, 108 00:05:41,080 --> 00:05:42,760 Speaker 1: we don't know exactly how long they built this, but 109 00:05:42,800 --> 00:05:43,680 Speaker 1: probably a few months. 110 00:05:44,880 --> 00:05:46,560 Speaker 4: What does that say about the state of the art. 111 00:05:46,600 --> 00:05:49,360 Speaker 1: Maybe it's maybe these maybe these chatbots just aren't as 112 00:05:49,440 --> 00:05:52,080 Speaker 1: great as they as we thought they were. Or maybe 113 00:05:52,240 --> 00:05:53,800 Speaker 1: a better way to say it would be maybe these 114 00:05:53,839 --> 00:05:56,040 Speaker 1: chatbots aren't as like sort of defensible as we thought 115 00:05:56,040 --> 00:05:58,800 Speaker 1: they were. Maybe, like there's not that much unique in anything. 116 00:05:58,880 --> 00:06:01,440 Speaker 2: There's not much of a n buffet terms of a 117 00:06:01,520 --> 00:06:06,880 Speaker 2: moat around them as as one would have thought. I mean, now, sure, 118 00:06:07,000 --> 00:06:09,640 Speaker 2: and you're you're saying you peppered it with news questions 119 00:06:10,080 --> 00:06:13,000 Speaker 2: when and to to help you be up to date 120 00:06:13,120 --> 00:06:16,799 Speaker 2: on what's going on in the in the open AI space, 121 00:06:16,839 --> 00:06:20,040 Speaker 2: in the artificial intelligence space. How important is that functionality? 122 00:06:20,120 --> 00:06:22,680 Speaker 2: Is that something that people would actually legitimately use it 123 00:06:22,720 --> 00:06:23,240 Speaker 2: a lot for? 124 00:06:23,560 --> 00:06:25,680 Speaker 3: Look, I think there's there's you know, you don't always 125 00:06:25,720 --> 00:06:29,200 Speaker 3: need to know what's happening in right away, like right now. 126 00:06:29,279 --> 00:06:32,200 Speaker 3: For example, if you're just interested in using these bots 127 00:06:32,240 --> 00:06:34,520 Speaker 3: to code, to help you code, it doesn't really matter. 128 00:06:34,560 --> 00:06:37,520 Speaker 3: Code doesn't you know, change minute by minute. But if 129 00:06:37,640 --> 00:06:39,760 Speaker 3: you care, and when you think about when you use 130 00:06:39,800 --> 00:06:42,159 Speaker 3: Google every day, right, how many times does it matter 131 00:06:42,520 --> 00:06:45,160 Speaker 3: if information is fresh when you Google a lot of 132 00:06:45,200 --> 00:06:47,039 Speaker 3: times it matters, like if you want to know when 133 00:06:47,040 --> 00:06:48,719 Speaker 3: the super Bowl is or if you want to know 134 00:06:48,920 --> 00:06:50,919 Speaker 3: like when to go to the movies, Like, so I 135 00:06:50,960 --> 00:06:53,120 Speaker 3: asked it you know about Israel Amas war and I 136 00:06:53,240 --> 00:06:54,800 Speaker 3: did that because in the past when I had asked 137 00:06:54,839 --> 00:06:57,800 Speaker 3: Google's chatbot and chat Shept, it was flat out wrong. 138 00:06:57,839 --> 00:07:00,160 Speaker 3: It was telling me, you know, in the p the 139 00:07:00,240 --> 00:07:02,160 Speaker 3: heat of the war, it was telling me there's been 140 00:07:02,160 --> 00:07:05,000 Speaker 3: a ceasefire, which was just not true. And when I 141 00:07:05,040 --> 00:07:08,000 Speaker 3: asked GROC, it gave a pretty nuanced and correct answer. 142 00:07:08,400 --> 00:07:09,960 Speaker 2: Well, exactly was that answer? 143 00:07:10,080 --> 00:07:12,400 Speaker 3: It was pretty concise. It was about a paragraph. It said, 144 00:07:12,520 --> 00:07:15,640 Speaker 3: you know, the war is continuing, there's no end in sight. 145 00:07:15,800 --> 00:07:18,280 Speaker 3: It listed, you know, an estimate of the casualties and 146 00:07:18,360 --> 00:07:22,760 Speaker 3: gave sources for those casualties. And you know, the reason 147 00:07:22,840 --> 00:07:26,000 Speaker 3: I think that that GROC may be better with this 148 00:07:26,120 --> 00:07:28,240 Speaker 3: and some of its competitors is because, first of all, 149 00:07:28,480 --> 00:07:30,360 Speaker 3: it's willing to engage on some of the stuff. So 150 00:07:30,360 --> 00:07:32,679 Speaker 3: when you asked Bart about the Israel mas war, for example, 151 00:07:32,760 --> 00:07:35,760 Speaker 3: after Bard a couple of months ago, embarrassing. 152 00:07:35,400 --> 00:07:36,480 Speaker 2: Or declined to comment. 153 00:07:36,560 --> 00:07:39,680 Speaker 3: Basically Bard declined to comment, right, And then this is 154 00:07:39,760 --> 00:07:42,440 Speaker 3: mind you after my reporting that it flunked on this 155 00:07:42,560 --> 00:07:44,240 Speaker 3: question a few months ago, so now it seems like 156 00:07:44,280 --> 00:07:47,000 Speaker 3: it's shut down the answer entirely to what's happening with 157 00:07:47,080 --> 00:07:50,800 Speaker 3: Israel Amas war. Chech Pt did give a directionally correct 158 00:07:50,800 --> 00:07:53,040 Speaker 3: answer actually performed better than I thought it would. But 159 00:07:53,240 --> 00:07:57,000 Speaker 3: Groc's answer was more concise. It listed references better I thought, 160 00:07:57,280 --> 00:07:59,560 Speaker 3: And I think a big reason for that is because 161 00:08:00,080 --> 00:08:04,000 Speaker 3: it's pulling from from tweets or I should call them exposts. 162 00:08:04,360 --> 00:08:06,720 Speaker 3: And you know, what was really surprising to me was 163 00:08:06,760 --> 00:08:09,800 Speaker 3: that I thought, Okay, even if it's pulling from people's exposts, 164 00:08:09,840 --> 00:08:11,520 Speaker 3: that a lot of them could be junkie. I mean, 165 00:08:11,560 --> 00:08:13,760 Speaker 3: just looking at my own, for you feet as degraded 166 00:08:13,800 --> 00:08:15,520 Speaker 3: in the past couple of months or a year, I 167 00:08:15,560 --> 00:08:18,720 Speaker 3: should say. However, when I asked Rock, it seemed like 168 00:08:19,000 --> 00:08:22,360 Speaker 3: it was filtering where a majority of vast majority of 169 00:08:22,400 --> 00:08:25,080 Speaker 3: sources it was pulling from. The in these tweets or 170 00:08:25,080 --> 00:08:29,360 Speaker 3: exposts were you know, mainstream news sources like Bloomberg, like 171 00:08:29,400 --> 00:08:31,640 Speaker 3: the Financial Times, Sky News. 172 00:08:32,280 --> 00:08:34,600 Speaker 2: Wait, which, so they're siting Bloomberg, Oh we love Grock, 173 00:08:34,640 --> 00:08:35,480 Speaker 2: then Croc's great. 174 00:08:35,880 --> 00:08:37,480 Speaker 3: Now, of course they're going to be right if they're 175 00:08:37,480 --> 00:08:38,160 Speaker 3: signing Bloomberg. 176 00:08:39,160 --> 00:08:40,199 Speaker 4: Now, Max, in. 177 00:08:40,200 --> 00:08:42,959 Speaker 2: That opening episode, when you were expressing those concerns and 178 00:08:43,120 --> 00:08:46,600 Speaker 2: and and your your you know the fact that you 179 00:08:46,640 --> 00:08:48,880 Speaker 2: were a bit dubious about it, that was specifically one 180 00:08:48,920 --> 00:08:50,400 Speaker 2: of the things you brought up that hey, if you're 181 00:08:50,440 --> 00:08:52,520 Speaker 2: pulling from X, which is what GROC is going to 182 00:08:52,640 --> 00:08:55,680 Speaker 2: largely be doing. And indeed, as as Sharen was saying 183 00:08:56,080 --> 00:08:58,400 Speaker 2: that the content on X is taking a turn for 184 00:08:58,440 --> 00:09:01,840 Speaker 2: the worst, how relyible can it be? So I guess 185 00:09:01,840 --> 00:09:03,680 Speaker 2: maybe they've managed to come up with the formula to 186 00:09:03,679 --> 00:09:04,240 Speaker 2: make this work. 187 00:09:04,280 --> 00:09:06,600 Speaker 1: I mean to me, what's most from a sort of 188 00:09:06,679 --> 00:09:09,000 Speaker 1: if you were an investor in one of Elon Musk's companies, 189 00:09:09,120 --> 00:09:10,679 Speaker 1: or you were his friend or something, what will be 190 00:09:10,720 --> 00:09:13,800 Speaker 1: most encouraging about this is that you have the rhetoric, right, 191 00:09:13,800 --> 00:09:16,199 Speaker 1: you have Elon Musk talking twenty four to seven about 192 00:09:16,200 --> 00:09:18,360 Speaker 1: the woke mind virus, and then you look at what 193 00:09:18,360 --> 00:09:20,840 Speaker 1: grog is actually doing. I was reading Sharen's story and 194 00:09:20,880 --> 00:09:24,280 Speaker 1: I saw Financial Times, Bloomberg, Wall Street Journal like Where's 195 00:09:24,320 --> 00:09:28,280 Speaker 1: cat Turn? And you realize, like, this thing has the 196 00:09:28,320 --> 00:09:31,960 Speaker 1: woke mind virus at least as Elon Musk has articulated it, 197 00:09:32,200 --> 00:09:36,240 Speaker 1: and so clearly he is sick. He's able to you know, 198 00:09:36,240 --> 00:09:38,320 Speaker 1: this would make you think, okay, maybe what's going on 199 00:09:38,360 --> 00:09:41,400 Speaker 1: in his own personal you know, Twitter feed X feed 200 00:09:41,720 --> 00:09:44,000 Speaker 1: is a performance and when it when it really matters, 201 00:09:44,000 --> 00:09:46,840 Speaker 1: when it gets down to business, he's making better decisions 202 00:09:46,920 --> 00:09:50,280 Speaker 1: or people underneath him are able to say, like, look like, 203 00:09:50,400 --> 00:09:53,840 Speaker 1: let's not like exclusively compose these answers of like the 204 00:09:53,920 --> 00:09:56,400 Speaker 1: right wing influencers that are you know, friends with Elon. 205 00:09:56,480 --> 00:10:01,559 Speaker 2: Let's let's useables. So GROC is so far better than expected. 206 00:10:01,760 --> 00:10:05,600 Speaker 2: It's doing well, It's passed some surprisingly some early tests. 207 00:10:05,720 --> 00:10:08,920 Speaker 2: Scharene does does it matter though, like in terms of 208 00:10:09,040 --> 00:10:13,280 Speaker 2: making this some way scalable monetizable in a way that 209 00:10:13,360 --> 00:10:17,960 Speaker 2: matters for XAI and for X and for Musk. Yeah, 210 00:10:18,080 --> 00:10:22,800 Speaker 2: is there is Does this have any actual import it? 211 00:10:22,880 --> 00:10:25,920 Speaker 3: Could you Look, if X really builds a reputation for 212 00:10:26,000 --> 00:10:28,720 Speaker 3: being the chatbot that will give you the most accurate 213 00:10:29,240 --> 00:10:32,480 Speaker 3: information about the world as it's happening, I can definitely 214 00:10:32,520 --> 00:10:36,040 Speaker 3: see that being monetizable or being a competitive asset. But 215 00:10:36,160 --> 00:10:38,800 Speaker 3: it's quite you know, a paradox because at the same time, 216 00:10:38,880 --> 00:10:42,520 Speaker 3: as you're pointing out, Musk is under fire for quite 217 00:10:42,559 --> 00:10:46,600 Speaker 3: the opposite, right for you know, how unreliable some people 218 00:10:46,679 --> 00:10:49,240 Speaker 3: are finding X to be. So it's kind of a 219 00:10:49,280 --> 00:10:53,160 Speaker 3: hard sell too, even if Grok is exceeding expectations in 220 00:10:53,200 --> 00:10:56,319 Speaker 3: this in this one area, it's the area of exact 221 00:10:56,360 --> 00:10:59,720 Speaker 3: weakness for him right now in terms of X's perception. 222 00:11:04,280 --> 00:11:07,240 Speaker 2: Now, Max, indeed you are in general you're dubious about AI, 223 00:11:07,360 --> 00:11:09,600 Speaker 2: not just as it applies to Grock, but across the board. 224 00:11:10,320 --> 00:11:11,200 Speaker 2: What's your take on this? 225 00:11:11,880 --> 00:11:16,840 Speaker 1: I mean, I think, look in certain ways, it makes 226 00:11:16,880 --> 00:11:20,559 Speaker 1: a lot of sense, just because Elon Musk needs to 227 00:11:20,679 --> 00:11:24,160 Speaker 1: create some reason for people to use this site and 228 00:11:24,200 --> 00:11:27,000 Speaker 1: to subscribe to X premium and Groc would be a 229 00:11:27,000 --> 00:11:27,480 Speaker 1: way to do that. 230 00:11:27,640 --> 00:11:31,160 Speaker 2: So who does indeed have access to GROC? Who can 231 00:11:31,280 --> 00:11:31,760 Speaker 2: use Grok? 232 00:11:31,840 --> 00:11:32,160 Speaker 4: Right now? 233 00:11:32,160 --> 00:11:34,280 Speaker 1: I believe it's it's I hope I'm getting the branding right. 234 00:11:34,320 --> 00:11:37,080 Speaker 1: I think it's ex Premium plus users. This is the 235 00:11:37,240 --> 00:11:39,040 Speaker 1: higher tier from the. 236 00:11:40,760 --> 00:11:44,000 Speaker 2: Twitter blue and apparently that group includes Scharene from what 237 00:11:44,000 --> 00:11:44,679 Speaker 2: we can tell. 238 00:11:44,880 --> 00:11:47,120 Speaker 1: Sharen, is it like twenty bucks a month or how much? 239 00:11:47,160 --> 00:11:48,240 Speaker 1: How much are you show up? 240 00:11:48,200 --> 00:11:50,599 Speaker 4: That's right, I'm a VIP users. 241 00:11:51,040 --> 00:11:54,920 Speaker 3: I believe it's around sixteen dollars a month, and it 242 00:11:54,960 --> 00:11:58,040 Speaker 3: is the highest package you can get on the pagement. 243 00:11:58,880 --> 00:12:00,880 Speaker 3: And I got it specifically for this reason, mind you, 244 00:12:00,960 --> 00:12:03,120 Speaker 3: not for my own vanity for the check mark. I 245 00:12:03,160 --> 00:12:05,440 Speaker 3: got it just because I wanted to test out there 246 00:12:05,880 --> 00:12:06,480 Speaker 3: likely story. 247 00:12:06,559 --> 00:12:10,080 Speaker 1: I mean, there's some really interesting issues around corporate governance 248 00:12:10,120 --> 00:12:12,880 Speaker 1: because it seems like Grock is a separate company, and 249 00:12:12,960 --> 00:12:16,480 Speaker 1: yet clearly it's very closely tied in with X. It 250 00:12:16,520 --> 00:12:19,080 Speaker 1: seems to be part of Elon Musk's plan to like, 251 00:12:19,280 --> 00:12:21,600 Speaker 1: you know, save the value of his investment. 252 00:12:22,640 --> 00:12:24,120 Speaker 4: So so I think that that's one thing. 253 00:12:24,240 --> 00:12:26,560 Speaker 1: The other thing is, you know, for a long time, 254 00:12:26,600 --> 00:12:29,280 Speaker 1: even going back to the to the old Twitter, everyone 255 00:12:29,360 --> 00:12:31,760 Speaker 1: knew there was a lot of value in the data 256 00:12:31,840 --> 00:12:35,160 Speaker 1: contained in tweets, and and Twitter had had a really 257 00:12:35,200 --> 00:12:37,760 Speaker 1: reliable revenue stream. I think to some extent still does 258 00:12:38,360 --> 00:12:41,080 Speaker 1: selling that data called the fire hose to companies that 259 00:12:41,080 --> 00:12:43,720 Speaker 1: would essentially pay for it and analyze it and so 260 00:12:43,800 --> 00:12:46,800 Speaker 1: on and and so you know, in certain ways, Grock 261 00:12:46,880 --> 00:12:50,319 Speaker 1: is like a logical the logical next step. On the 262 00:12:50,360 --> 00:12:53,360 Speaker 1: other hand, in other ways, the kind of Elon Musk 263 00:12:53,520 --> 00:12:57,959 Speaker 1: media troll way he has undermined that he's undermined Twitter 264 00:12:58,280 --> 00:13:01,679 Speaker 1: X as a sort of reliable record of sentiment, whereas 265 00:13:01,720 --> 00:13:05,840 Speaker 1: now it's it's more like a reliable reliable record for 266 00:13:05,920 --> 00:13:08,400 Speaker 1: sentiment in certain domains like sports and so on. I 267 00:13:08,400 --> 00:13:10,400 Speaker 1: bet you Grock will be great on you know, the 268 00:13:10,480 --> 00:13:12,800 Speaker 1: question of like which who to start in your fantasy 269 00:13:12,840 --> 00:13:15,280 Speaker 1: league or whatever, and less reliable on some of these 270 00:13:15,320 --> 00:13:18,400 Speaker 1: more controversial issues where we've seen really seen an impact 271 00:13:18,720 --> 00:13:21,720 Speaker 1: of Elon Musk's you know, leadership, or and and sort 272 00:13:21,720 --> 00:13:25,280 Speaker 1: of the the right wingification of the site, so. 273 00:13:25,360 --> 00:13:29,160 Speaker 2: Max and and in that vein. Then the EU investigation 274 00:13:29,280 --> 00:13:31,840 Speaker 2: of X which was announced i believe earlier this week 275 00:13:32,679 --> 00:13:37,720 Speaker 2: for disinformation. They're investigating on disinformation and illicit content. Does 276 00:13:37,840 --> 00:13:41,040 Speaker 2: this does the fact that GROC is a more reliable source, 277 00:13:41,080 --> 00:13:43,480 Speaker 2: does this in any way, shape or form help them 278 00:13:43,559 --> 00:13:47,520 Speaker 2: fend off that EU investigation? Are these totally unrelated things. 279 00:13:47,760 --> 00:13:52,280 Speaker 1: You know, I'm not an expert on European technology regulation. 280 00:13:52,400 --> 00:13:55,599 Speaker 1: I'm not sure that anybody is, but I know I 281 00:13:55,640 --> 00:13:57,600 Speaker 1: don't think it does. I mean, this is very much 282 00:13:57,600 --> 00:14:01,720 Speaker 1: a niche product. If what you're worried about is misinformation 283 00:14:02,240 --> 00:14:04,880 Speaker 1: hate speech. By the way, in parts of Europe, of course, 284 00:14:04,880 --> 00:14:07,640 Speaker 1: in Germany, certain kinds of hate speech that is you know, 285 00:14:07,679 --> 00:14:10,000 Speaker 1: allowed in the United States is not allowed there. So 286 00:14:10,320 --> 00:14:15,560 Speaker 1: if you're worried about like bad content broadcasting widely, this 287 00:14:15,600 --> 00:14:18,280 Speaker 1: doesn't do anything to address that. Right, You're talking about 288 00:14:18,520 --> 00:14:21,240 Speaker 1: a very narrow segment that has access to this thing 289 00:14:21,600 --> 00:14:23,560 Speaker 1: using it in a narrow way. I don't think Sharene 290 00:14:23,640 --> 00:14:25,840 Speaker 1: was the kind of person who was, like, you know, 291 00:14:26,000 --> 00:14:28,400 Speaker 1: liable to like retweet, you know, the latest cat turd 292 00:14:28,480 --> 00:14:31,400 Speaker 1: or whatever. And that's the kind of thing that European 293 00:14:31,680 --> 00:14:34,000 Speaker 1: regulators are worried about, although I should correct myself and 294 00:14:34,000 --> 00:14:37,000 Speaker 1: say I don't think they're specifically concerned about cat turn 295 00:14:37,040 --> 00:14:40,880 Speaker 1: I'm using that right stand in for the overall issue 296 00:14:40,880 --> 00:14:44,160 Speaker 1: of disinformation. When I was in college, we had a 297 00:14:44,200 --> 00:14:46,160 Speaker 1: cat called Claws, and so it's kind of like the 298 00:14:46,200 --> 00:14:46,520 Speaker 1: same thing. 299 00:14:46,520 --> 00:14:47,800 Speaker 4: It's like a small part. 300 00:14:47,840 --> 00:14:51,840 Speaker 2: I got to describe the whole cat, the whole understood, 301 00:14:52,080 --> 00:14:54,480 Speaker 2: by the way, And I don't know if anybody saw 302 00:14:54,480 --> 00:14:56,720 Speaker 2: this or anybody can make heads or tails of it. 303 00:14:56,760 --> 00:14:59,400 Speaker 2: So there's there's another groc out there is this? 304 00:14:59,480 --> 00:14:59,680 Speaker 1: Right? 305 00:15:00,080 --> 00:15:00,480 Speaker 2: Is anybody? 306 00:15:00,640 --> 00:15:00,880 Speaker 4: Yes? 307 00:15:00,920 --> 00:15:01,920 Speaker 3: This is so confusing? 308 00:15:02,040 --> 00:15:03,280 Speaker 2: This may help us, Serene. 309 00:15:04,200 --> 00:15:04,520 Speaker 4: Okay. 310 00:15:04,560 --> 00:15:07,600 Speaker 3: So Elon Musk's former partner and you know, mother to 311 00:15:07,720 --> 00:15:12,840 Speaker 3: his children. So she came out with a rocket ship 312 00:15:12,920 --> 00:15:18,720 Speaker 3: AI plushy from Curio named Grock, same name, and this 313 00:15:18,920 --> 00:15:21,920 Speaker 3: came out. This was announced, you know, I believe let's see, 314 00:15:22,120 --> 00:15:24,840 Speaker 3: just like was it last week? She announced on Instagram. 315 00:15:25,080 --> 00:15:27,960 Speaker 3: It's this really cute looking you know, great, I saw it. 316 00:15:27,960 --> 00:15:30,200 Speaker 3: It's not basically stufty animal, that's aipower. 317 00:15:30,320 --> 00:15:33,680 Speaker 2: What kind of odd so Grock custody battle. Yeah, I 318 00:15:33,720 --> 00:15:36,360 Speaker 2: don't know exactly, so GROC. So we're they're fighting over 319 00:15:36,360 --> 00:15:39,000 Speaker 2: the kids and now they're fighting over Groc And yeah, 320 00:15:39,000 --> 00:15:41,520 Speaker 2: it's odd. We went from having zero Gros in our 321 00:15:41,560 --> 00:15:43,920 Speaker 2: lives the now multiple in the span of weeks. So 322 00:15:44,040 --> 00:15:45,800 Speaker 2: do we think that this was deliberate? 323 00:15:46,040 --> 00:15:48,000 Speaker 4: Like is this like? I mean, it would have to 324 00:15:48,040 --> 00:15:48,360 Speaker 4: be right. 325 00:15:48,400 --> 00:15:51,720 Speaker 1: There's no way that they could have just independently arrived 326 00:15:51,720 --> 00:15:52,920 Speaker 1: at the same So I. 327 00:15:52,880 --> 00:15:56,120 Speaker 3: Saw a techcrun headlines today that she had the trademark 328 00:15:56,160 --> 00:15:57,680 Speaker 3: before he did or something. I don't know if that 329 00:15:57,720 --> 00:15:58,960 Speaker 3: means you came up with it first. 330 00:15:59,040 --> 00:16:02,680 Speaker 2: But we're going to say goodbye to Sharen for now. 331 00:16:02,720 --> 00:16:04,240 Speaker 2: That was great train that we're going to have you 332 00:16:04,320 --> 00:16:06,160 Speaker 2: back on both later in the show, and we're going 333 00:16:06,200 --> 00:16:08,440 Speaker 2: to have you back on once you you know, test 334 00:16:08,480 --> 00:16:15,440 Speaker 2: other elements of Groc will bring you back. Okay, So 335 00:16:15,520 --> 00:16:18,480 Speaker 2: welcome back. Max and I are still here, and we're 336 00:16:18,520 --> 00:16:21,800 Speaker 2: bringing in Dana Hall, our longtime Tesla reporter, to help 337 00:16:21,880 --> 00:16:26,600 Speaker 2: us talk about the recall. Hello Dana, Hello Hello. Okay, 338 00:16:26,760 --> 00:16:29,560 Speaker 2: So Tesla indeed recalled a few cars the other day, 339 00:16:29,600 --> 00:16:31,640 Speaker 2: and you know, it's attracted a lot of attention because 340 00:16:31,680 --> 00:16:34,160 Speaker 2: even my mother, who doesn't happen to own a Tesla, 341 00:16:34,320 --> 00:16:37,160 Speaker 2: asked me what was going on, So Dana help her 342 00:16:37,280 --> 00:16:41,000 Speaker 2: and everyone else out what exactly did Tesla do and 343 00:16:41,080 --> 00:16:42,480 Speaker 2: how big a deal is this. 344 00:16:43,640 --> 00:16:45,240 Speaker 5: So the first thing to know is that the word 345 00:16:45,320 --> 00:16:48,360 Speaker 5: recall is a little bit of an anachronism when it 346 00:16:48,360 --> 00:16:51,280 Speaker 5: comes to the auto industry these days. You hear recall 347 00:16:51,320 --> 00:16:53,920 Speaker 5: and you think, oh my god, like all of these cars, 348 00:16:53,960 --> 00:16:55,760 Speaker 5: like two million cars are going to be yanked off 349 00:16:55,800 --> 00:16:58,520 Speaker 5: the roads. It's like, you know, like a baby's toy 350 00:16:58,640 --> 00:17:01,600 Speaker 5: is recalled because there's a choke hazard. Like the idea 351 00:17:01,600 --> 00:17:03,800 Speaker 5: of a recall makes you think that everything is off 352 00:17:03,840 --> 00:17:07,720 Speaker 5: the market. But you know, now, like a lot of automakers, 353 00:17:07,920 --> 00:17:11,720 Speaker 5: it's Tesla in particular, they can improve the product via 354 00:17:11,840 --> 00:17:15,600 Speaker 5: over the air software updates. So, yes, this was a recall, 355 00:17:16,040 --> 00:17:20,600 Speaker 5: but the remedy is that, you know, Tesla, free to consumers, 356 00:17:20,680 --> 00:17:22,800 Speaker 5: is going to beam and over the air software update 357 00:17:22,840 --> 00:17:25,239 Speaker 5: to everyone and the consumer does not need to do 358 00:17:25,680 --> 00:17:27,920 Speaker 5: anything you don't need to bring it into a shop. 359 00:17:28,359 --> 00:17:32,919 Speaker 5: You can still drive it. This is Tesla basically admitting, voluntarily, 360 00:17:33,160 --> 00:17:37,040 Speaker 5: with Nitza's guidance, that autopilot is a problem and that 361 00:17:37,280 --> 00:17:40,040 Speaker 5: drivers tend to get lulled into kind of like a 362 00:17:40,080 --> 00:17:43,920 Speaker 5: sense of complacency, and that in certain conditions drivers may 363 00:17:43,920 --> 00:17:47,000 Speaker 5: not pay attention. So the remedy is that Tesla's going 364 00:17:47,080 --> 00:17:48,760 Speaker 5: to nag you a little bit more to keep your 365 00:17:48,760 --> 00:17:50,440 Speaker 5: hands on the wheel and your eyes on the road. 366 00:17:50,720 --> 00:17:53,320 Speaker 5: And I think the significance here is that, you know, 367 00:17:53,440 --> 00:17:56,919 Speaker 5: for years, like Nitza has been working with Tesla to 368 00:17:57,000 --> 00:18:00,320 Speaker 5: kind of improve the company's safety culture. They invest gated 369 00:18:00,359 --> 00:18:04,399 Speaker 5: all these accidents where Tesla drivers on autopilot or with 370 00:18:04,440 --> 00:18:08,560 Speaker 5: autopilot engaged seem to keep smashing into stationary fire trucks, 371 00:18:08,560 --> 00:18:10,600 Speaker 5: and it was like, hmm, what's going on with that? 372 00:18:10,760 --> 00:18:12,720 Speaker 5: And then you know, this kind of recall that was 373 00:18:12,760 --> 00:18:16,600 Speaker 5: announced last week was basically the result of those two 374 00:18:16,720 --> 00:18:19,800 Speaker 5: years of negotiations back and forth between the regulatory agency 375 00:18:19,840 --> 00:18:20,440 Speaker 5: and the company. 376 00:18:21,520 --> 00:18:25,440 Speaker 1: It's Dan's right that, like, you know, there you could 377 00:18:25,520 --> 00:18:27,159 Speaker 1: read this in a very literal way and think, oh 378 00:18:27,160 --> 00:18:28,760 Speaker 1: my god, they're going to take all these cars and 379 00:18:28,800 --> 00:18:30,200 Speaker 1: you're gonna have to give them back and they're gonna 380 00:18:30,200 --> 00:18:31,520 Speaker 1: be whatever rebil. 381 00:18:31,680 --> 00:18:34,440 Speaker 2: A literal sense in the old in the old sense 382 00:18:34,480 --> 00:18:34,960 Speaker 2: of the world. 383 00:18:34,960 --> 00:18:36,920 Speaker 1: Well maybe, but I mean a lot of recalls work 384 00:18:37,080 --> 00:18:39,720 Speaker 1: like this, Like if your Ikea bookshelf was recalled, they 385 00:18:39,720 --> 00:18:41,960 Speaker 1: don't take the bookshelf back. They just send you like 386 00:18:42,000 --> 00:18:44,760 Speaker 1: the new pins to like hold the shelves in better. 387 00:18:45,240 --> 00:18:48,880 Speaker 1: But it is bad, and it's it's bad, especially in 388 00:18:48,960 --> 00:18:51,800 Speaker 1: relation to what Elon Musk has been saying for years 389 00:18:51,840 --> 00:18:54,639 Speaker 1: and years and years. We are so far away from 390 00:18:54,680 --> 00:18:59,520 Speaker 1: the swaggering you know, we're gonna have robotaxis immediately. You 391 00:18:59,640 --> 00:19:01,199 Speaker 1: barely have to look at this thing. You know, I 392 00:19:01,400 --> 00:19:03,840 Speaker 1: before before going on, like I went back and you know, 393 00:19:03,880 --> 00:19:06,359 Speaker 1: there was that that video that that they ran, the 394 00:19:06,760 --> 00:19:09,720 Speaker 1: Painted Black video. I think Dana is that the right 395 00:19:09,800 --> 00:19:10,960 Speaker 1: song from sixteen? 396 00:19:11,080 --> 00:19:11,280 Speaker 5: Yeah. 397 00:19:11,359 --> 00:19:14,199 Speaker 1: Yeah, it's just like awesome video where the cars like 398 00:19:14,600 --> 00:19:18,359 Speaker 1: driving around and there's a little thing on the bottom 399 00:19:18,400 --> 00:19:20,560 Speaker 1: of the screen says the driver is not doing anything, 400 00:19:20,600 --> 00:19:22,440 Speaker 1: he's just there for legal reasons. 401 00:19:22,160 --> 00:19:24,120 Speaker 2: Right as seven years ago, yeah exactly. 402 00:19:24,320 --> 00:19:26,360 Speaker 1: And and then you had you had like you know, 403 00:19:26,640 --> 00:19:29,840 Speaker 1: I think shortly after that, an adult film actress you know, 404 00:19:29,920 --> 00:19:32,880 Speaker 1: tweeted a video, you know, having sex in a Tesla, 405 00:19:33,080 --> 00:19:35,400 Speaker 1: and Elon Musk was like, ha ha, ha isn't it great? 406 00:19:35,440 --> 00:19:39,800 Speaker 1: Another use and like this was all kind of an exaggeration, 407 00:19:40,040 --> 00:19:43,560 Speaker 1: like they this thing doesn't work the way Elon Musk 408 00:19:43,600 --> 00:19:46,480 Speaker 1: has said it works for years and years and years 409 00:19:46,480 --> 00:19:48,919 Speaker 1: and years. And not only that, it doesn't work the 410 00:19:48,960 --> 00:19:52,400 Speaker 1: way many many Tesla investors have sort of been assuming 411 00:19:52,440 --> 00:19:54,640 Speaker 1: it would work. So in that sense, like I think 412 00:19:54,680 --> 00:19:56,439 Speaker 1: this is damaging. 413 00:19:55,960 --> 00:19:58,880 Speaker 2: But max to that point, right, because you're saying investor, 414 00:19:58,960 --> 00:20:02,399 Speaker 2: Tesla investors have been and assuming that the technology works 415 00:20:02,400 --> 00:20:05,119 Speaker 2: are going to work like that soon ergo. That is 416 00:20:05,160 --> 00:20:08,240 Speaker 2: part of the very very lofty valuation that Tesla has. 417 00:20:08,240 --> 00:20:11,560 Speaker 2: It's worth more than all the other automakers combined, and 418 00:20:11,600 --> 00:20:13,639 Speaker 2: by a lot, but the stock price didn't move it 419 00:20:13,680 --> 00:20:14,679 Speaker 2: all off this really well. 420 00:20:14,720 --> 00:20:18,440 Speaker 5: The thing is that even though Tesla markets it as autopilot, 421 00:20:18,800 --> 00:20:21,400 Speaker 5: the truth is that it is a level two system, 422 00:20:21,480 --> 00:20:24,360 Speaker 5: which means that the driver is always in control. And 423 00:20:25,160 --> 00:20:27,760 Speaker 5: there's like sort of daylight between how it's marketed and 424 00:20:27,800 --> 00:20:30,359 Speaker 5: how it's actually used legally, Like if you get a 425 00:20:30,400 --> 00:20:32,560 Speaker 5: Tesla and you read all the documents and even look 426 00:20:32,560 --> 00:20:34,119 Speaker 5: at their website, they make it very clear that the 427 00:20:34,200 --> 00:20:36,440 Speaker 5: driver is supposed to be in control at all times. 428 00:20:41,240 --> 00:20:46,119 Speaker 2: But then presumably Knitz's concern though, is that no, you know, 429 00:20:46,440 --> 00:20:50,040 Speaker 2: reasonable or typical user is actually going to read all 430 00:20:50,080 --> 00:20:52,000 Speaker 2: the fine print, is that right? And so they are 431 00:20:52,040 --> 00:20:55,800 Speaker 2: going to perhaps think that like, oh, whatever autopilot's got 432 00:20:55,800 --> 00:20:56,960 Speaker 2: it is that right. 433 00:20:57,040 --> 00:20:59,199 Speaker 5: I mean, there may be an increased risk of a 434 00:20:59,200 --> 00:21:02,480 Speaker 5: collision if you are if auto steer is engaged and 435 00:21:02,520 --> 00:21:03,920 Speaker 5: you're not fully paying attention. 436 00:21:04,480 --> 00:21:06,560 Speaker 2: It says the new warning that you see what where 437 00:21:06,600 --> 00:21:08,239 Speaker 2: does that pop up? By the way, that pops up 438 00:21:08,240 --> 00:21:10,520 Speaker 2: one on your on your screen as you as you 439 00:21:10,560 --> 00:21:11,400 Speaker 2: sit down in the car. 440 00:21:11,640 --> 00:21:13,959 Speaker 5: So the remedy is that there will be increased visual 441 00:21:14,000 --> 00:21:17,440 Speaker 5: alerts when you're driving, and like, if you ignore all 442 00:21:17,560 --> 00:21:21,080 Speaker 5: the alerts completely, like the car could basically ground to 443 00:21:21,119 --> 00:21:22,160 Speaker 5: a halt and pull you over. 444 00:21:22,200 --> 00:21:22,240 Speaker 1: It. 445 00:21:22,720 --> 00:21:25,640 Speaker 5: Tesla already kind of like nags drivers to pay attention. 446 00:21:26,080 --> 00:21:29,080 Speaker 5: Now the nagging is more pronounced and more severe. But 447 00:21:29,240 --> 00:21:32,360 Speaker 5: I mean, what's interesting is that you know, Tesla initiated 448 00:21:32,359 --> 00:21:35,560 Speaker 5: this recall. This wasn't nitsa coming in and like forcing 449 00:21:35,560 --> 00:21:40,240 Speaker 5: a recall, Tesla basically begrudgingly acknowledged it and said they 450 00:21:40,280 --> 00:21:42,000 Speaker 5: would do it. They said that they didn't agree with 451 00:21:42,080 --> 00:21:44,719 Speaker 5: Nitza's findings, but they were they were going along with it. 452 00:21:44,800 --> 00:21:47,439 Speaker 1: I mean one reason for that is that there are 453 00:21:47,520 --> 00:21:50,080 Speaker 1: worse things that NITSA could do, I mean, and and 454 00:21:50,160 --> 00:21:52,399 Speaker 1: so like what well, I mean, they could force them 455 00:21:52,400 --> 00:21:54,359 Speaker 1: to change the name. I mean, they could, they could 456 00:21:54,560 --> 00:21:57,240 Speaker 1: like you know, they could they could more heavily regulate 457 00:21:57,720 --> 00:22:00,879 Speaker 1: or try to stop the way Tesla has talked about 458 00:22:00,880 --> 00:22:04,240 Speaker 1: these products, both autopilot and it's you know, quote unquote 459 00:22:04,280 --> 00:22:07,080 Speaker 1: quote full self driving product, you know. And I think 460 00:22:07,119 --> 00:22:09,159 Speaker 1: you could make an argument that both of those names 461 00:22:09,400 --> 00:22:12,959 Speaker 1: are a little bit misleading. And you know, I think 462 00:22:13,000 --> 00:22:15,240 Speaker 1: there's an assumption, right that this is going to sort 463 00:22:15,280 --> 00:22:18,040 Speaker 1: of settle the matter for NITSA, although NITSA you know, 464 00:22:18,080 --> 00:22:20,200 Speaker 1: released the stament saying that they are not necessarily like 465 00:22:20,240 --> 00:22:21,719 Speaker 1: they have not closed this investigation. 466 00:22:22,040 --> 00:22:22,200 Speaker 2: Yeah. 467 00:22:22,240 --> 00:22:24,520 Speaker 5: And it's interesting too because there's a lot of litigation 468 00:22:24,640 --> 00:22:28,280 Speaker 5: coming up on behalf of you know, drivers and their 469 00:22:28,320 --> 00:22:32,200 Speaker 5: families who were injured or killed in accidents where autopilot 470 00:22:32,320 --> 00:22:34,960 Speaker 5: was or was not engaged. And you have all these plaintiffs, 471 00:22:34,960 --> 00:22:37,960 Speaker 5: attorneys who are very keen to kind of look at 472 00:22:38,000 --> 00:22:41,840 Speaker 5: this recall as part of their strategy going forward. There's 473 00:22:41,840 --> 00:22:43,879 Speaker 5: some big trials that are coming up or slated for 474 00:22:43,920 --> 00:22:46,439 Speaker 5: twenty twenty four involving some of the more high profile 475 00:22:46,440 --> 00:22:47,280 Speaker 5: autopilot cases. 476 00:22:47,359 --> 00:22:49,440 Speaker 2: What do those lawsuits allege. 477 00:22:50,000 --> 00:22:52,760 Speaker 5: Well, they basically alleged that the product is defective and 478 00:22:52,800 --> 00:22:56,800 Speaker 5: that Tesla knowingly marketed this defective product that like lulled 479 00:22:56,880 --> 00:22:59,800 Speaker 5: their these drivers into this false sense of complacency. But 480 00:22:59,840 --> 00:23:03,400 Speaker 5: it's tricky because you know, from Tesla's perspective, like all 481 00:23:03,440 --> 00:23:05,680 Speaker 5: of the fine prints says that you've got to pay 482 00:23:05,720 --> 00:23:08,840 Speaker 5: attention and if you didn't, then then it's on you. 483 00:23:08,960 --> 00:23:12,480 Speaker 5: And just you know, when you're in a car and 484 00:23:12,720 --> 00:23:17,119 Speaker 5: there's something like autopilot engaged, like you know, as a driver, 485 00:23:17,240 --> 00:23:19,719 Speaker 5: it can be confusing, like in terms of how much 486 00:23:19,800 --> 00:23:22,680 Speaker 5: is it actually capable of when do you take control? 487 00:23:22,800 --> 00:23:25,680 Speaker 5: It's like that murky middle it's not a full robotaxi, 488 00:23:26,359 --> 00:23:29,400 Speaker 5: but it's it's helping you like keep lanes and you know, 489 00:23:29,560 --> 00:23:32,240 Speaker 5: like keep speed, and you know it's it's it's a 490 00:23:32,320 --> 00:23:35,880 Speaker 5: driver assistance project. But to Max's point, a big part 491 00:23:35,880 --> 00:23:38,359 Speaker 5: of the valuation of Tesla is the promise of robotaxis, 492 00:23:38,400 --> 00:23:41,439 Speaker 5: and Musk raised billions of dollars from Wall Street with 493 00:23:41,520 --> 00:23:45,720 Speaker 5: that idea, and like a full robotaxis always kind of 494 00:23:45,760 --> 00:23:48,200 Speaker 5: right around the corner. And that's been going on like 495 00:23:48,320 --> 00:23:49,000 Speaker 5: year after year. 496 00:23:49,040 --> 00:23:49,240 Speaker 4: Now. 497 00:23:49,600 --> 00:23:51,919 Speaker 1: Yeah, I was gonna say, that's a regulatory problem, like 498 00:23:51,960 --> 00:23:55,280 Speaker 1: getting we like, they are so far away from getting 499 00:23:55,280 --> 00:23:57,800 Speaker 1: the federal government to say, okay, you can use your 500 00:23:57,840 --> 00:24:00,800 Speaker 1: Tesla as a robotaxi. I mean we've seen you know, 501 00:24:01,119 --> 00:24:05,000 Speaker 1: Cruise essentially like layoff a huge sex segment of the 502 00:24:05,040 --> 00:24:09,000 Speaker 1: company basically slammed the brakes on its robotaxi business. It 503 00:24:09,119 --> 00:24:13,000 Speaker 1: really just seems like, you know, good for Elon to 504 00:24:13,600 --> 00:24:16,040 Speaker 1: solve this problem, but again, you are so far away 505 00:24:16,280 --> 00:24:18,760 Speaker 1: from what that ultimate goal is, which is you know, 506 00:24:18,960 --> 00:24:20,120 Speaker 1: the driver's not looking at all. 507 00:24:20,200 --> 00:24:21,840 Speaker 2: Yeah, and I mean to go back to twenty sixteen, 508 00:24:21,840 --> 00:24:23,840 Speaker 2: which was your reference to that ad that you remembered 509 00:24:23,840 --> 00:24:26,960 Speaker 2: from back then, I remember being involved in a story 510 00:24:27,040 --> 00:24:30,080 Speaker 2: at the time absolutely wasn't just a Tesla thing, but 511 00:24:30,119 --> 00:24:32,919 Speaker 2: across the board, there was this sense out there in 512 00:24:32,960 --> 00:24:36,320 Speaker 2: the industry that like, you know, we're days, we're hours 513 00:24:36,359 --> 00:24:39,320 Speaker 2: away from from everybody, just you know, sitting in the 514 00:24:39,359 --> 00:24:42,080 Speaker 2: backseat and sleeping while their car drives. Yeah, that was 515 00:24:42,480 --> 00:24:43,120 Speaker 2: a myth. 516 00:24:43,000 --> 00:24:44,199 Speaker 4: By the way, that was caused. 517 00:24:44,240 --> 00:24:45,719 Speaker 1: You know, I've done some reporting on this that was 518 00:24:45,720 --> 00:24:48,640 Speaker 1: caused in part by Elon because because Elon went out 519 00:24:48,680 --> 00:24:51,679 Speaker 1: there and said that that this is like definitely gonna happen, 520 00:24:51,960 --> 00:24:55,159 Speaker 1: then you had other CEOs promising to buy certain numbers 521 00:24:55,160 --> 00:24:56,679 Speaker 1: of Tesla. I believe it was a CEO of a 522 00:24:56,680 --> 00:24:59,320 Speaker 1: big ride sharing company promised to buy you know, big, 523 00:24:59,359 --> 00:25:02,959 Speaker 1: big quantity of of Tesla's and that caused everyone to 524 00:25:03,000 --> 00:25:06,919 Speaker 1: make models right showing enormous sales for you know, like 525 00:25:06,920 --> 00:25:09,720 Speaker 1: like it caused this whole like Wall Street apparatus to 526 00:25:09,800 --> 00:25:13,680 Speaker 1: like kick into gear and develop all these justification. 527 00:25:13,320 --> 00:25:15,560 Speaker 2: Wall Street military industrial costs, and a. 528 00:25:15,520 --> 00:25:17,959 Speaker 1: Lot of that has kind of has gone away, like 529 00:25:18,000 --> 00:25:19,520 Speaker 1: people have sort of woken up to the fact that, 530 00:25:19,520 --> 00:25:21,159 Speaker 1: like this stuff hasn't happening as quickly as it has. 531 00:25:21,320 --> 00:25:21,720 Speaker 4: Accepted. 532 00:25:21,760 --> 00:25:24,520 Speaker 1: One important exception, which is test that'sl stock brash. So 533 00:25:24,520 --> 00:25:27,400 Speaker 1: it's probably worth saying that. You know, there's this other 534 00:25:27,440 --> 00:25:29,880 Speaker 1: case going on, which is the case of the Nicola 535 00:25:30,160 --> 00:25:34,560 Speaker 1: founder who has been found guilty of defrauding investors around 536 00:25:34,720 --> 00:25:37,720 Speaker 1: you know, basically the capabilities of this what was essentially 537 00:25:37,760 --> 00:25:41,320 Speaker 1: a vapor ware truck now much much more serious case. 538 00:25:41,359 --> 00:25:45,399 Speaker 5: I mean, and look, investors love who investors who have autopilot, 539 00:25:45,560 --> 00:25:47,240 Speaker 5: I mean a lot of them love it. They see it, 540 00:25:47,320 --> 00:25:50,000 Speaker 5: they see it getting better, they experience it getting better. 541 00:25:50,280 --> 00:25:53,240 Speaker 5: They feel like, Okay, Elon might be wrong on his timelines, 542 00:25:53,320 --> 00:25:55,440 Speaker 5: but if anyone's going to figure out this problem, it 543 00:25:55,480 --> 00:25:57,760 Speaker 5: will be him. And Tesla has a lot of data. 544 00:25:57,840 --> 00:25:59,919 Speaker 5: I mean, that's the thing. The other thing, like you know, 545 00:26:00,080 --> 00:26:03,080 Speaker 5: Weimo and all these other companies that kind of tried 546 00:26:03,080 --> 00:26:07,480 Speaker 5: to do full robotoxis are you know, training small numbers 547 00:26:07,480 --> 00:26:10,800 Speaker 5: of vehicles in these kind of geofence locations, whereas Tesla 548 00:26:10,880 --> 00:26:13,359 Speaker 5: is learning from all the autopilot data that it is 549 00:26:13,440 --> 00:26:16,199 Speaker 5: just unleashed on the universe. And you know, they've got 550 00:26:16,240 --> 00:26:18,840 Speaker 5: a really big fleet now, so their model is based 551 00:26:18,840 --> 00:26:20,359 Speaker 5: on a lot more real world data. 552 00:26:21,000 --> 00:26:23,919 Speaker 2: All right, So last question on this topic that I 553 00:26:23,960 --> 00:26:25,919 Speaker 2: have for both of you is, and this is kind 554 00:26:25,960 --> 00:26:27,879 Speaker 2: of brings us back to the beginning of the conversation 555 00:26:28,040 --> 00:26:31,960 Speaker 2: on it is so is recall then in the end 556 00:26:32,000 --> 00:26:33,800 Speaker 2: the right word and is it the word we're going 557 00:26:33,880 --> 00:26:37,240 Speaker 2: to use going forward on something like this, Dana. 558 00:26:37,280 --> 00:26:40,840 Speaker 5: You could say a software recall, because ultimately the remedy 559 00:26:40,880 --> 00:26:43,280 Speaker 5: is to update the software. You could say a software recall. 560 00:26:45,080 --> 00:26:48,680 Speaker 4: It's a recall. It's what I mean this whole conversation 561 00:26:48,680 --> 00:26:49,920 Speaker 4: about it. Is it a recall? 562 00:26:50,760 --> 00:26:53,800 Speaker 1: Yeah, there are lots of recalls that that are satisfied 563 00:26:53,800 --> 00:26:57,239 Speaker 1: without taking the product in. But like if you're if 564 00:26:57,320 --> 00:26:59,719 Speaker 1: you're having to like make a change to your product, 565 00:26:59,720 --> 00:27:01,080 Speaker 1: that is because it's unsafe. 566 00:27:01,240 --> 00:27:02,240 Speaker 4: And that's what it's just saying. 567 00:27:05,400 --> 00:27:08,879 Speaker 2: So sharene welcome back. Now. You need to know that. 568 00:27:09,280 --> 00:27:11,760 Speaker 2: One if you hear that is a source of never 569 00:27:11,880 --> 00:27:16,560 Speaker 2: ending raging debate on the show is the looming death 570 00:27:16,720 --> 00:27:20,680 Speaker 2: match cage match between Elon Musk and his rival Mark Zuckerberg. 571 00:27:20,720 --> 00:27:23,560 Speaker 2: Max has very strong opinions, I have very strong opinions, 572 00:27:24,000 --> 00:27:26,280 Speaker 2: Dana and Sarah Friar. Who's not with us? I think 573 00:27:26,280 --> 00:27:28,600 Speaker 2: we're crazy and should move on and do other things 574 00:27:28,640 --> 00:27:32,159 Speaker 2: with our lives. Sharen, who do you think wins this 575 00:27:32,240 --> 00:27:33,199 Speaker 2: match when it happens? 576 00:27:34,960 --> 00:27:37,520 Speaker 3: It's tough because I mean, look, Elon's a bigger guy, right, 577 00:27:37,600 --> 00:27:38,880 Speaker 3: he's six to one. 578 00:27:39,560 --> 00:27:43,520 Speaker 2: Sucks he's seven. But I've been saying let her answer, 579 00:27:43,880 --> 00:27:45,080 Speaker 2: So who wins. 580 00:27:44,960 --> 00:27:47,920 Speaker 3: But I think Mark Zuckerberg and look, Mark Zerberg also 581 00:27:47,960 --> 00:27:49,440 Speaker 3: recently had an injury, So. 582 00:27:49,680 --> 00:27:52,240 Speaker 2: Flip flow, who's gonna you're hem and Han who's gonna win? 583 00:27:52,440 --> 00:27:53,760 Speaker 4: Okay, okay on a good day. 584 00:27:53,880 --> 00:27:56,119 Speaker 3: I think Mark, if he's recovered from his injury, I 585 00:27:56,119 --> 00:27:59,840 Speaker 3: think it suck because I think he's trained for this conditionally. 586 00:28:00,560 --> 00:28:03,280 Speaker 1: He posted on Instagram that he's doing rehab. He's going 587 00:28:03,320 --> 00:28:04,120 Speaker 1: to be better than ever. 588 00:28:04,280 --> 00:28:04,800 Speaker 2: So sucked it. 589 00:28:04,840 --> 00:28:06,280 Speaker 4: Okay, good, something that'll keep in mind. 590 00:28:06,640 --> 00:28:10,880 Speaker 2: But we decided to just see just how brilliant Groc is. Right, 591 00:28:11,040 --> 00:28:13,000 Speaker 2: we said, why don't we say why don't we hit 592 00:28:13,080 --> 00:28:15,800 Speaker 2: Grock with the question. So, Scharen, you did you asked 593 00:28:15,800 --> 00:28:18,800 Speaker 2: Groc who will win the fight? And it told you what. 594 00:28:21,960 --> 00:28:25,360 Speaker 3: It said. It also, like myself, gave a very kind 595 00:28:25,400 --> 00:28:28,960 Speaker 3: of reasoned response, and it said, you know, listen out 596 00:28:29,080 --> 00:28:31,920 Speaker 3: that Zuck is smaller. He gave his hide his weight. 597 00:28:32,320 --> 00:28:35,800 Speaker 3: It said, as for their fighting styles, Zuckerberg's jiu jitsu 598 00:28:35,840 --> 00:28:38,800 Speaker 3: training could give him an advantage in grappling and ground fighting, 599 00:28:39,040 --> 00:28:43,040 Speaker 3: while Musk's height and reach advantage could be useful in striking. 600 00:28:42,800 --> 00:28:44,360 Speaker 2: Very clinical atwer distance. 601 00:28:44,680 --> 00:28:46,960 Speaker 3: Wow yeah, And it said ultimately, the outcome of the 602 00:28:46,960 --> 00:28:50,000 Speaker 3: fight would depend on various factors such as training, preparation, 603 00:28:50,120 --> 00:28:53,040 Speaker 3: and strategy. So let's wait and see if this hypothetical 604 00:28:53,080 --> 00:28:54,640 Speaker 3: match ever becomes reality. 605 00:28:55,360 --> 00:28:57,200 Speaker 2: So a non answer, that's a lay of answer, and 606 00:28:57,240 --> 00:28:59,880 Speaker 2: it says Musk is known for his his interest in 607 00:29:00,000 --> 00:29:03,320 Speaker 2: physical fitness, eh and martial arts. What it doesn't say 608 00:29:03,480 --> 00:29:07,720 Speaker 2: is that Musk is a life long brawler, which he's 609 00:29:07,760 --> 00:29:10,400 Speaker 2: been fighting since he was you know, since he got 610 00:29:10,400 --> 00:29:12,000 Speaker 2: out of the crib, and that's why he's going to 611 00:29:12,080 --> 00:29:14,480 Speaker 2: win any of it. It's kind of indeed, Sharen, it's 612 00:29:14,520 --> 00:29:15,920 Speaker 2: kind of a lame answer. So he might have done 613 00:29:15,960 --> 00:29:18,520 Speaker 2: great on the news questions the cage match answer, I 614 00:29:18,560 --> 00:29:20,040 Speaker 2: give it a zero one. 615 00:29:20,640 --> 00:29:23,000 Speaker 3: Yeah, here's the thing. They advertise it as a chat 616 00:29:23,000 --> 00:29:26,280 Speaker 3: bought with a rebellious streak right and ready to answer 617 00:29:26,320 --> 00:29:29,360 Speaker 3: spicy questions. And you know, you would think on something 618 00:29:29,400 --> 00:29:31,640 Speaker 3: like this it would weigh in. By the way, I 619 00:29:31,880 --> 00:29:33,800 Speaker 3: fact checked Elon Musk's height just now. 620 00:29:34,000 --> 00:29:34,400 Speaker 4: Correction. 621 00:29:34,800 --> 00:29:37,400 Speaker 3: According to the Internet, he is six ' two, not 622 00:29:37,440 --> 00:29:40,360 Speaker 3: six ' one, so Grock maybe even underestimated his height. 623 00:29:42,280 --> 00:29:45,240 Speaker 2: Okay, enough with the cage match talk. Let's end it there. 624 00:29:45,760 --> 00:29:48,120 Speaker 2: Thanks for listening. To Elon Ing and thanks to our 625 00:29:48,160 --> 00:29:51,400 Speaker 2: panel Scharene Dana Mac. 626 00:29:52,160 --> 00:29:55,520 Speaker 5: Great to be here, always a pleasure, Thanks for having me. 627 00:30:02,160 --> 00:30:05,560 Speaker 2: This episode was produced by Stacy Wong. Naomi Shavin and 628 00:30:05,600 --> 00:30:09,760 Speaker 2: Rayhan Harmansi are senior editors. The idea for this very 629 00:30:09,880 --> 00:30:14,320 Speaker 2: show also came from Rayhan. Blake Maples handles engineering, and 630 00:30:14,320 --> 00:30:18,720 Speaker 2: we get special editing assistants from Jeff Grocott. Our supervising 631 00:30:18,760 --> 00:30:22,640 Speaker 2: producer is Magnus Henrickson, thanks a bunch of BusinessWeek editor 632 00:30:22,720 --> 00:30:26,600 Speaker 2: Joel Weber. The Elon Inc. Theme is written and performed 633 00:30:26,600 --> 00:30:30,920 Speaker 2: by Taka Yasuzawa and Alex Suira. Sage Bauman is the 634 00:30:30,920 --> 00:30:34,160 Speaker 2: head of Bloomberg Podcast and our executive producer. I am 635 00:30:34,240 --> 00:30:37,840 Speaker 2: David Papadopolis. If you have a minute, rate and review 636 00:30:37,880 --> 00:30:41,200 Speaker 2: our show, it'll help other listeners find us. See you 637 00:30:41,240 --> 00:30:41,680 Speaker 2: next week.