1 00:00:08,560 --> 00:00:13,560 Speaker 1: We welcome Welcome back, Bob left the podcast. My guest 2 00:00:13,600 --> 00:00:17,240 Speaker 1: today is Ryan Mack, co author of the book Character Limit, 3 00:00:17,600 --> 00:00:22,960 Speaker 1: How Elon Musk Destroyed Twitter? Ryan, what does the average 4 00:00:23,000 --> 00:00:24,840 Speaker 1: person not know about Elon Musk? 5 00:00:26,320 --> 00:00:30,120 Speaker 2: Oh Man? Where to start? Gosh, so much, especially in 6 00:00:30,120 --> 00:00:33,720 Speaker 2: the last couple weeks days that he's been in the news, 7 00:00:33,760 --> 00:00:37,160 Speaker 2: But I don't know. One of the things that I've 8 00:00:37,960 --> 00:00:42,200 Speaker 2: found very interesting is explaining to people how much power 9 00:00:42,240 --> 00:00:47,520 Speaker 2: he has now, especially with this position he has close 10 00:00:47,560 --> 00:00:50,040 Speaker 2: to the White House. But even before then, you know, 11 00:00:50,120 --> 00:00:53,199 Speaker 2: he's you know, one of the most successful menstionneurs of 12 00:00:53,240 --> 00:00:57,800 Speaker 2: his time. He controls these major companies in Tesla and SpaceX, 13 00:00:58,720 --> 00:01:02,600 Speaker 2: and in some ways he is the second most important 14 00:01:02,600 --> 00:01:04,560 Speaker 2: person in the White House next to Donald Trump. Now, 15 00:01:04,640 --> 00:01:08,520 Speaker 2: so yeah, we can get into all that, but there's 16 00:01:08,560 --> 00:01:09,320 Speaker 2: a lot to unpack. 17 00:01:10,040 --> 00:01:12,279 Speaker 1: Well, why don't we start there, because that's what's happening 18 00:01:12,400 --> 00:01:17,720 Speaker 1: right this second. He is working with the Department of 19 00:01:18,280 --> 00:01:22,520 Speaker 1: Government Efficiency. You know a lot of people say, oh, 20 00:01:22,560 --> 00:01:25,320 Speaker 1: the guy cut all these people at Twitter, It's gonna 21 00:01:25,360 --> 00:01:29,280 Speaker 1: be great. What might we anticipate in his work there? 22 00:01:31,400 --> 00:01:34,560 Speaker 2: It's actually kind of crazy reporting out what happened at 23 00:01:34,560 --> 00:01:39,040 Speaker 2: Twitter and seeing almost the exact same playbook being talked 24 00:01:39,080 --> 00:01:43,440 Speaker 2: about with this Department of Government Efficiency. I think first 25 00:01:43,480 --> 00:01:46,240 Speaker 2: of all, what should be made clear is that we 26 00:01:46,280 --> 00:01:50,120 Speaker 2: really don't know what this DOGE is. Obviously it's a 27 00:01:50,240 --> 00:01:54,960 Speaker 2: reference to a meme cryptocurrency, which is the abbreviation of 28 00:01:55,160 --> 00:01:58,560 Speaker 2: Department of Government Efficiency. But we don't know what this 29 00:01:58,600 --> 00:02:01,600 Speaker 2: is gonna do. They've talked about it, you know, cutting 30 00:02:01,680 --> 00:02:05,840 Speaker 2: two trillion dollars, you know, Elon and Vivek Ramaswami his 31 00:02:06,280 --> 00:02:12,399 Speaker 2: co head of this department, making decisions over hiring and firing. 32 00:02:13,760 --> 00:02:16,359 Speaker 2: But what they've talked about, we've compared to what we 33 00:02:16,480 --> 00:02:20,480 Speaker 2: reported on it in Twitter. You know, for example, at Twitter, 34 00:02:20,560 --> 00:02:23,400 Speaker 2: he cut more than at this point eighty percent of 35 00:02:23,440 --> 00:02:26,120 Speaker 2: the staff then, you know, compared to when he what 36 00:02:26,280 --> 00:02:29,200 Speaker 2: he when he took over, that's thousands of jobs. You know, 37 00:02:29,200 --> 00:02:33,240 Speaker 2: if he applies that to the federal government, that's tens 38 00:02:33,240 --> 00:02:36,240 Speaker 2: of thousands of jobs that could be that could be removed. 39 00:02:37,600 --> 00:02:40,760 Speaker 2: You know. He slash costs kind of ruthlessly at Twitter 40 00:02:40,800 --> 00:02:47,320 Speaker 2: as well, cutting things from rent payments to janitorial services, 41 00:02:47,400 --> 00:02:52,600 Speaker 2: to food services, to systems that help make the website run. 42 00:02:53,440 --> 00:02:55,520 Speaker 2: And we could see that kind of kind of slash 43 00:02:55,520 --> 00:02:59,400 Speaker 2: and burn technique play out in the government to an 44 00:02:59,440 --> 00:03:03,240 Speaker 2: extent we've ever seen before. So, you know, we have 45 00:03:03,280 --> 00:03:05,560 Speaker 2: to talk very generally about what he's saying he's doing. 46 00:03:05,639 --> 00:03:08,000 Speaker 2: And of course Elon is someone that makes a lot 47 00:03:08,000 --> 00:03:12,320 Speaker 2: of promises, but the precedent has been kind of set 48 00:03:12,360 --> 00:03:14,040 Speaker 2: with what he did at Twitter. 49 00:03:15,520 --> 00:03:18,920 Speaker 1: Let's start from the beginning. What is your gig? What 50 00:03:19,080 --> 00:03:21,520 Speaker 1: is your purview with The New York Times? 51 00:03:22,080 --> 00:03:25,680 Speaker 2: Sure, so, I'm a I'm a reporter on our tech desk. 52 00:03:25,760 --> 00:03:29,120 Speaker 2: We have people around the world who cover the technology industry. 53 00:03:30,440 --> 00:03:34,720 Speaker 2: On that desk, I do what we call accountability accountability reporting, 54 00:03:34,760 --> 00:03:37,200 Speaker 2: you know, a lot of investigative work, a lot of 55 00:03:37,200 --> 00:03:41,800 Speaker 2: deeper dives into the movers and shakers of the industry. 56 00:03:44,160 --> 00:03:47,320 Speaker 2: I've covered Facebook and Mark Zuckerberg in the past. I've 57 00:03:47,360 --> 00:03:50,920 Speaker 2: covered Jack Dorsey, and I've covered Elon not only in 58 00:03:50,960 --> 00:03:54,320 Speaker 2: this job but in previous gigs. I've worked at BuzzFeed 59 00:03:54,360 --> 00:03:59,680 Speaker 2: News before this, at Forbes magazine as well, and my 60 00:03:59,720 --> 00:04:03,560 Speaker 2: rem at The Times is to, you know, find these stories, 61 00:04:03,640 --> 00:04:07,920 Speaker 2: dig deeper into these guys, and you know, report out 62 00:04:08,080 --> 00:04:13,960 Speaker 2: these monumental business decisions or happenings in the industry. So, 63 00:04:14,480 --> 00:04:16,200 Speaker 2: you know, two and a half years ago, when Elon 64 00:04:16,240 --> 00:04:19,240 Speaker 2: decided to buy Twitter, I was one of a couple 65 00:04:19,279 --> 00:04:23,160 Speaker 2: of reporters that was thrown head on into covering this 66 00:04:23,279 --> 00:04:27,880 Speaker 2: kind of unprecedented transaction, this transaction of one man, you know, 67 00:04:28,080 --> 00:04:30,520 Speaker 2: the world's richest man worth two hundred billion dollars at 68 00:04:30,560 --> 00:04:34,520 Speaker 2: the time, buying a whole company for forty four billion dollars, 69 00:04:35,320 --> 00:04:38,120 Speaker 2: and one thing led to another. I was teamed up 70 00:04:38,120 --> 00:04:40,160 Speaker 2: with Kate Conger, my co author now in the book, 71 00:04:40,200 --> 00:04:43,279 Speaker 2: and we were reporting two or three stories a day 72 00:04:43,520 --> 00:04:46,039 Speaker 2: to the point where we couldn't fit everything into the 73 00:04:46,040 --> 00:04:49,359 Speaker 2: pages of The New York Times. And you know, that's 74 00:04:49,600 --> 00:04:52,200 Speaker 2: where we got this book and kind of continued it 75 00:04:52,240 --> 00:04:57,000 Speaker 2: from there. But at the Times these days, I effectively 76 00:04:57,000 --> 00:04:59,800 Speaker 2: cover Elon full time. You know, there he is someone 77 00:05:00,520 --> 00:05:04,800 Speaker 2: that is incredibly powerful, incredibly interesting, and I'm one of 78 00:05:04,839 --> 00:05:07,640 Speaker 2: a couple of reporters where are dedicated to, you know, 79 00:05:07,720 --> 00:05:10,720 Speaker 2: following him and understanding what he's up to, whether that's 80 00:05:10,880 --> 00:05:14,000 Speaker 2: a Tesla or SpaceX or now in the White House. 81 00:05:14,839 --> 00:05:16,680 Speaker 1: Okay, let's go back a step. How did you get 82 00:05:16,720 --> 00:05:18,280 Speaker 1: the gig at the New York Times? 83 00:05:22,000 --> 00:05:24,960 Speaker 2: Gosh, I guess I have to start from the very beginning. 84 00:05:25,720 --> 00:05:31,040 Speaker 2: Go for it, Let's do it. Coming out of college, I, 85 00:05:32,240 --> 00:05:33,920 Speaker 2: you know, in college, didn't really know what I wanted 86 00:05:33,960 --> 00:05:34,120 Speaker 2: to do. 87 00:05:34,160 --> 00:05:35,600 Speaker 1: And where'd you go to college? 88 00:05:36,120 --> 00:05:39,920 Speaker 2: I went to Stanford, to Stanford, and like a good 89 00:05:39,960 --> 00:05:43,719 Speaker 2: immigrant son, you know, son of immigrants, I was on 90 00:05:43,760 --> 00:05:47,360 Speaker 2: the med school track and looking to become a doctor 91 00:05:48,040 --> 00:05:51,760 Speaker 2: and realizing that the life sciences weren't necessarily for me. 92 00:05:53,040 --> 00:05:55,400 Speaker 2: But I was always doing the school paper in school 93 00:05:55,560 --> 00:05:58,320 Speaker 2: and having a lot of fun with it and realizing, 94 00:05:58,360 --> 00:06:01,760 Speaker 2: you know, maybe I could pursue this as a profession. 95 00:06:01,839 --> 00:06:05,120 Speaker 2: You know, I started doing internships while in school. I 96 00:06:05,160 --> 00:06:08,200 Speaker 2: worked for the half Moon Bay Review in half Moon Bay, California. 97 00:06:09,400 --> 00:06:12,080 Speaker 2: I worked at the Orange County Register, my hometown paper 98 00:06:13,040 --> 00:06:16,479 Speaker 2: where I grew up in Orange County. I ended up 99 00:06:16,520 --> 00:06:20,200 Speaker 2: working for The Times as an intern and in Bloomberg, 100 00:06:21,360 --> 00:06:24,440 Speaker 2: and eventually, you know, you know, what the heck, I'll 101 00:06:24,440 --> 00:06:28,120 Speaker 2: give this a shot. And after school I went to 102 00:06:28,200 --> 00:06:31,200 Speaker 2: Forbes worked there for five and a half years as 103 00:06:31,200 --> 00:06:35,440 Speaker 2: a staff writer on the magazine covering Silicon Valley. I 104 00:06:36,400 --> 00:06:37,960 Speaker 2: was based in San Francisco, so it made it a 105 00:06:37,960 --> 00:06:43,680 Speaker 2: lot easier and actually at Forbes, I started on this 106 00:06:43,720 --> 00:06:46,359 Speaker 2: thing called the Wealth Team, which is the team that 107 00:06:46,400 --> 00:06:49,480 Speaker 2: puts together these kind of Forbes ritualists, you know, the 108 00:06:49,520 --> 00:06:53,920 Speaker 2: Forbes billionaires, the Forbes four hundred riches, Americans, And actually 109 00:06:54,240 --> 00:06:56,799 Speaker 2: it's a great place to start. As a cub reporter, 110 00:06:57,120 --> 00:07:00,599 Speaker 2: you are essentially doing these calculations that ever one sees 111 00:07:00,680 --> 00:07:03,760 Speaker 2: of you know, so and so is worth billions of dollars. 112 00:07:04,360 --> 00:07:07,600 Speaker 2: But behind the scenes, as reporters, you are doing the 113 00:07:07,720 --> 00:07:11,120 Speaker 2: nitty gritty of those calculations. You are tallying up SEC 114 00:07:11,520 --> 00:07:15,360 Speaker 2: documents and looking at the financial filings. You are valuing 115 00:07:15,400 --> 00:07:19,640 Speaker 2: things like megayachts and wine collection and art. You're talking 116 00:07:19,720 --> 00:07:22,720 Speaker 2: to you know, some of these people talk to us 117 00:07:22,760 --> 00:07:25,160 Speaker 2: and like, you know, say, you know, I'm actually up 118 00:07:25,240 --> 00:07:28,920 Speaker 2: this year. You're missing my picasto that I just bought, 119 00:07:29,200 --> 00:07:32,720 Speaker 2: or you know that that wine collection is too low 120 00:07:32,760 --> 00:07:35,040 Speaker 2: for me. Actually it should be higher because the price 121 00:07:35,120 --> 00:07:38,360 Speaker 2: of burgundy went up. So as a twenty two year 122 00:07:38,400 --> 00:07:44,320 Speaker 2: old reporter who was making peanuts thirty five thousand dollars 123 00:07:44,360 --> 00:07:47,440 Speaker 2: a year talking to the richest people in the world, 124 00:07:47,440 --> 00:07:51,520 Speaker 2: it was kind of surreal for me. And actually that's 125 00:07:51,560 --> 00:07:54,600 Speaker 2: where I developed kind of my early network, my rolodex. 126 00:07:54,720 --> 00:07:56,160 Speaker 2: As a reporter, you know, you're as good as your 127 00:07:56,160 --> 00:07:59,160 Speaker 2: sources and the people you talk to, and it was 128 00:07:59,240 --> 00:08:02,320 Speaker 2: kind of a start for me, kind of a because 129 00:08:02,360 --> 00:08:05,920 Speaker 2: you know, you're that young, you don't usually get introduced 130 00:08:05,960 --> 00:08:09,320 Speaker 2: into those circles. And from there I kind of built 131 00:08:09,320 --> 00:08:11,600 Speaker 2: a platform, you know, I moved off that team. I 132 00:08:11,680 --> 00:08:14,480 Speaker 2: was able to generate my own story ideas, did a 133 00:08:14,520 --> 00:08:18,360 Speaker 2: lot of profiles of CEOs and those types. I actually 134 00:08:18,400 --> 00:08:22,320 Speaker 2: covered music briefly with zach O Maalley Greenberg at Forbes, 135 00:08:22,320 --> 00:08:25,960 Speaker 2: who was a business music business guy, and kind of 136 00:08:26,040 --> 00:08:31,400 Speaker 2: learned from him as well. And uh yeah, moved to 137 00:08:31,440 --> 00:08:34,800 Speaker 2: BuzzFeed a couple of years later, kind of pivoted a 138 00:08:34,800 --> 00:08:37,840 Speaker 2: little bit to do more investigative type reporting, where I 139 00:08:37,880 --> 00:08:41,480 Speaker 2: reported quite a bit on Facebook and Mark Zuckerberg and 140 00:08:42,200 --> 00:08:44,280 Speaker 2: won some awards there and ended up at the Times 141 00:08:45,559 --> 00:08:46,439 Speaker 2: three and a half years ago. 142 00:08:46,480 --> 00:08:50,360 Speaker 1: Now, so, okay, are you just working at BuzzFeed and 143 00:08:50,600 --> 00:08:54,360 Speaker 1: The Times calls you also BuzzFeed took an ax to 144 00:08:54,400 --> 00:08:56,720 Speaker 1: their news department, So how did the transition happen? 145 00:08:57,520 --> 00:09:01,640 Speaker 2: Very very sad. Yeah, because BuzzFeed, if you remember that 146 00:09:01,720 --> 00:09:06,600 Speaker 2: kind of optimist era of digital media. Right, you have BuzzFeed, 147 00:09:06,640 --> 00:09:10,240 Speaker 2: you have Vox, you have all these startups popping up 148 00:09:11,080 --> 00:09:14,559 Speaker 2: with the promise that they could figure out, you know, 149 00:09:14,600 --> 00:09:18,800 Speaker 2: how to put things online better than the legacy publications. 150 00:09:19,280 --> 00:09:23,400 Speaker 2: And you know, ultimately what we learned is that they 151 00:09:23,400 --> 00:09:26,360 Speaker 2: were at the whims of these tech platforms. These giants, 152 00:09:26,440 --> 00:09:29,960 Speaker 2: right and Facebook, you know, love that content initially and 153 00:09:29,960 --> 00:09:32,040 Speaker 2: then didn't like it so much and turn off the spigot, 154 00:09:32,080 --> 00:09:35,120 Speaker 2: and that's what caused the collapse. But in that kind 155 00:09:35,120 --> 00:09:37,839 Speaker 2: of heyday, which I kind of joined towards the end of, 156 00:09:38,240 --> 00:09:42,520 Speaker 2: you know, the venture capital fueled digital media revolution, it 157 00:09:42,640 --> 00:09:46,440 Speaker 2: was great. You had these amazing resources to pursue these 158 00:09:46,480 --> 00:09:48,600 Speaker 2: long term stories. You know, you didn't have to deal 159 00:09:48,640 --> 00:09:54,160 Speaker 2: with the nitty gritty day to day coverage your news cycle. 160 00:09:54,400 --> 00:09:57,040 Speaker 2: I could kind of pull out and do a three 161 00:09:57,040 --> 00:10:02,199 Speaker 2: month investigation into Facebook or Google or I covered a 162 00:10:02,200 --> 00:10:04,360 Speaker 2: lot of Peter Tiel, for example. It was a billionaire 163 00:10:04,360 --> 00:10:07,319 Speaker 2: who was on the Facebook board. And I kind of 164 00:10:07,320 --> 00:10:10,640 Speaker 2: built my name on those stories. And BuzzFeed had the 165 00:10:10,679 --> 00:10:14,800 Speaker 2: resources to do that and gave the opportunity to kind 166 00:10:14,840 --> 00:10:18,320 Speaker 2: of mid career or early to mid career reporters like myself. 167 00:10:19,640 --> 00:10:21,640 Speaker 2: So it was a great place to work, super collegial. 168 00:10:22,000 --> 00:10:25,800 Speaker 2: Everyone is like friends and sharing each other's work, you know, 169 00:10:27,440 --> 00:10:31,199 Speaker 2: and just an amazing place to be a great energy. Unfortunately, 170 00:10:31,280 --> 00:10:34,360 Speaker 2: what we learned is that that model wasn't a great business, 171 00:10:34,520 --> 00:10:37,920 Speaker 2: you know, And when this bigot was turned off by 172 00:10:37,960 --> 00:10:41,959 Speaker 2: the digital platforms, there were layoffs and quite a few 173 00:10:42,040 --> 00:10:46,200 Speaker 2: until the point where the news division is effectively dead. 174 00:10:47,840 --> 00:10:52,480 Speaker 2: I left before the very very end of BuzzFeed News Buzzy. 175 00:10:53,000 --> 00:10:54,800 Speaker 2: The New York Times saw my work and liked it 176 00:10:55,040 --> 00:10:59,520 Speaker 2: and recruited me. But yeah, that's that's kind of how 177 00:10:59,559 --> 00:11:00,000 Speaker 2: that played out. 178 00:11:00,240 --> 00:11:04,000 Speaker 1: Okay. At this late date, in the wake of the election, 179 00:11:04,640 --> 00:11:08,480 Speaker 1: the New York Times the paper of record irrelevant of the truth. 180 00:11:08,640 --> 00:11:12,000 Speaker 1: It is seen as more of a left wing publication 181 00:11:12,880 --> 00:11:17,000 Speaker 1: in light of the election and the election of Trump. 182 00:11:17,559 --> 00:11:21,400 Speaker 1: Has there been any wee valuation or adjustment at the 183 00:11:21,520 --> 00:11:22,600 Speaker 1: Times in general? 184 00:11:24,640 --> 00:11:26,560 Speaker 2: You know, I don't want to speak for the paper. 185 00:11:26,559 --> 00:11:28,280 Speaker 2: I've only been there for three and a half years. 186 00:11:28,520 --> 00:11:31,319 Speaker 2: I know that the paper has focused so much on independence, 187 00:11:31,480 --> 00:11:35,040 Speaker 2: you know. Obviously there's a perception of bias one or 188 00:11:35,040 --> 00:11:37,360 Speaker 2: the other. I mean, you could talk to progressives, for example, 189 00:11:37,400 --> 00:11:42,640 Speaker 2: that I think the paper is actually conservative, but I 190 00:11:42,640 --> 00:11:45,719 Speaker 2: think above all it's this idea of editorial independence that 191 00:11:45,800 --> 00:11:49,880 Speaker 2: the paper values more than most. And honestly, you know, 192 00:11:49,920 --> 00:11:52,240 Speaker 2: that could sound like me blowing smoke up people's asses, 193 00:11:52,280 --> 00:11:58,160 Speaker 2: but in reality, you know, there is no political bent 194 00:11:58,360 --> 00:12:01,160 Speaker 2: to the paper that I've experience, and you know, on 195 00:12:01,800 --> 00:12:06,920 Speaker 2: my desk, you know, I'm I cover technology. There's no 196 00:12:07,000 --> 00:12:10,240 Speaker 2: remit to go after whatever, report on certain targets, whatever. 197 00:12:10,320 --> 00:12:13,040 Speaker 2: But you know, we for me, it's always been about 198 00:12:13,040 --> 00:12:15,960 Speaker 2: interrogating power. That that for me has kind of been 199 00:12:15,960 --> 00:12:21,480 Speaker 2: my north star, regardless of political affiliation. And you know, 200 00:12:21,520 --> 00:12:23,280 Speaker 2: I think my career has kind of proven that that 201 00:12:23,440 --> 00:12:28,480 Speaker 2: I have focused on the powers, the titans of industry, 202 00:12:28,520 --> 00:12:30,560 Speaker 2: you know, the powers that be, you know, the richest 203 00:12:30,559 --> 00:12:33,080 Speaker 2: people in the world, the people that control these platforms, 204 00:12:33,120 --> 00:12:35,920 Speaker 2: that control these companies that shape our daily lives. And 205 00:12:36,920 --> 00:12:40,160 Speaker 2: that's how I hope I continue to report, you know, 206 00:12:40,280 --> 00:12:44,080 Speaker 2: and and you know, I think that's that's kind of okay. 207 00:12:44,960 --> 00:12:49,000 Speaker 1: Point in time. The mainstream media is a pejorative. Yeah, 208 00:12:49,120 --> 00:12:54,000 Speaker 1: what do people not understand about The New York Times? Hm? 209 00:12:54,040 --> 00:12:59,319 Speaker 2: Hm, you know, that's a that's a good question. I mean, 210 00:13:00,120 --> 00:13:03,319 Speaker 2: the mainstream media has it's it's so easy to criticize, right, 211 00:13:03,400 --> 00:13:08,080 Speaker 2: And my world of looking at social media and seeing 212 00:13:08,120 --> 00:13:12,840 Speaker 2: people that criticize the New York Times, particularly like when 213 00:13:13,000 --> 00:13:17,839 Speaker 2: when I'll use Elon Musk as an example, right, who says, 214 00:13:17,880 --> 00:13:21,040 Speaker 2: you know, you shouldn't get your your news from the 215 00:13:21,040 --> 00:13:23,520 Speaker 2: New York Times or the wash Journal or the Washington Post. 216 00:13:23,600 --> 00:13:26,160 Speaker 2: Everything is actually going to be on X. You know 217 00:13:26,320 --> 00:13:29,760 Speaker 2: you could X is the news source. You are the 218 00:13:29,800 --> 00:13:33,400 Speaker 2: news source because it's it's people feeding X. Well, a 219 00:13:33,440 --> 00:13:35,600 Speaker 2: lot of that news comes from mainstream media. You know 220 00:13:35,640 --> 00:13:37,959 Speaker 2: that that we're the ones reporting it that is then 221 00:13:38,000 --> 00:13:42,600 Speaker 2: aggregated into those tweets or or or shared in those videos. 222 00:13:43,280 --> 00:13:49,200 Speaker 2: And I think that's the kind of uh, shortsightedness. I 223 00:13:49,240 --> 00:13:52,959 Speaker 2: think in all this that like when when I when 224 00:13:53,000 --> 00:13:55,360 Speaker 2: I cover social media, for example, when people talk about 225 00:13:55,360 --> 00:14:00,760 Speaker 2: social media replacing newspapers or CNN or whatever, a lot 226 00:14:00,760 --> 00:14:03,400 Speaker 2: of what they draw from is is what we work 227 00:14:03,440 --> 00:14:06,160 Speaker 2: on every day. You know that information couldn't come out 228 00:14:06,200 --> 00:14:08,280 Speaker 2: without a reporter at the New York Times calling someone 229 00:14:08,360 --> 00:14:12,320 Speaker 2: up and writing up a story which then gets aggregated 230 00:14:12,559 --> 00:14:15,120 Speaker 2: or pulled out into a tweet or put on TikTok, 231 00:14:15,240 --> 00:14:19,320 Speaker 2: for example. And it's that ecosystem that I think people 232 00:14:19,360 --> 00:14:26,600 Speaker 2: don't understand. And yeah, it's it's it's kind of sad 233 00:14:26,720 --> 00:14:31,760 Speaker 2: because Elon Musk and Lynda Akarino, the CEO of X 234 00:14:31,840 --> 00:14:34,000 Speaker 2: now it's called X because that's what they've changed the 235 00:14:34,080 --> 00:14:38,120 Speaker 2: name to have made this kind of antagonistic point that 236 00:14:38,200 --> 00:14:40,800 Speaker 2: like X is actually the replacement for the New York 237 00:14:40,840 --> 00:14:44,680 Speaker 2: Times or for whatever fill in news source. You know. 238 00:14:44,800 --> 00:14:48,600 Speaker 2: But the last two decades of Twitter I have shown that, 239 00:14:49,200 --> 00:14:52,080 Speaker 2: you know, the platform has been super reliant on news 240 00:14:52,080 --> 00:14:57,240 Speaker 2: generators and reporters, and to alienate them seems pretty shortsighted. 241 00:14:58,280 --> 00:15:00,760 Speaker 2: And not to understand that that ecosystem seems kind of 242 00:15:01,200 --> 00:15:01,800 Speaker 2: absurd to be. 243 00:15:09,800 --> 00:15:16,200 Speaker 1: Okay, so you're working at the New York Times, can 244 00:15:16,240 --> 00:15:20,360 Speaker 1: you literally follow anything you want? Or do they say 245 00:15:20,400 --> 00:15:21,160 Speaker 1: stay on this. 246 00:15:23,680 --> 00:15:25,640 Speaker 2: It's a bit of give and take. You know, I'm 247 00:15:25,680 --> 00:15:29,080 Speaker 2: hopefully you know, if I wake up one morning, I'm 248 00:15:29,160 --> 00:15:31,600 Speaker 2: editor is like, you know, we should pay more attention 249 00:15:31,680 --> 00:15:35,960 Speaker 2: to this. I'll definitely listen. You know, It's it's not 250 00:15:36,160 --> 00:15:42,720 Speaker 2: completely me generating everything, but there's also a give and take. 251 00:15:42,800 --> 00:15:44,760 Speaker 2: And if I say, like, look, we should be paying 252 00:15:44,800 --> 00:15:48,200 Speaker 2: attention to this more. I think this is undercovered. I 253 00:15:48,280 --> 00:15:50,640 Speaker 2: have that freedom, and it's a it's a nice place 254 00:15:50,680 --> 00:15:53,920 Speaker 2: to be in your career. I don't necessarily have a 255 00:15:53,920 --> 00:15:56,040 Speaker 2: beat at the company. You know, we have we have 256 00:15:56,080 --> 00:15:59,600 Speaker 2: reporters that cover Tesla. We have reporters that cover X. Kate, 257 00:15:59,720 --> 00:16:03,120 Speaker 2: my co author, covers X on a full time basis. 258 00:16:04,080 --> 00:16:06,640 Speaker 2: We have reporters that cover Google and you know, name 259 00:16:06,680 --> 00:16:12,360 Speaker 2: whatever company. I help those reporters, I'll supplement them. But 260 00:16:12,400 --> 00:16:14,400 Speaker 2: if there's a deep dive that we need to go 261 00:16:14,480 --> 00:16:19,360 Speaker 2: into or you know understand you know, musk relationship with 262 00:16:19,400 --> 00:16:21,160 Speaker 2: the White House, for example, which is something we've been 263 00:16:21,160 --> 00:16:23,600 Speaker 2: trying to do over the last couple weeks to months, 264 00:16:24,040 --> 00:16:26,640 Speaker 2: you know, the deploy me on that and I'll kind 265 00:16:26,640 --> 00:16:28,760 Speaker 2: of shift my resources and my efforts to that kind 266 00:16:28,800 --> 00:16:29,200 Speaker 2: of story. 267 00:16:30,200 --> 00:16:32,640 Speaker 1: So you've been covering starting and forwards, you and covering 268 00:16:32,720 --> 00:16:36,920 Speaker 1: the wealthy, forgetting people who inherit their wealth, people who 269 00:16:36,960 --> 00:16:40,680 Speaker 1: make their wealth. What similarities to all these people have. 270 00:16:44,280 --> 00:16:47,800 Speaker 2: That's something I've always thought about, and I don't think 271 00:16:47,800 --> 00:16:51,680 Speaker 2: there is one unique characteristic. You know, it's actually really funny. 272 00:16:51,720 --> 00:16:55,760 Speaker 2: I've I've I have been around them. I guess I've 273 00:16:55,760 --> 00:16:58,520 Speaker 2: been around them more relative, more around more rich people 274 00:16:58,600 --> 00:17:02,160 Speaker 2: relative to the average person. But there's all types, you know, 275 00:17:02,200 --> 00:17:06,680 Speaker 2: there's all types of people I can like there. For example, 276 00:17:07,119 --> 00:17:09,400 Speaker 2: some folks do not want to talk to us about 277 00:17:09,400 --> 00:17:11,120 Speaker 2: their wealth at all. You know, they would they would 278 00:17:11,160 --> 00:17:12,680 Speaker 2: hang up the phone as soon as I called, never 279 00:17:12,720 --> 00:17:17,120 Speaker 2: return my emails, that kind of thing. Some people want 280 00:17:17,160 --> 00:17:19,560 Speaker 2: to get down to the every last detail. You know, 281 00:17:19,600 --> 00:17:22,320 Speaker 2: you miss this. I think my holding over here in 282 00:17:22,359 --> 00:17:24,439 Speaker 2: these stocks and you know, I'm actually up by one 283 00:17:24,480 --> 00:17:28,359 Speaker 2: hundred million over here and down, and it's it's like 284 00:17:28,520 --> 00:17:32,479 Speaker 2: it was like such an exercise in egos in a way, 285 00:17:32,600 --> 00:17:36,760 Speaker 2: and you start to realize that, like it's gonna sound 286 00:17:37,600 --> 00:17:40,040 Speaker 2: like weird, but like, you know, they're normal people like us, 287 00:17:40,320 --> 00:17:45,439 Speaker 2: you know it. They have the same hang ups or 288 00:17:46,119 --> 00:17:53,880 Speaker 2: you know, same motivations in some ways, same fears. And 289 00:17:54,200 --> 00:17:58,560 Speaker 2: that's actually really like it took me a while to 290 00:17:58,640 --> 00:18:02,120 Speaker 2: understand that, Like there wasn't like one gene, for example, 291 00:18:02,160 --> 00:18:05,520 Speaker 2: that causes you to become super wealthy. I think there's 292 00:18:05,560 --> 00:18:08,160 Speaker 2: a lot of luck at play, a lot of where 293 00:18:08,200 --> 00:18:13,600 Speaker 2: you start in life. Sure there's things like hard work 294 00:18:13,640 --> 00:18:17,200 Speaker 2: and that kind of thing and effort, but Yeah, there's 295 00:18:17,359 --> 00:18:21,480 Speaker 2: there's no one characteristic that I could ascribe to, or 296 00:18:21,520 --> 00:18:23,639 Speaker 2: else I would have followed that and made myself a 297 00:18:23,680 --> 00:18:24,560 Speaker 2: billionaire myself. 298 00:18:25,160 --> 00:18:28,040 Speaker 1: Okay, we live in an era where money is everything, 299 00:18:28,760 --> 00:18:31,639 Speaker 1: and people who have money are seen in comp usually 300 00:18:31,680 --> 00:18:35,159 Speaker 1: they made the money in one specific vertical, and people 301 00:18:35,160 --> 00:18:40,000 Speaker 1: who have made money there's a conception by the hoyploy 302 00:18:40,760 --> 00:18:44,640 Speaker 1: that these people are experts in all areas. Having covered 303 00:18:44,680 --> 00:18:49,280 Speaker 1: these people, are they experts in all areas or are 304 00:18:49,280 --> 00:18:51,840 Speaker 1: they just super smart in the one area they're in 305 00:18:52,000 --> 00:18:54,879 Speaker 1: and they're as blind as the next person in another area. 306 00:18:55,720 --> 00:18:58,520 Speaker 2: Well, I've read your review of our books. I think 307 00:18:58,520 --> 00:19:01,240 Speaker 2: we're aligned on this. But one of the findings of 308 00:19:01,240 --> 00:19:04,680 Speaker 2: our book. Let's let's talk about Elon specifically here, right, 309 00:19:05,480 --> 00:19:08,680 Speaker 2: I mean a generational entrepreneur like that that I'm not 310 00:19:08,760 --> 00:19:12,359 Speaker 2: going to deny that by any means. He brought in 311 00:19:12,520 --> 00:19:18,159 Speaker 2: the electric car revolution. He foresaw this industry to privatize 312 00:19:18,200 --> 00:19:23,399 Speaker 2: space and get things into into orbit, which now he 313 00:19:23,720 --> 00:19:25,960 Speaker 2: has a virtual monopoly with the US government and getting 314 00:19:26,000 --> 00:19:29,560 Speaker 2: things into space, you know, to see that and be 315 00:19:29,640 --> 00:19:34,000 Speaker 2: successful in that not once but twice, because there's a 316 00:19:34,000 --> 00:19:37,640 Speaker 2: lot of credit at the same time, it doesn't necessarily 317 00:19:37,720 --> 00:19:41,560 Speaker 2: make him, you know, smart in another area. He is 318 00:19:41,600 --> 00:19:46,040 Speaker 2: not a epidemiologist, you know, when he was making comments 319 00:19:46,080 --> 00:19:52,160 Speaker 2: about COVID, for example. He is not by any means 320 00:19:52,160 --> 00:19:53,840 Speaker 2: an expert in social media and how to run a 321 00:19:53,880 --> 00:19:58,280 Speaker 2: social media platform, which I think is in some ways 322 00:19:58,280 --> 00:20:01,639 Speaker 2: the original sin when he bought Twitter, right, it's this 323 00:20:01,760 --> 00:20:04,720 Speaker 2: idea that he had been so successful in these other 324 00:20:04,920 --> 00:20:08,760 Speaker 2: areas that, you know, how hard could it be. This 325 00:20:08,800 --> 00:20:12,280 Speaker 2: is a social media company. It's you know, in his view, 326 00:20:12,400 --> 00:20:16,119 Speaker 2: servers and a website. You know, you're just plugging it 327 00:20:16,160 --> 00:20:19,399 Speaker 2: in and letting it go and keeping it online. How 328 00:20:19,720 --> 00:20:23,880 Speaker 2: hard could that be compared to you know, developing autopilot 329 00:20:24,240 --> 00:20:29,480 Speaker 2: or making a rocket land on a ship in the 330 00:20:29,480 --> 00:20:33,359 Speaker 2: middle of the ocean. And I think that is the 331 00:20:33,400 --> 00:20:36,840 Speaker 2: original sin because what he didn't realize is social media 332 00:20:36,880 --> 00:20:40,080 Speaker 2: isn't a tech problem. It's actually a human problem. You are, 333 00:20:40,720 --> 00:20:43,040 Speaker 2: it's a societal problem. It's a social problem. You're connecting 334 00:20:43,080 --> 00:20:47,840 Speaker 2: people online and figuring out ways for them to communicate 335 00:20:47,880 --> 00:20:52,120 Speaker 2: without killing each other or you know, causing huge disturbances 336 00:20:53,040 --> 00:20:55,680 Speaker 2: and making everyone feel at least most people feeling like 337 00:20:55,720 --> 00:21:01,520 Speaker 2: they're welcome to exhibit and express themselves, and yeah, so 338 00:21:01,640 --> 00:21:05,440 Speaker 2: I mean that that kind of central understanding which powered 339 00:21:05,640 --> 00:21:08,720 Speaker 2: our book is one that I come back to a 340 00:21:08,720 --> 00:21:11,880 Speaker 2: lot that oftentimes there are people that are very successful 341 00:21:11,920 --> 00:21:15,320 Speaker 2: that think because they've reached such success in one area, 342 00:21:15,359 --> 00:21:18,000 Speaker 2: they can be successful in another, and oftentimes that's not 343 00:21:18,040 --> 00:21:18,440 Speaker 2: the case. 344 00:21:19,280 --> 00:21:25,040 Speaker 1: Okay, when Elong makes the offer to buy Twitter to 345 00:21:25,160 --> 00:21:28,400 Speaker 1: what did we were you on the Elong beat? 346 00:21:30,920 --> 00:21:37,240 Speaker 2: Hmmm? I started covering him, like really covering him in 347 00:21:37,280 --> 00:21:42,919 Speaker 2: twenty eighteen at BuzzFeed. There were some issues at his factories. 348 00:21:42,960 --> 00:21:46,880 Speaker 2: This is a period of turmoil for him, and there 349 00:21:46,880 --> 00:21:50,480 Speaker 2: were some kind of investigative stories, and he was a 350 00:21:50,480 --> 00:21:52,840 Speaker 2: subject I would return to every now and then, but 351 00:21:52,880 --> 00:21:56,119 Speaker 2: it wasn't by any means like my main focus. And 352 00:21:56,160 --> 00:21:58,479 Speaker 2: by the time he made the offer for Twitter in 353 00:21:58,520 --> 00:22:03,119 Speaker 2: April of twenty ti twenty two, I don't think I 354 00:22:03,160 --> 00:22:05,760 Speaker 2: had written a single story on him for The Times 355 00:22:05,800 --> 00:22:09,879 Speaker 2: I had joined gosh, I think the year prior, year 356 00:22:09,920 --> 00:22:12,760 Speaker 2: and a half prior, I hadn't written about him for 357 00:22:12,760 --> 00:22:17,800 Speaker 2: The Times, but they had known my pastwork and covering him. 358 00:22:18,280 --> 00:22:20,280 Speaker 2: There's actually a trial we can talk about that I 359 00:22:20,320 --> 00:22:24,439 Speaker 2: was kind of in some ways involved with with him, 360 00:22:24,520 --> 00:22:26,600 Speaker 2: but at the times. You know, they saw my past 361 00:22:26,600 --> 00:22:30,160 Speaker 2: work and they're like, Okay, this is a pretty monumental event. 362 00:22:30,800 --> 00:22:33,640 Speaker 2: That's actually take some time and understand how serious this is, 363 00:22:34,280 --> 00:22:36,000 Speaker 2: because what you have to remember is when he actually 364 00:22:36,040 --> 00:22:38,600 Speaker 2: made the bid, people didn't know if he was serious. 365 00:22:38,640 --> 00:22:41,560 Speaker 2: You know, it was he made a reference to a 366 00:22:41,560 --> 00:22:44,920 Speaker 2: weed joke, you know, for fifty four twenty fifty four 367 00:22:44,880 --> 00:22:47,840 Speaker 2: to twenty per share was his offer amounted to forty 368 00:22:47,880 --> 00:22:52,040 Speaker 2: four billion dollars. His offer letter was pretty thin when 369 00:22:52,080 --> 00:22:54,719 Speaker 2: he when he made the offer, it wasn't clear if 370 00:22:54,720 --> 00:22:56,040 Speaker 2: he was going to go through. And then obviously he 371 00:22:56,600 --> 00:23:00,159 Speaker 2: tried to pull out of it a couple weeks later later. 372 00:23:02,040 --> 00:23:04,359 Speaker 2: But they had me spend a lot of time trying 373 00:23:04,359 --> 00:23:06,639 Speaker 2: to interrogate whether or not this thing was going to happen, 374 00:23:06,840 --> 00:23:10,240 Speaker 2: and you know, what was the mindset and what was 375 00:23:10,280 --> 00:23:13,640 Speaker 2: the thinking. So I started to dig in and work 376 00:23:13,680 --> 00:23:16,960 Speaker 2: closely with Kate. And when he finally did take over 377 00:23:17,000 --> 00:23:20,199 Speaker 2: the company after all the kind of legal wrangling in 378 00:23:20,240 --> 00:23:24,040 Speaker 2: October twenty twenty two, it was kind of all hands 379 00:23:24,040 --> 00:23:25,959 Speaker 2: on deck. You know, it was two or three stories 380 00:23:25,960 --> 00:23:29,160 Speaker 2: a day. You know, it was, oh my gosh, he's 381 00:23:29,200 --> 00:23:31,160 Speaker 2: laying off people over in this part of the company, 382 00:23:31,280 --> 00:23:34,120 Speaker 2: or he stopped paying rent over here. At one point 383 00:23:34,200 --> 00:23:37,120 Speaker 2: we heard the janitorial services were cut, so like I'm 384 00:23:37,160 --> 00:23:39,280 Speaker 2: in a toilet paper in the offices. But everything was 385 00:23:39,280 --> 00:23:42,359 Speaker 2: like a fire. Every day. We'd wake up when we 386 00:23:42,400 --> 00:23:45,120 Speaker 2: have two or three things to report on about his takeover. 387 00:23:46,560 --> 00:23:48,600 Speaker 2: And since then, it's kind of hasn't stopped. You know, 388 00:23:48,880 --> 00:23:51,320 Speaker 2: it's led from one thing and the other into the book, 389 00:23:51,960 --> 00:23:54,280 Speaker 2: into our coverage of him now as a political figure, 390 00:23:55,920 --> 00:23:59,520 Speaker 2: and that kind of that roller coaster hasn't stopped. 391 00:24:00,200 --> 00:24:03,040 Speaker 1: Okay, there's a legendary story before you were at the 392 00:24:03,080 --> 00:24:08,680 Speaker 1: New York Times where a reporter Drove Tesla wrote a 393 00:24:08,800 --> 00:24:12,920 Speaker 1: review Elon mark Back said, hey, there's data. I checked 394 00:24:12,920 --> 00:24:16,640 Speaker 1: the data in the car and he contradicted you're reporting. 395 00:24:17,680 --> 00:24:21,200 Speaker 1: Are you hearing from Elon? You or Kate or somebody 396 00:24:21,240 --> 00:24:22,720 Speaker 1: else saying you got it wrong. 397 00:24:25,400 --> 00:24:29,639 Speaker 2: Actually, I've had a couple interactions with Elon, so we 398 00:24:29,640 --> 00:24:32,600 Speaker 2: can start from beginning, and actually I know that anecdote, 399 00:24:32,600 --> 00:24:37,280 Speaker 2: and what I would say about that is like fighting 400 00:24:37,320 --> 00:24:41,280 Speaker 2: against the media has been core to Elon's beliefs since 401 00:24:41,920 --> 00:24:45,200 Speaker 2: he started his companies. You know, you guys, remember there 402 00:24:45,240 --> 00:24:47,880 Speaker 2: was a lot of doubt about his companies, particularly Tesla, 403 00:24:48,880 --> 00:24:52,600 Speaker 2: where he in some ways relied on the media to 404 00:24:52,600 --> 00:24:55,080 Speaker 2: get at his message. He was very big on getting 405 00:24:55,080 --> 00:24:58,879 Speaker 2: on magazine covers and in stories, to the point where 406 00:24:58,960 --> 00:25:00,800 Speaker 2: we were talking to folks used to work with him, 407 00:25:01,760 --> 00:25:03,760 Speaker 2: and he'd be up at three am reading like some 408 00:25:03,840 --> 00:25:07,480 Speaker 2: obscure Belgian blog, you know, trying to understand like every 409 00:25:07,520 --> 00:25:09,320 Speaker 2: piece of coverage was near and dear to him. He 410 00:25:09,320 --> 00:25:12,520 Speaker 2: had a Google, earned his own name. He cared so much, 411 00:25:12,800 --> 00:25:16,639 Speaker 2: you know. And on the other hand, he battled with 412 00:25:16,680 --> 00:25:20,840 Speaker 2: the media anytime they got something wrong. And so that 413 00:25:21,160 --> 00:25:24,359 Speaker 2: story you reference actually is very telling, and it was 414 00:25:24,440 --> 00:25:27,920 Speaker 2: it kind of in some ways proved his belief that 415 00:25:28,240 --> 00:25:31,879 Speaker 2: there was this kind of media force against him and 416 00:25:31,920 --> 00:25:34,560 Speaker 2: against his success, and he was out to kind of 417 00:25:34,600 --> 00:25:38,240 Speaker 2: prove them wrong, which in some ways you can see 418 00:25:38,280 --> 00:25:42,840 Speaker 2: the line that's drawn to the Twitter acquisition. In terms 419 00:25:42,840 --> 00:25:49,200 Speaker 2: of my interactions with him throughout my career, I don't 420 00:25:49,200 --> 00:25:55,880 Speaker 2: think he's ever denied a story I've reported. My famous 421 00:25:55,960 --> 00:25:57,479 Speaker 2: run in with him was that when I was at 422 00:25:57,480 --> 00:26:01,640 Speaker 2: BuzzFeed in twenty eighteen, and do you remember when those 423 00:26:01,720 --> 00:26:05,400 Speaker 2: kids got caught in the cave and time of course, man, 424 00:26:05,480 --> 00:26:10,720 Speaker 2: the whole bit exactly. So for the listeners, like, you know, 425 00:26:10,840 --> 00:26:13,640 Speaker 2: cave rescuer gets involved, a British guy living in Thailand. 426 00:26:14,520 --> 00:26:17,480 Speaker 2: Elon then gets involved. It's like this crazy thing. He 427 00:26:17,480 --> 00:26:19,879 Speaker 2: builds a submarine to get these kids out of this 428 00:26:19,960 --> 00:26:24,119 Speaker 2: winding underwater cave. They don't end up using the submarine, 429 00:26:24,160 --> 00:26:26,320 Speaker 2: but Elon gets a lot of press out of it, 430 00:26:26,880 --> 00:26:30,119 Speaker 2: and one of the rescuers goes on CNN and says, like, 431 00:26:30,200 --> 00:26:33,240 Speaker 2: you know, that didn't work, like he can stick that 432 00:26:33,320 --> 00:26:35,520 Speaker 2: sub up his butt, you know, and you know kind 433 00:26:35,520 --> 00:26:39,119 Speaker 2: of was like it was a complete distraction and he 434 00:26:39,160 --> 00:26:42,000 Speaker 2: didn't help at all. And in return, Elon calls this 435 00:26:42,040 --> 00:26:45,280 Speaker 2: guy a pedo guy you know, on Twitter, and it 436 00:26:45,320 --> 00:26:47,359 Speaker 2: was like this kind of I mean, those reactions now 437 00:26:47,359 --> 00:26:50,719 Speaker 2: were kind of part and parcel of who he is 438 00:26:50,720 --> 00:26:52,720 Speaker 2: on Twitter, but at that point in time, it was 439 00:26:52,800 --> 00:26:56,880 Speaker 2: kind of a insane accusation to call someone a pedophile 440 00:26:56,920 --> 00:27:04,359 Speaker 2: on Twitter with no basis. So in that process I 441 00:27:04,440 --> 00:27:08,879 Speaker 2: ended up emailing Elon asking him why he thought this 442 00:27:08,920 --> 00:27:12,560 Speaker 2: guy was a pedophile. Elon emails me back with this 443 00:27:12,680 --> 00:27:16,440 Speaker 2: kind of screed. Bear in mind I'd never really talked 444 00:27:16,440 --> 00:27:19,720 Speaker 2: to him. The screed of why this guy was a 445 00:27:19,760 --> 00:27:22,159 Speaker 2: quote child rapist who had a twelve year old bride. 446 00:27:22,600 --> 00:27:24,439 Speaker 2: Since she's trying to get me as a reporter at 447 00:27:24,440 --> 00:27:28,840 Speaker 2: BuzzFeed to write a story about this guy being a pedophile, 448 00:27:30,960 --> 00:27:33,520 Speaker 2: we actually didn't did not end up doing that. We 449 00:27:34,160 --> 00:27:36,040 Speaker 2: reported it out a little more, and it turns out 450 00:27:36,040 --> 00:27:39,199 Speaker 2: that Elon was kind of making all this up and 451 00:27:39,240 --> 00:27:41,840 Speaker 2: trying to get a reporter to dump this out into 452 00:27:41,840 --> 00:27:45,160 Speaker 2: the world, and so we ended up publishing Elon's email 453 00:27:45,200 --> 00:27:47,760 Speaker 2: in full, to kind of show his state of mind, 454 00:27:47,960 --> 00:27:50,679 Speaker 2: you know, to show that he was willing to destroy 455 00:27:50,720 --> 00:27:57,240 Speaker 2: someone's reputation and to show that this, you know, was 456 00:27:57,320 --> 00:28:00,359 Speaker 2: kind of a typical behavior for anyone, much less us 457 00:28:00,520 --> 00:28:02,879 Speaker 2: someone who runs, you know, a major company in Tesla. 458 00:28:04,840 --> 00:28:09,119 Speaker 2: Fast forward that that that cave Rescuer ends up suing Elon. 459 00:28:09,920 --> 00:28:15,480 Speaker 2: Elon tries to depose me in court to for the case. 460 00:28:16,440 --> 00:28:19,399 Speaker 2: But you know, that was a that was a crazy 461 00:28:19,400 --> 00:28:21,080 Speaker 2: interaction in my career and one of the times that 462 00:28:21,119 --> 00:28:22,840 Speaker 2: I've ever gotten to interact with them. 463 00:28:22,880 --> 00:28:27,520 Speaker 1: So, Okay, a lot of these people can push back 464 00:28:27,840 --> 00:28:34,080 Speaker 1: very hard, accurately or inaccurately. I've dealt with some reporters 465 00:28:34,119 --> 00:28:38,800 Speaker 1: who laugh I know in my own particular cases, sometimes 466 00:28:39,200 --> 00:28:41,840 Speaker 1: you know, you hold your ground, but you shake when 467 00:28:41,880 --> 00:28:44,480 Speaker 1: these people push back. Do you ever feel any personal 468 00:28:44,520 --> 00:28:47,760 Speaker 1: anxiety where you say, no, I'm a reporter, I'm getting 469 00:28:47,760 --> 00:28:49,360 Speaker 1: it right. I work for the New York Times, Or 470 00:28:49,400 --> 00:28:52,640 Speaker 1: sometimes you say, well, shit, do I really have it right? 471 00:28:55,160 --> 00:28:58,480 Speaker 2: That? That's I think about that every day. And you know, 472 00:28:58,480 --> 00:29:01,360 Speaker 2: as a reporter, the worst thing that can happen is 473 00:29:01,400 --> 00:29:03,360 Speaker 2: for you to get something wrong, especially in such a 474 00:29:03,400 --> 00:29:08,320 Speaker 2: key story where the people that you're reporting on have 475 00:29:08,440 --> 00:29:13,280 Speaker 2: unlimited resources to challenge you or to possibly sue you, 476 00:29:13,720 --> 00:29:16,560 Speaker 2: or you know, one of the biggest scoops in my 477 00:29:16,640 --> 00:29:19,760 Speaker 2: career was finding out that Peter Teele was funding the 478 00:29:19,760 --> 00:29:24,280 Speaker 2: Gawker case, the whole cloaging Gawker case that took out Gawker. 479 00:29:24,360 --> 00:29:27,760 Speaker 2: You know, I broke that story, and you know, I 480 00:29:27,800 --> 00:29:30,440 Speaker 2: was a cub reporter at Forbes, a younger reporter at Forbes, 481 00:29:30,480 --> 00:29:33,400 Speaker 2: and before we publish, you know, there was a doubt 482 00:29:33,600 --> 00:29:35,680 Speaker 2: in my mind that flash. You know, I was so 483 00:29:35,680 --> 00:29:38,800 Speaker 2: solid on the reporting, it's spent months doing it, but 484 00:29:39,280 --> 00:29:41,600 Speaker 2: was like, you know what if that point zero zero 485 00:29:41,600 --> 00:29:44,000 Speaker 2: one percent chance is right? You know what if I've 486 00:29:44,000 --> 00:29:48,280 Speaker 2: been duped the whole time into reporting this out what happens, 487 00:29:48,320 --> 00:29:52,040 Speaker 2: you know, and like that that feeling never goes away. 488 00:29:52,080 --> 00:29:54,360 Speaker 2: And I think being paranoid in some ways as a 489 00:29:54,360 --> 00:29:57,480 Speaker 2: reporter helps you because that will lead you to make 490 00:29:57,520 --> 00:29:59,640 Speaker 2: the next call, right, You'll be like, actually, you know 491 00:29:59,680 --> 00:30:01,920 Speaker 2: I have sources. Maybe I'll go for six or seven, 492 00:30:02,360 --> 00:30:05,120 Speaker 2: or maybe I'll make that extra send that extra email. 493 00:30:07,560 --> 00:30:09,320 Speaker 2: You know, It's helped in my career that I've been 494 00:30:09,360 --> 00:30:14,120 Speaker 2: supported by major publications that have great resources, in Forbes, 495 00:30:14,120 --> 00:30:19,040 Speaker 2: in the New York Times. But you know that that 496 00:30:19,160 --> 00:30:21,640 Speaker 2: fear stays with you. And I don't know, I could 497 00:30:21,680 --> 00:30:24,960 Speaker 2: be eighty years old and you know, have worked the 498 00:30:24,960 --> 00:30:29,200 Speaker 2: New York Times for fifty years. I'd be worried if 499 00:30:29,200 --> 00:30:31,440 Speaker 2: that fear wasn't part of me still, you know, because 500 00:30:31,480 --> 00:30:34,600 Speaker 2: you always want to have a little bit of paranoia, 501 00:30:34,640 --> 00:30:39,320 Speaker 2: I guess. And yeah, when when when it's someone like Elon, 502 00:30:40,520 --> 00:30:43,360 Speaker 2: you gotta you gotta do everything by the book, You gotta, 503 00:30:43,520 --> 00:30:46,720 Speaker 2: you gotta, you gotta be so solid on your reporting. 504 00:30:47,560 --> 00:30:54,680 Speaker 2: And I've always led with that, you know, my I yeah, 505 00:30:54,680 --> 00:30:56,960 Speaker 2: I've always relied on the reporting and the like you know, 506 00:30:57,000 --> 00:31:02,280 Speaker 2: my source saying and everything like that, and yeah, but 507 00:31:02,360 --> 00:31:04,240 Speaker 2: you know, if someone tries to intimidate us at the 508 00:31:04,280 --> 00:31:07,240 Speaker 2: same time, like, I'm not going to back down. You know, 509 00:31:07,280 --> 00:31:09,240 Speaker 2: if we know something is right, we're gon we're gonna 510 00:31:09,280 --> 00:31:13,320 Speaker 2: go with it. And it's kind of that weird balance. 511 00:31:21,120 --> 00:31:25,760 Speaker 1: Okay, it's a summer of twenty twenty two. Yeah, elon 512 00:31:26,240 --> 00:31:29,800 Speaker 1: trying to wiggle out. We know there's gonna be a 513 00:31:29,960 --> 00:31:34,320 Speaker 1: case in the Delaware Court. You're in the New York Times. 514 00:31:35,480 --> 00:31:38,160 Speaker 1: I agree with you one hundred percent. New York Times 515 00:31:38,240 --> 00:31:41,080 Speaker 1: has boots on the ground everywhere, whereas all these other 516 00:31:41,120 --> 00:31:46,160 Speaker 1: people's just opinion at the source. Do you think, hey, 517 00:31:46,160 --> 00:31:48,320 Speaker 1: this is going to close based on the history of 518 00:31:48,320 --> 00:31:51,560 Speaker 1: the Delaware Court? And to what degree are you penetrating 519 00:31:51,600 --> 00:31:55,200 Speaker 1: the company in terms of sources such that when the 520 00:31:55,240 --> 00:31:57,600 Speaker 1: case goes down, you're prepared. 521 00:31:59,160 --> 00:32:01,760 Speaker 2: You're doing everything. You You're calling the people on the deal. 522 00:32:01,800 --> 00:32:07,320 Speaker 2: You're calling the lawyers, the bankers, the pr people, you know, 523 00:32:07,440 --> 00:32:09,160 Speaker 2: people who you might think might be in the room. 524 00:32:09,200 --> 00:32:12,200 Speaker 2: You know, these are very high level conversations too, So 525 00:32:12,400 --> 00:32:16,720 Speaker 2: when you're talking about sourcing, you're talking about C suite level. 526 00:32:16,880 --> 00:32:18,800 Speaker 2: You know, those are the people that are having conversations, 527 00:32:19,720 --> 00:32:22,280 Speaker 2: and what we realize is that on the Twitter side 528 00:32:22,280 --> 00:32:25,160 Speaker 2: of things, they had no idea what to expect from Elon. 529 00:32:25,440 --> 00:32:28,920 Speaker 2: You know, they had some meetings about what it would. 530 00:32:29,480 --> 00:32:31,400 Speaker 2: Bear in mind as he was backing out, there were 531 00:32:31,480 --> 00:32:34,160 Speaker 2: still some meetings that were going on between him and 532 00:32:34,200 --> 00:32:38,560 Speaker 2: the company over there was a calculat there was that 533 00:32:38,600 --> 00:32:40,840 Speaker 2: bot argument, for example, and there were they were trying 534 00:32:40,840 --> 00:32:47,000 Speaker 2: to assuage him concerns about bots because the company was 535 00:32:47,000 --> 00:32:48,680 Speaker 2: bound by it's pretty sure your duty to make this 536 00:32:48,720 --> 00:32:50,760 Speaker 2: deal happen, right, which is why they sue the company. 537 00:32:52,320 --> 00:32:56,640 Speaker 2: But you know, gosh, that was it was a crazy period. 538 00:32:57,920 --> 00:33:00,640 Speaker 2: Kate was ready to go to Delaware and and spend 539 00:33:00,680 --> 00:33:03,360 Speaker 2: like weeks there. I was having to read up on 540 00:33:03,440 --> 00:33:06,160 Speaker 2: chancery court, which I knew very little about until that, 541 00:33:06,280 --> 00:33:10,640 Speaker 2: until that kind of moment. But that's kind of the 542 00:33:10,680 --> 00:33:14,200 Speaker 2: beauty of reporting, right, Like you're thrown into the deep 543 00:33:14,320 --> 00:33:17,160 Speaker 2: end of these weird situations you might have never been 544 00:33:17,160 --> 00:33:19,600 Speaker 2: in before, you might have never thought about the court 545 00:33:19,600 --> 00:33:24,080 Speaker 2: of chancery, and suddenly you're making calls to trial lawyers 546 00:33:24,080 --> 00:33:29,000 Speaker 2: who have you know, practiced there or represented other companies 547 00:33:29,000 --> 00:33:33,240 Speaker 2: who're trying to find comparisons to maybe legal precedents that 548 00:33:33,280 --> 00:33:36,280 Speaker 2: I played out similar to this, where you know the 549 00:33:36,320 --> 00:33:38,720 Speaker 2: court has compelled someone to buy a company like that. 550 00:33:39,840 --> 00:33:41,160 Speaker 2: What did you think did you think it? Did you 551 00:33:41,160 --> 00:33:42,360 Speaker 2: think it was going to happen at that time? 552 00:33:42,400 --> 00:33:45,280 Speaker 1: Absolutely? Because of the Delaware Court. It blew my mind. 553 00:33:45,320 --> 00:33:47,800 Speaker 1: All these people. You know, I got a book up 554 00:33:47,840 --> 00:33:50,640 Speaker 1: my ass about Kara Swisher anyway, Oh no, no, no, 555 00:33:50,640 --> 00:33:53,040 Speaker 1: no, no no, there's law involvement that happened to be a 556 00:33:53,120 --> 00:33:55,640 Speaker 1: lawyer or whatever. And if you know anything about why 557 00:33:55,720 --> 00:33:58,800 Speaker 1: all this stuff is in Delaware. But let's go back 558 00:33:58,800 --> 00:34:02,960 Speaker 1: to your story. You're saying, you're yeah, will everybody talk 559 00:34:03,000 --> 00:34:05,680 Speaker 1: to you? And to what degree is the fact that 560 00:34:05,720 --> 00:34:08,040 Speaker 1: you're writing for the New York Times to your advantage 561 00:34:08,080 --> 00:34:09,040 Speaker 1: or disadvantage? 562 00:34:12,080 --> 00:34:14,359 Speaker 2: It goes both ways, actually, and I've been on both 563 00:34:14,400 --> 00:34:16,319 Speaker 2: sides of it. When I was at BuzzFeed, I used 564 00:34:16,320 --> 00:34:18,120 Speaker 2: to think, you know, those reporters of the New York Times, 565 00:34:18,160 --> 00:34:20,319 Speaker 2: they must get everything handed to them on a silver 566 00:34:20,400 --> 00:34:23,319 Speaker 2: platter because they're the New York Times. Who wouldn't want 567 00:34:23,360 --> 00:34:25,960 Speaker 2: to talk to them? And then now being on this 568 00:34:26,000 --> 00:34:29,640 Speaker 2: side of things, there are instances where people are like, sorry, 569 00:34:29,640 --> 00:34:31,760 Speaker 2: you guys are too big, Like I'm worried about my reputation. 570 00:34:31,800 --> 00:34:33,239 Speaker 2: I want this information to be out there, but like 571 00:34:33,280 --> 00:34:36,040 Speaker 2: in a smaller setting. And so it's kind of funny 572 00:34:36,360 --> 00:34:42,080 Speaker 2: seeing that play out in a way. That being said, 573 00:34:42,080 --> 00:34:45,880 Speaker 2: when you're at the New York Times, you know, it 574 00:34:46,000 --> 00:34:48,960 Speaker 2: carries the gravitas that has the name, the rename recognition. 575 00:34:49,239 --> 00:34:52,400 Speaker 2: When you call someone, they're not asking you to spell 576 00:34:52,440 --> 00:34:55,280 Speaker 2: out you know what do you who does that mean? BuzzFeed? 577 00:34:55,360 --> 00:34:57,200 Speaker 2: Can you spell that for me? You know, you're at 578 00:34:57,239 --> 00:34:59,080 Speaker 2: the New York Times, You're like, oh, okay, I get it. 579 00:34:59,160 --> 00:35:03,799 Speaker 2: You know, you know where you're coming from. But in 580 00:35:03,840 --> 00:35:07,360 Speaker 2: those moments, it's it's it's kind of you kind of 581 00:35:07,400 --> 00:35:09,600 Speaker 2: have to be lucky, and sometimes people don't want to 582 00:35:09,600 --> 00:35:11,520 Speaker 2: talk to you at all. You bear in mind, you know, 583 00:35:11,600 --> 00:35:15,319 Speaker 2: the people around the Twitter deal are especially on the 584 00:35:15,320 --> 00:35:19,279 Speaker 2: Twitter side, are scared. Are scared shit lists. You know, 585 00:35:19,320 --> 00:35:20,600 Speaker 2: they don't want to ruin the deal. They don't want 586 00:35:20,600 --> 00:35:24,239 Speaker 2: to say anything to report that could scupper this, this 587 00:35:24,320 --> 00:35:29,560 Speaker 2: monumental deal, this payout for shareholders and whatever. But in 588 00:35:29,600 --> 00:35:32,840 Speaker 2: writing a book, what I found is that oftentimes after 589 00:35:32,920 --> 00:35:36,480 Speaker 2: the fact, if you say, you know, actually my book 590 00:35:36,520 --> 00:35:37,799 Speaker 2: is going to come out in a year or two 591 00:35:37,840 --> 00:35:41,439 Speaker 2: years time. You know, people might be open to talking 592 00:35:41,440 --> 00:35:43,120 Speaker 2: to you for that because it's going to be past 593 00:35:43,280 --> 00:35:47,160 Speaker 2: the period where it's sensitive for them. You know, this 594 00:35:47,200 --> 00:35:48,960 Speaker 2: is something that is for history in a way. It's 595 00:35:49,000 --> 00:35:52,120 Speaker 2: not just a daily story, and so we use that 596 00:35:52,120 --> 00:35:54,880 Speaker 2: to our advantage for the book. You know, some people 597 00:35:54,920 --> 00:35:57,320 Speaker 2: that that didn't talk to us for these stories in 598 00:35:57,360 --> 00:36:01,080 Speaker 2: the New York Times came back to us or towards 599 00:36:01,080 --> 00:36:02,640 Speaker 2: the very end of the process they were like, actually, 600 00:36:02,680 --> 00:36:04,440 Speaker 2: I want to make this one thing very clear and 601 00:36:04,520 --> 00:36:07,640 Speaker 2: just the specific thing. But it ended up being very helpful. 602 00:36:09,760 --> 00:36:13,600 Speaker 2: So you get all types. I'll say it. In the moment, 603 00:36:13,640 --> 00:36:17,040 Speaker 2: it's quite hard to get people to speak immediately, but 604 00:36:17,080 --> 00:36:18,960 Speaker 2: if you give them a little time, if you work 605 00:36:18,960 --> 00:36:21,560 Speaker 2: on them a little bit, you keep reminding them there's 606 00:36:21,560 --> 00:36:25,440 Speaker 2: always a chance. And I've learned that kind of perseverance 607 00:36:25,560 --> 00:36:28,319 Speaker 2: is kind of the you know, one of my most 608 00:36:28,320 --> 00:36:29,520 Speaker 2: important traits as a reporter. 609 00:36:30,600 --> 00:36:32,800 Speaker 1: Okay, I dealt with the press a lot, calling me 610 00:36:32,960 --> 00:36:36,480 Speaker 1: for stuff. Yeah, like being on television news like the 611 00:36:36,520 --> 00:36:38,640 Speaker 1: classic story of the Today Show. They provide you a 612 00:36:38,680 --> 00:36:43,560 Speaker 1: limo there, but not back. You're just fodder. You're just 613 00:36:43,680 --> 00:36:46,280 Speaker 1: hit and run. I've had a lot of people forget 614 00:36:46,280 --> 00:36:48,839 Speaker 1: a lot of times. You know, music is a you know, 615 00:36:49,520 --> 00:36:51,520 Speaker 1: a lot of these papers have already fired all the 616 00:36:51,600 --> 00:36:53,839 Speaker 1: music people. You know. This is a lot of times 617 00:36:53,880 --> 00:36:56,960 Speaker 1: I'll put some junior person who's ignorant on the issue. 618 00:36:57,280 --> 00:37:01,040 Speaker 1: But if you get a reporter from an publication, a 619 00:37:01,080 --> 00:37:04,839 Speaker 1: lot of times they're aggressive. And most people who deal 620 00:37:04,840 --> 00:37:07,160 Speaker 1: with the press deal with the press and frequently what 621 00:37:07,320 --> 00:37:08,440 Speaker 1: is your style? 622 00:37:11,920 --> 00:37:14,640 Speaker 2: I tend not to be aggressive, largely because I am 623 00:37:14,719 --> 00:37:20,160 Speaker 2: not a beat reporter, like I don't have to file 624 00:37:20,280 --> 00:37:23,880 Speaker 2: tomorrow about you know, the news of the day that 625 00:37:24,000 --> 00:37:26,799 Speaker 2: happened at Facebook or Twitter or you know X you 626 00:37:26,800 --> 00:37:30,960 Speaker 2: know or Google. I have time to like build relationships, 627 00:37:30,960 --> 00:37:34,560 Speaker 2: and any good reporter is doing that. And everyone has 628 00:37:34,560 --> 00:37:36,840 Speaker 2: their own style. But I've never found the kind of 629 00:37:37,880 --> 00:37:40,399 Speaker 2: you know, give me the information before everyone else, or 630 00:37:41,360 --> 00:37:44,719 Speaker 2: some people use kind of like a bullying you know, 631 00:37:44,800 --> 00:37:46,839 Speaker 2: why why did you talk to the Washington Posts over there, 632 00:37:46,920 --> 00:37:49,839 Speaker 2: Wall Journal and not to me? Next time? I want 633 00:37:49,880 --> 00:37:52,200 Speaker 2: you to come to me first. I've never found that 634 00:37:52,239 --> 00:37:55,960 Speaker 2: works for me. Some people it works great, and it 635 00:37:56,000 --> 00:38:01,080 Speaker 2: makes them really dogged reporters. My style is to talk 636 00:38:01,080 --> 00:38:05,240 Speaker 2: to people NonStop. You know, if I think the worst 637 00:38:05,280 --> 00:38:08,680 Speaker 2: thing for someone to feel like is for someone to 638 00:38:08,719 --> 00:38:13,120 Speaker 2: feel used in a way. And I don't know, maybe 639 00:38:13,160 --> 00:38:15,960 Speaker 2: the social engineering is like the wrong word here, but 640 00:38:16,040 --> 00:38:20,040 Speaker 2: something lighter than that. But I want to make people 641 00:38:20,040 --> 00:38:22,399 Speaker 2: feel like they can talk to me about anything, and 642 00:38:22,520 --> 00:38:26,120 Speaker 2: so you know, it's not just me checking in for 643 00:38:26,200 --> 00:38:28,000 Speaker 2: information when I need it. You know, I'm talking to 644 00:38:28,040 --> 00:38:31,040 Speaker 2: them about their holidays or you know, what do you 645 00:38:31,040 --> 00:38:34,280 Speaker 2: have for Thanksgiving? And you know, how is your Christmas break? 646 00:38:34,840 --> 00:38:37,480 Speaker 2: You know, how is your soccer team doing. I'm building 647 00:38:37,480 --> 00:38:41,560 Speaker 2: those relationships. I'm meeting people for drinks to the point 648 00:38:41,560 --> 00:38:46,239 Speaker 2: where when it comes to talking about something that is 649 00:38:46,280 --> 00:38:48,560 Speaker 2: relevant to my reporting, it feels natural for them to 650 00:38:48,880 --> 00:38:52,480 Speaker 2: talk to me about it. And that isn't like a 651 00:38:52,640 --> 00:38:56,120 Speaker 2: unique finding like that could a banker could do that 652 00:38:56,239 --> 00:38:58,520 Speaker 2: or a lawyer. You know, it's just a relationship building 653 00:38:58,520 --> 00:39:01,879 Speaker 2: and people building. And it took me a long time 654 00:39:01,920 --> 00:39:04,440 Speaker 2: to realize that. You know, I wasn't the biggest chatterbox 655 00:39:04,480 --> 00:39:07,480 Speaker 2: in college and I still am not. But like, you 656 00:39:07,560 --> 00:39:10,520 Speaker 2: don't ever want to make people feel like you're just 657 00:39:10,560 --> 00:39:13,560 Speaker 2: pumping them for information, and then you'll never sty'll never 658 00:39:13,560 --> 00:39:16,719 Speaker 2: see you again, and you know I and that just 659 00:39:16,719 --> 00:39:19,440 Speaker 2: feels weird and not human to me anyway, So I 660 00:39:19,440 --> 00:39:22,359 Speaker 2: guess yeah, playing the human card is actually is actually 661 00:39:22,400 --> 00:39:23,200 Speaker 2: pretty important to me. 662 00:39:24,480 --> 00:39:27,160 Speaker 1: Okay, let's talk about the deal now. The first thing 663 00:39:27,200 --> 00:39:30,880 Speaker 1: that is striking in the book that comes back to 664 00:39:30,960 --> 00:39:35,960 Speaker 1: haunt Elon is he makes a deal with no due diligence. 665 00:39:36,360 --> 00:39:37,359 Speaker 1: Tell us about that. 666 00:39:39,120 --> 00:39:41,160 Speaker 2: It's one of the it's one of the crazier things 667 00:39:41,160 --> 00:39:45,759 Speaker 2: about this whole thing. You know, these mergers or takeovers 668 00:39:45,800 --> 00:39:52,320 Speaker 2: take months, sometimes years to close, and that's because bankers 669 00:39:52,360 --> 00:39:56,600 Speaker 2: and lawyers are haggling over every small detail. They're running 670 00:39:56,600 --> 00:40:00,239 Speaker 2: their fine tooth combs through every legal filing and findancial 671 00:40:00,320 --> 00:40:05,680 Speaker 2: filing and making sure everything is up to standard. And 672 00:40:05,719 --> 00:40:09,800 Speaker 2: in this case, Elon wanted to bouldoze a deal through 673 00:40:10,000 --> 00:40:13,240 Speaker 2: in days as fast as he could. After the after 674 00:40:13,360 --> 00:40:17,200 Speaker 2: the after his offer letter. Remember he was on the board, 675 00:40:18,080 --> 00:40:20,319 Speaker 2: he had bought this this chunk of the company. He 676 00:40:20,360 --> 00:40:22,279 Speaker 2: was on the board because Twitter brought him in to 677 00:40:22,360 --> 00:40:26,120 Speaker 2: be closer to them. He ended up kind of getting 678 00:40:26,120 --> 00:40:28,239 Speaker 2: into a fight with the CEO and offering to buy 679 00:40:28,239 --> 00:40:31,719 Speaker 2: the company kind of in this huff, right, and right 680 00:40:31,760 --> 00:40:35,040 Speaker 2: after that, he wants the deal to close immediately, so 681 00:40:35,200 --> 00:40:38,280 Speaker 2: much so that he tells his his bankers and his lawyers, 682 00:40:38,760 --> 00:40:41,040 Speaker 2: you know, we don't need to I don't need to 683 00:40:41,080 --> 00:40:43,680 Speaker 2: look at the books here, I don't need any non 684 00:40:43,680 --> 00:40:46,680 Speaker 2: public information. I don't need to sign an NDA to 685 00:40:46,719 --> 00:40:49,720 Speaker 2: get that non public information. I just want the company, 686 00:40:49,960 --> 00:40:51,919 Speaker 2: and you're gonna You're going to force that deal through 687 00:40:52,000 --> 00:40:54,719 Speaker 2: for me, so I have that company by the end 688 00:40:54,760 --> 00:40:57,400 Speaker 2: of the month. And that's what he did. There was 689 00:40:57,520 --> 00:41:02,120 Speaker 2: no diligence. You remember when he steps on stage at 690 00:41:02,120 --> 00:41:04,680 Speaker 2: the TED conference in I think it was a Vancouver 691 00:41:05,360 --> 00:41:08,759 Speaker 2: in the day after his offer letter comes out, he 692 00:41:08,840 --> 00:41:11,279 Speaker 2: makes all these grand pock proclamations. You know, he says 693 00:41:11,280 --> 00:41:13,040 Speaker 2: he's going to get rid of the bots, He's going 694 00:41:13,080 --> 00:41:19,120 Speaker 2: to say free speech, and that was said without any 695 00:41:19,400 --> 00:41:24,480 Speaker 2: understanding of the business. So yeah, there was no there 696 00:41:24,520 --> 00:41:27,480 Speaker 2: was no looking under the hood effectively, and that was 697 00:41:27,520 --> 00:41:30,360 Speaker 2: one of the major quirks of this deal that caused 698 00:41:30,440 --> 00:41:33,920 Speaker 2: him a lot of problems down the line. Basically immediately 699 00:41:33,920 --> 00:41:37,360 Speaker 2: as soon as soon as the merger agreement assigned. 700 00:41:39,719 --> 00:41:43,920 Speaker 1: Is that evidence of how Elon works, and he's impulsive. 701 00:41:44,520 --> 00:41:47,880 Speaker 1: Let me move forward. Worry about the repercussions after. 702 00:41:49,880 --> 00:41:53,799 Speaker 2: Totally. I mean, I'll reference something he said about the 703 00:41:53,840 --> 00:41:57,400 Speaker 2: Department of Government efficiency. He said something to the effect of, 704 00:41:57,480 --> 00:41:59,839 Speaker 2: you know, if something we cut and it goes wrong, 705 00:42:00,120 --> 00:42:02,080 Speaker 2: just put it back. You know, it's not like I'm 706 00:42:02,080 --> 00:42:04,560 Speaker 2: gonna learn about this and what this thing does. I'm 707 00:42:04,560 --> 00:42:08,600 Speaker 2: gonna cut it. See if something breaks, something some people suffer, 708 00:42:09,239 --> 00:42:11,160 Speaker 2: and if it's enough pain, then I'll just put it back. 709 00:42:11,880 --> 00:42:15,200 Speaker 2: And this kind of impulsiveness is how he's operated through 710 00:42:15,200 --> 00:42:21,200 Speaker 2: his whole career. It's very gut instinct, it's very primal 711 00:42:21,239 --> 00:42:24,360 Speaker 2: in some ways, and it served him well. You know, 712 00:42:24,520 --> 00:42:27,440 Speaker 2: he's going to look back on his successes. Is net worth, 713 00:42:27,560 --> 00:42:30,440 Speaker 2: you know, he's worth more than three hundred billion dollars. 714 00:42:30,480 --> 00:42:33,480 Speaker 2: He has two major companies. He's gonna be like, what 715 00:42:33,760 --> 00:42:36,759 Speaker 2: this has served me so well throughout my life. You know, 716 00:42:36,800 --> 00:42:40,560 Speaker 2: why why shouldn't I do things this way? And in 717 00:42:40,600 --> 00:42:42,759 Speaker 2: this case, in the Twitter case obviously came back to 718 00:42:42,760 --> 00:42:47,120 Speaker 2: bite him because he doesn't do the due diligence. He 719 00:42:47,160 --> 00:42:49,919 Speaker 2: signs the merger agreement and then he starts to get 720 00:42:49,920 --> 00:42:52,759 Speaker 2: really cold feet and tries to find ways to back out. 721 00:42:52,840 --> 00:42:55,880 Speaker 2: You know, he starts to name issues around bots, for example, 722 00:42:58,600 --> 00:43:03,680 Speaker 2: and you know that's kind of the source of his pain. 723 00:43:03,760 --> 00:43:08,280 Speaker 1: Now with this deal, Okay, since you cover Elon in general, 724 00:43:08,360 --> 00:43:11,120 Speaker 1: let's go to Tesla for a second. So we have 725 00:43:11,200 --> 00:43:14,560 Speaker 1: the Tesla self driving. They're the first movers in that area. 726 00:43:15,040 --> 00:43:18,680 Speaker 1: He gets rid of light ar relying solely on cameras. 727 00:43:19,400 --> 00:43:25,279 Speaker 1: The competitors go to lightar. Okay, Meanwhile, the government is 728 00:43:25,320 --> 00:43:30,080 Speaker 1: investing for crashes. Is he doubling down based on his 729 00:43:30,239 --> 00:43:34,080 Speaker 1: personality saying fuck you? Well, do you think he really 730 00:43:34,160 --> 00:43:38,040 Speaker 1: evaluates a cameras are enough or is he convinced himself 731 00:43:38,120 --> 00:43:38,959 Speaker 1: cameras are enough? 732 00:43:39,800 --> 00:43:42,080 Speaker 2: I think in these beliefs, he's convinced himself. I think 733 00:43:42,120 --> 00:43:44,759 Speaker 2: it's very key to understanding he's not just saying things right, 734 00:43:44,800 --> 00:43:48,799 Speaker 2: he fundamentally believes them. And the thing with lightar is 735 00:43:48,840 --> 00:43:52,160 Speaker 2: that he believes that cars should see like humans do, 736 00:43:52,520 --> 00:43:55,359 Speaker 2: and and and cameras will be enough, and cameras are 737 00:43:55,400 --> 00:43:58,600 Speaker 2: much cheaper than a light r you know that being said, 738 00:43:58,760 --> 00:44:04,040 Speaker 2: look at the progress of where Tesla is relative to Weimo. 739 00:44:04,200 --> 00:44:06,560 Speaker 2: For example, Wemo actually has cars on the streets that 740 00:44:06,600 --> 00:44:10,160 Speaker 2: are taxing people around with no drivers, and Tesla is 741 00:44:10,200 --> 00:44:14,640 Speaker 2: still kind of making these half promises about full self driving. 742 00:44:14,719 --> 00:44:16,799 Speaker 2: Oh actually it's not full self driving. You have to 743 00:44:16,880 --> 00:44:21,480 Speaker 2: be engaged with the driver. And he's kind of jerked 744 00:44:21,480 --> 00:44:26,840 Speaker 2: people around on the meeting of Autopilot. You know, he 745 00:44:26,920 --> 00:44:30,560 Speaker 2: had that Robotaxi event a couple months ago. I'm sure 746 00:44:30,600 --> 00:44:33,520 Speaker 2: you saw it right where he had. You know, it's 747 00:44:33,560 --> 00:44:35,920 Speaker 2: a staged event. You're on a literally on a movie 748 00:44:35,960 --> 00:44:39,840 Speaker 2: set in Burbank, driving people around these pretend cities on 749 00:44:39,880 --> 00:44:44,840 Speaker 2: these supposedly autonomous vehicles. And he's selling that to people 750 00:44:44,880 --> 00:44:48,160 Speaker 2: as if it's like the next revolution, when if you 751 00:44:48,280 --> 00:44:52,160 Speaker 2: walk over to Hollywood you can find a Waimo, you know, 752 00:44:52,239 --> 00:44:56,840 Speaker 2: going up and down the street with no driver. So yeah, 753 00:44:56,920 --> 00:45:00,600 Speaker 2: I mean, I think the one thing understand a boy 754 00:45:00,640 --> 00:45:03,279 Speaker 2: Elon is that one of his superpowers, if not his 755 00:45:03,320 --> 00:45:07,560 Speaker 2: most super important superpower, is his ability as a salesman, 756 00:45:07,680 --> 00:45:10,000 Speaker 2: his ability to sell you on a mission and a vision. 757 00:45:10,760 --> 00:45:14,360 Speaker 2: And people love that, you know, because you know, we 758 00:45:14,440 --> 00:45:18,480 Speaker 2: can we can debate whether or not he hits his 759 00:45:18,480 --> 00:45:23,120 Speaker 2: his his goals or his his projections. You know, he said, 760 00:45:23,200 --> 00:45:25,600 Speaker 2: for example, we get to Mars by twenty twenty five. 761 00:45:26,400 --> 00:45:30,680 Speaker 2: I don't think that's going to happen. But you know, 762 00:45:31,800 --> 00:45:34,160 Speaker 2: you look at this robotax event and me as a report, 763 00:45:34,200 --> 00:45:36,560 Speaker 2: I'm like, I'm very skeptical, like this is. But you 764 00:45:36,640 --> 00:45:39,880 Speaker 2: have people that are screaming and cheering and posting online. 765 00:45:40,239 --> 00:45:42,319 Speaker 2: You know, they're loving the robots that are serving the 766 00:45:42,360 --> 00:45:45,120 Speaker 2: beers and everything, and you're like, wow, he really has 767 00:45:45,120 --> 00:45:48,960 Speaker 2: sold this to people. People really believe him on this stuff, 768 00:45:49,120 --> 00:45:54,680 Speaker 2: and I don't. Uh, yeah, I don't know. It's it's 769 00:45:54,719 --> 00:45:56,320 Speaker 2: it's an interesting quality that he has. 770 00:45:56,440 --> 00:46:09,080 Speaker 1: So okay. You also say that he has these advisors 771 00:46:09,200 --> 00:46:12,319 Speaker 1: that he trusts in the deal. They really have no 772 00:46:12,480 --> 00:46:17,439 Speaker 1: portfolio that have no expertise in this area whatsoever. 773 00:46:18,960 --> 00:46:22,680 Speaker 2: Aside from having large Twitter followings. You know, he's yeah, 774 00:46:22,719 --> 00:46:24,879 Speaker 2: he's in the room with some of these venture capitalists, 775 00:46:25,239 --> 00:46:27,920 Speaker 2: people that have had good careers in technology, have built 776 00:46:28,320 --> 00:46:31,680 Speaker 2: massive fortunes investing, but in terms of running a social 777 00:46:31,680 --> 00:46:34,759 Speaker 2: media company something to the scale of Twitter, which few 778 00:46:34,760 --> 00:46:40,440 Speaker 2: companies are. You know, there's Facebook, there's Instagram, Reddit, But 779 00:46:41,160 --> 00:46:44,040 Speaker 2: these guys don't have an understanding of this, and they 780 00:46:44,520 --> 00:46:48,080 Speaker 2: again view this as a technical problem as opposed to 781 00:46:48,080 --> 00:46:52,840 Speaker 2: a people problem. And so he's taking advice from people 782 00:46:53,680 --> 00:46:55,600 Speaker 2: of his same background of the same ilk you know, 783 00:46:55,680 --> 00:47:01,000 Speaker 2: like David Sachs, Jason kalakhanis of these investors that have 784 00:47:01,040 --> 00:47:03,040 Speaker 2: put money into this deal. He at one point takes 785 00:47:03,080 --> 00:47:08,000 Speaker 2: advice from his own biographer, Walter Isaacson, and he and 786 00:47:08,160 --> 00:47:11,600 Speaker 2: also in that process decides to fire people that have 787 00:47:11,920 --> 00:47:14,560 Speaker 2: worked at the company for years studying this stuff and 788 00:47:14,600 --> 00:47:19,400 Speaker 2: working on it, largely because he believes they're inept or 789 00:47:19,440 --> 00:47:25,080 Speaker 2: they're too woke or whatever. And that's I think that Hubris. 790 00:47:25,600 --> 00:47:28,040 Speaker 2: Again we talk about these kind of gut decisions that 791 00:47:28,080 --> 00:47:32,560 Speaker 2: he makes, but that is what has hampered him throughout 792 00:47:32,560 --> 00:47:36,080 Speaker 2: the deal and throughout the takeover to the point where 793 00:47:36,080 --> 00:47:39,600 Speaker 2: the company is, you know, valuation wise is crashed. It's 794 00:47:39,640 --> 00:47:42,120 Speaker 2: advertising as crashed, the user numbers have crashed. 795 00:47:43,840 --> 00:47:47,800 Speaker 1: Yeah, but even his lawyer, you know, even the people, 796 00:47:47,880 --> 00:47:50,840 Speaker 1: you know, the the he's not using blue chip people 797 00:47:51,080 --> 00:47:52,160 Speaker 1: even to make the deal. 798 00:47:54,000 --> 00:47:58,000 Speaker 2: Yeah. Alex Alex Spiro, who is his I'm sure you've 799 00:47:58,040 --> 00:48:00,879 Speaker 2: come across me as a is a big celebrity lawyer. 800 00:48:00,880 --> 00:48:05,640 Speaker 2: He represented Megan the Stallion and Robert Kraft. He's a 801 00:48:05,719 --> 00:48:09,200 Speaker 2: Jay Z's lawyer and actually becomes Elon's lawyer in the 802 00:48:09,239 --> 00:48:12,080 Speaker 2: Pedal Guy case where he defends them successfully against deformation, 803 00:48:13,800 --> 00:48:16,240 Speaker 2: ends up being one of the kind of lead lawyers 804 00:48:16,280 --> 00:48:20,319 Speaker 2: on this deal, and also someone who ends up like 805 00:48:21,560 --> 00:48:24,160 Speaker 2: somewhat leading the legal department in the early days of Twitter. 806 00:48:24,960 --> 00:48:27,239 Speaker 2: You know, this is a trial lawyer. This isn't you know. 807 00:48:27,680 --> 00:48:31,319 Speaker 2: He may be wildly successful, but does he have the 808 00:48:31,360 --> 00:48:35,040 Speaker 2: expertise to understand what it means when an FTC consent 809 00:48:35,120 --> 00:48:38,680 Speaker 2: decree comes in from the FTC? Does he understand what 810 00:48:39,160 --> 00:48:43,279 Speaker 2: happens when the EU demands information? And does he have 811 00:48:43,320 --> 00:48:48,080 Speaker 2: international law expertise to under like to grasp that. I 812 00:48:48,080 --> 00:48:50,920 Speaker 2: don't think so, you know. And Elon fired a lot 813 00:48:50,960 --> 00:48:52,959 Speaker 2: of the people or got or encouraged them to leave, 814 00:48:53,719 --> 00:48:55,880 Speaker 2: folks that have kind of specialized in these areas and 815 00:48:55,920 --> 00:49:00,440 Speaker 2: working in it for their whole careers. And again, that's 816 00:49:00,480 --> 00:49:02,160 Speaker 2: just kind of one of the wilder things of this, 817 00:49:03,280 --> 00:49:06,799 Speaker 2: of this whole thing. There was really no transition in 818 00:49:06,880 --> 00:49:10,160 Speaker 2: place to understand how Twitter operated when he took over 819 00:49:10,200 --> 00:49:12,799 Speaker 2: the company. There were a handful of meetings, but there 820 00:49:12,880 --> 00:49:16,799 Speaker 2: wasn't like a transition plan. There wasn't like, Okay, these 821 00:49:16,880 --> 00:49:19,319 Speaker 2: these people work over here. I understand that this is 822 00:49:19,320 --> 00:49:22,160 Speaker 2: how the advertising business works. Let me understand that that 823 00:49:22,200 --> 00:49:24,759 Speaker 2: came after he took over the company and after he 824 00:49:24,840 --> 00:49:31,520 Speaker 2: laid off hundreds and thousands of people, and that has 825 00:49:31,560 --> 00:49:32,920 Speaker 2: fundamentally damaged to the company. 826 00:49:34,000 --> 00:49:37,160 Speaker 1: Okay, let's since you mentioned let's stay with the FTC 827 00:49:37,280 --> 00:49:41,479 Speaker 1: for one second. There was the lawsuit where Elon said 828 00:49:41,520 --> 00:49:46,640 Speaker 1: he was going to take the company private. This is Tesla. Okay, 829 00:49:46,800 --> 00:49:49,160 Speaker 1: the government sued him. He made an agreement that he's 830 00:49:49,200 --> 00:49:53,319 Speaker 1: been bitching about ever since, trying to overturn. You make 831 00:49:53,360 --> 00:49:55,520 Speaker 1: a point that there are certain deals with the FTC. 832 00:49:55,840 --> 00:50:00,480 Speaker 1: The FTC keeps asking for compliance and they don't imply. 833 00:50:01,560 --> 00:50:04,360 Speaker 1: Does this basically say, if you're one of these guys, 834 00:50:04,400 --> 00:50:07,200 Speaker 1: you have these balls, the real rules really don't apply. 835 00:50:09,040 --> 00:50:14,160 Speaker 2: So the yeah, so the the the UH funding secured 836 00:50:14,200 --> 00:50:16,400 Speaker 2: stuff that that Tesla take private, that was the that 837 00:50:16,440 --> 00:50:19,640 Speaker 2: was the securities in the Change Commission sec But that 838 00:50:19,640 --> 00:50:24,960 Speaker 2: that case, I feel like he is where he draws 839 00:50:25,040 --> 00:50:29,000 Speaker 2: his understanding of how unaccountable he is. There is nothing 840 00:50:29,000 --> 00:50:32,359 Speaker 2: that can hold him accountable. And I'll explain why, which 841 00:50:32,400 --> 00:50:34,440 Speaker 2: is in twenty eighteen, he tweets that he has the 842 00:50:34,480 --> 00:50:38,799 Speaker 2: money to take Tesla private funding secured. It has at 843 00:50:38,840 --> 00:50:40,759 Speaker 2: four hundred and twenty dollars a share. He's going to 844 00:50:40,800 --> 00:50:43,560 Speaker 2: take Tesla private, take it off the public market where 845 00:50:43,560 --> 00:50:45,800 Speaker 2: it's not doing well because he's not hitting his numbers 846 00:50:45,800 --> 00:50:48,640 Speaker 2: with the Model three rollout, and he says, screw it. 847 00:50:48,800 --> 00:50:50,279 Speaker 2: You know, I don't need the public market. I have 848 00:50:50,320 --> 00:50:52,600 Speaker 2: my private investors. You know, he didn't have that money, 849 00:50:52,840 --> 00:50:54,960 Speaker 2: so I was going to come from Saudi Arabia wasn't coming. 850 00:50:55,920 --> 00:50:59,600 Speaker 2: And the SEC sues him because he puts out a 851 00:50:59,600 --> 00:51:05,240 Speaker 2: statement that manipulates the markets and offers a false promise 852 00:51:05,320 --> 00:51:10,400 Speaker 2: to investors. YadA YadA. They sue him. He fights it, 853 00:51:10,440 --> 00:51:15,319 Speaker 2: briefly settles what is what is he? What does he 854 00:51:15,360 --> 00:51:17,120 Speaker 2: have to pay for that? You know, he personally pays 855 00:51:17,120 --> 00:51:19,319 Speaker 2: a twenty million dollar fine. Testa pays a twenty million 856 00:51:19,360 --> 00:51:23,280 Speaker 2: dollar fine. He's no longer able to be chairman, whatever 857 00:51:23,320 --> 00:51:26,279 Speaker 2: that means. But what is a for What does forty 858 00:51:26,320 --> 00:51:30,720 Speaker 2: million dollars mean to someone who's worth one hundred billion dollars? 859 00:51:30,800 --> 00:51:33,000 Speaker 2: You know it it's a parking ticket for you and me. 860 00:51:33,960 --> 00:51:39,520 Speaker 2: And his fundamental understanding of that that he could either 861 00:51:39,560 --> 00:51:43,239 Speaker 2: fight something or simply pay the fine is crucial to 862 00:51:43,280 --> 00:51:46,280 Speaker 2: how he operates now because when you get to something 863 00:51:46,360 --> 00:51:49,800 Speaker 2: like operating Twitter and complying to an FDC consent decree, 864 00:51:50,760 --> 00:51:52,439 Speaker 2: He'll be like, you know, I don't care about that, 865 00:51:52,600 --> 00:51:54,080 Speaker 2: Like what are they going to do to me? Like 866 00:51:54,680 --> 00:51:55,920 Speaker 2: are they going to They're not going to take the 867 00:51:55,960 --> 00:51:59,000 Speaker 2: company away from me. They'll levy a fine. What is 868 00:51:59,000 --> 00:52:01,240 Speaker 2: the fine going to be? Tens of millions dollars? Hundreds 869 00:52:01,239 --> 00:52:03,239 Speaker 2: of millions of dollars? You know, I'm now worth two 870 00:52:03,280 --> 00:52:06,239 Speaker 2: hundred billion dollars. And not only that, I'm going to 871 00:52:06,320 --> 00:52:11,000 Speaker 2: fight on every level using the legal system to really 872 00:52:11,000 --> 00:52:15,640 Speaker 2: clog this process up. Right now, he is being asked 873 00:52:15,760 --> 00:52:20,360 Speaker 2: to sit for a deposition by the SEC in a 874 00:52:20,440 --> 00:52:23,279 Speaker 2: in a related securities cases. He's simply not showing up 875 00:52:23,280 --> 00:52:27,440 Speaker 2: to his depositions, you know, and he's flaunting the SEC. 876 00:52:27,640 --> 00:52:29,239 Speaker 2: He had a tweet the other day which is a 877 00:52:29,239 --> 00:52:33,000 Speaker 2: play on something. He's kind of childish thing he said, 878 00:52:33,000 --> 00:52:37,759 Speaker 2: but he says, the SEC stands for you know, the 879 00:52:38,200 --> 00:52:41,239 Speaker 2: E stands for Elons, So you know S, you know, 880 00:52:41,680 --> 00:52:44,279 Speaker 2: suck Elon's blank. You know that's what the s. You 881 00:52:44,280 --> 00:52:46,839 Speaker 2: know that that was his tweet the other day, And 882 00:52:47,080 --> 00:52:49,880 Speaker 2: you start to realize that in some ways he's he's 883 00:52:51,520 --> 00:52:54,799 Speaker 2: blown past the gravitational pool of accountability. You know, he 884 00:52:54,880 --> 00:52:58,840 Speaker 2: is so rich, he has so many resources. The government 885 00:52:58,920 --> 00:53:02,440 Speaker 2: is so reliant on him with that, you know, he 886 00:53:02,480 --> 00:53:06,080 Speaker 2: doesn't care, and that I think is a crucial understanding 887 00:53:06,200 --> 00:53:08,360 Speaker 2: understanding how he operates. 888 00:53:09,280 --> 00:53:12,840 Speaker 1: Okay, to make the deal. It's not all his own money. 889 00:53:12,880 --> 00:53:16,400 Speaker 1: He raises some Silicon Valley money, raises some Wall Street money. 890 00:53:17,080 --> 00:53:22,080 Speaker 1: Anybody who followed this closely knew that was a bad deal. 891 00:53:23,520 --> 00:53:26,040 Speaker 1: Why did they put the money in and it's been 892 00:53:26,080 --> 00:53:29,200 Speaker 1: proven to be a bad deal. They can't lay the 893 00:53:29,239 --> 00:53:29,759 Speaker 1: money off. 894 00:53:30,680 --> 00:53:34,400 Speaker 2: Yeah, he raised what was publicly just goes about seven 895 00:53:34,480 --> 00:53:38,600 Speaker 2: billion dollars of external capital, not including the debt, which 896 00:53:38,600 --> 00:53:40,080 Speaker 2: we can talk about later, which is kind of a 897 00:53:40,960 --> 00:53:46,400 Speaker 2: millstone around the company's neck. But the external this external 898 00:53:46,480 --> 00:53:49,640 Speaker 2: venture capital. Some of it came from Qatar, for example. 899 00:53:49,680 --> 00:53:53,000 Speaker 2: Some of it came from Kingdom holding the Saudi Prince, 900 00:53:53,040 --> 00:53:58,399 Speaker 2: who is effectively overseen by MBS Mohammed bin Salman, who 901 00:53:58,480 --> 00:54:03,080 Speaker 2: rolled over his steak from he already owned past Twitter share, 902 00:54:03,080 --> 00:54:05,160 Speaker 2: so he just rolled over his investment about two billion 903 00:54:05,200 --> 00:54:08,719 Speaker 2: dollars into into Elon's deal. But a lot of these 904 00:54:08,719 --> 00:54:11,879 Speaker 2: were blue like leading venture Capital from the Silicon Valley. 905 00:54:11,880 --> 00:54:17,000 Speaker 2: We're talking about Siquoy Capital, Andreas and Horowitz, you know, 906 00:54:17,080 --> 00:54:20,960 Speaker 2: and these and these firms believe in him as an entrepreneur, 907 00:54:21,239 --> 00:54:23,920 Speaker 2: you know, they they see him as a generational talent. 908 00:54:24,600 --> 00:54:26,840 Speaker 2: He has done so much with SpaceX and with Tesla 909 00:54:27,440 --> 00:54:29,200 Speaker 2: that they're just going to take a punt on this 910 00:54:29,280 --> 00:54:33,919 Speaker 2: Twitter deal. And even if it's a money loser, they're 911 00:54:33,960 --> 00:54:37,240 Speaker 2: still close to him. They've they've it's it's especially buying 912 00:54:37,280 --> 00:54:40,080 Speaker 2: loyalty in a way where on his next deal they 913 00:54:40,160 --> 00:54:42,440 Speaker 2: might come to him. And so I think about this 914 00:54:42,560 --> 00:54:45,160 Speaker 2: message that we reported in the book from Mark Andreesen, 915 00:54:45,200 --> 00:54:49,720 Speaker 2: the head of Andresen Horowitz, who was offered to invest 916 00:54:49,800 --> 00:54:51,919 Speaker 2: I think to initially two hundred million dollars. The number 917 00:54:52,000 --> 00:54:55,360 Speaker 2: changed after but the emails, the messages came out. He's like, 918 00:54:55,360 --> 00:54:57,080 Speaker 2: you know, two hundred million. Sure, we don't even need 919 00:54:57,160 --> 00:55:00,200 Speaker 2: to do dilig just whatever he wants, he gets the check. 920 00:55:01,239 --> 00:55:04,560 Speaker 2: And that's the level of influence he has over these people. 921 00:55:05,480 --> 00:55:07,880 Speaker 2: And if you actually think about this deal, yes, it 922 00:55:08,080 --> 00:55:11,200 Speaker 2: individually it's a money loser for the Sequoia Capitals of 923 00:55:11,200 --> 00:55:13,560 Speaker 2: the world, for the and recent Horowitz of the world, 924 00:55:14,560 --> 00:55:18,279 Speaker 2: but these same firms invested in this ai company that 925 00:55:18,320 --> 00:55:21,880 Speaker 2: Elon spun up off of the data from Twitter Xai. 926 00:55:23,280 --> 00:55:26,880 Speaker 2: And yes, it's funny money, but that company was recently 927 00:55:27,000 --> 00:55:32,040 Speaker 2: valued at fifty billion dollars in a private venture round. 928 00:55:32,800 --> 00:55:35,440 Speaker 2: So you know, if you're looking at the plus minuses 929 00:55:35,520 --> 00:55:38,719 Speaker 2: here Twitter goes down, it's now about eight to nine 930 00:55:38,719 --> 00:55:41,360 Speaker 2: billion dollars if you believe Fidelity's valuation of the company. 931 00:55:41,880 --> 00:55:44,000 Speaker 2: But he's been able to create xai out of zero 932 00:55:44,320 --> 00:55:47,600 Speaker 2: zero to fifty and so you're like, you know, maybe 933 00:55:47,600 --> 00:55:51,760 Speaker 2: they have a point. You know, maybe they've they've gained 934 00:55:51,760 --> 00:55:54,680 Speaker 2: the system enough where they understand that just being close 935 00:55:54,719 --> 00:55:56,800 Speaker 2: to Elon begets more wealth. 936 00:55:57,719 --> 00:56:00,080 Speaker 1: And what about the banks that loan them the money? 937 00:56:00,800 --> 00:56:04,200 Speaker 2: Do they think that? Yeah? Those that's where it gets. 938 00:56:05,640 --> 00:56:05,920 Speaker 1: Rough. 939 00:56:06,239 --> 00:56:08,480 Speaker 2: You know. I have to look back on the actual 940 00:56:08,680 --> 00:56:13,279 Speaker 2: debt that was raised. I think it was gosh, I'm 941 00:56:13,480 --> 00:56:16,359 Speaker 2: off top of mad, like twelve thirteen billion dollars. It 942 00:56:16,400 --> 00:56:20,879 Speaker 2: amounts to a billion dollars in interest payments alone a year, 943 00:56:22,120 --> 00:56:27,080 Speaker 2: and those I mean, it's it's it's it's the worst 944 00:56:27,120 --> 00:56:30,799 Speaker 2: deal since the Financial crisis for these banks. These banks 945 00:56:30,800 --> 00:56:33,160 Speaker 2: are whole, They weren't able to sell on this debt, 946 00:56:33,760 --> 00:56:40,120 Speaker 2: and they're holding these this horrible you know, this horrible debt. 947 00:56:40,120 --> 00:56:42,600 Speaker 2: You know, it's just like they're getting crushed by it. 948 00:56:43,960 --> 00:56:45,960 Speaker 2: These are the Morgan Stanley's, the Moral Think Bank of 949 00:56:46,000 --> 00:56:51,320 Speaker 2: America had some of the debt Mizuno as well, and 950 00:56:51,719 --> 00:56:55,840 Speaker 2: for them, I think the calculus is is somewhat even similar. 951 00:56:56,000 --> 00:56:59,399 Speaker 2: You know, maybe we take this hit, really ugly hit 952 00:56:59,760 --> 00:57:03,000 Speaker 2: on our books from this this Twitter deal. But let's 953 00:57:03,000 --> 00:57:06,440 Speaker 2: say SpaceX goes public in the next couple of years, 954 00:57:06,680 --> 00:57:08,759 Speaker 2: we want to be on that i PO. You know, 955 00:57:08,840 --> 00:57:11,360 Speaker 2: that's a that's a billion dollar i PO for us 956 00:57:12,320 --> 00:57:15,160 Speaker 2: and so and you know what happens with Neuralink and 957 00:57:15,320 --> 00:57:20,640 Speaker 2: Boring Company. You know, they're probably doing the plus minus 958 00:57:20,680 --> 00:57:23,080 Speaker 2: calculation and thinking, actually, they will still come out on top, 959 00:57:23,120 --> 00:57:26,680 Speaker 2: even if even if Twitter is a historically bad deal. 960 00:57:27,800 --> 00:57:31,200 Speaker 1: Okay, one of the reasons he wanted to buy what 961 00:57:31,400 --> 00:57:37,040 Speaker 1: was then called Twitter was because of content moderation. Okay, 962 00:57:37,160 --> 00:57:40,760 Speaker 1: he felt it was too tight. He comes in and 963 00:57:40,800 --> 00:57:45,160 Speaker 1: he immediately blows out everybody involved in content moderation. 964 00:57:47,240 --> 00:57:47,280 Speaker 3: It. 965 00:57:47,600 --> 00:57:51,400 Speaker 2: Uh yeah, he's he's he doesn't hide what he wants 966 00:57:51,400 --> 00:57:54,400 Speaker 2: to do, you know. Uh, he was mad about the 967 00:57:54,400 --> 00:57:57,720 Speaker 2: Babylon Bee being suspended, if that that that kind of 968 00:57:57,760 --> 00:58:01,880 Speaker 2: right wing satirial satirical account. He was upset with Trump 969 00:58:02,000 --> 00:58:05,000 Speaker 2: not being on the platform, even though he technically didn't 970 00:58:05,120 --> 00:58:08,960 Speaker 2: like Trump. He thought it was a bad implication for 971 00:58:09,000 --> 00:58:11,880 Speaker 2: free speech whatever, you know, and we'll talk about what 972 00:58:11,920 --> 00:58:17,440 Speaker 2: he means by free speech. But yeah, well, actually, actually 973 00:58:17,480 --> 00:58:20,120 Speaker 2: what happened was he tried to play nice with some 974 00:58:20,160 --> 00:58:22,640 Speaker 2: of the content moderation folks. You remember this guy Yoel Roth. 975 00:58:23,640 --> 00:58:26,960 Speaker 2: We ended up like severely harassing on the platform, to 976 00:58:27,000 --> 00:58:31,120 Speaker 2: the point where Roth had to sell his home because 977 00:58:31,120 --> 00:58:35,880 Speaker 2: he was being chased in real life by this online harassment. 978 00:58:37,160 --> 00:58:41,080 Speaker 2: They were trying to get along. Elon was taking advice 979 00:58:41,120 --> 00:58:43,240 Speaker 2: from this guy Yoel Roth ahead of Trust and Safety. 980 00:58:44,240 --> 00:58:48,640 Speaker 2: Yoel eventually resigns, I think in the two or three 981 00:58:48,680 --> 00:58:52,080 Speaker 2: weeks after the deal closes, and then kind of all 982 00:58:52,080 --> 00:58:54,520 Speaker 2: hell breaks loose. He goes after you Wel he continues 983 00:58:54,520 --> 00:58:59,720 Speaker 2: to fire Trust and Safety people. But yeah, content moderation 984 00:59:00,120 --> 00:59:04,400 Speaker 2: for Elon was one of his big bugaboobs begaboos, and 985 00:59:04,480 --> 00:59:07,240 Speaker 2: like it still is to this day. You know, he's 986 00:59:08,120 --> 00:59:11,880 Speaker 2: he still thinks that Twitter was or the previous administration 987 00:59:12,120 --> 00:59:18,600 Speaker 2: was influenced by progressive woke wokeism or whatever, and that 988 00:59:18,720 --> 00:59:22,919 Speaker 2: everything flowed from that, and uh yeah, now you kind 989 00:59:22,920 --> 00:59:26,000 Speaker 2: of that's that's kind of how you can understand or 990 00:59:26,080 --> 00:59:28,080 Speaker 2: the content that's on Twitter now these days. 991 00:59:28,440 --> 00:59:31,720 Speaker 1: Okay, it's not only the United States of America. There's 992 00:59:31,760 --> 00:59:33,480 Speaker 1: been the recent thing with Brazil. 993 00:59:34,600 --> 00:59:35,960 Speaker 2: Yeah, is he just. 994 00:59:37,760 --> 00:59:41,600 Speaker 1: Flying naked or is this is it at some point 995 00:59:41,760 --> 00:59:46,720 Speaker 1: with no content moderation team in place? Is it gonna 996 00:59:46,840 --> 00:59:52,800 Speaker 1: ultimately impact Twitter just as a business negatively in a 997 00:59:52,960 --> 00:59:55,880 Speaker 1: huge way. I Mean, we had the situation in Brazil 998 00:59:56,480 --> 00:59:59,560 Speaker 1: and the person said, I'm standing up for everybody else 999 00:59:59,560 --> 01:00:00,640 Speaker 1: who will stand up? 1000 01:00:01,440 --> 01:00:04,760 Speaker 2: Yeah, So a couple of things there. There is still 1001 01:00:04,760 --> 01:00:06,920 Speaker 2: a little bit of content moderation. There's automated processes in 1002 01:00:06,960 --> 01:00:11,080 Speaker 2: place that at at X there's still people. There's there's 1003 01:00:11,120 --> 01:00:14,360 Speaker 2: still some humans in the building. He's gotten rid of 1004 01:00:14,360 --> 01:00:18,120 Speaker 2: a lot of them, but there's still some around with 1005 01:00:18,240 --> 01:00:21,040 Speaker 2: content moderation. A couple there's there's a couple things you 1006 01:00:21,080 --> 01:00:23,680 Speaker 2: have to understand, which is that advertisers have been driven 1007 01:00:23,720 --> 01:00:26,439 Speaker 2: away by the lack of content moderation, right like it's 1008 01:00:26,480 --> 01:00:32,600 Speaker 2: a It's caused major brands to leave or spend less money, 1009 01:00:32,760 --> 01:00:35,320 Speaker 2: because why would they want their content to be next 1010 01:00:35,360 --> 01:00:39,080 Speaker 2: to a white nationalist who's been let back on the platform. 1011 01:00:39,120 --> 01:00:42,640 Speaker 2: Why would they want it next to tweets about Nazism 1012 01:00:43,080 --> 01:00:48,200 Speaker 2: or you know, videos of violence, which are somewhat commonplace 1013 01:00:48,240 --> 01:00:53,680 Speaker 2: on the platform. Now, the Brazil situation is very interesting 1014 01:00:53,720 --> 01:00:58,200 Speaker 2: to me, and it goes back to Elon's political alliances 1015 01:00:58,240 --> 01:01:03,000 Speaker 2: around the world. Elon was friends with Jayra Bolsonaro, the 1016 01:01:03,040 --> 01:01:08,520 Speaker 2: previous president in Brazil who was voted out essentially and 1017 01:01:08,560 --> 01:01:16,400 Speaker 2: replaced by the leftist president. But in doing you know, 1018 01:01:16,440 --> 01:01:17,760 Speaker 2: and then there were they had their own kind of 1019 01:01:17,840 --> 01:01:22,320 Speaker 2: January sixth moment where supporters of Bolsonaro storm government buildings 1020 01:01:22,360 --> 01:01:26,240 Speaker 2: in Brazil, which led to these decisions by the courts 1021 01:01:26,240 --> 01:01:32,240 Speaker 2: down in Brazil to essentially demand that Twitter ex remove 1022 01:01:33,400 --> 01:01:37,120 Speaker 2: these voices that caused for people to question the election 1023 01:01:37,600 --> 01:01:39,480 Speaker 2: and cause this kind of unrest. You know, we can 1024 01:01:39,520 --> 01:01:42,720 Speaker 2: debate whether or not it's good for any government body 1025 01:01:42,720 --> 01:01:46,280 Speaker 2: to get involved in online speech, but that was the 1026 01:01:46,360 --> 01:01:50,360 Speaker 2: law of the land in Brazil, and Elon decided to 1027 01:01:50,400 --> 01:01:55,880 Speaker 2: fight that in part because he was aligned with, you know, 1028 01:01:55,920 --> 01:01:59,160 Speaker 2: the old president, and he made a huge think about 1029 01:01:59,160 --> 01:02:02,040 Speaker 2: this to the point where X was pulled out of 1030 01:02:02,080 --> 01:02:05,880 Speaker 2: the country for a couple of weeks and then ultimately folded. 1031 01:02:05,920 --> 01:02:07,800 Speaker 2: You know, he complied with the orders and now the 1032 01:02:09,880 --> 01:02:13,120 Speaker 2: platform is back in the country. He'd loved to make 1033 01:02:13,640 --> 01:02:16,920 Speaker 2: if you analyze this case, he loved to make himself. 1034 01:02:17,080 --> 01:02:19,080 Speaker 2: He loves to make himself kind of the free speech 1035 01:02:19,160 --> 01:02:22,280 Speaker 2: advocate or seen as the free speech advocate. But let's 1036 01:02:22,280 --> 01:02:24,640 Speaker 2: compare this to what's happening in India, for example, where 1037 01:02:24,640 --> 01:02:27,479 Speaker 2: he's very close with the Prime Minister there in Arender Modi. 1038 01:02:28,000 --> 01:02:32,400 Speaker 2: The Modi government has demanded that Twitter sensor content regularly. 1039 01:02:32,640 --> 01:02:34,880 Speaker 2: I think of something like a documentary that was put 1040 01:02:34,960 --> 01:02:38,880 Speaker 2: up by the BBC about Modi himself. It was put 1041 01:02:38,920 --> 01:02:42,320 Speaker 2: on YouTube and shared across platforms. You know, Twitter complied 1042 01:02:42,360 --> 01:02:45,960 Speaker 2: with those takedown orders. There was no standing up for 1043 01:02:46,000 --> 01:02:50,760 Speaker 2: free speech in that case by Elon and opposing that 1044 01:02:50,760 --> 01:02:54,960 Speaker 2: that government takedown requests. He complied very easily. He's complied 1045 01:02:55,000 --> 01:02:57,760 Speaker 2: with government takedown requests in Turkey, for example, where he's 1046 01:02:58,080 --> 01:03:03,320 Speaker 2: aligned with Erdawan. So this kind of free speech mantra 1047 01:03:04,360 --> 01:03:09,320 Speaker 2: that he purports is very selectively applied and really depends 1048 01:03:09,360 --> 01:03:13,320 Speaker 2: on the market who he's aligned with. As well as 1049 01:03:13,320 --> 01:03:14,920 Speaker 2: his own kind of personal interests. 1050 01:03:22,520 --> 01:03:26,600 Speaker 1: Okay, since we're doing a three D view on Elon. Yeah, 1051 01:03:26,760 --> 01:03:30,439 Speaker 1: just one thing, a little lose gravel here. He makes 1052 01:03:30,480 --> 01:03:33,280 Speaker 1: all these deals with these other car companies that they 1053 01:03:33,280 --> 01:03:37,680 Speaker 1: can use his superchargers. Then he wipes out his complete 1054 01:03:37,720 --> 01:03:41,120 Speaker 1: supercharger team. What was going on there? 1055 01:03:43,240 --> 01:03:46,160 Speaker 2: Your guess is as good as mine. He was He 1056 01:03:46,240 --> 01:03:50,280 Speaker 2: often he sometimes does this where he decides like there 1057 01:03:50,320 --> 01:03:53,520 Speaker 2: needs to be a kicking the ass for a part 1058 01:03:53,520 --> 01:03:56,600 Speaker 2: of his company, and he'll like severely jolt it, you know. 1059 01:03:56,640 --> 01:03:58,200 Speaker 2: And again we go back to that idea of if 1060 01:03:58,280 --> 01:04:00,919 Speaker 2: I if I break it, I can always bring him back. 1061 01:04:00,960 --> 01:04:03,360 Speaker 2: I can always reintroduce it. And actually he did that. 1062 01:04:03,440 --> 01:04:06,920 Speaker 2: He fired his whole supercharging team and they actually ended 1063 01:04:07,000 --> 01:04:11,480 Speaker 2: up bringing a lot of them back, which he's done 1064 01:04:11,480 --> 01:04:15,400 Speaker 2: this throughout his career. But if you're if you're GM, 1065 01:04:15,760 --> 01:04:18,560 Speaker 2: if you're Chevy, if you're even Rivian, you know you're 1066 01:04:18,720 --> 01:04:22,080 Speaker 2: you're probably raising your eyebrows and being like, damn, we really, 1067 01:04:23,360 --> 01:04:25,640 Speaker 2: you know, do we want to reevaluate this relationship? We 1068 01:04:25,680 --> 01:04:28,560 Speaker 2: got in bed with our main competitor and he is 1069 01:04:28,680 --> 01:04:32,520 Speaker 2: completely erratic and we don't know why he did it. 1070 01:04:32,640 --> 01:04:36,040 Speaker 2: Still like was it because he one day he woke 1071 01:04:36,120 --> 01:04:37,800 Speaker 2: up and was like, actually, you know what, I don't 1072 01:04:37,840 --> 01:04:40,960 Speaker 2: think I want my competitors to use my Like why 1073 01:04:40,960 --> 01:04:43,360 Speaker 2: would I give away my secret sauce? You know, let 1074 01:04:43,360 --> 01:04:46,520 Speaker 2: them build their own they you know, this is that's capitalism. 1075 01:04:46,520 --> 01:04:51,720 Speaker 2: I should be able to control my own chargers. But 1076 01:04:51,800 --> 01:04:56,760 Speaker 2: this kind of erratic nature and you know, unexplained decisions 1077 01:04:56,840 --> 01:05:00,880 Speaker 2: are part and parcel of him. Uh and it is 1078 01:05:01,360 --> 01:05:03,400 Speaker 2: it is fascinating to watch it play out over and 1079 01:05:03,440 --> 01:05:07,880 Speaker 2: over again as a reporter. But yeah, if i'm if 1080 01:05:07,920 --> 01:05:12,240 Speaker 2: I'm Mary Bara a GM, I'm I'm watching this with 1081 01:05:12,240 --> 01:05:16,480 Speaker 2: with Rai's eyebrows and wondering what's going on and just 1082 01:05:17,120 --> 01:05:19,480 Speaker 2: always seems like a master of five alarm fire with him. 1083 01:05:20,200 --> 01:05:25,200 Speaker 1: Okay, so he'sh Twitter, he starts bringing people in. If 1084 01:05:25,200 --> 01:05:28,040 Speaker 1: you don't tell him what he wants to hear, he 1085 01:05:28,080 --> 01:05:32,000 Speaker 1: immediately gets rid of you. So is this also a 1086 01:05:32,080 --> 01:05:35,120 Speaker 1: characteristic of his that he surrounds himself with the y 1087 01:05:35,200 --> 01:05:37,040 Speaker 1: s people can hear a contrary opinion. 1088 01:05:40,320 --> 01:05:43,760 Speaker 2: So it's a bit more nuanced. Actually in the book 1089 01:05:43,800 --> 01:05:46,240 Speaker 2: we go into this a little bit, but one on 1090 01:05:46,240 --> 01:05:49,640 Speaker 2: one he can actually be receptive to criticism and as 1091 01:05:49,640 --> 01:05:52,120 Speaker 2: long as you have the domain expertise, you have the 1092 01:05:52,200 --> 01:05:55,919 Speaker 2: numbers to back it up, the details. You know, if 1093 01:05:55,920 --> 01:05:59,240 Speaker 2: he asks you a question, you're able to answer it. 1094 01:05:59,400 --> 01:06:04,520 Speaker 2: And I the what we talked about. He likes to 1095 01:06:04,520 --> 01:06:07,040 Speaker 2: break things down to what he called first principles, like 1096 01:06:07,160 --> 01:06:09,840 Speaker 2: how do things operate? And if you're able to explain 1097 01:06:09,920 --> 01:06:12,120 Speaker 2: things to him in those first principles, he'll appreciate that. 1098 01:06:12,920 --> 01:06:15,920 Speaker 2: But what he doesn't tolerate is being criticized in a 1099 01:06:15,920 --> 01:06:19,560 Speaker 2: group setting or being made to look like he is 1100 01:06:22,880 --> 01:06:26,680 Speaker 2: dumb in a group setting. Essentially, we opened the book 1101 01:06:26,760 --> 01:06:30,880 Speaker 2: with this with this instance. Actually was well, this is 1102 01:06:30,880 --> 01:06:33,280 Speaker 2: a one on one thing, so it's kind of contradictory 1103 01:06:33,280 --> 01:06:35,320 Speaker 2: to what I said, But of this instance of a 1104 01:06:35,400 --> 01:06:41,200 Speaker 2: data scientist challenging him on why he tweeted out that 1105 01:06:41,240 --> 01:06:45,360 Speaker 2: conspiracy theory about Paul Pelosi. Do you remember that, yes? Yeah, 1106 01:06:45,400 --> 01:06:49,320 Speaker 2: So Paul Pelosi was attacked Nancy Pelosi's husband on far 1107 01:06:49,400 --> 01:06:52,200 Speaker 2: right media. It became this kind of rumor that he 1108 01:06:52,280 --> 01:06:57,960 Speaker 2: was attacked by a jilted lover, and Elon actually shared 1109 01:06:57,960 --> 01:07:02,440 Speaker 2: that came within days after the his acquisition. He shared 1110 01:07:02,480 --> 01:07:09,920 Speaker 2: that conspiracy theory essentially and this data scientist who's already 1111 01:07:09,920 --> 01:07:12,480 Speaker 2: going to quit, manages to get a one on one 1112 01:07:12,480 --> 01:07:15,240 Speaker 2: meeting with Elon and tell him. He goes in the 1113 01:07:15,280 --> 01:07:19,840 Speaker 2: meeting with the hope that he could maybe change him 1114 01:07:19,920 --> 01:07:22,640 Speaker 2: or change his perspective. And this is a data scientist 1115 01:07:22,920 --> 01:07:27,240 Speaker 2: who had studied misinformation and how it spreads on social platforms. 1116 01:07:27,960 --> 01:07:30,240 Speaker 2: And he says, like, look, I can't believe you would 1117 01:07:30,240 --> 01:07:34,400 Speaker 2: believe you know, you're in like the top tenth percentile 1118 01:07:34,720 --> 01:07:37,840 Speaker 2: of folks who would fall for something this ridiculous, Like 1119 01:07:37,880 --> 01:07:40,200 Speaker 2: how could you believe that you really need to like 1120 01:07:40,880 --> 01:07:45,320 Speaker 2: understand online sources better and like how information flows and 1121 01:07:45,520 --> 01:07:48,680 Speaker 2: really be more skeptical of what you read online. And 1122 01:07:48,760 --> 01:07:51,520 Speaker 2: like they get into a shouting match. Elon says, fuck you, 1123 01:07:51,520 --> 01:07:55,240 Speaker 2: you know, and the guy was going to quit anyway 1124 01:07:55,280 --> 01:07:59,360 Speaker 2: with like you know, fires him, and you start to 1125 01:07:59,400 --> 01:08:02,640 Speaker 2: realize that, yes, he doesn't he doesn't like to be 1126 01:08:02,720 --> 01:08:05,560 Speaker 2: challenged very often. And there are other scenes of the 1127 01:08:05,600 --> 01:08:07,560 Speaker 2: book where he fires people on the spot as well. 1128 01:08:10,000 --> 01:08:12,600 Speaker 2: And we go back to, you know, operating on gout instinct. 1129 01:08:15,800 --> 01:08:17,640 Speaker 2: It's worked for him in the past, It's worked for 1130 01:08:17,720 --> 01:08:20,479 Speaker 2: him at Tesla. You know, he has worked people to 1131 01:08:20,520 --> 01:08:22,519 Speaker 2: the bone, and if they don't get the things he 1132 01:08:22,560 --> 01:08:24,880 Speaker 2: wants done for him, he'll just find the next person 1133 01:08:24,920 --> 01:08:30,800 Speaker 2: who will. And he's built these major companies off those techniques, 1134 01:08:31,000 --> 01:08:33,840 Speaker 2: and he has simply applied those to Twitter. And you 1135 01:08:33,880 --> 01:08:35,360 Speaker 2: see that over and over again in the book. 1136 01:08:36,400 --> 01:08:41,360 Speaker 1: Okay, he starts taking cost cutting to an extreme level. 1137 01:08:41,840 --> 01:08:44,800 Speaker 1: You talk about getting rid of janitorial services. You talk 1138 01:08:44,840 --> 01:08:49,200 Speaker 1: about him actually ripping out a server. Is this like 1139 01:08:49,320 --> 01:08:52,400 Speaker 1: an emotional thing that he is setting. Let me put 1140 01:08:52,400 --> 01:08:58,160 Speaker 1: it differently, is he setting an example or he's for 1141 01:08:58,200 --> 01:09:00,400 Speaker 1: a guy who's worth two hundred north of two hundred billion. 1142 01:09:00,840 --> 01:09:03,160 Speaker 1: Is he really trying to save every bedding? 1143 01:09:04,920 --> 01:09:07,760 Speaker 2: We had this bit of reporting even before Twitter at 1144 01:09:07,800 --> 01:09:13,000 Speaker 2: Tesla and one it was during the the I think 1145 01:09:13,040 --> 01:09:16,880 Speaker 2: the model why ramp up, you know, one of the 1146 01:09:16,960 --> 01:09:22,639 Speaker 2: crossover SUVs whatever, And he decides to motivate people by 1147 01:09:22,720 --> 01:09:25,479 Speaker 2: removing the cereal from the offices and the factories. This 1148 01:09:26,040 --> 01:09:29,000 Speaker 2: free cereal that people get, it's the only free food 1149 01:09:29,000 --> 01:09:33,760 Speaker 2: that they get. And everyone's just lays like up in arms. 1150 01:09:33,760 --> 01:09:35,920 Speaker 2: They're like, you're gonna starve the people in the factory 1151 01:09:35,920 --> 01:09:38,240 Speaker 2: who are working for you, who like literally rely on 1152 01:09:38,240 --> 01:09:41,920 Speaker 2: this for breakfast. And sustenance. And what are you saving here? 1153 01:09:42,040 --> 01:09:46,000 Speaker 2: You're saving how much are Lucky charms? You know, mass 1154 01:09:46,040 --> 01:09:48,759 Speaker 2: produce Lucky charms, you know, a thousand couple thousand bucks. 1155 01:09:49,600 --> 01:09:51,839 Speaker 2: But it's for him, it's proving a point, right, it's 1156 01:09:52,400 --> 01:09:55,639 Speaker 2: if I can endure this pain, you can endure this pain, 1157 01:09:56,200 --> 01:09:58,519 Speaker 2: and I am willing to cut whatever we need to 1158 01:09:58,560 --> 01:10:03,200 Speaker 2: cut to make this company successful, whether that's lucky charms, 1159 01:10:03,400 --> 01:10:09,639 Speaker 2: whether that's servers, whether that's toilet paper, which I keep 1160 01:10:09,680 --> 01:10:12,600 Speaker 2: referencing but actually happened. They didn't have toilet paper the 1161 01:10:12,640 --> 01:10:17,240 Speaker 2: offices for a bit. But it's like a it's a 1162 01:10:17,240 --> 01:10:22,840 Speaker 2: psychological thing, almost right. And what he believes is he's 1163 01:10:22,880 --> 01:10:24,599 Speaker 2: going to be the one enduring this pain as well. 1164 01:10:24,640 --> 01:10:27,160 Speaker 2: He'll be the one sleeping at the factory if he 1165 01:10:27,240 --> 01:10:30,080 Speaker 2: can not eat and be on the factory line or 1166 01:10:30,080 --> 01:10:31,679 Speaker 2: be on the factory floor. And if he can sleep 1167 01:10:31,680 --> 01:10:34,720 Speaker 2: in the office, which he did set up be set 1168 01:10:34,760 --> 01:10:38,720 Speaker 2: up a bed in one of the conference rooms, then 1169 01:10:38,760 --> 01:10:41,600 Speaker 2: you like individual employees should be able to tolerate that 1170 01:10:41,640 --> 01:10:44,760 Speaker 2: pain too. Now on the flip side of that, you know, 1171 01:10:44,800 --> 01:10:46,720 Speaker 2: I talk to a lot of people about this, and 1172 01:10:46,840 --> 01:10:49,880 Speaker 2: it does motivate a certain amount of people. But then 1173 01:10:50,000 --> 01:10:52,120 Speaker 2: these people start to realize, like the upside for them, 1174 01:10:52,600 --> 01:10:54,320 Speaker 2: like what is it in Like what do they get? 1175 01:10:54,360 --> 01:10:59,000 Speaker 2: They get a couple shares, they get a good salary, 1176 01:10:59,600 --> 01:11:03,960 Speaker 2: but his upside is, you know, millions of shares that 1177 01:11:04,040 --> 01:11:05,920 Speaker 2: could increase in value. So he's the one seeing the 1178 01:11:05,960 --> 01:11:10,760 Speaker 2: upside from the enduring of pain by these folks. And 1179 01:11:10,800 --> 01:11:13,679 Speaker 2: whereas you know, the individual contributor to the individual employee 1180 01:11:13,720 --> 01:11:17,880 Speaker 2: isn't seeing the same kind of payoff. I guess he 1181 01:11:17,880 --> 01:11:21,920 Speaker 2: would argue, you know, he's made many people at Tesla millionaires, 1182 01:11:21,960 --> 01:11:26,720 Speaker 2: but you know, the cutting has always been a part 1183 01:11:26,760 --> 01:11:30,639 Speaker 2: of him and his identity, and it's something he's now 1184 01:11:30,680 --> 01:11:32,479 Speaker 2: taking into to the White House. 1185 01:11:33,720 --> 01:11:39,519 Speaker 1: Okay, he doesn't pay all these severance fees, etc. What 1186 01:11:39,720 --> 01:11:42,960 Speaker 1: is the status of the people were owed money, both 1187 01:11:43,000 --> 01:11:46,880 Speaker 1: in terms of severance, any stocked payouts, even people are 1188 01:11:46,960 --> 01:11:50,519 Speaker 1: owed money for office space. Are those people continuing to 1189 01:11:50,560 --> 01:11:53,200 Speaker 1: be stiffed or did anybody ever get paid? 1190 01:11:54,479 --> 01:11:56,960 Speaker 2: So there've been a series of lawsuits that have been filed. 1191 01:11:57,160 --> 01:11:59,880 Speaker 2: When we talk about the employees, there's thousands of employees 1192 01:12:00,080 --> 01:12:03,960 Speaker 2: that are currently awaiting arbitration. A lot of them had 1193 01:12:04,040 --> 01:12:10,160 Speaker 2: arbitration clauses in their contracts. So they're fighting essentially to 1194 01:12:10,200 --> 01:12:15,880 Speaker 2: get that severance paid to them, which is being overseen 1195 01:12:15,920 --> 01:12:21,040 Speaker 2: by a couple of different legal firms. But Elon has 1196 01:12:21,080 --> 01:12:23,800 Speaker 2: not relented. He has not signaled that he's just going 1197 01:12:23,840 --> 01:12:25,800 Speaker 2: to simply settle and pay these things. He's going to fight, 1198 01:12:26,000 --> 01:12:30,960 Speaker 2: which ultimately may cost him more, or you know, maybe 1199 01:12:31,040 --> 01:12:33,200 Speaker 2: his calculation is that some people would simply give up 1200 01:12:33,560 --> 01:12:37,599 Speaker 2: and forget about it. He's taking the same tact with 1201 01:12:38,840 --> 01:12:41,479 Speaker 2: the four executives that he fired in like literally the 1202 01:12:41,479 --> 01:12:46,040 Speaker 2: minutes after taking over, he fired the CEO, chief legal officer, 1203 01:12:46,800 --> 01:12:51,000 Speaker 2: General counsel, and chief financial officer. That was literally his 1204 01:12:51,000 --> 01:12:53,160 Speaker 2: first action upon taking over the company in October twenty 1205 01:12:53,160 --> 01:12:57,040 Speaker 2: twenty two. And all these these four executives were do 1206 01:12:57,200 --> 01:13:02,040 Speaker 2: golden parachutes on a change of on a transfer power, which, uh, 1207 01:13:02,360 --> 01:13:04,679 Speaker 2: you know, any kind of merger or anything, they would 1208 01:13:04,680 --> 01:13:08,639 Speaker 2: do these golden parachutes, which amounted to about I think 1209 01:13:08,640 --> 01:13:10,519 Speaker 2: it's around one hundred and twenty million dollars total for 1210 01:13:10,560 --> 01:13:14,720 Speaker 2: all four of them combined. So he's he's fired them 1211 01:13:14,840 --> 01:13:18,960 Speaker 2: for cause, which allowed him to say, actually, you know, 1212 01:13:19,000 --> 01:13:20,880 Speaker 2: I fired you for cause, Therefore you don't deserve your 1213 01:13:20,880 --> 01:13:23,400 Speaker 2: golden parachutes or you're you're not, I don't have to 1214 01:13:23,400 --> 01:13:26,600 Speaker 2: pay you those And of course those executives sued. 1215 01:13:27,200 --> 01:13:27,679 Speaker 1: And. 1216 01:13:29,240 --> 01:13:31,879 Speaker 2: You know, now they're paying millions of dollars to lawyers 1217 01:13:31,920 --> 01:13:35,719 Speaker 2: to litigate that, and eventually they'll probably be a settlement 1218 01:13:37,200 --> 01:13:40,599 Speaker 2: where those executives get tired and they maybe don't get 1219 01:13:40,600 --> 01:13:44,400 Speaker 2: their full package, but they you know, they get some 1220 01:13:44,520 --> 01:13:50,559 Speaker 2: part of it. And that's kind of his strategy. You know, 1221 01:13:50,600 --> 01:13:54,720 Speaker 2: with rent as well, he stopped paying rent just you know, 1222 01:13:55,120 --> 01:13:58,439 Speaker 2: just didn't write the checks and continued to occupy the 1223 01:13:58,439 --> 01:14:02,200 Speaker 2: buildings until the real estate companies were forced to come 1224 01:14:02,240 --> 01:14:06,880 Speaker 2: to table and negotiate with him. We often think about 1225 01:14:06,920 --> 01:14:09,479 Speaker 2: like too big to fail, and we use that with banks. 1226 01:14:10,200 --> 01:14:17,599 Speaker 2: Obviously the financial crisis. There's a quality of Elon. There's 1227 01:14:17,640 --> 01:14:19,479 Speaker 2: a quality of that to Elon as well. You know, 1228 01:14:19,600 --> 01:14:23,240 Speaker 2: this idea that he is too big that people have 1229 01:14:23,280 --> 01:14:26,559 Speaker 2: to play on his terms. He's not going to simply 1230 01:14:26,600 --> 01:14:30,559 Speaker 2: do things because he signed a contract or because he's 1231 01:14:30,640 --> 01:14:34,200 Speaker 2: legally obligated to. But he can bend the rules and 1232 01:14:34,240 --> 01:14:36,760 Speaker 2: get people to play on his playing field if he 1233 01:14:36,760 --> 01:14:39,480 Speaker 2: wants to, and he's done that constantly with Twitter. 1234 01:14:39,600 --> 01:14:42,439 Speaker 1: Okay, changes the name of the company, puts up a 1235 01:14:42,560 --> 01:14:45,280 Speaker 1: huge sign which breaks the law, and has to take 1236 01:14:45,320 --> 01:14:50,600 Speaker 1: it down. Is he aware that he's breaking laws and 1237 01:14:50,680 --> 01:14:54,360 Speaker 1: regulations or is he just doing whatever he's doing because 1238 01:14:54,400 --> 01:14:58,480 Speaker 1: he's blind to the rest of the world. 1239 01:15:00,000 --> 01:15:02,080 Speaker 2: I think he just simply doesn't care. I mean his lawyer, 1240 01:15:02,080 --> 01:15:04,400 Speaker 2: we had we quote his lawyer in the book. You know, 1241 01:15:05,080 --> 01:15:09,960 Speaker 2: Elon does not care about FTC consent decrees. Now that 1242 01:15:10,040 --> 01:15:17,200 Speaker 2: applies to building permits, it applies to whatever rules governed signage. 1243 01:15:18,240 --> 01:15:25,200 Speaker 2: In San Francisco, the company, the city sent inspectors to 1244 01:15:25,400 --> 01:15:29,800 Speaker 2: inspect the so called Twitter hotel in the building where 1245 01:15:29,800 --> 01:15:32,200 Speaker 2: he built these hotel rooms for employees to stand because 1246 01:15:32,240 --> 01:15:34,799 Speaker 2: he didn't want to pay for out of time employees 1247 01:15:34,800 --> 01:15:37,559 Speaker 2: to stay in hotels, so they would sleep in conference 1248 01:15:37,640 --> 01:15:41,840 Speaker 2: rooms on mixshift beds and shower in the office and 1249 01:15:41,880 --> 01:15:45,080 Speaker 2: that kind of thing. So the city set investigators and 1250 01:15:45,200 --> 01:15:47,800 Speaker 2: like you know, they continued to operate and have people 1251 01:15:47,840 --> 01:15:53,320 Speaker 2: stay there. He just doesn't care, you know. And that 1252 01:15:53,479 --> 01:15:58,479 Speaker 2: is a again going back to accountability. That's just a 1253 01:15:58,560 --> 01:16:01,800 Speaker 2: key understanding that he has. He is bigger than any 1254 01:16:01,880 --> 01:16:04,000 Speaker 2: law or any regulator or any agency. 1255 01:16:04,800 --> 01:16:08,320 Speaker 1: If we pull the lens all the way back. Those 1256 01:16:08,360 --> 01:16:12,160 Speaker 1: of us who were active on Twitter and we're aware 1257 01:16:12,200 --> 01:16:16,720 Speaker 1: of the changes he made, expected a the service to 1258 01:16:16,800 --> 01:16:19,000 Speaker 1: go down. There were a couple of hiccups, but nothing 1259 01:16:19,080 --> 01:16:23,880 Speaker 1: significant and for the whole thing to collapse on some level. 1260 01:16:24,240 --> 01:16:29,160 Speaker 1: It has not collapsed. So on some level do we say, hey, 1261 01:16:29,320 --> 01:16:32,800 Speaker 1: by cutting the vast majority of employeing and costs, he. 1262 01:16:32,960 --> 01:16:37,640 Speaker 2: Was right, You're right in that it hasn't collapsed, but 1263 01:16:37,680 --> 01:16:41,960 Speaker 2: there's been some pretty severe outages. Australia in December twenty 1264 01:16:41,960 --> 01:16:44,120 Speaker 2: twenty two was without it for a couple of days. 1265 01:16:46,240 --> 01:16:48,160 Speaker 2: You know, he's taking down a lot of the tools 1266 01:16:48,160 --> 01:16:53,160 Speaker 2: that oversee content moderation that have allowed you know, there 1267 01:16:54,560 --> 01:16:56,599 Speaker 2: the bought problem is still a major issue, you know, 1268 01:16:56,680 --> 01:16:57,960 Speaker 2: and there were there were a lot of tools that 1269 01:16:57,960 --> 01:17:02,080 Speaker 2: he took out that were overseeing that. When we talk 1270 01:17:02,120 --> 01:17:04,400 Speaker 2: about cuts, it's not just the platform itself, it's the 1271 01:17:04,400 --> 01:17:08,640 Speaker 2: people that also oversee advertising and what happened with advertising 1272 01:17:08,760 --> 01:17:14,799 Speaker 2: at the company. It's completely cratered. So yes, I think 1273 01:17:15,360 --> 01:17:20,880 Speaker 2: at the time, all these speculator all the speculation about 1274 01:17:21,040 --> 01:17:25,080 Speaker 2: Twitter collapsing after he did all the layoffs, I think 1275 01:17:25,160 --> 01:17:30,880 Speaker 2: was a bit premature, and Twitter over the years had 1276 01:17:30,880 --> 01:17:33,759 Speaker 2: built up a pretty robust infrastructure to keep itself online. 1277 01:17:34,120 --> 01:17:36,920 Speaker 2: You know, obviously we all remember the days of the 1278 01:17:36,960 --> 01:17:39,960 Speaker 2: fail well and you know those error messages that we 1279 01:17:40,040 --> 01:17:43,280 Speaker 2: we get or during big events like the Super Bowl, 1280 01:17:43,680 --> 01:17:46,160 Speaker 2: the site crashing or not the super Bowl, but the 1281 01:17:46,240 --> 01:17:50,679 Speaker 2: World Cup and stuff like that. But by and large, 1282 01:17:50,720 --> 01:17:53,400 Speaker 2: it's it's a bit a pretty robust system that Eleono 1283 01:17:53,479 --> 01:17:55,519 Speaker 2: has been able to take over, and so it stayed online. 1284 01:17:56,560 --> 01:17:59,320 Speaker 2: That being said, you know, it doesn't guarantee that it's 1285 01:17:59,320 --> 01:18:04,839 Speaker 2: going to stay online forever. And who knows. I reserve judgment, 1286 01:18:04,840 --> 01:18:05,880 Speaker 2: I guess, is what I'll say to that. 1287 01:18:06,479 --> 01:18:09,360 Speaker 1: Okay, he has this vision of turning what is now 1288 01:18:09,439 --> 01:18:14,439 Speaker 1: called next into a universal payment system, you know, akin 1289 01:18:14,479 --> 01:18:17,680 Speaker 1: to what they have in China, in light of what 1290 01:18:17,720 --> 01:18:21,040 Speaker 1: you were talking about self driving cars, et cetera. Is 1291 01:18:21,080 --> 01:18:24,160 Speaker 1: this complete fantasy? He says? You know, that's what he 1292 01:18:24,200 --> 01:18:27,360 Speaker 1: wanted to do with PayPal. Legend is that's why he 1293 01:18:27,479 --> 01:18:30,880 Speaker 1: was squeezed out. Should we pay any heed to that? 1294 01:18:33,600 --> 01:18:36,599 Speaker 2: Elan is a man that makes big, big promises. That's 1295 01:18:36,600 --> 01:18:40,439 Speaker 2: been his whole career. He's going to get humans to Mars. 1296 01:18:40,720 --> 01:18:46,080 Speaker 2: He's going to defeat climate change. He is going to 1297 01:18:46,120 --> 01:18:48,880 Speaker 2: build the everything up, and that's what he has pitched 1298 01:18:48,880 --> 01:18:51,559 Speaker 2: to investors. That's why he was in our book. We 1299 01:18:51,600 --> 01:18:54,000 Speaker 2: go over as some of the projections that he had 1300 01:18:54,080 --> 01:18:56,840 Speaker 2: for three years from now, five years from now, ten 1301 01:18:56,920 --> 01:19:03,120 Speaker 2: years from now, and some of those numbers were, you know, fantastical, 1302 01:19:03,240 --> 01:19:06,920 Speaker 2: you know, they were just fantasy looking at it now. 1303 01:19:07,880 --> 01:19:10,320 Speaker 2: And he sold investors on this idea that he could 1304 01:19:11,000 --> 01:19:14,040 Speaker 2: build the weed Chat of the US, you know, a 1305 01:19:14,080 --> 01:19:18,040 Speaker 2: place where you could watch videos, you would hail your cab, 1306 01:19:18,160 --> 01:19:21,320 Speaker 2: you would order your food, you would have your bank. 1307 01:19:22,400 --> 01:19:25,160 Speaker 2: And by and large, that vision has not been achieved. 1308 01:19:25,280 --> 01:19:27,879 Speaker 2: You know, they've been still working on those money transmitter 1309 01:19:27,960 --> 01:19:34,240 Speaker 2: licenses in all those states. You know, there's really you 1310 01:19:34,280 --> 01:19:36,880 Speaker 2: can't pay for anything with X at this point in time. 1311 01:19:37,880 --> 01:19:42,799 Speaker 2: But yeah, it's it's again these promises that he continues 1312 01:19:42,800 --> 01:19:46,160 Speaker 2: to make, and that's how he's operated as companies. It's 1313 01:19:46,200 --> 01:19:48,439 Speaker 2: always on this idea of a vision, this mission that 1314 01:19:48,479 --> 01:19:53,040 Speaker 2: we need it to and for for X, that mission 1315 01:19:53,080 --> 01:19:55,880 Speaker 2: has been the everything app mean, by the way, he 1316 01:19:56,280 --> 01:19:59,720 Speaker 2: hasn't mentioned that very recently. If you notice how we 1317 01:19:59,760 --> 01:20:02,240 Speaker 2: talked about it, he doesn't really talk about payments anymore. 1318 01:20:02,240 --> 01:20:04,960 Speaker 2: He doesn't talk about the everything app so maybe that 1319 01:20:05,000 --> 01:20:07,920 Speaker 2: was a convenient message for him to raise capital at 1320 01:20:07,960 --> 01:20:12,600 Speaker 2: the time. But yeah, I guess that's kind of a 1321 01:20:12,640 --> 01:20:14,639 Speaker 2: wait and see moment still as well. 1322 01:20:22,200 --> 01:20:25,400 Speaker 1: Okay, let's talk about Linda Yakarno. So we read all 1323 01:20:25,439 --> 01:20:28,920 Speaker 1: about this stuff. I'm into a small conference where she 1324 01:20:29,040 --> 01:20:37,000 Speaker 1: was spoke very unimpressive, a sales lady at best. You know, 1325 01:20:37,200 --> 01:20:40,559 Speaker 1: the news was all about how much power she would 1326 01:20:40,600 --> 01:20:42,479 Speaker 1: be given. I mean, I got to meet the woman 1327 01:20:42,520 --> 01:20:44,360 Speaker 1: for three minutes to know you're not going to put 1328 01:20:44,360 --> 01:20:47,479 Speaker 1: her in control? What was going on in with you? 1329 01:20:47,640 --> 01:20:48,559 Speaker 1: Linda yak Arino. 1330 01:20:50,960 --> 01:20:53,240 Speaker 2: Lynda Yakarino has passed over for the head job at 1331 01:20:53,400 --> 01:20:58,800 Speaker 2: Mbcuniversal and was very successful what she did there. You know, 1332 01:20:58,880 --> 01:21:04,720 Speaker 2: she's a big advertising person, pretty highp executive. It worked 1333 01:21:04,720 --> 01:21:07,120 Speaker 2: a way up, but kind of saw the writing on 1334 01:21:07,160 --> 01:21:10,400 Speaker 2: the wall when she didn't get the gig and had 1335 01:21:10,439 --> 01:21:16,599 Speaker 2: always been angling for a top job. Her politics aligned 1336 01:21:16,760 --> 01:21:21,040 Speaker 2: with Elon. You know, she was involved with some Trump 1337 01:21:21,080 --> 01:21:26,040 Speaker 2: councils on the first Trump administration, and her Twitter feed 1338 01:21:26,200 --> 01:21:28,479 Speaker 2: was pretty obviously right leading. There were a lot of 1339 01:21:28,520 --> 01:21:30,920 Speaker 2: engagements with right wing accounts and that kind of thing, 1340 01:21:31,880 --> 01:21:36,400 Speaker 2: and she saw an opportunity, so they began a conversation. 1341 01:21:37,840 --> 01:21:41,960 Speaker 2: Remember Elon had that very bizarre period in December twenty 1342 01:21:42,000 --> 01:21:43,920 Speaker 2: twenty two where he had a lot of self doubts 1343 01:21:43,960 --> 01:21:46,360 Speaker 2: of whether he wanted to run the company, and he 1344 01:21:46,400 --> 01:21:48,320 Speaker 2: tweeted out that thing about, you know, should I still 1345 01:21:48,360 --> 01:21:53,120 Speaker 2: be CEO of Twitter. It's this dark moment in the book, 1346 01:21:53,160 --> 01:21:56,559 Speaker 2: and it's I think one thing you realize about Elon 1347 01:21:56,640 --> 01:21:59,080 Speaker 2: is this very human He goes through these very up 1348 01:21:59,120 --> 01:22:03,240 Speaker 2: and down and in that point in time, our reporting 1349 01:22:04,640 --> 01:22:06,559 Speaker 2: put him in like a very dark place. You know, 1350 01:22:06,600 --> 01:22:09,400 Speaker 2: he had a lot of self doubts, and so there 1351 01:22:09,520 --> 01:22:11,920 Speaker 2: was kind of some public chatter about, you know, maybe 1352 01:22:11,960 --> 01:22:14,000 Speaker 2: he's looking for someone to like take the burden off. 1353 01:22:14,840 --> 01:22:16,800 Speaker 2: And so Linda Yakarino comes in and they start to 1354 01:22:16,840 --> 01:22:19,800 Speaker 2: have those conversations. She interviews him at a conference and 1355 01:22:19,880 --> 01:22:23,559 Speaker 2: ads conference where they have decent rapport on stage, and 1356 01:22:23,600 --> 01:22:27,080 Speaker 2: a couple months later she's named as the CEO. You know, 1357 01:22:27,160 --> 01:22:29,679 Speaker 2: I think anyone looking at that company knows who calls 1358 01:22:29,720 --> 01:22:34,320 Speaker 2: the shots. But also Linda oversees a lot of the 1359 01:22:34,360 --> 01:22:37,720 Speaker 2: aspects of the business that Elon just has no care for, 1360 01:22:37,880 --> 01:22:44,200 Speaker 2: and that particular part is advertising because Twitter, despite what 1361 01:22:44,240 --> 01:22:46,559 Speaker 2: Elan has said about subscriptions and despite what he has 1362 01:22:46,600 --> 01:22:51,640 Speaker 2: said about wanting to develop new revenue streams, is supremely 1363 01:22:52,000 --> 01:22:55,439 Speaker 2: dependent on advertising, and so Linda is there to do 1364 01:22:55,479 --> 01:23:01,559 Speaker 2: that but also carry his message. So you often get 1365 01:23:01,560 --> 01:23:05,800 Speaker 2: this very weird dynamic where she has to speak out 1366 01:23:05,840 --> 01:23:08,240 Speaker 2: of both sides of her mouth, or she has to 1367 01:23:08,280 --> 01:23:13,679 Speaker 2: pairt what Elon says, and uh, yeah, it's it's it's 1368 01:23:13,680 --> 01:23:18,040 Speaker 2: fascinating watching her and what she says online. 1369 01:23:18,280 --> 01:23:22,320 Speaker 1: Okay, then you have the famous meeting with the advertisers 1370 01:23:22,720 --> 01:23:25,840 Speaker 1: where Elon basically says, fuck you, we don't need you. 1371 01:23:26,520 --> 01:23:30,920 Speaker 1: What's the status of advertising today on X? 1372 01:23:32,560 --> 01:23:34,680 Speaker 2: So? I think you're referencing the deal Book conference where 1373 01:23:34,680 --> 01:23:37,360 Speaker 2: he goes yes as go fuck yourself. Yes, he says, 1374 01:23:37,360 --> 01:23:39,960 Speaker 2: Bob Igers in the audience, that kind of thing, you know, 1375 01:23:40,160 --> 01:23:45,040 Speaker 2: just that that from I often forget about that moment, 1376 01:23:45,400 --> 01:23:48,120 Speaker 2: but like I also now remember where I was for it, 1377 01:23:48,160 --> 01:23:52,160 Speaker 2: and to think that felt like a decade ago now. 1378 01:23:52,200 --> 01:23:57,880 Speaker 2: But the status of advertising on X is it's still declining. 1379 01:23:58,560 --> 01:24:00,680 Speaker 2: We get kind of piecemeal numbers. Obvious they don't they 1380 01:24:00,680 --> 01:24:04,120 Speaker 2: don't have to report those anymore. But I guess some 1381 01:24:04,200 --> 01:24:07,920 Speaker 2: numbers internally in the US, for example, that it's it's 1382 01:24:07,920 --> 01:24:11,559 Speaker 2: still declining after year over year, after a pretty bad 1383 01:24:11,640 --> 01:24:15,280 Speaker 2: year in the in the previous twelve months, it's still declining. 1384 01:24:16,080 --> 01:24:20,320 Speaker 2: And so Elon has wanted to put out this message 1385 01:24:20,360 --> 01:24:25,360 Speaker 2: that actually advertisers are returning, but you should, we should 1386 01:24:25,360 --> 01:24:29,160 Speaker 2: all be kind of skeptical about what returning means. Does 1387 01:24:29,160 --> 01:24:31,880 Speaker 2: that mean they're returning and spending five dollars? You know, 1388 01:24:31,880 --> 01:24:34,720 Speaker 2: what's the actual spend is actually the main question. And 1389 01:24:35,600 --> 01:24:39,120 Speaker 2: like you can often I pay attention a lot to 1390 01:24:39,160 --> 01:24:42,719 Speaker 2: what's on the Explore page that that on the training topics. 1391 01:24:43,080 --> 01:24:46,240 Speaker 2: You see that not right? So that big like banner 1392 01:24:46,320 --> 01:24:49,040 Speaker 2: ad is about a five hundred, it's anywhere from tw 1393 01:24:49,080 --> 01:24:51,200 Speaker 2: hundred fifty to fi hundred thousand dollars a day to 1394 01:24:51,280 --> 01:24:53,640 Speaker 2: take over that page and run that banner ad, and 1395 01:24:53,680 --> 01:24:56,080 Speaker 2: it's often very empty. You know, that used to be 1396 01:24:56,120 --> 01:25:00,360 Speaker 2: a place where artists promoted their new albums, or there 1397 01:25:00,360 --> 01:25:03,719 Speaker 2: were movie promotions, or if there was a new Apple 1398 01:25:03,960 --> 01:25:07,680 Speaker 2: you know, whatever, iPhone launch or whatever, they would take 1399 01:25:07,720 --> 01:25:10,400 Speaker 2: over that space. And that goes empty a lot of 1400 01:25:10,439 --> 01:25:13,479 Speaker 2: the time. And that just speaks to where the platform 1401 01:25:13,560 --> 01:25:17,240 Speaker 2: is at with advertisers being very queasy about the content 1402 01:25:17,600 --> 01:25:18,880 Speaker 2: and the people that remain on. 1403 01:25:20,800 --> 01:25:26,160 Speaker 1: X okay. A month or two ago, Tim Cooked very publicly, 1404 01:25:26,280 --> 01:25:31,120 Speaker 1: Tim Apple very publicly said, oh, yeah, we're advertising on 1405 01:25:31,439 --> 01:25:34,840 Speaker 1: x So you'd talk about Andreese and Horowitch, talk about 1406 01:25:34,840 --> 01:25:38,080 Speaker 1: Morgan Stanley, these people looking to the future. To what 1407 01:25:38,240 --> 01:25:42,400 Speaker 1: degree are other companies afraid of Elong? 1408 01:25:43,680 --> 01:25:49,640 Speaker 2: Oh completely? You know Cook got in a fight with 1409 01:25:49,680 --> 01:25:52,960 Speaker 2: Elon very famously and then had to like invite him 1410 01:25:53,000 --> 01:25:55,600 Speaker 2: to headquarters to like smooth it over. And that was 1411 01:25:55,640 --> 01:25:59,920 Speaker 2: over the thirty percent fee in the app store, right, 1412 01:26:00,320 --> 01:26:07,120 Speaker 2: And you know, I think these these even the top 1413 01:26:07,160 --> 01:26:10,360 Speaker 2: of the top, the Top Sea, Sundar Pichai, Sasha Nadella, 1414 01:26:11,040 --> 01:26:14,360 Speaker 2: Reid Hoffmann, they don't want to be seen on the 1415 01:26:14,400 --> 01:26:17,559 Speaker 2: other end of a shotgun blast from Elon Musk. You know, 1416 01:26:17,600 --> 01:26:23,040 Speaker 2: it creates a huge headache for you. And on some level, 1417 01:26:23,360 --> 01:26:25,880 Speaker 2: you know, like if he tweets value, you're just gonna 1418 01:26:25,920 --> 01:26:30,360 Speaker 2: have millions of people coming after you and denigrating you, 1419 01:26:30,479 --> 01:26:37,080 Speaker 2: and you know, a just criticizing you. And I think 1420 01:26:37,280 --> 01:26:39,639 Speaker 2: all of them believe that it's just easier to play 1421 01:26:39,720 --> 01:26:42,120 Speaker 2: nice with him. And so when Tim Cook says we're 1422 01:26:42,160 --> 01:26:45,920 Speaker 2: still advertising on X. It's interesting, right, like they still 1423 01:26:45,920 --> 01:26:48,200 Speaker 2: could be advertising an X, but what is the amount? 1424 01:26:48,439 --> 01:26:52,080 Speaker 2: You know, how does that compare in the past. And 1425 01:26:52,120 --> 01:26:55,880 Speaker 2: bear in mind that these companies are also very they 1426 01:26:55,880 --> 01:26:57,519 Speaker 2: have to do it's best for their business as well, 1427 01:26:57,560 --> 01:27:00,479 Speaker 2: and so if they are trying to break out of 1428 01:27:00,520 --> 01:27:06,000 Speaker 2: the duopoly of Google and Facebook and online advertising, you know, 1429 01:27:06,439 --> 01:27:11,160 Speaker 2: X offers a pretty reliable place, I guess, or at 1430 01:27:11,240 --> 01:27:14,479 Speaker 2: least an alternative. And that's not new to Elon. That's 1431 01:27:14,479 --> 01:27:16,880 Speaker 2: how it's always been. That's why Twitter was able to 1432 01:27:16,880 --> 01:27:19,840 Speaker 2: survive as a kind of alternative to the kind of 1433 01:27:19,840 --> 01:27:26,160 Speaker 2: two giants. But when when Tim Cook says something like that, 1434 01:27:26,240 --> 01:27:28,920 Speaker 2: it definitely feels like a very placating move, right, it's 1435 01:27:28,960 --> 01:27:33,240 Speaker 2: a it's a it's an olive branch to Elon. 1436 01:27:34,080 --> 01:27:38,040 Speaker 1: Okay, you follow this very closely. What's the long term 1437 01:27:38,560 --> 01:27:42,040 Speaker 1: maybe not that long term, not immediate, but short term 1438 01:27:42,320 --> 01:27:48,720 Speaker 1: prognosis with X? Do they miss the debt payments? Is 1439 01:27:48,760 --> 01:27:51,639 Speaker 1: it essentially like truth social it is gonna live forever? 1440 01:27:51,880 --> 01:27:54,760 Speaker 1: Relevant to the economics, what's gonna happen here? 1441 01:27:56,000 --> 01:27:58,759 Speaker 2: I don't think they miss the debt payments. I wouldn't 1442 01:27:58,760 --> 01:28:01,640 Speaker 2: be surprised if elon Treasury Finance members some negotiation that 1443 01:28:01,680 --> 01:28:04,760 Speaker 2: probably that's actually already happened that there's been reporting on 1444 01:28:04,800 --> 01:28:09,599 Speaker 2: that X will operate as long as elon Ce's fit. 1445 01:28:09,720 --> 01:28:12,639 Speaker 2: You know, it's his favorite toy. He's created the thing 1446 01:28:15,360 --> 01:28:18,000 Speaker 2: that he wants the most. It's it's essentially his echo chamber, right. 1447 01:28:18,240 --> 01:28:20,759 Speaker 2: It's the place where he's now become the most followed user, 1448 01:28:21,240 --> 01:28:24,519 Speaker 2: he's the most engaged with account. This wasn't his intent 1449 01:28:24,560 --> 01:28:26,200 Speaker 2: when he bought the company, but he's been able to 1450 01:28:26,320 --> 01:28:31,599 Speaker 2: use it to leverage UH and get the presidential candidate 1451 01:28:31,600 --> 01:28:35,760 Speaker 2: he's wanted into the White House. So for him, it's 1452 01:28:35,760 --> 01:28:40,200 Speaker 2: been a you know, absent all the discussion about finances 1453 01:28:40,200 --> 01:28:46,519 Speaker 2: and everything. It's been a great success, and he's shaped 1454 01:28:46,560 --> 01:28:50,160 Speaker 2: it in a way. He's driven away the reporters that 1455 01:28:50,280 --> 01:28:52,960 Speaker 2: used to thrive on there, the news outlets that used 1456 01:28:52,960 --> 01:28:55,320 Speaker 2: to thrive on there, the people used to criticize them, 1457 01:28:56,280 --> 01:28:59,960 Speaker 2: and recreated it to become this place where his favorit. 1458 01:29:00,200 --> 01:29:02,679 Speaker 2: People have the blue check marks, they've bought them. They're 1459 01:29:02,680 --> 01:29:05,880 Speaker 2: elevated in his replies. They're the ones being shared on 1460 01:29:05,920 --> 01:29:12,559 Speaker 2: the for you page, the algorithmically curated page, and yeah, 1461 01:29:12,680 --> 01:29:15,920 Speaker 2: I think you could be happier in some ways. You know, 1462 01:29:16,360 --> 01:29:19,519 Speaker 2: his net worth has never been higher. You know, he's 1463 01:29:19,560 --> 01:29:21,719 Speaker 2: now worth three hundred and twenty three twenty five billion 1464 01:29:21,760 --> 01:29:25,080 Speaker 2: dollars give and take on any given day. So we 1465 01:29:25,120 --> 01:29:29,559 Speaker 2: talk about the financial problems that Twitter has had, but 1466 01:29:29,600 --> 01:29:32,320 Speaker 2: again in the plus minus column, like he's he's on, 1467 01:29:32,479 --> 01:29:38,040 Speaker 2: he's up, you know, and uh so, Yeah, I think 1468 01:29:38,160 --> 01:29:41,000 Speaker 2: I think X continues to exist forever, for however long 1469 01:29:41,040 --> 01:29:41,600 Speaker 2: he wants it to. 1470 01:29:42,280 --> 01:29:48,880 Speaker 1: If Elon doesn't buy Twitter, Elon doesn't put money into 1471 01:29:48,920 --> 01:29:51,360 Speaker 1: the election process, does Trump win? 1472 01:29:54,600 --> 01:30:04,280 Speaker 2: Oh man, I gotta get my DeLorean out. I don't know. 1473 01:30:05,000 --> 01:30:06,560 Speaker 2: It's a bad place to be in a reporter to 1474 01:30:06,600 --> 01:30:13,240 Speaker 2: make these kinds of predictions. Yeah. Maybe, maybe maybe. If 1475 01:30:13,240 --> 01:30:18,400 Speaker 2: he doesn't buy Twitter, he doesn't become so steeped in 1476 01:30:18,600 --> 01:30:23,600 Speaker 2: right wing culture. You know, after he buys Twitter, he 1477 01:30:24,120 --> 01:30:28,760 Speaker 2: becomes really invested in immigration and starts pushing effectively the 1478 01:30:28,800 --> 01:30:33,920 Speaker 2: great replacement theory. He starts caring about voter fraud and 1479 01:30:34,080 --> 01:30:36,599 Speaker 2: you know, the thing that it has been talked about 1480 01:30:36,600 --> 01:30:41,080 Speaker 2: ad nauseum on Fox News, And he starts engaging with 1481 01:30:41,080 --> 01:30:43,479 Speaker 2: folks like Tucker Carlson and that kind of stuff. So 1482 01:30:45,120 --> 01:30:50,439 Speaker 2: I view that process as radicalizing him towards Trump, because 1483 01:30:51,160 --> 01:30:54,160 Speaker 2: you have to remember he was never a Trumper. He 1484 01:30:54,240 --> 01:30:58,599 Speaker 2: supported Ron de Santis in as late as May twenty 1485 01:30:58,680 --> 01:31:01,760 Speaker 2: twenty three. You remember that Twitter space he had where 1486 01:31:01,800 --> 01:31:04,640 Speaker 2: he DeSantis launched his you know, his disastrous but he 1487 01:31:04,720 --> 01:31:11,120 Speaker 2: launched his his presidential campaign on x with Elon, but 1488 01:31:12,360 --> 01:31:17,599 Speaker 2: owning Twitter, engaging in that sphere really pushed him further 1489 01:31:17,680 --> 01:31:20,920 Speaker 2: and further to the right, to the point where he 1490 01:31:21,040 --> 01:31:25,960 Speaker 2: arrived on Trump's doorstep and they began a fast and 1491 01:31:26,479 --> 01:31:29,120 Speaker 2: aggressive relationship that kind of continues to this day. 1492 01:31:29,920 --> 01:31:34,479 Speaker 1: Okay, is this bro man that's going to continue? Really? Elon? 1493 01:31:34,560 --> 01:31:38,799 Speaker 1: Although South African is a classic American one man job, 1494 01:31:39,439 --> 01:31:43,600 Speaker 1: what do we know? Trump is uneducated, stupid, and impulsive. 1495 01:31:44,320 --> 01:31:48,040 Speaker 1: So therefore, seeing over the overall landscape and being manipulative, 1496 01:31:48,479 --> 01:31:51,160 Speaker 1: what you see is what you get. What we learned 1497 01:31:51,280 --> 01:31:55,920 Speaker 1: with Trump is hey, barring you know, are worst fears. 1498 01:31:56,640 --> 01:32:01,320 Speaker 1: He's constrained by the government. Also, he may not agree 1499 01:32:01,600 --> 01:32:06,799 Speaker 1: with Elon on certain things. Also, Elon likes to move fast, 1500 01:32:06,880 --> 01:32:11,559 Speaker 1: the government moves at a glacial piece. To continue would 1501 01:32:11,600 --> 01:32:17,240 Speaker 1: absolutely require compromise on Elon's part. Is that going to happen, 1502 01:32:17,800 --> 01:32:19,640 Speaker 1: or is Elon going to jump the rails at some 1503 01:32:19,720 --> 01:32:22,400 Speaker 1: point say you know this is just fucked up. You 1504 01:32:22,439 --> 01:32:24,160 Speaker 1: don't know what you're doing. I gotta I gotta lead. 1505 01:32:25,320 --> 01:32:28,360 Speaker 2: I thinks a big question. I think Trump does want 1506 01:32:28,400 --> 01:32:32,080 Speaker 2: Elon to break some of that those processes, right, That's 1507 01:32:32,080 --> 01:32:35,080 Speaker 2: why he's brought him in for the Department of Government 1508 01:32:35,120 --> 01:32:42,200 Speaker 2: Government Government Efficiency, Right. I think it's more analyzing it 1509 01:32:42,240 --> 01:32:45,280 Speaker 2: on a kind of one to one relationship standpoint as 1510 01:32:45,280 --> 01:32:51,280 Speaker 2: opposed to anything policy related, like can their egos exist 1511 01:32:51,400 --> 01:32:54,599 Speaker 2: in the same space for however long? We've already started 1512 01:32:54,600 --> 01:32:58,160 Speaker 2: to see reports about folks around Trump tiring of the 1513 01:32:58,240 --> 01:33:03,280 Speaker 2: amount that that that Elon has spent at the president's 1514 01:33:03,280 --> 01:33:06,960 Speaker 2: side in mar A Lago. That's the big question, you know. 1515 01:33:07,360 --> 01:33:09,360 Speaker 2: I wish I was in the game of predictions, but 1516 01:33:10,280 --> 01:33:14,400 Speaker 2: historically they have tended not to get along well with 1517 01:33:14,479 --> 01:33:21,639 Speaker 2: other people that have similar egos, similar desires to take 1518 01:33:21,640 --> 01:33:27,600 Speaker 2: control and be the leader. In my opinion, though, and 1519 01:33:27,720 --> 01:33:29,960 Speaker 2: this is complete speculation, I think both of them are 1520 01:33:30,000 --> 01:33:34,760 Speaker 2: aware of that, and they understand that like they're being 1521 01:33:34,800 --> 01:33:37,800 Speaker 2: watched in this way, and so to do things as 1522 01:33:37,800 --> 01:33:41,559 Speaker 2: they've always done them might not be the way to go. 1523 01:33:42,560 --> 01:33:47,280 Speaker 2: But who knows. I mean, these guys make split decisions, 1524 01:33:47,439 --> 01:33:51,880 Speaker 2: especially Elon, and one day you could wake up and 1525 01:33:51,880 --> 01:33:55,599 Speaker 2: find himself on the other side of Trump's good list 1526 01:33:55,680 --> 01:33:59,600 Speaker 2: or bad list, and it may lead to fireworks, but 1527 01:33:59,640 --> 01:34:01,479 Speaker 2: hopefully were there to report it out. 1528 01:34:02,439 --> 01:34:07,680 Speaker 1: Okay, So for two years Elon has been heavily invested 1529 01:34:07,720 --> 01:34:12,639 Speaker 1: in x miyor to him becoming very right wing when 1530 01:34:12,680 --> 01:34:17,080 Speaker 1: the lefties were still gun home on buying Tesla's question was, 1531 01:34:18,240 --> 01:34:21,360 Speaker 1: and this was an issue for the stock who's running 1532 01:34:21,400 --> 01:34:24,679 Speaker 1: the ship? He's so busy at X. Is the real 1533 01:34:24,720 --> 01:34:28,559 Speaker 1: story that he's just a figurehead and these companies run 1534 01:34:28,600 --> 01:34:29,480 Speaker 1: by themselves. 1535 01:34:30,680 --> 01:34:34,120 Speaker 2: It's actually really interesting that he's developed these kind of 1536 01:34:34,160 --> 01:34:38,040 Speaker 2: deep benches at these companies, and that's largely because he's 1537 01:34:38,200 --> 01:34:40,720 Speaker 2: at Tesla and SpaceX has been able to shape them 1538 01:34:40,720 --> 01:34:43,160 Speaker 2: from the ground up. You know, Tesla, he was an 1539 01:34:43,160 --> 01:34:47,240 Speaker 2: early investor that took over at SpaceX he founded, but 1540 01:34:47,320 --> 01:34:49,640 Speaker 2: he was able to mold these companies and put in 1541 01:34:49,840 --> 01:34:53,960 Speaker 2: place the people he wants to oversee, you know, at 1542 01:34:54,000 --> 01:34:59,360 Speaker 2: SpaceX rocket manufacturing or design. So all these people are 1543 01:34:59,840 --> 01:35:02,400 Speaker 2: very very familiar with how he works and what we've 1544 01:35:02,439 --> 01:35:05,640 Speaker 2: heard from people at these companies, is he's kind of 1545 01:35:05,640 --> 01:35:09,479 Speaker 2: the troubleshooter. If there's a major problem that's happening at 1546 01:35:09,520 --> 01:35:12,080 Speaker 2: Tesla or SpaceX, let's say in twenty eighteen, for example, 1547 01:35:12,080 --> 01:35:16,000 Speaker 2: on the on the factory line manufacturing Model threes, you 1548 01:35:16,040 --> 01:35:22,160 Speaker 2: know he'll get in and try and problem solve very 1549 01:35:23,479 --> 01:35:27,160 Speaker 2: aggressively and spend all his time on that problem until 1550 01:35:27,200 --> 01:35:31,360 Speaker 2: it's fixed. And so actually, if you think about it, 1551 01:35:31,400 --> 01:35:35,439 Speaker 2: like people call elon the sun in these companies, and 1552 01:35:35,520 --> 01:35:37,360 Speaker 2: you don't want to get too close to sunny get burned. 1553 01:35:37,800 --> 01:35:40,120 Speaker 2: So like if he is, if he's working with you 1554 01:35:41,439 --> 01:35:44,559 Speaker 2: on your part of the company or whatever project you're 1555 01:35:46,000 --> 01:35:48,040 Speaker 2: you're probably fucked. Like you're probably in there, you're probably 1556 01:35:48,080 --> 01:35:49,680 Speaker 2: in deep shit, right. You don't want to be in 1557 01:35:49,720 --> 01:35:51,000 Speaker 2: that position. You don't want to be in the same 1558 01:35:51,080 --> 01:35:55,120 Speaker 2: room with him. And so, by and large, these companies, 1559 01:35:55,240 --> 01:35:58,680 Speaker 2: when there is no crisis, he's run themselves. SpaceX has 1560 01:35:58,720 --> 01:36:02,000 Speaker 2: a very strong number too, and when shot well he 1561 01:36:02,080 --> 01:36:04,559 Speaker 2: had he has other vps around him as well that 1562 01:36:04,600 --> 01:36:08,880 Speaker 2: run that company and know about way more about rocket 1563 01:36:08,920 --> 01:36:11,759 Speaker 2: science or manufacturing than he does, and he defers to them. 1564 01:36:12,120 --> 01:36:16,679 Speaker 2: Same thing at Tesla with X, he didn't have that right. 1565 01:36:17,200 --> 01:36:20,760 Speaker 2: This wasn't a company he built up from the ground up, 1566 01:36:20,880 --> 01:36:23,439 Speaker 2: and so when he came in and fired everyone, there 1567 01:36:23,479 --> 01:36:26,080 Speaker 2: was really no one he could trust. He couldn't he 1568 01:36:26,080 --> 01:36:29,840 Speaker 2: couldn't leave and like let this thing operate week to week. 1569 01:36:30,600 --> 01:36:33,439 Speaker 2: So that's kind of what explains why he spent so 1570 01:36:33,520 --> 01:36:36,799 Speaker 2: much time there when after he took over, and explains 1571 01:36:36,800 --> 01:36:41,519 Speaker 2: why he eventually hired Linda Ekerino. But that's that's just 1572 01:36:41,520 --> 01:36:44,040 Speaker 2: an understanding of how it operates. So x is I 1573 01:36:44,040 --> 01:36:46,080 Speaker 2: guess getting there to the point where it can maybe 1574 01:36:46,080 --> 01:36:48,080 Speaker 2: be a little more self sustainable and he's not spending 1575 01:36:48,120 --> 01:36:52,040 Speaker 2: all his time there. But yeah, that's just that's how 1576 01:36:52,080 --> 01:36:52,479 Speaker 2: he works. 1577 01:36:52,880 --> 01:36:59,479 Speaker 1: Okay. Steve Jobs the legendary entrepreneur before Elon Musk. It's 1578 01:36:59,520 --> 01:37:04,640 Speaker 1: been well documented that the other Steve Wozniak was the 1579 01:37:04,680 --> 01:37:09,960 Speaker 1: brilliant engineer, and Steve Jobs was an incredible marketer and 1580 01:37:10,000 --> 01:37:15,840 Speaker 1: an incredible trends spotter. What is Elon's expertise? Is he 1581 01:37:15,920 --> 01:37:17,480 Speaker 1: really that great an engineer? 1582 01:37:20,160 --> 01:37:22,639 Speaker 2: As I said earlier, he's he's a great salesman. He's 1583 01:37:22,680 --> 01:37:27,120 Speaker 2: a great person in pushing people towards a vision and 1584 01:37:27,200 --> 01:37:31,280 Speaker 2: motivating them, collecting them in a room and getting him 1585 01:37:31,439 --> 01:37:34,800 Speaker 2: marching in the same direction. And that's for the causes 1586 01:37:34,880 --> 01:37:40,480 Speaker 2: of making humanity multiplanetary. It's for the causes of electrifying 1587 01:37:41,280 --> 01:37:43,959 Speaker 2: cars and bringing up and bringing about the ev revolution. 1588 01:37:46,000 --> 01:37:50,719 Speaker 2: And it's funny because Jobs and Musk are often talked 1589 01:37:50,960 --> 01:37:54,960 Speaker 2: talked about as having this quote reality distortion field, right, 1590 01:37:55,040 --> 01:38:00,040 Speaker 2: this ability to you know, if it's not real in 1591 01:38:00,040 --> 01:38:02,880 Speaker 2: real life, they able to make it happen. You know, 1592 01:38:03,600 --> 01:38:06,920 Speaker 2: no one saw the iPhone before Steve Jobs, no one 1593 01:38:06,960 --> 01:38:11,160 Speaker 2: saw self landing rockets before Elon, And they're able to 1594 01:38:11,200 --> 01:38:14,599 Speaker 2: push people and the boundaries to get them working towards 1595 01:38:14,640 --> 01:38:21,880 Speaker 2: these once impossible goals. And you know, I think I 1596 01:38:21,880 --> 01:38:24,880 Speaker 2: think that's it's borrowed from Star Trek. I'm not a Treky, 1597 01:38:24,960 --> 01:38:27,280 Speaker 2: but I think the reality distortion field is a Star 1598 01:38:27,360 --> 01:38:29,720 Speaker 2: Trek thing. But we've talked to so many people that 1599 01:38:30,240 --> 01:38:34,120 Speaker 2: say that about someone like Elon right, like that he 1600 01:38:34,280 --> 01:38:39,040 Speaker 2: has that ability to make people believe. And you see 1601 01:38:39,040 --> 01:38:41,439 Speaker 2: that quality not just that his companies, but outside his companies. 1602 01:38:41,479 --> 01:38:44,599 Speaker 2: You talk about the Robotaxi event. Again, he got people 1603 01:38:44,640 --> 01:38:48,000 Speaker 2: believing in this thing that's not even real. You know, 1604 01:38:48,040 --> 01:38:50,519 Speaker 2: that is on a movie set you know, farrying people 1605 01:38:50,600 --> 01:38:53,639 Speaker 2: around from fake city to fake city on a movie set. 1606 01:38:54,720 --> 01:38:58,439 Speaker 2: And when we talk about like his greatest quality as 1607 01:38:58,439 --> 01:39:00,200 Speaker 2: an entrepreneur, I think that that's it. 1608 01:39:02,080 --> 01:39:03,880 Speaker 1: Okay, But is he a good engineer? 1609 01:39:08,680 --> 01:39:13,880 Speaker 2: Uh? I guess it depends. Yeah, he's given himself the 1610 01:39:13,960 --> 01:39:16,840 Speaker 2: chief engineer title. I think at some of his companies, 1611 01:39:21,720 --> 01:39:23,080 Speaker 2: I don't know enough about his day to day at 1612 01:39:23,120 --> 01:39:25,960 Speaker 2: Tesla and SpaceX. At Twitter, for example, he is not 1613 01:39:26,080 --> 01:39:32,200 Speaker 2: in the code, he is not tweaking the algorithm. In fact, 1614 01:39:32,320 --> 01:39:35,120 Speaker 2: what we reported is he barely even uses as a 1615 01:39:35,160 --> 01:39:38,040 Speaker 2: computer that you know, he had. He had to have 1616 01:39:38,080 --> 01:39:40,320 Speaker 2: an assistant that set up his computer for him whenever 1617 01:39:40,360 --> 01:39:43,920 Speaker 2: he needed to really use a computer. But he views 1618 01:39:43,960 --> 01:39:47,600 Speaker 2: everything on his iPhone. Presentations have to be delivered to 1619 01:39:47,680 --> 01:39:50,240 Speaker 2: him so it's readable on a you know, whatever the 1620 01:39:50,280 --> 01:39:53,599 Speaker 2: size screen of an iPhone is, and from there he's 1621 01:39:53,600 --> 01:39:56,400 Speaker 2: able to dictate his vision and you know, tell people 1622 01:39:56,520 --> 01:40:00,000 Speaker 2: and direct them. But I don't think he's doing much 1623 01:40:00,080 --> 01:40:04,000 Speaker 2: engineering these days, particularly at Twitter or X. 1624 01:40:04,760 --> 01:40:08,839 Speaker 1: Okay, my dealings with Walter isaacs in have been relatively minimal, 1625 01:40:09,560 --> 01:40:12,080 Speaker 1: but I got to bug up my ass based on 1626 01:40:12,200 --> 01:40:17,120 Speaker 1: that dealing and the reverence this guy gets so he 1627 01:40:17,160 --> 01:40:21,360 Speaker 1: writes a book on Elon. People who are not don't 1628 01:40:21,400 --> 01:40:25,120 Speaker 1: follow this space, and we rave and then everybody follows 1629 01:40:25,120 --> 01:40:28,640 Speaker 1: this space. Said this guy drank the kool aid. And 1630 01:40:28,680 --> 01:40:33,559 Speaker 1: then reading your book where Elon literally asked Walter Isaacs 1631 01:40:33,640 --> 01:40:36,880 Speaker 1: in advice, whether consciously what this will do for the 1632 01:40:36,880 --> 01:40:42,040 Speaker 1: book or not, that compromises the entire reporting. But as 1633 01:40:42,080 --> 01:40:44,760 Speaker 1: a reporter, I'm sure you've read the book. What do 1634 01:40:44,840 --> 01:40:46,479 Speaker 1: you say about Isaacs in book? 1635 01:40:47,400 --> 01:40:52,200 Speaker 2: I've read the book, and I'll say this, I really 1636 01:40:52,240 --> 01:40:54,800 Speaker 2: loved the I read the Steve Jobs book cover to cover. 1637 01:40:55,880 --> 01:40:58,680 Speaker 2: I loved it. You know, I learned so much about him. 1638 01:40:59,000 --> 01:41:01,240 Speaker 2: But then I started to realize is that there are 1639 01:41:02,080 --> 01:41:06,439 Speaker 2: issues with these kind of great man biographies. Right, You're 1640 01:41:06,479 --> 01:41:11,200 Speaker 2: starting with this premise that these are great men and 1641 01:41:14,840 --> 01:41:17,400 Speaker 2: that they can be forgiven in some instances because they 1642 01:41:17,400 --> 01:41:19,519 Speaker 2: are such great men because have done such great things. 1643 01:41:20,560 --> 01:41:23,800 Speaker 2: And I think as a reporter, like leading with that 1644 01:41:25,240 --> 01:41:30,160 Speaker 2: is uncomfortable for me, particularly because I view my job 1645 01:41:30,200 --> 01:41:33,479 Speaker 2: as having to interrogate power. Like Yes, on one hand, 1646 01:41:33,520 --> 01:41:36,640 Speaker 2: I can understand that he's built great companies, but on 1647 01:41:36,680 --> 01:41:39,840 Speaker 2: the other hand, if he is harming, if people are 1648 01:41:39,840 --> 01:41:43,479 Speaker 2: getting hurt in his factories if he is breaking the law, like, 1649 01:41:44,040 --> 01:41:46,439 Speaker 2: it's my job to report that and expose that and 1650 01:41:46,479 --> 01:41:51,040 Speaker 2: show that and understand that there are costs to being 1651 01:41:52,200 --> 01:41:56,519 Speaker 2: this supposed great man. And I think, you know, with 1652 01:41:56,560 --> 01:41:59,160 Speaker 2: the Isaacson case, we'll just let a reporting do the talking. 1653 01:41:59,200 --> 01:42:02,799 Speaker 2: You know, our reporting showed that he was providing advice 1654 01:42:03,040 --> 01:42:07,280 Speaker 2: to Elon on Twitter Blue, the subscription service. He was 1655 01:42:07,600 --> 01:42:09,880 Speaker 2: telling him that a lot of people will use a service, 1656 01:42:09,920 --> 01:42:12,559 Speaker 2: how we should you know, he should price it low 1657 01:42:12,640 --> 01:42:16,200 Speaker 2: because it's going to be such a game changer. He 1658 01:42:16,400 --> 01:42:20,160 Speaker 2: was giving Elon advice on labeling. You know, there was 1659 01:42:20,160 --> 01:42:23,720 Speaker 2: that kind of tiff with NPR and whether or not 1660 01:42:23,880 --> 01:42:27,120 Speaker 2: it should be labeled as state sponsored media, and apparently 1661 01:42:27,120 --> 01:42:31,120 Speaker 2: Isaacson got involved with that as well. And as a reporter, 1662 01:42:31,280 --> 01:42:33,120 Speaker 2: that's just not a place I want to be in, 1663 01:42:33,240 --> 01:42:36,080 Speaker 2: I don't, you know. I want my I want my subject. Yes, 1664 01:42:36,120 --> 01:42:37,559 Speaker 2: I want to be in the room with my subject 1665 01:42:37,600 --> 01:42:39,760 Speaker 2: if they allow me to. But I'm not going to 1666 01:42:39,760 --> 01:42:43,960 Speaker 2: compromise my integrity as reporter. I don't want to be 1667 01:42:44,000 --> 01:42:47,880 Speaker 2: an advisor to him by any means. And I think 1668 01:42:47,920 --> 01:42:53,439 Speaker 2: that's the trap that he fell into, you know, Elon 1669 01:42:53,560 --> 01:42:57,760 Speaker 2: is extremely influential. He he he has a gravitational pull. 1670 01:42:58,000 --> 01:43:01,360 Speaker 2: We see this with all the people that come to 1671 01:43:01,439 --> 01:43:04,320 Speaker 2: kind of kiss his ring, and you know a lot 1672 01:43:04,320 --> 01:43:09,280 Speaker 2: of people described it as a king's court, and you 1673 01:43:09,439 --> 01:43:12,160 Speaker 2: kind of if you guess, if you're around it long enough, 1674 01:43:12,160 --> 01:43:15,800 Speaker 2: you fall into that. And we see that time and 1675 01:43:15,840 --> 01:43:19,559 Speaker 2: time again in our book with Walter, and I think 1676 01:43:19,600 --> 01:43:23,080 Speaker 2: that caloused your judgment when it comes to reporting. I 1677 01:43:23,120 --> 01:43:25,479 Speaker 2: think one of my main issues with his book is 1678 01:43:25,479 --> 01:43:29,040 Speaker 2: that he took Elon for face value on a lot 1679 01:43:29,080 --> 01:43:32,560 Speaker 2: of reporting. You know, as a reporter, you want to 1680 01:43:32,560 --> 01:43:39,280 Speaker 2: get multiple perspectives. And so for example, Elon's daughter has 1681 01:43:39,320 --> 01:43:41,760 Speaker 2: come out as trans daughter and said, you know, you've 1682 01:43:41,760 --> 01:43:43,720 Speaker 2: written about me a lot in your book, but you 1683 01:43:43,840 --> 01:43:46,080 Speaker 2: never reached out to me to get my perspective, and 1684 01:43:46,120 --> 01:43:48,320 Speaker 2: you just took my dad's and they don't have a 1685 01:43:48,400 --> 01:43:50,920 Speaker 2: very good relationship effect. Elon has said that his daughter 1686 01:43:51,000 --> 01:43:54,240 Speaker 2: is dead to him. So he simply took Elon's perspective 1687 01:43:54,360 --> 01:44:00,600 Speaker 2: about the trans daughter and ran with it. And I 1688 01:44:00,640 --> 01:44:04,120 Speaker 2: don't think that's good reporting. You know, that would that 1689 01:44:04,160 --> 01:44:06,759 Speaker 2: would never fly in a basic reporting one in one class, 1690 01:44:08,400 --> 01:44:10,320 Speaker 2: and yeah, that's that's kind of my view on it. 1691 01:44:12,800 --> 01:44:16,960 Speaker 1: Okay, starlink, we have the issue in the Ukraine War, 1692 01:44:17,760 --> 01:44:21,280 Speaker 1: we read about the plethora of satellites. No one has 1693 01:44:21,280 --> 01:44:26,160 Speaker 1: ever made satellite communications profitable. What's the future of starlink 1694 01:44:26,600 --> 01:44:30,400 Speaker 1: and to what degree do we have to fear Elon 1695 01:44:30,600 --> 01:44:33,599 Speaker 1: Musk's actions on the world stage. 1696 01:44:37,040 --> 01:44:41,360 Speaker 2: Starlink is another source of his power and one that 1697 01:44:43,120 --> 01:44:49,960 Speaker 2: ah gosh, I mean, it makes him so influential. You 1698 01:44:49,960 --> 01:44:52,960 Speaker 2: think about the Ukraine War and him essentially controlling the 1699 01:44:52,960 --> 01:44:58,360 Speaker 2: communications and pipes for the Ukraine defense effort as he 1700 01:44:58,760 --> 01:45:01,960 Speaker 2: simultaneously is calling for a cease fire and end of 1701 01:45:02,000 --> 01:45:07,160 Speaker 2: the war and an appeasing of putin right, it's kind 1702 01:45:07,200 --> 01:45:09,400 Speaker 2: of just nuts, Like we haven't really never seen this, 1703 01:45:09,479 --> 01:45:11,680 Speaker 2: and I think starlink only will get more powerful, like 1704 01:45:12,160 --> 01:45:16,599 Speaker 2: because Elon has a virtual monopoly on getting things into space, 1705 01:45:16,600 --> 01:45:20,639 Speaker 2: at least in the private industry, Like he controls how 1706 01:45:20,680 --> 01:45:24,800 Speaker 2: those satellites get up and he knows that and it 1707 01:45:25,360 --> 01:45:27,200 Speaker 2: gives him a lot of power. I think of the 1708 01:45:27,280 --> 01:45:32,840 Speaker 2: hurricanes that just hit, you know, the Southeast, and how 1709 01:45:32,880 --> 01:45:37,200 Speaker 2: FEMA was reliant on starlink and getting communications to that region, 1710 01:45:38,960 --> 01:45:45,000 Speaker 2: and Elon is is literally criticizing FEMA on on X 1711 01:45:45,120 --> 01:45:48,960 Speaker 2: he is shitting on the federal government, He's shitting on Biden, 1712 01:45:49,720 --> 01:45:51,599 Speaker 2: and yet FEMA has to play ball with him because 1713 01:45:51,640 --> 01:45:56,840 Speaker 2: he controls Starlink like it's an incredible position to be in, 1714 01:45:58,240 --> 01:46:02,120 Speaker 2: and I think Starlink will become more and more a 1715 01:46:02,120 --> 01:46:03,800 Speaker 2: source of his power. I think I saw today in 1716 01:46:03,840 --> 01:46:06,200 Speaker 2: the news that they had signed a contract with T 1717 01:46:06,360 --> 01:46:13,400 Speaker 2: Mobile to provide I guess calling service, So you know, 1718 01:46:13,400 --> 01:46:15,800 Speaker 2: we start to see more of those deals overseas. It's 1719 01:46:15,880 --> 01:46:23,480 Speaker 2: quite influential as well. And yeah, yeah, it's it's definitely 1720 01:46:23,600 --> 01:46:25,320 Speaker 2: an underrated part of his empire. 1721 01:46:28,320 --> 01:46:32,240 Speaker 1: Okay, and the Boring Company, which he never really took 1722 01:46:32,280 --> 01:46:37,800 Speaker 1: that seriously, but we almost never hear about it. Neuralink, 1723 01:46:37,880 --> 01:46:42,240 Speaker 1: he had his press conference. Are those businesses or those hobbies? 1724 01:46:44,840 --> 01:46:46,559 Speaker 2: I mean their businesses in the sense that they've raised 1725 01:46:46,600 --> 01:46:51,879 Speaker 2: venture capital and they employ people, and you know, Neuralink 1726 01:46:51,920 --> 01:46:56,759 Speaker 2: had its first patient that got something implanted. It's very 1727 01:46:57,000 --> 01:47:02,240 Speaker 2: very developmental early stages. The Boring Company, I guess it 1728 01:47:02,240 --> 01:47:05,360 Speaker 2: built that tunnel in Vegas that you know, shuttles people 1729 01:47:05,439 --> 01:47:10,680 Speaker 2: mile cars can drive through it, but there hasn't been 1730 01:47:10,760 --> 01:47:15,280 Speaker 2: very much from that from that company either, I would 1731 01:47:15,320 --> 01:47:17,880 Speaker 2: say those are not his main focuses. You know, obviously 1732 01:47:17,880 --> 01:47:20,080 Speaker 2: he'll dip into uralink stuff when they have an announcement 1733 01:47:20,320 --> 01:47:24,000 Speaker 2: or things like that. But when we talk about those 1734 01:47:24,080 --> 01:47:26,479 Speaker 2: kinds of teams that he's built up from the ground up, 1735 01:47:27,040 --> 01:47:29,160 Speaker 2: those companies are good examples of that where they kind 1736 01:47:29,200 --> 01:47:32,040 Speaker 2: of self operate without him. He'll dip in when he 1737 01:47:32,080 --> 01:47:36,800 Speaker 2: needs to, and when there's a big announcement he can 1738 01:47:36,880 --> 01:47:40,160 Speaker 2: draw attention to it. But it certainly adds to his lore, 1739 01:47:40,320 --> 01:47:45,800 Speaker 2: right Like he he's the man that runs six companies ostensibly, 1740 01:47:45,880 --> 01:47:49,680 Speaker 2: and it gives him a great legend. And oftentimes I 1741 01:47:49,720 --> 01:47:51,559 Speaker 2: get asked, you know, does how is he able to 1742 01:47:51,600 --> 01:47:54,559 Speaker 2: do this? Like I can't even do my own job, 1743 01:47:54,600 --> 01:47:56,960 Speaker 2: how can you do six and like run all these companies? Well, 1744 01:47:56,960 --> 01:48:00,760 Speaker 2: the truth is, like there's more to it, and he 1745 01:48:00,800 --> 01:48:03,960 Speaker 2: certainly loves to profit off this this image. But it's 1746 01:48:03,960 --> 01:48:06,240 Speaker 2: not like he is. He is going into New orlank 1747 01:48:06,280 --> 01:48:08,200 Speaker 2: every day and working on that from a nine to five. 1748 01:48:12,080 --> 01:48:16,920 Speaker 1: So what have you learned now that the book is out? 1749 01:48:20,000 --> 01:48:26,960 Speaker 2: Hmmm. So we end our book in March when he 1750 01:48:27,000 --> 01:48:28,920 Speaker 2: meets Donald Trump for the first time in a long 1751 01:48:28,960 --> 01:48:31,760 Speaker 2: time and they go in there at Florida, at the 1752 01:48:31,800 --> 01:48:35,120 Speaker 2: home of Nelson Pelts, the activist investory tried to take 1753 01:48:35,160 --> 01:48:39,559 Speaker 2: over Disney recently, and it was purposeful one. We couldn't 1754 01:48:39,560 --> 01:48:43,719 Speaker 2: get We couldn't continue the book forever, and we wanted 1755 01:48:43,920 --> 01:48:47,280 Speaker 2: a clean kind of cut to understand. You know, I 1756 01:48:47,280 --> 01:48:50,040 Speaker 2: think people understand like from that moment on is when 1757 01:48:50,040 --> 01:48:53,760 Speaker 2: he when he you know, really got in better of 1758 01:48:53,800 --> 01:48:58,479 Speaker 2: the Trump. But when we ended the book, I do 1759 01:48:58,560 --> 01:49:02,639 Speaker 2: not think we could have predict did the extent which 1760 01:49:02,680 --> 01:49:05,880 Speaker 2: with which with with which he would support Donald Trump. 1761 01:49:06,640 --> 01:49:09,160 Speaker 2: You know, there is a different world in which I 1762 01:49:09,160 --> 01:49:11,000 Speaker 2: thought it would have been a much more quieter support. 1763 01:49:11,080 --> 01:49:13,960 Speaker 2: He would have maybe announced his endorsement a couple of 1764 01:49:14,040 --> 01:49:17,840 Speaker 2: days before the election, and he would not have done 1765 01:49:18,760 --> 01:49:20,880 Speaker 2: by any means what we saw happen, you know, which 1766 01:49:20,920 --> 01:49:27,000 Speaker 2: is to pour two hundred million dollars into his campaign 1767 01:49:27,200 --> 01:49:30,880 Speaker 2: and to literally go to rallies in Pennsylvania and stuff 1768 01:49:30,920 --> 01:49:36,920 Speaker 2: for him and have this charity whatever lottery giveaway to 1769 01:49:37,040 --> 01:49:41,480 Speaker 2: people who register to vote for him. That was unfathomable 1770 01:49:41,520 --> 01:49:43,120 Speaker 2: to me, and I'd like, that wasn't even on the 1771 01:49:43,160 --> 01:49:48,800 Speaker 2: cards for me. And what I've learned from that is 1772 01:49:49,960 --> 01:49:53,160 Speaker 2: just his convictions, and what when you have that much money, 1773 01:49:53,280 --> 01:49:56,800 Speaker 2: what you're able to do about those convictions. You know, 1774 01:49:56,840 --> 01:49:59,479 Speaker 2: he spent two hundred million dollars on Donald Trump's campaign, 1775 01:50:00,640 --> 01:50:03,720 Speaker 2: and that seems like a really smart bet considering what 1776 01:50:03,800 --> 01:50:05,760 Speaker 2: he'll get out of that. You know, he'll be able 1777 01:50:05,800 --> 01:50:09,479 Speaker 2: to appoint people that he wants the government. He might 1778 01:50:09,600 --> 01:50:15,120 Speaker 2: he has the influence an ear of the president, and 1779 01:50:15,200 --> 01:50:18,800 Speaker 2: so that two hundred million dollars looks incredibly cheap, where 1780 01:50:18,960 --> 01:50:22,240 Speaker 2: whereas that's unfathomable. You know, who else could do that 1781 01:50:22,320 --> 01:50:27,080 Speaker 2: in the world. And so yeah, I've learned, I've learned 1782 01:50:27,080 --> 01:50:31,440 Speaker 2: how that works, I guess, or how that looks, and yeah, yeah. 1783 01:50:31,120 --> 01:50:35,400 Speaker 1: So well, I'm actually also interested in what you learn personally. 1784 01:50:35,960 --> 01:50:40,040 Speaker 1: You write a book, you're with your co author we 1785 01:50:40,160 --> 01:50:42,240 Speaker 1: both work at the New York Times, comes out on 1786 01:50:42,320 --> 01:50:47,479 Speaker 1: a major publisher, It comes out relative to your expectations 1787 01:50:47,520 --> 01:50:49,360 Speaker 1: and feedback. What did you learn. 1788 01:50:51,200 --> 01:50:54,080 Speaker 2: About writing a book or moving through the. 1789 01:50:54,000 --> 01:50:58,639 Speaker 1: World there or writing this book? Yeah, writing this book, 1790 01:50:58,680 --> 01:51:07,439 Speaker 1: in this book coming out, what do we learn? Let 1791 01:51:07,479 --> 01:51:10,040 Speaker 1: me throw on a couple of things. Yeah, sure, sure, sure, 1792 01:51:10,400 --> 01:51:14,880 Speaker 1: everybody's focused on their life. You're leading up to publication, date. 1793 01:51:15,000 --> 01:51:19,080 Speaker 1: In your world, this is the most important thing since 1794 01:51:19,120 --> 01:51:22,439 Speaker 1: your status, of the writer's subject matter of the book, 1795 01:51:22,479 --> 01:51:27,360 Speaker 1: the publisher. There's a lot of ink media coverage around 1796 01:51:27,360 --> 01:51:34,040 Speaker 1: its publication, then the media moves on. Okay, there's been 1797 01:51:34,040 --> 01:51:37,519 Speaker 1: a lot of people have written books recently. I had 1798 01:51:37,520 --> 01:51:39,760 Speaker 1: to listen to Kara Swisher write about her and talk 1799 01:51:39,800 --> 01:51:43,120 Speaker 1: about her fucking book for over a year, came out, 1800 01:51:43,479 --> 01:51:46,439 Speaker 1: was gone within like a week, and really no one 1801 01:51:46,520 --> 01:51:48,679 Speaker 1: bought it. And there are a lot of other people 1802 01:51:48,840 --> 01:51:53,200 Speaker 1: like that. Okay, your book certainly has a higher pedigree, 1803 01:51:53,320 --> 01:51:57,599 Speaker 1: more interest. Do you find you're getting continued feedback? Do 1804 01:51:57,640 --> 01:52:00,320 Speaker 1: you find that you're hearing from people that you didn't 1805 01:52:00,360 --> 01:52:04,000 Speaker 1: used to hear from. Do you find that you're being criticized? 1806 01:52:04,360 --> 01:52:07,200 Speaker 1: Are you stunned that the world has moved on what 1807 01:52:07,320 --> 01:52:11,040 Speaker 1: has been your emotional experience in addition to what has 1808 01:52:11,040 --> 01:52:11,719 Speaker 1: been incoming. 1809 01:52:14,360 --> 01:52:17,879 Speaker 2: So I've been surprised because we've had a renewed interest 1810 01:52:17,960 --> 01:52:21,000 Speaker 2: in our book after the election, because more people want 1811 01:52:21,040 --> 01:52:23,120 Speaker 2: to understand what he's going to do with government, how 1812 01:52:23,160 --> 01:52:27,960 Speaker 2: we operate, and people can read about that about in 1813 01:52:28,040 --> 01:52:29,800 Speaker 2: like what he did with Twitter. So we just came 1814 01:52:29,880 --> 01:52:32,000 Speaker 2: back from a tour of Europe actually to where we 1815 01:52:32,000 --> 01:52:34,960 Speaker 2: talked about the book. We spoke at web Summit and Lisbon, 1816 01:52:35,439 --> 01:52:39,840 Speaker 2: we did some events in London. So on that level, 1817 01:52:40,080 --> 01:52:43,080 Speaker 2: I honestly thought I talked to a lot of authors 1818 01:52:43,120 --> 01:52:44,680 Speaker 2: who are like, you know, enjoy it while you can, 1819 01:52:44,840 --> 01:52:48,800 Speaker 2: because a week later, like, people aren't gonna talking about 1820 01:52:48,800 --> 01:52:51,840 Speaker 2: your book. That's just the natural progression of things. And 1821 01:52:51,880 --> 01:52:55,160 Speaker 2: for us to be continuing to talk to people two 1822 01:52:55,160 --> 01:52:57,679 Speaker 2: and a half months out of publication, to people like yourself, 1823 01:52:57,800 --> 01:53:00,960 Speaker 2: has been an incredible privilege. And so I'm just trying 1824 01:53:00,960 --> 01:53:04,880 Speaker 2: to kind of ride that wave and enjoy it and 1825 01:53:05,920 --> 01:53:11,160 Speaker 2: kind of impart the knowledge that we've learned about him too. 1826 01:53:11,840 --> 01:53:14,640 Speaker 2: You know, people that are interested. 1827 01:53:15,680 --> 01:53:19,559 Speaker 3: Well, Ryan, this has been incredibly edifying. You're a very 1828 01:53:19,640 --> 01:53:24,439 Speaker 3: sharp guy. You're on the case. There's certainly a movie 1829 01:53:24,760 --> 01:53:27,000 Speaker 3: unspooling in front of us. I know you will be 1830 01:53:27,160 --> 01:53:29,720 Speaker 3: reporting on it. I want to thank you so much 1831 01:53:29,760 --> 01:53:32,040 Speaker 3: for taking this time to speak with my audience. 1832 01:53:33,200 --> 01:53:34,479 Speaker 2: Thanks so much, Bob. I want to say this has 1833 01:53:34,479 --> 01:53:36,639 Speaker 2: been an honor. I know I've read you for so long. 1834 01:53:36,800 --> 01:53:39,519 Speaker 2: You're such a time in the industry, So thank you 1835 01:53:39,560 --> 01:53:39,840 Speaker 2: so much. 1836 01:53:39,880 --> 01:53:44,080 Speaker 3: Having man till next time, This is Bob left st