1 00:00:01,240 --> 00:00:01,680 Speaker 1: Welcome. 2 00:00:01,720 --> 00:00:04,640 Speaker 2: It is Verdict with Center, Ted Cruz, Ben Ferguson with you, 3 00:00:04,680 --> 00:00:08,640 Speaker 2: and this is a special week in Review. We had 4 00:00:08,640 --> 00:00:12,160 Speaker 2: the awesome opportunity to sit down with Elon Musk at 5 00:00:12,160 --> 00:00:14,200 Speaker 2: the White House and we did it as a two 6 00:00:14,200 --> 00:00:16,680 Speaker 2: part series this week. Some of you may have missed it, 7 00:00:16,720 --> 00:00:18,840 Speaker 2: or you may want to share that with your family 8 00:00:18,840 --> 00:00:21,840 Speaker 2: and your friends. So if you would do us a favor, 9 00:00:21,920 --> 00:00:24,160 Speaker 2: because what Elon Musk had to say, we want. 10 00:00:24,040 --> 00:00:25,320 Speaker 3: Everyone to hear it. 11 00:00:25,000 --> 00:00:28,040 Speaker 2: It's a lot of info the media refuses to cover. 12 00:00:28,400 --> 00:00:28,560 Speaker 3: Now. 13 00:00:28,600 --> 00:00:32,320 Speaker 2: The good news is this interview went viral on x 14 00:00:32,640 --> 00:00:36,000 Speaker 2: But if you are on other social media platforms, please 15 00:00:36,360 --> 00:00:39,239 Speaker 2: share this podcast right now so that everyone can hear 16 00:00:39,280 --> 00:00:42,800 Speaker 2: what mus said about waste, fraud and abuse in our government. 17 00:00:43,040 --> 00:00:45,000 Speaker 2: He also had some amazing things to talk about when 18 00:00:45,040 --> 00:00:48,440 Speaker 2: it comes to colonizing Mars and the idea that may 19 00:00:48,520 --> 00:00:49,880 Speaker 2: actually become a reality. 20 00:00:50,440 --> 00:00:52,479 Speaker 1: So grab this and share it. 21 00:00:52,640 --> 00:00:57,480 Speaker 2: This is the entire interview, uninterrupted with Elon Musk at 22 00:00:57,520 --> 00:00:58,160 Speaker 2: the White House. 23 00:00:58,600 --> 00:01:00,639 Speaker 4: Well, we're in the White House right right now, and 24 00:01:00,680 --> 00:01:04,080 Speaker 4: we're here with my friend Elon Musk, who really has 25 00:01:04,080 --> 00:01:06,399 Speaker 4: not been doing much of anything, has not made any news. 26 00:01:06,800 --> 00:01:10,319 Speaker 4: Is and nobody has noticed. Yeah, the impact. 27 00:01:10,800 --> 00:01:15,360 Speaker 1: Welcome elon, thank you, Holy crap ah, yes wow. Let 28 00:01:15,360 --> 00:01:19,080 Speaker 1: me just say, never a dull moment. Never a dull moment. 29 00:01:19,160 --> 00:01:21,800 Speaker 4: The first fifty days the president has spent in office 30 00:01:22,880 --> 00:01:26,120 Speaker 4: over the top, and the first fifty days you've spent. 31 00:01:26,600 --> 00:01:28,319 Speaker 4: I don't think there's ever been anyone to have an 32 00:01:28,360 --> 00:01:29,360 Speaker 4: impact the way you have. 33 00:01:29,440 --> 00:01:30,040 Speaker 3: At the beginning. 34 00:01:31,280 --> 00:01:32,240 Speaker 1: Let me start with a question. 35 00:01:32,280 --> 00:01:35,440 Speaker 4: You know a lot about which was worse the mess 36 00:01:35,480 --> 00:01:38,800 Speaker 4: you found at Twitter or the mess you found in 37 00:01:38,840 --> 00:01:39,520 Speaker 4: the federal government. 38 00:01:39,600 --> 00:01:41,200 Speaker 3: Well, it's hard to compete with the federal government. 39 00:01:42,440 --> 00:01:44,880 Speaker 4: What's surprised you about the federal government? I assume you 40 00:01:44,920 --> 00:01:47,280 Speaker 4: came in and assumed it was bad. Is it worse 41 00:01:47,319 --> 00:01:48,080 Speaker 4: than you expected? 42 00:01:49,600 --> 00:01:52,440 Speaker 3: It is worse than I expected. But on the plus side, 43 00:01:52,440 --> 00:01:56,000 Speaker 3: that means there's more opportunity for improvement. So look, if 44 00:01:56,000 --> 00:01:59,440 Speaker 3: you look on the bright side, there's actually a lot 45 00:01:59,480 --> 00:02:03,360 Speaker 3: of opportunity for improvement in federal goverment expenditures because it's 46 00:02:03,360 --> 00:02:06,200 Speaker 3: so bad. If it was a well run ship, it 47 00:02:06,200 --> 00:02:09,240 Speaker 3: would be very difficult to improve. So like, but so 48 00:02:09,280 --> 00:02:11,560 Speaker 3: now it's like, people say, well, how how do you 49 00:02:11,560 --> 00:02:13,519 Speaker 3: figure out how to save money in the federal government. Well, 50 00:02:13,560 --> 00:02:15,840 Speaker 3: it's like being in a room where the walls, the roof, 51 00:02:15,880 --> 00:02:20,400 Speaker 3: and the floor are old targets any direction. Yeah, and 52 00:02:20,400 --> 00:02:24,200 Speaker 3: you're your kindness. Wow, yeah, I'm sure you would agree. 53 00:02:24,840 --> 00:02:27,359 Speaker 1: So a lot of folks have talked about like like. 54 00:02:27,880 --> 00:02:31,520 Speaker 3: You can't right, this is going to any direction. 55 00:02:33,000 --> 00:02:35,720 Speaker 1: A lot of the crazy expenditures, things like like two 56 00:02:35,720 --> 00:02:38,960 Speaker 1: million bucks for sex change surgeries in Guatemala, an essential 57 00:02:41,560 --> 00:02:45,400 Speaker 1: you know, transgendered mice, and and sesame street in Iraq, 58 00:02:45,480 --> 00:02:47,080 Speaker 1: A lot of that has gotten the tention. But some 59 00:02:47,120 --> 00:02:49,920 Speaker 1: of the stuff you've told me about, like tell us 60 00:02:49,960 --> 00:02:52,799 Speaker 1: about computer licenses and government agencies. 61 00:02:52,880 --> 00:02:55,399 Speaker 3: Yeah, so most of what do just finding you don't 62 00:02:55,440 --> 00:02:59,480 Speaker 3: need to shock comes. It's very obvious, basic stuff. So 63 00:02:59,639 --> 00:03:03,040 Speaker 3: it in every govert department. I say every because we've 64 00:03:03,080 --> 00:03:06,160 Speaker 3: not yet found a single exception. There are far too 65 00:03:06,160 --> 00:03:12,639 Speaker 3: many software licenses and media subscriptions, meaning many more software 66 00:03:12,680 --> 00:03:15,639 Speaker 3: licenses and media subscriptions than there are humans in the department. 67 00:03:16,280 --> 00:03:18,280 Speaker 4: Like you were saying, like an agency with fifteen thousand 68 00:03:18,320 --> 00:03:22,720 Speaker 4: people might have thirty thousand licenses. Yes, and even of 69 00:03:22,760 --> 00:03:25,240 Speaker 4: the fifteen thousand employees, a good chunk of them hadn't 70 00:03:25,320 --> 00:03:29,320 Speaker 4: used the license had never logged on or used the application. 71 00:03:29,560 --> 00:03:34,839 Speaker 3: Yes, we found entire situations of software licenses or media 72 00:03:35,120 --> 00:03:37,760 Speaker 3: subscriptions where there were zero logins. 73 00:03:38,280 --> 00:03:40,200 Speaker 1: So it had and yet we were paying for it. 74 00:03:40,320 --> 00:03:43,680 Speaker 3: Yes, the government's paying for thousands of licenses of software 75 00:03:44,560 --> 00:03:47,040 Speaker 3: or media subscriptions and no one had ever logged in 76 00:03:47,120 --> 00:03:47,640 Speaker 3: even once. 77 00:03:48,240 --> 00:03:50,400 Speaker 1: Or credit cards. You found the same thing. With government 78 00:03:50,400 --> 00:03:51,080 Speaker 1: credit cards. 79 00:03:51,360 --> 00:03:53,480 Speaker 3: We found that there are twice as many credit cards 80 00:03:53,560 --> 00:03:56,840 Speaker 3: as there are humans. And I still don't have a 81 00:03:56,840 --> 00:03:59,200 Speaker 3: good explanation for why this is the case. And these 82 00:03:59,200 --> 00:04:02,320 Speaker 3: are ten thousand dollar limit cards, so it's a lot 83 00:04:02,360 --> 00:04:02,760 Speaker 3: of money. 84 00:04:03,480 --> 00:04:06,200 Speaker 2: Is it incompetent that you're finding or is this like 85 00:04:06,320 --> 00:04:09,000 Speaker 2: the biggest money wandering scheme in the history of the 86 00:04:09,000 --> 00:04:09,960 Speaker 2: world that you're finding. 87 00:04:10,800 --> 00:04:13,839 Speaker 3: Okay, I think it's mostly If you say, look, what's 88 00:04:13,880 --> 00:04:17,560 Speaker 3: the waste to fraud ratio, Yeah, in my opinion, it's 89 00:04:18,160 --> 00:04:21,440 Speaker 3: it's like eighty percent wist twenty percent fraud. But you 90 00:04:21,480 --> 00:04:25,760 Speaker 3: do have these sort of gray areas. For example, example 91 00:04:25,920 --> 00:04:29,800 Speaker 3: be so uh, we saw a lot of payments going 92 00:04:29,800 --> 00:04:33,640 Speaker 3: out of treasury that had no payment code and no 93 00:04:33,720 --> 00:04:37,760 Speaker 3: explanation for the payment. And then we're we're we're we're 94 00:04:37,760 --> 00:04:39,760 Speaker 3: trying to figure out what that payment is and we'd 95 00:04:39,800 --> 00:04:42,480 Speaker 3: see that, Okay, that contract was supposed to be shut off, 96 00:04:43,240 --> 00:04:45,880 Speaker 3: but but someone forgot to shut off that that contract 97 00:04:46,120 --> 00:04:50,120 Speaker 3: and so the company kept getting money. Wow. Now is 98 00:04:50,160 --> 00:04:57,040 Speaker 3: that waste or fraud? Both both? Yeah, you're not supposed 99 00:04:57,040 --> 00:04:59,560 Speaker 3: to get you're not supposed to get it, but you 100 00:04:59,680 --> 00:05:02,000 Speaker 3: but the the government sent it to you, and nobody 101 00:05:02,040 --> 00:05:04,920 Speaker 3: from the government asked for it back. Take for example, 102 00:05:05,040 --> 00:05:07,159 Speaker 3: the one the one point nine billion dollars given to 103 00:05:07,160 --> 00:05:08,120 Speaker 3: Stacy Abrams. 104 00:05:08,400 --> 00:05:10,760 Speaker 1: Yeah, fake ng O, utter insanity. 105 00:05:11,080 --> 00:05:14,000 Speaker 3: Explain the story. 106 00:05:13,080 --> 00:05:17,200 Speaker 1: That's that's just corrupt. I think that's paying off cronies 107 00:05:17,240 --> 00:05:17,680 Speaker 1: at that point. 108 00:05:18,640 --> 00:05:18,800 Speaker 3: Yeah. 109 00:05:18,960 --> 00:05:19,159 Speaker 1: Yeah. 110 00:05:19,360 --> 00:05:21,320 Speaker 4: And by the way, she knew, like when you get 111 00:05:21,360 --> 00:05:23,359 Speaker 4: two billion dollars, you don't miss that. 112 00:05:23,080 --> 00:05:24,480 Speaker 1: That's not on accident. 113 00:05:25,040 --> 00:05:28,760 Speaker 3: That's allegedly it was for like, uh, you know, environmentally 114 00:05:28,760 --> 00:05:31,960 Speaker 3: friendly appliances or something. And they've given like like a 115 00:05:32,080 --> 00:05:35,120 Speaker 3: hundred appliances so far for two billion dollars. It's very 116 00:05:35,120 --> 00:05:41,920 Speaker 3: expensive toast that's fridge, it's nice. Is it's just obviously 117 00:05:42,440 --> 00:05:45,720 Speaker 3: one of the biggest scam fordoles we've uncovered, which is 118 00:05:45,800 --> 00:05:50,440 Speaker 3: really crazy, is uh? Is that is that the government 119 00:05:50,480 --> 00:05:54,359 Speaker 3: can give money to a so called nonprofit with with 120 00:05:54,560 --> 00:05:57,279 Speaker 3: very few controls, and then that and there's there's no 121 00:05:57,360 --> 00:06:01,560 Speaker 3: auditing subsequently of that nonprofit, so there's no So this 122 00:06:01,600 --> 00:06:03,520 Speaker 3: is where with the you know, one point nine billion 123 00:06:03,560 --> 00:06:06,880 Speaker 3: of Stacey Abrams, who's who's that they didn't give themselves 124 00:06:06,920 --> 00:06:11,520 Speaker 3: extremely lavished like insane salaries, expense everything yep, to the 125 00:06:11,560 --> 00:06:15,279 Speaker 3: to the nonprofit, you know, buy jets and homes and 126 00:06:15,320 --> 00:06:15,920 Speaker 3: all sorts of. 127 00:06:15,839 --> 00:06:19,159 Speaker 1: Things, live like kings and queens. Yes on the taxpayer done. 128 00:06:19,279 --> 00:06:22,360 Speaker 3: Correct. You mentioned this is happening at scale. It's not 129 00:06:22,440 --> 00:06:24,600 Speaker 3: just one or two. We're seeing this everywhere. 130 00:06:24,760 --> 00:06:26,560 Speaker 4: Now, one of the things you told me about is 131 00:06:27,120 --> 00:06:31,840 Speaker 4: at least what you call say, magic money computers. Well, 132 00:06:32,080 --> 00:06:33,400 Speaker 4: so tell us about it, because I've never heard of 133 00:06:33,400 --> 00:06:34,600 Speaker 4: that until you brought that up. 134 00:06:35,040 --> 00:06:39,040 Speaker 3: Okay, So you may think that these the government computers 135 00:06:39,800 --> 00:06:42,839 Speaker 3: like all talk to each other, they synchronize, they add 136 00:06:42,920 --> 00:06:46,000 Speaker 3: up what funds are going somewhere, and it's you know, 137 00:06:47,120 --> 00:06:50,800 Speaker 3: it's coherent that that that the you know, there's and 138 00:06:50,839 --> 00:06:52,799 Speaker 3: that and that the numbers, for example, that you're presented 139 00:06:52,839 --> 00:06:56,520 Speaker 3: as a senator are actually the real numbers. And one 140 00:06:56,520 --> 00:07:01,520 Speaker 3: would think we would think they're not okay. I mean, 141 00:07:01,520 --> 00:07:03,599 Speaker 3: they're not totally wrong, but they're probably off by five 142 00:07:03,600 --> 00:07:07,680 Speaker 3: percent or ten percent in some cases. So I call 143 00:07:07,720 --> 00:07:09,760 Speaker 3: it magic money computer. Any computer which can just make 144 00:07:09,800 --> 00:07:12,600 Speaker 3: money out of thin air best magic money. 145 00:07:12,640 --> 00:07:13,559 Speaker 1: So how does that work? 146 00:07:13,880 --> 00:07:15,000 Speaker 3: It just issues payments. 147 00:07:16,200 --> 00:07:18,800 Speaker 4: And you said something like eleven of these computers a 148 00:07:18,880 --> 00:07:21,840 Speaker 4: treasury that are that are sending out trillions in payments. 149 00:07:21,960 --> 00:07:26,920 Speaker 3: They're mostly a treasury. Some are with the sum at HHS, 150 00:07:27,120 --> 00:07:29,560 Speaker 3: some that there's one as one or two it states 151 00:07:30,400 --> 00:07:34,040 Speaker 3: there's summit at DoD. I think we found now fourteen 152 00:07:34,200 --> 00:07:38,400 Speaker 3: magic money computers. Gene, Okay, they just send money out 153 00:07:38,440 --> 00:07:39,320 Speaker 3: of nothing. 154 00:07:40,160 --> 00:07:43,560 Speaker 4: You have an ability to see where leverage points are 155 00:07:44,680 --> 00:07:48,080 Speaker 4: and how things actually happen. So I remember back I 156 00:07:48,080 --> 00:07:51,000 Speaker 4: think it was September October of this year, before the election. 157 00:07:51,080 --> 00:07:52,560 Speaker 4: We didn't know who was going to win, and I 158 00:07:52,640 --> 00:07:54,800 Speaker 4: was at your house in Austin. We were talking about it, 159 00:07:54,840 --> 00:07:58,280 Speaker 4: and you said, you said, look, I don't want a 160 00:07:58,360 --> 00:08:00,600 Speaker 4: job in Washington, and you said, all I want is 161 00:08:00,600 --> 00:08:03,720 Speaker 4: the log in for every computer. And I remember thinking 162 00:08:03,720 --> 00:08:05,400 Speaker 4: at the time that sounded kind of weird, like I 163 00:08:05,480 --> 00:08:08,239 Speaker 4: just didn't get it. And I have to say what's 164 00:08:08,440 --> 00:08:13,120 Speaker 4: interesting on this. If I would have thought like, okay, 165 00:08:13,120 --> 00:08:15,560 Speaker 4: how do you reform government? Like sort of the traditional 166 00:08:15,600 --> 00:08:17,080 Speaker 4: way to think about it is, okay, give me an 167 00:08:17,160 --> 00:08:18,960 Speaker 4: orc chart, let me sit down with the people who 168 00:08:18,960 --> 00:08:23,240 Speaker 4: are running agencies. And what you saw immediately is to 169 00:08:23,320 --> 00:08:26,679 Speaker 4: understand what's really going on, get to the payment systems, 170 00:08:26,720 --> 00:08:32,320 Speaker 4: get to the computers. Yeah, Like, why is getting to 171 00:08:32,400 --> 00:08:36,160 Speaker 4: the computers so critical to understanding what's actually happening? 172 00:08:36,800 --> 00:08:39,480 Speaker 3: Well, the government is run by computers. So you've got 173 00:08:41,200 --> 00:08:45,480 Speaker 3: essentially several hundred computers that effectively run the government. And 174 00:08:45,520 --> 00:08:48,959 Speaker 3: if you want to know, did you know that? Then no? Like, yeah, 175 00:08:49,000 --> 00:08:51,800 Speaker 3: so when somebody, like even when the president issues an 176 00:08:51,800 --> 00:08:53,559 Speaker 3: executive order, that's going to go through a whole bunch 177 00:08:53,559 --> 00:08:56,280 Speaker 3: of people until ultimately it is implemented at a computer somewhere. 178 00:08:57,520 --> 00:08:59,720 Speaker 3: And if you want to know what the situation is 179 00:08:59,760 --> 00:09:02,160 Speaker 3: with the accounting and you're trying to reconcile accounting and 180 00:09:02,160 --> 00:09:04,240 Speaker 3: get rid of waste and fraud, you must be able 181 00:09:04,240 --> 00:09:07,000 Speaker 3: to analyze the computer databases. Otherwise you can't figure it 182 00:09:07,000 --> 00:09:10,360 Speaker 3: out because all you're doing is asking a human who 183 00:09:10,440 --> 00:09:13,440 Speaker 3: will then ask another human, ask another human, and finally 184 00:09:13,520 --> 00:09:17,000 Speaker 3: usually ask some contractor will ask another contractor to your 185 00:09:17,080 --> 00:09:21,440 Speaker 3: query under computer. Wow, that's how it actually works. So 186 00:09:21,600 --> 00:09:24,840 Speaker 3: it's many layers D So the only way to reconcile 187 00:09:24,840 --> 00:09:26,680 Speaker 3: the databases and get rid of waste and fraud is 188 00:09:26,720 --> 00:09:30,319 Speaker 3: to to actually look at the computers and see what's 189 00:09:30,360 --> 00:09:33,880 Speaker 3: going on. So that's what I call it. That's like, 190 00:09:34,760 --> 00:09:38,120 Speaker 3: that's why when I sort of cryptically referred to reprogramming 191 00:09:38,200 --> 00:09:40,680 Speaker 3: the matrix. You have to understand what's going on the computers. 192 00:09:40,720 --> 00:09:43,480 Speaker 3: You have to reconcile the computer databases in order to 193 00:09:43,480 --> 00:09:45,000 Speaker 3: identify the waste and fraud. 194 00:09:45,320 --> 00:09:51,000 Speaker 4: I don't know that there was anyone in Congress who understood, 195 00:09:51,040 --> 00:09:54,600 Speaker 4: certainly myself included, who understood the leverage that. 196 00:09:54,640 --> 00:09:57,040 Speaker 1: Comes from the computer and the data. 197 00:09:57,160 --> 00:10:00,520 Speaker 4: In particular, that Congress would think about give me a 198 00:10:00,559 --> 00:10:04,240 Speaker 4: report on what your expenditures are rather than actually getting 199 00:10:04,240 --> 00:10:07,000 Speaker 4: into the pipes. And I think that has been fascinating 200 00:10:07,320 --> 00:10:10,240 Speaker 4: that it's let you uncover a bunch of crap that 201 00:10:10,360 --> 00:10:11,439 Speaker 4: just nobody knew. 202 00:10:11,920 --> 00:10:14,600 Speaker 3: Yes, I mean in order for money to go to 203 00:10:14,600 --> 00:10:17,479 Speaker 3: a van account. It's it's not like we're sending truckloads 204 00:10:17,480 --> 00:10:20,520 Speaker 3: of cash all over the place where it's a we're 205 00:10:20,559 --> 00:10:23,079 Speaker 3: wiring money. Right, we're sending money through the ACH system 206 00:10:23,160 --> 00:10:25,600 Speaker 3: or through the Swift system. So in order for money 207 00:10:25,600 --> 00:10:28,240 Speaker 3: to flow, it's going to flow electronically. So that's that's 208 00:10:28,240 --> 00:10:29,360 Speaker 3: what you need to look at. You need to look 209 00:10:29,360 --> 00:10:31,800 Speaker 3: at the actual electronic money flows. 210 00:10:31,920 --> 00:10:34,160 Speaker 4: And Tesla and all your companies, you have accounting and 211 00:10:34,200 --> 00:10:36,480 Speaker 4: you have every expenditure. You have it coded for what 212 00:10:36,520 --> 00:10:39,560 Speaker 4: it's going for. Federal government doesn't work that way. They 213 00:10:39,559 --> 00:10:40,680 Speaker 4: don't code what the money is going for. 214 00:10:40,720 --> 00:10:42,920 Speaker 3: They do know, but they didn't. They didn't. 215 00:10:43,760 --> 00:10:46,240 Speaker 4: And like one of the things that that you told me, 216 00:10:46,280 --> 00:10:48,600 Speaker 4: you said, if any company kept its books the way 217 00:10:48,600 --> 00:10:51,800 Speaker 4: the federal government does, they'd arrest the officers and put 218 00:10:51,800 --> 00:10:52,280 Speaker 4: them in jail. 219 00:10:52,400 --> 00:10:55,000 Speaker 3: Yes, if it was a poly company would be dealisted immediately, 220 00:10:55,120 --> 00:10:57,839 Speaker 3: it would fail, it's ordered, and the officers of the 221 00:10:57,880 --> 00:11:01,800 Speaker 3: company would be imprisoned. That's level of enough easiness in 222 00:11:01,800 --> 00:11:06,280 Speaker 3: the middle. Unfortunately, it's deliberately or do you think this 223 00:11:06,320 --> 00:11:09,440 Speaker 3: is in competence again, it's eighty percent. It's eighty percent 224 00:11:09,800 --> 00:11:12,120 Speaker 3: incompetence or in twenty percent malice. 225 00:11:13,200 --> 00:11:15,560 Speaker 2: If you look at DOGE now and you look at 226 00:11:15,600 --> 00:11:19,720 Speaker 2: the government and what you're finding, what percentage have you 227 00:11:19,760 --> 00:11:21,800 Speaker 2: guys even gotten to and how much of it is 228 00:11:22,440 --> 00:11:24,520 Speaker 2: Mars where you haven't even gotten there yet because there's 229 00:11:24,600 --> 00:11:27,040 Speaker 2: so much you're finding out here. I mean, how many 230 00:11:27,320 --> 00:11:29,040 Speaker 2: you seem like a timeline guy when you say all 231 00:11:29,160 --> 00:11:30,280 Speaker 2: I want to get in there and get all these 232 00:11:30,559 --> 00:11:33,040 Speaker 2: you know, numbers and things. How far are we from 233 00:11:33,040 --> 00:11:35,840 Speaker 2: the end game where you've seen it all, been able 234 00:11:35,880 --> 00:11:38,839 Speaker 2: to process it all and fix it. 235 00:11:38,960 --> 00:11:43,240 Speaker 3: I mean are we years away? Months away? Not yours? 236 00:11:43,520 --> 00:11:46,600 Speaker 3: I mean recently confident that we'll be able to get 237 00:11:46,760 --> 00:11:51,800 Speaker 3: a trillion dollars of waste and forward opt and that 238 00:11:52,440 --> 00:11:54,920 Speaker 3: meaning that it will have what we'll have a net 239 00:11:54,960 --> 00:11:57,560 Speaker 3: savings in f I twenty six, which thoughts in October 240 00:11:57,559 --> 00:12:01,800 Speaker 3: obviously of a trillion dollars provider were allowed to We're 241 00:12:01,800 --> 00:12:04,880 Speaker 3: allowed to continue and our progress is not impeded. And 242 00:12:05,200 --> 00:12:06,720 Speaker 3: we're very public about what we do. 243 00:12:06,800 --> 00:12:09,160 Speaker 1: Yeah, put it on. 244 00:12:08,440 --> 00:12:11,600 Speaker 3: The website about how we could be more transparent. Literally 245 00:12:11,640 --> 00:12:14,240 Speaker 3: everything action we do, smaller, large, we put on the 246 00:12:14,240 --> 00:12:17,640 Speaker 3: dose dot dot gov website and we post on the 247 00:12:17,800 --> 00:12:21,880 Speaker 3: x handle. And when people complain about it and they say, oh, 248 00:12:21,880 --> 00:12:24,880 Speaker 3: you're doing something costumes, I'm like, well, which of these costs? 249 00:12:25,040 --> 00:12:27,640 Speaker 1: Daylight everyone knows exactly what you're doing. 250 00:12:27,760 --> 00:12:31,920 Speaker 3: Extreme transparent. Yeah, I don't think it's anything's been this 251 00:12:32,000 --> 00:12:32,760 Speaker 3: transparent ever. 252 00:12:33,040 --> 00:12:37,839 Speaker 4: So five years ago you were a hero to the left, cool, 253 00:12:38,080 --> 00:12:41,680 Speaker 4: you had electric cars, you had space, And in five 254 00:12:41,760 --> 00:12:43,600 Speaker 4: years you've got. 255 00:12:42,480 --> 00:12:44,480 Speaker 3: To go to a party in Hollywood and not get 256 00:12:44,480 --> 00:12:47,920 Speaker 3: doddy looks. In fact, yeah, and now you might not 257 00:12:47,920 --> 00:12:50,400 Speaker 3: even get invited. It's invited, but I don't know if 258 00:12:50,400 --> 00:12:50,840 Speaker 3: I need to go. 259 00:12:52,840 --> 00:12:55,400 Speaker 4: And I don't think it's an exaggeration to say today, 260 00:12:56,400 --> 00:12:59,920 Speaker 4: after Donald Trump, the left hates you more than a 261 00:13:00,120 --> 00:13:00,839 Speaker 4: person on earth. 262 00:13:01,400 --> 00:13:03,800 Speaker 3: Yes, I appear to be number two. I mean, if 263 00:13:03,840 --> 00:13:05,760 Speaker 3: you're judged by the various signs. 264 00:13:06,280 --> 00:13:10,680 Speaker 1: They estrangements, it's Trump arrangement syndrome and Elon de arrangement syndrome. 265 00:13:10,920 --> 00:13:11,920 Speaker 1: How is that for you? 266 00:13:12,000 --> 00:13:14,680 Speaker 4: That's a little bit of whiplash of going from being 267 00:13:14,880 --> 00:13:18,680 Speaker 4: like mister cool to the devil incarnate in just a 268 00:13:18,679 --> 00:13:20,560 Speaker 4: couple of years. Is that is that kind of weird 269 00:13:20,600 --> 00:13:22,040 Speaker 4: to experience that transformation? 270 00:13:22,320 --> 00:13:22,640 Speaker 3: Yes? 271 00:13:23,559 --> 00:13:24,840 Speaker 1: Why do they hate you so much? 272 00:13:25,640 --> 00:13:30,280 Speaker 3: Well, because we're we're clearly over the target. If those 273 00:13:30,400 --> 00:13:32,800 Speaker 3: was ineffective, if we were not actually getting rid of 274 00:13:33,440 --> 00:13:35,360 Speaker 3: a bunch of wasting forward and a bunch of that. 275 00:13:35,400 --> 00:13:40,200 Speaker 3: For I mean, the ford we're seeing is overwhelmingly on 276 00:13:40,240 --> 00:13:43,920 Speaker 3: the on the left. I mean it's not zero on 277 00:13:43,960 --> 00:13:47,839 Speaker 3: the right, but these angos are almost all left wing 278 00:13:47,960 --> 00:13:52,800 Speaker 3: NGOs that are being funded for example. Yep. So they 279 00:13:52,880 --> 00:13:56,560 Speaker 3: hate me because Dog is being effective and Dog is 280 00:13:56,559 --> 00:13:59,600 Speaker 3: getting rid of a lot of waste. For that, they 281 00:13:59,640 --> 00:14:02,559 Speaker 3: were that people and left were taking advantage of that. 282 00:14:02,720 --> 00:14:04,959 Speaker 3: That's that's that's where it comes out to. And and 283 00:14:04,679 --> 00:14:08,080 Speaker 3: the single biggest thing that they're that they're worried about 284 00:14:08,280 --> 00:14:13,880 Speaker 3: is that Dogs is going to turn off fraudulent payments 285 00:14:14,080 --> 00:14:18,360 Speaker 3: of entitlements. I mean everything from Social Security, medicare, uh 286 00:14:18,559 --> 00:14:24,600 Speaker 3: you know, unemployment, disability, smallpers administ registration loans, turn them 287 00:14:24,600 --> 00:14:28,880 Speaker 3: off to illegals. This is that flux of the matter. 288 00:14:29,240 --> 00:14:31,920 Speaker 3: Ye okay, this is this is the this is the 289 00:14:31,960 --> 00:14:34,080 Speaker 3: thing that why they really hit my guts and want 290 00:14:34,080 --> 00:14:34,480 Speaker 3: me to die. 291 00:14:35,640 --> 00:14:38,320 Speaker 4: And do you think that's billions? Hundreds of billions? What 292 00:14:38,320 --> 00:14:39,560 Speaker 4: do you think the scale is of that? 293 00:14:40,080 --> 00:14:42,480 Speaker 3: I think across the country it's it's in the it's well, 294 00:14:42,480 --> 00:14:47,680 Speaker 3: not of one hundred billion, maybe two hundred million. So uh, 295 00:14:48,440 --> 00:14:51,920 Speaker 3: by by using entitlements fraud the Democrats have been able 296 00:14:51,960 --> 00:14:58,080 Speaker 3: to attract and retain vast numbers of illegal immigrants. 297 00:14:57,960 --> 00:15:00,520 Speaker 1: And buy voters and and buy. 298 00:15:00,680 --> 00:15:07,200 Speaker 3: Voters exactly the basically bring in ten twenty million people 299 00:15:07,520 --> 00:15:11,720 Speaker 3: who are beholden to the Democrats for governor handouts and 300 00:15:11,760 --> 00:15:15,600 Speaker 3: will vote overwhelming the Democrats. Has been demonstrated in California. 301 00:15:15,800 --> 00:15:20,240 Speaker 3: This is it's an election strategy. Yes, it's powered, Yes, 302 00:15:20,320 --> 00:15:23,040 Speaker 3: and it doesn't take much to turn the swing states blue. 303 00:15:23,280 --> 00:15:25,040 Speaker 3: I mean off in a swing state yet be won 304 00:15:25,080 --> 00:15:28,040 Speaker 3: by ten twenty thousand votes. Sure, so the DAMS can 305 00:15:28,080 --> 00:15:31,360 Speaker 3: bring in two hundred thousand illegals and over time get 306 00:15:31,360 --> 00:15:34,880 Speaker 3: them legalized, not counting any cheating that takes place, because 307 00:15:34,880 --> 00:15:38,880 Speaker 3: there is some cheating, but even without cheating, if you bake, 308 00:15:39,400 --> 00:15:41,080 Speaker 3: if you if you bring in illegals that are ten 309 00:15:41,240 --> 00:15:44,240 Speaker 3: x the voted differential in a swing state, it will 310 00:15:44,280 --> 00:15:47,600 Speaker 3: no longer be a swing state, right, and the DAMS 311 00:15:47,600 --> 00:15:49,760 Speaker 3: will win all the swing states just a matter of time, 312 00:15:51,200 --> 00:15:56,200 Speaker 3: and America will be permanent deep blue socialist state. At 313 00:15:56,400 --> 00:16:00,440 Speaker 3: the House, the Senate, YEP, the Presidency, and the Supreme 314 00:16:00,480 --> 00:16:03,600 Speaker 3: Court will all go hardcore down. They will then further 315 00:16:03,640 --> 00:16:07,880 Speaker 3: cement that by bringing even more aliens so you can't 316 00:16:07,920 --> 00:16:11,320 Speaker 3: vote your way out of it. Their objement is to 317 00:16:11,360 --> 00:16:13,600 Speaker 3: make it one party socialist state, and it will be 318 00:16:13,680 --> 00:16:16,000 Speaker 3: much worse than California, because at least California is mitigated 319 00:16:16,040 --> 00:16:17,840 Speaker 3: by the fact that someone can leave California. 320 00:16:17,880 --> 00:16:18,640 Speaker 1: You can go to Texas. 321 00:16:18,760 --> 00:16:22,880 Speaker 3: Yeah, exactly, So you're gonna make everywhere California but worse. 322 00:16:22,920 --> 00:16:25,200 Speaker 4: By the way, the middle of the pandemic, I spent 323 00:16:25,280 --> 00:16:27,080 Speaker 4: forty five minutes on the phone with Elon. 324 00:16:27,240 --> 00:16:28,520 Speaker 1: He was still in California. 325 00:16:28,840 --> 00:16:32,080 Speaker 4: I was walking my dog's snowflake and trying to convince 326 00:16:32,080 --> 00:16:35,640 Speaker 4: you come to Texas. The commis in California can't stand you. 327 00:16:36,120 --> 00:16:38,200 Speaker 4: We love you, we want you here, and you didn't 328 00:16:38,280 --> 00:16:40,680 Speaker 4: quite go then, but you went not that long afterwards. 329 00:16:41,120 --> 00:16:46,480 Speaker 3: I mean, the COVID actions almost killed Tesla because they 330 00:16:46,560 --> 00:16:48,800 Speaker 3: let every other auto plant in the country was allowed 331 00:16:48,800 --> 00:16:52,000 Speaker 3: to open, but ours, which was in California, was not 332 00:16:52,040 --> 00:16:55,680 Speaker 3: a lot to open. Wow. Wow, So they almost killed Tesla. 333 00:16:56,280 --> 00:17:00,800 Speaker 4: So as a personal matter, you do you ever regret it? 334 00:17:00,880 --> 00:17:02,800 Speaker 4: Like five years ago you go to the OSCARS and 335 00:17:03,120 --> 00:17:06,280 Speaker 4: mister cool and now you're you've got death threats every day, 336 00:17:06,320 --> 00:17:07,160 Speaker 4: Like do you well? 337 00:17:07,200 --> 00:17:10,879 Speaker 3: These days the OSCARS are boring. I wouldn't want to go. 338 00:17:11,000 --> 00:17:12,439 Speaker 1: God bless the movies they nominate. 339 00:17:12,480 --> 00:17:14,280 Speaker 4: No one on earth has ever seen like, like, could 340 00:17:14,320 --> 00:17:16,800 Speaker 4: they actually nominate a movie that human beings go watch? 341 00:17:17,320 --> 00:17:20,040 Speaker 3: I mean, how many great movies have come out in 342 00:17:20,080 --> 00:17:21,160 Speaker 3: the last several years. 343 00:17:21,440 --> 00:17:22,760 Speaker 1: Very few, depressingly few. 344 00:17:23,000 --> 00:17:25,640 Speaker 3: Yeah, very few. Last Oscars came and went I didn't 345 00:17:25,720 --> 00:17:26,960 Speaker 3: watch it. There's nothing to see. 346 00:17:27,400 --> 00:17:30,400 Speaker 4: I was sad that Gene Hackman just passed away because 347 00:17:30,480 --> 00:17:32,800 Speaker 4: Unforgiven was spectacular, but that was a long time ago. 348 00:17:33,119 --> 00:17:33,920 Speaker 3: Forgiven came out. 349 00:17:34,320 --> 00:17:39,400 Speaker 2: You've mentioned today here and before about the possibility of 350 00:17:39,760 --> 00:17:40,600 Speaker 2: someone wanting. 351 00:17:40,400 --> 00:17:43,879 Speaker 3: To take you out, dealing with the death threats we see. 352 00:17:44,000 --> 00:17:45,760 Speaker 3: It's not in my imagination. You could just look on 353 00:17:45,800 --> 00:17:49,840 Speaker 3: social media. Yeah, but like is it because very clear? 354 00:17:50,000 --> 00:17:51,840 Speaker 1: Yeah, And look I'm I'm very familiar with it. 355 00:17:51,920 --> 00:17:55,119 Speaker 3: And they've got science. They are people with science and demonstrations, uh, 356 00:17:55,400 --> 00:17:56,479 Speaker 3: saying that I need to die. 357 00:17:57,000 --> 00:18:00,840 Speaker 4: Do you think are these just why jobs? Or do 358 00:18:00,920 --> 00:18:05,120 Speaker 4: you think there are foreign sane people? Do you think 359 00:18:05,200 --> 00:18:07,360 Speaker 4: there are foreign entities behind this? Do you think they're 360 00:18:07,400 --> 00:18:10,359 Speaker 4: domestic entities behind the threats? And also the attacks to 361 00:18:10,440 --> 00:18:14,040 Speaker 4: Twitter are not Twitter, but Tesla, I mean you know, 362 00:18:14,080 --> 00:18:17,960 Speaker 4: you're getting Tesla's charging stations lit on fire. Do you 363 00:18:18,400 --> 00:18:20,240 Speaker 4: think that's organized and paid for? 364 00:18:21,160 --> 00:18:23,160 Speaker 3: Yes, at least some of it is organized and paid 365 00:18:23,200 --> 00:18:30,520 Speaker 3: for I think by domestic you know, basically left wing 366 00:18:30,560 --> 00:18:37,399 Speaker 3: organizations in America funded by left wing billionaires. Essentially, is 367 00:18:37,440 --> 00:18:39,840 Speaker 3: it like Act Blue or what Act Blue is one 368 00:18:39,840 --> 00:18:44,920 Speaker 3: of them? You know Arabella, you know the classic it's 369 00:18:44,920 --> 00:18:49,680 Speaker 3: funded by the you know, the blue basically the left 370 00:18:49,720 --> 00:18:51,880 Speaker 3: wing and g O cabal. 371 00:18:52,520 --> 00:18:54,600 Speaker 2: How big of a threats is to like what you 372 00:18:54,640 --> 00:18:57,040 Speaker 2: build at Tesla? I mean, I remember when Tesla's came out, 373 00:18:57,080 --> 00:18:59,840 Speaker 2: it was people that they didn't want to have gas cars. 374 00:19:00,119 --> 00:19:02,960 Speaker 2: A lot of it was environmental reasons. I jokingly said, 375 00:19:03,000 --> 00:19:04,800 Speaker 2: I was like, I'm a Texas guy, I'm always going 376 00:19:04,840 --> 00:19:08,000 Speaker 2: to have something that burns gas. My kids now, all 377 00:19:08,000 --> 00:19:11,840 Speaker 2: three of my boys think that that Tesla's are awesome. 378 00:19:11,880 --> 00:19:14,360 Speaker 2: The cyber truck is the car they want their dad 379 00:19:14,440 --> 00:19:16,560 Speaker 2: to buy, which I laugh because I never could have 380 00:19:16,600 --> 00:19:18,560 Speaker 2: imagined that five years ago. 381 00:19:19,040 --> 00:19:20,840 Speaker 3: And now I'm looking at We're worth. 382 00:19:20,600 --> 00:19:23,480 Speaker 4: The White House in the President's Tesla sparks. Yeah, I mean, 383 00:19:23,520 --> 00:19:25,400 Speaker 4: which is the fullest dand but I mean you've you've 384 00:19:25,480 --> 00:19:26,240 Speaker 4: changed a generation. 385 00:19:26,359 --> 00:19:28,120 Speaker 2: When you look at my kids are six and eight 386 00:19:28,119 --> 00:19:31,280 Speaker 2: and they're going Dad buy a cyber truck, and I'm 387 00:19:31,320 --> 00:19:34,480 Speaker 2: considering it. That's a that's a full circle in a 388 00:19:34,520 --> 00:19:35,040 Speaker 2: weird way. 389 00:19:35,480 --> 00:19:37,239 Speaker 3: Yeah. Well, I do have the story that the most 390 00:19:37,359 --> 00:19:41,960 Speaker 3: errantating outcome is the most likely. So yeah, it seems 391 00:19:42,040 --> 00:19:45,600 Speaker 3: often to be true. I like, what, uh, what twistal 392 00:19:45,680 --> 00:19:49,159 Speaker 3: turn of fate? Uh? Well, I would the highest ratings 393 00:19:49,160 --> 00:19:52,120 Speaker 3: if this was if we're a TV show, what twistal 394 00:19:52,119 --> 00:19:54,360 Speaker 3: turn of fate would generate the highest ratings that there's 395 00:19:54,359 --> 00:19:55,320 Speaker 3: a good chance that happens. 396 00:19:55,560 --> 00:19:59,480 Speaker 4: Well, I will say if if Act Blue and Arabella 397 00:19:59,520 --> 00:20:01,920 Speaker 4: Network was a huge scam, next level, do you think 398 00:20:01,920 --> 00:20:04,280 Speaker 4: it's foreign money? Chinese money? Where do you think the 399 00:20:04,320 --> 00:20:05,960 Speaker 4: money in Act Blue is coming from? How do you 400 00:20:05,960 --> 00:20:06,640 Speaker 4: figure that out? 401 00:20:07,160 --> 00:20:09,560 Speaker 3: Well, it's not coming from from a whole bunch of 402 00:20:10,080 --> 00:20:14,800 Speaker 3: from a ground swell of public support, because when individual 403 00:20:14,880 --> 00:20:17,520 Speaker 3: donors looked at an Act Blue, they a bunually turned 404 00:20:17,520 --> 00:20:20,600 Speaker 3: out to be like diehard Republicans, people have never given 405 00:20:20,640 --> 00:20:22,520 Speaker 3: money in their life. So you're going to track down 406 00:20:22,520 --> 00:20:24,520 Speaker 3: a bunch of these where it says, oh I gave 407 00:20:24,560 --> 00:20:26,840 Speaker 3: sixteen thousand dollars and they're like, I didn't give sixteen 408 00:20:26,840 --> 00:20:30,919 Speaker 3: thousand dollars. We're talking about. Well, if those whole con 409 00:20:30,920 --> 00:20:32,760 Speaker 3: friends of mine, if I found themselves on the Act 410 00:20:32,760 --> 00:20:33,560 Speaker 3: Blue List the like. 411 00:20:34,960 --> 00:20:38,200 Speaker 4: So that's if it can actually be shown that they 412 00:20:38,200 --> 00:20:43,080 Speaker 4: are funding firebombing of Tesla charging stations, that's objectively a 413 00:20:43,080 --> 00:20:46,040 Speaker 4: criminal act. That that is funding terrorist activity, and the 414 00:20:46,080 --> 00:20:50,320 Speaker 4: statutes make clear that an incendiary device qualifies. 415 00:20:49,720 --> 00:20:52,119 Speaker 3: So that down is a terrorist activity. 416 00:20:52,800 --> 00:20:57,360 Speaker 4: Let me ask AI in ten years, how is life 417 00:20:57,440 --> 00:21:00,119 Speaker 4: going to be different because of AI for just a 418 00:21:00,160 --> 00:21:00,800 Speaker 4: normal person? 419 00:21:01,160 --> 00:21:04,000 Speaker 3: Well, ten years is a long time. In ten years, 420 00:21:04,040 --> 00:21:08,080 Speaker 3: probably AI could do anything better than a human can cognitively, 421 00:21:08,560 --> 00:21:11,840 Speaker 3: probably almost. I think in ten years, based on the 422 00:21:11,840 --> 00:21:14,720 Speaker 3: cart rate of improvement, AI will be smarter than the 423 00:21:14,720 --> 00:21:15,400 Speaker 3: smartest human. 424 00:21:16,119 --> 00:21:16,320 Speaker 1: Yeah. 425 00:21:16,400 --> 00:21:20,520 Speaker 3: Yeah. There will also be a massive number of robots, 426 00:21:20,920 --> 00:21:22,120 Speaker 3: so humanoid robots. 427 00:21:22,600 --> 00:21:24,000 Speaker 4: By the way, I got to ask, how come your 428 00:21:24,080 --> 00:21:26,760 Speaker 4: robots look so much like the creepy robots for my robot? 429 00:21:27,200 --> 00:21:28,600 Speaker 1: Was that intentional or. 430 00:21:28,560 --> 00:21:32,040 Speaker 3: Just I was hoping he's gonna say, yeah, just to 431 00:21:32,119 --> 00:21:35,920 Speaker 3: mess with you, it's not meant to look like any 432 00:21:36,800 --> 00:21:41,399 Speaker 3: prior robot, and we'll iterate the design and you'll be 433 00:21:41,440 --> 00:21:44,600 Speaker 3: able to have a lot of the robot parts are cosmetic. 434 00:21:44,680 --> 00:21:47,639 Speaker 3: You'll be able to switch out the kind of snap 435 00:21:47,680 --> 00:21:50,000 Speaker 3: on cosmetic parts of the robot make it look like 436 00:21:50,040 --> 00:21:55,440 Speaker 3: something else if you lying so. There'll be ultimately billions 437 00:21:55,440 --> 00:21:59,399 Speaker 3: of humanoid robots all costs will be self driving in 438 00:21:59,440 --> 00:22:03,440 Speaker 3: ten years. In ten years, probably ninety percent of miles 439 00:22:03,480 --> 00:22:08,280 Speaker 3: driven will be autonomous. Huh wow that fast. Yeah. In 440 00:22:08,280 --> 00:22:12,119 Speaker 3: five years, probably fifty percent of miles will driven will 441 00:22:12,160 --> 00:22:12,720 Speaker 3: be autonomous. 442 00:22:12,760 --> 00:22:15,640 Speaker 4: Now, if AI will be smarter than any persons, how 443 00:22:15,680 --> 00:22:19,920 Speaker 4: many jobs go away because of that? And what do 444 00:22:20,040 --> 00:22:22,159 Speaker 4: people do if you've got billions of people that are 445 00:22:22,160 --> 00:22:24,400 Speaker 4: losing their jobs like that, A lot of people are 446 00:22:24,640 --> 00:22:26,080 Speaker 4: understandably freaked out about that. 447 00:22:26,480 --> 00:22:32,520 Speaker 3: Well, goods and services will become close to free. So 448 00:22:33,200 --> 00:22:35,040 Speaker 3: it's not as though people will be wanting in terms 449 00:22:35,080 --> 00:22:35,960 Speaker 3: of goods and services. 450 00:22:37,200 --> 00:22:37,919 Speaker 1: So why is that? 451 00:22:37,960 --> 00:22:38,400 Speaker 3: What? 452 00:22:38,440 --> 00:22:40,960 Speaker 4: Why are goods and services free in an AI world 453 00:22:41,480 --> 00:22:42,360 Speaker 4: or close to free? 454 00:22:42,680 --> 00:22:45,600 Speaker 3: Well, you have I don't know, call it tens of 455 00:22:45,600 --> 00:22:50,119 Speaker 3: billions of robots. They will they will make you anything 456 00:22:50,280 --> 00:22:54,600 Speaker 3: or provide any service you want for basically next to nothing. 457 00:22:56,600 --> 00:22:59,359 Speaker 3: It's it's not that people will be will have a 458 00:22:59,359 --> 00:23:01,480 Speaker 3: lower standable I mean, they'll have actually much higher standard 459 00:23:01,520 --> 00:23:06,359 Speaker 3: of living. But the challenge will be fulfillment. How do 460 00:23:06,400 --> 00:23:08,760 Speaker 3: you drive fulfillment and meeting in life? 461 00:23:09,320 --> 00:23:14,720 Speaker 4: Is Skynet real? Like you get the apocalyptic visions of AI? 462 00:23:15,680 --> 00:23:21,280 Speaker 4: How real is the prospect of killer robots annihilating humanity? 463 00:23:21,800 --> 00:23:23,439 Speaker 3: Likely? Maybe ten percent? 464 00:23:23,960 --> 00:23:24,879 Speaker 1: On what timeframe? 465 00:23:25,359 --> 00:23:27,480 Speaker 3: Fat ten years so soon? 466 00:23:27,880 --> 00:23:30,560 Speaker 1: Like you you see a world where that's possible. 467 00:23:31,400 --> 00:23:32,879 Speaker 3: Yeah, But I mean you can look at it like 468 00:23:32,920 --> 00:23:37,520 Speaker 3: the glasses eighty ninety percent full, meaning like eighty percent 469 00:23:37,800 --> 00:23:41,200 Speaker 3: likely will have extreme prosperity for all. 470 00:23:41,720 --> 00:23:44,119 Speaker 4: Now, I guess my view, we're in a race to 471 00:23:44,560 --> 00:23:47,480 Speaker 4: win AI. We're in a race with China, and my 472 00:23:47,600 --> 00:23:49,440 Speaker 4: view is if they're going to be killer robots, I'd 473 00:23:49,480 --> 00:23:53,679 Speaker 4: rather they be American killer robots than Chinese. How likely 474 00:23:54,359 --> 00:23:56,680 Speaker 4: are we winning right now? Is America winning right now? 475 00:23:56,720 --> 00:23:59,000 Speaker 4: And how likely is America to win the race for 476 00:23:59,080 --> 00:24:01,200 Speaker 4: AI v to be China or anyone else? 477 00:24:01,640 --> 00:24:03,880 Speaker 3: Well the next few years, I think America is likely 478 00:24:03,920 --> 00:24:06,800 Speaker 3: to win. Then it will be a function of who 479 00:24:06,800 --> 00:24:11,919 Speaker 3: controls the AI chip fabrication, if the factories that make 480 00:24:11,920 --> 00:24:14,720 Speaker 3: the AI chifs who controls them, If they are controlled, 481 00:24:15,280 --> 00:24:17,120 Speaker 3: if more of them will controlled by China than China 482 00:24:17,119 --> 00:24:17,639 Speaker 3: will win. 483 00:24:18,080 --> 00:24:20,720 Speaker 4: More of the factories that are making the AI chips. 484 00:24:20,960 --> 00:24:23,719 Speaker 4: You think that will determine it? Yes, And how are 485 00:24:23,760 --> 00:24:25,600 Speaker 4: we doing versus China on that front? 486 00:24:26,200 --> 00:24:31,920 Speaker 3: Well, right now, almost all the advanced AI chip factories 487 00:24:31,920 --> 00:24:35,360 Speaker 3: they call them babs are in Taiwan. 488 00:24:35,560 --> 00:24:38,600 Speaker 1: And what if China invaded one miles away from what 489 00:24:38,680 --> 00:24:39,640 Speaker 1: happens if China. 490 00:24:39,920 --> 00:24:43,080 Speaker 4: China invades Taiwan, what happens to the world. 491 00:24:43,400 --> 00:24:45,879 Speaker 3: Well, if they were to invade in the nea term, 492 00:24:47,040 --> 00:24:49,800 Speaker 3: the world would be cut off from advanced AI chips. 493 00:24:50,160 --> 00:24:53,360 Speaker 3: And currently one percent of advanced AI chips are made 494 00:24:53,400 --> 00:24:53,880 Speaker 3: in Taiwan. 495 00:24:54,240 --> 00:24:56,160 Speaker 2: How fast can we put that online in America? How 496 00:24:56,160 --> 00:24:57,680 Speaker 2: important is that for national security? 497 00:24:58,160 --> 00:25:01,199 Speaker 3: I think it's essential for national security, and we're not 498 00:25:01,240 --> 00:25:01,800 Speaker 3: doing enough. 499 00:25:02,080 --> 00:25:04,760 Speaker 4: You're fifty three years old. I'm one hundred and eighteen 500 00:25:04,840 --> 00:25:06,720 Speaker 4: days older than you. By what the hell have I 501 00:25:06,760 --> 00:25:07,480 Speaker 4: done in my life? 502 00:25:07,520 --> 00:25:07,720 Speaker 1: I know? 503 00:25:07,840 --> 00:25:08,119 Speaker 3: Right? 504 00:25:08,560 --> 00:25:09,640 Speaker 1: Fifty three years old? 505 00:25:10,240 --> 00:25:14,720 Speaker 3: Well, well, so seventy one was a great year. 506 00:25:15,240 --> 00:25:17,720 Speaker 4: And I was December seventy because I was just just 507 00:25:17,840 --> 00:25:20,120 Speaker 4: right before you were the summer of seventy one. 508 00:25:21,560 --> 00:25:23,520 Speaker 3: I was born sixty nine days after four twenty. 509 00:25:23,920 --> 00:25:27,440 Speaker 1: Wow, I did ask Ben, this is true. 510 00:25:27,520 --> 00:25:32,119 Speaker 4: Look, I did ask Ben, should I show up and 511 00:25:32,160 --> 00:25:35,000 Speaker 4: pull up a joint and say can we beat Rogan's views? 512 00:25:35,040 --> 00:25:38,439 Speaker 4: But I was pretty sure it might cause a scandal 513 00:25:38,480 --> 00:25:39,880 Speaker 4: if we spent a podcast. 514 00:25:40,680 --> 00:25:45,480 Speaker 3: It just turned out to be like a chocolate cigarden. Yeah. 515 00:25:45,560 --> 00:25:47,920 Speaker 4: Let me ask you if today was your last day 516 00:25:47,920 --> 00:25:51,439 Speaker 4: on Earth. Yeah, well, I'm not suggesting it's going to be. 517 00:25:51,440 --> 00:25:52,800 Speaker 4: But if it were, what do you think your biggest 518 00:25:52,840 --> 00:25:55,880 Speaker 4: legacy would be if everything you've done one hundred years 519 00:25:55,920 --> 00:25:58,159 Speaker 4: from now? What do you think people would remember if 520 00:25:58,600 --> 00:26:00,919 Speaker 4: if it were zero to today? 521 00:26:01,040 --> 00:26:04,520 Speaker 3: And will you ever go to space in the in 522 00:26:04,560 --> 00:26:06,960 Speaker 3: the distant future, one hundred or one thousand years ago? 523 00:26:07,520 --> 00:26:10,840 Speaker 3: If SpaceX got humans to Mars, that's what they would 524 00:26:10,880 --> 00:26:11,440 Speaker 3: remember me for. 525 00:26:12,119 --> 00:26:13,760 Speaker 1: All Right, final set of questions. 526 00:26:13,960 --> 00:26:16,480 Speaker 4: Who's the smartest guy you've ever met? You hang out 527 00:26:16,520 --> 00:26:18,720 Speaker 4: with some brilliant people, Like like, when you look at 528 00:26:18,800 --> 00:26:22,399 Speaker 4: what's a CEO? You look at other than yourself, what CEO? 529 00:26:22,480 --> 00:26:23,080 Speaker 3: Do you say? 530 00:26:24,080 --> 00:26:25,280 Speaker 1: Damn, that guy's good. 531 00:26:25,720 --> 00:26:30,600 Speaker 3: Larry Elson's very smart. So I say, Larry Elson's one 532 00:26:30,600 --> 00:26:35,119 Speaker 3: of the smartest people you know, Larry Page. I mean, 533 00:26:35,160 --> 00:26:36,280 Speaker 3: there are a lot of people that are very smart 534 00:26:36,280 --> 00:26:38,760 Speaker 3: it's hard to say. Like, you know, I think something 535 00:26:38,760 --> 00:26:41,920 Speaker 3: to be smart is as smart as so you know, 536 00:26:42,000 --> 00:26:48,840 Speaker 3: what if what have they done that is difficult and significant? 537 00:26:49,040 --> 00:26:51,920 Speaker 3: You know, Jeff Bezos has not a lot of difficult 538 00:26:51,960 --> 00:26:55,359 Speaker 3: and significant things. I mean, there are a lot of 539 00:26:55,400 --> 00:26:58,480 Speaker 3: smart humans. I call them smart for smart for a human, 540 00:26:58,680 --> 00:27:00,080 Speaker 3: A lot of people who are in the smart for 541 00:27:00,119 --> 00:27:00,920 Speaker 3: a human category. 542 00:27:01,440 --> 00:27:04,760 Speaker 1: All right, final lightning round, Star Wars or Star Trek. 543 00:27:05,320 --> 00:27:07,440 Speaker 3: The first movie I saw in the theater was Star Wars, 544 00:27:07,440 --> 00:27:09,159 Speaker 3: so I think it had a profound effect on me. 545 00:27:09,640 --> 00:27:13,040 Speaker 3: I'm six years old. I think, imagine, best first movie 546 00:27:13,040 --> 00:27:16,200 Speaker 3: you ever see in a theater is Star Wars. It's 547 00:27:16,200 --> 00:27:16,920 Speaker 3: gonna blaw her mind. 548 00:27:17,000 --> 00:27:18,040 Speaker 1: Best Star Wars. 549 00:27:17,760 --> 00:27:20,480 Speaker 3: Movie Empires Trucks Back. 550 00:27:20,920 --> 00:27:22,520 Speaker 1: The only objectively right answer. 551 00:27:22,560 --> 00:27:24,480 Speaker 4: I stood in line in three hours with my dad 552 00:27:24,480 --> 00:27:25,600 Speaker 4: to see it on opening day. 553 00:27:26,440 --> 00:27:29,440 Speaker 3: Kirk or Picard, I like them both. 554 00:27:29,440 --> 00:27:34,199 Speaker 4: For Kirk again, objectively right answer. By the way, James T. 555 00:27:34,359 --> 00:27:37,359 Speaker 4: Kirk is a Republican and Picard is a Democrat, and 556 00:27:37,560 --> 00:27:41,080 Speaker 4: the left gets very mad when I say that best 557 00:27:41,119 --> 00:27:41,840 Speaker 4: Star Trek. 558 00:27:41,640 --> 00:27:44,359 Speaker 3: Movie, I mean the original, the first Star Trek movie. 559 00:27:45,160 --> 00:27:51,880 Speaker 3: That's okay, most of both of both. Wrath of cons 560 00:27:51,880 --> 00:27:54,280 Speaker 3: were pretty good. But yeah, the original wrath of kN. 561 00:27:54,720 --> 00:27:59,439 Speaker 1: Ricardo Montaban revenge is a dish best served cold. It 562 00:27:59,560 --> 00:28:00,240 Speaker 1: is very a. 563 00:28:00,400 --> 00:28:05,160 Speaker 4: Cold in space, although I will say Rathacon is objectively 564 00:28:05,200 --> 00:28:07,560 Speaker 4: the right answer. But but four is a sleeper. When 565 00:28:07,600 --> 00:28:10,480 Speaker 4: they go back to San Francisco and and and go 566 00:28:10,520 --> 00:28:13,720 Speaker 4: find the whales and and you know, Scotty picks up 567 00:28:14,119 --> 00:28:15,920 Speaker 4: picks up the mouth and talks to it and goes 568 00:28:16,000 --> 00:28:16,640 Speaker 4: a keyboard. 569 00:28:16,880 --> 00:28:19,320 Speaker 1: How quaint that's a sleeper? 570 00:28:19,359 --> 00:28:19,680 Speaker 3: All right? 571 00:28:19,760 --> 00:28:22,160 Speaker 1: Last question? Did Han shoot first? 572 00:28:24,040 --> 00:28:25,359 Speaker 3: It seemed like he shot second. 573 00:28:25,920 --> 00:28:26,639 Speaker 1: This is verdict. 574 00:28:26,720 --> 00:28:29,200 Speaker 4: And by the way, I apologize Ben so Ben was 575 00:28:29,240 --> 00:28:31,600 Speaker 4: a jock and played tennis at ole miss and so 576 00:28:31,600 --> 00:28:34,680 Speaker 4: so occasionally when when we geek out a little and 577 00:28:34,720 --> 00:28:35,360 Speaker 4: y'all geek. 578 00:28:35,200 --> 00:28:38,600 Speaker 3: Out over, there is still on the question I love 579 00:28:38,840 --> 00:28:41,600 Speaker 3: you miss his The alien misses fast shot. So why 580 00:28:41,640 --> 00:28:43,400 Speaker 3: do you miss his vast shot? Must have been because 581 00:28:43,400 --> 00:28:46,040 Speaker 3: he got shot first and he's missing a point blank 582 00:28:46,120 --> 00:28:48,680 Speaker 3: bassil shot. If they less, they got knocked off kids. 583 00:28:48,720 --> 00:28:52,560 Speaker 4: But it's a question of real which is is Han 584 00:28:52,680 --> 00:28:55,520 Speaker 4: Solo simply a hero or an anti hero? And and 585 00:28:55,600 --> 00:28:58,840 Speaker 4: so I'm in the Han shot first category. I don't 586 00:28:58,880 --> 00:29:00,360 Speaker 4: like sanitized stories would have. 587 00:29:00,320 --> 00:29:02,400 Speaker 3: Had to have shot first, because otherwise, why why would 588 00:29:02,400 --> 00:29:05,680 Speaker 3: the alien miss appoint blank range? Are you ever going 589 00:29:05,760 --> 00:29:07,360 Speaker 3: to go to outer space? Is that saying in your 590 00:29:07,400 --> 00:29:09,320 Speaker 3: life goals? Yeah, I'd like to go to Mars at 591 00:29:09,360 --> 00:29:12,160 Speaker 3: some point. And people have said do I want to 592 00:29:12,200 --> 00:29:15,160 Speaker 3: die in maz and I say yes, just not an impact. 593 00:29:15,400 --> 00:29:16,760 Speaker 1: Now, that's a very good answer. 594 00:29:17,280 --> 00:29:21,480 Speaker 4: The astronauts on the space station, are they political prisoners? 595 00:29:22,480 --> 00:29:25,640 Speaker 4: Some of them are because because you could have given 596 00:29:25,640 --> 00:29:26,600 Speaker 4: them a ride back, and. 597 00:29:28,360 --> 00:29:30,560 Speaker 1: Joe Biden said no, purely for politics. 598 00:29:30,920 --> 00:29:34,360 Speaker 3: Yeah, I mean, you know, there's been some debate about 599 00:29:34,360 --> 00:29:36,680 Speaker 3: this online. But the thing is that it was very 600 00:29:36,720 --> 00:29:39,920 Speaker 3: a very high level decision, so it wasn't really even 601 00:29:39,960 --> 00:29:43,960 Speaker 3: an acid decision. It was just that the Biden White 602 00:29:43,960 --> 00:29:46,920 Speaker 3: House did not want to have someone who is pro 603 00:29:47,080 --> 00:29:52,240 Speaker 3: Trump rescuing Askrowt's rife before the election, so they pushed it. 604 00:29:52,320 --> 00:29:53,880 Speaker 4: Well, if you're one of those astronauts, you got to 605 00:29:53,880 --> 00:29:55,160 Speaker 4: be pretty pissed off about that. 606 00:29:55,560 --> 00:30:01,560 Speaker 3: Well, if they're a Democrat, yes, a prat like everything's fine, 607 00:30:01,600 --> 00:30:03,960 Speaker 3: fair enough. So I think one of them is a 608 00:30:03,960 --> 00:30:06,600 Speaker 3: repulicable incident. It depends on which one you ask. 609 00:30:06,920 --> 00:30:09,120 Speaker 1: Well, thank you Elan, this was this was awesome. 610 00:30:09,320 --> 00:30:12,400 Speaker 4: And let me say, and by the way, I put 611 00:30:12,440 --> 00:30:15,000 Speaker 4: out on X the day before yesterday, if you were 612 00:30:15,000 --> 00:30:16,880 Speaker 4: having a beer with Elon and could ask him anything, 613 00:30:16,960 --> 00:30:19,960 Speaker 4: what would you ask? And got lots of responses. The 614 00:30:20,000 --> 00:30:24,080 Speaker 4: most common response people said is is say thank you. Look, 615 00:30:24,280 --> 00:30:27,320 Speaker 4: Texans and the American people appreciate what you're doing. You 616 00:30:27,320 --> 00:30:29,320 Speaker 4: don't have to put up with this bs, and you're 617 00:30:29,360 --> 00:30:31,960 Speaker 4: doing it. I'm grateful. You're making a hell of a 618 00:30:32,000 --> 00:30:35,000 Speaker 4: difference for this country. I appreciate you, and the Americans 619 00:30:35,040 --> 00:30:35,640 Speaker 4: appreciate you. 620 00:30:35,920 --> 00:30:38,160 Speaker 3: Yeah, it's essential for the future of civilization. Otherwise I 621 00:30:38,200 --> 00:30:39,920 Speaker 3: wouldn't be doing it. Yes, it's not like I want 622 00:30:39,960 --> 00:30:43,040 Speaker 3: to get death threats. You know, Now, what. 623 00:30:42,920 --> 00:30:45,600 Speaker 4: Year does man first set foot on Mars? 624 00:30:45,920 --> 00:30:50,360 Speaker 3: I think the soonest would be twenty nine, twenty nine, yes, 625 00:30:50,680 --> 00:30:53,240 Speaker 3: and I don't think it's more than two to four 626 00:30:53,320 --> 00:30:54,000 Speaker 3: years beyond that. 627 00:30:54,320 --> 00:30:56,959 Speaker 4: And that's not an unman, that's that's a human being 628 00:30:57,280 --> 00:30:58,640 Speaker 4: putting his foot on the surface. 629 00:30:58,920 --> 00:31:00,560 Speaker 3: Yes, best case twenty nine. 630 00:31:00,680 --> 00:31:04,680 Speaker 4: And what do you put the odds of finding either 631 00:31:04,720 --> 00:31:07,120 Speaker 4: alien life or evidence of alien life. 632 00:31:07,160 --> 00:31:08,520 Speaker 3: I don't think we're going to find aliens. 633 00:31:08,600 --> 00:31:11,480 Speaker 1: Okay, but do we find ruins, do we find remnants? 634 00:31:11,680 --> 00:31:13,400 Speaker 3: We may we may find the ruins of a long 635 00:31:13,440 --> 00:31:18,040 Speaker 3: dead alien civilization. That's possible, And we may find subterranean 636 00:31:18,480 --> 00:31:20,120 Speaker 3: microbial life. That's possible. 637 00:31:20,280 --> 00:31:20,600 Speaker 1: All right. 638 00:31:20,640 --> 00:31:23,800 Speaker 4: If man lands on Mars and twenty nine, how soon 639 00:31:23,880 --> 00:31:25,640 Speaker 4: after that do you land on. 640 00:31:25,600 --> 00:31:28,040 Speaker 3: Mars remains to be seen. I'm not sure. The important 641 00:31:28,080 --> 00:31:32,000 Speaker 3: thing is that we build a self sustaining city on 642 00:31:32,040 --> 00:31:37,320 Speaker 3: Mars as quickly as possible. The key threshold is when 643 00:31:37,360 --> 00:31:41,240 Speaker 3: that city can't continue to grow, continue to prosper even 644 00:31:41,280 --> 00:31:45,240 Speaker 3: when the supply ships from Earth stopped coming. At that point, 645 00:31:45,440 --> 00:31:48,920 Speaker 3: even if something would happen on Earth, it might not 646 00:31:48,960 --> 00:31:49,560 Speaker 3: be World War. 647 00:31:49,520 --> 00:31:53,320 Speaker 1: Three, but it might be that a bad virus. 648 00:31:53,920 --> 00:31:56,480 Speaker 3: Yeah, it might not be anything. It's like, let's say 649 00:31:56,480 --> 00:31:58,360 Speaker 3: civilization you could die with a bang or whooper. It 650 00:31:58,360 --> 00:32:00,960 Speaker 3: may be that civilization is dies with a whimper rather 651 00:32:01,000 --> 00:32:04,720 Speaker 3: than a bang and simply loses the ability to send 652 00:32:04,760 --> 00:32:08,600 Speaker 3: ships to Mars. So you often need Mars to become 653 00:32:08,680 --> 00:32:12,120 Speaker 3: self sustaining and be able to grow by itself before 654 00:32:12,520 --> 00:32:15,080 Speaker 3: the recupply ships from Earth stopped coming. That is the 655 00:32:15,360 --> 00:32:20,440 Speaker 3: critical a civilizational threshold, beyond which the probable last span 656 00:32:20,520 --> 00:32:22,080 Speaker 3: of civilization is much greater. 657 00:32:22,400 --> 00:32:25,160 Speaker 4: And how close are we technologically to be able to 658 00:32:25,200 --> 00:32:29,000 Speaker 4: do that, to have a self sustaining settlement on the 659 00:32:29,000 --> 00:32:29,720 Speaker 4: surface of Mars. 660 00:32:29,960 --> 00:32:31,840 Speaker 3: I think it can be done in twenty years, but it. 661 00:32:31,800 --> 00:32:34,120 Speaker 4: Would take twenty years, so we're not in twenty nine. 662 00:32:34,160 --> 00:32:35,600 Speaker 4: We're not there. What are we missing? 663 00:32:35,680 --> 00:32:37,040 Speaker 1: What are the big technologies? 664 00:32:37,240 --> 00:32:39,800 Speaker 3: We don't have a few people running around the surface 665 00:32:40,240 --> 00:32:41,800 Speaker 3: in a hostile environment is not going to make it 666 00:32:41,800 --> 00:32:43,840 Speaker 3: self sustaining. So you're going to need on the order 667 00:32:43,880 --> 00:32:46,440 Speaker 3: of a million people, maybe a million tons of cargo. 668 00:32:46,680 --> 00:32:48,320 Speaker 4: So but you think we could have a million people 669 00:32:48,320 --> 00:32:52,600 Speaker 4: on Mars in twenty years, yes, And what's the technology 670 00:32:52,640 --> 00:32:54,840 Speaker 4: we're missing right now? When you think about a million 671 00:32:54,840 --> 00:32:57,400 Speaker 4: people on Mars? Do we have the ability to get water, 672 00:32:57,480 --> 00:32:59,880 Speaker 4: to get food, to keep them safe? 673 00:33:00,000 --> 00:33:01,880 Speaker 1: I mean, what what do we need to make that happen? 674 00:33:01,920 --> 00:33:04,320 Speaker 3: Well, you need to recreate the entire base of industry Earth. 675 00:33:04,520 --> 00:33:07,600 Speaker 3: So you know, we're here at the top of a 676 00:33:07,640 --> 00:33:12,240 Speaker 3: massive permitive industry that starts with mining a vast array 677 00:33:12,280 --> 00:33:16,720 Speaker 3: of materials. Those materials going through hundreds of steps of refinement. 678 00:33:17,800 --> 00:33:21,680 Speaker 3: We grow food, obviously, we grow trees, we make things 679 00:33:21,720 --> 00:33:24,240 Speaker 3: out of the trees. There's you know, you've got to 680 00:33:24,280 --> 00:33:26,120 Speaker 3: you've got to build all that on Mars, and Mars 681 00:33:26,160 --> 00:33:29,160 Speaker 3: is a hostile environment. It's you know, it sometimes gets 682 00:33:29,160 --> 00:33:32,160 Speaker 3: above zero on a warm summer day near the equator 683 00:33:32,160 --> 00:33:35,000 Speaker 3: on Mars. Meanly, it's quite cold. How do you prep 684 00:33:35,040 --> 00:33:37,600 Speaker 3: for that? Well, in the beginning, on Mars, you have 685 00:33:37,640 --> 00:33:42,160 Speaker 3: to have a life support habitation module, right, like you 686 00:33:42,400 --> 00:33:44,320 Speaker 3: need You can't just live outdoors. You can't breathe the. 687 00:33:44,280 --> 00:33:46,320 Speaker 1: Air like a dome you think is likely? 688 00:33:46,440 --> 00:33:47,920 Speaker 3: Yeah, glass domes type of thing. 689 00:33:48,080 --> 00:33:52,200 Speaker 4: Have you identified a location on Mars that is likely 690 00:33:52,280 --> 00:33:54,040 Speaker 4: to be ideal for a habitat? 691 00:33:54,520 --> 00:33:58,320 Speaker 3: What might be Arcadia planet? Here is one of the 692 00:33:58,720 --> 00:34:01,840 Speaker 3: one of the good options. That's one of my daughters 693 00:34:01,880 --> 00:34:03,120 Speaker 3: is named Arcadia after that. 694 00:34:04,160 --> 00:34:05,520 Speaker 1: And what makes that attractive. 695 00:34:05,760 --> 00:34:09,680 Speaker 3: My eldest son's middle name is Ari's Mars. You've been 696 00:34:09,719 --> 00:34:11,239 Speaker 3: thinking about this for a long time. If you're name 697 00:34:11,239 --> 00:34:13,920 Speaker 3: and your kids around it. My eldest kid is middle 698 00:34:14,000 --> 00:34:17,080 Speaker 3: name is essentially Mars. When did you get the dream? Like, 699 00:34:17,200 --> 00:34:22,200 Speaker 3: I mean it's twenty twenty one soon, this is decades old? Yeah, dream? 700 00:34:22,320 --> 00:34:23,920 Speaker 4: So like when you were ten, did you look up 701 00:34:23,960 --> 00:34:24,960 Speaker 4: and say I'm going to Mars. 702 00:34:25,320 --> 00:34:29,000 Speaker 3: No No. I read a lot of science fiction books 703 00:34:29,040 --> 00:34:32,799 Speaker 3: and program computers. But the first finally off. The first 704 00:34:32,880 --> 00:34:36,560 Speaker 3: video game that I sold was a space video game 705 00:34:36,600 --> 00:34:38,520 Speaker 3: called Blastar. Maybe I was born this way? 706 00:34:39,040 --> 00:34:39,520 Speaker 1: How do you do? 707 00:34:40,960 --> 00:34:44,680 Speaker 4: How do you become Elon Musk? Look, you're obviously smart 708 00:34:44,680 --> 00:34:46,719 Speaker 4: as hell, but but there are lots there are a 709 00:34:46,719 --> 00:34:49,200 Speaker 4: lot of smart people that don't do squat And you've 710 00:34:49,280 --> 00:34:56,160 Speaker 4: managed everything you've touched has been an extraordinary success. Yeah, yeah, Look, 711 00:34:56,200 --> 00:34:57,799 Speaker 4: I mean that's just objectively right. 712 00:34:57,880 --> 00:34:59,160 Speaker 1: So what has led to that? 713 00:34:59,200 --> 00:35:01,280 Speaker 4: Because there are others are people that that's not true, 714 00:35:01,600 --> 00:35:03,760 Speaker 4: and they gaze at their nable and they don't do anything. 715 00:35:03,880 --> 00:35:06,920 Speaker 4: So what do you do differently that makes you so effective? 716 00:35:07,000 --> 00:35:09,400 Speaker 3: Well, I suppose I have a philosophy of curiosity. I 717 00:35:09,440 --> 00:35:11,920 Speaker 3: want to find out the nature of the universe, understand 718 00:35:11,920 --> 00:35:14,759 Speaker 3: the universe, And in order to do that, we have 719 00:35:14,800 --> 00:35:17,680 Speaker 3: to travel to other planets, see other star systems, maybe 720 00:35:17,680 --> 00:35:21,879 Speaker 3: other galaxies, find perhaps other alien civilizations, or at least 721 00:35:21,880 --> 00:35:25,560 Speaker 3: the remnants of alien civilizations, gain a better understanding of 722 00:35:25,560 --> 00:35:28,719 Speaker 3: words universe going ready to come from? And what questions 723 00:35:28,920 --> 00:35:31,160 Speaker 3: do we not yet know to ask about the answer 724 00:35:31,239 --> 00:35:32,080 Speaker 3: that is the universe. 725 00:35:32,400 --> 00:35:36,040 Speaker 4: So let's go back twenty five years, late nineties. 726 00:35:36,280 --> 00:35:38,880 Speaker 1: You're at PayPal. How do you turn PayPal into the 727 00:35:38,920 --> 00:35:41,480 Speaker 1: success it was, which which then helped launch you to 728 00:35:41,480 --> 00:35:41,960 Speaker 1: the next one? 729 00:35:41,960 --> 00:35:43,920 Speaker 3: In the next one. Yeah, So I studied physics and 730 00:35:44,000 --> 00:35:47,280 Speaker 3: economics and college, which is a good foundation for understanding 731 00:35:47,280 --> 00:35:51,920 Speaker 3: how the economy works and how reality works, and then 732 00:35:52,239 --> 00:35:56,640 Speaker 3: was going to do a PhD at Stanford in advanced 733 00:35:57,120 --> 00:36:02,480 Speaker 3: ultra capacitors actually as a potential means of energy storage 734 00:36:02,520 --> 00:36:06,759 Speaker 3: for electric transport. Put that on hold to start an 735 00:36:06,760 --> 00:36:10,480 Speaker 3: Internet company. Essentially came to the conclusion that the Internet 736 00:36:10,680 --> 00:36:13,080 Speaker 3: was one of those rare things and I could either 737 00:36:13,120 --> 00:36:15,800 Speaker 3: watch it happen while a grad student or participated. And 738 00:36:15,880 --> 00:36:18,000 Speaker 3: I figured, I've always go back to grad school. Grad 739 00:36:18,000 --> 00:36:19,680 Speaker 3: school is going to be kind of the same. But 740 00:36:20,920 --> 00:36:23,080 Speaker 3: I couldn't betther thought of just watching the Internet happen. 741 00:36:23,320 --> 00:36:24,759 Speaker 3: So I wanted to be a part of building it. 742 00:36:24,880 --> 00:36:28,080 Speaker 3: So I created an Internet Internet company. We did the 743 00:36:28,080 --> 00:36:31,880 Speaker 3: first maps, directions, yellow pages, white pages on the Internet. 744 00:36:31,880 --> 00:36:34,200 Speaker 3: I actually wrote the first version of Suffered just by 745 00:36:34,200 --> 00:36:38,200 Speaker 3: myself in ninety five, and we ended up selling that 746 00:36:38,239 --> 00:36:42,319 Speaker 3: to Compact Texas company. I guess for about three hundred 747 00:36:42,320 --> 00:36:45,320 Speaker 3: million dollars in cash about four years after I graduated. Wow. 748 00:36:45,880 --> 00:36:49,240 Speaker 3: So I should say just to preface that I graduated 749 00:36:49,280 --> 00:36:51,440 Speaker 3: with about one hundred thousand dollars a student debt. So 750 00:36:51,440 --> 00:36:56,480 Speaker 3: it wasn't yeah you and me about yeah, yeah, I know. 751 00:36:57,680 --> 00:36:59,520 Speaker 3: And when I first arrived in North America, I arrived 752 00:36:59,520 --> 00:37:02,080 Speaker 3: with twenty five hundred dollars, a bag of books and 753 00:37:02,520 --> 00:37:03,360 Speaker 3: a bag of clothes. 754 00:37:03,480 --> 00:37:05,720 Speaker 4: All right, So you sell the company for three hundred million, 755 00:37:05,880 --> 00:37:07,360 Speaker 4: How much does that change your life? 756 00:37:07,400 --> 00:37:13,200 Speaker 3: Well, I got twenty one million dollars that jack, But 757 00:37:13,239 --> 00:37:16,080 Speaker 3: I wanted to do more on the internet, so started 758 00:37:16,080 --> 00:37:18,960 Speaker 3: a company called x dot com, which merch with a 759 00:37:19,000 --> 00:37:23,080 Speaker 3: company called Confinity, which is Peter Teel and Max Levchun yep. 760 00:37:23,239 --> 00:37:26,319 Speaker 3: And the combined company was actually at first still called 761 00:37:26,480 --> 00:37:28,680 Speaker 3: x dot com, but we later later changed the name 762 00:37:28,719 --> 00:37:31,879 Speaker 3: of the company to PayPal. Because of all the name changes, 763 00:37:31,880 --> 00:37:34,200 Speaker 3: it's kind of confusing. But the company that people know 764 00:37:34,320 --> 00:37:38,160 Speaker 3: is know as PayPal today was actually I filed those 765 00:37:38,160 --> 00:37:40,360 Speaker 3: in corporation documents for that company. Interesting. 766 00:37:40,640 --> 00:37:43,640 Speaker 4: Yeah, well, And as you know, Peter Tiel and I 767 00:37:43,719 --> 00:37:46,319 Speaker 4: were buddies back in the mid nineties before he went 768 00:37:46,360 --> 00:37:48,200 Speaker 4: and did any of this. But you know, I became 769 00:37:48,239 --> 00:37:50,319 Speaker 4: friends with him when he was a corporate lawyer in 770 00:37:50,320 --> 00:37:52,560 Speaker 4: New York and just sort of a young libertarian with 771 00:37:52,960 --> 00:37:53,760 Speaker 4: a lot of dreams. 772 00:37:53,960 --> 00:37:55,520 Speaker 1: So it's been a heck of a journey. 773 00:37:56,200 --> 00:37:59,000 Speaker 3: Yeah. Yeah, Now, Sly Peter was involved in a coup. 774 00:38:00,360 --> 00:38:03,160 Speaker 3: We had a little sort of knifing in the Senate situation, 775 00:38:03,840 --> 00:38:09,880 Speaker 3: uh where you know that they did Coon met at 776 00:38:10,440 --> 00:38:11,120 Speaker 3: at PayPal. 777 00:38:12,480 --> 00:38:14,560 Speaker 1: I kind of did you all make peace after that? 778 00:38:14,640 --> 00:38:17,600 Speaker 3: Yeah? Yeah, yeah, I mean I was doing a lot 779 00:38:17,600 --> 00:38:19,680 Speaker 3: of sort of risky moves that I think ultimately would 780 00:38:19,719 --> 00:38:23,080 Speaker 3: have been successful. But I then went on a two 781 00:38:23,120 --> 00:38:28,480 Speaker 3: week trip which was a dual money raising trip and honeymoon, 782 00:38:28,560 --> 00:38:31,239 Speaker 3: and said not done my honeymoon earlier in the year, 783 00:38:31,400 --> 00:38:33,640 Speaker 3: So it's raising money while doing doing holly honeymoon. But 784 00:38:33,680 --> 00:38:35,560 Speaker 3: I was kind of awake. Did that go over by 785 00:38:35,600 --> 00:38:38,480 Speaker 3: the way it worked? It worked? There you go, kind 786 00:38:38,520 --> 00:38:41,320 Speaker 3: of worked. I raised money, yeah, and we had honeymoon 787 00:38:41,400 --> 00:38:44,520 Speaker 3: there you go. So yeah, but you don't want to 788 00:38:44,520 --> 00:38:48,640 Speaker 3: be away from the battle when things are scary, so 789 00:38:48,680 --> 00:38:50,840 Speaker 3: I was not there to assuage the concerns of the troops. 790 00:38:51,719 --> 00:38:56,960 Speaker 3: And anyway, Uh, we passed things up, and I have 791 00:38:57,120 --> 00:39:01,600 Speaker 3: been friends nonetheless, and you know these days all like 792 00:39:01,680 --> 00:39:03,799 Speaker 3: stay at his house and stuff, So super friends. And 793 00:39:03,840 --> 00:39:06,160 Speaker 3: he's also invested in most of my companies. 794 00:39:07,000 --> 00:39:09,680 Speaker 4: All right, So two thousand and two you start SpaceX, 795 00:39:10,040 --> 00:39:12,120 Speaker 4: Like how do you start a rocket company? Like what's 796 00:39:12,160 --> 00:39:14,680 Speaker 4: the first day where you're like, I want to make 797 00:39:14,760 --> 00:39:16,200 Speaker 4: rockets and I want to go to Mars? 798 00:39:16,239 --> 00:39:17,440 Speaker 1: Like what do you do on day one? 799 00:39:17,640 --> 00:39:19,600 Speaker 3: So I think you have to start with a some 800 00:39:19,640 --> 00:39:23,000 Speaker 3: sort of philosophical premise in order to have in order 801 00:39:23,040 --> 00:39:24,960 Speaker 3: for the in order to be in order to be 802 00:39:25,000 --> 00:39:29,719 Speaker 3: highly motivated, you have to have some philosophical foundation. In 803 00:39:29,719 --> 00:39:33,160 Speaker 3: my case, it was that that we want to expand 804 00:39:33,239 --> 00:39:37,440 Speaker 3: the scope and scale of consciousness to better understand the 805 00:39:37,520 --> 00:39:41,359 Speaker 3: nature of the universe. And in order to expand scate, 806 00:39:42,080 --> 00:39:45,560 Speaker 3: expand consciousness, we need to go beyond one planet. From 807 00:39:45,640 --> 00:39:48,400 Speaker 3: one planet, there's there's too much risk. You know, hopefully 808 00:39:48,400 --> 00:39:51,360 Speaker 3: Earth civilization prosperos very far into the future, but it 809 00:39:51,360 --> 00:39:53,640 Speaker 3: may not. There's always some risk that we are we 810 00:39:53,680 --> 00:39:56,839 Speaker 3: self annihilate through nuclear war, or that there's a big 811 00:39:56,920 --> 00:39:59,319 Speaker 3: meter that takes us out like the dinosaurs. Yep, there's 812 00:39:59,320 --> 00:40:01,360 Speaker 3: always some risk full. Your eggs are in one basket. 813 00:40:01,400 --> 00:40:04,880 Speaker 3: So it's going to be better if we're a multiplanet species. 814 00:40:04,960 --> 00:40:07,000 Speaker 3: And then once we're a multiplanet species, that the next 815 00:40:07,000 --> 00:40:09,440 Speaker 3: step would be to be a multi stellar and have 816 00:40:09,800 --> 00:40:14,000 Speaker 3: a civilization among on many different star systems. So in 817 00:40:14,000 --> 00:40:15,720 Speaker 3: two thousand and one, I didn't think that I could. 818 00:40:15,840 --> 00:40:17,400 Speaker 3: I didn't think I could sell a rocket company. So 819 00:40:17,440 --> 00:40:20,800 Speaker 3: I thought I'd take some of the money from PayPal 820 00:40:21,040 --> 00:40:23,400 Speaker 3: in that case, I think was about one hundred and 821 00:40:23,440 --> 00:40:26,719 Speaker 3: eighty eight million dollars after tax, something like that, and 822 00:40:27,200 --> 00:40:28,840 Speaker 3: I thought, you know, I don't need one hundred and 823 00:40:28,840 --> 00:40:31,560 Speaker 3: eighty million dollars, so I'll spend a bunch of it 824 00:40:31,640 --> 00:40:36,680 Speaker 3: on a philanthropic Mars mission to get the public excited 825 00:40:36,880 --> 00:40:39,279 Speaker 3: about going back to Mars. We're going to Mars, I 826 00:40:39,320 --> 00:40:42,040 Speaker 3: should say, yeah, Mars was always going to be the 827 00:40:42,080 --> 00:40:45,000 Speaker 3: destination after the Moon. Right. In fact, if you told 828 00:40:45,040 --> 00:40:47,760 Speaker 3: people in nineteen sixty nine that it would be twenty 829 00:40:47,800 --> 00:40:49,680 Speaker 3: twenty five and we've not even gone back to. 830 00:40:49,640 --> 00:40:51,440 Speaker 1: The Moon let alone, it's hard to believe. 831 00:40:51,680 --> 00:40:54,520 Speaker 3: Let alone Mars, they'd be like what happened in the 832 00:40:54,560 --> 00:40:59,239 Speaker 3: civilization collapse, like they would be incomprehensible that we've not 833 00:40:59,360 --> 00:41:01,879 Speaker 3: been to Mars by if you told people this after 834 00:41:01,960 --> 00:41:03,120 Speaker 3: landing on the Moon in sixty nine. 835 00:41:03,160 --> 00:41:05,120 Speaker 1: But what do you think in fifty years America never 836 00:41:05,160 --> 00:41:05,919 Speaker 1: went back to the moon. 837 00:41:06,040 --> 00:41:08,640 Speaker 3: Well, we destroyed the Saturn five rocket that was that 838 00:41:08,920 --> 00:41:10,960 Speaker 3: could take you to the Moon, and had the Space Shuttle, 839 00:41:10,960 --> 00:41:13,680 Speaker 3: which could only go to lowth orbit, and then there 840 00:41:13,680 --> 00:41:17,719 Speaker 3: really hasn't been anything to replace any No vehicle has 841 00:41:17,719 --> 00:41:20,480 Speaker 3: been made since then that can go to the Moon 842 00:41:20,600 --> 00:41:24,520 Speaker 3: or to Mars until the SpaceX Starship rocket. Yeah, so 843 00:41:24,840 --> 00:41:26,479 Speaker 3: you can't go to Mars if you don't have the ride. 844 00:41:26,719 --> 00:41:30,360 Speaker 4: So I remember you and I first met in twenty 845 00:41:30,400 --> 00:41:33,560 Speaker 4: thirteen when when I was a brand new baby senator. Yeah, 846 00:41:33,680 --> 00:41:35,920 Speaker 4: and I was still down in the basement office. They 847 00:41:35,960 --> 00:41:38,279 Speaker 4: stick freshman senators in the basement. 848 00:41:37,960 --> 00:41:38,960 Speaker 1: Office, kind of like hazeny. 849 00:41:39,040 --> 00:41:39,839 Speaker 3: Yeah, yeah, that's what say. 850 00:41:39,840 --> 00:41:42,759 Speaker 1: It sounds like there aree hundred sected offices. But for 851 00:41:42,800 --> 00:41:43,920 Speaker 1: six months you stay in the basement. 852 00:41:44,760 --> 00:41:48,360 Speaker 3: It's like worry, mean, you know where you're supposed to know. 853 00:41:48,400 --> 00:41:49,040 Speaker 1: I got to stay. 854 00:41:49,040 --> 00:41:50,759 Speaker 4: Now, thirteen years into it, I think there's a lot 855 00:41:50,760 --> 00:41:54,120 Speaker 4: of wisdom to doing that. But you were down in 856 00:41:54,120 --> 00:41:55,960 Speaker 4: the basement office and I remember you were coming and 857 00:41:55,960 --> 00:41:58,759 Speaker 4: sitting down with SpaceX and at the time the Air 858 00:41:58,800 --> 00:42:01,280 Speaker 4: Force was not letting you all be to launch satellites, 859 00:42:01,320 --> 00:42:03,080 Speaker 4: and so you were coming and saying, look, we got 860 00:42:03,080 --> 00:42:04,440 Speaker 4: a company. I think we can do a really good 861 00:42:04,520 --> 00:42:06,200 Speaker 4: job of this, and yet we're locked out of this. 862 00:42:06,719 --> 00:42:09,719 Speaker 4: It's a little amazing to think the journey SpaceX is 863 00:42:09,760 --> 00:42:11,359 Speaker 4: gone from then to now. 864 00:42:11,880 --> 00:42:13,839 Speaker 3: Yes, it's hard to believe that this is all real 865 00:42:15,040 --> 00:42:18,600 Speaker 3: because originally, consistently with my belief that we need to 866 00:42:18,640 --> 00:42:21,040 Speaker 3: become a multiplant species, I thought the only way to 867 00:42:21,040 --> 00:42:23,839 Speaker 3: do that would be through Nassau So and I think 868 00:42:23,920 --> 00:42:26,040 Speaker 3: I thought, well, if I can just get the public 869 00:42:26,040 --> 00:42:28,720 Speaker 3: excited about Mars, then they'll do a mission to Mars. 870 00:42:29,040 --> 00:42:31,759 Speaker 3: And so initially my thought was to have to send 871 00:42:31,800 --> 00:42:36,000 Speaker 3: a small greenhouse with seeds and dehydrated nutrient gel. Then 872 00:42:36,200 --> 00:42:39,439 Speaker 3: land the greenhouse, hydrate the seeds, and you see these 873 00:42:39,840 --> 00:42:42,040 Speaker 3: the sort of money shot proof. The money shot would 874 00:42:42,080 --> 00:42:45,680 Speaker 3: be green plants on a red background. I also recently 875 00:42:45,760 --> 00:42:48,319 Speaker 3: known that money shot has a different meaning in some 876 00:42:48,560 --> 00:42:55,960 Speaker 3: other arenas. But yeah, story, But what I'm trying to 877 00:42:56,000 --> 00:43:00,560 Speaker 3: say is the captivating shot would be the green plants 878 00:43:00,600 --> 00:43:03,520 Speaker 3: on a red background, and then hopefully that would if 879 00:43:03,520 --> 00:43:05,320 Speaker 3: you did something like that, that would get the public excited 880 00:43:05,320 --> 00:43:07,759 Speaker 3: about Mars, that would increase nasta's budget and then we 881 00:43:07,800 --> 00:43:11,400 Speaker 3: could send people to Mars. Dream was nasty to do this, yes, 882 00:43:11,640 --> 00:43:16,359 Speaker 3: not you. The original original plan was literally to take 883 00:43:16,360 --> 00:43:19,160 Speaker 3: a bunch of the money from PayPal and I guess 884 00:43:19,160 --> 00:43:23,200 Speaker 3: by some people's definition wasteed with no profit on a 885 00:43:23,239 --> 00:43:26,239 Speaker 3: non profit thing. I wanted to spend a whole bunch 886 00:43:26,280 --> 00:43:28,239 Speaker 3: of my money for free to get Nasta's budget to 887 00:43:28,239 --> 00:43:30,760 Speaker 3: be bigger so we could go to friggin Mars. Right, Wow, 888 00:43:31,400 --> 00:43:33,680 Speaker 3: that's what I wanted. So that was the holy grail, 889 00:43:33,920 --> 00:43:35,560 Speaker 3: That's what I wanted. I was like, so, when did 890 00:43:35,600 --> 00:43:38,359 Speaker 3: you Mars? That's what I wanted to know. 891 00:43:38,480 --> 00:43:41,040 Speaker 1: Well, when when did it strike you? Okay, you're going to. 892 00:43:41,040 --> 00:43:42,200 Speaker 3: Have to do this if you want, I'll tell you. 893 00:43:42,239 --> 00:43:44,799 Speaker 3: It gets crazier, all right, It gets crazier. So so 894 00:43:44,840 --> 00:43:46,839 Speaker 3: that I couldn't afford any of the US rockets because 895 00:43:46,880 --> 00:43:48,560 Speaker 3: as you know, the US rockets are way too expensive, 896 00:43:48,800 --> 00:43:51,360 Speaker 3: boying lucky rock Lucky rockets are crazy money. I didn't have. 897 00:43:51,840 --> 00:43:53,400 Speaker 3: I didn't even even with one hundred eight millions, so 898 00:43:53,400 --> 00:43:55,760 Speaker 3: way I could have afforded sure they back then well 899 00:43:56,080 --> 00:44:00,960 Speaker 3: the with with the additional stage to get tomorrows, it 900 00:44:00,960 --> 00:44:03,040 Speaker 3: would have been about like eighty million. So technically I 901 00:44:03,040 --> 00:44:05,319 Speaker 3: could have afforded one of them, but I wanted to 902 00:44:05,320 --> 00:44:08,040 Speaker 3: do too in case one of them didn't work. Yeah. So, 903 00:44:09,000 --> 00:44:10,520 Speaker 3: and then I didn't have enough money for that. Yeah, 904 00:44:10,520 --> 00:44:12,920 Speaker 3: and I was sort of prepared to, you know, I 905 00:44:12,920 --> 00:44:16,600 Speaker 3: don't know, waste half the money. And I figured if 906 00:44:16,600 --> 00:44:18,400 Speaker 3: I had ninety million left, that'd be fine, you know, 907 00:44:19,040 --> 00:44:21,600 Speaker 3: but ideally on all of it. So I went to 908 00:44:21,680 --> 00:44:26,200 Speaker 3: Russia twice to try to buy ICBMs. How'd that go? 909 00:44:26,600 --> 00:44:29,360 Speaker 3: And who do you call? The Russian rocket forces? 910 00:44:29,920 --> 00:44:31,719 Speaker 1: Do they sell ICBMs? Does that work? 911 00:44:31,920 --> 00:44:32,200 Speaker 3: Yeah? 912 00:44:33,760 --> 00:44:35,359 Speaker 2: You got to tell us a story that I want 913 00:44:35,360 --> 00:44:38,280 Speaker 2: to know who you can buy anything in Russia? 914 00:44:38,360 --> 00:44:41,680 Speaker 3: Yeah, I like, please walk me down that. 915 00:44:41,800 --> 00:44:43,680 Speaker 2: I want to know how you made that phone call 916 00:44:43,760 --> 00:44:45,600 Speaker 2: and when you get there, how did that work? 917 00:44:45,600 --> 00:44:48,080 Speaker 3: And what do you tell your friends? Yeah, listen, I'm 918 00:44:48,120 --> 00:44:50,840 Speaker 3: going to Russian device in ICBMs. I might not return, 919 00:44:51,120 --> 00:44:57,319 Speaker 3: you know, in this situation. Literally, Yeah, so I guess 920 00:44:57,360 --> 00:45:02,480 Speaker 3: slightly less insane when when you understand that the Russians 921 00:45:02,520 --> 00:45:06,160 Speaker 3: had to demolish a bunch of their ICBMs because of 922 00:45:07,280 --> 00:45:10,480 Speaker 3: you know, Soult talks like the piece because basically an 923 00:45:10,480 --> 00:45:13,799 Speaker 3: agreement between the United States and Russia to reduce the 924 00:45:13,800 --> 00:45:17,760 Speaker 3: total number of ICBMs, Russia was actually obligated to scrap 925 00:45:17,840 --> 00:45:19,680 Speaker 3: a bunch of their ICBMs. So you took it the 926 00:45:19,800 --> 00:45:22,960 Speaker 3: very biggest ICBMs. You could converte those into a rocket, 927 00:45:23,120 --> 00:45:25,880 Speaker 3: added additional stage and send something to Mars. 928 00:45:26,080 --> 00:45:28,600 Speaker 4: So those are big enough with one more stage to 929 00:45:28,640 --> 00:45:29,520 Speaker 4: get to Mars. 930 00:45:29,760 --> 00:45:31,920 Speaker 3: To send a small payload to Mars. Yeah, so the 931 00:45:32,040 --> 00:45:32,680 Speaker 3: s S eighteen. 932 00:45:33,080 --> 00:45:35,440 Speaker 4: So you try to buy CBMs, do you succeed or no? 933 00:45:35,680 --> 00:45:37,560 Speaker 4: Or do you figure out you got to build your 934 00:45:37,600 --> 00:45:38,160 Speaker 4: own instead? 935 00:45:38,440 --> 00:45:41,920 Speaker 3: They kept raising the price on me, so because I 936 00:45:41,920 --> 00:45:43,440 Speaker 3: figured like, look, they're going to throw these things in 937 00:45:43,480 --> 00:45:46,480 Speaker 3: scrappy on anyway, you should get a really good deal, right, 938 00:45:47,400 --> 00:45:51,120 Speaker 3: So the price thought out at four million. Then the 939 00:45:51,120 --> 00:45:53,560 Speaker 3: next conversation they were at eight million. Then the next 940 00:45:53,560 --> 00:45:56,279 Speaker 3: conversation they were at like nineteen million. And I'm like, 941 00:45:56,440 --> 00:45:57,640 Speaker 3: this is before we signed a contract. 942 00:45:57,680 --> 00:46:00,200 Speaker 4: By the way, was there another was there an they're 943 00:46:00,200 --> 00:46:01,799 Speaker 4: bit or were you the only one trying to buy them? 944 00:46:02,120 --> 00:46:04,040 Speaker 3: I think I don't know if there were other bits, 945 00:46:04,040 --> 00:46:06,600 Speaker 3: but they didn't mention any other bits. But I was like, man, 946 00:46:06,600 --> 00:46:08,680 Speaker 3: if if the price is increasing this much before the 947 00:46:08,680 --> 00:46:11,720 Speaker 3: contract signed, I'm really going to get fleeced after the contracts. 948 00:46:12,280 --> 00:46:16,560 Speaker 3: So so I got pretty frustrated there. Actually, in some 949 00:46:16,600 --> 00:46:21,560 Speaker 3: cases we got into like shouting matches in Moscow, some 950 00:46:21,600 --> 00:46:23,640 Speaker 3: guys shouting at me in Russia, and I'm shouting back 951 00:46:23,640 --> 00:46:29,600 Speaker 3: at him really badly, you know, I'm like, so you 952 00:46:29,640 --> 00:46:31,359 Speaker 3: are all I mean, you're all. 953 00:46:35,040 --> 00:46:35,640 Speaker 1: In Moscow? 954 00:46:35,920 --> 00:46:39,960 Speaker 3: Yeah, so, uh man, I should have recorded that. That would 955 00:46:40,000 --> 00:46:40,920 Speaker 3: have been one for them. 956 00:46:40,920 --> 00:46:43,120 Speaker 2: How many days were you there negotiating that first time? 957 00:46:43,280 --> 00:46:44,799 Speaker 2: I mean, was this like ongoing? 958 00:46:44,920 --> 00:46:48,319 Speaker 3: Yeah, yeah, this this took place. These conversations took place 959 00:46:48,320 --> 00:46:52,839 Speaker 3: over probably six months or so. Wow. So and then 960 00:46:52,880 --> 00:46:56,600 Speaker 3: the final trip trip there was with the with was 961 00:46:56,600 --> 00:47:01,640 Speaker 3: with Mike Revenue later became assid administrator. I actually realized 962 00:47:01,719 --> 00:47:04,840 Speaker 3: in the course of this that my original premise was wrong, 963 00:47:05,080 --> 00:47:07,880 Speaker 3: that that America actually has plenty of will to go 964 00:47:07,920 --> 00:47:11,319 Speaker 3: to Mars, but needs it just needs a way to 965 00:47:11,560 --> 00:47:15,560 Speaker 3: Mars that is affordable and that doesn't break the budget. 966 00:47:15,600 --> 00:47:17,000 Speaker 1: You know, just as you know, we couldn't even get 967 00:47:17,000 --> 00:47:19,239 Speaker 1: to the space station. We needed the Russians to get 968 00:47:19,320 --> 00:47:20,360 Speaker 1: us to our own space station. 969 00:47:20,440 --> 00:47:21,080 Speaker 3: That was embarrassing. 970 00:47:21,239 --> 00:47:22,400 Speaker 1: It really was pitiful. 971 00:47:22,520 --> 00:47:24,880 Speaker 3: I'm not sure most Americans know just how much we 972 00:47:24,880 --> 00:47:26,400 Speaker 3: were being fleeced. Like I think they got up to 973 00:47:26,440 --> 00:47:29,960 Speaker 3: like ninety million dollars a seat. Yeah, wow, Yeah, for 974 00:47:30,000 --> 00:47:33,200 Speaker 3: a seat that cost them like ten prege obviously, but 975 00:47:33,239 --> 00:47:37,160 Speaker 3: it was the only Yeah it was before SpaceX. Ninety 976 00:47:37,160 --> 00:47:39,160 Speaker 3: million dollars a seat for a seat that cost them 977 00:47:39,160 --> 00:47:43,840 Speaker 3: ten million is high. Yeah, that's a lot of money. Yeah. 978 00:47:44,000 --> 00:47:46,360 Speaker 4: So a few months ago you and I were down 979 00:47:46,640 --> 00:47:49,240 Speaker 4: in Boca Chica with a president for a starship launch, 980 00:47:49,560 --> 00:47:52,480 Speaker 4: and it is incredible what you built in Boca Chica. 981 00:47:52,719 --> 00:47:55,440 Speaker 4: You know, five years ago it was an empty beach. 982 00:47:55,640 --> 00:47:58,640 Speaker 3: At the southern tip of the Sundbar. Yeah, and it's. 983 00:47:58,520 --> 00:48:01,760 Speaker 4: Now a city and factory where you're building a rocket 984 00:48:01,760 --> 00:48:05,120 Speaker 4: ship a month with with incredible precision. But one of 985 00:48:05,160 --> 00:48:06,920 Speaker 4: the things you said to me when we were down 986 00:48:06,960 --> 00:48:09,400 Speaker 4: there that really stood out to me is is you 987 00:48:09,440 --> 00:48:12,960 Speaker 4: said your philosophy on intellectual property talked to lots of 988 00:48:13,000 --> 00:48:16,239 Speaker 4: CEOs or we fight to guard our IP and you 989 00:48:16,239 --> 00:48:19,160 Speaker 4: had a very different approach. What's what's your view of IP. 990 00:48:19,400 --> 00:48:23,920 Speaker 3: Patents of the week, patents of for those that innovate slowly. 991 00:48:24,239 --> 00:48:27,359 Speaker 4: I literally do not know anyone else in business who 992 00:48:27,360 --> 00:48:29,200 Speaker 4: would say something like that, like like it was a 993 00:48:29,560 --> 00:48:32,440 Speaker 4: startling and and and and what Elon said down there 994 00:48:32,520 --> 00:48:34,879 Speaker 4: is he said, look this stuff, I assume everyone will 995 00:48:34,880 --> 00:48:36,799 Speaker 4: steal everything, but by the time they steal it will 996 00:48:36,800 --> 00:48:38,600 Speaker 4: be five generations beyond and it won't matter. 997 00:48:38,719 --> 00:48:42,400 Speaker 3: Yes, at Tesla, we actually open sourcial lat patents. So 998 00:48:42,400 --> 00:48:45,120 Speaker 3: we said our patents are anyone can use it for free. 999 00:48:45,280 --> 00:48:48,880 Speaker 3: Really yeah, uh, the only really your patents of Tesla 1000 00:48:49,160 --> 00:48:54,680 Speaker 3: to to avoid patent trolls causing causing trouble. So we'll 1001 00:48:54,719 --> 00:48:56,880 Speaker 3: try to look ahead and say, okay, patent trolls are 1002 00:48:56,960 --> 00:48:59,520 Speaker 3: going to trial pat file patents to blocks and things, 1003 00:48:59,560 --> 00:49:02,040 Speaker 3: will FIP patents and then open sources make it free. 1004 00:49:02,320 --> 00:49:04,160 Speaker 3: I mean, I want to say patents for the week. Now. 1005 00:49:04,320 --> 00:49:08,080 Speaker 3: There are a few cases, in say with pharmaceuticals, where 1006 00:49:08,120 --> 00:49:10,000 Speaker 3: it might cost you billion dollars to do a phase 1007 00:49:10,000 --> 00:49:14,600 Speaker 3: three human file, but then subsequently the drug is very 1008 00:49:14,680 --> 00:49:18,120 Speaker 3: cheap to manufacture. So cases there are some in my opinion, 1009 00:49:18,120 --> 00:49:22,399 Speaker 3: which would massively reduce what can be patented and and 1010 00:49:22,560 --> 00:49:26,160 Speaker 3: say because the whole point of patenting is to maximize innovation, 1011 00:49:26,440 --> 00:49:31,080 Speaker 3: not inhibit it. And in my opinion, maybe a controversial opinion, 1012 00:49:31,600 --> 00:49:35,239 Speaker 3: most patents inhibit innovation, they do not help it. But 1013 00:49:35,280 --> 00:49:37,120 Speaker 3: there are case I want to do one a single 1014 00:49:37,320 --> 00:49:40,000 Speaker 3: cases like where such as a phase three clinical trial 1015 00:49:40,040 --> 00:49:42,120 Speaker 3: that might cost a billion dollars, but then the drugs 1016 00:49:42,120 --> 00:49:46,600 Speaker 3: thereafter cost a few dollars to manufacture, and if you 1017 00:49:46,640 --> 00:49:48,960 Speaker 3: can then immediately copy those drugs for a few dollars, 1018 00:49:48,960 --> 00:49:52,680 Speaker 3: no one will pay for the billion dollars free writer problem. Yeah, exactly, 1019 00:49:52,719 --> 00:49:54,600 Speaker 3: So you have to address the free writer problem. But 1020 00:49:54,760 --> 00:49:57,000 Speaker 3: other than that, there should be no patents. The ideas 1021 00:49:57,000 --> 00:49:57,760 Speaker 3: are easy. 1022 00:49:58,000 --> 00:50:01,399 Speaker 2: You want ideas to flow maximum to people to get 1023 00:50:01,520 --> 00:50:03,400 Speaker 2: there faster and do things bigger. 1024 00:50:03,800 --> 00:50:06,960 Speaker 3: The idea is the easy part. The herd execution is 1025 00:50:06,960 --> 00:50:09,359 Speaker 3: the hard part. As the old saying goes, it's one 1026 00:50:09,360 --> 00:50:12,560 Speaker 3: percent inspiration, if not less than one percent and nineteen 1027 00:50:12,600 --> 00:50:13,560 Speaker 3: nine percent perspiration. 1028 00:50:13,920 --> 00:50:16,839 Speaker 4: But I'll say the perspiration part you're really damn good at. 1029 00:50:16,840 --> 00:50:20,080 Speaker 4: Also because you're making you know, the companies you're building 1030 00:50:20,600 --> 00:50:24,640 Speaker 4: are actually building stuff. They're building cars, they're building spaceships, 1031 00:50:24,680 --> 00:50:26,440 Speaker 4: they're building things that if they don't work, it's a 1032 00:50:26,480 --> 00:50:31,279 Speaker 4: real problem. And the precision you manufacture things with, how 1033 00:50:31,280 --> 00:50:33,080 Speaker 4: do you get that level of precision? How do you 1034 00:50:33,120 --> 00:50:36,239 Speaker 4: get how do you build a culture? You're not You're 1035 00:50:36,280 --> 00:50:39,719 Speaker 4: amazing at thinking outside the box. But what's interesting is 1036 00:50:39,760 --> 00:50:43,560 Speaker 4: you may even be better at execution, which is how 1037 00:50:43,560 --> 00:50:45,440 Speaker 4: do you execute so effectively well. 1038 00:50:45,440 --> 00:50:48,799 Speaker 3: I take as expost principles approach to everything. It's not 1039 00:50:48,840 --> 00:50:52,440 Speaker 3: as though I wanted to in source manufacturering, It's just 1040 00:50:52,440 --> 00:50:56,799 Speaker 3: that I was unable to outsource it effectively. So, you know, 1041 00:50:56,880 --> 00:50:58,719 Speaker 3: the idea of the beginning of Tesla was that we 1042 00:50:58,800 --> 00:51:03,040 Speaker 3: would outsource almost all the manufacturing. But then it turned 1043 00:51:03,080 --> 00:51:06,319 Speaker 3: out there was no there were no good companies to 1044 00:51:06,440 --> 00:51:10,319 Speaker 3: outsourced manufacturing too, which there wasn't a really really, it 1045 00:51:10,320 --> 00:51:14,719 Speaker 3: wasn't peaceable. Outsourced manufacturing actually is the exception of the rule. 1046 00:51:15,719 --> 00:51:18,000 Speaker 3: And uh and just over time, we had to insource 1047 00:51:18,040 --> 00:51:21,000 Speaker 3: almost everything for Tesla and same for SpaceX. I became 1048 00:51:21,120 --> 00:51:24,280 Speaker 3: very good at manufacturing because I had to lose no choice. 1049 00:51:24,440 --> 00:51:27,080 Speaker 3: At this point, I might know more about manufacturing than 1050 00:51:27,080 --> 00:51:29,800 Speaker 3: any any human ever has, because I've done so many 1051 00:51:29,880 --> 00:51:32,560 Speaker 3: I've manufactured so many different things, and so many different arenas. 1052 00:51:33,440 --> 00:51:35,359 Speaker 3: I think probably more than anyone ever has. 1053 00:51:35,719 --> 00:51:39,120 Speaker 4: Look, that's that sounds like an astonishing statement, but it's 1054 00:51:39,120 --> 00:51:43,160 Speaker 4: not a crazy statement. And you're somehow running Tesla and 1055 00:51:43,280 --> 00:51:46,920 Speaker 4: running SpaceX and running X and running the boring company 1056 00:51:46,960 --> 00:51:49,240 Speaker 4: and running Nourlink and doing doge. 1057 00:51:49,840 --> 00:51:51,319 Speaker 1: How much do you sleep in a given night? 1058 00:51:51,719 --> 00:51:53,560 Speaker 3: About six hours and average so about. 1059 00:51:53,360 --> 00:51:55,719 Speaker 4: Six So that's it wouldn't have shocked me if you 1060 00:51:55,719 --> 00:51:56,439 Speaker 4: said three or four. 1061 00:51:56,600 --> 00:51:58,319 Speaker 3: So the Crenet struction is how many hours do you 1062 00:51:58,400 --> 00:52:01,400 Speaker 3: work a day? I work almost every working. 1063 00:52:01,120 --> 00:52:04,439 Speaker 4: Hour, and Ben, he's not kidding that. Like when Elan 1064 00:52:04,480 --> 00:52:07,480 Speaker 4: and I were first getting to know each other, I suggested, 1065 00:52:07,560 --> 00:52:09,440 Speaker 4: I said, hey, let let's grab dinner sometime. 1066 00:52:09,480 --> 00:52:10,640 Speaker 1: And I don't know if you remember what you said. 1067 00:52:10,680 --> 00:52:12,440 Speaker 1: You said, I don't eat dinner. 1068 00:52:12,840 --> 00:52:14,760 Speaker 3: I don't have social dinners really. 1069 00:52:14,719 --> 00:52:17,440 Speaker 1: Right, I mean that, Yeah, I mean you obviously eat food, but. 1070 00:52:17,480 --> 00:52:20,200 Speaker 3: Yeah, idea going to a restaurant for two hours, But. 1071 00:52:20,280 --> 00:52:23,200 Speaker 4: The idea of like, I don't, But it was it 1072 00:52:23,239 --> 00:52:25,239 Speaker 4: was just kind of matter of fact, why would I 1073 00:52:25,520 --> 00:52:28,000 Speaker 4: go to dinner like you up? 1074 00:52:28,080 --> 00:52:28,479 Speaker 1: You work? 1075 00:52:28,880 --> 00:52:31,200 Speaker 3: Uh yeah, I literally just thought I'll have lunch and 1076 00:52:31,200 --> 00:52:33,640 Speaker 3: then it brought during meetings and continue being. 1077 00:52:33,600 --> 00:52:36,880 Speaker 2: How many nights have you slept at your offices? 1078 00:52:36,920 --> 00:52:38,880 Speaker 3: You think your career percentage. 1079 00:52:38,320 --> 00:52:40,759 Speaker 2: Wise where you say, I just got to take this 1080 00:52:40,840 --> 00:52:43,080 Speaker 2: nap basically because my body forces me to, and I 1081 00:52:43,120 --> 00:52:45,600 Speaker 2: got to get back to work fast and efficiently without 1082 00:52:45,640 --> 00:52:46,520 Speaker 2: going somewhere else. 1083 00:52:46,880 --> 00:52:50,600 Speaker 3: Well, I guess it started out even with the first company, 1084 00:52:51,160 --> 00:52:53,880 Speaker 3: uh SO two, which is a terrible name, but the 1085 00:52:53,920 --> 00:52:58,040 Speaker 3: first internet company. The we're able to rent an office 1086 00:52:58,440 --> 00:53:01,120 Speaker 3: which was like in a leaky attack, essentially four five 1087 00:53:01,200 --> 00:53:05,439 Speaker 3: hundred dollars a month, and the cheapest apartment we could 1088 00:53:05,440 --> 00:53:08,360 Speaker 3: find just eight hundred dollars a month. So like, and 1089 00:53:08,400 --> 00:53:10,960 Speaker 3: we only had about five thousand dollars between our brother 1090 00:53:10,960 --> 00:53:14,319 Speaker 3: and I, so like, we're not we'll just stay in 1091 00:53:14,320 --> 00:53:17,960 Speaker 3: the office. Yep. So we got some couches that converted 1092 00:53:18,000 --> 00:53:21,759 Speaker 3: into beds and we'd can't sleep at night, and then 1093 00:53:21,760 --> 00:53:25,200 Speaker 3: we just have like turn the beds back into couches 1094 00:53:25,840 --> 00:53:28,880 Speaker 3: before anyone came, and then we would shout the YMCA 1095 00:53:28,960 --> 00:53:31,160 Speaker 3: down the road and so that went that that that 1096 00:53:31,160 --> 00:53:34,440 Speaker 3: that literally was for several months. What we did, it 1097 00:53:34,480 --> 00:53:37,000 Speaker 3: was in great shape, you know, work out the way. 1098 00:53:37,960 --> 00:53:41,360 Speaker 3: I still remember that that YMCA at page Miller al 1099 00:53:41,400 --> 00:53:43,240 Speaker 3: Camino in Palo Alto. 1100 00:53:43,600 --> 00:53:44,560 Speaker 1: So that was a long time. 1101 00:53:44,560 --> 00:53:46,960 Speaker 3: Again, so it's been I don't know. I've never thought 1102 00:53:47,000 --> 00:53:50,279 Speaker 3: to count it, but several hundred days maybe, I don't know. 1103 00:53:50,719 --> 00:53:52,520 Speaker 1: So you're now the richest man on earth. 1104 00:53:52,680 --> 00:53:55,680 Speaker 4: Do you still sleep in the office, Well, that's true, 1105 00:53:56,080 --> 00:53:56,680 Speaker 4: maybe Mars. 1106 00:53:56,680 --> 00:53:58,319 Speaker 1: We'll find someone else, but. 1107 00:53:58,680 --> 00:54:01,160 Speaker 3: I think if someone is a sovereign head of a country, 1108 00:54:01,200 --> 00:54:03,040 Speaker 3: there to facto Richard by Allot. 1109 00:54:03,040 --> 00:54:04,520 Speaker 1: Do you still sleep at the office now? 1110 00:54:04,760 --> 00:54:06,680 Speaker 3: I have sometimes slept at the office. Yeah. 1111 00:54:06,840 --> 00:54:10,160 Speaker 2: As always, thank you for listening to Verdict with Sentner, 1112 00:54:10,200 --> 00:54:12,640 Speaker 2: Ted Cruz, Ben Ferguson with you don't forget to deal 1113 00:54:12,719 --> 00:54:14,920 Speaker 2: with my podcast and you can listen to my podcast 1114 00:54:15,000 --> 00:54:16,960 Speaker 2: every other day you're not listening to Verdict or each 1115 00:54:17,040 --> 00:54:19,480 Speaker 2: day when you listen to Verdict. Afterwards, I'd love to 1116 00:54:19,520 --> 00:54:22,600 Speaker 2: have you as a listener to again the Ben Ferguson podcasts, 1117 00:54:22,600 --> 00:54:25,120 Speaker 2: and we will see you back here on Monday morning.