1 00:00:00,160 --> 00:00:03,760 Speaker 1: Welcome. It is Verdict with Center, Ted Cruz, Ben Ferguson 2 00:00:03,840 --> 00:00:07,680 Speaker 1: with you. And today is a very special day as 3 00:00:07,720 --> 00:00:11,280 Speaker 1: we are doing part one of our two part conversation 4 00:00:11,640 --> 00:00:14,320 Speaker 1: at the White House with Elon Musk. 5 00:00:14,800 --> 00:00:15,120 Speaker 2: Now. 6 00:00:15,280 --> 00:00:19,040 Speaker 1: Elon is going to break news during today and Wednesday's 7 00:00:19,079 --> 00:00:23,760 Speaker 1: show on Doge and some credible, shocking information about corruption 8 00:00:23,880 --> 00:00:26,360 Speaker 1: within our government. I want to make sure you hit 9 00:00:26,360 --> 00:00:30,200 Speaker 1: that subscribe button, that auto download button right now so 10 00:00:30,280 --> 00:00:35,360 Speaker 1: you do not miss part two on Wednesday. Also, please 11 00:00:35,440 --> 00:00:38,920 Speaker 1: help this go viral as we're exposing government waste by 12 00:00:39,080 --> 00:00:43,559 Speaker 1: sharing this episode on any social media platform that you 13 00:00:43,600 --> 00:00:46,519 Speaker 1: are on. But before we get to that, after more 14 00:00:46,560 --> 00:00:48,960 Speaker 1: than a year of war, tear and pain in Israel, 15 00:00:49,080 --> 00:00:52,800 Speaker 1: the need for security essentials and support for first responders 16 00:00:53,000 --> 00:00:57,120 Speaker 1: is still critical. Even in times of ceasefire. Israel must 17 00:00:57,200 --> 00:01:00,400 Speaker 1: be prepared for the next attack wherever it may come from. 18 00:01:00,440 --> 00:01:03,920 Speaker 1: As Israel is surrounded by enemies on all sides. That 19 00:01:04,040 --> 00:01:07,240 Speaker 1: is where the International Fellowship of Christians and Jews and 20 00:01:07,400 --> 00:01:10,160 Speaker 1: you come in. They are the ones that help give 21 00:01:10,200 --> 00:01:12,680 Speaker 1: the support that is needed and the people of Israel 22 00:01:12,880 --> 00:01:16,679 Speaker 1: need now more than ever before. Life saving security essentials 23 00:01:16,760 --> 00:01:19,520 Speaker 1: and your gift today will help save lives by providing 24 00:01:19,560 --> 00:01:24,720 Speaker 1: bomb shelters, armored security vehicles and ambulances, firefighting equipment, flat 25 00:01:24,800 --> 00:01:28,520 Speaker 1: jackets and bulletproof vests, and so much more. You're generous 26 00:01:28,560 --> 00:01:31,280 Speaker 1: donation today will help ensure the people of Israel are 27 00:01:31,319 --> 00:01:33,920 Speaker 1: safe and secure in the days to come. So give 28 00:01:33,959 --> 00:01:36,440 Speaker 1: a gift to bless Israel and our people by visiting 29 00:01:36,680 --> 00:01:43,640 Speaker 1: support IFCJ dot org. That's one word, support IFCJ dot org. 30 00:01:43,880 --> 00:01:47,880 Speaker 1: Or call eight a eight for eight eight IFCJ. That's 31 00:01:48,040 --> 00:01:51,800 Speaker 1: eight eight eight four eight eight IFCJ eight and eight 32 00:01:51,920 --> 00:01:55,160 Speaker 1: four eight eight four three two five or support IFCJ 33 00:01:55,320 --> 00:01:59,640 Speaker 1: dot org. So, without any further ado, here is part 34 00:01:59,720 --> 00:02:03,360 Speaker 1: one of our sit down conversation at the White House 35 00:02:03,520 --> 00:02:04,480 Speaker 1: with Elon Musk. 36 00:02:04,800 --> 00:02:06,800 Speaker 3: Well, we're in the White House. What right now? And 37 00:02:06,840 --> 00:02:10,240 Speaker 3: we're here with my friend Elon Musk, who really has 38 00:02:10,280 --> 00:02:12,239 Speaker 3: not been doing much of anything, has not made any 39 00:02:12,280 --> 00:02:17,919 Speaker 3: news is and nobody has noticed. Yeah the impact. Welcome Elon. 40 00:02:18,200 --> 00:02:18,560 Speaker 2: Thank you. 41 00:02:18,600 --> 00:02:19,360 Speaker 3: Holy crap. 42 00:02:19,960 --> 00:02:24,120 Speaker 2: Oh yes wow. Let me just say, never a dull moment. 43 00:02:24,480 --> 00:02:27,200 Speaker 3: Never a dull moment. The first fifty days the president 44 00:02:27,200 --> 00:02:31,360 Speaker 3: has spent in office over the top, and the first 45 00:02:31,400 --> 00:02:34,079 Speaker 3: fifty days you've spent. I don't think there's ever been 46 00:02:34,080 --> 00:02:35,960 Speaker 3: anyone to have an impact the way you have. At 47 00:02:35,960 --> 00:02:38,880 Speaker 3: the beginning, let me start with a question. You know 48 00:02:38,960 --> 00:02:42,360 Speaker 3: a lot about which was worse the mess you found 49 00:02:42,440 --> 00:02:45,960 Speaker 3: at Twitter or the mess you found in the federal government. 50 00:02:46,000 --> 00:02:47,640 Speaker 2: Well, it's hard to compete with the federal government. 51 00:02:48,880 --> 00:02:51,280 Speaker 3: What's surprised you about the federal government? I assume you 52 00:02:51,360 --> 00:02:53,680 Speaker 3: came in and assumed it was bad. Is it worse 53 00:02:53,720 --> 00:02:54,480 Speaker 3: than you expected? 54 00:02:56,000 --> 00:02:58,840 Speaker 4: It is worse than I expected. But on the plus side, 55 00:02:58,880 --> 00:02:59,519 Speaker 4: that means. 56 00:02:59,320 --> 00:03:00,760 Speaker 2: This more oportunity for improvement. 57 00:03:01,560 --> 00:03:05,040 Speaker 4: So look, if you look on the right side, there's 58 00:03:05,160 --> 00:03:07,920 Speaker 4: there's actually a lot of opportunity for improvement in federal 59 00:03:07,960 --> 00:03:11,800 Speaker 4: goverment expenditures because it's so bad. If it was a 60 00:03:11,800 --> 00:03:13,840 Speaker 4: well run ship, it would be very difficult to improve. 61 00:03:14,320 --> 00:03:16,760 Speaker 4: So like, but so now it's like if people say, well, 62 00:03:16,880 --> 00:03:19,040 Speaker 4: how how do you figure out how to save money 63 00:03:19,040 --> 00:03:20,600 Speaker 4: in the federal government? Well, it's like being in a 64 00:03:20,680 --> 00:03:22,760 Speaker 4: room where the walls, the roof, and the floor are 65 00:03:22,760 --> 00:03:29,600 Speaker 4: old targets any direction and you're kindness, Wow, yeah, I'm 66 00:03:29,639 --> 00:03:30,640 Speaker 4: sure you would agree. 67 00:03:31,280 --> 00:03:33,200 Speaker 3: So a lot of folks have talked about like. 68 00:03:33,440 --> 00:03:37,920 Speaker 4: Like you count right, this is going to any direction. 69 00:03:39,440 --> 00:03:42,080 Speaker 3: A lot of the crazy expenditures, things like like two 70 00:03:42,160 --> 00:03:45,360 Speaker 3: million bucks for sex change surgeries in Guatemala, an essential 71 00:03:47,960 --> 00:03:51,800 Speaker 3: you know, transgendered mice and and sesame street in Iraq. 72 00:03:51,880 --> 00:03:53,480 Speaker 3: A lot of that has gotten a tension. But some 73 00:03:53,520 --> 00:03:56,119 Speaker 3: of the stuff you've told me about, but like tell 74 00:03:56,200 --> 00:03:59,200 Speaker 3: us about computer licenses and government agencies. 75 00:03:59,280 --> 00:04:01,800 Speaker 4: Yeah, so most of what dog just finding. You don't 76 00:04:01,840 --> 00:04:05,280 Speaker 4: need to use shlock homes. It's very obvious basic stuff. 77 00:04:05,760 --> 00:04:09,480 Speaker 4: So in every government department, I say every because we've 78 00:04:09,480 --> 00:04:12,520 Speaker 4: not yet found a single exception. There are far too 79 00:04:12,600 --> 00:04:19,080 Speaker 4: many software licenses and media subscriptions, meaning many more software 80 00:04:19,080 --> 00:04:22,080 Speaker 4: licenses and media subscriptions than there are humans in the department. 81 00:04:22,680 --> 00:04:24,719 Speaker 3: Like you were saying, like an agency with fifteen thousand 82 00:04:24,760 --> 00:04:29,120 Speaker 3: people might have thirty thousand licenses. Yes, and even of 83 00:04:29,160 --> 00:04:31,640 Speaker 3: the fifteen thousand employees, a good chunk of them hadn't 84 00:04:31,720 --> 00:04:35,760 Speaker 3: used the license, had never logged on or used the application. 85 00:04:36,000 --> 00:04:41,280 Speaker 4: Yes, we found entire situations of software licenses or media 86 00:04:41,560 --> 00:04:44,280 Speaker 4: subscriptions where there were zero logins. 87 00:04:44,720 --> 00:04:46,600 Speaker 3: So it had and yet we were paying for it. 88 00:04:46,720 --> 00:04:50,120 Speaker 4: Yes, the government's paying for thousands of licenses of software 89 00:04:50,960 --> 00:04:53,480 Speaker 4: or media subscriptions and no one had ever logged in 90 00:04:53,560 --> 00:04:54,080 Speaker 4: even once. 91 00:04:54,680 --> 00:04:56,800 Speaker 3: Or credit cards. You found the same thing with government 92 00:04:56,800 --> 00:04:57,520 Speaker 3: credit cards. 93 00:04:57,800 --> 00:05:00,120 Speaker 4: We found that there are twice as many credit cards 94 00:05:00,120 --> 00:05:03,400 Speaker 4: there are humans. And I still don't have a good 95 00:05:03,400 --> 00:05:05,760 Speaker 4: explanation for why this is the case. And these are 96 00:05:05,760 --> 00:05:09,159 Speaker 4: ten thousand dollars limit cards. So it's a lot of money. 97 00:05:09,920 --> 00:05:12,640 Speaker 1: Is it incompetent that you're finding or is this like 98 00:05:12,760 --> 00:05:15,400 Speaker 1: the biggest money laundering scheme in the history of the 99 00:05:15,400 --> 00:05:16,400 Speaker 1: world that you're finding. 100 00:05:17,200 --> 00:05:20,279 Speaker 4: Okay, I think it's mostly if you say, look, what's 101 00:05:20,320 --> 00:05:23,840 Speaker 4: the waste to fraud ratio? Yeah, in my opinion, it's 102 00:05:24,600 --> 00:05:27,880 Speaker 4: it's like eighty percent wasst twenty percent fraud. But you 103 00:05:27,920 --> 00:05:32,200 Speaker 4: do have these sort of gray areas. For example, example 104 00:05:32,320 --> 00:05:36,400 Speaker 4: be so we saw a lot of payments going out 105 00:05:36,400 --> 00:05:40,680 Speaker 4: of treasury that had no payment code and no explanation 106 00:05:40,760 --> 00:05:44,680 Speaker 4: for the payment, and then we're we're trying to figure 107 00:05:44,680 --> 00:05:46,839 Speaker 4: out what that payment is and we'd see that Okay, 108 00:05:46,880 --> 00:05:50,280 Speaker 4: that contract was supposed to be shut off, but someone 109 00:05:50,320 --> 00:05:52,960 Speaker 4: forgot to shut off that that contract and so the 110 00:05:52,960 --> 00:05:57,320 Speaker 4: company kept getting money. Wow, Now is that waste or 111 00:05:57,320 --> 00:05:59,719 Speaker 4: fraud both both? 112 00:06:00,120 --> 00:06:05,280 Speaker 1: Yeah, you're not supposed to get you're not supposed to 113 00:06:05,320 --> 00:06:07,479 Speaker 1: get it, but you but the government sent it to 114 00:06:07,560 --> 00:06:09,760 Speaker 1: you and nobody from the government asked for it back. 115 00:06:10,720 --> 00:06:12,960 Speaker 4: Take, for example, the one the one point nine billion 116 00:06:13,000 --> 00:06:14,560 Speaker 4: dollars given to Stacy Abrams. 117 00:06:14,800 --> 00:06:21,520 Speaker 3: Yeah, fakengo, utter insanity. Explain the story that's that's that's 118 00:06:21,600 --> 00:06:24,120 Speaker 3: just corrupt. I think that's paying off cronies at that point. 119 00:06:25,040 --> 00:06:25,240 Speaker 4: Yeah. 120 00:06:25,400 --> 00:06:25,599 Speaker 2: Yeah. 121 00:06:25,800 --> 00:06:27,760 Speaker 3: And by the way, she knew, like when you get 122 00:06:27,760 --> 00:06:30,880 Speaker 3: two billion dollars, you don't miss that. That's not on accident. 123 00:06:31,480 --> 00:06:35,159 Speaker 4: That's allegedly it was for like uh, you know, environmentally 124 00:06:35,160 --> 00:06:38,440 Speaker 4: friendly appliances or something. And they've given like like one 125 00:06:38,480 --> 00:06:41,040 Speaker 4: hundred appliances so far for two billion dollars. 126 00:06:41,120 --> 00:06:46,680 Speaker 3: It's very expensive toast. That's some zo fridge. It's nice. 127 00:06:47,440 --> 00:06:51,799 Speaker 4: It is obviously one of the biggest scam portholes we've uncovered, 128 00:06:51,800 --> 00:06:55,600 Speaker 4: which is really crazy, is uh is that is that 129 00:06:56,480 --> 00:06:59,479 Speaker 4: the government can give money to a so called nonprofit 130 00:07:00,440 --> 00:07:03,360 Speaker 4: with with very few controls and then that and there's 131 00:07:03,360 --> 00:07:07,080 Speaker 4: there's no auditing subsequently of that nonprofit, so there's no 132 00:07:07,800 --> 00:07:09,520 Speaker 4: So this is where with the you know what point 133 00:07:09,560 --> 00:07:12,720 Speaker 4: nine billion of Stacey Arams, who's who's that they didn't 134 00:07:12,720 --> 00:07:17,520 Speaker 4: give themselves extremely lavished like insane salaries, expense everything yep, 135 00:07:17,720 --> 00:07:21,080 Speaker 4: to the to the nonprofit you know, buy jets and 136 00:07:21,200 --> 00:07:22,320 Speaker 4: homes and all sorts of. 137 00:07:22,280 --> 00:07:24,120 Speaker 3: Things, live like kings and queens. 138 00:07:24,240 --> 00:07:27,480 Speaker 2: Yes on the taxpayer, dont correct. You mentioned this is 139 00:07:27,480 --> 00:07:29,920 Speaker 2: happening at scale. It's not just one or two. We're 140 00:07:29,960 --> 00:07:31,000 Speaker 2: seeing this everywhere. 141 00:07:31,160 --> 00:07:32,960 Speaker 3: Now, one of the things you told me about is 142 00:07:33,000 --> 00:07:38,240 Speaker 3: it's great is what you call say magic money computers. Well, 143 00:07:38,520 --> 00:07:39,840 Speaker 3: so tell us about it, because I've never heard of 144 00:07:39,840 --> 00:07:41,040 Speaker 3: that until you brought that up. 145 00:07:41,440 --> 00:07:45,560 Speaker 4: Okay, So you may think that these the government computers 146 00:07:46,240 --> 00:07:49,280 Speaker 4: like all talk to each other, they synchronize, they add 147 00:07:49,360 --> 00:07:52,400 Speaker 4: up what funds are going somewhere, and it's you know, 148 00:07:53,520 --> 00:07:57,240 Speaker 4: it's coherent that that that the you know, there's and 149 00:07:57,240 --> 00:07:59,239 Speaker 4: that and that. The numbers, for example, that you're presented 150 00:07:59,280 --> 00:08:01,480 Speaker 4: as a senator, Yeah, are actually the. 151 00:08:01,440 --> 00:08:04,000 Speaker 3: Real numbers in one would think, well, I would think 152 00:08:04,360 --> 00:08:04,680 Speaker 3: they're not. 153 00:08:04,760 --> 00:08:08,000 Speaker 2: Yeah, okay, I mean. 154 00:08:07,960 --> 00:08:09,960 Speaker 4: They're not totally wrong, but they're probably off by five 155 00:08:10,040 --> 00:08:14,120 Speaker 4: percent or ten percent in some cases. So I call 156 00:08:14,120 --> 00:08:16,200 Speaker 4: it magic money computer. Any computer which can just make 157 00:08:16,240 --> 00:08:19,000 Speaker 4: money out of thin air best magic money. 158 00:08:19,080 --> 00:08:20,000 Speaker 3: So how does that work? 159 00:08:20,280 --> 00:08:22,920 Speaker 2: It just issues payments, and you. 160 00:08:22,880 --> 00:08:25,800 Speaker 3: Said something like eleven of these computers of treasury that 161 00:08:25,880 --> 00:08:28,760 Speaker 3: are that are sending out trillions in payments. 162 00:08:28,400 --> 00:08:33,360 Speaker 4: They're mostly a Treasury, some are with the sum at HHS, 163 00:08:33,520 --> 00:08:35,960 Speaker 4: some at there's one as one of two as States, 164 00:08:36,800 --> 00:08:40,400 Speaker 4: there's summit at DoD. I think we found now fourteen 165 00:08:40,600 --> 00:08:41,880 Speaker 4: magic money computers. 166 00:08:42,720 --> 00:08:42,960 Speaker 3: Gene. 167 00:08:42,960 --> 00:08:45,760 Speaker 2: Okay, they just send money out of nothing. 168 00:08:46,559 --> 00:08:50,000 Speaker 3: You have an ability to see where leverage points are 169 00:08:51,080 --> 00:08:54,480 Speaker 3: and how things actually happen. So I remember back I 170 00:08:54,520 --> 00:08:57,440 Speaker 3: think it was September October of this year, before the election. 171 00:08:57,520 --> 00:08:59,000 Speaker 3: We didn't know who was going to win, and I 172 00:08:59,080 --> 00:09:01,120 Speaker 3: was at your house in Austin. We were talking about 173 00:09:01,160 --> 00:09:04,640 Speaker 3: it and you said, you said, look, I don't want 174 00:09:04,640 --> 00:09:06,880 Speaker 3: a job in Washington, and you said, all I want 175 00:09:06,960 --> 00:09:09,840 Speaker 3: is the log in for every computer. And I remember 176 00:09:09,880 --> 00:09:11,720 Speaker 3: thinking at the time that sounded kind of weird, like 177 00:09:11,760 --> 00:09:14,240 Speaker 3: I just didn't get it. And I have to say, 178 00:09:14,280 --> 00:09:19,480 Speaker 3: what's interesting on this? If I would have thought like, okay, 179 00:09:19,559 --> 00:09:22,000 Speaker 3: how do you reform government? Like sort of the traditional 180 00:09:22,000 --> 00:09:23,520 Speaker 3: way to think about it is, okay, give me an 181 00:09:23,600 --> 00:09:25,360 Speaker 3: orc chart, let me sit down with the people who 182 00:09:25,400 --> 00:09:29,679 Speaker 3: are running agencies. And what you saw immediately is to 183 00:09:29,760 --> 00:09:33,079 Speaker 3: understand what's really going on, get to the payment systems, 184 00:09:33,120 --> 00:09:39,600 Speaker 3: get to the computers. Yeah, Like, why is getting to 185 00:09:39,640 --> 00:09:43,400 Speaker 3: the computers so critical to understanding what's actually happening? 186 00:09:44,040 --> 00:09:46,719 Speaker 4: Well, the government is run by computers, So you've got 187 00:09:48,480 --> 00:09:51,760 Speaker 4: essentially several hundred computers that effectively run the government. 188 00:09:52,600 --> 00:09:54,280 Speaker 2: And if you want to know, did you know that 189 00:09:54,360 --> 00:09:54,800 Speaker 2: then no? 190 00:09:55,120 --> 00:09:58,600 Speaker 4: Like, yeah, so when somebody, like even when the president 191 00:09:58,640 --> 00:10:00,400 Speaker 4: issues an executive order, that's going to go through a 192 00:10:00,440 --> 00:10:02,720 Speaker 4: whole bunch of people until ultimately it is implemented at 193 00:10:02,720 --> 00:10:05,599 Speaker 4: a computer somewhere. And if you want to know what 194 00:10:06,200 --> 00:10:08,480 Speaker 4: the situation is with the accounting and you're trying to 195 00:10:08,480 --> 00:10:10,600 Speaker 4: reconcile accounting and get rid of waste and fraud, you 196 00:10:10,679 --> 00:10:13,680 Speaker 4: must be able to analyze the computer databases. Otherwise you 197 00:10:13,679 --> 00:10:16,640 Speaker 4: can't figure it out because all you're doing is asking 198 00:10:16,679 --> 00:10:20,200 Speaker 4: a human who will then ask another human, ask another human, 199 00:10:20,240 --> 00:10:23,760 Speaker 4: and finally usually ask some contractor will ask another contractor 200 00:10:24,040 --> 00:10:27,520 Speaker 4: to do a query on the computer. Wow, that's how 201 00:10:27,559 --> 00:10:31,120 Speaker 4: it actually works. So it's many layers d. So the 202 00:10:31,160 --> 00:10:33,160 Speaker 4: only way to reconcile the databases and get rid of 203 00:10:33,160 --> 00:10:36,439 Speaker 4: waste and fraud is to to actually look at the 204 00:10:36,440 --> 00:10:39,600 Speaker 4: computers and see what's going on. So that's what I 205 00:10:39,640 --> 00:10:44,000 Speaker 4: call you. That's like, that's why when I sort of 206 00:10:44,640 --> 00:10:47,800 Speaker 4: cryptically referred to reprogramming the matrix. You have to understand 207 00:10:47,840 --> 00:10:49,600 Speaker 4: what's going on the computers. You have to reconcile the 208 00:10:49,600 --> 00:10:53,120 Speaker 4: computer databases in order to identify the waste and fraud. 209 00:10:53,360 --> 00:10:59,040 Speaker 3: I don't know that there was anyone in Congress who understood, 210 00:10:59,120 --> 00:11:03,520 Speaker 3: certainly myself and do understood the leverage that comes from 211 00:11:03,559 --> 00:11:07,440 Speaker 3: the computer and the data in particular, that that Congress 212 00:11:07,480 --> 00:11:09,400 Speaker 3: would think about give me a report on what your 213 00:11:09,400 --> 00:11:13,240 Speaker 3: expenditures are rather than actually getting into the pipes. And 214 00:11:13,280 --> 00:11:16,280 Speaker 3: I think that has been fascinating that it's let you 215 00:11:16,559 --> 00:11:19,520 Speaker 3: uncover a bunch of craft that just nobody knew. 216 00:11:20,000 --> 00:11:22,640 Speaker 4: Yes, I mean, in order for money to go to 217 00:11:22,679 --> 00:11:25,640 Speaker 4: evan Acount, it's it's not like we're sending truckloads of 218 00:11:25,640 --> 00:11:29,600 Speaker 4: cash all over the place where it's a we're wiring money, right, 219 00:11:29,679 --> 00:11:31,480 Speaker 4: We're sending money through the ACH system or through the 220 00:11:31,480 --> 00:11:34,199 Speaker 4: Swift system. So in order for money to flow, it's 221 00:11:34,200 --> 00:11:36,680 Speaker 4: going to flow electronically. So that's that's what you need 222 00:11:36,720 --> 00:11:39,319 Speaker 4: to look at. You can look at the actual electronic 223 00:11:39,320 --> 00:11:40,280 Speaker 4: money flows. 224 00:11:40,000 --> 00:11:42,240 Speaker 3: And Tesla and all your companies, you have accounting and 225 00:11:42,280 --> 00:11:44,560 Speaker 3: you have every expenditure, you have it coded for what 226 00:11:44,600 --> 00:11:47,640 Speaker 3: it's going for. Federal government doesn't work that way. They 227 00:11:47,640 --> 00:11:48,760 Speaker 3: don't code what the money's going for. 228 00:11:48,800 --> 00:11:51,000 Speaker 2: They do know, but they didn't. They didn't. 229 00:11:51,840 --> 00:11:54,320 Speaker 3: And like one of the things that that you told me, 230 00:11:54,360 --> 00:11:56,679 Speaker 3: you said, if any company kept its books the way 231 00:11:56,679 --> 00:11:59,800 Speaker 3: the federal government does, they'd arrest the officers and put 232 00:11:59,800 --> 00:12:00,360 Speaker 3: them in jail. 233 00:12:00,440 --> 00:12:03,080 Speaker 4: Yes, if it was a poly company would be dealisted immediately, 234 00:12:03,200 --> 00:12:05,920 Speaker 4: it would fail, it's ordered and the officers of the 235 00:12:05,960 --> 00:12:09,800 Speaker 4: company would be imprisoned. That's the level of enal seasiness 236 00:12:09,800 --> 00:12:10,200 Speaker 4: in the middle. 237 00:12:10,640 --> 00:12:14,559 Speaker 1: Unfortunately, it's deliberately or do you think this is in 238 00:12:14,760 --> 00:12:15,720 Speaker 1: competence again. 239 00:12:15,600 --> 00:12:20,280 Speaker 4: It's eighty percent. It's eighty percent in competence or in 240 00:12:20,400 --> 00:12:21,280 Speaker 4: twenty percent malice. 241 00:12:21,400 --> 00:12:23,839 Speaker 1: So if you look at it, if you look at 242 00:12:23,880 --> 00:12:26,760 Speaker 1: DOGE now and you look at the government and what 243 00:12:26,800 --> 00:12:31,520 Speaker 1: you're finding, what percentage have you guys even gotten to 244 00:12:31,640 --> 00:12:34,040 Speaker 1: and how much of it is mars where you haven't 245 00:12:34,040 --> 00:12:36,480 Speaker 1: even gotten there yet because there's so much you're finding 246 00:12:36,520 --> 00:12:38,640 Speaker 1: out here. I mean, how many you seem like a 247 00:12:38,679 --> 00:12:40,320 Speaker 1: timeline guy when you say, all I want to get 248 00:12:40,320 --> 00:12:42,400 Speaker 1: in there and to get all these you know, numbers 249 00:12:42,400 --> 00:12:44,440 Speaker 1: and things. How far are we from the D game 250 00:12:44,640 --> 00:12:47,360 Speaker 1: where you've seen it all been able to process it 251 00:12:47,440 --> 00:12:50,839 Speaker 1: all and fix it. I mean are we years away? 252 00:12:50,960 --> 00:12:51,600 Speaker 1: Months away? 253 00:12:52,880 --> 00:12:53,400 Speaker 2: Not yours? 254 00:12:54,320 --> 00:12:57,360 Speaker 4: I mean recentaly confident that we'll be able to get 255 00:12:57,559 --> 00:13:03,080 Speaker 4: a trillion dollars of waste and APT and that that 256 00:13:03,200 --> 00:13:06,120 Speaker 4: meaning that it will have we'll have a net savings 257 00:13:06,160 --> 00:13:07,960 Speaker 4: in f I twenty six, which starts. 258 00:13:07,760 --> 00:13:10,960 Speaker 2: In October, obviously of a trillion dollars. 259 00:13:11,040 --> 00:13:13,559 Speaker 4: But Provider, we're allowed to we're allowed to continue, and 260 00:13:13,800 --> 00:13:16,360 Speaker 4: we're and our progress is not impeded. And we're very 261 00:13:16,400 --> 00:13:19,240 Speaker 4: public about what we do. Yeah, put it off on 262 00:13:19,240 --> 00:13:22,360 Speaker 4: the website about how we could be more transparent. Literally, 263 00:13:22,400 --> 00:13:24,880 Speaker 4: everything action we do, small or large, we put on 264 00:13:24,920 --> 00:13:28,040 Speaker 4: the dose dot dot com website and we post on 265 00:13:28,320 --> 00:13:31,559 Speaker 4: the X handle. And when people complain about it I 266 00:13:32,160 --> 00:13:34,120 Speaker 4: and they say, oh, you're doing something on costumes, I'm like, well, 267 00:13:34,120 --> 00:13:36,160 Speaker 4: which of these costs in the daylight. 268 00:13:36,840 --> 00:13:38,400 Speaker 3: Everyone knows exactly what you're doing. 269 00:13:38,520 --> 00:13:39,559 Speaker 2: Extreme transparent. 270 00:13:39,679 --> 00:13:43,520 Speaker 4: Yeah, I don't think it's anything's been this transparent ever. 271 00:13:43,800 --> 00:13:48,600 Speaker 3: So five years ago you were a hero to the left. Cool, 272 00:13:48,840 --> 00:13:52,440 Speaker 3: you had electric cars, you had space, and in five 273 00:13:52,520 --> 00:13:53,280 Speaker 3: years you've. 274 00:13:53,080 --> 00:13:55,080 Speaker 4: Got to go to a party in Hollywood and not 275 00:13:55,120 --> 00:13:55,760 Speaker 4: get dirty looks. 276 00:13:56,040 --> 00:13:58,880 Speaker 2: Yeah. In fact, yeah, and now you might not even 277 00:13:58,880 --> 00:14:01,000 Speaker 2: get invited. I think is invited, but I don't know 278 00:14:01,040 --> 00:14:01,640 Speaker 2: if I should go. 279 00:14:03,600 --> 00:14:06,160 Speaker 3: And I don't think it's an exaggeration to say, today, 280 00:14:07,200 --> 00:14:10,800 Speaker 3: after Donald Trump, the left hates you more than a 281 00:14:10,880 --> 00:14:11,600 Speaker 3: person on earth. 282 00:14:12,160 --> 00:14:14,600 Speaker 4: Yes, I appear to be number two. I mean, if 283 00:14:14,600 --> 00:14:16,920 Speaker 4: you're judged by the various signs. 284 00:14:17,040 --> 00:14:21,440 Speaker 3: They estrangements, it's Trump de arrangement syndrome and Elon derangement syndrome. 285 00:14:21,680 --> 00:14:23,440 Speaker 3: How is that for you? That's a little bit of 286 00:14:23,440 --> 00:14:27,920 Speaker 3: whiplash of going from being like mister cool to the 287 00:14:27,920 --> 00:14:30,440 Speaker 3: devil incarnate in just a couple of years. Is that 288 00:14:30,600 --> 00:14:32,840 Speaker 3: is that kind of weird to experience that transformation? 289 00:14:33,080 --> 00:14:33,440 Speaker 2: Yes? 290 00:14:34,360 --> 00:14:35,640 Speaker 3: Why do they hate you so much? 291 00:14:36,400 --> 00:14:41,040 Speaker 4: Well, because we're we're clearly over the target if those 292 00:14:41,160 --> 00:14:43,920 Speaker 4: was ineffective, if we were not actually getting rid of 293 00:14:44,200 --> 00:14:46,000 Speaker 4: a bunch of waste and forward and a bunch of 294 00:14:46,000 --> 00:14:49,680 Speaker 4: that Ford. I mean, the Ford reason we're seeing is 295 00:14:50,200 --> 00:14:54,160 Speaker 4: overwhelmingly on the on the left. I mean, it's not 296 00:14:54,280 --> 00:14:57,920 Speaker 4: zero on the right, but these NGOs are almost all 297 00:14:58,240 --> 00:15:01,800 Speaker 4: left wing NGOs that are being funded for Apple. So 298 00:15:03,120 --> 00:15:07,240 Speaker 4: they hate me because Dog is being effective and Doge 299 00:15:07,280 --> 00:15:10,200 Speaker 4: is getting rid of a lot of waste for that 300 00:15:10,240 --> 00:15:13,520 Speaker 4: they were that people and left were taking advantage of that. 301 00:15:13,520 --> 00:15:15,160 Speaker 4: That's that's that's what it comes out to you. And 302 00:15:15,160 --> 00:15:18,520 Speaker 4: and the single biggest thing that they're that they're worried 303 00:15:18,520 --> 00:15:22,960 Speaker 4: about is that Dog is going is going to turn 304 00:15:23,040 --> 00:15:28,680 Speaker 4: off fraudulent payments of entitlements. I mean everything from Social Security, medicare, 305 00:15:28,920 --> 00:15:35,120 Speaker 4: uh you know, unemployment, disability, smallpers administ regustration loans, turn 306 00:15:35,200 --> 00:15:39,640 Speaker 4: them off to illegals. This is that clus to the matter. 307 00:15:40,000 --> 00:15:42,720 Speaker 4: Ye okay, this is this is the this is the 308 00:15:42,760 --> 00:15:44,920 Speaker 4: thing that why they really hit my guts, want me 309 00:15:44,960 --> 00:15:45,360 Speaker 4: to die. 310 00:15:46,440 --> 00:15:49,080 Speaker 3: And do you think that's billions hundreds of billions? What 311 00:15:49,080 --> 00:15:50,320 Speaker 3: do you think the scale is of that? 312 00:15:50,840 --> 00:15:52,960 Speaker 4: I think across the country, it's it's in the it's 313 00:15:53,040 --> 00:15:55,320 Speaker 4: well noth of one hundred billion, maybe two hundred billion, 314 00:15:56,800 --> 00:16:02,280 Speaker 4: so uh bye. By using entitlements fraud, the Democrats have 315 00:16:02,320 --> 00:16:07,680 Speaker 4: been able to attract and retain vast numbers of illegal 316 00:16:07,720 --> 00:16:13,560 Speaker 4: immigrants and buy voters, and buy voters exactly the It 317 00:16:13,680 --> 00:16:19,160 Speaker 4: basically bring in ten twenty million people who are beholden 318 00:16:19,280 --> 00:16:23,720 Speaker 4: to the Democrats for government handouts and will vote overwhelming 319 00:16:23,720 --> 00:16:26,920 Speaker 4: the Democrat as has been demonstrated in California. This is 320 00:16:27,960 --> 00:16:31,640 Speaker 4: it's an election strategy. Yes, it's powered, yes, and it 321 00:16:31,680 --> 00:16:34,120 Speaker 4: doesn't take much to turn the swing states blue. I 322 00:16:34,120 --> 00:16:36,000 Speaker 4: mean off in a swing state yet be won by 323 00:16:36,160 --> 00:16:38,840 Speaker 4: ten twenty thousand votes. Sure so if the DAMS can 324 00:16:38,840 --> 00:16:42,120 Speaker 4: bring in two hundred thousand illegals and over time get 325 00:16:42,120 --> 00:16:45,680 Speaker 4: them legalized, not counting any cheating that takes place, because 326 00:16:45,680 --> 00:16:49,640 Speaker 4: there is some cheating, but even without cheating, if you bake, 327 00:16:50,160 --> 00:16:51,840 Speaker 4: if you if you bring in illegals that are ten 328 00:16:52,000 --> 00:16:55,040 Speaker 4: x the voted differential in a swing state, it will 329 00:16:55,040 --> 00:16:58,360 Speaker 4: no longer be a swing state, right, and the DAMS 330 00:16:58,400 --> 00:17:00,520 Speaker 4: will win all the swing states just amount of time, 331 00:17:01,960 --> 00:17:07,000 Speaker 4: and America will be permanent deep blue socialist state. At 332 00:17:07,160 --> 00:17:11,520 Speaker 4: the House, the Senate, YEP, the Presidency, and the Supreme. 333 00:17:11,240 --> 00:17:13,160 Speaker 2: Court will all go hardcore down. 334 00:17:13,640 --> 00:17:17,200 Speaker 4: They will then further cement that by bringing even more 335 00:17:17,240 --> 00:17:20,640 Speaker 4: aliens so you can't vote your way out of it. 336 00:17:20,920 --> 00:17:24,080 Speaker 4: Their objemin is to make one party socialist state, and 337 00:17:24,119 --> 00:17:25,840 Speaker 4: it will be much worse than California. Because at least 338 00:17:25,840 --> 00:17:28,600 Speaker 4: California is mitigated by the fact that someone can leave California. 339 00:17:28,640 --> 00:17:29,399 Speaker 3: You can go to Texas. 340 00:17:29,520 --> 00:17:33,640 Speaker 4: Yeah, exactly, So you're going to make everywhere California but worse. 341 00:17:33,720 --> 00:17:35,960 Speaker 3: By the way, the middle of the pandemic, I spent 342 00:17:36,040 --> 00:17:38,240 Speaker 3: forty five minutes on the phone with Elon. He was 343 00:17:38,280 --> 00:17:41,919 Speaker 3: still in California. I was walking my dog's snowflake and 344 00:17:42,040 --> 00:17:44,960 Speaker 3: trying to convince you come to Texas. The commis in 345 00:17:45,000 --> 00:17:48,280 Speaker 3: California can't stand you. We love you, we want you here. 346 00:17:48,320 --> 00:17:50,199 Speaker 3: And you didn't quite go then, but you went not 347 00:17:50,359 --> 00:17:51,439 Speaker 3: that long afterwards. 348 00:17:51,920 --> 00:17:57,240 Speaker 4: I mean, the COVID actions almost killed Tesla because they 349 00:17:57,359 --> 00:17:59,560 Speaker 4: let every other auto plant in the country was allowed 350 00:17:59,560 --> 00:18:01,720 Speaker 4: to open, but ours, which was. 351 00:18:01,680 --> 00:18:03,359 Speaker 2: In California, was not a lot of open. 352 00:18:03,440 --> 00:18:04,080 Speaker 3: Wow wow. 353 00:18:05,200 --> 00:18:06,439 Speaker 2: So they almost killed Tesla. 354 00:18:07,040 --> 00:18:11,560 Speaker 3: So as a personal matter, do you ever regret it? 355 00:18:11,640 --> 00:18:13,560 Speaker 3: Like five years ago you go to the Oscars and 356 00:18:13,600 --> 00:18:16,600 Speaker 3: were mister cool, and now you're you've got death threats 357 00:18:16,600 --> 00:18:17,960 Speaker 3: every day, Like do you well? 358 00:18:17,960 --> 00:18:21,800 Speaker 2: These days the oscars are boring. I wouldn't want to go. 359 00:18:21,800 --> 00:18:23,800 Speaker 3: God bless the movies they nominate no one on earth 360 00:18:23,800 --> 00:18:26,320 Speaker 3: has ever seen, Like could they actually nominate a movie 361 00:18:26,320 --> 00:18:27,560 Speaker 3: that human beings go watch. 362 00:18:28,119 --> 00:18:30,840 Speaker 4: I mean, how many great movies have come out in 363 00:18:30,840 --> 00:18:34,520 Speaker 4: the last several years. Very few, depressingly few, Yeah, very few. 364 00:18:34,920 --> 00:18:36,800 Speaker 4: Last Oscars came and went I didn't watch it. 365 00:18:36,880 --> 00:18:37,920 Speaker 2: There's nothing to see. 366 00:18:38,160 --> 00:18:41,160 Speaker 3: I was sad that Gene Hackman just passed away because 367 00:18:41,240 --> 00:18:43,600 Speaker 3: Unforgiven was spectacular. But that was a long time ago 368 00:18:43,640 --> 00:18:44,680 Speaker 3: when Forgiven came out. 369 00:18:45,119 --> 00:18:50,160 Speaker 1: You've mentioned today here and before about the possibility of 370 00:18:50,560 --> 00:18:53,440 Speaker 1: someone wanting to take you out, dealing with the death threats. 371 00:18:54,160 --> 00:18:56,280 Speaker 2: We see, it's not in my imagination. You can just 372 00:18:56,280 --> 00:18:59,960 Speaker 2: look on social media. Yeah, but like is it because 373 00:19:00,280 --> 00:19:01,000 Speaker 2: very clear? Yeah. 374 00:19:01,000 --> 00:19:02,600 Speaker 3: And look, I'm I'm very familiar with. 375 00:19:02,680 --> 00:19:06,119 Speaker 4: And they've got science, the people with science and demonstrations 376 00:19:06,160 --> 00:19:07,080 Speaker 4: saying that I need to die. 377 00:19:08,119 --> 00:19:11,600 Speaker 3: Do you think are these just whack jobs or do 378 00:19:11,680 --> 00:19:12,159 Speaker 3: you think. 379 00:19:12,160 --> 00:19:14,920 Speaker 2: There are hopefully foreign sane people. 380 00:19:15,280 --> 00:19:17,720 Speaker 3: Do you think there are foreign entities behind this? Do 381 00:19:17,760 --> 00:19:20,480 Speaker 3: you think they're domestic entities behind the threats? And also 382 00:19:20,520 --> 00:19:24,600 Speaker 3: the attacks to Twitter are not Twitter but Tesla. I mean, 383 00:19:24,640 --> 00:19:28,520 Speaker 3: you know you're getting Tesla's charging stations lit on fire. 384 00:19:28,560 --> 00:19:31,000 Speaker 3: Do you think that's organized and paid for? 385 00:19:31,960 --> 00:19:33,960 Speaker 4: Yes, at least some of it is organized and paid 386 00:19:33,960 --> 00:19:41,199 Speaker 4: for I think by domestic you know, basically left wing 387 00:19:41,359 --> 00:19:48,199 Speaker 4: organizations in America funded by left wing billionaires. Essentially, is 388 00:19:48,200 --> 00:19:50,520 Speaker 4: it like Act Blue or what Act Blue. 389 00:19:50,400 --> 00:19:52,120 Speaker 2: Is one of them? 390 00:19:52,800 --> 00:19:56,159 Speaker 4: You know Arabella, you know the classic it's funded by 391 00:19:56,200 --> 00:20:01,760 Speaker 4: the you know, the the blue basically the left wing 392 00:20:02,080 --> 00:20:03,920 Speaker 4: ANDNGO cabal. 393 00:20:05,200 --> 00:20:07,320 Speaker 1: How big of a threads is to like what you 394 00:20:07,359 --> 00:20:09,719 Speaker 1: build at Tesla? I mean, I remember when Tesla's came out, 395 00:20:09,800 --> 00:20:12,560 Speaker 1: it was people that they didn't want to have gas cars. 396 00:20:12,840 --> 00:20:15,680 Speaker 1: A lot of it was environmental reasons. I jokingly said, 397 00:20:15,680 --> 00:20:17,520 Speaker 1: I was like, I'm a Texas guy, I'm always going 398 00:20:17,560 --> 00:20:20,680 Speaker 1: to have something that burns gas. My kids now, all 399 00:20:20,720 --> 00:20:24,959 Speaker 1: three of my boys think that that Tesla's are awesome. 400 00:20:25,000 --> 00:20:27,480 Speaker 1: The cyber truck is the car they want their dad 401 00:20:27,520 --> 00:20:29,680 Speaker 1: to buy, which I laugh because I never could have 402 00:20:29,720 --> 00:20:33,240 Speaker 1: imagined that five years ago. And now I'm looking at 403 00:20:33,320 --> 00:20:33,880 Speaker 1: We're worth. 404 00:20:33,720 --> 00:20:36,600 Speaker 3: The White House in the President's tesla'sparking. Yeah, I mean, 405 00:20:36,640 --> 00:20:38,800 Speaker 3: which is the fullest stand. But I mean, you've changed 406 00:20:38,800 --> 00:20:39,360 Speaker 3: a generation. 407 00:20:39,440 --> 00:20:41,240 Speaker 1: When you look at my kids are six and eight 408 00:20:41,240 --> 00:20:44,360 Speaker 1: and they're going, Dad buy a cyber truck, and I'm 409 00:20:44,440 --> 00:20:47,600 Speaker 1: considering it. That's a that's a full circle in a 410 00:20:47,600 --> 00:20:48,120 Speaker 1: weird way. 411 00:20:48,600 --> 00:20:50,639 Speaker 4: Yeah, Well, I do have the story that the most 412 00:20:50,760 --> 00:20:56,199 Speaker 4: errantating outcome is the most likely. So yeah, it seems 413 00:20:56,240 --> 00:21:02,000 Speaker 4: often to be true. What what twist or two of fate? Well, 414 00:21:02,040 --> 00:21:04,200 Speaker 4: I would the highest ratings if this was if we're 415 00:21:04,240 --> 00:21:07,160 Speaker 4: a TV show, what crystal turn of fate would generate 416 00:21:07,200 --> 00:21:09,520 Speaker 4: the highest ratings? That there's a good chance that happens. 417 00:21:09,760 --> 00:21:14,160 Speaker 3: Well, I will say if if Act Blue and Arabella. 418 00:21:13,720 --> 00:21:16,560 Speaker 2: Network Blue is a huge scam, next level, do. 419 00:21:16,560 --> 00:21:18,919 Speaker 3: You think it's foreign money, Chinese money? Where do you 420 00:21:18,920 --> 00:21:20,520 Speaker 3: think the money in Act Blue is coming from? And 421 00:21:20,520 --> 00:21:21,560 Speaker 3: how do you figure that out? 422 00:21:22,040 --> 00:21:24,280 Speaker 4: Well, it's not coming from the from a whole bunch 423 00:21:24,320 --> 00:21:28,600 Speaker 4: of from a ground swell of public support, because when 424 00:21:29,240 --> 00:21:32,400 Speaker 4: individual donors looked at and Act Blue, they avntually turned 425 00:21:32,400 --> 00:21:35,240 Speaker 4: out to be like diehard Republicans. Right, people have never 426 00:21:35,240 --> 00:21:37,200 Speaker 4: given money in their life. So you're going to track 427 00:21:37,240 --> 00:21:39,159 Speaker 4: down a bunch of these where it says, oh, I 428 00:21:39,200 --> 00:21:41,400 Speaker 4: gave sixteen thousand dollars and they're like, I didn't give 429 00:21:41,400 --> 00:21:42,160 Speaker 4: sixteen thousand dollars. 430 00:21:42,200 --> 00:21:43,600 Speaker 2: We're talking about. 431 00:21:44,600 --> 00:21:46,880 Speaker 4: Well, if those Fokan friends of mine, if I found 432 00:21:46,880 --> 00:21:48,439 Speaker 4: themselves on the Act Blue list, they like. 433 00:21:49,840 --> 00:21:53,080 Speaker 3: So that's if it can actually be shown that they 434 00:21:53,080 --> 00:21:58,560 Speaker 3: are funding firebombing of Tesla charging stations, that's objectively a 435 00:21:58,560 --> 00:22:01,520 Speaker 3: criminal act. That that is fun terrorist activity. And the 436 00:22:01,560 --> 00:22:05,639 Speaker 3: statutes make clear that an incendiary device qualify, so that 437 00:22:06,080 --> 00:22:10,119 Speaker 3: down is a terrorist activity. Yeah, let me ask AI. 438 00:22:11,200 --> 00:22:14,399 Speaker 3: In ten years, how is life going to be different 439 00:22:14,440 --> 00:22:17,000 Speaker 3: because of AI for just a normal person. 440 00:22:17,680 --> 00:22:19,320 Speaker 2: Well, ten years is a long time. 441 00:22:20,400 --> 00:22:24,000 Speaker 4: In ten years, probably AI could do anything better than 442 00:22:24,040 --> 00:22:25,600 Speaker 4: a human can cognitively. 443 00:22:26,080 --> 00:22:26,760 Speaker 2: Probably almost. 444 00:22:27,200 --> 00:22:29,760 Speaker 4: I think in ten years, based on the cart rate 445 00:22:29,800 --> 00:22:32,920 Speaker 4: of improvement, AI will be smatter than the smasht human. 446 00:22:33,880 --> 00:22:34,120 Speaker 2: Yeah. 447 00:22:34,200 --> 00:22:38,600 Speaker 4: Yeah, there will also be a massive number of robots, 448 00:22:39,040 --> 00:22:40,359 Speaker 4: so humanoid robots. 449 00:22:40,720 --> 00:22:42,119 Speaker 3: By the way, I got to ask, how come your 450 00:22:42,160 --> 00:22:44,879 Speaker 3: robots look so much like the creepy robots for my robot? 451 00:22:45,760 --> 00:22:47,280 Speaker 3: Was that intentional or just. 452 00:22:49,160 --> 00:22:50,920 Speaker 1: I was hoping he's gonna say, yeah, just to mess 453 00:22:50,960 --> 00:22:51,159 Speaker 1: with you. 454 00:22:52,920 --> 00:22:56,080 Speaker 4: It's not meant to look like any any prior robot. 455 00:22:56,840 --> 00:23:00,400 Speaker 4: And we'll iterate the design and you'll be able to 456 00:23:00,600 --> 00:23:03,400 Speaker 4: have a lot of the robot parts are cosmetic. You'll 457 00:23:03,440 --> 00:23:06,879 Speaker 4: be able to switch out the kind of snap on 458 00:23:07,119 --> 00:23:09,360 Speaker 4: cosmetic parts of the robot make it look like something 459 00:23:09,359 --> 00:23:15,919 Speaker 4: else if you're lying. So there'll be ultimately billions of 460 00:23:16,000 --> 00:23:17,320 Speaker 4: humanoid robots. 461 00:23:17,600 --> 00:23:21,120 Speaker 2: All costs will be self driving in ten years. 462 00:23:21,760 --> 00:23:25,640 Speaker 4: In ten years, probably ninety percent of miles driven will 463 00:23:25,680 --> 00:23:26,880 Speaker 4: be autonomous. 464 00:23:26,960 --> 00:23:28,439 Speaker 3: Huh wow that fast. 465 00:23:29,560 --> 00:23:30,600 Speaker 2: Yeah. 466 00:23:30,840 --> 00:23:34,320 Speaker 4: In five years, probably fifty percent of all miles will 467 00:23:34,440 --> 00:23:35,520 Speaker 4: driven will be autonomous. 468 00:23:35,520 --> 00:23:38,400 Speaker 3: Now, if AI will be smarter than any person, how 469 00:23:38,400 --> 00:23:42,680 Speaker 3: many jobs go away because of that? And what do 470 00:23:42,760 --> 00:23:45,160 Speaker 3: people do if you've got billions of people they're losing 471 00:23:45,160 --> 00:23:47,959 Speaker 3: their jobs like that a lot of people are understandably 472 00:23:48,000 --> 00:23:48,880 Speaker 3: freaked out about that. 473 00:23:49,760 --> 00:23:55,720 Speaker 2: Well, good goods and services will become close to free, 474 00:23:56,600 --> 00:23:59,399 Speaker 2: so as long as though people will be wanting in 475 00:23:59,480 --> 00:24:04,399 Speaker 2: terms of goods services. 476 00:24:02,720 --> 00:24:03,480 Speaker 3: So why is that? 477 00:24:03,480 --> 00:24:03,960 Speaker 2: What? 478 00:24:04,000 --> 00:24:06,880 Speaker 3: Why are goods and services free in an AI world 479 00:24:07,440 --> 00:24:09,040 Speaker 3: or close to free? 480 00:24:09,280 --> 00:24:12,200 Speaker 4: Well, you have I don't know, pull it tens of 481 00:24:12,200 --> 00:24:16,320 Speaker 4: billions of robots that they will they will make you 482 00:24:16,400 --> 00:24:21,880 Speaker 4: anything or provide any service you want, or basically next 483 00:24:21,880 --> 00:24:27,520 Speaker 4: to nothing. It's it's not that people will be will 484 00:24:27,560 --> 00:24:29,040 Speaker 4: have a lower standard of living. They will have actually 485 00:24:29,119 --> 00:24:35,679 Speaker 4: much higher standard of living. The challenge will be fulfillment. 486 00:24:35,720 --> 00:24:39,040 Speaker 4: How do you drive fulfillment and meeting in life? 487 00:24:39,359 --> 00:24:44,760 Speaker 3: Is skynet real, like you get the apocalyptic visions of AI. 488 00:24:46,080 --> 00:24:52,280 Speaker 3: How real is the prospect of killer robots annihilating humanity? 489 00:24:52,800 --> 00:24:57,600 Speaker 2: Likely? Maybe ten percent on what timeframe perhaptally ten years 490 00:24:58,520 --> 00:24:59,080 Speaker 2: so soon? 491 00:24:59,520 --> 00:25:02,200 Speaker 3: Like you, you see a world where that's possible. 492 00:25:03,520 --> 00:25:05,000 Speaker 4: Yeah, But I mean you can look at it like 493 00:25:05,040 --> 00:25:09,639 Speaker 4: the glasses eighty ninety percent full, meaning like eighty percent 494 00:25:09,920 --> 00:25:13,960 Speaker 4: likely will have extreme prosperity for all. 495 00:25:15,119 --> 00:25:17,560 Speaker 3: Now, I guess my view we're in a race to 496 00:25:17,960 --> 00:25:21,240 Speaker 3: win AI. We're in a race with China, and my 497 00:25:21,359 --> 00:25:23,160 Speaker 3: view is if they're going to be killer robots, I'd 498 00:25:23,240 --> 00:25:27,920 Speaker 3: rather they be American killer robots than Chinese. How likely 499 00:25:28,640 --> 00:25:30,960 Speaker 3: are we winning right now? Is America winning right now? 500 00:25:31,000 --> 00:25:33,280 Speaker 3: And how likely is America to win the race for 501 00:25:33,359 --> 00:25:35,480 Speaker 3: AI visav China or anyone else? 502 00:25:36,200 --> 00:25:38,440 Speaker 4: Well the next few years. I think America is likely 503 00:25:38,480 --> 00:25:41,360 Speaker 4: to win. Then it will be a function of who 504 00:25:41,359 --> 00:25:46,560 Speaker 4: controls the AI chip fabrication, the factories that make the 505 00:25:46,560 --> 00:25:50,000 Speaker 4: AI chiefs, who controls them if they are controlled, If 506 00:25:50,000 --> 00:25:52,200 Speaker 4: more of them will controlled by China than China will win. 507 00:25:53,119 --> 00:25:55,760 Speaker 3: More of the factories that are making the AI chips. 508 00:25:56,000 --> 00:25:57,199 Speaker 3: You think that will determine it? 509 00:25:57,600 --> 00:25:57,919 Speaker 2: Yes? 510 00:25:58,720 --> 00:26:01,040 Speaker 3: And how are we doing just China on that front. 511 00:26:02,080 --> 00:26:07,800 Speaker 2: Well, right now, almost all the advanced AI chip factories 512 00:26:07,840 --> 00:26:12,160 Speaker 2: they call them fabs, are in Taiwan. 513 00:26:12,359 --> 00:26:14,840 Speaker 3: And what if China invades one miles away from Yeah, 514 00:26:15,160 --> 00:26:20,200 Speaker 3: what happens if China? Yea, if China invades Taiwan, what happens. 515 00:26:19,320 --> 00:26:19,879 Speaker 2: To the world. 516 00:26:20,960 --> 00:26:24,639 Speaker 4: Well, if they were to invade in the NATOM, the 517 00:26:24,680 --> 00:26:28,320 Speaker 4: world would be cut off from advanced AI chips. And 518 00:26:28,640 --> 00:26:31,440 Speaker 4: currently one percent of advanced the AI chips all made 519 00:26:31,440 --> 00:26:31,920 Speaker 4: in Taiwan. 520 00:26:32,240 --> 00:26:34,159 Speaker 1: How fast can we put that online in America? How 521 00:26:34,200 --> 00:26:36,840 Speaker 1: important is that for national security? 522 00:26:36,960 --> 00:26:40,000 Speaker 2: I think it's essential for national security, and we're not 523 00:26:40,040 --> 00:26:40,600 Speaker 2: doing enough. 524 00:26:41,440 --> 00:26:44,399 Speaker 3: You're fifty three years old. I'm one hundred and eighteen 525 00:26:44,480 --> 00:26:46,320 Speaker 3: days older than you. By what the hell have I 526 00:26:46,400 --> 00:26:47,359 Speaker 3: done in my life? I know? 527 00:26:47,480 --> 00:26:47,720 Speaker 2: Right? 528 00:26:48,520 --> 00:26:49,680 Speaker 3: Fifty three years old? 529 00:26:49,920 --> 00:26:54,439 Speaker 2: Pretty well? Well, so seventy one was a great. 530 00:26:54,359 --> 00:26:57,600 Speaker 3: Year, and I was December seventy because I was just 531 00:26:57,600 --> 00:27:00,240 Speaker 3: just right before you were the summer of seventy one. 532 00:27:01,520 --> 00:27:03,440 Speaker 2: I was born sixty nine days after four twenty. 533 00:27:04,359 --> 00:27:10,960 Speaker 3: Wow, I did ask Ben, this is true. Look, I 534 00:27:11,000 --> 00:27:12,919 Speaker 3: did ask Ben. Should I show up and pull up 535 00:27:12,960 --> 00:27:15,640 Speaker 3: a joint and say, can we beat Rogan's views? But 536 00:27:15,640 --> 00:27:18,880 Speaker 3: but I was pretty sure it might cause a scandal 537 00:27:18,880 --> 00:27:20,640 Speaker 3: if if we spent a podcast. 538 00:27:21,080 --> 00:27:22,800 Speaker 2: It just turned out to be like a Chocolates of garden. 539 00:27:23,240 --> 00:27:27,879 Speaker 3: Yeah, let me ask you if if today was your 540 00:27:27,960 --> 00:27:28,760 Speaker 3: last day on Earth? 541 00:27:29,000 --> 00:27:30,240 Speaker 2: Yeah, what what? 542 00:27:30,440 --> 00:27:32,440 Speaker 3: I'm not suggesting it's going to be. But if it were, 543 00:27:32,440 --> 00:27:34,200 Speaker 3: what do you think your biggest legacy would be if everything 544 00:27:34,200 --> 00:27:37,000 Speaker 3: you've done one hundred years from now? What do you 545 00:27:37,040 --> 00:27:40,320 Speaker 3: think people would remember if if if if it were 546 00:27:40,480 --> 00:27:41,680 Speaker 3: zero to today. 547 00:27:41,480 --> 00:27:43,080 Speaker 1: And were you ever going to space. 548 00:27:44,640 --> 00:27:47,560 Speaker 4: In the in the distant future one hundred or one 549 00:27:47,560 --> 00:27:51,560 Speaker 4: thousand years ago, if SpaceX got humans to Mars, that's 550 00:27:51,640 --> 00:27:53,159 Speaker 4: what they would remember me for. 551 00:27:54,440 --> 00:27:57,280 Speaker 3: All Right, final set of questions. Who's the smartest guy 552 00:27:57,320 --> 00:28:01,040 Speaker 3: you've ever met? You hang out with brilliant people, like 553 00:28:01,040 --> 00:28:03,480 Speaker 3: like when you look at what's a CEO? You look 554 00:28:03,520 --> 00:28:07,800 Speaker 3: at other than yourself, what CEO? Do you say? Damn, 555 00:28:07,920 --> 00:28:08,640 Speaker 3: that guy's good. 556 00:28:10,320 --> 00:28:16,119 Speaker 4: Larry Elson's very smart. So I say, Larry Elson's one 557 00:28:16,119 --> 00:28:21,320 Speaker 4: of the smartest people. You know, Larry Page. I mean, 558 00:28:21,359 --> 00:28:22,160 Speaker 4: there are a lot of people. 559 00:28:21,960 --> 00:28:22,480 Speaker 2: That are very smart. 560 00:28:22,480 --> 00:28:25,000 Speaker 4: It's hard to say, like, you know, I think something 561 00:28:25,000 --> 00:28:29,359 Speaker 4: to be smart is as smart as so you know, 562 00:28:29,400 --> 00:28:37,439 Speaker 4: what have they done that is difficult and significant. You know, 563 00:28:37,560 --> 00:28:41,040 Speaker 4: Jeff Bezos has done a lot of difficult and significant things. 564 00:28:43,640 --> 00:28:46,120 Speaker 4: I mean, there are a lot of smart humans. I 565 00:28:46,200 --> 00:28:48,479 Speaker 4: call them smart for smart for a human, A lot 566 00:28:48,520 --> 00:28:50,520 Speaker 4: of people who are in the smart for a human category. 567 00:28:51,080 --> 00:28:54,760 Speaker 3: All right, final lightning round, Star Wars or Star Trek. 568 00:28:55,960 --> 00:28:58,080 Speaker 4: The first movie I said one to theater was Star Wars, 569 00:28:58,120 --> 00:28:59,800 Speaker 4: so I think it had a profound effect on me. 570 00:29:00,920 --> 00:29:01,760 Speaker 2: I'm six years old. 571 00:29:01,920 --> 00:29:06,720 Speaker 4: I think, imagin best first movie you ever see in 572 00:29:06,800 --> 00:29:07,880 Speaker 4: a theater a Star Wars. 573 00:29:07,920 --> 00:29:08,800 Speaker 2: It's going to blow your mind. 574 00:29:08,880 --> 00:29:09,960 Speaker 3: Best Star Wars. 575 00:29:09,640 --> 00:29:13,200 Speaker 2: Movie Empires Trucks Back. 576 00:29:13,400 --> 00:29:16,120 Speaker 3: The only objectively right answer. I stood and learned in 577 00:29:16,160 --> 00:29:18,280 Speaker 3: three hours with my dad to see it on opening day. 578 00:29:19,120 --> 00:29:22,120 Speaker 2: Kirk or Picard, I like them both. 579 00:29:22,160 --> 00:29:26,920 Speaker 3: For Kirk again, objectively right answer. By the way, James T. 580 00:29:27,040 --> 00:29:30,080 Speaker 3: Kirk is a Republican and Picard is a Democrat, and 581 00:29:30,280 --> 00:29:33,800 Speaker 3: the left gets very mad when I say that best 582 00:29:33,800 --> 00:29:35,080 Speaker 3: Star Trek movie. 583 00:29:35,920 --> 00:29:37,960 Speaker 2: I mean the original, the first Star Trek movie. 584 00:29:38,680 --> 00:29:45,040 Speaker 4: That's okay, most of both of both. 585 00:29:45,080 --> 00:29:47,400 Speaker 2: Wrath of cons were pretty good, but yeah, the original 586 00:29:47,400 --> 00:29:48,160 Speaker 2: Wrath of kN. 587 00:29:48,840 --> 00:29:53,920 Speaker 3: Ricardo Montaban Revenge is a dish best served cold. It 588 00:29:54,000 --> 00:29:58,880 Speaker 3: is very cold in space, although I will say Wrathakon 589 00:29:59,000 --> 00:30:01,320 Speaker 3: is objectively the right a answer. But but four as 590 00:30:01,360 --> 00:30:04,320 Speaker 3: a sleeper when they go back to San Francisco and 591 00:30:04,320 --> 00:30:06,640 Speaker 3: and and go find the whales and and you know, 592 00:30:06,720 --> 00:30:09,880 Speaker 3: Scotty picks up picks up the mouth and talks to 593 00:30:09,920 --> 00:30:13,760 Speaker 3: it and goes a keyboard. How quaint that's a sleeper? 594 00:30:13,800 --> 00:30:16,640 Speaker 3: All right? Last question? Did Han shoot first? 595 00:30:18,480 --> 00:30:21,640 Speaker 2: It seemed like he shot second. This is verdict. 596 00:30:21,720 --> 00:30:24,200 Speaker 3: And by the way, I apologize Ben so Ben was 597 00:30:24,240 --> 00:30:26,600 Speaker 3: a jock and played tennis at ole miss and so 598 00:30:26,600 --> 00:30:30,120 Speaker 3: so occasionally when when we geek out a little watching y'all. 599 00:30:29,960 --> 00:30:33,880 Speaker 4: Geek out over there is still on the question you 600 00:30:33,960 --> 00:30:36,760 Speaker 4: miss the alien missters flash shot? So why do you 601 00:30:36,760 --> 00:30:38,720 Speaker 4: miss his flaster shot? Must have been because he got 602 00:30:38,880 --> 00:30:41,959 Speaker 4: shot first. He's missing a point blank bastle shot. If 603 00:30:41,960 --> 00:30:43,680 Speaker 4: the less they got knocked off kids. 604 00:30:43,680 --> 00:30:47,520 Speaker 3: But it's a question of real which is is Han 605 00:30:47,680 --> 00:30:50,520 Speaker 3: solo simply a hero or an anti hero? And and 606 00:30:50,600 --> 00:30:53,360 Speaker 3: so I'm in the Han shot first category. I think 607 00:30:53,360 --> 00:30:54,880 Speaker 3: I don't like sanitized stories. 608 00:30:54,960 --> 00:30:57,400 Speaker 4: You would have shot first because they why why would 609 00:30:57,400 --> 00:30:59,480 Speaker 4: the alien miss a point blank range? 610 00:30:59,800 --> 00:31:00,000 Speaker 2: Are you? 611 00:31:00,000 --> 00:31:01,640 Speaker 1: Are you ever going to go to outer space? 612 00:31:01,720 --> 00:31:03,000 Speaker 3: Is that thing in your life goals? 613 00:31:03,240 --> 00:31:04,640 Speaker 4: Yeah, I'd like to go to Mars at some point. 614 00:31:04,760 --> 00:31:08,240 Speaker 4: And people have said do I want to die maz 615 00:31:09,040 --> 00:31:10,680 Speaker 4: and I say yes, just not an impact. 616 00:31:11,520 --> 00:31:14,600 Speaker 3: Now, that's a very good answer. The astronauts on the 617 00:31:14,600 --> 00:31:19,320 Speaker 3: space station, are they political prisoners? Some of them are 618 00:31:20,400 --> 00:31:22,560 Speaker 3: because because you could have given them a ride back, 619 00:31:22,640 --> 00:31:26,719 Speaker 3: and Joe Biden said no, purely for politics. 620 00:31:27,040 --> 00:31:30,480 Speaker 4: Yeah, I mean, you know, there's been some debate about 621 00:31:30,480 --> 00:31:32,840 Speaker 4: this online. But the thing is that it was very 622 00:31:32,880 --> 00:31:35,880 Speaker 4: a very high level of decision, so it wasn't really 623 00:31:35,920 --> 00:31:39,880 Speaker 4: even an acid decision. It was just that the Biden 624 00:31:39,920 --> 00:31:42,600 Speaker 4: White House did not want to have someone who is 625 00:31:42,920 --> 00:31:48,000 Speaker 4: pro Trump rescuing Askroot's rife before the election, so they 626 00:31:48,000 --> 00:31:48,360 Speaker 4: pushed it. 627 00:31:48,480 --> 00:31:50,080 Speaker 3: Well, if you're one of those astronauts, you gotta be 628 00:31:50,080 --> 00:31:51,280 Speaker 3: pretty pissed off about that. 629 00:31:52,280 --> 00:31:58,280 Speaker 4: Well, if they're a Democrat, yes, a Democrat, like, everything's fine, 630 00:31:58,360 --> 00:32:00,680 Speaker 4: fair enough. So I think one of them is a 631 00:32:00,680 --> 00:32:03,360 Speaker 4: republicable incident. So it depends on which one you're asked. 632 00:32:03,640 --> 00:32:06,400 Speaker 3: Well, thank you elon this. This was awesome. And let 633 00:32:06,400 --> 00:32:09,440 Speaker 3: me say and by the way, I put out on 634 00:32:09,760 --> 00:32:12,000 Speaker 3: X the day before yesterday, if you were having a 635 00:32:12,000 --> 00:32:14,000 Speaker 3: beer with Elon and could ask him anything, what would 636 00:32:14,040 --> 00:32:17,360 Speaker 3: you ask? And got lots of responses. The most common 637 00:32:17,400 --> 00:32:21,480 Speaker 3: response people said is is say thank you. Look, Texans 638 00:32:21,480 --> 00:32:24,280 Speaker 3: and the American people appreciate what you're doing. You don't 639 00:32:24,320 --> 00:32:26,520 Speaker 3: have to put up with this bs, and you're doing it. 640 00:32:26,560 --> 00:32:29,160 Speaker 3: I'm grateful. You're making a hell of a difference for 641 00:32:29,200 --> 00:32:32,360 Speaker 3: this country. I appreciate you, and the Americans appreciate you. 642 00:32:32,640 --> 00:32:34,920 Speaker 4: Yeah, it's essential for the future of civilization. Otherwise I 643 00:32:34,920 --> 00:32:36,640 Speaker 4: wouldn't be doing it. Yes, it's so like I want 644 00:32:36,680 --> 00:32:37,640 Speaker 4: to get death threats. 645 00:32:37,440 --> 00:32:38,000 Speaker 2: You know. No. 646 00:32:38,480 --> 00:32:41,080 Speaker 1: All right now, Part two of this interview with Ion 647 00:32:41,160 --> 00:32:43,920 Speaker 1: Mosk and some bigger breaking news that we're going to 648 00:32:44,000 --> 00:32:47,240 Speaker 1: have for you about corruption the government will hit on 649 00:32:48,000 --> 00:32:51,240 Speaker 1: Wednesday morning, So make sure you get that subscribe. 650 00:32:50,680 --> 00:32:52,800 Speaker 3: Or auto Dowllo button right. 651 00:32:52,600 --> 00:32:55,800 Speaker 1: Now, and again, please help this show go viral so 652 00:32:55,840 --> 00:32:59,120 Speaker 1: that we can continue to expose government waste wherever you're 653 00:32:59,160 --> 00:33:02,720 Speaker 1: on social media. Hit that little forward button and post 654 00:33:02,880 --> 00:33:05,840 Speaker 1: this episode on social media and the Center and I 655 00:33:05,880 --> 00:33:08,440 Speaker 1: will see you back here with Elon Moss Part two 656 00:33:08,600 --> 00:33:09,320 Speaker 1: Wednesday morning,