1 00:00:01,120 --> 00:00:05,160 Speaker 1: Today is a very special day as we are doing 2 00:00:05,360 --> 00:00:09,120 Speaker 1: part one of our two part conversation at the White 3 00:00:09,119 --> 00:00:13,160 Speaker 1: House with Elon musk Now. Elon is going to break 4 00:00:13,200 --> 00:00:18,480 Speaker 1: news during today and Wednesday's show on Doge and some credible, 5 00:00:18,560 --> 00:00:22,360 Speaker 1: shocking information about corruption within our government. I want to 6 00:00:22,400 --> 00:00:25,520 Speaker 1: make sure you hit that subscribe button, that auto download 7 00:00:25,600 --> 00:00:28,840 Speaker 1: button right now so you do not miss part two 8 00:00:29,360 --> 00:00:33,880 Speaker 1: on Wednesday. Also, please help this go viral as we're 9 00:00:33,920 --> 00:00:38,839 Speaker 1: exposing government waste by sharing this episode on any social 10 00:00:38,840 --> 00:00:42,040 Speaker 1: media platform that you are on. But before we get 11 00:00:42,080 --> 00:00:44,640 Speaker 1: to that, after more than a year of war, tear 12 00:00:44,760 --> 00:00:47,720 Speaker 1: and pain in Israel, the need for security essentials and 13 00:00:47,800 --> 00:00:51,800 Speaker 1: support for first responders is still critical. Even in times 14 00:00:51,800 --> 00:00:55,440 Speaker 1: of ceasefire. Israel must be prepared for the next attack, 15 00:00:55,640 --> 00:00:57,080 Speaker 1: wherever it may come from. 16 00:00:57,160 --> 00:00:59,240 Speaker 2: As Israel is surrounded. 17 00:00:58,640 --> 00:01:01,800 Speaker 1: By enemies on all sides. That is where the International 18 00:01:01,840 --> 00:01:05,559 Speaker 1: Fellowship of Christians and Jews and you come in. They 19 00:01:05,680 --> 00:01:07,720 Speaker 1: are the ones that help give the support that is 20 00:01:07,800 --> 00:01:10,640 Speaker 1: needed and the people of Israel need now more than 21 00:01:10,720 --> 00:01:14,520 Speaker 1: ever before, life saving security essentials and your gift today 22 00:01:14,600 --> 00:01:18,839 Speaker 1: will help save lives by providing bomb shelters, armored security vehicles, 23 00:01:18,920 --> 00:01:23,640 Speaker 1: and ambulances, firefighting equipment, flat jackets and bulletproof vests and 24 00:01:23,720 --> 00:01:26,759 Speaker 1: so much more. You're generous donation today will help ensure 25 00:01:26,800 --> 00:01:29,360 Speaker 1: the people of Israel are safe and secure in the 26 00:01:29,440 --> 00:01:31,839 Speaker 1: days to come. So give a gift to bless Israel 27 00:01:31,840 --> 00:01:37,280 Speaker 1: and our people by visiting SUPPORTIFCJ dot org. That's one word, 28 00:01:37,720 --> 00:01:42,280 Speaker 1: support IFCJ dot org. Or call eight a eight for 29 00:01:42,640 --> 00:01:46,759 Speaker 1: eight eight IFCJ. That's eight eight eight four eight eight 30 00:01:46,920 --> 00:01:50,400 Speaker 1: IFCJ eight and eight four eight eight four three two 31 00:01:50,440 --> 00:01:54,160 Speaker 1: five or support IFCJ dot org. Here is part one 32 00:01:54,640 --> 00:01:57,760 Speaker 1: of our sit down conversation at the White House with 33 00:01:57,880 --> 00:01:58,639 Speaker 1: Elon Musk. 34 00:01:59,080 --> 00:02:00,920 Speaker 3: Well, we're in the White House US what right now? 35 00:02:00,960 --> 00:02:04,360 Speaker 3: And we're here with my friend Elon Musk, who really 36 00:02:04,400 --> 00:02:06,320 Speaker 3: has not been doing much of anything, has not made 37 00:02:06,320 --> 00:02:11,200 Speaker 3: any news is and nobody has noticed. Yeah, the impact 38 00:02:11,520 --> 00:02:16,680 Speaker 3: welcome Elon. Holy crap ah, yes wow. Let me just say, 39 00:02:17,600 --> 00:02:18,400 Speaker 3: never a dull moment. 40 00:02:18,760 --> 00:02:19,799 Speaker 2: Never a dull moment. 41 00:02:19,840 --> 00:02:22,520 Speaker 3: The first fifty days the president has spent in office 42 00:02:23,600 --> 00:02:26,840 Speaker 3: over the top, and the first fifty days you've spent. 43 00:02:27,320 --> 00:02:29,040 Speaker 3: I don't think there's ever been anyone to have an 44 00:02:29,040 --> 00:02:32,200 Speaker 3: impact the way you have at the beginning let me 45 00:02:32,200 --> 00:02:35,040 Speaker 3: start with a question. You know a lot about which 46 00:02:35,200 --> 00:02:38,720 Speaker 3: was worse the mess you found at Twitter or the 47 00:02:38,760 --> 00:02:40,240 Speaker 3: mess you found in the federal government. 48 00:02:40,280 --> 00:02:41,920 Speaker 4: Well, it's hard to compete with the federal government. 49 00:02:43,160 --> 00:02:45,600 Speaker 3: What's surprised you about the federal government? I assume you 50 00:02:45,639 --> 00:02:47,959 Speaker 3: came in and assumed it was bad. Is it worse 51 00:02:48,000 --> 00:02:48,760 Speaker 3: than you expected? 52 00:02:50,320 --> 00:02:53,160 Speaker 5: It is worse than I expected. But on the plus side, 53 00:02:53,160 --> 00:02:56,720 Speaker 5: that means there's more opportunity for improvement. So look, if 54 00:02:56,720 --> 00:03:00,200 Speaker 5: you look on the right side, there's actually a lot 55 00:03:00,240 --> 00:03:04,079 Speaker 5: of opportunity for improvement in federal government expenditures because it's 56 00:03:04,080 --> 00:03:04,799 Speaker 5: so bad. 57 00:03:05,360 --> 00:03:07,200 Speaker 4: If I was a well run ship, it would be 58 00:03:07,240 --> 00:03:08,120 Speaker 4: very difficult to improve. 59 00:03:08,600 --> 00:03:11,079 Speaker 5: So like, but so now it's like people say, well, 60 00:03:11,200 --> 00:03:13,320 Speaker 5: how how do you figure out how to save money 61 00:03:13,320 --> 00:03:14,920 Speaker 5: in the federal government. Well, it's like being in a 62 00:03:14,960 --> 00:03:17,080 Speaker 5: room where the walls, the roof, and the floor are 63 00:03:17,080 --> 00:03:23,720 Speaker 5: old targets any direction and you're your kindness, Wow, yeah, 64 00:03:23,760 --> 00:03:24,800 Speaker 5: I'm sure you would agree. 65 00:03:25,560 --> 00:03:28,079 Speaker 2: So a lot of folks have talked about like, like. 66 00:03:28,600 --> 00:03:32,240 Speaker 4: You can't right, this is going to any direction. 67 00:03:33,720 --> 00:03:36,400 Speaker 3: A lot of the crazy expenditures, things like like two 68 00:03:36,440 --> 00:03:39,680 Speaker 3: million bucks for sex change surgeries in Guatemala an essential, 69 00:03:42,240 --> 00:03:46,120 Speaker 3: you know, transgendered mice and and sesame street in Iraq. 70 00:03:46,200 --> 00:03:47,880 Speaker 3: A lot of that has gotten attention. But some of 71 00:03:47,920 --> 00:03:51,040 Speaker 3: the stuff you've told me about, like tell us about 72 00:03:51,360 --> 00:03:53,520 Speaker 3: computer licenses and government agencies. 73 00:03:53,560 --> 00:03:56,119 Speaker 5: Yeah, so most of what notes just finding. You don't 74 00:03:56,160 --> 00:03:59,560 Speaker 5: need to be shoock comes. Okay, it's very obvious basic stuff. 75 00:04:00,120 --> 00:04:03,760 Speaker 5: So in every government department, I say every because we've 76 00:04:03,760 --> 00:04:06,800 Speaker 5: not yet found a single exception. There are far too 77 00:04:06,880 --> 00:04:13,360 Speaker 5: many software licenses and media subscriptions, meaning many more software 78 00:04:13,400 --> 00:04:16,800 Speaker 5: licenses and media subscriptions than there are humans in the department. 79 00:04:17,000 --> 00:04:19,000 Speaker 3: Like you were saying, like an agency with fifteen thousand 80 00:04:19,040 --> 00:04:23,400 Speaker 3: people might have thirty thousand licenses. Yes, and even of 81 00:04:23,440 --> 00:04:25,960 Speaker 3: the fifteen thousand employees, a good chunk of them hadn't 82 00:04:26,040 --> 00:04:30,040 Speaker 3: used the license, had never logged on or used the application. 83 00:04:30,279 --> 00:04:35,560 Speaker 5: Yes, we found entire situations of software licenses or media 84 00:04:35,839 --> 00:04:38,440 Speaker 5: subscriptions where there were zero logins. 85 00:04:39,000 --> 00:04:40,880 Speaker 2: So it had and yet we were paying for it. 86 00:04:41,000 --> 00:04:44,400 Speaker 5: Yes, the governor's paying for thousands of licenses of software 87 00:04:45,240 --> 00:04:47,760 Speaker 5: or media subscriptions and no one had ever logged in 88 00:04:47,839 --> 00:04:48,360 Speaker 5: even once. 89 00:04:48,960 --> 00:04:51,600 Speaker 2: Or credit cards. You found the same thing with government credit. 90 00:04:51,400 --> 00:04:53,919 Speaker 5: Cards we found that there are twice as many credit 91 00:04:53,960 --> 00:04:57,400 Speaker 5: cards as there are humans, and I still don't have 92 00:04:57,480 --> 00:04:59,680 Speaker 5: a good explanation for why this is the case. And 93 00:04:59,720 --> 00:05:02,840 Speaker 5: these are ten thousand dollars limit cards, so it's a 94 00:05:02,880 --> 00:05:03,480 Speaker 5: lot of money. 95 00:05:04,200 --> 00:05:06,919 Speaker 1: Is it incompetent that you're finding or is this like 96 00:05:07,040 --> 00:05:09,640 Speaker 1: the biggest money laundering scheme in the history of the 97 00:05:09,720 --> 00:05:10,679 Speaker 1: world that you're finding. 98 00:05:11,520 --> 00:05:14,560 Speaker 5: Okay, I think it's mostly If you say, look, what's 99 00:05:14,600 --> 00:05:18,279 Speaker 5: the waste to fraud ratio, Yeah, in my opinion, it's 100 00:05:18,880 --> 00:05:20,800 Speaker 5: it's like eighty percent wist twenty percent fraud. 101 00:05:21,120 --> 00:05:24,440 Speaker 4: But you do have these sort of gray areas. 102 00:05:25,080 --> 00:05:29,520 Speaker 5: For example, example, be so, uh, we saw a lot 103 00:05:29,520 --> 00:05:33,760 Speaker 5: of payments going out of treasury that had no payment 104 00:05:33,800 --> 00:05:36,720 Speaker 5: code and no explanation for the payment, and then we're 105 00:05:36,839 --> 00:05:39,800 Speaker 5: we're we're trying to figure out what that payment is 106 00:05:40,200 --> 00:05:42,520 Speaker 5: and we'd see that Okay, that contract was supposed to 107 00:05:42,560 --> 00:05:45,480 Speaker 5: be shut off, but but someone forgot to shut off 108 00:05:45,600 --> 00:05:49,240 Speaker 5: that that contract and so the company kept getting money. 109 00:05:49,480 --> 00:05:54,000 Speaker 4: Wow. Now is that waste or fraud both both? 110 00:05:54,320 --> 00:05:57,400 Speaker 1: Yeah, you're not. 111 00:05:57,400 --> 00:06:00,279 Speaker 5: Supposed to get you're not supposed to get it. You 112 00:06:00,400 --> 00:06:02,920 Speaker 5: but the government sent it to you, and nobody from 113 00:06:02,920 --> 00:06:05,839 Speaker 5: the government asked for it back. Take for example, the 114 00:06:05,880 --> 00:06:09,359 Speaker 5: one the one point nine billion dollars given to Stacy Abrams. Yeah, 115 00:06:09,640 --> 00:06:14,719 Speaker 5: fake NGO, utter insanity, explain the story. 116 00:06:13,800 --> 00:06:17,919 Speaker 3: That's that's just corrupt. I think that's paying off cronies 117 00:06:17,960 --> 00:06:18,400 Speaker 3: at that point. 118 00:06:19,360 --> 00:06:19,880 Speaker 4: Yeah. Yeah. 119 00:06:20,040 --> 00:06:22,040 Speaker 3: And by the way, she knew, like when you get 120 00:06:22,040 --> 00:06:24,720 Speaker 3: two billion dollars, you don't miss that that's not on. 121 00:06:24,760 --> 00:06:28,839 Speaker 5: Accident that allegedly it was for like, uh, you know, 122 00:06:28,920 --> 00:06:32,640 Speaker 5: environmentally friendly appliances or something. And they've given like like 123 00:06:32,640 --> 00:06:35,600 Speaker 5: one hundred appliances so far for two billion dollars. It's 124 00:06:35,720 --> 00:06:40,560 Speaker 5: very expensive toast, that's fridge. 125 00:06:40,520 --> 00:06:41,000 Speaker 4: It's nice. 126 00:06:41,600 --> 00:06:45,320 Speaker 5: Is it's just obviously one of the biggest scam portholes 127 00:06:45,440 --> 00:06:49,360 Speaker 5: we've uncovered, which is really crazy, is uh? Is that 128 00:06:49,560 --> 00:06:52,680 Speaker 5: is that the government can give money to a so 129 00:06:52,800 --> 00:06:56,919 Speaker 5: called nonprofit with with very few controls and then that 130 00:06:57,320 --> 00:07:01,040 Speaker 5: and there's there's no auditing subsequently of that nonprofit, so 131 00:07:01,080 --> 00:07:03,479 Speaker 5: there's no So this is where with the you know, 132 00:07:03,480 --> 00:07:06,760 Speaker 5: one point nine billion Stacey Arams, who's who's that they 133 00:07:06,760 --> 00:07:11,800 Speaker 5: didn't give themselves extremely lavished like insane salaries, expense everything yep, 134 00:07:12,000 --> 00:07:15,360 Speaker 5: to the to the nonprofit, you know, buy jets and 135 00:07:15,480 --> 00:07:17,120 Speaker 5: homes and all sorts of things. 136 00:07:17,360 --> 00:07:20,320 Speaker 2: Live like kings and queens. Yes on the taxpayer, dont correct. 137 00:07:20,680 --> 00:07:23,240 Speaker 4: You mentioned this is happening at scale. It's not just 138 00:07:23,240 --> 00:07:25,280 Speaker 4: one or two. We're seeing this everywhere. 139 00:07:25,480 --> 00:07:27,280 Speaker 3: Now, one of the things you told me about is 140 00:07:27,280 --> 00:07:32,560 Speaker 3: it is what you call say magic money computers. Well, 141 00:07:32,800 --> 00:07:34,120 Speaker 3: so tell us about it, because I never heard of 142 00:07:34,120 --> 00:07:35,320 Speaker 3: that until you brought that up. 143 00:07:35,760 --> 00:07:39,760 Speaker 5: Okay, So you may think that these the government computers 144 00:07:40,520 --> 00:07:43,560 Speaker 5: like all talk to each other, they synchronize, they add 145 00:07:43,640 --> 00:07:46,680 Speaker 5: up what funds are going somewhere, and it's you know, 146 00:07:47,840 --> 00:07:51,520 Speaker 5: it's coherent that that that the you know, there's and 147 00:07:51,560 --> 00:07:53,520 Speaker 5: that and that. The numbers for example, that you're presented 148 00:07:53,560 --> 00:07:56,320 Speaker 5: as a senator, Yeah, are actually the real numbers. 149 00:07:56,760 --> 00:08:01,920 Speaker 4: And one would think, well, I would think they're not. Okay. 150 00:07:59,880 --> 00:08:03,920 Speaker 5: I mean, they're not totally wrong, but they're probably off 151 00:08:03,920 --> 00:08:07,600 Speaker 5: by five percent or ten percent in some cases. So 152 00:08:08,120 --> 00:08:10,160 Speaker 5: I call it magic my computer. Any computer which can 153 00:08:10,240 --> 00:08:13,320 Speaker 5: just make money out of thin air best magic money. 154 00:08:13,360 --> 00:08:14,280 Speaker 2: So how does that work? 155 00:08:14,560 --> 00:08:15,720 Speaker 4: It just issues payments. 156 00:08:16,920 --> 00:08:17,480 Speaker 2: And you said to. 157 00:08:17,560 --> 00:08:20,280 Speaker 3: Something like eleven of these computers a treasury that are 158 00:08:20,400 --> 00:08:23,080 Speaker 3: that are sending out trillions in payments they're. 159 00:08:22,920 --> 00:08:27,640 Speaker 5: Mostly a treasury, some are with the sum at HHS, 160 00:08:27,800 --> 00:08:31,600 Speaker 5: some at there's one one of two states. There's summit 161 00:08:31,720 --> 00:08:36,000 Speaker 5: at DoD. I think we found now fourteen magic money computers. 162 00:08:37,040 --> 00:08:37,240 Speaker 2: Gene. 163 00:08:37,280 --> 00:08:40,040 Speaker 4: Okay, they just send money out of nothing. 164 00:08:40,880 --> 00:08:44,280 Speaker 3: You have an ability to see where leverage points are 165 00:08:45,400 --> 00:08:48,800 Speaker 3: and how things actually happen. So I remember back I 166 00:08:48,800 --> 00:08:51,720 Speaker 3: think it was September October of this year, before the election. 167 00:08:51,800 --> 00:08:53,280 Speaker 3: We didn't know who was going to win, and I 168 00:08:53,360 --> 00:08:55,520 Speaker 3: was at your house in Austin. We were talking about it, 169 00:08:55,559 --> 00:08:59,000 Speaker 3: and you said, you said, look, I don't want a 170 00:08:59,080 --> 00:09:01,200 Speaker 3: job in washing to any You said, all I want 171 00:09:01,240 --> 00:09:01,880 Speaker 3: is the log in. 172 00:09:01,800 --> 00:09:02,440 Speaker 2: For every computer. 173 00:09:03,160 --> 00:09:05,560 Speaker 3: And I remember thinking at the time that sounded kind 174 00:09:05,600 --> 00:09:07,840 Speaker 3: of weird, like I just didn't get it, And I 175 00:09:07,960 --> 00:09:12,560 Speaker 3: have to say, what's interesting on this? If I would 176 00:09:12,600 --> 00:09:15,400 Speaker 3: have thought like, okay, how do you reform government? Like 177 00:09:15,440 --> 00:09:17,440 Speaker 3: sort of the traditional way to think about it is, okay, 178 00:09:17,480 --> 00:09:19,240 Speaker 3: give me an ORC chart, let me sit down with 179 00:09:19,240 --> 00:09:23,000 Speaker 3: the people who are running agencies. And what you saw 180 00:09:23,080 --> 00:09:26,480 Speaker 3: immediately is to understand what's really going on. Get to 181 00:09:26,480 --> 00:09:28,600 Speaker 3: the payment systems, get to the computers. 182 00:09:28,760 --> 00:09:29,560 Speaker 4: Yeah, Like. 183 00:09:32,840 --> 00:09:36,239 Speaker 3: Why is getting to the computers so critical to understanding 184 00:09:36,240 --> 00:09:37,680 Speaker 3: what's actually happening. 185 00:09:38,320 --> 00:09:42,160 Speaker 5: Well, the government is run by computers, So you've got 186 00:09:42,760 --> 00:09:46,040 Speaker 5: essentially several hundred computers that effectively run the government. 187 00:09:46,920 --> 00:09:48,600 Speaker 4: And if you want to know, did you know that 188 00:09:48,640 --> 00:09:49,120 Speaker 4: ben No? 189 00:09:49,440 --> 00:09:52,840 Speaker 5: Like, yeah, So when somebody, like even when the present 190 00:09:52,920 --> 00:09:54,719 Speaker 5: issues an executive order, that's going to go through a 191 00:09:54,720 --> 00:09:57,000 Speaker 5: whole bunch of people and ultimately it is implemented at 192 00:09:57,000 --> 00:10:00,199 Speaker 5: a computer somewhere. And if you want to know what 193 00:10:00,200 --> 00:10:02,719 Speaker 5: what the situation is with the accounting and you're trying 194 00:10:02,720 --> 00:10:04,760 Speaker 5: to reconcile accounting and get rid of waste and fraud, 195 00:10:04,840 --> 00:10:07,880 Speaker 5: you must be able to analyze the computer databases. Otherwise 196 00:10:07,880 --> 00:10:10,640 Speaker 5: you can't figure it out because all you're doing is 197 00:10:10,640 --> 00:10:13,959 Speaker 5: asking a human who will then ask another human, ask 198 00:10:14,000 --> 00:10:17,160 Speaker 5: another human, and finally usually ask some contractor will ask 199 00:10:17,200 --> 00:10:19,560 Speaker 5: another contractor to your query under computer. 200 00:10:20,559 --> 00:10:22,560 Speaker 4: Wow, that's how it actually works. 201 00:10:22,840 --> 00:10:25,880 Speaker 5: So it's many layers deep. So the only way to 202 00:10:25,920 --> 00:10:28,040 Speaker 5: reconcile the databases and get rid of waste and fraud 203 00:10:28,160 --> 00:10:31,680 Speaker 5: is to to actually look at the computers and see 204 00:10:31,720 --> 00:10:35,400 Speaker 5: what's going on. So that's what I call you. That's like, 205 00:10:37,120 --> 00:10:40,559 Speaker 5: that's why when I sort of cryptically referred to reprogramming 206 00:10:40,559 --> 00:10:43,079 Speaker 5: the matrix. You have to understand what's going on the computers. 207 00:10:43,080 --> 00:10:45,800 Speaker 5: You have to reconcile the computer databases in order to 208 00:10:45,840 --> 00:10:47,360 Speaker 5: identify the waste and fraud. 209 00:10:47,679 --> 00:10:53,400 Speaker 3: I don't know that there was anyone in Congress who understood, 210 00:10:53,400 --> 00:10:57,840 Speaker 3: certainly myself included, who understood the leverage that comes from 211 00:10:57,840 --> 00:11:01,760 Speaker 3: the computer and the data in particul killer that Congress 212 00:11:01,760 --> 00:11:03,679 Speaker 3: would think about give me a report on what your 213 00:11:03,720 --> 00:11:07,560 Speaker 3: expenditures are rather than actually getting into the pipes. And 214 00:11:07,600 --> 00:11:10,600 Speaker 3: I think that has been fascinating that it's let you 215 00:11:10,840 --> 00:11:13,840 Speaker 3: uncover a bunch of craft that just nobody knew. 216 00:11:14,320 --> 00:11:16,960 Speaker 5: Yes, I mean, in order for money to go to 217 00:11:17,000 --> 00:11:19,840 Speaker 5: a van account, it's it's not like we're sending truckloads 218 00:11:19,840 --> 00:11:22,920 Speaker 5: of cash all over the place where it's a we're 219 00:11:22,920 --> 00:11:25,439 Speaker 5: wiring money right, We're sending money through the ACH system 220 00:11:25,520 --> 00:11:27,920 Speaker 5: or through the Swift system. So in order for money 221 00:11:27,960 --> 00:11:30,600 Speaker 5: to flow, it's going to flow electronically. So that's that's 222 00:11:30,600 --> 00:11:31,720 Speaker 5: what you need to look at. You need to look 223 00:11:31,760 --> 00:11:34,200 Speaker 5: at the actual electronic money flows. 224 00:11:34,320 --> 00:11:36,520 Speaker 3: And Tesla and all your companies you have accounting and 225 00:11:36,559 --> 00:11:38,840 Speaker 3: you have every expenditure, you have it coded for what 226 00:11:38,920 --> 00:11:41,800 Speaker 3: it's going for federal government doesn't work that way. 227 00:11:41,840 --> 00:11:43,240 Speaker 2: They don't code what the money is going for. They 228 00:11:43,240 --> 00:11:44,720 Speaker 2: do not, but they didn't. 229 00:11:44,880 --> 00:11:45,280 Speaker 4: They didn't. 230 00:11:46,120 --> 00:11:48,600 Speaker 3: And like one of the things that that you told me, 231 00:11:48,679 --> 00:11:50,959 Speaker 3: you said, if any company kept its books the way 232 00:11:51,000 --> 00:11:54,320 Speaker 3: the federal government does, they'd arrest the officers and put. 233 00:11:54,200 --> 00:11:54,680 Speaker 2: Them in jail. 234 00:11:54,760 --> 00:11:57,400 Speaker 5: Yes, if it was a poly company would be dealisted immediately, 235 00:11:57,480 --> 00:12:00,240 Speaker 5: it would fail, it's ordered, and the office as the 236 00:12:00,240 --> 00:12:03,720 Speaker 5: company would be in prisoned. That's the level of enough 237 00:12:03,720 --> 00:12:04,480 Speaker 5: business in the middle. 238 00:12:04,920 --> 00:12:08,839 Speaker 1: Unfortunately, it's deliberately or do you think this is a 239 00:12:09,040 --> 00:12:10,079 Speaker 1: confidence again. 240 00:12:09,880 --> 00:12:14,560 Speaker 5: It's eighty percent, it's eighty percent in competence or in 241 00:12:14,679 --> 00:12:15,600 Speaker 5: twenty percent malice. 242 00:12:15,720 --> 00:12:18,600 Speaker 1: So if you look at if you look at DOGE 243 00:12:18,640 --> 00:12:21,840 Speaker 1: now and you look at the government and what you're finding, 244 00:12:22,840 --> 00:12:26,200 Speaker 1: what percentage have you guys even gotten to And how 245 00:12:26,280 --> 00:12:28,720 Speaker 1: much of it is mars where you haven't even gotten 246 00:12:28,760 --> 00:12:31,280 Speaker 1: there yet because there's so much you're finding out here, 247 00:12:31,360 --> 00:12:33,600 Speaker 1: I mean, how many you seem like a timeline guy 248 00:12:33,600 --> 00:12:34,640 Speaker 1: when you say, all right, I want to get in 249 00:12:34,679 --> 00:12:37,120 Speaker 1: there and get all these you know, numbers and things. 250 00:12:37,240 --> 00:12:39,360 Speaker 1: How far are we from the end game where you've 251 00:12:39,400 --> 00:12:42,959 Speaker 1: seen it all been able to process it all and 252 00:12:43,400 --> 00:12:46,160 Speaker 1: fix it. I mean are we years away, months away. 253 00:12:47,160 --> 00:12:54,400 Speaker 5: Not yours, I mean reasonably confident that we'll be able 254 00:12:54,440 --> 00:12:58,520 Speaker 5: to get a trillion dollars of waste and forward out, 255 00:13:00,720 --> 00:13:03,760 Speaker 5: and that that meaning that it will have we'll have 256 00:13:03,760 --> 00:13:06,080 Speaker 5: a net savings in f I twenty six, which starts 257 00:13:06,120 --> 00:13:06,640 Speaker 5: in October. 258 00:13:06,679 --> 00:13:09,960 Speaker 4: Obviously of a trillion dollars provider. 259 00:13:10,000 --> 00:13:12,480 Speaker 5: We're allowed to we're allowed to continue, and we and 260 00:13:12,520 --> 00:13:13,679 Speaker 5: our progress is not impeded. 261 00:13:13,960 --> 00:13:15,840 Speaker 4: And we're very public about what we do. 262 00:13:15,920 --> 00:13:18,040 Speaker 2: Yeah, put it off on the website. 263 00:13:18,360 --> 00:13:19,760 Speaker 4: How we could be more transparent? 264 00:13:20,320 --> 00:13:23,160 Speaker 5: Literally, everything action we do, small or large, we put 265 00:13:23,160 --> 00:13:26,319 Speaker 5: on the dose dot dot gov website and we post 266 00:13:26,360 --> 00:13:29,880 Speaker 5: on the x handle. And when people complain about it 267 00:13:30,520 --> 00:13:32,480 Speaker 5: and they say, oh, you're doing something on costumes, I'm like, well, 268 00:13:32,480 --> 00:13:34,520 Speaker 5: which of these costs in the daylight. 269 00:13:35,200 --> 00:13:36,760 Speaker 2: Everyone knows exactly what you're doing. 270 00:13:36,880 --> 00:13:37,920 Speaker 4: Extreme transparent. 271 00:13:38,040 --> 00:13:41,920 Speaker 5: Yeah, I don't think it's anything's been this transparent ever. 272 00:13:42,160 --> 00:13:45,880 Speaker 3: So five years ago you were a hero to the left, 273 00:13:46,280 --> 00:13:50,280 Speaker 3: you cool, you had electric cars, you had space, And 274 00:13:50,360 --> 00:13:52,720 Speaker 3: in five years you've got. 275 00:13:51,600 --> 00:13:53,640 Speaker 4: To go to a party in Hollywood and not get 276 00:13:53,640 --> 00:13:56,760 Speaker 4: doddy looks. Yeah in fact, yeah. 277 00:13:56,120 --> 00:13:58,160 Speaker 2: And now you might not even get invited. I think 278 00:13:58,160 --> 00:13:59,960 Speaker 2: it's really invited, but I don't know if I go. 279 00:14:01,960 --> 00:14:04,520 Speaker 3: And I don't think it's an exaggeration to say today, 280 00:14:05,559 --> 00:14:09,200 Speaker 3: after Donald Trump, the left hates you more than a 281 00:14:09,240 --> 00:14:09,960 Speaker 3: person on earth. 282 00:14:10,520 --> 00:14:12,960 Speaker 5: Yes, I appear to be number two. I mean, if 283 00:14:12,960 --> 00:14:14,960 Speaker 5: you're judged by the various signs. 284 00:14:15,400 --> 00:14:19,800 Speaker 3: They extrangements, it's Trump de arrangement syndrome and Elon derangement syndrome. 285 00:14:20,040 --> 00:14:21,080 Speaker 2: How is that for you? 286 00:14:21,120 --> 00:14:23,800 Speaker 3: That's a little bit of whiplash of going from being 287 00:14:24,000 --> 00:14:27,800 Speaker 3: like mister cool to the devil incarnate in just a 288 00:14:27,840 --> 00:14:29,720 Speaker 3: couple of years. Is that is that kind of weird 289 00:14:29,720 --> 00:14:31,200 Speaker 3: to experience that transformation? 290 00:14:31,440 --> 00:14:31,800 Speaker 4: Yes? 291 00:14:32,720 --> 00:14:34,000 Speaker 2: Why do they hate you so much? 292 00:14:34,760 --> 00:14:39,400 Speaker 5: Well, because we're we're clearly over the target if those 293 00:14:39,520 --> 00:14:42,280 Speaker 5: was ineffective, if we were not actually getting rid of 294 00:14:42,560 --> 00:14:44,360 Speaker 5: a bunch of waste and forward and a bunch of 295 00:14:44,360 --> 00:14:48,040 Speaker 5: that fraud. I mean, the ford reason we're seeing is 296 00:14:48,560 --> 00:14:52,520 Speaker 5: overwhelmingly on the on the left. I mean, it's not 297 00:14:52,640 --> 00:14:56,240 Speaker 5: zero on the right, but these angos are almost all 298 00:14:56,600 --> 00:14:58,960 Speaker 5: left wing NGOs that are being funded for example. Yep, 299 00:15:00,160 --> 00:15:05,160 Speaker 5: So they hate me because dog is being effective and 300 00:15:05,320 --> 00:15:08,360 Speaker 5: Doge is getting rid of a lot of waste for 301 00:15:08,480 --> 00:15:11,120 Speaker 5: that they were that people and left were taking advantage 302 00:15:11,160 --> 00:15:13,200 Speaker 5: of that. That's that's that's what it comes out to you. 303 00:15:13,360 --> 00:15:16,600 Speaker 5: And and the single biggest thing that they're that they're 304 00:15:16,640 --> 00:15:21,320 Speaker 5: worried about is that doge is is going to turn 305 00:15:21,440 --> 00:15:27,360 Speaker 5: off fraudulent payments of entitlements. I mean everything from Social Security, Medicare, uh, 306 00:15:27,680 --> 00:15:33,720 Speaker 5: you know, unemployment, disability, smallpers administ registration loans, turn them 307 00:15:33,720 --> 00:15:35,480 Speaker 5: off to illegals. 308 00:15:36,520 --> 00:15:38,040 Speaker 4: This is that crux of the matter. 309 00:15:38,360 --> 00:15:41,080 Speaker 5: Yeah, okay, this is this is the this is the 310 00:15:41,120 --> 00:15:43,280 Speaker 5: thing that why they really hit my guts, want me 311 00:15:43,320 --> 00:15:43,760 Speaker 5: to die. 312 00:15:44,800 --> 00:15:47,440 Speaker 3: And do you think that's billions? Hundreds of billions? What 313 00:15:47,480 --> 00:15:48,680 Speaker 3: do you think the scale is of that? 314 00:15:49,200 --> 00:15:51,320 Speaker 5: I think across the country it's it's in the it's 315 00:15:51,400 --> 00:15:53,680 Speaker 5: well north of one hundred billion, maybe two hundred billion. 316 00:15:55,160 --> 00:15:56,560 Speaker 4: So uh. 317 00:15:57,600 --> 00:16:01,080 Speaker 5: By by using entitlements fraud, the Democrats have been able 318 00:16:01,120 --> 00:16:07,120 Speaker 5: to attract and retain vast numbers of illegal immigrants and 319 00:16:07,480 --> 00:16:12,960 Speaker 5: buy voters, and buy voters exactly the It basically bring 320 00:16:13,000 --> 00:16:17,840 Speaker 5: in ten twenty million people who are beholden to the 321 00:16:17,840 --> 00:16:22,680 Speaker 5: Democrats for government handouts and will vote overwhelming the Democrat 322 00:16:22,800 --> 00:16:26,640 Speaker 5: as has been demonstrated in California. This is it's an 323 00:16:26,640 --> 00:16:30,480 Speaker 5: election strategy. Yes, it's powered, Yes, and it doesn't take 324 00:16:30,560 --> 00:16:32,800 Speaker 5: much to turn the swing states blue. I mean off 325 00:16:32,840 --> 00:16:34,960 Speaker 5: in a swing state yet be won by ten twenty 326 00:16:35,040 --> 00:16:37,520 Speaker 5: thousand votes. Sure, so if the DAMS can bring in 327 00:16:37,560 --> 00:16:41,680 Speaker 5: two hundred thousand illegals and over time get them legal legalized, 328 00:16:41,880 --> 00:16:44,240 Speaker 5: not counting any cheating that takes place, because there is 329 00:16:44,240 --> 00:16:48,680 Speaker 5: some cheating, but even without cheating, if you bake, if 330 00:16:48,680 --> 00:16:50,200 Speaker 5: you have, if you bring in illegals that are ten 331 00:16:50,360 --> 00:16:53,400 Speaker 5: x the voted differential in a swing state, it will 332 00:16:53,400 --> 00:16:56,720 Speaker 5: no longer be a swing state, right, and the DAMS 333 00:16:56,760 --> 00:16:58,880 Speaker 5: will win all the swing states just a matter of time, 334 00:17:00,320 --> 00:17:05,080 Speaker 5: and America will be permanent deep blue socialist state. Were 335 00:17:05,160 --> 00:17:09,880 Speaker 5: at the House, the Senate, YEP, the Presidency, and the Supreme. 336 00:17:09,600 --> 00:17:11,520 Speaker 4: Court will all go hardcore down. 337 00:17:12,000 --> 00:17:15,560 Speaker 5: They will then further cement that by bringing even more 338 00:17:15,640 --> 00:17:19,040 Speaker 5: aliens so you can't vote your way out of it. 339 00:17:19,280 --> 00:17:22,040 Speaker 5: Their objement is to make it one party socialist state, 340 00:17:22,400 --> 00:17:24,000 Speaker 5: and it will be much worse than California, because at 341 00:17:24,040 --> 00:17:26,320 Speaker 5: least California is mitigated by the fact that someone can 342 00:17:26,400 --> 00:17:26,960 Speaker 5: leave California. 343 00:17:27,000 --> 00:17:27,800 Speaker 2: You can go to Texas. 344 00:17:27,880 --> 00:17:32,200 Speaker 5: Yeah, exactly, you're going to make everywhere California but worse. 345 00:17:32,080 --> 00:17:34,320 Speaker 3: By the way, the middle of the pandemic, I spent 346 00:17:34,400 --> 00:17:36,199 Speaker 3: forty five minutes on the phone with Elon. 347 00:17:36,400 --> 00:17:37,680 Speaker 2: He was still in California. 348 00:17:37,960 --> 00:17:41,199 Speaker 3: I was walking my dog's snowflake and trying to convince 349 00:17:41,240 --> 00:17:44,800 Speaker 3: you come to Texas. The commis in California can't stand you. 350 00:17:45,240 --> 00:17:47,359 Speaker 3: We love you, we want you here. And you didn't 351 00:17:47,400 --> 00:17:49,800 Speaker 3: quite go then, but you went not that long afterwards. 352 00:17:50,280 --> 00:17:55,600 Speaker 5: I mean, the COVID actions almost killed Tesla because they 353 00:17:55,720 --> 00:17:57,920 Speaker 5: let every other auto plant in the country was allowed 354 00:17:57,920 --> 00:18:01,159 Speaker 5: to open, but ours, which was in California, was not 355 00:18:01,160 --> 00:18:02,399 Speaker 5: a lot of open wow wow. 356 00:18:03,560 --> 00:18:04,840 Speaker 4: So they almost killed Tesling. 357 00:18:05,400 --> 00:18:09,919 Speaker 3: So as a personal matter, do you ever regret it? 358 00:18:10,000 --> 00:18:11,920 Speaker 3: Like five years ago you go to the Oscars and 359 00:18:11,960 --> 00:18:14,960 Speaker 3: were mister cool, and now you're You've got death threats 360 00:18:14,960 --> 00:18:16,000 Speaker 3: every day, liked do you? 361 00:18:16,119 --> 00:18:19,840 Speaker 4: Well? These days the Oscars are boring. I wouldn't want 362 00:18:19,840 --> 00:18:20,160 Speaker 4: to go. 363 00:18:20,160 --> 00:18:21,560 Speaker 2: God bless the movies they nominate. 364 00:18:21,600 --> 00:18:23,560 Speaker 3: No one on earth has ever seen, like, could they 365 00:18:23,560 --> 00:18:25,920 Speaker 3: actually nominate a movie that human beings go watch? 366 00:18:26,480 --> 00:18:29,200 Speaker 5: I mean, how many great movies have come out in 367 00:18:29,200 --> 00:18:30,280 Speaker 5: the last several years. 368 00:18:30,560 --> 00:18:34,159 Speaker 4: Very few, depressingly few, yeah, very few. Last oscars came 369 00:18:34,200 --> 00:18:36,200 Speaker 4: and went I didn't watch it. There's nothing to see. 370 00:18:36,520 --> 00:18:39,520 Speaker 3: I was sad that Gene Hackman just passed away because 371 00:18:39,600 --> 00:18:41,960 Speaker 3: Unforgiven was spectacular, but that was a long time ago. 372 00:18:42,280 --> 00:18:43,040 Speaker 2: Forgiven came out. 373 00:18:43,480 --> 00:18:48,520 Speaker 1: You've mentioned today here and before about the possibility of 374 00:18:48,920 --> 00:18:49,680 Speaker 1: someone wanting to. 375 00:18:49,600 --> 00:18:53,000 Speaker 4: Take you out, dealing with the death threats. We see, 376 00:18:53,160 --> 00:18:54,879 Speaker 4: it's not in my imagination. You can just look on 377 00:18:54,920 --> 00:18:55,400 Speaker 4: social media. 378 00:18:55,480 --> 00:18:59,680 Speaker 2: Yeah, but like is it because very clear? Yeah, And look, 379 00:18:59,720 --> 00:19:00,960 Speaker 2: I'm I'm very familiar with. 380 00:19:01,040 --> 00:19:04,480 Speaker 5: And they've got science, the people with science and demonstrations 381 00:19:04,520 --> 00:19:05,440 Speaker 5: saying that I need to die. 382 00:19:06,480 --> 00:19:09,960 Speaker 3: Do you think are these just whack jobs or do 383 00:19:10,040 --> 00:19:14,480 Speaker 3: you think there are foreign people? Do you think there 384 00:19:14,480 --> 00:19:16,879 Speaker 3: are foreign entities behind this? Do you think they're domestic 385 00:19:17,040 --> 00:19:19,879 Speaker 3: entities behind the threats? And also the attacks to Twitter 386 00:19:20,480 --> 00:19:23,360 Speaker 3: are not Twitter but Tesla. I mean, you know you're 387 00:19:23,400 --> 00:19:27,680 Speaker 3: getting Tesla's charging stations lit on fire. Do you think 388 00:19:27,760 --> 00:19:29,360 Speaker 3: that's organized and paid for? 389 00:19:30,320 --> 00:19:32,320 Speaker 5: Yes, at least some of it is organized and paid 390 00:19:32,320 --> 00:19:40,159 Speaker 5: for I think by domestic you know, based left wing 391 00:19:40,320 --> 00:19:47,159 Speaker 5: organizations in America funded by left wing billionaires. Essentially, is 392 00:19:47,160 --> 00:19:49,520 Speaker 5: it like Act Blue or what Act Blue is one 393 00:19:49,560 --> 00:19:54,640 Speaker 5: of them. You know, Arabella, you know the classic. It's 394 00:19:54,640 --> 00:20:00,240 Speaker 5: funded by the you know, the the blue based, the 395 00:20:00,720 --> 00:20:03,480 Speaker 5: left wing and Goo Gobal. 396 00:20:04,080 --> 00:20:06,040 Speaker 1: Well, we may have won this election, the fight to 397 00:20:06,080 --> 00:20:09,119 Speaker 1: restore a great nation is only beginning, and now is 398 00:20:09,160 --> 00:20:12,119 Speaker 1: the time to take a stand, and Patriot Mobile is 399 00:20:12,200 --> 00:20:16,080 Speaker 1: leading the charge. It has never been easier to switch 400 00:20:16,119 --> 00:20:19,399 Speaker 1: your cell phone than in twenty twenty five. With technology, 401 00:20:19,480 --> 00:20:22,280 Speaker 1: you can do it literally over the phone with a 402 00:20:22,320 --> 00:20:25,760 Speaker 1: code they send you. It is amazing. But here's the 403 00:20:25,800 --> 00:20:29,200 Speaker 1: reason why I use Patriot Mobile now. One I save 404 00:20:29,280 --> 00:20:31,359 Speaker 1: money over what I was paying before with Big Mobile. 405 00:20:31,920 --> 00:20:35,320 Speaker 1: And two, I'm no longer supporting the radical left now. 406 00:20:35,400 --> 00:20:38,560 Speaker 1: I didn't realize until someone told me just how radical 407 00:20:38,560 --> 00:20:41,639 Speaker 1: Big Mobile had gotten. Where they are supporting organizations that 408 00:20:41,680 --> 00:20:46,320 Speaker 1: pay for abortions or supporting radical democratic causes and organizations 409 00:20:46,520 --> 00:20:49,080 Speaker 1: they're helping democrats get elect it at the local, state 410 00:20:49,200 --> 00:20:52,439 Speaker 1: and national level. That is actually why over a decade 411 00:20:52,440 --> 00:20:55,960 Speaker 1: ago Patriot Mobile was started and they are now America's 412 00:20:56,080 --> 00:21:01,600 Speaker 1: only Christian conservative wireless provider. Talk about coverage in twenty 413 00:21:01,680 --> 00:21:06,320 Speaker 1: twenty five, you have zero problems with coverage anymore, because well, 414 00:21:06,440 --> 00:21:09,120 Speaker 1: Patriot Mobile uses all three major networks. 415 00:21:09,200 --> 00:21:10,240 Speaker 2: What does that mean for you? 416 00:21:10,359 --> 00:21:12,919 Speaker 1: It means whatever coverage you're used to now, you're going 417 00:21:13,000 --> 00:21:15,760 Speaker 1: to stay exactly the same, or you may even be 418 00:21:15,800 --> 00:21:18,440 Speaker 1: able to improve it. But it will not be any 419 00:21:18,560 --> 00:21:20,760 Speaker 1: issues when it comes to coverage. In fact, they have 420 00:21:20,800 --> 00:21:24,000 Speaker 1: a coverage guarantee. But when you pay your bill, this 421 00:21:24,080 --> 00:21:26,960 Speaker 1: is where the magic happens. They take a portion of 422 00:21:27,000 --> 00:21:29,119 Speaker 1: your bill every month and they give it back to 423 00:21:29,240 --> 00:21:32,760 Speaker 1: organizations that support our first and our second amendments that 424 00:21:32,840 --> 00:21:36,440 Speaker 1: support the sanctity of life, that support our veterans, our 425 00:21:36,480 --> 00:21:39,919 Speaker 1: first responders, and are wounded warriors. And like I said, 426 00:21:40,320 --> 00:21:43,600 Speaker 1: switching with technology now has never been easier. You keep 427 00:21:43,640 --> 00:21:46,320 Speaker 1: your same phone you have now, keep your same phone number, 428 00:21:46,560 --> 00:21:48,280 Speaker 1: and you can even upgrade to a new phone if 429 00:21:48,280 --> 00:21:51,240 Speaker 1: you want to. They have a one hundred percent US 430 00:21:51,320 --> 00:21:54,320 Speaker 1: based customer support team to help you find the perfect 431 00:21:54,359 --> 00:21:56,560 Speaker 1: plan and to save money. And if you own a 432 00:21:56,640 --> 00:21:58,639 Speaker 1: business or a small business, they have a division that 433 00:21:59,000 --> 00:22:02,520 Speaker 1: just works on swam witching multiple lines over easily. So 434 00:22:02,720 --> 00:22:08,320 Speaker 1: right now, go to Patriotmobile dot com, slash ferguson or 435 00:22:08,480 --> 00:22:11,200 Speaker 1: call them nine seven to two Patriot. You're going to 436 00:22:11,240 --> 00:22:13,960 Speaker 1: get a free month of service with a promo code Ferguson. 437 00:22:14,240 --> 00:22:16,439 Speaker 1: So switch to Patriot Mobile and make a difference with 438 00:22:16,600 --> 00:22:20,520 Speaker 1: every call you make Patriot Mobile dot com slash Ferguson 439 00:22:21,040 --> 00:22:24,000 Speaker 1: or nine seven to two Patriot. How big of a 440 00:22:24,080 --> 00:22:26,560 Speaker 1: thread is to like what you build at Tesla? I mean, 441 00:22:26,600 --> 00:22:28,800 Speaker 1: I remember when Tesla's came out, it was people that 442 00:22:29,280 --> 00:22:31,480 Speaker 1: they didn't want to have gas cars. A lot of 443 00:22:31,480 --> 00:22:34,280 Speaker 1: it was environmental reasons. I jokingly said, I was like, 444 00:22:34,600 --> 00:22:36,160 Speaker 1: I'm a Texas guy, I'm always going to have something 445 00:22:36,160 --> 00:22:39,399 Speaker 1: that burns gas. My kids now, all three of my 446 00:22:39,480 --> 00:22:44,600 Speaker 1: boys think that that Tesla's are awesome. The cyber truck 447 00:22:44,760 --> 00:22:46,880 Speaker 1: is the car they want their dad to buy, which 448 00:22:47,040 --> 00:22:49,760 Speaker 1: I laughed because I never could have imagined that five 449 00:22:49,880 --> 00:22:50,400 Speaker 1: years ago. 450 00:22:50,880 --> 00:22:52,520 Speaker 2: And now I'm looking at We're worth the. 451 00:22:52,480 --> 00:22:55,639 Speaker 3: White House in the President's Tesla'spark. Yeah, meaning, which is 452 00:22:55,680 --> 00:22:56,320 Speaker 3: the fullest stand. 453 00:22:56,359 --> 00:22:57,680 Speaker 4: But I mean, you've. 454 00:22:57,280 --> 00:22:59,359 Speaker 1: Changed a generation when you look at my kids are 455 00:22:59,400 --> 00:23:02,200 Speaker 1: six and eight, They're going dad buy a cyber truck 456 00:23:02,680 --> 00:23:05,960 Speaker 1: and I'm considering it. That's a that's a full circle 457 00:23:06,080 --> 00:23:06,840 Speaker 1: in a weird way. 458 00:23:07,320 --> 00:23:08,119 Speaker 4: Yeah. 459 00:23:08,200 --> 00:23:09,880 Speaker 5: Well, I do have this story that the most errantating 460 00:23:09,880 --> 00:23:15,520 Speaker 5: outcome is the most likely, so Yeah, it seems often 461 00:23:15,560 --> 00:23:18,920 Speaker 5: to be true. If you like, what what twist or 462 00:23:18,960 --> 00:23:22,720 Speaker 5: turn of fate? Well would the highest ratings if this 463 00:23:22,840 --> 00:23:25,600 Speaker 5: was if we're a TV show, what twistal turn of 464 00:23:25,640 --> 00:23:27,880 Speaker 5: fate would generate the highest ratings? That there's a good 465 00:23:27,920 --> 00:23:28,600 Speaker 5: chance that happens. 466 00:23:28,840 --> 00:23:33,240 Speaker 3: Well, I will say if if Act Blue and Arabella. 467 00:23:32,800 --> 00:23:35,720 Speaker 4: Network Blue is a huge scam, next level, do. 468 00:23:35,640 --> 00:23:37,400 Speaker 2: You think it's foreign money? Chinese money? 469 00:23:37,680 --> 00:23:39,199 Speaker 3: Where do you think the money in Act Blue is 470 00:23:39,200 --> 00:23:40,639 Speaker 3: coming from? How do you figure that out? 471 00:23:41,119 --> 00:23:43,360 Speaker 5: Well, it's not coming from the from a whole bunch 472 00:23:43,400 --> 00:23:47,640 Speaker 5: of from a ground swell of public support, because when 473 00:23:48,359 --> 00:23:51,080 Speaker 5: individual donors looked at and Act Blue, they a bunch 474 00:23:51,080 --> 00:23:53,960 Speaker 5: of them turn out to be like diehard Republicans, people 475 00:23:54,000 --> 00:23:56,000 Speaker 5: have never given money in their life. So you're going 476 00:23:56,040 --> 00:23:58,120 Speaker 5: to track down a bunch of these people where it says, oh, 477 00:23:58,119 --> 00:23:59,240 Speaker 5: I gave sixteen. 478 00:23:58,920 --> 00:24:01,240 Speaker 4: Thousand dollars and they're like, I didn't give sixteen thousand dollars. 479 00:24:01,280 --> 00:24:02,840 Speaker 4: We're talking about this. 480 00:24:03,680 --> 00:24:05,679 Speaker 5: Well, if those stole con friends of mine, if I 481 00:24:05,720 --> 00:24:09,919 Speaker 5: found themselves on the Actlue list, like so that's. 482 00:24:09,920 --> 00:24:12,760 Speaker 3: If it can actually be shown that they are funding 483 00:24:12,880 --> 00:24:14,720 Speaker 3: firebombing of Tesla. 484 00:24:14,440 --> 00:24:15,440 Speaker 2: Charging stations. 485 00:24:16,720 --> 00:24:20,240 Speaker 3: That's objectively a criminal act. That that is funding terrorist activity. 486 00:24:20,280 --> 00:24:24,400 Speaker 3: And the statutes make clear that an incendiary device qualify, 487 00:24:24,520 --> 00:24:27,840 Speaker 3: so that down is a terrorist activity. Yeah, let me 488 00:24:27,880 --> 00:24:34,359 Speaker 3: ask AI in ten years, how is life going to 489 00:24:34,400 --> 00:24:35,600 Speaker 3: be different because of AI? 490 00:24:35,840 --> 00:24:37,480 Speaker 2: For for just a normal person. 491 00:24:38,960 --> 00:24:42,880 Speaker 5: Well, ten years is a long time. In ten years, 492 00:24:42,880 --> 00:24:48,920 Speaker 5: probably AI could do anything better than a human can cognitively, 493 00:24:49,440 --> 00:24:52,720 Speaker 5: probably almost. I think in ten years, based on the 494 00:24:52,720 --> 00:24:55,560 Speaker 5: cart rate of improvement, AI will be smarter than the 495 00:24:55,600 --> 00:24:56,280 Speaker 5: smartest human. 496 00:24:57,760 --> 00:24:57,960 Speaker 2: Yeah. 497 00:24:58,080 --> 00:25:02,120 Speaker 5: Yeah, there will also be a massive number of robots, 498 00:25:02,920 --> 00:25:04,120 Speaker 5: So humanoid robots. 499 00:25:04,600 --> 00:25:06,000 Speaker 3: By the way, I got to ask, how come your 500 00:25:06,080 --> 00:25:08,760 Speaker 3: robots look so much like the creepy robots for my robot? 501 00:25:09,640 --> 00:25:11,399 Speaker 2: Was that intentional or just? Uh? 502 00:25:13,040 --> 00:25:14,800 Speaker 1: I was hoping he's gonna say, yeah, just to mess 503 00:25:14,840 --> 00:25:15,040 Speaker 1: with you. 504 00:25:16,800 --> 00:25:19,959 Speaker 5: It's not meant to look like any any prior robot. 505 00:25:20,720 --> 00:25:24,239 Speaker 5: And we'll iterate the design and you'll be able to 506 00:25:24,480 --> 00:25:27,280 Speaker 5: have a lot of the robot parts are cosmetic. You'll 507 00:25:27,320 --> 00:25:30,760 Speaker 5: be able to switch out the kind of snap on 508 00:25:31,000 --> 00:25:33,240 Speaker 5: cosmetic parts of the robot make it look like something 509 00:25:33,240 --> 00:25:39,800 Speaker 5: else if you're lying. So there'll be ultimately billions of 510 00:25:39,840 --> 00:25:43,120 Speaker 5: humanoid robots all costs will be self driving. 511 00:25:44,560 --> 00:25:45,240 Speaker 2: In ten years. 512 00:25:47,240 --> 00:25:51,840 Speaker 5: In ten years, probably ninety percent of miles driven will 513 00:25:51,880 --> 00:25:53,040 Speaker 5: be autonomous. 514 00:25:53,160 --> 00:25:54,639 Speaker 2: Huh wow that fast. 515 00:25:55,720 --> 00:26:01,679 Speaker 5: Yeah, in five years, probably fifty percent of driven over autonomous. 516 00:26:01,720 --> 00:26:04,560 Speaker 3: Now, if AI will be smarter than any person, how 517 00:26:04,600 --> 00:26:08,840 Speaker 3: many jobs go away because of that? And what do 518 00:26:08,960 --> 00:26:11,040 Speaker 3: people do if you've got billions of people that are 519 00:26:11,119 --> 00:26:13,359 Speaker 3: losing their jobs like that, a lot of people are 520 00:26:13,560 --> 00:26:15,040 Speaker 3: understandably freaked out about that. 521 00:26:15,920 --> 00:26:22,320 Speaker 5: Well, goods, goods and services will become its close to free. 522 00:26:23,200 --> 00:26:26,040 Speaker 5: So it's not as though people will be wanting in 523 00:26:26,119 --> 00:26:27,240 Speaker 5: terms of goods and services. 524 00:26:29,320 --> 00:26:30,080 Speaker 2: So why is that? 525 00:26:30,080 --> 00:26:30,560 Speaker 4: What? 526 00:26:30,600 --> 00:26:33,119 Speaker 3: Why are goods and services free in an AI world 527 00:26:34,080 --> 00:26:34,840 Speaker 3: or close to free? 528 00:26:35,880 --> 00:26:38,800 Speaker 5: Well, you have I don't know, call it tens of 529 00:26:38,800 --> 00:26:42,919 Speaker 5: billions of robots that they will They will make you 530 00:26:43,000 --> 00:26:48,480 Speaker 5: anything or provide any service you want for basically next 531 00:26:48,480 --> 00:26:54,120 Speaker 5: to nothing. It's it's not that people will be will 532 00:26:54,160 --> 00:26:55,840 Speaker 5: have a lower standard of living. They'll have actually much 533 00:26:55,880 --> 00:27:02,480 Speaker 5: higher standard of living. The challenge will be fulfillment. How 534 00:27:02,480 --> 00:27:05,640 Speaker 5: do you drive fulfillment and meeting in life? 535 00:27:05,960 --> 00:27:11,399 Speaker 3: Is Skynet real like you get the apocalyptic visions of AI? 536 00:27:12,720 --> 00:27:17,480 Speaker 3: How real is the prospect of killer robots? Annihilating humanity 537 00:27:18,560 --> 00:27:22,480 Speaker 3: twenty percent likely, maybe ten percent on what timeframe? 538 00:27:23,280 --> 00:27:26,000 Speaker 4: Fifty ten years so soon. 539 00:27:26,119 --> 00:27:28,800 Speaker 3: Like you you see a world where that's possible. 540 00:27:30,119 --> 00:27:31,600 Speaker 5: Yeah, But I mean you can look at it like 541 00:27:31,640 --> 00:27:36,280 Speaker 5: the glasses eighty ninety percent full, meaning like eighty percent 542 00:27:36,520 --> 00:27:40,600 Speaker 5: likely will have extreme prosperity for all. 543 00:27:41,720 --> 00:27:44,119 Speaker 3: Now, I guess my view we're in a race to 544 00:27:44,560 --> 00:27:47,840 Speaker 3: win AI. We're in a race with China, and my 545 00:27:47,960 --> 00:27:49,760 Speaker 3: view is, if they're going to be killer robots, I'd 546 00:27:49,840 --> 00:27:54,560 Speaker 3: rather they be American killer robots than Chinese. How likely 547 00:27:55,240 --> 00:27:57,560 Speaker 3: are we winning right now? Is America winning right now? 548 00:27:57,600 --> 00:27:59,800 Speaker 3: And how likely is America to win the race for 549 00:27:59,800 --> 00:28:02,879 Speaker 3: AI visav China or anyone else for. 550 00:28:02,960 --> 00:28:05,720 Speaker 5: The next few years. I think America is likely to win. 551 00:28:06,080 --> 00:28:08,639 Speaker 5: Then it will be a function of who controls the 552 00:28:08,640 --> 00:28:13,680 Speaker 5: AI chip fabrication, the factories that make the AI chips, 553 00:28:13,680 --> 00:28:16,800 Speaker 5: who controls them, if they are controlled, If more of 554 00:28:16,840 --> 00:28:18,800 Speaker 5: them will controlled by China, then China will win. 555 00:28:20,320 --> 00:28:22,959 Speaker 3: More of the factories that are making the AI chips. 556 00:28:23,200 --> 00:28:27,160 Speaker 3: You think that will determine it? Yes, And how are 557 00:28:27,160 --> 00:28:29,000 Speaker 3: we doing versus China on that front? 558 00:28:30,080 --> 00:28:35,800 Speaker 5: Well, right now, almost all the advanced AI chip factories 559 00:28:35,800 --> 00:28:40,719 Speaker 5: they call them fabs are in Taiwan, and what if 560 00:28:40,800 --> 00:28:43,440 Speaker 5: China invades one miles away from Yeah, what. 561 00:28:43,440 --> 00:28:44,440 Speaker 2: Happens if China? 562 00:28:44,480 --> 00:28:47,880 Speaker 3: If China invades Taiwan, what happens to the world. 563 00:28:50,160 --> 00:28:52,640 Speaker 5: Well, if they were to invade in the near term, 564 00:28:53,800 --> 00:28:57,480 Speaker 5: the world would be cut off from advanced AI chips 565 00:28:58,680 --> 00:29:02,800 Speaker 5: and currently one percent of advansity ichifs I made in Taiwan. 566 00:29:02,720 --> 00:29:04,640 Speaker 1: How fast can we put that online in America? How 567 00:29:04,680 --> 00:29:06,160 Speaker 1: important is that for national security? 568 00:29:07,440 --> 00:29:10,480 Speaker 5: I think it's essential for national security, and we're not 569 00:29:10,520 --> 00:29:11,080 Speaker 5: doing enough. 570 00:29:11,920 --> 00:29:14,880 Speaker 3: You're fifty three years old. I'm one hundred and eighteen 571 00:29:14,960 --> 00:29:16,840 Speaker 3: days older than you. By what the hell have I 572 00:29:16,880 --> 00:29:17,520 Speaker 3: done in my life? 573 00:29:17,640 --> 00:29:18,200 Speaker 4: I know, right? 574 00:29:19,000 --> 00:29:20,080 Speaker 2: Fifty three years old? 575 00:29:20,400 --> 00:29:20,960 Speaker 4: Pretty well? 576 00:29:22,640 --> 00:29:25,920 Speaker 3: Well, so seventy one was a great year, and I 577 00:29:25,960 --> 00:29:28,840 Speaker 3: was December seventy because I was just just right before 578 00:29:29,240 --> 00:29:30,560 Speaker 3: you were the summer of seventy one. 579 00:29:32,000 --> 00:29:33,960 Speaker 4: I was born sixty nine days after four to twenty. 580 00:29:34,840 --> 00:29:39,560 Speaker 2: Wow, I did ask Ben, this is true? Look all right? 581 00:29:41,360 --> 00:29:42,200 Speaker 2: I did ask Ben. 582 00:29:42,240 --> 00:29:44,560 Speaker 3: Should I show up and pull up a joint and say, 583 00:29:44,560 --> 00:29:47,120 Speaker 3: can we beat Rogan's views? But I was pretty sure 584 00:29:48,280 --> 00:29:51,280 Speaker 3: it might cause a scandal if we spent a podca. 585 00:29:51,560 --> 00:29:56,280 Speaker 4: It just turned out to be like a chocolate cigarden. Yeah. 586 00:29:56,480 --> 00:29:58,600 Speaker 3: Let me ask you if if today was your last 587 00:29:58,680 --> 00:29:59,240 Speaker 3: day on Earth? 588 00:29:59,480 --> 00:30:00,719 Speaker 4: Yeah, what what? 589 00:30:00,920 --> 00:30:02,920 Speaker 3: I'm not suggesting it's going to be, But if it were, 590 00:30:02,920 --> 00:30:04,360 Speaker 3: what do you think your biggest legacy would be if 591 00:30:04,360 --> 00:30:07,360 Speaker 3: everything you've done one hundred years from now? What do 592 00:30:07,400 --> 00:30:10,560 Speaker 3: you think people would remember if if if if it 593 00:30:10,600 --> 00:30:11,440 Speaker 3: were zero. 594 00:30:11,360 --> 00:30:13,560 Speaker 1: To today, and were you ever going to space. 595 00:30:15,160 --> 00:30:18,040 Speaker 5: In the in the distant future one hundred or one 596 00:30:18,040 --> 00:30:22,080 Speaker 5: thousand years ago? If SpaceX got humans to Mars, that's 597 00:30:22,120 --> 00:30:23,640 Speaker 5: what they would remember me for? 598 00:30:24,920 --> 00:30:26,600 Speaker 2: All Right, final set of questions. 599 00:30:26,760 --> 00:30:30,320 Speaker 3: Who's the smartest guy you've ever met? You hang out 600 00:30:30,320 --> 00:30:32,600 Speaker 3: with some brilliant people, Like like when you look at 601 00:30:32,640 --> 00:30:35,320 Speaker 3: what's the CEO? You look at other than yourself? 602 00:30:35,600 --> 00:30:39,120 Speaker 2: What CEO? Do you say? Damn, that guy's good. 603 00:30:40,800 --> 00:30:46,600 Speaker 5: Larry Elson's very smart? So I say, Larry Elson's one 604 00:30:46,600 --> 00:30:51,840 Speaker 5: of the smartest people. You know, Larry Page. I mean, 605 00:30:51,840 --> 00:30:52,960 Speaker 5: there are a lot of people that are very smart. 606 00:30:52,960 --> 00:30:56,200 Speaker 5: It's hard to say, like, you know, I think some 607 00:30:56,240 --> 00:31:01,440 Speaker 5: of you smart is as smart as so Oh you 608 00:31:01,520 --> 00:31:05,360 Speaker 5: know what if what if they've done that is difficult 609 00:31:06,600 --> 00:31:12,680 Speaker 5: and significant? You know, Jeff Bezos has done a lot 610 00:31:12,720 --> 00:31:17,160 Speaker 5: of difficult and significant things. I mean, there are a 611 00:31:17,160 --> 00:31:20,600 Speaker 5: lot of smart humans. I call them smart for smart 612 00:31:20,600 --> 00:31:22,240 Speaker 5: for a human, A lot of people who are in 613 00:31:22,280 --> 00:31:23,600 Speaker 5: the smart for a human category. 614 00:31:24,120 --> 00:31:27,800 Speaker 2: All right, final lightning round, Star Wars or Star Trek. 615 00:31:29,040 --> 00:31:31,160 Speaker 5: The first movie I said in the theater was Star Wars, 616 00:31:31,200 --> 00:31:32,880 Speaker 5: so I think it had a profound effect on me. 617 00:31:34,000 --> 00:31:37,680 Speaker 5: I was six years old. I think, imagine, best first 618 00:31:37,720 --> 00:31:40,920 Speaker 5: movie you ever see in a theater is Star Wars. 619 00:31:40,960 --> 00:31:41,880 Speaker 5: It's going to blow your mind. 620 00:31:41,960 --> 00:31:43,360 Speaker 2: Best Star Wars movie. 621 00:31:46,560 --> 00:31:47,560 Speaker 4: Empires tracks Back. 622 00:31:47,800 --> 00:31:50,440 Speaker 3: The only objectively right answer. I stood in line in 623 00:31:50,520 --> 00:31:52,640 Speaker 3: three hours with my dad to see it on opening day. 624 00:31:53,480 --> 00:31:54,240 Speaker 2: Kirker Picard. 625 00:31:55,880 --> 00:31:57,240 Speaker 4: I like them both, but Kirk. 626 00:31:57,840 --> 00:32:01,680 Speaker 3: Again objectively right answer. By the way, James T. Kirk 627 00:32:02,200 --> 00:32:04,600 Speaker 3: is a Republican and Picard is a Democrat, and and 628 00:32:04,640 --> 00:32:08,160 Speaker 3: the left gets very mad when I say that best 629 00:32:08,160 --> 00:32:10,720 Speaker 3: Star Trek movie, I mean. 630 00:32:10,600 --> 00:32:16,560 Speaker 4: The original, the first Star Trek movie. That's okay. Ratha Khan. 631 00:32:18,000 --> 00:32:20,680 Speaker 5: Most of both of both Ratha cons were pretty good, 632 00:32:20,840 --> 00:32:22,360 Speaker 5: but yeah, the original Wratha. 633 00:32:22,120 --> 00:32:28,120 Speaker 3: Khon Ricardo Montaban Revenge is a dish best served cold. 634 00:32:28,200 --> 00:32:30,400 Speaker 2: It is very cold in space. 635 00:32:31,120 --> 00:32:34,680 Speaker 3: Although I will say Wrathacon is objectively the right answer. 636 00:32:34,720 --> 00:32:36,840 Speaker 3: But but four is a sleeper when they go back 637 00:32:36,880 --> 00:32:40,160 Speaker 3: to San Francisco and and and go find the whales 638 00:32:40,200 --> 00:32:43,320 Speaker 3: and and you know, Scotty picks up spicks, picks up 639 00:32:43,360 --> 00:32:45,440 Speaker 3: the mouth and talks to it and goes a keyboard. 640 00:32:45,680 --> 00:32:48,120 Speaker 2: How quaint that's a sleeper? 641 00:32:48,160 --> 00:32:48,480 Speaker 4: All right? 642 00:32:48,560 --> 00:32:51,000 Speaker 3: Last question? Did Han shoot first? 643 00:32:52,840 --> 00:32:54,400 Speaker 4: It seemed like you shot second. 644 00:32:55,320 --> 00:32:56,000 Speaker 2: This is verdict. 645 00:32:56,080 --> 00:32:58,560 Speaker 3: And by the way, I apologize Ben so Ben was 646 00:32:58,600 --> 00:33:00,959 Speaker 3: a jock and played tennis at all miss and so 647 00:33:00,960 --> 00:33:03,440 Speaker 3: so occasionally when when we geek out a little, I 648 00:33:03,480 --> 00:33:04,280 Speaker 3: love watching. 649 00:33:04,080 --> 00:33:08,280 Speaker 5: Y'all geek out over there. Still on the question, you 650 00:33:08,320 --> 00:33:11,040 Speaker 5: missed the alien miss is blast shot? So why do 651 00:33:11,080 --> 00:33:12,800 Speaker 5: you miss his last shot? Must have been because he 652 00:33:12,880 --> 00:33:16,080 Speaker 5: got shot first, and he's missing a point blank bastel shot. 653 00:33:16,200 --> 00:33:18,040 Speaker 5: If they less, they got knocked off kids. 654 00:33:18,040 --> 00:33:21,880 Speaker 3: But it's a question of real which is is Han 655 00:33:22,040 --> 00:33:24,880 Speaker 3: solo simply a hero or an anti hero? And and 656 00:33:24,960 --> 00:33:27,720 Speaker 3: so I'm in the Han shot first category. I think 657 00:33:27,880 --> 00:33:29,240 Speaker 3: I don't like sanitized stories. 658 00:33:29,320 --> 00:33:30,720 Speaker 5: You would have had to have shot first because of 659 00:33:30,800 --> 00:33:33,000 Speaker 5: the way. Why why would the alien miss a point 660 00:33:33,040 --> 00:33:33,760 Speaker 5: blank range? 661 00:33:34,200 --> 00:33:36,120 Speaker 1: Are you ever going to go to outer space? Is 662 00:33:36,160 --> 00:33:37,240 Speaker 1: that thing in your life goals. 663 00:33:37,600 --> 00:33:39,040 Speaker 4: Yeah, I'd like to go to Mars at some point. 664 00:33:39,120 --> 00:33:42,600 Speaker 5: And people have said do I want to die maz 665 00:33:43,400 --> 00:33:45,040 Speaker 5: and I say yes, just not an impact. 666 00:33:45,880 --> 00:33:47,280 Speaker 2: Now, that's a very good answer. 667 00:33:47,760 --> 00:33:52,000 Speaker 3: The astronauts on the space station, are they political prisoners? 668 00:33:52,960 --> 00:33:56,120 Speaker 3: Some of them are because because you could have given 669 00:33:56,160 --> 00:33:57,080 Speaker 3: him a ride back, and. 670 00:33:58,840 --> 00:34:01,040 Speaker 2: Joe Biden said no, surely for politics. 671 00:34:01,400 --> 00:34:04,840 Speaker 5: Yeah, I mean, you know, there's been some debate about 672 00:34:04,840 --> 00:34:07,160 Speaker 5: this online. But the thing is that it was very 673 00:34:07,240 --> 00:34:10,439 Speaker 5: a very high level decision, so it wasn't really even 674 00:34:10,440 --> 00:34:11,240 Speaker 5: an acid decision. 675 00:34:11,400 --> 00:34:12,040 Speaker 4: It was just. 676 00:34:12,040 --> 00:34:15,680 Speaker 5: That the Biden White House did not want to have 677 00:34:16,320 --> 00:34:21,040 Speaker 5: someone who is pro Trump rescuing askrowts rife before the election, 678 00:34:22,080 --> 00:34:22,719 Speaker 5: so they pushed it. 679 00:34:22,840 --> 00:34:24,400 Speaker 3: Well, if you're one of those astronauts, you got to 680 00:34:24,400 --> 00:34:25,640 Speaker 3: be pretty pissed off about that. 681 00:34:26,040 --> 00:34:32,000 Speaker 5: Well, if they're a Democrat, yes, a Democrat, like, everything's fine, 682 00:34:32,080 --> 00:34:34,400 Speaker 5: fair enough. So I think one of them is a 683 00:34:34,440 --> 00:34:37,040 Speaker 5: Republicable is a Democrat? It depends on which one you ask. 684 00:34:37,360 --> 00:34:39,640 Speaker 3: Well, thank you, Elan, this was this was awesome. And 685 00:34:39,960 --> 00:34:43,080 Speaker 3: let me say, and by the way, I put out 686 00:34:43,120 --> 00:34:45,640 Speaker 3: on X the day before yesterday, if you were having 687 00:34:45,640 --> 00:34:47,480 Speaker 3: a beer with Elon and could ask him anything, what 688 00:34:47,520 --> 00:34:50,719 Speaker 3: would you ask and got lots of responses. The most 689 00:34:50,760 --> 00:34:54,520 Speaker 3: common response people said is is say thank you. Look, 690 00:34:54,760 --> 00:34:57,759 Speaker 3: Texans and the American people appreciate what you're doing. You 691 00:34:57,800 --> 00:35:00,120 Speaker 3: don't have to put up with this BS and you're doing. 692 00:35:00,320 --> 00:35:02,920 Speaker 3: I'm grateful you're making a hell of a difference for 693 00:35:02,960 --> 00:35:06,120 Speaker 3: this country. I appreciate you, and the Americans appreciate you. 694 00:35:06,360 --> 00:35:08,640 Speaker 5: Yeah, it's essential for the future of civilization. Otherwise I 695 00:35:08,680 --> 00:35:10,399 Speaker 5: wouldn't be doing it. Yes, it's not like I want 696 00:35:10,440 --> 00:35:11,360 Speaker 5: to get death threats, you know. 697 00:35:11,680 --> 00:35:11,759 Speaker 2: No. 698 00:35:12,080 --> 00:35:14,640 Speaker 1: All right now, part two of this interview with Eon 699 00:35:14,719 --> 00:35:17,520 Speaker 1: Mosk and some bigger breaking news that we're going to 700 00:35:17,600 --> 00:35:20,680 Speaker 1: have for you about corruption the government will hit on 701 00:35:21,600 --> 00:35:24,440 Speaker 1: Wednesday morning, So make sure you get that subscribe or 702 00:35:24,480 --> 00:35:27,880 Speaker 1: auto download button right now, and again, please help this 703 00:35:27,960 --> 00:35:30,960 Speaker 1: show go viral so that we can continue to expose 704 00:35:31,000 --> 00:35:34,720 Speaker 1: government waste. Wherever you're on social media, hit that little 705 00:35:34,719 --> 00:35:38,719 Speaker 1: forward button and post this episode on social media and 706 00:35:38,760 --> 00:35:40,520 Speaker 1: I'll see you back here tomorrow.