1 00:00:00,040 --> 00:00:01,279 Speaker 1: Thanks to all of you for being with us. 2 00:00:01,320 --> 00:00:04,920 Speaker 2: Right down our toll free telephone number if you want 3 00:00:05,000 --> 00:00:07,560 Speaker 2: to be a part of the program, it's eight hundred 4 00:00:07,600 --> 00:00:11,039 Speaker 2: and nine to four one Sean if you want to 5 00:00:11,600 --> 00:00:14,520 Speaker 2: join us. Uh wow. We're learning a lot about the 6 00:00:14,520 --> 00:00:17,800 Speaker 2: Epstein files. We're learning a lot about why Joe Biden 7 00:00:17,880 --> 00:00:21,239 Speaker 2: didn't release him for the four years, and we're also 8 00:00:21,360 --> 00:00:25,040 Speaker 2: learning why a bunch of so called conservatives have been 9 00:00:25,120 --> 00:00:28,960 Speaker 2: so you know, in a frenzy. You know, why were 10 00:00:28,960 --> 00:00:31,280 Speaker 2: they so quiet for four years? Why didn't they demand 11 00:00:31,320 --> 00:00:33,760 Speaker 2: the release of all this when Biden was president? Because 12 00:00:33,800 --> 00:00:38,520 Speaker 2: now it's impacting Hakeem Jeffries one congressman take a congresswoman 13 00:00:38,600 --> 00:00:42,360 Speaker 2: taking questions directly during a hearing with Michael Khne, apparently, 14 00:00:42,400 --> 00:00:47,120 Speaker 2: according to reports and information in the in the documents, verbatim, 15 00:00:47,240 --> 00:00:51,360 Speaker 2: asking the questions of Jeffrey Epstein at that time, a 16 00:00:51,600 --> 00:00:58,240 Speaker 2: convicted a felon for what was it, soliciting prostitution from 17 00:00:58,280 --> 00:01:00,520 Speaker 2: a minor? By the way, you know, the Bible is 18 00:01:00,600 --> 00:01:03,560 Speaker 2: very clear you don't hurt the look, don't hurt children. 19 00:01:04,319 --> 00:01:07,760 Speaker 2: I mean, I just I shudder to think what level 20 00:01:07,800 --> 00:01:11,480 Speaker 2: of Dante's inferno people like that will end up in 21 00:01:12,560 --> 00:01:14,640 Speaker 2: but it's it's just horrible. I mean, now they're talking 22 00:01:14,680 --> 00:01:17,600 Speaker 2: about thousands of victims, and even with the release of it, 23 00:01:17,640 --> 00:01:19,920 Speaker 2: the only you know, with all the talk about this, 24 00:01:20,360 --> 00:01:24,120 Speaker 2: thank god they put in their provision to leave people 25 00:01:24,160 --> 00:01:27,399 Speaker 2: that have been victimized, that probably live with this for 26 00:01:27,440 --> 00:01:30,160 Speaker 2: the rest of their lives, that alone, if they want to, 27 00:01:30,480 --> 00:01:34,400 Speaker 2: you know, deal with it privately and not expose them. 28 00:01:34,640 --> 00:01:37,960 Speaker 2: I can't imagine the pain, the tragedy, you know, and 29 00:01:38,520 --> 00:01:42,160 Speaker 2: the horror of such evil of young children. It is 30 00:01:42,280 --> 00:01:47,319 Speaker 2: disgusting on every level. A new Republican bill in the House, 31 00:01:47,319 --> 00:01:50,680 Speaker 2: by the way, will expedite deportations for criminal illegals. Will 32 00:01:50,760 --> 00:01:53,320 Speaker 2: hit that one illegal immigrant who was caught and released 33 00:01:53,320 --> 00:01:56,840 Speaker 2: by the Biden administration charged in a brutal hammer attack 34 00:01:57,080 --> 00:02:01,080 Speaker 2: on a Texas woman seventeen year old and arrested and 35 00:02:01,160 --> 00:02:04,440 Speaker 2: released by the federal immigration authorities during the Biden years, 36 00:02:04,880 --> 00:02:07,920 Speaker 2: charged with a brutal hammer attack on a woman jogging 37 00:02:07,960 --> 00:02:09,000 Speaker 2: in a Texas park. 38 00:02:09,840 --> 00:02:10,600 Speaker 1: Unbelievable. 39 00:02:10,960 --> 00:02:15,440 Speaker 2: By the way, did you see Nicki Minaj praising President 40 00:02:15,480 --> 00:02:18,960 Speaker 2: Trump for prioritizing the issue of the persecution and slaughter 41 00:02:19,040 --> 00:02:23,040 Speaker 2: of Christians in Nigeria. Thanking him for his leadership on 42 00:02:23,080 --> 00:02:26,359 Speaker 2: the global stage. I mean, I can only imagine the 43 00:02:26,400 --> 00:02:29,920 Speaker 2: attacks have probably begun, you know, with the Hollywood lunatics 44 00:02:29,919 --> 00:02:31,640 Speaker 2: out there, Linda, Am I wrong about that? Or am 45 00:02:31,680 --> 00:02:34,359 Speaker 2: I right about not at all? Starting a very real thing. 46 00:02:35,600 --> 00:02:39,120 Speaker 2: She's being attacked for standing up for people. And President 47 00:02:39,160 --> 00:02:42,840 Speaker 2: Trump said, if they don't stop this, radical islamis apparently 48 00:02:42,880 --> 00:02:48,080 Speaker 2: attacking and killing innocent Christians. And you know, Donald Trump 49 00:02:48,120 --> 00:02:50,760 Speaker 2: has a moral compass, he has moral clarity. And this 50 00:02:50,840 --> 00:02:54,280 Speaker 2: is why I really don't understand even some people that 51 00:02:54,360 --> 00:02:57,359 Speaker 2: claim to be MAGA, what part of the Trump doctrine 52 00:02:57,360 --> 00:03:00,560 Speaker 2: some of these people have trouble understanding, and why some 53 00:03:00,600 --> 00:03:05,240 Speaker 2: people haven't interpreted this and misinterpreted this to mean isolationism, 54 00:03:05,880 --> 00:03:09,920 Speaker 2: because some have and they seem at times to be 55 00:03:10,200 --> 00:03:13,799 Speaker 2: the louder voice in the MAGA movement. I'm all for 56 00:03:14,120 --> 00:03:17,280 Speaker 2: no forever warse. You know, many years now I've been 57 00:03:17,320 --> 00:03:20,720 Speaker 2: saying on this program, we can never ever allow what 58 00:03:20,919 --> 00:03:24,240 Speaker 2: happened in Baghdad and elsewhere to ever happen again. And 59 00:03:24,280 --> 00:03:27,320 Speaker 2: that is you know, going door to door, stepping over 60 00:03:27,400 --> 00:03:31,880 Speaker 2: IEDs without up armored hum v's, and you know, our 61 00:03:31,919 --> 00:03:35,360 Speaker 2: brave men and women, our soldiers, our heroes, getting their 62 00:03:35,440 --> 00:03:38,800 Speaker 2: legs and their arms and blown off and their faces disfigured. 63 00:03:39,080 --> 00:03:41,640 Speaker 2: That's not how modern warfare is going to be fought. 64 00:03:42,240 --> 00:03:44,200 Speaker 2: I mean, you see the beginning stages of it. 65 00:03:44,280 --> 00:03:44,520 Speaker 1: Now. 66 00:03:44,600 --> 00:03:47,280 Speaker 2: You saw it with the attack on Iran's nuclear facilities. 67 00:03:47,320 --> 00:03:49,760 Speaker 2: You see it the way. You know, even Israel is 68 00:03:49,800 --> 00:03:53,560 Speaker 2: now being attacked with you know, bombardments of ballistic missiles, 69 00:03:53,600 --> 00:03:56,040 Speaker 2: and thank god, they have missile defense systems that are 70 00:03:56,080 --> 00:03:58,760 Speaker 2: able to shoot those things out of the sky. Back 71 00:03:58,800 --> 00:04:01,960 Speaker 2: in the day they mocked Ronald Regan, they called it 72 00:04:02,160 --> 00:04:06,320 Speaker 2: star Wars if it was called strategic defense. Now President 73 00:04:06,360 --> 00:04:09,640 Speaker 2: Trump wants to modernize it and protect our entire country. 74 00:04:09,680 --> 00:04:14,040 Speaker 2: It's called the Golden Dome. Because of my sources, I 75 00:04:14,080 --> 00:04:16,320 Speaker 2: could tell you we are progressing at a level that 76 00:04:16,360 --> 00:04:21,039 Speaker 2: would really impress most of you. However, I keep some 77 00:04:21,120 --> 00:04:23,640 Speaker 2: sources private. It's not that I want to not tease 78 00:04:23,680 --> 00:04:25,520 Speaker 2: you and tell you that I know something you don't know. 79 00:04:25,560 --> 00:04:28,800 Speaker 2: It's just that it's not in our best national defense interests, 80 00:04:29,080 --> 00:04:31,120 Speaker 2: you know. I'm just I'm just getting little bits and 81 00:04:31,160 --> 00:04:33,760 Speaker 2: pieces and putting it all together, and I'm really feeling 82 00:04:33,839 --> 00:04:36,920 Speaker 2: very confident that we will achieve that goal of being 83 00:04:36,960 --> 00:04:40,000 Speaker 2: able to stop any type of attack on our country 84 00:04:40,360 --> 00:04:43,560 Speaker 2: with that technology that is advancing and that we're discovering, 85 00:04:44,760 --> 00:04:50,320 Speaker 2: all of which I think is very, very needed and necessary. 86 00:04:50,360 --> 00:04:53,320 Speaker 2: All of you should be angry about what I am 87 00:04:53,360 --> 00:04:58,520 Speaker 2: about to play for you. We have Senators Mark Kelly, 88 00:04:58,760 --> 00:05:04,640 Speaker 2: Ali Slotkin and and others, all these Democrats. They're actually 89 00:05:04,680 --> 00:05:09,040 Speaker 2: on tape and they are telling people in the intelligence community. 90 00:05:09,640 --> 00:05:14,560 Speaker 2: They are telling people that are in our military not 91 00:05:14,720 --> 00:05:19,720 Speaker 2: to obey unlawful orders, illegal orders of the president. The 92 00:05:19,800 --> 00:05:22,719 Speaker 2: problem is they don't identify what the hell they're talking about, 93 00:05:23,360 --> 00:05:25,800 Speaker 2: because I don't think they can, and I don't think 94 00:05:25,839 --> 00:05:28,960 Speaker 2: they have any intention to do so. But there's such 95 00:05:29,080 --> 00:05:32,680 Speaker 2: great damage and danger in what it is that they 96 00:05:32,680 --> 00:05:37,600 Speaker 2: are saying in this tape. And there's danger, you know, 97 00:05:38,200 --> 00:05:43,080 Speaker 2: it's almost like a subversion. It's almost like they're calling 98 00:05:43,120 --> 00:05:48,600 Speaker 2: for insurrection among our intelligence community and people in are 99 00:05:49,080 --> 00:05:55,440 Speaker 2: armed forces. What is President Trump ordered that caused them 100 00:05:55,839 --> 00:05:59,800 Speaker 2: to tell them they have a duty to disobey orders 101 00:05:59,800 --> 00:06:03,480 Speaker 2: of the commander in chief. I mean, the constitution is clear, 102 00:06:04,279 --> 00:06:06,599 Speaker 2: not a Leise Slacke and not Mark Kelly, not any 103 00:06:06,600 --> 00:06:09,480 Speaker 2: of the people in this video. They're not the commander 104 00:06:09,520 --> 00:06:13,800 Speaker 2: in chief of the United States of America. The President 105 00:06:13,880 --> 00:06:16,880 Speaker 2: is the commander in chief of our armed services. He 106 00:06:17,040 --> 00:06:20,560 Speaker 2: is the guy he makes the decisions. That's not Congress, 107 00:06:20,600 --> 00:06:23,440 Speaker 2: that's not a senator. That's not a former CIA officer, 108 00:06:23,920 --> 00:06:26,440 Speaker 2: not a former astronaut, not Alis Slocket. 109 00:06:27,200 --> 00:06:29,880 Speaker 1: But here's what they say. We want to speak directly 110 00:06:29,920 --> 00:06:31,440 Speaker 1: to members of the military. 111 00:06:31,120 --> 00:06:33,840 Speaker 2: And the intelligence community to take risks each day to 112 00:06:33,960 --> 00:06:35,160 Speaker 2: keep Americans safe. 113 00:06:35,200 --> 00:06:38,240 Speaker 1: We know you are under enormous stress and pressure right now. 114 00:06:38,360 --> 00:06:41,480 Speaker 2: Americans trust their military, but that trust is at risk. 115 00:06:41,680 --> 00:06:45,880 Speaker 2: This administration is pitting our uniform military and intelligence community 116 00:06:45,920 --> 00:06:48,760 Speaker 2: professionals against American citizens like us. 117 00:06:49,080 --> 00:06:52,240 Speaker 1: You all swore an oath to protect any then this constitution. 118 00:06:52,600 --> 00:06:55,560 Speaker 2: Right now, the threats to our constitution aren't just coming 119 00:06:55,560 --> 00:06:57,240 Speaker 2: from a broad but from right here at home. 120 00:06:57,360 --> 00:07:00,800 Speaker 1: Our laws are clear. You can refuse the legal orders, 121 00:07:01,240 --> 00:07:03,120 Speaker 1: you can refuse illegal orders. 122 00:07:03,360 --> 00:07:06,679 Speaker 2: You must refuse illegal orders. No one has to carry 123 00:07:06,680 --> 00:07:10,080 Speaker 2: out orders that violate the law or our constitution. We 124 00:07:10,160 --> 00:07:12,320 Speaker 2: know this is hard and that it's a difficult time 125 00:07:12,360 --> 00:07:13,360 Speaker 2: to be a public servant. 126 00:07:13,400 --> 00:07:16,320 Speaker 1: But whether you're serving in the CIA, the Army, or Navy, 127 00:07:16,400 --> 00:07:20,000 Speaker 1: the Air Force. Your vigilance is critical, and know that 128 00:07:20,120 --> 00:07:20,840 Speaker 1: we have your. 129 00:07:20,680 --> 00:07:25,520 Speaker 2: Back, O that we have your back. I'll think about this. 130 00:07:26,800 --> 00:07:32,600 Speaker 2: How scary is that tape? This is TDS Trump derangement 131 00:07:32,720 --> 00:07:37,000 Speaker 2: syndrome on steroids and human growth hormone. It is not 132 00:07:37,640 --> 00:07:44,040 Speaker 2: just it is so morally repugnant and inappropriate, but it's 133 00:07:44,080 --> 00:07:50,200 Speaker 2: downright dangerous what they are saying there is Again, if 134 00:07:50,240 --> 00:07:55,200 Speaker 2: they had a specific issue where they identified that the 135 00:07:55,960 --> 00:08:02,160 Speaker 2: very examples where people must refuse illegal orders, they have 136 00:08:02,200 --> 00:08:05,960 Speaker 2: a duty and obligation to spell it out. 137 00:08:06,400 --> 00:08:07,160 Speaker 1: They didn't do that. 138 00:08:08,000 --> 00:08:13,280 Speaker 2: No, it is just, you know, as far as I'm concerned, constitutionally, 139 00:08:13,840 --> 00:08:18,760 Speaker 2: they are undermining the authority of the president and his 140 00:08:18,960 --> 00:08:23,840 Speaker 2: constitutional authority as commander in chief. I mean, if that 141 00:08:24,360 --> 00:08:27,440 Speaker 2: it's it's if that's not subversion, tell me what it is. 142 00:08:28,400 --> 00:08:32,400 Speaker 2: If that's not a call for some type of insurrection, 143 00:08:32,679 --> 00:08:34,400 Speaker 2: then please tell me what it is. 144 00:08:34,760 --> 00:08:37,440 Speaker 1: I'll give you a number. It's eight hundred and nine 145 00:08:37,520 --> 00:08:38,320 Speaker 1: point one. Shawn. 146 00:08:38,920 --> 00:08:41,600 Speaker 2: I'm open to any any points you want to make 147 00:08:41,640 --> 00:08:44,800 Speaker 2: on this, but you know, they might as well just 148 00:08:44,880 --> 00:08:47,760 Speaker 2: breach the walls of the White House and you know, 149 00:08:48,040 --> 00:08:52,560 Speaker 2: sending their radical supporters because remember it was a five 150 00:08:52,640 --> 00:08:55,640 Speaker 2: hundred and seventy four official riots in the summer of 151 00:08:55,679 --> 00:08:59,480 Speaker 2: twenty twenty their radical base, and they never lifted a 152 00:08:59,520 --> 00:09:01,200 Speaker 2: finger to stop them. 153 00:09:01,480 --> 00:09:02,800 Speaker 1: They never criticized them. 154 00:09:03,200 --> 00:09:06,320 Speaker 2: No, they just flat out either their silence was deafening, 155 00:09:07,120 --> 00:09:09,640 Speaker 2: or they went out there and just flat out lied 156 00:09:09,679 --> 00:09:13,280 Speaker 2: and said they're mostly peaceful, or they went out like 157 00:09:13,960 --> 00:09:17,040 Speaker 2: you know, their vice presidential candidate at the time later 158 00:09:17,160 --> 00:09:21,760 Speaker 2: Vice President Kamala Harris and you know, said those rioters 159 00:09:21,840 --> 00:09:22,800 Speaker 2: aren't going to stop. 160 00:09:23,360 --> 00:09:25,000 Speaker 1: They shouldn't stop, and. 161 00:09:24,880 --> 00:09:28,640 Speaker 2: We're not going to stop supporting them, okay, And then 162 00:09:28,800 --> 00:09:32,000 Speaker 2: they claimed only care they ignored those five hundred and 163 00:09:32,040 --> 00:09:36,480 Speaker 2: seventy four rights. They rationalized something that wasn't true. Dozens 164 00:09:36,480 --> 00:09:40,560 Speaker 2: of Americans that Summer of Love lost their lives. Thousands 165 00:09:40,640 --> 00:09:44,200 Speaker 2: of cops were injured during the Summer of Love. You know, 166 00:09:44,520 --> 00:09:47,920 Speaker 2: they were pelted with bricks, rocks, bottles, molotov cocktails, things 167 00:09:47,960 --> 00:09:52,200 Speaker 2: like that. Billions of dollars in property damage took place. 168 00:09:52,840 --> 00:09:57,320 Speaker 2: You know, where was Benny Thompson and or whatever his 169 00:09:57,400 --> 00:10:00,600 Speaker 2: name is and Liz Cheney at the time, they cared 170 00:10:00,640 --> 00:10:04,480 Speaker 2: so much about one riot where the president actually said 171 00:10:04,520 --> 00:10:08,439 Speaker 2: many of you will peacefully and patriotically, you know, march 172 00:10:08,480 --> 00:10:11,600 Speaker 2: to the Capitol so your voices will be heard. 173 00:10:12,040 --> 00:10:14,920 Speaker 1: That's what he said at the time. But they didn't. 174 00:10:14,960 --> 00:10:16,760 Speaker 2: Of course, that never came up, and then all the 175 00:10:16,800 --> 00:10:20,600 Speaker 2: records magically just disappeared. They're just gone, you know, in 176 00:10:20,679 --> 00:10:23,720 Speaker 2: thin air, you know. And if we're going to talk 177 00:10:23,800 --> 00:10:30,560 Speaker 2: about moments where maybe just maybe, you know, members of 178 00:10:31,040 --> 00:10:35,400 Speaker 2: our intelligence community for example, and high ranking people and 179 00:10:35,520 --> 00:10:39,000 Speaker 2: law enforcement for example, maybe they should have stood up 180 00:10:39,040 --> 00:10:41,320 Speaker 2: against the Russia hoax that they knew was a hoax. 181 00:10:42,000 --> 00:10:44,720 Speaker 2: We now know they knew it was a hoax. We 182 00:10:44,840 --> 00:10:49,200 Speaker 2: now learned from the classified materials of Tulca Gabbard that 183 00:10:49,360 --> 00:10:52,640 Speaker 2: in fact, they had information it was a hoax. They 184 00:10:52,640 --> 00:10:56,000 Speaker 2: had a meeting about Hillary Clinton's plan to unveil this 185 00:10:56,120 --> 00:11:00,800 Speaker 2: hoax against Donald Trump. And then when career years senior 186 00:11:01,080 --> 00:11:06,640 Speaker 2: intelligence officials did their job after the twenty sixteen election 187 00:11:07,280 --> 00:11:10,520 Speaker 2: and came to the determination that in fact, there was 188 00:11:10,640 --> 00:11:14,920 Speaker 2: no Russia Trump collusion, well then it was presented to 189 00:11:15,000 --> 00:11:19,160 Speaker 2: Barack Obama, and according to these declassified documents and Tulsea Gabbard, 190 00:11:19,640 --> 00:11:22,800 Speaker 2: Barack Obama didn't like the conclusion, so he got his 191 00:11:22,880 --> 00:11:26,880 Speaker 2: political operatives involved, and they decided to redo the intelligence 192 00:11:26,920 --> 00:11:29,880 Speaker 2: assessment come up with a new one that had a 193 00:11:29,920 --> 00:11:35,080 Speaker 2: completely different conclusion, and again based on the Dirty Russian 194 00:11:35,080 --> 00:11:39,560 Speaker 2: Disinformation NOSSI, which they also were warned about that it 195 00:11:39,600 --> 00:11:44,079 Speaker 2: was political in August of twenty sixteen. They apparently were 196 00:11:44,080 --> 00:11:48,640 Speaker 2: worn multiple times and again statute of limitations run out 197 00:11:48,679 --> 00:11:52,080 Speaker 2: unless they come up with a grand conspiracy investigation. But 198 00:11:52,520 --> 00:11:55,480 Speaker 2: you know, you think about that, the Russia hoax, the 199 00:11:55,559 --> 00:11:59,800 Speaker 2: Mueller investigation, that witch hunt, two bogus impeachments to Donald 200 00:11:59,800 --> 00:12:02,760 Speaker 2: trum and then of course, you know, we've got three 201 00:12:02,800 --> 00:12:08,360 Speaker 2: phases of the grand conspiracy. One includes no reasonable prosecutor 202 00:12:08,360 --> 00:12:13,240 Speaker 2: would prosecute Hillary Clinton's you know, servers with top secret 203 00:12:13,280 --> 00:12:16,840 Speaker 2: classified information on them. They didn't raider her home or office, 204 00:12:16,920 --> 00:12:19,440 Speaker 2: just like later they wouldn't raid Joe Biden's homes or 205 00:12:19,559 --> 00:12:23,840 Speaker 2: offices the four locations where he had top secret classified information. 206 00:12:24,559 --> 00:12:28,679 Speaker 2: Presidential Record Zact actually gave Donald Trump a lot more 207 00:12:28,760 --> 00:12:31,200 Speaker 2: leeway than any of them, but they rated mar a lago. 208 00:12:32,280 --> 00:12:35,480 Speaker 2: That sounds like a dual justice system to me. Using 209 00:12:35,480 --> 00:12:40,120 Speaker 2: the dirty Russian disinformation dossier that not only was not corroborated, 210 00:12:40,480 --> 00:12:43,080 Speaker 2: but it was completely bogus, And even after they knew 211 00:12:43,120 --> 00:12:46,080 Speaker 2: it was completely bogus and Christopher Steele was long gone, 212 00:12:46,120 --> 00:12:48,640 Speaker 2: they used it three more times for three separate PISA 213 00:12:48,679 --> 00:12:51,800 Speaker 2: warrants lied to a phis A court. Law requires that 214 00:12:51,880 --> 00:12:54,400 Speaker 2: once you know it's false, that you're supposed to go 215 00:12:54,440 --> 00:12:56,880 Speaker 2: to the PHISA court. You know, with the fact that 216 00:12:57,000 --> 00:12:59,640 Speaker 2: intelligence now we were finding out we don't know how 217 00:12:59,640 --> 00:13:04,560 Speaker 2: many Americans, including senators and congressmen and other people and 218 00:13:04,640 --> 00:13:08,120 Speaker 2: groups like Turning Point and so many other conservative organizations 219 00:13:08,600 --> 00:13:14,520 Speaker 2: were targeted by our government to be spied on without 220 00:13:14,720 --> 00:13:18,280 Speaker 2: without any indication that a proper warrant was sought or granted. 221 00:13:18,760 --> 00:13:22,400 Speaker 2: Whatever happened, unreasonable search and seizure. I'm sure there was 222 00:13:22,480 --> 00:13:26,360 Speaker 2: great need to rip through Milania Trump's closet and Baron 223 00:13:26,400 --> 00:13:31,080 Speaker 2: Trump's bedroom, you know. And then four years of weaponization 224 00:13:31,800 --> 00:13:36,280 Speaker 2: because well they couldn't stop them. Remember, Originally the dirty 225 00:13:36,320 --> 00:13:41,520 Speaker 2: dossier was designed to prevent Trump from getting elected. Then 226 00:13:41,760 --> 00:13:46,360 Speaker 2: after the election it was used to sabotage the incoming president, 227 00:13:46,400 --> 00:13:50,800 Speaker 2: which it did effectively. Then, of course, the suppression of 228 00:13:50,840 --> 00:13:54,840 Speaker 2: the very real laptop from Hell that they've had verified 229 00:13:54,880 --> 00:13:57,839 Speaker 2: as authentic. In March at twenty twenty, and then they 230 00:13:57,880 --> 00:14:01,079 Speaker 2: prebunked it, meeting weekly with big tech companies in twenty 231 00:14:01,160 --> 00:14:04,320 Speaker 2: twenty because they didn't want the truth about that laptop 232 00:14:04,400 --> 00:14:07,520 Speaker 2: coming out. And then of course big tech asked, well 233 00:14:07,640 --> 00:14:11,120 Speaker 2: is this information in the New York Post reel? They 234 00:14:11,200 --> 00:14:14,160 Speaker 2: knew the answer, but they wouldn't tell them and they 235 00:14:14,160 --> 00:14:16,080 Speaker 2: didn't want the American people to know them. When that 236 00:14:16,120 --> 00:14:20,000 Speaker 2: didn't work, Let's just, you know, take a case where 237 00:14:20,000 --> 00:14:22,360 Speaker 2: the statute of limitations had run out. That's only a 238 00:14:22,360 --> 00:14:25,080 Speaker 2: minor misdemeanor in New York, and let's turn it into 239 00:14:25,080 --> 00:14:27,760 Speaker 2: thirty four felon accounts. And let's value Maro Lago at 240 00:14:27,760 --> 00:14:31,680 Speaker 2: eighteen million dollars, not one point five trillion, and let 241 00:14:31,720 --> 00:14:33,880 Speaker 2: everybody not say a word about any of it because 242 00:14:33,880 --> 00:14:37,120 Speaker 2: we're okay, Because it's just to prevent Trump from becoming president. 243 00:14:37,720 --> 00:14:40,960 Speaker 2: This is sick stuff. These are sick people. This is 244 00:14:40,960 --> 00:14:42,600 Speaker 2: a sick time that we're living. 245 00:14:42,360 --> 00:14:43,280 Speaker 1: In, all right. 246 00:14:43,400 --> 00:14:46,920 Speaker 2: So I may lose Linda here a little bit. Linda 247 00:14:47,160 --> 00:14:51,680 Speaker 2: doesn't love technology the way I do in this sense. Now, 248 00:14:51,680 --> 00:14:54,600 Speaker 2: I've never been the most technologically savvy person in the world, 249 00:14:54,600 --> 00:14:57,720 Speaker 2: but I am obsessed with artificial intelligence because I can 250 00:14:57,800 --> 00:15:02,440 Speaker 2: learn so much from it. Friend of mine who pretty 251 00:15:02,520 --> 00:15:06,760 Speaker 2: high ranking with Elon Musk and smart as a whip. 252 00:15:07,040 --> 00:15:10,440 Speaker 2: I mean, this guy is is so out there smart. 253 00:15:10,520 --> 00:15:14,440 Speaker 2: It's he's spectacularly smart. You know who I'm talking about, right, 254 00:15:16,200 --> 00:15:19,520 Speaker 2: And he's also the nicest guy you'd ever want to 255 00:15:19,560 --> 00:15:21,800 Speaker 2: meet in your life. He works so hard. I mean, 256 00:15:21,800 --> 00:15:24,440 Speaker 2: when you work for Elon, you don't stop working. By 257 00:15:24,440 --> 00:15:27,640 Speaker 2: the way, you think I'm tough as a boss, I'm 258 00:15:27,680 --> 00:15:31,000 Speaker 2: really not. I'm actually, you know, a pussy cat compared 259 00:15:31,040 --> 00:15:36,760 Speaker 2: to Elon Musk. But anyway, and they are driving and 260 00:15:36,880 --> 00:15:38,359 Speaker 2: driving and driving. 261 00:15:38,600 --> 00:15:40,920 Speaker 1: I mean this Elon. 262 00:15:40,680 --> 00:15:43,120 Speaker 2: Will sleep on his office floor for like an hour 263 00:15:43,160 --> 00:15:44,880 Speaker 2: and a half, we can get up and start working again. 264 00:15:45,240 --> 00:15:48,440 Speaker 2: He just doesn't stop. I mean, it's pretty fascinating to 265 00:15:48,480 --> 00:15:48,840 Speaker 2: watch it. 266 00:15:49,360 --> 00:15:49,560 Speaker 1: Now. 267 00:15:49,600 --> 00:15:52,200 Speaker 2: Do I think he was he was designed for the 268 00:15:52,320 --> 00:15:57,920 Speaker 2: DC Washington swamp or bureaucracy. Absolutely not. It just goes 269 00:15:57,960 --> 00:16:02,280 Speaker 2: against his very nature. He is a genius CEO at 270 00:16:02,280 --> 00:16:06,600 Speaker 2: the highest level and an innovator and creator. And there's 271 00:16:06,640 --> 00:16:09,080 Speaker 2: a great series on the History Channel The Men That 272 00:16:09,120 --> 00:16:12,560 Speaker 2: Built America, and it talks about you know, the Rockefellers 273 00:16:12,640 --> 00:16:15,840 Speaker 2: and the Conegies and the Melons, and you know, the 274 00:16:15,880 --> 00:16:18,840 Speaker 2: Morgans and all these people. By the way, they're all ruthless, 275 00:16:19,360 --> 00:16:22,200 Speaker 2: ruthless business people. There's the side of them is rough. 276 00:16:23,440 --> 00:16:26,280 Speaker 2: But I think with genius, I think, you know, when 277 00:16:26,320 --> 00:16:30,040 Speaker 2: people have really spectacular gifts, I think, and I've been 278 00:16:30,040 --> 00:16:32,760 Speaker 2: blessed in my life to meet a lot of brilliant people, 279 00:16:33,280 --> 00:16:36,560 Speaker 2: and it's a very common trait that with their gift 280 00:16:37,080 --> 00:16:39,840 Speaker 2: comes a little bit of a curse and it manifests 281 00:16:39,880 --> 00:16:41,160 Speaker 2: itself in different forms. 282 00:16:41,640 --> 00:16:43,960 Speaker 1: Linda, that makes sense. You're with me so far, right. 283 00:16:44,520 --> 00:16:48,520 Speaker 2: Sure, and okay, but the work ethic that Elon has 284 00:16:49,480 --> 00:16:54,520 Speaker 2: coupled with the genius that he has is is it's 285 00:16:54,640 --> 00:16:58,760 Speaker 2: just undeniable that he is the brightest guy in the 286 00:16:58,760 --> 00:17:01,320 Speaker 2: world that I can think of right now. If there's 287 00:17:01,360 --> 00:17:03,640 Speaker 2: anybody smarter, you can tell me, But I don't think 288 00:17:03,680 --> 00:17:07,840 Speaker 2: there is. Having spent time with him and spoken with him, 289 00:17:08,400 --> 00:17:13,159 Speaker 2: actually at dinner with him one night and it was outside, 290 00:17:13,640 --> 00:17:16,240 Speaker 2: and you know, he was looking up at the stars 291 00:17:16,720 --> 00:17:19,239 Speaker 2: and he's pointing out that that's Venus, that's Mars, that's this, 292 00:17:19,359 --> 00:17:21,320 Speaker 2: that's this, that's this, and he's like pointing all this 293 00:17:21,400 --> 00:17:25,400 Speaker 2: stuff out to me, and the difference is the way 294 00:17:25,480 --> 00:17:30,240 Speaker 2: my mind works, which is, you know, rather on hip 295 00:17:30,280 --> 00:17:31,760 Speaker 2: base compared to his level. 296 00:17:31,880 --> 00:17:34,400 Speaker 1: Very simple and basic. 297 00:17:34,560 --> 00:17:37,679 Speaker 2: I don't look at Venus and Mars and think that 298 00:17:37,720 --> 00:17:39,880 Speaker 2: I want to figure out a way to travel there 299 00:17:40,400 --> 00:17:44,560 Speaker 2: and maybe establish life there and send people there and 300 00:17:44,600 --> 00:17:48,399 Speaker 2: bring them home. But that's how he thinks. It's just 301 00:17:48,480 --> 00:17:51,520 Speaker 2: a whole different dimension of thinking. I've met people that 302 00:17:51,600 --> 00:17:54,760 Speaker 2: are so insightful in terms of politics. 303 00:17:54,920 --> 00:17:56,399 Speaker 1: You know, one of the geniuses of RUSSI. 304 00:17:56,400 --> 00:17:58,720 Speaker 2: I always felt Rush had a take that nobody else 305 00:17:58,720 --> 00:18:01,199 Speaker 2: would have, and I think that was part of his 306 00:18:01,320 --> 00:18:04,520 Speaker 2: great creativity and genius, and he was a great entertainer 307 00:18:04,560 --> 00:18:06,439 Speaker 2: on top of it. It's just great at everything that 308 00:18:06,480 --> 00:18:09,600 Speaker 2: he did, and which is why when he passed, God 309 00:18:09,640 --> 00:18:12,160 Speaker 2: rest his soul, we miss him dearly. You know, I said, 310 00:18:12,240 --> 00:18:16,119 Speaker 2: nobody can replace this guy. He's irreplaceable, and we all 311 00:18:16,240 --> 00:18:18,440 Speaker 2: he would want all of us to do our very 312 00:18:18,480 --> 00:18:22,040 Speaker 2: best because he believed so much in the in the 313 00:18:22,359 --> 00:18:26,040 Speaker 2: in the in the individual, in the concept of liberty 314 00:18:26,400 --> 00:18:30,399 Speaker 2: and freedom and talent that God gave every individual, and 315 00:18:30,400 --> 00:18:33,280 Speaker 2: that the only way that that those talents can be 316 00:18:33,359 --> 00:18:37,040 Speaker 2: brought brought to fruition is if you live in freedom 317 00:18:37,160 --> 00:18:40,359 Speaker 2: and liberty and which is the antithesis of top down 318 00:18:40,400 --> 00:18:43,640 Speaker 2: government run everything in our lives. So you know we've 319 00:18:43,680 --> 00:18:47,880 Speaker 2: gone through many different revolutions, if you will. I mean 320 00:18:48,000 --> 00:18:50,880 Speaker 2: when the train was first created, that was a big deal. 321 00:18:51,440 --> 00:18:53,760 Speaker 2: You know, going shopping with Linda to buy a little 322 00:18:53,760 --> 00:18:56,840 Speaker 2: toy train for her son at the time was the 323 00:18:56,840 --> 00:18:59,040 Speaker 2: biggest deal I've ever seen in my life. And after 324 00:18:59,119 --> 00:19:02,399 Speaker 2: an hour of picking up every single solitary train and 325 00:19:03,160 --> 00:19:06,000 Speaker 2: turning it over and playing with the wheels and looking 326 00:19:06,000 --> 00:19:09,280 Speaker 2: at the price and comparative shopping, I couldn't take it anymore. 327 00:19:09,600 --> 00:19:11,600 Speaker 2: I went and got a shopping cart and I just 328 00:19:11,600 --> 00:19:13,440 Speaker 2: filled it up full of a bunch of trains. And 329 00:19:13,480 --> 00:19:16,200 Speaker 2: I figured, let our son pick it out on his own, 330 00:19:16,280 --> 00:19:18,320 Speaker 2: and they'll be happy with more trains rather than a 331 00:19:18,320 --> 00:19:20,480 Speaker 2: few trains. And I don't think he's going to miss 332 00:19:20,480 --> 00:19:22,840 Speaker 2: out on whether or not Linda picked out the perfect train. 333 00:19:23,960 --> 00:19:26,199 Speaker 2: And Linda doesn't like when I tell that story, but 334 00:19:26,280 --> 00:19:29,600 Speaker 2: it's true. So why am I bringing all of this up? 335 00:19:30,359 --> 00:19:33,160 Speaker 2: So there's an article of Business Insider and it says 336 00:19:33,160 --> 00:19:38,960 Speaker 2: Elon Musk says optimists will eliminate poverty, and a speech 337 00:19:39,320 --> 00:19:42,080 Speaker 2: after his one trillion dollar pay package was approved. This 338 00:19:42,160 --> 00:19:45,760 Speaker 2: is with Tesla if he meets certain goals and certain 339 00:19:46,440 --> 00:19:50,960 Speaker 2: benchmarks that they have set out for him. And what 340 00:19:51,080 --> 00:19:55,199 Speaker 2: I've been trying to communicate to people is the we 341 00:19:55,280 --> 00:20:00,159 Speaker 2: are living through an age that is going to be 342 00:20:01,240 --> 00:20:05,840 Speaker 2: so impactful. Nobody is going to everybody's going to be 343 00:20:05,880 --> 00:20:09,560 Speaker 2: impacted by it. And it's a matter of are you 344 00:20:09,760 --> 00:20:13,360 Speaker 2: ready for it, are you ready to adapt to it? 345 00:20:14,040 --> 00:20:16,240 Speaker 2: Are you going to get ahead of the curve or 346 00:20:16,320 --> 00:20:19,560 Speaker 2: you're going to be behind the curve. And my hope 347 00:20:19,600 --> 00:20:24,000 Speaker 2: is as we discuss this periodically to make sure that 348 00:20:24,119 --> 00:20:27,520 Speaker 2: you are ready for it, it doesn't take you by surprise, 349 00:20:27,880 --> 00:20:33,080 Speaker 2: it doesn't overwhelm you, and that you actually take advantage 350 00:20:33,080 --> 00:20:36,359 Speaker 2: of it by having more knowledge than other people and 351 00:20:36,400 --> 00:20:39,679 Speaker 2: are fully aware of the magnitude of what's about to happen. 352 00:20:40,640 --> 00:20:46,520 Speaker 2: It would be the equivalent of okay, well, communications changed 353 00:20:46,600 --> 00:20:50,639 Speaker 2: when we had telephones. It would be the equivalent of 354 00:20:51,040 --> 00:20:55,639 Speaker 2: electricity versus you know, having you know, a gas lamp 355 00:20:55,680 --> 00:20:59,520 Speaker 2: in your house. It is that dramatic an innovation with 356 00:20:59,680 --> 00:21:05,760 Speaker 2: artifice intelligence, so you know, the Tesla's optimist robot, which 357 00:21:05,760 --> 00:21:08,320 Speaker 2: is what he was talking about on this call, he 358 00:21:08,400 --> 00:21:12,760 Speaker 2: actually believes ultimately will eliminate much of the need for 359 00:21:12,880 --> 00:21:17,720 Speaker 2: human labor. Now that that is a little bit shocking, 360 00:21:18,119 --> 00:21:21,560 Speaker 2: earth shattering, but something I think I want all of 361 00:21:21,560 --> 00:21:24,919 Speaker 2: you to be aware of so that you're prepared for it. 362 00:21:25,680 --> 00:21:29,919 Speaker 2: And he's talking about humanoid robots are going to be 363 00:21:29,920 --> 00:21:33,840 Speaker 2: a production challenge and it's not launching anytime soon. But again, 364 00:21:34,080 --> 00:21:36,119 Speaker 2: I want you to be thinking in the future and 365 00:21:36,160 --> 00:21:38,960 Speaker 2: not be sure, shocked or surprised by it. As I 366 00:21:39,080 --> 00:21:42,320 Speaker 2: have spoken to people that are involved in this industry, 367 00:21:42,359 --> 00:21:46,800 Speaker 2: what they're telling me is middle class families. This is 368 00:21:46,840 --> 00:21:49,639 Speaker 2: sort of like the creation of the Model t and 369 00:21:50,440 --> 00:21:54,360 Speaker 2: you know Henry Ford and the fact that he made 370 00:21:54,400 --> 00:21:56,800 Speaker 2: it in a way that was affordable so the average 371 00:21:56,840 --> 00:22:01,680 Speaker 2: person can own one, And the way they're designing these 372 00:22:01,840 --> 00:22:06,280 Speaker 2: robots that would take care of mundane tasks for middle 373 00:22:06,280 --> 00:22:07,280 Speaker 2: class Americans. 374 00:22:07,640 --> 00:22:10,760 Speaker 1: We're talking about robots that will clean. 375 00:22:10,560 --> 00:22:15,080 Speaker 2: Your house, cut your lawn, cook your dinner, you know, 376 00:22:15,160 --> 00:22:18,360 Speaker 2: do all of these mundane tasks that you take for granted, 377 00:22:18,920 --> 00:22:20,880 Speaker 2: put the sheets on your bed, et cetera, et cetera. 378 00:22:21,040 --> 00:22:23,199 Speaker 2: It may sound crazy, but I'm telling you this is 379 00:22:23,240 --> 00:22:26,720 Speaker 2: what they're telling me. And he has a bigger vision 380 00:22:26,760 --> 00:22:30,399 Speaker 2: for robots. He wants them to transform the economy. That 381 00:22:30,480 --> 00:22:33,480 Speaker 2: should be scary to everybody. He says, you know, people 382 00:22:33,520 --> 00:22:39,160 Speaker 2: often talk about eliminating poverty, giving everyone amazing medical care. 383 00:22:39,960 --> 00:22:43,040 Speaker 2: And this goes to my conversation with Gary Brecker, who's 384 00:22:43,080 --> 00:22:48,080 Speaker 2: in the health, wellness, fitness, nutrition space. And Gary Breker says, 385 00:22:48,359 --> 00:22:50,960 Speaker 2: if you live the next five years, the odds of 386 00:22:51,000 --> 00:22:53,400 Speaker 2: you living to one hundred will go up exponentially. 387 00:22:54,080 --> 00:22:56,160 Speaker 1: I agree with him. 388 00:22:55,880 --> 00:22:57,760 Speaker 2: And I talk to a friend of mine that's in 389 00:22:57,880 --> 00:23:04,199 Speaker 2: radiology and they are already using artificial intelligence, and what 390 00:23:04,640 --> 00:23:08,639 Speaker 2: he has discovered is the ability of artificial intelligence to 391 00:23:08,840 --> 00:23:13,440 Speaker 2: pick up spots on an MRI that no human eye 392 00:23:13,440 --> 00:23:16,920 Speaker 2: would be able to detect. In other words, early detection 393 00:23:17,119 --> 00:23:21,120 Speaker 2: is the key to longevity, especially on issues like cancer. 394 00:23:21,640 --> 00:23:24,119 Speaker 2: And I believe probably we're going to look back on 395 00:23:24,160 --> 00:23:27,160 Speaker 2: how we treat cancer today and one day we'll say 396 00:23:27,280 --> 00:23:30,159 Speaker 2: we were living in the dinosaur era, and they are 397 00:23:30,160 --> 00:23:34,639 Speaker 2: going to be far more sophisticated ways of detecting and 398 00:23:34,720 --> 00:23:39,160 Speaker 2: treating cancers long before they get bad. Anyway, so Musk 399 00:23:39,280 --> 00:23:44,200 Speaker 2: double down. He said optimists will actually eliminate poverty minutes earlier. 400 00:23:44,240 --> 00:23:47,000 Speaker 2: The crowd cheered and broke out in chance of Elani 401 00:23:47,080 --> 00:23:50,639 Speaker 2: Lan and the shareholders approved a trillion dollar pay package 402 00:23:51,119 --> 00:23:54,480 Speaker 2: and he's the world's richest person. It will unlock up 403 00:23:54,480 --> 00:23:57,240 Speaker 2: to a trillion dollars in shares if Tesla achieves the 404 00:23:57,640 --> 00:24:02,800 Speaker 2: lofty targets, including selling a million Optimists robots in the 405 00:24:02,840 --> 00:24:06,480 Speaker 2: next decade, I believe that will happen. Musk said that 406 00:24:06,920 --> 00:24:11,719 Speaker 2: Optimists would change life for incarcerated people. He said, for 407 00:24:11,800 --> 00:24:15,240 Speaker 2: instead of physically jailing prisoners, he said, you know, optimists 408 00:24:15,240 --> 00:24:17,760 Speaker 2: will follow you around and stop you from doing crime 409 00:24:18,119 --> 00:24:20,560 Speaker 2: and things like that. I don't know how, but just 410 00:24:20,640 --> 00:24:23,159 Speaker 2: follow me here that these are his remarks, not mine. 411 00:24:23,400 --> 00:24:26,639 Speaker 2: He said the robots would increase global economy by a 412 00:24:26,680 --> 00:24:30,639 Speaker 2: factor of ten or even possibly one hundred, and I 413 00:24:30,680 --> 00:24:34,600 Speaker 2: would argue even maybe greater than that. He did this 414 00:24:34,680 --> 00:24:38,959 Speaker 2: on the Tesla third quarter earnings call. He imagines a 415 00:24:39,000 --> 00:24:44,199 Speaker 2: world of sustainable abundance, a goal outlined by the Tesla 416 00:24:44,320 --> 00:24:47,720 Speaker 2: master Plan Part four, with Optimists leading the way. An 417 00:24:47,760 --> 00:24:51,320 Speaker 2: Optimist robot would have five times the productivity of any 418 00:24:51,400 --> 00:24:54,560 Speaker 2: human being per year, he predicted, because it would be 419 00:24:54,560 --> 00:24:57,960 Speaker 2: able to operate twenty four to seven. Let me stop here. 420 00:24:59,000 --> 00:25:01,320 Speaker 2: There's a company in our Australia. A friend of mine 421 00:25:01,359 --> 00:25:05,639 Speaker 2: sent me this video and sure enough, in less than 422 00:25:05,680 --> 00:25:12,040 Speaker 2: a week, a massive, big warehouse is being built perfectly, 423 00:25:12,840 --> 00:25:17,280 Speaker 2: no flaws at all, where it would take a construction 424 00:25:17,480 --> 00:25:21,120 Speaker 2: company six months to a year. And you can see it. 425 00:25:21,359 --> 00:25:23,359 Speaker 2: And maybe I should post it online, Linda, because it's 426 00:25:23,400 --> 00:25:28,359 Speaker 2: worth posting it and eliminates waste and they work twenty 427 00:25:28,359 --> 00:25:31,400 Speaker 2: four hours a day and the job gets done. He said, 428 00:25:31,400 --> 00:25:33,879 Speaker 2: there's no limit how much AI can do in terms 429 00:25:33,880 --> 00:25:37,879 Speaker 2: of enhancing the productivity of humans, and is not really 430 00:25:37,920 --> 00:25:41,600 Speaker 2: a limit to AI that is embodied, he said. He 431 00:25:41,720 --> 00:25:47,159 Speaker 2: said described how sustainable abundance and a robotic future. He 432 00:25:47,200 --> 00:25:50,520 Speaker 2: told Joe Rogan, this will transform the economy. He said, 433 00:25:50,560 --> 00:25:53,359 Speaker 2: I came to the conclusion that the only way to 434 00:25:53,400 --> 00:25:55,520 Speaker 2: get us out of the debt crisis and to prevent 435 00:25:55,560 --> 00:26:00,520 Speaker 2: America from going bankrupt is artificial intelligence and robotics. And 436 00:26:00,520 --> 00:26:04,760 Speaker 2: he said that robots like Optimists will make working optional 437 00:26:05,000 --> 00:26:07,520 Speaker 2: in the future. I don't like that part of it, 438 00:26:07,560 --> 00:26:09,719 Speaker 2: as long as I think people need to be of 439 00:26:09,800 --> 00:26:10,720 Speaker 2: service to other people. 440 00:26:10,840 --> 00:26:13,520 Speaker 1: Really that's spiritual for me. But put that aside. 441 00:26:13,520 --> 00:26:16,840 Speaker 2: It's not my thoughts here we'll have a benign scenario 442 00:26:17,119 --> 00:26:20,760 Speaker 2: universal income. He's saying that the wealth that will be 443 00:26:20,840 --> 00:26:25,880 Speaker 2: created and generated and food production generated, that there won't 444 00:26:25,920 --> 00:26:29,240 Speaker 2: be something like poverty that exists. If you can build 445 00:26:29,280 --> 00:26:31,800 Speaker 2: a home at one hundred and fifty dollars a square foot, 446 00:26:32,200 --> 00:26:36,000 Speaker 2: everybody's going to have their own home. And that's how 447 00:26:36,160 --> 00:26:38,840 Speaker 2: once the robotics are up and running and building homes 448 00:26:38,960 --> 00:26:41,080 Speaker 2: or building out seventy or seventy five percent of them, 449 00:26:41,080 --> 00:26:43,600 Speaker 2: that's what you're going to have. Is that anyone can 450 00:26:43,720 --> 00:26:46,320 Speaker 2: have any product or service they want, but there'll be 451 00:26:46,400 --> 00:26:49,119 Speaker 2: a lot of trauma disruption along the way. This is 452 00:26:49,160 --> 00:26:52,000 Speaker 2: why I'm warning all of you. And he's not the 453 00:26:52,000 --> 00:26:55,280 Speaker 2: only business leader that is touting the prospects of universal 454 00:26:55,320 --> 00:26:58,280 Speaker 2: basic income. Saying that the money is not coming from 455 00:26:58,320 --> 00:27:00,840 Speaker 2: the government, he's not suggesting that. I don't interpret it 456 00:27:00,840 --> 00:27:04,760 Speaker 2: that way anyway. And Rogan, you know, said there's an 457 00:27:04,800 --> 00:27:08,639 Speaker 2: economic irony here the capitalist implementation of AI and robotics, 458 00:27:08,640 --> 00:27:11,920 Speaker 2: assuming he goes down the good path, and there's always 459 00:27:11,960 --> 00:27:13,920 Speaker 2: a danger with technology that it can be used for 460 00:27:14,000 --> 00:27:19,800 Speaker 2: something horrible, you know, could result in this utopia that 461 00:27:20,040 --> 00:27:24,639 Speaker 2: we as conservatives launchly disagree with. I do believe that 462 00:27:24,720 --> 00:27:27,800 Speaker 2: human beings are designed with talent and gifts from God, 463 00:27:27,840 --> 00:27:30,600 Speaker 2: and you've got to bring him to fruition, and you 464 00:27:30,680 --> 00:27:34,159 Speaker 2: can't be all about being served all day. But he said, 465 00:27:34,320 --> 00:27:37,119 Speaker 2: AI robots will replace a lot of jobs. Working for 466 00:27:37,240 --> 00:27:40,439 Speaker 2: some will be optional, like growing your own vegetables instead 467 00:27:40,440 --> 00:27:44,840 Speaker 2: of buying them is optional. That's a pretty ambitious prediction. 468 00:27:45,320 --> 00:27:47,280 Speaker 2: I'm not saying all of this is going to be true, 469 00:27:47,840 --> 00:27:51,400 Speaker 2: but what I am saying is the economy's changing. And 470 00:27:51,640 --> 00:27:54,080 Speaker 2: what I want to tie this into is Donald Trump's 471 00:27:54,080 --> 00:27:57,440 Speaker 2: securing an other trillion trillion dollars. You know, the Saudi's 472 00:27:57,480 --> 00:27:59,919 Speaker 2: committed up to a trillion of manufacturing in this country 473 00:28:00,040 --> 00:28:03,159 Speaker 2: the next four years. Okay, that's going to be You know, 474 00:28:03,680 --> 00:28:06,680 Speaker 2: we're gonna need workers for these jobs, and I want 475 00:28:06,720 --> 00:28:09,320 Speaker 2: you to be ready to get those jobs. We're gonna 476 00:28:09,359 --> 00:28:12,920 Speaker 2: have the energy sector building out like never before, high 477 00:28:12,920 --> 00:28:16,280 Speaker 2: paying career jobs, and if you're interested, I want you 478 00:28:16,359 --> 00:28:18,919 Speaker 2: to be ready for those jobs. I want you to 479 00:28:18,920 --> 00:28:25,200 Speaker 2: be ready for the semiconductor manufacturing facilities, the pharmaceutical manufacturing facilities, 480 00:28:25,600 --> 00:28:29,919 Speaker 2: the AI and robotic manufacturing facilities that they're going to 481 00:28:29,960 --> 00:28:32,280 Speaker 2: need workers and they're also going to need a very 482 00:28:32,359 --> 00:28:38,040 Speaker 2: well educated population to get in on this and be 483 00:28:38,320 --> 00:28:42,000 Speaker 2: productive members of this new technology. And I would add 484 00:28:42,040 --> 00:28:45,239 Speaker 2: to that, the tax cuts automatically are going to kick in, 485 00:28:45,600 --> 00:28:49,080 Speaker 2: and when that does, that's going to be economic growth, prosperity, hopefully, 486 00:28:49,080 --> 00:28:52,920 Speaker 2: the golden era that Donald Trump talks about. To me, 487 00:28:53,000 --> 00:28:56,040 Speaker 2: it's just fascinating. Did that interest you at all? Into 488 00:28:57,720 --> 00:29:00,440 Speaker 2: I wonder if I can hire about to replace you? 489 00:29:00,520 --> 00:29:02,320 Speaker 1: Yeah, why don't you? Same sass. 490 00:29:03,080 --> 00:29:06,320 Speaker 2: Will teach him to be sassy, sarcastic, you know, to 491 00:29:06,760 --> 00:29:09,000 Speaker 2: try and get me angry every day before the fool 492 00:29:09,000 --> 00:29:09,479 Speaker 2: of vinegar. 493 00:29:09,720 --> 00:29:10,360 Speaker 1: It'll be great.