1 00:00:01,120 --> 00:00:04,040 Speaker 1: Welcome in a Verdict with Ted Cruz. Weekend Review. Ben 2 00:00:04,080 --> 00:00:06,560 Speaker 1: Ferguson with you, and these are the stories you may 3 00:00:06,559 --> 00:00:09,480 Speaker 1: have missed that we talked about this week. First up, 4 00:00:09,800 --> 00:00:12,640 Speaker 1: a big story coming out of Washington, d C. National 5 00:00:12,720 --> 00:00:15,760 Speaker 1: guardsmen that were shot, one of them killed. And what 6 00:00:15,800 --> 00:00:18,000 Speaker 1: we now know is the media wanted to blame Donald 7 00:00:18,040 --> 00:00:21,960 Speaker 1: Trump for this. Well, they now have had a massive 8 00:00:22,000 --> 00:00:25,680 Speaker 1: backfire as we're learning that this was clearly lack of 9 00:00:25,720 --> 00:00:29,360 Speaker 1: vetting done and warnings that were there by the Biden 10 00:00:29,360 --> 00:00:32,280 Speaker 1: administration that they were letting people in this country that 11 00:00:32,320 --> 00:00:34,640 Speaker 1: could be a threat to all of us. I'll have 12 00:00:34,720 --> 00:00:37,760 Speaker 1: all those details for you in just a moment. We're 13 00:00:37,760 --> 00:00:41,240 Speaker 1: also joined by Scott Jennings, you know him from CNN. 14 00:00:41,640 --> 00:00:44,879 Speaker 1: He's talking about his new book and also AI and 15 00:00:44,960 --> 00:00:48,400 Speaker 1: what you need to know. It's really incredible and finally 16 00:00:48,560 --> 00:00:51,159 Speaker 1: a massive win for the President of United States of 17 00:00:51,240 --> 00:00:56,040 Speaker 1: America telling Americans, I want to get out of your car. 18 00:00:56,560 --> 00:01:00,560 Speaker 1: Cafe standards are now changing and going backwards. And what 19 00:01:00,600 --> 00:01:03,360 Speaker 1: does that mean? A car that you can buy is 20 00:01:03,360 --> 00:01:06,320 Speaker 1: going to be a lot lower priced. It's the Weekend 21 00:01:06,360 --> 00:01:10,920 Speaker 1: Review and it starts right now. The media is trying 22 00:01:10,959 --> 00:01:14,520 Speaker 1: to say that these National guardsmen that got shot, one 23 00:01:14,520 --> 00:01:16,840 Speaker 1: of them has died. As the time we are doing this, 24 00:01:17,560 --> 00:01:20,400 Speaker 1: it happened, they're trying to say because of Donald Trump 25 00:01:20,480 --> 00:01:23,119 Speaker 1: rewriting the history and the warnings going back to as 26 00:01:23,120 --> 00:01:25,600 Speaker 1: we just played there from September of twenty twenty one 27 00:01:25,720 --> 00:01:28,600 Speaker 1: on this exact thing happening in reality. 28 00:01:29,040 --> 00:01:29,280 Speaker 2: Yeah. 29 00:01:29,319 --> 00:01:30,920 Speaker 3: At the same time, I wrote an op ed and 30 00:01:30,959 --> 00:01:33,679 Speaker 3: Fox News where I said, unfortunately, I believe we will 31 00:01:33,720 --> 00:01:38,160 Speaker 3: see American blood spilled because of these foolish mistakes. And 32 00:01:38,240 --> 00:01:40,399 Speaker 3: I went on to say, the truth is that the 33 00:01:40,440 --> 00:01:43,000 Speaker 3: Taliban doesn't want to be welcomed into the community of 34 00:01:43,040 --> 00:01:48,760 Speaker 3: civilized nations. They are vicious terrorists who want to kill us. Unfortunately, 35 00:01:48,800 --> 00:01:52,400 Speaker 3: because of the catastrophic mistakes the Biden administration made in Afghanistan, 36 00:01:52,960 --> 00:01:55,760 Speaker 3: we are much more at risk for deadly attacks by 37 00:01:55,760 --> 00:01:59,680 Speaker 3: the Taliban or Al Qaeda, which Pentagon officials estimate could 38 00:01:59,720 --> 00:02:02,080 Speaker 3: re group in Afghanistan in the next year or two. 39 00:02:02,520 --> 00:02:06,160 Speaker 3: It's a disgrace and the fault lies directly with President 40 00:02:06,240 --> 00:02:10,400 Speaker 3: Joe Biden and his administration. This was obvious at the time. 41 00:02:10,480 --> 00:02:13,680 Speaker 3: And look you heard on that clip in September twenty one. 42 00:02:13,760 --> 00:02:19,200 Speaker 3: I'm talking about seeing the Donaana tent facility. When I 43 00:02:19,240 --> 00:02:21,120 Speaker 3: showed up there, I was like, well, are there any 44 00:02:21,160 --> 00:02:24,760 Speaker 3: fences around it? No? Is there any bearer? They're like no, no, no, 45 00:02:24,800 --> 00:02:27,080 Speaker 3: this is not a prison facility. And it was a 46 00:02:27,080 --> 00:02:29,400 Speaker 3: one star general who was taking me around in the helicopter. 47 00:02:29,840 --> 00:02:33,200 Speaker 3: And I'm like, well, you know what happens. You're saying 48 00:02:33,240 --> 00:02:35,560 Speaker 3: anyone can leave. He's like, yeah, they're not detained. This 49 00:02:35,680 --> 00:02:37,600 Speaker 3: is just we're just offering them housing. Think of this 50 00:02:37,680 --> 00:02:40,640 Speaker 3: like an apartment complex. They can stay here if they want, 51 00:02:41,120 --> 00:02:42,559 Speaker 3: or if they want to go somewhere. 52 00:02:42,280 --> 00:02:43,520 Speaker 1: Else, they can go. 53 00:02:43,880 --> 00:02:46,640 Speaker 3: They can go somewhere else. And I yeah, And I 54 00:02:46,680 --> 00:02:48,840 Speaker 3: asked the army, are y'all doing any vetting here? They 55 00:02:48,880 --> 00:02:51,160 Speaker 3: said no, no, no, the State Department does that Afghanistan, So 56 00:02:51,440 --> 00:02:54,000 Speaker 3: it's all done before they get there. So they got 57 00:02:54,000 --> 00:02:56,919 Speaker 3: ten thousand people there and who knows where they went. 58 00:02:57,639 --> 00:02:59,960 Speaker 3: They don't keep a record of them. They don't ask 59 00:03:00,320 --> 00:03:02,880 Speaker 3: where do you go? I mean, that's part of the 60 00:03:02,919 --> 00:03:11,359 Speaker 3: infuriating problem that the Biden administration invited into this country. Islamists, 61 00:03:11,760 --> 00:03:14,880 Speaker 3: terrorists who explicitly want to murder us. And by the way, 62 00:03:16,120 --> 00:03:21,120 Speaker 3: you know what is unbelievable is the shooting in d 63 00:03:21,360 --> 00:03:27,079 Speaker 3: C was one of two Afghans who were arrested this 64 00:03:27,160 --> 00:03:32,640 Speaker 3: week for threatening violence. Another one happened in Texas. Here's 65 00:03:32,720 --> 00:03:36,360 Speaker 3: one report from Breitbart. An Afghan national admitted to the 66 00:03:36,440 --> 00:03:41,240 Speaker 3: United States under President Biden's Operation Allies Welcome Resettlement program 67 00:03:41,800 --> 00:03:45,280 Speaker 3: was arrested this week after posting a TikTok video in 68 00:03:45,320 --> 00:03:49,440 Speaker 3: which he appeared to be building a bomb and referenced 69 00:03:49,520 --> 00:03:51,840 Speaker 3: Fort Worth as a target, according to the Department of 70 00:03:51,880 --> 00:03:56,200 Speaker 3: Homeland Security. Department of Homeland Security Assistant Secretary Triscia McLaughlin 71 00:03:56,240 --> 00:04:00,920 Speaker 3: reported the arrest of Mohammad da wood Alokose, saying he 72 00:04:01,000 --> 00:04:03,920 Speaker 3: allegedly posted a video on TikTok threatening to blow up 73 00:04:03,960 --> 00:04:07,560 Speaker 3: a building in Fort Worth, Texas with a bomb. The 74 00:04:07,720 --> 00:04:10,680 Speaker 3: Texas Department of Public Safety and the FBI Joint Terrorism 75 00:04:10,720 --> 00:04:13,960 Speaker 3: Task Force teamed up to make the arrest on charges 76 00:04:14,000 --> 00:04:18,160 Speaker 3: of making terroristic threats. This is happening, and by the way, 77 00:04:18,200 --> 00:04:22,839 Speaker 3: it continues to be. Alcase is an Afghan national who 78 00:04:22,920 --> 00:04:26,120 Speaker 3: was pardoned into the United States by the Biden administration. 79 00:04:26,640 --> 00:04:29,200 Speaker 3: His arrest came one day prior to the a vicious 80 00:04:29,200 --> 00:04:32,520 Speaker 3: attack on two West Virginia National Guard soldiers in Washington, 81 00:04:32,600 --> 00:04:37,279 Speaker 3: d C. By another Afghan national This pattern when you 82 00:04:37,440 --> 00:04:41,039 Speaker 3: let people in, you don't vet them, This violence and 83 00:04:41,120 --> 00:04:45,240 Speaker 3: this terrorism is predictable. And you had a combination of 84 00:04:45,360 --> 00:04:47,719 Speaker 3: a for four years, an open southern border, so any 85 00:04:47,720 --> 00:04:50,000 Speaker 3: Hamas or Hesblo or other terrorists could come in through 86 00:04:50,040 --> 00:04:50,800 Speaker 3: the southern border. 87 00:04:51,320 --> 00:04:54,800 Speaker 1: You combine what we're witnessing now with what you just 88 00:04:54,920 --> 00:04:58,479 Speaker 1: mentioned center at the southern border. We knew for a 89 00:04:58,880 --> 00:05:02,400 Speaker 1: fact that people on the terrorists watch list were coming 90 00:05:02,400 --> 00:05:05,279 Speaker 1: across our southern border, the same border that when Texas 91 00:05:05,320 --> 00:05:07,400 Speaker 1: tried to secure it, the federal government said no, no, we're 92 00:05:07,400 --> 00:05:09,599 Speaker 1: gonna use fork lists to raise up the bob wire 93 00:05:09,720 --> 00:05:11,679 Speaker 1: to let more people. And we have that on video. 94 00:05:11,720 --> 00:05:14,280 Speaker 1: We talked about that on Verdict. They knew that people 95 00:05:14,320 --> 00:05:16,279 Speaker 1: were getting it on the terrorists watch list, people that 96 00:05:16,360 --> 00:05:19,080 Speaker 1: wanted to kill Americans. They knew there was a real 97 00:05:19,160 --> 00:05:22,680 Speaker 1: threat of sleeper cells in this country coming from multiple 98 00:05:22,720 --> 00:05:25,479 Speaker 1: places around the world, and there still is. Al Shabab 99 00:05:25,600 --> 00:05:29,560 Speaker 1: exactly still is. And you combine that with this, and 100 00:05:29,600 --> 00:05:31,800 Speaker 1: you sit there and you go basically they said, look, 101 00:05:31,800 --> 00:05:33,599 Speaker 1: we know people are gonna die, but we want an 102 00:05:33,640 --> 00:05:35,960 Speaker 1: open border. We know terifts are coming in, we know 103 00:05:35,960 --> 00:05:37,880 Speaker 1: they're on the watch list, we know we're catching some 104 00:05:37,920 --> 00:05:39,839 Speaker 1: of them. There's a lot of guidaways that we don't catch. 105 00:05:40,279 --> 00:05:42,280 Speaker 1: This is just the cost of doing business. There will 106 00:05:42,320 --> 00:05:46,080 Speaker 1: be human carbage of American citizens', abusive children, the rape 107 00:05:46,080 --> 00:05:48,880 Speaker 1: of children, the sex trafficking of children. There will be 108 00:05:48,960 --> 00:05:52,599 Speaker 1: murderers and all sorts of people in this country. We 109 00:05:52,720 --> 00:05:56,680 Speaker 1: know that it's worth it to flood America with illegal immigrants. 110 00:05:56,920 --> 00:05:59,279 Speaker 1: That was a calculated decision they made. 111 00:06:00,800 --> 00:06:04,839 Speaker 3: It was and the two occurred simultaneously. So on the 112 00:06:04,839 --> 00:06:07,160 Speaker 3: one hand, they're opening the southern border and letting anyone 113 00:06:07,200 --> 00:06:10,920 Speaker 3: come in. But on the other hand, they're affirmatively flying 114 00:06:11,640 --> 00:06:13,800 Speaker 3: tens of thousands of Afghans. And by the way, we 115 00:06:13,800 --> 00:06:15,599 Speaker 3: don't have any firm numbers in terms of how many 116 00:06:15,640 --> 00:06:19,080 Speaker 3: they brought in. In terms of vetting, you know, you 117 00:06:19,160 --> 00:06:21,680 Speaker 3: don't have anyone from the Biden administration saying, oh, yeah, 118 00:06:22,080 --> 00:06:24,479 Speaker 3: we thoroughly vetted this guy before we brought him here, 119 00:06:24,960 --> 00:06:28,920 Speaker 3: because their view was there was nothing to vet. If 120 00:06:29,000 --> 00:06:31,640 Speaker 3: they're bringing a fifty year old dude with an eight 121 00:06:31,720 --> 00:06:34,640 Speaker 3: year old girl that he says is his wife, that 122 00:06:34,920 --> 00:06:37,920 Speaker 3: shows you how utterly incompetent the vetting is that they 123 00:06:38,000 --> 00:06:42,880 Speaker 3: just didn't care. Ideologically, they believed, well, their culture apparently 124 00:06:42,880 --> 00:06:45,919 Speaker 3: it's okay to rape little girls, and so they certainly 125 00:06:45,920 --> 00:06:50,960 Speaker 3: were not asking engaging in the serious vetting. And it's difficult. Look, 126 00:06:50,960 --> 00:06:53,480 Speaker 3: one of the problems with vetting a place like Afghanistan 127 00:06:54,080 --> 00:06:56,400 Speaker 3: is there are limited records, and so that's not an 128 00:06:56,440 --> 00:06:59,839 Speaker 3: easy task in my view, unless you can conclusively determine 129 00:07:00,040 --> 00:07:02,120 Speaker 3: that this person is not a terror threat, we should 130 00:07:02,120 --> 00:07:05,919 Speaker 3: not be bringing him to America. And so but but 131 00:07:06,560 --> 00:07:10,240 Speaker 3: the Biden administration I don't believe was even trying to 132 00:07:10,320 --> 00:07:13,160 Speaker 3: vet them in any serious way. They were just bringing 133 00:07:13,160 --> 00:07:17,320 Speaker 3: in numbers because this was all about politics rather than 134 00:07:17,360 --> 00:07:19,239 Speaker 3: the substance of keeping America safe. 135 00:07:19,480 --> 00:07:21,480 Speaker 1: Right, sina, I want to go back to a little 136 00:07:21,480 --> 00:07:24,880 Speaker 1: over a year ago, and let's just remind people of 137 00:07:24,920 --> 00:07:27,240 Speaker 1: the Democratic Party and also how we got to the 138 00:07:27,280 --> 00:07:28,440 Speaker 1: point they we're at right now. 139 00:07:29,600 --> 00:07:32,280 Speaker 3: Well, this just shows the overlap we were talking about 140 00:07:32,280 --> 00:07:37,120 Speaker 3: before that Ideologically, today's Democrat Party refuses to acknowledge the 141 00:07:37,120 --> 00:07:40,720 Speaker 3: threat of radical Islamic terrorism, refuses to acknowledge radical Islam 142 00:07:40,760 --> 00:07:43,920 Speaker 3: and that it is fundamentally anti American and so in 143 00:07:43,960 --> 00:07:48,520 Speaker 3: April twenty four there were protesters in Dearborn, Michigan for 144 00:07:48,680 --> 00:07:52,120 Speaker 3: that and it was the International Al Kudds Day rally 145 00:07:52,520 --> 00:07:57,800 Speaker 3: where they began chanting death to America. And when that happened, 146 00:07:58,000 --> 00:08:01,160 Speaker 3: so so, Dearborn Michigan is in Rashida to Leed's district. 147 00:08:01,240 --> 00:08:04,600 Speaker 3: She is the member of Congress that represents Dearborn Michigan. 148 00:08:04,960 --> 00:08:07,560 Speaker 3: You have a rally in her district channing death to America. 149 00:08:07,720 --> 00:08:10,680 Speaker 3: And Fox News quite reasonably came and asked her if 150 00:08:10,720 --> 00:08:13,480 Speaker 3: she would condemn that. Uh, give a watch and give 151 00:08:13,480 --> 00:08:16,760 Speaker 3: a listen to how she responded, I love. 152 00:08:16,600 --> 00:08:25,400 Speaker 1: Their record, recD read not a great look. 153 00:08:26,000 --> 00:08:29,160 Speaker 4: The mayor condemned it, the White House condemned it. But 154 00:08:29,200 --> 00:08:33,960 Speaker 4: what about Congresswoman Rashida to leave she reps Dearborn. Is 155 00:08:34,000 --> 00:08:38,480 Speaker 4: she okay with her constituents chanting death to America? Fox 156 00:08:38,559 --> 00:08:43,760 Speaker 4: Business correspondent Hillary Vaughan asked her, watch congressmenan. 157 00:08:43,280 --> 00:08:45,800 Speaker 3: To Lee Fox News, I don't talk to Fox. 158 00:08:45,600 --> 00:08:48,400 Speaker 5: Newd At a rally in your district, people were chanting 159 00:08:48,480 --> 00:08:49,520 Speaker 5: death to America? 160 00:08:49,640 --> 00:08:52,240 Speaker 2: Do you condemn talk to Fox News? 161 00:08:52,400 --> 00:08:54,480 Speaker 3: Do you condemn chance of death to America? 162 00:08:55,040 --> 00:08:57,480 Speaker 6: I don't talk to people that use racist troupes. 163 00:08:59,280 --> 00:09:01,280 Speaker 5: Why can't you just say whether or not you condemn 164 00:09:01,360 --> 00:09:02,559 Speaker 5: people chanting death to me? 165 00:09:02,920 --> 00:09:05,120 Speaker 3: Why are you afraid to talk to Fox. 166 00:09:04,840 --> 00:09:09,320 Speaker 6: News is not not listen using racist tropes for us? 167 00:09:09,400 --> 00:09:12,440 Speaker 3: My community is what Fox joos is about. 168 00:09:12,440 --> 00:09:14,960 Speaker 1: And I don't talk to Fox News. It's death to America. 169 00:09:15,320 --> 00:09:18,720 Speaker 3: Racist news is chanting death to America. 170 00:09:18,840 --> 00:09:20,840 Speaker 4: Racist talking about your guys, racist tropes. 171 00:09:21,320 --> 00:09:22,080 Speaker 3: You know you are. 172 00:09:22,120 --> 00:09:23,400 Speaker 1: You guys know exactly what you do. 173 00:09:24,040 --> 00:09:26,280 Speaker 3: I know you're from a pole, but you guys gotta 174 00:09:26,320 --> 00:09:26,880 Speaker 3: go deal with it. 175 00:09:26,920 --> 00:09:27,600 Speaker 1: You're on your own self. 176 00:09:27,640 --> 00:09:28,439 Speaker 3: You're not gonna use me. 177 00:09:29,480 --> 00:09:31,840 Speaker 1: All right, Sinner. You go back to that, and it's 178 00:09:32,000 --> 00:09:37,320 Speaker 1: just one of those moments where the warning signs are there. 179 00:09:37,440 --> 00:09:41,000 Speaker 1: We tell you that these warning signs are there, and 180 00:09:41,040 --> 00:09:43,280 Speaker 1: then it happens, and I hate it that it happens, 181 00:09:43,280 --> 00:09:47,120 Speaker 1: but it's so predictable now, you know. 182 00:09:47,200 --> 00:09:50,120 Speaker 3: It reminds me when when Obama was president, John Carey 183 00:09:50,160 --> 00:09:53,120 Speaker 3: was Secretary of State. There was an editorial cartoon and 184 00:09:53,160 --> 00:09:56,720 Speaker 3: it had the Eyotolin Iran going death to America, and 185 00:09:56,760 --> 00:10:01,680 Speaker 3: it had John Carey saying, well, can we meet you halfway? Yeah, 186 00:10:01,760 --> 00:10:06,320 Speaker 3: that's the ideological incoherence. Look, when you walk down the 187 00:10:06,360 --> 00:10:09,959 Speaker 3: halls of Congress, and you've been there with me, ben 188 00:10:10,400 --> 00:10:12,360 Speaker 3: reporters stop you all the time. By the way, I 189 00:10:12,360 --> 00:10:14,920 Speaker 3: answer questions from Fox News, I answer questions from CNN, 190 00:10:14,960 --> 00:10:17,440 Speaker 3: I answer questions New York Times, I answer questions from 191 00:10:17,440 --> 00:10:20,720 Speaker 3: every lefty outlet, just walking along. Because if you know 192 00:10:20,800 --> 00:10:24,640 Speaker 3: what you believe, I gotta say, that's not a gotcha question. 193 00:10:24,679 --> 00:10:29,600 Speaker 3: It's not a difficult question. Do you support death to America? 194 00:10:29,679 --> 00:10:32,559 Speaker 3: If you are a member of the United States Congress, 195 00:10:33,000 --> 00:10:35,559 Speaker 3: the Congress of the United States of America, one would 196 00:10:35,559 --> 00:10:37,720 Speaker 3: think it ought to be easy to say, Yeah, of course, 197 00:10:37,720 --> 00:10:41,040 Speaker 3: death to America is wrong. I'm against that. That should 198 00:10:41,120 --> 00:10:46,160 Speaker 3: not be difficult for anyone. It speaks volumes that Rashida 199 00:10:46,200 --> 00:10:50,800 Speaker 3: Talib was unwilling to even condemn chance of death to 200 00:10:50,920 --> 00:10:55,400 Speaker 3: America in her own congressional district. That shows just how 201 00:10:55,520 --> 00:10:59,240 Speaker 3: dangerous the ideological rod is. And tragically, we saw the 202 00:10:59,360 --> 00:11:04,080 Speaker 3: very real consequences of that ideological rod in letting unvetted 203 00:11:04,160 --> 00:11:09,480 Speaker 3: Afghan Afghan immigrants into this country, one of whom just 204 00:11:09,760 --> 00:11:12,920 Speaker 3: murdered one DC National guardsman and may have murdered a second. 205 00:11:12,960 --> 00:11:16,199 Speaker 3: We pray that he survives, but at this point our 206 00:11:16,240 --> 00:11:17,960 Speaker 3: prayers are with him because we don't know if he's 207 00:11:17,960 --> 00:11:19,960 Speaker 3: going to survive the horrific attack. 208 00:11:20,440 --> 00:11:22,880 Speaker 1: Now, if you want to hear the rest of this conversation, 209 00:11:23,080 --> 00:11:25,400 Speaker 1: you can go back and listen to the full podcast 210 00:11:25,520 --> 00:11:29,280 Speaker 1: from earlier this week. Now onto story number two. 211 00:11:29,840 --> 00:11:32,440 Speaker 3: So on AI. AI is an issue I care about 212 00:11:32,440 --> 00:11:34,160 Speaker 3: a lot, and I agree with you that we are 213 00:11:34,200 --> 00:11:37,560 Speaker 3: in a race. AI is coming and either the United 214 00:11:37,559 --> 00:11:39,640 Speaker 3: States is gonna win or China's going to win. And 215 00:11:39,720 --> 00:11:41,959 Speaker 3: I think the world is much much worse off if 216 00:11:42,040 --> 00:11:44,400 Speaker 3: China wins, and it's much better off of America wins. 217 00:11:45,200 --> 00:11:48,880 Speaker 3: I will tell you the polling on AI is terrible. 218 00:11:49,679 --> 00:11:52,720 Speaker 3: AI is unpopular in America. If you do almost any 219 00:11:52,720 --> 00:11:56,760 Speaker 3: polling ideas, it's about seventy thirty. People are terrified. They're 220 00:11:56,800 --> 00:11:59,120 Speaker 3: afraid they're going to lose their jobs. They don't trust it, 221 00:11:59,160 --> 00:12:02,760 Speaker 3: they're scared of it. And I get that, and in fact, 222 00:12:02,800 --> 00:12:04,320 Speaker 3: I talk with a lot of the tech leaders and 223 00:12:04,320 --> 00:12:06,560 Speaker 3: I say, look, if you don't engage on this issue, 224 00:12:07,120 --> 00:12:10,640 Speaker 3: the easy political outcome. Every Democrat and a number of 225 00:12:10,720 --> 00:12:13,800 Speaker 3: Republicans are like, AI is horrible. We oppose it. And 226 00:12:14,400 --> 00:12:17,240 Speaker 3: that if you look at the polling, that's the knee 227 00:12:17,320 --> 00:12:24,880 Speaker 3: jerk response. I get that fear. I understand anytime you 228 00:12:24,920 --> 00:12:30,120 Speaker 3: have economic dislocation, it is frightening, it is dangerous. But 229 00:12:30,200 --> 00:12:31,480 Speaker 3: at the end of the day, I'm not a lot. 230 00:12:31,520 --> 00:12:35,000 Speaker 3: I don't think you can stop technological advancement. I'll give 231 00:12:35,000 --> 00:12:38,600 Speaker 3: you an example. If I could right now destroy every 232 00:12:38,600 --> 00:12:40,719 Speaker 3: cell phone in America, I would. I think these are 233 00:12:40,920 --> 00:12:45,000 Speaker 3: evil portals to everything harmful in the world. All three 234 00:12:45,040 --> 00:12:49,920 Speaker 3: of us are parents. These things invite every horrible force 235 00:12:50,120 --> 00:12:53,280 Speaker 3: to our children. But we don't live in that world. 236 00:12:53,280 --> 00:12:55,000 Speaker 3: We can't end that. I'd also like a world with 237 00:12:55,040 --> 00:12:57,600 Speaker 3: no nuclear weapons, Like if I could push a button 238 00:12:57,640 --> 00:13:00,600 Speaker 3: and every nuclear weapon disappear, I would, But we don't 239 00:13:00,640 --> 00:13:01,920 Speaker 3: live in that world. And if we're going to live 240 00:13:01,920 --> 00:13:03,719 Speaker 3: in a world with nuclear weapons, I sure a sec 241 00:13:03,800 --> 00:13:06,080 Speaker 3: want to make sure the United States has enough that 242 00:13:06,200 --> 00:13:08,320 Speaker 3: China and Russia and our enemies can't dominate US. I 243 00:13:08,400 --> 00:13:11,240 Speaker 3: view AI in the same world. Even if you don't 244 00:13:11,240 --> 00:13:14,840 Speaker 3: want it to happen, it is coming, and one of 245 00:13:14,960 --> 00:13:19,920 Speaker 3: two outcomes will occur in five to ten years. Either 246 00:13:20,080 --> 00:13:22,360 Speaker 3: China will have won the race, in which case AI 247 00:13:22,520 --> 00:13:26,640 Speaker 3: worldwide will reflect China's values, will be totalitarian, We'll be controlling, 248 00:13:26,960 --> 00:13:29,960 Speaker 3: we'll be censoring, will reflect the values of communists. China 249 00:13:30,400 --> 00:13:33,920 Speaker 3: or America will have won, in which case, hopefully AI 250 00:13:34,000 --> 00:13:37,959 Speaker 3: will reflect American values of freedom, free enterprise, free speech. 251 00:13:38,760 --> 00:13:41,920 Speaker 3: That is a massive shift between those two worlds. World 252 00:13:41,960 --> 00:13:46,559 Speaker 3: two is much better where America wins. Two questions, Number one, 253 00:13:46,559 --> 00:13:51,880 Speaker 3: do you agree with that? And number two? If so, 254 00:13:52,559 --> 00:13:54,640 Speaker 3: how does the politics change? Do you agree with my 255 00:13:54,679 --> 00:13:58,240 Speaker 3: point that right now the politics is against AI and 256 00:13:58,840 --> 00:14:00,560 Speaker 3: that is dangerous? 257 00:14:00,800 --> 00:14:01,040 Speaker 6: Yes? 258 00:14:01,120 --> 00:14:01,679 Speaker 3: I agree. 259 00:14:02,000 --> 00:14:03,839 Speaker 6: First of all, in point one, I agree with everything 260 00:14:03,840 --> 00:14:05,520 Speaker 6: you said about which world we want to live in. 261 00:14:05,640 --> 00:14:07,880 Speaker 6: Number two, I agree with you on the polling, and 262 00:14:07,920 --> 00:14:10,360 Speaker 6: I think people have a healthier fear of how this 263 00:14:10,440 --> 00:14:14,000 Speaker 6: is going to upend their life and reasonable and which 264 00:14:14,080 --> 00:14:16,520 Speaker 6: is a reasonable fear? And so the next iteration of 265 00:14:16,559 --> 00:14:20,120 Speaker 6: this march towards AI is going to have to be 266 00:14:20,240 --> 00:14:21,960 Speaker 6: not how it's going to make your life worse? How's 267 00:14:21,960 --> 00:14:24,240 Speaker 6: it going to make your life better? How is any 268 00:14:24,280 --> 00:14:27,840 Speaker 6: other technological evolution ultimately made your life better? And so 269 00:14:28,000 --> 00:14:31,320 Speaker 6: that explanation has to come, And then point four is 270 00:14:31,360 --> 00:14:32,560 Speaker 6: just going to be how are we going to power 271 00:14:32,560 --> 00:14:35,120 Speaker 6: it all? And you know, are we going to be 272 00:14:35,120 --> 00:14:36,800 Speaker 6: able to produce enough energy to do what we have 273 00:14:36,880 --> 00:14:37,440 Speaker 6: to do now? 274 00:14:37,640 --> 00:14:38,600 Speaker 1: Plus what we have. 275 00:14:38,600 --> 00:14:41,840 Speaker 3: To do AI is energy, And if we don't unleash energy, 276 00:14:41,880 --> 00:14:42,800 Speaker 3: we cannot win the race. 277 00:14:42,800 --> 00:14:44,880 Speaker 6: Frae. I. One other issue on this that is on 278 00:14:45,000 --> 00:14:47,280 Speaker 6: my mind is how are we going to build enough 279 00:14:47,280 --> 00:14:50,680 Speaker 6: of these data centers to win this thing. I'm noticing 280 00:14:50,680 --> 00:14:53,400 Speaker 6: in local communities in a lot of places, folks are 281 00:14:53,440 --> 00:14:55,920 Speaker 6: like rising up against data center development. 282 00:14:55,920 --> 00:14:58,360 Speaker 3: So the polling is terrible in most communities. But I 283 00:14:58,360 --> 00:15:00,000 Speaker 3: will say, by the way, one of the few except 284 00:15:00,000 --> 00:15:02,560 Speaker 3: options is sort of West Texas. You've got West Texas 285 00:15:02,560 --> 00:15:04,320 Speaker 3: in the Panhandle, and you're seeing a ton of data 286 00:15:04,360 --> 00:15:07,640 Speaker 3: centers come in there, and and Texans are like, all right, 287 00:15:07,680 --> 00:15:10,400 Speaker 3: we want jobs, we want investment, we want billions in investment. 288 00:15:10,520 --> 00:15:12,800 Speaker 3: And there'll be jobs for construction, they'll be jobs for 289 00:15:12,840 --> 00:15:16,880 Speaker 3: running the data centers. Look, I think in dense urban areas, 290 00:15:16,960 --> 00:15:19,600 Speaker 3: people are afraid. They're afraid it will suck power, it 291 00:15:19,600 --> 00:15:22,160 Speaker 3: will drive up their electricity prices, it will suck water. 292 00:15:22,680 --> 00:15:26,120 Speaker 3: And I get those fears. Look, anyone, if you're given 293 00:15:26,160 --> 00:15:28,000 Speaker 3: a choice, do you want something that will make your 294 00:15:28,040 --> 00:15:31,360 Speaker 3: life worse or better. People are naturally going to say, 295 00:15:31,360 --> 00:15:33,760 Speaker 3: I don't want something that makes my life worse, and so. 296 00:15:35,520 --> 00:15:37,080 Speaker 1: I think I'm a guy by the way, Centata. I 297 00:15:37,080 --> 00:15:38,240 Speaker 1: had a guy the other day that said to me, 298 00:15:38,280 --> 00:15:41,320 Speaker 1: in the AI industry, he said, the hardest part for 299 00:15:41,480 --> 00:15:44,560 Speaker 1: them now is overcoming what you're describing, because he said 300 00:15:44,600 --> 00:15:47,920 Speaker 1: they're polling that internally, says that the fear and a 301 00:15:47,920 --> 00:15:50,400 Speaker 1: lot of parts of the country right now are at 302 00:15:50,440 --> 00:15:53,320 Speaker 1: the level of where they were with nuclear reactors back 303 00:15:53,360 --> 00:15:54,000 Speaker 1: in the eighties. 304 00:15:54,080 --> 00:15:57,440 Speaker 3: Yes, interesting now, and there's been a lot of demagoguery, 305 00:15:58,080 --> 00:16:01,960 Speaker 3: and but there's also the demagoguery is there, But I 306 00:16:02,000 --> 00:16:06,640 Speaker 3: will say the fears of AI are real and justified 307 00:16:06,680 --> 00:16:11,640 Speaker 3: and all right, Look, with every great technological change, there's 308 00:16:11,720 --> 00:16:15,680 Speaker 3: been economic dislocation. So when the automobile was invented, the 309 00:16:15,720 --> 00:16:18,280 Speaker 3: horse and buggy industry was decimated, and if you were 310 00:16:18,320 --> 00:16:24,120 Speaker 3: in the buggy business, you were screwed. The difference with 311 00:16:24,280 --> 00:16:27,240 Speaker 3: AI is twofold. Number One, the volume is greater. I 312 00:16:27,280 --> 00:16:30,440 Speaker 3: think there are going to be more jobs that are 313 00:16:30,480 --> 00:16:33,640 Speaker 3: going to be threatened by AI than have been in 314 00:16:33,840 --> 00:16:39,600 Speaker 3: previous technological innovations. Number Two, it's a different contour of jobs. 315 00:16:40,080 --> 00:16:43,120 Speaker 3: So often in the past it's been blue collar jobs 316 00:16:43,120 --> 00:16:45,800 Speaker 3: that have been at risk. Many of the jobs that 317 00:16:45,840 --> 00:16:49,120 Speaker 3: are at risk with AI are white collar jobs. There 318 00:16:49,480 --> 00:16:52,720 Speaker 3: if you look at Mandami, part of what elected Mandami 319 00:16:52,840 --> 00:16:57,480 Speaker 3: is you have young college graduates who have two hundred 320 00:16:57,520 --> 00:17:01,280 Speaker 3: thousand dollars in college debt, who we're told I can 321 00:17:01,320 --> 00:17:03,680 Speaker 3: go and be an investment banker or a consultant or 322 00:17:03,720 --> 00:17:07,440 Speaker 3: an accountant, and they're seeing those jobs being eliminated as 323 00:17:07,480 --> 00:17:12,679 Speaker 3: AI is replacing thousands and thousands of those jobs. That's 324 00:17:12,720 --> 00:17:16,800 Speaker 3: a level of I guess I would call elite discontent 325 00:17:18,680 --> 00:17:25,880 Speaker 3: that is politically complicated and so and I get those 326 00:17:25,920 --> 00:17:28,680 Speaker 3: concerns are very real. If you've gone and worked hard 327 00:17:28,720 --> 00:17:31,359 Speaker 3: and you've got a degree and you took out loans 328 00:17:31,359 --> 00:17:32,960 Speaker 3: and you were told this is what I got to do, 329 00:17:33,040 --> 00:17:34,600 Speaker 3: and then suddenly you come out and you can't get 330 00:17:34,600 --> 00:17:38,760 Speaker 3: a job, you're pissed like that's a very real concern. 331 00:17:39,359 --> 00:17:42,960 Speaker 3: And so I actually think as policy makers, as people 332 00:17:43,040 --> 00:17:45,920 Speaker 3: in an elected office, we need to have real solutions 333 00:17:45,960 --> 00:17:50,879 Speaker 3: to that problem because that content is not illegitimate, but 334 00:17:51,000 --> 00:17:54,040 Speaker 3: it is driving much of this resistance and it's. 335 00:17:53,920 --> 00:17:56,359 Speaker 6: A very real battle we've got well and you have 336 00:17:56,720 --> 00:18:01,000 Speaker 6: now for you know, many years, people teaching their children, 337 00:18:01,040 --> 00:18:02,880 Speaker 6: here's the course you got to get on you well 338 00:18:02,880 --> 00:18:04,800 Speaker 6: in school, go to college, you can get one of 339 00:18:04,800 --> 00:18:08,040 Speaker 6: these white collar jobs. And you know, for you know, 340 00:18:08,080 --> 00:18:10,280 Speaker 6: a few generations we've been telling people this is the path. 341 00:18:10,359 --> 00:18:12,920 Speaker 6: It may not be the path anymore. We're going to 342 00:18:13,000 --> 00:18:17,119 Speaker 6: have to reorient how we're educating kids, what we're teaching 343 00:18:17,119 --> 00:18:19,080 Speaker 6: them to do, What other skills can we teach them 344 00:18:19,080 --> 00:18:21,040 Speaker 6: to do. I agree with you on the white collar 345 00:18:21,160 --> 00:18:25,960 Speaker 6: versus blue collar issue. Also creatives, I mean I've sneaked out. 346 00:18:26,000 --> 00:18:28,960 Speaker 6: I've seen AI apps and a matter of seconds write 347 00:18:29,000 --> 00:18:31,520 Speaker 6: songs that sounded like they had a whole team of 348 00:18:31,680 --> 00:18:35,640 Speaker 6: musicians and writers over weeks putting them together in California. 349 00:18:35,640 --> 00:18:37,560 Speaker 6: And it literally happened on my phone in a matter 350 00:18:37,640 --> 00:18:38,040 Speaker 6: of seconds. 351 00:18:38,160 --> 00:18:40,280 Speaker 3: By the way, I'll tell you, in politics, it's interesting. 352 00:18:40,320 --> 00:18:44,159 Speaker 3: I'll pull up AI and I'll ask something like, what 353 00:18:44,280 --> 00:18:47,280 Speaker 3: has Cruz said on the following issue? And listen, I've 354 00:18:47,280 --> 00:18:49,000 Speaker 3: been doing this a long time, so I don't remember 355 00:18:49,040 --> 00:18:52,200 Speaker 3: every comment I've given in every newspaper interview ten years ago, 356 00:18:52,840 --> 00:18:55,800 Speaker 3: and in like four seconds it comes out with he 357 00:18:55,880 --> 00:18:58,320 Speaker 3: said this here, he said this here, and it like traces, 358 00:18:58,320 --> 00:18:59,200 Speaker 3: and it's like. 359 00:19:00,240 --> 00:19:00,920 Speaker 1: It's insane. 360 00:19:01,480 --> 00:19:04,760 Speaker 3: It is insane. The ability of power you have on 361 00:19:04,800 --> 00:19:05,320 Speaker 3: your cell phone. 362 00:19:05,960 --> 00:19:09,280 Speaker 6: Oh yeah, I mean, and it's basically free. I mean 363 00:19:09,320 --> 00:19:13,399 Speaker 6: information for the first time in human history is basically 364 00:19:13,840 --> 00:19:18,320 Speaker 6: accessible to all and free to all and instant. And yeah, 365 00:19:18,560 --> 00:19:20,600 Speaker 6: I can see why the pulling on this is bad. Now, 366 00:19:21,960 --> 00:19:24,480 Speaker 6: the positive outlook here is how can we use these 367 00:19:24,520 --> 00:19:27,240 Speaker 6: tools to make your life easier, make you more productive, 368 00:19:27,640 --> 00:19:30,480 Speaker 6: and maybe make you more economically successful. Maybe you haven't 369 00:19:30,480 --> 00:19:33,119 Speaker 6: conceived of how that will work yet, but that's the 370 00:19:33,240 --> 00:19:35,560 Speaker 6: challenge of these companies that are developing the technology. 371 00:19:35,560 --> 00:19:37,520 Speaker 1: As before. If you want to hear the rest of 372 00:19:37,520 --> 00:19:40,399 Speaker 1: this conversation on this topic, you can go back and 373 00:19:40,480 --> 00:19:42,919 Speaker 1: dow the podcasts from earlier this week to hear the 374 00:19:43,080 --> 00:19:46,480 Speaker 1: entire thing. I want to get back to the big 375 00:19:46,640 --> 00:19:48,719 Speaker 1: story number three of the week. You may have missed 376 00:19:49,240 --> 00:19:51,959 Speaker 1: Center on Wednesday. You were at the White House for 377 00:19:52,119 --> 00:19:55,840 Speaker 1: a really cool moment, and that was President Trump saying 378 00:19:55,920 --> 00:19:58,280 Speaker 1: I want government to get out of your way and 379 00:19:58,320 --> 00:20:02,520 Speaker 1: stop making things to expense for you, specifically cars, having 380 00:20:02,520 --> 00:20:05,600 Speaker 1: a reset of cafe standards that were out of control. 381 00:20:05,840 --> 00:20:07,840 Speaker 1: That is going to lower the costs of cars for 382 00:20:08,000 --> 00:20:10,520 Speaker 1: many Americans. We're also getting rid of the obsession with 383 00:20:10,560 --> 00:20:14,000 Speaker 1: a mandate for ev vehicles, which the Democrats were hell 384 00:20:14,160 --> 00:20:17,720 Speaker 1: bent on, and this is going to again make cars 385 00:20:17,880 --> 00:20:21,240 Speaker 1: more affordable for Americans. Everyone should be excited about that. 386 00:20:21,680 --> 00:20:24,400 Speaker 3: Well, that's exactly right. After spending Monday at the White 387 00:20:24,400 --> 00:20:26,760 Speaker 3: House with the President while he was signing my legislation 388 00:20:27,000 --> 00:20:31,040 Speaker 3: tripling the monthly stipend for Medal of Honor recipients. After 389 00:20:31,080 --> 00:20:33,800 Speaker 3: spending Tuesday at the White House with the President, Michael 390 00:20:33,960 --> 00:20:36,760 Speaker 3: and Susan Dell giving six and a quarter billion dollars 391 00:20:37,040 --> 00:20:40,320 Speaker 3: to the Kids of America for Trump accounts, on Wednesday, 392 00:20:40,400 --> 00:20:43,280 Speaker 3: I joined the President again, this time with the CEOs 393 00:20:43,320 --> 00:20:46,160 Speaker 3: of the major auto companies in the United States, as 394 00:20:46,200 --> 00:20:51,159 Speaker 3: the President was implementing legislation and provision again that I 395 00:20:51,280 --> 00:20:56,440 Speaker 3: wrote zeroing out the Cafe standards. The Cafe standards were 396 00:20:56,480 --> 00:21:00,800 Speaker 3: the federal mileage requirements that the Biden administration used to 397 00:21:00,960 --> 00:21:03,440 Speaker 3: jack them up so high that at number one made 398 00:21:03,520 --> 00:21:07,160 Speaker 3: cars much much more expensive, but number two was part 399 00:21:07,200 --> 00:21:10,600 Speaker 3: of their effort to eliminate the internal combustion engine to 400 00:21:10,640 --> 00:21:13,879 Speaker 3: force everyone to buy an electric car, whether they wanted 401 00:21:14,000 --> 00:21:17,760 Speaker 3: or not. And as part of the One Big Beautiful Bill, 402 00:21:17,840 --> 00:21:20,680 Speaker 3: we zeroed that out entirely. I wrote that provision, and 403 00:21:21,480 --> 00:21:24,680 Speaker 3: on Wednesday, the President was implementing it. Here I want 404 00:21:24,720 --> 00:21:26,800 Speaker 3: you to give a listen to the President describing the 405 00:21:26,840 --> 00:21:28,560 Speaker 3: historic steps he took on Wednesday. 406 00:21:29,240 --> 00:21:31,680 Speaker 2: From day one, I've been taking action to make buying 407 00:21:31,680 --> 00:21:34,520 Speaker 2: a car more affordable. I signed an executive order to 408 00:21:34,760 --> 00:21:38,959 Speaker 2: end the unfair expensive electric vehicle mandate. As you know, 409 00:21:39,000 --> 00:21:41,600 Speaker 2: we had to have an electric car within a very 410 00:21:41,600 --> 00:21:44,120 Speaker 2: short period of time, even though there was no way 411 00:21:44,119 --> 00:21:46,359 Speaker 2: of charging them, and lots of other things. Would of 412 00:21:46,359 --> 00:21:50,920 Speaker 2: course five trillion dollars to build the charging plants, and 413 00:21:51,320 --> 00:21:53,480 Speaker 2: as you know, in certain parts of the Midwest they 414 00:21:53,520 --> 00:21:58,320 Speaker 2: spent to build nine chargers. They spent eight billion dollars. 415 00:21:58,960 --> 00:22:01,040 Speaker 2: So that wasn't working out too well. That was done 416 00:22:01,040 --> 00:22:03,320 Speaker 2: before me. By the way, I wouldn't have let it 417 00:22:03,359 --> 00:22:08,160 Speaker 2: go forward. We're canceling the EPA is observed tailpipe permission standards. 418 00:22:08,200 --> 00:22:11,320 Speaker 2: One of the most important things that I've never had 419 00:22:11,560 --> 00:22:16,560 Speaker 2: a group of people come to me more more powerfully 420 00:22:16,720 --> 00:22:19,880 Speaker 2: and really just devastated that they had to do it. 421 00:22:19,880 --> 00:22:24,520 Speaker 2: It was killing them then the automobile manufactures the tailpipe 422 00:22:24,520 --> 00:22:27,720 Speaker 2: permission standards. And I can tell you your people at 423 00:22:27,720 --> 00:22:29,639 Speaker 2: Ford were coming to me all the time and they 424 00:22:29,680 --> 00:22:34,400 Speaker 2: were saying, like, please, it doesn't do anything, and it's 425 00:22:34,480 --> 00:22:37,440 Speaker 2: killing us, and it's driving the course through the roof. 426 00:22:37,480 --> 00:22:40,800 Speaker 2: And we revoke Biden's emissions waiver for California so that 427 00:22:41,440 --> 00:22:46,720 Speaker 2: California communists could not regulate the automobile industry and ruin 428 00:22:46,800 --> 00:22:50,560 Speaker 2: the entire nation of automobiles. And they were doing that too, 429 00:22:51,280 --> 00:22:54,680 Speaker 2: but we have that now under control, also with your governor, 430 00:22:55,160 --> 00:22:58,920 Speaker 2: who's got much more than he can control. Now, under 431 00:22:58,960 --> 00:23:01,960 Speaker 2: the new rules being issued today by Secretary Duffy, the 432 00:23:02,000 --> 00:23:08,120 Speaker 2: Department of Transportation will rescind the Biden fuel economy prices. 433 00:23:08,359 --> 00:23:10,080 Speaker 2: And I hate to say that, because they were really 434 00:23:10,119 --> 00:23:15,080 Speaker 2: not economy. They were really they were anti economy. They 435 00:23:15,119 --> 00:23:18,199 Speaker 2: were horrible what they were doing to the costs and 436 00:23:18,280 --> 00:23:23,000 Speaker 2: actually making the car much worse. But these policies forced 437 00:23:23,000 --> 00:23:27,399 Speaker 2: automakers to build cars using expensive technologies that drove up costs, 438 00:23:27,520 --> 00:23:30,960 Speaker 2: drove up prices, and made the car much worse. The 439 00:23:31,040 --> 00:23:34,200 Speaker 2: action is expected to save the typical consumer at least 440 00:23:34,280 --> 00:23:36,720 Speaker 2: one thousand dollars off the price of a new car, 441 00:23:37,040 --> 00:23:38,960 Speaker 2: and we think substantially more than that. 442 00:23:40,160 --> 00:23:43,560 Speaker 1: Senator, you hear him talk about this. This is exactly 443 00:23:43,560 --> 00:23:46,480 Speaker 1: what I think so many Americans voted for. It's government 444 00:23:46,560 --> 00:23:50,520 Speaker 1: actually not mandating things. It's government getting out of the 445 00:23:50,560 --> 00:23:55,280 Speaker 1: way of insane regulations to make your life easier. Just 446 00:23:55,440 --> 00:23:57,280 Speaker 1: give me a little bit of my freedom back. Let 447 00:23:57,280 --> 00:24:00,159 Speaker 1: the private sector do what the private sector does. Yo 448 00:24:00,320 --> 00:24:03,280 Speaker 1: putting all this government agenda on top of me, and 449 00:24:03,400 --> 00:24:04,800 Speaker 1: let me do what I want to do. If I 450 00:24:04,840 --> 00:24:06,840 Speaker 1: want to buy an Eva, I'll buy an EV. But 451 00:24:06,960 --> 00:24:08,760 Speaker 1: don't force me to do something. And don't make these 452 00:24:08,760 --> 00:24:11,439 Speaker 1: standards so insane that you raise the cary that the 453 00:24:11,560 --> 00:24:14,640 Speaker 1: costs of cars a point where I can't even afford anyone. 454 00:24:15,080 --> 00:24:16,000 Speaker 1: It's your choice. 455 00:24:16,000 --> 00:24:17,960 Speaker 3: It should be your choice. If you decide you want 456 00:24:17,960 --> 00:24:20,440 Speaker 3: to go buy a Tesla, Tesla's are amazing automobiles, you 457 00:24:20,480 --> 00:24:22,520 Speaker 3: can go do that. But if you decide you want 458 00:24:22,520 --> 00:24:24,320 Speaker 3: to drive your F one fifty, that ought to be 459 00:24:24,359 --> 00:24:28,640 Speaker 3: your choice too. And this legislation and what the President 460 00:24:28,680 --> 00:24:32,960 Speaker 3: did this week benefited consumers. Lowered prices made cars and 461 00:24:32,960 --> 00:24:36,439 Speaker 3: trucks a lot cheaper. It also made them safer. It 462 00:24:36,560 --> 00:24:40,920 Speaker 3: saved lives. It also helped produce jobs. Thousands and thousands 463 00:24:40,960 --> 00:24:43,800 Speaker 3: of jobs. Is billions of dollars of being invested by 464 00:24:43,880 --> 00:24:47,240 Speaker 3: car manufacturers in the United States. We're bringing blue collar 465 00:24:47,359 --> 00:24:51,399 Speaker 3: jobs back to America. Ben, Here's how. Here's how I 466 00:24:51,480 --> 00:24:54,439 Speaker 3: explained that in the Oval Office, standing next to the President. 467 00:24:54,440 --> 00:24:55,000 Speaker 3: Give a listen. 468 00:24:55,440 --> 00:24:57,159 Speaker 2: Anybody else want to say a few words? 469 00:24:57,440 --> 00:24:58,560 Speaker 5: Oh? Did I did? I? 470 00:24:59,600 --> 00:25:07,400 Speaker 3: Generally I'm easy to miss. Well, miss President. I want 471 00:25:07,440 --> 00:25:10,200 Speaker 3: to congratulate you on behalf everyone here for the leadership 472 00:25:10,240 --> 00:25:10,959 Speaker 3: that you're showing. 473 00:25:11,119 --> 00:25:12,080 Speaker 5: And this is a. 474 00:25:12,080 --> 00:25:15,760 Speaker 3: Victory today for consumers. This is a victory for affordability. 475 00:25:15,800 --> 00:25:18,520 Speaker 3: Your critics like to say the word affordability. Under Joe Biden, 476 00:25:18,640 --> 00:25:22,280 Speaker 3: the Democrats, they put mandate after mandate after mandate on 477 00:25:22,359 --> 00:25:25,320 Speaker 3: cars and trucks, and they drove the price up thousands 478 00:25:25,320 --> 00:25:28,639 Speaker 3: and thousands of dollars. And with the actions you're taking today, 479 00:25:28,680 --> 00:25:30,280 Speaker 3: and you put on top of that the one big 480 00:25:30,320 --> 00:25:33,000 Speaker 3: beautiful bill. You know, most of the senators here on 481 00:25:33,000 --> 00:25:35,359 Speaker 3: the Senate Commerce Committee and on the Commerce Committee, we 482 00:25:35,400 --> 00:25:38,720 Speaker 3: worked together to zero out the Cafe standards. We wrote 483 00:25:38,760 --> 00:25:41,960 Speaker 3: that into the law, that the Cafe standards went to zero. 484 00:25:42,400 --> 00:25:44,600 Speaker 3: What does that mean? It means you can now, as 485 00:25:44,600 --> 00:25:47,679 Speaker 3: a consumer, buy the car you want. It also means 486 00:25:48,080 --> 00:25:51,760 Speaker 3: people's lives will be safer because what these regulations did 487 00:25:52,400 --> 00:25:55,400 Speaker 3: is they forced cars to be more expensive and made 488 00:25:55,440 --> 00:25:57,959 Speaker 3: a plastic instead of steel because you had to make 489 00:25:58,000 --> 00:26:00,640 Speaker 3: them lighter to comply with these standards. Getting a wreck 490 00:26:00,680 --> 00:26:03,480 Speaker 3: and people would die. The results of what you're doing, 491 00:26:03,520 --> 00:26:07,000 Speaker 3: you're literally saving people's lives and you're making it where 492 00:26:07,040 --> 00:26:10,240 Speaker 3: families can afford to get a new car. These actions 493 00:26:10,280 --> 00:26:13,680 Speaker 3: will drop the cost of cars and trucks thousands of dollars. 494 00:26:13,680 --> 00:26:16,520 Speaker 3: That's makes a real difference. And I'll make one final point. 495 00:26:16,560 --> 00:26:18,480 Speaker 3: You know about half the members here are car dealers, 496 00:26:19,600 --> 00:26:21,879 Speaker 3: and I will say, my good friend Roger Williams out 497 00:26:21,880 --> 00:26:23,480 Speaker 3: in the lobby, he tried to sell me a car, 498 00:26:24,480 --> 00:26:26,760 Speaker 3: which didn't surprise me. But I just just want to 499 00:26:26,760 --> 00:26:29,600 Speaker 3: give props to Mike Kelly. He showed he was savvyer 500 00:26:29,600 --> 00:26:31,000 Speaker 3: because he tried to sell you a car. And you 501 00:26:31,000 --> 00:26:32,840 Speaker 3: can aford a lot of nicer cars than I can. 502 00:26:33,520 --> 00:26:35,680 Speaker 3: So well done, Mike, Roger, you just went to the 503 00:26:35,680 --> 00:26:36,480 Speaker 3: wrong customer. 504 00:26:36,640 --> 00:26:37,840 Speaker 2: I was one hundred dollars chief. 505 00:26:39,480 --> 00:26:43,080 Speaker 5: The thing that tends you said, It's true plastic instead 506 00:26:43,080 --> 00:26:46,199 Speaker 5: of steal thinking of that, and people died because of that. 507 00:26:47,160 --> 00:26:50,440 Speaker 5: The plastic was plastic in an auto accident that it 508 00:26:50,840 --> 00:26:54,960 Speaker 5: broke up and people were shattered because of that crazy deal. 509 00:26:55,440 --> 00:26:57,240 Speaker 2: But it's a really good point. 510 00:26:57,280 --> 00:26:59,800 Speaker 3: God and vis President, I'll tell you the setate commerce 511 00:26:59,840 --> 00:27:02,600 Speaker 3: could On January fourteenth, we're going to have a hearing 512 00:27:03,200 --> 00:27:05,480 Speaker 3: with all of the big three there in Tesla, and 513 00:27:05,520 --> 00:27:08,560 Speaker 3: the entire hearing is going to focus on how your 514 00:27:08,680 --> 00:27:12,840 Speaker 3: leadership has reduced the burdens on car makers. That's lowering costs, 515 00:27:12,840 --> 00:27:16,200 Speaker 3: that's getting consumers more choices, and it's producing more jobs 516 00:27:16,200 --> 00:27:19,280 Speaker 3: in America and it's highlighted. That's a real record of success. 517 00:27:19,560 --> 00:27:21,480 Speaker 1: Not only is it a record of success, but as 518 00:27:21,520 --> 00:27:24,399 Speaker 1: you mentioned there and highlighted and the President agreed with you, 519 00:27:24,520 --> 00:27:25,760 Speaker 1: it's an issue of safety. 520 00:27:25,800 --> 00:27:30,520 Speaker 3: Senator Yeah, yeah, No, I mean it's these cafe standards 521 00:27:30,560 --> 00:27:34,280 Speaker 3: were making cars much much more dangerous. And we'll never 522 00:27:34,400 --> 00:27:37,960 Speaker 3: know who, but there are people who will live now 523 00:27:38,720 --> 00:27:41,359 Speaker 3: because of this change because they'll be able to be 524 00:27:41,359 --> 00:27:44,520 Speaker 3: in safer cars. Some of it is they may be 525 00:27:44,560 --> 00:27:46,600 Speaker 3: able to afford a new car now because as this 526 00:27:46,720 --> 00:27:50,240 Speaker 3: drives down the price of new cars and new trucks. 527 00:27:50,640 --> 00:27:53,000 Speaker 3: They may sell their old used car and buy a 528 00:27:53,040 --> 00:27:55,840 Speaker 3: new car because it makes it more affordable. New cars 529 00:27:55,880 --> 00:27:59,959 Speaker 3: typically have more safety features and enhance their safety. But second, 530 00:28:00,200 --> 00:28:02,879 Speaker 3: if you're not having to make these cars and remove 531 00:28:02,920 --> 00:28:05,280 Speaker 3: all the steel, you're much more likely to survive an 532 00:28:05,320 --> 00:28:07,760 Speaker 3: accident if you're in a car that has some weight, 533 00:28:07,880 --> 00:28:12,320 Speaker 3: has some steel, and doesn't just crumple up like a 534 00:28:12,359 --> 00:28:12,840 Speaker 3: beer can. 535 00:28:13,280 --> 00:28:16,600 Speaker 1: As always, thank you for listening to Verdict with center 536 00:28:16,640 --> 00:28:19,080 Speaker 1: Ted Cruz Ben Ferguson with you don't forget to deal 537 00:28:19,160 --> 00:28:21,400 Speaker 1: with my podcast and you can listen to my podcast 538 00:28:21,440 --> 00:28:23,400 Speaker 1: every other day you're not listening to Verdict or each 539 00:28:23,440 --> 00:28:25,919 Speaker 1: day when you listen to Verdict. Afterwards, I'd love to 540 00:28:25,960 --> 00:28:28,480 Speaker 1: have you as a listener to again the Ben Ferguson 541 00:28:28,520 --> 00:28:31,520 Speaker 1: podcasts and we will see you back here on Monday morning,