1 00:00:00,120 --> 00:00:03,360 Speaker 1: Welcome to today's edition of the Clay Travis and Buck 2 00:00:03,400 --> 00:00:08,039 Speaker 1: Sexton Show podcast O. Welcome in Wednesday edition Clay Travis 3 00:00:08,039 --> 00:00:10,879 Speaker 1: buck Sexton Show. Appreciate all of you hanging out with us. 4 00:00:10,960 --> 00:00:16,360 Speaker 1: We have got heavy of stories to dive into, including 5 00:00:16,840 --> 00:00:20,320 Speaker 1: State of Kentucky has picked a nominee to run this 6 00:00:20,520 --> 00:00:25,200 Speaker 1: November against Andy Basheer, the Democrat governor who's been one 7 00:00:25,239 --> 00:00:28,280 Speaker 1: of the worst governors on COVID in the entire country. 8 00:00:28,400 --> 00:00:32,680 Speaker 1: Daniel Cameron wins big. We will see whether Daniel Cameron 9 00:00:32,800 --> 00:00:35,760 Speaker 1: can take down Andy Basheer. We may talk to Daniel 10 00:00:35,760 --> 00:00:40,520 Speaker 1: Cameron today, but we begin Buck with what I think 11 00:00:40,840 --> 00:00:46,360 Speaker 1: is probably the most consequential aspect of media as it 12 00:00:46,440 --> 00:00:51,360 Speaker 1: pertains to the upcoming twenty twenty four election, and that 13 00:00:51,560 --> 00:00:54,640 Speaker 1: is we all know that there was a big tech 14 00:00:54,880 --> 00:00:58,920 Speaker 1: rig job in effect to help drag Joe Biden across 15 00:00:59,080 --> 00:01:01,760 Speaker 1: the finish line. That you don't even have to be 16 00:01:02,240 --> 00:01:05,520 Speaker 1: a died in the wool Republican or even a Trump 17 00:01:05,640 --> 00:01:12,360 Speaker 1: voter to look at the basic facts and say Facebook, Twitter, Instagram, 18 00:01:12,800 --> 00:01:16,480 Speaker 1: every single big tech platform with any kind of substantial 19 00:01:16,520 --> 00:01:19,880 Speaker 1: audience was all pulling in the direction of Joe Biden. 20 00:01:19,920 --> 00:01:22,760 Speaker 1: To be elected, and they were willing to do whatever 21 00:01:22,800 --> 00:01:26,679 Speaker 1: they could on their platforms to help to make that happen, 22 00:01:26,720 --> 00:01:31,520 Speaker 1: including certainly keeping the Hunter Biden story from ever seeing 23 00:01:31,600 --> 00:01:34,679 Speaker 1: the light of day. One reason that we have talked 24 00:01:34,720 --> 00:01:38,560 Speaker 1: so much about Elon Musk buying Twitter is because it 25 00:01:38,760 --> 00:01:42,600 Speaker 1: changes the game just in terms of having someone who 26 00:01:42,680 --> 00:01:47,560 Speaker 1: is willing to allow natural, honest conversations to take place 27 00:01:47,800 --> 00:01:51,280 Speaker 1: on Twitter. Now, I'm not saying that everything Elon Musk 28 00:01:51,320 --> 00:01:54,240 Speaker 1: has done is perfect. I'm not even saying that there's 29 00:01:54,280 --> 00:01:57,720 Speaker 1: going to be a great business there or that somehow 30 00:01:57,840 --> 00:02:03,280 Speaker 1: Twitter is going to flourish with Elon Musk in control. 31 00:02:03,560 --> 00:02:06,440 Speaker 1: But if you look at what Elon Musk is saying, 32 00:02:06,880 --> 00:02:09,840 Speaker 1: and if you look at the basic outlines of what 33 00:02:10,120 --> 00:02:15,760 Speaker 1: exactly is going on, this becomes a pretty significant aspect 34 00:02:16,200 --> 00:02:19,880 Speaker 1: of counter intel in many ways. And I want to 35 00:02:20,040 --> 00:02:23,800 Speaker 1: play many of these sound bites from an interview that 36 00:02:23,840 --> 00:02:27,560 Speaker 1: Elon Musk did yesterday with CNBC, and I want to 37 00:02:27,600 --> 00:02:31,880 Speaker 1: start with the most consequential I think, which is him 38 00:02:32,000 --> 00:02:35,920 Speaker 1: directly saying on CNBC that the suppression of the Hunter 39 00:02:35,960 --> 00:02:40,280 Speaker 1: Biden laptop story is election interference. And I want to 40 00:02:40,280 --> 00:02:44,079 Speaker 1: play this audio for you because unless I am wrong, 41 00:02:44,960 --> 00:02:47,040 Speaker 1: I don't believe. I don't think I am I don't 42 00:02:47,080 --> 00:02:51,720 Speaker 1: believe any big tech executive has ever publicly acknowledged the 43 00:02:51,760 --> 00:02:55,440 Speaker 1: suppression of Hunter Biden was election interference. We hear all 44 00:02:55,480 --> 00:02:58,200 Speaker 1: the time about the sanctity of elections in this country. 45 00:02:58,480 --> 00:03:00,640 Speaker 1: We know what the FBI is doing, We know what 46 00:03:00,680 --> 00:03:03,600 Speaker 1: the Department of Justice is doing. We know all of 47 00:03:03,639 --> 00:03:07,440 Speaker 1: the cheating that is going on with Joe Biden's administration, 48 00:03:07,600 --> 00:03:10,600 Speaker 1: both to be elected in twenty twenty and potentially to 49 00:03:10,600 --> 00:03:13,919 Speaker 1: be re elected in twenty twenty four. Elon Musk's purchase 50 00:03:14,040 --> 00:03:17,520 Speaker 1: of Twitter is the most clarion sound so far that 51 00:03:17,600 --> 00:03:21,560 Speaker 1: we may have some fairness in big tech stories, but 52 00:03:21,600 --> 00:03:25,560 Speaker 1: here is Elon Musk directly saying on CNBC this was 53 00:03:25,600 --> 00:03:29,040 Speaker 1: suppression and it was election interference, and this is super 54 00:03:29,080 --> 00:03:32,079 Speaker 1: important to have a big tech executive saying, listen. 55 00:03:32,000 --> 00:03:34,880 Speaker 2: You do some tweets that seem to be or at 56 00:03:34,960 --> 00:03:38,880 Speaker 2: least give support to some who would call others conspiracy theories. 57 00:03:39,320 --> 00:03:43,840 Speaker 3: Well, yes, but I mean honestly, you know some of 58 00:03:43,840 --> 00:03:46,440 Speaker 3: these conspiracy theories have turned out. 59 00:03:46,320 --> 00:03:46,800 Speaker 2: To be true. 60 00:03:47,320 --> 00:03:51,640 Speaker 3: Which ones, well, like the Hunter Biden laptop that's true. 61 00:03:51,840 --> 00:03:54,040 Speaker 4: Yeah, Yeah, so you. 62 00:03:54,000 --> 00:03:56,080 Speaker 3: Know that was a pretty big deal. 63 00:03:56,120 --> 00:03:59,440 Speaker 4: There was Twitter and others. 64 00:03:59,160 --> 00:04:01,680 Speaker 3: Engagement, active pression of the information that was relevant to 65 00:04:01,720 --> 00:04:04,680 Speaker 3: the public. That's a terrible thing that happened. 66 00:04:04,720 --> 00:04:09,120 Speaker 4: That's like for interference. It is election interference. Yes, and 67 00:04:09,160 --> 00:04:12,000 Speaker 4: the problem that we now are are facing and I 68 00:04:12,040 --> 00:04:14,200 Speaker 4: think everyone needs to understand there was a phrase that 69 00:04:14,240 --> 00:04:16,880 Speaker 4: was being used a bit more on the right recently 70 00:04:16,920 --> 00:04:19,200 Speaker 4: and then it kind of fell out of favor. You know, 71 00:04:19,240 --> 00:04:21,240 Speaker 4: do you know what time it is? And to know 72 00:04:21,240 --> 00:04:24,000 Speaker 4: what time it is right now is to understand that 73 00:04:24,080 --> 00:04:27,719 Speaker 4: the people who were pushing the craziest stuff over COVID, 74 00:04:28,200 --> 00:04:31,560 Speaker 4: the people who push the Hunter Biden laptop lie the 75 00:04:31,600 --> 00:04:37,440 Speaker 4: fifty one intelligent former intelligence officers and really intelligence agency heads, 76 00:04:37,440 --> 00:04:39,840 Speaker 4: most of them are a lot of them. The people 77 00:04:39,880 --> 00:04:42,040 Speaker 4: who have done so many of these things, Clay, they 78 00:04:42,080 --> 00:04:44,760 Speaker 4: don't care that they were wrong, right, There is no 79 00:04:44,800 --> 00:04:47,320 Speaker 4: integrity to protect, there is no honor that they seek 80 00:04:47,400 --> 00:04:50,320 Speaker 4: to regain. It was useful to them in the moment. 81 00:04:50,400 --> 00:04:53,039 Speaker 4: They would do it again. And you see that with 82 00:04:53,680 --> 00:04:57,279 Speaker 4: the Durham report that just dropped. There's really no there's 83 00:04:57,320 --> 00:04:59,400 Speaker 4: really no one who's coming forward to say, how is 84 00:04:59,400 --> 00:05:04,160 Speaker 4: it possible. The entirety of the Democrat commy media apparatus 85 00:05:04,640 --> 00:05:09,239 Speaker 4: got this thing so stunningly wrong, and no one comes 86 00:05:09,240 --> 00:05:12,840 Speaker 4: forward to say, you know, I honestly I fell for it, 87 00:05:12,960 --> 00:05:15,400 Speaker 4: and I'm sorry, and I won't do it again. No 88 00:05:15,440 --> 00:05:19,080 Speaker 4: one apologizes on the left for being wrong anymore, because 89 00:05:19,120 --> 00:05:22,120 Speaker 4: being right isn't the point. Being right isn't the purpose. 90 00:05:22,320 --> 00:05:25,279 Speaker 4: They have chosen to live by lies. And I think 91 00:05:25,320 --> 00:05:28,680 Speaker 4: everybody has to understand that's what time it is now, 92 00:05:28,720 --> 00:05:31,880 Speaker 4: this is what we are up against. And Elon knows 93 00:05:31,920 --> 00:05:34,560 Speaker 4: it and understands this as a threat really to not 94 00:05:34,640 --> 00:05:37,279 Speaker 4: just America, but Western civilization as we know it, the 95 00:05:37,320 --> 00:05:40,320 Speaker 4: notion of free speech and at least the freedom of 96 00:05:40,360 --> 00:05:44,080 Speaker 4: the press at some level. And we're entering a different phase. 97 00:05:44,120 --> 00:05:46,640 Speaker 4: We are a post journalism phase of America. 98 00:05:46,720 --> 00:05:50,159 Speaker 1: Now, what about the fact that all of those Pulitzer 99 00:05:50,240 --> 00:05:54,599 Speaker 1: Prizes are still outstanding and journalists are still bragging about 100 00:05:54,640 --> 00:05:58,960 Speaker 1: them associated with the Russia collusion. Why in an honest world, 101 00:05:59,000 --> 00:06:00,360 Speaker 1: I don't know if you pay a lot of attention 102 00:06:00,400 --> 00:06:03,000 Speaker 1: to this, buck, but if they find out that you 103 00:06:03,120 --> 00:06:06,480 Speaker 1: cheated in baseball, right, they find out that you use steroids, 104 00:06:06,680 --> 00:06:09,160 Speaker 1: they won't vote you into the Hall of Fame because 105 00:06:09,200 --> 00:06:13,000 Speaker 1: they believe that the numbers upon which you posted your 106 00:06:13,120 --> 00:06:16,960 Speaker 1: career were based on a lie that you were using 107 00:06:17,040 --> 00:06:19,600 Speaker 1: banned substances, that you were using steroids. And there's lots 108 00:06:19,600 --> 00:06:21,400 Speaker 1: of examples of this in the Baseball Hall of Fame. 109 00:06:21,400 --> 00:06:23,040 Speaker 1: If you're a baseball fan, you know exactly what I'm 110 00:06:23,040 --> 00:06:26,560 Speaker 1: talking about. Even Congress had hearings on this. Why are 111 00:06:26,640 --> 00:06:29,680 Speaker 1: there no consequences for the Pulitzer if you win a 112 00:06:29,720 --> 00:06:35,159 Speaker 1: Pulitzer Prize for a story that later is proven to 113 00:06:35,360 --> 00:06:41,520 Speaker 1: be a falsehood? Isn't the very basic element of the 114 00:06:41,600 --> 00:06:45,720 Speaker 1: prize itself rooted in journalistic accuracy at least in theory, 115 00:06:45,800 --> 00:06:48,240 Speaker 1: And wouldn't you have to pull them if you cared 116 00:06:48,240 --> 00:06:49,000 Speaker 1: about that at all? 117 00:06:49,279 --> 00:06:52,240 Speaker 4: So this is what I'm getting at in terms of 118 00:06:52,240 --> 00:06:54,960 Speaker 4: the perception shift, and I know you see it. I 119 00:06:55,000 --> 00:06:59,280 Speaker 4: want everyone at home to understand that journalism now operates 120 00:06:59,800 --> 00:07:03,640 Speaker 4: like a country club where they're rejecting certain people, but 121 00:07:03,680 --> 00:07:06,400 Speaker 4: they'll never say why because they ultimately know, like you know, 122 00:07:06,440 --> 00:07:09,440 Speaker 4: we only want you know, certain people to join. And 123 00:07:09,840 --> 00:07:12,120 Speaker 4: it's that's now when you look at the Pulitzery, this 124 00:07:12,200 --> 00:07:15,760 Speaker 4: is an insider's club that is a bunch of activists. 125 00:07:16,000 --> 00:07:18,880 Speaker 4: They don't actually exist day in and day out to 126 00:07:19,040 --> 00:07:21,760 Speaker 4: provide people with the facts to speak, I mean to 127 00:07:21,760 --> 00:07:24,160 Speaker 4: speak truth to power as a joke. Right, These these 128 00:07:24,200 --> 00:07:27,600 Speaker 4: people who are supposed to be covering the Biden administration 129 00:07:27,680 --> 00:07:29,840 Speaker 4: now and it was the same thing under the Obama administration. 130 00:07:30,280 --> 00:07:32,240 Speaker 4: They are the handmaidens of power. I mean, they are 131 00:07:32,320 --> 00:07:35,720 Speaker 4: people who go above and beyond to spread the propaganda 132 00:07:35,840 --> 00:07:39,960 Speaker 4: of the left, and to do so gleefully. And you know, 133 00:07:40,520 --> 00:07:43,480 Speaker 4: this is I think, at least Clay the clarifying moment 134 00:07:43,520 --> 00:07:46,760 Speaker 4: we're in. It's really good when someone like Elon comes 135 00:07:46,800 --> 00:07:50,720 Speaker 4: forward and says, uh, it's not about do we have 136 00:07:50,720 --> 00:07:52,320 Speaker 4: the clip by the way, we're talking about the clip 137 00:07:52,360 --> 00:07:55,440 Speaker 4: about money? I want to significant too. Yeah, I definitely 138 00:07:55,440 --> 00:07:58,000 Speaker 4: play it. So so Elon was pushed. I have a 139 00:07:58,040 --> 00:08:01,200 Speaker 4: couple things on this. One is journalists get so defensive 140 00:08:01,240 --> 00:08:04,560 Speaker 4: about Soros all the time, and I really mean this. 141 00:08:04,560 --> 00:08:07,760 Speaker 4: This is a foreign born billionaire who has decided to 142 00:08:07,840 --> 00:08:14,120 Speaker 4: make American cities more dangerous. More people have been you know, murdered, raped, mugged, assaulted, 143 00:08:14,280 --> 00:08:18,160 Speaker 4: robbed because of policies that no one ever even thought 144 00:08:18,200 --> 00:08:20,800 Speaker 4: to pursue. In the way that Soros has. And this 145 00:08:20,840 --> 00:08:23,200 Speaker 4: is a matter of public record. Journals that were always 146 00:08:23,920 --> 00:08:26,880 Speaker 4: why are you picking on Soros? And it's because I 147 00:08:26,920 --> 00:08:30,560 Speaker 4: think ultimately they have some affinity for Soros. And also 148 00:08:30,600 --> 00:08:33,120 Speaker 4: all the foundations that he funds, including things that are 149 00:08:33,160 --> 00:08:35,839 Speaker 4: just meant to attack people on the right. Soros money 150 00:08:35,880 --> 00:08:39,480 Speaker 4: does go to those entities and organizations. But beyond that, 151 00:08:39,600 --> 00:08:43,160 Speaker 4: Clay this notion that someone like Elon Musk might have 152 00:08:43,280 --> 00:08:47,040 Speaker 4: some greater purpose, might have something in mind beyond just 153 00:08:47,640 --> 00:08:52,240 Speaker 4: getting the chattering classes clinking their champagne glasses in order 154 00:08:52,280 --> 00:08:56,439 Speaker 4: to talk about how great he is. Here, here's Elon 155 00:08:56,679 --> 00:09:00,440 Speaker 4: when he's pushed by this CNBC journalist and he just says, says, look, 156 00:09:00,480 --> 00:09:03,360 Speaker 4: there's something more play four. 157 00:09:03,760 --> 00:09:06,840 Speaker 2: Do your tweets hurt the company? Are there Tesla owners 158 00:09:06,840 --> 00:09:09,520 Speaker 2: to say I don't agree with his political position because 159 00:09:09,640 --> 00:09:11,679 Speaker 2: and I know it because he shares so much of it. 160 00:09:12,080 --> 00:09:14,439 Speaker 2: Or are there advertisers on Twitter that Linda Yakarna will 161 00:09:14,480 --> 00:09:17,040 Speaker 2: come and say you gotta stop man, Or you know, 162 00:09:17,320 --> 00:09:19,200 Speaker 2: I can't get these ads because of some of the 163 00:09:19,240 --> 00:09:20,040 Speaker 2: things you tweet. 164 00:09:20,400 --> 00:09:22,640 Speaker 3: You know, I'm reminded of as a seat in the 165 00:09:22,640 --> 00:09:26,560 Speaker 3: Princess Bride where he confronts the person who kills his 166 00:09:26,640 --> 00:09:31,800 Speaker 3: father and he says, offer me money, offer me power. 167 00:09:32,280 --> 00:09:33,280 Speaker 3: I don't care. 168 00:09:33,920 --> 00:09:34,840 Speaker 4: See, you just don't care. 169 00:09:35,480 --> 00:09:37,400 Speaker 2: You want to share what you have to say. 170 00:09:37,440 --> 00:09:40,679 Speaker 3: I'll say what I want to say, And if if 171 00:09:40,679 --> 00:09:42,520 Speaker 3: the consequence of that is losing money, so be it. 172 00:09:43,400 --> 00:09:47,120 Speaker 4: You know, Clay I tweeted out in June of twenty 173 00:09:47,160 --> 00:09:50,199 Speaker 4: twenty one a thread that Rush read every word of 174 00:09:50,240 --> 00:09:54,000 Speaker 4: it on his show, So on these airwaves about how 175 00:09:54,320 --> 00:09:56,199 Speaker 4: if you're a billionaire in this country and you care 176 00:09:56,240 --> 00:09:59,120 Speaker 4: about the country, stop being a whimp. Do something. Step up, 177 00:09:59,160 --> 00:10:01,160 Speaker 4: if you care about freedom, if you care about not 178 00:10:01,200 --> 00:10:04,280 Speaker 4: even just conservatism. Now there was a lag period. I'm 179 00:10:04,320 --> 00:10:07,040 Speaker 4: not saying Elon read that tweet, but the idea was there. 180 00:10:07,760 --> 00:10:10,240 Speaker 4: If people don't step up and start doing things, who 181 00:10:10,320 --> 00:10:12,720 Speaker 4: aren't worried about paying their bills, aren't worried about what 182 00:10:12,760 --> 00:10:15,680 Speaker 4: happens to the mortgage, the left wins. So Elon, this 183 00:10:15,720 --> 00:10:18,199 Speaker 4: is why they hate him now so much, because there 184 00:10:18,280 --> 00:10:21,359 Speaker 4: is a rallying effect other people who are in positions 185 00:10:21,800 --> 00:10:24,960 Speaker 4: of they can say what they want without real material 186 00:10:25,000 --> 00:10:27,160 Speaker 4: consequence to them other than the New York Times. Maybe 187 00:10:27,200 --> 00:10:30,200 Speaker 4: criticizing them. Other people may want to come forward and say, 188 00:10:30,240 --> 00:10:33,000 Speaker 4: you know what, gender transitions for twelve year old's not 189 00:10:33,040 --> 00:10:36,199 Speaker 4: a good idea. Turning our cities into high crime healthcapes 190 00:10:36,320 --> 00:10:39,360 Speaker 4: not a good idea. Using big tech to suppress certain 191 00:10:39,400 --> 00:10:42,200 Speaker 4: thoughts and ideas during a pandemic and to push lies 192 00:10:42,280 --> 00:10:44,360 Speaker 4: during an election not a good idea. 193 00:10:45,080 --> 00:10:48,959 Speaker 1: Yeah, we agree, and that's why I was so so 194 00:10:50,200 --> 00:10:54,360 Speaker 1: happy to hear, joyous almost to hear that interview with 195 00:10:54,440 --> 00:10:57,120 Speaker 1: Elon Musk when he said that because we've been hammering 196 00:10:57,160 --> 00:11:00,680 Speaker 1: home the same thing. If you are fortunate enough to 197 00:11:00,840 --> 00:11:02,920 Speaker 1: be very wealthy, and we have a lot of those 198 00:11:02,960 --> 00:11:07,920 Speaker 1: listeners right now, what are you afraid of? That is 199 00:11:08,000 --> 00:11:12,200 Speaker 1: my question. I understand one hundred billion percent. If because 200 00:11:12,240 --> 00:11:14,640 Speaker 1: I've been there, If you have young kids and you're 201 00:11:14,640 --> 00:11:17,360 Speaker 1: worried about paying your mortgage, if you've got a kid 202 00:11:17,400 --> 00:11:21,199 Speaker 1: in college and you're helping to put that kid through college, 203 00:11:21,360 --> 00:11:23,720 Speaker 1: and man, you really have to have your job in 204 00:11:23,840 --> 00:11:27,199 Speaker 1: order to do that. If you're an employee and your 205 00:11:27,240 --> 00:11:30,280 Speaker 1: ability to take care of your family is directly dictated 206 00:11:30,320 --> 00:11:32,960 Speaker 1: by what you make on a week to week basis, 207 00:11:33,280 --> 00:11:36,960 Speaker 1: I one hundred percent understand why you may have strong opinions, 208 00:11:37,000 --> 00:11:39,320 Speaker 1: but you feel like you just got to bite your tongue. 209 00:11:40,320 --> 00:11:44,560 Speaker 1: But if you are worth twenty thirty forty million dollars 210 00:11:44,640 --> 00:11:49,040 Speaker 1: or more and you aren't willing to say exactly what 211 00:11:49,120 --> 00:11:53,199 Speaker 1: you think, what are you afraid of? And why are 212 00:11:53,240 --> 00:11:57,400 Speaker 1: you such a coward? And so for whether you agree 213 00:11:57,520 --> 00:11:59,520 Speaker 1: or disagree with Elon Musk, I mean I think he 214 00:11:59,520 --> 00:12:02,760 Speaker 1: would ignorege Hey at forty four billion dollars, I overpaid 215 00:12:02,760 --> 00:12:06,880 Speaker 1: for Twitter. But Elon Musk is a long horizon guy. 216 00:12:06,920 --> 00:12:09,320 Speaker 1: I mean, he wants to put people on Mars and 217 00:12:09,400 --> 00:12:12,000 Speaker 1: he wants to leave. I really do, genuinely believe this 218 00:12:12,320 --> 00:12:15,120 Speaker 1: humanity in a better place than humanity was before he 219 00:12:15,200 --> 00:12:17,920 Speaker 1: was born. I think that should be the goal of anyone, 220 00:12:18,040 --> 00:12:20,320 Speaker 1: certainly of anyone who is super wealthy. I would hope 221 00:12:20,320 --> 00:12:22,439 Speaker 1: that that would be their goal. What are you waiting for? 222 00:12:23,280 --> 00:12:26,200 Speaker 1: Sitting quietly and meekly on the sideline with all of 223 00:12:26,240 --> 00:12:31,199 Speaker 1: your goal balance behind you, Why are you not weighing in? 224 00:12:31,960 --> 00:12:35,079 Speaker 1: You have nothing to be afraid of. No one can 225 00:12:35,120 --> 00:12:37,080 Speaker 1: do anything to you or your family. You will be 226 00:12:37,120 --> 00:12:38,960 Speaker 1: taken care of, and they will be taken care of 227 00:12:39,320 --> 00:12:42,319 Speaker 1: long into the future for generations that you can't even see. 228 00:12:44,040 --> 00:12:46,720 Speaker 1: But will the country still be able to stand for 229 00:12:46,800 --> 00:12:50,280 Speaker 1: what you believe in, for what you believe has allowed 230 00:12:50,280 --> 00:12:52,320 Speaker 1: you to have that success. That's a real question that 231 00:12:52,360 --> 00:12:54,800 Speaker 1: I think wealthy people have to answer themselves. Look in 232 00:12:54,840 --> 00:12:56,160 Speaker 1: the mirror and stop being cowards. 233 00:12:56,640 --> 00:12:59,280 Speaker 4: Yeah, sending a check to a think tank and doing 234 00:12:59,280 --> 00:13:01,080 Speaker 4: a little nice chair already work on the side. It's 235 00:13:01,120 --> 00:13:03,120 Speaker 4: not going to save the country. It's not gonna save 236 00:13:03,160 --> 00:13:06,600 Speaker 4: the country. So for the people out there who are 237 00:13:06,640 --> 00:13:08,840 Speaker 4: on the right, or I would even say to borrow 238 00:13:08,880 --> 00:13:11,040 Speaker 4: from Uclay, that are on the side of the sane 239 00:13:11,760 --> 00:13:15,920 Speaker 4: versus the insane. So so for the sane billionaires out 240 00:13:15,920 --> 00:13:18,120 Speaker 4: there or you know, one hundred millionaires, we're just people 241 00:13:18,120 --> 00:13:20,560 Speaker 4: who know their material needs are met. This is what 242 00:13:20,559 --> 00:13:22,600 Speaker 4: I would say to people. Do they They will call 243 00:13:22,720 --> 00:13:25,160 Speaker 4: or they'll email, They'll say, you know, I want to 244 00:13:25,160 --> 00:13:27,160 Speaker 4: get involved in this fight. I'm gonna speak out at 245 00:13:27,240 --> 00:13:29,199 Speaker 4: you know, I work at in at tech For those 246 00:13:29,240 --> 00:13:31,319 Speaker 4: of you who are wasn't that what it was in 247 00:13:31,400 --> 00:13:32,880 Speaker 4: office space in a tech? Am I right? 248 00:13:33,600 --> 00:13:34,280 Speaker 1: That's a great question. 249 00:13:34,280 --> 00:13:35,880 Speaker 4: I can't even think it was called it attack. I 250 00:13:35,880 --> 00:13:37,280 Speaker 4: work at in a tech If that's what it's called 251 00:13:37,320 --> 00:13:39,800 Speaker 4: an office space, and I'm gonna speak out at the 252 00:13:39,800 --> 00:13:42,319 Speaker 4: next dee I meeting. I'm don't do that. Don't do that, 253 00:13:42,360 --> 00:13:44,640 Speaker 4: It's gonna get you fired. Wait until you're in a 254 00:13:44,640 --> 00:13:47,120 Speaker 4: position to do something. I'm not saying go along with it, 255 00:13:47,400 --> 00:13:50,320 Speaker 4: but don't just charge up the hillside and get mowed 256 00:13:50,360 --> 00:13:52,520 Speaker 4: down by the machine gun, so to speak, because you 257 00:13:52,520 --> 00:13:55,240 Speaker 4: think it's gonna create some some hero narrative. They're just 258 00:13:55,240 --> 00:13:57,680 Speaker 4: going to replace you tomorrow. Some people are. You're in 259 00:13:57,679 --> 00:14:00,840 Speaker 4: position sometimes to fight. You're in positions times to rest, 260 00:14:00,880 --> 00:14:04,400 Speaker 4: refit and recoup right. And I think for the people 261 00:14:04,440 --> 00:14:07,600 Speaker 4: out there who don't have Elon is leading the charge 262 00:14:07,600 --> 00:14:10,760 Speaker 4: of people who are saying there are civilizational risks right 263 00:14:10,760 --> 00:14:13,080 Speaker 4: now in America, and for people who don't have to 264 00:14:13,120 --> 00:14:16,000 Speaker 4: worry about paying their bills, sitting on the sidelines and 265 00:14:16,040 --> 00:14:18,320 Speaker 4: being afraid of being called out makes you a coward. 266 00:14:19,000 --> 00:14:20,920 Speaker 4: And I think that's what a lot of us took 267 00:14:21,000 --> 00:14:24,000 Speaker 4: from what he said last night when he was or 268 00:14:24,040 --> 00:14:26,600 Speaker 4: whatever it was in this interview, when he said, you 269 00:14:26,600 --> 00:14:28,120 Speaker 4: can give me money, you can give me power. I 270 00:14:28,120 --> 00:14:30,800 Speaker 4: don't care. I think the truth is more important. How 271 00:14:30,800 --> 00:14:35,000 Speaker 4: many people say that these days, Clay it almost felt revolutionary. 272 00:14:35,360 --> 00:14:39,080 Speaker 1: Yes, I mean, we do it every day on this show. 273 00:14:39,640 --> 00:14:42,800 Speaker 1: But for people who are billionaires who have nothing to risk, 274 00:14:42,840 --> 00:14:44,440 Speaker 1: I mean what Elon Musk, I mean, I no, but 275 00:14:44,520 --> 00:14:46,640 Speaker 1: stock price of Tesla can go up or down every day, 276 00:14:46,640 --> 00:14:49,080 Speaker 1: and it does so. But he's worth somewhere in the 277 00:14:49,080 --> 00:14:53,120 Speaker 1: neighborhood of two hundred billion dollars. You know, he's pretty 278 00:14:53,120 --> 00:14:55,480 Speaker 1: safe in terms of being able to say exactly what 279 00:14:55,560 --> 00:14:57,240 Speaker 1: he wants to say. But there are a lot of people. 280 00:14:57,280 --> 00:14:59,080 Speaker 1: I don't know what the number is. Number is different, 281 00:14:59,120 --> 00:15:01,760 Speaker 1: But if you have twenty year, thirty or forty million dollars, 282 00:15:02,200 --> 00:15:05,960 Speaker 1: you're pretty taken care of, right Certainly if you have 283 00:15:06,080 --> 00:15:09,200 Speaker 1: billions of dollars or hundreds of millions of dollars, join 284 00:15:09,240 --> 00:15:11,720 Speaker 1: them and speak out. So many of us are relying 285 00:15:11,840 --> 00:15:15,040 Speaker 1: on our computers and smartphones. Just the idea of losing 286 00:15:15,160 --> 00:15:18,720 Speaker 1: save data, documents, photos. How about my case? The book 287 00:15:18,760 --> 00:15:21,040 Speaker 1: I'm working on gives me chills. Book's going to be 288 00:15:21,040 --> 00:15:23,080 Speaker 1: out in August. I lost it on my computer for 289 00:15:23,160 --> 00:15:25,360 Speaker 1: a while, had to go get it reclaimed. Have you 290 00:15:25,400 --> 00:15:27,920 Speaker 1: ever had that feeling where you're working on a computer. 291 00:15:28,040 --> 00:15:31,000 Speaker 1: We got a file and suddenly advantages. The next time 292 00:15:31,040 --> 00:15:35,400 Speaker 1: you turn on your computer, imagine it not working. Imagine 293 00:15:35,440 --> 00:15:38,920 Speaker 1: not being able to access anything you've worked on in 294 00:15:38,960 --> 00:15:41,360 Speaker 1: that moment. What would you pay to have all that 295 00:15:41,440 --> 00:15:46,239 Speaker 1: information back? The company I drive makes that question irrelevant. 296 00:15:46,640 --> 00:15:50,160 Speaker 1: Their systems back up the data on your computer and 297 00:15:50,280 --> 00:15:54,400 Speaker 1: securely store it away. Eight years in a row, I 298 00:15:54,600 --> 00:15:58,160 Speaker 1: drive has been awarded the best cloud backup solution by 299 00:15:58,280 --> 00:16:03,560 Speaker 1: PC Magazine. Eight years. That says it all. This is 300 00:16:03,600 --> 00:16:06,560 Speaker 1: the backup system Rush spoke about. It's the same one 301 00:16:06,680 --> 00:16:10,120 Speaker 1: we're now relying on. I drive is the best in 302 00:16:10,160 --> 00:16:14,880 Speaker 1: the business. Give yourself peace of mind by protecting everything 303 00:16:14,960 --> 00:16:18,160 Speaker 1: on your computer. Plans start at less than seven dollars 304 00:16:18,240 --> 00:16:21,320 Speaker 1: a month. Get ninety percent off your first year plan 305 00:16:21,440 --> 00:16:24,960 Speaker 1: when you use my name Clay as the promo code 306 00:16:25,120 --> 00:16:28,760 Speaker 1: at checkout. I drive is the easiest, the most secure 307 00:16:28,840 --> 00:16:33,520 Speaker 1: cloud backup solution. I drive dot Com. My name Clay 308 00:16:33,720 --> 00:16:38,520 Speaker 1: cla Why for ninety percent off The supply chain of 309 00:16:38,640 --> 00:16:44,560 Speaker 1: Smart's Sanity and Truth Uninterrupted Glay Travis and Buck Sexton. 310 00:16:44,920 --> 00:16:47,560 Speaker 5: You link to somebody who's talking about the guy who 311 00:16:47,680 --> 00:16:51,160 Speaker 5: killed me, children and a mall and now in Texas 312 00:16:52,480 --> 00:16:55,120 Speaker 5: you say something like it might be a bad sy off. 313 00:16:55,160 --> 00:16:56,960 Speaker 4: I'm not quite sure what you meant. It was. 314 00:16:57,480 --> 00:17:00,480 Speaker 6: I think incorrectly ascribed to be a white supremacist action. 315 00:17:01,280 --> 00:17:05,879 Speaker 6: And the evidence for that was some obscure Russian website 316 00:17:05,920 --> 00:17:09,080 Speaker 6: that Noah's ever heard of that had no followers. And 317 00:17:09,200 --> 00:17:13,280 Speaker 6: the company that found this is Bellan cat right, And 318 00:17:13,320 --> 00:17:16,080 Speaker 6: you know what Bellaqet says, Sciops, I'm saying that I 319 00:17:16,080 --> 00:17:18,520 Speaker 6: thought this the ascribing a two whites trying to see 320 00:17:18,680 --> 00:17:19,000 Speaker 6: his book. 321 00:17:19,320 --> 00:17:21,760 Speaker 4: Okay, there's no proof, by the way, that he was not. 322 00:17:22,440 --> 00:17:24,119 Speaker 3: There's no I would say that there's no. 323 00:17:24,119 --> 00:17:24,680 Speaker 7: Proof that he is. 324 00:17:25,359 --> 00:17:26,920 Speaker 2: And that's a debate you want to get into on 325 00:17:27,040 --> 00:17:27,600 Speaker 2: Twitter pod. 326 00:17:28,200 --> 00:17:36,200 Speaker 4: Yes, Elon Musk completely flummixing a journal there by applying 327 00:17:36,400 --> 00:17:40,200 Speaker 4: this really rare thing critical thinking and their willingness to 328 00:17:40,480 --> 00:17:45,720 Speaker 4: ask the important and even obvious questions. Journalists are like, 329 00:17:45,840 --> 00:17:48,760 Speaker 4: why don't you want to be clay among the sheep? 330 00:17:49,359 --> 00:17:52,080 Speaker 4: The sheep are safe, the sheep are in a herd. 331 00:17:52,720 --> 00:17:53,639 Speaker 4: Just join the sheep. 332 00:17:54,960 --> 00:17:57,960 Speaker 1: It really is staggering how little many of these journalists 333 00:17:58,000 --> 00:18:00,240 Speaker 1: even know about these details, isn't it. 334 00:18:00,520 --> 00:18:02,600 Speaker 4: They don't even care. They're there to push a narrative. 335 00:18:03,440 --> 00:18:06,600 Speaker 4: My fellow gun owners. AMMO is really expensive, like so 336 00:18:06,680 --> 00:18:08,560 Speaker 4: many other things these days, but there is a way 337 00:18:08,640 --> 00:18:10,960 Speaker 4: to keep your skills sharp without going to the range. 338 00:18:11,200 --> 00:18:13,879 Speaker 4: It's called the mantis X, a firearms training system that 339 00:18:14,040 --> 00:18:16,800 Speaker 4: is a no ammo all electronic way to improve your 340 00:18:16,840 --> 00:18:19,600 Speaker 4: shooting accuracy. It attaches to your fire I'm like a 341 00:18:19,680 --> 00:18:22,600 Speaker 4: weapon light than that attaches to your iPhone or rather 342 00:18:22,640 --> 00:18:25,720 Speaker 4: you connect your iPhone or the Android you've got to 343 00:18:25,800 --> 00:18:29,800 Speaker 4: the mantisx app. This is called dry fire practice. It's 344 00:18:29,920 --> 00:18:32,119 Speaker 4: something you can do at home or at the range, 345 00:18:32,640 --> 00:18:35,119 Speaker 4: and mantis X guides you through drills and courses to 346 00:18:35,200 --> 00:18:38,760 Speaker 4: give you data driven, real time feedback on your technique. 347 00:18:39,080 --> 00:18:42,600 Speaker 4: Most users improve within twenty minutes of using mantis X. 348 00:18:42,880 --> 00:18:45,879 Speaker 4: I know I have start improving your shooting accuracy today. 349 00:18:45,960 --> 00:18:48,240 Speaker 4: Get more out of every trip to the range and 350 00:18:48,480 --> 00:18:51,000 Speaker 4: every round you fire, because your skills are going to 351 00:18:51,040 --> 00:18:55,159 Speaker 4: be sharp. Go to mantisx dot com. That's MA and 352 00:18:55,480 --> 00:18:57,920 Speaker 4: tisx dot com. 353 00:18:58,480 --> 00:19:01,760 Speaker 1: Welcome back in Klay Travis buck Sexton Show. Appreciate all 354 00:19:01,840 --> 00:19:06,840 Speaker 1: of you hanging out with us. I feel like the 355 00:19:06,960 --> 00:19:11,960 Speaker 1: Donald Trump sixty minutes interview with Leslie Stall that many 356 00:19:12,040 --> 00:19:15,200 Speaker 1: of you remember back in October of twenty twenty, that 357 00:19:15,359 --> 00:19:18,080 Speaker 1: went off the rails and that they claimed, oh, he 358 00:19:18,280 --> 00:19:21,400 Speaker 1: stormed out of the interview, and then the Trump campaign 359 00:19:21,560 --> 00:19:24,359 Speaker 1: had their own footage behind the scenes of the sixty 360 00:19:24,440 --> 00:19:28,639 Speaker 1: minutes interview and they released it. Everything that Trump said 361 00:19:29,440 --> 00:19:34,560 Speaker 1: to Leslie Stall has since been proven to be essentially 362 00:19:35,160 --> 00:19:38,560 Speaker 1: one hundred percent correct. Remember, he was telling her all 363 00:19:38,560 --> 00:19:41,639 Speaker 1: about the Hunter Biden laptop and the fact that his 364 00:19:42,119 --> 00:19:45,440 Speaker 1: campaign had been spied on in twenty sixteen, and Leslie 365 00:19:45,560 --> 00:19:50,320 Speaker 1: Stall was the AI chatbot of journalists in that she 366 00:19:50,520 --> 00:19:53,040 Speaker 1: just responded, there's no proof. We can't talk about that. 367 00:19:53,200 --> 00:19:56,960 Speaker 1: There's no way it's real. Listen to this as a 368 00:19:57,040 --> 00:20:00,680 Speaker 1: flashback now that the Durham report has come out. This 369 00:20:00,840 --> 00:20:05,119 Speaker 1: is in October of twenty twenty, Leslie Stall being told 370 00:20:05,800 --> 00:20:09,000 Speaker 1: by Donald Trump that his campaign was spied on. And 371 00:20:09,200 --> 00:20:13,160 Speaker 1: listen to her. Just pare it back left wing Democrat 372 00:20:13,320 --> 00:20:15,240 Speaker 1: media propaganda talking points. 373 00:20:15,320 --> 00:20:18,840 Speaker 5: Listen, the biggest scandal was when they spied on my campaign. 374 00:20:19,400 --> 00:20:20,520 Speaker 5: They spied at my campaign. 375 00:20:20,560 --> 00:20:22,159 Speaker 1: There's no real evidence of that. 376 00:20:22,359 --> 00:20:22,960 Speaker 7: Of course there is. 377 00:20:23,040 --> 00:20:24,040 Speaker 3: It's all over the place. 378 00:20:24,560 --> 00:20:27,080 Speaker 4: Leslie spied to my campaign and they. 379 00:20:27,000 --> 00:20:28,359 Speaker 2: Got can I say something? 380 00:20:29,040 --> 00:20:32,720 Speaker 5: You know, this is sixty minutes, and we can't put 381 00:20:32,840 --> 00:20:33,560 Speaker 5: on things we. 382 00:20:33,600 --> 00:20:35,840 Speaker 7: Can't verify it on because it's bad for Biden. 383 00:20:35,920 --> 00:20:38,040 Speaker 1: We can't put things we can't very. 384 00:20:38,080 --> 00:20:42,000 Speaker 4: Leslie, they spy to my campaign. It's been totally verified. No, 385 00:20:42,760 --> 00:20:44,600 Speaker 4: it's been just go down and get the papers. 386 00:20:45,400 --> 00:20:46,720 Speaker 7: They spied on my campaign. 387 00:20:46,800 --> 00:20:47,480 Speaker 4: They got caught. 388 00:20:47,840 --> 00:20:50,040 Speaker 7: No, and then they went much further than that, and 389 00:20:50,160 --> 00:20:52,879 Speaker 7: they got caught. And you will see that, Leslie, and 390 00:20:53,000 --> 00:20:54,840 Speaker 7: you know that, but you just don't want to move on. 391 00:20:55,359 --> 00:20:58,720 Speaker 4: As a matter of fact, I don't know that. Okay, 392 00:20:58,800 --> 00:21:01,920 Speaker 4: she was doing job clay, Yeah, but this ever needs 393 00:21:01,920 --> 00:21:04,480 Speaker 4: to you know, this is we need to stop playing 394 00:21:04,560 --> 00:21:07,760 Speaker 4: along with the Oh how how did she not know 395 00:21:08,000 --> 00:21:10,680 Speaker 4: that what she was saying was all false? Her job 396 00:21:10,840 --> 00:21:13,439 Speaker 4: is propaganda. You might as well think about a sixty 397 00:21:13,440 --> 00:21:15,240 Speaker 4: minutes anchored the same way you would someone in the 398 00:21:15,280 --> 00:21:18,200 Speaker 4: West wing who's Biden's press secretary. The job is not 399 00:21:18,359 --> 00:21:20,600 Speaker 4: to tell the truth. Now, does she look like an 400 00:21:20,600 --> 00:21:24,920 Speaker 4: absolute imbecile here? Yes? But I don't even think that. 401 00:21:25,160 --> 00:21:27,240 Speaker 4: Do you let me ask you this. Do you think 402 00:21:27,359 --> 00:21:29,439 Speaker 4: she cares that she got that wrong? Do you think 403 00:21:29,480 --> 00:21:32,040 Speaker 4: there's any part of this like, oh, my brand is damaged? 404 00:21:32,320 --> 00:21:35,359 Speaker 4: Maybe I won't get renewed at sixty minutes on my contract, 405 00:21:35,640 --> 00:21:36,160 Speaker 4: I don't think. 406 00:21:36,240 --> 00:21:36,280 Speaker 3: So. 407 00:21:37,680 --> 00:21:40,720 Speaker 1: It's such a good question because and this is letting 408 00:21:40,760 --> 00:21:42,640 Speaker 1: you guys behind the curtain a little bit, because Buck 409 00:21:42,640 --> 00:21:46,399 Speaker 1: and I are obviously in the media. There is a 410 00:21:48,200 --> 00:21:53,720 Speaker 1: there's almost a wilful blindness to actually look into things yourself. 411 00:21:54,400 --> 00:21:57,200 Speaker 1: And part of it is that you're in the service 412 00:21:57,400 --> 00:22:02,119 Speaker 1: of whatever left wing acolyte you're trying to suck up 413 00:22:02,160 --> 00:22:05,399 Speaker 1: to at that point in time. But there's also, I 414 00:22:05,600 --> 00:22:11,640 Speaker 1: think a fundamental lack of confidence. I think many people 415 00:22:11,720 --> 00:22:15,159 Speaker 1: in media are not smart enough to trust themselves to 416 00:22:15,359 --> 00:22:18,200 Speaker 1: actually go to the real documents. Let me explain what 417 00:22:18,240 --> 00:22:20,359 Speaker 1: I mean by that book. When I was in law school, 418 00:22:21,359 --> 00:22:24,200 Speaker 1: I had a contracts professor and he said, you know 419 00:22:24,240 --> 00:22:28,080 Speaker 1: what the three most important things in any contract dispute are. 420 00:22:28,680 --> 00:22:31,920 Speaker 1: Everybody looks around like nobody knows the contract, the contract, 421 00:22:31,960 --> 00:22:32,520 Speaker 1: the contract. 422 00:22:33,000 --> 00:22:33,280 Speaker 4: He said. 423 00:22:33,359 --> 00:22:36,159 Speaker 1: My response is, anytime somebody's got a contract dispute is 424 00:22:36,200 --> 00:22:40,680 Speaker 1: go read the contract. And he said, the number of lawyers, 425 00:22:41,359 --> 00:22:44,480 Speaker 1: even lawyers that are being paid well, that aren't interested 426 00:22:44,520 --> 00:22:47,159 Speaker 1: in actually going and reading the contract because it is 427 00:22:47,359 --> 00:22:49,960 Speaker 1: oftentimes hard to work. You have to go line by line. 428 00:22:50,080 --> 00:22:52,280 Speaker 1: You have to go word by word and analyze what 429 00:22:52,359 --> 00:22:55,600 Speaker 1: it means. There's a lot of people who make millions 430 00:22:55,640 --> 00:22:59,120 Speaker 1: of dollars in journalism and are being held up by 431 00:22:59,200 --> 00:23:02,440 Speaker 1: a staff that surrounds them, and they don't actually do 432 00:23:02,640 --> 00:23:06,280 Speaker 1: the work themselves anymore. And that's what to me, this 433 00:23:06,480 --> 00:23:10,200 Speaker 1: Leslie Stall thing represents. It's not just that she is 434 00:23:10,840 --> 00:23:14,160 Speaker 1: looking like an imbecile here, it's that she didn't put 435 00:23:14,200 --> 00:23:18,360 Speaker 1: the time in herself to be intellectually curious and look 436 00:23:18,480 --> 00:23:21,440 Speaker 1: into the underlying data herself. And that's why I think 437 00:23:21,480 --> 00:23:25,600 Speaker 1: Elon Musk is so interesting, Buck, because you can't build 438 00:23:25,640 --> 00:23:27,760 Speaker 1: a brand new way to make a car, and you 439 00:23:27,920 --> 00:23:33,200 Speaker 1: certainly can't spend us can't send rockets to space better 440 00:23:33,280 --> 00:23:37,080 Speaker 1: than NASA without getting involved in the nitty gritty yourself 441 00:23:37,200 --> 00:23:40,960 Speaker 1: without being incredibly detail oriented. And a lot of these 442 00:23:41,200 --> 00:23:45,600 Speaker 1: journalists are not detail oriented people. They're big picture and 443 00:23:45,720 --> 00:23:50,280 Speaker 1: they just paarrot what everybody else says because they're secretly 444 00:23:50,440 --> 00:23:54,200 Speaker 1: afraid of being a little bit outside the lines and 445 00:23:54,320 --> 00:23:59,160 Speaker 1: bounds of what acceptable information is. And that is why 446 00:23:59,240 --> 00:24:01,199 Speaker 1: so many of you out they're listening to us right now, 447 00:24:01,280 --> 00:24:04,080 Speaker 1: I think have been read. Pilled is because you really 448 00:24:04,240 --> 00:24:06,479 Speaker 1: did start to you trusted that and you didn't pay 449 00:24:06,520 --> 00:24:09,119 Speaker 1: attention to it, and now you look behind the curtain 450 00:24:09,160 --> 00:24:11,120 Speaker 1: and you're like, man, there's not a lot of substance here. 451 00:24:11,359 --> 00:24:12,920 Speaker 1: They haven't done the work that I did. 452 00:24:13,200 --> 00:24:16,200 Speaker 4: And I think that's certainly there are a lot of 453 00:24:16,359 --> 00:24:17,919 Speaker 4: you know, the other part of this is there are 454 00:24:17,960 --> 00:24:20,840 Speaker 4: the demagogues, There are the go alogus, right, there are 455 00:24:20,880 --> 00:24:24,359 Speaker 4: the people who there are different roles even in the 456 00:24:25,680 --> 00:24:29,480 Speaker 4: commie democrat media that people play depending on you know, 457 00:24:29,680 --> 00:24:33,439 Speaker 4: are they a serious correspondent who's not really giving opinion 458 00:24:33,560 --> 00:24:38,080 Speaker 4: but actually is making lots of editorial decisions to support well, 459 00:24:38,119 --> 00:24:40,400 Speaker 4: you know, there's different roles, right, There's like a big 460 00:24:40,480 --> 00:24:43,760 Speaker 4: team where they have different roles. But someone like Leslie Stall, 461 00:24:44,280 --> 00:24:48,960 Speaker 4: her job is to reflect the sentiments and the political 462 00:24:49,080 --> 00:24:53,359 Speaker 4: feelings of the New York Times editorial page through a 463 00:24:53,560 --> 00:24:57,000 Speaker 4: TV prism on the sixty minutes show. Her job is 464 00:24:57,359 --> 00:24:59,920 Speaker 4: not to be knowledgeable. It's not to do the reading, 465 00:25:00,119 --> 00:25:01,960 Speaker 4: it's not to do the homework with your point, she's 466 00:25:02,000 --> 00:25:04,720 Speaker 4: not doing those things. She's not looking at the primary 467 00:25:04,760 --> 00:25:07,520 Speaker 4: source documents. But I don't even think that that's just 468 00:25:08,160 --> 00:25:12,399 Speaker 4: her being lazy or an oversight. Her job is the 469 00:25:12,480 --> 00:25:15,280 Speaker 4: talking points. Her job is to sit there and give this, 470 00:25:16,359 --> 00:25:20,280 Speaker 4: you know, this sort of Barbara Walters esque approach to 471 00:25:20,400 --> 00:25:23,920 Speaker 4: Donald Trump, where it's all, you know, emotional connection and 472 00:25:24,320 --> 00:25:28,359 Speaker 4: seemingly to be the sound voice, the reasonable voice in 473 00:25:28,440 --> 00:25:31,720 Speaker 4: the room, and spewing wise in the process. I don't 474 00:25:31,840 --> 00:25:33,639 Speaker 4: even if she All she did was read the New 475 00:25:33,720 --> 00:25:35,919 Speaker 4: York Times front page. She had to know that there 476 00:25:36,000 --> 00:25:38,320 Speaker 4: was something funky going on with Trump, and just for everyone. 477 00:25:38,400 --> 00:25:42,440 Speaker 4: By way of quick review Clay, they used Faiza against 478 00:25:43,000 --> 00:25:47,800 Speaker 4: carter Page. Carter Page is like the I know carter 479 00:25:47,920 --> 00:25:51,080 Speaker 4: Page because of this whole situation. Carter Page, this is 480 00:25:51,160 --> 00:25:52,879 Speaker 4: kind of funny. Sat down next to me when I 481 00:25:52,960 --> 00:25:55,359 Speaker 4: was interviewing him. He's like, hey, you know that your 482 00:25:55,400 --> 00:25:58,360 Speaker 4: dad and I went to a like a Harvard Club 483 00:25:58,400 --> 00:26:01,240 Speaker 4: event talking about you know, and tech or something. And 484 00:26:01,320 --> 00:26:03,600 Speaker 4: he was so nice to me. Like, like I sat 485 00:26:03,640 --> 00:26:06,000 Speaker 4: down to carter Page, I'm like, wait, you and my 486 00:26:06,119 --> 00:26:08,920 Speaker 4: dad knew each other from ten years ago. Carter Page 487 00:26:08,960 --> 00:26:12,760 Speaker 4: served the Navy. He's a super nice guy. He he 488 00:26:13,160 --> 00:26:16,720 Speaker 4: actually even had worked with the government to help the 489 00:26:16,840 --> 00:26:20,280 Speaker 4: government find people that were trying to, you know, leverage 490 00:26:20,320 --> 00:26:24,800 Speaker 4: contacts in this country for espionage purposes, and they miked 491 00:26:24,880 --> 00:26:28,560 Speaker 4: them up like they would some superspy and violated. Really 492 00:26:28,800 --> 00:26:31,800 Speaker 4: they can say, oh, it's legal because it's FISA. PISA 493 00:26:31,880 --> 00:26:34,080 Speaker 4: is not supposed to be used this way. It was abuse. 494 00:26:34,160 --> 00:26:35,920 Speaker 4: We all know it. And the probe now or the 495 00:26:36,000 --> 00:26:39,080 Speaker 4: Durham report makes it clear. I mean, these people acted 496 00:26:39,200 --> 00:26:42,560 Speaker 4: like psychopaths. No, no FBI agent could sit down, as 497 00:26:42,600 --> 00:26:44,960 Speaker 4: my point here, with Carter Page or George Papadopolis and 498 00:26:45,040 --> 00:26:49,000 Speaker 4: be like, yeah, this guy's a Putin puppet who's trying 499 00:26:49,040 --> 00:26:54,840 Speaker 4: to overthrow our democracy. Only emotionally destabilized eunuchs could actually 500 00:26:54,920 --> 00:26:57,159 Speaker 4: believe that. Unfortunate they were running the FBI. 501 00:26:58,240 --> 00:27:01,120 Speaker 1: And I will say this that's so well said about 502 00:27:01,280 --> 00:27:03,080 Speaker 1: like the lack of power that he had and the 503 00:27:03,160 --> 00:27:05,560 Speaker 1: idea that if Russia were going to create Donald Trump, 504 00:27:05,640 --> 00:27:08,879 Speaker 1: a Manchurian candidate, they would be going through people like 505 00:27:09,000 --> 00:27:13,800 Speaker 1: Carter Page to do so. The people that are leslie 506 00:27:13,840 --> 00:27:16,239 Speaker 1: Stall to me, is just not smart, and worse than 507 00:27:16,320 --> 00:27:19,560 Speaker 1: not being smart, she's not intellectually curious. That's my take 508 00:27:19,640 --> 00:27:21,840 Speaker 1: on it. But but your point about the roles that 509 00:27:21,960 --> 00:27:26,119 Speaker 1: people play. The Jeff Zuckers of the world, are very smart, 510 00:27:26,760 --> 00:27:30,480 Speaker 1: and they sit up above the chessboard and decide what 511 00:27:30,760 --> 00:27:34,480 Speaker 1: pieces go where in for instance, the CNN universe, to 512 00:27:34,680 --> 00:27:37,800 Speaker 1: tell them, hey, these are the stories that we are 513 00:27:37,920 --> 00:27:43,040 Speaker 1: willing to cover. And the fact that there is such 514 00:27:43,160 --> 00:27:44,800 Speaker 1: a use the word cabal, and I think it's a 515 00:27:44,840 --> 00:27:50,240 Speaker 1: great one that there is such this cabal of people 516 00:27:50,280 --> 00:27:52,840 Speaker 1: who are willing to take their marching orders as to 517 00:27:53,000 --> 00:27:55,439 Speaker 1: what the operable facts are. And by the way, there 518 00:27:55,480 --> 00:27:58,160 Speaker 1: are really smart people at sixty minutes who likely told 519 00:27:58,240 --> 00:28:01,480 Speaker 1: Leslie Stall, oh, that's totally made up. That's a far 520 00:28:01,680 --> 00:28:04,640 Speaker 1: right wing conspiracy. That's the way they try to dismiss things. 521 00:28:05,240 --> 00:28:09,520 Speaker 1: The distance between can far right wing conspiracy and the truth. Buck, 522 00:28:09,520 --> 00:28:11,800 Speaker 1: I don't know about you, but it sure seems to 523 00:28:11,840 --> 00:28:14,639 Speaker 1: be shortening up the amount of time it takes to 524 00:28:14,720 --> 00:28:18,399 Speaker 1: go from far right wing conspiracy to the truth. Is 525 00:28:18,720 --> 00:28:21,760 Speaker 1: it seems like every year a couple of months are 526 00:28:21,800 --> 00:28:23,920 Speaker 1: getting lopped off how long it takes to get there? 527 00:28:25,240 --> 00:28:27,480 Speaker 4: Totally agree. Well, let's take some of your calls on 528 00:28:27,640 --> 00:28:31,880 Speaker 4: all this and eight hundred two eight two to eight, 529 00:28:32,080 --> 00:28:33,639 Speaker 4: eight to two. We'll get back to it in just 530 00:28:33,720 --> 00:28:36,400 Speaker 4: a moment. You know, men's energy comes in large part 531 00:28:36,480 --> 00:28:39,520 Speaker 4: from having a healthy amount of testosterone in the body. 532 00:28:40,000 --> 00:28:41,600 Speaker 4: You'll meet guys during your life that maybe have a 533 00:28:41,640 --> 00:28:44,440 Speaker 4: little too much testosterone. You'll come across guys that could 534 00:28:44,560 --> 00:28:47,120 Speaker 4: use a little boost. It is a natural phenomenon to 535 00:28:47,160 --> 00:28:51,080 Speaker 4: see those levels of tea get lower over time. Look, 536 00:28:51,160 --> 00:28:54,480 Speaker 4: it's diet, stress, speed of our lifestyles, and a lot 537 00:28:54,560 --> 00:28:57,440 Speaker 4: more genetics. There's a solution to all this though, when 538 00:28:57,440 --> 00:29:00,960 Speaker 4: it comes from a trusted company called Chalk. Se Hoq 539 00:29:01,800 --> 00:29:05,160 Speaker 4: is a Chalk dot com is their website that cchoq 540 00:29:05,440 --> 00:29:08,480 Speaker 4: dot com. They make daily natural supplements for men and women. 541 00:29:08,720 --> 00:29:12,760 Speaker 4: Their Male Vitality Stack helps replenish testosterone. In three months time, 542 00:29:13,080 --> 00:29:16,320 Speaker 4: your t levels will increase by twenty percent. That really 543 00:29:16,360 --> 00:29:19,240 Speaker 4: has an impact and you'll feel the difference. Their Female 544 00:29:19,320 --> 00:29:21,920 Speaker 4: Vitality Stack helps women with hormone health too. By the way, 545 00:29:21,960 --> 00:29:24,160 Speaker 4: you should check that out if you're a lady. Learn 546 00:29:24,240 --> 00:29:27,560 Speaker 4: more about Chalk's amazing products. Go to Chalk dot com 547 00:29:28,080 --> 00:29:32,320 Speaker 4: cchoq dot com. You can save thirty five percent off 548 00:29:32,400 --> 00:29:35,080 Speaker 4: any Chalk product subscription you choose by using my name 549 00:29:35,200 --> 00:29:38,520 Speaker 4: Buck in the purchase process. Start feeling better today in 550 00:29:38,600 --> 00:29:43,760 Speaker 4: a natural way with Chalk Supplements that's Chalk cchoq dot com. 551 00:29:44,640 --> 00:29:47,840 Speaker 1: Download and us than you Clay and fuck Ass. Listen 552 00:29:47,920 --> 00:29:50,560 Speaker 1: to the program live, catch up on any part of 553 00:29:50,600 --> 00:29:53,760 Speaker 1: the show you might have missed. Use your CNB twenty 554 00:29:53,800 --> 00:29:56,680 Speaker 1: four to seven subscription to get access to the guys. 555 00:29:57,040 --> 00:29:59,520 Speaker 1: Fine the Clay and fuckap in your app store and 556 00:29:59,680 --> 00:30:00,920 Speaker 1: make it part of your day. 557 00:30:01,000 --> 00:30:02,960 Speaker 4: And welcome back to Clay and coming up in a 558 00:30:03,040 --> 00:30:06,360 Speaker 4: few minutes. Our friend Julio Rosas you may recall, last week, 559 00:30:06,400 --> 00:30:07,680 Speaker 4: we had a little bit of phone trouble, but he 560 00:30:07,760 --> 00:30:10,680 Speaker 4: called in from the border to tell us about how 561 00:30:10,760 --> 00:30:13,400 Speaker 4: it's just basically wide open. 562 00:30:13,520 --> 00:30:13,680 Speaker 7: Now. 563 00:30:14,120 --> 00:30:15,680 Speaker 4: He's done a lot of reporting not just from the 564 00:30:15,720 --> 00:30:18,560 Speaker 4: border though, but also from the front lines of the 565 00:30:18,680 --> 00:30:24,239 Speaker 4: Democrat shock troops known as Antifa, and he was at 566 00:30:24,280 --> 00:30:27,920 Speaker 4: a hearing up on the up on Capitol Hill House 567 00:30:27,960 --> 00:30:34,360 Speaker 4: hearing on Antifa, and he got a lot of condescension 568 00:30:34,480 --> 00:30:37,760 Speaker 4: from a particularly smug and useless member of Congress Democrat 569 00:30:37,800 --> 00:30:41,280 Speaker 4: of course, and gave it right back to him. And 570 00:30:41,520 --> 00:30:42,840 Speaker 4: I want to let you hear that. But that's going 571 00:30:42,880 --> 00:30:45,760 Speaker 4: to come up in a few minutes. And also Clay's 572 00:30:45,840 --> 00:30:48,720 Speaker 4: got more for us coming up in the third hour. 573 00:30:48,760 --> 00:30:51,160 Speaker 4: Were talking I don't talk a little debt ceiling. I 574 00:30:51,240 --> 00:30:53,640 Speaker 4: guess I put out a poll. Most people say they 575 00:30:53,720 --> 00:30:55,400 Speaker 4: just they know it's just going to get solved and 576 00:30:55,440 --> 00:30:57,320 Speaker 4: they don't really care. It's a lot of posturing. But 577 00:30:57,600 --> 00:31:00,320 Speaker 4: we'll update everybody. Let's we will update everybody, Boddy on 578 00:31:00,480 --> 00:31:03,440 Speaker 4: what's going on with the debt ceiling. But first we 579 00:31:03,520 --> 00:31:05,080 Speaker 4: got a lot of calls and we wanted to get 580 00:31:05,120 --> 00:31:10,520 Speaker 4: to some of your thoughts. Uh, Alan in Fisher's Indiana, 581 00:31:10,640 --> 00:31:11,480 Speaker 4: what's going on? Alan? 582 00:31:13,120 --> 00:31:17,640 Speaker 7: Hey, Beck, I'm with you one hundred percent on I 583 00:31:17,720 --> 00:31:21,200 Speaker 7: think you're dead right on that the that the left 584 00:31:21,400 --> 00:31:26,000 Speaker 7: doesn't care about the truth. They're they're there to further 585 00:31:26,200 --> 00:31:29,920 Speaker 7: the agenda, create the narratives. If Leslie, I find it 586 00:31:30,000 --> 00:31:33,520 Speaker 7: hard to believe that Leslie Stall where she is in 587 00:31:33,640 --> 00:31:39,480 Speaker 7: the journalism hierarchy didn't have some idea of the details 588 00:31:39,520 --> 00:31:42,520 Speaker 7: of what was going on with all the smearing of. 589 00:31:42,600 --> 00:31:46,000 Speaker 1: Trus But I appreciate that. I don't. I don't. I 590 00:31:46,680 --> 00:31:49,720 Speaker 1: have a lot maybe again, I used to be more 591 00:31:49,880 --> 00:31:51,000 Speaker 1: like the caller Buck. 592 00:31:51,120 --> 00:31:52,760 Speaker 4: And Clay, by the way, because I like it, you 593 00:31:52,800 --> 00:31:55,480 Speaker 4: agree to be go ahead? No, No, I used to. 594 00:31:55,560 --> 00:31:59,360 Speaker 1: Believe more like the caller, Hey, these people are really smart, 595 00:31:59,720 --> 00:32:02,920 Speaker 1: you know, like there. When I have worked in media, 596 00:32:03,600 --> 00:32:06,920 Speaker 1: I have found that so many people are so intellectually 597 00:32:07,040 --> 00:32:10,640 Speaker 1: lazy and not curious that they don't look beyond the 598 00:32:10,720 --> 00:32:13,920 Speaker 1: first page of whatever briefing book they're getting. They don't 599 00:32:14,080 --> 00:32:19,080 Speaker 1: do the homework. Leslie, I think, Leslie, you think Leslie 600 00:32:19,160 --> 00:32:22,840 Speaker 1: stall knew that Trump's campaign had been spied on and just. 601 00:32:24,480 --> 00:32:27,959 Speaker 4: And she made the political and career decision. I mean, 602 00:32:28,040 --> 00:32:29,360 Speaker 4: I'd put it to you, by the way, So Alan 603 00:32:29,480 --> 00:32:31,880 Speaker 4: I agree with you. Clay does not agree with you, 604 00:32:31,960 --> 00:32:34,360 Speaker 4: But Allan, I think you are a patriot and a genius. 605 00:32:34,400 --> 00:32:37,280 Speaker 4: So thank you for calling in. I look, I'll say this, 606 00:32:38,160 --> 00:32:40,960 Speaker 4: look at the career trajectories. With the exception of a 607 00:32:41,080 --> 00:32:44,360 Speaker 4: couple of lunatics at CNN who were problems apart from 608 00:32:44,360 --> 00:32:49,640 Speaker 4: the Russia collusion thing, the career trajectories of every Democrat 609 00:32:49,760 --> 00:32:53,440 Speaker 4: in the media and every deep state Democrat inside of 610 00:32:53,480 --> 00:32:58,560 Speaker 4: the federal government became more lucrative and more famous after 611 00:32:58,680 --> 00:33:00,800 Speaker 4: they went along with the whole lot. So all the 612 00:33:00,920 --> 00:33:03,640 Speaker 4: incentives they knew this, right, all the incentives are aligned 613 00:33:03,640 --> 00:33:05,800 Speaker 4: with I mean, Jim Acosta became a household they have 614 00:33:05,920 --> 00:33:07,920 Speaker 4: Jim Acosta's a this is a clown. 615 00:33:09,000 --> 00:33:12,200 Speaker 1: I think you're right one hundred percent about that. I 616 00:33:12,360 --> 00:33:16,360 Speaker 1: think that following the herd, though oftentimes means you never 617 00:33:16,640 --> 00:33:20,320 Speaker 1: actually look and see why you're following the herd. So 618 00:33:20,800 --> 00:33:23,440 Speaker 1: I I really think Leslie Stall, if we had Leslie 619 00:33:23,440 --> 00:33:25,240 Speaker 1: Stall on the show, this would be a good experiment. 620 00:33:25,640 --> 00:33:27,640 Speaker 1: If we had Leslie Stall on the show for an 621 00:33:27,760 --> 00:33:31,120 Speaker 1: hour and we, for instance, went through COVID data with her, 622 00:33:31,560 --> 00:33:33,200 Speaker 1: I think she would know none of it. She would 623 00:33:33,280 --> 00:33:35,800 Speaker 1: just have taken Fauci's talking points as part of being 624 00:33:35,840 --> 00:33:36,560 Speaker 1: a good person. 625 00:33:36,760 --> 00:33:39,840 Speaker 4: So I'll put it's interesting because you're okay, if I may, 626 00:33:40,120 --> 00:33:43,160 Speaker 4: then maybe maybe I'm transgressing my knowledge a little bit here. 627 00:33:43,840 --> 00:33:46,000 Speaker 4: You dealt with people in the sports world on the 628 00:33:46,120 --> 00:33:49,440 Speaker 4: left who I think, you know, especially with the politics stuff, 629 00:33:49,720 --> 00:33:52,000 Speaker 4: they had no idea what was going on, right, That 630 00:33:52,160 --> 00:33:54,520 Speaker 4: is true. I've you know, you were bad from CNN 631 00:33:54,640 --> 00:33:58,080 Speaker 4: to be fair, to be fair, you were bad from CNN. 632 00:33:58,680 --> 00:34:02,320 Speaker 4: And he's to be still hilarious, still hilarious in a 633 00:34:02,400 --> 00:34:06,440 Speaker 4: moment of true pride. I'm telling you, and everyone who 634 00:34:06,520 --> 00:34:09,480 Speaker 4: knows me knows that I like to call out Van 635 00:34:09,640 --> 00:34:13,560 Speaker 4: Jones is smart. Jake Tapper is smart. These guys are 636 00:34:13,600 --> 00:34:15,160 Speaker 4: I mean, you know, you go to the there are 637 00:34:15,160 --> 00:34:18,200 Speaker 4: a number of people over there in that Chris Cuomo 638 00:34:18,320 --> 00:34:20,359 Speaker 4: is you know, he comes across as very very kind 639 00:34:20,360 --> 00:34:23,520 Speaker 4: of bro guy. He's he's actually pretty sharp when he 640 00:34:23,600 --> 00:34:27,520 Speaker 4: wants to be. I'm I these people knew, you know, 641 00:34:27,640 --> 00:34:30,680 Speaker 4: these these different people out there knew on the left 642 00:34:30,800 --> 00:34:33,799 Speaker 4: what they were doing, and they they made I think 643 00:34:33,800 --> 00:34:35,600 Speaker 4: they made it. A lot of them made a choice. 644 00:34:35,880 --> 00:34:37,480 Speaker 4: So a lot of them made a choice. You think 645 00:34:37,480 --> 00:34:39,319 Speaker 4: that they just had no idea and they don't care. 646 00:34:39,880 --> 00:34:42,399 Speaker 1: If you're right, It's made way worse than what I'm saying. 647 00:34:42,560 --> 00:34:46,160 Speaker 1: I'm attributing it to not being that smart or intellectually 648 00:34:46,280 --> 00:34:49,840 Speaker 1: curious and just not doing the homework to actually know 649 00:34:50,120 --> 00:34:52,719 Speaker 1: the larger community. You have all these pas, all these 650 00:34:52,760 --> 00:34:55,560 Speaker 1: production assistants that put stuff in front of you, and 651 00:34:55,680 --> 00:34:58,120 Speaker 1: you just take the talking points and run with them, 652 00:34:58,560 --> 00:35:02,560 Speaker 1: not out of a desire to intentionally mislead, but because 653 00:35:02,640 --> 00:35:08,040 Speaker 1: you're being played. Your your take is far more devious 654 00:35:08,320 --> 00:35:11,640 Speaker 1: and far more destructive of the country than my just 655 00:35:11,760 --> 00:35:12,840 Speaker 1: intellectually lazy. 656 00:35:12,960 --> 00:35:14,839 Speaker 4: But that would kind of line up right. 657 00:35:15,200 --> 00:35:20,560 Speaker 1: Yeah, Yeah, I'm just saying Leslie stall Is is truly 658 00:35:20,600 --> 00:35:21,200 Speaker 1: an imbecile. 659 00:35:22,760 --> 00:35:24,840 Speaker 4: I couldn't say that. You know, I know who the 660 00:35:25,080 --> 00:35:28,239 Speaker 4: slick operatives on the left are. I don't know if 661 00:35:28,280 --> 00:35:30,480 Speaker 4: I would put her in that category. So for me, 662 00:35:30,680 --> 00:35:33,399 Speaker 4: she's borderline. But I'm just saying there are definitely people 663 00:35:33,480 --> 00:35:36,160 Speaker 4: on the left who I know all the source material 664 00:35:36,280 --> 00:35:36,920 Speaker 4: and they don't care. 665 00:35:37,440 --> 00:35:40,360 Speaker 1: The Jeff Zuckers of the world are totally geniuses. 666 00:35:40,560 --> 00:35:40,719 Speaker 7: Right. 667 00:35:40,960 --> 00:35:43,320 Speaker 1: He told me he is not an unintelligent person. But 668 00:35:43,600 --> 00:35:46,400 Speaker 1: Leslie stall I feel like not a very smart person. 669 00:35:46,560 --> 00:35:49,719 Speaker 1: She may really not know the stuff that she's saying. 670 00:35:49,800 --> 00:35:51,120 Speaker 1: She's just parroting talking points. 671 00:35:51,480 --> 00:35:53,080 Speaker 4: John in North Carolina. I wanted to get you in 672 00:35:53,160 --> 00:35:58,040 Speaker 4: here before we close up the hour. What's up, John? John? 673 00:35:58,560 --> 00:36:03,600 Speaker 4: Going once? Okay, John, we can't even get to going twice. 674 00:36:03,640 --> 00:36:04,759 Speaker 4: You're not talking to it all right? 675 00:36:04,880 --> 00:36:07,719 Speaker 1: Well, what he wanted to say, Buck was nothing's gonna 676 00:36:07,760 --> 00:36:10,320 Speaker 1: happen with regards to the Durham Report, So why do 677 00:36:10,400 --> 00:36:11,320 Speaker 1: we even talk about it. 678 00:36:12,200 --> 00:36:14,600 Speaker 4: It's a great question, and it's what I've been saying 679 00:36:14,640 --> 00:36:17,440 Speaker 4: all along. I was saying this about the Bengazi hearings too. 680 00:36:18,400 --> 00:36:20,279 Speaker 4: If you don't have the political power and the will 681 00:36:20,400 --> 00:36:24,320 Speaker 4: to use it. All this other stuff is essentially feels 682 00:36:24,360 --> 00:36:26,279 Speaker 4: a bit like noise. I do think it's important to 683 00:36:26,320 --> 00:36:29,640 Speaker 4: get on the record, and that's what this is. What 684 00:36:29,800 --> 00:36:32,960 Speaker 4: a fraud they perpetrated, beyond any reasonable doubt. The Democrats 685 00:36:33,000 --> 00:36:36,160 Speaker 4: perpetrated a fraud, a soft coup against Trump. But if 686 00:36:36,160 --> 00:36:40,160 Speaker 4: you're looking for justice, oh, this is not justice. You 687 00:36:40,280 --> 00:36:43,439 Speaker 4: need to have political power to get Justine. In my mind, 688 00:36:43,520 --> 00:36:46,160 Speaker 4: John's a great question. I think that there are still 689 00:36:46,239 --> 00:36:48,680 Speaker 4: persuadable people. And some of you may be laughing at 690 00:36:48,760 --> 00:36:50,960 Speaker 4: me and think that I'm an idiot like Leslie stall Is, 691 00:36:51,239 --> 00:36:53,759 Speaker 4: but I really genuinely believe that there are people who 692 00:36:53,840 --> 00:36:56,040 Speaker 4: can be convinced to do the right thing when it 693 00:36:56,080 --> 00:36:56,799 Speaker 4: comes to their votes. 694 00:36:56,880 --> 00:36:57,239 Speaker 7: That's why