1 00:00:11,697 --> 00:00:14,977 Speaker 1: You're listening to the Buck Sexton Show podcast. Make sure 2 00:00:14,977 --> 00:00:17,617 Speaker 1: you subscribe to the podcast on the iHeart Radio app 3 00:00:17,897 --> 00:00:21,297 Speaker 1: or wherever you get your podcasts. Team. Welcome to the 4 00:00:21,337 --> 00:00:25,337 Speaker 1: Freedom Hunt Monday, April twenty fifth edition. The program so 5 00:00:25,417 --> 00:00:30,537 Speaker 1: much to discuss, including Elon's offer to buy Twitter may 6 00:00:30,657 --> 00:00:35,497 Speaker 1: actually be accepted now by the board. We have social 7 00:00:35,657 --> 00:00:39,537 Speaker 1: emotional learning in textbooks in Florida. The culture war over 8 00:00:40,057 --> 00:00:47,377 Speaker 1: indoctrination of children continues, and some more on Hunter Biden, 9 00:00:47,617 --> 00:00:51,217 Speaker 1: the lies to defend him, the Biden administration under siege 10 00:00:51,257 --> 00:00:54,057 Speaker 1: the border by the way, a total mess will cover 11 00:00:54,177 --> 00:00:56,497 Speaker 1: as much of this as weekend today and obviously every 12 00:00:56,497 --> 00:01:00,537 Speaker 1: other day this week. That is how we roll. But 13 00:01:00,697 --> 00:01:03,697 Speaker 1: you know, we've got to stand behind our men and 14 00:01:03,857 --> 00:01:06,857 Speaker 1: women in law enforcement. Right now. You may not know this. 15 00:01:06,937 --> 00:01:09,497 Speaker 1: Four Deputy US Marshals were deployed under President Trump to 16 00:01:09,537 --> 00:01:11,977 Speaker 1: protect him. To fend to federal courthouse in Portland. This 17 00:01:12,137 --> 00:01:14,657 Speaker 1: is the courthouse that the psycho libs we're trying to 18 00:01:14,657 --> 00:01:17,817 Speaker 1: burn down Antifa, anti fascions and trying to burn to 19 00:01:17,897 --> 00:01:20,337 Speaker 1: the ground. Well, guess what. The Department of Justice under 20 00:01:20,377 --> 00:01:24,777 Speaker 1: the Biden administration will not provide an attorney to defend 21 00:01:24,857 --> 00:01:29,537 Speaker 1: these deputy marshals from these crazy allegations. These are brave 22 00:01:29,577 --> 00:01:33,337 Speaker 1: officers who had to take a lot of punishment, assault fires, 23 00:01:33,377 --> 00:01:37,097 Speaker 1: flying objects, lasers in their eyes, and now the DOJ, 24 00:01:37,337 --> 00:01:39,457 Speaker 1: to make a political point under Biden, is leaving them 25 00:01:39,497 --> 00:01:41,657 Speaker 1: high and dry. They need help. They need help to 26 00:01:41,697 --> 00:01:45,017 Speaker 1: defend themselves. You can help them right now and alleviate 27 00:01:45,057 --> 00:01:47,417 Speaker 1: the financial burden of their defense. These are family men 28 00:01:47,497 --> 00:01:50,897 Speaker 1: with stellar records of service. Help them today. Donate at 29 00:01:50,977 --> 00:01:56,857 Speaker 1: police defense dot org. That's police defense dot org. Step 30 00:01:56,937 --> 00:01:59,217 Speaker 1: up and help out those who are trying to keep 31 00:01:59,577 --> 00:02:07,057 Speaker 1: anarchy from ruling on our streets. Oh, Twitter, so here 32 00:02:07,057 --> 00:02:12,457 Speaker 1: we have former see chair Jake Clayton on Musk's Twitter bid. 33 00:02:12,657 --> 00:02:14,857 Speaker 1: Let's update you on it. What you are seeing is 34 00:02:15,257 --> 00:02:20,217 Speaker 1: a direction of travel where those uncertainties are being addressed. 35 00:02:20,537 --> 00:02:24,617 Speaker 1: There are many to go, but we've seen financing be addressed. 36 00:02:24,777 --> 00:02:29,137 Speaker 1: We've seen due diligence taken off the table. You've seen 37 00:02:29,177 --> 00:02:32,017 Speaker 1: the Board of directors do what board of directors do 38 00:02:32,097 --> 00:02:34,617 Speaker 1: with a pill and other actions trying to put some 39 00:02:35,137 --> 00:02:39,377 Speaker 1: order and process around us. And you know, the market 40 00:02:39,457 --> 00:02:42,777 Speaker 1: is taking stock as you've noted of how shareholders and 41 00:02:43,017 --> 00:02:46,537 Speaker 1: others feel about this. So there's a direction of travel 42 00:02:46,697 --> 00:02:51,177 Speaker 1: from uncertainty to trying to eliminate those uncertainties and get 43 00:02:51,177 --> 00:02:54,297 Speaker 1: greater certainty. There's obviously a way to go, but that's 44 00:02:54,297 --> 00:03:00,097 Speaker 1: where we are. Elon may in fact be able to 45 00:03:00,377 --> 00:03:07,457 Speaker 1: buy Twitter. Elon may be able to buy Why is 46 00:03:07,457 --> 00:03:11,297 Speaker 1: it such a big deal? Well, because twitteries the closest 47 00:03:11,297 --> 00:03:14,337 Speaker 1: thing that we have to a public square where you 48 00:03:14,337 --> 00:03:19,977 Speaker 1: can have the very rapid sharing and elevation of ideas, 49 00:03:20,057 --> 00:03:24,097 Speaker 1: the exchange of ideas, and most media outlets, because it's 50 00:03:24,137 --> 00:03:29,097 Speaker 1: instantaneous information sharing that everybody has, most media outlets are 51 00:03:29,697 --> 00:03:33,377 Speaker 1: very involved in Twitter, both having their employees using it 52 00:03:33,417 --> 00:03:36,977 Speaker 1: to understand what the zeitgeist of the current moment may be, 53 00:03:38,017 --> 00:03:40,857 Speaker 1: what the activists are saying if you're a leftist. But 54 00:03:40,977 --> 00:03:43,657 Speaker 1: this is a big deal. He's a big deal of Elon, 55 00:03:43,777 --> 00:03:45,457 Speaker 1: as we've been discussing here on the show for weeks. 56 00:03:45,457 --> 00:03:47,377 Speaker 1: I mean, this isn't new, but they had initially said 57 00:03:49,097 --> 00:03:51,417 Speaker 1: that they wouldn't accept it. Now they're in the final 58 00:03:51,497 --> 00:03:54,137 Speaker 1: stages of a fifty four dollars and twenty cents per 59 00:03:54,177 --> 00:03:58,097 Speaker 1: share takeover bid, and he's going to revolutionize the tech Gihon. 60 00:03:58,297 --> 00:04:01,817 Speaker 1: Elon Musk is going to make Twitter a mega platform 61 00:04:01,857 --> 00:04:04,937 Speaker 1: if he buys this. I think he's going to make 62 00:04:04,937 --> 00:04:09,217 Speaker 1: this something that is going to have ramifications well beyond 63 00:04:09,257 --> 00:04:12,977 Speaker 1: just what you can tweet, because Facebook and other companies 64 00:04:12,977 --> 00:04:16,017 Speaker 1: will see this and realize, oh, this is now the 65 00:04:16,057 --> 00:04:19,457 Speaker 1: place where people believe that there is not only a 66 00:04:19,537 --> 00:04:22,297 Speaker 1: level a more level playing field for ideas, but that 67 00:04:22,417 --> 00:04:28,017 Speaker 1: there's also the most clear vision of what this platform 68 00:04:28,057 --> 00:04:30,297 Speaker 1: can do going forward. Look at you that they've been 69 00:04:30,297 --> 00:04:33,137 Speaker 1: whining about, you know, climate change and electric cars for years. 70 00:04:33,177 --> 00:04:36,217 Speaker 1: By the way, I absolutely love the Twitter exchange between 71 00:04:36,377 --> 00:04:41,457 Speaker 1: Elon and Bill Gates. I shouldn't say Twitter, It's actually 72 00:04:41,457 --> 00:04:44,977 Speaker 1: a text exchange that was shared on Twitter where Bill 73 00:04:45,017 --> 00:04:47,937 Speaker 1: Gates wants Elon to donate money I think to the 74 00:04:47,937 --> 00:04:51,257 Speaker 1: Gates Foundation something like that having to do with climate change, 75 00:04:51,577 --> 00:04:55,937 Speaker 1: and Elon says, are you still short Tesla five hundred 76 00:04:55,937 --> 00:05:01,017 Speaker 1: million dollars? And Gates goes yes. Unfortunately, I am so 77 00:05:01,057 --> 00:05:04,017 Speaker 1: this guy's asking. Gates is a scummy fellow, And this 78 00:05:04,097 --> 00:05:07,697 Speaker 1: is what everyone's kind of realized this now. I think 79 00:05:07,697 --> 00:05:09,537 Speaker 1: it's taken a while. I'm not even talking about the 80 00:05:09,537 --> 00:05:13,657 Speaker 1: Epstein connections, which there are connections. That is a fact. 81 00:05:13,697 --> 00:05:17,417 Speaker 1: He spent time at at the Island now better known 82 00:05:17,457 --> 00:05:22,977 Speaker 1: as Pedophile Island. But Gates it just seems like an untrustworthy, 83 00:05:23,177 --> 00:05:28,137 Speaker 1: sanctimonious lib fraud. Who really software that he built and 84 00:05:28,457 --> 00:05:30,577 Speaker 1: they you know, nineties or whatever or the company around 85 00:05:30,577 --> 00:05:34,257 Speaker 1: the nineties. Yeah, people say revolutionize eater and the software's 86 00:05:34,337 --> 00:05:36,457 Speaker 1: kind of crappy. Really when you think about it, does 87 00:05:36,497 --> 00:05:38,697 Speaker 1: not even really want to use Windows these days? Windows 88 00:05:39,177 --> 00:05:43,297 Speaker 1: Windows trash anyway, neither here nor there. Well, actually it 89 00:05:43,377 --> 00:05:46,177 Speaker 1: is both here and there. Point here being that when 90 00:05:46,177 --> 00:05:48,457 Speaker 1: Elon found out that he was still short five hundred 91 00:05:48,457 --> 00:05:50,337 Speaker 1: million dollars of Tesla, He's like, yeah, no, I don't 92 00:05:50,337 --> 00:05:52,937 Speaker 1: think I'm writing you a check for your climate change foundation, 93 00:05:53,257 --> 00:05:57,937 Speaker 1: because Tesla has actually done more to advance a clean 94 00:05:58,097 --> 00:06:00,977 Speaker 1: energy future than any other company on the planet. You 95 00:06:01,017 --> 00:06:04,417 Speaker 1: know why. They made better technology, They made it cool, 96 00:06:04,777 --> 00:06:08,417 Speaker 1: and they made it work. That's the big difference. This 97 00:06:08,617 --> 00:06:11,897 Speaker 1: isn't let's use government subsidies. Let's pretend that it's not 98 00:06:11,977 --> 00:06:14,857 Speaker 1: super annoying and inconvenient. This is Hey, Tesla's are kind 99 00:06:14,897 --> 00:06:17,057 Speaker 1: of awesome. I might buy one. I'm gonna tell you 100 00:06:17,097 --> 00:06:19,977 Speaker 1: all this right now I might actually, yes, I might 101 00:06:20,017 --> 00:06:22,737 Speaker 1: get a Tesla. Not because I think that climate chase 102 00:06:22,777 --> 00:06:24,617 Speaker 1: the problem. I don't. I just think Tesla's are cool. 103 00:06:24,857 --> 00:06:27,177 Speaker 1: I'll be honest, I just think they're I've been in them. 104 00:06:27,217 --> 00:06:28,977 Speaker 1: I want to go test drive one and really see. 105 00:06:29,057 --> 00:06:31,177 Speaker 1: But I think they're great. I think they're great cars. 106 00:06:31,337 --> 00:06:34,097 Speaker 1: You know, once you go electric scooter, you figure electric 107 00:06:34,217 --> 00:06:37,497 Speaker 1: electric car maybe the next move. And I will have 108 00:06:37,577 --> 00:06:43,737 Speaker 1: you know that that the scooter is catching on very 109 00:06:43,857 --> 00:06:47,737 Speaker 1: rapidly in New York City. The scooter is catching on 110 00:06:48,137 --> 00:06:50,977 Speaker 1: in a way that people I don't think ever would 111 00:06:50,977 --> 00:06:57,217 Speaker 1: have really anticipated beforehand. Lewis Hamilton the best Formula One 112 00:06:57,297 --> 00:06:59,657 Speaker 1: driver in the world, some would argue the best Formula 113 00:06:59,657 --> 00:07:02,697 Speaker 1: One driver of all time, along Michael Schumacher. He drives 114 00:07:02,737 --> 00:07:05,537 Speaker 1: around on race day in an electric scooter. I'll have 115 00:07:05,577 --> 00:07:07,577 Speaker 1: you know, I've seen it on Netflix. I'm just saying, 116 00:07:08,017 --> 00:07:10,737 Speaker 1: good enos good enough for Lewis Hamilton's good enough for me. 117 00:07:11,057 --> 00:07:15,337 Speaker 1: But back to Elon and the Twitter situation. Here, my 118 00:07:15,777 --> 00:07:18,257 Speaker 1: team is saying stuff about the need for me to 119 00:07:19,097 --> 00:07:21,977 Speaker 1: get a normal car. Probably, but that's all right, we 120 00:07:22,017 --> 00:07:25,737 Speaker 1: can we can get an electric car. So Elon's offer 121 00:07:25,857 --> 00:07:29,017 Speaker 1: may go through and that would be fantastic. And I've 122 00:07:29,057 --> 00:07:30,537 Speaker 1: already I twittered at him today. I don't know if 123 00:07:30,537 --> 00:07:32,057 Speaker 1: they'll see it. He's got, you know, one of the 124 00:07:32,097 --> 00:07:34,697 Speaker 1: biggest Twitter followings in the world, and I'm like, hey, man, 125 00:07:34,737 --> 00:07:38,577 Speaker 1: can you please make public all of the shadow banning 126 00:07:38,617 --> 00:07:40,657 Speaker 1: and the rigged game nonsense that the Libs have been 127 00:07:40,657 --> 00:07:44,617 Speaker 1: pulling for ten years because radical transparency is the best 128 00:07:44,697 --> 00:07:51,497 Speaker 1: antidote to their authoritarianism. It is absolutely true radical transparency 129 00:07:51,577 --> 00:07:53,697 Speaker 1: of what has been done there. And I think this 130 00:07:53,857 --> 00:07:55,937 Speaker 1: terrifies the lives almost more than anything else. There's so 131 00:07:55,977 --> 00:08:00,777 Speaker 1: many there has been such it's basically cheating that's been 132 00:08:00,817 --> 00:08:03,057 Speaker 1: going on. I mean, it's like we thought we were 133 00:08:03,097 --> 00:08:05,457 Speaker 1: doing collegiate wrestling the whole time. You know, it made 134 00:08:05,457 --> 00:08:08,337 Speaker 1: the best man win, but it's really WWE wrestling. No 135 00:08:08,377 --> 00:08:11,217 Speaker 1: offense at WWE, but you know they know who's going 136 00:08:11,257 --> 00:08:13,137 Speaker 1: to win the match before it's over, right, If we 137 00:08:13,177 --> 00:08:14,577 Speaker 1: all know this, I don't know. I don't think I'm 138 00:08:14,577 --> 00:08:17,297 Speaker 1: telling anybody. There's no Santa Clause here. But they told 139 00:08:17,337 --> 00:08:18,977 Speaker 1: us that it was actually a fair game. It wasn't. 140 00:08:18,977 --> 00:08:20,937 Speaker 1: It hasn't been a fair game all along. Twitter has 141 00:08:20,937 --> 00:08:24,017 Speaker 1: been a rigged a rigged system, There's no question about it. 142 00:08:24,457 --> 00:08:28,417 Speaker 1: So if Elon takes this over, it would be an opportunity. Look, 143 00:08:28,417 --> 00:08:30,377 Speaker 1: we got to see what he does, but it would 144 00:08:30,377 --> 00:08:35,177 Speaker 1: be an opportunity for things to be fantastic. Honestly on Twitter. 145 00:08:35,217 --> 00:08:38,577 Speaker 1: It would be great. And look, if if Elon can't 146 00:08:38,657 --> 00:08:40,097 Speaker 1: fix it, well, then we just got to do a 147 00:08:40,097 --> 00:08:41,977 Speaker 1: different We gotta go to a different platform, right if 148 00:08:41,977 --> 00:08:44,257 Speaker 1: Elon's not actually making it free and fair, But why 149 00:08:44,337 --> 00:08:45,857 Speaker 1: do it? If he's not going to do it, well, 150 00:08:45,937 --> 00:08:48,057 Speaker 1: why buy Twitter to make it left wing? It's already 151 00:08:48,097 --> 00:08:50,017 Speaker 1: left wing? Right, It doesn't make any sense. So if 152 00:08:50,017 --> 00:08:52,977 Speaker 1: he's going to spend fifty billion dollars, I'm sure he 153 00:08:53,017 --> 00:08:55,497 Speaker 1: has a plan in mind to make this a real 154 00:08:55,537 --> 00:08:59,137 Speaker 1: platform with real free speech. And I'm excited that that 155 00:08:59,297 --> 00:09:03,097 Speaker 1: is the plan that he has. Look, if you run 156 00:09:03,097 --> 00:09:06,217 Speaker 1: a small business, who's running your HR? If the answer 157 00:09:06,257 --> 00:09:08,337 Speaker 1: is I'll figure it out myself or no one, Remember, 158 00:09:08,337 --> 00:09:11,537 Speaker 1: one employee complaint can turn your world upside down. HR 159 00:09:11,617 --> 00:09:13,777 Speaker 1: is not just about avoiding risk. As a business leader, 160 00:09:13,777 --> 00:09:16,137 Speaker 1: you should do right by the people you employ. That's 161 00:09:16,137 --> 00:09:19,977 Speaker 1: why you need Bambi. Bambi is an HR platform build 162 00:09:19,977 --> 00:09:22,977 Speaker 1: for business like yours, so you can automate it. That's right, 163 00:09:23,057 --> 00:09:25,657 Speaker 1: automate the most important HR practices and get your own 164 00:09:25,737 --> 00:09:30,697 Speaker 1: dedicated HR manager Bambi's HR autopilot automate your core policies, 165 00:09:30,737 --> 00:09:34,257 Speaker 1: workplace training, and employee feedback. Your dedicated HR manager will 166 00:09:34,257 --> 00:09:36,617 Speaker 1: help you navigate the more complex parts of HR and 167 00:09:36,697 --> 00:09:39,217 Speaker 1: guide you to compliance. Available by phone, email, or real 168 00:09:39,257 --> 00:09:42,097 Speaker 1: time chat. An in house HR manager can cost up 169 00:09:42,137 --> 00:09:44,577 Speaker 1: to eighty thousand dollars a year, but with Bambi, your 170 00:09:44,577 --> 00:09:47,697 Speaker 1: dedicated HR manager starts at just ninety nine dollars a month, 171 00:09:48,097 --> 00:09:51,017 Speaker 1: no hidden fees, cancel anytime. All you have to do 172 00:09:51,057 --> 00:09:54,057 Speaker 1: now is go to Bamby, b A M b Ee 173 00:09:54,337 --> 00:09:57,377 Speaker 1: Bambi dot com slash a buck for your free HR audit. 174 00:09:57,417 --> 00:10:01,017 Speaker 1: So for all the business owners and HR managers and 175 00:10:01,057 --> 00:10:05,737 Speaker 1: folks out there working in corporate America, Bambi, Bambee dot com, 176 00:10:05,737 --> 00:10:10,817 Speaker 1: slash buck, Bambi dot com slash buck, got to go 177 00:10:11,057 --> 00:10:14,337 Speaker 1: check out Vanby. It is fantastic. I just want to 178 00:10:14,377 --> 00:10:19,377 Speaker 1: note that there's this continuous game of lying that is 179 00:10:19,417 --> 00:10:23,377 Speaker 1: going on about the Battle in Florida culture War, Battle 180 00:10:23,377 --> 00:10:25,457 Speaker 1: in Florida, over and by the way. I don't think 181 00:10:25,457 --> 00:10:27,577 Speaker 1: people should. The left tries to use that as a 182 00:10:27,657 --> 00:10:30,177 Speaker 1: term of disparagement. I think, you know, we should just 183 00:10:30,817 --> 00:10:32,937 Speaker 1: actually fight back in the culture war. It shouldn't be 184 00:10:32,937 --> 00:10:35,297 Speaker 1: a one way war, which is what it has felt 185 00:10:35,297 --> 00:10:37,657 Speaker 1: like for the last oh, you know, ten or fifteen, 186 00:10:37,737 --> 00:10:41,097 Speaker 1: twenty years. I think it's a good thing that we're 187 00:10:41,137 --> 00:10:45,497 Speaker 1: finally saying enough is enough and fight fire with fire. 188 00:10:45,617 --> 00:10:48,617 Speaker 1: This is the big debate that I've at least seen 189 00:10:48,737 --> 00:10:54,177 Speaker 1: last few days among some old school conservatives of the 190 00:10:54,217 --> 00:11:00,217 Speaker 1: Weekly Standard, rip and National Review type and other people 191 00:11:00,377 --> 00:11:02,777 Speaker 1: on the more. I guess you'd call it populous, right. 192 00:11:02,857 --> 00:11:07,297 Speaker 1: I just think the right that wants to win because 193 00:11:07,457 --> 00:11:10,977 Speaker 1: corporations and politicians, I've been doing this for a long time, 194 00:11:11,737 --> 00:11:15,497 Speaker 1: favors for corporations based on politics. To pretend that this 195 00:11:15,577 --> 00:11:18,097 Speaker 1: isn't going on literally every day is to be blind 196 00:11:18,137 --> 00:11:21,017 Speaker 1: to reality. That this isn't happening all over the country 197 00:11:21,057 --> 00:11:24,337 Speaker 1: all the time is to live in a delusion. So 198 00:11:24,577 --> 00:11:29,497 Speaker 1: we can pretend that there's some neutral space here that 199 00:11:29,617 --> 00:11:32,257 Speaker 1: we are seeding, or that Rhonda Santis is seeding in 200 00:11:32,257 --> 00:11:35,697 Speaker 1: the state of Florida by taking action against Disney, or 201 00:11:35,737 --> 00:11:37,537 Speaker 1: we can say hold on a second, this is what 202 00:11:37,657 --> 00:11:41,137 Speaker 1: is happening constantly all the time, every day. We need 203 00:11:41,177 --> 00:11:44,577 Speaker 1: to start doing that too. Our side, when we have power, 204 00:11:44,617 --> 00:11:46,577 Speaker 1: when there's a mandate from the voters, from the people 205 00:11:46,617 --> 00:11:49,297 Speaker 1: that have put in place somebody to make decisions for 206 00:11:49,377 --> 00:11:52,817 Speaker 1: a state like Florida, our side can change the calculation 207 00:11:52,857 --> 00:11:56,897 Speaker 1: of these corporations and start to shift the Overton window 208 00:11:56,977 --> 00:12:01,057 Speaker 1: back to some semblance of reality instead of this mass 209 00:12:01,177 --> 00:12:06,657 Speaker 1: left wing delusion. That's where so I'm firmly in favor 210 00:12:06,817 --> 00:12:10,177 Speaker 1: of taking action here or else. We just keep losing 211 00:12:10,217 --> 00:12:13,297 Speaker 1: until effectively the right ceases to exist in any meaningful way. 212 00:12:13,297 --> 00:12:16,257 Speaker 1: Which it started to feel like we were heading in 213 00:12:16,297 --> 00:12:19,377 Speaker 1: that direction if we continued on the pathway of it's 214 00:12:19,377 --> 00:12:21,337 Speaker 1: a neutral space and people saying, oh, take it to 215 00:12:21,337 --> 00:12:23,497 Speaker 1: the courts. I saw this over the weekend. Look at 216 00:12:23,537 --> 00:12:26,297 Speaker 1: what a big victory we had with Masterpiece cake Shop. 217 00:12:27,457 --> 00:12:30,897 Speaker 1: Master Piece cake Shop. The individual, the proprietor there is 218 00:12:30,897 --> 00:12:33,177 Speaker 1: basically bankrupted and they've ruined his life and they keep 219 00:12:33,217 --> 00:12:36,217 Speaker 1: suing him like all the time. It's not a victory 220 00:12:36,417 --> 00:12:39,137 Speaker 1: when someone's able to destroy your life because you won't 221 00:12:39,177 --> 00:12:43,097 Speaker 1: make a cake of all kinds of either pornographic or 222 00:12:43,137 --> 00:12:47,177 Speaker 1: satanic stuff or whatever it is that's being demanded. That's 223 00:12:47,577 --> 00:12:51,217 Speaker 1: not victory, friends, But speaking of the mass delusion on 224 00:12:51,337 --> 00:12:54,937 Speaker 1: mask delusions, that's wanted to hit this quickly. Over the 225 00:12:54,977 --> 00:12:58,417 Speaker 1: weekend on the Bill Maher Show, there was a discussion 226 00:12:58,457 --> 00:13:00,817 Speaker 1: about masks. It sounds like some people on the left 227 00:13:00,937 --> 00:13:04,737 Speaker 1: or recognizing reality. Mary Katherine Ham is a not a leftist, 228 00:13:04,737 --> 00:13:06,497 Speaker 1: but she was on the panel. Here you go, listen. 229 00:13:06,537 --> 00:13:10,417 Speaker 1: We know that masks are a minimally helpful, but we 230 00:13:10,457 --> 00:13:12,897 Speaker 1: told a lot of people they were maximally helpful. Wouldn't 231 00:13:12,897 --> 00:13:15,017 Speaker 1: they get it just when they said you don't have 232 00:13:15,057 --> 00:13:17,297 Speaker 1: to wear it while you're eating? Or I see the 233 00:13:17,417 --> 00:13:22,337 Speaker 1: basketball players playing gouging each other's eyes out, and then 234 00:13:22,377 --> 00:13:24,617 Speaker 1: they go to the bench and they put the mask on. 235 00:13:25,057 --> 00:13:26,977 Speaker 1: It's like, what the world am I living in? What 236 00:13:27,177 --> 00:13:30,617 Speaker 1: people are not seeing this insanity? We have to decide, Okay, well, 237 00:13:30,657 --> 00:13:32,617 Speaker 1: what is my risk level? Two year olds don't have 238 00:13:32,617 --> 00:13:34,697 Speaker 1: the same risk level as eighty eight year olds. Being 239 00:13:34,737 --> 00:13:37,497 Speaker 1: indoors doesn't have the same level as being outdoors. But 240 00:13:37,537 --> 00:13:40,617 Speaker 1: we did seemingly all the opposite things, which is, get 241 00:13:40,617 --> 00:13:42,457 Speaker 1: the kids out of school who are at least vulnerable, 242 00:13:42,537 --> 00:13:45,497 Speaker 1: hurt them that way, close the parks and the hiking trails, 243 00:13:45,817 --> 00:13:48,377 Speaker 1: and then have indoor outdoor spaces to eat that are 244 00:13:48,417 --> 00:13:51,977 Speaker 1: actually just indoor. Again. Yes, it was all so stupid, 245 00:13:52,097 --> 00:13:54,897 Speaker 1: wasn't it. None of that is made up. All of 246 00:13:54,897 --> 00:13:59,457 Speaker 1: that is true, and all of that was indefensible, indefensibly 247 00:13:59,537 --> 00:14:03,937 Speaker 1: stupid in fact, all of it. And so what does 248 00:14:03,937 --> 00:14:07,457 Speaker 1: the left offer up? Fauci's brilliant. He's a genius mask 249 00:14:07,537 --> 00:14:11,137 Speaker 1: up between bites. They're wrong, they were wrong about all 250 00:14:11,177 --> 00:14:13,737 Speaker 1: of it. We were right about all of it. It's 251 00:14:13,777 --> 00:14:17,257 Speaker 1: worth remembering. I'm gonna have to take you on a 252 00:14:17,497 --> 00:14:21,457 Speaker 1: update tour of the border tomorrow, friends, because doing a 253 00:14:21,537 --> 00:14:24,857 Speaker 1: quick quick Hits podcast here on Monday. Back with you 254 00:14:24,937 --> 00:14:26,457 Speaker 1: tomorrow though, Shield time