1 00:00:00,560 --> 00:00:02,480 Speaker 1: Welcome to the Tutor Dixon Podcast. 2 00:00:02,520 --> 00:00:05,480 Speaker 2: We've all been talking about what Donald Trump will do 3 00:00:05,559 --> 00:00:08,360 Speaker 2: when he gets into office, with all of these big 4 00:00:08,400 --> 00:00:11,640 Speaker 2: companies that had been censoring him, and now everybody's kind 5 00:00:11,640 --> 00:00:12,520 Speaker 2: of flocking to him. 6 00:00:12,520 --> 00:00:14,280 Speaker 1: And it seems like even. 7 00:00:14,080 --> 00:00:17,160 Speaker 2: The Zuckerbergs of the world have been like, hmm, maybe 8 00:00:17,200 --> 00:00:20,240 Speaker 2: I shouldn't make this guy mad again. So the world 9 00:00:20,280 --> 00:00:23,160 Speaker 2: seems to be changing, But is it actually changing? We 10 00:00:23,200 --> 00:00:26,079 Speaker 2: wanted to ask an expert, so we're bringing in Mike Benz. 11 00:00:26,079 --> 00:00:29,680 Speaker 2: He is the director of the Foundation for Freedom Online 12 00:00:29,720 --> 00:00:33,520 Speaker 2: and a former former State Department official in the Trump administration. 13 00:00:33,920 --> 00:00:37,080 Speaker 2: He has become known as the blob expert, taking on 14 00:00:37,159 --> 00:00:40,560 Speaker 2: Internet censorship right here in the US and abroad. Thank 15 00:00:40,600 --> 00:00:41,640 Speaker 2: you so much for joining me. 16 00:00:42,400 --> 00:00:43,479 Speaker 3: Thanks so much for having me. 17 00:00:44,080 --> 00:00:48,040 Speaker 2: So we're all watching this, We're all watching the Zuckerbergs 18 00:00:48,040 --> 00:00:50,760 Speaker 2: and the Bezos and all these guys kind of come 19 00:00:50,840 --> 00:00:54,480 Speaker 2: around Trump. But I mean, should we take this with 20 00:00:54,760 --> 00:00:57,200 Speaker 2: a grain of salt considering what they did to him? 21 00:00:57,280 --> 00:00:59,360 Speaker 3: Well, I think it's one of those things where you 22 00:00:59,440 --> 00:01:05,200 Speaker 3: can forgive a little but not forget, certainly not entirely, 23 00:01:05,640 --> 00:01:08,160 Speaker 3: that you can trust someone as far as you can 24 00:01:08,200 --> 00:01:11,720 Speaker 3: throw them but still do business together. The fact is, 25 00:01:12,560 --> 00:01:14,720 Speaker 3: I remember very vividly when I was in the White 26 00:01:14,720 --> 00:01:20,200 Speaker 3: House that MAGA was commonly called Microsoft, Apple, Google, and Amazon, 27 00:01:20,880 --> 00:01:25,240 Speaker 3: because these large tech companies were the largest companies in 28 00:01:25,280 --> 00:01:28,240 Speaker 3: the Nasdaq and the New York Stock Exchange, and so 29 00:01:28,920 --> 00:01:32,360 Speaker 3: they are, like it or not, US national champions. And 30 00:01:32,440 --> 00:01:37,319 Speaker 3: it is the role of the foreign policy establishment as 31 00:01:37,360 --> 00:01:41,839 Speaker 3: well as our domestic facing agencies to support US companies 32 00:01:41,880 --> 00:01:45,920 Speaker 3: because they provide US jobs, they provide US leadership and technology. 33 00:01:47,280 --> 00:01:49,920 Speaker 3: But you want them to be aligned as much as possible, 34 00:01:49,920 --> 00:01:51,880 Speaker 3: and to the extent there's disagreement, you want there to 35 00:01:51,880 --> 00:01:54,960 Speaker 3: be fair play on both sides of that. The issue 36 00:01:55,280 --> 00:01:59,000 Speaker 3: is obviously from twenty sixteen to twenty twenty and into 37 00:01:59,040 --> 00:02:02,120 Speaker 3: the Biden administry, there was not fair play. You had 38 00:02:02,120 --> 00:02:05,280 Speaker 3: a strange situation where the taxpayers during the Trump administration 39 00:02:05,880 --> 00:02:10,040 Speaker 3: were funding agencies to help Google and Microsoft and Apple 40 00:02:10,720 --> 00:02:15,760 Speaker 3: and Amazon, and these companies were turning around and censoring 41 00:02:15,800 --> 00:02:17,600 Speaker 3: the very people who voted for the government that was 42 00:02:17,639 --> 00:02:21,520 Speaker 3: helping them. But I do think there is a bit 43 00:02:21,520 --> 00:02:23,520 Speaker 3: of a sea change that's happened. I think a lot 44 00:02:23,560 --> 00:02:28,280 Speaker 3: of that is because strangely, the Biden ministration. As bad 45 00:02:28,320 --> 00:02:30,400 Speaker 3: as they were to their enemies, they weren't even good 46 00:02:30,440 --> 00:02:33,160 Speaker 3: to their friends. The fact is they stuck a knife 47 00:02:33,200 --> 00:02:37,919 Speaker 3: in Google, with the current Justice Department prosecution of Google 48 00:02:37,960 --> 00:02:40,600 Speaker 3: to break them up. They stuck a knife in Apple 49 00:02:40,880 --> 00:02:44,560 Speaker 3: by not fighting for them against the europe europe Digital 50 00:02:44,600 --> 00:02:48,000 Speaker 3: Markets Digital Services Act, where they're facing a fifteen billion 51 00:02:48,040 --> 00:02:56,200 Speaker 3: dollar fine. They were constantly twisting Amazon through the union 52 00:02:56,240 --> 00:03:04,520 Speaker 3: fights and through censorship demands of Amazon Books and multiple 53 00:03:04,560 --> 00:03:06,760 Speaker 3: other entanglements. I mean think the only one that's really 54 00:03:06,760 --> 00:03:09,400 Speaker 3: sticking by and refusing to come to the table right 55 00:03:09,400 --> 00:03:12,720 Speaker 3: now is Microsoft, which is a whole other can of worms. 56 00:03:13,560 --> 00:03:15,799 Speaker 2: So as we look at this, I want to kind 57 00:03:15,840 --> 00:03:19,560 Speaker 2: of break down what we think happened and what could 58 00:03:19,560 --> 00:03:22,639 Speaker 2: happen in the future. Because you recently spoke about the 59 00:03:22,680 --> 00:03:25,560 Speaker 2: history of the intelligence state, and in that speech you 60 00:03:25,639 --> 00:03:29,040 Speaker 2: talked about how we have interfered in elections in other 61 00:03:29,120 --> 00:03:32,960 Speaker 2: countries and used not just the media to do so. 62 00:03:33,000 --> 00:03:34,800 Speaker 2: But I thought I found it interesting to think that 63 00:03:35,120 --> 00:03:38,800 Speaker 2: we even made movies that we then projected into these 64 00:03:38,840 --> 00:03:42,160 Speaker 2: countries to change sentiment, to change ideas, to change hearts 65 00:03:42,160 --> 00:03:44,640 Speaker 2: and minds. And I think that we've seen that significantly 66 00:03:44,640 --> 00:03:47,680 Speaker 2: in the United States, whether it is a progressive agenda 67 00:03:48,000 --> 00:03:53,400 Speaker 2: or if it is actually to influence an election. You 68 00:03:53,480 --> 00:03:55,840 Speaker 2: see this in culture. You see this in TV shows. Man. 69 00:03:55,880 --> 00:03:58,720 Speaker 2: We started to see in TV shows where you would 70 00:03:58,800 --> 00:04:01,720 Speaker 2: have these shows that would be like based on government, 71 00:04:01,720 --> 00:04:04,400 Speaker 2: and they would really be trashing a figure that was 72 00:04:04,520 --> 00:04:07,240 Speaker 2: like Trump, you know, and you would see them just 73 00:04:07,880 --> 00:04:10,840 Speaker 2: turning the country against someone. So did we did the 74 00:04:10,880 --> 00:04:13,720 Speaker 2: intelligence state take what they did in other countries and 75 00:04:13,760 --> 00:04:17,320 Speaker 2: bring it here and become this massive arm of propaganda? 76 00:04:18,360 --> 00:04:20,440 Speaker 3: Yes, I was actually looking around if you saw me 77 00:04:20,480 --> 00:04:23,040 Speaker 3: looking around just now, I was looking for my copy 78 00:04:23,040 --> 00:04:26,960 Speaker 3: of Total Cold War about the Eisenhower era and how 79 00:04:27,120 --> 00:04:30,360 Speaker 3: the Defense and Intelligence apparatus was doing that right here 80 00:04:30,400 --> 00:04:35,600 Speaker 3: at home in the nineteen fifties. But in fact, that 81 00:04:35,640 --> 00:04:38,159 Speaker 3: goes back to the to the nineteen forties and even 82 00:04:38,160 --> 00:04:41,520 Speaker 3: a little bit before, when there was a partnership between 83 00:04:41,520 --> 00:04:45,200 Speaker 3: early Hollywood and what at the time was just the 84 00:04:45,240 --> 00:04:48,600 Speaker 3: Department of War that's now our Department of Defense. But 85 00:04:48,720 --> 00:04:51,480 Speaker 3: we didn't have a CIA when all this started. In 86 00:04:51,560 --> 00:04:53,560 Speaker 3: nineteen forty two, we had something called the Office of 87 00:04:53,600 --> 00:04:57,760 Speaker 3: War Information, which was a way for the War Department 88 00:04:58,960 --> 00:05:04,159 Speaker 3: which is now our pen gone Defence Department to centralize 89 00:05:04,200 --> 00:05:07,600 Speaker 3: all media in the United States in order to do 90 00:05:07,839 --> 00:05:13,880 Speaker 3: a total production to galvanize propaganda and to make sure 91 00:05:14,000 --> 00:05:18,920 Speaker 3: that all modalities of media supported World War Two. So 92 00:05:19,279 --> 00:05:22,120 Speaker 3: Hollywood was wrapped up in that, the newspapers were wrapped 93 00:05:22,160 --> 00:05:24,880 Speaker 3: up in that, the radio stations were wrapped up in 94 00:05:24,920 --> 00:05:27,880 Speaker 3: that the journals and periodicals were wrapped up in that. 95 00:05:28,960 --> 00:05:33,719 Speaker 3: Interestingly enough, all three of the big three major Cold 96 00:05:33,760 --> 00:05:39,600 Speaker 3: War broadcasting stations, ABCNBC, and CBS had their founders come 97 00:05:39,640 --> 00:05:42,800 Speaker 3: out of the Office of War Information. So there's always 98 00:05:42,839 --> 00:05:49,720 Speaker 3: been this marriage between the war industry, state craft, and 99 00:05:50,000 --> 00:05:53,360 Speaker 3: Hollywood and media. And in fact, during the Cold War, 100 00:05:53,400 --> 00:05:56,800 Speaker 3: we actually had a formal US government agency, it's called 101 00:05:56,800 --> 00:06:01,600 Speaker 3: the US Information Agency, which was a formal agency for 102 00:06:02,680 --> 00:06:06,560 Speaker 3: US propaganda, including at home. And you can freely see 103 00:06:06,560 --> 00:06:11,120 Speaker 3: this on YouTube, these sorts of videos that were made 104 00:06:11,640 --> 00:06:15,000 Speaker 3: that were I think probably familiar to all of our 105 00:06:15,080 --> 00:06:19,640 Speaker 3: parents and grandparents and which we sort of look back on, 106 00:06:19,800 --> 00:06:22,120 Speaker 3: I think a lot of us as patriotic Americans, as 107 00:06:22,560 --> 00:06:27,719 Speaker 3: good propaganda. But you know, the fact is is this 108 00:06:28,040 --> 00:06:33,000 Speaker 3: pattern and practice of partnering with domestic media and Hollywood 109 00:06:33,040 --> 00:06:36,480 Speaker 3: allies to tilt the domestic culture to support a particular 110 00:06:36,920 --> 00:06:39,440 Speaker 3: objective has a very, very long history. 111 00:06:39,720 --> 00:06:43,320 Speaker 2: So what's your take then, on Donald Trump suing sixty minutes, 112 00:06:43,440 --> 00:06:46,719 Speaker 2: suing ABC, suing the what is it the Des Moines 113 00:06:46,760 --> 00:06:49,400 Speaker 2: Dispatch or I would dispatch one of the two over 114 00:06:49,600 --> 00:06:52,880 Speaker 2: this poll that he says was election interference, which I 115 00:06:52,880 --> 00:06:56,880 Speaker 2: think we have been seeing polls and other things that 116 00:06:57,400 --> 00:06:58,640 Speaker 2: are very suspicious. 117 00:06:58,839 --> 00:07:01,200 Speaker 1: And do they do they change sentiment? 118 00:07:01,240 --> 00:07:03,360 Speaker 2: And so for people who see that and they say, well, 119 00:07:03,440 --> 00:07:06,080 Speaker 2: how does a poll actually influence, Well, not only does 120 00:07:06,080 --> 00:07:08,240 Speaker 2: it influence the people on the ground, because they may say, 121 00:07:08,240 --> 00:07:10,080 Speaker 2: why should I go out and vote? If it's that 122 00:07:10,200 --> 00:07:12,920 Speaker 2: far why should I? Even my vote won't matter, but 123 00:07:13,040 --> 00:07:14,280 Speaker 2: also donors and. 124 00:07:14,200 --> 00:07:16,760 Speaker 1: People who would get involved. It keeps them out. 125 00:07:16,840 --> 00:07:18,920 Speaker 2: The minute you put something like that out there in 126 00:07:18,960 --> 00:07:22,320 Speaker 2: the paper, it stops all forward moving progress. 127 00:07:23,200 --> 00:07:27,800 Speaker 3: Yeah, it's an interesting case. I mean, I'll be I'll 128 00:07:27,840 --> 00:07:30,200 Speaker 3: be frank, I don't know enough about it to feel 129 00:07:30,240 --> 00:07:34,720 Speaker 3: like my you know, i'd be talking out of turn. 130 00:07:35,600 --> 00:07:39,440 Speaker 3: You know, if the case does in fact have merit 131 00:07:39,520 --> 00:07:43,920 Speaker 3: or there's things about it that Trump believes are significant 132 00:07:43,960 --> 00:07:46,320 Speaker 3: that I just don't know about because I haven't done 133 00:07:46,400 --> 00:07:50,200 Speaker 3: the diligence on it. My initial impression is, you know, 134 00:07:50,280 --> 00:07:55,680 Speaker 3: suing George Stephanopolis for an obvious lie is one thing 135 00:07:56,640 --> 00:07:59,720 Speaker 3: that's the sort of classic bread and butter of defamation 136 00:07:59,800 --> 00:08:03,000 Speaker 3: sit in this country. Although I do think that the 137 00:08:03,040 --> 00:08:07,440 Speaker 3: defamation judgments judgment system is completely out of hand, these 138 00:08:07,520 --> 00:08:11,760 Speaker 3: uncapped damages and defamation suits, in my opinion, are frankly insane. 139 00:08:11,920 --> 00:08:14,560 Speaker 1: The ability we saw with Rudy Giuliani. 140 00:08:14,560 --> 00:08:17,120 Speaker 3: Like we saw with Rudy Giuliani, what we saw with 141 00:08:17,160 --> 00:08:22,120 Speaker 3: Info Wars, what we saw with Dominion, and frankly just 142 00:08:22,400 --> 00:08:26,240 Speaker 3: you know what we saw even with things like Gawker 143 00:08:26,400 --> 00:08:28,920 Speaker 3: and the fact that you have to settle for fifteen 144 00:08:29,040 --> 00:08:34,679 Speaker 3: million because you might be facing down a one hundred 145 00:08:34,679 --> 00:08:39,360 Speaker 3: and fifty million dollar judgment. I do think that that 146 00:08:39,559 --> 00:08:43,440 Speaker 3: capping damages on these sorts of suits is something that 147 00:08:43,600 --> 00:08:46,920 Speaker 3: the legislature should look into because it has really become 148 00:08:47,720 --> 00:08:50,360 Speaker 3: an unbelievable weapon. I mean, we saw this even against 149 00:08:50,360 --> 00:08:54,160 Speaker 3: Trump himself. The thing that he made the defamation suit 150 00:08:54,800 --> 00:08:59,440 Speaker 3: against Stepanopolis over was claims relating to a defamation suit 151 00:08:59,440 --> 00:09:04,240 Speaker 3: against him from Egene Carroll, which was bankrolled by the 152 00:09:04,800 --> 00:09:08,240 Speaker 3: by a you know, Reid Hoffman, the sort of dark 153 00:09:08,360 --> 00:09:11,199 Speaker 3: arts billionaire for the for the d n c H. 154 00:09:11,840 --> 00:09:14,520 Speaker 3: This is a person who had, you know, made similar 155 00:09:14,520 --> 00:09:19,040 Speaker 3: accusations against an untold number of people who's caught lying 156 00:09:19,080 --> 00:09:21,880 Speaker 3: about the dress that you know, that she claimed to 157 00:09:21,920 --> 00:09:23,760 Speaker 3: have been made so many things that were just not 158 00:09:23,840 --> 00:09:27,520 Speaker 3: admissible in the court. So I mean, and that they 159 00:09:27,559 --> 00:09:30,360 Speaker 3: had back channeled with the New York State Assembly in 160 00:09:30,480 --> 00:09:33,280 Speaker 3: order to you know, bring some of these cases in 161 00:09:33,280 --> 00:09:37,240 Speaker 3: in New York. But but the fact is is just 162 00:09:37,280 --> 00:09:41,080 Speaker 3: because you're you could be right about something and lose 163 00:09:41,080 --> 00:09:44,400 Speaker 3: your entire livelihood and be bankrupted simply because the system 164 00:09:44,480 --> 00:09:48,120 Speaker 3: is corrupt. And Rudy Giuliani's case, he defaulted on on 165 00:09:48,200 --> 00:09:51,680 Speaker 3: the on the defamation charges because the FBI took his 166 00:09:51,840 --> 00:09:55,400 Speaker 3: phone and his home computer and he couldn't afford the 167 00:09:55,440 --> 00:09:59,400 Speaker 3: twenty thousand dollars a month access fees for the information 168 00:09:59,480 --> 00:10:02,640 Speaker 3: even defense him. And in Trump's case, you know, they 169 00:10:03,160 --> 00:10:06,400 Speaker 3: he couldn't admit anything into court that actually absolved him, 170 00:10:06,440 --> 00:10:12,480 Speaker 3: and so I it's it is a little unseemly for 171 00:10:12,720 --> 00:10:18,200 Speaker 3: a president to sue a media institution for defamation, given 172 00:10:18,240 --> 00:10:22,199 Speaker 3: how weaponized defamation is. At the same time, Trump there's 173 00:10:22,320 --> 00:10:25,960 Speaker 3: been no bigger victim of that in terms of US 174 00:10:26,080 --> 00:10:32,600 Speaker 3: presidents than Trump, including on the defamation side, suing over polls. 175 00:10:33,679 --> 00:10:35,600 Speaker 3: I don't know how I feel about that. I'd really 176 00:10:35,600 --> 00:10:39,439 Speaker 3: need to look at what you know immediately. I bristle 177 00:10:39,480 --> 00:10:40,120 Speaker 3: at that, But. 178 00:10:41,600 --> 00:10:46,600 Speaker 2: I think it's interesting because so from the standpoint of 179 00:10:46,640 --> 00:10:49,520 Speaker 2: someone who ran for office and had things like this happen, 180 00:10:49,880 --> 00:10:53,760 Speaker 2: it is interesting because there is no way to stop 181 00:10:53,840 --> 00:10:56,559 Speaker 2: a rogue media if you are a candidate, because they 182 00:10:56,559 --> 00:10:58,719 Speaker 2: can say whatever they want under the guise of your 183 00:10:58,840 --> 00:11:01,080 Speaker 2: public figure. We can say to where we want, and 184 00:11:01,120 --> 00:11:04,559 Speaker 2: I will say that I won the primary, and then 185 00:11:04,640 --> 00:11:07,960 Speaker 2: immediately one of our news organizations put out, oh, her 186 00:11:08,000 --> 00:11:10,439 Speaker 2: biggest donor isn't going to stay with her in the general. 187 00:11:10,800 --> 00:11:13,520 Speaker 1: Not a true story, no truth to it whatsoever. 188 00:11:13,800 --> 00:11:16,760 Speaker 2: But once it's out there, you, like I said, suddenly 189 00:11:16,800 --> 00:11:20,080 Speaker 2: the other donor's like, why why we should all sit 190 00:11:20,160 --> 00:11:24,920 Speaker 2: tight and not give And that effectively stopped any funding 191 00:11:24,960 --> 00:11:28,400 Speaker 2: from coming in for enough weeks that the Democrats could 192 00:11:28,840 --> 00:11:32,320 Speaker 2: completely you know, define who I was the minute we 193 00:11:32,320 --> 00:11:36,320 Speaker 2: were in the general. So the question is then, I 194 00:11:36,559 --> 00:11:38,560 Speaker 2: it's a tough it's a double edged sword because you 195 00:11:38,559 --> 00:11:41,480 Speaker 2: have free speech, But then can you just lie and 196 00:11:41,679 --> 00:11:43,560 Speaker 2: in interfere with an election like that? 197 00:11:43,559 --> 00:11:49,040 Speaker 1: That's it is. It's I think it's like a gray area, right, No. 198 00:11:49,120 --> 00:11:50,760 Speaker 3: It is. But you know, a lot of this is 199 00:11:50,760 --> 00:11:54,960 Speaker 3: adjudicated and public opinion on the basis of the credibility 200 00:11:55,040 --> 00:11:59,679 Speaker 3: of the media institution and the general media legacy media 201 00:11:59,679 --> 00:12:02,600 Speaker 3: family at large. Frankly, I think that's been one of 202 00:12:02,600 --> 00:12:07,200 Speaker 3: the major sea change events of my lifetime and certainly 203 00:12:07,200 --> 00:12:11,360 Speaker 3: of the past eighteen to twenty four months slash past 204 00:12:11,600 --> 00:12:16,360 Speaker 3: six to eight years, has been the total erosion, justly 205 00:12:16,679 --> 00:12:21,040 Speaker 3: of trust in mainstream media headlines and legacy media headlines. 206 00:12:21,559 --> 00:12:24,319 Speaker 3: This is something Elon Musk probably tweets five to seven 207 00:12:24,320 --> 00:12:28,840 Speaker 3: times a day now, is that you are the media 208 00:12:28,880 --> 00:12:32,880 Speaker 3: and you see legacy media outlets constantly whaling and gnashing. 209 00:12:33,040 --> 00:12:36,319 Speaker 3: You see hundreds of millions of dollars in government funding 210 00:12:36,400 --> 00:12:42,760 Speaker 3: going to shore up trust in news, which is frankly insane. 211 00:12:42,880 --> 00:12:47,200 Speaker 3: But that's because many, many forces inside government know that 212 00:12:47,200 --> 00:12:50,720 Speaker 3: they can only achieve their aims politically if they have 213 00:12:51,040 --> 00:12:55,280 Speaker 3: allies in media who serve effectively as pawns and where 214 00:12:55,280 --> 00:12:59,880 Speaker 3: there's a favors for favors relationship and they need to 215 00:13:00,080 --> 00:13:08,679 Speaker 3: prop up their pawns against the total destruction of their ratings, 216 00:13:08,720 --> 00:13:12,320 Speaker 3: their revenue and the reputation. And I think on the 217 00:13:12,320 --> 00:13:17,000 Speaker 3: reputation front, they've earned the position there now. 218 00:13:16,920 --> 00:13:19,760 Speaker 2: In We've got more coming up with Mike Ben's but first, 219 00:13:19,880 --> 00:13:22,920 Speaker 2: let me tell you about my partners at Saber. Protecting 220 00:13:22,960 --> 00:13:26,520 Speaker 2: our families and homes is essential, but are we truly prepared? 221 00:13:26,880 --> 00:13:29,440 Speaker 2: Did you guys know that break ins happen every twenty 222 00:13:29,520 --> 00:13:32,480 Speaker 2: five seconds, and even with a security system, you have 223 00:13:32,559 --> 00:13:35,000 Speaker 2: to ask yourself, can you really keep intruders out? 224 00:13:35,160 --> 00:13:35,439 Speaker 1: Well? 225 00:13:35,520 --> 00:13:38,360 Speaker 2: You have to layer your defenses and buy yourself time. 226 00:13:38,520 --> 00:13:42,200 Speaker 2: So start with Saber driveway alerts to know when someone's approaching, 227 00:13:42,280 --> 00:13:45,400 Speaker 2: and then pair them with floodlights to deter them. Saber 228 00:13:45,440 --> 00:13:48,440 Speaker 2: door security bars reinforce your front and back doors, stopping 229 00:13:48,600 --> 00:13:51,000 Speaker 2: up to six hundred and fifty pounds of force to 230 00:13:51,080 --> 00:13:53,760 Speaker 2: secure entry points even when you're not home. And if 231 00:13:53,760 --> 00:13:56,080 Speaker 2: you are home, I know this is scary, but many 232 00:13:56,080 --> 00:13:59,760 Speaker 2: invasions happen at night. Saber's Home Defense Launcher is the 233 00:14:00,040 --> 00:14:03,960 Speaker 2: ultimate choice to protect yourself and family. Saber projectiles hit 234 00:14:04,040 --> 00:14:07,160 Speaker 2: hard at causing intense pain, and can still be effective 235 00:14:07,160 --> 00:14:10,080 Speaker 2: if you miss, as intruders within the six foot peppercloud 236 00:14:10,280 --> 00:14:15,280 Speaker 2: will experience sensory irritation. Plus, Saber's Home Defense launcher is 237 00:14:15,320 --> 00:14:19,440 Speaker 2: the only sixty eight caliber launcher with a seven projectile capacity, 238 00:14:19,480 --> 00:14:22,640 Speaker 2: offering up to forty percent more shots than others. Stay 239 00:14:22,640 --> 00:14:26,400 Speaker 2: secured day or night with Saber Solutions. Visit Saber radio 240 00:14:26,520 --> 00:14:31,120 Speaker 2: dot com. That's Sabre radio dot com, or call eight 241 00:14:31,200 --> 00:14:34,360 Speaker 2: four four eight two four safe today to protect what 242 00:14:34,520 --> 00:14:35,280 Speaker 2: matters most. 243 00:14:35,520 --> 00:14:37,560 Speaker 1: Now, stay tuned. We've got more after this. 244 00:14:41,200 --> 00:14:44,440 Speaker 2: So let's talk about Reuters because you've been talking about 245 00:14:44,440 --> 00:14:48,760 Speaker 2: this Abiden administration paying over three hundred million in government 246 00:14:48,800 --> 00:14:51,720 Speaker 2: contracts and then suddenly there seems to be a connection 247 00:14:51,840 --> 00:14:56,520 Speaker 2: between those government contracts and a lot of targeted investigations 248 00:14:56,560 --> 00:14:58,480 Speaker 2: against Elon Musk and his businesses. 249 00:14:58,520 --> 00:15:00,360 Speaker 1: So explain that wholeso area. 250 00:15:01,040 --> 00:15:04,280 Speaker 3: Well, the connection is that all eleven agencies under the 251 00:15:04,280 --> 00:15:15,360 Speaker 3: Biden administration who have filed actions against Elon Musk business assets, Tesla, SpaceX, 252 00:15:15,400 --> 00:15:20,280 Speaker 3: Neuralink and X. All eleven of those, whether that's the 253 00:15:20,400 --> 00:15:24,440 Speaker 3: Justice Department, the Department of Transportation, the SEC, the FTC, 254 00:15:24,880 --> 00:15:27,960 Speaker 3: the FCC, you name it, they all have multimillion dollar 255 00:15:28,040 --> 00:15:33,320 Speaker 3: contracts with Reuters and Reuters this year won the Politzer 256 00:15:33,400 --> 00:15:40,080 Speaker 3: Prize for their investigative work into the misconduct happening at 257 00:15:40,120 --> 00:15:47,520 Speaker 3: all of Elon Musk businesses under investigation Tesla, Neuralink, SpaceX 258 00:15:47,840 --> 00:15:53,600 Speaker 3: and X. So there is this strange coincidence where you 259 00:15:53,800 --> 00:15:59,040 Speaker 3: have an obviously bad moral hazard going on. Now again, 260 00:15:59,120 --> 00:16:02,720 Speaker 3: I'm not a ledging a direct pay to play here. 261 00:16:02,840 --> 00:16:07,360 Speaker 3: This is something where and and again I'll let me 262 00:16:07,440 --> 00:16:11,640 Speaker 3: be you know, totally just to steal men this. The 263 00:16:11,720 --> 00:16:15,040 Speaker 3: fact is is most of those contract dollars are too 264 00:16:15,520 --> 00:16:19,280 Speaker 3: lines of business at Reuters that are not their newswire. So, 265 00:16:19,320 --> 00:16:24,400 Speaker 3: for example, Reuters has something called Thompson Reuter's Special Services, 266 00:16:24,880 --> 00:16:29,800 Speaker 3: which does network intelligence and UH, and that assists law 267 00:16:29,880 --> 00:16:34,840 Speaker 3: enforcement with being able to investigate suspects by culling you know, 268 00:16:34,920 --> 00:16:39,440 Speaker 3: all the news about them, all the information aggregated online. Uh. 269 00:16:39,680 --> 00:16:44,840 Speaker 3: They they have something called Thompson Reuter's Markets for for 270 00:16:44,960 --> 00:16:48,960 Speaker 3: corporations and uh you know, the Treasury and other agencies 271 00:16:49,120 --> 00:16:54,320 Speaker 3: you know purchase those. But the fact is is we've 272 00:16:54,360 --> 00:16:57,120 Speaker 3: seen this. I saw this myself when I was in 273 00:16:57,160 --> 00:17:00,520 Speaker 3: the White House, and I saw this, uh as on 274 00:17:00,560 --> 00:17:07,000 Speaker 3: the outside when our foundation was investigating the censorship industry, 275 00:17:07,080 --> 00:17:09,959 Speaker 3: which is that there is this constant favors or favors 276 00:17:10,000 --> 00:17:14,679 Speaker 3: relationships that corporations have with government, especially when it comes 277 00:17:14,720 --> 00:17:19,399 Speaker 3: to getting contracts or getting favors from the government, where 278 00:17:19,720 --> 00:17:24,040 Speaker 3: a corporation will often have non core elements of what 279 00:17:24,119 --> 00:17:27,240 Speaker 3: it does that can help the government in some way, 280 00:17:27,720 --> 00:17:32,720 Speaker 3: and if they leverage those outside elements to help the government, 281 00:17:33,400 --> 00:17:36,800 Speaker 3: another aspect of their business will be helped by the 282 00:17:36,840 --> 00:17:42,520 Speaker 3: government because the Grand Administrator, the White House, the agency 283 00:17:42,600 --> 00:17:46,080 Speaker 3: head will see the favor done and repick them for 284 00:17:46,119 --> 00:17:49,159 Speaker 3: the contract. And a great example of this is Facebook. 285 00:17:49,960 --> 00:17:53,600 Speaker 3: We were talking just now about Mark Zuckerberg now donating 286 00:17:53,640 --> 00:17:56,480 Speaker 3: to the Trump Inaugural fund and come to mar A 287 00:17:56,520 --> 00:18:01,000 Speaker 3: Lago and apologizing for censoring Trump and blaming that on 288 00:18:01,040 --> 00:18:03,720 Speaker 3: pressure from the Biden administration. Well, if you go back 289 00:18:03,760 --> 00:18:06,160 Speaker 3: and look at the Facebook files, which were the subpoena 290 00:18:06,560 --> 00:18:12,840 Speaker 3: documents from Jim Jordan's weaponization subcommittee on Facebook, we now 291 00:18:12,920 --> 00:18:15,840 Speaker 3: know the reason. We know this from direct emails from 292 00:18:15,880 --> 00:18:18,600 Speaker 3: Mark Zuckerberg and his top lieutenant, the head of public 293 00:18:18,640 --> 00:18:22,879 Speaker 3: policy at Facebook, Nick Kelegg, that Nick Klegg, when they 294 00:18:22,920 --> 00:18:25,879 Speaker 3: were told by the Biden White House to censor all 295 00:18:25,880 --> 00:18:31,399 Speaker 3: claims about COVID origins, sensor claims about vaccines, sensor claims 296 00:18:31,440 --> 00:18:36,840 Speaker 3: about about you know that criticized the Biden Minstration's public 297 00:18:36,880 --> 00:18:41,639 Speaker 3: health response to COVID. Mark Zuckerberg at first put up 298 00:18:41,640 --> 00:18:43,679 Speaker 3: a little bit of a fight on that, you know, 299 00:18:43,760 --> 00:18:45,960 Speaker 3: calling it an unusual request, saying do we really have 300 00:18:46,040 --> 00:18:49,719 Speaker 3: to do this? And Nick Clegg, running the Public Policy Division, 301 00:18:49,800 --> 00:18:53,840 Speaker 3: responded by saying, we need to think creatively about ways 302 00:18:53,880 --> 00:18:58,640 Speaker 3: to be receptive to the bidenministrations demands for takedowns because 303 00:18:58,640 --> 00:19:01,679 Speaker 3: we have multiple fish to I'm sorry, we have bigger 304 00:19:01,720 --> 00:19:05,680 Speaker 3: fish to fry with the Biden administration on multiple policy fronts. 305 00:19:06,800 --> 00:19:09,840 Speaker 3: And at the time they needed the Biden administration to 306 00:19:09,960 --> 00:19:13,400 Speaker 3: fight off the European regulators so that the State Department 307 00:19:13,480 --> 00:19:18,520 Speaker 3: diplomats would argue with their European counterparts to shave down 308 00:19:18,560 --> 00:19:21,800 Speaker 3: the Digital Markets Act and the Digital Services Act. That 309 00:19:21,880 --> 00:19:25,440 Speaker 3: they needed the Biden administrations help staving off regulatory pressure 310 00:19:25,840 --> 00:19:30,120 Speaker 3: and data monopoly pressures around the world, and that if 311 00:19:30,160 --> 00:19:33,880 Speaker 3: they put a favor in the Favor Bank on censoring 312 00:19:33,960 --> 00:19:37,200 Speaker 3: news stories the Biden administration didn't like, then they would 313 00:19:37,200 --> 00:19:39,920 Speaker 3: get They're more likely to get the favor in return 314 00:19:40,640 --> 00:19:42,879 Speaker 3: on the bigger fish. They have the fry on the 315 00:19:42,880 --> 00:19:47,440 Speaker 3: business side and on multiple policy fronts that have nothing 316 00:19:47,480 --> 00:19:52,760 Speaker 3: to do with the censorship of COVID. Now, so you 317 00:19:52,800 --> 00:19:57,040 Speaker 3: can see the perverse incentive for Thomson Reuters in this case, 318 00:19:57,600 --> 00:19:59,879 Speaker 3: where they've gotten one point five to six billion DO 319 00:20:00,000 --> 00:20:03,000 Speaker 3: dollars in government contracts. They're a major government contractor. They 320 00:20:03,000 --> 00:20:05,800 Speaker 3: got three hundred billion from the Biden administration alone. The 321 00:20:05,840 --> 00:20:10,160 Speaker 3: Biden administration is targeting Elon Musk if they if they 322 00:20:10,200 --> 00:20:14,199 Speaker 3: direct their reporting against the targets of those investigations, you 323 00:20:14,240 --> 00:20:16,800 Speaker 3: can see how that would obviously tilt the calculus in 324 00:20:16,880 --> 00:20:18,719 Speaker 3: terms of them securing future contracts. 325 00:20:19,080 --> 00:20:21,359 Speaker 2: But I mean, aren't you kind of explaining a lot 326 00:20:21,400 --> 00:20:24,000 Speaker 2: of companies. It's not just a censorship thing. I mean, 327 00:20:24,320 --> 00:20:26,879 Speaker 2: this is the this is lobbying. It's like, you know, 328 00:20:26,920 --> 00:20:29,600 Speaker 2: you've got how did big Pharma get so big? Hey, 329 00:20:30,400 --> 00:20:33,360 Speaker 2: you help us and we'll put some money into your 330 00:20:33,440 --> 00:20:36,680 Speaker 2: campaign account. You help us, we'll get you elected. I mean, 331 00:20:37,000 --> 00:20:40,800 Speaker 2: isn't this the corruption of crony capitalism? 332 00:20:41,400 --> 00:20:42,960 Speaker 3: Right? Well, I think what it gets to is the 333 00:20:43,080 --> 00:20:46,280 Speaker 3: larger issue is that if indeed this was the case 334 00:20:47,440 --> 00:20:53,320 Speaker 3: that that Reuter's targeting of Elon Musk and winning the 335 00:20:53,320 --> 00:20:57,960 Speaker 3: Pulitzer for that was used by those government agencies paying 336 00:20:58,160 --> 00:21:01,280 Speaker 3: Reuters as a government contra factor, or whether that was 337 00:21:01,320 --> 00:21:06,080 Speaker 3: done you know, in tandem, then me it really dissolves 338 00:21:06,160 --> 00:21:10,200 Speaker 3: the concept of independent media in terms of Reuters's reputation 339 00:21:10,440 --> 00:21:15,480 Speaker 3: because Reuters is Writers an AP are the two big 340 00:21:15,840 --> 00:21:19,760 Speaker 3: news wires. They are not just news media companies. They 341 00:21:19,760 --> 00:21:24,400 Speaker 3: are the wire that news comes in on two news 342 00:21:24,440 --> 00:21:30,280 Speaker 3: media companies. You know, most news institutions don't have boots 343 00:21:30,280 --> 00:21:35,840 Speaker 3: on the ground in Tunisia or Egypt, or Malaysia or Tanzania. 344 00:21:36,840 --> 00:21:40,240 Speaker 3: Everything that they report on that's happening in foreign wars, 345 00:21:40,800 --> 00:21:44,840 Speaker 3: that's happening in the day to day political or cultural 346 00:21:45,200 --> 00:21:49,840 Speaker 3: or economic events in foreign countries, they rely on the 347 00:21:49,880 --> 00:21:54,240 Speaker 3: truth and reputation of Reuters as the source for stories 348 00:21:54,240 --> 00:21:57,280 Speaker 3: coming in off the wire. And if Reuters is not 349 00:21:57,480 --> 00:22:02,080 Speaker 3: really an independent agency much as it is state media, 350 00:22:02,680 --> 00:22:06,200 Speaker 3: then you have a very different lens through which you 351 00:22:06,240 --> 00:22:09,760 Speaker 3: should view the credibility of Reuters. And you know, I 352 00:22:09,800 --> 00:22:13,600 Speaker 3: posted on my timeline on X recently the long history 353 00:22:13,640 --> 00:22:17,000 Speaker 3: of Reuters working with the Central Intelligence Agency. This is 354 00:22:17,000 --> 00:22:19,840 Speaker 3: something that goes back a long time. Republicans did not 355 00:22:19,960 --> 00:22:22,679 Speaker 3: have a problem with it during the Cold War because 356 00:22:22,720 --> 00:22:26,520 Speaker 3: it was buying large targeting left wing governments. But now 357 00:22:26,560 --> 00:22:29,919 Speaker 3: it's it's come full circle and the universal thump is 358 00:22:29,960 --> 00:22:30,800 Speaker 3: being passed around. 359 00:22:31,400 --> 00:22:33,800 Speaker 2: We've got more after this with Mike Bnz, but first 360 00:22:33,880 --> 00:22:36,920 Speaker 2: let me tell you about my partners at Saber. Protecting 361 00:22:36,960 --> 00:22:39,560 Speaker 2: our families and homes is essential, but you have to 362 00:22:39,600 --> 00:22:42,800 Speaker 2: ask yourself, are you truly prepared? Did you know that 363 00:22:42,840 --> 00:22:46,080 Speaker 2: breakings happen every twenty five seconds? And guys, even with 364 00:22:46,160 --> 00:22:48,920 Speaker 2: a security system, it can be really hard to keep 365 00:22:48,920 --> 00:22:51,920 Speaker 2: intruders out. But if you layer your defenses, you buy 366 00:22:51,960 --> 00:22:55,399 Speaker 2: yourself time. So start with Saber driveway alerts so you 367 00:22:55,440 --> 00:22:58,320 Speaker 2: know when someone's approaching. Impair them with bloodlights to deter 368 00:22:58,480 --> 00:23:02,080 Speaker 2: the person. Saber's door security bars reinforce your front and 369 00:23:02,160 --> 00:23:05,159 Speaker 2: back doors, stopping up to six hundred and fifty pounds 370 00:23:05,160 --> 00:23:08,280 Speaker 2: of force to secure entry points even when you're not home, 371 00:23:08,640 --> 00:23:11,520 Speaker 2: and if you are home, many of these invasions are 372 00:23:11,560 --> 00:23:15,240 Speaker 2: happening at night. Saber Home Defense Launcher is the ultimate 373 00:23:15,320 --> 00:23:19,160 Speaker 2: choice to protect you and your family. Saber projectiles hit hard, 374 00:23:19,280 --> 00:23:22,840 Speaker 2: causing intense pain and even if you miss, your intruder 375 00:23:22,920 --> 00:23:25,480 Speaker 2: is still going to suffer from the six foot pepper 376 00:23:25,480 --> 00:23:29,600 Speaker 2: cloud and they're going to experience sensory irritation. Plus, Saber's 377 00:23:29,680 --> 00:23:32,919 Speaker 2: home defense launcher is the only sixty eight caliber launcher 378 00:23:32,960 --> 00:23:36,840 Speaker 2: with a seven projectile capacity, offering up to forty percent 379 00:23:36,960 --> 00:23:40,000 Speaker 2: more shots than others. Stay secure day or night with 380 00:23:40,080 --> 00:23:46,040 Speaker 2: Saber solutions. Visit saberradio dot com. That's Sabre radio dot com, 381 00:23:46,160 --> 00:23:49,440 Speaker 2: or call eight four four eight two four safe today 382 00:23:49,480 --> 00:23:51,280 Speaker 2: to protect what matters most now. 383 00:23:51,320 --> 00:23:53,479 Speaker 1: Stay tuned. We'll be back with more after this. 384 00:23:57,240 --> 00:24:00,840 Speaker 2: So when does this kind of behavior then start to 385 00:24:01,760 --> 00:24:04,600 Speaker 2: impact other media outlets because I think you know, if 386 00:24:04,640 --> 00:24:08,399 Speaker 2: Reuter's is getting paid, but they're a huge organization, so 387 00:24:08,800 --> 00:24:11,920 Speaker 2: you know, that's not clearly obvious to anybody. If you're 388 00:24:11,920 --> 00:24:14,439 Speaker 2: reading what they're writing, it's not like you're like, oh, well, 389 00:24:14,440 --> 00:24:17,399 Speaker 2: that was probably because they are getting taxpayer money. So 390 00:24:17,880 --> 00:24:20,760 Speaker 2: if you are a smaller news organization, Because we're struggling 391 00:24:20,840 --> 00:24:23,800 Speaker 2: with this in Michigan right now, and I would imagine 392 00:24:23,840 --> 00:24:27,320 Speaker 2: that probably in most swing states they also have some 393 00:24:30,160 --> 00:24:34,320 Speaker 2: copycat behavior like this, where we have a media that 394 00:24:34,480 --> 00:24:37,920 Speaker 2: is totally in bed with the Democrat Party. I mean, gosh, 395 00:24:38,160 --> 00:24:41,280 Speaker 2: just in the past few days we've even had the 396 00:24:42,720 --> 00:24:46,040 Speaker 2: Speaker elect come out and say, look, it's bizarre. We're 397 00:24:46,080 --> 00:24:49,800 Speaker 2: seeing the media tweet things out that the Democrats are 398 00:24:49,840 --> 00:24:53,080 Speaker 2: then obediently doing because the media has said you need 399 00:24:53,119 --> 00:24:55,399 Speaker 2: to go out and do this. It's not even there's 400 00:24:55,440 --> 00:24:57,119 Speaker 2: not even a firewall between it. 401 00:24:57,119 --> 00:24:59,600 Speaker 1: It's just very obvious. Hey go do this. They go 402 00:24:59,680 --> 00:25:02,120 Speaker 1: do it. Hey go do this, go do it. There's 403 00:25:02,160 --> 00:25:04,560 Speaker 1: a weird connection where there. 404 00:25:04,440 --> 00:25:07,320 Speaker 2: Is no longer it's like groupthink and the media and 405 00:25:07,359 --> 00:25:10,920 Speaker 2: the Democrat side, and it is an attack against half 406 00:25:11,000 --> 00:25:14,119 Speaker 2: of the state, but not just the electeds. 407 00:25:14,160 --> 00:25:15,919 Speaker 1: It's an attack against the people. 408 00:25:17,720 --> 00:25:19,840 Speaker 3: Yeah, it's I mean, it's it's a nasty thing. But 409 00:25:19,920 --> 00:25:22,520 Speaker 3: that goes back to the history of you know, media 410 00:25:22,600 --> 00:25:24,800 Speaker 3: has always been this way. You can argue, I mean, 411 00:25:25,440 --> 00:25:27,439 Speaker 3: I think Mark and Dreesen told a funny story on 412 00:25:27,520 --> 00:25:31,640 Speaker 3: Joe Rogan about you know, the history of the seventeen 413 00:25:31,760 --> 00:25:34,879 Speaker 3: hundreds in media and presidents fighting each other in the 414 00:25:35,000 --> 00:25:41,119 Speaker 3: in the papers, using synonyms, using pseudonyms, and you know, 415 00:25:41,160 --> 00:25:45,600 Speaker 3: you look at the history of yellow journalism and you 416 00:25:45,600 --> 00:25:48,879 Speaker 3: know on into today, you know that is part of 417 00:25:48,880 --> 00:25:52,080 Speaker 3: the rough and tumble. There is a sort of funny 418 00:25:52,080 --> 00:25:56,159 Speaker 3: phenomenon where we often talk about state run media like 419 00:25:56,560 --> 00:26:01,400 Speaker 3: Russia Today or you know, the China Daily or these 420 00:26:01,400 --> 00:26:04,960 Speaker 3: sort of it's PBS or NPR or BBC, but we 421 00:26:05,000 --> 00:26:08,560 Speaker 3: don't really talk so much about a media run state, 422 00:26:09,200 --> 00:26:14,520 Speaker 3: which is the funny phenomenon where political parties and governments 423 00:26:14,560 --> 00:26:17,640 Speaker 3: often feel like they are controlled by the media because 424 00:26:17,760 --> 00:26:23,000 Speaker 3: they're political bona fides depend on the support of the media. 425 00:26:21,880 --> 00:26:26,520 Speaker 3: If if if Trump is in the White House, for example, 426 00:26:26,520 --> 00:26:31,000 Speaker 3: and he's already being pilloried by Democrat media and their 427 00:26:31,119 --> 00:26:35,560 Speaker 3: neck and neck in the polls, assuming some credibility to them, 428 00:26:36,160 --> 00:26:40,120 Speaker 3: then if there's a dog pile from Fox News or 429 00:26:40,160 --> 00:26:46,320 Speaker 3: from other significant sources of media support, then that could 430 00:26:46,640 --> 00:26:50,240 Speaker 3: crater the White House political support. It could collapse support 431 00:26:50,240 --> 00:26:53,960 Speaker 3: in Congress for White House objectives, even with the Republican Congress. 432 00:26:54,480 --> 00:26:57,399 Speaker 3: And so, you know, a political party or an empower 433 00:26:57,480 --> 00:27:02,840 Speaker 3: government will often feel beholden to it's media allies just 434 00:27:02,880 --> 00:27:05,840 Speaker 3: because it needs them for its own political survival, and 435 00:27:05,960 --> 00:27:09,560 Speaker 3: so we'll almost take instructional orders, it can seem sometimes 436 00:27:10,280 --> 00:27:11,919 Speaker 3: and you can you know, sort of think of that 437 00:27:12,000 --> 00:27:14,359 Speaker 3: as a as a media run state. Of course, there's 438 00:27:14,359 --> 00:27:17,119 Speaker 3: another aspect of it, which is that most you know, 439 00:27:17,119 --> 00:27:21,040 Speaker 3: most media organizations are owned by at least at the 440 00:27:21,119 --> 00:27:25,199 Speaker 3: large level, excluding you know, local news, but most major 441 00:27:25,400 --> 00:27:28,880 Speaker 3: news media aulets are owned by a billionaire olig ark. 442 00:27:28,920 --> 00:27:31,879 Speaker 3: I mean, you know this with you know, not just 443 00:27:32,000 --> 00:27:40,280 Speaker 3: the you know, the Samantha Powell, the the Jeff Bezos types, 444 00:27:40,800 --> 00:27:43,160 Speaker 3: the you know, the owner of the La Times who 445 00:27:43,200 --> 00:27:46,480 Speaker 3: just recently came out and just you know, said he 446 00:27:46,560 --> 00:27:48,159 Speaker 3: was going to try to move that more center. But 447 00:27:48,359 --> 00:27:51,040 Speaker 3: Elon Muski, you know, you would say about about X. 448 00:27:51,240 --> 00:27:57,119 Speaker 3: There's you know, the famous New York Times Carlos Slim intervention, 449 00:27:57,359 --> 00:28:02,240 Speaker 3: the you know, wealthiest man in Mexico. This is a 450 00:28:02,400 --> 00:28:07,879 Speaker 3: large media is a billionaire's plaything, and so often, you know, 451 00:28:08,160 --> 00:28:10,600 Speaker 3: there's a relationship between the messaging coming from the media 452 00:28:10,640 --> 00:28:13,640 Speaker 3: and the messaging coming from that faction of the donors. 453 00:28:13,960 --> 00:28:15,920 Speaker 1: Well, and that I think is very interesting. 454 00:28:15,920 --> 00:28:18,040 Speaker 2: I mean, we're talking about this the other day about 455 00:28:18,359 --> 00:28:21,200 Speaker 2: how we talk about Ukraine and we talk about Russia 456 00:28:21,240 --> 00:28:25,320 Speaker 2: with these oligarchs that are really moving the chess pieces. 457 00:28:24,960 --> 00:28:25,440 Speaker 1: On the board. 458 00:28:25,480 --> 00:28:27,760 Speaker 2: But it is not that much different in the United 459 00:28:27,800 --> 00:28:31,440 Speaker 2: States because we still have the same like you said, 460 00:28:31,480 --> 00:28:35,679 Speaker 2: the billionaires who are who own these companies, they're also 461 00:28:35,800 --> 00:28:39,160 Speaker 2: funding the candidates. So you have a lot of people 462 00:28:39,200 --> 00:28:41,320 Speaker 2: think that it is we the people. And I think 463 00:28:41,320 --> 00:28:43,960 Speaker 2: that Donald Trump had a big impact on that because 464 00:28:44,000 --> 00:28:47,080 Speaker 2: when the donors said, hey, it's not you, it's Ron 465 00:28:47,080 --> 00:28:50,360 Speaker 2: DeSantis or it's NICKI Haley, he went, well, fine, I 466 00:28:50,400 --> 00:28:50,920 Speaker 2: don't need you. 467 00:28:51,200 --> 00:28:52,000 Speaker 1: I have the people. 468 00:28:52,280 --> 00:28:55,840 Speaker 2: And it was shocking, I think, to the oligarch class, 469 00:28:55,880 --> 00:28:59,240 Speaker 2: to the billionaire class, because they went, no, no. 470 00:28:59,480 --> 00:29:00,280 Speaker 1: We can try this. 471 00:29:00,800 --> 00:29:04,960 Speaker 2: We move the chess pieces and then they saw, oh crap, Actually, 472 00:29:05,320 --> 00:29:07,760 Speaker 2: in this case, the people are moving the chess pieces, 473 00:29:07,800 --> 00:29:11,480 Speaker 2: they're taking it back. How do we continue to push 474 00:29:11,640 --> 00:29:14,520 Speaker 2: back on those folks that think they have control total control. 475 00:29:15,640 --> 00:29:18,840 Speaker 3: Well, to me, my answer to that is you have 476 00:29:18,920 --> 00:29:24,120 Speaker 3: to support places like X and places like you know, Rumble, 477 00:29:24,200 --> 00:29:28,560 Speaker 3: and places that have unfettered more or less, you know, 478 00:29:28,640 --> 00:29:32,800 Speaker 3: free speech that allows you to add your one small 479 00:29:32,880 --> 00:29:36,320 Speaker 3: voice to the masses. You know, I think about that 480 00:29:36,400 --> 00:29:40,520 Speaker 3: meme frequently where you know, there's a shark and then 481 00:29:40,560 --> 00:29:45,320 Speaker 3: there's a minno, and then you know the next panel 482 00:29:45,440 --> 00:29:48,880 Speaker 3: is the shark running away. Because now you've have so 483 00:29:48,960 --> 00:29:53,440 Speaker 3: many minnows that they're bigger than the shark, and that 484 00:29:53,440 --> 00:29:57,240 Speaker 3: that really is that relationship I think between the sort 485 00:29:57,280 --> 00:30:02,480 Speaker 3: of legacy billionaire media, you know, the media owner overclass, 486 00:30:03,040 --> 00:30:08,040 Speaker 3: and the billions of people who can aggregate together on 487 00:30:08,120 --> 00:30:13,920 Speaker 3: social media and overcome you know, collectively the power that's 488 00:30:14,040 --> 00:30:17,680 Speaker 3: held in the more centralized performed by a media owner. 489 00:30:17,720 --> 00:30:20,480 Speaker 3: And of course there's some not a small amount of 490 00:30:20,480 --> 00:30:23,040 Speaker 3: irony in the fact that X, which is where so 491 00:30:23,160 --> 00:30:26,000 Speaker 3: much of this is happening, and so much of this 492 00:30:26,080 --> 00:30:34,320 Speaker 3: counterforce is being centralized with decentralizing media power is owned 493 00:30:34,360 --> 00:30:39,160 Speaker 3: by the wealthiest man in the entire world. So you know, 494 00:30:39,360 --> 00:30:42,600 Speaker 3: there is something of a you know, a no blessed 495 00:30:42,640 --> 00:30:46,240 Speaker 3: believe there's a little bit of a we are still 496 00:30:46,520 --> 00:30:53,760 Speaker 3: dependent on benevolent dictators if you will. But you know, 497 00:30:53,960 --> 00:30:55,600 Speaker 3: you take freedom anyway you can get it. 498 00:30:56,400 --> 00:30:58,800 Speaker 1: Yeah, And I think that's I love that. I think 499 00:30:58,800 --> 00:30:59,600 Speaker 1: that's a great point. 500 00:31:00,360 --> 00:31:03,360 Speaker 2: You mentioned at the beginning that Elon Musk is constantly 501 00:31:03,400 --> 00:31:06,120 Speaker 2: saying you are the media, but you have to continue 502 00:31:06,120 --> 00:31:08,800 Speaker 2: that pressure. I mean, this is a pressure campaign of 503 00:31:08,840 --> 00:31:11,080 Speaker 2: making sure the right thing is happening. And I find 504 00:31:11,080 --> 00:31:13,920 Speaker 2: this interesting right now as I look at my own state, 505 00:31:14,000 --> 00:31:19,200 Speaker 2: and I have Democrats who have never well some of 506 00:31:19,240 --> 00:31:21,920 Speaker 2: them have pushed back, but some of them have always 507 00:31:22,000 --> 00:31:24,320 Speaker 2: just been, you know, go with what the party line is. 508 00:31:24,360 --> 00:31:27,280 Speaker 2: And they're pushing back against the governor and saying, you're 509 00:31:27,280 --> 00:31:28,960 Speaker 2: not going to force us into this, and you're not 510 00:31:29,000 --> 00:31:30,680 Speaker 2: going to force us into that. And I find it 511 00:31:30,720 --> 00:31:33,920 Speaker 2: interesting because the things that they're fighting back about is, 512 00:31:34,920 --> 00:31:37,440 Speaker 2: don't vote on corporate welfare if you haven't taken care 513 00:31:37,480 --> 00:31:40,600 Speaker 2: of Detroit. Don't vote on corporate welfare if you haven't 514 00:31:40,640 --> 00:31:42,880 Speaker 2: taken care of the water, you know. And this is like, 515 00:31:43,520 --> 00:31:47,160 Speaker 2: I think this is a change because of this election 516 00:31:47,640 --> 00:31:50,040 Speaker 2: and because of what you're saying, because I don't think 517 00:31:50,080 --> 00:31:53,320 Speaker 2: the people felt empowered in the past in their own 518 00:31:53,400 --> 00:31:56,080 Speaker 2: party to push back and say, well, wait a minute, 519 00:31:56,240 --> 00:31:57,720 Speaker 2: we're not going to vote on a. 520 00:31:57,680 --> 00:32:00,000 Speaker 1: Bunch of crap for the other rich people. 521 00:32:00,040 --> 00:32:02,640 Speaker 2: Well, when we don't have clean water and we don't 522 00:32:02,680 --> 00:32:05,200 Speaker 2: have our communities taken care of, and we don't have 523 00:32:05,440 --> 00:32:09,160 Speaker 2: a good public safety record, get come on, take care 524 00:32:09,240 --> 00:32:12,440 Speaker 2: of people. Now. It's the little guy speaking up. It's 525 00:32:12,480 --> 00:32:16,560 Speaker 2: the average American citizen who is saying, pay attention to us. 526 00:32:16,920 --> 00:32:18,320 Speaker 1: And I think. 527 00:32:18,400 --> 00:32:21,840 Speaker 2: I love the fact that that that imagery of the minos, 528 00:32:21,880 --> 00:32:24,360 Speaker 2: because I do believe that that's what we've seen on 529 00:32:24,680 --> 00:32:27,680 Speaker 2: X is that push but that has kind of allowed 530 00:32:27,760 --> 00:32:30,080 Speaker 2: people to feel like, I have the minnos with. 531 00:32:30,040 --> 00:32:34,360 Speaker 3: Me totally in their safety in numbers, yeah, and they 532 00:32:34,400 --> 00:32:40,080 Speaker 3: know it. And not only that, the huge variety diversity, 533 00:32:41,360 --> 00:32:44,840 Speaker 3: you know, of those masses makes it makes it a 534 00:32:44,920 --> 00:32:48,360 Speaker 3: hard target because you can't just take out one thing 535 00:32:48,800 --> 00:32:52,200 Speaker 3: and you know, and take it all unless you're taking 536 00:32:52,200 --> 00:32:55,840 Speaker 3: out the platform itself, which is why they are trying 537 00:32:55,880 --> 00:32:58,640 Speaker 3: to do that to Elon in the EU, in Brazil, 538 00:32:58,840 --> 00:33:04,040 Speaker 3: in Australia, in Pakistan. And I have no doubt if 539 00:33:04,600 --> 00:33:06,600 Speaker 3: the election had gone the other way would have been 540 00:33:06,720 --> 00:33:07,760 Speaker 3: in the United States. 541 00:33:08,280 --> 00:33:11,959 Speaker 1: So you know, well he had no doubt either. He 542 00:33:12,000 --> 00:33:14,239 Speaker 1: was like, I'm screwed. If this doesn't work out, you know, 543 00:33:15,160 --> 00:33:16,800 Speaker 1: he has to win her. I'm in trouble. 544 00:33:17,040 --> 00:33:19,800 Speaker 2: But I mean, the point you make is fantastic because 545 00:33:20,080 --> 00:33:23,080 Speaker 2: I actually called one of these legislators and I said, hey, 546 00:33:23,160 --> 00:33:25,320 Speaker 2: I know we're on opposite teams, but I just want 547 00:33:25,360 --> 00:33:27,920 Speaker 2: to say thank you. I'm sure you're getting a lot 548 00:33:27,960 --> 00:33:30,320 Speaker 2: of pushback because of what you're doing, but thank you 549 00:33:30,720 --> 00:33:33,880 Speaker 2: and she amazingly said, no, no pushback. 550 00:33:34,000 --> 00:33:35,800 Speaker 1: My constituents are thrilled. 551 00:33:36,120 --> 00:33:39,880 Speaker 2: I am getting so much support, support for keeping people 552 00:33:39,920 --> 00:33:43,640 Speaker 2: from voting because you know, she's standing in the way 553 00:33:43,680 --> 00:33:46,520 Speaker 2: and saying, I'm not going to allow bogus bills to 554 00:33:46,520 --> 00:33:49,240 Speaker 2: go through while the people in my district are suffering. 555 00:33:49,600 --> 00:33:52,280 Speaker 2: So I just think it's incredible to see that here 556 00:33:52,320 --> 00:33:55,280 Speaker 2: you have someone who is really taking a pretty radical 557 00:33:55,320 --> 00:33:57,080 Speaker 2: stance and she's like, nope, I'm good. 558 00:33:57,200 --> 00:33:59,040 Speaker 1: I've got everybody. Everybody's with me. 559 00:34:00,120 --> 00:34:04,640 Speaker 3: Yeah. It's it's interesting watching this happen right now on 560 00:34:04,680 --> 00:34:08,879 Speaker 3: the Democrat side, because I think the party has been 561 00:34:08,920 --> 00:34:11,960 Speaker 3: really held in, you know, in a more or less 562 00:34:12,000 --> 00:34:16,400 Speaker 3: monolithic way by the power exercise by Nancy Pelosi and 563 00:34:16,520 --> 00:34:19,959 Speaker 3: Chuck Schumer. And I think it's been a little weaker 564 00:34:20,040 --> 00:34:22,359 Speaker 3: under HAKEM. Jeffreys, but you know, there has been this 565 00:34:23,520 --> 00:34:27,239 Speaker 3: you cannot dissent from the consent, from the Democrat consensus. 566 00:34:27,880 --> 00:34:33,520 Speaker 3: Where's Republicans this whole time have been fighting, you know, fighting, 567 00:34:33,600 --> 00:34:36,760 Speaker 3: fighting with crocodile dundee knives in a in a phone 568 00:34:36,800 --> 00:34:40,680 Speaker 3: booth between the MAGA faction and the you know, the 569 00:34:40,760 --> 00:34:45,920 Speaker 3: legacy Republican Bush era and Romney factions. I mean, there's 570 00:34:46,400 --> 00:34:49,000 Speaker 3: there's been no hotter fight in Washington, I would say, 571 00:34:49,080 --> 00:34:54,839 Speaker 3: then Republicans versus Republicans, and so I mean even more 572 00:34:54,880 --> 00:34:58,080 Speaker 3: than Republicans versus Democrats, I would argue, I mean even 573 00:34:58,239 --> 00:35:01,920 Speaker 3: right now, we don't know with a Republican president, a 574 00:35:02,040 --> 00:35:06,880 Speaker 3: Republican Congress, a Republican Senate, and a Republican Supreme Court, 575 00:35:07,200 --> 00:35:10,040 Speaker 3: whether Trump will even be able to pass his budget 576 00:35:10,200 --> 00:35:14,319 Speaker 3: because there's so much there's such a robust fight within 577 00:35:14,400 --> 00:35:21,040 Speaker 3: the party and they air their disagreements. They don't just 578 00:35:21,280 --> 00:35:25,399 Speaker 3: suck it up and fold to consensus. It's being fought out. 579 00:35:25,440 --> 00:35:27,719 Speaker 3: So it is actually nice and refreshing to see that 580 00:35:28,160 --> 00:35:30,759 Speaker 3: on the Democrat side, and we're seeing figures like Chenk 581 00:35:30,840 --> 00:35:35,919 Speaker 3: Yueger and Anakisparian and many others who I think had 582 00:35:35,960 --> 00:35:38,759 Speaker 3: these sort of populous left wing whatever you think about them, 583 00:35:38,800 --> 00:35:41,399 Speaker 3: had these sort of populous left wing routes, but then 584 00:35:41,840 --> 00:35:46,480 Speaker 3: during the Trump era completely folded into the borg just 585 00:35:46,840 --> 00:35:49,880 Speaker 3: you know, seem to have forgotten their entire individual identities 586 00:35:49,880 --> 00:35:53,799 Speaker 3: so that they'd stay in the good graces of mainstream Democrats, 587 00:35:53,840 --> 00:35:58,480 Speaker 3: and then once they lost that power over them, suddenly 588 00:35:58,520 --> 00:36:02,239 Speaker 3: they're free to size again. So but that's that's just 589 00:36:02,280 --> 00:36:03,120 Speaker 3: the nature of power. 590 00:36:03,160 --> 00:36:06,600 Speaker 2: I think. Well, it has definitely been fascinating to watch, 591 00:36:06,600 --> 00:36:09,319 Speaker 2: and I always enjoy what you're out there tweeting so 592 00:36:09,360 --> 00:36:12,920 Speaker 2: we can follow you, and I encourage everybody to follow you. 593 00:36:13,000 --> 00:36:15,279 Speaker 2: But I appreciate you coming on today, Mike Bens. Thank 594 00:36:15,280 --> 00:36:17,879 Speaker 2: you so much, Thanks so much for having and thank 595 00:36:17,920 --> 00:36:20,320 Speaker 2: you all for joining us on the Tutor Dixon Podcast. 596 00:36:20,440 --> 00:36:23,640 Speaker 2: For this episode and others, go to Tutor dixonpodcast dot com. 597 00:36:23,680 --> 00:36:25,880 Speaker 2: You can subscribe right there, or head over to the 598 00:36:25,960 --> 00:36:29,600 Speaker 2: iHeartRadio app, Apple Podcasts, or wherever you get your podcasts 599 00:36:29,640 --> 00:36:30,640 Speaker 2: and join us next time. 600 00:36:30,840 --> 00:36:32,440 Speaker 1: Have a blessed day.