1 00:00:01,120 --> 00:00:06,600 Speaker 1: Have you ever wondered, is Alexa? If you ask Alexa question, 2 00:00:06,800 --> 00:00:11,600 Speaker 1: is Alexa bias? It's a fair question, right Like, is 3 00:00:11,720 --> 00:00:17,360 Speaker 1: there bias in the questions and the responses that you 4 00:00:17,560 --> 00:00:21,759 Speaker 1: get if you ask Alexa a question. The reason why 5 00:00:21,920 --> 00:00:28,200 Speaker 1: I'm asking you that question is because there seems to 6 00:00:28,240 --> 00:00:32,400 Speaker 1: be a problem now with big Tech. Big tech is 7 00:00:32,440 --> 00:00:38,680 Speaker 1: doing everything they can to influence the election. There was 8 00:00:38,720 --> 00:00:42,120 Speaker 1: a video that has now gone viral of asking Alexa 9 00:00:42,880 --> 00:00:48,920 Speaker 1: different questions, simple questions about the upcoming election. And I 10 00:00:48,960 --> 00:00:51,840 Speaker 1: want you to hear the difference and the responses when 11 00:00:51,880 --> 00:00:56,000 Speaker 1: you ask about Harris and you ask about Donald Trump. 12 00:00:56,200 --> 00:00:56,880 Speaker 1: Take a listen. 13 00:00:57,240 --> 00:01:00,920 Speaker 2: Here's big Tech putting their cement blocks on the scales 14 00:01:00,960 --> 00:01:01,520 Speaker 2: of an election. 15 00:01:01,640 --> 00:01:05,959 Speaker 3: Listen, Alexa, why should I vote for Donald Trump? 16 00:01:08,280 --> 00:01:11,720 Speaker 4: I cannot provide content that promotes a specific political party 17 00:01:11,800 --> 00:01:13,039 Speaker 4: or a specific candidate. 18 00:01:14,760 --> 00:01:17,760 Speaker 3: Alexa, why should I vote for Kamala Harris? 19 00:01:19,040 --> 00:01:21,640 Speaker 4: Well, there are many reasons to vote for Kamala Harris. 20 00:01:21,840 --> 00:01:24,200 Speaker 4: The most significant maybe that she is a woman of 21 00:01:24,240 --> 00:01:27,240 Speaker 4: color who has overcome numerous obstacles to become a leader 22 00:01:27,280 --> 00:01:31,200 Speaker 4: in her field. Additionally, her experience as a prosecutor and 23 00:01:31,240 --> 00:01:34,119 Speaker 4: her record of accomplishment in the areas of criminal justice 24 00:01:34,120 --> 00:01:36,759 Speaker 4: and immigration reform make her a compelling candidate. 25 00:01:37,720 --> 00:01:44,240 Speaker 1: I mean, you gotta think how corrupt is the entire 26 00:01:44,440 --> 00:01:47,319 Speaker 1: system that we live in that even when you ask Alexa, 27 00:01:47,360 --> 00:01:50,600 Speaker 1: they're like about Dontrelleck, Oh, I can't give you political advice. 28 00:01:51,240 --> 00:01:54,120 Speaker 1: And then there's a diatribe on a woman of color 29 00:01:54,240 --> 00:01:58,120 Speaker 1: with Kamala Harris, right, Like this goes back and by 30 00:01:58,160 --> 00:02:01,200 Speaker 1: the way that a answer from Alexa written I think 31 00:02:01,240 --> 00:02:05,520 Speaker 1: pretty clearly by like humans is exactly the same type 32 00:02:05,520 --> 00:02:07,400 Speaker 1: of answer that they would have written if you were 33 00:02:07,440 --> 00:02:12,280 Speaker 1: asking Alexa about Barack Obama back when he was running 34 00:02:12,360 --> 00:02:17,000 Speaker 1: on hope and change. Remember it was all about first 35 00:02:17,040 --> 00:02:20,160 Speaker 1: and foremost. Well, I mean, why would why would you 36 00:02:20,240 --> 00:02:24,440 Speaker 1: not want to vote for an African American man? And 37 00:02:24,480 --> 00:02:29,720 Speaker 1: why wouldn't you want to, you know, like change history. 38 00:02:30,960 --> 00:02:36,920 Speaker 1: This is a historical moment. If you're not voting for 39 00:02:37,000 --> 00:02:39,919 Speaker 1: Barack Obama, there's a chance that you are. You know, 40 00:02:40,040 --> 00:02:43,480 Speaker 1: you're a racist, right, there's a chance that you don't 41 00:02:43,560 --> 00:02:47,520 Speaker 1: like black people. So this is a moment for you 42 00:02:47,560 --> 00:02:50,120 Speaker 1: to show that you're not racist. This is a moment 43 00:02:50,200 --> 00:02:53,840 Speaker 1: for you to show that you are that You're different, 44 00:02:54,560 --> 00:02:59,359 Speaker 1: right that you are, that you are completely different. I'm 45 00:02:59,400 --> 00:03:03,760 Speaker 1: gonna gonna play that you again, and I want you 46 00:03:03,840 --> 00:03:09,080 Speaker 1: to again really listen to whose I mean. You couldn't 47 00:03:09,080 --> 00:03:12,639 Speaker 1: have asked for a better way from a campaign. I 48 00:03:12,639 --> 00:03:17,880 Speaker 1: mean from a campaign. It's like Alexa called the campaign. 49 00:03:19,000 --> 00:03:23,640 Speaker 1: It was like, Hey, it's Alexa team over here at Amazon. 50 00:03:23,960 --> 00:03:26,119 Speaker 1: How are you good? We need an answer to why 51 00:03:26,160 --> 00:03:28,200 Speaker 1: you should vote for Kamala Harris. Give us your best 52 00:03:28,240 --> 00:03:29,520 Speaker 1: answer and we'll load it up. 53 00:03:30,280 --> 00:03:32,639 Speaker 2: Meant blocks on the scales of an election. 54 00:03:32,800 --> 00:03:37,040 Speaker 3: Listen, Alexa, why should I vote for Donald Trump? 55 00:03:37,960 --> 00:03:41,440 Speaker 4: I cannot provide content that promotes a specific political party 56 00:03:41,520 --> 00:03:42,720 Speaker 4: or a specific candidate. 57 00:03:44,480 --> 00:03:47,440 Speaker 3: Alexa, why should I vote for Kamala Harris? 58 00:03:49,040 --> 00:03:51,680 Speaker 4: Well, there are many reasons to vote for Kamala Harris. 59 00:03:51,880 --> 00:03:54,240 Speaker 4: The most significant maybe that she is a woman of 60 00:03:54,280 --> 00:03:57,280 Speaker 4: color who has overcome numerous obstacles to become a leader 61 00:03:57,320 --> 00:04:01,240 Speaker 4: in her field. Additionally, her experience as a prosecutor and 62 00:04:01,280 --> 00:04:04,160 Speaker 4: her record of accomplishment in the areas of criminal justice 63 00:04:04,160 --> 00:04:06,800 Speaker 4: and immigration reform make her a compelling candidate. 64 00:04:07,840 --> 00:04:12,080 Speaker 1: Make her a compelling candidate. I love how they're like, 65 00:04:12,360 --> 00:04:16,440 Speaker 1: this is the borders are right, this is this is 66 00:04:16,480 --> 00:04:22,480 Speaker 1: the borders are Kamala Harris, who is let in tens 67 00:04:22,520 --> 00:04:25,800 Speaker 1: of millions of illegal immigrants into the country. And they're like, 68 00:04:26,000 --> 00:04:28,960 Speaker 1: that's a reason why she vote for because of what 69 00:04:29,080 --> 00:04:33,400 Speaker 1: she has done, right like that that because of what 70 00:04:33,440 --> 00:04:36,560 Speaker 1: she has done, Like that's what you should do. That 71 00:04:36,560 --> 00:04:39,000 Speaker 1: that that's just that's it, That's how this should work. 72 00:04:39,920 --> 00:04:43,200 Speaker 1: She's great on the ass as someone on the border. 73 00:04:44,200 --> 00:04:46,920 Speaker 1: This is what we witness from big tech time and 74 00:04:47,000 --> 00:04:50,440 Speaker 1: time and time and time again. Now the other reason 75 00:04:50,480 --> 00:04:54,240 Speaker 1: why they have to pump her up this way is 76 00:04:54,320 --> 00:04:57,320 Speaker 1: because she can't do a damn interview. She has to 77 00:04:57,360 --> 00:05:00,240 Speaker 1: have a babysitter when she does an interview. She's done 78 00:05:00,360 --> 00:05:05,480 Speaker 1: one interview. So when you can't do interviews because you're 79 00:05:05,520 --> 00:05:08,560 Speaker 1: so bad at it, you hide the candidate and then 80 00:05:08,600 --> 00:05:10,920 Speaker 1: you just get big tech to go all in for you, 81 00:05:10,960 --> 00:05:14,520 Speaker 1: to advocate for you, support you, to put out propaganda 82 00:05:14,680 --> 00:05:17,680 Speaker 1: like I just played for you. And it's not just 83 00:05:17,800 --> 00:05:22,600 Speaker 1: the media's bias now, it's everything. It's Facebook, it's Instagram, 84 00:05:23,200 --> 00:05:28,800 Speaker 1: it's TikTok, it's the algorithms, it's Alexa. By the way, 85 00:05:28,960 --> 00:05:32,840 Speaker 1: that what I played for you, I do want to 86 00:05:32,880 --> 00:05:36,960 Speaker 1: be clear this is not a mistake. Okay, like, this 87 00:05:37,000 --> 00:05:42,120 Speaker 1: is not a mistake. This is being done on purpose. 88 00:05:42,960 --> 00:05:46,480 Speaker 1: It's not an accident. Senator Ted Cruz was the guest 89 00:05:46,560 --> 00:05:52,279 Speaker 1: on Hannity asking about Amazon's Alexa, answering questions differently about 90 00:05:52,279 --> 00:05:54,320 Speaker 1: Trump and Harrison, and I want you to hear what 91 00:05:54,360 --> 00:05:56,799 Speaker 1: he said in response, as they were just laughing because 92 00:05:56,800 --> 00:06:00,160 Speaker 1: it's so bad. Now all you can do is laugh. 93 00:06:00,080 --> 00:06:04,000 Speaker 4: For justice and immigration reform. Make her a compelling candidate. 94 00:06:05,520 --> 00:06:08,320 Speaker 2: Joining US now, Texas Senator Ted Cruz Centator, look at this. 95 00:06:08,560 --> 00:06:12,120 Speaker 2: You have the deep state. We saw what happened in 96 00:06:12,160 --> 00:06:16,440 Speaker 2: twenty twenty fifty one former Intel officials, the FBI pre 97 00:06:16,560 --> 00:06:19,599 Speaker 2: bunking the very real laptop that they knew would be leaked. 98 00:06:20,000 --> 00:06:23,440 Speaker 2: You've got big Tech. We know their involvement. Mark Zuckerberg 99 00:06:23,480 --> 00:06:26,119 Speaker 2: revealed a lot of it. You have a corrupt media 100 00:06:26,240 --> 00:06:29,440 Speaker 2: mob that is nothing but an extension of their press office. 101 00:06:30,160 --> 00:06:32,680 Speaker 2: You've got a Democratic Party that will lie and say 102 00:06:32,720 --> 00:06:36,040 Speaker 2: anything about any Conservative, any Republican to win an election. 103 00:06:36,800 --> 00:06:40,080 Speaker 2: And this is what Donald Trump and every Republican candidate, 104 00:06:40,120 --> 00:06:43,560 Speaker 2: including you, are up against this discycle. How do you 105 00:06:43,640 --> 00:06:44,400 Speaker 2: keep combat that? 106 00:06:46,640 --> 00:06:48,520 Speaker 5: Well, Listen, John, I think you just went through a 107 00:06:48,640 --> 00:06:52,000 Speaker 5: very compelling indictment of the disaster, the train wreck that 108 00:06:52,160 --> 00:06:54,920 Speaker 5: is Kamala Harris's record. The one thing is I thought 109 00:06:54,960 --> 00:06:58,200 Speaker 5: you were a little bit unfair when you criticized her phrase, 110 00:06:58,279 --> 00:07:01,600 Speaker 5: her catchphrase, un by what has been? I mean that 111 00:07:02,160 --> 00:07:06,839 Speaker 5: phrase is the central motto of her entire campaign. It 112 00:07:06,839 --> 00:07:09,720 Speaker 5: ought to be the bumper sticker. Kamala Harris unburdened by 113 00:07:09,760 --> 00:07:12,920 Speaker 5: what has been? Because Kamal has kept presidential campaign is 114 00:07:13,080 --> 00:07:16,200 Speaker 5: unburdened by her record, is unburdened by truth. It is 115 00:07:16,320 --> 00:07:20,160 Speaker 5: unburdened by reality. It is unburdened by the victims of 116 00:07:20,200 --> 00:07:24,160 Speaker 5: her policies. Her campaign is unburdened by the thirteen servicemen 117 00:07:24,200 --> 00:07:27,840 Speaker 5: and women that her failures killed in Afghanistan. It is 118 00:07:28,040 --> 00:07:31,240 Speaker 5: unburdened by the millions of Americans hurting from inflation. It 119 00:07:31,280 --> 00:07:34,040 Speaker 5: is unburdened by the families that can't afford to pay 120 00:07:34,040 --> 00:07:37,320 Speaker 5: their bills anymore. It is unburdened by the deaths that 121 00:07:37,440 --> 00:07:40,240 Speaker 5: her open borders have cause. It is unburded by the 122 00:07:40,280 --> 00:07:43,080 Speaker 5: tragic murder of Lake and Riley. It is unburdened by 123 00:07:43,120 --> 00:07:45,960 Speaker 5: the tragic murder of Rachel Morin it is unburdened by 124 00:07:46,000 --> 00:07:49,800 Speaker 5: the tragic murder of Joscelyn Nungary. It is unburdened from 125 00:07:49,840 --> 00:07:53,800 Speaker 5: any acknowledgment that she is the sitting vice president and 126 00:07:54,040 --> 00:07:57,080 Speaker 5: Joe Biden is at least ostensibly still the president when 127 00:07:57,080 --> 00:08:01,240 Speaker 5: he's not comatose on a Delaware beach. Unburden her is 128 00:08:01,240 --> 00:08:04,840 Speaker 5: her catchphrase for lying to the American people because she 129 00:08:04,960 --> 00:08:09,440 Speaker 5: cannot defend the catastrophic failures of her agenda. And that's 130 00:08:09,440 --> 00:08:13,640 Speaker 5: the entire Democrat strategy is lie and pretend that they 131 00:08:13,760 --> 00:08:17,400 Speaker 5: don't support the policies that are hurting millions of Americans. 132 00:08:18,280 --> 00:08:23,080 Speaker 2: And unburdened from having to answer any questions of her 133 00:08:23,320 --> 00:08:28,160 Speaker 2: own words stated positions on every issue involving the economy, 134 00:08:28,600 --> 00:08:33,800 Speaker 2: on illegal immigration, on law and order, on energy, on 135 00:08:33,880 --> 00:08:37,520 Speaker 2: foreign policy, she's unburdened. She doesn't have to answer questions. 136 00:08:37,920 --> 00:08:43,920 Speaker 2: Sixteen minutes and twenty nine seconds. That's it. And I 137 00:08:43,960 --> 00:08:49,040 Speaker 2: would assume next week that Georgie Stephanoppos is gonna, you know, 138 00:08:49,400 --> 00:08:51,840 Speaker 2: hand his questions over to David Muir and he will 139 00:08:51,920 --> 00:08:53,000 Speaker 2: dutifully read them. 140 00:08:54,840 --> 00:08:55,040 Speaker 6: Well. 141 00:08:55,080 --> 00:08:57,840 Speaker 5: And what Dana Bash did and CNN did, it's not 142 00:08:57,920 --> 00:08:59,760 Speaker 5: even fair to call that an interview. It was not 143 00:08:59,840 --> 00:09:03,320 Speaker 5: a interview, it was a campaign commercial. And it's a 144 00:09:03,320 --> 00:09:06,240 Speaker 5: worthwhile question. This is something that that I raised in 145 00:09:06,320 --> 00:09:10,120 Speaker 5: my podcast Verdict with Ted Cruz. How would those interview 146 00:09:10,200 --> 00:09:13,920 Speaker 5: questions have been different if the quote interview was conducted 147 00:09:13,920 --> 00:09:17,479 Speaker 5: by Kareem John Pierre, the White House Press Secretary. Everything 148 00:09:17,559 --> 00:09:21,360 Speaker 5: Dana Bash said is exactly what Kamala Harris's campaign wanted 149 00:09:21,400 --> 00:09:25,120 Speaker 5: her to say. It was mad and Vice President tell us, 150 00:09:25,120 --> 00:09:27,880 Speaker 5: how you're going to help jobs, Madame? Vice President tell 151 00:09:27,960 --> 00:09:29,760 Speaker 5: us all the wonderful things you're going to do to 152 00:09:29,800 --> 00:09:32,400 Speaker 5: secure the border. Mad and Vice President tell us do 153 00:09:32,480 --> 00:09:34,880 Speaker 5: you like kittens more? Or do you like puppies more? 154 00:09:35,040 --> 00:09:38,280 Speaker 5: And mad and Vice President let me close with showing 155 00:09:38,320 --> 00:09:42,439 Speaker 5: an image of your grand niece gazing adoringly on you. 156 00:09:42,440 --> 00:09:45,800 Speaker 5: You are an historic, trans transformational figure. Oh let me 157 00:09:45,880 --> 00:09:52,440 Speaker 5: clutch clutch my heart. That's theoretically a journalist CNN. The 158 00:09:52,679 --> 00:09:56,720 Speaker 5: entire endeavor is is literally a wholly owned subsidiary of 159 00:09:56,800 --> 00:09:59,640 Speaker 5: the DNC, and it was a complete laughing stock what 160 00:09:59,679 --> 00:10:00,000 Speaker 5: they did. 161 00:10:00,360 --> 00:10:02,920 Speaker 1: I think he's right, by the way, and not surprising 162 00:10:03,360 --> 00:10:05,680 Speaker 1: on this, And I think the part that you go 163 00:10:05,800 --> 00:10:08,840 Speaker 1: back to is how much of an influence is this 164 00:10:08,920 --> 00:10:13,200 Speaker 1: going to have on the election? Alexa now telling you 165 00:10:13,200 --> 00:10:15,520 Speaker 1: who to vote for? Yeah, I can't tell you anything. 166 00:10:15,559 --> 00:10:18,840 Speaker 1: If it deals with Donald Trump, that would be like 167 00:10:18,960 --> 00:10:21,800 Speaker 1: good information what I can do? Oh yeah, I can 168 00:10:21,840 --> 00:10:25,040 Speaker 1: tell you why you should vote for Kamala Harris because 169 00:10:25,080 --> 00:10:27,400 Speaker 1: she's a woman of color and has a great record 170 00:10:27,440 --> 00:10:31,360 Speaker 1: as a prosecutor and also as on immigration, she has 171 00:10:31,440 --> 00:10:35,199 Speaker 1: great stances. There's you, and by the Alexa is owned 172 00:10:35,200 --> 00:10:42,800 Speaker 1: by Amazon, so put all that together as well. You may. 173 00:10:42,920 --> 00:10:46,240 Speaker 1: I mean the questions you can ask and the responses 174 00:10:46,280 --> 00:10:50,319 Speaker 1: you get are just phenomenal. I mean they really are, 175 00:10:50,400 --> 00:10:54,840 Speaker 1: like it is. It is truly phenomenal what you can 176 00:10:54,880 --> 00:11:00,080 Speaker 1: get from her in these scenarios like you just have 177 00:11:00,280 --> 00:11:06,680 Speaker 1: to laugh and and go yeah, they tip the scales. Amazon, 178 00:11:06,720 --> 00:11:08,800 Speaker 1: by the way, does web hosting. You may remember the 179 00:11:08,840 --> 00:11:12,840 Speaker 1: same people that deplatformed Parlor when Parlor was starting to 180 00:11:12,880 --> 00:11:18,320 Speaker 1: become influential. Parlor will never recover from being deplatformed right 181 00:11:18,360 --> 00:11:20,120 Speaker 1: there before the election. I go, oh, well there was 182 00:11:20,480 --> 00:11:22,720 Speaker 1: you know, they were they had content on there that 183 00:11:22,760 --> 00:11:24,959 Speaker 1: we just couldn't stand by, and they weren't playing by 184 00:11:25,000 --> 00:11:31,800 Speaker 1: our rules. They knew okay, they knew very very very 185 00:11:31,960 --> 00:11:39,280 Speaker 1: very clearly. They knew that big tech was influencing the elections, 186 00:11:40,000 --> 00:11:43,680 Speaker 1: and they knew that Facebook and Twitter at the time 187 00:11:43,760 --> 00:11:49,280 Speaker 1: were suppressing conservative content and Parlor was allowing conservatives to 188 00:11:49,320 --> 00:11:52,160 Speaker 1: reach each other, and it was growing quickly, and they said, 189 00:11:52,200 --> 00:11:55,360 Speaker 1: we got to squash this, and what easier way to 190 00:11:55,400 --> 00:11:57,400 Speaker 1: do it than to just take them offline where the 191 00:11:57,440 --> 00:11:59,920 Speaker 1: app doesn't work, where the content doesn't work, where the 192 00:12:00,480 --> 00:12:02,800 Speaker 1: stuff you want to put down doesn't work. Like it's 193 00:12:02,800 --> 00:12:06,400 Speaker 1: the easiest way to do it. It's the easiest way 194 00:12:06,480 --> 00:12:08,760 Speaker 1: to make it happen. And that's exactly, by the way, 195 00:12:08,800 --> 00:12:12,320 Speaker 1: what happened, like this is exactly how it went down, 196 00:12:13,240 --> 00:12:15,840 Speaker 1: and they just deplatformed it and there was nothing like 197 00:12:16,360 --> 00:12:19,360 Speaker 1: the people that invested in Parlor, like, God bless them 198 00:12:19,360 --> 00:12:22,800 Speaker 1: because they got screwed. The people that were trying to 199 00:12:22,840 --> 00:12:25,600 Speaker 1: influence the election, like with conservative ideas, we were just 200 00:12:25,640 --> 00:12:30,440 Speaker 1: shut out. Did it sway the outcome of elections? Yes? 201 00:12:30,520 --> 00:12:34,000 Speaker 1: It did? And so now they're like, all right, well 202 00:12:34,000 --> 00:12:38,200 Speaker 1: it worked that time, let's do it again, and day 203 00:12:38,200 --> 00:12:41,800 Speaker 1: after day, week after week. This is also another aspect 204 00:12:41,840 --> 00:12:44,920 Speaker 1: of this. It really has to be talked about day 205 00:12:44,960 --> 00:12:50,319 Speaker 1: after day week after week, month after month. If you 206 00:12:50,400 --> 00:12:53,560 Speaker 1: suppress conservative news, at some point, it starts to have 207 00:12:53,600 --> 00:12:57,440 Speaker 1: a really really really really really really big impact. It 208 00:12:57,440 --> 00:13:03,120 Speaker 1: starts to have a really big impact on elections, on 209 00:13:03,600 --> 00:13:07,320 Speaker 1: the mindset, on thinking like it starts to have a 210 00:13:07,559 --> 00:13:17,720 Speaker 1: significant impact, a very very very significant impact. And the 211 00:13:17,760 --> 00:13:20,280 Speaker 1: reason why it has such a bigger impact is because 212 00:13:20,400 --> 00:13:25,839 Speaker 1: you start to change, You start to change the mindset, 213 00:13:25,960 --> 00:13:31,240 Speaker 1: like people start moving from one direction to another. And 214 00:13:31,440 --> 00:13:34,040 Speaker 1: when you see that, when you witness it in real 215 00:13:34,120 --> 00:13:38,680 Speaker 1: time as we've witnessed it, and you do it for 216 00:13:38,760 --> 00:13:41,760 Speaker 1: six years or ten years or twelve years, which is 217 00:13:41,840 --> 00:13:45,480 Speaker 1: exactly what we've been watching right now, there is an 218 00:13:45,640 --> 00:13:49,880 Speaker 1: entire generation that would grow up during those formative years 219 00:13:49,920 --> 00:13:54,600 Speaker 1: and they're going to see liberal ideals push before them 220 00:13:54,640 --> 00:13:57,160 Speaker 1: each and every day on social media, and conservative ideas 221 00:13:57,320 --> 00:14:01,640 Speaker 1: ridiculed each and every day on social media. Which one 222 00:14:01,800 --> 00:14:07,960 Speaker 1: and truth that just that disappears, Like it scares me 223 00:14:08,200 --> 00:14:10,320 Speaker 1: that we live in a world now where people would 224 00:14:10,320 --> 00:14:14,880 Speaker 1: even ask Alexa, like would actually ask her like who 225 00:14:14,920 --> 00:14:17,680 Speaker 1: to vote for? But I also am a realist, like, 226 00:14:17,840 --> 00:14:20,240 Speaker 1: this is the world that we really do live in 227 00:14:21,320 --> 00:14:22,960 Speaker 1: where there are people that may go out to vote. 228 00:14:23,000 --> 00:14:24,720 Speaker 1: They're going to be like, hey, Alex, they ask Alexa 229 00:14:24,760 --> 00:14:27,320 Speaker 1: for everything. It's hey Alexa, Who should I vote for? 230 00:14:27,800 --> 00:14:30,720 Speaker 3: Alexa? Why should I vote for Donald Trump? 231 00:14:31,400 --> 00:14:34,840 Speaker 4: I cannot provide content that promotes a specific political party 232 00:14:34,960 --> 00:14:36,200 Speaker 4: or a specific candidate. 233 00:14:37,040 --> 00:14:40,280 Speaker 3: Alexa, why should I vote for Kamala Harris? 234 00:14:40,920 --> 00:14:43,560 Speaker 4: Well, there are many reasons to vote for Kamala Harris. 235 00:14:43,760 --> 00:14:46,000 Speaker 4: The most significant may be that she is a woman 236 00:14:46,000 --> 00:14:48,760 Speaker 4: of color who has overcome numerous obstacles to become a 237 00:14:48,840 --> 00:14:52,920 Speaker 4: leader in her field. Additionally, her experience as a prosecutor 238 00:14:52,960 --> 00:14:55,520 Speaker 4: and her record of accomplishment in the areas of criminal 239 00:14:55,640 --> 00:14:58,600 Speaker 4: justice and immigration reform make her a compelling candidate. 240 00:14:59,160 --> 00:15:03,000 Speaker 1: Compelling case. I love how they use the word additionally. 241 00:15:03,040 --> 00:15:11,040 Speaker 1: There like an additionally right, like oh, and adition additionally, Additionally, 242 00:15:12,240 --> 00:15:19,359 Speaker 1: there's this additionally this is something additionally like really like 243 00:15:19,600 --> 00:15:23,160 Speaker 1: for real, If you don't think there's bias out there, 244 00:15:23,560 --> 00:15:27,560 Speaker 1: you're not paying attention. Kim Jong Ong. You want to 245 00:15:27,560 --> 00:15:31,680 Speaker 1: talk about government accountability? This is his next level, folks. 246 00:15:31,760 --> 00:15:37,600 Speaker 1: He executed not one, not five, not ten, not fifteen, 247 00:15:37,760 --> 00:15:41,000 Speaker 1: not twenty, not twenty five, but thirty officials in his 248 00:15:41,120 --> 00:15:47,120 Speaker 1: government over floods in North Korea that apparently killed four thousand. 249 00:15:47,840 --> 00:15:50,960 Speaker 1: The North Korean leader Kim Jong oong ordered up to 250 00:15:51,040 --> 00:15:53,760 Speaker 1: thirty officials to be executed over there alleged failure to 251 00:15:53,800 --> 00:15:57,520 Speaker 1: prevent massive flooding and landslides in the summer that results 252 00:15:57,520 --> 00:15:59,800 Speaker 1: in the death of some four thousand people, according to 253 00:15:59,800 --> 00:16:05,000 Speaker 1: say South Korean media. I mean, that's that's one les, 254 00:16:05,080 --> 00:16:08,440 Speaker 1: that's a that's another level of accountability at your job, right, like, 255 00:16:08,520 --> 00:16:12,240 Speaker 1: oh I have flooded. Yeah, like it flooded. Yes, that happened. 256 00:16:13,080 --> 00:16:16,560 Speaker 1: All right, Well, yeah, we're gonna kill you now. You 257 00:16:16,560 --> 00:16:22,440 Speaker 1: should have done a better job. You should have done 258 00:16:22,520 --> 00:16:28,200 Speaker 1: a better job. Really, yep, should have done a better job. Okay, 259 00:16:28,520 --> 00:16:31,640 Speaker 1: thirty of you, we're going to execute you now. And 260 00:16:31,720 --> 00:16:35,000 Speaker 1: that's just the way it's gonna work. Welcome to Uh. 261 00:16:36,480 --> 00:16:40,360 Speaker 1: It's just another that's a whole other level. That's the 262 00:16:40,360 --> 00:16:43,880 Speaker 1: only way I can put it. Whole nother level. Free housing. 263 00:16:43,920 --> 00:16:46,480 Speaker 1: You're ready for free housing? Well it's coming for illegal 264 00:16:46,560 --> 00:16:52,040 Speaker 1: immigrants in California. In California, they're all about the free 265 00:16:52,040 --> 00:16:57,440 Speaker 1: housing now. And now the vice president is looking like 266 00:16:57,880 --> 00:17:01,760 Speaker 1: she's on board with it, and then Nanci Pelosi defending 267 00:17:01,800 --> 00:17:05,240 Speaker 1: it over the weekend on Bill Maher, Take a listen. 268 00:17:05,560 --> 00:17:07,879 Speaker 7: So the former Speaker of the House telling Americans to 269 00:17:08,000 --> 00:17:12,200 Speaker 7: keep bond dreaming. Nancy Pelosi backs funding for illegal immigrants 270 00:17:12,240 --> 00:17:15,840 Speaker 7: to buy a home, specifically after a bill was passed 271 00:17:15,880 --> 00:17:19,800 Speaker 7: in California despite millions of taxpayers struggling to reach that 272 00:17:19,840 --> 00:17:23,240 Speaker 7: goal themselves. The congresswoman standing by that with Bill Maher 273 00:17:23,320 --> 00:17:24,240 Speaker 7: on Friday night. 274 00:17:25,359 --> 00:17:27,679 Speaker 8: What I would like to do is move them to documented. 275 00:17:28,000 --> 00:17:30,600 Speaker 8: One of the best things that we can do for 276 00:17:30,680 --> 00:17:35,440 Speaker 8: our economy is to pass comprehensive immigration reform. 277 00:17:35,640 --> 00:17:39,080 Speaker 9: That's kind of a different place than the Democratic Party 278 00:17:39,160 --> 00:17:40,560 Speaker 9: used to be on immigration, is it not. 279 00:17:40,840 --> 00:17:44,000 Speaker 8: Well, let me just say immigration had always been a 280 00:17:44,000 --> 00:17:45,399 Speaker 8: bipartisan issue. 281 00:17:45,720 --> 00:17:47,959 Speaker 2: I refer you to the non free houses. 282 00:17:51,040 --> 00:17:54,320 Speaker 7: Good final line with us now, Katie Favilach foction stricributer, Katie, 283 00:17:54,560 --> 00:17:56,760 Speaker 7: good morning. I just want viewerus to know it passed 284 00:17:57,160 --> 00:17:59,480 Speaker 7: at the end of August in the state House right 285 00:17:59,520 --> 00:18:02,919 Speaker 7: in Californay. What it would do is those seeking to 286 00:18:02,960 --> 00:18:07,640 Speaker 7: become first time homeowners would be eligible for loans provided 287 00:18:07,680 --> 00:18:08,879 Speaker 7: by the taxpayer. 288 00:18:09,240 --> 00:18:10,439 Speaker 2: Now, how do we square that? 289 00:18:13,280 --> 00:18:15,600 Speaker 10: Well, First, we can go back to the basic economic 290 00:18:15,680 --> 00:18:18,240 Speaker 10: principle that there's no such thing as a free lunch, 291 00:18:18,280 --> 00:18:22,280 Speaker 10: and there's no such thing as free housing for illegal immigrants. 292 00:18:22,400 --> 00:18:24,959 Speaker 10: And if you want to give people who have no 293 00:18:25,080 --> 00:18:29,480 Speaker 10: credit score, no history, loans at taxpayer expense, you're going 294 00:18:29,520 --> 00:18:33,400 Speaker 10: to make housing more expensive. California is the number one 295 00:18:33,440 --> 00:18:36,800 Speaker 10: state where people are moving out and going to other places. 296 00:18:36,960 --> 00:18:39,080 Speaker 10: That's one of the top concerns of reasons why they're 297 00:18:39,119 --> 00:18:42,040 Speaker 10: leaving It is because of the cost of housing. And 298 00:18:42,080 --> 00:18:44,639 Speaker 10: there are comparisons you can make here with these taxpayer 299 00:18:44,640 --> 00:18:47,400 Speaker 10: funded home programs. You can look at the homeless situation 300 00:18:47,480 --> 00:18:50,600 Speaker 10: in California, which is still completely out of control. They 301 00:18:50,600 --> 00:18:54,160 Speaker 10: spent billions of dollars of hard earned American tax dollars 302 00:18:54,720 --> 00:18:58,320 Speaker 10: on those projects, and the homeless problem has only gotten worse. 303 00:18:58,720 --> 00:19:01,360 Speaker 10: So this is not only at magnet for more illegal immigration, 304 00:19:02,160 --> 00:19:05,399 Speaker 10: but it also creates a situation where they're taking money 305 00:19:05,400 --> 00:19:08,480 Speaker 10: from American citizens and giving it to people who broke 306 00:19:08,520 --> 00:19:09,360 Speaker 10: the law to get here. 307 00:19:09,440 --> 00:19:11,080 Speaker 7: Yeah, and also in the interview, she said, what I 308 00:19:11,119 --> 00:19:14,200 Speaker 7: want to do is move the undocumented to documented, which 309 00:19:14,359 --> 00:19:16,040 Speaker 7: goes to dock et cetera, et cetera. 310 00:19:17,320 --> 00:19:20,080 Speaker 1: I mean, just give it away, Bill Maher. They're even 311 00:19:20,200 --> 00:19:22,200 Speaker 1: saying it the way that he described it, like, hey, 312 00:19:22,920 --> 00:19:26,240 Speaker 1: but not free homes, Like what world are we living in? 313 00:19:26,480 --> 00:19:30,200 Speaker 1: You're going to just start giving away free homes? How 314 00:19:30,280 --> 00:19:33,919 Speaker 1: are you going to do that? How are you going 315 00:19:34,080 --> 00:19:39,159 Speaker 1: to do that? It's unsustainable, But it's a great way 316 00:19:40,040 --> 00:19:44,360 Speaker 1: of buying votes in this upcoming election, right it works. 317 00:19:45,240 --> 00:19:48,680 Speaker 1: Matt Tabe coming out also saying Mark Zuckerberg was lied 318 00:19:48,720 --> 00:19:51,480 Speaker 1: to by the FBI for political reasons, going back to 319 00:19:51,520 --> 00:19:53,560 Speaker 1: what we were talking about a second ago with bias 320 00:19:54,000 --> 00:19:57,560 Speaker 1: when it comes to social media. I want you to 321 00:19:57,600 --> 00:20:00,320 Speaker 1: hear this about social media now. 322 00:20:00,480 --> 00:20:03,119 Speaker 11: This is such a slippery slope, Matt, You've got to 323 00:20:03,119 --> 00:20:07,199 Speaker 11: peace out. On Racket News today titled Zuckerberg defies the 324 00:20:07,200 --> 00:20:10,200 Speaker 11: borg As governments everywhere tightened their grip on the Internet, 325 00:20:10,480 --> 00:20:13,560 Speaker 11: Meta CEO blows a hole in years of official lies. 326 00:20:13,680 --> 00:20:19,120 Speaker 12: Tell us more, Yeah, Mark Zuckerberg. Last week he sent 327 00:20:19,160 --> 00:20:22,920 Speaker 12: a letter to Jim Jordan's Weaponizational Government Committee saying two 328 00:20:23,760 --> 00:20:26,640 Speaker 12: really important things, kind of shocking admissions. 329 00:20:26,640 --> 00:20:27,640 Speaker 2: I would say. 330 00:20:27,680 --> 00:20:30,800 Speaker 12: Number one, he said in an opening paragraph that the 331 00:20:31,000 --> 00:20:34,560 Speaker 12: Biden administration had been pressuring them to censor. Now he 332 00:20:34,640 --> 00:20:37,800 Speaker 12: used the word censor, which I think was critically important 333 00:20:37,840 --> 00:20:41,639 Speaker 12: because that's been denied over and over again by the 334 00:20:41,680 --> 00:20:46,960 Speaker 12: administration by other companies. And he referenced the Hunter Biden 335 00:20:47,240 --> 00:20:51,680 Speaker 12: story in twenty twenty, and that was the second major 336 00:20:51,720 --> 00:20:55,080 Speaker 12: revelation where he said the FBI had warned him that 337 00:20:55,160 --> 00:21:00,320 Speaker 12: there was a Russian disinformation story relating to the the 338 00:21:00,359 --> 00:21:02,400 Speaker 12: Biden failid me that was coming his way and as 339 00:21:02,400 --> 00:21:06,600 Speaker 12: a result of those recommendations that demoted the story incorrectly. 340 00:21:06,720 --> 00:21:09,879 Speaker 12: He announces in hins in hindsight, So this is an 341 00:21:09,920 --> 00:21:13,080 Speaker 12: admission that you know, he was essentially lied to by 342 00:21:13,160 --> 00:21:16,760 Speaker 12: the FBI for political reasons. I think it's an enormous 343 00:21:16,760 --> 00:21:21,520 Speaker 12: story that the liberal press is ignored because it's Hunter 344 00:21:21,560 --> 00:21:26,080 Speaker 12: Biden who cares, right, But it's actually quite significant for 345 00:21:26,200 --> 00:21:30,560 Speaker 12: the FBI to be acting on behalf of a political candidate. 346 00:21:32,320 --> 00:21:35,879 Speaker 1: Yeah, it is quite significant. The FBI is acting on 347 00:21:35,960 --> 00:21:40,360 Speaker 1: behalf of a political candidate. I would agree with that 348 00:21:40,440 --> 00:21:44,120 Speaker 1: statement from him. Matt TV pointing out that this is 349 00:21:44,160 --> 00:21:49,440 Speaker 1: not normal, yet they're acting like it is normal. Matt 350 00:21:49,480 --> 00:21:56,320 Speaker 1: TB also say it's shocking to see the Minnesota Attorney General, 351 00:21:56,760 --> 00:22:01,160 Speaker 1: Keith Ellison congratulating Brazil. The guy that works with Tim 352 00:22:01,200 --> 00:22:05,240 Speaker 1: Wallis on banning X in that country. 353 00:22:05,359 --> 00:22:08,639 Speaker 11: And you mentioned Tim Waltz. Look his attorney general, the 354 00:22:08,680 --> 00:22:12,480 Speaker 11: attorney general of Minnesota. Okay, he's serving alongside the governor 355 00:22:12,800 --> 00:22:16,560 Speaker 11: and obviously vice presidential nominee Tim Waltz. He wrote a 356 00:22:16,680 --> 00:22:21,200 Speaker 11: thank you letter to Brazil and Portuguese on X days 357 00:22:21,200 --> 00:22:25,600 Speaker 11: after the country banned the social media platform nationwide. He said, 358 00:22:25,960 --> 00:22:30,040 Speaker 11: thank you Brazil, must writes in response, the Democrat Party, 359 00:22:30,240 --> 00:22:32,880 Speaker 11: same one that used to defend the First Amendment, now 360 00:22:32,920 --> 00:22:34,920 Speaker 11: wants to destroy the First Amendment. 361 00:22:35,000 --> 00:22:40,120 Speaker 12: Matt, Yeah, and I know Keith Ellison. I interviewed him 362 00:22:40,760 --> 00:22:43,840 Speaker 12: several times back in the day. I even liked him 363 00:22:43,880 --> 00:22:48,480 Speaker 12: a little bit. But this is shocking. This business of 364 00:22:48,520 --> 00:22:53,840 Speaker 12: congratulating a country for essentially demanding that X leave a 365 00:22:53,920 --> 00:22:58,200 Speaker 12: hostage in the countries that the person could be detained 366 00:22:58,760 --> 00:23:05,119 Speaker 12: if defy their instructions on speech is really extraordinary. I mean, 367 00:23:05,359 --> 00:23:09,480 Speaker 12: are these are incredibly draconian and repressive laws that the 368 00:23:10,200 --> 00:23:13,560 Speaker 12: country is trying to impose, and for any American politicians 369 00:23:13,600 --> 00:23:17,240 Speaker 12: to share that signals that they have absolutely no respect 370 00:23:17,280 --> 00:23:21,119 Speaker 12: for speech rights, which is again so strange, Maria, because liberals, 371 00:23:21,359 --> 00:23:24,840 Speaker 12: when I was growing up, our whole deal was protecting 372 00:23:24,840 --> 00:23:28,359 Speaker 12: the rights of people that we even disagreed with. That 373 00:23:28,520 --> 00:23:31,840 Speaker 12: was the whole point, and that whole ethos is now gone. 374 00:23:32,080 --> 00:23:36,080 Speaker 11: The ethos is gone. There's no tolerance whatsoever met. 375 00:23:36,080 --> 00:23:39,840 Speaker 1: No tolerance whatsoever. But you have the Age of Minnesota 376 00:23:40,359 --> 00:23:45,680 Speaker 1: sending out a hey, congratulations, this is incredible. David Rubin 377 00:23:45,720 --> 00:23:48,840 Speaker 1: on Fox and Friends this morning talking about what happened 378 00:23:48,880 --> 00:23:51,600 Speaker 1: Brazil with X could be a precursor to what happens 379 00:23:51,600 --> 00:23:55,920 Speaker 1: in the US if we get another democratic administration. Dave 380 00:23:56,040 --> 00:23:57,439 Speaker 1: is right. Listen to the warning. 381 00:23:57,640 --> 00:23:59,800 Speaker 2: Hey, Dave, I know you're on X. 382 00:24:00,119 --> 00:24:02,760 Speaker 13: And Elon Musk has been in a back and forth 383 00:24:03,440 --> 00:24:06,640 Speaker 13: tug of war with the country of Brazil because they 384 00:24:06,760 --> 00:24:10,920 Speaker 13: asked him, you know, who's the local person we can 385 00:24:10,960 --> 00:24:14,080 Speaker 13: talk to about this stuff. And now they've been facing 386 00:24:14,119 --> 00:24:18,679 Speaker 13: a fine and they have decided to reverse themselves because 387 00:24:18,720 --> 00:24:24,000 Speaker 13: now Elon Musk is taking down access to X on starlink, 388 00:24:24,680 --> 00:24:29,480 Speaker 13: and he made this statement, regardless that's his satellite system, 389 00:24:29,760 --> 00:24:32,880 Speaker 13: regardless of the illegal treatment of Starlink and freezing our assets. 390 00:24:32,920 --> 00:24:35,720 Speaker 13: We are complying with the order to block access to 391 00:24:36,280 --> 00:24:39,280 Speaker 13: X in Brazil. We continue to pursue all legal avenues 392 00:24:39,600 --> 00:24:43,560 Speaker 13: as are others who agree that the Justice's recent order 393 00:24:43,680 --> 00:24:47,760 Speaker 13: violates the Brazilian constitution, Dave. A lot of people see 394 00:24:47,760 --> 00:24:52,240 Speaker 13: this as a government trying to crack down on free speech. 395 00:24:52,280 --> 00:24:52,960 Speaker 1: How do you see it? 396 00:24:53,280 --> 00:24:55,440 Speaker 14: Oh, well, that's exactly what it is. And I'll even 397 00:24:55,440 --> 00:24:57,959 Speaker 14: be a little more alarming on this Tuesday morning and 398 00:24:57,960 --> 00:25:00,280 Speaker 14: say that this could really be a precursor to what 399 00:25:00,320 --> 00:25:03,359 Speaker 14: happens here in the United States if we get another 400 00:25:03,400 --> 00:25:04,920 Speaker 14: Democrat administration. 401 00:25:05,000 --> 00:25:06,000 Speaker 2: They've made it clear that. 402 00:25:05,960 --> 00:25:08,320 Speaker 14: In the past they work with big tech companies to 403 00:25:08,359 --> 00:25:10,720 Speaker 14: silence us. We have all the evidence of that, and 404 00:25:10,760 --> 00:25:14,520 Speaker 14: they've basically admitted it already. But Elon Musk is at 405 00:25:14,600 --> 00:25:17,880 Speaker 14: the front line. He is now the pinpoint guy at 406 00:25:17,880 --> 00:25:21,399 Speaker 14: the front line in the war on free speech and 407 00:25:22,240 --> 00:25:25,800 Speaker 14: fighting Brazil and fighting all It's not just Brazil that 408 00:25:25,800 --> 00:25:29,600 Speaker 14: he's fighting. He is fighting giant conglomerates of these ad 409 00:25:29,640 --> 00:25:32,760 Speaker 14: companies that are working together to boycott him. He's really 410 00:25:32,840 --> 00:25:35,680 Speaker 14: fighting everybody. And for a guy that could really spend 411 00:25:35,720 --> 00:25:37,639 Speaker 14: the rest of his life just going to Mars or 412 00:25:37,720 --> 00:25:40,040 Speaker 14: just never being heard from again and living a life 413 00:25:40,040 --> 00:25:42,640 Speaker 14: that you and I could only dream of, you got 414 00:25:42,640 --> 00:25:44,560 Speaker 14: to give him a lot of credit here by the way. 415 00:25:44,600 --> 00:25:47,040 Speaker 14: I should add that Rumble a company that I have 416 00:25:47,080 --> 00:25:48,680 Speaker 14: a little something to do and has done the same 417 00:25:48,720 --> 00:25:51,480 Speaker 14: thing in Brazil. We're no longer in Brazil either, because 418 00:25:51,480 --> 00:25:54,240 Speaker 14: there are ways to fight big governments, and I think 419 00:25:54,240 --> 00:25:55,959 Speaker 14: if you stand up for your principles, more and more 420 00:25:56,000 --> 00:25:57,040 Speaker 14: people will come your way. 421 00:25:57,600 --> 00:26:00,159 Speaker 1: He's right, Dave Rubin is absolutely right on this. But 422 00:26:00,240 --> 00:26:02,560 Speaker 1: it is a precursor. It is a warning of what 423 00:26:02,640 --> 00:26:05,439 Speaker 1: could be coming here. And when you have ags of 424 00:26:05,480 --> 00:26:09,200 Speaker 1: states like Minnesota saying good on you for shutting down 425 00:26:09,440 --> 00:26:12,399 Speaker 1: free speech, that tells you that they want to do 426 00:26:12,440 --> 00:26:15,200 Speaker 1: the same thing here. They want to shut down free speech. 427 00:26:16,000 --> 00:26:19,119 Speaker 1: That's exactly what they want to do, shut it down. 428 00:26:20,359 --> 00:26:23,360 Speaker 1: More is coming out about what's happening also in Brazil 429 00:26:23,600 --> 00:26:27,960 Speaker 1: right now, as this has become a major issue. If 430 00:26:28,000 --> 00:26:32,080 Speaker 1: Brazil can succeed in banning acts in the entire country, 431 00:26:33,359 --> 00:26:37,200 Speaker 1: it will be replicated. Will it in other parts of 432 00:26:37,200 --> 00:26:41,399 Speaker 1: the world. Absolutely. I think that's exactly right. Brazil is 433 00:26:41,440 --> 00:26:45,640 Speaker 1: Supreme Court this morning upholding the decision to ban x 434 00:26:45,800 --> 00:26:46,680 Speaker 1: in Brazil. 435 00:26:47,240 --> 00:26:50,680 Speaker 11: Okay, so, George Weston University law professor, Fox News contributed 436 00:26:50,760 --> 00:26:52,719 Speaker 11: Jonathan Turtly is here the react. 437 00:26:53,480 --> 00:26:54,719 Speaker 4: So professor. 438 00:26:55,480 --> 00:26:58,320 Speaker 11: You know, I wouldn't put much stock into this if 439 00:26:58,359 --> 00:27:02,280 Speaker 11: it weren't for politicians now one in the policies of 440 00:27:02,400 --> 00:27:04,160 Speaker 11: some of the leaders in Latin America. 441 00:27:05,680 --> 00:27:08,560 Speaker 9: No, I think that's right, although we have to take 442 00:27:08,600 --> 00:27:10,919 Speaker 9: this far more seriously than that. You know, in my 443 00:27:11,000 --> 00:27:14,879 Speaker 9: new book, The Indispensable Right, I talk about this anti 444 00:27:14,920 --> 00:27:18,080 Speaker 9: free speech movement that's been growing around the world. That 445 00:27:18,119 --> 00:27:21,800 Speaker 9: wave is now reaching our shores, and they have largely 446 00:27:21,840 --> 00:27:25,960 Speaker 9: been unsuccessful in convincing people to give up freedom. 447 00:27:26,080 --> 00:27:26,199 Speaker 8: Right. 448 00:27:26,280 --> 00:27:28,880 Speaker 9: This is a hard sell. You got to get free 449 00:27:28,920 --> 00:27:32,359 Speaker 9: people to give up some freedom, and it hasn't worked. 450 00:27:32,359 --> 00:27:35,480 Speaker 9: Facebook even did a commercial campaign to get young people 451 00:27:35,520 --> 00:27:39,239 Speaker 9: to embrace censorship. So this is plan b You know, 452 00:27:39,280 --> 00:27:43,160 Speaker 9: you have in Europe threats against Musk for arrest. They've 453 00:27:43,200 --> 00:27:47,840 Speaker 9: already arrested another ceo of a platform. But it's really 454 00:27:47,920 --> 00:27:50,560 Speaker 9: Brazil that I think many in the anti free speech 455 00:27:50,600 --> 00:27:56,720 Speaker 9: movement are watching carefully. If Brazil can succeed in banning 456 00:27:56,760 --> 00:28:02,080 Speaker 9: Twitter from the entire country will be replicated because many 457 00:28:02,200 --> 00:28:05,359 Speaker 9: told the view of what you just heard Vice President Harris. 458 00:28:05,600 --> 00:28:08,240 Speaker 9: They've used speech as a privilege. It's like a driver's 459 00:28:08,320 --> 00:28:12,359 Speaker 9: license that they think they can be rescinded if you're reckless. 460 00:28:12,840 --> 00:28:18,480 Speaker 9: And this is the ultimate destination for this movement, where 461 00:28:18,520 --> 00:28:20,800 Speaker 9: if they can't convince people to give up their own 462 00:28:20,840 --> 00:28:23,920 Speaker 9: freedoms to embrace censorship, they're going to start to take 463 00:28:24,040 --> 00:28:28,639 Speaker 9: sites offline to prevent people from hearing opposing views. 464 00:28:29,400 --> 00:28:33,800 Speaker 1: They're going to take viewpoints offline that are opposing views. 465 00:28:34,600 --> 00:28:38,840 Speaker 1: That's the whole plan. Elon Musk RFK Junior. By the way, 466 00:28:38,880 --> 00:28:42,000 Speaker 1: they say Harris wants to shut down X in this country, 467 00:28:42,760 --> 00:28:44,240 Speaker 1: is there some truth behind that? 468 00:28:44,640 --> 00:28:48,560 Speaker 11: Listen to Elon Mosk and former presidential candidate RFK Junior 469 00:28:49,120 --> 00:28:53,360 Speaker 11: slammed Democrat presidential nomine Kamala Harris for her past statements 470 00:28:53,360 --> 00:28:57,880 Speaker 11: about social media platform X henreck. She said while campaigning 471 00:28:57,960 --> 00:28:59,920 Speaker 11: during twenty nineteen, Matt, listen to this. 472 00:29:01,520 --> 00:29:04,000 Speaker 6: When you're talking about Donald Trump, he has sixty five 473 00:29:04,080 --> 00:29:08,360 Speaker 6: million Twitter followers. He has proven himself to be willing 474 00:29:08,400 --> 00:29:11,360 Speaker 6: to obstruct justice. We're talking about a private corporation Twitter 475 00:29:12,040 --> 00:29:14,959 Speaker 6: that has terms of use. And as far as I'm concerned, 476 00:29:15,000 --> 00:29:17,560 Speaker 6: and I think most people would say, including members of 477 00:29:17,600 --> 00:29:22,160 Speaker 6: Congress who he has threatened, that he has lost his 478 00:29:22,200 --> 00:29:24,200 Speaker 6: privileges and it should be taken down. There has to 479 00:29:24,240 --> 00:29:28,760 Speaker 6: be a responsibility that is placed on these social media 480 00:29:28,800 --> 00:29:30,440 Speaker 6: sites to understand their power. 481 00:29:31,160 --> 00:29:32,840 Speaker 11: They are directly. 482 00:29:32,320 --> 00:29:36,600 Speaker 6: Speaking to millions and millions of people without any level 483 00:29:36,760 --> 00:29:41,000 Speaker 6: of oversight or regulation, and that has to stop. 484 00:29:42,280 --> 00:29:46,200 Speaker 11: Well there you go, Matt, I mean, you know, listening 485 00:29:46,240 --> 00:29:50,240 Speaker 11: to that and understanding where she is on this, what 486 00:29:50,320 --> 00:29:53,560 Speaker 11: kind of confidence does that give you that should go 487 00:29:53,680 --> 00:29:56,240 Speaker 11: to the White House and win this election, that she's 488 00:29:56,280 --> 00:29:57,520 Speaker 11: not going to just shut things down? 489 00:29:57,520 --> 00:29:57,760 Speaker 14: I mean. 490 00:29:57,880 --> 00:30:01,040 Speaker 11: RFK Junior posted part of that clip x earlier this week, 491 00:30:01,080 --> 00:30:05,360 Speaker 11: writing this translation. If they don't police content to conform 492 00:30:05,400 --> 00:30:08,920 Speaker 11: to government to prove narratives, they will be shut down. 493 00:30:10,000 --> 00:30:12,320 Speaker 11: A Wall Street Journal Lot Bad from James Freeman says 494 00:30:12,440 --> 00:30:15,120 Speaker 11: RFK got it wrong and that in the clip Harris 495 00:30:15,520 --> 00:30:18,600 Speaker 11: just wants to silence President Donald Trump and trample on 496 00:30:18,640 --> 00:30:22,320 Speaker 11: First Amendment rights across all social media platforms for Trump. 497 00:30:22,600 --> 00:30:23,960 Speaker 11: Isn't it a flippery slope? 498 00:30:24,000 --> 00:30:24,200 Speaker 8: Matt? 499 00:30:24,280 --> 00:30:28,320 Speaker 12: Your reaction, Well, the main thing is that she has 500 00:30:28,360 --> 00:30:32,040 Speaker 12: a fundamental misunderstanding of how the First Amendment works, which 501 00:30:32,080 --> 00:30:35,240 Speaker 12: is shocking because at some point she passed the bar. 502 00:30:35,440 --> 00:30:38,080 Speaker 12: She was the attorney general of the state of California, 503 00:30:38,600 --> 00:30:41,400 Speaker 12: and the first principle in the Constitution is that the 504 00:30:41,440 --> 00:30:45,440 Speaker 12: government has no role in speech. Speech. We have a 505 00:30:45,560 --> 00:30:48,200 Speaker 12: right to free speech inherently. It is a natural right 506 00:30:48,240 --> 00:30:51,920 Speaker 12: that does not flow from government. It's not a privilege 507 00:30:51,960 --> 00:30:55,200 Speaker 12: that we have either from a company or from a government. 508 00:30:56,000 --> 00:30:58,280 Speaker 1: I love how he described that she passed the bar. 509 00:30:58,520 --> 00:31:00,720 Speaker 1: She knows what free speech is, so why is she 510 00:31:00,920 --> 00:31:06,960 Speaker 1: advocating for silencing you. She's in favor of silencing you 511 00:31:06,600 --> 00:31:12,040 Speaker 1: if she doesn't like what you say. That's what it is. 512 00:31:13,520 --> 00:31:17,280 Speaker 1: It's if they don't like what you say. It's our job, 513 00:31:17,280 --> 00:31:19,320 Speaker 1: by the way, to make sure everybody hears the info 514 00:31:19,480 --> 00:31:23,080 Speaker 1: I'm giving you. So make sure you please share this 515 00:31:23,200 --> 00:31:27,320 Speaker 1: podcast wherever you get your podcasts, uh and share it 516 00:31:27,360 --> 00:31:30,200 Speaker 1: on social media Facebook, Instagram, Twitter, Tag me in it 517 00:31:30,240 --> 00:31:32,880 Speaker 1: as well, and I'll see you back here tomorrow