1 00:00:00,040 --> 00:00:03,880 Speaker 1: House Republicans have kicked off their investigations. The House Oversight 2 00:00:03,960 --> 00:00:07,120 Speaker 1: Committee brought Twitter employees before them, like James Baker and 3 00:00:07,200 --> 00:00:11,080 Speaker 1: nul Wroth, to grill them on Twitter's collusion with the 4 00:00:11,160 --> 00:00:14,200 Speaker 1: FBI to censor the media, to censor Americans. We all 5 00:00:14,240 --> 00:00:16,560 Speaker 1: know and remember what they did with the Hunter Bided 6 00:00:16,640 --> 00:00:18,880 Speaker 1: laptop story and what they did to the New York 7 00:00:18,920 --> 00:00:23,720 Speaker 1: Posts government ever contact you or anyone at Twitter to 8 00:00:23,840 --> 00:00:27,640 Speaker 1: censor or moderate certain tweets? Yes or no. We received 9 00:00:27,720 --> 00:00:30,760 Speaker 1: legal demands to remove content from the platform from the 10 00:00:30,840 --> 00:00:33,519 Speaker 1: US government and governments all around the walking of these 11 00:00:33,560 --> 00:00:36,479 Speaker 1: people for weeks and months, years prior to this League game. 12 00:00:36,640 --> 00:00:41,680 Speaker 1: They have specifically told you in October that the potentially 13 00:00:42,200 --> 00:00:47,320 Speaker 1: involving Hunter Biden's laptop legately and literally prophesized what happened, 14 00:00:47,760 --> 00:00:51,560 Speaker 1: and you contacted them, Sir, I did not, did Mr Roth? 15 00:00:51,680 --> 00:00:54,400 Speaker 1: Do you personally think that you have a political bias? 16 00:00:54,400 --> 00:00:55,720 Speaker 1: And did you have on when you worked in Twitter, 17 00:00:56,160 --> 00:01:00,200 Speaker 1: a personal political bias? No, sir, as you didn't. You know. 18 00:01:00,320 --> 00:01:03,160 Speaker 1: That's that's remarkable because it's pretty much you did have 19 00:01:03,280 --> 00:01:08,560 Speaker 1: strong biases compared when you compared, ironically using Twitter people 20 00:01:08,600 --> 00:01:11,080 Speaker 1: that worked in the Trump White House to Nazis, they 21 00:01:11,080 --> 00:01:14,240 Speaker 1: were good folks that you simply disagreed with politically and 22 00:01:14,400 --> 00:01:17,520 Speaker 1: art representative republic, and you compare them to the most 23 00:01:17,640 --> 00:01:20,360 Speaker 1: evil people on the planet there murdered sixty million people, 24 00:01:20,480 --> 00:01:22,320 Speaker 1: or at least we're responsible for those deaths. Do you 25 00:01:22,360 --> 00:01:24,319 Speaker 1: think that was a little bit hyperbolic. We'll talk to 26 00:01:24,400 --> 00:01:27,080 Speaker 1: Kara Frederick, the director of the Tech Policy Center at 27 00:01:27,080 --> 00:01:30,600 Speaker 1: the Heritage Foundation, about all of it. What do you 28 00:01:30,640 --> 00:01:33,200 Speaker 1: need to know about what happened in that hearing, what 29 00:01:33,240 --> 00:01:35,199 Speaker 1: does it mean, and what does it mean moving forward. 30 00:01:35,600 --> 00:01:39,280 Speaker 1: We'll also get her take on TikTok about how China 31 00:01:39,480 --> 00:01:42,480 Speaker 1: uses the social media app to spy on you and 32 00:01:42,560 --> 00:01:45,240 Speaker 1: brainwash your kids. What do you need to know about that? 33 00:01:45,360 --> 00:01:48,160 Speaker 1: How worried should you be? All that? And more on 34 00:01:48,200 --> 00:01:57,280 Speaker 1: the Truth with Lisa Booth Kara Frederick's My guests, stay tuned. So, Karen, 35 00:01:57,320 --> 00:02:01,200 Speaker 1: Republicans kicked off their oversight hearings looking at the collusion 36 00:02:01,360 --> 00:02:05,040 Speaker 1: between the FBI and Twitter as well. You know, what 37 00:02:05,120 --> 00:02:07,600 Speaker 1: did you make of these hearings? Obviously we wouldn't know 38 00:02:07,600 --> 00:02:10,239 Speaker 1: about any of this if Elon Musk kind of released 39 00:02:10,520 --> 00:02:13,080 Speaker 1: the Twitter files, But you know, what are your takeaways 40 00:02:13,120 --> 00:02:15,720 Speaker 1: from all of it? Yeah, I thought it was pretty unsatisfying, 41 00:02:15,760 --> 00:02:17,760 Speaker 1: to be honest. Um, I left with a lot more 42 00:02:17,840 --> 00:02:21,799 Speaker 1: questions than answers. Um, you know that you'll wroth of Gota, 43 00:02:22,000 --> 00:02:24,160 Speaker 1: James Baker. These guys are all coach, They're good at 44 00:02:24,160 --> 00:02:27,240 Speaker 1: what they do. Um. The other Twitter employee was she 45 00:02:27,320 --> 00:02:29,680 Speaker 1: was a little more fast and loose, especially when it 46 00:02:29,720 --> 00:02:32,240 Speaker 1: came to the First Amendment and free speech. But but 47 00:02:32,320 --> 00:02:34,920 Speaker 1: the other guys had it pretty buttoned down. So I 48 00:02:34,960 --> 00:02:38,360 Speaker 1: was pretty unsatisfied with what came out of it. However, 49 00:02:38,760 --> 00:02:42,280 Speaker 1: you know, given what Elon Musk has basically revealed, and 50 00:02:42,480 --> 00:02:46,399 Speaker 1: we saw like little shreds and data points of what 51 00:02:46,440 --> 00:02:49,360 Speaker 1: we believed to be happening. We wrote about this at 52 00:02:49,360 --> 00:02:53,200 Speaker 1: the Heritage Foundation back in February two. But when Elon 53 00:02:53,280 --> 00:02:56,360 Speaker 1: Musk actually had those Twitter files released through the spade 54 00:02:56,360 --> 00:02:59,480 Speaker 1: of independent journalists, UM, we were able to see and 55 00:03:00,080 --> 00:03:03,160 Speaker 1: you know, in stark reality like what was actually going on. 56 00:03:03,320 --> 00:03:06,639 Speaker 1: So we knew before from the Intercept reporting that there 57 00:03:06,680 --> 00:03:09,040 Speaker 1: had been meetings with the FBI and Big Tech ahead 58 00:03:09,040 --> 00:03:12,680 Speaker 1: of We knew that Zuckerberg, remember when he admitted on 59 00:03:12,800 --> 00:03:16,200 Speaker 1: Joe Rogan that the FBI warned a Facebook of a 60 00:03:16,200 --> 00:03:20,560 Speaker 1: propaganda dump that might look akin to Russian hacking prior 61 00:03:20,600 --> 00:03:23,079 Speaker 1: to their decision to suppress the laptop story. But then 62 00:03:23,080 --> 00:03:25,400 Speaker 1: when Ellen came out with it, it was like okay, bam, 63 00:03:25,520 --> 00:03:28,760 Speaker 1: like it is absolutely indisputaful. So I thought that was 64 00:03:28,919 --> 00:03:32,360 Speaker 1: a great service, But in my mind, I'm kind of 65 00:03:32,360 --> 00:03:35,560 Speaker 1: wondered why, wondering why the rest of um, you know, 66 00:03:35,600 --> 00:03:38,560 Speaker 1: the main cream media and um, the rest of America 67 00:03:38,640 --> 00:03:42,160 Speaker 1: isn't as exercised about this collusion now that we have 68 00:03:42,720 --> 00:03:45,880 Speaker 1: verifiable proof that this was happening, that you know, they 69 00:03:46,160 --> 00:03:49,560 Speaker 1: they effectively the FBI ran an influence campaign against you 70 00:03:49,680 --> 00:03:53,880 Speaker 1: o roth um Uh former Twitter executive by bringing him 71 00:03:53,880 --> 00:03:58,480 Speaker 1: out to Aspen, having an entire exercise focused on Hunter 72 00:03:58,600 --> 00:04:03,400 Speaker 1: Biden specifically and what potential laptop league and documently could 73 00:04:03,440 --> 00:04:05,440 Speaker 1: look like, um if it was part of you know, 74 00:04:05,640 --> 00:04:09,000 Speaker 1: foreign influence campaign. So you know, the FBI is kind 75 00:04:09,000 --> 00:04:13,000 Speaker 1: of doing at this point what they're accusing other other countries, 76 00:04:13,040 --> 00:04:16,680 Speaker 1: but militians intent of doing. And in my mind, Americans 77 00:04:16,680 --> 00:04:19,560 Speaker 1: should be really really ticked off. And I hope the 78 00:04:19,640 --> 00:04:23,840 Speaker 1: Oversight Committee keeps going after these people. I hope that 79 00:04:24,080 --> 00:04:28,800 Speaker 1: the FEC opens their investing, reopens their investigation because the 80 00:04:28,839 --> 00:04:31,880 Speaker 1: Twitter executives at that point had said that Twitter, the 81 00:04:31,960 --> 00:04:35,000 Speaker 1: Twitter was not working with the Biden campaign. Again, these 82 00:04:35,000 --> 00:04:37,240 Speaker 1: Twitter files were real that that was not true. So 83 00:04:37,640 --> 00:04:40,120 Speaker 1: I'm I'm hoping to get consequences out of this and 84 00:04:40,160 --> 00:04:43,040 Speaker 1: not just sort of a theatrical here. And we we 85 00:04:43,120 --> 00:04:47,200 Speaker 1: need to to actually have legislation, um be the result 86 00:04:47,240 --> 00:04:49,240 Speaker 1: of these hearings. And I think some you know, there's 87 00:04:49,279 --> 00:04:52,119 Speaker 1: draft legislation underway. I think those should go to a vote, 88 00:04:52,120 --> 00:04:54,719 Speaker 1: and and I think this collusion should end. Yeah, I 89 00:04:54,839 --> 00:04:56,680 Speaker 1: hear you and that. I mean, look, I was getting 90 00:04:56,720 --> 00:04:59,760 Speaker 1: frustrated to just because they weren't being honest. I mean, 91 00:04:59,760 --> 00:05:03,080 Speaker 1: we know that the FBI had these weekly meetings with 92 00:05:03,200 --> 00:05:06,080 Speaker 1: Twitter officials like you'll Roth. They're warning them about this 93 00:05:06,120 --> 00:05:10,440 Speaker 1: hack and leak operation, you know, heading into all of this, 94 00:05:10,560 --> 00:05:15,479 Speaker 1: heading into this stopping of this New York Post article. 95 00:05:15,839 --> 00:05:18,479 Speaker 1: And also we know that the FBI sent you'll Roth 96 00:05:18,760 --> 00:05:22,640 Speaker 1: documents the night before they stopped the New York Post 97 00:05:23,040 --> 00:05:26,120 Speaker 1: from sharing this article. And I think what concerns me 98 00:05:26,200 --> 00:05:29,160 Speaker 1: the most is obviously, if we lose the First Amendment, 99 00:05:29,800 --> 00:05:32,680 Speaker 1: we cease to be a free country. We ceased to 100 00:05:32,720 --> 00:05:35,800 Speaker 1: be a republic exactly. And you know, everybody used to 101 00:05:36,080 --> 00:05:39,080 Speaker 1: sort of say, oh, the First Amendment only applies to 102 00:05:39,200 --> 00:05:41,720 Speaker 1: government actors, when we as conservatives would be like, hey, 103 00:05:41,720 --> 00:05:44,840 Speaker 1: these private companies are are actually you know, restricting our 104 00:05:44,839 --> 00:05:47,240 Speaker 1: free speech. But we have a culture of free speech 105 00:05:47,279 --> 00:05:49,840 Speaker 1: of course, so that matters too. But then secondly, this 106 00:05:49,920 --> 00:05:52,600 Speaker 1: is the smoking gun when it comes to government actors 107 00:05:52,680 --> 00:05:56,360 Speaker 1: restricting our free speech, and those are um, at least 108 00:05:56,520 --> 00:05:58,800 Speaker 1: in my estimation. I'm not a lawyer, but that's a 109 00:05:58,839 --> 00:06:02,520 Speaker 1: pretty clear by elation of the First Amendment. And this 110 00:06:02,560 --> 00:06:05,400 Speaker 1: is another thing that I think people don't now they're 111 00:06:05,400 --> 00:06:08,680 Speaker 1: starting to understand about tech companies, But we used to 112 00:06:08,839 --> 00:06:11,279 Speaker 1: I I used to work at Facebook, and we would 113 00:06:11,320 --> 00:06:14,200 Speaker 1: call ourselves, um, those the people that are building an 114 00:06:14,240 --> 00:06:17,240 Speaker 1: airplane in midflight. So these guys are sort of making 115 00:06:17,240 --> 00:06:19,320 Speaker 1: it up as they go. They've proven that over the 116 00:06:19,360 --> 00:06:22,640 Speaker 1: past few years. But now we see that their ideological 117 00:06:22,680 --> 00:06:28,120 Speaker 1: actors sort of retrofitting explanations for enacting their political proclivities, 118 00:06:28,400 --> 00:06:31,320 Speaker 1: and they make that decision, they find a justification later, 119 00:06:31,440 --> 00:06:33,760 Speaker 1: like they did with the you know, the hacked materials, 120 00:06:34,120 --> 00:06:35,640 Speaker 1: which turn out not to be true, and then they 121 00:06:35,720 --> 00:06:38,760 Speaker 1: actually admitted that's not true. Then the election happens and 122 00:06:38,800 --> 00:06:40,760 Speaker 1: they get the result that they supposed they want, and 123 00:06:40,800 --> 00:06:43,360 Speaker 1: now it's maya copa, maya coppa. We shouldn't have done that, 124 00:06:43,480 --> 00:06:45,520 Speaker 1: but it's too little, too late, and this happens time 125 00:06:45,560 --> 00:06:51,760 Speaker 1: and time and time again. You mentioned working for Facebook. 126 00:06:51,920 --> 00:06:56,160 Speaker 1: It what's really concerning, it's just how much influenced all 127 00:06:56,160 --> 00:06:58,200 Speaker 1: of these tech companies have and me, if you look 128 00:06:58,200 --> 00:07:03,279 Speaker 1: collectively at Google twy or Facebook, etcetera, they control so 129 00:07:03,400 --> 00:07:07,080 Speaker 1: much almost you know, most of the information that we 130 00:07:07,200 --> 00:07:10,000 Speaker 1: see or that we read, you know, talk a little 131 00:07:10,000 --> 00:07:12,960 Speaker 1: bit about that, oh exactly. And you know, it's not 132 00:07:13,080 --> 00:07:17,080 Speaker 1: just sort of the content that they're censoring under the 133 00:07:17,200 --> 00:07:20,800 Speaker 1: term content moderation, which really amounts to censorship, like you said, 134 00:07:20,800 --> 00:07:24,520 Speaker 1: it's the information flow, it's access to information. And that's 135 00:07:24,560 --> 00:07:28,360 Speaker 1: particularly troubling. And we as conservatives harp on this vignette 136 00:07:28,400 --> 00:07:30,680 Speaker 1: over and over again because I think it's important, and 137 00:07:30,720 --> 00:07:34,560 Speaker 1: that vignette is the Parlor one. So when Amazon or 138 00:07:34,600 --> 00:07:38,880 Speaker 1: excuse me, when um, Apple and Google decided to purge 139 00:07:39,120 --> 00:07:41,960 Speaker 1: uh Parlor, which at that time was at the top 140 00:07:42,000 --> 00:07:44,760 Speaker 1: of the Apple's app store from their their apps from 141 00:07:44,800 --> 00:07:47,240 Speaker 1: their stores. It was like, okay, well maybe you can 142 00:07:47,280 --> 00:07:49,160 Speaker 1: get it on you know, the desktop. You can still 143 00:07:49,200 --> 00:07:51,560 Speaker 1: look at Parlor. It was still sort of alive but 144 00:07:51,640 --> 00:07:55,440 Speaker 1: on life support. But then when Amazon Web Services decided 145 00:07:55,480 --> 00:07:58,880 Speaker 1: to restrict or not just restrict, but take away cloud 146 00:07:58,880 --> 00:08:02,000 Speaker 1: hosting services from Parlor, it was dead in the water. 147 00:08:02,160 --> 00:08:05,600 Speaker 1: It didn't exist anymore. So it's not just at you know, 148 00:08:05,680 --> 00:08:08,880 Speaker 1: the top layer, that the digital platform layer, but it's 149 00:08:08,920 --> 00:08:11,560 Speaker 1: that these mid tiers of what we call the digital 150 00:08:11,600 --> 00:08:16,440 Speaker 1: stack that actually prevents access to consumers to information. You 151 00:08:16,480 --> 00:08:19,360 Speaker 1: could not get Parlor at that point at all in 152 00:08:19,480 --> 00:08:22,440 Speaker 1: any form, in any digital form. So you know, this 153 00:08:22,520 --> 00:08:25,840 Speaker 1: goes further and further down to things like internet service providers, 154 00:08:25,880 --> 00:08:29,240 Speaker 1: more foundational layers of that stack. So Americans don't just 155 00:08:29,360 --> 00:08:32,559 Speaker 1: have to be concerned about how their content is manipulated 156 00:08:32,559 --> 00:08:34,840 Speaker 1: and restricted and sort of the the influence in the 157 00:08:34,840 --> 00:08:37,160 Speaker 1: propaganda that you were thinking of, um, that's both cut 158 00:08:37,200 --> 00:08:40,840 Speaker 1: off and and pushed them by specific companies. Um. But 159 00:08:40,960 --> 00:08:43,920 Speaker 1: they do have to be concerned about their access to information, 160 00:08:44,040 --> 00:08:46,440 Speaker 1: which I think is that is the seat change, that 161 00:08:46,600 --> 00:08:49,240 Speaker 1: is the crossing of the rubicon in my mind, when 162 00:08:49,240 --> 00:08:52,520 Speaker 1: you start to get down into those foundation or foundational 163 00:08:52,600 --> 00:08:56,280 Speaker 1: layers of that digital stack which prevents Americans from actually 164 00:08:56,360 --> 00:08:59,240 Speaker 1: seeing this information. So it's not just breaking the links 165 00:08:59,240 --> 00:09:02,640 Speaker 1: in direct message, does it Twitter? It's actually lights out 166 00:09:02,679 --> 00:09:06,800 Speaker 1: for these entire companies if they displease a specific big 167 00:09:06,920 --> 00:09:10,319 Speaker 1: tech platforms that frankly have an ideological event. And one 168 00:09:10,320 --> 00:09:12,400 Speaker 1: of the big problems is that so many of these 169 00:09:12,440 --> 00:09:15,400 Speaker 1: American companies are being led by people who do not 170 00:09:15,520 --> 00:09:18,559 Speaker 1: believe in freedom. They don't believe in free speech. I mean, 171 00:09:18,559 --> 00:09:21,640 Speaker 1: you'll roth sit there during the hearing talking about how 172 00:09:21,720 --> 00:09:24,640 Speaker 1: somehow you know censorship preserve speech. I mean, how does 173 00:09:24,679 --> 00:09:28,360 Speaker 1: that even make sense? Or you look at Apple, Apple 174 00:09:28,480 --> 00:09:32,240 Speaker 1: change the air drop feature in China so protesters couldn't 175 00:09:32,240 --> 00:09:36,720 Speaker 1: circumvent the government. So even if you make changes in Congress, 176 00:09:36,760 --> 00:09:39,920 Speaker 1: even if you change policy in Congress, we still have 177 00:09:40,040 --> 00:09:44,200 Speaker 1: people leading these companies who do not believe in free speech. 178 00:09:44,559 --> 00:09:48,120 Speaker 1: So where do you go from there? No, that's exactly right. 179 00:09:48,160 --> 00:09:50,520 Speaker 1: And that's when I when I first came from California 180 00:09:50,600 --> 00:09:53,360 Speaker 1: to the Hill, I basically would walk around, you know, 181 00:09:53,360 --> 00:09:57,360 Speaker 1: briefing members and say, hey, you guys have to look 182 00:09:57,400 --> 00:10:00,800 Speaker 1: at these companies differently because they don't see themselves as 183 00:10:00,840 --> 00:10:05,200 Speaker 1: American companies. They see themselves as global companies responsible to 184 00:10:05,280 --> 00:10:08,240 Speaker 1: a global constituency. And you look at a company like 185 00:10:08,559 --> 00:10:12,720 Speaker 1: Facebook or now Meta, which has their user base outside 186 00:10:12,760 --> 00:10:15,320 Speaker 1: of the U. S. And Canada. So they will, you know, 187 00:10:15,360 --> 00:10:17,320 Speaker 1: if they're hauled in front of Congress, they'll pretend that 188 00:10:17,320 --> 00:10:20,280 Speaker 1: they bleed red, white and blue, or incorporated in Delaware. 189 00:10:20,320 --> 00:10:23,320 Speaker 1: You know, we're American companies, but they really think differently 190 00:10:23,400 --> 00:10:26,280 Speaker 1: because they do. They're they're concerned about growth, their bottom 191 00:10:26,360 --> 00:10:28,920 Speaker 1: lines first and foremost, and they want to make sure 192 00:10:29,040 --> 00:10:32,200 Speaker 1: that when they're making decisions that it could apply to 193 00:10:32,320 --> 00:10:36,920 Speaker 1: the countries within which they're operating, and that's not just America. 194 00:10:37,160 --> 00:10:39,760 Speaker 1: And I think that the biggest manifestation of this was 195 00:10:39,800 --> 00:10:42,800 Speaker 1: when the former CEO of Twitter, Parague eper Wall, when 196 00:10:42,880 --> 00:10:46,680 Speaker 1: he specifically talked and tweeted about you know, the First 197 00:10:46,720 --> 00:10:49,680 Speaker 1: Amendment and free speech right, they're more concerned with safety 198 00:10:49,760 --> 00:10:53,440 Speaker 1: and harm. He's basically said that free speech is not 199 00:10:53,679 --> 00:10:56,160 Speaker 1: you know, it's not absolute, it's not something that um 200 00:10:56,240 --> 00:10:59,199 Speaker 1: we care about first and foremost. It's you know, preventing 201 00:10:59,320 --> 00:11:02,160 Speaker 1: harm and and that you know, harm as is as 202 00:11:02,200 --> 00:11:04,280 Speaker 1: you know, a catch off for all kinds of things, 203 00:11:04,400 --> 00:11:10,000 Speaker 1: especially on the left, that that we necessarily wouldn't agree with. So, um, 204 00:11:10,080 --> 00:11:11,920 Speaker 1: where do you go from there? That's a good question, 205 00:11:12,040 --> 00:11:14,840 Speaker 1: I think generally, Um, and you'll see this in the 206 00:11:14,920 --> 00:11:17,880 Speaker 1: China question that you raised. We we need to recover 207 00:11:18,000 --> 00:11:22,319 Speaker 1: a sense of you know, American self interest, frankly, because 208 00:11:22,360 --> 00:11:24,720 Speaker 1: with with China it's the whole. Now there's a big 209 00:11:24,800 --> 00:11:28,320 Speaker 1: movement to onshore a lot of our capabilities, whether it 210 00:11:28,320 --> 00:11:31,840 Speaker 1: be ppe for you know, the next pandemic um or 211 00:11:32,080 --> 00:11:36,360 Speaker 1: the ability to to make a semiconductor chips that power 212 00:11:36,880 --> 00:11:40,400 Speaker 1: pretty much everything that that we're going to use, from cars, computers, 213 00:11:40,400 --> 00:11:43,400 Speaker 1: to watches, etcetera. So I think that we need to 214 00:11:43,480 --> 00:11:47,360 Speaker 1: sort of press into that momentum m you know, bring 215 00:11:47,520 --> 00:11:51,040 Speaker 1: things back to America and recover our sense of of 216 00:11:51,080 --> 00:11:55,480 Speaker 1: what it means to be American innovators who care about America. 217 00:11:55,840 --> 00:11:58,600 Speaker 1: And you know then that has naturally resulted in being 218 00:11:58,640 --> 00:12:01,480 Speaker 1: a force for good as the world. So we we 219 00:12:01,559 --> 00:12:03,640 Speaker 1: need to get back to where we started from if 220 00:12:03,679 --> 00:12:07,240 Speaker 1: we're going to frankly succeed going forward. I mean, it 221 00:12:07,240 --> 00:12:10,119 Speaker 1: seems to be common sense that you want to be independent, 222 00:12:10,200 --> 00:12:12,440 Speaker 1: that you want to bring manufacturing back home, that we 223 00:12:12,480 --> 00:12:14,280 Speaker 1: want to be energy independent and we want to be 224 00:12:14,360 --> 00:12:18,360 Speaker 1: less reliant upon our enemies. It seems like common sense, 225 00:12:18,400 --> 00:12:21,480 Speaker 1: but Joe Biden doesn't get it. Democrats don't get it 226 00:12:21,960 --> 00:12:24,760 Speaker 1: as well. But you know, I want to keep talking 227 00:12:24,800 --> 00:12:28,720 Speaker 1: about China here. So there's obviously been this increased focus 228 00:12:28,920 --> 00:12:34,680 Speaker 1: on TikTok after the Chinese aircraft was going around basically 229 00:12:34,760 --> 00:12:38,119 Speaker 1: doing god knows what, taking pictures of our nuclear silos 230 00:12:38,160 --> 00:12:41,720 Speaker 1: and air force bases and all of that. But do 231 00:12:41,880 --> 00:12:46,160 Speaker 1: you think the concerns here over TikTok and China's influence 232 00:12:46,200 --> 00:12:51,760 Speaker 1: over TikTok? Is that overblown? Pun intended or what's your takeaway? Oh? 233 00:12:51,960 --> 00:12:54,840 Speaker 1: Not overblown at all. And and it's really funny that, um, 234 00:12:54,960 --> 00:12:57,240 Speaker 1: you know, we're in in the position to say, is 235 00:12:57,280 --> 00:12:59,880 Speaker 1: this overblown? Because when we were working on this prob 236 00:13:00,000 --> 00:13:04,160 Speaker 1: ablem back in the Heritage Foundation was working on I 237 00:13:04,240 --> 00:13:06,680 Speaker 1: was working on in a different passy center, Holly, Um, 238 00:13:06,720 --> 00:13:09,000 Speaker 1: I felt like we were voices alone in the wilderness, 239 00:13:09,040 --> 00:13:12,120 Speaker 1: and people were like, you're crazy, what are you talking about. 240 00:13:12,160 --> 00:13:14,400 Speaker 1: This is a great cup above and the fact that 241 00:13:14,440 --> 00:13:16,880 Speaker 1: we've gotten this far to to be able to expose 242 00:13:17,320 --> 00:13:19,960 Speaker 1: this digital platform for what it is, I think is great, 243 00:13:19,960 --> 00:13:21,800 Speaker 1: So I'll take that as a win, um, but not 244 00:13:22,000 --> 00:13:25,560 Speaker 1: overblown at all. As you said, Beijing based parent company 245 00:13:25,600 --> 00:13:28,240 Speaker 1: by Dance. That matters, and it matters because of a 246 00:13:28,280 --> 00:13:31,520 Speaker 1: few reasons. And the first and most important one is 247 00:13:31,679 --> 00:13:36,080 Speaker 1: China has a seventeen National Intelligence law that effectively requires 248 00:13:36,120 --> 00:13:39,920 Speaker 1: every private company to provide access to the state. So 249 00:13:40,040 --> 00:13:45,080 Speaker 1: every private entity is working in service of the Chinese 250 00:13:45,120 --> 00:13:49,040 Speaker 1: Party state that the Chinese Communist Parties UCP UM. That is, 251 00:13:49,200 --> 00:13:52,000 Speaker 1: so whenever they say, oh, you know we are we're 252 00:13:52,040 --> 00:13:55,600 Speaker 1: totally separate. We store our data in the US and 253 00:13:55,720 --> 00:13:58,480 Speaker 1: in Singapore, so nobody has to worry about our Chinese 254 00:13:58,480 --> 00:14:01,319 Speaker 1: ties that were actively trying to downplay We know that 255 00:14:01,800 --> 00:14:05,200 Speaker 1: given internal communications that were released and reported on that 256 00:14:05,240 --> 00:14:09,200 Speaker 1: they actively try to downplay that tie. Um. That that 257 00:14:09,240 --> 00:14:13,199 Speaker 1: matters because when Beijing comes knocking, Bye Dance via TikTok 258 00:14:13,320 --> 00:14:16,320 Speaker 1: or Tiktokavians is not going to say no, they can't. 259 00:14:16,760 --> 00:14:20,520 Speaker 1: And then we know that Byte Dance actively employees former 260 00:14:20,760 --> 00:14:24,440 Speaker 1: CCP state media members. There's a report that had over 261 00:14:24,600 --> 00:14:29,040 Speaker 1: three hundred UM profiles declared that they had been members 262 00:14:29,160 --> 00:14:31,920 Speaker 1: of the Party state at some point UM, and then further, 263 00:14:32,080 --> 00:14:35,280 Speaker 1: one of three by dance board seats held by a 264 00:14:35,800 --> 00:14:39,520 Speaker 1: UM person with active CCP ties. So this is I mean, 265 00:14:39,560 --> 00:14:43,080 Speaker 1: it's very very obvious. And that that alone is you know, 266 00:14:43,120 --> 00:14:45,480 Speaker 1: those are facts, and then you look at what's happened 267 00:14:45,520 --> 00:14:48,880 Speaker 1: since UM we can establish those facts for the American people, 268 00:14:49,160 --> 00:14:52,720 Speaker 1: and we know that US user data has been accessed 269 00:14:52,720 --> 00:14:56,360 Speaker 1: in China by Bye Dance engineers. So again they're saying, oh, 270 00:14:56,400 --> 00:14:59,560 Speaker 1: there's a sundering between our Chinese operations and our US 271 00:14:59,600 --> 00:15:03,680 Speaker 1: operation is not true because Chinese engineers in China have 272 00:15:03,800 --> 00:15:08,000 Speaker 1: access to US user data. Further, a new revelation forbes 273 00:15:08,200 --> 00:15:12,560 Speaker 1: UM basically UM released that TikTok planned to surveil the 274 00:15:12,600 --> 00:15:16,080 Speaker 1: location of specific Americans journalists. So these are Byte Dance 275 00:15:16,080 --> 00:15:20,560 Speaker 1: employees in China using TikTok to gather i P address 276 00:15:20,640 --> 00:15:24,080 Speaker 1: location or attempting to gather i P address locational information 277 00:15:24,120 --> 00:15:27,280 Speaker 1: for these specific journalists to find their sources. UM. So 278 00:15:27,400 --> 00:15:30,280 Speaker 1: it is being used TikTok as a surveillance app that 279 00:15:30,360 --> 00:15:33,840 Speaker 1: it is by by Dance employees in China. So anything 280 00:15:33,920 --> 00:15:37,560 Speaker 1: that these companies are saying, and I'm gonna be watching 281 00:15:37,600 --> 00:15:40,720 Speaker 1: with with bated breath the TikTok CEO when he goes 282 00:15:40,760 --> 00:15:42,760 Speaker 1: in front of the House Energy and Commerce Committee on 283 00:15:42,760 --> 00:15:45,440 Speaker 1: March twenty three, um to to hear what he has 284 00:15:45,480 --> 00:15:48,920 Speaker 1: to say, because all of these equivocations UM frankly amount 285 00:15:48,960 --> 00:15:52,440 Speaker 1: to falsehoods, and they know they're being deliberately misleading given 286 00:15:52,640 --> 00:15:56,360 Speaker 1: what we know from recent reporting on accessing user data 287 00:15:56,480 --> 00:16:00,280 Speaker 1: from China that that rift between the Chinese and American 288 00:16:00,320 --> 00:16:02,880 Speaker 1: company does not exist like they say it does. But 289 00:16:03,000 --> 00:16:07,040 Speaker 1: what's wild is Adam Schiff, who previously was the chairman 290 00:16:07,200 --> 00:16:10,120 Speaker 1: of the House Intelligence Committee, is audit. I mean, he 291 00:16:10,160 --> 00:16:13,120 Speaker 1: should know better. And then you have people like GERK Swollwell, 292 00:16:13,160 --> 00:16:15,400 Speaker 1: who was also on the Intel Committee, who had a 293 00:16:15,440 --> 00:16:19,600 Speaker 1: relationship with a Chinese spy. You have Senator Diane Feinstein 294 00:16:19,760 --> 00:16:23,520 Speaker 1: who employed a Chinese spy for for twenty years. You've 295 00:16:23,520 --> 00:16:26,920 Speaker 1: got the Biden family who have made money off of 296 00:16:27,000 --> 00:16:30,360 Speaker 1: China as well. You've got all these American companies previously 297 00:16:30,400 --> 00:16:34,240 Speaker 1: you know mentioned Apple, you know you got the NBA. 298 00:16:35,240 --> 00:16:37,280 Speaker 1: So I mean, I guess, do do we have the 299 00:16:37,320 --> 00:16:41,000 Speaker 1: will to to go after China, to hold China's feet 300 00:16:41,040 --> 00:16:43,840 Speaker 1: to the fire since everyone is still beholding to them. 301 00:16:44,080 --> 00:16:46,920 Speaker 1: I honestly believe that that is the biggest problem that 302 00:16:46,960 --> 00:16:49,440 Speaker 1: we're facing and the biggest problem that the China Select 303 00:16:49,440 --> 00:16:51,480 Speaker 1: Committee is going to face. Two. You know, we need 304 00:16:51,520 --> 00:16:54,440 Speaker 1: to end China's land grabs in the American heartlands. We 305 00:16:54,560 --> 00:16:57,840 Speaker 1: need to end p l A influence in U S universities. 306 00:16:57,920 --> 00:17:00,680 Speaker 1: And when people are, you know, getting part of their 307 00:17:00,680 --> 00:17:04,359 Speaker 1: paycheck from the CCP or Chinese related end z s, 308 00:17:04,480 --> 00:17:07,119 Speaker 1: I think you know that they're they're compromised. UM, and 309 00:17:07,200 --> 00:17:09,880 Speaker 1: a lot of especially the Democratic Party, as you mentioned 310 00:17:10,080 --> 00:17:13,359 Speaker 1: swallow Well shift, even Biden. We know about his dealings 311 00:17:13,400 --> 00:17:15,920 Speaker 1: from the Hunter Biden laptop relations UM and his son's 312 00:17:15,960 --> 00:17:20,360 Speaker 1: dealings with CCP members, the CCP spy master in fact. UM. 313 00:17:20,400 --> 00:17:23,480 Speaker 1: So so you're right, when they're in the pockets of Beijing, 314 00:17:23,800 --> 00:17:26,879 Speaker 1: it's going to be harder and harder to UM figure 315 00:17:26,880 --> 00:17:30,640 Speaker 1: out solutions. And I really believe it's incumbent upon especially 316 00:17:30,680 --> 00:17:33,240 Speaker 1: the Republicans on the China Select Committee. I'm glad that 317 00:17:33,280 --> 00:17:36,960 Speaker 1: there's bipartisan efforts UM. Some of the TikTok bills actually 318 00:17:37,119 --> 00:17:41,520 Speaker 1: UM in the House are bipartisan led efforts of Christian Morthy, 319 00:17:41,840 --> 00:17:45,600 Speaker 1: Democrat Representative is co sponsoring the Bill with Mike Gallagher UM, 320 00:17:45,640 --> 00:17:48,440 Speaker 1: in addition to Mike Warner really sounding the alarm on 321 00:17:48,480 --> 00:17:51,880 Speaker 1: TikTok um at Democrat from Virginia. So I think it's UM. 322 00:17:51,920 --> 00:17:55,560 Speaker 1: It's very important for UM for people to obey the 323 00:17:55,560 --> 00:17:59,560 Speaker 1: dictates of their consciences at this point because UM this 324 00:17:59,640 --> 00:18:03,560 Speaker 1: whole you know, extracting wealth from America. My friend Page 325 00:18:03,560 --> 00:18:06,040 Speaker 1: wildly talks about that. A lot of people sort of 326 00:18:06,040 --> 00:18:09,160 Speaker 1: getting their's and you know, while the rot is UM 327 00:18:09,280 --> 00:18:12,320 Speaker 1: is still pervasive, so that you know this, it carries 328 00:18:12,400 --> 00:18:15,640 Speaker 1: on as it was without UM being you know, who 329 00:18:15,680 --> 00:18:17,879 Speaker 1: we used to be as a country. I think I 330 00:18:17,920 --> 00:18:21,239 Speaker 1: think people are just they're they're trying to get their's uh, 331 00:18:21,280 --> 00:18:24,280 Speaker 1: and then they're trying to secure that for their posterity 332 00:18:24,280 --> 00:18:28,320 Speaker 1: and their own children. Perfect example of of how that 333 00:18:28,320 --> 00:18:31,040 Speaker 1: that looks in the real world. And Americans just have 334 00:18:31,200 --> 00:18:33,720 Speaker 1: to say no UM. At the Heritage Foundation, you know, 335 00:18:33,760 --> 00:18:36,360 Speaker 1: we've talked to a lot of people who are who 336 00:18:36,359 --> 00:18:39,640 Speaker 1: are very interested in this question, people with UM who 337 00:18:39,760 --> 00:18:42,000 Speaker 1: have made a lot of money, businessmen and one and 338 00:18:42,040 --> 00:18:44,880 Speaker 1: they're like, I cannot allow this to go on. I 339 00:18:44,880 --> 00:18:48,000 Speaker 1: I have UM for a sense of morality. And this 340 00:18:48,080 --> 00:18:50,560 Speaker 1: is where you know, we're pulling out of China and 341 00:18:50,760 --> 00:18:53,560 Speaker 1: being able to see that, especially when you're big tech 342 00:18:53,600 --> 00:18:56,480 Speaker 1: companies like Apple, UM, who gets a lot of their 343 00:18:56,520 --> 00:19:00,200 Speaker 1: revenue from China. That's a huge market for them. Uh 344 00:19:00,440 --> 00:19:02,520 Speaker 1: and they can't see that. But the average you know, 345 00:19:02,560 --> 00:19:06,040 Speaker 1: American business owner and not average, but you know, good 346 00:19:06,080 --> 00:19:08,600 Speaker 1: solid American business owners can. I think we need more 347 00:19:08,640 --> 00:19:11,320 Speaker 1: of those those men and women, UM and and frankly, 348 00:19:11,320 --> 00:19:13,439 Speaker 1: it needs to be exposed. I don't think people know 349 00:19:13,520 --> 00:19:17,960 Speaker 1: what you know about UM apples in China and partnering 350 00:19:17,960 --> 00:19:21,520 Speaker 1: with the CCP, propaganda arm Um, Google attempting to make 351 00:19:21,560 --> 00:19:25,719 Speaker 1: a search engine in China called Dragonfly for you, and UM. 352 00:19:25,760 --> 00:19:27,800 Speaker 1: You know, we need to expose the fact that a 353 00:19:27,840 --> 00:19:30,400 Speaker 1: lot of these big tech companies, these global is frankly 354 00:19:30,600 --> 00:19:33,240 Speaker 1: UM are are in bed with the CCP because that's 355 00:19:33,280 --> 00:19:35,400 Speaker 1: the world they think is successful. Those are all really 356 00:19:35,440 --> 00:19:42,159 Speaker 1: great points. You look at TikTok, more than a hundred 357 00:19:42,160 --> 00:19:45,480 Speaker 1: million American users. They spend an average of more than 358 00:19:45,680 --> 00:19:49,040 Speaker 1: eighty minutes per day on the app. You've got roughly 359 00:19:49,119 --> 00:19:51,880 Speaker 1: thirty six percent of Americans over the age of twelve 360 00:19:52,440 --> 00:19:56,479 Speaker 1: now use TikTok, including sixty one of Americans ages twelve 361 00:19:56,600 --> 00:20:00,480 Speaker 1: through thirty four, I mean massive influence. And I worry 362 00:20:00,520 --> 00:20:06,840 Speaker 1: that beyond just the data privacy issues, I worry about 363 00:20:06,880 --> 00:20:11,320 Speaker 1: the brainwashing aspect of this as well. Uh, there was 364 00:20:11,359 --> 00:20:13,480 Speaker 1: this article I was reading and the Daily Mail is 365 00:20:13,520 --> 00:20:15,639 Speaker 1: talking about concerns in the UK. They said something like 366 00:20:15,640 --> 00:20:19,439 Speaker 1: the hashtag trance has been used almost twenty six billion 367 00:20:19,520 --> 00:20:22,840 Speaker 1: times and you look at that, you look at things 368 00:20:22,880 --> 00:20:26,160 Speaker 1: like puberty blockers leading to infertility at a time when 369 00:20:26,160 --> 00:20:29,720 Speaker 1: we face declining birthrates. I mean, if you're China, would 370 00:20:29,720 --> 00:20:34,520 Speaker 1: would you do anything differently? Absolutely not? And you know 371 00:20:34,640 --> 00:20:38,800 Speaker 1: we know from this has been popularized. Now China is 372 00:20:38,840 --> 00:20:41,560 Speaker 1: doing all of this all while insulating Chinese kids through 373 00:20:41,560 --> 00:20:43,880 Speaker 1: the more wholesome and joy in at their Chinese version 374 00:20:43,920 --> 00:20:47,920 Speaker 1: of TikTok um. And it's particularly concerning for the next 375 00:20:47,960 --> 00:20:51,399 Speaker 1: generation of citizens. So TikTok is they're winning the battle 376 00:20:51,440 --> 00:20:54,240 Speaker 1: for the next generation in America, frankly, and two thirds 377 00:20:54,280 --> 00:20:57,480 Speaker 1: of American teams around TikTok We know that parents surveyed 378 00:20:57,520 --> 00:21:00,879 Speaker 1: by Pew said that of their pre teens, these are 379 00:21:00,960 --> 00:21:04,000 Speaker 1: nine to eleven year olds are on TikTok and it 380 00:21:04,160 --> 00:21:08,800 Speaker 1: indoctrinates American youth through pushing harmful content like eating disorders, 381 00:21:08,840 --> 00:21:14,080 Speaker 1: like suicide, content like verbal, verbal and physical ticks UM 382 00:21:14,119 --> 00:21:18,800 Speaker 1: content with influencers that exhibit those UM particular symptoms. And 383 00:21:19,160 --> 00:21:22,480 Speaker 1: there's one report from the Wall Street Journal UH last 384 00:21:22,560 --> 00:21:25,919 Speaker 1: year that basically said, all of the young women who 385 00:21:26,040 --> 00:21:29,639 Speaker 1: went to a Texas hospital who were exhibiting these verbal 386 00:21:29,720 --> 00:21:32,640 Speaker 1: and physical ticks, all of a sudden, they traced all 387 00:21:32,680 --> 00:21:36,080 Speaker 1: of their their UM, their habits back to looking at 388 00:21:36,119 --> 00:21:40,600 Speaker 1: a particular TikTok influencer. So this is all happening all 389 00:21:40,720 --> 00:21:44,600 Speaker 1: while China insulates their own children through the more wholesome 390 00:21:44,680 --> 00:21:48,719 Speaker 1: app called in the Chinese version of TikTok UM. And 391 00:21:48,760 --> 00:21:51,159 Speaker 1: we know that, you know, even adults are starting to 392 00:21:51,200 --> 00:21:53,480 Speaker 1: get their news and a lot of their content here 393 00:21:54,040 --> 00:21:57,040 Speaker 1: on TikTok. The number of American adults using TikTok for 394 00:21:57,119 --> 00:22:00,399 Speaker 1: news has tripled in the past two years. So we're 395 00:22:00,400 --> 00:22:05,920 Speaker 1: plugging ourselves into this application that's pumping UH these influence 396 00:22:05,960 --> 00:22:09,040 Speaker 1: campaigns as propaganda. We know that pro CCP narratives are 397 00:22:09,040 --> 00:22:12,720 Speaker 1: also pumped into UM TikTok algorithms and their feed So 398 00:22:13,480 --> 00:22:16,600 Speaker 1: understanding what this is doing to the next generation, what 399 00:22:16,600 --> 00:22:18,919 Speaker 1: what effect this could have on our citizens who are 400 00:22:18,920 --> 00:22:22,159 Speaker 1: going to vote and who are charged with the future 401 00:22:22,200 --> 00:22:25,320 Speaker 1: of our republic. We haven't even seen the impacts of 402 00:22:25,359 --> 00:22:28,040 Speaker 1: that yet, so not just on mental health, but also 403 00:22:28,119 --> 00:22:31,600 Speaker 1: on you know, how we how they successfully stored this 404 00:22:31,720 --> 00:22:35,080 Speaker 1: nation going forward. And as you said, if I, you know, 405 00:22:35,320 --> 00:22:37,040 Speaker 1: if I was trying to if I was a CCP, 406 00:22:37,040 --> 00:22:39,880 Speaker 1: if I was chusing being, I would not do anything differently. 407 00:22:40,280 --> 00:22:45,240 Speaker 1: Z turning out and droves for Democrats believing abortion, climate change, 408 00:22:45,359 --> 00:22:48,640 Speaker 1: gun control, all these different things, you know, are are 409 00:22:48,640 --> 00:22:50,520 Speaker 1: the biggest issues that you know, they're all going to 410 00:22:50,640 --> 00:22:54,080 Speaker 1: lead to the demise of the country. But I wanted 411 00:22:54,119 --> 00:22:56,280 Speaker 1: to ask you, you know, if if if you could 412 00:22:56,280 --> 00:22:59,280 Speaker 1: wave a magic wand what would you do about big tech? 413 00:22:59,640 --> 00:23:04,400 Speaker 1: What what would you like to see get done in Congress? Yeah, well, 414 00:23:04,520 --> 00:23:06,560 Speaker 1: starting with TikTok, I think this is a no brainer. 415 00:23:06,600 --> 00:23:09,919 Speaker 1: It's time to ban it outright from operating within the 416 00:23:10,000 --> 00:23:12,560 Speaker 1: United States. It's both a CCP spy app and as 417 00:23:12,560 --> 00:23:16,280 Speaker 1: you said, something that that really brainwashes our children and 418 00:23:16,320 --> 00:23:19,840 Speaker 1: also puts them in the CCPs clutches for um, you know, 419 00:23:19,880 --> 00:23:22,960 Speaker 1: when they're playing the long game, for for blackmail, for espionage, 420 00:23:23,000 --> 00:23:25,080 Speaker 1: all kinds. They have patterns of life on our children 421 00:23:25,080 --> 00:23:27,800 Speaker 1: now and we should not be tolerating that. So that 422 00:23:27,840 --> 00:23:30,320 Speaker 1: would be great. Just absolutely, it has to happen at 423 00:23:30,359 --> 00:23:32,639 Speaker 1: the federal level. States are moving out. It's great what 424 00:23:32,680 --> 00:23:35,280 Speaker 1: they're doing when it comes to banning TikTok on government devices, 425 00:23:35,480 --> 00:23:38,119 Speaker 1: but it's not enough. They can still access the data, 426 00:23:38,520 --> 00:23:42,080 Speaker 1: especially if you connect to government WiFi, etcetera. So it's 427 00:23:42,119 --> 00:23:45,159 Speaker 1: only a partial solution. Um. And then you know, I 428 00:23:45,200 --> 00:23:48,640 Speaker 1: think there's a great suggestion by SCC Commissioner Brendan Carr 429 00:23:48,680 --> 00:23:52,399 Speaker 1: who says, hey, remove TikTok from Google and Apple app stores. 430 00:23:52,440 --> 00:23:54,520 Speaker 1: You did it to Parlor, so why can't you do 431 00:23:54,560 --> 00:23:56,840 Speaker 1: to TikTok. We have a lot of evidence that and 432 00:23:57,200 --> 00:24:01,679 Speaker 1: security justification that this is extremely harmful. Then second, the 433 00:24:01,720 --> 00:24:06,160 Speaker 1: collusion between big tech and the government, that's something that 434 00:24:06,200 --> 00:24:09,560 Speaker 1: we should impose costs on. So there should not be 435 00:24:09,800 --> 00:24:12,679 Speaker 1: it should not be permitted to allow the federal government 436 00:24:12,720 --> 00:24:15,800 Speaker 1: to use these tech companies and these executives as agents 437 00:24:16,000 --> 00:24:19,160 Speaker 1: to chill the speech of Americans and polices speech of Americans. 438 00:24:19,359 --> 00:24:21,600 Speaker 1: So that's huge too. And then third, I think you 439 00:24:21,640 --> 00:24:24,440 Speaker 1: need to look at the way these companies do business 440 00:24:24,480 --> 00:24:27,200 Speaker 1: like their ad tech models. Center. Ely has a great 441 00:24:27,240 --> 00:24:31,520 Speaker 1: bill that I hope he introduces and Conress Congress reintroduces 442 00:24:31,760 --> 00:24:33,760 Speaker 1: UM that takes a look at their at tech. You know, 443 00:24:33,880 --> 00:24:37,760 Speaker 1: you can't have be UM on the buy side and 444 00:24:37,800 --> 00:24:40,520 Speaker 1: the cell side and operating ad exchange. It helps these 445 00:24:40,520 --> 00:24:43,440 Speaker 1: companies sort of accrew all of that power to themselves 446 00:24:43,480 --> 00:24:46,840 Speaker 1: and then they can visit the censorship decisions that you 447 00:24:46,880 --> 00:24:49,200 Speaker 1: know don't run down to the benefit of most American citizens, 448 00:24:49,320 --> 00:24:52,760 Speaker 1: especially on the right UM. And it's it's that consolidation 449 00:24:52,760 --> 00:24:57,439 Speaker 1: of power that is really UM preventing Americans from getting 450 00:24:57,520 --> 00:25:00,880 Speaker 1: the access they need to specific service as well as 451 00:25:01,240 --> 00:25:04,639 Speaker 1: UM giving them the potential to influence UM. You know 452 00:25:04,680 --> 00:25:07,439 Speaker 1: what Americans see, especially before an election, as we know 453 00:25:07,560 --> 00:25:10,760 Speaker 1: with Google and Gmail and their spam filters and how 454 00:25:10,800 --> 00:25:17,479 Speaker 1: they inordinately filtered out into their spam conservative candidates, uh 455 00:25:17,480 --> 00:25:20,600 Speaker 1: and not Democratic candidates. So there's a litany of things. 456 00:25:20,640 --> 00:25:22,720 Speaker 1: I think those three things are are the top things 457 00:25:22,760 --> 00:25:25,480 Speaker 1: that you do right now, and then you work on frankly, 458 00:25:25,800 --> 00:25:28,560 Speaker 1: data privacy and children. I think that's the next frontier. 459 00:25:28,640 --> 00:25:31,119 Speaker 1: That's something we really need to think hard about. Again, 460 00:25:31,160 --> 00:25:34,480 Speaker 1: there's some really interesting bills percolating like the Kids Online 461 00:25:34,480 --> 00:25:37,480 Speaker 1: Safety Act UM and I think Republicans need to lean 462 00:25:37,520 --> 00:25:40,600 Speaker 1: into those UM and conservatives as well, and we could 463 00:25:40,600 --> 00:25:44,160 Speaker 1: get some you know, bipartisan consensus along the way, because 464 00:25:44,200 --> 00:25:46,680 Speaker 1: everybody sees what this is doing to our children and 465 00:25:46,720 --> 00:25:49,920 Speaker 1: nobody's happy about it. And these tech companies they're they're 466 00:25:49,920 --> 00:25:52,720 Speaker 1: doing it on purpose. They're deliberately designing these platforms to 467 00:25:52,800 --> 00:25:57,280 Speaker 1: maximize engagement and even targeting these UM quote small but 468 00:25:57,440 --> 00:26:00,760 Speaker 1: untapped audiences like pre teams. So this is something that 469 00:26:00,800 --> 00:26:03,240 Speaker 1: should be nipped in the bud right away. America can 470 00:26:03,280 --> 00:26:05,240 Speaker 1: do it. All of this goes in the direction of 471 00:26:05,280 --> 00:26:10,800 Speaker 1: harming Republicans, as it always does. Kara Frederick, thanks so 472 00:26:10,920 --> 00:26:14,360 Speaker 1: much for joining the show. Super insightful. I really appreciate 473 00:26:14,440 --> 00:26:22,840 Speaker 1: your time. Thanks Lisa, so was Kara Frederick, director of 474 00:26:22,920 --> 00:26:27,520 Speaker 1: the Tech Policy Center at the Heritage Foundation. Interesting conversation. 475 00:26:27,640 --> 00:26:30,119 Speaker 1: Obviously knows a lot about the issue. I want to 476 00:26:30,119 --> 00:26:32,760 Speaker 1: thank you guys at home for listening every Monday and Thursday, 477 00:26:32,760 --> 00:26:34,600 Speaker 1: but you can listen throughout the week. I want to 478 00:26:34,640 --> 00:26:37,640 Speaker 1: thank John Cassio, my producer, for putting the show together. 479 00:26:37,760 --> 00:26:40,919 Speaker 1: Feel free to leave us a review on Apple podcasts, 480 00:26:41,119 --> 00:26:42,639 Speaker 1: or give us a rating. I love to look at 481 00:26:42,680 --> 00:26:43,880 Speaker 1: those until next time.