1 00:00:04,760 --> 00:00:07,640 Speaker 1: On this episode of news World. I was really intrigued 2 00:00:07,680 --> 00:00:11,680 Speaker 1: recently when I came across the February seventh report released 3 00:00:11,680 --> 00:00:16,320 Speaker 1: by the Heritage Foundation entitled Combating Big Tech to Tolitarianism 4 00:00:16,480 --> 00:00:20,799 Speaker 1: a Roadmap. The report makes three key points. First, the 5 00:00:20,880 --> 00:00:25,720 Speaker 1: growing symbiosis between big tech and government gives these companies 6 00:00:25,800 --> 00:00:31,200 Speaker 1: undue influence over America's daily lives and undermines our rights. Second, 7 00:00:31,560 --> 00:00:36,360 Speaker 1: big tech has increasingly exercised pervasive control of information and 8 00:00:36,479 --> 00:00:40,080 Speaker 1: access to the digital space in ways that undermine freedom 9 00:00:40,280 --> 00:00:44,320 Speaker 1: and a functioning republic. Third, it is time for aggressive 10 00:00:44,360 --> 00:00:48,080 Speaker 1: reforms to ensure that big tech has held accountable, provide 11 00:00:48,080 --> 00:00:52,200 Speaker 1: scrutiny and oversight, and constrain its ability to reshape society. 12 00:00:52,800 --> 00:00:55,200 Speaker 1: Like many of you, I've seen what big tech has 13 00:00:55,200 --> 00:00:57,920 Speaker 1: done in the last two years to silence voices they 14 00:00:57,920 --> 00:01:01,520 Speaker 1: don't agree with. From President Trump being permanently banned from 15 00:01:01,520 --> 00:01:05,160 Speaker 1: Twitter on January eighth, twenty twenty one, to the latest 16 00:01:05,200 --> 00:01:09,720 Speaker 1: efforts by Spotify employees to censor podcaster Joe Rogan for 17 00:01:09,800 --> 00:01:13,080 Speaker 1: his content. Big tech is more like a big government 18 00:01:13,120 --> 00:01:17,440 Speaker 1: overlord deciding what information you were allowed to see. Here 19 00:01:17,520 --> 00:01:20,680 Speaker 1: to discuss her report and talk about what needs to 20 00:01:20,720 --> 00:01:24,920 Speaker 1: happen next. I'm really pleased to welcome my guest, Kara Frederick, 21 00:01:25,360 --> 00:01:43,440 Speaker 1: Research Fellow and Technology Policy at the Heritage Foundation. Tara, 22 00:01:43,640 --> 00:01:46,000 Speaker 1: thank you for joining me, Thanks for having me. You, 23 00:01:46,360 --> 00:01:50,760 Speaker 1: as I understand it, you worked for Facebook and help 24 00:01:50,880 --> 00:01:56,440 Speaker 1: create and lead Facebook's Global Security counter Terrorism Analysis program, 25 00:01:56,480 --> 00:01:59,120 Speaker 1: and you were also the team lead for Facebook headquarters 26 00:01:59,440 --> 00:02:03,480 Speaker 1: Regional Intelligence Team in Menlo Park, California. When did you 27 00:02:03,560 --> 00:02:06,880 Speaker 1: work at Facebook? So? I was there in twenty sixteen 28 00:02:07,040 --> 00:02:10,200 Speaker 1: all the way through twenty seventeen. So there when Donald 29 00:02:10,200 --> 00:02:12,800 Speaker 1: Trump was elected. There was a big push at the 30 00:02:12,840 --> 00:02:16,880 Speaker 1: time to look at what terrorists did on these digital platforms. 31 00:02:16,880 --> 00:02:20,280 Speaker 1: So there is a big hiring push at Facebook to 32 00:02:20,320 --> 00:02:23,320 Speaker 1: sort of get people who'd spend time in the intelligence community, 33 00:02:23,560 --> 00:02:26,840 Speaker 1: people who'd done actual counter terrorism work, especially in the 34 00:02:26,880 --> 00:02:30,000 Speaker 1: digital space, in one of these three letter agencies, and 35 00:02:30,080 --> 00:02:31,920 Speaker 1: bring them over there so we could bring that know 36 00:02:32,040 --> 00:02:35,680 Speaker 1: how over to Facebook and effectively help them learn how 37 00:02:35,720 --> 00:02:38,720 Speaker 1: to do this on the platform. What surprised you about 38 00:02:38,720 --> 00:02:41,600 Speaker 1: that experience? Oh, I'd say the first thing, you know, 39 00:02:41,639 --> 00:02:43,639 Speaker 1: I used to work at a place called Naval Special 40 00:02:43,680 --> 00:02:46,800 Speaker 1: Warfare Development Group, So I was working at a command 41 00:02:46,960 --> 00:02:49,560 Speaker 1: you know, I had been Afghanistan three times. I was 42 00:02:49,560 --> 00:02:52,080 Speaker 1: sort of bleeding red, white and blue at the time, 43 00:02:52,120 --> 00:02:54,880 Speaker 1: and so I got there and I sort of expected 44 00:02:54,919 --> 00:02:58,000 Speaker 1: everybody to come from that similar foundation. I was at 45 00:02:58,080 --> 00:03:01,399 Speaker 1: Menlo Park headquarters in California, and I had thought that 46 00:03:01,639 --> 00:03:05,160 Speaker 1: there would be at least some recognition of the uniqueness 47 00:03:05,160 --> 00:03:09,639 Speaker 1: of America and its exceptionalism, and a recognition and gratitude 48 00:03:09,680 --> 00:03:12,600 Speaker 1: really for the American system and all of these companies 49 00:03:12,919 --> 00:03:16,840 Speaker 1: growing up and flourishing within our distinctly American system. But 50 00:03:16,880 --> 00:03:19,000 Speaker 1: I'd say the thing that struck me the most was 51 00:03:19,400 --> 00:03:21,920 Speaker 1: they just really didn't think about it. I'd say they 52 00:03:22,000 --> 00:03:24,400 Speaker 1: took it for granted. And there was a decided lack 53 00:03:24,480 --> 00:03:28,960 Speaker 1: of gratitude there about America itself because we were a 54 00:03:29,000 --> 00:03:33,560 Speaker 1: global corporation, right, We weren't necessarily an American company, even 55 00:03:33,560 --> 00:03:36,880 Speaker 1: though we were incorporated in Delaware. We were responsible to 56 00:03:36,960 --> 00:03:40,640 Speaker 1: a global constituency, and we were always trying to grow 57 00:03:40,840 --> 00:03:44,480 Speaker 1: outside of America. Now, Facebook around ninety percent of their 58 00:03:44,560 --> 00:03:47,040 Speaker 1: user base is outside of the US and Canada, so 59 00:03:47,360 --> 00:03:50,200 Speaker 1: I mean they are thinking in terms of a global 60 00:03:50,280 --> 00:03:53,560 Speaker 1: cognition rather than a this is America. What can we 61 00:03:53,600 --> 00:03:56,080 Speaker 1: do that is in America's interest. That was the number 62 00:03:56,120 --> 00:03:58,840 Speaker 1: one thing that struck me going from the Department of 63 00:03:58,920 --> 00:04:02,520 Speaker 1: Defense all the way to Silicon Valley. So in that sense, 64 00:04:02,640 --> 00:04:07,840 Speaker 1: these are global companies that happen to have headquarters in America, 65 00:04:07,880 --> 00:04:10,960 Speaker 1: but they're not American companies that have a global reach. 66 00:04:11,600 --> 00:04:14,280 Speaker 1: That's exactly what I experienced, and I think these past 67 00:04:14,280 --> 00:04:17,400 Speaker 1: few years have really borne that out. There's no better 68 00:04:17,440 --> 00:04:21,000 Speaker 1: demonstration really of that than the new Twitter CEO who 69 00:04:21,000 --> 00:04:24,640 Speaker 1: took over for Jack Dorsey, who basically said, the First 70 00:04:24,680 --> 00:04:27,920 Speaker 1: Amendment is not really a guiding principle for us, It's 71 00:04:27,960 --> 00:04:30,680 Speaker 1: not something that we're responsible too. So I think that 72 00:04:30,720 --> 00:04:33,719 Speaker 1: really tells you all you need to know about what's 73 00:04:33,760 --> 00:04:35,880 Speaker 1: going on in sort of the hearts and minds of 74 00:04:35,920 --> 00:04:38,280 Speaker 1: a lot of people in these companies, not all of them. 75 00:04:38,320 --> 00:04:41,040 Speaker 1: There's some patriots, but at the same time, I would 76 00:04:41,040 --> 00:04:45,880 Speaker 1: say the prevailing ethos is that of a global corporation, 77 00:04:46,120 --> 00:04:50,520 Speaker 1: where we are a global company. Before you came to Heritage, 78 00:04:51,040 --> 00:04:54,360 Speaker 1: between your time with Menlo Park, we became a fellow 79 00:04:54,400 --> 00:04:57,800 Speaker 1: for the Technology and National Security program at the Center 80 00:04:57,839 --> 00:05:01,480 Speaker 1: for a New American Security. Was that sort of comforting 81 00:05:01,480 --> 00:05:03,839 Speaker 1: to get back into a place where people actually thought 82 00:05:04,240 --> 00:05:08,120 Speaker 1: defense mattered and national interest mattered. It really was, to 83 00:05:08,200 --> 00:05:10,120 Speaker 1: be completely honest. I mean I would go to the 84 00:05:10,120 --> 00:05:13,599 Speaker 1: coffee shops in Melo Park and Redwood City and Mountain View, 85 00:05:13,839 --> 00:05:16,440 Speaker 1: and all people wanted to talk about was their product 86 00:05:16,480 --> 00:05:19,279 Speaker 1: and shipping their product, or getting a job at Facebook 87 00:05:19,360 --> 00:05:21,560 Speaker 1: or the latest thing that they came up in terms 88 00:05:21,560 --> 00:05:24,800 Speaker 1: of design. And to me, it was unfulfilling. Right. There 89 00:05:24,880 --> 00:05:28,679 Speaker 1: wasn't that geopolitical awareness that I had felt my entire 90 00:05:28,720 --> 00:05:31,760 Speaker 1: life in DC. Heck, you know, going to Afghanistan multiple 91 00:05:31,839 --> 00:05:34,039 Speaker 1: times will force it upon you if it's not top 92 00:05:34,080 --> 00:05:37,120 Speaker 1: of mind. So it was comforting to really use sort 93 00:05:37,120 --> 00:05:39,520 Speaker 1: of the technical proficiency I gained not just in the 94 00:05:39,560 --> 00:05:42,840 Speaker 1: Intel community, but applied to Facebook and then come back 95 00:05:42,880 --> 00:05:47,279 Speaker 1: and say, hey, there's a set of emerging technologies that 96 00:05:47,320 --> 00:05:50,919 Speaker 1: are really going to matter for national security going forward, 97 00:05:51,160 --> 00:05:53,200 Speaker 1: and if we don't help write the rules of the 98 00:05:53,279 --> 00:05:56,039 Speaker 1: road now, our enemies are just going to run rough 99 00:05:56,160 --> 00:05:59,440 Speaker 1: shot over us. So thinking in those terms instead of 100 00:05:59,480 --> 00:06:03,080 Speaker 1: just helping design products and tools and shipping them and 101 00:06:03,120 --> 00:06:06,240 Speaker 1: iterating and designing and shipping again, I think that to 102 00:06:06,320 --> 00:06:08,960 Speaker 1: me was really where my heart was, and it was 103 00:06:09,040 --> 00:06:14,280 Speaker 1: reassuring that people still cared about the national security implications of, say, 104 00:06:14,360 --> 00:06:18,120 Speaker 1: what the CCP is doing with their artificial intelligence development. 105 00:06:19,080 --> 00:06:22,200 Speaker 1: I want to take two detours before we get to 106 00:06:22,360 --> 00:06:26,440 Speaker 1: the domestic implications. First of all, when you look back 107 00:06:26,480 --> 00:06:29,360 Speaker 1: on your afghan experiences and then you look at how 108 00:06:29,440 --> 00:06:34,200 Speaker 1: we left Afghanistan, what's your general thinking about how Afghanistan 109 00:06:34,279 --> 00:06:38,000 Speaker 1: is going to evolve now? And what's your reaction having 110 00:06:38,080 --> 00:06:41,200 Speaker 1: watched the sacrifices we made to then look at the 111 00:06:41,440 --> 00:06:44,320 Speaker 1: sort of chaotic way in which we ran out of 112 00:06:44,320 --> 00:06:48,760 Speaker 1: the country. Sir, it was devastating. I mean, between my father, 113 00:06:48,960 --> 00:06:51,760 Speaker 1: my husband, and I, we have twenty deployments between the 114 00:06:51,800 --> 00:06:53,960 Speaker 1: three of us. You know, we all sort of came 115 00:06:54,000 --> 00:06:56,040 Speaker 1: of age. Except for my dad that was the tail 116 00:06:56,120 --> 00:06:58,320 Speaker 1: end of his career as a marine, but we all 117 00:06:58,320 --> 00:07:00,400 Speaker 1: sort of came of age and the warrant. You know, 118 00:07:00,440 --> 00:07:02,800 Speaker 1: I was in high school when we watched the towers fall. 119 00:07:03,160 --> 00:07:06,039 Speaker 1: My husband's same thing. So it was just to see 120 00:07:06,120 --> 00:07:10,080 Speaker 1: and to devote your best years to twenty years of 121 00:07:10,120 --> 00:07:13,360 Speaker 1: just being at war and fighting it and really believing 122 00:07:13,360 --> 00:07:15,320 Speaker 1: in the mission at the time, you know, on a 123 00:07:15,360 --> 00:07:18,880 Speaker 1: personal note, just simply devastating. On a policy note. I 124 00:07:18,880 --> 00:07:22,480 Speaker 1: mean to me, this was an absolute dereliction of duty, 125 00:07:22,560 --> 00:07:25,440 Speaker 1: right McMaster. I remember that book used to sit on 126 00:07:25,480 --> 00:07:29,080 Speaker 1: the shelf of my father's bookcase, and I was always thinking, 127 00:07:29,200 --> 00:07:31,280 Speaker 1: never again. You know, this is never going to happen 128 00:07:31,320 --> 00:07:34,000 Speaker 1: again because we know too much. We know too much 129 00:07:34,000 --> 00:07:38,720 Speaker 1: about the Vietnam disaster, etc. But history again is repeating itself, 130 00:07:38,760 --> 00:07:42,320 Speaker 1: and it was a botch withdrawal. It was entirely preventable 131 00:07:42,360 --> 00:07:46,080 Speaker 1: by the Biden administration, and in my estimation, it stomped 132 00:07:46,080 --> 00:07:49,240 Speaker 1: on the sacrifices that so many people made giving their 133 00:07:49,280 --> 00:07:53,160 Speaker 1: lives for and not just in a cost fallacy kind 134 00:07:53,200 --> 00:07:56,960 Speaker 1: of way, but we had the opportunity to at least 135 00:07:57,040 --> 00:08:00,160 Speaker 1: have a foothold there that China's now encroaching up on 136 00:08:00,240 --> 00:08:02,880 Speaker 1: with Bogram Air Force Base, and we gave it up. 137 00:08:02,960 --> 00:08:05,640 Speaker 1: So it was an unforced error in my estimation, and 138 00:08:06,040 --> 00:08:09,400 Speaker 1: I would be devastated to see it repeated again. Your 139 00:08:09,520 --> 00:08:12,680 Speaker 1: dad was a marine. Did he come in after Vietnam? 140 00:08:13,320 --> 00:08:16,880 Speaker 1: He did? Because there are some very eerie similarities. And 141 00:08:16,920 --> 00:08:20,840 Speaker 1: I do think McMaster's book on dereliction of duty because 142 00:08:20,920 --> 00:08:23,880 Speaker 1: some parallels in the way in which the Chairman of 143 00:08:23,920 --> 00:08:26,600 Speaker 1: the Joint Chiefs and the Secretary of Defense failed to 144 00:08:26,760 --> 00:08:31,040 Speaker 1: really communicate clearly to President Biden the dangers he was 145 00:08:31,120 --> 00:08:33,560 Speaker 1: running in the likelhood of a disaster. The other question 146 00:08:33,640 --> 00:08:35,360 Speaker 1: is going to ask you about just in general terms, 147 00:08:35,440 --> 00:08:40,080 Speaker 1: is as great a threat as some of the American 148 00:08:40,200 --> 00:08:43,880 Speaker 1: domestic companies are. What's your reaction to the scale of 149 00:08:43,960 --> 00:08:49,199 Speaker 1: the Chinese effort in this space. Oh, I'm very, very concerned. 150 00:08:49,440 --> 00:08:51,440 Speaker 1: I think it starts as you know, you have to 151 00:08:51,480 --> 00:08:53,760 Speaker 1: recognize this as an adversary. You know, this is not 152 00:08:53,800 --> 00:08:57,280 Speaker 1: a friendly competition whatsoever. No, when people tell us who 153 00:08:57,320 --> 00:09:00,000 Speaker 1: they are, when the CCP tells us who they are, 154 00:09:00,120 --> 00:09:03,200 Speaker 1: have to believe them. With their plans for basically leading 155 00:09:03,240 --> 00:09:06,560 Speaker 1: the race in AI by twenty thirty. Everything that they're saying, 156 00:09:06,679 --> 00:09:09,120 Speaker 1: I think we do to a degree have to take 157 00:09:09,160 --> 00:09:11,800 Speaker 1: it at face value, because they, to be honest, have 158 00:09:11,880 --> 00:09:13,920 Speaker 1: put their money where their mouth is in terms of 159 00:09:13,960 --> 00:09:18,440 Speaker 1: their investment in artificial intelligence development. These are technologies that 160 00:09:18,440 --> 00:09:21,160 Speaker 1: are going to underpin all of the war fighting technologies 161 00:09:21,200 --> 00:09:24,040 Speaker 1: going forward. When you talk about drone swarms, when you 162 00:09:24,040 --> 00:09:28,040 Speaker 1: talk about autonomous weapons, when you talk about even quantum computing, 163 00:09:28,080 --> 00:09:31,320 Speaker 1: you know, all of these technologies China is really investing 164 00:09:31,360 --> 00:09:33,199 Speaker 1: in them in a way that I don't think we've 165 00:09:33,240 --> 00:09:37,240 Speaker 1: been very serious. There are people who understand the threat, 166 00:09:37,480 --> 00:09:40,560 Speaker 1: but I take it very seriously. I think China clearly 167 00:09:40,600 --> 00:09:43,800 Speaker 1: we're leading in certain areas like the design of chips 168 00:09:43,800 --> 00:09:46,040 Speaker 1: and whatnot. And if you look at sort of the 169 00:09:46,440 --> 00:09:49,320 Speaker 1: AI research papers have come out, you know, our quality 170 00:09:49,400 --> 00:09:52,200 Speaker 1: tends to be better, but they are surging almost to 171 00:09:52,480 --> 00:09:54,480 Speaker 1: parody with us, and I think that we need to 172 00:09:54,520 --> 00:09:57,720 Speaker 1: be careful and look where they're investing all of their money, 173 00:09:57,840 --> 00:10:03,720 Speaker 1: their personnel, their largess are basically the civil military fusion concepts, 174 00:10:03,720 --> 00:10:07,360 Speaker 1: So they're taking their resources and the abilities of the 175 00:10:07,400 --> 00:10:11,960 Speaker 1: private sector and commandeering them for the CCP. You've got 176 00:10:11,960 --> 00:10:14,120 Speaker 1: to make sure that you understand if you're working with 177 00:10:14,160 --> 00:10:17,120 Speaker 1: the private company in China, you're effectively working with the 178 00:10:17,160 --> 00:10:20,679 Speaker 1: CCP because they can have access to whatever they do 179 00:10:20,720 --> 00:10:24,200 Speaker 1: with the National Intelligence Law, the National Cybersecurity Law, all 180 00:10:24,240 --> 00:10:27,440 Speaker 1: in twenty seventeen that were passed in China. I think 181 00:10:27,440 --> 00:10:30,560 Speaker 1: they are a very worthy adversary and we have to 182 00:10:30,600 --> 00:10:50,240 Speaker 1: posture accordingly. So given all that background, what brought you 183 00:10:50,280 --> 00:10:53,920 Speaker 1: the heritage? Yeah, so I'm gonna be completely honest with you. 184 00:10:54,200 --> 00:10:57,120 Speaker 1: The crossing of the rubicon moment for me was in 185 00:10:57,160 --> 00:11:01,960 Speaker 1: the election cycle of twenty twenty when Big actively suppressed 186 00:11:02,080 --> 00:11:04,520 Speaker 1: the Hunter Biden laptop story coming out of The New 187 00:11:04,559 --> 00:11:07,360 Speaker 1: York Post. So I was sitting there at CNAs, which 188 00:11:07,440 --> 00:11:10,199 Speaker 1: is an amazing organization. They got me my starting the 189 00:11:10,280 --> 00:11:12,840 Speaker 1: think tank world. I have so much respect, especially for 190 00:11:13,000 --> 00:11:16,160 Speaker 1: the CEO, Richard Fontaine, an old McCain guy. I mean, 191 00:11:16,360 --> 00:11:18,440 Speaker 1: one of the best individuals I've ever met, and he 192 00:11:18,520 --> 00:11:21,800 Speaker 1: runs a great type shop. But I sort of realized, Okay, 193 00:11:22,080 --> 00:11:24,880 Speaker 1: if we're working with the tech companies and they are 194 00:11:24,960 --> 00:11:28,920 Speaker 1: doing this to American citizens, not allowing them to share 195 00:11:29,240 --> 00:11:33,280 Speaker 1: links to this story in the Twitter direct messages, there 196 00:11:33,360 --> 00:11:36,720 Speaker 1: is something very very wrong here. And you know, maybe 197 00:11:36,760 --> 00:11:38,480 Speaker 1: I got to put my money where my mouth is 198 00:11:38,520 --> 00:11:41,560 Speaker 1: and go to a place that doesn't necessarily work so 199 00:11:41,679 --> 00:11:44,360 Speaker 1: closely with Big Tech. And I decided that I would 200 00:11:44,360 --> 00:11:48,040 Speaker 1: devote the rest of my career to conservative causes that 201 00:11:48,080 --> 00:11:50,680 Speaker 1: are on the side of freedom of expression and really 202 00:11:51,000 --> 00:11:55,880 Speaker 1: bolstering the American citizen as sovereign and the ability of 203 00:11:55,960 --> 00:11:59,679 Speaker 1: us to maintain our free republic by refining our thinking 204 00:11:59,720 --> 00:12:02,719 Speaker 1: again just one another. Because what we're seeing coming from 205 00:12:02,760 --> 00:12:06,319 Speaker 1: big tech companies, especially as they collude with the government. 206 00:12:06,360 --> 00:12:08,839 Speaker 1: It looks less and less like the America I grew 207 00:12:08,920 --> 00:12:12,320 Speaker 1: up in and more and more like China's authoritarian state. 208 00:12:12,360 --> 00:12:14,880 Speaker 1: And we need to nip that in the bud arrest 209 00:12:15,080 --> 00:12:17,480 Speaker 1: some of the trends that I'm seeing right now. And 210 00:12:17,520 --> 00:12:20,560 Speaker 1: I decided Heritage was the best juggernaut that I could 211 00:12:20,559 --> 00:12:23,000 Speaker 1: go to to actually get that message out. And I 212 00:12:23,080 --> 00:12:25,280 Speaker 1: think I've been corrected thus far. I think we're doing 213 00:12:25,280 --> 00:12:27,280 Speaker 1: a great job so far. Yeah. I think you've got 214 00:12:27,280 --> 00:12:29,800 Speaker 1: a great new leader in Kevin Roberts. Oh yeah, I 215 00:12:29,840 --> 00:12:32,520 Speaker 1: think it's going to be very impressive. But now the 216 00:12:32,559 --> 00:12:37,120 Speaker 1: Heritage Foundation has launched what they call the Conservative Oversight Project. 217 00:12:37,640 --> 00:12:40,600 Speaker 1: What is that all about? Yeah, so, you know, the 218 00:12:40,720 --> 00:12:44,680 Speaker 1: left is really good about holding politicians, people in the 219 00:12:44,720 --> 00:12:49,400 Speaker 1: public eye accountable for some of the deleterious effects they 220 00:12:49,480 --> 00:12:52,720 Speaker 1: say have been visited upon Americans. I think it's time 221 00:12:52,720 --> 00:12:55,200 Speaker 1: for conservatives to get into the fight. I'm not quite 222 00:12:55,240 --> 00:12:57,160 Speaker 1: involved with this project yet, but I know the people 223 00:12:57,160 --> 00:12:59,800 Speaker 1: who are running at and they are lions for the cause, 224 00:13:00,120 --> 00:13:02,240 Speaker 1: and they're saying, Okay, if the left is going to 225 00:13:02,320 --> 00:13:04,800 Speaker 1: do this, then we have to make sure that we 226 00:13:04,880 --> 00:13:07,640 Speaker 1: can fight fire with fire, and we have to understand 227 00:13:07,679 --> 00:13:11,959 Speaker 1: that holding politicians accountable is not just something that Democrats 228 00:13:11,960 --> 00:13:14,760 Speaker 1: can do, if something Conservatives have to do. Right in 229 00:13:14,840 --> 00:13:19,760 Speaker 1: terms of getting Foyer requests transparency massive, you've seen the 230 00:13:19,840 --> 00:13:23,240 Speaker 1: damage that Foyer requests have done to all of these 231 00:13:23,320 --> 00:13:27,040 Speaker 1: teachers union sort of teaching critical race theory rhetoric. Now, 232 00:13:27,080 --> 00:13:30,360 Speaker 1: when you have an element of transparency, that whole sunlight 233 00:13:30,480 --> 00:13:33,680 Speaker 1: is the best disinfectant idea. The Foyer requests have been 234 00:13:33,720 --> 00:13:37,240 Speaker 1: devastating for them because it shows what's actually going on 235 00:13:37,320 --> 00:13:39,920 Speaker 1: in our schools, and it's been able to galvanize this 236 00:13:40,080 --> 00:13:43,600 Speaker 1: grassroots movement of parents against what's going on in terms 237 00:13:43,640 --> 00:13:46,920 Speaker 1: of the indoctrination of our children. So I think using 238 00:13:46,960 --> 00:13:49,160 Speaker 1: some of those elements is going to be critical to 239 00:13:49,559 --> 00:13:52,640 Speaker 1: what the Oversight Project does and just really hold the 240 00:13:52,720 --> 00:13:56,520 Speaker 1: Biden administration especially accountable. For instance, you look at something 241 00:13:56,559 --> 00:13:59,800 Speaker 1: like flying illegal immigrants into the country in the debt 242 00:13:59,800 --> 00:14:03,360 Speaker 1: of Night, the American people should understand what's happening, they 243 00:14:03,360 --> 00:14:05,480 Speaker 1: should know about it. They're doing it in secret for 244 00:14:05,520 --> 00:14:08,199 Speaker 1: a purpose. So I think it's things like that that 245 00:14:08,240 --> 00:14:10,959 Speaker 1: we can help bring to light for the American people, 246 00:14:11,000 --> 00:14:12,920 Speaker 1: so that we can actually have a stake in a 247 00:14:13,000 --> 00:14:15,960 Speaker 1: voice in our government and not just be shrouded in 248 00:14:16,040 --> 00:14:21,080 Speaker 1: silence when the Biden administration covers our eyes. In that context, 249 00:14:21,680 --> 00:14:24,240 Speaker 1: you're trying to surface things that in a sense, they're 250 00:14:24,280 --> 00:14:28,280 Speaker 1: trying to hide. And you know, the less battle cry 251 00:14:28,320 --> 00:14:33,040 Speaker 1: now is that they need to police misinformation or false 252 00:14:33,080 --> 00:14:37,280 Speaker 1: information which turns out to be conservatism. From that standpoint, 253 00:14:37,320 --> 00:14:41,400 Speaker 1: aren't you engaged in almost every day swimming uphill trying 254 00:14:41,440 --> 00:14:45,280 Speaker 1: to get these ideas across? Oh? Absolutely. And now we've 255 00:14:45,280 --> 00:14:48,320 Speaker 1: been able to put studies in data to the suppression 256 00:14:48,360 --> 00:14:52,240 Speaker 1: of conservative viewpoints on these big tech platforms, So it 257 00:14:52,360 --> 00:14:55,520 Speaker 1: is a swimming upstream in many ways. The Media Research 258 00:14:55,520 --> 00:14:58,480 Speaker 1: Center recently had a study in September that said Twitter 259 00:14:58,520 --> 00:15:01,800 Speaker 1: and Facebook sensor Republic Congress members at a rate of 260 00:15:01,840 --> 00:15:05,280 Speaker 1: fifty three to one compared to Democrats. And then Facebook 261 00:15:05,320 --> 00:15:09,400 Speaker 1: created two internal tools after Donald Trump won that suppressed 262 00:15:09,640 --> 00:15:13,840 Speaker 1: very conservative media outlets on their platform. So the reach 263 00:15:13,880 --> 00:15:16,760 Speaker 1: of these outlets and their articles are not getting shared 264 00:15:16,840 --> 00:15:20,440 Speaker 1: because of these internal tools that Facebook has developed. Same 265 00:15:20,440 --> 00:15:23,840 Speaker 1: thing with Google. They've stifled conservative leaning outlets like The 266 00:15:23,960 --> 00:15:27,600 Speaker 1: Daily Color, Breitbart, the Federalist all during the twenty twenty 267 00:15:27,600 --> 00:15:30,440 Speaker 1: election season, and bright Bart was able to say that 268 00:15:30,520 --> 00:15:34,320 Speaker 1: their Google search visibility shrank by ninety nine percent in 269 00:15:34,360 --> 00:15:37,240 Speaker 1: the election season of twenty twenty compared to the election 270 00:15:37,280 --> 00:15:40,640 Speaker 1: season of twenty sixteen. So when you say swimming up hill, 271 00:15:40,960 --> 00:15:44,480 Speaker 1: absolutely that's an understatement if anything. Yeah, we experience that 272 00:15:44,560 --> 00:15:47,800 Speaker 1: at gin which three six you see cycles where Google 273 00:15:47,800 --> 00:15:50,800 Speaker 1: will come in and suddenly we're much further down the 274 00:15:50,880 --> 00:15:54,560 Speaker 1: list of key words in order to get to our website. 275 00:15:54,600 --> 00:15:57,160 Speaker 1: It has a direct impact on how many people can 276 00:15:57,240 --> 00:16:00,440 Speaker 1: find us. It's interesting to think that somewhere out there 277 00:16:00,560 --> 00:16:06,800 Speaker 1: in Silicon Valley, some unknown person is making random decisions 278 00:16:06,800 --> 00:16:09,480 Speaker 1: that either turn on or turn off the flow of 279 00:16:09,520 --> 00:16:12,520 Speaker 1: information in ways that I think are a real threat 280 00:16:12,560 --> 00:16:16,680 Speaker 1: to the country. From your perspective, what is it you 281 00:16:16,720 --> 00:16:18,840 Speaker 1: think that we need to do as a matter of 282 00:16:18,880 --> 00:16:22,280 Speaker 1: public policy. Yeah, I think there's many ways to sort 283 00:16:22,320 --> 00:16:25,360 Speaker 1: of redress what I call the imbalance between these tech 284 00:16:25,440 --> 00:16:28,960 Speaker 1: companies and their users. And so we at the Heritage Foundation, 285 00:16:29,040 --> 00:16:31,960 Speaker 1: I think first and foremost the ethos that should underpin 286 00:16:32,160 --> 00:16:35,760 Speaker 1: everything we do is that we are going on offense, 287 00:16:36,080 --> 00:16:39,240 Speaker 1: So we effectively are going to war against big tech, 288 00:16:39,360 --> 00:16:42,040 Speaker 1: and we've developed a roadmap to what that looks like. 289 00:16:42,080 --> 00:16:44,600 Speaker 1: And you talk about the policy options, so I can 290 00:16:44,680 --> 00:16:47,360 Speaker 1: run through some of those. But there's also other elements 291 00:16:47,400 --> 00:16:52,479 Speaker 1: here too. There's state legislatures, state attorney generals, there's grassroots citizenry, 292 00:16:52,520 --> 00:16:55,640 Speaker 1: there's tech founders who maybe haven't yet started tinkering in 293 00:16:55,680 --> 00:16:58,240 Speaker 1: their garages. But I think we need to sort of 294 00:16:58,320 --> 00:17:00,880 Speaker 1: look at it as a myriad of tactics. We need 295 00:17:00,880 --> 00:17:04,880 Speaker 1: to diversify our tactics, with policy just being one element, 296 00:17:05,160 --> 00:17:08,680 Speaker 1: but concerning policy because we're in Washington, DC and that's 297 00:17:08,800 --> 00:17:11,520 Speaker 1: the water that we swim in now. I think Congress 298 00:17:11,560 --> 00:17:15,439 Speaker 1: and relevant federal agencies, I think they should enforce antitrust 299 00:17:15,560 --> 00:17:18,480 Speaker 1: laws and reform them where necessary. I think they should 300 00:17:18,520 --> 00:17:22,080 Speaker 1: scrutinize big tech business models, so really get at the 301 00:17:22,160 --> 00:17:26,040 Speaker 1: heart of what allows these companies to manipulate and take 302 00:17:26,080 --> 00:17:29,480 Speaker 1: advantage of Americans. They're ad tech models. We need to 303 00:17:29,560 --> 00:17:33,360 Speaker 1: find ways to hold tech executives liable for real fraud 304 00:17:33,440 --> 00:17:36,120 Speaker 1: and breach of contract. You saw with the go fund 305 00:17:36,160 --> 00:17:39,360 Speaker 1: me issue over the Canadian truck or money they were 306 00:17:39,400 --> 00:17:43,160 Speaker 1: going to not disseminate nine million dollars that had been 307 00:17:43,200 --> 00:17:45,960 Speaker 1: donated to go fund me accounts for the Trucker convoy, 308 00:17:46,240 --> 00:17:49,159 Speaker 1: and when people started saying, wait, you're saying you're going 309 00:17:49,160 --> 00:17:51,879 Speaker 1: to disseminate them to charities of your choice instead of 310 00:17:51,920 --> 00:17:54,639 Speaker 1: giving it to who we thought we were donating it for. 311 00:17:55,160 --> 00:17:58,160 Speaker 1: That might be fraud, then they back down. So if 312 00:17:58,160 --> 00:18:00,679 Speaker 1: there's fraud, if there's breach of contract, you need to 313 00:18:00,720 --> 00:18:04,040 Speaker 1: hold tech executives liable for that, scare them straight. And 314 00:18:04,080 --> 00:18:06,800 Speaker 1: then I think you also need to require transparency, like 315 00:18:06,840 --> 00:18:11,320 Speaker 1: we talked about content moderation practices, transparency when they make mistakes, 316 00:18:11,359 --> 00:18:13,520 Speaker 1: they should admit it to the public, as well as 317 00:18:13,600 --> 00:18:17,840 Speaker 1: algorithmic transparency what are the impacts of these algorithms on users? 318 00:18:18,040 --> 00:18:21,200 Speaker 1: And then data transparency, how are they collecting this data, 319 00:18:21,240 --> 00:18:23,600 Speaker 1: how are they using the data, who are they sharing 320 00:18:23,640 --> 00:18:26,040 Speaker 1: it with, when and for how long are they storing 321 00:18:26,080 --> 00:18:28,720 Speaker 1: this data. There should be truth in lending like there 322 00:18:28,760 --> 00:18:31,800 Speaker 1: are in other sectors, So truth in dealing with data 323 00:18:31,880 --> 00:18:35,080 Speaker 1: that is critical. And then I think other elements consist 324 00:18:35,119 --> 00:18:39,400 Speaker 1: of that the government doesn't use these companies as agents 325 00:18:39,440 --> 00:18:42,760 Speaker 1: to chill speech. So Jensaki should not be allowed to, 326 00:18:42,800 --> 00:18:45,440 Speaker 1: in her official capacity, go up to the White House 327 00:18:45,480 --> 00:18:48,520 Speaker 1: podium and say we're working with Facebook to get these 328 00:18:48,760 --> 00:18:53,040 Speaker 1: certain accounts off of Facebook because they spew disinformation, and 329 00:18:53,080 --> 00:18:55,960 Speaker 1: then have Facebook comply less than a month later. No, 330 00:18:56,080 --> 00:18:59,560 Speaker 1: they should absolutely be prohibited, and joint ventures with the 331 00:18:59,600 --> 00:19:02,440 Speaker 1: cc P, like the ones Apple engages into the tune 332 00:19:02,440 --> 00:19:05,320 Speaker 1: of two hundred and seventy five billion dollars should also 333 00:19:05,400 --> 00:19:08,880 Speaker 1: be prohibited. Americans should be given new ways to fight back, 334 00:19:09,040 --> 00:19:11,680 Speaker 1: prompt and meaningful recourse, and they should be giving more 335 00:19:11,760 --> 00:19:14,600 Speaker 1: user control and more privacy. That's sort of the gist 336 00:19:14,640 --> 00:19:33,080 Speaker 1: of our policy ideas in a nutshet, this issue of 337 00:19:33,200 --> 00:19:38,320 Speaker 1: reforming section two thirty, which was originally created in a 338 00:19:38,400 --> 00:19:41,520 Speaker 1: pre Internet world. Can you walk us through what it 339 00:19:41,560 --> 00:19:43,879 Speaker 1: is now and what you think it should be? Of course, 340 00:19:43,960 --> 00:19:46,320 Speaker 1: so section two thirty refers to the section of the 341 00:19:46,320 --> 00:19:50,280 Speaker 1: Communications Decency Act of nineteen ninety six where it basically 342 00:19:50,480 --> 00:19:54,320 Speaker 1: shields these tech companies. It gives them immunity from civil 343 00:19:54,320 --> 00:20:00,240 Speaker 1: liability for removing objectionable content or lascivious content. Call it 344 00:20:00,320 --> 00:20:03,159 Speaker 1: what it is, porn and child exploitation and whatnot. But 345 00:20:03,200 --> 00:20:05,880 Speaker 1: there's a clause in it that says they can remove 346 00:20:06,040 --> 00:20:11,120 Speaker 1: with immunity from civil liability content that's otherwise objectionable, and 347 00:20:11,160 --> 00:20:13,800 Speaker 1: they've taken that to mean all sorts of things, right, 348 00:20:13,840 --> 00:20:17,399 Speaker 1: the things that you and I consider mainstream conservativism, like 349 00:20:17,640 --> 00:20:20,560 Speaker 1: the New York Posts Hunter Biden laptop story, that kind 350 00:20:20,600 --> 00:20:23,199 Speaker 1: of thing. We as Americans can't sue them, or we 351 00:20:23,240 --> 00:20:24,880 Speaker 1: can try, but we're going to lose in a court 352 00:20:25,160 --> 00:20:28,600 Speaker 1: because of Section two thirty. And then, as Clarence Thomas 353 00:20:28,600 --> 00:20:31,800 Speaker 1: has said, over the years, the courts have interpreted this 354 00:20:31,880 --> 00:20:36,240 Speaker 1: to involve a sweeping immunity, so they've basically gone overboard. 355 00:20:36,280 --> 00:20:39,120 Speaker 1: I say tech companies were given an inch. The purpose 356 00:20:39,359 --> 00:20:42,480 Speaker 1: was arguably pretty noble. These are the twenty six words 357 00:20:42,520 --> 00:20:45,560 Speaker 1: that created the Internet, according to some scholars. But they've 358 00:20:45,560 --> 00:20:48,080 Speaker 1: been given an inch and they've taken a mile. So 359 00:20:48,440 --> 00:20:51,560 Speaker 1: it's time to in our estimation at the Heritage Foundation 360 00:20:51,960 --> 00:20:55,359 Speaker 1: reform section to thirty. And what you do is you 361 00:20:55,480 --> 00:20:58,919 Speaker 1: say that we can strip immunity from tech companies if 362 00:20:58,960 --> 00:21:03,040 Speaker 1: they censor base on political views or other views protected 363 00:21:03,040 --> 00:21:07,320 Speaker 1: by the Constitution, as clearly outlined in future legislation. So 364 00:21:07,400 --> 00:21:10,960 Speaker 1: you have to basically say you don't get that freedom 365 00:21:11,119 --> 00:21:14,960 Speaker 1: of from the people clamoring for recourse if you censor 366 00:21:15,040 --> 00:21:17,560 Speaker 1: based on political views. You're going to be opened up 367 00:21:17,560 --> 00:21:21,000 Speaker 1: to lawsuits if you censor as wantonly as you have been, 368 00:21:21,240 --> 00:21:24,359 Speaker 1: especially based off of you know, people just engaging in 369 00:21:24,400 --> 00:21:27,879 Speaker 1: critical thinking on their platforms. I would speaker when we 370 00:21:27,960 --> 00:21:31,560 Speaker 1: passed this, and our interest at the time was helping 371 00:21:31,600 --> 00:21:36,199 Speaker 1: get off the ground this revolution and capabilities, and we 372 00:21:36,240 --> 00:21:40,080 Speaker 1: wanted to enable them, for example, to block child pornography. 373 00:21:41,000 --> 00:21:43,760 Speaker 1: I don't think any of us at the time thought 374 00:21:43,840 --> 00:21:47,320 Speaker 1: through the notion that it would become almost an anti 375 00:21:47,400 --> 00:21:51,760 Speaker 1: American kind of totalitarian tool that would begin to pick 376 00:21:51,840 --> 00:21:55,880 Speaker 1: winners and losers based on ideology. I would strongly support 377 00:21:56,480 --> 00:22:00,480 Speaker 1: reforming Section two thirty. You know, I had necessarily experience 378 00:22:00,480 --> 00:22:02,520 Speaker 1: of the day. I wrote a good friend of mine, 379 00:22:02,840 --> 00:22:06,199 Speaker 1: Herman Pershoner, who runs the American Foreign Policy Council, and 380 00:22:06,920 --> 00:22:11,280 Speaker 1: because I'd noticed that in Russia there have been very 381 00:22:11,280 --> 00:22:18,560 Speaker 1: significant letters and other outcries against Putin invading Ukraine, and 382 00:22:18,680 --> 00:22:20,840 Speaker 1: I wrote, and I said, wouldn't this kind of make 383 00:22:21,680 --> 00:22:26,679 Speaker 1: Russia an authoritarian state where China is a tutolitarian state, 384 00:22:27,280 --> 00:22:31,359 Speaker 1: because you couldn't imagine anybody in China doing the sort 385 00:22:31,400 --> 00:22:34,560 Speaker 1: of things that people are getting away with in Russia, 386 00:22:34,640 --> 00:22:36,360 Speaker 1: And He wrote back and said, that's right. He said, 387 00:22:36,359 --> 00:22:41,360 Speaker 1: you still have independent centers of power that putin actually 388 00:22:41,359 --> 00:22:44,560 Speaker 1: can't take head on, and they do in fact limited, 389 00:22:44,720 --> 00:22:48,720 Speaker 1: whereas g would disappear you and you just wouldn't exist. 390 00:22:49,200 --> 00:22:53,840 Speaker 1: It's a fascinating difference. And it seems to me that unfortunately, 391 00:22:54,160 --> 00:22:56,280 Speaker 1: many of the elements of big government in this country, 392 00:22:56,720 --> 00:23:00,920 Speaker 1: combining with elements of big business, come closer to tolerating 393 00:23:00,920 --> 00:23:05,199 Speaker 1: and accepting the Jesian paying to tolitarian line, and that 394 00:23:05,320 --> 00:23:11,120 Speaker 1: it's amazingly dangerous because once somebody asserts the right to say, well, 395 00:23:11,119 --> 00:23:14,719 Speaker 1: now your ideas don't count, then the next person can 396 00:23:14,760 --> 00:23:17,840 Speaker 1: decide their ideas don't count, and pretty soon you're down to, 397 00:23:18,640 --> 00:23:22,160 Speaker 1: you know, one person defining well we believe I mean, 398 00:23:22,200 --> 00:23:26,840 Speaker 1: do you worry about that kind of creeping to totitarianism. Oh? Absolutely. 399 00:23:26,920 --> 00:23:29,879 Speaker 1: And that's why we titled the paper Combating Big Text 400 00:23:29,880 --> 00:23:34,280 Speaker 1: Totalitarianism because this was evocative of Rod Dreyer's piece right 401 00:23:34,320 --> 00:23:37,960 Speaker 1: where he talks about soft totalitarianism in his book Live 402 00:23:38,040 --> 00:23:41,239 Speaker 1: Not By Lives based off the famous soldianisn quote right, 403 00:23:41,480 --> 00:23:44,480 Speaker 1: one word of truth outweighs the whole world because right 404 00:23:44,480 --> 00:23:47,640 Speaker 1: now Americans are being forced to lie by these tech 405 00:23:47,680 --> 00:23:50,960 Speaker 1: companies working hand in glove with the government. And that 406 00:23:51,080 --> 00:23:54,080 Speaker 1: is what I'm worried about. That nod the politicization of 407 00:23:54,119 --> 00:23:59,640 Speaker 1: everything as totalitarianism, as companies fuse with the Biden administration 408 00:23:59,720 --> 00:24:04,320 Speaker 1: and prevailing leftist ideologies that has sort of captured every institution. 409 00:24:04,680 --> 00:24:08,359 Speaker 1: We're seeing the manifestation of this today. You can't say 410 00:24:08,400 --> 00:24:10,280 Speaker 1: that a man is a man and a woman is 411 00:24:10,320 --> 00:24:13,680 Speaker 1: a woman on Twitter without getting kicked off. Representative Jim 412 00:24:13,680 --> 00:24:17,560 Speaker 1: Banks had this experience. Christian commentator Ali bes Stucky had 413 00:24:17,560 --> 00:24:20,560 Speaker 1: this experience, and this is also the line coming from 414 00:24:20,560 --> 00:24:24,240 Speaker 1: the Biden administration. And I'll get more granular in terms 415 00:24:24,240 --> 00:24:28,240 Speaker 1: of why we're saying this fusion of leftist ideology as 416 00:24:28,320 --> 00:24:33,040 Speaker 1: propagated by the Biden administration and Democratic politicians writ large 417 00:24:33,240 --> 00:24:37,000 Speaker 1: and tech companies matter because when Joe Biden in January 418 00:24:37,040 --> 00:24:41,640 Speaker 1: addresses COVID lecture and says, I'm appealing to tech companies, 419 00:24:41,840 --> 00:24:45,280 Speaker 1: you have to do more to get disinformation off your platforms. 420 00:24:45,520 --> 00:24:47,920 Speaker 1: What the Surgeon General said in July, the day after 421 00:24:48,040 --> 00:24:50,600 Speaker 1: Jensaki got up to the podium and said we're working 422 00:24:50,640 --> 00:24:52,920 Speaker 1: with Facebook to do more, Surgeon General said the same 423 00:24:52,960 --> 00:24:55,720 Speaker 1: thing said, his office is also working with Facebook. These 424 00:24:55,720 --> 00:24:59,000 Speaker 1: are agents of the government. DHS Secretary Majorca said the 425 00:24:59,040 --> 00:25:02,359 Speaker 1: same thing with regard to elections. So it's not just 426 00:25:02,480 --> 00:25:07,760 Speaker 1: COVID misinformation or COVID related ideas, but it's questioning the election. 427 00:25:07,880 --> 00:25:12,600 Speaker 1: It's asking for election integrity to be intensified. It's pervasive again, 428 00:25:12,640 --> 00:25:16,760 Speaker 1: it's gender ideology. You can't talk about biological sex in 429 00:25:16,800 --> 00:25:19,480 Speaker 1: a way that is truthful. And these tech companies are 430 00:25:19,720 --> 00:25:23,239 Speaker 1: enforcers of what the Biden regime is saying is the 431 00:25:23,280 --> 00:25:26,000 Speaker 1: only acceptable line of thought. You know, there's a reason 432 00:25:26,040 --> 00:25:28,560 Speaker 1: why Ron de Santas said that big tech is the 433 00:25:28,640 --> 00:25:32,080 Speaker 1: censorship arm of the Democratic Party. And when Parlor was 434 00:25:32,160 --> 00:25:36,120 Speaker 1: kicked off by Apple, Google and Amazon Web Services, which 435 00:25:36,160 --> 00:25:39,560 Speaker 1: is the real story here that was started by Michelle, 436 00:25:39,600 --> 00:25:43,399 Speaker 1: Obama talked about it, sitting Democratic politicians talked about it, 437 00:25:43,440 --> 00:25:47,840 Speaker 1: so that pressure from the government it absolutely matters when 438 00:25:47,880 --> 00:25:51,200 Speaker 1: it comes to the policies and the practices enacted by 439 00:25:51,240 --> 00:25:55,040 Speaker 1: these tech platforms, and it's bleeding over into companies beyond 440 00:25:55,280 --> 00:25:58,600 Speaker 1: the big tech companies and social media to online fundraising, 441 00:25:58,920 --> 00:26:03,960 Speaker 1: payment processors, email delivery services, internet service providers. It's every 442 00:26:04,040 --> 00:26:08,320 Speaker 1: facet of your life, and that equals to tolitarianism. You know, 443 00:26:08,320 --> 00:26:12,320 Speaker 1: it's very sobering to watch the Canadian government overreact to 444 00:26:12,359 --> 00:26:16,159 Speaker 1: the truckers, an announcement yesterday that if you allowed the 445 00:26:16,200 --> 00:26:18,840 Speaker 1: truckers to use one of your trucks, your company could 446 00:26:18,880 --> 00:26:22,760 Speaker 1: lose its insurance, it could lose its license. The government 447 00:26:22,760 --> 00:26:25,920 Speaker 1: could freeze its bank account, all of it, by the way, 448 00:26:25,960 --> 00:26:29,439 Speaker 1: without going to court, without getting a judge, just the 449 00:26:29,560 --> 00:26:33,800 Speaker 1: arbitrary decision of bureaucracies, which is very similar to the 450 00:26:33,880 --> 00:26:37,439 Speaker 1: Chinese model. I think I was surprised at how open 451 00:26:37,480 --> 00:26:40,400 Speaker 1: and blatant. I always think of Canada as a sort 452 00:26:40,400 --> 00:26:43,760 Speaker 1: of friendly country, and this is a level of ruthlessness. 453 00:26:43,800 --> 00:26:46,440 Speaker 1: I don't know of any other Western country that has 454 00:26:46,480 --> 00:26:50,600 Speaker 1: shown quite the willingness to get right into your personal 455 00:26:50,600 --> 00:26:53,120 Speaker 1: life and destroy you. And that's what they were saying, 456 00:26:53,280 --> 00:26:58,480 Speaker 1: says that basically, we will destroy you unless you obey us. Well, 457 00:26:58,920 --> 00:27:02,760 Speaker 1: this should be dreamely worrying because while Trudeau pulled the 458 00:27:02,800 --> 00:27:06,640 Speaker 1: trigger first on the Emergencies Act, all of the underpinnings 459 00:27:06,680 --> 00:27:08,920 Speaker 1: are here in America of the same kind of thing. 460 00:27:09,000 --> 00:27:14,280 Speaker 1: It's normalizing the targeting of mainstream citizens with counter terrorism tools. 461 00:27:14,520 --> 00:27:16,919 Speaker 1: So we saw this with the most recent Department of 462 00:27:16,920 --> 00:27:22,359 Speaker 1: Homeland Security bulletin that said, you know, Americans that share misinformation, 463 00:27:22,560 --> 00:27:26,480 Speaker 1: disinformation or malinformation on social media are subject to be 464 00:27:26,520 --> 00:27:30,520 Speaker 1: treated as terrorists. That should send a chill down everyone's fine. 465 00:27:30,560 --> 00:27:33,400 Speaker 1: And we've seen it with the National school Board Association 466 00:27:33,560 --> 00:27:36,240 Speaker 1: with the letter they wrote in tandem with the White 467 00:27:36,280 --> 00:27:40,720 Speaker 1: House to Merrick Garland that basically said parents who object 468 00:27:40,720 --> 00:27:43,480 Speaker 1: to the teaching of critical race theory in their schools 469 00:27:43,480 --> 00:27:47,160 Speaker 1: should be treated as domestic extremists. There's a couple other 470 00:27:47,240 --> 00:27:50,880 Speaker 1: examples here, like the DOJ starting a domestic terrorism unit 471 00:27:51,200 --> 00:27:55,640 Speaker 1: that looks at anti authority or anti government ideologies and 472 00:27:56,080 --> 00:28:00,000 Speaker 1: to me, you know, all classifying people who just maybe 473 00:28:00,040 --> 00:28:02,680 Speaker 1: you want to question where the government is going some 474 00:28:02,760 --> 00:28:05,400 Speaker 1: of these policies that we should be skeptical of. Yet 475 00:28:05,480 --> 00:28:08,719 Speaker 1: now we're going to be labeled terrorists. The Canadian government 476 00:28:08,800 --> 00:28:11,440 Speaker 1: did it first, but I think alas it is actually 477 00:28:11,440 --> 00:28:14,080 Speaker 1: going to come here too. Yeah, it's almost like the 478 00:28:14,119 --> 00:28:17,959 Speaker 1: elites all across the Western world are desperately trying to 479 00:28:18,000 --> 00:28:23,640 Speaker 1: suppress the incorrigibles, the blue collar workers, people who don't 480 00:28:23,640 --> 00:28:28,280 Speaker 1: automatically salute the new left wing flag, and it's fascinated. 481 00:28:28,320 --> 00:28:31,560 Speaker 1: As a historian, I find all of this kind of astonishing. 482 00:28:32,080 --> 00:28:34,760 Speaker 1: I want to really thank you for joining me. I 483 00:28:34,800 --> 00:28:38,440 Speaker 1: think your new report Combating Big Text to Tiltarianism a 484 00:28:38,520 --> 00:28:42,040 Speaker 1: Roadmap is a must read for everyone concerned about big 485 00:28:42,040 --> 00:28:44,360 Speaker 1: text overreach and what we need to do to reform 486 00:28:44,400 --> 00:28:47,120 Speaker 1: the system. We're going to include that in our show page, 487 00:28:47,440 --> 00:28:50,240 Speaker 1: and I hope as you do further work they'll come 488 00:28:50,240 --> 00:28:52,040 Speaker 1: back and keep briefing us, because this is going to 489 00:28:52,080 --> 00:28:56,680 Speaker 1: be very important and I think very rapidly changing battlefield, 490 00:28:57,000 --> 00:29:00,560 Speaker 1: and what you're doing is in some ways more important 491 00:29:00,800 --> 00:29:03,480 Speaker 1: for the future of freedom in America than anything you 492 00:29:03,520 --> 00:29:06,800 Speaker 1: ever did in Afghanistan. Thank you, sir, I'll come back anytime. 493 00:29:10,840 --> 00:29:13,480 Speaker 1: Thank you to my guest Kara Frederick. You can read 494 00:29:13,560 --> 00:29:18,120 Speaker 1: her report Combating Big Text to Tolitarianism a Roadmap on 495 00:29:18,120 --> 00:29:21,480 Speaker 1: our show page at newtsworld dot com. News World is 496 00:29:21,520 --> 00:29:26,160 Speaker 1: produced by Gingwish three sixty and iHeartMedia. Our executive producer 497 00:29:26,240 --> 00:29:29,880 Speaker 1: is Garnsey Slow, our producer is Rebecca Howell, and our 498 00:29:29,960 --> 00:29:34,200 Speaker 1: researcher is Rachel Peterson. The artwork for the show was 499 00:29:34,280 --> 00:29:38,000 Speaker 1: created by Steve Penley. Special thanks to the team at 500 00:29:38,040 --> 00:29:41,840 Speaker 1: gingwish three sixty. If you've been enjoying Newtsworld, I hope 501 00:29:41,840 --> 00:29:44,680 Speaker 1: you'll go to Apple Podcasts and both rate us with 502 00:29:44,800 --> 00:29:48,280 Speaker 1: five stars and give us a review so others can 503 00:29:48,400 --> 00:29:51,560 Speaker 1: learn what it's all about. Right now, listeners of newts 504 00:29:51,600 --> 00:29:54,760 Speaker 1: World can sign up from my three free weekly columns 505 00:29:54,960 --> 00:29:59,640 Speaker 1: at gingwish three sixty dot com slash newsletter. I'm Newt Gingrich. 506 00:30:00,120 --> 00:30:01,120 Speaker 1: This is neut World