1 00:00:00,120 --> 00:00:04,160 Speaker 1: President Trump signs an executive order to stop big tech's 2 00:00:04,320 --> 00:00:08,280 Speaker 1: censorship of conservatives. Luckily we can sit down with a 3 00:00:08,320 --> 00:00:12,200 Speaker 1: guy who has been fighting that very censorship for many years. 4 00:00:12,200 --> 00:00:21,120 Speaker 1: This is Verdict with Ted Cruz. Welcome back to Verdict 5 00:00:21,160 --> 00:00:24,160 Speaker 1: with Ted Cruz. I'm Michael Knowles, and the guy who's 6 00:00:24,160 --> 00:00:28,160 Speaker 1: been fighting this for many years, coincidentally is Ted Cruz himself. Senator, 7 00:00:28,480 --> 00:00:30,400 Speaker 1: nice to see you. Even though we still have to 8 00:00:30,440 --> 00:00:34,720 Speaker 1: do this virtually digitally and hopefully without any censorship. Well 9 00:00:35,240 --> 00:00:37,440 Speaker 1: let's hope so, but we'll see a big tech pulls 10 00:00:37,440 --> 00:00:41,320 Speaker 1: this podcast down. That's right, Senator. I think everybody knows 11 00:00:41,680 --> 00:00:45,440 Speaker 1: that big tech has been going after conservatives, and probably 12 00:00:45,479 --> 00:00:47,200 Speaker 1: for a lot of people listening to this right now, 13 00:00:47,440 --> 00:00:49,960 Speaker 1: they themselves have felt this kind of thing if their 14 00:00:49,960 --> 00:00:52,839 Speaker 1: account has been suspended for sharing a conservative view. I mean, 15 00:00:52,880 --> 00:00:54,880 Speaker 1: I think, frankly, it's probably happened to all of us 16 00:00:54,920 --> 00:00:57,200 Speaker 1: at this point. So we know that there's a problem. 17 00:00:57,240 --> 00:01:00,600 Speaker 1: But there's been some debate over how you solve the problem. 18 00:01:00,760 --> 00:01:03,240 Speaker 1: Do you pass some new regulation, do you break them 19 00:01:03,320 --> 00:01:06,440 Speaker 1: up like their monopoly with antitrust laws, or do you 20 00:01:06,560 --> 00:01:09,680 Speaker 1: enforce laws that are already on the books. You focused 21 00:01:09,680 --> 00:01:13,720 Speaker 1: in on something called section two thirty of the Communications 22 00:01:13,760 --> 00:01:16,760 Speaker 1: Decency Act of nineteen ninety six. The President focused in 23 00:01:16,840 --> 00:01:19,640 Speaker 1: on that as well in his executive order. Can you 24 00:01:19,680 --> 00:01:22,720 Speaker 1: tell us what that is and how the argument applies 25 00:01:22,720 --> 00:01:25,560 Speaker 1: to big tech censorship? Well? Sure, and an answer to 26 00:01:25,560 --> 00:01:28,160 Speaker 1: your question about what you do you listed several options. 27 00:01:28,160 --> 00:01:32,520 Speaker 1: By answers are yes, yes, and yes. Let's start with 28 00:01:32,520 --> 00:01:35,000 Speaker 1: two thirty. What is two thirty? So nineteen ninety six 29 00:01:35,000 --> 00:01:37,760 Speaker 1: Congress pass a law passes a law called the Communications 30 00:01:37,760 --> 00:01:41,720 Speaker 1: Decency Act. It was actually mainly focused at trying to 31 00:01:41,920 --> 00:01:45,440 Speaker 1: regulate Internet porn, but it included a portion in its 32 00:01:45,480 --> 00:01:51,800 Speaker 1: section two thirty that gave a special immunity from liability 33 00:01:52,600 --> 00:01:55,400 Speaker 1: for big, big tech companies. And here was the reasoning 34 00:01:55,400 --> 00:01:57,800 Speaker 1: at the time. Remember this is early, This was right 35 00:01:57,840 --> 00:02:02,000 Speaker 1: when the Internet is starting. In fact, one thing, Michael, 36 00:02:02,040 --> 00:02:04,400 Speaker 1: you and I have talked about, I was clerking at 37 00:02:04,400 --> 00:02:08,000 Speaker 1: the US Supreme Court in nineteen ninety six nineteen ninety 38 00:02:08,000 --> 00:02:13,320 Speaker 1: seven when one of the first challenges to congressional regulation 39 00:02:13,400 --> 00:02:16,240 Speaker 1: came up, and I told a story in the book 40 00:02:16,240 --> 00:02:19,120 Speaker 1: I wrote a few years ago about sitting with a 41 00:02:19,160 --> 00:02:22,560 Speaker 1: Supreme Court librarian with my boss chief Justice William rynk 42 00:02:22,600 --> 00:02:26,959 Speaker 1: Quist and was Santra Day O'Connor, and the librarian pulled 43 00:02:27,040 --> 00:02:32,040 Speaker 1: up hardcore porn to show the justices, and all the 44 00:02:32,120 --> 00:02:34,679 Speaker 1: law clerks are there and look, Justice O'Connor was in 45 00:02:34,760 --> 00:02:37,560 Speaker 1: her seventies, and I still remember when they pulled up 46 00:02:37,600 --> 00:02:40,680 Speaker 1: porn on the screen. Justice O'Connor just went, oh my, 47 00:02:42,200 --> 00:02:44,400 Speaker 1: And we're all sitting there like the law clerks are 48 00:02:44,440 --> 00:02:46,680 Speaker 1: in our twenties and we're looking at our shoes feeling 49 00:02:46,760 --> 00:02:50,679 Speaker 1: really really awkward. And even more awkward is the fact 50 00:02:50,760 --> 00:02:53,120 Speaker 1: that when they were last students at Stanford, Rinkquist and 51 00:02:53,160 --> 00:02:56,680 Speaker 1: O'Connor were same class. They actually dated for a while. 52 00:02:57,560 --> 00:03:00,000 Speaker 1: He was number one in the class, she was number three. 53 00:03:00,240 --> 00:03:05,639 Speaker 1: So picture this, Michael, forty years from now, you're seventy 54 00:03:05,639 --> 00:03:08,840 Speaker 1: eighty years old and you're standing in a room with 55 00:03:08,880 --> 00:03:14,320 Speaker 1: an ex girlfriend from forty fifty years ago watching porn. 56 00:03:14,440 --> 00:03:16,600 Speaker 1: Imagine how awkward it was for the two of that 57 00:03:16,800 --> 00:03:19,320 Speaker 1: senator I know and the librarians where I know you've 58 00:03:19,320 --> 00:03:21,440 Speaker 1: had a lot of strange experiences in politics. I just 59 00:03:21,480 --> 00:03:23,360 Speaker 1: think I have to stop here for a moment to 60 00:03:23,480 --> 00:03:25,840 Speaker 1: say that the strangest one I've ever heard from you 61 00:03:26,320 --> 00:03:30,799 Speaker 1: is watching hardcore porn with Justice Sandrada O'Connor and her 62 00:03:30,840 --> 00:03:34,840 Speaker 1: ex boyfriend William Brnquist. I think that one is absolutely 63 00:03:35,080 --> 00:03:38,720 Speaker 1: the top of the list. It was surreal and I 64 00:03:38,760 --> 00:03:43,640 Speaker 1: can still hear the oh my like echoing twenty plus 65 00:03:43,800 --> 00:03:47,040 Speaker 1: years later. But this was right at the beginning of 66 00:03:47,040 --> 00:03:49,040 Speaker 1: the Internet. So you gotta understand justices didn't know what 67 00:03:49,040 --> 00:03:52,120 Speaker 1: the Internet was. This was librarians saying, okay, this is 68 00:03:52,160 --> 00:03:55,120 Speaker 1: this internet thing. You type in things. I still remember. 69 00:03:55,160 --> 00:03:58,560 Speaker 1: In fact, they turned off the filters and to get 70 00:03:58,600 --> 00:04:03,839 Speaker 1: to porn, they typed in the world cantelope misspelled and cantalope. 71 00:04:03,600 --> 00:04:07,080 Speaker 1: I use use your imagination, but I guess it pulled 72 00:04:07,160 --> 00:04:11,440 Speaker 1: up porn and the librarians were trying to show them, Okay, 73 00:04:11,440 --> 00:04:14,680 Speaker 1: look it's people can stumble into this accidentally all the time. 74 00:04:16,640 --> 00:04:19,960 Speaker 1: Part of that bill, so it was section two thirty 75 00:04:20,560 --> 00:04:23,480 Speaker 1: was designed. You had these little Internet startups and the 76 00:04:23,520 --> 00:04:27,400 Speaker 1: idea was, listen, it's not fair to sue a tech 77 00:04:27,480 --> 00:04:31,000 Speaker 1: company for something somebody posts on there. Because this is 78 00:04:31,040 --> 00:04:32,880 Speaker 1: a forum. This is a public square. And so if 79 00:04:32,920 --> 00:04:37,000 Speaker 1: somebody posts something and you sue the internet company, you 80 00:04:37,000 --> 00:04:39,520 Speaker 1: could drive them out of business. And it's not the 81 00:04:39,560 --> 00:04:42,240 Speaker 1: tech company that's speaking, it's whoever the users are. So 82 00:04:42,279 --> 00:04:46,080 Speaker 1: if you post something, the users should be liable, but 83 00:04:46,160 --> 00:04:49,360 Speaker 1: not the forums and the and the predicate for this, 84 00:04:49,360 --> 00:04:54,960 Speaker 1: this is the policy predicate. Everyone understood big Tech what 85 00:04:55,080 --> 00:04:57,800 Speaker 1: we're going to be what are called neutral public fora 86 00:04:57,880 --> 00:05:01,279 Speaker 1: In other words, they were going to allow everyone to speak. 87 00:05:01,279 --> 00:05:03,320 Speaker 1: It was going to be the new marketplace of ideas, 88 00:05:03,360 --> 00:05:06,359 Speaker 1: and so Section two thirty past in the law. But 89 00:05:06,400 --> 00:05:09,800 Speaker 1: what it means, you know, Google and Facebook and Twitter 90 00:05:09,920 --> 00:05:13,520 Speaker 1: they've got an immunity from liability. Nobody else does you, 91 00:05:13,720 --> 00:05:16,760 Speaker 1: Michael Knowles, if you go on the radio and you 92 00:05:16,839 --> 00:05:19,320 Speaker 1: to fame someone, you can be sued. Yep, the New 93 00:05:19,400 --> 00:05:21,960 Speaker 1: York Times. Do you know that Google and Facebook have 94 00:05:22,000 --> 00:05:24,080 Speaker 1: an immunity from liability? The New York Times doesn't have 95 00:05:24,360 --> 00:05:28,480 Speaker 1: Fox News, doesn't have everyone else, every American citizen, every 96 00:05:28,520 --> 00:05:32,680 Speaker 1: American company. It is only big Tech that gets this 97 00:05:32,839 --> 00:05:36,520 Speaker 1: special immunity from liability. So, Senators, this is the distinction 98 00:05:36,600 --> 00:05:39,360 Speaker 1: we've heard about a little bit, which is the platform 99 00:05:39,560 --> 00:05:42,200 Speaker 1: versus the publisher. Right, if you're a neutral platform, you 100 00:05:42,240 --> 00:05:44,560 Speaker 1: get these protections. But if you're a publisher like the 101 00:05:44,600 --> 00:05:47,159 Speaker 1: New York Times or the Daily Wire or Fox News 102 00:05:47,240 --> 00:05:50,200 Speaker 1: or whatever. Then you don't get those protections. And the 103 00:05:50,200 --> 00:05:52,919 Speaker 1: reason for it, of course, is if you could sue 104 00:05:53,000 --> 00:05:56,719 Speaker 1: Twitter for every defamatory tweet that has ever been tweeted, 105 00:05:56,920 --> 00:06:00,679 Speaker 1: Twitter would go out of business in approximately five seconds. 106 00:06:01,680 --> 00:06:04,120 Speaker 1: That could well be true. But you know the interesting thing. 107 00:06:04,400 --> 00:06:08,320 Speaker 1: Let's suppose I wrote an op ed that said, with apologies, 108 00:06:08,480 --> 00:06:17,599 Speaker 1: that Michael Knowles has carnal relations with barnyard animals, and 109 00:06:19,480 --> 00:06:23,240 Speaker 1: The New York Times published it. And let's assume, and 110 00:06:23,279 --> 00:06:25,440 Speaker 1: I'm willing to assume for the sake of the podcast, 111 00:06:25,480 --> 00:06:28,960 Speaker 1: that that is a totally false state, thank you, and 112 00:06:29,000 --> 00:06:31,680 Speaker 1: that I have no basis that I am willfully being 113 00:06:31,760 --> 00:06:36,239 Speaker 1: reckless making it up. You as a citizen, could sue. 114 00:06:36,360 --> 00:06:39,520 Speaker 1: You could sue me for defamation, but you could sue 115 00:06:39,560 --> 00:06:42,480 Speaker 1: the New York Times because by their choice to publish that, 116 00:06:42,880 --> 00:06:45,800 Speaker 1: if they publish something that's defamatory about you, sue the 117 00:06:45,800 --> 00:06:48,640 Speaker 1: hell out of them. I do the same thing on 118 00:06:48,680 --> 00:06:50,880 Speaker 1: social media. You can't sue the hell out of them. 119 00:06:50,880 --> 00:06:54,719 Speaker 1: And the reason is that Congress made a determination twenty 120 00:06:54,800 --> 00:06:58,919 Speaker 1: years ago. These are special public form. Now Here's what's changed. 121 00:07:00,040 --> 00:07:02,320 Speaker 1: Big Tech has decided and they've decided it only in 122 00:07:02,320 --> 00:07:04,840 Speaker 1: the last couple of years. They don't want to be 123 00:07:04,880 --> 00:07:08,360 Speaker 1: neutral anymore. Yeah, they want to be political players. They 124 00:07:08,440 --> 00:07:12,600 Speaker 1: want to editorialize, they want to silence voices they don't like, 125 00:07:12,800 --> 00:07:18,560 Speaker 1: and so they're deliberately amplifying lefty voices and silencing conservative voices. 126 00:07:18,680 --> 00:07:22,080 Speaker 1: And it's okay, fine if they want to do that, fine, 127 00:07:22,120 --> 00:07:25,720 Speaker 1: but one of the obvious steps is you don't get 128 00:07:25,760 --> 00:07:29,000 Speaker 1: a special immunity from liability that nobody else gets. We're 129 00:07:29,040 --> 00:07:31,440 Speaker 1: not going to treat you differently, because if you're going 130 00:07:31,480 --> 00:07:33,520 Speaker 1: to behave like the New York Times, you should face 131 00:07:33,640 --> 00:07:39,040 Speaker 1: the same risks the New York Times face. That makes sense. 132 00:07:39,400 --> 00:07:44,520 Speaker 1: We've heard from some legal scholars on the left or 133 00:07:44,600 --> 00:07:48,080 Speaker 1: jurists on the left that really Section two thirty shouldn't 134 00:07:48,080 --> 00:07:51,320 Speaker 1: be applied this way. It's not about political stances and 135 00:07:51,600 --> 00:07:54,800 Speaker 1: conservatives or abusing this. But but the argument you've just made, 136 00:07:54,840 --> 00:07:57,080 Speaker 1: and that you've actually been making for quite some time now, 137 00:07:57,480 --> 00:07:59,960 Speaker 1: is the argument that was made today by Attorney General 138 00:08:00,040 --> 00:08:02,840 Speaker 1: William Barr. It's the argument that was made by President Trump. 139 00:08:03,000 --> 00:08:06,400 Speaker 1: So it seems that Section two thirty is the key 140 00:08:06,480 --> 00:08:11,200 Speaker 1: here to the conservatives argument. To stopping this big tech censorship. 141 00:08:11,520 --> 00:08:13,080 Speaker 1: Can you just go into a little bit of the 142 00:08:13,280 --> 00:08:16,400 Speaker 1: political debate or maybe how the Trump administration came to 143 00:08:16,440 --> 00:08:20,000 Speaker 1: adopt this idea. Look, the simplest thing I have to 144 00:08:20,000 --> 00:08:23,720 Speaker 1: say about as executive orders, it's about damn time. I 145 00:08:23,840 --> 00:08:26,840 Speaker 1: have literally been urging this administration to do this for 146 00:08:26,920 --> 00:08:29,320 Speaker 1: three and a half years. In the last three and 147 00:08:29,320 --> 00:08:32,080 Speaker 1: a half years, you know, Look, I'm in the Senate. 148 00:08:32,080 --> 00:08:34,840 Speaker 1: I'm a legislator. I can share hearings. I've shared hearings 149 00:08:34,920 --> 00:08:39,880 Speaker 1: highlighting the rampant censorship and political bias. I can introduce legislation. 150 00:08:39,880 --> 00:08:43,720 Speaker 1: I've advocated for legislation. But I'm not the executive. It's 151 00:08:43,760 --> 00:08:47,160 Speaker 1: the executive that actually has enforcement ability, that that that 152 00:08:47,520 --> 00:08:51,400 Speaker 1: actually has prosecutors and grand juries and subpoenas and can 153 00:08:51,520 --> 00:08:54,280 Speaker 1: enforce the law. So so I have in the last 154 00:08:54,320 --> 00:08:58,680 Speaker 1: three years, I have spent hours meeting with Bill Barr, 155 00:08:58,720 --> 00:09:01,600 Speaker 1: the Attorney General, on this topic. I have spent hours 156 00:09:01,720 --> 00:09:04,560 Speaker 1: with Jeff Rose and the Deputy Attorney General on this topic. 157 00:09:04,600 --> 00:09:07,360 Speaker 1: I've spent hours with Makon del Raheem, the head of 158 00:09:07,360 --> 00:09:10,320 Speaker 1: the Antitrust Division at the Department of Justice, on this topic. 159 00:09:10,360 --> 00:09:12,560 Speaker 1: I've spent hours with Joe Simons, the head of the 160 00:09:12,559 --> 00:09:14,760 Speaker 1: Federal Trade Commission, who I used to work with. I 161 00:09:14,760 --> 00:09:17,640 Speaker 1: know Joe very well on this topic. I've spent hours 162 00:09:17,679 --> 00:09:20,360 Speaker 1: with the President, with the Vice President, with the White 163 00:09:20,360 --> 00:09:24,520 Speaker 1: House Chief of Staff, with the White House Council, urging them. 164 00:09:24,520 --> 00:09:28,880 Speaker 1: And here's been the problem everybody. It's not quite in 165 00:09:29,040 --> 00:09:32,720 Speaker 1: their jurisdiction. It doesn't quite fit in the neat So 166 00:09:32,760 --> 00:09:34,839 Speaker 1: everyone says, yeah, yeah, it's a problem. They've agreed it's 167 00:09:34,840 --> 00:09:37,160 Speaker 1: a problem. And by the way, Baron Rosen especially agreed 168 00:09:37,160 --> 00:09:40,959 Speaker 1: it's a real problem. But it doesn't fit neatly in 169 00:09:41,080 --> 00:09:45,840 Speaker 1: anyone's sort of traditional job description. And so the president Listen, 170 00:09:45,880 --> 00:09:47,959 Speaker 1: the President has been frustrated and pissed off about this 171 00:09:48,040 --> 00:09:50,640 Speaker 1: for a long time, but nobody on his team has 172 00:09:50,640 --> 00:09:52,920 Speaker 1: been willing to do anything about it. Yeah, and so 173 00:09:53,200 --> 00:09:58,480 Speaker 1: I do find it kind of ironic that Twitter decided 174 00:09:58,480 --> 00:10:03,240 Speaker 1: to be such jackass. Yet, you know, look, my view 175 00:10:03,280 --> 00:10:05,800 Speaker 1: has always been in the kind of hierarchy of big tech. 176 00:10:06,679 --> 00:10:10,080 Speaker 1: The worst is Google and YouTube, which they own. You know, Google, 177 00:10:10,280 --> 00:10:12,160 Speaker 1: their motto used to be don't be evil. Now their 178 00:10:12,160 --> 00:10:17,040 Speaker 1: motto is just evil, Yeah, be evil? Right under Google 179 00:10:17,280 --> 00:10:20,360 Speaker 1: has been Twitter, and then and then under that is Facebook. 180 00:10:20,360 --> 00:10:23,560 Speaker 1: Facebook's pretty bad too, but there's tiny moments where they're trying. 181 00:10:24,160 --> 00:10:27,360 Speaker 1: I think Jack Dorsey at Twitter decided, you know what, 182 00:10:27,679 --> 00:10:30,160 Speaker 1: I'm tired of Google being the worst player on Earth. 183 00:10:30,360 --> 00:10:33,400 Speaker 1: And so this whole thing was prompted because the idiots 184 00:10:34,000 --> 00:10:38,679 Speaker 1: decided to fact check the president's tweet on voter fraud. 185 00:10:38,760 --> 00:10:41,640 Speaker 1: And by the way, number one, they linked to CNN, 186 00:10:41,720 --> 00:10:44,520 Speaker 1: which is so profoundly wrong on so many issues it's ridiculous. 187 00:10:45,520 --> 00:10:48,480 Speaker 1: But but Twitter, I think just pissed off the president 188 00:10:48,960 --> 00:10:52,360 Speaker 1: and thank god they did, because what I assume happened 189 00:10:52,360 --> 00:10:55,360 Speaker 1: and I don't know this, but I assume he blew 190 00:10:55,360 --> 00:10:58,520 Speaker 1: his top and told everyone, somebody get off your ass 191 00:10:58,520 --> 00:11:02,080 Speaker 1: and do something about it, and it motivated them to 192 00:11:02,120 --> 00:11:04,640 Speaker 1: finally do this. I'm glad they did. I think this 193 00:11:04,760 --> 00:11:09,200 Speaker 1: is an incredible threat. Big tech censorship is the biggest 194 00:11:09,240 --> 00:11:13,920 Speaker 1: threat to free speech and to fair elections and democracy. 195 00:11:13,960 --> 00:11:16,319 Speaker 1: We got in the whole content, and to Facebook's credit, actually, 196 00:11:16,360 --> 00:11:19,360 Speaker 1: Mark Zuckerberg came out today and he said I don't 197 00:11:19,400 --> 00:11:21,960 Speaker 1: think that big tech should be the arbiters of truth, 198 00:11:22,200 --> 00:11:25,480 Speaker 1: basically directly contradicting Jack Dorsey, a Twitter so, I mean, 199 00:11:25,600 --> 00:11:28,640 Speaker 1: that's a good move. You know, just what do we 200 00:11:28,679 --> 00:11:31,280 Speaker 1: think the conclusion of this will be. It seems that 201 00:11:31,320 --> 00:11:33,680 Speaker 1: the threat would be to get Twitter to back off, 202 00:11:33,679 --> 00:11:36,040 Speaker 1: to not interfere in the presidential election, to not put 203 00:11:36,040 --> 00:11:38,600 Speaker 1: their thumb on the scales of how information moves around 204 00:11:38,600 --> 00:11:41,800 Speaker 1: the Internet. If they don't back off, are we looking 205 00:11:41,800 --> 00:11:44,040 Speaker 1: at a world in which Twitter really does lose its 206 00:11:44,080 --> 00:11:47,760 Speaker 1: protections and Twitter goes down? Look that I hope so, 207 00:11:48,280 --> 00:11:52,319 Speaker 1: although listen, I say I hope so. But I love Twitter. 208 00:11:52,720 --> 00:11:55,760 Speaker 1: Twitter is a vehicle to engage in public debates, to 209 00:11:56,080 --> 00:11:59,160 Speaker 1: go back and forth. Twitter as a neutral public forum 210 00:11:59,200 --> 00:12:02,680 Speaker 1: actually works quite well. It's only recently that they decided 211 00:12:02,720 --> 00:12:06,480 Speaker 1: to let their crazy lefty go. And actually, Michael, I'll 212 00:12:06,480 --> 00:12:10,840 Speaker 1: tell you Zuckerberg story. Yeah. So, as you know, Zuckerberg 213 00:12:10,920 --> 00:12:13,040 Speaker 1: testified in front of the Senate. A lot of us 214 00:12:13,080 --> 00:12:15,040 Speaker 1: pounded it. He and I went back and forth in 215 00:12:15,040 --> 00:12:19,320 Speaker 1: a very public exchange when he was testifying. Well, last year, 216 00:12:20,120 --> 00:12:24,800 Speaker 1: Zuckerberg came to DC and reached out to my office 217 00:12:24,840 --> 00:12:26,520 Speaker 1: and asked if I'd be willing to sit down and 218 00:12:26,520 --> 00:12:30,920 Speaker 1: get together. Ken I had dinner together, and it was 219 00:12:32,320 --> 00:12:34,000 Speaker 1: you know, it was kind of interesting. We thought about 220 00:12:34,040 --> 00:12:36,839 Speaker 1: actually going to a restaurant. But to be honest, if 221 00:12:36,840 --> 00:12:39,360 Speaker 1: Mark Zuckerberg and I sat down at a restaurant in DC, 222 00:12:40,280 --> 00:12:42,079 Speaker 1: people would lose their minds. I mean, I mean, I 223 00:12:42,360 --> 00:12:44,679 Speaker 1: think they'd started running around screaming and lighting their hair 224 00:12:44,720 --> 00:12:47,800 Speaker 1: on fire. So we didn't do it in a restaurant. 225 00:12:47,880 --> 00:12:51,520 Speaker 1: We did it at somebody's house and it was just 226 00:12:51,600 --> 00:12:54,120 Speaker 1: it was Zuckerberg and me, and it was a couple 227 00:12:54,160 --> 00:12:55,559 Speaker 1: of people on my staff, a couple of people in 228 00:12:55,640 --> 00:12:58,000 Speaker 1: his staff. So it was a very small dinner and 229 00:12:58,080 --> 00:13:00,200 Speaker 1: it was like three hours long, and it was actually, Look, 230 00:13:00,200 --> 00:13:03,640 Speaker 1: it was a lot of fun. Listen, Zuckerberg comes across 231 00:13:03,760 --> 00:13:09,640 Speaker 1: as he's a smart, geeky techie, and I'll give him 232 00:13:09,679 --> 00:13:13,080 Speaker 1: some credit. Look, he's actually trying to wrestle through these issues. 233 00:13:13,400 --> 00:13:18,280 Speaker 1: But look, and to be honest, we went round and 234 00:13:18,440 --> 00:13:23,200 Speaker 1: round and round on just what I was advocating was, Look, 235 00:13:23,280 --> 00:13:27,560 Speaker 1: how about some basic free speech. How about just let 236 00:13:27,640 --> 00:13:30,719 Speaker 1: the marketplace of ideas? If you disagree with someone, lets 237 00:13:30,760 --> 00:13:34,800 Speaker 1: something like, let people argue. Trump tweets all sorts of 238 00:13:34,800 --> 00:13:37,000 Speaker 1: things I disagree with. I don't think the answer is 239 00:13:37,000 --> 00:13:40,280 Speaker 1: to let some Silicon Valley billionaire silence him. If you 240 00:13:40,360 --> 00:13:44,679 Speaker 1: disagree with him, say it, disagree with him, but Zuckerberg 241 00:13:46,280 --> 00:13:50,480 Speaker 1: is trying. He actually, shortly after the dinner I had 242 00:13:50,520 --> 00:13:52,880 Speaker 1: with him, he gave a speech I think it was 243 00:13:52,920 --> 00:13:58,240 Speaker 1: a Georgetown maybe somewhere in DC, advocating principles of free speech. Now, look, 244 00:13:58,480 --> 00:14:01,960 Speaker 1: Facebook has been bumpy on this, yeah, but compared to 245 00:14:02,040 --> 00:14:04,920 Speaker 1: Twitter and Google, they've been much much better. And you 246 00:14:04,960 --> 00:14:11,520 Speaker 1: can see them. Twitter doesn't care anymore. And and by 247 00:14:11,559 --> 00:14:17,280 Speaker 1: the way, YouTube, Yeah, CEO of YouTube came by my 248 00:14:17,320 --> 00:14:22,360 Speaker 1: office to talk about this, and her attitude was essentially well, 249 00:14:22,400 --> 00:14:24,960 Speaker 1: I can't even say it, but it was. It was 250 00:14:25,040 --> 00:14:29,240 Speaker 1: screw it. Although although it said more graphically than that yikes. Um, 251 00:14:30,720 --> 00:14:33,280 Speaker 1: it was simply we had power and will use power. 252 00:14:33,480 --> 00:14:37,760 Speaker 1: And you know what, she actually wanted credit because she said, well, 253 00:14:37,800 --> 00:14:40,880 Speaker 1: you know, people on the left want us to completely 254 00:14:40,920 --> 00:14:43,080 Speaker 1: silence people were so we're talking about Stephen Crowder as 255 00:14:43,080 --> 00:14:47,120 Speaker 1: a friend of ours, um comedian, and and and YouTube 256 00:14:48,120 --> 00:14:51,560 Speaker 1: demonetized him. Boy, talk about an Norwellian word. We will 257 00:14:51,600 --> 00:14:59,200 Speaker 1: take away all your money. CEO wanted she wanted props 258 00:14:59,240 --> 00:15:01,920 Speaker 1: because she said, well, we still allow him to post. 259 00:15:02,000 --> 00:15:05,320 Speaker 1: We didn't. We didn't silence him altogether, like what the 260 00:15:05,320 --> 00:15:07,640 Speaker 1: hell are you talking? Okay, And she's like, well, that's 261 00:15:07,640 --> 00:15:09,200 Speaker 1: what the people that the left wanted us to do. 262 00:15:09,240 --> 00:15:12,360 Speaker 1: I said, listen, the calls for censorship are only coming 263 00:15:12,360 --> 00:15:15,720 Speaker 1: from the crazy leftists. I'm not asking you to silence 264 00:15:15,760 --> 00:15:19,520 Speaker 1: Bernie Sanders right or AOC. God knows, they can prattle 265 00:15:19,560 --> 00:15:22,520 Speaker 1: on forever. But let them talk. Their ideas are so bad, 266 00:15:22,560 --> 00:15:27,760 Speaker 1: we'll engage with them on substance. Yep. But I am 267 00:15:27,920 --> 00:15:30,240 Speaker 1: glad the administration has jumped in. By the way, there 268 00:15:30,280 --> 00:15:32,760 Speaker 1: are other things they can do, and a trust agencies 269 00:15:32,800 --> 00:15:36,840 Speaker 1: both DIGNFTC have been engaged in investigations. These are monopolies 270 00:15:36,840 --> 00:15:39,640 Speaker 1: and their abusing monopoly power. Something else they can do 271 00:15:39,680 --> 00:15:41,960 Speaker 1: that's in the executive Order. That's important. It's not just 272 00:15:42,000 --> 00:15:46,440 Speaker 1: Section two thirty, but it's it's deceptive trade practices. I'm 273 00:15:46,480 --> 00:15:49,600 Speaker 1: really glad the order tells the Attorney General to work 274 00:15:49,640 --> 00:15:52,400 Speaker 1: with the state attorneys general. I've also talked at length 275 00:15:52,440 --> 00:15:56,160 Speaker 1: with a Texas attorney general who's leading a state lawsuits 276 00:15:56,160 --> 00:16:01,520 Speaker 1: about these deceptive practices. And Bill bar today in the 277 00:16:01,520 --> 00:16:04,160 Speaker 1: Oval Office, he said it well, and actually he reflected 278 00:16:04,200 --> 00:16:06,360 Speaker 1: a lot of what he and I talked about over breakfast. 279 00:16:07,240 --> 00:16:11,280 Speaker 1: He said, Listen, these tech companies have built their platforms 280 00:16:11,280 --> 00:16:14,400 Speaker 1: on a lie. They tell people if you come to 281 00:16:14,440 --> 00:16:17,680 Speaker 1: our platform, you can speak, and if you sign up 282 00:16:17,720 --> 00:16:21,280 Speaker 1: to follow someone, you watch them, you can see what 283 00:16:21,320 --> 00:16:23,240 Speaker 1: they post, and if they sign up to follow you, 284 00:16:23,560 --> 00:16:26,360 Speaker 1: they can see what you post. That's the fundamental promise, 285 00:16:26,680 --> 00:16:29,600 Speaker 1: and that is a lie. We know Twitter shadow bands 286 00:16:29,600 --> 00:16:32,920 Speaker 1: if they don't like you. People who say I want 287 00:16:32,920 --> 00:16:35,280 Speaker 1: to follow to Michael Knowles, I care what Michael Knowles 288 00:16:35,320 --> 00:16:37,320 Speaker 1: has to say. Twitter says no, no, no no, no, no, 289 00:16:37,320 --> 00:16:39,840 Speaker 1: no no, We're just going to silently make it go away. 290 00:16:39,920 --> 00:16:43,520 Speaker 1: That is a lie. That is fraud. And I'm glad 291 00:16:43,560 --> 00:16:47,480 Speaker 1: this executive order takes a step towards real legal liability 292 00:16:47,800 --> 00:16:50,480 Speaker 1: for defrauding consumers, which is what big tech is doing. 293 00:16:50,560 --> 00:16:53,360 Speaker 1: That's a big key to this whole issue is the dishonesty, 294 00:16:53,520 --> 00:16:57,040 Speaker 1: is the fraud, is the very very possible, and it 295 00:16:57,160 --> 00:17:00,720 Speaker 1: seems likely violation of these laws. So it's very good. 296 00:17:00,720 --> 00:17:03,000 Speaker 1: We'll see what happens from it. I mean sort of. 297 00:17:03,000 --> 00:17:05,160 Speaker 1: The ball now is in the court of big tech. 298 00:17:05,160 --> 00:17:06,960 Speaker 1: We'll see how they react to it. I want to 299 00:17:07,000 --> 00:17:10,640 Speaker 1: get your opinion on something by the way directly related 300 00:17:10,760 --> 00:17:14,920 Speaker 1: to this social media marketplace, which is a few incidents 301 00:17:14,960 --> 00:17:16,880 Speaker 1: that have popped up over the last week or so, 302 00:17:17,400 --> 00:17:20,399 Speaker 1: and they've really gained national attention because of social media. 303 00:17:20,440 --> 00:17:23,760 Speaker 1: The most prominent one would be the killing of George 304 00:17:23,760 --> 00:17:27,680 Speaker 1: Floyd in Minneapolis. The police officer who arrested this man. 305 00:17:28,359 --> 00:17:33,040 Speaker 1: You've got a white officer, black perpetrator, I suppose, or 306 00:17:33,119 --> 00:17:38,200 Speaker 1: alleged perpetrator, and the suspect ends up dead. That guy's 307 00:17:38,200 --> 00:17:40,040 Speaker 1: got his knee on his neck. I mean, it looks 308 00:17:40,080 --> 00:17:43,000 Speaker 1: really bad. And then this spreads on social media. Now 309 00:17:43,119 --> 00:17:46,080 Speaker 1: there are riots erupting around the country, not even just 310 00:17:46,160 --> 00:17:50,320 Speaker 1: in Minneapolis. Also in Los Angeles there is looting going on. 311 00:17:50,720 --> 00:17:54,640 Speaker 1: I'd like to get your perspective on that video, from 312 00:17:54,720 --> 00:17:58,600 Speaker 1: both a social media perspective, a social perspective, and also 313 00:17:58,720 --> 00:18:03,600 Speaker 1: from the perspective of well, what happened in Minneapolis was horrific, 314 00:18:04,000 --> 00:18:06,879 Speaker 1: it was wrong. I've watched that video and lest at 315 00:18:06,920 --> 00:18:13,280 Speaker 1: anytime you have an incident with police, sometimes the social 316 00:18:13,320 --> 00:18:16,560 Speaker 1: media mob is quick to demonize the police officer, and 317 00:18:16,600 --> 00:18:19,720 Speaker 1: I've long advocated we should wait for the facts to 318 00:18:19,760 --> 00:18:23,960 Speaker 1: play out. That being said, I watched that video and 319 00:18:25,640 --> 00:18:28,080 Speaker 1: you had a man in handcuffs on his face and 320 00:18:28,080 --> 00:18:31,840 Speaker 1: the pavement with an officer's knee in the back of 321 00:18:31,880 --> 00:18:35,680 Speaker 1: his neck pushing it into the pavement. He's gasping for breath, 322 00:18:35,760 --> 00:18:38,840 Speaker 1: he's pleading that he can't breathe, and the officer continues 323 00:18:38,880 --> 00:18:42,479 Speaker 1: for eight minutes. That is, on the face of it, 324 00:18:43,040 --> 00:18:48,800 Speaker 1: police brutality, and anyone who believes in liberty should not 325 00:18:49,000 --> 00:18:54,160 Speaker 1: want to see authoritarianism and an authority abuse. The police 326 00:18:54,160 --> 00:18:56,480 Speaker 1: officer has been fired and the Department of Justice is 327 00:18:56,520 --> 00:18:59,520 Speaker 1: open an investigation of civil rights investigation. I'm glad they 328 00:18:59,560 --> 00:19:05,280 Speaker 1: have watching. That pisses me off. And by the way, 329 00:19:05,400 --> 00:19:09,119 Speaker 1: you know, the social media mob is quick to paint 330 00:19:09,200 --> 00:19:12,159 Speaker 1: this as far as they're concerned, that was Donald Trump 331 00:19:12,160 --> 00:19:15,640 Speaker 1: with his knee on the back of the neck. Let 332 00:19:15,680 --> 00:19:19,480 Speaker 1: me be clear, this is Minneapolis, Minnesota. You've got a 333 00:19:19,560 --> 00:19:22,720 Speaker 1: democratic mayor, you've got a democratic governor, you've got democratic 334 00:19:22,760 --> 00:19:26,480 Speaker 1: senators that this is bright blue. And we keep seeing 335 00:19:26,520 --> 00:19:32,040 Speaker 1: these things happen, this abusive power, often in democratic strongholds. 336 00:19:33,560 --> 00:19:38,840 Speaker 1: Were people that claim to be interested in defending people's rights. 337 00:19:39,680 --> 00:19:41,480 Speaker 1: They're not doing a good job of it, right. I 338 00:19:41,480 --> 00:19:43,479 Speaker 1: mean that I think a little bit of perspective here 339 00:19:43,560 --> 00:19:46,440 Speaker 1: is key, and it's why it's so good to get 340 00:19:46,440 --> 00:19:48,800 Speaker 1: your opinion on this is you're not only somebody with 341 00:19:48,840 --> 00:19:51,760 Speaker 1: a Twitter account, but you also happen to know quite 342 00:19:51,800 --> 00:19:54,240 Speaker 1: a bit about the way the criminal justice system works, 343 00:19:54,400 --> 00:19:55,960 Speaker 1: having worked in it for a long time. I think 344 00:19:55,960 --> 00:20:00,320 Speaker 1: that perspective really helps. We saw a less tragic, much 345 00:20:00,320 --> 00:20:03,800 Speaker 1: more frivolous example of this also just a few days earlier, 346 00:20:03,920 --> 00:20:06,760 Speaker 1: which was this altercation that happened in Central Park. There 347 00:20:06,880 --> 00:20:10,680 Speaker 1: was a man talking to this woman and basically said, 348 00:20:10,720 --> 00:20:13,600 Speaker 1: put your dog on a leash. She said no. The man, 349 00:20:13,720 --> 00:20:17,040 Speaker 1: for some reason had dog treats in his bag just 350 00:20:17,280 --> 00:20:19,880 Speaker 1: for this sort of occasion when people don't know their 351 00:20:19,880 --> 00:20:21,399 Speaker 1: dogs on a leash. As he lures them away with 352 00:20:21,440 --> 00:20:24,719 Speaker 1: the treats. Then this woman sort of lost her temper 353 00:20:24,800 --> 00:20:27,080 Speaker 1: and had a little bit of an emotional meltdown, and 354 00:20:27,080 --> 00:20:29,840 Speaker 1: then he started filming it. And you know what was 355 00:20:29,880 --> 00:20:32,680 Speaker 1: the end result of this. Nothing happened. I mean, after 356 00:20:32,760 --> 00:20:35,160 Speaker 1: the videos went public, then this woman lost her job 357 00:20:35,200 --> 00:20:37,840 Speaker 1: and lost her dog. But in the moment itself, it 358 00:20:37,920 --> 00:20:40,680 Speaker 1: seemed a little bit like much much ado about nothing. 359 00:20:41,600 --> 00:20:45,320 Speaker 1: The social media mob is what made it so much worse, 360 00:20:45,600 --> 00:20:49,120 Speaker 1: so much more sensational. Is that is there a world 361 00:20:49,119 --> 00:20:52,159 Speaker 1: in which we should perhaps hope that maybe social media 362 00:20:52,320 --> 00:20:54,920 Speaker 1: takes it down a few notches because of this sort 363 00:20:54,960 --> 00:20:57,840 Speaker 1: of emotion that it can gin up. Yeah, look, I 364 00:20:57,840 --> 00:20:59,919 Speaker 1: gotta say I have a little bit different take on it. 365 00:21:00,119 --> 00:21:02,600 Speaker 1: There's there's no doubt that that incident showed the power 366 00:21:02,640 --> 00:21:05,960 Speaker 1: of social media and that that an iPhone video can 367 00:21:05,960 --> 00:21:09,000 Speaker 1: suddenly get millions of views all across the world. Um, 368 00:21:09,040 --> 00:21:12,880 Speaker 1: I have to admit the woman who involved, who was involved, 369 00:21:12,920 --> 00:21:17,399 Speaker 1: her behavior was atrocious, of course. Yeah. Um, this was 370 00:21:17,480 --> 00:21:21,120 Speaker 1: an individual, an African American individual who was there bird watching, 371 00:21:21,480 --> 00:21:23,520 Speaker 1: who at least on the video we saw, didn't do 372 00:21:23,560 --> 00:21:28,119 Speaker 1: anything remotely threatening to her. And to watch her be 373 00:21:28,320 --> 00:21:33,960 Speaker 1: willing to listen, making a false accusation, yep, deliberately is 374 00:21:34,000 --> 00:21:37,720 Speaker 1: an act of violence. And and and when when when 375 00:21:37,760 --> 00:21:40,399 Speaker 1: this woman Amy Cooper, picks up her phone and calls 376 00:21:40,520 --> 00:21:43,920 Speaker 1: nine one one and she says, an African American man 377 00:21:44,040 --> 00:21:46,359 Speaker 1: is physically threatening me and my dog, and she she 378 00:21:46,560 --> 00:21:50,240 Speaker 1: calls in what, by all appearances, is a totally false 379 00:21:50,280 --> 00:21:54,000 Speaker 1: crime report, and it's a dangerous crime report. Listen, if 380 00:21:54,040 --> 00:21:57,359 Speaker 1: you call the police and say someone is physically threatening me. 381 00:21:58,040 --> 00:22:00,880 Speaker 1: You are asking for law enforcement to show up with guns, 382 00:22:00,920 --> 00:22:05,240 Speaker 1: and the very real consequence of that could be that 383 00:22:05,400 --> 00:22:10,119 Speaker 1: the person you're wrongly accusing gets shot and killed. Listen, 384 00:22:10,119 --> 00:22:15,880 Speaker 1: when officers arrived to an assault in progress, that they 385 00:22:16,920 --> 00:22:19,480 Speaker 1: there is a risk of something going really wrong. And 386 00:22:19,560 --> 00:22:22,520 Speaker 1: she was in a very real sense in dangering his life. Now, 387 00:22:22,520 --> 00:22:24,359 Speaker 1: I gotta say the fact that he had dog treats, 388 00:22:24,359 --> 00:22:25,960 Speaker 1: it was trying to feed her dog. If I'm walking 389 00:22:26,040 --> 00:22:27,600 Speaker 1: my dog, stay the hell away from my dog and 390 00:22:27,600 --> 00:22:30,320 Speaker 1: don't get him dog treats. That that's a little out there. 391 00:22:31,680 --> 00:22:36,360 Speaker 1: But the attitude she expressed were clearly racist. They were 392 00:22:36,400 --> 00:22:39,760 Speaker 1: clearly wrong. And and by the way, the same point 393 00:22:39,800 --> 00:22:44,280 Speaker 1: I made about Minneapolis holds here. You saw the Twitter 394 00:22:44,359 --> 00:22:47,760 Speaker 1: mob saying, oh this is this is the age of 395 00:22:47,800 --> 00:22:50,600 Speaker 1: Donald Trump. They they blamed Donald Trump for it. Well, 396 00:22:50,600 --> 00:22:55,920 Speaker 1: then the story comes out that, to the shock of nobody, 397 00:22:56,119 --> 00:22:59,920 Speaker 1: she's apparently a liberal Democratic donor who's donated to Brocco 398 00:23:00,040 --> 00:23:03,520 Speaker 1: bomb At, a Pete buddhag Edge and a John Kerry, Right, 399 00:23:03,960 --> 00:23:08,680 Speaker 1: and of course she is. But you know, to be honest, well, look, 400 00:23:08,760 --> 00:23:11,919 Speaker 1: actually there is racism on both sides of the aisle. 401 00:23:12,960 --> 00:23:18,440 Speaker 1: But the left, many on the left love to jump 402 00:23:18,440 --> 00:23:21,920 Speaker 1: on a soapbox and moralize that's right. And you bring 403 00:23:22,000 --> 00:23:24,920 Speaker 1: up such a great point, because we shouldn't downplay and 404 00:23:25,040 --> 00:23:27,320 Speaker 1: in the instance in Central Park, we shouldn't downplay the 405 00:23:27,359 --> 00:23:30,680 Speaker 1: woman's atrocious behavior just because you point out that there's 406 00:23:30,720 --> 00:23:33,960 Speaker 1: more to the story, for instance, than you know, she's 407 00:23:34,040 --> 00:23:37,439 Speaker 1: totally wrong and he's totally right. I mean, maybe some 408 00:23:37,480 --> 00:23:39,480 Speaker 1: of his behavior was a little odd too, Maybe it 409 00:23:39,480 --> 00:23:42,240 Speaker 1: doesn't justify her behavior the same thing. Obviously, you can 410 00:23:42,240 --> 00:23:46,399 Speaker 1: look at what is pretty clearly police brutality in Minneapolis 411 00:23:46,800 --> 00:23:51,480 Speaker 1: and condemn that as being terrible and not therefore start 412 00:23:51,560 --> 00:23:54,520 Speaker 1: defending looting and burning down businesses. You know, it seems 413 00:23:54,640 --> 00:23:57,400 Speaker 1: that there's this knee jerk reaction on social media where 414 00:23:57,400 --> 00:23:59,640 Speaker 1: we immediately have to take a side and say one 415 00:23:59,680 --> 00:24:02,480 Speaker 1: person was totally wrong, one person was totally right, when 416 00:24:02,480 --> 00:24:05,960 Speaker 1: really the situations are much more complex than that, and 417 00:24:06,240 --> 00:24:09,000 Speaker 1: maybe a little bit of perspective on the legal side, 418 00:24:09,000 --> 00:24:11,160 Speaker 1: on the policing side, on the social side can help 419 00:24:11,240 --> 00:24:14,880 Speaker 1: us to understand those things well. And listen, police brutality 420 00:24:15,000 --> 00:24:18,960 Speaker 1: and it undermines not that just the community, but it 421 00:24:19,600 --> 00:24:23,800 Speaker 1: undermines law enforcement as well. Right, I'm blessed to know 422 00:24:23,920 --> 00:24:26,640 Speaker 1: a lot of men and women who are law enforcement officers, 423 00:24:26,680 --> 00:24:32,639 Speaker 1: and they feel that whenever something happens, they the mob 424 00:24:32,720 --> 00:24:36,959 Speaker 1: immediately assumes they're at fault. And there are instances a 425 00:24:37,000 --> 00:24:39,480 Speaker 1: lot of instances where an officer is scared for his 426 00:24:39,600 --> 00:24:43,400 Speaker 1: life and is protecting himself. And it's one thing I saw. 427 00:24:43,440 --> 00:24:46,760 Speaker 1: I watch the video, and sometimes these videos don't capture 428 00:24:46,800 --> 00:24:49,159 Speaker 1: everything that happened. There may have been something that happened 429 00:24:49,200 --> 00:24:52,040 Speaker 1: before that's not on it. So you have to view 430 00:24:52,080 --> 00:24:55,359 Speaker 1: all of this with skepticism. And so there are times 431 00:24:55,359 --> 00:24:58,240 Speaker 1: when an officer has to use physical force and serious 432 00:24:58,320 --> 00:25:03,840 Speaker 1: force to subdue a dangerous individual. What made this video 433 00:25:03,960 --> 00:25:06,399 Speaker 1: so damning is it at last eight minutes and the 434 00:25:06,440 --> 00:25:08,560 Speaker 1: guy is in handcuffs and his face is there, and 435 00:25:08,600 --> 00:25:11,200 Speaker 1: he's gasping for breath and he's pleading saying I can't 436 00:25:11,240 --> 00:25:14,520 Speaker 1: and and this officer doesn't do anything other than keep 437 00:25:14,560 --> 00:25:19,680 Speaker 1: the knee pressing into his neck. That was grotesque and wrong, 438 00:25:20,560 --> 00:25:23,680 Speaker 1: particularly there are multiple other officers around. It's very hard 439 00:25:23,720 --> 00:25:26,440 Speaker 1: to look at that video and suggest there was any 440 00:25:26,480 --> 00:25:29,760 Speaker 1: reason that officer believed was afraid for his safety as 441 00:25:29,800 --> 00:25:34,800 Speaker 1: compared to just being brutalizing someone who was already immobilized. 442 00:25:34,840 --> 00:25:39,879 Speaker 1: And that is not how law enforcement should operate, and 443 00:25:39,920 --> 00:25:43,040 Speaker 1: that's how not how they usually do. And it's why 444 00:25:43,119 --> 00:25:48,440 Speaker 1: it's good that the Department of Justice is looking into this, 445 00:25:48,480 --> 00:25:52,280 Speaker 1: and I very much hope justice is served here. That's right, 446 00:25:52,359 --> 00:25:55,479 Speaker 1: because the effect of this ultimately is going to be 447 00:25:55,480 --> 00:25:59,160 Speaker 1: to undermine our faith in these very institutions that would 448 00:25:59,160 --> 00:26:02,399 Speaker 1: be maintaining law in order, maintaining civil society. It is 449 00:26:02,600 --> 00:26:06,440 Speaker 1: far wider reaching consequences than many many would admit. Before 450 00:26:06,440 --> 00:26:07,520 Speaker 1: we go, we got to get to a little bit 451 00:26:07,560 --> 00:26:11,719 Speaker 1: of mail bag. First question, you know, a really easy 452 00:26:11,800 --> 00:26:14,040 Speaker 1: one to I'm sure you can answer this in ten 453 00:26:14,119 --> 00:26:18,280 Speaker 1: or twenty seconds from Norman. How should pro American nationalists 454 00:26:18,359 --> 00:26:20,800 Speaker 1: think about Hong Kong? You know, a really easy topic, 455 00:26:20,880 --> 00:26:24,280 Speaker 1: like China and Hong Kong. So what China is doing 456 00:26:24,400 --> 00:26:28,600 Speaker 1: is trying to take over Hong Kong completely, trying to 457 00:26:28,720 --> 00:26:33,040 Speaker 1: subject it to the communist government's authoritarianism, trying to strip 458 00:26:33,080 --> 00:26:35,439 Speaker 1: their rights of democracy, trying to strip their rights of 459 00:26:35,480 --> 00:26:38,960 Speaker 1: free speech. It is a power grab, but it is wrong. 460 00:26:39,000 --> 00:26:42,080 Speaker 1: And by the way, when Hong Kong Hong Kong used 461 00:26:42,119 --> 00:26:44,880 Speaker 1: to be part of the British Empire, and then when 462 00:26:44,920 --> 00:26:50,240 Speaker 1: it rejoined China, China agreed to have two separate systems 463 00:26:50,280 --> 00:26:54,159 Speaker 1: and to protect freedom and democracy in China. China's now 464 00:26:54,240 --> 00:26:57,479 Speaker 1: changed its mind and it's crushing freedom. There are a 465 00:26:57,520 --> 00:27:00,639 Speaker 1: lot of consequences. We just saw this week the State 466 00:27:00,680 --> 00:27:03,719 Speaker 1: Department to issue a report that Hong Kong is no 467 00:27:03,760 --> 00:27:07,719 Speaker 1: longer autonomous from China. They did so because of legislation 468 00:27:07,800 --> 00:27:10,560 Speaker 1: that I wrote. I authored legislation that was included in 469 00:27:10,600 --> 00:27:13,720 Speaker 1: a bigger bill that directed the State Department to assess 470 00:27:13,760 --> 00:27:16,480 Speaker 1: whether Hong Kong had real autonomy. We just got the 471 00:27:16,520 --> 00:27:18,399 Speaker 1: report this week. I talked with the State Department this 472 00:27:18,440 --> 00:27:22,119 Speaker 1: week right after they issued it. The consequences of that 473 00:27:22,200 --> 00:27:24,119 Speaker 1: is significant. There are a lot of things that flow 474 00:27:24,200 --> 00:27:27,879 Speaker 1: from that in terms of number one Treasury Department and 475 00:27:27,920 --> 00:27:31,159 Speaker 1: sanctions that could easily flow. You know, China wants to 476 00:27:31,240 --> 00:27:33,800 Speaker 1: use Hong Kong as this sort of free market bastion 477 00:27:33,840 --> 00:27:37,240 Speaker 1: to get around the restrictions on China, but at the 478 00:27:37,320 --> 00:27:40,359 Speaker 1: same time they want to trample freedom. There tariffs, the 479 00:27:40,440 --> 00:27:44,080 Speaker 1: US Trade Representative, the teriffs we have against China, Hong 480 00:27:44,119 --> 00:27:47,400 Speaker 1: Kong is exempt from that. I think given this determination, 481 00:27:48,119 --> 00:27:51,320 Speaker 1: we will see I believe a determination from the President 482 00:27:51,320 --> 00:27:53,760 Speaker 1: in the White House that will result in some very 483 00:27:53,880 --> 00:28:01,200 Speaker 1: significant legal consequences, basically ending Hong Kong's special status because 484 00:28:01,280 --> 00:28:04,359 Speaker 1: China is no longer honoring the agreement they had. That's 485 00:28:04,560 --> 00:28:07,760 Speaker 1: very very good point, because I guess this gets back 486 00:28:07,760 --> 00:28:11,440 Speaker 1: to what we're talking about with big tech. Some players 487 00:28:11,440 --> 00:28:13,119 Speaker 1: in the world are trying to have it both ways. 488 00:28:13,119 --> 00:28:16,400 Speaker 1: They're trying to get certain special protections when actually they're 489 00:28:16,480 --> 00:28:19,440 Speaker 1: violating the very basis of those protections. Something to look 490 00:28:19,440 --> 00:28:22,960 Speaker 1: at in China. Then, finally, maybe the most important question 491 00:28:23,880 --> 00:28:26,600 Speaker 1: that we keep getting recurring here when it comes to 492 00:28:26,600 --> 00:28:31,200 Speaker 1: criminal justice, what is Ted Cruz's stance on the legalization 493 00:28:31,200 --> 00:28:36,280 Speaker 1: of marijuana? Leave it to the states, Listen. I have 494 00:28:38,160 --> 00:28:43,000 Speaker 1: very libertarian instincts, and I have to admit on pot legalization. 495 00:28:43,080 --> 00:28:45,360 Speaker 1: Over the course of my life, I've had different views 496 00:28:45,360 --> 00:28:50,040 Speaker 1: at different times. There are times when I was for legalization. Personally, 497 00:28:50,080 --> 00:28:52,360 Speaker 1: I'm not for legalization now, so if there were a 498 00:28:52,400 --> 00:28:54,960 Speaker 1: referendum in Texas on it, i'd vote against it. I 499 00:28:54,960 --> 00:28:58,200 Speaker 1: think there are some significant negative consequences that come from it. 500 00:28:59,000 --> 00:29:02,520 Speaker 1: But I believe federalism. I believe we got fifty states 501 00:29:03,240 --> 00:29:06,040 Speaker 1: and reasonable people can differ on this, and I think 502 00:29:06,080 --> 00:29:10,440 Speaker 1: it's perfectly fine to let the states operate as laboratories 503 00:29:10,440 --> 00:29:16,760 Speaker 1: of democracy to see what happens. And and so, I mean, listen, 504 00:29:16,800 --> 00:29:19,160 Speaker 1: when I was when I was a teenager, I smoke pot. 505 00:29:20,120 --> 00:29:21,640 Speaker 1: I wrote about that in my book. Also, you know, 506 00:29:21,640 --> 00:29:23,680 Speaker 1: when I was in high school early on in college, 507 00:29:23,680 --> 00:29:26,040 Speaker 1: I smoke pot a number of times. And it, um, 508 00:29:26,320 --> 00:29:29,440 Speaker 1: it's not something I hope my kids don't. Thankfully they're 509 00:29:29,520 --> 00:29:31,720 Speaker 1: nine and twelve. I'm pretty sure they haven't gone there yet. 510 00:29:31,760 --> 00:29:35,360 Speaker 1: But you know, it was I wasn't much older than 511 00:29:35,400 --> 00:29:37,760 Speaker 1: they were when when I first tried it, and it's 512 00:29:37,800 --> 00:29:39,760 Speaker 1: not something I don't think it's good for kids to do. 513 00:29:40,440 --> 00:29:42,560 Speaker 1: But I think we can leave it to the states 514 00:29:42,560 --> 00:29:45,240 Speaker 1: and let the states sort out if and when it 515 00:29:45,280 --> 00:29:47,640 Speaker 1: should be allowed or if not. So what I'm hearing, 516 00:29:47,680 --> 00:29:50,000 Speaker 1: Senator is we're not going to get one of these 517 00:29:50,040 --> 00:29:53,880 Speaker 1: Elon Musk Joe Rogan moments where you pull a joint 518 00:29:53,920 --> 00:29:56,520 Speaker 1: from off camera and start puffing on screen. We're not 519 00:29:56,520 --> 00:29:59,800 Speaker 1: going to get that. Well, I will say this, if 520 00:29:59,840 --> 00:30:02,920 Speaker 1: you remember when Bill Clinton was running for office and 521 00:30:02,960 --> 00:30:06,479 Speaker 1: he said he smoked pot, but he didn't inhale. Yeah, 522 00:30:06,080 --> 00:30:10,120 Speaker 1: I have to admit I was even then. I was 523 00:30:10,160 --> 00:30:13,560 Speaker 1: cracking up, laughing, say thinking, so you're saying you didn't 524 00:30:13,560 --> 00:30:15,600 Speaker 1: do it right, like if you gotta do it actually 525 00:30:15,680 --> 00:30:18,760 Speaker 1: like I mean, don't screw it up. Like listen, I 526 00:30:19,120 --> 00:30:21,760 Speaker 1: still smoke cigars. Now you don't. You don't inhale cigars. 527 00:30:21,800 --> 00:30:24,120 Speaker 1: That's actually not how and you and I have smoked cigars, 528 00:30:24,200 --> 00:30:27,520 Speaker 1: that's right. But uh and there they're one does not 529 00:30:27,800 --> 00:30:31,760 Speaker 1: but but no, I will be not be lighting up 530 00:30:31,760 --> 00:30:34,840 Speaker 1: a spliff in this this particular podcast. It's fair enough. 531 00:30:34,960 --> 00:30:37,040 Speaker 1: You know you you're You've reminded me when you mentioned 532 00:30:37,080 --> 00:30:42,280 Speaker 1: Bill Clinton. Those Democrats wasting things so fiscally irresponsible even 533 00:30:42,320 --> 00:30:45,280 Speaker 1: when it comes down to something such as that much 534 00:30:45,320 --> 00:30:47,760 Speaker 1: more to get to. But a last we're right of time, Senator. 535 00:30:47,840 --> 00:30:50,080 Speaker 1: We will have to pick it up again next time. 536 00:30:50,160 --> 00:31:00,239 Speaker 1: I'm Michael Knowles. This is Verdict with Ted Cruz. This 537 00:31:00,280 --> 00:31:03,120 Speaker 1: episode of Verdict with Ted Cruz is being brought to 538 00:31:03,120 --> 00:31:06,560 Speaker 1: you by Jobs, Freedom and Security Pack, a political action 539 00:31:06,640 --> 00:31:11,640 Speaker 1: committee dedicated to supporting conservative causes, organizations, and candidates across 540 00:31:11,680 --> 00:31:15,080 Speaker 1: the country. In twenty twenty two jobs, Freedom and Security 541 00:31:15,080 --> 00:31:18,600 Speaker 1: Pack plans to donate to conservative candidates running for Congress 542 00:31:18,720 --> 00:31:21,440 Speaker 1: and help the Republican Party across the nation.