1 00:00:00,160 --> 00:00:03,280 Speaker 1: Hi guys, Liz Wheeler. Here I have today for you 2 00:00:03,400 --> 00:00:06,200 Speaker 1: a preview of The cloak Room. The cloak Room is 3 00:00:06,240 --> 00:00:09,200 Speaker 1: the series I host with Senator Cruz on Verdict Plus. 4 00:00:09,480 --> 00:00:13,200 Speaker 1: We break down the nitty gritty legal aspect of some 5 00:00:13,240 --> 00:00:15,800 Speaker 1: of the political issues that surround us today. You can 6 00:00:15,880 --> 00:00:18,560 Speaker 1: join us at anytime at Verdict with Ted Cruz dot 7 00:00:18,600 --> 00:00:22,200 Speaker 1: com slash plus. That's Verdict with Ted Cruise dot com 8 00:00:22,320 --> 00:00:26,040 Speaker 1: slash plus. This episode today is one of my favorite ones. 9 00:00:26,120 --> 00:00:30,080 Speaker 1: So the Fifth Circuit Court of Appeals recently ruled against 10 00:00:30,280 --> 00:00:33,120 Speaker 1: big Tech. They said, the court said that big Tech 11 00:00:33,240 --> 00:00:37,159 Speaker 1: is not allowed to censor conservatives based on conservatives viewpoint. 12 00:00:37,479 --> 00:00:41,519 Speaker 1: This legal opinion is absolutely something. This is very possibly 13 00:00:41,520 --> 00:00:43,279 Speaker 1: could get in front of the Supreme Court and have 14 00:00:43,720 --> 00:00:46,879 Speaker 1: what might be the showdown of the century between the courts, 15 00:00:46,960 --> 00:00:50,600 Speaker 1: the Constitution, and big Tech. Senator Cruz breaks this all 16 00:00:50,720 --> 00:00:54,720 Speaker 1: down and answers the question, is this going to be 17 00:00:54,760 --> 00:00:58,040 Speaker 1: the end of big tech censorship of conservatives? I hope 18 00:00:58,080 --> 00:01:01,040 Speaker 1: you I hope you enjoy this episode. This episode of 19 00:01:01,120 --> 00:01:03,600 Speaker 1: Verdict with Ted Cruz is brought to you by Field 20 00:01:03,640 --> 00:01:07,080 Speaker 1: of Greens. Back in the day, people grew what they 21 00:01:07,280 --> 00:01:10,000 Speaker 1: ate fresh vegetables and fruits were the core of their diet. 22 00:01:10,000 --> 00:01:13,360 Speaker 1: It's what they ate. But as Americans became busier and busier, 23 00:01:13,720 --> 00:01:16,679 Speaker 1: now we eat pre made process fast food. You know, 24 00:01:16,720 --> 00:01:19,959 Speaker 1: the easy stuff, but not very healthy. Definitely not the 25 00:01:20,040 --> 00:01:22,720 Speaker 1: six cups of veggies and fruits a day. But let 26 00:01:22,720 --> 00:01:25,360 Speaker 1: me tell you about Field of Greens. Field of Greens 27 00:01:25,400 --> 00:01:28,560 Speaker 1: is packed with a full spectrum of essential vegetables and fruits, 28 00:01:28,760 --> 00:01:32,600 Speaker 1: plus science back herbs and prebiotics. This is what we 29 00:01:32,680 --> 00:01:36,000 Speaker 1: need to stay healthy. Field of Greens works fast, You'll 30 00:01:36,040 --> 00:01:39,480 Speaker 1: have more energy, you'll look and feel healthier, and it 31 00:01:39,520 --> 00:01:41,760 Speaker 1: can even help you lose weight. Next time you're at 32 00:01:41,760 --> 00:01:43,640 Speaker 1: the doctor and they compare your old lab work to 33 00:01:43,680 --> 00:01:45,760 Speaker 1: your new lab work, I bet the doctor will tell 34 00:01:45,840 --> 00:01:49,320 Speaker 1: you crushed it. Join me and take Field of Greens. 35 00:01:49,600 --> 00:01:51,560 Speaker 1: And to help you get started, I got you fifteen 36 00:01:51,600 --> 00:01:54,680 Speaker 1: percent off your first order and another ten percent off 37 00:01:54,720 --> 00:01:58,560 Speaker 1: when you subscribe for recurring orders. Visit Field of Greens 38 00:01:58,760 --> 00:02:02,520 Speaker 1: dot com and promo code cactus to collect this deal. 39 00:02:02,840 --> 00:02:07,320 Speaker 1: That's Field of Greens dot com promo code cactus. Field 40 00:02:07,360 --> 00:02:11,679 Speaker 1: of Greens dot com promo code cactus. Senator, we have 41 00:02:11,720 --> 00:02:13,840 Speaker 1: a great topic to talk about today. This made me 42 00:02:14,000 --> 00:02:17,000 Speaker 1: very excited to see out of your home state of Texas. 43 00:02:17,360 --> 00:02:21,640 Speaker 1: It's a potential Supreme Court showdown on big tech banning 44 00:02:21,680 --> 00:02:25,200 Speaker 1: conservatives over our viewpoints. We've all experienced this, anybody who 45 00:02:25,560 --> 00:02:29,320 Speaker 1: is conservative and outspoken. This is the ruling that came 46 00:02:29,360 --> 00:02:31,600 Speaker 1: out of the Fifth Circuit Court. I want to jump 47 00:02:31,800 --> 00:02:34,680 Speaker 1: right into the legal aspect of this. This is the 48 00:02:34,760 --> 00:02:39,120 Speaker 1: ruling from Judge Andrew Oldham. He says, a Texas statute 49 00:02:39,360 --> 00:02:43,560 Speaker 1: named House Bill twenty generally prohibits large social media platforms 50 00:02:43,600 --> 00:02:46,440 Speaker 1: from censoring speech based on the viewpoint of its speaker. 51 00:02:46,800 --> 00:02:49,440 Speaker 1: The platforms urge us to hold that the statute is 52 00:02:49,480 --> 00:02:53,120 Speaker 1: facially unconstitutional and hence cannot be applied to anyone, at 53 00:02:53,160 --> 00:02:57,240 Speaker 1: any time and under any circumstance. In urging such sweeping relief, 54 00:02:57,280 --> 00:03:00,400 Speaker 1: the platforms offer a rather odd inversion of the First Amendment. 55 00:03:00,680 --> 00:03:03,760 Speaker 1: That Amendment, of course, protects every person's right to the 56 00:03:03,840 --> 00:03:08,200 Speaker 1: freedom of speech, But the platforms argue that buried somewhere 57 00:03:08,280 --> 00:03:11,880 Speaker 1: in the person's enumerated right to free speech lies a 58 00:03:11,960 --> 00:03:18,000 Speaker 1: corporation's unenumerated right to muzzle speech. The implications of the 59 00:03:18,000 --> 00:03:22,000 Speaker 1: platform's arguments are staggering. On the platform's view, email providers, 60 00:03:22,120 --> 00:03:25,320 Speaker 1: mobile phone companies, and banks could cancel the accounts of 61 00:03:25,320 --> 00:03:27,519 Speaker 1: anyone who sends an email, makes a phone call, or 62 00:03:27,560 --> 00:03:30,799 Speaker 1: spends money in support of a disfavored political party, candidate 63 00:03:30,880 --> 00:03:33,840 Speaker 1: or business. What's worse, the platforms argue that a business 64 00:03:33,880 --> 00:03:37,920 Speaker 1: can acquire a dominant market position by holding itself out 65 00:03:38,000 --> 00:03:41,320 Speaker 1: as open to everyone, as Twitter did in championing itself 66 00:03:41,360 --> 00:03:43,960 Speaker 1: as the free speech wing of the Free Speech Party. 67 00:03:44,640 --> 00:03:48,040 Speaker 1: Then having cemented itself as the monopolist of the modern 68 00:03:48,120 --> 00:03:52,080 Speaker 1: public square, Twitter unapologetically argues that it could turn around 69 00:03:52,080 --> 00:03:55,840 Speaker 1: and ban all pro LGBT speech for no other reason 70 00:03:56,080 --> 00:03:59,560 Speaker 1: than its employees want to pick on members of that community. Today, 71 00:03:59,680 --> 00:04:03,680 Speaker 1: we reject the idea that corporations have a freewheeling First 72 00:04:03,720 --> 00:04:07,160 Speaker 1: Amendment right to censor what people say. Because the district 73 00:04:07,200 --> 00:04:10,920 Speaker 1: court held otherwise, We reverse its injunction and remand for 74 00:04:10,960 --> 00:04:16,840 Speaker 1: further proceedings. So this is an incredibly important decision. So 75 00:04:16,960 --> 00:04:19,320 Speaker 1: let's take it in several pieces. Number one, let me 76 00:04:19,400 --> 00:04:23,039 Speaker 1: commend the Texas State Legislature and Governor Abbot. The state 77 00:04:23,040 --> 00:04:28,760 Speaker 1: of Texas passed serious legislation designed to tackle and stop 78 00:04:28,839 --> 00:04:33,520 Speaker 1: big tech censorship. And Texas is leading, unfortunately in a 79 00:04:33,560 --> 00:04:36,520 Speaker 1: way that the federal government Joe Biden won't do, and 80 00:04:36,680 --> 00:04:39,280 Speaker 1: unfortunately in a way that Congress won't do. I've been 81 00:04:39,320 --> 00:04:42,000 Speaker 1: pushing for Congress to do something like this, but Democrats 82 00:04:42,160 --> 00:04:46,080 Speaker 1: are blocking it in DC. With Schumer and Pelosi running Congress, 83 00:04:46,080 --> 00:04:48,440 Speaker 1: they won't allow this to move forward. And so the 84 00:04:48,440 --> 00:04:51,440 Speaker 1: Texas state legislature said, to heck with you, We're gonna 85 00:04:51,480 --> 00:04:53,880 Speaker 1: do it. We're going to protect thirty million Texans, We're 86 00:04:53,920 --> 00:04:56,120 Speaker 1: going to protect free speech rights. And so it made 87 00:04:56,160 --> 00:05:00,520 Speaker 1: it illegal for the big tech companies to censor based 88 00:05:00,520 --> 00:05:05,240 Speaker 1: on viewpoint, based on politics, and it created an enforcement mechanism, 89 00:05:05,279 --> 00:05:09,520 Speaker 1: including the ability for the Attorney general to file an 90 00:05:09,560 --> 00:05:14,920 Speaker 1: injunctive lawsuit against big tech to force them to stop censoring. Now, 91 00:05:14,960 --> 00:05:18,480 Speaker 1: what happened is Big Tech, which which has more money 92 00:05:18,560 --> 00:05:23,840 Speaker 1: than Midas, it has unlimited cash. They print gold. What 93 00:05:23,920 --> 00:05:26,599 Speaker 1: did they do? They hired an army of lawyers and 94 00:05:26,640 --> 00:05:30,560 Speaker 1: filed a lawsuits seeking to stop the bill, and they succeeded. 95 00:05:30,560 --> 00:05:34,400 Speaker 1: They got a district court to agree with them. This 96 00:05:34,520 --> 00:05:42,280 Speaker 1: is the appeal from that injunction. And this opinion is masterful. 97 00:05:43,080 --> 00:05:47,919 Speaker 1: This is a very very serious judicial opinion. Let me 98 00:05:47,960 --> 00:05:50,160 Speaker 1: stop for a second and give give a little bit 99 00:05:50,160 --> 00:05:51,440 Speaker 1: of color, And let me give you a little bit 100 00:05:51,440 --> 00:05:54,760 Speaker 1: of color on a couple of sides. Number one. The 101 00:05:54,839 --> 00:05:57,400 Speaker 1: judge who wrote it as a judge named Judge Andy Oldham. 102 00:05:57,720 --> 00:06:01,440 Speaker 1: I've known Andy for some time. Andy was a law 103 00:06:01,480 --> 00:06:04,400 Speaker 1: clerk to Judge David Sentel on the DC Circuit, who 104 00:06:04,440 --> 00:06:07,160 Speaker 1: was one of the top conservative appellate judges in the country. 105 00:06:07,760 --> 00:06:09,960 Speaker 1: He was then a law clerk to Justice Sam Alito 106 00:06:10,040 --> 00:06:15,920 Speaker 1: on the US Supreme Court. Andy is a veteran litigator, 107 00:06:16,080 --> 00:06:18,680 Speaker 1: and in fact he was in my old office. As 108 00:06:18,720 --> 00:06:20,640 Speaker 1: you know I was for five and a half years, 109 00:06:20,720 --> 00:06:24,279 Speaker 1: was the Solicitor General of Texas. Well. Andy was in 110 00:06:24,360 --> 00:06:27,200 Speaker 1: that office after I left. He didn't work for me, 111 00:06:27,320 --> 00:06:31,880 Speaker 1: but he was Deputy Solicitor General of Texas and did 112 00:06:31,880 --> 00:06:34,640 Speaker 1: an excellent job. And in fact, after being Deputy SG, 113 00:06:34,880 --> 00:06:38,040 Speaker 1: he went on to be General Counsel in the Governor's 114 00:06:38,080 --> 00:06:42,640 Speaker 1: Office for Greg Abbott. And so when Donald Trump was president, 115 00:06:42,680 --> 00:06:47,000 Speaker 1: there were multiple vacancies on the Fifth Circuit. And the 116 00:06:47,040 --> 00:06:50,400 Speaker 1: way it works with judicial appointments is that every federal 117 00:06:50,520 --> 00:06:54,000 Speaker 1: judicial appointment in the state of Texas has to get 118 00:06:54,160 --> 00:06:56,000 Speaker 1: my sign off and has to get the sign off 119 00:06:56,000 --> 00:06:57,719 Speaker 1: of John Cornen, and the two of us have a 120 00:06:57,839 --> 00:07:01,839 Speaker 1: very serious and thorough vetting process. In fact, we have 121 00:07:01,880 --> 00:07:05,640 Speaker 1: a bipartisan Judicial Evaluation Committee that plays an important role 122 00:07:05,680 --> 00:07:11,280 Speaker 1: in it. But Andy is someone who I enthusiastically supported 123 00:07:11,280 --> 00:07:17,320 Speaker 1: President Trump nominating Cordon and I both recommended to President 124 00:07:17,360 --> 00:07:20,040 Speaker 1: Trump that he nominated him. And for that matter, Greg Abbott, 125 00:07:20,080 --> 00:07:24,120 Speaker 1: my old bosson good friend, was effusive. Abbott called me 126 00:07:24,160 --> 00:07:26,440 Speaker 1: and said, this guy is a rock star. You want 127 00:07:26,480 --> 00:07:29,720 Speaker 1: him on the Court of Appeals. President Trump nominated him, 128 00:07:29,760 --> 00:07:33,360 Speaker 1: We confirmed him, and I got to say, look no 129 00:07:33,480 --> 00:07:36,520 Speaker 1: further than this opinion to have confirmation as to why 130 00:07:36,680 --> 00:07:42,000 Speaker 1: that was an incredibly important decision. This opinion, it's over 131 00:07:42,000 --> 00:07:46,640 Speaker 1: one hundred pages long. It is careful, it is thorough, 132 00:07:47,640 --> 00:07:50,920 Speaker 1: it is scholarly another odd bit of commentary. When this 133 00:07:50,960 --> 00:07:54,600 Speaker 1: opinion came down, I really wanted to read it. It 134 00:07:54,640 --> 00:07:58,760 Speaker 1: was Sunday, Sunday afternoon, and I was home in Houston 135 00:08:00,720 --> 00:08:05,080 Speaker 1: and had taken no I'm sorry with Saturday afternoon. It 136 00:08:05,160 --> 00:08:07,720 Speaker 1: was Saturday afternoon. I was home in Houston, and I 137 00:08:07,800 --> 00:08:11,840 Speaker 1: had taken my daughter Katherine, who's eleven, and one of 138 00:08:11,840 --> 00:08:16,360 Speaker 1: her friends to the nail salon to get a manny petty. 139 00:08:16,400 --> 00:08:19,559 Speaker 1: So they're both sitting there getting getting their toes done. 140 00:08:20,640 --> 00:08:24,240 Speaker 1: You know, Dad's a little kind of I'm not terribly 141 00:08:24,280 --> 00:08:26,440 Speaker 1: at ease in a nail salon with the whole manni 142 00:08:26,480 --> 00:08:31,280 Speaker 1: petty thing. And I gotta say, there there are lots 143 00:08:31,360 --> 00:08:34,640 Speaker 1: of like like women walking in who are doing double takes. 144 00:08:34,679 --> 00:08:37,719 Speaker 1: You know what on earth is cruise doing sitting in 145 00:08:37,800 --> 00:08:42,880 Speaker 1: the nail salon. And then like they said, hey, if 146 00:08:42,920 --> 00:08:44,840 Speaker 1: you want to sit with the girls while they're doing this, 147 00:08:44,960 --> 00:08:46,920 Speaker 1: you can. I'm like, yeah, sure, I'll go sit with them. 148 00:08:46,920 --> 00:08:50,440 Speaker 1: And so they had me seating seated in the chair 149 00:08:50,520 --> 00:08:54,120 Speaker 1: for the manny petty thing, which had a whole like 150 00:08:54,440 --> 00:08:56,920 Speaker 1: massage thing in the chair. So I did enjoy the 151 00:08:57,000 --> 00:08:59,640 Speaker 1: like back massage thing to turn on the auto massage. 152 00:09:00,559 --> 00:09:03,600 Speaker 1: But I'm sitting there. I was not getting the manny petty, 153 00:09:03,640 --> 00:09:05,960 Speaker 1: so I had my shoes on needless to sab next 154 00:09:06,000 --> 00:09:08,800 Speaker 1: to the girls. But I kept thinking, you know, someone 155 00:09:09,280 --> 00:09:12,200 Speaker 1: is going to video me in this manny petty thing 156 00:09:12,880 --> 00:09:18,000 Speaker 1: and it is going to drive Lefty Twitter insane. But 157 00:09:18,360 --> 00:09:20,839 Speaker 1: while I'm sitting there in the chair with my daughter 158 00:09:20,880 --> 00:09:24,760 Speaker 1: while she's having a blast. I'm reading this opinion onto 159 00:09:24,800 --> 00:09:27,880 Speaker 1: my iPhone, just sitting there like for a half hour, 160 00:09:28,000 --> 00:09:31,080 Speaker 1: just reading the opinion in the manny petty chair while 161 00:09:31,200 --> 00:09:33,560 Speaker 1: Catherine got her toes done. I was going to ask 162 00:09:33,559 --> 00:09:35,640 Speaker 1: you if you got a manny petty, but then I thought, 163 00:09:35,640 --> 00:09:37,600 Speaker 1: you know what, I would already know if he did, 164 00:09:37,679 --> 00:09:40,840 Speaker 1: because if someone was painting his fingernails, there would have 165 00:09:40,880 --> 00:09:42,599 Speaker 1: been a photo of that on Twitter already. So the 166 00:09:42,679 --> 00:09:46,360 Speaker 1: question was, moot, now that that is true. Nope, I 167 00:09:47,240 --> 00:09:51,040 Speaker 1: use clippers and do it myself. That that's sufficient. So 168 00:09:51,240 --> 00:09:56,920 Speaker 1: let's talk about the opinion. The opinion is really impressive. 169 00:09:57,000 --> 00:10:00,240 Speaker 1: It goes so one of the things to understand, this 170 00:10:00,320 --> 00:10:03,600 Speaker 1: law has not gone into effect in Texas yet. So 171 00:10:03,760 --> 00:10:09,360 Speaker 1: there are two ways generally to challenge a law as unconstitutional. 172 00:10:10,760 --> 00:10:13,440 Speaker 1: One way and the way typically a law is challenge 173 00:10:13,520 --> 00:10:16,800 Speaker 1: is unconstitutional is what's known as as applied, which is 174 00:10:17,880 --> 00:10:20,959 Speaker 1: a laws and effect, and the government comes and tries 175 00:10:21,000 --> 00:10:24,440 Speaker 1: to enforce it against you in some specific context, and 176 00:10:24,520 --> 00:10:27,920 Speaker 1: you go file a lawsuit and say, enforcing this law 177 00:10:27,960 --> 00:10:31,280 Speaker 1: against me as applied under these facts right here and 178 00:10:31,360 --> 00:10:35,040 Speaker 1: now is unconstitutional. That's the way the vast majority of 179 00:10:35,120 --> 00:10:41,600 Speaker 1: lawsuits about constitutionality are adjudicated, and we talked about that 180 00:10:41,720 --> 00:10:45,800 Speaker 1: distinction in our last episode. But the other way to 181 00:10:45,920 --> 00:10:49,079 Speaker 1: challenge it is what was happening here, and it's what's 182 00:10:49,120 --> 00:10:53,560 Speaker 1: called a facial challenge, and it's a challenge that is 183 00:10:54,720 --> 00:10:58,480 Speaker 1: often and in this case, it was brought before the 184 00:10:58,559 --> 00:11:02,160 Speaker 1: law even goes into effect, and a facial challenge, it's 185 00:11:02,160 --> 00:11:04,600 Speaker 1: a much harder thing to prevail on. A facial challenge 186 00:11:04,679 --> 00:11:08,680 Speaker 1: is saying there is no circumstance in which this law 187 00:11:08,760 --> 00:11:13,800 Speaker 1: can be applied unconstitutionally. It is on its face unconstitutional. 188 00:11:13,920 --> 00:11:18,679 Speaker 1: Doesn't matter how it's applied, this law cannot stand. The 189 00:11:18,720 --> 00:11:23,040 Speaker 1: most frequent area where a facial challenge is allowed is 190 00:11:23,040 --> 00:11:26,040 Speaker 1: in the First Amendment context. And there's a doctrine called 191 00:11:26,080 --> 00:11:30,000 Speaker 1: First Amendment over breadth, which is that it it will 192 00:11:30,280 --> 00:11:33,880 Speaker 1: chill speech, it will it will have such a chilling 193 00:11:33,920 --> 00:11:38,120 Speaker 1: and deterrent effect that people will refrain from speaking. So 194 00:11:38,160 --> 00:11:42,640 Speaker 1: this was a facial challenge and overbreadth challenge under the 195 00:11:42,640 --> 00:11:47,360 Speaker 1: First Amendment. And by the way, Big Tech had lots 196 00:11:47,400 --> 00:11:51,000 Speaker 1: of high priced lawyers who made lots of vigorous arguments 197 00:11:51,000 --> 00:11:55,800 Speaker 1: about this. What the Fifth Circuit opinion that judge Oldham wrote, 198 00:11:56,000 --> 00:12:00,240 Speaker 1: does is goes systematically through and says number one, look, 199 00:12:00,240 --> 00:12:04,680 Speaker 1: this doesn't chill This law going into effect doesn't chill 200 00:12:04,760 --> 00:12:08,640 Speaker 1: any speech. And in fact, big Tech is not asking 201 00:12:08,720 --> 00:12:10,920 Speaker 1: for a right to speak, they're asking for a right 202 00:12:10,960 --> 00:12:15,200 Speaker 1: to censor. Nobody is stopping Big Tech from saying whatever 203 00:12:15,240 --> 00:12:19,280 Speaker 1: it wants. What this law is saying is that big 204 00:12:19,320 --> 00:12:22,920 Speaker 1: Tech cannot silence other speakers. And so it goes through 205 00:12:23,000 --> 00:12:27,680 Speaker 1: systematically and says, well, is there a right to censor 206 00:12:27,679 --> 00:12:30,040 Speaker 1: that is separate from the right to speak? And one 207 00:12:30,080 --> 00:12:32,360 Speaker 1: of the arguments big Tech says, as they said, well, 208 00:12:33,080 --> 00:12:37,040 Speaker 1: we're a publisher and we're engaged in editorial decisions about 209 00:12:37,080 --> 00:12:40,720 Speaker 1: what to allow other speakers to say and not say. 210 00:12:40,800 --> 00:12:43,640 Speaker 1: And there is a whole line of cases where, for example, 211 00:12:43,760 --> 00:12:47,560 Speaker 1: newspapers can choose what op eds to run and what 212 00:12:47,600 --> 00:12:50,760 Speaker 1: op eds not to run. But what the opinion does 213 00:12:51,040 --> 00:12:53,960 Speaker 1: through systematically and distinguishes and says, well, that's not what 214 00:12:54,080 --> 00:12:56,560 Speaker 1: big tech is doing. They're not holding themselves out as 215 00:12:56,600 --> 00:12:59,640 Speaker 1: a publisher and saying we're choosing, We're only running the 216 00:12:59,640 --> 00:13:03,559 Speaker 1: speech we say. And in fact, Big Tech routinely represents 217 00:13:03,600 --> 00:13:07,440 Speaker 1: to Congress, to its consumers, to everybody else that they're 218 00:13:07,480 --> 00:13:10,640 Speaker 1: an open marketplace of ideas and they're not responsible for 219 00:13:10,640 --> 00:13:14,119 Speaker 1: the content of what they say. And in fact, Congress, 220 00:13:14,120 --> 00:13:17,440 Speaker 1: in section two thirty the Communications Decency Act explicitly said 221 00:13:17,440 --> 00:13:19,800 Speaker 1: Big Tech is not a publisher and they're not responsible 222 00:13:19,840 --> 00:13:24,480 Speaker 1: for the content of what people say, which the Fifth 223 00:13:24,520 --> 00:13:27,520 Speaker 1: Circuit reasonably interpreted say, well, they can't claim to be 224 00:13:27,559 --> 00:13:30,079 Speaker 1: a publisher and claim not to be simultaneously pick one 225 00:13:30,120 --> 00:13:31,640 Speaker 1: or the other, don't. You don't get to have your 226 00:13:31,640 --> 00:13:36,040 Speaker 1: cake and eat it too. There's something else text legislature did, 227 00:13:36,080 --> 00:13:40,840 Speaker 1: which is it regulated them as as what's called a 228 00:13:40,920 --> 00:13:44,319 Speaker 1: common carrier. Now what is a common carrier? And the 229 00:13:44,320 --> 00:13:46,120 Speaker 1: opinion does a very good job of going through the 230 00:13:46,200 --> 00:13:48,800 Speaker 1: history of this common carriers. Initially they came up with 231 00:13:48,840 --> 00:13:53,319 Speaker 1: in the transportation context, people who ran, say a ferry 232 00:13:53,360 --> 00:13:58,400 Speaker 1: boat across a let's say a river or a lake. 233 00:13:59,800 --> 00:14:03,839 Speaker 1: And early, early on, there was legislation common carrier legislation 234 00:14:03,920 --> 00:14:07,280 Speaker 1: that was understood as being permissible that said, if you're 235 00:14:07,320 --> 00:14:12,079 Speaker 1: running a ferry boat, you can't just arbitrarily decide Liz, 236 00:14:12,160 --> 00:14:14,400 Speaker 1: I don't like you, so you can't cross the river. 237 00:14:15,000 --> 00:14:17,800 Speaker 1: That if you hold yourself out to the public as 238 00:14:17,880 --> 00:14:22,320 Speaker 1: a common carrier, meaning I carry the public, then you 239 00:14:22,400 --> 00:14:25,760 Speaker 1: can't discriminate against and say I'll only carry you but 240 00:14:25,880 --> 00:14:31,200 Speaker 1: not you. That doctrine has been around for hundreds of years. 241 00:14:31,400 --> 00:14:36,000 Speaker 1: It's also been applied very specifically in terms of communication. 242 00:14:36,120 --> 00:14:38,200 Speaker 1: And again the opinion goes through the history of it. 243 00:14:38,200 --> 00:14:41,040 Speaker 1: It was first applied with the telegram, where you had 244 00:14:41,040 --> 00:14:44,000 Speaker 1: telegram lines, and it used to be the telegrams did 245 00:14:44,320 --> 00:14:47,920 Speaker 1: discriminate against speech, and so, for example, the telegraph companies 246 00:14:48,760 --> 00:14:53,240 Speaker 1: were owned by partisans, and so they would suppress, for example, 247 00:14:53,280 --> 00:14:57,480 Speaker 1: election information that was contrary to their partisan leanings. They 248 00:14:57,480 --> 00:15:02,280 Speaker 1: wouldn't transmit a telegram that say, hey, hypothetically the Democrat 249 00:15:02,320 --> 00:15:04,640 Speaker 1: one here, or the Democrats winning here, or the Republicans 250 00:15:04,840 --> 00:15:07,360 Speaker 1: one here is winning there. They would suppress that speech 251 00:15:07,400 --> 00:15:09,600 Speaker 1: because they disagreed with the politics of it. And Congress 252 00:15:09,600 --> 00:15:12,040 Speaker 1: came in and regulated it and said, look, if someone 253 00:15:12,120 --> 00:15:13,960 Speaker 1: sayds a telegram, you got to send the damn thing. 254 00:15:14,200 --> 00:15:16,360 Speaker 1: Whether you agree with it or not. You're a common 255 00:15:16,400 --> 00:15:20,880 Speaker 1: carrier carry the message. All of this the Fifth Circuit 256 00:15:20,960 --> 00:15:25,720 Speaker 1: walked painstakingly through and said, listen, big tech today they 257 00:15:25,720 --> 00:15:28,520 Speaker 1: are monopolists, they are much they are common carriers. That's 258 00:15:28,520 --> 00:15:33,040 Speaker 1: how the state regulated them, and you can provide an 259 00:15:33,080 --> 00:15:39,200 Speaker 1: equal access rule. This opinion, it will be challenged, It 260 00:15:39,280 --> 00:15:41,480 Speaker 1: could easily go to the Supreme Court. But I got 261 00:15:41,480 --> 00:15:45,520 Speaker 1: to say reading this opinion, I would encourage you know, 262 00:15:45,720 --> 00:15:48,760 Speaker 1: we did this before on Dobbs. Dobbs was another very 263 00:15:48,920 --> 00:15:52,360 Speaker 1: very serious opinion by Justice Alito. I would encourage people 264 00:15:52,360 --> 00:15:55,840 Speaker 1: to read this opinion. It is fascinating and exceptionally well done. 265 00:15:56,720 --> 00:16:00,640 Speaker 1: Some of the industries that have been regulated as common carriers, 266 00:16:00,680 --> 00:16:04,280 Speaker 1: that's happened via Congress, meaning our federal Congress. Yea, this 267 00:16:04,480 --> 00:16:08,160 Speaker 1: happened via the state of Texas. Will that become an 268 00:16:08,200 --> 00:16:09,680 Speaker 1: issue in any way that it was done by the 269 00:16:09,720 --> 00:16:13,880 Speaker 1: state and not the federal Congress? So I don't think so. 270 00:16:15,520 --> 00:16:19,680 Speaker 1: It is true that the common carrier legislation, by and large, 271 00:16:19,680 --> 00:16:22,520 Speaker 1: a lot of it, certainly dealing with for example, telecom 272 00:16:22,640 --> 00:16:27,240 Speaker 1: or railroads, is at the federal level. But the question 273 00:16:27,480 --> 00:16:30,400 Speaker 1: is whether it's permissible under the First Amendment. And the 274 00:16:30,440 --> 00:16:35,000 Speaker 1: First Amendment analysis is not different whether it's Congress regulating 275 00:16:35,040 --> 00:16:38,760 Speaker 1: or the state regulating. So we've talked before on Cloakroom 276 00:16:38,840 --> 00:16:43,160 Speaker 1: about how the Supreme Court has incorporated the First Amendment 277 00:16:43,160 --> 00:16:45,280 Speaker 1: against the states. So the First Amendment by its own 278 00:16:45,320 --> 00:16:48,600 Speaker 1: explicit text only applies to Congress. Congress shall make no 279 00:16:48,720 --> 00:16:52,760 Speaker 1: laws how it begins. But the Supreme Court has interpreted 280 00:16:52,760 --> 00:16:56,680 Speaker 1: the free speech protections exactly the same against state and 281 00:16:56,720 --> 00:17:00,880 Speaker 1: local government. So if it is unconstant and the First 282 00:17:00,920 --> 00:17:04,520 Speaker 1: Amendment for Congress to pass a law restricting speech, it's 283 00:17:04,600 --> 00:17:09,119 Speaker 1: equally unconstitutional for the state to do that. In this instance, 284 00:17:11,400 --> 00:17:15,600 Speaker 1: if it would be constitutional for Congress to regulate Big 285 00:17:15,600 --> 00:17:18,240 Speaker 1: Tech as a common carrier and put the same obligations 286 00:17:18,280 --> 00:17:23,200 Speaker 1: of nondiscrimination that they put on telephone companies or telegraph companies, 287 00:17:24,000 --> 00:17:26,320 Speaker 1: then it's equally constitutional for the states to do it. 288 00:17:26,359 --> 00:17:30,040 Speaker 1: That the First Amendment doesn't apply differently, and so the 289 00:17:30,119 --> 00:17:34,000 Speaker 1: principle is the same. And so I don't think that. Look, 290 00:17:35,400 --> 00:17:37,840 Speaker 1: could Big Tech try to come up with some argument 291 00:17:37,880 --> 00:17:40,520 Speaker 1: on that, I suppose, but I can't think of a 292 00:17:40,520 --> 00:17:44,600 Speaker 1: good one sitting here. What is Big Tech's appeal going 293 00:17:44,640 --> 00:17:46,880 Speaker 1: to be based on? What's their argument against this ruling 294 00:17:50,240 --> 00:17:53,680 Speaker 1: that it violates the First Amendment. That they are publishers 295 00:17:53,760 --> 00:17:57,000 Speaker 1: and they are choosing what speech to transmit and what 296 00:17:57,160 --> 00:18:01,560 Speaker 1: speech not and so for example, there was a case 297 00:18:01,680 --> 00:18:04,320 Speaker 1: that the Supreme Court considered a law at a Florida 298 00:18:04,440 --> 00:18:09,560 Speaker 1: that said if a newspaper wrote published an editorial that 299 00:18:09,640 --> 00:18:14,520 Speaker 1: was critical of a political candidate, that that candidate had 300 00:18:14,520 --> 00:18:18,600 Speaker 1: a right to respond in equal space. And the Supreme 301 00:18:18,640 --> 00:18:21,720 Speaker 1: Court said, you can't do that, that that's unconstitutional, that 302 00:18:21,800 --> 00:18:25,960 Speaker 1: the paper is exercising its First Amendment rights deciding what 303 00:18:26,920 --> 00:18:30,760 Speaker 1: editorials to publish. And it also talked about in the 304 00:18:30,800 --> 00:18:33,520 Speaker 1: world of over breadth that there was a chilling effect 305 00:18:33,520 --> 00:18:37,280 Speaker 1: on that that what it could do is discourage papers 306 00:18:37,280 --> 00:18:41,760 Speaker 1: from talking about politics altogether. Because look, space in a 307 00:18:41,800 --> 00:18:46,399 Speaker 1: newspaper cost money, and there's a finite amount of space. 308 00:18:46,520 --> 00:18:50,160 Speaker 1: And so if a newspaper knows if we write about politics, 309 00:18:50,160 --> 00:18:52,240 Speaker 1: we got to give equal space to the other side. 310 00:18:52,280 --> 00:18:56,000 Speaker 1: That's given away free newspaper, that costs us revenue. And 311 00:18:56,119 --> 00:18:58,600 Speaker 1: a rational thing for a newspaper to do is let's 312 00:18:58,640 --> 00:19:00,480 Speaker 1: just not talk about it all together. In the Supreme 313 00:19:00,520 --> 00:19:05,600 Speaker 1: Court said, well, that's chilling speech, that's discouraging speech. One 314 00:19:05,600 --> 00:19:08,879 Speaker 1: of the things that Fist Circuit's decision says here is, look, 315 00:19:10,240 --> 00:19:13,320 Speaker 1: big tech is very different from a newspaper. There's not 316 00:19:13,440 --> 00:19:19,040 Speaker 1: a finite number of tweets. It's not like allowing someone 317 00:19:19,080 --> 00:19:22,160 Speaker 1: to tweet something they disagree takes away with some other 318 00:19:22,160 --> 00:19:29,320 Speaker 1: tweet they want that it is essentially infinite, and Big 319 00:19:29,359 --> 00:19:34,520 Speaker 1: Tech is not purporting. They don't claim to be endorsing 320 00:19:34,560 --> 00:19:37,160 Speaker 1: what's said on their platform. They don't claim to be 321 00:19:38,000 --> 00:19:41,679 Speaker 1: doing what an editorial page does. They claim to be 322 00:19:41,920 --> 00:19:44,959 Speaker 1: the public square. The whole predicative section two thirty is 323 00:19:45,240 --> 00:19:47,760 Speaker 1: it's not our fault. It's not us speaking as somebody 324 00:19:47,800 --> 00:19:50,560 Speaker 1: else speaking. And I thought that was one of the 325 00:19:50,640 --> 00:19:57,639 Speaker 1: more insightful and clever aspects of the opinion was taking 326 00:19:57,720 --> 00:20:04,920 Speaker 1: the premises behind section two dirty and essentially flipping them 327 00:20:04,960 --> 00:20:08,679 Speaker 1: against Big Tech and saying, look, you guys have argued 328 00:20:08,760 --> 00:20:11,760 Speaker 1: very persuasively, you're not publishers. Okay, you're not publishers. Well, 329 00:20:11,800 --> 00:20:15,800 Speaker 1: then don't pretend to be publishers. You are a common carrier. 330 00:20:15,840 --> 00:20:19,320 Speaker 1: You're just a vehicle. AT and T is not a publisher. 331 00:20:19,359 --> 00:20:22,080 Speaker 1: If I call and tell you something, at and T 332 00:20:22,280 --> 00:20:24,199 Speaker 1: is not responsible for what I said to you, the 333 00:20:24,240 --> 00:20:28,399 Speaker 1: same thing if I tweet something at you a common carrier. Likewise, 334 00:20:28,440 --> 00:20:30,399 Speaker 1: it's not their fault. And one of the things they 335 00:20:30,400 --> 00:20:34,080 Speaker 1: can point out they said, you know what, if Big 336 00:20:34,119 --> 00:20:40,880 Speaker 1: Tech disagrees, they can say so so if I tweet 337 00:20:41,000 --> 00:20:45,960 Speaker 1: something that they don't like, if I tweet there is 338 00:20:45,960 --> 00:20:51,919 Speaker 1: a difference between boys and girls. Big Tech disagrees with that. 339 00:20:52,920 --> 00:20:55,200 Speaker 1: The circuit said, you know what, nothing stop and Twitter 340 00:20:55,320 --> 00:21:00,480 Speaker 1: from attaching appending to my tweet. This is a horrible, hate, full, 341 00:21:00,800 --> 00:21:04,800 Speaker 1: transphobic statement that we cannot stand and we disagree, and 342 00:21:04,840 --> 00:21:07,119 Speaker 1: we think boys and girls are exactly the same, and 343 00:21:07,160 --> 00:21:09,159 Speaker 1: no one could possibly think to the country. Then you 344 00:21:09,200 --> 00:21:14,320 Speaker 1: say that nobody's stopping them from speaking. And by the 345 00:21:14,400 --> 00:21:19,280 Speaker 1: way they do that, they're now like flagging things they 346 00:21:19,320 --> 00:21:22,119 Speaker 1: disagree with, usually anything right of center. As you know, 347 00:21:22,280 --> 00:21:25,320 Speaker 1: you know this is dubious, this is false, so they're 348 00:21:25,320 --> 00:21:29,080 Speaker 1: doing it right now. Fifth Circus says, well, gosh, if 349 00:21:29,080 --> 00:21:32,080 Speaker 1: you want to speak, you can, But you're not claiming 350 00:21:32,080 --> 00:21:34,399 Speaker 1: a right to speak. You're claiming a right to censor. 351 00:21:34,480 --> 00:21:37,560 Speaker 1: You're claiming a right to silence another voice that you 352 00:21:37,600 --> 00:21:42,000 Speaker 1: don't like. That's altogether different. That's a really important distinction. 353 00:21:42,040 --> 00:21:45,240 Speaker 1: I think Big Tech is definitely trying to have their 354 00:21:45,280 --> 00:21:49,640 Speaker 1: cake and eat it too, So they argue we are 355 00:21:49,720 --> 00:21:53,880 Speaker 1: not a publisher for purposes of liability law. For purposes 356 00:21:53,880 --> 00:21:57,240 Speaker 1: of defamation for purposes of Section two thirty. But we 357 00:21:57,320 --> 00:22:00,560 Speaker 1: are a publisher for purposes of the First Amendment. So 358 00:22:00,600 --> 00:22:05,600 Speaker 1: they're arguing it's different sources of law. It's still pretty 359 00:22:05,640 --> 00:22:10,040 Speaker 1: damn inconsistent, and I think this opinion quite rightly points 360 00:22:10,040 --> 00:22:14,200 Speaker 1: out the incoherence of you either are y art, and 361 00:22:15,760 --> 00:22:18,360 Speaker 1: I think it does so very effectively. The question then, 362 00:22:18,480 --> 00:22:20,760 Speaker 1: is is this going to reach the Supreme Court and 363 00:22:20,920 --> 00:22:22,800 Speaker 1: how is this court going to rule on it? If so, 364 00:22:24,880 --> 00:22:27,000 Speaker 1: I think it's quite likely to reach the Supreme Court. 365 00:22:27,080 --> 00:22:33,520 Speaker 1: This is a major, major constitutional issue. There's an enormous 366 00:22:33,520 --> 00:22:36,720 Speaker 1: amount of money behind these lawsuits. It is a law 367 00:22:36,800 --> 00:22:43,880 Speaker 1: that affects directly thirty million Americans, thirty million residents of Texas, 368 00:22:45,040 --> 00:22:47,960 Speaker 1: you know, just under ten percent of the US population. 369 00:22:48,840 --> 00:22:51,520 Speaker 1: It is a law that will have dramatic impact on 370 00:22:51,640 --> 00:22:55,680 Speaker 1: big tech nationally. If they're not allowed to discriminate in Texas, 371 00:22:57,240 --> 00:22:59,440 Speaker 1: it could well force them to change how they behave 372 00:22:59,520 --> 00:23:04,280 Speaker 1: to people all over the country. So I think it's 373 00:23:04,359 --> 00:23:09,359 Speaker 1: quite likely the Court takes this case. It's not a certainty, 374 00:23:09,560 --> 00:23:11,960 Speaker 1: but if I were a betting man, and I'm abetting man, 375 00:23:12,600 --> 00:23:14,760 Speaker 1: I would bet yes that they would take the case. 376 00:23:15,960 --> 00:23:23,240 Speaker 1: And I'm you know, the Court had stepped in before 377 00:23:23,680 --> 00:23:28,680 Speaker 1: and disagreed with the Fifth Circuit on staying the application 378 00:23:28,760 --> 00:23:33,280 Speaker 1: of the District Court's injunction. That was at a preliminary stage. 379 00:23:33,320 --> 00:23:36,679 Speaker 1: And we've talked before about how a number of the 380 00:23:36,760 --> 00:23:42,600 Speaker 1: justices and the Court are minimalists and incrementalists. That prior decision, 381 00:23:42,800 --> 00:23:46,440 Speaker 1: I don't think it at all foreshadows where this decision 382 00:23:46,440 --> 00:23:50,720 Speaker 1: on the merits goes. This is now a major, serious 383 00:23:50,760 --> 00:23:57,760 Speaker 1: decision on the merits of the overbreadth challenge. I like 384 00:23:57,960 --> 00:24:02,439 Speaker 1: the chances of five or even six justices at the 385 00:24:02,480 --> 00:24:09,960 Speaker 1: Court agreeing with the Fifth Circuit on this case. Oh, 386 00:24:10,000 --> 00:24:12,560 Speaker 1: I can't wait. I can't wait to see how this unfolds. 387 00:24:12,600 --> 00:24:15,040 Speaker 1: I hope that that is correct. Five or six justices. 388 00:24:15,080 --> 00:24:19,520 Speaker 1: That's a pretty hefty majority right there. Okay, I do 389 00:24:19,600 --> 00:24:21,440 Speaker 1: want to go over to mailbag for a second, and 390 00:24:21,480 --> 00:24:24,680 Speaker 1: I want to ask this is a very culturally hot topic. 391 00:24:25,160 --> 00:24:28,000 Speaker 1: Chrissy Tegan, who is the white She's a mega celebrity, 392 00:24:28,040 --> 00:24:30,280 Speaker 1: the wife of John Legend. I'm sure you're familiar with her. 393 00:24:30,440 --> 00:24:33,760 Speaker 1: She tragically lost her son halfway through her pregnancy two 394 00:24:33,800 --> 00:24:36,600 Speaker 1: years ago. She had a placental abruption. Essentially, she was 395 00:24:36,680 --> 00:24:39,640 Speaker 1: bleeding out. They had to induce her labor. He wasn't 396 00:24:39,720 --> 00:24:42,959 Speaker 1: yet viable. She miscarriage. She publicly posted the pictures. They 397 00:24:43,000 --> 00:24:47,200 Speaker 1: were heartbreaking pictures. Fast forward two years to right now, 398 00:24:47,520 --> 00:24:53,359 Speaker 1: Chrissy Tegan announces that she recently realized that her miscarriage 399 00:24:53,480 --> 00:24:57,200 Speaker 1: of two years ago was actually an abortion. She said 400 00:24:57,200 --> 00:24:59,960 Speaker 1: she realized this in the wake of Dobbs versus Jack 401 00:25:00,080 --> 00:25:03,280 Speaker 1: in Women's Health, which overturned Roe v. Wade. Do you 402 00:25:03,320 --> 00:25:07,080 Speaker 1: have a response to Chrissie Tigan, If there's a medical 403 00:25:07,160 --> 00:25:11,280 Speaker 1: procedure in that context, it's not an abortion, and it 404 00:25:11,359 --> 00:25:13,480 Speaker 1: is the law in all fifty states, that it should 405 00:25:13,520 --> 00:25:16,040 Speaker 1: be the law in all fifty states, that that that 406 00:25:16,160 --> 00:25:18,680 Speaker 1: doctors can intervene to save the life of the mother, 407 00:25:18,840 --> 00:25:22,440 Speaker 1: even if it means tragically losing the child, That that 408 00:25:22,440 --> 00:25:28,520 Speaker 1: that there is nobody, even the most robust pro life advocates, 409 00:25:28,560 --> 00:25:31,639 Speaker 1: nobody argues that that that when the woman's life is 410 00:25:31,680 --> 00:25:35,840 Speaker 1: in danger, that that you can't take extraordinary medical steps 411 00:25:35,880 --> 00:25:40,040 Speaker 1: to preserve the mother's life. And and so in those circumstances, 412 00:25:40,960 --> 00:25:47,080 Speaker 1: um she may want to characterize it as as abortion. 413 00:25:47,840 --> 00:25:52,480 Speaker 1: In this political context, but she described it at the 414 00:25:52,520 --> 00:25:56,919 Speaker 1: time as as as a miscarriage, and it certainly sounds 415 00:25:56,960 --> 00:25:59,879 Speaker 1: like that was an accurate description. Seems to me, she 416 00:26:00,040 --> 00:26:03,159 Speaker 1: described it in quite some detail, very tragic, every parent's 417 00:26:03,240 --> 00:26:07,520 Speaker 1: worst nightmare. But to now characterize it as an abortion 418 00:26:07,560 --> 00:26:11,600 Speaker 1: for politics also seems to add to that tragedy. Senator, 419 00:26:11,760 --> 00:26:15,080 Speaker 1: thank you for this legal analysis on the Fifth Circuit 420 00:26:15,119 --> 00:26:19,800 Speaker 1: Court ruling on Texas hr or House Built twenty. This 421 00:26:19,880 --> 00:26:23,159 Speaker 1: is the kind of legal stuff that I like the 422 00:26:23,240 --> 00:26:25,840 Speaker 1: best when we do on the Cloakroom. It's what it's 423 00:26:25,880 --> 00:26:27,920 Speaker 1: diving into these issues, not just on a tweet form, 424 00:26:27,960 --> 00:26:29,920 Speaker 1: not just in an op ed, but really getting into 425 00:26:30,000 --> 00:26:33,760 Speaker 1: the background of what underpins this big fight. Big tech 426 00:26:33,840 --> 00:26:35,320 Speaker 1: is one of the big fights of our time, and 427 00:26:35,359 --> 00:26:37,120 Speaker 1: it's gonna be really fun see how this plays out. 428 00:26:37,160 --> 00:26:39,560 Speaker 1: So it was good, It was good sitting here with 429 00:26:39,560 --> 00:26:42,119 Speaker 1: you tonight. I'm Liz Wheeler. This is the Cloakroom on 430 00:26:42,240 --> 00:26:42,840 Speaker 1: Verdict Plus