1 00:00:03,120 --> 00:00:08,360 Speaker 1: You're listening to Bloomberg Law with June Grasso from Bloomberg Radio. 2 00:00:09,160 --> 00:00:13,240 Speaker 1: In the face of more and more disinformation on different platforms, 3 00:00:13,520 --> 00:00:16,600 Speaker 1: what can be done to ensure that American voters can 4 00:00:16,640 --> 00:00:21,040 Speaker 1: make informed decisions in free and fair elections? Joining me 5 00:00:21,160 --> 00:00:24,239 Speaker 1: is Richard Hassan, a professor of law and political science 6 00:00:24,280 --> 00:00:27,560 Speaker 1: at the University of California, Irvine. His new book is 7 00:00:27,680 --> 00:00:31,840 Speaker 1: Cheap Speech. How Disinformation poisons Our Politics and How to 8 00:00:31,920 --> 00:00:35,280 Speaker 1: Cure it. It's hard to believe that more than a 9 00:00:35,360 --> 00:00:38,440 Speaker 1: year after the presidential election, there are still so many 10 00:00:38,520 --> 00:00:42,680 Speaker 1: people who believe that the election was stolen from former 11 00:00:42,720 --> 00:00:46,200 Speaker 1: President Donald Trump, despite all the court rulings to the contrary. 12 00:00:46,560 --> 00:00:50,160 Speaker 1: How did he manage this or how did this come about? Well? 13 00:00:50,200 --> 00:00:55,400 Speaker 1: A big reason for the continued persistence of belief and 14 00:00:55,480 --> 00:00:59,720 Speaker 1: the false claim that the election was stolen is the 15 00:01:00,160 --> 00:01:04,679 Speaker 1: that Donald Trump, the former president, was able to communicate 16 00:01:04,720 --> 00:01:09,560 Speaker 1: this idea directly to voters and to others through his 17 00:01:09,720 --> 00:01:12,960 Speaker 1: platforms on social media. If we lived in the old 18 00:01:13,160 --> 00:01:16,560 Speaker 1: technological days, when there were just a few television stations, 19 00:01:16,600 --> 00:01:19,959 Speaker 1: some newspapers, and a few other ways of getting news, 20 00:01:20,680 --> 00:01:24,840 Speaker 1: it would have been hard for someone like Trump to 21 00:01:25,000 --> 00:01:28,679 Speaker 1: have a platform where he could, in an unmediated way 22 00:01:28,880 --> 00:01:33,240 Speaker 1: repeat the false claim. Um. But in our new social media, 23 00:01:33,720 --> 00:01:37,360 Speaker 1: cable news environment, it's very easy to have repetition of 24 00:01:37,400 --> 00:01:40,280 Speaker 1: these claims. By one count in the New York Times, 25 00:01:40,720 --> 00:01:44,360 Speaker 1: donald Trump went to Twitter about four hundred times between 26 00:01:44,560 --> 00:01:47,000 Speaker 1: the day of election Day in early November and November 27 00:01:47,080 --> 00:01:49,960 Speaker 1: nine to make the false claim that the election was told. 28 00:01:50,040 --> 00:01:52,960 Speaker 1: And so I think you can draw a pretty straight 29 00:01:52,960 --> 00:01:58,400 Speaker 1: line from Trump's statements to both the belief among Republicans today, 30 00:01:58,640 --> 00:02:02,120 Speaker 1: among some Republicans today that the election was stolen, as 31 00:02:02,160 --> 00:02:05,800 Speaker 1: well as the January six insurrection. You may remember it 32 00:02:05,840 --> 00:02:09,360 Speaker 1: was Donald Trump who first on Twitter invited his supporters 33 00:02:09,360 --> 00:02:12,640 Speaker 1: to a quote wild time in Washington, d C. On 34 00:02:12,720 --> 00:02:16,600 Speaker 1: January six, the date of Congress counted the Electoral College votes. 35 00:02:16,880 --> 00:02:22,120 Speaker 1: Describe what cheap speech is. So the term cheap speech 36 00:02:22,680 --> 00:02:25,520 Speaker 1: is not originally mine. It originates with a professor at 37 00:02:25,560 --> 00:02:30,720 Speaker 1: U c. L. He was writing back in an article 38 00:02:30,800 --> 00:02:33,720 Speaker 1: Leah Law Journal, an article called cheap speech and what 39 00:02:33,760 --> 00:02:39,080 Speaker 1: it will do about the upcoming uh information revolution We 40 00:02:39,080 --> 00:02:43,160 Speaker 1: were about to have. Where we go from that scarcity 41 00:02:43,320 --> 00:02:47,400 Speaker 1: of means of communicating just a few television and radio 42 00:02:47,440 --> 00:02:52,200 Speaker 1: stations and some newspapers to a flood of speech. And 43 00:02:52,360 --> 00:02:55,960 Speaker 1: Box saw this as a great opportunity. You know, back 44 00:02:56,080 --> 00:02:58,160 Speaker 1: in the fifties, if you didn't like something in the 45 00:02:58,200 --> 00:02:59,800 Speaker 1: New York Times, you could write a letter to the 46 00:03:00,120 --> 00:03:02,000 Speaker 1: or you know, if you're lucky, it would get printed, 47 00:03:02,000 --> 00:03:04,560 Speaker 1: and otherwise you could just yell into the winds. And now, 48 00:03:04,639 --> 00:03:07,440 Speaker 1: of course everyone can have a platform and can say 49 00:03:07,440 --> 00:03:10,520 Speaker 1: whatever they like, and the only limit on how much 50 00:03:10,560 --> 00:03:12,880 Speaker 1: it spreads is how much people want to read or 51 00:03:12,919 --> 00:03:15,720 Speaker 1: hear what you have to say. So he saw the 52 00:03:15,760 --> 00:03:18,919 Speaker 1: loss of intermediaries as the most good thing for democracy 53 00:03:19,040 --> 00:03:22,520 Speaker 1: and for opening up channels of communication. And I think 54 00:03:22,560 --> 00:03:26,440 Speaker 1: there are many good things that have happened with the 55 00:03:26,520 --> 00:03:30,840 Speaker 1: change to this cheap speech era, but it also has 56 00:03:30,880 --> 00:03:33,000 Speaker 1: a dark side. And in the book Cheap Speech, I 57 00:03:33,080 --> 00:03:35,800 Speaker 1: argue that there's a second meaning to the term, which 58 00:03:35,880 --> 00:03:40,120 Speaker 1: is that what we have today is a system that 59 00:03:40,160 --> 00:03:43,160 Speaker 1: privileges low value speech over high value speech. And what 60 00:03:43,200 --> 00:03:45,880 Speaker 1: I mean by that is, I think for voters in 61 00:03:45,920 --> 00:03:49,800 Speaker 1: the elections, one of the highest valued speeches investigative journalism 62 00:03:49,880 --> 00:03:51,920 Speaker 1: by journalists who spent a lot of time, a lot 63 00:03:51,960 --> 00:03:54,240 Speaker 1: of effort, very expensive to figure out what's going on, 64 00:03:54,480 --> 00:03:58,960 Speaker 1: a hold politicians accountable for what they're doing. That's really 65 00:03:58,960 --> 00:04:02,880 Speaker 1: hard to produce, really expensive to produce, and the economic 66 00:04:02,960 --> 00:04:07,680 Speaker 1: model that supports that has collapsed as advertising dollars have 67 00:04:07,760 --> 00:04:13,640 Speaker 1: shifted to places like Google and Facebook. But creating literally 68 00:04:13,680 --> 00:04:16,240 Speaker 1: fake news, you know, making things up and putting them 69 00:04:16,279 --> 00:04:19,000 Speaker 1: on a nice looking website that looks like the news 70 00:04:19,080 --> 00:04:22,280 Speaker 1: that that's very cheap to do, and there's actually some 71 00:04:22,320 --> 00:04:25,120 Speaker 1: demand for that now for a bunch of complicated reasons. 72 00:04:25,200 --> 00:04:28,240 Speaker 1: And so we live in a cheap speech system where um, 73 00:04:28,839 --> 00:04:34,240 Speaker 1: not everything that voters should have easy access to is available, um. 74 00:04:34,760 --> 00:04:39,920 Speaker 1: And lots of things that might distract voters or mislead 75 00:04:40,000 --> 00:04:42,520 Speaker 1: voters that's very easy to get. And so with the 76 00:04:42,560 --> 00:04:46,440 Speaker 1: loss of intermediaries, it's harder for voters to make competent decisions. 77 00:04:47,400 --> 00:04:50,960 Speaker 1: It strikes me that if someone who watches only one 78 00:04:51,040 --> 00:04:55,280 Speaker 1: news channel, let's say Fox, comes away with different information 79 00:04:55,320 --> 00:04:59,560 Speaker 1: a different viewpoint than someone who watches let's say CNN. 80 00:04:59,720 --> 00:05:02,440 Speaker 1: And as you write in your book, there's no Walter 81 00:05:02,600 --> 00:05:06,480 Speaker 1: Cronkite anymore, like a trusted source of information that people 82 00:05:06,520 --> 00:05:10,200 Speaker 1: can go to, right, So, you know, it's just totally 83 00:05:10,240 --> 00:05:12,839 Speaker 1: different atmosphere, And the question is, you know, how do 84 00:05:12,880 --> 00:05:15,600 Speaker 1: you get voters the good information that they need when 85 00:05:16,120 --> 00:05:20,159 Speaker 1: we're all not just having different opinions about things, but 86 00:05:20,360 --> 00:05:22,960 Speaker 1: a different set of facts about what's going on. All 87 00:05:22,960 --> 00:05:28,120 Speaker 1: school About his book Cheap Speech, there are fears expressed 88 00:05:28,160 --> 00:05:31,880 Speaker 1: all the time that we're losing our democracy. Is it 89 00:05:31,960 --> 00:05:36,320 Speaker 1: the threat to democracy from cheap Speech? Is that more dangerous? 90 00:05:36,440 --> 00:05:38,600 Speaker 1: Or is it the threat just from people who are 91 00:05:38,640 --> 00:05:43,760 Speaker 1: taking over some local and state government's election systems, you know, 92 00:05:43,800 --> 00:05:48,919 Speaker 1: the conspiracy theorists, the believers in Trump having won the election. 93 00:05:49,640 --> 00:05:52,880 Speaker 1: Is that more dangerous than the Cheap Speech? Well, I 94 00:05:52,880 --> 00:05:55,080 Speaker 1: think I wouldn't separate the two because I think one 95 00:05:55,080 --> 00:05:58,880 Speaker 1: of the reasons why we had the January six insurrection, 96 00:05:59,040 --> 00:06:03,520 Speaker 1: why there are candidates running for office to run elections 97 00:06:04,160 --> 00:06:09,120 Speaker 1: who now say that election was stolen despite all reliable 98 00:06:09,160 --> 00:06:12,080 Speaker 1: evidence of the contrary, is because we live in this 99 00:06:13,200 --> 00:06:16,880 Speaker 1: um media environment in which there is no penalty for 100 00:06:17,560 --> 00:06:21,160 Speaker 1: making these kind of statements, and a government penalty, but 101 00:06:21,200 --> 00:06:23,320 Speaker 1: I mean a penalty in the court of public opinion. People. 102 00:06:23,560 --> 00:06:27,640 Speaker 1: People don't just laugh away conspiracy theories and theories of 103 00:06:27,960 --> 00:06:32,480 Speaker 1: stolen elections. But when you have a political system where 104 00:06:33,080 --> 00:06:36,240 Speaker 1: millions of people don't believe that the election was legitimately run, 105 00:06:36,880 --> 00:06:39,479 Speaker 1: they might be more willing to tolerate an attempt to 106 00:06:39,760 --> 00:06:43,200 Speaker 1: steal back the election of the next time around. And so, 107 00:06:43,400 --> 00:06:46,600 Speaker 1: you know, I really think that there was, before we 108 00:06:46,640 --> 00:06:50,920 Speaker 1: had the social media revolution, already polarization, but social media 109 00:06:51,160 --> 00:06:54,880 Speaker 1: has made it more polarized. We already had some people 110 00:06:55,200 --> 00:06:59,840 Speaker 1: who don't have full respect for our democratic process, but 111 00:07:00,080 --> 00:07:03,200 Speaker 1: the media environment makes it easier to spread those ideas 112 00:07:03,240 --> 00:07:05,880 Speaker 1: and also for people to find each other. So one 113 00:07:05,880 --> 00:07:07,919 Speaker 1: of the things that we saw about January six is 114 00:07:07,960 --> 00:07:11,080 Speaker 1: not only were there people who embrace the conspiracy theories 115 00:07:11,080 --> 00:07:13,880 Speaker 1: about the election being stolen, but they could use Facebook 116 00:07:13,880 --> 00:07:17,720 Speaker 1: groups to find each other to organize. Right the communications revolution, 117 00:07:17,760 --> 00:07:20,560 Speaker 1: people from all over the country could find each other. 118 00:07:21,040 --> 00:07:23,800 Speaker 1: There may be only one in a hundred thousand people 119 00:07:23,840 --> 00:07:26,160 Speaker 1: who are willing to do something, but when you can 120 00:07:26,200 --> 00:07:28,440 Speaker 1: ban them all together from across the country, they could 121 00:07:28,480 --> 00:07:31,000 Speaker 1: be much more powerful than you know, if people were 122 00:07:31,000 --> 00:07:34,000 Speaker 1: trying to find each other in old fashioned ways. And 123 00:07:34,040 --> 00:07:36,119 Speaker 1: I'm not saying that we should return to the days 124 00:07:36,200 --> 00:07:38,080 Speaker 1: of Walter Cronkite. I mean, there was a lot of 125 00:07:38,360 --> 00:07:42,000 Speaker 1: important news that wasn't covered, And think about the George 126 00:07:42,000 --> 00:07:47,320 Speaker 1: Floyd racial justice movement. Much easier to organize when you 127 00:07:47,400 --> 00:07:51,120 Speaker 1: have the ability to share information, to share videos of 128 00:07:51,280 --> 00:07:54,880 Speaker 1: police brutality, etcetera. But we have to think about what 129 00:07:54,960 --> 00:07:58,000 Speaker 1: we can do to give voters the tools that they 130 00:07:58,040 --> 00:08:01,960 Speaker 1: need to be able to get accurate information to understand 131 00:08:02,000 --> 00:08:05,000 Speaker 1: what's going on in our elections and our democracy. There 132 00:08:05,000 --> 00:08:09,680 Speaker 1: are some who say more speech is better, more speech helps. 133 00:08:10,200 --> 00:08:14,080 Speaker 1: Do you think that there should be laws limiting speech? 134 00:08:14,880 --> 00:08:17,880 Speaker 1: So generally speaking, I don't favor laws limiting speech. I 135 00:08:18,120 --> 00:08:22,960 Speaker 1: only have two small exceptions to that in the proposals 136 00:08:22,960 --> 00:08:25,560 Speaker 1: in my book Cheap Speech, both of which the Supreme 137 00:08:25,560 --> 00:08:29,320 Speaker 1: Court has indicated our constitutional One are laws that limit 138 00:08:29,440 --> 00:08:32,520 Speaker 1: foreign interference in American elections. In the Supreme Court, in 139 00:08:32,559 --> 00:08:35,800 Speaker 1: a case called Blueman versus Federal Election Commission, affirmed a 140 00:08:35,880 --> 00:08:38,720 Speaker 1: lower court ruling that such laws don't violate the First Amendment. 141 00:08:39,320 --> 00:08:44,000 Speaker 1: The other is a set of laws that would that 142 00:08:44,000 --> 00:08:48,679 Speaker 1: would ban empirically verifiable false election speech. So if you 143 00:08:48,760 --> 00:08:51,840 Speaker 1: lie about when, where, or how people vote. We have 144 00:08:51,880 --> 00:08:54,040 Speaker 1: a case that's pending right now about a person who 145 00:08:54,080 --> 00:08:57,480 Speaker 1: live that the that you could vote by text, and 146 00:08:57,559 --> 00:09:04,280 Speaker 1: he directed those messages to African American voters. In the election, 147 00:09:04,320 --> 00:09:07,239 Speaker 1: about five thousand people tried to vote by text or hashtag, 148 00:09:07,520 --> 00:09:10,880 Speaker 1: but how many of them ultimately realized were able to vote. 149 00:09:11,160 --> 00:09:14,400 Speaker 1: But that sort of space should be I think outlawed. 150 00:09:14,840 --> 00:09:17,640 Speaker 1: That's very very narrow. That doesn't apply to most of 151 00:09:17,679 --> 00:09:20,360 Speaker 1: the kinds of troubling state about elections that we hear. 152 00:09:20,960 --> 00:09:24,199 Speaker 1: Most of what I want to do is to enhance 153 00:09:24,280 --> 00:09:27,280 Speaker 1: what voters know. So, for example, I want to improve 154 00:09:27,320 --> 00:09:30,560 Speaker 1: our disclosure laws so if you're actually being targeted in 155 00:09:31,120 --> 00:09:35,880 Speaker 1: uh use an example in Alabama, we had some Democratic 156 00:09:35,960 --> 00:09:41,400 Speaker 1: operatives who pretended to be very conservative Baptists supporting Roy Moore, 157 00:09:41,400 --> 00:09:45,120 Speaker 1: trying to claim that Roy Moore would would want to 158 00:09:45,120 --> 00:09:48,400 Speaker 1: get rid of alcohol in the state of Alabama as 159 00:09:48,440 --> 00:09:50,840 Speaker 1: a way of trying to get these Republican voters not 160 00:09:50,880 --> 00:09:54,520 Speaker 1: to vote. I think that Alabama voters would want to 161 00:09:54,559 --> 00:09:59,960 Speaker 1: know that these conservative Baptists were actually liberal Democrats, or 162 00:10:00,240 --> 00:10:03,320 Speaker 1: that it wasn't African Americans trying to convince people that 163 00:10:03,520 --> 00:10:05,880 Speaker 1: Hillary Clinton doesn't believe in black lives matter. It was 164 00:10:06,240 --> 00:10:09,760 Speaker 1: Russian operatives in a boiler room in St. Petersburg. So 165 00:10:10,120 --> 00:10:15,480 Speaker 1: improved disclosure as well as labeling altered videos so called 166 00:10:15,520 --> 00:10:18,360 Speaker 1: deep fakes, has altered that people know what they're seeing, 167 00:10:18,880 --> 00:10:21,040 Speaker 1: whether they can believe what they see with their own eyes. 168 00:10:21,840 --> 00:10:24,160 Speaker 1: Tools like that on the legal side, I think would 169 00:10:24,160 --> 00:10:26,680 Speaker 1: be the most helpful. In the past, the risk of 170 00:10:26,880 --> 00:10:30,000 Speaker 1: stolen elections seem to be minimal. You know, you'd hear 171 00:10:30,040 --> 00:10:33,360 Speaker 1: a case here and there in North Carolina. Whatever are 172 00:10:33,360 --> 00:10:37,400 Speaker 1: the risks greater? Today? I think that the amount of 173 00:10:38,040 --> 00:10:41,320 Speaker 1: election fraud, whether committed by voters or by others, is 174 00:10:41,400 --> 00:10:46,800 Speaker 1: quite low. Ah, there are isolated instances. We see it 175 00:10:47,000 --> 00:10:51,560 Speaker 1: in every election. A ballot here, a ballot there. This 176 00:10:51,720 --> 00:10:56,960 Speaker 1: was the most watched and investigated election I think in 177 00:10:57,000 --> 00:11:01,080 Speaker 1: American history, and it turned up very few cases of fraud. 178 00:11:01,280 --> 00:11:03,120 Speaker 1: You know. The idea is that the fraud is going 179 00:11:03,160 --> 00:11:05,319 Speaker 1: to be so massive that it's going to swing elections, 180 00:11:05,360 --> 00:11:08,679 Speaker 1: but so well done that it can't be detected. And 181 00:11:08,960 --> 00:11:11,880 Speaker 1: it's just not realistic. And you know, when we see 182 00:11:12,520 --> 00:11:15,800 Speaker 1: fraud in elections, it tends to be in small elections 183 00:11:15,840 --> 00:11:18,720 Speaker 1: where moving a few votes could make a difference. I 184 00:11:18,800 --> 00:11:21,760 Speaker 1: certainly think we need to continue to be on guard 185 00:11:22,000 --> 00:11:28,400 Speaker 1: against these kinds of attempts to manipulate election outcomes. But 186 00:11:28,880 --> 00:11:30,880 Speaker 1: many of the laws that are being passed the name 187 00:11:30,920 --> 00:11:34,360 Speaker 1: of preventing voter fraud are really more about pleasing the 188 00:11:34,400 --> 00:11:37,200 Speaker 1: Republican base who has been told that our elections are 189 00:11:37,280 --> 00:11:40,640 Speaker 1: right with fraud, and that evinces a kind of response 190 00:11:40,679 --> 00:11:43,880 Speaker 1: from these election officials. When I think about the next 191 00:11:43,960 --> 00:11:48,760 Speaker 1: presidential election, I just wonder what's going to happen. Is 192 00:11:48,760 --> 00:11:52,719 Speaker 1: it going to be like elections before or they're going 193 00:11:52,760 --> 00:11:56,640 Speaker 1: to be outrageous claims, all kinds of problems that elections 194 00:11:56,720 --> 00:12:00,680 Speaker 1: sites and in counting votes. What has to be done 195 00:12:00,720 --> 00:12:04,319 Speaker 1: now to make it as normal as possible to avoid 196 00:12:05,320 --> 00:12:09,040 Speaker 1: some of these problems. Well, in cheap speech, I argue 197 00:12:09,080 --> 00:12:10,800 Speaker 1: that there are some legal steps we could take to 198 00:12:10,840 --> 00:12:15,160 Speaker 1: improve the information atmosphere for our elections. That that's only 199 00:12:15,160 --> 00:12:18,640 Speaker 1: one piece of what we need for um pre and 200 00:12:18,720 --> 00:12:21,720 Speaker 1: fair elections, elections that people who accept this legitimate the 201 00:12:21,760 --> 00:12:24,599 Speaker 1: next time. In a piece that just posted at the 202 00:12:24,640 --> 00:12:27,920 Speaker 1: Harvard Law Review Forum, I advocate a number of steps 203 00:12:27,920 --> 00:12:32,720 Speaker 1: to avoid election subversion future elections. That's where the election 204 00:12:33,400 --> 00:12:37,720 Speaker 1: loser might try and get declared the election winner, as 205 00:12:37,720 --> 00:12:40,079 Speaker 1: we saw Trump try to do in when he tried 206 00:12:40,120 --> 00:12:44,160 Speaker 1: to manipulate the electoral college process to turn his defeat 207 00:12:44,160 --> 00:12:46,280 Speaker 1: into a victory. I mean, there are a number of 208 00:12:46,320 --> 00:12:51,120 Speaker 1: things that we could do legally unrelated to speech. For example, 209 00:12:51,440 --> 00:12:54,160 Speaker 1: Congress could pass a law that would require every state 210 00:12:54,240 --> 00:12:57,240 Speaker 1: to use voting machines that produce a piece of paper 211 00:12:57,280 --> 00:12:59,920 Speaker 1: ballots that could be recounted by hand in the event 212 00:13:00,040 --> 00:13:04,679 Speaker 1: that there is a problem with how an election is run. Um. 213 00:13:04,800 --> 00:13:09,439 Speaker 1: You may remember Donald Trump claimed that the Georgia election 214 00:13:09,559 --> 00:13:11,960 Speaker 1: was fraudulent and they did a full hand recount of 215 00:13:12,000 --> 00:13:14,440 Speaker 1: every ballot of the state. You can't do that with 216 00:13:14,480 --> 00:13:16,640 Speaker 1: fully electronic machines. So there are a number of things 217 00:13:16,640 --> 00:13:20,000 Speaker 1: that could be done in terms of ballot transparency, voting 218 00:13:20,679 --> 00:13:25,240 Speaker 1: rules being transparent, also fixing the electoral college rules the 219 00:13:25,280 --> 00:13:29,160 Speaker 1: Congress applies. There's some talk of a bipartisan compromise there, 220 00:13:29,520 --> 00:13:31,520 Speaker 1: but I think beyond that it's going to have to 221 00:13:31,559 --> 00:13:35,640 Speaker 1: be not just legal change, but social and political organizing 222 00:13:36,120 --> 00:13:39,040 Speaker 1: to find those people in the center who might disagree 223 00:13:39,080 --> 00:13:42,079 Speaker 1: about all kinds of things politically, but agree that we 224 00:13:42,080 --> 00:13:44,600 Speaker 1: should have a free and fair election where the results 225 00:13:44,600 --> 00:13:48,360 Speaker 1: are not dictated by whoever is counting the votes, but 226 00:13:48,400 --> 00:13:50,960 Speaker 1: instead reflectible of the people. I want to talk for 227 00:13:50,960 --> 00:13:54,680 Speaker 1: a moment about Elon Musk taking over Twitter. He is 228 00:13:54,720 --> 00:13:57,960 Speaker 1: a free speech absolutist. He said that there should be 229 00:13:58,040 --> 00:14:02,440 Speaker 1: minimal interference. If it's a gray area, let the speech exist. 230 00:14:03,120 --> 00:14:07,440 Speaker 1: How do you think he's taking over Twitter will change things? Well, 231 00:14:07,480 --> 00:14:10,920 Speaker 1: first of all, he might say he's a free speech absolutist, 232 00:14:11,080 --> 00:14:13,120 Speaker 1: but if he wants to run a platform that's actually 233 00:14:13,160 --> 00:14:15,480 Speaker 1: going to make money, I don't think he's going to 234 00:14:15,559 --> 00:14:19,120 Speaker 1: let everything on the platform that the First Amendment would allow. 235 00:14:19,760 --> 00:14:22,240 Speaker 1: So you know, if there were no moderation of content, 236 00:14:22,320 --> 00:14:27,880 Speaker 1: for example, our feeds on Twitter would be filled with 237 00:14:28,560 --> 00:14:32,000 Speaker 1: hate speech, would be filled with pornography, would be filled 238 00:14:32,040 --> 00:14:35,080 Speaker 1: with a lot of advertising for male enhancement pills. I mean, 239 00:14:35,080 --> 00:14:38,480 Speaker 1: it would just be the kind of place that nobody 240 00:14:38,480 --> 00:14:41,720 Speaker 1: would want to go. And so, uh, I think what 241 00:14:41,840 --> 00:14:44,520 Speaker 1: he means is that Donald Trump and some of the 242 00:14:44,520 --> 00:14:47,920 Speaker 1: other conservatives who are removed from the platform should be restored. 243 00:14:48,320 --> 00:14:51,280 Speaker 1: I think that's very problematic. I think it's a private company, 244 00:14:51,320 --> 00:14:54,440 Speaker 1: Twitter can make that decision, but I think that voters 245 00:14:55,040 --> 00:14:58,920 Speaker 1: and consumers can also object to that, and the employees 246 00:14:58,920 --> 00:15:00,960 Speaker 1: of the company can object By leaving. You know, the 247 00:15:01,000 --> 00:15:04,400 Speaker 1: tech world is a very competitive world in terms of 248 00:15:04,480 --> 00:15:08,480 Speaker 1: hiring employees, and if people at these companies object, you're 249 00:15:08,480 --> 00:15:11,280 Speaker 1: going to lose some of the brain talent. And so, uh, 250 00:15:11,360 --> 00:15:14,640 Speaker 1: you know, we'll see what decisions Elon Musk ends up making. 251 00:15:15,320 --> 00:15:17,280 Speaker 1: I don't know that it's going to make things better. 252 00:15:17,840 --> 00:15:19,600 Speaker 1: I don't know that he's in it for the profit 253 00:15:19,640 --> 00:15:22,920 Speaker 1: as opposed to as a kind of vanity project. But 254 00:15:23,560 --> 00:15:26,760 Speaker 1: we'll see how things go. I'm not at all even 255 00:15:26,760 --> 00:15:30,520 Speaker 1: certain he'll end up owning the platform, given how things 256 00:15:30,520 --> 00:15:34,600 Speaker 1: have been going. What are your final thoughts? So, I 257 00:15:34,600 --> 00:15:38,560 Speaker 1: think American democracy is at a very challenging period. We 258 00:15:38,680 --> 00:15:43,280 Speaker 1: have this highly polarized environment in which false claims about 259 00:15:43,320 --> 00:15:45,840 Speaker 1: elections are easily spread, and I think the question we 260 00:15:45,880 --> 00:15:48,560 Speaker 1: really need to ask is how can we assure that 261 00:15:48,600 --> 00:15:51,560 Speaker 1: we can have both free and fair elections as well 262 00:15:51,600 --> 00:15:56,640 Speaker 1: as a system where people can vigorously compete with each other, 263 00:15:57,320 --> 00:16:00,800 Speaker 1: but where election lies don't take a for in terms 264 00:16:00,800 --> 00:16:04,920 Speaker 1: of voters thinking about how to make decisions about Thanks 265 00:16:04,920 --> 00:16:08,360 Speaker 1: so much, Rick. That's Professor Richard Hassan of the University 266 00:16:08,360 --> 00:16:13,520 Speaker 1: of California, Irvine, the author of Cheap Speech, How disinformation 267 00:16:13,680 --> 00:16:18,360 Speaker 1: poisons our politics, and how to cure it. This week, 268 00:16:18,480 --> 00:16:22,480 Speaker 1: the unprecedented leak of a draft opinion overturning Roe v. 269 00:16:22,680 --> 00:16:26,440 Speaker 1: Wade used up all the oxygen around the Supreme Court. 270 00:16:26,960 --> 00:16:29,960 Speaker 1: Little noticed was an opinion in which the justices were 271 00:16:30,040 --> 00:16:35,240 Speaker 1: unanimous in deciding that Boston violated the Constitution by refusing 272 00:16:35,280 --> 00:16:38,480 Speaker 1: to fly a Christian Civic Groups flag at City Hall 273 00:16:38,880 --> 00:16:42,680 Speaker 1: while raising the banners of some fifty other organizations. The 274 00:16:42,800 --> 00:16:46,600 Speaker 1: Justice has rejected the city's contention that its flag raising 275 00:16:46,680 --> 00:16:50,640 Speaker 1: program was a form of government speech that wasn't subject 276 00:16:50,680 --> 00:16:54,440 Speaker 1: to the First Amendment, something that Justice Elena Kagan was 277 00:16:54,600 --> 00:16:59,160 Speaker 1: unequivocal about during the oral arguments. A program that basically 278 00:16:59,240 --> 00:17:05,080 Speaker 1: now says and and and previously, we welcome all commerce 279 00:17:05,760 --> 00:17:11,399 Speaker 1: except for the most reprehensible, discriminatory speech and religious speech. 280 00:17:11,520 --> 00:17:15,199 Speaker 1: That's what this program is. And why should we understand 281 00:17:15,320 --> 00:17:20,040 Speaker 1: that to be government speech? To say everything's good except religion? 282 00:17:20,440 --> 00:17:23,480 Speaker 1: Joining me is First Amendment? Law expert Timothy Zick, a 283 00:17:23,560 --> 00:17:27,280 Speaker 1: professor at William and Mary Law School. Justice Brian wrote 284 00:17:27,320 --> 00:17:31,000 Speaker 1: the majority opinion and said the central question was whether 285 00:17:31,040 --> 00:17:34,639 Speaker 1: the city had created a public forum for private speech. 286 00:17:35,160 --> 00:17:37,879 Speaker 1: Can you explain that for us. Sure. So when the 287 00:17:37,920 --> 00:17:41,240 Speaker 1: government opened up a public space the Applausa or something 288 00:17:41,320 --> 00:17:44,240 Speaker 1: like that for private speakers to come in a sort 289 00:17:44,240 --> 00:17:47,240 Speaker 1: of diversity of views, to let them communicate in that space, 290 00:17:47,359 --> 00:17:51,040 Speaker 1: then it's opened up essentially a forum for private speakers. 291 00:17:51,160 --> 00:17:55,520 Speaker 1: And that's distinguished from the government using a space or 292 00:17:55,560 --> 00:17:59,440 Speaker 1: a property to communicate its own messages. And that's the 293 00:18:00,000 --> 00:18:02,720 Speaker 1: full dividing line in this case. If the government was 294 00:18:02,800 --> 00:18:06,760 Speaker 1: using the flagpole to send out its own messages, then 295 00:18:06,760 --> 00:18:09,720 Speaker 1: it would be government speech and it would be immune 296 00:18:09,760 --> 00:18:14,439 Speaker 1: from First Amendment concerns. Exactly when the government speak, it 297 00:18:14,480 --> 00:18:17,160 Speaker 1: gets to decide what it wants to say, what viewpoint 298 00:18:17,160 --> 00:18:19,919 Speaker 1: it wants to express, what message it wants to communicate, 299 00:18:20,119 --> 00:18:22,520 Speaker 1: which is quite different from you know, when the government 300 00:18:22,600 --> 00:18:25,359 Speaker 1: regulates private speakers, it has to sort of obey the 301 00:18:25,359 --> 00:18:28,879 Speaker 1: opposite rule, which is to say, it cannot discriminate based 302 00:18:28,880 --> 00:18:31,840 Speaker 1: on what a private speaker wants to say, what viewpoints 303 00:18:31,840 --> 00:18:35,680 Speaker 1: they want to commun Brier set out a three factor 304 00:18:35,840 --> 00:18:40,639 Speaker 1: test for deciding whether a message is government speech. Was 305 00:18:40,680 --> 00:18:44,200 Speaker 1: that a test you've seen before? Yeah, those are factors 306 00:18:44,240 --> 00:18:47,240 Speaker 1: that the court has relied on in previous cases. So 307 00:18:47,440 --> 00:18:50,760 Speaker 1: there have been several government speech cases before this one, 308 00:18:51,359 --> 00:18:55,000 Speaker 1: involving things like monuments that are placed in a public 309 00:18:55,080 --> 00:18:59,840 Speaker 1: park or specialty license place, so the state allows citizens 310 00:18:59,840 --> 00:19:03,960 Speaker 1: to display on their automobiles, or trademarks in the case 311 00:19:04,000 --> 00:19:07,159 Speaker 1: of federal trademark law. And so what the court was 312 00:19:07,200 --> 00:19:11,000 Speaker 1: doing here is sort of pulling factors from those precedents 313 00:19:11,080 --> 00:19:15,040 Speaker 1: and applying them to, in this case, Boston's flagpole policy. 314 00:19:15,480 --> 00:19:19,800 Speaker 1: So the factors that are relevant here are the history 315 00:19:19,840 --> 00:19:23,879 Speaker 1: of the expression at issue. Have flagpoles and flags been 316 00:19:24,000 --> 00:19:27,480 Speaker 1: used historically, you might say, by governments to speak, what's 317 00:19:27,480 --> 00:19:30,400 Speaker 1: the public likely perception as to who's doing the speaking, 318 00:19:30,800 --> 00:19:32,800 Speaker 1: who's the statement going to be attributed to, And then 319 00:19:32,800 --> 00:19:35,440 Speaker 1: the extent to which the government has actively shaped or 320 00:19:35,520 --> 00:19:38,600 Speaker 1: controls the expression. And I think it's fair to say, 321 00:19:38,680 --> 00:19:42,159 Speaker 1: reading Justice Prior's opinion, that the last element there seemed 322 00:19:42,200 --> 00:19:44,960 Speaker 1: to be the most prominent one. Indeed, he seemed to 323 00:19:44,960 --> 00:19:47,720 Speaker 1: suggest that Boston had a good case with respect of 324 00:19:47,800 --> 00:19:51,000 Speaker 1: the first two elements. But since it hadn't really exercised 325 00:19:51,080 --> 00:19:54,600 Speaker 1: any control as to what flags went on this third 326 00:19:54,640 --> 00:19:58,040 Speaker 1: flagpole until this case, until the Christian flag was presented 327 00:19:58,080 --> 00:20:01,600 Speaker 1: to it, that was the problem. What I'm curious about 328 00:20:01,840 --> 00:20:05,800 Speaker 1: is this was a unanimous decision, yet the district court 329 00:20:05,880 --> 00:20:09,080 Speaker 1: and the appellate court sided with the city. How did 330 00:20:09,119 --> 00:20:12,359 Speaker 1: they both get it so wrong? Yeah, I think again, 331 00:20:12,359 --> 00:20:14,560 Speaker 1: if you look at the courts factors, which is what 332 00:20:14,600 --> 00:20:16,680 Speaker 1: the lower courts are looking at too, there's the sort 333 00:20:16,680 --> 00:20:21,119 Speaker 1: of history of flags used over time. Governments, you know, 334 00:20:21,520 --> 00:20:26,000 Speaker 1: typically traditionally have communicated messages and sentiments through flags. There's 335 00:20:26,000 --> 00:20:28,840 Speaker 1: no question about that. And on a place like the 336 00:20:28,880 --> 00:20:31,520 Speaker 1: Plaza which sits right next to city Hall, and the 337 00:20:31,640 --> 00:20:36,159 Speaker 1: third flagpole that flies of flags next to the United 338 00:20:36,160 --> 00:20:40,119 Speaker 1: States flags and the state flag of Massachusetts, one might 339 00:20:40,160 --> 00:20:43,200 Speaker 1: attribute whatever goes on that third flag to the government. 340 00:20:43,520 --> 00:20:45,800 Speaker 1: You know, two of the three factors then again seemed 341 00:20:45,840 --> 00:20:49,399 Speaker 1: to point in the city's favor. And you know, the 342 00:20:49,440 --> 00:20:52,760 Speaker 1: Supreme Court, as I say, seemed to focus more heavily 343 00:20:52,800 --> 00:20:55,000 Speaker 1: on the fact that there seemed to be no selectivity 344 00:20:55,640 --> 00:20:58,760 Speaker 1: or control that if you're really trying to communicate something, 345 00:20:58,760 --> 00:21:00,760 Speaker 1: one would expect that you'd want to have some say 346 00:21:00,800 --> 00:21:04,119 Speaker 1: so in terms of which flags go on this third flagpole. 347 00:21:04,720 --> 00:21:08,600 Speaker 1: And you know, in the fifty instances and hundreds of 348 00:21:08,760 --> 00:21:12,280 Speaker 1: commemorative ceremonies that have starred. Up to this point, Boston 349 00:21:12,400 --> 00:21:16,679 Speaker 1: hadn't exercise any that kind of editorial control. Has the 350 00:21:16,720 --> 00:21:21,800 Speaker 1: Supreme Court struggled to distinguish government speech from private speech 351 00:21:21,800 --> 00:21:24,720 Speaker 1: in the past. I think it struggles mightily. There's a 352 00:21:24,760 --> 00:21:27,600 Speaker 1: really basic principle here that government couldn't operate if it 353 00:21:27,600 --> 00:21:31,879 Speaker 1: couldn't communicate its own points of view, positions, messages, et cetera. 354 00:21:32,400 --> 00:21:36,000 Speaker 1: But determining what is and what isn't government speech when 355 00:21:36,040 --> 00:21:38,680 Speaker 1: in fact the government is speaking, as opposed to regulating 356 00:21:38,720 --> 00:21:42,560 Speaker 1: private speech, turns out to be a very difficult endeavor. 357 00:21:42,880 --> 00:21:44,960 Speaker 1: And I think that's what you see in a case 358 00:21:45,000 --> 00:21:48,639 Speaker 1: like this one. And you know, in Justice the Leado's concurrence, 359 00:21:48,960 --> 00:21:52,400 Speaker 1: he tries to provide a more focused test in terms 360 00:21:52,440 --> 00:21:55,240 Speaker 1: of determining when is it that the government is actually speaking, 361 00:21:55,640 --> 00:21:58,880 Speaker 1: Has it developed a method, has it sort of communicated 362 00:21:58,920 --> 00:22:02,440 Speaker 1: that message consistently as a deputized private parties maybe to 363 00:22:02,440 --> 00:22:04,639 Speaker 1: communicate it. So there is an effort here in the 364 00:22:04,720 --> 00:22:08,760 Speaker 1: concurrence to try and narrow down the issue something other 365 00:22:08,800 --> 00:22:12,119 Speaker 1: than these three sort of amorphous factors. But it's just 366 00:22:12,200 --> 00:22:15,040 Speaker 1: a concurrence, and the majority just applies these factors. So 367 00:22:15,119 --> 00:22:17,360 Speaker 1: it's been each case on its own terms and its 368 00:22:17,359 --> 00:22:20,480 Speaker 1: own facts, which naturally leads to what some might do 369 00:22:20,520 --> 00:22:25,359 Speaker 1: as inconsistent results. You mentioned Justice Alito's concurrence, and he 370 00:22:25,440 --> 00:22:29,760 Speaker 1: seemed to be trying to relitigate the Texas case involving 371 00:22:29,760 --> 00:22:33,840 Speaker 1: the Confederate flag on license plates. Tell us more about 372 00:22:33,920 --> 00:22:37,280 Speaker 1: that case and why he has a problem with it. Yeah, 373 00:22:37,560 --> 00:22:40,720 Speaker 1: I think Justice Alito, you know, looking at the text, 374 00:22:40,880 --> 00:22:44,680 Speaker 1: especially license plate programs, just just found it incredible that 375 00:22:44,840 --> 00:22:48,639 Speaker 1: the public would attribute to the State of Texas the 376 00:22:48,720 --> 00:22:51,840 Speaker 1: sort of multitude of especialty license plate that the state 377 00:22:52,119 --> 00:22:56,200 Speaker 1: had allowed. And they ranged from you know, sports teams 378 00:22:56,240 --> 00:23:00,200 Speaker 1: outside of Texas being displayed on these license plate to 379 00:23:00,640 --> 00:23:02,600 Speaker 1: let's all go play golf, you know, just get the 380 00:23:02,640 --> 00:23:05,520 Speaker 1: multitude of messages. So it just seemed to him incredible 381 00:23:05,880 --> 00:23:08,600 Speaker 1: on the attribution part of it, at least, that the 382 00:23:08,640 --> 00:23:12,440 Speaker 1: public would associate any for a lot of those messages 383 00:23:12,480 --> 00:23:15,680 Speaker 1: with the state. So I think he was dubious there, 384 00:23:16,119 --> 00:23:18,399 Speaker 1: And you know, he is, as you say, trying to 385 00:23:18,520 --> 00:23:22,320 Speaker 1: relitigate the Texas license plate case, casted as an outlier 386 00:23:22,359 --> 00:23:24,960 Speaker 1: in terms of the scope of government speech, because if 387 00:23:24,960 --> 00:23:27,800 Speaker 1: that government speech, then he can imagine you know, lots 388 00:23:27,800 --> 00:23:30,920 Speaker 1: of other things government does also being characterized that way. 389 00:23:31,040 --> 00:23:33,439 Speaker 1: And the danger here is the government, under the guise 390 00:23:33,520 --> 00:23:38,240 Speaker 1: of discriminating against private speech and private speakers, will cast 391 00:23:38,280 --> 00:23:41,920 Speaker 1: itself in lots of cases as the speaker to try 392 00:23:41,960 --> 00:23:45,960 Speaker 1: and avoid the First Amendment content neutrality requirement. So he's 393 00:23:46,000 --> 00:23:49,280 Speaker 1: pushing back, you know, very hard against not just that case, 394 00:23:49,520 --> 00:23:52,840 Speaker 1: but I think the doctrine of government speech and trying 395 00:23:52,880 --> 00:23:55,560 Speaker 1: to cabinet in some way narrow it to its essence. 396 00:23:55,920 --> 00:24:00,200 Speaker 1: In another concurrence, Justice Courser seemed to have a problem 397 00:24:00,200 --> 00:24:04,280 Speaker 1: with the Lemon case. I think both he and Justice Kavanaugh, 398 00:24:04,320 --> 00:24:08,159 Speaker 1: who wrote his own concurrence, are concerned at what animated 399 00:24:08,160 --> 00:24:11,080 Speaker 1: this controversy, What made this a controversy in the first place, 400 00:24:11,240 --> 00:24:14,760 Speaker 1: was a sort of misunderstanding about the place of religious 401 00:24:14,760 --> 00:24:18,600 Speaker 1: speech in the public square and in public life more generally. 402 00:24:18,960 --> 00:24:23,000 Speaker 1: So it seems to them that Boston's officials recoiled from 403 00:24:23,040 --> 00:24:26,119 Speaker 1: the idea of having any kind of Christian flag or 404 00:24:26,160 --> 00:24:29,199 Speaker 1: religious flag put into the mix, even though they allowed 405 00:24:29,240 --> 00:24:31,680 Speaker 1: like credit unions to fly a flag, and lots of 406 00:24:31,720 --> 00:24:35,959 Speaker 1: different nationalities and so forth. And Justice Courses in particular 407 00:24:36,040 --> 00:24:39,119 Speaker 1: is tracing that back to sort of an establishment clause 408 00:24:39,359 --> 00:24:42,080 Speaker 1: problem as he sees it, with what's become a very 409 00:24:42,160 --> 00:24:45,280 Speaker 1: unpopular case Lemon, which steps forth a test for when 410 00:24:45,280 --> 00:24:48,919 Speaker 1: the establishment claus is violated. And so there is an 411 00:24:48,920 --> 00:24:52,120 Speaker 1: effort there to sort of go beyond the speech context 412 00:24:52,200 --> 00:24:56,080 Speaker 1: and the government speech principle and shed some light on 413 00:24:56,280 --> 00:24:59,240 Speaker 1: what they think is a related problem. So, in essence, 414 00:24:59,280 --> 00:25:01,960 Speaker 1: their position, as in the courts position has been, when 415 00:25:02,000 --> 00:25:05,600 Speaker 1: you open a forum like the flagpole, you can't discriminate 416 00:25:05,640 --> 00:25:10,880 Speaker 1: against religious speakers for religious viewpoints. The establishment claus doesn't protect. 417 00:25:11,080 --> 00:25:13,800 Speaker 1: That kind of activity is actually sort of contrary to 418 00:25:14,320 --> 00:25:18,360 Speaker 1: religious freedom to single out religious speakers for special treatment. 419 00:25:18,760 --> 00:25:21,240 Speaker 1: And that's the fight that I think this discourses just 420 00:25:21,320 --> 00:25:24,880 Speaker 1: pointing to in his concurrence. Does he seem to be 421 00:25:25,040 --> 00:25:27,840 Speaker 1: on a sort of mission. It seems to echo a 422 00:25:27,880 --> 00:25:30,880 Speaker 1: little bit of the oral arguments in the case of 423 00:25:30,920 --> 00:25:33,920 Speaker 1: the football coach who wants to prey on the fifty 424 00:25:34,000 --> 00:25:37,000 Speaker 1: yard line. Yeah. I think religious freedom in general has 425 00:25:37,040 --> 00:25:39,439 Speaker 1: been a very sort of front and center part of 426 00:25:39,440 --> 00:25:43,199 Speaker 1: the Supreme Court supermajority agenda. So they've taken lots of 427 00:25:43,480 --> 00:25:47,880 Speaker 1: free exercise cases and religious freedom cases, and I think 428 00:25:47,920 --> 00:25:51,160 Speaker 1: there is an effort here to move some of the brush, 429 00:25:51,200 --> 00:25:53,800 Speaker 1: if you will, out and Lemon is one of those 430 00:25:53,840 --> 00:25:57,639 Speaker 1: precedents that's been problematic and criticized for a long time. 431 00:25:57,920 --> 00:26:00,560 Speaker 1: And so yeah, I think there are echoes of the 432 00:26:00,600 --> 00:26:04,080 Speaker 1: football coach prayer case. You mentioned several others that have 433 00:26:04,160 --> 00:26:07,920 Speaker 1: been decided recently, the Masterpiece Cake Shop case, for example, 434 00:26:08,200 --> 00:26:12,040 Speaker 1: public funding for parochial schools, all of them based on 435 00:26:12,080 --> 00:26:14,840 Speaker 1: this sort of neutrality principle that you cannot under the 436 00:26:14,880 --> 00:26:19,320 Speaker 1: religion clauses single out religious viewpoints for burdens or different 437 00:26:19,400 --> 00:26:23,040 Speaker 1: rules and you'd apply to other persons or speakers. Nothing 438 00:26:23,240 --> 00:26:27,639 Speaker 1: prevents Boston from changing its policies going forward, and the 439 00:26:27,720 --> 00:26:30,280 Speaker 1: city has said that in the event of a loss 440 00:26:30,440 --> 00:26:35,040 Speaker 1: at the court, it will probably change its policy. What 441 00:26:35,080 --> 00:26:39,200 Speaker 1: would a policy change entail that would make this government speech? 442 00:26:39,840 --> 00:26:42,520 Speaker 1: I think they have to make it clearer or clear. 443 00:26:42,600 --> 00:26:44,480 Speaker 1: I guess one would say that this is not a 444 00:26:44,600 --> 00:26:48,800 Speaker 1: forum for just any comers. Anyone who wants to display 445 00:26:48,840 --> 00:26:51,360 Speaker 1: a flag, they've got to exercise the kind of control 446 00:26:51,800 --> 00:26:54,840 Speaker 1: that the court says is missing, the editorial control in 447 00:26:54,880 --> 00:26:58,720 Speaker 1: their policy. And the court actually points to different flag 448 00:26:58,760 --> 00:27:02,240 Speaker 1: flying policies from other jurisdictions like San Jose where they 449 00:27:02,240 --> 00:27:04,840 Speaker 1: make clear that flagpoles are not intended to serve as 450 00:27:04,880 --> 00:27:07,040 Speaker 1: a forum for free expression by the public, and then 451 00:27:07,040 --> 00:27:09,880 Speaker 1: they lift approved flags that can be flown as an 452 00:27:09,880 --> 00:27:13,280 Speaker 1: expression of the city's official sentiment. And that's not what 453 00:27:13,520 --> 00:27:15,800 Speaker 1: the city of Boston was doing, was opening up this 454 00:27:15,920 --> 00:27:18,639 Speaker 1: space and inviting speakers of all kinds to come in 455 00:27:18,680 --> 00:27:21,920 Speaker 1: and then shutting the door because just this one particular speaker. 456 00:27:22,480 --> 00:27:25,080 Speaker 1: I mean, the easiest thing for Boston to do is 457 00:27:25,080 --> 00:27:27,960 Speaker 1: simply to fly its own city flag on that third 458 00:27:28,000 --> 00:27:30,000 Speaker 1: flagpole and be done with it. But if they do 459 00:27:30,080 --> 00:27:33,399 Speaker 1: want to open that flag up two different flags of 460 00:27:33,640 --> 00:27:36,439 Speaker 1: you know, non city entities, they're going to have to 461 00:27:36,440 --> 00:27:39,919 Speaker 1: be much more selective and much clearer about what it 462 00:27:40,000 --> 00:27:45,000 Speaker 1: is they're trying to communicate to the public. Finally, do 463 00:27:45,080 --> 00:27:48,560 Speaker 1: you think that this opinion will help courts in the 464 00:27:48,640 --> 00:27:52,960 Speaker 1: future to be able to distinguish between government speech and 465 00:27:53,160 --> 00:27:57,000 Speaker 1: private speech. Not very much, unfortunately. And then, as someone 466 00:27:57,040 --> 00:27:59,720 Speaker 1: who teaches this subject, you know, as an interest in 467 00:28:00,080 --> 00:28:02,680 Speaker 1: getting some clarity in terms of, you know, what the 468 00:28:02,800 --> 00:28:05,800 Speaker 1: rules are, the standards are in this area. So on 469 00:28:05,840 --> 00:28:08,080 Speaker 1: the one hand, you know, this is a very outlier 470 00:28:08,119 --> 00:28:11,360 Speaker 1: case in the sense that the governmental entity here Boston 471 00:28:11,880 --> 00:28:15,800 Speaker 1: doesn't seem to have exercised really any selectivity and control. 472 00:28:16,840 --> 00:28:20,840 Speaker 1: In most cases you'll find some selectivity, some editorial control 473 00:28:20,920 --> 00:28:24,560 Speaker 1: by the governmental entity. It may be that the attribution 474 00:28:24,600 --> 00:28:27,879 Speaker 1: piece of this is a little unclear, or the historical 475 00:28:28,080 --> 00:28:30,919 Speaker 1: use of this kind of you know, forum for speech 476 00:28:31,040 --> 00:28:34,400 Speaker 1: is less clear. But I think all this court has 477 00:28:34,440 --> 00:28:38,080 Speaker 1: done is simply repeat the factors from cases like Walker, 478 00:28:38,160 --> 00:28:40,440 Speaker 1: the license plate case, and some of them the public 479 00:28:40,480 --> 00:28:42,600 Speaker 1: park case, and courts are going to have to take 480 00:28:42,680 --> 00:28:45,560 Speaker 1: these up one by one, and I think that's part 481 00:28:45,600 --> 00:28:47,800 Speaker 1: of what Justice Pelito is reacting to. We need to 482 00:28:47,840 --> 00:28:51,600 Speaker 1: give more clarity in terms of a test the lower 483 00:28:51,640 --> 00:28:54,800 Speaker 1: court and litigin otherwise we're just going to keep seeing 484 00:28:54,800 --> 00:28:58,200 Speaker 1: these cases each on their own facts. Thanks so much. 485 00:28:58,600 --> 00:29:01,800 Speaker 1: That's Professor Timothy Zick of William and Mary Law School, 486 00:29:02,160 --> 00:29:04,520 Speaker 1: And that's it for this edition of the Bloomberg Law Show. 487 00:29:05,000 --> 00:29:07,360 Speaker 1: Remember you can always get the latest legal news on 488 00:29:07,400 --> 00:29:10,280 Speaker 1: our Bloomberg Law podcast. You can find them wherever you 489 00:29:10,320 --> 00:29:13,720 Speaker 1: get your favorite podcasts. I'm John Grosso, and you're listening 490 00:29:13,800 --> 00:29:14,560 Speaker 1: to Bloomberg