1 00:00:15,356 --> 00:00:24,116 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:24,156 --> 00:00:27,276 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:27,716 --> 00:00:32,636 Speaker 1: I'm Noah Feldman. Today we're continuing our Freedom of Speech series, 4 00:00:32,796 --> 00:00:35,116 Speaker 1: and I'm going to share with you a conversation I 5 00:00:35,236 --> 00:00:38,516 Speaker 1: had with an expert whom I admire greatly and with 6 00:00:38,516 --> 00:00:42,996 Speaker 1: whom I very frequently disagree. Eugene Valak teaches First Amendment 7 00:00:43,076 --> 00:00:46,516 Speaker 1: law at the UCLA School of Law. He's the author 8 00:00:46,556 --> 00:00:48,836 Speaker 1: of a casebook about freedom of speech called the First 9 00:00:48,876 --> 00:00:52,516 Speaker 1: Amendment and Related Statutes. His law few articles and his 10 00:00:52,556 --> 00:00:55,196 Speaker 1: friend of the Court briefs have been cited in numerous 11 00:00:55,236 --> 00:00:58,556 Speaker 1: Supreme Court cases. He's also the founder and co author 12 00:00:58,596 --> 00:01:02,836 Speaker 1: of The Volak Conspiracy, a leading legal blog which, sometimes, 13 00:01:02,876 --> 00:01:06,716 Speaker 1: depending on the issue, tends towards the conservative or libertarian. 14 00:01:07,356 --> 00:01:09,596 Speaker 1: In the world of legal academia, where I have my 15 00:01:09,676 --> 00:01:13,436 Speaker 1: day job, Eugene is universally recognized across the range of 16 00:01:13,476 --> 00:01:16,476 Speaker 1: political opinions as one of the most significant and influential 17 00:01:16,556 --> 00:01:20,196 Speaker 1: voices about the freedom of speech. I spoke to Eugene 18 00:01:20,396 --> 00:01:27,356 Speaker 1: back in March. Eugene, I wonder if you would start 19 00:01:27,356 --> 00:01:31,116 Speaker 1: by telling our listeners how you got interested in the 20 00:01:31,236 --> 00:01:35,196 Speaker 1: freedom of speech, because your profile up until the time 21 00:01:35,236 --> 00:01:37,836 Speaker 1: when you did fancy a pelletchord in Supreme court clerkships 22 00:01:37,836 --> 00:01:41,196 Speaker 1: and became a law professor was extremely unusual. So how 23 00:01:41,236 --> 00:01:42,996 Speaker 1: did you get interested in this issue in the first place. 24 00:01:43,316 --> 00:01:46,636 Speaker 1: I've been interested in a constitutional law, and in particular 25 00:01:46,636 --> 00:01:49,636 Speaker 1: free speech laws, since I was in my mid teens. 26 00:01:50,356 --> 00:01:53,236 Speaker 1: I actually went to law school planning and becoming a prosecutor. 27 00:01:53,956 --> 00:01:56,276 Speaker 1: And then I realized in law school two things. One 28 00:01:56,356 --> 00:01:59,356 Speaker 1: is that prosecutors generally didn't do much with law like 29 00:01:59,476 --> 00:02:02,356 Speaker 1: most lawyers. They did things with facts, as is right. 30 00:02:02,436 --> 00:02:05,396 Speaker 1: That's what prosecutors should be mostly focused on, is figuring 31 00:02:05,396 --> 00:02:07,596 Speaker 1: out the facts and proving the fact. But I was 32 00:02:07,636 --> 00:02:10,236 Speaker 1: interested in law, so that meant I'd either be an 33 00:02:10,236 --> 00:02:12,396 Speaker 1: appellate lawyer or a professor. And then I saw how 34 00:02:12,476 --> 00:02:14,516 Speaker 1: much fun my professors were having, so I thought, oh, 35 00:02:14,516 --> 00:02:18,156 Speaker 1: I'll be a law professor, and turned out that there 36 00:02:18,156 --> 00:02:20,436 Speaker 1: were interesting things to say about free speech laws. I 37 00:02:20,476 --> 00:02:23,516 Speaker 1: was delighted. Now, Gene, you say that you were interested 38 00:02:23,516 --> 00:02:25,036 Speaker 1: in this from the time you were a teenager, but 39 00:02:25,436 --> 00:02:28,876 Speaker 1: you were not, by any ordinary standard, a normal teenager. 40 00:02:28,996 --> 00:02:31,676 Speaker 1: So you came to the US when you were seven 41 00:02:31,956 --> 00:02:35,276 Speaker 1: from the then Soviet Union, and then you graduated from 42 00:02:35,356 --> 00:02:38,356 Speaker 1: college at UCLA with a degree in computer science and 43 00:02:38,476 --> 00:02:41,516 Speaker 1: math when you were fifteen. What were you doing when 44 00:02:41,516 --> 00:02:43,116 Speaker 1: you were starting to think about this as a teenager? 45 00:02:43,196 --> 00:02:44,956 Speaker 1: Was that when you were actively doing the job of 46 00:02:44,996 --> 00:02:48,196 Speaker 1: a computer programmer. Yeah. I'd worked as a computer programer 47 00:02:48,196 --> 00:02:51,036 Speaker 1: actually ever since I was twelve, and I was happy 48 00:02:51,076 --> 00:02:53,716 Speaker 1: to be a programmer, and I was excited about my 49 00:02:53,796 --> 00:02:56,996 Speaker 1: programming career. But at the same time, I was interested 50 00:02:56,996 --> 00:02:58,676 Speaker 1: in this in the side, as I think so many 51 00:02:58,716 --> 00:03:01,076 Speaker 1: people are. Right. My guess is a lot of your 52 00:03:01,076 --> 00:03:03,796 Speaker 1: listeners who are interested in free speech law or law 53 00:03:04,156 --> 00:03:08,556 Speaker 1: aren't lawyers. They're just people who think that this is 54 00:03:08,516 --> 00:03:12,516 Speaker 1: an import part of their civic lives and their intellectual 55 00:03:12,516 --> 00:03:15,516 Speaker 1: lives is getting a better sense of the rules of 56 00:03:15,596 --> 00:03:18,716 Speaker 1: governor society. So I was interested in it then, kind 57 00:03:18,716 --> 00:03:21,236 Speaker 1: of on the side, and then I realized it's something 58 00:03:21,276 --> 00:03:24,396 Speaker 1: I might want to turn into a career. Much as 59 00:03:24,396 --> 00:03:26,516 Speaker 1: I enjoyed being a computer programmer, I wanted to be 60 00:03:26,556 --> 00:03:29,836 Speaker 1: more directly involved in these big picture illegal and public 61 00:03:29,876 --> 00:03:33,476 Speaker 1: policy issues. And I got exactly what I wanted. Knock Wood, 62 00:03:34,196 --> 00:03:36,836 Speaker 1: do you think that having come from the Soviet Union, 63 00:03:36,916 --> 00:03:41,116 Speaker 1: even as a young kid, was relevant to your eventual 64 00:03:41,196 --> 00:03:44,036 Speaker 1: formation of views in the First Amendment space. I sometimes 65 00:03:44,356 --> 00:03:46,436 Speaker 1: hear some of my students whose families came from the 66 00:03:46,476 --> 00:03:50,836 Speaker 1: former Soviet Union talking about how they're stronger pro rights viewpoint, 67 00:03:50,836 --> 00:03:54,716 Speaker 1: and in some cases even strongly libertarian perspectives come as 68 00:03:54,756 --> 00:03:58,796 Speaker 1: a reaction against seeing what communism could do in practice. Yeah, 69 00:03:58,836 --> 00:04:01,356 Speaker 1: you know, it's so hard to say, in part because 70 00:04:01,396 --> 00:04:05,356 Speaker 1: actually lots of positions on free speech issues are very, 71 00:04:05,476 --> 00:04:08,836 Speaker 1: very far from the communist position. So, you know, I 72 00:04:09,196 --> 00:04:12,516 Speaker 1: hesitate to ascribe that much to my background, in part 73 00:04:12,556 --> 00:04:14,876 Speaker 1: because I can't step out of my background. I can't say, well, 74 00:04:15,476 --> 00:04:17,356 Speaker 1: what if I had been born in the US, what 75 00:04:17,436 --> 00:04:19,236 Speaker 1: would I have thought? I don't know what wouldn't be me. 76 00:04:19,956 --> 00:04:22,956 Speaker 1: I will say that earlier in my life I actually 77 00:04:22,956 --> 00:04:25,716 Speaker 1: took a less speech protective view. I think my views 78 00:04:25,716 --> 00:04:30,076 Speaker 1: were perhaps somewhat more open to what people would say 79 00:04:30,276 --> 00:04:34,236 Speaker 1: reasonable moderate regulations of free speech. And then over time 80 00:04:34,316 --> 00:04:36,596 Speaker 1: I came to the view that the government really can 81 00:04:36,636 --> 00:04:40,836 Speaker 1: to be trusted with even these supposedly reasonable and moderate regulations. 82 00:04:41,116 --> 00:04:42,876 Speaker 1: But I'm not sure that that was because of my 83 00:04:42,956 --> 00:04:46,636 Speaker 1: background and my parents background and communist Russia. You know, 84 00:04:46,676 --> 00:04:49,716 Speaker 1: it's interesting you mention the question of trusting the government, 85 00:04:49,716 --> 00:04:51,956 Speaker 1: and I do think that for a lot of people 86 00:04:52,196 --> 00:04:56,396 Speaker 1: that comes down to how they form their views about 87 00:04:56,556 --> 00:04:59,356 Speaker 1: free speech. It's how much do they trust the government 88 00:04:59,716 --> 00:05:03,436 Speaker 1: to be able to engage in certain forms of reasonable regulation. 89 00:05:03,756 --> 00:05:06,516 Speaker 1: Obviously that's not the only way to think about it, 90 00:05:06,716 --> 00:05:08,756 Speaker 1: but do you think that that is the way that 91 00:05:08,796 --> 00:05:10,996 Speaker 1: one form views on free expression? Sort of? You know, 92 00:05:11,356 --> 00:05:13,796 Speaker 1: the more you distrust the government, the more pro free 93 00:05:13,836 --> 00:05:16,716 Speaker 1: speech you should be. I think that's right. I think 94 00:05:16,716 --> 00:05:19,116 Speaker 1: you've hit the nail on the head there, at least 95 00:05:19,156 --> 00:05:21,316 Speaker 1: as a practical matter for most people, and I think 96 00:05:21,316 --> 00:05:23,276 Speaker 1: for not just how most people think about about how 97 00:05:23,316 --> 00:05:25,556 Speaker 1: they should think about it. Let's take an example. People 98 00:05:25,596 --> 00:05:30,436 Speaker 1: talk about fake news, and isn't fake news bad? Well, yes, certainly, 99 00:05:30,436 --> 00:05:34,516 Speaker 1: outright hoaxes are bad. Lies are bad. Even honest mistakes, 100 00:05:34,596 --> 00:05:39,596 Speaker 1: especially about important subjects such as vaccination or the coronavirus, 101 00:05:39,756 --> 00:05:42,636 Speaker 1: or foreign policy or a wide range of other things, 102 00:05:42,716 --> 00:05:46,276 Speaker 1: those are bad. Well, why not allow the government to 103 00:05:46,476 --> 00:05:49,356 Speaker 1: prohibit those things? Once there's a trial, let's say, or 104 00:05:49,436 --> 00:05:52,636 Speaker 1: some hearing in which it's proved that something is false. Well, 105 00:05:52,676 --> 00:05:54,916 Speaker 1: if you trust the government to sort the truth from 106 00:05:54,956 --> 00:05:57,676 Speaker 1: the false, then that could be a big net plus 107 00:05:57,716 --> 00:06:00,316 Speaker 1: to the quality of public debate. On the other hand, 108 00:06:00,356 --> 00:06:03,516 Speaker 1: if you don't trust the government, not because you think 109 00:06:03,636 --> 00:06:07,796 Speaker 1: the government is always awful, no, because government is composed 110 00:06:07,796 --> 00:06:13,116 Speaker 1: of people and aren't always trustworthy, and people have both 111 00:06:13,356 --> 00:06:17,756 Speaker 1: subconscious biases generally and also are pushed in particular directions 112 00:06:17,796 --> 00:06:21,196 Speaker 1: by their own political self interest, then I think you 113 00:06:21,276 --> 00:06:24,076 Speaker 1: might come to the view that it's better to tolerate 114 00:06:24,236 --> 00:06:26,916 Speaker 1: a good deal of falsehood than to give the government 115 00:06:26,916 --> 00:06:29,596 Speaker 1: the power to ban certain things that might be true 116 00:06:29,956 --> 00:06:33,316 Speaker 1: on the grounds that they're false. I think one way 117 00:06:33,316 --> 00:06:36,876 Speaker 1: of thinking about is thinking about history. So let's say 118 00:06:36,876 --> 00:06:41,876 Speaker 1: there's some particular dispute, say over whether the killing of 119 00:06:41,996 --> 00:06:45,156 Speaker 1: Armenians during World War One was a deliberate genocide or 120 00:06:45,236 --> 00:06:48,396 Speaker 1: just the viccissitudes of war. Now I'm not an expert 121 00:06:48,436 --> 00:06:52,236 Speaker 1: on the subject. My understanding is that people who've studied 122 00:06:52,236 --> 00:06:54,356 Speaker 1: it generally do think that this was a genocide. But 123 00:06:54,396 --> 00:06:56,796 Speaker 1: how do I know that to the extent I know it, 124 00:06:56,916 --> 00:06:58,796 Speaker 1: or let's just say to the extent I believe it. 125 00:06:59,156 --> 00:07:03,036 Speaker 1: I believe it because historians seem to have come to 126 00:07:03,116 --> 00:07:06,596 Speaker 1: that consensus, and the best way we can figure out 127 00:07:06,796 --> 00:07:09,956 Speaker 1: what is true about history, about social sideys, and especially 128 00:07:09,996 --> 00:07:12,636 Speaker 1: about philosophy, religion, arts, and the like is based on 129 00:07:12,716 --> 00:07:16,316 Speaker 1: consensus of people who really have studied things closely. But 130 00:07:16,436 --> 00:07:19,876 Speaker 1: I know that only because I know that they could 131 00:07:20,356 --> 00:07:23,956 Speaker 1: study all possible opinions on this and hear all opinions 132 00:07:24,076 --> 00:07:28,676 Speaker 1: and air opinions, even ones that ultimately their colleagueses agree with, 133 00:07:28,996 --> 00:07:32,876 Speaker 1: and through the continued process of hearing all those opinions, 134 00:07:33,196 --> 00:07:39,716 Speaker 1: the consensus emerges and remains broadly accepted by historians. If 135 00:07:39,756 --> 00:07:44,756 Speaker 1: I learned that it's illegal to deny that this was 136 00:07:44,796 --> 00:07:48,516 Speaker 1: a genocide, then I would lose confidence in that very consensus, 137 00:07:48,956 --> 00:07:52,636 Speaker 1: because I would no longer think that this is something 138 00:07:52,676 --> 00:07:55,836 Speaker 1: that historians are coming to after hearing all the arguments, 139 00:07:55,876 --> 00:07:57,876 Speaker 1: because some of the arguments now it's illegal to make 140 00:07:57,916 --> 00:08:00,716 Speaker 1: to them. So it's actually, I think, better for our 141 00:08:00,796 --> 00:08:04,756 Speaker 1: understanding of truth if people can make statements, even false statements, 142 00:08:04,956 --> 00:08:08,116 Speaker 1: because only that way can we be sure that all 143 00:08:08,196 --> 00:08:10,956 Speaker 1: of the arguments have been aired, and the whatever consensus 144 00:08:10,956 --> 00:08:13,316 Speaker 1: there is is as best we can tell an accurate one. 145 00:08:14,276 --> 00:08:16,796 Speaker 1: You know, it's interesting that in this particular instance, the 146 00:08:16,876 --> 00:08:18,716 Speaker 1: law in Turkey actually runs the other way. It's a 147 00:08:18,716 --> 00:08:21,476 Speaker 1: crime to say that it was a genocide. And that 148 00:08:21,596 --> 00:08:23,916 Speaker 1: leads to a question that I really want to ask you, 149 00:08:23,996 --> 00:08:27,436 Speaker 1: which is about how the experiences of different countries might 150 00:08:27,476 --> 00:08:31,196 Speaker 1: be relevant to creating different rules here. It's sometimes said 151 00:08:31,316 --> 00:08:33,076 Speaker 1: that if you look at the example of Germany or 152 00:08:33,076 --> 00:08:37,956 Speaker 1: other European countries where fascism or national socialism or other 153 00:08:37,996 --> 00:08:43,756 Speaker 1: ideologies actually managed to swamp liberalism and then led to 154 00:08:43,756 --> 00:08:48,156 Speaker 1: the emergence of totalitarianism, that in those countries there might 155 00:08:48,196 --> 00:08:52,396 Speaker 1: be a strong pragmatic reason to outlaw some things that 156 00:08:52,436 --> 00:08:56,196 Speaker 1: are ideas or opinions that the United States permits, for example, 157 00:08:56,396 --> 00:09:02,876 Speaker 1: prohibiting racism or prohibiting statements that dehumanize people on the 158 00:09:02,916 --> 00:09:06,236 Speaker 1: basis of their membership in a group. The argument is 159 00:09:06,276 --> 00:09:09,716 Speaker 1: something like, those countries have learned through hard experience that 160 00:09:09,796 --> 00:09:13,956 Speaker 1: they cannot trust the free marketplace of ideas to actually 161 00:09:13,996 --> 00:09:17,236 Speaker 1: clear and get people not to believe in these terrible viewpoints. 162 00:09:17,236 --> 00:09:20,356 Speaker 1: To the contrary, when these views were expressed under relatively 163 00:09:20,356 --> 00:09:23,716 Speaker 1: free circumstances, they actually led to people adopting them. And 164 00:09:23,756 --> 00:09:26,636 Speaker 1: then sometimes the conclusion is drawn from that that depending 165 00:09:26,636 --> 00:09:28,956 Speaker 1: on your national experience, you should be able as a 166 00:09:28,996 --> 00:09:33,436 Speaker 1: state to outlaw hate speech and to outlaw other forms 167 00:09:33,436 --> 00:09:36,556 Speaker 1: of racism, or to outlaw political organizations that rely on 168 00:09:36,636 --> 00:09:39,636 Speaker 1: these points of view. Do you find yourself sympathetic to 169 00:09:39,636 --> 00:09:41,876 Speaker 1: that at all? Do you think to yourself, well, maybe 170 00:09:41,876 --> 00:09:43,396 Speaker 1: in the United States we shouldn't do this, but in 171 00:09:43,436 --> 00:09:46,596 Speaker 1: Germany it's actually appropriate for them to have a law 172 00:09:46,636 --> 00:09:50,316 Speaker 1: that prohibits the Nazi Party and its symbols. No, I don't. 173 00:09:50,476 --> 00:09:53,516 Speaker 1: I should say I specialize in American free speech law. 174 00:09:53,876 --> 00:09:55,436 Speaker 1: I know a lot about it. I don't know a 175 00:09:55,476 --> 00:09:58,836 Speaker 1: lot about foreign rules. So I've got to acknowledge that 176 00:09:58,956 --> 00:10:03,636 Speaker 1: I have limited expertise in foreign law matters. But if 177 00:10:03,676 --> 00:10:06,276 Speaker 1: you ask me the question, I think the answer is 178 00:10:06,356 --> 00:10:10,876 Speaker 1: that the same reasons that justify skepticism of the government 179 00:10:10,916 --> 00:10:14,476 Speaker 1: here justify skepticism in the government there. And that's true 180 00:10:14,516 --> 00:10:18,196 Speaker 1: if it's the Germans trying to ban Nazi advocacy or 181 00:10:18,236 --> 00:10:22,996 Speaker 1: more broadly, racist advocacy or of racism is potentially very 182 00:10:23,036 --> 00:10:27,916 Speaker 1: broad and ill defined category or supposedly dehumanizing advocacy, or 183 00:10:27,956 --> 00:10:31,436 Speaker 1: for that matter, if Polls or Ukrainians want to ban 184 00:10:31,556 --> 00:10:35,316 Speaker 1: communist advocacy, communism is of course caused as much misery 185 00:10:35,316 --> 00:10:39,796 Speaker 1: in those countries as Nazism caused in Germany, possibly more so. 186 00:10:40,196 --> 00:10:43,036 Speaker 1: You say, well, those countries have concluded, based in the 187 00:10:43,076 --> 00:10:45,996 Speaker 1: judgment of history, that they can't trust the marketplace of ideas. 188 00:10:46,516 --> 00:10:48,556 Speaker 1: But I should think that the judgment of history has 189 00:10:48,596 --> 00:10:51,276 Speaker 1: made it even clearer that they can't trust the government 190 00:10:51,716 --> 00:10:55,076 Speaker 1: policing the marketplace of ideas. That in fact, as I 191 00:10:55,156 --> 00:10:58,596 Speaker 1: understand it from our Germany did try to suppress the 192 00:10:58,676 --> 00:11:00,556 Speaker 1: Nazi Party, which of course was engaged not just an 193 00:11:00,596 --> 00:11:03,396 Speaker 1: advocacy of it in outright crimes, didn't do a great 194 00:11:03,476 --> 00:11:06,036 Speaker 1: job of it. And then of course Nazi Germany and 195 00:11:06,076 --> 00:11:09,836 Speaker 1: then in Eastern Germany, Communist Germany, they tried to regulate 196 00:11:09,836 --> 00:11:12,596 Speaker 1: what they thought was in badly working marketplace of ideas 197 00:11:12,636 --> 00:11:18,236 Speaker 1: by suppressing liberal democratic advocacy. So the question is always comparative, 198 00:11:18,356 --> 00:11:21,076 Speaker 1: is it not? Is the marketplace of ideas perfect? Where 199 00:11:21,156 --> 00:11:24,596 Speaker 1: is it even very good? The question is whether we're 200 00:11:24,716 --> 00:11:29,076 Speaker 1: likely to get better results by allowing people to say things, 201 00:11:29,436 --> 00:11:33,756 Speaker 1: even evil things, even wrongheaded things, or by allowing the 202 00:11:33,876 --> 00:11:36,876 Speaker 1: government to control what it is that people say. And 203 00:11:37,076 --> 00:11:39,756 Speaker 1: I'm pretty skeptical that giving the government that kind of 204 00:11:39,796 --> 00:11:42,636 Speaker 1: power is going to be terribly helpful. We'll be right back. 205 00:11:52,756 --> 00:11:55,116 Speaker 1: Let me ask you about a concrete case where this 206 00:11:55,396 --> 00:11:58,836 Speaker 1: issue is actually put very directly into play on which 207 00:11:58,916 --> 00:12:01,556 Speaker 1: you and I disagreed. Now, some years ago, this case 208 00:12:01,756 --> 00:12:05,596 Speaker 1: took place at the University of Oklahoma, and to summarize it, 209 00:12:05,996 --> 00:12:10,636 Speaker 1: there was a fraternity where two fraternity brothers were taking 210 00:12:10,636 --> 00:12:13,276 Speaker 1: a group of pledges on a bus ride and then 211 00:12:13,316 --> 00:12:18,436 Speaker 1: they had them sing a song that basically said, first 212 00:12:18,476 --> 00:12:20,476 Speaker 1: of all, there will never be an African American. They 213 00:12:20,476 --> 00:12:23,516 Speaker 1: did not use that term in our fraternity. And then 214 00:12:23,716 --> 00:12:26,836 Speaker 1: it went on to say you could hang an African 215 00:12:26,836 --> 00:12:29,156 Speaker 1: American from a tree, but they would not join the 216 00:12:29,196 --> 00:12:31,836 Speaker 1: fraternity with me, namely with a member of the fraternity. 217 00:12:31,916 --> 00:12:35,916 Speaker 1: So the song effectively insisted that there would be no 218 00:12:35,996 --> 00:12:39,796 Speaker 1: blacks in their fraternity. It also threatened violence in at 219 00:12:39,876 --> 00:12:43,716 Speaker 1: least in some way by invoking lynching. And when this 220 00:12:43,796 --> 00:12:46,076 Speaker 1: story got out, David Bourne, who was then the president 221 00:12:46,076 --> 00:12:49,796 Speaker 1: of the University of Oklahoma, acted very quickly and sanctioned 222 00:12:49,796 --> 00:12:52,756 Speaker 1: the students. I believe they were actually expelled. And as 223 00:12:52,796 --> 00:12:56,036 Speaker 1: I recall it, your view was that since the University 224 00:12:56,036 --> 00:12:59,596 Speaker 1: of Oklahoma is bound by the First Amendment the president 225 00:12:59,636 --> 00:13:02,516 Speaker 1: had actually infringed on the free speech rights of the 226 00:13:02,556 --> 00:13:04,596 Speaker 1: fraternity brothers? Was that in fact your view? And if so, 227 00:13:04,636 --> 00:13:07,436 Speaker 1: do you want to say a few words about why? Yeah? 228 00:13:07,436 --> 00:13:09,436 Speaker 1: That it was my view. It continues to be my view. 229 00:13:09,476 --> 00:13:12,396 Speaker 1: I think it was a clear First Amendment violation. I 230 00:13:12,436 --> 00:13:15,356 Speaker 1: obviously have no sympathy for the particular speech they engaged in, 231 00:13:15,436 --> 00:13:18,116 Speaker 1: but we should ask ourselves what would be the rule 232 00:13:18,316 --> 00:13:21,796 Speaker 1: under which the university is allowed to expel students for 233 00:13:21,836 --> 00:13:24,676 Speaker 1: that kind of speech. Note it's not even to like 234 00:13:24,756 --> 00:13:28,076 Speaker 1: traditional speech codes, which were limited to speech on campus. 235 00:13:28,076 --> 00:13:32,116 Speaker 1: This is speech off campus. This was a speech being 236 00:13:32,116 --> 00:13:35,396 Speaker 1: restricted precisely because of the viewpoints that it was expressing. 237 00:13:36,196 --> 00:13:40,436 Speaker 1: And if the argument is well, off campus speech expressing 238 00:13:40,476 --> 00:13:43,316 Speaker 1: certain views is going to have on campus effects, Well, 239 00:13:43,356 --> 00:13:45,556 Speaker 1: that's true of a vast range of off campus speech. 240 00:13:45,876 --> 00:13:48,396 Speaker 1: That means that if I take it to a student 241 00:13:48,516 --> 00:13:52,556 Speaker 1: were to go to a rally for a racist organization 242 00:13:52,636 --> 00:13:56,796 Speaker 1: or an organization that's perceived as racist, then presumably there'd 243 00:13:56,796 --> 00:14:00,156 Speaker 1: be a similar outcry and it would be similarly justified 244 00:14:00,156 --> 00:14:02,036 Speaker 1: for them to be expelled for that. Well, may I 245 00:14:02,076 --> 00:14:05,036 Speaker 1: try out the alternative view? Well, so, tell me, Noah, 246 00:14:05,076 --> 00:14:09,476 Speaker 1: what rule you would propose under which this speech would 247 00:14:09,476 --> 00:14:12,596 Speaker 1: be restrictable but other speech would not be. And why 248 00:14:12,676 --> 00:14:16,556 Speaker 1: you think that rule sound but also is politically defensible. 249 00:14:16,636 --> 00:14:19,716 Speaker 1: Is something that's going to actually be maintained as opposed 250 00:14:19,756 --> 00:14:23,236 Speaker 1: to just leading to more and more calls for restriction. 251 00:14:23,996 --> 00:14:27,036 Speaker 1: My view would be that A on a university campus, 252 00:14:27,716 --> 00:14:30,956 Speaker 1: B there should be rules, as indeed are required by 253 00:14:30,956 --> 00:14:36,596 Speaker 1: federal law that protect against racial discrimination. That C A 254 00:14:36,636 --> 00:14:42,556 Speaker 1: fraternity is a university sanctioned organization. That D What was 255 00:14:42,596 --> 00:14:47,796 Speaker 1: wrong here was conduct. The conduct was discrimination discrimination in 256 00:14:48,036 --> 00:14:52,276 Speaker 1: association with the membership in this particular fraternity. They weren't 257 00:14:52,316 --> 00:14:54,676 Speaker 1: making a decision on membership in that moment, but through 258 00:14:54,676 --> 00:14:57,836 Speaker 1: the song, they were making it extremely clear that their fraternity, 259 00:14:57,876 --> 00:15:02,276 Speaker 1: a campus organization, was a racially discriminatory one. I would 260 00:15:02,316 --> 00:15:04,596 Speaker 1: add to this that there was also a threat of violence. 261 00:15:04,636 --> 00:15:06,956 Speaker 1: I don't know how serious it was, but nevertheless there 262 00:15:06,996 --> 00:15:09,476 Speaker 1: was a threat of violence associated with this. And then 263 00:15:09,556 --> 00:15:14,036 Speaker 1: under these circumstances where what's being punished is the conduct, 264 00:15:14,596 --> 00:15:18,596 Speaker 1: even if that conduct is achieved via words, via singing something, 265 00:15:18,996 --> 00:15:23,156 Speaker 1: that this was a form of discriminatory conduct that was 266 00:15:23,276 --> 00:15:26,796 Speaker 1: justifiable to regulate under these circumstances. And then the analogy 267 00:15:26,836 --> 00:15:30,636 Speaker 1: that I would draw here is to the regulation of 268 00:15:30,836 --> 00:15:36,156 Speaker 1: workplace sexual harassment, which, as we know, can be rendered 269 00:15:36,436 --> 00:15:41,316 Speaker 1: civilly unlawful even when it's achieved just by talking. You 270 00:15:41,356 --> 00:15:45,596 Speaker 1: know someone who says to his coworker every day, you're unqualified. 271 00:15:45,796 --> 00:15:47,396 Speaker 1: You know, because you're a woman, you can't do this 272 00:15:47,476 --> 00:15:51,076 Speaker 1: job well a range of other discriminatory things. We recognize 273 00:15:51,156 --> 00:15:54,356 Speaker 1: that the government can sanction that conduct because it's in 274 00:15:54,396 --> 00:15:57,356 Speaker 1: the workplace, which is a environment. It's a little different 275 00:15:57,356 --> 00:15:59,556 Speaker 1: than being on the street. And even though it's done 276 00:15:59,556 --> 00:16:02,116 Speaker 1: by words, what we're doing is we're punishing the conduct, 277 00:16:02,116 --> 00:16:05,396 Speaker 1: the conduct of discrimination, rather than the words themselves. That 278 00:16:05,396 --> 00:16:07,516 Speaker 1: would be the argument that I would mount. Free speech 279 00:16:07,516 --> 00:16:10,116 Speaker 1: supporters actually support us all sorts of rights. Abortion rights, 280 00:16:10,116 --> 00:16:13,676 Speaker 1: gun rights, and others. Often worry about slippery slopes, and 281 00:16:13,716 --> 00:16:17,676 Speaker 1: I think that that worry is very justified in a 282 00:16:17,756 --> 00:16:21,156 Speaker 1: legal system such as ours that's built on precedent and analogy. 283 00:16:21,476 --> 00:16:23,876 Speaker 1: Let's look at in particular the kind of argument that 284 00:16:23,916 --> 00:16:26,476 Speaker 1: you're making. So first, as it happens, I have long 285 00:16:26,516 --> 00:16:30,076 Speaker 1: criticized workplace harassment law. I think while private employers are 286 00:16:30,196 --> 00:16:33,076 Speaker 1: entitled under the First Amendment to try to control what 287 00:16:33,156 --> 00:16:34,996 Speaker 1: goes on in the workplace and the interest of morale. 288 00:16:35,436 --> 00:16:38,236 Speaker 1: I think that at least certain aspects of workplace harassment 289 00:16:38,316 --> 00:16:41,476 Speaker 1: law go too far in coercing employers to do that. 290 00:16:42,036 --> 00:16:45,396 Speaker 1: But note some of the defenses of workplace harassment law, 291 00:16:45,476 --> 00:16:47,876 Speaker 1: in fact, including in your own argument. So it's in 292 00:16:47,916 --> 00:16:51,076 Speaker 1: the workplace, but this isn't in the workplace. It isn't 293 00:16:51,076 --> 00:16:53,556 Speaker 1: even on campus. It was on a bus, and you 294 00:16:53,596 --> 00:16:57,756 Speaker 1: were saying, well, somebody telling a female coworker every day 295 00:16:57,756 --> 00:17:01,436 Speaker 1: that she is unqualified. They weren't saying that to prospective 296 00:17:01,436 --> 00:17:06,156 Speaker 1: black applicants to the fraternity. Indeed, my understanding is that 297 00:17:06,196 --> 00:17:08,516 Speaker 1: they did not expect it to leak out. They did 298 00:17:08,556 --> 00:17:10,876 Speaker 1: not want it to leak out. It's just that somebody 299 00:17:10,956 --> 00:17:13,796 Speaker 1: recorded it and that's what alerted the rest of the 300 00:17:13,796 --> 00:17:16,636 Speaker 1: public to it. So already you're taking one thing which 301 00:17:16,676 --> 00:17:19,596 Speaker 1: I think is already at the boundary or perhaps beyond 302 00:17:19,676 --> 00:17:22,236 Speaker 1: the boundary of what is acceptable under the First Amendment, 303 00:17:22,236 --> 00:17:25,316 Speaker 1: which is workplace harassment law. And now see the slippage 304 00:17:25,476 --> 00:17:28,556 Speaker 1: goes from the university, where the theory is university is 305 00:17:28,596 --> 00:17:31,676 Speaker 1: for working, it's not for public discourse like college. Well 306 00:17:31,756 --> 00:17:34,556 Speaker 1: now it's getting to college, it's not even on campus, 307 00:17:34,636 --> 00:17:38,996 Speaker 1: it's off campus, and it's not speech to a person 308 00:17:39,396 --> 00:17:41,756 Speaker 1: that is offensive to them at speech about a person. 309 00:17:42,076 --> 00:17:45,076 Speaker 1: So what this comes down to the conduct argument is 310 00:17:45,076 --> 00:17:46,716 Speaker 1: also it seems to me a way of taking things 311 00:17:46,716 --> 00:17:50,156 Speaker 1: that clearly speech. What they're expelled for is what they said, 312 00:17:50,236 --> 00:17:53,276 Speaker 1: and trying to redefine it as conduct. The strongest argument 313 00:17:53,276 --> 00:17:55,716 Speaker 1: that I can see is, and I think you pointed 314 00:17:55,716 --> 00:17:59,396 Speaker 1: to in some measure, is that well, the university could 315 00:17:59,516 --> 00:18:04,836 Speaker 1: ban exclusion based on race from a fraternity, and that 316 00:18:04,996 --> 00:18:07,236 Speaker 1: this was somehow a signal that they would do this. 317 00:18:07,356 --> 00:18:09,996 Speaker 1: It's like you're saying, not even threatening, because again they 318 00:18:09,996 --> 00:18:12,276 Speaker 1: didn't expect anybody to see it, but you're kind of saying, 319 00:18:12,676 --> 00:18:16,476 Speaker 1: I will commit this wrong of discrimination. So maybe we 320 00:18:16,476 --> 00:18:20,036 Speaker 1: can anticipatorially punish you, not for what you've done or 321 00:18:20,076 --> 00:18:22,556 Speaker 1: whether you've been proven to do, but for what we 322 00:18:22,596 --> 00:18:25,356 Speaker 1: are expecting you to do, not something we usually do 323 00:18:25,716 --> 00:18:28,836 Speaker 1: under our legal system. But even if that's so, what's 324 00:18:28,876 --> 00:18:34,556 Speaker 1: the typical penalty for students discriminating based on race or religion, 325 00:18:34,636 --> 00:18:37,316 Speaker 1: or in sex or sexual orientation or whatever else in 326 00:18:37,396 --> 00:18:41,156 Speaker 1: student group? Membership. I can bet you that it's never expulsion. 327 00:18:41,316 --> 00:18:44,996 Speaker 1: Maybe it's suspension of the group sometimes, but never expulsion, 328 00:18:45,236 --> 00:18:47,316 Speaker 1: which makes it clear that they weren't just saying, well, 329 00:18:47,356 --> 00:18:49,956 Speaker 1: this is a and even handed no discrimination in group 330 00:18:49,956 --> 00:18:52,436 Speaker 1: membership law. We're just applying to you regardless of what 331 00:18:52,476 --> 00:18:54,436 Speaker 1: you're saying. They were doing it because of what they 332 00:18:54,436 --> 00:18:56,716 Speaker 1: were saying. And then the last thing is the threat 333 00:18:56,756 --> 00:18:59,476 Speaker 1: of violence. Well, again we have to ask how do 334 00:18:59,516 --> 00:19:03,436 Speaker 1: we deal with threats of violence in songs? Generally speaking? 335 00:19:03,796 --> 00:19:06,916 Speaker 1: Let's say somebody is singing cop killer, and let's say 336 00:19:06,916 --> 00:19:11,316 Speaker 1: the university says, ooh, well, because we employ police officers 337 00:19:11,316 --> 00:19:13,316 Speaker 1: and we heard that you were singing cop killer at 338 00:19:13,316 --> 00:19:16,876 Speaker 1: a party we think praising the killing of cops. Then 339 00:19:16,956 --> 00:19:19,596 Speaker 1: in that case, we're going to expel you because you're 340 00:19:19,636 --> 00:19:22,916 Speaker 1: creating an unsafe environment for our police officers. I take 341 00:19:22,956 --> 00:19:25,756 Speaker 1: it we'd say, no, that's not restriction on conduct. That's 342 00:19:25,756 --> 00:19:28,916 Speaker 1: obvious restriction on speech. It's restriction on speech that may 343 00:19:28,916 --> 00:19:31,516 Speaker 1: be quite offensive, may even be vile, but it's not 344 00:19:31,596 --> 00:19:34,836 Speaker 1: something the university should be doing. That's the consequence, it 345 00:19:34,876 --> 00:19:38,076 Speaker 1: seems to me, of accepting the rationale that the University 346 00:19:38,076 --> 00:19:40,836 Speaker 1: of Oklahoma used here that all of these kinds of 347 00:19:40,836 --> 00:19:44,636 Speaker 1: speech could equally be restricted using exactly the same arguments. 348 00:19:44,636 --> 00:19:47,756 Speaker 1: Something you said, Eugene, will I imagine get the attention 349 00:19:47,756 --> 00:19:49,876 Speaker 1: of some listeners as it got my attention, and that 350 00:19:49,956 --> 00:19:52,156 Speaker 1: was that you were on the edge. It sounded like 351 00:19:52,356 --> 00:19:57,076 Speaker 1: of saying that workplace sex harassment law as it's currently constituted, 352 00:19:57,396 --> 00:20:00,876 Speaker 1: where it's possible to hold someone liable for harassment just 353 00:20:00,916 --> 00:20:03,116 Speaker 1: based on things they've said, you know, without any physical 354 00:20:03,116 --> 00:20:07,436 Speaker 1: touchings or other harms, is of questionable constitutionality. Not only 355 00:20:07,476 --> 00:20:10,116 Speaker 1: did I say it now, I said in nineteen ninety 356 00:20:10,116 --> 00:20:13,236 Speaker 1: two in what was my student note? It was my 357 00:20:13,356 --> 00:20:16,436 Speaker 1: job talk. Eventually, so you've consistently held this view for 358 00:20:16,676 --> 00:20:18,996 Speaker 1: more than twenty five years, Oh right, I've written literally 359 00:20:19,076 --> 00:20:22,436 Speaker 1: half a dozen articles on the subject. So, in light 360 00:20:22,516 --> 00:20:26,596 Speaker 1: of the Me too movement and the raising of consciousness 361 00:20:26,716 --> 00:20:30,596 Speaker 1: around forms of workplaced discrimination, has any of that had 362 00:20:30,676 --> 00:20:32,956 Speaker 1: any effect on your view? I mean, I understand the 363 00:20:32,996 --> 00:20:36,236 Speaker 1: constitutional or legal basis for it, as you just expressed, 364 00:20:36,476 --> 00:20:39,516 Speaker 1: But what about the sort of real world consequential part 365 00:20:39,516 --> 00:20:42,156 Speaker 1: of the picture. Has that changed or affected your mind 366 00:20:42,156 --> 00:20:47,196 Speaker 1: at all? No, I've been against sexual assault I'm happy 367 00:20:47,196 --> 00:20:51,556 Speaker 1: to say all my life I have been against people, 368 00:20:51,596 --> 00:20:55,356 Speaker 1: for example, engaging in sexual extortion. I made that clear 369 00:20:55,396 --> 00:20:58,116 Speaker 1: in my original article. That is indeed a threat of 370 00:20:58,596 --> 00:21:01,436 Speaker 1: illegal conduct, and that is generally so called quid pro 371 00:21:01,516 --> 00:21:04,276 Speaker 1: quo sexual harassment. Sleep with me or you're fired, and 372 00:21:04,316 --> 00:21:07,636 Speaker 1: that is communicated directly and deliberately to that person. I 373 00:21:07,756 --> 00:21:11,156 Speaker 1: also actually argued in my icle that indeed unwanted speech 374 00:21:11,196 --> 00:21:13,316 Speaker 1: to a person, sort of one to one speech where 375 00:21:13,316 --> 00:21:17,756 Speaker 1: you're approaching somebody and insulting them, or for that matter, 376 00:21:17,916 --> 00:21:20,116 Speaker 1: persistently asking the how for dates where you're not trying 377 00:21:20,156 --> 00:21:22,876 Speaker 1: to insult them a love speech rather than a hate speech. 378 00:21:22,876 --> 00:21:26,796 Speaker 1: But unwanted love speech, as it were, that could indeed 379 00:21:26,836 --> 00:21:30,716 Speaker 1: be restricted. But speech that's merely overheard, or in the 380 00:21:31,036 --> 00:21:34,636 Speaker 1: Oklahoma situation, speech that was never expected to be overheard, 381 00:21:34,676 --> 00:21:38,516 Speaker 1: but that somebody records and then is revealed, No, I 382 00:21:38,556 --> 00:21:41,236 Speaker 1: don't think that can be properly restricted by the government 383 00:21:41,316 --> 00:21:44,116 Speaker 1: using workplace harassment law. Let me give you an example. 384 00:21:44,356 --> 00:21:49,876 Speaker 1: Imagine that somebody is talking at a party. A guy 385 00:21:49,996 --> 00:21:52,716 Speaker 1: is talking to other guys saying, you know, I think 386 00:21:52,756 --> 00:21:54,636 Speaker 1: women just don't do a really good job here. I 387 00:21:54,676 --> 00:21:56,676 Speaker 1: think that things were better when we only had men. 388 00:21:57,676 --> 00:22:01,836 Speaker 1: And then somebody records that that is revealed to women 389 00:22:01,836 --> 00:22:04,396 Speaker 1: of the workplace. Remember this is all set outside the workplace, 390 00:22:04,676 --> 00:22:08,276 Speaker 1: and then the company is sued under Title seven for 391 00:22:08,636 --> 00:22:13,196 Speaker 1: not hiring somebody for his off the job sexist statements. 392 00:22:13,276 --> 00:22:15,556 Speaker 1: I think that would be outrageous to have such a 393 00:22:15,636 --> 00:22:19,396 Speaker 1: lawsuit proceed and I should say, to the credit of 394 00:22:19,596 --> 00:22:22,396 Speaker 1: hustle environment harassment law, I don't know of any cases 395 00:22:22,396 --> 00:22:25,556 Speaker 1: that actually do involve a lawsuit proceeding based on this 396 00:22:25,636 --> 00:22:28,916 Speaker 1: person's off the job speech, right, because it's workplace harassment 397 00:22:28,916 --> 00:22:30,516 Speaker 1: and the fact that it's in the workplace is supposed 398 00:22:30,516 --> 00:22:32,796 Speaker 1: to matter. But could you explain in your mind, what's 399 00:22:32,836 --> 00:22:38,156 Speaker 1: the magic difference between discriminatory harassment that takes place directed 400 00:22:38,516 --> 00:22:42,036 Speaker 1: at a person and discrimination that takes place behind their backs. 401 00:22:42,036 --> 00:22:45,196 Speaker 1: I mean it's in the nature of discrimination that it 402 00:22:45,276 --> 00:22:48,676 Speaker 1: can take place either directly or indirectly. I mean, you 403 00:22:48,716 --> 00:22:50,836 Speaker 1: and I can set around and discriminate against the third person, 404 00:22:50,876 --> 00:22:53,636 Speaker 1: even if the third person doesn't know we're discriminating against her. Surely, 405 00:22:53,796 --> 00:22:56,076 Speaker 1: so you say discrimination, But we're talking about his speech. 406 00:22:56,556 --> 00:22:59,356 Speaker 1: It's like people used to say, well, this isn't speech 407 00:22:59,436 --> 00:23:02,596 Speaker 1: this is sedition or this is communist conspiracy. Workplace harassment 408 00:23:02,636 --> 00:23:05,316 Speaker 1: law is based on a statute that outlaws discrimination. It's 409 00:23:05,316 --> 00:23:07,356 Speaker 1: not a statute of outlaw speech, so that's right. But 410 00:23:07,396 --> 00:23:10,116 Speaker 1: when it is applied to speech because of what the 411 00:23:10,156 --> 00:23:14,276 Speaker 1: speech communicates, then it becomes a speech restriction. So the 412 00:23:14,276 --> 00:23:17,036 Speaker 1: difference you asked, what's the difference between one to one 413 00:23:17,076 --> 00:23:20,676 Speaker 1: speech and speech that's overheard, speech that may be talked 414 00:23:20,716 --> 00:23:23,036 Speaker 1: about in the lunch room and such. I think this 415 00:23:23,116 --> 00:23:25,116 Speaker 1: is a broader point that isn't at all limited to 416 00:23:25,156 --> 00:23:27,356 Speaker 1: hostile environment harassment law, and it has to do with 417 00:23:27,396 --> 00:23:30,356 Speaker 1: the value of the speech that talking to a particular 418 00:23:30,396 --> 00:23:33,356 Speaker 1: person when the person has told you stop talking to me, 419 00:23:33,636 --> 00:23:36,316 Speaker 1: or when it's perfectly clear that a person does not 420 00:23:36,396 --> 00:23:38,996 Speaker 1: want to hear this because these are insults. That's something 421 00:23:39,036 --> 00:23:43,676 Speaker 1: that has very limited First Amendment value because it's not 422 00:23:43,796 --> 00:23:46,836 Speaker 1: likely to persuade or enlighten. It's just likely to offend. 423 00:23:47,436 --> 00:23:50,476 Speaker 1: Whereas speech that's said to the public that is overheard 424 00:23:50,516 --> 00:23:52,476 Speaker 1: by some people who are offended, that could have a 425 00:23:52,476 --> 00:23:55,396 Speaker 1: great deal of First Amendment value. Here's an example from 426 00:23:55,396 --> 00:23:59,676 Speaker 1: a non hostile environment harassment non discrimination law context. There's 427 00:23:59,676 --> 00:24:02,996 Speaker 1: a case called Drowan the Post Office Department that upheld 428 00:24:03,036 --> 00:24:05,236 Speaker 1: the statute under which any of us could say to 429 00:24:05,356 --> 00:24:08,916 Speaker 1: any mailer, at least any commercial mailer, stop sending the 430 00:24:09,156 --> 00:24:12,276 Speaker 1: unwanted mail, and then they have to stop. And the 431 00:24:12,356 --> 00:24:15,316 Speaker 1: court said, look, there's no right to press even a 432 00:24:15,356 --> 00:24:19,196 Speaker 1: good idea on an unwilling listener. And I think that's 433 00:24:19,276 --> 00:24:22,116 Speaker 1: quite right, because if something's coming into my house and 434 00:24:22,236 --> 00:24:25,476 Speaker 1: coming to me, I should be entitled to say stop 435 00:24:25,556 --> 00:24:27,996 Speaker 1: talking to me. But let's say there was a similar 436 00:24:28,036 --> 00:24:30,876 Speaker 1: statue that allowed anybody who's offended by a billboard or 437 00:24:30,916 --> 00:24:33,996 Speaker 1: by a demonstration, or there was a famous case involving 438 00:24:34,316 --> 00:24:37,516 Speaker 1: nudity on a drive in theater screen, that they could 439 00:24:37,556 --> 00:24:41,396 Speaker 1: demand that that'd be taken down. That is a much greater, 440 00:24:41,556 --> 00:24:45,116 Speaker 1: and I think unconstitutional restriction on speech because that interferes 441 00:24:45,156 --> 00:24:48,236 Speaker 1: even with speech to willing listeners. So if a company 442 00:24:48,236 --> 00:24:51,516 Speaker 1: could be suited under titled seven because and actually unfortunately 443 00:24:51,556 --> 00:24:53,396 Speaker 1: it can be, I think this is the situation where 444 00:24:53,516 --> 00:24:57,276 Speaker 1: hostile roman harassment law is impermissible because somebody is wearing 445 00:24:57,316 --> 00:25:01,196 Speaker 1: a cap with a Confederate flag on it, or because 446 00:25:01,236 --> 00:25:04,956 Speaker 1: some people are talking about gender roles and who are saying, 447 00:25:04,996 --> 00:25:07,356 Speaker 1: you know, I think that these jobs should be for 448 00:25:07,396 --> 00:25:10,236 Speaker 1: men and not for women. Saying the lung Truman they're overheard. 449 00:25:10,676 --> 00:25:14,116 Speaker 1: That I think is unconstitutional because that interferes with speech 450 00:25:14,116 --> 00:25:16,836 Speaker 1: among willing listeners just because somebody who hears it is 451 00:25:16,876 --> 00:25:18,796 Speaker 1: going to be offended by it, and that I think 452 00:25:18,836 --> 00:25:21,596 Speaker 1: is impermissible. And I think that that tracks dividing lines 453 00:25:21,636 --> 00:25:23,916 Speaker 1: in a lot of first ammendent cases. Fighting words is 454 00:25:23,916 --> 00:25:27,076 Speaker 1: another example. There are various rationales for restricting fighting words, 455 00:25:27,116 --> 00:25:28,956 Speaker 1: but one of them is that they're not just likely 456 00:25:28,996 --> 00:25:31,956 Speaker 1: to cause a fight, but there said to a person 457 00:25:32,196 --> 00:25:36,396 Speaker 1: who is being directly personally insulted, as opposed to burning 458 00:25:36,396 --> 00:25:38,436 Speaker 1: a flag which might cause a fight, where the court 459 00:25:38,476 --> 00:25:41,556 Speaker 1: says that's protected. They're the important thing is that that 460 00:25:41,596 --> 00:25:45,956 Speaker 1: message may reach willing viewers as well as unwillingness. Do 461 00:25:45,996 --> 00:25:49,236 Speaker 1: you think of yourself as a free speech absolutist And 462 00:25:49,276 --> 00:25:52,316 Speaker 1: I'll ask you that because it seems in terms of 463 00:25:52,356 --> 00:25:55,876 Speaker 1: the continuum of where people come down that thinking that 464 00:25:56,436 --> 00:25:59,876 Speaker 1: sex arrassement workplace sex arrasma as it currently exists is 465 00:25:59,956 --> 00:26:03,196 Speaker 1: unconstitutional would put you towards one end of the continuum 466 00:26:03,236 --> 00:26:05,956 Speaker 1: at least. Well, I don't think there's ever been a 467 00:26:05,996 --> 00:26:08,676 Speaker 1: free speech absolutist. Some people have called themselves free speech 468 00:26:08,676 --> 00:26:11,076 Speaker 1: absolute just as Black as an example, but even he 469 00:26:11,196 --> 00:26:14,796 Speaker 1: was willing to uphold restrictions on threats on fighting words. 470 00:26:15,156 --> 00:26:17,436 Speaker 1: So I don't think it's possible to be an absolutist. 471 00:26:17,436 --> 00:26:20,076 Speaker 1: I don't think that threats of violence should be protected. 472 00:26:20,476 --> 00:26:23,236 Speaker 1: I don't think that if somebody wants to speak very 473 00:26:23,316 --> 00:26:25,396 Speaker 1: loudly in a residential area in the middle of the 474 00:26:25,516 --> 00:26:27,276 Speaker 1: night that that should be protected. But if no one 475 00:26:27,316 --> 00:26:29,956 Speaker 1: thinks those things, then maybe absolutes is the wrong word. 476 00:26:29,956 --> 00:26:32,116 Speaker 1: But do you think you're at one end of the continent. 477 00:26:32,396 --> 00:26:34,596 Speaker 1: I don't even I do think there are people who 478 00:26:34,596 --> 00:26:36,716 Speaker 1: are free speech maximalist, but I'm not even sure I'm 479 00:26:36,716 --> 00:26:39,836 Speaker 1: a free speech maximalist. I believe in strong protection for 480 00:26:39,876 --> 00:26:43,436 Speaker 1: free speech, especially when the government is restricting it based 481 00:26:43,516 --> 00:26:46,316 Speaker 1: on the content of the speech. I do think there 482 00:26:46,316 --> 00:26:49,116 Speaker 1: are exceptions to free speech protection. I think they're well 483 00:26:49,236 --> 00:26:52,116 Speaker 1: established and they're part of our law, but that they 484 00:26:52,116 --> 00:26:56,556 Speaker 1: should be read narrowly and kept within their boundaries, and 485 00:26:56,716 --> 00:26:59,756 Speaker 1: the slippery slope should be resisted. So in that respect, 486 00:26:59,796 --> 00:27:01,316 Speaker 1: you know, in a sense, I'm kind of a free 487 00:27:01,356 --> 00:27:05,076 Speaker 1: speech doctrinalist. I've read the cases, I've written about the cases. 488 00:27:05,276 --> 00:27:07,156 Speaker 1: I think there's a good deal of wisdom in the 489 00:27:07,396 --> 00:27:10,436 Speaker 1: courts cases, although it's not I agree with everything. I 490 00:27:10,476 --> 00:27:12,956 Speaker 1: think there are particular rules that we have, but I 491 00:27:12,956 --> 00:27:16,716 Speaker 1: don't think we should be moving towards a more speech restrictive, 492 00:27:17,036 --> 00:27:21,276 Speaker 1: generally speaking model than we have today. Eugene, it's a 493 00:27:21,356 --> 00:27:23,436 Speaker 1: huge pleasure to talk to you about these things, and 494 00:27:23,516 --> 00:27:27,196 Speaker 1: it's a huge pleasure to disagree under conditions of rational 495 00:27:27,236 --> 00:27:29,436 Speaker 1: debate of the kind of ideal speech conditions that the 496 00:27:29,476 --> 00:27:31,876 Speaker 1: first amend is, at least in theory designed for it. 497 00:27:31,876 --> 00:27:34,436 Speaker 1: And I suspect we'll keep on disagreeing on lots of issues, 498 00:27:34,476 --> 00:27:36,556 Speaker 1: but I hope we can keep on talking about them 499 00:27:36,596 --> 00:27:38,756 Speaker 1: as we disagree about them going forward. Thank you so 500 00:27:38,836 --> 00:27:40,476 Speaker 1: much for your time. Thank you very much, very kind 501 00:27:40,476 --> 00:27:42,036 Speaker 1: of you to say so to have me on. It 502 00:27:42,116 --> 00:27:43,996 Speaker 1: is always a pleasure to talk to you. And indeed, 503 00:27:44,396 --> 00:27:47,076 Speaker 1: that's I think why we became academics, right so we 504 00:27:47,156 --> 00:27:51,236 Speaker 1: could talk to other people who know the field, and 505 00:27:51,356 --> 00:27:54,556 Speaker 1: we can express our views and sometimes agree, sometimes disagree, 506 00:27:54,836 --> 00:27:57,636 Speaker 1: and we hope learn from each other and come to 507 00:27:57,716 --> 00:28:01,116 Speaker 1: better views ourselves. It definitely wasn't for the faculty meetings, 508 00:28:01,836 --> 00:28:04,796 Speaker 1: Thank you very much, or Jane much is true, thank you. 509 00:28:11,196 --> 00:28:14,156 Speaker 1: My conversation with Eugene raised in my mind one of 510 00:28:14,156 --> 00:28:17,156 Speaker 1: the hardest problems to me, at least in the freedom 511 00:28:17,156 --> 00:28:20,956 Speaker 1: of speech, and that is, should our speech be treated 512 00:28:21,116 --> 00:28:24,236 Speaker 1: with the same degree of freedom in environments like the 513 00:28:24,316 --> 00:28:29,436 Speaker 1: workplace or the university, where our values and goals may 514 00:28:29,436 --> 00:28:32,436 Speaker 1: be potentially a little bit different from the values and 515 00:28:32,516 --> 00:28:36,436 Speaker 1: goals we have in the naked public square. Eugene is 516 00:28:36,516 --> 00:28:40,876 Speaker 1: very concerned about slippery slope problems. He's worried that forms 517 00:28:40,916 --> 00:28:44,196 Speaker 1: of limitation on speech that we design for the workplace, 518 00:28:44,276 --> 00:28:46,596 Speaker 1: or that we design for the university, or for other 519 00:28:46,636 --> 00:28:48,916 Speaker 1: settings where we tend to think, or at least I 520 00:28:49,036 --> 00:28:52,556 Speaker 1: tend to think that speech can rightfully be constrained, might 521 00:28:52,716 --> 00:28:55,516 Speaker 1: in the long run, have the effect of undercutting our 522 00:28:55,556 --> 00:28:59,276 Speaker 1: commitment to free speech. More generally, line drawing is one 523 00:28:59,276 --> 00:29:02,396 Speaker 1: of the hardest tasks that the law faces. It's also 524 00:29:02,436 --> 00:29:06,436 Speaker 1: a task that the law has to engage in every day, 525 00:29:06,476 --> 00:29:08,956 Speaker 1: and of course there is also a slippery slope argument 526 00:29:09,156 --> 00:29:12,916 Speaker 1: the opposite direction. If we insist on nearly absolute free 527 00:29:12,916 --> 00:29:16,396 Speaker 1: speech in every context, what will that mean for our 528 00:29:16,436 --> 00:29:22,316 Speaker 1: society's ability to shape productive, meaningful interactions and conversations in 529 00:29:22,396 --> 00:29:26,396 Speaker 1: places like the workplace and the university. In any case, 530 00:29:26,436 --> 00:29:29,556 Speaker 1: there's nothing like testing out one's ideas about free speech 531 00:29:29,796 --> 00:29:33,356 Speaker 1: against the strongest pro free speech position in order to 532 00:29:33,396 --> 00:29:37,676 Speaker 1: figure out where you believe lines can appropriately be drawn. 533 00:29:38,716 --> 00:29:41,236 Speaker 1: Until the next time I speak to you, be careful, 534 00:29:41,716 --> 00:29:45,796 Speaker 1: be safe, and be well. Deep Background is brought to 535 00:29:45,796 --> 00:29:49,236 Speaker 1: you by Pushkin Industries. Our producer is Lydia Jane Cott, 536 00:29:49,356 --> 00:29:53,516 Speaker 1: with mastering by Jason Gambrell and Martin Gonzalez. Our showrunner 537 00:29:53,556 --> 00:29:56,676 Speaker 1: is Sophie mckibbon. Our theme music is composed by Luis 538 00:29:56,756 --> 00:30:01,076 Speaker 1: gera special thanks to the Pushkin Brass, Malcolm Gladwell, Jacob Weisberg, 539 00:30:01,116 --> 00:30:04,596 Speaker 1: and Mia Lobel. I'm Noah Feldman. I also write a 540 00:30:04,676 --> 00:30:07,476 Speaker 1: regular column for Bloomberg Opinion, which you can find at 541 00:30:07,476 --> 00:30:13,076 Speaker 1: Bloomberg dot com Feldman. To discover Bloomberg's original slate of podcasts, 542 00:30:13,316 --> 00:30:17,796 Speaker 1: go to Bloomberg dot com Slash Podcasts. And one last thing. 543 00:30:18,116 --> 00:30:20,956 Speaker 1: I just wrote a book called The Arab Winter, a Tragedy. 544 00:30:21,356 --> 00:30:23,756 Speaker 1: I would be delighted if you checked it out. If 545 00:30:23,756 --> 00:30:26,276 Speaker 1: you liked what you heard today. Please write a review 546 00:30:26,636 --> 00:30:28,916 Speaker 1: or tell a friend. You can always let me know 547 00:30:28,916 --> 00:30:32,036 Speaker 1: what you think on Twitter. My handle is Noah R Feldman. 548 00:30:32,636 --> 00:30:34,276 Speaker 1: This is deep background