1 00:00:03,080 --> 00:00:08,280 Speaker 1: You're listening to Bloomberg Law with June Grasso from Bloomberg Radio. 2 00:00:09,160 --> 00:00:13,360 Speaker 1: Elon Mosk is close to acquiring Twitter. Many are wondering, 3 00:00:13,520 --> 00:00:17,800 Speaker 1: and some are concerned about what his absolutist view of 4 00:00:17,840 --> 00:00:21,400 Speaker 1: the First Amendment will mean for the social media platform. 5 00:00:21,600 --> 00:00:25,120 Speaker 1: Most described his outlook during a Ted interview this month. 6 00:00:25,320 --> 00:00:28,400 Speaker 1: We want to be just very reluctant to delete things 7 00:00:29,240 --> 00:00:33,080 Speaker 1: and have just just be very cautious with with with 8 00:00:33,440 --> 00:00:37,240 Speaker 1: permanent bands. Uh, you know, timeouts I think are better 9 00:00:37,400 --> 00:00:40,600 Speaker 1: or my guest his First Amendment law expert Eugene Follick, 10 00:00:40,680 --> 00:00:43,479 Speaker 1: a professor at u c l A Law School. Some 11 00:00:43,600 --> 00:00:48,519 Speaker 1: people might confuse First Amendment rights with speech on social 12 00:00:48,520 --> 00:00:53,080 Speaker 1: media platforms like Twitter. Can you explain free speech in 13 00:00:53,159 --> 00:00:57,520 Speaker 1: the context of social media sites. Sure, we have this 14 00:00:57,600 --> 00:01:02,080 Speaker 1: tendency to use First Amendment and be speech interchangeably, but well, 15 00:01:02,080 --> 00:01:05,160 Speaker 1: they're related, they're different. The First Amendment is a constraint 16 00:01:05,200 --> 00:01:07,840 Speaker 1: on the government. Remember the first word of the First 17 00:01:07,800 --> 00:01:11,200 Speaker 1: Amment is Congress. Congress shall make no law. That's been 18 00:01:11,240 --> 00:01:14,880 Speaker 1: applied through the Fourteenth Amendment to states. The Fourteenth Amend 19 00:01:14,920 --> 00:01:18,680 Speaker 1: instructs with no state shall and also to local governments 20 00:01:18,720 --> 00:01:22,000 Speaker 1: and the like, which are agencies of state. But basically 21 00:01:22,160 --> 00:01:25,200 Speaker 1: only the government is bound by the First Amendment. Only 22 00:01:25,240 --> 00:01:28,440 Speaker 1: the government can violate the First Amendment. So, for example, 23 00:01:28,760 --> 00:01:31,840 Speaker 1: if your employer fires you for your speech, or for 24 00:01:31,920 --> 00:01:34,680 Speaker 1: your political activity, or even for your vote, that's not 25 00:01:34,720 --> 00:01:38,720 Speaker 1: a First Amendment violation. Likewise, if let's say a private 26 00:01:38,800 --> 00:01:42,800 Speaker 1: university expels a student because of the student's speech, that's 27 00:01:42,840 --> 00:01:46,480 Speaker 1: not a First Amendment violation. Freedom of speech is a 28 00:01:46,560 --> 00:01:51,120 Speaker 1: broader concept potentially, so you might say, for example, that universities, 29 00:01:51,200 --> 00:01:56,440 Speaker 1: especially secular universities that describe themselves as devoted to free inquiry, 30 00:01:56,680 --> 00:02:00,360 Speaker 1: should comply with free speech principles. Maybe not every maybe 31 00:02:00,360 --> 00:02:03,000 Speaker 1: not in the classroom where the professor gets to decide 32 00:02:03,320 --> 00:02:06,400 Speaker 1: whom to call on and what issues to discuss, but 33 00:02:06,880 --> 00:02:10,760 Speaker 1: for example, that the university private university shouldn't expel a 34 00:02:10,840 --> 00:02:13,760 Speaker 1: student because of a speech. In fact, in California, there's 35 00:02:13,800 --> 00:02:18,240 Speaker 1: a state statute that bans private universities except religious ones, 36 00:02:18,560 --> 00:02:21,720 Speaker 1: from imposing speech codes on their students. I think that 37 00:02:21,800 --> 00:02:24,240 Speaker 1: statute is a free speech protection, even though it's not 38 00:02:24,320 --> 00:02:27,799 Speaker 1: a First Amendment protection. Likewise, you might say, maybe private 39 00:02:27,800 --> 00:02:32,400 Speaker 1: employers shouldn't be able to fire their employees for their speech, 40 00:02:32,800 --> 00:02:37,600 Speaker 1: especially maybe off the job speech, especially maybe speech about politics. Again, 41 00:02:37,639 --> 00:02:40,880 Speaker 1: in my own California, private employers are not allowed to 42 00:02:40,960 --> 00:02:44,760 Speaker 1: fire employees based on the employees political activity, which is 43 00:02:44,760 --> 00:02:48,440 Speaker 1: defined very broadly to cover advocacy not just of candidates, 44 00:02:48,440 --> 00:02:51,840 Speaker 1: but also of ideological causes. Quite a few other states 45 00:02:51,919 --> 00:02:55,639 Speaker 1: also protect private employees against employers. That way, you might 46 00:02:55,680 --> 00:02:59,519 Speaker 1: say that's a free speech protection. Likewise, if you think 47 00:02:59,560 --> 00:03:03,120 Speaker 1: back historically to say the Hollywood Blacklist of the nineteen fifties, 48 00:03:03,200 --> 00:03:06,919 Speaker 1: where people who were viewed as having been communists, to 49 00:03:06,960 --> 00:03:10,680 Speaker 1: have been in affiliated with communist related organizations were blacklisted 50 00:03:10,720 --> 00:03:13,440 Speaker 1: from jobs. You know that wasn't the First Amendment violation, 51 00:03:13,480 --> 00:03:15,080 Speaker 1: but you might have said that that was bad for 52 00:03:15,240 --> 00:03:17,639 Speaker 1: the freedom of speech and maybe a violation of free 53 00:03:17,639 --> 00:03:22,200 Speaker 1: speech principles. So likewise, people talk about free speech on 54 00:03:22,520 --> 00:03:25,800 Speaker 1: social media platform social media platforms or not. The government 55 00:03:26,080 --> 00:03:28,720 Speaker 1: they don't violate the First Amendment if they banned people 56 00:03:28,720 --> 00:03:31,799 Speaker 1: based on their speech, but you might argue that they 57 00:03:31,840 --> 00:03:35,200 Speaker 1: ought to respect free speech because that's what our American 58 00:03:35,240 --> 00:03:39,120 Speaker 1: democracy requires. So I think it's an important question how 59 00:03:39,240 --> 00:03:41,600 Speaker 1: much and what kinds of free speech ought to be 60 00:03:41,640 --> 00:03:45,240 Speaker 1: allowed whether the law should require such protections or whether 61 00:03:45,240 --> 00:03:46,720 Speaker 1: it just ought to be allowed as a matter of 62 00:03:46,720 --> 00:03:50,800 Speaker 1: the platform decision on these privately owned platforms, so must. 63 00:03:51,080 --> 00:03:54,280 Speaker 1: In trying to explain his position on freedom of speech, 64 00:03:54,320 --> 00:03:58,360 Speaker 1: he tweeted, quote, I simply mean that which matches the law? 65 00:03:58,920 --> 00:04:03,000 Speaker 1: Is he showing tay in thinking that it's a simple 66 00:04:03,080 --> 00:04:06,960 Speaker 1: concept like that? Well, I think you said, He tweeted that, Well, 67 00:04:07,480 --> 00:04:12,160 Speaker 1: you can't capture even First Amendment rules in two characters 68 00:04:12,280 --> 00:04:16,600 Speaker 1: or less, and you certainly can't capture the possible differences 69 00:04:16,720 --> 00:04:20,039 Speaker 1: between free speech and First Amendment rules in Twitter day 70 00:04:20,120 --> 00:04:23,120 Speaker 1: characters or less. Just to give an example, I think 71 00:04:23,279 --> 00:04:26,839 Speaker 1: it's necessary for any platform to do something about spam. 72 00:04:26,920 --> 00:04:29,080 Speaker 1: And there may be certain kinds of things which are 73 00:04:29,200 --> 00:04:31,400 Speaker 1: protected by the First amend and certain kinds of mass 74 00:04:31,440 --> 00:04:34,240 Speaker 1: mailings and mass posts that a platform might say, you know, 75 00:04:34,320 --> 00:04:37,640 Speaker 1: they just degrade too much the user's ability to see 76 00:04:38,200 --> 00:04:41,080 Speaker 1: real material that they want to see. That therefore, we 77 00:04:41,160 --> 00:04:44,280 Speaker 1: need to block spam, even if spam is not illegal, 78 00:04:44,400 --> 00:04:47,400 Speaker 1: or at least even kinds of spams that are not illegal. Likewise, 79 00:04:47,400 --> 00:04:50,320 Speaker 1: you could imagine a platform Twitter is not one of them. 80 00:04:50,440 --> 00:04:53,920 Speaker 1: Twitter actually allows pornography at least of certain kinds. But 81 00:04:54,000 --> 00:04:55,880 Speaker 1: you can imagine a platform saying, you know, we're not 82 00:04:55,920 --> 00:04:59,280 Speaker 1: going to allow pornography because there's too much of a 83 00:04:59,360 --> 00:05:03,080 Speaker 1: risk of people incidentally stumble across it, and therefore we're 84 00:05:03,320 --> 00:05:05,760 Speaker 1: just going to ban it. Likewise, Twitter, as I understand it, 85 00:05:05,800 --> 00:05:09,800 Speaker 1: that allows pornography basically requires it to be labeled and 86 00:05:09,960 --> 00:05:13,640 Speaker 1: kind of gives people warnings before showing it. Again, that's 87 00:05:13,680 --> 00:05:16,240 Speaker 1: not a legal requirement. That's just Twitter's decisions. So there 88 00:05:16,240 --> 00:05:19,880 Speaker 1: are possible differences. This having been said, you know, if 89 00:05:19,920 --> 00:05:23,520 Speaker 1: you want to kind of take a first cut, admittedly 90 00:05:23,560 --> 00:05:27,400 Speaker 1: oversimplified view, you might say, yes, our view should be 91 00:05:27,480 --> 00:05:31,320 Speaker 1: the Twitter should generally speaking protect free speech at least 92 00:05:31,360 --> 00:05:33,839 Speaker 1: so long as it's not illegal speech. So, no, we 93 00:05:33,880 --> 00:05:38,080 Speaker 1: won't host child pornography. No, we won't host things that 94 00:05:38,120 --> 00:05:42,359 Speaker 1: are clearly conspiracy to commit crimes or kind of accounts 95 00:05:42,400 --> 00:05:45,839 Speaker 1: that are being used for criminal conspiracy. Let's say we 96 00:05:45,920 --> 00:05:49,760 Speaker 1: won't host copyright infringement because that's illegal. But pretty much 97 00:05:49,839 --> 00:05:53,200 Speaker 1: all other things, generally speaking, we're going to host. Maybe 98 00:05:53,200 --> 00:05:55,520 Speaker 1: there'll be a few exceptions, but they really should be 99 00:05:55,640 --> 00:05:58,880 Speaker 1: very narrow exceptions. It's a plausible way of thinking about these. 100 00:05:59,120 --> 00:06:03,760 Speaker 1: Musk is a self described free speech absolutist, and in 101 00:06:03,839 --> 00:06:07,080 Speaker 1: an interview at the TED Conference a couple of weeks ago, 102 00:06:07,640 --> 00:06:10,200 Speaker 1: he said, if in doubt, if it's a gray area, 103 00:06:10,360 --> 00:06:13,440 Speaker 1: let the tweet exist. If there's a lot of controversy, 104 00:06:13,880 --> 00:06:16,719 Speaker 1: you wouldn't necessarily want to promote that speech. But I'm 105 00:06:16,800 --> 00:06:21,359 Speaker 1: very reluctant to delete things and cautious with permanent fans. 106 00:06:21,400 --> 00:06:25,240 Speaker 1: Some civil rights advocates are saying that that kind of 107 00:06:25,320 --> 00:06:33,359 Speaker 1: uh minimally moderated Twitter can pose dangers or can harm women, minorities, 108 00:06:33,400 --> 00:06:37,240 Speaker 1: and anyone out of favor with the establishment. What's your 109 00:06:37,279 --> 00:06:41,000 Speaker 1: take on whether that kind of a minimally moderated Twitter 110 00:06:41,560 --> 00:06:45,920 Speaker 1: poses any dangers. Well, of course, free speech poses dangerous 111 00:06:45,960 --> 00:06:49,599 Speaker 1: Free speech is always posts. That's one reason why governments do, 112 00:06:49,720 --> 00:06:52,800 Speaker 1: indeed try to restrict. And there's nothing special about those 113 00:06:52,880 --> 00:06:55,520 Speaker 1: dangerous for the women, or as the minorities, or as 114 00:06:55,520 --> 00:06:58,040 Speaker 1: to others. Just to give an example, there's been a 115 00:06:58,080 --> 00:07:01,279 Speaker 1: lot of attacks on police officers recently, and some of 116 00:07:01,320 --> 00:07:04,480 Speaker 1: them seem to be prompted by kind of very harsh 117 00:07:04,520 --> 00:07:08,640 Speaker 1: criticism and kind of dehumanizing criticisms of police officers. And 118 00:07:08,800 --> 00:07:14,160 Speaker 1: that's something that is a downside of speech. Likewise, for example, 119 00:07:14,280 --> 00:07:19,480 Speaker 1: jahaddest kind of pro terrorists extremism is potentially harmful. Actually 120 00:07:19,520 --> 00:07:22,480 Speaker 1: it's especially harmful to Muslims. Muslims are the main victims 121 00:07:22,560 --> 00:07:25,760 Speaker 1: of extremist Islam, but it's also potentially harmful to others. 122 00:07:25,760 --> 00:07:30,280 Speaker 1: Of course, speech is potentially harmful, But generally speaking, when 123 00:07:30,280 --> 00:07:33,040 Speaker 1: it comes to the First Amendment, we take the view 124 00:07:33,640 --> 00:07:37,000 Speaker 1: that on balance, it's better for everyone, and historically this 125 00:07:37,040 --> 00:07:41,240 Speaker 1: has been especially important for various out of favor minority 126 00:07:41,280 --> 00:07:45,880 Speaker 1: groups that speech be protected, because when speech is restricted, 127 00:07:46,480 --> 00:07:49,560 Speaker 1: that keeps people from being able to argue about what 128 00:07:49,680 --> 00:07:53,200 Speaker 1: they think are genuine wrongs and to advocate for for change, 129 00:07:53,560 --> 00:07:56,360 Speaker 1: And that giving the government power to suppress speech of 130 00:07:56,400 --> 00:08:00,240 Speaker 1: the government thinks is harmful will also suppress lot of 131 00:08:00,240 --> 00:08:03,480 Speaker 1: speech that's valuable. So one could argue it's not necessarily 132 00:08:03,520 --> 00:08:06,520 Speaker 1: obviously correct, but one could argue that the same thing 133 00:08:06,840 --> 00:08:10,880 Speaker 1: applies not just to the government but immensely powerful platform 134 00:08:11,000 --> 00:08:15,680 Speaker 1: operators like Facebook, Twitter and the like, that perhaps, on balance, 135 00:08:16,000 --> 00:08:18,520 Speaker 1: just as it's good to deny the government the power 136 00:08:18,600 --> 00:08:21,160 Speaker 1: to say we're not going to allow this kind of 137 00:08:21,160 --> 00:08:24,360 Speaker 1: speech because it's harmful, Generally speaking, it's also good for 138 00:08:24,480 --> 00:08:26,960 Speaker 1: these platforms to deny themselves this power to say, look, 139 00:08:27,280 --> 00:08:29,840 Speaker 1: we we don't trust ourselves to make decisions about which 140 00:08:29,880 --> 00:08:32,760 Speaker 1: speech is so harmful that should be prohibited, perhaps except 141 00:08:32,800 --> 00:08:36,120 Speaker 1: for a really extraordinary things like, for example, child pornography 142 00:08:36,559 --> 00:08:38,400 Speaker 1: or the life. So it seems to be a perfectly 143 00:08:38,400 --> 00:08:41,200 Speaker 1: plausible position. Now, of course, you could argue that, no, 144 00:08:41,400 --> 00:08:44,320 Speaker 1: the government ought to be banning more speech. The government 145 00:08:44,320 --> 00:08:46,839 Speaker 1: ought to ban some people say, oh, the band racist speech, 146 00:08:46,920 --> 00:08:50,000 Speaker 1: or ought to ban anti government speech or anti police speech, 147 00:08:50,080 --> 00:08:52,120 Speaker 1: or whatever else. Lots of people have argued this in 148 00:08:52,160 --> 00:08:54,280 Speaker 1: the past. Or you could say, no, no, the government 149 00:08:54,320 --> 00:08:56,679 Speaker 1: shouldn't ban it. But instead of the government banning it, 150 00:08:57,000 --> 00:08:59,800 Speaker 1: we should make sure that all of these influential platforms 151 00:08:59,840 --> 00:09:03,280 Speaker 1: do what the government can't, which has suppressed speech that 152 00:09:03,320 --> 00:09:06,559 Speaker 1: we think is harmful in various ways. You could argue that, 153 00:09:06,720 --> 00:09:09,160 Speaker 1: But again, I think it's at least a plausible position 154 00:09:09,240 --> 00:09:12,760 Speaker 1: to say that just as we don't trust the government 155 00:09:12,760 --> 00:09:15,240 Speaker 1: to make these decisions, we shouldn't trust Mark Zuckerberg to 156 00:09:15,280 --> 00:09:18,520 Speaker 1: make these decisions, or even Elon Musk to make these decisions. 157 00:09:18,559 --> 00:09:21,040 Speaker 1: On a case by case basis, that maybe it's better 158 00:09:21,080 --> 00:09:23,920 Speaker 1: for you on Musk to say, look, you know, I 159 00:09:23,960 --> 00:09:27,000 Speaker 1: don't want to be the sensor dictating what is too 160 00:09:27,040 --> 00:09:29,720 Speaker 1: harmful to be allowed and what is not, and then again, 161 00:09:29,760 --> 00:09:33,319 Speaker 1: accepting extraordinary circumstances, I'm going to allow everything to be 162 00:09:33,840 --> 00:09:36,640 Speaker 1: to be present. So I think it's a perfectly plausible position. 163 00:09:36,679 --> 00:09:38,960 Speaker 1: It's not the only plausible position, but it seems to 164 00:09:39,000 --> 00:09:41,880 Speaker 1: me one that makes a good deal of sense in 165 00:09:42,040 --> 00:09:45,400 Speaker 1: light of the general assumptions that the courts and others 166 00:09:45,520 --> 00:09:48,679 Speaker 1: have long held with regard to free speech broadly. So, 167 00:09:48,840 --> 00:09:52,800 Speaker 1: the former chair of the FEC, Tom Wheeler, said that 168 00:09:53,240 --> 00:09:56,280 Speaker 1: Musk is taking actions that highlight the need for the 169 00:09:56,360 --> 00:10:01,359 Speaker 1: creation of a new regulator that would oversee the technology industry. 170 00:10:01,480 --> 00:10:04,560 Speaker 1: So for Congress to pass legislation, you think that's a 171 00:10:04,559 --> 00:10:06,960 Speaker 1: good idea or a bad idea, Well, it depends on 172 00:10:06,960 --> 00:10:10,120 Speaker 1: what legislation, right. I mean. I'll give an example. Let's 173 00:10:10,120 --> 00:10:13,240 Speaker 1: say somebody says, you know, yes, we think that it's 174 00:10:13,600 --> 00:10:17,000 Speaker 1: bad to have statements that tend to encourage the killing 175 00:10:17,000 --> 00:10:19,760 Speaker 1: of police officers, so there are to be regulation that 176 00:10:19,800 --> 00:10:23,120 Speaker 1: prohibits that. Or we think it's bad to have statements 177 00:10:23,200 --> 00:10:26,880 Speaker 1: that say insulting things about particular racial groups or particular 178 00:10:26,920 --> 00:10:30,520 Speaker 1: religious groups. Well, that regulation would violate the First Amendment. 179 00:10:30,559 --> 00:10:33,000 Speaker 1: We're not now in the category of free speech ethics, 180 00:10:33,440 --> 00:10:35,840 Speaker 1: in the field of actual First Amendment law, and we 181 00:10:35,880 --> 00:10:38,800 Speaker 1: know that that regulation would violate the First Amendment. If 182 00:10:38,920 --> 00:10:42,120 Speaker 1: the proposal and doesn't sound at all like what the 183 00:10:42,240 --> 00:10:45,000 Speaker 1: FT chair is talking about. But if the proposal is, 184 00:10:45,440 --> 00:10:51,040 Speaker 1: let's require platforms not to discriminate based on viewpoint, Let's 185 00:10:51,120 --> 00:10:54,840 Speaker 1: require them to host speech without regard to viewpoint, kind 186 00:10:54,880 --> 00:10:59,960 Speaker 1: of like a phone company is required to basically hope 187 00:11:00,040 --> 00:11:04,199 Speaker 1: stall paying customers and can't shut down your phone line 188 00:11:04,240 --> 00:11:07,000 Speaker 1: because it doesn't like the viewpoints you're expressing using it. Well, 189 00:11:07,040 --> 00:11:09,880 Speaker 1: that's a different story. I'm not sure that would be unconstitutional, 190 00:11:10,160 --> 00:11:12,160 Speaker 1: although one can argue whether it is or isn't it, 191 00:11:12,200 --> 00:11:14,720 Speaker 1: whether it's a good idea or not. Or Some people 192 00:11:14,760 --> 00:11:18,280 Speaker 1: are saying the regulations should be that platforms should have 193 00:11:18,320 --> 00:11:22,120 Speaker 1: the power to decide whether to block certain things, and 194 00:11:22,240 --> 00:11:24,400 Speaker 1: some will say will block a lot of things, Some 195 00:11:24,520 --> 00:11:27,960 Speaker 1: like maybe Elon Musk's Twitter, will say will block very little. 196 00:11:28,200 --> 00:11:30,400 Speaker 1: But what they should do is they should disclose it. 197 00:11:30,880 --> 00:11:34,280 Speaker 1: They should disclose for the public all their editing decisions 198 00:11:34,320 --> 00:11:36,600 Speaker 1: you know, that might be constitutional, might even be a 199 00:11:36,600 --> 00:11:39,480 Speaker 1: good idea, although there are possible interpretation problems as well. 200 00:11:39,600 --> 00:11:42,480 Speaker 1: So the fact is that when it comes to speech, 201 00:11:42,600 --> 00:11:45,600 Speaker 1: there are some kinds of permissible regulations. We haven't even 202 00:11:45,600 --> 00:11:48,040 Speaker 1: touched on, things like regulations with regard to liabel or 203 00:11:48,200 --> 00:11:51,120 Speaker 1: regulations with regard to true threat of violence and the like. 204 00:11:51,360 --> 00:11:53,840 Speaker 1: There lots of regulations that would be impermissible, so it's 205 00:11:53,840 --> 00:11:57,200 Speaker 1: hard to talk about whether there should be regulation in 206 00:11:57,240 --> 00:12:00,840 Speaker 1: the abstract. You need to know what exactly is. The 207 00:12:00,880 --> 00:12:06,240 Speaker 1: biggest change that's expected is re platforming of accounts that 208 00:12:06,320 --> 00:12:11,079 Speaker 1: Twitter banned after they were used to harass, or incite 209 00:12:11,160 --> 00:12:15,000 Speaker 1: violence or spread misinformation. And he also said that he 210 00:12:15,040 --> 00:12:19,120 Speaker 1: would be cautious with permanent bands and that time outs 211 00:12:19,120 --> 00:12:22,319 Speaker 1: would be better. I just want your take on that. Well, again, 212 00:12:22,360 --> 00:12:25,839 Speaker 1: it all depends on exactly what we're talking about. So, 213 00:12:25,880 --> 00:12:28,440 Speaker 1: for example, you said it accounts that would harass, that 214 00:12:28,440 --> 00:12:32,520 Speaker 1: that incite violence, or that spread misinformation. So those are 215 00:12:32,640 --> 00:12:36,200 Speaker 1: very different things harass. I'm actually writing an article about this. 216 00:12:36,360 --> 00:12:39,760 Speaker 1: It's a term that doesn't have a really clear legal definition, 217 00:12:40,240 --> 00:12:43,200 Speaker 1: or rather it has lots of different definitions in different contexts. 218 00:12:43,360 --> 00:12:48,040 Speaker 1: Under some state laws, harassment means, among those things, physical misconduct, 219 00:12:48,080 --> 00:12:52,400 Speaker 1: but also threats of violence. And you know, threats of 220 00:12:52,480 --> 00:12:57,320 Speaker 1: violence are constitutionally unprotected. Platforms don't have to take them down, 221 00:12:57,440 --> 00:12:59,520 Speaker 1: but I think we can plausibly say that if something 222 00:12:59,600 --> 00:13:03,360 Speaker 1: actually threatens violence, it's a crime, and the platforms ought 223 00:13:03,400 --> 00:13:05,480 Speaker 1: to take it down. On the other hand, sometimes you 224 00:13:05,480 --> 00:13:08,839 Speaker 1: will use harass to mean saying really mean things about 225 00:13:08,920 --> 00:13:11,680 Speaker 1: someone and doing so repeatedly in a way that makes 226 00:13:11,720 --> 00:13:14,760 Speaker 1: this person feel kind of insulted and set upon. Well, 227 00:13:14,800 --> 00:13:18,000 Speaker 1: all right, you could call that harass, but it's also 228 00:13:18,040 --> 00:13:20,720 Speaker 1: you could call it criticism, you could call it condemnations. 229 00:13:20,760 --> 00:13:23,559 Speaker 1: So I'd need to know a lot more if you're 230 00:13:23,600 --> 00:13:26,720 Speaker 1: asking whether some particular account that was guilty of harassment 231 00:13:27,160 --> 00:13:29,080 Speaker 1: ought to be thenned, I'd lead needs a lot know 232 00:13:29,160 --> 00:13:32,040 Speaker 1: a lot more about what that harassment incite violence. Well, 233 00:13:32,400 --> 00:13:36,920 Speaker 1: it means incitement in the First Amendment sense of speech 234 00:13:37,000 --> 00:13:41,520 Speaker 1: that is intended to and likely to cause imminent illegal conduct. Well, again, 235 00:13:41,559 --> 00:13:45,000 Speaker 1: that's constitutionally unprotective, and maybe Twitter would remove it once 236 00:13:45,000 --> 00:13:47,719 Speaker 1: it concludes that it's constitutionally unprotected. On the other hand, 237 00:13:47,760 --> 00:13:51,400 Speaker 1: often people use insight loosely, which basically just in the 238 00:13:51,480 --> 00:13:54,959 Speaker 1: sense of tending to lead some people, almost always a 239 00:13:55,040 --> 00:13:57,760 Speaker 1: tiny fraction of all readers, towards violence. And again you 240 00:13:57,840 --> 00:13:59,440 Speaker 1: might ask, well, what do you do with your garden 241 00:13:59,480 --> 00:14:03,760 Speaker 1: account to harshly condemns the police, calls them an occupying army, 242 00:14:04,040 --> 00:14:07,400 Speaker 1: says no justice, no piece, praises rioters and looters and 243 00:14:07,440 --> 00:14:10,560 Speaker 1: the like. Is that incitement of violence against the police 244 00:14:10,640 --> 00:14:13,719 Speaker 1: or incitement of writing that I'm a looting Well, you know, 245 00:14:14,559 --> 00:14:17,640 Speaker 1: h it could be called that in ordinary English. I'm 246 00:14:17,640 --> 00:14:19,600 Speaker 1: not sure that that's an account that should be banned 247 00:14:19,640 --> 00:14:23,080 Speaker 1: in those kinds of situations. And then you say misinformation. 248 00:14:23,200 --> 00:14:26,480 Speaker 1: The problem is misinformation according to whom right, One of 249 00:14:26,480 --> 00:14:28,480 Speaker 1: the problems is a lot of times people have a 250 00:14:28,520 --> 00:14:31,840 Speaker 1: dispute about what is accurate information and what is misinformation. 251 00:14:32,360 --> 00:14:35,760 Speaker 1: What are accurate allegations about counter Biden's laptop and which 252 00:14:35,760 --> 00:14:39,520 Speaker 1: ones were not? What we're accurate allegations about whether President 253 00:14:39,560 --> 00:14:42,640 Speaker 1: fromp colluded with the Russians in s and which we're not. 254 00:14:43,040 --> 00:14:45,840 Speaker 1: So I'm not sure that a company such as Twitter 255 00:14:46,040 --> 00:14:48,760 Speaker 1: should be making decisions of this account should be banned 256 00:14:49,120 --> 00:14:52,040 Speaker 1: because it's misinformation, because it may just be that it's 257 00:14:52,080 --> 00:14:55,000 Speaker 1: information that Twitter doesn't like or the Twitter's owners don't like. 258 00:14:55,240 --> 00:14:58,160 Speaker 1: Then there's a separate question of once you conclude something 259 00:14:58,600 --> 00:15:01,560 Speaker 1: ought to be locked in some measure, should it be 260 00:15:01,600 --> 00:15:04,200 Speaker 1: a permanent block or should be a temporary block. That's 261 00:15:04,200 --> 00:15:06,760 Speaker 1: this time out question. That's also a separate issue and 262 00:15:06,800 --> 00:15:08,920 Speaker 1: may depend a lot on the on the nature of 263 00:15:08,920 --> 00:15:12,480 Speaker 1: the speech that's involved. Must take in charge of Twitter 264 00:15:12,800 --> 00:15:18,320 Speaker 1: and making changes, Will it change how we define free speech? Well, first, 265 00:15:18,400 --> 00:15:20,880 Speaker 1: a lot depends on what changes he actually makes. We 266 00:15:20,960 --> 00:15:23,880 Speaker 1: now are hearing the kind of changes he's interested in making. 267 00:15:23,920 --> 00:15:27,480 Speaker 1: But of course, no battle plan survives contact with the enemy. 268 00:15:27,680 --> 00:15:30,760 Speaker 1: We'll see in some months or maybe even some years 269 00:15:30,800 --> 00:15:33,040 Speaker 1: as to what Twitter actually ends up to it. Second, 270 00:15:33,120 --> 00:15:37,400 Speaker 1: of course, on one hand, free speech is a philosophical concept. 271 00:15:37,480 --> 00:15:41,200 Speaker 1: People have been discussing it for hundreds of years, of 272 00:15:41,760 --> 00:15:44,280 Speaker 1: some measure thousands of years, but at least the current 273 00:15:44,320 --> 00:15:46,840 Speaker 1: debate about it dates back to the sixteen hundreds or 274 00:15:46,880 --> 00:15:50,520 Speaker 1: seventeen hundreds in England. Current American debate dates back to that. 275 00:15:51,040 --> 00:15:54,160 Speaker 1: So as a result of the policy that Twitter implements 276 00:15:54,240 --> 00:15:57,520 Speaker 1: or Facebook implements is not something that is a philosophical 277 00:15:57,600 --> 00:16:01,360 Speaker 1: matter should affect our definitions of speech. But as a 278 00:16:01,400 --> 00:16:03,920 Speaker 1: practical matter, I do think that a lot of times 279 00:16:04,000 --> 00:16:07,000 Speaker 1: people kind of take their cues about the meaning of 280 00:16:07,000 --> 00:16:11,160 Speaker 1: a concept based on what is actually allowed. So if 281 00:16:11,280 --> 00:16:15,560 Speaker 1: indeed platforms routinely banned certain kinds of speech on the 282 00:16:15,560 --> 00:16:18,600 Speaker 1: grounds that, well, that's not free speech, it's hate speech, 283 00:16:18,600 --> 00:16:21,240 Speaker 1: which is a legal matter, that's a non sequitor because 284 00:16:21,440 --> 00:16:23,880 Speaker 1: so called hate speech is generally protected free speech. But 285 00:16:23,960 --> 00:16:27,440 Speaker 1: imagine enough platforms say that, Imagine enough platforms say well, 286 00:16:27,480 --> 00:16:31,720 Speaker 1: that's not free speech, it's misinformation. Then maybe indeed, over time, 287 00:16:31,760 --> 00:16:33,720 Speaker 1: people will get used to it, and people will come 288 00:16:33,720 --> 00:16:36,160 Speaker 1: to assume that will of course it's not free speech 289 00:16:36,200 --> 00:16:39,480 Speaker 1: because we see that it's already prohibited by these influential entities. 290 00:16:39,520 --> 00:16:44,200 Speaker 1: So it's possible. It's also possible that maybe platforms decisions 291 00:16:44,280 --> 00:16:46,440 Speaker 1: try to sense of things will backfire and we will 292 00:16:46,520 --> 00:16:49,200 Speaker 1: remind people that this is free speeches, just free speech, 293 00:16:49,240 --> 00:16:52,760 Speaker 1: that platforms are wrongly forbidding. So it's really hard to 294 00:16:52,800 --> 00:16:57,120 Speaker 1: tell how public attitudes towards these kinds of concepts will 295 00:16:57,160 --> 00:17:01,120 Speaker 1: evolve as a result of private corporations decisions. Thanks Eugene, 296 00:17:01,440 --> 00:17:06,800 Speaker 1: that's Professor Eugene Ballack of US law school. Former President 297 00:17:06,840 --> 00:17:09,919 Speaker 1: Donald Trump is appealing in New York Judes ruling holding 298 00:17:09,960 --> 00:17:13,240 Speaker 1: him in contempt of court and imposing a ten thousand 299 00:17:13,320 --> 00:17:16,320 Speaker 1: dollar fine each day for failing to comply with this 300 00:17:16,440 --> 00:17:20,720 Speaker 1: subpoena in the Attorney General Civil fraud investigation. In more 301 00:17:20,760 --> 00:17:24,720 Speaker 1: than six million pages of corporate records handed over, only 302 00:17:24,800 --> 00:17:29,320 Speaker 1: ten documents were Trump's Joining me is Eric Larson Bloomberg. 303 00:17:29,680 --> 00:17:33,639 Speaker 1: Joining me is Eric Larson Bloomberg legal reporter, for failing 304 00:17:33,640 --> 00:17:37,439 Speaker 1: to comply with this subpoena. So, for those who may 305 00:17:37,440 --> 00:17:40,440 Speaker 1: have been living under a rock, tell us about what 306 00:17:40,520 --> 00:17:44,880 Speaker 1: the a G Is investigating. Sure, this is an investigation 307 00:17:44,880 --> 00:17:49,040 Speaker 1: that started in two thousand nineteen looking into the Trump organization, 308 00:17:49,200 --> 00:17:52,639 Speaker 1: the use of asset valuations over the years for some 309 00:17:52,720 --> 00:17:56,200 Speaker 1: of its biggest assets, and whether or not those valuations 310 00:17:56,200 --> 00:17:59,920 Speaker 1: were manipulated in any way to give Trump better terms 311 00:18:00,000 --> 00:18:02,440 Speaker 1: on things like bank loans or insurance or even task 312 00:18:02,600 --> 00:18:06,600 Speaker 1: refunds and things like that. And this investigation started after 313 00:18:06,640 --> 00:18:11,600 Speaker 1: Trump's longtime lawyer Michael Cohen testified for Congress and alleged 314 00:18:11,640 --> 00:18:14,080 Speaker 1: all kinds of wrongdoing at the Trump Boards. So the 315 00:18:14,119 --> 00:18:17,520 Speaker 1: Attorney General, Leticia James opened the investigation at that time 316 00:18:17,760 --> 00:18:20,680 Speaker 1: issued a bunch of subpoenas, I mean, ultimately had to 317 00:18:20,720 --> 00:18:24,080 Speaker 1: go to court to enforce those subpoenas because the Trump 318 00:18:24,160 --> 00:18:27,680 Speaker 1: organization disagreed with their validity. So that's how this court 319 00:18:27,760 --> 00:18:31,440 Speaker 1: case got started, and that's that investigation is obviously ongoing. 320 00:18:31,760 --> 00:18:35,640 Speaker 1: Have Trump and the Trump Organization turned over documents? Well, yeah, 321 00:18:35,640 --> 00:18:38,400 Speaker 1: there's are two very different questions, because the Trump organization 322 00:18:38,760 --> 00:18:41,880 Speaker 1: ultimately was ordered to comply with the subpoena and has 323 00:18:41,920 --> 00:18:45,199 Speaker 1: been doing so. Obviously there's still this agreement about that, 324 00:18:45,320 --> 00:18:48,640 Speaker 1: but they have turned over more than six million pages 325 00:18:48,640 --> 00:18:52,480 Speaker 1: of documents. Dozens of employees and former employees, you know, 326 00:18:52,520 --> 00:18:56,359 Speaker 1: have searched their records and been deposed under oaths and 327 00:18:56,440 --> 00:18:59,199 Speaker 1: things like that. But as for Trump, you know, the 328 00:18:59,440 --> 00:19:04,160 Speaker 1: earlier subpoenas to the Trump Organization and everyone else had 329 00:19:04,400 --> 00:19:08,359 Speaker 1: turned over some documents that were Trumps essentially, but but 330 00:19:08,440 --> 00:19:11,640 Speaker 1: only ten. So at the six million pages, only ten 331 00:19:11,680 --> 00:19:14,480 Speaker 1: documents were actually from Trump himself, who is you know, 332 00:19:14,640 --> 00:19:18,439 Speaker 1: the leader of the company. So in December um the 333 00:19:18,720 --> 00:19:22,880 Speaker 1: Attorney General issued a subpoena to Trump himself for records 334 00:19:22,920 --> 00:19:27,000 Speaker 1: and information in his possession, as well as his testimony. 335 00:19:27,440 --> 00:19:30,000 Speaker 1: You may recall that that Trump and also his son 336 00:19:30,119 --> 00:19:34,280 Speaker 1: Don Jr. And Ivanka Trump, who has also received subpoenas, 337 00:19:34,280 --> 00:19:38,639 Speaker 1: they all challenged the subpoena for their testimony. They lost 338 00:19:38,680 --> 00:19:42,080 Speaker 1: on that they were ordered to be deposed. They've appealed 339 00:19:42,400 --> 00:19:46,240 Speaker 1: and that is happening separately. But what they didn't challenge 340 00:19:46,520 --> 00:19:50,960 Speaker 1: was the subpoena for Trump's records. So it was understood 341 00:19:51,000 --> 00:19:54,280 Speaker 1: and expected that the former president would be handing over 342 00:19:54,640 --> 00:19:58,199 Speaker 1: some records of his own in his possession related to 343 00:19:58,240 --> 00:20:02,800 Speaker 1: these various asset valuations by March thirty one. That was 344 00:20:02,840 --> 00:20:06,920 Speaker 1: the deadline that was agreed by the Attorney General and Trump, 345 00:20:07,000 --> 00:20:10,159 Speaker 1: and it was ordered by the court. So March thirty 346 00:20:10,200 --> 00:20:13,520 Speaker 1: one came, and instead of complying in full as he 347 00:20:13,640 --> 00:20:18,679 Speaker 1: was ordered to do, he instead filed a sixteen page 348 00:20:18,720 --> 00:20:21,800 Speaker 1: what the judge described as a boiler plate objections to 349 00:20:21,880 --> 00:20:24,840 Speaker 1: the subpoena for records that he had already agreed to 350 00:20:24,880 --> 00:20:28,919 Speaker 1: comply with um and an affirmation by his lawyer stating 351 00:20:28,960 --> 00:20:31,679 Speaker 1: that there were no records. They've looked and there were not. 352 00:20:32,320 --> 00:20:34,639 Speaker 1: So that's where we ended up with this content of 353 00:20:34,760 --> 00:20:39,200 Speaker 1: court decision. A lawyer for the A G Flag concerns 354 00:20:39,200 --> 00:20:42,119 Speaker 1: about hard copy records that should be searched in file 355 00:20:42,240 --> 00:20:45,960 Speaker 1: cabinets on two floors of Trump Tower, a storage closet 356 00:20:46,000 --> 00:20:49,960 Speaker 1: near Trump's office, and an off site location that seems 357 00:20:50,040 --> 00:20:54,520 Speaker 1: awfully specific. How did they even know about those locations 358 00:20:54,520 --> 00:20:57,720 Speaker 1: and that there might be documents there. Well, they would 359 00:20:57,720 --> 00:21:01,679 Speaker 1: have learned about this from their investigations so far. You know, 360 00:21:01,720 --> 00:21:04,720 Speaker 1: they've gotten uh from all of their interviews with other 361 00:21:04,760 --> 00:21:07,159 Speaker 1: employees and just from looking at the documents that have 362 00:21:07,200 --> 00:21:09,080 Speaker 1: been turned over. I'm sure they have a very good 363 00:21:09,119 --> 00:21:13,199 Speaker 1: idea of how the records are stored. Um, there's a 364 00:21:13,200 --> 00:21:15,840 Speaker 1: lot of hard copy records that go into a series 365 00:21:15,920 --> 00:21:18,560 Speaker 1: of filing cabinets, I think they said on the twenty 366 00:21:18,640 --> 00:21:21,720 Speaker 1: five and twenty six floors of Trump Tower. There's also 367 00:21:21,800 --> 00:21:25,480 Speaker 1: a storage closet outside of Trump's office where some records 368 00:21:25,480 --> 00:21:27,760 Speaker 1: are stored, and then of course an off site location, 369 00:21:27,800 --> 00:21:31,840 Speaker 1: which is not unusual. But basically, what the judge determined 370 00:21:32,040 --> 00:21:34,960 Speaker 1: was that if Trump is going to stay and there 371 00:21:35,000 --> 00:21:37,439 Speaker 1: are no records, that it needs to be done with 372 00:21:37,480 --> 00:21:40,680 Speaker 1: more than just saying that, you know that that was clearly, 373 00:21:41,080 --> 00:21:44,440 Speaker 1: in the judges view, did not comply with his order 374 00:21:45,000 --> 00:21:47,840 Speaker 1: to follow up with the subpoena, so he did find 375 00:21:47,880 --> 00:21:50,119 Speaker 1: him in contempt. Of course, said that it just was 376 00:21:50,160 --> 00:21:53,399 Speaker 1: not nearly enough just to raise these objections at this point, 377 00:21:53,440 --> 00:21:55,720 Speaker 1: that it was too late. He'd already agreed to comply 378 00:21:55,760 --> 00:21:58,560 Speaker 1: about his subpoena um and then instead of doing so, 379 00:21:59,160 --> 00:22:02,320 Speaker 1: filed some objects that should have been filed earlier, and 380 00:22:02,359 --> 00:22:04,680 Speaker 1: then just made up sort of a blanket statement by 381 00:22:04,680 --> 00:22:07,879 Speaker 1: his lawyer that there were no records around. So the 382 00:22:08,000 --> 00:22:10,600 Speaker 1: judge that you need to be much more specific. You 383 00:22:10,640 --> 00:22:14,720 Speaker 1: need to tell me exactly where you searched, exactly what 384 00:22:14,760 --> 00:22:18,240 Speaker 1: you searched, exactly when you searched it, and who searched it, 385 00:22:18,680 --> 00:22:21,160 Speaker 1: and then maybe you'll be in compliance with the court 386 00:22:21,280 --> 00:22:23,639 Speaker 1: order and you'll no longer be in contempt. But until 387 00:22:23,680 --> 00:22:26,200 Speaker 1: then he's in content of court and is approving a 388 00:22:26,320 --> 00:22:30,160 Speaker 1: ten tho dollar a day fine. So his lawyer said, 389 00:22:30,320 --> 00:22:33,840 Speaker 1: President Trump does not email, he does not chext message, 390 00:22:33,840 --> 00:22:37,280 Speaker 1: and he has no work computer at home or anywhere else. 391 00:22:38,080 --> 00:22:42,360 Speaker 1: So is she saying these documents just don't exist? Right? 392 00:22:42,640 --> 00:22:46,720 Speaker 1: His lawyer, Elena Habba at the hearing on Monday where 393 00:22:46,760 --> 00:22:49,600 Speaker 1: Trump was found in contempt, but she guaranteed to the 394 00:22:49,680 --> 00:22:52,720 Speaker 1: judge verbally that she had personally looked in all of 395 00:22:52,720 --> 00:22:55,800 Speaker 1: these places and that she had actually flown down to 396 00:22:55,880 --> 00:22:59,879 Speaker 1: Florida and met with Trump at Mara Lago to interview 397 00:23:00,000 --> 00:23:02,840 Speaker 1: her client about these records to determine if there were 398 00:23:02,880 --> 00:23:06,439 Speaker 1: any responsive records. And she guaranteed to the court that 399 00:23:06,520 --> 00:23:09,240 Speaker 1: she had done all of these searches personally and that 400 00:23:09,400 --> 00:23:12,240 Speaker 1: no responsive records were found. And the Trump said there 401 00:23:12,240 --> 00:23:14,919 Speaker 1: were no records that were found. Obviously, that did not 402 00:23:15,000 --> 00:23:17,000 Speaker 1: cut it with the judge. He said, that's saying this 403 00:23:17,119 --> 00:23:19,119 Speaker 1: to me now in court is a lot different than 404 00:23:19,160 --> 00:23:22,119 Speaker 1: explaining this in the sworn affidavit. And by the way, 405 00:23:22,320 --> 00:23:24,160 Speaker 1: the judge says, why don't we just get a sworn 406 00:23:24,200 --> 00:23:27,520 Speaker 1: affidavit from Trump himself. He didn't end up ordering anything 407 00:23:27,560 --> 00:23:30,320 Speaker 1: like that, but he said in court, I just can't 408 00:23:30,359 --> 00:23:32,639 Speaker 1: take your word for it. We need to have something 409 00:23:33,160 --> 00:23:37,000 Speaker 1: much more detailed explaining why there are no records from 410 00:23:37,119 --> 00:23:41,920 Speaker 1: Donald J. Trump responsive to all of these huge asset valuations, 411 00:23:42,320 --> 00:23:44,800 Speaker 1: some of his key assets at the company that he's 412 00:23:44,840 --> 00:23:47,560 Speaker 1: at the head of. So the judge wants something much 413 00:23:47,600 --> 00:23:51,159 Speaker 1: more concrete. Do you think that if the lawyer signed 414 00:23:51,160 --> 00:23:54,600 Speaker 1: an affidavit swearing to what she said in court, that 415 00:23:54,600 --> 00:23:59,159 Speaker 1: would suffice that my takeaway that the judge you know, 416 00:24:00,080 --> 00:24:03,119 Speaker 1: ested that that is something that would bring them into compliance. 417 00:24:03,400 --> 00:24:06,600 Speaker 1: It's another question whether or not the Attorney General would 418 00:24:06,720 --> 00:24:09,040 Speaker 1: you know, accept that or believe that. You know, they 419 00:24:09,040 --> 00:24:12,480 Speaker 1: did indicate that during these hours of arguments and court 420 00:24:12,600 --> 00:24:14,560 Speaker 1: that you know, they just sort of find it a 421 00:24:14,560 --> 00:24:17,040 Speaker 1: little hard to believe that. You know, as I mentioned, 422 00:24:17,040 --> 00:24:20,680 Speaker 1: the guy running this company does not have any records 423 00:24:20,680 --> 00:24:24,280 Speaker 1: that are responsive compared to the six million pages that 424 00:24:24,280 --> 00:24:27,119 Speaker 1: have been handed over by the company and everyone else. So, 425 00:24:27,600 --> 00:24:30,480 Speaker 1: you know, Miss Hobba did make the point, of course 426 00:24:30,560 --> 00:24:34,680 Speaker 1: that that Trump does not use a computer. But nevertheless, 427 00:24:34,800 --> 00:24:38,160 Speaker 1: the Attorney General believes that there may be responsive documents anyway, 428 00:24:38,160 --> 00:24:42,240 Speaker 1: whether it's hard copies of calendars or other documents that 429 00:24:42,280 --> 00:24:44,320 Speaker 1: he received and put away in a filing can or 430 00:24:44,400 --> 00:24:48,280 Speaker 1: something you know like that. But his lawyer just you know, 431 00:24:48,320 --> 00:24:51,360 Speaker 1: pointed out that even the text messages of his children 432 00:24:52,080 --> 00:24:54,680 Speaker 1: had already been turned over and searched, and that none 433 00:24:54,720 --> 00:24:57,520 Speaker 1: of those text messages were to Donald Trump. She made 434 00:24:57,520 --> 00:24:59,359 Speaker 1: that point to say, he doesn't even text with his 435 00:24:59,440 --> 00:25:02,520 Speaker 1: own kids. But you know, as the judge said, put 436 00:25:02,520 --> 00:25:05,040 Speaker 1: it in writing, put your name on it, and we'll 437 00:25:05,080 --> 00:25:08,919 Speaker 1: see if that brings you the compliance. So Trump is 438 00:25:08,920 --> 00:25:13,360 Speaker 1: filing an appeal, what are his chances? Well, that appeal 439 00:25:13,480 --> 00:25:16,440 Speaker 1: is just not surprising. The lawyer said that she would 440 00:25:16,440 --> 00:25:18,800 Speaker 1: be filing an appeal as soon as the written order hit, 441 00:25:18,840 --> 00:25:22,200 Speaker 1: and that happened yesterday afternoon. So I spoke with I'm 442 00:25:22,200 --> 00:25:25,240 Speaker 1: a gentleman who's not involved in the case, but who 443 00:25:25,400 --> 00:25:28,800 Speaker 1: is a former prosecutor in California and a former stake 444 00:25:28,840 --> 00:25:32,480 Speaker 1: board judge, and he said that these kinds of contempt 445 00:25:32,640 --> 00:25:36,000 Speaker 1: orders and finds are generally upheld on appeal because the 446 00:25:36,000 --> 00:25:38,399 Speaker 1: appellate courts, you know, give a lot of deference to 447 00:25:38,520 --> 00:25:42,399 Speaker 1: judges to determine the best way to enforce compliance with 448 00:25:42,560 --> 00:25:45,840 Speaker 1: their poor orders. That these trial judges, they're the ones 449 00:25:45,880 --> 00:25:48,800 Speaker 1: who have all of the fact in the possession for 450 00:25:48,840 --> 00:25:51,879 Speaker 1: all of these these cases, and that a contempt finding 451 00:25:51,920 --> 00:25:54,760 Speaker 1: is very fact specific, and so they just leave it 452 00:25:54,760 --> 00:25:57,320 Speaker 1: to the judges most for the most part, to determine 453 00:25:57,760 --> 00:26:00,399 Speaker 1: when someone is in contempt and the issue fine. So 454 00:26:00,560 --> 00:26:03,280 Speaker 1: I think it would be fairly hard, would be my guest, 455 00:26:03,320 --> 00:26:05,920 Speaker 1: to get out of contempt finding like this, given the 456 00:26:05,960 --> 00:26:09,720 Speaker 1: way the judge felled out exactly in his order how 457 00:26:09,920 --> 00:26:12,520 Speaker 1: Trump came to not be in compliance with the order 458 00:26:12,640 --> 00:26:16,760 Speaker 1: and the amount of time that Trump had to previously 459 00:26:17,200 --> 00:26:21,080 Speaker 1: raise the concerns instead of waiting until after the March 460 00:26:21,160 --> 00:26:24,720 Speaker 1: thirty first deadline. So we'll see where that goes. But 461 00:26:25,080 --> 00:26:27,600 Speaker 1: for right now, it's the contempt order is in place 462 00:26:27,680 --> 00:26:30,520 Speaker 1: and he's currently accruing this ten thousand dollars a day. 463 00:26:30,560 --> 00:26:35,000 Speaker 1: Like I said, the judge also ordered Cushman in Wakefield, 464 00:26:35,080 --> 00:26:38,560 Speaker 1: a real estate services firm that was used by Trump, 465 00:26:38,920 --> 00:26:43,600 Speaker 1: to comply with subpoenas. Have they been fighting subpoenas? You know, 466 00:26:43,680 --> 00:26:47,399 Speaker 1: they had Pushman, wakes Build, have been cooperating, They had 467 00:26:47,400 --> 00:26:50,400 Speaker 1: received subpoenas like a lot of other entities and businesses 468 00:26:50,440 --> 00:26:53,080 Speaker 1: that had been involved with Trump organization over the years 469 00:26:53,119 --> 00:26:58,000 Speaker 1: in the process of these huge appraisals, you know, law firms, engineers, architects, 470 00:26:58,080 --> 00:27:01,399 Speaker 1: things like that. So this Cushman and Wakefiel was was complying, 471 00:27:02,359 --> 00:27:05,040 Speaker 1: but at one point, says I think it seems they 472 00:27:05,080 --> 00:27:07,840 Speaker 1: decided that it was going too far. The demands for 473 00:27:08,160 --> 00:27:11,360 Speaker 1: documents were getting too broad for their liking, and they 474 00:27:11,400 --> 00:27:14,960 Speaker 1: did finally go to court to try to wash the 475 00:27:15,000 --> 00:27:18,520 Speaker 1: most recent subpoena, and the judge denied that and ordered 476 00:27:18,560 --> 00:27:22,640 Speaker 1: them to to continue complying UM and actually added Pushman 477 00:27:22,640 --> 00:27:25,359 Speaker 1: and Wakefield as a respondent to the case, which the 478 00:27:25,440 --> 00:27:29,320 Speaker 1: company had also tried to prevent from happening. So, you know, 479 00:27:29,400 --> 00:27:33,480 Speaker 1: this company was involved in three of the main assets 480 00:27:33,640 --> 00:27:36,800 Speaker 1: Trump organization assets that are being scrutinized. There are a 481 00:27:36,840 --> 00:27:39,639 Speaker 1: lot of assets things scrutinized, the three of them in particular, 482 00:27:39,960 --> 00:27:43,600 Speaker 1: Cushman and Wakefield was involved in in appraisals I'm including 483 00:27:43,800 --> 00:27:48,320 Speaker 1: Trump Golf Club in Los Angeles, property called Seven Springs, 484 00:27:48,400 --> 00:27:52,320 Speaker 1: New York City, and the forty Wall Streets skyscraper downtown. 485 00:27:53,000 --> 00:27:55,639 Speaker 1: UM and the a G lawyer pointed out at this 486 00:27:55,720 --> 00:27:59,280 Speaker 1: court hearing that you know, evaluations had gone really kind 487 00:27:59,320 --> 00:28:01,840 Speaker 1: of all over the like very quickly for some of 488 00:28:01,840 --> 00:28:05,760 Speaker 1: these properties in a way that benefited the Trump organization, 489 00:28:06,119 --> 00:28:09,360 Speaker 1: and that they want to see more records about how 490 00:28:09,400 --> 00:28:13,000 Speaker 1: they came to value these assets and why they changed. 491 00:28:13,359 --> 00:28:16,080 Speaker 1: Any idea when the AG is going to wrap up 492 00:28:16,119 --> 00:28:21,880 Speaker 1: this investigation, I know she said that she's uncovered significant evidence. Yes, 493 00:28:22,119 --> 00:28:24,080 Speaker 1: I got the hint, certain more than a hints, I 494 00:28:24,080 --> 00:28:26,679 Speaker 1: guess that the AGES lawyer said at the hearing on 495 00:28:26,720 --> 00:28:30,600 Speaker 1: Monday that some sort of enforcement action was likely soon 496 00:28:31,760 --> 00:28:34,720 Speaker 1: with the Trump organization. So there is a statute of 497 00:28:34,760 --> 00:28:38,880 Speaker 1: limitations concerned things like that. But the AGES lawyer even 498 00:28:38,960 --> 00:28:42,640 Speaker 1: indicated um that after some enforcement action has taken that 499 00:28:42,680 --> 00:28:45,560 Speaker 1: they might continue the investigation, and that they believe that 500 00:28:45,600 --> 00:28:48,560 Speaker 1: the nature of some of the potential offenses would be 501 00:28:48,960 --> 00:28:51,880 Speaker 1: sus that the offenses might have continued in a way 502 00:28:51,880 --> 00:28:55,520 Speaker 1: that the statute of limitations wouldn't apply necessarily in their view. 503 00:28:55,720 --> 00:28:58,200 Speaker 1: So I think we could see something fairly soon. And 504 00:28:58,240 --> 00:29:01,200 Speaker 1: as you mentioned, Tish Jane him As the Attorney General, 505 00:29:01,240 --> 00:29:04,280 Speaker 1: has already said that she has found significant evidence of 506 00:29:04,360 --> 00:29:08,840 Speaker 1: potentially misleading asset value evaluations. She gave that in a 507 00:29:08,920 --> 00:29:12,880 Speaker 1: preliminary report sort of trying to enforce these subpoenas and 508 00:29:13,000 --> 00:29:15,680 Speaker 1: demands for testimony because they were putting up such a sight, 509 00:29:16,400 --> 00:29:18,960 Speaker 1: James had to go to court to explain and really 510 00:29:19,160 --> 00:29:22,320 Speaker 1: reiterate why the investigation was so necessary, and she put 511 00:29:22,320 --> 00:29:26,480 Speaker 1: out some preliminary findings that seems to be potentially pretty serious. 512 00:29:26,800 --> 00:29:29,760 Speaker 1: When you say enforcement actions, what does she mean? I 513 00:29:29,840 --> 00:29:33,600 Speaker 1: thought she'd just bring a lawsuit against him well, I 514 00:29:33,600 --> 00:29:35,520 Speaker 1: think those are one in one of the name. I 515 00:29:35,560 --> 00:29:37,440 Speaker 1: assume that that is what he meant, but he just 516 00:29:37,520 --> 00:29:40,600 Speaker 1: used the term enforcement actions. Um. I think that it's 517 00:29:40,680 --> 00:29:42,760 Speaker 1: likely it could come in the form of a civil 518 00:29:42,800 --> 00:29:46,800 Speaker 1: complaint accusing the company and potentially individuals of wrongdoing in 519 00:29:46,840 --> 00:29:50,560 Speaker 1: relation to these asset valuations. Well more to come, as 520 00:29:50,600 --> 00:29:53,880 Speaker 1: always with these Trump investigations. Now I want to go 521 00:29:53,880 --> 00:29:58,480 Speaker 1: to a completely different subject, the mask mandates. It was 522 00:29:58,600 --> 00:30:01,400 Speaker 1: a shock to a lot of the country and maybe 523 00:30:01,400 --> 00:30:03,920 Speaker 1: the Biden administration as well, when a last week a 524 00:30:03,960 --> 00:30:07,960 Speaker 1: Florida judge struck down the travel mask mandate. And this 525 00:30:08,160 --> 00:30:12,320 Speaker 1: wasn't because of a lawsuit filed by the governor or 526 00:30:12,360 --> 00:30:15,920 Speaker 1: state officials. It was because of a lawsuit filed by 527 00:30:15,960 --> 00:30:19,440 Speaker 1: a former Wall Street banker who now lives in Idaho. 528 00:30:19,800 --> 00:30:23,240 Speaker 1: Tell us what you found out. Yes, it was interesting. 529 00:30:23,240 --> 00:30:26,320 Speaker 1: I can. I can confirm that we were, at least 530 00:30:26,360 --> 00:30:28,800 Speaker 1: in the newsroom taken by surprise when this when this 531 00:30:29,320 --> 00:30:32,400 Speaker 1: order hit, because it was not a case that we 532 00:30:32,400 --> 00:30:35,360 Speaker 1: were frankly watching as closely as the one you mentioned earlier, 533 00:30:35,360 --> 00:30:39,160 Speaker 1: where Ron de Santis, the governor of Florida, was leading 534 00:30:39,160 --> 00:30:42,920 Speaker 1: a multi state lawsuit in Florida, speeking to overturn the 535 00:30:43,040 --> 00:30:47,560 Speaker 1: national math mandate for public transportation. Uh So, when when 536 00:30:47,560 --> 00:30:50,280 Speaker 1: this hit, um, it was a surprise, and I had 537 00:30:50,320 --> 00:30:54,520 Speaker 1: never actually heard of this organization that had filed the lawsuit, 538 00:30:55,120 --> 00:30:56,719 Speaker 1: But as you said, it was, it was filed by 539 00:30:56,720 --> 00:30:59,280 Speaker 1: a former Wall Street banker. Her name is Leslie Manukian. 540 00:30:59,800 --> 00:31:03,040 Speaker 1: Uh She's fifty eight years old. She lives in a Standpoint, 541 00:31:03,120 --> 00:31:06,240 Speaker 1: Idaho to ski resort town. But she used to work 542 00:31:06,280 --> 00:31:10,280 Speaker 1: at Goldman Sacked in New York and London, and then 543 00:31:10,320 --> 00:31:14,440 Speaker 1: went to a precursor company for Alliance Bernstein uh in 544 00:31:14,960 --> 00:31:17,240 Speaker 1: London as well. She was in London for ten years 545 00:31:17,720 --> 00:31:20,440 Speaker 1: and then around two thousands three, she retired at a 546 00:31:20,440 --> 00:31:24,160 Speaker 1: fairly young age and moved back to her native Idaho 547 00:31:24,640 --> 00:31:28,320 Speaker 1: to raise her child. And that is where she got 548 00:31:28,360 --> 00:31:35,040 Speaker 1: into natural medicine and became anti vaccine and after the 549 00:31:35,120 --> 00:31:39,560 Speaker 1: pandemic started anti mask even uh So, she founded a 550 00:31:39,640 --> 00:31:45,480 Speaker 1: nonprofit organization called Health Freedom Defense Fund in with the 551 00:31:45,600 --> 00:31:51,480 Speaker 1: specific aim of sewing over various mandates. Um. She also 552 00:31:51,560 --> 00:31:55,360 Speaker 1: has a lawsuit pending in federal court against the Biden 553 00:31:55,400 --> 00:32:01,920 Speaker 1: administration's vaccine mandate for federal employees. Lots of other lawsuits 554 00:32:01,960 --> 00:32:05,120 Speaker 1: filed over that as well. Her is just one of them. Um. 555 00:32:05,360 --> 00:32:08,360 Speaker 1: But at any rate, Um, a Trump appointed federal judge, 556 00:32:08,360 --> 00:32:12,120 Speaker 1: as we've all heard by now, um ruled in favor 557 00:32:12,200 --> 00:32:18,920 Speaker 1: of Minuchans nonprofit and vacated the mass mandate nationwide. Let 558 00:32:18,920 --> 00:32:22,200 Speaker 1: me ask you this. Maybe you asked her this for 559 00:32:22,280 --> 00:32:26,880 Speaker 1: an organization based in Idaho to file a lawsuit in Florida, 560 00:32:27,280 --> 00:32:30,600 Speaker 1: was she formed shopping for this particular judge or for 561 00:32:30,680 --> 00:32:35,000 Speaker 1: odd conservative judge. Well, I'm sure that she would say 562 00:32:35,240 --> 00:32:40,400 Speaker 1: no to that. I did speak with some some experts 563 00:32:40,720 --> 00:32:44,360 Speaker 1: who think that that was clearly potentially something that was 564 00:32:44,560 --> 00:32:48,080 Speaker 1: going on there. Um. You know, even uh with all 565 00:32:48,160 --> 00:32:52,040 Speaker 1: of those states joining together to file a lawsuits. You know, 566 00:32:52,040 --> 00:32:56,160 Speaker 1: they filed it in Florida, so um, it could have 567 00:32:56,160 --> 00:32:57,600 Speaker 1: been any of the states where they filed it, but 568 00:32:57,640 --> 00:33:00,760 Speaker 1: they saw it there. But you know, it's not difficult 569 00:33:00,800 --> 00:33:04,080 Speaker 1: for a nonprofit based in any state to find some 570 00:33:04,200 --> 00:33:07,360 Speaker 1: plaintift to work with their lawyers to establish a right 571 00:33:07,440 --> 00:33:09,320 Speaker 1: to do in any other state. That's just kind of 572 00:33:09,440 --> 00:33:12,680 Speaker 1: I suppose the nature of the federal court system. But 573 00:33:13,200 --> 00:33:15,080 Speaker 1: she definitely got lucky with that judge. I guess you 574 00:33:15,120 --> 00:33:18,920 Speaker 1: could say Thanks so much. Eric. That's Bloomberg Legal reporter 575 00:33:19,200 --> 00:33:21,680 Speaker 1: Eric Larson, and that's it for this edition of The 576 00:33:21,680 --> 00:33:24,640 Speaker 1: Bloomberg Law Show. Remember you can always get the latest 577 00:33:24,720 --> 00:33:27,800 Speaker 1: legal news on our Bloomberg Law Podcast. You can find 578 00:33:27,840 --> 00:33:32,400 Speaker 1: them on Apple Podcasts, Spotify, and at www dot Bloomberg 579 00:33:32,480 --> 00:33:36,280 Speaker 1: dot com slash podcast Slash Law, And remember to tune 580 00:33:36,280 --> 00:33:39,080 Speaker 1: into The Bloomberg Law Show every week night at ten 581 00:33:39,160 --> 00:33:42,440 Speaker 1: b m. Wall Street Time. I'm June Grosso and you're 582 00:33:42,520 --> 00:33:43,720 Speaker 1: listening to Bloomberg