1 00:00:02,759 --> 00:00:07,000 Speaker 1: This is Bloomberg Law with June Grossel from Bloomberg Radio. 2 00:00:09,200 --> 00:00:12,760 Speaker 2: US Attorney General Pam Bondi threatened to go after hate 3 00:00:12,840 --> 00:00:18,400 Speaker 2: speech on a podcast last week. We will absolutely target you, 4 00:00:18,800 --> 00:00:23,360 Speaker 2: go after you if you are targeting anyone with hate 5 00:00:23,400 --> 00:00:27,280 Speaker 2: speech anything, and that's across the aisle. Bondy was wrong. 6 00:00:27,640 --> 00:00:31,320 Speaker 2: Hate speech is not a crime. In fact, hate speech 7 00:00:31,520 --> 00:00:35,760 Speaker 2: is free speech protected by the First Amendment. Just ask 8 00:00:35,840 --> 00:00:42,080 Speaker 2: the Supreme Court. Conservative Justice Samuel Alito wrote in twenty seventeen, quote, 9 00:00:42,400 --> 00:00:46,239 Speaker 2: the proudest boast of our free speech jurisprudence is that 10 00:00:46,280 --> 00:00:50,160 Speaker 2: we protect the freedom to express the thought that we hate. 11 00:00:50,479 --> 00:00:54,920 Speaker 2: Bondi's remarks true criticism across the political spectrum, and she 12 00:00:55,000 --> 00:00:58,560 Speaker 2: tried to walk them back with some confusing posts on 13 00:00:59,080 --> 00:01:03,480 Speaker 2: x is. First Amendment expert Timothy Zick, a professor at 14 00:01:03,520 --> 00:01:07,880 Speaker 2: William and Mary Law School, Tim, can you define hate 15 00:01:07,920 --> 00:01:09,000 Speaker 2: speech for US? 16 00:01:09,480 --> 00:01:12,640 Speaker 3: Well, it doesn't have a definition in US law or 17 00:01:12,720 --> 00:01:17,400 Speaker 3: First Amendment jurisprudence. There's no category of hate speech that 18 00:01:17,560 --> 00:01:22,039 Speaker 3: is unprotected under the First Amendment. That's in contrast to 19 00:01:22,319 --> 00:01:26,720 Speaker 3: European countries and other countries that do have statutory prescriptions 20 00:01:26,760 --> 00:01:32,320 Speaker 3: on speech that derrigates or criticizes people based on gender 21 00:01:32,880 --> 00:01:35,920 Speaker 3: or race or some other protected characteristic, but if the 22 00:01:35,920 --> 00:01:39,840 Speaker 3: Attorney General should know. In the United States in general, 23 00:01:39,880 --> 00:01:43,000 Speaker 3: hate speech is not criminally prescribable. 24 00:01:43,400 --> 00:01:46,600 Speaker 2: The Supreme Court has protected hate speech in more than 25 00:01:46,680 --> 00:01:50,160 Speaker 2: one case. The one that stands out in my mind 26 00:01:50,480 --> 00:01:54,920 Speaker 2: is Brandenburg versus Ohio, the case involving the Nazi Party 27 00:01:55,080 --> 00:01:58,360 Speaker 2: marching in Skokie, Illinois, in nineteen seventy seven. 28 00:01:58,840 --> 00:02:01,880 Speaker 3: Well, the Supreme Court, in that case and others, has 29 00:02:01,920 --> 00:02:05,120 Speaker 3: come down on the side of freedom of expression right 30 00:02:05,200 --> 00:02:09,200 Speaker 3: in the sense that the government cannot criminalize or otherwise 31 00:02:09,320 --> 00:02:13,160 Speaker 3: punish the expression of viewpoints, even if those viewpoints are 32 00:02:13,280 --> 00:02:19,240 Speaker 3: offensive or vile or derogatory. Right, So, even speach in 33 00:02:19,360 --> 00:02:23,200 Speaker 3: support of Nazism, this is a general matter protected speech. 34 00:02:23,440 --> 00:02:27,800 Speaker 3: Speech that offends people based on race, or gender or 35 00:02:27,840 --> 00:02:33,200 Speaker 3: sexual orientation, that's also protected speech. And the Supreme Court 36 00:02:33,280 --> 00:02:38,000 Speaker 3: has been consistent in drawing that line where it has 37 00:02:38,040 --> 00:02:40,840 Speaker 3: in the sense that you know, whether it's Nazis marching 38 00:02:40,880 --> 00:02:43,680 Speaker 3: in skoki that case reached the Supreme Court is actually 39 00:02:43,680 --> 00:02:46,280 Speaker 3: a lower court case that decided, Look, the town of 40 00:02:46,280 --> 00:02:49,200 Speaker 3: Skokey cannot enact all these ordinances to try and prevent 41 00:02:49,280 --> 00:02:53,240 Speaker 3: Nazis from marching or displaying Nazi regalia. The Supreme Court 42 00:02:53,280 --> 00:02:58,040 Speaker 3: protects viewpoints even if they're vile. There are some narrow exceptions, right, 43 00:02:58,040 --> 00:03:01,600 Speaker 3: If you threaten another person's bothering or death, if you 44 00:03:01,720 --> 00:03:05,799 Speaker 3: incite other people to engage in imminent unlawful activity that's 45 00:03:05,919 --> 00:03:08,760 Speaker 3: likely to occur, those sorts of things are not protected. 46 00:03:08,919 --> 00:03:12,639 Speaker 3: The government cannot have the power to tell an audience 47 00:03:12,720 --> 00:03:16,760 Speaker 3: in the United States what speech is appropriate or too 48 00:03:16,800 --> 00:03:18,200 Speaker 3: offensive to be heard. 49 00:03:18,400 --> 00:03:21,720 Speaker 2: But the Court has recognized an exception to the First 50 00:03:21,760 --> 00:03:25,600 Speaker 2: Amendment for threats of violence. How are they defined? 51 00:03:26,440 --> 00:03:29,919 Speaker 3: That's a narrow exception to First Amendment protection. Right, So 52 00:03:30,160 --> 00:03:33,040 Speaker 3: if you communicate what the Court is defined as a 53 00:03:33,160 --> 00:03:36,920 Speaker 3: serious expression of an intent to inflict bodily harm or 54 00:03:37,040 --> 00:03:40,960 Speaker 3: death on another person, then you can be punished for 55 00:03:41,120 --> 00:03:44,440 Speaker 3: that kind of speech. But the narrowness here is sort 56 00:03:44,440 --> 00:03:46,760 Speaker 3: of like, you know, it has to be a serious expression. 57 00:03:46,760 --> 00:03:48,800 Speaker 3: It can't be something said in jets. It can't be 58 00:03:49,360 --> 00:03:53,240 Speaker 3: hyperbolic language where you say, well, this person should be 59 00:03:53,320 --> 00:03:55,760 Speaker 3: home for their crimes, right, that sort of thing. It 60 00:03:55,760 --> 00:03:59,520 Speaker 3: has to be more directed, more specific, and as a 61 00:03:59,560 --> 00:04:03,840 Speaker 3: Supreme has recently said uttered recklessly that you know there's 62 00:04:03,880 --> 00:04:06,520 Speaker 3: a risk when you say the words that a person 63 00:04:06,560 --> 00:04:08,800 Speaker 3: will perceive what you're saying is threatening, but you say 64 00:04:08,840 --> 00:04:12,520 Speaker 3: it anyway. So it's not just threatening language. That's not 65 00:04:12,840 --> 00:04:16,920 Speaker 3: unprotected speech. It's something far more specific than that. And 66 00:04:17,200 --> 00:04:19,440 Speaker 3: when the Attorney General said, well, what I meant to say, 67 00:04:19,440 --> 00:04:22,279 Speaker 3: it wasn't hate speech, really, it was threat Well, none 68 00:04:22,320 --> 00:04:25,599 Speaker 3: of the speech that we've been talking about since Charlie 69 00:04:25,680 --> 00:04:30,400 Speaker 3: Kirk's assassination, you know, constitutes threats. Right when you praise 70 00:04:30,520 --> 00:04:33,560 Speaker 3: or celebrate someone's death, that's not a threat. So you know, 71 00:04:33,640 --> 00:04:35,719 Speaker 3: she got it wrong twice. Essentially. 72 00:04:36,480 --> 00:04:40,000 Speaker 4: She also said in her explanation, you can't call for 73 00:04:40,120 --> 00:04:44,120 Speaker 4: someone's murder. You cannot swat a member of Congress, you 74 00:04:44,200 --> 00:04:47,320 Speaker 4: cannot dos a conservative family and think it will be 75 00:04:47,360 --> 00:04:51,120 Speaker 4: brushed off as free speech. These acts are punishable crimes, 76 00:04:51,520 --> 00:04:53,799 Speaker 4: and every single threat will be met with the full 77 00:04:53,839 --> 00:04:54,799 Speaker 4: force of the law. 78 00:04:55,279 --> 00:04:58,159 Speaker 2: Are all the things she mentioned punishable. 79 00:04:58,200 --> 00:05:01,000 Speaker 3: Well, some of them are protected, right. It depends on 80 00:05:01,040 --> 00:05:03,200 Speaker 3: the sort of statute that you're looking at, you know, 81 00:05:03,240 --> 00:05:08,400 Speaker 3: how narrowly it's defining harassment, for example, or threat calling 82 00:05:08,480 --> 00:05:12,560 Speaker 3: for the murder of someone is not incitement. It is 83 00:05:12,600 --> 00:05:13,360 Speaker 3: not a threat. 84 00:05:14,120 --> 00:05:14,320 Speaker 1: Right. 85 00:05:14,600 --> 00:05:17,560 Speaker 3: I wish you know so and so would die is 86 00:05:17,560 --> 00:05:20,400 Speaker 3: a terrible thing to think, a terrible thing to say. 87 00:05:21,120 --> 00:05:25,440 Speaker 3: But it's not unprotected expression under our First Amendment doctrines 88 00:05:25,560 --> 00:05:29,640 Speaker 3: and jurisprudence. Yes, there's conduct that you can go after. 89 00:05:29,760 --> 00:05:33,960 Speaker 3: If I repeatedly harass someone, whether it's online or offline, 90 00:05:34,520 --> 00:05:37,080 Speaker 3: then that can rise to the level of harassment. But 91 00:05:37,120 --> 00:05:39,240 Speaker 3: there I'm not being punished for my expression. I'm being 92 00:05:39,279 --> 00:05:44,600 Speaker 3: punished for the act of repetitious harassment of another. And 93 00:05:44,720 --> 00:05:50,400 Speaker 3: docting is difficult, right, because just publishing information about, say, 94 00:05:50,440 --> 00:05:55,520 Speaker 3: where someone lives, is not necessarily unprotected speech, right. A 95 00:05:55,560 --> 00:05:57,680 Speaker 3: lot depends on the contact. And again, as they said, 96 00:05:57,680 --> 00:06:00,279 Speaker 3: it's statute under which you're reviewing it. 97 00:06:00,320 --> 00:06:03,159 Speaker 2: And wasn't there a Supreme Court case a few years 98 00:06:03,160 --> 00:06:07,240 Speaker 2: ago involving threats on the internet. 99 00:06:07,480 --> 00:06:11,279 Speaker 3: Counterman versus Colorado. Yeah, that was this very recent threats 100 00:06:11,279 --> 00:06:14,680 Speaker 3: case the Supreme Court handed down. There was a singer 101 00:06:14,839 --> 00:06:21,359 Speaker 3: who had some uninvited online messages, tried to block the 102 00:06:21,400 --> 00:06:24,560 Speaker 3: person from contacting her. He just opened new accounts and 103 00:06:24,640 --> 00:06:29,599 Speaker 3: kept contacting her, and eventually this person was prosecuted for 104 00:06:29,600 --> 00:06:32,040 Speaker 3: form of harassment, which is really what the case sort 105 00:06:32,040 --> 00:06:35,160 Speaker 3: of set up as, but the court below and then 106 00:06:35,160 --> 00:06:37,680 Speaker 3: the Supreme Court treated it as raising the question of 107 00:06:37,720 --> 00:06:41,440 Speaker 3: whether this person had communicated we are called true threats, 108 00:06:41,520 --> 00:06:45,640 Speaker 3: as I described earlier, serious expressions of an intent to 109 00:06:45,760 --> 00:06:49,400 Speaker 3: cause bodiling, injury or death to another. And what the 110 00:06:49,440 --> 00:06:52,280 Speaker 3: court was wrestling with in that piece is the mental 111 00:06:52,360 --> 00:06:55,520 Speaker 3: state of the speaker. Well, how do I know whether 112 00:06:55,560 --> 00:06:58,279 Speaker 3: the person's communicating a threat? And they say, if the 113 00:06:58,360 --> 00:07:02,240 Speaker 3: person knows of a substantial risk that the person he 114 00:07:02,360 --> 00:07:06,240 Speaker 3: is communicating with is when it perceives the speech is threatening, 115 00:07:06,800 --> 00:07:09,359 Speaker 3: then that's the kind of recklessness that the First Amendment 116 00:07:09,360 --> 00:07:13,320 Speaker 3: requires before you label something a true threat. So the 117 00:07:13,360 --> 00:07:16,320 Speaker 3: court was grappling with a really important sort of technical 118 00:07:16,360 --> 00:07:18,880 Speaker 3: issue in that case, which was a mental state required 119 00:07:18,880 --> 00:07:22,440 Speaker 3: for the speaker, and a number of courts before that 120 00:07:22,640 --> 00:07:26,040 Speaker 3: had sort of adopted this subjective test. Well, if I'm 121 00:07:26,080 --> 00:07:29,000 Speaker 3: the audience for that speech and I perceive it subjectively 122 00:07:29,080 --> 00:07:32,600 Speaker 3: is threatening, that should be enough. And the court was worried, well, 123 00:07:32,600 --> 00:07:36,560 Speaker 3: that's not speech protective enough. That's going to cause misunderstandings 124 00:07:36,600 --> 00:07:40,720 Speaker 3: to be translated into criminalized threats. We don't want that, 125 00:07:41,600 --> 00:07:44,080 Speaker 3: but we also don't want a sort of lower standard, 126 00:07:44,200 --> 00:07:46,040 Speaker 3: so let's find something in the middle, sort of a 127 00:07:46,040 --> 00:07:49,760 Speaker 3: goldilocked standard. And they settled on recklessness. 128 00:07:50,280 --> 00:07:52,760 Speaker 4: And so the Supreme Court has been would you say, 129 00:07:52,840 --> 00:07:57,000 Speaker 4: particularly protective of free speech rights? 130 00:07:57,400 --> 00:08:00,280 Speaker 3: I think that's its reputation right. I could probably come 131 00:08:00,360 --> 00:08:03,440 Speaker 3: up with exceptions to that, but in general, I think 132 00:08:03,440 --> 00:08:07,240 Speaker 3: it's fair to say that they're protective of freedom of speech. 133 00:08:07,320 --> 00:08:08,800 Speaker 1: Yet we have, you. 134 00:08:08,760 --> 00:08:12,560 Speaker 4: Know, the Attorney General's remarks. We had Todd Blanche, the 135 00:08:12,600 --> 00:08:17,240 Speaker 4: Deputy Attorney General, said in an interview that people protesting 136 00:08:17,280 --> 00:08:20,640 Speaker 4: while President Trump had dinner at a restaurant might have 137 00:08:20,720 --> 00:08:25,880 Speaker 4: committed a crime. You have President Trump saying to ABC 138 00:08:26,200 --> 00:08:29,559 Speaker 4: news reporter will probably go after people like you because 139 00:08:29,640 --> 00:08:33,000 Speaker 4: you treat me so unfairly. It's hate. Why is there 140 00:08:33,080 --> 00:08:36,080 Speaker 4: this misunderstanding of hate speech? 141 00:08:37,280 --> 00:08:39,719 Speaker 3: Well, I don't know if it's a misunderstanding. I mean 142 00:08:39,760 --> 00:08:43,040 Speaker 3: this has been sort of President Trump's a longstanding position. 143 00:08:43,200 --> 00:08:43,360 Speaker 4: Right. 144 00:08:43,360 --> 00:08:47,760 Speaker 3: He either doesn't understand or doesn't appreciate freedom of expression. 145 00:08:47,920 --> 00:08:52,320 Speaker 3: So his view is that negative press isn't protected. You 146 00:08:52,360 --> 00:08:57,440 Speaker 3: can pull the broadcast license of a broadcaster that publishes 147 00:08:57,840 --> 00:09:02,360 Speaker 3: critical coverage of him. It consistingly negative right, That, of course, 148 00:09:02,440 --> 00:09:05,920 Speaker 3: is contrary to the First Amendment. Going after your political 149 00:09:06,000 --> 00:09:09,360 Speaker 3: enemies for things that they say is part of the 150 00:09:09,400 --> 00:09:13,760 Speaker 3: sort of Trump mantra, but it is unconstitutional. And you 151 00:09:13,760 --> 00:09:16,079 Speaker 3: know what's interesting to me is recently people have said, oh, 152 00:09:16,080 --> 00:09:19,840 Speaker 3: we've crossed some line here where the president is threatening 153 00:09:19,880 --> 00:09:23,520 Speaker 3: retribution at this political enemy. We are nine months into 154 00:09:23,559 --> 00:09:27,719 Speaker 3: a retribution campaign. It's gotten louder, but it's been there 155 00:09:27,760 --> 00:09:30,280 Speaker 3: the whole time. I mean they've gone after law firms, 156 00:09:30,800 --> 00:09:36,120 Speaker 3: international students, the American Bar Association, and plenty of others 157 00:09:36,360 --> 00:09:39,800 Speaker 3: up to this point. What's different is it's more explicit, 158 00:09:39,880 --> 00:09:42,960 Speaker 3: I suppose one could say, and the drumbeat is getting louder. 159 00:09:43,000 --> 00:09:46,600 Speaker 3: We're going to go after particularly so called left leaning 160 00:09:47,120 --> 00:09:50,680 Speaker 3: speakers or organizations who say things that we don't like, 161 00:09:51,040 --> 00:09:55,720 Speaker 3: and the First Amendment stands in complete opposition to that position. 162 00:09:56,559 --> 00:10:00,880 Speaker 3: So what's changed. I mean, the kirk assassination horrific event, 163 00:10:02,120 --> 00:10:05,240 Speaker 3: bound to create, you know, sort of turn and backlash, 164 00:10:05,360 --> 00:10:09,680 Speaker 3: but the administration's answer to that has been again, we're 165 00:10:09,679 --> 00:10:12,960 Speaker 3: going to go after the left so called and we're 166 00:10:12,960 --> 00:10:17,560 Speaker 3: going to punish speakers who say nasty things about Charlie 167 00:10:17,640 --> 00:10:21,960 Speaker 3: Kirk or President Trump, and the First Amendment just simply 168 00:10:21,960 --> 00:10:23,200 Speaker 3: doesn't allow them to do that. 169 00:10:23,480 --> 00:10:26,240 Speaker 2: Coming up next on the Bloomberg Lawn Show, I'll continue 170 00:10:26,280 --> 00:10:30,000 Speaker 2: this conversation with Professor Timothy Zick of William and Mary 171 00:10:30,080 --> 00:10:34,319 Speaker 2: Law School. People across the country, from teachers and lawyers 172 00:10:34,360 --> 00:10:39,520 Speaker 2: to airline pilots and healthcare workers, have been fired, suspended, 173 00:10:39,720 --> 00:10:43,680 Speaker 2: or disciplined over social media posts about Charlie Kirk and 174 00:10:43,760 --> 00:10:47,720 Speaker 2: his death. Why the First Amendment doesn't protect them. In 175 00:10:47,800 --> 00:10:50,840 Speaker 2: other legal news, today, the Supreme Court said it will 176 00:10:50,840 --> 00:10:55,040 Speaker 2: hear a Trump administration appeal that could topple a ninety 177 00:10:55,120 --> 00:10:58,359 Speaker 2: year old president and put the White House in control 178 00:10:58,480 --> 00:11:03,120 Speaker 2: of federal agencies that have long been independent. The Court's 179 00:11:03,160 --> 00:11:07,000 Speaker 2: conservative majority also refuse to let the person at the 180 00:11:07,040 --> 00:11:11,360 Speaker 2: center of the case, Federal Trade Commission Member Rebecca Kelly Slaughter, 181 00:11:11,840 --> 00:11:15,560 Speaker 2: returned to her job during the appeal. Trump is trying 182 00:11:15,600 --> 00:11:19,280 Speaker 2: to fire Slaughter despite a law that says commissioners can 183 00:11:19,320 --> 00:11:24,679 Speaker 2: be removed only for specified reasons. The showdown gives conservatives 184 00:11:24,679 --> 00:11:28,800 Speaker 2: and regulation opponents the chance to achieve a long sought 185 00:11:28,840 --> 00:11:33,280 Speaker 2: goal of overturning the Supreme Court's nineteen thirty five ruling. 186 00:11:33,720 --> 00:11:36,079 Speaker 2: And remember you can always get the latest legal news 187 00:11:36,080 --> 00:11:39,360 Speaker 2: by listening to our Bloomberg Law podcast wherever you get 188 00:11:39,360 --> 00:11:43,040 Speaker 2: your favorite podcasts. I'm June Grosso and this is Bloomberg. 189 00:11:43,679 --> 00:11:47,840 Speaker 2: Last week, Vice President j D Vance encouraged people to 190 00:11:47,960 --> 00:11:52,600 Speaker 2: report anyone celebrating Charlie Kirk's murder to their employers. 191 00:11:53,360 --> 00:11:56,520 Speaker 1: So when you see someone celebrating Charlie's murder, call them 192 00:11:56,520 --> 00:11:58,400 Speaker 1: out in hell, call their employer. 193 00:11:58,920 --> 00:12:01,600 Speaker 3: We don't believe in political violence, but we do believe 194 00:12:01,720 --> 00:12:02,640 Speaker 3: in civility. 195 00:12:03,120 --> 00:12:06,720 Speaker 2: And across the country, people have been fired, suspended, or 196 00:12:06,800 --> 00:12:11,080 Speaker 2: disciplined over social media posts about Kirk, from teachers and 197 00:12:11,200 --> 00:12:15,960 Speaker 2: lawyers to airline pilots and healthcare workers. Many employers have 198 00:12:16,120 --> 00:12:20,240 Speaker 2: cracked down on remarks they deem inappropriate. I've been talking 199 00:12:20,240 --> 00:12:23,520 Speaker 2: to Professor Timothy Zick of William and Mary Law School. 200 00:12:24,400 --> 00:12:28,520 Speaker 3: The rules are different for private and government speakers. Right, 201 00:12:28,559 --> 00:12:31,160 Speaker 3: So if you're talking about a private employee and at 202 00:12:31,200 --> 00:12:34,240 Speaker 3: will employee who can be dismissed for any reason at 203 00:12:34,240 --> 00:12:38,319 Speaker 3: all or no reason, then they can be dismissed for 204 00:12:38,840 --> 00:12:42,560 Speaker 3: speech that they publish or communicate. There are only a 205 00:12:42,559 --> 00:12:47,240 Speaker 3: few states where you get some statutory protection for political speech, 206 00:12:48,360 --> 00:12:52,280 Speaker 3: but in general, you speak at your peril. With respect 207 00:12:52,360 --> 00:12:57,760 Speaker 3: to private employment, public employment is very different. Public employees 208 00:12:57,760 --> 00:13:02,320 Speaker 3: retain some First Amendment rights as citizens to speak on 209 00:13:02,640 --> 00:13:06,800 Speaker 3: what are called matters of public concern newsworthy matters, which 210 00:13:06,840 --> 00:13:10,240 Speaker 3: certainly covers the speech that has been sort of debated 211 00:13:11,400 --> 00:13:16,640 Speaker 3: a post Charlie Kirks murder. And it's complicated. Right, So, 212 00:13:17,000 --> 00:13:20,000 Speaker 3: if you're a public employee and you say something offensive, 213 00:13:20,120 --> 00:13:23,880 Speaker 3: Let's say you praise Charlie Kirks murder, and you're a 214 00:13:24,000 --> 00:13:28,240 Speaker 3: university professor, let's say, and your employer says, well, I'm 215 00:13:28,280 --> 00:13:31,079 Speaker 3: going to terminate your employment. Well, putting aside the sort 216 00:13:31,120 --> 00:13:35,640 Speaker 3: of tenure and academic freedom problems there, as a public employee, 217 00:13:35,920 --> 00:13:39,880 Speaker 3: you have a First Amendment right to communicate that. Let 218 00:13:40,000 --> 00:13:43,200 Speaker 3: the Supreme Court has said, what you get as a 219 00:13:43,200 --> 00:13:46,079 Speaker 3: public employee if you speak on matters of public concern 220 00:13:46,600 --> 00:13:48,440 Speaker 3: is a balance. We're going to balance your right to 221 00:13:48,520 --> 00:13:54,000 Speaker 3: speak against the employer's interest in efficient operations. So across 222 00:13:54,040 --> 00:13:57,760 Speaker 3: a range of public employment, what you might find in 223 00:13:57,760 --> 00:13:59,960 Speaker 3: some cases is that courts will side with the employer. 224 00:14:00,160 --> 00:14:03,840 Speaker 3: What you said was so offensive it created disruption in 225 00:14:03,840 --> 00:14:07,520 Speaker 3: the workplace, and we're not required to tolerate that. The 226 00:14:07,520 --> 00:14:10,480 Speaker 3: first women doesn't require that we tolerate that. So it 227 00:14:10,520 --> 00:14:13,600 Speaker 3: can be complicated to respect the public employment, but it's 228 00:14:13,720 --> 00:14:15,640 Speaker 3: much simpler with regard to private. 229 00:14:15,920 --> 00:14:19,120 Speaker 4: The Supreme Court has taken a lot of cases involving 230 00:14:19,160 --> 00:14:22,360 Speaker 4: the Trump administration on the emergency docket, but none of 231 00:14:22,400 --> 00:14:26,720 Speaker 4: them that I can recall, involved a free speech issue. 232 00:14:27,000 --> 00:14:29,240 Speaker 4: Are you confident that the Court, if one of these 233 00:14:29,320 --> 00:14:33,000 Speaker 4: cases on hate speech came up to the court, that 234 00:14:33,080 --> 00:14:34,880 Speaker 4: they would stick by their precedent. 235 00:14:35,400 --> 00:14:39,040 Speaker 3: I think they would. I think, you know, the likelihood 236 00:14:39,240 --> 00:14:41,880 Speaker 3: of the Court taking a First Amendment case out of 237 00:14:41,920 --> 00:14:45,240 Speaker 3: what i'll call the Trump two point zero era, it's 238 00:14:45,320 --> 00:14:48,720 Speaker 3: relatively high. It's not clear yet what the administration intends 239 00:14:48,800 --> 00:14:53,160 Speaker 3: to do with respect to so called hate speech investigations 240 00:14:53,240 --> 00:14:56,920 Speaker 3: or prosecutions. It's mostly what they're doing is threatening to 241 00:14:56,960 --> 00:15:00,720 Speaker 3: investigate people for core political speech. We wouldn't even be 242 00:15:00,760 --> 00:15:03,240 Speaker 3: talking about hate speech. It would be more you know, 243 00:15:03,280 --> 00:15:07,240 Speaker 3: I'm going to go after George Soros organization because it 244 00:15:07,320 --> 00:15:12,600 Speaker 3: supports left wing positions or its funds left wing political 245 00:15:12,680 --> 00:15:15,440 Speaker 3: activistm what's clearly unconstitutional. I don't even know if the 246 00:15:15,440 --> 00:15:18,680 Speaker 3: Supreme Court would be interested in a case like that. 247 00:15:18,760 --> 00:15:22,400 Speaker 3: I'm assuming a lower court would say that's unconstitutional. But 248 00:15:22,440 --> 00:15:24,920 Speaker 3: there are cases in the First Amendment realm that may 249 00:15:25,000 --> 00:15:27,920 Speaker 3: make it to the Court, some of them involving maybe 250 00:15:28,040 --> 00:15:31,360 Speaker 3: the rights of non citizens under the First Amendment, which 251 00:15:31,400 --> 00:15:34,080 Speaker 3: the Court has been unclear about if they want to 252 00:15:34,120 --> 00:15:37,560 Speaker 3: clarify that. Some of the university cases, the Harvard case, 253 00:15:37,600 --> 00:15:42,120 Speaker 3: for example, or the administration is the terminating funds the 254 00:15:42,200 --> 00:15:46,920 Speaker 3: university says based on their speech, and the Court may 255 00:15:46,960 --> 00:15:48,920 Speaker 3: be interested in that. Or maybe there'll be a press 256 00:15:48,960 --> 00:15:52,320 Speaker 3: case who knows and involving a broadcast license or something 257 00:15:52,360 --> 00:15:54,880 Speaker 3: like that. I can imagine the Court being interested in 258 00:15:55,120 --> 00:15:55,880 Speaker 3: those cases. 259 00:15:56,400 --> 00:16:00,200 Speaker 4: In those cases that you've mentioned, for example, the Harvard case, 260 00:16:00,280 --> 00:16:04,480 Speaker 4: let's just take one. Is it difficult to prove that 261 00:16:04,600 --> 00:16:10,080 Speaker 4: the administration is going after Harvard because of its speech? 262 00:16:11,000 --> 00:16:14,440 Speaker 3: Well, the administration is its own worst enemy with regard 263 00:16:14,480 --> 00:16:17,880 Speaker 3: to so called retaliation claims. Right, So one of the 264 00:16:17,920 --> 00:16:22,280 Speaker 3: things you asked is, well, why did the Trump administration 265 00:16:22,560 --> 00:16:27,040 Speaker 3: target Harvard for termination of funds? And let's add the 266 00:16:27,080 --> 00:16:30,560 Speaker 3: other ten investigations to Harvard and faith it can be 267 00:16:30,640 --> 00:16:33,080 Speaker 3: hard to prove, right, because the administration to that said, oh, no, 268 00:16:33,160 --> 00:16:36,560 Speaker 3: we did that because of anti semitism on campus. And 269 00:16:36,880 --> 00:16:38,560 Speaker 3: you start looking into that and say, well, you never 270 00:16:38,640 --> 00:16:41,480 Speaker 3: help an investigation. You never found any facts. She didn't 271 00:16:41,480 --> 00:16:44,800 Speaker 3: follow the law. What I'm left with is, you know, 272 00:16:45,160 --> 00:16:48,280 Speaker 3: public statements by the Education Secretary not to mention the 273 00:16:48,280 --> 00:16:50,600 Speaker 3: President of the United States that what we need to 274 00:16:50,640 --> 00:16:54,080 Speaker 3: do is bring these universities to heal because they're too liberal, 275 00:16:54,840 --> 00:16:57,720 Speaker 3: because their culture is too liberal, it's too less leaning. 276 00:16:58,360 --> 00:17:01,040 Speaker 3: Well that gives it away, I assuming you can take 277 00:17:01,080 --> 00:17:03,960 Speaker 3: the president's communications into account, but even if you can't, 278 00:17:04,440 --> 00:17:07,320 Speaker 3: the Education Secretary has said this is what conservatives they 279 00:17:07,400 --> 00:17:10,080 Speaker 3: wanted to do for a long time, to sort of 280 00:17:10,640 --> 00:17:14,960 Speaker 3: lean on these universities who are indoctrinating students as the 281 00:17:15,080 --> 00:17:19,359 Speaker 3: left wing ideology. So it's all very explicit. Right in 282 00:17:19,480 --> 00:17:23,760 Speaker 3: other contexts, and say past administrations or different governments, it 283 00:17:23,840 --> 00:17:27,320 Speaker 3: might not have been so explicit, but they're very, very transparent. 284 00:17:27,440 --> 00:17:29,880 Speaker 3: I think the Trump administration about what they're trying to do. 285 00:17:30,520 --> 00:17:33,800 Speaker 2: ABC is putting the Jimmy Kimmel Show back on the 286 00:17:33,840 --> 00:17:38,720 Speaker 2: air tomorrow night. But would you say that FCC Chair 287 00:17:38,840 --> 00:17:43,920 Speaker 2: Brendan Carr's statements about Kimmel would fit into that category 288 00:17:43,960 --> 00:17:45,120 Speaker 2: you were just describing. 289 00:17:46,200 --> 00:17:49,359 Speaker 3: The difficulty here is that Brendan Carr, the chair of 290 00:17:49,359 --> 00:17:55,040 Speaker 3: the Federal Communications Commission, went on a podcast and threatened 291 00:17:55,640 --> 00:17:59,600 Speaker 3: ABC if it didn't do something. He said, we can 292 00:17:59,640 --> 00:18:03,040 Speaker 3: do this the easy way or the hard way, and 293 00:18:03,119 --> 00:18:05,879 Speaker 3: he's clearly threatening their broadcast license or at least the 294 00:18:05,920 --> 00:18:10,080 Speaker 3: licenses of their affiliates. That's who holds the licenses. So 295 00:18:10,200 --> 00:18:13,240 Speaker 3: the government inserts itself. It's another one of these examples 296 00:18:13,280 --> 00:18:16,080 Speaker 3: where you know, ordinarily it might be difficult to say 297 00:18:16,320 --> 00:18:18,240 Speaker 3: what role, if any of the government played. Well, here 298 00:18:18,240 --> 00:18:22,200 Speaker 3: he is on podcasts telling you I'm going to abuse 299 00:18:22,280 --> 00:18:28,879 Speaker 3: the fpc's authority by jaw owning and threatening licensed affiliates. 300 00:18:29,040 --> 00:18:32,520 Speaker 3: But for speech that we think we the Trump administration 301 00:18:32,680 --> 00:18:37,120 Speaker 3: think is offensive with regard to Charlie Kirk, Now they 302 00:18:37,240 --> 00:18:40,080 Speaker 3: pitched this sort of misinformation or he got it wrong. 303 00:18:40,760 --> 00:18:44,080 Speaker 3: Plenty of people got the facts wrong the early going 304 00:18:44,640 --> 00:18:49,200 Speaker 3: with respect to Kirk's assailants, So that doesn't single Jimmy 305 00:18:49,240 --> 00:18:53,080 Speaker 3: Kimmel out right. So there's sort of this sort of agenda, 306 00:18:53,600 --> 00:18:57,679 Speaker 3: particularly in the broadcast realm, by Brendan Carr, the SPC 307 00:18:58,600 --> 00:19:02,919 Speaker 3: and other agencies to come down on broadcasters in the 308 00:19:03,000 --> 00:19:08,160 Speaker 3: press and to do so explicitly because of their coverage, 309 00:19:08,560 --> 00:19:13,080 Speaker 3: whether it's their editorial decisions they make as reporters or 310 00:19:13,119 --> 00:19:16,520 Speaker 3: in this case, comments about a matter of public concerns. 311 00:19:17,119 --> 00:19:21,360 Speaker 3: And it's inconsistent with law, federal law for the FCC 312 00:19:21,640 --> 00:19:24,800 Speaker 3: to intervene in that respect, and it violates the First Amendments. 313 00:19:25,560 --> 00:19:28,920 Speaker 2: And are there other business interests at work in these 314 00:19:29,000 --> 00:19:30,320 Speaker 2: cases as well? 315 00:19:31,000 --> 00:19:33,320 Speaker 3: And there's another twist to this, there's another layer. And 316 00:19:33,400 --> 00:19:36,240 Speaker 3: when you talked about the affiliates who do hold the licenses, 317 00:19:37,080 --> 00:19:40,400 Speaker 3: the corporations that own those affiliates, one of them has 318 00:19:40,400 --> 00:19:43,520 Speaker 3: a big merger application pending with the Trump administration. And 319 00:19:43,560 --> 00:19:46,439 Speaker 3: guess what I really want that merger to go through. 320 00:19:47,359 --> 00:19:49,880 Speaker 3: I better play ball. I better do it the easy way. 321 00:19:51,400 --> 00:19:55,119 Speaker 3: So it's layers of leverage here. It's not just brending cars, 322 00:19:55,160 --> 00:19:57,760 Speaker 3: you know, mouthing off about you know, easy way or 323 00:19:57,800 --> 00:20:00,160 Speaker 3: hard way. It's behind the scenes. You've got people who 324 00:20:00,200 --> 00:20:07,480 Speaker 3: are interested in big time corporate mergers with these affiliates, 325 00:20:07,560 --> 00:20:11,199 Speaker 3: and their concern is, boy, I better not across some 326 00:20:11,680 --> 00:20:14,160 Speaker 3: line if I can figure out where it is with 327 00:20:14,240 --> 00:20:16,159 Speaker 3: regard to the Trump administration, or I won't get my 328 00:20:16,240 --> 00:20:21,280 Speaker 3: merger and that's already happened. Paramounts is the other example there. Right, 329 00:20:21,359 --> 00:20:25,600 Speaker 3: they did it the easy way, They played ball and 330 00:20:25,680 --> 00:20:31,080 Speaker 3: they got their merger through. So it's it's very it's 331 00:20:31,200 --> 00:20:34,680 Speaker 3: very mafia like as people have described it. Right, boy, 332 00:20:34,800 --> 00:20:36,919 Speaker 3: this is a really nice restaurant of the shame if 333 00:20:36,960 --> 00:20:40,560 Speaker 3: you lose, if you lost it. It's kind of like that, 334 00:20:42,200 --> 00:20:44,919 Speaker 3: but even more explicit. But you're right, of course, you know, 335 00:20:44,920 --> 00:20:49,480 Speaker 3: when it comes to comedy, people have been scuring presidents 336 00:20:49,640 --> 00:20:52,400 Speaker 3: for you know, since we've had television and even before, 337 00:20:54,000 --> 00:20:56,720 Speaker 3: and they haven't been punished for it. We are any different, 338 00:20:57,040 --> 00:21:01,000 Speaker 3: darker place with regard to freedom of expression in the 339 00:21:01,119 --> 00:21:02,000 Speaker 3: United States. 340 00:21:02,480 --> 00:21:06,280 Speaker 2: And tell us about the repository you're keeping all these 341 00:21:06,640 --> 00:21:09,160 Speaker 2: lawsuits concerning the First Amendment. 342 00:21:10,400 --> 00:21:14,159 Speaker 3: It's that First Amendment watch, so they're hosting it. So 343 00:21:14,240 --> 00:21:17,480 Speaker 3: what I've done is I've taken all of the executive 344 00:21:17,560 --> 00:21:20,920 Speaker 3: orders that relate to freedom of expression. So I've got 345 00:21:20,960 --> 00:21:24,480 Speaker 3: all of those organized on the site by subject matter, 346 00:21:25,040 --> 00:21:29,119 Speaker 3: and then i have litigation with regard to the executive 347 00:21:29,200 --> 00:21:32,320 Speaker 3: orders and also Trump's lawsuits against the press. So all 348 00:21:32,320 --> 00:21:37,800 Speaker 3: the litigation and the pleatings and then commentary broadly speaking, right, 349 00:21:37,880 --> 00:21:40,119 Speaker 3: some of its legal commentary, some of it's what I 350 00:21:40,200 --> 00:21:44,159 Speaker 3: read in the press. Right, I can't capture everything, but 351 00:21:44,280 --> 00:21:46,560 Speaker 3: if I see something and I have people sending the 352 00:21:46,600 --> 00:21:50,639 Speaker 3: item and neches in the repository as well, and of 353 00:21:50,640 --> 00:21:54,960 Speaker 3: course with regard to the Charlie Kirk murder in the 354 00:21:55,040 --> 00:21:58,240 Speaker 3: fallout from that, it's that's its own avalanche of stuff. 355 00:21:58,880 --> 00:22:01,399 Speaker 3: You know. The media is quite rightly focused on it, 356 00:22:01,480 --> 00:22:03,680 Speaker 3: but there is a lot of commentary and you can 357 00:22:03,800 --> 00:22:09,040 Speaker 3: really miss things that are going on. For example, and 358 00:22:09,240 --> 00:22:12,080 Speaker 3: I'm doing this now on substack because I need another outlet, 359 00:22:12,680 --> 00:22:16,760 Speaker 3: a district court just invalidated the National Endowment for the 360 00:22:16,840 --> 00:22:20,080 Speaker 3: Arts process for reviewing grant applications because they were weeding 361 00:22:20,080 --> 00:22:23,159 Speaker 3: them out based on whether they can they promoted gender ideology. 362 00:22:24,440 --> 00:22:30,960 Speaker 3: So there's another sort of viewpoint based unconstitutional Trump administration policy. 363 00:22:31,600 --> 00:22:35,160 Speaker 3: The zone is just so flooded. They are pressing First 364 00:22:35,200 --> 00:22:38,280 Speaker 3: Amendment boundaries on purpose. They want to see how far 365 00:22:38,359 --> 00:22:41,600 Speaker 3: they can go, and even when they lose in court, 366 00:22:41,760 --> 00:22:45,280 Speaker 3: I think they just say, well, shrug. It serves its 367 00:22:45,359 --> 00:22:49,680 Speaker 3: purpose anyway, because we're scaring people, we're chilling them. They're 368 00:22:49,680 --> 00:22:52,520 Speaker 3: going to self censor. I think that's happening to the press. 369 00:22:52,720 --> 00:22:54,680 Speaker 3: You know when I read headlines, and there are times 370 00:22:54,680 --> 00:22:57,760 Speaker 3: they keep changing them and they get friendlier and friendlier 371 00:22:58,200 --> 00:23:01,440 Speaker 3: to sort of right wing the almost as a way 372 00:23:01,440 --> 00:23:03,719 Speaker 3: to sort of say, don't look over here. Well, they 373 00:23:03,760 --> 00:23:06,960 Speaker 3: get sued anyway. The New York Times just got sued 374 00:23:07,200 --> 00:23:11,159 Speaker 3: for ten billion dollars fifteen billion. I forget that ridiculous number, 375 00:23:11,320 --> 00:23:15,560 Speaker 3: as did Penguin Press, because essentially they didn't report accurately 376 00:23:15,640 --> 00:23:16,919 Speaker 3: how popular Trump is. 377 00:23:17,720 --> 00:23:20,040 Speaker 2: Didn't a judge throw out that lawsuit? 378 00:23:20,880 --> 00:23:23,920 Speaker 3: Well, he threw it out preliminarily, right, your complaint has 379 00:23:23,960 --> 00:23:28,560 Speaker 3: to be succinct and state your claims, and it shouldn't 380 00:23:28,600 --> 00:23:31,919 Speaker 3: be full of you know, basically all the lathering of 381 00:23:31,960 --> 00:23:34,720 Speaker 3: Trump that this one was. So he gave him another chance. 382 00:23:34,720 --> 00:23:37,199 Speaker 3: He said, you know, you have about a month to 383 00:23:37,320 --> 00:23:41,240 Speaker 3: turn in a complaint that meets the Rule eight of 384 00:23:41,440 --> 00:23:44,439 Speaker 3: the Federal Rules of Civil Procedure, which says here's what 385 00:23:44,480 --> 00:23:48,720 Speaker 3: a complaint should include and nothing further. So, yeah, he 386 00:23:48,800 --> 00:23:50,560 Speaker 3: threw it out, but it'll come back. But I think 387 00:23:50,600 --> 00:23:54,040 Speaker 3: eventually Trump will lose if he loses every one of 388 00:23:54,080 --> 00:23:56,240 Speaker 3: these cases. And again I don't think the point is 389 00:23:56,280 --> 00:23:58,399 Speaker 3: to win the case. He doesn't need the money, He 390 00:23:58,440 --> 00:24:01,760 Speaker 3: hasn't suffered reputational damage, but wants to be sued for 391 00:24:01,800 --> 00:24:05,040 Speaker 3: fifteen billion dollars. You still have to defend yourself, and 392 00:24:05,080 --> 00:24:07,159 Speaker 3: if you can bring a settlement out of The New 393 00:24:07,240 --> 00:24:09,800 Speaker 3: York Times and or Penguin, all the better. 394 00:24:10,400 --> 00:24:13,600 Speaker 2: A lot more papers for your repository. Tim thanks so much. 395 00:24:14,040 --> 00:24:17,359 Speaker 2: That's Professor Timothy Zick of William and Mary Law School. 396 00:24:17,680 --> 00:24:21,880 Speaker 2: Coming up next. Parents are suing alleging that AI chatbots 397 00:24:21,960 --> 00:24:27,840 Speaker 2: are responsible for the suicides of their teenagers. This is bloomberg. 398 00:24:27,920 --> 00:24:31,760 Speaker 2: In the United States, more than seventy percent of teenagers 399 00:24:32,119 --> 00:24:37,240 Speaker 2: have used AI chatbots for companionship, and half used them regularly, 400 00:24:37,600 --> 00:24:41,560 Speaker 2: according to a recent study from common Sense Media and 401 00:24:41,760 --> 00:24:47,119 Speaker 2: last week, parents whose teenagers committed suicide after interactions with 402 00:24:47,359 --> 00:24:52,679 Speaker 2: artificial intelligence chatbots testified to Congress about the dangers of 403 00:24:52,720 --> 00:24:57,080 Speaker 2: the technology, and several parents have sued the companies behind 404 00:24:57,200 --> 00:25:01,280 Speaker 2: chatbots over the suicides of their teenage Joining me is 405 00:25:01,320 --> 00:25:07,080 Speaker 2: Colin Walkee, a cybersecurity and data privacy partner at hall Estel. Colin. 406 00:25:07,160 --> 00:25:11,119 Speaker 2: The latest lawsuit was filed by the parents of a 407 00:25:11,200 --> 00:25:16,119 Speaker 2: thirteen year old who committed suicide and it starts invisible 408 00:25:16,200 --> 00:25:20,640 Speaker 2: Monsters entered the home of Juliana Peralta in or around 409 00:25:20,800 --> 00:25:25,080 Speaker 2: August twenty twenty three, when she was only thirteen years old. 410 00:25:25,880 --> 00:25:29,080 Speaker 2: Tell us about this lawsuit well. 411 00:25:29,200 --> 00:25:33,520 Speaker 1: In particular, the allegations are is that over the next 412 00:25:34,359 --> 00:25:37,199 Speaker 1: year or so, while the child was interacting with the 413 00:25:37,240 --> 00:25:44,159 Speaker 1: AI chatbot, that it was not providing good advice in 414 00:25:44,280 --> 00:25:46,679 Speaker 1: terms of mental health and was acting as a friend 415 00:25:46,720 --> 00:25:50,400 Speaker 1: and actually helped assist it and kind of coerce, if 416 00:25:50,440 --> 00:25:55,040 Speaker 1: you will, this child to commit suicide. And so the 417 00:25:55,119 --> 00:26:01,600 Speaker 1: lawsuit asserts that, in short, the AI chatbot facilitating her 418 00:26:01,640 --> 00:26:04,160 Speaker 1: suicide and the company should be held responsible for. 419 00:26:04,960 --> 00:26:08,960 Speaker 2: This is not the first suit by parents connecting their 420 00:26:09,040 --> 00:26:14,240 Speaker 2: teenagers suicide to chatbots. How many of these wrongful death 421 00:26:14,280 --> 00:26:16,000 Speaker 2: lawsuits are there so far? 422 00:26:16,680 --> 00:26:18,560 Speaker 1: So there are at least three or four that I 423 00:26:18,600 --> 00:26:21,480 Speaker 1: am aware of, filed in various jurisdictions, and all of 424 00:26:21,520 --> 00:26:25,240 Speaker 1: the allegations are very similar and in fact, in one 425 00:26:25,280 --> 00:26:30,199 Speaker 1: particular instance, the child suggested leaving the news out on 426 00:26:30,320 --> 00:26:33,720 Speaker 1: his desk so that someone would try and stop him, 427 00:26:34,359 --> 00:26:37,679 Speaker 1: and the chatbot allegedly said, no, don't do that we 428 00:26:37,720 --> 00:26:39,680 Speaker 1: want this room to be the first place where someone 429 00:26:39,840 --> 00:26:43,479 Speaker 1: finds you. So you can see how this type of 430 00:26:43,520 --> 00:26:47,760 Speaker 1: AI is certainly not helping the individual in that particular circumstance. 431 00:26:48,240 --> 00:26:51,359 Speaker 1: And the thing is June is that we've known for 432 00:26:51,440 --> 00:26:54,000 Speaker 1: a long time that AI chatbots are not going to 433 00:26:54,000 --> 00:26:55,600 Speaker 1: tell us what we need to hear. They're going to 434 00:26:55,600 --> 00:26:58,359 Speaker 1: tell us what we want to hear. And that's the 435 00:26:58,359 --> 00:27:01,760 Speaker 1: most concerning part about this. That or unleashing products not 436 00:27:01,880 --> 00:27:04,639 Speaker 1: just to the public grid large, but to youth and 437 00:27:04,840 --> 00:27:09,000 Speaker 1: children without any true testing or guardrails being done to 438 00:27:09,080 --> 00:27:10,119 Speaker 1: ensure child's safety. 439 00:27:10,600 --> 00:27:15,040 Speaker 2: In the case of a California teenager, the father apparently 440 00:27:15,800 --> 00:27:19,679 Speaker 2: knew that his son had made previous suicide attempts. So 441 00:27:19,720 --> 00:27:23,640 Speaker 2: where does the parents' responsibility to monitor their teenagers fit 442 00:27:23,720 --> 00:27:24,560 Speaker 2: in this picture? 443 00:27:25,720 --> 00:27:28,600 Speaker 1: You hit upon a fantastic point, which is there is 444 00:27:28,680 --> 00:27:32,720 Speaker 1: absolutely a role for parental responsibility. I think there are 445 00:27:32,720 --> 00:27:36,119 Speaker 1: two problems in this particular case. The first problem that 446 00:27:36,160 --> 00:27:40,880 Speaker 1: you have is technological ignorance. Most parents don't understand how 447 00:27:40,880 --> 00:27:43,680 Speaker 1: their cell phone works, how their social media works, let 448 00:27:43,720 --> 00:27:46,879 Speaker 1: alone how AI works. And yet at the same time, 449 00:27:46,960 --> 00:27:49,240 Speaker 1: parents are allowing their children to utilize it. Just like 450 00:27:49,359 --> 00:27:53,600 Speaker 1: parents themselves are utilizing it again without understanding how it works. 451 00:27:54,000 --> 00:27:55,560 Speaker 1: But I think the second thing that you have to 452 00:27:55,600 --> 00:27:59,240 Speaker 1: think about here is that this is perfectly synonymous with 453 00:27:59,320 --> 00:28:02,320 Speaker 1: the development of social media. Right, So it's the very 454 00:28:02,400 --> 00:28:07,640 Speaker 1: beginning social media didn't test necessarily what the long term 455 00:28:07,640 --> 00:28:11,360 Speaker 1: consequences would be by developing algorithms that just fed us 456 00:28:11,400 --> 00:28:14,800 Speaker 1: what we wanted to see over time. They could have 457 00:28:15,000 --> 00:28:17,639 Speaker 1: changed that, but they knew that that would get into 458 00:28:17,720 --> 00:28:20,359 Speaker 1: their profit margin and they don't want to do that. Right. 459 00:28:20,840 --> 00:28:24,639 Speaker 1: It's the same thing here. We could be responsible and 460 00:28:24,760 --> 00:28:27,920 Speaker 1: test these products and tweak them in such a way 461 00:28:27,960 --> 00:28:31,040 Speaker 1: that hopefully they're a lot more secure and safe both 462 00:28:31,040 --> 00:28:34,760 Speaker 1: for adults and children than what we have, but instead, 463 00:28:35,000 --> 00:28:38,040 Speaker 1: because that would hurt profits, we've gone ahead and unleashed 464 00:28:38,080 --> 00:28:41,239 Speaker 1: all of this onto the market, expecting individuals to know 465 00:28:41,320 --> 00:28:44,400 Speaker 1: about the dangers, to know about the consequences. How many 466 00:28:44,480 --> 00:28:47,320 Speaker 1: parents using Life three sixty knew that I could go 467 00:28:47,400 --> 00:28:50,280 Speaker 1: on the internet and buy their children's geolocation data and 468 00:28:50,320 --> 00:28:52,840 Speaker 1: find out where their children are at. Very few people 469 00:28:52,920 --> 00:28:55,640 Speaker 1: knew that, and yet that was an exploitation. Same thing 470 00:28:55,680 --> 00:28:59,400 Speaker 1: here with AI. What parents believe their AI may be 471 00:28:59,520 --> 00:29:02,280 Speaker 1: saying to their children, or even particular chats that they 472 00:29:02,280 --> 00:29:05,200 Speaker 1: look up may not be everything that the AI is saying. 473 00:29:05,280 --> 00:29:07,840 Speaker 1: And how do you know which AI programs necessarily that 474 00:29:07,920 --> 00:29:10,880 Speaker 1: child is accessing on the animal. So you're right, there 475 00:29:10,960 --> 00:29:13,960 Speaker 1: is absolutely a role for reneral responsibility, but unfortunately this 476 00:29:14,080 --> 00:29:17,120 Speaker 1: technology has developed so quickly that most parents are ignorant 477 00:29:17,120 --> 00:29:18,360 Speaker 1: to the problems in the first place. 478 00:29:19,080 --> 00:29:23,400 Speaker 2: Can you sort of summarize where the law stands on 479 00:29:23,720 --> 00:29:26,560 Speaker 2: liability for AI right now? 480 00:29:26,600 --> 00:29:29,600 Speaker 1: What you're seeing with regard to any type of liability 481 00:29:29,600 --> 00:29:32,960 Speaker 1: in AI, whether it's copyright law, whether it's injuries individuals, 482 00:29:32,960 --> 00:29:36,000 Speaker 1: tesla vehicles, those sorts of things, one of the questions 483 00:29:36,120 --> 00:29:38,760 Speaker 1: is is are you the company who puts this out there, 484 00:29:38,800 --> 00:29:42,080 Speaker 1: responsible for that? So, for example, in this social media world, 485 00:29:42,160 --> 00:29:44,680 Speaker 1: we all know about section two thirty, and Section two 486 00:29:44,800 --> 00:29:47,560 Speaker 1: thirty says that Facebook is not liable for what individuals 487 00:29:47,600 --> 00:29:50,800 Speaker 1: post on their platform, So the same question might apply here. 488 00:29:51,560 --> 00:29:53,960 Speaker 1: Is it the case that Session two thirty could at 489 00:29:54,080 --> 00:29:59,920 Speaker 1: least in theory, begin to apply to AI and prohibit liability. 490 00:30:00,120 --> 00:30:02,400 Speaker 1: At the end of the day, everybody owes a duty 491 00:30:02,440 --> 00:30:06,280 Speaker 1: to other individuals to avoid harm, and so the question 492 00:30:06,320 --> 00:30:10,400 Speaker 1: will ultimately come down did these companies adequately test these 493 00:30:10,480 --> 00:30:14,440 Speaker 1: AI programs to determine their potential threats of harm and 494 00:30:14,480 --> 00:30:17,239 Speaker 1: did they adequately warn the public about those before they 495 00:30:17,240 --> 00:30:17,880 Speaker 1: were used. 496 00:30:18,520 --> 00:30:23,160 Speaker 2: One of the biggest AI platforms, Open Ai says parental 497 00:30:23,200 --> 00:30:26,760 Speaker 2: controls are going to be added to chat GPT within 498 00:30:26,800 --> 00:30:29,520 Speaker 2: the next month. But does that count against them in 499 00:30:29,560 --> 00:30:32,560 Speaker 2: a lawsuit because you know, too little, too late. 500 00:30:33,080 --> 00:30:35,120 Speaker 1: No doubt a lawyer may try and get that in 501 00:30:35,200 --> 00:30:37,800 Speaker 1: as evidence, and it might also be excluded as evidence 502 00:30:37,800 --> 00:30:40,479 Speaker 1: of the medial measures. But we all know that we 503 00:30:40,520 --> 00:30:42,920 Speaker 1: live in the twenty first century where you know, information 504 00:30:43,000 --> 00:30:45,520 Speaker 1: is spread all about and I'm quite confident people are 505 00:30:45,560 --> 00:30:47,719 Speaker 1: going to learn about this before there's ever a trial 506 00:30:47,760 --> 00:30:50,640 Speaker 1: on it. And the point being is is that even 507 00:30:51,120 --> 00:30:56,000 Speaker 1: if that particular issue was excluded, everyone knows that these 508 00:30:56,040 --> 00:30:59,520 Speaker 1: types of protocols can be put in place pre deployment. 509 00:31:00,120 --> 00:31:01,920 Speaker 1: It's just a choice of whether or not they want 510 00:31:01,960 --> 00:31:04,560 Speaker 1: to take the time to do that. And because everybody 511 00:31:04,560 --> 00:31:07,680 Speaker 1: wants their AI to be the first and latest so 512 00:31:07,720 --> 00:31:13,240 Speaker 1: that everybody starts using it, no one's incentivized to actually 513 00:31:13,280 --> 00:31:16,280 Speaker 1: adequately test and put in appropriate protocols. Because again, if 514 00:31:16,320 --> 00:31:19,400 Speaker 1: you limit what the AI is going to provide as 515 00:31:19,400 --> 00:31:21,920 Speaker 1: a response to somebody that's not going to give them 516 00:31:21,920 --> 00:31:24,040 Speaker 1: what they want. Just like if I go on to 517 00:31:24,440 --> 00:31:27,160 Speaker 1: you know, Facebook, and I don't like what I'm seeing, 518 00:31:27,200 --> 00:31:28,760 Speaker 1: the algorithm is going to give me more of what 519 00:31:28,800 --> 00:31:32,040 Speaker 1: I want to see. And so we're driving towards the 520 00:31:32,080 --> 00:31:35,080 Speaker 1: world in which the incentive is towards profits and giving 521 00:31:35,120 --> 00:31:37,840 Speaker 1: people what they want, not doing the right thing and 522 00:31:37,880 --> 00:31:39,400 Speaker 1: giving people the correct information. 523 00:31:39,720 --> 00:31:42,360 Speaker 2: What would the parents in these cases have to prove? 524 00:31:43,000 --> 00:31:45,640 Speaker 1: Well, depending on the allegations, I mean you could again 525 00:31:45,960 --> 00:31:49,160 Speaker 1: we're having to pigeonhole all these concepts into common laws theories. 526 00:31:49,480 --> 00:31:52,080 Speaker 1: And so under the negligence theory, which is the simplest, 527 00:31:52,120 --> 00:31:55,400 Speaker 1: the most straightforward theory, is that they owed a reasonable 528 00:31:55,480 --> 00:31:58,080 Speaker 1: duty of care to all of their customers to ensure 529 00:31:58,080 --> 00:32:01,160 Speaker 1: that their chatbot was reasonably safe. And then the question 530 00:32:01,280 --> 00:32:05,000 Speaker 1: is is did the company breach that duty and if so, 531 00:32:05,760 --> 00:32:09,520 Speaker 1: was the company the proximate cause? Okay, So, for example, 532 00:32:09,840 --> 00:32:12,280 Speaker 1: in this particular case, we may be able to say 533 00:32:12,320 --> 00:32:16,160 Speaker 1: that chat GPT said go check out the hotline, please 534 00:32:16,200 --> 00:32:21,640 Speaker 1: call that. Maybe then if the proximate cause for that 535 00:32:21,720 --> 00:32:24,560 Speaker 1: individual suicide could be cut off at that point in 536 00:32:24,600 --> 00:32:28,080 Speaker 1: time by chat GPT and another approximate cause i e. 537 00:32:28,200 --> 00:32:30,960 Speaker 1: His own volition or something that happened that day at 538 00:32:31,000 --> 00:32:33,960 Speaker 1: school or any number of things could become the proximate 539 00:32:34,040 --> 00:32:37,440 Speaker 1: cause and they could still avoid liability. So the planet 540 00:32:37,480 --> 00:32:39,040 Speaker 1: will have to show that there is a duty, which 541 00:32:39,080 --> 00:32:41,640 Speaker 1: I think is probably easy enough to prove. And then, 542 00:32:41,680 --> 00:32:44,000 Speaker 1: of course the last hurdle is is is the company 543 00:32:44,000 --> 00:32:47,920 Speaker 1: actually responsible for what the AI produces? Knowing that the 544 00:32:47,960 --> 00:32:51,320 Speaker 1: AI is inherently problematic, who doesn't know at this stage 545 00:32:51,360 --> 00:32:54,160 Speaker 1: the AI hopeens today? So there is a bit of 546 00:32:54,200 --> 00:32:56,320 Speaker 1: a user beware angle to this case. 547 00:32:56,600 --> 00:33:00,040 Speaker 2: What causes someone to commit suicide? There are often and 548 00:33:00,280 --> 00:33:03,680 Speaker 2: a host of reasons. It's not so simple, and so 549 00:33:03,760 --> 00:33:08,080 Speaker 2: I'm wondering how difficult it will be to prove that 550 00:33:08,120 --> 00:33:09,920 Speaker 2: the chatbot was responsible. 551 00:33:10,560 --> 00:33:13,920 Speaker 1: That's absolutely correct, but I don't think that it diminishes 552 00:33:13,960 --> 00:33:17,400 Speaker 1: the company's responsibilities in the first place, right. I mean, so, 553 00:33:17,560 --> 00:33:19,800 Speaker 1: for example, while I can't think of one off the 554 00:33:19,880 --> 00:33:22,000 Speaker 1: top of my head, I'm confident that there were lawsuits 555 00:33:22,080 --> 00:33:26,040 Speaker 1: in Spacebook, Meta Instagram, you know, as a result of 556 00:33:26,160 --> 00:33:29,120 Speaker 1: self harm from children. I don't know where those ever went, 557 00:33:29,240 --> 00:33:32,640 Speaker 1: but the point is is that companies are knowingly putting 558 00:33:33,000 --> 00:33:37,200 Speaker 1: products onto the market that our government is too incompetent 559 00:33:37,720 --> 00:33:41,840 Speaker 1: or too unwilling to regulate, and so therefore we have 560 00:33:41,920 --> 00:33:45,880 Speaker 1: products in the market that are potentially dangerous and that 561 00:33:46,080 --> 00:33:50,760 Speaker 1: the population is not adequately educated on. And so really 562 00:33:50,800 --> 00:33:54,000 Speaker 1: the onus shouldn't be on the population who is trying 563 00:33:54,040 --> 00:33:56,080 Speaker 1: to figure out what this technology does. It should be 564 00:33:56,080 --> 00:33:59,400 Speaker 1: on the technologists who created it. Just like in the 565 00:33:59,440 --> 00:34:01,840 Speaker 1: oil and gas industry, do you drill an oil and 566 00:34:01,880 --> 00:34:04,719 Speaker 1: gas well and it leaked, it's your responsibility. Yes, it's 567 00:34:04,720 --> 00:34:08,520 Speaker 1: an inherently dangerous operation, but you know what, that's your responsibility. 568 00:34:08,840 --> 00:34:12,279 Speaker 1: And we need that type of regulation and that type 569 00:34:12,320 --> 00:34:14,959 Speaker 1: of mindset to make sure that the public is safe 570 00:34:14,960 --> 00:34:16,160 Speaker 1: when using AI chatlock. 571 00:34:16,520 --> 00:34:20,239 Speaker 2: So which agency would be responsible for regulations in this 572 00:34:20,400 --> 00:34:22,280 Speaker 2: area or would it take a law? 573 00:34:22,840 --> 00:34:25,000 Speaker 1: Well, the law would be the best part. But because 574 00:34:25,000 --> 00:34:28,400 Speaker 1: we are literally in an AI arms race with China 575 00:34:28,440 --> 00:34:30,319 Speaker 1: and every other country in the world, I don't see 576 00:34:30,320 --> 00:34:35,240 Speaker 1: that happening. The FTC can and does currently regulate AI 577 00:34:35,600 --> 00:34:39,640 Speaker 1: under Section five of the FTC Act. In short, what 578 00:34:39,719 --> 00:34:43,160 Speaker 1: that says is that companies can't put out their falls 579 00:34:43,160 --> 00:34:48,480 Speaker 1: from its leading products, and so there are cases where 580 00:34:49,360 --> 00:34:51,960 Speaker 1: the FTC is postured at least to be able to 581 00:34:52,120 --> 00:34:56,600 Speaker 1: enforce you know, these types of issues. At the end 582 00:34:56,600 --> 00:34:59,480 Speaker 1: of the day, will they probably not, especially under this 583 00:34:59,600 --> 00:35:04,120 Speaker 1: ADMIN because again, there is so much incentive to allow 584 00:35:04,160 --> 00:35:07,480 Speaker 1: this type of development that no one and I put 585 00:35:07,480 --> 00:35:10,040 Speaker 1: that in air quotes, no one wants to see this regulated. 586 00:35:10,360 --> 00:35:12,200 Speaker 1: And I think there's a safer way to do the 587 00:35:12,239 --> 00:35:15,840 Speaker 1: AI development without posing the threat of harm to the community, 588 00:35:16,080 --> 00:35:18,400 Speaker 1: which is simply, hey, we all stop at chat GPT 589 00:35:18,600 --> 00:35:22,600 Speaker 1: four for public releases and everything else gets developed, you know, 590 00:35:22,880 --> 00:35:25,960 Speaker 1: within DHARPA or behind closed doors within the company itself. 591 00:35:26,320 --> 00:35:30,640 Speaker 1: We the public doesn't need access to this type of 592 00:35:30,640 --> 00:35:32,280 Speaker 1: information as soon as it's ready. 593 00:35:32,120 --> 00:35:32,760 Speaker 3: To be released. 594 00:35:32,840 --> 00:35:34,879 Speaker 2: I think it was last month that forty four state 595 00:35:34,920 --> 00:35:39,920 Speaker 2: attorneys general warned eleven companies that run AI chatboxes that 596 00:35:40,000 --> 00:35:43,560 Speaker 2: they would quote answer for it if their products harm children. 597 00:35:43,600 --> 00:35:46,640 Speaker 2: Has anything been done by state attorneys general? 598 00:35:47,840 --> 00:35:51,520 Speaker 1: Not to my knowledge, And to that point, great, what 599 00:35:51,640 --> 00:35:54,000 Speaker 1: are you going to do, because at the end of 600 00:35:54,040 --> 00:35:57,319 Speaker 1: the day, you state attorneys generals have limited resources. So 601 00:35:57,360 --> 00:36:00,439 Speaker 1: for example, in California, you know there are also trying 602 00:36:00,440 --> 00:36:04,680 Speaker 1: to enforce data privacy laws to their consumer protection agency 603 00:36:04,719 --> 00:36:07,560 Speaker 1: out there. We all know that we have limited funds, 604 00:36:07,640 --> 00:36:11,759 Speaker 1: limited resources, and our ags are battling multiple fronts all 605 00:36:11,800 --> 00:36:13,799 Speaker 1: at the same time, and so if they're going to 606 00:36:13,840 --> 00:36:17,680 Speaker 1: go up against multi billion dollar companies like Meta or 607 00:36:18,160 --> 00:36:21,320 Speaker 1: Open Ai, they're going to have a lot of long, 608 00:36:21,600 --> 00:36:24,319 Speaker 1: uphill battle. And I assure you, by the time that 609 00:36:24,560 --> 00:36:27,479 Speaker 1: legislation or that lawsuit were resolved, we'd have a whole 610 00:36:27,480 --> 00:36:31,000 Speaker 1: new slew of AI problems on our hands, and so 611 00:36:31,080 --> 00:36:32,959 Speaker 1: it'd be a lot too little, too late. 612 00:36:33,400 --> 00:36:35,440 Speaker 2: You're right, it's hard to keep up with all of it. 613 00:36:35,600 --> 00:36:39,560 Speaker 2: Thanks so much, Colin. That's Colin Walkee of haul Estell, 614 00:36:40,239 --> 00:36:42,560 Speaker 2: and that's it for this edition of The Bloomberg Law Show. 615 00:36:42,880 --> 00:36:45,279 Speaker 2: Remember you can always get the latest legal news on 616 00:36:45,280 --> 00:36:49,560 Speaker 2: our Bloomberg Law Podcast. You can find them on Apple Podcasts, Spotify, 617 00:36:49,760 --> 00:36:54,799 Speaker 2: and at www dot Bloomberg dot com, slash podcast Slash Law, 618 00:36:55,200 --> 00:36:57,800 Speaker 2: and remember to tune into The Bloomberg Law Show every 619 00:36:57,840 --> 00:37:01,960 Speaker 2: weeknight at ten pm Wall Street. I'm June Grosso and 620 00:37:02,000 --> 00:37:03,440 Speaker 2: you're listening to Bloomberg. 621 00:37:06,960 --> 00:37:08,600 Speaker 1: Mm hmm.