1 00:00:02,759 --> 00:00:07,000 Speaker 1: This is Bloomberg Law with June Grossel from Bloomberg Radio. 2 00:00:09,080 --> 00:00:13,520 Speaker 2: The Supreme Court ruled against a law banning conversion therapy 3 00:00:13,720 --> 00:00:18,720 Speaker 2: for LGBTQ kids in Colorado. In an eight to one ruling, 4 00:00:18,840 --> 00:00:22,200 Speaker 2: the Court sided with a Christian counselor who says she 5 00:00:22,280 --> 00:00:26,479 Speaker 2: has a constitutional right to engage in talk therapy to 6 00:00:26,560 --> 00:00:30,760 Speaker 2: try to change a child's sexual orientation or gender identity. 7 00:00:31,000 --> 00:00:35,000 Speaker 2: The justices found the Colorado law infringes on her free 8 00:00:35,000 --> 00:00:39,400 Speaker 2: speech rights by discriminating on the basis of viewpoint. Liberal 9 00:00:39,640 --> 00:00:43,839 Speaker 2: Justice Katanji Brown Jackson was the only dissenter, and she 10 00:00:44,000 --> 00:00:47,199 Speaker 2: took the unusual stemp of reading a summary of her 11 00:00:47,240 --> 00:00:51,159 Speaker 2: opinion from the bench to emphasize her disagreement with the 12 00:00:51,200 --> 00:00:55,120 Speaker 2: rest of the justices. The ruling casts doubt on similar 13 00:00:55,200 --> 00:00:58,480 Speaker 2: laws in more than half the states, laws in about 14 00:00:58,520 --> 00:01:03,360 Speaker 2: half the states. Guest is Columbia Law School professor Suzanne Goldberg. 15 00:01:03,760 --> 00:01:07,160 Speaker 2: She's the director of the Sexuality and Gender Law Clinic. 16 00:01:07,720 --> 00:01:12,200 Speaker 2: Conversion therapy is rejected by every major medical and mental 17 00:01:12,200 --> 00:01:15,560 Speaker 2: health organization, yet the Supreme Court, in this eight to 18 00:01:15,600 --> 00:01:20,800 Speaker 2: one decision, finds that Colorado's ban is a violation of 19 00:01:20,880 --> 00:01:24,400 Speaker 2: free speech rights. Can you explain their reasoning? 20 00:01:25,080 --> 00:01:27,840 Speaker 3: First, I want to affirm what you just said that 21 00:01:28,040 --> 00:01:31,800 Speaker 3: every major medical association says this kind of therapy to 22 00:01:31,920 --> 00:01:36,119 Speaker 3: change somebody's sexual orientation or gender identity is harmful, especially 23 00:01:36,120 --> 00:01:39,839 Speaker 3: to young people. And so the court says, even though 24 00:01:40,240 --> 00:01:45,520 Speaker 3: that may be the case, Colorado cannot regulate what talk 25 00:01:45,680 --> 00:01:50,800 Speaker 3: therapists say to their young patients because those therapists have 26 00:01:50,840 --> 00:01:52,760 Speaker 3: a First Amendment free speech right. 27 00:01:53,280 --> 00:01:54,360 Speaker 4: In other words, the. 28 00:01:54,200 --> 00:01:59,400 Speaker 3: Government of Colorado cannot, through regulating provision of medical care 29 00:01:59,480 --> 00:02:02,520 Speaker 3: or healthcare, say you can't say these things, even though 30 00:02:02,520 --> 00:02:04,560 Speaker 3: we know and there's evidence that they're harmful. 31 00:02:04,800 --> 00:02:08,560 Speaker 2: The ag of Colorado defended the law, saying states have 32 00:02:08,639 --> 00:02:13,680 Speaker 2: long regulated medical practices, including treatments carried out through speech, 33 00:02:13,840 --> 00:02:17,480 Speaker 2: to protect patients from substandard care. So why doesn't this 34 00:02:17,600 --> 00:02:21,560 Speaker 2: law fit under the state regulating mental health professionals. 35 00:02:22,320 --> 00:02:26,760 Speaker 3: So the court says, this is speech, it's pure speech, 36 00:02:27,480 --> 00:02:33,040 Speaker 3: and under our constitutional doctrine, whenever the government regulates speech, 37 00:02:33,520 --> 00:02:36,480 Speaker 3: the Court will impose its very highest level of review, 38 00:02:36,520 --> 00:02:39,840 Speaker 3: its most skeptical scrutiny. And it says, in this case, 39 00:02:39,880 --> 00:02:42,120 Speaker 3: the lower court made a mistake by applying a lower 40 00:02:42,200 --> 00:02:45,320 Speaker 3: level of review and treating this as regular medical regulation, 41 00:02:45,919 --> 00:02:49,959 Speaker 3: and Katanji B. Jackson in her dissent says, yes, this 42 00:02:50,200 --> 00:02:54,919 Speaker 3: is actually medical regulation. This is totally within the balance 43 00:02:54,960 --> 00:02:58,000 Speaker 3: of what a state can do to protect patients against 44 00:02:58,040 --> 00:03:02,320 Speaker 3: substandard care. So the two sides are really talking past 45 00:03:02,360 --> 00:03:05,480 Speaker 3: each other. And one of the concerns is how will 46 00:03:05,520 --> 00:03:10,280 Speaker 3: this interfere with the state's ability to protect patients from 47 00:03:10,400 --> 00:03:12,320 Speaker 3: what healthcare providers have to say to them. 48 00:03:12,840 --> 00:03:15,200 Speaker 2: I don't think it came as any surprise that the 49 00:03:15,280 --> 00:03:19,560 Speaker 2: six conservatives voted as they did, but it was surprising 50 00:03:19,600 --> 00:03:23,400 Speaker 2: to me that the two liberal justices Sonya so To 51 00:03:23,520 --> 00:03:28,400 Speaker 2: Mayor and Elaina Kagan voted against the law. How did 52 00:03:28,440 --> 00:03:29,480 Speaker 2: they explain that? 53 00:03:29,600 --> 00:03:34,080 Speaker 3: In the concurrence they take the same position that this 54 00:03:34,240 --> 00:03:38,600 Speaker 3: is regulation of speech, and its regulation of speech based 55 00:03:38,600 --> 00:03:42,760 Speaker 3: on the viewpoint of the speaker, Meaning a speaker can 56 00:03:42,880 --> 00:03:49,240 Speaker 3: say I affirm that your LGBT, whatever the young person is, 57 00:03:49,880 --> 00:03:53,680 Speaker 3: but the speaker the therapist cannot say I don't affirm that, 58 00:03:53,880 --> 00:03:57,880 Speaker 3: and you should change. And so that is justice, Kagan writes, 59 00:03:58,320 --> 00:04:03,160 Speaker 3: viewpoint discrimination. And the Court has long subjected viewpoint discrimination 60 00:04:03,320 --> 00:04:08,240 Speaker 3: to its most skeptical kind of scrutiny, that said, there 61 00:04:08,320 --> 00:04:12,960 Speaker 3: are many things that the state presumably can regulate and 62 00:04:13,000 --> 00:04:16,560 Speaker 3: prohibit a medical provider from saying to a patient. One 63 00:04:16,600 --> 00:04:19,120 Speaker 3: of those examples is, you know, a state should be 64 00:04:19,160 --> 00:04:23,080 Speaker 3: able to prohibit a provider from encouraging a suicidal patient 65 00:04:23,120 --> 00:04:24,279 Speaker 3: to take their own life. 66 00:04:24,440 --> 00:04:25,600 Speaker 4: That is also speech. 67 00:04:26,080 --> 00:04:30,279 Speaker 3: The Court doesn't wrestle with that, neither the majority nor 68 00:04:30,320 --> 00:04:32,440 Speaker 3: the concurring opinion of Justice Kagan. 69 00:04:32,920 --> 00:04:35,679 Speaker 2: Justice Jackson felt so strongly that she read a summary 70 00:04:35,720 --> 00:04:39,440 Speaker 2: of her opinion from the bench to emphasize her objection here. 71 00:04:39,440 --> 00:04:42,919 Speaker 2: But it's often the three liberal justices who are in 72 00:04:43,000 --> 00:04:47,920 Speaker 2: the dissent in cases like this involving transgender rights. Why 73 00:04:47,920 --> 00:04:51,160 Speaker 2: do you think Kagan and Soto Mayor didn't join her. 74 00:04:51,520 --> 00:04:54,320 Speaker 3: This is a difficult case in the sense. 75 00:04:54,720 --> 00:04:58,080 Speaker 4: That it is government regulation of speech. 76 00:04:58,200 --> 00:05:03,479 Speaker 3: All nine of them agree on that. Justice Jackson says, yes, 77 00:05:03,520 --> 00:05:07,000 Speaker 3: it's regulation of speech, but incidental. 78 00:05:06,400 --> 00:05:09,120 Speaker 4: To regulation of healthcare provision. 79 00:05:09,600 --> 00:05:13,719 Speaker 3: Justice is Kagan and so do Mayor stay with the majority, 80 00:05:14,000 --> 00:05:16,760 Speaker 3: making this an eight to one ruling. Again, I think 81 00:05:16,800 --> 00:05:18,719 Speaker 3: they try to pull back a little bit from some 82 00:05:18,800 --> 00:05:23,000 Speaker 3: of the majorities full on, we can never regulate speech, 83 00:05:23,720 --> 00:05:26,640 Speaker 3: and they suggest that this is quite a narrow ruling, 84 00:05:27,120 --> 00:05:29,080 Speaker 3: But I think we'll have to see. I mean, I 85 00:05:29,080 --> 00:05:32,000 Speaker 3: do think there are always concerns when allowing government to 86 00:05:32,040 --> 00:05:35,320 Speaker 3: regulate speech, that the government will regulate too much and 87 00:05:35,360 --> 00:05:40,080 Speaker 3: suppress speech. And that is clearly something that Justice Kagan 88 00:05:40,120 --> 00:05:42,599 Speaker 3: is attuned to as well as the majority, but so 89 00:05:42,800 --> 00:05:45,000 Speaker 3: is Justice Jackson. They just see it differently. 90 00:05:45,400 --> 00:05:49,440 Speaker 2: Did the majority deal with the fact that conversion therapy 91 00:05:49,800 --> 00:05:55,440 Speaker 2: is opposed by every major medical organization and studies have 92 00:05:55,640 --> 00:05:59,960 Speaker 2: linked it to depression, post dramatic stress, and higher rate 93 00:06:00,160 --> 00:06:01,000 Speaker 2: of suicide. 94 00:06:01,560 --> 00:06:06,040 Speaker 3: So the majority says, that's what these medical associations think now. 95 00:06:06,279 --> 00:06:10,160 Speaker 3: But the government is not allowed to prescribe an orthodoxy 96 00:06:10,240 --> 00:06:13,039 Speaker 3: of views. And it's standard again in First Amendment, to 97 00:06:13,120 --> 00:06:16,280 Speaker 3: say that the reason we are so protective of speech 98 00:06:16,400 --> 00:06:19,600 Speaker 3: is to allow for the contestation of ideas, to allow 99 00:06:19,680 --> 00:06:22,440 Speaker 3: that maybe at one point our views are mistaken and 100 00:06:22,480 --> 00:06:25,680 Speaker 3: we want to change them. And they make the point that, well, 101 00:06:25,720 --> 00:06:28,719 Speaker 3: you know, some years ago states might have said you 102 00:06:28,839 --> 00:06:33,280 Speaker 3: cannot support a young person who says there lesbian or gay, 103 00:06:33,360 --> 00:06:36,560 Speaker 3: or bisexual or transgender. You have to tell them to change. 104 00:06:36,720 --> 00:06:39,080 Speaker 3: And so the majority says, you know that could happen. 105 00:06:39,120 --> 00:06:43,560 Speaker 3: We can't allow any of this. That said, there remains 106 00:06:43,600 --> 00:06:50,600 Speaker 3: this question whether governments can protect young patients who are 107 00:06:50,720 --> 00:06:56,440 Speaker 3: facing a demonstrable risk empirically validated if this kind of 108 00:06:56,520 --> 00:07:00,120 Speaker 3: therapy is used on them, And so it seems to 109 00:07:00,160 --> 00:07:04,120 Speaker 3: me the majority does not actually wrestle fully with that question. 110 00:07:04,920 --> 00:07:07,440 Speaker 2: The court is sending the case back to the lower 111 00:07:07,520 --> 00:07:12,000 Speaker 2: court to apply a stricter standard. But is it an 112 00:07:12,040 --> 00:07:17,040 Speaker 2: almost foregone conclusion that the law won't pass a strict 113 00:07:17,080 --> 00:07:18,520 Speaker 2: scrutiny test. 114 00:07:19,040 --> 00:07:22,560 Speaker 3: I'd never like to say anything absolutely foregone, but I 115 00:07:22,600 --> 00:07:26,000 Speaker 3: don't think anybody expects this law to survive. And of course, 116 00:07:26,080 --> 00:07:29,800 Speaker 3: more than half the states in this country have this kind. 117 00:07:29,720 --> 00:07:31,320 Speaker 4: Of law or similar laws. 118 00:07:31,320 --> 00:07:33,480 Speaker 3: Some of them are worded differently, so they may be 119 00:07:33,600 --> 00:07:38,040 Speaker 3: evaluated somewhat differently. And it's important to also know that 120 00:07:38,920 --> 00:07:42,680 Speaker 3: virtually all of these laws were passed with bipartisan support. Right, 121 00:07:43,160 --> 00:07:48,480 Speaker 3: the evidence isn't in question here from any major or 122 00:07:49,040 --> 00:07:52,120 Speaker 3: nonfringe analyst of the data. 123 00:07:52,520 --> 00:07:53,680 Speaker 4: What's also important to. 124 00:07:53,720 --> 00:07:58,360 Speaker 3: Know is that this ruling, strong as it is, in 125 00:07:58,400 --> 00:08:00,680 Speaker 3: favor of the therapist who says she might want to 126 00:08:00,960 --> 00:08:04,920 Speaker 3: provide this kind of therapy, does not say conversion therapy 127 00:08:05,000 --> 00:08:09,600 Speaker 3: is good. It does not say conversion therapy is helpful 128 00:08:09,600 --> 00:08:13,520 Speaker 3: to patients. It doesn't disagree with the Medical Association saying 129 00:08:13,560 --> 00:08:17,800 Speaker 3: conversion therapy is harmful, and so people young people who 130 00:08:18,120 --> 00:08:22,840 Speaker 3: suffer as a result of conversion therapy can still file 131 00:08:23,040 --> 00:08:27,840 Speaker 3: medical malpractice lawsuits and consumer fraud lawsuits and obtain a remedy. 132 00:08:28,200 --> 00:08:30,480 Speaker 3: What states have been trying to do with these laws 133 00:08:30,840 --> 00:08:33,400 Speaker 3: is prevent the harm in the first place, and the 134 00:08:33,440 --> 00:08:37,439 Speaker 3: Supreme Court has just taken away state's capacity to do 135 00:08:37,480 --> 00:08:38,680 Speaker 3: that in many respects. 136 00:08:39,080 --> 00:08:42,480 Speaker 2: In this case, Colorado hasn't sought to enforce this law. 137 00:08:43,160 --> 00:08:45,880 Speaker 2: And it sort of reminds me of the Supreme Court 138 00:08:46,000 --> 00:08:49,040 Speaker 2: case where the web designer said she didn't want to 139 00:08:49,080 --> 00:08:53,400 Speaker 2: design for same sex couples, and yet no same sex 140 00:08:53,480 --> 00:08:55,920 Speaker 2: couple had ever asked her to do a web design. 141 00:08:56,160 --> 00:09:00,320 Speaker 2: So why is the Supreme Court taking these cases where 142 00:09:00,400 --> 00:09:02,880 Speaker 2: no one's been injured, no one's been harmed. 143 00:09:03,040 --> 00:09:06,760 Speaker 3: There are at least two responses to your wise question. 144 00:09:07,360 --> 00:09:11,280 Speaker 3: The first is that when the First Amendment is an issue, 145 00:09:11,280 --> 00:09:14,880 Speaker 3: when people's free speech rights might be chilled, which was 146 00:09:14,960 --> 00:09:19,040 Speaker 3: the argument here, the Court is more willing to take 147 00:09:19,080 --> 00:09:21,599 Speaker 3: a case that is hypothetical. 148 00:09:21,720 --> 00:09:24,079 Speaker 4: As you just said, right, the Colorado. 149 00:09:23,600 --> 00:09:26,680 Speaker 3: Hasn't enforced the law, she hasn't been in any trouble 150 00:09:26,720 --> 00:09:27,319 Speaker 3: with the law. 151 00:09:27,640 --> 00:09:29,520 Speaker 4: So this is really, you. 152 00:09:29,480 --> 00:09:32,000 Speaker 3: Know, a step outside of that kind of hypothetical we 153 00:09:32,080 --> 00:09:33,679 Speaker 3: might offer in a law school classroom. 154 00:09:33,679 --> 00:09:35,480 Speaker 4: It's not a real case yet. 155 00:09:35,720 --> 00:09:38,840 Speaker 3: But when somebody makes an argument that their speech is 156 00:09:38,880 --> 00:09:43,040 Speaker 3: being chilled by the government regulation, that kind of argument 157 00:09:43,200 --> 00:09:46,440 Speaker 3: is typically allowed to go forward for good reason. Right, 158 00:09:46,600 --> 00:09:49,439 Speaker 3: you know, government can chill speech, and we don't want 159 00:09:49,440 --> 00:09:52,559 Speaker 3: people to have to wait to face criminal punishment or 160 00:09:52,679 --> 00:09:56,080 Speaker 3: fines before they engage in their speech. There's a second 161 00:09:56,080 --> 00:09:58,719 Speaker 3: point though, too, to your question, like, why are we 162 00:09:58,760 --> 00:10:04,400 Speaker 3: seeing these cases? Why are people bringing cases when they 163 00:10:04,400 --> 00:10:08,600 Speaker 3: haven't faced the problem, when they haven't not only gotten 164 00:10:08,640 --> 00:10:10,640 Speaker 3: in trouble, they haven't even in the case of the 165 00:10:10,640 --> 00:10:14,040 Speaker 3: web design or offered the services, and neither did ms Childs. 166 00:10:14,280 --> 00:10:16,920 Speaker 3: And there I think we can see that this is 167 00:10:16,960 --> 00:10:19,720 Speaker 3: part of a broader agenda on the part of legal 168 00:10:19,800 --> 00:10:23,199 Speaker 3: organizations to find plaintiffs who are willing to make these 169 00:10:23,240 --> 00:10:27,520 Speaker 3: claims with a goal of pushing the law in a 170 00:10:27,600 --> 00:10:32,240 Speaker 3: direction that really restricts the kind of legal protections for 171 00:10:32,480 --> 00:10:33,520 Speaker 3: LGBT people. 172 00:10:33,920 --> 00:10:38,160 Speaker 2: Yeah, the therapist was represented by the Alliance Defending Freedom, 173 00:10:38,800 --> 00:10:41,720 Speaker 2: a Christian legal group that has been behind some of 174 00:10:41,760 --> 00:10:46,880 Speaker 2: these high profile cases, including overturning the constitutional right to abortion. 175 00:10:47,760 --> 00:10:50,960 Speaker 2: Let's take a broad view for a minute. The Supreme 176 00:10:51,000 --> 00:10:56,240 Speaker 2: fort has been consistently ruling against transgender rights in recent years. 177 00:10:56,800 --> 00:11:00,600 Speaker 2: It upheld the Tennessee law barring gender firming care for 178 00:11:00,760 --> 00:11:05,520 Speaker 2: transgender youth. It allowed Trump to ban transgender people from 179 00:11:05,559 --> 00:11:09,080 Speaker 2: the military and to require new passpoorts to reflect the 180 00:11:09,200 --> 00:11:13,080 Speaker 2: sex on the holder's birth certificate. There was the decision 181 00:11:13,120 --> 00:11:19,720 Speaker 2: allowing parents to opt out when LGBTQ storybooks were reading classrooms. 182 00:11:20,320 --> 00:11:24,199 Speaker 2: And if the oral arguments are any indication, the Court 183 00:11:24,280 --> 00:11:28,560 Speaker 2: is likely to uphold state laws banning transgender girls and 184 00:11:28,640 --> 00:11:34,200 Speaker 2: women from competing on female athletic teams. It just seems relentless. 185 00:11:34,640 --> 00:11:36,160 Speaker 2: It's one case after another. 186 00:11:36,679 --> 00:11:39,960 Speaker 3: I think it certainly does have a relentless feel. And 187 00:11:40,000 --> 00:11:43,640 Speaker 3: when you couple this most recent series of rulings with 188 00:11:43,840 --> 00:11:46,880 Speaker 3: the more than a thousand bills that have been introduced 189 00:11:46,880 --> 00:11:51,240 Speaker 3: in state legislatures around the country to restrict transgender people 190 00:11:51,320 --> 00:11:55,200 Speaker 3: in the daily activities of living in identity documents like 191 00:11:55,200 --> 00:11:59,240 Speaker 3: a driver's license that looks like you, or access to 192 00:11:59,320 --> 00:12:03,600 Speaker 3: a bathroom room, or ability to get a passport that 193 00:12:03,640 --> 00:12:08,120 Speaker 3: reflects identity. There are so many restrictions, so what can 194 00:12:08,160 --> 00:12:14,360 Speaker 3: we say. Many of these cutbacks on protection for transgender 195 00:12:14,400 --> 00:12:17,200 Speaker 3: people are coming in the guise of free speech or 196 00:12:17,240 --> 00:12:21,199 Speaker 3: protecting the religious freedom of people who don't want to 197 00:12:21,640 --> 00:12:24,080 Speaker 3: be around transgender people or don't want their kids to 198 00:12:24,200 --> 00:12:27,280 Speaker 3: hear stories about transgender people or lesbian and gay people 199 00:12:27,320 --> 00:12:30,760 Speaker 3: in that case. Interestingly, the sports case is not a 200 00:12:30,920 --> 00:12:34,480 Speaker 3: religious freedom case. It's not a First Amendment case. There 201 00:12:34,559 --> 00:12:38,520 Speaker 3: the question is what is the scope of sex discrimination law? 202 00:12:38,880 --> 00:12:42,840 Speaker 3: Does it protect all athletes who are female to be 203 00:12:42,880 --> 00:12:44,280 Speaker 3: able to participate. 204 00:12:43,840 --> 00:12:45,240 Speaker 4: In girls and women's sports. 205 00:12:45,400 --> 00:12:49,800 Speaker 3: Certainly, the majorities appear quite skeptical of the athletes arguments 206 00:12:49,800 --> 00:12:52,760 Speaker 3: in those cases, the transgender athletes arguments in those cases, 207 00:12:52,760 --> 00:12:55,960 Speaker 3: and the states arguments. So we'll see, but it's a 208 00:12:56,120 --> 00:12:59,760 Speaker 3: very challenging time. On the other hand, there are also 209 00:12:59,840 --> 00:13:03,120 Speaker 3: no birth states that are trying to really protect and 210 00:13:03,320 --> 00:13:06,800 Speaker 3: affirm their transgender youth to make sure that they have 211 00:13:06,840 --> 00:13:10,800 Speaker 3: healthy places to go to school, access to healthcare. But 212 00:13:10,880 --> 00:13:13,360 Speaker 3: the concern is, of course that the Trump administration right 213 00:13:13,400 --> 00:13:14,400 Speaker 3: now is trying to cut. 214 00:13:14,240 --> 00:13:15,040 Speaker 4: That off as well. 215 00:13:15,480 --> 00:13:19,720 Speaker 2: It's challenging to say the least. Thanks so much, Suzanne. 216 00:13:20,080 --> 00:13:24,680 Speaker 2: That's Professor Suzanne Goldberg of Columbia Law School. Coming up next, 217 00:13:25,240 --> 00:13:28,000 Speaker 2: we'll take a look at the implications of the first 218 00:13:28,160 --> 00:13:32,160 Speaker 2: verdict in a social media addiction trial. I'm June Grosse. 219 00:13:32,200 --> 00:13:36,440 Speaker 2: When you're listening to Bloomberg. The jury's verdict in the 220 00:13:36,520 --> 00:13:41,199 Speaker 2: first social media addiction trial has been called a landmark verdict, 221 00:13:41,360 --> 00:13:45,120 Speaker 2: a game changer. A jury ordered Meta and Google to 222 00:13:45,160 --> 00:13:48,200 Speaker 2: pay six million dollars to a twenty year old woman 223 00:13:48,520 --> 00:13:52,319 Speaker 2: who said her addiction to social media caused her mental 224 00:13:52,320 --> 00:13:55,640 Speaker 2: health struggles. But it's just the first trial in a 225 00:13:55,800 --> 00:13:59,640 Speaker 2: very long line of similar cases. Plus the company's a 226 00:13:59,720 --> 00:14:03,440 Speaker 2: pee are ahead. The verdict may be a potential crack 227 00:14:03,559 --> 00:14:07,800 Speaker 2: in the social media company's shield from legal responsibility for 228 00:14:07,960 --> 00:14:11,120 Speaker 2: what happens on their platforms, but it doesn't put them 229 00:14:11,120 --> 00:14:15,520 Speaker 2: in the same category as big tobacco or opioid makers, 230 00:14:16,080 --> 00:14:19,280 Speaker 2: at least not yet. My guest is an expert in 231 00:14:19,400 --> 00:14:23,840 Speaker 2: internet law, Professor Eric Goldman, Associate Dean for Research at 232 00:14:23,880 --> 00:14:28,400 Speaker 2: Santa Clara University School of Law. This verdict is being 233 00:14:28,440 --> 00:14:33,200 Speaker 2: described as landmark. The plantiffs attorney called it a game 234 00:14:33,360 --> 00:14:38,560 Speaker 2: changer turning point, what's your take on the import of this. 235 00:14:39,400 --> 00:14:43,880 Speaker 1: The State Court has set up three Bellweather trials. The 236 00:14:44,000 --> 00:14:46,880 Speaker 1: idea is to take the hundreds of cases that are 237 00:14:46,880 --> 00:14:50,720 Speaker 1: currently filed and pick three that will give the parties 238 00:14:50,760 --> 00:14:54,040 Speaker 1: better information about how jurys are responding to the arguments. 239 00:14:54,720 --> 00:14:56,960 Speaker 1: We got a verdict in the first of the three 240 00:14:57,000 --> 00:14:59,800 Speaker 1: Bellweather trials, but it's only one of the three, and 241 00:15:00,160 --> 00:15:03,120 Speaker 1: that respect, it's only one of the potential hundreds of 242 00:15:03,160 --> 00:15:05,800 Speaker 1: cases that are out there, So it's a little hard 243 00:15:05,840 --> 00:15:08,640 Speaker 1: to put too much weight in the single data point 244 00:15:08,920 --> 00:15:11,880 Speaker 1: about how the jurors are responding. It's not meant to 245 00:15:11,880 --> 00:15:15,160 Speaker 1: be the answer to all the questions. Having said that, 246 00:15:15,240 --> 00:15:18,200 Speaker 1: it does answer perhaps the most important question the plaintiffs 247 00:15:18,240 --> 00:15:22,120 Speaker 1: needed to know, which is our jury is buying the 248 00:15:22,200 --> 00:15:25,040 Speaker 1: basic arguments are making. Do they generally believe social media 249 00:15:25,080 --> 00:15:28,760 Speaker 1: services should be responsible for their user's harms? The jury 250 00:15:28,760 --> 00:15:31,960 Speaker 1: answered yes. If that is the answer that will continue 251 00:15:31,960 --> 00:15:35,520 Speaker 1: in the other trials, then that does change the state 252 00:15:35,520 --> 00:15:36,320 Speaker 1: of play quite a bit. 253 00:15:36,760 --> 00:15:39,760 Speaker 2: The plaintiffs didn't win over all the jurors. It was 254 00:15:39,920 --> 00:15:43,200 Speaker 2: tend to two, with two jurors holding out for the companies. 255 00:15:43,240 --> 00:15:46,400 Speaker 2: On all the questions, and it took nine days to 256 00:15:46,440 --> 00:15:50,400 Speaker 2: reach the verdict. And I assume that this is one 257 00:15:50,400 --> 00:15:52,880 Speaker 2: of the plaintiff's best cases because they chose it to 258 00:15:52,920 --> 00:15:57,640 Speaker 2: go first. So does that leave room for different decisions 259 00:15:57,640 --> 00:16:00,560 Speaker 2: in the other cases coming up to clarify? 260 00:16:00,560 --> 00:16:02,000 Speaker 1: I don't know that I would consider this to be 261 00:16:02,040 --> 00:16:04,640 Speaker 1: the plaintiff's best case. I don't think that's exactly how 262 00:16:04,640 --> 00:16:07,360 Speaker 1: they selected the bell Weather trials. It's more designed to 263 00:16:07,360 --> 00:16:10,440 Speaker 1: be a representative case. But having said that, the fact 264 00:16:10,480 --> 00:16:13,359 Speaker 1: that it was tend to is I think actually quite significant. 265 00:16:13,720 --> 00:16:17,320 Speaker 1: Despite many days of deliberation, there were two jurors who 266 00:16:17,680 --> 00:16:21,600 Speaker 1: just didn't buy the plaintiff's basic argument, could not be 267 00:16:21,840 --> 00:16:25,280 Speaker 1: swayed by the remainder of the majority. And one could 268 00:16:25,280 --> 00:16:29,400 Speaker 1: imagine with a different set of facts, a different presentation 269 00:16:29,520 --> 00:16:33,120 Speaker 1: of the arguments by the litigants, and possibly a different 270 00:16:33,200 --> 00:16:37,520 Speaker 1: jury composition, that a different jury might reach a different 271 00:16:37,520 --> 00:16:40,080 Speaker 1: conclusion in another case. It's not like this is so 272 00:16:40,400 --> 00:16:45,600 Speaker 1: obviously unanimously a basis for imposing liability. So that's part 273 00:16:45,640 --> 00:16:47,920 Speaker 1: of the whole point of doing the bell Weather process 274 00:16:48,000 --> 00:16:50,640 Speaker 1: is to have several trials and not rely on any 275 00:16:50,640 --> 00:16:54,160 Speaker 1: single data point as dispositive. But it also is a sign, 276 00:16:54,280 --> 00:16:57,040 Speaker 1: combined with the new Mexico jury verdic that came out 277 00:16:57,240 --> 00:17:00,640 Speaker 1: near the same time, that maybe jury's will buy this, 278 00:17:00,840 --> 00:17:03,800 Speaker 1: maybe not unanimously, but in general are sympathetic to the 279 00:17:03,800 --> 00:17:06,840 Speaker 1: Planet's argument. So I wouldn't put too much stoft in 280 00:17:06,880 --> 00:17:09,360 Speaker 1: the ten two verdict, but it does suggest that this 281 00:17:09,400 --> 00:17:12,960 Speaker 1: is not a definitive slam done case for the plaintiffs. 282 00:17:13,240 --> 00:17:16,880 Speaker 2: Also, punitive damages, as you know, meant to punish the company, 283 00:17:17,200 --> 00:17:20,440 Speaker 2: and we've seen crazy punitives in cases over the years, 284 00:17:20,480 --> 00:17:24,600 Speaker 2: really high punitives. Here, three million dollars seems like a 285 00:17:24,640 --> 00:17:27,520 Speaker 2: slap on the risk to these companies. It won't even 286 00:17:27,560 --> 00:17:29,440 Speaker 2: register in their balance sheet. 287 00:17:29,560 --> 00:17:33,240 Speaker 1: It's a little hard to contextualize the number because, as 288 00:17:33,240 --> 00:17:36,320 Speaker 1: you know, six million dollars is a rounding error for 289 00:17:36,560 --> 00:17:39,800 Speaker 1: Google a Meta, that number itself doesn't matter to them. 290 00:17:40,200 --> 00:17:42,639 Speaker 1: But the point is that this is the first of 291 00:17:42,680 --> 00:17:46,359 Speaker 1: what could be potentially thousands of cases, and so we 292 00:17:46,440 --> 00:17:49,880 Speaker 1: have to multiply the six million dollar number by potentially 293 00:17:50,040 --> 00:17:54,200 Speaker 1: thousands of other claims, at which point that number becomes 294 00:17:54,359 --> 00:17:58,360 Speaker 1: very large, a number that even Meta and Google will notice. 295 00:17:58,640 --> 00:18:01,920 Speaker 1: So I don't feel like six million dollar numbers at 296 00:18:01,920 --> 00:18:04,840 Speaker 1: all a low number. They may be less than the 297 00:18:05,000 --> 00:18:08,560 Speaker 1: plaintiff requested in this case, but it's a signal of 298 00:18:08,600 --> 00:18:13,240 Speaker 1: the industry's potential financial exposure. Six million times the potential 299 00:18:13,320 --> 00:18:17,360 Speaker 1: number victims equals a lot of money, maybe even more 300 00:18:17,359 --> 00:18:18,840 Speaker 1: than the entire industry has. 301 00:18:19,400 --> 00:18:23,640 Speaker 2: So the social media companies say they'll appeal, obviously, do 302 00:18:23,680 --> 00:18:27,320 Speaker 2: they have good grounds what you'd consider good grounds for appeal. 303 00:18:28,160 --> 00:18:30,439 Speaker 1: I do think that there's some important issues that the 304 00:18:30,440 --> 00:18:33,639 Speaker 1: appellate court will have to weigh in on. The trial 305 00:18:33,720 --> 00:18:36,520 Speaker 1: court in the state court case as well as federal case, 306 00:18:36,920 --> 00:18:40,720 Speaker 1: have made number of choices that were not obvious that 307 00:18:40,840 --> 00:18:44,280 Speaker 1: in some ways broke new ground, and as a result, 308 00:18:44,320 --> 00:18:46,320 Speaker 1: I don't really treat them as the final word on 309 00:18:46,400 --> 00:18:49,840 Speaker 1: matter until we hear from other judges, appellate judges if 310 00:18:49,840 --> 00:18:53,480 Speaker 1: they actually even agree with those decisions. So I think 311 00:18:53,480 --> 00:18:56,720 Speaker 1: that there are several bases on which the defendants have 312 00:18:57,040 --> 00:18:59,520 Speaker 1: good grounds for appeal, But that doesn't mean that the 313 00:18:59,520 --> 00:19:02,919 Speaker 1: appeal is likely to succeed. I kind of rate it 314 00:19:03,000 --> 00:19:05,720 Speaker 1: like a fifty to fifty. I can't really predict. The 315 00:19:05,760 --> 00:19:08,480 Speaker 1: crystal ball is unclear. I think that there's good grounds 316 00:19:08,480 --> 00:19:10,399 Speaker 1: for peel I think that the plaintiffs have some precedent 317 00:19:10,520 --> 00:19:13,080 Speaker 1: to support them, even if I disagree with the conclusions 318 00:19:13,080 --> 00:19:16,400 Speaker 1: they reach, And so, you know, this is exactly how 319 00:19:16,400 --> 00:19:19,840 Speaker 1: the judicial system is supposed to work. Complex tough questions 320 00:19:19,880 --> 00:19:22,879 Speaker 1: get posed to the first level of review, then the 321 00:19:22,920 --> 00:19:25,720 Speaker 1: second overview, and almost certainly a third level of review. 322 00:19:26,040 --> 00:19:28,520 Speaker 1: There will be a lot of eyeballs on these questions 323 00:19:28,560 --> 00:19:29,680 Speaker 1: to answer the tough ones. 324 00:19:30,080 --> 00:19:31,800 Speaker 2: Do you think one of the grounds for appeal will 325 00:19:31,840 --> 00:19:36,520 Speaker 2: be based on Section two thirty that generally immunizes online 326 00:19:36,600 --> 00:19:40,240 Speaker 2: platforms for liability for user generated content. 327 00:19:40,920 --> 00:19:42,719 Speaker 1: I think that Section two thirty is one of the 328 00:19:43,200 --> 00:19:47,200 Speaker 1: solid grounds for appeal in this case. Essentially, the plaintiffs 329 00:19:47,240 --> 00:19:50,240 Speaker 1: have had to argue that they're not suing over the 330 00:19:50,440 --> 00:19:53,960 Speaker 1: content that any individual victim was exposed to, but the 331 00:19:54,000 --> 00:19:56,880 Speaker 1: way in which that content was delivered, the various design 332 00:19:57,080 --> 00:20:02,000 Speaker 1: features that help get the content in front of the victim. 333 00:20:02,359 --> 00:20:05,840 Speaker 1: To me, that line between the content and the method 334 00:20:05,840 --> 00:20:09,399 Speaker 1: of presentation of the content is illusory. They're all the 335 00:20:09,440 --> 00:20:11,840 Speaker 1: same thing in my mind. They're all part of the 336 00:20:11,920 --> 00:20:15,879 Speaker 1: general publication or editorial decisions that the service made about 337 00:20:15,920 --> 00:20:19,200 Speaker 1: how to best and engage with its users. So from 338 00:20:19,200 --> 00:20:23,320 Speaker 1: my perspective, that effort to navigate around what is third 339 00:20:23,320 --> 00:20:26,240 Speaker 1: party content and what is the service's first party design 340 00:20:26,320 --> 00:20:30,800 Speaker 1: choices really sophisticated, nuanced arguments, but they're not clear to 341 00:20:30,800 --> 00:20:32,520 Speaker 1: me at all that the trial court got it right. 342 00:20:33,280 --> 00:20:35,600 Speaker 2: So do you think that this will lead social media 343 00:20:35,680 --> 00:20:39,240 Speaker 2: companies to change the way they operate. 344 00:20:40,320 --> 00:20:43,840 Speaker 1: I think that it would be remarkable if social media 345 00:20:44,000 --> 00:20:47,439 Speaker 1: doesn't change substantially over the next few years. If the 346 00:20:47,640 --> 00:20:50,600 Speaker 1: final outcome and a few years from now, is that 347 00:20:50,640 --> 00:20:53,320 Speaker 1: social media has retained the status quo it has today, 348 00:20:53,720 --> 00:20:56,119 Speaker 1: I think that would be remarkable. And it's not just 349 00:20:56,200 --> 00:20:59,960 Speaker 1: because of the litigation, though. If the plaintiffs win any 350 00:21:00,119 --> 00:21:04,080 Speaker 1: part of litigation, almost certainly the court will order changes 351 00:21:04,200 --> 00:21:08,000 Speaker 1: to the service, or any settlement agreement would require changes 352 00:21:08,040 --> 00:21:11,720 Speaker 1: to the services. But the reason why I'm so confident 353 00:21:11,760 --> 00:21:15,399 Speaker 1: that social media is under extraordinary pressure to change is because, 354 00:21:15,440 --> 00:21:19,320 Speaker 1: in addition to the litigation, state legislatures throughout the country 355 00:21:19,400 --> 00:21:24,600 Speaker 1: are passing laws that are requiring structural and tactical changes 356 00:21:24,680 --> 00:21:26,680 Speaker 1: to how the services operate. 357 00:21:27,280 --> 00:21:28,120 Speaker 4: And unless the. 358 00:21:28,080 --> 00:21:32,800 Speaker 1: Services can also overturn all of those laws, those laws 359 00:21:32,800 --> 00:21:35,879 Speaker 1: are actually the governing rules that will dictate how social 360 00:21:35,920 --> 00:21:39,720 Speaker 1: media services operate, so as a practical matter, the litigation 361 00:21:39,960 --> 00:21:43,320 Speaker 1: is only one path to where it's change. The legislation 362 00:21:43,400 --> 00:21:46,040 Speaker 1: is another path towards change, and of course the services 363 00:21:46,160 --> 00:21:49,960 Speaker 1: retain the voluntary right to make additional changes. I think 364 00:21:50,000 --> 00:21:51,760 Speaker 1: it would be foolhardy on the part of any of 365 00:21:51,800 --> 00:21:54,760 Speaker 1: your listeners to assume that the social media services they 366 00:21:54,800 --> 00:21:56,880 Speaker 1: have today, whether they love them or hate them, there's 367 00:21:56,920 --> 00:21:58,520 Speaker 1: going to be the social media services they have in 368 00:21:58,560 --> 00:22:01,280 Speaker 1: a few years from now, and we should talk about 369 00:22:01,320 --> 00:22:03,919 Speaker 1: what that means for all of us, because we're not 370 00:22:04,000 --> 00:22:07,639 Speaker 1: in the courtroom to express what we want from social media, 371 00:22:07,680 --> 00:22:10,639 Speaker 1: but our fates, our ability to use the services is 372 00:22:10,680 --> 00:22:14,080 Speaker 1: being dictated right now in courts and in the legislatures 373 00:22:14,080 --> 00:22:15,080 Speaker 1: around the country. 374 00:22:15,280 --> 00:22:19,200 Speaker 2: Will it be the social media companies putting in safe cards, 375 00:22:19,560 --> 00:22:24,119 Speaker 2: age verification, parental control, or will they have to change 376 00:22:24,520 --> 00:22:27,359 Speaker 2: what the jury found were addictive features? 377 00:22:28,280 --> 00:22:31,240 Speaker 1: All of the above and possibly more. In other words, 378 00:22:31,520 --> 00:22:35,000 Speaker 1: the litigation puts in play a number of very specific 379 00:22:35,440 --> 00:22:39,320 Speaker 1: tactical choices that the services have made, things like auto 380 00:22:39,320 --> 00:22:45,440 Speaker 1: play or infinite scrolling or algorithmic personalization. These are tactical 381 00:22:45,600 --> 00:22:48,720 Speaker 1: changes that might need to be changed in order to 382 00:22:48,760 --> 00:22:52,639 Speaker 1: avoid liability going forward. All of those features are also 383 00:22:52,680 --> 00:22:56,160 Speaker 1: being regulated by the legislatures who are saying you cannot 384 00:22:56,200 --> 00:22:59,840 Speaker 1: have infinite scrolling, you cannot have auto play, and so on. 385 00:23:00,240 --> 00:23:02,880 Speaker 1: So either way, one way or another, those services are 386 00:23:02,880 --> 00:23:05,400 Speaker 1: going to have to evaluate the functions that they have. 387 00:23:05,840 --> 00:23:09,359 Speaker 1: But the plantiff's basic argument doesn't rely on any specific 388 00:23:09,440 --> 00:23:12,760 Speaker 1: design feature. The basic argument is the way in which 389 00:23:12,800 --> 00:23:16,440 Speaker 1: you've designed the service overall was intended to addict your 390 00:23:16,560 --> 00:23:20,080 Speaker 1: users and cause them harm, And to the extent that 391 00:23:20,080 --> 00:23:24,520 Speaker 1: that general statement becomes part of the law, whether in 392 00:23:24,640 --> 00:23:28,200 Speaker 1: legislation or litigation, we could get there either way. Then 393 00:23:28,240 --> 00:23:31,520 Speaker 1: the services have to review everything that they do and 394 00:23:31,600 --> 00:23:35,159 Speaker 1: consider how that might impact potential victims. As a result, 395 00:23:35,640 --> 00:23:39,800 Speaker 1: there's no limit or boundary to what structural changes could 396 00:23:39,800 --> 00:23:43,879 Speaker 1: be forced through the legislation or the litigation. In either case, 397 00:23:44,280 --> 00:23:46,879 Speaker 1: everything is in play. And that's why again I'm so 398 00:23:47,400 --> 00:23:50,600 Speaker 1: concerned and or confident that social media services will not 399 00:23:50,680 --> 00:23:52,600 Speaker 1: look the same in a few years from now. 400 00:23:53,200 --> 00:23:57,240 Speaker 2: If the addictive features are changed, does that mean that 401 00:23:57,760 --> 00:24:02,080 Speaker 2: the value to advertisers changes as well? 402 00:24:02,200 --> 00:24:05,600 Speaker 1: The short answers, we don't know how the changes to 403 00:24:05,640 --> 00:24:09,440 Speaker 1: any particular product features will affect the revenue or profits 404 00:24:09,440 --> 00:24:14,119 Speaker 1: of the services, and I think it's actually impossible to model. 405 00:24:14,600 --> 00:24:17,880 Speaker 1: There is senset, a secret sauce that drives user engagement 406 00:24:17,880 --> 00:24:21,120 Speaker 1: in social media, and it might be that small changes 407 00:24:21,160 --> 00:24:23,639 Speaker 1: to that secret sauce, or changes that don't relate to 408 00:24:23,640 --> 00:24:26,400 Speaker 1: the secret sauce, have no impact on the bottom line. 409 00:24:26,560 --> 00:24:29,359 Speaker 1: It is also possible that even the smallest change might 410 00:24:29,440 --> 00:24:32,720 Speaker 1: have dramatic impact on the bottom line and really change 411 00:24:32,720 --> 00:24:37,040 Speaker 1: the value proposition for the services, how they structure their offerings, 412 00:24:37,040 --> 00:24:39,240 Speaker 1: and how they are able to profit from them. So 413 00:24:39,680 --> 00:24:42,359 Speaker 1: the short answers, we don't really know how any of 414 00:24:42,400 --> 00:24:45,760 Speaker 1: the particular conversations that are taking place could impact the 415 00:24:45,800 --> 00:24:49,160 Speaker 1: bottom line. We do know that in the end, if 416 00:24:49,320 --> 00:24:53,399 Speaker 1: the plaintiff's lawyers or the legislators have absolute power to 417 00:24:53,480 --> 00:24:56,400 Speaker 1: dictate over whether or not something is going to cause 418 00:24:56,480 --> 00:24:59,240 Speaker 1: victims harm, I don't know that there is a profitable 419 00:24:59,240 --> 00:25:01,600 Speaker 1: model at that point. In other words, at that point, 420 00:25:01,800 --> 00:25:06,160 Speaker 1: the services can no longer design their offerings for their customers. 421 00:25:06,400 --> 00:25:09,199 Speaker 1: Somebody else external to the conversation is going to come 422 00:25:09,200 --> 00:25:11,560 Speaker 1: in and dictate how that will work, and if so, 423 00:25:12,040 --> 00:25:15,960 Speaker 1: the niche becomes much less lucrative. Maybe not even profitable. 424 00:25:16,640 --> 00:25:19,200 Speaker 2: Coming up next on the Bloomberg Law Show, I'll continue 425 00:25:19,240 --> 00:25:23,800 Speaker 2: this conversation with Professor Eric Goldman of Santa Clara University 426 00:25:24,040 --> 00:25:27,959 Speaker 2: School of Law. What about the impact of any changes 427 00:25:28,080 --> 00:25:31,639 Speaker 2: in social media on the communities that don't have a 428 00:25:31,720 --> 00:25:35,919 Speaker 2: voice at trial. I'm June Grosso and you're listening to Bloomberg. 429 00:25:38,320 --> 00:25:41,920 Speaker 2: I've been talking to Internet law expert Professor Eric Goldman 430 00:25:42,040 --> 00:25:46,040 Speaker 2: of Santa Clara University Law School about the implications of 431 00:25:46,080 --> 00:25:49,640 Speaker 2: the verdict against Meta and Google in the first social 432 00:25:49,680 --> 00:25:53,600 Speaker 2: media addiction trial. Eric, you've said that it's the Internet 433 00:25:53,720 --> 00:25:57,480 Speaker 2: that's on trial here, not social media. Explain what you 434 00:25:57,520 --> 00:25:58,120 Speaker 2: mean by that. 435 00:25:58,560 --> 00:26:03,080 Speaker 1: In addition to the lawsuits against social media services, the plaintiffs' 436 00:26:03,160 --> 00:26:07,040 Speaker 1: lawyers have taken the same basic legal paradigms that they're 437 00:26:07,040 --> 00:26:10,240 Speaker 1: advanced in those cases and advance them against other parts 438 00:26:10,240 --> 00:26:13,440 Speaker 1: of the Internet. I'll mention three, although it's not limited 439 00:26:13,480 --> 00:26:17,680 Speaker 1: to these three generative AI model makers, video game makers, 440 00:26:17,720 --> 00:26:21,159 Speaker 1: and social gaming. In all three of those cases, the 441 00:26:21,200 --> 00:26:25,000 Speaker 1: plaintiffs are essentially arguing that the services are designed to 442 00:26:25,000 --> 00:26:29,160 Speaker 1: addict users, that addiction causes harm, and therefore the services 443 00:26:29,160 --> 00:26:32,080 Speaker 1: should be liable for the resulting harm the same exact 444 00:26:32,080 --> 00:26:35,199 Speaker 1: set of arguments in the social media addiction cases. So 445 00:26:35,520 --> 00:26:38,920 Speaker 1: if the arguments work in social media addiction, they will 446 00:26:38,960 --> 00:26:42,719 Speaker 1: be advanced and have a greater degree of likelihood of 447 00:26:42,760 --> 00:26:46,520 Speaker 1: succeeding against these other major segments of the Internet. And 448 00:26:46,600 --> 00:26:49,679 Speaker 1: in fact, if the arguments work, there's no segment of 449 00:26:49,680 --> 00:26:53,040 Speaker 1: the Internet that couldn't be potentially susceptible to the exact 450 00:26:53,080 --> 00:26:56,639 Speaker 1: same arguments that some function on the Internet is designed 451 00:26:56,640 --> 00:27:00,880 Speaker 1: to addict users, it causes harm, they should be financial responsible. 452 00:27:01,280 --> 00:27:04,920 Speaker 1: So if the arguments work in social media addiction, they 453 00:27:05,000 --> 00:27:08,480 Speaker 1: will be potentially more successful everywhere else on the Internet, 454 00:27:08,520 --> 00:27:11,399 Speaker 1: and that could dramatically change not just social media, but 455 00:27:11,480 --> 00:27:14,080 Speaker 1: the entire innet. And that's why the stakes are so 456 00:27:14,240 --> 00:27:16,680 Speaker 1: high for these bell Weather trials. They're giving us a 457 00:27:16,720 --> 00:27:20,359 Speaker 1: prediction of how likely that reshaping of the Internet is. 458 00:27:20,840 --> 00:27:24,760 Speaker 2: Lexi Hazam, one of the lead attorneys representing plaintiffs and 459 00:27:24,920 --> 00:27:28,440 Speaker 2: school districts in similar cases, said, we have the wind 460 00:27:28,480 --> 00:27:30,960 Speaker 2: at our backs going into the next trials, and these 461 00:27:30,960 --> 00:27:33,840 Speaker 2: companies are under a lot of pressure. I mean, does 462 00:27:34,000 --> 00:27:35,800 Speaker 2: one trial affect the next trial? 463 00:27:36,640 --> 00:27:39,159 Speaker 1: It doesn't, And that's in fact the whole nature of 464 00:27:39,200 --> 00:27:42,199 Speaker 1: the Bell Weather trials is to do basically a statistical 465 00:27:42,359 --> 00:27:46,919 Speaker 1: sampling of the entire corpus of claims and try to 466 00:27:46,960 --> 00:27:49,640 Speaker 1: get a sense about how to value that entire corpus 467 00:27:49,920 --> 00:27:54,720 Speaker 1: based on some independently chosen data points. So the plaintiffs 468 00:27:54,760 --> 00:27:58,439 Speaker 1: are I think excited because their arguments worked and that 469 00:27:58,480 --> 00:28:00,399 Speaker 1: gives them more confidence of the argument are going to 470 00:28:00,440 --> 00:28:03,760 Speaker 1: work in the next case. But they're independent and entirely 471 00:28:03,800 --> 00:28:07,760 Speaker 1: possible that the next trial will reach a completely different result, 472 00:28:07,880 --> 00:28:12,400 Speaker 1: maybe massive liability far beyond six million, or maybe zero dollars, 473 00:28:12,520 --> 00:28:15,040 Speaker 1: or no liability at all. That's the whole point of 474 00:28:15,200 --> 00:28:18,600 Speaker 1: sampling the different cases. Now, having said that, each side 475 00:28:18,640 --> 00:28:21,680 Speaker 1: has now heard the other side's best evidence and they're 476 00:28:21,720 --> 00:28:25,280 Speaker 1: going to iterate. Both sides will change their messaging to 477 00:28:25,320 --> 00:28:27,919 Speaker 1: try to reach a different outcome or a better outcome 478 00:28:27,920 --> 00:28:32,119 Speaker 1: for them. So the plaintiff lawyers may be feeling confident 479 00:28:32,240 --> 00:28:35,960 Speaker 1: that having seen the best evidence of the defendants, they 480 00:28:36,000 --> 00:28:38,920 Speaker 1: have a better sense about how they can even overcome 481 00:28:38,960 --> 00:28:41,560 Speaker 1: it more and that may be giving them some confidence 482 00:28:41,600 --> 00:28:42,000 Speaker 1: as well. 483 00:28:42,160 --> 00:28:46,680 Speaker 2: The social media cases are being compared to the tobacco 484 00:28:46,760 --> 00:28:51,320 Speaker 2: litigation with the Global settlement and the opioid litigation. Do 485 00:28:51,360 --> 00:28:53,680 Speaker 2: you see a direct comparison. 486 00:28:53,520 --> 00:28:57,440 Speaker 1: Yes and no, So let me reject the analogy. But 487 00:28:57,480 --> 00:28:59,479 Speaker 1: then they'll come back to how there's some kernel of 488 00:28:59,480 --> 00:29:03,200 Speaker 1: truth to it. So, in general, start with something like tobacco. 489 00:29:03,760 --> 00:29:07,880 Speaker 1: Tobacco has no known health benefits to its consumers. It's 490 00:29:07,960 --> 00:29:11,080 Speaker 1: only either neutral to their health or it's negative, whereas 491 00:29:11,120 --> 00:29:14,920 Speaker 1: with social media, we know that it has substantial benefits 492 00:29:14,960 --> 00:29:18,920 Speaker 1: to its users, in addition to some users driving potentially 493 00:29:18,960 --> 00:29:22,600 Speaker 1: significant detriments from it. And so trying to take the 494 00:29:22,800 --> 00:29:26,000 Speaker 1: regulation or the legal treatment of something that has no 495 00:29:26,160 --> 00:29:30,680 Speaker 1: known health benefits to something that has substantial benefits that 496 00:29:30,760 --> 00:29:33,680 Speaker 1: need to be accommodated and accounted for. I just think 497 00:29:33,720 --> 00:29:36,400 Speaker 1: it's like apples and oranges. I also think that it's 498 00:29:36,440 --> 00:29:40,440 Speaker 1: not an out of comparison because cigarettes are a physical 499 00:29:40,440 --> 00:29:43,840 Speaker 1: space problem that causes physical harm, causes health harm to 500 00:29:44,120 --> 00:29:49,000 Speaker 1: its consumers, whereas social media is an intangible venue for 501 00:29:49,160 --> 00:29:54,200 Speaker 1: publishing content. And trying to draw analogies between offline physical 502 00:29:54,240 --> 00:29:58,760 Speaker 1: space items to online intangile items, including the publication of 503 00:29:58,760 --> 00:30:02,120 Speaker 1: content which gets special constitutional protection, I just think it's 504 00:30:02,160 --> 00:30:04,880 Speaker 1: an apt Now, the reason why the big tobacco analogy 505 00:30:04,960 --> 00:30:08,920 Speaker 1: might be relevant is first because both sides are approaching 506 00:30:08,960 --> 00:30:11,880 Speaker 1: this as if it is a litigation war. And normally, 507 00:30:11,880 --> 00:30:14,920 Speaker 1: when you think about people litigating against big giants like 508 00:30:15,040 --> 00:30:18,040 Speaker 1: Google or Meta, you just assume that the plaintiffs are 509 00:30:18,080 --> 00:30:21,280 Speaker 1: going to be outgunned financially or in terms of their 510 00:30:21,360 --> 00:30:23,760 Speaker 1: legal expertise. But that's not the case at all. Here. 511 00:30:23,960 --> 00:30:27,440 Speaker 1: You have an extraordinarily well funded group of plaintiffs who 512 00:30:27,560 --> 00:30:31,520 Speaker 1: have spared no expense at trying to get to successful outcomes. 513 00:30:31,920 --> 00:30:34,080 Speaker 1: Just like in Big Tobacco, the amount of money on 514 00:30:34,160 --> 00:30:37,480 Speaker 1: both sides was extraordinary, and that's true here as well. 515 00:30:37,680 --> 00:30:39,920 Speaker 1: This case is going to be litigated to the very 516 00:30:40,160 --> 00:30:43,560 Speaker 1: nth degree by both sides. There's no imbalance in the 517 00:30:43,920 --> 00:30:47,080 Speaker 1: power relationship there. The other reason why it could be 518 00:30:47,400 --> 00:30:50,080 Speaker 1: like the Big Tobacco cas is because it could change 519 00:30:50,080 --> 00:30:53,040 Speaker 1: the industry. It is possible that social media as an 520 00:30:53,120 --> 00:30:55,880 Speaker 1: industry will look different at the end of this litigation, 521 00:30:56,120 --> 00:30:58,520 Speaker 1: just like the tobacco industry looked different at the end 522 00:30:58,520 --> 00:31:01,520 Speaker 1: of the Big Tobacco litigation. So in that sense, the 523 00:31:01,600 --> 00:31:04,600 Speaker 1: stakes are super high and I can see the analogy 524 00:31:04,640 --> 00:31:05,120 Speaker 1: on that front. 525 00:31:05,840 --> 00:31:09,440 Speaker 2: And you mentioned before that any changes in social media 526 00:31:09,960 --> 00:31:14,080 Speaker 2: will affect other communities who don't have a voice at trial. 527 00:31:14,200 --> 00:31:15,480 Speaker 2: Can you tell me more about that? 528 00:31:16,120 --> 00:31:19,880 Speaker 1: So the general presumption is that social media services are 529 00:31:19,960 --> 00:31:23,200 Speaker 1: just evil, that everyone who uses them hates them, doesn't 530 00:31:23,280 --> 00:31:26,000 Speaker 1: enjoy using them, wishes that you could stop using them, 531 00:31:26,200 --> 00:31:30,000 Speaker 1: and that's just not true. There are so many communities 532 00:31:30,040 --> 00:31:34,960 Speaker 1: that derive really substantial life affirming benefits from social media. 533 00:31:35,000 --> 00:31:37,200 Speaker 1: I'm just going to mention a few. The first is 534 00:31:37,400 --> 00:31:41,360 Speaker 1: the LGBTQ community. Social media has proven to be a 535 00:31:41,440 --> 00:31:44,320 Speaker 1: lifeline for many members of that community who do not 536 00:31:44,560 --> 00:31:47,560 Speaker 1: have a good physical space network or who are afraid 537 00:31:47,600 --> 00:31:50,680 Speaker 1: to access it because of safety or other concerns, and 538 00:31:50,760 --> 00:31:54,320 Speaker 1: social media has created a space where the members of 539 00:31:54,360 --> 00:31:57,280 Speaker 1: that community can learn from each other, engage with each other, 540 00:31:57,640 --> 00:32:01,120 Speaker 1: and help understand their lives and the needs that they have. 541 00:32:01,400 --> 00:32:05,080 Speaker 1: Taking away social media or restricting their access will materially 542 00:32:05,320 --> 00:32:10,960 Speaker 1: harm the LGBTQ community. Another community that derives substantial benefits 543 00:32:10,960 --> 00:32:15,320 Speaker 1: in social media is the neurodiverse community, who sometimes struggle 544 00:32:15,440 --> 00:32:19,040 Speaker 1: in communicating in physical space given the particular ways in 545 00:32:19,080 --> 00:32:21,959 Speaker 1: which their minds work. Social media provides them a different 546 00:32:22,000 --> 00:32:24,840 Speaker 1: form of outlet, a different way of expressing themselves. And 547 00:32:24,920 --> 00:32:28,040 Speaker 1: taking away social media. Restricting their access social media will 548 00:32:28,080 --> 00:32:33,800 Speaker 1: materially harm the neurodiverse community who cannot necessarily transfer all 549 00:32:33,840 --> 00:32:38,280 Speaker 1: of their social engagement to other forms of conversations. And 550 00:32:38,320 --> 00:32:41,840 Speaker 1: the third I'll mention are people who have rare diseases. 551 00:32:42,040 --> 00:32:44,040 Speaker 1: And I'll give it a personal example. My wife has 552 00:32:44,080 --> 00:32:47,160 Speaker 1: a lung cancer diagnosis of a specific mutation of which 553 00:32:47,160 --> 00:32:49,920 Speaker 1: there are about maybe six thousand Americans who have it, 554 00:32:50,440 --> 00:32:53,040 Speaker 1: and that community is so spread out throughout the country 555 00:32:53,040 --> 00:32:57,880 Speaker 1: that it cannot form a geographically based conversation. The only 556 00:32:57,880 --> 00:32:59,880 Speaker 1: way that they can find each other in group together 557 00:33:00,480 --> 00:33:03,400 Speaker 1: is online through social media. And the group that my 558 00:33:03,520 --> 00:33:08,000 Speaker 1: wife's community is formed has met material benefits for the 559 00:33:08,080 --> 00:33:11,240 Speaker 1: members of that community. It has helped them improve their 560 00:33:11,280 --> 00:33:14,360 Speaker 1: lives and in some cases lengthen their lives because of 561 00:33:14,360 --> 00:33:17,080 Speaker 1: the fact that they could converse and organize on social media. 562 00:33:17,200 --> 00:33:21,080 Speaker 1: So restricting or taking away access to that community will 563 00:33:21,120 --> 00:33:23,800 Speaker 1: materially be a detriment to that entire community. 564 00:33:23,960 --> 00:33:26,640 Speaker 2: You talked about state laws, but what about Congress. Do 565 00:33:26,720 --> 00:33:30,960 Speaker 2: you think that these verdicts can move Congress to act. 566 00:33:31,280 --> 00:33:33,280 Speaker 1: It's a little hard to predict what Congress is going 567 00:33:33,360 --> 00:33:37,640 Speaker 1: to do. Congress is already dysfunctional. The twenty twenty six 568 00:33:37,720 --> 00:33:41,240 Speaker 1: maternal elections might make it even more difficult for things 569 00:33:41,280 --> 00:33:44,880 Speaker 1: to move in Congress. Having said that, the jury verdict 570 00:33:44,920 --> 00:33:48,240 Speaker 1: has given a lot of extra wind in the sales 571 00:33:48,320 --> 00:33:51,720 Speaker 1: of the regulators who believe that they have a responsibility 572 00:33:51,800 --> 00:33:55,920 Speaker 1: to protect users of social media from the efforts of 573 00:33:56,240 --> 00:33:59,600 Speaker 1: Google and Meta. So the jury verdicts are already being 574 00:33:59,640 --> 00:34:04,680 Speaker 1: pointed to as a motivation for moving additional regulation forward. 575 00:34:04,760 --> 00:34:06,800 Speaker 1: That's going to happen at both the state and the 576 00:34:06,800 --> 00:34:11,280 Speaker 1: federal level. Will it be enough to get something through Congress, 577 00:34:11,600 --> 00:34:14,520 Speaker 1: I don't know, but I should warn your listeners that 578 00:34:14,600 --> 00:34:17,640 Speaker 1: we should be nervous about whatever will make it through Congress. 579 00:34:17,760 --> 00:34:21,880 Speaker 1: We should not assume that Congress cares about actually benefiting 580 00:34:21,960 --> 00:34:25,360 Speaker 1: us as constituents in regulating social media, and we should 581 00:34:25,360 --> 00:34:28,719 Speaker 1: not assume that any outcome of that regulatory process will 582 00:34:28,719 --> 00:34:32,000 Speaker 1: actually in fact make our lives better. And so I 583 00:34:32,040 --> 00:34:35,239 Speaker 1: know there's such an antipathy towards social media, such a 584 00:34:35,520 --> 00:34:39,239 Speaker 1: desire for some kind of regulatory magic wand that will 585 00:34:39,280 --> 00:34:41,879 Speaker 1: make life better, But that's not at all what we're 586 00:34:41,960 --> 00:34:44,400 Speaker 1: likely to get from Congress. And as a result, I 587 00:34:44,520 --> 00:34:47,840 Speaker 1: encourage your listeners to be vigilant here. Congress if it acts, 588 00:34:47,880 --> 00:34:50,680 Speaker 1: might be doing something in the name of trying to 589 00:34:50,719 --> 00:34:53,560 Speaker 1: advance these interests, but actually with a very different agenda 590 00:34:53,640 --> 00:34:56,480 Speaker 1: that is not necessarily in the interests of its constituents. 591 00:34:57,120 --> 00:35:01,279 Speaker 2: Well watch what if anything happens in Congress. Thanks so much, Eric. 592 00:35:01,800 --> 00:35:05,600 Speaker 2: That's Professor Eric Golman of Santa Clara University Law School, 593 00:35:06,040 --> 00:35:08,360 Speaker 2: And that's it for this edition of The Bloomberg Law Show. 594 00:35:08,680 --> 00:35:11,040 Speaker 2: Remember you can always get the latest legal news on 595 00:35:11,080 --> 00:35:15,360 Speaker 2: our Bloomberg Law podcasts. You can find them on Apple Podcasts, Spotify, 596 00:35:15,560 --> 00:35:20,600 Speaker 2: and at www dot Bloomberg dot com, slash podcast Slash Law, 597 00:35:21,000 --> 00:35:23,600 Speaker 2: and remember to tune into The Bloomberg Law Show every 598 00:35:23,640 --> 00:35:27,560 Speaker 2: weeknight at ten pm Wall Street Time. I'm June Grosso, 599 00:35:27,680 --> 00:35:29,279 Speaker 2: and you're listening to Bloomberg