1 00:00:00,440 --> 00:00:07,960 Speaker 1: Yes, this is Bloombird Law with June Brusso from Bloomberg Radio. 2 00:00:09,080 --> 00:00:12,800 Speaker 1: It's been the subject of controversy for years. Section two 3 00:00:12,920 --> 00:00:16,600 Speaker 1: thirty of the Communications Decency Act a legal shield for 4 00:00:16,680 --> 00:00:20,520 Speaker 1: social media platforms, and Congress has been debating whether it 5 00:00:20,520 --> 00:00:24,040 Speaker 1: should be reformed or revoked. Repealing the law may be 6 00:00:24,160 --> 00:00:27,600 Speaker 1: the one thing that President Joe Biden and former President 7 00:00:27,680 --> 00:00:31,480 Speaker 1: Donald Trump agree on, of course, not for the same reasons. 8 00:00:32,120 --> 00:00:36,000 Speaker 1: We must hold social media platforms accountable for the national 9 00:00:36,040 --> 00:00:41,400 Speaker 1: experiment they're conducting on our children for profit. The big 10 00:00:41,440 --> 00:00:45,680 Speaker 1: tech persists in coordination with the mainstream media. We must 11 00:00:45,720 --> 00:00:50,400 Speaker 1: immediately strip them of their Section to thirty protection. There's 12 00:00:50,440 --> 00:00:53,800 Speaker 1: been no action in Congress in the face of partisan differences, 13 00:00:54,120 --> 00:00:57,040 Speaker 1: and now the Supreme Court has decided to step into 14 00:00:57,120 --> 00:01:00,560 Speaker 1: the middle of this politically fraught debate or whether some 15 00:01:00,640 --> 00:01:04,000 Speaker 1: of the world's most powerful tech companies should continue to 16 00:01:04,040 --> 00:01:08,080 Speaker 1: be protected or should be held accountable for third party content. 17 00:01:08,520 --> 00:01:11,600 Speaker 1: My guest is Eric Goldman, a professor at Santa Clara 18 00:01:11,720 --> 00:01:15,400 Speaker 1: University School of Law and co director of the school's 19 00:01:15,480 --> 00:01:18,559 Speaker 1: High Tech Law Institute. So Eric tell us about Section 20 00:01:18,600 --> 00:01:21,880 Speaker 1: to thirty, Section two three. He says, that websites aren't 21 00:01:21,880 --> 00:01:25,320 Speaker 1: liable for third party content. It's a really simple premise. 22 00:01:25,440 --> 00:01:28,600 Speaker 1: The idea is that people who post content take responsibility forward, 23 00:01:28,760 --> 00:01:31,600 Speaker 1: but the services that they used to post that content don't. 24 00:01:32,040 --> 00:01:35,720 Speaker 1: Both cases the court is going to consider involve terrorist 25 00:01:35,760 --> 00:01:40,640 Speaker 1: attacks abroad, one in Paris in and another in istanbul In. 26 00:01:42,000 --> 00:01:45,880 Speaker 1: Tell us about the plaintiffs arguments against Google and Twitter, Well, 27 00:01:46,040 --> 00:01:49,040 Speaker 1: really both of them involved pretty much the same set 28 00:01:49,080 --> 00:01:54,040 Speaker 1: of facts. They involved terrorist attacks abroad that were allegedly 29 00:01:54,120 --> 00:01:58,520 Speaker 1: related to social media, and the relationship can vary based 30 00:01:58,520 --> 00:02:01,600 Speaker 1: on the facts, but the general gist is that the 31 00:02:01,720 --> 00:02:07,720 Speaker 1: terrorist organizations recruited and radicalized readers online, and because of that, 32 00:02:07,920 --> 00:02:11,400 Speaker 1: the services now take responsibility for the actions that are 33 00:02:11,440 --> 00:02:14,839 Speaker 1: done by these terrorists organizations or the people that they radicalized. 34 00:02:15,560 --> 00:02:18,919 Speaker 1: So are the plaintiffs in these cases complaining about the 35 00:02:19,080 --> 00:02:23,080 Speaker 1: algorithm generator recommendations. Well, on the part of it, you 36 00:02:23,080 --> 00:02:24,760 Speaker 1: could look at it a little bit more broadly that 37 00:02:24,800 --> 00:02:27,800 Speaker 1: I think the starting premise is that the terrorist organization 38 00:02:27,840 --> 00:02:30,400 Speaker 1: should never be online in the first instance, and if 39 00:02:30,440 --> 00:02:34,480 Speaker 1: they are online, then the social media services giving them 40 00:02:34,520 --> 00:02:38,760 Speaker 1: that support now take responsibility for any of the consequences 41 00:02:38,800 --> 00:02:42,200 Speaker 1: that flow from the visibility that they gain online. So 42 00:02:42,240 --> 00:02:45,320 Speaker 1: it's really one of these situations where social media services 43 00:02:45,240 --> 00:02:49,119 Speaker 1: is just one of many possible contributors to the outcome, 44 00:02:49,600 --> 00:02:52,480 Speaker 1: and we don't hold everyone who has that kind of 45 00:02:52,520 --> 00:02:56,680 Speaker 1: tenuous connection to a terrorist attack responsible for the attack. 46 00:02:56,680 --> 00:02:59,560 Speaker 1: In the first instance. Isn't it a leap for the 47 00:02:59,639 --> 00:03:04,320 Speaker 1: plane gifts to go from showing social media postings to 48 00:03:04,520 --> 00:03:10,560 Speaker 1: proving that the platforms were responsible for international terrorism. It 49 00:03:10,720 --> 00:03:13,240 Speaker 1: is a leap, and it really gets to the core 50 00:03:13,960 --> 00:03:18,760 Speaker 1: of the underlying social question here. Assuming that a terrorist 51 00:03:18,800 --> 00:03:21,720 Speaker 1: attack is done by an individual who has had many 52 00:03:21,800 --> 00:03:25,240 Speaker 1: relationships in life. They have a landlord, or they have 53 00:03:25,520 --> 00:03:28,680 Speaker 1: a home dssociations, they have a job, they took the 54 00:03:28,760 --> 00:03:31,359 Speaker 1: bus to work, or they have an internet access provider 55 00:03:31,400 --> 00:03:33,760 Speaker 1: who allowed them to connect to the internet. All of 56 00:03:33,800 --> 00:03:37,760 Speaker 1: those people in theory are all some very indirect contributor 57 00:03:37,880 --> 00:03:40,839 Speaker 1: to the activities of this individual, but we don't hold 58 00:03:40,840 --> 00:03:44,120 Speaker 1: them responsible. So we make a distinction in the law 59 00:03:44,240 --> 00:03:47,480 Speaker 1: between what we call bus for causation could never have 60 00:03:47,520 --> 00:03:50,400 Speaker 1: happened without this person doing what they did, and what 61 00:03:50,440 --> 00:03:53,360 Speaker 1: we call proximate causation, the people who actually are close 62 00:03:53,480 --> 00:03:56,280 Speaker 1: enough to the outcome that they could have changed the outcome. 63 00:03:56,440 --> 00:03:59,040 Speaker 1: And it's that last piece. The idea that social media 64 00:03:59,040 --> 00:04:01,880 Speaker 1: services are the pros in the cause of a distant 65 00:04:02,000 --> 00:04:05,720 Speaker 1: terrorist attack just doesn't really pass the sanity check. We 66 00:04:05,800 --> 00:04:08,400 Speaker 1: look at them, we say that's too far, that doesn't 67 00:04:08,400 --> 00:04:10,760 Speaker 1: make sense. And a number of the related cases to 68 00:04:10,800 --> 00:04:12,920 Speaker 1: the ones that are going to Supreme Court have failed 69 00:04:12,920 --> 00:04:15,320 Speaker 1: for that rare reason. The course said, we cannot hold 70 00:04:15,360 --> 00:04:20,000 Speaker 1: social media services as the cause of these unfortunate events. 71 00:04:20,000 --> 00:04:23,200 Speaker 1: So is the Supreme Court going to decide that question? 72 00:04:23,800 --> 00:04:26,600 Speaker 1: I don't think they're likely to address the causation piece, 73 00:04:26,680 --> 00:04:29,560 Speaker 1: but it's impossible to ignore. It's the same instinct that 74 00:04:29,640 --> 00:04:32,160 Speaker 1: you had when you ask the question. The Supreme Court 75 00:04:32,200 --> 00:04:34,679 Speaker 1: justices are going to look at this case and say, wait, 76 00:04:34,839 --> 00:04:37,600 Speaker 1: why are they the defendant? Why are we talking about 77 00:04:37,600 --> 00:04:40,400 Speaker 1: the social media services when there's all these other people 78 00:04:40,400 --> 00:04:44,120 Speaker 1: who are equally situated and no more responsible. However, the 79 00:04:44,240 --> 00:04:47,560 Speaker 1: legal question in front of the court doesn't reach that 80 00:04:47,640 --> 00:04:50,360 Speaker 1: causation question, so they may not talk about it. They 81 00:04:50,360 --> 00:04:52,160 Speaker 1: may not even feel like they have the authority to 82 00:04:52,200 --> 00:04:55,600 Speaker 1: do so. So the Nine Circuit, in the same ruling 83 00:04:55,680 --> 00:04:59,800 Speaker 1: that it absolved Google, basically for the Paris attacks, said 84 00:04:59,839 --> 00:05:04,120 Speaker 1: that Twitter, Google, and Facebook had to face claims that 85 00:05:04,160 --> 00:05:08,120 Speaker 1: they played a role in the Istanbul attack. Explain the 86 00:05:08,279 --> 00:05:11,400 Speaker 1: difference there. So, some of it's just based on the 87 00:05:11,480 --> 00:05:14,920 Speaker 1: way in which the arguments are made. These cases have 88 00:05:15,160 --> 00:05:18,279 Speaker 1: each had their own unique twist to them, and so 89 00:05:18,680 --> 00:05:22,440 Speaker 1: in that particular case, the question is actually a technical 90 00:05:22,520 --> 00:05:26,960 Speaker 1: statutory question. Congress enacted liability for people who might have 91 00:05:27,000 --> 00:05:30,560 Speaker 1: played a role in contributing to terrorist attacks and indapted 92 00:05:30,600 --> 00:05:33,760 Speaker 1: their case. So the court said that the statute didn't apply, 93 00:05:33,920 --> 00:05:35,920 Speaker 1: and so the way that the case was framed for 94 00:05:36,080 --> 00:05:39,839 Speaker 1: the Ninth Circuit, the Ninth Circuit said that statute could apply, 95 00:05:40,040 --> 00:05:42,360 Speaker 1: we need to go and ask more questions about it. 96 00:05:42,680 --> 00:05:44,720 Speaker 1: And that's now what Twitter is appealing up to the 97 00:05:44,760 --> 00:05:47,320 Speaker 1: Supreme Court to ask the question whether or not the 98 00:05:47,360 --> 00:05:50,640 Speaker 1: statute even reaches the activity. If it doesn't, then Twitter 99 00:05:50,680 --> 00:05:53,560 Speaker 1: is not liable because the statute never created liability. Is 100 00:05:53,600 --> 00:05:59,719 Speaker 1: it happenstance that both these involved terrorist attacks on foreign soil? No, 101 00:06:00,120 --> 00:06:02,680 Speaker 1: I don't think it's happened stance, because, in fact, many 102 00:06:02,720 --> 00:06:04,800 Speaker 1: of the cases that have been brought in this genre, 103 00:06:04,839 --> 00:06:06,799 Speaker 1: and they're about twenty of them, that have been foiled 104 00:06:06,800 --> 00:06:10,040 Speaker 1: across the country, have involved foreign terrorist activity, somemin that 105 00:06:10,160 --> 00:06:13,159 Speaker 1: have involved domestic activity. Ultimately, I don't think that it 106 00:06:13,200 --> 00:06:16,159 Speaker 1: really matters from a legal standpoint. There's so many legal 107 00:06:16,200 --> 00:06:19,080 Speaker 1: reasons why the services should be liable regardless of where 108 00:06:19,080 --> 00:06:22,200 Speaker 1: the terrorist activity took place. So these cases are the 109 00:06:22,279 --> 00:06:26,120 Speaker 1: first guest of Section two thirty at the Supreme Court. 110 00:06:26,680 --> 00:06:29,120 Speaker 1: What does it tell you that the Supreme Court agreed 111 00:06:29,200 --> 00:06:32,880 Speaker 1: to hear them when, like so many other cases this term, 112 00:06:32,960 --> 00:06:36,159 Speaker 1: it didn't have to, meaning there was no circuit split 113 00:06:36,279 --> 00:06:40,120 Speaker 1: that it had to resolve. Yeah. So one of the 114 00:06:40,160 --> 00:06:42,880 Speaker 1: most common reasons of Supreme Court takes the case is 115 00:06:42,920 --> 00:06:46,120 Speaker 1: because of circuit splits, where two federal courts are in 116 00:06:46,160 --> 00:06:48,680 Speaker 1: disagreement with each other and they need Supreme Court to 117 00:06:48,720 --> 00:06:51,880 Speaker 1: weigh in and resolve the dispute. The problem, and this 118 00:06:52,000 --> 00:06:54,920 Speaker 1: fits their case, is that the nine Circuit Court of 119 00:06:54,960 --> 00:06:59,120 Speaker 1: Appeals opinion had its own intrinsic split. There were three 120 00:06:59,400 --> 00:07:02,400 Speaker 1: judges on a panel and they wrote three different opinions 121 00:07:02,400 --> 00:07:05,400 Speaker 1: that were wildly different from each other. They were not 122 00:07:05,520 --> 00:07:07,560 Speaker 1: in sync with each other. So though there wasn't a 123 00:07:07,600 --> 00:07:11,600 Speaker 1: circuit split, there was an intra panel split. Now Normally 124 00:07:11,640 --> 00:07:15,120 Speaker 1: those get resolved by what's called an on bond procedure. 125 00:07:15,440 --> 00:07:17,280 Speaker 1: The Federal Appeals Court can say, we need to have 126 00:07:17,360 --> 00:07:19,560 Speaker 1: more judges listened to this case so that we can 127 00:07:19,600 --> 00:07:22,640 Speaker 1: figure out how to come up with a more harmonized resolution. 128 00:07:22,920 --> 00:07:26,280 Speaker 1: The Ninth Circuit didn't do that. So because the Ninth 129 00:07:26,320 --> 00:07:29,360 Speaker 1: circum opinion was so messy and the Ninth Circuit didn't 130 00:07:29,440 --> 00:07:32,280 Speaker 1: clean it up, it created the possibility for the Supreme 131 00:07:32,320 --> 00:07:34,560 Speaker 1: Court to say, there's a mess here that we need 132 00:07:34,600 --> 00:07:37,840 Speaker 1: to resolve. Two of the opinions also basically said we 133 00:07:37,880 --> 00:07:40,400 Speaker 1: think Section two thirties a problem, and so it created 134 00:07:40,720 --> 00:07:43,920 Speaker 1: a flag for the Supreme Court to pay attention, there's 135 00:07:43,960 --> 00:07:47,720 Speaker 1: a statutory problem here that needs attention. Maybe you ought 136 00:07:47,720 --> 00:07:49,720 Speaker 1: to take a look. So it was a combination of 137 00:07:49,760 --> 00:07:52,880 Speaker 1: the messy opinion plus what the judges said that I 138 00:07:52,920 --> 00:07:56,280 Speaker 1: think in fact of the Supreme Court interest. Justice Clarence 139 00:07:56,320 --> 00:08:01,000 Speaker 1: Thomas had already expressed interest and indicated that he's willing 140 00:08:01,000 --> 00:08:04,440 Speaker 1: to change the law if Congress isn't. Well, we have 141 00:08:04,520 --> 00:08:06,880 Speaker 1: to assume that Justice Thomas was in favor of hearing 142 00:08:06,960 --> 00:08:10,840 Speaker 1: this case because he's basically begged plaintiffs to bring Section 143 00:08:10,880 --> 00:08:13,040 Speaker 1: to three cases to him so he can find a 144 00:08:13,080 --> 00:08:15,280 Speaker 1: way to try and invistrate it. So we know that 145 00:08:15,320 --> 00:08:18,200 Speaker 1: Justice Thomas is already coming in as an extreme Section 146 00:08:18,240 --> 00:08:21,480 Speaker 1: to three skeptic. He's literally told us when nobody asked 147 00:08:21,520 --> 00:08:25,200 Speaker 1: him to. Google's chief executive officer told lawmakers last year 148 00:08:25,240 --> 00:08:29,440 Speaker 1: that revoking Section to thirty would mean that platforms would 149 00:08:29,520 --> 00:08:32,760 Speaker 1: either over filter content or not be able to filter 150 00:08:32,960 --> 00:08:36,439 Speaker 1: content at all. Do you agree with that? I do, 151 00:08:36,720 --> 00:08:40,120 Speaker 1: and it's a very well known phenomenon with online content. 152 00:08:40,120 --> 00:08:43,959 Speaker 1: It's something that I call the moderator's dilemma. The idea 153 00:08:44,040 --> 00:08:48,480 Speaker 1: is that if you're liable for trying and failing, then 154 00:08:48,600 --> 00:08:52,120 Speaker 1: either you don't try it all so that you can't fail, 155 00:08:52,160 --> 00:08:55,280 Speaker 1: so you just let everything go through, therefore you haven't 156 00:08:55,320 --> 00:08:59,719 Speaker 1: intervened at all, or you overrespond and make sure you 157 00:08:59,760 --> 00:09:03,080 Speaker 1: don't fail, which is impossible, but it leads to lots 158 00:09:03,080 --> 00:09:05,640 Speaker 1: of collateral damage as well. There is, of course, the 159 00:09:05,720 --> 00:09:08,320 Speaker 1: third option, which Google is unlikely to do but many 160 00:09:08,320 --> 00:09:11,040 Speaker 1: other services will, which is to exit the industry and 161 00:09:11,080 --> 00:09:13,640 Speaker 1: say that it's not profitable to do nothing or to 162 00:09:13,720 --> 00:09:16,720 Speaker 1: be perfect, and therefore we have to simply find another 163 00:09:16,760 --> 00:09:19,720 Speaker 1: line of business. Let's say Section two thirty is gone. 164 00:09:20,080 --> 00:09:23,439 Speaker 1: What effect would that have on social media companies? It's 165 00:09:23,480 --> 00:09:26,840 Speaker 1: not just social media companies that the entire Internet. So 166 00:09:26,920 --> 00:09:29,680 Speaker 1: much of the Internet is driven by user generated content, 167 00:09:29,960 --> 00:09:32,920 Speaker 1: us talking to each other, and Section two thirty is 168 00:09:32,960 --> 00:09:36,200 Speaker 1: the legal foundation that enables those conversations to take place 169 00:09:36,520 --> 00:09:40,800 Speaker 1: without the services being liable for facilitate. They're enabling those conversations. 170 00:09:40,800 --> 00:09:43,600 Speaker 1: So without Section to thirty, many of those conversations will 171 00:09:43,640 --> 00:09:47,240 Speaker 1: simply stop. They won't be possible to do anymore because 172 00:09:47,240 --> 00:09:50,520 Speaker 1: of the fact that the legal liability will overwhelm the benefit. Now, 173 00:09:50,600 --> 00:09:53,640 Speaker 1: some of the services that are existing today are big 174 00:09:53,760 --> 00:09:56,080 Speaker 1: enough and powerful enough that they will either find a 175 00:09:56,120 --> 00:09:59,920 Speaker 1: way to thread the legal needle and accept whatever collateral 176 00:10:00,040 --> 00:10:03,960 Speaker 1: image comes from that, or they will move towards professionally 177 00:10:03,960 --> 00:10:06,920 Speaker 1: produced content. They'll stop letting users talk to each other. 178 00:10:07,160 --> 00:10:11,160 Speaker 1: They'll pay some people who they trust to submit content 179 00:10:11,240 --> 00:10:13,719 Speaker 1: that they will accept the legal risk for, and as 180 00:10:13,720 --> 00:10:16,920 Speaker 1: a result, it becomes a lot more of the Internet 181 00:10:16,960 --> 00:10:20,680 Speaker 1: being people talking to us, not us talking to each other. 182 00:10:21,160 --> 00:10:25,480 Speaker 1: As I understood, this was about algorithm generated recommendations. Could 183 00:10:25,480 --> 00:10:31,320 Speaker 1: the court just eliminate those in theory one one solution 184 00:10:31,400 --> 00:10:34,880 Speaker 1: is that the Supreme Court could say that quote, algorithmic 185 00:10:34,960 --> 00:10:38,000 Speaker 1: recommendations are excluded from Section two thirty, but Section two 186 00:10:38,040 --> 00:10:41,880 Speaker 1: thy otherwise remains attacked. That would be a massive strategic 187 00:10:42,040 --> 00:10:45,040 Speaker 1: loss for the Internet. And the reason why is because 188 00:10:45,040 --> 00:10:49,400 Speaker 1: there's no principled way to distinguish between algorithmic recommendations and 189 00:10:49,559 --> 00:10:54,000 Speaker 1: any other promotional or curatorial functions that Internet services perform. 190 00:10:54,440 --> 00:10:57,800 Speaker 1: So basically saying that algorithmic recommendations are out of secret 191 00:10:57,880 --> 00:11:01,480 Speaker 1: to theory say there's no way to promote or encourage 192 00:11:01,480 --> 00:11:04,320 Speaker 1: readers to look at particular types of content and still 193 00:11:04,360 --> 00:11:06,800 Speaker 1: stay within Section two thirty. That would lead to an 194 00:11:06,800 --> 00:11:10,840 Speaker 1: Internet that looks a lot more like Google Drive or dropbox. 195 00:11:11,240 --> 00:11:15,920 Speaker 1: The services could only provide dumb storage lockers and a 196 00:11:16,120 --> 00:11:18,120 Speaker 1: u r L that the users go out and promote, 197 00:11:18,480 --> 00:11:20,080 Speaker 1: and that would be the only thing that that would 198 00:11:20,200 --> 00:11:22,959 Speaker 1: be covered by Section two thirty. Everything else will be gone, 199 00:11:23,120 --> 00:11:24,920 Speaker 1: and I don't think we want in their netfall of 200 00:11:25,000 --> 00:11:29,640 Speaker 1: Google drives. Besides Justice Thomas, are there other justices who 201 00:11:29,640 --> 00:11:34,040 Speaker 1: have expressed displeasure with section to thirty and might be 202 00:11:34,120 --> 00:11:37,680 Speaker 1: willing to tamper with it. You know, it's it's a 203 00:11:37,679 --> 00:11:42,040 Speaker 1: little hard to read the judges nowadays because every speech 204 00:11:42,040 --> 00:11:46,440 Speaker 1: related question is intrinsically linked with the culture wars that 205 00:11:46,520 --> 00:11:50,559 Speaker 1: have Royal the Supreme Court. So it's unclear whether or 206 00:11:50,600 --> 00:11:53,120 Speaker 1: not judges who in the past has stood for less 207 00:11:53,200 --> 00:11:56,760 Speaker 1: government and intervention into private activity still stand for that, 208 00:11:57,240 --> 00:12:00,320 Speaker 1: or judges to believe that in other services should doing 209 00:12:00,400 --> 00:12:04,400 Speaker 1: more to remove uh content will feel that way when 210 00:12:04,480 --> 00:12:08,280 Speaker 1: it comes to the implications of that. So we're actually 211 00:12:08,440 --> 00:12:11,199 Speaker 1: kind of in an a limbo with the other judges. 212 00:12:11,280 --> 00:12:14,280 Speaker 1: We don't really know where they're likely to come out. 213 00:12:14,760 --> 00:12:18,000 Speaker 1: And this is a substantial import because not only are 214 00:12:18,040 --> 00:12:20,080 Speaker 1: they going to brustle with the questions in the section 215 00:12:20,160 --> 00:12:23,480 Speaker 1: to thirty contexts, but there will be another appeal of 216 00:12:23,640 --> 00:12:26,880 Speaker 1: laws coming from Florida and Texas that will ask the 217 00:12:26,960 --> 00:12:29,280 Speaker 1: judges to weigh in further on the ability of their 218 00:12:29,320 --> 00:12:33,760 Speaker 1: own services to moderate content from users. So they're going 219 00:12:33,800 --> 00:12:36,360 Speaker 1: to be having to answer this question, and we don't 220 00:12:36,360 --> 00:12:38,640 Speaker 1: know what they're going to answer in their decisions are 221 00:12:38,679 --> 00:12:41,200 Speaker 1: likely to shape the future of the Internet. So two 222 00:12:41,120 --> 00:12:43,199 Speaker 1: thousand twenty three is going to be a very scary 223 00:12:43,240 --> 00:12:45,160 Speaker 1: time for the future of the Internet because it's the 224 00:12:45,240 --> 00:12:46,640 Speaker 1: room Court is going to decide it and we don't 225 00:12:46,640 --> 00:12:50,040 Speaker 1: know what they're gonna say. And Eric, the cases involving 226 00:12:50,320 --> 00:12:53,640 Speaker 1: Texas is social media law and Florida social media law, 227 00:12:53,679 --> 00:12:57,640 Speaker 1: where there's a split in the circuits involving similar laws. 228 00:12:58,040 --> 00:13:01,160 Speaker 1: You think the Court will take that. I do think 229 00:13:01,160 --> 00:13:03,120 Speaker 1: they're going to take the case. And the reason why, 230 00:13:03,120 --> 00:13:06,360 Speaker 1: impart is because of a opinion that came out of 231 00:13:06,400 --> 00:13:10,240 Speaker 1: the Texas Law on the Supreme Court's shadow docket, where 232 00:13:10,320 --> 00:13:13,800 Speaker 1: three judges led by Justice Toledo, said that they think 233 00:13:13,880 --> 00:13:17,320 Speaker 1: that these cases should be granted surtiari. You only need 234 00:13:17,360 --> 00:13:20,200 Speaker 1: four votes, so if any one of the other sticks 235 00:13:20,240 --> 00:13:22,200 Speaker 1: think that they should take this case, then the votes 236 00:13:22,240 --> 00:13:25,000 Speaker 1: are there. So I'm highly confident that the Supreme Court 237 00:13:25,040 --> 00:13:27,320 Speaker 1: is going to take the case, and when they do, 238 00:13:27,720 --> 00:13:32,040 Speaker 1: we have another battle royale over the future of the Internet. Finally, 239 00:13:32,600 --> 00:13:36,280 Speaker 1: Section two thirty. Do you think it's something that Congress 240 00:13:36,440 --> 00:13:39,040 Speaker 1: should take up and work on or do you think 241 00:13:39,080 --> 00:13:43,559 Speaker 1: it should just be left alone? We benefit every day, 242 00:13:43,720 --> 00:13:46,720 Speaker 1: hour by hour, off a minute by minute from section 243 00:13:46,760 --> 00:13:49,240 Speaker 1: to thirty. Though, times that we're talking to each other 244 00:13:49,240 --> 00:13:52,000 Speaker 1: in line are some of the most important and common 245 00:13:52,040 --> 00:13:55,080 Speaker 1: moments that we have used in the Internet. So even 246 00:13:55,240 --> 00:13:57,920 Speaker 1: small changes to Section two there you could have dramatic 247 00:13:57,960 --> 00:14:00,560 Speaker 1: effects on the way that we spend our time, the 248 00:14:00,600 --> 00:14:03,400 Speaker 1: way that we enjoy our lives, and the things that 249 00:14:03,440 --> 00:14:06,360 Speaker 1: we're able to do. So I think we actually have 250 00:14:06,640 --> 00:14:09,360 Speaker 1: it pretty good from that respect in the sect that 251 00:14:09,559 --> 00:14:11,920 Speaker 1: there's a lot of things that are taking place today 252 00:14:11,960 --> 00:14:15,160 Speaker 1: that only are possible because Section two three enables them. 253 00:14:15,440 --> 00:14:17,319 Speaker 1: So if Congress wants to take a look an section 254 00:14:17,320 --> 00:14:20,080 Speaker 1: to thees, that is their prerogative, and they have asked 255 00:14:20,160 --> 00:14:24,480 Speaker 1: questions about algorithmic recommendations in some of their draft bills. However, 256 00:14:24,600 --> 00:14:27,200 Speaker 1: the Supreme Court should not be the ones that reshaped 257 00:14:27,240 --> 00:14:30,640 Speaker 1: Section two thirty. If they think that Section two there's miscalibrated, 258 00:14:30,680 --> 00:14:33,360 Speaker 1: they should tell Congress that, but they shouldn't go and 259 00:14:33,440 --> 00:14:36,120 Speaker 1: change it. So the fact that the Supreme Court might 260 00:14:36,200 --> 00:14:39,080 Speaker 1: change section too therey is what really panics me because 261 00:14:39,440 --> 00:14:43,440 Speaker 1: then it creates the possibility that unelected justices are making 262 00:14:43,440 --> 00:14:46,920 Speaker 1: decisions that will affect our daily lives in ways that 263 00:14:47,040 --> 00:14:51,680 Speaker 1: it's really impossible to contemplate. So the sticks are so high. 264 00:14:51,840 --> 00:14:55,920 Speaker 1: So the Court is stepping into yet another controversial area 265 00:14:56,000 --> 00:15:00,120 Speaker 1: this term. In addition to voting rights, affirmative action, and 266 00:15:00,320 --> 00:15:04,080 Speaker 1: gay rights, the environment, to name just a few. It's 267 00:15:04,080 --> 00:15:06,280 Speaker 1: going to be quite a term. Thanks so much. Eric. 268 00:15:06,560 --> 00:15:10,360 Speaker 1: That's Professor Eric Goldman of Santa Clara University Law School. 269 00:15:10,680 --> 00:15:14,480 Speaker 1: He's also co director of the school's High Tech Law Institute. 270 00:15:14,840 --> 00:15:17,160 Speaker 1: And that's it for this edition of the Bloomberg Law Show. 271 00:15:17,560 --> 00:15:19,760 Speaker 1: Remember you can always get the latest legal news on 272 00:15:19,840 --> 00:15:22,880 Speaker 1: our Bloomberg Law Podcast. You can find them wherever you 273 00:15:22,920 --> 00:15:26,280 Speaker 1: get your favorite podcasts. I'm June Grosso and you're listening 274 00:15:26,360 --> 00:15:27,120 Speaker 1: to Bloomberg