1 00:00:02,759 --> 00:00:07,000 Speaker 1: This is Bloomberg Law with June Grossel from Bloomberg Radio. 2 00:00:08,640 --> 00:00:12,200 Speaker 2: It's been three years since a white supremacist opened fire 3 00:00:12,360 --> 00:00:16,200 Speaker 2: inside a grocery store in Buffalo and killed ten people. 4 00:00:16,520 --> 00:00:20,720 Speaker 3: This tragedy shocked us. It devastated us. It pushed us 5 00:00:21,400 --> 00:00:24,239 Speaker 3: to where we thought were beyond our limits. But it 6 00:00:24,280 --> 00:00:27,320 Speaker 3: didn't break us. It didn't break us. Instead, it revealed 7 00:00:27,320 --> 00:00:29,680 Speaker 3: a strength that runs deep in the veins of the city, 8 00:00:30,080 --> 00:00:32,440 Speaker 3: the strength of a community that binds together and refuses 9 00:00:32,520 --> 00:00:35,000 Speaker 3: to be defined by acts of hate. 10 00:00:35,040 --> 00:00:38,479 Speaker 2: The eighteen year old shooter pleaded guilty to all state 11 00:00:38,640 --> 00:00:43,239 Speaker 2: charges and was sentenced to eleven concurrent life sentences. The 12 00:00:43,240 --> 00:00:47,920 Speaker 2: federal case against him is ongoing, but survivors and families 13 00:00:47,920 --> 00:00:52,360 Speaker 2: of the victims are seeking to hold social media companies responsible. 14 00:00:52,720 --> 00:00:57,160 Speaker 2: They're suing internet giants Google, Meta, Amazon, and Reddit for 15 00:00:57,280 --> 00:01:01,920 Speaker 2: publishing racist content that alleged motivated the shooter to commit 16 00:01:01,960 --> 00:01:05,720 Speaker 2: a hate crime by targeting people in a historically black 17 00:01:05,800 --> 00:01:08,160 Speaker 2: neighborhood hundreds of miles from his home. 18 00:01:08,680 --> 00:01:13,960 Speaker 4: Our case is fundamentally about the isolation, obsession, lack of 19 00:01:14,000 --> 00:01:20,280 Speaker 4: impulse control, and desensitization resulting from the neurological and psychological 20 00:01:20,319 --> 00:01:24,160 Speaker 4: trauma inflicted on a teenager's brain by a highly addictive 21 00:01:24,200 --> 00:01:25,520 Speaker 4: social media product. 22 00:01:25,920 --> 00:01:28,640 Speaker 2: And today both sides were in a New York appellate 23 00:01:28,720 --> 00:01:32,679 Speaker 2: courtroom where the social media companies are arguing that they 24 00:01:32,720 --> 00:01:36,240 Speaker 2: can't be held liable for the racist content on their sites. 25 00:01:36,680 --> 00:01:40,720 Speaker 2: The victim's families say the algorithms the Internet sites used 26 00:01:40,800 --> 00:01:45,440 Speaker 2: to deliver tailored content to users should be considered products 27 00:01:45,520 --> 00:01:48,960 Speaker 2: under the state's product liability law. Here are the attorneys 28 00:01:48,960 --> 00:01:50,880 Speaker 2: for the victims families and Meta. 29 00:01:51,440 --> 00:01:55,320 Speaker 1: Any time our manufacturer places a dangerous product into the 30 00:01:55,320 --> 00:01:59,720 Speaker 1: stream of commerce that they reasonably know creates an unreasonable 31 00:01:59,760 --> 00:02:03,720 Speaker 1: risk of harm to the public, are libel are in 32 00:02:03,800 --> 00:02:09,080 Speaker 1: New York state products liability law. It's plain and dirt simple. 33 00:02:10,440 --> 00:02:13,760 Speaker 5: And there are two sort of foundational problems here that 34 00:02:13,840 --> 00:02:17,480 Speaker 5: get us outside of the realm of traditional product liability. 35 00:02:17,880 --> 00:02:23,480 Speaker 5: One is that communicating intangible information is not a product, 36 00:02:23,520 --> 00:02:27,360 Speaker 5: and the second is that services are not products. And 37 00:02:27,480 --> 00:02:31,880 Speaker 5: here we have both of those things. Product liability law 38 00:02:32,000 --> 00:02:35,919 Speaker 5: is geared to the tangible world. That's the language of 39 00:02:36,000 --> 00:02:37,799 Speaker 5: the seminal Winter case. 40 00:02:38,400 --> 00:02:40,280 Speaker 2: It would be a first in New York if a 41 00:02:40,360 --> 00:02:43,880 Speaker 2: court found a non tangible thing to be a product 42 00:02:43,919 --> 00:02:47,800 Speaker 2: for purposes of liability law. My guest is Eric Goleman, 43 00:02:47,919 --> 00:02:51,519 Speaker 2: a professor at Santa Clara University Law School and co 44 00:02:51,600 --> 00:02:54,720 Speaker 2: director of the High Tech Law Institute. Eric, can you 45 00:02:54,760 --> 00:02:58,160 Speaker 2: tell us about the allegations of the plaintiffs in these cases? 46 00:02:58,800 --> 00:03:03,560 Speaker 6: The case involved the mass shooting in a Buffalo supermarket 47 00:03:03,600 --> 00:03:07,800 Speaker 6: by a white supremacist, and the basic gist of the 48 00:03:07,840 --> 00:03:13,400 Speaker 6: claims is that the shooter was radicalized on Facebook, and 49 00:03:13,480 --> 00:03:16,799 Speaker 6: so in order to fit that into legal doctrines, there's 50 00:03:16,800 --> 00:03:20,760 Speaker 6: a variety of different claims that the victims and families 51 00:03:20,760 --> 00:03:24,720 Speaker 6: have made in order to explain what's wrong with that? 52 00:03:25,360 --> 00:03:29,600 Speaker 6: Why does that actually create legal rights for the victims does? 53 00:03:29,639 --> 00:03:34,279 Speaker 6: Some of those include a claim that Facebook was negligent, 54 00:03:34,800 --> 00:03:38,640 Speaker 6: that it should have done something different to protect the victims, 55 00:03:38,800 --> 00:03:41,400 Speaker 6: and some of it is based on what we sometimes 56 00:03:41,400 --> 00:03:45,960 Speaker 6: call products liability. That Facebook is a product that is 57 00:03:46,080 --> 00:03:49,360 Speaker 6: dangerous and as a result, any harms is caused by 58 00:03:49,360 --> 00:03:52,520 Speaker 6: a dangerous product can be attributable to the supplier of 59 00:03:52,560 --> 00:03:53,200 Speaker 6: that product. 60 00:03:53,880 --> 00:03:57,640 Speaker 2: Are the plaintiffs bringing it under the product liability law? 61 00:03:58,400 --> 00:04:00,200 Speaker 2: In order to get around sex. 62 00:04:01,760 --> 00:04:05,560 Speaker 6: No. Section two thirty is a defense whereas the planets 63 00:04:05,560 --> 00:04:07,840 Speaker 6: have to still figure out a claim that actually fits 64 00:04:07,920 --> 00:04:11,280 Speaker 6: their facts. However, they do want to pick claims that 65 00:04:11,360 --> 00:04:14,800 Speaker 6: are not preempted by Section two thirty, which says, in short, 66 00:04:14,840 --> 00:04:17,839 Speaker 6: that websites aren't liable for third party content. So the 67 00:04:17,920 --> 00:04:20,800 Speaker 6: idea is that the planets are arguing that they're not 68 00:04:20,960 --> 00:04:25,800 Speaker 6: suing to hold Facebook liabel for the third party content. 69 00:04:25,920 --> 00:04:30,320 Speaker 6: They're suing based on Facebook's contribution to the radicalization of 70 00:04:30,360 --> 00:04:30,760 Speaker 6: the shooter. 71 00:04:31,520 --> 00:04:34,240 Speaker 2: So when you think of product liability, you think of 72 00:04:34,480 --> 00:04:39,440 Speaker 2: tangible objects. You know, a mechanical device that injures you, 73 00:04:40,040 --> 00:04:43,640 Speaker 2: something like that. Has a New York court ever found 74 00:04:44,160 --> 00:04:48,160 Speaker 2: a streaming service or social media to be a product 75 00:04:48,240 --> 00:04:50,279 Speaker 2: under the state's product liability law. 76 00:04:50,760 --> 00:04:53,440 Speaker 6: For decades, there have been legal theories that have held 77 00:04:53,480 --> 00:04:56,960 Speaker 6: that the supply of physical items that are at risk 78 00:04:57,040 --> 00:05:01,520 Speaker 6: of causing physical injury are live for those physical injuries. 79 00:05:01,560 --> 00:05:04,920 Speaker 6: And one of the classic old cases involved exploding coca 80 00:05:04,960 --> 00:05:08,279 Speaker 6: cola bottles, and if you didn't manufacture and fill the 81 00:05:08,320 --> 00:05:11,760 Speaker 6: coca cola bottle properly, it could literally explode and create 82 00:05:11,880 --> 00:05:14,720 Speaker 6: shards of laughs that were projectiles. And so the idea 83 00:05:14,760 --> 00:05:16,800 Speaker 6: is that we want manufacturers to be more careful in 84 00:05:16,880 --> 00:05:19,520 Speaker 6: the manufacture of an item like that because of the 85 00:05:19,520 --> 00:05:23,520 Speaker 6: physical risk of poses when that item is now in consumers' homes. 86 00:05:23,720 --> 00:05:29,000 Speaker 6: There's a very bright line in those doctrines between physical 87 00:05:29,040 --> 00:05:33,600 Speaker 6: items they are capable of causing physical injury and intangible 88 00:05:33,720 --> 00:05:37,920 Speaker 6: thinks or services which do not pose the same risk 89 00:05:37,960 --> 00:05:42,080 Speaker 6: of sending glass yards as projectiles, because there's literally nothing 90 00:05:42,520 --> 00:05:46,839 Speaker 6: like that in their possession. So the laws normally limit 91 00:05:46,960 --> 00:05:52,680 Speaker 6: products liability theories to actual physical products, not to services 92 00:05:53,279 --> 00:05:58,279 Speaker 6: like Internet services like Facebook. So it's a leap of 93 00:05:58,360 --> 00:06:01,720 Speaker 6: the theory to try and expand it to cover services. 94 00:06:01,880 --> 00:06:05,920 Speaker 6: This theory wasn't designed for that purpose, it's not optimized 95 00:06:06,160 --> 00:06:09,599 Speaker 6: to do so, and unlike the regulation of Coca Cola 96 00:06:09,640 --> 00:06:14,320 Speaker 6: bottles exploding with glass chards, there's significant speech implications if 97 00:06:14,320 --> 00:06:17,920 Speaker 6: we tried to regulate the dissemination of information as if 98 00:06:17,960 --> 00:06:21,560 Speaker 6: it's an exploding coke bottle. So there have been plaintiffs 99 00:06:21,600 --> 00:06:24,719 Speaker 6: across the country that have been trying out the theories 100 00:06:25,160 --> 00:06:30,640 Speaker 6: that internet services are subject to products liability doctrines, and 101 00:06:30,880 --> 00:06:34,719 Speaker 6: courts have split on that issue. Most courts have rejected it, 102 00:06:35,040 --> 00:06:39,360 Speaker 6: some have entertained it, and so it's a live frontier 103 00:06:39,640 --> 00:06:40,920 Speaker 6: in internet law today. 104 00:06:41,520 --> 00:06:44,279 Speaker 2: And have there been rulings bipellate courts on that. 105 00:06:44,760 --> 00:06:47,400 Speaker 6: Yes and no. There's been a variety of rulings in 106 00:06:47,440 --> 00:06:50,640 Speaker 6: different fact circumstances. But there's a leading case in this 107 00:06:50,720 --> 00:06:53,359 Speaker 6: area that just came out in February from the Fourth 108 00:06:53,400 --> 00:06:57,200 Speaker 6: Circuit involved a very very similar case involving the shooter 109 00:06:57,560 --> 00:07:01,920 Speaker 6: who allegedly was radicalized on Facebook, and the plaintiffs argued 110 00:07:01,960 --> 00:07:04,840 Speaker 6: products liability is part of the reasons why Facebook should 111 00:07:04,839 --> 00:07:07,680 Speaker 6: be liable, and in that case the Federal peblic Court 112 00:07:07,839 --> 00:07:11,280 Speaker 6: shut all that down and said that Facebook qualified for 113 00:07:11,480 --> 00:07:14,600 Speaker 6: Section two thirty. Everything that the Planets are arguing was 114 00:07:14,680 --> 00:07:17,480 Speaker 6: trying to hold Facebook liable for the third party content 115 00:07:17,480 --> 00:07:20,880 Speaker 6: available on service, and therefore the Planets lose. So the 116 00:07:20,920 --> 00:07:24,480 Speaker 6: plainliffs in this case aren't directly bound by that ruling, 117 00:07:24,520 --> 00:07:27,640 Speaker 6: but clearly there's a strong message from that ruling that 118 00:07:27,680 --> 00:07:29,640 Speaker 6: the Planetf's arguments should not be successful. 119 00:07:30,360 --> 00:07:34,520 Speaker 2: Does that issue depend on the state's product liability law, 120 00:07:34,640 --> 00:07:37,440 Speaker 2: because those laws differ from state to state. 121 00:07:37,840 --> 00:07:40,160 Speaker 6: It can we have to look at the terms of 122 00:07:40,160 --> 00:07:43,240 Speaker 6: the exact statute, what it says it covers. That might 123 00:07:43,360 --> 00:07:46,760 Speaker 6: very well restrict plaints claims, but not expand them. In 124 00:07:46,800 --> 00:07:50,000 Speaker 6: other words, there's some limit to the theory that as 125 00:07:50,040 --> 00:07:52,440 Speaker 6: far as it can go, at some point it has 126 00:07:52,480 --> 00:07:56,360 Speaker 6: to stop, either because of doctrines like Section two thirty, 127 00:07:56,440 --> 00:07:59,560 Speaker 6: which Congress enacted as the way to prevent that kind 128 00:07:59,600 --> 00:08:02,520 Speaker 6: of explan or because of the First Amendment that the 129 00:08:02,600 --> 00:08:07,360 Speaker 6: Constitution does not permit a physical based legal doctrine to 130 00:08:07,440 --> 00:08:09,400 Speaker 6: extend to the dissemination of the speech. 131 00:08:09,840 --> 00:08:13,080 Speaker 2: If the court allows the case to go forward, well, 132 00:08:13,080 --> 00:08:16,720 Speaker 2: the plaintiffs have to prove that Facebook or the other 133 00:08:16,840 --> 00:08:20,760 Speaker 2: platforms led to the radicalization of the shooter. And how 134 00:08:20,800 --> 00:08:22,320 Speaker 2: difficult will that be? 135 00:08:22,800 --> 00:08:26,080 Speaker 6: Definitely they'll have to show that what we call causation 136 00:08:26,600 --> 00:08:29,480 Speaker 6: when it comes to negligence. I believe that that will 137 00:08:29,520 --> 00:08:33,120 Speaker 6: also be relevant in the context of strict products liability, 138 00:08:33,480 --> 00:08:36,720 Speaker 6: and it will be difficult. And causation was also an 139 00:08:36,760 --> 00:08:39,760 Speaker 6: issue in the Federal Appellate Court. Rulin I mentioned from 140 00:08:39,800 --> 00:08:42,840 Speaker 6: the Fourth Circuit and the court independently said there was 141 00:08:43,200 --> 00:08:46,360 Speaker 6: not sufficient causation in that case. And when we think 142 00:08:46,400 --> 00:08:51,600 Speaker 6: about what causes someone to radicalize into a white supremacist 143 00:08:51,679 --> 00:08:54,240 Speaker 6: and even words, what causes a white supremacist to then 144 00:08:54,320 --> 00:08:58,360 Speaker 6: decide to engage in mass murder. There's so many things 145 00:08:58,440 --> 00:09:01,520 Speaker 6: that contribute to that outcome, and it's very difficult to 146 00:09:01,520 --> 00:09:05,360 Speaker 6: pull out one piece and say that's the cause. They 147 00:09:05,360 --> 00:09:08,480 Speaker 6: should bear all the burden of responsibility. In fact, it 148 00:09:08,520 --> 00:09:11,480 Speaker 6: really is a whole of society failure to have people 149 00:09:11,520 --> 00:09:13,880 Speaker 6: who end up in that circumstance, and we really need 150 00:09:13,920 --> 00:09:16,480 Speaker 6: to look at our society across the board to figure 151 00:09:16,520 --> 00:09:18,760 Speaker 6: out what went wrong and what we ought to be 152 00:09:18,800 --> 00:09:21,319 Speaker 6: doing differently, and to try and pin it on anyone 153 00:09:21,400 --> 00:09:24,680 Speaker 6: player in the ecosystem. To say Facebook was the reason 154 00:09:24,720 --> 00:09:27,560 Speaker 6: why this particular person pulled the trigger, I really think 155 00:09:27,720 --> 00:09:32,440 Speaker 6: kind of denies the reality of how we've all contributed 156 00:09:32,520 --> 00:09:35,800 Speaker 6: to a society that can lead to these outcomes and. 157 00:09:35,920 --> 00:09:39,199 Speaker 2: Explain how Section two thirty would work here As a. 158 00:09:39,280 --> 00:09:44,120 Speaker 6: Defense, So the planets started arguing that Facebook radicalized the shooter, 159 00:09:44,480 --> 00:09:50,080 Speaker 6: and Facebook itself normally doesn't publish content itself that it authored. 160 00:09:50,640 --> 00:09:55,160 Speaker 6: Its primary way of engaging in content distribution is to 161 00:09:55,240 --> 00:09:58,319 Speaker 6: gather third party content like the Facebook post that we 162 00:09:58,360 --> 00:10:01,720 Speaker 6: all make as the every day activity, and then share 163 00:10:01,760 --> 00:10:06,959 Speaker 6: that with an audience. And so the content that allegedly 164 00:10:07,080 --> 00:10:12,240 Speaker 6: radicalized the shooter didn't come from Facebook. Facebook most was 165 00:10:12,520 --> 00:10:15,920 Speaker 6: the mechanism by which some author matched with the shooter. 166 00:10:16,600 --> 00:10:21,200 Speaker 6: And so it's impossible really to describe Facebook's role in 167 00:10:21,320 --> 00:10:24,760 Speaker 6: radicalizing the shooter without talking about the content that regalized 168 00:10:24,800 --> 00:10:27,520 Speaker 6: the shooter, all of which came from a third party. 169 00:10:27,880 --> 00:10:30,800 Speaker 6: And if third party content can't create liability for Facebook 170 00:10:30,840 --> 00:10:33,000 Speaker 6: section two, there you will apply a note. 171 00:10:33,160 --> 00:10:36,600 Speaker 2: Michael Bloomberg, the founder and majority owner of Bloomberg LP, 172 00:10:36,880 --> 00:10:39,960 Speaker 2: the parent company of Bloomberg Radio, is a donor to 173 00:10:40,000 --> 00:10:44,520 Speaker 2: groups that support gun control, including every Town for Gun Safety, 174 00:10:44,800 --> 00:10:47,959 Speaker 2: which represented the plaintiffs in this case in the lower courts. 175 00:10:48,440 --> 00:10:51,480 Speaker 2: Coming up next on the Bloomberg Law Show, I'll continue 176 00:10:51,480 --> 00:10:55,280 Speaker 2: this conversation with Professor Eric Goleman of Santa Clara University 177 00:10:55,360 --> 00:10:58,920 Speaker 2: Law School. We'll talk more about the algorithms that issue 178 00:10:58,920 --> 00:11:03,080 Speaker 2: in the case. Survivors and families of the victims of 179 00:11:03,120 --> 00:11:06,760 Speaker 2: the mass shooting in a Buffalo supermarket three years ago 180 00:11:07,080 --> 00:11:11,320 Speaker 2: are suing Internet giants Google, Meta, Amazon, and Reddit for 181 00:11:11,440 --> 00:11:16,079 Speaker 2: publishing racist content that allegedly motivated the shooter to commit 182 00:11:16,120 --> 00:11:20,400 Speaker 2: a hate crime targeting people in an historically black neighborhood 183 00:11:20,679 --> 00:11:23,960 Speaker 2: hundreds of miles from his Home. The plaintiffs blame the 184 00:11:24,040 --> 00:11:27,040 Speaker 2: proprietary algorithms the companies use. 185 00:11:27,320 --> 00:11:31,880 Speaker 1: Putting people in groups. There's something called confirmation bias. And 186 00:11:31,960 --> 00:11:35,880 Speaker 1: when you put people in groups where they're getting these 187 00:11:37,600 --> 00:11:41,720 Speaker 1: white supremacist theories and these calls to violence and these 188 00:11:41,760 --> 00:11:44,240 Speaker 1: calls to make a race war and think that that's 189 00:11:44,280 --> 00:11:47,080 Speaker 1: the right thing. That's what happened to Peyton Gendron, and 190 00:11:47,120 --> 00:11:48,760 Speaker 1: that's what this lawsuit is about him. 191 00:11:49,080 --> 00:11:53,000 Speaker 2: But the Internet companies say they're protected from liability under 192 00:11:53,000 --> 00:11:57,240 Speaker 2: the law, which says no interactive computer service provider can 193 00:11:57,280 --> 00:12:00,719 Speaker 2: be treated as the publisher or speaker of any information 194 00:12:01,200 --> 00:12:05,280 Speaker 2: provided by another information content provider. I've been talking with 195 00:12:05,360 --> 00:12:09,920 Speaker 2: Professor Eric Goldman of Santa Clara University Law School. Eric 196 00:12:09,960 --> 00:12:13,439 Speaker 2: tell us about the plaintiff's claims about the algorithms. 197 00:12:14,120 --> 00:12:16,640 Speaker 6: So, in order for the shooter to get access to 198 00:12:16,720 --> 00:12:20,559 Speaker 6: the radicalizing content, somebody has to upload it, and then 199 00:12:20,640 --> 00:12:23,800 Speaker 6: Facebook has to make the match to present that to 200 00:12:23,880 --> 00:12:25,920 Speaker 6: the reader to the shooter in this case, and so 201 00:12:26,040 --> 00:12:29,400 Speaker 6: plainists have often claimed that it was the algorithm Facebook 202 00:12:29,440 --> 00:12:31,959 Speaker 6: created that led to this radicalization. Now, there are a 203 00:12:32,000 --> 00:12:34,400 Speaker 6: couple of problems with that theory. One is it was 204 00:12:34,440 --> 00:12:38,200 Speaker 6: expressly rejected. In the Fourth Circuit opinion that I mentioned earlier, 205 00:12:38,679 --> 00:12:41,320 Speaker 6: Federal Peller Court said that's not a way of getting 206 00:12:41,360 --> 00:12:44,720 Speaker 6: around Section two thirty. The other piece, and the Appellate 207 00:12:44,760 --> 00:12:48,040 Speaker 6: Court mentioned this as well, that in order for Facebook 208 00:12:48,080 --> 00:12:50,800 Speaker 6: to do anything differently to prevent that requalization, it would 209 00:12:50,800 --> 00:12:53,760 Speaker 6: have to re architect its entire service. In other words, 210 00:12:53,880 --> 00:12:57,040 Speaker 6: there's not a way of it just changing the algorithm 211 00:12:57,080 --> 00:13:00,520 Speaker 6: to eliminate the risk of rattle and clovation with changing 212 00:13:00,600 --> 00:13:04,040 Speaker 6: any other element of its algorithm. So, in other words, 213 00:13:04,120 --> 00:13:07,480 Speaker 6: there's no easy fix here for Facebook. Facebook work the 214 00:13:07,480 --> 00:13:10,200 Speaker 6: way it was designed to work. It may matches between 215 00:13:10,200 --> 00:13:13,079 Speaker 6: people and content they might be interested in. Unfortunately, some 216 00:13:13,120 --> 00:13:17,920 Speaker 6: people are interested in content that is antisocial. Facebook's algorithm 217 00:13:18,080 --> 00:13:19,959 Speaker 6: treats that the same as all other content. 218 00:13:20,400 --> 00:13:23,480 Speaker 2: The director of the Center for Democracy and Technologies Free 219 00:13:23,520 --> 00:13:27,280 Speaker 2: Expression Project, which filed in a MEEKS brief in support 220 00:13:27,400 --> 00:13:31,200 Speaker 2: of the social media companies, said if the automatic ranking 221 00:13:31,240 --> 00:13:34,280 Speaker 2: and delivery of content is separated from the Section two 222 00:13:34,320 --> 00:13:38,200 Speaker 2: thirty liability shield, it will suddenly make many pieces of 223 00:13:38,280 --> 00:13:43,280 Speaker 2: content open to liability and incentivized platforms that use automated 224 00:13:43,360 --> 00:13:47,640 Speaker 2: ranking systems to deliver content to suppress or eliminate delivery 225 00:13:47,679 --> 00:13:50,600 Speaker 2: of content they're worried about. So basically, would there be 226 00:13:50,640 --> 00:13:52,720 Speaker 2: a chilling effect if that happens. 227 00:13:53,360 --> 00:13:55,679 Speaker 6: I think that's a riff on what I was just saying, 228 00:13:55,760 --> 00:13:57,800 Speaker 6: and maybe said better than I did. But let me 229 00:13:57,880 --> 00:14:01,680 Speaker 6: take another cut at it. So, if Facebook is liable 230 00:14:01,880 --> 00:14:07,400 Speaker 6: for radicalization content, and if and I'm going to hypothesize this, 231 00:14:07,480 --> 00:14:10,800 Speaker 6: it's true, people might debate it. There is no way 232 00:14:10,840 --> 00:14:14,160 Speaker 6: for Facebook to know which content is radicalized in or not, 233 00:14:14,559 --> 00:14:17,960 Speaker 6: or to properly separate that from the pro social content 234 00:14:18,000 --> 00:14:21,840 Speaker 6: that it would prefer to disseminate. Then Facebook basically becomes 235 00:14:21,840 --> 00:14:25,360 Speaker 6: the financial guaranteur of any harms that are caused by 236 00:14:25,400 --> 00:14:28,560 Speaker 6: people that can be traced back to someone's consumption content 237 00:14:28,600 --> 00:14:31,880 Speaker 6: on Facebook. We've seen so many cases in this genre. 238 00:14:32,000 --> 00:14:35,280 Speaker 6: This is not a unique case in Buffalo. Many times, 239 00:14:35,520 --> 00:14:39,760 Speaker 6: victims of shootings have claimed that the shooter was exposed 240 00:14:39,760 --> 00:14:41,960 Speaker 6: to content on social media, and social media should be 241 00:14:42,000 --> 00:14:45,080 Speaker 6: responsible for this, and that's just not tenable. Social media 242 00:14:45,320 --> 00:14:48,880 Speaker 6: cannot act as the financial guaranteur of all the shootings 243 00:14:48,880 --> 00:14:51,160 Speaker 6: that take place in the world, so then they have 244 00:14:51,240 --> 00:14:53,640 Speaker 6: to do something different. If they become the financial guarantur 245 00:14:53,720 --> 00:14:55,280 Speaker 6: they cannot exist in their current form. 246 00:14:55,840 --> 00:14:59,760 Speaker 2: The easiest path for the Appellate Division right now in 247 00:14:59,800 --> 00:15:03,520 Speaker 2: this case is to allow the case to go forward 248 00:15:03,560 --> 00:15:05,720 Speaker 2: and to say, you know, the facts have to be 249 00:15:05,760 --> 00:15:09,120 Speaker 2: developed before we can make a decision on whether or 250 00:15:09,160 --> 00:15:12,720 Speaker 2: not this is subject to the state's product liability law. 251 00:15:13,280 --> 00:15:15,960 Speaker 6: So that's essentially what the lower court did. The lower 252 00:15:16,000 --> 00:15:20,640 Speaker 6: court basically said, I understand Facebook's defenses, and yet I 253 00:15:20,760 --> 00:15:23,440 Speaker 6: need to see more facts than before I'm prepared to 254 00:15:23,440 --> 00:15:26,640 Speaker 6: make the conclusion that products valuability applies, or even that 255 00:15:26,720 --> 00:15:30,000 Speaker 6: Section two thirty applies. So one possibility is that pel 256 00:15:30,120 --> 00:15:33,400 Speaker 6: Court could say that District Court was right. Let the 257 00:15:33,440 --> 00:15:36,840 Speaker 6: case continue to evolve in its ordinary manner, and once 258 00:15:36,880 --> 00:15:38,600 Speaker 6: we get more information, then we'll be able to figure 259 00:15:38,600 --> 00:15:42,040 Speaker 6: out what defenses to actually apply in the circumstance. Most 260 00:15:42,080 --> 00:15:45,240 Speaker 6: cases evolving section two thirty don't do that. Most cases 261 00:15:45,280 --> 00:15:48,200 Speaker 6: in section two thirty. And at the motion the Smiths, 262 00:15:48,280 --> 00:15:50,720 Speaker 6: because it is obvious on the face there's nothing that 263 00:15:50,760 --> 00:15:55,640 Speaker 6: would be produced in a discovery that would show that 264 00:15:55,680 --> 00:15:58,840 Speaker 6: Section two thirty didn't apply. And here I don't think 265 00:15:58,880 --> 00:16:03,680 Speaker 6: anyone's contesting the Facebook provided or authored the radicalizing content. 266 00:16:03,920 --> 00:16:06,920 Speaker 6: The allegation is that it matched the shooter with their 267 00:16:07,000 --> 00:16:10,360 Speaker 6: party content. So everything that the court needs to know 268 00:16:10,400 --> 00:16:13,040 Speaker 6: about Section two theories, I think or are you available 269 00:16:13,080 --> 00:16:16,000 Speaker 6: to it. So punting on that question actually would be 270 00:16:16,040 --> 00:16:18,840 Speaker 6: a disservice to Facebook and I think to all other 271 00:16:18,960 --> 00:16:22,280 Speaker 6: social media that expect to be able to avoid the 272 00:16:22,280 --> 00:16:24,560 Speaker 6: liability for things that they're not legally responsible for. 273 00:16:26,120 --> 00:16:29,320 Speaker 2: It's interesting that the plaintiffs to your chose state court 274 00:16:29,800 --> 00:16:32,640 Speaker 2: were in the other case you were discussing the plaintiffs 275 00:16:32,720 --> 00:16:33,800 Speaker 2: chose federal court. 276 00:16:34,520 --> 00:16:36,480 Speaker 6: You know, planets get to pick the venue that they 277 00:16:36,520 --> 00:16:38,960 Speaker 6: think is the best place for them to achieve justice. 278 00:16:39,040 --> 00:16:42,240 Speaker 6: So it's completely logical that they might conclude that New 279 00:16:42,280 --> 00:16:44,360 Speaker 6: York state courts is opposed to New York federal courts 280 00:16:44,360 --> 00:16:47,480 Speaker 6: are a better venue for them. But I would hope 281 00:16:47,520 --> 00:16:49,400 Speaker 6: that the New York state courts will still take a 282 00:16:49,440 --> 00:16:51,560 Speaker 6: look at what's going on in the rest of the country, 283 00:16:51,600 --> 00:16:54,600 Speaker 6: because this is not a novel issue across the country, 284 00:16:54,760 --> 00:16:58,200 Speaker 6: and the case law is very clear that the New 285 00:16:58,280 --> 00:17:00,360 Speaker 6: York state trial courts got a wrong. 286 00:17:00,640 --> 00:17:04,560 Speaker 2: How important is this case as a precedent, let's say, 287 00:17:04,640 --> 00:17:07,440 Speaker 2: for other cases around the country, and is it important 288 00:17:07,560 --> 00:17:09,159 Speaker 2: or is it just, you know, another one of the 289 00:17:09,200 --> 00:17:11,800 Speaker 2: cases that you've been talking about. Is there any import 290 00:17:11,840 --> 00:17:12,119 Speaker 2: to it? 291 00:17:12,560 --> 00:17:17,359 Speaker 6: So every case that involves the situation where social media 292 00:17:17,359 --> 00:17:23,040 Speaker 6: allegedly contributed to some offline harm is important because if 293 00:17:23,160 --> 00:17:27,960 Speaker 6: those cases succeed, then all social media services face an 294 00:17:28,000 --> 00:17:31,679 Speaker 6: extreme amount of liability that may make them financially untenable. 295 00:17:32,080 --> 00:17:35,720 Speaker 6: So it's like the social media services have to bat 296 00:17:35,760 --> 00:17:38,680 Speaker 6: a thousand in these cases. If they only bout nine 297 00:17:38,720 --> 00:17:41,320 Speaker 6: to ninety nine and one case gets through, that one 298 00:17:41,400 --> 00:17:46,120 Speaker 6: case could potentially change the entire Internet ecosystem. So if 299 00:17:46,119 --> 00:17:49,960 Speaker 6: this case ends up like essentially all the others, and 300 00:17:50,000 --> 00:17:53,600 Speaker 6: the planiffs don't succeed, then this case became unimportant. It's 301 00:17:53,640 --> 00:17:56,520 Speaker 6: just one of the many. If for whatever reason, this 302 00:17:56,600 --> 00:17:59,480 Speaker 6: case does succeed, then this could be the case that 303 00:17:59,520 --> 00:18:03,679 Speaker 6: reshapes the Internet. And the district court opinion in this 304 00:18:03,840 --> 00:18:07,879 Speaker 6: case was so troubling, and because it clearly deviated from 305 00:18:08,280 --> 00:18:11,959 Speaker 6: pressing around the country, and because it basically said I 306 00:18:12,000 --> 00:18:16,280 Speaker 6: can't make any judgments, even though many people thought that 307 00:18:16,400 --> 00:18:18,040 Speaker 6: the court had all the facts that needed to make 308 00:18:18,080 --> 00:18:21,280 Speaker 6: a judgment. And so if that's the case, it opens 309 00:18:21,320 --> 00:18:25,600 Speaker 6: up the door for unpredictable results, and that unpredictable is 310 00:18:25,600 --> 00:18:28,080 Speaker 6: actually its own form of danger to the Internet. 311 00:18:28,480 --> 00:18:29,520 Speaker 2: Any final thoughts. 312 00:18:30,400 --> 00:18:32,840 Speaker 6: You know, the Buffalo shoe was horrific. You know, all 313 00:18:32,880 --> 00:18:36,320 Speaker 6: these mass murders are just heartbreaking. You know, they're a 314 00:18:36,400 --> 00:18:39,800 Speaker 6: sign of pathology in our country. They just absolutely tear 315 00:18:39,880 --> 00:18:43,160 Speaker 6: me up. And the fact that the shooter was espousing 316 00:18:43,280 --> 00:18:48,199 Speaker 6: the great replacement theory is even more troubling that this 317 00:18:48,400 --> 00:18:53,400 Speaker 6: theory continues to find an audience in our country when 318 00:18:53,600 --> 00:18:57,359 Speaker 6: it's both a factual but it's also deeply corrosive. I 319 00:18:57,480 --> 00:19:02,280 Speaker 6: will add that want to blame Facebook for the dissemination 320 00:19:02,320 --> 00:19:05,280 Speaker 6: of the great replacement theory, but many leading figures in 321 00:19:05,320 --> 00:19:09,679 Speaker 6: our country, whether that's media pundits or actual government officials, 322 00:19:10,000 --> 00:19:13,320 Speaker 6: have also dabbled in the great replacement theory, and so 323 00:19:14,040 --> 00:19:18,240 Speaker 6: to me, the idea that Facebook ratoclides this shooter misses 324 00:19:18,280 --> 00:19:19,520 Speaker 6: the whole ecosystem. 325 00:19:20,000 --> 00:19:22,880 Speaker 2: We'll see if the New York Appellate Court here deviates 326 00:19:22,920 --> 00:19:27,120 Speaker 2: from the way most other courts have ruled. Thanks so much, Eric, 327 00:19:27,640 --> 00:19:31,159 Speaker 2: that's professor Eric Goldman of a Santa Clara University School 328 00:19:31,200 --> 00:19:34,240 Speaker 2: of Law. Coming up next on the Bloomberg Law Show. 329 00:19:34,800 --> 00:19:39,000 Speaker 2: The growing trend of presidents using temporary appointments for top 330 00:19:39,160 --> 00:19:44,479 Speaker 2: roles in government. President Trump has installed another loyalist in 331 00:19:44,480 --> 00:19:47,840 Speaker 2: one of the most powerful US attorney's offices in the country. 332 00:19:48,000 --> 00:19:51,680 Speaker 2: Former Fox News host Janine Piro has taken over as 333 00:19:51,720 --> 00:19:55,880 Speaker 2: the interim US Attorney for DC from Ed Martin, who 334 00:19:56,040 --> 00:19:58,960 Speaker 2: was the interim US Attorney for close to one hundred 335 00:19:58,960 --> 00:20:02,920 Speaker 2: and twenty days. The interim nature of both appointments test 336 00:20:03,000 --> 00:20:07,400 Speaker 2: the bounds of a federal statute governing temporary officials and 337 00:20:07,480 --> 00:20:12,120 Speaker 2: reflects a growing trend of presidents leaning on temporary appointments 338 00:20:12,320 --> 00:20:16,399 Speaker 2: for top roles in government. My guest is Anne Joseph O'Connell, 339 00:20:16,440 --> 00:20:21,080 Speaker 2: a professor at Stanford Law School who specializes in political appointments. 340 00:20:21,840 --> 00:20:24,520 Speaker 2: Will you explain the provisions that are at play when 341 00:20:24,560 --> 00:20:27,600 Speaker 2: a US attorney is given an interim appointment. 342 00:20:28,160 --> 00:20:31,720 Speaker 7: So, when there's not a Senate confirmed US attorney in 343 00:20:31,760 --> 00:20:36,200 Speaker 7: a particular district, there are two statutes that can provide 344 00:20:36,320 --> 00:20:40,520 Speaker 7: for temporary service. So the first, what we call interim 345 00:20:40,800 --> 00:20:45,040 Speaker 7: US attorneys, is a specific provision in the United States 346 00:20:45,040 --> 00:20:48,880 Speaker 7: Code called Section five forty six, and that provision allows 347 00:20:48,960 --> 00:20:53,360 Speaker 7: for the Attorney General to choose a temporary interim US 348 00:20:53,440 --> 00:20:55,919 Speaker 7: attorney for the district in which there is not a 349 00:20:55,960 --> 00:21:00,280 Speaker 7: Senate confirmed person. There are no restrictions on who the 350 00:21:00,320 --> 00:21:02,760 Speaker 7: Attorney General can pick, so they don't have to pick 351 00:21:02,840 --> 00:21:06,600 Speaker 7: someone who's within the Department of Justice, unlike this other 352 00:21:06,720 --> 00:21:09,960 Speaker 7: statutory provision. And so the Attorney General can pick an 353 00:21:09,960 --> 00:21:13,280 Speaker 7: interim US attorney and that person can serve for one 354 00:21:13,359 --> 00:21:17,159 Speaker 7: hundred and twenty days under section five forty six, and 355 00:21:17,200 --> 00:21:21,280 Speaker 7: then after that one hundred twenty days expires, the district 356 00:21:21,320 --> 00:21:26,960 Speaker 7: court may right may appoint a US attorney to serve 357 00:21:27,160 --> 00:21:30,000 Speaker 7: until there is a Senate confirmed one. 358 00:21:30,280 --> 00:21:33,000 Speaker 2: So what happened here is the term for ed Martin 359 00:21:33,240 --> 00:21:35,960 Speaker 2: was about to run out, and before the district court 360 00:21:35,960 --> 00:21:41,920 Speaker 2: could appoint someone, President Trump nominated as another interim US attorney. 361 00:21:42,160 --> 00:21:44,600 Speaker 2: Jeanine Piro I mean, is that according to the. 362 00:21:44,640 --> 00:21:48,040 Speaker 7: Rules, Well, it depends whether you're looking at the text 363 00:21:48,040 --> 00:21:50,960 Speaker 7: of the rule or the intent of the provision. Under 364 00:21:51,119 --> 00:21:55,880 Speaker 7: the text, I think it's permitted. The provision Section five 365 00:21:55,960 --> 00:22:00,520 Speaker 7: forty six does not explicitly bar successive one hundred twenty 366 00:22:00,600 --> 00:22:03,919 Speaker 7: day appointments. And I should also note that even if 367 00:22:03,920 --> 00:22:08,600 Speaker 7: the district courts had chosen someone, the President in all 368 00:22:08,760 --> 00:22:12,280 Speaker 7: likelihood could have fired the district court pick and picked 369 00:22:12,400 --> 00:22:16,159 Speaker 7: another interim US attorney or turned this other statue the 370 00:22:16,160 --> 00:22:19,639 Speaker 7: Federal Vacancy's Reform Act. So I think the text permits 371 00:22:20,200 --> 00:22:24,199 Speaker 7: successive appointments. There's no explicit bar on it. There's no 372 00:22:24,480 --> 00:22:28,640 Speaker 7: provision that provides for a penalty if you see these 373 00:22:28,720 --> 00:22:34,560 Speaker 7: successive appointments end. In the past, we have seen successive appointments. 374 00:22:34,640 --> 00:22:39,239 Speaker 7: Now that's the text. When Congress reimposed time limits, so 375 00:22:39,440 --> 00:22:43,320 Speaker 7: put back the one hundred and twenty day constraint on 376 00:22:43,440 --> 00:22:46,280 Speaker 7: how long interim US attorneys can serve. They did that 377 00:22:46,359 --> 00:22:49,960 Speaker 7: in two thousand and seven after a US attorney scandal 378 00:22:50,200 --> 00:22:55,760 Speaker 7: involving Attorney General Darto Gonzalez, who had fired a number 379 00:22:55,880 --> 00:22:59,160 Speaker 7: of US attorneys. And the scandal was not whether the 380 00:22:59,200 --> 00:23:02,879 Speaker 7: president or the Attorney General could fire US attorneys. It 381 00:23:02,920 --> 00:23:06,000 Speaker 7: was about the reason that was given for those firings. 382 00:23:06,040 --> 00:23:08,080 Speaker 7: But at the time that it happened, there was no 383 00:23:08,320 --> 00:23:12,560 Speaker 7: limit on how long interim US attorneys could serve. And 384 00:23:13,000 --> 00:23:16,000 Speaker 7: this had occurred after September eleventh. There was a period 385 00:23:16,000 --> 00:23:18,840 Speaker 7: of years where there was kind of no time limit. 386 00:23:18,920 --> 00:23:21,240 Speaker 7: But then in two thousand and seven, Congress put back 387 00:23:21,280 --> 00:23:23,600 Speaker 7: this time limit, and I think the putting back of 388 00:23:23,640 --> 00:23:27,960 Speaker 7: the time limit suggests strongly that the intent of Congress 389 00:23:28,200 --> 00:23:33,840 Speaker 7: was not to have endless interim appointments of US attorneys. 390 00:23:33,200 --> 00:23:35,879 Speaker 2: And was that the intent of the Framers as well. 391 00:23:36,119 --> 00:23:39,760 Speaker 2: In other words, they wanted the Senate to advise and consent. 392 00:23:40,680 --> 00:23:42,000 Speaker 3: So we have. 393 00:23:42,080 --> 00:23:46,639 Speaker 7: Had Senate confirmed appointments since the start of the United 394 00:23:46,680 --> 00:23:50,560 Speaker 7: States Constitution under the appointments Clause, and interestingly, we've had 395 00:23:51,040 --> 00:23:55,840 Speaker 7: statutes that provide for temporary appointments from about the same time. 396 00:23:56,080 --> 00:24:00,040 Speaker 7: The very first Vacancies Act, which applied generally, was in 397 00:24:00,040 --> 00:24:02,960 Speaker 7: seventeen ninety two, and there was another one in seventeen 398 00:24:03,040 --> 00:24:05,959 Speaker 7: ninety five. There it gets a little tricky about kind 399 00:24:06,000 --> 00:24:08,800 Speaker 7: of what were the time limits, because in the seventeen 400 00:24:08,840 --> 00:24:12,800 Speaker 7: ninety five Act, Congress imposed the six month time limit 401 00:24:13,080 --> 00:24:14,800 Speaker 7: on acting officials. 402 00:24:15,200 --> 00:24:19,480 Speaker 2: Here a lot of actings. Have US presidents been using 403 00:24:19,600 --> 00:24:23,720 Speaker 2: interim positions more and more so I have done. 404 00:24:23,600 --> 00:24:29,200 Speaker 7: Some research on recent administrations and looking at the very 405 00:24:29,400 --> 00:24:33,920 Speaker 7: highest level of positions, so the heads of agencies, cabinet 406 00:24:33,960 --> 00:24:39,640 Speaker 7: secretaries in particular, and definitely recent administrations, and that would 407 00:24:39,680 --> 00:24:44,359 Speaker 7: include President Obama, President Trump's first term and President Biden 408 00:24:44,520 --> 00:24:49,399 Speaker 7: have heavily relied on acting officials, much more, I would say, 409 00:24:49,880 --> 00:24:53,720 Speaker 7: than their predecessors in terms of modern government. 410 00:24:54,080 --> 00:24:56,760 Speaker 2: What's the reason for this? I know that President Trump 411 00:24:56,840 --> 00:24:59,040 Speaker 2: in the past has said, you know, he likes the 412 00:24:59,119 --> 00:25:00,320 Speaker 2: idea of it. 413 00:25:00,359 --> 00:25:01,119 Speaker 3: Is it because it's. 414 00:25:00,960 --> 00:25:03,920 Speaker 2: Difficult to get people through the Senate? I mean, what's 415 00:25:03,960 --> 00:25:05,760 Speaker 2: the reason then they're using it more. 416 00:25:06,240 --> 00:25:10,159 Speaker 7: I think there are several reasons. The first is it 417 00:25:10,240 --> 00:25:13,360 Speaker 7: takes time to nominate people. Though I will say that 418 00:25:13,400 --> 00:25:17,080 Speaker 7: President Trump in his second term has a much faster 419 00:25:17,280 --> 00:25:20,960 Speaker 7: nominations pace than he had in his first term. I 420 00:25:20,960 --> 00:25:25,919 Speaker 7: mean six or sevenfold more nominations at one hundred day 421 00:25:26,000 --> 00:25:28,920 Speaker 7: mark in his second term than in his first term. 422 00:25:28,920 --> 00:25:32,040 Speaker 7: But nominations still takes time to find the people and 423 00:25:32,080 --> 00:25:35,239 Speaker 7: to formally nominate them. There's a vetting that occurs, and 424 00:25:35,280 --> 00:25:38,680 Speaker 7: I think for President Trump that vetting has really focused 425 00:25:38,720 --> 00:25:43,200 Speaker 7: on loyalty, whereas in other administrations might be focusing on 426 00:25:43,240 --> 00:25:48,520 Speaker 7: ethical considerations, expertise and the like. And then there's the 427 00:25:48,600 --> 00:25:52,800 Speaker 7: confirmation process. And so even as we have now and 428 00:25:52,920 --> 00:25:56,359 Speaker 7: even as we did for President Biden, have a Senate 429 00:25:56,400 --> 00:25:59,760 Speaker 7: controlled by the same party as the White House. It's 430 00:25:59,760 --> 00:26:03,920 Speaker 7: still takes time because individual senators. Even though there's no 431 00:26:03,960 --> 00:26:08,040 Speaker 7: longer a sixty vote threshold to move a nomination to 432 00:26:08,119 --> 00:26:11,120 Speaker 7: a confirmation vote that went away in November twenty thirteen, 433 00:26:11,480 --> 00:26:16,600 Speaker 7: individual senators can still delay the confirmation's process, right, And 434 00:26:16,800 --> 00:26:21,280 Speaker 7: we saw this with Senator Tuberville in the Biden administration. 435 00:26:21,760 --> 00:26:24,680 Speaker 7: We've seen this in other cases as well. Right, Democrats 436 00:26:24,680 --> 00:26:27,800 Speaker 7: do it to Republicans. Republicans do it to Democrats, and 437 00:26:27,840 --> 00:26:29,440 Speaker 7: that can slow it down. And so I think both 438 00:26:29,440 --> 00:26:33,879 Speaker 7: the nominations and confirmation process make acting officials attractive. But 439 00:26:33,960 --> 00:26:36,720 Speaker 7: I would also say that what I think is a 440 00:26:36,840 --> 00:26:41,639 Speaker 7: large reason is that acting officials don't require Senate confirmation, 441 00:26:41,960 --> 00:26:45,080 Speaker 7: and so you can pick people to serve in the 442 00:26:45,119 --> 00:26:48,639 Speaker 7: temporary capacity that the Senate might not confirm. So in 443 00:26:48,680 --> 00:26:53,400 Speaker 7: President Trump's first term, the Republicans, as reported by the media, 444 00:26:53,600 --> 00:26:56,639 Speaker 7: did not want to confirm Ken Kucinelli to a senior 445 00:26:56,680 --> 00:27:00,479 Speaker 7: post in the Department of Homeland Security, but through acting 446 00:27:00,640 --> 00:27:05,000 Speaker 7: appointments and delegations of authority, mister Kuchinelli was able to 447 00:27:05,080 --> 00:27:08,240 Speaker 7: serve in very high profile roles in DXS. And I 448 00:27:08,240 --> 00:27:12,560 Speaker 7: think a similar story right for Ed Martin that Republicans 449 00:27:12,640 --> 00:27:17,200 Speaker 7: balked and he was able through the interim provision to serve, 450 00:27:17,240 --> 00:27:19,480 Speaker 7: whereas you know, if he had just been put up 451 00:27:19,520 --> 00:27:21,879 Speaker 7: as a nominee, not as an interim, he might not 452 00:27:22,000 --> 00:27:22,840 Speaker 7: have gotten true. 453 00:27:23,080 --> 00:27:27,320 Speaker 2: Has President Trump pushed the limits of the Vacancy's Act 454 00:27:27,440 --> 00:27:29,199 Speaker 2: further than other presidents? 455 00:27:30,080 --> 00:27:33,800 Speaker 7: I think the Vacancy's Act is pretty capacious. It allows 456 00:27:33,840 --> 00:27:37,080 Speaker 7: the government to function. I think the Vacancies Act should 457 00:27:37,080 --> 00:27:40,760 Speaker 7: be reformed in certain ways because I think it allows 458 00:27:40,800 --> 00:27:44,360 Speaker 7: more than perhaps what we want for an accountable government. 459 00:27:44,960 --> 00:27:48,800 Speaker 7: But I think in this term, President Trump is really 460 00:27:48,920 --> 00:27:52,760 Speaker 7: pushing an argument that no previous president has pushed, which 461 00:27:52,800 --> 00:27:56,000 Speaker 7: is that he believes that if the Vacancy's Act does 462 00:27:56,040 --> 00:28:01,679 Speaker 7: not apply, he has inherent article to authority name acting officials. 463 00:28:01,760 --> 00:28:04,960 Speaker 7: And this has come up in the Inner America Foundation 464 00:28:05,359 --> 00:28:08,399 Speaker 7: and the African Development Foundation, these are entities not covered 465 00:28:08,440 --> 00:28:12,679 Speaker 7: by the Vacancies Act, where President Trump has named acting 466 00:28:12,720 --> 00:28:18,480 Speaker 7: officials to Senate confirmed Board positions relying on claimed inherent 467 00:28:18,600 --> 00:28:21,800 Speaker 7: article to authority. It seems with the firing of the 468 00:28:21,840 --> 00:28:25,080 Speaker 7: Librarian of Congress that if the Library of Congress is 469 00:28:25,119 --> 00:28:27,280 Speaker 7: not subject to the Vacancies Act, and I don't think 470 00:28:27,280 --> 00:28:29,960 Speaker 7: it is President Trump. The White House is indicated that 471 00:28:30,000 --> 00:28:33,679 Speaker 7: there is Article two authority to name an acting librarian, 472 00:28:33,760 --> 00:28:36,919 Speaker 7: and I think this is an outrageous claim. We have 473 00:28:36,960 --> 00:28:39,640 Speaker 7: an appointments clause, we have a recess appointments clause. It 474 00:28:39,720 --> 00:28:43,200 Speaker 7: is true that the Vacancies Act and specific agency provisions 475 00:28:43,200 --> 00:28:46,640 Speaker 7: like section five forty six for interim US attorneys, do 476 00:28:46,920 --> 00:28:51,200 Speaker 7: allow vast use of temporary appointments, but outside of those, 477 00:28:51,640 --> 00:28:54,640 Speaker 7: presidents need to rely on the appointments clause or the 478 00:28:54,680 --> 00:28:55,440 Speaker 7: recess appointment. 479 00:28:55,600 --> 00:28:59,160 Speaker 2: So some are saying that with the US attorney, for example, 480 00:28:59,480 --> 00:29:05,120 Speaker 2: with Piero becoming the second interim US Attorney for DC, 481 00:29:05,880 --> 00:29:10,240 Speaker 2: that defendants might be able to challenge some of the 482 00:29:10,480 --> 00:29:15,200 Speaker 2: you know, prosecutions, et cetera under her because of this issue. 483 00:29:15,440 --> 00:29:19,479 Speaker 7: I think they would definitely have standing to sue. And 484 00:29:19,520 --> 00:29:22,880 Speaker 7: I think their argument would be that under five forty six, 485 00:29:23,080 --> 00:29:26,520 Speaker 7: successive one hundred and twenty day appointments are not allowed. 486 00:29:27,040 --> 00:29:30,120 Speaker 7: And I think that is a plausible argument. I just 487 00:29:30,200 --> 00:29:34,040 Speaker 7: don't think it's a winning argument. I think that courts 488 00:29:34,080 --> 00:29:36,800 Speaker 7: will look at the text. Right as Justice Kagan says, 489 00:29:36,800 --> 00:29:39,719 Speaker 7: we're all textless now, that the courts will look at 490 00:29:39,760 --> 00:29:42,960 Speaker 7: the text and see that successive one hundred and twenty 491 00:29:43,040 --> 00:29:46,520 Speaker 7: day appointments are not barred. It's happened in the past 492 00:29:46,800 --> 00:29:49,920 Speaker 7: and permitted. Now, that doesn't mean it shouldn't change. I 493 00:29:49,960 --> 00:29:53,960 Speaker 7: mean Congress could amend five forty six to make it 494 00:29:54,080 --> 00:29:57,240 Speaker 7: clear that you get one one hundred twenty day appointment, 495 00:29:57,800 --> 00:30:00,440 Speaker 7: then the district court gets to choose. And then I 496 00:30:00,480 --> 00:30:03,720 Speaker 7: should note that there is an alternative to these successive 497 00:30:03,720 --> 00:30:07,000 Speaker 7: one hundred and twenty day appointments. While a nomination might 498 00:30:07,040 --> 00:30:10,280 Speaker 7: be pending to a US attorney slot, is that I 499 00:30:10,400 --> 00:30:13,680 Speaker 7: believe that the Federal Vacancy's Reform Act is available for 500 00:30:13,840 --> 00:30:19,160 Speaker 7: temporary appointments to these positions. Now, the constraint why I 501 00:30:19,200 --> 00:30:22,240 Speaker 7: think the Trump administration doesn't want to turn to the 502 00:30:22,320 --> 00:30:25,600 Speaker 7: Vacancies Act is that under the Vacancy's Act, you can't 503 00:30:25,640 --> 00:30:29,160 Speaker 7: just choose anyone. You can't choose an outsider like the 504 00:30:29,280 --> 00:30:32,320 Speaker 7: current person. You can't choose an outsider to come in 505 00:30:32,520 --> 00:30:35,760 Speaker 7: as an acting US attorney. You either need to choose 506 00:30:35,760 --> 00:30:39,760 Speaker 7: someone who is the first assistant to the US Attorney right, 507 00:30:39,800 --> 00:30:42,960 Speaker 7: which is typically a career person. You need to choose 508 00:30:42,960 --> 00:30:46,280 Speaker 7: someone who has Senate confirmed already to another position, so 509 00:30:46,320 --> 00:30:49,160 Speaker 7: it has already kind of gone through the Senate process, 510 00:30:49,560 --> 00:30:52,200 Speaker 7: or you have to choose someone who was in the 511 00:30:52,280 --> 00:30:55,360 Speaker 7: agency for at least ninety days in the year before 512 00:30:55,400 --> 00:30:56,040 Speaker 7: the vacancy. 513 00:30:56,400 --> 00:30:59,280 Speaker 2: So finally, you think there needs to be reform here. 514 00:30:59,680 --> 00:31:01,400 Speaker 1: I think the. 515 00:31:01,320 --> 00:31:07,280 Speaker 7: Law of governing temporary appointments are pretty broad and likely 516 00:31:07,440 --> 00:31:11,840 Speaker 7: too broad. On the other hand, we want the government 517 00:31:12,000 --> 00:31:14,680 Speaker 7: to function. We don't want the government to come to 518 00:31:14,760 --> 00:31:19,880 Speaker 7: a standstill. And so finding the balance between getting Senate 519 00:31:19,880 --> 00:31:25,560 Speaker 7: confirmed appointments through a dysfunctional appointments process and having the 520 00:31:25,600 --> 00:31:30,160 Speaker 7: government function is tricky. And sometimes we're on the right 521 00:31:30,200 --> 00:31:33,040 Speaker 7: side of the balance, and sometimes we're on the wrong 522 00:31:33,240 --> 00:31:37,040 Speaker 7: side of the balance. And so I think that Congress 523 00:31:37,200 --> 00:31:40,400 Speaker 7: really has a role to play in constraining the use 524 00:31:40,440 --> 00:31:42,080 Speaker 7: of acting an interim appointment. 525 00:31:42,320 --> 00:31:44,520 Speaker 2: So we'll start counting now to see if Janine Pirou 526 00:31:44,840 --> 00:31:49,400 Speaker 2: is nominated to be the permanent US Attorney for DC 527 00:31:49,640 --> 00:31:53,400 Speaker 2: before the one hundred and twenty days expires. Thanks so 528 00:31:53,480 --> 00:31:57,120 Speaker 2: much for joining me today. That's Professor and Joseph O'Connell 529 00:31:57,200 --> 00:32:00,320 Speaker 2: of Stanford Law School, and that's it for the edition 530 00:32:00,360 --> 00:32:02,959 Speaker 2: of the Bloomberg Law Show. Remember you can always get 531 00:32:03,000 --> 00:32:06,160 Speaker 2: the latest legal news on our Bloomberg Law podcasts. You 532 00:32:06,160 --> 00:32:10,280 Speaker 2: can find them on Apple Podcasts, Spotify, and at www 533 00:32:10,400 --> 00:32:14,680 Speaker 2: dot bloomberg dot com, slash podcast slash Law, and remember 534 00:32:14,720 --> 00:32:17,680 Speaker 2: to tune into The Bloomberg Law Show every weeknight at 535 00:32:17,680 --> 00:32:21,160 Speaker 2: ten pm Wall Street Time. I'm June Grosso and you're 536 00:32:21,240 --> 00:32:22,480 Speaker 2: listening to Bloomberg