1 00:00:02,720 --> 00:00:06,800 Speaker 1: This is Bloomberg Law. A divided Supreme Court rejects a 2 00:00:06,920 --> 00:00:09,800 Speaker 1: religious challenge. Tell us a little about the facts of 3 00:00:09,800 --> 00:00:13,600 Speaker 1: the case. Interviews with prominent attorneys in Bloomberg Legal experts. 4 00:00:13,680 --> 00:00:16,919 Speaker 1: I guess his former federal prosecutor Jimmy Garula Joining me 5 00:00:16,960 --> 00:00:20,760 Speaker 1: is Bloomberg Law reporter Jordan Ruben. And analysis of important 6 00:00:20,800 --> 00:00:24,160 Speaker 1: legal issues, cases and headlines. The Supreme Court takes on 7 00:00:24,239 --> 00:00:28,560 Speaker 1: state secrets. Multiple lawsuits were filed against the emergency rule? 8 00:00:28,800 --> 00:00:32,599 Speaker 1: Is this lawsuit for real? Bloomberg Law with June Grasso 9 00:00:32,960 --> 00:00:39,320 Speaker 1: from Bloomberg Radio. Welcome to Bloomberg Laws Show. I'm Lydia 10 00:00:39,360 --> 00:00:42,720 Speaker 1: Wheeler and I'm Kimberly Robinson. We're in for June Grosso. 11 00:00:43,280 --> 00:00:45,040 Speaker 1: Coming up on the show, we'll discuss the U. S. 12 00:00:45,040 --> 00:00:48,479 Speaker 1: Supreme Court's latest COVID action and the increasing use of 13 00:00:48,560 --> 00:00:52,600 Speaker 1: the courts so called shadow doctor. The first Bloomberg Law reporter, 14 00:00:52,680 --> 00:00:55,720 Speaker 1: Andrea Vittorio joins us to talk about the new extended 15 00:00:55,800 --> 00:00:59,720 Speaker 1: reality experiences that technology companies are promising in the metaverse 16 00:01:00,120 --> 00:01:02,600 Speaker 1: and the privacy pitfollows that could come with collecting more 17 00:01:02,680 --> 00:01:06,440 Speaker 1: data from users. Andrea, thanks for being here. Can you 18 00:01:06,480 --> 00:01:10,039 Speaker 1: start off by explaining what an extended reality is and 19 00:01:10,160 --> 00:01:15,800 Speaker 1: what types of technology it typically includes. Sure, so I 20 00:01:15,840 --> 00:01:18,760 Speaker 1: asked this question myself in writing this story, because there 21 00:01:18,800 --> 00:01:22,240 Speaker 1: are many different kinds of reality and UM, there's sort 22 00:01:22,280 --> 00:01:26,240 Speaker 1: of regular reality on one side and then virtual reality 23 00:01:26,240 --> 00:01:29,160 Speaker 1: on the other, and that's where UM, there's a whole 24 00:01:29,240 --> 00:01:32,120 Speaker 1: virtual world. You have a virtual version of yourself and 25 00:01:32,200 --> 00:01:38,200 Speaker 1: you can participate in virtual activities like games or events 26 00:01:38,240 --> 00:01:42,440 Speaker 1: like concerts or shopping. So UM, that's sort of what 27 00:01:42,480 --> 00:01:44,360 Speaker 1: we think of when we talk about the metaverse. There 28 00:01:44,400 --> 00:01:48,640 Speaker 1: are also versions in between where you can have holograms 29 00:01:48,680 --> 00:01:53,800 Speaker 1: imposed on real life or UM you can see digital 30 00:01:54,400 --> 00:01:59,360 Speaker 1: characters like Pokemon go um in your everyday activities. So 31 00:02:00,600 --> 00:02:05,320 Speaker 1: the virtual realities that we are talking about can mean 32 00:02:05,360 --> 00:02:08,120 Speaker 1: a lot different things. And so what are the digital 33 00:02:08,160 --> 00:02:12,600 Speaker 1: experiences that extended reality companies are promising users. It doesn't 34 00:02:12,600 --> 00:02:16,560 Speaker 1: seem like we're talking about just games here right right. 35 00:02:16,639 --> 00:02:20,120 Speaker 1: There are a lot of potential applications. UM. Gaming is 36 00:02:20,160 --> 00:02:23,400 Speaker 1: definitely one of them. UM that is learning. There are 37 00:02:23,520 --> 00:02:28,160 Speaker 1: schools that are experimenting with virtual reality for students. There's 38 00:02:28,200 --> 00:02:31,920 Speaker 1: also corporate training that can happen in virtual reality. You 39 00:02:31,960 --> 00:02:38,160 Speaker 1: can help firefighters or doctors learn their craft UM just 40 00:02:38,280 --> 00:02:43,000 Speaker 1: by practicing in additual environment UM. And they're just a 41 00:02:43,040 --> 00:02:46,760 Speaker 1: lot of different UM use cases that we kind of 42 00:02:46,800 --> 00:02:51,280 Speaker 1: are seeing explored. But UM could brawn now like theme parks, 43 00:02:51,320 --> 00:02:55,239 Speaker 1: travel shopping. There's a lot going on there in the metaverse. 44 00:02:55,480 --> 00:02:58,560 Speaker 1: So what kinds of data are our companies collecting from 45 00:02:58,600 --> 00:03:03,000 Speaker 1: people who use these sorts of extended reality devices? And 46 00:03:03,120 --> 00:03:07,280 Speaker 1: is any of its sensitive information? The data collected can 47 00:03:07,320 --> 00:03:10,280 Speaker 1: depend on the device or how you're using it UM, 48 00:03:10,360 --> 00:03:14,400 Speaker 1: but there are a lot of potential collection points, like 49 00:03:14,800 --> 00:03:18,400 Speaker 1: when you wear a headset can gather information about like 50 00:03:18,880 --> 00:03:21,600 Speaker 1: how your head is moving, what you're looking at. UM. 51 00:03:21,639 --> 00:03:25,079 Speaker 1: You can sometimes hold devices in your hands that will 52 00:03:25,120 --> 00:03:27,600 Speaker 1: track your hands are moving or what sides they are 53 00:03:27,639 --> 00:03:33,519 Speaker 1: even So UM these are all considered um pretty personal 54 00:03:34,560 --> 00:03:38,760 Speaker 1: pieces of information because they really varied by person UM 55 00:03:38,960 --> 00:03:43,040 Speaker 1: and can even sort of amount to identifying a person 56 00:03:43,080 --> 00:03:45,040 Speaker 1: if you have enough information about them and how they 57 00:03:45,080 --> 00:03:48,480 Speaker 1: move and what they look like. So, UH, try to 58 00:03:48,520 --> 00:03:54,000 Speaker 1: be advocates or concerned just about the um physical characteristics 59 00:03:54,600 --> 00:03:57,800 Speaker 1: or traits that are being gathered about people as they 60 00:03:58,080 --> 00:04:02,120 Speaker 1: use these devices. Right in your story that you recently wrote, 61 00:04:02,120 --> 00:04:04,600 Speaker 1: you refer to you know, tracking these movements as a 62 00:04:04,680 --> 00:04:06,880 Speaker 1: kind of thumb print of your movements, which I thought 63 00:04:06,920 --> 00:04:09,440 Speaker 1: was really interesting. And I'm wondering do companies have to 64 00:04:09,440 --> 00:04:12,680 Speaker 1: get permission from users before they can collect this kind 65 00:04:12,720 --> 00:04:18,120 Speaker 1: of data? And if so, how do they usually do that? Right? So, 66 00:04:18,760 --> 00:04:22,240 Speaker 1: permission is kind of interesting concepts, like when you're in 67 00:04:22,279 --> 00:04:26,680 Speaker 1: a virtual world and you're interacting with different uh people 68 00:04:26,920 --> 00:04:29,719 Speaker 1: or with places, and there's sort of different touch points 69 00:04:29,720 --> 00:04:32,480 Speaker 1: where you would need permission. So there's sort of like 70 00:04:32,640 --> 00:04:35,640 Speaker 1: the base level of permission of using a device and 71 00:04:35,720 --> 00:04:39,040 Speaker 1: creating accounts, but then um, when you play a game 72 00:04:39,200 --> 00:04:43,640 Speaker 1: or some other sort of activity in the in the 73 00:04:43,720 --> 00:04:49,840 Speaker 1: virtual world, um, you're interacting with another business potentially, and 74 00:04:50,120 --> 00:04:53,640 Speaker 1: so they would have to potentially ask for permission to 75 00:04:53,720 --> 00:04:56,560 Speaker 1: gather information about you. Or if you go and buy 76 00:04:56,600 --> 00:04:59,200 Speaker 1: something in the metaverse, then maybe you have to agree 77 00:04:59,200 --> 00:05:02,400 Speaker 1: to private people healthy of the merchant selling you something. 78 00:05:02,560 --> 00:05:08,320 Speaker 1: So UM, permission to gather information or touse information could 79 00:05:08,360 --> 00:05:13,279 Speaker 1: be many layered in this virtual world. Are there any 80 00:05:13,400 --> 00:05:16,080 Speaker 1: laws in place, either at the state or the federal 81 00:05:16,200 --> 00:05:21,279 Speaker 1: level to protect this kind of digital data? So far, 82 00:05:21,880 --> 00:05:26,240 Speaker 1: we have been thinking about privacy laws UM in virtual 83 00:05:26,279 --> 00:05:29,880 Speaker 1: worlds just sort of as applying existing laws to the 84 00:05:29,960 --> 00:05:34,880 Speaker 1: space and UM. So in Europe there's a rule called 85 00:05:34,880 --> 00:05:40,640 Speaker 1: the General Data Protection Regulation that would um probably apply 86 00:05:40,880 --> 00:05:46,520 Speaker 1: these general privacy rights to UM different platforms regardless of 87 00:05:46,560 --> 00:05:49,839 Speaker 1: where you are, So it would apply um in the 88 00:05:49,880 --> 00:05:52,679 Speaker 1: metaverse as well as just sort of auto website UM. 89 00:05:52,760 --> 00:05:55,159 Speaker 1: But then in the US it might kind of depend 90 00:05:55,240 --> 00:06:01,520 Speaker 1: on different state laws since there's no national privacy law here. UM. 91 00:06:01,560 --> 00:06:04,680 Speaker 1: So there's still a lot of questions around, like how 92 00:06:04,720 --> 00:06:07,880 Speaker 1: do these laws apply? And like especially how do how 93 00:06:07,920 --> 00:06:11,320 Speaker 1: do location base laws apply when like I might be 94 00:06:12,040 --> 00:06:15,440 Speaker 1: in one physical place using a device, but I might 95 00:06:15,520 --> 00:06:19,240 Speaker 1: be going somewhere else in the world in my device, 96 00:06:19,320 --> 00:06:24,840 Speaker 1: So does that change like where the laws of that 97 00:06:24,960 --> 00:06:27,920 Speaker 1: place would apply to me or whether the laws where 98 00:06:27,920 --> 00:06:31,440 Speaker 1: I am physically apply to me. So, um, there's still 99 00:06:31,480 --> 00:06:36,200 Speaker 1: a lot of policy questions around, uh, how existing laws 100 00:06:36,279 --> 00:06:39,040 Speaker 1: might fit or um if we need to have new 101 00:06:39,120 --> 00:06:43,000 Speaker 1: laws written specifically for this space. Well, thanks so much. 102 00:06:43,080 --> 00:06:47,120 Speaker 1: That's Bloomberg Law reporter Andrea Vittorio. You're listening to Bloomberg Law. 103 00:06:47,600 --> 00:06:49,920 Speaker 1: Next up will bring in the University of Texas law 104 00:06:49,960 --> 00:06:52,800 Speaker 1: professor Stephen Bladdock to discuss the U S. Supreme Courts 105 00:06:52,800 --> 00:06:57,240 Speaker 1: COVID rulings. I'm Kimberly Robinson and I'm Lidia Wheeler. This 106 00:06:57,560 --> 00:07:09,720 Speaker 1: is Bloomberg Y. This is Bloomberg Law with June Brasso 107 00:07:10,160 --> 00:07:15,240 Speaker 1: from Bloomberg Radio. I'm Lydia Wheeler and I'm Kimberly Robinson. 108 00:07:15,400 --> 00:07:17,960 Speaker 1: We're in for June Grasso. We turn now from the 109 00:07:17,960 --> 00:07:21,680 Speaker 1: metaverse to the COVID pandemic, which has upended life worldwide, 110 00:07:21,680 --> 00:07:24,800 Speaker 1: including at the U. S. Supreme Court, to discuss how 111 00:07:24,800 --> 00:07:27,400 Speaker 1: the justices have dealt with the health crisis. We bring 112 00:07:27,400 --> 00:07:30,760 Speaker 1: in University of Texas law professor Steven Laddock. Thanks so 113 00:07:30,920 --> 00:07:32,920 Speaker 1: much for being here with us. Oh, thanks for having 114 00:07:33,000 --> 00:07:37,280 Speaker 1: me so. On Tuesday, Justice Sonia Sotomayor refused to block 115 00:07:37,400 --> 00:07:40,680 Speaker 1: New York City's requirement that employees here a New York 116 00:07:40,760 --> 00:07:45,600 Speaker 1: City Police detective, be vaccinated against COVID nineteen. There was 117 00:07:45,720 --> 00:07:48,520 Speaker 1: no explanation from the court, but we've seen the justices 118 00:07:48,640 --> 00:07:52,680 Speaker 1: block other vaccine mandates and allow other requirements to stay 119 00:07:52,720 --> 00:07:56,080 Speaker 1: in place. Do you have an understanding about why the 120 00:07:56,120 --> 00:07:59,920 Speaker 1: Court have might have rejected this particular request. Yeah, I mean, 121 00:08:00,000 --> 00:08:02,800 Speaker 1: I think if you look at the overall body of 122 00:08:02,840 --> 00:08:05,400 Speaker 1: work and there's actually a fairly substantial number of cases 123 00:08:05,480 --> 00:08:10,440 Speaker 1: where the Justices have been asked to block various COVID restrictions. UM. 124 00:08:10,480 --> 00:08:15,440 Speaker 1: The cases that have succeeded, basically with I think one exception, 125 00:08:15,920 --> 00:08:18,760 Speaker 1: invariably fall into one of two categories. Either they are 126 00:08:18,880 --> 00:08:23,720 Speaker 1: religious liberty based challenges to COVID restrictions, whether it's a 127 00:08:23,800 --> 00:08:26,800 Speaker 1: vaccine mandate or a limit on how many people can 128 00:08:26,800 --> 00:08:29,600 Speaker 1: gather in the same place. UM. Or it is a 129 00:08:29,680 --> 00:08:33,080 Speaker 1: challenge to a federal policy on grounds that the federal 130 00:08:33,120 --> 00:08:38,120 Speaker 1: policy exceeds the statutory authority that the relevant agency, the 131 00:08:38,160 --> 00:08:42,000 Speaker 1: Centers for Disease Control and Prevention OSHA right that they had. 132 00:08:42,200 --> 00:08:44,760 Speaker 1: And so with one exception, and the exception is the 133 00:08:44,800 --> 00:08:48,760 Speaker 1: New York State eviction moratorium, every single case where the 134 00:08:48,800 --> 00:08:52,679 Speaker 1: Supreme Court has agreed to block a state or federal 135 00:08:53,000 --> 00:08:57,160 Speaker 1: COVID policy the grounds of been religious liberty or you know, 136 00:08:57,200 --> 00:09:02,479 Speaker 1: federal administrative law. You mentioned those in person gatherings in particular, 137 00:09:02,640 --> 00:09:05,440 Speaker 1: and I was, you know, on religious gatherings, I should say, 138 00:09:05,440 --> 00:09:08,480 Speaker 1: and I was curious, how did the justices analyze those 139 00:09:08,520 --> 00:09:11,400 Speaker 1: specific requests? Yeah, you know, I mean, as you guys know, 140 00:09:11,520 --> 00:09:13,640 Speaker 1: part of the trick here is that very very few 141 00:09:13,640 --> 00:09:17,800 Speaker 1: of these rulings should actually come with opinions for the court. UM. 142 00:09:17,880 --> 00:09:20,360 Speaker 1: But so there were a series of cases, you know, 143 00:09:20,400 --> 00:09:24,880 Speaker 1: starting in the summer of and really culminating in the summer, 144 00:09:26,080 --> 00:09:30,760 Speaker 1: where UM, religious groups others challenged, you know, gathering restrictions, 145 00:09:30,840 --> 00:09:34,400 Speaker 1: especially in New York, UM in California. And you know, 146 00:09:34,480 --> 00:09:37,200 Speaker 1: this is I think one place where we saw the 147 00:09:37,240 --> 00:09:40,040 Speaker 1: confirmation of Justice Barrett have a huge shift. UM. So 148 00:09:40,080 --> 00:09:43,400 Speaker 1: in the summer of the court was actually denying these 149 00:09:43,400 --> 00:09:47,280 Speaker 1: requests to block the gathering restrictions. UM. In one or 150 00:09:47,320 --> 00:09:49,439 Speaker 1: two cases, one from California went from the VATA by 151 00:09:49,440 --> 00:09:52,720 Speaker 1: five to four votes, and it was Chief Justice Roberts 152 00:09:52,880 --> 00:09:56,160 Speaker 1: joining the liberals in those cases. UM. And one of 153 00:09:56,160 --> 00:09:59,720 Speaker 1: those cases, the case called South Bay Unitependent Constal Church UM. 154 00:09:59,760 --> 00:10:02,480 Speaker 1: To Justice Roberts wrote separately to say, you know, I'm 155 00:10:02,480 --> 00:10:06,480 Speaker 1: not unsympathetic to the claims that these planets are making, 156 00:10:07,080 --> 00:10:09,280 Speaker 1: but this is you know, not something we should be 157 00:10:09,280 --> 00:10:12,920 Speaker 1: resolving on an emergency application. UM. You know, the sort 158 00:10:12,920 --> 00:10:16,400 Speaker 1: of things are changing on the ground, the policies are shifting, UM, 159 00:10:16,440 --> 00:10:18,959 Speaker 1: and so we should give at least some latitude right 160 00:10:19,000 --> 00:10:23,640 Speaker 1: to the government decision makers. UM. That shifts quickly when 161 00:10:23,679 --> 00:10:26,319 Speaker 1: Justice Barrett's confirmed, so that you know, within a month 162 00:10:26,320 --> 00:10:31,079 Speaker 1: of her confirmation UM in November. Now it's five to 163 00:10:31,160 --> 00:10:33,520 Speaker 1: four the other way in a pair of cases blocking 164 00:10:33,920 --> 00:10:37,480 Speaker 1: you know, New York's restrictions on religious gatherings, UM. And 165 00:10:37,520 --> 00:10:39,080 Speaker 1: in the first of those cases, the case called Roman 166 00:10:39,080 --> 00:10:43,360 Speaker 1: Catholic Diocese of We've got a very short, unsigned opinion 167 00:10:43,400 --> 00:10:46,520 Speaker 1: for the Court that says, the problem with these restrictions 168 00:10:46,640 --> 00:10:50,240 Speaker 1: is that they are treating you know, religious worship more 169 00:10:50,360 --> 00:10:55,040 Speaker 1: harshly than they're treating other forms of essential secular businesses, 170 00:10:55,160 --> 00:10:59,120 Speaker 1: and that that becomes the dominant problem that the Justice 171 00:10:59,200 --> 00:11:01,679 Speaker 1: is find with whole bunch of these um, you know, 172 00:11:01,720 --> 00:11:04,240 Speaker 1: state gathering restrictions during the pandemic, which I think is 173 00:11:04,280 --> 00:11:06,800 Speaker 1: part of why we saw so much action through these cases, 174 00:11:07,320 --> 00:11:09,840 Speaker 1: only when the claims were about religious liberty as opposed 175 00:11:09,880 --> 00:11:13,280 Speaker 1: to other due process or other constitutional rights. So you 176 00:11:13,320 --> 00:11:15,959 Speaker 1: mentioned that a lot of these cases, most of them 177 00:11:15,960 --> 00:11:19,240 Speaker 1: in fact, came up through these emergency requests. I wanted 178 00:11:19,280 --> 00:11:21,640 Speaker 1: to talk a little bit about the so called shadow doctor. 179 00:11:21,720 --> 00:11:25,439 Speaker 1: Can you first explained to people what that even is? Sure, 180 00:11:26,040 --> 00:11:28,160 Speaker 1: So it's it's sort of it's a catchy term um 181 00:11:28,200 --> 00:11:32,880 Speaker 1: that will Bow the Professor Chicago coined in UM to 182 00:11:33,000 --> 00:11:36,680 Speaker 1: describe basically all of the traditionally boring stuff the Supreme 183 00:11:36,720 --> 00:11:38,679 Speaker 1: Court does. You know, we spend most of our time 184 00:11:38,760 --> 00:11:42,600 Speaker 1: thinking about like the fifty five to sixty big merits 185 00:11:42,640 --> 00:11:45,360 Speaker 1: decisions the Court hands down each year dabbs, you know, 186 00:11:45,400 --> 00:11:47,960 Speaker 1: the gun case, etcetera. And as you guys both know, 187 00:11:48,000 --> 00:11:50,480 Speaker 1: I mean, the reality is that, at least by volume, 188 00:11:50,559 --> 00:11:54,040 Speaker 1: the overwhelming majority of what the Supreme Court does is 189 00:11:54,080 --> 00:11:59,439 Speaker 1: actually not those It's these unsigned and usually unexplained orders UM. 190 00:11:59,480 --> 00:12:01,960 Speaker 1: Most of these or anodyne right. No one really gets 191 00:12:02,000 --> 00:12:06,280 Speaker 1: exercised about extensions of time to file briefs UM, even 192 00:12:06,320 --> 00:12:08,559 Speaker 1: when the court is denying sorceerarity, that is to say, 193 00:12:08,640 --> 00:12:11,120 Speaker 1: refuse them to take up an appeal. You know, that 194 00:12:11,160 --> 00:12:14,360 Speaker 1: doesn't often make headlines. But what we've seen in the 195 00:12:14,440 --> 00:12:16,440 Speaker 1: last couple of years, as we've seen more and more 196 00:12:16,480 --> 00:12:21,600 Speaker 1: of these orders, especially when parties are asking for emergency relief, 197 00:12:21,679 --> 00:12:24,560 Speaker 1: especially when a party is saying, I want to appeal 198 00:12:24,640 --> 00:12:27,760 Speaker 1: a decision from a lower court. While I'm appealing it, 199 00:12:27,880 --> 00:12:31,160 Speaker 1: I want you to block this state policy. UM, We're 200 00:12:31,160 --> 00:12:33,720 Speaker 1: seeing the court not only sort of here and and 201 00:12:33,760 --> 00:12:37,199 Speaker 1: take seriously more of those, but grant more of those requests. Um. 202 00:12:37,240 --> 00:12:40,080 Speaker 1: And that's coming you know, without oral argument in almost 203 00:12:40,120 --> 00:12:44,720 Speaker 1: all cases with limited briefing, um, oftentimes through orders that 204 00:12:44,800 --> 00:12:47,640 Speaker 1: have no explanation, and if they have an explanations, through 205 00:12:47,679 --> 00:12:50,199 Speaker 1: a short explanation. UM. As you guys know, well, these 206 00:12:50,280 --> 00:12:52,320 Speaker 1: orders can come at all times a day or even 207 00:12:52,320 --> 00:12:53,920 Speaker 1: in the middle of the dice. And so I think, 208 00:12:53,960 --> 00:12:57,120 Speaker 1: you know, the COVID pandemic was in some respects a 209 00:12:57,320 --> 00:13:00,920 Speaker 1: flashpoint for how much the court is due through these unfined, 210 00:13:00,960 --> 00:13:04,120 Speaker 1: unexplained orders, um. And really for how those orders can 211 00:13:04,160 --> 00:13:07,040 Speaker 1: have massive real world effects even when we have no 212 00:13:07,120 --> 00:13:09,600 Speaker 1: idea why the court is doing what it's doing. And Steve, 213 00:13:09,840 --> 00:13:12,000 Speaker 1: you know, as you mentioned just you know, just now 214 00:13:12,040 --> 00:13:14,160 Speaker 1: that it seems like you're a big critic of the 215 00:13:14,160 --> 00:13:16,400 Speaker 1: shadow docket. And you know, because these cases are coming 216 00:13:16,400 --> 00:13:18,480 Speaker 1: before the court, we're not getting or argument, we're not 217 00:13:18,520 --> 00:13:22,520 Speaker 1: getting you know, opinions. What's can you put into perspective though? 218 00:13:22,520 --> 00:13:24,640 Speaker 1: What's the harm in the fact that we're not having 219 00:13:24,679 --> 00:13:28,880 Speaker 1: like a fully briefed case and arguments before the court? Yeah, 220 00:13:28,920 --> 00:13:30,360 Speaker 1: I mean, I think there are a couple of harms. 221 00:13:30,360 --> 00:13:32,240 Speaker 1: I mean, as to be sure, I don't think the 222 00:13:32,240 --> 00:13:34,800 Speaker 1: shado adopted is first say a bad thing, UM Like, 223 00:13:35,000 --> 00:13:37,040 Speaker 1: there are going to be emergency and the court has 224 00:13:37,040 --> 00:13:38,679 Speaker 1: to have a way of dealing with them. I know 225 00:13:38,880 --> 00:13:42,720 Speaker 1: that the trouble comes when you have the court basically 226 00:13:42,760 --> 00:13:46,559 Speaker 1: issuing an unsigned, unexplained order, let's say, blocking a California 227 00:13:46,720 --> 00:13:50,520 Speaker 1: COVID policy for example, UM, in a case where first 228 00:13:50,840 --> 00:13:54,680 Speaker 1: the lower courts actually had detailed hurings and took significant 229 00:13:54,679 --> 00:13:57,040 Speaker 1: evidence um and actually you know, did a bunch of 230 00:13:57,080 --> 00:14:00,760 Speaker 1: fact findings to support their their conclusions that what California 231 00:14:00,800 --> 00:14:03,480 Speaker 1: was doing with above board. But second where the court 232 00:14:03,559 --> 00:14:07,040 Speaker 1: then turns around and says, hey, lower courts, you are 233 00:14:07,120 --> 00:14:11,000 Speaker 1: bound to follow our unsigned, unexplained order in this case. 234 00:14:11,440 --> 00:14:13,960 Speaker 1: And so I think the problem the shadow doctor creates 235 00:14:14,000 --> 00:14:17,520 Speaker 1: is not just that it's a compressed opportunity for the 236 00:14:17,640 --> 00:14:21,520 Speaker 1: court to do its job, but that it also deprives 237 00:14:21,560 --> 00:14:24,440 Speaker 1: the court in the typical case of the ability to 238 00:14:24,480 --> 00:14:28,480 Speaker 1: provide the kind of lengthy, principled rationale that you know, guys, 239 00:14:28,520 --> 00:14:31,320 Speaker 1: we may not agree with, but at least we understand 240 00:14:31,480 --> 00:14:34,600 Speaker 1: and that you know, the relevant parties, the local and 241 00:14:34,640 --> 00:14:37,440 Speaker 1: state governments, the lower courts can figure out how to 242 00:14:37,480 --> 00:14:40,680 Speaker 1: apply in future cases with you know, marginally different facts. 243 00:14:41,040 --> 00:14:42,840 Speaker 1: We don't have those in most of these orders, and 244 00:14:42,840 --> 00:14:46,120 Speaker 1: I think that's part of why, you know, the proliferation 245 00:14:46,200 --> 00:14:50,120 Speaker 1: of these decisions, especially in context in which the justices 246 00:14:50,160 --> 00:14:53,880 Speaker 1: are treating them as creating precedents. Um is I think 247 00:14:53,880 --> 00:14:57,360 Speaker 1: hard to defend. Coming up next will continue our conversation 248 00:14:57,480 --> 00:15:00,720 Speaker 1: with the University of Texas law professor Stephen Addic. I'm 249 00:15:00,800 --> 00:15:10,120 Speaker 1: Lidia Wheeler and I'm Kimberly Robinson. This is Bloomberg Y. 250 00:15:12,800 --> 00:15:17,600 Speaker 1: This is Bloomberg Law with June Grasso from Bloomberg Radio. 251 00:15:18,120 --> 00:15:21,040 Speaker 1: I'm Lidia Wheeler and I'm Kimberly Robinson. We're in for 252 00:15:21,120 --> 00:15:24,160 Speaker 1: June Grasso. We're back with Professor Stephen Bladdock of the 253 00:15:24,240 --> 00:15:27,560 Speaker 1: University of Texas. When we left, we were talking about 254 00:15:27,600 --> 00:15:30,520 Speaker 1: the courts shadow docket and how most of the COVID 255 00:15:30,600 --> 00:15:34,360 Speaker 1: cases came up through uh that procedure. Of course, there 256 00:15:34,360 --> 00:15:38,160 Speaker 1: were two major exceptions that was on the on two 257 00:15:38,200 --> 00:15:41,960 Speaker 1: federal vaccine mandates, one that the Supreme Court upheld and 258 00:15:42,000 --> 00:15:45,400 Speaker 1: another more broader mandate that the Supreme Court struck down. 259 00:15:45,800 --> 00:15:48,680 Speaker 1: Can you tell us Uh. These cases started on the 260 00:15:48,680 --> 00:15:51,840 Speaker 1: shadow docket themselves, though, right, Yeah, and actually I think 261 00:15:51,880 --> 00:15:54,200 Speaker 1: we could even probably debate whether they were even exception 262 00:15:54,320 --> 00:15:57,240 Speaker 1: so the right. There were two sets of really high 263 00:15:57,280 --> 00:16:01,360 Speaker 1: profile challenges to vaccine mandate from the Biden administration. One 264 00:16:01,440 --> 00:16:05,360 Speaker 1: was the Osha um proposed emergency rule that would have 265 00:16:05,360 --> 00:16:09,800 Speaker 1: required every large employer to impose a vaccination or testing 266 00:16:10,000 --> 00:16:12,880 Speaker 1: requirement on their employees UM. And the other was a 267 00:16:12,960 --> 00:16:16,520 Speaker 1: rule promulgated by the Center for Medicare and Medicaid Services 268 00:16:16,680 --> 00:16:21,120 Speaker 1: that basically required all health care facilities that received federal 269 00:16:21,320 --> 00:16:24,120 Speaker 1: Medicare or Medicaid funds to a whole lot of them 270 00:16:24,440 --> 00:16:28,480 Speaker 1: UM to require their healthcare workers to be vaccinated. So 271 00:16:28,800 --> 00:16:31,720 Speaker 1: they both came to the Supreme Court through the shadow docket. 272 00:16:31,800 --> 00:16:36,640 Speaker 1: The the Osha mandate UM was not blocked by the 273 00:16:36,680 --> 00:16:39,440 Speaker 1: Sixth Circuit UM, and then a whole bunch of parties 274 00:16:39,480 --> 00:16:42,840 Speaker 1: fifteen different sets of applicants asked the Supreme Court to 275 00:16:42,920 --> 00:16:45,840 Speaker 1: issue an emergency stay of the ocean Man date Um. 276 00:16:45,920 --> 00:16:49,520 Speaker 1: The CMS mandate was blocked by two different district courts 277 00:16:49,520 --> 00:16:52,680 Speaker 1: on a nationwide basis, and the Biden administration came to 278 00:16:52,760 --> 00:16:55,920 Speaker 1: the Supreme Court um asking for emergency relief in the 279 00:16:56,000 --> 00:16:59,280 Speaker 1: form of stays of those injunctions. And so what the 280 00:16:59,320 --> 00:17:01,280 Speaker 1: court did, I think made it look like it was 281 00:17:01,360 --> 00:17:04,640 Speaker 1: less shadowy is for the first time nearest we can tell, 282 00:17:04,640 --> 00:17:08,880 Speaker 1: since like um, the full Court decided to hear oral 283 00:17:08,960 --> 00:17:12,119 Speaker 1: argument on the vaccine mandates, and they did, you know, 284 00:17:12,160 --> 00:17:14,520 Speaker 1: in early January, UM. And they turned around about a 285 00:17:14,600 --> 00:17:18,720 Speaker 1: week later and handed down these you know, unsigned procureum 286 00:17:18,840 --> 00:17:22,400 Speaker 1: for the court opinions UM, where they blocked the Ocean 287 00:17:22,480 --> 00:17:26,520 Speaker 1: rule and unblocked the CMS rule. And you know, I guess, guys, 288 00:17:26,560 --> 00:17:31,280 Speaker 1: to me, those cases are a remarkable bell weather because first, 289 00:17:31,480 --> 00:17:33,399 Speaker 1: you know, the fact that the justices saw fit the 290 00:17:33,440 --> 00:17:35,480 Speaker 1: whole argument I think was a bit of a concession 291 00:17:36,080 --> 00:17:39,119 Speaker 1: that they realized that, like you know, the normal shadow 292 00:17:39,119 --> 00:17:43,400 Speaker 1: doctor process was insufficient for cases of the of that magnitude. 293 00:17:43,840 --> 00:17:46,680 Speaker 1: But second, you know, there's a line in the majority 294 00:17:46,680 --> 00:17:49,960 Speaker 1: opinion in the Ocean case where the you know whoever 295 00:17:49,960 --> 00:17:52,960 Speaker 1: wrote it, we don't know who did, um talks about 296 00:17:53,080 --> 00:17:56,360 Speaker 1: the equities and how you know, the federal government said 297 00:17:56,400 --> 00:17:58,720 Speaker 1: if you block the Ocean mandate, all these bath rooms 298 00:17:58,720 --> 00:18:01,920 Speaker 1: will happen. The Balingers, including a bunch of Red State said, 299 00:18:01,960 --> 00:18:04,920 Speaker 1: if you don't block the mandate, these bad dems will happen. 300 00:18:05,440 --> 00:18:07,720 Speaker 1: And then the Court says, it's not our job to 301 00:18:07,880 --> 00:18:11,240 Speaker 1: balance those trade offs. Um. And I have to say, 302 00:18:11,280 --> 00:18:13,560 Speaker 1: guys like that, that line, as someone who studies this 303 00:18:13,720 --> 00:18:17,600 Speaker 1: like kind of set me spinning, because it's actually exactly 304 00:18:17,640 --> 00:18:20,280 Speaker 1: the Court's job in the context of these kinds of 305 00:18:20,320 --> 00:18:22,840 Speaker 1: emergency application to balance those trade offs. And so I 306 00:18:22,880 --> 00:18:25,439 Speaker 1: think you know, in that respect, these cases were actually 307 00:18:25,440 --> 00:18:29,920 Speaker 1: this perfect encapsulation of how the shadow doctor has evolved, 308 00:18:30,040 --> 00:18:34,040 Speaker 1: of how more and more high profile disputes are being 309 00:18:34,080 --> 00:18:39,000 Speaker 1: resolved through these expedited processes, and how in that context 310 00:18:39,000 --> 00:18:42,720 Speaker 1: where the Court is supposed to be balancing the harms 311 00:18:42,720 --> 00:18:46,159 Speaker 1: to each side, UM, the Court is really increasingly just 312 00:18:46,359 --> 00:18:48,879 Speaker 1: deciding what it thinks the right answer is on the merits. 313 00:18:49,080 --> 00:18:51,280 Speaker 1: I want to talk with you more about how the 314 00:18:51,359 --> 00:18:54,679 Speaker 1: shadow docket has evolved, UM, because I know you've been 315 00:18:54,720 --> 00:18:58,080 Speaker 1: following a change in how parties and the justices use 316 00:18:58,160 --> 00:19:00,600 Speaker 1: the shadow jacket. So can you talk talk about those 317 00:19:00,680 --> 00:19:04,560 Speaker 1: changes in particularly those changes under the Trump administration. Yeah, 318 00:19:04,600 --> 00:19:06,800 Speaker 1: I mean, so you know it used to be, as 319 00:19:06,880 --> 00:19:09,440 Speaker 1: I said, we've had the shadow docket forever um and 320 00:19:09,520 --> 00:19:13,840 Speaker 1: the historically the body of cases that were the source 321 00:19:13,920 --> 00:19:16,880 Speaker 1: of the most shadowed doctor activity involved the death penalty, 322 00:19:17,000 --> 00:19:20,840 Speaker 1: where you know, you'd oftentimes have last minute applications from 323 00:19:21,240 --> 00:19:23,800 Speaker 1: death row inmates for stage of execution or if a 324 00:19:23,920 --> 00:19:26,440 Speaker 1: lower court had blocks and execution, and lasting application from 325 00:19:26,440 --> 00:19:29,480 Speaker 1: a state to unblock the execution and real these guys 326 00:19:29,520 --> 00:19:33,640 Speaker 1: into the tens. Like that was the majority of what 327 00:19:33,760 --> 00:19:37,439 Speaker 1: was interesting about emergency applications in the Supreme Court. The 328 00:19:37,480 --> 00:19:40,560 Speaker 1: shift in the Trump administration is a shift in just 329 00:19:40,720 --> 00:19:42,840 Speaker 1: the kinds of cases that are ending up on the 330 00:19:42,840 --> 00:19:46,520 Speaker 1: shadow dockets, and cases with not just massive implications for 331 00:19:46,600 --> 00:19:50,560 Speaker 1: one death row prisoner, but for state or federal policies. So, 332 00:19:50,640 --> 00:19:53,560 Speaker 1: just to take one data point, um, during the Bush 333 00:19:53,560 --> 00:19:57,720 Speaker 1: and Obama administrations from two thousand one seventeen, two pretty 334 00:19:57,760 --> 00:20:01,600 Speaker 1: different administrations, the federal government fils a total of eight 335 00:20:01,760 --> 00:20:05,600 Speaker 1: emergency applications in the Supreme Court, so one every other 336 00:20:05,760 --> 00:20:09,320 Speaker 1: year on average. During the Trump administration, so four years, 337 00:20:09,359 --> 00:20:12,520 Speaker 1: the Justice Department files forty one application and I think, 338 00:20:12,520 --> 00:20:15,160 Speaker 1: you know, the there's a longer conversation to to sort 339 00:20:15,160 --> 00:20:16,840 Speaker 1: of be had about, you know, the sort of what 340 00:20:17,119 --> 00:20:21,080 Speaker 1: caused that uptick, whatever caused it. Right, what it means 341 00:20:21,200 --> 00:20:26,000 Speaker 1: is that there was a lot more UM nationwide policy 342 00:20:26,240 --> 00:20:29,320 Speaker 1: challenging going on on the shadow docket, where you know, 343 00:20:29,359 --> 00:20:31,720 Speaker 1: it started as like the travel band, that it turned 344 00:20:31,720 --> 00:20:34,800 Speaker 1: into a case about the transgender band, you know, immigration, 345 00:20:35,400 --> 00:20:40,600 Speaker 1: environmental law. All of a sudden, right, every major contentious 346 00:20:40,760 --> 00:20:43,720 Speaker 1: UM challenge to staateard federal policies is coming to the 347 00:20:43,720 --> 00:20:47,080 Speaker 1: shadow docket. And I think that's the shift that has 348 00:20:47,160 --> 00:20:48,600 Speaker 1: led to why this has had so much more of 349 00:20:48,600 --> 00:20:50,679 Speaker 1: an impact on all of us. Well, thank you so 350 00:20:50,760 --> 00:20:53,840 Speaker 1: much for that. That's University of Texas Law professor Steven Bladdock. 351 00:20:54,200 --> 00:20:56,919 Speaker 1: You're listening to Bloomberg Radio. Up next, we'll talk with 352 00:20:56,920 --> 00:21:00,400 Speaker 1: Bloomberg News reporter Jeff Feey about Elon Musk his fight 353 00:21:00,440 --> 00:21:02,800 Speaker 1: to get out of his forty four billion dollar Twitter deal. 354 00:21:03,040 --> 00:21:07,160 Speaker 1: I'm Kimberly Robinson and I'm Lydia Wheeler. This is Bloomberg. 355 00:21:15,720 --> 00:21:20,520 Speaker 1: This is Bloomberg Law with June Grasso from Bloomberg Radio. 356 00:21:21,200 --> 00:21:24,840 Speaker 1: I'm Kimberly Robinson and I'm Lydia Wheeler. We're in for 357 00:21:24,960 --> 00:21:27,639 Speaker 1: June grass Ow. Elon Musk has been engulfed in a 358 00:21:27,720 --> 00:21:30,600 Speaker 1: legal battle over his failed forty four billion dollar deal 359 00:21:30,680 --> 00:21:33,720 Speaker 1: to buy Twitter. He's now claiming a whistle blower's allegations 360 00:21:33,760 --> 00:21:35,760 Speaker 1: against the company should let him walk away from that 361 00:21:35,800 --> 00:21:38,760 Speaker 1: purchase agreement. Joining us now to talk about this litigation, 362 00:21:38,760 --> 00:21:41,720 Speaker 1: as Bloomberg News reporter Jeff Feely, Jeff, thanks for being here, 363 00:21:41,960 --> 00:21:44,840 Speaker 1: no problem. Can you start off by telling us a 364 00:21:44,880 --> 00:21:50,600 Speaker 1: bit of background about this litigation and how it came about? Yes, Uh, 365 00:21:51,080 --> 00:21:55,240 Speaker 1: it came about because Mr Musk, the world's richest man, 366 00:21:55,920 --> 00:21:59,600 Speaker 1: decided he had agend own Twitter, which he is a 367 00:21:59,720 --> 00:22:03,960 Speaker 1: very frequent user of that platform to tweet out the 368 00:22:04,080 --> 00:22:07,360 Speaker 1: various things that come to his mind, and he decided 369 00:22:07,440 --> 00:22:09,720 Speaker 1: that maybe it would be good for him to buy Twitter. 370 00:22:09,880 --> 00:22:14,199 Speaker 1: So he back in April, offered fifty four dollars and 371 00:22:14,240 --> 00:22:18,600 Speaker 1: twenty cents a share of the social media platform, waived 372 00:22:18,680 --> 00:22:24,800 Speaker 1: his due diligence on the deal, and signed well. When 373 00:22:24,800 --> 00:22:29,040 Speaker 1: the market turned a bit and Twitter's stock value fell, 374 00:22:29,520 --> 00:22:34,800 Speaker 1: Mr Musk was not happy and he decided that maybe 375 00:22:34,840 --> 00:22:37,200 Speaker 1: he didn't want Twitter as much as he thought he did, 376 00:22:37,880 --> 00:22:43,000 Speaker 1: so he decided to cancel or walk away from the 377 00:22:43,720 --> 00:22:47,720 Speaker 1: billion dollar deal. Well, the fox at Twitter, understandably weren't 378 00:22:47,720 --> 00:22:51,960 Speaker 1: happy about that, and so they filed suit in Delaware 379 00:22:52,040 --> 00:22:54,840 Speaker 1: chancery court to force him to consummate the deal. The 380 00:22:54,880 --> 00:22:58,920 Speaker 1: reason they came to the first state is because under 381 00:22:58,960 --> 00:23:03,399 Speaker 1: the merger agreement, that was where all legal disputes had 382 00:23:03,440 --> 00:23:06,760 Speaker 1: to be litigated. Delaware has a long tradition of its 383 00:23:06,800 --> 00:23:12,680 Speaker 1: business courts handling merger and acquisition disputes. So that sort 384 00:23:12,680 --> 00:23:15,480 Speaker 1: of gets us to where we are. And so we've 385 00:23:15,480 --> 00:23:19,840 Speaker 1: discussed before on this UH show about the whistle blower 386 00:23:19,880 --> 00:23:23,240 Speaker 1: who recently came forward with allegations against Twitter. Can you 387 00:23:23,280 --> 00:23:25,120 Speaker 1: tell us a little bit more about that and what 388 00:23:25,160 --> 00:23:28,399 Speaker 1: it is that they're alleging is happening at the company. Sure. 389 00:23:28,520 --> 00:23:34,240 Speaker 1: The whistleblower's name is Peter Zako. He's a well known hacker, 390 00:23:34,840 --> 00:23:40,480 Speaker 1: um and pardon me, computer security expert, and he was 391 00:23:40,520 --> 00:23:45,800 Speaker 1: Twitter's ahead of computer security for a while. While at Twitter, 392 00:23:46,119 --> 00:23:50,560 Speaker 1: Mr Zako says that he raised issues about the number 393 00:23:50,600 --> 00:23:53,880 Speaker 1: of spam and robot accounts that are embedded in the 394 00:23:53,880 --> 00:23:59,280 Speaker 1: company's customer base. Uh. These accounts are not basically ones 395 00:23:59,320 --> 00:24:03,000 Speaker 1: with humans hid them, but they are you know, uh, 396 00:24:03,520 --> 00:24:07,800 Speaker 1: put on there to increase people's audience, if you will, 397 00:24:08,600 --> 00:24:13,840 Speaker 1: and um, he claims that the folks of Twitter had 398 00:24:13,920 --> 00:24:17,000 Speaker 1: no idea how many of these accounts were, you know, 399 00:24:17,359 --> 00:24:20,640 Speaker 1: part of their customers, and really didn't seem to care 400 00:24:20,880 --> 00:24:24,399 Speaker 1: to delve too deeply into it, because you know, the 401 00:24:24,440 --> 00:24:27,840 Speaker 1: more customers they have, the more they can charge advertisers. 402 00:24:27,920 --> 00:24:32,680 Speaker 1: So if they if they you know, I didn't want 403 00:24:32,680 --> 00:24:34,919 Speaker 1: to dig into it, it's because they didn't want to 404 00:24:34,960 --> 00:24:40,000 Speaker 1: reduce their advertiser numbers. So Mr zach Coo got fired 405 00:24:40,080 --> 00:24:43,520 Speaker 1: for performance issues a couple of months ago, and he 406 00:24:43,600 --> 00:24:48,000 Speaker 1: has now come forward and lodged a so called whistleblower's 407 00:24:48,040 --> 00:24:55,000 Speaker 1: complaint with regulators and UH congresspeople. And those complaints include 408 00:24:55,920 --> 00:25:01,440 Speaker 1: laxity of computer security, a district art for privacy issues, 409 00:25:01,680 --> 00:25:06,800 Speaker 1: and this whole issue about the spam and robot accounts. 410 00:25:06,840 --> 00:25:09,960 Speaker 1: Those accounts are important because Mr Musk has made them 411 00:25:10,000 --> 00:25:14,080 Speaker 1: the centerpiece of his legal arguments, saying that he's justified 412 00:25:14,119 --> 00:25:18,360 Speaker 1: and walking away from the deal. Isn't elon Musk here 413 00:25:18,600 --> 00:25:22,320 Speaker 1: um asking the court to kind of amend his argument um, 414 00:25:22,440 --> 00:25:25,320 Speaker 1: and is the judge in this case likely to allow 415 00:25:25,359 --> 00:25:30,040 Speaker 1: him to do that. He is asking to amend his counterclaims. 416 00:25:30,040 --> 00:25:33,280 Speaker 1: So Twitter has filed the actual suit. Mr Musk has 417 00:25:33,280 --> 00:25:38,240 Speaker 1: filed his defenses and counterclaims to those, and those counterclaims 418 00:25:38,240 --> 00:25:41,720 Speaker 1: focused solely on the bot issue at first. Well, now 419 00:25:41,760 --> 00:25:46,439 Speaker 1: that this gentleman has come forward talking about security issues 420 00:25:46,480 --> 00:25:50,800 Speaker 1: and privacy concerns, he wants to amend his counterclaims to 421 00:25:50,920 --> 00:25:55,800 Speaker 1: say that these are other legitimate bases for uh POEMA 422 00:25:55,880 --> 00:26:00,479 Speaker 1: rug out from the deal. Under Delaware law, UH judges 423 00:26:00,600 --> 00:26:06,240 Speaker 1: have wide authority to amend and it's basically granted as 424 00:26:06,240 --> 00:26:08,760 Speaker 1: long as it does not prejudice the other side in 425 00:26:08,840 --> 00:26:12,400 Speaker 1: some way. So most people think that Judge McCormick will 426 00:26:12,440 --> 00:26:16,240 Speaker 1: allow some amendment to add these other issues to the case. 427 00:26:16,960 --> 00:26:19,320 Speaker 1: And do you think that these whistable or allegations are 428 00:26:19,400 --> 00:26:23,800 Speaker 1: ultimately going to help must break the deal with Twitter. Well, 429 00:26:23,840 --> 00:26:25,960 Speaker 1: we really don't know at this point. It's too early 430 00:26:26,000 --> 00:26:31,040 Speaker 1: to tell whether Mr Zako's allegations have so called meat 431 00:26:31,160 --> 00:26:35,760 Speaker 1: on the bones or sour grapes by fired employee. UH 432 00:26:36,240 --> 00:26:39,120 Speaker 1: people really need to dig into them to find out 433 00:26:39,160 --> 00:26:43,399 Speaker 1: whether there's substance or not. What is Twitter said in 434 00:26:43,520 --> 00:26:48,040 Speaker 1: response to Musk, you know, using the whistleblower complaints in 435 00:26:48,080 --> 00:26:51,959 Speaker 1: his defense. Well, they have said that Mrs obviously they 436 00:26:51,960 --> 00:26:54,560 Speaker 1: are the ones who who notice, who let people know 437 00:26:54,640 --> 00:26:57,920 Speaker 1: that Mr Zako was fired for performance issues. They also 438 00:26:57,960 --> 00:27:00,480 Speaker 1: have said that his complaint is quote it old with 439 00:27:00,560 --> 00:27:06,199 Speaker 1: inaccuracies closed quote. So um, again, we're going to have 440 00:27:06,320 --> 00:27:11,120 Speaker 1: to wait and see whether the rubber meets the road 441 00:27:11,480 --> 00:27:15,320 Speaker 1: on these issues that he's raised about Twitter's operations. And 442 00:27:15,400 --> 00:27:18,440 Speaker 1: do we have any sense about how many Twitter users 443 00:27:18,640 --> 00:27:21,520 Speaker 1: are really body accounts or are really real or is 444 00:27:21,560 --> 00:27:25,560 Speaker 1: that just an unknown at this point? Apparently in the industry, 445 00:27:25,600 --> 00:27:28,159 Speaker 1: the social media industry, this is a problem. It's a 446 00:27:28,200 --> 00:27:33,320 Speaker 1: problem trying to figure out robots from humans. Um. Twitter 447 00:27:33,400 --> 00:27:36,679 Speaker 1: has more than two d and thirty million customers, it 448 00:27:36,760 --> 00:27:41,399 Speaker 1: has said in its regulatory or securities filings with the SEC. 449 00:27:42,200 --> 00:27:46,159 Speaker 1: But it believes the bot and scam accounts are somewhere 450 00:27:46,200 --> 00:27:50,560 Speaker 1: about five round five of their customer base. Mr Musk 451 00:27:50,680 --> 00:27:54,520 Speaker 1: and his experts have done some preliminary analysis and they 452 00:27:54,680 --> 00:27:57,600 Speaker 1: postulate that it could be as much as a third 453 00:27:57,600 --> 00:28:03,560 Speaker 1: of the thirty million plus customers who are not humans. 454 00:28:03,600 --> 00:28:08,000 Speaker 1: The reason that is important again is that you can 455 00:28:08,040 --> 00:28:14,240 Speaker 1: only really make money with advertising from humans with eyeballs 456 00:28:14,320 --> 00:28:18,520 Speaker 1: watching the ads. And you know, if the Twitter folks 457 00:28:19,280 --> 00:28:23,040 Speaker 1: you know, have written their sec disclosures in a masterful 458 00:28:23,080 --> 00:28:26,000 Speaker 1: way to headge them. But if there turns out to 459 00:28:26,040 --> 00:28:29,880 Speaker 1: be a major discrepancy, you know, and there's many more 460 00:28:29,920 --> 00:28:33,000 Speaker 1: bots than five percent, that could be a problem for 461 00:28:33,080 --> 00:28:36,880 Speaker 1: the deal. So has this dispute hurt Twitter in any way? 462 00:28:36,920 --> 00:28:40,240 Speaker 1: I mean, our market shares down. Shares are down. Shares 463 00:28:40,240 --> 00:28:43,240 Speaker 1: are down as part of an overall drop in the 464 00:28:43,320 --> 00:28:48,400 Speaker 1: technical sector as well, but shares are down. And I 465 00:28:48,440 --> 00:28:51,480 Speaker 1: do believe this whole fight has certainly had an effect 466 00:28:51,520 --> 00:28:55,240 Speaker 1: on Twitter. They have said that the uncertainty has cast 467 00:28:55,240 --> 00:28:59,560 Speaker 1: a cloud over the shares. It's caused unrest among the 468 00:28:59,640 --> 00:29:04,720 Speaker 1: employe ease, there's been you know, a significant brain drain, 469 00:29:05,640 --> 00:29:09,320 Speaker 1: and you know, it's just it's just not it's not 470 00:29:09,360 --> 00:29:13,000 Speaker 1: a pretty it's not pretty optics either about the way 471 00:29:13,040 --> 00:29:15,920 Speaker 1: Twitter operates and how it handles things, and this whole 472 00:29:15,920 --> 00:29:20,080 Speaker 1: stuff about the box. It's just not pretty. Well, you 473 00:29:20,080 --> 00:29:22,880 Speaker 1: can understand why Twitter would want to get this trial over. 474 00:29:22,920 --> 00:29:25,800 Speaker 1: Then when is it slated for? And also I read 475 00:29:25,800 --> 00:29:28,640 Speaker 1: that Musk is trying to delay it. What's the strategy there? 476 00:29:29,720 --> 00:29:32,680 Speaker 1: So the trial right now is set for October seventeenth 477 00:29:33,040 --> 00:29:37,640 Speaker 1: in beautiful Wilmington, Delaware, and it is. Of course, chancery 478 00:29:37,640 --> 00:29:41,160 Speaker 1: court is a business court, is a non jury court. 479 00:29:41,840 --> 00:29:46,040 Speaker 1: So Judge McCormick will hear it by herself and then 480 00:29:46,080 --> 00:29:51,959 Speaker 1: render a decision some months afterwards. Uh. Mr Musk originally 481 00:29:52,040 --> 00:29:54,400 Speaker 1: wanted to have the trial in February. He wanted to 482 00:29:54,440 --> 00:29:57,360 Speaker 1: have a nice long time for discovery and everything. But 483 00:29:57,440 --> 00:29:59,920 Speaker 1: the Twitter folks wanted to fast track it. They wanted, 484 00:30:00,160 --> 00:30:02,480 Speaker 1: you know, they said the uncertainty was hurting the company, 485 00:30:02,920 --> 00:30:05,160 Speaker 1: so a quicker decision would be better for them. So 486 00:30:06,000 --> 00:30:09,800 Speaker 1: Judge McCormick sort of split the baby. Twitter asked for September, 487 00:30:11,080 --> 00:30:14,200 Speaker 1: Musk asked for February. She said it for mid October. 488 00:30:14,720 --> 00:30:17,840 Speaker 1: Now Mr Musk is saying, with the emergence of Mr Zacho, 489 00:30:17,920 --> 00:30:20,560 Speaker 1: the whistleblower, there's going to be some more time needed 490 00:30:20,800 --> 00:30:24,760 Speaker 1: to dig into his claims, analyze them, and figure out 491 00:30:24,760 --> 00:30:30,040 Speaker 1: the implications for the case. So they had originally asked 492 00:30:30,040 --> 00:30:34,040 Speaker 1: for November, and some court filings yesterday today our sources 493 00:30:34,080 --> 00:30:39,080 Speaker 1: are saying they're even thinking about asking for early December 494 00:30:39,160 --> 00:30:43,080 Speaker 1: for a trial date. Interesting, Um, can you tell us 495 00:30:43,080 --> 00:30:46,400 Speaker 1: who's been subpoena in this case and if those people 496 00:30:46,400 --> 00:30:50,440 Speaker 1: could end up as witnesses. Well, it would take a 497 00:30:50,480 --> 00:30:52,880 Speaker 1: couple hours to tell you all the subpoenas because there's 498 00:30:52,880 --> 00:30:56,200 Speaker 1: been over a hundred of them. But we've we've had 499 00:30:56,240 --> 00:31:01,720 Speaker 1: some big names. Jack Dorsey, the former CEO Twitter, he's likely, 500 00:31:01,760 --> 00:31:05,520 Speaker 1: I suspect he's likely to testify. We've had Larry Ellison, 501 00:31:05,640 --> 00:31:09,560 Speaker 1: the head of Oracle, he's an investor in the case. Um, 502 00:31:09,600 --> 00:31:13,120 Speaker 1: there have been investment vehicles tied to Mark Andres and 503 00:31:13,240 --> 00:31:16,720 Speaker 1: the very famous tech investor. That's that's a pretty good start. 504 00:31:17,120 --> 00:31:19,480 Speaker 1: And so a judge ordered Musk to disclose all of 505 00:31:19,560 --> 00:31:23,880 Speaker 1: his potential investors. How calm and could those people end 506 00:31:23,960 --> 00:31:31,200 Speaker 1: up being witnesses too? Well, I think the judge was 507 00:31:31,280 --> 00:31:36,880 Speaker 1: trying to get a sense of the universe of investors 508 00:31:36,880 --> 00:31:40,840 Speaker 1: and advisers to Mr Musk, and to get an idea 509 00:31:41,120 --> 00:31:44,320 Speaker 1: of whether or not there might be a second equity race, 510 00:31:44,640 --> 00:31:48,760 Speaker 1: which is possible. Um. You know, there's a bunch of 511 00:31:48,800 --> 00:31:53,920 Speaker 1: funds who have um invested in the deal. I mean 512 00:31:53,960 --> 00:31:56,239 Speaker 1: he raised he did a first equity raise of over 513 00:31:56,320 --> 00:32:01,400 Speaker 1: seven billion dollars for the deal, and you know there 514 00:32:01,520 --> 00:32:04,560 Speaker 1: I think there continues to be some conversations among folks. 515 00:32:04,720 --> 00:32:08,320 Speaker 1: So Twitter is entitled to know who they're talking to. 516 00:32:08,920 --> 00:32:11,920 Speaker 1: So that's why McCormick ordered them to give up both 517 00:32:12,120 --> 00:32:15,640 Speaker 1: the names of the investors and the potential investors. And 518 00:32:15,680 --> 00:32:18,080 Speaker 1: that does it for this episode of Bloomberg Law. I'm 519 00:32:18,160 --> 00:32:21,840 Speaker 1: Lydia Wheeler and I'm Kimberly Robinson. This is Bloomberg