1 00:00:03,200 --> 00:00:08,000 Speaker 1: This is Bloomberg Law with June Brusso from Bloomberg Radio. 2 00:00:09,960 --> 00:00:13,320 Speaker 1: There's been a lot of speculation that in a second term, 3 00:00:13,600 --> 00:00:16,840 Speaker 1: Donald Trump could create a Supreme Court with a Trump 4 00:00:16,920 --> 00:00:21,080 Speaker 1: appointed majority that could serve for decades. After all, Trump 5 00:00:21,160 --> 00:00:25,439 Speaker 1: appointed three Supreme Court justices in his first term that 6 00:00:25,600 --> 00:00:29,639 Speaker 1: gave the Court a super conservative majority. Joining me is 7 00:00:29,680 --> 00:00:34,920 Speaker 1: constitutional law expert David super, a professor at Georgetown Law. David, 8 00:00:34,920 --> 00:00:39,159 Speaker 1: There's a lot of speculation that one or two of 9 00:00:39,440 --> 00:00:44,400 Speaker 1: the oldest justices, the super conservatives actually seventy six year 10 00:00:44,400 --> 00:00:48,120 Speaker 1: old Clarence Thomas and seventy four year old Samuel Alito, 11 00:00:48,760 --> 00:00:52,440 Speaker 1: could retire and give Trump a chance to put younger 12 00:00:52,600 --> 00:00:55,600 Speaker 1: conservatives on the court. Do you think that's likely. 13 00:00:56,440 --> 00:00:59,280 Speaker 2: I'm not sure that it is. They both show every 14 00:00:59,320 --> 00:01:03,360 Speaker 2: sign of their jobs. Progressives don't like how they're doing 15 00:01:03,440 --> 00:01:07,720 Speaker 2: their jobs, but their opinions remain sharp. When they choose 16 00:01:07,720 --> 00:01:11,040 Speaker 2: to speak from the bench, they make sense, So I 17 00:01:11,120 --> 00:01:14,280 Speaker 2: can certainly imagine them feeling like they want to keep 18 00:01:14,280 --> 00:01:15,440 Speaker 2: doing this well. 19 00:01:15,560 --> 00:01:18,880 Speaker 1: Thomas, The New York Times reported that he once told 20 00:01:18,880 --> 00:01:21,200 Speaker 1: a former clerk that he intended to stay on the 21 00:01:21,240 --> 00:01:25,000 Speaker 1: bench until twenty thirty four. The liberals made my life 22 00:01:25,040 --> 00:01:27,200 Speaker 1: miserable for forty three years, and I'm going to make 23 00:01:27,200 --> 00:01:30,240 Speaker 1: their lives miserable for forty three years. So we have 24 00:01:30,319 --> 00:01:35,000 Speaker 1: some time yet before he considers retiring, I suppose. On 25 00:01:35,040 --> 00:01:38,520 Speaker 1: the liberal side, there have also been renewed calls for 26 00:01:38,880 --> 00:01:43,080 Speaker 1: seventy year old Justice Sonya Soto Mayord to retire before 27 00:01:43,200 --> 00:01:48,600 Speaker 1: January so President Biden and the Democratic controlled Senate can 28 00:01:48,720 --> 00:01:52,440 Speaker 1: quickly confirm a successor. I don't know about quickly, but 29 00:01:52,520 --> 00:01:55,080 Speaker 1: that whole scenario seems unlikely to me. 30 00:01:55,440 --> 00:01:58,360 Speaker 2: Yeah, I can't imagine that she would do that, and 31 00:01:58,440 --> 00:02:00,640 Speaker 2: I think it would be a foolish move. Try. There 32 00:02:00,720 --> 00:02:06,040 Speaker 2: was talk about their good Marshal retiring after Jimmy Carter 33 00:02:06,200 --> 00:02:10,320 Speaker 2: lost for reelection, and I think cooler heads prevailed there 34 00:02:10,639 --> 00:02:14,320 Speaker 2: and recognized that that couldn't be done in a snap. Now. 35 00:02:14,480 --> 00:02:18,760 Speaker 2: I can't imagine that Senator Mansion or Senator Cinema would 36 00:02:18,800 --> 00:02:21,720 Speaker 2: be okay with this sort of a process at this point, 37 00:02:21,760 --> 00:02:23,440 Speaker 2: and without them you can't do anything. 38 00:02:24,160 --> 00:02:28,800 Speaker 1: As far as future judicial appointments, there have been reports 39 00:02:28,840 --> 00:02:33,359 Speaker 1: that Trump complained about his first term appointees being too 40 00:02:33,440 --> 00:02:37,800 Speaker 1: independent and not being loyal enough to him, and you know, 41 00:02:38,040 --> 00:02:42,400 Speaker 1: reportedly has fallen out with the Federalist Society, which basically 42 00:02:42,480 --> 00:02:46,640 Speaker 1: chose his judicial appointments during his first term. So what 43 00:02:46,680 --> 00:02:51,040 Speaker 1: do you expect from Trump as far as future judicial appointments. 44 00:02:51,600 --> 00:02:56,519 Speaker 2: I think he's not especially focused personally on the courts. 45 00:02:56,560 --> 00:02:59,000 Speaker 2: He's not a lawyer. He's never shown a lot of 46 00:02:59,040 --> 00:03:02,440 Speaker 2: interest in it. So my suspicion is that he will 47 00:03:02,600 --> 00:03:05,880 Speaker 2: go back to the federal of Society, perhaps not give 48 00:03:06,000 --> 00:03:09,640 Speaker 2: them as celebrated and formal a role in the process, 49 00:03:09,840 --> 00:03:13,120 Speaker 2: but that we will basically be getting more said sock judges. 50 00:03:13,120 --> 00:03:16,600 Speaker 1: And more ideologues. It seems like even the judges that 51 00:03:16,639 --> 00:03:22,280 Speaker 1: he did appoint were ideologues more than respected jurists. 52 00:03:22,560 --> 00:03:26,280 Speaker 2: He appointed arrange some of his appointments are splendid judges, 53 00:03:26,520 --> 00:03:30,400 Speaker 2: some of them are skilled ideologs, and some of them 54 00:03:30,600 --> 00:03:34,440 Speaker 2: are rather lost. I'm guessing the new round rule will 55 00:03:34,440 --> 00:03:36,240 Speaker 2: include some in each of those piles. 56 00:03:36,840 --> 00:03:41,360 Speaker 1: Now, the Biden administration has taken positions in certain cases 57 00:03:41,400 --> 00:03:45,200 Speaker 1: before the Supreme Court, and we can anticipate that the 58 00:03:45,200 --> 00:03:49,640 Speaker 1: Trump administration has opposing views and will want to back 59 00:03:49,680 --> 00:03:52,520 Speaker 1: out of those positions. How often does it happen that 60 00:03:53,320 --> 00:03:58,280 Speaker 1: new administration changes positions in cases before the Court. 61 00:03:59,040 --> 00:04:02,920 Speaker 2: It's become more more and more frequent over time. Once 62 00:04:03,040 --> 00:04:07,120 Speaker 2: upon a time, it was considered shocking enough that the 63 00:04:07,160 --> 00:04:11,080 Speaker 2: Supreme Court would appoint a lawyer to argue the case 64 00:04:11,360 --> 00:04:15,320 Speaker 2: as friend of the Court representing the position of the 65 00:04:15,320 --> 00:04:19,680 Speaker 2: prior administration. These days, there's less of that. I think 66 00:04:19,680 --> 00:04:23,760 Speaker 2: it is understood by the justices that this is what 67 00:04:23,960 --> 00:04:26,760 Speaker 2: happens in a society where views on law have become 68 00:04:26,839 --> 00:04:27,680 Speaker 2: very polarized. 69 00:04:28,160 --> 00:04:32,560 Speaker 1: So now some areas where there might be a change. 70 00:04:33,000 --> 00:04:37,720 Speaker 1: Transgender rights is probably one of them. Now the transgender 71 00:04:37,800 --> 00:04:41,760 Speaker 1: rights case arguments are scheduled for December fourth, so well 72 00:04:41,800 --> 00:04:45,480 Speaker 1: before inauguration day. I mean, what happens in a. 73 00:04:45,440 --> 00:04:49,719 Speaker 2: Case like that, Well, the Trump administration has a couple 74 00:04:49,760 --> 00:04:53,760 Speaker 2: of options. One is that they can ask the court 75 00:04:54,040 --> 00:04:58,400 Speaker 2: to reopen it to allow supplemental briefing. Whether the Court 76 00:04:58,440 --> 00:05:01,560 Speaker 2: would be inclined to slow itself down to allow that 77 00:05:01,800 --> 00:05:05,719 Speaker 2: is debatable. Second, they can simply declare that their allegiance 78 00:05:05,800 --> 00:05:08,200 Speaker 2: are on the other side and rely on the briefs 79 00:05:08,200 --> 00:05:12,440 Speaker 2: that we're opposing what the Biden administration advocated. They can 80 00:05:12,520 --> 00:05:17,360 Speaker 2: also express the intent to take executive action to reverse 81 00:05:17,640 --> 00:05:22,719 Speaker 2: federal regulations, but that can't be done instantly. The court 82 00:05:23,000 --> 00:05:26,360 Speaker 2: could decide the case and let them take their own 83 00:05:26,400 --> 00:05:29,400 Speaker 2: action through due course, or the court could suspend the 84 00:05:29,440 --> 00:05:32,120 Speaker 2: case to give them time to act. Or the court 85 00:05:32,240 --> 00:05:36,440 Speaker 2: could decide not to issue a ruling because it expects 86 00:05:36,600 --> 00:05:37,680 Speaker 2: the issue to go away. 87 00:05:39,320 --> 00:05:44,400 Speaker 1: The federal government already argued a case over ghost guns 88 00:05:44,680 --> 00:05:48,800 Speaker 1: to keep regulations for ghost guns on the books, so 89 00:05:48,880 --> 00:05:52,600 Speaker 1: that's already been argued, fully briefed, and argued. Now it's 90 00:05:52,640 --> 00:05:58,240 Speaker 1: anticipated that the Trump administration might oppose those regulations. What 91 00:05:58,360 --> 00:06:00,960 Speaker 1: happens there? Do they just pull their regulations or. 92 00:06:01,360 --> 00:06:05,400 Speaker 2: Well, you can't just pull the regulation. The Administrative Procedure 93 00:06:05,400 --> 00:06:08,200 Speaker 2: Act and the Supreme Court have been very clear that 94 00:06:08,240 --> 00:06:11,720 Speaker 2: you have to go through the same procedure to remove 95 00:06:11,800 --> 00:06:16,320 Speaker 2: regulations that you did to create them. So it would 96 00:06:16,320 --> 00:06:19,560 Speaker 2: take a certain amount of time, probably a year, even 97 00:06:19,600 --> 00:06:24,520 Speaker 2: if they move very quickly to get rid of those regulations. 98 00:06:24,920 --> 00:06:28,160 Speaker 2: This is a case where the conservative legal agenda has 99 00:06:28,200 --> 00:06:32,599 Speaker 2: a somewhat ironic impact in that a year ago, the 100 00:06:32,600 --> 00:06:36,840 Speaker 2: administration changing its views on how the statute should be 101 00:06:36,960 --> 00:06:42,000 Speaker 2: interpreted would be entitled a considerable weight. But today, because 102 00:06:42,160 --> 00:06:45,320 Speaker 2: the Supreme Court got rid of Chevron. The administration is 103 00:06:45,360 --> 00:06:48,320 Speaker 2: not entiled to any difference anyway. So one can well 104 00:06:48,360 --> 00:06:50,760 Speaker 2: imagine the Court saying, we really don't care if you've 105 00:06:50,839 --> 00:06:52,480 Speaker 2: changed your position, really. 106 00:06:52,360 --> 00:06:55,400 Speaker 1: Just ignoring Trump, the Trump administration. I mean, has the 107 00:06:55,440 --> 00:06:56,600 Speaker 1: Court done that before. 108 00:06:56,960 --> 00:07:01,960 Speaker 2: Well, we're very early in the post Chevron world. There 109 00:07:02,040 --> 00:07:06,440 Speaker 2: always was deference. The administration's position mattered a great deal. 110 00:07:06,560 --> 00:07:09,960 Speaker 2: But the Court went at great lengths last spring to 111 00:07:10,040 --> 00:07:14,480 Speaker 2: tell us that Congress did not intend administration views to 112 00:07:14,520 --> 00:07:20,080 Speaker 2: get any special weight in interpreting laws. And on that basis, 113 00:07:20,320 --> 00:07:23,080 Speaker 2: one could well imagine the Court simply going ahead and 114 00:07:23,120 --> 00:07:25,960 Speaker 2: deciding the case and saying that the administration wants to 115 00:07:26,040 --> 00:07:30,320 Speaker 2: go through the procedures to remove the regulations, they're free 116 00:07:30,360 --> 00:07:33,120 Speaker 2: to do so. But what the administration might or might 117 00:07:33,200 --> 00:07:35,600 Speaker 2: not try to do or succeed in doing in the 118 00:07:35,640 --> 00:07:38,560 Speaker 2: future doesn't stop the Court from deciding the case in 119 00:07:38,560 --> 00:07:39,000 Speaker 2: front of it. 120 00:07:39,520 --> 00:07:43,080 Speaker 1: Yeah, I'm wondering if, as you mentioned, we saw the 121 00:07:43,600 --> 00:07:47,880 Speaker 1: Chevron doctrine being thrown out and the Court's conservatives have 122 00:07:48,000 --> 00:07:52,480 Speaker 1: been basically reining in federal agencies, do you think that 123 00:07:52,600 --> 00:07:56,640 Speaker 1: we'll still see that kind of aggressive attack on regulatory 124 00:07:56,680 --> 00:08:00,880 Speaker 1: agencies by the Court in a Trump administration which may 125 00:08:00,880 --> 00:08:06,240 Speaker 1: be using agencies to affect some of these broad changes. 126 00:08:07,040 --> 00:08:10,480 Speaker 2: I think different justices will have a different view. Chief 127 00:08:10,600 --> 00:08:16,040 Speaker 2: Justice Roberts has very little patients for sloppy, incompetent agency work, 128 00:08:16,160 --> 00:08:22,400 Speaker 2: whichever ideology the agency has, and when presented with dishonest 129 00:08:22,800 --> 00:08:28,160 Speaker 2: or incompetent agency actions in the first Trump administration, did 130 00:08:28,160 --> 00:08:31,480 Speaker 2: not hesitate to strike it down, So I would expect 131 00:08:31,560 --> 00:08:35,559 Speaker 2: he would be continued inclined to look closely at agencies 132 00:08:35,840 --> 00:08:39,200 Speaker 2: and whether they're dotting their eyes and crossing their t's. 133 00:08:39,559 --> 00:08:43,520 Speaker 2: Some of the other justices probably will feel that a 134 00:08:44,000 --> 00:08:47,720 Speaker 2: lower level of scrutiny should be applied to deregulatory actions 135 00:08:47,720 --> 00:08:49,960 Speaker 2: than was applied to regulatory actions. 136 00:08:50,280 --> 00:08:54,719 Speaker 1: Some disputes coming up involve the EPA whether fights over 137 00:08:54,760 --> 00:09:00,000 Speaker 1: certain clean air regulations can be heard by circuit courts. 138 00:09:00,000 --> 00:09:03,400 Speaker 1: I mean, what happens there. Does the Trump administration just 139 00:09:03,960 --> 00:09:06,760 Speaker 1: pull those cases or how would they handle. 140 00:09:06,480 --> 00:09:11,040 Speaker 2: Those That's harder to know because that's an issue of 141 00:09:11,520 --> 00:09:16,600 Speaker 2: process that has implications that go far beyond the current 142 00:09:16,720 --> 00:09:23,280 Speaker 2: or next administrations. So one could imagine that the Trump administration, 143 00:09:24,120 --> 00:09:27,480 Speaker 2: although its view on the ultimate merits, may be different, 144 00:09:28,040 --> 00:09:31,760 Speaker 2: might not feel that that's a high priority to expend 145 00:09:31,800 --> 00:09:36,040 Speaker 2: its credibility with the Court by switching positions, and may 146 00:09:36,120 --> 00:09:40,240 Speaker 2: see some benefit to staying out of it or not 147 00:09:40,840 --> 00:09:42,960 Speaker 2: disrupting what the Biden administration did. 148 00:09:43,320 --> 00:09:45,720 Speaker 1: Have you heard any rumors about who might be the 149 00:09:45,880 --> 00:09:46,880 Speaker 1: solicitor General? 150 00:09:47,600 --> 00:09:47,960 Speaker 2: I have not. 151 00:09:48,640 --> 00:09:50,720 Speaker 1: I have not either, and I can't believe that because 152 00:09:50,760 --> 00:09:54,640 Speaker 1: I've heard rumors about many other high level positions that 153 00:09:55,320 --> 00:09:56,480 Speaker 1: will have to be filled. 154 00:09:56,880 --> 00:10:00,320 Speaker 2: I think that the new administration is going to want 155 00:10:00,360 --> 00:10:05,040 Speaker 2: someone competent and credible, and so what you hear, at 156 00:10:05,120 --> 00:10:08,840 Speaker 2: least what I hear about various other positions are people 157 00:10:09,240 --> 00:10:14,280 Speaker 2: who may not be especially skilled or especially respected, but 158 00:10:14,360 --> 00:10:19,160 Speaker 2: are very close with Trump circles and with the former president. 159 00:10:19,760 --> 00:10:23,240 Speaker 2: I would expect the new solicitor General would be an 160 00:10:23,240 --> 00:10:27,520 Speaker 2: extremely conservative lawyer, but not necessarily someone who Trump knows. 161 00:10:27,880 --> 00:10:29,760 Speaker 1: It's always a pleasure to talk to you, David, Thanks 162 00:10:29,800 --> 00:10:32,439 Speaker 1: so much for being on the show. That's Professor David 163 00:10:32,480 --> 00:10:36,520 Speaker 1: super of Georgetown Law. Metta's Facebook is trying to get 164 00:10:36,520 --> 00:10:40,000 Speaker 1: out of a shareholder lawsuit accusing it of fraud for 165 00:10:40,160 --> 00:10:44,360 Speaker 1: misleading investors about a known risk from the massive Cambridge 166 00:10:44,360 --> 00:10:49,119 Speaker 1: Analytica data breach. In twenty fifteen, the justices seemed divided 167 00:10:49,160 --> 00:10:53,840 Speaker 1: about how much information public companies must disclose about potential 168 00:10:53,960 --> 00:10:59,200 Speaker 1: investment risks, including past events. It led to hypotheticals from 169 00:10:59,280 --> 00:11:03,440 Speaker 1: Chief Justice John Roberts slip and fall case to Justice 170 00:11:03,440 --> 00:11:06,120 Speaker 1: Samuel Alito's trash from space. 171 00:11:06,600 --> 00:11:09,800 Speaker 3: For example, if you're leaving my house and I say 172 00:11:10,760 --> 00:11:15,120 Speaker 3: you might slip on the steps, you wouldn't say, well, 173 00:11:15,120 --> 00:11:19,600 Speaker 3: that's never happened before. Your inference would be that has happened, 174 00:11:19,679 --> 00:11:22,120 Speaker 3: and that's why I'm giving you the warning. And if 175 00:11:22,160 --> 00:11:24,600 Speaker 3: there was a fire and it was caused by the 176 00:11:24,640 --> 00:11:27,200 Speaker 3: fact that the factory was hit by a piece of 177 00:11:27,280 --> 00:11:31,040 Speaker 3: space junk that fell out of the sky, the fact 178 00:11:31,080 --> 00:11:32,160 Speaker 3: that that happened. 179 00:11:31,800 --> 00:11:35,120 Speaker 2: Doesn't really tell you much more about the probability that 180 00:11:35,679 --> 00:11:38,800 Speaker 2: you're going to have another fire based on objects falling 181 00:11:39,960 --> 00:11:40,800 Speaker 2: out of space. 182 00:11:41,559 --> 00:11:45,120 Speaker 1: Joining me is securities law expert James Park, a professor 183 00:11:45,200 --> 00:11:49,959 Speaker 1: at UCLA Law School, Jim tell Us about the shareholder's 184 00:11:50,040 --> 00:11:51,680 Speaker 1: lawsuit at the heart of this case. 185 00:11:52,320 --> 00:11:57,040 Speaker 3: It's a securities class action brought under sec Rule ten 186 00:11:57,120 --> 00:12:04,840 Speaker 3: B five, which typically applies to misrepresentations material misrepresentations made 187 00:12:04,960 --> 00:12:09,800 Speaker 3: by companies that relate to their securities. And so this 188 00:12:09,920 --> 00:12:13,880 Speaker 3: is a lawsuit class action brought on behalf of investors 189 00:12:13,920 --> 00:12:19,360 Speaker 3: of Facebook. Now it's meta course, but Facebook, as with 190 00:12:19,720 --> 00:12:24,720 Speaker 3: other public companies, issues periodic disclosures the form ten K 191 00:12:25,200 --> 00:12:28,960 Speaker 3: on a yearly basis. The ten ques, and these sec 192 00:12:29,000 --> 00:12:34,240 Speaker 3: disclosures typically have a lot of disclosures about risk which 193 00:12:34,280 --> 00:12:37,640 Speaker 3: are required by the SEC And so the idea behind 194 00:12:37,679 --> 00:12:41,520 Speaker 3: these risk disclosures is that your warning investors, this is 195 00:12:41,559 --> 00:12:45,640 Speaker 3: the risk of investing in Facebook stock. And one of 196 00:12:45,679 --> 00:12:50,840 Speaker 3: those risk disclosures basically said that there could be a 197 00:12:51,240 --> 00:12:58,280 Speaker 3: risk of unauthorized access to user data that could damage Facebook. 198 00:12:58,520 --> 00:13:01,720 Speaker 3: That was essentially what the risk is closure said. And 199 00:13:01,760 --> 00:13:04,680 Speaker 3: the reason why the planiffs are saying this is maturely 200 00:13:04,760 --> 00:13:09,559 Speaker 3: misleading is that at the time Facebook made this statement, 201 00:13:10,400 --> 00:13:16,400 Speaker 3: it knew about a fairly large breach by a political 202 00:13:16,440 --> 00:13:21,200 Speaker 3: consulting firm called Cambridge Analytica of user data. And so 203 00:13:21,280 --> 00:13:24,800 Speaker 3: what the plantiffs are arguing is that you said that 204 00:13:25,000 --> 00:13:29,680 Speaker 3: unauthorized access could be a risk when you knew there 205 00:13:29,920 --> 00:13:34,600 Speaker 3: was a very significant unauthorized access at the time you 206 00:13:34,800 --> 00:13:39,280 Speaker 3: issued that risk disclosure. That's essentially what the securities class 207 00:13:39,320 --> 00:13:41,079 Speaker 3: action is dependent on. 208 00:13:41,960 --> 00:13:46,240 Speaker 1: Outside of the validity of the issues here. The FTC's 209 00:13:46,400 --> 00:13:50,559 Speaker 1: investigation led to this record five billion dollars civil penalty 210 00:13:50,600 --> 00:13:54,040 Speaker 1: against Facebook, and Facebook reached a two hundred and seventy 211 00:13:54,080 --> 00:13:58,240 Speaker 1: five million dollar class action settlement with users over the 212 00:13:58,280 --> 00:14:02,240 Speaker 1: privacy breach last year. Why do you think this hasn't 213 00:14:02,280 --> 00:14:03,040 Speaker 1: been settled. 214 00:14:03,640 --> 00:14:08,160 Speaker 3: It's a good question, and I think that maybe one 215 00:14:08,240 --> 00:14:13,079 Speaker 3: reason is that Facebook is often facing securities class actions 216 00:14:13,960 --> 00:14:18,680 Speaker 3: and it does not want to be perceived as a 217 00:14:18,760 --> 00:14:22,920 Speaker 3: company that settled lawsuits that arise out of disclosure violations. 218 00:14:23,600 --> 00:14:26,920 Speaker 3: They may believe they have a strong case with respect 219 00:14:27,000 --> 00:14:31,120 Speaker 3: to this particular allegation. Their argument's not a material misstatement 220 00:14:31,360 --> 00:14:34,080 Speaker 3: because you know, there's a risk of this happening. And 221 00:14:34,640 --> 00:14:37,040 Speaker 3: the argument, as I understand it, is that, Okay, they 222 00:14:37,120 --> 00:14:40,640 Speaker 3: knew that there was a breach, they didn't know how 223 00:14:40,720 --> 00:14:44,880 Speaker 3: big the impact would be on a huge company like Facebook, 224 00:14:45,320 --> 00:14:48,920 Speaker 3: And you know, I think that's essentially the argument is that, well, 225 00:14:48,960 --> 00:14:52,000 Speaker 3: we knew that there had been a breach, but we 226 00:14:52,040 --> 00:14:53,920 Speaker 3: didn't know it was going to result in this later 227 00:14:54,040 --> 00:14:58,480 Speaker 3: penalty and all this reputational harm, and so our disclosure 228 00:14:58,520 --> 00:15:02,000 Speaker 3: that there's this risk that and unauthorized actions could affect 229 00:15:02,000 --> 00:15:05,760 Speaker 3: our market value was mostly accurate according to Facebook, and 230 00:15:05,800 --> 00:15:09,040 Speaker 3: so I think they believe they have a good case here. 231 00:15:09,200 --> 00:15:12,120 Speaker 1: So let's talk about the argument. And there were a 232 00:15:12,160 --> 00:15:15,800 Speaker 1: lot of strange hypotheticals thrown around. What were the concerns 233 00:15:15,840 --> 00:15:16,800 Speaker 1: of the justices? 234 00:15:17,240 --> 00:15:20,880 Speaker 3: I think one concern is that companies issue a lot 235 00:15:20,920 --> 00:15:25,360 Speaker 3: of risk disclosures about many different things, and some of 236 00:15:25,360 --> 00:15:29,200 Speaker 3: those things actually happen, you know, and so you might say, Okay, 237 00:15:29,200 --> 00:15:34,440 Speaker 3: there's a risk of bad publicity, just generally media coverage, 238 00:15:34,480 --> 00:15:36,520 Speaker 3: and you know, at the same time you issue this, 239 00:15:36,600 --> 00:15:38,960 Speaker 3: there may be a lot of various events that are 240 00:15:39,120 --> 00:15:42,400 Speaker 3: putting you in a bad life. And so I think, 241 00:15:42,720 --> 00:15:45,600 Speaker 3: you know, one of the concerns that the justices are 242 00:15:45,720 --> 00:15:50,200 Speaker 3: grappling with is that if we say there's liability in 243 00:15:50,240 --> 00:15:54,040 Speaker 3: this case, so that mean that for almost every type 244 00:15:54,040 --> 00:15:58,320 Speaker 3: of risk disclosure that is being issued, that a lawsuit 245 00:15:58,360 --> 00:16:01,720 Speaker 3: could come up if you phrase things in the wrong way, 246 00:16:01,920 --> 00:16:05,920 Speaker 3: if you know, there's something that happens relating to that risk, 247 00:16:06,120 --> 00:16:09,000 Speaker 3: and so there may be too much liability there, and 248 00:16:09,120 --> 00:16:11,320 Speaker 3: you know the effect of that may be that companies 249 00:16:11,360 --> 00:16:15,560 Speaker 3: will just issue very very vague risk disclosures that don't 250 00:16:15,560 --> 00:16:19,640 Speaker 3: say anything. They may not issue risk disclosures at all. Right, 251 00:16:19,720 --> 00:16:23,479 Speaker 3: if I make you liable for a misleading risk disclosure, 252 00:16:24,280 --> 00:16:27,400 Speaker 3: then maybe I won't issue that risk disclosure at all 253 00:16:27,480 --> 00:16:29,960 Speaker 3: in the future. I think that's a concern, and I 254 00:16:29,960 --> 00:16:32,440 Speaker 3: think they're also just concerned that the rule seems to 255 00:16:32,480 --> 00:16:37,280 Speaker 3: be unclear here and it's tough for companies to navigate. 256 00:16:37,520 --> 00:16:39,440 Speaker 3: You know, what you have to disclose and when you 257 00:16:39,480 --> 00:16:42,240 Speaker 3: have to disclose it. And you know, there's a concern 258 00:16:42,360 --> 00:16:44,840 Speaker 3: here that we don't want to create a general duty 259 00:16:44,880 --> 00:16:50,120 Speaker 3: to update everything in these disclosures, because there's so much 260 00:16:50,280 --> 00:16:54,280 Speaker 3: in any any SEC disclosure that it may be impossible 261 00:16:54,360 --> 00:17:00,160 Speaker 3: to you know, constantly update every single aspect of it. 262 00:17:00,200 --> 00:17:04,320 Speaker 1: Is the question how much to disclose of the past 263 00:17:04,920 --> 00:17:07,480 Speaker 1: in the present, you know, if you don't think the 264 00:17:07,520 --> 00:17:10,240 Speaker 1: past is going to affect the future. I mean, what's 265 00:17:10,240 --> 00:17:11,920 Speaker 1: the exact issue. 266 00:17:12,280 --> 00:17:15,400 Speaker 3: I think what Facebook would say is these are primarily 267 00:17:15,920 --> 00:17:19,479 Speaker 3: forward looking statements. They are predictions, and so they're not 268 00:17:19,520 --> 00:17:24,240 Speaker 3: really meant to guarantee every single thing about past event. 269 00:17:24,480 --> 00:17:28,719 Speaker 3: I think that they're saying that we're making no guarantee 270 00:17:28,760 --> 00:17:33,480 Speaker 3: that there has been no unauthorized access to user accounts. 271 00:17:33,520 --> 00:17:35,479 Speaker 3: And in fact, I think they would say, of course, 272 00:17:36,160 --> 00:17:40,800 Speaker 3: of course there have been some abuses of Facebook privacy policies, 273 00:17:40,880 --> 00:17:43,840 Speaker 3: and everyone would know that at the time the risk 274 00:17:43,880 --> 00:17:47,240 Speaker 3: disclosure had been made. And I think Facebook is making 275 00:17:47,280 --> 00:17:50,680 Speaker 3: a fairly subtle argument here, which was to say that 276 00:17:51,520 --> 00:17:55,359 Speaker 3: it's true that we knew that there was an unauthorized 277 00:17:55,560 --> 00:17:59,400 Speaker 3: access by Cambridge Analytica. What we didn't know is that 278 00:17:59,520 --> 00:18:03,280 Speaker 3: it would affect our business so much. We had no 279 00:18:03,400 --> 00:18:07,480 Speaker 3: way of knowing that, and essentially we have no way 280 00:18:07,480 --> 00:18:12,480 Speaker 3: of essentially predicting the future. And essentially this risk disclosure, 281 00:18:12,560 --> 00:18:16,159 Speaker 3: we disclosed this and we didn't know. We just didn't 282 00:18:16,200 --> 00:18:20,320 Speaker 3: know that there was an event that would eventually cause 283 00:18:20,320 --> 00:18:25,160 Speaker 3: significant economic damage to Facebook. We made no implicit representation 284 00:18:25,760 --> 00:18:28,560 Speaker 3: about the past in this particular disclosure. It's more of 285 00:18:28,600 --> 00:18:30,360 Speaker 3: a forward looking disclosure. 286 00:18:30,440 --> 00:18:34,400 Speaker 1: In this case a hypothetical from Justice Kagan. A lot 287 00:18:34,440 --> 00:18:37,120 Speaker 1: of the justices kept coming back to it, and that 288 00:18:37,400 --> 00:18:41,800 Speaker 1: is a company discloses a risk of fire but doesn't 289 00:18:41,840 --> 00:18:44,920 Speaker 1: disclose that there had recently been a fire that destroyed 290 00:18:44,960 --> 00:18:46,640 Speaker 1: fifty percent of the plant. 291 00:18:47,280 --> 00:18:50,400 Speaker 3: Yes, that's a great hypothetical, and I think it pushes 292 00:18:50,480 --> 00:18:54,280 Speaker 3: back upon sort of Facebook statement that these statements do 293 00:18:54,359 --> 00:19:00,400 Speaker 3: not make implicit representations about past events, because implicit in 294 00:19:00,480 --> 00:19:03,280 Speaker 3: the risk of a fire in a building is that 295 00:19:03,320 --> 00:19:06,760 Speaker 3: the building is still there. I think that's one implicit recognition, 296 00:19:06,880 --> 00:19:09,760 Speaker 3: and I think there's also a recognition that there's not 297 00:19:09,880 --> 00:19:13,320 Speaker 3: a known high risk of the building burning down. And 298 00:19:13,440 --> 00:19:16,520 Speaker 3: I think that investors will want to know if there's 299 00:19:16,560 --> 00:19:20,280 Speaker 3: a very very high risk that the building is so 300 00:19:20,440 --> 00:19:24,439 Speaker 3: unsafe that we know it's very likely that it would 301 00:19:24,760 --> 00:19:28,520 Speaker 3: burn down. What she's arguing here is that, you know, 302 00:19:28,720 --> 00:19:32,040 Speaker 3: we should be requiring companies when they make these risk 303 00:19:32,160 --> 00:19:38,040 Speaker 3: disclosures to also speak completely and disclose risks that they 304 00:19:38,119 --> 00:19:42,320 Speaker 3: know about that they can calculate with some reasonable assurance. 305 00:19:42,560 --> 00:19:46,240 Speaker 3: And that's where Justice Kagan may come out at the 306 00:19:46,320 --> 00:19:46,920 Speaker 3: end of the day. 307 00:19:47,800 --> 00:19:51,639 Speaker 1: You mentioned before the regulations, and well, just as Kavanaugh said, 308 00:19:51,920 --> 00:19:54,960 Speaker 1: why can't the SEC just write a rig? Why does 309 00:19:55,000 --> 00:19:57,399 Speaker 1: the judiciary have to walk the plank on this and 310 00:19:57,440 --> 00:20:00,439 Speaker 1: answer the question when the SEC could do it? And 311 00:20:00,520 --> 00:20:04,040 Speaker 1: the Chief Justice and Justice Amy Coney Barrett seem to 312 00:20:04,080 --> 00:20:06,840 Speaker 1: agree with that. I mean, why can't they SEC write 313 00:20:06,840 --> 00:20:07,480 Speaker 1: a regulation? 314 00:20:08,040 --> 00:20:12,439 Speaker 3: They could, but it's difficult to write good regulations, and 315 00:20:12,520 --> 00:20:16,119 Speaker 3: I think that the more general the regulation is, the 316 00:20:16,240 --> 00:20:19,440 Speaker 3: easier it is for the SCC. That's one reason the 317 00:20:19,480 --> 00:20:24,159 Speaker 3: SEC is not adding more specificity. It's also easier though, 318 00:20:24,280 --> 00:20:28,119 Speaker 3: I think for public companies, right, the more specific you 319 00:20:28,280 --> 00:20:31,720 Speaker 3: write the regulations, and you know the company is going 320 00:20:31,760 --> 00:20:34,320 Speaker 3: to have to dig out all of this specific information 321 00:20:34,440 --> 00:20:38,639 Speaker 3: and that could be overly burdensome. And so sometimes corporations 322 00:20:38,720 --> 00:20:43,880 Speaker 3: want fairly vague disclosure requirements that they have some discretion 323 00:20:44,160 --> 00:20:46,919 Speaker 3: to comply with. And that's the whole idea behind principles 324 00:20:46,920 --> 00:20:50,679 Speaker 3: based regulation, which a lot of you know, what you 325 00:20:50,760 --> 00:20:55,840 Speaker 3: might think of as conservative individuals, have advocated for SEC regulation. 326 00:20:56,600 --> 00:20:59,080 Speaker 3: But I think Justice Kavanaugh has a good point, which 327 00:20:59,119 --> 00:21:01,960 Speaker 3: is that you know, if you are leaving this to 328 00:21:02,280 --> 00:21:06,359 Speaker 3: lawsuits to define, then you know courts, which are not 329 00:21:06,480 --> 00:21:09,320 Speaker 3: experts in these matters, have to make these very difficult, 330 00:21:09,680 --> 00:21:13,880 Speaker 3: complicated determinations of what should be disclosed. It also give 331 00:21:14,240 --> 00:21:18,880 Speaker 3: a sense of regulation by enforcement, where lawsuits are being 332 00:21:19,000 --> 00:21:23,560 Speaker 3: used to regulate companies after the fact, and that's a 333 00:21:23,680 --> 00:21:28,200 Speaker 3: running theme about both lawsuits by private litigants as well 334 00:21:28,240 --> 00:21:32,840 Speaker 3: as lawsuits by the Securities and Exchange Commissions. So he 335 00:21:33,000 --> 00:21:36,320 Speaker 3: asked a very good question, but I think that there 336 00:21:36,359 --> 00:21:39,920 Speaker 3: are some pressures that make it difficult for the SEC 337 00:21:40,040 --> 00:21:45,919 Speaker 3: to specify completely what the disclosures does look like. And 338 00:21:45,960 --> 00:21:48,719 Speaker 3: so what we have is a very imperfect system where 339 00:21:49,359 --> 00:21:53,239 Speaker 3: you know, these issues get fleshed out in lawsuits, and 340 00:21:53,280 --> 00:21:56,680 Speaker 3: often the lawsuits are settled, so we never even actually 341 00:21:56,680 --> 00:22:01,040 Speaker 3: resolved what was required. So, you know, the requirements of 342 00:22:01,119 --> 00:22:05,359 Speaker 3: disclosure for public companies, there's not clear guidances. It's a 343 00:22:05,359 --> 00:22:08,840 Speaker 3: lot of discretion, it's a lot of judgment that is 344 00:22:08,880 --> 00:22:13,480 Speaker 3: required in order to craft good disclosures. And I think 345 00:22:13,520 --> 00:22:18,080 Speaker 3: there's you know, some resentment by public corporations that you 346 00:22:18,200 --> 00:22:22,440 Speaker 3: have private enforcers who have a monetary incentive to sue 347 00:22:22,480 --> 00:22:25,800 Speaker 3: whenever they find something wrong in a company's disclosures. And 348 00:22:26,119 --> 00:22:27,679 Speaker 3: you know, you have to pay a lot of money 349 00:22:27,840 --> 00:22:30,879 Speaker 3: sometimes to settle these lawsuits because you don't want to 350 00:22:30,960 --> 00:22:33,480 Speaker 3: risk them going to trial. Now, you know. One point 351 00:22:33,520 --> 00:22:35,359 Speaker 3: I would add, though, is that in addition to this 352 00:22:35,400 --> 00:22:39,359 Speaker 3: private class action, the SEC brought a case against Facebook 353 00:22:39,480 --> 00:22:44,000 Speaker 3: for I believe essentially it's the same disclosure, and they 354 00:22:44,200 --> 00:22:48,280 Speaker 3: alleged that they were materially misleading, and Facebook actually settled 355 00:22:48,280 --> 00:22:50,840 Speaker 3: that case for one hundred million dollars. So this is 356 00:22:50,880 --> 00:22:54,720 Speaker 3: not a lawsuit that was brought only by private planets. 357 00:22:54,720 --> 00:22:59,359 Speaker 3: The SEC also has been asserting and pushing this particular 358 00:22:59,440 --> 00:23:00,400 Speaker 3: theory as well. 359 00:23:00,440 --> 00:23:03,200 Speaker 1: Coming up next, how might the Justice is rule here? 360 00:23:03,440 --> 00:23:07,280 Speaker 1: This is Bloomberg. I've been talking to Professor James Park 361 00:23:07,359 --> 00:23:11,560 Speaker 1: of UCLA Law School about oral arguments before the Supreme 362 00:23:11,640 --> 00:23:14,920 Speaker 1: Court this week in Facebook's attempt to get out of 363 00:23:14,960 --> 00:23:19,520 Speaker 1: a shareholder lawsuit accusing it of fraud for misleading investors 364 00:23:19,840 --> 00:23:23,320 Speaker 1: about a known risk from the massive Cambridge Analytic data 365 00:23:23,320 --> 00:23:27,960 Speaker 1: breach in twenty fifteen. Something that Justice Gorsech said resonated 366 00:23:28,000 --> 00:23:32,280 Speaker 1: with me. He suggested that reasonable investors were well aware 367 00:23:32,320 --> 00:23:35,840 Speaker 1: of the risk of data breaches at large companies, including 368 00:23:35,840 --> 00:23:36,880 Speaker 1: by foreign governments. 369 00:23:37,040 --> 00:23:40,040 Speaker 3: I think China probably has all of our FBI files, 370 00:23:40,160 --> 00:23:43,520 Speaker 3: you know, I mean, data breaches are part of our 371 00:23:43,600 --> 00:23:46,960 Speaker 3: lives these days. There are data breaches. But I think 372 00:23:47,000 --> 00:23:49,840 Speaker 3: what the plaintiffs would say is that the Cambridge Analytical 373 00:23:49,960 --> 00:23:55,120 Speaker 3: breach was an extraordinary one. I'm involving millions of different customers, 374 00:23:55,320 --> 00:23:57,879 Speaker 3: and you know, I believe the data was used for 375 00:23:57,960 --> 00:24:03,159 Speaker 3: political purposes. And you know, not every data breach results 376 00:24:03,200 --> 00:24:06,440 Speaker 3: in a five billion dollar fine by the Federal Trade Commission. 377 00:24:06,840 --> 00:24:10,200 Speaker 3: And so, you know, certainly we're generally aware there could 378 00:24:10,280 --> 00:24:13,320 Speaker 3: be data breaches, but you could argue the investors assume 379 00:24:13,720 --> 00:24:17,080 Speaker 3: most companies are managing them pretty well and they're not 380 00:24:17,119 --> 00:24:21,640 Speaker 3: going to affect the stock price significantly, but this one did. Now, 381 00:24:21,840 --> 00:24:24,919 Speaker 3: what Facebook is saying, though, is that, you know, we 382 00:24:25,040 --> 00:24:27,159 Speaker 3: knew about the breach at the time we made this 383 00:24:27,320 --> 00:24:31,080 Speaker 3: risk disclosure, but it wasn't quite clear it was going 384 00:24:31,160 --> 00:24:35,720 Speaker 3: to have this big impact. That was still unclear, and 385 00:24:36,240 --> 00:24:38,840 Speaker 3: you know, we warned about that risk, but we didn't know. 386 00:24:39,119 --> 00:24:42,240 Speaker 3: We didn't know that the stock price would be affected 387 00:24:42,280 --> 00:24:45,399 Speaker 3: as much as it ultimately was. When the scandal broke 388 00:24:45,520 --> 00:24:50,679 Speaker 3: more widely and regulators reacted, and that, you know, it 389 00:24:50,720 --> 00:24:53,920 Speaker 3: became something that hurt Facebook's reputation. That aspect of it 390 00:24:54,200 --> 00:24:57,160 Speaker 3: we didn't know at the time the risk disclosure was issued. 391 00:24:57,960 --> 00:25:00,520 Speaker 1: Do you have any feel from the oral organ about 392 00:25:00,520 --> 00:25:01,440 Speaker 1: how they might rule. 393 00:25:03,119 --> 00:25:06,760 Speaker 3: I think it'll be a very fact specific decision. I 394 00:25:06,800 --> 00:25:12,520 Speaker 3: think that the court understands that this is a really, 395 00:25:12,600 --> 00:25:17,879 Speaker 3: really hard issue and the Supreme Court is not qualified 396 00:25:17,920 --> 00:25:22,520 Speaker 3: to write very detailed disclosure rules, and so I think 397 00:25:22,560 --> 00:25:26,480 Speaker 3: they will try to limit the case to its fact. 398 00:25:26,840 --> 00:25:29,240 Speaker 3: It's a tough one. It's really tough to predict which 399 00:25:29,240 --> 00:25:32,719 Speaker 3: way they're going to go. I suspect maybe a majority 400 00:25:32,840 --> 00:25:36,600 Speaker 3: will side with Facebook. You know that this is a 401 00:25:36,640 --> 00:25:40,679 Speaker 3: case where sure they knew there was a breach, but 402 00:25:40,760 --> 00:25:44,439 Speaker 3: they didn't know it would impact the company's business as 403 00:25:44,520 --> 00:25:48,359 Speaker 3: much as it did, and therefore the risk disclosure was 404 00:25:48,400 --> 00:25:51,359 Speaker 3: not materially misleading. Now, you know, if they had known 405 00:25:51,920 --> 00:25:55,440 Speaker 3: about the breach and they knew that it was as 406 00:25:55,480 --> 00:25:58,439 Speaker 3: impactful as it would be, then that would be a 407 00:25:58,480 --> 00:26:01,520 Speaker 3: different story. Now it's possible, though, some of the justices 408 00:26:01,520 --> 00:26:04,240 Speaker 3: will buy the plantif's response, you know, the planets. Basically 409 00:26:04,280 --> 00:26:07,000 Speaker 3: their response is that, Okay, you must have known that, 410 00:26:07,480 --> 00:26:10,120 Speaker 3: given the size of the breach, that it was going 411 00:26:10,160 --> 00:26:13,000 Speaker 3: to really affect your reputation at the time you made 412 00:26:13,040 --> 00:26:17,240 Speaker 3: that statement, and therefore the risk disclosure was misleading. What 413 00:26:17,320 --> 00:26:21,120 Speaker 3: you should have done when you wrote the disclosure was acknowledged, Hey, 414 00:26:21,160 --> 00:26:26,200 Speaker 3: we have this very very unprecedented significant reach of millions 415 00:26:26,200 --> 00:26:29,399 Speaker 3: and millions of users of data. That's the disclosure you 416 00:26:29,400 --> 00:26:32,359 Speaker 3: should have written. Now, that does seem to be kind 417 00:26:32,359 --> 00:26:35,600 Speaker 3: of second guessing from the disclosure with twenty twenty hind 418 00:26:35,600 --> 00:26:40,320 Speaker 3: sight and Facebook would say it's not fair for every 419 00:26:40,359 --> 00:26:45,160 Speaker 3: single little disclosure in one hundred page document to potentially 420 00:26:45,560 --> 00:26:48,040 Speaker 3: be second guests by a planif's attorney who has the 421 00:26:48,119 --> 00:26:50,919 Speaker 3: incentive to question it. But I think it's going to 422 00:26:50,920 --> 00:26:52,399 Speaker 3: be a close one. I think it could be a 423 00:26:52,440 --> 00:26:54,719 Speaker 3: close one, and I think there could be some disagreement. 424 00:26:54,760 --> 00:26:57,080 Speaker 3: But what I am confident of is it's going to 425 00:26:57,119 --> 00:27:00,280 Speaker 3: be a fairly backed specific decision, a. 426 00:27:00,280 --> 00:27:04,239 Speaker 1: Split between the conservatives and the liberals. 427 00:27:04,320 --> 00:27:08,840 Speaker 3: Somewhat, although some of the conservatives I wasn't quite sure 428 00:27:09,320 --> 00:27:12,800 Speaker 3: which way they were going. And you know, Justice Thomas 429 00:27:12,800 --> 00:27:15,920 Speaker 3: starts out and ask some questions that I thought were 430 00:27:16,000 --> 00:27:19,000 Speaker 3: leaning a little bit towards the plaintiffs. So it is 431 00:27:19,080 --> 00:27:22,280 Speaker 3: it is a little bit hard to say, I'm you know, 432 00:27:22,520 --> 00:27:26,119 Speaker 3: Justice Tavanaugh's question is more just about why are we 433 00:27:26,160 --> 00:27:28,679 Speaker 3: even deciding this in the first place. I don't know 434 00:27:28,720 --> 00:27:32,280 Speaker 3: what that means for how he would actually believe the 435 00:27:32,400 --> 00:27:35,159 Speaker 3: case should come out. You know, I do think the 436 00:27:35,240 --> 00:27:40,199 Speaker 3: liberal justices probably seem to be mostly sympathetic to the question. 437 00:27:40,480 --> 00:27:43,040 Speaker 3: And you know, Justice Alito has thought a lot about 438 00:27:43,160 --> 00:27:45,639 Speaker 3: these issues. He decided a few cases when he was 439 00:27:45,680 --> 00:27:50,879 Speaker 3: on the Third Circuit decision on Burlington Cope factories that 440 00:27:51,280 --> 00:27:55,600 Speaker 3: basically grappled with this duty of updating disclosures. So he's 441 00:27:55,600 --> 00:27:59,440 Speaker 3: thought about this for many years. I tend to believe Alido, 442 00:28:00,040 --> 00:28:03,480 Speaker 3: based on the Burlington Code Factory's decision, would be less 443 00:28:03,520 --> 00:28:07,119 Speaker 3: inclined to say that those disclosures were misleading. But I 444 00:28:07,160 --> 00:28:10,359 Speaker 3: could be wrong on that. It's hard for me to say, 445 00:28:10,880 --> 00:28:14,480 Speaker 3: but I think you could see some coalescing that might 446 00:28:14,680 --> 00:28:18,119 Speaker 3: break down under conservative and liberal sort of life. 447 00:28:18,400 --> 00:28:22,199 Speaker 1: But you think that whatever decision, it'll be fact specific, 448 00:28:22,280 --> 00:28:25,200 Speaker 1: so it won't have so much impact on other companies 449 00:28:25,280 --> 00:28:27,320 Speaker 1: disclosures or am I overstating that. 450 00:28:27,800 --> 00:28:31,560 Speaker 3: I think that's right, and even the defendants. Facebook's is 451 00:28:31,720 --> 00:28:36,000 Speaker 3: very careful to stay that there will be some circumstances 452 00:28:36,800 --> 00:28:40,920 Speaker 3: where a risk disclosure is misleading, like when you know 453 00:28:41,120 --> 00:28:45,480 Speaker 3: for a certainty that the risk has actualized. My understanding 454 00:28:45,520 --> 00:28:49,400 Speaker 3: of Facebook's position is that would be materially misleading, and 455 00:28:49,440 --> 00:28:53,520 Speaker 3: so I think that is a situation you find quite 456 00:28:53,560 --> 00:28:56,680 Speaker 3: often these days. That's a theory that's asserted a lot 457 00:28:56,720 --> 00:29:00,680 Speaker 3: against companies. I think that they could write an decision 458 00:29:00,720 --> 00:29:05,760 Speaker 3: where you could say in these circumstances, you knew that 459 00:29:05,920 --> 00:29:08,320 Speaker 3: part of the risk can happen, but not all of 460 00:29:08,360 --> 00:29:11,440 Speaker 3: the risks could happened. I think that might be what 461 00:29:11,520 --> 00:29:14,800 Speaker 3: the Supreme Court could could do in this case if 462 00:29:14,800 --> 00:29:18,480 Speaker 3: it wanted to rule in favor of Facebook, and that 463 00:29:18,600 --> 00:29:21,760 Speaker 3: could affect though that could affect other cases because there 464 00:29:21,800 --> 00:29:25,160 Speaker 3: may be other cases that are like that, and also 465 00:29:25,280 --> 00:29:27,320 Speaker 3: there will often be the question of well, does our 466 00:29:27,600 --> 00:29:31,040 Speaker 3: case fall into this back pattern or is it something else. 467 00:29:31,280 --> 00:29:35,120 Speaker 3: So I think it could have a substantial effect on 468 00:29:35,640 --> 00:29:38,880 Speaker 3: the way these cases are litigated, but I don't think 469 00:29:38,920 --> 00:29:44,240 Speaker 3: it will completely immunize risk disclosures from scrutiny under Rule 470 00:29:44,280 --> 00:29:44,960 Speaker 3: ten D five. 471 00:29:45,880 --> 00:29:48,520 Speaker 1: Switching topics for just a moment, I do have a 472 00:29:48,600 --> 00:29:54,320 Speaker 1: question for you about crypto and the sec The crypto industry, 473 00:29:54,600 --> 00:29:59,360 Speaker 1: you know, poured millions of dollars into presidential and congressional races, 474 00:29:59,360 --> 00:30:02,520 Speaker 1: and Donald Trump has said that on day one he's 475 00:30:02,560 --> 00:30:07,120 Speaker 1: going to fire Gary Gensler. Do you see he has 476 00:30:07,160 --> 00:30:09,320 Speaker 1: a lot of firing to do on day one. So 477 00:30:09,560 --> 00:30:13,000 Speaker 1: do you think the crypto industry is going to benefit 478 00:30:13,600 --> 00:30:17,200 Speaker 1: from Gensler's departure and Trump administration? 479 00:30:17,560 --> 00:30:20,480 Speaker 3: I think so. I'm not sure how much they will benefit. 480 00:30:20,640 --> 00:30:23,720 Speaker 3: It's not quite clear to me what the SEC will 481 00:30:23,760 --> 00:30:28,080 Speaker 3: do even under a Trump administration. I'm not sure whether 482 00:30:28,120 --> 00:30:34,760 Speaker 3: it's possible to write rules that would balance investor protection 483 00:30:35,040 --> 00:30:39,440 Speaker 3: and what the crypto industry is seeking. Is it possible 484 00:30:39,480 --> 00:30:41,760 Speaker 3: that the crypto industry could get some sort of broad 485 00:30:42,800 --> 00:30:47,920 Speaker 3: community from the requirements of the SEC. Possibly. I think 486 00:30:47,960 --> 00:30:52,120 Speaker 3: that would probably have to go through Congress, and the 487 00:30:52,120 --> 00:30:55,560 Speaker 3: crypto industry has had a good amount of influence there. 488 00:30:55,640 --> 00:30:59,760 Speaker 3: But you know, Congress is self interested, and if they 489 00:30:59,760 --> 00:31:05,040 Speaker 3: were to path that sort of broad community from registration, 490 00:31:06,440 --> 00:31:09,640 Speaker 3: if the crypto industry collapses, if a lot of investors 491 00:31:09,640 --> 00:31:12,240 Speaker 3: lose a lot of money, they know exactly who to 492 00:31:12,320 --> 00:31:18,440 Speaker 3: point it, and so I'm not sure politically that Congress 493 00:31:18,480 --> 00:31:22,960 Speaker 3: will go in that direction. Could Trump appoint somebody who 494 00:31:24,360 --> 00:31:28,760 Speaker 3: will lessen the SEC's enforcement. I think that's probably the 495 00:31:28,800 --> 00:31:33,240 Speaker 3: most likely option there. It is difficult, though, I think 496 00:31:33,360 --> 00:31:37,520 Speaker 3: for enforcement to justify completely dropping cases you know that 497 00:31:37,560 --> 00:31:41,120 Speaker 3: are being litigated and are still being litigated against companies 498 00:31:41,240 --> 00:31:45,600 Speaker 3: like coinbase, And if it takes that somewhat drastic step, 499 00:31:46,520 --> 00:31:50,240 Speaker 3: then you know, we have someone to blame if investors 500 00:31:50,240 --> 00:31:53,640 Speaker 3: lose a lot of money. So I'm not exactly sure 501 00:31:53,800 --> 00:31:58,360 Speaker 3: what's going to happen on day one or the days 502 00:31:58,400 --> 00:32:03,600 Speaker 3: afterwards with a prompt SCC, but certainly Trump will try 503 00:32:03,640 --> 00:32:06,520 Speaker 3: to do some things that at least appear to be 504 00:32:06,640 --> 00:32:11,080 Speaker 3: favorable to the crypto industry. But I also think he's 505 00:32:11,080 --> 00:32:16,040 Speaker 3: going to engage in some self interested calculus, and he 506 00:32:16,760 --> 00:32:21,440 Speaker 3: understands that he could be blamed if there are significant losses. 507 00:32:21,760 --> 00:32:24,320 Speaker 3: I think he will balance those different things, and it'll 508 00:32:24,360 --> 00:32:27,239 Speaker 3: be interesting to see what happens on day one and 509 00:32:27,360 --> 00:32:28,480 Speaker 3: the days afterwards. 510 00:32:28,520 --> 00:32:30,840 Speaker 1: There will certainly be a lot of things to watch. 511 00:32:31,480 --> 00:32:33,680 Speaker 1: It's always great to have you on this show, Jim, 512 00:32:33,760 --> 00:32:37,640 Speaker 1: thanks so much. That's Professor James Park of UCLA Law School. 513 00:32:38,040 --> 00:32:40,680 Speaker 1: And that's it for this edition of the Bloomberg Law Podcast. 514 00:32:41,040 --> 00:32:43,400 Speaker 1: Remember you can always get the latest legal news by 515 00:32:43,440 --> 00:32:47,280 Speaker 1: subscribing and listening to the show on Apple Podcasts, Spotify, 516 00:32:47,560 --> 00:32:51,400 Speaker 1: and at Bloomberg dot com, Slash podcast, Slash Law. I'm 517 00:32:51,480 --> 00:32:53,920 Speaker 1: June Grosso, and this is Bloomberg