1 00:00:03,200 --> 00:00:07,960 Speaker 1: This is Bloomberg Law with June Brussel from Bloomberg Radio. 2 00:00:08,760 --> 00:00:12,560 Speaker 1: Facebook could be looking at scrutiny from federal and state regulators, 3 00:00:12,600 --> 00:00:16,120 Speaker 1: as well as lawsuits from consumers, after data on more 4 00:00:16,160 --> 00:00:20,760 Speaker 1: than half a billion users became widely available online. Information 5 00:00:20,880 --> 00:00:24,200 Speaker 1: said to be exposed includes phone numbers, Facebook I d s, 6 00:00:24,280 --> 00:00:28,200 Speaker 1: full names, locations, birth dates, bios, and in some cases 7 00:00:28,280 --> 00:00:32,240 Speaker 1: email addresses. Facebook says that the data which re emerged 8 00:00:32,320 --> 00:00:35,559 Speaker 1: online over the weekend is from an earlier flaw revealed 9 00:00:35,560 --> 00:00:40,199 Speaker 1: in which Facebook fixed. My guest is Andream, a twition professor, 10 00:00:40,200 --> 00:00:43,920 Speaker 1: an Associate Dean of Innovation and Technology at Penn State Law. 11 00:00:44,479 --> 00:00:48,040 Speaker 1: So was this a dump a redump? What exactly was it? 12 00:00:48,560 --> 00:00:52,240 Speaker 1: So that's an excellent question. And it's still early enough 13 00:00:52,400 --> 00:00:57,640 Speaker 1: in the investigations around the forensics that will explain to 14 00:00:57,760 --> 00:01:03,560 Speaker 1: us the events leading up to this most recent identification 15 00:01:04,000 --> 00:01:10,000 Speaker 1: of large numbers of user personal identifiable information being available 16 00:01:10,360 --> 00:01:13,480 Speaker 1: in hat performs. The forensics are really what we need 17 00:01:13,520 --> 00:01:16,640 Speaker 1: to acquire here because the extent to which this is 18 00:01:16,680 --> 00:01:19,280 Speaker 1: part of a prior problem, the extent to which this 19 00:01:19,400 --> 00:01:22,120 Speaker 1: is a new problem, whether things have in fact been 20 00:01:22,120 --> 00:01:25,399 Speaker 1: corrected at this point, but also to what extent users 21 00:01:25,520 --> 00:01:30,600 Speaker 1: were notified in accordance with data brenotification obligations. Those are 22 00:01:30,600 --> 00:01:34,800 Speaker 1: all questions that are going to the contingent um the 23 00:01:34,880 --> 00:01:39,520 Speaker 1: specifics of those forensics. Additionally, because we have a complicated 24 00:01:39,800 --> 00:01:46,000 Speaker 1: regulatory relationship between the FTC and Facebook, the date of 25 00:01:46,440 --> 00:01:50,800 Speaker 1: knowledge by Facebook, but also what exactly they disclosed to 26 00:01:50,880 --> 00:01:55,280 Speaker 1: the STC in ten at the point of the second 27 00:01:55,600 --> 00:02:00,960 Speaker 1: round of consent decrees will be operative in whether potentially 28 00:02:01,120 --> 00:02:06,360 Speaker 1: the data control law will give rise to basis for 29 00:02:06,560 --> 00:02:14,239 Speaker 1: a new FTC enforcement action under the twelve consent decree. 30 00:02:14,760 --> 00:02:17,880 Speaker 1: What will the FTC be investigating and I assume they're 31 00:02:17,880 --> 00:02:22,800 Speaker 1: going to investigate yes. So in twelve, the STC and 32 00:02:22,919 --> 00:02:27,640 Speaker 1: Facebook agreed to a set of terms that included ongoing 33 00:02:28,320 --> 00:02:34,400 Speaker 1: self supervision obligations and assessments going forward due to some 34 00:02:34,720 --> 00:02:40,520 Speaker 1: of facebook prior practices around privacy and data security. So 35 00:02:40,800 --> 00:02:46,320 Speaker 1: the extent to which those promises that were voluntarily agreed 36 00:02:46,360 --> 00:02:51,320 Speaker 1: to by Facebook in twelve has been broken for a 37 00:02:51,440 --> 00:02:56,440 Speaker 1: second time will be on the table for discussions I 38 00:02:56,520 --> 00:03:02,200 Speaker 1: expect between the FTC and Facebook when the forensics of 39 00:03:02,280 --> 00:03:06,120 Speaker 1: the particular incidents or maybe more than one incident we 40 00:03:06,160 --> 00:03:09,560 Speaker 1: don't even know exactly whether it was one set of 41 00:03:09,760 --> 00:03:13,720 Speaker 1: leagus or multiple scraping. And it's bound up with some 42 00:03:13,840 --> 00:03:19,080 Speaker 1: other questions, particularly around the phone numbers, because in Facebook 43 00:03:19,560 --> 00:03:25,720 Speaker 1: publicly acknowledged that they were using phone numbers that users 44 00:03:25,760 --> 00:03:30,880 Speaker 1: provided for two factor authentication purposes and security as a 45 00:03:30,919 --> 00:03:37,960 Speaker 1: functionality enhancement to allow for user look up of other users, 46 00:03:38,440 --> 00:03:42,600 Speaker 1: and so there was a profuscilist of time around that 47 00:03:43,360 --> 00:03:49,600 Speaker 1: choice by Facebook to repurpose information that for many users 48 00:03:49,640 --> 00:03:54,840 Speaker 1: would have been provided with an expectation of a narrow 49 00:03:55,000 --> 00:04:01,760 Speaker 1: security related use, but not necessarily a use repurposing for 50 00:04:02,240 --> 00:04:06,360 Speaker 1: helping other Facebook users find them. So there are a 51 00:04:06,440 --> 00:04:10,600 Speaker 1: bundle of various practices that will potentially be implicated in 52 00:04:11,280 --> 00:04:16,680 Speaker 1: these conversations, as well as the context of what was 53 00:04:16,760 --> 00:04:21,480 Speaker 1: disclosed by Facebook to the SEC at the time of 54 00:04:21,560 --> 00:04:28,760 Speaker 1: the June Does Facebook know what happened? I would hope so. Um, 55 00:04:29,560 --> 00:04:34,120 Speaker 1: but this is the first question. So the issue of 56 00:04:34,200 --> 00:04:39,200 Speaker 1: the extent of forensic analysis internally and what the company 57 00:04:39,320 --> 00:04:43,520 Speaker 1: knew when and how much of it was a design choice, 58 00:04:43,560 --> 00:04:48,280 Speaker 1: how much of it was a well executed incident response, 59 00:04:48,320 --> 00:04:50,000 Speaker 1: was the point at which they found out about it, 60 00:04:50,600 --> 00:04:55,160 Speaker 1: or how much of the conduct around this incident was 61 00:04:55,920 --> 00:05:00,000 Speaker 1: a response that some regulators, for example the EU or 62 00:05:00,000 --> 00:05:06,839 Speaker 1: Australian regulators may deem to not reflect the expectations that 63 00:05:07,000 --> 00:05:11,600 Speaker 1: they have for companies in possession of their residents information. 64 00:05:11,920 --> 00:05:15,359 Speaker 1: So the details and the forensics here are going to 65 00:05:15,480 --> 00:05:18,240 Speaker 1: be this positive. So that's a little bit of a 66 00:05:18,240 --> 00:05:21,240 Speaker 1: wait and see right now. But EU regulators have already 67 00:05:21,240 --> 00:05:25,200 Speaker 1: announced that they will be conducting further inquiries. I would 68 00:05:25,200 --> 00:05:28,800 Speaker 1: expect the FT set of fellow suits in the US, 69 00:05:28,839 --> 00:05:32,920 Speaker 1: and it's possible that the spec will take a look 70 00:05:32,960 --> 00:05:36,320 Speaker 1: depending on what the nature of the disclosures were by 71 00:05:36,400 --> 00:05:40,000 Speaker 1: the company and the relevant tent case statements, because there 72 00:05:40,000 --> 00:05:45,159 Speaker 1: are securities to exchange Commission guidance documents around disclosure of 73 00:05:45,600 --> 00:05:53,160 Speaker 1: security incidents and um. Obviously, public companies have reporting duties 74 00:05:53,200 --> 00:05:57,640 Speaker 1: around material risks to business and material litigation risks on 75 00:05:57,680 --> 00:06:01,240 Speaker 1: an ongoing basis under thirty four. Book has made privacy 76 00:06:01,279 --> 00:06:07,960 Speaker 1: settlements with the FTC twice once in and once in. 77 00:06:07,960 --> 00:06:10,440 Speaker 1: In light of that, how much credibility do they have 78 00:06:10,600 --> 00:06:15,080 Speaker 1: when they say they want to protect users privacy? Critics 79 00:06:15,080 --> 00:06:19,640 Speaker 1: of Facebook have certainly raised that point. UH that in 80 00:06:20,520 --> 00:06:23,480 Speaker 1: UH this point in the history of the company. Critics 81 00:06:23,480 --> 00:06:28,839 Speaker 1: would argue there has been so many sequential problems that 82 00:06:28,920 --> 00:06:32,719 Speaker 1: there is a broader story potentially being told of a 83 00:06:32,839 --> 00:06:38,479 Speaker 1: company that is very interested in public face and statements, 84 00:06:38,520 --> 00:06:44,320 Speaker 1: but not necessarily interested in creating a culture of data 85 00:06:44,560 --> 00:06:50,120 Speaker 1: protection and stewardship. And critics haven't founding this alarms since 86 00:06:50,600 --> 00:06:57,560 Speaker 1: really the UH creation of the company, UH, particularly circa 87 00:06:57,839 --> 00:07:02,599 Speaker 1: two thousands seven two D they're started to be material 88 00:07:02,680 --> 00:07:07,640 Speaker 1: modifications and the privacy default and Facebook um, and so 89 00:07:07,680 --> 00:07:11,000 Speaker 1: there's been a constant UH set of concerns raised by 90 00:07:11,080 --> 00:07:16,200 Speaker 1: privacy advocates, m consumers as well UM, dating back to 91 00:07:17,200 --> 00:07:19,880 Speaker 1: the early days of the company. So your point is 92 00:07:20,000 --> 00:07:25,320 Speaker 1: well taken. Well state attorneys general look into this, It's 93 00:07:25,480 --> 00:07:30,480 Speaker 1: entirely possible that state attorneys general will also inquire as 94 00:07:30,520 --> 00:07:34,160 Speaker 1: to the specifics of these incidents. State attorneys general would 95 00:07:34,200 --> 00:07:39,960 Speaker 1: have authority to potentially engage in state love enforcement action 96 00:07:40,160 --> 00:07:45,120 Speaker 1: under the Mini FTC Acts, meaning the state specific Unfair 97 00:07:45,120 --> 00:07:49,440 Speaker 1: and Deceptive Trade Practices statutes that are a matter of 98 00:07:49,480 --> 00:07:53,080 Speaker 1: state law and generally tend to parallel the structure of 99 00:07:53,200 --> 00:07:56,160 Speaker 1: the FTC Act Section five on the federal level, So 100 00:07:56,480 --> 00:08:02,040 Speaker 1: it would be entirely unsurprising if some state regulators who 101 00:08:02,120 --> 00:08:06,680 Speaker 1: already have Facebook in their cross hairs from previous incidents 102 00:08:06,720 --> 00:08:14,840 Speaker 1: of data stewardships, UM suboptimal incidents, and the concerns over 103 00:08:14,960 --> 00:08:21,119 Speaker 1: tech concentration and competition hindrance that are in the ether 104 00:08:21,360 --> 00:08:24,120 Speaker 1: now both on the state and the federal level, as 105 00:08:24,160 --> 00:08:27,800 Speaker 1: well as UH formal legal proceedings in some cases against 106 00:08:27,880 --> 00:08:34,640 Speaker 1: large technology companies. That environment is one of greater regulatory 107 00:08:34,960 --> 00:08:38,880 Speaker 1: and state attorneys general scrutiny. So I would not at 108 00:08:38,880 --> 00:08:43,079 Speaker 1: all be surprised if the more tech savvy tech engage 109 00:08:43,160 --> 00:08:47,400 Speaker 1: to state attorneys general do indeed have some tough questions 110 00:08:47,400 --> 00:08:51,920 Speaker 1: on this point. What about class action lawsuits? What kind 111 00:08:51,960 --> 00:08:57,000 Speaker 1: of class action lawsuits could we see? So depending on 112 00:08:57,280 --> 00:09:02,640 Speaker 1: the extent to which data Breach no Applications statutes were 113 00:09:03,240 --> 00:09:09,880 Speaker 1: UH complied with or potentially not fully UH followed, there 114 00:09:09,920 --> 00:09:13,920 Speaker 1: may be an individual level positive action in the state 115 00:09:14,080 --> 00:09:18,960 Speaker 1: with more aggressively drafted data breach notifications statutes and Massachusetes 116 00:09:19,000 --> 00:09:22,679 Speaker 1: being one of the UM California potentially giving rise to 117 00:09:22,840 --> 00:09:26,880 Speaker 1: individual level suits. At least some of these cases may 118 00:09:26,960 --> 00:09:29,440 Speaker 1: end up being class actions, and the jurisdictions that are 119 00:09:29,480 --> 00:09:34,600 Speaker 1: more friendly to class actions. In particular, the nature of 120 00:09:34,640 --> 00:09:37,960 Speaker 1: the data that was released will be relevant because some 121 00:09:38,080 --> 00:09:43,720 Speaker 1: of those data breach notification statutes are contingent in the 122 00:09:43,840 --> 00:09:46,960 Speaker 1: rights that they grant based on the nature of the 123 00:09:47,040 --> 00:09:51,680 Speaker 1: information that was disclosed about consumers residing in their states. 124 00:09:52,320 --> 00:09:56,839 Speaker 1: So this is another situation where the specifics of the 125 00:09:56,960 --> 00:10:01,199 Speaker 1: Facebook response at the time that they found out whenever 126 00:10:01,280 --> 00:10:07,040 Speaker 1: that was, and the extent of the data loss of 127 00:10:07,120 --> 00:10:13,800 Speaker 1: control and the extent of response, forensic analysis and overall 128 00:10:13,920 --> 00:10:17,480 Speaker 1: conduct around the incident response will be in play. So 129 00:10:17,640 --> 00:10:21,480 Speaker 1: I would not be surprised to see class actions UM 130 00:10:21,600 --> 00:10:25,960 Speaker 1: they are becoming increasingly frequent in parallel situations to this one. 131 00:10:26,480 --> 00:10:29,480 Speaker 1: It is also possible that, depending on the nature of 132 00:10:29,800 --> 00:10:34,600 Speaker 1: the disclosure in the ten K annual reports that I 133 00:10:34,679 --> 00:10:39,439 Speaker 1: mentioned previously, where Facebook has an obligation under the thirty 134 00:10:39,440 --> 00:10:43,000 Speaker 1: four Acts file periodic reports with the SEC, if the 135 00:10:43,080 --> 00:10:51,040 Speaker 1: disclosure did not extend with um adequate specificity and notice 136 00:10:51,480 --> 00:10:55,840 Speaker 1: in the opinion of securities litigators UH, there is an 137 00:10:55,880 --> 00:11:00,320 Speaker 1: active securities class action bar. It is possible that we 138 00:11:00,400 --> 00:11:06,240 Speaker 1: may see class action attempted based on the tent K 139 00:11:07,000 --> 00:11:11,760 Speaker 1: disclosures or lack thereof. Particularly if this does result in 140 00:11:12,440 --> 00:11:17,439 Speaker 1: a new fine from the FTC, the EU or another 141 00:11:18,000 --> 00:11:22,680 Speaker 1: national regulator. Do the fines effect Facebook at all? Their 142 00:11:22,800 --> 00:11:25,640 Speaker 1: massive fines but there are just a drop in the 143 00:11:25,679 --> 00:11:30,480 Speaker 1: bucket compared to what Facebook is worth it makes. This 144 00:11:30,559 --> 00:11:34,280 Speaker 1: is a critique that has been the topic of discussion, 145 00:11:34,400 --> 00:11:38,559 Speaker 1: certainly the STCs last fine, which was in the neighborhood 146 00:11:38,600 --> 00:11:45,760 Speaker 1: of five billion dollars. The approach that the US regulators 147 00:11:45,800 --> 00:11:53,400 Speaker 1: take is a more constrained one than European regulators. The 148 00:11:53,480 --> 00:11:57,960 Speaker 1: amounts of any subsequent finds may be more aggressive under 149 00:11:58,040 --> 00:12:00,400 Speaker 1: a g d PR based approach because g d p 150 00:12:00,640 --> 00:12:05,920 Speaker 1: R authorizes sign that are contingent on corporate earnings. So 151 00:12:06,160 --> 00:12:09,640 Speaker 1: this question of the proper construction of signs in a 152 00:12:09,679 --> 00:12:13,080 Speaker 1: way to send a message companies is one that that 153 00:12:13,280 --> 00:12:17,840 Speaker 1: has been definitely discussed. It's a fair critique if you can, 154 00:12:18,360 --> 00:12:22,240 Speaker 1: in essence plan and to your business model the amount 155 00:12:22,360 --> 00:12:25,880 Speaker 1: of the fine and the fine is maturely less than 156 00:12:25,920 --> 00:12:31,760 Speaker 1: the revenue generated in UH, say a single quarter, then 157 00:12:31,880 --> 00:12:37,120 Speaker 1: the business incentives UH, some whould argue, are to simply 158 00:12:37,200 --> 00:12:41,400 Speaker 1: view that fine as the cost of doing business, particularly 159 00:12:42,040 --> 00:12:48,400 Speaker 1: when you have waivers by enforcers of finding any personal 160 00:12:48,480 --> 00:12:53,359 Speaker 1: responsibility on the part of officers and directors for oversight failures, 161 00:12:53,520 --> 00:12:59,200 Speaker 1: you set up potentially a situation where that kind of 162 00:12:59,600 --> 00:13:05,439 Speaker 1: cost benefit calculation is more likely, and particularly if a 163 00:13:05,440 --> 00:13:10,840 Speaker 1: company does tend to have a history of repeating kinds 164 00:13:10,880 --> 00:13:15,920 Speaker 1: of problems, you may not be some whould argue creating 165 00:13:16,040 --> 00:13:21,760 Speaker 1: the incentive for an ethical internal self evaluation as to 166 00:13:21,880 --> 00:13:27,320 Speaker 1: whether the current management structures are optimally calibrated to identify 167 00:13:27,440 --> 00:13:31,800 Speaker 1: these kinds of problems early enough in the process and 168 00:13:32,160 --> 00:13:39,040 Speaker 1: correct them quickly enough. Some of the data sharing decisions 169 00:13:39,040 --> 00:13:46,840 Speaker 1: that were made internally may have facilitated the aggregation and 170 00:13:46,920 --> 00:13:51,959 Speaker 1: availability of data that then create a more attractive target 171 00:13:52,040 --> 00:13:56,840 Speaker 1: for attackers. So, assuming that there was a malicious intrusion, 172 00:13:57,040 --> 00:14:00,920 Speaker 1: which we don't know, then the way that you build 173 00:14:01,080 --> 00:14:05,839 Speaker 1: and design your products makes them more or less attractive targets. 174 00:14:05,920 --> 00:14:09,160 Speaker 1: And if there was no intrusion, but instead this was 175 00:14:09,559 --> 00:14:14,680 Speaker 1: the aggressive use of an API or or an interface 176 00:14:14,720 --> 00:14:18,800 Speaker 1: of other sorts that's designed to share information, for example, 177 00:14:18,880 --> 00:14:24,600 Speaker 1: allowing for data scraping, then the question again comes back 178 00:14:24,640 --> 00:14:30,520 Speaker 1: to products design and whether the threat modeling was done 179 00:14:30,840 --> 00:14:36,280 Speaker 1: in a way that accurately legally modeled the risk down 180 00:14:36,320 --> 00:14:41,080 Speaker 1: the road in terms of regulatory action and loss of 181 00:14:41,160 --> 00:14:46,480 Speaker 1: consumer truck arising from problems that may happen because of 182 00:14:46,520 --> 00:14:49,800 Speaker 1: the design choices in the way that the products works. 183 00:14:50,520 --> 00:14:55,840 Speaker 1: And is this data breach particularly problematic in that in 184 00:14:55,880 --> 00:14:59,760 Speaker 1: the amount of information that was given out on person 185 00:15:00,120 --> 00:15:02,480 Speaker 1: you know, for example, you mentioned the phone number that's 186 00:15:02,520 --> 00:15:07,119 Speaker 1: related to two fact authentication. In particular, could that disclosure 187 00:15:07,160 --> 00:15:11,520 Speaker 1: of the phone number be problematic for consumers depending on 188 00:15:11,960 --> 00:15:16,320 Speaker 1: whether the consumer volunteers the phone number or limited purpose, 189 00:15:16,520 --> 00:15:20,880 Speaker 1: or whether it was a generally shared phone number. Those 190 00:15:20,880 --> 00:15:26,880 Speaker 1: specifics I think are relevant to regulators determinations business conduct. 191 00:15:27,120 --> 00:15:31,600 Speaker 1: The question of, say a phone number being published, some 192 00:15:31,720 --> 00:15:36,479 Speaker 1: consumers would certainly view it as at least a material 193 00:15:36,600 --> 00:15:43,240 Speaker 1: inconvenience that their phone numbers are now available for public use, 194 00:15:43,520 --> 00:15:46,920 Speaker 1: potentially leading some consumers to want to change their phone number. 195 00:15:47,600 --> 00:15:54,359 Speaker 1: The consequences of sharing a phone number are potentially less 196 00:15:54,400 --> 00:15:59,280 Speaker 1: direct in some ways than sharing, say birthday or other 197 00:16:00,080 --> 00:16:03,640 Speaker 1: put a personally and unifiable information that can't be changed. 198 00:16:04,040 --> 00:16:06,880 Speaker 1: You can change your phone number, but you can't change 199 00:16:06,880 --> 00:16:11,680 Speaker 1: your birthday. So the nature of the information that is 200 00:16:11,720 --> 00:16:17,760 Speaker 1: included in these exposed databases will be relevant. The way 201 00:16:17,960 --> 00:16:22,040 Speaker 1: that the information was shared originally by consumers will be 202 00:16:22,080 --> 00:16:28,040 Speaker 1: relevant as a matter of privacy. The extent of security practices, 203 00:16:28,240 --> 00:16:32,800 Speaker 1: the product design and data stewardship choices as a matter 204 00:16:32,840 --> 00:16:36,080 Speaker 1: of security will be relevant to the security increase that 205 00:16:36,120 --> 00:16:40,360 Speaker 1: regulators will undertake. Uh So, you know, again, the specifics 206 00:16:40,480 --> 00:16:44,640 Speaker 1: here of what exactly happens and which ss of data 207 00:16:44,840 --> 00:16:49,560 Speaker 1: and how they're integrated will become very relevant. Thanks Andrea. 208 00:16:49,840 --> 00:16:53,520 Speaker 1: That's Professor Andrea ma Tuition, Associate Dean of Innovation and 209 00:16:53,560 --> 00:16:58,560 Speaker 1: Technology at Penn State Law. This week, the Supreme Court 210 00:16:58,640 --> 00:17:01,760 Speaker 1: denied the US Solicitor General's request to argue in an 211 00:17:01,840 --> 00:17:05,720 Speaker 1: upcoming case, something the Court has done just three times 212 00:17:05,760 --> 00:17:09,040 Speaker 1: in the past two decades, but twice now in under 213 00:17:09,080 --> 00:17:12,240 Speaker 1: a year. Joining me is Bloomberg Law. Supreme Court reporter 214 00:17:12,400 --> 00:17:17,840 Speaker 1: Kimberly Strawbridge Robinson explain the Solicitor General's role and how 215 00:17:17,960 --> 00:17:21,480 Speaker 1: this listener general requests to argue in cases where the 216 00:17:21,560 --> 00:17:24,800 Speaker 1: government is not a party. Well, the Solicitor General is 217 00:17:24,880 --> 00:17:28,040 Speaker 1: the federal government's top lawyer at the U s. Supreme Court, 218 00:17:28,320 --> 00:17:30,760 Speaker 1: and while they do some work in the lower appellate 219 00:17:30,760 --> 00:17:34,760 Speaker 1: courts as well, really their focus is on the Supreme Court, 220 00:17:34,880 --> 00:17:38,240 Speaker 1: and they have a really unique place as a litigant. 221 00:17:38,240 --> 00:17:41,359 Speaker 1: They're they're not only the most frequent litigant in the 222 00:17:41,359 --> 00:17:44,520 Speaker 1: Supreme Court by far, but they also hold a special 223 00:17:44,560 --> 00:17:48,639 Speaker 1: place of trust within the Supreme Court. And the office 224 00:17:48,680 --> 00:17:51,720 Speaker 1: is sometimes known as the tenth Justice because of that 225 00:17:51,840 --> 00:17:54,239 Speaker 1: special role. And so we can see that play out 226 00:17:54,280 --> 00:17:58,520 Speaker 1: in many different ways that the Solicitor General interact with 227 00:17:58,600 --> 00:18:02,560 Speaker 1: the justices. But one that we noticed recently is when 228 00:18:02,600 --> 00:18:06,080 Speaker 1: the Solicitor General requests to argue in the case and 229 00:18:06,119 --> 00:18:08,760 Speaker 1: which is not really a party, but in which there's 230 00:18:08,800 --> 00:18:10,960 Speaker 1: some kind of federal interests. So this is what it 231 00:18:11,160 --> 00:18:13,440 Speaker 1: requests argue as a friend of the court rather than 232 00:18:13,520 --> 00:18:17,239 Speaker 1: as a party. And do the justices always honor this 233 00:18:17,359 --> 00:18:21,120 Speaker 1: listener general's request to argue? Well in modern history, yes, 234 00:18:21,400 --> 00:18:24,520 Speaker 1: there's a forthcoming law review article out that looked at 235 00:18:24,720 --> 00:18:27,600 Speaker 1: a period of ten years starting in uh, you know, 236 00:18:27,720 --> 00:18:30,760 Speaker 1: the two thousand and tents that said that when other 237 00:18:31,080 --> 00:18:35,040 Speaker 1: organizations requested argument time as a friend of the court, 238 00:18:35,359 --> 00:18:37,840 Speaker 1: the court only granted it, uh, you know, less than 239 00:18:37,880 --> 00:18:40,480 Speaker 1: half of the time, fourteen out of forty one time. 240 00:18:40,920 --> 00:18:43,760 Speaker 1: But when it was the federal government asking they granted 241 00:18:43,800 --> 00:18:46,840 Speaker 1: it three hundred and eleven, three hundred and twelve times, 242 00:18:46,840 --> 00:18:51,680 Speaker 1: so basically every time. But that's you know, something that's 243 00:18:51,800 --> 00:18:54,359 Speaker 1: very small when they deny it, and so it really 244 00:18:54,720 --> 00:18:58,320 Speaker 1: makes court watchers notice when the justices actually rebuffs a 245 00:18:58,440 --> 00:19:02,639 Speaker 1: solicitor general in this way. So the solicitor General was 246 00:19:02,720 --> 00:19:08,359 Speaker 1: delta rejection recently tell us about that, Well, recently the 247 00:19:08,520 --> 00:19:11,680 Speaker 1: Justices did just that. They told the Solicitor General, no, 248 00:19:11,840 --> 00:19:14,800 Speaker 1: thank you, we don't uh want to give you precious 249 00:19:14,880 --> 00:19:18,919 Speaker 1: argument time in a case about a pellet cost. And 250 00:19:18,960 --> 00:19:21,520 Speaker 1: it's notable because it is one of these rare times 251 00:19:22,040 --> 00:19:25,160 Speaker 1: where they were turned away, but also because it's happened 252 00:19:25,160 --> 00:19:28,560 Speaker 1: twice now in just under a year or something that 253 00:19:28,840 --> 00:19:31,080 Speaker 1: you know, if you look over three hundred and eleven 254 00:19:31,080 --> 00:19:34,320 Speaker 1: out of three hundred and twelve time versus something happening 255 00:19:34,480 --> 00:19:37,080 Speaker 1: twice in one year, you know, that's that's a noticeable 256 00:19:37,240 --> 00:19:40,400 Speaker 1: up chick. Of course, it's too small of a sample 257 00:19:40,480 --> 00:19:43,639 Speaker 1: size to really say that, you know, it's an increasing trend, 258 00:19:43,960 --> 00:19:47,199 Speaker 1: but it's something to watch for sure. Is there something 259 00:19:47,400 --> 00:19:51,560 Speaker 1: similar in the two cases that were denied, Well, you know, 260 00:19:51,640 --> 00:19:54,439 Speaker 1: all we can really do is speculate I've said before 261 00:19:54,760 --> 00:19:57,439 Speaker 1: um on the show that the Supreme Court, you know, 262 00:19:57,520 --> 00:19:59,879 Speaker 1: doesn't really explain a lot. It's a very kind of 263 00:20:00,000 --> 00:20:03,199 Speaker 1: secretive institution. And so it didn't tell us why it 264 00:20:03,320 --> 00:20:06,760 Speaker 1: turned away the solicitor general here, but we can guess that. 265 00:20:07,160 --> 00:20:11,000 Speaker 1: You know, normally the cases where the Solicitor General is 266 00:20:11,000 --> 00:20:13,800 Speaker 1: seeking to argue as a friend of the court, there's 267 00:20:13,800 --> 00:20:16,280 Speaker 1: a pretty strong federal interest. And so there was a 268 00:20:16,320 --> 00:20:20,320 Speaker 1: case earlier this term where we saw a regulated party 269 00:20:20,440 --> 00:20:23,920 Speaker 1: challenging a state law that's that it was preempted by 270 00:20:24,000 --> 00:20:26,680 Speaker 1: another federal law. And you can see, you know, the 271 00:20:27,080 --> 00:20:31,120 Speaker 1: federal interests there and sending, uh, what the federal law means. 272 00:20:31,160 --> 00:20:33,800 Speaker 1: It is pretty strong, and so the Solicitor General was 273 00:20:33,840 --> 00:20:37,200 Speaker 1: allowed to argue in that case. In these other cases, 274 00:20:37,280 --> 00:20:40,760 Speaker 1: the solicitor general stated interest is is really one as 275 00:20:40,800 --> 00:20:43,399 Speaker 1: kind of like a general litigant. Um. So one was 276 00:20:43,400 --> 00:20:47,160 Speaker 1: about jurisdiction and state courts, and this current latest one 277 00:20:47,200 --> 00:20:51,120 Speaker 1: is about appellate cost and so there really isn't an 278 00:20:51,119 --> 00:20:55,800 Speaker 1: explanation about how the United States is situated any differently, 279 00:20:56,200 --> 00:20:59,320 Speaker 1: um than any other litigant would be. So you talked 280 00:20:59,320 --> 00:21:02,480 Speaker 1: to several experts, what did they say about this. Do 281 00:21:02,560 --> 00:21:05,080 Speaker 1: they see it as the courts sending a message to 282 00:21:05,160 --> 00:21:10,000 Speaker 1: the s G. Well, again, it's too few instances right 283 00:21:10,040 --> 00:21:13,080 Speaker 1: now to make any generalized terms. I think, you know, 284 00:21:14,040 --> 00:21:18,399 Speaker 1: with just this, before we had this latest rebuff, you know, 285 00:21:18,480 --> 00:21:21,080 Speaker 1: there was some speculation that the justices are just going 286 00:21:21,119 --> 00:21:23,240 Speaker 1: to do this every once in a while. It's a 287 00:21:23,320 --> 00:21:26,680 Speaker 1: token measure to remind the Solicitor generals that they don't 288 00:21:26,720 --> 00:21:30,000 Speaker 1: get to argue as of course, but you know, there 289 00:21:30,160 --> 00:21:34,359 Speaker 1: is some idea um that perhaps the justices are picking 290 00:21:34,400 --> 00:21:37,960 Speaker 1: up this practice because the Solicitor General has is really 291 00:21:38,000 --> 00:21:41,440 Speaker 1: asking to argue in more cases, uh than it ever 292 00:21:41,560 --> 00:21:44,720 Speaker 1: has before. And so we see them, I think it 293 00:21:44,800 --> 00:21:47,520 Speaker 1: was last term they were in you know, something like 294 00:21:47,600 --> 00:21:51,800 Speaker 1: eighty of the cases. And you know that can really 295 00:21:51,840 --> 00:21:55,720 Speaker 1: skew the you know, the policy arguments that are put 296 00:21:55,760 --> 00:21:58,240 Speaker 1: in front of the justices and ultimately the way that 297 00:21:58,280 --> 00:22:01,080 Speaker 1: they come out. So that's right now a lot of 298 00:22:01,080 --> 00:22:03,639 Speaker 1: speculation and something to watch. But it could be a 299 00:22:03,720 --> 00:22:07,040 Speaker 1: signal that the justices are sending to the Solicitor General 300 00:22:07,119 --> 00:22:10,240 Speaker 1: to kind of be more cautious about when you asked 301 00:22:10,240 --> 00:22:14,600 Speaker 1: for time. Does giving the Solicitor General time cut into 302 00:22:14,640 --> 00:22:18,400 Speaker 1: the time of the other parties who are arguing. It can, 303 00:22:18,560 --> 00:22:21,240 Speaker 1: but it really depends on a case by case basis. 304 00:22:21,600 --> 00:22:25,119 Speaker 1: And and so you know, we saw sometimes the solicitor 305 00:22:25,160 --> 00:22:28,040 Speaker 1: General will be given uh, you know, ten of the 306 00:22:28,119 --> 00:22:31,560 Speaker 1: thirty minutes that decides that they're arguing on it has 307 00:22:31,640 --> 00:22:34,800 Speaker 1: to make their argument. In in a case, the same 308 00:22:34,880 --> 00:22:37,080 Speaker 1: day that they turned away the solicitor general in that 309 00:22:37,119 --> 00:22:40,680 Speaker 1: appellate cost case, they actually granted the solicitor general ten 310 00:22:40,880 --> 00:22:43,679 Speaker 1: extra minutes on top of us thirty minutes that the 311 00:22:43,720 --> 00:22:47,520 Speaker 1: party who they're supporting has. So it really depends. But yes, 312 00:22:47,640 --> 00:22:51,040 Speaker 1: typically it does actually take away from the party's time. 313 00:22:51,520 --> 00:22:54,400 Speaker 1: And you know that's pretty significant when you consider that 314 00:22:55,080 --> 00:22:58,080 Speaker 1: often the solicitor General they're coming down on the side 315 00:22:58,080 --> 00:23:01,040 Speaker 1: of that party, but they're making for an argument and 316 00:23:01,080 --> 00:23:04,400 Speaker 1: putting forth different ways that the justices should decide the case. 317 00:23:04,800 --> 00:23:08,680 Speaker 1: In your story, you talk about a case involving California's 318 00:23:08,760 --> 00:23:12,959 Speaker 1: rule requiring charities to disclose the biggest donors, and in 319 00:23:13,000 --> 00:23:16,800 Speaker 1: that case, the court is refusing to divide the argument 320 00:23:16,840 --> 00:23:21,480 Speaker 1: time among the petitioners. So these are actually two cases 321 00:23:21,600 --> 00:23:25,440 Speaker 1: that have the same issue, but they involved different parties. UH. 322 00:23:25,520 --> 00:23:29,359 Speaker 1: And the Supreme Court has consolidated those cases for just 323 00:23:29,480 --> 00:23:32,320 Speaker 1: one hour of argument since they involved you know, the 324 00:23:32,400 --> 00:23:35,920 Speaker 1: same legal issues, and the parties had asked if they 325 00:23:35,920 --> 00:23:39,320 Speaker 1: could divide the time between you know, both sets of 326 00:23:39,400 --> 00:23:43,919 Speaker 1: petitioners and then give you know, additional time to the respondent, 327 00:23:44,400 --> 00:23:48,359 Speaker 1: but the Supreme Court notably said no, YouTube petitioners have 328 00:23:48,440 --> 00:23:52,760 Speaker 1: to decide one attorney to represent both of your arguments. 329 00:23:52,960 --> 00:23:55,720 Speaker 1: And at the same time it allowed the Solicitor General 330 00:23:55,840 --> 00:23:58,280 Speaker 1: to step in as a friend of the court um. 331 00:23:58,400 --> 00:24:00,280 Speaker 1: So you can see how, you know, there are really 332 00:24:00,400 --> 00:24:04,879 Speaker 1: small differences between the arguments that the petitioners are making, 333 00:24:05,240 --> 00:24:08,639 Speaker 1: but that's pretty significant that the justices told them no, 334 00:24:09,160 --> 00:24:11,600 Speaker 1: they can't have separate time, but at the same time 335 00:24:11,680 --> 00:24:14,679 Speaker 1: gave extra time to the Solicitor General and has the 336 00:24:14,840 --> 00:24:19,040 Speaker 1: SGS record in arguments last term, well, the Solicitor General 337 00:24:19,080 --> 00:24:21,920 Speaker 1: has a pretty good record as a friend of the court. 338 00:24:21,960 --> 00:24:24,760 Speaker 1: That was a pretty good record um in general, even 339 00:24:24,760 --> 00:24:27,480 Speaker 1: including cases whereas a party, you know, the Supreme Court 340 00:24:27,640 --> 00:24:31,480 Speaker 1: usually takes cases to reverse them, so you don't see 341 00:24:31,480 --> 00:24:34,000 Speaker 1: a lot of repeat players coming up with a lot 342 00:24:34,080 --> 00:24:37,120 Speaker 1: of winning streaks at the court. But the Sluicitor General's 343 00:24:37,160 --> 00:24:41,679 Speaker 1: office did prevail in, you know, more than the cases 344 00:24:41,720 --> 00:24:43,840 Speaker 1: in which it was a party, But when you look 345 00:24:43,840 --> 00:24:45,840 Speaker 1: at when it's a friend of the court, those numbers 346 00:24:45,880 --> 00:24:49,080 Speaker 1: are really skewed. Last term, it won twenty two of 347 00:24:49,160 --> 00:24:52,280 Speaker 1: the cases where it weighed in as a friend of 348 00:24:52,280 --> 00:24:55,280 Speaker 1: the court as opposed to a party, saying, so, you know, 349 00:24:55,440 --> 00:24:58,240 Speaker 1: that's pretty significant given that you know, they're not really 350 00:24:58,240 --> 00:25:02,000 Speaker 1: officially a party in the case. Had the justices really 351 00:25:02,080 --> 00:25:05,399 Speaker 1: lean on what they have to say? There's an acting 352 00:25:05,440 --> 00:25:10,280 Speaker 1: solicitor general right now, Is there any word about when 353 00:25:10,720 --> 00:25:14,600 Speaker 1: or who Joe Biden my name as the next listener general. 354 00:25:15,040 --> 00:25:18,920 Speaker 1: I have been paying sources and I haven't heard any 355 00:25:19,000 --> 00:25:22,960 Speaker 1: word on who may be the next solicitor general. The 356 00:25:23,000 --> 00:25:25,720 Speaker 1: only thing that I really heard from the people that 357 00:25:25,760 --> 00:25:28,480 Speaker 1: I've talked to is that they think that the acting 358 00:25:28,640 --> 00:25:32,800 Speaker 1: Solicitor General Solicita's prologar and is doing a great job, 359 00:25:32,920 --> 00:25:35,960 Speaker 1: and they hope that the Biden administration might actually nominate 360 00:25:36,040 --> 00:25:38,840 Speaker 1: her to the top spot, something that's not really unheard of. 361 00:25:39,160 --> 00:25:42,680 Speaker 1: Um that happened with Trump's acting solicitor general. But no, 362 00:25:42,840 --> 00:25:47,200 Speaker 1: that's just I think speculationist point and the Biden administrations 363 00:25:47,320 --> 00:25:50,480 Speaker 1: keeping that s G position. Um, pretty quiet, right. Now, 364 00:25:51,040 --> 00:25:55,240 Speaker 1: let's turn to a Texas judge who has gotten a 365 00:25:55,280 --> 00:25:58,680 Speaker 1: lot of attention over the years. Judge Reid O'Connor. First 366 00:25:58,680 --> 00:26:01,680 Speaker 1: of all, tell us a little bit about him. So, 367 00:26:02,000 --> 00:26:06,719 Speaker 1: Judge O'Connor is really the go to Republican appointed judge 368 00:26:07,200 --> 00:26:13,000 Speaker 1: um for states who want to challenge Democratic administration policies. Um. 369 00:26:13,080 --> 00:26:17,320 Speaker 1: He's based in Texas, and you know, we've seen Texas 370 00:26:17,480 --> 00:26:21,000 Speaker 1: really lead a lot of red states, uh, red state 371 00:26:21,040 --> 00:26:26,400 Speaker 1: coalitions challenging things like Obamacare, challenging immigration policies under Obama 372 00:26:26,480 --> 00:26:30,200 Speaker 1: and now under Biden. Um. And he's really the judge 373 00:26:30,520 --> 00:26:33,800 Speaker 1: and his court is really the court where Texas has 374 00:26:33,800 --> 00:26:37,200 Speaker 1: filed those cases. Now, I want to be clear that 375 00:26:37,240 --> 00:26:40,399 Speaker 1: this is not something that only red states do. Of course, 376 00:26:40,400 --> 00:26:43,600 Speaker 1: blue states do it. Um. It's not just uh, you know, 377 00:26:44,080 --> 00:26:47,600 Speaker 1: political groups that do it. We see even international plaintiffs 378 00:26:47,920 --> 00:26:51,520 Speaker 1: seeking to you know, find a plaintiff friendly judge here 379 00:26:51,520 --> 00:26:54,080 Speaker 1: in the United States. So this idea, what we call 380 00:26:54,160 --> 00:26:57,720 Speaker 1: form shopping, is not anything specific to this judge. Um, 381 00:26:57,920 --> 00:27:00,720 Speaker 1: But this is the judge that's really the to judge 382 00:27:00,720 --> 00:27:05,320 Speaker 1: for challenging democratic policies. And tell us about some a 383 00:27:05,400 --> 00:27:09,120 Speaker 1: few of the cases that um, he drew a lot 384 00:27:09,160 --> 00:27:13,560 Speaker 1: of attention for, particularly the Obamacare decision. That's right. So 385 00:27:13,680 --> 00:27:17,240 Speaker 1: the Obamacare case, which I'm sure your listeners are familiar with, 386 00:27:17,480 --> 00:27:19,879 Speaker 1: is one that's actually in front of the Supreme Court 387 00:27:20,440 --> 00:27:22,800 Speaker 1: right now. It's the case that looks at, you know, 388 00:27:22,880 --> 00:27:27,399 Speaker 1: kind of a tweak that the Republican led Congress made 389 00:27:27,440 --> 00:27:30,760 Speaker 1: to the Affordable Care Act, and the argument is that 390 00:27:30,960 --> 00:27:34,240 Speaker 1: with that tweak, it kind of makes the whole Affordable 391 00:27:34,280 --> 00:27:38,520 Speaker 1: Care Acts fall apart. And this judge actually agreed with 392 00:27:38,560 --> 00:27:42,239 Speaker 1: that argument, struck down the entire Affordable Care Act. And 393 00:27:42,320 --> 00:27:47,080 Speaker 1: that's the decision that the justices are considering right now. Really, 394 00:27:47,320 --> 00:27:50,000 Speaker 1: that decision has got a lot of criticism, not just 395 00:27:50,040 --> 00:27:52,679 Speaker 1: from those on the left, but also those on the right. 396 00:27:52,840 --> 00:27:56,560 Speaker 1: We people who filed in the original Affordable Care Act 397 00:27:56,640 --> 00:27:59,919 Speaker 1: case on the side of those challenging UM that acts 398 00:28:00,040 --> 00:28:03,000 Speaker 1: saying that it was unconstitutional, saying that that ruling now 399 00:28:03,240 --> 00:28:06,960 Speaker 1: just really doesn't follow the law, and it's a really outlier. UM. 400 00:28:07,000 --> 00:28:10,080 Speaker 1: And I think most people expect after oral arguments that 401 00:28:10,160 --> 00:28:13,479 Speaker 1: the Supreme Court is going to reverse that decision. UM. 402 00:28:13,520 --> 00:28:15,640 Speaker 1: So it's decisions like that, you know, where we see 403 00:28:15,680 --> 00:28:17,359 Speaker 1: a lot of criticism on both the right and the 404 00:28:17,440 --> 00:28:22,040 Speaker 1: left that Judge O'Connor is known for. Explain this tweet, 405 00:28:22,359 --> 00:28:25,320 Speaker 1: and you know the whole context of this tweet, how 406 00:28:25,359 --> 00:28:28,960 Speaker 1: it came about. Well, you know, the administrative offices of 407 00:28:29,000 --> 00:28:31,400 Speaker 1: the U. S. Court, they are kind of the policy 408 00:28:31,440 --> 00:28:35,679 Speaker 1: making arm of the judiciary. They often put out, you know, 409 00:28:35,800 --> 00:28:39,920 Speaker 1: historical information on their tweets and social media sites. They 410 00:28:39,960 --> 00:28:43,000 Speaker 1: put out a lot of educational information, and they just 411 00:28:43,000 --> 00:28:47,080 Speaker 1: so fus small clips of Judge O'Connor saying that, you know, 412 00:28:47,480 --> 00:28:50,760 Speaker 1: a judge's role is simply to interpret the law and 413 00:28:50,800 --> 00:28:54,120 Speaker 1: if they have any disagreements with public policy, you know, 414 00:28:54,400 --> 00:28:57,680 Speaker 1: that's not for a judge to say. It's something pretty innocuous, 415 00:28:57,720 --> 00:29:01,959 Speaker 1: and I think it's something that most people learn about, um, 416 00:29:02,000 --> 00:29:05,480 Speaker 1: you know, an elementary of middle school, the role of judges. Um. 417 00:29:05,640 --> 00:29:08,480 Speaker 1: But it was not well received, given that it was 418 00:29:08,600 --> 00:29:12,520 Speaker 1: this particular judge making that statement. These tweets don't normally 419 00:29:12,600 --> 00:29:16,800 Speaker 1: get very many responses, but this did. It did, and 420 00:29:16,920 --> 00:29:20,880 Speaker 1: with a lot of mockery from academia and those who 421 00:29:20,960 --> 00:29:25,120 Speaker 1: practice in the federal courts saying, regardless of whether or 422 00:29:25,120 --> 00:29:30,400 Speaker 1: not that's true for some judges. That's not particularly credible 423 00:29:30,720 --> 00:29:33,680 Speaker 1: from Judge O'Connor, you know, whether that's fair or not, 424 00:29:34,080 --> 00:29:37,200 Speaker 1: or criticism that people can wield at other judges, I think, 425 00:29:37,560 --> 00:29:40,480 Speaker 1: you know, the point has taken that this particular judge 426 00:29:40,520 --> 00:29:43,960 Speaker 1: doesn't always follow that rule, and that's something we can 427 00:29:43,960 --> 00:29:46,880 Speaker 1: see not just in you know, our own personal opinions, 428 00:29:46,920 --> 00:29:49,680 Speaker 1: but also the times that he's been reversed by the 429 00:29:49,680 --> 00:29:53,760 Speaker 1: Supreme Court and higher appellate Court. That's Bloomberg Law Supreme 430 00:29:53,760 --> 00:29:57,200 Speaker 1: Court reporter Kimberly Strawbridge Robinson and that's it for the 431 00:29:57,320 --> 00:29:59,960 Speaker 1: edition of the Bloomberg Law Show. Remember you can always 432 00:30:00,080 --> 00:30:03,440 Speaker 1: the latest legal news by subscribing to our Bloomberg Law podcasts. 433 00:30:03,880 --> 00:30:06,640 Speaker 1: You can find them on Apple Podcasts, Spotify, and at 434 00:30:06,800 --> 00:30:11,880 Speaker 1: www dot Bloomberg dot com slash podcast Slash Law. I'm 435 00:30:12,000 --> 00:30:15,000 Speaker 1: June Grosso. Thanks so much for listening, and please tune 436 00:30:15,000 --> 00:30:17,800 Speaker 1: into The Bloomberg Law Show every weeknight at ten pm 437 00:30:17,840 --> 00:30:19,880 Speaker 1: Eastern right here on Bloomberg Radio.