1 00:00:02,759 --> 00:00:07,000 Speaker 1: This is Bloomberg Law with June Grossel from Bloomberg Radio. 2 00:00:08,600 --> 00:00:11,239 Speaker 2: We're going to work with President Trump to push back 3 00:00:11,320 --> 00:00:14,520 Speaker 2: on governments around the world. They're going after American companies 4 00:00:14,880 --> 00:00:16,279 Speaker 2: and pushing to censor more. 5 00:00:17,160 --> 00:00:21,280 Speaker 1: Meta CEO Mark Zuckerberg has been courting President Donald Trump 6 00:00:21,360 --> 00:00:27,400 Speaker 1: since his election, scrapping content moderation policies criticized by conservatives, 7 00:00:27,800 --> 00:00:32,680 Speaker 1: donating a million dollars to Trump's inauguration, promoting Republicans to 8 00:00:32,680 --> 00:00:36,559 Speaker 1: top roles in Meta, and visiting the White House several times. 9 00:00:37,080 --> 00:00:41,919 Speaker 1: Despite all that lobbying, the historic antitrust case against Meta 10 00:00:42,000 --> 00:00:45,440 Speaker 1: that was started in the first Trump administration went to 11 00:00:45,520 --> 00:00:49,600 Speaker 1: trial this week in DC. The Federal Trade Commission is 12 00:00:49,640 --> 00:00:53,720 Speaker 1: trying to force METTA to divest Instagram and WhatsApp, calling 13 00:00:53,800 --> 00:00:57,320 Speaker 1: Zuckerberg as its first witness. Joining me is anti trust 14 00:00:57,360 --> 00:01:01,840 Speaker 1: expert Harry First, a professor at NYU Law School. Harry, 15 00:01:01,880 --> 00:01:04,760 Speaker 1: what's the main issue the FGC has to prove? Here? 16 00:01:05,040 --> 00:01:07,480 Speaker 2: There are a few main issues. The first issue is 17 00:01:07,600 --> 00:01:11,160 Speaker 2: the definition of the market. So the question is who 18 00:01:11,240 --> 00:01:14,320 Speaker 2: are Facebook's competitors? That sort of in some ways a 19 00:01:14,400 --> 00:01:18,640 Speaker 2: simple question. Who will consumers turn to? What sellers? If 20 00:01:18,720 --> 00:01:22,319 Speaker 2: the sellers they're looking at offers a bad deal. Usually 21 00:01:22,319 --> 00:01:25,080 Speaker 2: it's raises price, they switch to something else. We do 22 00:01:25,160 --> 00:01:27,759 Speaker 2: this all the time. So the question is who are 23 00:01:27,800 --> 00:01:32,080 Speaker 2: the rivals of Facebook and who do consumers switch to 24 00:01:32,680 --> 00:01:36,640 Speaker 2: if they wanted to switch. Market definition always critical issue 25 00:01:36,640 --> 00:01:39,600 Speaker 2: in any trust cases, and the first issue to resolve. 26 00:01:40,120 --> 00:01:43,160 Speaker 2: The second issue is are they a monopolists in this market? 27 00:01:43,360 --> 00:01:46,000 Speaker 2: You know, do they have a large enough share of 28 00:01:46,040 --> 00:01:48,480 Speaker 2: the market. Is it hard to enter the market so 29 00:01:48,520 --> 00:01:50,960 Speaker 2: they can sort of have power to do what they want? 30 00:01:51,200 --> 00:01:54,800 Speaker 2: Do we see them doing the bad things that monopolists do, 31 00:01:55,160 --> 00:01:58,040 Speaker 2: which often means, you know, selling something at a high price. 32 00:01:58,280 --> 00:02:01,920 Speaker 2: Of course, Facebook says, hey, we give our thing away 33 00:02:01,960 --> 00:02:05,320 Speaker 2: for free. So what's their problem? And then the third 34 00:02:05,360 --> 00:02:08,720 Speaker 2: thing is, okay, if they are monopoly with monopoly power, 35 00:02:09,080 --> 00:02:13,919 Speaker 2: have they engaged in anti competitive conduct that excludes competitors 36 00:02:14,000 --> 00:02:17,359 Speaker 2: unreasonably from the market. And so that's the third thing, 37 00:02:17,400 --> 00:02:21,400 Speaker 2: and that's where the acquisitions of Instagram and WhatsApp come in. 38 00:02:22,120 --> 00:02:25,040 Speaker 1: Let's start with the market definition, which was one of 39 00:02:25,080 --> 00:02:29,120 Speaker 1: the key points Zuckerberg was questioned about. So tell us 40 00:02:29,160 --> 00:02:33,679 Speaker 1: how the FTC and Meta are viewing the market. 41 00:02:34,280 --> 00:02:39,000 Speaker 2: Facebook contends that the market, you know, whatever it was 42 00:02:39,120 --> 00:02:43,040 Speaker 2: maybe in twenty eleven. Now there are lots of competitors 43 00:02:43,480 --> 00:02:48,520 Speaker 2: and Facebook is doing things similar to particularly TikTok. So 44 00:02:48,919 --> 00:02:52,239 Speaker 2: the Pedroal Trade Commission has called the market we always 45 00:02:52,240 --> 00:02:56,480 Speaker 2: love these names and anti trust, the Personal Social Networking 46 00:02:56,600 --> 00:02:59,600 Speaker 2: Services market, and then they use an acronym, so you 47 00:02:59,600 --> 00:03:03,560 Speaker 2: think it's something special, the PSN market, And basically that's 48 00:03:03,600 --> 00:03:07,000 Speaker 2: friends and family. You know where Facebook started out putting 49 00:03:07,120 --> 00:03:11,720 Speaker 2: people together and creating the social network, as the name 50 00:03:11,760 --> 00:03:15,079 Speaker 2: of that movie was way back at the start. So 51 00:03:15,080 --> 00:03:17,959 Speaker 2: the Federal Trade Commission says, that's really the core of 52 00:03:18,000 --> 00:03:21,880 Speaker 2: their services, and it's not something that actually other platforms 53 00:03:21,919 --> 00:03:26,200 Speaker 2: except for Instagram offer, They don't really offer that sort 54 00:03:26,240 --> 00:03:29,359 Speaker 2: of friends and family network. You know, TikTok and follow people, 55 00:03:29,360 --> 00:03:31,520 Speaker 2: but you know, not for keeping up on what your 56 00:03:31,560 --> 00:03:34,480 Speaker 2: high school buddies are doing. So the first question is 57 00:03:34,520 --> 00:03:39,000 Speaker 2: how broad is that market? So Facebook says, hey, you know, 58 00:03:39,160 --> 00:03:41,480 Speaker 2: lots of choices for consumers. They don't want to watch 59 00:03:41,600 --> 00:03:45,840 Speaker 2: reels on Facebook, they go to TikTok and vice versa, 60 00:03:45,920 --> 00:03:49,240 Speaker 2: so we have to include them in the market, says Meta, 61 00:03:49,280 --> 00:03:52,000 Speaker 2: And you know that's what a lot of the argument's 62 00:03:52,080 --> 00:03:55,440 Speaker 2: going to be with data and consumer surveys and so forth. 63 00:03:55,800 --> 00:03:59,560 Speaker 2: And the analogy that the Commission has used and I 64 00:03:59,600 --> 00:04:03,200 Speaker 2: think we'll continue to use, is supermarkets. So there was 65 00:04:03,240 --> 00:04:07,440 Speaker 2: a famous case involving Whole Foods and an acquisition that 66 00:04:07,480 --> 00:04:12,040 Speaker 2: Whole Foods did, and the Federal Trade Commission called the 67 00:04:12,240 --> 00:04:18,520 Speaker 2: markets the PNOS market got it premium natural organic supermarkets. 68 00:04:18,560 --> 00:04:23,480 Speaker 2: And when Whole Foods made this acquisition of another market 69 00:04:23,640 --> 00:04:26,760 Speaker 2: like that, they said, ah, that's the market. Other supermarkets 70 00:04:26,800 --> 00:04:29,280 Speaker 2: are not in this market, you know, like stopping shopper 71 00:04:29,720 --> 00:04:33,040 Speaker 2: and so forth. And the court in which this case 72 00:04:33,120 --> 00:04:36,400 Speaker 2: is being tried, the PC Circuit, accepted that definition. And 73 00:04:36,440 --> 00:04:38,880 Speaker 2: the point is sort of a simple one. Yeah, they've 74 00:04:38,920 --> 00:04:41,039 Speaker 2: got lots of things in the supermarket, but you go 75 00:04:41,080 --> 00:04:44,719 Speaker 2: to certain supermarkets sort of a core of users for 76 00:04:44,880 --> 00:04:47,480 Speaker 2: a core of things. So you go to Whole Foods 77 00:04:47,480 --> 00:04:49,480 Speaker 2: because you like to pay high prices. Oh wait a minute, 78 00:04:49,480 --> 00:04:52,320 Speaker 2: that's wrong. You go to Whole Foods because you want 79 00:04:52,360 --> 00:04:56,680 Speaker 2: the organic, natural, premium stuff that they specialize in. And 80 00:04:56,720 --> 00:04:59,120 Speaker 2: then you buy milk, so you know, you go to 81 00:04:59,400 --> 00:05:02,960 Speaker 2: Facebook do you want to communicate with your friends and family? 82 00:05:03,480 --> 00:05:05,279 Speaker 2: And then you know, maybe you'll look at some reels 83 00:05:05,360 --> 00:05:07,480 Speaker 2: or maybe do some other things or what whatever other 84 00:05:07,520 --> 00:05:10,000 Speaker 2: fees they've got. But the core is still the friends 85 00:05:10,000 --> 00:05:13,320 Speaker 2: and family. So that's going to be the legal and 86 00:05:13,400 --> 00:05:16,479 Speaker 2: sort of factual contention that the parties are going to 87 00:05:16,560 --> 00:05:20,400 Speaker 2: litigate over during the course of this trial, and to. 88 00:05:20,279 --> 00:05:24,920 Speaker 1: The point of bad conduct or anti competitive conduct. The 89 00:05:25,040 --> 00:05:30,040 Speaker 1: FTC showed Zuckerberg some smoking gun emails, including one from 90 00:05:30,080 --> 00:05:33,159 Speaker 1: twenty twelve where he described the Instagram deal as a 91 00:05:33,200 --> 00:05:35,480 Speaker 1: way to neutralize a competitor. 92 00:05:35,880 --> 00:05:37,960 Speaker 2: Yeah, that's a good one. Yeah. I could see the 93 00:05:38,040 --> 00:05:41,840 Speaker 2: lawyers looking at that going, oh my god, did you 94 00:05:41,880 --> 00:05:44,039 Speaker 2: write that. There's a good reason why that would be 95 00:05:44,080 --> 00:05:47,359 Speaker 2: called a smoking gun. You know, not every acquisition that 96 00:05:47,560 --> 00:05:51,680 Speaker 2: a major platform or a monopoly, let's say, makes is illegal. 97 00:05:51,720 --> 00:05:54,960 Speaker 2: It's not illegal for that reason. Firms make acquisitions all 98 00:05:55,000 --> 00:05:58,120 Speaker 2: the time. The ones that are illegal in this context 99 00:05:58,120 --> 00:06:01,080 Speaker 2: are those that are done to u out a threat 100 00:06:01,120 --> 00:06:05,120 Speaker 2: to the monopolists. As they said, a buyer bury strategy. 101 00:06:05,120 --> 00:06:07,839 Speaker 2: They would just buy up the competitions they were worried about. 102 00:06:08,200 --> 00:06:11,000 Speaker 2: So on its face it could be neutral. Well, they 103 00:06:11,080 --> 00:06:14,080 Speaker 2: made an acquisition, what the heck? And maybe they have 104 00:06:14,200 --> 00:06:16,960 Speaker 2: some alleged good reasons, but then you read the email 105 00:06:17,000 --> 00:06:19,760 Speaker 2: in Zuckerberg says, hey, we're worried about this company, we 106 00:06:19,880 --> 00:06:24,720 Speaker 2: better buy it. And their effort at incorporating photographs into 107 00:06:25,240 --> 00:06:28,039 Speaker 2: Facebook and making it the kind of thing that Instagram 108 00:06:28,120 --> 00:06:31,000 Speaker 2: was doing was not going well. So they were really 109 00:06:31,000 --> 00:06:34,760 Speaker 2: being challenged by Instagram, and you know, they responded to 110 00:06:34,800 --> 00:06:38,040 Speaker 2: that challenge not by making a better product, but by 111 00:06:38,080 --> 00:06:43,400 Speaker 2: buying their competitor. Classic move, but not a competitive move. 112 00:06:43,440 --> 00:06:47,000 Speaker 2: It's an anti competitive move, so the government says, and 113 00:06:47,040 --> 00:06:49,320 Speaker 2: so Mark Zuckerberg apparently said. 114 00:06:50,040 --> 00:06:54,120 Speaker 1: At one point, Zuckerberg said that Facebook's feed has turned 115 00:06:54,160 --> 00:06:57,520 Speaker 1: away from family and friends and toward quote more of 116 00:06:57,560 --> 00:07:00,400 Speaker 1: a broad discovery entertainment space. 117 00:07:01,120 --> 00:07:05,640 Speaker 2: Right, Yeah, sure, we compete with television, Yeah, and tennis 118 00:07:05,680 --> 00:07:08,000 Speaker 2: and the national football. They're not going to go that far, 119 00:07:08,440 --> 00:07:11,000 Speaker 2: but you know the game if you want to call 120 00:07:11,040 --> 00:07:14,560 Speaker 2: it that. But the idea is you broaden that consumers 121 00:07:14,600 --> 00:07:17,760 Speaker 2: do have lots of choices, do do different things. You know, 122 00:07:17,960 --> 00:07:22,560 Speaker 2: they don't only look at Facebook, and Facebook over time 123 00:07:22,640 --> 00:07:26,679 Speaker 2: has tried to bring more things within his ecosystem, within 124 00:07:26,720 --> 00:07:30,840 Speaker 2: the platform. So they're gone into virtual reality. That's why 125 00:07:31,200 --> 00:07:35,800 Speaker 2: Zuckerberg renamed the company meta because he wants the metaverse. 126 00:07:36,160 --> 00:07:39,200 Speaker 2: So true that, but that doesn't take away from the 127 00:07:39,240 --> 00:07:41,760 Speaker 2: fact that they still sell groceries. You know, they may 128 00:07:41,800 --> 00:07:44,680 Speaker 2: want to have espressos, you know, in the grocery store, 129 00:07:44,880 --> 00:07:47,840 Speaker 2: but they're not an espresso store. So you know, they 130 00:07:47,840 --> 00:07:50,920 Speaker 2: still sell their core function and that's why people put 131 00:07:51,000 --> 00:07:54,280 Speaker 2: up with all the junk pardon my French that they 132 00:07:54,360 --> 00:07:56,800 Speaker 2: see with these ads constantly. 133 00:07:57,160 --> 00:08:00,920 Speaker 1: Can we read anything into the fact that Judge Bosberg 134 00:08:01,080 --> 00:08:06,480 Speaker 1: has sounded skeptical about the FTC's case. He dismissed the 135 00:08:06,560 --> 00:08:09,280 Speaker 1: initial case in twenty twenty one, and in November he 136 00:08:09,360 --> 00:08:14,000 Speaker 1: said the agency faces hard questions about whether it's claims 137 00:08:14,040 --> 00:08:16,600 Speaker 1: can hold up in the crucible of trial. 138 00:08:17,080 --> 00:08:20,600 Speaker 2: Right, well, he's running the crucible like an Arthur Miller play. 139 00:08:21,040 --> 00:08:27,520 Speaker 2: So Judge Bosburg, just from reading his opinions, is sharp, 140 00:08:28,160 --> 00:08:31,680 Speaker 2: he's critical. He is not a pushover for the government, 141 00:08:31,840 --> 00:08:35,440 Speaker 2: but he's not a pushover for the defendants either. So 142 00:08:36,120 --> 00:08:38,959 Speaker 2: you know, I read this as saying he'll come out 143 00:08:39,000 --> 00:08:43,679 Speaker 2: with an opinion that will prove challenging for the losing 144 00:08:43,800 --> 00:08:47,840 Speaker 2: party on appeal, wherever that is. And I don't predict 145 00:08:47,840 --> 00:08:51,520 Speaker 2: that he's necessarily going to come out in favor of Facebook, frankly, 146 00:08:52,120 --> 00:08:54,320 Speaker 2: because he's been willing to accept, at least as a 147 00:08:54,360 --> 00:08:58,480 Speaker 2: legal matter, important arguments from the government and dismissed some 148 00:08:58,760 --> 00:09:03,160 Speaker 2: of Facebook's defenses. And you're not going to fool a. 149 00:09:03,640 --> 00:09:07,680 Speaker 1: Judge if the judge does find against Meta. How likely 150 00:09:07,760 --> 00:09:11,040 Speaker 1: is in order to divest Instagram and WhatsApp, I mean, 151 00:09:11,040 --> 00:09:14,360 Speaker 1: a breakup of that size hasn't happened since AT and 152 00:09:14,400 --> 00:09:15,640 Speaker 1: T forty years ago. 153 00:09:16,360 --> 00:09:19,840 Speaker 2: Well, we haven't taken on companies of this size. We 154 00:09:19,880 --> 00:09:23,880 Speaker 2: did take on Microsoft, I didn't order structural relief. The 155 00:09:23,920 --> 00:09:27,160 Speaker 2: AT and T breakup was by agreement, so it's not 156 00:09:27,640 --> 00:09:31,680 Speaker 2: a decree entered by a court after losing a case. 157 00:09:32,160 --> 00:09:37,320 Speaker 2: So we don't do divestitures that often, you know, not never, 158 00:09:38,000 --> 00:09:40,760 Speaker 2: it does get done, but this is certainly of an 159 00:09:40,760 --> 00:09:44,560 Speaker 2: important magnitude, and you know, puts the court in a 160 00:09:44,760 --> 00:09:49,240 Speaker 2: difficult role of trying to separate companies that Facebook frankly 161 00:09:49,320 --> 00:09:52,920 Speaker 2: has done its best to smush together, to use the 162 00:09:53,040 --> 00:09:58,520 Speaker 2: technical term. So that's sort of an obvious remedy but 163 00:09:58,840 --> 00:10:02,440 Speaker 2: not necessarily going to be the one the court will accept. 164 00:10:02,760 --> 00:10:06,680 Speaker 2: But there are lots of steps before we would see 165 00:10:07,120 --> 00:10:09,240 Speaker 2: an order to separate those companies. 166 00:10:09,480 --> 00:10:12,640 Speaker 1: Mark Zuckerberg has been closing up to Trump for a 167 00:10:12,720 --> 00:10:16,040 Speaker 1: while now. Could Trump bail Meta out if it does 168 00:10:16,120 --> 00:10:16,760 Speaker 1: lose the case. 169 00:10:17,240 --> 00:10:20,280 Speaker 2: I guess the answer is sure. At least at this point, 170 00:10:20,640 --> 00:10:23,480 Speaker 2: he has pretty much seized control. I don't know what 171 00:10:23,559 --> 00:10:25,680 Speaker 2: you want to call it. A hostile takeover of the 172 00:10:25,720 --> 00:10:29,720 Speaker 2: Federal Trade Commission. He fired the two Democratic members of 173 00:10:29,760 --> 00:10:34,839 Speaker 2: the commission, supposed to be balanced bipartisan commission. He's fired 174 00:10:34,880 --> 00:10:38,320 Speaker 2: the two for no reason other than the Democrats. The 175 00:10:38,440 --> 00:10:42,120 Speaker 2: chair is slavish in his praise of the President. I 176 00:10:42,200 --> 00:10:45,280 Speaker 2: think that's a fair work. So yes, I think Trump 177 00:10:45,360 --> 00:10:48,319 Speaker 2: could very well order whatever he wanted to order, and 178 00:10:48,640 --> 00:10:51,280 Speaker 2: if the chairman or the other commissioners didn't want to 179 00:10:51,280 --> 00:10:53,680 Speaker 2: go along, you can just fire them. At least that's 180 00:10:53,679 --> 00:10:56,079 Speaker 2: how he sees the law. It may turn out that 181 00:10:56,080 --> 00:10:58,360 Speaker 2: that's not going to be the law. Maybe the Supreme 182 00:10:58,400 --> 00:11:01,760 Speaker 2: Court is going to not take that final step in 183 00:11:01,880 --> 00:11:06,640 Speaker 2: terms of ending the independence of regulatory agencies, but we'll 184 00:11:06,640 --> 00:11:09,760 Speaker 2: have to see. That's in litigation. But if he has control, 185 00:11:09,920 --> 00:11:12,720 Speaker 2: then he can do that. Even before he did try 186 00:11:12,760 --> 00:11:16,240 Speaker 2: to pressure in his first administration, pressure the chair of 187 00:11:16,280 --> 00:11:19,840 Speaker 2: the Federal Trade Commission who resisted things. But now the 188 00:11:19,880 --> 00:11:23,480 Speaker 2: ability and willingness of the chairman who resists whatever the 189 00:11:23,520 --> 00:11:26,680 Speaker 2: president wants is zero. That's said, I'm not quite sure 190 00:11:26,720 --> 00:11:30,520 Speaker 2: I know or understand, and maybe the President doesn't either 191 00:11:30,800 --> 00:11:33,360 Speaker 2: what he might want and what he would pressure the 192 00:11:33,400 --> 00:11:36,680 Speaker 2: Federal Trade Commission to do. Obviously he's not pulled the 193 00:11:36,720 --> 00:11:39,800 Speaker 2: plug on this litigation, which he could have, so we'll 194 00:11:39,880 --> 00:11:41,520 Speaker 2: just have to see how it goes. 195 00:11:41,720 --> 00:11:45,160 Speaker 1: Always a pleasure, Harry, Thank you. That's Professor Harry First 196 00:11:45,200 --> 00:11:49,000 Speaker 1: of NYU Law School. Coming up next. How the Blue 197 00:11:49,000 --> 00:11:52,800 Speaker 1: states could fight back when they're targeted. I'm June Grosso. 198 00:11:52,800 --> 00:11:57,760 Speaker 1: When you're listening to Bloomberg. In the past, President Trump 199 00:11:57,800 --> 00:12:01,600 Speaker 1: has even blamed the criminal case against him on blue 200 00:12:01,640 --> 00:12:03,520 Speaker 1: states being out to get him. 201 00:12:03,880 --> 00:12:06,200 Speaker 3: Can we fly over a Democrat state? 202 00:12:06,880 --> 00:12:10,000 Speaker 4: I get it, grand jury subpoena? 203 00:12:10,280 --> 00:12:11,120 Speaker 3: So what happened? 204 00:12:11,120 --> 00:12:11,600 Speaker 2: What do you do? 205 00:12:11,720 --> 00:12:11,880 Speaker 4: Sir? 206 00:12:12,320 --> 00:12:13,560 Speaker 1: They're investigated you. 207 00:12:13,559 --> 00:12:14,600 Speaker 3: You flew over the state. 208 00:12:14,640 --> 00:12:18,040 Speaker 1: They think something's up. So now is he targeting those 209 00:12:18,080 --> 00:12:22,240 Speaker 1: states for retribution? On April first, the Trump administration announced 210 00:12:22,240 --> 00:12:25,520 Speaker 1: the closing of five of ten regional offices of the 211 00:12:25,559 --> 00:12:29,120 Speaker 1: Department of Health and Human Services, all in blue states. 212 00:12:29,800 --> 00:12:32,760 Speaker 1: On April eighth, an executive order was aimed at the 213 00:12:32,760 --> 00:12:36,760 Speaker 1: blue states of New York, Vermont, in California for state 214 00:12:36,960 --> 00:12:41,120 Speaker 1: overreach with respect to climate change policies. And six of 215 00:12:41,160 --> 00:12:44,440 Speaker 1: the seven universities that have lost funding on the grounds 216 00:12:44,440 --> 00:12:48,760 Speaker 1: that they didn't adequately address anti semitism on campus are 217 00:12:48,800 --> 00:12:52,040 Speaker 1: in blue states. The seventh is in a purple state. 218 00:12:52,440 --> 00:12:55,520 Speaker 1: But even if Trump is in fact targeting Blue states, 219 00:12:55,800 --> 00:12:58,520 Speaker 1: what can they do about it? Joining me is retired 220 00:12:58,559 --> 00:13:02,920 Speaker 1: federal judge Nancy Goer, a senior lecturer at Harvard Law School. 221 00:13:03,360 --> 00:13:06,120 Speaker 1: She's the co author of an article on Bloomberg Law 222 00:13:06,400 --> 00:13:10,840 Speaker 1: entitled Trump's Blue state bias could rip the US apart. 223 00:13:11,120 --> 00:13:15,000 Speaker 1: Judge tell us how President Trump has been discriminating against 224 00:13:15,040 --> 00:13:16,080 Speaker 1: the Blue states. 225 00:13:16,320 --> 00:13:20,560 Speaker 3: Well, the most recent one was when he closed five 226 00:13:20,960 --> 00:13:27,560 Speaker 3: of the ten AHHS offices, and significantly the five were 227 00:13:27,600 --> 00:13:31,679 Speaker 3: all in blue states. In addition, he has suspended the 228 00:13:31,720 --> 00:13:37,040 Speaker 3: funding for various kinds of programs to hospitals. But really 229 00:13:37,240 --> 00:13:41,160 Speaker 3: he's listening, we believe, only to the red states that 230 00:13:41,240 --> 00:13:45,000 Speaker 3: are begging for these two return. So you know, Katie 231 00:13:45,040 --> 00:13:48,520 Speaker 3: Britt from Alabama calls him to rein state and we 232 00:13:48,600 --> 00:13:51,439 Speaker 3: believe that he will do that. So he's basically doling 233 00:13:51,559 --> 00:13:55,280 Speaker 3: out federal funds who had vanished the red states over 234 00:13:55,320 --> 00:13:57,440 Speaker 3: the blue. And we think that that's going to be 235 00:13:57,440 --> 00:13:59,600 Speaker 3: a pattern going forward. You know, the pattern has not 236 00:13:59,640 --> 00:14:03,320 Speaker 3: been completed. We haven't seen all aspects of it, but 237 00:14:03,480 --> 00:14:06,400 Speaker 3: clearly this is what we have seen so far, particularly 238 00:14:06,440 --> 00:14:08,760 Speaker 3: the thing that happens with respect to AHHS. I mean, 239 00:14:08,800 --> 00:14:12,640 Speaker 3: why do you shut down the regional offices that do 240 00:14:12,760 --> 00:14:18,760 Speaker 3: the most business, the most business, namely you know, New York, Boston, Chicago, 241 00:14:19,000 --> 00:14:23,360 Speaker 3: San Francisco, Seattle. It's really pretty transparent what's going on. 242 00:14:23,840 --> 00:14:27,200 Speaker 1: And has the Supreme Court been clear that the Constitution 243 00:14:27,480 --> 00:14:31,160 Speaker 1: requires the federal government to treat all states the same. 244 00:14:31,600 --> 00:14:34,560 Speaker 3: Well, in a decision that I mostly disagreed with, which 245 00:14:34,600 --> 00:14:37,720 Speaker 3: is the Shelby County decision, that was dealing with preclearance 246 00:14:37,800 --> 00:14:41,280 Speaker 3: requirement of the Voting Rights Act. Preclearance was a requirement 247 00:14:41,440 --> 00:14:45,800 Speaker 3: that really reflected the discrimination against black people that had 248 00:14:45,800 --> 00:14:49,080 Speaker 3: occurred in certain states in the South, that was documented, 249 00:14:49,120 --> 00:14:52,760 Speaker 3: that was well known. The Supreme Court eliminated preclearance, which 250 00:14:52,960 --> 00:14:56,120 Speaker 3: basically was a situation in which the government would review 251 00:14:56,280 --> 00:15:00,160 Speaker 3: any changes in voting rights procedures in those states to 252 00:15:00,200 --> 00:15:03,600 Speaker 3: make certain that it didn't continue to disadvantage black people 253 00:15:03,680 --> 00:15:07,320 Speaker 3: or didn't re disadvantage black people. And what happened was 254 00:15:07,480 --> 00:15:10,600 Speaker 3: the court eliminated that preclearance on the theory that there 255 00:15:10,600 --> 00:15:14,480 Speaker 3: was a requirement of equal treatment of all states. And 256 00:15:14,560 --> 00:15:17,720 Speaker 3: so if that is a principle reaffirms now only a 257 00:15:17,720 --> 00:15:20,960 Speaker 3: few years ago, then Trump simply does not have the 258 00:15:21,080 --> 00:15:25,160 Speaker 3: right to discriminate against blue states in favor of red. 259 00:15:25,200 --> 00:15:29,640 Speaker 3: He can't use the federal spending power to disadvantage blue 260 00:15:29,640 --> 00:15:31,160 Speaker 3: states at the expense of red. 261 00:15:31,480 --> 00:15:34,680 Speaker 1: In the article, you talk about tax dollars, and a 262 00:15:34,720 --> 00:15:38,280 Speaker 1: recent report from the Rockefeller Institute of Government shows that 263 00:15:38,480 --> 00:15:42,400 Speaker 1: only thirteen states send more money to the federal government 264 00:15:42,720 --> 00:15:46,040 Speaker 1: than they receive, and ten of those are Blue states. 265 00:15:46,960 --> 00:15:50,320 Speaker 3: Every state is an employer and is oftentimes the major 266 00:15:50,440 --> 00:15:53,800 Speaker 3: employer in any given state, and as with any employer, 267 00:15:53,840 --> 00:15:57,680 Speaker 3: the state has to withhold money for federal taxes. And 268 00:15:57,760 --> 00:15:59,920 Speaker 3: in the case of the blue states that we mention, 269 00:16:00,720 --> 00:16:03,040 Speaker 3: it is a substantial amount of money. In fact, as 270 00:16:03,080 --> 00:16:07,040 Speaker 3: we note in the article, the blue states are net donors. 271 00:16:07,080 --> 00:16:09,800 Speaker 3: In other words, they give more money to the federal 272 00:16:09,840 --> 00:16:12,480 Speaker 3: government and they get in the form of services. Now, 273 00:16:12,880 --> 00:16:15,600 Speaker 3: it would be illegal, and I have to say that 274 00:16:15,920 --> 00:16:20,640 Speaker 3: quite candidly, for the Blue states to withhold federal dollars. 275 00:16:20,680 --> 00:16:24,160 Speaker 3: In other words, the withholding that you do on your taxes, 276 00:16:24,520 --> 00:16:26,960 Speaker 3: you withhold for the purpose of turning it over to 277 00:16:27,040 --> 00:16:30,120 Speaker 3: the federal government at the appropriate time. This would be 278 00:16:30,200 --> 00:16:34,120 Speaker 3: illegal to withhold it, frankly, until the government makes the 279 00:16:34,200 --> 00:16:37,600 Speaker 3: allocations among the states equal. It would certainly be illegal. 280 00:16:37,840 --> 00:16:40,600 Speaker 3: But it is leverage, what we say, and the piece 281 00:16:40,680 --> 00:16:43,760 Speaker 3: is that the government, if they continue to favor Red 282 00:16:43,800 --> 00:16:48,400 Speaker 3: states over blue, will be acting illegally and unconstitutionally. And 283 00:16:48,440 --> 00:16:51,440 Speaker 3: so we speculate that this is something the Blue states 284 00:16:51,440 --> 00:16:54,160 Speaker 3: could do, although it is illegal. 285 00:16:54,200 --> 00:16:57,480 Speaker 1: And California has a ballot issue where voters are already 286 00:16:57,680 --> 00:16:58,520 Speaker 1: considering this. 287 00:16:59,000 --> 00:17:03,320 Speaker 3: So California Balid initiative goes even further. The Baalid initiative 288 00:17:03,840 --> 00:17:07,840 Speaker 3: is to secede from the Union, which is extraordinary. It's 289 00:17:07,960 --> 00:17:11,160 Speaker 3: asking the voters to endorse the idea of secession. It's 290 00:17:11,200 --> 00:17:15,160 Speaker 3: not clear what legal authority that would have, but that's 291 00:17:15,200 --> 00:17:18,480 Speaker 3: certainly what California voters are indulging in. Again, you know, 292 00:17:18,600 --> 00:17:21,879 Speaker 3: both withholding federal tax dollars and obviously, seceding from the 293 00:17:21,960 --> 00:17:26,000 Speaker 3: Union are patently illegal. But the notion here is the 294 00:17:26,000 --> 00:17:31,360 Speaker 3: government is behaving illegally. Were they to punish Massachusetts and California, 295 00:17:31,400 --> 00:17:34,480 Speaker 3: for example, in favor of the Red States. The government 296 00:17:34,560 --> 00:17:38,760 Speaker 3: is behaving blatantly illegally. There's no question. When the government 297 00:17:38,840 --> 00:17:42,639 Speaker 3: behaves illegally, citizens have to decide what to do. Citizens 298 00:17:42,640 --> 00:17:45,640 Speaker 3: have to decide what the appropriate response is. And make 299 00:17:45,720 --> 00:17:50,400 Speaker 3: no mistake, this government is behaving illegally on numbers of fronts, 300 00:17:50,480 --> 00:17:52,480 Speaker 3: as numbers of federal courts have found. 301 00:17:52,720 --> 00:17:55,160 Speaker 1: Have we reached a constitutional crisis yet? 302 00:17:55,400 --> 00:17:57,159 Speaker 3: I don't think there's any doubt that we are in 303 00:17:57,200 --> 00:18:00,239 Speaker 3: a constitutional crisis, whatever the name you put on it, 304 00:18:00,400 --> 00:18:02,320 Speaker 3: I don't think that there's any doubt. So far. There 305 00:18:02,359 --> 00:18:05,240 Speaker 3: isn't a situation, you know, with a Southern politician now 306 00:18:05,320 --> 00:18:09,159 Speaker 3: barring federal truths from coming into Mississippi or Georgia or 307 00:18:09,200 --> 00:18:12,080 Speaker 3: Alabama or anything like that. That's not what we have 308 00:18:12,200 --> 00:18:15,320 Speaker 3: seen yet. What we have seen, as with for example, 309 00:18:15,680 --> 00:18:19,040 Speaker 3: the most recent case of mister Abrago Garcia, this is 310 00:18:19,080 --> 00:18:23,879 Speaker 3: an individual who is wrongly deported to El Salvador. What 311 00:18:23,920 --> 00:18:27,760 Speaker 3: we have seen is the government effectively ignoring a Supreme 312 00:18:27,840 --> 00:18:33,239 Speaker 3: Court decision. We've seen the government effectively ignoring decisions of 313 00:18:33,280 --> 00:18:37,439 Speaker 3: the lower courts. We've seen sophistry men outright lying. The 314 00:18:37,560 --> 00:18:40,280 Speaker 3: sophistry is describing the Supreme Court's decision in a way 315 00:18:40,320 --> 00:18:43,720 Speaker 3: which is patently false, patently false. You know, they saying 316 00:18:43,760 --> 00:18:47,000 Speaker 3: things like the Supreme Court in dealing with the Abrago 317 00:18:47,040 --> 00:18:50,960 Speaker 3: Garcia case didn't really demand that the government do anything, 318 00:18:51,040 --> 00:18:55,200 Speaker 3: but only they can passively make certain that if El 319 00:18:55,240 --> 00:18:58,120 Speaker 3: Salvador releases a man he could come into the countries. 320 00:18:58,480 --> 00:19:01,679 Speaker 3: I mean, that's absolutely not the Supreme Court said. It's 321 00:19:01,359 --> 00:19:03,840 Speaker 3: not at all what the Supreme Court said. So that's 322 00:19:03,880 --> 00:19:07,760 Speaker 3: sort of really sophistry in dealing with Supreme Court decisions. 323 00:19:07,800 --> 00:19:11,560 Speaker 3: And then there's been outright ignoring and disobedience of them. 324 00:19:11,760 --> 00:19:15,600 Speaker 3: You know, in the cases involving the funding freezes on 325 00:19:15,760 --> 00:19:19,119 Speaker 3: government funds across the country, a court will require that 326 00:19:19,200 --> 00:19:23,159 Speaker 3: the funding be restored, and the government is just ignoring it, 327 00:19:23,359 --> 00:19:27,200 Speaker 3: just ignoring it, requiring you know, courts to essentially get 328 00:19:27,200 --> 00:19:30,800 Speaker 3: to the point to the moment of contempt. So I 329 00:19:30,800 --> 00:19:33,480 Speaker 3: don't think there's any question we're in a constitutional crisis 330 00:19:33,520 --> 00:19:37,119 Speaker 3: when the government can lie as they did in the 331 00:19:37,160 --> 00:19:40,440 Speaker 3: Oval office, in the meeting with El Salvador's who Cale 332 00:19:41,040 --> 00:19:44,520 Speaker 3: about again, this gentleman Albrego Garcia, when the government can 333 00:19:44,560 --> 00:19:47,800 Speaker 3: lie flat out that he had been found to be 334 00:19:47,920 --> 00:19:53,240 Speaker 3: an MS thirteen member, not true anywhere that the government 335 00:19:53,320 --> 00:19:57,240 Speaker 3: can lie about its arrangements with with Cale. I don't 336 00:19:57,240 --> 00:19:59,880 Speaker 3: think there's any doubt that we're in a constitutional crisis. 337 00:20:00,000 --> 00:20:03,160 Speaker 1: And not only did the Attorney General misinterpret the Supreme 338 00:20:03,200 --> 00:20:07,760 Speaker 1: Court's decision, but Trump advisor Stephen Miller actually said that 339 00:20:07,840 --> 00:20:12,120 Speaker 1: the Supreme Court had ruled unanimously for the Trump administration. 340 00:20:12,680 --> 00:20:12,880 Speaker 4: Right. 341 00:20:13,119 --> 00:20:15,760 Speaker 3: Again, this is not even close to truth. When the 342 00:20:15,840 --> 00:20:19,000 Speaker 3: Supreme Court says the government has to facilitate its return, 343 00:20:19,400 --> 00:20:22,000 Speaker 3: and clearly in the context of a decision that is 344 00:20:22,040 --> 00:20:25,880 Speaker 3: about how there was no right to deport him. When 345 00:20:25,880 --> 00:20:29,440 Speaker 3: they say it has to facilitate his return, that doesn't 346 00:20:29,440 --> 00:20:33,120 Speaker 3: mean well, El Salvador, if you wish to send him, 347 00:20:33,600 --> 00:20:36,800 Speaker 3: we will allow him in. It's clear that it meant 348 00:20:36,880 --> 00:20:40,280 Speaker 3: more than that. It meant affirmative steps to right or wrong. 349 00:20:40,680 --> 00:20:44,000 Speaker 3: The government was responsible for a wrong and they had 350 00:20:44,040 --> 00:20:46,600 Speaker 3: to take affirmative steps to write it. I mean also 351 00:20:46,880 --> 00:20:49,879 Speaker 3: the notion that this is a foreign policy issue is 352 00:20:49,960 --> 00:20:53,679 Speaker 3: pure poppycock. This is a commercial issue. This is not 353 00:20:54,040 --> 00:20:58,320 Speaker 3: like returning Brittany Griner from Russia, which was a matter 354 00:20:58,359 --> 00:21:02,840 Speaker 3: of delicate diplomatic negation. This is about a commercial contract 355 00:21:02,960 --> 00:21:06,800 Speaker 3: between a two big dictator and the United States dealing 356 00:21:06,840 --> 00:21:10,960 Speaker 3: with a contract for six million dollars to receive deportees 357 00:21:11,040 --> 00:21:14,360 Speaker 3: from the United States. And clearly the United States has 358 00:21:14,680 --> 00:21:18,280 Speaker 3: rights under the contract and the United States has power. 359 00:21:18,840 --> 00:21:21,439 Speaker 3: If in the meeting, before the public meeting, Trump had 360 00:21:21,480 --> 00:21:25,080 Speaker 3: said to Bouquele, return this guy, it would have been over. 361 00:21:25,760 --> 00:21:28,240 Speaker 3: It would have been over. And the fact that they 362 00:21:28,240 --> 00:21:31,800 Speaker 3: don't even think that this is a wrong that needs 363 00:21:31,840 --> 00:21:33,800 Speaker 3: to be righted is stunning. 364 00:21:34,240 --> 00:21:37,920 Speaker 1: The Fourth Circuit agrees with you. It denied an emergency 365 00:21:38,000 --> 00:21:42,160 Speaker 1: motion by the Trump administration to halt a federal judge's 366 00:21:42,280 --> 00:21:47,760 Speaker 1: effort to facilitate Garcia's return, saying the Justice Department's conduct 367 00:21:47,840 --> 00:21:51,480 Speaker 1: were shocking to American sense of liberty. Is this the 368 00:21:51,520 --> 00:21:55,879 Speaker 1: most serious constitutional crisis we've faced since the Civil War? 369 00:21:56,359 --> 00:21:59,320 Speaker 3: I think that that's true. It's one thing to have, 370 00:21:59,640 --> 00:22:03,320 Speaker 3: you know, the pictures of the Southern States defying a 371 00:22:03,560 --> 00:22:09,720 Speaker 3: constitutional directive to desegregate, requiring troops to be sent to 372 00:22:09,880 --> 00:22:14,000 Speaker 3: southern states to enforce the prohibition against segregation. That was 373 00:22:14,000 --> 00:22:19,280 Speaker 3: a confrontation as between a unified federal government and particular states. 374 00:22:19,440 --> 00:22:23,639 Speaker 3: This is a confrontation between the federal government as a 375 00:22:23,680 --> 00:22:28,280 Speaker 3: whole individuals, and a confrontation between the federal government as 376 00:22:28,280 --> 00:22:31,280 Speaker 3: a whole and the states. It's about to involve almost 377 00:22:31,320 --> 00:22:35,080 Speaker 3: every aspect of our lives. The federal government is asserting 378 00:22:35,119 --> 00:22:38,840 Speaker 3: control over universities. The federal government is seeking to control 379 00:22:39,080 --> 00:22:42,160 Speaker 3: who is in the country and who is not, seeking 380 00:22:42,200 --> 00:22:45,760 Speaker 3: to control who is a citizen, and flouting the Constitution 381 00:22:45,880 --> 00:22:49,080 Speaker 3: in numbers of ways. So in both the scope of 382 00:22:49,119 --> 00:22:52,840 Speaker 3: the disobedience the scope of the violations by the federal 383 00:22:52,880 --> 00:22:56,080 Speaker 3: government on every front. So yes, I would call it 384 00:22:56,160 --> 00:22:58,880 Speaker 3: the most serious constitutional crisis in the Civil War. 385 00:22:59,440 --> 00:23:03,680 Speaker 1: And capitulation by some of the country's largest law firms. 386 00:23:04,040 --> 00:23:06,280 Speaker 3: Well, I think we're beginning to see something different. And 387 00:23:06,359 --> 00:23:09,280 Speaker 3: let me make a more general points. So, the strength 388 00:23:09,280 --> 00:23:12,440 Speaker 3: of the United States is not just in representative government, 389 00:23:12,560 --> 00:23:17,439 Speaker 3: not just having Congress and the president elected by the people, 390 00:23:18,119 --> 00:23:21,040 Speaker 3: not just having local governments. The strength of the United 391 00:23:21,080 --> 00:23:26,520 Speaker 3: States is also having civil society, namely organizations like universities, 392 00:23:26,920 --> 00:23:30,080 Speaker 3: and law firms that are independent of the government. They 393 00:23:30,320 --> 00:23:34,639 Speaker 3: stand for principles that the government can't control. So what 394 00:23:34,760 --> 00:23:37,480 Speaker 3: Trump has done is he's taken over the government we 395 00:23:37,560 --> 00:23:40,320 Speaker 3: elected him. That was the product of the voters, namely 396 00:23:40,359 --> 00:23:42,760 Speaker 3: that he has a House of Representatives, the Senate, and 397 00:23:42,840 --> 00:23:45,879 Speaker 3: the Lighthouse. But the judges are supposed to be independent, 398 00:23:45,920 --> 00:23:48,800 Speaker 3: and the judges have been holding the judges have reflected 399 00:23:48,840 --> 00:23:52,000 Speaker 3: that independence. What he is doing is trying to dismantle 400 00:23:52,040 --> 00:23:55,280 Speaker 3: civil society. And the way you dismantle civil society is 401 00:23:55,320 --> 00:23:58,600 Speaker 3: you dismantle the universities, and you'd sort of tried to 402 00:23:58,680 --> 00:24:00,960 Speaker 3: co opt the law firms. I think we're seeing a 403 00:24:01,080 --> 00:24:03,800 Speaker 3: change now. I think there was a certain kind of 404 00:24:03,840 --> 00:24:07,000 Speaker 3: shock at the beginning of this administration. People didn't quite 405 00:24:07,080 --> 00:24:09,640 Speaker 3: understand how far he was prepared to go. I think 406 00:24:09,640 --> 00:24:12,760 Speaker 3: that the firms that have sued the government for its 407 00:24:12,840 --> 00:24:15,320 Speaker 3: illegal actions, I think are going to be the wave 408 00:24:15,400 --> 00:24:19,120 Speaker 3: of the future. And Harvard's standing up to Trump, I 409 00:24:19,160 --> 00:24:23,760 Speaker 3: think will forecast other universities doing exactly the same. I 410 00:24:23,760 --> 00:24:25,919 Speaker 3: think there was a certain amount of shock. Surely he 411 00:24:25,960 --> 00:24:29,159 Speaker 3: didn't mean to go as far as he went, was 412 00:24:29,200 --> 00:24:32,560 Speaker 3: the sense, And the answer is yes, he has an 413 00:24:32,600 --> 00:24:33,679 Speaker 3: indeed further. 414 00:24:33,720 --> 00:24:37,320 Speaker 1: Thanks for joining me. That's retired federal judge Nancy Gertner 415 00:24:37,640 --> 00:24:40,440 Speaker 1: coming up. Can you keep your data private? I'm June 416 00:24:40,480 --> 00:24:45,439 Speaker 1: Grosso and you're listening to Bloomberg Holden. 417 00:24:46,480 --> 00:24:48,320 Speaker 4: Almost missed it. 418 00:24:48,440 --> 00:24:54,879 Speaker 1: That's for your wife. Yeah, I thought your wife liked roses. 419 00:24:56,359 --> 00:24:56,879 Speaker 4: Excuse me? 420 00:24:58,119 --> 00:24:59,480 Speaker 3: Do I know you? 421 00:24:59,520 --> 00:24:59,600 Speaker 1: No? 422 00:25:00,760 --> 00:25:03,920 Speaker 3: But I know you Bill little relationship advice. 423 00:25:04,400 --> 00:25:06,000 Speaker 1: If you're going to step out on your wife, you 424 00:25:06,080 --> 00:25:10,800 Speaker 1: need to think it through. Do work here, because if 425 00:25:10,840 --> 00:25:11,719 Speaker 1: you do, you're fired. 426 00:25:12,119 --> 00:25:13,800 Speaker 3: I'm more like an independent contract. 427 00:25:14,560 --> 00:25:18,639 Speaker 1: In the TV series Person of Interest, a computer program 428 00:25:18,720 --> 00:25:23,640 Speaker 1: known as the Machine monitors all electronic communications and surveillance 429 00:25:23,760 --> 00:25:27,560 Speaker 1: video feeds in order to find people planning to commit crimes. 430 00:25:28,160 --> 00:25:32,119 Speaker 1: The Machine knows all about you. The series is science fiction, 431 00:25:32,640 --> 00:25:36,760 Speaker 1: but is it that far from reality? Our online activities 432 00:25:36,760 --> 00:25:41,439 Speaker 1: and data are increasingly being tracked, collected, and analyzed, so 433 00:25:41,640 --> 00:25:47,840 Speaker 1: our digital data, including photos, documents, emails, finances, and even 434 00:25:47,880 --> 00:25:52,479 Speaker 1: health information, isn't really private anymore. For example, according to 435 00:25:52,520 --> 00:25:57,480 Speaker 1: a class action lawsuit, weight Watchers websites allegedly shared users' 436 00:25:57,560 --> 00:26:02,400 Speaker 1: personal information, including health related data, with third party tracking 437 00:26:02,480 --> 00:26:07,200 Speaker 1: services like Google, Facebook, and Amplitude. Joining me is an 438 00:26:07,200 --> 00:26:12,080 Speaker 1: expert in cybersecurity and data privacy. Colin Walkee a partner 439 00:26:12,080 --> 00:26:16,119 Speaker 1: at holestal So Colin what kind of information did WeightWatchers 440 00:26:16,119 --> 00:26:18,320 Speaker 1: allegedly give out and how. 441 00:26:18,240 --> 00:26:22,359 Speaker 4: So WeightWatchers was able to give out this information utilizing 442 00:26:22,480 --> 00:26:27,160 Speaker 4: pixels that were on their website from companies like Facebook, Splash, Meta, 443 00:26:27,480 --> 00:26:33,280 Speaker 4: Google for analytics purposes. And their allegations are that essentially 444 00:26:33,320 --> 00:26:37,240 Speaker 4: every question that you answered on their website up to 445 00:26:37,359 --> 00:26:40,520 Speaker 4: and including you know, are you living with Type two diatabetes? 446 00:26:40,840 --> 00:26:44,040 Speaker 4: Have you used any weight loss medication? You know, do 447 00:26:44,080 --> 00:26:47,199 Speaker 4: you have a history of cancer? Blood pressure? All of 448 00:26:47,240 --> 00:26:51,120 Speaker 4: these sorts of sensitive information, low testosterone, All of these 449 00:26:51,160 --> 00:26:53,040 Speaker 4: were questions that you were supposed to answer when you 450 00:26:53,080 --> 00:26:55,880 Speaker 4: went to the website that had these pixels on them 451 00:26:56,200 --> 00:26:58,640 Speaker 4: and share that information then with third parties. 452 00:26:58,760 --> 00:27:02,639 Speaker 1: The sharing with third party was deliberate or was it 453 00:27:02,680 --> 00:27:03,320 Speaker 1: a mistake? 454 00:27:03,760 --> 00:27:05,960 Speaker 4: Well, it appears that it was deliberate. I mean, according 455 00:27:05,960 --> 00:27:10,919 Speaker 4: to the allegation, certainly WeightWatchers knew about utilizing pixels in 456 00:27:11,000 --> 00:27:14,880 Speaker 4: relation to healthcare information because the Federal Trade Commission has 457 00:27:14,920 --> 00:27:17,600 Speaker 4: issued warnings to that, and in fact, Weight Watchers was 458 00:27:17,680 --> 00:27:21,199 Speaker 4: sued once before regarding these types of issues, and so 459 00:27:21,280 --> 00:27:25,080 Speaker 4: weight watchers knew that pixels were being utilized and continue 460 00:27:25,119 --> 00:27:27,679 Speaker 4: to do so in spite of warnings from the Federal 461 00:27:27,720 --> 00:27:31,800 Speaker 4: Trade Commission. So, for example, like fitbit, it collects healthcare information, 462 00:27:31,880 --> 00:27:34,280 Speaker 4: but it's not a hipoprotected entity, and so they can 463 00:27:34,320 --> 00:27:37,480 Speaker 4: do whatever they want with that information. So consequently, the 464 00:27:37,720 --> 00:27:39,800 Speaker 4: FDC came out and said, no, you need to start 465 00:27:39,840 --> 00:27:43,280 Speaker 4: treating healthcare information wildly different than any other type of 466 00:27:43,320 --> 00:27:47,200 Speaker 4: information and protect it better, ie, don't use pixels on website. 467 00:27:47,280 --> 00:27:50,240 Speaker 1: They had a privacy policy, right, and they pledged that 468 00:27:50,280 --> 00:27:53,120 Speaker 1: they wouldn't share the information without written consent. 469 00:27:53,920 --> 00:27:56,800 Speaker 4: That's correct, but you have to read between the lines 470 00:27:56,800 --> 00:27:59,960 Speaker 4: on their privacy policy because they do know that they 471 00:28:00,119 --> 00:28:02,720 Speaker 4: share this information with third parties. And that's part of 472 00:28:02,760 --> 00:28:05,120 Speaker 4: the problem with privacy policies. One, if you actually took 473 00:28:05,119 --> 00:28:06,760 Speaker 4: the time to read all of them, you know, you 474 00:28:06,760 --> 00:28:09,240 Speaker 4: would never actually utilize a website. You'd just be reading 475 00:28:09,280 --> 00:28:12,600 Speaker 4: all of these vague details. And then two, because there 476 00:28:12,640 --> 00:28:16,480 Speaker 4: aren't requirements on what privacy policies have to contain with 477 00:28:16,520 --> 00:28:20,000 Speaker 4: certain nominal exceptions, they can get away with making broad 478 00:28:20,000 --> 00:28:22,240 Speaker 4: statements like we don't sell or share your data to 479 00:28:22,280 --> 00:28:25,520 Speaker 4: third parties, and at the same time, they can still 480 00:28:25,560 --> 00:28:28,360 Speaker 4: do it because they'll share that information for marketing purposes 481 00:28:28,440 --> 00:28:32,000 Speaker 4: or other sorts of quote unquote internal business purposes. And 482 00:28:32,040 --> 00:28:35,120 Speaker 4: so if you're utilizing the information for internal business purposes 483 00:28:35,200 --> 00:28:38,280 Speaker 4: i e. Analytics, then you can get away with sharing 484 00:28:38,320 --> 00:28:40,560 Speaker 4: this type of information with third parties. So it's really 485 00:28:40,640 --> 00:28:41,720 Speaker 4: all about wordsmithing. 486 00:28:42,160 --> 00:28:46,080 Speaker 1: Weight Watchers is reportedly facing bankruptcy. If it does go 487 00:28:46,160 --> 00:28:48,920 Speaker 1: into bankruptcy, what happens with the data? 488 00:28:48,960 --> 00:28:50,760 Speaker 4: So that's all part of the questions, right. So first 489 00:28:50,800 --> 00:28:53,680 Speaker 4: and foremost, this class action law to filed against weight 490 00:28:53,720 --> 00:28:56,160 Speaker 4: Watchers was filed on April nine, so they're still at 491 00:28:56,200 --> 00:28:58,160 Speaker 4: the beginning stages. And in fact, I do not believe 492 00:28:58,160 --> 00:29:00,920 Speaker 4: that weight Watchers has answered yet. If it the case 493 00:29:00,920 --> 00:29:04,160 Speaker 4: that WeightWatchers isn't fact going to file for bankruptcy, the 494 00:29:04,240 --> 00:29:06,400 Speaker 4: question becomes is it going to be like a twenty 495 00:29:06,440 --> 00:29:09,520 Speaker 4: three and me situation in which they're looking for buyers, 496 00:29:09,680 --> 00:29:12,840 Speaker 4: ie a Chapter seven or something along those lines, or 497 00:29:12,920 --> 00:29:15,040 Speaker 4: is it going to be something more like a Chapter 498 00:29:15,080 --> 00:29:18,760 Speaker 4: eleven where it's a true restructuring, in which case WeightWatchers 499 00:29:18,800 --> 00:29:21,560 Speaker 4: wouldn't necessarily sell or give away this information to any 500 00:29:21,600 --> 00:29:24,720 Speaker 4: third parties, but we retain it for their future use, 501 00:29:24,840 --> 00:29:27,120 Speaker 4: you know, after they've addressed all of their debt issues. 502 00:29:27,320 --> 00:29:29,200 Speaker 4: But all of that said, it goes right back to 503 00:29:29,200 --> 00:29:31,560 Speaker 4: what we saw with twenty three and meters filing bankruptcy, 504 00:29:31,600 --> 00:29:33,720 Speaker 4: which is, if it's the case that they go ahead 505 00:29:33,720 --> 00:29:36,560 Speaker 4: and liquidates company, these are assets, which is part of 506 00:29:36,560 --> 00:29:38,840 Speaker 4: the reason why the PLANEFF sued for unjust En Richmond. 507 00:29:39,400 --> 00:29:41,800 Speaker 4: And this is the private type of information that's going 508 00:29:41,840 --> 00:29:43,280 Speaker 4: to be sold in a bankruptcy proceeding. 509 00:29:43,480 --> 00:29:45,400 Speaker 1: So you can go on these sites and they give 510 00:29:45,440 --> 00:29:48,320 Speaker 1: you options about whether you want to share your information 511 00:29:48,640 --> 00:29:52,000 Speaker 1: or not. Does any of that matter in reality? 512 00:29:52,440 --> 00:29:55,280 Speaker 4: Well, so you've got to remember there's a slight difference 513 00:29:55,320 --> 00:30:00,240 Speaker 4: between for example, accepting cookies, right or rejecting cookies, which 514 00:30:00,280 --> 00:30:02,880 Speaker 4: is one way that they can obtain your information. 515 00:30:03,360 --> 00:30:04,320 Speaker 2: But anytime you. 516 00:30:04,280 --> 00:30:07,480 Speaker 4: Fill out a form, the form and that information is 517 00:30:07,480 --> 00:30:10,080 Speaker 4: going to be stored with that company. So even if 518 00:30:10,120 --> 00:30:12,880 Speaker 4: you opt out of all cookies, you might still be 519 00:30:13,120 --> 00:30:16,720 Speaker 4: consenting to them utilizing this information. Because remember all state 520 00:30:16,840 --> 00:30:19,840 Speaker 4: laws at this stage are opt out. What's the exception 521 00:30:19,880 --> 00:30:22,800 Speaker 4: of Colorado, which is a minor exception for sensitive information, 522 00:30:23,040 --> 00:30:25,840 Speaker 4: But otherwise, these are all opt out statuses. And what 523 00:30:25,880 --> 00:30:28,680 Speaker 4: that means is is while you may decline cookies, once 524 00:30:28,680 --> 00:30:31,720 Speaker 4: you've submitted your information to that company, they can basically 525 00:30:31,760 --> 00:30:34,200 Speaker 4: do anything that they want with that information, up to 526 00:30:34,240 --> 00:30:36,000 Speaker 4: and including selling it to third party. 527 00:30:36,200 --> 00:30:39,680 Speaker 1: So then as you respond to the questions or fill 528 00:30:39,720 --> 00:30:43,760 Speaker 1: out forms, you should just consider that information is gone, 529 00:30:43,920 --> 00:30:45,360 Speaker 1: whatever information I'm giving them. 530 00:30:45,720 --> 00:30:49,120 Speaker 4: I don't think that most consumers realize how unprotected their 531 00:30:49,200 --> 00:30:51,800 Speaker 4: data is. Once you go on the Internet and you 532 00:30:51,840 --> 00:30:54,360 Speaker 4: interact with the third party for all intents and purposes, 533 00:30:54,400 --> 00:30:57,240 Speaker 4: that information is then going to be retained by them 534 00:30:57,680 --> 00:31:00,400 Speaker 4: and they can pretty much do whatever they want with it. 535 00:31:00,760 --> 00:31:04,400 Speaker 4: And so the concept of privacy no longer exists because 536 00:31:05,080 --> 00:31:09,680 Speaker 4: we have divided human beings from their data. What the 537 00:31:09,720 --> 00:31:13,120 Speaker 4: reality is is human beings are data. What we do 538 00:31:13,200 --> 00:31:16,480 Speaker 4: on a daily basis is being collected, and so consequently 539 00:31:16,520 --> 00:31:18,480 Speaker 4: we need to make sure or we need to try 540 00:31:18,840 --> 00:31:23,280 Speaker 4: and obtain right to our data, not just privacy laws 541 00:31:23,280 --> 00:31:26,720 Speaker 4: and those sorts of things, but actual rights, because then if, 542 00:31:26,760 --> 00:31:29,880 Speaker 4: for example, weightwatcher says we don't sell your information or 543 00:31:29,880 --> 00:31:33,160 Speaker 4: give it away to third parties, and meta gets hacked 544 00:31:33,280 --> 00:31:34,920 Speaker 4: and all of a sudden you find out that they've 545 00:31:34,920 --> 00:31:38,640 Speaker 4: got WeightWatchers information in their website. You now have a claim, 546 00:31:39,080 --> 00:31:42,800 Speaker 4: plain and simple because they stole your property and used 547 00:31:42,840 --> 00:31:45,640 Speaker 4: it without your consent. But right now, there are no 548 00:31:45,760 --> 00:31:47,800 Speaker 4: laws on the books they even except for a few 549 00:31:47,840 --> 00:31:51,800 Speaker 4: minor exceptions, that say your data is yours. Otherwise your 550 00:31:51,880 --> 00:31:52,720 Speaker 4: data is whoever you. 551 00:31:52,680 --> 00:31:55,400 Speaker 1: Give it to, So then it would take a law 552 00:31:55,600 --> 00:31:56,320 Speaker 1: to change this. 553 00:31:57,000 --> 00:31:59,640 Speaker 4: That's correct. So for example, in Oklahoma, and this is 554 00:31:59,680 --> 00:32:01,640 Speaker 4: the only law that I'm aware of on the books, 555 00:32:01,640 --> 00:32:03,160 Speaker 4: but I would love to see it in other states. 556 00:32:03,280 --> 00:32:06,640 Speaker 4: In Oklahoma, there's a law that says if you upload 557 00:32:06,680 --> 00:32:10,360 Speaker 4: health care information to Oklahoma's Health Information Exchange, which is 558 00:32:10,400 --> 00:32:13,600 Speaker 4: a database that most states have where if you get 559 00:32:13,640 --> 00:32:15,360 Speaker 4: injured in one part of the state, a doctor in 560 00:32:15,360 --> 00:32:17,320 Speaker 4: another part of the state can look up that information 561 00:32:17,360 --> 00:32:20,480 Speaker 4: and the Health Information Exchange. And the law says that 562 00:32:20,600 --> 00:32:23,120 Speaker 4: all of the information and data that you upload to 563 00:32:23,240 --> 00:32:27,800 Speaker 4: the Health Information Exchange, you quote retain a property right 564 00:32:28,400 --> 00:32:31,880 Speaker 4: in that healthcare data, but then give them a license 565 00:32:31,880 --> 00:32:36,200 Speaker 4: to use it. So there is a statutory instantiation of 566 00:32:36,240 --> 00:32:39,840 Speaker 4: your property rights in your healthcare data. Now how far 567 00:32:39,880 --> 00:32:42,480 Speaker 4: does that extend, Probably not much further. Than your healthcare data. 568 00:32:42,520 --> 00:32:44,040 Speaker 4: But you can see how we would be able to 569 00:32:44,080 --> 00:32:45,840 Speaker 4: pass the law that would cover any type of data 570 00:32:45,880 --> 00:32:47,320 Speaker 4: that you'd give over to third parties. 571 00:32:47,600 --> 00:32:50,400 Speaker 1: So a few months ago I got to notice that 572 00:32:50,440 --> 00:32:53,000 Speaker 1: a company had been hacked, and it was a company 573 00:32:53,040 --> 00:32:56,719 Speaker 1: that was somehow involved in processing claims for one of 574 00:32:56,720 --> 00:33:00,680 Speaker 1: my doctors. So you're giving information to your doctor and 575 00:33:00,760 --> 00:33:03,000 Speaker 1: somehow that information is getting hacked. 576 00:33:03,360 --> 00:33:06,520 Speaker 4: So that's another great example. So within HIPPA, there are 577 00:33:06,600 --> 00:33:09,880 Speaker 4: rules that permit third parties to utilize this data on 578 00:33:09,960 --> 00:33:13,200 Speaker 4: behalf of the covertant. So for example, your doctor, he 579 00:33:13,360 --> 00:33:15,360 Speaker 4: gives this over to a third party so that they 580 00:33:15,360 --> 00:33:18,760 Speaker 4: can essentially manage it, but not utilize it without his permission. 581 00:33:18,960 --> 00:33:21,440 Speaker 4: Here's the problem is that I would be shocked in 582 00:33:21,520 --> 00:33:26,360 Speaker 4: astounded whether many, if any health care entities actually then 583 00:33:26,400 --> 00:33:31,160 Speaker 4: go to their business associates and bet their cybersecurity protocol instead. 584 00:33:31,240 --> 00:33:34,880 Speaker 4: The business associated agreements typically say that you know you're 585 00:33:34,920 --> 00:33:37,080 Speaker 4: going to comply with HIPPA, and you're going to comply 586 00:33:37,160 --> 00:33:40,520 Speaker 4: with my instructions, but there's no verification, there's no auditing 587 00:33:40,880 --> 00:33:43,360 Speaker 4: that in fact, that's what they're doing with that data. 588 00:33:43,920 --> 00:33:46,880 Speaker 4: And in healthcare space, it is all over the place 589 00:33:46,920 --> 00:33:51,120 Speaker 4: because you're sharing information with testing facilities, with third parties, 590 00:33:51,160 --> 00:33:54,160 Speaker 4: billing companies, so your data gets spread all over the 591 00:33:54,160 --> 00:33:56,480 Speaker 4: place and you're hoping and praying that somebody's actually can 592 00:33:56,560 --> 00:33:58,800 Speaker 4: imply with that business associate agreement. 593 00:33:58,560 --> 00:34:01,040 Speaker 1: And their solution is usually to give you something like 594 00:34:01,080 --> 00:34:04,240 Speaker 1: a year of free credit monitoring. But what can you do? 595 00:34:04,280 --> 00:34:07,280 Speaker 1: You really don't have a choice but giving information in 596 00:34:07,320 --> 00:34:08,680 Speaker 1: some of these situations. 597 00:34:09,080 --> 00:34:10,640 Speaker 4: That's a very good point. And not only that, but 598 00:34:10,680 --> 00:34:12,759 Speaker 4: what are you going to do about the data breaches themselves? 599 00:34:12,840 --> 00:34:16,439 Speaker 4: I mean, in HIPPA, if somebody has breached in less 600 00:34:16,440 --> 00:34:20,480 Speaker 4: than five hundred healthcare records or access, the reporting entity 601 00:34:20,719 --> 00:34:23,480 Speaker 4: does not have to report that hack until February one 602 00:34:23,520 --> 00:34:26,560 Speaker 4: of the following year. So you, as a consumer, how 603 00:34:26,560 --> 00:34:28,480 Speaker 4: are you going to protect yourself when you find out 604 00:34:28,520 --> 00:34:31,760 Speaker 4: about this hack a year later? You're not. And in fact, 605 00:34:31,800 --> 00:34:34,040 Speaker 4: I've been an advocate that we need to rethink our 606 00:34:34,480 --> 00:34:38,200 Speaker 4: breach notification laws altogether, because just like you said, what 607 00:34:38,200 --> 00:34:39,640 Speaker 4: do you do when you get that piece of paper? 608 00:34:39,760 --> 00:34:42,320 Speaker 4: You throw it away or you call a credit agency 609 00:34:42,360 --> 00:34:44,480 Speaker 4: to monitor your credit. That's about all that you can 610 00:34:44,520 --> 00:34:46,439 Speaker 4: do at the moment. There needs to be more meat 611 00:34:46,520 --> 00:34:49,680 Speaker 4: on the bones when it comes to data breach notification requirements. 612 00:34:50,000 --> 00:34:53,040 Speaker 1: And what about all the information that the government has 613 00:34:53,120 --> 00:34:54,560 Speaker 1: about us? 614 00:34:54,640 --> 00:34:58,080 Speaker 4: And it's not just information that you've supplied through for example, 615 00:34:58,120 --> 00:35:01,480 Speaker 4: social security information in those works, but it's all that 616 00:35:01,600 --> 00:35:04,880 Speaker 4: information plus whatever else that the government can aggregate from 617 00:35:04,960 --> 00:35:07,840 Speaker 4: data brokers, which they have full rain and free access to, 618 00:35:08,120 --> 00:35:10,360 Speaker 4: just by asking the data brokers to give them copies 619 00:35:10,400 --> 00:35:12,399 Speaker 4: of that information. And just think about it this way. 620 00:35:12,480 --> 00:35:14,200 Speaker 4: Every time that you go to the airport, you have 621 00:35:14,280 --> 00:35:17,680 Speaker 4: that bioinformation screenpace check when you go to check in. 622 00:35:17,880 --> 00:35:21,480 Speaker 4: That information is then stored and utilized so that, for example, 623 00:35:21,520 --> 00:35:23,320 Speaker 4: if you're in New York City and there's a street 624 00:35:23,320 --> 00:35:25,759 Speaker 4: camera that can pick you up, they can then reidentify 625 00:35:25,800 --> 00:35:28,439 Speaker 4: you fairly quickly. And not just that, but your gate. 626 00:35:28,680 --> 00:35:31,560 Speaker 4: All sorts of unique aspects about you are stored not 627 00:35:31,640 --> 00:35:33,839 Speaker 4: just by third parties but by the government as well. 628 00:35:34,360 --> 00:35:36,319 Speaker 4: The thing is is that I do not think that 629 00:35:36,360 --> 00:35:39,160 Speaker 4: people appreciate the surveillance state that we live in at 630 00:35:39,200 --> 00:35:41,960 Speaker 4: the particular moment. If we want to identify any particular 631 00:35:42,000 --> 00:35:44,399 Speaker 4: person in the United States right now, at this moment 632 00:35:44,520 --> 00:35:46,960 Speaker 4: where they're at, if they have a cell phone. We 633 00:35:47,040 --> 00:35:48,120 Speaker 4: can do that fairly. 634 00:35:48,200 --> 00:35:50,680 Speaker 1: So it's like the TV shows where they can track 635 00:35:50,800 --> 00:35:52,399 Speaker 1: people no matter where they are. 636 00:35:52,719 --> 00:35:56,040 Speaker 4: It is, it is Unfortunately, it's a very scary situation 637 00:35:56,120 --> 00:35:58,040 Speaker 4: in which we're living in. So I mean, for example, 638 00:35:58,360 --> 00:36:00,759 Speaker 4: you know, one of the questions on wait Watchers was 639 00:36:01,000 --> 00:36:03,360 Speaker 4: have you made yourself vomit within the past week? What 640 00:36:03,480 --> 00:36:06,839 Speaker 4: you could imagine then that information being sold to some 641 00:36:06,920 --> 00:36:10,160 Speaker 4: diet pill company who then sends a targeted ad towards you, 642 00:36:10,280 --> 00:36:13,839 Speaker 4: And now you're getting, you know, fluff medication from some 643 00:36:13,920 --> 00:36:16,319 Speaker 4: third party that probably doesn't work as opposed to from 644 00:36:16,320 --> 00:36:20,000 Speaker 4: your actual healthcare provider. And so it's really concerning how 645 00:36:20,000 --> 00:36:22,400 Speaker 4: we're giving away this information, how can be utilized. 646 00:36:23,160 --> 00:36:24,920 Speaker 1: So what do you do when a website ask for 647 00:36:24,960 --> 00:36:25,680 Speaker 1: your information? 648 00:36:26,520 --> 00:36:29,360 Speaker 4: So again it's all about risk and so certain websites 649 00:36:29,400 --> 00:36:30,840 Speaker 4: I have to use, But I will tell you I 650 00:36:30,840 --> 00:36:32,799 Speaker 4: don't use apps on my phone at all. I have 651 00:36:32,880 --> 00:36:35,520 Speaker 4: like five apps that didn't come pre stored, and it's 652 00:36:35,560 --> 00:36:38,239 Speaker 4: because I'm more concerned about them tracking me on my 653 00:36:38,320 --> 00:36:41,000 Speaker 4: phone and where I'm at, more so than necessarily what 654 00:36:41,040 --> 00:36:43,600 Speaker 4: I'm uploading the website. Because the reality is is we 655 00:36:43,640 --> 00:36:45,480 Speaker 4: live in the world in which we have to upload 656 00:36:45,520 --> 00:36:47,719 Speaker 4: information to websites every day, so it's a matter of 657 00:36:47,719 --> 00:36:48,480 Speaker 4: managing your risk. 658 00:36:49,200 --> 00:36:52,600 Speaker 1: So then when you check, don't follow me. Uh huh, 659 00:36:52,640 --> 00:36:53,440 Speaker 1: that doesn't work. 660 00:36:53,840 --> 00:36:57,280 Speaker 4: Well, it can, I mean it can to some degree, 661 00:36:57,400 --> 00:36:59,520 Speaker 4: but you just never know. I mean there's a lot 662 00:36:59,560 --> 00:37:01,759 Speaker 4: of backdoor and so I just prefer not to even 663 00:37:01,840 --> 00:37:02,160 Speaker 4: risk it. 664 00:37:02,400 --> 00:37:04,360 Speaker 1: So I guess the only solution is to go back 665 00:37:04,400 --> 00:37:05,840 Speaker 1: to an analog world. 666 00:37:06,360 --> 00:37:08,080 Speaker 4: I think that's the way people are going to realize 667 00:37:08,120 --> 00:37:09,000 Speaker 4: they need to be safe. 668 00:37:09,160 --> 00:37:11,960 Speaker 1: Yes, whenever I talk to you, Colin, I resolve to 669 00:37:12,040 --> 00:37:17,279 Speaker 1: be more careful about my personal information. Thanks for the warnings. 670 00:37:17,640 --> 00:37:20,600 Speaker 1: That's Colin Walkee of hall Estell and that's it for 671 00:37:20,640 --> 00:37:23,279 Speaker 1: this edition of The Bloomberg Law Show. Remember you can 672 00:37:23,320 --> 00:37:26,560 Speaker 1: always get the latest legal news on our Bloomberg Law Podcast. 673 00:37:26,840 --> 00:37:29,839 Speaker 1: You can find them on Apple Podcasts, Spotify, and at 674 00:37:30,000 --> 00:37:35,040 Speaker 1: www dot Bloomberg dot com, slash podcast Slash Law, and 675 00:37:35,120 --> 00:37:38,200 Speaker 1: remember to tune into The Bloomberg Law Show every weeknight 676 00:37:38,280 --> 00:37:41,719 Speaker 1: at ten pm Wall Street Time. I'm June Grosso, and 677 00:37:41,760 --> 00:37:43,239 Speaker 1: you're listening to Bloomberg