1 00:00:03,160 --> 00:00:18,000 Speaker 1: This is Bloomberg Law with June Brusso from Bloomberg Radio. Beautiful, beautiful, unethical, dangerous. 2 00:00:18,960 --> 00:00:22,480 Speaker 1: You've turned every cell phone in Gotham into a microphone. 3 00:00:23,160 --> 00:00:26,400 Speaker 1: This is wrong. I've got to find this man, Lucius. 4 00:00:28,720 --> 00:00:33,159 Speaker 1: At what cost? This is an audious He talks with 5 00:00:33,280 --> 00:00:35,279 Speaker 1: a range of any phone in the city. You can 6 00:00:35,320 --> 00:00:39,720 Speaker 1: triangulate his position. I'll help you this one time. In 7 00:00:39,800 --> 00:00:44,360 Speaker 1: the dark night Batman uses an unexplained technology to convert 8 00:00:44,440 --> 00:00:48,280 Speaker 1: cell phones into a citywide tracking system to find the joker. 9 00:00:48,880 --> 00:00:51,920 Speaker 1: That technology may have seemed like science fiction when the 10 00:00:51,920 --> 00:00:55,200 Speaker 1: movie was released more than a decade ago, but now 11 00:00:55,240 --> 00:00:59,720 Speaker 1: it's real and today. Geo fence warrants allowed law enforcement 12 00:00:59,760 --> 00:01:03,800 Speaker 1: to search Google sensor vault database to get location records 13 00:01:03,840 --> 00:01:07,800 Speaker 1: for all mobile phones within a virtual perimeter at specified times, 14 00:01:08,080 --> 00:01:11,920 Speaker 1: raising privacy and Fourth Amendment issues. A robbery case in 15 00:01:12,000 --> 00:01:14,880 Speaker 1: Virginia will be a high stakes legal test for the 16 00:01:14,959 --> 00:01:18,440 Speaker 1: dramatic spike and police use of geo fence warrants to 17 00:01:18,520 --> 00:01:22,280 Speaker 1: identify suspects. The first Fourth Amendment challenge to a geo 18 00:01:22,400 --> 00:01:25,160 Speaker 1: fence warrant in a federal court. Here to explain the 19 00:01:25,240 --> 00:01:29,560 Speaker 1: legal challenges, Ari Ezra Waldman, a professor at Northeastern University 20 00:01:29,640 --> 00:01:33,400 Speaker 1: Law School. Are new figures from Google show a tenfold 21 00:01:33,600 --> 00:01:36,280 Speaker 1: increase in the use of these geo fence warrants in 22 00:01:36,280 --> 00:01:39,840 Speaker 1: the last three years. Explain what they are. Geo fence 23 00:01:39,880 --> 00:01:44,560 Speaker 1: warrants are tools that government uses to take advantage of 24 00:01:44,600 --> 00:01:48,720 Speaker 1: the data that private companies gather about us, like where 25 00:01:48,760 --> 00:01:51,400 Speaker 1: we are, where we use our phone, where we use 26 00:01:51,440 --> 00:01:55,840 Speaker 1: Apple pay, where we buy products, that can essentially just 27 00:01:55,960 --> 00:01:59,720 Speaker 1: like track individuals wherever they are. Think about it as 28 00:02:00,040 --> 00:02:03,720 Speaker 1: kind of a suped up version of getting self site data. 29 00:02:03,840 --> 00:02:06,040 Speaker 1: You know, when you drive down the New Jersey Turnpike 30 00:02:06,160 --> 00:02:09,480 Speaker 1: or you drive down a highway, your cell phone connects 31 00:02:09,520 --> 00:02:12,160 Speaker 1: to different towers and you can identify where a car 32 00:02:12,200 --> 00:02:14,959 Speaker 1: has been or where it's going. A warrant to get 33 00:02:14,960 --> 00:02:18,400 Speaker 1: ge offense data is essentially here a company, where you 34 00:02:18,400 --> 00:02:21,120 Speaker 1: have gathered all this information about what we normally do 35 00:02:21,320 --> 00:02:24,160 Speaker 1: every minute of the day. Give us that information so 36 00:02:24,240 --> 00:02:27,280 Speaker 1: we can track where a potential suspect is. But a 37 00:02:27,400 --> 00:02:30,840 Speaker 1: judge has authorized the warrant based on a finding of 38 00:02:30,919 --> 00:02:35,720 Speaker 1: probable cause. Yeah, police get a warrant when an independent 39 00:02:35,880 --> 00:02:39,440 Speaker 1: judge feels that there is probable cause to that some 40 00:02:39,600 --> 00:02:44,000 Speaker 1: criminal activity is happening. It's often relatively easy to get 41 00:02:44,000 --> 00:02:46,400 Speaker 1: a warrant, But the idea is that at least the 42 00:02:46,480 --> 00:02:50,920 Speaker 1: police have to justify to an independent arbiter like a judge, 43 00:02:51,160 --> 00:02:54,760 Speaker 1: that this information is necessary. But the fact that someone 44 00:02:54,840 --> 00:02:58,079 Speaker 1: got a warrant, that's okay. But you have to ask yourself, 45 00:02:58,840 --> 00:03:03,280 Speaker 1: did individual in giving over this information to private parties 46 00:03:03,400 --> 00:03:07,600 Speaker 1: right and using tools like Google or an iPhone or 47 00:03:07,639 --> 00:03:10,400 Speaker 1: so forth, and using tools where we may expect or 48 00:03:10,440 --> 00:03:13,160 Speaker 1: know that information is being given to private parties, did 49 00:03:13,200 --> 00:03:16,359 Speaker 1: we expect that such information would be so easily obtained 50 00:03:16,440 --> 00:03:19,720 Speaker 1: by government authorities? And I think that's a very different question. 51 00:03:20,400 --> 00:03:22,960 Speaker 1: So in this case, a man accused of a bank 52 00:03:23,040 --> 00:03:26,880 Speaker 1: robbery is challenging the use of data from a GEO 53 00:03:26,960 --> 00:03:30,200 Speaker 1: warrant as evidence in his indictment. And it's the first 54 00:03:30,200 --> 00:03:33,679 Speaker 1: such federal court challenge. So it's the first case that 55 00:03:33,720 --> 00:03:36,480 Speaker 1: I'm aware of. It's not the first case where location 56 00:03:36,600 --> 00:03:40,280 Speaker 1: data is part of an indictment. I'm very important. Supreme 57 00:03:40,320 --> 00:03:42,640 Speaker 1: Court case Carpenter of the United States, which happened just 58 00:03:42,640 --> 00:03:45,119 Speaker 1: a couple of years ago, was based on historical sell 59 00:03:45,200 --> 00:03:49,800 Speaker 1: site location. So police have been using location information from 60 00:03:50,200 --> 00:03:54,400 Speaker 1: cell service companies, from Internet companies for some time, and 61 00:03:54,800 --> 00:03:58,160 Speaker 1: Google said, I think it was last month that geo 62 00:03:58,240 --> 00:04:02,080 Speaker 1: fence warrants now make up something like one quarter of 63 00:04:02,120 --> 00:04:06,200 Speaker 1: all demands of information from the US government to these companies. 64 00:04:06,600 --> 00:04:09,000 Speaker 1: And what that means is, it's not that the government 65 00:04:09,040 --> 00:04:13,840 Speaker 1: is newly using location data in criminal prosecution. It's just 66 00:04:13,960 --> 00:04:18,200 Speaker 1: that the technology that we have is so much better, 67 00:04:18,720 --> 00:04:21,800 Speaker 1: better in the sense that these companies know exactly where 68 00:04:21,800 --> 00:04:23,680 Speaker 1: we are and have the ability to track us in 69 00:04:23,720 --> 00:04:26,040 Speaker 1: real time. So of course the government is going to 70 00:04:26,120 --> 00:04:29,520 Speaker 1: want to use it when trying to arrest suspects or 71 00:04:29,920 --> 00:04:33,359 Speaker 1: crashed down on crime. What is the defendant's basic argument 72 00:04:33,360 --> 00:04:38,200 Speaker 1: about the constitutionality of the warrant? So the argument is 73 00:04:38,240 --> 00:04:41,520 Speaker 1: that the quantity and quality of their multiple arguments, but 74 00:04:41,600 --> 00:04:44,240 Speaker 1: one of the arguments is that the quantity and quality 75 00:04:44,279 --> 00:04:48,880 Speaker 1: of the information that can be provided through geo fence 76 00:04:48,920 --> 00:04:53,080 Speaker 1: warrant is just way too significant, way too much. Normally, 77 00:04:53,160 --> 00:04:55,839 Speaker 1: when you get a warrant, the police will go I 78 00:04:55,920 --> 00:04:57,880 Speaker 1: need a warrant to search your home. I have a 79 00:04:57,880 --> 00:05:01,760 Speaker 1: warrant to search your office. Right there is a finite 80 00:05:01,800 --> 00:05:05,960 Speaker 1: amount of information that's there if police get a warrant 81 00:05:05,960 --> 00:05:08,080 Speaker 1: to search your office and there's a computer in there, 82 00:05:08,680 --> 00:05:10,640 Speaker 1: Normally you have to get a separate warrant to search 83 00:05:10,680 --> 00:05:15,960 Speaker 1: the computer. But a geofense warrant really is it's almost 84 00:05:16,000 --> 00:05:21,760 Speaker 1: like a seven dragnet about a person that allows law 85 00:05:21,880 --> 00:05:24,800 Speaker 1: enforcement to not just see the papers that are in 86 00:05:24,839 --> 00:05:27,960 Speaker 1: the death, it allows them to see pretty much everywhere 87 00:05:28,000 --> 00:05:31,440 Speaker 1: you go, everything you do for however long the warrant lasts, 88 00:05:31,520 --> 00:05:34,680 Speaker 1: and sometimes you can get a significant warrant. And in 89 00:05:34,720 --> 00:05:37,520 Speaker 1: the case the United States Carpenter, the Supreme Court was 90 00:05:37,680 --> 00:05:42,640 Speaker 1: very skeptical of that, very skeptical of these seven dragnet 91 00:05:43,040 --> 00:05:47,279 Speaker 1: surveillance tools, not just because there wasn't warrant, there wasn't 92 00:05:47,279 --> 00:05:49,640 Speaker 1: a warrant, but because of the quantity and the quality 93 00:05:49,680 --> 00:05:52,680 Speaker 1: of the information. So in defending the GEO warrant in 94 00:05:52,720 --> 00:05:56,120 Speaker 1: this case, the Justice Department argues that Google users give 95 00:05:56,200 --> 00:05:59,599 Speaker 1: up their expectation of privacy when they let the company 96 00:05:59,680 --> 00:06:03,480 Speaker 1: collect data on their phone. Do you agree with that, Well, 97 00:06:03,520 --> 00:06:06,120 Speaker 1: that's what the government argued. But as I was saying earlier, 98 00:06:06,200 --> 00:06:10,279 Speaker 1: it's a very big difference first between giving information to 99 00:06:10,320 --> 00:06:12,760 Speaker 1: a private company in order to gain some benefits like 100 00:06:12,880 --> 00:06:15,560 Speaker 1: be able to find your way in a new city 101 00:06:15,640 --> 00:06:18,680 Speaker 1: through Google Maps and having the government be able to 102 00:06:18,680 --> 00:06:22,320 Speaker 1: have access to that information. So that's one of the differences. Second, 103 00:06:22,839 --> 00:06:25,200 Speaker 1: I think it's just plain wrong, even if the government 104 00:06:25,240 --> 00:06:28,159 Speaker 1: were not involved in the story, it's just plain wrong 105 00:06:28,279 --> 00:06:30,400 Speaker 1: to say that we give up our privacy rights the 106 00:06:30,480 --> 00:06:33,560 Speaker 1: moment we share information with a third party. If that 107 00:06:33,680 --> 00:06:37,000 Speaker 1: were true, then the moment we walked out of our houses, 108 00:06:37,040 --> 00:06:39,920 Speaker 1: we would lose our privacy rights. The moment we spoke 109 00:06:39,960 --> 00:06:42,960 Speaker 1: to another person, we would lose our privacy rights. Now, 110 00:06:43,000 --> 00:06:48,560 Speaker 1: of course, the US government, Facebook, Google, large tech companies 111 00:06:48,760 --> 00:06:52,800 Speaker 1: make that exact argument in court all the time. Our 112 00:06:52,920 --> 00:06:55,920 Speaker 1: users have no privacy rights. Individuals have no privacy rights 113 00:06:55,920 --> 00:07:01,240 Speaker 1: because they voluntarily shared information. It's a cynical loy that 114 00:07:01,560 --> 00:07:07,720 Speaker 1: is based on American laws tradition of seeing privacy in 115 00:07:07,880 --> 00:07:12,360 Speaker 1: terms of control. It's a longstanding discourse of what privacy means. 116 00:07:12,880 --> 00:07:16,680 Speaker 1: That is, privacy is the ability to control the dissemination 117 00:07:16,720 --> 00:07:21,280 Speaker 1: of information about you. That definition undergird so much of 118 00:07:21,320 --> 00:07:25,520 Speaker 1: privacy law, whether it's privacy we have against another private 119 00:07:25,560 --> 00:07:29,120 Speaker 1: party like a tech company or another person or the government. 120 00:07:29,560 --> 00:07:33,320 Speaker 1: But the implication of that is that if privacy is 121 00:07:33,360 --> 00:07:36,760 Speaker 1: about controlling the dissemination of your information. Once you share 122 00:07:36,800 --> 00:07:39,760 Speaker 1: that information with a third party, you lost control of it. 123 00:07:40,040 --> 00:07:42,960 Speaker 1: So it's a weak form of privacy that the government 124 00:07:42,960 --> 00:07:45,440 Speaker 1: and tech companies have been taking advantage of for so long. 125 00:07:45,920 --> 00:07:49,200 Speaker 1: That's why we feel like we have no privacy, despite 126 00:07:49,240 --> 00:07:51,760 Speaker 1: all these companies saying we care about your privacy and 127 00:07:51,880 --> 00:07:55,680 Speaker 1: these privacy wizards and these cookie banners. At its heart, 128 00:07:55,840 --> 00:08:00,400 Speaker 1: the law itself is undermining our ability to TechEd our 129 00:08:00,440 --> 00:08:05,640 Speaker 1: privacy because of this individuated control based definition. So in 130 00:08:05,680 --> 00:08:09,560 Speaker 1: another case, concerning a search for suspects in ten fires 131 00:08:09,560 --> 00:08:14,080 Speaker 1: in the Chicago area, a federal judge approved a government 132 00:08:14,120 --> 00:08:17,840 Speaker 1: request for location data from Google. What kind of a 133 00:08:17,960 --> 00:08:24,080 Speaker 1: test did that judge use? Normally, warrants are given where 134 00:08:24,080 --> 00:08:28,280 Speaker 1: there's probable cause that criminal activities occur. So on any 135 00:08:28,400 --> 00:08:31,320 Speaker 1: given case, you know, a judge will get evidence before 136 00:08:31,440 --> 00:08:34,600 Speaker 1: when the when the proper party applies for a warrant, 137 00:08:34,880 --> 00:08:37,920 Speaker 1: and they'll see, Okay, here's the information. This is the 138 00:08:38,040 --> 00:08:41,040 Speaker 1: data that law enforcement wants in order to rest or 139 00:08:41,240 --> 00:08:45,079 Speaker 1: find more information. And therefore, okay, let me determine if 140 00:08:45,080 --> 00:08:48,960 Speaker 1: there's probable cause based on the information before me. So 141 00:08:49,040 --> 00:08:53,320 Speaker 1: that's a one off decision, right, Judges when when determining warrants, 142 00:08:53,360 --> 00:08:58,040 Speaker 1: don't make decisions about how one particular case or one 143 00:08:58,080 --> 00:09:02,240 Speaker 1: particular motion in front of them affects the wider system 144 00:09:02,320 --> 00:09:05,160 Speaker 1: or implicates the wider system of surveillance. So it's a 145 00:09:05,200 --> 00:09:08,200 Speaker 1: one off. The nature of a one off decision is 146 00:09:08,240 --> 00:09:10,800 Speaker 1: that it is made in a vacuum. It doesn't notice 147 00:09:10,960 --> 00:09:15,040 Speaker 1: or doesn't know that, or doesn't care really that the 148 00:09:15,240 --> 00:09:18,920 Speaker 1: quality and the quantity of information that a geofense warrant 149 00:09:18,960 --> 00:09:23,960 Speaker 1: or another database more warrant for data um will gather 150 00:09:24,520 --> 00:09:28,640 Speaker 1: so significant of information that the entire for that the 151 00:09:28,679 --> 00:09:32,560 Speaker 1: person's everything about the person, not just about their criminal activity. 152 00:09:32,600 --> 00:09:37,040 Speaker 1: Potential criminal activity is subject to be known by the police. 153 00:09:37,160 --> 00:09:40,640 Speaker 1: So it's very easy to approve one of these warrants 154 00:09:40,800 --> 00:09:45,000 Speaker 1: considering that the surveillance context in which it sits is 155 00:09:45,040 --> 00:09:49,439 Speaker 1: not really part of the judges determined determination. What's really 156 00:09:49,480 --> 00:09:51,959 Speaker 1: part of the judges determination is just is there a 157 00:09:51,960 --> 00:09:55,000 Speaker 1: probable cause here and does this warrant make sense? And 158 00:09:55,760 --> 00:09:57,560 Speaker 1: it's a lot easier to make that choice than it 159 00:09:57,720 --> 00:10:00,600 Speaker 1: is to say, well, this is part of a larger 160 00:10:00,640 --> 00:10:04,400 Speaker 1: system of surveillance that is uh problematic for the fourth amend? 161 00:10:05,320 --> 00:10:08,880 Speaker 1: Do you have any reason to think that the judge 162 00:10:08,880 --> 00:10:11,319 Speaker 1: in this case is going to rule that the warrant 163 00:10:11,480 --> 00:10:17,439 Speaker 1: was invalid? Well, I can hope. Um. There's a forthcoming 164 00:10:17,559 --> 00:10:22,360 Speaker 1: paper in the Harvard Law Review by professor at the 165 00:10:22,480 --> 00:10:26,120 Speaker 1: University of Utah named Matthew Tocsin, and he's done a 166 00:10:26,160 --> 00:10:31,000 Speaker 1: really great job of taking every single post Carpenter case 167 00:10:31,440 --> 00:10:36,679 Speaker 1: and analyzing how court lower courts, tel courts, disrecord, state courts, 168 00:10:36,679 --> 00:10:39,680 Speaker 1: how they've interpreted Carpenter. And this is going to be 169 00:10:39,679 --> 00:10:42,600 Speaker 1: another one of those data points, because here the judge 170 00:10:42,640 --> 00:10:45,679 Speaker 1: has the opportunity to look at this surveillance, look at 171 00:10:45,720 --> 00:10:48,480 Speaker 1: what the police asked for and what they did, and 172 00:10:48,600 --> 00:10:54,640 Speaker 1: see has this surveillance gathered so much information the so 173 00:10:54,720 --> 00:10:58,720 Speaker 1: called aggregation theory of surveillance or mosaic theory of surveillance, 174 00:10:59,240 --> 00:11:03,959 Speaker 1: such that it provided no privacy whatsoever. When more surveillance 175 00:11:04,000 --> 00:11:06,440 Speaker 1: over and above what would be necessary or what would 176 00:11:06,440 --> 00:11:10,360 Speaker 1: be expected as would be necessary for this particular criminal investigation, 177 00:11:10,679 --> 00:11:13,880 Speaker 1: Now it's kimely possible that the court will you take 178 00:11:13,920 --> 00:11:17,800 Speaker 1: Professor Toxsin's advice and do a nuanced analysis, But more 179 00:11:17,880 --> 00:11:21,520 Speaker 1: often than not, the federal courts, which definitely have a 180 00:11:21,520 --> 00:11:25,000 Speaker 1: conservative tilt, more often than not side with law enforcements. 181 00:11:25,280 --> 00:11:28,600 Speaker 1: So maybe that's just my cynicism, but I don't really 182 00:11:28,640 --> 00:11:33,600 Speaker 1: expect much of this federal bench. But the appellate courts 183 00:11:33,600 --> 00:11:35,920 Speaker 1: of the Supreme Court, you know, when it comes to 184 00:11:35,920 --> 00:11:41,080 Speaker 1: these issues, they don't always fall down traditional ideological lines. 185 00:11:41,400 --> 00:11:44,840 Speaker 1: So it's an open question what might happen as more 186 00:11:45,040 --> 00:11:49,200 Speaker 1: courts start to interpret or think about the constitutionality of 187 00:11:49,200 --> 00:11:52,280 Speaker 1: these types of warms. Are legal experts in the privacy 188 00:11:52,320 --> 00:11:56,320 Speaker 1: area paying attention to this case? Sure, I mean, these 189 00:11:56,400 --> 00:11:59,600 Speaker 1: kind of cases are always come before, we always come 190 00:12:00,040 --> 00:12:03,360 Speaker 1: us our death. We're whether we're teaching about it or 191 00:12:03,840 --> 00:12:06,319 Speaker 1: um talking about it or using them as example of 192 00:12:06,320 --> 00:12:09,720 Speaker 1: in our research. The interesting thing about geo fence warrants 193 00:12:09,920 --> 00:12:14,480 Speaker 1: is that it implicates both It's interesting for scholars who 194 00:12:14,559 --> 00:12:17,880 Speaker 1: scholars of the Fourth Amendment, so government access to information, 195 00:12:17,920 --> 00:12:20,760 Speaker 1: but also people like me, who I mean I do 196 00:12:20,840 --> 00:12:24,760 Speaker 1: I focus much more on the relationships between individuals and 197 00:12:24,800 --> 00:12:28,600 Speaker 1: private companies because this source of the source of all 198 00:12:28,640 --> 00:12:31,280 Speaker 1: this data is not the government gathering the information on 199 00:12:31,640 --> 00:12:36,800 Speaker 1: its own. It implicates our expectations about what a private 200 00:12:36,800 --> 00:12:40,080 Speaker 1: company like Google will do with our data or what 201 00:12:40,200 --> 00:12:42,600 Speaker 1: goes on when we make decisions about the tools that 202 00:12:42,679 --> 00:12:45,920 Speaker 1: we use, the apps that we use, and so forth. 203 00:12:46,040 --> 00:12:47,840 Speaker 1: That really has nothing to do with the government. But 204 00:12:47,960 --> 00:12:51,480 Speaker 1: yet our system allows the government great access to that. 205 00:12:51,640 --> 00:12:54,559 Speaker 1: So I would expect not just Professor Tompson, but I 206 00:12:54,600 --> 00:12:57,120 Speaker 1: would expect that a lot of law professors and a 207 00:12:57,120 --> 00:13:00,000 Speaker 1: lot of legal advocates and pol privacy advocates. I think 208 00:13:00,000 --> 00:13:04,400 Speaker 1: me about this. Albert fox Cohn, who's a privacy advocate, 209 00:13:04,440 --> 00:13:07,600 Speaker 1: runs a nonprofit in New York City. He's written several 210 00:13:08,000 --> 00:13:12,599 Speaker 1: um poignant and uh cojen op eds about g f 211 00:13:12,679 --> 00:13:15,320 Speaker 1: N S Warren's showing how dangerous they are. So this 212 00:13:15,400 --> 00:13:18,800 Speaker 1: is a hot topic right now, and hopefully judges are 213 00:13:18,800 --> 00:13:23,600 Speaker 1: taking note. You've written a book, Industry Unbound, the inside 214 00:13:23,679 --> 00:13:27,360 Speaker 1: story of privacy, data and corporate power. So let me 215 00:13:27,440 --> 00:13:31,559 Speaker 1: ask you, what can the average person do to protect 216 00:13:31,720 --> 00:13:36,839 Speaker 1: their privacy? Because using Google Maps as an example, it 217 00:13:36,880 --> 00:13:40,679 Speaker 1: makes it so easy to get around anywhere in the world. Basically, 218 00:13:41,160 --> 00:13:45,440 Speaker 1: it's hard to give that up for the concept of privacy. Yeah, 219 00:13:45,440 --> 00:13:48,000 Speaker 1: I get this question a lot, and you're right. I mean, 220 00:13:48,040 --> 00:13:50,480 Speaker 1: I barely know how to get from place A to B. 221 00:13:50,640 --> 00:13:53,800 Speaker 1: So I use Google Maps all the time. But I'll 222 00:13:53,800 --> 00:13:56,160 Speaker 1: play a little bit of a law professor game here 223 00:13:56,200 --> 00:13:59,160 Speaker 1: and say that is the wrong question. In fact, what 224 00:13:59,280 --> 00:14:01,800 Speaker 1: can the individual will do is exactly the question that 225 00:14:01,880 --> 00:14:05,479 Speaker 1: companies want us to be asking ourselves, because it implies 226 00:14:05,559 --> 00:14:08,760 Speaker 1: that protecting privacy is something that we can do in 227 00:14:08,800 --> 00:14:13,319 Speaker 1: the world in informational capitalism, in the world that privacy 228 00:14:13,440 --> 00:14:17,520 Speaker 1: law and the law has set up, almost nothing an 229 00:14:17,559 --> 00:14:21,080 Speaker 1: individual can do can protect their privacy. I'm sorry to say, 230 00:14:21,120 --> 00:14:24,680 Speaker 1: but there is nothing that we as individuals can do. Sure, 231 00:14:24,880 --> 00:14:27,360 Speaker 1: you can use a VPN, a virtual private network, You 232 00:14:27,400 --> 00:14:32,560 Speaker 1: can use Firefox instead of Google Chrome, you can say 233 00:14:32,600 --> 00:14:36,680 Speaker 1: no to all personalization cookies. All of that is great. 234 00:14:36,720 --> 00:14:38,840 Speaker 1: Anyone can do that. If you want to go through 235 00:14:38,880 --> 00:14:42,480 Speaker 1: the work of doing all those toggles, your Internet experience 236 00:14:42,600 --> 00:14:44,640 Speaker 1: is not going to change very much, and some of 237 00:14:44,640 --> 00:14:47,120 Speaker 1: your privacy will be protected. But all of that is 238 00:14:47,200 --> 00:14:51,920 Speaker 1: marginal because even if you do all those things, these 239 00:14:51,920 --> 00:14:56,160 Speaker 1: companies are still collecting terabytes of data about you passively. 240 00:14:56,520 --> 00:14:59,200 Speaker 1: Even if you don't accept cookies, you're on their website. 241 00:14:59,280 --> 00:15:01,880 Speaker 1: They know where are, they know what hardware you're using, 242 00:15:02,040 --> 00:15:04,280 Speaker 1: they know where your cursor is they know your purchase 243 00:15:04,400 --> 00:15:08,560 Speaker 1: histories because companies can buy it from data brokers. So 244 00:15:08,920 --> 00:15:12,240 Speaker 1: it's the wrong question to ask what we need to do. 245 00:15:12,360 --> 00:15:16,280 Speaker 1: The only way that individuals will ever be empowered to 246 00:15:16,400 --> 00:15:21,120 Speaker 1: protect their privacy is when the current political economy of 247 00:15:21,520 --> 00:15:25,280 Speaker 1: big tech in the data economy is overturned, or is rejected, 248 00:15:25,360 --> 00:15:29,520 Speaker 1: or is completely reformed such that we actually do have 249 00:15:29,560 --> 00:15:33,240 Speaker 1: those tools. So, although that's very cynical, what we really 250 00:15:33,240 --> 00:15:35,960 Speaker 1: should be asking ourselves is what can we do to 251 00:15:36,080 --> 00:15:40,560 Speaker 1: ensure structural reform, whether that means voting for the right 252 00:15:40,720 --> 00:15:46,320 Speaker 1: political leaders who will actually pass honest reform versus people 253 00:15:46,400 --> 00:15:49,760 Speaker 1: who only talk about big tech for conservative political gain, 254 00:15:50,280 --> 00:15:53,600 Speaker 1: whether it's not just a political leaders that we vote for, 255 00:15:53,920 --> 00:15:56,920 Speaker 1: but what happens in our own schools, Like are we 256 00:15:57,000 --> 00:16:01,520 Speaker 1: okay with our employers, how having a contract with Google 257 00:16:01,920 --> 00:16:06,080 Speaker 1: to allow us to Google docs while also feeling data 258 00:16:06,120 --> 00:16:09,800 Speaker 1: in the process, or would we be more well served 259 00:16:09,840 --> 00:16:14,520 Speaker 1: with a similar product that doesn't include rampant data collection 260 00:16:14,560 --> 00:16:17,280 Speaker 1: like Google Docs. So those are the kind of things 261 00:16:17,280 --> 00:16:20,200 Speaker 1: that we really need to be thinking of, and we 262 00:16:20,240 --> 00:16:23,440 Speaker 1: also need to create a larger ground. So will a 263 00:16:23,520 --> 00:16:29,040 Speaker 1: larger movement for privacy reform because without that, politicians are 264 00:16:29,040 --> 00:16:33,240 Speaker 1: just going to take more big tech money and maintain 265 00:16:33,800 --> 00:16:37,400 Speaker 1: the status quo with some changes around the edges. The 266 00:16:37,440 --> 00:16:40,680 Speaker 1: book is based on four years of research. Tell us 267 00:16:40,960 --> 00:16:43,680 Speaker 1: what you did during those four years, who you talk to, 268 00:16:43,840 --> 00:16:47,800 Speaker 1: how you've got access? Sure, So the book Industry on 269 00:16:47,920 --> 00:16:51,000 Speaker 1: Bound is based on nearly four years of field work. 270 00:16:51,120 --> 00:16:55,440 Speaker 1: And it's not four years five days a year, but 271 00:16:56,080 --> 00:16:59,960 Speaker 1: it's a It's based on interviews with current and four 272 00:17:00,040 --> 00:17:04,720 Speaker 1: UMER employees and current and firmer employees of big tech companies, 273 00:17:05,040 --> 00:17:11,840 Speaker 1: and that include engineers, UM, privacy professionals, product managers, lawyers, 274 00:17:11,880 --> 00:17:16,200 Speaker 1: as well as interviews with law firm partners and associates 275 00:17:16,320 --> 00:17:21,640 Speaker 1: and UM as well, in addition reviews of confidential documents 276 00:17:21,640 --> 00:17:24,560 Speaker 1: that have been redacted. I also had the opportunity to 277 00:17:24,800 --> 00:17:29,200 Speaker 1: be embedded for a couple of days around a week 278 00:17:29,280 --> 00:17:33,359 Speaker 1: er and so in one case inside UM several medium 279 00:17:33,400 --> 00:17:36,080 Speaker 1: size and small tech companies just to see how they 280 00:17:36,200 --> 00:17:39,719 Speaker 1: run their meetings and UH interview some of their people 281 00:17:40,240 --> 00:17:43,040 Speaker 1: talk about how they do privacy law on the ground. 282 00:17:43,400 --> 00:17:45,480 Speaker 1: Getting access was really hard. I had a lot of 283 00:17:45,520 --> 00:17:48,199 Speaker 1: companies just say no, we're not interested in this. So 284 00:17:48,320 --> 00:17:53,760 Speaker 1: I had to find other routes, including snowball sampling, finding 285 00:17:54,119 --> 00:17:57,240 Speaker 1: social network that gives me access to people who formerly 286 00:17:57,240 --> 00:18:00,280 Speaker 1: worked as a company. In fact, Facebook just refused to apply. 287 00:18:01,320 --> 00:18:05,240 Speaker 1: Google insisted that if I talked to anyone, it has 288 00:18:05,280 --> 00:18:07,800 Speaker 1: to be on deep background. I can't quote anyone, And 289 00:18:07,840 --> 00:18:14,440 Speaker 1: in fact Facebook UM eventually replied to multiple emails later 290 00:18:14,560 --> 00:18:17,240 Speaker 1: several years down the line and said that if we 291 00:18:17,280 --> 00:18:21,560 Speaker 1: talk to you, we have to get We require approval 292 00:18:22,280 --> 00:18:26,600 Speaker 1: of all quotes from Facebook employees, as well as approval 293 00:18:26,640 --> 00:18:30,160 Speaker 1: of the manuscript. And whereas that might be that kind 294 00:18:30,160 --> 00:18:33,800 Speaker 1: of access might be helpful for some researchers, that violates 295 00:18:34,119 --> 00:18:36,919 Speaker 1: research ethics. So I ended up having to do a 296 00:18:36,960 --> 00:18:40,360 Speaker 1: lot more work to get access from the back end, 297 00:18:40,560 --> 00:18:45,040 Speaker 1: like through former employees or through current employees at conferences 298 00:18:45,480 --> 00:18:50,320 Speaker 1: UM something we call interface ethnography, where companies have open 299 00:18:50,880 --> 00:18:53,440 Speaker 1: um events where they talk about their products, where you're 300 00:18:53,440 --> 00:18:56,320 Speaker 1: able to ask questions. Those kind of tools, those kind 301 00:18:56,359 --> 00:19:01,200 Speaker 1: of methods really helps paint a picture UM that maybe 302 00:19:01,240 --> 00:19:04,800 Speaker 1: sometimes the company, an official tour by the company would 303 00:19:04,840 --> 00:19:09,480 Speaker 1: not have so correct me if I'm wrong, But you 304 00:19:09,520 --> 00:19:13,440 Speaker 1: contend that the systems the tech companies have to comply 305 00:19:13,680 --> 00:19:20,760 Speaker 1: with privacy laws are designed to intentionally undermine our privacy. 306 00:19:21,240 --> 00:19:25,520 Speaker 1: That's exactly right, there's there's. The argument is broader than that. 307 00:19:25,600 --> 00:19:30,200 Speaker 1: The argument is that the systems in place coops both 308 00:19:30,240 --> 00:19:35,119 Speaker 1: privacy law and people who are who see themselves as 309 00:19:35,160 --> 00:19:39,399 Speaker 1: privacy advocates that are working on the inside on on 310 00:19:39,680 --> 00:19:45,080 Speaker 1: these on these compliance on compliance elements are on design. Essentially, 311 00:19:45,680 --> 00:19:55,280 Speaker 1: the organizational structures inside companies take engineers, privacy professionals, salespeople, designers, advocate, 312 00:19:55,520 --> 00:20:01,359 Speaker 1: um executives, and whomever, whatever they're reasons for coming to 313 00:20:01,400 --> 00:20:05,000 Speaker 1: the company, and even if they care so much about privacy, 314 00:20:05,040 --> 00:20:09,520 Speaker 1: going in the systems inside the company, there their control 315 00:20:09,600 --> 00:20:13,120 Speaker 1: over discourse, their control over the definitions of privacy, their 316 00:20:13,119 --> 00:20:16,600 Speaker 1: control over how people do their work, their control over 317 00:20:16,640 --> 00:20:21,000 Speaker 1: how design functions, and the relationships between designers and lawyers 318 00:20:21,040 --> 00:20:25,200 Speaker 1: and designers and privacy professionals, and the organizational barriers that 319 00:20:25,280 --> 00:20:29,640 Speaker 1: companies put in front of privacy professionals, and the way 320 00:20:29,680 --> 00:20:34,600 Speaker 1: they habituate privacy workers into doing into thinking about privacy 321 00:20:34,600 --> 00:20:37,200 Speaker 1: and narrow ways all of these things, it almost doesn't 322 00:20:37,280 --> 00:20:41,280 Speaker 1: matter if someone is a privacy advocate. It channels all 323 00:20:41,560 --> 00:20:46,560 Speaker 1: work into data extracted ends and when a system of 324 00:20:46,680 --> 00:20:49,760 Speaker 1: law like we have privacy law like the gdp R 325 00:20:49,840 --> 00:20:52,959 Speaker 1: in Europe and all of the proposals that have been 326 00:20:53,000 --> 00:20:55,879 Speaker 1: introduced as the US Congress and the state. When a 327 00:20:55,920 --> 00:20:59,760 Speaker 1: system of law is based on third party compliance, when 328 00:20:59,760 --> 00:21:05,680 Speaker 1: it's based on internal structures to fill out forms, to 329 00:21:05,840 --> 00:21:09,280 Speaker 1: keep records, to fill out to complete privacy impact assessments, 330 00:21:09,920 --> 00:21:14,320 Speaker 1: a system that co opts those, uh, those people and 331 00:21:14,359 --> 00:21:17,480 Speaker 1: those tools is a system that does not protect privacy 332 00:21:17,520 --> 00:21:20,240 Speaker 1: at all. So in that sense, yes, the systems that 333 00:21:20,280 --> 00:21:24,879 Speaker 1: these companies have created are built to purposely undermine our privacy. 334 00:21:25,080 --> 00:21:27,920 Speaker 1: Al Right, Often we get alerts from Google and other 335 00:21:28,000 --> 00:21:32,240 Speaker 1: sites saying we value your privacy, go through these steps 336 00:21:32,400 --> 00:21:36,080 Speaker 1: to help protect your information. I mean, should we bother 337 00:21:36,240 --> 00:21:40,640 Speaker 1: with that? Yeah? Well, part of what that does is 338 00:21:41,280 --> 00:21:46,320 Speaker 1: it create, it maintains or or maintains this myth that 339 00:21:46,720 --> 00:21:51,160 Speaker 1: protecting your privacy is something that is your individual responsibility. 340 00:21:51,720 --> 00:21:54,640 Speaker 1: It's like you have to do the work. You you 341 00:21:54,800 --> 00:21:57,680 Speaker 1: have to click these buttons. Setting aside the fact that 342 00:21:57,760 --> 00:22:00,119 Speaker 1: a lot of these companies don't even provide you with 343 00:22:00,200 --> 00:22:03,959 Speaker 1: the options of doing that. When Google sends you an email, 344 00:22:03,960 --> 00:22:07,080 Speaker 1: it sends you an email and says we value your privacy. 345 00:22:07,119 --> 00:22:10,560 Speaker 1: It's mostly just to notify you have changes the Private 346 00:22:10,600 --> 00:22:13,880 Speaker 1: to their privacy policy that you really have no option 347 00:22:14,000 --> 00:22:17,200 Speaker 1: to opt out. Right, It's just here's the deal, and 348 00:22:17,240 --> 00:22:21,160 Speaker 1: this is what's happening. But as an aside, I mean, 349 00:22:21,200 --> 00:22:24,560 Speaker 1: that's part of a larger campaign, a larger purpose is 350 00:22:24,840 --> 00:22:28,920 Speaker 1: to make you think the privacy is something that we control. 351 00:22:29,359 --> 00:22:32,479 Speaker 1: That these toggle buttons. Every time you go onto your 352 00:22:32,480 --> 00:22:35,560 Speaker 1: iPhone and toggle the button you green to gray or 353 00:22:35,600 --> 00:22:38,720 Speaker 1: great to green, you're turning on something, you're turning off something, 354 00:22:38,720 --> 00:22:42,720 Speaker 1: you're protecting where that information goes. Even though that is 355 00:22:42,920 --> 00:22:46,840 Speaker 1: more of a performance that performance. This this just the 356 00:22:46,960 --> 00:22:51,120 Speaker 1: action of clicking, okay, the action of toggling these buttons. 357 00:22:51,560 --> 00:22:55,360 Speaker 1: It habituates us into thinking that privacy is something that 358 00:22:55,600 --> 00:22:58,000 Speaker 1: we're on, we're in control of, that we're we're in 359 00:22:58,080 --> 00:23:00,840 Speaker 1: charge of when it's back. The options that they give 360 00:23:00,880 --> 00:23:04,800 Speaker 1: us are so minimal, or they're so complex and so 361 00:23:04,920 --> 00:23:08,320 Speaker 1: difficult to actually run through. Imagine trying to manage all 362 00:23:08,359 --> 00:23:11,440 Speaker 1: of your privacy, all the privacy toggles on every app 363 00:23:11,520 --> 00:23:14,280 Speaker 1: on your phone right now, or on every website that 364 00:23:14,320 --> 00:23:18,160 Speaker 1: you've ever visited. That's impossible to do. Yet the company 365 00:23:18,240 --> 00:23:22,520 Speaker 1: frames it in this empowering language. We're putting you in control. 366 00:23:23,000 --> 00:23:27,520 Speaker 1: That kind of control is anything but control. Every time 367 00:23:27,520 --> 00:23:31,439 Speaker 1: the government is going to enforce privacy, let's say the 368 00:23:31,640 --> 00:23:37,600 Speaker 1: settlement agreements they've made with Facebook, Facebook still violates privacy 369 00:23:37,720 --> 00:23:41,200 Speaker 1: rights and then makes another settlement agreement with the FTC. 370 00:23:41,920 --> 00:23:48,080 Speaker 1: Is anyone monitoring what these companies do? Yeah, so the FTC, 371 00:23:48,440 --> 00:23:52,240 Speaker 1: You're absolutely right to note that Facebook is a serial 372 00:23:52,400 --> 00:23:57,320 Speaker 1: violator of not only our privacy, but supposed consent decree, 373 00:23:57,400 --> 00:24:00,840 Speaker 1: but consent increase from the SEC. It's serial violation of 374 00:24:00,920 --> 00:24:04,119 Speaker 1: It just goes to show you how much FTC and 375 00:24:04,400 --> 00:24:07,320 Speaker 1: agreements have been just slaps on the risk. The FEC 376 00:24:07,520 --> 00:24:09,479 Speaker 1: is a weak agency when it comes down to it. 377 00:24:09,480 --> 00:24:12,120 Speaker 1: It's a weak agency. It has better appointees now under 378 00:24:12,160 --> 00:24:15,320 Speaker 1: the Biden administration who hope to empower it, but it's 379 00:24:15,320 --> 00:24:19,960 Speaker 1: a weak agency that for the last thirty years, most 380 00:24:20,040 --> 00:24:24,160 Speaker 1: of the people who sat on the FTC have seen 381 00:24:24,320 --> 00:24:27,919 Speaker 1: their role. It seen part of their role as not 382 00:24:27,920 --> 00:24:32,840 Speaker 1: not to protect privacy, but to protect privacy only so 383 00:24:32,920 --> 00:24:37,440 Speaker 1: much as it doesn't harm innovation. A lot of FTC 384 00:24:37,520 --> 00:24:41,000 Speaker 1: commissioners to said part of their job is to maintain innovation, 385 00:24:41,080 --> 00:24:44,600 Speaker 1: to tell to listen to companies and say and listen 386 00:24:44,640 --> 00:24:46,480 Speaker 1: to them when they say that, oh no, we can't 387 00:24:46,480 --> 00:24:49,120 Speaker 1: do this or don't require us to do this because 388 00:24:49,160 --> 00:24:52,480 Speaker 1: of a harm innovation. And that goes a law that 389 00:24:52,520 --> 00:24:55,679 Speaker 1: goes very far to and I quote some of these 390 00:24:56,359 --> 00:24:59,840 Speaker 1: FEC commissioners and lobbyists and so forth in the book, 391 00:25:00,040 --> 00:25:02,200 Speaker 1: it goes very far to show why the SEC has 392 00:25:02,200 --> 00:25:06,240 Speaker 1: been so so weak. UM and these consent a crees 393 00:25:06,359 --> 00:25:10,680 Speaker 1: allow the company to get off scott free because it's 394 00:25:10,720 --> 00:25:15,120 Speaker 1: basically a settlement. There's no order, there's no UM, there's 395 00:25:15,160 --> 00:25:22,200 Speaker 1: no requirements other than complete this audit or fill out 396 00:25:22,240 --> 00:25:24,600 Speaker 1: this document. But one of the things that I found 397 00:25:25,119 --> 00:25:30,199 Speaker 1: during my research also UM Megan Gray, who used to 398 00:25:30,200 --> 00:25:34,040 Speaker 1: be at Stanford Center for Internet and Society, also found 399 00:25:34,040 --> 00:25:38,080 Speaker 1: this when she reviewed these audits. What I found were 400 00:25:38,119 --> 00:25:40,680 Speaker 1: that a lot of these audit reports that were going 401 00:25:40,720 --> 00:25:45,080 Speaker 1: to the SEC were just based on executive attestation, which 402 00:25:45,119 --> 00:25:48,120 Speaker 1: means someone comes in that the company hires and they 403 00:25:48,160 --> 00:25:51,359 Speaker 1: say and they asked, have you complied with section three 404 00:25:51,359 --> 00:25:54,520 Speaker 1: point five of the SEC consent decree? And the answer is, 405 00:25:54,880 --> 00:25:59,320 Speaker 1: according to executives at attestation as attended in Exhibit I, 406 00:25:59,520 --> 00:26:02,119 Speaker 1: the company is in compliance with deception. And then when 407 00:26:02,119 --> 00:26:04,159 Speaker 1: you go to Exhibit I, all it is it's just 408 00:26:04,200 --> 00:26:07,720 Speaker 1: the executive signing a letter thing. We're in compliance. So 409 00:26:08,440 --> 00:26:13,399 Speaker 1: that system of audits of documents, which is what Julie 410 00:26:13,400 --> 00:26:17,200 Speaker 1: Cohen has called tools on the periphery of the regulatory state, 411 00:26:17,920 --> 00:26:21,280 Speaker 1: those aren't holding these companies speeds of the fire at all. 412 00:26:21,400 --> 00:26:25,480 Speaker 1: It's performative. It's privacy lost theater, and companies have a 413 00:26:25,520 --> 00:26:29,960 Speaker 1: strong interest in maintaining that theater. I'm making it look 414 00:26:30,000 --> 00:26:33,280 Speaker 1: as good as they can and co opting and ensuring 415 00:26:33,280 --> 00:26:36,439 Speaker 1: that their workers think that it's real stuff, but in 416 00:26:36,560 --> 00:26:41,399 Speaker 1: fact it's really just a show for us and for regulators. 417 00:26:42,280 --> 00:26:46,440 Speaker 1: So finally, what do you hope that readers will get 418 00:26:46,640 --> 00:26:51,000 Speaker 1: from your book? I want readers to realize that the 419 00:26:51,080 --> 00:26:54,399 Speaker 1: system is stacked against them even more than they think. 420 00:26:55,000 --> 00:27:00,800 Speaker 1: I want privacy professionals to be more introspective about how 421 00:27:00,880 --> 00:27:06,000 Speaker 1: their work inside tech companies is actually perpetuating data extraction. 422 00:27:06,520 --> 00:27:10,879 Speaker 1: I want designers to realize that their employers are manipulating 423 00:27:10,960 --> 00:27:16,720 Speaker 1: them into not caring about privacy. And I want politicians 424 00:27:17,040 --> 00:27:22,560 Speaker 1: to realize that this whole system depends on companies and 425 00:27:22,600 --> 00:27:25,879 Speaker 1: their lobby is influencing them to think about privacy and 426 00:27:26,000 --> 00:27:29,120 Speaker 1: narrow ways. If there's one thing that I'd like to see, 427 00:27:29,119 --> 00:27:32,040 Speaker 1: it's a change in discourse. I want us to stop thinking. 428 00:27:32,040 --> 00:27:34,800 Speaker 1: I want us to all stop thinking about privacy as 429 00:27:34,880 --> 00:27:39,879 Speaker 1: anything about control, individual control, or anything about choice. That's 430 00:27:40,000 --> 00:27:42,880 Speaker 1: the fodder, that's the red meat for these tech companies. 431 00:27:43,160 --> 00:27:45,520 Speaker 1: It's much better for us to think about privacy in 432 00:27:45,640 --> 00:27:49,240 Speaker 1: terms of civil rights, in terms of equality and justice, 433 00:27:49,600 --> 00:27:54,280 Speaker 1: in terms of um the the obligations of trust. As 434 00:27:54,359 --> 00:27:57,920 Speaker 1: my colleagues Woody Hertzog and Neil Richards have argued, these 435 00:27:57,960 --> 00:28:02,040 Speaker 1: more robust concepts, the more so show concepts of seeing 436 00:28:02,040 --> 00:28:04,600 Speaker 1: ourselves as part of a larger group, as opposed to 437 00:28:04,640 --> 00:28:09,080 Speaker 1: thinking that privacy is something we do against others or 438 00:28:09,119 --> 00:28:12,040 Speaker 1: against the world. So as soon as we start, as 439 00:28:12,040 --> 00:28:14,919 Speaker 1: soon as we stopped talking about privacy in terms of 440 00:28:14,960 --> 00:28:19,560 Speaker 1: control and choice, and as soon as that discourse gains prominence, 441 00:28:19,920 --> 00:28:22,080 Speaker 1: it's almost like this house of cards that I talked 442 00:28:22,119 --> 00:28:26,320 Speaker 1: about in the book is going to start falling apart, 443 00:28:26,680 --> 00:28:29,280 Speaker 1: and then we can start passing better laws, and then 444 00:28:29,359 --> 00:28:31,840 Speaker 1: we can have a then we can have Are these 445 00:28:31,920 --> 00:28:37,000 Speaker 1: laws not depend on the completely captured industry of most 446 00:28:37,040 --> 00:28:42,240 Speaker 1: privacy professionals. Thanks. Sorry. That's Ari Ezra Waldman of Northeastern 447 00:28:42,320 --> 00:28:47,080 Speaker 1: University Law School. His new book, Industry Unbound, The inside 448 00:28:47,120 --> 00:28:50,560 Speaker 1: story of privacy, data and corporate power. I'm June Grass, 449 00:28:50,680 --> 00:28:52,000 Speaker 1: and you're listening to Bloomberg