1 00:00:00,480 --> 00:00:05,000 Speaker 1: Hey, there, everybody, this is Jonathan Strickland, host of Tax Stuff. Today, 2 00:00:05,080 --> 00:00:08,760 Speaker 1: we have something a little different for you. We have 3 00:00:09,080 --> 00:00:13,760 Speaker 1: a series called First Contact with Laurie Siegel uh, former 4 00:00:14,560 --> 00:00:18,439 Speaker 1: correspondent with CNN, and if you have not listened to 5 00:00:18,560 --> 00:00:22,000 Speaker 1: that show, you definitely need to check it out. She's 6 00:00:22,000 --> 00:00:26,640 Speaker 1: doing amazing work having incredible conversations with thought leaders in 7 00:00:26,680 --> 00:00:30,000 Speaker 1: the tech space and the social space, and this particular 8 00:00:30,040 --> 00:00:34,360 Speaker 1: episode I think is really important and one that really 9 00:00:34,400 --> 00:00:37,080 Speaker 1: needs to be heard by as many folks as possible. 10 00:00:37,880 --> 00:00:42,760 Speaker 1: She sits down with a CLU president, Susan Herman, and 11 00:00:42,880 --> 00:00:47,600 Speaker 1: together they talk about the issue of racially biased face 12 00:00:47,680 --> 00:00:51,720 Speaker 1: recognition technology. And you've heard me speak on episodes of 13 00:00:51,720 --> 00:00:58,160 Speaker 1: Tech Stuff in the past about bias artificial intelligence, facial recognition, 14 00:00:58,280 --> 00:01:01,160 Speaker 1: you know, image recognition, that sort of stuff, and how 15 00:01:01,600 --> 00:01:04,800 Speaker 1: that is a real problem in tech and it's something 16 00:01:04,840 --> 00:01:08,600 Speaker 1: that we absolutely have to get our heads wrapped around 17 00:01:08,680 --> 00:01:13,280 Speaker 1: and address. So, without further ado, I'm going to have 18 00:01:13,680 --> 00:01:17,560 Speaker 1: that episode play in lieu of a Tech Stuff episode 19 00:01:17,560 --> 00:01:21,280 Speaker 1: for today, and I highly recommend you check out First Contact. 20 00:01:21,319 --> 00:01:24,919 Speaker 1: It's a great show and one that I think really 21 00:01:25,000 --> 00:01:28,480 Speaker 1: dove tails nicely with Tech Stuff. So if you enjoy 22 00:01:28,560 --> 00:01:31,400 Speaker 1: tech stuff, I think you're really gonna like First Contact 23 00:01:31,400 --> 00:01:34,920 Speaker 1: as well, and we will catch you next week with 24 00:01:35,000 --> 00:01:39,360 Speaker 1: new episodes of tech Stuff. Thanks. First Contact with Lori 25 00:01:39,440 --> 00:01:41,679 Speaker 1: Siegel is a production of Dot Dot Dot Media and 26 00:01:41,760 --> 00:01:49,320 Speaker 1: I Heart Radio. There's a great quote on the A 27 00:01:49,400 --> 00:01:51,880 Speaker 1: c l U website. The fact that technology now allows 28 00:01:51,880 --> 00:01:54,280 Speaker 1: an individual to carry such information in his hand does 29 00:01:54,320 --> 00:01:57,040 Speaker 1: not make the information any less worthy of the protection 30 00:01:57,080 --> 00:02:00,440 Speaker 1: for which the founders fought. Exactly. I like to talk about, 31 00:02:00,480 --> 00:02:02,640 Speaker 1: you know, one of the whole points of the Constitution 32 00:02:02,680 --> 00:02:05,400 Speaker 1: adding the fourth Amendment, which is the protection of privacy, 33 00:02:05,800 --> 00:02:08,359 Speaker 1: is they wanted to protect what was in Benjamin Franklin's disk. 34 00:02:09,120 --> 00:02:11,040 Speaker 1: Nobody should know if he was writing some things that 35 00:02:11,080 --> 00:02:13,359 Speaker 1: were anti government, and we now have that on our 36 00:02:13,360 --> 00:02:16,040 Speaker 1: cell phone, so of course, but that's where I think 37 00:02:16,080 --> 00:02:18,720 Speaker 1: that a lot of the protection of civil liberties is 38 00:02:18,760 --> 00:02:34,240 Speaker 1: applying our fundamental principles in different circumstances. We are in 39 00:02:34,280 --> 00:02:37,280 Speaker 1: a moment of reckoning as we enter an age of 40 00:02:37,360 --> 00:02:43,880 Speaker 1: ubiquitous surveillance, questionable data collection practices, even algorithms that discriminate. 41 00:02:44,480 --> 00:02:49,560 Speaker 1: It's minorities, especially black and brown communities that are disproportionately affected. 42 00:02:50,040 --> 00:02:52,480 Speaker 1: Over the last months, as the nation has grappled with 43 00:02:52,520 --> 00:02:56,519 Speaker 1: a conversation around police brutality, we've seen predator drones used 44 00:02:56,520 --> 00:03:01,440 Speaker 1: for aerial surveillance at protests, facial recognition technology that wrongfully 45 00:03:01,480 --> 00:03:04,560 Speaker 1: accused a black man of a crime he didn't commit, 46 00:03:05,040 --> 00:03:07,880 Speaker 1: and it wasn't a coincidence. Reports say the tech is 47 00:03:07,919 --> 00:03:11,960 Speaker 1: a hundred times more likely to misidentify African, American and 48 00:03:12,080 --> 00:03:16,600 Speaker 1: Asian people, and as COVID nineteen continues to spread, there 49 00:03:16,600 --> 00:03:20,480 Speaker 1: are serious questions being raised about contact tracing apps and 50 00:03:20,639 --> 00:03:24,919 Speaker 1: how that data collected could be misused. These issues raise 51 00:03:25,000 --> 00:03:29,880 Speaker 1: ethical questions about technology and its impact on our civil liberties, equality, 52 00:03:30,040 --> 00:03:33,400 Speaker 1: and the future of our country. For Susan Herman, it 53 00:03:33,520 --> 00:03:36,760 Speaker 1: is an extraordinary time to be sitting in her seat 54 00:03:36,840 --> 00:03:39,960 Speaker 1: as president of the a s l U. Over the years, 55 00:03:40,120 --> 00:03:43,240 Speaker 1: the American Civil Liberties Union has filed lawsuits fighting for 56 00:03:43,280 --> 00:03:48,000 Speaker 1: free speech, reproductive rights, and privacy. But as technology continues 57 00:03:48,040 --> 00:03:52,240 Speaker 1: to muddy the waters, the tradeoffs become more complicated. Where 58 00:03:52,240 --> 00:03:55,160 Speaker 1: do we draw the line between security and privacy and 59 00:03:55,440 --> 00:03:59,320 Speaker 1: how do we prevent technological innovation from outpacing the law. 60 00:04:00,120 --> 00:04:06,600 Speaker 1: I'm Laurie Siegel, and this is first contact, Susan, thank 61 00:04:06,640 --> 00:04:10,520 Speaker 1: you for being virtually with me today. Thank you for 62 00:04:10,600 --> 00:04:13,880 Speaker 1: inviting me, Laurie. Yeah, you know, I I always start 63 00:04:13,880 --> 00:04:16,919 Speaker 1: out these interviews with our first contact. I talked to 64 00:04:17,040 --> 00:04:20,039 Speaker 1: guests about how we met, and we don't really have 65 00:04:20,080 --> 00:04:22,719 Speaker 1: a first contact. We've never met in person, but we 66 00:04:22,800 --> 00:04:24,839 Speaker 1: met on an email chain because we were going to 67 00:04:24,880 --> 00:04:27,040 Speaker 1: do an interview together for something else and it and 68 00:04:27,160 --> 00:04:29,160 Speaker 1: it fell through. So I said, you've got to come 69 00:04:29,160 --> 00:04:32,320 Speaker 1: on the podcast because you are just sitting in such 70 00:04:32,360 --> 00:04:36,839 Speaker 1: an extraordinary seat at such an extraordinary moment in time. 71 00:04:37,000 --> 00:04:40,279 Speaker 1: So that's our first contact. Well, thanks, It just seems 72 00:04:40,320 --> 00:04:44,160 Speaker 1: to me like our first contact was total serendipity. Yeah, exactly, 73 00:04:44,800 --> 00:04:47,279 Speaker 1: so you know, to get started. You've been the president 74 00:04:47,279 --> 00:04:49,719 Speaker 1: of the a c l U since two thousand and eight, 75 00:04:49,839 --> 00:04:52,360 Speaker 1: and I said this before, but you know, what an 76 00:04:52,360 --> 00:04:55,080 Speaker 1: extraordinary time to be sitting in your seat. You know, 77 00:04:55,200 --> 00:04:58,480 Speaker 1: how are you feeling? Oh my, it's just sort of overwhelming. 78 00:04:58,560 --> 00:05:00,919 Speaker 1: You know. As president, I'm go the chair of the board, 79 00:05:01,360 --> 00:05:03,080 Speaker 1: so you know, I'm not the one doing the day 80 00:05:03,080 --> 00:05:05,280 Speaker 1: to day work as all of the members of our 81 00:05:05,279 --> 00:05:07,320 Speaker 1: staff are. But to be a member of the a 82 00:05:07,400 --> 00:05:09,880 Speaker 1: c l U staff right now is just it's mind 83 00:05:09,920 --> 00:05:12,560 Speaker 1: boggling because we had, you know, a lot of work 84 00:05:12,640 --> 00:05:15,120 Speaker 1: that we were already doing before two thousand and sixteen, 85 00:05:15,600 --> 00:05:18,120 Speaker 1: with all of the states making worse and worse laws 86 00:05:18,160 --> 00:05:21,280 Speaker 1: about reproductive freedom and voting rights and immigrants rights, and 87 00:05:21,360 --> 00:05:24,200 Speaker 1: you know, all sorts of other things. Then came the election, 88 00:05:24,360 --> 00:05:27,239 Speaker 1: and since that we have brought a hundreds and seventy 89 00:05:27,279 --> 00:05:30,920 Speaker 1: three legal actions against the Trump administration for things like 90 00:05:31,240 --> 00:05:34,760 Speaker 1: family separations and the travel BAM and prohibiting trans in 91 00:05:34,800 --> 00:05:39,719 Speaker 1: the military. Then in March, COVID hit, and at that point, 92 00:05:39,760 --> 00:05:42,479 Speaker 1: you know, since then, we've also brought over a hundred lawsuits, 93 00:05:42,880 --> 00:05:45,560 Speaker 1: including with a hundred lawsuits just about people who are 94 00:05:45,600 --> 00:05:49,280 Speaker 1: incarcerated in jails and prisons and ice attention and who 95 00:05:49,320 --> 00:05:51,400 Speaker 1: are just in a hotspot. You know, they have no 96 00:05:51,520 --> 00:05:54,800 Speaker 1: control over whether they can social distance, and so we've 97 00:05:54,800 --> 00:05:57,400 Speaker 1: been working very hard to get vulnerable people out of 98 00:05:57,640 --> 00:06:02,680 Speaker 1: those terrible situations, out of basically death traps. Plus, the 99 00:06:02,720 --> 00:06:06,680 Speaker 1: COVID also led to a number of states opportunistically restricting 100 00:06:06,720 --> 00:06:10,560 Speaker 1: things like freedom of abortion, declaring abortion to be a 101 00:06:10,640 --> 00:06:13,680 Speaker 1: non essential procedure so people could just wait until the 102 00:06:13,720 --> 00:06:17,040 Speaker 1: pandemic is over to get an abortion right, and voting 103 00:06:17,120 --> 00:06:20,200 Speaker 1: rights has also just been a really braught area right 104 00:06:20,200 --> 00:06:23,440 Speaker 1: now because all the restrictions on voting and the ways 105 00:06:23,440 --> 00:06:26,880 Speaker 1: in which the vote was becoming distorted. Have you just 106 00:06:26,920 --> 00:06:31,640 Speaker 1: been magnified by all the difficulties of So there's a 107 00:06:31,640 --> 00:06:33,120 Speaker 1: lot to talk about. So I was a gonna say, 108 00:06:33,200 --> 00:06:35,080 Speaker 1: what what what I'm hearing from is you're sleeping really 109 00:06:35,120 --> 00:06:38,400 Speaker 1: well at night. You know, there's no work to do, 110 00:06:39,120 --> 00:06:41,039 Speaker 1: almost nothing to do this stuff that they're just sitting 111 00:06:41,040 --> 00:06:44,359 Speaker 1: around polishing their nails. Yeah, I mean, like, take me 112 00:06:44,480 --> 00:06:47,760 Speaker 1: to March, like coronavirus hits. You have been involved in 113 00:06:47,800 --> 00:06:51,280 Speaker 1: some of these monumental cases that have just shaped society 114 00:06:51,400 --> 00:06:56,200 Speaker 1: in our civil liberties, like coronavirus hits. And now you know, 115 00:06:56,240 --> 00:06:57,960 Speaker 1: we have a little bit I don't even think we 116 00:06:58,000 --> 00:07:00,080 Speaker 1: have the luxury of perspective at this point, but we 117 00:07:00,120 --> 00:07:02,440 Speaker 1: have a little bit more perspective. But let's take me 118 00:07:02,520 --> 00:07:06,599 Speaker 1: to March, Like in your role at this extraordinary moment, 119 00:07:06,720 --> 00:07:09,440 Speaker 1: like what was going through your head? What were you 120 00:07:09,480 --> 00:07:12,480 Speaker 1: concerned about at the time. Well, you know, one of 121 00:07:12,560 --> 00:07:15,360 Speaker 1: the first concerns is just you have to close the office. 122 00:07:15,560 --> 00:07:17,720 Speaker 1: So the first concern is how can people do all 123 00:07:17,800 --> 00:07:20,840 Speaker 1: this umcity It increases the work and makes it more 124 00:07:20,880 --> 00:07:23,520 Speaker 1: difficult to do the work. So we just had to 125 00:07:23,840 --> 00:07:26,640 Speaker 1: really make sure that our technology was up to doing things. 126 00:07:26,680 --> 00:07:28,200 Speaker 1: So one thing that the a c l YOU did 127 00:07:28,280 --> 00:07:30,560 Speaker 1: was to buy new laptops for some stuff. People who 128 00:07:30,560 --> 00:07:33,200 Speaker 1: are going to be working and you have to worry 129 00:07:33,240 --> 00:07:37,080 Speaker 1: about how the technology is working um which has been 130 00:07:37,120 --> 00:07:40,000 Speaker 1: a question for us every time there's something really big hits. 131 00:07:40,480 --> 00:07:43,360 Speaker 1: When the travel band hit, there were so many people 132 00:07:43,400 --> 00:07:45,160 Speaker 1: wanting to donate to the a c l U. There 133 00:07:45,160 --> 00:07:48,400 Speaker 1: are website crash so even things like that, you know, 134 00:07:48,440 --> 00:07:50,120 Speaker 1: that's you know, like number one of how do you 135 00:07:50,160 --> 00:07:53,080 Speaker 1: handle this? We have been fortunate so far that the 136 00:07:53,160 --> 00:07:55,520 Speaker 1: a c l U is so well managed and we 137 00:07:55,560 --> 00:07:58,320 Speaker 1: had not spent every penny that all of our donors 138 00:07:58,320 --> 00:08:00,320 Speaker 1: had given us up until that point, so have not 139 00:08:00,360 --> 00:08:03,320 Speaker 1: had to laid people off, which is very fortunate because 140 00:08:03,360 --> 00:08:05,440 Speaker 1: as you know, saying, there's more than enough work to do. 141 00:08:05,560 --> 00:08:07,400 Speaker 1: But yeah, that's the first concern of just you know, 142 00:08:07,400 --> 00:08:09,960 Speaker 1: how do you keep the organization up to speed and 143 00:08:10,040 --> 00:08:12,840 Speaker 1: ready to do you know what. Staff members now need 144 00:08:12,920 --> 00:08:15,840 Speaker 1: to be doing an incredible amount more work. But for 145 00:08:15,920 --> 00:08:18,560 Speaker 1: some of them, it's well, they're juggling a toddler and 146 00:08:18,840 --> 00:08:22,320 Speaker 1: a dog. Yeah. Can you give me a run through 147 00:08:22,360 --> 00:08:24,760 Speaker 1: of some of the cases that you've been involved in 148 00:08:24,800 --> 00:08:26,680 Speaker 1: that correct me if I'm wrong. You started out as 149 00:08:26,680 --> 00:08:30,480 Speaker 1: an intern, right, and really just worked your way up. 150 00:08:30,560 --> 00:08:33,240 Speaker 1: I mean, I can imagine you've been involved, and I 151 00:08:33,280 --> 00:08:36,480 Speaker 1: know you've been involved in some pretty extraordinary cases. To 152 00:08:36,520 --> 00:08:38,840 Speaker 1: give listeners some context, can you explain some of the 153 00:08:38,880 --> 00:08:41,719 Speaker 1: cases that kind of stick out to you. Well, I 154 00:08:41,760 --> 00:08:44,040 Speaker 1: wasn't intern for the a c l U back, you know, 155 00:08:44,080 --> 00:08:46,520 Speaker 1: in the nineties seventies, you know, around the time when 156 00:08:46,520 --> 00:08:48,600 Speaker 1: I was in law school. And just to make sure 157 00:08:48,640 --> 00:08:51,400 Speaker 1: that everybody understands, I don't actually work at the a 158 00:08:51,520 --> 00:08:54,000 Speaker 1: c l U. Day job is I'm a law professor, 159 00:08:54,600 --> 00:08:56,880 Speaker 1: and I don't generally work on the cases. What I'm 160 00:08:56,920 --> 00:09:00,960 Speaker 1: generally doing is we run the organization. But I'll tell 161 00:09:01,000 --> 00:09:03,760 Speaker 1: you I think it would be interesting start um. But 162 00:09:03,840 --> 00:09:05,520 Speaker 1: the first a c l U case that I actually 163 00:09:05,559 --> 00:09:07,560 Speaker 1: did work on, which was while I was a law student, 164 00:09:08,440 --> 00:09:10,840 Speaker 1: and this was the case. One of my connections with 165 00:09:10,880 --> 00:09:12,520 Speaker 1: the a c l U originally was that one of 166 00:09:12,559 --> 00:09:15,319 Speaker 1: my law professors in the first year was connected with 167 00:09:15,360 --> 00:09:18,000 Speaker 1: the New York Civil Liberties Union, and he had some 168 00:09:18,040 --> 00:09:20,880 Speaker 1: clients who came to him who were graduate students at 169 00:09:20,920 --> 00:09:24,160 Speaker 1: Stony Brook on Allowland, and they had just discovered they 170 00:09:24,160 --> 00:09:26,200 Speaker 1: were not allowed to live together. They had rented a 171 00:09:26,240 --> 00:09:28,360 Speaker 1: house together, there were six of them, and they had 172 00:09:28,400 --> 00:09:30,920 Speaker 1: just discovered they weren't allowed to live together because there 173 00:09:30,960 --> 00:09:33,679 Speaker 1: was an ordinance in their village village called belt Hair, 174 00:09:34,240 --> 00:09:37,760 Speaker 1: that prohibited more than two persons unrelated by blood, marriage 175 00:09:37,880 --> 00:09:41,000 Speaker 1: or adoption from living together. So, you know, they were 176 00:09:41,000 --> 00:09:44,760 Speaker 1: pretty shocked. And it turned out that under the laws 177 00:09:44,800 --> 00:09:46,719 Speaker 1: it was at the time, by the time they were 178 00:09:46,720 --> 00:09:49,880 Speaker 1: talking about this, they were liable for all sorts of 179 00:09:50,120 --> 00:09:53,560 Speaker 1: criminal fines and punishment. It was really a very heavy stuff. 180 00:09:54,040 --> 00:09:56,199 Speaker 1: So I started working on that case with my law 181 00:09:56,240 --> 00:10:01,079 Speaker 1: professor and um we went to a federal job to 182 00:10:01,320 --> 00:10:04,240 Speaker 1: ask for a temporary restraining order, which means to just 183 00:10:04,559 --> 00:10:07,000 Speaker 1: until we had litigated whether or not that was a 184 00:10:07,040 --> 00:10:09,680 Speaker 1: constitutional thing to do, to tell people who they couldn't 185 00:10:09,720 --> 00:10:12,400 Speaker 1: couldn't live with that the village should not be allowed 186 00:10:12,400 --> 00:10:15,440 Speaker 1: to either kick them out of their their house or 187 00:10:15,520 --> 00:10:17,560 Speaker 1: to you know, start walking them up because you know, 188 00:10:17,600 --> 00:10:20,760 Speaker 1: they owned too many fines for having been illegal residents. 189 00:10:21,600 --> 00:10:24,880 Speaker 1: So um, the judge ended up signing the order, and 190 00:10:25,120 --> 00:10:27,720 Speaker 1: he was signing the order about that, and then one 191 00:10:27,720 --> 00:10:30,160 Speaker 1: of the ways they which actually the original way in 192 00:10:30,160 --> 00:10:33,000 Speaker 1: which our clients had discovered that they were illegal residents, 193 00:10:33,040 --> 00:10:35,520 Speaker 1: was that they had applied for residents only beach permit 194 00:10:35,920 --> 00:10:37,640 Speaker 1: and they were told they couldn't have one because they 195 00:10:37,679 --> 00:10:40,320 Speaker 1: were illegal residents. So the judge who we had the 196 00:10:40,360 --> 00:10:42,720 Speaker 1: district judge, who was a very nice man, looked at 197 00:10:42,720 --> 00:10:44,400 Speaker 1: the order we had written out and he said, well, 198 00:10:44,440 --> 00:10:46,240 Speaker 1: you know, it's the summer. Don't your clients want to 199 00:10:46,280 --> 00:10:49,040 Speaker 1: go to the beach while the litigation is pending? Do 200 00:10:49,080 --> 00:10:51,040 Speaker 1: you mind if I write that in that they have 201 00:10:51,160 --> 00:10:52,839 Speaker 1: to be allowed to park in the parking lot of 202 00:10:52,880 --> 00:10:55,960 Speaker 1: the beach. So we said sure, you know, that's very nice, 203 00:10:55,960 --> 00:10:58,400 Speaker 1: so he wrote that in. Then, as the junior member 204 00:10:58,400 --> 00:11:00,800 Speaker 1: of the team, I was sent out to explain to 205 00:11:00,880 --> 00:11:02,839 Speaker 1: our clients, to show them the order and explained to 206 00:11:02,880 --> 00:11:04,960 Speaker 1: them what was going on, and they gave me a 207 00:11:05,000 --> 00:11:07,600 Speaker 1: tour of you what the village looked like in the 208 00:11:07,679 --> 00:11:10,640 Speaker 1: residents only beach and the minute the wheels of their 209 00:11:10,679 --> 00:11:13,240 Speaker 1: car hit the parking lot is very large. Your fierce 210 00:11:13,280 --> 00:11:15,640 Speaker 1: looking man comes striding across and says, what are you 211 00:11:15,679 --> 00:11:17,720 Speaker 1: doing here? You're not allowed to be in this parking lot. 212 00:11:18,280 --> 00:11:20,760 Speaker 1: And they all look at me, and I'm thinking, what 213 00:11:20,800 --> 00:11:22,480 Speaker 1: am I? I'm like, you know, twenty something, I'm not 214 00:11:22,600 --> 00:11:24,400 Speaker 1: very tall, but what am I supposed to do with 215 00:11:24,559 --> 00:11:26,400 Speaker 1: this large man who doesn't want us in there in 216 00:11:26,440 --> 00:11:28,600 Speaker 1: his parking lot? And then I remembered that I had 217 00:11:28,640 --> 00:11:31,719 Speaker 1: a federal court order right on my person, so I 218 00:11:31,800 --> 00:11:33,719 Speaker 1: kind of drew myself up, but I showed him my 219 00:11:33,760 --> 00:11:36,320 Speaker 1: federal court order and I said, well, I'm with the 220 00:11:36,360 --> 00:11:39,160 Speaker 1: New York Civil Liberties Union kind of, and I have 221 00:11:39,200 --> 00:11:41,720 Speaker 1: a federal court order saying that these people are allowed 222 00:11:41,760 --> 00:11:43,760 Speaker 1: to be in this parking lot and go to the beach. 223 00:11:44,520 --> 00:11:47,480 Speaker 1: And he melted. And that that was, I think, in 224 00:11:47,679 --> 00:11:49,360 Speaker 1: one of the points at which I thought, Wow, you 225 00:11:49,400 --> 00:11:53,080 Speaker 1: know this, this is really powerful stuff. Yeah you saw 226 00:11:53,080 --> 00:11:56,839 Speaker 1: that there was my first day case. Yeah, exactly, that's great. 227 00:11:56,880 --> 00:11:58,800 Speaker 1: And I saw I read that. So maybe your your 228 00:11:58,840 --> 00:12:02,440 Speaker 1: earliest memories, um of speaking up to authority involved I 229 00:12:02,440 --> 00:12:06,480 Speaker 1: think a dispute over a book at your school library. Yeah, 230 00:12:06,520 --> 00:12:09,120 Speaker 1: that's right, even before the Belchair case. My first civil 231 00:12:09,120 --> 00:12:12,080 Speaker 1: liberties hero was my mother. So when I was in 232 00:12:12,160 --> 00:12:15,000 Speaker 1: third grade, we were doing a school play about a 233 00:12:15,000 --> 00:12:18,439 Speaker 1: story called Johnny Tremaine, about a boy in the American Revolution, 234 00:12:19,080 --> 00:12:21,280 Speaker 1: and I like, I thought the play was interesting. Players 235 00:12:21,320 --> 00:12:23,679 Speaker 1: don't have that many words, and we were told that 236 00:12:23,720 --> 00:12:25,720 Speaker 1: this was based on the book. So I went to 237 00:12:25,760 --> 00:12:28,439 Speaker 1: my school library, my public school library, and I asked 238 00:12:28,440 --> 00:12:31,000 Speaker 1: to take out the book, and the librarian said, oh, 239 00:12:31,040 --> 00:12:32,880 Speaker 1: you can't take out that book, dear, that's in the 240 00:12:32,880 --> 00:12:36,840 Speaker 1: boys section. And I was I was surprised to find 241 00:12:36,840 --> 00:12:39,040 Speaker 1: this out. I've been reading books in the girls section, 242 00:12:39,120 --> 00:12:42,400 Speaker 1: which were all collections of fairy tales and biographies of 243 00:12:42,440 --> 00:12:45,120 Speaker 1: president's wives, but it had never occurred to me that 244 00:12:45,160 --> 00:12:46,920 Speaker 1: I wasn't allowed to take out a book from the 245 00:12:46,960 --> 00:12:49,400 Speaker 1: boys section. So I went home and I told my 246 00:12:49,440 --> 00:12:51,400 Speaker 1: mother about this. You're just thinking, you know, that's the 247 00:12:51,440 --> 00:12:54,600 Speaker 1: way things are, and she just exploded and she called 248 00:12:54,640 --> 00:12:56,680 Speaker 1: the librarian the next day and say, how dare you 249 00:12:56,760 --> 00:12:59,080 Speaker 1: told my daughter you know what she's not allowed to read. 250 00:12:59,800 --> 00:13:02,480 Speaker 1: So the librarian told me that from then on, I 251 00:13:02,480 --> 00:13:04,880 Speaker 1: could take out any book I wanted, and you know, 252 00:13:04,960 --> 00:13:07,439 Speaker 1: not long after that, they changed the policy for everyone. 253 00:13:08,120 --> 00:13:10,120 Speaker 1: So you know, there was another example of how you 254 00:13:10,160 --> 00:13:12,160 Speaker 1: know you can kind of speak up to authority when 255 00:13:12,160 --> 00:13:14,280 Speaker 1: they kind of tell you who to be and prevent 256 00:13:14,320 --> 00:13:18,160 Speaker 1: you from making your own choices. Were you always like that? Well, 257 00:13:18,160 --> 00:13:20,600 Speaker 1: you know, that's third grade and I feel like yes, 258 00:13:21,200 --> 00:13:23,400 Speaker 1: I think for most of us are values of form 259 00:13:23,480 --> 00:13:26,160 Speaker 1: when we're pretty young. Yeah, so you know, seeing my 260 00:13:26,240 --> 00:13:28,760 Speaker 1: mother do that, I'm sure you would have had an 261 00:13:28,800 --> 00:13:31,760 Speaker 1: impact on me. Yeah, that's such a good story. And 262 00:13:31,800 --> 00:13:33,320 Speaker 1: did you I mean, did you always know you wanted 263 00:13:33,360 --> 00:13:36,880 Speaker 1: to go into law? No? I actually really didn't, because 264 00:13:37,160 --> 00:13:39,240 Speaker 1: having grown up as a woman during that era, my 265 00:13:39,280 --> 00:13:41,439 Speaker 1: father was a lawyer and he always used to talk 266 00:13:41,480 --> 00:13:43,520 Speaker 1: about the fact that law was really not a good 267 00:13:43,520 --> 00:13:45,360 Speaker 1: profession for women. Why would you want to do that 268 00:13:45,440 --> 00:13:47,040 Speaker 1: if you could be an English teacher and have the 269 00:13:47,080 --> 00:13:50,240 Speaker 1: summer off, you take care of your children, so you 270 00:13:50,280 --> 00:13:52,280 Speaker 1: have to be a while. I graduated from college and 271 00:13:52,280 --> 00:13:54,480 Speaker 1: then spent a few years doing other things and then 272 00:13:54,520 --> 00:13:58,000 Speaker 1: decided to go to law school. Well, I mean, it's 273 00:13:58,160 --> 00:14:01,360 Speaker 1: it's so interesting and now to seeing where you're at 274 00:14:01,679 --> 00:14:05,120 Speaker 1: um and seeing this moment, it does feel like a moment. 275 00:14:05,160 --> 00:14:07,200 Speaker 1: And I was looking at something you said about you know, 276 00:14:07,240 --> 00:14:10,120 Speaker 1: this feels like a moment we can be optimistic because 277 00:14:10,960 --> 00:14:13,840 Speaker 1: so many Americans are beginning to really understand the scope 278 00:14:13,840 --> 00:14:17,000 Speaker 1: and the depth of structural racism. It certainly feels, you know, 279 00:14:17,120 --> 00:14:19,480 Speaker 1: I'm based in New York City. You can just feel 280 00:14:19,520 --> 00:14:22,760 Speaker 1: it right on the streets with the protests, and they 281 00:14:22,760 --> 00:14:26,080 Speaker 1: hear the sirens and the helicopters, you know, as we 282 00:14:26,160 --> 00:14:29,080 Speaker 1: sit here, um and we hear you know, your rich 283 00:14:29,160 --> 00:14:32,720 Speaker 1: history and covering and caring about these issues. What is 284 00:14:32,760 --> 00:14:37,000 Speaker 1: the challenge for you guys ahead, Well, you know, the 285 00:14:37,080 --> 00:14:40,520 Speaker 1: challenge on that particular subject is that this is work 286 00:14:40,520 --> 00:14:42,600 Speaker 1: that we had already been doing. One of our top 287 00:14:42,640 --> 00:14:46,440 Speaker 1: priorities for the past several years has been trying to 288 00:14:46,480 --> 00:14:50,600 Speaker 1: break our addiction to mass incarceration, which, as everybody is 289 00:14:50,640 --> 00:14:53,520 Speaker 1: now really coming to terms with, has been really it's 290 00:14:53,520 --> 00:14:56,640 Speaker 1: a system that has disproportionately affected people on the basis 291 00:14:56,640 --> 00:15:00,320 Speaker 1: of race and income and disability of co to the 292 00:15:00,320 --> 00:15:02,520 Speaker 1: people who are arrested or people who are mentally ill, 293 00:15:03,400 --> 00:15:06,359 Speaker 1: And our feeling is that the system has been fundamentally 294 00:15:06,400 --> 00:15:09,480 Speaker 1: broken and misguided for a long time. So part of 295 00:15:09,520 --> 00:15:11,200 Speaker 1: what we're trying to do with this moment is to 296 00:15:11,320 --> 00:15:13,440 Speaker 1: capitalize on the fact that people want to look at 297 00:15:13,480 --> 00:15:16,240 Speaker 1: what the police do. We're trying to encourage people to 298 00:15:16,240 --> 00:15:18,760 Speaker 1: look beyond the police. It's not just you. Who are 299 00:15:18,800 --> 00:15:21,200 Speaker 1: the police arresting and how are they treating the people 300 00:15:21,240 --> 00:15:24,480 Speaker 1: they arrest? I think behind that is the question of 301 00:15:24,520 --> 00:15:26,400 Speaker 1: what do we really want to treat as a crime. 302 00:15:27,320 --> 00:15:31,120 Speaker 1: So when you treat all sorts of very minor misconduct 303 00:15:31,480 --> 00:15:34,280 Speaker 1: as a crime, you're really setting up a situation where 304 00:15:34,320 --> 00:15:37,200 Speaker 1: they're going to be more contacts and therefore potentially more 305 00:15:37,280 --> 00:15:40,800 Speaker 1: arbitrary and discriminatory context. So, if you think about it, 306 00:15:40,960 --> 00:15:45,000 Speaker 1: Eric Garner ended up dying because he was selling single 307 00:15:45,120 --> 00:15:49,320 Speaker 1: cigarettes on which the tax had not been paid. George Floyd. 308 00:15:49,600 --> 00:15:51,800 Speaker 1: The basis for that encounter was that they thought he 309 00:15:51,880 --> 00:15:55,640 Speaker 1: might be passing a counterfeit twenty dollar bill. So I 310 00:15:55,680 --> 00:15:58,560 Speaker 1: think that if you look at why are we criminalizing 311 00:15:58,600 --> 00:16:01,160 Speaker 1: some of the things we criminalize, especially if you're talking 312 00:16:01,200 --> 00:16:04,640 Speaker 1: about people who are mentally ill and are having problems. 313 00:16:04,880 --> 00:16:07,040 Speaker 1: Do we really want the police to be the people 314 00:16:07,080 --> 00:16:09,640 Speaker 1: who are the first responders to people who are having 315 00:16:09,680 --> 00:16:12,800 Speaker 1: a mental health crisis or is there some more effective 316 00:16:12,840 --> 00:16:16,000 Speaker 1: way to deal with that that would avoid putting those 317 00:16:16,000 --> 00:16:18,800 Speaker 1: people into the criminal justice system, which isn't really good 318 00:16:18,800 --> 00:16:23,120 Speaker 1: for anyone, and to maybe recommit, reallocate some of the 319 00:16:23,160 --> 00:16:26,680 Speaker 1: resources we're using on arresting people and locking them up 320 00:16:27,080 --> 00:16:29,880 Speaker 1: to actually dealing with the mental health crises. You have 321 00:16:30,000 --> 00:16:33,920 Speaker 1: mental health treatment. So instead of viewing everything as well 322 00:16:34,040 --> 00:16:37,040 Speaker 1: dysfunction as a matter of policing, why don't we spend 323 00:16:37,040 --> 00:16:40,560 Speaker 1: more time reinvesting and to try to prevent more dysfunction. 324 00:16:40,840 --> 00:16:42,520 Speaker 1: Instort of like the old saying, you know, if you're 325 00:16:42,520 --> 00:16:44,800 Speaker 1: a hammer, everything looks like a nail. Well, you know, 326 00:16:44,920 --> 00:16:47,560 Speaker 1: not every problem in our society is a problem for 327 00:16:47,600 --> 00:16:50,280 Speaker 1: the criminal justice system, and an occasion to arrest people 328 00:16:50,280 --> 00:16:52,440 Speaker 1: and lock them up a lot of them really should 329 00:16:52,440 --> 00:16:55,880 Speaker 1: be an occasion for thinking about public health treatments. I'm 330 00:16:55,880 --> 00:16:58,600 Speaker 1: thinking about how we want to approach homelessness, and you 331 00:16:58,640 --> 00:17:01,360 Speaker 1: have a lot of much deeper thoughts about how you 332 00:17:01,400 --> 00:17:04,479 Speaker 1: prevent dysfunction. Rather than answering everything with you we're going 333 00:17:04,480 --> 00:17:07,520 Speaker 1: to send in the police. It certainly seems also like 334 00:17:07,600 --> 00:17:10,360 Speaker 1: this moment, even coming out of the pandemic, I can 335 00:17:10,359 --> 00:17:12,439 Speaker 1: only imagine the mental health crisis is going to be 336 00:17:12,480 --> 00:17:16,119 Speaker 1: even worse. Yeah, that could well be um and I 337 00:17:16,160 --> 00:17:19,080 Speaker 1: think the pandemic is also showing us. Somebody asked me 338 00:17:19,119 --> 00:17:23,240 Speaker 1: the other day whether the protests over policing and police 339 00:17:23,240 --> 00:17:26,359 Speaker 1: brutality are related to the pandemic, and I was in 340 00:17:26,400 --> 00:17:28,159 Speaker 1: a webinar and one of the smart people in the 341 00:17:28,240 --> 00:17:30,639 Speaker 1: room said, oh, no, no, they're two entirely different things. 342 00:17:30,880 --> 00:17:33,119 Speaker 1: But I said, what do you mean. The same people 343 00:17:33,160 --> 00:17:37,560 Speaker 1: who are being disproportionately affected by policing and police brutality 344 00:17:37,600 --> 00:17:40,600 Speaker 1: are the people who are being disproportionately affected by COVID. 345 00:17:41,200 --> 00:17:43,520 Speaker 1: The statistics is that people of color are much more 346 00:17:43,560 --> 00:17:45,959 Speaker 1: likely to die, and there are a lot of reasons 347 00:17:46,000 --> 00:17:48,399 Speaker 1: for that. You're having to do with your underlying health 348 00:17:48,560 --> 00:17:51,960 Speaker 1: and having to do with the fact that minorities and 349 00:17:52,000 --> 00:17:54,960 Speaker 1: people who are not affluent don't get to work from home, 350 00:17:55,480 --> 00:17:57,200 Speaker 1: they don't get to work through zoom. There are the 351 00:17:57,280 --> 00:17:59,440 Speaker 1: people who are out there on the streets being the 352 00:17:59,440 --> 00:18:01,879 Speaker 1: first risk wanders, being the people who are picking up 353 00:18:01,920 --> 00:18:05,040 Speaker 1: the garbage, being the people who was talking the supermarket shills. 354 00:18:05,840 --> 00:18:09,520 Speaker 1: And I feel like the virus is really amplifying so 355 00:18:09,600 --> 00:18:12,560 Speaker 1: many of the inequities we've had in our society. And 356 00:18:12,600 --> 00:18:14,359 Speaker 1: I think especially you know, I don't know what it's 357 00:18:14,400 --> 00:18:16,960 Speaker 1: like for everyone else. But I live in Brooklyn and 358 00:18:17,000 --> 00:18:18,840 Speaker 1: in New York City. It really felt like a lot 359 00:18:18,840 --> 00:18:20,920 Speaker 1: of the people who were out on the street, they 360 00:18:20,920 --> 00:18:22,760 Speaker 1: were out on the street because they were upset about 361 00:18:22,760 --> 00:18:25,520 Speaker 1: George Floyd. But I think it was more that they 362 00:18:25,560 --> 00:18:28,000 Speaker 1: recognized that George Floyd was the chip of the iceberg 363 00:18:28,480 --> 00:18:31,000 Speaker 1: and that there were just a lot going on that 364 00:18:31,119 --> 00:18:38,280 Speaker 1: they really just you could not tolerate any longer. More 365 00:18:38,320 --> 00:18:41,280 Speaker 1: from Susan after the break, and make sure to subscribe 366 00:18:41,280 --> 00:18:44,399 Speaker 1: to First Contact in Apple podcasts or wherever you listen 367 00:18:44,520 --> 00:19:02,840 Speaker 1: so you don't miss an episode putting on the tech hat. 368 00:19:02,960 --> 00:19:04,840 Speaker 1: You know, I think most people probably don't think of 369 00:19:04,880 --> 00:19:06,320 Speaker 1: tech when they think of the a c l U, 370 00:19:06,920 --> 00:19:10,119 Speaker 1: But there's quite a bit of litigation in regards to 371 00:19:10,200 --> 00:19:15,440 Speaker 1: security and privacy issues around contact tracing, surveillance, algorithmic bias, 372 00:19:15,600 --> 00:19:17,880 Speaker 1: and obviously the a c l U has a hand 373 00:19:17,880 --> 00:19:19,679 Speaker 1: in checks and balances and a lot of the issues 374 00:19:19,720 --> 00:19:23,320 Speaker 1: that are merging from the pandemic. You know, what are 375 00:19:23,680 --> 00:19:26,720 Speaker 1: some of the tech developments that you guys are most 376 00:19:26,720 --> 00:19:31,040 Speaker 1: concerned about. Well, since you were mentioning the COVID and 377 00:19:31,080 --> 00:19:33,640 Speaker 1: the the contact tracking and tracing, I'll start with that. 378 00:19:34,040 --> 00:19:36,720 Speaker 1: So the upshot is that we are neither for nor 379 00:19:36,760 --> 00:19:40,480 Speaker 1: against contact racing. Contact tracing is something that really will 380 00:19:40,520 --> 00:19:43,720 Speaker 1: contribute to public health. Our concern is not to say no, 381 00:19:43,840 --> 00:19:45,600 Speaker 1: you can't do it, or you just go right ahead 382 00:19:45,640 --> 00:19:48,520 Speaker 1: and do whatever you want. What we're concerned about is 383 00:19:48,560 --> 00:19:54,679 Speaker 1: to minimize the damage to privacy, the damage to equity. Again, Uh, 384 00:19:54,840 --> 00:19:56,840 Speaker 1: there are a lot of concerns that we have. The 385 00:19:56,880 --> 00:20:00,680 Speaker 1: other thing that we're concerned about is discrimination. Again, because 386 00:20:01,080 --> 00:20:05,320 Speaker 1: there are ways in which the technology could also increase 387 00:20:05,440 --> 00:20:08,800 Speaker 1: pre existing social inequities. We think that people should not 388 00:20:08,840 --> 00:20:11,560 Speaker 1: be coerced into participating and in testing. We think it 389 00:20:11,600 --> 00:20:14,399 Speaker 1: should be voluntary, and we also think that it should 390 00:20:14,440 --> 00:20:18,760 Speaker 1: be nonpunitive, because if you start having the criminal justice 391 00:20:18,760 --> 00:20:22,320 Speaker 1: system enforcing whether or not people are willing to use 392 00:20:22,359 --> 00:20:25,040 Speaker 1: their phone to take a test or whatever it is, 393 00:20:25,520 --> 00:20:31,720 Speaker 1: you're just creating more opportunities for police interactions that will 394 00:20:31,760 --> 00:20:35,280 Speaker 1: at some point be arbitrary or discriminatory. So we don't 395 00:20:35,280 --> 00:20:37,879 Speaker 1: want to see rules and regulations that are good to 396 00:20:38,040 --> 00:20:40,479 Speaker 1: public health rules. Even if they really are good public 397 00:20:40,480 --> 00:20:43,680 Speaker 1: health rules, we don't want to see those become occasions 398 00:20:43,800 --> 00:20:47,720 Speaker 1: for filling up the jails with the people who aren't complying, 399 00:20:48,160 --> 00:20:51,440 Speaker 1: because we've already seen there were some statistics in New 400 00:20:51,480 --> 00:20:54,640 Speaker 1: York that when you ask the police to start enforcing 401 00:20:54,640 --> 00:20:56,879 Speaker 1: who's wearing a mask and who's not wearing a mask, 402 00:20:57,240 --> 00:21:01,240 Speaker 1: that right away, excuse excuse, racially truly disproportionate in terms 403 00:21:01,280 --> 00:21:03,480 Speaker 1: of who they were questioning and who they weren't questioning. 404 00:21:04,080 --> 00:21:05,800 Speaker 1: So I think there's just a lot of issues they 405 00:21:05,840 --> 00:21:07,840 Speaker 1: were which is very much prop your reality, because they're 406 00:21:07,880 --> 00:21:11,119 Speaker 1: very much ethical issues. Yeah, you know, UM one of 407 00:21:11,160 --> 00:21:14,760 Speaker 1: the one of the cases that I'm fascinated by. UM 408 00:21:14,840 --> 00:21:16,680 Speaker 1: and I you know, I honestly I felt like it 409 00:21:16,720 --> 00:21:18,639 Speaker 1: was just it was only a matter of time until 410 00:21:18,640 --> 00:21:21,360 Speaker 1: we saw this headline. And then we saw the headline, 411 00:21:21,440 --> 00:21:24,440 Speaker 1: you know, a man was arrested after an algorithm wrongfully 412 00:21:24,520 --> 00:21:28,200 Speaker 1: identified him. You know, I've been covering for so many years. 413 00:21:28,240 --> 00:21:31,520 Speaker 1: AI is biased. AI is trained on you know, on 414 00:21:31,680 --> 00:21:34,919 Speaker 1: data online, which can be very racist, you know, And 415 00:21:34,960 --> 00:21:37,400 Speaker 1: I think for so many years we've been having this conversation. 416 00:21:37,480 --> 00:21:40,240 Speaker 1: But the question of okay, well, what happens when it 417 00:21:40,240 --> 00:21:43,439 Speaker 1: gets into the hands of the police, what happens, you know, 418 00:21:43,880 --> 00:21:46,240 Speaker 1: if if it could go for policing, And so I 419 00:21:46,240 --> 00:21:49,480 Speaker 1: think it's such a fascinating case. And and you guys 420 00:21:49,560 --> 00:21:53,399 Speaker 1: that a cl you filed an administrative complaint with Detroit's 421 00:21:53,400 --> 00:21:56,560 Speaker 1: police department over what you guys are calling the country's 422 00:21:56,720 --> 00:22:01,600 Speaker 1: first known wrongful arrest involving facial recognition technology. I mean, 423 00:22:02,320 --> 00:22:06,119 Speaker 1: for context, a man was arrested because he was wrongfully 424 00:22:06,200 --> 00:22:09,920 Speaker 1: identified by an algorithm. The police department thought he had 425 00:22:10,200 --> 00:22:13,880 Speaker 1: robbed I believe, like stolen watches, and he was arrested. 426 00:22:14,800 --> 00:22:17,359 Speaker 1: I mean, can you talk to me about the significance 427 00:22:17,440 --> 00:22:19,800 Speaker 1: of this case. I can't help put put on my 428 00:22:19,880 --> 00:22:22,200 Speaker 1: tech hat and scream. You guys, this is a really 429 00:22:22,240 --> 00:22:25,320 Speaker 1: big deal. Yeah, it is a really big deal. And 430 00:22:25,359 --> 00:22:28,119 Speaker 1: as you're saying, Laurie, we were aware of this problem 431 00:22:28,119 --> 00:22:30,720 Speaker 1: for a long time and we've been complaining. So going 432 00:22:30,800 --> 00:22:32,920 Speaker 1: back for a minute before getting to the case you're 433 00:22:32,920 --> 00:22:37,600 Speaker 1: talking about, Robert Williams UH the National Institute of Science 434 00:22:37,600 --> 00:22:40,679 Speaker 1: and Technology says that African American and Asian people are 435 00:22:40,760 --> 00:22:43,679 Speaker 1: up to a hundred times is likely to be disidentified 436 00:22:43,800 --> 00:22:47,359 Speaker 1: by official recognition. So yeah, that's the background problem. And 437 00:22:47,400 --> 00:22:49,560 Speaker 1: so we knew that, right, you know, we knew that 438 00:22:49,640 --> 00:22:53,120 Speaker 1: before the case came up in Michigan. Um, and it's 439 00:22:53,119 --> 00:22:55,800 Speaker 1: not the algorithm's fault. Obviously, there's something that's being put 440 00:22:55,800 --> 00:22:58,600 Speaker 1: into the algorithm that that is, you know, that has 441 00:22:58,640 --> 00:23:01,840 Speaker 1: a bias. And I people tend to think that algorithms 442 00:23:01,840 --> 00:23:03,600 Speaker 1: are you know, are so neutral and that we can 443 00:23:03,680 --> 00:23:05,960 Speaker 1: rely on algorithms. That's what I was saying about the 444 00:23:06,000 --> 00:23:09,439 Speaker 1: contact tracking and tracing, that you you start relying on 445 00:23:09,480 --> 00:23:11,840 Speaker 1: algorithms or apps that you think are neutral, and you 446 00:23:11,880 --> 00:23:15,399 Speaker 1: really have to be very wary of that. So again, 447 00:23:15,440 --> 00:23:18,520 Speaker 1: before getting to the Robert Williams case, uh An, a 448 00:23:18,600 --> 00:23:21,520 Speaker 1: c l U staffer at the ah LU of Northern California, 449 00:23:21,720 --> 00:23:24,919 Speaker 1: had the really interesting idea of trying out Amazon's facial 450 00:23:24,960 --> 00:23:29,400 Speaker 1: recognition program Recognition with the K because yeah, they were 451 00:23:29,400 --> 00:23:31,800 Speaker 1: just offering this to the police or whatever. This is great, 452 00:23:31,800 --> 00:23:33,919 Speaker 1: it will help you identify and see if you have 453 00:23:34,040 --> 00:23:36,880 Speaker 1: somebody who matches a mug shot. Well, what they tried 454 00:23:36,920 --> 00:23:39,040 Speaker 1: to do, which I thought was very clever, was they 455 00:23:39,160 --> 00:23:42,600 Speaker 1: tried to match mug shots against the members of Congress. 456 00:23:43,080 --> 00:23:46,080 Speaker 1: They got the facial pictures of all the members of Congress. 457 00:23:46,400 --> 00:23:51,240 Speaker 1: This was in July and there were twenty eight members 458 00:23:51,240 --> 00:23:54,280 Speaker 1: of Congress who were misidentified as matching the mug shots. 459 00:23:54,920 --> 00:23:57,560 Speaker 1: There were twenty in mistakes out of that, and not 460 00:23:57,640 --> 00:24:00,640 Speaker 1: only that, but that the false matches were disc apportionately 461 00:24:00,720 --> 00:24:03,159 Speaker 1: people of color. And one of the people who was 462 00:24:03,240 --> 00:24:06,000 Speaker 1: identified as matching a mug shot, and therefore, you know, 463 00:24:06,040 --> 00:24:09,679 Speaker 1: probably you know this criminal was civil rights legend John Lewis. 464 00:24:10,119 --> 00:24:11,760 Speaker 1: You're the guy who was beat up on the bridge 465 00:24:11,800 --> 00:24:14,840 Speaker 1: in Salma to know, to get us all voting rights. 466 00:24:14,880 --> 00:24:19,240 Speaker 1: So yeah, we know that almost of the false matches 467 00:24:19,280 --> 00:24:22,879 Speaker 1: there were of people of color, even though people of 468 00:24:22,880 --> 00:24:26,600 Speaker 1: color made up only twenty of the members of Congress. 469 00:24:26,640 --> 00:24:28,639 Speaker 1: So in some ways, you know, the Robert Williams case 470 00:24:28,760 --> 00:24:32,399 Speaker 1: is completely predictable. We knew that we allowed for that 471 00:24:32,480 --> 00:24:35,359 Speaker 1: to happen. It might have already happened elsewhere, but you know, 472 00:24:35,400 --> 00:24:37,600 Speaker 1: subterranean lee in a way that we don't really we 473 00:24:37,640 --> 00:24:40,399 Speaker 1: didn't see the case. But what's amazing about the Robert 474 00:24:40,400 --> 00:24:43,080 Speaker 1: Williams cases that it happened right there, you know, visible 475 00:24:43,119 --> 00:24:45,679 Speaker 1: to everybody where you can just see it. So what 476 00:24:45,760 --> 00:24:47,960 Speaker 1: happened was that they told him that he was being 477 00:24:48,040 --> 00:24:51,600 Speaker 1: arrested because they believed that he was that the algorithm 478 00:24:51,600 --> 00:24:54,959 Speaker 1: has said that that that he was a match for 479 00:24:55,000 --> 00:24:57,280 Speaker 1: this mug shot, and they showed him in the mug shot, 480 00:24:57,600 --> 00:24:59,560 Speaker 1: and he said to them, do you guys think all 481 00:24:59,600 --> 00:25:03,199 Speaker 1: black people look alike? That looks nothing like me. So, 482 00:25:03,280 --> 00:25:05,000 Speaker 1: you know, it was pretty clearing that if you used 483 00:25:05,000 --> 00:25:07,600 Speaker 1: your eyes and looked at the picture yourself, if you 484 00:25:07,640 --> 00:25:10,080 Speaker 1: didn't trust the algorithm, and if you looked at the 485 00:25:10,080 --> 00:25:13,919 Speaker 1: picture in this man's face, they didn't look alike. But nevertheless, 486 00:25:13,960 --> 00:25:16,920 Speaker 1: he spent thirty hours in jail under some pretty miserable 487 00:25:16,920 --> 00:25:20,399 Speaker 1: conditions because the algorithm said it was a match. So 488 00:25:20,440 --> 00:25:22,760 Speaker 1: I think that's really important. In some ways, the fact 489 00:25:22,840 --> 00:25:27,480 Speaker 1: that you know a problem exists is not as inspiring 490 00:25:27,720 --> 00:25:30,240 Speaker 1: to make people want to do something about it as 491 00:25:30,240 --> 00:25:32,879 Speaker 1: when you see it. So that's what happened with all 492 00:25:32,920 --> 00:25:35,560 Speaker 1: the protests about George Floyd. You people could watch that 493 00:25:35,640 --> 00:25:38,800 Speaker 1: horrible video. They could see it. It was recorded on 494 00:25:38,800 --> 00:25:42,040 Speaker 1: the video, And here we have an actual person, not 495 00:25:42,160 --> 00:25:46,040 Speaker 1: just hypothetically statistics are showing, but an actual person who 496 00:25:46,080 --> 00:25:48,720 Speaker 1: did get arrested and did have a miserable time. He 497 00:25:48,760 --> 00:25:51,400 Speaker 1: was arrested in front of his family. It was really 498 00:25:51,440 --> 00:25:56,800 Speaker 1: traumatizing and based on again, the officers involved were trusting 499 00:25:56,840 --> 00:25:59,840 Speaker 1: the science more than they were trusting their their own eyes. 500 00:26:00,320 --> 00:26:04,160 Speaker 1: When anybody cook he didn't look like the picture right. 501 00:26:04,440 --> 00:26:06,480 Speaker 1: And you know, he wrote an offed in the Washington 502 00:26:06,520 --> 00:26:08,440 Speaker 1: Post and he he asked the question, He said, why 503 00:26:08,480 --> 00:26:10,800 Speaker 1: is law enforcement even allowed to use this technology when 504 00:26:10,800 --> 00:26:14,960 Speaker 1: it obviously doesn't work? So, I guess asking a legal 505 00:26:15,000 --> 00:26:19,040 Speaker 1: scholar the question. You know, police departments all around the 506 00:26:19,040 --> 00:26:22,960 Speaker 1: country are using different variations of facial recognition software. So 507 00:26:23,520 --> 00:26:27,200 Speaker 1: you know, what regulations should we see as we enter 508 00:26:27,320 --> 00:26:31,560 Speaker 1: this era of algorithmic discrimination. Yeah, that's a great question. 509 00:26:31,600 --> 00:26:33,800 Speaker 1: And again we've been urging, you know, long before Robert 510 00:26:33,840 --> 00:26:36,960 Speaker 1: Williams turned up, we've been urging police departments not to 511 00:26:37,040 --> 00:26:40,240 Speaker 1: rely on the facial recognition technology that it was just 512 00:26:40,320 --> 00:26:42,840 Speaker 1: it was not reliable enough to you know, to hold 513 00:26:42,840 --> 00:26:46,600 Speaker 1: people's face in the hands of the algorithms don't have hands, 514 00:26:46,840 --> 00:26:49,880 Speaker 1: but for people's face to be dependent on the spatial 515 00:26:49,920 --> 00:26:53,280 Speaker 1: recognition technology which was being touted. And again, it's great 516 00:26:53,280 --> 00:26:55,760 Speaker 1: if a company is doing something to make money, but 517 00:26:55,800 --> 00:26:58,679 Speaker 1: if wanting to make money is your only consideration, and 518 00:26:58,680 --> 00:27:02,240 Speaker 1: if you're not considering whether you are unleashing something that 519 00:27:02,359 --> 00:27:05,919 Speaker 1: is really going to be disruptive of people's lives unfairly, 520 00:27:06,680 --> 00:27:08,879 Speaker 1: either because it's just going to be wrong or because 521 00:27:08,880 --> 00:27:11,240 Speaker 1: it's going to be wrong in a racially skewed way. 522 00:27:11,880 --> 00:27:15,000 Speaker 1: I think that's just really a problem. So um, we've 523 00:27:15,000 --> 00:27:18,840 Speaker 1: been urging police departments not to buy and use the technology. 524 00:27:19,040 --> 00:27:21,439 Speaker 1: And I'm sure you know Amazon has withdrawn the facial 525 00:27:21,480 --> 00:27:24,840 Speaker 1: recognition technology temporarily and they're not sure whether or not 526 00:27:24,880 --> 00:27:28,680 Speaker 1: they'll bring it back. So the probability of a wrongful 527 00:27:28,760 --> 00:27:31,600 Speaker 1: arrest is one thing, but when you draw the camera 528 00:27:31,680 --> 00:27:34,639 Speaker 1: back and look at all the technology in the bigger picture. 529 00:27:35,080 --> 00:27:38,440 Speaker 1: In addition to facial recognition, one thing that police departments 530 00:27:38,440 --> 00:27:42,160 Speaker 1: have been doing with facial recognition and different law enforcement 531 00:27:42,200 --> 00:27:45,760 Speaker 1: agencies is to try to see who attends a demonstration 532 00:27:45,960 --> 00:27:49,560 Speaker 1: or see who's in the crowd. So it ties not 533 00:27:49,680 --> 00:27:52,440 Speaker 1: into are you, like, is somebody likely to be wrongly 534 00:27:52,520 --> 00:27:55,520 Speaker 1: arrested like Robert Williams because they just there was a 535 00:27:55,560 --> 00:28:00,199 Speaker 1: false match. But it starts becoming big surveillance too that 536 00:28:00,480 --> 00:28:03,399 Speaker 1: an agency has the cameras on and then they have 537 00:28:03,480 --> 00:28:07,919 Speaker 1: the facial recognition and they're purporting to identify all the 538 00:28:07,960 --> 00:28:11,119 Speaker 1: people in that crowd, so that then they contract those people. 539 00:28:11,160 --> 00:28:13,880 Speaker 1: They now know that you were at the George Floyd demonstration, 540 00:28:14,200 --> 00:28:18,440 Speaker 1: and that person was in the the the anti war demonstration, 541 00:28:18,960 --> 00:28:21,199 Speaker 1: And at that point, the government starts having more and 542 00:28:21,280 --> 00:28:24,440 Speaker 1: more information about all of us, to the point where 543 00:28:24,480 --> 00:28:27,720 Speaker 1: it feels like instead of we're controlling the government, it's 544 00:28:27,760 --> 00:28:30,840 Speaker 1: like the government controls us. So I think the facial 545 00:28:30,920 --> 00:28:37,399 Speaker 1: recognition is only one part of the whole tendency of technology. 546 00:28:37,600 --> 00:28:43,680 Speaker 1: Two amplify government power to be kind of watching, watching 547 00:28:43,680 --> 00:28:47,240 Speaker 1: what we do. Yeah, I mean, it's it's interesting to 548 00:28:47,280 --> 00:28:50,800 Speaker 1: hear you say that. Um. You know, that type of 549 00:28:51,120 --> 00:28:53,280 Speaker 1: technology is just a part of it, especially when it 550 00:28:53,280 --> 00:28:56,360 Speaker 1: comes to this moment where people are out protesting police brutality, 551 00:28:56,400 --> 00:28:59,400 Speaker 1: when people are out fighting for their civil liberties. You know, 552 00:28:59,560 --> 00:29:04,240 Speaker 1: there's all sorts of technology that's being built. Their cameras 553 00:29:04,280 --> 00:29:07,200 Speaker 1: that are are being built that can recognize people in 554 00:29:07,240 --> 00:29:09,640 Speaker 1: real time that are police, police are wearing. There's all 555 00:29:09,720 --> 00:29:13,120 Speaker 1: sorts of technology. This is just the beginning of it. Um. 556 00:29:13,520 --> 00:29:15,480 Speaker 1: I I know you mentioned Amazon put a hold on 557 00:29:15,520 --> 00:29:18,240 Speaker 1: their sales of recognition software. Microsoft said it's not going 558 00:29:18,280 --> 00:29:22,320 Speaker 1: to sell face recognition software to police departments until their 559 00:29:22,360 --> 00:29:25,120 Speaker 1: federal regulations. I know. IBM said that it was going 560 00:29:25,120 --> 00:29:28,920 Speaker 1: to announce a ban on general purpose facial recognition? Is 561 00:29:29,000 --> 00:29:32,440 Speaker 1: that enough? Like? What is I guess, you know, what 562 00:29:32,680 --> 00:29:34,960 Speaker 1: is the government's role here? Like? What do you think 563 00:29:34,960 --> 00:29:38,200 Speaker 1: should happen? Especially since this is just, as you say, 564 00:29:38,440 --> 00:29:41,440 Speaker 1: one small part of a larger issue that we're facing 565 00:29:41,440 --> 00:29:43,960 Speaker 1: as a society. I think that's right, and I think 566 00:29:44,000 --> 00:29:46,360 Speaker 1: that there could be, you know, government regulation, but that's 567 00:29:46,360 --> 00:29:49,320 Speaker 1: not going to happen unless the public wants to urge 568 00:29:49,360 --> 00:29:52,720 Speaker 1: their representatives to start controlling this. And what we've seen 569 00:29:52,840 --> 00:29:56,200 Speaker 1: is that an enlightened public can make something happen even 570 00:29:56,240 --> 00:29:59,600 Speaker 1: without regulation, right, So, you know, it was that the 571 00:29:59,600 --> 00:30:02,520 Speaker 1: public is becoming concerned and that's the reason why Amazon 572 00:30:02,640 --> 00:30:06,280 Speaker 1: acted to withdraw this. They started being concerned that their 573 00:30:06,320 --> 00:30:09,320 Speaker 1: customers were not going to be happy with them. And 574 00:30:09,560 --> 00:30:12,800 Speaker 1: I think at this point that's almost more effective than 575 00:30:12,880 --> 00:30:16,480 Speaker 1: government regulation. And once you have that wake up call, 576 00:30:16,920 --> 00:30:19,600 Speaker 1: then you can start having serious debates. And I think 577 00:30:19,600 --> 00:30:22,440 Speaker 1: those debates have to take place in many places. They 578 00:30:22,440 --> 00:30:25,520 Speaker 1: should be taking place in legislatures where people can talk 579 00:30:25,560 --> 00:30:28,920 Speaker 1: about the trade off between privacy and mass surveillance and 580 00:30:28,960 --> 00:30:32,520 Speaker 1: whatever the government is trying to accomplish, why do they 581 00:30:32,560 --> 00:30:34,840 Speaker 1: need this technology? Is it really worth it? You are 582 00:30:34,880 --> 00:30:37,800 Speaker 1: their crimes that they wouldn't be solving without it, and 583 00:30:37,840 --> 00:30:41,240 Speaker 1: are they crimes that we're concerned about solving or do 584 00:30:41,320 --> 00:30:44,560 Speaker 1: they fall into the category of, you know, is that 585 00:30:44,680 --> 00:30:46,640 Speaker 1: something that we don't think should be a crime at all. 586 00:30:46,920 --> 00:30:49,200 Speaker 1: People are generally unaware in terms of what the police 587 00:30:49,200 --> 00:30:52,280 Speaker 1: do that only four to five percent of all arrests 588 00:30:52,280 --> 00:30:55,840 Speaker 1: involved crimes of violence. So when people think about we 589 00:30:55,880 --> 00:30:59,200 Speaker 1: want to enable law enforcement to be at catching criminals, 590 00:30:59,400 --> 00:31:02,720 Speaker 1: where we're concerned about divesting or defunding the police because 591 00:31:02,720 --> 00:31:06,000 Speaker 1: who's going to protect us from physical harm? Almost none 592 00:31:06,120 --> 00:31:08,160 Speaker 1: of what the police and law enforcement do is about 593 00:31:08,160 --> 00:31:12,040 Speaker 1: physical harm. It's a tiny percentage. Everything else that they're 594 00:31:12,040 --> 00:31:14,840 Speaker 1: doing is about this whole array of all sorts of 595 00:31:14,840 --> 00:31:17,600 Speaker 1: other things that we criminalize. And I think that in 596 00:31:17,640 --> 00:31:22,360 Speaker 1: addition to having better conversations about is there a potential 597 00:31:22,480 --> 00:31:25,160 Speaker 1: for some of these technologies that the government is using 598 00:31:25,880 --> 00:31:30,200 Speaker 1: to create arbitrary or discriminatory enforcement, I think we need 599 00:31:30,240 --> 00:31:32,920 Speaker 1: to dig deeper behind that question, in the same way 600 00:31:32,960 --> 00:31:35,440 Speaker 1: that you need to dig deeper beyond the George Floyd 601 00:31:35,760 --> 00:31:39,040 Speaker 1: murder and to ask if there's something systemically wrong here. 602 00:31:39,040 --> 00:31:41,560 Speaker 1: Do you need to rethink the whole question. So when 603 00:31:41,600 --> 00:31:43,440 Speaker 1: people say, well, you know, but we need the facial 604 00:31:43,480 --> 00:31:47,120 Speaker 1: recognition technology because it helps the police solve crimes, well, okay, 605 00:31:47,160 --> 00:31:49,640 Speaker 1: but you know what crimes and what are the costs? 606 00:31:50,320 --> 00:31:52,760 Speaker 1: So I think once people are educated enough and once 607 00:31:52,760 --> 00:31:56,360 Speaker 1: they realize what the nature of the problem is kind 608 00:31:56,360 --> 00:31:59,920 Speaker 1: of what's being unleashed, they can start really being read 609 00:32:00,160 --> 00:32:02,560 Speaker 1: to have that broader conversation. And I think it should 610 00:32:02,560 --> 00:32:04,960 Speaker 1: take place in legislatures, but I think it also should 611 00:32:04,960 --> 00:32:10,360 Speaker 1: take place and evidently is taking place in boardrooms at Amazon, Facebook, 612 00:32:10,400 --> 00:32:13,600 Speaker 1: and Google and Microsoft. They should be talking and they 613 00:32:13,640 --> 00:32:16,640 Speaker 1: do sometimes if the people demand it. And it also 614 00:32:16,720 --> 00:32:19,400 Speaker 1: has to take part just among people, you know, among 615 00:32:19,520 --> 00:32:22,760 Speaker 1: you know, tech communities and people just beginning to talk 616 00:32:22,760 --> 00:32:25,600 Speaker 1: about what are our responsibilities here? Is it okay for 617 00:32:25,720 --> 00:32:29,360 Speaker 1: us to create products just to make money if we 618 00:32:29,440 --> 00:32:31,719 Speaker 1: know that there are dangers, that the products are going 619 00:32:31,720 --> 00:32:34,920 Speaker 1: to be misused, or maybe aren't reliable enough, or that 620 00:32:35,000 --> 00:32:38,960 Speaker 1: they just feed into this enormous surveillance date. So let 621 00:32:38,960 --> 00:32:41,640 Speaker 1: me compare this to an earlier moment. After nine eleven, 622 00:32:42,360 --> 00:32:44,560 Speaker 1: we had a kind of similar phenomenon that in order 623 00:32:44,600 --> 00:32:48,360 Speaker 1: to deal with catching terrorists. We changed a lot of 624 00:32:48,440 --> 00:32:51,960 Speaker 1: laws that ended up really sacrificing a lot of privacy 625 00:32:51,960 --> 00:32:55,080 Speaker 1: and allowing a lot more government surveillance. And for a 626 00:32:55,160 --> 00:32:58,240 Speaker 1: number of years that went unchallenged, and people kept saying, oh, well, 627 00:32:58,280 --> 00:33:00,000 Speaker 1: you know, if if that's what we need in order 628 00:33:00,000 --> 00:33:04,120 Speaker 1: to be safe, we're willing to give up a little privacy. So, 629 00:33:04,240 --> 00:33:06,240 Speaker 1: first of all, I think people didn't think about the 630 00:33:06,280 --> 00:33:08,440 Speaker 1: fact that they weren't giving up their own privacy, they 631 00:33:08,440 --> 00:33:11,280 Speaker 1: were giving up somebody else's. And second of all, people 632 00:33:11,320 --> 00:33:15,320 Speaker 1: didn't realize how extensive the surveillance really was until Edward Snowden. 633 00:33:16,080 --> 00:33:19,240 Speaker 1: So then after Edward Snowden came along and people realized 634 00:33:19,280 --> 00:33:22,120 Speaker 1: how the government was just scooping up tons of information 635 00:33:22,160 --> 00:33:24,880 Speaker 1: about people and just keeping it in government databases and 636 00:33:24,920 --> 00:33:30,160 Speaker 1: started realizing the horrifying potential of all that. What happened 637 00:33:30,200 --> 00:33:32,760 Speaker 1: was that Congress made a couple of little changes to 638 00:33:32,800 --> 00:33:36,080 Speaker 1: the law. But more important, Microsoft and Google and other 639 00:33:36,080 --> 00:33:39,880 Speaker 1: places started to realize that their customers were concerned, and 640 00:33:39,920 --> 00:33:42,560 Speaker 1: they started being a little less cooperative. At the beginning, 641 00:33:42,640 --> 00:33:45,520 Speaker 1: right after nine eleven, all of the telecoms, all these 642 00:33:45,560 --> 00:33:48,120 Speaker 1: companies were just saying to the government, you want information here. 643 00:33:48,200 --> 00:33:50,440 Speaker 1: Take it all your verizons are sure, you know, hear 644 00:33:50,480 --> 00:33:52,360 Speaker 1: all the records of all our customers. Take it all. 645 00:33:52,400 --> 00:33:56,600 Speaker 1: You're keeping us safe. And I think that to me, 646 00:33:56,760 --> 00:33:59,719 Speaker 1: the most important thing is an informed public. That if 647 00:33:59,760 --> 00:34:02,680 Speaker 1: people can examine for themselves, but that they really think 648 00:34:02,720 --> 00:34:05,400 Speaker 1: that we're being kept safe by all of this, and 649 00:34:05,480 --> 00:34:08,440 Speaker 1: really examine you both the costs and the benefits in 650 00:34:08,480 --> 00:34:11,719 Speaker 1: an educated way, I think we get much better discussions. 651 00:34:11,760 --> 00:34:14,279 Speaker 1: And I think not only do you have the possibility 652 00:34:14,280 --> 00:34:17,360 Speaker 1: of getting better legislation or regulation, you also have the 653 00:34:17,400 --> 00:34:20,960 Speaker 1: possibility that private companies and you know, the tech the 654 00:34:21,000 --> 00:34:22,759 Speaker 1: tech companies are not going to want to do it 655 00:34:22,800 --> 00:34:25,960 Speaker 1: anymore because their customers don't want them to. Yeah. I 656 00:34:26,000 --> 00:34:28,680 Speaker 1: mean it's hard to have an informed public and to 657 00:34:28,800 --> 00:34:31,960 Speaker 1: have these discussions, even in this current environment. To some degree. 658 00:34:32,040 --> 00:34:34,960 Speaker 1: I mean, people I think are struggling with the idea 659 00:34:35,000 --> 00:34:38,160 Speaker 1: of truth. People are um, you know. And I remember, 660 00:34:38,239 --> 00:34:40,160 Speaker 1: by the way, I remember this note in leaks, like 661 00:34:40,160 --> 00:34:43,000 Speaker 1: I remember being in the news room covering technology and 662 00:34:43,040 --> 00:34:46,160 Speaker 1: thinking to myself because I wrote the tech bubble all 663 00:34:46,200 --> 00:34:49,840 Speaker 1: the way up right, and thinking, this is an extraordinary moment, 664 00:34:49,920 --> 00:34:52,960 Speaker 1: because we saw that we've been sharing all our data. 665 00:34:53,040 --> 00:34:55,319 Speaker 1: But we saw for the first time that, you know, 666 00:34:55,360 --> 00:34:58,080 Speaker 1: the government had a lot of access to things that 667 00:34:58,160 --> 00:35:01,040 Speaker 1: we had no idea they had access to. And I 668 00:35:01,040 --> 00:35:04,000 Speaker 1: think it was a fundamental shift, and the lens on 669 00:35:04,040 --> 00:35:08,240 Speaker 1: tech companies changed at that moment, and tech companies behavior 670 00:35:08,280 --> 00:35:11,200 Speaker 1: has changed quite a bit after that. You know, I 671 00:35:11,280 --> 00:35:13,480 Speaker 1: wonder this moment we're sitting in where we're having these 672 00:35:13,480 --> 00:35:17,400 Speaker 1: debates about surveillance and privacy and whatnot. These are sticky 673 00:35:17,440 --> 00:35:20,600 Speaker 1: debates and they're very politicized as we're heading into an election, 674 00:35:20,880 --> 00:35:24,360 Speaker 1: as we have misinformation spreading online, as a lot of 675 00:35:24,360 --> 00:35:26,600 Speaker 1: people don't know what to believe and what not to believe. 676 00:35:26,640 --> 00:35:30,360 Speaker 1: As the media landscape has changed, it's it certainly seems 677 00:35:30,400 --> 00:35:33,480 Speaker 1: like a harder environment to even to even have some 678 00:35:33,560 --> 00:35:36,440 Speaker 1: of these conversations. Well, I think in some ways it's 679 00:35:36,480 --> 00:35:38,440 Speaker 1: harder in some ways. I think the other thing that 680 00:35:38,560 --> 00:35:41,319 Speaker 1: is a catalyst for the discussions is realizing that there's 681 00:35:41,320 --> 00:35:43,879 Speaker 1: a dimension of race to all of us, I think, 682 00:35:43,880 --> 00:35:47,719 Speaker 1: and talking about artificial intelligence and facial recognition, not many 683 00:35:47,760 --> 00:35:50,920 Speaker 1: people saw that as an issue of structural racism. You know, 684 00:35:50,960 --> 00:35:53,160 Speaker 1: that there's something wrong with how we're putting together the 685 00:35:53,160 --> 00:35:55,480 Speaker 1: algorithms and it ends up that John Lewis is going 686 00:35:55,520 --> 00:35:58,520 Speaker 1: to be misidentified as somebody who matches a mug shot 687 00:35:58,560 --> 00:36:01,120 Speaker 1: and that Robert Williams is going to be arrested. So 688 00:36:01,200 --> 00:36:04,000 Speaker 1: I think that the fact that we now know that 689 00:36:04,000 --> 00:36:08,520 Speaker 1: that is an additional concern enables us to have richer conversations. 690 00:36:08,920 --> 00:36:11,000 Speaker 1: So we're not only talking about is there a trade 691 00:36:11,040 --> 00:36:14,600 Speaker 1: off between security and privacy? Plus, I think the other 692 00:36:14,680 --> 00:36:16,960 Speaker 1: thing that people are feeling much more open to is 693 00:36:17,000 --> 00:36:21,000 Speaker 1: to have that deeper conversation about what are our goals 694 00:36:21,000 --> 00:36:24,600 Speaker 1: here and if we're enabling all this government surveillance in 695 00:36:24,680 --> 00:36:27,759 Speaker 1: order to help the government to catch criminals, well, you 696 00:36:27,800 --> 00:36:30,120 Speaker 1: know what do we mean by criminals? What crimes are 697 00:36:30,120 --> 00:36:32,279 Speaker 1: they solving? And how are they using you know, how 698 00:36:32,320 --> 00:36:34,680 Speaker 1: how are how is this actually being used in services? 699 00:36:34,719 --> 00:36:36,880 Speaker 1: Wood So I feel like in some ways, you know, 700 00:36:37,239 --> 00:36:39,520 Speaker 1: with the election coming up, I think that gives people 701 00:36:40,600 --> 00:36:43,319 Speaker 1: more impetus to want to talk about these issues, because 702 00:36:43,320 --> 00:36:46,320 Speaker 1: the elections aren't only about the president. They're also about 703 00:36:46,520 --> 00:36:49,360 Speaker 1: local prosecutors and sheriffs and the people who make the 704 00:36:49,400 --> 00:36:53,759 Speaker 1: decisions about whether to buy surveillance equipment and what they're 705 00:36:53,760 --> 00:36:56,840 Speaker 1: gonna do with their authority over the criminal justice system. 706 00:36:57,320 --> 00:36:58,920 Speaker 1: So one thing the a c l U has been 707 00:36:58,960 --> 00:37:01,560 Speaker 1: doing in addition to everything else, as we've been very 708 00:37:01,640 --> 00:37:05,520 Speaker 1: involved in elections of prosecutors because that's the place where 709 00:37:05,520 --> 00:37:08,200 Speaker 1: almost people never used to pay attention to, you know who, 710 00:37:08,239 --> 00:37:11,000 Speaker 1: who were these people running and maybe they would vote 711 00:37:11,000 --> 00:37:13,759 Speaker 1: for somebody without really knowing what they voted for. So 712 00:37:13,800 --> 00:37:15,960 Speaker 1: what we're urging, and I think this is very much 713 00:37:16,000 --> 00:37:19,360 Speaker 1: what we're talking about about having an educated public. We're 714 00:37:19,440 --> 00:37:23,080 Speaker 1: urging people to go to elections or to go to debates, 715 00:37:23,320 --> 00:37:26,200 Speaker 1: to go to campaign events, attend I guess on zoom 716 00:37:26,239 --> 00:37:30,400 Speaker 1: these days, to attend campaign events and ask the candidates questions, 717 00:37:30,719 --> 00:37:32,839 Speaker 1: what would be your policy about whether or not you're 718 00:37:32,840 --> 00:37:35,920 Speaker 1: going to accept military equipment from the federal government in 719 00:37:35,920 --> 00:37:38,880 Speaker 1: your police department? Are you going to buy tanks? Are 720 00:37:38,880 --> 00:37:41,120 Speaker 1: you going to buy you know, these horrible weapons that 721 00:37:41,520 --> 00:37:44,279 Speaker 1: are used. Is that something you would do? Are you 722 00:37:44,320 --> 00:37:46,840 Speaker 1: going to buy you know, facial recognition software? Is that 723 00:37:46,880 --> 00:37:48,799 Speaker 1: how you would use your power? If we left you 724 00:37:49,480 --> 00:37:53,640 Speaker 1: um say that the prosecutors, would you support a reduction 725 00:37:53,680 --> 00:37:57,920 Speaker 1: in cash bail and increase of increased alternatives to incarceration? 726 00:37:58,400 --> 00:38:01,000 Speaker 1: So that's a place where without way for the government 727 00:38:01,040 --> 00:38:04,920 Speaker 1: to do something, we can ourselves affect what's happening in 728 00:38:04,920 --> 00:38:10,200 Speaker 1: our communities by encouraging candidates to think about what positions 729 00:38:10,200 --> 00:38:12,400 Speaker 1: they're taking on these different issues and letting them know 730 00:38:12,520 --> 00:38:16,040 Speaker 1: that they're gonna lose votes. The more people educated, the 731 00:38:16,120 --> 00:38:18,560 Speaker 1: more they are educated, the more they can tell people 732 00:38:18,640 --> 00:38:21,040 Speaker 1: that they'll lose votes, and to try that. This is 733 00:38:21,080 --> 00:38:24,279 Speaker 1: something that's worked in some places to encourage candidates to 734 00:38:24,800 --> 00:38:28,040 Speaker 1: take a better position. Yeah. Yeah, they might never never 735 00:38:28,200 --> 00:38:30,160 Speaker 1: thought of that, but you know, once they commit themselves, 736 00:38:30,160 --> 00:38:32,040 Speaker 1: you know that that's going to be better. So there 737 00:38:32,040 --> 00:38:33,919 Speaker 1: are all sorts of ways that we can affect things. 738 00:38:37,440 --> 00:38:39,920 Speaker 1: More from Susan after the break and make sure you 739 00:38:39,960 --> 00:38:42,080 Speaker 1: set up for our newsletter at dot dot dot media 740 00:38:42,200 --> 00:39:00,880 Speaker 1: dot com. Backslash newsletter will be launching this summer. Before 741 00:39:00,880 --> 00:39:03,880 Speaker 1: I move on from specifically some of the tech issues, 742 00:39:04,120 --> 00:39:08,680 Speaker 1: I have to bring up predator drones right right, you know, 743 00:39:08,880 --> 00:39:11,480 Speaker 1: the the U S. Customs and Border Protection flew a 744 00:39:11,560 --> 00:39:14,960 Speaker 1: large predator drone over the Minneapolis protests. You know, people 745 00:39:15,000 --> 00:39:17,640 Speaker 1: were protesting police brutality and the killing of George Floyd, 746 00:39:18,080 --> 00:39:20,759 Speaker 1: and for many reasons, it almost felt symbolic. You know, 747 00:39:20,800 --> 00:39:25,520 Speaker 1: it was raising all these questions about aerial surveillance, about 748 00:39:25,760 --> 00:39:29,960 Speaker 1: what data was being collected, where was this going? What 749 00:39:30,160 --> 00:39:33,480 Speaker 1: is your take on this? Well, you know, as you're saying, Laurie, 750 00:39:33,520 --> 00:39:36,520 Speaker 1: and you know that really it really magnifies the opportunity 751 00:39:36,560 --> 00:39:38,960 Speaker 1: to gather more information because you don't even have to 752 00:39:38,960 --> 00:39:42,040 Speaker 1: have the helicopters or whatever. But so you know that 753 00:39:42,200 --> 00:39:44,560 Speaker 1: of course is a concern just you how much information 754 00:39:44,680 --> 00:39:46,560 Speaker 1: is the government gathering, what are they going to do 755 00:39:46,600 --> 00:39:48,480 Speaker 1: with it, who's going to have access to it? Will 756 00:39:48,480 --> 00:39:50,560 Speaker 1: would ever be deleted or will it just got to 757 00:39:50,640 --> 00:39:53,799 Speaker 1: stay there in the government databasis forever. But I think 758 00:39:53,840 --> 00:39:56,200 Speaker 1: the other thing that the Predator drone brings to mind 759 00:39:56,680 --> 00:39:58,840 Speaker 1: is a question that people were also asking, which is 760 00:39:58,880 --> 00:40:02,200 Speaker 1: about the new live touris aation of law enforcement we've 761 00:40:02,280 --> 00:40:05,279 Speaker 1: had for years in this country, a passi Comma tat 762 00:40:05,360 --> 00:40:07,799 Speaker 1: us Act as it's called, which says, you don't want 763 00:40:07,800 --> 00:40:12,600 Speaker 1: the military doing everyday law enforcement, because that's that's not 764 00:40:13,000 --> 00:40:15,640 Speaker 1: our country. We don't want the military to be quote 765 00:40:15,680 --> 00:40:19,400 Speaker 1: dominating the streets, and we don't want the people who 766 00:40:19,480 --> 00:40:21,960 Speaker 1: are out protesting to be considered the enemy of the 767 00:40:22,040 --> 00:40:25,160 Speaker 1: United States, there are people who are expressing their opinions, 768 00:40:25,760 --> 00:40:28,719 Speaker 1: and so the whole idea of you know, it's one thing. 769 00:40:29,120 --> 00:40:32,520 Speaker 1: It's enough if the police held helicopters are flying overhead 770 00:40:32,960 --> 00:40:34,719 Speaker 1: and trying to keep track of, you know, who's in 771 00:40:34,760 --> 00:40:37,200 Speaker 1: the crowd and what the crowd is doing. But once 772 00:40:37,239 --> 00:40:41,520 Speaker 1: you start adding an element of something the military helicopters 773 00:40:41,600 --> 00:40:45,480 Speaker 1: or the military drones or things that feel like we 774 00:40:45,560 --> 00:40:48,280 Speaker 1: are being treated as the enemy of the government, instaid 775 00:40:48,280 --> 00:40:50,760 Speaker 1: that the people who are the government, who are supposed 776 00:40:50,760 --> 00:40:53,920 Speaker 1: to be controlling the government, I think that that's just it. 777 00:40:54,080 --> 00:40:58,640 Speaker 1: It's a very bad paradigm. You think it's a slippery slope, Well, 778 00:40:58,680 --> 00:41:01,080 Speaker 1: it's a slippery slope unless we stopped the slipping. And 779 00:41:01,120 --> 00:41:04,279 Speaker 1: as we saw with you with Amazon and the facial recognition, 780 00:41:04,320 --> 00:41:06,600 Speaker 1: if people say, wait a minute, yeah, I think we 781 00:41:06,640 --> 00:41:08,879 Speaker 1: can make that stuff. But I think if people don't 782 00:41:08,880 --> 00:41:11,440 Speaker 1: pay attention, I think we have a very slippery slope. 783 00:41:11,840 --> 00:41:13,719 Speaker 1: And that's what I've been saying about most of the 784 00:41:13,719 --> 00:41:16,920 Speaker 1: issues we've talked about you, starting with the contact tracing 785 00:41:17,480 --> 00:41:20,040 Speaker 1: and the surveillance and everything else. It seems to me 786 00:41:20,160 --> 00:41:23,359 Speaker 1: that what's really important is transparency. We should know what 787 00:41:23,400 --> 00:41:27,040 Speaker 1: the government is doing and accountability. Back on the issue 788 00:41:27,040 --> 00:41:29,279 Speaker 1: of contact tracing, one thing that the AHL you do, 789 00:41:29,480 --> 00:41:32,359 Speaker 1: together with the a l U of Massachusetts, is we 790 00:41:32,440 --> 00:41:36,320 Speaker 1: have filed a lawsuit. We're actually a records request demanding 791 00:41:36,440 --> 00:41:40,600 Speaker 1: that the government, including the CDC, release information about the 792 00:41:40,640 --> 00:41:43,319 Speaker 1: possible uses of all the location data that they would 793 00:41:43,320 --> 00:41:47,359 Speaker 1: be collecting in connection with contact tracing, because you know, 794 00:41:47,640 --> 00:41:49,680 Speaker 1: once if you don't know what they're doing, then you 795 00:41:49,719 --> 00:41:51,799 Speaker 1: can't have a discussion about what they should be doing. 796 00:41:52,239 --> 00:41:54,040 Speaker 1: And one reason why I was bringing up all the 797 00:41:54,040 --> 00:41:56,439 Speaker 1: post nine eleven changes of law is that I think 798 00:41:56,480 --> 00:42:00,560 Speaker 1: that the whole idea that we can't know the government 799 00:42:00,640 --> 00:42:02,719 Speaker 1: is doing. The government has to act in secret in 800 00:42:02,800 --> 00:42:04,960 Speaker 1: order to keep us safe or else the enemy will 801 00:42:05,000 --> 00:42:06,880 Speaker 1: be able to know what they're doing and you know, 802 00:42:07,000 --> 00:42:10,200 Speaker 1: and work around it. But the government can know everything 803 00:42:10,200 --> 00:42:13,360 Speaker 1: that we're doing. I think that just has democracy backwards. 804 00:42:14,000 --> 00:42:15,919 Speaker 1: You know, we have to be able to know what's 805 00:42:15,920 --> 00:42:18,839 Speaker 1: happening inside the government. And that applies to why are 806 00:42:18,840 --> 00:42:20,719 Speaker 1: they sending the Predator drone? What are they going to 807 00:42:20,760 --> 00:42:22,799 Speaker 1: do with the information? What does this mean? Are they 808 00:42:22,800 --> 00:42:25,160 Speaker 1: going to do it again? And it also has to 809 00:42:25,200 --> 00:42:28,400 Speaker 1: do with the contact track tracking and tracing. Once they 810 00:42:28,440 --> 00:42:31,040 Speaker 1: get that data, what happens to it? Are they going 811 00:42:31,120 --> 00:42:33,000 Speaker 1: to erase itever? You know, who do they share it with? 812 00:42:33,040 --> 00:42:35,200 Speaker 1: What are they going to do with it? And I feel, 813 00:42:35,239 --> 00:42:37,879 Speaker 1: you know, those are really important issues in a democracy, 814 00:42:37,920 --> 00:42:39,719 Speaker 1: that we just have the right to know what the 815 00:42:39,760 --> 00:42:42,080 Speaker 1: government is doing so that we can talk about it. 816 00:42:42,440 --> 00:42:44,359 Speaker 1: And I feel like to sort of say, well, this 817 00:42:44,440 --> 00:42:46,680 Speaker 1: is what the government is doing and that's really bad, 818 00:42:47,120 --> 00:42:49,800 Speaker 1: and that upsets me. I think that kind of misses 819 00:42:49,800 --> 00:42:52,239 Speaker 1: the point. If the government is doing something bad, then 820 00:42:52,320 --> 00:42:55,480 Speaker 1: it is the duty of every American to find out 821 00:42:55,480 --> 00:42:57,799 Speaker 1: what they're doing and to push back. And so at 822 00:42:57,840 --> 00:42:59,680 Speaker 1: the a C. O you we have a program that 823 00:42:59,680 --> 00:43:03,399 Speaker 1: we call People Power. We first invented that and used 824 00:43:03,400 --> 00:43:07,120 Speaker 1: it to explain to cities and localities all over the 825 00:43:07,160 --> 00:43:11,200 Speaker 1: country about how they could fight back against draconian immigration 826 00:43:11,320 --> 00:43:14,839 Speaker 1: rules by becoming quote sanctuary cities, what what their rights 827 00:43:14,880 --> 00:43:18,080 Speaker 1: actually were. We then used it for voting rights. We're 828 00:43:18,120 --> 00:43:20,319 Speaker 1: about to use it some more for voting rights. But 829 00:43:20,440 --> 00:43:22,279 Speaker 1: what we have really urged and I hope that you know, 830 00:43:22,360 --> 00:43:23,920 Speaker 1: some of your listeners will go to the a C 831 00:43:24,040 --> 00:43:26,239 Speaker 1: l U website and see about what people power is 832 00:43:26,280 --> 00:43:28,080 Speaker 1: doing in addition to what the a c l U 833 00:43:28,280 --> 00:43:30,600 Speaker 1: is doing, Because what is the A c L you doing? 834 00:43:30,640 --> 00:43:32,759 Speaker 1: And that's all the staffers at home trying to you know, 835 00:43:32,800 --> 00:43:34,799 Speaker 1: work on their new laptops while they're trying to, you know, 836 00:43:34,880 --> 00:43:38,080 Speaker 1: keep their talkers quiet. But people power is about what 837 00:43:38,160 --> 00:43:40,919 Speaker 1: every single person can and I think should be doing. 838 00:43:41,280 --> 00:43:43,840 Speaker 1: You know, if people really educate themselves and think about 839 00:43:44,200 --> 00:43:46,880 Speaker 1: the ethical issues, the costs and benefits of all this 840 00:43:47,040 --> 00:43:50,600 Speaker 1: technology in addition to a lot of other things going on, 841 00:43:50,680 --> 00:43:52,839 Speaker 1: I think we get a lot better results if people 842 00:43:52,880 --> 00:43:55,680 Speaker 1: pay attention. Yeah, I mean, it's interesting to watch the 843 00:43:55,719 --> 00:43:59,040 Speaker 1: A c L you take on issues like surveillance, facial recognition. 844 00:43:59,080 --> 00:44:00,600 Speaker 1: I know, the A C L you out a lawsuit 845 00:44:00,640 --> 00:44:04,440 Speaker 1: against clear View AI, which was this very controversial company 846 00:44:04,480 --> 00:44:08,160 Speaker 1: that was using biometric data. I think facial recognition technology 847 00:44:08,160 --> 00:44:10,640 Speaker 1: helped them collect something like three billion face prints, and 848 00:44:10,640 --> 00:44:15,320 Speaker 1: they were giving access to private companies, wealthy individuals, federal, 849 00:44:15,320 --> 00:44:18,520 Speaker 1: state and local law enforcement agencies. And you know, coming 850 00:44:18,560 --> 00:44:22,120 Speaker 1: from the tech space, it certainly feels like sometimes these stories, 851 00:44:22,200 --> 00:44:25,160 Speaker 1: you just don't know what these companies are doing until 852 00:44:25,200 --> 00:44:28,120 Speaker 1: you start, you know, peeling back the layers and seeing 853 00:44:28,160 --> 00:44:30,200 Speaker 1: all the data went to here and here, and why 854 00:44:30,239 --> 00:44:33,279 Speaker 1: did it go there? And why wasn't this disclosed? And 855 00:44:33,280 --> 00:44:38,160 Speaker 1: and oftentimes it takes the watchdog to really understand where 856 00:44:38,200 --> 00:44:40,879 Speaker 1: some of this can can go wrong and how it's 857 00:44:40,920 --> 00:44:44,160 Speaker 1: being used in ways in ways that can be dangerous 858 00:44:44,200 --> 00:44:46,960 Speaker 1: in many ways. Yeah, I think that's exactly right. And 859 00:44:46,960 --> 00:44:49,040 Speaker 1: that's why I was saying before that aren't concerned before 860 00:44:49,040 --> 00:44:51,800 Speaker 1: everybody jumps on the bandwagon about let's have more contact 861 00:44:51,840 --> 00:44:54,000 Speaker 1: tracing and then you know, like everybody should just be 862 00:44:54,040 --> 00:44:57,040 Speaker 1: doing all this information. I think we have to get 863 00:44:57,080 --> 00:45:00,479 Speaker 1: a dog. Yeah, you're not gonna have the watch dog 864 00:45:00,520 --> 00:45:03,160 Speaker 1: telling you things unless you build a watchdog into the system. 865 00:45:03,600 --> 00:45:05,600 Speaker 1: And if everything is just you know, a company has 866 00:45:05,640 --> 00:45:07,719 Speaker 1: invented this and is selling it to the police, or 867 00:45:07,719 --> 00:45:09,600 Speaker 1: a company who has invented this and now we're all 868 00:45:09,600 --> 00:45:11,680 Speaker 1: going to buy it. If you just leave out any 869 00:45:11,680 --> 00:45:15,840 Speaker 1: sort of oversight, then you really have a tremendous potential problem. 870 00:45:15,880 --> 00:45:17,880 Speaker 1: Are there any other examples of tech that we're not 871 00:45:17,960 --> 00:45:23,399 Speaker 1: thinking about the unintended consequences for our rights or privacy yet? Well, 872 00:45:23,440 --> 00:45:26,879 Speaker 1: you know, a AI is really big altogether across as 873 00:45:26,880 --> 00:45:30,040 Speaker 1: you're saying across many different kinds of issues. I was 874 00:45:30,080 --> 00:45:32,760 Speaker 1: just actually, this is not a tangential to your question, 875 00:45:32,800 --> 00:45:34,759 Speaker 1: but you were asking me before about cases that I 876 00:45:34,800 --> 00:45:36,560 Speaker 1: had worked on, and there was another case that I 877 00:45:36,600 --> 00:45:39,040 Speaker 1: worked on that was about tech where I wrote the 878 00:45:39,280 --> 00:45:41,920 Speaker 1: A c l Use brief in the Supreme Court. It 879 00:45:42,040 --> 00:45:44,759 Speaker 1: was an ancus brief. It wasn't about our client, but 880 00:45:44,880 --> 00:45:47,680 Speaker 1: it was a kid called Riley versus California. And what 881 00:45:47,800 --> 00:45:51,239 Speaker 1: the police were saying they're most law enforcement places, the 882 00:45:51,320 --> 00:45:53,880 Speaker 1: federal government as well as the state of California and 883 00:45:53,920 --> 00:45:57,680 Speaker 1: many other jurisdictions, was that when you arrest somebody, the 884 00:45:57,719 --> 00:45:59,880 Speaker 1: police get to do what is called a search incident 885 00:46:00,000 --> 00:46:01,920 Speaker 1: to arrest, so they get to see what you have 886 00:46:01,960 --> 00:46:03,839 Speaker 1: in your pocket. Makes some sense, right, you know, if 887 00:46:03,840 --> 00:46:05,480 Speaker 1: you have a gun in your pocket, that's a problem, 888 00:46:05,600 --> 00:46:07,239 Speaker 1: or you know whatever, So they get to do a 889 00:46:07,239 --> 00:46:09,879 Speaker 1: surgeon sent in to arrest. And the law had been 890 00:46:10,200 --> 00:46:12,600 Speaker 1: that if they find something in your pocket like that's 891 00:46:12,680 --> 00:46:15,560 Speaker 1: that's a container, they can search inside the container to 892 00:46:15,560 --> 00:46:18,760 Speaker 1: see if there's anything in it that that could be harmful. 893 00:46:18,840 --> 00:46:21,360 Speaker 1: And in fact, there was one situation where they opened 894 00:46:21,400 --> 00:46:24,040 Speaker 1: up a cigarette package that somebody had and they you know, 895 00:46:24,080 --> 00:46:26,719 Speaker 1: they could find a razor blade, they could find marijuana, 896 00:46:26,800 --> 00:46:30,040 Speaker 1: cigarette whatever. So that was law where the Supreme Court said, yes, 897 00:46:30,040 --> 00:46:32,640 Speaker 1: you're allowed to search people and search the containers that 898 00:46:32,680 --> 00:46:35,480 Speaker 1: are on them. Well, what law enforcement said was, your 899 00:46:35,520 --> 00:46:38,600 Speaker 1: cell phone is a container. When we arrest you, we 900 00:46:38,640 --> 00:46:41,239 Speaker 1: can search your cell phone. It's a container. We have 901 00:46:41,280 --> 00:46:43,640 Speaker 1: the right to search incident to risk. And so we 902 00:46:43,719 --> 00:46:46,160 Speaker 1: wrote a brief saying no, it's not you know, it's 903 00:46:46,160 --> 00:46:49,040 Speaker 1: a container, but it's a container that essentially is your home, 904 00:46:49,440 --> 00:46:53,520 Speaker 1: it's your library, it's your desk. So allowing the police 905 00:46:53,560 --> 00:46:56,160 Speaker 1: to look in your cell phone when they only had 906 00:46:56,440 --> 00:47:00,560 Speaker 1: really very feeble and very unlikely scenarios, things just wouldn't 907 00:47:00,600 --> 00:47:02,600 Speaker 1: happen too often for what the need was. You know, 908 00:47:02,640 --> 00:47:04,560 Speaker 1: maybe you had some remote thing that would go off 909 00:47:04,560 --> 00:47:07,040 Speaker 1: and would blow something up, you know, oh come on, yeah, 910 00:47:07,080 --> 00:47:08,640 Speaker 1: but yeah, there were other ways to deal with a 911 00:47:08,680 --> 00:47:11,440 Speaker 1: lot of that, and so the Supreme Court actually agreed 912 00:47:11,520 --> 00:47:13,480 Speaker 1: with that. They said, yeah, you know, this is really 913 00:47:13,600 --> 00:47:17,400 Speaker 1: is just a technological way of finding out what's in 914 00:47:17,440 --> 00:47:20,680 Speaker 1: your your papers and books and records. It used to 915 00:47:20,680 --> 00:47:22,120 Speaker 1: be they were in your desk, and now they're in 916 00:47:22,200 --> 00:47:24,839 Speaker 1: your cell phone. So that to me, it's a sort 917 00:47:24,880 --> 00:47:26,960 Speaker 1: of a whole thread of what we've been talking about. 918 00:47:27,000 --> 00:47:30,400 Speaker 1: But the challenges to civil liberties are different and in 919 00:47:30,440 --> 00:47:35,280 Speaker 1: some ways greater when the technology builds up. Yeah, there's 920 00:47:35,400 --> 00:47:37,600 Speaker 1: there's a great quote on the A c. L U website. 921 00:47:37,640 --> 00:47:40,120 Speaker 1: The fact that technology now allows an individual to carry 922 00:47:40,160 --> 00:47:42,640 Speaker 1: such information in his hand does not make the information 923 00:47:42,719 --> 00:47:45,720 Speaker 1: any less worthy of the protection for which the founders fought. 924 00:47:46,080 --> 00:47:49,880 Speaker 1: The U. S. Supreme Court Chief Justice John Roberts exactly. 925 00:47:50,360 --> 00:47:52,239 Speaker 1: I like to talk about, you know, one of the 926 00:47:52,239 --> 00:47:55,239 Speaker 1: whole points of the Constitution adding the Fourth Amendment, which 927 00:47:55,280 --> 00:47:57,960 Speaker 1: is the protection of privacy, is they wanted to protect 928 00:47:57,960 --> 00:48:00,920 Speaker 1: what was in Benjamin Franklin's disk. You know, nobody should 929 00:48:00,920 --> 00:48:03,120 Speaker 1: know if he was writing some things that were anti government, 930 00:48:03,360 --> 00:48:05,319 Speaker 1: and we now have that on our cell phone, so 931 00:48:05,360 --> 00:48:07,759 Speaker 1: of course, But that's where I think that a lot 932 00:48:07,800 --> 00:48:12,080 Speaker 1: of the protection of civil liberties is applying our fundamental 933 00:48:12,120 --> 00:48:17,120 Speaker 1: principles in different circumstances. Taking a gigantic step back, what 934 00:48:17,200 --> 00:48:19,360 Speaker 1: do you think is the biggest threat to civil liberties 935 00:48:19,400 --> 00:48:22,839 Speaker 1: in the new World Order? In the New World Order, Well, 936 00:48:22,880 --> 00:48:24,640 Speaker 1: you know, it's hard to just select one. In sort 937 00:48:24,640 --> 00:48:26,560 Speaker 1: of like Sophie's choice, you know which, which is your 938 00:48:26,600 --> 00:48:30,319 Speaker 1: favorite child? Right now, I think one of our very 939 00:48:30,400 --> 00:48:33,200 Speaker 1: top priorities in Adenia. Mass incarceration is a big one 940 00:48:33,239 --> 00:48:36,319 Speaker 1: because so many people's lives are just being totally disrupted 941 00:48:36,320 --> 00:48:39,040 Speaker 1: their families, and often the question really has to be 942 00:48:39,120 --> 00:48:41,640 Speaker 1: for what. One thing that we're hoping is that the 943 00:48:41,680 --> 00:48:44,880 Speaker 1: work we've been doing around trying to get vulnerable people 944 00:48:45,040 --> 00:48:47,680 Speaker 1: released from prison so that they won't get the virus 945 00:48:47,719 --> 00:48:52,360 Speaker 1: and get seriously all possibly die is we're helping that 946 00:48:52,440 --> 00:48:56,280 Speaker 1: once jurisdictions see that they were able to release thousands 947 00:48:56,360 --> 00:48:59,360 Speaker 1: of people from prisons and jails and that it's not 948 00:48:59,400 --> 00:49:01,680 Speaker 1: going to cause a spike in the crime rate, it 949 00:49:01,680 --> 00:49:04,200 Speaker 1: really is pretty safe thing to do. We're hoping that 950 00:49:04,200 --> 00:49:07,360 Speaker 1: that's going to stick and that long run will be 951 00:49:07,400 --> 00:49:09,319 Speaker 1: able to rethink, well, did we really need to put 952 00:49:09,360 --> 00:49:11,399 Speaker 1: all those people in prison and jail to start with? 953 00:49:11,640 --> 00:49:13,920 Speaker 1: What are we doing with the criminal justice system? So 954 00:49:14,000 --> 00:49:16,160 Speaker 1: that's really big. But the other thing that I think 955 00:49:16,239 --> 00:49:19,120 Speaker 1: is really big right now is voting rights. I have 956 00:49:19,160 --> 00:49:22,239 Speaker 1: alluded to this at the beginning of our conversation, but 957 00:49:22,760 --> 00:49:25,400 Speaker 1: the premise of democracy is that the people get to 958 00:49:25,440 --> 00:49:27,839 Speaker 1: decide on who should be running the government and who 959 00:49:27,840 --> 00:49:30,239 Speaker 1: should be making the policy about all these things we're 960 00:49:30,239 --> 00:49:33,279 Speaker 1: talking about here. You know, what, what are the regulations 961 00:49:33,320 --> 00:49:36,960 Speaker 1: about technology? What are the regulations about your reproductive freedom? 962 00:49:37,520 --> 00:49:43,200 Speaker 1: Everything else? LGBT rights? Uh, And it's the people's vote 963 00:49:43,239 --> 00:49:46,000 Speaker 1: is distorted. That's a real problem that people can't vote. 964 00:49:46,320 --> 00:49:49,160 Speaker 1: So we have litigation going on right now in I 965 00:49:49,200 --> 00:49:53,160 Speaker 1: think it's like thirty different states trying to get people 966 00:49:53,200 --> 00:49:56,359 Speaker 1: the opportunity to vote. So one of the things that 967 00:49:57,160 --> 00:50:01,160 Speaker 1: has happened in addition to all way is that incumbents 968 00:50:01,160 --> 00:50:03,480 Speaker 1: had been using to try to protect their own seats, 969 00:50:04,160 --> 00:50:06,560 Speaker 1: is that the virus has really made it dangerous for 970 00:50:06,600 --> 00:50:10,000 Speaker 1: people to vote in public places. So we saw the 971 00:50:10,080 --> 00:50:13,480 Speaker 1: election in Wisconsin where people were just lined up for 972 00:50:13,640 --> 00:50:16,000 Speaker 1: you know, tremendous distance is waiting for a really long 973 00:50:16,040 --> 00:50:19,080 Speaker 1: time to vote because Wisconsin would not allow them to 974 00:50:19,200 --> 00:50:22,880 Speaker 1: submit absentee ballots. And in fact, a study showed afterwards 975 00:50:22,880 --> 00:50:25,760 Speaker 1: that at least seventeen people got got the virus from voting. 976 00:50:26,560 --> 00:50:30,040 Speaker 1: Many many polling places were closed because they, first of all, 977 00:50:30,080 --> 00:50:32,799 Speaker 1: the poll poll workers are generally elderly people, and the 978 00:50:32,800 --> 00:50:35,279 Speaker 1: poll workers were not able and willing to to man 979 00:50:35,400 --> 00:50:38,440 Speaker 1: the polling places. There are a number of states that 980 00:50:38,480 --> 00:50:41,239 Speaker 1: don't allow absentee ballots at all unless you have a 981 00:50:41,280 --> 00:50:44,839 Speaker 1: particular situation, like if you're disabled, and the states are saying, oh, well, 982 00:50:44,880 --> 00:50:46,879 Speaker 1: you know, the pure the virus are getting yill, that's 983 00:50:46,920 --> 00:50:50,120 Speaker 1: not a disability. Or before you get an absentee ballot, 984 00:50:50,160 --> 00:50:52,880 Speaker 1: you have to have it notarized, you have to have witnesses. Now, 985 00:50:53,160 --> 00:50:56,359 Speaker 1: how is all this going to happen? So it's very 986 00:50:56,400 --> 00:50:59,480 Speaker 1: concerning that people are going to have to choose between 987 00:50:59,520 --> 00:51:02,120 Speaker 1: their health and they're right to vote, and we don't 988 00:51:02,120 --> 00:51:04,600 Speaker 1: think that that should happen. And that's something that has 989 00:51:04,640 --> 00:51:07,799 Speaker 1: to be attended to right now, because if states don't 990 00:51:07,840 --> 00:51:10,839 Speaker 1: come up with plans for trying to enable everyone who 991 00:51:10,920 --> 00:51:13,839 Speaker 1: wants to vote to be able to vote, and for 992 00:51:14,040 --> 00:51:17,279 Speaker 1: counting absentee ballots and for administering this program, if you 993 00:51:17,280 --> 00:51:19,640 Speaker 1: don't come up right now with the plan and the resources, 994 00:51:19,880 --> 00:51:21,440 Speaker 1: a lot of people are going to be left out 995 00:51:21,480 --> 00:51:23,520 Speaker 1: and they're going to find that either you know, they 996 00:51:23,560 --> 00:51:25,919 Speaker 1: can't vote because they're afraid to go out to the poll, 997 00:51:26,400 --> 00:51:28,400 Speaker 1: or the vote is not going to be adequately counted. 998 00:51:28,880 --> 00:51:31,520 Speaker 1: So I think that right now, making democracy work is 999 00:51:31,560 --> 00:51:34,560 Speaker 1: really one of our top projects. What is the solution 1000 00:51:34,680 --> 00:51:37,239 Speaker 1: to some of these problems? What are your tangible solutions? 1001 00:51:37,560 --> 00:51:39,960 Speaker 1: But one tangible solution is that more states have to 1002 00:51:40,000 --> 00:51:43,319 Speaker 1: make absentee balloting available to people without having all these 1003 00:51:43,320 --> 00:51:48,279 Speaker 1: conditions and you know obstacles. Uh. The other solution is 1004 00:51:48,560 --> 00:51:52,200 Speaker 1: you were talking before about truth. A lot of the 1005 00:51:52,200 --> 00:51:56,080 Speaker 1: reason that's given the very thin veneer of justification that's 1006 00:51:56,120 --> 00:51:58,399 Speaker 1: given for we don't want absentee ballots or we need 1007 00:51:58,480 --> 00:52:02,560 Speaker 1: voter i D keep to carry government to approved voter 1008 00:52:02,680 --> 00:52:04,520 Speaker 1: i D, which means you have to go down to 1009 00:52:04,560 --> 00:52:06,960 Speaker 1: a governmental office live and get your voter i D 1010 00:52:07,120 --> 00:52:10,160 Speaker 1: and show it at the polls. The excuse for a 1011 00:52:10,200 --> 00:52:12,960 Speaker 1: lot of this is is that there could be fraud. Well, 1012 00:52:12,960 --> 00:52:16,160 Speaker 1: studies have shown that there's virtually no voter fraud, and 1013 00:52:16,400 --> 00:52:19,040 Speaker 1: it's just it's really a real unicorn. And again, I 1014 00:52:19,040 --> 00:52:21,640 Speaker 1: think if people understood that, that might sound good, but 1015 00:52:21,680 --> 00:52:24,560 Speaker 1: it's not true. I think truth is another thing that 1016 00:52:24,600 --> 00:52:26,719 Speaker 1: we're really fighting for these days. Can you listen to 1017 00:52:26,760 --> 00:52:29,120 Speaker 1: the evidence. Can you listen to the public health officials? 1018 00:52:29,120 --> 00:52:31,680 Speaker 1: Can you listen to what you what's real? I know 1019 00:52:31,760 --> 00:52:34,480 Speaker 1: for a fact that tech companies are very concerned about 1020 00:52:34,560 --> 00:52:38,759 Speaker 1: voter suppression, you know, and misinformation spreading online. This idea 1021 00:52:38,800 --> 00:52:41,920 Speaker 1: of countering truth around a lot of these very important initiatives, 1022 00:52:41,960 --> 00:52:44,640 Speaker 1: whether it's absentee ballots, whether it's showing up to the polls, 1023 00:52:44,760 --> 00:52:47,680 Speaker 1: all that kind of thing. You know. I'd be curious 1024 00:52:47,719 --> 00:52:50,399 Speaker 1: to know your take. There's a current battle happening right now. 1025 00:52:50,400 --> 00:52:54,680 Speaker 1: You have seven fifty advertisers boycotting Facebook asking for better 1026 00:52:54,719 --> 00:52:58,680 Speaker 1: policing of hateful content. Our social media companies doing enough 1027 00:52:58,719 --> 00:53:02,400 Speaker 1: to police harmful content, especially as we head into an 1028 00:53:02,440 --> 00:53:05,239 Speaker 1: election where voter suppression and the spread of misinformation will 1029 00:53:05,280 --> 00:53:08,719 Speaker 1: most certainly be attacked. It used to manipulate voters. Well, 1030 00:53:08,800 --> 00:53:11,360 Speaker 1: let me actually break your question down into two different parts, 1031 00:53:11,360 --> 00:53:13,520 Speaker 1: because you were starting by saying about the concern about 1032 00:53:13,600 --> 00:53:16,560 Speaker 1: voter suppression. I think one thing that everybody should be 1033 00:53:16,600 --> 00:53:20,200 Speaker 1: doing is to increase awareness of what is a fair 1034 00:53:20,239 --> 00:53:24,000 Speaker 1: way to improve access to the ballot for everybody. And 1035 00:53:24,040 --> 00:53:26,160 Speaker 1: some of those things are tech solutions. We've had tech 1036 00:53:26,239 --> 00:53:29,640 Speaker 1: solutions for years that are available and not widely enough used. 1037 00:53:30,080 --> 00:53:33,560 Speaker 1: How to enable differently able people to vote. You can 1038 00:53:33,600 --> 00:53:36,920 Speaker 1: blind people, do they have the technology? So there are 1039 00:53:36,960 --> 00:53:39,080 Speaker 1: a lot of places where we need the tech community 1040 00:53:39,320 --> 00:53:42,480 Speaker 1: and we need everybody to find out how you vote, 1041 00:53:42,560 --> 00:53:44,600 Speaker 1: to find out a voting can be made easier, and 1042 00:53:44,640 --> 00:53:47,480 Speaker 1: to let people know what the rules for voting are 1043 00:53:47,480 --> 00:53:49,040 Speaker 1: where they live. So one thing the a c l 1044 00:53:49,120 --> 00:53:51,080 Speaker 1: U Is doing is we have on our website you 1045 00:53:51,080 --> 00:53:54,000 Speaker 1: know your rights, you know what you're voting regulations are, 1046 00:53:54,560 --> 00:53:56,400 Speaker 1: and that's something that I think people really have to 1047 00:53:56,440 --> 00:53:59,359 Speaker 1: start thinking a lot about and to let let all 1048 00:53:59,400 --> 00:54:01,839 Speaker 1: their community is all their friends and family know about 1049 00:54:01,880 --> 00:54:04,360 Speaker 1: the importance of voting and how they what they have 1050 00:54:04,480 --> 00:54:06,400 Speaker 1: to do to vote, and to urge them to just 1051 00:54:06,440 --> 00:54:08,759 Speaker 1: get out and vote in whatever form that's going to take. 1052 00:54:09,200 --> 00:54:12,840 Speaker 1: So I think that's really important. In terms of disinformation 1053 00:54:13,000 --> 00:54:17,759 Speaker 1: on social media, people talk about the First Amendment and 1054 00:54:17,800 --> 00:54:20,400 Speaker 1: whether you know there's a First Amendment problem with Facebook 1055 00:54:20,719 --> 00:54:23,479 Speaker 1: telling you what you can't do. Well, there isn't, because 1056 00:54:23,520 --> 00:54:26,680 Speaker 1: the First Amendment only applies to the government, so you 1057 00:54:26,719 --> 00:54:28,880 Speaker 1: don't have a First Amendment right to say whatever you 1058 00:54:28,920 --> 00:54:31,680 Speaker 1: want on Facebook. However, I have to say that we're 1059 00:54:31,719 --> 00:54:34,680 Speaker 1: you know, we don't regard that issue is altogether a 1060 00:54:34,760 --> 00:54:37,640 Speaker 1: simplistic issue that Facebook should be telling everybody what they 1061 00:54:37,719 --> 00:54:40,560 Speaker 1: can't say, because even though the First Amendment does not 1062 00:54:40,640 --> 00:54:44,080 Speaker 1: apply to private companies, there's still a tremendous value to 1063 00:54:44,120 --> 00:54:47,560 Speaker 1: free speech. And there are a number of examples which 1064 00:54:47,600 --> 00:54:50,760 Speaker 1: you know, are we've come up with about people who 1065 00:54:50,800 --> 00:54:53,960 Speaker 1: are have speech suppressed for bad reasons. I'll give you 1066 00:54:54,000 --> 00:54:57,120 Speaker 1: one example. There was a woman who African American woman 1067 00:54:57,680 --> 00:54:59,960 Speaker 1: who posted something on Twitter and she got all the 1068 00:55:00,000 --> 00:55:04,359 Speaker 1: it is horrible racist responses, and she posted a screenshot 1069 00:55:04,480 --> 00:55:06,560 Speaker 1: of the responses that she got to show people what 1070 00:55:06,680 --> 00:55:09,080 Speaker 1: she was up against, and Twitter took it down because 1071 00:55:09,080 --> 00:55:13,000 Speaker 1: it included racist words that you know, okay, you know 1072 00:55:13,080 --> 00:55:16,080 Speaker 1: kind of this is the point. There was another uh 1073 00:55:16,360 --> 00:55:20,000 Speaker 1: An CLU lawyer wrote about a statue in Kansas that 1074 00:55:20,160 --> 00:55:23,799 Speaker 1: was a topless statue is a woman who were bare aggreasted, 1075 00:55:24,080 --> 00:55:27,319 Speaker 1: and so whatever the locality was in Kansas decided to 1076 00:55:27,360 --> 00:55:30,080 Speaker 1: take it down because yeah, that was they considered that 1077 00:55:30,160 --> 00:55:32,400 Speaker 1: to be important. So the A. C. L You lawyer 1078 00:55:32,400 --> 00:55:34,680 Speaker 1: who was challenging whether or not the I think it 1079 00:55:34,760 --> 00:55:37,759 Speaker 1: was city could take it down, posted a portrait a 1080 00:55:37,760 --> 00:55:40,160 Speaker 1: picture of the statue and that was it wasn't Twitter 1081 00:55:40,239 --> 00:55:42,680 Speaker 1: as I think Facebook, and that was taken down on 1082 00:55:42,719 --> 00:55:44,880 Speaker 1: the ground that it was obscene, so she couldn't post 1083 00:55:45,239 --> 00:55:47,520 Speaker 1: the picture of what she wanted to do. So we 1084 00:55:47,640 --> 00:55:51,200 Speaker 1: think that social media control is really a two age sword. 1085 00:55:51,880 --> 00:55:54,040 Speaker 1: What I liked is at one point Facebook had a 1086 00:55:54,040 --> 00:55:56,800 Speaker 1: protocol for about you what's true and what isn't true, 1087 00:55:57,280 --> 00:55:59,120 Speaker 1: And what they did was they gave you a flag. 1088 00:55:59,600 --> 00:56:02,760 Speaker 1: So they were concerned that something that was said wasn't true, 1089 00:56:03,160 --> 00:56:05,839 Speaker 1: they would have a neutral fact checker check it, and 1090 00:56:05,880 --> 00:56:08,080 Speaker 1: then if it didn't turn up well, they would put 1091 00:56:08,080 --> 00:56:10,520 Speaker 1: a little flag over it and say this has been questioned, 1092 00:56:10,560 --> 00:56:12,319 Speaker 1: and you could click on the flag and you could 1093 00:56:12,360 --> 00:56:14,640 Speaker 1: see why it was questioned. But they didn't just take 1094 00:56:14,680 --> 00:56:17,440 Speaker 1: it down. So you know, I I agree that, you know, 1095 00:56:17,480 --> 00:56:20,880 Speaker 1: disinformation is a tremendous problem, but I think that the 1096 00:56:20,960 --> 00:56:23,560 Speaker 1: idea that the solution is asked the tchech companies to 1097 00:56:23,640 --> 00:56:26,399 Speaker 1: decide what we should and shouldn't see. Yeah, I don't 1098 00:56:26,400 --> 00:56:29,360 Speaker 1: think that's so great either, And certainly they should not 1099 00:56:29,440 --> 00:56:32,400 Speaker 1: be doing it without a lot of transparency and accountability. 1100 00:56:32,440 --> 00:56:34,840 Speaker 1: If they're going to be taking things down, they should 1101 00:56:34,840 --> 00:56:38,239 Speaker 1: tell us what their protocols are, and you know, there 1102 00:56:38,239 --> 00:56:42,040 Speaker 1: should be more public discussion about where the balance is there. Yeah, 1103 00:56:42,040 --> 00:56:44,239 Speaker 1: It certainly seems like the protocols changed quite a bit, 1104 00:56:44,360 --> 00:56:47,239 Speaker 1: especially having covered tent for for this many years. It 1105 00:56:47,239 --> 00:56:50,040 Speaker 1: certainly seems like Facebook changes at Twitter changes it, and 1106 00:56:50,120 --> 00:56:52,879 Speaker 1: oftentimes it depends on public pressure. I'm curious to see 1107 00:56:52,920 --> 00:56:56,320 Speaker 1: what happens with all these advertisers boycotting. I think personally, 1108 00:56:56,360 --> 00:56:58,120 Speaker 1: I have a feeling it won't impact the bottom line 1109 00:56:58,200 --> 00:57:00,879 Speaker 1: much and they'll go back to business, says normal. But 1110 00:57:00,880 --> 00:57:03,239 Speaker 1: but who knows, you know, I do know that Zuckerbird 1111 00:57:03,320 --> 00:57:06,839 Speaker 1: cares uh deeply about his employees and and but they've 1112 00:57:06,840 --> 00:57:09,719 Speaker 1: been kind of up against you know, public scrutiny for 1113 00:57:09,719 --> 00:57:12,240 Speaker 1: a very long time. But but it certainly is interesting, 1114 00:57:12,360 --> 00:57:16,240 Speaker 1: especially when the stakes get higher and disinformation can go further, 1115 00:57:16,360 --> 00:57:19,520 Speaker 1: and especially as we get closer to an election, it 1116 00:57:19,600 --> 00:57:24,919 Speaker 1: certainly feels like everyone feels more triggered around it. Yeah, yeah, well, 1117 00:57:24,920 --> 00:57:27,080 Speaker 1: you know, one of the classics statements about the First 1118 00:57:27,120 --> 00:57:29,800 Speaker 1: Amendment is that in the marketplace of ideas, the best 1119 00:57:29,840 --> 00:57:33,760 Speaker 1: antidote to bad speech is more speech, right, so, you know, suppression. 1120 00:57:33,760 --> 00:57:35,920 Speaker 1: I think we always have to worry every time somebody's 1121 00:57:35,960 --> 00:57:39,880 Speaker 1: censoring and suppressing. Yeah, who are we giving that power to? 1122 00:57:40,320 --> 00:57:42,440 Speaker 1: You know, nearing a clothes because we don't have you 1123 00:57:42,560 --> 00:57:44,840 Speaker 1: for too much longer. I saw that you gave a 1124 00:57:44,880 --> 00:57:49,080 Speaker 1: talk um a Democrat and a Republican walk into a bar, 1125 00:57:49,640 --> 00:57:52,080 Speaker 1: and you're saying that it seems like these days Democrats 1126 00:57:52,120 --> 00:57:54,840 Speaker 1: and Republicans can't really agree on anything, but we all 1127 00:57:54,880 --> 00:57:58,480 Speaker 1: need to agree on fundamental American principles like do process, 1128 00:57:58,520 --> 00:58:04,040 Speaker 1: equality and freedom of conscience. So is that possible? Do 1129 00:58:04,080 --> 00:58:06,000 Speaker 1: you believe are you? Are you an optimist? Do you 1130 00:58:06,040 --> 00:58:10,400 Speaker 1: believe that in this current environment? Is that possible? Well? 1131 00:58:10,440 --> 00:58:12,560 Speaker 1: I think that's that's a great wrap up question. Where 1132 00:58:13,160 --> 00:58:15,720 Speaker 1: so that speech I gave it the Central Arkansas Library, 1133 00:58:16,480 --> 00:58:18,720 Speaker 1: And my chief point, as you're saying, is I think 1134 00:58:18,760 --> 00:58:23,680 Speaker 1: that people have to be able to agree on neutral principles. 1135 00:58:24,280 --> 00:58:27,440 Speaker 1: The Constitution was designed not to say what we're going 1136 00:58:27,480 --> 00:58:30,440 Speaker 1: to do about everything. It was designed to have everybody 1137 00:58:30,440 --> 00:58:33,960 Speaker 1: have a fair opportunity to be part of the process 1138 00:58:34,040 --> 00:58:36,360 Speaker 1: of deciding what we're going to do. So it sets 1139 00:58:36,400 --> 00:58:39,000 Speaker 1: up all these democratic structures where we get to vote 1140 00:58:39,040 --> 00:58:40,920 Speaker 1: for the people who are the policy makers and we 1141 00:58:40,960 --> 00:58:44,120 Speaker 1: all get to decide. But the principles there, the underlying 1142 00:58:44,160 --> 00:58:46,600 Speaker 1: principle is that everybody should have a fair and you know, 1143 00:58:46,920 --> 00:58:49,320 Speaker 1: the principle should be neutral. Everyone should get to vote. 1144 00:58:49,640 --> 00:58:51,960 Speaker 1: It's not like, you know, if you're a Democrat, your 1145 00:58:52,000 --> 00:58:54,400 Speaker 1: vote doesn't count in this area, and if your republic 1146 00:58:54,440 --> 00:58:56,600 Speaker 1: and your vote doesn't count in that area, is that's 1147 00:58:56,640 --> 00:59:01,280 Speaker 1: not fair. And the basic ideas of the freedom of speech, 1148 00:59:01,320 --> 00:59:04,840 Speaker 1: freedom of religion, they're all to be they manage the 1149 00:59:04,880 --> 00:59:07,680 Speaker 1: stations of the Golden rule that if I want the 1150 00:59:07,680 --> 00:59:10,120 Speaker 1: ability to just choose my own religion and decide what 1151 00:59:10,200 --> 00:59:12,520 Speaker 1: religion I'm going to practice, I have to respect your 1152 00:59:12,640 --> 00:59:14,960 Speaker 1: right to make a different choice and have your own religion, 1153 00:59:15,200 --> 00:59:17,480 Speaker 1: because that's the golden rule. If I want to say 1154 00:59:17,520 --> 00:59:20,120 Speaker 1: something that's unpopular, I have to respect your right to 1155 00:59:20,160 --> 00:59:22,640 Speaker 1: say something that's unpopular. And if I want to be 1156 00:59:22,720 --> 00:59:25,200 Speaker 1: treated fairly and not locked away for you know, doing 1157 00:59:25,240 --> 00:59:27,760 Speaker 1: something minor and never given a fair trial, I have 1158 00:59:27,840 --> 00:59:30,120 Speaker 1: to respect your right to have the same thing happened 1159 00:59:30,120 --> 00:59:33,320 Speaker 1: to you and to be all those fundamental principles are 1160 00:59:33,360 --> 00:59:35,880 Speaker 1: things that we really all should agree on. I think 1161 00:59:35,880 --> 00:59:39,120 Speaker 1: people get into arguing and assuming that they can never 1162 00:59:39,200 --> 00:59:42,840 Speaker 1: agree on the principles because they're differing on what they 1163 00:59:42,840 --> 00:59:45,440 Speaker 1: think the results should be. And I think to be 1164 00:59:45,520 --> 00:59:47,760 Speaker 1: part of the point of civil liberties is it's all 1165 00:59:47,800 --> 00:59:50,120 Speaker 1: about process, it's not about results. The a c l 1166 00:59:50,160 --> 00:59:53,240 Speaker 1: U is nonpartisan. We don't try to get Republicans elected, 1167 00:59:53,240 --> 00:59:55,920 Speaker 1: we don't try to get Democrats elected. We don't favor 1168 00:59:56,000 --> 01:00:00,360 Speaker 1: or disfavor individual politicians or individual parties, but we we 1169 01:00:00,480 --> 01:00:03,960 Speaker 1: favor or that there should be neutral principles that everybody 1170 01:00:04,000 --> 01:00:06,840 Speaker 1: can agree to to say, okay, here's what's fair. And 1171 01:00:06,880 --> 01:00:09,640 Speaker 1: the analogy I used in that talk at the Central 1172 01:00:09,760 --> 01:00:13,560 Speaker 1: Arkansas Library. It was one of the nights during the 1173 01:00:13,640 --> 01:00:15,920 Speaker 1: World Series, but fortunately not a night where there was 1174 01:00:15,920 --> 01:00:19,040 Speaker 1: a game, so people were able to come. And I said, okay, 1175 01:00:19,200 --> 01:00:22,280 Speaker 1: so what happens before a baseball game is that everybody 1176 01:00:22,360 --> 01:00:25,720 Speaker 1: has agreed on the underlying rules, and everyone agrees that 1177 01:00:25,720 --> 01:00:29,480 Speaker 1: the your umpires, your referees, and any sports should be neutral. 1178 01:00:30,320 --> 01:00:32,640 Speaker 1: And you don't want somebody who's partisan. If they were 1179 01:00:32,680 --> 01:00:34,960 Speaker 1: favoring one team, you get rid of them at all. 1180 01:00:35,000 --> 01:00:37,360 Speaker 1: Sports fans could agree to that. You know, maybe there 1181 01:00:37,360 --> 01:00:39,880 Speaker 1: would be a few who would be just so you know, matchiavellian, 1182 01:00:39,920 --> 01:00:43,800 Speaker 1: that they would rather have the biased umpire to always 1183 01:00:43,880 --> 01:00:45,760 Speaker 1: rule for their side. But I think sports fans can 1184 01:00:45,800 --> 01:00:48,600 Speaker 1: agree what you really want for a fair game. Is 1185 01:00:48,640 --> 01:00:50,600 Speaker 1: you want a fair game, you want everyone to agree 1186 01:00:50,640 --> 01:00:53,440 Speaker 1: on the principles beforehand. And I think that if we 1187 01:00:53,480 --> 01:00:56,000 Speaker 1: could sit down in small groups around the country and 1188 01:00:56,040 --> 01:00:59,800 Speaker 1: really talk about what the fundamental principles are, I am 1189 01:00:59,840 --> 01:01:01,880 Speaker 1: an enough of the patriot to think we actually could 1190 01:01:01,880 --> 01:01:03,800 Speaker 1: agree about a lot. And let me give you an 1191 01:01:03,840 --> 01:01:07,440 Speaker 1: example of why. I think there's some basis for hope. 1192 01:01:07,960 --> 01:01:11,200 Speaker 1: Maybe not optimism, but certainly hope. We were talking about 1193 01:01:11,280 --> 01:01:14,280 Speaker 1: voting rights. So one of the major problems is gerrymandering, 1194 01:01:14,600 --> 01:01:17,760 Speaker 1: the way when a party is in power they try 1195 01:01:17,800 --> 01:01:20,760 Speaker 1: to distort all the district and they try to stack 1196 01:01:20,840 --> 01:01:23,040 Speaker 1: the deck so that their party will remain in power. 1197 01:01:23,880 --> 01:01:26,280 Speaker 1: Or if the party in power in a particular state 1198 01:01:26,360 --> 01:01:28,720 Speaker 1: thinks it's to their advantage to not have that many 1199 01:01:28,720 --> 01:01:31,240 Speaker 1: people vote, they try to make it harder to register 1200 01:01:31,640 --> 01:01:36,160 Speaker 1: to vote for new voters, etcetera. Uh, we have had 1201 01:01:36,160 --> 01:01:38,560 Speaker 1: the A C l U and a number of other 1202 01:01:38,680 --> 01:01:41,440 Speaker 1: organizations working in coalition with US have had a fair 1203 01:01:41,480 --> 01:01:45,200 Speaker 1: amount of success doing ballot initiatives going to the people 1204 01:01:45,280 --> 01:01:48,720 Speaker 1: of the state in states like Michigan and Nevada and 1205 01:01:48,800 --> 01:01:51,880 Speaker 1: Missouri and Florida, where we were part of getting the 1206 01:01:51,920 --> 01:01:54,880 Speaker 1: amendment for a past that gave the vote back to 1207 01:01:54,960 --> 01:01:57,080 Speaker 1: people who have been convicted of a felony at some 1208 01:01:57,160 --> 01:02:00,160 Speaker 1: point and the people of the state. When you us 1209 01:02:00,240 --> 01:02:02,280 Speaker 1: the people of the state, you can get a majority 1210 01:02:02,440 --> 01:02:04,760 Speaker 1: sometimes the super majority of people who say no, we 1211 01:02:04,840 --> 01:02:07,720 Speaker 1: want the rules to be fair. Who doesn't want the 1212 01:02:07,800 --> 01:02:10,960 Speaker 1: rules to be fair are legislators who want who are 1213 01:02:11,000 --> 01:02:13,160 Speaker 1: incumbents and who want to keep their seats even if 1214 01:02:13,200 --> 01:02:16,080 Speaker 1: it takes unfair procedures to do it. So that's a 1215 01:02:16,480 --> 01:02:19,280 Speaker 1: real problem right where we have right now that the incumbents, 1216 01:02:19,320 --> 01:02:21,800 Speaker 1: the people who are trying to maintain power and not 1217 01:02:21,920 --> 01:02:25,160 Speaker 1: allow any sort of regime change, are pulling all the levers. 1218 01:02:25,680 --> 01:02:27,920 Speaker 1: But what I think, I think the chief grounds for 1219 01:02:27,960 --> 01:02:30,840 Speaker 1: optimism is that when you go to the American people 1220 01:02:30,920 --> 01:02:33,200 Speaker 1: themselves and say, well, do you want a fair system 1221 01:02:33,320 --> 01:02:34,720 Speaker 1: or do you want a system where you think your 1222 01:02:34,800 --> 01:02:37,560 Speaker 1: side is more likely to win? You talk to them 1223 01:02:37,600 --> 01:02:39,200 Speaker 1: about that, and I think that you're going to get 1224 01:02:39,200 --> 01:02:40,880 Speaker 1: them to say they would really like to see a 1225 01:02:40,920 --> 01:02:45,040 Speaker 1: fair system, and that is the promise of America. UM. 1226 01:02:45,280 --> 01:02:47,720 Speaker 1: Last question, you have taught at Brooklyn Law School since 1227 01:02:48,840 --> 01:02:51,000 Speaker 1: what is the lesson your students will take from this 1228 01:02:51,080 --> 01:02:55,880 Speaker 1: moment in history. Well, I know there are lots of lessons, 1229 01:02:55,920 --> 01:02:58,680 Speaker 1: but if you could extract it, what is the lesson 1230 01:02:58,720 --> 01:03:01,919 Speaker 1: your students will take from this moment in history? Well, 1231 01:03:01,960 --> 01:03:05,720 Speaker 1: you know, in an individual setting, one thing I'm doing 1232 01:03:05,760 --> 01:03:07,800 Speaker 1: for the fall is I am preparing of course that 1233 01:03:07,800 --> 01:03:11,240 Speaker 1: I'm calling COVID nineteen and the Constitution. So what we're 1234 01:03:11,240 --> 01:03:12,920 Speaker 1: gonna do in this seminar is we're going to be 1235 01:03:12,960 --> 01:03:15,320 Speaker 1: looking at the way in which the Constitution has been 1236 01:03:15,400 --> 01:03:17,440 Speaker 1: challenged and to see, you know, how well it holds up. 1237 01:03:17,440 --> 01:03:19,760 Speaker 1: What does the Constitution have to say about whether you 1238 01:03:19,760 --> 01:03:22,800 Speaker 1: can quarantine people and whether you can allow people to 1239 01:03:22,840 --> 01:03:24,560 Speaker 1: be at a religious assembly but not to go to 1240 01:03:24,600 --> 01:03:27,560 Speaker 1: a protest, and etcetera, etcetera. So I think there's a 1241 01:03:27,560 --> 01:03:29,600 Speaker 1: lot of interesting things there which I think are very 1242 01:03:29,640 --> 01:03:32,919 Speaker 1: much this particular moment, But big picture, what I would 1243 01:03:32,920 --> 01:03:36,960 Speaker 1: like the students to take away the constitutional law students 1244 01:03:37,040 --> 01:03:39,680 Speaker 1: especially is essentially what I just said to you, that 1245 01:03:39,760 --> 01:03:43,400 Speaker 1: the Constitution is about process. It's not about results. It's 1246 01:03:43,400 --> 01:03:45,720 Speaker 1: not about you know, you're a Republican and you're a Democrat, 1247 01:03:45,760 --> 01:03:48,320 Speaker 1: and we have two different countries depending on what your 1248 01:03:48,320 --> 01:03:51,440 Speaker 1: party is I think that we have one country and 1249 01:03:51,480 --> 01:03:54,400 Speaker 1: it's all about a neutral process for very good reasons, 1250 01:03:54,480 --> 01:03:56,439 Speaker 1: and I would like people to think more about that. 1251 01:03:56,840 --> 01:03:59,840 Speaker 1: After my speech at the Central Arkansas Library, I had 1252 01:04:00,600 --> 01:04:02,760 Speaker 1: two examples of people who talked to me. One guy 1253 01:04:02,800 --> 01:04:05,400 Speaker 1: came up to me, he said, I'm the Republican who 1254 01:04:05,440 --> 01:04:09,960 Speaker 1: walked into that bar, and he said, you know, you're 1255 01:04:09,960 --> 01:04:11,960 Speaker 1: making a lot of sense to me. And then there 1256 01:04:12,000 --> 01:04:14,240 Speaker 1: was another guy who talked to me who was a Democrat. 1257 01:04:14,280 --> 01:04:16,360 Speaker 1: He said, you know, I never really thought about that, 1258 01:04:16,360 --> 01:04:18,600 Speaker 1: that maybe it's not right if we're only trying to win. 1259 01:04:18,800 --> 01:04:20,680 Speaker 1: I never thought about you know, that's that's not what 1260 01:04:20,720 --> 01:04:23,760 Speaker 1: we do in sports. And that's what I'd like people 1261 01:04:23,800 --> 01:04:25,440 Speaker 1: to think about. You know, do you really want to 1262 01:04:25,480 --> 01:04:27,320 Speaker 1: do things that are only about how you think it's 1263 01:04:27,320 --> 01:04:29,920 Speaker 1: going to come out and cheat and destroy the system 1264 01:04:29,960 --> 01:04:31,720 Speaker 1: and you know, put a film on the scale and 1265 01:04:32,040 --> 01:04:34,800 Speaker 1: you know, stack the deck in order to make things 1266 01:04:34,840 --> 01:04:37,680 Speaker 1: come out to what your preferred result is in the 1267 01:04:37,720 --> 01:04:40,480 Speaker 1: short run or a long term. Is that just a 1268 01:04:40,520 --> 01:04:44,120 Speaker 1: really bad idea because it's just totally inconsistent. You know, 1269 01:04:44,160 --> 01:04:46,960 Speaker 1: we've just come from fourth of July and totally inconsistent 1270 01:04:47,000 --> 01:04:50,320 Speaker 1: with the premises on which we would like to believe 1271 01:04:50,360 --> 01:04:53,920 Speaker 1: our country was founded. Does technology throw a wrench in 1272 01:04:53,920 --> 01:04:56,720 Speaker 1: the system? I mean it does. It does create lots 1273 01:04:56,720 --> 01:05:00,160 Speaker 1: of things you can't control, and and it it always 1274 01:05:00,000 --> 01:05:02,560 Speaker 1: it's always you know, it's always new environment. So you know, 1275 01:05:02,640 --> 01:05:05,960 Speaker 1: different kind of example, we were talking about technology and surveillance, 1276 01:05:06,000 --> 01:05:08,520 Speaker 1: where of course technology has enabled a whole lot of 1277 01:05:08,520 --> 01:05:11,560 Speaker 1: surveillance that we then have to deal with. But technology 1278 01:05:11,640 --> 01:05:14,520 Speaker 1: also enabled a whole lot of new marketplaces of ideas. 1279 01:05:15,200 --> 01:05:16,600 Speaker 1: So the A. C. L. You did a lot of 1280 01:05:16,640 --> 01:05:19,720 Speaker 1: litigation you a few decades ago on applying first two 1281 01:05:19,720 --> 01:05:23,160 Speaker 1: Moment principles to the Internet, right, you know, because the 1282 01:05:23,160 --> 01:05:26,320 Speaker 1: government censor what was on the Internet because you know, child, 1283 01:05:26,360 --> 01:05:28,960 Speaker 1: a child might see it. Yeah, And so you know, 1284 01:05:29,040 --> 01:05:33,360 Speaker 1: every new generation of technology, there are new challenges about 1285 01:05:33,400 --> 01:05:36,880 Speaker 1: how you apply our principles like privacy and free speech, 1286 01:05:37,200 --> 01:05:41,240 Speaker 1: et cetera to the Internet, but the principles remained the same. 1287 01:05:53,760 --> 01:05:56,160 Speaker 1: I hope everyone is doing well in these strange and 1288 01:05:56,240 --> 01:06:00,440 Speaker 1: surreal times and adjusting to the new normal. Most of important, 1289 01:06:00,560 --> 01:06:04,400 Speaker 1: I hope you're staying healthy and somewhat saying follow along 1290 01:06:04,440 --> 01:06:07,160 Speaker 1: on our social media. I'm at Lorie Siegel on Twitter 1291 01:06:07,240 --> 01:06:10,320 Speaker 1: and Instagram, and the show is at First Contact Podcasts 1292 01:06:10,320 --> 01:06:13,439 Speaker 1: on Instagram and on Twitter. We're at First Contact pod 1293 01:06:13,960 --> 01:06:16,040 Speaker 1: and for even more from dot dot dot sign up 1294 01:06:16,040 --> 01:06:18,520 Speaker 1: for our newsletter at dot dot dot media dot com 1295 01:06:18,520 --> 01:06:21,480 Speaker 1: Backslash Newsletter. And if you like what you heard, leave 1296 01:06:21,520 --> 01:06:24,280 Speaker 1: us a review on Apple Podcasts or wherever you listen. 1297 01:06:24,440 --> 01:06:30,480 Speaker 1: We really appreciate it. First Contact is a production of 1298 01:06:30,520 --> 01:06:33,520 Speaker 1: dot dot dot Media, Executive produced by Laurie Siegel and 1299 01:06:33,600 --> 01:06:38,200 Speaker 1: Derek Dodge. This episode was produced and edited by Sabine 1300 01:06:38,280 --> 01:06:41,520 Speaker 1: Jansen and Jack Reagan. The original theme music is by 1301 01:06:41,600 --> 01:06:54,160 Speaker 1: Zander Sing. First Contact with Lori Siegel is a production 1302 01:06:54,160 --> 01:06:56,360 Speaker 1: of dot dot dot Media and I Heart Radio