1 00:00:13,360 --> 00:00:17,680 Speaker 1: December of twenty twenty, during the pandemic cold winter morning, 2 00:00:18,079 --> 00:00:21,560 Speaker 1: the security guard on a train platform and the outskirts 3 00:00:21,560 --> 00:00:27,200 Speaker 1: of Saint Louis was working his job and suddenly saw 4 00:00:27,320 --> 00:00:31,040 Speaker 1: a man in a security booth who wasn't supposed to 5 00:00:31,080 --> 00:00:33,960 Speaker 1: be there. So he approached this man and told him 6 00:00:34,000 --> 00:00:39,080 Speaker 1: to leave, and then the verbal altercation ensued and another 7 00:00:39,159 --> 00:00:41,919 Speaker 1: man walked up behind him, so he was surrounded by 8 00:00:41,920 --> 00:00:44,480 Speaker 1: these two guys who were getting more and more agitated 9 00:00:44,560 --> 00:00:47,800 Speaker 1: with him. The next thing he knew, he was assaulted 10 00:00:47,880 --> 00:00:51,159 Speaker 1: by these two guys, struck in the head, and when 11 00:00:51,159 --> 00:00:53,680 Speaker 1: he was on the ground, they continued to beat him 12 00:00:53,880 --> 00:00:56,040 Speaker 1: and hit him in the head and hit ananimous chest 13 00:00:56,080 --> 00:01:00,360 Speaker 1: and his ribs. The guys ran off and Michael and 14 00:01:00,400 --> 00:01:04,880 Speaker 1: the victim suffered concussion that day and he couldn't remember anything. 15 00:01:05,440 --> 00:01:08,760 Speaker 1: So what the police were really left with to solve 16 00:01:08,760 --> 00:01:11,840 Speaker 1: this case is the same thing that they're left with 17 00:01:12,120 --> 00:01:15,080 Speaker 1: in a lot of cases where there's no witnesses. 18 00:01:15,280 --> 00:01:19,759 Speaker 2: A random crime, a victim with no memory, and nobody 19 00:01:19,760 --> 00:01:21,040 Speaker 2: seems to have seen anything. 20 00:01:22,040 --> 00:01:24,959 Speaker 1: A video camera was the only witness to this crime. 21 00:01:25,480 --> 00:01:30,560 Speaker 1: This kind of fuzzy, blurry surveillance video captured a still 22 00:01:31,160 --> 00:01:35,000 Speaker 1: of these perpetrators, and that was basically the only thing 23 00:01:35,040 --> 00:01:36,280 Speaker 1: that they had to go on. 24 00:01:36,880 --> 00:01:40,600 Speaker 2: Douglas McMillan is a reporter for the Washington Post, So Doug, 25 00:01:40,920 --> 00:01:43,280 Speaker 2: first off, for real, thanks for being down to talk 26 00:01:43,319 --> 00:01:46,199 Speaker 2: with me about all this. Of course, Doug doesn't usually 27 00:01:46,240 --> 00:01:49,320 Speaker 2: cover cops or the justice system. His usual beat at 28 00:01:49,320 --> 00:01:53,000 Speaker 2: the Post is corporate accountability and technology. But there was 29 00:01:53,000 --> 00:01:55,760 Speaker 2: something about this case in Saint Louis that caught us attention. 30 00:01:56,520 --> 00:02:00,480 Speaker 1: There's a lot of crimes, hundreds or thousands of solved 31 00:02:00,800 --> 00:02:05,480 Speaker 1: cases out there where the only evidence was a camera, 32 00:02:05,640 --> 00:02:08,839 Speaker 1: The only evidence was a photo of this in perpetrator. 33 00:02:09,280 --> 00:02:14,200 Speaker 1: And the potential of this technology is we have this 34 00:02:14,240 --> 00:02:18,200 Speaker 1: seemingly magical tool to help solve those crimes. 35 00:02:18,600 --> 00:02:21,920 Speaker 2: The technology he's talking about here is AI facial recognition. 36 00:02:22,600 --> 00:02:25,880 Speaker 2: Police around the country are increasingly using this on cases 37 00:02:25,960 --> 00:02:28,720 Speaker 2: that they can't find a witness for and this was 38 00:02:28,760 --> 00:02:32,600 Speaker 2: one of those cases. The investigation into who had assaulted 39 00:02:32,600 --> 00:02:35,079 Speaker 2: Michael Feldman had pretty much gone cold. 40 00:02:35,520 --> 00:02:41,840 Speaker 1: Until about eight months after the incident. We believe the 41 00:02:41,880 --> 00:02:45,360 Speaker 1: police officers they decided, oh hey, we have these grainy 42 00:02:45,400 --> 00:02:50,200 Speaker 1: surveillance images, let's try to run them through the facial 43 00:02:50,200 --> 00:02:52,680 Speaker 1: recognition system and see if that will give us any matches. 44 00:02:53,160 --> 00:02:57,320 Speaker 1: And so they did that and they got some matches back. 45 00:02:57,560 --> 00:03:00,200 Speaker 1: We believe it was between five and ten maybe of 46 00:03:00,200 --> 00:03:04,919 Speaker 1: different possible matches. And the one they chose, the one 47 00:03:04,919 --> 00:03:08,560 Speaker 1: that they thought looked the most like the perpetrator of 48 00:03:08,560 --> 00:03:10,520 Speaker 1: the crime, or one of the two perpetrators of the crime, 49 00:03:11,200 --> 00:03:14,359 Speaker 1: was this gentleman named Christopher Gatlin. And then they kind 50 00:03:14,360 --> 00:03:16,200 Speaker 1: of were off and running. In this case, they had 51 00:03:16,200 --> 00:03:21,240 Speaker 1: a suspect. Then this was the only thing that police 52 00:03:21,320 --> 00:03:26,240 Speaker 1: had to arrest this man, take away his freedom, you know, 53 00:03:26,480 --> 00:03:29,160 Speaker 1: came down to this AI program. 54 00:03:29,520 --> 00:03:32,400 Speaker 2: I mean, because the thing here that of course we're 55 00:03:32,440 --> 00:03:35,800 Speaker 2: sort of circling around, is that that's not the guy, 56 00:03:36,080 --> 00:03:37,280 Speaker 2: Christopher Gatlin's innocence. 57 00:03:46,280 --> 00:03:47,240 Speaker 3: I'm afraid. 58 00:03:51,000 --> 00:03:57,400 Speaker 2: From Kaleidoscope in iHeart Podcasts. This is kill Switch. I'm 59 00:03:57,440 --> 00:03:58,120 Speaker 2: Dexter Thomas. 60 00:03:58,920 --> 00:04:36,200 Speaker 3: I'm sorry, I'm sorry. Good bye. 61 00:04:40,080 --> 00:04:45,080 Speaker 1: So they went out investigating Christopher Gatlin and the first 62 00:04:45,120 --> 00:04:49,600 Speaker 1: thing they did was print out a picture of his mugshot, 63 00:04:50,120 --> 00:04:53,880 Speaker 1: and they took the picture of Chris Gatlin to the 64 00:04:54,279 --> 00:04:56,960 Speaker 1: apartment building where Michael Feldman lives and they conducted a 65 00:04:56,960 --> 00:05:01,440 Speaker 1: photo lineup with him, and they basically sat Michael Feldman 66 00:05:01,520 --> 00:05:04,200 Speaker 1: down and gave him the photos, put them on the 67 00:05:04,240 --> 00:05:06,600 Speaker 1: table in front of him, and instructed him to kind 68 00:05:06,600 --> 00:05:08,919 Speaker 1: of spend time looking at each one and going through 69 00:05:09,360 --> 00:05:12,799 Speaker 1: these photos, take. 70 00:05:12,600 --> 00:05:15,839 Speaker 4: A look through all of them and see if anything. 71 00:05:15,600 --> 00:05:16,159 Speaker 5: Rings a bell. 72 00:05:17,160 --> 00:05:20,320 Speaker 2: Eight months after the assault, the cops visited the victim, 73 00:05:20,440 --> 00:05:23,760 Speaker 2: Michael Feldman's house, and they brought some pictures with them. 74 00:05:24,200 --> 00:05:27,400 Speaker 2: One of those pictures was of Chris Gatlin. And we 75 00:05:27,520 --> 00:05:31,040 Speaker 2: know exactly what happened next because it's all on record. 76 00:05:31,839 --> 00:05:35,159 Speaker 1: So we got the body camera a video of this 77 00:05:35,760 --> 00:05:40,520 Speaker 1: photo lineup, and the officers who conducted this photo lineup 78 00:05:40,760 --> 00:05:44,400 Speaker 1: did not follow the proper procedures for basically getting an 79 00:05:44,520 --> 00:05:48,480 Speaker 1: affair and impartial identification of this witness. And you can 80 00:05:48,520 --> 00:05:52,839 Speaker 1: see throughout the process when Michael Feldman's going through these photos, 81 00:05:53,200 --> 00:05:55,839 Speaker 1: these police officers, who are really supposed to be just 82 00:05:55,839 --> 00:05:58,560 Speaker 1: standing off to the side and not saying anything, are 83 00:05:58,680 --> 00:06:02,040 Speaker 1: kind of coaching him and giving him kind of little 84 00:06:02,120 --> 00:06:06,000 Speaker 1: clues throughout the process on which guy they want him 85 00:06:06,040 --> 00:06:06,400 Speaker 1: to pick. 86 00:06:06,600 --> 00:06:10,760 Speaker 4: Okay, let's think about before the incident, he went up 87 00:06:10,760 --> 00:06:14,640 Speaker 4: and he talked to him a little bit right, Let's 88 00:06:14,640 --> 00:06:17,000 Speaker 4: start to focus on things before you actually got any 89 00:06:17,080 --> 00:06:20,560 Speaker 4: altercation of it, maybe even before you actually initially went. 90 00:06:20,520 --> 00:06:22,880 Speaker 3: To talk to him. Just take a minute kind. 91 00:06:22,720 --> 00:06:23,479 Speaker 4: Of ponder, think. 92 00:06:24,360 --> 00:06:26,480 Speaker 1: And at one point he actually picks up the picture 93 00:06:26,480 --> 00:06:29,760 Speaker 1: of Chris Gatlan and puts it aside, and he keeps 94 00:06:29,760 --> 00:06:34,799 Speaker 1: on looking through these photos, and at one point he says, 95 00:06:34,839 --> 00:06:38,600 Speaker 1: I want to say it's him pointing to a different guy. 96 00:06:38,720 --> 00:06:43,080 Speaker 3: I want to say, jan I remember my memory is 97 00:06:44,160 --> 00:06:47,360 Speaker 3: I want to say to him. 98 00:06:47,400 --> 00:06:50,400 Speaker 1: When he does that, one of the detectives steps forward 99 00:06:50,400 --> 00:06:53,200 Speaker 1: and says, okay. Instead of, you know, accepting that as 100 00:06:53,200 --> 00:06:57,279 Speaker 1: an identification, the detective encourages him to continue going and 101 00:06:57,320 --> 00:07:01,039 Speaker 1: continue thinking about the interaction. Think about, you know, was 102 00:07:01,040 --> 00:07:03,520 Speaker 1: a guy wearing anything, Was he wearing a hat? Did 103 00:07:03,560 --> 00:07:07,480 Speaker 1: you recognize anything about his face, his eyes, their clothing 104 00:07:07,560 --> 00:07:08,480 Speaker 1: he had. 105 00:07:10,200 --> 00:07:11,200 Speaker 3: I don't remember really. 106 00:07:11,240 --> 00:07:13,800 Speaker 1: I thought he had like a hat on or something, or. 107 00:07:14,200 --> 00:07:17,720 Speaker 4: Stocking something on. So let's picture the picture of these 108 00:07:17,760 --> 00:07:20,400 Speaker 4: two guys wearing that, you know, a stocking cap or something. 109 00:07:20,400 --> 00:07:22,000 Speaker 4: If you need to use your hands, if you've got 110 00:07:22,120 --> 00:07:23,920 Speaker 4: to put hands on the papers. 111 00:07:23,560 --> 00:07:24,080 Speaker 3: That's okay. 112 00:07:24,440 --> 00:07:26,320 Speaker 1: And so you know, he kind of like, you know, 113 00:07:26,520 --> 00:07:30,320 Speaker 1: prods Michael into thinking that he didn't pick the right 114 00:07:30,360 --> 00:07:33,360 Speaker 1: person and then going back and kind of changing his mind, 115 00:07:33,360 --> 00:07:35,720 Speaker 1: And eventually he does go back and he pulls the 116 00:07:35,760 --> 00:07:37,920 Speaker 1: picture of Chris Gatland out and he points to that 117 00:07:37,960 --> 00:07:41,440 Speaker 1: and he says it was him, kid. 118 00:07:42,560 --> 00:07:42,960 Speaker 3: All right. 119 00:07:43,200 --> 00:07:45,559 Speaker 4: I just remember he getting really angry quick. 120 00:07:46,200 --> 00:07:52,000 Speaker 2: Just the mannerism I'm just trying to and the eyes probably. 121 00:07:53,440 --> 00:07:54,960 Speaker 4: How pissed off he got. 122 00:07:55,920 --> 00:07:56,960 Speaker 3: As not to be in that. 123 00:07:57,200 --> 00:07:58,360 Speaker 1: You couldn't stand there. 124 00:07:59,520 --> 00:07:59,920 Speaker 3: I think. 125 00:08:01,600 --> 00:08:01,960 Speaker 5: Okay. 126 00:08:03,480 --> 00:08:06,240 Speaker 2: So police take that identification and they're able to get 127 00:08:06,280 --> 00:08:09,560 Speaker 2: a warrant to arrest Chris Gatlin. They have no other 128 00:08:09,640 --> 00:08:13,120 Speaker 2: evidence other than that an algorithm picked him out. He 129 00:08:13,240 --> 00:08:18,440 Speaker 2: spent seventeen months in jail. So, Doug, I've read about 130 00:08:18,480 --> 00:08:21,960 Speaker 2: Chris Gatland's case, but you've met him in person. What's 131 00:08:21,960 --> 00:08:22,280 Speaker 2: he like. 132 00:08:22,680 --> 00:08:26,000 Speaker 1: He's a very sweet, charming guy, father of four kids. 133 00:08:26,520 --> 00:08:29,880 Speaker 1: He did not have a criminal record, so the reason 134 00:08:29,920 --> 00:08:34,679 Speaker 1: that he was in this system. So the fish recognition 135 00:08:35,240 --> 00:08:38,640 Speaker 1: database in Saint Louis has over two hundred and fifty 136 00:08:38,679 --> 00:08:44,800 Speaker 1: thousand mug shots of people, including people who were pulled 137 00:08:44,800 --> 00:08:48,240 Speaker 1: over for traffic stops and arrested for speeding or arrested 138 00:08:48,240 --> 00:08:51,840 Speaker 1: for very minor things. Saint Louis has a long history 139 00:08:52,040 --> 00:08:57,800 Speaker 1: of over policing, especially black minority, low income populations, handing 140 00:08:57,800 --> 00:09:00,240 Speaker 1: out tickets to people just so that police can meet 141 00:09:00,280 --> 00:09:04,000 Speaker 1: their quotas. So Chris, he had had a few different 142 00:09:04,160 --> 00:09:08,040 Speaker 1: traffic infractions, and he had one burglary charge from a 143 00:09:08,040 --> 00:09:12,120 Speaker 1: few years earlier. It was ultimately dropped right prosecutors. But 144 00:09:12,160 --> 00:09:13,559 Speaker 1: that's the reason that he came out, and that's the 145 00:09:13,559 --> 00:09:14,800 Speaker 1: reason he was in this system. 146 00:09:15,120 --> 00:09:19,160 Speaker 2: It's incredible that you could be in a system that 147 00:09:19,240 --> 00:09:21,720 Speaker 2: would lead you to get arrested based off of a 148 00:09:21,760 --> 00:09:25,559 Speaker 2: traffic ticket. I have a completely clean record, with the 149 00:09:25,600 --> 00:09:30,679 Speaker 2: one exception that a literal decade ago, I was speeding 150 00:09:30,760 --> 00:09:32,880 Speaker 2: to work on the freeway. I was late for work. 151 00:09:32,920 --> 00:09:35,520 Speaker 2: I got pulled over. Now, I didn't get arrested, but 152 00:09:35,600 --> 00:09:37,880 Speaker 2: had somebody decide to go a little harder on me? Sure, 153 00:09:37,920 --> 00:09:42,680 Speaker 2: I suppose they could have. And I'm just imagining myself 154 00:09:42,800 --> 00:09:46,480 Speaker 2: being at home one day and police coming to my 155 00:09:46,559 --> 00:09:51,839 Speaker 2: door and saying, Hey, we're arresting you because we think 156 00:09:51,880 --> 00:09:53,520 Speaker 2: you did something that I've never heard of in a 157 00:09:53,520 --> 00:09:54,480 Speaker 2: place that I don't. 158 00:09:54,240 --> 00:09:56,000 Speaker 1: Go, because you were in this database. 159 00:09:56,200 --> 00:09:58,760 Speaker 2: Yeah, because I was five miles over the speed limit 160 00:09:58,800 --> 00:10:03,240 Speaker 2: one day going for work. Yeah, so maybe this is 161 00:10:03,240 --> 00:10:06,480 Speaker 2: a good time to say that all facial recognition technology 162 00:10:06,640 --> 00:10:10,040 Speaker 2: isn't the same, or maybe more importantly, that it all 163 00:10:10,080 --> 00:10:13,000 Speaker 2: doesn't come from the same place. The program that the 164 00:10:13,040 --> 00:10:16,160 Speaker 2: Saint Louis Cops use searches the internal databases of the 165 00:10:16,160 --> 00:10:20,440 Speaker 2: police department. But there's another very popular tool. It's able 166 00:10:20,480 --> 00:10:24,440 Speaker 2: to pull data straight from the Internet. It's called clearview AI. 167 00:10:25,720 --> 00:10:33,440 Speaker 1: Clearview scrapes together images from Facebook, from LinkedIn, from Venmo accounts, 168 00:10:33,480 --> 00:10:36,720 Speaker 1: from news articles, from all of the public web sources 169 00:10:37,040 --> 00:10:39,320 Speaker 1: that it can find images with people's face. 170 00:10:39,800 --> 00:10:44,480 Speaker 2: Clearview is wild, and they've got billions of photos in there. 171 00:10:44,840 --> 00:10:47,439 Speaker 1: Yeah, they say they have billions of images. And there's 172 00:10:47,480 --> 00:10:50,320 Speaker 1: a question about whether it's legal for them to scrape 173 00:10:50,480 --> 00:10:52,679 Speaker 1: these images and put them in a database, because they 174 00:10:52,679 --> 00:10:55,600 Speaker 1: do not have they never really got permission from anybody 175 00:10:56,000 --> 00:10:57,080 Speaker 1: to get these images. 176 00:10:57,559 --> 00:11:01,920 Speaker 2: Clearview AI's website boats that they've scraped over fifty billion images. 177 00:11:02,320 --> 00:11:04,920 Speaker 2: And again this is just from the general Internet. So 178 00:11:05,040 --> 00:11:07,680 Speaker 2: when the police run a search, you could be in there, 179 00:11:08,040 --> 00:11:10,440 Speaker 2: even if you've never had a run in with law enforcement. 180 00:11:12,400 --> 00:11:15,320 Speaker 2: Do we even know how many police departments are using this. 181 00:11:16,000 --> 00:11:19,800 Speaker 1: No, it's very hard to tell because they're not required 182 00:11:20,360 --> 00:11:23,080 Speaker 1: to disclose this anyway. In most places, they're not required 183 00:11:23,679 --> 00:11:26,760 Speaker 1: to disclose that they're using these tools. So the best 184 00:11:26,800 --> 00:11:31,079 Speaker 1: proxy for that number is what the companies who make 185 00:11:31,120 --> 00:11:35,280 Speaker 1: this software have announced. Clearview AI has said that it 186 00:11:35,360 --> 00:11:38,600 Speaker 1: has thousands. I think they have said over three thousand 187 00:11:38,920 --> 00:11:43,760 Speaker 1: police customers. I did a pretty exhaustive public record search 188 00:11:44,120 --> 00:11:47,800 Speaker 1: for police departments that are using this tools. We only 189 00:11:47,840 --> 00:11:50,960 Speaker 1: came up with a list of about seventy that we 190 00:11:50,960 --> 00:11:53,960 Speaker 1: could verify that are using it in some way, and 191 00:11:54,080 --> 00:11:58,400 Speaker 1: of those, we sought public records on individual cases, and 192 00:11:58,440 --> 00:12:02,880 Speaker 1: we could only get individual case on something like thirty 193 00:12:02,960 --> 00:12:05,160 Speaker 1: or forty of them. Getting to the bottom of you know, 194 00:12:05,160 --> 00:12:07,760 Speaker 1: which police departments are using this, how they're using it 195 00:12:07,800 --> 00:12:10,080 Speaker 1: is very difficult because for the most part, there's no 196 00:12:10,360 --> 00:12:13,720 Speaker 1: requirement for them to make public how and when they're 197 00:12:13,800 --> 00:12:18,320 Speaker 1: using the tools, and that is allowing police to mostly 198 00:12:18,400 --> 00:12:19,840 Speaker 1: use these tools in secrets. 199 00:12:20,920 --> 00:12:24,840 Speaker 2: But sometimes it's not the machines making the mistakes, it's 200 00:12:24,880 --> 00:12:39,960 Speaker 2: the cops. More on that after the break, it's not 201 00:12:40,000 --> 00:12:42,480 Speaker 2: too hard to understand why a police department would be 202 00:12:42,520 --> 00:12:46,560 Speaker 2: interested in facial recognition tools. It seems pretty logical. I mean, 203 00:12:46,600 --> 00:12:48,800 Speaker 2: you're having a hard time crack in a case and 204 00:12:48,840 --> 00:12:51,400 Speaker 2: you either don't have any eyewitnesses, or you can't rely 205 00:12:51,440 --> 00:12:54,600 Speaker 2: on the witnesses you have, or maybe, like with Michael Feldman, 206 00:12:54,840 --> 00:12:57,520 Speaker 2: the victims suffered an injury that affected their memory of 207 00:12:57,559 --> 00:13:00,200 Speaker 2: the event, or maybe too much time has passed for 208 00:13:00,280 --> 00:13:04,080 Speaker 2: somebody to really remember the details. Eyewitness accounts are full 209 00:13:04,120 --> 00:13:07,840 Speaker 2: of problems for law enforcement because human memory is unreliable 210 00:13:08,040 --> 00:13:11,480 Speaker 2: and it can be influenced by people's biases. So there's 211 00:13:11,520 --> 00:13:15,080 Speaker 2: this potential magical solution. You in put a face into 212 00:13:15,080 --> 00:13:18,120 Speaker 2: the system, It searches a database and spits out a match. 213 00:13:18,520 --> 00:13:21,440 Speaker 2: All you gotta do is go find that person and boom. 214 00:13:21,640 --> 00:13:22,319 Speaker 2: Case closed. 215 00:13:23,040 --> 00:13:27,240 Speaker 1: So we know that there are at least eight Americans 216 00:13:27,760 --> 00:13:32,160 Speaker 1: who have been wrongly arrested due to police misuse of 217 00:13:32,240 --> 00:13:33,160 Speaker 1: facial recognition. 218 00:13:34,240 --> 00:13:37,400 Speaker 2: And just to be clear here, when he says wrongly arrested, 219 00:13:37,559 --> 00:13:40,080 Speaker 2: that's not just his opinion. What he means is that 220 00:13:40,120 --> 00:13:43,400 Speaker 2: the police admitted afterwards that they got the wrong person. 221 00:13:43,880 --> 00:13:47,080 Speaker 1: Interestingly, one of those cases this man named Jason Vernaw 222 00:13:47,200 --> 00:13:50,120 Speaker 1: who is based in Miami. In that case, the computer 223 00:13:50,920 --> 00:13:54,280 Speaker 1: got the right person, but the police had fed the 224 00:13:54,360 --> 00:13:57,800 Speaker 1: computer an image of the wrong person. So they fed 225 00:13:57,840 --> 00:14:00,000 Speaker 1: it an image who they thought this was the perpetrator 226 00:14:00,240 --> 00:14:02,880 Speaker 1: of somebody in a bank committing fraud. But it turns 227 00:14:02,920 --> 00:14:06,080 Speaker 1: out they were mistaken and the surveillance image that they 228 00:14:06,160 --> 00:14:08,360 Speaker 1: put in the system was just the wrong person to 229 00:14:08,400 --> 00:14:08,880 Speaker 1: begin with. 230 00:14:09,040 --> 00:14:11,160 Speaker 2: So that was the one time they got it right. 231 00:14:11,640 --> 00:14:15,120 Speaker 1: Well, yeah, so the computer actually picked the right person, 232 00:14:15,520 --> 00:14:18,160 Speaker 1: but the police were relying on this image they got 233 00:14:18,200 --> 00:14:20,600 Speaker 1: from the bank and they just pulled an image from 234 00:14:20,600 --> 00:14:21,280 Speaker 1: the wrong time. 235 00:14:22,240 --> 00:14:25,160 Speaker 2: So the bank pulled an image from their security cameras 236 00:14:25,400 --> 00:14:28,000 Speaker 2: and sent it to the police, and this image was 237 00:14:28,120 --> 00:14:31,400 Speaker 2: Jason Vernow, but it was from a completely different time 238 00:14:31,400 --> 00:14:36,080 Speaker 2: of the day. The facial recognition software had correctly identified Jason, 239 00:14:36,600 --> 00:14:39,600 Speaker 2: but the police failed to make some obvious checks. 240 00:14:40,120 --> 00:14:44,200 Speaker 1: What this case shows was that the police put the 241 00:14:44,240 --> 00:14:47,360 Speaker 1: image in their system and it popped up Jason Vernow 242 00:14:47,520 --> 00:14:49,920 Speaker 1: and they went out and arrested the guy, rather than 243 00:14:50,520 --> 00:14:53,400 Speaker 1: taking a second and saying, okay, is we have a suspect. 244 00:14:53,760 --> 00:14:56,440 Speaker 1: Let's see if, like, can we prove that Jason ver 245 00:14:56,480 --> 00:14:58,880 Speaker 1: Now was tied to this crime. And in the case 246 00:14:58,920 --> 00:15:02,320 Speaker 1: of a financial there is a lot of potential evidence there. 247 00:15:02,360 --> 00:15:06,040 Speaker 1: There's a check that he supposedly cashed, a faulty check. Well, 248 00:15:06,200 --> 00:15:09,160 Speaker 1: this check of his signature matches. There are witnesses, there's 249 00:15:09,160 --> 00:15:10,960 Speaker 1: a bank teller who took let's see if the bank 250 00:15:11,000 --> 00:15:13,400 Speaker 1: teller can confirm that was him. They didn't do any 251 00:15:13,440 --> 00:15:17,000 Speaker 1: of that. They apparently didn't do any real investigative steps. 252 00:15:17,320 --> 00:15:19,080 Speaker 1: They just took the word of the machine and went 253 00:15:19,120 --> 00:15:24,440 Speaker 1: not arrested him. It speaks to kind of this phenomenon 254 00:15:24,960 --> 00:15:27,400 Speaker 1: that seems to be happening with police as they are 255 00:15:27,600 --> 00:15:32,120 Speaker 1: trusting this kind of they're kind of imbuing this software 256 00:15:32,200 --> 00:15:35,160 Speaker 1: with this magical ability to lead them to the right suspect. 257 00:15:36,200 --> 00:15:40,720 Speaker 1: Some researchers call this automation bias, that when a computer 258 00:15:40,840 --> 00:15:44,200 Speaker 1: is telling you an answer, you're more likely to believe 259 00:15:44,200 --> 00:15:47,040 Speaker 1: that that is the correct answer, even at the expense 260 00:15:47,240 --> 00:15:51,320 Speaker 1: of your kind of typical due diligence, your typical you know, 261 00:15:51,520 --> 00:15:54,680 Speaker 1: just rational brain and just normal steps that you should 262 00:15:54,720 --> 00:15:57,200 Speaker 1: think through and follow before you go out and arrest 263 00:15:57,200 --> 00:15:59,320 Speaker 1: somebody take away somebody's freedom. 264 00:16:00,080 --> 00:16:02,960 Speaker 2: Automation bias is just one kind of bias here, though, 265 00:16:04,080 --> 00:16:07,640 Speaker 2: I mean, there was something unique about this particular person though, right, 266 00:16:07,640 --> 00:16:11,600 Speaker 2: compared to the other cases. Yeah, he was white, and 267 00:16:11,640 --> 00:16:14,400 Speaker 2: that's the one that they got the computer got right. Yeah. 268 00:16:14,680 --> 00:16:15,000 Speaker 3: Yeah. 269 00:16:15,920 --> 00:16:23,400 Speaker 2: The facial recognition has been shown to often get certain 270 00:16:23,400 --> 00:16:24,480 Speaker 2: people wrong. Right. 271 00:16:24,600 --> 00:16:27,600 Speaker 1: Yeah. There's two things going on here. One, facial recognition 272 00:16:27,880 --> 00:16:32,280 Speaker 1: has struggled with darker complexions, and part of this is 273 00:16:32,600 --> 00:16:36,760 Speaker 1: due to how these programs are trained. So when the 274 00:16:36,760 --> 00:16:39,720 Speaker 1: first face rerecation algorithms were trained, they were trained on 275 00:16:39,840 --> 00:16:44,680 Speaker 1: lighter skin tones. There's a federal research lab it's called 276 00:16:44,760 --> 00:16:49,720 Speaker 1: NIST NIST that does testing into how well facial recognition 277 00:16:49,760 --> 00:16:53,720 Speaker 1: performs in laboratory settings, and they found in twenty nineteen 278 00:16:53,800 --> 00:16:59,040 Speaker 1: that certain demographic groups, including African Americans, and certain facial 279 00:16:59,040 --> 00:17:02,160 Speaker 1: rection algorithms could be up to one hundred times more 280 00:17:02,240 --> 00:17:05,919 Speaker 1: likely to be mismatched than lighter skin tones. That was 281 00:17:05,920 --> 00:17:09,480 Speaker 1: about six years ago, and the industry says, and clear 282 00:17:09,560 --> 00:17:12,240 Speaker 1: View in other companies says that their algorithms have improved 283 00:17:12,320 --> 00:17:14,959 Speaker 1: dramatically over that time, and there's some evidence that they 284 00:17:15,000 --> 00:17:20,240 Speaker 1: have gotten better. However, we ultimately don't know how accurate 285 00:17:20,280 --> 00:17:23,560 Speaker 1: and how reliable this software is in the way that 286 00:17:23,840 --> 00:17:27,720 Speaker 1: law enforcement use it because it's never been tested in 287 00:17:27,760 --> 00:17:29,880 Speaker 1: the way that law enforcement use it, the way that 288 00:17:29,880 --> 00:17:34,320 Speaker 1: that federal lab tests things. It's looking at basically perfect lighting, 289 00:17:34,600 --> 00:17:38,760 Speaker 1: a perfectly framed profile photo in perfect conditions. How does 290 00:17:38,760 --> 00:17:42,360 Speaker 1: that algorithm do in those settings? Unlike how police use 291 00:17:42,400 --> 00:17:45,280 Speaker 1: these tools, which is, we're going to use this grainy 292 00:17:45,320 --> 00:17:49,520 Speaker 1: surveillance image shot at a weird angle from above, usually 293 00:17:50,080 --> 00:17:53,359 Speaker 1: usually if very poor lighting, and usually the face is 294 00:17:53,400 --> 00:17:56,040 Speaker 1: partly obscure or it could be. In the case in 295 00:17:56,040 --> 00:17:58,639 Speaker 1: Saint Louis, the guy was actually wearing a COVID mask 296 00:17:58,720 --> 00:18:00,840 Speaker 1: that was covering part of his and he was wearing 297 00:18:00,880 --> 00:18:03,159 Speaker 1: a hood that covered another part of his face. So 298 00:18:03,480 --> 00:18:06,120 Speaker 1: this is the typical setting where police are using these tools. 299 00:18:06,880 --> 00:18:11,280 Speaker 2: There's a case where a cartoon came up in a 300 00:18:11,320 --> 00:18:13,240 Speaker 2: match at one point, right. 301 00:18:13,600 --> 00:18:16,880 Speaker 1: Yeah, So because clear VII just has these I guess 302 00:18:16,920 --> 00:18:19,679 Speaker 1: web crawlers that are just scraping the Internet, and so 303 00:18:19,920 --> 00:18:22,760 Speaker 1: the things that come up and the Clearview results sometimes 304 00:18:23,000 --> 00:18:26,640 Speaker 1: are bizarre. So in this one case in Ohio, two 305 00:18:26,680 --> 00:18:28,399 Speaker 1: of the search results that I think we're in the 306 00:18:28,400 --> 00:18:32,320 Speaker 1: top ten of the search results were Michael Jordan just 307 00:18:32,960 --> 00:18:35,800 Speaker 1: a picture of Michael Jordan and I by the way, 308 00:18:35,800 --> 00:18:38,680 Speaker 1: I reached out to Michael Jordan's rep when I published 309 00:18:38,720 --> 00:18:41,040 Speaker 1: this in a story, and she didn't have any comment 310 00:18:41,080 --> 00:18:44,160 Speaker 1: on whether he might have been implicated into crime in Ohio. 311 00:18:45,920 --> 00:18:48,560 Speaker 1: And then the other one was yeah, just like it 312 00:18:48,640 --> 00:18:51,439 Speaker 1: illustrated a cartoon image of a black man. So it 313 00:18:51,520 --> 00:18:54,119 Speaker 1: just makes you like, kind of makes it you scratch 314 00:18:54,160 --> 00:18:56,000 Speaker 1: your head as to you know, what is this tool 315 00:18:56,040 --> 00:18:59,320 Speaker 1: that police are relying on if it's feeding them, you know, 316 00:18:59,400 --> 00:19:01,760 Speaker 1: garbage as results. 317 00:19:01,600 --> 00:19:04,280 Speaker 2: And giving them Michael Jordan committed this crime in the 318 00:19:04,320 --> 00:19:04,919 Speaker 2: top ten. 319 00:19:04,840 --> 00:19:06,280 Speaker 1: Right, which right, Yeah. 320 00:19:06,040 --> 00:19:09,960 Speaker 2: Again a big allegedly here, but I feel pretty sure 321 00:19:09,960 --> 00:19:12,560 Speaker 2: they'd have a tough time connecting him to the scene 322 00:19:12,560 --> 00:19:14,840 Speaker 2: of whatever that that particular crime was. 323 00:19:15,200 --> 00:19:18,400 Speaker 1: Yeah, And if this is such an advanced algorithm, why 324 00:19:18,440 --> 00:19:20,720 Speaker 1: can't their computer figure out that's Michael Jordan and just 325 00:19:20,960 --> 00:19:23,160 Speaker 1: take him out? And why can't a computer figure out 326 00:19:23,200 --> 00:19:25,919 Speaker 1: that that's a cartoon black man and not a real person. 327 00:19:26,320 --> 00:19:28,600 Speaker 1: I just ca't like a lot of the people who 328 00:19:28,640 --> 00:19:34,520 Speaker 1: are who put forward facial recognition as this science, they 329 00:19:34,560 --> 00:19:36,679 Speaker 1: have sort of cast it in the kind of the 330 00:19:36,720 --> 00:19:40,119 Speaker 1: cloak of this is a scientific tool. The police are 331 00:19:40,160 --> 00:19:42,840 Speaker 1: now using a lot of those arguments kind of break 332 00:19:42,880 --> 00:19:45,000 Speaker 1: down when you see stuff like this, when you see, oh, well, 333 00:19:45,280 --> 00:19:49,199 Speaker 1: your scientific, highly advanced tool, you know, brought back a 334 00:19:49,200 --> 00:19:51,240 Speaker 1: picture of you know, cartoon black Man. How do the 335 00:19:51,320 --> 00:19:52,520 Speaker 1: science arrive at that? 336 00:19:56,000 --> 00:20:00,280 Speaker 2: Facial recognition as a technology is probably a whole other episod. 337 00:20:00,720 --> 00:20:03,480 Speaker 2: But if you've ever used face ID to unlock your iPhone, 338 00:20:03,560 --> 00:20:06,800 Speaker 2: you've already used the version of facial recognition on yourself. 339 00:20:07,400 --> 00:20:09,760 Speaker 2: In some ways, the technology isn't all that different from 340 00:20:09,800 --> 00:20:12,600 Speaker 2: how it started in the nineteen sixties, when researchers were 341 00:20:12,640 --> 00:20:15,399 Speaker 2: trying to figure out how to identify facial features like 342 00:20:15,480 --> 00:20:17,639 Speaker 2: the bridge of the nose, the edge of the nose, 343 00:20:17,880 --> 00:20:21,520 Speaker 2: or the eyes and measure the distance between them. They'd 344 00:20:21,520 --> 00:20:24,679 Speaker 2: store all those measurements as data for a single face. 345 00:20:25,359 --> 00:20:28,399 Speaker 2: Back in the sixties, this stuff was pretty rudimentary, but 346 00:20:28,720 --> 00:20:31,159 Speaker 2: AI's made it more accurate because you can train the 347 00:20:31,200 --> 00:20:35,120 Speaker 2: systems on more faces, which means theoretically it's more likely 348 00:20:35,160 --> 00:20:38,720 Speaker 2: to tell your face from someone else's. But you've probably 349 00:20:38,720 --> 00:20:41,400 Speaker 2: had your face ID fail on you before, and that's 350 00:20:41,480 --> 00:20:44,480 Speaker 2: under good conditions. It's just trying to match your face 351 00:20:44,640 --> 00:20:48,000 Speaker 2: with well your face. If you add a few million 352 00:20:48,080 --> 00:20:51,720 Speaker 2: other faces into the mix, the potential for mistakes multiplies 353 00:20:52,119 --> 00:20:56,080 Speaker 2: by a lot. But even with those issues, police are 354 00:20:56,119 --> 00:21:01,080 Speaker 2: increasingly leaning on this technology. It seems like police are 355 00:21:01,280 --> 00:21:06,480 Speaker 2: presenting the fact that facial recognition software has returned a match, 356 00:21:07,000 --> 00:21:11,879 Speaker 2: whether it's true or false, as evidence. But that seems 357 00:21:12,119 --> 00:21:13,480 Speaker 2: it seems backwards, right. 358 00:21:13,880 --> 00:21:17,200 Speaker 1: Yeah, Well, not only that, but it's on the one hand, 359 00:21:17,320 --> 00:21:21,600 Speaker 1: they actually acknowledge that facial recognition is not enough to 360 00:21:21,640 --> 00:21:25,400 Speaker 1: make a case. They say this in their police rules 361 00:21:25,600 --> 00:21:28,160 Speaker 1: and their public statements over and over and over. If 362 00:21:28,160 --> 00:21:30,800 Speaker 1: you hear police talk about facial recognition, they say to 363 00:21:30,840 --> 00:21:34,040 Speaker 1: the public, this is only an investigative tool. We are 364 00:21:34,080 --> 00:21:36,199 Speaker 1: only going to use this to find investigative leads that 365 00:21:36,240 --> 00:21:38,960 Speaker 1: we then go out and corroborate. And in some places 366 00:21:39,040 --> 00:21:42,560 Speaker 1: that's actually the law. There's six states where actually police 367 00:21:42,600 --> 00:21:45,880 Speaker 1: are required to corroborate any leads that they given facial recgnition. 368 00:21:46,280 --> 00:21:48,840 Speaker 1: What we found in looking at cases all around the 369 00:21:48,840 --> 00:21:51,480 Speaker 1: country was that the police were not doing that. Sometimes 370 00:21:51,480 --> 00:21:54,919 Speaker 1: they do, but often they will just take it at 371 00:21:54,960 --> 00:21:58,080 Speaker 1: face value that this software hit is enough, and then 372 00:21:58,119 --> 00:21:59,800 Speaker 1: they will use that to go out and arrest the 373 00:21:59,800 --> 00:22:03,520 Speaker 1: per I encountered this really surprising thing, which is I 374 00:22:03,560 --> 00:22:06,960 Speaker 1: talked to a number of prosecutors and police who told 375 00:22:07,000 --> 00:22:10,919 Speaker 1: me that, well, yeah, we did corroboration. As soon as 376 00:22:10,920 --> 00:22:13,560 Speaker 1: we got the name from the software, we went and 377 00:22:13,600 --> 00:22:16,880 Speaker 1: we looked at that person's other photos and it visually 378 00:22:16,960 --> 00:22:19,000 Speaker 1: looked like the same person. And they said that that 379 00:22:19,119 --> 00:22:21,920 Speaker 1: is corroboration, visual corroboration of a match. 380 00:22:22,640 --> 00:22:27,359 Speaker 2: That sounds like you're corroborating evidence on your own, like 381 00:22:27,400 --> 00:22:28,920 Speaker 2: you're becoming a witness at that point. 382 00:22:28,960 --> 00:22:31,639 Speaker 1: As a police officer, this thing comes up over and 383 00:22:31,640 --> 00:22:36,240 Speaker 1: over is that many people look alike, and humans are 384 00:22:36,480 --> 00:22:41,200 Speaker 1: very bad at distinguishing between two people who are similar. 385 00:22:42,960 --> 00:22:45,560 Speaker 2: You would think that this is where technology could help, 386 00:22:46,000 --> 00:22:49,800 Speaker 2: but AI isn't much better at telling people apart. The 387 00:22:49,840 --> 00:22:52,399 Speaker 2: first known case of a wrongful arrest made by AI 388 00:22:52,880 --> 00:22:56,439 Speaker 2: was Robert Williams in Detroit in twenty twenty. Doug was 389 00:22:56,480 --> 00:23:01,959 Speaker 2: able to get the police interrogation video. William's case is 390 00:23:02,040 --> 00:23:05,800 Speaker 2: really interesting because we have the interrogation video of this 391 00:23:05,880 --> 00:23:10,119 Speaker 2: and I was just watching this and it's incredible. So 392 00:23:10,520 --> 00:23:12,520 Speaker 2: this is you I know you've seen this video. So 393 00:23:12,520 --> 00:23:16,359 Speaker 2: this is where they're you know, these two detectives are 394 00:23:16,760 --> 00:23:20,280 Speaker 2: sitting in this interrogation room. Rob Williams is sitting across 395 00:23:20,400 --> 00:23:23,760 Speaker 2: the table from them, and he's clearly confused. And they 396 00:23:23,840 --> 00:23:27,960 Speaker 2: bring out these papers and I'll play it from here. 397 00:23:29,880 --> 00:23:34,440 Speaker 5: December twenty sixth, around one pm? Where were you. 398 00:23:37,359 --> 00:23:38,359 Speaker 3: Home? Home? 399 00:23:39,080 --> 00:23:43,560 Speaker 5: Can you do me a favorite? Is that you? 400 00:23:43,560 --> 00:23:45,320 Speaker 3: You know? No? 401 00:23:45,960 --> 00:23:53,560 Speaker 5: Not even closed it? No, I'm pissed. 402 00:23:53,720 --> 00:23:54,200 Speaker 3: Keep going. 403 00:23:55,040 --> 00:24:01,679 Speaker 5: Is that you? No, that's not you at all? Not 404 00:24:01,880 --> 00:24:09,199 Speaker 5: there either. You can't y'all can't tail that. I'm one hundred, 405 00:24:15,359 --> 00:24:20,159 Speaker 5: so I'm one of the pictures. We actually got facial recognition. 406 00:24:20,280 --> 00:24:23,320 Speaker 5: I heard that it was probably favorite recognition. 407 00:24:23,400 --> 00:24:24,080 Speaker 3: That is not me. 408 00:24:24,520 --> 00:24:28,840 Speaker 2: That's me on my d right, you're you're smiling and 409 00:24:28,880 --> 00:24:33,359 Speaker 2: I'm smiling too. As bleak as this is. But there's 410 00:24:33,400 --> 00:24:37,400 Speaker 2: something almost funny in this exchange. 411 00:24:37,480 --> 00:24:40,000 Speaker 1: Well, they because they present the pictures as if it's 412 00:24:40,080 --> 00:24:43,840 Speaker 1: evidence in their favor, and then as soon as he 413 00:24:43,920 --> 00:24:48,040 Speaker 1: sees them and reacts to them, suddenly it becomes evidence 414 00:24:48,080 --> 00:24:50,239 Speaker 1: in his favor because he's like, are you looking at 415 00:24:50,240 --> 00:24:52,320 Speaker 1: the same picture I'm looking at because it's clearly not me? 416 00:24:53,000 --> 00:24:54,639 Speaker 1: And it's funny. You can kind of hear in the 417 00:24:54,680 --> 00:25:00,199 Speaker 1: officer's voice, well, well, it's we have facial recognition, as 418 00:25:00,200 --> 00:25:02,680 Speaker 1: if you know, as if that's somehow, you know, bolsters 419 00:25:02,720 --> 00:25:05,640 Speaker 1: his case that this is this is accurate, as if 420 00:25:05,800 --> 00:25:08,159 Speaker 1: that's going to change his mind that the picture that 421 00:25:08,200 --> 00:25:10,800 Speaker 1: he's looking at is actually him when he knows it's 422 00:25:10,800 --> 00:25:15,400 Speaker 1: not him, As if the phrase facial recognition is itself 423 00:25:15,600 --> 00:25:19,320 Speaker 1: kind of evidence of the police being correct. 424 00:25:19,520 --> 00:25:22,320 Speaker 2: You know, he holds a picture up to his face, like, look, 425 00:25:23,400 --> 00:25:25,720 Speaker 2: of course, this isn't me. What are you talking about, Kate? 426 00:25:26,520 --> 00:25:29,520 Speaker 2: Forget the software, look at me. The thing that hits 427 00:25:29,520 --> 00:25:32,360 Speaker 2: you about this interaction is that when the cop finally 428 00:25:32,400 --> 00:25:34,600 Speaker 2: looks at the picture, you could hear the room go 429 00:25:34,720 --> 00:25:38,120 Speaker 2: quiet for a second, and nobody's arguing with Robert Williams, 430 00:25:38,480 --> 00:25:41,280 Speaker 2: and it kind of sounds like he's agreeing, like it's 431 00:25:41,320 --> 00:25:44,119 Speaker 2: the first time he's actually looking at it, like he 432 00:25:44,200 --> 00:25:47,080 Speaker 2: had so much confidence in the machine that he didn't 433 00:25:47,119 --> 00:25:48,600 Speaker 2: bother to look with his own eyes. 434 00:25:49,240 --> 00:25:52,359 Speaker 1: They by and large are not telling even the people 435 00:25:52,359 --> 00:25:55,639 Speaker 1: that they identify and arrest using his tools that they 436 00:25:55,760 --> 00:26:01,000 Speaker 1: use them, and that's leading to people either finding out 437 00:26:01,080 --> 00:26:05,000 Speaker 1: kind of in offhand ways in interrogation that oh, well, 438 00:26:05,400 --> 00:26:07,399 Speaker 1: you know the computer picked you. Well, what do you 439 00:26:07,400 --> 00:26:09,200 Speaker 1: mean by the computer. A lot of times, they're probably 440 00:26:09,200 --> 00:26:12,560 Speaker 1: not finding out at all, which is concerning because one 441 00:26:12,560 --> 00:26:15,880 Speaker 1: of the pillars of our courts and our justice system 442 00:26:16,400 --> 00:26:19,280 Speaker 1: is that you need to be able to face your accuser. 443 00:26:20,000 --> 00:26:25,000 Speaker 1: And so if your accuser is this algorithm, this computer program, 444 00:26:25,600 --> 00:26:28,280 Speaker 1: but you're not even being told that it was used, 445 00:26:28,600 --> 00:26:31,479 Speaker 1: let alone given any of the details about how it works. 446 00:26:31,640 --> 00:26:34,280 Speaker 1: So all of these things are being kind of kept 447 00:26:34,400 --> 00:26:37,520 Speaker 1: vague or sometimes completely kept hidden from the people that 448 00:26:37,560 --> 00:26:40,760 Speaker 1: these tools are being used to investigate and ultimately arrest. 449 00:26:41,440 --> 00:26:43,560 Speaker 2: So what does this mean for the future of policing 450 00:26:44,080 --> 00:26:47,080 Speaker 2: and how should we be using these technologies if at all? 451 00:26:47,800 --> 00:27:07,040 Speaker 2: That's after the break. People are being investigated and arrested 452 00:27:07,320 --> 00:27:09,960 Speaker 2: and not being told that police are using these facial 453 00:27:10,000 --> 00:27:13,160 Speaker 2: recognition tools to find them. You don't need a law 454 00:27:13,200 --> 00:27:16,160 Speaker 2: agree to feel that something about that is just kind 455 00:27:16,200 --> 00:27:19,800 Speaker 2: of off, and you wouldn't be wrong. Here in the US, 456 00:27:19,960 --> 00:27:22,720 Speaker 2: there's a specific set of rules that might apply here, 457 00:27:25,440 --> 00:27:29,240 Speaker 2: the Brady rules. I want to talk about that. So 458 00:27:29,400 --> 00:27:33,639 Speaker 2: can we talk about how Brady rules can can potentially 459 00:27:33,840 --> 00:27:38,320 Speaker 2: figure into an arrest that is made with AI facial 460 00:27:38,359 --> 00:27:40,840 Speaker 2: recognition as a part of that or is the sole 461 00:27:40,920 --> 00:27:41,440 Speaker 2: part of it? 462 00:27:41,840 --> 00:27:47,800 Speaker 1: Yeah, So basically, when you prosecute somebody, the prosecution is 463 00:27:47,960 --> 00:27:52,800 Speaker 1: required by the courts and required by the Constitution to 464 00:27:52,920 --> 00:27:56,639 Speaker 1: share any evidence that speaks to the guilt or the 465 00:27:56,640 --> 00:28:00,399 Speaker 1: innocence of the person that they're prosecuting, so whired to 466 00:28:00,440 --> 00:28:03,720 Speaker 1: share that with the defense. There's a question now the 467 00:28:03,800 --> 00:28:08,120 Speaker 1: courts are grappling with, which is does somebody's identification by 468 00:28:08,160 --> 00:28:11,160 Speaker 1: an AI software? Should that be brought into the courtroom? 469 00:28:11,640 --> 00:28:14,879 Speaker 1: And should the should the defendant be given a chance 470 00:28:15,440 --> 00:28:19,159 Speaker 1: to know everything about that software? And right now this 471 00:28:19,280 --> 00:28:21,840 Speaker 1: is playing out in courtrooms across the country. 472 00:28:21,560 --> 00:28:25,680 Speaker 2: Basically putting the AI in a manner of speaking, putting 473 00:28:25,680 --> 00:28:29,520 Speaker 2: the AI on the witness stand and saying how reliable 474 00:28:29,520 --> 00:28:30,920 Speaker 2: really are you? Yeah. 475 00:28:31,480 --> 00:28:34,280 Speaker 1: A very good indicator of how that's going to go 476 00:28:34,560 --> 00:28:40,840 Speaker 1: is the companies themselves have disclaimers saying this does not 477 00:28:40,920 --> 00:28:43,520 Speaker 1: hold up in courts. Clearview AI is the one that 478 00:28:43,560 --> 00:28:46,280 Speaker 1: I'm very familiar with. They have language and all of 479 00:28:46,320 --> 00:28:49,680 Speaker 1: their contracts with police departments saying this is not admissible 480 00:28:49,720 --> 00:28:53,040 Speaker 1: evidence in court. And also basically, please don't ask us 481 00:28:53,080 --> 00:28:55,280 Speaker 1: to come into court, because nobody from Clearview AI is 482 00:28:55,320 --> 00:28:57,840 Speaker 1: going to come and sit in court and defend this software. 483 00:28:58,200 --> 00:29:00,760 Speaker 1: The very companies who make it and advertise it and 484 00:29:00,800 --> 00:29:04,080 Speaker 1: market it as this great, amazing technology tool, they would 485 00:29:04,120 --> 00:29:05,320 Speaker 1: not even stand behind it. 486 00:29:05,760 --> 00:29:09,480 Speaker 2: In your research, in your reporting, you've seen what looks 487 00:29:09,560 --> 00:29:12,480 Speaker 2: like and what defendants are saying wrongful arrest based on AI. 488 00:29:13,640 --> 00:29:18,080 Speaker 2: Have the police who made these arrests or who made 489 00:29:18,120 --> 00:29:22,640 Speaker 2: those decisions. Has there been any repercussions, any discipline, any 490 00:29:22,680 --> 00:29:25,800 Speaker 2: guidance for them that you've seen. 491 00:29:27,000 --> 00:29:29,880 Speaker 1: I mean, the answer is no, as far as I 492 00:29:29,920 --> 00:29:35,920 Speaker 1: can tell, the individual officers have not publicly faced any punishment, 493 00:29:35,960 --> 00:29:38,560 Speaker 1: although you know, you never know what happens behind closed doors. 494 00:29:39,000 --> 00:29:43,040 Speaker 1: Typically police unions forbid police from publicly talking about punishment 495 00:29:43,840 --> 00:29:47,840 Speaker 1: of individuals. Now, we've seen settlements paid out of three 496 00:29:47,920 --> 00:29:49,680 Speaker 1: hundred thousand dollars to a few of these people who 497 00:29:49,680 --> 00:29:51,800 Speaker 1: are wrong fully arrested. Is this going to change the 498 00:29:51,840 --> 00:29:55,520 Speaker 1: behavior of these individuals. There's only six states that have 499 00:29:55,800 --> 00:30:01,520 Speaker 1: laws mandating specific disclosures and mandate specific things about how 500 00:30:01,560 --> 00:30:04,880 Speaker 1: these tools cannot be used. So you know, the vast 501 00:30:04,880 --> 00:30:07,480 Speaker 1: majority of the country police are kind of still just 502 00:30:07,560 --> 00:30:09,240 Speaker 1: kind of figuring out. It's really early days. 503 00:30:10,120 --> 00:30:12,840 Speaker 2: So just to back up here, we've been talking about 504 00:30:12,840 --> 00:30:17,120 Speaker 2: the problems with this technology, what about the upsides? And 505 00:30:17,160 --> 00:30:20,800 Speaker 2: it's interesting because if you ask people how facial recognition 506 00:30:20,880 --> 00:30:23,880 Speaker 2: could be used to catch criminals, there's not really any 507 00:30:23,960 --> 00:30:27,480 Speaker 2: high profile cases we could point to. And before you 508 00:30:27,520 --> 00:30:31,280 Speaker 2: say January sixth, well that's complicated. 509 00:30:31,920 --> 00:30:34,840 Speaker 1: A lot of people attribute facial recognition with helping to 510 00:30:34,920 --> 00:30:39,000 Speaker 1: I identify these people and to bring them to justice. 511 00:30:39,520 --> 00:30:43,160 Speaker 1: And yes, facial recond did play a role. However, on 512 00:30:43,280 --> 00:30:45,760 Speaker 1: that day, there was a lot of evidence being collected 513 00:30:45,800 --> 00:30:48,400 Speaker 1: through social media posts that these people were making on 514 00:30:48,440 --> 00:30:50,760 Speaker 1: their own videos that these people were taking and posting. 515 00:30:51,040 --> 00:30:54,120 Speaker 1: I think the federal investigators even use of high tech 516 00:30:54,200 --> 00:30:59,000 Speaker 1: methods to grab the cellular location data of everybody who 517 00:30:59,040 --> 00:31:00,760 Speaker 1: is in the capital that day, and I think they 518 00:31:00,800 --> 00:31:02,360 Speaker 1: were able to identify a lot of people that way. 519 00:31:02,480 --> 00:31:05,280 Speaker 1: So facial actation in that case was one of many 520 00:31:05,320 --> 00:31:07,880 Speaker 1: tools that were used, which is sort of how it's 521 00:31:08,040 --> 00:31:10,120 Speaker 1: designed to be used, is you know, not as kind 522 00:31:10,160 --> 00:31:14,640 Speaker 1: of the sole investigative lead, but as one of several. So, 523 00:31:14,800 --> 00:31:17,320 Speaker 1: you know, other than that, there aren't that many well 524 00:31:17,400 --> 00:31:19,920 Speaker 1: known cases out there that we can point to of 525 00:31:20,040 --> 00:31:23,840 Speaker 1: you know, this was the thing that brought down, you know, 526 00:31:23,880 --> 00:31:25,520 Speaker 1: a murder case that we've been trying to solve for 527 00:31:25,600 --> 00:31:27,480 Speaker 1: many years. But I bet we will start to hear 528 00:31:27,520 --> 00:31:29,800 Speaker 1: about some of that, and I would love to kind 529 00:31:29,840 --> 00:31:32,320 Speaker 1: of have more of those case studies we can pick apart. 530 00:31:32,640 --> 00:31:34,880 Speaker 1: I think the public deserves to know about both. The 531 00:31:34,920 --> 00:31:37,920 Speaker 1: public should also be aware of the good and the 532 00:31:37,960 --> 00:31:41,600 Speaker 1: benefit because as we kind of make laws or think 533 00:31:41,600 --> 00:31:44,400 Speaker 1: about regulating how to ring this in, it's going to 534 00:31:44,400 --> 00:31:47,880 Speaker 1: be important to understand that balance and to get it right. 535 00:31:48,360 --> 00:31:52,120 Speaker 1: I think that there's a danger and rushing to a 536 00:31:52,160 --> 00:31:55,840 Speaker 1: blanket ban on this technology. 537 00:31:56,840 --> 00:31:59,360 Speaker 2: I want to play some for you really quick. This 538 00:31:59,400 --> 00:32:02,680 Speaker 2: is a really I think powerful moment from your podcast, 539 00:32:03,680 --> 00:32:05,800 Speaker 2: and I was hoping I could have have you speak 540 00:32:05,840 --> 00:32:06,840 Speaker 2: to this a little bit here. 541 00:32:07,760 --> 00:32:11,520 Speaker 5: They lean on the technology because I think we are 542 00:32:11,600 --> 00:32:13,800 Speaker 5: taught that computers don't make mistakes. 543 00:32:13,920 --> 00:32:14,480 Speaker 4: Humans do. 544 00:32:15,680 --> 00:32:18,040 Speaker 2: Now I'm sitting there like, how can you do this? 545 00:32:18,640 --> 00:32:19,640 Speaker 3: You can't do this? 546 00:32:19,960 --> 00:32:22,440 Speaker 1: Like, no way, how can you just put these charges 547 00:32:22,480 --> 00:32:22,800 Speaker 1: on them? 548 00:32:22,960 --> 00:32:24,400 Speaker 3: And I'm telling you that that's not me. 549 00:32:25,000 --> 00:32:27,160 Speaker 2: I know I was in this, So how do I 550 00:32:27,280 --> 00:32:28,120 Speaker 2: beat a machine? 551 00:32:29,760 --> 00:32:29,840 Speaker 3: So? 552 00:32:30,520 --> 00:32:32,440 Speaker 2: I mean, I guess I guess my question is here 553 00:32:34,000 --> 00:32:36,720 Speaker 2: these I'm just gonna call them AI arrests. How do 554 00:32:36,800 --> 00:32:40,720 Speaker 2: these arrests affect the people who are being accused. 555 00:32:41,320 --> 00:32:44,600 Speaker 1: That was a clip of our episode of Post Reports, 556 00:32:44,600 --> 00:32:49,240 Speaker 1: and we heard these patterns of experience which were very notable. 557 00:32:49,280 --> 00:32:52,120 Speaker 1: So a few things. Wrongful arrests are not a new thing. 558 00:32:52,160 --> 00:32:55,080 Speaker 1: The technology did not bring about wrongful arrests, but the 559 00:32:55,120 --> 00:32:59,360 Speaker 1: idea of you know, usually there's some other incidental relationship 560 00:32:59,520 --> 00:33:03,160 Speaker 1: some somebody has to a crime before they get into 561 00:33:03,160 --> 00:33:06,719 Speaker 1: a wrongful orest. In these cases, the AI is plucking 562 00:33:06,760 --> 00:33:09,040 Speaker 1: you out of thin air in some cases. And in 563 00:33:09,120 --> 00:33:11,920 Speaker 1: one case, this man, Karan Reid, he had never actually 564 00:33:11,920 --> 00:33:13,800 Speaker 1: even been to the state of a Louisiana and he 565 00:33:13,840 --> 00:33:17,560 Speaker 1: was resident in Atlanta, so he's literally been plucked from 566 00:33:17,600 --> 00:33:19,920 Speaker 1: the other side of the country and brought into this 567 00:33:20,000 --> 00:33:23,719 Speaker 1: crime that he had literally no proximity to whatsoever. So 568 00:33:24,000 --> 00:33:26,720 Speaker 1: this what does that do to the human brain when 569 00:33:26,720 --> 00:33:30,320 Speaker 1: you are just suddenly poof inside of a criminal investigation 570 00:33:30,480 --> 00:33:34,120 Speaker 1: that you have no connection to whatsoever. It's like baseline 571 00:33:34,280 --> 00:33:35,920 Speaker 1: is a word that came up over and over. A 572 00:33:35,960 --> 00:33:38,000 Speaker 1: lot of people just kind of got stuck on that idea, 573 00:33:38,040 --> 00:33:40,240 Speaker 1: even when they sat in jail for one or two 574 00:33:40,360 --> 00:33:44,479 Speaker 1: or three weeks or months in the case of Chris Gatlin, 575 00:33:44,480 --> 00:33:47,000 Speaker 1: who was in jail for seventeen months. Some of them 576 00:33:47,040 --> 00:33:49,640 Speaker 1: had this feeling of and I think you just heard 577 00:33:49,680 --> 00:33:52,120 Speaker 1: it in the clip. If the computer is telling you 578 00:33:52,360 --> 00:33:54,160 Speaker 1: that I did it, then how am I going to 579 00:33:54,160 --> 00:33:57,560 Speaker 1: convince you? Otherwise? How am I going to beat a computer? 580 00:33:57,840 --> 00:33:59,320 Speaker 1: Because you know, it goes back to this thing that 581 00:33:59,320 --> 00:34:04,239 Speaker 1: we were talking about before, this automation bias of you know, 582 00:34:04,280 --> 00:34:08,960 Speaker 1: he never heard of that term, but he instinctively realized 583 00:34:09,000 --> 00:34:12,200 Speaker 1: that he was going against a power that was much 584 00:34:12,239 --> 00:34:14,760 Speaker 1: greater than him and in some ways probably much greater 585 00:34:14,840 --> 00:34:17,319 Speaker 1: than if a witness had picked you for a crime. 586 00:34:17,360 --> 00:34:20,240 Speaker 1: It's like it's a computer picked you for a crime 587 00:34:20,320 --> 00:34:22,960 Speaker 1: and put your name on the line and put your 588 00:34:23,040 --> 00:34:24,120 Speaker 1: name into this investigation. 589 00:34:24,280 --> 00:34:27,319 Speaker 2: So it's our belief in the computer that's what you're 590 00:34:27,400 --> 00:34:27,960 Speaker 2: up against. 591 00:34:28,200 --> 00:34:30,000 Speaker 1: You know, many of the people that we talked to 592 00:34:30,080 --> 00:34:32,520 Speaker 1: and the cases that are public, many of them feel 593 00:34:32,560 --> 00:34:36,520 Speaker 1: like they kind of were the lucky ones. And maybe 594 00:34:36,560 --> 00:34:39,640 Speaker 1: they got out because one of them had a mole 595 00:34:39,800 --> 00:34:42,920 Speaker 1: on his face and his lawyer discovered that the guy 596 00:34:42,719 --> 00:34:46,040 Speaker 1: and the surveillance image didn't have a mole. Or there 597 00:34:46,080 --> 00:34:48,960 Speaker 1: was one woman, Portia Woodriff who was eight months pregnant 598 00:34:49,000 --> 00:34:51,399 Speaker 1: at the time she has arrested, and you know, there 599 00:34:51,440 --> 00:34:53,960 Speaker 1: was nothing in the interviews with the victims that said 600 00:34:54,000 --> 00:34:56,239 Speaker 1: there's a pregnant woman involved. So that's ultimately one of 601 00:34:56,280 --> 00:34:59,240 Speaker 1: the reasons that she got out. But are there many 602 00:34:59,320 --> 00:35:02,000 Speaker 1: other people who didn't get lucky and might might be 603 00:35:02,000 --> 00:35:05,600 Speaker 1: behind bars still because of facial recognition, you know, wrongfully 604 00:35:05,640 --> 00:35:06,160 Speaker 1: put them there. 605 00:35:06,640 --> 00:35:08,680 Speaker 2: I mean, when you put it like that, you start 606 00:35:08,760 --> 00:35:11,640 Speaker 2: to make me think that maybe we're not hearing horror stories. 607 00:35:12,480 --> 00:35:13,960 Speaker 2: Those are the success stories. 608 00:35:14,680 --> 00:35:18,120 Speaker 1: This was pretty horrible for these people, not just them 609 00:35:18,480 --> 00:35:20,720 Speaker 1: in some of the cases. You know, their their children, 610 00:35:20,800 --> 00:35:23,760 Speaker 1: their young children watch them get arrested in their front lawns, 611 00:35:24,400 --> 00:35:25,320 Speaker 1: and that's trauma. 612 00:35:25,560 --> 00:35:25,840 Speaker 3: Yeah. 613 00:35:26,120 --> 00:35:27,480 Speaker 1: One of the one of the men that we've talked 614 00:35:27,480 --> 00:35:29,880 Speaker 1: about before, Robert Williams, who's arrested in Detroit. You know, 615 00:35:29,960 --> 00:35:32,279 Speaker 1: his kids watched them get arrested in the front lawn 616 00:35:32,560 --> 00:35:36,320 Speaker 1: and he says, to this day, they still talk about 617 00:35:36,360 --> 00:35:39,120 Speaker 1: that incident. One one of his daughters. This is kind 618 00:35:39,120 --> 00:35:41,200 Speaker 1: of heartbreaking. One of his daughters every once in a 619 00:35:41,200 --> 00:35:44,520 Speaker 1: while comes up to him and says, Daddy, I think 620 00:35:44,560 --> 00:35:47,279 Speaker 1: I've solved it. I found the real man who went 621 00:35:47,320 --> 00:35:49,920 Speaker 1: and stole those watches, and she pointed to like a 622 00:35:49,960 --> 00:35:52,560 Speaker 1: cartoon character as like that was the one that was 623 00:35:52,600 --> 00:35:54,680 Speaker 1: the real guy who stole the watches and not daddy. 624 00:35:54,840 --> 00:35:55,600 Speaker 2: Oh my god. 625 00:35:55,760 --> 00:35:57,919 Speaker 1: So these, I mean, these experiences are now baked into 626 00:35:57,960 --> 00:36:00,080 Speaker 1: their lives and their childhoods. And then that's you know, 627 00:36:00,360 --> 00:36:02,719 Speaker 1: I'm a parent of small kids, and that's a very 628 00:36:02,719 --> 00:36:06,200 Speaker 1: sad thing. And yes, they were lucky, probably luckier than 629 00:36:06,280 --> 00:36:09,760 Speaker 1: other people whose stories we don't know about, but also 630 00:36:09,840 --> 00:36:12,440 Speaker 1: these are horrible experiences. 631 00:36:14,120 --> 00:36:16,480 Speaker 2: I'm not sure what y'all think about this. I'm still 632 00:36:16,480 --> 00:36:19,040 Speaker 2: trying to figure that out myself, and I know a 633 00:36:19,080 --> 00:36:22,800 Speaker 2: lot of what we've heard sounds pretty terrifying. So for 634 00:36:22,880 --> 00:36:25,080 Speaker 2: one of my last questions, I asked Doug what he 635 00:36:25,200 --> 00:36:27,080 Speaker 2: thought the next few years were going to look like 636 00:36:27,160 --> 00:36:30,759 Speaker 2: with this technology, and he answered in a way that 637 00:36:30,800 --> 00:36:31,920 Speaker 2: I didn't really expect. 638 00:36:33,560 --> 00:36:38,000 Speaker 1: We're still at a point of fascination with this technology, 639 00:36:38,719 --> 00:36:42,480 Speaker 1: and in places around the country, maybe not in the 640 00:36:42,480 --> 00:36:44,440 Speaker 1: big cities, where I mean you do see kind of 641 00:36:44,440 --> 00:36:49,880 Speaker 1: a skepticism of tech tools and concerned about privacy and surveillance, 642 00:36:50,280 --> 00:36:53,000 Speaker 1: but in many of the places in this country, you know, 643 00:36:53,000 --> 00:36:56,279 Speaker 1: in smaller towns around the country, police are adopting these 644 00:36:56,320 --> 00:36:59,319 Speaker 1: things and are excited and there's a fascination, and they're 645 00:36:59,320 --> 00:37:03,000 Speaker 1: being sold to the public as this kind of magical tool, 646 00:37:03,320 --> 00:37:08,320 Speaker 1: like in Evansville, Indiana, in Florence, Kentucky, in Saint John's County, Florida, 647 00:37:08,560 --> 00:37:12,600 Speaker 1: these smaller town places where you know, the local population 648 00:37:12,719 --> 00:37:15,480 Speaker 1: may not know that the police have started to like 649 00:37:15,600 --> 00:37:19,320 Speaker 1: rely on these tools quite a bit. So a simple 650 00:37:19,360 --> 00:37:21,640 Speaker 1: goal for my reporting has just been to just kind 651 00:37:21,680 --> 00:37:24,480 Speaker 1: of encourage people to ask questions to your go to 652 00:37:24,520 --> 00:37:27,160 Speaker 1: your city council meeting. You'll actually learn a lot. You 653 00:37:27,200 --> 00:37:28,880 Speaker 1: don't have to sit on the sidelines of this discussion 654 00:37:28,880 --> 00:37:31,600 Speaker 1: because you're part of it. You are a person that 655 00:37:31,719 --> 00:37:34,200 Speaker 1: is in front of these surveillance cameras too. So now 656 00:37:34,280 --> 00:37:37,560 Speaker 1: there's a risk of you, everybody in this country of 657 00:37:37,600 --> 00:37:40,359 Speaker 1: getting pulled into an investigation due to, you know, a 658 00:37:40,360 --> 00:37:43,440 Speaker 1: misuse of this technology. So you should everybody should think 659 00:37:43,440 --> 00:37:46,480 Speaker 1: about themselves as being part of this discussion about whether 660 00:37:46,560 --> 00:37:48,480 Speaker 1: and how we use this technology going forward. 661 00:38:00,120 --> 00:38:03,040 Speaker 2: Thank you so much for listening to our first episode 662 00:38:03,200 --> 00:38:07,040 Speaker 2: of The New kill Switch, and a very special thanks 663 00:38:07,080 --> 00:38:10,240 Speaker 2: to our friends over the Washington Post. Podcast Post reports 664 00:38:10,440 --> 00:38:12,640 Speaker 2: who helped us with this episode. You can check out 665 00:38:12,640 --> 00:38:15,440 Speaker 2: more Doug's reporting over there and stay with us. We 666 00:38:15,520 --> 00:38:18,200 Speaker 2: have so much more ahead. This show is all about 667 00:38:18,239 --> 00:38:22,040 Speaker 2: what's happening right now between humans, between machines, how we're 668 00:38:22,080 --> 00:38:24,759 Speaker 2: being shaped by our own creations, and a whole lot 669 00:38:24,800 --> 00:38:27,360 Speaker 2: beyond that. So let us know what you think, and 670 00:38:27,640 --> 00:38:29,520 Speaker 2: even if there's something you want us to cover, you 671 00:38:29,520 --> 00:38:32,960 Speaker 2: can hit us up at kill Switch at Kaleidoscope dot NYC, 672 00:38:33,560 --> 00:38:35,840 Speaker 2: or you could hit me at dex Digi on the 673 00:38:35,920 --> 00:38:39,239 Speaker 2: Gram or Blue Sky if that's more your thing. Kill 674 00:38:39,239 --> 00:38:42,719 Speaker 2: Switch is hosted by me Dexter Thomas. It's produced by 675 00:38:42,719 --> 00:38:47,160 Speaker 2: Shena Ozaki, Daryl Potts, and Kate Osbourne. Our theme song 676 00:38:47,280 --> 00:38:50,920 Speaker 2: is by Kyle Murdoch, who also mixed the show. From Kaleidoscope. 677 00:38:50,960 --> 00:38:54,920 Speaker 2: Our executive producers are Ozwa Washin, Mangesh Hutti Goodur, and 678 00:38:55,120 --> 00:38:59,400 Speaker 2: Kate Osborne from iHeart, our executive producers are Katrina Norvil 679 00:38:59,600 --> 00:39:02,920 Speaker 2: and Nick Etur. That's it for me, catch on the 680 00:39:03,000 --> 00:39:11,480 Speaker 2: next one. H