1 00:00:00,120 --> 00:00:04,760 Speaker 1: My REGI welcome Stuff to Blow Your Mind, a production 2 00:00:04,800 --> 00:00:13,680 Speaker 1: of I Heart Radios How Stuff Works. Hey, welcome to 3 00:00:13,720 --> 00:00:15,960 Speaker 1: Stuff to Blow Your Mind. My name is Robert Lamb 4 00:00:16,280 --> 00:00:19,520 Speaker 1: and I'm Joe McCormick, and we're back with part three 5 00:00:19,680 --> 00:00:23,200 Speaker 1: of our Journey through Facial Recognition. In the first episode, 6 00:00:23,200 --> 00:00:25,640 Speaker 1: we talked a lot about the sort of the current 7 00:00:25,680 --> 00:00:29,520 Speaker 1: tech landscape of a company focusing on facial recognition, some 8 00:00:29,640 --> 00:00:32,080 Speaker 1: issues with that. In the last episode we focused primarily 9 00:00:32,080 --> 00:00:35,800 Speaker 1: on the biological domain of facial recognition, and now we're 10 00:00:35,840 --> 00:00:39,120 Speaker 1: bringing it back to technology to uh to finish up today. 11 00:00:39,880 --> 00:00:42,200 Speaker 1: So one of the things that we were looking at 12 00:00:42,600 --> 00:00:45,360 Speaker 1: coming into today's episode was an article that I thought 13 00:00:45,400 --> 00:00:48,520 Speaker 1: was really good in Wired magazine again from this month 14 00:00:48,520 --> 00:00:52,559 Speaker 1: from January of twenty by Seaun Revieve called the Secret 15 00:00:52,680 --> 00:00:56,040 Speaker 1: History of Facial Recognition, And I will say I was 16 00:00:56,160 --> 00:01:00,800 Speaker 1: surprised to find out how far back facial recognition projects go. 17 00:01:01,080 --> 00:01:05,160 Speaker 1: This this goes into the nineteen sixties. Yeah, yeah, this, 18 00:01:05,160 --> 00:01:07,759 Speaker 1: this this is a really good read. This article. I mean, 19 00:01:07,800 --> 00:01:10,480 Speaker 1: it's it's extremely well written. It almost has I would 20 00:01:10,480 --> 00:01:13,600 Speaker 1: say it has a very narrative flow to a beginning 21 00:01:13,840 --> 00:01:18,320 Speaker 1: with this scene in which Uh, an elderly researcher who 22 00:01:18,440 --> 00:01:19,959 Speaker 1: is you know at this point, I believe you can 23 00:01:20,000 --> 00:01:24,440 Speaker 1: find to a wheelchair is instructing his son to to 24 00:01:24,680 --> 00:01:28,920 Speaker 1: unlock some old, rotting files from the sixties and burn 25 00:01:29,000 --> 00:01:31,280 Speaker 1: them in front of him and the garbage can in 26 00:01:31,319 --> 00:01:34,000 Speaker 1: the garage or something. Yeah, with you know, with and 27 00:01:34,080 --> 00:01:36,319 Speaker 1: you can you can tell that there are things about 28 00:01:36,600 --> 00:01:41,319 Speaker 1: classified information, top secret or what have you on the documents. 29 00:01:41,600 --> 00:01:45,399 Speaker 1: So it's a wonder wonderful, um you know, ominous start 30 00:01:45,560 --> 00:01:48,960 Speaker 1: to this article, which of course deals predominantly with the 31 00:01:48,960 --> 00:01:52,720 Speaker 1: the origins of facial recognition and touches on other at 32 00:01:52,800 --> 00:01:59,280 Speaker 1: times inspiring and other times creepy and devastating scientific programs 33 00:01:59,400 --> 00:02:02,520 Speaker 1: that were going on or in like in full swing 34 00:02:02,720 --> 00:02:05,000 Speaker 1: during the sixties. Well. Yeah, one of the things that 35 00:02:05,080 --> 00:02:07,720 Speaker 1: really comes home in this article is that the creepiness 36 00:02:07,800 --> 00:02:11,040 Speaker 1: of facial recognition technology is not new. That's sort of 37 00:02:11,080 --> 00:02:13,120 Speaker 1: been there since the very beginning. It's not one of 38 00:02:13,160 --> 00:02:16,760 Speaker 1: these things where um, like even the author here mentions like, 39 00:02:16,800 --> 00:02:19,200 Speaker 1: you know, social media, where it seems great at first, 40 00:02:19,200 --> 00:02:21,600 Speaker 1: and it's not until it's quote in the wild that 41 00:02:21,720 --> 00:02:24,919 Speaker 1: we begin to realize, oh yeah, this is um civilization 42 00:02:25,000 --> 00:02:28,080 Speaker 1: wrecking awfulness. Uh. And not just a fun way to 43 00:02:28,120 --> 00:02:33,000 Speaker 1: share photos. No. At the time, there was a realization 44 00:02:33,200 --> 00:02:37,840 Speaker 1: that this was potentially problematic. Yeah. And so it might 45 00:02:37,880 --> 00:02:40,040 Speaker 1: not come as a big surprise, especially given the story 46 00:02:40,080 --> 00:02:43,359 Speaker 1: we mentioned about burning documents that some of the earliest 47 00:02:43,400 --> 00:02:47,720 Speaker 1: funding for facial recognition technology research clearly came from the 48 00:02:47,760 --> 00:02:50,760 Speaker 1: c i A and front companies set up to funnel 49 00:02:50,800 --> 00:02:54,600 Speaker 1: c i A money. Yeah, this was this was super interesting, 50 00:02:54,880 --> 00:02:58,840 Speaker 1: the CIA funding through these various phony companies, and and 51 00:02:59,120 --> 00:03:01,360 Speaker 1: due to the CI funding, some of the you know, 52 00:03:01,480 --> 00:03:04,760 Speaker 1: stuff was was secret. Some of the image material has 53 00:03:04,800 --> 00:03:07,040 Speaker 1: only come out, you know, due to Freedom of Information 54 00:03:07,480 --> 00:03:10,720 Speaker 1: Act filings, and and some of the work was just 55 00:03:10,800 --> 00:03:13,959 Speaker 1: never published. A lot of the work was never published. Uh. 56 00:03:14,280 --> 00:03:16,840 Speaker 1: To to drive home the creepiness though, one of the 57 00:03:16,840 --> 00:03:21,480 Speaker 1: companies involved was this company, Panoramic, which which was also 58 00:03:21,520 --> 00:03:25,480 Speaker 1: tied to other programs including UH the Project mk Ultra. Uh. 59 00:03:25,880 --> 00:03:28,959 Speaker 1: It was one of eighty organizations that worked on Project 60 00:03:29,120 --> 00:03:33,600 Speaker 1: mk Ultra in particular quote subprojects and ninety four on 61 00:03:33,639 --> 00:03:36,800 Speaker 1: the study of bacterial and fungal toxins and the remote 62 00:03:36,880 --> 00:03:42,880 Speaker 1: directional control of activities of selected species of animals Animal control. Yeah, 63 00:03:42,960 --> 00:03:44,320 Speaker 1: I was like, if I can, if I want to 64 00:03:44,320 --> 00:03:47,440 Speaker 1: sick tigers at you from the other side of the world. Yeah. 65 00:03:47,680 --> 00:03:50,120 Speaker 1: MK Ultra, just to remind everybody, was a c I 66 00:03:50,360 --> 00:03:57,040 Speaker 1: project that explored the potential for mind control using psychedelics 67 00:03:57,400 --> 00:04:00,480 Speaker 1: UH and other tactics like basically looking at ways to 68 00:04:00,600 --> 00:04:07,040 Speaker 1: take these mind expanding um UH agents and use them 69 00:04:07,040 --> 00:04:10,960 Speaker 1: to break down the human psyche and then inevitably built 70 00:04:11,040 --> 00:04:13,800 Speaker 1: something back up that it could be tightly control. Now, 71 00:04:13,800 --> 00:04:16,640 Speaker 1: the evidence today is that the mind control experiments of 72 00:04:16,760 --> 00:04:19,200 Speaker 1: m K culture didn't really work, but they were really 73 00:04:19,200 --> 00:04:22,560 Speaker 1: good at destroying the human mind. I mean because basically 74 00:04:22,720 --> 00:04:27,120 Speaker 1: the project was responsible for psychological torture. UH. Just a 75 00:04:27,279 --> 00:04:32,360 Speaker 1: horrible program in a real blight on the scientific history 76 00:04:32,480 --> 00:04:36,200 Speaker 1: of the United States. UM. You know, not the only blight, 77 00:04:36,279 --> 00:04:39,440 Speaker 1: but but I think an appalling one. They were not good. 78 00:04:39,480 --> 00:04:41,080 Speaker 1: They did not figure out a way how to read 79 00:04:41,160 --> 00:04:43,919 Speaker 1: and you know, to rebuild say an ideal sleeper agent 80 00:04:44,040 --> 00:04:47,279 Speaker 1: out of the psychological destruction that they wrought. Yeah. So 81 00:04:47,360 --> 00:04:49,880 Speaker 1: a lot of this article focuses on this one main 82 00:04:50,000 --> 00:04:53,919 Speaker 1: figure named Woody Bledso, who was a leader at this company. 83 00:04:54,160 --> 00:04:57,840 Speaker 1: A founder and leader at this company, Panoramic Research, Incorporated, 84 00:04:57,839 --> 00:05:00,960 Speaker 1: which you mentioned earlier, which got a lot of business 85 00:05:01,080 --> 00:05:04,279 Speaker 1: from c I A and c I A front organization 86 00:05:04,400 --> 00:05:08,640 Speaker 1: funding in the sixties to study things like facial recognition. 87 00:05:09,080 --> 00:05:11,640 Speaker 1: But if you put yourself back in the context to 88 00:05:11,640 --> 00:05:13,720 Speaker 1: the nineteen sixties, I think one thing that's kind of 89 00:05:13,760 --> 00:05:17,640 Speaker 1: funny is people then might not yet have realized how 90 00:05:17,680 --> 00:05:21,159 Speaker 1: difficult of a project recognizing a face would be for 91 00:05:21,240 --> 00:05:24,560 Speaker 1: a machine, because we'd have tons of sci fi goes 92 00:05:24,600 --> 00:05:28,120 Speaker 1: back decades before that, where of course robots, computers, whatever, 93 00:05:28,120 --> 00:05:31,000 Speaker 1: just recognize people easily. Yeah, And I think that's mainly 94 00:05:31,040 --> 00:05:33,200 Speaker 1: because we would just we generally would just have a 95 00:05:33,279 --> 00:05:35,880 Speaker 1: rough idea that a robot can do everything a person 96 00:05:36,080 --> 00:05:38,680 Speaker 1: can do. A robot is a mechanical person, and you 97 00:05:38,720 --> 00:05:41,120 Speaker 1: didn't have to think too hard about all the complexities 98 00:05:41,160 --> 00:05:45,159 Speaker 1: involved there. I mean, even like the best example of 99 00:05:45,240 --> 00:05:48,839 Speaker 1: nineteen sixties science fiction and really one of the twenty 100 00:05:48,920 --> 00:05:51,640 Speaker 1: and twenty one centuries best examples of science fiction two 101 00:05:51,680 --> 00:05:56,160 Speaker 1: thou one of Space Odyssey, it does reference facial recognition 102 00:05:56,200 --> 00:05:59,719 Speaker 1: capabilities for how but it doesn't I wouldn't say that 103 00:05:59,760 --> 00:06:02,080 Speaker 1: it really it goes in depth about what that means. 104 00:06:03,120 --> 00:06:06,360 Speaker 1: But but how does is able to recognize faces, and 105 00:06:06,400 --> 00:06:09,040 Speaker 1: even like can recognize faces when it is a sketch 106 00:06:09,360 --> 00:06:12,080 Speaker 1: as opposed to video feed or a photo. Yeah, and 107 00:06:12,279 --> 00:06:15,480 Speaker 1: I think if you're not well versed in the computer 108 00:06:15,560 --> 00:06:18,760 Speaker 1: technology world, it might not be immediately apparent what's so 109 00:06:18,839 --> 00:06:22,440 Speaker 1: difficult about making a computer recognize a face? But you know, 110 00:06:22,520 --> 00:06:25,240 Speaker 1: our facial recognition systems, the things going on in our 111 00:06:25,279 --> 00:06:28,400 Speaker 1: brain have amazing capabilities and their analog right. You know, 112 00:06:28,440 --> 00:06:31,360 Speaker 1: a face, face has all kinds of variables that move 113 00:06:31,400 --> 00:06:34,719 Speaker 1: around all the time. It can be extremely difficult to 114 00:06:35,120 --> 00:06:38,520 Speaker 1: reduce a face to a set of nu miracle values, 115 00:06:38,600 --> 00:06:40,479 Speaker 1: which are what you need to do in order to 116 00:06:40,560 --> 00:06:44,320 Speaker 1: have a computer recognize a face. Yeah, as as I 117 00:06:44,360 --> 00:06:47,960 Speaker 1: think we've explored in the previous two episodes, it's um yeah, 118 00:06:48,000 --> 00:06:50,640 Speaker 1: there's there's a lot going on in facial recognition. There 119 00:06:50,640 --> 00:06:52,480 Speaker 1: are a number of challenges to it, and it is 120 00:06:52,520 --> 00:06:55,880 Speaker 1: still not a perfected technology by any means. Yeah, that's true, 121 00:06:55,880 --> 00:06:58,560 Speaker 1: And so I think it's reasonable to think of what 122 00:06:58,680 --> 00:07:02,240 Speaker 1: he bled So and his colleague as legitimate AI pioneers, 123 00:07:02,279 --> 00:07:05,320 Speaker 1: even with their work in the nineteen sixties here. Oh absolutely, 124 00:07:05,400 --> 00:07:08,000 Speaker 1: but I'm Bledso was working with a number of you know, 125 00:07:08,080 --> 00:07:14,000 Speaker 1: highly talented individuals, sometimes on projects like involving atomic weaponry, 126 00:07:14,040 --> 00:07:18,679 Speaker 1: for instance, before he became more focused on AI. Another 127 00:07:18,720 --> 00:07:22,680 Speaker 1: individual that he worked on concerning facial recognition was a 128 00:07:22,720 --> 00:07:27,120 Speaker 1: Helen shan Wolf. Um Wolf was involved in the development 129 00:07:27,120 --> 00:07:30,440 Speaker 1: of Shaky, which is a robot which DARPA describes as 130 00:07:30,560 --> 00:07:33,880 Speaker 1: quote the first mobile robot with enough artificial intelligence to 131 00:07:33,960 --> 00:07:36,640 Speaker 1: navigate on its own through a set of rooms. I 132 00:07:36,680 --> 00:07:39,120 Speaker 1: think somebody at the company also worked on a robot 133 00:07:39,160 --> 00:07:43,160 Speaker 1: called Mobot, which mode lawns and a random and unattended pattern. 134 00:07:44,480 --> 00:07:46,080 Speaker 1: I'm not joking, by the way, where that is the 135 00:07:46,160 --> 00:07:49,240 Speaker 1: thing Revieve mentioned. Yeah, it just always that That makes 136 00:07:49,240 --> 00:07:52,040 Speaker 1: me think of the the old Gumby short where the 137 00:07:52,080 --> 00:07:55,120 Speaker 1: Gumby family have the robots that are doing lawn care 138 00:07:55,200 --> 00:07:58,040 Speaker 1: and home repair and they just go perserk and the 139 00:07:58,120 --> 00:08:00,720 Speaker 1: Gumby family has to as to at them down. I 140 00:08:00,720 --> 00:08:02,280 Speaker 1: don't think I know that one. Oh, it's good. There 141 00:08:02,320 --> 00:08:04,480 Speaker 1: was an MST three k riff five eight years back. 142 00:08:04,720 --> 00:08:06,960 Speaker 1: That sounds horrible. It is. It's horrifying. At the end, 143 00:08:06,960 --> 00:08:09,560 Speaker 1: there's like a robot's head on the wall. It's it's 144 00:08:10,400 --> 00:08:14,200 Speaker 1: terrifying material. But one of the things that ended up 145 00:08:14,240 --> 00:08:16,520 Speaker 1: being the case if if you're trying to think, okay, 146 00:08:16,560 --> 00:08:19,320 Speaker 1: in the sixties, how would you even begin to get 147 00:08:19,320 --> 00:08:24,000 Speaker 1: a computer to recognize a face across different images? Uh. 148 00:08:24,360 --> 00:08:26,840 Speaker 1: One of the problems is, of course, the lack of 149 00:08:26,920 --> 00:08:30,880 Speaker 1: existing digitized imagery at the time, right. I mean, we 150 00:08:30,920 --> 00:08:34,080 Speaker 1: live in a world where digital imagery is ubiquitous. It 151 00:08:34,120 --> 00:08:35,880 Speaker 1: was not at all back then. There was almost none 152 00:08:35,920 --> 00:08:38,440 Speaker 1: of it. Yeah. Yeah, Today when you read about official 153 00:08:38,520 --> 00:08:41,840 Speaker 1: recognition research that are often working off of digital databases 154 00:08:41,880 --> 00:08:46,199 Speaker 1: containing hundreds and not thousands of photographs. Uh. And at 155 00:08:46,200 --> 00:08:50,319 Speaker 1: the time, yeah, how many digital photographs were there in 156 00:08:50,360 --> 00:08:52,600 Speaker 1: the world, right, So they actually had to have some 157 00:08:52,720 --> 00:08:55,960 Speaker 1: kind of digitization process, Like they had to be able 158 00:08:56,000 --> 00:08:59,640 Speaker 1: to take a photo and turn that photo into some 159 00:08:59,800 --> 00:09:03,280 Speaker 1: no numbers that could then be interpreted by the machine 160 00:09:03,800 --> 00:09:06,760 Speaker 1: to try to recognize a person or you know, say that, yeah, 161 00:09:06,760 --> 00:09:09,000 Speaker 1: that's the same person or not. And so a lot 162 00:09:09,000 --> 00:09:12,480 Speaker 1: of these methods involved number one, recognition rules based on 163 00:09:12,640 --> 00:09:16,240 Speaker 1: explicit measures, not machine learning. They didn't have the machine 164 00:09:16,320 --> 00:09:18,840 Speaker 1: learning methods we have today back then. They would have 165 00:09:18,920 --> 00:09:23,439 Speaker 1: to have explicit rules like the distance between randomly selected 166 00:09:23,520 --> 00:09:26,600 Speaker 1: features of the face. So in this image, what's the 167 00:09:26,640 --> 00:09:29,920 Speaker 1: difference between the left eyebrow and the right ear and 168 00:09:29,960 --> 00:09:31,959 Speaker 1: the you know, left eye and the corner the right 169 00:09:32,000 --> 00:09:35,720 Speaker 1: corner of the mouth or something. And furthermore, to get 170 00:09:35,760 --> 00:09:38,319 Speaker 1: those measures, they often had to resort to what they 171 00:09:38,320 --> 00:09:42,439 Speaker 1: called a man machine approach, which was it would pair 172 00:09:42,480 --> 00:09:47,000 Speaker 1: initial measurements made by humans. Uh. It would take those measurements, 173 00:09:47,000 --> 00:09:50,480 Speaker 1: turn them into numerical values, and then train the computer 174 00:09:50,559 --> 00:09:55,760 Speaker 1: to match based on those values, which is quite laborious. UH. 175 00:09:55,760 --> 00:09:59,000 Speaker 1: And the man machine approach was apparently necessary to input 176 00:09:59,000 --> 00:10:03,120 Speaker 1: the measures based really until the seventies, until um Review 177 00:10:03,160 --> 00:10:06,520 Speaker 1: writes quote. In nineteen seventy three, a Japanese computer scientists 178 00:10:06,600 --> 00:10:11,040 Speaker 1: named Takeo Kannate made a major leap in facial recognition technology. 179 00:10:11,160 --> 00:10:14,040 Speaker 1: Using what was then a very rare commodity, a database 180 00:10:14,240 --> 00:10:19,120 Speaker 1: of eight hundred and fifty digitized photographs eight hundred and 181 00:10:19,160 --> 00:10:24,200 Speaker 1: fifty taken from the nineteen seventy World's Fair in sweet To, Japan, 182 00:10:24,880 --> 00:10:28,280 Speaker 1: Kannada developed a program that could extract facial features such 183 00:10:28,320 --> 00:10:31,920 Speaker 1: as the nose, mouth, and eyes without human input, and 184 00:10:31,960 --> 00:10:34,800 Speaker 1: so by doing that, Cannadi was able to finally eliminate 185 00:10:35,160 --> 00:10:38,480 Speaker 1: the man part of the man machine approach the human 186 00:10:38,520 --> 00:10:41,600 Speaker 1: input of the measuring and input of the values. But 187 00:10:41,760 --> 00:10:44,000 Speaker 1: throughout all this period, I mean there were still huge 188 00:10:44,160 --> 00:10:48,200 Speaker 1: problems with machine recognition of faces. Like they would sometimes 189 00:10:48,200 --> 00:10:51,719 Speaker 1: get to the point where the system developed by by 190 00:10:51,760 --> 00:10:55,760 Speaker 1: Panoramic would be it would be more efficient than humans 191 00:10:55,800 --> 00:10:59,000 Speaker 1: at matching faces under ideal conditions. So if you could 192 00:10:59,000 --> 00:11:02,320 Speaker 1: get all the faces like oriented the same way, looking 193 00:11:02,440 --> 00:11:05,400 Speaker 1: right into the camera and all that, the machine would 194 00:11:05,400 --> 00:11:08,160 Speaker 1: be better than humans trying to match photos. But if 195 00:11:08,200 --> 00:11:10,719 Speaker 1: you just pollute the imagery a little bit and make 196 00:11:10,760 --> 00:11:14,920 Speaker 1: people like turn their heads or change the lighting, etcetera. Yeah, 197 00:11:14,960 --> 00:11:17,800 Speaker 1: any kind of problems like that, suddenly the machine loses 198 00:11:17,840 --> 00:11:20,440 Speaker 1: all its advantages and the humans are better again. Right, 199 00:11:20,440 --> 00:11:22,920 Speaker 1: And some of the other problems that are still issues 200 00:11:23,000 --> 00:11:25,960 Speaker 1: today we're problems at the time, like like depending too 201 00:11:26,040 --> 00:11:31,040 Speaker 1: much on say, uh an all lighte male database, you know, 202 00:11:31,280 --> 00:11:33,080 Speaker 1: or something to that effect. You know, where you just 203 00:11:33,160 --> 00:11:36,400 Speaker 1: do not have you don't have a broad enough sample 204 00:11:36,640 --> 00:11:40,360 Speaker 1: of of human appearances to to really have a robust 205 00:11:40,720 --> 00:11:44,400 Speaker 1: facial recognition system. Right. If it's not trained on humans generally, 206 00:11:44,400 --> 00:11:47,240 Speaker 1: it's not gonna work on humans generally, right, Uh. And 207 00:11:47,320 --> 00:11:50,079 Speaker 1: so yeah, like the racial bias problems that show up 208 00:11:50,080 --> 00:11:53,960 Speaker 1: an existing facial recognition technologies today, we're basically they're right 209 00:11:54,000 --> 00:11:56,319 Speaker 1: from the beginning. But towards the end of his article, 210 00:11:56,360 --> 00:11:59,560 Speaker 1: Revieve writes, quote only in the past ten years or so, 211 00:11:59,600 --> 00:12:03,000 Speaker 1: as face show recognition started to become capable of dealing 212 00:12:03,040 --> 00:12:07,040 Speaker 1: with real world imperfection, says A. Neil K. Jane, a 213 00:12:07,080 --> 00:12:10,320 Speaker 1: computer scientist at a Michigan State University and co editor 214 00:12:10,440 --> 00:12:14,000 Speaker 1: of handbook on Face Recognition, nearly all of the obstacles 215 00:12:14,000 --> 00:12:17,200 Speaker 1: that would he encountered, in fact, have fallen away. And 216 00:12:17,240 --> 00:12:19,800 Speaker 1: one of the big ones they point to is not 217 00:12:19,880 --> 00:12:22,920 Speaker 1: just the fact that you can get some digital images now, 218 00:12:23,160 --> 00:12:25,520 Speaker 1: but you can get so many of them that you know, 219 00:12:25,679 --> 00:12:29,360 Speaker 1: it provides these vast databases for for machine learning and 220 00:12:29,679 --> 00:12:32,920 Speaker 1: data sets for training of neural networks and stuff. That's right, 221 00:12:32,960 --> 00:12:35,679 Speaker 1: you can just basically crawl, um, you know, any given 222 00:12:35,679 --> 00:12:37,960 Speaker 1: social media side. And well, we mentioned a few already. 223 00:12:37,960 --> 00:12:40,880 Speaker 1: We'll mention a few different ones as we proceed here. Well, 224 00:12:40,920 --> 00:12:43,480 Speaker 1: maybe we should pivot to talk about the ways that 225 00:12:43,559 --> 00:12:48,400 Speaker 1: facial recognition technology is being used today. Uh And there 226 00:12:48,480 --> 00:12:50,640 Speaker 1: was one thing that I was looking at, uh So 227 00:12:50,679 --> 00:12:52,200 Speaker 1: it was a piece in the New York Times. It 228 00:12:52,240 --> 00:12:55,560 Speaker 1: was an opinion piece a guy named Bruce Schneier, who 229 00:12:55,640 --> 00:12:58,480 Speaker 1: is a fellow at the Harvard Kennedy School, that was 230 00:12:58,559 --> 00:13:01,200 Speaker 1: making a point about facial recognition that I'm not sure 231 00:13:01,200 --> 00:13:05,000 Speaker 1: I fully agree with the framing of. But within making this, 232 00:13:05,320 --> 00:13:08,400 Speaker 1: within writing this article, he articulates something that I think 233 00:13:08,520 --> 00:13:11,000 Speaker 1: is important and clarifying for us to remember as we 234 00:13:11,040 --> 00:13:14,040 Speaker 1: go on. So to first acknowledge his main point, Uh, 235 00:13:14,040 --> 00:13:17,360 Speaker 1: he writes, quote, facial recognition bands are the wrong way 236 00:13:17,360 --> 00:13:22,000 Speaker 1: to fight against modern surveillance. Focusing on one particular identification 237 00:13:22,080 --> 00:13:25,880 Speaker 1: method misconstrues the nature of the surveillance society we're in 238 00:13:25,920 --> 00:13:29,840 Speaker 1: the process of building. So he's arguing facial recognition, you know, 239 00:13:29,880 --> 00:13:33,640 Speaker 1: it's just one technology among many, and banning it doesn't 240 00:13:33,640 --> 00:13:37,600 Speaker 1: necessarily stop all of these other surveillance technologies from doing 241 00:13:37,640 --> 00:13:40,680 Speaker 1: effectively the same thing. Uh. And we can talk later 242 00:13:40,720 --> 00:13:43,960 Speaker 1: about why. I'm probably not really convinced by this argument, 243 00:13:44,080 --> 00:13:47,240 Speaker 1: But Uh, the point that I think is clarifying is 244 00:13:47,280 --> 00:13:50,080 Speaker 1: that he, you know, he says what we call facial 245 00:13:50,160 --> 00:13:53,560 Speaker 1: recognition is not just one act, but it's at least 246 00:13:53,600 --> 00:13:57,880 Speaker 1: three different acts, each coming with their own challenges. Quote 247 00:13:58,040 --> 00:14:04,640 Speaker 1: in all cases. Modern mass r valance has three broad components, identification, correlation, 248 00:14:04,880 --> 00:14:09,000 Speaker 1: and discrimination. So of course identification is just the first step. 249 00:14:09,080 --> 00:14:13,840 Speaker 1: It recognizes who you are. Correlation then associates that ide 250 00:14:14,080 --> 00:14:18,760 Speaker 1: with other information about you, and then finally discrimination treats 251 00:14:18,760 --> 00:14:21,600 Speaker 1: you differently because of that I D or because of 252 00:14:21,640 --> 00:14:25,280 Speaker 1: the associated information. Now, this in and of itself, of course, 253 00:14:25,360 --> 00:14:28,040 Speaker 1: is something that that humans are. Humans are perfectly capable 254 00:14:28,040 --> 00:14:29,760 Speaker 1: of carrying out all three of these tasks without the 255 00:14:29,800 --> 00:14:32,800 Speaker 1: aigive machines. But what we're talking about here is that 256 00:14:33,040 --> 00:14:35,640 Speaker 1: it would be automated. It would be something that would 257 00:14:35,640 --> 00:14:40,200 Speaker 1: be happening by default to everybody, as opposed to something 258 00:14:40,200 --> 00:14:44,040 Speaker 1: that might take place uh and perhaps even laboriously uh 259 00:14:44,240 --> 00:14:47,760 Speaker 1: in specific scenarios such as, you know, once you've been 260 00:14:47,800 --> 00:14:51,520 Speaker 1: flagged at a security checkpoint or something like that, they 261 00:14:51,800 --> 00:14:53,440 Speaker 1: they ask you to pull out your I D and 262 00:14:53,440 --> 00:14:56,240 Speaker 1: then look you up, to correlate you with other information 263 00:14:56,360 --> 00:14:58,600 Speaker 1: and then maybe treat you differently based on whatever they 264 00:14:58,640 --> 00:15:01,320 Speaker 1: find out. That we're dealing with the scenario here where 265 00:15:01,360 --> 00:15:03,880 Speaker 1: that this would all be happening in real time and 266 00:15:04,000 --> 00:15:06,720 Speaker 1: would happen perhaps with very little delay. And I think 267 00:15:06,760 --> 00:15:10,200 Speaker 1: most importantly would be the scale and pervasiveness. It would 268 00:15:10,200 --> 00:15:14,480 Speaker 1: be happening everywhere all the time. We know from experience 269 00:15:14,480 --> 00:15:19,680 Speaker 1: how quickly new digital technology pervades all spaces, so basically, 270 00:15:20,200 --> 00:15:22,280 Speaker 1: you know the one of his arguments is that these 271 00:15:22,320 --> 00:15:26,240 Speaker 1: are three broad components that it might be easier to 272 00:15:26,320 --> 00:15:31,280 Speaker 1: regulate individually, as opposed to saying, let's not do facial recognition. Well, instead, 273 00:15:31,360 --> 00:15:33,760 Speaker 1: let's maybe we go after each of these three things. 274 00:15:34,000 --> 00:15:36,920 Speaker 1: I think that's not necessarily a bad idea to to 275 00:15:37,000 --> 00:15:40,400 Speaker 1: think about a framework for overall regulation of surveillance and 276 00:15:40,760 --> 00:15:44,360 Speaker 1: preservation of privacy. I mean, ultimately, I think I agree 277 00:15:44,440 --> 00:15:48,200 Speaker 1: that it's important to better understand and regulate the entire 278 00:15:48,320 --> 00:15:52,920 Speaker 1: process recognizing the three different components of identification, correlation, and 279 00:15:52,960 --> 00:15:56,840 Speaker 1: discrimination individually. I just don't think it makes a lot 280 00:15:56,920 --> 00:15:59,920 Speaker 1: of sense to frame this is an argument against banning 281 00:16:00,000 --> 00:16:03,080 Speaker 1: a regulating facial recognition, because to me, that sounds kind 282 00:16:03,080 --> 00:16:06,400 Speaker 1: of like saying, well, international treaties banning the use of 283 00:16:06,440 --> 00:16:09,440 Speaker 1: smallpox virus as a weapon of war miss the point 284 00:16:09,480 --> 00:16:11,960 Speaker 1: that we need to rethink our entire concept of war 285 00:16:12,080 --> 00:16:15,280 Speaker 1: and defense, and we need to regulate international conflict in 286 00:16:15,320 --> 00:16:19,280 Speaker 1: a more comprehensive way. Um. I mean, like, yes, that 287 00:16:19,360 --> 00:16:21,800 Speaker 1: would be true. But if you don't know when and 288 00:16:21,840 --> 00:16:24,920 Speaker 1: whether that full comprehensive thing is going to be accomplished, 289 00:16:25,120 --> 00:16:27,560 Speaker 1: and you do currently have a consensus to ban the 290 00:16:27,640 --> 00:16:31,440 Speaker 1: use of germ warfare, why wouldn't you do it? Absolutely? Um, 291 00:16:31,520 --> 00:16:33,640 Speaker 1: here's another quote from that that article though that I 292 00:16:33,720 --> 00:16:36,240 Speaker 1: that I was really taken by quote. The point is 293 00:16:36,280 --> 00:16:38,920 Speaker 1: that it doesn't matter which technology is used to identify 294 00:16:39,040 --> 00:16:42,640 Speaker 1: people that they're currently is. No comprehensive database of heartbeats 295 00:16:42,720 --> 00:16:45,600 Speaker 1: or gates doesn't make the technologies that gather them any 296 00:16:45,680 --> 00:16:48,080 Speaker 1: less effective, And most of the time it doesn't matter 297 00:16:48,120 --> 00:16:51,040 Speaker 1: if identification isn't tied to a real name. What's important 298 00:16:51,280 --> 00:16:54,960 Speaker 1: is that we can be consistently identified over time. And 299 00:16:55,000 --> 00:16:57,360 Speaker 1: then on on top of that though, you know, once 300 00:16:57,400 --> 00:17:00,560 Speaker 1: your real name is then attached to that information, there's 301 00:17:00,600 --> 00:17:03,160 Speaker 1: no turning back, right. The system is building a picture 302 00:17:03,200 --> 00:17:06,639 Speaker 1: of you, move by move, purchased by purchase, search by search, 303 00:17:06,920 --> 00:17:10,800 Speaker 1: over the course of years, if not decades. Yeah. Uh. 304 00:17:10,840 --> 00:17:13,119 Speaker 1: And of course, because it's the Internet, what kind of 305 00:17:13,200 --> 00:17:15,919 Speaker 1: system are we talking about? I mean, I would argue 306 00:17:15,960 --> 00:17:18,320 Speaker 1: that it is a dystopia A little bit less like 307 00:17:18,480 --> 00:17:22,200 Speaker 1: nineteen eighty four and more like Terry Gilliam's Brazil, where 308 00:17:22,320 --> 00:17:24,560 Speaker 1: you know, there's no one person in charge that you 309 00:17:24,600 --> 00:17:28,199 Speaker 1: can rebel against. No one person seems to understand how 310 00:17:28,240 --> 00:17:30,720 Speaker 1: it all works or call all the shots. It is 311 00:17:30,800 --> 00:17:35,199 Speaker 1: a terrible will emanating from the void that is expressed 312 00:17:35,200 --> 00:17:39,320 Speaker 1: through millions of antlike functionaries, each doing it's bidding without 313 00:17:39,400 --> 00:17:43,240 Speaker 1: being bidden. I mean, isn't the oppression scarier when there's 314 00:17:43,280 --> 00:17:45,159 Speaker 1: no boss in charge of it all that you can 315 00:17:45,200 --> 00:17:48,719 Speaker 1: rebel against? Absolutely, And especially when it's uh when it's 316 00:17:48,760 --> 00:17:51,399 Speaker 1: a case like facial recognition where a lot of the 317 00:17:52,440 --> 00:17:54,320 Speaker 1: a lot of the the advancements that have been made, 318 00:17:54,320 --> 00:17:56,960 Speaker 1: and then more specifically a lot of the cases where 319 00:17:57,320 --> 00:18:02,080 Speaker 1: it has been or is being um rolled out. You know, 320 00:18:02,119 --> 00:18:04,720 Speaker 1: it's often it's it's not as a situation where people 321 00:18:04,720 --> 00:18:07,240 Speaker 1: are getting to vote on it or even necessarily having 322 00:18:07,320 --> 00:18:11,600 Speaker 1: any kind of really broad discussion about it before it 323 00:18:11,600 --> 00:18:14,080 Speaker 1: takes place. It just sort of happens and then here 324 00:18:14,119 --> 00:18:16,280 Speaker 1: we are. You know, So not only is there no 325 00:18:16,440 --> 00:18:20,479 Speaker 1: key individual in place that you can blame and rebell against, 326 00:18:20,520 --> 00:18:22,600 Speaker 1: there's like they're not even necessarily a set point in 327 00:18:22,680 --> 00:18:24,520 Speaker 1: time where you could say, we have to go back 328 00:18:24,560 --> 00:18:27,359 Speaker 1: and change this, you know, getting a time machine and 329 00:18:27,400 --> 00:18:30,840 Speaker 1: try and prevent facial recognition from being rolled out. And 330 00:18:30,840 --> 00:18:33,320 Speaker 1: where do you go, Who do you try and stop? 331 00:18:33,359 --> 00:18:36,040 Speaker 1: Who do you try and speak to except the whole world? 332 00:18:36,640 --> 00:18:38,080 Speaker 1: You know? Another thing I would say is to go 333 00:18:38,119 --> 00:18:41,560 Speaker 1: back to my germ warfare analogy. I think it's possible 334 00:18:41,720 --> 00:18:44,960 Speaker 1: that that he's not quite right, that the different methods 335 00:18:45,000 --> 00:18:48,280 Speaker 1: of ideaing you and tracking you are indistinguishable from each other. 336 00:18:48,600 --> 00:18:52,160 Speaker 1: I think it's possible that facial recognition is an especially 337 00:18:52,320 --> 00:18:57,159 Speaker 1: insidious and corrupting type of automated identification compared to some 338 00:18:57,200 --> 00:18:59,640 Speaker 1: other methods, like you know, ideas of credit card numbers 339 00:18:59,680 --> 00:19:03,720 Speaker 1: or math addresses on your phone. Because our lives are 340 00:19:03,800 --> 00:19:07,400 Speaker 1: built around faces, our social existence and our cultures are 341 00:19:07,440 --> 00:19:11,120 Speaker 1: built around interfacing with faces. I mean, it seems like 342 00:19:11,200 --> 00:19:15,560 Speaker 1: somehow a more dangerous kind of well of information to 343 00:19:15,800 --> 00:19:19,080 Speaker 1: poison in the culture than to say, like, well, you know, 344 00:19:19,160 --> 00:19:21,040 Speaker 1: your phone knows where you are. I mean, you could 345 00:19:21,119 --> 00:19:25,280 Speaker 1: potentially smash your phone with a hammer. Yeah, and again, 346 00:19:25,720 --> 00:19:27,879 Speaker 1: I think I've mentioned this already. So many of the 347 00:19:27,880 --> 00:19:31,480 Speaker 1: things involving facial recognition on the surface, it doesn't seem threatening. 348 00:19:31,480 --> 00:19:33,520 Speaker 1: It seems like, well, we wanted machines to do what 349 00:19:33,520 --> 00:19:35,840 Speaker 1: people could do, so we we've created a way for 350 00:19:35,920 --> 00:19:38,199 Speaker 1: them to do it. That's what we always wanted. How 351 00:19:38,280 --> 00:19:42,120 Speaker 1: is that that terrifying? And I think part of it 352 00:19:42,160 --> 00:19:44,399 Speaker 1: is haven't you always wanted a person who is a 353 00:19:44,440 --> 00:19:47,399 Speaker 1: boss who can watch you every minute of the day. Well, 354 00:19:47,440 --> 00:19:49,679 Speaker 1: I mean, yes, when you get do specific examples of it. 355 00:19:49,680 --> 00:19:51,399 Speaker 1: But but I think I think one of the keys 356 00:19:51,520 --> 00:19:54,560 Speaker 1: is that when we're talking about automating this sort of thing, 357 00:19:54,800 --> 00:19:57,600 Speaker 1: and when we're talking about the machine version of it, 358 00:19:57,800 --> 00:19:59,879 Speaker 1: we're talking about a version of it that exists on 359 00:20:00,119 --> 00:20:03,600 Speaker 1: scale and on a scope that is beyond human. And 360 00:20:03,680 --> 00:20:05,920 Speaker 1: that's that's the thing. It's like, we're not talking about 361 00:20:05,920 --> 00:20:10,040 Speaker 1: a human capability anymore. We're talking about an inhuman capability 362 00:20:10,480 --> 00:20:12,000 Speaker 1: the same way that we've you know, we've talked about 363 00:20:12,000 --> 00:20:14,440 Speaker 1: consciousness before and we sort of post the question, well, 364 00:20:15,160 --> 00:20:18,520 Speaker 1: is part of consciousness? The limitation of what we can 365 00:20:18,560 --> 00:20:21,760 Speaker 1: focus our attention on? Is there is part of the 366 00:20:21,840 --> 00:20:25,159 Speaker 1: key to being human? Uh? Is part of it? The 367 00:20:25,200 --> 00:20:27,520 Speaker 1: limitations on what our brains can do, on what our 368 00:20:27,560 --> 00:20:31,200 Speaker 1: senses can do. And we're talking about models that are 369 00:20:31,280 --> 00:20:34,000 Speaker 1: not limited by those senses or those uh, you know, 370 00:20:34,040 --> 00:20:37,200 Speaker 1: those computational resources as beautifully put. And I think you're 371 00:20:37,240 --> 00:20:40,400 Speaker 1: exactly right. I mean, if you could be conscious of everything, 372 00:20:40,440 --> 00:20:42,800 Speaker 1: I'm not sure you would be conscious anymore. I'm not 373 00:20:42,840 --> 00:20:45,359 Speaker 1: sure you'd be a person. Yeah, yeah, I don't think. 374 00:20:45,880 --> 00:20:50,000 Speaker 1: You know, this is something that that far more specialized 375 00:20:50,280 --> 00:20:53,560 Speaker 1: people than myself probably have better arguments about. But yeah, 376 00:20:53,560 --> 00:20:56,760 Speaker 1: to to what extents extent would a super consciousness not 377 00:20:56,880 --> 00:20:59,600 Speaker 1: be a consciousness? It would be beyond what we think 378 00:20:59,640 --> 00:21:02,720 Speaker 1: of a consciousness. Okay, we need to take a quick break, 379 00:21:02,760 --> 00:21:05,080 Speaker 1: but when we come back, we will discuss how facial 380 00:21:05,080 --> 00:21:08,520 Speaker 1: recognition technology is already being used in several countries around 381 00:21:08,520 --> 00:21:14,680 Speaker 1: the world. Alright, we're back, all right, So let's let's 382 00:21:14,680 --> 00:21:16,360 Speaker 1: look at just this is just going to be kind 383 00:21:16,359 --> 00:21:19,040 Speaker 1: of a snapshot at a few different a few different 384 00:21:19,040 --> 00:21:22,560 Speaker 1: stories covering facial recognition as it is being rolled out 385 00:21:22,680 --> 00:21:25,400 Speaker 1: right now as of this recording in at the tail 386 00:21:25,520 --> 00:21:28,960 Speaker 1: end of January. So the first a couple of sources 387 00:21:28,960 --> 00:21:32,040 Speaker 1: of looking at uh. There's a an opinion piece by 388 00:21:32,240 --> 00:21:37,400 Speaker 1: Frederic califooner UH in The Guardian titled Facial Recognition cameras 389 00:21:37,400 --> 00:21:41,119 Speaker 1: will put us all in an identity parade. And this 390 00:21:41,160 --> 00:21:44,359 Speaker 1: piece also refers in places to reporting from the same 391 00:21:44,400 --> 00:21:48,000 Speaker 1: month and publication by Vicram Dodd, a police and crime 392 00:21:48,040 --> 00:21:53,080 Speaker 1: correspondent for The Guardian. So basically, London's Metro Police announced 393 00:21:53,119 --> 00:21:57,119 Speaker 1: their intention to launch live facial recognition cameras in London, 394 00:21:57,480 --> 00:22:00,239 Speaker 1: a city already known for a mass surveillance survey ence 395 00:22:00,320 --> 00:22:03,960 Speaker 1: via their CCTV system. It's a move condemned by civil 396 00:22:04,000 --> 00:22:08,439 Speaker 1: liberty groups as as a quote breathtaking assault on human rights. Uh. 397 00:22:08,520 --> 00:22:11,880 Speaker 1: The police, however, claimed that eight of people surveyed back 398 00:22:12,000 --> 00:22:13,880 Speaker 1: the move and they would only be used to catch 399 00:22:13,960 --> 00:22:17,119 Speaker 1: violent criminals and find missing people. Uh. They also stressed 400 00:22:17,160 --> 00:22:19,480 Speaker 1: that it will be properly posted and it will be 401 00:22:19,600 --> 00:22:22,840 Speaker 1: rolled out with clarity and transparency. Uh and and only 402 00:22:23,200 --> 00:22:27,720 Speaker 1: you know, following outreach in affected communities. Now, you know, 403 00:22:27,720 --> 00:22:29,480 Speaker 1: we've kind of talked about about this already. If you 404 00:22:29,640 --> 00:22:32,840 Speaker 1: if you stand by the notion that, Okay, I trust 405 00:22:32,880 --> 00:22:35,439 Speaker 1: the government as as it exists now, and I trust 406 00:22:35,480 --> 00:22:38,480 Speaker 1: whatever model it will take in the future. I trust 407 00:22:38,480 --> 00:22:42,560 Speaker 1: the keepers of this information and my personal information and uh, 408 00:22:42,560 --> 00:22:44,919 Speaker 1: and I don't have anything to hide anyway. Then okay, 409 00:22:44,960 --> 00:22:47,080 Speaker 1: I guess you can easily get on board with something 410 00:22:47,119 --> 00:22:49,760 Speaker 1: like that, but a lot of folks have have issues 411 00:22:49,880 --> 00:22:53,520 Speaker 1: with this rollout in particular, as well as with facial 412 00:22:53,560 --> 00:22:58,120 Speaker 1: recognition technology in general. So for starters, this comes as 413 00:22:58,200 --> 00:23:01,240 Speaker 1: the European Union is consider during a temporary ban on 414 00:23:01,280 --> 00:23:04,399 Speaker 1: facial recognition, and of course this is also occurring as 415 00:23:04,440 --> 00:23:08,480 Speaker 1: the UK continues to extract itself from the EU. Also, 416 00:23:08,760 --> 00:23:14,000 Speaker 1: the only independent review of the Metro facial recognition public trials, 417 00:23:14,040 --> 00:23:17,679 Speaker 1: by one Pete to Fussy of Essex University, found that 418 00:23:17,720 --> 00:23:21,199 Speaker 1: it was only verifiably accurate in just nineteen percent of 419 00:23:21,280 --> 00:23:24,360 Speaker 1: the cases that opposed to the uh TOO. I think 420 00:23:24,359 --> 00:23:27,399 Speaker 1: it was something like a seventy success rate that the 421 00:23:27,440 --> 00:23:31,840 Speaker 1: Metro police were claiming. Yeah, I mean we saw in 422 00:23:31,880 --> 00:23:34,240 Speaker 1: the first episode we talked about how the company clear 423 00:23:34,320 --> 00:23:37,919 Speaker 1: View AI claimed that they found correct matches up to 424 00:23:37,960 --> 00:23:41,600 Speaker 1: seventy percent of the time. But we've definitely read reviews 425 00:23:41,680 --> 00:23:45,200 Speaker 1: that that placed correct to numbers way way lower. Right, 426 00:23:45,240 --> 00:23:47,080 Speaker 1: And and the other thing is there's no there's no 427 00:23:47,160 --> 00:23:49,800 Speaker 1: opt out here. This is not even though they're talking 428 00:23:49,840 --> 00:23:53,760 Speaker 1: about targeted uses of it, the technology is not targeted. 429 00:23:53,840 --> 00:23:57,040 Speaker 1: All faces are scanned that are at all scannable, and 430 00:23:57,040 --> 00:24:01,280 Speaker 1: therefore everyone is in a virtual criminal line up whenever 431 00:24:01,400 --> 00:24:04,480 Speaker 1: they're in the sights of the camera. Um. This is 432 00:24:04,920 --> 00:24:08,399 Speaker 1: a quote from that opinion piece quote. Given that the 433 00:24:08,400 --> 00:24:12,479 Speaker 1: system inevitably processes the biometric data of everyone live, facial 434 00:24:12,520 --> 00:24:16,960 Speaker 1: recognition has the potential to fundamentally change the power relationship 435 00:24:17,000 --> 00:24:20,520 Speaker 1: between people and the police, and even alter the very 436 00:24:20,600 --> 00:24:24,439 Speaker 1: meaning of public space. So the you know, the standard 437 00:24:24,440 --> 00:24:29,040 Speaker 1: criticisms come up as well, including misidentification, but not only misidentification, 438 00:24:29,080 --> 00:24:33,480 Speaker 1: but also automated misidentification, which the author charges shifts the 439 00:24:33,520 --> 00:24:38,359 Speaker 1: burden of proof onto the falsely recognized individual because there's 440 00:24:38,359 --> 00:24:41,520 Speaker 1: no human accuser, right, it's just the machine said you 441 00:24:41,560 --> 00:24:43,360 Speaker 1: did this, and it's it's up to you to prove 442 00:24:43,400 --> 00:24:45,520 Speaker 1: that you are not that face, that that face is 443 00:24:45,560 --> 00:24:49,399 Speaker 1: not yours. Again and and and this is and I 444 00:24:49,400 --> 00:24:53,160 Speaker 1: want to stress that this is problematic even if that 445 00:24:53,600 --> 00:24:56,080 Speaker 1: even if the success rate were not so low, like 446 00:24:56,359 --> 00:24:59,200 Speaker 1: because obviously the technology is improving and that is not 447 00:24:59,400 --> 00:25:01,480 Speaker 1: the ants or that is part of the problem. No, 448 00:25:01,560 --> 00:25:04,560 Speaker 1: I'm more scared when the success rate gets verifiably higher, 449 00:25:04,600 --> 00:25:07,160 Speaker 1: because then if the trust and it just keeps going 450 00:25:07,240 --> 00:25:10,800 Speaker 1: up and up. The fewer cases where people still get misidentified, 451 00:25:10,840 --> 00:25:13,439 Speaker 1: which always is going to continue to happen to some degree, 452 00:25:13,840 --> 00:25:16,719 Speaker 1: are are going to get more and more like, you know, terrified. 453 00:25:16,720 --> 00:25:19,080 Speaker 1: They're going to be more and more under the knife, right, 454 00:25:19,119 --> 00:25:21,159 Speaker 1: I mean you end up with it again, it changes 455 00:25:21,200 --> 00:25:23,720 Speaker 1: what is a public space. Well, a public space then 456 00:25:23,880 --> 00:25:25,760 Speaker 1: I can't go to the park without being in a 457 00:25:25,840 --> 00:25:28,959 Speaker 1: virtual uh like a lot of crime line up, Like 458 00:25:29,040 --> 00:25:32,399 Speaker 1: I am always going to be profiled as to be 459 00:25:32,600 --> 00:25:37,879 Speaker 1: as a potential criminal. So yeah, it's a it's a 460 00:25:37,880 --> 00:25:40,320 Speaker 1: good piece, worth a worth reading. So Caul Throner, you know, 461 00:25:40,400 --> 00:25:42,000 Speaker 1: sums this up by saying, you know, look, the stakes 462 00:25:42,040 --> 00:25:45,159 Speaker 1: are very high with this kind of invasive technology. And 463 00:25:45,160 --> 00:25:48,720 Speaker 1: while these rollouts are not without their critics. Obviously we've 464 00:25:48,760 --> 00:25:52,760 Speaker 1: been talking about the various criticisms of it, but they 465 00:25:52,840 --> 00:25:56,199 Speaker 1: argue that we still haven't really seen them subjected to 466 00:25:56,320 --> 00:26:00,320 Speaker 1: true public debate. The technology is happening faster than the 467 00:26:00,320 --> 00:26:03,160 Speaker 1: awareness of it is exactly. Yeah. Yeah, And I would 468 00:26:03,200 --> 00:26:06,080 Speaker 1: point out another thing about it, which is just that, Okay, 469 00:26:06,119 --> 00:26:08,240 Speaker 1: if you suddenly put a pause on this kind of 470 00:26:08,240 --> 00:26:11,119 Speaker 1: technology and say we can't use it, and then you 471 00:26:11,200 --> 00:26:14,280 Speaker 1: go under a review, when you review and review and review, 472 00:26:14,320 --> 00:26:17,199 Speaker 1: and you're sure in the end, okay, it definitely the 473 00:26:17,240 --> 00:26:21,119 Speaker 1: benefits outweigh the harms. Then you could still release it 474 00:26:21,160 --> 00:26:24,000 Speaker 1: in the future. But I think you can't really go 475 00:26:24,160 --> 00:26:26,800 Speaker 1: the other way, right, once you live in the facial 476 00:26:26,840 --> 00:26:31,400 Speaker 1: recognition world, that animals not getting back in the cage, right, Yeah, 477 00:26:31,440 --> 00:26:35,400 Speaker 1: once you've sort of eroded the norms of of privacy, 478 00:26:35,440 --> 00:26:37,080 Speaker 1: like how do you go back? You know, you and 479 00:26:37,280 --> 00:26:40,919 Speaker 1: quickly enter an age where people who don't want to 480 00:26:40,960 --> 00:26:44,240 Speaker 1: be a part of the database are going to be 481 00:26:44,280 --> 00:26:47,040 Speaker 1: seen as the outsiders in the fringe cases and the 482 00:26:47,359 --> 00:26:49,520 Speaker 1: you know, the people who live out in the hut 483 00:26:49,520 --> 00:26:52,040 Speaker 1: in the woods. And perhaps that would be also because 484 00:26:52,040 --> 00:26:57,080 Speaker 1: they would be effectively ostracized from so many of these systems. Yeah. Uh, 485 00:26:57,160 --> 00:26:59,800 Speaker 1: And again we're mainly talking about like state uses or 486 00:27:00,080 --> 00:27:02,680 Speaker 1: lease uses or whatever, but there's a whole other world 487 00:27:02,720 --> 00:27:07,159 Speaker 1: to consider of just like, uh, distributed public use of 488 00:27:07,200 --> 00:27:12,480 Speaker 1: facial recognition systems, which could be hugely socially important, could 489 00:27:12,600 --> 00:27:16,280 Speaker 1: lead to new types of social ostracism and stuff. Yeah, 490 00:27:16,440 --> 00:27:19,040 Speaker 1: or situations where oh well, I'm I'm opting out of 491 00:27:19,040 --> 00:27:22,359 Speaker 1: facial recognition technology. Oops. Looks like it's gonna make it 492 00:27:22,400 --> 00:27:25,199 Speaker 1: really difficult for me to pay my bills because I 493 00:27:25,240 --> 00:27:28,439 Speaker 1: need my face for identification. On that, we'll get to 494 00:27:28,480 --> 00:27:31,240 Speaker 1: some examples of of that currently in use here in 495 00:27:31,280 --> 00:27:34,720 Speaker 1: a bit. Or how about I mean this might sound petty, 496 00:27:34,800 --> 00:27:38,800 Speaker 1: but imagine a world where, uh, anytime somebody invites you 497 00:27:38,880 --> 00:27:41,560 Speaker 1: to do something and you say no, I can't, they 498 00:27:41,560 --> 00:27:45,120 Speaker 1: can also very likely find out wherever you were at 499 00:27:45,119 --> 00:27:48,880 Speaker 1: the time you couldn't do the thing they invited you to. Die. Yeah, 500 00:27:48,920 --> 00:27:50,800 Speaker 1: I was just having a conversation with my wife about 501 00:27:50,800 --> 00:27:54,919 Speaker 1: smartphone technology, about how I had recently been to a 502 00:27:55,040 --> 00:27:58,680 Speaker 1: concert venue and uh, and I'm not going to name 503 00:27:58,680 --> 00:28:02,440 Speaker 1: the organization that was handling a concert, but you had 504 00:28:02,480 --> 00:28:04,920 Speaker 1: to have a mobile entry ticket, which, as far as 505 00:28:04,920 --> 00:28:06,480 Speaker 1: I know, that meant it had to be I could 506 00:28:06,480 --> 00:28:08,440 Speaker 1: be wrong on this, but my understanding was you had 507 00:28:08,480 --> 00:28:11,919 Speaker 1: to have like a smartphone version of your ticket to 508 00:28:11,920 --> 00:28:14,360 Speaker 1: get in, which I just kind of blew me agg 509 00:28:14,400 --> 00:28:15,879 Speaker 1: And I'll what if you don't have a smartphone? Not 510 00:28:15,960 --> 00:28:18,720 Speaker 1: everyone has one? Does this mean like only smartphone people 511 00:28:18,720 --> 00:28:20,720 Speaker 1: are getting into this thing? And my wife had a 512 00:28:20,760 --> 00:28:22,800 Speaker 1: similar situation with a parking deck where you had to 513 00:28:22,840 --> 00:28:26,040 Speaker 1: have an app on your phone to get and and 514 00:28:26,080 --> 00:28:29,600 Speaker 1: so you can easily imagine this kind of scenario extrapolated 515 00:28:29,880 --> 00:28:32,679 Speaker 1: to to facial recognition, like, oh, you're not opting in 516 00:28:32,680 --> 00:28:35,119 Speaker 1: on facial recognition, Well, I'm sorry. You can't come to 517 00:28:35,160 --> 00:28:37,720 Speaker 1: this this this concert because you have to have a 518 00:28:37,720 --> 00:28:39,960 Speaker 1: face scandy get in and your face will be scanned 519 00:28:40,000 --> 00:28:42,400 Speaker 1: of course while you're in the venue by the security systems. 520 00:28:42,920 --> 00:28:44,920 Speaker 1: But the UK, of course is by far not the 521 00:28:44,960 --> 00:28:47,760 Speaker 1: only place that's already trying to roll out some kind 522 00:28:47,760 --> 00:28:50,560 Speaker 1: of public form of facial recognition technology. I've read about 523 00:28:50,640 --> 00:28:53,880 Speaker 1: uses in China. I've read about uses in Russia. Yeah, 524 00:28:53,960 --> 00:28:57,080 Speaker 1: even in Russia, though, civil rights activists are criticizing facial 525 00:28:57,080 --> 00:29:00,320 Speaker 1: recognition technology is a threat to privacy and human rights, 526 00:29:00,840 --> 00:29:05,160 Speaker 1: specifically for the technology's ability to say, identify individuals at protests, 527 00:29:05,200 --> 00:29:07,920 Speaker 1: store them a database, and then track them. Right, you 528 00:29:07,920 --> 00:29:10,000 Speaker 1: could easily be put on the you know, the government's 529 00:29:10,080 --> 00:29:15,440 Speaker 1: undesirables list. Here's a quote from Natalia's Virgina, Amnesty International 530 00:29:15,520 --> 00:29:20,680 Speaker 1: Russia's director. She's quoted in the January BBC article Russia's 531 00:29:20,760 --> 00:29:24,959 Speaker 1: use of facial recognition challenged in court. Quote, facial recognition 532 00:29:24,960 --> 00:29:28,880 Speaker 1: technology is by nature deeply intrusive, as it enables the 533 00:29:29,200 --> 00:29:33,600 Speaker 1: widespread and bulk monitoring, collection, storage, and analysis of sensitive 534 00:29:33,640 --> 00:29:38,520 Speaker 1: personal data without individualized reasonable suspicion. Again, everybody's in the 535 00:29:38,520 --> 00:29:41,000 Speaker 1: criminal lineup. Yeah, you don't have to do anything in 536 00:29:41,040 --> 00:29:44,880 Speaker 1: particular to arouse suspicion of yourself just by being in public, 537 00:29:45,360 --> 00:29:49,400 Speaker 1: that's enough. According to the article that PBC article, Moscow 538 00:29:49,480 --> 00:29:53,360 Speaker 1: has about a hundred and sixty thousand CCTV cameras in 539 00:29:53,400 --> 00:29:56,840 Speaker 1: operation in the city and this month month they expanded 540 00:29:56,840 --> 00:30:00,400 Speaker 1: the number using facial recognition with no x a nation 541 00:30:00,440 --> 00:30:04,800 Speaker 1: of how privacy and human rights would be insured or 542 00:30:05,000 --> 00:30:07,160 Speaker 1: I mean it seems if they would being insured, because 543 00:30:07,160 --> 00:30:09,840 Speaker 1: it seems like the basic system would would be an 544 00:30:09,840 --> 00:30:13,760 Speaker 1: athema to that. The BBC article also pointed out that 545 00:30:13,800 --> 00:30:17,520 Speaker 1: the Moscow depart Department of Information Technology has reportedly signed 546 00:30:17,520 --> 00:30:20,800 Speaker 1: a deal with the Russian firm intech Lab to provide 547 00:30:20,840 --> 00:30:23,560 Speaker 1: the need to pride the needed technology, and this firm 548 00:30:23,600 --> 00:30:27,280 Speaker 1: had previously rolled out the the Fine Face app, which 549 00:30:27,440 --> 00:30:30,880 Speaker 1: used data from a website that is often referred to 550 00:30:30,920 --> 00:30:33,840 Speaker 1: as Russia's Facebook. It is not an actual Facebook website, 551 00:30:33,840 --> 00:30:36,880 Speaker 1: but is often held up as the equivalent. Yeah, there 552 00:30:36,880 --> 00:30:39,880 Speaker 1: are a lot of different sort of other countries facebooks, 553 00:30:40,680 --> 00:30:43,640 Speaker 1: and then there's China to consider. So China has also 554 00:30:43,800 --> 00:30:47,000 Speaker 1: rolled out facial recognition, and there are a number of 555 00:30:47,040 --> 00:30:50,400 Speaker 1: different uses and implicatement implementations that you can find just 556 00:30:50,440 --> 00:30:52,360 Speaker 1: all over the place. But you can find it in 557 00:30:52,440 --> 00:30:57,920 Speaker 1: train stations, airports, stores, hotels, gated communities, and more. I 558 00:30:58,000 --> 00:31:00,640 Speaker 1: think I had seen allegations that it had been used 559 00:31:00,800 --> 00:31:04,760 Speaker 1: in um in attacks on the weaker communities. Basically, yes, 560 00:31:05,080 --> 00:31:08,120 Speaker 1: that's a big one, the tracking of the alleged tracking 561 00:31:08,120 --> 00:31:13,000 Speaker 1: of ethnic minorities in China using these systems. But you 562 00:31:13,080 --> 00:31:15,640 Speaker 1: also you also find it another place, like it's used 563 00:31:15,680 --> 00:31:20,760 Speaker 1: for things that are almost comical but also troubling by 564 00:31:20,800 --> 00:31:24,520 Speaker 1: just how tedious and small steaks they seem, such as 565 00:31:25,320 --> 00:31:27,600 Speaker 1: you know, using it to catch toilet tissue thiefs at 566 00:31:27,680 --> 00:31:31,200 Speaker 1: bathroom um as. Some of what I was reading about 567 00:31:31,200 --> 00:31:33,640 Speaker 1: it came from a New York Times article by any 568 00:31:33,800 --> 00:31:36,400 Speaker 1: chin Uh that's about q I in it. In case 569 00:31:36,440 --> 00:31:38,360 Speaker 1: you want to look around, I'm not gonna name I'm 570 00:31:38,400 --> 00:31:40,400 Speaker 1: not gonna mention the title of the article because it 571 00:31:40,440 --> 00:31:42,200 Speaker 1: will spoil the fun of what I'm about to share. 572 00:31:42,960 --> 00:31:46,400 Speaker 1: But anyway, she in this she described China as a 573 00:31:46,480 --> 00:31:51,040 Speaker 1: quote a country accustomed to surveillance, which I think is 574 00:31:51,080 --> 00:31:53,360 Speaker 1: a is an interesting way of looking at because I 575 00:31:53,400 --> 00:31:55,920 Speaker 1: wonder if it is if China's case in some way 576 00:31:55,960 --> 00:31:58,800 Speaker 1: provides a model of where some of these other countries 577 00:31:58,840 --> 00:32:01,320 Speaker 1: we've mentioned could be head it very soon, Like it 578 00:32:01,480 --> 00:32:06,080 Speaker 1: is just given a a culture that is in very 579 00:32:06,120 --> 00:32:12,760 Speaker 1: broadly um predisposed to to being okay with surveillance. Uh, 580 00:32:13,040 --> 00:32:15,200 Speaker 1: it might provide a future look at where where we're 581 00:32:15,200 --> 00:32:18,000 Speaker 1: going in these other countries. Uh. For for example, you 582 00:32:18,040 --> 00:32:20,200 Speaker 1: see quite a bit of facial recognition used to do 583 00:32:20,280 --> 00:32:23,400 Speaker 1: stuff like open your phone or make a payment, something 584 00:32:23,440 --> 00:32:26,080 Speaker 1: that the South China Morning Post points out has been 585 00:32:26,120 --> 00:32:29,760 Speaker 1: disrupted by the current coronavirus outbreak. So in the wake 586 00:32:29,800 --> 00:32:33,040 Speaker 1: of this current global health threat which has impacted China 587 00:32:33,120 --> 00:32:36,680 Speaker 1: the most thus far, lots of people are wearing surgical masks, 588 00:32:37,320 --> 00:32:39,760 Speaker 1: and they were already in many cases wearing these masks 589 00:32:39,760 --> 00:32:43,719 Speaker 1: and high numbers due to pollution concerns. But the software 590 00:32:43,800 --> 00:32:47,000 Speaker 1: that is used in many of these instances is apparently 591 00:32:47,000 --> 00:32:50,600 Speaker 1: struggling to deal with such little facial information, requiring other 592 00:32:50,680 --> 00:32:55,040 Speaker 1: biometric data uh, in order to still I D and individual. 593 00:32:55,960 --> 00:32:59,320 Speaker 1: So so Chen's article goes into this about like people 594 00:32:59,320 --> 00:33:02,800 Speaker 1: basically have to make a choice between uh, being able 595 00:33:02,840 --> 00:33:06,880 Speaker 1: to easily make payments at stores using facial recognition or 596 00:33:07,080 --> 00:33:11,240 Speaker 1: feeling like they are adequately protected against a dangerous illness. 597 00:33:11,320 --> 00:33:13,520 Speaker 1: This is a great example of something that I was 598 00:33:13,520 --> 00:33:15,960 Speaker 1: searching for an example of later in the episode. So 599 00:33:16,040 --> 00:33:18,720 Speaker 1: we'll have to remember this later. Now I mentioned something 600 00:33:18,960 --> 00:33:21,320 Speaker 1: kind of fun but also still scary, and that it 601 00:33:21,440 --> 00:33:24,840 Speaker 1: seems small steaks and uh. And that is the central 602 00:33:25,000 --> 00:33:28,200 Speaker 1: point of Chen's recent article, and that's the use of 603 00:33:28,240 --> 00:33:32,880 Speaker 1: facial recognition technology to root out quote uncivilized behavior in 604 00:33:33,160 --> 00:33:37,120 Speaker 1: in an Hua Province in eastern China. So using facial 605 00:33:37,120 --> 00:33:41,160 Speaker 1: recognition data, she writes, the urban management the Urban Management 606 00:33:41,200 --> 00:33:46,680 Speaker 1: Department of SUSO published surveillance photos of individuals engaged in 607 00:33:47,160 --> 00:33:51,360 Speaker 1: said uncivilized behavior with partial exposure of their names. So 608 00:33:51,400 --> 00:33:54,880 Speaker 1: like a public shaming or public outing, kind of like saying, 609 00:33:54,920 --> 00:33:57,520 Speaker 1: look at these people, her her name is, you know, 610 00:33:57,560 --> 00:34:00,560 Speaker 1: Amy or whatever. Uh, here they are in gauged in 611 00:34:00,920 --> 00:34:03,480 Speaker 1: uncivilized behavior. They should not do this. What was the 612 00:34:03,520 --> 00:34:06,200 Speaker 1: uncivilized behavior? I know you you would think Okay, is 613 00:34:06,200 --> 00:34:11,320 Speaker 1: it defacing you know, a public place, stealing the toilet 614 00:34:11,320 --> 00:34:14,239 Speaker 1: paper even is low stakes? Is that same? No, it's 615 00:34:14,239 --> 00:34:17,120 Speaker 1: even lower stakes. It's the wearing of full length pajamas 616 00:34:17,160 --> 00:34:21,239 Speaker 1: in public. What yeah, So, I mean I've seen people 617 00:34:21,280 --> 00:34:25,239 Speaker 1: wear pajamas in public. I am currently wearing clothes as 618 00:34:25,239 --> 00:34:27,839 Speaker 1: I record this. I'm wearing clothes that I also sleep in. 619 00:34:28,320 --> 00:34:30,759 Speaker 1: So I think the last time you came to my 620 00:34:30,800 --> 00:34:32,640 Speaker 1: house to let me borrow a book, I answered the 621 00:34:32,680 --> 00:34:35,160 Speaker 1: door in pajamas and didn't realize until you were there. 622 00:34:35,480 --> 00:34:38,000 Speaker 1: Because I'm sorry, man, No, no, because you don't have 623 00:34:38,080 --> 00:34:41,200 Speaker 1: to apologize, because this is the universal truth. Chen also 624 00:34:42,120 --> 00:34:45,879 Speaker 1: describes it thus in her article, pajamas are comfortable. They're 625 00:34:46,000 --> 00:34:49,840 Speaker 1: very comfortable, and public pajama wearing is common in China, 626 00:34:49,920 --> 00:34:53,759 Speaker 1: especially among older women and especially in cold months and regions. 627 00:34:54,520 --> 00:34:56,680 Speaker 1: And to add another level to the injustice is the 628 00:34:56,719 --> 00:34:59,360 Speaker 1: sort of thing that is praised as fashionable when celebrities 629 00:34:59,400 --> 00:35:01,799 Speaker 1: do it, but to this sort of backlash when it's 630 00:35:01,840 --> 00:35:04,640 Speaker 1: common people. And it's been a point of contention um 631 00:35:05,360 --> 00:35:07,680 Speaker 1: for for a little while here, with government officials trying 632 00:35:07,719 --> 00:35:10,640 Speaker 1: to ban it in some places, but the people pushing 633 00:35:10,680 --> 00:35:12,400 Speaker 1: back and say no, you know this, this is a 634 00:35:12,440 --> 00:35:15,040 Speaker 1: bridge too far. I want to wear my comfy pajamas 635 00:35:15,040 --> 00:35:16,799 Speaker 1: and if I want to wear them outside of my home, 636 00:35:16,840 --> 00:35:19,879 Speaker 1: I would do so. It's not uncivilized to do this. 637 00:35:20,320 --> 00:35:23,080 Speaker 1: And in this case, it's a rare it's a rare 638 00:35:23,080 --> 00:35:25,960 Speaker 1: example of of people in China pushing back against facial 639 00:35:26,000 --> 00:35:30,600 Speaker 1: recognition technology in a place again where the technology has 640 00:35:30,600 --> 00:35:35,320 Speaker 1: already been highly established and is is recognized and appreciated 641 00:35:35,680 --> 00:35:38,279 Speaker 1: for the things that it makes easier in life, such 642 00:35:38,320 --> 00:35:41,799 Speaker 1: as making payments. So when you're purchasing something at the store, yeah, 643 00:35:41,840 --> 00:35:45,160 Speaker 1: I mean the process there is so clear suddenly, how 644 00:35:45,239 --> 00:35:48,319 Speaker 1: like you are lured in with convenience, you know, you're 645 00:35:48,400 --> 00:35:52,160 Speaker 1: lured in with immediate concrete benefits, and then there are 646 00:35:52,200 --> 00:35:57,000 Speaker 1: these consequences that just come later. Right, Yeah, And again 647 00:35:57,000 --> 00:35:59,879 Speaker 1: it shows also it should shows you what happens when 648 00:36:01,160 --> 00:36:03,839 Speaker 1: the government or the police, you know, change their mind 649 00:36:03,840 --> 00:36:06,720 Speaker 1: about how specific they need to be with the laws 650 00:36:06,760 --> 00:36:09,440 Speaker 1: that they readily enforce. You know. It's it's like, it's 651 00:36:09,480 --> 00:36:11,960 Speaker 1: it's similar case to you know that we often encounter 652 00:36:12,000 --> 00:36:14,759 Speaker 1: with say traffic cameras, like how are we supposed to 653 00:36:14,800 --> 00:36:17,440 Speaker 1: feel about like, yes, we don't want people just you know, 654 00:36:17,600 --> 00:36:19,919 Speaker 1: going through the traffic running every red light. There needs 655 00:36:19,920 --> 00:36:22,759 Speaker 1: to be order, uh, And I guess you know, there 656 00:36:22,760 --> 00:36:24,839 Speaker 1: needs to be there needs to be this idea that, yes, 657 00:36:24,880 --> 00:36:27,000 Speaker 1: if you break these rules, you will you know, you 658 00:36:27,040 --> 00:36:29,520 Speaker 1: could be pulled over, you get a ticket, there could 659 00:36:29,520 --> 00:36:31,480 Speaker 1: be some sort of a punishment in place. But when 660 00:36:31,480 --> 00:36:35,359 Speaker 1: you have this situation where any any transgression, uh, you know, 661 00:36:35,440 --> 00:36:37,840 Speaker 1: for you know, for even the smallest thing can be 662 00:36:37,920 --> 00:36:41,600 Speaker 1: met with an automated fine. Uh. You know, that gets 663 00:36:41,600 --> 00:36:45,439 Speaker 1: into an area that feels more like dystopia than order. Yeah, 664 00:36:45,480 --> 00:36:47,279 Speaker 1: it certainly can. I mean again, it's one of those 665 00:36:47,280 --> 00:36:49,560 Speaker 1: things where it starts to get harder to argue with, 666 00:36:49,600 --> 00:36:52,640 Speaker 1: Like how do you argue with the machine? The camera 667 00:36:52,719 --> 00:36:55,080 Speaker 1: said you were speeding and you're like, I wasn't speeding. 668 00:36:55,400 --> 00:36:57,759 Speaker 1: What do you do? Yeah? So anyway, these I think 669 00:36:57,800 --> 00:37:00,480 Speaker 1: these these examples, I think they just they just helped 670 00:37:00,480 --> 00:37:03,160 Speaker 1: provide a more a more nuanced idea of like what's 671 00:37:03,160 --> 00:37:06,160 Speaker 1: going on in the world right now with facial recognition, uh, 672 00:37:06,200 --> 00:37:09,880 Speaker 1: and what the many pain points are and and and 673 00:37:09,920 --> 00:37:12,919 Speaker 1: what people are where people are fighting back against him. Yeah. 674 00:37:12,960 --> 00:37:15,840 Speaker 1: So I think now that we better understand how the 675 00:37:15,880 --> 00:37:18,480 Speaker 1: technology is being deployed. Of course, in the first episode 676 00:37:18,840 --> 00:37:21,040 Speaker 1: we talked about how it's being deployed even in the 677 00:37:21,120 --> 00:37:24,640 Speaker 1: United States. The question is how should we react, Like, 678 00:37:24,680 --> 00:37:27,480 Speaker 1: what can we do? Uh? And I think maybe it 679 00:37:27,480 --> 00:37:31,520 Speaker 1: would be best to first talk about individual countermeasures and 680 00:37:31,520 --> 00:37:34,600 Speaker 1: then come back to broader action after that. Alright, so 681 00:37:34,680 --> 00:37:37,880 Speaker 1: individual countermeasures so something that an individual person can do 682 00:37:38,080 --> 00:37:42,479 Speaker 1: in the face of facial recognition technology. Yes. So. A 683 00:37:42,520 --> 00:37:45,920 Speaker 1: bunch of the existing knowledge about how to confuse and 684 00:37:45,960 --> 00:37:49,480 Speaker 1: confound facial recognition tech was summarized in a really good 685 00:37:49,560 --> 00:37:52,160 Speaker 1: article I was reading and wired by at least Thomas 686 00:37:52,239 --> 00:37:55,800 Speaker 1: from February of nineteen called how to Hack Your Face 687 00:37:55,880 --> 00:37:59,880 Speaker 1: and Dodge The Rise of Facial Recognition Tech, And unfortunately, 688 00:38:00,080 --> 00:38:02,319 Speaker 1: Thomas points out that the best way to foil a 689 00:38:02,360 --> 00:38:06,320 Speaker 1: facial recognition system is to know what method of facial 690 00:38:06,360 --> 00:38:10,520 Speaker 1: recognition is being used. Thomas quotes a privacy advocate named 691 00:38:10,560 --> 00:38:13,239 Speaker 1: Lily Ryan, who says, quote, you really need to know 692 00:38:13,360 --> 00:38:16,600 Speaker 1: what's under the hood to know what's most likely to work, 693 00:38:16,920 --> 00:38:19,160 Speaker 1: And it can be very hard for the average person 694 00:38:19,200 --> 00:38:22,000 Speaker 1: to know what kind of facial recognition is being used 695 00:38:22,040 --> 00:38:25,120 Speaker 1: on them at any particular time. So it's worth pausing 696 00:38:25,160 --> 00:38:28,920 Speaker 1: to look at different kinds of rebellion against the recognizing machine. 697 00:38:29,560 --> 00:38:31,480 Speaker 1: I think the first one is the simpler one. That 698 00:38:31,680 --> 00:38:35,759 Speaker 1: the first would be an anonymity defense, some method of 699 00:38:35,800 --> 00:38:40,760 Speaker 1: making your actual identity unrecognizable and presenting as an unknown, 700 00:38:40,880 --> 00:38:45,600 Speaker 1: unscannable person. This would essentially trying to become faceless, yeah, exactly. 701 00:38:46,040 --> 00:38:48,680 Speaker 1: The second path would go beyond that into what are 702 00:38:48,960 --> 00:38:53,400 Speaker 1: called presentation attacks in some of the literature. For example, 703 00:38:53,480 --> 00:38:56,960 Speaker 1: there's an article that's linked by Thomas by Raga, Vendra 704 00:38:57,080 --> 00:39:02,240 Speaker 1: Rama Chandra and Kristoff Bush called Presentation Attack Detection Methods 705 00:39:02,280 --> 00:39:07,160 Speaker 1: for Face Recognition Systems. A comprehensive survey published in in 706 00:39:07,280 --> 00:39:10,120 Speaker 1: a c M Computing Surveys and so this is a 707 00:39:10,160 --> 00:39:13,840 Speaker 1: survey of known presentation attacks, also known as direct attacks 708 00:39:13,960 --> 00:39:17,440 Speaker 1: or spoof attacks the author's right quote. The goal of 709 00:39:17,440 --> 00:39:21,440 Speaker 1: a presentation attack is to subvert the face recognition system 710 00:39:21,480 --> 00:39:26,040 Speaker 1: by presenting a facial biometric artifact. So if you go 711 00:39:26,200 --> 00:39:28,759 Speaker 1: in front of a facial recognition scanner and you hold 712 00:39:28,840 --> 00:39:31,520 Speaker 1: up a picture of Nicolas Cage, or you wear a 713 00:39:31,640 --> 00:39:35,840 Speaker 1: Richard Nixon mask or something, you are conducting a presentation attack. 714 00:39:36,160 --> 00:39:39,400 Speaker 1: You're not just trying not to be recognized as yourself, 715 00:39:39,680 --> 00:39:43,960 Speaker 1: but actively trying to be recognized as someone else. Common 716 00:39:43,960 --> 00:39:47,320 Speaker 1: methods here would include like presenting a photo of someone 717 00:39:47,600 --> 00:39:50,239 Speaker 1: to a scanner that has actually worked in a lot 718 00:39:50,280 --> 00:39:53,840 Speaker 1: of cases, uh, playing a video of the target face, 719 00:39:54,000 --> 00:39:57,040 Speaker 1: or wearing a three D mask of somebody else's head. 720 00:39:57,840 --> 00:40:00,360 Speaker 1: All of these methods have had some success us, but 721 00:40:00,440 --> 00:40:05,040 Speaker 1: the researchers here describe presentation attack detection algorithms or p 722 00:40:05,200 --> 00:40:09,959 Speaker 1: A D algorithms that are countermeasures against the countermeasures. Now, 723 00:40:10,360 --> 00:40:12,680 Speaker 1: a question you might be wondering is like, well, why 724 00:40:12,760 --> 00:40:14,960 Speaker 1: would you want to present as someone else instead of 725 00:40:15,000 --> 00:40:17,879 Speaker 1: just being anonymous? Well, I mean there there are all 726 00:40:17,880 --> 00:40:19,880 Speaker 1: sorts of reasons for that. I mean, there's certainly nefarious 727 00:40:19,880 --> 00:40:21,839 Speaker 1: reasons for that. If you get into a situation where 728 00:40:21,920 --> 00:40:25,680 Speaker 1: saying facial recognition is required to interrogated community, well, then 729 00:40:25,719 --> 00:40:28,520 Speaker 1: if you wanted to break into sedigated community, it would, uh, 730 00:40:28,840 --> 00:40:31,520 Speaker 1: it would behoove you to have another person's face to wear. 731 00:40:31,600 --> 00:40:34,160 Speaker 1: Perhaps you know, print it out exactly right, That could 732 00:40:34,160 --> 00:40:36,480 Speaker 1: be the direct reason. Maybe you want access to a 733 00:40:36,520 --> 00:40:41,240 Speaker 1: location or a device, and access is granted to specific 734 00:40:41,280 --> 00:40:45,279 Speaker 1: people based on facial recognition, So you use an authorized 735 00:40:45,320 --> 00:40:48,960 Speaker 1: person's face in order to get in. But I can 736 00:40:49,000 --> 00:40:54,120 Speaker 1: also see another idea, which is perhaps rampant presentation attacks 737 00:40:54,760 --> 00:40:58,560 Speaker 1: could be an effective method for fighting the facial recognition 738 00:40:58,600 --> 00:41:01,640 Speaker 1: reign of terror. Beca is it would not deny these 739 00:41:01,760 --> 00:41:04,759 Speaker 1: data collecting systems the data they want, not just do that. 740 00:41:05,080 --> 00:41:07,960 Speaker 1: It would go further and come up the databases with 741 00:41:08,040 --> 00:41:12,560 Speaker 1: lots of confusing, incorrect information, which might in fact make 742 00:41:12,600 --> 00:41:17,279 Speaker 1: them less useful overall. Yeah. I mean another application here 743 00:41:17,320 --> 00:41:20,920 Speaker 1: that is awful to think about is the use of 744 00:41:21,920 --> 00:41:24,320 Speaker 1: you We're talking about sort of the automated guilt machine 745 00:41:24,520 --> 00:41:28,359 Speaker 1: that could exist with facial recognition technology. Say you had 746 00:41:28,400 --> 00:41:31,279 Speaker 1: it in for somebody you know you're mad at, you know, 747 00:41:31,840 --> 00:41:34,799 Speaker 1: a coworker or a schoolmate. Then you go and you 748 00:41:34,880 --> 00:41:39,000 Speaker 1: do something illegal with with a mask of that person's face. 749 00:41:39,400 --> 00:41:41,600 Speaker 1: You know, not enough to say send them away forever, 750 00:41:41,680 --> 00:41:45,399 Speaker 1: but enough to say, uh, you know, to to cause 751 00:41:45,440 --> 00:41:47,000 Speaker 1: them a lot of grief in the short term, in 752 00:41:47,040 --> 00:41:49,040 Speaker 1: the very least. I mean, you don't know how well 753 00:41:49,120 --> 00:41:51,359 Speaker 1: their their defense would be. I mean, maybe it would 754 00:41:51,400 --> 00:41:53,359 Speaker 1: send them away forever. I mean, I don't know how 755 00:41:53,440 --> 00:41:55,840 Speaker 1: much faith the criminal justice system is going to end 756 00:41:55,920 --> 00:41:58,720 Speaker 1: up putting in the verdicts of these machines. I wouldn't 757 00:41:58,760 --> 00:42:01,440 Speaker 1: be surprised if it's too much. We've seen that before. 758 00:42:01,480 --> 00:42:05,120 Speaker 1: We've certainly seen models of that with some of the 759 00:42:05,200 --> 00:42:09,480 Speaker 1: past episodes where where we've discussed a forensic science. Yeah, exactly, Yeah, 760 00:42:09,640 --> 00:42:11,600 Speaker 1: there are a lot of methods of forensic science that 761 00:42:11,680 --> 00:42:17,200 Speaker 1: have been vastly overestimated in their in their confidence. Now, finally, 762 00:42:17,239 --> 00:42:19,560 Speaker 1: another reason I was thinking it might be useful to 763 00:42:19,719 --> 00:42:22,960 Speaker 1: present as another human instead of just trying to make 764 00:42:23,000 --> 00:42:26,680 Speaker 1: yourself anonymous is it's not hard at all to imagine 765 00:42:26,719 --> 00:42:31,680 Speaker 1: scenarios were being unscannable or being anonymous, will itself be 766 00:42:31,760 --> 00:42:36,759 Speaker 1: a problem, will restrict your rights, make you a target, etcetera. Like, 767 00:42:36,840 --> 00:42:41,359 Speaker 1: the anonymity could attract attention rather than discouraging it. So 768 00:42:41,440 --> 00:42:45,000 Speaker 1: the alternative would be to appear as a real scannable person, 769 00:42:45,120 --> 00:42:48,880 Speaker 1: but not yourself. Right, Wearing a faceless mask in public 770 00:42:49,200 --> 00:42:53,319 Speaker 1: is gonna draw more attention than looking like someone else. Yes, 771 00:42:53,520 --> 00:42:55,640 Speaker 1: even even if it wasn't like that great a mask 772 00:42:55,760 --> 00:42:58,120 Speaker 1: you know, uh it would, it would still it was 773 00:42:58,160 --> 00:43:01,640 Speaker 1: still potentially draw less attention. Now back to Thomas's article, 774 00:43:01,680 --> 00:43:04,000 Speaker 1: Thomas writes, that. Of course, you know, the simplest method 775 00:43:04,000 --> 00:43:07,719 Speaker 1: of fighting facial recognition is what would normally be called occlusion, 776 00:43:07,960 --> 00:43:11,239 Speaker 1: hiding all or part of your face. But again, this 777 00:43:11,320 --> 00:43:14,200 Speaker 1: is more difficult and more complicated than it sounds. So 778 00:43:14,560 --> 00:43:16,279 Speaker 1: let's say you want to walk around in public with 779 00:43:16,280 --> 00:43:19,319 Speaker 1: your face completely hidden behind a cloth or a zip 780 00:43:19,400 --> 00:43:21,799 Speaker 1: up hood or something. You know that there are there 781 00:43:21,840 --> 00:43:26,120 Speaker 1: are actually people selling basically backwards hoodies, you know, front 782 00:43:26,200 --> 00:43:29,520 Speaker 1: zip hoodies that cover your entire head. First of all, 783 00:43:29,680 --> 00:43:32,000 Speaker 1: is that legal in a lot of places? No, in 784 00:43:32,040 --> 00:43:34,560 Speaker 1: a lot of places in for example, Europe and Canada. 785 00:43:34,600 --> 00:43:37,520 Speaker 1: In the US, it's illegal to cover your entire face 786 00:43:37,560 --> 00:43:40,600 Speaker 1: in public. But even if these laws were changed or 787 00:43:40,640 --> 00:43:42,759 Speaker 1: you're in a place where that's not illegal, is this 788 00:43:42,880 --> 00:43:46,040 Speaker 1: socially feasible. We'll come back to that in a minute. 789 00:43:46,440 --> 00:43:49,560 Speaker 1: But okay, let's say you decided it is not practical 790 00:43:49,719 --> 00:43:53,080 Speaker 1: to completely occlude your entire face. What if you just 791 00:43:53,160 --> 00:43:57,480 Speaker 1: cover part of your face? Unfortunately, Thomas writes that a 792 00:43:57,480 --> 00:44:00,640 Speaker 1: lot of facial recognition software is good enough now to 793 00:44:00,760 --> 00:44:04,040 Speaker 1: make partial covering of the face ineffective as a defense. 794 00:44:04,560 --> 00:44:08,000 Speaker 1: Uh quote. For example, a bala clava, which leaves the 795 00:44:08,000 --> 00:44:11,600 Speaker 1: most important facial features exposed the eyes, the mouth, the 796 00:44:11,680 --> 00:44:14,520 Speaker 1: nose may not actually do much to prevent a person 797 00:44:14,560 --> 00:44:17,920 Speaker 1: from being identified. Researchers have found that by using a 798 00:44:18,000 --> 00:44:21,759 Speaker 1: deep learning framework trained on fourteen key facial points, they 799 00:44:21,800 --> 00:44:25,640 Speaker 1: were able to accurately identify partially occluded faces most of 800 00:44:25,719 --> 00:44:30,640 Speaker 1: the time. This includes wearing glasses, scarves, hats, or fake beards. Yeah. 801 00:44:30,680 --> 00:44:32,520 Speaker 1: I mean when you're getting down to things like like 802 00:44:32,600 --> 00:44:35,319 Speaker 1: the measurement between your eyes, you know, stuff like that. 803 00:44:35,400 --> 00:44:37,759 Speaker 1: I mean, it's it's probably going to be visible. And 804 00:44:37,840 --> 00:44:42,520 Speaker 1: most of these facial inclusion methods and it's it's not 805 00:44:42,640 --> 00:44:45,239 Speaker 1: something you can easily mess with, you know, I mean 806 00:44:46,080 --> 00:44:50,560 Speaker 1: short of like massive facial injury. Uh, I can't think 807 00:44:50,560 --> 00:44:53,440 Speaker 1: of anything much that's going to alter that measurement. Yeah, 808 00:44:53,480 --> 00:44:56,040 Speaker 1: and they're apparently multiple measures like that of a face. 809 00:44:56,120 --> 00:44:58,040 Speaker 1: You know, as long as you want to have your 810 00:44:58,080 --> 00:45:01,319 Speaker 1: eyes and your mouth and stuff exposed, is there are 811 00:45:01,360 --> 00:45:03,920 Speaker 1: probably going to be systems, especially in the near future, 812 00:45:04,360 --> 00:45:07,960 Speaker 1: that will be fairly accurate identifying you. Anyway. Now, another thing, 813 00:45:08,000 --> 00:45:10,480 Speaker 1: as I mentioned a minute ago, there's some evidence that 814 00:45:10,600 --> 00:45:14,000 Speaker 1: three D printed masks based on other people's faces can 815 00:45:14,040 --> 00:45:18,080 Speaker 1: be pretty effective, but they might not remain effective for long. Remember, 816 00:45:18,080 --> 00:45:21,400 Speaker 1: of course, we've got these pa D algorithms, the presentation 817 00:45:21,440 --> 00:45:25,759 Speaker 1: attack detections, and they're developing quickly. And also, I mean, 818 00:45:25,880 --> 00:45:28,719 Speaker 1: is that anything close to practical for regular people? Like 819 00:45:28,800 --> 00:45:30,840 Speaker 1: it seems more like an option that might be available 820 00:45:30,880 --> 00:45:34,080 Speaker 1: to a professional spy, but not just as somebody trying 821 00:45:34,120 --> 00:45:36,880 Speaker 1: to live their life. Now, there's another interesting method that 822 00:45:37,120 --> 00:45:40,640 Speaker 1: at least Thomas mentions in her article, which is confusing 823 00:45:40,680 --> 00:45:43,440 Speaker 1: the computer into believing it is not looking at a 824 00:45:43,480 --> 00:45:47,880 Speaker 1: face at all, attacking not the facial recognition stage, but 825 00:45:47,960 --> 00:45:51,480 Speaker 1: the facial detection stage. I hadn't so much thought about that, 826 00:45:51,520 --> 00:45:54,200 Speaker 1: and I think that's a really good point. One solution 827 00:45:54,239 --> 00:45:57,440 Speaker 1: along these lines is widely known as c V dazzle, 828 00:45:57,520 --> 00:46:00,960 Speaker 1: which stands for Computer Vision Dazzle dazzle kind of in 829 00:46:01,000 --> 00:46:03,800 Speaker 1: the same sense it's used in like a military context. 830 00:46:03,800 --> 00:46:05,760 Speaker 1: You would put dazzle on the side of a ship 831 00:46:06,239 --> 00:46:09,800 Speaker 1: in a context of warfare, which are like different lines 832 00:46:09,880 --> 00:46:12,319 Speaker 1: going in different ways that apparently make it harder to 833 00:46:12,400 --> 00:46:15,200 Speaker 1: look at the ship and determine its speed and it's bearing, 834 00:46:15,680 --> 00:46:20,280 Speaker 1: like like some of the more like lightning bolty camouflage designs. 835 00:46:20,280 --> 00:46:23,640 Speaker 1: One sees not not necessarily be dazzled, which is the 836 00:46:23,640 --> 00:46:25,640 Speaker 1: first thing that came to my mind, like just bedazzle 837 00:46:25,719 --> 00:46:29,080 Speaker 1: your face. Well that it's funny because that does kind 838 00:46:29,120 --> 00:46:31,320 Speaker 1: of come in but but yeah, I think it's based 839 00:46:31,360 --> 00:46:34,080 Speaker 1: on the idea of a visual dazzle, and so what 840 00:46:34,160 --> 00:46:37,279 Speaker 1: it involves for people is altering the appearance of your 841 00:46:37,320 --> 00:46:41,799 Speaker 1: head to stop facial detection algorithms from flagging it as 842 00:46:41,800 --> 00:46:44,200 Speaker 1: a face. And the most common ways to do this 843 00:46:44,239 --> 00:46:48,160 Speaker 1: are with makeup, with hair styling, and coloring with facial 844 00:46:48,200 --> 00:46:51,319 Speaker 1: accessories like hair clips and stick on rhinestone. So there 845 00:46:51,400 --> 00:46:54,719 Speaker 1: is some bedazzling going on. Robert I included a few 846 00:46:54,719 --> 00:46:57,279 Speaker 1: examples here for you to look at. These are from 847 00:46:57,280 --> 00:47:01,400 Speaker 1: a website cv dazzle dot com. We offers some explicit 848 00:47:01,520 --> 00:47:04,439 Speaker 1: style guides that people can use. Oh wow, these are great. 849 00:47:04,480 --> 00:47:07,640 Speaker 1: I mean these look like futuristic hair and makeup designs 850 00:47:07,640 --> 00:47:10,200 Speaker 1: that you might see in like Blade Runner or something. Yes, 851 00:47:10,880 --> 00:47:14,759 Speaker 1: now again here, clearly, the the purpose of these designs 852 00:47:15,200 --> 00:47:17,840 Speaker 1: is not just to make your face look different. It 853 00:47:18,000 --> 00:47:20,960 Speaker 1: is to try to make your face look like something 854 00:47:21,000 --> 00:47:23,439 Speaker 1: other than a face, which could but I can see 855 00:47:23,520 --> 00:47:27,000 Speaker 1: proving potentially difficult coming back again to that that blog 856 00:47:27,040 --> 00:47:29,560 Speaker 1: post that we discussed in one of the earlier episodes, 857 00:47:29,840 --> 00:47:34,840 Speaker 1: um about about trying to to fool the Skype facial 858 00:47:34,880 --> 00:47:38,239 Speaker 1: recognition software. Yeah, the background blurring thing. Yeah yeah, so 859 00:47:38,400 --> 00:47:41,440 Speaker 1: like you know, even the stuff giraffe was getting recognized 860 00:47:41,480 --> 00:47:45,040 Speaker 1: as mostly a face. Uh you. So it's it's a 861 00:47:45,080 --> 00:47:48,640 Speaker 1: more difficult challenge than than one might think. But but again, 862 00:47:48,680 --> 00:47:50,840 Speaker 1: these visual examples from c v Dazzle dot com I 863 00:47:50,880 --> 00:47:53,560 Speaker 1: think will be very informative if you can't quite picture it, 864 00:47:53,600 --> 00:47:56,239 Speaker 1: look these up, yeah totally. Um. Now, a lot of 865 00:47:56,280 --> 00:48:00,080 Speaker 1: them involved things like, um, sort of different streaks of 866 00:48:00,200 --> 00:48:03,360 Speaker 1: light and dark in the hair and in lines across 867 00:48:03,440 --> 00:48:07,279 Speaker 1: the face, and makeup hair partially covering the face in 868 00:48:07,320 --> 00:48:10,200 Speaker 1: a lot of cases, things stuck on the face that 869 00:48:10,280 --> 00:48:13,200 Speaker 1: kind of make it make the shape look different. Thomas 870 00:48:13,280 --> 00:48:16,280 Speaker 1: quotes Kristoff Bush, one of the authors of that study 871 00:48:16,320 --> 00:48:19,400 Speaker 1: we mentioned earlier, and he says, quote from an academic 872 00:48:19,440 --> 00:48:23,840 Speaker 1: research perspective, the makeup attack is gaining more attention. However, 873 00:48:24,280 --> 00:48:27,719 Speaker 1: this kind of attack demands good makeup skills in order 874 00:48:27,760 --> 00:48:30,080 Speaker 1: to be successful, and that's a really good point. In 875 00:48:30,160 --> 00:48:33,080 Speaker 1: other words, it's not good enough just to do some 876 00:48:33,160 --> 00:48:36,640 Speaker 1: unusual things to your hair and put different colored makeup 877 00:48:36,680 --> 00:48:39,759 Speaker 1: on at random. The face design needs to be specially 878 00:48:39,840 --> 00:48:43,600 Speaker 1: tailored to obscure and break up specific like lines and 879 00:48:43,800 --> 00:48:47,040 Speaker 1: shapes and points on the face in order to make 880 00:48:47,080 --> 00:48:50,120 Speaker 1: it sabotage the detection stage. So if if you want 881 00:48:50,120 --> 00:48:53,040 Speaker 1: to know what these specific techniques are, you can find 882 00:48:53,080 --> 00:48:56,279 Speaker 1: guides online from people who study the issue. Now, I 883 00:48:56,560 --> 00:48:59,200 Speaker 1: think this is really cool, and you know, being being 884 00:48:59,280 --> 00:49:02,440 Speaker 1: somebody who's into I guess weirdness myself, like I like 885 00:49:02,600 --> 00:49:05,520 Speaker 1: these styles. Like if I saw somebody wearing these styles, 886 00:49:05,560 --> 00:49:08,239 Speaker 1: I would think it was cool. But I I think 887 00:49:08,280 --> 00:49:10,719 Speaker 1: we should be real and say these styles are in 888 00:49:10,920 --> 00:49:14,960 Speaker 1: daily life simply not socially feasible for a lot of people, 889 00:49:15,280 --> 00:49:17,960 Speaker 1: maybe most people. Well, for one thing, socially, they're going 890 00:49:18,000 --> 00:49:20,399 Speaker 1: to have the opposite effect. Instead, you're going to draw 891 00:49:20,440 --> 00:49:23,719 Speaker 1: attention to yourself from humans, even if you are obscuring 892 00:49:23,840 --> 00:49:26,879 Speaker 1: the attention of the machine. Yeah, that's exactly right. And 893 00:49:26,960 --> 00:49:29,480 Speaker 1: in addition to that, a lot of people's friends, families, 894 00:49:29,600 --> 00:49:33,600 Speaker 1: especially workplaces will probably not be okay with them styling 895 00:49:33,640 --> 00:49:36,680 Speaker 1: their hair and makeup deliberately to make their face look 896 00:49:36,680 --> 00:49:39,799 Speaker 1: as unlike a face as possible. I'm okay with it 897 00:49:39,840 --> 00:49:42,520 Speaker 1: around here. Yeah, me too. I don't know if the 898 00:49:42,520 --> 00:49:44,680 Speaker 1: bosses would be okay, right, You can easily imagine like 899 00:49:44,680 --> 00:49:49,319 Speaker 1: a very conservative or a governmental kind of employer, who 900 00:49:49,360 --> 00:49:52,120 Speaker 1: would you know, we're gonna have a very firm idea 901 00:49:52,120 --> 00:49:54,520 Speaker 1: of what your hair and your your makeup should consist of. 902 00:49:54,840 --> 00:49:57,279 Speaker 1: And again, if you're not allowed to wear, say, pajamas 903 00:49:57,680 --> 00:50:00,799 Speaker 1: in public, I can't imagine this would lie either. And 904 00:50:00,840 --> 00:50:04,480 Speaker 1: there's another complication actually that that makes this even more difficult. 905 00:50:04,880 --> 00:50:09,320 Speaker 1: So the CV dazzle method is only effective at fighting 906 00:50:09,360 --> 00:50:13,600 Speaker 1: face detection technologies that rely on visible light, and not 907 00:50:13,840 --> 00:50:17,880 Speaker 1: all do Thomas sites. For example, Apple's Face i D 908 00:50:18,080 --> 00:50:22,080 Speaker 1: which act which actually uses infrared light. The system detects 909 00:50:22,080 --> 00:50:25,520 Speaker 1: sort of more about the underlying contours and like bone 910 00:50:25,680 --> 00:50:28,960 Speaker 1: structure of your face, and it is not easily thrown 911 00:50:28,960 --> 00:50:32,120 Speaker 1: off by unusual patterns of light and dark colors. The 912 00:50:32,120 --> 00:50:36,399 Speaker 1: CV dazzle wouldn't necessarily affect a system like that, Uh, 913 00:50:36,440 --> 00:50:39,120 Speaker 1: though there are other methods you could maybe use. Apparently, 914 00:50:39,160 --> 00:50:41,840 Speaker 1: you might be able to protect your face from infrared 915 00:50:41,880 --> 00:50:45,400 Speaker 1: detection by, for instance, wearing a hat that projects infrared 916 00:50:45,520 --> 00:50:48,839 Speaker 1: light on your face in weird patterns, as demonstrated by 917 00:50:48,840 --> 00:50:52,880 Speaker 1: one study from Chinese and American researchers in But again, like, 918 00:50:53,120 --> 00:50:55,759 Speaker 1: is that realistic that that people would be able to 919 00:50:55,800 --> 00:50:59,640 Speaker 1: do that? Thomas mentions another method that I like overwhelming 920 00:50:59,719 --> 00:51:03,360 Speaker 1: to re action. Uh, kind of like a visual denial 921 00:51:03,440 --> 00:51:06,880 Speaker 1: of service attack on facial recognition. The solution here is 922 00:51:06,880 --> 00:51:10,879 Speaker 1: pretty simple. You cover yourself in lots of images of faces, 923 00:51:11,280 --> 00:51:15,160 Speaker 1: shirt scarf, ear rings, all with pictures of faces on them. 924 00:51:15,719 --> 00:51:18,480 Speaker 1: Would this always work? Probably not, but it will be 925 00:51:18,520 --> 00:51:21,879 Speaker 1: effective with some systems. And she ends her article by 926 00:51:21,920 --> 00:51:25,880 Speaker 1: mentioning again like major problems with existing facial recognition technology 927 00:51:25,880 --> 00:51:29,280 Speaker 1: that we've already alluded to, like huge numbers of false matches, 928 00:51:29,520 --> 00:51:32,439 Speaker 1: errors that skew along race and gender lines, all kinds 929 00:51:32,440 --> 00:51:35,560 Speaker 1: of problems like that. She ends up saying, quote, the 930 00:51:35,640 --> 00:51:39,560 Speaker 1: real solution to issues around facial recognition the tech community 931 00:51:39,600 --> 00:51:42,600 Speaker 1: working with other industries and sectors to strike an appropriate 932 00:51:42,600 --> 00:51:47,160 Speaker 1: balance between security and privacy in public spaces. Uh, Which 933 00:51:47,600 --> 00:51:49,680 Speaker 1: that may be an answer, but I mean I wonder 934 00:51:49,840 --> 00:51:52,120 Speaker 1: will that be good enough. In the first episode of 935 00:51:52,160 --> 00:51:55,799 Speaker 1: the series, we discussed several figures who have called for 936 00:51:55,880 --> 00:52:01,480 Speaker 1: either strong regulations or even outright bands on face recognition technology. 937 00:52:01,560 --> 00:52:03,319 Speaker 1: So I think next we should look at that as 938 00:52:03,320 --> 00:52:05,360 Speaker 1: a solution, maybe after we come back from a break. 939 00:52:05,800 --> 00:52:11,719 Speaker 1: All right, we'll be right back, thank thank alright, we're back. 940 00:52:11,760 --> 00:52:14,719 Speaker 1: So we were just talking about the the individual approaches 941 00:52:14,760 --> 00:52:17,440 Speaker 1: to fighting facial recognition, like things you could do to 942 00:52:17,680 --> 00:52:23,080 Speaker 1: disrupt your own image and or potentially full facial recognition device. 943 00:52:23,680 --> 00:52:26,680 Speaker 1: But now we're getting more into bands and regulations, the 944 00:52:26,719 --> 00:52:30,960 Speaker 1: broader governmental legislative moves that could be made to keep 945 00:52:31,000 --> 00:52:33,239 Speaker 1: this kind of technology from getting out of control. Yeah, 946 00:52:33,280 --> 00:52:35,000 Speaker 1: And there was one article I was reading that I 947 00:52:35,000 --> 00:52:37,560 Speaker 1: thought was pretty straightforward and made a very good case. 948 00:52:37,600 --> 00:52:41,520 Speaker 1: It was by Evans Sellinger and Woodrow heart Zog, published 949 00:52:41,560 --> 00:52:44,920 Speaker 1: in The New York Times in October seventeen nineteen. It 950 00:52:45,000 --> 00:52:48,400 Speaker 1: was called, what happens when employers can read your facial expressions? 951 00:52:48,560 --> 00:52:51,880 Speaker 1: That's that's a great question, um uh. So Sellinger is 952 00:52:51,920 --> 00:52:55,520 Speaker 1: a professor of philosophy at the Rochester Institute of Technology, 953 00:52:55,719 --> 00:52:58,040 Speaker 1: and heart Soog is a professor of law and computer 954 00:52:58,080 --> 00:53:01,520 Speaker 1: science at Northeastern University. And they are responding to the 955 00:53:01,520 --> 00:53:03,960 Speaker 1: fact that, of course many many are calling for a 956 00:53:04,000 --> 00:53:07,160 Speaker 1: band on this technology. Uh, and this is one of 957 00:53:07,200 --> 00:53:10,680 Speaker 1: these rare cases left in in the US politics, at 958 00:53:10,760 --> 00:53:15,600 Speaker 1: least where there is actually some bipartisan agreement. Apparently, some 959 00:53:15,719 --> 00:53:19,720 Speaker 1: right wing politicians like Jim Jordans have expressed concern. Speaking 960 00:53:19,760 --> 00:53:22,919 Speaker 1: to NPR in July twent nineteen, he said he thought 961 00:53:22,920 --> 00:53:25,240 Speaker 1: it was time for a time out on this technology 962 00:53:25,280 --> 00:53:27,600 Speaker 1: and that we needed to put safeguards in place before 963 00:53:27,680 --> 00:53:30,799 Speaker 1: we went forward with developing it. Meanwhile, you've got left 964 00:53:30,800 --> 00:53:34,840 Speaker 1: wing leaders Alexandria Cassio Cortez and Bernie Sanders have called 965 00:53:34,960 --> 00:53:38,160 Speaker 1: have called for regulation or bands on facial recognition. I 966 00:53:38,160 --> 00:53:40,640 Speaker 1: think uh Sanders announced the band as part of a 967 00:53:40,640 --> 00:53:44,879 Speaker 1: criminal justice reform agenda for his presidential campaign. So all 968 00:53:44,880 --> 00:53:47,880 Speaker 1: over the map, people are throwing up flags and saying 969 00:53:47,920 --> 00:53:50,400 Speaker 1: we stop this is this is scary, We need to 970 00:53:50,440 --> 00:53:53,040 Speaker 1: do something about it, and that at least is reassuring. 971 00:53:53,239 --> 00:53:56,160 Speaker 1: Let's just hope that that can maintain that. You know 972 00:53:56,160 --> 00:53:59,080 Speaker 1: that there there remains bipartisan support. It doesn't wind up 973 00:53:59,080 --> 00:54:02,000 Speaker 1: politicized in way or the other. But but certainly, the 974 00:54:02,040 --> 00:54:05,719 Speaker 1: idea of your face not becoming part of a massive database, 975 00:54:05,800 --> 00:54:08,680 Speaker 1: the idea of not being in a perpetual police lineup, 976 00:54:09,000 --> 00:54:12,440 Speaker 1: I feel like like that is going to strike a 977 00:54:12,520 --> 00:54:17,640 Speaker 1: chord with with with with with with most demographics in America. Well, 978 00:54:17,680 --> 00:54:19,399 Speaker 1: I think it's one of those weird things where there 979 00:54:19,520 --> 00:54:22,400 Speaker 1: is some bipartisan support, but there's also just not nearly 980 00:54:22,480 --> 00:54:26,440 Speaker 1: enough awareness. So like a few people on different parts 981 00:54:26,440 --> 00:54:28,840 Speaker 1: of the political spectrum are all sort of in agreement 982 00:54:28,880 --> 00:54:30,920 Speaker 1: about this, like wait a minute, we need to do something, 983 00:54:31,200 --> 00:54:34,239 Speaker 1: But it's not a lot of people overall, right, and 984 00:54:34,280 --> 00:54:36,319 Speaker 1: if all, and if the only thing you've really heard 985 00:54:36,360 --> 00:54:39,360 Speaker 1: has been like say, pitched from say a company that 986 00:54:39,480 --> 00:54:42,880 Speaker 1: is specializing in this, perhaps with a law enforcement focus, 987 00:54:42,920 --> 00:54:44,480 Speaker 1: like you just you just might think, oh, well that 988 00:54:44,520 --> 00:54:49,040 Speaker 1: sounds fine, say, you know, find missing people and stop criminals. Sure, 989 00:54:49,040 --> 00:54:50,759 Speaker 1: I'll sign off on that those are good things. These 990 00:54:50,800 --> 00:54:53,080 Speaker 1: are good things, but it's not the complete picture, right 991 00:54:53,480 --> 00:54:55,799 Speaker 1: uh So they also the authors here note that many 992 00:54:55,880 --> 00:55:00,320 Speaker 1: local governments have already formally restricted their government agencies, including police, 993 00:55:00,400 --> 00:55:03,040 Speaker 1: from using it, so nothing really strong has happened at 994 00:55:03,040 --> 00:55:08,320 Speaker 1: the national level, but like San Francisco, Oakland, Berkeley, Somerville, Massachusetts, 995 00:55:08,320 --> 00:55:10,960 Speaker 1: and some other local areas have have put a ban 996 00:55:11,080 --> 00:55:14,040 Speaker 1: in place. So the authors of this piece, Hertzog and 997 00:55:14,120 --> 00:55:17,360 Speaker 1: Sellinger here argue that it is not enough just to 998 00:55:17,560 --> 00:55:21,239 Speaker 1: limit the ways in which facial recognition is used, or 999 00:55:21,280 --> 00:55:24,719 Speaker 1: to say, restrict government agencies such as police departments, from 1000 00:55:24,719 --> 00:55:28,799 Speaker 1: buying these tools. They say that, unfortunately, the only way 1001 00:55:28,840 --> 00:55:32,279 Speaker 1: to actually protect ourselves is to enact a complete and 1002 00:55:32,320 --> 00:55:36,320 Speaker 1: total ban on the technology. Quote, we must ban facial 1003 00:55:36,360 --> 00:55:40,360 Speaker 1: recognition in both public and private sectors before we grow 1004 00:55:40,480 --> 00:55:43,840 Speaker 1: so dependent on it that we accept its inevitable harms 1005 00:55:43,880 --> 00:55:48,759 Speaker 1: as necessary for progress. Perhaps over time, appropriate policies can 1006 00:55:48,800 --> 00:55:52,440 Speaker 1: be enacted that justify lifting a ban, but we doubt it. 1007 00:55:53,040 --> 00:55:55,120 Speaker 1: So I think they make a pretty good case, and 1008 00:55:55,120 --> 00:55:57,080 Speaker 1: and we'll get to a little bit more about it 1009 00:55:57,120 --> 00:55:59,200 Speaker 1: in a second. But you know, you might be wondering, 1010 00:55:59,239 --> 00:56:02,800 Speaker 1: like if there is actually some bipartisan agreement that facial 1011 00:56:02,880 --> 00:56:06,040 Speaker 1: recognition could be devastating to our basic liberties, like what's 1012 00:56:06,080 --> 00:56:09,239 Speaker 1: the hold up? You know, what's the problem. And so 1013 00:56:09,320 --> 00:56:12,080 Speaker 1: of course, the authors note that in general, the United 1014 00:56:12,160 --> 00:56:15,799 Speaker 1: States is very reticent to enact bands on technology like 1015 00:56:15,840 --> 00:56:18,600 Speaker 1: with with one of the few counter examples being various 1016 00:56:18,600 --> 00:56:21,279 Speaker 1: types of malware. And I think that's good because I 1017 00:56:21,320 --> 00:56:24,440 Speaker 1: think it makes sense to think of facial recognition technology 1018 00:56:24,480 --> 00:56:28,360 Speaker 1: as a form of cultural malware seeps in makes copies 1019 00:56:28,400 --> 00:56:30,799 Speaker 1: of itself. You know, you get it as a byproduct 1020 00:56:30,800 --> 00:56:34,600 Speaker 1: of something you wanted to download. But of course there's 1021 00:56:34,600 --> 00:56:37,480 Speaker 1: another thing. The authors don't really speculate about these motives, 1022 00:56:37,520 --> 00:56:40,040 Speaker 1: but obviously one big hurdle is that there's just a 1023 00:56:40,080 --> 00:56:42,279 Speaker 1: lot of money to be made in this sector, and 1024 00:56:42,360 --> 00:56:46,960 Speaker 1: even worse, there are significant sunk costs. Like many extremely 1025 00:56:47,040 --> 00:56:52,200 Speaker 1: talented people and powerful institutions have already devoted significant resources 1026 00:56:52,680 --> 00:56:57,040 Speaker 1: and time to improving facial recognition software. And we know 1027 00:56:57,120 --> 00:57:00,799 Speaker 1: that humans do not like abandoning sunk costs. Once you've 1028 00:57:00,800 --> 00:57:05,000 Speaker 1: already invested in something, you're kind of your psychologically stuck 1029 00:57:05,040 --> 00:57:06,879 Speaker 1: with it. You know, do you want to throw all 1030 00:57:06,920 --> 00:57:09,640 Speaker 1: the all that work in the trash now? Right? Right? 1031 00:57:09,880 --> 00:57:11,799 Speaker 1: You know, companies that have we have created this kind 1032 00:57:11,840 --> 00:57:14,359 Speaker 1: of technology that are looking for ways to expand new 1033 00:57:14,440 --> 00:57:17,160 Speaker 1: markets that they can they can get into new new uses, 1034 00:57:17,200 --> 00:57:19,760 Speaker 1: and certainly when you have uses that in and off 1035 00:57:19,800 --> 00:57:26,600 Speaker 1: themselves are advantageous things like making payments, personal security um 1036 00:57:26,680 --> 00:57:28,800 Speaker 1: and just the you know, the broad broadly speaking, the 1037 00:57:28,920 --> 00:57:33,200 Speaker 1: idea of finding missing persons and catching criminals. Yeah, I 1038 00:57:33,200 --> 00:57:36,880 Speaker 1: mean that makes all those things in isolation make they 1039 00:57:37,000 --> 00:57:40,240 Speaker 1: make a good case, right, And that's actually the first 1040 00:57:40,280 --> 00:57:42,960 Speaker 1: to main argument. So the authors of this piece outline 1041 00:57:43,040 --> 00:57:45,800 Speaker 1: three major arguments that are used by the advocates of 1042 00:57:45,840 --> 00:57:49,680 Speaker 1: facial recognition, and the first one is exactly that is that, well, 1043 00:57:50,160 --> 00:57:52,920 Speaker 1: there might be some harms, but they argue the potential 1044 00:57:52,960 --> 00:57:56,040 Speaker 1: benefits outweigh the harms. Uh. You know, again, think about 1045 00:57:56,040 --> 00:57:58,400 Speaker 1: the benefits to law enforcement alone. Think of all the 1046 00:57:58,480 --> 00:58:00,560 Speaker 1: violent criminals that could be called, Think of all the 1047 00:58:00,600 --> 00:58:03,400 Speaker 1: missing persons that could be found. Uh. And on top 1048 00:58:03,440 --> 00:58:06,840 Speaker 1: of that, think about some of the obvious consumer demand 1049 00:58:07,000 --> 00:58:09,560 Speaker 1: that there would be for public versions of the software. 1050 00:58:09,920 --> 00:58:11,720 Speaker 1: I mean, it's the kind of thing that, like, we 1051 00:58:11,800 --> 00:58:14,520 Speaker 1: would be terrified about the thought of people using it 1052 00:58:14,560 --> 00:58:17,520 Speaker 1: on us, but might be really excited to use it 1053 00:58:17,520 --> 00:58:20,520 Speaker 1: on others. You know. It's just like this, uh, this 1054 00:58:20,680 --> 00:58:24,400 Speaker 1: basic failure to like reverse the situation and apply the 1055 00:58:24,440 --> 00:58:26,920 Speaker 1: same rules you would want applied to yourself to other people. 1056 00:58:27,320 --> 00:58:30,680 Speaker 1: So under this mindset, instead of being banned, the people 1057 00:58:30,680 --> 00:58:32,720 Speaker 1: who say, you know, the benefits outweigh the harms, they 1058 00:58:32,720 --> 00:58:36,680 Speaker 1: would probably say, well, facial recognition should be lightly regulated. 1059 00:58:36,760 --> 00:58:40,040 Speaker 1: You know, maybe we could require transparency and make sure 1060 00:58:40,080 --> 00:58:43,280 Speaker 1: consumers are aware when face data is being harvested for 1061 00:58:43,320 --> 00:58:46,800 Speaker 1: recognition purposes. You know you can't do it surreptitiously. Uh 1062 00:58:46,800 --> 00:58:50,200 Speaker 1: And the authors here disagree, they write, quote, notice and 1063 00:58:50,320 --> 00:58:54,760 Speaker 1: choice has been an abysmal failure. Social media companies, airlines, 1064 00:58:54,760 --> 00:58:58,320 Speaker 1: and retailers overhyped the short term benefits of facial recognition 1065 00:58:58,560 --> 00:59:02,600 Speaker 1: while using unreadable privacy policies and vague disclaimers that make 1066 00:59:02,640 --> 00:59:06,960 Speaker 1: it hard to understand how the technology endangers users privacies 1067 00:59:06,960 --> 00:59:10,600 Speaker 1: and freedom. Uh So, you know, like, does anybody actually 1068 00:59:10,640 --> 00:59:14,480 Speaker 1: read the end user license Agreement? Of course not No, 1069 00:59:14,800 --> 00:59:17,920 Speaker 1: I mean, like, even if we did, would some vague 1070 00:59:18,040 --> 00:59:21,600 Speaker 1: legal phrasing about ownership of face data cause us to 1071 00:59:21,720 --> 00:59:26,200 Speaker 1: actually forego participation in technological trends that everybody around us 1072 00:59:26,280 --> 00:59:29,360 Speaker 1: is adopting. I mean again, of course not, Like, even 1073 00:59:29,400 --> 00:59:33,160 Speaker 1: if the harms vastly outweigh the benefits. The benefits are 1074 00:59:33,200 --> 00:59:37,640 Speaker 1: immediate and concrete, and the harms are long term and abstract. 1075 00:59:38,000 --> 00:59:41,000 Speaker 1: Exactly the kinds of cases where we are so bad 1076 00:59:41,080 --> 00:59:44,000 Speaker 1: at like making informed decisions, like would you like a 1077 00:59:44,040 --> 00:59:46,640 Speaker 1: slice of pizza right now? Just be aware that you 1078 00:59:46,680 --> 00:59:49,560 Speaker 1: know it may compromise the integrity of your identity in 1079 00:59:49,600 --> 00:59:53,080 Speaker 1: some way that's difficult to picture. Now. The next counter 1080 00:59:53,200 --> 00:59:56,440 Speaker 1: argument they explore is that the idea that strong fears 1081 00:59:56,480 --> 01:00:01,439 Speaker 1: about new technologies are overreactions, and we've you know, we've 1082 01:00:01,440 --> 01:00:03,600 Speaker 1: looked at lots of ways that this can absolutely happen. 1083 01:00:03,680 --> 01:00:05,880 Speaker 1: We've discussed it on this show and also a lot 1084 01:00:05,920 --> 01:00:08,400 Speaker 1: on invention. Think about the panic about a racer. Just 1085 01:00:08,440 --> 01:00:11,280 Speaker 1: remember that, yes, yes, definitely, the idea that oh, racers 1086 01:00:11,360 --> 01:00:14,360 Speaker 1: are are exist. All my writings will be a race 1087 01:00:14,800 --> 01:00:17,680 Speaker 1: or remember some of the panics that came with the 1088 01:00:17,880 --> 01:00:21,040 Speaker 1: advent of photography, right, yeah, and just the general idea 1089 01:00:21,040 --> 01:00:23,680 Speaker 1: of a future shock, you know, the idea that that 1090 01:00:23,800 --> 01:00:27,920 Speaker 1: rapidly advancing technology is overwhelming and uh um yeah, that 1091 01:00:28,000 --> 01:00:30,919 Speaker 1: is a subject unto itself. Yeah, So this argument would 1092 01:00:30,960 --> 01:00:33,720 Speaker 1: be that things sometimes just seems scary and provoke shock 1093 01:00:34,040 --> 01:00:36,400 Speaker 1: because they're new, but once we get used to it, 1094 01:00:36,400 --> 01:00:39,480 Speaker 1: it's great. Uh. The authors disagree. They do not think 1095 01:00:39,520 --> 01:00:42,200 Speaker 1: this is the case with facial recognition. They argue that 1096 01:00:42,200 --> 01:00:46,200 Speaker 1: the backlash is not just hyperventilating about something that's unfamiliar. 1097 01:00:46,560 --> 01:00:50,280 Speaker 1: In very concrete ways, facial recognition does have a unique 1098 01:00:50,400 --> 01:00:54,600 Speaker 1: power to create a world with pervasive automated surveillance in 1099 01:00:54,640 --> 01:00:59,400 Speaker 1: a way that's disempowering to an individuals in almost uncountable ways, 1100 01:01:00,160 --> 01:01:03,480 Speaker 1: they write, quote, big companies, government agencies, and even your 1101 01:01:03,520 --> 01:01:06,360 Speaker 1: next door neighbors will seek to deploy it in more places. 1102 01:01:06,640 --> 01:01:08,920 Speaker 1: They'll want to identify and track you. They'll want to 1103 01:01:08,960 --> 01:01:12,640 Speaker 1: categorize your emotions and identity. They will want to infer 1104 01:01:12,720 --> 01:01:15,680 Speaker 1: where you might shop, protest, or work, and use that 1105 01:01:15,760 --> 01:01:19,520 Speaker 1: information to control and manipulate you, to deprive you of opportunities. 1106 01:01:19,880 --> 01:01:22,720 Speaker 1: It's likely that the technology will be used to police 1107 01:01:22,760 --> 01:01:26,800 Speaker 1: social norms. People who skip church or jaywalk will be 1108 01:01:26,880 --> 01:01:31,120 Speaker 1: noticed and potentially ostracized. And you'd better start practicing your 1109 01:01:31,120 --> 01:01:34,840 Speaker 1: most convincing facial expressions otherwise during your next job interview, 1110 01:01:35,040 --> 01:01:38,680 Speaker 1: a computer could code you as a liar or a malcontent. Now, 1111 01:01:38,720 --> 01:01:40,720 Speaker 1: remember in the last episode when we talked about like 1112 01:01:40,800 --> 01:01:44,040 Speaker 1: these services that are being sold as like identifying people's 1113 01:01:44,080 --> 01:01:47,480 Speaker 1: emotions through facial recognition, which there is a major study 1114 01:01:47,560 --> 01:01:49,960 Speaker 1: that really undercut that and said these things are not 1115 01:01:50,160 --> 01:01:53,000 Speaker 1: very accurate, but that doesn't mean they're not going to 1116 01:01:53,080 --> 01:01:57,000 Speaker 1: be used. Right, and in terms of policing social norms, 1117 01:01:57,280 --> 01:02:01,040 Speaker 1: again go back to the pajamas because it's it's slightly 1118 01:02:01,120 --> 01:02:03,800 Speaker 1: hilarious as the pajama cases. It is also frightening because 1119 01:02:03,800 --> 01:02:07,840 Speaker 1: it is a firm example of facial recognition technology being 1120 01:02:07,920 --> 01:02:11,080 Speaker 1: used to police social norms. Yes. Uh. And then the 1121 01:02:11,160 --> 01:02:14,240 Speaker 1: third counter argument they look at is basically the argument 1122 01:02:14,240 --> 01:02:18,240 Speaker 1: that Bruce Schneyer was making when we referenced his article earlier. Uh. 1123 01:02:18,280 --> 01:02:22,360 Speaker 1: It's that facial recognition technology is just one branch of 1124 01:02:22,400 --> 01:02:26,320 Speaker 1: a broader privacy and civil liberty debate. Um, and we 1125 01:02:26,320 --> 01:02:28,920 Speaker 1: we need to focus on all surveillance, not just this 1126 01:02:28,960 --> 01:02:32,600 Speaker 1: particular technology. So how about you know, things that identify 1127 01:02:32,680 --> 01:02:35,240 Speaker 1: you by your gate, the way you move, or about 1128 01:02:35,480 --> 01:02:38,760 Speaker 1: retinal scanning or brain scanning or anything like that. And 1129 01:02:38,800 --> 01:02:40,680 Speaker 1: I'd say about this one, well, you know, it's true 1130 01:02:40,800 --> 01:02:43,600 Speaker 1: like other technologies could represent the kind of threat to 1131 01:02:43,640 --> 01:02:46,920 Speaker 1: privacy and freedom that facial recognition presents. And some of 1132 01:02:46,920 --> 01:02:49,080 Speaker 1: those are scary and awful too. You know, we're not 1133 01:02:49,120 --> 01:02:53,439 Speaker 1: saying like don't, um, don't ban gate recognition scanning, or 1134 01:02:53,680 --> 01:02:58,160 Speaker 1: you know, don't scan my brain while I'm at a restaurant. Yeah, exactly. 1135 01:02:58,160 --> 01:03:00,480 Speaker 1: I mean I would say facial recognition is going special 1136 01:03:00,480 --> 01:03:05,240 Speaker 1: attention because it's already here, like people are selling these programs. Uh. 1137 01:03:05,240 --> 01:03:08,600 Speaker 1: And another thing is that, you know, different technologies are 1138 01:03:08,640 --> 01:03:12,520 Speaker 1: slightly different, and they require different regulatory schemes. You know, 1139 01:03:12,560 --> 01:03:17,120 Speaker 1: the authors here point out that the law singles out automobile, spyware, 1140 01:03:17,240 --> 01:03:19,760 Speaker 1: medical devices, and a bunch of other different kinds of 1141 01:03:19,760 --> 01:03:22,760 Speaker 1: technologies with their own laws and rules. They're not all 1142 01:03:22,800 --> 01:03:26,520 Speaker 1: covered under one type of law. We've got individual agencies 1143 01:03:26,520 --> 01:03:30,600 Speaker 1: for like airplanes and for phones and stuff. But then 1144 01:03:30,680 --> 01:03:34,919 Speaker 1: another thing is that faces are central to our identities. 1145 01:03:35,200 --> 01:03:37,280 Speaker 1: In the first episode, you know, we joked about going 1146 01:03:37,360 --> 01:03:42,200 Speaker 1: around constantly changing masks, but is this socially feasible? I mean, 1147 01:03:42,320 --> 01:03:45,400 Speaker 1: be realistic, Like we need to see each other's faces 1148 01:03:45,440 --> 01:03:48,760 Speaker 1: in order to see each other. Seeing faces is the 1149 01:03:48,800 --> 01:03:51,400 Speaker 1: soul of human life. Yeah. One of the one of 1150 01:03:51,440 --> 01:03:53,960 Speaker 1: the whole aspects of the whole bummer situation that we've 1151 01:03:54,000 --> 01:03:58,120 Speaker 1: discussed involving social media and just electronic culture in general, 1152 01:03:58,160 --> 01:04:00,600 Speaker 1: is that we don't see each other's faces, not not 1153 01:04:00,680 --> 01:04:04,280 Speaker 1: the living face, not the face that shows expression and humanity. 1154 01:04:04,800 --> 01:04:08,120 Speaker 1: We just see the manufactured faces. And uh, we don't 1155 01:04:08,120 --> 01:04:11,160 Speaker 1: want to live in a world of physically manufactured faces 1156 01:04:11,280 --> 01:04:13,960 Speaker 1: because we've could created a technology and allowed it to 1157 01:04:13,960 --> 01:04:16,360 Speaker 1: get out of hand. Yeah. Uh. And so I think 1158 01:04:16,400 --> 01:04:18,760 Speaker 1: often with issues like this and and something you run 1159 01:04:18,800 --> 01:04:21,120 Speaker 1: into with a lot of these counter arguments that these 1160 01:04:21,160 --> 01:04:24,840 Speaker 1: things that are in favor of facial recognition, or we 1161 01:04:24,960 --> 01:04:27,240 Speaker 1: argue in ways that fall into a trap of talking 1162 01:04:27,280 --> 01:04:31,439 Speaker 1: about just what's possible or what's technically true, rather than 1163 01:04:31,480 --> 01:04:35,240 Speaker 1: what's realistic and what's practical. Like could you protect people 1164 01:04:35,360 --> 01:04:37,520 Speaker 1: with opt in methods where they have to sign a 1165 01:04:37,680 --> 01:04:41,840 Speaker 1: ula disclosing that they've surrendered their privacy? I mean in theory, yes, 1166 01:04:41,880 --> 01:04:44,360 Speaker 1: but in practice we know we just know this doesn't work, 1167 01:04:44,400 --> 01:04:46,880 Speaker 1: you know, we all just click. I agree, And again, 1168 01:04:46,920 --> 01:04:49,800 Speaker 1: the benefits are immediate and concrete, and the harms are 1169 01:04:49,920 --> 01:04:53,720 Speaker 1: long term and abstract. Yeah, I want to see what 1170 01:04:53,760 --> 01:04:55,520 Speaker 1: I'll look like as an old man on my iPhone 1171 01:04:55,640 --> 01:04:58,480 Speaker 1: right now. I don't care where this software, where this 1172 01:04:58,520 --> 01:05:00,880 Speaker 1: app is coming from, and where of this facial data 1173 01:05:00,960 --> 01:05:03,600 Speaker 1: might or might not be going. Yeah, and I'm obviously 1174 01:05:03,640 --> 01:05:05,760 Speaker 1: i'd advise people not to do that. But if you 1175 01:05:05,840 --> 01:05:08,440 Speaker 1: react to that with the mentality of will you signed 1176 01:05:08,480 --> 01:05:11,360 Speaker 1: the contract, You've got nothing to complain about. That just 1177 01:05:11,480 --> 01:05:15,560 Speaker 1: doesn't take the experience of human life seriously, right, And 1178 01:05:15,640 --> 01:05:18,160 Speaker 1: it's nothing one of these cases which to to um 1179 01:05:18,200 --> 01:05:22,240 Speaker 1: to summon the words of William Gibson of technology making 1180 01:05:22,800 --> 01:05:25,640 Speaker 1: packs with the devil possible, things that were only the 1181 01:05:25,960 --> 01:05:30,040 Speaker 1: wild imaginings of of people in the past we bring 1182 01:05:30,080 --> 01:05:34,240 Speaker 1: to life with technology making AI demons, making it possible 1183 01:05:34,320 --> 01:05:38,240 Speaker 1: to to to sell your face to some faceless entity. 1184 01:05:38,480 --> 01:05:40,640 Speaker 1: I think that's exactly right. I mean, this is like, 1185 01:05:41,120 --> 01:05:43,760 Speaker 1: this is somehow a case of like a horrible fantasy 1186 01:05:43,840 --> 01:05:47,560 Speaker 1: becoming reality. Um. And you know, and this could be 1187 01:05:47,600 --> 01:05:48,840 Speaker 1: the case with a lot of things. So it's not 1188 01:05:48,880 --> 01:05:51,640 Speaker 1: just the u LA agreements about you know, about people 1189 01:05:51,880 --> 01:05:56,280 Speaker 1: arguing about like what's technically possible without just acknowledging what's realistic. 1190 01:05:56,400 --> 01:05:58,320 Speaker 1: What are people really going to do with their lives? 1191 01:05:58,600 --> 01:06:00,840 Speaker 1: Same thing is, you know, you could argue, well, if 1192 01:06:00,880 --> 01:06:02,800 Speaker 1: you don't like it, you could go around with masks 1193 01:06:02,840 --> 01:06:05,160 Speaker 1: on or something in order to opt out. I mean 1194 01:06:05,280 --> 01:06:08,880 Speaker 1: you could try that. But are people actually practically gonna 1195 01:06:08,920 --> 01:06:11,200 Speaker 1: want to live that way? Yeah? I mean not only 1196 01:06:11,240 --> 01:06:15,560 Speaker 1: living with a mask, but living among the masks, because God, 1197 01:06:15,600 --> 01:06:18,200 Speaker 1: we've talked about this before, just the psychology of of 1198 01:06:18,280 --> 01:06:20,440 Speaker 1: masks and what happens when you wear one, and the 1199 01:06:20,480 --> 01:06:23,360 Speaker 1: group think of mask wearing and and generally the idea 1200 01:06:23,400 --> 01:06:26,200 Speaker 1: of encountering a bunch of people wearing masks or the 1201 01:06:26,240 --> 01:06:29,400 Speaker 1: same mask is a frightening a proposition and they are 1202 01:06:29,560 --> 01:06:34,960 Speaker 1: countless historical examples of that. Yeah. So the authors here conclude, quote, 1203 01:06:35,160 --> 01:06:38,960 Speaker 1: we support a wide ranging ban on this powerful, powerful technology, 1204 01:06:39,000 --> 01:06:42,800 Speaker 1: but even limited prohibitions on its use in police body cams, 1205 01:06:42,920 --> 01:06:46,000 Speaker 1: d m V databases, public housing, and schools would be 1206 01:06:46,040 --> 01:06:49,320 Speaker 1: an important start. And they say the public is ready 1207 01:06:49,360 --> 01:06:52,520 Speaker 1: for this, and the actions by San Francisco, Somerville, Berkeley, 1208 01:06:52,520 --> 01:06:55,120 Speaker 1: and Oakland show it. Our society does not have to 1209 01:06:55,160 --> 01:06:59,040 Speaker 1: allow the spread of new technology that endangers our privacy. 1210 01:06:59,240 --> 01:07:01,480 Speaker 1: And I guess it, just speaking for myself, I I 1211 01:07:01,840 --> 01:07:04,600 Speaker 1: am highly convinced by this point of view. I mean, like, 1212 01:07:04,720 --> 01:07:08,200 Speaker 1: I think, if you ultimately in the future were really 1213 01:07:08,280 --> 01:07:10,880 Speaker 1: confident that you wanted to change your mind and move 1214 01:07:10,960 --> 01:07:14,200 Speaker 1: forward with facial recognition technology, you could lift a ban. 1215 01:07:14,800 --> 01:07:17,640 Speaker 1: But like, you can't undo it once the technology is 1216 01:07:17,680 --> 01:07:21,680 Speaker 1: out there and and it's coming really fast. Absolutely, yeah. 1217 01:07:21,680 --> 01:07:23,560 Speaker 1: I mean on the other side is yes, so you 1218 01:07:23,560 --> 01:07:26,040 Speaker 1: can think of various situations where you put bands in 1219 01:07:26,120 --> 01:07:29,480 Speaker 1: place and then something bad happened, something summone's up a 1220 01:07:29,480 --> 01:07:32,200 Speaker 1: great deal of fear in a nation and allows us 1221 01:07:32,200 --> 01:07:35,200 Speaker 1: to slide back into this situation and say, give up 1222 01:07:35,480 --> 01:07:38,000 Speaker 1: various privacies and rights in the name of feeling a 1223 01:07:38,040 --> 01:07:41,440 Speaker 1: little less afraid. But that doesn't mean it's not worth 1224 01:07:41,560 --> 01:07:45,280 Speaker 1: fighting for now, you know, before the fear, because imagine 1225 01:07:45,320 --> 01:07:49,000 Speaker 1: how much further we would sink into into the fear. Uh, 1226 01:07:49,040 --> 01:07:51,640 Speaker 1: you know if we already had all these technologies in place, 1227 01:07:51,720 --> 01:07:54,880 Speaker 1: eroting and taking aware of freedoms. This is a topic 1228 01:07:54,920 --> 01:07:57,000 Speaker 1: I think it's worth spreading the word about. I mean, like, 1229 01:07:57,080 --> 01:07:58,960 Speaker 1: this is something a lot of people just probably aren't 1230 01:07:58,960 --> 01:08:02,160 Speaker 1: thinking about at all, and it's coming really fast. So uh, 1231 01:08:03,040 --> 01:08:06,960 Speaker 1: it's worth raising the awareness. Yeah, yeah, worth uh worth 1232 01:08:07,640 --> 01:08:11,680 Speaker 1: recording three episodes that I must admit are not fun episodes. 1233 01:08:13,040 --> 01:08:14,800 Speaker 1: The middle one was a lot of fun. I thought, okay, 1234 01:08:14,800 --> 01:08:16,320 Speaker 1: the middle one. The middle one was more fun because 1235 01:08:16,320 --> 01:08:20,120 Speaker 1: we're just talking about the basic biological facial recognition and 1236 01:08:20,280 --> 01:08:23,720 Speaker 1: parts of this one we're at least entertaining that that 1237 01:08:23,760 --> 01:08:26,439 Speaker 1: Wired article was was a very fun but also you know, 1238 01:08:26,640 --> 01:08:30,840 Speaker 1: disturbing read. Uh, but it is. This is a troubling topic. 1239 01:08:30,920 --> 01:08:33,960 Speaker 1: I am. I am more troubled for having researched it 1240 01:08:34,000 --> 01:08:38,200 Speaker 1: and recorded it. But I think we should be troubled 1241 01:08:38,240 --> 01:08:40,880 Speaker 1: by things like this, and it gives us the It 1242 01:08:40,920 --> 01:08:44,160 Speaker 1: gives us the space from which to to act, uh, 1243 01:08:44,200 --> 01:08:47,000 Speaker 1: you know, hopefully just by spreading the word about it 1244 01:08:47,040 --> 01:08:48,960 Speaker 1: and uh and you know, getting to a place where 1245 01:08:49,000 --> 01:08:51,439 Speaker 1: we have some of these protections in place for our 1246 01:08:51,479 --> 01:08:55,439 Speaker 1: own faces. That's right, don't build the nightmare mask land right, 1247 01:08:55,560 --> 01:08:58,160 Speaker 1: save your face, Do not go gentle into that good 1248 01:08:58,240 --> 01:09:02,320 Speaker 1: night right rage, rage against the scanning of the face. Yes, 1249 01:09:04,320 --> 01:09:06,800 Speaker 1: I will stress though. This was the third of a 1250 01:09:06,880 --> 01:09:08,600 Speaker 1: three part series, So if you didn't listen to the 1251 01:09:08,600 --> 01:09:10,640 Speaker 1: other two, go back and listen to them because it 1252 01:09:10,640 --> 01:09:12,839 Speaker 1: will help fill in all the pieces for you totally. 1253 01:09:12,960 --> 01:09:14,920 Speaker 1: But also it's worth pointing out that this is such 1254 01:09:14,960 --> 01:09:17,360 Speaker 1: a bleeding edge issue that even though you'll be listening 1255 01:09:17,400 --> 01:09:20,719 Speaker 1: to this episode mere days after we recorded it, uh, 1256 01:09:20,760 --> 01:09:22,839 Speaker 1: there's just there's just gonna be more and more coverage. 1257 01:09:22,880 --> 01:09:27,000 Speaker 1: There's gonna be you know, new movements happening, new new studies, 1258 01:09:27,479 --> 01:09:30,680 Speaker 1: sadly new rollouts that are going to meet with controversy. 1259 01:09:30,800 --> 01:09:33,640 Speaker 1: So um, just keep that in mind, especially if you 1260 01:09:33,720 --> 01:09:36,559 Speaker 1: end up coming back to this episode same months from now. 1261 01:09:36,680 --> 01:09:39,040 Speaker 1: We're in the frenzy period right now, all right in 1262 01:09:39,040 --> 01:09:40,719 Speaker 1: the meantime, if you want to check out other episodes 1263 01:09:40,720 --> 01:09:42,400 Speaker 1: of Stuff to Blow Your Mind, you can find us 1264 01:09:42,400 --> 01:09:44,840 Speaker 1: wherever you get your podcasts. If you want to just 1265 01:09:45,080 --> 01:09:47,040 Speaker 1: jump on over to the I Heart listing for the show, 1266 01:09:47,040 --> 01:09:48,519 Speaker 1: go to stuff to Blow your Mind dot com and 1267 01:09:48,520 --> 01:09:51,120 Speaker 1: you will be redirected. But wherever you get the show, 1268 01:09:51,280 --> 01:09:54,320 Speaker 1: make sure that you rate, review, and spread the word, 1269 01:09:54,400 --> 01:09:56,960 Speaker 1: because that's how we keep the show alive. Hughes, thanks 1270 01:09:57,000 --> 01:09:59,960 Speaker 1: as always to our excellent audio producer Seth Nicholas Johnson. 1271 01:10:00,200 --> 01:10:01,640 Speaker 1: If you'd like to get in touch with us with 1272 01:10:01,680 --> 01:10:04,160 Speaker 1: feedback on this episode or any other, to suggest topic 1273 01:10:04,240 --> 01:10:06,320 Speaker 1: for the future, or just to say hello, you can 1274 01:10:06,360 --> 01:10:09,200 Speaker 1: email us at contact at stuff to Blow your Mind 1275 01:10:09,360 --> 01:10:18,719 Speaker 1: dot com. Stuff to Blow Your Mind is a production 1276 01:10:18,760 --> 01:10:21,240 Speaker 1: of iHeart Radio's How Stuff Works. For more podcasts from 1277 01:10:21,280 --> 01:10:24,120 Speaker 1: my Heart Radio, visit the iHeart Radio app, Apple Podcasts, 1278 01:10:24,200 --> 01:10:31,880 Speaker 1: or wherever you listen to your favorite shows. Bo