1 00:00:00,720 --> 00:00:03,240 Speaker 1: Hey everybody, it's me Josh, and I'm here to tell 2 00:00:03,240 --> 00:00:06,240 Speaker 1: you it's official. We're going to be in Vancouver, b C. 3 00:00:06,720 --> 00:00:10,880 Speaker 1: And Portland, Oregon this March. On March twenty nine, will 4 00:00:10,920 --> 00:00:13,200 Speaker 1: be at the Chance Center in Vancouver and on March 5 00:00:13,840 --> 00:00:16,920 Speaker 1: will be at the Arlene Schnitzer Concert Hall in Portland. 6 00:00:17,200 --> 00:00:20,920 Speaker 1: So come see us. Tickets go on sale this Friday. 7 00:00:21,040 --> 00:00:23,919 Speaker 1: Go to s Y s K live dot com for 8 00:00:24,000 --> 00:00:26,759 Speaker 1: ticket links and info and everything you need and we'll 9 00:00:26,760 --> 00:00:30,600 Speaker 1: see you guys in March. Welcome to Stuff You Should Know, 10 00:00:30,840 --> 00:00:39,280 Speaker 1: a production of I Heart Radios How Stuff Works. Hey, 11 00:00:39,440 --> 00:00:45,080 Speaker 1: welcome to the podcast. I'm Josh Clark. There's Charles Scooter, 12 00:00:45,200 --> 00:00:52,000 Speaker 1: Computer Bryant and Jerry Um Matthew Broderick in War Games 13 00:00:52,440 --> 00:00:58,320 Speaker 1: Rolling and this is Should Know. That's good. Thanks. What's 14 00:00:58,360 --> 00:01:04,600 Speaker 1: your name, Josh m hmm? Okay, Josh, Ali Shedy in 15 00:01:04,680 --> 00:01:08,360 Speaker 1: More Games Okay Clark, I know Aaron Cooper is doing 16 00:01:08,480 --> 00:01:11,760 Speaker 1: right now? Man? How cute was she in that movie? Like? 17 00:01:11,800 --> 00:01:14,959 Speaker 1: I think they designed that movie for like every thirteen 18 00:01:15,040 --> 00:01:16,959 Speaker 1: year old boy in America's fall in love with Ali 19 00:01:16,959 --> 00:01:21,800 Speaker 1: shed I think you're talking about um short Circuit. I 20 00:01:21,840 --> 00:01:24,959 Speaker 1: never saw that, believe that what with Johnny five. I mean, 21 00:01:25,000 --> 00:01:27,880 Speaker 1: I know the movie. You gotta see it. Really, it's 22 00:01:27,920 --> 00:01:35,160 Speaker 1: pretty awful, especially with um oh, what's his name? Who 23 00:01:35,240 --> 00:01:38,039 Speaker 1: is the sleeves ball from Fast Times at Ridgemont High? 24 00:01:38,040 --> 00:01:44,720 Speaker 1: Who is the ticket scalper? Yes, he plays a Indian 25 00:01:45,520 --> 00:01:52,920 Speaker 1: like Asian Indian character, like full on brown face and everything. 26 00:01:53,360 --> 00:01:57,200 Speaker 1: It's really bad. The movie is bad enough, but then 27 00:01:57,240 --> 00:01:59,600 Speaker 1: now when you go back and see that, to your like, 28 00:01:59,680 --> 00:02:02,400 Speaker 1: I can't believe this. I can't believe it. I think 29 00:02:02,440 --> 00:02:06,440 Speaker 1: he's Italian, oh easily, maybe Jewish or maybe just a 30 00:02:06,480 --> 00:02:09,080 Speaker 1: straight up white guy. He's definitely not Asian Indian. No, 31 00:02:09,280 --> 00:02:12,720 Speaker 1: he's not. But anyway, go see Short Circuit. Okay, see 32 00:02:12,720 --> 00:02:14,919 Speaker 1: what you think Ali shed He just keeps looking at 33 00:02:14,919 --> 00:02:18,120 Speaker 1: the camera going, I'm so sorry that was a big 34 00:02:18,160 --> 00:02:20,760 Speaker 1: hit though she didn't have anything be sorry about. Um So. 35 00:02:20,880 --> 00:02:25,480 Speaker 1: I guess War Games is kind of in your your wheelhouse. Yeah, 36 00:02:25,800 --> 00:02:28,680 Speaker 1: it was a little old for me. Yeah, because I 37 00:02:28,720 --> 00:02:32,480 Speaker 1: saw that when I was like twelve, bish uh, and 38 00:02:32,600 --> 00:02:35,280 Speaker 1: you would have been I don't know how how much 39 00:02:35,320 --> 00:02:38,359 Speaker 1: younger are you. I would have been seven seven. Yeah, 40 00:02:38,440 --> 00:02:40,000 Speaker 1: that's a little young for War Games, I would think. 41 00:02:40,040 --> 00:02:42,560 Speaker 1: I mean, I still watched your wherever, but I was like, yeah, 42 00:02:42,600 --> 00:02:45,160 Speaker 1: that's hilarious. The computer is called Whopper. Yeah, I mean 43 00:02:45,200 --> 00:02:47,799 Speaker 1: it was right in my wheelhouse, right. I remember at 44 00:02:47,800 --> 00:02:51,280 Speaker 1: the end of war Games, Uh, they lock in, you know, 45 00:02:51,320 --> 00:02:55,360 Speaker 1: they're decoding the the code one number and letter at 46 00:02:55,400 --> 00:02:58,200 Speaker 1: a time. Very suspenseful and yeah, very suspenseful. And it 47 00:02:58,200 --> 00:03:01,080 Speaker 1: finally locks in and me and my friends memerized it 48 00:03:01,120 --> 00:03:03,040 Speaker 1: so we could go home and plug it into an 49 00:03:03,040 --> 00:03:08,840 Speaker 1: Apple too to see what happened. Nothing, Okay, how do 50 00:03:08,840 --> 00:03:11,440 Speaker 1: you plug in a number anyway? What does that even mean? Oh? Yeah, 51 00:03:11,520 --> 00:03:15,400 Speaker 1: that's true, you know. Yeah. I remember a very rudimentary 52 00:03:16,840 --> 00:03:21,520 Speaker 1: program you could run where you could type in like 53 00:03:21,560 --> 00:03:24,280 Speaker 1: four lines of whatever I don't even know if you 54 00:03:24,320 --> 00:03:27,639 Speaker 1: call it code with a phrase, and it would run 55 00:03:27,680 --> 00:03:30,560 Speaker 1: the phrase like a thousand times all over your screen 56 00:03:30,560 --> 00:03:33,280 Speaker 1: and a big scroll. And then that just thought that 57 00:03:33,360 --> 00:03:36,360 Speaker 1: was the coolest thing ever. I feel like, I remember 58 00:03:36,440 --> 00:03:39,000 Speaker 1: what you're talking just like five lines was like the 59 00:03:39,040 --> 00:03:42,160 Speaker 1: only part I remember is twenty go to ten and 60 00:03:42,280 --> 00:03:47,160 Speaker 1: ten was the phrase I think something like I don't remember. 61 00:03:47,440 --> 00:03:49,560 Speaker 1: Then I was like, man, let's just play Castle Wolfenstein. 62 00:03:49,920 --> 00:03:52,320 Speaker 1: That was a good one. I remember, did Oregon Trail. 63 00:03:52,680 --> 00:03:55,040 Speaker 1: I never did as well. I was Castle Wolfenstein, but 64 00:03:55,160 --> 00:03:57,600 Speaker 1: like wolf and signed on the PC. Oh yeah, like 65 00:03:57,840 --> 00:04:02,320 Speaker 1: move left arrow, right arrow, shoot uh shoot dash, which 66 00:04:02,320 --> 00:04:04,560 Speaker 1: is some sort of a bullet. Yeah. It was fun. 67 00:04:04,720 --> 00:04:06,480 Speaker 1: That was fun, and I thought it was just like 68 00:04:06,520 --> 00:04:10,480 Speaker 1: the height of technological gaming. It was it was at 69 00:04:10,480 --> 00:04:13,160 Speaker 1: the time. But now, Chuck, we've reached the height of 70 00:04:13,200 --> 00:04:17,360 Speaker 1: technology which is being tracked everywhere you go, look at 71 00:04:17,400 --> 00:04:20,680 Speaker 1: you all the time by whoever wants to do that. 72 00:04:20,760 --> 00:04:22,960 Speaker 1: I'm gonna change your name to Josh smooth up a 73 00:04:23,040 --> 00:04:27,400 Speaker 1: Raider Clark for that transition. Very nice. Yeah, Like I 74 00:04:27,440 --> 00:04:31,800 Speaker 1: like Ali Shedy Clark, Josh, Ali Shedy and Clark. Yeah, 75 00:04:31,880 --> 00:04:35,000 Speaker 1: this is a good one. Uh did you put this together? 76 00:04:35,040 --> 00:04:37,440 Speaker 1: It was this you and this is Dave. Dave Brews 77 00:04:37,880 --> 00:04:40,279 Speaker 1: good stuff in high Dave. We finally got to meet 78 00:04:40,560 --> 00:04:45,080 Speaker 1: Dave and his family, lovely family that we cursed awfully 79 00:04:45,120 --> 00:04:48,560 Speaker 1: in front of in Seattle. I felt terrible, thanks for 80 00:04:48,600 --> 00:04:51,800 Speaker 1: adding yourself to the max and he was like, you 81 00:04:51,839 --> 00:04:55,640 Speaker 1: know whatever, he was fine. His kids were adorable, they 82 00:04:55,640 --> 00:04:57,680 Speaker 1: were they were great. They couldn't look at me though 83 00:04:58,040 --> 00:05:00,800 Speaker 1: that really they were probably just him did by your present, 84 00:05:00,839 --> 00:05:03,880 Speaker 1: No No, it was because I cursed so badly. So 85 00:05:03,960 --> 00:05:08,279 Speaker 1: this is good stuff. Though. Facial recognition technology that they've 86 00:05:08,320 --> 00:05:12,560 Speaker 1: been kind of at since the nineteen fifties, which they 87 00:05:12,640 --> 00:05:15,360 Speaker 1: rolled out as a test in two thousand two at 88 00:05:15,360 --> 00:05:18,600 Speaker 1: the Super Bowl in New Orleans, did not go that well. No, 89 00:05:19,320 --> 00:05:21,640 Speaker 1: it was a little clunky back then, but it's gotten 90 00:05:21,640 --> 00:05:24,080 Speaker 1: a lot better since then. Let me explain why. For 91 00:05:24,080 --> 00:05:25,880 Speaker 1: anyone who's listened to the End of the World with 92 00:05:25,960 --> 00:05:31,080 Speaker 1: Josh Clark the AI episode in particular, everything associated with 93 00:05:31,240 --> 00:05:35,680 Speaker 1: artificial intelligence got way better starting around two thousand seven, 94 00:05:36,080 --> 00:05:41,320 Speaker 1: when neural nets became a viable form of machine learning. 95 00:05:41,640 --> 00:05:46,640 Speaker 1: Because you don't have to train a computer what constitutes 96 00:05:46,640 --> 00:05:48,880 Speaker 1: a human face and what to look for. You just 97 00:05:49,040 --> 00:05:51,400 Speaker 1: feeded a bunch of pictures of faces and say these 98 00:05:51,440 --> 00:05:54,840 Speaker 1: are human faces, learn what a human faces and they 99 00:05:54,880 --> 00:05:57,840 Speaker 1: trained themselves. And so around about two thousand seven, two 100 00:05:57,839 --> 00:06:00,680 Speaker 1: thousand eight, two thousand nine, everything had to do with 101 00:06:00,760 --> 00:06:04,719 Speaker 1: machine learning got way smarter because we started using neural 102 00:06:04,800 --> 00:06:08,719 Speaker 1: nets and facial recognition software is no exception. Yeah, and 103 00:06:08,760 --> 00:06:10,920 Speaker 1: there were a few things that kind of converged all 104 00:06:10,960 --> 00:06:14,120 Speaker 1: at the same time or around the same time. Uh, 105 00:06:14,240 --> 00:06:17,159 Speaker 1: social meds kind of coming on the scene right in 106 00:06:17,160 --> 00:06:22,280 Speaker 1: that wheelhouse was a big deal. Um, Facebook, this is staggering. 107 00:06:22,880 --> 00:06:28,320 Speaker 1: Facebook just by itself processes three hundred and fifty million 108 00:06:29,120 --> 00:06:34,400 Speaker 1: new photos through its uh facial recognition software every day, 109 00:06:34,560 --> 00:06:39,359 Speaker 1: and every time one comes through. Mark Zuckerberg goes like, 110 00:06:39,440 --> 00:06:41,640 Speaker 1: you think it's neat when you go when you put 111 00:06:41,640 --> 00:06:43,640 Speaker 1: a picture up and it says like would you like 112 00:06:43,680 --> 00:06:46,920 Speaker 1: to tag Emily your wife because that's her? And you think, oh, well, 113 00:06:46,920 --> 00:06:49,880 Speaker 1: that's super easy, thanks Facebook. But then you don't think, like, 114 00:06:49,880 --> 00:06:51,840 Speaker 1: wait a minute, all right, how do they know that's 115 00:06:51,920 --> 00:06:56,719 Speaker 1: my wife? And you know, it's like with everything else, Uh, 116 00:06:56,800 --> 00:06:59,680 Speaker 1: there's privacy people that were like, whoa do you guys 117 00:06:59,680 --> 00:07:03,279 Speaker 1: realize what's going on? And then of the sheep they're 118 00:07:03,320 --> 00:07:05,919 Speaker 1: like huh no, no, they're like no, it's great, Like 119 00:07:05,960 --> 00:07:08,279 Speaker 1: I don't have to go in and like make click 120 00:07:08,360 --> 00:07:12,360 Speaker 1: two links or two buttons to take the way easier. 121 00:07:12,560 --> 00:07:15,400 Speaker 1: So that was one thing. There's way more photos out 122 00:07:15,440 --> 00:07:19,400 Speaker 1: there for those machines to learn one like good high 123 00:07:19,480 --> 00:07:24,920 Speaker 1: quality photos right uty million a day just on Facebook alone, um, 124 00:07:25,080 --> 00:07:27,520 Speaker 1: which means the machines were getting smarter, They were getting 125 00:07:27,560 --> 00:07:32,680 Speaker 1: better and better at at training themselves. And then lastly, um, 126 00:07:32,760 --> 00:07:37,000 Speaker 1: that has led to a ubiquity in in facial recognition. 127 00:07:37,440 --> 00:07:41,000 Speaker 1: That the better the machines have gotten, the easier it 128 00:07:41,040 --> 00:07:43,280 Speaker 1: has been to put together data sets for them to 129 00:07:43,320 --> 00:07:46,920 Speaker 1: train on, which is lots and lots of pictures of people. Um, 130 00:07:46,960 --> 00:07:49,840 Speaker 1: the cheaper the technology has gotten, which means the more 131 00:07:49,880 --> 00:07:54,440 Speaker 1: people that are now using facial recognition than ever. Yeah. 132 00:07:54,480 --> 00:07:59,320 Speaker 1: Amazon has a service called Recognition with a K, which 133 00:07:59,360 --> 00:08:02,080 Speaker 1: is not a good look. No, it looks very German. 134 00:08:02,240 --> 00:08:06,360 Speaker 1: There's something about replacing at sea with the K. It 135 00:08:06,480 --> 00:08:09,280 Speaker 1: just looks creepy, Like when you spell America with a K, 136 00:08:10,280 --> 00:08:15,240 Speaker 1: it means something. It means like bad America. Yet they 137 00:08:15,240 --> 00:08:20,240 Speaker 1: went right full steam ahead and called it recognition. You 138 00:08:20,240 --> 00:08:21,720 Speaker 1: have to say it like that, you do, I think, 139 00:08:21,720 --> 00:08:23,960 Speaker 1: and you have to be like squeezing the air out 140 00:08:24,000 --> 00:08:27,560 Speaker 1: of a syringe while you're saying it too. Um, so 141 00:08:27,600 --> 00:08:29,960 Speaker 1: they have this. I didn't even know about this, but 142 00:08:30,160 --> 00:08:34,880 Speaker 1: it's ubiquitous and it's not super expensive, and that means 143 00:08:34,920 --> 00:08:38,960 Speaker 1: that the law enforcement agencies agencies they don't have to 144 00:08:39,040 --> 00:08:41,560 Speaker 1: like create their own they can just say, well, just 145 00:08:41,559 --> 00:08:45,640 Speaker 1: to sign up for a recognition right exactly. Because it's there, 146 00:08:45,720 --> 00:08:48,839 Speaker 1: and because it's it's relatively cheap, you can just get 147 00:08:48,880 --> 00:08:52,200 Speaker 1: a subscription, not just law enforcement agencies. If you have 148 00:08:52,280 --> 00:08:56,079 Speaker 1: a photo sharing app or whatever and you want some 149 00:08:56,160 --> 00:09:00,880 Speaker 1: facial recognition technology, you just contract with with Amazon, and 150 00:09:00,960 --> 00:09:04,000 Speaker 1: Amazon goes here, you go, here's our data, and you 151 00:09:04,040 --> 00:09:06,520 Speaker 1: can or um like our code, and you can put 152 00:09:06,559 --> 00:09:10,200 Speaker 1: it onto your platform. Anybody can use it. Um. So 153 00:09:10,280 --> 00:09:12,760 Speaker 1: it is kind of everywhere, and that makes a lot 154 00:09:12,800 --> 00:09:18,120 Speaker 1: of people, including me, very very nervous because as this 155 00:09:18,200 --> 00:09:22,880 Speaker 1: guy wood Row hurt Zog, if he's not Werner hurt Zog, brother, 156 00:09:23,640 --> 00:09:28,040 Speaker 1: I'll be disappointed. But but a wood Row and a 157 00:09:28,120 --> 00:09:31,800 Speaker 1: Werner come on, you know anyway. Um. Wood Row hurt 158 00:09:31,880 --> 00:09:38,319 Speaker 1: Zog as a professor of computer science, and I believe privacy, 159 00:09:38,840 --> 00:09:43,960 Speaker 1: civil liberty. Um. He basically says, Look, there is no 160 00:09:44,120 --> 00:09:47,720 Speaker 1: way we're going to reap the benefits of facial recognition 161 00:09:48,080 --> 00:09:55,520 Speaker 1: without ultimately sliding irreversibly into a dystopian surveillance state where 162 00:09:55,800 --> 00:09:58,320 Speaker 1: it's happening right now, and if we don't do something 163 00:09:58,360 --> 00:10:01,640 Speaker 1: about it, it's never going to change back. We're about 164 00:10:01,679 --> 00:10:04,559 Speaker 1: to fully give up our privacy because it's one thing 165 00:10:04,600 --> 00:10:07,240 Speaker 1: to have your phone tracked you can give up your phone, 166 00:10:07,840 --> 00:10:10,920 Speaker 1: get yourself a burner phone, like you're Jesse Pinkman or 167 00:10:10,920 --> 00:10:13,520 Speaker 1: something like that, and then you just throw that phone away. 168 00:10:13,559 --> 00:10:16,319 Speaker 1: You can't be tracked anymore. You can't get a burner face. 169 00:10:16,840 --> 00:10:19,320 Speaker 1: And if that does become a thing down the line, 170 00:10:19,600 --> 00:10:22,040 Speaker 1: it'll be very, very expensive. So the average person can't 171 00:10:22,040 --> 00:10:26,200 Speaker 1: get a burner face. Will be tracked by our face 172 00:10:26,760 --> 00:10:28,880 Speaker 1: everywhere we go. And as we add more and more 173 00:10:29,120 --> 00:10:34,880 Speaker 1: cameras and this technology becomes cheaper and cheaper, it's we 174 00:10:34,920 --> 00:10:36,640 Speaker 1: will be living in a world where there will be 175 00:10:36,760 --> 00:10:39,920 Speaker 1: zero privacy and will be monitored and tracked because it 176 00:10:39,920 --> 00:10:42,679 Speaker 1: will be so easy, and it will be sold to 177 00:10:42,760 --> 00:10:45,160 Speaker 1: us like it's being sold to us now that it's 178 00:10:45,200 --> 00:10:48,600 Speaker 1: a law enforcement tool to get the bad guys. But 179 00:10:48,720 --> 00:10:51,760 Speaker 1: it's eventually going to extend to include everybody. But what 180 00:10:51,840 --> 00:10:54,240 Speaker 1: do you have to worry about. You're an upstanding citizen. 181 00:10:54,320 --> 00:10:57,440 Speaker 1: It doesn't matter if you're tracked. That's not true. That's 182 00:10:57,520 --> 00:11:00,680 Speaker 1: just not the case, everybody, It's not case. All right, 183 00:11:00,679 --> 00:11:05,559 Speaker 1: we're gonna call that soap soapbox soliloquy number one of 184 00:11:06,160 --> 00:11:10,240 Speaker 1: what I guarantee will be probably three or four. Uh, 185 00:11:10,360 --> 00:11:12,400 Speaker 1: let's talk of a little bit about how it works. 186 00:11:12,559 --> 00:11:16,280 Speaker 1: It is biometric authentication. Um. It's like a fingerprint or 187 00:11:16,320 --> 00:11:19,840 Speaker 1: a retina scan. And basically what it does is, uh, 188 00:11:20,240 --> 00:11:25,200 Speaker 1: it is precise measurements of a face to calculate every 189 00:11:25,240 --> 00:11:29,400 Speaker 1: person's very unique visual geometry, like how far apart your 190 00:11:29,400 --> 00:11:32,600 Speaker 1: eyes are, how how far apart your pupils are from 191 00:11:32,600 --> 00:11:36,079 Speaker 1: your nostrils. Yeah, your facial geometry, how your face is 192 00:11:36,120 --> 00:11:38,440 Speaker 1: all set up? I think. Yeah, it's even gotten into 193 00:11:38,440 --> 00:11:43,000 Speaker 1: things like facial hair, skin tone, um, skin texture. Yeah, 194 00:11:43,040 --> 00:11:45,240 Speaker 1: I'm sure it'll get just more and more specific. Yeah. 195 00:11:45,280 --> 00:11:48,040 Speaker 1: And then you know, because they're getting the machines are 196 00:11:48,040 --> 00:11:50,199 Speaker 1: getting better and better and easier and easier to train 197 00:11:50,280 --> 00:11:52,520 Speaker 1: on this stuff, you can just add more and more 198 00:11:52,600 --> 00:11:56,760 Speaker 1: data to it, and the the the recognition will just 199 00:11:56,840 --> 00:12:02,839 Speaker 1: become increasingly um um good. Yeah. And if you want 200 00:12:02,840 --> 00:12:06,600 Speaker 1: to throw off facial recognition software and freak out every 201 00:12:06,679 --> 00:12:10,320 Speaker 1: human you meet to shave your eyebrows, oh yeah, that 202 00:12:10,360 --> 00:12:13,080 Speaker 1: would be a little freaky. Have you ever seen that. 203 00:12:13,400 --> 00:12:15,360 Speaker 1: You've seen it before, I'm sure in movies and stuff. 204 00:12:15,400 --> 00:12:17,440 Speaker 1: It's an interesting thing. I remember a kid in industrial 205 00:12:17,520 --> 00:12:19,440 Speaker 1: arts class did that one year. He was like a 206 00:12:19,480 --> 00:12:21,839 Speaker 1: little you know, kind of a ninth grade burnout, and 207 00:12:21,840 --> 00:12:24,480 Speaker 1: he just showed up one day with no eyebrows, like 208 00:12:24,720 --> 00:12:27,559 Speaker 1: you would I think like not having a nose would 209 00:12:27,559 --> 00:12:32,000 Speaker 1: be more easily accepted. There's something just uncanny when someone 210 00:12:32,040 --> 00:12:34,920 Speaker 1: shaves their eyebrows, Like one day they have them, the 211 00:12:34,920 --> 00:12:39,920 Speaker 1: next day they don't. Was it like immediately recognizable what 212 00:12:39,960 --> 00:12:41,880 Speaker 1: the thing was, or was it like that's the thing 213 00:12:42,760 --> 00:12:46,199 Speaker 1: off today? Whereas if you, you know, you came in 214 00:12:46,240 --> 00:12:48,079 Speaker 1: the next day without a nose, the first thing you 215 00:12:48,080 --> 00:12:49,600 Speaker 1: would say is what happened to your nose? What happened 216 00:12:49,600 --> 00:12:51,880 Speaker 1: to your nose? Todd? Yeah, and Todd would be like, 217 00:12:52,200 --> 00:12:56,560 Speaker 1: I can't rent of it. I fell off. So those 218 00:12:56,600 --> 00:12:58,880 Speaker 1: measurements we were talking about. What happens then is they 219 00:12:58,920 --> 00:13:05,080 Speaker 1: compare that just like a fingerprint, with a database of images. 220 00:13:05,200 --> 00:13:08,480 Speaker 1: And depending on what this is for, it could be 221 00:13:08,520 --> 00:13:11,080 Speaker 1: like just within your company, or it could be the 222 00:13:11,120 --> 00:13:14,760 Speaker 1: FBI's database of mugshots, or it could be the d 223 00:13:14,840 --> 00:13:17,959 Speaker 1: m VS database of driver's license photos yeh, which we'll 224 00:13:17,960 --> 00:13:21,000 Speaker 1: get into. Yeah. What's interesting is each stage of the way, 225 00:13:21,040 --> 00:13:24,760 Speaker 1: there's a different algorithm that does you know, each increasingly 226 00:13:24,840 --> 00:13:29,760 Speaker 1: sophisticated step, until you finally have basically like all of 227 00:13:29,800 --> 00:13:32,880 Speaker 1: the different data points for that, you know, what makes 228 00:13:32,960 --> 00:13:35,880 Speaker 1: up that facial geometry, and then you can compare it 229 00:13:35,920 --> 00:13:38,320 Speaker 1: to all the other data points. We think of like 230 00:13:38,360 --> 00:13:43,199 Speaker 1: a computer running like a you know, a picture, You've 231 00:13:43,240 --> 00:13:46,000 Speaker 1: got your input picture and then running you know, all 232 00:13:46,040 --> 00:13:48,200 Speaker 1: the pictures next to it. That's not what it's doing. 233 00:13:48,280 --> 00:13:51,880 Speaker 1: It's it's running the numbers. Basically, it's doing computer stuff. Yeah. 234 00:13:51,880 --> 00:13:55,360 Speaker 1: I love that first step, which is you have to 235 00:13:55,400 --> 00:13:59,840 Speaker 1: teach the computer what a face is. So, I mean, 236 00:14:00,040 --> 00:14:02,960 Speaker 1: seems silly, but of course that's what it is. Well, yeah, 237 00:14:02,960 --> 00:14:04,640 Speaker 1: because I mean if you show it a picture of 238 00:14:04,840 --> 00:14:07,839 Speaker 1: a person standing next to a fire hydra shaming on 239 00:14:07,880 --> 00:14:10,679 Speaker 1: the fire Hydran, Yeah, and say hello, handsome, So this 240 00:14:10,720 --> 00:14:13,199 Speaker 1: is what a human face looks like. Yeah, or no, 241 00:14:13,400 --> 00:14:16,600 Speaker 1: that's a butt, And then it starts, you know, closer, closer. 242 00:14:16,640 --> 00:14:18,280 Speaker 1: All right, now that's a face. You know what a 243 00:14:18,320 --> 00:14:21,200 Speaker 1: face is. Now move on to step two, which is 244 00:14:21,560 --> 00:14:24,080 Speaker 1: stop screwing around. Yeah, so now you know what a 245 00:14:24,120 --> 00:14:26,040 Speaker 1: face is. You've got to normalize it for the photo, 246 00:14:26,120 --> 00:14:29,400 Speaker 1: which means there are not that many. Well, let's you 247 00:14:29,440 --> 00:14:34,600 Speaker 1: have to put it in the dockers. That normalizes, Um, 248 00:14:34,680 --> 00:14:37,000 Speaker 1: you isolate that face, and then you have to make 249 00:14:37,040 --> 00:14:41,680 Speaker 1: sure that it's normalized as far as looking at the camera. 250 00:14:42,280 --> 00:14:44,160 Speaker 1: So if it's a if you get a photo of 251 00:14:44,240 --> 00:14:47,440 Speaker 1: someone from a CCTV, let's say, and it's sort of 252 00:14:47,440 --> 00:14:50,600 Speaker 1: a three quarter, they have the ability to make it 253 00:14:50,680 --> 00:14:52,480 Speaker 1: as if it is looking straight at you. Yeah, the 254 00:14:52,520 --> 00:14:56,080 Speaker 1: computer can pretty accurately predict what the rest of the 255 00:14:56,120 --> 00:15:00,720 Speaker 1: face looks like face you know it on, I guess 256 00:15:00,720 --> 00:15:05,520 Speaker 1: face on um. And when it normalizes it like that, 257 00:15:05,560 --> 00:15:08,600 Speaker 1: it makes it much easier to compare to other pictures because, 258 00:15:09,000 --> 00:15:11,680 Speaker 1: as we'll see, most of the pictures or most of 259 00:15:11,680 --> 00:15:14,160 Speaker 1: the data points that it's comparing it to, are taken 260 00:15:14,200 --> 00:15:17,440 Speaker 1: from databases of pictures that have been taking of people 261 00:15:17,640 --> 00:15:20,040 Speaker 1: face on. Right. So that's why it wants to go 262 00:15:20,080 --> 00:15:24,040 Speaker 1: to driver's license. Yeah, well just spoiled it. But yes, 263 00:15:24,200 --> 00:15:26,720 Speaker 1: we already said that. Oh we did, Yeah, I did, Okay, 264 00:15:27,120 --> 00:15:32,440 Speaker 1: I missed it, I know. So uh from there, do 265 00:15:32,440 --> 00:15:38,280 Speaker 1: you have more algorithms still that isolate parts of the face. Um. 266 00:15:38,360 --> 00:15:40,160 Speaker 1: And this is where my old theory that like, there 267 00:15:40,200 --> 00:15:43,400 Speaker 1: are only so many sort of facial combinations, so that's 268 00:15:43,400 --> 00:15:45,960 Speaker 1: why you have doppelgangers. We got to do an episode 269 00:15:45,960 --> 00:15:47,840 Speaker 1: on doppel Gan. There's only so many things you can 270 00:15:47,880 --> 00:15:49,920 Speaker 1: do with two eyes to eyebrows and nose in a 271 00:15:49,960 --> 00:15:53,880 Speaker 1: mouth and cheekbones and in chin. Well, okay, what else? 272 00:15:54,560 --> 00:15:57,200 Speaker 1: I mean, there's not a whole lot. There's lips, lips, sure, 273 00:15:57,800 --> 00:16:01,720 Speaker 1: what about um uh uh uh, that's about it. These 274 00:16:01,720 --> 00:16:04,840 Speaker 1: are called elevens, the ridges between your eyebrows. Well, if 275 00:16:04,840 --> 00:16:07,320 Speaker 1: you want to get super specific, but that's what I'm saying. 276 00:16:07,320 --> 00:16:09,840 Speaker 1: I think they're getting more and more specific. Oh yeah, yeah. 277 00:16:09,880 --> 00:16:12,840 Speaker 1: But my whole point is, and we'll learn and here 278 00:16:12,880 --> 00:16:16,920 Speaker 1: and facial recognition. They do use Apple gangers, but put 279 00:16:16,920 --> 00:16:19,960 Speaker 1: a pin in that so they recognize all these features 280 00:16:20,480 --> 00:16:23,200 Speaker 1: and then each feature becomes what's called a nodal point 281 00:16:24,320 --> 00:16:28,600 Speaker 1: or no Dowell point. I think, nodal nodal, I think. 282 00:16:28,840 --> 00:16:31,240 Speaker 1: And this is where you're gonna get your super exact 283 00:16:32,080 --> 00:16:36,040 Speaker 1: uh angles and distances between all these parts as a 284 00:16:36,120 --> 00:16:41,320 Speaker 1: flat two dimensional thing. Which my question was because below here, 285 00:16:41,400 --> 00:16:43,680 Speaker 1: you know, it talks about Apple and their iPhone have 286 00:16:43,720 --> 00:16:47,720 Speaker 1: a three D facial recognition. Is that is two dimensional 287 00:16:47,760 --> 00:16:50,360 Speaker 1: superior to three D? I don't know, or is it 288 00:16:50,400 --> 00:16:53,360 Speaker 1: just because that's what all the pictures are in the databasis, 289 00:16:53,360 --> 00:16:55,000 Speaker 1: so that's what they do. I don't know. All I 290 00:16:55,040 --> 00:16:57,440 Speaker 1: know is my phone usually unlocks when I look at it. 291 00:16:59,560 --> 00:17:00,960 Speaker 1: You know what? I Hey, that's having to take off 292 00:17:01,000 --> 00:17:06,000 Speaker 1: my sunglasses. Worst so I found I've got some wayfarers 293 00:17:06,480 --> 00:17:09,560 Speaker 1: that I don't have to take, but my aviators I 294 00:17:09,560 --> 00:17:11,920 Speaker 1: do have to take off. Interesting do they keep trying 295 00:17:11,920 --> 00:17:16,399 Speaker 1: to make you into mav when you have on the 296 00:17:16,400 --> 00:17:20,399 Speaker 1: the go ahead? What is that? That was Tom Cruise 297 00:17:20,640 --> 00:17:26,760 Speaker 1: laughing and chewing gum. Okay, wow thanks, I feel like okay, 298 00:17:26,880 --> 00:17:29,159 Speaker 1: um we we we gotta keep going because I was 299 00:17:29,200 --> 00:17:34,680 Speaker 1: about to take a break unnecessarily. So um when the 300 00:17:34,800 --> 00:17:36,840 Speaker 1: when the computer is running through the pictures, it just 301 00:17:36,840 --> 00:17:40,000 Speaker 1: sis soon goes like no, no, no, no, no, millions 302 00:17:40,000 --> 00:17:42,720 Speaker 1: and millions of times and then finally goes yes. But 303 00:17:42,800 --> 00:17:46,280 Speaker 1: when it says yes and it spits out another picture, 304 00:17:46,480 --> 00:17:49,080 Speaker 1: it's not like this is that person? No you want 305 00:17:49,119 --> 00:17:51,639 Speaker 1: it to be. Because we all watch n c I S, 306 00:17:51,840 --> 00:17:54,760 Speaker 1: we all watch c S I, we all watch um 307 00:17:55,880 --> 00:17:59,880 Speaker 1: Law and Order, we all watch um uh party down 308 00:18:00,400 --> 00:18:07,239 Speaker 1: uh Andy Griffith, that Mattlock, the whole CHAMAI um, so 309 00:18:07,320 --> 00:18:09,159 Speaker 1: we wanted to just spit out and be like, here's 310 00:18:09,160 --> 00:18:12,720 Speaker 1: your here's your your person of interest, right. But what 311 00:18:12,760 --> 00:18:16,320 Speaker 1: it's really doing is it's it's producing a similarity score 312 00:18:17,520 --> 00:18:21,439 Speaker 1: that is probabilistic. It's saying this is there's this percent 313 00:18:21,600 --> 00:18:24,200 Speaker 1: chance that this is the same person as the picture 314 00:18:25,040 --> 00:18:27,239 Speaker 1: or the person in the picture that you uploaded. It's 315 00:18:27,240 --> 00:18:30,200 Speaker 1: a bit of a guess. It is, so a sophisticated 316 00:18:30,200 --> 00:18:33,880 Speaker 1: guess it is, and the better computers get at this, 317 00:18:35,320 --> 00:18:38,000 Speaker 1: the likelier it is that if they say this is 318 00:18:38,040 --> 00:18:41,159 Speaker 1: a there's a chances the same person that it's the 319 00:18:41,200 --> 00:18:45,159 Speaker 1: same person. But it's as we'll see, it's up to 320 00:18:45,200 --> 00:18:50,200 Speaker 1: the human user to determine what is an acceptable threshold 321 00:18:50,359 --> 00:18:55,600 Speaker 1: of a confidence Is it no? Is it no? Frankly, 322 00:18:55,600 --> 00:18:58,240 Speaker 1: it really should be about or higher should be the 323 00:18:58,280 --> 00:19:02,600 Speaker 1: confidence setting for the confidence level. Didn't that what Amazon's 324 00:19:02,680 --> 00:19:07,560 Speaker 1: recognition says the threshold should be. I'm glad you said that, man, 325 00:19:07,640 --> 00:19:10,240 Speaker 1: because it really is creepy and I couldn't put my 326 00:19:10,280 --> 00:19:13,280 Speaker 1: finger on it. And it's exactly I mean, I knew 327 00:19:13,280 --> 00:19:15,360 Speaker 1: the k looked weird or whatever, but it hadn't hit 328 00:19:15,440 --> 00:19:18,000 Speaker 1: me that just how creepy it is and just how 329 00:19:18,080 --> 00:19:23,439 Speaker 1: off the mark or potentially on the mark that name is. Like, Oh, like, 330 00:19:23,480 --> 00:19:26,200 Speaker 1: if my name was spelled c h u K, I'm 331 00:19:26,200 --> 00:19:29,080 Speaker 1: sinister a little bit, you'd be more sinister. Yeah, I 332 00:19:29,119 --> 00:19:32,320 Speaker 1: don't think you could ever be truly sinister. I appreciate that. Alright, 333 00:19:32,400 --> 00:19:35,119 Speaker 1: let's take a break. I'm gonna go work on sinistering 334 00:19:35,200 --> 00:19:37,919 Speaker 1: up a bit, and we'll talk a little bit more 335 00:19:37,960 --> 00:20:04,840 Speaker 1: about some of the uses of f R right after this. So, 336 00:20:04,880 --> 00:20:07,840 Speaker 1: as with all technology, it has to be abbreviated into 337 00:20:07,920 --> 00:20:10,800 Speaker 1: two letters, the second of which is are do they 338 00:20:10,840 --> 00:20:14,359 Speaker 1: call it f R? I've seen it, Yes, I was 339 00:20:14,400 --> 00:20:17,280 Speaker 1: just silly, but it doesn't surprise me. Nope. So in 340 00:20:17,800 --> 00:20:22,400 Speaker 1: f R facial recognition technology, there are there are some 341 00:20:22,400 --> 00:20:25,159 Speaker 1: some beneficial uses for it. Yeah, Like we said, you 342 00:20:25,280 --> 00:20:30,119 Speaker 1: got to tag people is chief among them. The for 343 00:20:30,440 --> 00:20:33,600 Speaker 1: people like you and me, that's the pinnacle as it stands. 344 00:20:34,040 --> 00:20:37,040 Speaker 1: You don't have to tag people yourself. Facebook does it 345 00:20:37,119 --> 00:20:40,760 Speaker 1: for you. That's what we're trading everything for. I gotta 346 00:20:40,800 --> 00:20:45,440 Speaker 1: calm down. Okay. There are some other, like genuinely beneficial 347 00:20:45,520 --> 00:20:50,200 Speaker 1: uses too. There's a nonprofit company called Thorn that scans 348 00:20:50,480 --> 00:20:56,879 Speaker 1: missing person's pictures against um pictures of children in child 349 00:20:56,920 --> 00:21:01,960 Speaker 1: porn videos, um or suspected human and trafficking to to 350 00:21:02,080 --> 00:21:04,879 Speaker 1: get matches, and apparently they've rescued a hundred kids so 351 00:21:04,960 --> 00:21:08,560 Speaker 1: far from using that technology. There's a pretty beneficial use 352 00:21:08,600 --> 00:21:13,000 Speaker 1: of facial recognition software. Uh dating apps. Let's say you 353 00:21:13,119 --> 00:21:15,959 Speaker 1: want to you can get pretty specific on what kind 354 00:21:16,000 --> 00:21:20,919 Speaker 1: of face you find attractive, which is interesting. But you 355 00:21:20,960 --> 00:21:24,840 Speaker 1: can say I really think, um, I like guys with 356 00:21:24,920 --> 00:21:28,040 Speaker 1: high cheekbones, and but no, you would go find mall lips. 357 00:21:28,280 --> 00:21:32,120 Speaker 1: It would be more like, um, somebody could be like, oh, 358 00:21:32,160 --> 00:21:34,679 Speaker 1: I really find Christian Bale attractive, and they get a 359 00:21:34,800 --> 00:21:37,600 Speaker 1: picture of Christian Bale into this dating app and I 360 00:21:37,640 --> 00:21:39,680 Speaker 1: would come up. But I wouldn't because I wouldn't be 361 00:21:39,720 --> 00:21:41,720 Speaker 1: in the dating app because I'm happily married. Do you 362 00:21:41,720 --> 00:21:44,879 Speaker 1: think you look like Christian Bale? I'm told that a lot. Really, Yeah, 363 00:21:45,080 --> 00:21:46,760 Speaker 1: that's weird. Don't think you look anything like it. I 364 00:21:46,800 --> 00:21:50,960 Speaker 1: don't either, but people say interesting. I'm I don't know 365 00:21:51,000 --> 00:21:52,520 Speaker 1: what I would do if I was dating now. I 366 00:21:52,520 --> 00:21:54,320 Speaker 1: guess I would just go to a service and say 367 00:21:54,440 --> 00:21:57,920 Speaker 1: and aliety type sure for more games era, But they'd 368 00:21:57,920 --> 00:22:00,399 Speaker 1: be like, okay, sir, you would just upload the picture. 369 00:22:00,440 --> 00:22:02,720 Speaker 1: You don't have to come into the office, which is 370 00:22:02,760 --> 00:22:05,000 Speaker 1: really not even open to the public, and just tell 371 00:22:05,080 --> 00:22:07,919 Speaker 1: us you're interested in Ali sheety type like a weirdo. 372 00:22:08,160 --> 00:22:10,239 Speaker 1: You mean, dating apps don't have offices where they just 373 00:22:10,600 --> 00:22:13,920 Speaker 1: feel complaints and interested parties. You sit down and they 374 00:22:13,960 --> 00:22:17,879 Speaker 1: they videotape you with the VHS camera, put you on 375 00:22:17,960 --> 00:22:19,560 Speaker 1: with some other guys on the tape. That's how they 376 00:22:19,640 --> 00:22:21,679 Speaker 1: used to do it. Oh yeah, that was one of 377 00:22:21,720 --> 00:22:26,720 Speaker 1: the subplots of Singles, the Cameron Crow movie. Oh yeah, 378 00:22:26,880 --> 00:22:29,520 Speaker 1: it was Expect the Best was the name of the 379 00:22:29,600 --> 00:22:32,720 Speaker 1: dating service, and you would make a videotape and like 380 00:22:32,720 --> 00:22:35,200 Speaker 1: a watch video tapes of people you know, saying who 381 00:22:35,240 --> 00:22:37,840 Speaker 1: they are. How do you remember that? I was a 382 00:22:37,840 --> 00:22:40,200 Speaker 1: big singles fan. That's not a bunch I got you. Yeah, 383 00:22:40,280 --> 00:22:44,480 Speaker 1: Expect the Best. You me pack of cigarettes and some coffee. 384 00:22:45,040 --> 00:22:49,440 Speaker 1: We don't need anything else becausezoom tight. So what else here? 385 00:22:49,560 --> 00:22:53,480 Speaker 1: This was? UM Taylor Swift and and her security team 386 00:22:53,480 --> 00:22:56,680 Speaker 1: on tour used it to scan the audience to see 387 00:22:56,680 --> 00:23:00,240 Speaker 1: if any of the creeps who have harassed and doctor 388 00:23:00,359 --> 00:23:03,399 Speaker 1: were in the audience. That's super beneficial. No one should 389 00:23:03,400 --> 00:23:06,119 Speaker 1: have to go through that. UM also, cops use it 390 00:23:06,320 --> 00:23:10,960 Speaker 1: in myriad ways, but in particular especially beneficial or um 391 00:23:11,040 --> 00:23:14,680 Speaker 1: when they use facial recognition to identify people who can't 392 00:23:14,720 --> 00:23:18,840 Speaker 1: identify themselves. Somebody in the midst of a psychotic break, 393 00:23:18,880 --> 00:23:25,159 Speaker 1: perhaps somebody wasted on troom's um, somebody you're not Jesus Christ, 394 00:23:25,359 --> 00:23:30,959 Speaker 1: who who has uh amnesia, Our friend Benjamin Kyle, who 395 00:23:31,040 --> 00:23:34,359 Speaker 1: apparently knows who he is now but he's decided not 396 00:23:34,400 --> 00:23:37,480 Speaker 1: to disclose it publicly. Remember the guy he was found 397 00:23:37,480 --> 00:23:41,399 Speaker 1: behind a burger king near a dumpster, had zero recollection 398 00:23:41,480 --> 00:23:43,719 Speaker 1: of who he was or how he got there, and 399 00:23:43,800 --> 00:23:49,800 Speaker 1: like there was this international publicity publicizing like who he 400 00:23:49,880 --> 00:23:52,800 Speaker 1: wasn't that he couldn't remember who he was, and somebody 401 00:23:52,840 --> 00:23:55,600 Speaker 1: finally came forward and identified him. So now he knows 402 00:23:55,600 --> 00:23:57,920 Speaker 1: who he was, but he went like a decade without knowing. 403 00:23:59,160 --> 00:24:01,680 Speaker 1: By the way, when I said, sorry, you're not Jesus Christ, 404 00:24:01,680 --> 00:24:03,520 Speaker 1: I was making fun of the guy on mushrooms, not 405 00:24:03,920 --> 00:24:06,480 Speaker 1: someone in the midst of a psychotic break. I just 406 00:24:06,520 --> 00:24:08,680 Speaker 1: want to be very specific. I think that was very 407 00:24:08,680 --> 00:24:13,600 Speaker 1: clear a right to make sure everybody knows that. So uh. 408 00:24:13,640 --> 00:24:15,479 Speaker 1: Those are some of the good ways that it can 409 00:24:15,520 --> 00:24:19,240 Speaker 1: be used UM. Now let's talk about all the bad ways. Yeah, 410 00:24:19,280 --> 00:24:21,120 Speaker 1: I mean when you're talking about the government, you're talking 411 00:24:21,119 --> 00:24:27,040 Speaker 1: about law enforcement, when you're talking about things like what's 412 00:24:27,080 --> 00:24:32,320 Speaker 1: going on allegedly in China with CCTVs everywhere trained to 413 00:24:32,840 --> 00:24:39,080 Speaker 1: UM to single out ethnic minorities and religious groups just 414 00:24:39,560 --> 00:24:43,320 Speaker 1: walking down the street going about their day. Yeah, that's 415 00:24:43,400 --> 00:24:46,760 Speaker 1: it gets into much different territory than tagging people in 416 00:24:47,040 --> 00:24:50,400 Speaker 1: dating apps. Yeah, it's pretty difficult to attend your religious 417 00:24:50,400 --> 00:24:53,440 Speaker 1: service if you're not allowed to attend your religious service 418 00:24:53,440 --> 00:24:56,680 Speaker 1: and you're being tracked everywhere you go. Yeah, that's why 419 00:24:56,720 --> 00:25:00,280 Speaker 1: places like and this is the most predictable thing in 420 00:25:00,320 --> 00:25:05,080 Speaker 1: the world San Francisco, Oakland and Berkeley and then Somerville, Maine. 421 00:25:05,880 --> 00:25:08,159 Speaker 1: I knew the Manners would be in there. They're not 422 00:25:08,200 --> 00:25:13,280 Speaker 1: into this, that's right. They have banned law enforcement UM 423 00:25:13,320 --> 00:25:18,280 Speaker 1: from using facial recognition altogether in California UM as a state, 424 00:25:19,000 --> 00:25:21,520 Speaker 1: and the ACE has put a three or moratorium on 425 00:25:21,560 --> 00:25:24,399 Speaker 1: the use of it UM body cams, which and the 426 00:25:24,400 --> 00:25:26,639 Speaker 1: a c l U is basically I know this is 427 00:25:26,720 --> 00:25:28,439 Speaker 1: jumping ahead, but they're at the point where they're like, 428 00:25:28,960 --> 00:25:31,240 Speaker 1: we need to tap the brakes here for a few 429 00:25:31,320 --> 00:25:34,879 Speaker 1: years and like because there's no legislation about this yet 430 00:25:35,200 --> 00:25:38,520 Speaker 1: and it's just going full steam ahead. Yeah, I really 431 00:25:38,560 --> 00:25:41,720 Speaker 1: I don't want to like like run past that there 432 00:25:41,840 --> 00:25:46,359 Speaker 1: is aside from Berkeley, San Francisco, who's the other one, 433 00:25:46,440 --> 00:25:53,159 Speaker 1: Oakland and Somerville, Maine. UM, there are no laws, state, local, 434 00:25:53,240 --> 00:25:57,639 Speaker 1: or federal governing the use of facial recognition technology by 435 00:25:57,720 --> 00:26:00,840 Speaker 1: law enforcement. It's just happening very fast. Whatever they want 436 00:26:00,880 --> 00:26:04,560 Speaker 1: to do, they can do UM and in some cases 437 00:26:04,960 --> 00:26:08,160 Speaker 1: they do all sorts of stuff with it. They will 438 00:26:08,240 --> 00:26:12,320 Speaker 1: use it, like the NYPD very famously used UM what 439 00:26:12,400 --> 00:26:14,880 Speaker 1: you were talking about with doppelgangers. There was a guy 440 00:26:14,920 --> 00:26:18,080 Speaker 1: who was caught stealing beer at a CVS. Not even 441 00:26:18,080 --> 00:26:22,200 Speaker 1: a duane read a CVS. UM, and they said, but 442 00:26:22,320 --> 00:26:24,600 Speaker 1: this guy looks a lot like Woody Harrelson. We don't 443 00:26:24,600 --> 00:26:26,399 Speaker 1: have a good we don't have a good shot of 444 00:26:26,480 --> 00:26:32,879 Speaker 1: him to use in official recognitions. So they went and 445 00:26:32,920 --> 00:26:35,440 Speaker 1: got a pick of Woody Harrelson, and there they came 446 00:26:35,520 --> 00:26:37,679 Speaker 1: up with a match, and they think it was the 447 00:26:37,720 --> 00:26:42,280 Speaker 1: guy on video and CVS and so UM. The Georgetown 448 00:26:42,440 --> 00:26:46,000 Speaker 1: School of Law UH produced a study called Garbage in, 449 00:26:46,040 --> 00:26:49,560 Speaker 1: Garbage Out, and they were basically like, that's not okay, 450 00:26:49,640 --> 00:26:52,440 Speaker 1: you really shouldn't be doing that. But that's the level 451 00:26:52,480 --> 00:26:55,520 Speaker 1: of legality as it stands right now. There's it's just 452 00:26:55,640 --> 00:27:00,480 Speaker 1: open season, um, and uh, it's just basically whatever you 453 00:27:00,480 --> 00:27:02,199 Speaker 1: want to do, you can do as far as facial 454 00:27:02,200 --> 00:27:05,240 Speaker 1: recognition is concerned. In that story in particular, it's like 455 00:27:05,240 --> 00:27:07,960 Speaker 1: some people are like, awesome, the system works. Sure. Other 456 00:27:07,960 --> 00:27:10,280 Speaker 1: people are like, what about poor Woody Harrelson. He was 457 00:27:10,320 --> 00:27:13,640 Speaker 1: really in danger right then of being implicated in this 458 00:27:13,680 --> 00:27:17,639 Speaker 1: beer stealing scheme from CVS and what he said? What dude, 459 00:27:18,800 --> 00:27:23,359 Speaker 1: I love that guy man. True Detective the first season, Yeah, 460 00:27:23,520 --> 00:27:27,359 Speaker 1: first four episodes just amazing. That's called using a probe 461 00:27:27,359 --> 00:27:30,560 Speaker 1: photo when you use when you say, hey, that looks 462 00:27:30,600 --> 00:27:32,960 Speaker 1: like someone. They also did the same with one of 463 00:27:32,960 --> 00:27:35,399 Speaker 1: the New York Knicks. Apparently I could not, for the 464 00:27:35,440 --> 00:27:38,040 Speaker 1: life of me find out who. Yeah, it's like he's 465 00:27:38,080 --> 00:27:41,400 Speaker 1: being protected or something that maybe no one said who 466 00:27:41,400 --> 00:27:44,119 Speaker 1: it was. Um, a couple of numbers for you, though. Uh. 467 00:27:44,160 --> 00:27:49,080 Speaker 1: The FBI receives about fifty thousand facial recognition search submissions 468 00:27:49,080 --> 00:27:51,960 Speaker 1: a month for their database. So that's the other thing. 469 00:27:52,160 --> 00:27:55,520 Speaker 1: If you don't have even the money for a subscription 470 00:27:55,560 --> 00:27:58,760 Speaker 1: to Amazon Recognition, or you don't have an I T 471 00:27:58,960 --> 00:28:02,000 Speaker 1: person who's capable of assembling and putting it, you know, 472 00:28:02,119 --> 00:28:05,480 Speaker 1: using it UM. You can just submit these requests to 473 00:28:05,520 --> 00:28:08,760 Speaker 1: the FBI. So there's a lot of different avenues you 474 00:28:08,800 --> 00:28:11,920 Speaker 1: could take his law enforcement to to UM to use 475 00:28:12,000 --> 00:28:16,200 Speaker 1: facial recognition technology to catch suspected criminals. Yeah. I was 476 00:28:16,200 --> 00:28:18,680 Speaker 1: about to say bad guys, but who knows. We'll see 477 00:28:19,000 --> 00:28:22,320 Speaker 1: it's not always the case. So here's some more numbers though, 478 00:28:23,680 --> 00:28:28,280 Speaker 1: because you know it needs to be regulated. But when 479 00:28:28,280 --> 00:28:30,520 Speaker 1: it works, it really works. Yeah, it really does though. 480 00:28:30,520 --> 00:28:32,920 Speaker 1: It is the thing. Yeah, there was one department where 481 00:28:32,960 --> 00:28:34,959 Speaker 1: they said it lowered the average time required for an 482 00:28:34,960 --> 00:28:37,920 Speaker 1: officer to identify a subject from an image from thirty 483 00:28:38,000 --> 00:28:41,600 Speaker 1: days to three minutes, which kind of brings home the 484 00:28:41,640 --> 00:28:48,280 Speaker 1: point there's another number in here. This interesting, but uh, 485 00:28:48,440 --> 00:28:50,840 Speaker 1: it brings on the point that like, this is something 486 00:28:50,840 --> 00:28:54,920 Speaker 1: that human policemen were doing. Officers were doing with their 487 00:28:54,960 --> 00:28:59,080 Speaker 1: eyeballs by flipping through books, yes, for thirty days straight, 488 00:28:59,440 --> 00:29:01,960 Speaker 1: saying like it doesn't look like this person. This is 489 00:29:02,000 --> 00:29:04,800 Speaker 1: like a chance to really speed up that process and 490 00:29:04,840 --> 00:29:09,120 Speaker 1: to spend more time in theory catching bad guys. Yes, 491 00:29:09,400 --> 00:29:12,080 Speaker 1: I'm not arguing for it. I'm just saying they were 492 00:29:12,120 --> 00:29:16,440 Speaker 1: doing this anyway, just through manpower. Right. I think the 493 00:29:16,560 --> 00:29:22,560 Speaker 1: thing is is anytime you add artificial intelligence automatically makes 494 00:29:22,640 --> 00:29:27,400 Speaker 1: the user of the artificial intelligence aside unfair unfairly advantaged. 495 00:29:27,960 --> 00:29:30,520 Speaker 1: It's not like the criminals are able to use AI 496 00:29:30,600 --> 00:29:33,960 Speaker 1: to steal beer from cvs more effectively, but the cops 497 00:29:34,000 --> 00:29:36,960 Speaker 1: are using AI to catch them stealing beer more effectively. 498 00:29:37,360 --> 00:29:40,160 Speaker 1: And it's kind of like, yes, it makes sense to 499 00:29:40,320 --> 00:29:45,640 Speaker 1: catch like child pornographers and um human traffickers and rapists 500 00:29:45,680 --> 00:29:49,280 Speaker 1: and murderers and violent criminals with this stuff, but using 501 00:29:49,320 --> 00:29:51,479 Speaker 1: that kind of technology catch somebody who stole beer from 502 00:29:51,480 --> 00:29:53,840 Speaker 1: a CVS. If that's when it starts to feel like 503 00:29:54,840 --> 00:29:58,680 Speaker 1: what kind of society are we moving towards? You know? Well, 504 00:29:58,720 --> 00:30:00,959 Speaker 1: I think someone not. And let me keep going here 505 00:30:00,960 --> 00:30:02,600 Speaker 1: for a second, because I don't want people be like, 506 00:30:02,600 --> 00:30:04,440 Speaker 1: what do you in favor of the guy stealing beer 507 00:30:04,480 --> 00:30:06,920 Speaker 1: from CVS? No, I'm not. I think you're a scumbag 508 00:30:06,960 --> 00:30:10,480 Speaker 1: if you steal beer from CVS. But I also think 509 00:30:10,520 --> 00:30:13,840 Speaker 1: that it's overkilled to use facial recognition technology to catch 510 00:30:13,920 --> 00:30:19,280 Speaker 1: that person. Use old fashioned police tactics or don't catch them. Yes, 511 00:30:19,400 --> 00:30:21,320 Speaker 1: it's just kind of the fairness of the old western 512 00:30:21,360 --> 00:30:23,160 Speaker 1: New York City. I think I might be on the 513 00:30:23,200 --> 00:30:26,040 Speaker 1: other side because I don't think we need to set 514 00:30:26,080 --> 00:30:31,200 Speaker 1: a fair playing ground between criminals and cops and saying 515 00:30:31,280 --> 00:30:34,000 Speaker 1: like it's unfair that cops can use this stuff and 516 00:30:34,080 --> 00:30:37,160 Speaker 1: criminals are just out there not able to use these 517 00:30:37,160 --> 00:30:41,080 Speaker 1: same techniques. Okay, So my the fairness thing doesn't just 518 00:30:41,360 --> 00:30:43,920 Speaker 1: end at the law and order thing, right like, Like, 519 00:30:44,000 --> 00:30:47,600 Speaker 1: it's not just with cops using it. They they have 520 00:30:47,720 --> 00:30:50,720 Speaker 1: this huge advantage. I totally get how people would be like, no, 521 00:30:50,840 --> 00:30:53,840 Speaker 1: give the cops that huge advantage. I don't have an 522 00:30:53,920 --> 00:30:57,000 Speaker 1: issue with that in and of itself. I think my 523 00:30:57,080 --> 00:31:00,760 Speaker 1: issue comes a step or two down the road, sure, 524 00:31:00,760 --> 00:31:05,520 Speaker 1: where the government or the cops, acting on behalf of 525 00:31:05,520 --> 00:31:09,080 Speaker 1: the government, use that against everyday citizens who have no 526 00:31:09,200 --> 00:31:15,280 Speaker 1: recourse whatsoever. That that that that lopsidedness that's so evident 527 00:31:15,360 --> 00:31:18,840 Speaker 1: when you're using AI to catch somebody's stealing beer from 528 00:31:18,840 --> 00:31:22,040 Speaker 1: the CBS. It's really easy to kind of follow that 529 00:31:22,120 --> 00:31:25,040 Speaker 1: a little further across to the horizon and see just 530 00:31:25,120 --> 00:31:28,720 Speaker 1: how unfair life could be and how oppressive that could 531 00:31:28,720 --> 00:31:32,040 Speaker 1: be using that technology. I think that's ultimately what I'm saying. 532 00:31:32,640 --> 00:31:36,520 Speaker 1: I hopefully dug myself out of that hole by now so. 533 00:31:36,960 --> 00:31:40,680 Speaker 1: Uh And and this gets into some of the controversies 534 00:31:40,720 --> 00:31:43,800 Speaker 1: and the arguments if you're if you're scanning mug shots 535 00:31:44,760 --> 00:31:50,000 Speaker 1: uh for rapists and arsonists and murderers and violent criminals, 536 00:31:50,640 --> 00:31:53,400 Speaker 1: and you're catching people, You're not gonna find a lot 537 00:31:53,440 --> 00:31:56,520 Speaker 1: of people that say, well, that's not fair, go back 538 00:31:56,560 --> 00:31:58,480 Speaker 1: and use take a month to look through a mug 539 00:31:58,520 --> 00:32:02,360 Speaker 1: shot book instead, and waste a bunch of time and 540 00:32:02,440 --> 00:32:05,800 Speaker 1: don't be efficient. So I think most people would say 541 00:32:06,240 --> 00:32:07,960 Speaker 1: if you're looking at mug shots, although we should point 542 00:32:07,960 --> 00:32:10,120 Speaker 1: out that a mug shot doesn't mean that just means 543 00:32:10,160 --> 00:32:12,320 Speaker 1: you're arrested. That doesn't mean you were guilty of anything. 544 00:32:13,200 --> 00:32:17,360 Speaker 1: Um So, there are plenty of opportunities for false positives 545 00:32:17,400 --> 00:32:21,560 Speaker 1: and people being put in jail that shouldn't be. But 546 00:32:21,560 --> 00:32:23,200 Speaker 1: but there's not a lot of people who are like, no, 547 00:32:23,200 --> 00:32:27,320 Speaker 1: not use mugshot databases right Exactly, If you're scanning driver's 548 00:32:27,360 --> 00:32:31,640 Speaker 1: license databases or other just general public databases, that's when 549 00:32:31,640 --> 00:32:35,320 Speaker 1: it gets super tricky because we can't avoid the fact 550 00:32:35,400 --> 00:32:37,680 Speaker 1: that what that means is in in the Center on 551 00:32:37,760 --> 00:32:42,120 Speaker 1: Privacy and Technology, Um kind of stated very plainly what 552 00:32:42,160 --> 00:32:46,400 Speaker 1: that means is everyone is in a perpetual lineup. Essentially, 553 00:32:47,000 --> 00:32:49,120 Speaker 1: if you have a driver's license, you're part of a 554 00:32:49,120 --> 00:32:52,000 Speaker 1: police lineup. Yeah, whether you like it or not, whether 555 00:32:52,040 --> 00:32:54,840 Speaker 1: you know it or not. And if that computer says 556 00:32:55,080 --> 00:32:58,520 Speaker 1: here's the guy, it's Chuck Bryant, Uh, they will say, oh, 557 00:32:58,560 --> 00:33:00,320 Speaker 1: he doesn't strike me as very sinister, or in the 558 00:33:00,320 --> 00:33:03,520 Speaker 1: computer would be like, trust me, this is the guy. Uh, 559 00:33:03,600 --> 00:33:07,520 Speaker 1: with like an eighty something percent confidence interval. Um, Chuck, 560 00:33:07,560 --> 00:33:10,240 Speaker 1: Suddenly you're going to get visited by the cops and 561 00:33:10,400 --> 00:33:12,640 Speaker 1: maybe you'll even get arrested because you were a little 562 00:33:12,720 --> 00:33:14,600 Speaker 1: KG when they talked to you and you set off 563 00:33:14,680 --> 00:33:17,719 Speaker 1: their cop radar or whatever. And then the next thing 564 00:33:17,760 --> 00:33:20,440 Speaker 1: you know, you're in court being charged with the crime 565 00:33:20,480 --> 00:33:23,600 Speaker 1: that you didn't commit because the computer implicated you and 566 00:33:23,640 --> 00:33:26,200 Speaker 1: the cops thought that you were acting KG. And let's 567 00:33:26,200 --> 00:33:29,520 Speaker 1: say that you were a very very poor person and 568 00:33:29,560 --> 00:33:31,560 Speaker 1: you don't have any money to mount a decent defense. 569 00:33:31,880 --> 00:33:34,400 Speaker 1: The best you can afford is a free public defender 570 00:33:34,600 --> 00:33:37,120 Speaker 1: who has fifty other cases is not really paying very 571 00:33:37,200 --> 00:33:40,160 Speaker 1: much attention to you and you're in jail now because 572 00:33:40,160 --> 00:33:43,760 Speaker 1: you got convicted wrongly because you were putting a lineup 573 00:33:44,040 --> 00:33:47,040 Speaker 1: just because you had a driver's license. Yeah. I think 574 00:33:47,080 --> 00:33:51,120 Speaker 1: for me, um, and this is total my privilege coming 575 00:33:51,120 --> 00:33:53,760 Speaker 1: through as well, Like I'd want to see some numbers 576 00:33:54,400 --> 00:33:58,520 Speaker 1: and if one of every ten thousand arrest and conviction 577 00:33:58,600 --> 00:34:01,160 Speaker 1: of a real criminal or an a rapist and a murderer, 578 00:34:01,520 --> 00:34:05,120 Speaker 1: and there's three people that get falsely identified and have 579 00:34:05,200 --> 00:34:07,160 Speaker 1: to go through the system and may or it may 580 00:34:07,200 --> 00:34:10,040 Speaker 1: not be acquitted, I'd want to see those numbers. But 581 00:34:10,080 --> 00:34:13,960 Speaker 1: again that's coming from a privileged position as someone who 582 00:34:13,960 --> 00:34:18,360 Speaker 1: could afford a legal defense who uh white, Yeah exactly. 583 00:34:18,520 --> 00:34:21,759 Speaker 1: That's another one too, is that people of color uh 584 00:34:22,040 --> 00:34:26,719 Speaker 1: bear a inordinate burden, disproportionate burden when it comes to 585 00:34:26,760 --> 00:34:29,640 Speaker 1: facial recognition technologies. We'll see well, I mean you might 586 00:34:29,680 --> 00:34:31,920 Speaker 1: as well go ahead and talk about that. It's um. 587 00:34:31,960 --> 00:34:35,040 Speaker 1: I think from the beginning, even with social media, there 588 00:34:35,040 --> 00:34:40,960 Speaker 1: were certain facial recognition, early facial recognition technologies that admitted like, 589 00:34:41,160 --> 00:34:46,320 Speaker 1: we're not as good as uh seeing or recognizing faces 590 00:34:46,400 --> 00:34:49,120 Speaker 1: with darker skin. It's just not that good. Yeah, I 591 00:34:49,120 --> 00:34:52,239 Speaker 1: think something like darker skinned men and women were recognized, 592 00:34:52,760 --> 00:34:57,600 Speaker 1: twelve and thirty were misidentified, compared to one percent and 593 00:34:57,680 --> 00:35:00,880 Speaker 1: seven percent of light skinned men and him in And 594 00:35:00,960 --> 00:35:04,240 Speaker 1: they say it's because of the data sets that these 595 00:35:04,640 --> 00:35:08,560 Speaker 1: machines have been trained on, which is not it's not purposefully, 596 00:35:08,680 --> 00:35:10,440 Speaker 1: but it makes sense if you live in like a 597 00:35:10,600 --> 00:35:14,880 Speaker 1: generally like like the white people are in power, and 598 00:35:14,960 --> 00:35:17,960 Speaker 1: it's like whiteness is the most celebrated part of the 599 00:35:18,080 --> 00:35:20,480 Speaker 1: society or whatever. That's why you're going to have more 600 00:35:20,520 --> 00:35:24,120 Speaker 1: pictures of And when you feed just a bunch of 601 00:35:24,160 --> 00:35:27,080 Speaker 1: pictures from your society into a machine and say, learn 602 00:35:27,120 --> 00:35:29,279 Speaker 1: what faces are, it's going to go, oh, white men, 603 00:35:29,400 --> 00:35:31,960 Speaker 1: I got you. Well, they just are more white people. 604 00:35:32,160 --> 00:35:36,279 Speaker 1: Numbers wise, probably has something to do with it, right, Um, Yes, 605 00:35:36,360 --> 00:35:38,480 Speaker 1: that's a really that is an excellent point as well, 606 00:35:38,640 --> 00:35:41,840 Speaker 1: for sure. But the fact of the matter is the 607 00:35:41,960 --> 00:35:45,960 Speaker 1: data sets that the machines are learning on are largely 608 00:35:46,120 --> 00:35:50,239 Speaker 1: white and largely male, and so they're just not as 609 00:35:50,280 --> 00:35:55,080 Speaker 1: good at recognizing the differences in faces among um people 610 00:35:55,120 --> 00:35:59,359 Speaker 1: who aren't white males. Uh, let's read these quotes. There's 611 00:35:59,400 --> 00:36:02,080 Speaker 1: a couple of good quotes here. The first one is 612 00:36:02,120 --> 00:36:07,719 Speaker 1: from Woodrow Heart sog. I was going to read it 613 00:36:07,760 --> 00:36:10,000 Speaker 1: as wurner. I don't know if I can. I should 614 00:36:10,040 --> 00:36:12,560 Speaker 1: get Nolan here. He does a good burner. The most 615 00:36:12,680 --> 00:36:18,040 Speaker 1: uniquely dangerous surveillance mechanism ever intended. It's an irresistible tool 616 00:36:18,080 --> 00:36:23,440 Speaker 1: for oppression that's perfectly suited for governments to display unprecedented 617 00:36:23,440 --> 00:36:29,439 Speaker 1: authoritary tane I'm sorry, authoritarian control and an all out 618 00:36:29,480 --> 00:36:36,080 Speaker 1: privacy eviscerating machine. I just realized it's heart sog, so 619 00:36:36,200 --> 00:36:39,600 Speaker 1: it's it's spelled differently. It's a j r T hert 620 00:36:39,640 --> 00:36:42,560 Speaker 1: sog is just h g r z o G. I'm 621 00:36:42,600 --> 00:36:45,840 Speaker 1: glad that we didn't figure that out beforehand, though. You 622 00:36:45,880 --> 00:36:47,440 Speaker 1: want to take the other one though. I also I 623 00:36:47,480 --> 00:36:50,080 Speaker 1: have to say I detected a hint of Michael Caine 624 00:36:50,080 --> 00:36:52,600 Speaker 1: in there too. There might have been a little bit. 625 00:36:52,680 --> 00:36:54,880 Speaker 1: It's hard to get Michael Caine out of my system. 626 00:36:55,000 --> 00:36:58,799 Speaker 1: What's the other one? Oh? From Microsoft president Brad Smith? Yeah. So. 627 00:36:58,920 --> 00:37:02,400 Speaker 1: Brad Smith says that when combined with ubiquitous cameras and 628 00:37:02,520 --> 00:37:05,520 Speaker 1: massive computing power and storage in the cloud, a government 629 00:37:05,600 --> 00:37:10,120 Speaker 1: could use facial recognition technology to enable continuous surveillance of 630 00:37:10,200 --> 00:37:14,280 Speaker 1: specific individuals like they're supposedly doing in China as an aside, 631 00:37:14,360 --> 00:37:18,040 Speaker 1: it could follow anyone anywhere, or for that matter, everyone everywhere, 632 00:37:18,200 --> 00:37:21,000 Speaker 1: at any time, or even all the time. And he 633 00:37:21,080 --> 00:37:23,560 Speaker 1: wasn't this wasn't a sales pitch. He was speaking out 634 00:37:23,640 --> 00:37:26,319 Speaker 1: against this to Congress, saying like, guys, we gotta we 635 00:37:26,360 --> 00:37:28,239 Speaker 1: have to do something about this, because this is the 636 00:37:29,200 --> 00:37:34,400 Speaker 1: path we're heading down. And that's why uh Seth abrama 637 00:37:34,440 --> 00:37:38,360 Speaker 1: Witz changed his name to Brad Smith. It sounds like 638 00:37:38,400 --> 00:37:43,080 Speaker 1: a total like like I just want to blend in. UM. 639 00:37:43,160 --> 00:37:47,320 Speaker 1: So you've got scanning against mug shots, scanning against driver's licenses, 640 00:37:47,440 --> 00:37:49,640 Speaker 1: and then um, there's a new one that just came out. 641 00:37:49,680 --> 00:37:53,400 Speaker 1: The New York Times just released this expose on January eighteenth, 642 00:37:53,480 --> 00:37:56,479 Speaker 1: just a few days ago, UM on a company called 643 00:37:56,480 --> 00:38:02,200 Speaker 1: clear View AI. And apparently even among um Silicon Valley 644 00:38:02,280 --> 00:38:06,399 Speaker 1: there has been this longstanding kind of unspoken thing where 645 00:38:06,920 --> 00:38:10,600 Speaker 1: let's steer clear of this facial recognition technology because it's 646 00:38:10,640 --> 00:38:14,560 Speaker 1: such a tool of oppression potentially. And clear View AI said, hey, 647 00:38:14,560 --> 00:38:16,800 Speaker 1: we're not from Silicon Valley. Well, we're just going to 648 00:38:16,920 --> 00:38:19,880 Speaker 1: do our own thing. So now there's this, there's this 649 00:38:20,000 --> 00:38:23,839 Speaker 1: tool that's available to law enforcement agencies that they're using. 650 00:38:23,880 --> 00:38:26,280 Speaker 1: Remember that one that one guy who had a quote 651 00:38:26,320 --> 00:38:29,240 Speaker 1: saying that, um, it went from thirty days to three minutes. 652 00:38:29,760 --> 00:38:33,040 Speaker 1: They were almost certainly using clear View AI. And the 653 00:38:33,120 --> 00:38:36,120 Speaker 1: reason clear View AI has such an advantage because they've 654 00:38:36,200 --> 00:38:39,680 Speaker 1: gone to this place where everyone else said was off limits, 655 00:38:39,880 --> 00:38:43,360 Speaker 1: which is scraping social media. So rather than the forty 656 00:38:43,400 --> 00:38:47,920 Speaker 1: one million driver's license and mug shot pictures that is 657 00:38:47,920 --> 00:38:51,960 Speaker 1: available in the FBI's database, clear View AI is this 658 00:38:52,040 --> 00:38:54,080 Speaker 1: app that you can subscribe to for a year for 659 00:38:54,160 --> 00:38:56,840 Speaker 1: like two thousand and ten thousand dollars, and they have 660 00:38:57,239 --> 00:39:02,160 Speaker 1: three billion pictures, including links to the social accounts of 661 00:39:02,200 --> 00:39:04,919 Speaker 1: the people whose pictures come up, so that you can 662 00:39:04,920 --> 00:39:07,560 Speaker 1: not only see who it is, you can find out 663 00:39:07,560 --> 00:39:11,839 Speaker 1: where they're at right then. And it's a hugely invasive thing. 664 00:39:11,920 --> 00:39:16,120 Speaker 1: And there's no legislation on this whatsoever. And it's only 665 00:39:16,160 --> 00:39:18,880 Speaker 1: just recently come out that this this company even exists, 666 00:39:18,960 --> 00:39:21,040 Speaker 1: or that this app exists, and that law enforcement is 667 00:39:21,120 --> 00:39:24,600 Speaker 1: using this stuff because again there's basically no laws saying 668 00:39:24,640 --> 00:39:27,279 Speaker 1: you can do this, you can't do that. Um And 669 00:39:27,320 --> 00:39:31,160 Speaker 1: again Woodrow Herzog has basically said, there's no way we're 670 00:39:31,160 --> 00:39:35,200 Speaker 1: going to realize the benefits of this without the incredibly 671 00:39:35,640 --> 00:39:40,040 Speaker 1: disproportionate drawbacks, and um, he just calls for an all 672 00:39:40,080 --> 00:39:42,600 Speaker 1: outband of the technology. He's basically saying it's not worth it. 673 00:39:42,960 --> 00:39:45,120 Speaker 1: All right, let's take another break. Oh my gosh, we 674 00:39:45,160 --> 00:39:48,120 Speaker 1: haven't taken our second break yet. Uh, and we'll be 675 00:39:48,200 --> 00:39:51,480 Speaker 1: right back to talk about the rest of this stuff 676 00:39:52,040 --> 00:40:17,759 Speaker 1: right after this. I think we should talk a little bit, 677 00:40:18,040 --> 00:40:22,120 Speaker 1: like we've talked about the false positives, UM, and I 678 00:40:22,160 --> 00:40:27,760 Speaker 1: think within Amazon, their contention is that what you're talking 679 00:40:27,760 --> 00:40:29,600 Speaker 1: about with like the studies out of m I T 680 00:40:29,920 --> 00:40:33,000 Speaker 1: that said, UM, that there are too many false positive 681 00:40:33,239 --> 00:40:35,439 Speaker 1: is he's they're saying, wait a minute, you're talking about 682 00:40:35,440 --> 00:40:39,720 Speaker 1: facial analysis, not facial recognition, and those are two different things. 683 00:40:40,040 --> 00:40:41,920 Speaker 1: I did not understand this at all. I went and 684 00:40:41,960 --> 00:40:43,640 Speaker 1: looked it up and I just fully get it. Either 685 00:40:43,719 --> 00:40:46,600 Speaker 1: there's it sounds like some tap dancing to me. I 686 00:40:46,680 --> 00:40:49,399 Speaker 1: looked at and there's like not a distinction between those 687 00:40:49,440 --> 00:40:53,040 Speaker 1: two aside from this quote. Yeah, it's basically the same thing. 688 00:40:53,719 --> 00:40:56,000 Speaker 1: And also it doesn't even make sense as a defense. 689 00:40:56,040 --> 00:41:00,880 Speaker 1: So basically what they're saying is that um, that they 690 00:41:00,920 --> 00:41:03,479 Speaker 1: were being called out by M I. T. S. Media Lab. 691 00:41:03,560 --> 00:41:06,239 Speaker 1: They did a two thousand eighteen study. That's the one 692 00:41:06,280 --> 00:41:08,520 Speaker 1: that found that there's like a twelve and thirty five 693 00:41:08,920 --> 00:41:16,080 Speaker 1: misidentification among darker skinned men and women women. I think yeah, um, 694 00:41:16,160 --> 00:41:19,640 Speaker 1: and Amazon said no, no, no, uh, you guys are 695 00:41:19,760 --> 00:41:23,640 Speaker 1: using facial analysis, not facial recognition. It's like, no, that's 696 00:41:23,880 --> 00:41:26,360 Speaker 1: that's not the case at all. They're doing facial recognition. 697 00:41:26,400 --> 00:41:27,960 Speaker 1: All right. I'm glad it wouldn't just me, because you 698 00:41:27,960 --> 00:41:30,480 Speaker 1: see I wrote I don't get it next to this. 699 00:41:30,600 --> 00:41:33,320 Speaker 1: It was a it was a bad, a bad jam, 700 00:41:33,360 --> 00:41:36,239 Speaker 1: I guess. But I think their point was, well, you're 701 00:41:36,239 --> 00:41:39,560 Speaker 1: trying to tell the gender of somebody, and if you're 702 00:41:39,560 --> 00:41:43,759 Speaker 1: doing binary gender stuff, like you're trying to say this 703 00:41:43,800 --> 00:41:47,239 Speaker 1: as male or female, you can't really use facial recognition 704 00:41:47,280 --> 00:41:51,319 Speaker 1: for that, especially among darker skinned people. And they said 705 00:41:51,360 --> 00:41:54,080 Speaker 1: that you shouldn't use that, especially in cases of people's 706 00:41:54,080 --> 00:41:58,680 Speaker 1: civil liberties or whatever. But it still remains the case 707 00:41:58,800 --> 00:42:03,640 Speaker 1: that if you are a darker skinned person and you're 708 00:42:03,760 --> 00:42:07,839 Speaker 1: being looked at by a police department that has their 709 00:42:07,920 --> 00:42:12,160 Speaker 1: threshold for a confidence level set low, yeah, there's a 710 00:42:12,280 --> 00:42:14,520 Speaker 1: chance that a false positive is going to be put 711 00:42:14,560 --> 00:42:17,160 Speaker 1: out there, right, And that's that can be trouble for 712 00:42:17,200 --> 00:42:19,440 Speaker 1: you if you don't have the money to mount a defense. 713 00:42:19,840 --> 00:42:21,600 Speaker 1: And even if you do have the money, you shouldn't 714 00:42:21,600 --> 00:42:23,759 Speaker 1: have to mount a defense to spend money on that 715 00:42:23,840 --> 00:42:26,480 Speaker 1: to be equitted of a crime. Just because the computer 716 00:42:26,560 --> 00:42:29,840 Speaker 1: is not so good at a distinguishing Um black people 717 00:42:30,160 --> 00:42:33,080 Speaker 1: think it is among white people. Yeah, And what you 718 00:42:33,080 --> 00:42:35,160 Speaker 1: know when it comes to where this is going to 719 00:42:35,280 --> 00:42:38,360 Speaker 1: end up legally, Uh, you might want to look at 720 00:42:38,400 --> 00:42:41,839 Speaker 1: the Fourth Amendment. Um. It gets really dicey on how 721 00:42:41,880 --> 00:42:45,480 Speaker 1: you interpret the constitution when you talk about illegal search 722 00:42:45,520 --> 00:42:50,760 Speaker 1: and seizure? Is this a search or a seizure? Probably not, um, 723 00:42:50,840 --> 00:42:53,640 Speaker 1: because it depends on what we're talking about with the 724 00:42:53,680 --> 00:42:58,120 Speaker 1: Supreme Court. Um. You've probably been stopped h in a 725 00:42:58,120 --> 00:43:01,560 Speaker 1: at a d y checkpoint, and that's stopping everybody. That's 726 00:43:01,560 --> 00:43:03,560 Speaker 1: sort of the same thing. It's like, if you're in 727 00:43:03,560 --> 00:43:05,279 Speaker 1: a car, we're gonna stop you and check you out 728 00:43:05,719 --> 00:43:11,000 Speaker 1: because the couplet the public, the public his said, you know, 729 00:43:11,120 --> 00:43:14,920 Speaker 1: that's okay, it's reasonable, it's not super invasive. And if 730 00:43:14,920 --> 00:43:17,319 Speaker 1: you're stopping drunk drivers, it's just putting someone out for 731 00:43:17,400 --> 00:43:19,840 Speaker 1: a few minutes. Yeah, the court said, if it's minimally 732 00:43:19,880 --> 00:43:23,480 Speaker 1: invasive and the public good or the potential for public good, 733 00:43:23,520 --> 00:43:26,000 Speaker 1: which is in this case getting drunk drivers off the 734 00:43:26,080 --> 00:43:29,560 Speaker 1: road is high enough, then it's it's okay to basically 735 00:43:29,560 --> 00:43:32,480 Speaker 1: search everybody without probable cause. Yeah. Same with T s 736 00:43:32,520 --> 00:43:36,120 Speaker 1: A checkpoints when it comes to official rulings. Obviously we 737 00:43:36,120 --> 00:43:38,640 Speaker 1: don't have one in facial recognition yet. But if you 738 00:43:38,680 --> 00:43:42,080 Speaker 1: look at Carpenter v. United States, the court ruled five 739 00:43:42,160 --> 00:43:45,520 Speaker 1: four that police violated uh Fourth Amendment rights of a 740 00:43:45,560 --> 00:43:48,600 Speaker 1: man when they asked for his cell phone location data 741 00:43:48,680 --> 00:43:53,640 Speaker 1: without a warrant from T Mobile. Um. So hopefully this 742 00:43:53,719 --> 00:43:56,279 Speaker 1: nuance will prevail, and it just won't. It looks like 743 00:43:56,280 --> 00:43:59,080 Speaker 1: it probably won't be some blanket ruling that just says, yep, 744 00:43:59,160 --> 00:44:01,239 Speaker 1: you can use it for whatever you want, right if 745 00:44:01,280 --> 00:44:03,000 Speaker 1: it even gets to that point, and if the court 746 00:44:03,040 --> 00:44:06,040 Speaker 1: hears it, which probably would. So. The other thing that 747 00:44:06,200 --> 00:44:11,239 Speaker 1: um that is has become worrisome for people, though, is there. 748 00:44:11,280 --> 00:44:16,560 Speaker 1: It's becoming um our society is becoming increasingly surveiled. Right 749 00:44:17,000 --> 00:44:22,120 Speaker 1: like ring the ring doorbell. They market to law enforcement 750 00:44:22,160 --> 00:44:24,680 Speaker 1: basically saying like you can, you can, These people like 751 00:44:24,719 --> 00:44:26,919 Speaker 1: will pay to have video cameras put on their house 752 00:44:26,960 --> 00:44:29,560 Speaker 1: and you can go get these videos on neighborhood all 753 00:44:29,560 --> 00:44:31,480 Speaker 1: the time, people like my car got broken into, who 754 00:44:31,480 --> 00:44:33,560 Speaker 1: can help me out with their cameras. So it's being 755 00:44:33,600 --> 00:44:36,080 Speaker 1: marketed to law enforcement. Your TV has a camera and 756 00:44:36,200 --> 00:44:38,600 Speaker 1: your smart speaker has a microphone and it So the 757 00:44:38,719 --> 00:44:42,960 Speaker 1: more that we are um surveiled and the more ubiquitous 758 00:44:43,080 --> 00:44:47,120 Speaker 1: facial recognition technology gets, the easier will be to not 759 00:44:47,400 --> 00:44:50,759 Speaker 1: just scan a picture of somebody stealing beard of a 760 00:44:50,920 --> 00:44:57,960 Speaker 1: CVS against the mugshot database, or but to say this 761 00:44:58,040 --> 00:45:01,080 Speaker 1: person right here that you're you're looking at, that the 762 00:45:01,120 --> 00:45:05,160 Speaker 1: cameras following, that's this, that's that's um, that's uh, that's 763 00:45:05,239 --> 00:45:09,000 Speaker 1: Chuck Bryant right there. And everywhere you walk there's getting 764 00:45:09,040 --> 00:45:11,759 Speaker 1: a little icon next to your head Chuck Bryant. You know, 765 00:45:11,800 --> 00:45:14,240 Speaker 1: if you click it, it'll show you your Facebook page 766 00:45:14,320 --> 00:45:16,480 Speaker 1: or a map to your house or whatever. They want 767 00:45:16,480 --> 00:45:18,640 Speaker 1: to know, your police record, it doesn't matter. And that 768 00:45:18,800 --> 00:45:22,960 Speaker 1: this is what we're increasingly getting closer to. And some 769 00:45:23,000 --> 00:45:25,959 Speaker 1: people say this is what they're already doing in China. Yeah, 770 00:45:26,360 --> 00:45:28,799 Speaker 1: and London has has They were one of the first 771 00:45:28,840 --> 00:45:32,520 Speaker 1: on the CCTV train. Yes, but they use humans, which 772 00:45:32,560 --> 00:45:37,040 Speaker 1: is fair right for recognizing phases. Yeah, they have people 773 00:45:37,080 --> 00:45:40,760 Speaker 1: like actually looking at the at the individual monitors looking 774 00:45:40,800 --> 00:45:43,600 Speaker 1: for crime. This is this is the idea of this 775 00:45:43,640 --> 00:45:46,800 Speaker 1: is it's just tracking people who are just doing nothing wrong. Yeah, 776 00:45:46,840 --> 00:45:48,640 Speaker 1: but there are plenty of people on the other side 777 00:45:48,680 --> 00:45:51,160 Speaker 1: we should point out that are like, you know what, 778 00:45:51,239 --> 00:45:53,600 Speaker 1: if you're catching bad guys, that's great. If you're a 779 00:45:53,600 --> 00:45:55,320 Speaker 1: good guy, you've got nothing to hide, so you shouldn't 780 00:45:55,320 --> 00:45:57,759 Speaker 1: sweat it. Yeah. I can never remember the name of 781 00:45:57,760 --> 00:46:00,840 Speaker 1: the article. I'll try to find it, but there's a 782 00:46:01,880 --> 00:46:03,360 Speaker 1: man I wish I could remember off the top of 783 00:46:03,400 --> 00:46:06,040 Speaker 1: my head. But there's there's this amazing article from a 784 00:46:06,080 --> 00:46:09,560 Speaker 1: few years back, um that that basically says like that's 785 00:46:09,640 --> 00:46:13,359 Speaker 1: that's a terrible argument that that even if you have 786 00:46:13,520 --> 00:46:18,000 Speaker 1: nothing to hide, you still, um are a human being. 787 00:46:18,400 --> 00:46:21,160 Speaker 1: And if somebody wanted to put together and but if 788 00:46:21,200 --> 00:46:25,120 Speaker 1: somebody wanted to put together like a dossier on your 789 00:46:25,760 --> 00:46:29,160 Speaker 1: embarrassing things that you've done or said or thought or whatever, 790 00:46:29,640 --> 00:46:32,080 Speaker 1: um and and put it all together and condensed it, 791 00:46:32,200 --> 00:46:35,239 Speaker 1: you can make anybody look bad. No one should want 792 00:46:35,280 --> 00:46:38,520 Speaker 1: to live in a situation where like that could conceivably 793 00:46:38,560 --> 00:46:42,960 Speaker 1: happen in the police state. Yeah, police state, good stuff. 794 00:46:44,480 --> 00:46:47,200 Speaker 1: I guess we'll see how it pans out. I'm not 795 00:46:47,280 --> 00:46:51,240 Speaker 1: saying police state stuff. Police date. It's good stuff. Yeah, 796 00:46:51,520 --> 00:46:54,920 Speaker 1: we'll see what happens right in Woodrow, Hurtzog and let 797 00:46:55,000 --> 00:46:57,520 Speaker 1: us know what to do. If you want to know 798 00:46:57,560 --> 00:47:02,880 Speaker 1: more about facial recognition technology, you can go onto the 799 00:47:02,880 --> 00:47:05,880 Speaker 1: internet and start reading stuff about it. Definitely read the 800 00:47:05,920 --> 00:47:10,680 Speaker 1: New York Times expose about um clear view AI came 801 00:47:10,680 --> 00:47:14,160 Speaker 1: out January. Yes, okay, since I said that it's time 802 00:47:14,160 --> 00:47:16,560 Speaker 1: for a listener mitt all right, now it's not you 803 00:47:16,560 --> 00:47:18,319 Speaker 1: know what it's time for. Oh yeah, I know it's 804 00:47:18,360 --> 00:47:20,360 Speaker 1: time for you ready, Yeah, you say it. It's time 805 00:47:20,440 --> 00:47:33,600 Speaker 1: for administrative details. All right. This is part two. This 806 00:47:33,640 --> 00:47:35,799 Speaker 1: is where we thank people on the show that have 807 00:47:35,880 --> 00:47:41,640 Speaker 1: sent us kindness. Is via snail mail. Siggy s I 808 00:47:41,719 --> 00:47:44,560 Speaker 1: g D. I sent sent me some hand that it 809 00:47:44,640 --> 00:47:46,600 Speaker 1: sucks not you for some reason. I don't know why. 810 00:47:46,920 --> 00:47:49,040 Speaker 1: I got some socks too. Oh really yeah, I didn't 811 00:47:49,080 --> 00:47:51,160 Speaker 1: know who they were from, so they may be from Siggy. 812 00:47:51,320 --> 00:47:53,960 Speaker 1: I think probably what it was is you left him 813 00:47:54,320 --> 00:47:58,120 Speaker 1: with my desk and I thank you for it, CHUCKR 814 00:47:58,520 --> 00:48:01,240 Speaker 1: do another one while I'm pulling up my list acting. 815 00:48:01,560 --> 00:48:05,360 Speaker 1: Julie Shoop made us t shirts. Shoop, this is good stuff. 816 00:48:05,400 --> 00:48:09,560 Speaker 1: Faux band name tour shirts. Uh, super fun. Thanks a 817 00:48:09,600 --> 00:48:12,719 Speaker 1: lot of Julie. Very cool. Uh, you're still working, so 818 00:48:12,719 --> 00:48:16,920 Speaker 1: I'm gonna keep going. Thalia Dawes is our pal from Australia, 819 00:48:17,000 --> 00:48:20,520 Speaker 1: said my daughter a couple of books. She's a very 820 00:48:20,520 --> 00:48:23,560 Speaker 1: lovely lady who has a very adorable and whip smart 821 00:48:23,640 --> 00:48:26,960 Speaker 1: daughter about the same age, who listens to our show. 822 00:48:27,600 --> 00:48:30,080 Speaker 1: And um, I was just like, man, I wish she 823 00:48:30,160 --> 00:48:32,160 Speaker 1: lived here. We could go into play date. They both 824 00:48:32,160 --> 00:48:35,760 Speaker 1: seemed like lovely humans. There's such thing as plains. Yeah, 825 00:48:35,920 --> 00:48:39,040 Speaker 1: good Australia for a play date. Um. So at our 826 00:48:39,200 --> 00:48:42,600 Speaker 1: Portland's main show, Chuck, we had like a lot of um, 827 00:48:42,640 --> 00:48:46,759 Speaker 1: We've got a lot of neat gifts. Jim Diefenbacher made 828 00:48:46,840 --> 00:48:51,520 Speaker 1: us amazing cross hatch portraits of them. Yeah, those were 829 00:48:51,560 --> 00:48:54,640 Speaker 1: great of us, like of a photo we took I 830 00:48:54,680 --> 00:48:57,480 Speaker 1: think on like our West Coast tour from two. It 831 00:48:57,520 --> 00:49:00,560 Speaker 1: brought back some memories saw that. It's just really great stuff. 832 00:49:00,560 --> 00:49:03,720 Speaker 1: And you can see Jim's work at Jim Diefenbaker dot com, 833 00:49:03,800 --> 00:49:06,719 Speaker 1: j I M D I E F F E N 834 00:49:06,800 --> 00:49:09,359 Speaker 1: B A C H E R dot com and they 835 00:49:09,360 --> 00:49:12,480 Speaker 1: were framed in everything. Yeah, very sweet stuff, Jim. We 836 00:49:12,600 --> 00:49:17,919 Speaker 1: got some home tapped maple syrup from Andy Huntsberger from 837 00:49:19,480 --> 00:49:24,479 Speaker 1: Elgin I A, Okay, what's Iowa? Is that Iowa? Yeah? Yeah, okay, Yeah, 838 00:49:24,600 --> 00:49:27,400 Speaker 1: I was about to say the wrong state. What are 839 00:49:27,440 --> 00:49:29,279 Speaker 1: you gonna say? You know, I think I went to 840 00:49:29,320 --> 00:49:32,200 Speaker 1: say Illinoia. If you ever see Gary Gulman's bid on 841 00:49:32,280 --> 00:49:35,719 Speaker 1: Nate abbreviating the states, dude, just look it up. One 842 00:49:35,719 --> 00:49:38,520 Speaker 1: of the great comedy bits I've ever seen. It's hysterical. 843 00:49:39,880 --> 00:49:43,560 Speaker 1: Let's see. Uh oh. Another at the Portlands Show, we 844 00:49:43,640 --> 00:49:47,399 Speaker 1: got a letter from tog Braun from Downea's day Boat 845 00:49:47,480 --> 00:49:53,240 Speaker 1: from Lloyd Braun. Tog Braun Um and Downey's day Boats 846 00:49:53,320 --> 00:49:56,600 Speaker 1: mission is to bring sustainable, delicious scalops from Maine to 847 00:49:56,680 --> 00:49:59,880 Speaker 1: the world, and she said that scalops have varietals like 848 00:50:00,000 --> 00:50:02,319 Speaker 1: oysters and that main has the best. So check out 849 00:50:02,440 --> 00:50:07,279 Speaker 1: down East day Boat dot com. Coke Braun feel free 850 00:50:07,320 --> 00:50:09,640 Speaker 1: to send us some scallops as long as they've been 851 00:50:09,680 --> 00:50:14,520 Speaker 1: appropriately refrigerated. The entire time. I got another children's book, 852 00:50:14,640 --> 00:50:18,040 Speaker 1: Are You a Good Egg? And that was from Peter Deutschel, 853 00:50:18,520 --> 00:50:21,360 Speaker 1: along with some stuff you should know coasters. Yeah yeah, 854 00:50:21,400 --> 00:50:24,240 Speaker 1: thanks again Peter. I think we thanked him last episode 855 00:50:24,239 --> 00:50:27,560 Speaker 1: for the coasters too. I didn't know about the children's book. 856 00:50:27,600 --> 00:50:31,120 Speaker 1: Then Sarah Law who is an s y s k 857 00:50:31,280 --> 00:50:34,400 Speaker 1: Army member. Um, she came to the Toronto show and 858 00:50:34,400 --> 00:50:37,960 Speaker 1: she brought us a bunch of um Canadian goodies, everything 859 00:50:38,080 --> 00:50:42,720 Speaker 1: from Japanese cheesecakes and tarts from Uncle Tetsus. So good. 860 00:50:43,200 --> 00:50:46,400 Speaker 1: Um and uh. I think some other stuff too, like 861 00:50:46,760 --> 00:50:50,319 Speaker 1: coffee crisps, which are my favorite. Um. Yeah, so thanks 862 00:50:50,320 --> 00:50:53,280 Speaker 1: a lot. Sarah's always that is everything from Japan awesome. 863 00:50:54,000 --> 00:50:57,399 Speaker 1: They just it's really good. They don't necessarily invent much. 864 00:50:57,480 --> 00:51:00,080 Speaker 1: They just take other people's inventions and perfect them. And 865 00:51:00,120 --> 00:51:01,640 Speaker 1: it seems like they take a lot of pride in 866 00:51:01,760 --> 00:51:04,920 Speaker 1: like doing things right. I think that he could say that, 867 00:51:04,960 --> 00:51:12,040 Speaker 1: probably because we got from Matt an assortment of food 868 00:51:12,160 --> 00:51:17,080 Speaker 1: things from Japan and that came in today, including our 869 00:51:17,120 --> 00:51:21,920 Speaker 1: beloved QP Man's. I love that stuff. It's been too long, man, 870 00:51:22,160 --> 00:51:25,400 Speaker 1: God bless you. Let's see Leah Harrison gave us some 871 00:51:25,480 --> 00:51:30,080 Speaker 1: amazing goodies to including coffee Crisp and Canadian Smarties, which 872 00:51:30,120 --> 00:51:33,280 Speaker 1: are way better than American Smarties because they involved chocolate 873 00:51:33,360 --> 00:51:37,640 Speaker 1: and super smarties. A student named Maria Styling wrote us 874 00:51:37,640 --> 00:51:40,759 Speaker 1: a letter for an honors English project because she had 875 00:51:40,800 --> 00:51:43,880 Speaker 1: to write someone who inspired her, and she asked this. 876 00:51:43,960 --> 00:51:45,440 Speaker 1: I told her we to answer, how do we choose 877 00:51:45,440 --> 00:51:48,799 Speaker 1: a topic? Maria? We choose a topic. It's not it's 878 00:51:48,800 --> 00:51:52,080 Speaker 1: pretty low fih. We just send each other one each 879 00:51:52,080 --> 00:51:55,280 Speaker 1: week on whatever happens to grab our fancy. We're always 880 00:51:55,520 --> 00:52:00,080 Speaker 1: looking around our world, uh and thinking I wonder about that. 881 00:52:00,080 --> 00:52:01,759 Speaker 1: That's as that's as easy as it gets. And we'll 882 00:52:01,760 --> 00:52:04,840 Speaker 1: just send each other an email and times out of 883 00:52:04,840 --> 00:52:08,719 Speaker 1: a hundred will say great, let's do it. Yeah more ka, 884 00:52:09,560 --> 00:52:13,160 Speaker 1: let's see um oh. Michael C. Learner, who's an attorney 885 00:52:13,160 --> 00:52:16,520 Speaker 1: at Law and Reno, sent us a letter about getting 886 00:52:16,520 --> 00:52:20,080 Speaker 1: the word out about the National Consumer Law Center, for 887 00:52:20,200 --> 00:52:22,480 Speaker 1: which Learner does a lot of pro bono work for 888 00:52:22,520 --> 00:52:25,759 Speaker 1: people who are poor and getting screwed over because of debt. 889 00:52:25,840 --> 00:52:28,480 Speaker 1: As he put it so, he pointed to the National 890 00:52:28,480 --> 00:52:32,880 Speaker 1: Consumer Law Center and the Practicing law Institutes Consumer Financial Services. 891 00:52:32,960 --> 00:52:36,000 Speaker 1: Answer book. So if you are in debt and you're 892 00:52:36,040 --> 00:52:38,960 Speaker 1: getting pushed around, go check those things out, says Michael C. Learner. 893 00:52:39,120 --> 00:52:41,680 Speaker 1: Good stuff Van Ostro, and we gotta thank him again. 894 00:52:42,200 --> 00:52:45,520 Speaker 1: Our buddy from Washington sent us a book by his 895 00:52:45,600 --> 00:52:49,560 Speaker 1: friend Andy Robbins called Field Guide to the North American Jackalothes. 896 00:52:49,640 --> 00:52:53,680 Speaker 1: Pretty awesome, that's pretty fun. Paul Speth from Mars Community 897 00:52:53,760 --> 00:52:56,080 Speaker 1: Brewing Company in Chicago gives a bunch of beer at 898 00:52:56,080 --> 00:52:59,520 Speaker 1: the Chicago show. Thank you for that. Um, I got 899 00:52:59,520 --> 00:53:01,359 Speaker 1: one more. Okay, I'll go and finish up and then 900 00:53:01,360 --> 00:53:03,560 Speaker 1: you can round us out. Man, I have a whole page. 901 00:53:03,640 --> 00:53:06,920 Speaker 1: Laugh all right. Robert Highland from Wammo, this was just 902 00:53:06,960 --> 00:53:09,839 Speaker 1: came in today. Okay. He works for Wammo. He sent 903 00:53:09,960 --> 00:53:13,839 Speaker 1: us each their seventieth anniversary super book. Oh wow, thanks 904 00:53:13,840 --> 00:53:15,880 Speaker 1: a lot. Like you guys talk a lot about wammo products. 905 00:53:15,920 --> 00:53:18,799 Speaker 1: Does it bounce? I have not dropped it on the floor. YEA, 906 00:53:18,920 --> 00:53:21,160 Speaker 1: let's find out. Give it a try. I'm gonna do 907 00:53:21,200 --> 00:53:23,160 Speaker 1: a couple more and then well maybe we'll split these 908 00:53:23,239 --> 00:53:25,840 Speaker 1: up because there for both of us for another episode. 909 00:53:26,000 --> 00:53:27,720 Speaker 1: It's up to you. He can blaze through them too. 910 00:53:28,520 --> 00:53:32,880 Speaker 1: Now there's too many Um, so let's see the Crown 911 00:53:32,960 --> 00:53:36,920 Speaker 1: Royal people again for hooking us up. Very sweet. They've 912 00:53:37,120 --> 00:53:40,319 Speaker 1: hooked us up many, many times. Um. And they gave 913 00:53:40,400 --> 00:53:43,399 Speaker 1: us a nice congratulations because we've got the Best Curiosity 914 00:53:43,440 --> 00:53:46,919 Speaker 1: Award from the Heart Podcast Awards last year. Oh yeah, 915 00:53:47,000 --> 00:53:49,360 Speaker 1: that's how old this one is. Mick Sullivan gave us 916 00:53:49,360 --> 00:53:51,680 Speaker 1: a copy of his book The Meat Shower, which is 917 00:53:51,680 --> 00:53:54,560 Speaker 1: amazingly illustrated. You can check it out on the Past 918 00:53:54,640 --> 00:53:58,279 Speaker 1: and the Curious dot com. Yeah that just sounds really 919 00:53:58,280 --> 00:54:02,439 Speaker 1: greats it really does. Let's see, um and over round 920 00:54:02,480 --> 00:54:05,279 Speaker 1: everything out with Danielle Dixon, who is a real life 921 00:54:05,280 --> 00:54:08,520 Speaker 1: marine biologists chuck at the University of Delaware, and she 922 00:54:08,680 --> 00:54:10,960 Speaker 1: sent us a couple of copies of her kids books 923 00:54:11,000 --> 00:54:15,719 Speaker 1: See Stories, children's books based on real science. You can 924 00:54:15,800 --> 00:54:18,279 Speaker 1: check it out at s E A S T O 925 00:54:18,640 --> 00:54:22,160 Speaker 1: r Y books dot com. Alright, you're gonna save the rest. 926 00:54:22,239 --> 00:54:25,360 Speaker 1: I'm gonna save the restival splow up. All right. Thanks 927 00:54:25,360 --> 00:54:28,560 Speaker 1: everybody who sent us stuff, and and thank you also 928 00:54:28,680 --> 00:54:30,839 Speaker 1: just for saying hi to anyone who does. You can 929 00:54:30,880 --> 00:54:33,520 Speaker 1: say hi to us by sending us an email. Wrap 930 00:54:33,560 --> 00:54:35,440 Speaker 1: it up, spending on the bottom, send it off to 931 00:54:35,520 --> 00:54:43,000 Speaker 1: Stuff podcast at i heart radio dot com. Stuff you 932 00:54:43,000 --> 00:54:45,759 Speaker 1: Should Know is a production of iHeart Radio's How Stuff Works. 933 00:54:45,800 --> 00:54:48,040 Speaker 1: For more podcasts for my heart Radio, visit the iHeart 934 00:54:48,080 --> 00:54:50,600 Speaker 1: radio app, Apple Podcasts, or wherever you listen to your 935 00:54:50,640 --> 00:54:51,320 Speaker 1: favorite shows.