1 00:00:03,000 --> 00:00:04,960 Speaker 1: Welcome Stuff to Blow your Mind. A production of I 2 00:00:05,000 --> 00:00:14,040 Speaker 1: Heart Radios has to works. Hey, welcome to Stuff to 3 00:00:14,080 --> 00:00:16,720 Speaker 1: Blow your mind. My name is Robert Lamb and I'm 4 00:00:16,800 --> 00:00:19,360 Speaker 1: Joe McCormick, and I want to start off today by 5 00:00:19,400 --> 00:00:23,520 Speaker 1: proposing a scenario for you to imagine. What if your 6 00:00:23,680 --> 00:00:29,240 Speaker 1: body included a personal search bar. I know that sounds weird, 7 00:00:29,280 --> 00:00:31,000 Speaker 1: that might be hard to picture, but just try to 8 00:00:31,040 --> 00:00:35,440 Speaker 1: imagine your body. Your physical body has a digital interface 9 00:00:35,640 --> 00:00:39,280 Speaker 1: that maybe anybody within a hundred feet of you can access. 10 00:00:39,960 --> 00:00:44,879 Speaker 1: What if your physical body included a searchable database of 11 00:00:44,920 --> 00:00:48,960 Speaker 1: pretty much everything you ever did or posted on the Internet. 12 00:00:49,600 --> 00:00:51,960 Speaker 1: So whether you're out at a bar with your friends, 13 00:00:52,400 --> 00:00:54,680 Speaker 1: or you're sitting on the subway on your way to work, 14 00:00:54,920 --> 00:00:57,480 Speaker 1: or you're sitting in your car in traffic, or taking 15 00:00:57,480 --> 00:01:00,640 Speaker 1: part in a protest march, or working out of the gym, 16 00:01:00,800 --> 00:01:03,880 Speaker 1: or you're on a date, whatever, anybody who could see 17 00:01:03,960 --> 00:01:07,680 Speaker 1: you would instantly have the ability to look up your 18 00:01:07,800 --> 00:01:11,480 Speaker 1: personal information. They could find out your name, your resume, 19 00:01:11,959 --> 00:01:16,200 Speaker 1: your contact info, your workplace, home address, maybe find all 20 00:01:16,200 --> 00:01:18,959 Speaker 1: publicly available photos of you that are out there on 21 00:01:19,040 --> 00:01:23,400 Speaker 1: Instagram or whatever. Maybe everything you've ever posted on Twitter, 22 00:01:23,600 --> 00:01:27,399 Speaker 1: or on Facebook or whatever other social media, and more broadly, 23 00:01:27,840 --> 00:01:31,800 Speaker 1: basically just everything you've done on the internet, purchasing history, 24 00:01:31,920 --> 00:01:35,800 Speaker 1: search history, as well as your location history anywhere you've 25 00:01:35,840 --> 00:01:39,520 Speaker 1: physically taken your phone with GPS enabled. I think most 26 00:01:39,560 --> 00:01:43,440 Speaker 1: of us would probably recoil in horror at the idea 27 00:01:43,560 --> 00:01:46,800 Speaker 1: that we would ever lose the ability to be anonymous 28 00:01:46,959 --> 00:01:50,440 Speaker 1: in a public place. But perhaps the horrifying part of 29 00:01:50,480 --> 00:01:54,280 Speaker 1: imagining this scenario is that I think what I'm describing 30 00:01:54,400 --> 00:01:58,480 Speaker 1: is not only fairly plausible in some preliminary ways, this 31 00:01:58,560 --> 00:02:02,400 Speaker 1: is already the case, at least in principle. All the 32 00:02:02,480 --> 00:02:08,000 Speaker 1: foundations and support structure of this horrible hypothetical world are laid, 33 00:02:08,760 --> 00:02:10,919 Speaker 1: and really all that's left to do is just kind 34 00:02:10,919 --> 00:02:13,400 Speaker 1: of tighten the screws on it. And one thing we 35 00:02:13,400 --> 00:02:15,720 Speaker 1: know about this world is that there there's no shortage 36 00:02:15,720 --> 00:02:19,120 Speaker 1: of would be screw tighteners out there, Nope, especially if 37 00:02:19,120 --> 00:02:21,360 Speaker 1: you can make some money by tightening screws, which you 38 00:02:21,680 --> 00:02:24,959 Speaker 1: very often can. Uh So, yeah, we we should already 39 00:02:24,960 --> 00:02:27,880 Speaker 1: know that there is very little privacy in the modern 40 00:02:27,919 --> 00:02:32,320 Speaker 1: technology sphere. Our phones, our social media accounts are advertiser 41 00:02:32,360 --> 00:02:34,320 Speaker 1: i d s, which are used to track us across 42 00:02:34,360 --> 00:02:38,280 Speaker 1: the Internet. These sources are already used to create profiles 43 00:02:38,320 --> 00:02:42,400 Speaker 1: in which the disparate types of our personal identifying information 44 00:02:42,720 --> 00:02:45,400 Speaker 1: get correlated with each other and used to serve us 45 00:02:45,440 --> 00:02:49,600 Speaker 1: ads or manipulate us on social media. But the leap 46 00:02:49,720 --> 00:02:54,320 Speaker 1: into physical space, where all of our information is easily 47 00:02:54,400 --> 00:02:57,880 Speaker 1: linked now to our physical body wherever we are, whether 48 00:02:57,919 --> 00:03:01,400 Speaker 1: we like it or not, is the frontier that's currently 49 00:03:01,440 --> 00:03:05,560 Speaker 1: being pushed and at a very rapid pace. Now there 50 00:03:05,600 --> 00:03:08,160 Speaker 1: are multiple ways to make this link, of course, you know, 51 00:03:08,280 --> 00:03:11,560 Speaker 1: so linking our digital profiles and all the the associated 52 00:03:11,639 --> 00:03:14,560 Speaker 1: data with our physical bodies. A very simple one would 53 00:03:14,560 --> 00:03:17,200 Speaker 1: be with the tracking devices that pretty much all of 54 00:03:17,280 --> 00:03:20,280 Speaker 1: us carry with us at all times, the mac addresses 55 00:03:20,280 --> 00:03:23,120 Speaker 1: on our phones and our mobile devices. But one of 56 00:03:23,160 --> 00:03:27,080 Speaker 1: the most powerful developing techniques is for the technosphere to 57 00:03:27,200 --> 00:03:31,320 Speaker 1: recognize you in physical space the same way that your 58 00:03:31,320 --> 00:03:34,720 Speaker 1: friends and your family do by your face. Now, one 59 00:03:34,720 --> 00:03:36,880 Speaker 1: thing to keep in mind about this is that, of course, 60 00:03:37,000 --> 00:03:40,520 Speaker 1: just because say you're thermostat in your house can recognize 61 00:03:40,520 --> 00:03:44,200 Speaker 1: your face, that is not necessarily in and of itself 62 00:03:44,240 --> 00:03:47,560 Speaker 1: a bad thing. In some cases that could be very helpful, 63 00:03:47,680 --> 00:03:49,960 Speaker 1: or even it could be seen as a way to uh, 64 00:03:50,360 --> 00:03:54,680 Speaker 1: you know, to safeguard the temperature of your home, uh, 65 00:03:54,720 --> 00:03:56,120 Speaker 1: that sort of thing, like you could have some sort 66 00:03:56,160 --> 00:03:59,800 Speaker 1: of a security feature. And as we as we proceed 67 00:04:00,040 --> 00:04:02,280 Speaker 1: through these episodes, we're going to try and keep that 68 00:04:02,320 --> 00:04:05,080 Speaker 1: in mind. We're gonna we're gonna try not to color 69 00:04:05,120 --> 00:04:11,960 Speaker 1: the technology as inherently vile or inherently uh prone to misuse. 70 00:04:12,360 --> 00:04:15,680 Speaker 1: But the story of technology is that it is both 71 00:04:16,200 --> 00:04:19,040 Speaker 1: light and dark. Well, yeah, and that I think there 72 00:04:19,080 --> 00:04:22,760 Speaker 1: is a big difference between something being inherently vile versus 73 00:04:22,839 --> 00:04:25,359 Speaker 1: prone to misuse. There are a lot of things that 74 00:04:25,440 --> 00:04:29,440 Speaker 1: are that are created with perfectly good intentions in mind. 75 00:04:29,839 --> 00:04:31,960 Speaker 1: You can think of lots of great reasons to do 76 00:04:32,080 --> 00:04:35,840 Speaker 1: most of the worst technological stuff imaginable. You know, we've 77 00:04:35,880 --> 00:04:38,400 Speaker 1: talked before on the show. We make no secret our 78 00:04:38,800 --> 00:04:41,120 Speaker 1: distaste for a lot of things about social media. But 79 00:04:41,160 --> 00:04:43,200 Speaker 1: of course, you know, you can see the good side 80 00:04:43,240 --> 00:04:46,200 Speaker 1: of something like Facebook, Yeah, or very broadly, you can 81 00:04:46,240 --> 00:04:51,080 Speaker 1: think of something like banking. Banking in many respects allows 82 00:04:51,200 --> 00:04:55,040 Speaker 1: humans to do amazing things things they wouldn't or otherwise 83 00:04:55,040 --> 00:04:59,160 Speaker 1: be able to do. Uh, to buy larger uh, you know, 84 00:04:59,440 --> 00:05:01,599 Speaker 1: to buy a property, you know, to buy a vehicle, 85 00:05:01,760 --> 00:05:04,039 Speaker 1: to start a business, start a business, and so forth. 86 00:05:04,080 --> 00:05:06,880 Speaker 1: But at the same time, very terrible things have been 87 00:05:06,920 --> 00:05:10,960 Speaker 1: done and are still being done under the broad tent 88 00:05:11,120 --> 00:05:14,680 Speaker 1: of banking. Yeah, and we can help better protect ourselves 89 00:05:14,720 --> 00:05:17,680 Speaker 1: from those outcomes by better understanding banking so that we 90 00:05:17,720 --> 00:05:20,839 Speaker 1: can regulate it properly, which doesn't always get done, but 91 00:05:20,920 --> 00:05:23,200 Speaker 1: you know that that, at least in theory, that is 92 00:05:23,240 --> 00:05:25,839 Speaker 1: the way to protect yourself. And so today we're gonna 93 00:05:25,880 --> 00:05:28,240 Speaker 1: be taking that same point of view to a series 94 00:05:28,240 --> 00:05:33,320 Speaker 1: of episodes about facial recognition science and technology. And this 95 00:05:33,400 --> 00:05:35,240 Speaker 1: is a subject I've wanted to talk about for a 96 00:05:35,240 --> 00:05:40,159 Speaker 1: while because obviously this is an issue of increasing significance today. 97 00:05:40,720 --> 00:05:44,000 Speaker 1: But actually, just after we landed on this topic, I 98 00:05:44,080 --> 00:05:46,640 Speaker 1: came across a brand new story in The New York 99 00:05:46,680 --> 00:05:50,240 Speaker 1: Times that serves as a really good anchor on why 100 00:05:50,279 --> 00:05:54,240 Speaker 1: this issue is incredibly relevant today. And this article is 101 00:05:54,240 --> 00:05:57,640 Speaker 1: called The Secretive Company that Might end Privacy as we 102 00:05:57,720 --> 00:06:00,400 Speaker 1: know it by cashmir Hill, published in the New York 103 00:06:00,440 --> 00:06:06,120 Speaker 1: Times January. Yeah. It uh. It concerns a small technology 104 00:06:06,160 --> 00:06:10,360 Speaker 1: company called clear View AI, which just the title of 105 00:06:10,440 --> 00:06:12,920 Speaker 1: that it sounds like it could be fine, right, sounds 106 00:06:13,040 --> 00:06:17,120 Speaker 1: very transparent, doesn't clear View AI, But of course this 107 00:06:17,240 --> 00:06:22,080 Speaker 1: is a facial recognition based artificial intelligence company. Uh So, 108 00:06:22,080 --> 00:06:24,440 Speaker 1: so what is clear View in their own words? What 109 00:06:24,480 --> 00:06:27,680 Speaker 1: do they say about themselves? Uh? To read from their 110 00:06:27,720 --> 00:06:31,719 Speaker 1: marketing materials quote. Clear View is a new research tool 111 00:06:31,920 --> 00:06:36,480 Speaker 1: used by law enforcement agencies to identify perpetrators and victims 112 00:06:36,480 --> 00:06:40,360 Speaker 1: of crimes. Clear Views technology has helped law enforcement track 113 00:06:40,400 --> 00:06:44,760 Speaker 1: down hundreds of at large criminals, including pedophiles, terrorists, and 114 00:06:44,839 --> 00:06:48,159 Speaker 1: sex traffickers. It is also used to help exonerate the 115 00:06:48,200 --> 00:06:51,880 Speaker 1: innocent and identify the victims of crimes, including child sex 116 00:06:51,880 --> 00:06:55,000 Speaker 1: abuse and financial fraud. Now, on the surface of things, 117 00:06:55,040 --> 00:06:58,880 Speaker 1: that sounds absolutely air tight, right. It describes the technology 118 00:06:58,920 --> 00:07:03,400 Speaker 1: that is used by the appropriate agencies to protect the 119 00:07:03,480 --> 00:07:08,360 Speaker 1: innocent and to go after the guilty. But then again, 120 00:07:09,080 --> 00:07:11,200 Speaker 1: that can be used to sell a lot of things 121 00:07:11,680 --> 00:07:15,160 Speaker 1: in the world, of course. So what they advertise is 122 00:07:15,200 --> 00:07:19,520 Speaker 1: that this app helps law enforcement identify perpetrators and protect 123 00:07:19,640 --> 00:07:22,160 Speaker 1: victims of crime, and of course, in some cases that 124 00:07:22,240 --> 00:07:25,040 Speaker 1: may very well be true. Obviously, it would be pointless 125 00:07:25,120 --> 00:07:29,120 Speaker 1: to deny that facial recognition technology the ability to take 126 00:07:29,120 --> 00:07:32,160 Speaker 1: a picture of somebody's face and then find out tons 127 00:07:32,160 --> 00:07:34,160 Speaker 1: of stuff about who they are and how you can 128 00:07:34,200 --> 00:07:36,760 Speaker 1: find them. You know, be pointless to deny that in 129 00:07:36,800 --> 00:07:40,000 Speaker 1: many cases that would be useful and beneficial to law enforcement. 130 00:07:40,400 --> 00:07:42,800 Speaker 1: But it is also, of course just so easy to 131 00:07:42,840 --> 00:07:45,360 Speaker 1: see how a tool like this could be terrible, both 132 00:07:45,400 --> 00:07:48,760 Speaker 1: in its successes and in its failures. So, first of all, 133 00:07:48,760 --> 00:07:51,800 Speaker 1: of course, it could fail in catastrophic ways, say with 134 00:07:51,840 --> 00:07:54,760 Speaker 1: like false matches when police are looking for a perpetrator, 135 00:07:55,520 --> 00:07:57,280 Speaker 1: but of course you know that's something that can happen 136 00:07:57,320 --> 00:08:01,080 Speaker 1: with human witnesses too, right, But then it could also 137 00:08:01,280 --> 00:08:05,400 Speaker 1: be used effectively if it correctly identifies people too amazingly 138 00:08:05,480 --> 00:08:09,480 Speaker 1: insidious ends. So how does it work. It's actually it 139 00:08:09,560 --> 00:08:13,000 Speaker 1: sounds pretty simple in terms of its user interface. Specifically, 140 00:08:13,000 --> 00:08:16,440 Speaker 1: what this tool does is it matches an input photo 141 00:08:16,520 --> 00:08:21,200 Speaker 1: of a face with a huge database of existing photos 142 00:08:21,240 --> 00:08:24,840 Speaker 1: scraped from the internet, and then it will provide links 143 00:08:24,880 --> 00:08:28,080 Speaker 1: to the places that those images from the internet were 144 00:08:28,160 --> 00:08:32,080 Speaker 1: originally found. So very simple example, I take a photo 145 00:08:32,080 --> 00:08:34,440 Speaker 1: of you, and then I feed it into the tool, 146 00:08:34,800 --> 00:08:37,160 Speaker 1: and then it comes back with other photos of you 147 00:08:37,800 --> 00:08:40,560 Speaker 1: and links to the places where those photos were found, 148 00:08:40,640 --> 00:08:43,920 Speaker 1: maybe your Facebook page, a YouTube video, you're in and 149 00:08:43,960 --> 00:08:47,080 Speaker 1: so forth, and so when it works, it will provide 150 00:08:47,120 --> 00:08:51,400 Speaker 1: a direct link between your anonymous face from a picture 151 00:08:51,960 --> 00:08:55,559 Speaker 1: and the digital locations where all of your personal information 152 00:08:55,600 --> 00:08:58,720 Speaker 1: may be logged online. Now, we're not going to look 153 00:08:58,760 --> 00:09:01,760 Speaker 1: too far under the hood of exactly how the you know, 154 00:09:01,840 --> 00:09:06,199 Speaker 1: the underlying technology of this sort of thing works right now. Uh, 155 00:09:06,440 --> 00:09:09,280 Speaker 1: there are a number of ways that facial recognition algorithms 156 00:09:09,320 --> 00:09:11,880 Speaker 1: can actually work, but a very common way is using 157 00:09:11,920 --> 00:09:15,840 Speaker 1: neural networks. Yes, and uh and and for this For 158 00:09:15,840 --> 00:09:19,480 Speaker 1: for like a nice succinct description of how this works, 159 00:09:20,040 --> 00:09:23,160 Speaker 1: I'd like to refer to Max tag marks most recent 160 00:09:23,160 --> 00:09:26,040 Speaker 1: book Life three point oh, which deals at length with 161 00:09:26,280 --> 00:09:31,000 Speaker 1: AI and the potential the threats posed by by AI 162 00:09:31,040 --> 00:09:33,839 Speaker 1: in the future. It's a it's a wonderful book. But 163 00:09:34,160 --> 00:09:38,560 Speaker 1: in this this one section, he's just summarizing how this 164 00:09:38,640 --> 00:09:41,760 Speaker 1: kind of facial recognition works. Uh. And he writes, uh 165 00:09:41,840 --> 00:09:45,600 Speaker 1: that neural networks have been quote trained to input numbers 166 00:09:45,640 --> 00:09:50,200 Speaker 1: representing the probability that the image depicts various people. Here, 167 00:09:50,360 --> 00:09:54,480 Speaker 1: each artificial neuron and on on an illustration they're depicted 168 00:09:54,559 --> 00:09:58,360 Speaker 1: as circles. Computes a weighted sum of the number sent 169 00:09:58,440 --> 00:10:02,880 Speaker 1: into it via connections or lines in this image from above, 170 00:10:02,960 --> 00:10:06,720 Speaker 1: applies a simple function and passes the results downward each 171 00:10:06,760 --> 00:10:11,760 Speaker 1: subsequent layer of computing. Higher level features. Typical face recognition 172 00:10:11,800 --> 00:10:15,640 Speaker 1: network contain hundreds of thousands of urons. Uh. The figure 173 00:10:15,640 --> 00:10:19,240 Speaker 1: shows merely a handful for clarity, so uh. In the 174 00:10:19,600 --> 00:10:22,960 Speaker 1: the visual representation the tech mark includes here you see 175 00:10:23,200 --> 00:10:26,560 Speaker 1: all these circles and interconnected lines representing how the you know, 176 00:10:26,559 --> 00:10:30,480 Speaker 1: the neural network is functioning. But then it begins, then 177 00:10:30,520 --> 00:10:33,240 Speaker 1: we apply this to the facial features. It starts with 178 00:10:33,240 --> 00:10:35,959 Speaker 1: with sort of general features and sort of blurry shapes, 179 00:10:36,040 --> 00:10:39,040 Speaker 1: and then to more specific features, and then tying those 180 00:10:39,040 --> 00:10:43,120 Speaker 1: features together and then eventually getting to an output probability 181 00:10:43,400 --> 00:10:47,079 Speaker 1: of actual facial matches. Yeah. And as with many other 182 00:10:47,160 --> 00:10:50,000 Speaker 1: neural networks that are trained on large data sets, to 183 00:10:50,280 --> 00:10:54,200 Speaker 1: you know, match values together or produce an output, why 184 00:10:54,360 --> 00:10:58,400 Speaker 1: given input x uh you know, given a training method, 185 00:10:59,160 --> 00:11:01,880 Speaker 1: there will often so. The way these things are trained 186 00:11:02,040 --> 00:11:03,840 Speaker 1: is that you you know, you feed them a lot 187 00:11:03,880 --> 00:11:06,959 Speaker 1: of examples of the kind of output you want, and 188 00:11:07,040 --> 00:11:11,000 Speaker 1: slowly they refine their own rules internally. The rules that 189 00:11:11,040 --> 00:11:14,240 Speaker 1: happened at each of these layers of neurons to manipulate 190 00:11:14,320 --> 00:11:17,240 Speaker 1: numbers and values as they pass through the neural network 191 00:11:17,559 --> 00:11:20,560 Speaker 1: in order to give the output that closely matches whatever 192 00:11:20,600 --> 00:11:23,360 Speaker 1: you've trained it to come up with. But that that 193 00:11:23,440 --> 00:11:27,680 Speaker 1: means that like you can train potentially an effective neural 194 00:11:27,720 --> 00:11:32,320 Speaker 1: network without yourself really understanding very well exactly what's happening 195 00:11:32,320 --> 00:11:35,480 Speaker 1: at each layer throughout throughout the network. Now, I think 196 00:11:35,480 --> 00:11:37,960 Speaker 1: it is possible to like sort of get in there 197 00:11:38,000 --> 00:11:40,280 Speaker 1: and try to dig into it and and see what's 198 00:11:40,280 --> 00:11:42,679 Speaker 1: going on, if you've really got the time and expertise. 199 00:11:43,559 --> 00:11:47,440 Speaker 1: But but it can be relatively opaque as far as 200 00:11:47,440 --> 00:11:51,320 Speaker 1: computer programs go. It doesn't necessarily work like a normal 201 00:11:51,360 --> 00:11:54,679 Speaker 1: computer program has lines of code that any programmer who 202 00:11:54,720 --> 00:11:57,040 Speaker 1: knows the language can read through and figure out what's 203 00:11:57,040 --> 00:11:59,280 Speaker 1: going on easily. But to come back to the cash 204 00:11:59,320 --> 00:12:03,400 Speaker 1: Mirror Hill article and UH clear view AI uh. One 205 00:12:03,440 --> 00:12:06,520 Speaker 1: thing that's important to point out is that this is 206 00:12:06,600 --> 00:12:10,839 Speaker 1: by no means the first facial recognition app or tool, 207 00:12:11,000 --> 00:12:14,200 Speaker 1: nor is it the first used by law enforcement. It's 208 00:12:14,240 --> 00:12:17,840 Speaker 1: particular value that the thing that it's doing that's somewhat 209 00:12:17,880 --> 00:12:21,920 Speaker 1: new is in its database of images, which again have 210 00:12:22,000 --> 00:12:27,040 Speaker 1: been scraped from organic sources like Facebook and YouTube. Previously, 211 00:12:27,120 --> 00:12:31,720 Speaker 1: law enforcement facial recognition matching programs were often weaker and 212 00:12:31,760 --> 00:12:36,400 Speaker 1: more limited to smaller databases of government photos, say mug 213 00:12:36,440 --> 00:12:40,560 Speaker 1: shots or driver's licenses. And of course there's the potential that, 214 00:12:40,640 --> 00:12:44,400 Speaker 1: you know, smaller training or matching material will make any 215 00:12:44,440 --> 00:12:47,320 Speaker 1: machine learning process weaker at at coming up with the 216 00:12:47,360 --> 00:12:50,800 Speaker 1: results you want. Yeah. All this kind of ties in 217 00:12:50,840 --> 00:12:52,200 Speaker 1: with what I often think of its kind of like 218 00:12:52,240 --> 00:12:55,800 Speaker 1: this title pool illusion of the Internet, that feeling that 219 00:12:55,920 --> 00:12:58,600 Speaker 1: a lot of us had I think still have sometimes 220 00:12:58,600 --> 00:13:01,320 Speaker 1: but also especially early on, this feeling that we were 221 00:13:01,360 --> 00:13:04,920 Speaker 1: engaging in something segmented from the general population. You know. 222 00:13:05,440 --> 00:13:07,560 Speaker 1: But the thing about tide pools, of course, is that 223 00:13:07,640 --> 00:13:09,959 Speaker 1: eventually the tide rolls in and then you realize that 224 00:13:10,400 --> 00:13:14,280 Speaker 1: you're actually connected to the wider internet. Uh so you know, 225 00:13:14,360 --> 00:13:17,760 Speaker 1: not just your friends or your family or your fandom, 226 00:13:17,840 --> 00:13:22,920 Speaker 1: but also uh, you know, law enforcement, criminals, politics, all 227 00:13:23,600 --> 00:13:27,480 Speaker 1: just churning around in the same grim ocean of numbing obscenity. 228 00:13:28,080 --> 00:13:30,640 Speaker 1: I think that's a really excellent metaphor. Yeah, there is 229 00:13:30,960 --> 00:13:35,640 Speaker 1: some somehow the Internet was very easily able to create 230 00:13:35,800 --> 00:13:40,280 Speaker 1: a sense of isolated, walled off gardens that we were 231 00:13:40,280 --> 00:13:44,360 Speaker 1: living in which were at the time totally public you know, um, 232 00:13:44,520 --> 00:13:48,280 Speaker 1: early days of various social networking sites, fan forums, all 233 00:13:48,280 --> 00:13:50,280 Speaker 1: that kind of thing, you know, whatever it was that 234 00:13:50,400 --> 00:13:52,920 Speaker 1: gave people a sense that they were in a little 235 00:13:52,960 --> 00:13:57,079 Speaker 1: private space, you know, that their little corner, their little room. 236 00:13:57,120 --> 00:13:59,800 Speaker 1: But of course it's the Internet. What's happening there is public, 237 00:13:59,840 --> 00:14:02,800 Speaker 1: and the consumers of what's happening there may be completely 238 00:14:02,840 --> 00:14:06,280 Speaker 1: invisible to you. Right. So in this case, with the 239 00:14:06,760 --> 00:14:10,800 Speaker 1: previous models of facial identification, the data sets they were 240 00:14:10,800 --> 00:14:14,400 Speaker 1: depending on, we're basically title pools, like here's the title 241 00:14:14,400 --> 00:14:17,400 Speaker 1: pool of of mug shots. Here's the driver's license, uh, 242 00:14:17,440 --> 00:14:19,800 Speaker 1: title pool, and that's what we're feeding on. But basically 243 00:14:19,920 --> 00:14:22,160 Speaker 1: clear View comes around, and this is a company that 244 00:14:22,320 --> 00:14:24,760 Speaker 1: is saying, well, let's just use the whole ocean. What's 245 00:14:24,760 --> 00:14:27,240 Speaker 1: stopping is from using the whole ocean. So this is 246 00:14:27,240 --> 00:14:30,000 Speaker 1: a company using the assets of various social media and 247 00:14:30,040 --> 00:14:32,960 Speaker 1: in general visual media companies on the web to do 248 00:14:33,000 --> 00:14:35,920 Speaker 1: the sorts of things that those companies have been loath 249 00:14:36,040 --> 00:14:38,160 Speaker 1: to do, or if at least been you know, publicly 250 00:14:38,200 --> 00:14:41,360 Speaker 1: opposed to doing. Because technically, as as pointed out in 251 00:14:41,440 --> 00:14:44,080 Speaker 1: Hills New York Times, article. Um, you know, there is 252 00:14:44,120 --> 00:14:46,440 Speaker 1: there is an argument that what they're doing here, what 253 00:14:46,440 --> 00:14:49,000 Speaker 1: what this company, in any company that's that's engaging in 254 00:14:49,000 --> 00:14:51,280 Speaker 1: this kind of like broad sampling, that they may be 255 00:14:51,840 --> 00:14:55,200 Speaker 1: violating the terms of service for these various websites. Sure, yeah, 256 00:14:55,240 --> 00:14:59,040 Speaker 1: automatically scraping imagery and data from Facebook. Say, I think 257 00:14:59,640 --> 00:15:01,960 Speaker 1: there was at least the allegation that that could be 258 00:15:02,440 --> 00:15:05,840 Speaker 1: a violation of Facebook's terms of service, but it didn't 259 00:15:05,880 --> 00:15:08,600 Speaker 1: really seem to bother the clear view people right right, 260 00:15:08,640 --> 00:15:11,640 Speaker 1: And I think when Hill reached out to Facebook representative, 261 00:15:11,640 --> 00:15:13,480 Speaker 1: they said, well, we may look into that, and so 262 00:15:13,560 --> 00:15:17,160 Speaker 1: it's kind of an open question. But you know, a 263 00:15:17,200 --> 00:15:19,280 Speaker 1: lot of this also comes down to something we discussed 264 00:15:19,320 --> 00:15:23,280 Speaker 1: in our our look at during uh Lanier's ten arguments 265 00:15:23,320 --> 00:15:27,040 Speaker 1: for deleting your social media accounts right now, because it 266 00:15:27,080 --> 00:15:30,840 Speaker 1: concerns your data, data that you have, in all likelihood 267 00:15:31,120 --> 00:15:34,680 Speaker 1: given away to companies like Facebook, Twitter and others simply 268 00:15:35,000 --> 00:15:39,120 Speaker 1: to be a part of the interconnectedness that they sold us. Now, 269 00:15:39,160 --> 00:15:41,400 Speaker 1: a lot of the time when we're discussing such a data, 270 00:15:41,440 --> 00:15:45,080 Speaker 1: we're discussing behavioral information, right your likes, your dislikes, your 271 00:15:45,080 --> 00:15:50,160 Speaker 1: arousal patterns concerning posts and advertisements. But in addition to this, 272 00:15:50,240 --> 00:15:53,960 Speaker 1: you also sold the devil your face. Yeah, I mean 273 00:15:54,040 --> 00:15:56,760 Speaker 1: he he's the devil in this case is simply promising 274 00:15:56,800 --> 00:16:00,920 Speaker 1: not to do anything unbecoming with your face. But but 275 00:16:01,040 --> 00:16:04,440 Speaker 1: Hell is highly populated. Even if you have good reason 276 00:16:04,520 --> 00:16:07,960 Speaker 1: to trust this particular devil to which you've already entrusted 277 00:16:08,000 --> 00:16:10,600 Speaker 1: your face and perhaps the faces of your family members, 278 00:16:10,600 --> 00:16:14,920 Speaker 1: your loved ones, deceased loved ones, your children. Uh, you know, 279 00:16:14,960 --> 00:16:18,960 Speaker 1: there are countless others that they will make no such promises, 280 00:16:19,280 --> 00:16:21,640 Speaker 1: steal your face right off your head. Yeah, and and 281 00:16:21,680 --> 00:16:24,120 Speaker 1: they may have little concern for, you know, the values 282 00:16:24,160 --> 00:16:27,240 Speaker 1: that were in place during the initial purchase. It's something 283 00:16:27,240 --> 00:16:29,200 Speaker 1: that I think about coming up again and again with 284 00:16:29,520 --> 00:16:31,960 Speaker 1: sharing data on the internet. So you share your data 285 00:16:32,000 --> 00:16:36,360 Speaker 1: with a company, and you maybe trust that company today 286 00:16:36,400 --> 00:16:40,120 Speaker 1: to protect your data. But what if so say that 287 00:16:40,200 --> 00:16:42,320 Speaker 1: company hangs onto your data for a while and then 288 00:16:42,360 --> 00:16:44,880 Speaker 1: they get new management that you don't trust as much, 289 00:16:44,920 --> 00:16:47,680 Speaker 1: but they've got it, Uh, you can't get it back. 290 00:16:48,160 --> 00:16:50,760 Speaker 1: Or maybe they have a security breach and somebody just 291 00:16:50,800 --> 00:16:54,240 Speaker 1: happens to steal your data from them. It's like, well, 292 00:16:54,360 --> 00:16:56,920 Speaker 1: you would have trusted the company maybe, but now somebody 293 00:16:56,920 --> 00:16:59,120 Speaker 1: else has got it, and you can see how in 294 00:16:59,120 --> 00:17:02,960 Speaker 1: a world like that, Uh, it could start to feel 295 00:17:03,200 --> 00:17:08,040 Speaker 1: like maybe hopeless or pointless or futile for people who 296 00:17:08,080 --> 00:17:10,320 Speaker 1: say are in a position to make money off of 297 00:17:10,359 --> 00:17:14,320 Speaker 1: not being very careful about people's privacy. Uh. You know, 298 00:17:14,400 --> 00:17:17,800 Speaker 1: it's like, you know, what's the point. Everything eventually gets 299 00:17:17,800 --> 00:17:20,640 Speaker 1: out there anyway. And this kind of point of view 300 00:17:20,920 --> 00:17:23,400 Speaker 1: was sort of articulated by some of the people quoted 301 00:17:23,520 --> 00:17:26,440 Speaker 1: in Hill's article. For example, there's a figure named David 302 00:17:26,520 --> 00:17:30,240 Speaker 1: Scalzo who was an early investor and investor in this company, 303 00:17:30,240 --> 00:17:33,439 Speaker 1: Clear View AI uh, and Scalzo is quoted in the 304 00:17:33,560 --> 00:17:37,280 Speaker 1: article saying, I've come to the conclusion that because information 305 00:17:37,400 --> 00:17:42,040 Speaker 1: constantly increases, there's never going to be privacy. Laws have 306 00:17:42,160 --> 00:17:46,040 Speaker 1: to determine what's legal, but you can't ban technology. Sure 307 00:17:46,200 --> 00:17:49,040 Speaker 1: that might lead to a dystopian future or something, but 308 00:17:49,160 --> 00:17:53,840 Speaker 1: you can't ban it. So, you know, this is it 309 00:17:53,960 --> 00:17:56,919 Speaker 1: at once one of those statements that seems very pragmatic 310 00:17:56,960 --> 00:17:59,639 Speaker 1: but also entirely self serving, because true the story of 311 00:17:59,640 --> 00:18:02,760 Speaker 1: technolo alogy is that its advanced cannot really be stopped. 312 00:18:02,800 --> 00:18:04,959 Speaker 1: You have to think ahead as best we can and 313 00:18:05,000 --> 00:18:07,600 Speaker 1: prepare our laws and our moral code to dealing with 314 00:18:07,680 --> 00:18:10,639 Speaker 1: emerging technologies. We've talked about this before, for instance, in 315 00:18:10,880 --> 00:18:15,200 Speaker 1: as far as genetic technology is concerned, But this particular 316 00:18:15,280 --> 00:18:17,480 Speaker 1: quote also sounds a lot like, Hey, it's gonna happen 317 00:18:17,520 --> 00:18:19,560 Speaker 1: either way, so I might as well be the person 318 00:18:19,600 --> 00:18:21,879 Speaker 1: to make some money off of it. It totally agree, 319 00:18:21,920 --> 00:18:24,919 Speaker 1: I mean, I agree that it is difficult to stop 320 00:18:25,000 --> 00:18:28,199 Speaker 1: technological progress. That if you know, if one group of 321 00:18:28,200 --> 00:18:30,800 Speaker 1: people isn't working on it, maybe a less ethical group 322 00:18:30,800 --> 00:18:33,640 Speaker 1: of people might be somewhere, But that's not an excuse 323 00:18:33,720 --> 00:18:37,240 Speaker 1: to be the person who creates the synthetic supervirus that 324 00:18:37,359 --> 00:18:42,679 Speaker 1: you know who like genetically engineers, captain trips flu or whatever. Like. Also, 325 00:18:42,840 --> 00:18:46,000 Speaker 1: you could use this logic about almost any bad thing. 326 00:18:46,600 --> 00:18:48,600 Speaker 1: It's kind of like saying, yeah, you know, there's no 327 00:18:48,640 --> 00:18:51,840 Speaker 1: way to totally eliminate pollution. Some people are always going 328 00:18:51,880 --> 00:18:54,040 Speaker 1: to find a way to pollute and harm the environment, 329 00:18:54,119 --> 00:18:56,159 Speaker 1: so you might as well just go hog while just 330 00:18:56,240 --> 00:18:59,359 Speaker 1: dump it all. Like. So, it's true that you can't 331 00:18:59,440 --> 00:19:03,480 Speaker 1: stop everything harmful to the environment with regulation, but you 332 00:19:03,520 --> 00:19:06,760 Speaker 1: can really slow it down. You can present major obstacles 333 00:19:06,760 --> 00:19:09,680 Speaker 1: to the worst types of offenses. And likewise, I think 334 00:19:09,680 --> 00:19:13,240 Speaker 1: it would be very difficult to completely stop the advancing 335 00:19:13,280 --> 00:19:17,720 Speaker 1: capabilities of AI, including facial recognition, but you can certainly 336 00:19:17,840 --> 00:19:21,240 Speaker 1: slow it. You can certainly limit its potentially harmful uses 337 00:19:21,359 --> 00:19:24,520 Speaker 1: by banning those uses and punishing offenders. Now, on the 338 00:19:24,560 --> 00:19:27,160 Speaker 1: other hand, you could think, well, yeah, you could do that, 339 00:19:27,200 --> 00:19:29,840 Speaker 1: but this would be so helpful to law enforcement in 340 00:19:29,880 --> 00:19:33,480 Speaker 1: some cases. You know, so would the ability to search 341 00:19:33,560 --> 00:19:36,239 Speaker 1: any house you wanted without a warrant, right right, I mean, 342 00:19:36,280 --> 00:19:40,000 Speaker 1: this is the same argument that has often been part 343 00:19:40,320 --> 00:19:46,000 Speaker 1: of the reasoning for enhanced interrogation and torture, is that, well, 344 00:19:46,040 --> 00:19:48,240 Speaker 1: it can help us get the bad guys, it can 345 00:19:48,280 --> 00:19:51,199 Speaker 1: help us in this situation. And then also in all 346 00:19:51,240 --> 00:19:54,080 Speaker 1: of these arguments are is also the the idea that well, 347 00:19:54,119 --> 00:19:56,640 Speaker 1: if you have nothing to hide, if you are truly 348 00:19:56,720 --> 00:20:01,080 Speaker 1: a good and supportive member of society, then what if 349 00:20:01,119 --> 00:20:02,840 Speaker 1: you have to what do you have to worry about? Anyway? 350 00:20:03,440 --> 00:20:06,000 Speaker 1: But but it kind of comes down to the data issue. Well, 351 00:20:06,080 --> 00:20:08,159 Speaker 1: you trust the person who has your data now, but 352 00:20:08,240 --> 00:20:10,080 Speaker 1: do you trust the person who have your data tomorrow? 353 00:20:10,119 --> 00:20:15,320 Speaker 1: You trust the government of today, but governments change, yeah, 354 00:20:15,320 --> 00:20:18,679 Speaker 1: I mean, and No, nobody actually in practice believes this 355 00:20:18,800 --> 00:20:20,920 Speaker 1: what do you have to hide argument? It's just something 356 00:20:20,960 --> 00:20:23,280 Speaker 1: you would say. I mean, like, if anybody ever says that, 357 00:20:23,400 --> 00:20:26,600 Speaker 1: just immediately demand them to give you their email password. 358 00:20:27,000 --> 00:20:29,240 Speaker 1: Me like, yeah, just let me read all your email. 359 00:20:29,240 --> 00:20:32,880 Speaker 1: I mean, what's the problem, wild inspector? Right, I think 360 00:20:32,920 --> 00:20:36,560 Speaker 1: you'll find everything in order. Uh. But yeah, so, I 361 00:20:36,600 --> 00:20:40,399 Speaker 1: mean obviously, society has often decide to regulate police power 362 00:20:40,440 --> 00:20:43,800 Speaker 1: in ways that I mean that are truly inconvenient to 363 00:20:43,880 --> 00:20:47,919 Speaker 1: law enforcement, because they decide that in some cases there 364 00:20:47,960 --> 00:20:51,200 Speaker 1: are types of privacy and other civil liberties that matter 365 00:20:51,359 --> 00:20:55,640 Speaker 1: more than prosecuting offenses at the maximum efficiency. Yeah. Alright, Well, 366 00:20:55,640 --> 00:21:01,320 Speaker 1: on that note, we're going to take a quick break. Okay, 367 00:21:01,359 --> 00:21:04,040 Speaker 1: we're back. So again, we're in the middle of this 368 00:21:04,119 --> 00:21:07,880 Speaker 1: first episode of this exploration we're doing of of facial 369 00:21:07,960 --> 00:21:11,439 Speaker 1: recognition science and technology, and we've been talking about this 370 00:21:11,480 --> 00:21:13,920 Speaker 1: New York Times article that just came out last week 371 00:21:14,040 --> 00:21:19,119 Speaker 1: by Cashmere Hill about a facial recognition technology company called 372 00:21:19,200 --> 00:21:22,960 Speaker 1: clear View AI. Now, broadly speaking, the two big advantages 373 00:21:23,440 --> 00:21:26,680 Speaker 1: of clear View AI are that it first of all, 374 00:21:26,920 --> 00:21:30,280 Speaker 1: pulls from an extensive database of images. So we're talking 375 00:21:30,400 --> 00:21:33,879 Speaker 1: three billion photos in a database versus the four hundred 376 00:21:33,920 --> 00:21:37,879 Speaker 1: and eleven million searchable through the FBI's database. These stats 377 00:21:37,920 --> 00:21:42,200 Speaker 1: according to clear View marketing materials reviewed by Cashmere here 378 00:21:42,359 --> 00:21:44,960 Speaker 1: Hill for that New York Times article. And then secondly, 379 00:21:45,240 --> 00:21:48,800 Speaker 1: it boasts a robust enough facial recognition engine built up 380 00:21:49,040 --> 00:21:52,840 Speaker 1: from academic work by others on artificial intelligence, image recognition 381 00:21:52,840 --> 00:21:56,520 Speaker 1: and machine learning. UM that it and it and it 382 00:21:56,520 --> 00:22:00,440 Speaker 1: it does not require high quality or complete facial images 383 00:22:00,480 --> 00:22:03,600 Speaker 1: to produce matches. So like the you know, the I 384 00:22:03,600 --> 00:22:06,520 Speaker 1: guess the ideal example would be you could have somebody 385 00:22:06,560 --> 00:22:09,639 Speaker 1: going into a bank and robbing it with their face 386 00:22:09,760 --> 00:22:13,960 Speaker 1: partially covered, and then this would potentially be able to 387 00:22:14,040 --> 00:22:17,120 Speaker 1: match that partial face to a full face and say, 388 00:22:17,119 --> 00:22:19,720 Speaker 1: a Facebook profile, at least according to what the company 389 00:22:19,760 --> 00:22:22,439 Speaker 1: and some of its satisfied customers in law enforcement have 390 00:22:22,480 --> 00:22:26,280 Speaker 1: been alleging. Yeah. Like one example of a successful match 391 00:22:26,359 --> 00:22:31,000 Speaker 1: that Hill mentions h is um matching an individual to 392 00:22:31,160 --> 00:22:34,760 Speaker 1: a face in a mirror in someone else's gym photo. 393 00:22:35,040 --> 00:22:38,439 Speaker 1: What yeah, and uh, you know, details of the you know, 394 00:22:38,480 --> 00:22:41,600 Speaker 1: presumed guilt or innocence of that particular individual aside, I 395 00:22:41,600 --> 00:22:44,600 Speaker 1: think this is notable in that gyms are often considered 396 00:22:44,600 --> 00:22:47,960 Speaker 1: to be photo taboo places, and it's certainly as far 397 00:22:48,000 --> 00:22:50,399 Speaker 1: as like other people working out in the gym go. 398 00:22:50,640 --> 00:22:53,960 Speaker 1: You know, this is I'm not, you know, an expert 399 00:22:54,000 --> 00:22:57,320 Speaker 1: on jim etiquette, but it is my understanding that you 400 00:22:57,359 --> 00:23:00,400 Speaker 1: shouldn't even you know, accidentally photographed someone else the gym. 401 00:23:00,640 --> 00:23:04,159 Speaker 1: But obviously it does happen. Uh, it just it just 402 00:23:04,560 --> 00:23:06,720 Speaker 1: you should not have your phone out snap and picks 403 00:23:06,720 --> 00:23:09,760 Speaker 1: at the gym unless it's unless it's your private gym 404 00:23:09,800 --> 00:23:12,159 Speaker 1: and you're the only person there, yeah, or unless it 405 00:23:12,160 --> 00:23:16,600 Speaker 1: catches bad guys, because what do you have to hide. Um. 406 00:23:16,680 --> 00:23:19,399 Speaker 1: That being said, the people at the company do admit 407 00:23:19,480 --> 00:23:21,960 Speaker 1: that of course, like you know it, it still has flaws. 408 00:23:22,000 --> 00:23:24,800 Speaker 1: There's still things that can't do. Yeah, Like, for instance, 409 00:23:25,000 --> 00:23:28,080 Speaker 1: a lot of it is leaning on eye level photos, 410 00:23:28,080 --> 00:23:29,919 Speaker 1: the kind of photos that you see and say a 411 00:23:29,960 --> 00:23:32,800 Speaker 1: linked in profile photo, as opposed to the sort of 412 00:23:32,840 --> 00:23:36,520 Speaker 1: ceiling level security camera footage that it is often involved 413 00:23:36,520 --> 00:23:39,720 Speaker 1: in these scenarios, right right, uh yeah, I mean, And 414 00:23:39,800 --> 00:23:41,800 Speaker 1: it's so it's running into the same kind of problems 415 00:23:41,800 --> 00:23:43,720 Speaker 1: that we could talk about this later in the episode 416 00:23:43,960 --> 00:23:47,359 Speaker 1: that human beings sometimes have with less familiar faces. I mean, 417 00:23:47,640 --> 00:23:51,760 Speaker 1: this is a A known thing about human face perception 418 00:23:51,840 --> 00:23:55,439 Speaker 1: and facial recognition within the brain is that, uh, we 419 00:23:55,520 --> 00:23:59,919 Speaker 1: are much better at recognizing very familiar faces under unfair 420 00:24:00,040 --> 00:24:02,879 Speaker 1: ruble conditions, like a partial face, a face it a 421 00:24:02,920 --> 00:24:06,040 Speaker 1: weird angle, facing bad lighting. We can do that a 422 00:24:06,040 --> 00:24:08,400 Speaker 1: lot better if it's a familiar face. Then if it's 423 00:24:08,400 --> 00:24:11,320 Speaker 1: a relatively unfamiliar face, right, I mean, even things like 424 00:24:11,920 --> 00:24:14,479 Speaker 1: our face in a mirror versus our face in a photo. 425 00:24:14,800 --> 00:24:17,159 Speaker 1: You know, things like that can be distorting or if 426 00:24:17,200 --> 00:24:20,640 Speaker 1: you or or more more directly, I find it with 427 00:24:20,840 --> 00:24:24,239 Speaker 1: someone else's face reflected in a mirror. I'm definitely not 428 00:24:24,320 --> 00:24:27,119 Speaker 1: used to seeing that, and that'll throw me off sometimes. 429 00:24:27,320 --> 00:24:29,480 Speaker 1: Do you ever do the inversion test? This is another 430 00:24:29,520 --> 00:24:31,879 Speaker 1: weird quirk of of the brain, trying to see if 431 00:24:31,880 --> 00:24:36,399 Speaker 1: you recognize photos of people's heads upside down. I know 432 00:24:36,480 --> 00:24:38,560 Speaker 1: we've talked about that before in the podcast, but I 433 00:24:38,600 --> 00:24:41,160 Speaker 1: haven't really put it to the test in my own life. 434 00:24:42,040 --> 00:24:44,400 Speaker 1: There's a This is just a total side note. There's 435 00:24:44,400 --> 00:24:47,800 Speaker 1: a very funny thing known as the Thatcher effect. It 436 00:24:47,840 --> 00:24:50,560 Speaker 1: has to do with um, the fact that so if 437 00:24:50,600 --> 00:24:55,320 Speaker 1: you look at somebody's head upside down, but with their 438 00:24:55,440 --> 00:24:58,280 Speaker 1: eyes right side up. A lot of times people look 439 00:24:58,320 --> 00:25:00,399 Speaker 1: at that and they don't even notice anything's wrong with 440 00:25:00,440 --> 00:25:03,359 Speaker 1: the photo. So, like, the head is upside down, but 441 00:25:03,400 --> 00:25:06,080 Speaker 1: the eyebrows are like over the eyes because the eyes 442 00:25:06,160 --> 00:25:09,879 Speaker 1: are still in the correct orientation. Um. But then if 443 00:25:09,880 --> 00:25:11,879 Speaker 1: you flip that whole thing to where the head is 444 00:25:11,960 --> 00:25:14,520 Speaker 1: right side up but the eyes are upside down, it 445 00:25:14,600 --> 00:25:18,360 Speaker 1: looks unbelievably grotesque, Like you will burst out, you'll make 446 00:25:18,440 --> 00:25:21,920 Speaker 1: noise when you see it. Look it up. That's your effect. 447 00:25:22,119 --> 00:25:24,239 Speaker 1: But anyway, back to the story about clear View. So 448 00:25:24,600 --> 00:25:28,320 Speaker 1: the company claims its product finds matches for an input 449 00:25:28,320 --> 00:25:31,119 Speaker 1: photo up to seventy percent of the time, and of 450 00:25:31,160 --> 00:25:33,280 Speaker 1: course Hill notes in the article that we can't be 451 00:25:33,320 --> 00:25:37,840 Speaker 1: sure how often false matches turn up. She quotes Claire Garvey, 452 00:25:37,960 --> 00:25:41,480 Speaker 1: who is a researcher at Georgia University's Center on Privacy 453 00:25:41,480 --> 00:25:44,879 Speaker 1: and Technology, who says, quote, we have no data to 454 00:25:44,960 --> 00:25:48,280 Speaker 1: suggest this tool is accurate. The larger the database, the 455 00:25:48,359 --> 00:25:52,200 Speaker 1: larger the risk of misidentification because of the Doppelganger effect. 456 00:25:52,520 --> 00:25:55,600 Speaker 1: They're talking about a massive database of random people they've 457 00:25:55,600 --> 00:26:00,080 Speaker 1: found on the Internet. The Doppelganger effect being not that 458 00:26:00,080 --> 00:26:05,440 Speaker 1: that vengeful German spirits are are actually invading the database, 459 00:26:05,560 --> 00:26:09,000 Speaker 1: but that there's just a going to be the larger 460 00:26:09,040 --> 00:26:11,360 Speaker 1: the pool of people, the more people they're gonna look 461 00:26:11,640 --> 00:26:15,880 Speaker 1: very much like like others. There's gonna be more similarity 462 00:26:15,920 --> 00:26:21,000 Speaker 1: between an increasing a pool of individuals. Right. But at 463 00:26:21,040 --> 00:26:23,679 Speaker 1: the same time, anecdotal reports from a number of law 464 00:26:23,800 --> 00:26:27,080 Speaker 1: enforcement officers have claimed that this tool was effective at 465 00:26:27,119 --> 00:26:30,919 Speaker 1: identifying real perpetrators from photos alone, and there have been 466 00:26:30,920 --> 00:26:34,719 Speaker 1: plenty of other examples in recent years of supposedly effective 467 00:26:34,800 --> 00:26:38,239 Speaker 1: facial recognition technology provided by other companies that have been 468 00:26:38,359 --> 00:26:43,480 Speaker 1: used to UH to allegedly capture perpetrators of crimes done 469 00:26:43,520 --> 00:26:47,040 Speaker 1: in public places in New York, in the UK, certainly 470 00:26:47,240 --> 00:26:50,240 Speaker 1: you know, in in countries with a very strong surveillance state, 471 00:26:50,320 --> 00:26:52,360 Speaker 1: like in China. And we can come back to more 472 00:26:52,400 --> 00:26:54,639 Speaker 1: about that in later episodes, I think, But as a 473 00:26:54,640 --> 00:26:58,480 Speaker 1: personal anecdote in this reported story, Hill at one point 474 00:26:58,520 --> 00:27:01,959 Speaker 1: has the company's founder use the app on a picture 475 00:27:02,000 --> 00:27:05,200 Speaker 1: of her, and she claims that the tool quote returned 476 00:27:05,280 --> 00:27:09,440 Speaker 1: numerous results dating back a decade, including photos of myself 477 00:27:09,480 --> 00:27:12,720 Speaker 1: that I had never seen before. When I used my 478 00:27:12,800 --> 00:27:15,200 Speaker 1: hand to cover my nose, and the bottom of my face. 479 00:27:15,280 --> 00:27:19,000 Speaker 1: The app still returned seven correct matches for me. So 480 00:27:19,400 --> 00:27:22,960 Speaker 1: I think we can assume that failures, including both false 481 00:27:22,960 --> 00:27:26,800 Speaker 1: and negatives and false positives are surely occurring at some rate. 482 00:27:27,240 --> 00:27:29,760 Speaker 1: But it's also clear that this thing at least works 483 00:27:29,840 --> 00:27:32,760 Speaker 1: some of the time. Yeah, and that's enough to help 484 00:27:32,760 --> 00:27:35,920 Speaker 1: it get picked up by law enforcement. Also, it helps 485 00:27:35,920 --> 00:27:37,520 Speaker 1: that there was a it seems like there's a pretty 486 00:27:37,520 --> 00:27:42,840 Speaker 1: sizeable outreach campaign from the company to to to market 487 00:27:43,080 --> 00:27:46,120 Speaker 1: the technology to law enforcement. Yes, and and we should say, 488 00:27:46,119 --> 00:27:48,000 Speaker 1: I mean, we're not going to hash out everything they 489 00:27:48,040 --> 00:27:51,200 Speaker 1: get into in the article, but the company has arrived 490 00:27:51,280 --> 00:27:54,720 Speaker 1: at law enforcement as their primary customers. Before that, they 491 00:27:54,720 --> 00:27:57,000 Speaker 1: tried to market it in all kinds of ways, including 492 00:27:57,400 --> 00:28:01,359 Speaker 1: you know, for like personal use to like private security things, 493 00:28:01,440 --> 00:28:04,520 Speaker 1: and for like commercial use and even then like political 494 00:28:04,520 --> 00:28:08,600 Speaker 1: opposition research and stuff. Yeah, but this, uh, but you 495 00:28:08,600 --> 00:28:11,040 Speaker 1: can definitely see the advantage to law enforcement here, because 496 00:28:11,200 --> 00:28:14,399 Speaker 1: a detective, for instance, has their disposal a number a 497 00:28:14,440 --> 00:28:17,280 Speaker 1: limited number of talents and tools that are useful and 498 00:28:17,320 --> 00:28:20,560 Speaker 1: attempting to solve a case, and adding this to the 499 00:28:21,080 --> 00:28:23,760 Speaker 1: to the toolkit is no brainer. Because you know, larger 500 00:28:23,760 --> 00:28:26,119 Speaker 1: issues of stability of the platform aside, you know some 501 00:28:26,119 --> 00:28:30,520 Speaker 1: of these legal issues we potential legal issues we're discussing earlier. Um, 502 00:28:30,560 --> 00:28:33,119 Speaker 1: you know, this would be something you could use in 503 00:28:33,200 --> 00:28:36,360 Speaker 1: congress with other techniques. You know, you could say, all right, 504 00:28:36,359 --> 00:28:39,560 Speaker 1: this face seems to match up with this individual. We 505 00:28:39,600 --> 00:28:42,520 Speaker 1: also know that this individual was in the correct you know, 506 00:28:42,600 --> 00:28:44,920 Speaker 1: vicinity at the time. You know, you could lean on 507 00:28:44,960 --> 00:28:49,000 Speaker 1: your other detective tools then to to actually make the case. 508 00:28:49,320 --> 00:28:51,480 Speaker 1: That's not to say there's not potential for misuse here, 509 00:28:51,880 --> 00:28:54,560 Speaker 1: but uh, I'm just saying you can. You can definitely 510 00:28:54,640 --> 00:28:58,000 Speaker 1: see the appeal and how if everything is working perfectly, 511 00:28:58,360 --> 00:29:01,040 Speaker 1: it would be an effective law and forcement tool. Yeah, 512 00:29:01,080 --> 00:29:03,920 Speaker 1: and however effective it actually is, it's clear that this 513 00:29:04,000 --> 00:29:07,240 Speaker 1: and similar tools are increasingly popular with law enforcement in 514 00:29:07,360 --> 00:29:09,880 Speaker 1: countries all over the world. If you're one of those 515 00:29:09,880 --> 00:29:11,960 Speaker 1: people who feels like pumping the brakes on this kind 516 00:29:11,960 --> 00:29:15,880 Speaker 1: of technology, what could actually be done about it? Well, 517 00:29:15,960 --> 00:29:19,360 Speaker 1: Hill quote somebody named al Ghadari, a privacy professor at 518 00:29:19,360 --> 00:29:23,240 Speaker 1: Stanford Law School who says, bluntly quote, absent of very 519 00:29:23,280 --> 00:29:27,960 Speaker 1: strong federal privacy law, we're all screwed. Uh. And He's 520 00:29:28,000 --> 00:29:31,280 Speaker 1: not alone. There are plenty of privacy experts today advocating 521 00:29:31,280 --> 00:29:34,600 Speaker 1: the point of view that facial recognition technology, or at 522 00:29:34,680 --> 00:29:37,720 Speaker 1: least some specific uses of it, aren't just something that 523 00:29:37,800 --> 00:29:40,080 Speaker 1: maybe we should be a little concerned about, there's something 524 00:29:40,120 --> 00:29:44,240 Speaker 1: that needs to be banned outright. Um. For example, Hill 525 00:29:44,320 --> 00:29:47,640 Speaker 1: also quotes somebody named Woodrow Hertzog, who is a professor 526 00:29:47,680 --> 00:29:51,959 Speaker 1: of law and computer science at Northeastern University. Uh and 527 00:29:52,080 --> 00:29:55,400 Speaker 1: hart Sog says, quote, we've relied on industry efforts to 528 00:29:55,560 --> 00:29:59,800 Speaker 1: self police and not embrace such a risky technology, but 529 00:30:00,000 --> 00:30:02,920 Speaker 1: now those damns are breaking because there's so much money 530 00:30:02,920 --> 00:30:05,520 Speaker 1: on the table. I don't see a future where we 531 00:30:05,600 --> 00:30:10,240 Speaker 1: harness the benefits of face recognition technology without the crippling 532 00:30:10,280 --> 00:30:13,200 Speaker 1: abuse of the surveillance that comes with it. The only 533 00:30:13,240 --> 00:30:16,120 Speaker 1: way to stop it is to ban it. So whether 534 00:30:16,200 --> 00:30:18,800 Speaker 1: we should do that, or if so, what form that 535 00:30:18,880 --> 00:30:21,080 Speaker 1: band should take, what whatever, is the best way to 536 00:30:21,120 --> 00:30:24,680 Speaker 1: address it. I think it is at least clear that 537 00:30:24,760 --> 00:30:29,280 Speaker 1: this is a very pressing and uh like time sensitive issue, 538 00:30:29,440 --> 00:30:33,760 Speaker 1: that that is of urgent public concern right now. Yeah, because, again, 539 00:30:33,880 --> 00:30:35,520 Speaker 1: as the author points out, I don't think any of 540 00:30:35,560 --> 00:30:37,400 Speaker 1: us want to live in a world where any stranger 541 00:30:37,520 --> 00:30:40,080 Speaker 1: can you know, surreptitiously take a photo of our face 542 00:30:40,160 --> 00:30:43,200 Speaker 1: and then face searches and get all this data on us. 543 00:30:43,320 --> 00:30:45,239 Speaker 1: You know, I don't want for us to build that 544 00:30:45,320 --> 00:30:47,880 Speaker 1: kind of world for our children, who, more than any 545 00:30:47,960 --> 00:30:50,920 Speaker 1: of us, never had a chance to opt out of 546 00:30:50,960 --> 00:30:53,520 Speaker 1: this uh, this face trade, you know. And that's the 547 00:30:53,720 --> 00:30:55,680 Speaker 1: that's the pressing to think about, you know, because because 548 00:30:55,680 --> 00:30:58,240 Speaker 1: you're not thinking about that when you when you share 549 00:30:58,400 --> 00:31:02,000 Speaker 1: images of your child on on Facebook or Instagram or 550 00:31:02,000 --> 00:31:04,600 Speaker 1: whatever it happens to be. Um, you know, you you 551 00:31:04,720 --> 00:31:08,080 Speaker 1: just you're wanting to celebrate that this person exists at all, 552 00:31:08,120 --> 00:31:10,800 Speaker 1: but you're you're laying the groundwork from like you know, 553 00:31:11,720 --> 00:31:15,360 Speaker 1: age zero Hola onward right, Uh, like this is their 554 00:31:15,480 --> 00:31:18,320 Speaker 1: their digital history. We were talking before we came into 555 00:31:18,320 --> 00:31:20,680 Speaker 1: the studio about like if a person wanted to do 556 00:31:20,760 --> 00:31:23,880 Speaker 1: something about this, what could you do? You can't unpost 557 00:31:23,960 --> 00:31:26,920 Speaker 1: photos of yourself and data that has already been scraped. 558 00:31:27,160 --> 00:31:29,640 Speaker 1: But I wonder if maybe you could try to come 559 00:31:29,720 --> 00:31:34,080 Speaker 1: up the works by constantly just polluting the Internet with 560 00:31:34,280 --> 00:31:37,240 Speaker 1: false pictures of you that are not you. So you're 561 00:31:37,280 --> 00:31:41,000 Speaker 1: like sort of like deep facing enough to where you um, 562 00:31:41,040 --> 00:31:44,720 Speaker 1: you've obscured the visual record of yourself. Maybe yeah, like 563 00:31:44,840 --> 00:31:48,040 Speaker 1: if you, I guess it would then depend on the 564 00:31:48,160 --> 00:31:51,360 Speaker 1: all those new images being taken in and causing enough 565 00:31:51,400 --> 00:31:54,479 Speaker 1: confusion and the identification of you. But I don't know, 566 00:31:55,520 --> 00:32:00,640 Speaker 1: or perhaps like altering your facial appearance with enough regularity 567 00:32:00,720 --> 00:32:04,120 Speaker 1: that there is no concise version of you, or or 568 00:32:04,160 --> 00:32:06,360 Speaker 1: at least making it to wear. Then the the AI 569 00:32:06,440 --> 00:32:08,320 Speaker 1: would have to work a lot harder. It would have 570 00:32:08,360 --> 00:32:11,000 Speaker 1: to have like a broader definition of what you look like, 571 00:32:11,080 --> 00:32:14,920 Speaker 1: to the point that maybe it excuse with maybe it 572 00:32:15,000 --> 00:32:18,040 Speaker 1: enhances the doppelgang or effect, like I'm thinking about you know, 573 00:32:18,040 --> 00:32:20,520 Speaker 1: like you just you know, each day you inject a 574 00:32:20,520 --> 00:32:23,840 Speaker 1: different portion of your face with collagen or something, or 575 00:32:23,840 --> 00:32:27,400 Speaker 1: maybe not collagen, but maybe just saying lead, well, I mean, 576 00:32:27,440 --> 00:32:30,320 Speaker 1: does this lead to if you, I mean, this sounds ridiculous, 577 00:32:30,360 --> 00:32:32,600 Speaker 1: but does this lead to a future where everybody starts 578 00:32:32,640 --> 00:32:36,480 Speaker 1: walking around with a broad array of interchanging masks? Yeah, 579 00:32:36,640 --> 00:32:40,040 Speaker 1: and will and then you have an enhanced um laws 580 00:32:40,120 --> 00:32:43,320 Speaker 1: against the wearing of masks. I mean, masks are outlat 581 00:32:43,400 --> 00:32:45,440 Speaker 1: in a lot of places and and a lot of 582 00:32:45,480 --> 00:32:47,720 Speaker 1: events for for a reason. Yeah, you're not supposed to 583 00:32:47,760 --> 00:32:52,120 Speaker 1: drive a car wearing a mask. That's um, this does 584 00:32:52,240 --> 00:32:56,000 Speaker 1: remind me. There's a there's an excellent show on Hulu, 585 00:32:56,080 --> 00:32:59,080 Speaker 1: a titled Future Man that is a it's it's a 586 00:32:59,120 --> 00:33:01,920 Speaker 1: it's a comedy. It's a satire with a lot of 587 00:33:01,960 --> 00:33:06,160 Speaker 1: nostalgia for various you know, sci fi franchises. But there 588 00:33:06,240 --> 00:33:08,239 Speaker 1: is a scene in one of the episodes where an 589 00:33:08,280 --> 00:33:12,600 Speaker 1: individual knows that a facial recognition system is looking for him, 590 00:33:12,640 --> 00:33:15,120 Speaker 1: so he gets himself beat up first so that his 591 00:33:15,200 --> 00:33:17,520 Speaker 1: face is then all swelled and distorted and it and 592 00:33:17,560 --> 00:33:19,400 Speaker 1: then it cannot make a match for him, and he's 593 00:33:19,400 --> 00:33:21,840 Speaker 1: able to sneak past the guards. I don't think that's 594 00:33:21,880 --> 00:33:25,280 Speaker 1: a sustainable strategy. It's not. And more to the point, yeah, 595 00:33:25,280 --> 00:33:28,160 Speaker 1: we should not even we should not have to even 596 00:33:28,240 --> 00:33:32,080 Speaker 1: entertain that possibility to to hold on to, uh, you know, 597 00:33:32,120 --> 00:33:34,920 Speaker 1: our sense of privacy. Yeah. Now, the fact that we're 598 00:33:34,920 --> 00:33:36,720 Speaker 1: talking about this story in the New York Times about 599 00:33:36,720 --> 00:33:39,960 Speaker 1: clear View is it's just a result of timing. It's 600 00:33:40,040 --> 00:33:43,760 Speaker 1: like this one specific company is not the entirety of 601 00:33:43,880 --> 00:33:47,280 Speaker 1: the end of privacy problem, nor of the facial recognition 602 00:33:47,280 --> 00:33:51,480 Speaker 1: technology landscape in particular, Another company could do the same thing. 603 00:33:51,560 --> 00:33:55,120 Speaker 1: Other copycats I'm sure are already getting in there. Uh, 604 00:33:55,120 --> 00:33:58,360 Speaker 1: it's just one high profile example of of the potential 605 00:33:58,480 --> 00:34:01,280 Speaker 1: already being put to use that that's getting a lot 606 00:34:01,280 --> 00:34:03,600 Speaker 1: of attention in the past couple of weeks. Yeah, in 607 00:34:03,680 --> 00:34:05,400 Speaker 1: part two. You have to read the full article for 608 00:34:05,400 --> 00:34:07,480 Speaker 1: the details. But it's also like their key individuals that 609 00:34:07,520 --> 00:34:10,799 Speaker 1: are notable that are tied into its funding. Yes, there's that, 610 00:34:10,920 --> 00:34:13,600 Speaker 1: and and of course there's the ominous way it ends, 611 00:34:13,640 --> 00:34:16,319 Speaker 1: which is the idea that it will soon probably be 612 00:34:16,440 --> 00:34:18,719 Speaker 1: rolled out not just to law enforcement, but to be 613 00:34:18,800 --> 00:34:21,759 Speaker 1: a publicly usable app, you know, which I guess is 614 00:34:21,800 --> 00:34:23,800 Speaker 1: the sort of the scenario we were describing at the 615 00:34:23,800 --> 00:34:27,360 Speaker 1: beginning of the episode, just having a publicly available personal 616 00:34:27,400 --> 00:34:30,080 Speaker 1: search bar tool. Yeah. Well, well, markin me down for 617 00:34:30,120 --> 00:34:33,040 Speaker 1: being against that. Yes, alright, time to take a quick break, 618 00:34:33,080 --> 00:34:38,560 Speaker 1: but we'll be right back with more than thank alright, 619 00:34:38,560 --> 00:34:41,080 Speaker 1: we're back. So, of course, I also want to make 620 00:34:41,120 --> 00:34:46,000 Speaker 1: the case that facial recognition technology is not necessarily always 621 00:34:46,080 --> 00:34:48,960 Speaker 1: a dangerous or scary or ominous thing. I mean, I 622 00:34:49,040 --> 00:34:50,919 Speaker 1: think there are some uses of it that one could 623 00:34:50,960 --> 00:34:54,680 Speaker 1: quite easily find benevolent or even delightful. And one of 624 00:34:54,680 --> 00:34:57,600 Speaker 1: the ways I was thinking about this was many of 625 00:34:57,640 --> 00:35:01,120 Speaker 1: its developing uses in non human animals, not all because 626 00:35:01,160 --> 00:35:03,360 Speaker 1: some of its uses in non human animals are also 627 00:35:03,600 --> 00:35:06,040 Speaker 1: like kind of horrifying, but some of the ones that 628 00:35:06,120 --> 00:35:08,440 Speaker 1: non human animals are pretty great. I was reading an 629 00:35:08,520 --> 00:35:12,880 Speaker 1: article in New York Magazine from October called Here's a 630 00:35:12,920 --> 00:35:16,560 Speaker 1: list of every animal humans currently monitor using facial recognition 631 00:35:16,600 --> 00:35:21,480 Speaker 1: technology by Mac dagaron As A complete list is probably 632 00:35:21,520 --> 00:35:23,440 Speaker 1: wildly out of date at this point because this was, 633 00:35:24,320 --> 00:35:27,479 Speaker 1: but a few of the entries include things like there's 634 00:35:27,520 --> 00:35:31,960 Speaker 1: a Norwegian fish farming company called Sermac Group as that 635 00:35:32,000 --> 00:35:36,560 Speaker 1: commissioned a system for facial recognition of salmon which would 636 00:35:36,719 --> 00:35:40,080 Speaker 1: use distinctive patterns of spots around the eyes, mouth, and 637 00:35:40,160 --> 00:35:46,040 Speaker 1: gills of individual salmon to build individual digital medical records 638 00:35:46,080 --> 00:35:49,120 Speaker 1: associated with each fish. And this would be for the 639 00:35:49,160 --> 00:35:53,280 Speaker 1: purpose of fighting epidemics of parasitic sea lice primarily. Okay, 640 00:35:53,280 --> 00:35:56,759 Speaker 1: so not merely just presenting it with with the dish 641 00:35:56,760 --> 00:35:59,399 Speaker 1: when you ordered it a restaurant, but I will take 642 00:35:59,480 --> 00:36:03,600 Speaker 1: the baked salmon and please include it's complete medical history. Yes, 643 00:36:03,840 --> 00:36:07,279 Speaker 1: this salmon was named Jeffrey. Here's his Facebook profile, you 644 00:36:07,320 --> 00:36:10,879 Speaker 1: know on on salmon Facebook. Oh, but that comes back. 645 00:36:11,360 --> 00:36:14,600 Speaker 1: Of course, facial recognition technology is being deployed to keep 646 00:36:14,600 --> 00:36:17,480 Speaker 1: individual track of all kinds of livestock like cows and 647 00:36:17,560 --> 00:36:21,120 Speaker 1: chickens for maybe medical reasons or reasons having to do 648 00:36:21,200 --> 00:36:24,280 Speaker 1: in the case of cows, reasons having to do with tracking, 649 00:36:24,360 --> 00:36:27,160 Speaker 1: like periods of peak milk output and stuff like that. 650 00:36:27,680 --> 00:36:30,520 Speaker 1: But there are also stories about conservation efforts to non 651 00:36:30,560 --> 00:36:35,799 Speaker 1: invasively monitor wild populations of vulnerable animals by way of 652 00:36:35,840 --> 00:36:38,920 Speaker 1: facial recognition, which if that works, that sounds awesome. Like 653 00:36:39,200 --> 00:36:43,120 Speaker 1: I was looking at article in Scientific American that described 654 00:36:43,160 --> 00:36:47,000 Speaker 1: efforts to use facial recognition to track wild lions through 655 00:36:47,040 --> 00:36:51,840 Speaker 1: a platform called the Lion Identification Network of Collaborators or link. 656 00:36:52,400 --> 00:36:54,560 Speaker 1: That's interesting. It reminds me of I want to say, 657 00:36:54,600 --> 00:36:57,160 Speaker 1: like a decade ago, maybe a little a little further 658 00:36:57,200 --> 00:37:00,959 Speaker 1: back in time. Uh, there was a piece I read 659 00:37:01,040 --> 00:37:05,279 Speaker 1: about tracking whale sharks, and whale sharks all have you know, 660 00:37:05,320 --> 00:37:08,239 Speaker 1: distinctive patterns on their you know that's sort of the 661 00:37:08,239 --> 00:37:11,560 Speaker 1: top of their heads that area, and uh, you know, 662 00:37:11,600 --> 00:37:15,440 Speaker 1: it doesn't mean anything, you know, much to humanize. But 663 00:37:15,960 --> 00:37:18,840 Speaker 1: I think at the time they were utilizing NASA technology 664 00:37:18,840 --> 00:37:21,680 Speaker 1: that was aimed at making sense of of the stars 665 00:37:22,320 --> 00:37:26,320 Speaker 1: like astronomical um computation systems to then to make sense 666 00:37:26,360 --> 00:37:30,319 Speaker 1: and sort of track um identify at any rate these 667 00:37:30,400 --> 00:37:33,000 Speaker 1: various whale sharks. So this would but this sort of 668 00:37:33,000 --> 00:37:34,960 Speaker 1: thing would be I think an even better method of 669 00:37:35,040 --> 00:37:37,400 Speaker 1: doing that because because you're you're probably dealing with it 670 00:37:37,440 --> 00:37:39,480 Speaker 1: with creatures in all these cases that uh, you know, 671 00:37:39,520 --> 00:37:43,120 Speaker 1: there's a they definitely are not all identical. There are differences, 672 00:37:43,120 --> 00:37:44,839 Speaker 1: but we just may not have the eye for it, 673 00:37:45,040 --> 00:37:48,279 Speaker 1: whereas technology can be can be used to say the 674 00:37:48,320 --> 00:37:51,880 Speaker 1: Sharper eye for chicken identity exactly, and you know, it 675 00:37:51,960 --> 00:37:55,440 Speaker 1: has the advantage of not having to physically tag the 676 00:37:55,480 --> 00:37:58,279 Speaker 1: animal in some way, which can be difficult to do 677 00:37:58,600 --> 00:38:01,000 Speaker 1: or it can be harmful to the animal right or 678 00:38:01,040 --> 00:38:04,480 Speaker 1: day are dangerous to the individuals doing the tagging. Yeah, 679 00:38:04,840 --> 00:38:07,760 Speaker 1: um and so apparently so you've got this one with lions, 680 00:38:07,800 --> 00:38:10,000 Speaker 1: the link project, But there are similar things that have 681 00:38:10,080 --> 00:38:14,200 Speaker 1: been attempted with tigers, elephants, even whales. You mentioned whale sharks, 682 00:38:14,239 --> 00:38:17,360 Speaker 1: but with like actual mammal whales, with a project that 683 00:38:17,480 --> 00:38:21,120 Speaker 1: an article in the Atlantic called Facebook for whales. I 684 00:38:21,160 --> 00:38:23,520 Speaker 1: hope it's not as addictive for whales as it is 685 00:38:23,520 --> 00:38:26,479 Speaker 1: for humans. That was a joke, but you didn't laugh. 686 00:38:26,600 --> 00:38:33,360 Speaker 1: That's okay. But similar technologies have been proposed and tested 687 00:38:33,400 --> 00:38:36,520 Speaker 1: to help link people with lost pets, including cats and dogs. 688 00:38:36,560 --> 00:38:38,520 Speaker 1: That seems like a great use of this. Yeah, like, 689 00:38:38,640 --> 00:38:42,440 Speaker 1: you know, certainly, I'm on enough social media boards where 690 00:38:42,520 --> 00:38:45,839 Speaker 1: their pet owners and occasionally pedicle missing. And then there's 691 00:38:45,880 --> 00:38:48,439 Speaker 1: this whole back and forth where like like, oh my 692 00:38:48,440 --> 00:38:51,439 Speaker 1: my orange cat is missing that it looks like this, 693 00:38:51,560 --> 00:38:53,239 Speaker 1: and then somebody would be like, well does he look 694 00:38:53,280 --> 00:38:55,800 Speaker 1: like this? And I think I saw him in the backyard, 695 00:38:55,840 --> 00:38:57,120 Speaker 1: and someone else is like, oh, I think I saw 696 00:38:57,200 --> 00:39:00,880 Speaker 1: him over here across town, and nobody can be for sure, right, 697 00:39:00,960 --> 00:39:03,640 Speaker 1: because it's hard to get close to a stray cat 698 00:39:03,680 --> 00:39:06,440 Speaker 1: in some cases, or or an escaped cat or a 699 00:39:06,480 --> 00:39:09,319 Speaker 1: feral cat that is then misidentified as a lost cat. 700 00:39:10,080 --> 00:39:11,799 Speaker 1: But if you had the ability, if you had some 701 00:39:11,840 --> 00:39:15,719 Speaker 1: sort of app infrastructure where your your cats missing, fine, 702 00:39:15,760 --> 00:39:18,560 Speaker 1: you upload them to this database and then when someone 703 00:39:18,640 --> 00:39:20,480 Speaker 1: finds a cat, they just take a picture of it 704 00:39:20,600 --> 00:39:22,520 Speaker 1: and it tells you if that cat is missing. Like 705 00:39:22,560 --> 00:39:24,319 Speaker 1: that would be that would be great. That would cut 706 00:39:24,320 --> 00:39:26,560 Speaker 1: out a lot of the the the anguish and the 707 00:39:26,920 --> 00:39:29,880 Speaker 1: work that goes with having a runaway pet. I agree 708 00:39:30,080 --> 00:39:32,600 Speaker 1: that sounds great, and maybe I'm suffering from a lack 709 00:39:32,640 --> 00:39:36,000 Speaker 1: of imagination, but I'm I'm thinking in cases like that, 710 00:39:36,000 --> 00:39:39,040 Speaker 1: that's a case where I think the the risks to 711 00:39:39,080 --> 00:39:42,359 Speaker 1: the cats privacy would be far outweighed by the benefits 712 00:39:42,400 --> 00:39:45,440 Speaker 1: of people finding their lost animals, because we know how 713 00:39:45,480 --> 00:39:47,640 Speaker 1: cats are. They don't give a damn about privacy. Now 714 00:39:47,719 --> 00:39:49,759 Speaker 1: they have, Yeah, they have a whole different set of 715 00:39:50,640 --> 00:39:55,440 Speaker 1: set of values. Now. Another sort of quirk of timing 716 00:39:55,480 --> 00:39:58,520 Speaker 1: here as we were putting this episode together, in fact, 717 00:39:58,520 --> 00:40:00,880 Speaker 1: is we're sort of finishing our no for this episode. 718 00:40:01,040 --> 00:40:04,719 Speaker 1: This morning, I actually read a new blog post, uh 719 00:40:05,200 --> 00:40:09,240 Speaker 1: titled depth of Field Fails by Janelle Shane at ai 720 00:40:09,320 --> 00:40:12,279 Speaker 1: weirdness dot Com. Oh, we've talked about her blog on 721 00:40:12,320 --> 00:40:14,799 Speaker 1: the show before because it came up in the pair 722 00:40:14,800 --> 00:40:17,919 Speaker 1: of episodes we did called flat a Sex Mackinah, which 723 00:40:17,960 --> 00:40:21,440 Speaker 1: was about why it's so funny when machines fail. Yes, yeah, yeah, 724 00:40:21,560 --> 00:40:24,080 Speaker 1: this is and I imagine a lot of you have encountered, uh, 725 00:40:24,360 --> 00:40:27,479 Speaker 1: you know, the various scenarios she's run with, like AI 726 00:40:27,600 --> 00:40:30,960 Speaker 1: s coming up with names for Halloween costumes or names 727 00:40:30,960 --> 00:40:33,520 Speaker 1: for I think at one point evenel like She's they 728 00:40:33,520 --> 00:40:35,479 Speaker 1: were using it to come up with not only names, 729 00:40:35,480 --> 00:40:40,439 Speaker 1: but actual recipes for cocktails, yes, recipes for foods, also 730 00:40:40,840 --> 00:40:43,760 Speaker 1: names for dishes. We talked about D and D. Character 731 00:40:43,840 --> 00:40:48,040 Speaker 1: biography is generated, and and spells names for spells. Remember 732 00:40:48,280 --> 00:40:52,640 Speaker 1: remember song of the Darn Daving Fire. Yeah, there's even 733 00:40:52,640 --> 00:40:56,480 Speaker 1: one about cat names. But in this particular post, um 734 00:40:57,200 --> 00:41:00,839 Speaker 1: Shane the research scientists. She tests out the facial recognition 735 00:41:00,920 --> 00:41:04,359 Speaker 1: AI that is employed by Skype for its blur My 736 00:41:04,440 --> 00:41:08,640 Speaker 1: Background for All Calls feature. Now, the curious thing about 737 00:41:08,640 --> 00:41:11,360 Speaker 1: this is I've been using Skype. We've been using Skype 738 00:41:11,360 --> 00:41:14,239 Speaker 1: here on the show for years for interviews, and I 739 00:41:14,239 --> 00:41:16,480 Speaker 1: guess I just I don't dig in deeply enough or 740 00:41:16,520 --> 00:41:19,400 Speaker 1: read emails because I didn't realize this was a feature 741 00:41:19,480 --> 00:41:22,720 Speaker 1: at all until today. I think we always let somebody 742 00:41:22,760 --> 00:41:24,920 Speaker 1: smarter than us figure out how to use Skype and 743 00:41:24,960 --> 00:41:27,279 Speaker 1: there we'll just do the talking. But but it, but 744 00:41:27,360 --> 00:41:30,279 Speaker 1: it totally makes sense as a feature because perhaps you 745 00:41:30,400 --> 00:41:33,080 Speaker 1: do have an ideal environment set up for a call 746 00:41:33,080 --> 00:41:36,680 Speaker 1: with a business friendly background behind you, but maybe you don't. 747 00:41:36,719 --> 00:41:39,160 Speaker 1: Maybe you have a fridge with a bunch of notes 748 00:41:39,160 --> 00:41:41,239 Speaker 1: stuck to it with magnets, and maybe some of those 749 00:41:41,239 --> 00:41:44,120 Speaker 1: are like bills or you know, or they have some 750 00:41:44,239 --> 00:41:47,080 Speaker 1: data on there that you you wouldn't even want there 751 00:41:47,120 --> 00:41:49,799 Speaker 1: to be a chance somebody might be able to decipher it, 752 00:41:50,000 --> 00:41:52,399 Speaker 1: or a bookshelf full of occult tones that you don't 753 00:41:52,400 --> 00:41:55,120 Speaker 1: want people to know you've been researching that that's a 754 00:41:55,160 --> 00:41:57,920 Speaker 1: good one. Or perhaps you're at work and there's a 755 00:41:57,960 --> 00:42:01,279 Speaker 1: marker board full of you know, of data back there 756 00:42:01,320 --> 00:42:03,600 Speaker 1: might be something you don't want out, Or perhaps you 757 00:42:03,640 --> 00:42:05,960 Speaker 1: have some distracting art up there in the wall and 758 00:42:06,000 --> 00:42:08,919 Speaker 1: you don't want to compromise the interior of your own 759 00:42:08,960 --> 00:42:11,840 Speaker 1: home so that you can do a skype call. You 760 00:42:11,840 --> 00:42:13,920 Speaker 1: don't want to have to like take things off the 761 00:42:13,920 --> 00:42:15,839 Speaker 1: wall in order to do this, or you're in one 762 00:42:15,840 --> 00:42:21,000 Speaker 1: of those weird Roman mass toilets, whatever the case may be. 763 00:42:21,520 --> 00:42:24,520 Speaker 1: The AI then can auto blur all of that out 764 00:42:24,800 --> 00:42:27,360 Speaker 1: for you, But to do so, it has to be 765 00:42:27,400 --> 00:42:29,560 Speaker 1: able to tell the difference between the face of the 766 00:42:29,640 --> 00:42:33,520 Speaker 1: collar and mere objects in the background. Uh. And it's 767 00:42:33,560 --> 00:42:35,880 Speaker 1: in this case the AIS definition of a face is 768 00:42:35,920 --> 00:42:40,320 Speaker 1: pretty broad. As Shane discovers, it will allow ancient Egyptian 769 00:42:40,360 --> 00:42:44,600 Speaker 1: illustrations to to come through unblurred, to be the face, 770 00:42:44,880 --> 00:42:46,600 Speaker 1: to be the face. So it's like, okay, this call 771 00:42:46,680 --> 00:42:51,680 Speaker 1: is being made by this individual from ancient Egypt isis 772 00:42:52,320 --> 00:42:57,640 Speaker 1: but also increasingly abstract depictions of a human face. Uh. 773 00:42:57,680 --> 00:43:00,080 Speaker 1: So it showed various works of art and it's some 774 00:43:00,120 --> 00:43:02,080 Speaker 1: of them were really abstract, and it was like, all right, 775 00:43:02,120 --> 00:43:04,320 Speaker 1: that's a face. Sure that will work, just like monk 776 00:43:04,440 --> 00:43:10,000 Speaker 1: the scream, Yes, yeah exactly. Um. Also stuffed giraffes. It 777 00:43:10,080 --> 00:43:14,000 Speaker 1: had a problem with the like the horns of the draffe, 778 00:43:14,280 --> 00:43:17,640 Speaker 1: but but not so much the face of the stuffed giraffe. Uh. 779 00:43:17,680 --> 00:43:22,320 Speaker 1: It gets a little confused with life size plastic skeletons, however, 780 00:43:22,440 --> 00:43:25,560 Speaker 1: and also cats can throw it off as well. But 781 00:43:26,320 --> 00:43:28,840 Speaker 1: so do check out that that blog post. It's it's 782 00:43:28,880 --> 00:43:32,160 Speaker 1: amusing and also insightful. But all this is certainly I 783 00:43:32,160 --> 00:43:35,399 Speaker 1: think an example of facial recognition AI doing something that's 784 00:43:35,440 --> 00:43:39,719 Speaker 1: not only helpful but could actually help you with your privacy. Now, 785 00:43:39,760 --> 00:43:41,839 Speaker 1: one thing I think that would be different there is 786 00:43:41,880 --> 00:43:45,120 Speaker 1: that that's facial recognition in the sense of recognizing a 787 00:43:45,440 --> 00:43:49,600 Speaker 1: face as opposed to a background, versus recognizing whose face 788 00:43:49,680 --> 00:43:52,200 Speaker 1: a picture is of right. But then again you could 789 00:43:52,200 --> 00:43:55,080 Speaker 1: easily imagine like that being an upgrade or being a 790 00:43:55,080 --> 00:43:58,160 Speaker 1: situation where if you had a more robust um facial 791 00:43:58,200 --> 00:44:01,920 Speaker 1: recognition um AI that was then used on some sort 792 00:44:01,960 --> 00:44:04,560 Speaker 1: of skypelike system, you might actually go ahead and have 793 00:44:04,600 --> 00:44:08,240 Speaker 1: a feature where the caller's face was logged and therefore 794 00:44:08,400 --> 00:44:10,759 Speaker 1: it would blur out any face that was not the 795 00:44:10,800 --> 00:44:14,520 Speaker 1: authorized users face. So that way of you know, there 796 00:44:14,560 --> 00:44:17,320 Speaker 1: are other employees walking by in the background getting coffee, 797 00:44:17,440 --> 00:44:19,279 Speaker 1: they're not going to show up on your call. If 798 00:44:19,320 --> 00:44:21,880 Speaker 1: you're you know, significant other walks by in the background, 799 00:44:21,960 --> 00:44:23,920 Speaker 1: they're not going to show up on the call, etcetera. 800 00:44:24,000 --> 00:44:26,040 Speaker 1: So even with this, I mean, when I'm in these 801 00:44:26,160 --> 00:44:29,200 Speaker 1: kind of scenarios in my head, I'm always wondering if 802 00:44:29,280 --> 00:44:32,960 Speaker 1: there are freaky applications that I'm just not being imaginative 803 00:44:33,080 --> 00:44:35,640 Speaker 1: enough to to get to. Yet, We'll let me put 804 00:44:35,640 --> 00:44:38,880 Speaker 1: on my black mirror neural lace cap for a second 805 00:44:39,200 --> 00:44:42,839 Speaker 1: and think, um how about a simple case where law 806 00:44:42,920 --> 00:44:46,960 Speaker 1: enforcement wants to access an unblurred background. Now, I'm not 807 00:44:46,960 --> 00:44:49,640 Speaker 1: sure to what extent that's even possible with this technology, 808 00:44:49,680 --> 00:44:52,239 Speaker 1: but what if, say, you know, a government agency made 809 00:44:52,239 --> 00:44:55,440 Speaker 1: a claim for a need to override the auto blurring 810 00:44:55,480 --> 00:44:58,440 Speaker 1: features utilized by others, so they would just have blanket 811 00:44:58,920 --> 00:45:02,000 Speaker 1: power to do this. So you think you're blurring your background, 812 00:45:02,040 --> 00:45:04,839 Speaker 1: but actually you're Yeah, nobody can see, not if you're 813 00:45:04,880 --> 00:45:08,239 Speaker 1: on the phone with you know, with with with someone 814 00:45:08,280 --> 00:45:11,040 Speaker 1: who's actually a government employee or whoever happens to have 815 00:45:11,080 --> 00:45:13,480 Speaker 1: the magic key in this scenario. Oh, I guess it's 816 00:45:13,520 --> 00:45:15,200 Speaker 1: kind of like how you think you can have your 817 00:45:15,200 --> 00:45:17,600 Speaker 1: phone turned off, or you think you can have GPS 818 00:45:17,680 --> 00:45:20,879 Speaker 1: turned off, but in fact it is still location tech. Yeah, yeah, 819 00:45:20,880 --> 00:45:23,480 Speaker 1: that's sort of thing. And again I don't have a 820 00:45:23,480 --> 00:45:25,879 Speaker 1: detailed enough knowledge of this particular software. I'm not saying 821 00:45:26,600 --> 00:45:29,879 Speaker 1: not not not applying this directly to the Skype scenario here, 822 00:45:29,880 --> 00:45:33,080 Speaker 1: but just sort of thinking in general. Um, now, in 823 00:45:33,120 --> 00:45:36,279 Speaker 1: this particular case, I'm also assuming that the broad definition 824 00:45:36,320 --> 00:45:38,480 Speaker 1: of a face is in place, at least in part 825 00:45:38,560 --> 00:45:41,880 Speaker 1: to avoid situations where a human beings face is blurred 826 00:45:41,880 --> 00:45:46,120 Speaker 1: because the AI can't handle, say, facial disfiguration. Because I 827 00:45:46,120 --> 00:45:48,720 Speaker 1: think we can understand why we wouldn't want an AI 828 00:45:48,880 --> 00:45:53,399 Speaker 1: like this to lean heavily on norms to promote ideals 829 00:45:53,440 --> 00:45:57,040 Speaker 1: about who and who doesn't have a face? Sure though 830 00:45:57,200 --> 00:46:00,319 Speaker 1: this in considering this with face recognition, and we get 831 00:46:00,360 --> 00:46:04,080 Speaker 1: into some some interesting and you know, at times disturbing territory. 832 00:46:04,719 --> 00:46:08,480 Speaker 1: Matthew Galt had an article in Vice last February titled 833 00:46:08,480 --> 00:46:13,279 Speaker 1: Facial recognition software regularly miss genders trans people, detailing how 834 00:46:13,320 --> 00:46:17,000 Speaker 1: these systems were simply not built with trans or non 835 00:46:17,000 --> 00:46:20,839 Speaker 1: binary people in mind, and can quote continue to reinforce 836 00:46:21,120 --> 00:46:24,000 Speaker 1: existing biases? Oh yeah, I mean, as with a lot 837 00:46:24,000 --> 00:46:27,640 Speaker 1: of things, I think sometimes there is an illusion that 838 00:46:28,320 --> 00:46:33,000 Speaker 1: machines somehow will be free from applying biases to humans 839 00:46:33,000 --> 00:46:35,560 Speaker 1: that that humans apply to each other. But I think 840 00:46:35,600 --> 00:46:38,239 Speaker 1: we we've got ample evidence now that that is not 841 00:46:38,320 --> 00:46:42,480 Speaker 1: the case. That human biases get quite easily mapped onto 842 00:46:42,719 --> 00:46:46,840 Speaker 1: artificial intelligence through assumptions used in the in the creation 843 00:46:46,880 --> 00:46:49,640 Speaker 1: of these algorithms, or through the data sets they're trained on. 844 00:46:50,080 --> 00:46:52,280 Speaker 1: Right and then, and as Galt explores in the article, 845 00:46:52,320 --> 00:46:54,120 Speaker 1: like part of it too is just like who's building 846 00:46:54,480 --> 00:46:57,040 Speaker 1: these programs, You know, it's built being built by programmers 847 00:46:57,040 --> 00:47:00,520 Speaker 1: and engineers, people that are that that may just not 848 00:47:00,680 --> 00:47:02,880 Speaker 1: have have ever really given serious study to some of 849 00:47:03,000 --> 00:47:05,800 Speaker 1: like say the gender issues that are you know inherent 850 00:47:05,840 --> 00:47:09,200 Speaker 1: to the problem. He also mentions how past databases have, 851 00:47:09,320 --> 00:47:13,640 Speaker 1: for instance, misidentified black people in criminal databases and even 852 00:47:13,680 --> 00:47:16,880 Speaker 1: in some cases had they failed to see black people 853 00:47:16,960 --> 00:47:19,719 Speaker 1: at all. Yeah, like say, if they are trained primarily 854 00:47:19,800 --> 00:47:24,160 Speaker 1: on data sets with lighter skinned faces. In fact, just 855 00:47:24,320 --> 00:47:27,680 Speaker 1: last December December of twenty nineteen, the National Institute of 856 00:47:27,719 --> 00:47:31,000 Speaker 1: Standards and Technology in the US tested a hundred and 857 00:47:31,000 --> 00:47:35,120 Speaker 1: eighty nine facial recognition algorithms from ninety nine developers, which 858 00:47:35,480 --> 00:47:39,000 Speaker 1: includes like some big name developers, and found that they 859 00:47:39,000 --> 00:47:42,400 Speaker 1: were far less accurate at identifying African American and Asian 860 00:47:42,440 --> 00:47:47,120 Speaker 1: faces compared to Caucasian faces, and African American females were 861 00:47:47,120 --> 00:47:50,440 Speaker 1: even more likely to be misidentified. Uh. Now this was 862 00:47:50,480 --> 00:47:54,200 Speaker 1: reported in various places. Uh, but the article I was 863 00:47:54,239 --> 00:47:58,440 Speaker 1: reading about it BBC News text article facial recognition fails 864 00:47:58,480 --> 00:48:01,760 Speaker 1: on race Government study says Yeah, I've read about several 865 00:48:01,800 --> 00:48:04,480 Speaker 1: cases like this. Uh. I mean, I think it's just 866 00:48:04,719 --> 00:48:08,280 Speaker 1: so important for people, especially working in the technology space, 867 00:48:08,320 --> 00:48:10,680 Speaker 1: to remember, you know, don't fall for the myth that 868 00:48:10,800 --> 00:48:13,720 Speaker 1: it's unbiased just because it's a machine and not a person. 869 00:48:13,840 --> 00:48:17,919 Speaker 1: People's biases end up in the machine. The rules come 870 00:48:18,000 --> 00:48:20,360 Speaker 1: from us. Okay, well, I think we're gonna have to 871 00:48:20,400 --> 00:48:23,399 Speaker 1: call the first episode right there. But when we come 872 00:48:23,400 --> 00:48:25,920 Speaker 1: back in the in the next in the series of episodes, 873 00:48:25,920 --> 00:48:28,719 Speaker 1: we're gonna be definitely talking about facial recognition in the 874 00:48:28,800 --> 00:48:32,239 Speaker 1: organic brain, and we'll be moving on more to the 875 00:48:32,320 --> 00:48:36,080 Speaker 1: history of technological facial recognition. We'll get to talk about Greebel's. 876 00:48:36,120 --> 00:48:39,439 Speaker 1: We love Greebel's. Greebles were new to me, but there's 877 00:48:39,440 --> 00:48:41,799 Speaker 1: a whole world. We could do a whole podcast on Greebel's. 878 00:48:41,840 --> 00:48:43,319 Speaker 1: You might think they were new to you, they weren't 879 00:48:43,320 --> 00:48:45,080 Speaker 1: new to you. We've talked about Greebeles, we have talked 880 00:48:45,080 --> 00:48:50,160 Speaker 1: about Greebles have the most delectable spikes. Okay, if you're curious, 881 00:48:50,160 --> 00:48:51,879 Speaker 1: you'll have to come back next time to find out. 882 00:48:52,000 --> 00:48:54,600 Speaker 1: Come back for the Greebel's. In the meantime, if you 883 00:48:54,600 --> 00:48:56,359 Speaker 1: want to check out other episodes of Stuff to Blow 884 00:48:56,360 --> 00:48:59,760 Speaker 1: your Mind, you can find us anywhere you find podcasts. 885 00:49:00,200 --> 00:49:01,759 Speaker 1: If you want to just a handy way to check 886 00:49:01,840 --> 00:49:03,799 Speaker 1: us out, go to stuff to Blow your Mind dot 887 00:49:03,840 --> 00:49:06,080 Speaker 1: com and that'll shoot you over to the I Heart 888 00:49:07,239 --> 00:49:10,000 Speaker 1: listing for our program. But wherever you get the show, 889 00:49:10,120 --> 00:49:12,520 Speaker 1: just make sure that you subscribe, make sure that you 890 00:49:12,640 --> 00:49:15,200 Speaker 1: rate and review. These are great ways to help us 891 00:49:15,239 --> 00:49:17,680 Speaker 1: out and just tell a friend about the show. That 892 00:49:17,800 --> 00:49:21,600 Speaker 1: also helps. And don't forget our other podcast, Invention. Invention 893 00:49:21,800 --> 00:49:24,840 Speaker 1: is a journey through human techno history. Oh yeah, I 894 00:49:24,840 --> 00:49:26,520 Speaker 1: feel like we've been doing a lot most of our 895 00:49:26,560 --> 00:49:30,440 Speaker 1: technology stuff on invention these days, and so uh so 896 00:49:30,680 --> 00:49:32,680 Speaker 1: I'm glad to be getting back into the techno space 897 00:49:32,719 --> 00:49:35,359 Speaker 1: a little bit on Stuff to Blow your Mind today. Yeah, absolutely, 898 00:49:35,480 --> 00:49:37,439 Speaker 1: even if it is for a kind of dystopian sci 899 00:49:37,480 --> 00:49:41,160 Speaker 1: fi topic like this. Anyway, huge things as always to 900 00:49:41,200 --> 00:49:44,359 Speaker 1: our excellent audio producer Seth Nicholas Johnson. If you would 901 00:49:44,400 --> 00:49:46,160 Speaker 1: like to get in touch with us with feedback on 902 00:49:46,200 --> 00:49:48,600 Speaker 1: this episode or any other, to suggest a topic for 903 00:49:48,640 --> 00:49:50,920 Speaker 1: the future, or just to say hello, you can email 904 00:49:51,000 --> 00:50:01,799 Speaker 1: us at contact. That's Stuff to Blow your Mind dot com. 905 00:50:01,880 --> 00:50:03,720 Speaker 1: Stuff to Blow Your Mind is a production of iHeart 906 00:50:03,760 --> 00:50:06,439 Speaker 1: Radio's How Stuff Works. For more podcasts from my heart Radio, 907 00:50:06,600 --> 00:50:09,160 Speaker 1: is that the iHeart Radio app, Apple Podcasts, or wherever 908 00:50:09,200 --> 00:50:15,680 Speaker 1: you listen to your favorite shows.