1 00:00:04,600 --> 00:00:08,360 Speaker 1: Sleepwalkers is a production of our heart Radio and unusual productions. 2 00:00:13,800 --> 00:00:16,920 Speaker 1: Every single thing that you do on a social network 3 00:00:17,640 --> 00:00:22,720 Speaker 1: or on a website is essentially recorded. How many pages 4 00:00:22,840 --> 00:00:25,119 Speaker 1: you visited, what did you click on, how did you 5 00:00:25,160 --> 00:00:27,920 Speaker 1: get to that website? What page did you leave on? 6 00:00:28,080 --> 00:00:30,760 Speaker 1: How many photos have you ever uploaded? Where were those 7 00:00:30,760 --> 00:00:33,880 Speaker 1: photos uploaded? How many places have you checked into? Who 8 00:00:33,920 --> 00:00:37,480 Speaker 1: have you tagged? What photographs have you been tagged in with? Whom? 9 00:00:37,880 --> 00:00:41,200 Speaker 1: Where were those photographs taken? Who's in your friendship circle? 10 00:00:41,360 --> 00:00:44,600 Speaker 1: Who did you go to school with? There are algorithms 11 00:00:44,600 --> 00:00:48,960 Speaker 1: and machine learning technologies that connect all of that data 12 00:00:49,000 --> 00:00:54,200 Speaker 1: together and start to find patterns. That's Lisa italian Aretti speaking. 13 00:00:54,720 --> 00:00:58,520 Speaker 1: She's a digital sociologist and tech ethics advocate with a 14 00:00:58,560 --> 00:01:01,880 Speaker 1: big focus on data. I think some of the data 15 00:01:02,000 --> 00:01:07,600 Speaker 1: that's quite concerning is facial recognition technology, where machines are 16 00:01:07,640 --> 00:01:13,080 Speaker 1: being fed huge amounts of images pictures of people's faces, 17 00:01:13,120 --> 00:01:17,320 Speaker 1: and that can come from dating websites, from photographs of 18 00:01:17,400 --> 00:01:21,640 Speaker 1: you on Facebook or Instagram or anywhere on the Internet. 19 00:01:21,680 --> 00:01:23,640 Speaker 1: You don't even have to upload it yourself. It could 20 00:01:23,640 --> 00:01:26,440 Speaker 1: be somebody else that's uploaded it for you. With all 21 00:01:26,480 --> 00:01:28,959 Speaker 1: that data out in the wild. All it takes is 22 00:01:29,000 --> 00:01:31,080 Speaker 1: for somebody to suck it up and they can start 23 00:01:31,160 --> 00:01:35,360 Speaker 1: connecting the dots. So there were researchers last year from 24 00:01:35,440 --> 00:01:41,440 Speaker 1: Stanford University who without the user's permission, scraped thirty thousand 25 00:01:41,640 --> 00:01:45,920 Speaker 1: photographs from a dating website that was public rights. So 26 00:01:46,360 --> 00:01:48,680 Speaker 1: they took it that they could just use those photographs 27 00:01:48,720 --> 00:01:52,200 Speaker 1: for research. Because it was a dating site, people will 28 00:01:52,240 --> 00:01:55,800 Speaker 1: also asked the data points, um, what is your sexual orientation? 29 00:01:56,400 --> 00:02:00,240 Speaker 1: So gay, straight, by right, and so they had all 30 00:02:00,280 --> 00:02:03,920 Speaker 1: of that data connected, so they could then connect faces 31 00:02:03,960 --> 00:02:08,040 Speaker 1: to sexual orientation and essentially they built an algorithm that 32 00:02:08,120 --> 00:02:11,560 Speaker 1: they said could detect just by somebody's face if they 33 00:02:11,600 --> 00:02:15,920 Speaker 1: were gay or straight or by the algorithm worked, it 34 00:02:15,960 --> 00:02:20,839 Speaker 1: could predict sexual orientation from photographs with accuracy for men 35 00:02:21,000 --> 00:02:26,320 Speaker 1: and accuracy for women based on just five photographs. And 36 00:02:26,400 --> 00:02:30,360 Speaker 1: Stanford isn't the only place using facial recognition to categorize people. 37 00:02:30,880 --> 00:02:48,720 Speaker 1: Governments are too all around the world. This is Sleepwalkers Welcome. 38 00:02:48,840 --> 00:02:51,840 Speaker 1: I'm Azveloshin. In the last episode, we looked at what 39 00:02:51,919 --> 00:02:56,040 Speaker 1: happens when artificial intelligence digests huge data sets to find 40 00:02:56,080 --> 00:03:00,920 Speaker 1: patterns and make predictions, like what's the most typical move synopsis? 41 00:03:01,040 --> 00:03:05,440 Speaker 1: Or is that moral cancerous? But advances in machine learning 42 00:03:05,480 --> 00:03:10,040 Speaker 1: are also making us more and more legible. In today's episode, 43 00:03:10,040 --> 00:03:13,080 Speaker 1: we ask what happens when we become the data set 44 00:03:13,440 --> 00:03:16,640 Speaker 1: and the power to predict is turned on us. Here's 45 00:03:16,680 --> 00:03:21,560 Speaker 1: Lisa again. The extra terrifying thing is why creates this 46 00:03:21,840 --> 00:03:26,000 Speaker 1: kind of technology? Like? Who does it serve? Why do 47 00:03:26,080 --> 00:03:29,720 Speaker 1: we need this? If you take this type of technology, 48 00:03:30,120 --> 00:03:35,840 Speaker 1: feed it to a citywide CCTV surveillance system and say 49 00:03:35,880 --> 00:03:38,760 Speaker 1: that you go to a place like Saudi Arabia where 50 00:03:38,840 --> 00:03:42,360 Speaker 1: being gay is considered a crime, and suddenly you're just 51 00:03:42,400 --> 00:03:45,160 Speaker 1: what pulling people off the street and arresting them because 52 00:03:45,800 --> 00:03:49,680 Speaker 1: you're gay because the computer said so, so like now 53 00:03:49,720 --> 00:03:53,920 Speaker 1: you're going to prison. This may sound like a terrifying 54 00:03:54,000 --> 00:03:59,880 Speaker 1: Minority Report style future, but actually it's here today. I 55 00:04:00,080 --> 00:04:03,840 Speaker 1: Cara Highs. So Lisa was talking about what happens when 56 00:04:04,000 --> 00:04:07,000 Speaker 1: governments start to connect the dots of all this data, 57 00:04:07,240 --> 00:04:10,880 Speaker 1: but it's already happening with private enterprise. For example, insurance 58 00:04:10,880 --> 00:04:14,119 Speaker 1: companies can now see someone who joins a Facebook group 59 00:04:14,200 --> 00:04:17,880 Speaker 1: about a genetic mutation and use that data to guess 60 00:04:18,120 --> 00:04:20,680 Speaker 1: that that person may have the genetic mutation and the 61 00:04:20,720 --> 00:04:24,520 Speaker 1: condition that's associated, and then the computer says, let's raise 62 00:04:24,560 --> 00:04:27,200 Speaker 1: their premium or let's deny them care. And so we 63 00:04:27,240 --> 00:04:30,599 Speaker 1: can use this peroxy data to do something The New 64 00:04:30,680 --> 00:04:34,480 Speaker 1: York Times has recently called proxy discrimination. And I think 65 00:04:34,520 --> 00:04:39,240 Speaker 1: something that's even more widely applicable is this definition of 66 00:04:39,480 --> 00:04:43,640 Speaker 1: surveillance capitalism, which is essentially that data is not just 67 00:04:43,760 --> 00:04:48,400 Speaker 1: data anymore, it's money. Companies can use data to make 68 00:04:48,640 --> 00:04:53,160 Speaker 1: predictions about future behavior and that can make them profit. Right, 69 00:04:53,240 --> 00:04:57,719 Speaker 1: that surveillance capitalism. I using information about joining a Facebook 70 00:04:57,760 --> 00:05:00,839 Speaker 1: group to make an insurance decision, and it's I don't know, 71 00:05:00,880 --> 00:05:02,840 Speaker 1: it's a little bit scary. In the US, it's all 72 00:05:02,839 --> 00:05:06,279 Speaker 1: about capitalism, But in other countries surveillance is used for 73 00:05:06,440 --> 00:05:10,960 Speaker 1: other purposes. For example, in China, it's about social control. 74 00:05:11,320 --> 00:05:14,360 Speaker 1: But in China, this same massive ingestion of data and 75 00:05:14,440 --> 00:05:18,440 Speaker 1: statistical modeling is used for governance. So let's take a 76 00:05:18,440 --> 00:05:21,520 Speaker 1: closer look at China and how they're using technology to 77 00:05:21,600 --> 00:05:26,400 Speaker 1: amplify the power of the state. They have this notion 78 00:05:26,400 --> 00:05:28,040 Speaker 1: of a social credit score, which is how good of 79 00:05:28,040 --> 00:05:30,279 Speaker 1: a citizen you are according to the government. They have 80 00:05:30,520 --> 00:05:32,760 Speaker 1: literally hundreds of millions of cameras around and they can 81 00:05:32,800 --> 00:05:35,120 Speaker 1: basically do things like you've been out at the bar, 82 00:05:35,320 --> 00:05:37,200 Speaker 1: you know, until to the last couple of nights. That's 83 00:05:37,200 --> 00:05:39,359 Speaker 1: not really what a good upstanding citizen would do. So 84 00:05:39,400 --> 00:05:41,920 Speaker 1: your social credit score can get ding because you've been 85 00:05:41,920 --> 00:05:44,400 Speaker 1: spending too much time at night at a bar. That's 86 00:05:44,480 --> 00:05:48,440 Speaker 1: Dr Alex Kilpatrick. He and Mary Haskett co founded Blink Identity, 87 00:05:48,600 --> 00:05:52,080 Speaker 1: a facial recognition company that can recognize the users face 88 00:05:52,240 --> 00:05:54,800 Speaker 1: in yes, the blink of an eye about no point 89 00:05:54,880 --> 00:05:59,000 Speaker 1: four seconds. Here's Blink co founder Mary on Chinese surveillance. 90 00:05:59,360 --> 00:06:02,200 Speaker 1: They have camera, so you're recorded jaywalking, and so your 91 00:06:02,240 --> 00:06:04,839 Speaker 1: score goes down, you know, and you automatically get a ticket, 92 00:06:04,880 --> 00:06:07,000 Speaker 1: which again doesn't sound like that they give a deal 93 00:06:07,080 --> 00:06:08,960 Speaker 1: if you really believe in law and order, except if 94 00:06:09,000 --> 00:06:11,440 Speaker 1: your score isn't high enough, you can't buy a plane ticket, 95 00:06:11,480 --> 00:06:13,280 Speaker 1: you have to travel by bus, you know, you can't 96 00:06:13,320 --> 00:06:15,960 Speaker 1: live in certain areas. And you can obviously see how 97 00:06:16,000 --> 00:06:18,120 Speaker 1: this could be abused. I mean, it doesn't take much 98 00:06:18,120 --> 00:06:21,320 Speaker 1: of an imagination. Other factors that can bring down your 99 00:06:21,360 --> 00:06:24,320 Speaker 1: social credit score include what you buy at the store, 100 00:06:24,680 --> 00:06:27,840 Speaker 1: your online browsing, and even having a friend with a 101 00:06:27,839 --> 00:06:30,920 Speaker 1: low score. This use of generalized surveillance can keep a 102 00:06:30,960 --> 00:06:33,320 Speaker 1: whole population in check, which is more or less the 103 00:06:33,360 --> 00:06:36,440 Speaker 1: explicit goal of the Communist Party of China. This can 104 00:06:36,480 --> 00:06:40,280 Speaker 1: be stifling for the average hand Chinese citizen. For minorities, 105 00:06:40,440 --> 00:06:43,080 Speaker 1: it can be much much worse, right, you know, more 106 00:06:43,120 --> 00:06:47,120 Speaker 1: specifically and more dauntingly, it can be used for internment, 107 00:06:47,279 --> 00:06:49,680 Speaker 1: you know, in the case of the weaker minority in China, 108 00:06:50,240 --> 00:06:53,880 Speaker 1: using data that is being created by this minority to 109 00:06:54,040 --> 00:06:57,560 Speaker 1: just communicate, you know, one person communicating with another person, 110 00:06:57,720 --> 00:07:00,640 Speaker 1: those who are connected that both wagas right, five weeks 111 00:07:00,640 --> 00:07:02,719 Speaker 1: are gathering the same place. Now that we know that, 112 00:07:03,440 --> 00:07:05,240 Speaker 1: you know, what are we going to do with that data? Oh, 113 00:07:05,240 --> 00:07:08,040 Speaker 1: We're going to send these people to re education camps. 114 00:07:08,520 --> 00:07:12,120 Speaker 1: And as technology improves, so does the state's ability to 115 00:07:12,200 --> 00:07:16,480 Speaker 1: project power. China today is cleaning the floor with the 116 00:07:16,480 --> 00:07:22,200 Speaker 1: Americans on voice and facial recognition technology. That's the Embrema, 117 00:07:22,280 --> 00:07:25,160 Speaker 1: an expert on global political risk and the founder of 118 00:07:25,160 --> 00:07:29,440 Speaker 1: the Eurasia Group. The Chinese have much more data. You 119 00:07:29,560 --> 00:07:32,720 Speaker 1: also have a government that is consolidating the data and 120 00:07:32,880 --> 00:07:35,560 Speaker 1: allocating it for different types of purposes. And you have 121 00:07:35,600 --> 00:07:39,280 Speaker 1: no presumption of privacy whatsoever. With no presumption of privacy, 122 00:07:39,440 --> 00:07:41,800 Speaker 1: the amount of data you can collect from your citizens 123 00:07:41,880 --> 00:07:46,760 Speaker 1: grows exponentially, and that actually gives you a huge technological advantage. 124 00:07:47,040 --> 00:07:49,880 Speaker 1: This is how Kai Fu Lee explains it. What makes 125 00:07:49,880 --> 00:07:53,240 Speaker 1: sense AI algorithm work better is how much data you 126 00:07:53,360 --> 00:07:56,440 Speaker 1: used to train it. And that's the beauty of deep learning. 127 00:07:56,640 --> 00:07:58,720 Speaker 1: You just keep throwing data at it and it just 128 00:07:58,840 --> 00:08:02,880 Speaker 1: performs better. And Kai fully understands this world better than most. 129 00:08:03,200 --> 00:08:06,360 Speaker 1: His fund, Signivation Ventures, has invested in meg v, a 130 00:08:06,400 --> 00:08:10,960 Speaker 1: facial recognition company valued at four billion dollars. Kaifu also 131 00:08:11,000 --> 00:08:15,840 Speaker 1: around Google China, so he knows the landscape. China simply 132 00:08:15,960 --> 00:08:19,080 Speaker 1: has more data than the US due to not only 133 00:08:19,080 --> 00:08:22,080 Speaker 1: the large number of users, but also the depth in 134 00:08:22,120 --> 00:08:25,960 Speaker 1: which Chinese users use the Internet for ordering food, for 135 00:08:26,000 --> 00:08:30,960 Speaker 1: shared bicycles, for mobile payments. So the AI will actually 136 00:08:30,960 --> 00:08:33,480 Speaker 1: just perform better because it's trained out more data. Some 137 00:08:33,600 --> 00:08:37,559 Speaker 1: of that data is taken from citizens using surveillance, but 138 00:08:37,640 --> 00:08:42,120 Speaker 1: according to Kaifu, much is freely given. I think the 139 00:08:42,240 --> 00:08:46,800 Speaker 1: Chinese culture and the Chinese people are more pragmatic, so 140 00:08:46,920 --> 00:08:53,120 Speaker 1: that if the software delivers pragmatic value, they ask fewer questions. 141 00:08:53,559 --> 00:08:57,080 Speaker 1: For example, you know, we've funded an app that loans 142 00:08:57,120 --> 00:09:00,280 Speaker 1: money to ask you a couple of questions and takes 143 00:09:00,360 --> 00:09:02,280 Speaker 1: data from your phone at the same level as the 144 00:09:02,320 --> 00:09:05,600 Speaker 1: Facebook would take data from your phone, and it's SAPs 145 00:09:05,640 --> 00:09:08,080 Speaker 1: the money to you instantly if it decides to lend 146 00:09:08,120 --> 00:09:10,400 Speaker 1: money to you. I think in the US people my 147 00:09:10,559 --> 00:09:12,840 Speaker 1: question do I really want to give my data to 148 00:09:12,920 --> 00:09:17,200 Speaker 1: a lending application? And is it appropriate to consider the 149 00:09:17,280 --> 00:09:19,360 Speaker 1: makeup my phone as a part of that give me 150 00:09:19,559 --> 00:09:22,880 Speaker 1: giving me money or my zip codes? Because that might 151 00:09:22,960 --> 00:09:27,000 Speaker 1: reflect certain things about me. The breadth and depth of 152 00:09:27,080 --> 00:09:30,360 Speaker 1: data in China, both from a larger population and a 153 00:09:30,440 --> 00:09:35,480 Speaker 1: much deeper integration with technology, gives China a serious competitive advantage. 154 00:09:35,720 --> 00:09:39,520 Speaker 1: Where the Americans have much better scientists, the Chinese were 155 00:09:39,520 --> 00:09:41,600 Speaker 1: able to buy a lot of science. Now, when you 156 00:09:41,679 --> 00:09:44,360 Speaker 1: talk to specialists in this field, they will tell you 157 00:09:44,400 --> 00:09:48,680 Speaker 1: that in many parts of Ai, great data and okay 158 00:09:48,679 --> 00:09:54,280 Speaker 1: scientists will frequently beat great scientists and okay data. Character 159 00:09:54,320 --> 00:09:57,880 Speaker 1: scary thing is that China is using that technological superiority 160 00:09:57,920 --> 00:10:00,280 Speaker 1: to build a very different kind of state, you know, 161 00:10:00,320 --> 00:10:03,559 Speaker 1: one in which the price of descent is intolerably high. 162 00:10:03,960 --> 00:10:08,200 Speaker 1: You know. So according to Ian public demonstrations have fallen marketly, right, 163 00:10:08,200 --> 00:10:11,080 Speaker 1: because if you know you're being watched, you're probably less 164 00:10:11,120 --> 00:10:14,480 Speaker 1: likely to commit public displays of civil disobedience. Right. We 165 00:10:14,559 --> 00:10:17,360 Speaker 1: talked about the vigas earlier in Cashgar, which is a 166 00:10:17,400 --> 00:10:20,640 Speaker 1: Muslim city in China. Vigas have to register to go 167 00:10:20,720 --> 00:10:24,560 Speaker 1: into the mosque, and once they're inside, they face a 168 00:10:24,640 --> 00:10:28,520 Speaker 1: bank of cameras like many many surveillance cameras, and go 169 00:10:28,679 --> 00:10:33,360 Speaker 1: figure Muslims stopped going to mosque voluntarily well, because going 170 00:10:33,360 --> 00:10:37,160 Speaker 1: to mask is an act of civil disobedience where they are. 171 00:10:37,280 --> 00:10:40,920 Speaker 1: Even if it's not explicitly stated, it's it's heavily implied, right. 172 00:10:41,000 --> 00:10:42,920 Speaker 1: I mean, I think about it for myself, Like if 173 00:10:42,960 --> 00:10:45,320 Speaker 1: I were living in an area in the United States 174 00:10:45,320 --> 00:10:47,840 Speaker 1: where going to temple was going to land me in 175 00:10:47,880 --> 00:10:51,160 Speaker 1: an internment camp, I would not be going voluntarily if 176 00:10:51,200 --> 00:10:54,559 Speaker 1: I knew there were security cameras all over my temple. Absolutely, 177 00:10:54,640 --> 00:10:57,080 Speaker 1: And the crazy thing is you wouldn't even have to 178 00:10:57,160 --> 00:10:59,880 Speaker 1: know if those security cameras work. It's like on the 179 00:11:00,000 --> 00:11:01,480 Speaker 1: o TO in England, we have a lot of speed 180 00:11:01,520 --> 00:11:03,920 Speaker 1: cameras and no one knows that they're actually doing it. 181 00:11:03,960 --> 00:11:05,959 Speaker 1: It's staying apart from once every five years you've got 182 00:11:05,960 --> 00:11:09,240 Speaker 1: a ticket. But it still slows people down the panopticon 183 00:11:09,880 --> 00:11:12,720 Speaker 1: even if they can't do that. But people think they 184 00:11:12,720 --> 00:11:14,599 Speaker 1: can do that, right. I mean, you don't need a 185 00:11:14,679 --> 00:11:17,319 Speaker 1: hundred percent certainty. You just need a government that is 186 00:11:17,360 --> 00:11:19,680 Speaker 1: starting to get that capacity and make it known and 187 00:11:19,760 --> 00:11:21,400 Speaker 1: have a few people that are sort of strung up 188 00:11:21,400 --> 00:11:26,480 Speaker 1: as examples, and suddenly everyone's scared. And this isn't only 189 00:11:26,520 --> 00:11:29,840 Speaker 1: happening in China. According to Ian, it was a key 190 00:11:29,880 --> 00:11:33,280 Speaker 1: part of Bashar al Assad's strategy in the Syrian Civil War. 191 00:11:33,600 --> 00:11:36,520 Speaker 1: Assade got some help from the Russians, who gave them 192 00:11:36,559 --> 00:11:39,600 Speaker 1: a couple of hundred computer scientists to go in work 193 00:11:39,640 --> 00:11:43,480 Speaker 1: with the Syrian military and identify on social media and 194 00:11:43,480 --> 00:11:47,720 Speaker 1: on text messaging who wore those Syrian citizens that were 195 00:11:47,840 --> 00:11:52,640 Speaker 1: nodes of dissent and within six months no more moderate 196 00:11:52,640 --> 00:11:58,720 Speaker 1: opposition in Syria. They specifically, we're looking into individual Syrian 197 00:11:58,800 --> 00:12:02,240 Speaker 1: citizens that we're saying things about the regime that we're untoward, 198 00:12:02,480 --> 00:12:06,520 Speaker 1: that we're connected to influencers that were helping to organize protests, 199 00:12:06,720 --> 00:12:10,440 Speaker 1: and suddenly, you know, a bunch of those people were 200 00:12:10,559 --> 00:12:13,520 Speaker 1: rounded up and some were never heard from again. And 201 00:12:13,760 --> 00:12:16,640 Speaker 1: as I mentioned in terms of China, you don't have 202 00:12:16,720 --> 00:12:19,800 Speaker 1: to do that with many people before people start ratting 203 00:12:19,840 --> 00:12:22,560 Speaker 1: out their friends, being scared of talking to anyone not 204 00:12:22,679 --> 00:12:27,959 Speaker 1: going out. The system worked. We may feel comfortably far 205 00:12:28,040 --> 00:12:31,080 Speaker 1: from the battlefields of Syria here in the US, and 206 00:12:31,120 --> 00:12:34,559 Speaker 1: from the overwhelming number of surveillance cameras on practically every 207 00:12:34,559 --> 00:12:38,400 Speaker 1: street corner in China. But the more effective these technologies are, 208 00:12:38,840 --> 00:12:41,720 Speaker 1: the more likely they are to be adopted by others. Now, 209 00:12:41,760 --> 00:12:44,840 Speaker 1: in other countries, you're going to have a confluence of 210 00:12:44,880 --> 00:12:48,720 Speaker 1: both them liking the model and the Chinese directly exporting it. 211 00:12:49,000 --> 00:12:51,720 Speaker 1: Who were those countries, Well, look at One Belt, One Road, 212 00:12:52,000 --> 00:12:55,120 Speaker 1: the you know, trillion plus dollar investments that the Chinese 213 00:12:55,120 --> 00:12:58,720 Speaker 1: are making all over the world in Pakistan, Southeast Asia, 214 00:12:58,760 --> 00:13:01,640 Speaker 1: you know, Cambodia, laughs, a whole bunch of countries. And 215 00:13:01,840 --> 00:13:04,760 Speaker 1: when you look at those countries and you see that 216 00:13:04,800 --> 00:13:08,000 Speaker 1: the Chinese are providing the money in this conditionality in return, 217 00:13:08,080 --> 00:13:11,520 Speaker 1: some that conditionalities use Chinese standards for technology that's in 218 00:13:11,640 --> 00:13:14,400 Speaker 1: many of these contracts. And with the spread of Chinese 219 00:13:14,400 --> 00:13:18,040 Speaker 1: standards of technology comes the spread of Chinese style surveillance, 220 00:13:18,080 --> 00:13:21,439 Speaker 1: which could ultimately make the whole world trend more authoritarian. 221 00:13:22,080 --> 00:13:26,120 Speaker 1: So as that happens, these governments are going to say, 222 00:13:26,360 --> 00:13:29,520 Speaker 1: ah ha, we get the money from China, we use 223 00:13:29,600 --> 00:13:33,720 Speaker 1: their technology. We're stuck with their system, but we can 224 00:13:33,920 --> 00:13:37,319 Speaker 1: use it to ensure that our people stay in power. Again, 225 00:13:37,800 --> 00:13:41,160 Speaker 1: it's easy to let all of this feel comfortably far away, 226 00:13:41,200 --> 00:13:44,680 Speaker 1: but remember the Internet doesn't have borders, so you don't 227 00:13:44,679 --> 00:13:46,800 Speaker 1: have to be in China for the Chinese state to 228 00:13:46,840 --> 00:13:49,280 Speaker 1: access your data. Yeah. I don't know how many people 229 00:13:49,320 --> 00:13:52,440 Speaker 1: know this, but Grinder is actually owned by a Chinese company, 230 00:13:52,640 --> 00:13:55,280 Speaker 1: Grinder the dating MP. Yes, and actually there have been 231 00:13:55,400 --> 00:13:58,600 Speaker 1: articles about the fact that the US government is trying 232 00:13:58,640 --> 00:14:01,320 Speaker 1: to force China's hands so that we can buy it 233 00:14:01,400 --> 00:14:05,720 Speaker 1: back because there's so much user data that this company 234 00:14:05,760 --> 00:14:08,760 Speaker 1: now owns. Well, that Grinder use a data is basically 235 00:14:08,760 --> 00:14:11,280 Speaker 1: being seen by the US government as a strategic asset. 236 00:14:11,679 --> 00:14:13,480 Speaker 1: It is a strategic asset. I mean, if you think 237 00:14:13,480 --> 00:14:16,920 Speaker 1: about it, if somebody is on a military base or 238 00:14:16,960 --> 00:14:21,600 Speaker 1: in a barrack and trying to connect with someone on Grinder, 239 00:14:21,800 --> 00:14:24,400 Speaker 1: they're turning their location services data on because they want 240 00:14:24,400 --> 00:14:26,760 Speaker 1: to see people in their area and if they're turning 241 00:14:26,800 --> 00:14:31,080 Speaker 1: that location services data on, they're basically making themselves vulnerable 242 00:14:31,560 --> 00:14:34,320 Speaker 1: to the company that owns the data, because it's basically 243 00:14:34,320 --> 00:14:36,720 Speaker 1: saying here I am, here, I am and not only that, 244 00:14:36,840 --> 00:14:39,760 Speaker 1: here I am, here, I am, and I'm gay, and 245 00:14:39,920 --> 00:14:43,400 Speaker 1: that can lead to some possibilities of blackmail. Even today, exactly, 246 00:14:43,720 --> 00:14:45,920 Speaker 1: there was an article in the interface that was written 247 00:14:45,920 --> 00:14:49,400 Speaker 1: by this guy, Casey Newton, chat history, photos, videos, real 248 00:14:49,400 --> 00:14:52,760 Speaker 1: time location. All of that is connected to a user's 249 00:14:52,760 --> 00:14:56,040 Speaker 1: email address, and that means that the user's identity can 250 00:14:56,040 --> 00:14:59,880 Speaker 1: be very easily learned. That's pretty scary. And even if 251 00:15:00,040 --> 00:15:03,840 Speaker 1: don't use Grinder, you might use Flash of Clans or Fortnite, 252 00:15:03,840 --> 00:15:07,200 Speaker 1: which are very popular gaming apps also owned by the Chinese. 253 00:15:07,200 --> 00:15:09,720 Speaker 1: Now we keep saying the Chinese. To be clear, these 254 00:15:09,760 --> 00:15:13,080 Speaker 1: apps aren't owned by the Chinese government, that owned by 255 00:15:13,200 --> 00:15:16,320 Speaker 1: Chinese companies. What the actions of the US government imply 256 00:15:16,520 --> 00:15:19,440 Speaker 1: by trying to force this company to sell Grinder back 257 00:15:20,200 --> 00:15:23,000 Speaker 1: is that they don't believe in the distinction. And do 258 00:15:23,040 --> 00:15:24,960 Speaker 1: you think the US government would really be working that 259 00:15:25,080 --> 00:15:29,320 Speaker 1: hard to get back a gay dating app if they 260 00:15:29,360 --> 00:15:32,240 Speaker 1: didn't think that there was not a murky separation between 261 00:15:32,240 --> 00:15:36,720 Speaker 1: the government and companies in China. Right, So every time 262 00:15:36,760 --> 00:15:39,000 Speaker 1: we give data away, I mean, we're aware that it 263 00:15:39,040 --> 00:15:41,560 Speaker 1: opens us up to targeted ads on Facebook. We talked 264 00:15:41,560 --> 00:15:45,400 Speaker 1: about those with Gillian. We're not aware that that data 265 00:15:45,440 --> 00:15:50,040 Speaker 1: may end up in the hands of potentially hostile foreign government. 266 00:15:50,160 --> 00:15:55,120 Speaker 1: So once again, Sleepwalking. We've been talking about how foreign 267 00:15:55,160 --> 00:15:58,120 Speaker 1: governments are using AI, but when we come back, we'll 268 00:15:58,120 --> 00:16:00,080 Speaker 1: look at how the police and courts are using it 269 00:16:00,280 --> 00:16:09,320 Speaker 1: at home in America. Kara, it's easy to look at 270 00:16:09,440 --> 00:16:12,200 Speaker 1: China and to see the big bad wolf. They're using 271 00:16:12,240 --> 00:16:16,440 Speaker 1: surveillance technology for the wholesale suppression of an ethnic minority. 272 00:16:16,280 --> 00:16:18,880 Speaker 1: They have a social credit score that con emit access 273 00:16:18,880 --> 00:16:23,160 Speaker 1: to opportunities and even travel, But algorithms also determined outcomes 274 00:16:23,360 --> 00:16:26,240 Speaker 1: here in the US exactly if you think about it, 275 00:16:26,360 --> 00:16:29,160 Speaker 1: we do use social ratings. If you use Uber and 276 00:16:29,240 --> 00:16:31,280 Speaker 1: have a rating lower than four out of five stars, 277 00:16:31,360 --> 00:16:33,560 Speaker 1: you can't get a car right and you can't get 278 00:16:33,560 --> 00:16:36,080 Speaker 1: alone if you have a low FICO credit score. And 279 00:16:36,400 --> 00:16:40,320 Speaker 1: our criminal justice system also uses algorithmic ratings to decide 280 00:16:40,360 --> 00:16:45,400 Speaker 1: people's fate. When I got arrested at sixteen, I was 281 00:16:45,440 --> 00:16:48,120 Speaker 1: in high school and John F. Kennedy High School. That's 282 00:16:48,160 --> 00:16:51,480 Speaker 1: Glenn Rodriguez. When he was a baby, Glenn's mother was murdered, 283 00:16:51,520 --> 00:16:54,920 Speaker 1: and when he was three, his father committed suicide. From 284 00:16:54,960 --> 00:16:57,920 Speaker 1: then on, Glenn was raised by his grandmother and he 285 00:16:58,000 --> 00:17:02,200 Speaker 1: searched for belonging. The kid who wants to feel accepted, 286 00:17:02,440 --> 00:17:05,040 Speaker 1: wants to feel a part of something right. Whatever the 287 00:17:05,080 --> 00:17:08,080 Speaker 1: group was up for, I was down. And if they 288 00:17:08,119 --> 00:17:11,199 Speaker 1: were going one step forward, I would take two. We 289 00:17:11,240 --> 00:17:14,760 Speaker 1: pretty much planned a robbery at a car dealership in 290 00:17:14,840 --> 00:17:20,240 Speaker 1: Queen's and we entered the premises. We took three cars. 291 00:17:20,720 --> 00:17:24,240 Speaker 1: There was a twenty five year old men in there, 292 00:17:24,920 --> 00:17:27,520 Speaker 1: and he initially pulled a gun, and so I had 293 00:17:27,520 --> 00:17:32,000 Speaker 1: a gun and I shot him. Glenn was arrested and 294 00:17:32,080 --> 00:17:35,679 Speaker 1: convicted of second degree murder. He was sentenced to twenty 295 00:17:35,720 --> 00:17:38,560 Speaker 1: five years in jail, and he was still a high schooler. 296 00:17:38,760 --> 00:17:41,720 Speaker 1: You feel powerless, you feel hopeless, especially at that age. 297 00:17:41,880 --> 00:17:44,159 Speaker 1: So the way I saw it was, this is my life. 298 00:17:44,400 --> 00:17:46,320 Speaker 1: You know, I'm probably gonna die in jail, and so 299 00:17:46,440 --> 00:17:48,080 Speaker 1: whatever it is that I have to do, I need 300 00:17:48,160 --> 00:17:50,199 Speaker 1: to survive. One of the things that I learned very 301 00:17:50,280 --> 00:17:53,399 Speaker 1: quickly is that in prison, one of the only things 302 00:17:53,440 --> 00:17:55,919 Speaker 1: that is respected is violence, and so in order for 303 00:17:56,000 --> 00:17:58,479 Speaker 1: you to survive in there, you have to be violent, 304 00:17:58,920 --> 00:18:02,200 Speaker 1: because otherwise you'd be come pray. In time, Glenn established 305 00:18:02,200 --> 00:18:06,320 Speaker 1: his reputation and started to feel safer. With that security 306 00:18:06,359 --> 00:18:09,879 Speaker 1: and getting older, his thinking began to change. And it 307 00:18:09,960 --> 00:18:12,560 Speaker 1: wasn't until later, to like my mid twhenies, when I 308 00:18:12,640 --> 00:18:15,119 Speaker 1: started saying, you know what, I need to reverse this 309 00:18:15,240 --> 00:18:17,879 Speaker 1: trend if I want to have any chance at parole. 310 00:18:18,240 --> 00:18:22,120 Speaker 1: Glenn had to reverse thirteen years of behavior to survive 311 00:18:22,119 --> 00:18:24,720 Speaker 1: in prison. He had learned to behave one way, but 312 00:18:24,800 --> 00:18:27,760 Speaker 1: to get out he had to behave another. I reveiled 313 00:18:27,800 --> 00:18:30,399 Speaker 1: myself of the Puppies behind Bars program, so I was 314 00:18:30,520 --> 00:18:33,399 Speaker 1: training service dogs for wounded war veterans for five years. 315 00:18:33,720 --> 00:18:36,959 Speaker 1: That was an amazing experience, right because throughout incarceration it's 316 00:18:36,960 --> 00:18:40,840 Speaker 1: almost like you build a wall around yourself with the dogs. 317 00:18:41,280 --> 00:18:43,440 Speaker 1: You can't fake it with the dogs. If you're trying 318 00:18:43,480 --> 00:18:45,800 Speaker 1: to like teach them a command, sometimes you may have 319 00:18:45,880 --> 00:18:48,679 Speaker 1: to be silly, which guess what, in prison, being silly 320 00:18:48,720 --> 00:18:51,240 Speaker 1: is not acceptable. That's perceived as a weakness. But with 321 00:18:51,320 --> 00:18:53,960 Speaker 1: that program, often times you had the resort of being 322 00:18:53,960 --> 00:18:57,160 Speaker 1: silly and throwing yourself on the floor and giggling loud 323 00:18:57,160 --> 00:18:59,159 Speaker 1: and making all kinds of crazy sounds to try to 324 00:18:59,160 --> 00:19:01,840 Speaker 1: get the dogs attend gin Right. To a very large extent, 325 00:19:01,920 --> 00:19:04,160 Speaker 1: I believe that that program kind of helped me regain 326 00:19:04,200 --> 00:19:08,840 Speaker 1: my humanity as well as helping Glenn. Personally. Taking part 327 00:19:08,880 --> 00:19:11,400 Speaker 1: in prison programs for the public good is looked upon 328 00:19:11,480 --> 00:19:14,520 Speaker 1: favorably by parole boards. So everything I did, I wanted 329 00:19:14,560 --> 00:19:17,239 Speaker 1: to document the kind of showcases what I've done, this 330 00:19:17,320 --> 00:19:20,280 Speaker 1: is who I am today. As part of the process, 331 00:19:20,320 --> 00:19:25,080 Speaker 1: there's also the Campus Risk Assessment COMPUS stands for Correctional 332 00:19:25,160 --> 00:19:30,720 Speaker 1: Offender Management Profiling for Alternative Sanctions. It's an algorithm that 333 00:19:30,880 --> 00:19:33,680 Speaker 1: claims to be able to predict how likely a defendant 334 00:19:33,760 --> 00:19:36,359 Speaker 1: is to commit another crime based on a list of 335 00:19:36,359 --> 00:19:39,760 Speaker 1: a hundred and thirty seven questions. Since being developed in 336 00:19:41,200 --> 00:19:43,800 Speaker 1: its estimated COMPASS has been used to assess more than 337 00:19:43,880 --> 00:19:48,639 Speaker 1: one million defendants, including Glenn. You meet with this person 338 00:19:48,800 --> 00:19:51,640 Speaker 1: a few months before your schedule parole board day, and 339 00:19:52,320 --> 00:19:54,640 Speaker 1: they asked you a series of questions. And so when 340 00:19:54,640 --> 00:19:58,680 Speaker 1: he got to the disciplinary history section of the Campus 341 00:19:58,760 --> 00:20:01,960 Speaker 1: Risk Assessment, there was a list of offenses right for 342 00:20:02,040 --> 00:20:03,920 Speaker 1: him to check off. Yes or no for the past 343 00:20:03,920 --> 00:20:06,280 Speaker 1: twenty four months, and it was all known. And anyone 344 00:20:06,359 --> 00:20:08,880 Speaker 1: who has any experience with prison would tell you that 345 00:20:08,880 --> 00:20:12,320 Speaker 1: that is almost impossible to do, right, because misbehavior reports 346 00:20:12,560 --> 00:20:15,800 Speaker 1: can be for something as simple as having too many pillows, 347 00:20:15,840 --> 00:20:18,440 Speaker 1: something as simple as your parents hanging off your bot, 348 00:20:18,840 --> 00:20:21,720 Speaker 1: your sneaker is untied. It takes a lot of energy 349 00:20:21,760 --> 00:20:24,760 Speaker 1: to dodge a misbehavior report during the course of a year, 350 00:20:25,119 --> 00:20:27,959 Speaker 1: let alone ten, and in my case, it had been eleven. 351 00:20:29,520 --> 00:20:31,720 Speaker 1: And then I heard him read the question and he says, 352 00:20:31,760 --> 00:20:36,159 Speaker 1: does this person appear to have notable disciplinary issues? And 353 00:20:36,200 --> 00:20:39,359 Speaker 1: he says yes, And I was like, hold up, wait 354 00:20:39,400 --> 00:20:41,639 Speaker 1: a second, did I just hear you right? Because I 355 00:20:41,760 --> 00:20:44,800 Speaker 1: just heard you say that I have notable disciplinary issues? 356 00:20:45,040 --> 00:20:47,199 Speaker 1: Do you do realize that I haven't had a misbehavior 357 00:20:47,200 --> 00:20:50,480 Speaker 1: report in over a decade? Right? And his answer was well, 358 00:20:50,520 --> 00:20:53,920 Speaker 1: I was told that if there's any instance of misbehavior 359 00:20:54,000 --> 00:20:56,359 Speaker 1: at any point, I have to check yes for this answer. 360 00:20:56,520 --> 00:20:58,760 Speaker 1: So I was like, okay, So at that point there 361 00:20:58,800 --> 00:21:04,119 Speaker 1: was nothing I could do. I'm appearing before the parole 362 00:21:04,119 --> 00:21:07,479 Speaker 1: board panel. I presented to them a portfolio that was 363 00:21:07,520 --> 00:21:12,800 Speaker 1: approximately one hundred pages, had letters to support Now the 364 00:21:12,840 --> 00:21:16,800 Speaker 1: Compass is saying that I'm a disciplinary issue and so 365 00:21:16,840 --> 00:21:19,399 Speaker 1: I shouldn't be released. But I was denied because of 366 00:21:19,400 --> 00:21:21,800 Speaker 1: the fact that I scored high on Compass. They played 367 00:21:21,800 --> 00:21:24,760 Speaker 1: a safe and kept me in. It may have been 368 00:21:24,840 --> 00:21:27,640 Speaker 1: less than five minutes the hearing. I waited twenty six 369 00:21:27,720 --> 00:21:30,120 Speaker 1: years to sit in front of a panel of three 370 00:21:30,119 --> 00:21:33,000 Speaker 1: people for less than five minutes. No one wants to 371 00:21:33,000 --> 00:21:35,399 Speaker 1: be the one to go against Compass, and next you know, 372 00:21:35,480 --> 00:21:37,720 Speaker 1: something goes wrong and now your job is on the 373 00:21:37,760 --> 00:21:41,840 Speaker 1: line because you departed from Compass, which is taken as 374 00:21:42,080 --> 00:21:48,320 Speaker 1: factual and scientific. In time, Glenn went before another parole 375 00:21:48,359 --> 00:21:52,120 Speaker 1: board and this time they freed him against the recommendation 376 00:21:52,160 --> 00:21:54,439 Speaker 1: of Compass. And now Glenn has built a life for 377 00:21:54,520 --> 00:21:58,119 Speaker 1: himself working with teenagers at risk, giving conceration at the 378 00:21:58,160 --> 00:22:02,280 Speaker 1: Center for Community Alternatives. But he's still being affected by 379 00:22:02,280 --> 00:22:06,760 Speaker 1: the algorithm. Compass does not end upon your release because 380 00:22:06,800 --> 00:22:10,840 Speaker 1: the same Compass risk assessment that's considered for your release. 381 00:22:10,960 --> 00:22:13,760 Speaker 1: The term is how you're going to be supervised upon release. 382 00:22:14,000 --> 00:22:16,280 Speaker 1: There's a number of restrictions that I have. I have 383 00:22:16,359 --> 00:22:21,800 Speaker 1: a curve you. I'm still haunted by a Compass. Despite 384 00:22:21,800 --> 00:22:26,480 Speaker 1: turning his life around Compass is still limiting Glen's freedom, 385 00:22:26,520 --> 00:22:29,680 Speaker 1: and that should haunt all of us. According to propublic A, 386 00:22:29,760 --> 00:22:34,320 Speaker 1: Compass inaccurately labels black defendants as likely to reoffend twice 387 00:22:34,359 --> 00:22:39,480 Speaker 1: as often as white defendants. Algorithmic discrimination isn't government policy 388 00:22:39,520 --> 00:22:41,800 Speaker 1: here in the US like it is against the weak 389 00:22:41,840 --> 00:22:46,320 Speaker 1: as in China, but it still exists. There's this issue 390 00:22:46,680 --> 00:22:50,639 Speaker 1: where you can have computer scientists building a more accurate algorithm, 391 00:22:50,960 --> 00:22:55,159 Speaker 1: but on account of dubious input factors like gender or 392 00:22:55,280 --> 00:22:59,600 Speaker 1: race or religion, you've created something that's unconstitutional. That's Jason 393 00:22:59,640 --> 00:23:02,600 Speaker 1: tchet It. He introduced us to Glenn, and he's the 394 00:23:02,600 --> 00:23:05,840 Speaker 1: founder of Justice Codes and a legal affairs writer for 395 00:23:05,880 --> 00:23:10,359 Speaker 1: the American Bar Association Journal. There's this predisposition to believe 396 00:23:10,480 --> 00:23:15,840 Speaker 1: that math doesn't carry all of the biases that humans do. 397 00:23:16,000 --> 00:23:18,840 Speaker 1: It's an objective science. I think we need to dispel 398 00:23:19,080 --> 00:23:23,159 Speaker 1: that idea. Jason is describing the very human habit of 399 00:23:23,200 --> 00:23:28,560 Speaker 1: taking computer output as gospel truth. It's called automation bias, 400 00:23:28,680 --> 00:23:31,720 Speaker 1: and it's why parole boards often don't feel comfortable over 401 00:23:31,760 --> 00:23:35,320 Speaker 1: writing algorithms like Compass, and why some people follow their 402 00:23:35,320 --> 00:23:38,200 Speaker 1: GPS even when it has them driving into the ocean. 403 00:23:38,960 --> 00:23:42,800 Speaker 1: This idea that somehow, because math is an underlying force 404 00:23:43,160 --> 00:23:46,840 Speaker 1: to these tools, makes them more objective or beyond certain 405 00:23:46,880 --> 00:23:51,160 Speaker 1: types of scrutiny is wrong. Computer algorithms are being used 406 00:23:51,200 --> 00:23:54,960 Speaker 1: to determine human fate today, whether it's Compassed in the 407 00:23:55,040 --> 00:23:58,479 Speaker 1: US or the social credit score in China, So we 408 00:23:58,560 --> 00:24:01,679 Speaker 1: have to scrutinize them and understand that their output is 409 00:24:01,720 --> 00:24:06,399 Speaker 1: not necessarily neutral. The foundational principle of AI is using 410 00:24:06,480 --> 00:24:10,080 Speaker 1: historical data to predict what will happen next, and that 411 00:24:10,160 --> 00:24:13,879 Speaker 1: in itself is a challenge to our culture because the 412 00:24:13,920 --> 00:24:16,600 Speaker 1: American dream is built on the idea that we have 413 00:24:16,640 --> 00:24:19,480 Speaker 1: a capacity to change, that we can move from rags 414 00:24:19,480 --> 00:24:22,920 Speaker 1: to riches, from the penitentiary to the boardroom. And it's 415 00:24:22,960 --> 00:24:26,440 Speaker 1: not just an American narrative. It's Scrooges change of heart 416 00:24:26,480 --> 00:24:29,560 Speaker 1: delivering the turkey to the Cratchets on Christmas Day. It's 417 00:24:29,560 --> 00:24:33,159 Speaker 1: Saint Paul's conversion on the Road to Damascus. It's at 418 00:24:33,200 --> 00:24:38,080 Speaker 1: the very heart of Western culture. But algorithms like Compass 419 00:24:38,200 --> 00:24:41,400 Speaker 1: aren't built to see the potential in people. They're designed 420 00:24:41,400 --> 00:24:46,200 Speaker 1: to calculate risk based on past actions, and Compass isn't 421 00:24:46,240 --> 00:24:48,800 Speaker 1: the only example of algorithms being used in our criminal 422 00:24:48,840 --> 00:24:52,680 Speaker 1: justice system. When we come back, we go right inside 423 00:24:52,720 --> 00:24:57,040 Speaker 1: the NYPD to understand how new technology is powering policing. 424 00:25:03,040 --> 00:25:05,560 Speaker 1: It's a freezing cold day in New York City when 425 00:25:05,600 --> 00:25:08,800 Speaker 1: we arrive at the NYPD headquarters, and before we even 426 00:25:08,800 --> 00:25:11,280 Speaker 1: get into the main building, Julie and I have to 427 00:25:11,320 --> 00:25:15,399 Speaker 1: pass through airport style security and naturally give up some data, 428 00:25:15,920 --> 00:25:21,560 Speaker 1: including submitting a selfie in a kiosk. Yes, right now, 429 00:25:21,560 --> 00:25:26,400 Speaker 1: our society is holding big conversations about body cameras, police accountability, 430 00:25:26,480 --> 00:25:30,199 Speaker 1: and government monitoring, so we had to ask how does 431 00:25:30,240 --> 00:25:33,320 Speaker 1: one of the most recognizable police forces in the world 432 00:25:33,320 --> 00:25:37,879 Speaker 1: handle our data. My name is Benjamin Singleton, director of 433 00:25:37,920 --> 00:25:41,280 Speaker 1: Analytics at the NYPD, so I probably spend half my 434 00:25:41,359 --> 00:25:44,240 Speaker 1: day writing code and half my day and meetings. The 435 00:25:44,280 --> 00:25:47,360 Speaker 1: police Department collects records as a regular course of our business. 436 00:25:47,720 --> 00:25:50,920 Speaker 1: We respond to nine when one calls, We take crime reports, 437 00:25:51,240 --> 00:25:54,720 Speaker 1: we make arrests, We issue moving summons. Is when you 438 00:25:54,960 --> 00:25:57,399 Speaker 1: you know, speed in the city. These are examples of 439 00:25:57,440 --> 00:25:59,600 Speaker 1: the kind of data that we collect. You know, I 440 00:25:59,640 --> 00:26:02,960 Speaker 1: think there's probably some sentiment that there are back doors 441 00:26:03,000 --> 00:26:06,679 Speaker 1: into various systems UM, but the NYPD is governed by 442 00:26:06,760 --> 00:26:11,440 Speaker 1: the same legal processes as any other UM law enforcement agency. 443 00:26:11,720 --> 00:26:14,200 Speaker 1: If we want data from an outside company or vendor, 444 00:26:14,480 --> 00:26:16,560 Speaker 1: we get a search warrant from a judge or through 445 00:26:16,600 --> 00:26:19,000 Speaker 1: a d a's office issue with subpoena, and that's how 446 00:26:19,000 --> 00:26:22,440 Speaker 1: we collect our data. Their cameras throughout the subways, white 447 00:26:22,480 --> 00:26:25,840 Speaker 1: corset turnstyles, and easy pause readers on the roads and 448 00:26:26,000 --> 00:26:29,399 Speaker 1: more in New York. So what might the NYPD know 449 00:26:29,560 --> 00:26:32,080 Speaker 1: about me? If you haven't sort of stood in front 450 00:26:32,119 --> 00:26:35,000 Speaker 1: of a police officer who hasn't taken a report by hand, 451 00:26:35,359 --> 00:26:38,760 Speaker 1: we probably don't have records on you. That being said, 452 00:26:38,800 --> 00:26:42,600 Speaker 1: we do collect data through sensors like license plate readers UM, 453 00:26:42,760 --> 00:26:44,720 Speaker 1: and we do have data sharing agreements with some other 454 00:26:44,800 --> 00:26:49,800 Speaker 1: criminal justice agencies like corrections, like the courts, and so 455 00:26:49,920 --> 00:26:53,040 Speaker 1: there's obviously opportunities for that kind of data to enter 456 00:26:53,119 --> 00:26:56,000 Speaker 1: our realm. But one thing that's built into every single 457 00:26:56,160 --> 00:27:00,159 Speaker 1: NYPD application is an auditing track. So anytime you look 458 00:27:00,200 --> 00:27:03,159 Speaker 1: at any piece of information, no matter what system you're in, 459 00:27:03,560 --> 00:27:07,000 Speaker 1: that's being audited. And so we have a very large 460 00:27:07,080 --> 00:27:10,160 Speaker 1: internal affairs bureau, and people have gotten in trouble before 461 00:27:10,240 --> 00:27:13,119 Speaker 1: for misuse of computer systems, and so I think that 462 00:27:13,200 --> 00:27:16,679 Speaker 1: that's an important check that's reassuring to hear. But why 463 00:27:16,760 --> 00:27:19,360 Speaker 1: collects so much data in the first place. I think 464 00:27:19,400 --> 00:27:22,760 Speaker 1: that the next frontier of machine learning in policing is 465 00:27:23,400 --> 00:27:27,640 Speaker 1: bringing decisions and information into the hands of cops who 466 00:27:27,720 --> 00:27:31,399 Speaker 1: need to make decisions quickly. We recently rolled out tens 467 00:27:31,440 --> 00:27:33,919 Speaker 1: of thousands of mobile phones to all of our cops, 468 00:27:34,400 --> 00:27:37,040 Speaker 1: and putting a computer in their hands has really changed 469 00:27:37,080 --> 00:27:39,639 Speaker 1: the way that the police. When you have more information, 470 00:27:39,880 --> 00:27:43,320 Speaker 1: you can make better decisions. So we could be responding 471 00:27:43,400 --> 00:27:46,240 Speaker 1: to a job at a specific location in a building, 472 00:27:46,560 --> 00:27:49,040 Speaker 1: and we know what's happened at that building before we 473 00:27:49,160 --> 00:27:52,360 Speaker 1: responded to nine. When one calls their last week in apartment, 474 00:27:52,600 --> 00:27:55,640 Speaker 1: you know for see and in that interaction it led 475 00:27:55,680 --> 00:27:58,399 Speaker 1: to some sort of altercation or we found out that 476 00:27:58,480 --> 00:28:01,600 Speaker 1: that person that we interacted have had some sort of issue. Well, 477 00:28:01,640 --> 00:28:03,520 Speaker 1: the cop who's working today might not be the same 478 00:28:03,600 --> 00:28:05,639 Speaker 1: cop who is working a week ago. And so how 479 00:28:05,680 --> 00:28:08,480 Speaker 1: do I convey that information? Maybe in a phone as 480 00:28:08,520 --> 00:28:10,720 Speaker 1: a as a pop up, as a notification that tells 481 00:28:10,800 --> 00:28:14,600 Speaker 1: you take extra time, take caution. Um, this sort of 482 00:28:14,800 --> 00:28:18,840 Speaker 1: incident happened. Using data to give office as context is 483 00:28:18,880 --> 00:28:21,240 Speaker 1: hard to argue against if it can lead to safer 484 00:28:21,359 --> 00:28:25,240 Speaker 1: interactions for everyone. But of course, what many people find 485 00:28:25,280 --> 00:28:28,960 Speaker 1: more concerning is ambient surveillance. Surveillance that happens all the time, 486 00:28:29,200 --> 00:28:32,639 Speaker 1: and despite much pressure, the NYPD has yet to release 487 00:28:32,720 --> 00:28:36,560 Speaker 1: an explicit facial recognition policy. And where do the effort 488 00:28:36,600 --> 00:28:41,280 Speaker 1: stand on facial recognition technology. Our Facial Identification Section, which 489 00:28:41,320 --> 00:28:44,960 Speaker 1: sits under the Detective Bureau, is a group of trained 490 00:28:45,240 --> 00:28:49,880 Speaker 1: detectives investigators. They use a tool and algorithm that compares 491 00:28:50,000 --> 00:28:53,160 Speaker 1: faces that we might get from a surveillance photo, and 492 00:28:53,600 --> 00:28:57,680 Speaker 1: they run that algorithm, get potential matches and then conduct 493 00:28:57,720 --> 00:29:00,680 Speaker 1: an investigation. But it's not as simple as you know, 494 00:29:00,840 --> 00:29:04,440 Speaker 1: a facial recognition hit occurs and that's suddenly licensed to 495 00:29:04,800 --> 00:29:08,320 Speaker 1: make an arrest. It doesn't generate probable cause for us. 496 00:29:08,640 --> 00:29:11,719 Speaker 1: We still require much more evidence in order to make 497 00:29:11,760 --> 00:29:15,000 Speaker 1: a determination that that hit is truly viable and something 498 00:29:15,080 --> 00:29:18,120 Speaker 1: we can act on. But there are cases where that 499 00:29:18,240 --> 00:29:21,680 Speaker 1: technology has been used as part of an arrest or prosecution. 500 00:29:25,600 --> 00:29:28,400 Speaker 1: In the absence of an explicit policy, Ben wasn't able 501 00:29:28,480 --> 00:29:30,960 Speaker 1: to answer the question live in the room, but we 502 00:29:31,040 --> 00:29:33,080 Speaker 1: did get a statement from the m I p D. 503 00:29:34,240 --> 00:29:37,560 Speaker 1: The NYPD has moved deliberately and responsibly in the use 504 00:29:37,640 --> 00:29:41,640 Speaker 1: of facial recognition software. There is no NYPD case where 505 00:29:41,680 --> 00:29:44,280 Speaker 1: an arrest or prosecution was brought on the basis of 506 00:29:44,360 --> 00:29:47,480 Speaker 1: facial recognition. The NYPD uses it on a case by 507 00:29:47,560 --> 00:29:50,719 Speaker 1: case basis, and the case must always be supported by 508 00:29:50,880 --> 00:29:55,000 Speaker 1: further investigation before any arrest is made. The NYPD has 509 00:29:55,040 --> 00:29:58,720 Speaker 1: absolutely no interest in wholesale surveillance, which would be an 510 00:29:58,840 --> 00:30:04,160 Speaker 1: enormous and entirely pointless task. We have little choice but 511 00:30:04,240 --> 00:30:07,840 Speaker 1: to trust. But that said, Bend is beat convincingly about 512 00:30:07,920 --> 00:30:11,440 Speaker 1: how the NYPD actually uses technology to police themselves. There's 513 00:30:11,440 --> 00:30:15,440 Speaker 1: also statistical tools around fairness that can actually measure whether 514 00:30:15,520 --> 00:30:19,120 Speaker 1: an algorithm is fair, whether it's causing bias, etcetera. And 515 00:30:19,240 --> 00:30:23,600 Speaker 1: so we're very interested in utilizing these metrics and we 516 00:30:23,760 --> 00:30:26,280 Speaker 1: fully embrace them. We we want to get better, and 517 00:30:26,440 --> 00:30:31,240 Speaker 1: we're taking a conservative approach because we know how high 518 00:30:31,320 --> 00:30:35,360 Speaker 1: stakes this is. The stakes are high and the path 519 00:30:35,520 --> 00:30:38,880 Speaker 1: is murky. I didn't know what to expect at the NYPD. 520 00:30:39,400 --> 00:30:42,720 Speaker 1: Would they optimize purely for reducing crime or they take 521 00:30:42,760 --> 00:30:46,600 Speaker 1: a broader view of justice. Personally, I found ben reassuring, 522 00:30:47,240 --> 00:30:51,200 Speaker 1: but the potential for abuse remains. So how do we 523 00:30:51,440 --> 00:30:55,600 Speaker 1: here in America god against that abuse. Well, let's return 524 00:30:55,640 --> 00:30:59,520 Speaker 1: to Mary Haskett who found a blink identity with Alex Kilpatrick. 525 00:31:00,000 --> 00:31:02,720 Speaker 1: Anytime you're using face wreck without consent, it's going to 526 00:31:02,840 --> 00:31:06,720 Speaker 1: get abused because why wouldn't it. And and here's the problem. 527 00:31:06,960 --> 00:31:09,720 Speaker 1: I don't think it's appropriate to ask a police department 528 00:31:09,760 --> 00:31:13,280 Speaker 1: to just voluntarily not use a tool that's awesome for them. 529 00:31:13,840 --> 00:31:15,560 Speaker 1: I mean you need to have a different level. You 530 00:31:15,640 --> 00:31:19,880 Speaker 1: need to have your governor, federal states, some government governing 531 00:31:19,960 --> 00:31:23,280 Speaker 1: body needs to be saying sorry, this is not appropriate. 532 00:31:23,440 --> 00:31:26,600 Speaker 1: This is violating people's rights. The difference between what is 533 00:31:26,640 --> 00:31:29,640 Speaker 1: happening in China and in the US is not technological, 534 00:31:30,200 --> 00:31:34,440 Speaker 1: it's cultural and political. Edward Snowden had a phrase for this, 535 00:31:35,200 --> 00:31:39,880 Speaker 1: turnkey tyranny, meaning that the technical infrastructure of mass surveillance 536 00:31:39,920 --> 00:31:43,000 Speaker 1: already exists, and that we're only protected by our values 537 00:31:43,320 --> 00:31:46,440 Speaker 1: and our laws. And thinking of China, I think there 538 00:31:46,440 --> 00:31:49,000 Speaker 1: are some profoundly creepy things that we are right on 539 00:31:49,080 --> 00:31:52,840 Speaker 1: the edge of starting to see. There's cameras everywhere. If 540 00:31:52,920 --> 00:31:55,760 Speaker 1: you add face recognition. It's not just oh, they saw 541 00:31:55,880 --> 00:31:59,120 Speaker 1: my face, they saw that I went to Starbucks. It's 542 00:31:59,680 --> 00:32:02,520 Speaker 1: where you were every day, every time, all of her 543 00:32:02,600 --> 00:32:05,400 Speaker 1: history and all that gets saved. It's my pattern of 544 00:32:05,440 --> 00:32:09,240 Speaker 1: where I go when I'm outdoors forever. Five years ago, 545 00:32:09,320 --> 00:32:12,400 Speaker 1: I would have said that could never happen here. Part 546 00:32:12,400 --> 00:32:15,080 Speaker 1: of the reason Mary has so much about privacy is 547 00:32:15,120 --> 00:32:18,800 Speaker 1: that she knows how quickly facial recognition is spreading. In fact, 548 00:32:18,840 --> 00:32:22,320 Speaker 1: in twenty eight Blink Identity raise money from Live Nation 549 00:32:22,760 --> 00:32:26,840 Speaker 1: ticket Master's parent company to allow future concert goers to 550 00:32:27,000 --> 00:32:30,240 Speaker 1: use their faces instead of their tickets. We wanted to 551 00:32:30,440 --> 00:32:32,080 Speaker 1: be a case study of how to do this in 552 00:32:32,120 --> 00:32:36,080 Speaker 1: a way that preserves individual privacy and respects the individual, 553 00:32:36,200 --> 00:32:38,400 Speaker 1: and maybe that will help set a precedence and maybe 554 00:32:38,440 --> 00:32:41,640 Speaker 1: some of these other objectionable use cases just won't be 555 00:32:41,680 --> 00:32:45,440 Speaker 1: able to take off. Facial recognition and other AI technologies 556 00:32:45,520 --> 00:32:48,160 Speaker 1: are being developed all over the world, and we can't 557 00:32:48,240 --> 00:32:51,560 Speaker 1: trust everyone to be as conscientious as Mary and Alex 558 00:32:52,080 --> 00:32:54,640 Speaker 1: In America, the liberty we take for granted is hard 559 00:32:54,720 --> 00:32:58,320 Speaker 1: one and fragile, and cases like Glens show what can 560 00:32:58,400 --> 00:33:02,840 Speaker 1: happen when algorithms are blind trusted to determine outcomes. So 561 00:33:03,040 --> 00:33:06,480 Speaker 1: much hangs in the balance right now about our technological future, 562 00:33:07,080 --> 00:33:10,000 Speaker 1: and the decisions we take will affect our lives profoundly 563 00:33:10,320 --> 00:33:13,320 Speaker 1: and echo through the lives of our children too. I 564 00:33:13,440 --> 00:33:16,560 Speaker 1: mentioned Charles Dickens Christmas Carol earlier to me. One of 565 00:33:16,600 --> 00:33:19,320 Speaker 1: the most powerful scenes in the book is Scrooge seeing 566 00:33:19,360 --> 00:33:21,440 Speaker 1: for the first time the chains he has made for 567 00:33:21,560 --> 00:33:25,480 Speaker 1: himself through his own decisions. Nowadays, we would call those 568 00:33:25,520 --> 00:33:29,840 Speaker 1: decisions and those chains longitudinal data, and they'd be very 569 00:33:29,880 --> 00:33:32,480 Speaker 1: hard to get rid of. They're the record that Lenn 570 00:33:32,520 --> 00:33:36,240 Speaker 1: couldn't shake, that might deny a Chinese citizen a plane ticket, 571 00:33:36,920 --> 00:33:40,440 Speaker 1: or deny you health insurance because of your social media activity. 572 00:33:41,480 --> 00:33:44,760 Speaker 1: But data can also set us free. In the next episode, 573 00:33:44,840 --> 00:33:47,600 Speaker 1: we investigate what's possible when our data is used to 574 00:33:47,720 --> 00:33:50,520 Speaker 1: help us. From a dying man brought back to his youth, 575 00:33:50,800 --> 00:33:53,680 Speaker 1: to movies and music that read our bodies while they play, 576 00:33:54,520 --> 00:33:57,080 Speaker 1: and what happens when Alexa becomes part of the family. 577 00:33:58,840 --> 00:34:03,360 Speaker 1: How long that Giuliana dived with m hmmm, I don't 578 00:34:03,400 --> 00:34:06,200 Speaker 1: know that one. I am still learning more about dinosaurs 579 00:34:07,200 --> 00:34:24,800 Speaker 1: us some dinosaur trivially. Sleepwalkers is a production of I 580 00:34:24,920 --> 00:34:28,800 Speaker 1: Heart Radio and Unusual productions. For the latest AI news, 581 00:34:29,000 --> 00:34:32,520 Speaker 1: live interviews, and behind the scenes footage, find us on Instagram, 582 00:34:32,680 --> 00:34:37,120 Speaker 1: at Sleepwalker's podcast or at Sleepwalker's podcast dot com. Special 583 00:34:37,160 --> 00:34:41,720 Speaker 1: thanks this episode to Laurie Arlam and Lucy Brady. Sleepwalkers 584 00:34:41,840 --> 00:34:44,440 Speaker 1: is hosted by me Osbaloshen and co hosted by me 585 00:34:44,600 --> 00:34:47,440 Speaker 1: Kara Price, with produced by Julian Weller with help from 586 00:34:47,520 --> 00:34:51,080 Speaker 1: Jacopo Penzo and Taylor Koin. Mixing by Tristan McNeil and 587 00:34:51,200 --> 00:34:54,840 Speaker 1: Julian Weller. Our story editor is Matthew Riddle. Recording assistance 588 00:34:54,920 --> 00:34:58,200 Speaker 1: this episode from Dina Bridgett, Rachel London, and Phil Bodger. 589 00:34:58,840 --> 00:35:03,040 Speaker 1: Sleepwalkers is Exactly You, produced by me Osloschen and Mangesh Hattikiler. 590 00:35:04,080 --> 00:35:06,080 Speaker 1: For more podcasts from my Heart Radio, visit the I 591 00:35:06,200 --> 00:35:09,080 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen to 592 00:35:09,120 --> 00:35:10,000 Speaker 1: your favorite shows.