1 00:00:14,280 --> 00:00:17,239 Speaker 1: Welcome to Tech Stuff. I'm Os Voloshan here with Cara 2 00:00:17,360 --> 00:00:20,360 Speaker 1: Price High as Hi Cara. So we've talked on the 3 00:00:20,400 --> 00:00:23,720 Speaker 1: show a fair bit about facial recognition technology and the 4 00:00:23,800 --> 00:00:28,320 Speaker 1: ubiquity of data scraping by private companies. It does sometimes 5 00:00:28,360 --> 00:00:30,840 Speaker 1: feel a bit like we are living in a kind 6 00:00:30,840 --> 00:00:34,959 Speaker 1: of science fiction movie about the population of people who 7 00:00:35,000 --> 00:00:36,479 Speaker 1: are being technologically surveiled. 8 00:00:37,280 --> 00:00:39,120 Speaker 2: I think the movie is called Minority Report, and we 9 00:00:39,200 --> 00:00:40,960 Speaker 2: live in it right now. 10 00:00:41,000 --> 00:00:43,479 Speaker 1: And over the past few months, it's become impossible to 11 00:00:43,520 --> 00:00:47,879 Speaker 1: ignore how government agencies are partnering with private companies to 12 00:00:48,000 --> 00:00:51,240 Speaker 1: help in the efforts to do mass deportations. 13 00:00:51,720 --> 00:00:57,279 Speaker 2: Yeah, it's very disconcerting to read about how government agencies 14 00:00:57,320 --> 00:01:01,480 Speaker 2: are using facial recognition technology to detain deput tees. 15 00:01:01,960 --> 00:01:04,360 Speaker 1: Yeah, and it's not just facial recognition. We'll get into 16 00:01:04,400 --> 00:01:07,160 Speaker 1: all the different technologies that are being used today with 17 00:01:07,280 --> 00:01:11,240 Speaker 1: our friend Joseph Cox from four oho form Media. The 18 00:01:11,240 --> 00:01:14,679 Speaker 1: amount of data that immigration and customs enforcement can scrape 19 00:01:14,760 --> 00:01:18,399 Speaker 1: and utilize is pretty mind blowing. And on the other 20 00:01:18,440 --> 00:01:21,000 Speaker 1: side of it, people have developed products to help track 21 00:01:21,319 --> 00:01:24,960 Speaker 1: ice activity, but those products and apps have been removed 22 00:01:25,280 --> 00:01:27,480 Speaker 1: from the Apple App Store, and also I think from 23 00:01:27,520 --> 00:01:30,560 Speaker 1: the Android app stores, which limits how much people are 24 00:01:30,600 --> 00:01:31,560 Speaker 1: able to use these tools. 25 00:01:31,760 --> 00:01:35,120 Speaker 2: You know, how private tech companies are responding to ICE 26 00:01:35,319 --> 00:01:39,160 Speaker 2: does put a spotlight on the relationship between Silicon Valley 27 00:01:39,200 --> 00:01:41,360 Speaker 2: and this presidential administration. 28 00:01:42,080 --> 00:01:46,360 Speaker 1: Joseph said, it's really astonishing what a different environment it 29 00:01:46,440 --> 00:01:49,840 Speaker 1: is now versus the first Trump administration in that respect, 30 00:01:50,000 --> 00:01:52,640 Speaker 1: which we'll get into but without further ado, here's my 31 00:01:52,720 --> 00:01:56,400 Speaker 1: conversation with Joseph Cox. Thanks for joining us today. 32 00:01:56,240 --> 00:01:58,360 Speaker 3: Joseph, yeah, absolutely, thank you for having me. 33 00:01:58,640 --> 00:02:02,400 Speaker 1: So I've got to be questions what's going on with 34 00:02:02,480 --> 00:02:05,680 Speaker 1: ICE in technology and how worried should we all be 35 00:02:06,320 --> 00:02:07,120 Speaker 1: about what's going on? 36 00:02:08,000 --> 00:02:14,120 Speaker 3: So ICE is buying essentially every surveillance technology available under 37 00:02:14,120 --> 00:02:18,239 Speaker 3: the sun. A very brief overview is facial recognition technology, 38 00:02:18,360 --> 00:02:23,680 Speaker 3: social media surveillance, phone location data, spyware for hacking into 39 00:02:23,720 --> 00:02:27,560 Speaker 3: phones remotely fake cell phone towers as well, if you 40 00:02:27,639 --> 00:02:30,400 Speaker 3: can imagine it, and if a law enforcement agency has 41 00:02:30,440 --> 00:02:34,240 Speaker 3: had it before in the US, ICE has almost certainly 42 00:02:34,360 --> 00:02:37,520 Speaker 3: this point purchased it or adapted it, or is at 43 00:02:37,560 --> 00:02:41,360 Speaker 3: least thinking about buying it. As for your second question, 44 00:02:42,000 --> 00:02:46,320 Speaker 3: we actually just published an article today about ICE and 45 00:02:46,360 --> 00:02:50,600 Speaker 3: Customs and Border Protection officials using a facial recognition app 46 00:02:51,120 --> 00:02:55,120 Speaker 3: in the field on people that stops to verify their citizenship. 47 00:02:55,639 --> 00:02:59,120 Speaker 3: And a lawmaker who's also been following the issue told 48 00:02:59,160 --> 00:03:03,000 Speaker 3: me that ICE believes a result from this app can 49 00:03:03,080 --> 00:03:07,040 Speaker 3: override a birth certificate in determining whether someone is a 50 00:03:07,120 --> 00:03:10,360 Speaker 3: citizen of the United States or not. That is obviously 51 00:03:10,400 --> 00:03:13,720 Speaker 3: incredibly alarming that you're going to take the results from 52 00:03:13,720 --> 00:03:16,320 Speaker 3: a piece of technology and you're going to trust that 53 00:03:16,360 --> 00:03:21,360 Speaker 3: and a notoriously inaccurate technology sometimes, and you're going to 54 00:03:21,400 --> 00:03:25,480 Speaker 3: trust that over the piece of paper that says that 55 00:03:25,560 --> 00:03:27,720 Speaker 3: somebody is a US citizen. 56 00:03:28,320 --> 00:03:31,360 Speaker 1: What is the reliability of facial recognition software today? 57 00:03:31,840 --> 00:03:35,160 Speaker 3: So I don't know about this specific technology it's called 58 00:03:35,640 --> 00:03:40,880 Speaker 3: Mobile Fortify, but broadly, over the years, as companies and 59 00:03:40,920 --> 00:03:45,120 Speaker 3: governments have been deploying and developing facial recognition technology, we've 60 00:03:45,120 --> 00:03:49,000 Speaker 3: seen there are racial biases. Often because the data sets 61 00:03:49,000 --> 00:03:51,160 Speaker 3: of all of these photos of faces used to train 62 00:03:51,520 --> 00:03:55,360 Speaker 3: the AI or the machine learning algorithms, they don't have 63 00:03:55,400 --> 00:03:57,800 Speaker 3: as many black people or brown people in them, so 64 00:03:57,800 --> 00:04:01,640 Speaker 3: they can actually be very inaccurate at directly identifying people. 65 00:04:01,880 --> 00:04:04,080 Speaker 3: We've seen a ton of coverage in the New York Times, 66 00:04:04,080 --> 00:04:07,600 Speaker 3: for example, where people have been arrested by mistake because 67 00:04:07,640 --> 00:04:10,400 Speaker 3: they believe one black man is actually another one based 68 00:04:10,440 --> 00:04:13,360 Speaker 3: on the facial recognition technology. Again, I don't know the 69 00:04:13,360 --> 00:04:17,200 Speaker 3: accuracy of this specific tool that ICE is using. They 70 00:04:17,240 --> 00:04:21,800 Speaker 3: all do vary, but this bias, and especially racial bias, 71 00:04:21,839 --> 00:04:26,040 Speaker 3: has been our longstanding criticism of facial recognition over the 72 00:04:26,120 --> 00:04:28,160 Speaker 3: past five, six, even ten years. 73 00:04:28,320 --> 00:04:30,920 Speaker 1: I want to get in start of the specific tools, Joseph, 74 00:04:31,000 --> 00:04:34,120 Speaker 1: but before we get there, I actually several years ago 75 00:04:34,200 --> 00:04:36,800 Speaker 1: hosted a podcast on a very different topic, which was 76 00:04:36,839 --> 00:04:39,520 Speaker 1: called Forgotten Women at Fuarez, and it was about the 77 00:04:40,120 --> 00:04:44,240 Speaker 1: femicides and murders of women on the Texas Mexico border. 78 00:04:44,560 --> 00:04:47,000 Speaker 1: And one of the interesting things that came out of 79 00:04:47,360 --> 00:04:51,080 Speaker 1: spending time there was how everyone on the US side 80 00:04:51,120 --> 00:04:56,240 Speaker 1: of the border talks about the twenty five miles inland 81 00:04:56,279 --> 00:04:59,680 Speaker 1: from the border as this kind of zone where the 82 00:04:59,760 --> 00:05:03,480 Speaker 1: law that govern law enforcement don't really apply. There's all 83 00:05:03,520 --> 00:05:06,880 Speaker 1: these kind of special exceptions for border enforcement and stuff, 84 00:05:07,200 --> 00:05:09,760 Speaker 1: and that therefore that area became a kind of early 85 00:05:09,880 --> 00:05:15,599 Speaker 1: testing ground for both actions that might be considered extra judicial, 86 00:05:16,120 --> 00:05:20,480 Speaker 1: but also for technologies that the wider population wouldn't accept 87 00:05:20,600 --> 00:05:25,200 Speaker 1: if they were deployed on the wider population. Are we 88 00:05:25,320 --> 00:05:29,120 Speaker 1: now seeing the metastasis of the border area and the 89 00:05:29,120 --> 00:05:33,320 Speaker 1: technology used to surveil it and control it across the 90 00:05:33,320 --> 00:05:33,960 Speaker 1: whole country? 91 00:05:34,560 --> 00:05:37,880 Speaker 3: Yeah, I think absolutely. I think a concrete example would 92 00:05:37,960 --> 00:05:42,640 Speaker 3: be the use of surveillance drones. Many people will associate 93 00:05:42,680 --> 00:05:47,240 Speaker 3: these very large predator drones with the US War on terror, 94 00:05:47,279 --> 00:05:50,599 Speaker 3: firing health fire missiles in the Iraq, Afghanistan, Yemen, that 95 00:05:50,680 --> 00:05:53,960 Speaker 3: sort of thing. Those same drones or be unarmed have 96 00:05:54,080 --> 00:05:57,200 Speaker 3: been surveiling the US Mexico border and the Canadian border 97 00:05:57,240 --> 00:05:59,960 Speaker 3: as well for years and years and years. What happened 98 00:06:00,200 --> 00:06:04,440 Speaker 3: now more regularly and more frequently is that DHS will 99 00:06:04,440 --> 00:06:07,800 Speaker 3: fly those same high powered surveillance drones again that were 100 00:06:07,839 --> 00:06:11,200 Speaker 3: designed for war zones above US cities. The most recent 101 00:06:11,240 --> 00:06:15,600 Speaker 3: example of that was the La anti Ice protests and 102 00:06:15,760 --> 00:06:18,880 Speaker 3: DHS flew those same drones above the city. They even 103 00:06:18,920 --> 00:06:22,479 Speaker 3: published some of the footage in social media posts. And 104 00:06:22,560 --> 00:06:26,400 Speaker 3: we see it in all sorts of different technology as well, 105 00:06:26,600 --> 00:06:30,159 Speaker 3: maybe downloading the contents of your phone, which is very 106 00:06:30,240 --> 00:06:33,400 Speaker 3: easy at a border where people have fewer rights obviously 107 00:06:33,440 --> 00:06:37,200 Speaker 3: because they're crossing a border or entering or leaving. We're 108 00:06:37,279 --> 00:06:40,760 Speaker 3: absolutely going to see that sort of thing proliferating. And 109 00:06:40,800 --> 00:06:43,600 Speaker 3: that's one of the main worries from the surveillance experts 110 00:06:43,640 --> 00:06:46,160 Speaker 3: I speak to, is this trickle down effect of well, 111 00:06:46,200 --> 00:06:49,640 Speaker 3: this technology starts in an immigration context, then it goes 112 00:06:49,680 --> 00:06:53,359 Speaker 3: to domestic immigration enforcement, and then it ends up with 113 00:06:53,400 --> 00:06:55,320 Speaker 3: local cops or local sheriff's office. 114 00:06:55,800 --> 00:06:59,520 Speaker 1: Or you hear Trump talking about having the military practiced 115 00:06:59,600 --> 00:07:04,080 Speaker 1: on pops in sanctuary cities. Makes me think of Anderil, 116 00:07:04,360 --> 00:07:07,560 Speaker 1: the weapons manufacturer who are doing all these drones and 117 00:07:07,600 --> 00:07:10,880 Speaker 1: autonomous weapons that are being exported to Taiwan in preparation 118 00:07:11,000 --> 00:07:14,400 Speaker 1: for a potential conflict there. But I believe that Anderil 119 00:07:14,480 --> 00:07:17,080 Speaker 1: back in twenty sixteen developed a lot of these technologies, 120 00:07:17,120 --> 00:07:20,920 Speaker 1: these weapons of wars, again starting off with border contracts. 121 00:07:20,960 --> 00:07:25,520 Speaker 1: So this intersection of the tech the military and using 122 00:07:25,600 --> 00:07:29,240 Speaker 1: military methods to control domestic populations, I mean, as just 123 00:07:29,720 --> 00:07:31,720 Speaker 1: a set of ideas, it's quite terrifying. 124 00:07:32,080 --> 00:07:37,640 Speaker 3: Yeah, Andreill runs these AI powered towers all across the 125 00:07:37,760 --> 00:07:41,160 Speaker 3: US Mexico border and the US Canadian border as well. 126 00:07:41,280 --> 00:07:45,280 Speaker 3: They're capable of all sorts of things like AI object recognition, 127 00:07:45,720 --> 00:07:50,480 Speaker 3: tracking people, tracking vehicles, alerting DHS officials, all of that 128 00:07:50,520 --> 00:07:53,440 Speaker 3: sort of thing. And there is just a big shift 129 00:07:53,600 --> 00:07:57,560 Speaker 3: now in Silicon Valley in the tech companies where they 130 00:07:57,600 --> 00:08:01,640 Speaker 3: are more explicitly going to work with the government. Whereas 131 00:08:01,640 --> 00:08:07,680 Speaker 3: before Facebook, Instagram, Meta, Twitter or whoever uber, they were 132 00:08:07,760 --> 00:08:10,640 Speaker 3: very much separate from the government. We're making consumer goods, 133 00:08:10,680 --> 00:08:14,640 Speaker 3: We're making an app for whatever reason. Tech companies today 134 00:08:15,440 --> 00:08:19,000 Speaker 3: are not only leaning into they're very proud to work 135 00:08:19,000 --> 00:08:20,960 Speaker 3: with the government. And Andrew has been a leader in that, 136 00:08:21,040 --> 00:08:24,480 Speaker 3: And obviously other examples would be palanted, for example, where 137 00:08:24,800 --> 00:08:28,880 Speaker 3: these companies say, this is our purpose is to protect 138 00:08:29,040 --> 00:08:32,760 Speaker 3: the United States. Now, of course, how people understand what 139 00:08:32,840 --> 00:08:35,880 Speaker 3: protecting the United States might actually be is obviously going 140 00:08:35,960 --> 00:08:38,920 Speaker 3: to vary, especially when we talk about surveillance technology and 141 00:08:38,960 --> 00:08:41,240 Speaker 3: how that's being used and who is being used against. 142 00:08:42,080 --> 00:08:46,440 Speaker 1: You mentioned facial recognition. Clearview AI always seems to come 143 00:08:46,520 --> 00:08:49,160 Speaker 1: up sort of as the big batter wolf of this 144 00:08:49,760 --> 00:08:53,320 Speaker 1: topic area. Can you explain a bit about how that 145 00:08:53,440 --> 00:08:56,640 Speaker 1: technology or that company is being used by or partnered with, 146 00:08:56,760 --> 00:08:57,280 Speaker 1: the government. 147 00:08:57,960 --> 00:09:01,319 Speaker 3: Yes, so we just revealed that I paid clearview AI 148 00:09:01,559 --> 00:09:06,080 Speaker 3: another five million dollars for access to its technology. Ice 149 00:09:06,080 --> 00:09:09,040 Speaker 3: has been contracting with Clearview for years. It's something of 150 00:09:09,120 --> 00:09:12,480 Speaker 3: the established player in the government or law enforcement facial 151 00:09:12,520 --> 00:09:17,640 Speaker 3: recognition space. What Clearview did several years ago was basically 152 00:09:17,720 --> 00:09:22,880 Speaker 3: scrape a ton of photos of people's faces from the 153 00:09:23,040 --> 00:09:26,320 Speaker 3: World Wide Web, so they're then no profiles, are social 154 00:09:26,320 --> 00:09:29,560 Speaker 3: media profiles. And they made this massive database which is 155 00:09:29,600 --> 00:09:32,880 Speaker 3: now something like ten billion images, and they allowed people 156 00:09:33,200 --> 00:09:35,520 Speaker 3: to upload a photo and we will then show you 157 00:09:35,760 --> 00:09:39,040 Speaker 3: similar photos, probably of the same person, and you can 158 00:09:39,080 --> 00:09:41,760 Speaker 3: then figure out who this person is. And when it 159 00:09:41,800 --> 00:09:44,120 Speaker 3: was revealed by The New York Times several years ago, 160 00:09:44,559 --> 00:09:48,560 Speaker 3: that was really crossing the line, like Facebook could have 161 00:09:48,600 --> 00:09:52,160 Speaker 3: done this, and although they had facial recognition technology, it 162 00:09:52,200 --> 00:09:54,240 Speaker 3: was to tag people, you know, when you tag your 163 00:09:54,240 --> 00:09:58,160 Speaker 3: friends and your photos. It wasn't to unmask strangers. Google 164 00:09:58,200 --> 00:10:00,439 Speaker 3: could have done it. They said, this is two dangerous, 165 00:10:00,480 --> 00:10:04,760 Speaker 3: we don't want to Clearview cross that line. And incredibly 166 00:10:04,800 --> 00:10:08,040 Speaker 3: controversial all the time, still controversial now, but it's almost 167 00:10:08,080 --> 00:10:12,520 Speaker 3: normal now. There are Clearview contracts every single day essentially 168 00:10:12,600 --> 00:10:16,160 Speaker 3: where local cops are already using that, sheriff's offices are 169 00:10:16,240 --> 00:10:18,880 Speaker 3: or using that, and ICE is absolutely using it as well. 170 00:10:19,080 --> 00:10:22,560 Speaker 3: With this latest contract. Ice spent this five million dollars 171 00:10:22,640 --> 00:10:25,480 Speaker 3: in part because it believes it can use facial recognition 172 00:10:25,520 --> 00:10:29,199 Speaker 3: technology to unmask the people that it believes are quote, 173 00:10:29,200 --> 00:10:30,760 Speaker 3: assaulting officers. 174 00:10:31,640 --> 00:10:36,120 Speaker 1: So that's facial recognition. There's also location tracking. Talk about flock. 175 00:10:36,720 --> 00:10:40,240 Speaker 3: Flock is an interesting company. A lot of people may 176 00:10:40,280 --> 00:10:43,000 Speaker 3: have actually heard about it because it is proliferated around 177 00:10:43,040 --> 00:10:46,480 Speaker 3: local communities. And what they do is they sell these 178 00:10:46,559 --> 00:10:51,640 Speaker 3: cameras to police departments or homeowner associations or neighborhoods or 179 00:10:51,679 --> 00:10:56,040 Speaker 3: city governments or whatever. And these cameras they sit stationary 180 00:10:56,240 --> 00:11:00,160 Speaker 3: at the road and they continuously scan every vehicle that 181 00:11:00,200 --> 00:11:02,360 Speaker 3: goes by. That will be the model of the car, 182 00:11:02,679 --> 00:11:06,240 Speaker 3: the brand, the color, and most importantly the license plate. 183 00:11:06,640 --> 00:11:10,680 Speaker 3: And this creates this vast surveillance network that law enforcement 184 00:11:10,720 --> 00:11:15,040 Speaker 3: officers can tap into without a warrant. Typically you enter 185 00:11:15,720 --> 00:11:18,920 Speaker 3: xyz license plate, it shows you all the locations this 186 00:11:19,040 --> 00:11:20,520 Speaker 3: vehicle has been and. 187 00:11:20,480 --> 00:11:22,000 Speaker 1: Do you have to be law enforce where anyone can 188 00:11:22,080 --> 00:11:22,320 Speaker 1: use this? 189 00:11:22,559 --> 00:11:25,920 Speaker 3: If your homeowners association, it's going to be the authorized 190 00:11:26,000 --> 00:11:30,200 Speaker 3: user of that particular homeowners association or that neighborhood or something. 191 00:11:30,200 --> 00:11:32,480 Speaker 3: It's similar to a ring camera. In a way, but 192 00:11:32,520 --> 00:11:33,920 Speaker 3: you're not going to be able to access all of 193 00:11:33,920 --> 00:11:34,240 Speaker 3: the data. 194 00:11:34,280 --> 00:11:36,760 Speaker 1: I couldn't if I was a stalker and know my 195 00:11:36,960 --> 00:11:39,920 Speaker 1: ex's number plate and therefore track them around the country 196 00:11:39,960 --> 00:11:42,240 Speaker 1: as an individual person. But as a member of law enforcement, 197 00:11:42,280 --> 00:11:42,600 Speaker 1: I could. 198 00:11:43,120 --> 00:11:46,160 Speaker 3: Yes. Essentially yes, and law enforcement have access to this 199 00:11:46,240 --> 00:11:49,640 Speaker 3: sort of additional capability, which is the National Lookup Tool, 200 00:11:49,920 --> 00:11:53,480 Speaker 3: where a cop, say in Texas, can search cameras in 201 00:11:53,600 --> 00:11:57,400 Speaker 3: Illinois or California. And that's not even a hypothetic example. 202 00:11:57,440 --> 00:12:00,920 Speaker 3: We actually reported that a law enforcement office in Texas 203 00:12:01,280 --> 00:12:04,679 Speaker 3: searched for a woman who self administered an abortion, and 204 00:12:04,720 --> 00:12:09,200 Speaker 3: it searched cameras all over the country, and crucially, we 205 00:12:09,360 --> 00:12:13,160 Speaker 3: found cops were performing lookups for ICE in this tool. 206 00:12:13,200 --> 00:12:15,680 Speaker 3: We got these spreadsheets has said all the reasons why, 207 00:12:15,960 --> 00:12:18,240 Speaker 3: and you scroll through them and some just say ICE, 208 00:12:18,440 --> 00:12:22,520 Speaker 3: some say immigration or immigration enforcement. Immigration protest was another 209 00:12:22,559 --> 00:12:27,040 Speaker 3: one as well, and that was already rarely interesting enough, 210 00:12:27,160 --> 00:12:30,640 Speaker 3: and Flock made serious changes after we revealed that, But 211 00:12:31,080 --> 00:12:33,559 Speaker 3: we then also learned that ICE actually had direct access 212 00:12:33,600 --> 00:12:35,520 Speaker 3: as well. They didn't even have to go through the cops, 213 00:12:35,559 --> 00:12:39,040 Speaker 3: and that's apparently stopped now, But ICE were getting access 214 00:12:39,040 --> 00:12:42,199 Speaker 3: to that system through a side door via local police, 215 00:12:42,240 --> 00:12:45,080 Speaker 3: but then also just direct access as well, so you've. 216 00:12:44,960 --> 00:12:49,320 Speaker 1: Got facial rehognition, license plate tracking, and then we have 217 00:12:49,480 --> 00:12:52,360 Speaker 1: this sort of system that sits even above, which is 218 00:12:52,400 --> 00:12:56,199 Speaker 1: the investigative case management you mentioned Palentier earlier. This is 219 00:12:56,200 --> 00:12:57,280 Speaker 1: this a Palentier product. 220 00:12:57,960 --> 00:13:03,360 Speaker 3: So it started as DHS project and it's basically a 221 00:13:03,400 --> 00:13:06,719 Speaker 3: centralized database and you look into it and you can 222 00:13:07,000 --> 00:13:10,319 Speaker 3: look up a person that is their name, social Security number, 223 00:13:10,600 --> 00:13:14,480 Speaker 3: maybe identifying features like tattoos or scars, that sort of thing. 224 00:13:14,880 --> 00:13:18,920 Speaker 3: Palenteer then started work on that, and this was years 225 00:13:18,920 --> 00:13:20,520 Speaker 3: and years ago. They've been doing this for something like 226 00:13:20,600 --> 00:13:22,920 Speaker 3: ten years at this point. But then with the second 227 00:13:22,920 --> 00:13:28,000 Speaker 3: Trump administration, something changed and it became clear that ICE 228 00:13:28,040 --> 00:13:31,480 Speaker 3: and DHS more broadly needed this tool to be a 229 00:13:31,559 --> 00:13:34,440 Speaker 3: bit more powerful, which is where they brought in Palenteer 230 00:13:34,520 --> 00:13:37,439 Speaker 3: to actually take over the development, not just the maintenance, 231 00:13:37,480 --> 00:13:42,240 Speaker 3: of this project. And that's how we come to immigration OS, which, 232 00:13:42,480 --> 00:13:45,840 Speaker 3: according to material I got leaked from inside Palenteer, is 233 00:13:45,920 --> 00:13:48,880 Speaker 3: designed to make it much easier to find the physical 234 00:13:48,960 --> 00:13:53,040 Speaker 3: location of aliens. Of course, the US government's terms for 235 00:13:53,240 --> 00:13:58,440 Speaker 3: immigrants essentially we don't know all of the different data 236 00:13:58,440 --> 00:14:02,720 Speaker 3: that's going into this system alan to their entire the 237 00:14:02,840 --> 00:14:06,400 Speaker 3: entire purpose of their existence is to match together all 238 00:14:06,400 --> 00:14:08,520 Speaker 3: of this different sorts of data from all over the 239 00:14:08,520 --> 00:14:12,319 Speaker 3: place and make it intelligible and readable for government customers. 240 00:14:12,520 --> 00:14:15,480 Speaker 3: So we can assume that Immigration OS is probably bringing 241 00:14:15,559 --> 00:14:18,240 Speaker 3: data from all over the place into an interface that 242 00:14:18,480 --> 00:14:21,600 Speaker 3: ICE or DHS can use to I mean, essentially track 243 00:14:21,640 --> 00:14:23,320 Speaker 3: down people at least recording some TIRA. 244 00:14:23,440 --> 00:14:28,120 Speaker 1: I got interesting. One of my dear friends and board 245 00:14:28,120 --> 00:14:32,080 Speaker 1: members at Kaleidoscope is a man called Stuart Carl who's 246 00:14:32,120 --> 00:14:34,600 Speaker 1: a First Amendment lawyer. He was actually the general counselor 247 00:14:34,680 --> 00:14:36,720 Speaker 1: for the Wall Street Journal for many years, and now 248 00:14:36,760 --> 00:14:42,239 Speaker 1: he teaches at Columbia Journalism School, and he advised students 249 00:14:43,120 --> 00:14:47,040 Speaker 1: basically to avoid posting on social media who are not 250 00:14:47,160 --> 00:14:50,640 Speaker 1: US citizens about political matters at least until after they graduate. 251 00:14:50,920 --> 00:14:53,360 Speaker 1: I think his point was get your degree that you've 252 00:14:53,400 --> 00:14:55,560 Speaker 1: come for, and then decide where you want to draw 253 00:14:55,600 --> 00:14:57,720 Speaker 1: your line. He got a lot of blowback for that, 254 00:14:57,840 --> 00:15:00,720 Speaker 1: and there was debate and a York Time story about it. 255 00:15:00,120 --> 00:15:04,200 Speaker 1: But this idea of second guessing what you post on 256 00:15:04,240 --> 00:15:07,440 Speaker 1: social media in the current political environment, I mean, how 257 00:15:07,480 --> 00:15:12,680 Speaker 1: programmatic is social media scrapian analysis versus opportunistic when they 258 00:15:12,720 --> 00:15:14,440 Speaker 1: already want to build a case against somebody. 259 00:15:14,960 --> 00:15:17,760 Speaker 3: Yeah, they do it in several different ways, but broadly, 260 00:15:18,320 --> 00:15:22,600 Speaker 3: you have right wing agitators who are constantly going through 261 00:15:22,640 --> 00:15:26,480 Speaker 3: social media Twitter, Instagram. Whatever they see something they don't like, 262 00:15:26,880 --> 00:15:30,720 Speaker 3: they flag it to the administration. And we have several 263 00:15:31,000 --> 00:15:35,320 Speaker 3: very high profile right wing social media personalities who are 264 00:15:35,360 --> 00:15:37,640 Speaker 3: clearly doing that, and they're emitting as much on their 265 00:15:37,680 --> 00:15:41,840 Speaker 3: own social media. The other side are these technological tools 266 00:15:42,320 --> 00:15:45,240 Speaker 3: which ICE has, Customs and Border Protection has, and the 267 00:15:45,280 --> 00:15:48,560 Speaker 3: State Department I believe has as well, and these are 268 00:15:48,800 --> 00:15:52,840 Speaker 3: broadly speaking AI powers. They say they can make decisions 269 00:15:52,960 --> 00:15:56,360 Speaker 3: or inferences based on somebody's profile, what they've posted, the 270 00:15:56,440 --> 00:15:58,400 Speaker 3: location they've done it from, and all of that sort 271 00:15:58,400 --> 00:16:02,840 Speaker 3: of thing. Again, already a black box. We're also trying 272 00:16:02,840 --> 00:16:05,320 Speaker 3: to look at a black box from the outside, so 273 00:16:05,360 --> 00:16:07,960 Speaker 3: it's doubly opaque about what is actually going on. But 274 00:16:08,040 --> 00:16:10,640 Speaker 3: what we do know is that these government agencies SUS 275 00:16:10,640 --> 00:16:14,400 Speaker 3: spend millions and millions of dollars on social media surveillance tools. 276 00:16:14,440 --> 00:16:17,280 Speaker 3: We just don't know exactly how it goes from post 277 00:16:17,600 --> 00:16:20,400 Speaker 3: to visa being taken away. But the threat is absolutely 278 00:16:20,400 --> 00:16:22,920 Speaker 3: there that if you say something that the administration does 279 00:16:23,000 --> 00:16:26,600 Speaker 3: not agree with, there's every chance that some sort of 280 00:16:26,640 --> 00:16:30,840 Speaker 3: pretense will be used to revoke you your immigration status. 281 00:16:31,160 --> 00:16:36,200 Speaker 1: You know. As this sort of ice crisis royals the country. 282 00:16:36,240 --> 00:16:39,320 Speaker 1: There are certain courts that are fighting back. I mean 283 00:16:39,360 --> 00:16:45,080 Speaker 1: recently in Chicago, one of the judges basically hauled in 284 00:16:45,160 --> 00:16:49,640 Speaker 1: the local head of border patrol and said, you know, 285 00:16:49,680 --> 00:16:51,400 Speaker 1: you have to come and report to me every single 286 00:16:51,480 --> 00:16:54,160 Speaker 1: day at six pm and what you're doing. You can't 287 00:16:54,200 --> 00:16:56,840 Speaker 1: throw tear gas into crowds where children are about to 288 00:16:56,920 --> 00:16:58,640 Speaker 1: arrive with the warning, etc. 289 00:16:58,880 --> 00:16:59,040 Speaker 3: Etc. 290 00:16:59,400 --> 00:17:03,320 Speaker 1: And essentially kind of insisted on some kind of role 291 00:17:03,560 --> 00:17:08,639 Speaker 1: for judiciary in managing the behavior or the illegal behavior 292 00:17:08,720 --> 00:17:12,040 Speaker 1: or the potentially illegal behavior of some officials. How's that 293 00:17:12,080 --> 00:17:14,160 Speaker 1: playing out in the tech realm, That's a. 294 00:17:14,200 --> 00:17:17,720 Speaker 3: Very good question, actually, Yeah. And with the court stuff, obviously, 295 00:17:17,760 --> 00:17:20,960 Speaker 3: that is the American system working as intenders, where you 296 00:17:20,960 --> 00:17:24,600 Speaker 3: have all of these different branches of government and all 297 00:17:24,640 --> 00:17:26,560 Speaker 3: of these other things, like the courts, and they provide 298 00:17:26,560 --> 00:17:31,800 Speaker 3: these checks and balances. That doesn't exist really for the 299 00:17:31,840 --> 00:17:37,480 Speaker 3: private surveillance tech social media industry. We have had impact 300 00:17:37,720 --> 00:17:41,600 Speaker 3: through our reporting. Again, our reporting or FLOCK ended up 301 00:17:41,640 --> 00:17:46,080 Speaker 3: with the company removing California, Illinois, and I think a 302 00:17:46,119 --> 00:17:49,440 Speaker 3: couple more states from its National Lookup TOL, meaning ICE 303 00:17:49,480 --> 00:17:52,160 Speaker 3: can no longer access those through the police, and then 304 00:17:52,280 --> 00:17:56,040 Speaker 3: FLOCK also stopped. It's more direct sharing with ICE and 305 00:17:56,040 --> 00:17:59,119 Speaker 3: Customs and Border Protection as well. There has been some 306 00:17:59,240 --> 00:18:04,200 Speaker 3: pressure from Democrats on House committees looking into the facial 307 00:18:04,240 --> 00:18:07,560 Speaker 3: recognition side as well after our reporting. That being said, 308 00:18:08,119 --> 00:18:12,000 Speaker 3: accountability looks very very different in the second of Trump administration, 309 00:18:12,400 --> 00:18:16,200 Speaker 3: where you have masked agents, you have officials physically assaulting 310 00:18:16,240 --> 00:18:18,840 Speaker 3: people in the corridors of court rooms in New York 311 00:18:18,880 --> 00:18:22,159 Speaker 3: and then two days later they're back at work. Accountability 312 00:18:22,160 --> 00:18:25,679 Speaker 3: looks very very different here. And for the tech stuff, 313 00:18:26,480 --> 00:18:31,440 Speaker 3: the accountability comes from the outside. It's not really from government, 314 00:18:31,480 --> 00:18:33,760 Speaker 3: because of course the government wants to work for these companies. 315 00:18:34,000 --> 00:18:36,920 Speaker 3: It can come from inside where people are clearly annoyed 316 00:18:36,920 --> 00:18:40,240 Speaker 3: inside Pale or Flock and then they'll provide information to me. 317 00:18:41,160 --> 00:18:44,440 Speaker 3: But it looks very very different to as you say, 318 00:18:44,520 --> 00:18:46,960 Speaker 3: taking a senior official in front of a judge and 319 00:18:47,000 --> 00:18:49,720 Speaker 3: having him report every day. It's just a completely different environment. 320 00:18:56,119 --> 00:18:59,879 Speaker 1: After the break the people fighting back against the ICE 321 00:19:00,200 --> 00:19:10,639 Speaker 1: surveillance state stay with us. There is also resistance tech. 322 00:19:11,320 --> 00:19:14,360 Speaker 1: I'm very curious to hear about ice block. I mean, 323 00:19:14,720 --> 00:19:18,040 Speaker 1: our producer Eliza made an interesting point that, you know, 324 00:19:18,080 --> 00:19:22,480 Speaker 1: after the California fires, there was this great interest in 325 00:19:22,920 --> 00:19:26,680 Speaker 1: funding fire watching apps and early warning systems, and there's 326 00:19:26,680 --> 00:19:31,400 Speaker 1: also an explosion of apps around helping communities avoid ice 327 00:19:31,480 --> 00:19:34,199 Speaker 1: round apps and stuff that it will be fair, they 328 00:19:34,200 --> 00:19:37,159 Speaker 1: didn't receive the same funding attention. Talk a little bit 329 00:19:37,160 --> 00:19:40,760 Speaker 1: about these apps and the resistance to them. 330 00:19:41,320 --> 00:19:44,399 Speaker 3: Yeah, so ice block is the most prominent one. It 331 00:19:44,440 --> 00:19:47,399 Speaker 3: was launched earlier this year. People only really started to 332 00:19:47,440 --> 00:19:51,240 Speaker 3: pay attention in around June when CNN covered the app. 333 00:19:51,680 --> 00:19:54,800 Speaker 3: And all ice block does is it allows people to 334 00:19:54,960 --> 00:19:58,680 Speaker 3: report sightings of ICE officials in their local proximity. It 335 00:19:58,800 --> 00:20:02,040 Speaker 3: uses the location data off your iPhone. It was only 336 00:20:02,119 --> 00:20:04,800 Speaker 3: on iPhone. It wasn't on Android for privacy reasons, is 337 00:20:04,840 --> 00:20:07,920 Speaker 3: what the developer told me. And if you see an 338 00:20:07,920 --> 00:20:10,280 Speaker 3: ICE official or some sort of ICE raid within five 339 00:20:10,320 --> 00:20:12,720 Speaker 3: miles of you or whatever, you can report it and 340 00:20:12,760 --> 00:20:16,240 Speaker 3: then other people around the area will receive a push alert. 341 00:20:16,280 --> 00:20:22,080 Speaker 3: It's rarely similar to ring the neighborhood app or even 342 00:20:22,320 --> 00:20:24,520 Speaker 3: a citizen or something like that. All the fire watching 343 00:20:24,560 --> 00:20:28,320 Speaker 3: apps you mentioned as well. Time goes on. The DOJ 344 00:20:28,480 --> 00:20:31,520 Speaker 3: does not like that CNN has reported this and amplified it. 345 00:20:31,840 --> 00:20:34,400 Speaker 3: The DOJ even says it's trying to find a way 346 00:20:34,440 --> 00:20:37,600 Speaker 3: to charge CNN. Obviously that didn't come to anything at 347 00:20:37,680 --> 00:20:41,080 Speaker 3: least yet. But fast forward to September and there is 348 00:20:41,160 --> 00:20:44,440 Speaker 3: a horrible shooting at an ICE facility where a gunman 349 00:20:44,840 --> 00:20:48,600 Speaker 3: shoots at the facility and he hits free detainees. I 350 00:20:48,640 --> 00:20:53,000 Speaker 3: believe two died and one survived. When authorities looking to 351 00:20:53,040 --> 00:20:56,520 Speaker 3: that person, they look at his phone and they find 352 00:20:56,520 --> 00:21:01,400 Speaker 3: that that person was searching for ice spotting apps explicitly 353 00:21:01,840 --> 00:21:06,320 Speaker 3: ice block. The DOJ and Pambondi, the Attorney General then 354 00:21:06,520 --> 00:21:10,200 Speaker 3: uses that as a justification to go to Apple and say, look, 355 00:21:10,240 --> 00:21:13,600 Speaker 3: you have to remove this app. This is encouraging violence 356 00:21:14,040 --> 00:21:17,640 Speaker 3: against ICE officials. Putting aside the fact that obviously ICE 357 00:21:17,640 --> 00:21:19,360 Speaker 3: officials are going to be an ICE facility, I don't 358 00:21:19,359 --> 00:21:22,520 Speaker 3: think you need an app to determine that. But Apple 359 00:21:22,880 --> 00:21:26,040 Speaker 3: gives in and it removes not just that app, but 360 00:21:26,080 --> 00:21:29,520 Speaker 3: a host of other ones at the explicit request of 361 00:21:29,960 --> 00:21:32,919 Speaker 3: the US Department of Justice. Now Apple removes and Google 362 00:21:32,920 --> 00:21:36,879 Speaker 3: removes apps every single day for doing sketchy or weird 363 00:21:36,960 --> 00:21:42,359 Speaker 3: or misleading stuff. It is an entirely different thing for 364 00:21:42,400 --> 00:21:45,520 Speaker 3: the US government to go and say remove this app, 365 00:21:45,760 --> 00:21:49,680 Speaker 3: especially when that app was actually facilitating First Amendment protected 366 00:21:49,720 --> 00:21:52,879 Speaker 3: speech as well. This is not illegal to report where 367 00:21:53,359 --> 00:21:56,160 Speaker 3: law enforcement are. That's a core tenant of the First Amendment. 368 00:21:56,880 --> 00:21:58,560 Speaker 1: Yeah, it's kind of extraordinary. I want to help people 369 00:21:58,680 --> 00:22:03,679 Speaker 1: look back on this time. I mean, obviously, people who 370 00:22:03,720 --> 00:22:07,160 Speaker 1: are rich and powerful like getting richer and more powerful. 371 00:22:07,280 --> 00:22:12,880 Speaker 1: But the capitulation every step of the way of private companies, 372 00:22:13,240 --> 00:22:18,400 Speaker 1: especially tech companies, to these requests and mandates, it's quite extraordinary. 373 00:22:18,440 --> 00:22:22,000 Speaker 1: I wonder how it will be judged by history. 374 00:22:22,760 --> 00:22:25,720 Speaker 3: Yeah. I remember in the first Trump administration, I feel 375 00:22:25,720 --> 00:22:29,800 Speaker 3: like there was so much more pushback from tech companies, 376 00:22:29,880 --> 00:22:32,600 Speaker 3: you know, even surveillance companies as well. I think Amazon 377 00:22:32,680 --> 00:22:37,600 Speaker 3: stopped selling its facial recognition technology to law enforcement agencies 378 00:22:37,600 --> 00:22:41,080 Speaker 3: for a period of time as well. That has completely 379 00:22:41,160 --> 00:22:45,200 Speaker 3: gone Now. It is absolute capitulation to the government. It 380 00:22:45,320 --> 00:22:48,720 Speaker 3: is doing whatever they want. It is not really pushing back. 381 00:22:48,760 --> 00:22:51,120 Speaker 3: I mean Apple removes that app. You know, a few 382 00:22:51,119 --> 00:22:53,399 Speaker 3: weeks earlier or whatever it was, Tim Cook was giving 383 00:22:53,680 --> 00:22:57,680 Speaker 3: Trump a golden statue of the Apple logo or whatever. 384 00:22:58,080 --> 00:23:02,080 Speaker 3: Another example is that DHS right now is using a 385 00:23:02,080 --> 00:23:05,880 Speaker 3: lot of memes from the Halo video game franchise which 386 00:23:05,920 --> 00:23:11,359 Speaker 3: are dehumanizing immigrants and basically comparing them to like an 387 00:23:11,400 --> 00:23:15,840 Speaker 3: alien parasite. Microsoft hasn't responded to that. Microsoft hasn't, you know, 388 00:23:16,480 --> 00:23:19,240 Speaker 3: asked them to take it down. It's an entirely different 389 00:23:19,320 --> 00:23:22,520 Speaker 3: environment now than it was in the first Trump administration 390 00:23:22,560 --> 00:23:25,360 Speaker 3: where it comes to tech and even just business right 391 00:23:25,480 --> 00:23:26,840 Speaker 3: and private companies. 392 00:23:27,480 --> 00:23:30,360 Speaker 1: So who's fighting back? I mean, there's Joshua Aaron who 393 00:23:30,480 --> 00:23:33,560 Speaker 1: founded the ice block app. Where is their resistance to this? 394 00:23:33,640 --> 00:23:36,320 Speaker 1: And also what the normal people do in their daily 395 00:23:36,359 --> 00:23:39,600 Speaker 1: lives to protect themselves from the reach of some of 396 00:23:39,600 --> 00:23:41,920 Speaker 1: these technologies and keep themselves safe. 397 00:23:42,240 --> 00:23:44,400 Speaker 3: Yeah, I mean there's a lot of different things people 398 00:23:44,440 --> 00:23:47,680 Speaker 3: are doing. There are other apps similar to ice Block. 399 00:23:47,760 --> 00:23:50,680 Speaker 3: There was one called eyes up. There was just cataloging 400 00:23:51,280 --> 00:23:54,280 Speaker 3: videos of ice abuses. I wasn't even tracking the physical 401 00:23:54,320 --> 00:23:58,200 Speaker 3: location of people. That developer who he only gave me 402 00:23:58,359 --> 00:24:01,679 Speaker 3: his first name, Mark, but he's doing work in his 403 00:24:01,720 --> 00:24:02,080 Speaker 3: own way. 404 00:24:02,080 --> 00:24:02,359 Speaker 2: There. 405 00:24:02,560 --> 00:24:05,440 Speaker 3: You even have states like New York State is basically 406 00:24:05,440 --> 00:24:07,919 Speaker 3: building it's equivalent of that, which is, hey, if you 407 00:24:08,040 --> 00:24:12,600 Speaker 3: have videos of abuses by ICE, send them to us, 408 00:24:12,840 --> 00:24:15,960 Speaker 3: Like there will be consequences for this later. So people 409 00:24:16,040 --> 00:24:19,000 Speaker 3: right now are thinking about, well, how do we collect 410 00:24:19,080 --> 00:24:24,000 Speaker 3: evidence for accountability later? Like it can't happen now. There's 411 00:24:24,080 --> 00:24:26,920 Speaker 3: no real way to hold a ICE agent who maybe 412 00:24:27,000 --> 00:24:30,719 Speaker 3: detains a US citizen unlawfully to account because they're masks, 413 00:24:30,760 --> 00:24:33,600 Speaker 3: and there's this complete culture of we're just going to 414 00:24:33,640 --> 00:24:36,119 Speaker 3: do whatever we want. But people are preparing for that. 415 00:24:36,440 --> 00:24:38,119 Speaker 3: As for the second part of your questions and what 416 00:24:38,160 --> 00:24:41,199 Speaker 3: people can do to protect their privacy, I mean, it 417 00:24:41,240 --> 00:24:45,280 Speaker 3: really depends on the technology, but I would sort of 418 00:24:45,280 --> 00:24:47,119 Speaker 3: suggest some of the normal stuff when it comes to 419 00:24:47,160 --> 00:24:50,560 Speaker 3: tracking mobile phones, which is, keep the phone fully up 420 00:24:50,560 --> 00:24:56,360 Speaker 3: to date, uninstall sketchy apps, don't give location permissions. Of course, 421 00:24:56,440 --> 00:24:58,600 Speaker 3: there's lots more you could do, and it's going to 422 00:24:58,640 --> 00:25:01,040 Speaker 3: depend upon your individe your use case. So you a 423 00:25:01,160 --> 00:25:04,640 Speaker 3: lawyer defending immigrants, so you're a journalist, are you an 424 00:25:04,640 --> 00:25:07,800 Speaker 3: activists whatever? So people should do their own research. But 425 00:25:07,920 --> 00:25:10,480 Speaker 3: there are things that people can do if they think 426 00:25:10,480 --> 00:25:11,119 Speaker 3: it's appropriate. 427 00:25:13,480 --> 00:25:15,200 Speaker 1: Joseph thank you, Thank you so much. 428 00:25:41,760 --> 00:25:43,560 Speaker 2: That's it for this week for tech Stuff. 429 00:25:43,560 --> 00:25:46,439 Speaker 1: I'm Cara Price and I'm os Vlocian. This episode was 430 00:25:46,440 --> 00:25:50,320 Speaker 1: produced by Eliza Dennis, Tyler Hill and Melissa Slaughter. It 431 00:25:50,400 --> 00:25:53,720 Speaker 1: was executive produced by me Cara Price, Julia Nutter, and 432 00:25:53,800 --> 00:25:57,920 Speaker 1: Kate Osborne for Kaleidisco and Katrina nor velve iHeart Podcasts. 433 00:25:58,640 --> 00:26:02,400 Speaker 1: Jack Insley mixed this up. Kyle Murdoch wrote our theme song. 434 00:26:02,720 --> 00:26:04,920 Speaker 2: Join us on Friday for the Week in Tech, where 435 00:26:04,960 --> 00:26:06,560 Speaker 2: we'll run through the headlines. 436 00:26:06,119 --> 00:26:08,879 Speaker 1: You need to follow, and please do rate and review 437 00:26:08,920 --> 00:26:11,240 Speaker 1: the show and reach out to us at tech Stuff 438 00:26:11,280 --> 00:26:14,160 Speaker 1: podcast at gmail dot com. We want to hear from you.