1 00:00:01,720 --> 00:00:02,920 Speaker 1: All Zone media. 2 00:00:04,960 --> 00:00:07,760 Speaker 2: Hey everybody, Robert Evans here, Welcome back to It Could 3 00:00:07,800 --> 00:00:11,920 Speaker 2: Happen Here, a podcast about things falling apart and coping 4 00:00:11,960 --> 00:00:15,560 Speaker 2: with dystopia. And one of the first signs of our 5 00:00:16,600 --> 00:00:20,640 Speaker 2: dystopia coming to be was the establishment of the TSA 6 00:00:20,760 --> 00:00:24,240 Speaker 2: and the Department of Homeland Security more broadly, obviously DHS 7 00:00:24,640 --> 00:00:27,760 Speaker 2: much more problematic than just the TSA. We did a 8 00:00:27,800 --> 00:00:31,000 Speaker 2: couple of episodes on them with Behind the Bastards back 9 00:00:31,000 --> 00:00:33,320 Speaker 2: in the day. But you know, the TSA came to 10 00:00:33,320 --> 00:00:36,760 Speaker 2: be right after nine to eleven, and both its establishment, 11 00:00:36,880 --> 00:00:39,480 Speaker 2: you know, the seeming necessity of it, and the kind 12 00:00:39,520 --> 00:00:43,959 Speaker 2: of impositions into personal privacy that it made commonplace. We're 13 00:00:44,000 --> 00:00:48,640 Speaker 2: both harbingers of the very fucked up era we find 14 00:00:48,640 --> 00:00:51,800 Speaker 2: ourselves in now. And the TSA is an interesting law 15 00:00:51,880 --> 00:00:56,040 Speaker 2: enforcement agency to me from the perspective of a normal person. 16 00:00:56,120 --> 00:00:59,040 Speaker 2: I think they're kind of the least objectionable of our 17 00:00:59,080 --> 00:01:02,520 Speaker 2: federal law enforce spin agencies, right They at least, I 18 00:01:02,520 --> 00:01:04,160 Speaker 2: should say, of all of the cops that we have 19 00:01:04,200 --> 00:01:06,080 Speaker 2: in this country, they're the ones you're least likely to 20 00:01:06,120 --> 00:01:09,160 Speaker 2: have a serious problem with, right, Like, they're not real 21 00:01:09,560 --> 00:01:12,360 Speaker 2: cops in the way that most cops are. They don't 22 00:01:12,440 --> 00:01:15,679 Speaker 2: like ticket or arrest you generally, unless you're one of 23 00:01:15,720 --> 00:01:18,280 Speaker 2: the startling number of Americans who gets caught with a 24 00:01:18,319 --> 00:01:21,880 Speaker 2: loaded handgun trying to go through airport security. Mostly, and 25 00:01:21,920 --> 00:01:25,600 Speaker 2: I've flown way more often than I want to remember. 26 00:01:25,920 --> 00:01:28,920 Speaker 2: Mostly my experience with TSA agents is they check your ID, 27 00:01:29,480 --> 00:01:31,400 Speaker 2: you know, they stare at an X ray machine where 28 00:01:31,400 --> 00:01:34,480 Speaker 2: your shit goes through it. Sometimes they alert and swab 29 00:01:34,560 --> 00:01:37,000 Speaker 2: some stuff. But you know, for me, it's usually not 30 00:01:37,080 --> 00:01:39,240 Speaker 2: that big a deal. For most people, I know it's 31 00:01:39,280 --> 00:01:42,160 Speaker 2: not that big a deal obviously, you know, the degree 32 00:01:42,200 --> 00:01:44,959 Speaker 2: to which it's a problem is going to vary widely 33 00:01:45,080 --> 00:01:47,480 Speaker 2: depending on whether or not you're a white dude. But 34 00:01:48,240 --> 00:01:52,320 Speaker 2: that said, still less potential for things going horribly violently 35 00:01:52,360 --> 00:01:54,880 Speaker 2: wrong and with a lot of police interactions. So I'll 36 00:01:54,880 --> 00:01:57,400 Speaker 2: give them that. And it's interesting to me that kind 37 00:01:57,400 --> 00:02:02,920 Speaker 2: of given this fact, the TSA is so hated, not 38 00:02:03,000 --> 00:02:05,440 Speaker 2: by I think most Americans. I think we're all kind 39 00:02:05,440 --> 00:02:07,680 Speaker 2: of frustrated by them. We know, you know, they're not 40 00:02:07,760 --> 00:02:11,080 Speaker 2: great at their jobs. They get caught in tests, letting 41 00:02:11,120 --> 00:02:14,000 Speaker 2: shit through all the time. It's kind of a pain 42 00:02:14,000 --> 00:02:15,799 Speaker 2: in the butt. But there's a chunk of Americans who 43 00:02:15,919 --> 00:02:19,200 Speaker 2: fucking hate the TSA, and they hate it because of 44 00:02:19,240 --> 00:02:22,840 Speaker 2: how invasive it is. And it's a little weird if 45 00:02:22,880 --> 00:02:26,040 Speaker 2: you're a regular person, because going through airport security is 46 00:02:26,040 --> 00:02:29,240 Speaker 2: still less invasive than like applying for an apartment, which 47 00:02:29,320 --> 00:02:31,560 Speaker 2: a lot of people do more regularly than they fly, 48 00:02:32,400 --> 00:02:34,320 Speaker 2: or taking a trip to the DMV, which again a 49 00:02:34,320 --> 00:02:37,799 Speaker 2: lot of people do more regularly than they fly. But people, 50 00:02:38,000 --> 00:02:40,720 Speaker 2: you know, with money, upper middle class and rich people, 51 00:02:41,280 --> 00:02:44,360 Speaker 2: that's where you really get most of the hate from 52 00:02:44,400 --> 00:02:47,560 Speaker 2: the TSA. Now, obviously there's some from principled libertarians, and 53 00:02:48,280 --> 00:02:50,560 Speaker 2: I tend to think they have a point there, but 54 00:02:50,600 --> 00:02:52,720 Speaker 2: there's a lot of people who really hate the TSA 55 00:02:52,840 --> 00:02:56,640 Speaker 2: specifically because it's kind of the only law enforcement friction 56 00:02:56,720 --> 00:02:58,760 Speaker 2: they deal with on a day to day basis. You know, 57 00:02:58,800 --> 00:03:01,080 Speaker 2: they live in a neighbor hood where they're not getting 58 00:03:01,080 --> 00:03:03,600 Speaker 2: pulled over. You know, the cops their job is not 59 00:03:03,720 --> 00:03:06,560 Speaker 2: to fuck with the people who have money, So the 60 00:03:06,600 --> 00:03:08,720 Speaker 2: only time they're going to get padded down and deal 61 00:03:08,760 --> 00:03:11,320 Speaker 2: with that thing that is a pretty common experience for 62 00:03:11,360 --> 00:03:14,600 Speaker 2: Americans and a lot of cities is when they go 63 00:03:14,680 --> 00:03:16,799 Speaker 2: through the airport. When they fly, and they also fly 64 00:03:16,919 --> 00:03:19,760 Speaker 2: often because they have more money. So I find myself 65 00:03:19,760 --> 00:03:22,200 Speaker 2: in this interesting position when reporting on the TSA of 66 00:03:22,400 --> 00:03:25,399 Speaker 2: there's real abuses there, there's a lot of real threats there. 67 00:03:25,440 --> 00:03:28,680 Speaker 2: The fact that they do get so much up in 68 00:03:28,760 --> 00:03:32,000 Speaker 2: our business and that we've normalized it is an issue. 69 00:03:32,080 --> 00:03:35,040 Speaker 2: And at the same time, like I always know when 70 00:03:35,040 --> 00:03:37,240 Speaker 2: I do something like this, the people who get angriest 71 00:03:37,280 --> 00:03:39,600 Speaker 2: about whatever I write about the TSA are going to 72 00:03:39,600 --> 00:03:41,360 Speaker 2: be the worst pieces of shit in the country. So 73 00:03:41,360 --> 00:03:44,720 Speaker 2: it's a fun balancing act. Now, obviously, as I'm trying 74 00:03:44,760 --> 00:03:47,600 Speaker 2: not to gloss over, there are some really good reasons 75 00:03:47,640 --> 00:03:50,640 Speaker 2: to be pissed at the TSA, like this twenty fifteen 76 00:03:50,680 --> 00:03:53,600 Speaker 2: story from Denver of several agents who were caught running 77 00:03:53,600 --> 00:03:57,400 Speaker 2: a groping scam. Basically, one female crew member would point 78 00:03:57,440 --> 00:03:59,840 Speaker 2: out the men that she found attractive, and a colleague 79 00:04:00,120 --> 00:04:02,680 Speaker 2: would signal that person out for a pat down, and 80 00:04:02,880 --> 00:04:05,680 Speaker 2: they basically say, like, oh, it's alerting something around your 81 00:04:05,680 --> 00:04:09,520 Speaker 2: growing or inner thighs, so that like she could bottle them. Essentially, 82 00:04:09,600 --> 00:04:12,480 Speaker 2: Now these people got fired. The TSA went after them 83 00:04:12,520 --> 00:04:15,280 Speaker 2: when they got caught, but who knows how many people 84 00:04:15,360 --> 00:04:18,680 Speaker 2: they groped in the interim period. Video in twenty twenty 85 00:04:18,760 --> 00:04:22,839 Speaker 2: three caught TSA agents at Miami International Airport stealing from 86 00:04:22,880 --> 00:04:27,200 Speaker 2: passenger bags in the security lines. There's like footage of it. Obviously, 87 00:04:27,400 --> 00:04:29,840 Speaker 2: the people who do this are going to be very, 88 00:04:29,960 --> 00:04:33,560 Speaker 2: very stupid, because, like you know, as a tsagent, there's 89 00:04:33,560 --> 00:04:35,760 Speaker 2: cameras all over the place. You're in the fucking TSA, 90 00:04:36,480 --> 00:04:38,440 Speaker 2: and the video is just this guy like reaching his 91 00:04:38,520 --> 00:04:42,080 Speaker 2: hand into a bag pulling out cash. It's not hard 92 00:04:42,279 --> 00:04:45,719 Speaker 2: to find headlines that are similar though, going back about 93 00:04:45,760 --> 00:04:49,240 Speaker 2: as long as the TSA has existed, what interests me 94 00:04:49,360 --> 00:04:53,320 Speaker 2: more are the massive violations of privacy and the potential 95 00:04:53,440 --> 00:04:56,960 Speaker 2: involvement of the TSA and their normalization of those invasions 96 00:04:56,960 --> 00:05:00,520 Speaker 2: of privacy has in the expansion of the survey state. 97 00:05:00,640 --> 00:05:03,719 Speaker 2: So this year at CEES twenty twenty four, when we 98 00:05:03,760 --> 00:05:06,000 Speaker 2: found out the TSA had a booth and they were 99 00:05:06,040 --> 00:05:09,360 Speaker 2: there to talk about their new facial recognition scanners, Garrison 100 00:05:09,400 --> 00:05:11,680 Speaker 2: and I had to go check it out, and the 101 00:05:11,680 --> 00:05:13,599 Speaker 2: interview that we conducted is going to kind of be 102 00:05:13,680 --> 00:05:16,400 Speaker 2: the heart of this episode, but I wanted to get 103 00:05:16,480 --> 00:05:19,720 Speaker 2: over a little bit more of a preamble first, So 104 00:05:20,000 --> 00:05:24,360 Speaker 2: the TSA started testing facial recognition scanners on a voluntary 105 00:05:24,440 --> 00:05:28,280 Speaker 2: basis at sixteen domestic airports, and I think twenty twenty 106 00:05:28,360 --> 00:05:32,080 Speaker 2: two they expanded it to twenty five airports or so 107 00:05:32,320 --> 00:05:35,680 Speaker 2: last year twenty twenty three, they are in twenty seven now, 108 00:05:35,720 --> 00:05:38,039 Speaker 2: according to what we were told by a representative, and 109 00:05:38,120 --> 00:05:42,560 Speaker 2: the goal is for this technology to go nationwide. Obviously, 110 00:05:43,000 --> 00:05:45,400 Speaker 2: not everyone is thrilled with that idea. And I'm going 111 00:05:45,440 --> 00:05:48,480 Speaker 2: to quote from a June twenty twenty three article on 112 00:05:48,560 --> 00:05:52,400 Speaker 2: CBS News. Five US senators sent a letter demanding the 113 00:05:52,400 --> 00:05:55,680 Speaker 2: TSA halt the program. You don't have to compromise people's 114 00:05:55,720 --> 00:05:59,320 Speaker 2: biometric security in order to provide physical security at airports, 115 00:05:59,520 --> 00:06:03,919 Speaker 2: said Senor ed Marky. Pokowski, who is the TSA representative, 116 00:06:03,920 --> 00:06:06,240 Speaker 2: says he agrees with senators and that he wants to 117 00:06:06,279 --> 00:06:09,880 Speaker 2: protect privacy for every passenger. I want to deploy technology 118 00:06:09,920 --> 00:06:14,240 Speaker 2: that's accurate and doesn't disadvantage anybody. Privacy advocates worry about 119 00:06:14,240 --> 00:06:17,880 Speaker 2: the lack of regulations around facial recognition and its tendency 120 00:06:17,920 --> 00:06:21,239 Speaker 2: to be less accurate with people of color. Most images 121 00:06:21,240 --> 00:06:24,520 Speaker 2: are deleted after use, but some information is encrypted and retained. 122 00:06:24,560 --> 00:06:26,360 Speaker 2: For up to twenty four months as part of the 123 00:06:26,400 --> 00:06:30,160 Speaker 2: ongoing review of how the technology performs. What's left out 124 00:06:30,160 --> 00:06:32,440 Speaker 2: of that article is that the TSA is also allowed 125 00:06:32,480 --> 00:06:36,440 Speaker 2: to maintain biometric data taken from non citizens people entering 126 00:06:36,480 --> 00:06:39,360 Speaker 2: the country from foreign countries, migrants and the like, and 127 00:06:39,360 --> 00:06:41,720 Speaker 2: they're able to keep that and share it. It's kind 128 00:06:41,720 --> 00:06:44,160 Speaker 2: of unclear the extent of that, but they're not bound 129 00:06:44,240 --> 00:06:46,280 Speaker 2: by the same rules with those people that they are 130 00:06:46,640 --> 00:06:49,280 Speaker 2: for citizens. And there are other issues as well, as 131 00:06:49,279 --> 00:06:51,760 Speaker 2: we'll get into. So Garrison and I were with excited 132 00:06:51,800 --> 00:06:54,360 Speaker 2: to have a chance to chat with a TSA representative. 133 00:06:54,920 --> 00:06:57,000 Speaker 2: This guy was less excited to see us, and we 134 00:06:57,040 --> 00:07:00,400 Speaker 2: will get into that story, but before we do, it's 135 00:07:00,520 --> 00:07:16,640 Speaker 2: time for an ad break and we're back. So Garrison 136 00:07:16,680 --> 00:07:19,680 Speaker 2: and I show up at the CEES booth and it's 137 00:07:19,800 --> 00:07:21,760 Speaker 2: kind of a small one. You might imagine it's about 138 00:07:21,760 --> 00:07:25,680 Speaker 2: the size of like three normal office cubicles. Maybe there's 139 00:07:25,720 --> 00:07:28,560 Speaker 2: a couple of tables. There's like a little podium thing 140 00:07:28,680 --> 00:07:30,760 Speaker 2: in the front that's got their logo on it. They 141 00:07:30,760 --> 00:07:34,960 Speaker 2: have some stickers, including one that's like it's like peanut 142 00:07:34,960 --> 00:07:38,840 Speaker 2: butter is a liquid and it's a cartoon of peanut butter, 143 00:07:39,200 --> 00:07:41,880 Speaker 2: which I was informed when I commented on it by 144 00:07:41,880 --> 00:07:44,280 Speaker 2: one of the TSA people that yes, peanut butter is 145 00:07:44,320 --> 00:07:46,480 Speaker 2: a liquid, which is one of those things that it's 146 00:07:46,480 --> 00:07:49,000 Speaker 2: both absurd and also like, well, actually, I don't know 147 00:07:49,000 --> 00:07:52,120 Speaker 2: how else you'd categorize peanut butter if you're the TSA. 148 00:07:52,320 --> 00:07:54,600 Speaker 2: So I guess it's something I can't have much of 149 00:07:54,640 --> 00:07:57,840 Speaker 2: an issue with cream. But is a cream different from 150 00:07:57,880 --> 00:08:01,120 Speaker 2: a liquid? I don't know. That's for the philosopher to decide. 151 00:08:01,360 --> 00:08:03,760 Speaker 2: So Garrison and I come up to this booth and 152 00:08:04,600 --> 00:08:06,320 Speaker 2: there's a guy standing behind the pode. I mean, the 153 00:08:06,320 --> 00:08:09,840 Speaker 2: way stuff works at CEES is you have generally a 154 00:08:09,880 --> 00:08:13,840 Speaker 2: mix of actual officers from the company. Sometimes it'll be 155 00:08:13,920 --> 00:08:16,920 Speaker 2: like a CEO or an executive in the case of 156 00:08:16,920 --> 00:08:20,280 Speaker 2: a smaller company. Other times it'll be regular employees or 157 00:08:20,320 --> 00:08:23,240 Speaker 2: like engineers and stuff who can answer technical questions, and 158 00:08:23,280 --> 00:08:24,800 Speaker 2: then a bunch of Most of the people that you 159 00:08:24,880 --> 00:08:27,640 Speaker 2: talk to are like pr reps. I don't know who 160 00:08:27,720 --> 00:08:29,280 Speaker 2: the guy was that was at the booth when we 161 00:08:29,320 --> 00:08:31,400 Speaker 2: first showed up, because as soon as we said we 162 00:08:31,440 --> 00:08:34,520 Speaker 2: wanted to talk about their facial recognition cameras he saw 163 00:08:34,559 --> 00:08:38,559 Speaker 2: were media, and he instantly did the pr equivalent of 164 00:08:38,600 --> 00:08:41,520 Speaker 2: throwing his buddy on a grenade. He like backed off. 165 00:08:41,559 --> 00:08:43,360 Speaker 2: He was like, let me get something for you. He 166 00:08:43,480 --> 00:08:47,920 Speaker 2: pulled his coworker over and then he fucking vanished, And 167 00:08:47,960 --> 00:08:49,600 Speaker 2: I'm going to play you a little bit of audio 168 00:08:49,840 --> 00:08:50,160 Speaker 2: of that. 169 00:08:50,600 --> 00:08:52,240 Speaker 3: We're interested in what you have here in terms of 170 00:08:52,240 --> 00:08:53,079 Speaker 3: facial recognition. 171 00:08:53,480 --> 00:08:55,160 Speaker 4: It's the cat too right over there. 172 00:08:55,240 --> 00:08:58,240 Speaker 3: Yeah, yeah, absolutely, what are you saying because I know 173 00:08:58,360 --> 00:09:02,040 Speaker 3: right now, like if you've got pre check or oh gosh, 174 00:09:02,080 --> 00:09:03,960 Speaker 3: what's the other one the independent company. 175 00:09:03,640 --> 00:09:05,480 Speaker 4: That were live on a live interview? 176 00:09:05,559 --> 00:09:08,200 Speaker 3: Now, I thought, so, I mean he came up to 177 00:09:08,200 --> 00:09:09,360 Speaker 3: ask to talk to a TSA. 178 00:09:09,600 --> 00:09:12,920 Speaker 4: Okay, sure then yeah, sure, hang on just a second. 179 00:09:14,000 --> 00:09:15,960 Speaker 2: So the guy we found ourselves in front of was 180 00:09:16,080 --> 00:09:19,560 Speaker 2: our Carter Langston, who is actually the press secretary of 181 00:09:19,600 --> 00:09:21,920 Speaker 2: the TSA, and by god, I don't know if I've 182 00:09:21,960 --> 00:09:25,359 Speaker 2: ever seen a man less happy to see me. Eventually 183 00:09:25,640 --> 00:09:28,320 Speaker 2: we started talking, and I have to give Carter credit 184 00:09:28,400 --> 00:09:32,280 Speaker 2: for professionalism. His eyes said I despise you both on principle, 185 00:09:32,320 --> 00:09:34,920 Speaker 2: and I am enraged to be doing this, But his 186 00:09:35,080 --> 00:09:39,280 Speaker 2: voice remained calm, even and his responses were to be 187 00:09:39,400 --> 00:09:41,240 Speaker 2: quite honest, pretty polished. 188 00:09:41,640 --> 00:09:43,480 Speaker 3: I'm kind of interested in sort of how you see 189 00:09:43,480 --> 00:09:46,880 Speaker 3: this altering the way we do air travel over the 190 00:09:46,920 --> 00:09:50,600 Speaker 3: next five ten years, right because obviously, right now people 191 00:09:50,640 --> 00:09:53,520 Speaker 3: are using facial recognition if they have pre check or 192 00:09:53,520 --> 00:09:55,920 Speaker 3: they have I'm spacing the name, but you know the 193 00:09:55,960 --> 00:09:58,920 Speaker 3: independent company that does get you pass the line and 194 00:09:58,960 --> 00:10:02,240 Speaker 3: stuff like they do like facial recognition when you are 195 00:10:02,240 --> 00:10:04,559 Speaker 3: in the airport. Is this something that you see as 196 00:10:04,600 --> 00:10:08,480 Speaker 3: coming more broadly to like everybody going through security in the. 197 00:10:08,440 --> 00:10:11,720 Speaker 4: Future, eventually, eventually. 198 00:10:13,280 --> 00:10:17,439 Speaker 5: So right now it's at twenty seven participating airports, and 199 00:10:17,480 --> 00:10:21,240 Speaker 5: it's not at every single we call them travel. 200 00:10:20,880 --> 00:10:22,280 Speaker 4: Document Checker podiums. 201 00:10:22,960 --> 00:10:27,559 Speaker 5: It's not at every single checker Travel Document checkerdum TDC 202 00:10:27,960 --> 00:10:36,480 Speaker 5: for sure, but it is growing and becoming We're deploying 203 00:10:36,800 --> 00:10:42,040 Speaker 5: more and more as as funding becomes available for we're 204 00:10:42,240 --> 00:10:51,200 Speaker 5: using facial recognition to identify passengers. It's a significant. 205 00:10:50,400 --> 00:10:53,880 Speaker 4: Improvement over the. 206 00:10:53,240 --> 00:10:58,240 Speaker 5: Previous way we were identifying passengers, just with the human interaction, 207 00:10:58,480 --> 00:11:02,720 Speaker 5: looking at an high decredidential and looking at it based 208 00:11:02,760 --> 00:11:07,160 Speaker 5: on what that individual knows about the fifty states and 209 00:11:07,400 --> 00:11:12,360 Speaker 5: territories and how they could what their credentials look like 210 00:11:13,760 --> 00:11:17,680 Speaker 5: the technology actually takes that over and does a much 211 00:11:18,040 --> 00:11:24,600 Speaker 5: better job of evalidating the authenticity of that ID. And 212 00:11:24,640 --> 00:11:29,120 Speaker 5: then the facial recognition component with a picture still image, 213 00:11:29,920 --> 00:11:33,000 Speaker 5: taking a picture of the passenger standing in front of 214 00:11:33,040 --> 00:11:39,679 Speaker 5: the travel document checker podium and then matching that picture 215 00:11:39,880 --> 00:11:44,920 Speaker 5: of that person standing there live against the credential photo 216 00:11:46,720 --> 00:11:49,480 Speaker 5: and making the match that way, so we know that 217 00:11:49,520 --> 00:11:56,120 Speaker 5: the credential is valid, it's it's true, it's accurate. We 218 00:11:56,240 --> 00:11:59,319 Speaker 5: know that the person standing there is also the person 219 00:11:59,720 --> 00:12:05,080 Speaker 5: on the credential. We can verify the boarding status and 220 00:12:05,160 --> 00:12:09,800 Speaker 5: the screening status of that individual and can provide them 221 00:12:10,000 --> 00:12:15,920 Speaker 5: with where they should go next for their screening because 222 00:12:16,040 --> 00:12:19,600 Speaker 5: the officer is able to discern based on all of 223 00:12:19,640 --> 00:12:23,360 Speaker 5: the information provided at the back end of the monitor 224 00:12:23,400 --> 00:12:24,320 Speaker 5: that they reviewed. 225 00:12:25,200 --> 00:12:27,400 Speaker 4: They can see all of those. 226 00:12:28,760 --> 00:12:32,200 Speaker 5: Items have been checked, there's a boarding status, and there's 227 00:12:32,200 --> 00:12:35,680 Speaker 5: a screening status, and then just tells the passenger where 228 00:12:35,679 --> 00:12:37,800 Speaker 5: to go to follow up for screening. 229 00:12:38,440 --> 00:12:41,800 Speaker 3: So I'm interested in how someone how someone becomes basically 230 00:12:41,920 --> 00:12:45,040 Speaker 3: enrolled in this right because my assumption is at this point, 231 00:12:45,800 --> 00:12:48,240 Speaker 3: just the picture you get when you're getting your driver's 232 00:12:48,280 --> 00:12:51,320 Speaker 3: license or your state ID from the DMV is not 233 00:12:51,480 --> 00:12:55,200 Speaker 3: sufficient for a facial recognition system, right, simply having the 234 00:12:55,240 --> 00:12:57,760 Speaker 3: picture in the government space. You need to have somebody 235 00:12:58,040 --> 00:13:01,080 Speaker 3: to like get their face scaled and their irises scand 236 00:13:01,160 --> 00:13:02,640 Speaker 3: or something like that in order to have them in 237 00:13:02,679 --> 00:13:03,160 Speaker 3: the system. 238 00:13:03,240 --> 00:13:05,559 Speaker 4: Right, it's not in the system at all. 239 00:13:06,600 --> 00:13:10,439 Speaker 5: So with the way we're rolling these out at airports 240 00:13:10,440 --> 00:13:15,640 Speaker 5: and check blinds is once a passenger has been identified. 241 00:13:15,559 --> 00:13:16,320 Speaker 4: And that. 242 00:13:18,240 --> 00:13:22,800 Speaker 5: And goes into screening, all of the information that was 243 00:13:23,000 --> 00:13:28,480 Speaker 5: captured is gone. We don't store any of the pictures 244 00:13:28,559 --> 00:13:30,960 Speaker 5: participation right now, it's completely volatile. 245 00:13:31,920 --> 00:13:32,079 Speaker 3: UH. 246 00:13:32,120 --> 00:13:34,760 Speaker 4: Their signage right there at the. 247 00:13:34,600 --> 00:13:42,480 Speaker 5: Checker podium UH to indicate that passengers can opt out, 248 00:13:43,640 --> 00:13:46,839 Speaker 5: they don't have to participate it and all that. All 249 00:13:46,880 --> 00:13:51,360 Speaker 5: it happens at that point is the same officer will 250 00:13:51,720 --> 00:14:03,000 Speaker 5: UH will turn basically over to the alternative process, review 251 00:14:03,040 --> 00:14:08,319 Speaker 5: the ID, review the boarding pass and allow the passenger. 252 00:14:07,880 --> 00:14:10,199 Speaker 4: To continue too easy. 253 00:14:10,880 --> 00:14:13,360 Speaker 5: So if you don't lose their place in line, and 254 00:14:13,440 --> 00:14:17,720 Speaker 5: they don't, they're not delayed in any way from getting 255 00:14:17,760 --> 00:14:18,880 Speaker 5: screen But. 256 00:14:19,560 --> 00:14:22,000 Speaker 3: In terms of like the people who choose to use it, right, 257 00:14:22,120 --> 00:14:24,560 Speaker 3: So if you if you're in this system, is it 258 00:14:24,640 --> 00:14:27,680 Speaker 3: literally just comparing your face to the face on the 259 00:14:27,720 --> 00:14:30,160 Speaker 3: I D. You're not like enrolled separately the way you 260 00:14:30,240 --> 00:14:32,560 Speaker 3: are if you are in like pre check or somewhere. 261 00:14:33,040 --> 00:14:34,840 Speaker 4: Okay, No, it's not. 262 00:14:35,320 --> 00:14:40,680 Speaker 5: There's not a database associated mats and so no, that's 263 00:14:40,720 --> 00:14:45,800 Speaker 5: that's not our use. Now, some of the airlines have 264 00:14:45,960 --> 00:14:49,600 Speaker 5: partnered with us, they saw benefit in it, and they're 265 00:14:49,800 --> 00:14:56,320 Speaker 5: using similar technology for backdrop for their Frequent Flyer Miles 266 00:14:56,960 --> 00:15:03,560 Speaker 5: program participants. So that is there's a database associated with 267 00:15:03,640 --> 00:15:07,400 Speaker 5: that obviously, and so those. 268 00:15:07,200 --> 00:15:10,520 Speaker 4: Passengers are company. 269 00:15:12,440 --> 00:15:16,040 Speaker 5: Have an entirely different experience. But the way that we're 270 00:15:16,200 --> 00:15:21,240 Speaker 5: using it at the checkpoint, as I just said, for 271 00:15:21,360 --> 00:15:23,120 Speaker 5: identity verification. 272 00:15:23,280 --> 00:15:26,680 Speaker 2: God n you caught that right how he says participation 273 00:15:26,840 --> 00:15:30,560 Speaker 2: is voluntary right now. Well, I hadn't come across this 274 00:15:30,600 --> 00:15:33,600 Speaker 2: information at the time, but afterwards I read a fascinating 275 00:15:33,720 --> 00:15:38,080 Speaker 2: Washington Post article from last summer about Portland Senator Jeff Murkley. 276 00:15:38,440 --> 00:15:40,960 Speaker 2: Jeff Murkley says that when he was trying to make 277 00:15:41,000 --> 00:15:43,640 Speaker 2: a flight at Reagan International Airport, he was told that 278 00:15:43,640 --> 00:15:46,400 Speaker 2: if he didn't verify his ID via face scanner, he 279 00:15:46,440 --> 00:15:50,280 Speaker 2: would face a significant delay. Quote from the article. There 280 00:15:50,360 --> 00:15:52,840 Speaker 2: was no delay, the spokeswoman said the senator showed his 281 00:15:52,880 --> 00:15:55,600 Speaker 2: photo ID to the TSA agent in cleared security, so 282 00:15:55,640 --> 00:15:58,400 Speaker 2: basically he was lied to. Somebody lied and said, you're 283 00:15:58,440 --> 00:16:00,560 Speaker 2: going to if you don't want to like delayed and 284 00:16:00,560 --> 00:16:03,080 Speaker 2: maybe miss your flight, you have to submit to a 285 00:16:03,120 --> 00:16:05,720 Speaker 2: face scan, which is one of the things that privacy 286 00:16:05,760 --> 00:16:08,760 Speaker 2: advocates were worried about from the beginning. But you know what, 287 00:16:08,800 --> 00:16:12,280 Speaker 2: privacy advocates aren't worried about the products and services that 288 00:16:12,400 --> 00:16:27,920 Speaker 2: support this podcast and we're back, So I think when 289 00:16:27,960 --> 00:16:30,440 Speaker 2: it gets right down to it, the silliest part of 290 00:16:30,480 --> 00:16:32,640 Speaker 2: all of this to me is that the TSA isn't 291 00:16:32,640 --> 00:16:35,760 Speaker 2: even claiming they need to run faces through like some 292 00:16:35,920 --> 00:16:39,160 Speaker 2: futuristic database of terrorists, right like they want to scan 293 00:16:39,280 --> 00:16:41,200 Speaker 2: our faces so they know if this like bad guy 294 00:16:41,240 --> 00:16:44,160 Speaker 2: they're tracking is in the airport in disguised using a 295 00:16:44,200 --> 00:16:47,120 Speaker 2: fake passport or something that's not actually what it does. 296 00:16:47,640 --> 00:16:51,480 Speaker 2: Facial recognition the TSA is using right now at least 297 00:16:52,080 --> 00:16:54,520 Speaker 2: just takes the place of the TSA guy. You hand 298 00:16:54,560 --> 00:16:57,000 Speaker 2: your ID before you go put your shit in bins, 299 00:16:57,080 --> 00:16:59,640 Speaker 2: right like, you know, you go up and before you 300 00:16:59,680 --> 00:17:01,400 Speaker 2: can go take your stuff out of your bags and 301 00:17:01,400 --> 00:17:04,040 Speaker 2: put it in those bins. You hand a guy your license, 302 00:17:04,080 --> 00:17:06,560 Speaker 2: sometimes your license and boarding pass, he looks at it 303 00:17:06,600 --> 00:17:08,040 Speaker 2: in your face. If you wear at a mask, he 304 00:17:08,080 --> 00:17:09,840 Speaker 2: tolls you to take it down for a second and 305 00:17:09,880 --> 00:17:11,639 Speaker 2: then he lets you go in. Right, that's what the 306 00:17:11,720 --> 00:17:16,000 Speaker 2: facial recognition scanners are actually doing here. That Washington Post 307 00:17:16,080 --> 00:17:19,720 Speaker 2: article also cites an anti facial recognition activist, Tijuana Petty, 308 00:17:19,760 --> 00:17:21,840 Speaker 2: who says that she was told by a TSA agent 309 00:17:21,920 --> 00:17:26,120 Speaker 2: at the same airport, Reagan, that undergoing facial recognition scanning 310 00:17:26,280 --> 00:17:29,119 Speaker 2: was not optional. So people are already being told this 311 00:17:29,200 --> 00:17:32,400 Speaker 2: is a requirement. And obviously, as a spoiler for where 312 00:17:32,480 --> 00:17:36,720 Speaker 2: this goes, the bigger, deeper question is like how long 313 00:17:37,000 --> 00:17:38,320 Speaker 2: is that going to be the case? 314 00:17:38,640 --> 00:17:38,840 Speaker 4: Right? 315 00:17:39,560 --> 00:17:43,280 Speaker 2: So you know that's kind of the big concern, right, 316 00:17:43,440 --> 00:17:46,200 Speaker 2: is that they're saying it's optional now, it obviously won't 317 00:17:46,200 --> 00:17:48,040 Speaker 2: be forever, and some people are just going to be 318 00:17:48,080 --> 00:17:50,440 Speaker 2: told they don't have a choice, Like that's kind of bullying, 319 00:17:50,520 --> 00:17:53,440 Speaker 2: strong arming people, threatening that they'll miss their flight if 320 00:17:53,480 --> 00:17:56,240 Speaker 2: they don't submit to it, which makes me extra suspicious 321 00:17:56,240 --> 00:17:59,159 Speaker 2: about their data retention. Right, And that is the question 322 00:17:59,240 --> 00:18:02,680 Speaker 2: we asked as the interview went on, where is the TSAs? 323 00:18:02,760 --> 00:18:05,840 Speaker 2: But where is the TSA's biometric data or really, where 324 00:18:05,920 --> 00:18:10,040 Speaker 2: is passenger biometric data actually going to go once the 325 00:18:10,080 --> 00:18:11,040 Speaker 2: TSA has it? 326 00:18:11,520 --> 00:18:15,879 Speaker 6: In terms of the just information storage is there? I 327 00:18:15,920 --> 00:18:18,960 Speaker 6: know I've been seeing more of these like signs the 328 00:18:18,960 --> 00:18:22,320 Speaker 6: more that I travel. I do it decent traveling for 329 00:18:22,359 --> 00:18:24,920 Speaker 6: these for these sorts of systems. And I'm curious with 330 00:18:25,000 --> 00:18:29,080 Speaker 6: how this works for non US citizens if because I 331 00:18:29,080 --> 00:18:31,560 Speaker 6: know there's there's certain at least in some of the 332 00:18:31,560 --> 00:18:34,880 Speaker 6: technology that's being used by customers of border patrol. They 333 00:18:35,040 --> 00:18:38,439 Speaker 6: do store images captured of non US citizens for a 334 00:18:38,440 --> 00:18:39,240 Speaker 6: certain time period. 335 00:18:39,480 --> 00:18:40,640 Speaker 2: They do take pictures. 336 00:18:40,320 --> 00:18:43,000 Speaker 6: Of US citizens when entering the country, and lots of 337 00:18:43,320 --> 00:18:47,240 Speaker 6: lots of airports. Are are these two systems interacting at all? 338 00:18:47,359 --> 00:18:50,760 Speaker 6: Or is the TSA system and customers about customs and 339 00:18:50,800 --> 00:18:52,879 Speaker 6: voter patrol system more separated? 340 00:18:53,480 --> 00:18:56,000 Speaker 5: Well, first of all, I can't speak for Customs and 341 00:18:56,040 --> 00:19:00,679 Speaker 5: Border protection, but I can tell you that we're we 342 00:19:00,760 --> 00:19:04,040 Speaker 5: have two very different use cases. So their use case 343 00:19:04,119 --> 00:19:08,399 Speaker 5: is very much oriented in the customs arena, and then 344 00:19:08,560 --> 00:19:10,400 Speaker 5: ours is as I just. 345 00:19:10,480 --> 00:19:13,720 Speaker 4: Mentioned, at the checkpoint. 346 00:19:12,880 --> 00:19:19,320 Speaker 5: And solely for the identity verification and if an international 347 00:19:19,359 --> 00:19:23,720 Speaker 5: passenger comes in with a credential. 348 00:19:24,800 --> 00:19:26,800 Speaker 4: That identifies them. 349 00:19:27,160 --> 00:19:32,480 Speaker 5: Then the unit would obviously accept that credential. 350 00:19:33,040 --> 00:19:35,439 Speaker 4: It's a photo, it's a photo credential. 351 00:19:35,920 --> 00:19:39,679 Speaker 5: So again, all that the system would do is validay 352 00:19:40,320 --> 00:19:43,200 Speaker 5: that that person on that credential is also the same 353 00:19:43,240 --> 00:19:46,479 Speaker 5: person that's standing right there in front of the travel 354 00:19:46,520 --> 00:19:47,760 Speaker 5: document checker. 355 00:19:48,000 --> 00:19:50,840 Speaker 2: Okay, most of you are probably aware of this, but 356 00:19:50,880 --> 00:19:53,600 Speaker 2: the TSA actually does not have a good record of 357 00:19:53,640 --> 00:19:56,480 Speaker 2: protecting private data. Now this is not old Man Robert 358 00:19:56,560 --> 00:19:59,920 Speaker 2: being a libertarian, it's just documented history. The TSA INDI 359 00:20:00,359 --> 00:20:03,320 Speaker 2: claimed their full body scanners, which took what are essentially 360 00:20:03,400 --> 00:20:07,200 Speaker 2: naked pictures of passengers, never stored photos and couldn't transmit them. 361 00:20:07,440 --> 00:20:09,639 Speaker 2: But in twenty ten this was revealed to be a 362 00:20:09,880 --> 00:20:13,240 Speaker 2: lie when we gained access to documents that included technical 363 00:20:13,240 --> 00:20:17,840 Speaker 2: specifications and vendor contracts which indicated the TSA required vendors 364 00:20:17,880 --> 00:20:20,800 Speaker 2: of these scanners to provide equipment that can store and 365 00:20:20,840 --> 00:20:23,399 Speaker 2: send images of screen passengers. Now this was supposed to 366 00:20:23,400 --> 00:20:25,879 Speaker 2: only be in testing mode, but if it can store 367 00:20:26,000 --> 00:20:29,359 Speaker 2: and send images of screened passengers, it can store and 368 00:20:29,400 --> 00:20:33,240 Speaker 2: send images of screened passengers. In twenty twelve, a former 369 00:20:33,280 --> 00:20:36,760 Speaker 2: TSA agent accused his coworkers of saving nude body images 370 00:20:36,800 --> 00:20:39,639 Speaker 2: of passengers from the body scanner and making fun of 371 00:20:39,680 --> 00:20:42,440 Speaker 2: them in back rooms. He said that safeguards were put 372 00:20:42,480 --> 00:20:44,880 Speaker 2: in place to ensure the agents manning the scanners never 373 00:20:44,920 --> 00:20:47,600 Speaker 2: saw the people they were scanning outside of the scanner, 374 00:20:48,160 --> 00:20:52,240 Speaker 2: but that these policies were frequently violated. Basically, every privacy 375 00:20:52,280 --> 00:20:55,520 Speaker 2: policy they had was frequently violated by agents so that 376 00:20:55,560 --> 00:20:58,280 Speaker 2: they could make fun of people's dicks. Right, that's the story. 377 00:20:58,800 --> 00:21:02,000 Speaker 2: The TSA retired its old scanners the next year, replacing 378 00:21:02,040 --> 00:21:04,840 Speaker 2: them with a device that showed less detail and particularly 379 00:21:04,880 --> 00:21:08,359 Speaker 2: provided agents with less clear looks at people's dongs. In 380 00:21:08,400 --> 00:21:11,680 Speaker 2: twenty twenty one, a TSA agent in Minneapolis was accused 381 00:21:11,680 --> 00:21:14,280 Speaker 2: by airport police of taking dozens of photos of young 382 00:21:14,280 --> 00:21:17,720 Speaker 2: women going through flight screening. The TSA's record here, both 383 00:21:17,760 --> 00:21:20,199 Speaker 2: in terms of the agency itself and in terms of 384 00:21:20,200 --> 00:21:24,280 Speaker 2: its employees, is certainly not worse than numerous police departments 385 00:21:24,400 --> 00:21:27,000 Speaker 2: right or the FBI. This is something to keep in mind. 386 00:21:27,359 --> 00:21:30,840 Speaker 2: As frustrating as all this stuff is, literally every local 387 00:21:30,960 --> 00:21:35,720 Speaker 2: and city law enforcement agency has worse cases, and by gods, 388 00:21:35,720 --> 00:21:38,280 Speaker 2: so do the FEDS could make a case that as 389 00:21:38,280 --> 00:21:40,280 Speaker 2: frustrating as a lot of this is the TSA is 390 00:21:40,359 --> 00:21:42,399 Speaker 2: less of a threat to privacy than most other federal 391 00:21:42,480 --> 00:21:45,760 Speaker 2: law enforcement agencies. But that's beside the point. For one thing, 392 00:21:46,240 --> 00:21:50,200 Speaker 2: normalizing facial recognition technology in the airports is a step 393 00:21:50,240 --> 00:21:54,400 Speaker 2: towards normalizing it everywhere. The data that is gathered will 394 00:21:54,440 --> 00:21:58,600 Speaker 2: not always be deleted, and more to the point, there's 395 00:21:58,640 --> 00:22:00,359 Speaker 2: no way to know that the system isn't going to 396 00:22:00,400 --> 00:22:03,760 Speaker 2: expand in directions that we all find deeply uncomfortable as 397 00:22:03,800 --> 00:22:05,520 Speaker 2: it goes on. That's why you kind of have to 398 00:22:05,640 --> 00:22:08,240 Speaker 2: nip this stuff in the bud, especially since they're not 399 00:22:08,359 --> 00:22:12,200 Speaker 2: really promising extra security here. When you look at the 400 00:22:12,640 --> 00:22:14,480 Speaker 2: scandals of the TSA, a lot of it has to 401 00:22:14,520 --> 00:22:17,040 Speaker 2: do with the fact that they'll be getting tested by 402 00:22:17,040 --> 00:22:19,120 Speaker 2: some other law enforcement agency to see if they can 403 00:22:19,320 --> 00:22:22,400 Speaker 2: sneak shit through, right, and the TSAO let a bunch 404 00:22:22,400 --> 00:22:24,600 Speaker 2: of guns or a fake bomb or whatever through because 405 00:22:24,640 --> 00:22:28,040 Speaker 2: people aren't paying attention. Facial scanners aren't going to catch that, 406 00:22:28,080 --> 00:22:29,840 Speaker 2: and it's kind of unclear to me what they are 407 00:22:29,880 --> 00:22:33,720 Speaker 2: going to catch. My other bigger issue is that even 408 00:22:33,760 --> 00:22:36,960 Speaker 2: though they say they're going to throw away biometric data, 409 00:22:37,000 --> 00:22:38,840 Speaker 2: they're not going to keep it more than twenty four 410 00:22:38,880 --> 00:22:42,480 Speaker 2: hours outside of special situations, which they do kind of 411 00:22:42,560 --> 00:22:44,920 Speaker 2: leave themselves an end there. The fact that they say 412 00:22:44,920 --> 00:22:46,919 Speaker 2: they're deleting this stuff doesn't mean they are going to 413 00:22:46,960 --> 00:22:49,880 Speaker 2: delete that stuff. And I brought up this troubling history 414 00:22:49,920 --> 00:22:53,840 Speaker 2: of lack of respect for privacy, violations of privacy by 415 00:22:53,880 --> 00:22:56,639 Speaker 2: TSA agents in the past within the context of this 416 00:22:56,720 --> 00:23:00,280 Speaker 2: new system. And I want to play Carter's answer for you. 417 00:23:00,840 --> 00:23:04,440 Speaker 3: I am curious. You know there were a couple five 418 00:23:04,520 --> 00:23:07,680 Speaker 3: or six years a couple of cases stories that grew 419 00:23:07,760 --> 00:23:11,800 Speaker 3: up of pictures images of passengers who were on the 420 00:23:11,840 --> 00:23:15,960 Speaker 3: body scanners being shared, right like that's there were for 421 00:23:16,080 --> 00:23:19,760 Speaker 3: a couple of scandals about that. Had that influenced your 422 00:23:19,800 --> 00:23:22,960 Speaker 3: attitudes on the data attention policy that should exist for 423 00:23:23,520 --> 00:23:24,640 Speaker 3: the facial recognition of. 424 00:23:24,600 --> 00:23:31,120 Speaker 5: This So first tell you that we follow the National 425 00:23:31,160 --> 00:23:37,120 Speaker 5: Institute of National Institute of Standards and Technology their guidelines 426 00:23:37,320 --> 00:23:42,359 Speaker 5: and standards to a tu Not only that we publish 427 00:23:42,560 --> 00:23:51,720 Speaker 5: online our privacy impact assessments, so there are we're very 428 00:23:52,040 --> 00:23:56,760 Speaker 5: transparent in our use of this technology, how we're using it, 429 00:23:56,880 --> 00:24:01,000 Speaker 5: what we're using it for. And again it's completely voluntary. 430 00:24:01,600 --> 00:24:06,200 Speaker 5: Nothing is stored similar and it's simply. 431 00:24:09,040 --> 00:24:14,080 Speaker 4: Which is really the Lynch Transportation security Yeah, I mean 432 00:24:14,240 --> 00:24:16,639 Speaker 4: we've got to know that was and who. 433 00:24:16,520 --> 00:24:19,679 Speaker 5: We're Vegas going mind into the secure area of an 434 00:24:19,680 --> 00:24:23,080 Speaker 5: airpoilt is in fact it's a person that. 435 00:24:24,560 --> 00:24:24,960 Speaker 4: They say that. 436 00:24:26,200 --> 00:24:29,320 Speaker 2: So yeah, that's more or less how the conversation ended. 437 00:24:29,720 --> 00:24:32,480 Speaker 2: Carter was very happy to see us go, and I 438 00:24:32,520 --> 00:24:37,160 Speaker 2: don't think Garrison or I were particularly surprised by anything 439 00:24:37,200 --> 00:24:39,800 Speaker 2: we heard, but I did find it interesting that he 440 00:24:39,960 --> 00:24:42,639 Speaker 2: kind of confirmed the goal is eventually for this to 441 00:24:42,720 --> 00:24:45,119 Speaker 2: not just be everywhere, but be something that you can't 442 00:24:45,200 --> 00:24:49,040 Speaker 2: opt out of. And I do partly wonder how much 443 00:24:49,160 --> 00:24:52,800 Speaker 2: of that is them looking for a way to get 444 00:24:52,840 --> 00:24:55,040 Speaker 2: more data on people, maybe even to share to other 445 00:24:55,119 --> 00:24:58,320 Speaker 2: law enforcement agencies, and how much of that is kind 446 00:24:58,320 --> 00:25:01,240 Speaker 2: of the same reason all a lot of you know, 447 00:25:01,560 --> 00:25:05,560 Speaker 2: AI style technology and kind of facial recognition does sort 448 00:25:05,600 --> 00:25:07,359 Speaker 2: of fall under that umbrella. If you're going to have 449 00:25:07,400 --> 00:25:09,320 Speaker 2: a general intelligence, one thing it has to be able 450 00:25:09,320 --> 00:25:11,639 Speaker 2: to do is recognize people's faces. So it is a 451 00:25:11,680 --> 00:25:14,320 Speaker 2: piece of that and I think that, just like a 452 00:25:14,359 --> 00:25:17,200 Speaker 2: lot of other applications of that kind of technology, are 453 00:25:17,200 --> 00:25:21,240 Speaker 2: inevitably used to cut workforces. That's kind of probably the 454 00:25:21,320 --> 00:25:24,440 Speaker 2: chief thing the TSA is looking to use it to do. Right, 455 00:25:24,880 --> 00:25:26,840 Speaker 2: If you can replace the guy who has to look 456 00:25:26,880 --> 00:25:29,040 Speaker 2: at your idea, or at least most of them, with 457 00:25:29,080 --> 00:25:32,040 Speaker 2: facial recognition scanners that do the same thing, then you 458 00:25:32,080 --> 00:25:34,840 Speaker 2: can save on your budget. Right now. The downside of that, 459 00:25:34,920 --> 00:25:37,240 Speaker 2: maybe to us. The upside is it could be faster. 460 00:25:37,359 --> 00:25:39,720 Speaker 2: The downside, of course, is there's a really good chance 461 00:25:39,720 --> 00:25:42,320 Speaker 2: it won't be. The robot will be even more racist 462 00:25:42,320 --> 00:25:44,840 Speaker 2: than an ESA agent might be. You know, it's one 463 00:25:44,920 --> 00:25:47,080 Speaker 2: less chance to deal with a human with whom you 464 00:25:47,160 --> 00:25:50,160 Speaker 2: might be able to talk something through. Anyway. Well, I'll 465 00:25:50,160 --> 00:25:52,840 Speaker 2: see where this goes, but for today, this has been 466 00:25:52,880 --> 00:25:55,760 Speaker 2: it could happen here, and I have been Robert Evans. 467 00:26:01,480 --> 00:26:03,840 Speaker 1: It could happen here, as a production of cool Zone Media. 468 00:26:04,040 --> 00:26:06,720 Speaker 1: For more podcasts from cool Zone Media, visit our website 469 00:26:06,760 --> 00:26:09,880 Speaker 1: coolzonemedia dot com, or check us out on the iHeartRadio app, 470 00:26:09,920 --> 00:26:13,080 Speaker 1: Apple Podcasts, or wherever you listen to podcasts. You can 471 00:26:13,119 --> 00:26:15,800 Speaker 1: find sources for It could Happen Here, updated monthly at 472 00:26:15,840 --> 00:26:19,080 Speaker 1: Coolzonmedia dot com slash sources. Thanks for listening,