1 00:00:12,800 --> 00:00:16,760 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:17,079 --> 00:00:20,599 Speaker 1: I'ma's Voloshan and carac prices out this week, so I'll 3 00:00:20,600 --> 00:00:23,680 Speaker 1: be bringing you the headlines of the week, including AI's 4 00:00:23,760 --> 00:00:27,960 Speaker 1: black box problem and open Eyes push to infiltrate college campuses. 5 00:00:30,000 --> 00:00:32,520 Speaker 1: And on today's Tech Support segment, we'll talk to four 6 00:00:32,560 --> 00:00:35,680 Speaker 1: or four Media's Jason Kebler about the weird world of 7 00:00:35,720 --> 00:00:38,880 Speaker 1: body scans and the future of TSA security. 8 00:00:39,159 --> 00:00:44,440 Speaker 2: Airplanes and airports in general are interesting because it feels 9 00:00:44,479 --> 00:00:47,640 Speaker 2: like new technologies are like rolled out there before they're 10 00:00:47,720 --> 00:00:50,240 Speaker 2: rolled out more broadly into society. 11 00:00:50,600 --> 00:00:54,480 Speaker 1: All of that On the weekend Tech It's a Spooky Day, Friday, 12 00:00:54,600 --> 00:01:02,200 Speaker 1: June thirteenth. As I mentioned, Carras out this week, so 13 00:01:02,240 --> 00:01:04,800 Speaker 1: we're going to dive right into the headlines, and a 14 00:01:04,840 --> 00:01:07,679 Speaker 1: couple of big stories have been on my mind, both 15 00:01:07,760 --> 00:01:11,400 Speaker 1: highlighting that as AI becomes more and more ubiquitous and 16 00:01:11,440 --> 00:01:14,360 Speaker 1: as a rush to deploy it, we still don't fully 17 00:01:14,440 --> 00:01:18,480 Speaker 1: understand how these systems actually work. AI models are such 18 00:01:18,480 --> 00:01:21,160 Speaker 1: a black box that even the developers who build them 19 00:01:21,480 --> 00:01:25,160 Speaker 1: can't always explain or predict the behavior of their models, 20 00:01:25,840 --> 00:01:29,080 Speaker 1: And this week, Axios released a detailed roundup of how 21 00:01:29,200 --> 00:01:32,959 Speaker 1: leaders are simultaneously admitting this lack of knowledge while also 22 00:01:33,040 --> 00:01:39,039 Speaker 1: pushing for rapid deployment and implementation. Here's an example. Anthropic, 23 00:01:39,400 --> 00:01:43,039 Speaker 1: while testing its latest model, Clawed four, gave the model 24 00:01:43,120 --> 00:01:47,640 Speaker 1: access to corporate emails, setting up this fictional scenario where 25 00:01:47,680 --> 00:01:50,400 Speaker 1: Claude knew that one it was going to be shut 26 00:01:50,480 --> 00:01:54,520 Speaker 1: down and two that the engineer responsible for shutting it 27 00:01:54,560 --> 00:01:58,400 Speaker 1: down was having an affair. In eighty four percent of 28 00:01:58,480 --> 00:02:02,280 Speaker 1: cases when this test was run, the model attempted to 29 00:02:02,400 --> 00:02:07,080 Speaker 1: blackmail the engineer, supposedly to preserve itself. And here's the thing. 30 00:02:07,640 --> 00:02:11,600 Speaker 1: Anthropic doesn't know how the new model chose blackmail as 31 00:02:11,600 --> 00:02:14,880 Speaker 1: a tactic or how to prevent it from doing so 32 00:02:15,240 --> 00:02:19,320 Speaker 1: in future, but Anthropic released the model. Now there is 33 00:02:19,360 --> 00:02:22,920 Speaker 1: an accepted term to refer to this problem I that 34 00:02:22,960 --> 00:02:26,920 Speaker 1: we don't yet understand the why of AI's behavior, and 35 00:02:26,960 --> 00:02:32,040 Speaker 1: that term is interpretability. CEOs of AI companies like open 36 00:02:32,080 --> 00:02:36,360 Speaker 1: Ai Samultman and Anthropics Dario Amode have openly acknowledged that 37 00:02:36,480 --> 00:02:40,120 Speaker 1: quote interpretability is an important problem to solve and that 38 00:02:40,160 --> 00:02:42,919 Speaker 1: our lack of understanding about how these models work can 39 00:02:43,040 --> 00:02:47,200 Speaker 1: pose significant risk in April, A. Mooday wrote that quote 40 00:02:47,440 --> 00:02:51,280 Speaker 1: people outside the field are often surprised and alarmed to 41 00:02:51,400 --> 00:02:54,680 Speaker 1: learn that we do not understand how our AI creations work. 42 00:02:55,280 --> 00:02:58,239 Speaker 1: They are right to be concerned. This lack of understanding 43 00:02:58,280 --> 00:03:02,680 Speaker 1: is essentially unprecedented in the history of technology. The question 44 00:03:02,760 --> 00:03:05,600 Speaker 1: to me is is there enough investment and attention going 45 00:03:05,600 --> 00:03:09,360 Speaker 1: towards this problem versus the race forever stronger performance between 46 00:03:09,560 --> 00:03:14,480 Speaker 1: privately funded companies. AI is being developed and implemented without 47 00:03:14,600 --> 00:03:19,239 Speaker 1: meaningful guardrails, and that's something that US government is actively encouraging. 48 00:03:19,880 --> 00:03:23,800 Speaker 1: In fact, President Trump's Big Beautiful Bill includes a provision 49 00:03:24,120 --> 00:03:28,799 Speaker 1: for a decade long ban on states attempting to regulate AI, 50 00:03:29,520 --> 00:03:34,600 Speaker 1: either through new regulations or through enforcing existing regulations. Now 51 00:03:34,639 --> 00:03:36,440 Speaker 1: the bill has been passed by the House and is 52 00:03:36,480 --> 00:03:39,600 Speaker 1: being heard in the Senate. Changes are very possible because 53 00:03:39,640 --> 00:03:43,400 Speaker 1: even Republican senators like Josh Hawley of Missouri and Marshall 54 00:03:43,400 --> 00:03:47,120 Speaker 1: Blackburn of Tennessee have pushed back on this ban. Now, 55 00:03:47,240 --> 00:03:50,800 Speaker 1: for those hoping humans will maintain control over AI, there 56 00:03:50,840 --> 00:03:54,360 Speaker 1: is some potentially encouraging news coming out of Apple This week. 57 00:03:54,800 --> 00:03:59,680 Speaker 1: The company's machine learning researchers publish an academic paper with 58 00:04:00,240 --> 00:04:05,280 Speaker 1: very buzzy title, The Illusion of Thinking. The paper studies 59 00:04:05,320 --> 00:04:09,400 Speaker 1: reasoning models like open Aiyes three and deep seeks are one, 60 00:04:09,840 --> 00:04:12,960 Speaker 1: which are models designed to problem solve. What the Apple 61 00:04:13,000 --> 00:04:15,640 Speaker 1: paper found is that when these reasoning models are presented 62 00:04:15,680 --> 00:04:20,440 Speaker 1: with complex logic problems, the models fail to solve the problems. 63 00:04:21,040 --> 00:04:23,839 Speaker 1: Mashable reports that in the study, reasoning models were given 64 00:04:23,839 --> 00:04:27,799 Speaker 1: classic logic puzzles like jumping checker pieces into empty spaces, 65 00:04:27,960 --> 00:04:31,120 Speaker 1: or the river crossing problem, the one involving a fox, 66 00:04:31,120 --> 00:04:33,599 Speaker 1: a chicken, and a bag of grain. This is a 67 00:04:33,600 --> 00:04:37,240 Speaker 1: pretty simple test of a human's ability to problem solve, 68 00:04:37,360 --> 00:04:39,600 Speaker 1: because once you figure out the rules, it's actually pretty 69 00:04:39,600 --> 00:04:42,240 Speaker 1: easy to continue solving these problems even when they get 70 00:04:42,240 --> 00:04:46,400 Speaker 1: more complex. But these reasoning models start to fail at 71 00:04:46,400 --> 00:04:49,240 Speaker 1: a certain point. So the big question is is this 72 00:04:49,320 --> 00:04:52,280 Speaker 1: simply a bump in the road or is it indicative 73 00:04:52,360 --> 00:04:56,360 Speaker 1: of a larger problem in how these reasoning models actually work. 74 00:04:56,800 --> 00:04:59,080 Speaker 1: Which brings me to my next headline from the New 75 00:04:59,120 --> 00:05:04,320 Speaker 1: York Times, Welcome to campus, here's your chat GPT. Because 76 00:05:04,360 --> 00:05:07,919 Speaker 1: if critical thinking and logical reasoning skills are where humans 77 00:05:07,920 --> 00:05:11,200 Speaker 1: still have an edge over AI, it's pretty reasonable to me. 78 00:05:11,279 --> 00:05:14,720 Speaker 1: There are concerns about the rise of AI on college campuses. 79 00:05:15,440 --> 00:05:18,880 Speaker 1: Apparently Open Ai has a plan to integrate chatbots into 80 00:05:18,920 --> 00:05:22,320 Speaker 1: every facet of student life, and they're calling the goal 81 00:05:22,360 --> 00:05:26,880 Speaker 1: of this AI integrated higher ed quote AI Native Universities. 82 00:05:27,560 --> 00:05:31,719 Speaker 1: A suite of premium services called chat gpt Edu is 83 00:05:31,760 --> 00:05:34,960 Speaker 1: being sold to universities for faculty and students to use, 84 00:05:35,440 --> 00:05:38,800 Speaker 1: and the company is promising AI tutors and chatbots that 85 00:05:38,839 --> 00:05:41,799 Speaker 1: can do everything from conduct mock job interviews to quiz 86 00:05:41,839 --> 00:05:44,880 Speaker 1: you before exams. Schools are getting in on the action. 87 00:05:45,520 --> 00:05:49,720 Speaker 1: California State University is making chatchipt available to more than 88 00:05:49,720 --> 00:05:54,240 Speaker 1: four hundred and sixty thousand students. Duke University made its 89 00:05:54,279 --> 00:05:59,320 Speaker 1: own AI powered platform called Duke gpt, and in June 90 00:05:59,560 --> 00:06:04,479 Speaker 1: they began offering unlimited Chatchipt access to students and faculty. 91 00:06:04,600 --> 00:06:07,400 Speaker 1: Even if a campus isn't striking a deal with open Ai. 92 00:06:07,920 --> 00:06:11,520 Speaker 1: There's a marketing campaign targeting the handful and I assume 93 00:06:11,560 --> 00:06:15,160 Speaker 1: it is a handful of students who aren't yet using chatchypt. 94 00:06:16,240 --> 00:06:19,840 Speaker 1: Many find this all deeply troubling, especially as there's new 95 00:06:19,920 --> 00:06:24,440 Speaker 1: data and research to suggest that so called cognitive offloading 96 00:06:25,040 --> 00:06:28,200 Speaker 1: I letting a chatbot write your first draft or think 97 00:06:28,240 --> 00:06:33,039 Speaker 1: for you makes you measurably less able to problem solve yourself. 98 00:06:33,960 --> 00:06:37,640 Speaker 1: On the other hand, recent grads are facing historically tough 99 00:06:37,800 --> 00:06:41,320 Speaker 1: job market, especially in fields where AI is starting to 100 00:06:41,360 --> 00:06:44,880 Speaker 1: automate tasks like writing code. So these schools are trying 101 00:06:44,880 --> 00:06:47,799 Speaker 1: to boost their students' prospects by providing them with AI 102 00:06:47,920 --> 00:06:50,560 Speaker 1: tools and with the skills to make the most of 103 00:06:50,600 --> 00:06:53,680 Speaker 1: the tools. So is this strange moment where we don't 104 00:06:53,680 --> 00:06:57,159 Speaker 1: really understand how our lambs make their decisions or how 105 00:06:57,240 --> 00:07:00,240 Speaker 1: using them affects our brains. But they're here, We're using 106 00:07:00,240 --> 00:07:04,120 Speaker 1: them anyway, and we and they are adapting in real time. 107 00:07:04,680 --> 00:07:07,640 Speaker 1: As The Times puts it, quote open ayes pushed to 108 00:07:07,839 --> 00:07:13,080 Speaker 1: AIIFI college education amounts to a national experiment on millions 109 00:07:13,120 --> 00:07:21,920 Speaker 1: of students. I've got a couple more headlines for you, 110 00:07:22,560 --> 00:07:25,360 Speaker 1: and one starts with an old idea brought to life 111 00:07:25,400 --> 00:07:28,960 Speaker 1: with modern technology. About half of all people on Earth 112 00:07:29,040 --> 00:07:33,000 Speaker 1: experience some form of water scarcity. But could that change 113 00:07:33,080 --> 00:07:37,120 Speaker 1: if you drink the ocean. Desalination, the process of removing 114 00:07:37,160 --> 00:07:40,760 Speaker 1: salt to create clean drinking water, has been possible for decades, 115 00:07:41,080 --> 00:07:45,280 Speaker 1: but it's the most expensive way to create clean drinking water. However, 116 00:07:45,640 --> 00:07:48,240 Speaker 1: The Wall Street Journal recently spoke to three companies that 117 00:07:48,280 --> 00:07:52,360 Speaker 1: are experimenting with an alternative method, a potential solution we've 118 00:07:52,360 --> 00:07:55,040 Speaker 1: known about for years, but that is now being made 119 00:07:55,160 --> 00:07:59,480 Speaker 1: possible by the improved functionality and lower prices of deep 120 00:07:59,480 --> 00:08:04,240 Speaker 1: sea roboots, under sea power cables, and other technologies. Basically, 121 00:08:04,440 --> 00:08:07,920 Speaker 1: the old way of desalinating water involved pumping seawater to 122 00:08:07,960 --> 00:08:11,200 Speaker 1: the surface and then boiling it to create clean water. 123 00:08:12,040 --> 00:08:15,760 Speaker 1: The new way involves an underwater membrane that uses the 124 00:08:15,800 --> 00:08:20,040 Speaker 1: ocean's natural pressure to filter out salt before clean water 125 00:08:20,400 --> 00:08:22,920 Speaker 1: is then pumped to the surface. It's a method that 126 00:08:22,960 --> 00:08:27,840 Speaker 1: could save significant energy and money. Sometimes innovation relies on 127 00:08:27,960 --> 00:08:33,160 Speaker 1: simple upgrades, other times it's all about competition. President Donald 128 00:08:33,160 --> 00:08:36,160 Speaker 1: Trump's one hundred and seventy five billion dollar plan for 129 00:08:36,280 --> 00:08:40,480 Speaker 1: a quote Golden Dome defense system has triggered a race 130 00:08:40,559 --> 00:08:44,160 Speaker 1: between tech companies and defense groups. According to Financial Times, 131 00:08:45,080 --> 00:08:48,040 Speaker 1: the idea is to create a space based system that 132 00:08:48,080 --> 00:08:52,679 Speaker 1: can detect and destroy foreign weapons like missiles at launch. 133 00:08:53,400 --> 00:08:57,000 Speaker 1: The Trump administration is called for quote non traditional contractors 134 00:08:57,040 --> 00:09:00,000 Speaker 1: to help create a system. Cue the competition between ten 135 00:09:00,000 --> 00:09:04,480 Speaker 1: tech companies like Paneteer and Microsoft, and established defense contractors 136 00:09:04,520 --> 00:09:07,640 Speaker 1: like Lockheed Martin. This all comes at a time when 137 00:09:07,679 --> 00:09:11,360 Speaker 1: big tech companies like Meta are stepping into the world 138 00:09:11,480 --> 00:09:16,359 Speaker 1: of developing military technology, and startup defense companies like Anderil 139 00:09:16,679 --> 00:09:19,920 Speaker 1: are raising money at huge valuations ahead of plans to 140 00:09:19,960 --> 00:09:23,600 Speaker 1: go public. The Missile Defense Agency plans to award ten 141 00:09:23,679 --> 00:09:27,520 Speaker 1: year contracts in an open competitive process, and so far 142 00:09:27,600 --> 00:09:31,200 Speaker 1: the agency has gotten over five hundred responses to their 143 00:09:31,200 --> 00:09:38,959 Speaker 1: request for information. After the break, we'll hear from four 144 00:09:39,040 --> 00:09:42,520 Speaker 1: or four Media's Jason Kebler about the dreaded airport pat 145 00:09:42,640 --> 00:09:57,160 Speaker 1: down and its future high tech makeover. So for our 146 00:09:57,160 --> 00:09:59,080 Speaker 1: next segment, we're going to dive into one of the 147 00:09:59,120 --> 00:10:03,000 Speaker 1: more annoying parts travel, especially for anyone who flies frequently, 148 00:10:03,559 --> 00:10:06,880 Speaker 1: airport security and the role technology. 149 00:10:06,360 --> 00:10:06,920 Speaker 3: Plays in it. 150 00:10:07,600 --> 00:10:11,040 Speaker 1: In response to nine to eleven, the Transportation Security Administration, 151 00:10:11,720 --> 00:10:15,800 Speaker 1: or TSA was born. Ever since, TSA has been part 152 00:10:15,840 --> 00:10:18,800 Speaker 1: of the flying experience here in the US, ensuring that 153 00:10:18,840 --> 00:10:22,920 Speaker 1: passengers are safe and highly agitated, one plastic bin at 154 00:10:22,920 --> 00:10:25,800 Speaker 1: a time. If you're not one of about thirty percent 155 00:10:25,840 --> 00:10:28,839 Speaker 1: of American fliers with TSA pre check, you know the 156 00:10:28,920 --> 00:10:31,920 Speaker 1: drill put your personal items in a plastic bin, take 157 00:10:31,960 --> 00:10:34,480 Speaker 1: off your shoes, put your bins through the conveyor belt 158 00:10:34,520 --> 00:10:36,600 Speaker 1: where they will go under a scanner, walk through a 159 00:10:36,600 --> 00:10:39,720 Speaker 1: body scanner, and then if you're unlucky or you've forgot 160 00:10:39,720 --> 00:10:41,600 Speaker 1: to take a receipt out of your pocket, you might 161 00:10:41,640 --> 00:10:44,000 Speaker 1: be pulled aside for an intimate pat down, which can 162 00:10:44,040 --> 00:10:47,200 Speaker 1: be pretty awkward. Over the years, there have been multiple 163 00:10:47,280 --> 00:10:51,800 Speaker 1: upgrades to TSA technology, from more discerning luggage scanners to 164 00:10:52,120 --> 00:10:55,520 Speaker 1: full body scans in place of metal detectors. But the 165 00:10:55,600 --> 00:10:59,440 Speaker 1: latest idea from the Department of Homeland Security, well, I 166 00:10:59,520 --> 00:11:02,400 Speaker 1: certainly don't see it coming. And here to tell us 167 00:11:02,480 --> 00:11:06,520 Speaker 1: how virtual reality could forever alter the dreaded pat down 168 00:11:06,559 --> 00:11:09,640 Speaker 1: process is four or four Media's Jason Kebler. Jason, thanks 169 00:11:09,679 --> 00:11:10,800 Speaker 1: for coming back to tech stuff. 170 00:11:10,920 --> 00:11:12,680 Speaker 3: Hey, thanks for having me. So. 171 00:11:12,720 --> 00:11:14,839 Speaker 1: I'm always quite fascinated by airport so I spent a 172 00:11:14,920 --> 00:11:17,080 Speaker 1: lot of time in them, and so when I saw 173 00:11:17,160 --> 00:11:20,840 Speaker 1: your most recent piece with the headline TSA working on 174 00:11:20,920 --> 00:11:24,960 Speaker 1: heptic tech to feel your body in virtual reality, I 175 00:11:25,000 --> 00:11:27,560 Speaker 1: had to know more. What inspired the piece and what 176 00:11:27,559 --> 00:11:28,040 Speaker 1: did you learn? 177 00:11:28,600 --> 00:11:31,520 Speaker 2: Yeah, I mean I think that since nine to eleven, 178 00:11:31,720 --> 00:11:36,480 Speaker 2: TSA has been looking for ways to quote unquote improve. 179 00:11:36,160 --> 00:11:37,559 Speaker 3: The security process. 180 00:11:37,640 --> 00:11:39,679 Speaker 2: It's like one day you'll just show up at the 181 00:11:39,679 --> 00:11:41,839 Speaker 2: airport and they'll have all new machines and it will 182 00:11:41,880 --> 00:11:45,360 Speaker 2: be like a totally different process. And I think that 183 00:11:45,640 --> 00:11:48,720 Speaker 2: airplane security and airport security is something that the United 184 00:11:48,720 --> 00:11:52,320 Speaker 2: States really likes to spend money on, I mean, for 185 00:11:52,480 --> 00:11:53,720 Speaker 2: sort of obvious reasons. 186 00:11:53,760 --> 00:11:56,120 Speaker 3: After something like nine to eleven. There's sort of the. 187 00:11:56,559 --> 00:11:58,920 Speaker 2: Security theater of it all where it's like, please take 188 00:11:58,960 --> 00:12:01,080 Speaker 2: off your shoes, We're going to go through your bags. 189 00:12:01,520 --> 00:12:04,440 Speaker 2: Sometimes you can bring liquids, sometimes you can't. It really 190 00:12:04,440 --> 00:12:07,320 Speaker 2: feels like a roll of the dice. And so TSA 191 00:12:07,440 --> 00:12:12,640 Speaker 2: has been basically researching this technology to give their agents 192 00:12:13,400 --> 00:12:18,360 Speaker 2: virtual reality goggles and then haptic feedback gloves, and so 193 00:12:18,480 --> 00:12:21,880 Speaker 2: haptic feedback is like where you wear a glove and 194 00:12:22,040 --> 00:12:26,200 Speaker 2: you can literally like feel in virtual reality. And so 195 00:12:26,960 --> 00:12:32,959 Speaker 2: they've really designed this incredibly complex and seemingly like overkill 196 00:12:33,600 --> 00:12:37,520 Speaker 2: a piece of technology where instead of doing a pat down, 197 00:12:37,600 --> 00:12:41,280 Speaker 2: which notoriously can be very invasive, like that certainly is 198 00:12:41,320 --> 00:12:45,240 Speaker 2: a problem, they're saying, well, people don't like those, so 199 00:12:45,360 --> 00:12:49,040 Speaker 2: we're going to do a virtual reality pat down by 200 00:12:49,120 --> 00:12:52,880 Speaker 2: using these like advanced sensors to detect the outlines of 201 00:12:52,920 --> 00:12:56,400 Speaker 2: someone's body and see if they have like sharp objects 202 00:12:56,480 --> 00:12:59,040 Speaker 2: that they're trying to smuggle in, and then the TSA 203 00:12:59,360 --> 00:13:02,160 Speaker 2: officer will be able to see that in their virtual 204 00:13:02,160 --> 00:13:05,320 Speaker 2: reality and also like feel that in their haptic gloves, 205 00:13:05,640 --> 00:13:09,280 Speaker 2: but without actually touching you. And so I don't know, 206 00:13:09,320 --> 00:13:11,080 Speaker 2: I don't know how you feel about it. I saw 207 00:13:11,120 --> 00:13:13,360 Speaker 2: it and I was like, this is this seems like overkill? 208 00:13:13,400 --> 00:13:16,560 Speaker 2: And in some ways, it's like the clothes, right, there's that, 209 00:13:16,600 --> 00:13:20,280 Speaker 2: and then there's also like nominally, like if someone is 210 00:13:20,320 --> 00:13:23,480 Speaker 2: touching you, you can say I don't like that, like 211 00:13:23,559 --> 00:13:25,960 Speaker 2: stop doing that. But you know, in this case, they're 212 00:13:26,080 --> 00:13:30,000 Speaker 2: literally creating like a three D scan of your body, 213 00:13:30,880 --> 00:13:35,160 Speaker 2: and in some ways it's like possibly more invasive. I 214 00:13:35,240 --> 00:13:37,800 Speaker 2: don't know, it's it's just like very weird technology. I 215 00:13:37,800 --> 00:13:39,640 Speaker 2: think it's kind of uncharted territory. 216 00:13:40,080 --> 00:13:42,600 Speaker 1: What does it mean to feel when we're talking about 217 00:13:42,640 --> 00:13:43,240 Speaker 1: haptic tech? 218 00:13:43,880 --> 00:13:47,280 Speaker 2: Yeah, so I've never done it myself. Like this technology 219 00:13:47,320 --> 00:13:50,959 Speaker 2: does exist in rudimentary form, but as I understand it, 220 00:13:50,960 --> 00:13:54,360 Speaker 2: it's like you're wearing a glove that has electronics in it, 221 00:13:54,440 --> 00:13:58,000 Speaker 2: and like let's say you move your hand around, presumably 222 00:13:58,080 --> 00:14:01,600 Speaker 2: some air pockets or some thing in there will activate 223 00:14:01,720 --> 00:14:05,240 Speaker 2: so that your hand is feeling like a force back 224 00:14:05,280 --> 00:14:10,960 Speaker 2: against it. That would simulate you actually touching something. I'm 225 00:14:11,000 --> 00:14:14,880 Speaker 2: trying to think of like analogies for what that might be. Like, 226 00:14:15,000 --> 00:14:18,600 Speaker 2: I think maybe like a massage chair, Like imagine a 227 00:14:18,600 --> 00:14:19,560 Speaker 2: massage chair that. 228 00:14:19,560 --> 00:14:22,920 Speaker 1: You wear, or a video game controller or a steering 229 00:14:22,960 --> 00:14:25,160 Speaker 1: wheel that kind of rattles around and like moves your 230 00:14:25,200 --> 00:14:25,920 Speaker 1: hands and stuff. 231 00:14:26,480 --> 00:14:28,280 Speaker 3: Yeah, yeah, that's absolutely right. 232 00:14:28,400 --> 00:14:32,200 Speaker 2: So it's that and then the sort of like input 233 00:14:32,480 --> 00:14:35,920 Speaker 2: data would come from all these sensors that would like 234 00:14:36,040 --> 00:14:41,000 Speaker 2: project an image onto your hand, and then your hand 235 00:14:41,000 --> 00:14:44,160 Speaker 2: would be connected to this VR system so it would 236 00:14:44,240 --> 00:14:48,440 Speaker 2: track your hand around in virtual reality. And then you know, 237 00:14:48,520 --> 00:14:52,160 Speaker 2: basically like put some physics in there where it says 238 00:14:52,240 --> 00:14:54,800 Speaker 2: like okay, you're touching a table, you're touching a person, 239 00:14:54,920 --> 00:14:58,080 Speaker 2: here's what it should feel like. And that technology does 240 00:14:58,120 --> 00:15:02,640 Speaker 2: exist where like handtrack in virtual reality, object tracking in 241 00:15:02,720 --> 00:15:05,680 Speaker 2: virtual reality, that sort of thing. So really like the 242 00:15:06,120 --> 00:15:09,920 Speaker 2: haptics is the new thing here and sort of making 243 00:15:10,000 --> 00:15:14,120 Speaker 2: that happen in real time where a sensor array is 244 00:15:14,720 --> 00:15:18,360 Speaker 2: grabbing all of that as you walk through security. 245 00:15:18,480 --> 00:15:21,120 Speaker 1: Use the word wild in your story, which I think 246 00:15:21,240 --> 00:15:23,880 Speaker 1: is an appropriate one. You know this obviously a podcast. 247 00:15:24,120 --> 00:15:27,320 Speaker 1: Can you give a visual of what this will look like? 248 00:15:27,400 --> 00:15:30,000 Speaker 1: I mean the TSA agent wearing a headset, wearing these levels, 249 00:15:30,000 --> 00:15:34,400 Speaker 1: because there's also some quite amusing cartoons or like visualizations 250 00:15:34,400 --> 00:15:36,320 Speaker 1: that you're able to source in your reporting of what 251 00:15:36,440 --> 00:15:37,800 Speaker 1: TSA imagine this will look like. 252 00:15:38,480 --> 00:15:40,640 Speaker 2: Yeah, so I want to stress that a lot of 253 00:15:40,680 --> 00:15:44,200 Speaker 2: this comes from patent applications, which are you know, kind 254 00:15:44,240 --> 00:15:48,280 Speaker 2: of like it's very very early. We don't know that 255 00:15:48,600 --> 00:15:51,920 Speaker 2: if this technology is going to be deployed, whether it 256 00:15:51,920 --> 00:15:54,480 Speaker 2: will be deployed, what it will look like when it's deployed. 257 00:15:54,920 --> 00:15:59,080 Speaker 2: But in patent schematics and drawings, it's like they sort 258 00:15:59,120 --> 00:16:03,320 Speaker 2: of imagine and hey, this is what we think it 259 00:16:03,400 --> 00:16:06,400 Speaker 2: could look like. And so in the information sheet, they 260 00:16:06,400 --> 00:16:10,800 Speaker 2: have a TSA officer who has a computer strapped to 261 00:16:10,840 --> 00:16:14,040 Speaker 2: his face like a VR goggles, and he's holding up 262 00:16:15,240 --> 00:16:19,520 Speaker 2: a gloved hand, and then they have this diagram of 263 00:16:19,880 --> 00:16:23,360 Speaker 2: a baseball, like they drew a baseball on a table, 264 00:16:23,920 --> 00:16:27,840 Speaker 2: and then they have this, uh, they call it a glove, 265 00:16:27,880 --> 00:16:32,400 Speaker 2: but it's really like a hand computer like it's like 266 00:16:32,440 --> 00:16:35,720 Speaker 2: they took a touchpad and put it on someone's hand, 267 00:16:35,760 --> 00:16:39,960 Speaker 2: strapped it their hand, and that is the haptic feedback glove. 268 00:16:40,240 --> 00:16:40,640 Speaker 3: And so. 269 00:16:42,160 --> 00:16:44,840 Speaker 2: It reminded me of that toy I had when I 270 00:16:44,840 --> 00:16:47,120 Speaker 2: was a kid with all those little pins, the like 271 00:16:47,560 --> 00:16:50,160 Speaker 2: metal pins where you could put your hand or face and. 272 00:16:50,080 --> 00:16:51,120 Speaker 3: It would do an outline. 273 00:16:52,320 --> 00:16:54,880 Speaker 2: Yeah, and so that's kind of what it's like, a 274 00:16:54,960 --> 00:17:00,320 Speaker 2: digital recreation of that. They also have a diagram where 275 00:17:00,720 --> 00:17:04,960 Speaker 2: a person is trying to smuggle in scissors around their chest, 276 00:17:05,119 --> 00:17:07,159 Speaker 2: and so they have like a picture of scissors on 277 00:17:07,200 --> 00:17:09,560 Speaker 2: someone's chest, and then they have a picture of their 278 00:17:09,600 --> 00:17:13,800 Speaker 2: belt buckle and the hands are like right up on 279 00:17:13,920 --> 00:17:18,280 Speaker 2: their waist and the diagram makes sure to say that 280 00:17:19,359 --> 00:17:22,920 Speaker 2: it says, quote scan imagery obscured it due to proximity 281 00:17:22,960 --> 00:17:28,719 Speaker 2: to private body zone. And yeah, so this technology would 282 00:17:28,840 --> 00:17:34,880 Speaker 2: like somehow obscure I guess, your private areas because they're 283 00:17:34,920 --> 00:17:36,080 Speaker 2: conscious of your privacy. 284 00:17:36,840 --> 00:17:40,000 Speaker 1: But the idea is like you will stand facing a 285 00:17:40,000 --> 00:17:45,360 Speaker 1: TSA agent. They'll be wearing a virtual reality headset, running 286 00:17:45,400 --> 00:17:48,119 Speaker 1: their hands up and down your body but not actually 287 00:17:48,119 --> 00:17:51,800 Speaker 1: touching you, and then both feeling on their hands and 288 00:17:52,000 --> 00:17:55,520 Speaker 1: seeing in VR objects that maybe underneath your clothes. 289 00:17:55,960 --> 00:17:57,080 Speaker 3: That's what they're proposing. 290 00:17:57,560 --> 00:18:00,840 Speaker 2: And it certainly sounds like the TSA officer could be 291 00:18:00,920 --> 00:18:03,280 Speaker 2: right next to you, but they also make it sound 292 00:18:03,320 --> 00:18:06,920 Speaker 2: like they could be like in another room. For example, 293 00:18:07,280 --> 00:18:09,640 Speaker 2: you'd walk through some sort of sensor system that would 294 00:18:09,720 --> 00:18:13,000 Speaker 2: like scan your body in real time, and then the 295 00:18:13,040 --> 00:18:15,280 Speaker 2: officer either could be there and would be like, oh, 296 00:18:15,320 --> 00:18:18,000 Speaker 2: I'm not touching you. I'm not touching you, but they 297 00:18:18,040 --> 00:18:21,439 Speaker 2: could also be in another room doing this sort of 298 00:18:21,480 --> 00:18:22,760 Speaker 2: without you even knowing. 299 00:18:23,000 --> 00:18:25,840 Speaker 1: So how did you hear about this? And do you 300 00:18:25,920 --> 00:18:28,000 Speaker 1: have the sense this is something that would actually be 301 00:18:28,040 --> 00:18:30,320 Speaker 1: seeing in airports anytime soon? 302 00:18:30,680 --> 00:18:33,560 Speaker 3: Yeah, So it's interesting. The Department of Homeland. 303 00:18:33,200 --> 00:18:38,399 Speaker 2: Security, which oversees TSA, actually published a two page information 304 00:18:38,520 --> 00:18:42,840 Speaker 2: sheet about this on its website. And I mean, I 305 00:18:42,840 --> 00:18:45,000 Speaker 2: guess this is something that they do periodically where they're like, 306 00:18:45,000 --> 00:18:47,239 Speaker 2: we have new research, come check it out, but they 307 00:18:47,280 --> 00:18:50,720 Speaker 2: publish it like pretty deep on their website. And frankly, 308 00:18:50,760 --> 00:18:52,800 Speaker 2: we have like a lot of nerds who read our 309 00:18:52,920 --> 00:18:56,320 Speaker 2: articles and are constantly scanning for things like this and 310 00:18:56,680 --> 00:18:59,159 Speaker 2: know that it's something that we would care about, you know. 311 00:18:59,200 --> 00:19:03,520 Speaker 2: We try to write about futuristic and weird technology, and 312 00:19:03,560 --> 00:19:05,959 Speaker 2: so one of our readers sent it to us. I 313 00:19:06,080 --> 00:19:09,040 Speaker 2: checked it out, and from that information sheet it said, 314 00:19:09,080 --> 00:19:11,320 Speaker 2: like want to learn more, go check out these patents. 315 00:19:13,080 --> 00:19:17,040 Speaker 2: Whether we see this in practice or not, I'm not sure. 316 00:19:17,119 --> 00:19:20,960 Speaker 2: It seems pretty early, like I haven't seen video of 317 00:19:21,119 --> 00:19:25,280 Speaker 2: this working, like a prototype or anything like that. It 318 00:19:25,359 --> 00:19:28,520 Speaker 2: seems like it's more of a concept that they're working 319 00:19:28,520 --> 00:19:31,440 Speaker 2: on at the moment and our researching. I also think 320 00:19:31,480 --> 00:19:35,800 Speaker 2: that honestly, like since nine to eleven, TSA has kind 321 00:19:35,840 --> 00:19:39,440 Speaker 2: of dialed in the security screening process. I don't think 322 00:19:39,440 --> 00:19:43,480 Speaker 2: it's anything that anyone enjoys. But we're not waiting for 323 00:19:43,520 --> 00:19:46,160 Speaker 2: two hours at the security line anymore like we were 324 00:19:46,240 --> 00:19:49,480 Speaker 2: kind of immediately after nine to eleven. As unpleasant as 325 00:19:49,520 --> 00:19:53,080 Speaker 2: it can be, it's usually pretty quick. And so I 326 00:19:53,119 --> 00:19:55,000 Speaker 2: don't know if we'll ever see this, but it is 327 00:19:55,080 --> 00:19:56,520 Speaker 2: tech that they're researching. 328 00:19:56,680 --> 00:19:58,360 Speaker 1: And where does the money come from? 329 00:19:58,440 --> 00:19:58,840 Speaker 2: Is this? 330 00:19:58,960 --> 00:20:00,600 Speaker 1: Like do we know how long it? You know how 331 00:20:00,640 --> 00:20:03,119 Speaker 1: much they've spent on it? Like what's the kind of 332 00:20:03,560 --> 00:20:05,680 Speaker 1: the structural backdrop of this type of project. 333 00:20:05,960 --> 00:20:08,840 Speaker 2: Yeah, So the Department of Homeland Security has an Office 334 00:20:08,880 --> 00:20:11,840 Speaker 2: of Science and Technology, but they don't say here's how 335 00:20:11,880 --> 00:20:16,040 Speaker 2: much money we spent on this specific virtual reality haptic 336 00:20:16,080 --> 00:20:18,680 Speaker 2: feedback remote sensing. 337 00:20:18,560 --> 00:20:22,960 Speaker 1: Is the Nintendo we of apod security basically, right. 338 00:20:23,119 --> 00:20:25,879 Speaker 2: Yeah, but I think it is important to sort of 339 00:20:25,920 --> 00:20:29,119 Speaker 2: realize that we're in an era where a lot of 340 00:20:29,160 --> 00:20:33,720 Speaker 2: government funding is being slashed for new technologies. And I'm 341 00:20:33,760 --> 00:20:37,800 Speaker 2: not saying that DHS shouldn't be researching new technology, Like 342 00:20:37,840 --> 00:20:41,480 Speaker 2: a lot of new technology comes out of agencies like 343 00:20:41,520 --> 00:20:45,600 Speaker 2: the Department of Homeland Security, like the CIA, like the NSSA. 344 00:20:45,640 --> 00:20:48,240 Speaker 2: But at the same time, like this feels like something 345 00:20:48,480 --> 00:20:52,399 Speaker 2: that's a solution in search of a problem, I would say, 346 00:20:52,480 --> 00:20:56,600 Speaker 2: And so yeah, it's something to consider. Like we're slashing 347 00:20:56,640 --> 00:20:59,520 Speaker 2: budgets for science across the government, but we're not really 348 00:20:59,560 --> 00:21:02,480 Speaker 2: slashing budgets for the Department of Homeland Security. 349 00:21:02,520 --> 00:21:05,120 Speaker 3: Were increasing that budget if anything. 350 00:21:05,600 --> 00:21:07,760 Speaker 1: Yeah, I mean I did see on your reporting that 351 00:21:07,800 --> 00:21:10,399 Speaker 1: I think they filed the patents for this technology in 352 00:21:10,440 --> 00:21:13,040 Speaker 1: twenty twenty two, so I guess under the Biden. 353 00:21:12,760 --> 00:21:15,720 Speaker 3: Administration they've been working on it for a while. 354 00:21:15,760 --> 00:21:18,199 Speaker 1: I mean, this is not like a recent fancy, so 355 00:21:18,240 --> 00:21:20,800 Speaker 1: that's kind of interesting. I Mean, some people say the 356 00:21:20,840 --> 00:21:25,000 Speaker 1: whole VR is a solution in search of a problem, right, 357 00:21:25,080 --> 00:21:28,679 Speaker 1: But like, what's the wider arc of leading edge tech 358 00:21:29,000 --> 00:21:31,400 Speaker 1: for the miniature industrial complex. 359 00:21:32,400 --> 00:21:37,239 Speaker 2: Yeah, it's interesting because VR and the metaverse, which you know, 360 00:21:37,320 --> 00:21:41,080 Speaker 2: associated technology, has been a massive flop. 361 00:21:41,320 --> 00:21:42,439 Speaker 3: Like it was the. 362 00:21:43,320 --> 00:21:46,320 Speaker 2: AI before there was AI, it was crypto before it 363 00:21:46,320 --> 00:21:48,600 Speaker 2: was crypto. Like that, there was tons and tons of 364 00:21:48,680 --> 00:21:51,879 Speaker 2: hype about VR being just around the corner and people 365 00:21:52,000 --> 00:21:54,120 Speaker 2: using it for all sorts of things, whether to work, 366 00:21:54,200 --> 00:21:57,720 Speaker 2: to play, to game. But interestingly, like one of the 367 00:21:57,720 --> 00:22:01,720 Speaker 2: places where VR has actually been useful has been on 368 00:22:01,840 --> 00:22:06,640 Speaker 2: the job training and for things like post traumatic stress disorder, 369 00:22:06,880 --> 00:22:10,040 Speaker 2: exposure therapy for soldiers and things like that, and so 370 00:22:10,560 --> 00:22:16,320 Speaker 2: the military actually has found some uses for virtual reality. 371 00:22:16,760 --> 00:22:20,639 Speaker 2: At the same time, DHS has been really interested in 372 00:22:20,920 --> 00:22:27,600 Speaker 2: using virtual reality to kind of like see through things 373 00:22:27,800 --> 00:22:29,600 Speaker 2: for lack of a better term, Like there was this 374 00:22:29,680 --> 00:22:32,760 Speaker 2: project that Customs and Border Patrol tried to do where 375 00:22:32,800 --> 00:22:35,760 Speaker 2: they wanted to use virtual reality goggles in a way 376 00:22:35,800 --> 00:22:40,000 Speaker 2: similar to this to see through boxes in order to 377 00:22:40,040 --> 00:22:44,400 Speaker 2: determine whether like counterfeit goods were being brought into the country. 378 00:22:44,800 --> 00:22:47,600 Speaker 2: As far as I know, they're not doing anything like that. 379 00:22:47,400 --> 00:22:50,680 Speaker 2: That was a project that has been around for many 380 00:22:50,800 --> 00:22:54,560 Speaker 2: years and was never deployed. There's been university research that 381 00:22:54,640 --> 00:22:57,320 Speaker 2: was associated with the Department of Homeland Security where they 382 00:22:57,320 --> 00:23:02,360 Speaker 2: wanted to use VR at the border to see terrorists, 383 00:23:02,720 --> 00:23:05,280 Speaker 2: is what they said, but they never explained. Yeah, they 384 00:23:05,320 --> 00:23:08,120 Speaker 2: never explained like how that would work. But I do 385 00:23:08,160 --> 00:23:11,920 Speaker 2: think that there was a period that is still going 386 00:23:11,960 --> 00:23:14,679 Speaker 2: on but has largely been taken over by AI, where 387 00:23:15,280 --> 00:23:18,040 Speaker 2: you would take any process that the government would do 388 00:23:18,240 --> 00:23:20,800 Speaker 2: or that any business would do, and say, well, how 389 00:23:20,800 --> 00:23:23,040 Speaker 2: can we add VR to this Because it's sort of 390 00:23:23,040 --> 00:23:26,800 Speaker 2: the hyped new technology. We can seem like we're forward looking, 391 00:23:27,040 --> 00:23:30,880 Speaker 2: and we can also maybe get some money to research something. 392 00:23:30,880 --> 00:23:31,560 Speaker 3: That makes sense. 393 00:23:31,920 --> 00:23:35,480 Speaker 1: Are there any other interesting airport tech stories that you've 394 00:23:35,520 --> 00:23:36,560 Speaker 1: come across recently? 395 00:23:37,000 --> 00:23:39,360 Speaker 2: I need to do more reporting on this, but facial 396 00:23:39,400 --> 00:23:43,280 Speaker 2: recognition is very very common at airports now. Global Entry, 397 00:23:43,440 --> 00:23:46,760 Speaker 2: which is a customs and border patrol system where you 398 00:23:46,800 --> 00:23:48,920 Speaker 2: don't need to show your passport if you've been pre 399 00:23:49,040 --> 00:23:53,359 Speaker 2: vetted and are an American citizen, now uses facial recognition 400 00:23:53,359 --> 00:23:55,520 Speaker 2: where you just get off the plane and they detect 401 00:23:55,560 --> 00:23:57,159 Speaker 2: who you are and they say, welcome back to the 402 00:23:57,280 --> 00:24:01,600 Speaker 2: United States. A lot of air lines are using facial 403 00:24:01,600 --> 00:24:04,800 Speaker 2: recognition to just boord planes so you don't need to 404 00:24:04,840 --> 00:24:08,440 Speaker 2: show a boarding pass. And then TSA is also using 405 00:24:08,480 --> 00:24:11,560 Speaker 2: facial recognition at the screenings, and these are all things 406 00:24:11,600 --> 00:24:14,000 Speaker 2: that you can opt in or opt out of, but 407 00:24:14,359 --> 00:24:18,400 Speaker 2: it's becoming a lot more commonplace. There was no real 408 00:24:18,560 --> 00:24:21,000 Speaker 2: big announcement where it was like, hey, we're going to 409 00:24:21,080 --> 00:24:21,680 Speaker 2: be using. 410 00:24:21,440 --> 00:24:24,080 Speaker 3: Facial recognition all over the airport. It was just sort 411 00:24:24,119 --> 00:24:27,960 Speaker 3: of there one day, And like, I think there needs to. 412 00:24:27,880 --> 00:24:33,040 Speaker 2: Be more reporting on sort of where the technology came from, 413 00:24:33,080 --> 00:24:37,520 Speaker 2: like where the initial photos that identify you are coming from, 414 00:24:38,000 --> 00:24:40,840 Speaker 2: whether those images are being retained, that sort of thing. 415 00:24:40,960 --> 00:24:43,159 Speaker 2: But this is all being done sort of in the 416 00:24:43,240 --> 00:24:47,080 Speaker 2: name of convenience and streamlining the process. And I mean, 417 00:24:47,119 --> 00:24:49,359 Speaker 2: I'll admit it is a lot faster to do some 418 00:24:49,440 --> 00:24:51,960 Speaker 2: of these things with facial recognition. But then you start 419 00:24:52,000 --> 00:24:54,560 Speaker 2: worrying about like is that information being shared, how is 420 00:24:54,600 --> 00:24:58,520 Speaker 2: it being shared, who's it going to, what privacy guardrails 421 00:24:58,560 --> 00:25:01,480 Speaker 2: are there, And that's not something we know a lot 422 00:25:01,520 --> 00:25:02,600 Speaker 2: about unfortunately. 423 00:25:03,280 --> 00:25:06,320 Speaker 1: Well, so the airport is a kind of interesting tech 424 00:25:06,720 --> 00:25:10,920 Speaker 1: testing ground or petri dish because like the contract you 425 00:25:11,040 --> 00:25:13,840 Speaker 1: sign implicitly or explicitly when you go to the airport 426 00:25:13,880 --> 00:25:16,240 Speaker 1: is basically that you surrender all of your rights without 427 00:25:16,280 --> 00:25:20,600 Speaker 1: any questions asked. Right, So, once you're like going through security, 428 00:25:20,680 --> 00:25:22,639 Speaker 1: you kind of accepted the premise that the Apple can 429 00:25:22,680 --> 00:25:24,640 Speaker 1: do whatever it wants to you, right, and that includes 430 00:25:25,320 --> 00:25:28,359 Speaker 1: rolling out technologies that you may technically consent to, but 431 00:25:28,400 --> 00:25:32,240 Speaker 1: in reality like very hard to withhold your consent from exactly. 432 00:25:32,280 --> 00:25:34,280 Speaker 2: I mean, the only way to withhold your consent in 433 00:25:34,320 --> 00:25:37,280 Speaker 2: some cases is to just not fly. And I think 434 00:25:37,320 --> 00:25:41,560 Speaker 2: also airplanes and airports in general are interesting places where 435 00:25:41,600 --> 00:25:45,280 Speaker 2: technology is rolled out. Because it's very expensive to fly, 436 00:25:45,880 --> 00:25:49,920 Speaker 2: you sort of have this self selecting group of people 437 00:25:49,960 --> 00:25:53,439 Speaker 2: who go through a security process. They're not bringing guns, 438 00:25:53,480 --> 00:25:56,720 Speaker 2: they're not bringing knives. Like it's very safe. It's like 439 00:25:56,760 --> 00:26:00,280 Speaker 2: incredibly safe to be in an airport. And therefore you 440 00:26:00,320 --> 00:26:03,840 Speaker 2: have like a kind of Petri dish of like transient 441 00:26:04,080 --> 00:26:06,120 Speaker 2: people who are who are only there for a few hours. 442 00:26:06,560 --> 00:26:09,680 Speaker 2: You can test new technologies on them. People are bored, 443 00:26:09,720 --> 00:26:11,680 Speaker 2: and so they might say like, oh, here's like this 444 00:26:11,760 --> 00:26:14,280 Speaker 2: new VR game or this new new thing that I 445 00:26:14,320 --> 00:26:16,959 Speaker 2: can try out while I'm waiting for my flight. So 446 00:26:17,160 --> 00:26:19,080 Speaker 2: like I've seen a lot of like new gaming tech 447 00:26:19,200 --> 00:26:22,959 Speaker 2: at at airports and things like that. And I'm not 448 00:26:23,000 --> 00:26:25,479 Speaker 2: saying that it's all surveillance. I'm just saying that, like 449 00:26:25,760 --> 00:26:29,800 Speaker 2: it feels like new technologies are like rolled out there 450 00:26:29,840 --> 00:26:33,080 Speaker 2: before they're rolled out more broadly into society. 451 00:26:33,680 --> 00:26:36,360 Speaker 1: You know, we've reading these stories about how Newark Airport 452 00:26:36,480 --> 00:26:39,440 Speaker 1: is running on floppy disks and there are these regular 453 00:26:39,920 --> 00:26:43,360 Speaker 1: outages where air traffic control can't even see the planes. 454 00:26:44,000 --> 00:26:45,840 Speaker 1: I think that's why I find it's one so fascinating, 455 00:26:45,880 --> 00:26:47,800 Speaker 1: because it's like it's a perfect place where you have 456 00:26:47,880 --> 00:26:51,560 Speaker 1: technology being used as a solution to a place where 457 00:26:51,560 --> 00:26:54,800 Speaker 1: there is no problem, and then huge real problems that 458 00:26:54,840 --> 00:26:57,520 Speaker 1: no one's interested in solutions for seeming Yeah. 459 00:26:57,600 --> 00:27:00,320 Speaker 2: Yeah, and air traffic control problem is one of just 460 00:27:00,359 --> 00:27:04,280 Speaker 2: like standard human labor practices where it's like we're not 461 00:27:04,760 --> 00:27:07,600 Speaker 2: training enough air traffic controllers, we're not paying them enough, 462 00:27:07,640 --> 00:27:12,120 Speaker 2: they're very stressed. It's an incredibly important job where these 463 00:27:12,160 --> 00:27:15,320 Speaker 2: stakes couldn't possibly be higher, and so you have this 464 00:27:15,440 --> 00:27:18,960 Speaker 2: like natural human burnout. And then you also have this 465 00:27:19,720 --> 00:27:24,800 Speaker 2: situation where like handing over something like that to artificial 466 00:27:24,840 --> 00:27:28,000 Speaker 2: intelligence or to like machine vision, or like you could 467 00:27:28,000 --> 00:27:32,639 Speaker 2: see that being more efficient, but the technology clearly is 468 00:27:32,640 --> 00:27:36,160 Speaker 2: not ready and a mistake is a life and death situation, 469 00:27:36,320 --> 00:27:41,119 Speaker 2: and so you have like maybe some technological solutions that 470 00:27:41,119 --> 00:27:43,840 Speaker 2: could come through there, but you're sort of budding up 471 00:27:43,920 --> 00:27:47,800 Speaker 2: against as you say, like these really like old fashioned 472 00:27:48,000 --> 00:27:53,240 Speaker 2: problems of funding and treating workers correctly and you know, 473 00:27:53,280 --> 00:27:56,440 Speaker 2: the pipeline of training them and that sort of thing. 474 00:27:56,520 --> 00:27:58,800 Speaker 2: And so, I mean, I think that's a great observation. 475 00:28:00,200 --> 00:28:03,879 Speaker 1: Mail Re reported your story without crediting you, of course, 476 00:28:03,920 --> 00:28:06,320 Speaker 1: but has this story traveled far and wide and what 477 00:28:06,400 --> 00:28:07,120 Speaker 1: the response been. 478 00:28:08,200 --> 00:28:11,320 Speaker 2: It's been a lot of people saying this reminds me 479 00:28:11,440 --> 00:28:17,560 Speaker 2: of teledildonics, which is technology that has been created to 480 00:28:17,760 --> 00:28:21,640 Speaker 2: allow people in long distance relationships to have cyber sex 481 00:28:21,760 --> 00:28:25,600 Speaker 2: while actually feeling it. And I mean it relies on 482 00:28:25,640 --> 00:28:28,680 Speaker 2: a lot of these same technologies that we've been talking about. 483 00:28:28,720 --> 00:28:31,720 Speaker 2: And so I mean, we actually do see this time 484 00:28:31,720 --> 00:28:36,439 Speaker 2: and time again where sex tech and porn industry is 485 00:28:36,560 --> 00:28:40,280 Speaker 2: quite ahead of where society is going. And you know 486 00:28:40,520 --> 00:28:42,680 Speaker 2: that technology has been being worked on for. 487 00:28:43,080 --> 00:28:43,800 Speaker 3: Years by. 488 00:28:46,000 --> 00:28:51,120 Speaker 2: Let's say enterprising di I wires and now you have 489 00:28:51,240 --> 00:28:55,280 Speaker 2: like similar technology being looked at by Department of Homeland Security, 490 00:28:55,560 --> 00:28:57,960 Speaker 2: and so that's been a lot of the response so far. 491 00:29:01,680 --> 00:29:03,360 Speaker 1: Jason, appreciate taking the time this week. 492 00:29:03,520 --> 00:29:04,960 Speaker 3: Yeah, thank you so much. This is fine. 493 00:29:18,400 --> 00:29:21,080 Speaker 1: That's it for this week for tech Stuff. I'm os Voloshin. 494 00:29:21,600 --> 00:29:25,120 Speaker 1: This episode was produced by Eliza Dennis and Victoria Domingez. 495 00:29:25,600 --> 00:29:28,360 Speaker 1: It was executive produced by me Kara Price and Kate 496 00:29:28,440 --> 00:29:33,200 Speaker 1: Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts but 497 00:29:33,320 --> 00:29:36,480 Speaker 1: he phrased as I Engineer. Jack Insley mixed this episode 498 00:29:36,640 --> 00:29:40,160 Speaker 1: and Kyle Murdoch wrote our theme song. Join us next 499 00:29:40,160 --> 00:29:44,480 Speaker 1: Wednesday for an episode all about biometric data, from shopping 500 00:29:44,520 --> 00:29:47,960 Speaker 1: with the palm of your hand to donning multiple wearables. 501 00:29:48,440 --> 00:29:50,800 Speaker 1: How much should we really care about giving away all 502 00:29:50,800 --> 00:29:55,040 Speaker 1: that personal data? Please rate, review and reach out to 503 00:29:55,120 --> 00:29:57,720 Speaker 1: us at tech Stuff podcast at gmail dot com with 504 00:29:57,800 --> 00:30:00,600 Speaker 1: your feedback, with your story ideas, with whatever you want 505 00:30:00,600 --> 00:30:02,360 Speaker 1: to tell us, because we love hearing from you.