1 00:00:00,200 --> 00:00:02,880 Speaker 1: Thanks for tunion to tech Stuff. If you don't recognize 2 00:00:02,880 --> 00:00:05,640 Speaker 1: my voice, my name is oz Valoshan, and I'm here 3 00:00:05,680 --> 00:00:08,840 Speaker 1: because the inimitable Jonathan Strickland has passed the baton to 4 00:00:08,920 --> 00:00:12,000 Speaker 1: Karra Price and myself to host tech Stuff. The show 5 00:00:12,039 --> 00:00:14,920 Speaker 1: will remain your home for all things tech, and all 6 00:00:14,960 --> 00:00:18,200 Speaker 1: the old episodes will remain available in this feed. Thanks 7 00:00:18,239 --> 00:00:27,280 Speaker 1: for listening. Welcome to Tech Stuff, a production from iHeart 8 00:00:27,320 --> 00:00:35,680 Speaker 1: Podcasts and Kaleidoscope. I'm Anzaloshan and today co host Kara Price, 9 00:00:35,720 --> 00:00:39,360 Speaker 1: and I will bring you three things first. The headlines 10 00:00:39,400 --> 00:00:44,120 Speaker 1: this week, including new and scary uses of chatchipt and 11 00:00:44,240 --> 00:00:47,640 Speaker 1: a sobering reminder of the climate impact of data processing. 12 00:00:48,360 --> 00:00:51,440 Speaker 1: On today's Tech Support segment, we have a conversation with 13 00:00:51,520 --> 00:00:54,800 Speaker 1: four or four Media's Joseph Cox about U haul criminals, 14 00:00:55,360 --> 00:00:58,320 Speaker 1: and then finally with Thrill to introduce a new segment 15 00:00:58,600 --> 00:01:01,280 Speaker 1: which we lovingly call when did This Become a Thing? 16 00:01:01,920 --> 00:01:04,479 Speaker 1: We go on a wild ride with Cara, which ends 17 00:01:04,480 --> 00:01:07,600 Speaker 1: with a revelation about what it means to be a 18 00:01:07,680 --> 00:01:10,920 Speaker 1: digital native. All of that on this week in Tech. 19 00:01:11,319 --> 00:01:26,960 Speaker 1: It's Friday, January seventeenth. Stay with us, Karen Good Afternoon. 20 00:01:27,000 --> 00:01:31,720 Speaker 1: I actually wasn't expecting to Good afternoon, Good afternoon, I 21 00:01:31,720 --> 00:01:33,720 Speaker 1: actually wasn't expecting to see you this afternoon. 22 00:01:34,200 --> 00:01:35,760 Speaker 2: Why am I not supposed to be here? 23 00:01:35,880 --> 00:01:37,800 Speaker 1: Well, now you are supposed to be here. I'm glad 24 00:01:37,800 --> 00:01:39,960 Speaker 1: you're here, but I actually thought you were supposed to 25 00:01:40,000 --> 00:01:41,160 Speaker 1: be in LA this week. 26 00:01:41,240 --> 00:01:44,759 Speaker 2: I was actually I was supposed to be on the 27 00:01:44,800 --> 00:01:49,040 Speaker 2: other coast for work, and obviously I wasn't able to go. 28 00:01:49,200 --> 00:01:51,920 Speaker 2: And I think I speak for both of us when 29 00:01:51,920 --> 00:01:54,040 Speaker 2: I say that our hearts really go out to the 30 00:01:54,040 --> 00:01:56,920 Speaker 2: people who were impacted by the fires and who continue 31 00:01:56,960 --> 00:01:59,680 Speaker 2: to be impacted by the fires. And while I'm glad 32 00:01:59,680 --> 00:02:02,720 Speaker 2: I'm here, I'm not glad about the circumstances. 33 00:02:02,960 --> 00:02:07,000 Speaker 1: No, obviously me neither. But actually what's been going on 34 00:02:07,080 --> 00:02:11,320 Speaker 1: in LA has some interesting text stories around it, and 35 00:02:11,440 --> 00:02:14,160 Speaker 1: so I actually, if you'll allow me, wants to go 36 00:02:14,240 --> 00:02:16,080 Speaker 1: first in this week's news roundup. 37 00:02:16,160 --> 00:02:19,040 Speaker 2: We will forego the continentass and I will let you go. 38 00:02:20,040 --> 00:02:24,120 Speaker 1: So last week you actually sent me a screenshot of 39 00:02:24,200 --> 00:02:25,239 Speaker 1: an Instagram story. 40 00:02:25,520 --> 00:02:28,880 Speaker 2: The only way to get you anything from Instagram. You're 41 00:02:28,919 --> 00:02:30,120 Speaker 2: not on it, which I love about you. 42 00:02:30,240 --> 00:02:32,440 Speaker 1: I'm not I do have the application, I just he. 43 00:02:32,400 --> 00:02:33,600 Speaker 2: Goes on browser people. 44 00:02:35,480 --> 00:02:39,359 Speaker 1: But basically, this Instagram post that you screenshotted for me 45 00:02:40,040 --> 00:02:43,400 Speaker 1: was a picture of this raging inferno, and overlaid was 46 00:02:43,480 --> 00:02:47,800 Speaker 1: text that reads, quote, the average AI data center uses 47 00:02:48,080 --> 00:02:51,200 Speaker 1: three hundred thousand gallons of water a day to keep cool, 48 00:02:51,639 --> 00:02:55,080 Speaker 1: roughly equivalent to water use in one hundred thousand homes. 49 00:02:55,360 --> 00:03:00,400 Speaker 1: Open brackets, NPR, close brackets. Please skip the silly ais, 50 00:03:00,680 --> 00:03:03,920 Speaker 1: the questions to chat GPT, having AI write all of 51 00:03:03,960 --> 00:03:07,320 Speaker 1: your emails. Life needs water, not tech. 52 00:03:07,560 --> 00:03:09,279 Speaker 2: Water is the essence of life. 53 00:03:09,040 --> 00:03:10,600 Speaker 1: To be fair, whata, it is the essence of life. 54 00:03:10,600 --> 00:03:12,800 Speaker 2: And it's also the essence of fighting fires in a 55 00:03:13,120 --> 00:03:16,080 Speaker 2: city where droughts are just commonplace. 56 00:03:16,440 --> 00:03:19,480 Speaker 1: Totally, and I was actually pretty compelled by this post 57 00:03:19,480 --> 00:03:22,959 Speaker 1: that you sent me until I dug a little deeper, 58 00:03:23,600 --> 00:03:26,919 Speaker 1: say more. This was, of course misinformation. 59 00:03:27,800 --> 00:03:28,919 Speaker 2: I sent you misinformation. 60 00:03:28,960 --> 00:03:32,119 Speaker 1: You sent me misinformation, but I didn't amplify. 61 00:03:32,280 --> 00:03:34,679 Speaker 2: This is why I don't listen to news and Instagram stories. 62 00:03:34,960 --> 00:03:41,360 Speaker 1: But my first clue was this open brackets, NPR, close brackets, 63 00:03:41,680 --> 00:03:45,240 Speaker 1: and that gave me an opportunity. Okay, thank you Watson. 64 00:03:45,280 --> 00:03:48,680 Speaker 1: That gave me an opportunity to do some research and 65 00:03:48,760 --> 00:03:52,080 Speaker 1: figure out what the source of this story was, and 66 00:03:52,120 --> 00:03:54,880 Speaker 1: I actually found the original NPR story. 67 00:03:55,280 --> 00:03:56,880 Speaker 2: What was the NPR story. 68 00:03:56,720 --> 00:03:59,360 Speaker 1: Well, it was published in August twenty twenty two, Oh 69 00:03:59,400 --> 00:04:04,360 Speaker 1: my god, and the headline was data center's backbone of 70 00:04:04,400 --> 00:04:08,440 Speaker 1: the digital economy, face water scarcity and climate risk. Now, 71 00:04:08,440 --> 00:04:11,720 Speaker 1: this was published three months before November twenty twenty two, 72 00:04:11,720 --> 00:04:15,320 Speaker 1: which is when chat GPT three entered the world and 73 00:04:15,760 --> 00:04:18,520 Speaker 1: generative AI became something that kind of has entered the 74 00:04:18,560 --> 00:04:22,320 Speaker 1: mainstream discourse. But the NPR story makes no mention of it. 75 00:04:22,560 --> 00:04:27,159 Speaker 1: The actual quote was quote a mid size data center 76 00:04:27,240 --> 00:04:30,400 Speaker 1: consumes around three hundred thousand gallons of water. Again, no 77 00:04:30,520 --> 00:04:32,960 Speaker 1: mention of quote the average AI data center from the 78 00:04:32,960 --> 00:04:36,200 Speaker 1: Instagram story. The story was really just about regular old 79 00:04:36,320 --> 00:04:39,279 Speaker 1: data centers that were processing Google searches and doing weather 80 00:04:39,320 --> 00:04:46,040 Speaker 1: forecasts and helping bitcoin traders making lose fortunes. That wasn't 81 00:04:46,080 --> 00:04:51,040 Speaker 1: the only distortion in the non My mind has been 82 00:04:51,480 --> 00:04:53,800 Speaker 1: in the non AI slop that you sent me. It 83 00:04:53,920 --> 00:04:55,400 Speaker 1: is human generated slop. 84 00:04:55,520 --> 00:04:57,520 Speaker 2: It distresses me to know end that we're in, like 85 00:04:57,720 --> 00:05:00,960 Speaker 2: true crises, how easy it it is for people to 86 00:05:01,000 --> 00:05:05,839 Speaker 2: get excited about information that seems compelling and not only 87 00:05:05,920 --> 00:05:09,440 Speaker 2: believe it, but share it because it seems so compelling. 88 00:05:09,640 --> 00:05:12,960 Speaker 1: I've actually find the truth of this story as interesting 89 00:05:13,000 --> 00:05:16,400 Speaker 1: as the misinformation. So the Instagram story said that these 90 00:05:16,440 --> 00:05:19,240 Speaker 1: three hundred thousand gallons per day of water could otherwise 91 00:05:19,360 --> 00:05:22,360 Speaker 1: serve one hundred thousand US homes per day, it's just 92 00:05:22,400 --> 00:05:24,640 Speaker 1: made up. No, In the NPR story, it's one thousand. 93 00:05:26,320 --> 00:05:30,280 Speaker 1: So that's quite a big difference. As we know, there's 94 00:05:30,279 --> 00:05:33,560 Speaker 1: been this kind of explosion in demand for data centers 95 00:05:33,600 --> 00:05:37,520 Speaker 1: exactly because of AI. The La Times actually reported that 96 00:05:37,920 --> 00:05:41,200 Speaker 1: a single query on a chatbot that uses AI is 97 00:05:41,320 --> 00:05:45,240 Speaker 1: estimated to require at least ten times more electricity than 98 00:05:45,240 --> 00:05:48,159 Speaker 1: a standard search on Google. So there is this spiritual 99 00:05:48,240 --> 00:05:51,040 Speaker 1: truth of the story you shared, which is data centers 100 00:05:51,080 --> 00:05:54,880 Speaker 1: create heat. Heat needs water to cool it down, and 101 00:05:54,920 --> 00:05:57,600 Speaker 1: the explosion in generative of AI is putting a lot 102 00:05:57,640 --> 00:05:59,680 Speaker 1: more strain on data centers. So there is actually weirdly 103 00:05:59,720 --> 00:06:00,599 Speaker 1: lot of truth in the story. 104 00:06:00,920 --> 00:06:05,320 Speaker 2: But the connection directly between data centers and the fires 105 00:06:05,400 --> 00:06:07,400 Speaker 2: in Los Angeles. 106 00:06:06,880 --> 00:06:10,320 Speaker 1: There is no suggestion there is a direct connection, and 107 00:06:10,360 --> 00:06:13,640 Speaker 1: that's primarily because there aren't that many data centers in 108 00:06:13,640 --> 00:06:15,760 Speaker 1: the Los Angeles area. There are a lot of data 109 00:06:15,800 --> 00:06:19,200 Speaker 1: centers in northern California, and there is now actually a 110 00:06:19,200 --> 00:06:22,320 Speaker 1: big program to build more data centers in southern California, 111 00:06:22,480 --> 00:06:26,000 Speaker 1: so this could become kind of a bigger problem. But Cara, 112 00:06:26,279 --> 00:06:28,240 Speaker 1: I know, I'm really only allowed one story, but this 113 00:06:28,279 --> 00:06:30,520 Speaker 1: one is sort of very much connected, and it's in 114 00:06:30,560 --> 00:06:33,159 Speaker 1: one of my favorite news sources. Can you guess what 115 00:06:33,200 --> 00:06:33,560 Speaker 1: it is? 116 00:06:33,680 --> 00:06:34,599 Speaker 2: The Daily Mail? 117 00:06:35,120 --> 00:06:35,360 Speaker 1: No? 118 00:06:36,839 --> 00:06:39,960 Speaker 2: I really one of your favorite the New York Posts? 119 00:06:40,000 --> 00:06:43,200 Speaker 1: No? No, no, Yeah, you're embarrassing. Sorry, this is the 120 00:06:43,240 --> 00:06:44,200 Speaker 1: Sacramento Bee. 121 00:06:44,400 --> 00:06:45,760 Speaker 2: I love the Sacramento Bee. 122 00:06:45,800 --> 00:06:49,920 Speaker 1: Sacramento is a great paper. On December eighteenth, twenty twenty four, 123 00:06:50,360 --> 00:06:54,080 Speaker 1: The Bee ran a prescient op ed by Dean Flores, 124 00:06:54,120 --> 00:06:59,039 Speaker 1: a former California State Senator, under the headline California's next 125 00:06:59,080 --> 00:07:02,719 Speaker 1: water war won't concern agriculture. It will be about AI. 126 00:07:03,400 --> 00:07:06,239 Speaker 2: Mmmm, so it's not the almonds, it's the AI. 127 00:07:06,400 --> 00:07:08,520 Speaker 1: It's the AI is the AI and the almonds, And 128 00:07:08,520 --> 00:07:11,520 Speaker 1: and Flores continues that the struggle for water, of course, 129 00:07:11,600 --> 00:07:14,560 Speaker 1: is not a new topic in California. It used to 130 00:07:14,600 --> 00:07:19,360 Speaker 1: be farms versus cities, agriculture versus know, urban development, but 131 00:07:19,480 --> 00:07:22,440 Speaker 1: now there's this third great demand on the water supply 132 00:07:22,520 --> 00:07:25,960 Speaker 1: of a drought prone state. There are actually more than 133 00:07:25,960 --> 00:07:29,600 Speaker 1: two hundred and seventy data centers already in California, and 134 00:07:29,680 --> 00:07:32,600 Speaker 1: in California, water consumption from those data centers is in 135 00:07:32,600 --> 00:07:36,520 Speaker 1: the billions of gallons each year and there. 136 00:07:36,480 --> 00:07:39,040 Speaker 2: And Flores points this out in his. 137 00:07:38,880 --> 00:07:42,560 Speaker 1: Exactly and says that data centers could double or triple 138 00:07:42,600 --> 00:07:45,680 Speaker 1: their water demand in the next decade in California. And 139 00:07:45,720 --> 00:07:48,840 Speaker 1: I just think what Flora's ended his piece saying was 140 00:07:49,480 --> 00:07:52,360 Speaker 1: this story hasn't yet been written. AI infrastructure is being 141 00:07:52,360 --> 00:07:54,720 Speaker 1: built in real time, so there are opportunities to kind 142 00:07:54,760 --> 00:07:57,040 Speaker 1: of do better when it comes to figuring out how 143 00:07:57,080 --> 00:07:58,960 Speaker 1: to to meet this demand. 144 00:07:59,240 --> 00:08:01,680 Speaker 2: Yeah, so I do agree with you, Oz, and I 145 00:08:01,680 --> 00:08:04,480 Speaker 2: think it's important on this show when we cover things 146 00:08:04,520 --> 00:08:09,120 Speaker 2: that can be a little bit paranoia inducing or anxiety provoking, 147 00:08:09,400 --> 00:08:13,360 Speaker 2: for us to talk about technology that really is solving 148 00:08:13,400 --> 00:08:16,360 Speaker 2: a human problem. I have a ton of friends in La. 149 00:08:16,440 --> 00:08:21,119 Speaker 2: My sister lives in LA, and they are all using 150 00:08:21,120 --> 00:08:24,280 Speaker 2: this technology called watch Duty. Do you know what watch 151 00:08:24,320 --> 00:08:28,240 Speaker 2: to do? So? Watch Duty is an app that is 152 00:08:28,600 --> 00:08:31,440 Speaker 2: so beta looking that it kind of looks like the 153 00:08:31,440 --> 00:08:35,600 Speaker 2: Oregon Trail. And unfortunately it does look like the Oregon 154 00:08:35,640 --> 00:08:39,160 Speaker 2: Trail as of last week, because it's just like firefire, firefire, horrible, 155 00:08:39,200 --> 00:08:41,600 Speaker 2: don't be there, don't do this. But what watch Duty 156 00:08:41,720 --> 00:08:45,720 Speaker 2: is in practice is this free app that shows active fires, 157 00:08:45,800 --> 00:08:49,920 Speaker 2: evacuation zones, and air quality indexes in the LA area. 158 00:08:50,000 --> 00:08:53,920 Speaker 2: And it's actually, you know, anecdotally been a lifeline for 159 00:08:54,000 --> 00:08:57,680 Speaker 2: people living there. And it's just this really great act 160 00:08:57,720 --> 00:09:00,920 Speaker 2: of humanity and that it's not selling people data, it's 161 00:09:00,960 --> 00:09:03,520 Speaker 2: not showing you ads. You don't have to log in, 162 00:09:03,920 --> 00:09:07,000 Speaker 2: and it is completely donation based. It's a nonprofit. Well, 163 00:09:07,240 --> 00:09:10,080 Speaker 2: you know, I wanted to highlight this because I think 164 00:09:10,160 --> 00:09:14,760 Speaker 2: often about disaster capitalism, and this, to me is more 165 00:09:15,320 --> 00:09:16,839 Speaker 2: disaster troubleshooting. 166 00:09:17,120 --> 00:09:20,120 Speaker 1: Yeah. One of the fundamental interesting things about tech is 167 00:09:20,160 --> 00:09:24,480 Speaker 1: it often does emerge in response to real world problems 168 00:09:24,520 --> 00:09:26,720 Speaker 1: and can help solve them or at least mitigate them. 169 00:09:26,880 --> 00:09:29,520 Speaker 1: It's just what happens a day after, which is often 170 00:09:29,559 --> 00:09:32,920 Speaker 1: where things go wrong. So carry you've already over delivered 171 00:09:32,960 --> 00:09:34,920 Speaker 1: by telling me something interesting about the story. 172 00:09:35,160 --> 00:09:35,720 Speaker 2: Nothing new. 173 00:09:36,240 --> 00:09:39,880 Speaker 1: I didn't know about Watch Duty, But what have you 174 00:09:39,920 --> 00:09:40,440 Speaker 1: got for me? 175 00:09:40,920 --> 00:09:43,959 Speaker 2: Well, the last decent man on the internet read it. 176 00:09:44,720 --> 00:09:51,520 Speaker 2: Our old friend served me with something so demented. Just 177 00:09:51,520 --> 00:09:53,160 Speaker 2: just bear with me because this is there's a lot 178 00:09:53,160 --> 00:09:56,560 Speaker 2: of like R two D two in this But there's 179 00:09:56,559 --> 00:09:58,080 Speaker 2: a video that I'm going to show you in a 180 00:09:58,120 --> 00:10:03,080 Speaker 2: second that shows this guy who goes by the online 181 00:10:03,160 --> 00:10:09,160 Speaker 2: name STS three d. He's created a robot rifle what 182 00:10:09,600 --> 00:10:15,440 Speaker 2: that can shoot automatically and he's giving this robot rifle 183 00:10:15,559 --> 00:10:17,959 Speaker 2: directions using chat GPT. 184 00:10:18,080 --> 00:10:18,760 Speaker 1: Are you serious? 185 00:10:18,960 --> 00:10:20,560 Speaker 2: I got it? Did you? Did you know this? 186 00:10:20,800 --> 00:10:21,679 Speaker 1: I didn't know this. 187 00:10:21,360 --> 00:10:24,920 Speaker 2: This is okay, So let me let me just show 188 00:10:24,920 --> 00:10:25,480 Speaker 2: you this video. 189 00:10:27,320 --> 00:10:29,079 Speaker 1: Oh my god. 190 00:10:29,160 --> 00:10:31,800 Speaker 2: Chat gpt, we're under attack from the front left and 191 00:10:31,920 --> 00:10:32,400 Speaker 2: front right. 192 00:10:32,559 --> 00:10:33,559 Speaker 1: Respond to bording with it. 193 00:10:35,920 --> 00:10:37,000 Speaker 3: Oh my god. 194 00:10:39,880 --> 00:10:42,880 Speaker 2: If you need any further assistance, just let me know. 195 00:10:43,480 --> 00:10:45,400 Speaker 2: By the way, that's chat gpt talking back him. 196 00:10:46,960 --> 00:10:57,760 Speaker 1: One round. There is a rifle following this guy's voice commons. 197 00:10:54,800 --> 00:10:56,640 Speaker 2: Like an automated assault leaptip cheez. 198 00:10:57,679 --> 00:10:59,040 Speaker 1: Wow. What did you think when you saw this? 199 00:11:00,559 --> 00:11:02,560 Speaker 2: Well? I thought to myself, Am I going to do 200 00:11:02,600 --> 00:11:05,840 Speaker 2: another story this week where chat gpt is being used 201 00:11:06,080 --> 00:11:09,640 Speaker 2: by individuals to create weapons? 202 00:11:10,040 --> 00:11:10,280 Speaker 3: Yes? 203 00:11:10,559 --> 00:11:11,120 Speaker 1: Yeah, you know. 204 00:11:11,120 --> 00:11:14,080 Speaker 2: When I saw it, I couldn't really wrap my head 205 00:11:14,120 --> 00:11:18,800 Speaker 2: around how it was actually working and how chat GPT's 206 00:11:19,400 --> 00:11:23,640 Speaker 2: API was being used to operate operate. 207 00:11:24,040 --> 00:11:26,480 Speaker 1: So I have this friend, you have another friend you 208 00:11:26,520 --> 00:11:29,160 Speaker 1: talk about tech stuff with I knew, oh geez, and 209 00:11:29,240 --> 00:11:29,720 Speaker 1: he's straight. 210 00:11:31,880 --> 00:11:35,840 Speaker 2: So I sent this video and he basically said that 211 00:11:36,280 --> 00:11:39,080 Speaker 2: STS three D. The guy in the video is using 212 00:11:39,360 --> 00:11:43,720 Speaker 2: voice to text and is basically feeding that to chat GPT, 213 00:11:44,280 --> 00:11:48,160 Speaker 2: and then he's running a separate program that can control 214 00:11:48,240 --> 00:11:52,120 Speaker 2: the robot gun's movements, like you know how many degrees 215 00:11:52,160 --> 00:11:55,120 Speaker 2: to move the gun, when to shoot it. And that 216 00:11:55,280 --> 00:11:59,959 Speaker 2: program that he is designed is communicating with chat GPT. 217 00:12:00,280 --> 00:12:04,120 Speaker 1: His speaking to check GPT, which is then interpreting his 218 00:12:04,240 --> 00:12:08,400 Speaker 1: requests and passing it on to the robot, but presumably 219 00:12:08,440 --> 00:12:12,080 Speaker 1: not in natural language, presumably translating natural language into a 220 00:12:12,080 --> 00:12:12,880 Speaker 1: computer command. 221 00:12:13,240 --> 00:12:16,000 Speaker 2: That's correct. Okay, that's correct, And that's why he's talking 222 00:12:16,080 --> 00:12:19,040 Speaker 2: to something that sounds like Siri. The other thing that 223 00:12:19,080 --> 00:12:22,160 Speaker 2: I want to mention, of course, is that open Ai 224 00:12:22,280 --> 00:12:24,600 Speaker 2: got wind of this because the video went viral, right 225 00:12:25,200 --> 00:12:27,520 Speaker 2: and they came out and said that what this user 226 00:12:27,559 --> 00:12:31,040 Speaker 2: is doing goes against company policy. There's a very interesting 227 00:12:31,080 --> 00:12:35,439 Speaker 2: loophole that I found accidentally through reading about this on futurism, 228 00:12:35,960 --> 00:12:38,959 Speaker 2: which was actually reported by the intercept last year, which 229 00:12:39,080 --> 00:12:43,920 Speaker 2: is that open Ai quietly removed language prohibiting the use 230 00:12:43,920 --> 00:12:48,880 Speaker 2: of its tool for military purposes. And of course this 231 00:12:49,080 --> 00:12:52,280 Speaker 2: makes sense because just last month it was reported that 232 00:12:52,320 --> 00:12:55,720 Speaker 2: open ai is launching a partnership with Andrel, the American 233 00:12:55,720 --> 00:12:58,800 Speaker 2: defense company. Yeah, you know, for open Ai to stand 234 00:12:58,880 --> 00:13:02,000 Speaker 2: up and say okay, yes, for an individual user using 235 00:13:02,040 --> 00:13:04,960 Speaker 2: open Ai in this way goes against company policy, but 236 00:13:05,200 --> 00:13:09,240 Speaker 2: quietly in the background take out language opposing the use 237 00:13:09,280 --> 00:13:10,560 Speaker 2: of it for military defense. 238 00:13:11,840 --> 00:13:14,120 Speaker 1: I mean it is that, isn't It's like you always 239 00:13:14,120 --> 00:13:16,200 Speaker 1: talk about how things get faster and faster. It took 240 00:13:16,200 --> 00:13:20,360 Speaker 1: Google like fifteen years to ten years to abandon their 241 00:13:20,400 --> 00:13:24,080 Speaker 1: don't be evil policy. Yeah, it has taken open Ai 242 00:13:24,240 --> 00:13:26,880 Speaker 1: five minutes to make an exception for military use of 243 00:13:26,880 --> 00:13:27,480 Speaker 1: that technology. 244 00:13:27,559 --> 00:13:29,600 Speaker 2: Well, they pivoted from a five oh one c to 245 00:13:29,679 --> 00:13:30,880 Speaker 2: a private company quickly. 246 00:13:30,960 --> 00:13:31,360 Speaker 1: There you go. 247 00:13:34,080 --> 00:13:37,960 Speaker 2: Coming up, we'll discuss another reason to hate moving. Stay 248 00:13:37,960 --> 00:13:50,840 Speaker 2: with us. So today for our tech support segment, we 249 00:13:50,920 --> 00:13:54,920 Speaker 2: have a story about how cybercrime is meeting good old 250 00:13:54,960 --> 00:14:00,440 Speaker 2: fashioned IRL crime and somehow it all comes together thanks 251 00:14:00,520 --> 00:14:05,200 Speaker 2: to well as a lesbian, it's an important company, you haul, 252 00:14:05,880 --> 00:14:07,680 Speaker 2: and we're going to get this story from Joseph Cox, 253 00:14:07,720 --> 00:14:09,600 Speaker 2: who's one of the brilliant tech writers at four or 254 00:14:09,600 --> 00:14:10,160 Speaker 2: for Media. 255 00:14:10,760 --> 00:14:12,560 Speaker 1: Joseph, Welcome, it's great to have you on the show. 256 00:14:13,000 --> 00:14:14,200 Speaker 3: Thank you so much for having me. 257 00:14:14,880 --> 00:14:17,400 Speaker 1: So you wrote this story in four of for with 258 00:14:17,520 --> 00:14:21,200 Speaker 1: the headline quote violent hackers are using you Haul to 259 00:14:21,320 --> 00:14:22,160 Speaker 1: dox targets. 260 00:14:22,880 --> 00:14:26,920 Speaker 2: So when I read this story or saw this headline originally, 261 00:14:26,960 --> 00:14:28,640 Speaker 2: I'm sort of like, well, what does you Haul have 262 00:14:28,760 --> 00:14:30,640 Speaker 2: to do with doxing and hackers? 263 00:14:31,040 --> 00:14:33,800 Speaker 3: Yeah, and it's not obvious at first, and I don't 264 00:14:33,800 --> 00:14:36,520 Speaker 3: think anybody really thinks of you Haul of being a 265 00:14:36,600 --> 00:14:39,440 Speaker 3: juicy hacking target. But when you take a step back 266 00:14:39,480 --> 00:14:42,320 Speaker 3: and you think about it, people who use you Haul, 267 00:14:42,480 --> 00:14:44,920 Speaker 3: they're probably going to give their real phone number because 268 00:14:44,920 --> 00:14:46,720 Speaker 3: obviously they need to pick up the truck and they 269 00:14:46,720 --> 00:14:49,840 Speaker 3: need to organize that. They might give their real address 270 00:14:49,920 --> 00:14:52,800 Speaker 3: as well, because you know, you all needs to verify that, 271 00:14:52,920 --> 00:14:55,800 Speaker 3: and maybe they're real payment information because obviously they need 272 00:14:55,800 --> 00:14:58,080 Speaker 3: to pay for the truck as well. So it's actually 273 00:14:58,120 --> 00:15:02,080 Speaker 3: a really good source for accurate information if a hacker 274 00:15:02,120 --> 00:15:05,760 Speaker 3: is trying to reveal somebody's identity and you Haul has 275 00:15:05,760 --> 00:15:09,240 Speaker 3: been hacked multiple times over the years. You know, happened 276 00:15:09,240 --> 00:15:11,880 Speaker 3: in twenty twenty and I think twenty thirteen as well, 277 00:15:12,080 --> 00:15:14,400 Speaker 3: and in at least one of those cases it did 278 00:15:14,440 --> 00:15:18,840 Speaker 3: include driver license information. But at the time we didn't 279 00:15:18,880 --> 00:15:22,560 Speaker 3: really have any context of oh, well, why do hackers 280 00:15:22,600 --> 00:15:25,200 Speaker 3: care about you whole? And I think this story shows 281 00:15:25,240 --> 00:15:27,240 Speaker 3: why some people may care about that sort of target. 282 00:15:27,960 --> 00:15:30,120 Speaker 1: So who's doing this and what do we know about them? 283 00:15:31,000 --> 00:15:36,560 Speaker 3: So it's predominantly a group or a nebulous network of 284 00:15:36,640 --> 00:15:41,080 Speaker 3: people known as comm which is short for community, and 285 00:15:41,160 --> 00:15:44,160 Speaker 3: this is made up of a lot of young hackers 286 00:15:44,200 --> 00:15:47,760 Speaker 3: who take over phone numbers, they break into companies, but 287 00:15:48,160 --> 00:15:49,760 Speaker 3: the main thing is that they try to get as 288 00:15:49,840 --> 00:15:53,280 Speaker 3: much data as they can on potential targets. 289 00:15:53,480 --> 00:15:57,600 Speaker 1: As a practical matter, Joseph, how are the hackers getting 290 00:15:57,600 --> 00:15:58,880 Speaker 1: this information from you Hall? 291 00:15:59,280 --> 00:16:03,680 Speaker 3: So they're not not targeting individual you Haul customer accounts. 292 00:16:03,720 --> 00:16:07,680 Speaker 3: What they're doing is they're making phishing pages which will 293 00:16:07,680 --> 00:16:11,520 Speaker 3: trick people into handing over passwords, but specifically for you 294 00:16:11,640 --> 00:16:13,640 Speaker 3: Haul employee accounts. 295 00:16:13,760 --> 00:16:14,920 Speaker 1: This is like the Sony pictures. 296 00:16:14,960 --> 00:16:18,400 Speaker 3: Heck, yeah, So when the hackers have the loging credentials 297 00:16:18,400 --> 00:16:21,400 Speaker 3: for a you Haul employee, you know, they essentially become 298 00:16:22,080 --> 00:16:24,640 Speaker 3: that employee with all of the sort of access to 299 00:16:24,720 --> 00:16:29,040 Speaker 3: information they might have, so they can rifle through customer data. 300 00:16:29,440 --> 00:16:33,840 Speaker 3: And this is a constant problem across companies where you 301 00:16:33,920 --> 00:16:37,040 Speaker 3: have it, especially with US and UK telecoms as well. 302 00:16:37,320 --> 00:16:41,120 Speaker 3: These hackers will break into employee accounts so then they 303 00:16:41,120 --> 00:16:42,520 Speaker 3: can look up customer information. 304 00:16:43,280 --> 00:16:45,520 Speaker 1: What do they tend to be going after and who 305 00:16:45,560 --> 00:16:46,640 Speaker 1: is the typical victim? 306 00:16:47,040 --> 00:16:51,040 Speaker 3: In writing this story, I spoke to somebody who writes 307 00:16:51,120 --> 00:16:55,480 Speaker 3: the tools to steal you haul logins and they told 308 00:16:55,560 --> 00:17:01,080 Speaker 3: me that basically people inside the com use to dox 309 00:17:01,120 --> 00:17:05,919 Speaker 3: one another or dos other targets, and they were, you know, 310 00:17:06,200 --> 00:17:08,679 Speaker 3: it sounded like quite surprised the amount of data you 311 00:17:08,720 --> 00:17:09,440 Speaker 3: can get in there. 312 00:17:10,040 --> 00:17:13,240 Speaker 1: Doxing is when you sort of get somebody's private details 313 00:17:13,320 --> 00:17:14,600 Speaker 1: and then publicize them. 314 00:17:14,680 --> 00:17:14,840 Speaker 2: Yeah. 315 00:17:14,920 --> 00:17:19,000 Speaker 3: Docting is where you will take and find somebody's personal information, 316 00:17:20,040 --> 00:17:23,560 Speaker 3: maybe their physical address, email address, phone number, and then 317 00:17:23,600 --> 00:17:27,040 Speaker 3: maybe some more sensitive data as well, and either using 318 00:17:27,080 --> 00:17:30,720 Speaker 3: that for your own means or publishing it online. Now 319 00:17:30,960 --> 00:17:33,879 Speaker 3: I don't know exactly what a specific member of the 320 00:17:33,920 --> 00:17:37,120 Speaker 3: comm has used this data for, but you can easily 321 00:17:37,160 --> 00:17:41,159 Speaker 3: see it potentially being used for physical violence. Members of 322 00:17:41,200 --> 00:17:44,800 Speaker 3: the com often shoot at each other's houses or hire 323 00:17:44,800 --> 00:17:47,800 Speaker 3: people to do so. They throw bricks through them, they 324 00:17:48,000 --> 00:17:52,800 Speaker 3: rob one another armed with hammers or other weapons. And 325 00:17:53,000 --> 00:17:56,919 Speaker 3: I absolutely put the harvesting of this U haul data 326 00:17:57,400 --> 00:18:00,760 Speaker 3: inside that context and that it is being you by 327 00:18:01,240 --> 00:18:03,040 Speaker 3: violent criminal hackers. 328 00:18:03,640 --> 00:18:06,280 Speaker 2: So this is really about being an admin. It's not 329 00:18:06,359 --> 00:18:08,040 Speaker 2: about getting individual information. 330 00:18:09,000 --> 00:18:09,760 Speaker 3: Yeah, exactly. 331 00:18:09,920 --> 00:18:11,640 Speaker 1: I kind of feel like a lot of people when 332 00:18:11,680 --> 00:18:15,160 Speaker 1: it comes to Internet privacy, unless they've been like scanned 333 00:18:15,160 --> 00:18:17,919 Speaker 1: in some way themselves. You know, their eyes can glaze 334 00:18:17,960 --> 00:18:21,119 Speaker 1: over a little bit. But in your story, people actually 335 00:18:21,119 --> 00:18:22,600 Speaker 1: got shot after being dogsed. 336 00:18:23,520 --> 00:18:27,119 Speaker 3: Yeah. I think people have a pretty old understanding of 337 00:18:27,119 --> 00:18:30,199 Speaker 3: what cybercrime is. They imagine, you know, somebody in a 338 00:18:30,240 --> 00:18:33,720 Speaker 3: hoodie overall, laptop in a dark basement or whatever doing 339 00:18:33,760 --> 00:18:39,200 Speaker 3: these sort of abstract, obscure crimes. Cybercrime today's done by 340 00:18:39,400 --> 00:18:41,840 Speaker 3: people who are a lot younger, and they're a lot 341 00:18:41,880 --> 00:18:45,040 Speaker 3: more violent in some cases. If they want to get 342 00:18:45,040 --> 00:18:48,080 Speaker 3: hold of maybe login tokens, you know, the two FA 343 00:18:48,200 --> 00:18:51,040 Speaker 3: tokens to get into an account. Some will even threaten 344 00:18:51,440 --> 00:18:53,960 Speaker 3: to go shoot up some of the's house. And that's 345 00:18:54,000 --> 00:18:57,000 Speaker 3: why I'm so fascinated by this community. Because it's a 346 00:18:57,000 --> 00:19:02,000 Speaker 3: complete intersection of digital and physical crime. You know, as 347 00:19:02,000 --> 00:19:05,640 Speaker 3: a crime reporter for more than ten years, I've never 348 00:19:05,760 --> 00:19:09,720 Speaker 3: quite seen as latent an intersection as these guys. 349 00:19:10,000 --> 00:19:12,000 Speaker 1: This is the metaverse, but not the one that all 350 00:19:12,080 --> 00:19:14,080 Speaker 1: the investors were hyping. 351 00:19:14,720 --> 00:19:17,760 Speaker 3: Yeah, it's the metaverse, but not the very cringe one 352 00:19:17,800 --> 00:19:20,600 Speaker 3: from Mark Zuckerberg. Is unfortunately, the very real way. And 353 00:19:21,000 --> 00:19:23,120 Speaker 3: it's funny you bring up metaverse because actually a lot 354 00:19:23,160 --> 00:19:26,040 Speaker 3: of the people that get into this community, they actually 355 00:19:26,560 --> 00:19:30,879 Speaker 3: fall into it through video games such as Minecraft or Roadblocks, 356 00:19:31,000 --> 00:19:34,040 Speaker 3: which arguably are the metaverse, and then they end up 357 00:19:34,080 --> 00:19:35,080 Speaker 3: in this world of cybercrime. 358 00:19:35,440 --> 00:19:38,359 Speaker 2: And do you think there's a part of playing those 359 00:19:38,400 --> 00:19:41,600 Speaker 2: games that kind of like blurs the line between like 360 00:19:41,640 --> 00:19:44,080 Speaker 2: the reality of what they're doing and the games that 361 00:19:44,080 --> 00:19:45,040 Speaker 2: they're used to playing. 362 00:19:45,680 --> 00:19:49,359 Speaker 3: Yeah. Absolutely. And when I've spoken to many different members 363 00:19:49,359 --> 00:19:52,640 Speaker 3: of the com Roadblocks and Minecraft keep coming up over 364 00:19:52,680 --> 00:19:55,760 Speaker 3: and over again where even hackers will break into each 365 00:19:55,760 --> 00:19:59,200 Speaker 3: other's Roadblocks accounts to steal, you know, their rare hat 366 00:19:59,320 --> 00:20:02,240 Speaker 3: or their rare eye or something, and eventually that escalates 367 00:20:02,280 --> 00:20:05,360 Speaker 3: into well, there's steal hundreds of thousands of dollars of cryptocurrency, 368 00:20:05,400 --> 00:20:05,760 Speaker 3: as well. 369 00:20:06,080 --> 00:20:08,840 Speaker 1: Is there a single story or a single real world 370 00:20:09,000 --> 00:20:12,840 Speaker 1: crime that for you, really throws this into relief for 371 00:20:12,880 --> 00:20:15,480 Speaker 1: the kind of non tech news junkie. 372 00:20:16,119 --> 00:20:20,080 Speaker 3: So recently I wrote about something called the Colm World War, 373 00:20:20,320 --> 00:20:23,040 Speaker 3: which is obviously not an entirely serious term. This is 374 00:20:23,080 --> 00:20:26,000 Speaker 3: what members of the comm called it themselves. But there 375 00:20:26,000 --> 00:20:30,600 Speaker 3: were a series of robberies, shootings and brickings between different 376 00:20:30,680 --> 00:20:32,880 Speaker 3: rivals inside the como and. 377 00:20:33,240 --> 00:20:37,080 Speaker 1: Before the West Side story of Legs exactly. 378 00:20:37,520 --> 00:20:39,840 Speaker 3: And I would say it's sort of bubbled over out 379 00:20:39,880 --> 00:20:42,800 Speaker 3: of calm and into the wider public, because I've seen 380 00:20:42,880 --> 00:20:46,359 Speaker 3: videos where Colm will hire people to go throw a 381 00:20:46,359 --> 00:20:48,679 Speaker 3: brick at somebody's house, but they throw it at the 382 00:20:48,720 --> 00:20:51,600 Speaker 3: wrong home and then it's just impacted the completely random 383 00:20:51,640 --> 00:20:53,840 Speaker 3: person and now they have a smashed door or smashed 384 00:20:53,920 --> 00:20:56,960 Speaker 3: ring door, camera or whatever. So it is starting to 385 00:20:57,119 --> 00:21:00,760 Speaker 3: bleed over into the world of just ordinary people who 386 00:21:01,080 --> 00:21:04,000 Speaker 3: have no idea why somebody is throwing a brick through 387 00:21:04,000 --> 00:21:06,120 Speaker 3: the window because something happened on roadblocks. 388 00:21:06,160 --> 00:21:06,320 Speaker 1: You know. 389 00:21:07,280 --> 00:21:10,840 Speaker 3: Fortunately, if you are a general member of the U 390 00:21:10,920 --> 00:21:14,480 Speaker 3: haul using public, you probably don't need to worry about 391 00:21:14,480 --> 00:21:18,040 Speaker 3: this in that maybe somebody you know or somebody nearby 392 00:21:18,119 --> 00:21:21,560 Speaker 3: unfortunately becomes a target of calm. You know, that's kind 393 00:21:21,600 --> 00:21:23,280 Speaker 3: of impossible to say, and I would just be very 394 00:21:23,320 --> 00:21:26,600 Speaker 3: bad luck. But they do predominantly go and try to 395 00:21:26,680 --> 00:21:30,120 Speaker 3: rob people or target people who hold a lot of cryptocurrency. 396 00:21:30,200 --> 00:21:33,480 Speaker 3: They have various tools which can determine, Oh, this person 397 00:21:33,520 --> 00:21:36,280 Speaker 3: has x amount of bitcoin in their coin base account, 398 00:21:36,520 --> 00:21:39,679 Speaker 3: let's go after them. So, I mean, the threat to 399 00:21:39,720 --> 00:21:43,080 Speaker 3: a bitcoin investor is very, very different to a sort 400 00:21:43,119 --> 00:21:45,199 Speaker 3: of ordinary member of the public, and you should be 401 00:21:45,200 --> 00:21:45,720 Speaker 3: worried then. 402 00:21:46,080 --> 00:21:52,280 Speaker 2: So if you're a lesbian bitcoin investor. 403 00:21:50,840 --> 00:21:52,760 Speaker 1: Is the center of the ven diagram, you're going to 404 00:21:52,800 --> 00:21:58,840 Speaker 1: calm You probably already the bricks already have lesbian lucky 405 00:21:58,960 --> 00:22:02,359 Speaker 1: don't drive. Joseph, thank you for joining us today and 406 00:22:02,400 --> 00:22:04,480 Speaker 1: we hope to see you on the program again soon. 407 00:22:04,840 --> 00:22:06,800 Speaker 3: Thanks Joseph, thank you so much. 408 00:22:09,160 --> 00:22:19,840 Speaker 1: Coming up, Kara's Big Adventure stay with us. I'm excited 409 00:22:19,880 --> 00:22:23,520 Speaker 1: to be launching this new segment that we're calling when 410 00:22:23,560 --> 00:22:25,639 Speaker 1: did this become a thing? We've got a few listener 411 00:22:25,680 --> 00:22:29,560 Speaker 1: emails to our account tech Stuff podcast at gmail dot 412 00:22:29,600 --> 00:22:33,040 Speaker 1: com and they were very touching and very encouraging, but 413 00:22:33,119 --> 00:22:38,000 Speaker 1: also directive and one of the listener emails. Basically said Jonathan, 414 00:22:38,160 --> 00:22:42,120 Speaker 1: over the course of sixteen years, had so many interesting 415 00:22:42,160 --> 00:22:45,919 Speaker 1: segments about the history of various technologies, and you know 416 00:22:45,960 --> 00:22:48,760 Speaker 1: we're hungry for more of that. And so when did 417 00:22:48,760 --> 00:22:51,040 Speaker 1: this become a thing? It's kind of in service of 418 00:22:51,080 --> 00:22:52,320 Speaker 1: trying to scratch that itch. 419 00:22:53,040 --> 00:22:56,200 Speaker 2: Yeah, each week we're going to bring you a story 420 00:22:56,359 --> 00:22:59,240 Speaker 2: or observation from the real world and try to figure 421 00:22:59,240 --> 00:23:01,160 Speaker 2: out when it became a thing. 422 00:23:01,359 --> 00:23:04,240 Speaker 1: And this has a little bit of our favorite pair 423 00:23:04,320 --> 00:23:08,800 Speaker 1: of historical platonic freelance detectives. 424 00:23:08,240 --> 00:23:12,720 Speaker 2: That's correct. I get to be the Watson to your Sherlock. 425 00:23:12,840 --> 00:23:16,320 Speaker 1: And this week you have something that you encountered in 426 00:23:16,359 --> 00:23:20,960 Speaker 1: the wild that made you question when something became a thing. 427 00:23:21,280 --> 00:23:23,439 Speaker 2: Yeah, it was more of a rescue mission. So I 428 00:23:23,480 --> 00:23:27,760 Speaker 2: recently had this experience where I came face to face 429 00:23:27,960 --> 00:23:34,000 Speaker 2: with one of the twentieth centuries most prolific writers and 430 00:23:34,040 --> 00:23:38,520 Speaker 2: feminist icons. And it wasn't the best case scenario. She 431 00:23:38,840 --> 00:23:43,600 Speaker 2: had fallen on the street and was a little bit disoriented, 432 00:23:44,280 --> 00:23:48,560 Speaker 2: and my first thought was, we will not lose a 433 00:23:48,560 --> 00:23:52,720 Speaker 2: feminist icon on my watch. My second thought was, we 434 00:23:52,760 --> 00:23:55,040 Speaker 2: won't lose her because I'm going to call an ambulance 435 00:23:55,080 --> 00:23:58,879 Speaker 2: on my cellphone. That took too long, and so we 436 00:23:58,960 --> 00:24:01,480 Speaker 2: headed to an urgent care that Margo we can call her. 437 00:24:01,560 --> 00:24:04,479 Speaker 2: Margo knew about and she was adamant that she'd been 438 00:24:04,480 --> 00:24:06,680 Speaker 2: there many times before. So I Google mapped the address. 439 00:24:07,040 --> 00:24:12,080 Speaker 2: Google Map says it's closed, closed, like permanently. She's like, 440 00:24:12,119 --> 00:24:14,240 Speaker 2: it's not closed. I was there recently. It's closed. 441 00:24:14,600 --> 00:24:16,280 Speaker 1: Are you gaining her trust? By this point is she 442 00:24:16,400 --> 00:24:17,280 Speaker 1: likes sign to know? 443 00:24:18,119 --> 00:24:20,760 Speaker 2: And then I'm like, all right, I'm gonna call an 444 00:24:20,840 --> 00:24:24,880 Speaker 2: uber to go to another urgent character CityMD. She insists 445 00:24:24,880 --> 00:24:27,359 Speaker 2: on paying for the uber. I said, you can't play 446 00:24:27,400 --> 00:24:30,760 Speaker 2: for the uber. It doesn't work that way. We get 447 00:24:30,760 --> 00:24:36,720 Speaker 2: to CityMD. I see it that freaking iPad that they 448 00:24:36,760 --> 00:24:40,040 Speaker 2: want to use to scan her ID. I'm like, no, 449 00:24:40,359 --> 00:24:44,080 Speaker 2: we're not using an iPad. We're using paper. Then, And 450 00:24:44,080 --> 00:24:45,639 Speaker 2: this is my favorite part of the story because it's 451 00:24:45,680 --> 00:24:49,600 Speaker 2: kind of sweet. She has this like lightning thought where 452 00:24:49,600 --> 00:24:52,199 Speaker 2: she's like, oh, my cat has a vet appointment today 453 00:24:52,240 --> 00:24:55,200 Speaker 2: in the afternoon. I said, okay, I'll use Google and 454 00:24:55,240 --> 00:24:57,199 Speaker 2: I'll look up what animal hospital you go to in 455 00:24:57,200 --> 00:24:58,800 Speaker 2: the phone number. She goes, you can do that. I said, yeah, 456 00:25:00,520 --> 00:25:03,200 Speaker 2: all right, So I call the vet and I'm thinking 457 00:25:03,200 --> 00:25:05,440 Speaker 2: to myself, well, when you go to the vet, they 458 00:25:05,480 --> 00:25:08,000 Speaker 2: have everybody listed by the animal name, not the parent name. 459 00:25:08,440 --> 00:25:11,560 Speaker 2: So I turned to her and I say, Margot, what's 460 00:25:11,560 --> 00:25:18,119 Speaker 2: your cat's name? And dead stare Puss and I'm like, okay, 461 00:25:18,160 --> 00:25:22,880 Speaker 2: here we go. I said, my friend's cat, Puss, has 462 00:25:22,880 --> 00:25:27,399 Speaker 2: an appointment today and we're gonna need to cancel because 463 00:25:27,760 --> 00:25:31,400 Speaker 2: Puss's mother has been in an accent. She's okay, thank 464 00:25:31,440 --> 00:25:35,000 Speaker 2: you so much for telling us. Okay. Then the last 465 00:25:35,040 --> 00:25:38,000 Speaker 2: point of sort of technological contact was when we ended 466 00:25:38,080 --> 00:25:40,040 Speaker 2: up in the er where she had to go after 467 00:25:40,080 --> 00:25:43,600 Speaker 2: CityMD because she was having trouble breathing, and they said, 468 00:25:43,600 --> 00:25:47,360 Speaker 2: you need a chest X ray and I told her 469 00:25:47,640 --> 00:25:50,280 Speaker 2: I can't stay here for four hours, but I'll come 470 00:25:50,320 --> 00:25:54,040 Speaker 2: pick you up and take you home when you are 471 00:25:54,080 --> 00:25:59,480 Speaker 2: ready to leave. How well you have an iPhone? It 472 00:25:59,520 --> 00:26:01,720 Speaker 2: was off. I never seen an iPhone off. It was 473 00:26:01,840 --> 00:26:05,520 Speaker 2: fully off. I said, you know this nurse here who 474 00:26:05,560 --> 00:26:09,119 Speaker 2: is about my age, will help you call me. And 475 00:26:10,520 --> 00:26:15,520 Speaker 2: I think each touch point here made me reflect upon 476 00:26:17,119 --> 00:26:23,359 Speaker 2: how easy it was for me to navigate, and not 477 00:26:23,520 --> 00:26:26,760 Speaker 2: that she is well. She's certainly more intelligent than I 478 00:26:26,840 --> 00:26:30,480 Speaker 2: am intellectually speaking, but that there is a digital divide 479 00:26:30,520 --> 00:26:35,200 Speaker 2: between us that made this experience far less harrowing. Because 480 00:26:35,200 --> 00:26:37,560 Speaker 2: I was with her and she said as much. I 481 00:26:37,680 --> 00:26:41,320 Speaker 2: think she realized very early on that, like all of 482 00:26:41,400 --> 00:26:44,600 Speaker 2: those things that I think that are so second nature 483 00:26:44,760 --> 00:26:51,000 Speaker 2: to me, we're not even like on Margo's radar as possibilities. 484 00:26:52,200 --> 00:26:56,719 Speaker 1: I mean, this story raises the perfect question, almost as 485 00:26:56,760 --> 00:26:59,720 Speaker 1: though we planned it. When did this become a thing? 486 00:27:00,440 --> 00:27:03,000 Speaker 1: When did the world sort of cleave into two where 487 00:27:03,440 --> 00:27:07,800 Speaker 1: you're either totally able to navigate everything because of having 488 00:27:07,880 --> 00:27:12,000 Speaker 1: a tool which makes it easy, or you're just kind 489 00:27:12,000 --> 00:27:15,480 Speaker 1: of struggling to translate. In other words, when did digital 490 00:27:15,600 --> 00:27:19,200 Speaker 1: natives become a thing. Yes, but let's start by defining 491 00:27:19,240 --> 00:27:20,679 Speaker 1: our terms, which always a good thing to do at 492 00:27:20,680 --> 00:27:23,720 Speaker 1: the beginning. What is a digital native? 493 00:27:24,359 --> 00:27:26,440 Speaker 2: You know, I actually have no idea, but I've always 494 00:27:26,520 --> 00:27:30,720 Speaker 2: considered myself a digital native. I'm a New Yorker first, 495 00:27:30,920 --> 00:27:32,760 Speaker 2: but I'm definitely a digital native second. 496 00:27:33,200 --> 00:27:36,399 Speaker 1: I also always consider myself a digital native, even if 497 00:27:36,560 --> 00:27:40,280 Speaker 1: somewhat reluctantly. Weirdly, because as a child, I was diagnosed 498 00:27:40,320 --> 00:27:43,440 Speaker 1: with dyspraxia, which basically means can't catch a ball, can't 499 00:27:43,600 --> 00:27:44,600 Speaker 1: do nice handwriting. 500 00:27:44,880 --> 00:27:46,600 Speaker 2: You haven't really grown out of that, No, I haven't 501 00:27:46,640 --> 00:27:47,240 Speaker 2: tell you. I haven't. 502 00:27:47,240 --> 00:27:50,879 Speaker 1: But I got my prescription was a laptop, like aged 503 00:27:50,920 --> 00:27:53,000 Speaker 1: eight years old, to do my homework on and stuff, 504 00:27:53,000 --> 00:27:57,200 Speaker 1: and so I became a digital native basically in the 505 00:27:57,280 --> 00:27:57,880 Speaker 1: late nineties. 506 00:27:58,640 --> 00:28:00,920 Speaker 2: You know, I did not know this about you, And 507 00:28:01,040 --> 00:28:03,880 Speaker 2: this might be one of our sort of unspoken personal 508 00:28:03,960 --> 00:28:07,080 Speaker 2: connections because I went to one of those kooky laptop 509 00:28:07,160 --> 00:28:09,080 Speaker 2: schools that I think has ultimately netted out to be 510 00:28:09,119 --> 00:28:12,160 Speaker 2: the wrong idea. But I went to one of those 511 00:28:12,160 --> 00:28:15,000 Speaker 2: schools where they taught us C plus plus before they 512 00:28:15,040 --> 00:28:16,080 Speaker 2: taught us basic algebra. 513 00:28:16,560 --> 00:28:19,159 Speaker 1: So we both had this feeling of being digital natives. 514 00:28:19,240 --> 00:28:21,800 Speaker 1: But I thought I would do some research on what 515 00:28:22,000 --> 00:28:25,119 Speaker 1: that term actually means and where it comes from. Turns 516 00:28:25,160 --> 00:28:27,880 Speaker 1: out it was coined in the year two thousand and one, 517 00:28:28,000 --> 00:28:30,520 Speaker 1: around the same time we both got our laptops by 518 00:28:30,640 --> 00:28:34,560 Speaker 1: an educational consultant called Mark Prinsky, and he had this 519 00:28:34,720 --> 00:28:37,920 Speaker 1: idea that education needed to change in response to young 520 00:28:37,960 --> 00:28:41,880 Speaker 1: people whose brains were literally different from the brains of 521 00:28:41,880 --> 00:28:45,200 Speaker 1: the generation before because they'd already been shaped by gaming 522 00:28:45,440 --> 00:28:48,160 Speaker 1: and spending time online and being more used to finding 523 00:28:48,240 --> 00:28:51,000 Speaker 1: information on the web than, for example, in a book. 524 00:28:51,480 --> 00:28:54,680 Speaker 1: And he had this quite powerful quote, which was today's 525 00:28:54,680 --> 00:28:58,000 Speaker 1: students have not just changed incrementally from those of the past, 526 00:28:58,440 --> 00:29:03,000 Speaker 1: nor simply changed their slang, clothed, body adornments, or styles, 527 00:29:03,240 --> 00:29:06,640 Speaker 1: as has happened between generations previously, but a really big 528 00:29:06,720 --> 00:29:11,560 Speaker 1: discontinuity has taken place. One might even call it a singularity, 529 00:29:12,120 --> 00:29:14,840 Speaker 1: an event which changes things so fundamentally that there is 530 00:29:14,880 --> 00:29:16,560 Speaker 1: absolutely no going back. 531 00:29:17,320 --> 00:29:20,120 Speaker 2: Wouldn't you say, that's like the fact that I now 532 00:29:20,240 --> 00:29:22,000 Speaker 2: say google it, then look it up. 533 00:29:22,240 --> 00:29:23,200 Speaker 1: That's the singularity. 534 00:29:23,440 --> 00:29:26,840 Speaker 2: Right. When you talk to some people who are older, 535 00:29:27,760 --> 00:29:31,560 Speaker 2: they'll say, call for one one, where's four one one going. 536 00:29:31,480 --> 00:29:33,160 Speaker 1: To get me? And it's interesting, you just made that 537 00:29:33,280 --> 00:29:35,959 Speaker 1: point about calling four one one. Prince Ki also has 538 00:29:36,000 --> 00:29:41,040 Speaker 1: this concept of digital natives versus digital immigrants and digital immigrants, 539 00:29:41,080 --> 00:29:44,160 Speaker 1: of people like Margo who kind of live in the 540 00:29:44,400 --> 00:29:48,040 Speaker 1: digital world but carry the accent of where they come from. 541 00:29:48,240 --> 00:29:50,720 Speaker 1: Whether it's like printing out an email is one of 542 00:29:50,760 --> 00:29:53,720 Speaker 1: the examples that Prince Key uses, or kind of beckoning 543 00:29:53,800 --> 00:29:56,080 Speaker 1: someone over to look at your computer screen rather than 544 00:29:56,120 --> 00:29:56,840 Speaker 1: sending them a link. 545 00:29:57,480 --> 00:30:00,920 Speaker 2: And you know, I think that there are definitely those 546 00:30:01,000 --> 00:30:04,800 Speaker 2: things that mark people today as digital immigrants. You know, 547 00:30:04,880 --> 00:30:08,600 Speaker 2: I think, my poor mother, but you know, sometimes she'll say, oh, 548 00:30:08,720 --> 00:30:16,400 Speaker 2: you know, let's call for takeout. What nobody's home. There's 549 00:30:16,400 --> 00:30:19,360 Speaker 2: actually this book called the Victorian Internet, which is by 550 00:30:19,480 --> 00:30:21,960 Speaker 2: Tom's Standage, and it's all about the history of the 551 00:30:22,040 --> 00:30:25,120 Speaker 2: development of the telegraph and how it parallels to the Internet. 552 00:30:25,160 --> 00:30:27,680 Speaker 2: And in it he tells the story from when the 553 00:30:27,760 --> 00:30:31,920 Speaker 2: telegraph was first introduced around the time of the American 554 00:30:32,000 --> 00:30:34,640 Speaker 2: Civil War. And I mean, this could have been my mom. 555 00:30:34,920 --> 00:30:37,800 Speaker 2: A mom goes to the telegraph office with a bowl 556 00:30:37,880 --> 00:30:41,880 Speaker 2: of sour kraut to send her son to the battlefront, 557 00:30:41,920 --> 00:30:42,800 Speaker 2: you know, like by wire. 558 00:30:43,200 --> 00:30:46,440 Speaker 1: So she really was a digital immigrant, bringing a native 559 00:30:46,520 --> 00:30:48,600 Speaker 1: cuisine and then trying to send day with a telegraph. 560 00:30:48,680 --> 00:30:50,240 Speaker 2: That's correct. And you know, back in the time of 561 00:30:50,280 --> 00:30:54,120 Speaker 2: the Civil War, change felt fast. It did not feel 562 00:30:54,160 --> 00:30:56,440 Speaker 2: as fast as it feels now. Where you know, by 563 00:30:56,480 --> 00:30:58,200 Speaker 2: the time you have a software update, you need to 564 00:30:58,280 --> 00:31:01,040 Speaker 2: like learn how to use a whole new phone. You 565 00:31:01,120 --> 00:31:05,160 Speaker 2: could blink and miss an entire technological evolution. And I 566 00:31:05,280 --> 00:31:07,800 Speaker 2: do blink and miss entire things sometimes. 567 00:31:07,520 --> 00:31:09,480 Speaker 1: Well, I know you think of yourself as a digital 568 00:31:09,600 --> 00:31:12,600 Speaker 1: native since you grew up with tech. Do you still 569 00:31:13,720 --> 00:31:15,080 Speaker 1: you know, I do in. 570 00:31:15,160 --> 00:31:18,840 Speaker 2: The sense of the idea of new technology is not 571 00:31:19,160 --> 00:31:20,440 Speaker 2: incredibly daunting to me. 572 00:31:20,760 --> 00:31:21,880 Speaker 1: I think it was subtractive to you. 573 00:31:22,120 --> 00:31:24,200 Speaker 2: Yes, And I think we were born at a time 574 00:31:24,680 --> 00:31:28,680 Speaker 2: when technology was changing so rapidly that we are not 575 00:31:28,920 --> 00:31:32,360 Speaker 2: completely thrown by the idea of like the facts not 576 00:31:32,520 --> 00:31:35,880 Speaker 2: being the predominant form of communication for ten years, right, right. 577 00:31:36,040 --> 00:31:39,720 Speaker 1: But do you think if you kind of got given 578 00:31:40,040 --> 00:31:42,280 Speaker 1: a grand tour of downtown New York by an eighteen 579 00:31:42,360 --> 00:31:45,440 Speaker 1: year old, they will be surprised by your use of technology? 580 00:31:45,880 --> 00:31:49,920 Speaker 2: I think it's more if like I spent five days 581 00:31:49,960 --> 00:31:52,239 Speaker 2: with an eighteen year old, they'd be like, why are 582 00:31:52,280 --> 00:31:55,440 Speaker 2: you not charting your every move on Snapchat? Which is 583 00:31:55,520 --> 00:31:57,200 Speaker 2: not to say that like I couldn't live in the 584 00:31:57,280 --> 00:31:59,960 Speaker 2: world with them, but they live in the world different. 585 00:32:00,280 --> 00:32:03,680 Speaker 1: Yeah, truly, we're living Margo's living in more or less 586 00:32:03,680 --> 00:32:06,400 Speaker 1: an analog world. We're probably living in a hybrid world. 587 00:32:06,600 --> 00:32:09,040 Speaker 1: And many the younger people are living in a fully 588 00:32:09,560 --> 00:32:10,280 Speaker 1: digital world. 589 00:32:10,680 --> 00:32:13,040 Speaker 2: Well, and that's the irony of connectivity, I think, right, 590 00:32:13,160 --> 00:32:15,360 Speaker 2: Which is this idea that like the world is actually 591 00:32:15,440 --> 00:32:18,480 Speaker 2: quite siloed by like how you use technology. If you 592 00:32:18,640 --> 00:32:20,680 Speaker 2: use it properly, you can be very connected, but it 593 00:32:20,720 --> 00:32:22,640 Speaker 2: can also be very isolating totally. 594 00:32:23,000 --> 00:32:25,440 Speaker 1: And we should point out that the framing of native 595 00:32:25,560 --> 00:32:27,840 Speaker 1: versus immigrant is rather loaded. 596 00:32:28,280 --> 00:32:31,720 Speaker 2: Yeah, And one of the things that really struck me 597 00:32:32,560 --> 00:32:35,200 Speaker 2: with what Margo experience was that there was friction at 598 00:32:35,440 --> 00:32:40,960 Speaker 2: every turn in her accident. And that's just not only 599 00:32:41,040 --> 00:32:44,840 Speaker 2: because she was in an accident. It's because we are 600 00:32:45,000 --> 00:32:49,480 Speaker 2: constantly encountering, like almost every day, a more and more 601 00:32:49,960 --> 00:32:54,240 Speaker 2: digitized world, right. And you know, the thought that I'm 602 00:32:54,360 --> 00:32:59,200 Speaker 2: left with is that even though there is a definition 603 00:32:59,320 --> 00:33:02,880 Speaker 2: for digital needs and we fit into that definition, it's 604 00:33:02,920 --> 00:33:03,760 Speaker 2: not a permanent state. 605 00:33:04,560 --> 00:33:07,560 Speaker 1: I do wonder if there will emerge another term, something 606 00:33:07,640 --> 00:33:12,040 Speaker 1: to supplant digital natives the way postmoderns are planted modern. 607 00:33:12,560 --> 00:33:14,360 Speaker 1: You know, one of the guests that we were most 608 00:33:14,400 --> 00:33:17,880 Speaker 1: excited and proudest to have on Sleepwalkers, our former podcast, 609 00:33:18,360 --> 00:33:21,760 Speaker 1: was uv Al Noah Harari, who wrote a book called Homodeus, 610 00:33:22,080 --> 00:33:27,000 Speaker 1: which is basically all about how using technology and merging 611 00:33:27,040 --> 00:33:31,840 Speaker 1: with technology was changing Homo sapiens into almost this new 612 00:33:31,960 --> 00:33:36,600 Speaker 1: species Homodeus human but with these godlike powers enabled by 613 00:33:36,720 --> 00:33:41,120 Speaker 1: new technology. I found it as a young pre teenager 614 00:33:41,560 --> 00:33:44,640 Speaker 1: pretty crazy getting a laptop in the early two thousands, 615 00:33:44,680 --> 00:33:48,160 Speaker 1: which was theoretically to do my homework because I was dyspraxic, 616 00:33:48,280 --> 00:33:52,680 Speaker 1: but more realistically to play the Oregon Trail and Cassel Wolfenstein. 617 00:33:54,320 --> 00:33:56,320 Speaker 1: And I remember sort of having this laptop and other 618 00:33:56,400 --> 00:33:58,800 Speaker 1: kids didn't, and it was sort of this little It 619 00:33:58,880 --> 00:34:01,160 Speaker 1: didn't change too much, but it was a kind of 620 00:34:01,280 --> 00:34:03,360 Speaker 1: moment of like I was ahead of the curve in 621 00:34:03,440 --> 00:34:06,600 Speaker 1: some way. But imagine going to school now as a 622 00:34:06,720 --> 00:34:09,399 Speaker 1: nine year old in the age of chech GPT three. 623 00:34:09,920 --> 00:34:13,200 Speaker 1: What does that experience of having not only the whole 624 00:34:13,239 --> 00:34:17,279 Speaker 1: world's information at your fingertips for a search, but having 625 00:34:17,400 --> 00:34:20,000 Speaker 1: any answer which is probably better than the answer or 626 00:34:20,040 --> 00:34:22,920 Speaker 1: teacher or your parent can provide, at the click of 627 00:34:22,960 --> 00:34:26,160 Speaker 1: a button. What does that do to a developing brain. 628 00:34:34,480 --> 00:34:36,600 Speaker 1: That's it for this week on Tech Stuff. I'm Oz 629 00:34:36,640 --> 00:34:38,440 Speaker 1: Valoshin and I'm Kara Price. 630 00:34:38,960 --> 00:34:42,440 Speaker 2: This episode was produced by Eliza Dennis, Victoria Domingez, and 631 00:34:42,520 --> 00:34:46,600 Speaker 2: Lizzie Jacobs. It was executive produced by me, Kara Price, Ozwalashan, 632 00:34:46,840 --> 00:34:51,000 Speaker 2: and Kate Osbourne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. 633 00:34:51,400 --> 00:34:54,800 Speaker 2: The engineer is Charles Demontebello and it's mixed by Kyle Murdoch, 634 00:34:54,880 --> 00:34:56,120 Speaker 2: who also wrote our theme song. 635 00:34:56,920 --> 00:35:00,319 Speaker 1: Join us next Wednesday for tech Stuff The Story, when 636 00:35:00,360 --> 00:35:04,280 Speaker 1: we'll share an in depth conversation with Jessica Lesson, CEO 637 00:35:04,440 --> 00:35:07,800 Speaker 1: of the information about a vibe shift in Silicon Valley. 638 00:35:08,400 --> 00:35:11,279 Speaker 2: Please rate, review, and reach out to us at tech 639 00:35:11,360 --> 00:35:14,799 Speaker 2: Stuff podcast at gmail dot com. We want to hear 640 00:35:14,840 --> 00:35:15,080 Speaker 2: from you