1 00:00:02,120 --> 00:00:05,840 Speaker 1: Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. 2 00:00:06,200 --> 00:00:08,920 Speaker 1: I'm Osvoloshin and today Kara Price and I will bring 3 00:00:08,960 --> 00:00:12,959 Speaker 1: you the headlines this week, well tariffs, obviously, but also 4 00:00:13,119 --> 00:00:16,159 Speaker 1: a dating game. Then on Tech Support, we'll talk to 5 00:00:16,200 --> 00:00:19,520 Speaker 1: The Wall Street Journal's Family and Tech columnist Julie Jargon 6 00:00:19,920 --> 00:00:22,919 Speaker 1: about a mother's worst fear, a cry for help over 7 00:00:22,960 --> 00:00:26,360 Speaker 1: the phone that sounded like her youngest daughter, all of 8 00:00:26,360 --> 00:00:40,320 Speaker 1: that on the Weekend Tech. It's Friday, April eleventh. Kara, Hello, Hello, Hello, ohs. 9 00:00:40,880 --> 00:00:43,040 Speaker 2: I think I'm going to start this thing called Kara's 10 00:00:43,080 --> 00:00:45,400 Speaker 2: Hats of the Week. And just for people who can't 11 00:00:45,440 --> 00:00:48,239 Speaker 2: see me, I'm wearing a hat today that says, and 12 00:00:48,280 --> 00:00:50,080 Speaker 2: this is from a show called Summer Heatsie. If you 13 00:00:50,120 --> 00:00:53,640 Speaker 2: ever watched Summerheatsie, I'm a naughty girl with a bad habit. 14 00:00:53,920 --> 00:00:55,639 Speaker 1: What is the bad habit in the. 15 00:00:55,600 --> 00:00:59,200 Speaker 2: Show, It's for drugs. If I had preempted today's top story, 16 00:00:59,200 --> 00:01:01,720 Speaker 2: it would have said, with a bad habit for tariffs. 17 00:01:01,800 --> 00:01:05,600 Speaker 1: Uh huh. That would have been fast fashion. Indeed, that's 18 00:01:05,600 --> 00:01:08,480 Speaker 1: a good one. But today's news is all about quote. 19 00:01:08,720 --> 00:01:10,479 Speaker 1: The most beautiful word in the dictionary. 20 00:01:10,560 --> 00:01:13,080 Speaker 2: Well, I don't know anything about the dictionary, but I 21 00:01:13,240 --> 00:01:16,480 Speaker 2: looked up the word tariff with chat GPT. 22 00:01:16,640 --> 00:01:18,959 Speaker 1: The Dictionary twenty twenty five correct. 23 00:01:18,600 --> 00:01:22,040 Speaker 2: Correct, and the thesaurus and also what we'll eventually write 24 00:01:22,120 --> 00:01:26,960 Speaker 2: someone's wedding vesse. I asked chat Gpt what a tariff is, 25 00:01:27,360 --> 00:01:30,479 Speaker 2: and she responded because she's a she yea in my book. 26 00:01:30,520 --> 00:01:33,280 Speaker 2: In your book, a tariff is a tax or fee 27 00:01:33,480 --> 00:01:37,000 Speaker 2: that a government imposes on imported or exported goods. But 28 00:01:37,160 --> 00:01:41,559 Speaker 2: most importantly, chat Gpt says tariffs can impact the price 29 00:01:41,560 --> 00:01:45,039 Speaker 2: of goods, trade relationships, and the global economy. 30 00:01:45,280 --> 00:01:49,960 Speaker 1: Oh my prophetic soul can and do. 31 00:01:50,680 --> 00:01:53,800 Speaker 2: Chat Gipt was onto something there because Trump's tariff announcement 32 00:01:54,040 --> 00:01:57,120 Speaker 2: definitely shook up the global economy and trade relationships, and 33 00:01:57,200 --> 00:02:00,000 Speaker 2: everything is still evolving. I mean, there's basically a new 34 00:02:00,240 --> 00:02:03,960 Speaker 2: update every hour. Just this week, tariffs on about ninety 35 00:02:04,000 --> 00:02:06,960 Speaker 2: countries went into effect. Then Trump issued a pause on 36 00:02:07,000 --> 00:02:09,400 Speaker 2: most of them. But the country who's been pummeled the 37 00:02:09,400 --> 00:02:12,840 Speaker 2: most is China. As of Thursday midday, Trump has increased 38 00:02:12,880 --> 00:02:15,760 Speaker 2: tariffs on the country's imports by one hundred and twenty 39 00:02:15,760 --> 00:02:19,120 Speaker 2: five percent. So with everything going on, the technology industry 40 00:02:19,160 --> 00:02:20,680 Speaker 2: has been feeling some effects. 41 00:02:21,040 --> 00:02:23,359 Speaker 1: There's a headline in the Washington Post earlier this week 42 00:02:23,440 --> 00:02:27,079 Speaker 1: that was delicious in its understated irony. Big tech bet 43 00:02:27,120 --> 00:02:29,440 Speaker 1: on Trump. It's still waiting for the payoff. 44 00:02:29,760 --> 00:02:31,800 Speaker 2: I'm just thinking about when we first started this version 45 00:02:31,800 --> 00:02:33,600 Speaker 2: of tech stuff. All of those tech bros were at 46 00:02:33,600 --> 00:02:36,359 Speaker 2: the inauguration, and now they're all scrambling to rethink their 47 00:02:36,360 --> 00:02:37,160 Speaker 2: supply chains. 48 00:02:37,320 --> 00:02:40,320 Speaker 1: But it's also about the threat of reciprocal tariffs. And 49 00:02:40,360 --> 00:02:43,800 Speaker 1: in that Washington Post story, the writers quip those tech 50 00:02:43,840 --> 00:02:47,000 Speaker 1: giants are in another front row, lol, not the front 51 00:02:47,080 --> 00:02:50,840 Speaker 1: row the inauguration as targets for US trade partners looking 52 00:02:50,840 --> 00:02:53,480 Speaker 1: for ways to strike back at the US economy. 53 00:02:53,919 --> 00:02:56,600 Speaker 2: Yeah. I heard at one point the EU is considering 54 00:02:56,639 --> 00:03:01,000 Speaker 2: tariffs on digital products like Netflix subscriptions and Google Cloud storage, 55 00:03:01,000 --> 00:03:05,079 Speaker 2: which I honestly didn't even know was possible. But another 56 00:03:05,120 --> 00:03:07,480 Speaker 2: area of concern for the tech industry is how these 57 00:03:07,520 --> 00:03:11,400 Speaker 2: tariffs will affect semiconductors, because they really do power the 58 00:03:11,440 --> 00:03:16,120 Speaker 2: modern world. Everything from consumer tech, data centers, even cars, 59 00:03:16,240 --> 00:03:17,880 Speaker 2: they all use semiconductors. 60 00:03:18,080 --> 00:03:21,560 Speaker 1: And here's where the kind of ironies continue to abound, because, 61 00:03:21,560 --> 00:03:25,000 Speaker 1: of course, you know, President Trump has made AI supremacy 62 00:03:25,080 --> 00:03:28,720 Speaker 1: a key element of his policy for this term, and 63 00:03:28,880 --> 00:03:34,480 Speaker 1: so technically the tariffs announce an exemption for semiconductors, but 64 00:03:34,960 --> 00:03:37,560 Speaker 1: as it turns out, many of the semiconductors imported are 65 00:03:37,600 --> 00:03:41,360 Speaker 1: actually bundled into other products like GPU chips and service 66 00:03:41,560 --> 00:03:44,680 Speaker 1: to train AI models. That's per wired to have a 67 00:03:44,720 --> 00:03:47,920 Speaker 1: story under the headline Trump's tariffs are threatening the US 68 00:03:48,000 --> 00:03:51,640 Speaker 1: semiconductor revival. Wide also points out that all of the 69 00:03:51,680 --> 00:03:56,160 Speaker 1: machinery and the underlying materials to manufacture semiconductors here in 70 00:03:56,200 --> 00:03:59,600 Speaker 1: the US will become far more expensive with these tariffs 71 00:03:59,720 --> 00:04:01,840 Speaker 1: make it less attractive to manufacture domestically. 72 00:04:02,080 --> 00:04:04,280 Speaker 2: Yeah, you know this is a little bit heady, but 73 00:04:04,400 --> 00:04:07,080 Speaker 2: needless to say, it is a consumer tech story. You know, 74 00:04:07,160 --> 00:04:10,040 Speaker 2: there are many articles circulating about how these tariffs could 75 00:04:10,120 --> 00:04:13,320 Speaker 2: affect the price of something as ubiquitous as the iPhone. 76 00:04:13,360 --> 00:04:15,840 Speaker 1: Have you been stoking up for your ebase sales. 77 00:04:15,600 --> 00:04:18,560 Speaker 2: Of my iPhone? Oh to sell iPhones? Oh yeah, it's 78 00:04:18,600 --> 00:04:20,640 Speaker 2: not a bad idea. Actually, I hadn't thought about it, 79 00:04:21,000 --> 00:04:22,120 Speaker 2: you know. And just to give a shout out to 80 00:04:22,160 --> 00:04:24,080 Speaker 2: four A form Meta who we love, you know, they 81 00:04:24,160 --> 00:04:28,440 Speaker 2: pointed out that on Apple's own supply chain website. The 82 00:04:28,520 --> 00:04:32,480 Speaker 2: big beautiful bold text that overlays the video of people 83 00:04:32,520 --> 00:04:37,760 Speaker 2: making iPhones in a factory says, designed by Apple in California, 84 00:04:37,800 --> 00:04:42,600 Speaker 2: made by people everywhere. And it's true, you know, as 85 00:04:42,640 --> 00:04:46,720 Speaker 2: a device exemplifies globalization. The materials for the batteries come 86 00:04:46,760 --> 00:04:50,280 Speaker 2: from one country, the display from another. Almost every part 87 00:04:50,279 --> 00:04:52,760 Speaker 2: of the iPhone comes from a different country, and then 88 00:04:52,760 --> 00:04:56,440 Speaker 2: they are predominantly assembled in China. So if things go 89 00:04:56,560 --> 00:04:59,400 Speaker 2: the way they're going and tariffs on Chinese imports remain, 90 00:04:59,720 --> 00:05:03,480 Speaker 2: these phones could get a lot more expensive. And I'm 91 00:05:03,520 --> 00:05:05,159 Speaker 2: going to get the burner phone that I plan on 92 00:05:05,200 --> 00:05:06,479 Speaker 2: getting the summer anyway. 93 00:05:06,200 --> 00:05:08,400 Speaker 1: Okay, good, Yeah, Well this will be another inducement for 94 00:05:08,480 --> 00:05:11,719 Speaker 1: us to get dumb phones, but that that might be 95 00:05:11,800 --> 00:05:13,719 Speaker 1: enough for us on Tarish this week. I feel this 96 00:05:13,720 --> 00:05:15,839 Speaker 1: is probably something going to be coming back to again 97 00:05:15,880 --> 00:05:17,279 Speaker 1: and again. So time for a game. 98 00:05:17,520 --> 00:05:20,279 Speaker 2: I have been sitting on my hands for this entire 99 00:05:20,360 --> 00:05:22,960 Speaker 2: show as we talk about tariffs, to play a game 100 00:05:23,040 --> 00:05:25,640 Speaker 2: with you that came out last week for April Fools. 101 00:05:26,080 --> 00:05:28,520 Speaker 2: I will catch people up a little bit. Last week, 102 00:05:28,600 --> 00:05:32,160 Speaker 2: Tinder launched an in app game called The Game Game. 103 00:05:32,279 --> 00:05:34,840 Speaker 1: The Game Game it sounds a bit like that seventies 104 00:05:34,880 --> 00:05:37,960 Speaker 1: dating show, the dating game with a sprinkle of my 105 00:05:38,080 --> 00:05:39,520 Speaker 1: hero Neil Strauss. 106 00:05:39,680 --> 00:05:41,960 Speaker 2: I think they were definitely going for seventies dating show. 107 00:05:41,960 --> 00:05:43,640 Speaker 2: I don't think they were going for a sprinkle of 108 00:05:43,680 --> 00:05:47,280 Speaker 2: Neil Strauss. But that's your drama. So basically, Open AI's Chat, 109 00:05:47,360 --> 00:05:51,080 Speaker 2: GPT four oh and Tinder partner together to create this 110 00:05:51,120 --> 00:05:54,960 Speaker 2: thing called the Game Game, which allows real Tinder users 111 00:05:55,480 --> 00:05:59,080 Speaker 2: to enter pretend scenarios and talk to AI characters. 112 00:05:59,080 --> 00:06:02,640 Speaker 1: This was a meet cute between chat ChiPT and Tinder, 113 00:06:03,360 --> 00:06:04,359 Speaker 1: good one if you will. 114 00:06:04,440 --> 00:06:07,320 Speaker 2: It was a me cute between altman and Tinder. Absolutely, 115 00:06:07,720 --> 00:06:10,680 Speaker 2: But there's also a competitive part of the game, which 116 00:06:10,680 --> 00:06:13,120 Speaker 2: is that as you talk into your phone, you try 117 00:06:13,160 --> 00:06:15,520 Speaker 2: your best to flirt with the AI, and you get 118 00:06:15,560 --> 00:06:19,960 Speaker 2: points for how suave or empathetic or interesting your responses are. 119 00:06:20,120 --> 00:06:23,480 Speaker 2: And here's how it works. You entered a preferred scenario 120 00:06:23,880 --> 00:06:28,080 Speaker 2: like I'm on a train and my shoes untied and 121 00:06:28,120 --> 00:06:31,800 Speaker 2: a man says, miss, your shoes untied and I say, sir, 122 00:06:32,480 --> 00:06:34,440 Speaker 2: I'm into women. That would be the end of that. 123 00:06:34,640 --> 00:06:36,880 Speaker 2: But no, if I were straight, he would say, mam, 124 00:06:36,920 --> 00:06:39,320 Speaker 2: your shoes untied and I'd look up and it would 125 00:06:39,320 --> 00:06:46,600 Speaker 2: get no. But our producer Tory actually played it and 126 00:06:46,720 --> 00:06:50,280 Speaker 2: she talked to a character named Nathan, who was interested 127 00:06:50,279 --> 00:06:53,920 Speaker 2: in technology and had a Southern accent. But after their 128 00:06:53,960 --> 00:06:58,080 Speaker 2: conversation ended, Tinder told Tory that her replies were charming, 129 00:06:58,440 --> 00:07:01,200 Speaker 2: but that her conversation could have fload a little better 130 00:07:01,400 --> 00:07:04,760 Speaker 2: as she jumped between topics too quickly. Now, as a producer, 131 00:07:04,800 --> 00:07:05,640 Speaker 2: that's a great quality. 132 00:07:05,920 --> 00:07:08,440 Speaker 1: Yeah, exactly, short attention span. I think Tory may have 133 00:07:08,440 --> 00:07:11,680 Speaker 1: had it right and the and the app wrong, but 134 00:07:12,160 --> 00:07:14,280 Speaker 1: I would love to hear you try it. 135 00:07:15,240 --> 00:07:19,040 Speaker 2: My phone is ready and Tinder is downloaded. So let's 136 00:07:21,120 --> 00:07:27,320 Speaker 2: the game game. So what happens first is the prompt 137 00:07:27,360 --> 00:07:30,720 Speaker 2: comes up in the game. It's in the Tinder app. 138 00:07:30,920 --> 00:07:34,880 Speaker 1: Well, I mean, let's not pass over this soundtrack and 139 00:07:35,000 --> 00:07:35,800 Speaker 1: loving it. 140 00:07:35,800 --> 00:07:36,520 Speaker 2: It's fantastic. 141 00:07:36,440 --> 00:07:38,240 Speaker 1: This is like old New York. 142 00:07:39,240 --> 00:07:45,239 Speaker 2: We're going to Cafe Carlisle for a drink and a smoke. Okay, 143 00:07:45,240 --> 00:07:47,360 Speaker 2: I'm done, but let me read you the prompt so 144 00:07:47,400 --> 00:07:48,160 Speaker 2: we can start the game. 145 00:07:48,240 --> 00:07:48,560 Speaker 1: Okay. 146 00:07:50,120 --> 00:07:51,840 Speaker 2: And by the way, it lets you re roll, so 147 00:07:51,880 --> 00:07:53,960 Speaker 2: if you don't like the person you're introduced to, it 148 00:07:53,960 --> 00:07:55,960 Speaker 2: gives you another prompt, which I don't think should happen. 149 00:07:56,040 --> 00:07:59,200 Speaker 2: I think you should happen. But that's how Tinder works, 150 00:07:59,200 --> 00:08:01,520 Speaker 2: That's how chatchy be T four zero works. At a 151 00:08:01,520 --> 00:08:05,960 Speaker 2: bustling dog park, your curious dog eagerly runs up to Penelope, 152 00:08:06,320 --> 00:08:10,200 Speaker 2: a customer service representative, and starts playing with her dog. 153 00:08:10,560 --> 00:08:13,880 Speaker 1: How does your dog know what Penelope's job is? Oh, 154 00:08:13,920 --> 00:08:14,720 Speaker 1: you find that out later. 155 00:08:14,960 --> 00:08:18,320 Speaker 2: Hello, get my dog's like? Whoa, she's in customer service. 156 00:08:18,360 --> 00:08:22,360 Speaker 2: You should chase her down. No, and so my dog 157 00:08:22,600 --> 00:08:25,559 Speaker 2: goes up to Penelope's dog and starts playing with her dog. 158 00:08:26,000 --> 00:08:30,960 Speaker 2: This unexpected encounter creates the perfect excuse to strike up 159 00:08:31,000 --> 00:08:35,000 Speaker 2: a conversation with the one and only Penelope from Customer Service. 160 00:08:35,400 --> 00:08:39,480 Speaker 2: Let's go, I'm not rerolling, I'm taking my first card. 161 00:08:40,840 --> 00:08:44,080 Speaker 3: Well, hey, there looks like your dog's got quite a 162 00:08:44,120 --> 00:08:47,000 Speaker 3: bit of energy, don't they. My pup here is pretty 163 00:08:47,000 --> 00:08:49,920 Speaker 3: friendly too. Seems they're getting along just fine. 164 00:08:50,200 --> 00:08:58,360 Speaker 2: What's their name? My dog's name is Astrid Pastrid. 165 00:08:58,480 --> 00:09:02,400 Speaker 3: Huh, that's a fun name. Seems like an estrid. It's 166 00:09:02,400 --> 00:09:05,800 Speaker 3: a real social butterfly. Does she usually make friends this quick? 167 00:09:06,280 --> 00:09:09,000 Speaker 2: Honestly? No, your dog is the first dog she's ever 168 00:09:09,040 --> 00:09:14,280 Speaker 2: taken any interest in her? Well, ain't that something special? 169 00:09:14,679 --> 00:09:16,280 Speaker 3: Looks like they found the money today. 170 00:09:16,720 --> 00:09:17,880 Speaker 1: Must be a good match. 171 00:09:18,440 --> 00:09:21,319 Speaker 3: Maybe Astrid knows a kindred spirit when she sees one. 172 00:09:21,920 --> 00:09:24,800 Speaker 2: So how often do you come to the park. I've 173 00:09:24,840 --> 00:09:27,439 Speaker 2: tried to go to the dog park more now because 174 00:09:27,600 --> 00:09:30,600 Speaker 2: Astroid's gotten kind of crazy inside, so have I I mean, 175 00:09:30,600 --> 00:09:33,080 Speaker 2: post COVID it's really hard to get outside even now. 176 00:09:34,640 --> 00:09:37,120 Speaker 2: Why I hear you. It's been tough all around. 177 00:09:37,679 --> 00:09:40,440 Speaker 3: The fresh air and space can do wonders for both 178 00:09:40,480 --> 00:09:43,600 Speaker 3: you and Astro. Charleston's got a lot of beauty if 179 00:09:43,600 --> 00:09:44,800 Speaker 3: you take time to notice it. 180 00:09:45,440 --> 00:09:45,599 Speaker 1: You know. 181 00:09:45,720 --> 00:09:48,800 Speaker 2: Pela. Sorry, Penelte, I'm so sorry to cut you off. 182 00:09:49,640 --> 00:09:53,200 Speaker 3: No worries at all, I understand it. I don't want 183 00:09:53,200 --> 00:09:55,679 Speaker 3: to chat about the park or just need a friendly here. 184 00:09:56,040 --> 00:09:56,480 Speaker 3: I'm here. 185 00:09:57,040 --> 00:10:01,360 Speaker 2: Take care of yourself and Astrid. Penelope, would you be 186 00:10:01,400 --> 00:10:08,480 Speaker 2: interested in either giving me or getting Did she go away? 187 00:10:07,400 --> 00:10:17,920 Speaker 2: Oh you're still here, Penelope? Sorry, so Penelope, I want 188 00:10:18,000 --> 00:10:19,720 Speaker 2: another go But he just like girl. 189 00:10:19,600 --> 00:10:22,400 Speaker 1: You have no no. I like the way you dropped 190 00:10:22,400 --> 00:10:23,200 Speaker 1: your voice a little bit. 191 00:10:24,120 --> 00:10:26,040 Speaker 2: Wait, do you want to try it? We have to 192 00:10:26,120 --> 00:10:28,040 Speaker 2: have things to do on the show, but that's really 193 00:10:28,080 --> 00:10:31,920 Speaker 2: how it goes. CHATPG is kind of incredible. So guys, 194 00:10:32,760 --> 00:10:34,880 Speaker 2: that is the game game. 195 00:10:35,080 --> 00:10:37,800 Speaker 1: I love the game game. I'm also curious as to 196 00:10:37,880 --> 00:10:41,679 Speaker 1: why both your and Torri's match as a Southern accent is. 197 00:10:42,200 --> 00:10:46,600 Speaker 1: Is this a subtle kind of white loticification of society. 198 00:10:46,120 --> 00:10:49,120 Speaker 2: Where chat GPT four roh is just like, you know what, 199 00:10:49,160 --> 00:10:52,160 Speaker 2: they're getting Southern girls and that's about it, or southern boys. 200 00:10:52,160 --> 00:10:54,640 Speaker 2: In the case of Tory, Nathan was southern. I live 201 00:10:54,679 --> 00:10:56,200 Speaker 2: in New York and one of the things that I 202 00:10:56,240 --> 00:11:00,679 Speaker 2: saw come up and I was talking to Penelope was 203 00:11:01,520 --> 00:11:04,440 Speaker 2: plus plus empathetic. Now, if I was talking to New York, 204 00:11:04,440 --> 00:11:06,640 Speaker 2: I would have been like, yo, girl, what's up. I'm 205 00:11:06,640 --> 00:11:09,160 Speaker 2: talking to Penelope. I'm like, well, girl, would you like 206 00:11:09,240 --> 00:11:10,880 Speaker 2: to meet me at the park again? 207 00:11:12,200 --> 00:11:15,040 Speaker 1: I thought that you're lying about how Astroid had never 208 00:11:15,120 --> 00:11:17,719 Speaker 1: approached any other dogs. Was I mean that was you 209 00:11:17,760 --> 00:11:20,760 Speaker 1: gotta make them feel speak. 210 00:11:21,400 --> 00:11:25,000 Speaker 2: I hope this podcast never comes out. I love that 211 00:11:25,160 --> 00:11:27,200 Speaker 2: more than anything. I will be playing that all day 212 00:11:27,360 --> 00:11:28,800 Speaker 2: and I think I will, by the end of it 213 00:11:28,840 --> 00:11:32,880 Speaker 2: have a Southern accent, just to get a little bit 214 00:11:32,920 --> 00:11:34,839 Speaker 2: more serious about this story. You know, the Washington Post 215 00:11:35,040 --> 00:11:37,080 Speaker 2: reached out to the vice president of Product Growth and 216 00:11:37,120 --> 00:11:40,200 Speaker 2: Revenue at Tinder, and she said that the game is 217 00:11:40,280 --> 00:11:43,040 Speaker 2: meant to be silly and that the company quote leaned 218 00:11:43,080 --> 00:11:47,120 Speaker 2: into the campiness. Apparently, though, She went on to call 219 00:11:47,240 --> 00:11:50,440 Speaker 2: gen Z a socially anxious generation, and while the game 220 00:11:50,520 --> 00:11:53,319 Speaker 2: might be cringe, it's a generation that might look past 221 00:11:53,400 --> 00:11:56,400 Speaker 2: that if it indeed leads to a real connection. 222 00:11:56,960 --> 00:11:58,800 Speaker 1: I had to say, I mean, it was definitely fun 223 00:11:58,960 --> 00:12:02,680 Speaker 1: watching you play. I have never before myself, and I 224 00:12:02,679 --> 00:12:04,520 Speaker 1: didn't just time either, but I've never had a conversation 225 00:12:04,640 --> 00:12:06,920 Speaker 1: directly using my voice with an ali before. Was that 226 00:12:06,960 --> 00:12:08,040 Speaker 1: Was that a first for you or. 227 00:12:08,880 --> 00:12:11,400 Speaker 2: Only when I tried to scam my cousin? Actually? 228 00:12:11,480 --> 00:12:11,880 Speaker 1: Yeah, yeah? 229 00:12:12,040 --> 00:12:14,600 Speaker 2: So crazy? Is the pressure cooker that that just created 230 00:12:14,679 --> 00:12:16,840 Speaker 2: for me? Felt like there was literally a gun to 231 00:12:16,880 --> 00:12:18,359 Speaker 2: my head that was like flirt. 232 00:12:19,360 --> 00:12:21,880 Speaker 1: That's what it felt like. It's getting hot in here. 233 00:12:21,920 --> 00:12:24,520 Speaker 1: So we're going to take a quick break when we 234 00:12:24,559 --> 00:12:38,320 Speaker 1: come back some more headlines. Now to pivot back to 235 00:12:38,360 --> 00:12:41,640 Speaker 1: the headlines, We've got a few more for today, continuing 236 00:12:41,679 --> 00:12:45,600 Speaker 1: the theme of sex, deaths, and money. Well, no death, thankfully, 237 00:12:46,000 --> 00:12:48,520 Speaker 1: but we've had sex in the form of flirting now 238 00:12:48,559 --> 00:12:50,040 Speaker 1: for money taxes. 239 00:12:52,559 --> 00:12:55,120 Speaker 2: We know it's tax month, and one of the stories 240 00:12:55,160 --> 00:12:57,560 Speaker 2: has to do with two things you never want to 241 00:12:57,600 --> 00:13:02,600 Speaker 2: hear put together, which is IRS and hackathon. And of 242 00:13:02,679 --> 00:13:07,080 Speaker 2: course what does this start with the Department of Government 243 00:13:07,160 --> 00:13:11,200 Speaker 2: Efficiency is planning to stage a hackathon event. I sound 244 00:13:11,280 --> 00:13:14,160 Speaker 2: sad because I am is planning to stage a hackathon 245 00:13:14,200 --> 00:13:18,280 Speaker 2: event with the best engineers at the Internal Revenue Service. 246 00:13:18,600 --> 00:13:22,000 Speaker 2: According to Wired, DOE is planning to host dozens of 247 00:13:22,040 --> 00:13:24,679 Speaker 2: them in DC to build a mega API. 248 00:13:24,880 --> 00:13:28,400 Speaker 1: A mega API, that's actually what I read process. 249 00:13:28,679 --> 00:13:31,840 Speaker 2: It is a MEGAAPI essentially, which would make it easier 250 00:13:31,880 --> 00:13:36,199 Speaker 2: to access taxpayer data across different applications and cloud platforms. 251 00:13:36,240 --> 00:13:38,600 Speaker 1: We don't yet have a lot of details on the hackathon, 252 00:13:38,760 --> 00:13:41,040 Speaker 1: but I do hope they keep it tight because the 253 00:13:41,080 --> 00:13:45,200 Speaker 1: idea of highly sensitive tax data moving freely between what 254 00:13:45,240 --> 00:13:48,719 Speaker 1: maybe third party applications is a little frightening. There is 255 00:13:48,760 --> 00:13:52,200 Speaker 1: a broader controversy roiling the IRS. Several officials, including the 256 00:13:52,240 --> 00:13:56,360 Speaker 1: acting Commissioner, are quitting over the Trump administration's insistence that 257 00:13:56,480 --> 00:14:01,880 Speaker 1: the agency disclosed taxpayer information to Immigration and Customs Enforcement. 258 00:14:02,160 --> 00:14:05,719 Speaker 1: The IRIS has typically kept taxpay information confidential, even from 259 00:14:05,840 --> 00:14:10,920 Speaker 1: other government agencies, and that includes information submitted by undocumented immigrants. 260 00:14:11,360 --> 00:14:13,960 Speaker 1: But in a new agreement which appeared redacted in a 261 00:14:14,000 --> 00:14:17,480 Speaker 1: court filing, ICE officials can now ask the IRS for 262 00:14:17,520 --> 00:14:21,120 Speaker 1: information about people they're investigating or who've been ordered to 263 00:14:21,200 --> 00:14:22,280 Speaker 1: leave the US. 264 00:14:22,360 --> 00:14:24,760 Speaker 2: And in a story that takes us elsewhere into a 265 00:14:24,800 --> 00:14:27,360 Speaker 2: topic I am personally obsessed with, which is right to 266 00:14:27,400 --> 00:14:28,280 Speaker 2: repair laws? 267 00:14:28,480 --> 00:14:29,200 Speaker 1: What does that mean? 268 00:14:29,600 --> 00:14:32,160 Speaker 2: Right to repair law is basically like laws that say 269 00:14:32,160 --> 00:14:35,640 Speaker 2: that companies have to provide information to people who buy 270 00:14:35,680 --> 00:14:38,800 Speaker 2: things that teach them how to repair it, so one 271 00:14:38,880 --> 00:14:41,320 Speaker 2: you're not just buying new things every time they come out, 272 00:14:41,360 --> 00:14:44,640 Speaker 2: and two that you're able to actually know how to, 273 00:14:44,800 --> 00:14:47,360 Speaker 2: for example, repair a tractor. I was drawn to this 274 00:14:47,400 --> 00:14:50,840 Speaker 2: headline from The Verge with the perfect subhead quote, India's 275 00:14:50,920 --> 00:14:54,280 Speaker 2: repair culture gives new life to dead tech. 276 00:14:54,440 --> 00:14:56,400 Speaker 1: So we had sex, some money, and we do indeed have. 277 00:14:57,920 --> 00:15:00,640 Speaker 2: Oh dead technology, which is about the least sexy thing 278 00:15:00,640 --> 00:15:04,800 Speaker 2: on the planet. There's a rise of Frankenstein laptops in India. Now, 279 00:15:04,800 --> 00:15:06,800 Speaker 2: when I say Frankenstein laptops, what do you think? 280 00:15:07,320 --> 00:15:11,320 Speaker 1: Uh, gosh, I guess I think about laptops assembled from 281 00:15:11,320 --> 00:15:12,040 Speaker 1: all different parts. 282 00:15:12,400 --> 00:15:14,320 Speaker 2: I thought you were gonna say laptops with two bolts 283 00:15:14,360 --> 00:15:18,000 Speaker 2: on the side of that, But yes, they're basically resurrected 284 00:15:18,040 --> 00:15:21,080 Speaker 2: computers made with parts from trash older laptops and other 285 00:15:21,160 --> 00:15:24,480 Speaker 2: e waste ewtes, meaning trash that is of the electronic 286 00:15:24,520 --> 00:15:27,600 Speaker 2: variety at a fraction of the price. These laptops are 287 00:15:27,600 --> 00:15:31,400 Speaker 2: a good option for students, freelancers, or really anyone who 288 00:15:31,480 --> 00:15:34,000 Speaker 2: needs to be a part of India's growing digital economy 289 00:15:34,120 --> 00:15:36,640 Speaker 2: but may not be able to afford to participate otherwise. 290 00:15:36,680 --> 00:15:38,960 Speaker 1: Yeah, I think I read that you can basically get 291 00:15:39,080 --> 00:15:41,800 Speaker 1: a functional laptop from one of these one of these 292 00:15:41,800 --> 00:15:45,280 Speaker 1: Frankenstein laptops for around one hundred US dollars, which is 293 00:15:45,320 --> 00:15:47,880 Speaker 1: like an eighth of the price of any decent new laptop. 294 00:15:47,920 --> 00:15:49,120 Speaker 1: So it's pretty cool story. 295 00:15:49,440 --> 00:15:52,160 Speaker 2: But these Frankenstein tinkers don't have it so easy. There 296 00:15:52,200 --> 00:15:56,080 Speaker 2: are actually many global tech giants who restrict access to 297 00:15:56,120 --> 00:15:59,600 Speaker 2: spare parts or use proprietary hardware, which means people are 298 00:15:59,600 --> 00:16:03,440 Speaker 2: going through piles of sometimes toxic trash to get the parts, 299 00:16:03,640 --> 00:16:06,360 Speaker 2: and India's government is beginning to discuss right to repair 300 00:16:06,440 --> 00:16:09,120 Speaker 2: laws to address this, but progress has been slow. 301 00:16:09,840 --> 00:16:12,200 Speaker 1: Final story for this week is about a question I 302 00:16:12,320 --> 00:16:16,160 Speaker 1: find quite fascinating. What will be the iPhone of ai? 303 00:16:16,720 --> 00:16:19,000 Speaker 1: Will there be a kind of AI product that becomes 304 00:16:19,040 --> 00:16:21,880 Speaker 1: so ubiquitous that we forget what life was like before 305 00:16:21,880 --> 00:16:26,440 Speaker 1: it existed? Well, the iPhone designer himself, Johnny Ive or 306 00:16:26,640 --> 00:16:29,960 Speaker 1: Sir Johnny I've is working on it. Over a year ago, 307 00:16:30,000 --> 00:16:33,120 Speaker 1: he and Sam Altman, the CEO of Open Ai, began 308 00:16:33,160 --> 00:16:36,600 Speaker 1: discussing a device that might bring to life voice enabled 309 00:16:36,800 --> 00:16:41,280 Speaker 1: AI assistance, partly inspired by Altman's well documented fascination with 310 00:16:41,360 --> 00:16:44,920 Speaker 1: the movie Her. So. Alman and I have this startup together, 311 00:16:45,080 --> 00:16:48,600 Speaker 1: Ioproducts that's raised hundreds of millions of dollars and is 312 00:16:48,640 --> 00:16:51,880 Speaker 1: working on some device concepts, including a quote phone without 313 00:16:51,880 --> 00:16:55,280 Speaker 1: a screen, although some sources insist that it's in fact 314 00:16:55,480 --> 00:17:00,240 Speaker 1: not a phone, so the mystery remains. But this into 315 00:17:00,240 --> 00:17:02,480 Speaker 1: a story in the Information, which is also reporting that 316 00:17:02,560 --> 00:17:07,760 Speaker 1: Open Ai executives are considering acquiring a startup Ioproducts. This 317 00:17:07,800 --> 00:17:09,960 Speaker 1: would be a move that could potentially bring the AI 318 00:17:10,080 --> 00:17:13,960 Speaker 1: giant into more direct competition with Apple. It's not clear 319 00:17:14,080 --> 00:17:17,040 Speaker 1: where the negotiation is at the moment, but another of 320 00:17:17,080 --> 00:17:22,440 Speaker 1: these types of XAI X deals perhaps brewing, Although while 321 00:17:22,440 --> 00:17:24,520 Speaker 1: Altman worked close to you with ive on the project. 322 00:17:24,960 --> 00:17:26,760 Speaker 1: Is not clear what his economic stake in it. 323 00:17:26,800 --> 00:17:29,879 Speaker 2: Maybe if the new phone is not a phone. It 324 00:17:29,920 --> 00:17:31,960 Speaker 2: begs the question how the next thing that we cover 325 00:17:32,200 --> 00:17:35,960 Speaker 2: actually is going to happen in a no phone phone universe. 326 00:17:36,800 --> 00:17:40,159 Speaker 1: You're right, and our next segment is all about scammers, 327 00:17:40,280 --> 00:17:44,720 Speaker 1: and specifically scam callers who famously use phones and the 328 00:17:44,800 --> 00:17:48,400 Speaker 1: tech they're using to be more convincing and successful than ever. 329 00:17:48,960 --> 00:17:51,080 Speaker 2: Yeah, And one of the things that I can't stop 330 00:17:51,160 --> 00:17:53,120 Speaker 2: talking about on the show and talked a lot about 331 00:17:53,119 --> 00:17:58,000 Speaker 2: on Sleepwalkers is how much technological progress and innovation happens 332 00:17:58,040 --> 00:18:01,040 Speaker 2: in the sort of seedier parts of society. And then 333 00:18:01,080 --> 00:18:05,520 Speaker 2: it's after everyone hears these sensational stories about criminal ingenuity 334 00:18:05,680 --> 00:18:08,320 Speaker 2: that the tech is more widely adopted by the general public. 335 00:18:08,600 --> 00:18:11,480 Speaker 2: But it's actually the illicit use that forges the way. 336 00:18:11,600 --> 00:18:13,640 Speaker 1: Yeah. I remember, back in twenty nineteen, when we first 337 00:18:13,680 --> 00:18:16,239 Speaker 1: started covering this stuff together, there was a study that 338 00:18:16,280 --> 00:18:19,080 Speaker 1: revealed that more than ninety five percent of all deep 339 00:18:19,119 --> 00:18:22,400 Speaker 1: fake videos on the Internet were non consensual porn. 340 00:18:22,600 --> 00:18:25,000 Speaker 2: Well, I actually didn't even know that three D printing 341 00:18:25,200 --> 00:18:28,159 Speaker 2: was a consumer tech until I heard that blueprints for 342 00:18:28,240 --> 00:18:31,640 Speaker 2: three D printed ghost guns were circulating the internet. 343 00:18:32,119 --> 00:18:34,400 Speaker 1: Together. We actually ran an experiment a few years ago 344 00:18:34,600 --> 00:18:38,040 Speaker 1: to create a deep fake of your voice and scam 345 00:18:38,119 --> 00:18:40,439 Speaker 1: your cousin, and it took us about a week to 346 00:18:40,480 --> 00:18:42,680 Speaker 1: make that clone with the help of a company called 347 00:18:42,760 --> 00:18:46,240 Speaker 1: liar Bird that was subsequently acquired by Descript, the software 348 00:18:46,280 --> 00:18:49,159 Speaker 1: that we use every week to make our podcast. We 349 00:18:49,200 --> 00:18:51,200 Speaker 1: didn't actually get to the scamming apart, but we did 350 00:18:51,240 --> 00:18:54,600 Speaker 1: briefly trick Kara's cousin. That was back in twenty nineteen. 351 00:18:55,119 --> 00:18:57,640 Speaker 1: Since then, the state of the art and the kind 352 00:18:57,680 --> 00:19:01,840 Speaker 1: of social risks haveally advanced. And here to tell us 353 00:19:01,880 --> 00:19:04,720 Speaker 1: more is Julie Jargon, the family and tech columnist at 354 00:19:04,760 --> 00:19:06,960 Speaker 1: the Wall Street Journal. Julie, welcome to Tech Stuff. 355 00:19:07,440 --> 00:19:08,280 Speaker 4: Thank you for having me. 356 00:19:08,640 --> 00:19:12,359 Speaker 2: So, just to begin, your article tells the story of 357 00:19:12,400 --> 00:19:14,520 Speaker 2: a woman who gets a terrifying call. Can you tell 358 00:19:14,600 --> 00:19:18,760 Speaker 2: us a little bit more about what happened in this exchange? 359 00:19:18,960 --> 00:19:21,479 Speaker 4: Yeah, absolutely So. There was a woman in Colorado by 360 00:19:21,480 --> 00:19:23,720 Speaker 4: the name of Linda Rohan, and she was just at 361 00:19:23,760 --> 00:19:26,640 Speaker 4: home one night making herself dinner, and her phone rang 362 00:19:26,760 --> 00:19:29,199 Speaker 4: her cell phone and the caller ID showed that the 363 00:19:29,240 --> 00:19:30,960 Speaker 4: call was from a local number, so she thought it 364 00:19:31,040 --> 00:19:33,840 Speaker 4: might be someone she should talk to, so she picked 365 00:19:33,880 --> 00:19:36,280 Speaker 4: it up and immediately heard a voice of a young 366 00:19:36,280 --> 00:19:39,840 Speaker 4: woman that she thought sounded exactly like the youngest of 367 00:19:39,840 --> 00:19:44,280 Speaker 4: her three adult daughters, a panicked, you know message, Mom, 368 00:19:44,320 --> 00:19:46,879 Speaker 4: I'm okay, but something awful has happened and she's sobbing 369 00:19:46,920 --> 00:19:50,080 Speaker 4: and saying she needs help. And that immediately put this 370 00:19:50,119 --> 00:19:53,760 Speaker 4: woman on high alert. And then apparently a man took 371 00:19:53,800 --> 00:19:57,320 Speaker 4: the phone and mentioned the name of her daughter by 372 00:19:57,600 --> 00:20:00,000 Speaker 4: you know by name, and said that she had witnessed 373 00:20:00,200 --> 00:20:02,919 Speaker 4: this drug deal and she screamed and it scared the 374 00:20:02,920 --> 00:20:05,439 Speaker 4: buyers away, and so now he was out all this money, 375 00:20:05,480 --> 00:20:08,560 Speaker 4: and he had pulled this girl into his van and 376 00:20:08,720 --> 00:20:10,280 Speaker 4: now was demanding money. 377 00:20:10,480 --> 00:20:13,520 Speaker 2: We know that this wasn't Linda's real daughter. Where was 378 00:20:13,600 --> 00:20:15,800 Speaker 2: Linda's daughter actually during this. 379 00:20:15,880 --> 00:20:18,600 Speaker 4: She was in her apartment the whole time, safe at home. 380 00:20:18,800 --> 00:20:21,240 Speaker 2: And can you talk a little bit about how this happened. 381 00:20:21,640 --> 00:20:23,960 Speaker 4: I think what happens with these kind of callers is 382 00:20:24,040 --> 00:20:28,560 Speaker 4: they operate on fear and a sense of urgency. And 383 00:20:28,880 --> 00:20:33,199 Speaker 4: this scammer had an elaborate story that he kind of 384 00:20:33,640 --> 00:20:37,040 Speaker 4: kept this woman through this whole time. He told her 385 00:20:37,080 --> 00:20:39,560 Speaker 4: that he needed money in order to free her daughter. 386 00:20:40,040 --> 00:20:43,520 Speaker 4: He told her to go to Walmart and wire money. 387 00:20:43,560 --> 00:20:46,000 Speaker 4: And so she gets in her car and finds the 388 00:20:46,040 --> 00:20:49,520 Speaker 4: nearest Walmart, and he timed how long it took her 389 00:20:49,560 --> 00:20:51,800 Speaker 4: to get there. When she went to the Walmart, he 390 00:20:51,880 --> 00:20:53,600 Speaker 4: wanted to be on speaker the whole time, so he 391 00:20:53,680 --> 00:20:56,840 Speaker 4: had her conceal her phone in her shirt so he 392 00:20:56,880 --> 00:21:00,200 Speaker 4: could hear the conversation. And you know, I think he'd 393 00:21:00,200 --> 00:21:02,119 Speaker 4: made some kind of threats to her and you know, 394 00:21:02,240 --> 00:21:05,119 Speaker 4: her daughter. And when she got to the Walmart, she 395 00:21:05,160 --> 00:21:07,119 Speaker 4: couldn't do the wire transfer because she didn't have a 396 00:21:07,119 --> 00:21:09,240 Speaker 4: debit card. So he told her to go home and 397 00:21:09,480 --> 00:21:12,520 Speaker 4: do it online. And he said, you've got sixteen minutes. 398 00:21:12,840 --> 00:21:15,080 Speaker 4: If you stop anywhere, I'm going to know, because he 399 00:21:15,160 --> 00:21:16,800 Speaker 4: knew how long it had taken her to drive there 400 00:21:16,800 --> 00:21:18,040 Speaker 4: in the first place. And he kept her on the 401 00:21:18,080 --> 00:21:20,720 Speaker 4: phone this whole time, talking to her, trying to keep 402 00:21:20,720 --> 00:21:24,359 Speaker 4: her calm, asking her questions. This whole scenario played out 403 00:21:24,480 --> 00:21:27,800 Speaker 4: for a long time, and she made not one, but 404 00:21:27,880 --> 00:21:33,520 Speaker 4: two money transfers online in order to obtain her daughter's freedom. 405 00:21:34,080 --> 00:21:37,680 Speaker 4: And once it was finally over, she called her daughter 406 00:21:38,000 --> 00:21:40,480 Speaker 4: and found that her daughter was safe in her apartments. 407 00:21:40,800 --> 00:21:43,520 Speaker 1: It was such a striking story because it had this 408 00:21:43,640 --> 00:21:46,600 Speaker 1: kind of cinematic quality. I mean's actually like a movie. 409 00:21:46,680 --> 00:21:49,880 Speaker 1: The guy is playing a version of her daughter's voice, 410 00:21:50,320 --> 00:21:53,320 Speaker 1: making her literally drive from a to b, having her 411 00:21:53,440 --> 00:21:57,200 Speaker 1: conceal a phone in her clothes. I mean, there's fifteen 412 00:21:57,280 --> 00:21:59,840 Speaker 1: twenty thirty forty five minutes, all the while she thinks 413 00:21:59,840 --> 00:22:04,160 Speaker 1: that her daughter has been abducted by a drug dealer. 414 00:22:04,280 --> 00:22:05,800 Speaker 1: I mean, what did that do to the mother? And 415 00:22:05,800 --> 00:22:07,199 Speaker 1: when you were into viewing her and what does she 416 00:22:07,560 --> 00:22:08,760 Speaker 1: reflect about the experience? 417 00:22:09,480 --> 00:22:11,480 Speaker 4: Yeah, she described it as something that she can still 418 00:22:11,640 --> 00:22:15,560 Speaker 4: feel viscerally like. She retold the story to me three 419 00:22:15,600 --> 00:22:18,120 Speaker 4: times over the course of a few different conversations with her, 420 00:22:18,280 --> 00:22:20,480 Speaker 4: as I went through the story again and again with her, 421 00:22:20,920 --> 00:22:22,840 Speaker 4: and I could tell each time I talk to her 422 00:22:22,840 --> 00:22:24,560 Speaker 4: that she felt really nervous and worked up about it, 423 00:22:24,560 --> 00:22:26,520 Speaker 4: even though she knows it was all a scam, even 424 00:22:26,560 --> 00:22:28,880 Speaker 4: though she knows that her daughter was never in any 425 00:22:28,920 --> 00:22:33,000 Speaker 4: actual danger, but this whole ordeal was so terrifying to her. 426 00:22:33,600 --> 00:22:35,880 Speaker 4: And then that's of course why these scammers are so effective, 427 00:22:36,200 --> 00:22:38,920 Speaker 4: that they prey on the fear of people thinking that 428 00:22:38,960 --> 00:22:41,000 Speaker 4: they have a loved one, especially a child who might 429 00:22:41,000 --> 00:22:43,520 Speaker 4: be in some sort of danger. So, even though she's 430 00:22:43,720 --> 00:22:46,680 Speaker 4: now more than a month removed from the situation, still 431 00:22:46,680 --> 00:22:50,760 Speaker 4: in the retelling she feels very like physically nervous and scared. 432 00:22:50,960 --> 00:22:53,119 Speaker 1: Well, she's not surprising because the tension was kind of 433 00:22:53,160 --> 00:22:55,440 Speaker 1: resting up. And then just when she thought that she'd 434 00:22:55,440 --> 00:22:57,920 Speaker 1: made the payment everything was okay, there was a kind 435 00:22:57,920 --> 00:23:00,159 Speaker 1: of not the tone of the screw right right. 436 00:23:00,240 --> 00:23:01,679 Speaker 4: She thought it was kind of over. She'd made one 437 00:23:01,720 --> 00:23:05,119 Speaker 4: transfer of a thousand dollars, and then there was a 438 00:23:05,119 --> 00:23:08,720 Speaker 4: commotion and the man on the phone came back and said, well, 439 00:23:08,760 --> 00:23:10,439 Speaker 4: you know, my boss is angry that it took so 440 00:23:10,560 --> 00:23:13,919 Speaker 4: long to transfer this money, so we need more. My 441 00:23:13,960 --> 00:23:17,119 Speaker 4: boss is mad and he thinks he could sell your 442 00:23:17,200 --> 00:23:19,520 Speaker 4: daughter for thirty thousand dollars. And then at that point 443 00:23:19,560 --> 00:23:22,040 Speaker 4: she hears her daughter in the background screaming like no, no, 444 00:23:22,160 --> 00:23:25,960 Speaker 4: you know, please help me, and this woman Linda wanted 445 00:23:26,000 --> 00:23:28,280 Speaker 4: to talk to her daughter. She pleaded with this man 446 00:23:28,640 --> 00:23:30,639 Speaker 4: to let her talk to her daughter again, and he 447 00:23:30,680 --> 00:23:32,480 Speaker 4: said no, but you know, we can end this now 448 00:23:32,520 --> 00:23:35,679 Speaker 4: if he send another thousand dollars. So then she wired 449 00:23:36,240 --> 00:23:40,720 Speaker 4: another thousand dollars through a different wire service, and at 450 00:23:40,720 --> 00:23:42,359 Speaker 4: that point it was finally over. 451 00:23:43,200 --> 00:23:46,800 Speaker 1: It's this kind of incredible intersection of both a new technology, 452 00:23:46,920 --> 00:23:50,119 Speaker 1: I like the ubiquity of deep fake voices, and a 453 00:23:50,160 --> 00:23:53,440 Speaker 1: tremendously sophisticated psychological hack, right, I mean it has both 454 00:23:53,480 --> 00:23:54,640 Speaker 1: elements exactly. 455 00:23:54,640 --> 00:23:56,600 Speaker 4: And these kind of imposter scams have been going on 456 00:23:56,640 --> 00:23:58,679 Speaker 4: for a long time. I mean years ago, we'd be 457 00:23:58,760 --> 00:24:01,679 Speaker 4: hearing about grandpa parents getting calls from someone that was 458 00:24:01,760 --> 00:24:03,959 Speaker 4: claiming to be their grandson. But they usually didn't have 459 00:24:04,000 --> 00:24:06,639 Speaker 4: a name, they didn't have you know, the voice was 460 00:24:06,680 --> 00:24:09,600 Speaker 4: like just any young man, you know. And so it's 461 00:24:09,720 --> 00:24:14,040 Speaker 4: kind of using the same type of social engineering, but 462 00:24:14,200 --> 00:24:16,960 Speaker 4: ramped up in a more technological way that makes it 463 00:24:17,040 --> 00:24:18,080 Speaker 4: all the more believable. 464 00:24:24,359 --> 00:24:26,680 Speaker 2: When we come back, we'll hear about the way generative 465 00:24:26,720 --> 00:24:44,240 Speaker 2: AI makes imposter scams so convincing. Welcome back, So, Julie, 466 00:24:44,320 --> 00:24:47,119 Speaker 2: I have a lot of friends whose grandparents this has 467 00:24:47,160 --> 00:24:50,600 Speaker 2: happened to, and it preys on this sort of psychology 468 00:24:50,600 --> 00:24:54,000 Speaker 2: of oh my god, my grandchild is in trouble, let 469 00:24:54,080 --> 00:24:57,439 Speaker 2: me help them, without really thinking about how possible it 470 00:24:57,480 --> 00:24:59,920 Speaker 2: is that this is actually going on. This to your point, 471 00:25:00,119 --> 00:25:03,679 Speaker 2: is like an extremely ratcheted up version of this, and 472 00:25:03,720 --> 00:25:08,320 Speaker 2: it begs the question how exactly does something like this work? 473 00:25:08,640 --> 00:25:11,720 Speaker 2: How has it gotten so much more advanced? And I 474 00:25:11,760 --> 00:25:14,520 Speaker 2: think most importantly for this show tech stuff is like, 475 00:25:14,920 --> 00:25:18,639 Speaker 2: how were these people able to replicate Linda's daughter's voice? 476 00:25:19,680 --> 00:25:21,600 Speaker 4: Well, what we don't know here is whether they in 477 00:25:21,680 --> 00:25:25,840 Speaker 4: fact cloned her voice from some publicly available audio, you know, 478 00:25:25,880 --> 00:25:29,000 Speaker 4: whether her daughter had YouTube video out there or some 479 00:25:29,119 --> 00:25:31,639 Speaker 4: other type of audio or video that they could have 480 00:25:32,400 --> 00:25:34,679 Speaker 4: grabbed her voice from. She's twenty six, so chances are 481 00:25:34,720 --> 00:25:37,480 Speaker 4: she she could have. I didn't find any social media 482 00:25:37,520 --> 00:25:39,800 Speaker 4: accounts that I could access for her. But there are 483 00:25:39,800 --> 00:25:43,080 Speaker 4: other ways that you can approximate the sound of someone's voice. 484 00:25:43,200 --> 00:25:45,520 Speaker 4: There are a bunch of apps that are free or 485 00:25:45,720 --> 00:25:49,000 Speaker 4: very inexpensive on the different app stores that allow you 486 00:25:49,080 --> 00:25:51,920 Speaker 4: to change your voice. Fifty year old man could change 487 00:25:51,920 --> 00:25:54,879 Speaker 4: his voice to sound like a twenty year old woman, 488 00:25:55,119 --> 00:25:57,120 Speaker 4: you know, and you can change the dialect to the accent, 489 00:25:57,600 --> 00:26:00,639 Speaker 4: and those can be pretty convincing. And the experts I 490 00:26:00,680 --> 00:26:04,040 Speaker 4: talked to, both psychologists and cybersecurity experts, said that you 491 00:26:04,040 --> 00:26:07,400 Speaker 4: know when you're in this moment of fear, and you've 492 00:26:07,440 --> 00:26:10,439 Speaker 4: already gotten this idea in your mind that your daughter 493 00:26:10,600 --> 00:26:12,960 Speaker 4: is calling you. The first thing they're saying is mom, 494 00:26:13,600 --> 00:26:16,880 Speaker 4: your mind immediately switches to one of your children. And 495 00:26:17,000 --> 00:26:21,000 Speaker 4: so if they're able to approximate a voice of a 496 00:26:21,160 --> 00:26:25,240 Speaker 4: twenty year old woman, then your mind might immediately think 497 00:26:25,280 --> 00:26:28,040 Speaker 4: that that is your daughter, when it may not be 498 00:26:28,720 --> 00:26:30,800 Speaker 4: her actual voice or clone of her voice. So in 499 00:26:30,840 --> 00:26:34,239 Speaker 4: this case, we'll never know how they either got a 500 00:26:34,280 --> 00:26:36,640 Speaker 4: clone of her voice or whether they use some sort 501 00:26:36,680 --> 00:26:39,119 Speaker 4: of generative AI to create a voice that sounded like 502 00:26:39,160 --> 00:26:41,399 Speaker 4: it could be her daughter. And then a couple of 503 00:26:41,400 --> 00:26:44,320 Speaker 4: the tip offs here is that she wasn't able to 504 00:26:44,400 --> 00:26:47,560 Speaker 4: interact with the daughter. There were just these clips of 505 00:26:47,640 --> 00:26:51,120 Speaker 4: sound playing. She didn't have a conversation. 506 00:26:51,800 --> 00:26:53,879 Speaker 2: This is exactly what we did with my cousin and 507 00:26:53,920 --> 00:26:55,800 Speaker 2: what she had said to me that was so interesting 508 00:26:55,840 --> 00:26:58,840 Speaker 2: about this is the thing that tricked her was not 509 00:26:58,920 --> 00:27:01,679 Speaker 2: that I had this like incredible deep fake, but it 510 00:27:01,720 --> 00:27:04,800 Speaker 2: was the context. So she didn't really question the fact 511 00:27:04,920 --> 00:27:07,199 Speaker 2: that it was me, not because it really sounded like me, 512 00:27:07,560 --> 00:27:10,240 Speaker 2: but because of the context of our conversation. She called 513 00:27:10,240 --> 00:27:11,919 Speaker 2: me and I picked up, So why shouldn't she think 514 00:27:11,960 --> 00:27:14,160 Speaker 2: it's me right exactly. 515 00:27:14,359 --> 00:27:16,560 Speaker 1: So what is the scale of this problem? 516 00:27:16,920 --> 00:27:20,719 Speaker 4: It's really huge. The Federal Trade Commission said that the 517 00:27:21,280 --> 00:27:26,640 Speaker 4: number one category of fraud last year was imposter scams. 518 00:27:26,840 --> 00:27:29,359 Speaker 4: So that doesn't mean that they're all AI generated, but 519 00:27:29,960 --> 00:27:35,160 Speaker 4: scams in which people are calling, texting, emailing, whatever, impersonating 520 00:27:35,359 --> 00:27:38,840 Speaker 4: someone that someone knows with some sort of story and 521 00:27:38,880 --> 00:27:39,919 Speaker 4: a request for money. 522 00:27:40,760 --> 00:27:44,600 Speaker 2: This is something that is incredibly advanced for people who 523 00:27:44,640 --> 00:27:47,520 Speaker 2: we'd often call petty criminals. Does that mean that the 524 00:27:47,560 --> 00:27:51,920 Speaker 2: technology has become so ubiquitous that it's very accessible by 525 00:27:51,920 --> 00:27:54,359 Speaker 2: people we would call petty criminals. It's no longer the 526 00:27:54,359 --> 00:27:55,919 Speaker 2: thing of like, oh, I'm going to get on the 527 00:27:55,960 --> 00:27:58,600 Speaker 2: subway and pickpocket someone for the amount of money that 528 00:27:58,640 --> 00:28:00,320 Speaker 2: you might be able to get for this kind of scam. 529 00:28:00,400 --> 00:28:02,439 Speaker 2: So I mean not that I think you have a 530 00:28:02,480 --> 00:28:06,000 Speaker 2: criminal mind, Julie, but I'm wondering, from your perspective as 531 00:28:06,000 --> 00:28:08,360 Speaker 2: someone who's now reported on this, is this the kind 532 00:28:08,400 --> 00:28:12,240 Speaker 2: of crime that people who are looking to scam people 533 00:28:12,240 --> 00:28:15,000 Speaker 2: are engaging it? Is it because it's so easy? 534 00:28:15,080 --> 00:28:17,720 Speaker 4: Yeah, it has become a lot easier because of the 535 00:28:17,760 --> 00:28:21,040 Speaker 4: abiquity of these tools that can do voice clones or 536 00:28:21,600 --> 00:28:25,000 Speaker 4: AA generated voice approximations of people. All you have to 537 00:28:25,000 --> 00:28:28,879 Speaker 4: do is google it and you'll find dozens of online 538 00:28:28,880 --> 00:28:32,280 Speaker 4: tools that are either free or very very inexpensive. Or 539 00:28:32,359 --> 00:28:35,200 Speaker 4: go on the app store and download a voice changing app. 540 00:28:35,240 --> 00:28:39,440 Speaker 4: Though it's widely accessible, it's inexpensive, and all it takes 541 00:28:39,520 --> 00:28:42,920 Speaker 4: is one person who sends two thousand dollars and however 542 00:28:42,960 --> 00:28:45,680 Speaker 4: long this scenario went on, maybe thirty forty five minutes 543 00:28:45,760 --> 00:28:48,840 Speaker 4: or whatever, you know, they got two thousand dollars. So 544 00:28:49,240 --> 00:28:52,120 Speaker 4: you get a few victims over the course of some 545 00:28:52,360 --> 00:28:56,360 Speaker 4: period of time, and the payout can be pretty sizable. 546 00:28:57,080 --> 00:29:00,160 Speaker 1: Yeah, I think. I mean there's a financial cost to 547 00:29:00,200 --> 00:29:02,840 Speaker 1: your point about you know, the aftermath for Linda. There's 548 00:29:02,840 --> 00:29:06,280 Speaker 1: also this tremendous emotional costs, I mean, the trauma of it. 549 00:29:06,400 --> 00:29:09,360 Speaker 1: I saw a documentary the other day produced by Bloomberg 550 00:29:09,600 --> 00:29:13,960 Speaker 1: about young teens who are basically the victims of sextortion scam. 551 00:29:14,040 --> 00:29:16,160 Speaker 1: So somebody pretends to be somebody in their community and 552 00:29:16,200 --> 00:29:18,960 Speaker 1: gets them to send nude photos and maybe they're looking 553 00:29:19,000 --> 00:29:21,320 Speaker 1: to get I think hundreds of dollars, but in some 554 00:29:21,360 --> 00:29:24,520 Speaker 1: cases this pushes the teams to suicide and so it's 555 00:29:24,600 --> 00:29:26,479 Speaker 1: not like, yes, you have a sense of violation if 556 00:29:26,480 --> 00:29:28,880 Speaker 1: you get pick pocket on the subway, but this goes 557 00:29:28,920 --> 00:29:32,440 Speaker 1: to your core of your deepest fears, and I guess 558 00:29:32,480 --> 00:29:34,280 Speaker 1: maybe that's one of the reasons why your story went 559 00:29:34,440 --> 00:29:36,760 Speaker 1: so viral. But what can we do and what can 560 00:29:36,800 --> 00:29:38,840 Speaker 1: listeners do? What can readers do? What is the way 561 00:29:38,840 --> 00:29:41,120 Speaker 1: to make ourselves a bit more robust in the face 562 00:29:41,160 --> 00:29:41,360 Speaker 1: of this. 563 00:29:41,640 --> 00:29:43,440 Speaker 4: Well, I do have a column coming out this weekend 564 00:29:43,480 --> 00:29:45,040 Speaker 4: that will have tips, so I don't want to pre 565 00:29:45,120 --> 00:29:49,240 Speaker 4: empt to that, but there are things you can do, So. 566 00:29:49,240 --> 00:29:51,920 Speaker 1: Stay tuned, read all about it. Yeah, yeah, real about. 567 00:29:51,640 --> 00:29:53,720 Speaker 4: It when it comes out. But yeah, I mean, I 568 00:29:53,720 --> 00:29:56,360 Speaker 4: think just awareness, for one, is a major thing. And 569 00:29:56,400 --> 00:29:59,680 Speaker 4: I think that's why so many people responded to this, 570 00:29:59,720 --> 00:30:04,080 Speaker 4: Because you'd talk to anybody and someone knows someone to 571 00:30:04,120 --> 00:30:07,000 Speaker 4: whom this has happened or something you know very similar, 572 00:30:07,520 --> 00:30:10,600 Speaker 4: and that shows the scale of the problem. And I 573 00:30:10,640 --> 00:30:13,680 Speaker 4: think what's unfortunate is that people who are victimized by 574 00:30:13,680 --> 00:30:16,360 Speaker 4: these scams feel an incredible sense of shame and embarrassment 575 00:30:16,520 --> 00:30:19,040 Speaker 4: about it. You know, after their mind has calmed down, 576 00:30:19,120 --> 00:30:21,240 Speaker 4: they can easily go back and see the red flags 577 00:30:21,240 --> 00:30:24,040 Speaker 4: and they can you know, even when Linda was experiencing 578 00:30:24,040 --> 00:30:26,920 Speaker 4: this from the beginning, it cossed her mind this could 579 00:30:26,960 --> 00:30:29,280 Speaker 4: be a scam, but she felt like the stakes were 580 00:30:29,320 --> 00:30:32,360 Speaker 4: too high to just hang up and call her daughter 581 00:30:32,400 --> 00:30:34,080 Speaker 4: at that point, because she thought, you know, there was 582 00:30:34,080 --> 00:30:36,560 Speaker 4: that one kernel of like what if, what if my 583 00:30:36,640 --> 00:30:38,880 Speaker 4: daughter really has witnessed the drug deal and is in 584 00:30:38,920 --> 00:30:41,520 Speaker 4: the back of this person's van, and now her life 585 00:30:41,600 --> 00:30:42,280 Speaker 4: is in my hands. 586 00:30:42,400 --> 00:30:44,200 Speaker 2: And you don't want to be the mother that avoided 587 00:30:44,200 --> 00:30:47,400 Speaker 2: this because you think that you're being sort of techno savvy, right, 588 00:30:47,440 --> 00:30:49,120 Speaker 2: And all of a sudden, yeah, And that's when it 589 00:30:49,160 --> 00:30:52,600 Speaker 2: really speaks to the sort of core emotional piece of 590 00:30:52,640 --> 00:30:55,320 Speaker 2: these kind of scams. To Oz's point, it's not just pickpocketing. 591 00:30:55,640 --> 00:30:57,320 Speaker 2: You know, pickpocketing, you can say, oh, I should have 592 00:30:57,400 --> 00:30:59,880 Speaker 2: closed up my jacket better, But this is something that 593 00:31:00,120 --> 00:31:03,000 Speaker 2: is just so much more complicated than that. And also 594 00:31:03,080 --> 00:31:05,720 Speaker 2: I think has dual use. I just wanted to bring 595 00:31:05,760 --> 00:31:08,840 Speaker 2: that up quickly, like a lot of these technologies are 596 00:31:08,880 --> 00:31:12,120 Speaker 2: not just created for bad right, so it's not something 597 00:31:12,160 --> 00:31:14,920 Speaker 2: that can just be kind of wiped out. Deep fake 598 00:31:14,960 --> 00:31:19,880 Speaker 2: technology also has some really interesting applications that I think 599 00:31:19,960 --> 00:31:23,080 Speaker 2: we all benefit from. Now, so it becomes that sort 600 00:31:23,120 --> 00:31:26,040 Speaker 2: of complicated intersection of like some people are using this 601 00:31:26,480 --> 00:31:29,800 Speaker 2: to take advantage of people and scam them, and other 602 00:31:29,800 --> 00:31:34,080 Speaker 2: people are using it to make some really interesting practical applications. 603 00:31:34,120 --> 00:31:36,800 Speaker 2: So I don't know, I'm curious to see your column, 604 00:31:36,800 --> 00:31:41,080 Speaker 2: but it's definitely less simple than just get rid of 605 00:31:41,120 --> 00:31:41,920 Speaker 2: this technology. 606 00:31:42,120 --> 00:31:44,280 Speaker 4: Well, it's not going away, that's for sure. There are 607 00:31:44,680 --> 00:31:48,479 Speaker 4: obviously good uses of generative AI and it's definitely here 608 00:31:48,520 --> 00:31:51,160 Speaker 4: to stay. And my worry is as it gets better 609 00:31:51,200 --> 00:31:54,040 Speaker 4: and better, especially with video, I just wonder at some 610 00:31:54,160 --> 00:31:58,040 Speaker 4: point will people be able to receive faith time calls, 611 00:31:58,120 --> 00:32:02,520 Speaker 4: right video calls where they feel like they're seeing interacting 612 00:32:02,600 --> 00:32:05,920 Speaker 4: with someone who looks just like their child. Now, how 613 00:32:07,040 --> 00:32:08,720 Speaker 4: do you then tell that that's not real? 614 00:32:09,880 --> 00:32:11,400 Speaker 1: Julie, Thank you so much for your time today. 615 00:32:11,560 --> 00:32:14,240 Speaker 2: Thank you, Julie. We'll look forward to that column this Saturday. 616 00:32:14,480 --> 00:32:15,560 Speaker 4: Yeah, thank you for having me. 617 00:32:22,680 --> 00:32:24,280 Speaker 2: That's it for this week for TEXTA. 618 00:32:24,560 --> 00:32:27,800 Speaker 1: I'm Kara Price and I'm mos Vloschen. This episode was 619 00:32:27,800 --> 00:32:32,240 Speaker 1: produced by Eliza Dennis, Victoria Dominguez, and Adriana Tapia. It 620 00:32:32,280 --> 00:32:35,480 Speaker 1: was executive produced by me, Kara Price and Kate Osborne 621 00:32:35,560 --> 00:32:40,520 Speaker 1: Kaleidoscope and Katria Novelfi Hot Podcasts. The engineer is Bihit 622 00:32:40,560 --> 00:32:43,880 Speaker 1: Fraser and Kyle Murdoll makes this episode and he also 623 00:32:43,880 --> 00:32:44,640 Speaker 1: wrote our theme song. 624 00:32:44,880 --> 00:32:47,600 Speaker 2: Join us next Wednesday for Textuff the Story, when we 625 00:32:47,640 --> 00:32:50,880 Speaker 2: will share an in depth conversation with Jenstatsky, creator and 626 00:32:50,920 --> 00:32:54,000 Speaker 2: writer of the hit HBO Max show hacks Well, chat 627 00:32:54,000 --> 00:32:56,280 Speaker 2: about if AI is coming for her job and what 628 00:32:56,320 --> 00:32:58,800 Speaker 2: it's like to make TV. Knowing you're likely battling for 629 00:32:58,840 --> 00:32:59,760 Speaker 2: attention with a. 630 00:32:59,680 --> 00:33:03,440 Speaker 1: Second, please rate, review, and reach out to us at 631 00:33:03,480 --> 00:33:06,720 Speaker 1: tech Stuff podcast at gmail dot com. We love hearing 632 00:33:06,760 --> 00:33:07,120 Speaker 1: from you