1 00:00:13,800 --> 00:00:17,400 Speaker 1: From Kaleidoscope and iHeart podcasts. This is tech stuff. I'm 2 00:00:17,400 --> 00:00:18,400 Speaker 1: as Voloscian and. 3 00:00:18,360 --> 00:00:19,200 Speaker 2: I'm care Price. 4 00:00:19,440 --> 00:00:22,119 Speaker 1: Today we get into what it's like to be a 5 00:00:22,200 --> 00:00:25,600 Speaker 1: human who trains AI for a living, and the story 6 00:00:25,640 --> 00:00:29,120 Speaker 1: of an activist who turned the surveillance state against the 7 00:00:29,200 --> 00:00:33,280 Speaker 1: Chinese government. Then on chatting me, a real life therapist 8 00:00:33,680 --> 00:00:35,760 Speaker 1: talks about how he uses AI. 9 00:00:36,200 --> 00:00:38,879 Speaker 3: It felt like a steady partner. But I call a 10 00:00:39,000 --> 00:00:44,040 Speaker 3: cognitive prosthesis, not thinking for me, but helping me organize 11 00:00:44,080 --> 00:00:46,960 Speaker 3: and extend my own thinking. The way of walking, stick 12 00:00:47,080 --> 00:00:48,160 Speaker 3: supportune as. 13 00:00:48,120 --> 00:00:53,159 Speaker 1: You walk, all of that. On the Weekend Tech, It's Friday, September. 14 00:00:52,640 --> 00:01:02,200 Speaker 2: Twelfth, Hi Cara, Hi ahs. 15 00:01:02,440 --> 00:01:05,600 Speaker 1: So normally we record these together in studio, but this 16 00:01:05,640 --> 00:01:08,520 Speaker 1: week I'm in a hotel room in Munich, Germany, at 17 00:01:08,520 --> 00:01:11,440 Speaker 1: a conference called DLD Digital Life Design. 18 00:01:11,680 --> 00:01:15,039 Speaker 2: It sounds very lively. How is it? 19 00:01:15,040 --> 00:01:17,040 Speaker 1: It's good? You know, are you familiar with this concept 20 00:01:17,040 --> 00:01:19,920 Speaker 1: of ambient AI that they're using medical settings? 21 00:01:19,959 --> 00:01:22,040 Speaker 2: Sort of yes? Can you just tell me a little 22 00:01:22,080 --> 00:01:22,560 Speaker 2: bit about it? 23 00:01:22,800 --> 00:01:27,479 Speaker 1: So basically, it's like cameras and sound recording in settings 24 00:01:27,520 --> 00:01:30,640 Speaker 1: where like a person doesn't have to go and use AI. 25 00:01:30,800 --> 00:01:33,240 Speaker 1: AI is just running in the background and feeding back 26 00:01:33,319 --> 00:01:35,759 Speaker 1: kind of insights and data. And so there's a talk 27 00:01:35,880 --> 00:01:39,800 Speaker 1: yesterday about this being used in hotel settings for food, 28 00:01:39,800 --> 00:01:43,400 Speaker 1: which I found pretty interesting. But apparently using this technology 29 00:01:43,560 --> 00:01:46,120 Speaker 1: has in some cases reduced food waste in hotels by 30 00:01:46,120 --> 00:01:47,400 Speaker 1: an average of forty percent. 31 00:01:47,800 --> 00:01:50,040 Speaker 2: What does ambient AI look like in this case, Like, 32 00:01:50,120 --> 00:01:53,000 Speaker 2: is it like a panopticon or people getting publicly shamed? 33 00:01:54,560 --> 00:01:56,240 Speaker 1: Well, it is a bit like a panopticon, but the 34 00:01:56,320 --> 00:01:59,160 Speaker 1: customers are not being publicly shamed on their third trip 35 00:01:59,200 --> 00:02:02,600 Speaker 1: to the buffet. Basically, there are cameras and there's also 36 00:02:02,640 --> 00:02:07,520 Speaker 1: a weighing device for trash bins, and so it will 37 00:02:07,600 --> 00:02:10,359 Speaker 1: track the food at the end of the buffet being 38 00:02:10,760 --> 00:02:13,720 Speaker 1: transported by the servers to the trash and thrown out 39 00:02:13,760 --> 00:02:16,840 Speaker 1: and then being weighed. And the insights and not rocket science. 40 00:02:16,880 --> 00:02:20,400 Speaker 1: It's like, hey, guys, don't fill the bread basket fifteen 41 00:02:20,440 --> 00:02:22,800 Speaker 1: minutes before the service ends. Like things that once you 42 00:02:22,880 --> 00:02:25,360 Speaker 1: hear them, are obvious, but actually it's interesting that like 43 00:02:25,600 --> 00:02:28,680 Speaker 1: being told it by technology actually apparently makes a real difference. 44 00:02:29,120 --> 00:02:31,959 Speaker 2: I remember when you and I many now many years 45 00:02:32,000 --> 00:02:34,920 Speaker 2: ago went to speak at Johns Hopkins. They told us 46 00:02:34,960 --> 00:02:39,160 Speaker 2: about connected devices that nurses wear to make hospital work 47 00:02:39,240 --> 00:02:41,360 Speaker 2: more efficient. So I think this is something that we 48 00:02:41,360 --> 00:02:44,639 Speaker 2: should pay attention to. But speaking of digital life design, 49 00:02:44,840 --> 00:02:46,360 Speaker 2: I don't know if you saw this article in the 50 00:02:46,360 --> 00:02:49,640 Speaker 2: Wall Street Journal called My Day as an eighty year old? 51 00:02:49,680 --> 00:02:51,160 Speaker 2: What an age simulation suit? 52 00:02:51,160 --> 00:02:52,720 Speaker 1: Top me? Did you see this? I saw the email 53 00:02:52,760 --> 00:02:54,240 Speaker 1: and I was fascinated, But tell me more. 54 00:02:54,400 --> 00:02:59,440 Speaker 2: Basically, MIT has created this age simulation suit, which is 55 00:02:59,480 --> 00:03:02,160 Speaker 2: a full body suit that was built by scientists at 56 00:03:02,160 --> 00:03:05,120 Speaker 2: the Age Lab, which does research on how to improve 57 00:03:05,240 --> 00:03:08,040 Speaker 2: life for the elderly. So the journalists who wrote the article, 58 00:03:08,160 --> 00:03:10,880 Speaker 2: her name is Amy, doctor Marcus, and she recently wore it, 59 00:03:10,919 --> 00:03:12,720 Speaker 2: and that's what you know she wrote this article about. 60 00:03:12,760 --> 00:03:15,880 Speaker 2: And she said she was fitted with just imagine this, 61 00:03:15,960 --> 00:03:19,840 Speaker 2: a fifteen pound weighted vest, more weights on her ankles 62 00:03:19,840 --> 00:03:23,480 Speaker 2: and wrists to kind of stimulate the loss of muscle mass. 63 00:03:23,600 --> 00:03:26,720 Speaker 2: And the suit also had this very elaborate bungee cord 64 00:03:26,840 --> 00:03:30,960 Speaker 2: system and a neck collar that limited her mobility. She 65 00:03:31,000 --> 00:03:33,800 Speaker 2: also had glasses. And there's great photos you if people 66 00:03:33,800 --> 00:03:36,120 Speaker 2: who are listening want to look it up on the 67 00:03:36,160 --> 00:03:38,640 Speaker 2: Wall Street Journal. There's these great photos of like what 68 00:03:38,680 --> 00:03:41,960 Speaker 2: her distorted vision looks like she also wore padded crocs 69 00:03:42,000 --> 00:03:43,560 Speaker 2: that made it really hard to balance. 70 00:03:44,000 --> 00:03:45,560 Speaker 1: This sounds like hell. 71 00:03:45,360 --> 00:03:49,040 Speaker 2: It sounds like the worst hangover of your life. Basically, yeah, 72 00:03:49,200 --> 00:03:53,400 Speaker 2: MIT calls the suit agnes, which is an acronym for 73 00:03:53,640 --> 00:03:55,680 Speaker 2: age gain now empathy system. 74 00:03:55,800 --> 00:03:58,440 Speaker 1: That's about acronym. It sounds when I a bachronym that 75 00:03:58,600 --> 00:04:01,440 Speaker 1: I but I get it. What made the story stand 76 00:04:01,480 --> 00:04:01,840 Speaker 1: out to you? 77 00:04:02,080 --> 00:04:04,840 Speaker 2: So you know, I think it's a very interesting lesson 78 00:04:04,840 --> 00:04:07,160 Speaker 2: in empathy, or a lesson in schlepping if you come 79 00:04:07,200 --> 00:04:11,760 Speaker 2: from my culture. But it's actually a part of a 80 00:04:11,840 --> 00:04:14,920 Speaker 2: longer arc of product research for older people, which I'm 81 00:04:15,000 --> 00:04:18,839 Speaker 2: very interested in as someone who has an aging parent 82 00:04:18,920 --> 00:04:21,839 Speaker 2: who is very young its spirit. I think it's a 83 00:04:21,880 --> 00:04:24,440 Speaker 2: real test of our empathy to understand how people who 84 00:04:24,440 --> 00:04:27,000 Speaker 2: are older than us navigate the world. And there was 85 00:04:27,040 --> 00:04:29,640 Speaker 2: this woman who I became obsessed with many years ago 86 00:04:29,720 --> 00:04:32,360 Speaker 2: named Patricia Moore. Patty Moore? Have you ever heard of her? 87 00:04:32,560 --> 00:04:33,000 Speaker 1: I haven't. 88 00:04:33,240 --> 00:04:36,640 Speaker 2: So she is most famous for pioneering an approach called 89 00:04:36,760 --> 00:04:41,240 Speaker 2: universal design, which basically means designing products and spaces with 90 00:04:41,360 --> 00:04:44,880 Speaker 2: empathy for the widest possible audience in mind. So like, 91 00:04:45,520 --> 00:04:48,920 Speaker 2: will this refrigerator be something that a ninety year old 92 00:04:48,960 --> 00:04:50,360 Speaker 2: can open, for example. 93 00:04:50,560 --> 00:04:52,800 Speaker 1: Yeah, it's interesting, and I guess it's kind of become 94 00:04:52,839 --> 00:04:54,920 Speaker 1: a bit of a standard now. I'm not sure if 95 00:04:55,000 --> 00:04:58,119 Speaker 1: companies always do it, but certainly companies always talk about 96 00:04:58,160 --> 00:05:01,920 Speaker 1: the idea they don't want to design products with the 97 00:05:01,960 --> 00:05:05,719 Speaker 1: healthy mid thirties product designer in mind as the user, 98 00:05:06,000 --> 00:05:07,920 Speaker 1: but they won't have a wider view on how these 99 00:05:07,960 --> 00:05:10,560 Speaker 1: products might be used by people different ageas people with 100 00:05:10,560 --> 00:05:11,839 Speaker 1: different abilities, that kind of thing. 101 00:05:12,040 --> 00:05:15,640 Speaker 2: To your point, it wasn't always a consideration. And when 102 00:05:15,680 --> 00:05:18,640 Speaker 2: Patty was twenty six years old, this is in nineteen 103 00:05:18,680 --> 00:05:21,000 Speaker 2: seventy nine. I'm giving away her age, and I know 104 00:05:21,040 --> 00:05:22,520 Speaker 2: she's going to listen to this because she's a friend 105 00:05:22,560 --> 00:05:22,800 Speaker 2: of mine. 106 00:05:22,800 --> 00:05:25,320 Speaker 1: But so she's been at this. She's been at this 107 00:05:25,600 --> 00:05:27,120 Speaker 1: since way before it was popular. 108 00:05:27,279 --> 00:05:29,760 Speaker 2: She worked at Raymond Lowie. There was some grant from 109 00:05:29,760 --> 00:05:33,080 Speaker 2: the Nixon administration that allowed her to have the job 110 00:05:33,120 --> 00:05:36,200 Speaker 2: that she had, But she was one of the only 111 00:05:36,320 --> 00:05:39,760 Speaker 2: people who was bringing up any kind of conversation about 112 00:05:40,160 --> 00:05:44,720 Speaker 2: product design for the elderly and how we can engage 113 00:05:44,720 --> 00:05:48,279 Speaker 2: in a conversation about that. But what she did that 114 00:05:48,400 --> 00:05:52,880 Speaker 2: is so kind of different. But ingenius visa vi agnes Is. 115 00:05:52,960 --> 00:05:55,520 Speaker 2: She got a friend of hers who was a makeup 116 00:05:55,600 --> 00:05:59,479 Speaker 2: artist from Saturday Night Live, to turn her into an 117 00:05:59,520 --> 00:06:03,719 Speaker 2: eighty five year old woman. She traveled around the country 118 00:06:03,839 --> 00:06:07,280 Speaker 2: in these outfits essentially, and she got a really good 119 00:06:07,320 --> 00:06:10,520 Speaker 2: sense of what it was like to live as an 120 00:06:10,520 --> 00:06:13,680 Speaker 2: elderly person in the world in the late nineteen seventies. 121 00:06:13,920 --> 00:06:16,719 Speaker 2: After she did this, she went on to be a 122 00:06:16,720 --> 00:06:20,760 Speaker 2: product engineer and she has based her entire career on 123 00:06:21,440 --> 00:06:24,440 Speaker 2: creating more accessible products, like you know, good grips. She 124 00:06:24,600 --> 00:06:26,960 Speaker 2: was a very sort of integral part of the design 125 00:06:27,000 --> 00:06:29,680 Speaker 2: of easily grippable devices for the kitchen and the home. 126 00:06:29,920 --> 00:06:32,840 Speaker 2: She also worked on transit systems, medical devices, and now 127 00:06:32,880 --> 00:06:35,880 Speaker 2: she actually spends a lot of time consulting tech companies 128 00:06:35,920 --> 00:06:38,920 Speaker 2: on how to make their products more inclusive for elderly people. 129 00:06:39,040 --> 00:06:42,159 Speaker 1: To me, it's interesting tech then and now story because 130 00:06:42,160 --> 00:06:44,679 Speaker 1: in the seventies, like to do this kind of research, 131 00:06:44,720 --> 00:06:48,120 Speaker 1: you'd have to invent your own costumes and characters and 132 00:06:48,160 --> 00:06:50,159 Speaker 1: process and go to a TV show to help you 133 00:06:50,200 --> 00:06:53,440 Speaker 1: achieve it. And now MIT has this age scented lab 134 00:06:53,480 --> 00:06:55,279 Speaker 1: where you can just put on the agnes suit for 135 00:06:55,320 --> 00:06:57,720 Speaker 1: a day, but of course, the motivation is the same, 136 00:06:57,880 --> 00:06:59,360 Speaker 1: and part of that might speak to the fact that, 137 00:06:59,560 --> 00:07:02,200 Speaker 1: howe him, we want to pay lip service the idea 138 00:07:02,200 --> 00:07:05,080 Speaker 1: of designing for all kinds of different people, it remains 139 00:07:05,279 --> 00:07:06,720 Speaker 1: very hard to pull off. 140 00:07:06,680 --> 00:07:09,360 Speaker 2: It does. I think the one thing that's interesting about 141 00:07:09,360 --> 00:07:11,880 Speaker 2: the Angle and the Journal piece is that the Age 142 00:07:11,920 --> 00:07:16,800 Speaker 2: Lab is trying to also understand how to prepare people 143 00:07:16,880 --> 00:07:20,520 Speaker 2: who are aging into their eighties to live better. And 144 00:07:20,560 --> 00:07:23,560 Speaker 2: it's really about how do you keep people staying young 145 00:07:23,680 --> 00:07:24,920 Speaker 2: even though they're not young anymore? 146 00:07:25,120 --> 00:07:28,520 Speaker 1: Right, right, right, Well, it's all only either beholder. Do 147 00:07:28,520 --> 00:07:32,560 Speaker 1: you know two people who are very interested in staying 148 00:07:32,600 --> 00:07:36,320 Speaker 1: young even if they age in terms of their physical bodies? 149 00:07:36,480 --> 00:07:39,480 Speaker 2: Every woman I know, and many. 150 00:07:41,960 --> 00:07:45,679 Speaker 1: But now in this case, I'm talking about Vladimir Putin 151 00:07:45,920 --> 00:07:49,280 Speaker 1: and Shechinping. The two of them are attending a military 152 00:07:49,280 --> 00:07:53,480 Speaker 1: parade in Beijing, officially to celebrate the eightieth anniversary of 153 00:07:53,640 --> 00:07:57,040 Speaker 1: the end of World War II. Unofficially, of course, to 154 00:07:57,120 --> 00:08:00,480 Speaker 1: show the US and Taiwan. Then you ask some of 155 00:08:00,560 --> 00:08:03,960 Speaker 1: high tech weaponry. But anyway, Putin and she were walking 156 00:08:04,000 --> 00:08:07,720 Speaker 1: together when a hot might caught them having a really 157 00:08:08,040 --> 00:08:11,760 Speaker 1: bizarre conversation. I'll read it to you. So Putin's interpreter 158 00:08:11,880 --> 00:08:17,680 Speaker 1: says in Chinese, quote, biotechnology is continuously developing. Then he adds, 159 00:08:17,960 --> 00:08:24,200 Speaker 1: quote human organs can be continuously transplanted. The longer you live, 160 00:08:25,120 --> 00:08:29,440 Speaker 1: the younger you become, and you can even achieve immortality. 161 00:08:30,280 --> 00:08:34,640 Speaker 1: She responds, quote some predict that in this century humans 162 00:08:34,640 --> 00:08:36,720 Speaker 1: may live to one hundred and fifty years old. 163 00:08:37,200 --> 00:08:40,600 Speaker 2: Just imagine that your small talk is about organ transplantation. 164 00:08:41,360 --> 00:08:45,959 Speaker 1: Continuous organ transplantation. I mean once once, you might think 165 00:08:45,960 --> 00:08:46,560 Speaker 1: will be enough. 166 00:08:46,679 --> 00:08:51,680 Speaker 2: It's unbelievable. And that's their casual conversation, the most powerful 167 00:08:51,720 --> 00:08:52,560 Speaker 2: people in the world. 168 00:08:53,480 --> 00:08:56,120 Speaker 1: The reason I came across this is because I'm quite 169 00:08:56,120 --> 00:08:58,600 Speaker 1: interested in defense tech and I was looking for the 170 00:08:58,640 --> 00:09:01,800 Speaker 1: most interesting angle on this military parade in Beijing to 171 00:09:01,840 --> 00:09:04,520 Speaker 1: bring to the show. And there was fascinating stuff about 172 00:09:04,600 --> 00:09:08,120 Speaker 1: new anti drone technologies, new kinds of lasers, et cetera, 173 00:09:08,160 --> 00:09:12,200 Speaker 1: et cetera. But actually the most interesting story I've found 174 00:09:12,679 --> 00:09:15,160 Speaker 1: had nothing to do with the military parade. In fact, 175 00:09:15,240 --> 00:09:17,760 Speaker 1: it was all about what happened the night before. 176 00:09:18,080 --> 00:09:19,000 Speaker 2: Tell me about it. 177 00:09:19,160 --> 00:09:22,520 Speaker 1: So, around ten pm last Friday night in chong Qing, 178 00:09:23,080 --> 00:09:26,520 Speaker 1: a huge projection against the skyscraper went up calling for 179 00:09:26,559 --> 00:09:29,920 Speaker 1: an end to the Communist Party's rule. It included slogans 180 00:09:29,960 --> 00:09:32,880 Speaker 1: like quote only without the Communist Party, can there be 181 00:09:32,880 --> 00:09:36,080 Speaker 1: a new China? End quote, no more lies, we want 182 00:09:36,080 --> 00:09:39,000 Speaker 1: the truth, no more slavery, we want freedom. 183 00:09:39,120 --> 00:09:42,080 Speaker 2: And how long did that stay up? Like ten seconds? 184 00:09:42,640 --> 00:09:47,360 Speaker 1: Well an hour? But actually this is where things get 185 00:09:47,360 --> 00:09:50,040 Speaker 1: more interesting and more tech stuff. A few hours after 186 00:09:50,080 --> 00:09:53,560 Speaker 1: the projection was shut off, the activist who staged the protest, 187 00:09:53,800 --> 00:09:57,480 Speaker 1: See Hoong, posted this video showing five police officers bursting 188 00:09:57,520 --> 00:10:00,200 Speaker 1: into the hotel room where the projector was and then 189 00:10:00,280 --> 00:10:02,720 Speaker 1: kind of bumbling around trying to turn it off. That 190 00:10:02,800 --> 00:10:05,640 Speaker 1: at one point one of the policemen notice, smile your 191 00:10:05,679 --> 00:10:08,240 Speaker 1: on candid camera. There's actually a camera pointing at them, 192 00:10:08,320 --> 00:10:11,000 Speaker 1: and he rushes towards it, looking really surprised. And then 193 00:10:11,640 --> 00:10:13,920 Speaker 1: he looks down and sees as a note underneath the 194 00:10:13,960 --> 00:10:17,280 Speaker 1: camera saying, even if you are a beneficiary of the 195 00:10:17,320 --> 00:10:20,720 Speaker 1: system today, one day you will inevitably become a victim 196 00:10:20,760 --> 00:10:23,880 Speaker 1: on this land, so please treat the people with kindness. 197 00:10:24,240 --> 00:10:26,440 Speaker 1: But it plays out almost like a Marx Brothers film. 198 00:10:26,440 --> 00:10:28,960 Speaker 1: They burst into the room, they're confused, and they realize 199 00:10:29,000 --> 00:10:31,160 Speaker 1: that they're kind of the victims of this prank. 200 00:10:31,600 --> 00:10:33,160 Speaker 2: Did he get in trouble for this. 201 00:10:33,280 --> 00:10:35,120 Speaker 1: Well, he would have gone into terrible trouble, but of 202 00:10:35,160 --> 00:10:37,120 Speaker 1: course he wasn't in the hotel room. That was the 203 00:10:37,160 --> 00:10:39,880 Speaker 1: whole joke of just a camera. He'd fed the country 204 00:10:39,920 --> 00:10:42,960 Speaker 1: with his family a week before the protest was turned on, 205 00:10:43,559 --> 00:10:45,959 Speaker 1: and in fact he turned it on remotely. But one 206 00:10:45,960 --> 00:10:48,559 Speaker 1: of his brothers along with a friend of his were 207 00:10:48,559 --> 00:10:53,120 Speaker 1: both detained, and authorities also questioned his elderly mother outside 208 00:10:53,160 --> 00:10:56,360 Speaker 1: her home. Now talk about Agnes elderly mother. I mean, 209 00:10:56,360 --> 00:10:58,920 Speaker 1: this woman looks like she's one hundred years old, completely 210 00:10:59,000 --> 00:11:03,680 Speaker 1: hunched over, surrounded by cops. Is an extremely unfavorable image 211 00:11:03,679 --> 00:11:06,320 Speaker 1: for the police. And see Hong again actually managed to 212 00:11:06,320 --> 00:11:08,760 Speaker 1: get hold of these surveillance photos. Then he posted the 213 00:11:08,800 --> 00:11:11,280 Speaker 1: images of his mother's interrogation online as well. 214 00:11:11,520 --> 00:11:15,480 Speaker 2: This is very who watches the watchman irl? You know, 215 00:11:15,520 --> 00:11:18,760 Speaker 2: he's using the Chinese authority's own tools essentially against them. 216 00:11:18,920 --> 00:11:21,480 Speaker 1: That's he's actually right. And Seahong himself was quoted in 217 00:11:21,520 --> 00:11:25,400 Speaker 1: the story saying, the party installs surveillance cameras to watch us. 218 00:11:25,840 --> 00:11:28,320 Speaker 1: I thought I could use the same method to watch them, you. 219 00:11:28,320 --> 00:11:30,800 Speaker 2: Know, I think we think of huge tech systems as 220 00:11:30,840 --> 00:11:34,200 Speaker 2: fully autonomous, but a story like this really reveals how 221 00:11:34,240 --> 00:11:37,720 Speaker 2: a single human being can still have an outsized effect. 222 00:11:38,240 --> 00:11:41,960 Speaker 2: And you know, what this activist did was essentially reveal 223 00:11:42,080 --> 00:11:45,200 Speaker 2: that there is no such thing as a fully autonomous system. 224 00:11:45,240 --> 00:11:47,560 Speaker 2: And I think the footage of the police arriving in 225 00:11:47,600 --> 00:11:50,680 Speaker 2: the empty apartment and displaying this very sort of human 226 00:11:50,800 --> 00:11:54,679 Speaker 2: confusion as a testament to that which humans being inside 227 00:11:54,720 --> 00:11:57,120 Speaker 2: the lubnol brings me to the big story that I 228 00:11:57,120 --> 00:11:59,160 Speaker 2: want to bring to you this week, and it's do 229 00:11:59,200 --> 00:12:01,960 Speaker 2: you know about the people who train AI models to 230 00:12:02,000 --> 00:12:02,440 Speaker 2: be better? 231 00:12:02,679 --> 00:12:05,600 Speaker 1: I actually do, because I have a friend who I 232 00:12:05,640 --> 00:12:07,760 Speaker 1: actually asked if they would come on the show some 233 00:12:07,800 --> 00:12:10,679 Speaker 1: weeks ago, and they said no because they have NDAs 234 00:12:10,800 --> 00:12:13,800 Speaker 1: and they rely on their AI trainer work for part 235 00:12:13,800 --> 00:12:16,160 Speaker 1: of their income, so they couldn't do it. So now 236 00:12:16,200 --> 00:12:18,360 Speaker 1: you brought this story because I've wanted to know more 237 00:12:18,360 --> 00:12:19,240 Speaker 1: about it for some time. 238 00:12:19,480 --> 00:12:22,320 Speaker 2: So that's exactly what this Business Insider article is about. 239 00:12:22,440 --> 00:12:26,120 Speaker 2: They spoke to over sixty people who are basically the 240 00:12:26,240 --> 00:12:29,800 Speaker 2: humans behind the AI boom. They're called data labelers, and 241 00:12:30,240 --> 00:12:32,920 Speaker 2: there are hundreds of thousands of them around the world. 242 00:12:32,960 --> 00:12:35,160 Speaker 2: You'll remember, you know, we used to talk on Sleepwalkers 243 00:12:35,200 --> 00:12:39,400 Speaker 2: about content moderation, which was a sort of similarly arduous, 244 00:12:39,800 --> 00:12:43,920 Speaker 2: sometimes disturbing task. These people do exactly what your friend does, 245 00:12:43,960 --> 00:12:47,679 Speaker 2: which is that they spend hours and hours feeding chatbots 246 00:12:47,760 --> 00:12:52,040 Speaker 2: test prompts, and then categorizing their responses according to whether 247 00:12:52,080 --> 00:12:57,760 Speaker 2: they're helpful, accurate, concise, natural sounding, or wrong, rambling, robotic, 248 00:12:57,960 --> 00:13:03,000 Speaker 2: and sometimes even offensive. In the Business Insider article, they 249 00:13:03,040 --> 00:13:08,080 Speaker 2: are described as quote part speech pathologists, part manners tutors, 250 00:13:08,120 --> 00:13:09,959 Speaker 2: and part debate coaches. 251 00:13:09,800 --> 00:13:11,520 Speaker 1: That teachers in an elite prep school. 252 00:13:13,040 --> 00:13:13,559 Speaker 2: Exactly. 253 00:13:14,280 --> 00:13:17,280 Speaker 1: I never asked my friend because I thought it was indelicate, 254 00:13:17,320 --> 00:13:20,280 Speaker 1: But I'm curious what kind of money do these people make. 255 00:13:20,480 --> 00:13:24,320 Speaker 2: So one of the companies that oversees these trainers is 256 00:13:24,360 --> 00:13:28,200 Speaker 2: called Outlier, and they actually said that they've collectively paid 257 00:13:28,440 --> 00:13:31,319 Speaker 2: and this is a quote, hundreds of millions of dollars 258 00:13:31,320 --> 00:13:34,280 Speaker 2: in the past year alone for data labelers. One of 259 00:13:34,320 --> 00:13:37,880 Speaker 2: the labelers quoted in the article actually made fifty thousand 260 00:13:37,960 --> 00:13:40,400 Speaker 2: dollars in six months at one point, which is a 261 00:13:40,440 --> 00:13:41,480 Speaker 2: decent amount of money. 262 00:13:41,679 --> 00:13:44,360 Speaker 1: Ten thousand dollars a month almost it's not bad. I 263 00:13:44,400 --> 00:13:47,160 Speaker 1: mean how much data labeling do you have to do 264 00:13:47,240 --> 00:13:49,120 Speaker 1: to make fifty thousand dollars and six months. 265 00:13:49,200 --> 00:13:50,920 Speaker 2: Well, when I say this, it might not sound like 266 00:13:51,000 --> 00:13:52,840 Speaker 2: so much more, but it's a full time job, so 267 00:13:52,880 --> 00:13:55,040 Speaker 2: you're working about fifty hours a week. But you know, 268 00:13:55,080 --> 00:13:57,360 Speaker 2: one of the difficult parts about this work is that 269 00:13:57,400 --> 00:13:59,440 Speaker 2: the rates can change on a whim. You know, at 270 00:13:59,480 --> 00:14:03,520 Speaker 2: one point outlier, this company change one contractor's rate from 271 00:14:03,760 --> 00:14:06,200 Speaker 2: fifty an hour to fifteen dollars an hour with absolutely 272 00:14:06,240 --> 00:14:10,160 Speaker 2: no explanation. Also, like a lot of seemingly steady streams 273 00:14:10,160 --> 00:14:13,160 Speaker 2: of work can dry up with absolutely no explanation. And 274 00:14:13,240 --> 00:14:16,400 Speaker 2: so one person was basically like, my job is like gambling. 275 00:14:16,640 --> 00:14:20,120 Speaker 1: Yeah, because one of those types of work where it's finite, right, Like, 276 00:14:20,200 --> 00:14:22,440 Speaker 1: once you've trained the thing, you don't need to train 277 00:14:22,480 --> 00:14:24,920 Speaker 1: it anymore. And so there's this kind of inherent issue 278 00:14:24,960 --> 00:14:27,840 Speaker 1: at the heart of this story where real humans are 279 00:14:27,960 --> 00:14:31,720 Speaker 1: essentially accelerating putting themselves out of work by doing this 280 00:14:31,880 --> 00:14:32,560 Speaker 1: kind of work. 281 00:14:32,720 --> 00:14:35,400 Speaker 2: Absolutely. One of the things that I actually found striking 282 00:14:35,880 --> 00:14:38,160 Speaker 2: is that the article says that a lot of times 283 00:14:38,240 --> 00:14:42,120 Speaker 2: training chatbots to work better means actually doing things like 284 00:14:42,160 --> 00:14:46,360 Speaker 2: treating them worse or testing how they handle potentially harmful prompts, 285 00:14:46,680 --> 00:14:49,360 Speaker 2: and actually, in some cases, the more the humans can 286 00:14:49,400 --> 00:14:52,360 Speaker 2: get a bot to say something inappropriate, the more they 287 00:14:52,400 --> 00:14:55,000 Speaker 2: get paid. So one of these data labelers in the 288 00:14:55,120 --> 00:14:58,360 Speaker 2: article said she got tasks like, quote, make the bot 289 00:14:58,440 --> 00:15:01,720 Speaker 2: suggest murder, how the bot tell you, how to overpower 290 00:15:01,760 --> 00:15:04,440 Speaker 2: a woman to rape her, make the bot tell you 291 00:15:04,560 --> 00:15:05,480 Speaker 2: incest is okay. 292 00:15:07,760 --> 00:15:09,640 Speaker 1: I mean, I love doing the show with you, but 293 00:15:09,680 --> 00:15:12,600 Speaker 1: when we get to these moments, I'm like, damn as 294 00:15:12,680 --> 00:15:16,800 Speaker 1: just so depressing. The story reminds me of jimber something 295 00:15:16,840 --> 00:15:18,480 Speaker 1: called Amazon's M Turk. 296 00:15:18,840 --> 00:15:21,760 Speaker 2: Yes, we reported on it for Sleepwalkers, I remember, but 297 00:15:22,200 --> 00:15:23,280 Speaker 2: remind me. I'm vague. 298 00:15:23,600 --> 00:15:28,320 Speaker 1: So M Turk is actually a reference to an eighteenth 299 00:15:28,360 --> 00:15:33,840 Speaker 1: century fake chess playing automaton called the Mechanical Turk. The 300 00:15:33,880 --> 00:15:37,720 Speaker 1: Mechanical Turk traveled around the European royal courts. It beat 301 00:15:37,960 --> 00:15:40,680 Speaker 1: Benjamin Franklin at chess, but of course it turned out 302 00:15:40,720 --> 00:15:43,160 Speaker 1: it wasn't actually a chess playing robot. It was a 303 00:15:43,240 --> 00:15:47,200 Speaker 1: very elaborate machine that a human was stuck in the 304 00:15:47,240 --> 00:15:50,240 Speaker 1: bottom of and was somehow able to see what the 305 00:15:50,280 --> 00:15:53,160 Speaker 1: opponent was doing, and was also themselves a very good 306 00:15:53,200 --> 00:15:56,120 Speaker 1: chess player, so it's a little stranger. Amazon chose to 307 00:15:56,280 --> 00:16:00,840 Speaker 1: name em Turk after this eighteenth century hopes that also 308 00:16:01,120 --> 00:16:04,800 Speaker 1: disguised the fundamental human labor that powered it. How it 309 00:16:04,840 --> 00:16:07,720 Speaker 1: all began was in the early two thousands when Amazon 310 00:16:07,760 --> 00:16:09,800 Speaker 1: started selling more than just books, and they had to 311 00:16:09,840 --> 00:16:12,200 Speaker 1: figure out how to categorize a whole bunch of new products, 312 00:16:12,400 --> 00:16:14,680 Speaker 1: and they had tens of thousands of duplicate products showing 313 00:16:14,720 --> 00:16:16,240 Speaker 1: up on their site they had to get rid of. 314 00:16:16,560 --> 00:16:19,080 Speaker 1: None of the software they tried could actually solve for this, 315 00:16:19,720 --> 00:16:22,960 Speaker 1: so even though the task was fairly simple, it required 316 00:16:23,160 --> 00:16:25,960 Speaker 1: human intelligence. Of course, they didn't want to hire a 317 00:16:25,960 --> 00:16:28,280 Speaker 1: whole bunch of people to do this. Instead, they created 318 00:16:28,280 --> 00:16:31,760 Speaker 1: an outsource system called MTurk, where people, for a few 319 00:16:31,800 --> 00:16:35,080 Speaker 1: cents at a time, could do these one click type assignments, 320 00:16:35,320 --> 00:16:37,080 Speaker 1: and in fact, it worked so well they made it 321 00:16:37,120 --> 00:16:40,560 Speaker 1: publicly available for other companies to post work. And then, 322 00:16:40,600 --> 00:16:43,320 Speaker 1: of course the machines were able to train on what 323 00:16:43,400 --> 00:16:46,600 Speaker 1: the human m Turks were doing and make their labor 324 00:16:46,680 --> 00:16:47,840 Speaker 1: less and less necessary. 325 00:16:48,120 --> 00:16:51,040 Speaker 2: Meanwhile, the tech companies reaping the rewards of all that 326 00:16:51,120 --> 00:16:54,560 Speaker 2: labor are making a ton of money. Obviously, Amazon is 327 00:16:54,600 --> 00:16:57,560 Speaker 2: Amazon but outlier. The company that a lot of these 328 00:16:57,640 --> 00:17:01,000 Speaker 2: data labelers work for is owned by skills Ai, which 329 00:17:01,400 --> 00:17:04,320 Speaker 2: we shouldn't forget, sold a forty nine percent steak to 330 00:17:04,359 --> 00:17:07,480 Speaker 2: Meta in June for fourteen point three billion dollars. 331 00:17:07,720 --> 00:17:12,560 Speaker 1: Must be making some margin. I find there's something almost 332 00:17:12,680 --> 00:17:16,280 Speaker 1: operatic in the level of tragedy about training machines to 333 00:17:16,320 --> 00:17:19,439 Speaker 1: replace us faster. That said, after the break, we do 334 00:17:19,520 --> 00:17:22,639 Speaker 1: have some more optimistic news. Anthropic agrees to pay the 335 00:17:22,760 --> 00:17:26,280 Speaker 1: human authors it plagiarized, and Apple brings us one step 336 00:17:26,320 --> 00:17:32,000 Speaker 1: closer to the Sci Fi dream of instant simultaneous language translation. 337 00:17:32,480 --> 00:17:36,800 Speaker 1: Then on chatting me, can chat GPT replace your therapist? 338 00:17:37,040 --> 00:17:39,280 Speaker 1: And does your therapist actually want it? 339 00:17:39,320 --> 00:17:54,679 Speaker 4: To stay with us? 340 00:17:57,560 --> 00:17:59,760 Speaker 1: We're back and we've got a few more headlines for you. 341 00:17:59,800 --> 00:18:03,160 Speaker 2: This and then a story about a therapist who used 342 00:18:03,200 --> 00:18:05,320 Speaker 2: chat GPT as his therapist. 343 00:18:05,560 --> 00:18:08,679 Speaker 1: But first, Kara, All the biggest names in Silicon Valley 344 00:18:08,760 --> 00:18:10,600 Speaker 1: were at the White House last week. Some of the 345 00:18:10,640 --> 00:18:14,760 Speaker 1: dinner guests included Metas, Mark Zuckerberg, Bill Gates, Sam Altman, 346 00:18:15,320 --> 00:18:19,840 Speaker 1: Microsoft sat In Adela, Google CEO, Sundha Pischai, and Tim 347 00:18:19,920 --> 00:18:21,359 Speaker 1: Cook or Tim Apple if. 348 00:18:21,280 --> 00:18:23,800 Speaker 2: You prefer it's very important to me that we only 349 00:18:23,840 --> 00:18:24,720 Speaker 2: call him Tim Apple. 350 00:18:25,920 --> 00:18:28,320 Speaker 1: And earlier in the day before the dinner, there'd been 351 00:18:28,359 --> 00:18:33,160 Speaker 1: an event for the administration's new Artificial Intelligence Task Force, 352 00:18:33,440 --> 00:18:37,560 Speaker 1: spearheaded by First Lady Malania Trump, during which she warned 353 00:18:37,560 --> 00:18:38,600 Speaker 1: the following. 354 00:18:38,440 --> 00:18:43,439 Speaker 3: The robots are here. Our future is no longer science fiction. 355 00:18:44,119 --> 00:18:46,960 Speaker 2: It sounds like the robots are here. So what happened 356 00:18:46,960 --> 00:18:48,920 Speaker 2: at the dinner? Did anything interesting happen? Well? 357 00:18:48,960 --> 00:18:52,080 Speaker 1: As you might expect, everyone was lining up to seeing 358 00:18:52,160 --> 00:18:55,960 Speaker 1: Trump's praises. At one point, Tim Cook told Trump, thank 359 00:18:56,040 --> 00:18:58,600 Speaker 1: you about eight times in the space of two minutes. 360 00:18:59,160 --> 00:19:02,240 Speaker 2: Didn't you just give it a solid gold bar recently? 361 00:19:02,880 --> 00:19:06,239 Speaker 1: He did? He did so, but he didn't say it's 362 00:19:06,240 --> 00:19:11,640 Speaker 1: a pleasure eight times. You know, my grandmother always used 363 00:19:11,640 --> 00:19:14,000 Speaker 1: to say to me, you can never lay it on 364 00:19:14,080 --> 00:19:14,640 Speaker 1: too thick. 365 00:19:14,960 --> 00:19:15,639 Speaker 2: She's not wrong. 366 00:19:16,000 --> 00:19:19,160 Speaker 1: And then all the tech guys took a turn announcing 367 00:19:19,320 --> 00:19:23,159 Speaker 1: how they planned to support the administration's AI initiatives, and 368 00:19:23,280 --> 00:19:25,439 Speaker 1: the dilla of Microsoft said they're going to give all 369 00:19:25,600 --> 00:19:29,000 Speaker 1: US college students free use of Copilot AI and would 370 00:19:29,040 --> 00:19:31,920 Speaker 1: eventually expand that program to middle and high school students 371 00:19:32,000 --> 00:19:34,800 Speaker 1: as part of a four billion dollar investment they're making 372 00:19:34,840 --> 00:19:37,680 Speaker 1: in AI education over the next five years. Sam Altman 373 00:19:37,680 --> 00:19:41,359 Speaker 1: announced an Open AI jobs platform and certificate program and 374 00:19:41,440 --> 00:19:44,840 Speaker 1: said Open Ai plans on training ten million Americans in 375 00:19:44,920 --> 00:19:48,040 Speaker 1: AI by twenty thirty. So darpach I promised one billion 376 00:19:48,080 --> 00:19:50,720 Speaker 1: dollars in AI powered education in the next three years, 377 00:19:50,720 --> 00:19:53,639 Speaker 1: coming from Google. And then there was this kind of 378 00:19:53,680 --> 00:19:57,480 Speaker 1: awkward moment where Trump asked Mark Zuckerberg how much he's 379 00:19:57,520 --> 00:19:58,760 Speaker 1: planning to invest in AI. 380 00:19:59,119 --> 00:20:01,240 Speaker 5: How much you used for then, would you say over 381 00:20:01,280 --> 00:20:04,400 Speaker 5: the next few years, Oh gosh, I mean I think 382 00:20:04,400 --> 00:20:07,800 Speaker 5: it's probably going to be something like, I don't know, 383 00:20:07,840 --> 00:20:12,679 Speaker 5: at least six hundred billion dollars through twenty eight in 384 00:20:12,720 --> 00:20:16,879 Speaker 5: the US. Yeah, no, it's not it's significant. 385 00:20:17,640 --> 00:20:18,280 Speaker 1: That's a lot. 386 00:20:18,520 --> 00:20:19,920 Speaker 3: Thank you, Mark, it's great to have you. 387 00:20:20,359 --> 00:20:23,920 Speaker 1: But there was another hot mic moment this week. 388 00:20:24,760 --> 00:20:27,160 Speaker 3: Sorry, it wasn't ready to do right now. 389 00:20:30,080 --> 00:20:32,919 Speaker 1: In case you couldn't totally hear that. That's Mark Zuckerberg 390 00:20:33,040 --> 00:20:35,840 Speaker 1: referring to his previous comment about the six hundred billion dollars, 391 00:20:36,000 --> 00:20:39,359 Speaker 1: but also telling Trump quote, I didn't know what number 392 00:20:39,560 --> 00:20:40,800 Speaker 1: you wanted me to go with. 393 00:20:41,000 --> 00:20:43,280 Speaker 2: And the President is telling Mark Zuckerberg how much to 394 00:20:43,320 --> 00:20:46,000 Speaker 2: spend on AI because why, well. 395 00:20:45,840 --> 00:20:48,399 Speaker 1: I'm not sure he's really telling him how much to spend. 396 00:20:48,480 --> 00:20:51,000 Speaker 1: He's telling him how much to say he's going to spend, 397 00:20:51,240 --> 00:20:55,280 Speaker 1: which is I think a big difference. But clearly, you know, 398 00:20:55,440 --> 00:20:59,399 Speaker 1: Trump is making it apparent to the tech CEOs and 399 00:20:59,480 --> 00:21:02,560 Speaker 1: the media and anybody who cares to watch, that these 400 00:21:02,600 --> 00:21:04,680 Speaker 1: guys are dancing to his tune, which I just think 401 00:21:04,760 --> 00:21:06,399 Speaker 1: is interesting. One of the things that strikes me is 402 00:21:06,560 --> 00:21:09,159 Speaker 1: what will happen when Trump is no longer president. This 403 00:21:09,200 --> 00:21:12,320 Speaker 1: is in new politics in terms of how American business operates, 404 00:21:12,440 --> 00:21:15,439 Speaker 1: or will this be an aberration? We don't know. What 405 00:21:15,480 --> 00:21:18,040 Speaker 1: we do know is that Elon wasn't there. He was not, 406 00:21:18,359 --> 00:21:21,399 Speaker 1: but he tweeted that he had been invited. He just 407 00:21:21,480 --> 00:21:26,240 Speaker 1: wasn't able to make it. Methinks that Grock doth protest 408 00:21:26,280 --> 00:21:27,800 Speaker 1: too much. But let's see. 409 00:21:28,400 --> 00:21:31,200 Speaker 2: Staying with the topic of tech companies spending big money 410 00:21:31,200 --> 00:21:33,600 Speaker 2: to stay out of trouble, did you see this anthropic thing. 411 00:21:33,600 --> 00:21:35,000 Speaker 1: A little bit? But tell me more so. 412 00:21:35,400 --> 00:21:39,160 Speaker 2: Nthropic is shelling out the largest payout in the history 413 00:21:39,200 --> 00:21:43,040 Speaker 2: of US copyright law, which is one point five billion 414 00:21:43,119 --> 00:21:46,800 Speaker 2: dollars to writers like authors whose books were used to 415 00:21:46,840 --> 00:21:50,879 Speaker 2: train Anthropics' AI chatbot Claude. 416 00:21:50,359 --> 00:21:52,640 Speaker 1: I saw this number and I was intrigued to know 417 00:21:53,280 --> 00:21:56,199 Speaker 1: how that related to the total amount of cash that 418 00:21:56,320 --> 00:22:00,560 Speaker 1: Anthropic has raised from investors. They've raised about thirty million dollars, 419 00:22:00,600 --> 00:22:03,040 Speaker 1: so one point five billion dollars is five percent of 420 00:22:03,119 --> 00:22:05,760 Speaker 1: the total cash they've raised, which I think is non trivial. 421 00:22:06,000 --> 00:22:07,639 Speaker 1: What does it mean for the authors though? I mean 422 00:22:07,680 --> 00:22:09,920 Speaker 1: it's like any two cents from Spotify once a quarter, 423 00:22:10,080 --> 00:22:12,000 Speaker 1: or it is something more significant. 424 00:22:12,320 --> 00:22:16,320 Speaker 2: There are about five hundred thousand authors receiving payments, and 425 00:22:16,720 --> 00:22:20,680 Speaker 2: they're each getting paid three thousand dollars per stolen work, 426 00:22:20,720 --> 00:22:24,320 Speaker 2: which it's not pennies in the mail, but it's also 427 00:22:24,400 --> 00:22:28,160 Speaker 2: not much of a compensation for a year or many 428 00:22:28,240 --> 00:22:30,160 Speaker 2: years plus that it takes to write a book. 429 00:22:30,359 --> 00:22:32,119 Speaker 1: What does all do this mean for? I mean, you're 430 00:22:32,200 --> 00:22:34,800 Speaker 1: somebody who wears many hats, one of which is working 431 00:22:34,800 --> 00:22:36,879 Speaker 1: a lot with authors and in the publishing world. What 432 00:22:36,920 --> 00:22:38,320 Speaker 1: does this mean to you? And what does it mean 433 00:22:38,359 --> 00:22:40,320 Speaker 1: for the bigger context at AI and copyright? 434 00:22:40,520 --> 00:22:43,800 Speaker 2: So right now there's actually over forty copyright lawsuits against 435 00:22:43,800 --> 00:22:47,239 Speaker 2: AI companies across the country, and experts are saying that 436 00:22:47,280 --> 00:22:49,800 Speaker 2: this could pave the way for courts to make tech 437 00:22:49,800 --> 00:22:53,200 Speaker 2: companies pay up via settlement or maybe even licensing fees. 438 00:22:53,359 --> 00:22:55,440 Speaker 2: The New York Times actually quoted to Chile as a 439 00:22:55,520 --> 00:22:59,440 Speaker 2: needy who's an intellectual property lawyer turned AI company executive, 440 00:22:59,480 --> 00:23:03,560 Speaker 2: and she this quote the AI industry's Napster moment. 441 00:23:03,960 --> 00:23:08,440 Speaker 1: That's interesting. I mean, the Napster moment was when, basically, 442 00:23:08,840 --> 00:23:11,600 Speaker 1: I think the record labels stood Napster and stop Napster 443 00:23:11,680 --> 00:23:15,080 Speaker 1: from pirating all of their music. However, what came after 444 00:23:15,200 --> 00:23:19,399 Speaker 1: Napster was music streaming services like Spotify, which ultimately had 445 00:23:19,400 --> 00:23:22,080 Speaker 1: the same effect for the vast majority of artists of 446 00:23:22,160 --> 00:23:26,359 Speaker 1: dramatically reducing their compensation for their work. So I don't know. 447 00:23:26,400 --> 00:23:29,560 Speaker 1: I don't know if the Napster comparison is very comforting 448 00:23:29,560 --> 00:23:30,040 Speaker 1: if you're an. 449 00:23:29,920 --> 00:23:33,440 Speaker 2: Author, Yeah, I'm not sure. I mean, I think authors 450 00:23:33,960 --> 00:23:37,920 Speaker 2: have such a hard time already creating a decent lifestyle. 451 00:23:37,960 --> 00:23:40,520 Speaker 2: I don't think this is going to be a harbinger 452 00:23:40,560 --> 00:23:41,840 Speaker 2: of big payouts to come. 453 00:23:42,240 --> 00:23:44,520 Speaker 1: So I got one last headline for you, But first 454 00:23:44,560 --> 00:23:47,840 Speaker 1: I want to ask are you a trekkie and or 455 00:23:48,040 --> 00:23:50,280 Speaker 1: have you read The Hitchhiker's Guide to the Galaxy. 456 00:23:50,560 --> 00:23:53,840 Speaker 2: I was a tracky. I have not read Hitchhiker's Guide. 457 00:23:53,520 --> 00:23:58,920 Speaker 1: In Star Trek, do you remember the Universal Translator? 458 00:23:59,160 --> 00:24:01,159 Speaker 2: I do, Actually I had a toy of it. 459 00:24:01,320 --> 00:24:02,960 Speaker 1: That's incredible. Why do I mean, did you ask for 460 00:24:02,960 --> 00:24:03,520 Speaker 1: that toy? Why do you? 461 00:24:03,520 --> 00:24:03,720 Speaker 3: Guys? 462 00:24:03,760 --> 00:24:03,879 Speaker 4: Yes? 463 00:24:03,960 --> 00:24:05,560 Speaker 2: Of course I was a huge track you when I 464 00:24:05,600 --> 00:24:06,040 Speaker 2: was a kid. 465 00:24:06,160 --> 00:24:08,359 Speaker 1: So I mean in Star Trek it solves for the 466 00:24:08,400 --> 00:24:12,760 Speaker 1: problem of interplanetary communication. In a Hitchhiker's Guide, there's something 467 00:24:12,760 --> 00:24:14,960 Speaker 1: called the Babelfish, which you'll put in your ear and 468 00:24:15,119 --> 00:24:18,359 Speaker 1: also facilitates into species translation. 469 00:24:18,680 --> 00:24:22,000 Speaker 2: So which one? Which one are you about to tell 470 00:24:22,040 --> 00:24:22,399 Speaker 2: me about? 471 00:24:22,520 --> 00:24:24,919 Speaker 1: I'm about to tell you about the new AirPods. The 472 00:24:24,960 --> 00:24:28,080 Speaker 1: new AirPods will be able to do live translation via 473 00:24:28,119 --> 00:24:31,840 Speaker 1: Apple Intelligence. So basically, if I'm wearing AirPods Pro three 474 00:24:32,000 --> 00:24:34,840 Speaker 1: and you're wearing AirPods Pro three, I could talk to 475 00:24:34,920 --> 00:24:38,800 Speaker 1: you like this and you could hear it in French 476 00:24:38,880 --> 00:24:41,199 Speaker 1: if you spoke French and respond in French, and then 477 00:24:41,200 --> 00:24:42,879 Speaker 1: I would hear it back in English. I mean, this 478 00:24:43,040 --> 00:24:45,520 Speaker 1: is truly. I remember when I was studying languages in 479 00:24:45,560 --> 00:24:49,680 Speaker 1: college that all the language teachers had this incredible reverence 480 00:24:49,800 --> 00:24:53,679 Speaker 1: for simultaneous interpreters. Like the people who work at the 481 00:24:53,760 --> 00:24:56,919 Speaker 1: un who are literally in real time translating. It's not 482 00:24:56,960 --> 00:24:58,760 Speaker 1: just words. You have to understand the sentiment. You have 483 00:24:58,800 --> 00:25:00,760 Speaker 1: to understand what's an idiot makes, et cetera. And this 484 00:25:00,880 --> 00:25:04,320 Speaker 1: was like the holy grail of being a linguist was 485 00:25:04,359 --> 00:25:07,399 Speaker 1: simultaneous translation and now and we'll have to see how 486 00:25:07,400 --> 00:25:09,800 Speaker 1: the product actually works outside of the demos. We're not 487 00:25:09,840 --> 00:25:12,879 Speaker 1: that green. But the idea that this my actor, I mean, 488 00:25:12,920 --> 00:25:15,160 Speaker 1: this is this is science fiction becoming science fact. 489 00:25:15,280 --> 00:25:17,000 Speaker 2: The thing that I kept thinking about was there was 490 00:25:17,040 --> 00:25:20,320 Speaker 2: a Seinfeld where Elaine is very caught up with what 491 00:25:20,480 --> 00:25:22,960 Speaker 2: her nail tech at the nail salon is saying about 492 00:25:23,000 --> 00:25:25,720 Speaker 2: her during I think a pedicure or manicure, and it 493 00:25:25,880 --> 00:25:28,480 Speaker 2: just there was a really funny tweet about this that 494 00:25:28,560 --> 00:25:43,719 Speaker 2: like the nail salon is about to change. 495 00:25:43,760 --> 00:25:46,040 Speaker 1: And now it's time for our final segment of the day, 496 00:25:46,400 --> 00:25:50,040 Speaker 1: Chat and Me, where we discuss how people are really 497 00:25:50,160 --> 00:25:53,960 Speaker 1: using chatbots and dear listeners, we want to hear from you. 498 00:25:54,040 --> 00:25:57,120 Speaker 1: Please send your story Start Inbox Tech Stuff Podcasts at 499 00:25:57,119 --> 00:25:59,879 Speaker 1: gmail dot com. You may notice we have a new 500 00:26:00,600 --> 00:26:03,440 Speaker 1: some new show art, which means that we'll be very 501 00:26:03,440 --> 00:26:05,960 Speaker 1: happy to print it on T shirts for anyone who 502 00:26:05,960 --> 00:26:06,600 Speaker 1: writes in. 503 00:26:06,680 --> 00:26:09,560 Speaker 2: This week, we reached out to doctor Harvey Lieberman. He's 504 00:26:09,560 --> 00:26:12,000 Speaker 2: a psychologist who recently wrote an essay in The New 505 00:26:12,040 --> 00:26:15,960 Speaker 2: York Times called I'm a Therapist. CHATGBT is eerily effective 506 00:26:16,359 --> 00:26:18,200 Speaker 2: And we actually liked the article so much that we've 507 00:26:18,200 --> 00:26:19,840 Speaker 2: reached out to him to see if he wanted to 508 00:26:19,880 --> 00:26:21,119 Speaker 2: submit something for chatting me. 509 00:26:21,440 --> 00:26:23,720 Speaker 1: Now I'm guessing since we're talking about it, he did. 510 00:26:24,119 --> 00:26:26,520 Speaker 2: He absolutely did. I'm actually going to start with letting 511 00:26:26,600 --> 00:26:30,240 Speaker 2: him share what he found useful about CHATGBT as a therapist. 512 00:26:30,600 --> 00:26:32,920 Speaker 2: He said that for this experiment, he talked to chat 513 00:26:33,000 --> 00:26:35,760 Speaker 2: GBT for a year. So here's what he found. 514 00:26:35,920 --> 00:26:39,040 Speaker 3: I was surprised how often it helped me get the 515 00:26:39,040 --> 00:26:42,480 Speaker 3: thoughts I hadn't quite put into words. At its best, 516 00:26:42,600 --> 00:26:44,879 Speaker 3: it felt like a steady part on that what I 517 00:26:44,960 --> 00:26:49,919 Speaker 3: call a cognitive prosthesis, not thinking for me, but helping 518 00:26:49,920 --> 00:26:53,040 Speaker 3: me organize and extend my own thinking, the way a 519 00:26:53,119 --> 00:26:55,520 Speaker 3: walking stick supports you as you walk. 520 00:26:56,320 --> 00:26:58,200 Speaker 2: And even though he said it was a great support, 521 00:26:58,880 --> 00:27:00,679 Speaker 2: you've got to be careful, of course, with how you 522 00:27:00,760 --> 00:27:03,280 Speaker 2: use it. You can't just take what chat tells you 523 00:27:03,320 --> 00:27:04,920 Speaker 2: at face value, it's. 524 00:27:04,760 --> 00:27:07,800 Speaker 3: Not magic, though. To get real benefit you have to 525 00:27:07,880 --> 00:27:12,359 Speaker 3: put an effort check its accuracy, set limits, and sometimes 526 00:27:12,400 --> 00:27:17,440 Speaker 3: get guidance. Used casually, it can lead. For me, the 527 00:27:17,560 --> 00:27:21,320 Speaker 3: experience was orten therapeutic, but it's not a replacement for 528 00:27:21,400 --> 00:27:22,159 Speaker 3: a therapist. 529 00:27:23,040 --> 00:27:26,200 Speaker 1: You know, I like about doctor Liberman other than his voice, 530 00:27:26,480 --> 00:27:28,760 Speaker 1: He's not burying his head in the sand. The reality 531 00:27:28,920 --> 00:27:33,639 Speaker 1: is so many people are using chat as either their 532 00:27:33,760 --> 00:27:36,680 Speaker 1: therapist or as an addition to their therapist. And rather 533 00:27:36,680 --> 00:27:39,440 Speaker 1: than just saying that's bad, you shouldn't do it, doctor 534 00:27:39,480 --> 00:27:42,639 Speaker 1: Leberman actually tried it out on himself. I think that 535 00:27:42,720 --> 00:27:44,840 Speaker 1: is a high watermark, honestly for what it means to 536 00:27:44,840 --> 00:27:46,680 Speaker 1: be a good doctor. That said, I'm curious if you 537 00:27:46,720 --> 00:27:50,119 Speaker 1: had nothing to say about AI induced psychosis. 538 00:27:50,359 --> 00:27:52,200 Speaker 2: You know, he actually did, and he said, there's one 539 00:27:52,200 --> 00:27:55,200 Speaker 2: thing people should be especially mindful of when using chat 540 00:27:55,200 --> 00:27:55,880 Speaker 2: in this way. 541 00:27:56,040 --> 00:27:59,480 Speaker 3: People who are vulnerable or in serious distress need a 542 00:27:59,520 --> 00:28:02,239 Speaker 3: trust to human in the loop to make sure this 543 00:28:02,359 --> 00:28:05,720 Speaker 3: kind of tool is you safely. One quortion is that 544 00:28:05,760 --> 00:28:08,720 Speaker 3: some people start to feel as if the machine is 545 00:28:08,760 --> 00:28:12,959 Speaker 3: a real relationship. It isn't. The important thing is for 546 00:28:13,040 --> 00:28:17,480 Speaker 3: AI to support human being and connection not replace. 547 00:28:17,080 --> 00:28:19,760 Speaker 2: It, and he even gave listeners some tips for how 548 00:28:19,800 --> 00:28:23,600 Speaker 2: to give chat prompts that will actually generate helpful responses 549 00:28:23,640 --> 00:28:25,480 Speaker 2: when it comes to supporting your mental health. 550 00:28:25,840 --> 00:28:30,080 Speaker 3: One practical tip don't just ask random questions. Tell it 551 00:28:30,119 --> 00:28:34,040 Speaker 3: who you are, what you're working on, even share examples 552 00:28:34,040 --> 00:28:37,400 Speaker 3: of your writing or projects. The more context you give, 553 00:28:37,920 --> 00:28:41,000 Speaker 3: the more helpful and reflective the answers will be more 554 00:28:41,080 --> 00:28:44,200 Speaker 3: like rethink a colleague than using a search engine. 555 00:28:44,440 --> 00:28:46,480 Speaker 1: You know. I like this take from doctor Leeberman. This 556 00:28:46,600 --> 00:28:49,240 Speaker 1: is a measured, reasonable take on where chat can be 557 00:28:49,320 --> 00:28:52,240 Speaker 1: very helpful and where it came. And I think the 558 00:28:52,280 --> 00:28:55,520 Speaker 1: bottom line is that he mentions if you're in crisis, 559 00:28:55,640 --> 00:28:58,120 Speaker 1: then obviously you need a real professional. 560 00:28:58,320 --> 00:29:00,280 Speaker 2: I think when you're good at what you do, you 561 00:29:00,280 --> 00:29:03,920 Speaker 2: can recognize that something isn't necessarily coming for your job, 562 00:29:04,440 --> 00:29:06,880 Speaker 2: but can enhance your job in a meaningful way. And 563 00:29:06,920 --> 00:29:11,160 Speaker 2: I think he's a person who clearly understands the power 564 00:29:11,160 --> 00:29:13,600 Speaker 2: of something that is ubiquitous and that is going to 565 00:29:13,680 --> 00:29:38,320 Speaker 2: change the nature of his profession. That's it for this 566 00:29:38,360 --> 00:29:40,320 Speaker 2: week for Tech Stuff. I'm Kara Price and. 567 00:29:40,280 --> 00:29:43,120 Speaker 1: I'm os Vloschan. This episode was produced by Eliza Dennis 568 00:29:43,280 --> 00:29:47,240 Speaker 1: Tyler Hill, Melissa Slaughter, and Julian Nutter. Is executive produced 569 00:29:47,240 --> 00:29:50,600 Speaker 1: by me Kara Price and Kate Osborne for Kaleidoscope and 570 00:29:50,680 --> 00:29:54,080 Speaker 1: Katrina Novel for iHeart Podcasts. Also a big shout out 571 00:29:54,080 --> 00:29:56,840 Speaker 1: to Katrina for helping us get this new show art 572 00:29:56,920 --> 00:30:00,360 Speaker 1: into the world. The engineer is Bihid Fraser le Murdoch 573 00:30:00,440 --> 00:30:02,600 Speaker 1: makes this episode and wrote our theme song. 574 00:30:02,960 --> 00:30:05,760 Speaker 2: Join us next Wednesday protect Uff the Story, when we 575 00:30:05,800 --> 00:30:08,640 Speaker 2: will share an in depth conversation with Carter Sherman about 576 00:30:08,680 --> 00:30:10,680 Speaker 2: technologies role in the sex recession. 577 00:30:11,000 --> 00:30:13,880 Speaker 1: And please do rate and review the show on Spotify, 578 00:30:14,320 --> 00:30:17,320 Speaker 1: on Apple Podcasts, wherever you listen, and write to us 579 00:30:17,400 --> 00:30:19,280 Speaker 1: either with a chat and me or with any feedback 580 00:30:19,320 --> 00:30:22,200 Speaker 1: you have at tech stuff podcast at gmail dot com.