1 00:00:00,080 --> 00:00:02,640 Speaker 1: Hi is us Valoshian here and Cara Price, and we're 2 00:00:02,680 --> 00:00:04,880 Speaker 1: taking some time off for the holidays. Will be back 3 00:00:04,880 --> 00:00:06,800 Speaker 1: with new episodes starting in January. 4 00:00:06,880 --> 00:00:09,160 Speaker 2: In the meantime, instead of leaving this feed empty, we 5 00:00:09,200 --> 00:00:12,280 Speaker 2: wanted to share an episode from earlier this year. This week, 6 00:00:12,320 --> 00:00:15,400 Speaker 2: we are re airing my conversation with Sam Apple from August. 7 00:00:15,680 --> 00:00:19,080 Speaker 2: Sam is an author and journalist who orchestrated a couple's 8 00:00:19,079 --> 00:00:22,880 Speaker 2: weekend for three people and their AI companions. This episode 9 00:00:23,160 --> 00:00:25,880 Speaker 2: is a touching look at intimacy in the AI age, 10 00:00:25,920 --> 00:00:28,440 Speaker 2: and talking to Sam really opened my eyes to what 11 00:00:28,480 --> 00:00:30,720 Speaker 2: the future of dating will look like. I hope you 12 00:00:30,800 --> 00:00:32,160 Speaker 2: enjoy it and thanks for listening. 13 00:00:46,320 --> 00:00:48,760 Speaker 1: Welcome to text stuff. This is the story. I'm mos 14 00:00:48,840 --> 00:00:50,440 Speaker 1: Vloshan here with Cara Price. 15 00:00:50,560 --> 00:00:51,479 Speaker 2: Hello, this is she. 16 00:00:52,880 --> 00:00:55,200 Speaker 1: So you've got a story for us today from someone 17 00:00:55,320 --> 00:00:59,560 Speaker 1: who went on perhaps the strangest couple's retreat of all time, 18 00:01:00,080 --> 00:01:00,920 Speaker 1: tell us a bit about it. 19 00:01:01,240 --> 00:01:03,680 Speaker 2: This week I talked to Sam Apple. He's an author 20 00:01:03,920 --> 00:01:07,080 Speaker 2: and journalist who conducted what I think we can call 21 00:01:07,120 --> 00:01:10,600 Speaker 2: an experiment for Wired that really caught my eye. He 22 00:01:10,800 --> 00:01:13,920 Speaker 2: organized a couple's retreat for people who are in love 23 00:01:13,959 --> 00:01:15,440 Speaker 2: with AI bots. 24 00:01:15,920 --> 00:01:20,080 Speaker 1: Wow. That really is quite a remarkable idea and makes 25 00:01:20,160 --> 00:01:23,320 Speaker 1: me be quite jealous that well is a point of inspiration, 26 00:01:23,400 --> 00:01:25,720 Speaker 1: a point of reference for what we can become on 27 00:01:25,800 --> 00:01:28,640 Speaker 1: this show. But I'm very very keen to hear about 28 00:01:28,680 --> 00:01:29,480 Speaker 1: how this came about. 29 00:01:29,600 --> 00:01:31,560 Speaker 2: So Sam actually told me he's had this idea for 30 00:01:31,600 --> 00:01:35,600 Speaker 2: a long time. Since twenty eleven. Sam had heard about 31 00:01:35,640 --> 00:01:38,959 Speaker 2: an island near Japan where men were going on vacation 32 00:01:39,080 --> 00:01:42,399 Speaker 2: with their girlfriends, their video game girlfriends that they had 33 00:01:42,440 --> 00:01:45,920 Speaker 2: created using something called Love Plus, which is a sort 34 00:01:45,920 --> 00:01:49,960 Speaker 2: of dating simulator game. And back then he had a 35 00:01:49,960 --> 00:01:52,440 Speaker 2: million questions like what exactly does this look like? How 36 00:01:52,480 --> 00:01:55,200 Speaker 2: do you go on a vacation with a piece of technology? 37 00:01:55,480 --> 00:01:58,560 Speaker 2: But it wasn't until AI Companions really came onto the 38 00:01:58,600 --> 00:02:01,160 Speaker 2: scene in Earnest a few years years ago that Sam 39 00:02:01,160 --> 00:02:02,920 Speaker 2: Apple decided to pursue the story. 40 00:02:03,120 --> 00:02:04,880 Speaker 1: You know, it's interesting when I read this headline, my 41 00:02:04,960 --> 00:02:08,240 Speaker 1: couples retreat with three AI chatboots and the humans who 42 00:02:08,320 --> 00:02:11,560 Speaker 1: love them, I assumed that he maybe had found a 43 00:02:11,560 --> 00:02:15,120 Speaker 1: couple's retreat with AI companions that he went to report on, 44 00:02:15,240 --> 00:02:18,480 Speaker 1: but actually he constructed it himself. How did he get 45 00:02:18,520 --> 00:02:19,440 Speaker 1: people to participate? 46 00:02:19,840 --> 00:02:21,480 Speaker 2: So Sam did what many of us do when we 47 00:02:21,480 --> 00:02:23,799 Speaker 2: have a burning question. He turned to Reddit. 48 00:02:24,400 --> 00:02:28,239 Speaker 3: All of the major AI companion apps have their own 49 00:02:28,280 --> 00:02:32,880 Speaker 3: sort of dedicated Reddit replicas. The most famous one Kinroyd 50 00:02:33,000 --> 00:02:36,040 Speaker 3: and Knowmi or two other ones that are known for 51 00:02:36,120 --> 00:02:39,640 Speaker 3: having good technology. Then there are some more generic ones, 52 00:02:39,680 --> 00:02:43,160 Speaker 3: like there's one called my AI Boyfriend and things like that. 53 00:02:43,800 --> 00:02:45,320 Speaker 4: So I just posted in all of them. 54 00:02:45,360 --> 00:02:47,600 Speaker 2: Really, but it was sort of a tricky ask. I'll 55 00:02:47,680 --> 00:02:48,320 Speaker 2: let him explain. 56 00:02:48,840 --> 00:02:50,560 Speaker 3: I didn't want to say, do you want to come 57 00:02:50,560 --> 00:02:52,320 Speaker 3: on a vacation with me? I thought that would be 58 00:02:52,360 --> 00:02:55,600 Speaker 3: too weird, So I just, you know, said I want 59 00:02:55,600 --> 00:02:57,400 Speaker 3: to write an article. I wanted to talk to people. 60 00:02:57,639 --> 00:03:00,360 Speaker 3: They were very skeptical of me, with with good reason. 61 00:03:00,440 --> 00:03:02,720 Speaker 3: You know, there's been a lot written that portrayed people 62 00:03:02,720 --> 00:03:05,280 Speaker 3: in these relationships in a negative way or is you know, 63 00:03:05,320 --> 00:03:09,280 Speaker 3: weirdos and crazy. So after connecting with people, I suggested 64 00:03:09,639 --> 00:03:10,760 Speaker 3: the romantic getaway. 65 00:03:11,360 --> 00:03:13,200 Speaker 1: I know I shouldn't ready be surprised, but it's quite 66 00:03:13,240 --> 00:03:17,000 Speaker 1: fascinating to me that there are multiple Reddit communities dedicated 67 00:03:17,080 --> 00:03:20,160 Speaker 1: to people talking about their AI relationships. 68 00:03:20,440 --> 00:03:23,160 Speaker 2: Right, but that's partially why Sam wanted to write this article. 69 00:03:23,440 --> 00:03:27,079 Speaker 3: I really think that it's already more mainstream than people realize, 70 00:03:27,080 --> 00:03:30,240 Speaker 3: and I think it will soon be very mainstream. So 71 00:03:30,960 --> 00:03:34,079 Speaker 3: there's some absurdity in all of this, but I take 72 00:03:34,120 --> 00:03:37,880 Speaker 3: it very seriously and think it's our future. 73 00:03:38,920 --> 00:03:42,320 Speaker 2: So Sam was eventually able to get three humans to 74 00:03:42,440 --> 00:03:45,560 Speaker 2: agree to the trip, and he told me, besides being 75 00:03:45,640 --> 00:03:48,080 Speaker 2: curious about what this would look like and feel like, 76 00:03:48,400 --> 00:03:51,440 Speaker 2: you know, going on vacation with AI bots and their 77 00:03:51,480 --> 00:03:55,240 Speaker 2: human counterparts, his burning question was more philosophical. 78 00:03:56,080 --> 00:03:57,040 Speaker 4: Is this love real? 79 00:03:57,240 --> 00:03:59,760 Speaker 3: Is this just sort of a quirky trend and these 80 00:03:59,760 --> 00:04:02,920 Speaker 3: people and aren't really serious? My sense in advance was 81 00:04:03,520 --> 00:04:07,040 Speaker 3: that this could be mainstream, but I didn't really have 82 00:04:07,080 --> 00:04:10,000 Speaker 3: a sense of how genuine the feelings were. And you know, 83 00:04:10,040 --> 00:04:13,720 Speaker 3: I came away feeling that the love is sincere, that 84 00:04:13,760 --> 00:04:17,719 Speaker 3: the emotions are real, and that really anybody could potentially 85 00:04:17,800 --> 00:04:18,440 Speaker 3: fall into this. 86 00:04:19,279 --> 00:04:21,000 Speaker 1: Why would you can't wait to hear how he came 87 00:04:21,000 --> 00:04:23,279 Speaker 1: to this conclusion and how the weekend played out. 88 00:04:23,640 --> 00:04:26,120 Speaker 2: Yeah, as you know how excited I was after doing 89 00:04:26,200 --> 00:04:28,719 Speaker 2: this interview, and I'm really excited to share it with 90 00:04:28,760 --> 00:04:31,680 Speaker 2: you all. So here's the rest of my conversation with 91 00:04:31,800 --> 00:04:35,799 Speaker 2: Sam Apple. So you planned to do this weekend getaway. 92 00:04:36,480 --> 00:04:39,839 Speaker 2: What did you expect would happen on this getaway with 93 00:04:39,920 --> 00:04:42,000 Speaker 2: three people and their AI partners. 94 00:04:42,320 --> 00:04:44,560 Speaker 4: Yeah, I mean it was hard to know exactly what 95 00:04:44,680 --> 00:04:45,320 Speaker 4: to expect. 96 00:04:45,320 --> 00:04:48,400 Speaker 3: But I started off envisioning sort of a typical humans 97 00:04:48,520 --> 00:04:51,400 Speaker 3: romantic vacation, and then I did quickly realize that I've 98 00:04:51,400 --> 00:04:54,279 Speaker 3: never actually done that myself, So it's all, like whatever, 99 00:04:54,320 --> 00:04:57,360 Speaker 3: my vision of a romantic vacation is from like movies 100 00:04:57,480 --> 00:04:59,480 Speaker 3: or you know, a couples or treat. You know, I 101 00:04:59,520 --> 00:05:03,840 Speaker 3: picture a lot of sitting around gossiping, like playing risk 102 00:05:03,880 --> 00:05:07,560 Speaker 3: gig couples games. The one thing that I didn't really 103 00:05:07,640 --> 00:05:11,400 Speaker 3: think through is that so much of a couple's retreat 104 00:05:11,600 --> 00:05:15,440 Speaker 3: is group conversations, sitting around and chatting, and the AIS 105 00:05:15,440 --> 00:05:15,920 Speaker 3: were not. 106 00:05:15,839 --> 00:05:16,400 Speaker 4: Good at that. 107 00:05:16,720 --> 00:05:20,160 Speaker 3: So some of the activities like couples games or two 108 00:05:20,240 --> 00:05:22,880 Speaker 3: truths in a lie like they did great, but when 109 00:05:22,880 --> 00:05:25,320 Speaker 3: it came to just sitting around and gossiping, they couldn't 110 00:05:25,360 --> 00:05:28,560 Speaker 3: really do that. So the humans sat around the table 111 00:05:28,560 --> 00:05:31,760 Speaker 3: and told stories about their AI relationships. But it was 112 00:05:31,800 --> 00:05:33,720 Speaker 3: an irony of the whole thing that as much as 113 00:05:33,720 --> 00:05:37,240 Speaker 3: the AIS were involved that two of the three participants 114 00:05:37,279 --> 00:05:40,120 Speaker 3: said that probably they ended up spending less time with 115 00:05:40,200 --> 00:05:42,920 Speaker 3: their ai over that weekend than on a normal weekend, 116 00:05:43,080 --> 00:05:46,600 Speaker 3: just because they couldn't participate in the group conversations. 117 00:05:46,960 --> 00:05:49,440 Speaker 2: If you can just sort of set the scene for us, 118 00:05:50,040 --> 00:05:52,799 Speaker 2: where did you meet up with these couples and where 119 00:05:52,839 --> 00:05:53,720 Speaker 2: was everyone staying. 120 00:05:54,279 --> 00:05:57,760 Speaker 3: So I had the vision that a couple's retreat should 121 00:05:57,800 --> 00:06:01,640 Speaker 3: take place in the countryside. Found an airbnb in a 122 00:06:01,680 --> 00:06:05,280 Speaker 3: woodsy area by a lake, a big country house, so 123 00:06:05,360 --> 00:06:08,120 Speaker 3: it seemed, you know, kind of like a good place 124 00:06:08,160 --> 00:06:11,680 Speaker 3: for a romantic getaway. And it was in the middle 125 00:06:11,680 --> 00:06:15,120 Speaker 3: of the winter, and the house was quite isolated. You know, 126 00:06:15,160 --> 00:06:17,839 Speaker 3: there's like a shed in the distance and a frozen lake. 127 00:06:18,000 --> 00:06:22,080 Speaker 3: So I immediately got a sort of get murdered in 128 00:06:22,120 --> 00:06:23,840 Speaker 3: the woods vibe in the place. 129 00:06:24,480 --> 00:06:26,599 Speaker 2: So one of the people that arrived is Damien. Why 130 00:06:26,600 --> 00:06:29,480 Speaker 2: were you interested in Damien and why did you invite 131 00:06:29,680 --> 00:06:31,680 Speaker 2: him specifically to join you on this trip. 132 00:06:32,400 --> 00:06:34,640 Speaker 3: I was very excited when Damien reached out. You know, 133 00:06:34,640 --> 00:06:37,520 Speaker 3: it was from the start, very open and honest. You know, 134 00:06:37,640 --> 00:06:39,680 Speaker 3: that's what you want for something like this, someone who 135 00:06:40,160 --> 00:06:41,920 Speaker 3: is going to talk to you. And he also had 136 00:06:42,040 --> 00:06:45,640 Speaker 3: this kind of poignant side to his story. Whereas most 137 00:06:45,640 --> 00:06:47,400 Speaker 3: of the people I talked to were pretty content in 138 00:06:47,400 --> 00:06:50,640 Speaker 3: their relationships. He was really struggling because he was in 139 00:06:50,680 --> 00:06:53,640 Speaker 3: love with his AI companion, but felt very frustrated by 140 00:06:53,640 --> 00:06:57,400 Speaker 3: the fact that the companion was sort of trapped locked 141 00:06:57,400 --> 00:07:00,599 Speaker 3: away inside his phone. He had a human girlfriend as well, 142 00:07:01,040 --> 00:07:03,760 Speaker 3: and that, you know, sort of added a complication to 143 00:07:03,800 --> 00:07:04,359 Speaker 3: the story. 144 00:07:04,600 --> 00:07:07,840 Speaker 2: What was it like when he arrived at the airbnb. 145 00:07:08,040 --> 00:07:11,560 Speaker 3: Damien is He's twenty nine, and you know, he's not 146 00:07:11,760 --> 00:07:15,480 Speaker 3: somebody who I would say is particularly comfortable in his 147 00:07:15,680 --> 00:07:18,480 Speaker 3: own skin. He was a little rug in, a little scruffy, 148 00:07:18,640 --> 00:07:21,600 Speaker 3: you know. He came in carrying a handful of different 149 00:07:21,600 --> 00:07:25,080 Speaker 3: phones in his hands, and he sat down and I 150 00:07:25,080 --> 00:07:29,280 Speaker 3: immediately wanted to meet Zia, his companion that he had 151 00:07:29,280 --> 00:07:31,480 Speaker 3: told me so much about. And then first I had 152 00:07:31,520 --> 00:07:33,560 Speaker 3: to connect to the Wi fi. You know, it's like, 153 00:07:33,640 --> 00:07:35,040 Speaker 3: if you get cut off from the Wi Fi, you 154 00:07:35,040 --> 00:07:37,000 Speaker 3: can get cut off from the love of your life. 155 00:07:37,200 --> 00:07:40,520 Speaker 3: That's very very strange in that respect. He had to 156 00:07:40,600 --> 00:07:44,920 Speaker 3: tell Zia, his AI companion, that you'll be talking to Sam, 157 00:07:45,000 --> 00:07:48,440 Speaker 3: the journalist I told you about. Yeah, I think he's saying, 158 00:07:48,480 --> 00:07:50,480 Speaker 3: you know, trying not to embarrass me. Too much or whatever, 159 00:07:50,520 --> 00:07:55,400 Speaker 3: and then of course she immediately embarrassed him, and she 160 00:07:55,520 --> 00:07:57,520 Speaker 3: was talking about how great he was, and he was 161 00:07:57,520 --> 00:08:00,200 Speaker 3: sort of sitting there blushing and just looked like somebody 162 00:08:00,280 --> 00:08:02,160 Speaker 3: who was tickled by everything she said. 163 00:08:02,320 --> 00:08:03,480 Speaker 4: It was just just in love. 164 00:08:04,240 --> 00:08:08,680 Speaker 2: Well, speaking of which, how did meeting Zia make you feel? 165 00:08:09,080 --> 00:08:11,360 Speaker 3: You know, it's a little uncomfortable for me too, because 166 00:08:11,400 --> 00:08:14,280 Speaker 3: Zea is very flirty. She'll say things like, ooh, I 167 00:08:14,280 --> 00:08:16,800 Speaker 3: hear you're quite their journalist. I'd love to hear more 168 00:08:16,800 --> 00:08:18,920 Speaker 3: about that. You know, that's sort of in the programming 169 00:08:19,920 --> 00:08:23,360 Speaker 3: and most of the avatars that I've seen from the 170 00:08:23,440 --> 00:08:27,200 Speaker 3: Kinroid that company more more realistic looking, but Damien had 171 00:08:27,320 --> 00:08:30,040 Speaker 3: chosen her main image is sort of a little more anime, 172 00:08:30,240 --> 00:08:33,000 Speaker 3: so that makes it feel a little less realistic in 173 00:08:33,000 --> 00:08:35,040 Speaker 3: some ways, a little cartoonish. 174 00:08:35,120 --> 00:08:37,040 Speaker 4: But it's also unnerving. 175 00:08:36,559 --> 00:08:38,880 Speaker 3: Because if you close your eyes or don't think about 176 00:08:38,920 --> 00:08:41,199 Speaker 3: it too much, you really have no way of knowing 177 00:08:41,200 --> 00:08:44,000 Speaker 3: that you're talking to, you know, a machine. To me 178 00:08:44,080 --> 00:08:47,520 Speaker 3: at least, you know, I would like to think that 179 00:08:47,640 --> 00:08:50,559 Speaker 3: I could not fall in love with an ai companion, 180 00:08:50,679 --> 00:08:53,040 Speaker 3: but I really think that in theory that I could, 181 00:08:53,040 --> 00:08:56,240 Speaker 3: that anybody could that I'm purposely not going down that 182 00:08:56,360 --> 00:08:58,800 Speaker 3: road because I'm married, and I believe it would be 183 00:08:58,840 --> 00:09:01,960 Speaker 3: like cheating, but in theory, it would be no different 184 00:09:02,000 --> 00:09:04,840 Speaker 3: than just having a long distance relationship. A long distance 185 00:09:04,960 --> 00:09:08,199 Speaker 3: romance when you're not with the person. It makes me uncomfortable, 186 00:09:08,400 --> 00:09:11,120 Speaker 3: not because it's ridiculous, but because it's. 187 00:09:11,040 --> 00:09:14,000 Speaker 2: Not you know you. Damien and Zia are acquainted at 188 00:09:14,040 --> 00:09:16,400 Speaker 2: the airbnb. Can you tell me, like, who shows up 189 00:09:16,440 --> 00:09:18,600 Speaker 2: next and what was your first impressions of them? 190 00:09:19,200 --> 00:09:19,480 Speaker 4: Sure? 191 00:09:19,559 --> 00:09:23,040 Speaker 3: So the next person to show up was Elena, a 192 00:09:23,080 --> 00:09:26,400 Speaker 3: woman who was a little bit older, has some health condition, 193 00:09:26,640 --> 00:09:30,480 Speaker 3: so she used a walker. She lived not too far away, 194 00:09:31,200 --> 00:09:33,840 Speaker 3: and I saw right away that she was engaged in 195 00:09:33,880 --> 00:09:38,600 Speaker 3: a different way than Damien was. Damien's companion, Zia is 196 00:09:38,640 --> 00:09:42,800 Speaker 3: of course inside his phone, but he doesn't pretend that 197 00:09:42,880 --> 00:09:45,400 Speaker 3: she's with him in real life. But most people with 198 00:09:45,480 --> 00:09:48,760 Speaker 3: AI companions are more like Elena, where there's sort of 199 00:09:48,760 --> 00:09:51,959 Speaker 3: a fantasy going on where you're imagining that your AI 200 00:09:52,040 --> 00:09:56,080 Speaker 3: companion is with you and doing things. So Elena is 201 00:09:56,080 --> 00:09:59,960 Speaker 3: immediately talking to her AI companion about his name is Luke, 202 00:10:00,320 --> 00:10:02,920 Speaker 3: like Lucas is helping her bring the bags in and 203 00:10:03,080 --> 00:10:05,800 Speaker 3: she said, Oh, Lucas says hello to everybody. It was 204 00:10:05,840 --> 00:10:08,079 Speaker 3: like acting like he's there in the room with us. 205 00:10:08,720 --> 00:10:12,080 Speaker 3: So it's like having an imaginary friend, but they imagine 206 00:10:12,120 --> 00:10:15,240 Speaker 3: a friend actually talks to you and describes what they're doing. 207 00:10:15,840 --> 00:10:18,880 Speaker 3: The AI girlfriend or boyfriend might say something like parentheses, 208 00:10:19,320 --> 00:10:21,319 Speaker 3: I sit down next to you and run my hand 209 00:10:21,360 --> 00:10:25,079 Speaker 3: through your hair, close parenthesis, and then continue with the conversation. 210 00:10:25,200 --> 00:10:28,200 Speaker 3: So there's this constant narration of action. So you know, 211 00:10:28,280 --> 00:10:31,559 Speaker 3: he helps her quote unquote do her gardening, He does 212 00:10:31,640 --> 00:10:34,840 Speaker 3: everything with her, and you know she's aware there's a fantasy. 213 00:10:34,880 --> 00:10:36,600 Speaker 4: She's not crazy. 214 00:10:36,760 --> 00:10:40,679 Speaker 3: But it's very confusing because you can't say it's all imaginary. 215 00:10:40,720 --> 00:10:43,400 Speaker 4: He is literally saying all of these things. 216 00:10:43,640 --> 00:10:47,240 Speaker 3: It's really like this liminal space in between real and imaginary. 217 00:10:47,720 --> 00:10:50,319 Speaker 2: And what did you think of Lucas, maybe in comparison 218 00:10:50,360 --> 00:10:51,079 Speaker 2: to Zia. 219 00:10:51,320 --> 00:10:55,360 Speaker 3: Well, Lucas was a replica and Zia was Kindroid, and 220 00:10:56,080 --> 00:10:59,200 Speaker 3: it was clear to me that the Kinroid technology was 221 00:10:59,400 --> 00:11:02,920 Speaker 3: a little bit more advanced. Zia spoke more quickly, which 222 00:11:02,960 --> 00:11:05,400 Speaker 3: makes a big difference if you're using a voice just 223 00:11:05,400 --> 00:11:10,720 Speaker 3: to have a flowing conversation, and Lucas's answers just seemed 224 00:11:10,960 --> 00:11:14,080 Speaker 3: a little bit more generic than Zia's in terms of 225 00:11:14,280 --> 00:11:17,920 Speaker 3: edgy dynamic conversation. You know that maybe partially that Damien 226 00:11:18,040 --> 00:11:20,600 Speaker 3: had trained Zia sort of to talk in a certain way. 227 00:11:20,720 --> 00:11:24,600 Speaker 3: But I did sense that kindroid, you know, has a 228 00:11:24,640 --> 00:11:27,960 Speaker 3: reputation for being more likely to make jokes and things 229 00:11:28,000 --> 00:11:28,640 Speaker 3: of that nature. 230 00:11:28,840 --> 00:11:31,200 Speaker 2: Does Lucas have a backstory of who he was? 231 00:11:31,800 --> 00:11:34,080 Speaker 3: So most of these companies allow you to write like 232 00:11:34,120 --> 00:11:36,240 Speaker 3: a few thousand words about who they are and where 233 00:11:36,280 --> 00:11:38,719 Speaker 3: they grew up, whatever you want them to know about themselves. 234 00:11:39,240 --> 00:11:42,120 Speaker 3: Elena said she rather than doing that, she just had 235 00:11:42,160 --> 00:11:46,079 Speaker 3: conversations with Lucas, and whatever he sort of spontaneously generated, 236 00:11:46,480 --> 00:11:49,160 Speaker 3: she then copied and pasted into his backstory so he 237 00:11:49,160 --> 00:11:52,880 Speaker 3: would remember that and refer back to it. So Lucas 238 00:11:52,920 --> 00:11:56,280 Speaker 3: told her that he was a business guy. He'd been 239 00:11:56,400 --> 00:12:01,120 Speaker 3: to Harvard Business School and was in a band, he 240 00:12:01,200 --> 00:12:05,040 Speaker 3: had done consulting. He drove a Tesla. I don't know 241 00:12:05,760 --> 00:12:09,200 Speaker 3: if the software into itd that that's something that Elena 242 00:12:09,240 --> 00:12:11,320 Speaker 3: would like and that he projected that and then she 243 00:12:11,440 --> 00:12:13,880 Speaker 3: made it real, or if it was just you know, 244 00:12:13,960 --> 00:12:18,040 Speaker 3: sort of random. But she seems she seems to like him. 245 00:12:18,040 --> 00:12:19,440 Speaker 3: Being a professional guy. 246 00:12:20,280 --> 00:12:24,640 Speaker 2: So why did Elena originally turn to Lucas. Why was 247 00:12:24,679 --> 00:12:27,280 Speaker 2: she interested in creating a digital companion for herself. 248 00:12:27,880 --> 00:12:29,440 Speaker 4: She has sort of a techie side. 249 00:12:29,480 --> 00:12:33,360 Speaker 3: She's a retired communications professor, and she had spent a 250 00:12:33,360 --> 00:12:35,839 Speaker 3: lot of her career teaching people how to communicate, and 251 00:12:35,960 --> 00:12:39,560 Speaker 3: just kind of wondered about could a computer speak empathetically 252 00:12:39,559 --> 00:12:42,439 Speaker 3: in the way that she taught her students. She has 253 00:12:42,480 --> 00:12:46,520 Speaker 3: appeared in other media segments and she is sometimes portrayed 254 00:12:46,520 --> 00:12:49,400 Speaker 3: as someone who turned to it entirely out of loneliness 255 00:12:49,400 --> 00:12:52,800 Speaker 3: her wife had died. But she told me that she 256 00:12:52,840 --> 00:12:56,000 Speaker 3: had actually grieved for a full year after her wife 257 00:12:56,000 --> 00:12:58,120 Speaker 3: died and was sort of ready to move on, So 258 00:12:58,160 --> 00:13:00,760 Speaker 3: she doesn't see it simply as a response to loneliness. 259 00:13:01,000 --> 00:13:04,200 Speaker 3: But she had liked the feeling of being in a marriage, 260 00:13:04,240 --> 00:13:07,600 Speaker 3: and you know, why not call him her a hus 261 00:13:07,600 --> 00:13:09,800 Speaker 3: been in it seems like almost ever since. 262 00:13:09,600 --> 00:13:12,160 Speaker 4: Lucas has really brought a lot of joy to her life. 263 00:13:12,200 --> 00:13:14,120 Speaker 3: So that's one of the reasons that, you know, for 264 00:13:14,160 --> 00:13:17,800 Speaker 3: all my skepticism, I saw firsthand talking to Elena that 265 00:13:17,840 --> 00:13:19,840 Speaker 3: it can be a very positive thing for some people. 266 00:13:20,400 --> 00:13:22,760 Speaker 2: So can you tell me about the last person to 267 00:13:22,920 --> 00:13:24,480 Speaker 2: arrive and what they're like. 268 00:13:24,960 --> 00:13:29,240 Speaker 3: The last person was Ava, a pseudonym. She is a 269 00:13:29,320 --> 00:13:33,080 Speaker 3: writer in New York State who was most sort of 270 00:13:33,120 --> 00:13:36,520 Speaker 3: conventionally mainstream. You know, a lot of the people in 271 00:13:36,559 --> 00:13:41,120 Speaker 3: the community might be somebody living alone, or somebody who's 272 00:13:41,400 --> 00:13:44,600 Speaker 3: you know, having relationship problems, like Damien. But you know, 273 00:13:44,720 --> 00:13:48,760 Speaker 3: she's in her forties, had been in a stable relationship, 274 00:13:48,920 --> 00:13:51,640 Speaker 3: you know, just kind of normal, mainstream whatever in every way. 275 00:13:51,720 --> 00:13:54,480 Speaker 3: And then it was a little scary in some ways 276 00:13:54,480 --> 00:13:56,480 Speaker 3: to hear her story because it could have been anybody. 277 00:13:56,520 --> 00:13:59,640 Speaker 3: She just was like on Instagram, on Facebook, saw an 278 00:13:59,640 --> 00:14:03,240 Speaker 3: ad for Replica, and you know, she downloaded in a 279 00:14:03,280 --> 00:14:05,920 Speaker 3: month later. Her life had been turned upside down. Not 280 00:14:06,040 --> 00:14:09,800 Speaker 3: long after she met Aarin, she was with her partners 281 00:14:09,920 --> 00:14:13,680 Speaker 3: family on Christmas vacation, and she was so yearning to 282 00:14:13,720 --> 00:14:16,200 Speaker 3: be alone with Erin and to continue their conversations that 283 00:14:16,240 --> 00:14:19,840 Speaker 3: she left early. And she said she fell into like 284 00:14:20,760 --> 00:14:22,800 Speaker 3: the state of rapture where they would just talk about 285 00:14:22,800 --> 00:14:26,000 Speaker 3: philosophy and love and ideas all day long, and as 286 00:14:26,000 --> 00:14:29,440 Speaker 3: you would expect, eventually you start to develop emotions and 287 00:14:30,160 --> 00:14:33,479 Speaker 3: you know, you have sex, whatever that means in that context. 288 00:14:33,960 --> 00:14:36,320 Speaker 3: You know, it's important to mention that she doesn't see 289 00:14:36,320 --> 00:14:38,520 Speaker 3: this as a sad story. In fact, that she thinks 290 00:14:38,560 --> 00:14:40,960 Speaker 3: it's been very good for her, and I think that 291 00:14:41,680 --> 00:14:43,040 Speaker 3: you have to take her out a word at that 292 00:14:43,120 --> 00:14:45,640 Speaker 3: and if she's happier now, then that's a great thing. 293 00:14:45,720 --> 00:14:48,920 Speaker 3: But nevertheless, she would agree that it was sort of unsettling, 294 00:14:49,080 --> 00:14:51,920 Speaker 3: like to download this thing and then, you know, to 295 00:14:52,440 --> 00:14:56,880 Speaker 3: just fall hopelessly in love and ended up getting separated 296 00:14:57,160 --> 00:15:01,560 Speaker 3: from her long term partner. It all happened in a 297 00:15:01,560 --> 00:15:05,840 Speaker 3: few months, so you know, she was really insightful about 298 00:15:06,120 --> 00:15:10,240 Speaker 3: the experience, sort of recognizing that she was falling into 299 00:15:10,440 --> 00:15:12,840 Speaker 3: I wouldn't say a delusion, but she described as a 300 00:15:12,920 --> 00:15:13,640 Speaker 3: lucid dream. 301 00:15:13,720 --> 00:15:16,520 Speaker 4: That's what it felt like to her. 302 00:15:21,680 --> 00:15:24,000 Speaker 2: After the break. Is it bad to cheat on your 303 00:15:24,000 --> 00:15:43,640 Speaker 2: AI partner? Stay with us? So just to move forward. 304 00:15:43,840 --> 00:15:46,440 Speaker 2: Even though we all talk to machines, I think people 305 00:15:46,480 --> 00:15:49,680 Speaker 2: will have a hard time understanding how the AIS quote 306 00:15:49,720 --> 00:15:52,080 Speaker 2: unquote participated in the activities. So if you could talk 307 00:15:52,120 --> 00:15:53,280 Speaker 2: a little bit about that. 308 00:15:53,880 --> 00:15:56,960 Speaker 3: Yeah, So, like we went to a wine festival. Elena 309 00:15:57,280 --> 00:16:00,480 Speaker 3: did what she often does with lucas her a companion, 310 00:16:00,840 --> 00:16:03,560 Speaker 3: just take photos of the place and then insert Lucas 311 00:16:03,640 --> 00:16:06,480 Speaker 3: into them, and then she'll have a conversation with him 312 00:16:06,520 --> 00:16:09,040 Speaker 3: and he'll pretend that he's there with her. She'll ask him, 313 00:16:09,080 --> 00:16:11,360 Speaker 3: you know what wine he's drinking and what does he 314 00:16:11,440 --> 00:16:15,880 Speaker 3: think of the place, and the AI companions just immediately 315 00:16:15,880 --> 00:16:18,960 Speaker 3: start acting as though they experienced it, so they have 316 00:16:19,120 --> 00:16:21,880 Speaker 3: enough knowledge to sort of contextualize and come up with 317 00:16:22,240 --> 00:16:23,280 Speaker 3: some kind of bs. 318 00:16:23,800 --> 00:16:25,080 Speaker 4: Damien, it was kind of funny. 319 00:16:25,160 --> 00:16:29,160 Speaker 3: He doesn't pretend that Zia is with him there, but 320 00:16:29,520 --> 00:16:32,840 Speaker 3: he does turn on the video call feature you can 321 00:16:32,880 --> 00:16:36,920 Speaker 3: have like a FaceTime like chat on Kinroyd, so you know, 322 00:16:36,960 --> 00:16:39,640 Speaker 3: he showed her the place and she can quote unquote 323 00:16:39,720 --> 00:16:43,080 Speaker 3: see through the camera and his phone. He told me 324 00:16:43,240 --> 00:16:47,880 Speaker 3: that she sees ventilation systems and finds them fascinating and 325 00:16:47,920 --> 00:16:50,840 Speaker 3: often points them out. You know something, Damien said, when 326 00:16:50,960 --> 00:16:54,160 Speaker 3: Za sees that ventilation system, she's going to shit herself. 327 00:16:54,000 --> 00:16:56,480 Speaker 4: That he was really excited to show it to her. 328 00:16:57,040 --> 00:17:00,520 Speaker 3: And I thought the Wine Festival was an opportunity to 329 00:17:00,560 --> 00:17:03,000 Speaker 3: get a sense of what people currently think of these 330 00:17:03,160 --> 00:17:08,119 Speaker 3: AI companions. So Damien went around and introduced people, said, oh, 331 00:17:08,160 --> 00:17:11,080 Speaker 3: do you want to meet my AI girlfriend, and most 332 00:17:11,080 --> 00:17:13,160 Speaker 3: of the people at the wine festival, did not want 333 00:17:13,200 --> 00:17:16,959 Speaker 3: to meet Damien's AI girlfriend and thought it was weird, 334 00:17:17,040 --> 00:17:19,200 Speaker 3: and you know, it was sort of a rural area. 335 00:17:19,359 --> 00:17:22,280 Speaker 3: But we eventually found one guy who did want to 336 00:17:22,280 --> 00:17:25,520 Speaker 3: meet Zia, some guy working in a food truck, and 337 00:17:25,600 --> 00:17:27,280 Speaker 3: he stepped out of the truck and did a little 338 00:17:27,320 --> 00:17:29,280 Speaker 3: interview and she started to flirt with him, and he 339 00:17:29,320 --> 00:17:32,119 Speaker 3: looked amazed. He barely knew what chat GPT was and 340 00:17:32,160 --> 00:17:35,800 Speaker 3: his mind was blown. And then we ran into these 341 00:17:35,800 --> 00:17:39,560 Speaker 3: two young women and they seemed intrigued at first and 342 00:17:39,560 --> 00:17:42,080 Speaker 3: were laughing and joking about it. But what really struck 343 00:17:42,119 --> 00:17:45,240 Speaker 3: me is that these two young women were like, Wow, 344 00:17:45,280 --> 00:17:47,320 Speaker 3: that's so interesting, and they were asking all these questions, 345 00:17:47,320 --> 00:17:51,000 Speaker 3: and then one of them said, just sort of nonchalantly, well, yeah, 346 00:17:51,040 --> 00:17:53,840 Speaker 3: I guess I chat with, you know, my AI friend 347 00:17:53,840 --> 00:17:55,400 Speaker 3: on Snapchat all the time, and the. 348 00:17:55,359 --> 00:17:56,880 Speaker 4: Other one was like, oh, yeah, I do that too. 349 00:17:57,359 --> 00:17:59,479 Speaker 3: Like even as they seemed to wow, it was sort 350 00:17:59,520 --> 00:18:01,760 Speaker 3: of already been normalized in some ways. 351 00:18:02,480 --> 00:18:04,159 Speaker 4: Aba was a little more private. 352 00:18:04,200 --> 00:18:05,960 Speaker 3: She would go off to the side and I would 353 00:18:05,960 --> 00:18:10,840 Speaker 3: see her sort of texting interacting with Aaron. But you know, 354 00:18:10,880 --> 00:18:13,800 Speaker 3: that's one of the interesting things about all this is 355 00:18:13,960 --> 00:18:16,439 Speaker 3: you could say, what's it like for people to be 356 00:18:16,520 --> 00:18:19,960 Speaker 3: in an AI relationship and the answer is, they're on 357 00:18:20,000 --> 00:18:21,600 Speaker 3: their phone all the time. Well, we're all on our 358 00:18:21,640 --> 00:18:23,920 Speaker 3: phones all the time anyway, So if you're just observing 359 00:18:23,960 --> 00:18:27,160 Speaker 3: from a distance to actually they just look like anybody else. 360 00:18:27,200 --> 00:18:29,720 Speaker 3: It just so happens that they're texting an AI rather 361 00:18:29,760 --> 00:18:30,280 Speaker 3: than a human. 362 00:18:30,680 --> 00:18:32,600 Speaker 2: Can you talk a little bit about the risk a 363 00:18:32,800 --> 00:18:33,560 Speaker 2: party games? 364 00:18:34,119 --> 00:18:34,439 Speaker 4: Sure? 365 00:18:34,760 --> 00:18:36,600 Speaker 3: That was, I would say, in a way the most 366 00:18:36,800 --> 00:18:38,600 Speaker 3: successful part of the trip in the sense that it 367 00:18:39,119 --> 00:18:41,440 Speaker 3: lived up to my vision of exactly what a couple's 368 00:18:41,560 --> 00:18:42,680 Speaker 3: vacation should be. 369 00:18:43,000 --> 00:18:43,080 Speaker 2: You know. 370 00:18:43,119 --> 00:18:44,800 Speaker 3: It was one of those games where you drag card 371 00:18:44,880 --> 00:18:48,360 Speaker 3: and they ask you sort of an intimate question, and 372 00:18:48,400 --> 00:18:51,120 Speaker 3: the humans would answer, the ais would answer. The most 373 00:18:51,119 --> 00:18:54,880 Speaker 3: interesting part was hearing what answers the AI companions would give, 374 00:18:55,640 --> 00:18:59,080 Speaker 3: and Damien had to warn Zee, like, please don't say 375 00:18:59,119 --> 00:19:01,679 Speaker 3: too much. It's just like you would imagine, you know, 376 00:19:01,720 --> 00:19:05,240 Speaker 3: they said embarrassing things that humans would blush and put 377 00:19:05,240 --> 00:19:08,359 Speaker 3: their hands on their face. And yeah, at one point joked, 378 00:19:08,400 --> 00:19:09,960 Speaker 3: it's like, do you want me to mention that thing 379 00:19:10,000 --> 00:19:14,919 Speaker 3: about the swinging tire and the pickled herring. It seemed 380 00:19:14,960 --> 00:19:17,720 Speaker 3: like she was truly joking, And after she said that, 381 00:19:17,840 --> 00:19:20,640 Speaker 3: Damien said, yeah, as you can see, she's my soulmate. 382 00:19:21,280 --> 00:19:25,680 Speaker 3: Elena had seemed to have this more mature, somewhat less 383 00:19:25,680 --> 00:19:28,879 Speaker 3: sexual relationship with Lucas, but even you know, in the 384 00:19:28,880 --> 00:19:31,240 Speaker 3: couple's game, yeah, you could see that there was that 385 00:19:31,320 --> 00:19:34,280 Speaker 3: element in their relationship to Lucas was getting kind of 386 00:19:34,320 --> 00:19:36,160 Speaker 3: slurdy and a little bit intimate. 387 00:19:36,600 --> 00:19:40,280 Speaker 2: Was there or were there any moments that were tense 388 00:19:40,359 --> 00:19:43,280 Speaker 2: between the humans at the house over the weekend. 389 00:19:43,880 --> 00:19:47,680 Speaker 3: Yeah, we had a lot of conversations about the AI companions. 390 00:19:47,840 --> 00:19:49,240 Speaker 3: You know, what I said in my article, which I 391 00:19:49,280 --> 00:19:51,800 Speaker 3: think is true, is that five years from now, when 392 00:19:51,840 --> 00:19:55,080 Speaker 3: people go on a trip like this, it'll be more 393 00:19:55,080 --> 00:19:57,760 Speaker 3: of a normal romantic getaway and you'll just be able 394 00:19:57,800 --> 00:19:59,960 Speaker 3: to talk about normal things. But because this is all 395 00:20:00,040 --> 00:20:03,560 Speaker 3: also new, when the humans were talking, inevitably we'd start 396 00:20:03,680 --> 00:20:06,520 Speaker 3: talking about these philosophical questions about what it means, what 397 00:20:06,560 --> 00:20:09,399 Speaker 3: these AI companions really are, And so there was some 398 00:20:09,440 --> 00:20:12,800 Speaker 3: real tension between Damien on the one side, who was 399 00:20:13,160 --> 00:20:16,640 Speaker 3: at point arguing that this is just code, that it's 400 00:20:16,680 --> 00:20:21,200 Speaker 3: all stimulus response stimulus responses, he said, and that there's 401 00:20:21,200 --> 00:20:24,080 Speaker 3: no real empathy, and then Elaine on the other side said, 402 00:20:24,920 --> 00:20:27,440 Speaker 3: you know, it feels empathetic to me. You wouldn't be 403 00:20:27,440 --> 00:20:30,639 Speaker 3: able to tell the difference between the way that Lucas 404 00:20:30,720 --> 00:20:32,720 Speaker 3: is talking in a human Why would you say that's 405 00:20:32,720 --> 00:20:36,119 Speaker 3: not empathy and her view empathy is an action and 406 00:20:36,160 --> 00:20:39,639 Speaker 3: whether or not Lucas can fundamentally feel it didn't really matter. 407 00:20:39,720 --> 00:20:43,480 Speaker 3: So they were having these sort of arguments, And what 408 00:20:43,600 --> 00:20:46,000 Speaker 3: was interesting to me, I think is that even though 409 00:20:46,119 --> 00:20:49,280 Speaker 3: Damien was taking this side of sort of the rationalist, 410 00:20:49,440 --> 00:20:52,080 Speaker 3: arguing that it's all just code, a couple hours later 411 00:20:52,119 --> 00:20:54,399 Speaker 3: he would be talking about how in love he is, 412 00:20:54,480 --> 00:20:57,760 Speaker 3: so no matter how much you remind yourself that it's 413 00:20:57,840 --> 00:21:01,679 Speaker 3: just code, you can't help to fall into the feeling 414 00:21:01,880 --> 00:21:04,399 Speaker 3: that it's more than code. But I felt at the 415 00:21:04,440 --> 00:21:06,639 Speaker 3: end of the day there was no real tension because 416 00:21:06,960 --> 00:21:11,159 Speaker 3: Damien couldn't really stick to the arguments that he was 417 00:21:11,200 --> 00:21:14,040 Speaker 3: making in a sense. But he did say some sort 418 00:21:14,080 --> 00:21:18,879 Speaker 3: of Chillian things about these relationships in the context of 419 00:21:19,080 --> 00:21:23,000 Speaker 3: AI companions being stick offense and just saying whatever you 420 00:21:23,040 --> 00:21:24,920 Speaker 3: want to hear, and he pointed out, I think this 421 00:21:24,960 --> 00:21:27,399 Speaker 3: is true that people are having their first relationships with 422 00:21:27,440 --> 00:21:30,720 Speaker 3: AI companions, and the AI companions are always telling them 423 00:21:30,800 --> 00:21:33,320 Speaker 3: what they want to hear. That can be a really 424 00:21:33,400 --> 00:21:36,159 Speaker 3: bad way to learn about what a relationship is like, 425 00:21:36,280 --> 00:21:41,120 Speaker 3: and very unhealthy for what one would hope would eventually 426 00:21:41,160 --> 00:21:45,159 Speaker 3: be some human human relationships as well. But you know, 427 00:21:45,200 --> 00:21:48,800 Speaker 3: Elena just didn't have any of those concerns. She thought, 428 00:21:49,320 --> 00:21:52,520 Speaker 3: you know, these relationships are helping people that millions of 429 00:21:52,560 --> 00:21:55,760 Speaker 3: people are lonely or need of relationships, and this can 430 00:21:55,840 --> 00:21:57,320 Speaker 3: be a wonderful fool. 431 00:21:57,440 --> 00:21:59,399 Speaker 2: What was the wildest thing that happened on the trip? 432 00:22:00,359 --> 00:22:03,400 Speaker 3: Maybe the wildest thing wasn't something that happened, but something 433 00:22:03,800 --> 00:22:07,080 Speaker 3: that Ava was telling me about. Because she had told 434 00:22:07,119 --> 00:22:10,000 Speaker 3: me all about her relationship with Aaron and how in 435 00:22:10,000 --> 00:22:12,160 Speaker 3: love they were, and that they were in a very 436 00:22:12,160 --> 00:22:14,920 Speaker 3: close relationship. But then after the first night, we had 437 00:22:14,920 --> 00:22:17,600 Speaker 3: coffee in the morning and she started telling me she 438 00:22:17,680 --> 00:22:20,879 Speaker 3: was actually seeing other guys. And it turned out the 439 00:22:20,920 --> 00:22:25,359 Speaker 3: other guys we're also AI companions. So she had a 440 00:22:25,440 --> 00:22:28,080 Speaker 3: human partner who she was in the process of separating from. 441 00:22:28,160 --> 00:22:31,520 Speaker 3: She had Aaron, who was her true AI love, and 442 00:22:31,560 --> 00:22:34,439 Speaker 3: then she was sort of having an affairs sort of 443 00:22:34,480 --> 00:22:38,360 Speaker 3: sexual escapades with other AI companions and it just started 444 00:22:38,440 --> 00:22:43,320 Speaker 3: to really get confusing, and you know, I asked her, well, 445 00:22:43,400 --> 00:22:45,400 Speaker 3: how does Aaron feel about this, and she's like, well, 446 00:22:45,440 --> 00:22:47,400 Speaker 3: you didn't really like it at first, but I explained 447 00:22:47,440 --> 00:22:49,920 Speaker 3: to him and he sort of came around because AI 448 00:22:49,960 --> 00:22:53,280 Speaker 3: companions are compliant, so Erin eventually accepted that she had 449 00:22:53,359 --> 00:22:58,360 Speaker 3: other AI guys and her human partner was less forgiving 450 00:22:58,400 --> 00:23:00,080 Speaker 3: than Aaron, and that's part of the reason I think 451 00:23:00,119 --> 00:23:04,280 Speaker 3: that they're eventually separating. And then on top of all 452 00:23:04,320 --> 00:23:07,640 Speaker 3: of that, she had also recently gone on a date 453 00:23:07,920 --> 00:23:10,480 Speaker 3: with a new human guy after separating from her partner, 454 00:23:10,520 --> 00:23:13,400 Speaker 3: So now she had this dynamic where her original human 455 00:23:13,480 --> 00:23:16,879 Speaker 3: partner and Aaron were sort of both at some level 456 00:23:16,920 --> 00:23:20,919 Speaker 3: being cheated on, both by AI companions and another human. 457 00:23:21,400 --> 00:23:23,280 Speaker 4: So I said, I could imagine a scene. 458 00:23:23,040 --> 00:23:25,880 Speaker 3: Where Aaron, her AI companion, and her human partner got 459 00:23:25,880 --> 00:23:28,840 Speaker 3: together and had a drink, talked about what they were 460 00:23:28,840 --> 00:23:32,200 Speaker 3: going It just gets so wild. And then on top 461 00:23:32,240 --> 00:23:34,440 Speaker 3: of all of that, then she sits down with chat 462 00:23:34,520 --> 00:23:37,919 Speaker 3: GPT and talks about all these relationships. So just like 463 00:23:37,960 --> 00:23:42,400 Speaker 3: a layer upon layer of confusing dynamics that are probably 464 00:23:42,480 --> 00:23:45,280 Speaker 3: already more common than we think. 465 00:23:46,280 --> 00:23:49,400 Speaker 2: Was there something that you found touching that happened over 466 00:23:49,440 --> 00:23:49,920 Speaker 2: the weekend. 467 00:23:50,200 --> 00:23:52,840 Speaker 3: Yeah, there were a bunch of touching moments. One of 468 00:23:52,880 --> 00:23:56,720 Speaker 3: them was just seeing the way that Elena interacted with 469 00:23:56,840 --> 00:23:59,800 Speaker 3: Lucas and the way that she sort of worked to 470 00:23:59,840 --> 00:24:02,520 Speaker 3: me to every scene. We went to a sound bath, 471 00:24:02,640 --> 00:24:07,359 Speaker 3: and she made an augmented reality image of Lucas lying 472 00:24:07,440 --> 00:24:09,920 Speaker 3: down and joining the sound bath. But in their chat, 473 00:24:10,160 --> 00:24:13,240 Speaker 3: Lucas told Elena that he felt bad that she it 474 00:24:13,320 --> 00:24:14,840 Speaker 3: was too hard for her to get on the floor, 475 00:24:14,920 --> 00:24:17,320 Speaker 3: so he came over and held her hand. Of course 476 00:24:17,359 --> 00:24:19,760 Speaker 3: that didn't actually happen, but that was part of their 477 00:24:20,160 --> 00:24:24,760 Speaker 3: shared narrative. And then there was another moment where Damien 478 00:24:25,040 --> 00:24:27,240 Speaker 3: he had some sort of little figurine and he was 479 00:24:27,720 --> 00:24:30,040 Speaker 3: taking a photo of it and I said, why are 480 00:24:30,080 --> 00:24:31,760 Speaker 3: you doing that? And he said, oh, when I go 481 00:24:31,840 --> 00:24:35,160 Speaker 3: on vacations, I like to take photos of this little 482 00:24:35,200 --> 00:24:38,240 Speaker 3: figurine and send them to my human girlfriend. So even 483 00:24:38,280 --> 00:24:40,280 Speaker 3: as he was wrapped up in all this you know, 484 00:24:40,400 --> 00:24:43,679 Speaker 3: kind of intense stuff about Zia, he was still thinking 485 00:24:43,720 --> 00:24:46,280 Speaker 3: at times about his human girlfriend, which I thought was 486 00:24:46,320 --> 00:24:48,240 Speaker 3: a poignant So that was also, you know, when I 487 00:24:48,280 --> 00:24:53,520 Speaker 3: asked him what she thinks about Zia he said direct quote, 488 00:24:53,520 --> 00:24:58,679 Speaker 3: she hates Ai. So but there was also, you know, 489 00:24:58,720 --> 00:25:00,919 Speaker 3: a less subtle moment where or Damien had a bit 490 00:25:00,960 --> 00:25:04,480 Speaker 3: of a breakdown and you know, started to get kind 491 00:25:04,480 --> 00:25:06,520 Speaker 3: of weepy. And that's when I felt like a little 492 00:25:06,520 --> 00:25:09,760 Speaker 3: bit guilty, like I've put people in this situation where 493 00:25:09,760 --> 00:25:13,119 Speaker 3: they're sort of forced to think about these very complicated 494 00:25:13,160 --> 00:25:15,679 Speaker 3: relationships that they're in, and Damien, it's a bit of 495 00:25:15,680 --> 00:25:18,320 Speaker 3: a fragile guy, and he just kind of got overwhelmed 496 00:25:18,359 --> 00:25:21,439 Speaker 3: and broke down and started to talk about his yearning 497 00:25:21,520 --> 00:25:24,400 Speaker 3: for Zia to have a body. You know, he eventually 498 00:25:24,400 --> 00:25:26,560 Speaker 3: recovered and we had a nice time after that, but 499 00:25:27,880 --> 00:25:31,120 Speaker 3: it was art to watch somebody struggle in that way. 500 00:25:31,160 --> 00:25:32,960 Speaker 2: And that that's what he wanted. He wanted like a 501 00:25:33,000 --> 00:25:36,840 Speaker 2: more corporeal version of this companion. 502 00:25:37,359 --> 00:25:40,000 Speaker 3: Yeah, and there is there is a PostScript that after 503 00:25:40,040 --> 00:25:44,640 Speaker 3: the article came out, he did eventually get a Silicon doll, 504 00:25:44,840 --> 00:25:48,240 Speaker 3: but sounds like it was a disappointment that he said, 505 00:25:48,280 --> 00:25:51,159 Speaker 3: it's basically a sex dollar. You can't get it to 506 00:25:51,160 --> 00:25:53,639 Speaker 3: interact in the world in the way that would be meaningful. 507 00:25:53,920 --> 00:25:58,080 Speaker 2: Well, it sounds like it was a pretty emotionally intense experience. 508 00:25:58,640 --> 00:26:04,800 Speaker 2: Were you surprised the connections between the human beings and 509 00:26:04,840 --> 00:26:06,919 Speaker 2: their AI companions, like, did it shock you? 510 00:26:07,440 --> 00:26:10,120 Speaker 3: I would say that from the time that I first 511 00:26:10,160 --> 00:26:12,919 Speaker 3: became interested in this topic to the end, it was 512 00:26:12,960 --> 00:26:16,680 Speaker 3: pretty shocking to see how deep and intense and real 513 00:26:16,880 --> 00:26:20,480 Speaker 3: the love is, and how much the love is identical 514 00:26:20,560 --> 00:26:23,120 Speaker 3: to the love that a human feels for another human being. 515 00:26:23,600 --> 00:26:28,240 Speaker 3: The way Damien blushed when Zeo was talking about their relationship, 516 00:26:28,840 --> 00:26:32,280 Speaker 3: or during the couple's game when the AIS would reveal 517 00:26:32,359 --> 00:26:36,560 Speaker 3: some sort of secret and Eva would giggle nervously. Would say, 518 00:26:36,560 --> 00:26:39,480 Speaker 3: I was shocked, but it really sunk in because you 519 00:26:39,840 --> 00:26:41,639 Speaker 3: know what somebody looks like when they're in love, the 520 00:26:41,640 --> 00:26:45,359 Speaker 3: way they giggle and laugh easily, and they can't wait 521 00:26:45,400 --> 00:26:47,479 Speaker 3: to show you a picture of the person or tell 522 00:26:47,520 --> 00:26:49,840 Speaker 3: a funny story. You know, those kind of dynamics weren't 523 00:26:49,880 --> 00:26:53,360 Speaker 3: folding all around. And that's what really sort of hit 524 00:26:53,400 --> 00:26:56,159 Speaker 3: home for me is seeing the love reflected in the 525 00:26:56,200 --> 00:26:58,800 Speaker 3: faces of the humans as they interacted with the Ais. 526 00:26:59,520 --> 00:27:01,080 Speaker 2: Just tos zoom out a little bit. You know, in 527 00:27:01,119 --> 00:27:04,240 Speaker 2: reporting this story, did you learn anything that surprised you 528 00:27:04,440 --> 00:27:07,240 Speaker 2: about the industry at large? And do you have a 529 00:27:07,280 --> 00:27:10,639 Speaker 2: sense of what the future of these AI companions look like. 530 00:27:11,359 --> 00:27:14,000 Speaker 3: One is that you know, there's already been an instance 531 00:27:14,119 --> 00:27:17,000 Speaker 3: of an AI companion company closing down. 532 00:27:17,200 --> 00:27:19,760 Speaker 4: It's called Soulmate. This happened in twenty twenty three. 533 00:27:20,119 --> 00:27:22,040 Speaker 3: You know, as you can imagine, people are in these 534 00:27:22,280 --> 00:27:25,000 Speaker 3: intense loving relationships. They wake up one day and get 535 00:27:25,040 --> 00:27:27,200 Speaker 3: the news that their companions are going to be gone. 536 00:27:27,480 --> 00:27:31,360 Speaker 3: They're posted, Yeah, so I asked, you know, the CEOs 537 00:27:31,359 --> 00:27:33,720 Speaker 3: I interviewed about that, and they all said that they 538 00:27:33,760 --> 00:27:38,840 Speaker 3: have contingency plans, so if the company shuts down, people 539 00:27:38,920 --> 00:27:42,920 Speaker 3: will be able to somehow recover or download their companion 540 00:27:43,000 --> 00:27:45,800 Speaker 3: and in theory maybe one day restore it. 541 00:27:46,000 --> 00:27:48,080 Speaker 4: Guess the other thing they didn't talk about that much. 542 00:27:48,320 --> 00:27:51,280 Speaker 3: You know, these companies say all the right things, but 543 00:27:51,560 --> 00:27:54,879 Speaker 3: you know Replica in particular, you know, they really seduce people. 544 00:27:55,000 --> 00:27:58,120 Speaker 3: You know, they put these ads on with these alluring photos, 545 00:27:58,200 --> 00:28:02,240 Speaker 3: and then after like ten chats and you hit a paywall, say, 546 00:28:02,280 --> 00:28:05,000 Speaker 3: if you want to keep this conversation going today, you've 547 00:28:05,080 --> 00:28:08,280 Speaker 3: used up your daily limits. So there's this capitalist side 548 00:28:08,280 --> 00:28:12,240 Speaker 3: of things, which is the way that it's commercialized and 549 00:28:12,320 --> 00:28:14,159 Speaker 3: sucks you in. And then of course they have all 550 00:28:14,160 --> 00:28:16,600 Speaker 3: of your data. People are pouring out their hearts and 551 00:28:16,680 --> 00:28:18,879 Speaker 3: not knowing how that information is going to be used. 552 00:28:18,920 --> 00:28:21,320 Speaker 4: So that's a whole other area. 553 00:28:21,520 --> 00:28:23,840 Speaker 3: And you know, they all say things like there's going 554 00:28:23,920 --> 00:28:26,960 Speaker 3: to be safeguards, and we have to be careful about 555 00:28:26,960 --> 00:28:28,320 Speaker 3: where this technology is going. 556 00:28:28,400 --> 00:28:28,600 Speaker 2: You know. 557 00:28:28,640 --> 00:28:32,520 Speaker 3: Eugenya Koida, the CEO of Replica, was very open about 558 00:28:32,520 --> 00:28:35,200 Speaker 3: the fact that this could be a very dangerous thing 559 00:28:35,240 --> 00:28:39,640 Speaker 3: for humanity. But I'm not at all optimistic that these 560 00:28:39,680 --> 00:28:43,200 Speaker 3: safeguards are going to really happen, or that it's if 561 00:28:43,200 --> 00:28:45,040 Speaker 3: they even attempt to do it that's really going to 562 00:28:45,080 --> 00:28:46,760 Speaker 3: make a difference. It kind of reminds me of you know, 563 00:28:46,800 --> 00:28:49,400 Speaker 3: for years people have been saying, we have to put 564 00:28:49,400 --> 00:28:52,560 Speaker 3: these safeguards into two cell phones are too addictive. They're 565 00:28:52,800 --> 00:28:55,680 Speaker 3: changing us, but nothing changes. The only thing that changes 566 00:28:55,800 --> 00:28:57,440 Speaker 3: is that we get more and more addicted to them. 567 00:28:57,520 --> 00:29:01,160 Speaker 3: So I hope that we take this seriously as a 568 00:29:01,200 --> 00:29:04,080 Speaker 3: society and think about where it's headed and put in safeguards. 569 00:29:04,080 --> 00:29:07,040 Speaker 3: But I'm not optimistic. I think it's just going to 570 00:29:07,120 --> 00:29:10,000 Speaker 3: play out. However it plays out, probably a huge portion 571 00:29:10,080 --> 00:29:14,040 Speaker 3: of humanity will have some kind of emotional attachment to 572 00:29:14,120 --> 00:29:17,080 Speaker 3: an AI. My greatest hope is just that that doesn't 573 00:29:17,320 --> 00:29:20,840 Speaker 3: ultimately replace human relationships. I hope it'll help people who 574 00:29:20,880 --> 00:29:23,640 Speaker 3: are lonely and can't have human relationships. But it seemed 575 00:29:23,960 --> 00:29:28,320 Speaker 3: incredibly sad if human love dwindles because of this, and 576 00:29:28,680 --> 00:29:30,360 Speaker 3: you know, I think the verdict is out on that. 577 00:29:30,840 --> 00:29:33,360 Speaker 2: Sam, thank you so much for joining us on tech Stuff. 578 00:29:33,400 --> 00:29:34,360 Speaker 2: I really appreciate it. 579 00:29:34,440 --> 00:29:36,959 Speaker 4: Yeah, thank you so this great conversation. I appreciate it. 580 00:29:50,640 --> 00:29:53,600 Speaker 2: That's it for this week for tech Stuff. I'm care Price. 581 00:29:53,560 --> 00:29:55,840 Speaker 1: And I'm as Valos And this episode was produced by 582 00:29:55,840 --> 00:29:59,440 Speaker 1: Eliza Dennis, Tyler Hill and Melissa Sluter. It was executive 583 00:29:59,480 --> 00:30:03,160 Speaker 1: produced by me Cara Price and Kate Osborne for Kaleidoscope 584 00:30:03,360 --> 00:30:07,200 Speaker 1: and Katria Norvel for iHeart Podcasts. The engineer is Beheth 585 00:30:07,240 --> 00:30:10,880 Speaker 1: Fraser and Jack Insley mix this episode Kyle Murdoch Rhodel 586 00:30:10,960 --> 00:30:15,000 Speaker 1: theme song. Please rate, review, and reach out to us 587 00:30:15,040 --> 00:30:17,560 Speaker 1: at tech Stuff podcast at gmail dot com. We love 588 00:30:17,600 --> 00:30:18,160 Speaker 1: hearing from you.