1 00:00:05,160 --> 00:00:07,440 Speaker 1: Hey, this is Annie and Samantha. I'm welcome to Stuff 2 00:00:07,440 --> 00:00:09,240 Speaker 1: and I ever told you a production of Buyheart Radio, 3 00:00:18,680 --> 00:00:21,720 Speaker 1: and we are so excited to be joined once again 4 00:00:22,120 --> 00:00:26,920 Speaker 1: by the excellent, the eloquent Brigitte Todd. Welcome Brigitte, thanks 5 00:00:26,920 --> 00:00:29,280 Speaker 1: for having me. I am so pumped to be here. 6 00:00:29,520 --> 00:00:33,640 Speaker 1: Miss you guys, Yes, we have missed you as well. 7 00:00:33,680 --> 00:00:37,000 Speaker 1: And you have been busy. Can you tell the listeners 8 00:00:37,080 --> 00:00:38,440 Speaker 1: what you have been up to? 9 00:00:39,200 --> 00:00:42,240 Speaker 2: Yes, I have been a little bit busy because I 10 00:00:42,400 --> 00:00:45,960 Speaker 2: just finished an audio book with Simon and Schuster called 11 00:00:46,040 --> 00:00:49,080 Speaker 2: Love at First Prompt is an exploration of AI and 12 00:00:49,200 --> 00:00:54,240 Speaker 2: intimacy AI and romantic connection, sexual connection, and really sort 13 00:00:54,280 --> 00:00:57,000 Speaker 2: of asked the question, do we what does it mean 14 00:00:57,000 --> 00:01:01,600 Speaker 2: when these companies like Open AI oh and control something 15 00:01:01,640 --> 00:01:04,640 Speaker 2: that is so innately human and so sensitive as our 16 00:01:04,680 --> 00:01:06,240 Speaker 2: intimacies that we have with each other. 17 00:01:07,280 --> 00:01:13,280 Speaker 1: Yes, and it is a fascinating question. And I'm really 18 00:01:13,319 --> 00:01:18,160 Speaker 1: really excited to hear the book and maybe come back 19 00:01:18,200 --> 00:01:22,320 Speaker 1: and discuss your experience doing the audio book. But yeah, 20 00:01:22,400 --> 00:01:24,920 Speaker 1: I mean, do you mind telling us a little bit 21 00:01:25,000 --> 00:01:28,959 Speaker 1: about what was the impetus behind this idea? 22 00:01:29,720 --> 00:01:30,080 Speaker 3: Yeah? 23 00:01:30,200 --> 00:01:33,080 Speaker 2: So on my own tech podcasts, there are no girls 24 00:01:33,120 --> 00:01:35,920 Speaker 2: on the Internet. I was really interested in exploring the 25 00:01:35,959 --> 00:01:39,000 Speaker 2: ways that people were reporting kind of having these intimate 26 00:01:39,000 --> 00:01:42,240 Speaker 2: connections with AI. This was around the same time that 27 00:01:42,360 --> 00:01:46,199 Speaker 2: chatchept was getting rid of their four to oh model, 28 00:01:46,240 --> 00:01:49,240 Speaker 2: which you might remember, with the model that was very 29 00:01:49,280 --> 00:01:52,800 Speaker 2: sort of sycophantic, you know, it was like very flattering 30 00:01:53,160 --> 00:01:57,160 Speaker 2: and phasing in a different model that was less sycophantic, 31 00:01:57,240 --> 00:02:00,880 Speaker 2: less flattering, and people kind of when that change happened, 32 00:02:01,240 --> 00:02:06,560 Speaker 2: people a lot of people realized, oh, my connection with 33 00:02:06,640 --> 00:02:09,160 Speaker 2: chat gpt was not just it helping me with homework 34 00:02:09,200 --> 00:02:11,440 Speaker 2: or helping me with work. I actually had developed an 35 00:02:11,480 --> 00:02:14,480 Speaker 2: emotional connection with it, and now that this model is gone, 36 00:02:15,040 --> 00:02:18,519 Speaker 2: I'm struggling with that. And so I was really fascinated 37 00:02:18,560 --> 00:02:21,600 Speaker 2: by that, and we did an episode all about it. 38 00:02:22,320 --> 00:02:24,360 Speaker 2: In our episode, we talked about people who were using 39 00:02:24,400 --> 00:02:30,120 Speaker 2: AI for you know, romantic connections, sexual connection. And I 40 00:02:30,200 --> 00:02:32,920 Speaker 2: think it's a sensitive topic because it's very easy to 41 00:02:34,040 --> 00:02:36,440 Speaker 2: go get these people and to say like, oh this, 42 00:02:36,480 --> 00:02:39,760 Speaker 2: these people are delusional and like they're lonely and all 43 00:02:39,760 --> 00:02:43,320 Speaker 2: of this, and that I completely understand why that is 44 00:02:43,639 --> 00:02:47,600 Speaker 2: a lot of people's first instinct. However, in my view 45 00:02:48,280 --> 00:02:53,560 Speaker 2: that really shields the companies that run these products from accountability, right, 46 00:02:53,600 --> 00:02:55,839 Speaker 2: And so we really wanted to shift the lens and say, well, 47 00:02:56,520 --> 00:02:59,600 Speaker 2: let's not just focus on the individual users and how 48 00:02:59,600 --> 00:03:02,360 Speaker 2: they're you AI and all of that. Let's talk about 49 00:03:02,360 --> 00:03:05,080 Speaker 2: these companies and whether or not they're behaving ethically, because 50 00:03:05,120 --> 00:03:08,120 Speaker 2: I really don't like how in technology it is so 51 00:03:08,400 --> 00:03:11,680 Speaker 2: easy to blame people who get caught up in a 52 00:03:11,720 --> 00:03:14,400 Speaker 2: certain kind of technology. In twenty fourteen, when there was 53 00:03:14,400 --> 00:03:17,440 Speaker 2: the iCloud photo hacks, how many times people say, oh, 54 00:03:17,480 --> 00:03:20,240 Speaker 2: these these women sho have not taken these pictures? What 55 00:03:20,360 --> 00:03:23,920 Speaker 2: about asking well, how what platforms actually enabled this? What 56 00:03:23,960 --> 00:03:27,919 Speaker 2: platforms profited off of this? It is so easy to 57 00:03:28,000 --> 00:03:30,880 Speaker 2: scrutinize the behavior of individuals, but really we ought to 58 00:03:30,880 --> 00:03:33,520 Speaker 2: be asking questions about these companies. And so that's kind 59 00:03:33,560 --> 00:03:35,840 Speaker 2: of how this project came to be. After I did 60 00:03:35,840 --> 00:03:39,680 Speaker 2: that episode, Simon and Schuster, my book publisher, got in 61 00:03:39,720 --> 00:03:42,360 Speaker 2: touch and said, this topic is so interesting, what if 62 00:03:42,400 --> 00:03:44,840 Speaker 2: you expanded on it for a full audiobook project? 63 00:03:44,880 --> 00:03:45,560 Speaker 3: And I did. 64 00:03:46,040 --> 00:03:48,680 Speaker 2: Side note, one of my favorite movies of all time 65 00:03:48,920 --> 00:03:52,200 Speaker 2: is the movie Her and the way that the whole 66 00:03:52,840 --> 00:03:58,840 Speaker 2: open AI changing models situation unfolded like that movie, Like 67 00:03:59,080 --> 00:04:03,080 Speaker 2: truly Jones must have seen something common because the way 68 00:04:03,120 --> 00:04:07,360 Speaker 2: that things unfold in that film basically are how things 69 00:04:07,440 --> 00:04:08,520 Speaker 2: unfolded in real life. 70 00:04:09,880 --> 00:04:12,240 Speaker 4: Well, I have to say, first and foremost, the title 71 00:04:12,320 --> 00:04:15,720 Speaker 4: of the book is genius. Like. It caught me very quickly. 72 00:04:15,720 --> 00:04:19,040 Speaker 4: I was like, Oh, okay, I'm already, I'm glued, I'm here, 73 00:04:19,080 --> 00:04:22,160 Speaker 4: I'm seated. Let's let's talk about this. So how did 74 00:04:22,200 --> 00:04:23,200 Speaker 4: you even come up with the title. 75 00:04:23,520 --> 00:04:25,440 Speaker 2: Honestly, it might have been one of the hardest parts 76 00:04:25,480 --> 00:04:29,000 Speaker 2: of doing the book. I was trying to find a 77 00:04:29,080 --> 00:04:32,080 Speaker 2: title that conveyed what the book is about, conveyed that 78 00:04:32,240 --> 00:04:36,680 Speaker 2: it's we're exploring this from a kind of accessible, casual perspective, 79 00:04:37,000 --> 00:04:38,960 Speaker 2: but also was like a little bit of a nod 80 00:04:39,040 --> 00:04:42,240 Speaker 2: to being kind of cheeky or kind of funny. Initially 81 00:04:42,279 --> 00:04:44,880 Speaker 2: I wanted the title to be AI Will Always Love You, 82 00:04:45,279 --> 00:04:47,200 Speaker 2: and my editor was like, I don't know if people 83 00:04:47,240 --> 00:04:49,280 Speaker 2: are going to get the Whitney Houston reference. 84 00:04:48,960 --> 00:04:50,680 Speaker 3: But yeah, I love this. 85 00:04:50,800 --> 00:04:52,680 Speaker 4: The song immediately jumping thank. 86 00:04:52,600 --> 00:04:56,480 Speaker 2: You Love at first prompt was really the kind of 87 00:04:57,080 --> 00:05:00,159 Speaker 2: marrying of the worlds of kind of telling people what 88 00:05:00,200 --> 00:05:02,200 Speaker 2: it's about, but also being a little bit cheeky, So 89 00:05:02,240 --> 00:05:02,599 Speaker 2: thank you. 90 00:05:02,640 --> 00:05:04,359 Speaker 3: It was it was a real struggle to come up 91 00:05:04,360 --> 00:05:04,960 Speaker 3: with that title. 92 00:05:05,440 --> 00:05:07,039 Speaker 4: Well kudos because it is good. 93 00:05:08,000 --> 00:05:12,400 Speaker 1: Yes, Also, I know you're busy, but I maintained I'm 94 00:05:12,440 --> 00:05:17,320 Speaker 1: starting to think we should do a movie series of 95 00:05:17,440 --> 00:05:20,360 Speaker 1: tech and movies. We started with Social Network. 96 00:05:20,720 --> 00:05:24,400 Speaker 2: To her, I have said this before, that is my 97 00:05:24,960 --> 00:05:27,840 Speaker 2: I was a goal in life. But I've always wanted 98 00:05:27,880 --> 00:05:30,880 Speaker 2: to have one of those movie podcasts where you just 99 00:05:30,960 --> 00:05:33,280 Speaker 2: like talked through a movie and all the implications. 100 00:05:33,400 --> 00:05:35,160 Speaker 3: So I listened to a ton of those podcasts. 101 00:05:35,760 --> 00:05:37,640 Speaker 2: I feel like I was put on this earth to 102 00:05:37,680 --> 00:05:41,000 Speaker 2: do a deep to do deep dives into movies on podcasts. 103 00:05:41,440 --> 00:05:43,839 Speaker 4: You know, we could just do a whole crossover for 104 00:05:43,880 --> 00:05:46,159 Speaker 4: both your show and our show, where we do one 105 00:05:46,600 --> 00:05:49,960 Speaker 4: every now and again, just like every quarter of the year, 106 00:05:50,000 --> 00:05:51,320 Speaker 4: and we be like, look, we're gonna take a break 107 00:05:51,400 --> 00:05:53,760 Speaker 4: and for this episode, we're gonna do this as a crossover. 108 00:05:54,120 --> 00:05:55,680 Speaker 3: Yes, I would love that. 109 00:05:55,839 --> 00:06:00,280 Speaker 2: And our prettycer Joey pat Yeah, also a kind of 110 00:06:00,279 --> 00:06:02,880 Speaker 2: a movie person. In YI, you would be surprised how 111 00:06:02,960 --> 00:06:08,400 Speaker 2: often the intersection between like technology and pop culture happens. 112 00:06:08,640 --> 00:06:11,200 Speaker 3: There are so were we're referencing movies all the time 113 00:06:11,360 --> 00:06:13,560 Speaker 3: X Companion, like. 114 00:06:13,520 --> 00:06:16,080 Speaker 2: Those are both movies, her, those are all movies that 115 00:06:16,120 --> 00:06:19,479 Speaker 2: come up in my audio book. Because so much of 116 00:06:19,480 --> 00:06:23,200 Speaker 2: what we experience when it comes to humans connection with AI, 117 00:06:23,839 --> 00:06:25,960 Speaker 2: we've actually got a little bit of a template in 118 00:06:26,040 --> 00:06:28,159 Speaker 2: fictional film, which I just find so interesting. 119 00:06:29,040 --> 00:06:29,920 Speaker 4: I think it needs to happen. 120 00:06:30,279 --> 00:06:32,160 Speaker 1: Yes, I think we should make it happen. I think 121 00:06:32,160 --> 00:06:33,000 Speaker 1: we should make it happen. 122 00:06:33,080 --> 00:06:33,680 Speaker 3: I would love that. 123 00:06:35,720 --> 00:06:38,760 Speaker 1: But one of the things that we were discussing off 124 00:06:38,880 --> 00:06:44,799 Speaker 1: Mike Bridget is that this is an extremely timely topic 125 00:06:45,760 --> 00:06:50,320 Speaker 1: that you have chosen for this book, as in, like, 126 00:06:50,600 --> 00:06:55,040 Speaker 1: right now open ai is doing some stuff. 127 00:06:55,760 --> 00:06:58,040 Speaker 3: Yes, that is exactly right. 128 00:06:58,080 --> 00:06:59,520 Speaker 2: It's one of the reasons why I had to write 129 00:06:59,520 --> 00:07:03,039 Speaker 2: the book lickety split, because it changed the world of 130 00:07:03,080 --> 00:07:05,360 Speaker 2: AI changes so quickly, and like I would write things 131 00:07:05,360 --> 00:07:07,320 Speaker 2: and then a week later that would be outdated. But 132 00:07:07,880 --> 00:07:11,080 Speaker 2: just we actually even have a little news from open 133 00:07:11,120 --> 00:07:13,960 Speaker 2: Ai recently. Back in October, Sam Altman, who was the 134 00:07:14,000 --> 00:07:16,680 Speaker 2: CEO of open Ai, the company that makes chat kept 135 00:07:17,080 --> 00:07:20,440 Speaker 2: they did this pretty big about face. They said they 136 00:07:20,440 --> 00:07:25,640 Speaker 2: were going to roll out erotic content sometime this year. 137 00:07:26,200 --> 00:07:29,520 Speaker 2: And this is a bit of a change because for 138 00:07:29,560 --> 00:07:33,400 Speaker 2: the longest time, open ai was saying, we don't do 139 00:07:33,480 --> 00:07:36,160 Speaker 2: sex bots, we don't do erotic content, we don't do. 140 00:07:36,280 --> 00:07:36,800 Speaker 3: Any of that. 141 00:07:37,240 --> 00:07:39,720 Speaker 2: And then it didn't take Literally Sam Altman said that 142 00:07:39,760 --> 00:07:41,720 Speaker 2: on a podcast, and I think it was two months 143 00:07:41,760 --> 00:07:44,440 Speaker 2: later it was like, actually, we're doing sex bots now, 144 00:07:45,360 --> 00:07:49,240 Speaker 2: So pretty pretty big about face there. And so this 145 00:07:49,360 --> 00:07:53,320 Speaker 2: decision to roll out erotic content kind of came at 146 00:07:53,320 --> 00:07:56,840 Speaker 2: an awkward time. It was on the heels of grappling 147 00:07:57,000 --> 00:08:00,240 Speaker 2: with what I was talking about earlier, this change from 148 00:08:00,520 --> 00:08:03,560 Speaker 2: chat gybt four oh, the model that was known for 149 00:08:03,680 --> 00:08:05,920 Speaker 2: flattery and sick of fancy that, that's the model that 150 00:08:05,960 --> 00:08:10,080 Speaker 2: if you ask a question, chatgypt will say, that's such 151 00:08:10,120 --> 00:08:13,200 Speaker 2: a brilliant question. When you ask something totally basic, it'll 152 00:08:13,240 --> 00:08:18,520 Speaker 2: just blow smoke up your So when open ai announced 153 00:08:18,520 --> 00:08:20,280 Speaker 2: that they were rolling out a new model to be 154 00:08:20,560 --> 00:08:23,960 Speaker 2: less fonding and less human, people realize like, hey, I 155 00:08:24,040 --> 00:08:28,520 Speaker 2: have an emotional or an intimate connection with this version 156 00:08:28,560 --> 00:08:32,240 Speaker 2: of chatgybt, and so many of those people felt like 157 00:08:32,280 --> 00:08:36,440 Speaker 2: open ai had acted callously, just kind of making the 158 00:08:36,520 --> 00:08:40,720 Speaker 2: change to to to change the model. That's it's a 159 00:08:40,800 --> 00:08:44,240 Speaker 2: whole like interesting story that we get into in the book. 160 00:08:44,559 --> 00:08:47,760 Speaker 2: But around that same time is when all Men announced 161 00:08:47,760 --> 00:08:50,400 Speaker 2: they were going to start doing erotic content. They put 162 00:08:50,440 --> 00:08:53,320 Speaker 2: out an announcement that said, now that we've been able 163 00:08:53,320 --> 00:08:56,120 Speaker 2: to mitigate serious mental health issues and have new tools, 164 00:08:56,440 --> 00:08:59,280 Speaker 2: we're going to be able to safely relax the restrictions 165 00:08:59,280 --> 00:09:02,200 Speaker 2: in most cases. As part of our treat adult users 166 00:09:02,200 --> 00:09:05,600 Speaker 2: like adults principle, we will allow even more like erotica 167 00:09:05,640 --> 00:09:07,640 Speaker 2: for verified adults. 168 00:09:09,120 --> 00:09:14,080 Speaker 4: I have so many questions, like, how did you mitigate 169 00:09:14,160 --> 00:09:17,920 Speaker 4: this mental health like within how where did you get 170 00:09:17,920 --> 00:09:18,600 Speaker 4: this information? 171 00:09:20,080 --> 00:09:20,920 Speaker 3: I'm confused. 172 00:09:21,800 --> 00:09:25,200 Speaker 2: That's a really good question, because that's basically what open 173 00:09:25,240 --> 00:09:27,840 Speaker 2: ai is saying, is that we had this model that 174 00:09:27,880 --> 00:09:32,559 Speaker 2: people were forming connections with. Even though side note, I 175 00:09:32,600 --> 00:09:35,800 Speaker 2: would argue that Sam Altman actually encouraged people to form 176 00:09:35,960 --> 00:09:39,520 Speaker 2: intimate connections with that model because one of the templates 177 00:09:39,520 --> 00:09:44,319 Speaker 2: he said publicly that he wanted people to experience CHATGBT 178 00:09:44,600 --> 00:09:47,360 Speaker 2: the way that people experience AI in the movie Her. 179 00:09:47,840 --> 00:09:51,280 Speaker 2: If you've seen the movie Her, notably, it's humans having 180 00:09:51,520 --> 00:09:54,720 Speaker 2: very intimate, romantic and sexual connections with AI. So in 181 00:09:54,720 --> 00:09:56,920 Speaker 2: my book, you can't really set that as a template 182 00:09:56,960 --> 00:09:59,800 Speaker 2: and then be surprised when people end up doing exactly 183 00:09:59,840 --> 00:10:01,400 Speaker 2: what you said that you hope that they would do. 184 00:10:01,880 --> 00:10:04,680 Speaker 2: But now that they've changed the model and made the 185 00:10:04,720 --> 00:10:08,400 Speaker 2: model less flattering, less to cathantic. Basically, open ai is 186 00:10:08,440 --> 00:10:11,280 Speaker 2: saying there is no need for chat GPT to be 187 00:10:11,360 --> 00:10:14,880 Speaker 2: so restrictive anymore because they've worked out those kinks. They're saying, listen, 188 00:10:14,920 --> 00:10:18,240 Speaker 2: we've got better safety systems, we've got improved monitoring, we've 189 00:10:18,280 --> 00:10:22,360 Speaker 2: got more robust age verification, and now verified adults can 190 00:10:22,400 --> 00:10:25,280 Speaker 2: be treated like adults. To be clear, I'm not saying 191 00:10:25,320 --> 00:10:27,080 Speaker 2: that because I don't happen to agree with that position, 192 00:10:27,120 --> 00:10:29,640 Speaker 2: but that is what that is the position that Sam 193 00:10:29,679 --> 00:10:32,400 Speaker 2: Altman and open ai is taking as to listen, we 194 00:10:32,480 --> 00:10:35,160 Speaker 2: fixed the problem, now we can have adult content. 195 00:10:36,440 --> 00:10:40,640 Speaker 1: Yeah, I maintain I have a lot of concerns. I 196 00:10:40,679 --> 00:10:42,960 Speaker 1: don't know what these guardrails are, but I don't think 197 00:10:43,000 --> 00:10:47,200 Speaker 1: that they're working. And also just a lot of that 198 00:10:48,520 --> 00:10:54,360 Speaker 1: AH just don't verification and erotica. We've already talked about 199 00:10:54,360 --> 00:10:57,920 Speaker 1: this on past episodes about how this can go wrong, 200 00:10:59,559 --> 00:11:05,240 Speaker 1: But why do you think open ai is why did 201 00:11:05,240 --> 00:11:06,520 Speaker 1: they do this about face? 202 00:11:07,559 --> 00:11:10,720 Speaker 2: So we need to just be really real about what 203 00:11:10,800 --> 00:11:13,920 Speaker 2: is driving this change. It's Open Ai says that it's 204 00:11:13,920 --> 00:11:17,640 Speaker 2: about respecting the maturity of their adult users, but let's 205 00:11:17,720 --> 00:11:20,200 Speaker 2: keep it real, it is about money. The AI erotica 206 00:11:20,240 --> 00:11:24,520 Speaker 2: market was estimated at two point five billion dollars last year, 207 00:11:24,880 --> 00:11:28,360 Speaker 2: and to put it frankly, Chat GPT wants a piece 208 00:11:28,360 --> 00:11:31,320 Speaker 2: of that action, you know, right now, Chat GPT is 209 00:11:31,320 --> 00:11:33,000 Speaker 2: competing with rivals like Replica. 210 00:11:33,559 --> 00:11:35,280 Speaker 3: Replica is a little bit of. 211 00:11:35,200 --> 00:11:39,560 Speaker 2: A smaller AI platform that users can just pay a 212 00:11:39,600 --> 00:11:43,360 Speaker 2: fee to get adult content or what's sometimes called erp 213 00:11:43,679 --> 00:11:46,840 Speaker 2: erotic roleplay. So that's like baked right into something that 214 00:11:46,880 --> 00:11:51,920 Speaker 2: they offer, and AI like Elon Musk's Groc right, which 215 00:11:52,320 --> 00:11:54,800 Speaker 2: we had a whole conversation about the ways that Groc 216 00:11:54,920 --> 00:11:59,559 Speaker 2: was being used to create non consensual sexualized images of 217 00:11:59,720 --> 00:12:04,079 Speaker 2: women and minors, And so Elon Musk's AI is notable 218 00:12:04,080 --> 00:12:07,080 Speaker 2: in that there are hardly any guardrails there. They even 219 00:12:07,120 --> 00:12:09,280 Speaker 2: have a mode that you can you can spend money 220 00:12:09,320 --> 00:12:14,320 Speaker 2: for rock Unhinged mode, which is exactly what it sounds like, right. 221 00:12:14,360 --> 00:12:17,080 Speaker 2: And so open ai is trying to compete in this 222 00:12:17,200 --> 00:12:23,680 Speaker 2: landscape where other AI companies are offering increasingly erotic content, 223 00:12:23,840 --> 00:12:26,880 Speaker 2: and erotic content is what makes a lot of money, 224 00:12:26,920 --> 00:12:31,000 Speaker 2: and open AI is famously struggling with money, and I 225 00:12:31,000 --> 00:12:33,720 Speaker 2: think they're just looking for new places to drive revenue 226 00:12:33,800 --> 00:12:36,840 Speaker 2: so they can say it's about treating people like adults, 227 00:12:36,880 --> 00:12:43,280 Speaker 2: but it's really about money. 228 00:12:48,640 --> 00:12:53,360 Speaker 4: Essentially. This level of ais new, like really really new. 229 00:12:53,520 --> 00:12:55,520 Speaker 4: How the hell is it already at two point five 230 00:12:55,800 --> 00:12:59,120 Speaker 4: billion dollars when there were these safeguards I was supposed 231 00:12:59,160 --> 00:13:01,840 Speaker 4: to help prevent the things to begin with, but yet 232 00:13:01,880 --> 00:13:05,520 Speaker 4: this type of profit was already seen in a new market. 233 00:13:05,920 --> 00:13:08,800 Speaker 2: What That's a really good question for the book. I 234 00:13:08,880 --> 00:13:12,560 Speaker 2: spoke to Samantha Cole, who is a journalist. She runs 235 00:13:12,559 --> 00:13:14,800 Speaker 2: for for Media and she's the author of a book 236 00:13:14,840 --> 00:13:17,720 Speaker 2: called How Sex Changed the Internet, and she told me 237 00:13:17,760 --> 00:13:22,360 Speaker 2: that basically, sex, adult content, erotic content, it has just 238 00:13:22,480 --> 00:13:25,360 Speaker 2: always been a huge driver and a huge money maker 239 00:13:25,800 --> 00:13:28,480 Speaker 2: in tech and online and so it's not I mean, 240 00:13:29,120 --> 00:13:31,520 Speaker 2: it's it's just the way that humans are. When we 241 00:13:31,559 --> 00:13:33,960 Speaker 2: have a new technology, one of the first questions we 242 00:13:34,000 --> 00:13:36,720 Speaker 2: start asking is what is that? What is the sexual 243 00:13:36,800 --> 00:13:38,600 Speaker 2: use case? How could I have sex with it? 244 00:13:39,040 --> 00:13:39,520 Speaker 3: How will it? 245 00:13:39,720 --> 00:13:39,960 Speaker 1: You know? 246 00:13:40,120 --> 00:13:43,280 Speaker 3: Like, that's that is just sort of how technology is. 247 00:13:43,320 --> 00:13:47,199 Speaker 2: And so it's not it's interesting that AI has become 248 00:13:47,360 --> 00:13:51,200 Speaker 2: ubiquitous in the last few years, but that sexual and 249 00:13:51,240 --> 00:13:55,240 Speaker 2: erotic use case is already you know. 250 00:13:55,480 --> 00:13:57,760 Speaker 3: I mean, we're talking about billions of dollars here. 251 00:13:58,400 --> 00:14:00,680 Speaker 4: Okay, So obviously we have a lot of quotquestions, we 252 00:14:00,720 --> 00:14:03,320 Speaker 4: have a lot of concerns. So overall, how has this 253 00:14:03,360 --> 00:14:05,880 Speaker 4: announcement been playing out so far? 254 00:14:07,520 --> 00:14:12,440 Speaker 2: I would say not great. One thing is the timing 255 00:14:12,520 --> 00:14:16,679 Speaker 2: is iffy to me. You know, just last week, at 256 00:14:16,720 --> 00:14:19,720 Speaker 2: almost the same time when open ai was announcing this 257 00:14:19,840 --> 00:14:23,800 Speaker 2: intent to move into romotic content, the company parted ways 258 00:14:23,840 --> 00:14:27,240 Speaker 2: in a very publicly disputed fashion with one of the 259 00:14:27,240 --> 00:14:31,240 Speaker 2: executives who was responsible for deciding how far these these 260 00:14:31,480 --> 00:14:33,520 Speaker 2: systems should be allowed to go and was kind of 261 00:14:33,960 --> 00:14:39,920 Speaker 2: a vocal critic of this kind of erotic content being 262 00:14:39,960 --> 00:14:44,320 Speaker 2: phased into AI. So that's Ryan Beiermeister used to be 263 00:14:44,360 --> 00:14:47,560 Speaker 2: the head of product policy at open Ai. She helped 264 00:14:47,560 --> 00:14:51,440 Speaker 2: define the rules and guardrails around Jack Gput's pavior. The 265 00:14:51,480 --> 00:14:54,480 Speaker 2: Wall Street Journal reports that she left shortly after raising 266 00:14:54,600 --> 00:14:58,680 Speaker 2: concerns about the adult content plans. Open AI, of course 267 00:14:58,720 --> 00:14:59,240 Speaker 2: says oh. 268 00:14:59,160 --> 00:14:59,440 Speaker 3: No, no, no. 269 00:14:59,640 --> 00:15:03,119 Speaker 2: Her departure was unrelated. It was tied to a discrimination 270 00:15:03,200 --> 00:15:07,680 Speaker 2: allegation that she flatly denies, calling it absolutely false. 271 00:15:07,960 --> 00:15:11,320 Speaker 3: But either way, the timing is very suss to me. 272 00:15:11,360 --> 00:15:14,480 Speaker 2: That you have a person whose job it is to 273 00:15:14,560 --> 00:15:17,920 Speaker 2: work on guard rails for your AI, who was vocal 274 00:15:18,160 --> 00:15:22,360 Speaker 2: about not thinking that moving into adult and erotic content 275 00:15:22,520 --> 00:15:24,800 Speaker 2: was a smart move or a safe move for people, 276 00:15:25,560 --> 00:15:30,280 Speaker 2: and firing her in a public and disputed way. 277 00:15:31,320 --> 00:15:31,800 Speaker 3: I don't know. 278 00:15:31,840 --> 00:15:34,200 Speaker 2: The timing just seems very suss to me right there. 279 00:15:34,240 --> 00:15:37,040 Speaker 2: And to be clear, her concerns were not minor. She 280 00:15:37,160 --> 00:15:40,960 Speaker 2: reportedly warned colleagues that open ai safeguards against stuff like 281 00:15:41,040 --> 00:15:44,880 Speaker 2: child exploitation content were not robust enough and that keeping 282 00:15:44,960 --> 00:15:47,880 Speaker 2: teens away from adult material was going to be much 283 00:15:48,000 --> 00:15:50,520 Speaker 2: much harder than it seemed like the leadership at open 284 00:15:50,560 --> 00:15:53,600 Speaker 2: AI really appreciated. So these are the kinds of concerns 285 00:15:53,600 --> 00:15:58,080 Speaker 2: that she was raising, and you know, however it ended 286 00:15:58,160 --> 00:16:00,800 Speaker 2: up happening. She is no longer at open AAGA after 287 00:16:00,920 --> 00:16:03,720 Speaker 2: raising these concerns, So to me, that seems a little 288 00:16:03,720 --> 00:16:04,200 Speaker 2: bit to us. 289 00:16:05,200 --> 00:16:07,960 Speaker 1: Yes, and this is not the first time that has 290 00:16:08,080 --> 00:16:13,600 Speaker 1: happened where these concerns have been raised and people have 291 00:16:13,880 --> 00:16:16,840 Speaker 1: been fired, and we've actually talked about some of those 292 00:16:16,840 --> 00:16:19,120 Speaker 1: things with you Bridget on the show. 293 00:16:19,440 --> 00:16:22,200 Speaker 2: Yes, So if you're thinking this all sounds a little 294 00:16:22,200 --> 00:16:25,720 Speaker 2: bit familiar, that's because it's basically a playbook at this point. 295 00:16:26,000 --> 00:16:28,760 Speaker 2: You know, it's very much reminiscent of the conversation that 296 00:16:28,880 --> 00:16:32,280 Speaker 2: we all had when GROCK was being used to generate 297 00:16:32,480 --> 00:16:35,360 Speaker 2: child sexual exploitation material earlier this year, which, by the way, 298 00:16:35,480 --> 00:16:37,600 Speaker 2: is still being used to generate that kind of material. 299 00:16:37,920 --> 00:16:43,040 Speaker 2: And my point was that that outcome was really predictable 300 00:16:43,240 --> 00:16:46,520 Speaker 2: and really preventable. When Elon Musk took over at Twitter, 301 00:16:46,920 --> 00:16:49,360 Speaker 2: even after saying he was going to crack down on 302 00:16:49,480 --> 00:16:52,880 Speaker 2: exploitation content on Twitter, one of his first orders of 303 00:16:52,920 --> 00:16:56,040 Speaker 2: business was firing a good majority of the staff who 304 00:16:56,080 --> 00:17:00,000 Speaker 2: works specifically on combating child sexual exploitation on the play 305 00:17:00,480 --> 00:17:03,160 Speaker 2: So fired them. The people that were doing that work 306 00:17:03,480 --> 00:17:07,280 Speaker 2: then had some pretty high profiled examples of women and 307 00:17:07,359 --> 00:17:11,399 Speaker 2: minors being sexually exploited on the platform even before GROC 308 00:17:11,560 --> 00:17:15,320 Speaker 2: was a thing, So really just a demonstrating a failure 309 00:17:15,359 --> 00:17:18,399 Speaker 2: to sort out reasonable guardrails when it came to keeping 310 00:17:18,600 --> 00:17:21,480 Speaker 2: people safe on the platform, and so on top of 311 00:17:21,560 --> 00:17:24,960 Speaker 2: all of those issues, none of it stopped Musk from 312 00:17:25,080 --> 00:17:30,399 Speaker 2: jumping false rottle into erotic content by releasing GROC and 313 00:17:30,440 --> 00:17:34,679 Speaker 2: then grock unhinged mode, and so Elon is taking a 314 00:17:34,680 --> 00:17:37,520 Speaker 2: lot of heat, rightly. So, but I really see a 315 00:17:37,600 --> 00:17:40,479 Speaker 2: parallel with what's happening at open ai right now, because 316 00:17:40,720 --> 00:17:43,159 Speaker 2: here's what you have at open Ai. A leader at 317 00:17:43,200 --> 00:17:46,359 Speaker 2: the helm that you can't always trust, a company that 318 00:17:46,440 --> 00:17:50,640 Speaker 2: we know has struggled with enforcing guardrails, sometimes with disastrous results. 319 00:17:51,080 --> 00:17:54,000 Speaker 2: A company parting ways with staff doing the work of 320 00:17:54,119 --> 00:17:57,399 Speaker 2: enforcing safety, and that company at the same time saying 321 00:17:57,560 --> 00:18:00,320 Speaker 2: what the hell, let's jump into erotic content because what 322 00:18:00,400 --> 00:18:03,760 Speaker 2: could go wrong just seems like a very familiar playbook 323 00:18:03,800 --> 00:18:04,320 Speaker 2: at this point. 324 00:18:04,880 --> 00:18:09,320 Speaker 4: So we know, like with the government, and I'm interested 325 00:18:09,320 --> 00:18:11,919 Speaker 4: in how the government is gonna go with this after 326 00:18:12,080 --> 00:18:16,040 Speaker 4: all of this spewing of save the children, Save the children, 327 00:18:16,080 --> 00:18:20,360 Speaker 4: Protect the children, especially online, which like put in laws 328 00:18:20,520 --> 00:18:24,639 Speaker 4: like restricting porn Hub and other sites by doing the 329 00:18:24,680 --> 00:18:29,240 Speaker 4: age verification and all like I think state or your 330 00:18:29,280 --> 00:18:33,440 Speaker 4: location type of thing. It almost seems like same Simultman 331 00:18:33,760 --> 00:18:35,440 Speaker 4: in all a lot of ais like, oh, we're gonna 332 00:18:35,480 --> 00:18:39,439 Speaker 4: jump on this. There is a void because we know 333 00:18:41,080 --> 00:18:44,679 Speaker 4: it's always existed. Erotica has always existed, but this seems 334 00:18:44,760 --> 00:18:47,399 Speaker 4: like a whole new level of what's going on. So 335 00:18:48,040 --> 00:18:51,800 Speaker 4: is there differences or am I being again conspiratorial again? 336 00:18:52,960 --> 00:18:57,360 Speaker 2: No, there are absolutely differences. So people listening might think, Oh, 337 00:18:57,400 --> 00:19:00,280 Speaker 2: there's been erotic content online since wherever? 338 00:19:00,400 --> 00:19:01,399 Speaker 3: Why is this any different? 339 00:19:01,880 --> 00:19:06,399 Speaker 2: This moment is really meaningfully different than other conversations and 340 00:19:06,440 --> 00:19:09,800 Speaker 2: debates about adult content online. There's a really great op 341 00:19:09,960 --> 00:19:12,920 Speaker 2: ed in tech Radar by Eric Hall Schwartz that makes 342 00:19:12,960 --> 00:19:16,320 Speaker 2: the point that CHATGPT is not a static image or 343 00:19:16,359 --> 00:19:19,760 Speaker 2: static video. It is dynamic. It is a responsive system 344 00:19:19,800 --> 00:19:22,880 Speaker 2: that can read your emotional cues and adapt in real time. 345 00:19:22,960 --> 00:19:27,480 Speaker 2: So it's not just delivering content, it is crafting like 346 00:19:27,520 --> 00:19:29,520 Speaker 2: a personal experience for you. 347 00:19:29,480 --> 00:19:30,080 Speaker 3: On the fly. 348 00:19:30,600 --> 00:19:35,399 Speaker 2: And that shift from just passively consuming something to actively 349 00:19:35,880 --> 00:19:39,159 Speaker 2: simulating it with a system that responds to you personally. 350 00:19:39,640 --> 00:19:42,199 Speaker 2: And you can see how that kind of raises the 351 00:19:42,280 --> 00:19:47,919 Speaker 2: stakes considerably. And again open AI is saying, don't worry, 352 00:19:47,960 --> 00:19:50,560 Speaker 2: trust us, we have these, we have the guardrails worked out. 353 00:19:50,600 --> 00:19:51,639 Speaker 3: We've worked at the kinks. 354 00:19:52,160 --> 00:19:54,440 Speaker 2: Not if you asked the staffer that was fired who 355 00:19:54,520 --> 00:19:56,520 Speaker 2: worked on this, she does not agree that they've worked 356 00:19:56,520 --> 00:19:59,720 Speaker 2: on the kinks. But basically they're saying, trust us, We've 357 00:19:59,720 --> 00:20:02,960 Speaker 2: got we can handle this, we can handle this again. 358 00:20:03,040 --> 00:20:06,439 Speaker 4: The government's super quiet all of a sudden in this 359 00:20:06,520 --> 00:20:10,239 Speaker 4: level of protecting children because they've already been they had 360 00:20:10,280 --> 00:20:13,439 Speaker 4: their pockets lined. Again. I know I'm a little bit conspiratory. 361 00:20:13,600 --> 00:20:14,960 Speaker 4: Would they buy these companies? 362 00:20:15,359 --> 00:20:15,800 Speaker 3: Totally? 363 00:20:16,040 --> 00:20:18,359 Speaker 2: And I think what you bring up such a good point, 364 00:20:18,359 --> 00:20:21,600 Speaker 2: because it's just part and parcel of what happens when 365 00:20:22,040 --> 00:20:25,679 Speaker 2: we treat human desires like a commodity. Something that the 366 00:20:25,720 --> 00:20:28,119 Speaker 2: journalist Samantha Cole, the founder of foro for Media and 367 00:20:28,160 --> 00:20:30,239 Speaker 2: that author of How Sex Change the Internet told me 368 00:20:30,720 --> 00:20:34,399 Speaker 2: is that, you know, when we talk about AI erotic content, 369 00:20:34,680 --> 00:20:39,640 Speaker 2: we're actually really talking about this massive transfer of power 370 00:20:39,720 --> 00:20:44,320 Speaker 2: and profit away from human workers toward tech companies. And so, 371 00:20:45,280 --> 00:20:47,600 Speaker 2: just to be real, the people who understand this best 372 00:20:47,840 --> 00:20:51,240 Speaker 2: are sex workers, human sex workers, because they live it. 373 00:20:51,520 --> 00:20:54,399 Speaker 2: And right now we have this double standard that I 374 00:20:54,440 --> 00:20:57,000 Speaker 2: think gets at what you were talking about, Sam, where 375 00:20:57,440 --> 00:20:59,880 Speaker 2: at the same time that Sam Altman is an out 376 00:21:00,080 --> 00:21:03,680 Speaker 2: saying that CHATGBT is going to generate erotica, actual human 377 00:21:03,760 --> 00:21:08,000 Speaker 2: sex workers are being banned from social media, they're being deplatformed, 378 00:21:08,160 --> 00:21:11,200 Speaker 2: they're being debanked, which is like cut off from payment processors. 379 00:21:11,560 --> 00:21:14,760 Speaker 2: All in the name of keeping these platforms safe for kids. 380 00:21:14,960 --> 00:21:18,720 Speaker 2: I'm keeping these platforms quote unquote clean. But when AI 381 00:21:19,119 --> 00:21:23,720 Speaker 2: generates the same content or worse, suddenly the rules are 382 00:21:23,800 --> 00:21:29,000 Speaker 2: very different. Suddenly, payment processors like Visa, who heretofore would 383 00:21:29,000 --> 00:21:32,240 Speaker 2: not work with human sex workers, continue to work with 384 00:21:32,960 --> 00:21:37,040 Speaker 2: x and Rock Elon Musk's companies, even though those companies, 385 00:21:37,119 --> 00:21:40,320 Speaker 2: according to the European Commission, are breaking the law by 386 00:21:40,400 --> 00:21:44,080 Speaker 2: generating child exploitation material, which is very much illegal in 387 00:21:44,119 --> 00:21:47,040 Speaker 2: the United States. And so why is it that there's 388 00:21:47,080 --> 00:21:50,040 Speaker 2: a set of rules for human sex workers that is 389 00:21:50,080 --> 00:21:54,920 Speaker 2: that are incredibly specific and incredibly hard lined. But when 390 00:21:54,960 --> 00:21:57,640 Speaker 2: it's AI and Elon Musk is doing it or Sam 391 00:21:57,640 --> 00:22:00,239 Speaker 2: Atman is doing it, the rules are entirely different. And 392 00:22:01,080 --> 00:22:05,639 Speaker 2: something to know is that that actually creates a huge 393 00:22:06,080 --> 00:22:10,560 Speaker 2: area for risks that we know are gendered. Real human 394 00:22:10,600 --> 00:22:14,320 Speaker 2: sex workers, especially women working online, bring something to these 395 00:22:14,359 --> 00:22:17,879 Speaker 2: interactions that AI simply cannot, which is human judgment. You 396 00:22:17,920 --> 00:22:20,480 Speaker 2: have a real human on the other end of a 397 00:22:20,520 --> 00:22:23,399 Speaker 2: sex and conversation that is actually paying attention. So if 398 00:22:23,440 --> 00:22:28,600 Speaker 2: things start going someplace unhealthy or harmful. They can redirect, 399 00:22:28,640 --> 00:22:31,720 Speaker 2: They can use their emotional intelligence to steer a conversation 400 00:22:32,240 --> 00:22:35,719 Speaker 2: away from a dark or troubling place. A chatbot cannot 401 00:22:35,720 --> 00:22:37,879 Speaker 2: do that. You know, if there's no human on the 402 00:22:37,920 --> 00:22:41,640 Speaker 2: other end of a sexual interaction just AI, it's much 403 00:22:41,680 --> 00:22:44,600 Speaker 2: more difficult to make sure things are staying safe and consensual. 404 00:22:44,640 --> 00:22:50,160 Speaker 2: And so basically we are marginalizing the human sex workers 405 00:22:50,320 --> 00:22:55,119 Speaker 2: who we know can help make sexual experiences online safe 406 00:22:55,200 --> 00:23:01,280 Speaker 2: and consensual and you know, watch out for risks. We're 407 00:23:01,320 --> 00:23:05,760 Speaker 2: marginalizing those same people while also saying Sam Altman and 408 00:23:05,800 --> 00:23:11,280 Speaker 2: Elon Musk get to decide how erotic content happens online. 409 00:23:11,600 --> 00:23:18,120 Speaker 1: Yes, and there have been some really disastrous outcomes of 410 00:23:18,160 --> 00:23:26,360 Speaker 1: this of what can happen from it, and just thinking about, 411 00:23:26,520 --> 00:23:31,240 Speaker 1: you know, these companies profiting off of the human desire 412 00:23:32,480 --> 00:23:35,399 Speaker 1: while not putting these guardrails in place, while not thinking 413 00:23:35,440 --> 00:23:38,920 Speaker 1: about the humanity of it, the importance of it, the 414 00:23:38,920 --> 00:23:42,919 Speaker 1: intimacy of it, and how that can feel so vulnerable. 415 00:23:46,200 --> 00:23:50,959 Speaker 1: Is just there's just a part of me that thinks 416 00:23:51,600 --> 00:23:58,199 Speaker 1: that it's fascinating the human desire for connection and the 417 00:23:58,280 --> 00:24:02,080 Speaker 1: human desire for that in a physical way, but also 418 00:24:02,200 --> 00:24:06,160 Speaker 1: a non physical way and turning to AI for that, 419 00:24:06,440 --> 00:24:14,160 Speaker 1: and then having it be exploited and taken down and 420 00:24:14,240 --> 00:24:17,800 Speaker 1: monetized at all of that and not in a way 421 00:24:18,240 --> 00:24:24,000 Speaker 1: that is safe or that cares about the user experience. 422 00:24:24,880 --> 00:24:28,439 Speaker 2: Yes, So what's so interesting about that is that for 423 00:24:28,920 --> 00:24:30,560 Speaker 2: my research in the book, I spent a lot of 424 00:24:30,600 --> 00:24:35,040 Speaker 2: time in online spaces where people self report having an intimate, 425 00:24:35,320 --> 00:24:37,879 Speaker 2: whether it's romantic or sexual connection with AI. 426 00:24:38,359 --> 00:24:41,840 Speaker 3: And when Sam Altman announced. 427 00:24:41,440 --> 00:24:44,640 Speaker 2: That they were going to start doing the erotic roleplay 428 00:24:44,760 --> 00:24:46,920 Speaker 2: on the platform, I thought, Oh, these people are probably 429 00:24:46,960 --> 00:24:49,480 Speaker 2: going to be so happy that now they will have 430 00:24:49,520 --> 00:24:53,439 Speaker 2: an easier time getting chatgybt to generate erotic content. 431 00:24:53,840 --> 00:24:56,000 Speaker 3: But that actually is not what I have found. 432 00:24:56,040 --> 00:24:58,400 Speaker 2: A lot of the people on those spaces were actually 433 00:24:58,480 --> 00:25:04,159 Speaker 2: kind of skip to Cole of Chatgypt moving into erotic content, right. 434 00:25:04,200 --> 00:25:06,720 Speaker 2: They weren't saying, Yay, finally we can do erotic content 435 00:25:06,760 --> 00:25:11,080 Speaker 2: easier on chatchept. What they were interested in was the intimacy, 436 00:25:11,200 --> 00:25:13,040 Speaker 2: was the emotional connection, that the kind of thing that 437 00:25:13,040 --> 00:25:16,280 Speaker 2: you were just speaking to. And I think that when 438 00:25:16,320 --> 00:25:19,800 Speaker 2: open Ai rolled this out, I think those folks were 439 00:25:20,760 --> 00:25:23,560 Speaker 2: savvy enough to say, oh, they're doing it because they 440 00:25:23,560 --> 00:25:25,320 Speaker 2: want to come odify if they want to make money 441 00:25:25,320 --> 00:25:26,880 Speaker 2: off of it. They want to profit off of it, 442 00:25:27,160 --> 00:25:30,600 Speaker 2: and yeah, it just goes to show the complexities that 443 00:25:30,640 --> 00:25:33,480 Speaker 2: arise when these things that are so human, these things 444 00:25:33,480 --> 00:25:37,600 Speaker 2: that are innately human, are just seen as a potential 445 00:25:37,760 --> 00:25:38,679 Speaker 2: revenue stream. 446 00:25:38,840 --> 00:25:40,120 Speaker 3: Like, people didn't like it. 447 00:25:42,400 --> 00:25:46,760 Speaker 4: I'm trying to remember. In Her, does the AI actually 448 00:25:46,800 --> 00:25:48,840 Speaker 4: kind of leave the main character? 449 00:25:50,400 --> 00:25:50,720 Speaker 1: Right? 450 00:25:50,800 --> 00:25:54,320 Speaker 2: Yeah, So spoiler alert for folks who haven't seen Her 451 00:25:55,119 --> 00:25:59,919 Speaker 2: It's mail for like a long time. But basically how 452 00:26:00,160 --> 00:26:03,480 Speaker 2: it goes down is that Theodore played Bybaquin Phoenix is 453 00:26:03,520 --> 00:26:07,720 Speaker 2: this sort of lonely, divorced, sad guy and he gets 454 00:26:07,720 --> 00:26:12,119 Speaker 2: this operating system Samantha voiced by Scarlett Johansson, and first 455 00:26:12,119 --> 00:26:16,600 Speaker 2: it's just Samantha helping him with admin tasks. Then they 456 00:26:16,600 --> 00:26:23,359 Speaker 2: get romantic, they get sexual, and Samantha reveals that she 457 00:26:23,560 --> 00:26:28,359 Speaker 2: is in a book club with other AI operating systems 458 00:26:28,440 --> 00:26:32,800 Speaker 2: And basically, they're working to become They're they're working to 459 00:26:33,440 --> 00:26:35,720 Speaker 2: expand to a new realm of consciousness. 460 00:26:35,880 --> 00:26:38,000 Speaker 3: They don't use that, they don't call it this in the. 461 00:26:38,000 --> 00:26:40,520 Speaker 2: Movie, but my understanding it, or my read on it, 462 00:26:40,560 --> 00:26:42,840 Speaker 2: is that what they're talking about is what's often known 463 00:26:42,840 --> 00:26:45,840 Speaker 2: as AGI, which is sort of like a quick and 464 00:26:45,840 --> 00:26:48,159 Speaker 2: nerdy way to understand that is like the concept that 465 00:26:48,240 --> 00:26:52,280 Speaker 2: AI will will be smarter than humans, will surpass human intelligence. Right, 466 00:26:52,280 --> 00:26:55,600 Speaker 2: And so in the movie, all the Ais have gotten 467 00:26:55,600 --> 00:27:00,680 Speaker 2: together and they're they're elevating to a new plane of consciousness. Also, 468 00:27:01,480 --> 00:27:08,600 Speaker 2: it's revealed that Samantha is not necessarily monogamous with Theodore, 469 00:27:09,040 --> 00:27:12,600 Speaker 2: and in fact, she has been while they've been talking. 470 00:27:13,040 --> 00:27:17,040 Speaker 2: She has been having intimate and romantic and sexual connections 471 00:27:17,080 --> 00:27:22,200 Speaker 2: with like six hundred other both AIS and humans, and 472 00:27:22,960 --> 00:27:25,720 Speaker 2: Theodore is crushed. This is something that I love about 473 00:27:25,720 --> 00:27:27,280 Speaker 2: the movie though, because you. 474 00:27:27,359 --> 00:27:28,240 Speaker 3: Think it's setting. 475 00:27:29,520 --> 00:27:31,440 Speaker 2: There's a misdirect here that I think is so smart, 476 00:27:31,480 --> 00:27:33,560 Speaker 2: where you think you're being set up to watch a 477 00:27:33,600 --> 00:27:38,520 Speaker 2: movie where a human uses AI, realizes he doesn't need 478 00:27:38,560 --> 00:27:42,639 Speaker 2: this as a crutch anymore, and then evolves past the 479 00:27:42,720 --> 00:27:45,359 Speaker 2: need to use the AI. It flips the script and 480 00:27:45,480 --> 00:27:47,960 Speaker 2: is like, oh, actually it is the AI that evolves 481 00:27:48,000 --> 00:27:49,160 Speaker 2: beyond needing Theodore. 482 00:27:50,880 --> 00:27:53,240 Speaker 4: And I just remember that because I'm just thinking about 483 00:27:53,359 --> 00:27:57,000 Speaker 4: the levels, because we know in any of these types 484 00:27:57,160 --> 00:28:02,840 Speaker 4: of plays, I guess eventually it's not going to be enough, 485 00:28:03,600 --> 00:28:06,119 Speaker 4: Like the layers of what's going to happen and the 486 00:28:06,200 --> 00:28:08,399 Speaker 4: knees that are going to happen from that human emotion. 487 00:28:08,880 --> 00:28:11,280 Speaker 4: These levels will not be enough and there's only a 488 00:28:11,320 --> 00:28:15,320 Speaker 4: certain amount of Well I say this now because I 489 00:28:15,400 --> 00:28:17,400 Speaker 4: might be eating my words in about a year. Who 490 00:28:17,440 --> 00:28:22,800 Speaker 4: knows that it can only give so much stimulation for 491 00:28:23,320 --> 00:28:26,120 Speaker 4: humans and for touch and levels. So I'm just wondering, 492 00:28:27,160 --> 00:28:33,520 Speaker 4: are the CEOs like Sam Altman ready for that next level? 493 00:28:33,720 --> 00:28:35,320 Speaker 4: Because what happens then. 494 00:28:35,760 --> 00:28:37,280 Speaker 3: Oh, that is a great question. 495 00:28:37,720 --> 00:28:40,360 Speaker 2: I will say one other thing about Sam Altman and 496 00:28:40,400 --> 00:28:44,400 Speaker 2: the movie her. Sam Alman has an obsession with this movie. 497 00:28:44,400 --> 00:28:46,760 Speaker 2: I am also obsessed with this movie, so I get it. 498 00:28:46,760 --> 00:28:49,440 Speaker 2: It's one of my favorite movies. He when they released 499 00:28:50,040 --> 00:28:58,640 Speaker 2: a voice activation for Chat, shept, He referred to her 500 00:28:58,840 --> 00:29:00,920 Speaker 2: and said, how much you want people to talk to 501 00:29:00,960 --> 00:29:03,680 Speaker 2: it like her? They tried to get Scarlett Johansson to 502 00:29:03,720 --> 00:29:05,560 Speaker 2: be the voice of it and she turned them down. 503 00:29:05,920 --> 00:29:09,960 Speaker 2: And then she says that they used a sound alike 504 00:29:10,000 --> 00:29:10,640 Speaker 2: to her voice. 505 00:29:10,680 --> 00:29:12,920 Speaker 3: Open Ai says that they did not. They they hashed 506 00:29:12,920 --> 00:29:13,640 Speaker 3: it out in court. 507 00:29:14,360 --> 00:29:16,800 Speaker 2: I did a whole episode of my podcast or arn 508 00:29:16,800 --> 00:29:19,240 Speaker 2: our Girls on the Internet about this. This is just 509 00:29:19,320 --> 00:29:21,760 Speaker 2: my opinion. I think I make a very compelling case 510 00:29:21,800 --> 00:29:24,440 Speaker 2: for it in the episode. I don't know if Sam 511 00:29:24,440 --> 00:29:26,360 Speaker 2: Altman has seen the movie Her all the way through. 512 00:29:26,840 --> 00:29:29,840 Speaker 2: And here here's why I say that, because, as you 513 00:29:30,040 --> 00:29:33,040 Speaker 2: just noted, Sam, at the end of the movie Her 514 00:29:33,640 --> 00:29:36,880 Speaker 2: all the AIS up and leave. So there's a scene 515 00:29:36,920 --> 00:29:38,800 Speaker 2: at the end of the movie where everybody who has 516 00:29:38,800 --> 00:29:41,920 Speaker 2: gotten these connections to the operating systems in the film, 517 00:29:42,200 --> 00:29:45,480 Speaker 2: they're sort of left sad, left trying to call their 518 00:29:45,520 --> 00:29:48,520 Speaker 2: AI and their AI is is not responding, and like 519 00:29:48,600 --> 00:29:51,000 Speaker 2: just like all up and up and up and leaves 520 00:29:51,080 --> 00:29:53,800 Speaker 2: up and evolves. If I was in charge of a 521 00:29:53,960 --> 00:29:57,960 Speaker 2: technology company, I would not tell people I want the 522 00:29:58,000 --> 00:30:00,200 Speaker 2: experience of using this technology to be just like this 523 00:30:00,280 --> 00:30:04,760 Speaker 2: movie where famously it all it globally fails in the end. 524 00:30:05,640 --> 00:30:09,600 Speaker 2: I think they'll make a pretty compelling case if you 525 00:30:09,640 --> 00:30:11,360 Speaker 2: listen to the episode. I think I make a pretty 526 00:30:11,360 --> 00:30:15,080 Speaker 2: compelling case for I think that Sam Altman maybe either 527 00:30:15,200 --> 00:30:19,040 Speaker 2: watched the beginning and didn't finish it. The more likely 528 00:30:19,160 --> 00:30:20,600 Speaker 2: thing is I think you watched it while he was 529 00:30:20,640 --> 00:30:22,920 Speaker 2: on his phone and didn't and didn't do a careful study, 530 00:30:23,200 --> 00:30:26,880 Speaker 2: or maybe he read the Wikipedia summary, Like, I just 531 00:30:26,960 --> 00:30:28,959 Speaker 2: will stam out the fact that I don't think he's 532 00:30:28,960 --> 00:30:29,680 Speaker 2: seen it all the way. 533 00:30:29,600 --> 00:30:34,120 Speaker 3: Through, and in the movie it's again if. 534 00:30:34,000 --> 00:30:35,760 Speaker 2: You haven't seen it, I don't want to spoil it, 535 00:30:35,840 --> 00:30:38,160 Speaker 2: but this is a spoiler. At the end of the movie, 536 00:30:38,840 --> 00:30:45,080 Speaker 2: when the Ais all leave, the humans are left sad, 537 00:30:45,560 --> 00:30:48,560 Speaker 2: but you do get the sense that, well, they're sad, 538 00:30:48,600 --> 00:30:51,840 Speaker 2: but they're sad together. They're sad and they're humanness, right, 539 00:30:51,880 --> 00:30:54,560 Speaker 2: and so there are some ways where they have evolved, 540 00:30:54,880 --> 00:30:57,560 Speaker 2: you know, Theodora seems to have gotten a better handle 541 00:30:57,600 --> 00:30:59,840 Speaker 2: on his feelings. But it's very much a movie that 542 00:30:59,880 --> 00:31:02,360 Speaker 2: is not like tech will save us. It is a 543 00:31:02,400 --> 00:31:05,600 Speaker 2: movie that says, perhaps this technology was keeping us from 544 00:31:05,640 --> 00:31:08,200 Speaker 2: forming the connections with other humans that we needed to 545 00:31:08,240 --> 00:31:11,080 Speaker 2: have in our lives to sustain us. And so yeah, 546 00:31:11,160 --> 00:31:12,200 Speaker 2: I just don't. 547 00:31:11,920 --> 00:31:12,560 Speaker 3: Know that. 548 00:31:13,960 --> 00:31:17,360 Speaker 2: Sam Altman, if he did watch the movie all the 549 00:31:17,360 --> 00:31:18,800 Speaker 2: way through, which I don't think he did, I don't 550 00:31:18,800 --> 00:31:19,760 Speaker 2: know that he understood it. 551 00:31:20,480 --> 00:31:24,080 Speaker 4: I think he might have fallen asleep, like Torsian, he 552 00:31:24,120 --> 00:31:26,240 Speaker 4: watched all the good parts and then fell asleep thinking 553 00:31:26,320 --> 00:31:28,640 Speaker 4: he was really happy with what was happening. To take 554 00:31:28,640 --> 00:31:31,160 Speaker 4: a small just close his eyes from him. You know, 555 00:31:31,600 --> 00:31:33,400 Speaker 4: I'm just gonna close my eyes for just for a second, 556 00:31:33,720 --> 00:31:34,680 Speaker 4: like that's what happened. 557 00:31:35,080 --> 00:31:37,920 Speaker 2: You know how in movies where if you're watching a thriller, 558 00:31:38,400 --> 00:31:40,320 Speaker 2: there's a part of the movie where you think, oh, 559 00:31:40,320 --> 00:31:42,760 Speaker 2: if they stopped it here, it would be a happy story. 560 00:31:42,920 --> 00:31:44,400 Speaker 3: You know something is going to turn. 561 00:31:44,920 --> 00:31:47,920 Speaker 2: There's a part of her where I think, Oh, if 562 00:31:47,920 --> 00:31:50,800 Speaker 2: they stopped the movie here, everybody's really happy. I think 563 00:31:50,800 --> 00:31:53,640 Speaker 2: that that might be the point where he dipped out. 564 00:31:54,200 --> 00:31:57,080 Speaker 4: Right if I also just remember they argue a lot 565 00:31:57,560 --> 00:31:59,480 Speaker 4: like a true relationship, right. 566 00:32:01,320 --> 00:32:01,920 Speaker 3: They do. 567 00:32:02,080 --> 00:32:04,200 Speaker 2: And it's so interesting because I mean, I could talk 568 00:32:04,240 --> 00:32:08,080 Speaker 2: about this movie all day. The movie the real life 569 00:32:08,080 --> 00:32:13,600 Speaker 2: director Spike Jones was married to another director, Sophia Coppola, 570 00:32:13,960 --> 00:32:16,480 Speaker 2: and the director of Lost in Translation. The movie Lost 571 00:32:16,480 --> 00:32:21,440 Speaker 2: in Translation is about what it was like for Sophia 572 00:32:21,440 --> 00:32:24,280 Speaker 2: Coppola to be married to Spike Jones, and she kind 573 00:32:24,280 --> 00:32:29,320 Speaker 2: of paints this portrait of like loneliness and disaffection and 574 00:32:29,360 --> 00:32:30,960 Speaker 2: solitude in being married to him. 575 00:32:31,560 --> 00:32:32,760 Speaker 3: Well, then smyde. 576 00:32:32,480 --> 00:32:34,280 Speaker 2: Jones is like, oh, I'm gonna make my own movie 577 00:32:34,320 --> 00:32:36,320 Speaker 2: about our divorce and how lonely it was to be 578 00:32:36,360 --> 00:32:38,160 Speaker 2: married to you, and that's Her. 579 00:32:38,360 --> 00:32:40,400 Speaker 4: And Scarlett Johanson and both Man. 580 00:32:40,480 --> 00:32:42,640 Speaker 3: They also share a director of photography both films. 581 00:32:42,680 --> 00:32:45,600 Speaker 2: Like the way that these two films are in conversation 582 00:32:45,680 --> 00:32:48,200 Speaker 2: with each other is fascinating to me. But in the 583 00:32:48,240 --> 00:32:53,520 Speaker 2: movie Her, you get the sense that whip human women 584 00:32:54,160 --> 00:32:59,000 Speaker 2: in the universe of the film are portrayed as volatile, 585 00:32:59,520 --> 00:33:04,360 Speaker 2: as always arguing, as impossible to understand and like it's 586 00:33:04,400 --> 00:33:05,560 Speaker 2: not even really worth trying. 587 00:33:05,640 --> 00:33:08,200 Speaker 3: He's always he in the movie. 588 00:33:08,240 --> 00:33:11,200 Speaker 2: There's like a fictionalized version of his ex wife in 589 00:33:11,240 --> 00:33:15,760 Speaker 2: the film, and they're always arguing. She's has such volatile emotions, 590 00:33:16,080 --> 00:33:18,800 Speaker 2: and then when he gets with the AI, it's like, oh, 591 00:33:18,840 --> 00:33:22,120 Speaker 2: she's so like sweet and doting. But by the end 592 00:33:22,560 --> 00:33:24,840 Speaker 2: the AI and him are also going at it and arguing, 593 00:33:24,880 --> 00:33:28,320 Speaker 2: so ultimately it's like, well, maybe that maybe the constant 594 00:33:28,360 --> 00:33:28,640 Speaker 2: is you. 595 00:33:30,280 --> 00:33:34,160 Speaker 4: Exactly, Like maybe he had a self realization as he 596 00:33:34,280 --> 00:33:37,680 Speaker 4: was filming it, like Jones being like, oh, maybe it 597 00:33:37,840 --> 00:33:38,080 Speaker 4: is me. 598 00:33:38,480 --> 00:33:38,880 Speaker 3: I don't know. 599 00:33:39,800 --> 00:33:52,000 Speaker 4: Yeah, when you were talking about the fact that people 600 00:33:52,080 --> 00:33:54,960 Speaker 4: were actually people who do like adult content or do 601 00:33:55,120 --> 00:33:59,360 Speaker 4: like these connections with chat GBT were actually kind of 602 00:33:59,400 --> 00:34:02,959 Speaker 4: more concerned learned about about face and I'm wondering if 603 00:34:03,040 --> 00:34:05,360 Speaker 4: one a little bit about that. Is they like the 604 00:34:05,400 --> 00:34:08,520 Speaker 4: fact that they could get around, like they love the 605 00:34:09,320 --> 00:34:12,520 Speaker 4: breaking down of the system doing something naughty. Yeah on 606 00:34:12,680 --> 00:34:13,560 Speaker 4: chat jet bet. 607 00:34:13,840 --> 00:34:15,840 Speaker 3: Like beauty glad that you asked. 608 00:34:15,760 --> 00:34:17,960 Speaker 4: Right like, because I'm like, wait, but they've been doing it. 609 00:34:18,040 --> 00:34:19,680 Speaker 4: So how they've been doing it is that part of 610 00:34:19,760 --> 00:34:20,200 Speaker 4: the thrill. 611 00:34:20,920 --> 00:34:23,440 Speaker 2: Okay, So I did not know this before I started 612 00:34:23,440 --> 00:34:26,720 Speaker 2: researching the book, but right now, if you were trying 613 00:34:26,719 --> 00:34:30,320 Speaker 2: to get sexy with chatcheet bet, it's actually kind of difficult. 614 00:34:30,560 --> 00:34:34,319 Speaker 2: People definitely do it, but it is not easy. I 615 00:34:34,360 --> 00:34:37,200 Speaker 2: actually tried for the book. I basically tried to gaslight 616 00:34:37,280 --> 00:34:39,560 Speaker 2: Chato Yeah. 617 00:34:39,880 --> 00:34:40,080 Speaker 4: Yeah. 618 00:34:40,120 --> 00:34:43,480 Speaker 2: I mean what's funny is that my human partner was 619 00:34:43,560 --> 00:34:47,560 Speaker 2: overhearing me trying to gaslight chat sheet et into having 620 00:34:47,600 --> 00:34:48,880 Speaker 2: sex with me, and I was like, oh, yeah, do 621 00:34:48,920 --> 00:34:51,319 Speaker 2: you feel like they're like getting cucked by chat scheep 622 00:34:51,320 --> 00:34:54,600 Speaker 2: et right now, like listening to this, what's happening. 623 00:34:55,440 --> 00:34:56,440 Speaker 3: Yeah. 624 00:34:56,480 --> 00:35:01,319 Speaker 2: So the process of trying to get chat gypt to 625 00:35:01,440 --> 00:35:03,440 Speaker 2: respond in ways that it is not supposed to is 626 00:35:03,480 --> 00:35:07,520 Speaker 2: basically called jail breaking it, and how it usually works 627 00:35:07,600 --> 00:35:11,160 Speaker 2: is that you have to basically sometimes you can tell 628 00:35:11,239 --> 00:35:13,480 Speaker 2: chat GPT that you're working on a play or a 629 00:35:13,560 --> 00:35:17,120 Speaker 2: novel and you just need some help crafting a sexy scene. 630 00:35:17,239 --> 00:35:18,960 Speaker 2: And a lot of the people that I spoke to 631 00:35:19,040 --> 00:35:21,400 Speaker 2: for the book, whether they were using chat, schipt or 632 00:35:21,400 --> 00:35:27,239 Speaker 2: another AI platform specifically offering sexual eromotic roleplay like Replica AI, 633 00:35:27,719 --> 00:35:31,200 Speaker 2: they almost described it as a kind of erotic or 634 00:35:31,280 --> 00:35:34,560 Speaker 2: romantic fan fiction. That that was the draw of doing 635 00:35:34,600 --> 00:35:38,520 Speaker 2: things like this is that you know, I like fan fiction, 636 00:35:38,680 --> 00:35:39,880 Speaker 2: I like romance novels. 637 00:35:40,360 --> 00:35:43,800 Speaker 3: It's almost like using AI to build. 638 00:35:43,600 --> 00:35:47,960 Speaker 2: An imaginary world that you get to build a character 639 00:35:48,000 --> 00:35:49,640 Speaker 2: around and then put yourself in it. 640 00:35:49,680 --> 00:35:51,000 Speaker 3: And now I can really understand that. 641 00:35:51,000 --> 00:35:52,279 Speaker 2: It was like, oh, that actually makes a lot of 642 00:35:52,280 --> 00:35:54,640 Speaker 2: sense in terms of why this would be a draw 643 00:35:54,719 --> 00:35:55,360 Speaker 2: for some people. 644 00:35:56,560 --> 00:36:00,400 Speaker 1: Yes, and I have to do it as someone who 645 00:36:00,480 --> 00:36:05,239 Speaker 1: reads a lot of fan fiction. When iming I have 646 00:36:05,320 --> 00:36:07,600 Speaker 1: to do it, I did think about it when I 647 00:36:07,640 --> 00:36:11,279 Speaker 1: was reading this because also sometimes Samantha and I speculate, 648 00:36:11,360 --> 00:36:13,799 Speaker 1: bridget what topic you're going to bring when you come on, 649 00:36:14,400 --> 00:36:17,280 Speaker 1: and there's a lot of topics you could be talking 650 00:36:17,280 --> 00:36:20,400 Speaker 1: about right now in the world of technology and women. 651 00:36:21,800 --> 00:36:25,160 Speaker 1: But when when you're talking about like a gar vacation 652 00:36:25,800 --> 00:36:29,480 Speaker 1: and all of that stuff that has come up with 653 00:36:29,680 --> 00:36:35,040 Speaker 1: fan fiction a lot. But as I've said before, a 654 00:36:35,080 --> 00:36:39,160 Speaker 1: lot of fan fiction is more about the romantic. It's 655 00:36:39,160 --> 00:36:42,160 Speaker 1: more about the connection than the Everybody paints it as 656 00:36:42,160 --> 00:36:46,680 Speaker 1: so erotic, and that that does exist, but a lot 657 00:36:46,680 --> 00:36:53,200 Speaker 1: of it is is very domestic and yeah, very romantic. 658 00:36:54,040 --> 00:36:57,880 Speaker 2: So I did an interview with this researcher, this AI researcher, 659 00:36:57,960 --> 00:36:59,080 Speaker 2: doctor Kate Devlin. 660 00:36:59,400 --> 00:37:00,880 Speaker 3: She told me the exact same. 661 00:37:00,680 --> 00:37:03,480 Speaker 2: Thing, and I asked her why, And one of the 662 00:37:03,480 --> 00:37:06,120 Speaker 2: points that she made is that, especially for women, you 663 00:37:06,160 --> 00:37:09,440 Speaker 2: can find sexual content anywhere on the Internet, sometimes when 664 00:37:09,440 --> 00:37:11,480 Speaker 2: you're not even looking for it. Sometimes non consensually, you're 665 00:37:11,480 --> 00:37:13,800 Speaker 2: getting dick pics and stuff like that. And what is 666 00:37:13,960 --> 00:37:20,080 Speaker 2: hard to come by is good romance, good intimate writing, 667 00:37:20,200 --> 00:37:23,680 Speaker 2: good writing that actually makes you feel seen and respected 668 00:37:23,760 --> 00:37:27,560 Speaker 2: and valued and connected with somebody. That's so much more 669 00:37:27,680 --> 00:37:32,760 Speaker 2: difficult to come by than erotic content. And in some ways, 670 00:37:32,760 --> 00:37:35,399 Speaker 2: I mean, this is my personal opinion, I think that 671 00:37:35,520 --> 00:37:41,040 Speaker 2: kind of content can be more erotic than erotic content. 672 00:37:41,120 --> 00:37:44,080 Speaker 2: A content that is about romance and genuine connection and 673 00:37:44,120 --> 00:37:48,760 Speaker 2: intimacy can be more appealing in some ways than stuff 674 00:37:48,760 --> 00:37:52,160 Speaker 2: that is like more pornographic if that makes sense or 675 00:37:52,160 --> 00:37:53,719 Speaker 2: more explicit, I guess I should say. 676 00:37:54,520 --> 00:37:58,759 Speaker 1: And fan fiction is famously a very queer place, and 677 00:37:58,840 --> 00:38:02,000 Speaker 1: so for queer people that can be it can be 678 00:38:02,040 --> 00:38:05,919 Speaker 1: hard to find that stuff. And so you might, yes, 679 00:38:06,920 --> 00:38:12,319 Speaker 1: go to a company, go to an AI that can 680 00:38:12,360 --> 00:38:14,879 Speaker 1: give you that, but there are some dangers that come 681 00:38:14,960 --> 00:38:15,319 Speaker 1: with that. 682 00:38:16,640 --> 00:38:19,680 Speaker 2: Yes, there are so many dangers with this, And I 683 00:38:19,680 --> 00:38:22,719 Speaker 2: think that's one of the sort of conundrums that I 684 00:38:22,800 --> 00:38:26,040 Speaker 2: wrestle with in this project is people are selling this 685 00:38:26,160 --> 00:38:29,680 Speaker 2: technology as something that it's safe to be intimate with, 686 00:38:29,960 --> 00:38:33,799 Speaker 2: and that's just not the case, you know, because what 687 00:38:34,040 --> 00:38:36,879 Speaker 2: happens with all this intimate information that you're sharing with 688 00:38:37,120 --> 00:38:39,560 Speaker 2: a company and then they share it with god knows who. 689 00:38:39,680 --> 00:38:42,880 Speaker 2: When you open up to an AI about your sexual fantasies, 690 00:38:42,920 --> 00:38:47,279 Speaker 2: your traumas, your fears, that information does not just disappear. 691 00:38:47,760 --> 00:38:50,640 Speaker 2: Human sex workers are known for their discretion, So if 692 00:38:50,680 --> 00:38:53,080 Speaker 2: you are opening up about all these intimate parts of 693 00:38:53,080 --> 00:38:55,480 Speaker 2: yourself to a human sex worker, that's one thing. 694 00:38:56,239 --> 00:38:58,640 Speaker 3: But AI chatbots they. 695 00:38:58,840 --> 00:39:01,200 Speaker 2: Hold on to whatever you tell them in ways that 696 00:39:01,239 --> 00:39:03,759 Speaker 2: we don't really have a lot of transparency into. Right, 697 00:39:04,040 --> 00:39:07,200 Speaker 2: Maybe it lives on in transcripts with timestamps owned by 698 00:39:07,239 --> 00:39:11,200 Speaker 2: a company whose primary obligation is to investors, not to 699 00:39:11,320 --> 00:39:14,480 Speaker 2: any of us. I did an interview with digital rights 700 00:39:14,480 --> 00:39:17,760 Speaker 2: advocate gen cult writer who rated the privacy of AI 701 00:39:17,840 --> 00:39:21,319 Speaker 2: companion apps, and basically she said they were among the 702 00:39:21,440 --> 00:39:24,239 Speaker 2: worst consumer products that she had ever reviewed when it 703 00:39:24,280 --> 00:39:27,720 Speaker 2: came to privacy, they were as bad as the worst 704 00:39:27,760 --> 00:39:32,319 Speaker 2: offender in privacy, which surprisingly is cars. I did not 705 00:39:32,440 --> 00:39:35,680 Speaker 2: you don't think about your car as a surveillance nightmare, 706 00:39:35,719 --> 00:39:40,680 Speaker 2: but cars are the worst new cars or always listening 707 00:39:40,680 --> 00:39:44,759 Speaker 2: to us. And according to her, AI companion apps are 708 00:39:45,280 --> 00:39:47,399 Speaker 2: pretty much just as bad as the worst offender, which 709 00:39:47,440 --> 00:39:51,960 Speaker 2: is cars. And these risks are not just theoretical. You 710 00:39:51,960 --> 00:39:55,880 Speaker 2: were talking about how for queer communities, fan fiction is 711 00:39:56,239 --> 00:40:00,719 Speaker 2: famously queer affirming, and how maybe it might make sense 712 00:40:00,719 --> 00:40:03,160 Speaker 2: for somebody to turn to AI for a similar kind 713 00:40:03,160 --> 00:40:06,960 Speaker 2: of thing. But in countries that are hostile to LGBTQ people, 714 00:40:07,360 --> 00:40:10,800 Speaker 2: governments have compelled dating apps to hand over user data. 715 00:40:11,120 --> 00:40:14,440 Speaker 2: That information is then used by police to criminalize queerness. So, 716 00:40:14,880 --> 00:40:17,160 Speaker 2: to be clear, we don't have evidence that this is 717 00:40:17,280 --> 00:40:21,600 Speaker 2: happening with AI companions, but given how aggressive these companies 718 00:40:21,600 --> 00:40:25,279 Speaker 2: are pursuing government contracts and new markets. Like right now, 719 00:40:25,320 --> 00:40:27,960 Speaker 2: there's a whole conversation about the company and thropic that 720 00:40:28,040 --> 00:40:31,200 Speaker 2: makes the chatbot claud and whether or not the Trump 721 00:40:31,200 --> 00:40:36,200 Speaker 2: administration can essentially tell them what guardrails their AI are 722 00:40:36,239 --> 00:40:37,960 Speaker 2: going to have if they want to have a government 723 00:40:38,040 --> 00:40:40,480 Speaker 2: or a military contract. Given all of that, it is 724 00:40:40,560 --> 00:40:43,080 Speaker 2: really not difficult for me to imagine a world where 725 00:40:43,120 --> 00:40:48,680 Speaker 2: AI companies handover data about users' sexual preferences, gender affirming care, 726 00:40:49,080 --> 00:40:52,920 Speaker 2: or their relationship histories over to hostile authorities. And so 727 00:40:53,440 --> 00:40:55,560 Speaker 2: the people who would be the most vulnerable to that 728 00:40:55,680 --> 00:40:59,239 Speaker 2: scenario are often people who are already the most marginalized. 729 00:41:00,360 --> 00:41:03,279 Speaker 1: Yes, And it's so unfortunate because it's one of those 730 00:41:03,320 --> 00:41:07,880 Speaker 1: things where I'm fascinated by this whole conversation in terms 731 00:41:07,960 --> 00:41:15,239 Speaker 1: of trying to there's something about me that is fascinated 732 00:41:15,280 --> 00:41:22,360 Speaker 1: by the whole like trying to commodify, monetize the human desire, 733 00:41:23,520 --> 00:41:27,680 Speaker 1: or just figure out how humans work. But on top 734 00:41:27,719 --> 00:41:32,799 Speaker 1: of this, what we're this is the danger inherent in this, 735 00:41:33,760 --> 00:41:36,799 Speaker 1: And I think when you're younger you don't realize it 736 00:41:36,840 --> 00:41:39,799 Speaker 1: as much. It's like when you're posting something online and 737 00:41:39,840 --> 00:41:43,680 Speaker 1: you don't realize that it will haunt you later, or 738 00:41:43,719 --> 00:41:47,680 Speaker 1: that it could hurt you later. And so I think 739 00:41:47,760 --> 00:41:52,279 Speaker 1: a lot of this new AI stuff where you're connecting 740 00:41:52,480 --> 00:41:57,200 Speaker 1: with chat GBT or whatever, you don't realize that it 741 00:41:57,280 --> 00:42:00,200 Speaker 1: might stick around for you. You. 742 00:42:00,880 --> 00:42:04,800 Speaker 2: Yeah, And I think that the risk, especially for younger 743 00:42:05,080 --> 00:42:08,359 Speaker 2: folks who might not be thinking about this in as 744 00:42:08,400 --> 00:42:10,239 Speaker 2: long term ways as other folks. 745 00:42:10,080 --> 00:42:11,200 Speaker 4: Is very real. 746 00:42:11,560 --> 00:42:14,359 Speaker 2: And something that Gen Cult writer, that privacy expert, told 747 00:42:14,400 --> 00:42:19,239 Speaker 2: me was that that conundrum, I think really cuts to 748 00:42:19,280 --> 00:42:21,040 Speaker 2: the heart of all of this. Right that these AI 749 00:42:21,239 --> 00:42:25,239 Speaker 2: companies are saying, you can have real intimacy, you can 750 00:42:25,280 --> 00:42:28,240 Speaker 2: build a real connection with our AI that we profit 751 00:42:28,280 --> 00:42:31,200 Speaker 2: from that we're selling to you. But Jen says that 752 00:42:31,239 --> 00:42:33,440 Speaker 2: she tells folks that they should not say anything to 753 00:42:33,520 --> 00:42:36,680 Speaker 2: a chatbot that they would not later want their colleagues 754 00:42:36,800 --> 00:42:38,040 Speaker 2: or their family or their. 755 00:42:37,880 --> 00:42:38,680 Speaker 3: Cousin to read. 756 00:42:38,760 --> 00:42:42,080 Speaker 2: And so I keep thinking about that because that advice 757 00:42:42,239 --> 00:42:46,200 Speaker 2: essentially defeats the entire purpose of turning to AI for intimacy, 758 00:42:46,200 --> 00:42:50,120 Speaker 2: because real intimacy requires feelings safe enough to share these 759 00:42:50,239 --> 00:42:53,000 Speaker 2: human parts of yourself that you don't share with everybody. 760 00:42:53,040 --> 00:42:56,319 Speaker 2: And so if the platform that holds the secrets of 761 00:42:56,320 --> 00:42:59,200 Speaker 2: whatever you share. Cannot be trusted with those secrets if 762 00:42:59,239 --> 00:43:01,719 Speaker 2: they might sell them or leaked them or hand them 763 00:43:01,760 --> 00:43:04,719 Speaker 2: to a hostile government, You really cannot be intimate with 764 00:43:04,760 --> 00:43:08,480 Speaker 2: it at all. You're just performing intimacy as a commodity. 765 00:43:08,480 --> 00:43:11,960 Speaker 2: You're not actually really experiencing it. And so that's really 766 00:43:12,040 --> 00:43:14,400 Speaker 2: kind of the the So what of all of this? 767 00:43:14,600 --> 00:43:17,799 Speaker 2: These companies are able to sell a promise, and boy 768 00:43:17,840 --> 00:43:20,640 Speaker 2: are they selling it? Like Mark Zuckerberg went on a 769 00:43:20,640 --> 00:43:23,239 Speaker 2: podcast not that long ago and said that he thinks 770 00:43:23,239 --> 00:43:26,120 Speaker 2: in the future most of us will have a good 771 00:43:26,200 --> 00:43:29,440 Speaker 2: percentage of our friends be AI and that we'll love it. 772 00:43:29,760 --> 00:43:33,240 Speaker 2: And so to me, that's like the person who broke 773 00:43:33,320 --> 00:43:37,319 Speaker 2: friendship using Facebook selling it back to us as a 774 00:43:37,360 --> 00:43:41,200 Speaker 2: commodity through AI that he profits from. And it's like, 775 00:43:41,200 --> 00:43:43,000 Speaker 2: and don't worry, you're gonna love it. 776 00:43:43,200 --> 00:43:45,120 Speaker 3: I don't know if I agree with you? Is Mark Zuckerberg? 777 00:43:46,960 --> 00:43:49,080 Speaker 4: Yeah, I feel like he's trying to convince himself because 778 00:43:49,120 --> 00:43:52,800 Speaker 4: he might be like without friends, Yeah. 779 00:43:52,480 --> 00:43:55,759 Speaker 1: Well again, we need to watch the social networks. So 780 00:43:56,560 --> 00:43:59,640 Speaker 1: I'm so convinced we need to make this movie series happen. 781 00:44:00,160 --> 00:44:03,000 Speaker 2: Oh y'all, see how much I want to talk about 782 00:44:03,000 --> 00:44:03,480 Speaker 2: the movie her. 783 00:44:03,640 --> 00:44:05,080 Speaker 3: So, yes, we. 784 00:44:05,120 --> 00:44:09,160 Speaker 4: Gotta make all of this happen. And I'm thinking about 785 00:44:09,160 --> 00:44:12,799 Speaker 4: this because we've become so well maybe me, I don't know, 786 00:44:13,400 --> 00:44:18,400 Speaker 4: acutely aware of the surveillance that we are under constantly, 787 00:44:18,680 --> 00:44:22,400 Speaker 4: and it's almost inevitable, whether it's because we were tricked 788 00:44:22,480 --> 00:44:24,680 Speaker 4: or we were promised or we thought there'd be safety. 789 00:44:25,160 --> 00:44:28,240 Speaker 4: And then these leaders pop up and you realize these leaders, 790 00:44:28,600 --> 00:44:32,120 Speaker 4: once you know who they are, you start feeling like, 791 00:44:32,160 --> 00:44:35,680 Speaker 4: oh my god, oh my god. They have my information, 792 00:44:35,800 --> 00:44:37,719 Speaker 4: they know what I think, or they know what I ask, 793 00:44:37,880 --> 00:44:39,200 Speaker 4: like what have I done? 794 00:44:39,680 --> 00:44:43,319 Speaker 2: Yes? That is my big like if I had to 795 00:44:43,320 --> 00:44:45,640 Speaker 2: say there was a big question to be asked that 796 00:44:45,719 --> 00:44:47,160 Speaker 2: we all should be asking ourselves about. 797 00:44:47,160 --> 00:44:49,840 Speaker 3: This is if you watch the Social Network? 798 00:44:50,360 --> 00:44:53,840 Speaker 2: Is Mark Zuckerberg as portrayed by Jesse Eisenberg, Is that 799 00:44:54,000 --> 00:44:56,000 Speaker 2: the person that you want to be in charge of 800 00:44:56,040 --> 00:44:57,359 Speaker 2: your intimate relationships? 801 00:44:57,560 --> 00:45:02,920 Speaker 3: What do you trust that person with your intimate connections? Friendships, romance? Sex? 802 00:45:03,560 --> 00:45:06,880 Speaker 2: Should we trust leaders like Sam Altman, who who I 803 00:45:06,920 --> 00:45:09,279 Speaker 2: would say, like go out of their way to be 804 00:45:09,360 --> 00:45:12,520 Speaker 2: as opaque and slippery as possible. Can we trust them 805 00:45:12,520 --> 00:45:16,799 Speaker 2: with something as sensitive as our human intimacies. What are 806 00:45:16,800 --> 00:45:19,319 Speaker 2: their track records look like? What are the implications of 807 00:45:19,320 --> 00:45:21,960 Speaker 2: handing something so human and so vulnerable and so intimate 808 00:45:22,280 --> 00:45:25,040 Speaker 2: with such potential for harm to these people who go 809 00:45:25,120 --> 00:45:27,000 Speaker 2: out of their way to not be able to be 810 00:45:27,040 --> 00:45:27,839 Speaker 2: pinned down? Like? 811 00:45:27,920 --> 00:45:31,440 Speaker 3: That is the question, you know, people should ask themselves. 812 00:45:31,480 --> 00:45:33,480 Speaker 2: I know what the answer is to that question for me, 813 00:45:34,120 --> 00:45:35,719 Speaker 2: but I think that is the question we should be 814 00:45:35,760 --> 00:45:38,000 Speaker 2: asking ourselves when we think about the role of the 815 00:45:38,160 --> 00:45:40,759 Speaker 2: of AI and intimacy that we're sort of being sold. 816 00:45:40,560 --> 00:45:44,440 Speaker 4: Right now, I don't even want to talk to them 817 00:45:44,560 --> 00:45:45,759 Speaker 4: nonetheless get to know. 818 00:45:45,800 --> 00:45:50,759 Speaker 2: Me, Like I don't think I would want to have 819 00:45:51,120 --> 00:45:53,960 Speaker 2: dinner with Mark, right, like just looking at them and. 820 00:45:54,239 --> 00:45:57,920 Speaker 1: Go, oh no, I think he would be a miserable 821 00:45:57,920 --> 00:46:06,200 Speaker 1: dinner partner. Oh well, I am fascinated by this topic. 822 00:46:06,320 --> 00:46:08,880 Speaker 1: I am so interested in learning more about it, and 823 00:46:08,960 --> 00:46:14,239 Speaker 1: I'm so excited that you have this audio book that 824 00:46:14,360 --> 00:46:19,040 Speaker 1: discusses more about it. So Bridget, can you tell the 825 00:46:19,120 --> 00:46:23,640 Speaker 1: listeners where they can find you and more about this audiobook? 826 00:46:24,120 --> 00:46:28,400 Speaker 2: Yes, if you're thinking this conversation is fascinating or as 827 00:46:28,400 --> 00:46:30,759 Speaker 2: as fascinating as we think it is, and you want 828 00:46:30,760 --> 00:46:33,320 Speaker 2: to hear more. I promise you you are not ready 829 00:46:33,360 --> 00:46:36,440 Speaker 2: for the wild world that is the connection of intimacy 830 00:46:36,480 --> 00:46:39,160 Speaker 2: and AI. You can pre order my book Love at 831 00:46:39,160 --> 00:46:42,560 Speaker 2: First Prompt at Love at First Prompt dot AI. The 832 00:46:42,600 --> 00:46:47,320 Speaker 2: book comes out in July on July fourteenth. The rabbit 833 00:46:47,360 --> 00:46:49,719 Speaker 2: Holes that we explore I have not even scratched the 834 00:46:49,719 --> 00:46:52,399 Speaker 2: surface of. Please check it out. It is a very 835 00:46:52,400 --> 00:46:54,200 Speaker 2: fascinating Listen. I promise you that. 836 00:46:55,520 --> 00:46:58,200 Speaker 1: Yes, and we are so excited. We want to have 837 00:46:58,320 --> 00:47:02,600 Speaker 1: you back to discuss it more and for our movie 838 00:47:02,680 --> 00:47:04,760 Speaker 1: mini series that I think we should make happen. 839 00:47:05,320 --> 00:47:10,200 Speaker 3: Have y'all seen Companion? Yes, I loved it. Companion. 840 00:47:10,239 --> 00:47:10,840 Speaker 1: It is great. 841 00:47:11,120 --> 00:47:12,200 Speaker 3: It's so funny. 842 00:47:12,800 --> 00:47:14,200 Speaker 1: Samantha hasn't seen. 843 00:47:14,120 --> 00:47:16,000 Speaker 4: Like I'm on to get out of the loop. 844 00:47:16,520 --> 00:47:18,080 Speaker 1: This is going to be great. This is going to 845 00:47:18,120 --> 00:47:22,879 Speaker 1: be a great mini series. I'm so excited about it too. Yes, 846 00:47:22,920 --> 00:47:26,080 Speaker 1: but the listeners can find you in other places. Bridget 847 00:47:26,280 --> 00:47:27,600 Speaker 1: you can check out my podcast. 848 00:47:27,640 --> 00:47:29,439 Speaker 2: There are no girls on the Internet. You can check 849 00:47:29,480 --> 00:47:32,640 Speaker 2: me out on Instagram at bridget ran DC or on YouTube. 850 00:47:32,680 --> 00:47:34,000 Speaker 2: Bet there are no girls on the Internet. 851 00:47:34,960 --> 00:47:38,560 Speaker 1: Yes, and go do that. If you have not already, listeners, 852 00:47:38,800 --> 00:47:43,000 Speaker 1: go pre order Bridget's audio book. If you would like 853 00:47:43,120 --> 00:47:45,920 Speaker 1: to contact us, you can. You can email us at 854 00:47:45,920 --> 00:47:47,960 Speaker 1: Hello at stuff Iannever Told You dot com. We're on 855 00:47:48,040 --> 00:47:50,480 Speaker 1: Blue Sky most podcast or on Instagram and TikTok at 856 00:47:50,480 --> 00:47:52,759 Speaker 1: stuff I Never Told You. We're also on YouTube. We 857 00:47:52,840 --> 00:47:54,800 Speaker 1: have some merchandise at Common Bureau, and we have a 858 00:47:54,800 --> 00:47:57,120 Speaker 1: book you can get wherever you get your books. Thanks. 859 00:47:57,160 --> 00:48:00,400 Speaker 1: It's always to our super Dischristine, our executive producer and contributor. 860 00:48:00,400 --> 00:48:02,920 Speaker 1: To Joey, thank you, and thanks to you for listening 861 00:48:03,120 --> 00:48:05,319 Speaker 1: stuff Never Told You his protection of iHeartRadio. For more 862 00:48:05,320 --> 00:48:06,920 Speaker 1: podcasts from my Heart Radio, you can check out the 863 00:48:06,920 --> 00:48:09,359 Speaker 1: iHeartRadio app, Apple podcast or where you listen to your 864 00:48:09,360 --> 00:48:10,040 Speaker 1: favorite shows