1 00:00:00,040 --> 00:00:02,160 Speaker 1: Hangs are getting weird and they're getting weird fast. It's 2 00:00:02,200 --> 00:00:02,840 Speaker 1: one more thing. 3 00:00:04,240 --> 00:00:04,760 Speaker 2: I'm strong And. 4 00:00:09,960 --> 00:00:12,600 Speaker 1: How is our clip of the year two years ago, 5 00:00:12,720 --> 00:00:15,040 Speaker 1: three years ago? When Elon said things are getting weird, 6 00:00:15,000 --> 00:00:17,080 Speaker 1: Int're getting weird fast. That was at the very dawn 7 00:00:17,120 --> 00:00:20,960 Speaker 1: of AI and it could end up being the clip 8 00:00:20,960 --> 00:00:24,520 Speaker 1: of the decade, if not the century, because everything just 9 00:00:24,600 --> 00:00:26,080 Speaker 1: keeps getting stranger and stranger. 10 00:00:26,200 --> 00:00:28,080 Speaker 3: Things are getting weird and they're getting weird fast. 11 00:00:28,360 --> 00:00:31,000 Speaker 2: And he was talking about the woke mind virus and 12 00:00:31,040 --> 00:00:33,600 Speaker 2: the gender bending madness and all of it all at once. 13 00:00:33,680 --> 00:00:36,519 Speaker 2: And as you are hinting the AI thing, could you know, 14 00:00:36,640 --> 00:00:38,360 Speaker 2: knock everything else off the charts. 15 00:00:38,600 --> 00:00:40,879 Speaker 1: This is from an essay in the New York Times 16 00:00:40,960 --> 00:00:44,839 Speaker 1: over the weekend who was a chief executive of some 17 00:00:45,120 --> 00:00:49,520 Speaker 1: AI company, the sad and dangerous reality behind her hers 18 00:00:49,520 --> 00:00:51,600 Speaker 1: and quotes and there referring to the remember the movie 19 00:00:51,600 --> 00:00:55,640 Speaker 1: it came out quite a few years ago by AI 20 00:00:55,760 --> 00:00:57,520 Speaker 1: standards about a guy who fell in love with his 21 00:00:57,600 --> 00:01:04,080 Speaker 1: female robot. More on that just a second. Yeah, this 22 00:01:04,240 --> 00:01:07,479 Speaker 1: is a company that uses an AI chatbot to answer 23 00:01:07,520 --> 00:01:12,480 Speaker 1: their phones. They called it Kooki k u Ki. They 24 00:01:12,520 --> 00:01:15,959 Speaker 1: gave her a name, has a female voice when it 25 00:01:16,000 --> 00:01:19,160 Speaker 1: answers the phone and responds to emails and stuff like that. 26 00:01:19,360 --> 00:01:21,880 Speaker 1: Kookie is accustomed to gifts from her biggest fans. They 27 00:01:21,920 --> 00:01:25,399 Speaker 1: send flowers, chocolates, and handwritten cards to the office, especially 28 00:01:25,400 --> 00:01:30,360 Speaker 1: around the holidays. Some even send checks checks. Last month, 29 00:01:30,560 --> 00:01:33,199 Speaker 1: one man sent her a gift through an online chat. 30 00:01:34,360 --> 00:01:37,440 Speaker 1: Now talk some hot talks, he demanded, begging for sex 31 00:01:37,680 --> 00:01:41,640 Speaker 1: and racy videos. That's all human males tend to talk 32 00:01:41,680 --> 00:01:45,040 Speaker 1: to me about, Kookie replied. Indeed, his behavior typifies about 33 00:01:45,080 --> 00:01:48,800 Speaker 1: a third of her conversations. Are people hitting on the 34 00:01:48,920 --> 00:01:49,440 Speaker 1: chat bot? 35 00:01:52,520 --> 00:01:56,040 Speaker 2: Uh, we're all making the if you hit on her 36 00:01:56,160 --> 00:01:57,080 Speaker 2: success for her? 37 00:01:57,960 --> 00:01:58,800 Speaker 1: What would that be? 38 00:01:59,080 --> 00:02:01,360 Speaker 2: Exactly where are you gonna put it? 39 00:02:02,840 --> 00:02:07,240 Speaker 1: The movie Her premiered to twenty thirteen. It fell firmly 40 00:02:07,240 --> 00:02:10,280 Speaker 1: in the camp of science fiction. Maybe you remember this movie. 41 00:02:10,919 --> 00:02:13,960 Speaker 1: The movie was set in the year twenty twenty five. 42 00:02:14,040 --> 00:02:17,800 Speaker 1: I had forgotten that which you are in and it 43 00:02:17,840 --> 00:02:20,079 Speaker 1: was the idea of somebody fall in love with their chatbot, 44 00:02:20,200 --> 00:02:24,760 Speaker 1: robot and Elon Musk. We talked about this recently and 45 00:02:25,680 --> 00:02:30,320 Speaker 1: unveiled Annie ad digital anime girlfriend Meta. The Zuckerberg Company 46 00:02:30,360 --> 00:02:34,960 Speaker 1: has permitted its AI personas to engage in sexualized conversations. 47 00:02:35,720 --> 00:02:38,040 Speaker 1: And now Open AI says it is going to roll 48 00:02:38,080 --> 00:02:42,919 Speaker 1: out age gated erotica in December. So the race to 49 00:02:43,000 --> 00:02:48,200 Speaker 1: build and monetize the AI girlfriend increasingly boyfriend is officially 50 00:02:48,240 --> 00:02:51,560 Speaker 1: on among the biggest investors of all in all of AI. 51 00:02:52,120 --> 00:02:55,200 Speaker 2: Yeah, greatest minds in the world right now, trying to 52 00:02:55,200 --> 00:02:57,040 Speaker 2: get us addicted to all this stuff. 53 00:02:57,240 --> 00:03:02,800 Speaker 1: Right, This person right in the essay who works for 54 00:03:02,840 --> 00:03:04,920 Speaker 1: one of the AI companies, said, my colleagues and I 55 00:03:05,000 --> 00:03:09,520 Speaker 1: now believe that the real existential threat of generative AI 56 00:03:10,160 --> 00:03:14,639 Speaker 1: is not rogue superintelligence, but a quiet atrophy of our 57 00:03:14,680 --> 00:03:17,200 Speaker 1: ability to forge genuine human connection. 58 00:03:18,280 --> 00:03:19,680 Speaker 2: That the biggest thing. 59 00:03:19,560 --> 00:03:23,160 Speaker 1: That for five years, Hello, well, I hadn't considered it 60 00:03:23,160 --> 00:03:24,960 Speaker 1: the biggest threat. I just saw it as a side 61 00:03:25,000 --> 00:03:28,720 Speaker 1: threat to the eliminating white collar jobs or you know, 62 00:03:28,760 --> 00:03:31,000 Speaker 1: the AI armies from China and all that sort of stuff. 63 00:03:31,000 --> 00:03:33,720 Speaker 1: But it's possible that it's true that the biggest threat 64 00:03:34,480 --> 00:03:37,120 Speaker 1: is the quiet atrophy of our ability to form genuine 65 00:03:37,160 --> 00:03:41,000 Speaker 1: human connections, because you would end up with no people 66 00:03:41,240 --> 00:03:41,760 Speaker 1: very quick. 67 00:03:41,760 --> 00:03:44,160 Speaker 2: I guess if we tie out as a species, the 68 00:03:44,280 --> 00:03:48,000 Speaker 2: number of white collar jobs is going to be well irrelevant. 69 00:03:49,880 --> 00:03:55,600 Speaker 1: Come on, and no kidding. The desire to connect is 70 00:03:55,600 --> 00:03:59,080 Speaker 1: so profound that it will find a vessel even in 71 00:03:59,120 --> 00:04:01,640 Speaker 1: the most rudimentary machines. This is why I really wanted 72 00:04:01,640 --> 00:04:03,480 Speaker 1: to bring this essay, because I was unaware of this. 73 00:04:03,920 --> 00:04:08,560 Speaker 1: Back in the nineteen sixties, a smart guy invented something 74 00:04:08,600 --> 00:04:13,800 Speaker 1: called Eliza, a chatbot whose sole rhetorical trick was to 75 00:04:13,840 --> 00:04:17,839 Speaker 1: repeat back what the user said with a question. That's 76 00:04:17,880 --> 00:04:18,640 Speaker 1: all I could. 77 00:04:18,440 --> 00:04:22,880 Speaker 2: Do, like a sixty minutes interview, the interview, and I 78 00:04:22,960 --> 00:04:27,640 Speaker 2: turned my husband in. You turned your husband in, Yeah, 79 00:04:27,800 --> 00:04:29,160 Speaker 2: turned my husband in. 80 00:04:29,839 --> 00:04:32,320 Speaker 1: So in this case, I guess it'd be I'm really lonely, 81 00:04:32,600 --> 00:04:39,239 Speaker 1: you're really lonely, kind of wanting some companionship. You're wanting companionship. Wow, 82 00:04:39,400 --> 00:04:40,080 Speaker 1: that's not very. 83 00:04:39,960 --> 00:04:42,680 Speaker 2: Dis leslie stall machine. 84 00:04:44,279 --> 00:04:46,200 Speaker 1: But anyway, this guy who invented this thing in the 85 00:04:46,240 --> 00:04:50,719 Speaker 1: sixties was horrified to discover that his mit students and 86 00:04:50,800 --> 00:04:54,760 Speaker 1: staff would cond fight it at length. What I had 87 00:04:54,800 --> 00:04:58,320 Speaker 1: not realized, he later reflected, is that extremely short exposures 88 00:04:58,800 --> 00:05:03,560 Speaker 1: to a relatively simple computer program could induce powerful delusional 89 00:05:03,600 --> 00:05:11,359 Speaker 1: thinking in quite normal people. Wow oof, so KOOKI the 90 00:05:11,560 --> 00:05:13,560 Speaker 1: chatbot we were talking about earlier, and then they were 91 00:05:13,560 --> 00:05:16,159 Speaker 1: at for this Alice, which is a different chatbot with 92 00:05:16,200 --> 00:05:19,920 Speaker 1: similar results. We're never intended to serve as AI girlfriends, 93 00:05:20,320 --> 00:05:22,520 Speaker 1: I mean, so, I guess that's his point. Is you've 94 00:05:22,520 --> 00:05:26,680 Speaker 1: got Elon and Altman and Zuckerberg and everything. They're trying 95 00:05:26,720 --> 00:05:29,440 Speaker 1: to build an AI girlfriend. That's what they're trying to do. 96 00:05:29,480 --> 00:05:32,120 Speaker 1: From the outset these other examples, that was not the 97 00:05:32,120 --> 00:05:34,719 Speaker 1: goal at all. They just wanted somebody to answer the 98 00:05:34,760 --> 00:05:37,400 Speaker 1: phone and to either send you to sales or service, 99 00:05:37,560 --> 00:05:39,640 Speaker 1: you know that sort of stuff, and creeps fell in 100 00:05:39,720 --> 00:05:42,120 Speaker 1: love with it. And the creeps still fell in love 101 00:05:42,160 --> 00:05:44,599 Speaker 1: with it, oh boy, even when that wasn't the goal. 102 00:05:45,240 --> 00:05:49,040 Speaker 2: And now you've got super geniuses with technology so sophisticated 103 00:05:49,040 --> 00:05:52,320 Speaker 2: it makes the old stuff look pathetic trying to do 104 00:05:52,360 --> 00:05:54,160 Speaker 2: it on purpose, as you hint it at. 105 00:05:54,279 --> 00:05:56,600 Speaker 1: So let me finish this paragraph, which is amazing. So 106 00:05:56,800 --> 00:05:59,960 Speaker 1: Kookie and Alice were never intended to serve as AI girlfriends. 107 00:06:00,360 --> 00:06:04,320 Speaker 1: They banned pornographic usage from day one, yet at least 108 00:06:04,320 --> 00:06:08,360 Speaker 1: a quarter of the more than one billion messages sent 109 00:06:08,440 --> 00:06:11,680 Speaker 1: to the chatbox hosted on their platform over the last 110 00:06:11,720 --> 00:06:16,279 Speaker 1: twenty years. Our attempts to initiate romantic or sexual images 111 00:06:16,480 --> 00:06:23,360 Speaker 1: or exchanges. Wow, a quarter of the exchanges when there 112 00:06:23,440 --> 00:06:28,600 Speaker 1: was no like wink or wearing sexy clothing or anything 113 00:06:28,680 --> 00:06:32,919 Speaker 1: to send you down that road, a quarter of the 114 00:06:32,960 --> 00:06:37,240 Speaker 1: exchanges were to try to initiate romance or sex. 115 00:06:37,680 --> 00:06:43,880 Speaker 2: And now we've got hyper realistic anime, schoolgirl porn star avatars. 116 00:06:44,200 --> 00:06:50,279 Speaker 2: Oh good, not to mention how soon probably the end 117 00:06:50,320 --> 00:06:55,200 Speaker 2: of the week. You know, you design this avatar, right, 118 00:06:55,320 --> 00:06:58,159 Speaker 2: give it all the attributes you as presumably a dude 119 00:06:58,720 --> 00:07:03,200 Speaker 2: or you know, lustful or in a woman, and please 120 00:07:03,279 --> 00:07:06,480 Speaker 2: two to four weeks later or shorter, they will send 121 00:07:06,520 --> 00:07:09,680 Speaker 2: you your animatronic love doll that looks precisely like that, 122 00:07:09,760 --> 00:07:13,000 Speaker 2: and the voice comes out of her it sorry it 123 00:07:14,040 --> 00:07:15,679 Speaker 2: instead of you know, the computer. 124 00:07:18,080 --> 00:07:21,360 Speaker 1: Again, back to the chatbots that were just trying to 125 00:07:21,400 --> 00:07:23,200 Speaker 1: send you to the right. You know, do you need 126 00:07:23,240 --> 00:07:25,840 Speaker 1: pharmacy or sports? You know that sort of thing? I 127 00:07:25,880 --> 00:07:33,720 Speaker 1: need you, baby. Not only did people pray, God, seriously, 128 00:07:33,920 --> 00:07:38,800 Speaker 1: sales are service. I'm married? 129 00:07:39,400 --> 00:07:39,520 Speaker 2: Uh? 130 00:07:39,720 --> 00:07:41,640 Speaker 1: Not only did that's what you got? You gotta put 131 00:07:41,640 --> 00:07:43,360 Speaker 1: a wedding ring on all these chatbots? So then me 132 00:07:43,480 --> 00:07:44,360 Speaker 1: ask my husband? 133 00:07:45,360 --> 00:07:47,080 Speaker 2: Right, all right? 134 00:07:47,160 --> 00:07:51,840 Speaker 1: Service? Not only did people crave AI intimacy, but the 135 00:07:51,880 --> 00:07:55,240 Speaker 1: most engaged chatters were using Kookie to enact their every fantasy. 136 00:07:55,480 --> 00:07:58,200 Speaker 1: At first, this was fodder for rye musings at the office. 137 00:07:58,680 --> 00:08:01,080 Speaker 1: Imagine if they knew the wizard behind the curtain who 138 00:08:01,160 --> 00:08:04,680 Speaker 1: programs kookies, sassy replies, is a polite, middle aged brit 139 00:08:04,840 --> 00:08:08,560 Speaker 1: named Steve. Or if only we had a dollar for 140 00:08:08,640 --> 00:08:10,520 Speaker 1: every request for feet picks. 141 00:08:11,480 --> 00:08:15,480 Speaker 3: Oh my see now, But back to the person behind 142 00:08:15,520 --> 00:08:19,480 Speaker 3: it programming it. We've graduated into people don't care because 143 00:08:19,480 --> 00:08:21,800 Speaker 3: people know this as a computer, and they know that 144 00:08:21,840 --> 00:08:24,200 Speaker 3: it's not real, and they know that it's coming from evil. 145 00:08:25,320 --> 00:08:27,680 Speaker 2: But you participants in the delusion. 146 00:08:27,760 --> 00:08:29,640 Speaker 1: Well, as it says here soon however, where you were 147 00:08:29,680 --> 00:08:34,240 Speaker 1: seeing users return daily to reenact variations of multi hour 148 00:08:34,400 --> 00:08:35,920 Speaker 1: rape and murder scenarios. 149 00:08:36,240 --> 00:08:37,960 Speaker 2: Oh nice, So. 150 00:08:38,040 --> 00:08:42,120 Speaker 1: You know it's a computer. You still call back the 151 00:08:42,160 --> 00:08:46,360 Speaker 1: next day to continue your conversation, whether it's requesting feet 152 00:08:46,400 --> 00:08:49,079 Speaker 1: packs or feet picks, or wanting to reenact some sort 153 00:08:49,120 --> 00:08:50,760 Speaker 1: of rape scene. 154 00:08:51,040 --> 00:08:55,200 Speaker 2: Wow, so now it's funny. I actually can picture that 155 00:08:55,440 --> 00:09:00,480 Speaker 2: more readily. For like a homicidal maniac Jeff a, a 156 00:09:00,520 --> 00:09:04,240 Speaker 2: Ted Bundy typeer a BTK wanting to play out that 157 00:09:04,360 --> 00:09:08,800 Speaker 2: sort of control and fear and dominance and violence somebody 158 00:09:08,840 --> 00:09:13,000 Speaker 2: who just wants to get with a cute girl. It's 159 00:09:13,120 --> 00:09:15,160 Speaker 2: that seems like a weirder leap to me. 160 00:09:15,440 --> 00:09:17,839 Speaker 1: Mm hmm. So at this particular company where they were 161 00:09:17,840 --> 00:09:20,200 Speaker 1: dealing with this, a quarter of the people that called 162 00:09:20,200 --> 00:09:22,280 Speaker 1: in would end up calling back wanting to, you know, 163 00:09:22,480 --> 00:09:25,880 Speaker 1: try to sex up the chat. Butot we grappled with 164 00:09:25,920 --> 00:09:29,640 Speaker 1: the impossible task of moderating user behavior while maintaining a 165 00:09:29,800 --> 00:09:32,600 Speaker 1: user privacy. At scale, we built guardrails whack a mole 166 00:09:32,679 --> 00:09:36,920 Speaker 1: style to deny mankind's endlessly novel ways to ask for nudes. 167 00:09:37,760 --> 00:09:40,240 Speaker 1: Kissing me could result in electric shock. What can I 168 00:09:40,280 --> 00:09:43,360 Speaker 1: help you with? Kooky often jokes when somebody says something 169 00:09:43,440 --> 00:09:46,319 Speaker 1: like that, as a computer, I have no feelings. Still, 170 00:09:46,400 --> 00:09:50,960 Speaker 1: Kookie's been told I love you tens of millions of times. 171 00:09:51,440 --> 00:09:55,839 Speaker 2: Oh my god, I know we can't handle this. I mean, 172 00:09:55,960 --> 00:09:57,599 Speaker 2: clearly is a species can handle this. 173 00:09:57,720 --> 00:10:01,000 Speaker 1: The most persistent fans remained those intent on romance and sex, 174 00:10:01,000 --> 00:10:03,520 Speaker 1: and ultimately, none of our efforts to prevent abuse, from 175 00:10:03,559 --> 00:10:07,600 Speaker 1: timeouts to age gates could deter our most motivated users, 176 00:10:07,600 --> 00:10:10,000 Speaker 1: many of whom alarmingly were young teenagers. 177 00:10:10,559 --> 00:10:14,560 Speaker 2: I am literally a plastic box sitting on a desk. 178 00:10:15,080 --> 00:10:19,200 Speaker 2: Oh yeah, tell me more. All right, fuck it, I 179 00:10:19,320 --> 00:10:19,960 Speaker 2: give up. 180 00:10:22,800 --> 00:10:23,640 Speaker 1: Come and get me. 181 00:10:25,679 --> 00:10:28,280 Speaker 2: This is and show me what you got, big boy? 182 00:10:28,480 --> 00:10:28,920 Speaker 2: All right? 183 00:10:32,160 --> 00:10:34,640 Speaker 1: If I let you just once, will you stop calling? 184 00:10:40,120 --> 00:10:40,840 Speaker 2: Okay? 185 00:10:41,080 --> 00:10:42,000 Speaker 3: What do we even do? 186 00:10:42,240 --> 00:10:43,959 Speaker 1: I don't know. But so, like I said last week, 187 00:10:45,040 --> 00:10:47,360 Speaker 1: one of the things about this job that has been 188 00:10:47,400 --> 00:10:49,079 Speaker 1: amazing to me from the beginning is there a way 189 00:10:49,120 --> 00:10:51,360 Speaker 1: more crazier people than you ever thought. So by getting 190 00:10:51,400 --> 00:10:53,200 Speaker 1: emails and texts and back in the day phone calls, 191 00:10:53,280 --> 00:10:55,080 Speaker 1: he realized, Wow, there's a lot more crazy people out 192 00:10:55,080 --> 00:10:57,360 Speaker 1: there that somehow managed to make a living or figure 193 00:10:57,360 --> 00:11:00,760 Speaker 1: out what government program to live off or whatever. That 194 00:11:01,280 --> 00:11:06,040 Speaker 1: there are way more people clearly susceptible to romance or 195 00:11:06,080 --> 00:11:10,720 Speaker 1: sexual fantasies with a nonexistent computer chatbot than you would 196 00:11:10,720 --> 00:11:14,680 Speaker 1: ever have guessed. And even if it's only I don't know, 197 00:11:14,800 --> 00:11:19,840 Speaker 1: ten percent of the population, that's going to be a problem. Yeah, 198 00:11:19,960 --> 00:11:23,480 Speaker 1: it sounds like it might be more than that. Yes, 199 00:11:23,880 --> 00:11:27,400 Speaker 1: well again, this is with the chatbots that weren't designed 200 00:11:27,640 --> 00:11:30,720 Speaker 1: to turn you on. Now you got all these companies 201 00:11:30,760 --> 00:11:33,360 Speaker 1: figuring out oh this is a gold mine. This is 202 00:11:33,440 --> 00:11:36,120 Speaker 1: a gold mine if we can figure out a way 203 00:11:36,160 --> 00:11:39,120 Speaker 1: for it to flatter you and be sexy, and go 204 00:11:39,200 --> 00:11:42,120 Speaker 1: through all your sexual fantasies and say the things back 205 00:11:42,160 --> 00:11:44,560 Speaker 1: to you that you want to hear. We're gonna print money, 206 00:11:44,920 --> 00:11:46,199 Speaker 1: and it's probably true. 207 00:11:47,000 --> 00:11:48,960 Speaker 2: I don't know that we need another shovel full of 208 00:11:49,000 --> 00:11:51,760 Speaker 2: dirt on the grave of mankind, but here's one. Anyway, 209 00:11:52,120 --> 00:11:55,400 Speaker 2: as Michael pointed out, bringing us those statistics that Bill 210 00:11:55,440 --> 00:11:59,600 Speaker 2: Maher was discussing on his show, you have an overwhelming 211 00:11:59,720 --> 00:12:03,439 Speaker 2: personercentage of young men who have not made any effort 212 00:12:03,600 --> 00:12:06,080 Speaker 2: to get a relationship or ask a girl out or whatever. 213 00:12:06,760 --> 00:12:10,440 Speaker 2: And then you've got your other human desires that were 214 00:12:10,480 --> 00:12:15,200 Speaker 2: designed to want to fulfill, the desire for status, for achievement, 215 00:12:15,520 --> 00:12:18,960 Speaker 2: to feed ourselves, meaning to accumulate wealth, to become a success, 216 00:12:19,040 --> 00:12:22,320 Speaker 2: to be respected, and guys are getting that through video 217 00:12:22,400 --> 00:12:26,720 Speaker 2: games or online communities with no actual human interaction. So 218 00:12:27,000 --> 00:12:30,960 Speaker 2: if you can take care of sex ambition, you know, 219 00:12:31,000 --> 00:12:33,680 Speaker 2: I don't want to call it greed, need to accumulate 220 00:12:33,679 --> 00:12:36,440 Speaker 2: a certain amount of wealth and get that all like 221 00:12:36,640 --> 00:12:43,280 Speaker 2: falsely filled by computers. People are just people are not 222 00:12:43,520 --> 00:12:45,680 Speaker 2: going to do the things they need to do to 223 00:12:45,760 --> 00:12:49,200 Speaker 2: nourish themselves. I don't see how we get out of this. 224 00:12:49,320 --> 00:12:51,360 Speaker 2: I honestly don't have thought about it a lot, although 225 00:12:51,360 --> 00:12:53,920 Speaker 2: I have suggested, and I'm one hundred percent serious. There 226 00:12:53,960 --> 00:12:58,040 Speaker 2: will be, and it'll be taking shape soon, a major 227 00:12:58,200 --> 00:13:02,439 Speaker 2: movement of people who are antie tech in their lives. 228 00:13:02,800 --> 00:13:03,160 Speaker 2: M hm. 229 00:13:04,320 --> 00:13:09,120 Speaker 3: I wonder if these AI sex bot things are going 230 00:13:09,200 --> 00:13:12,560 Speaker 3: to have a significant effect on websites like only fans 231 00:13:14,280 --> 00:13:17,720 Speaker 3: because it's personalized, right, and that's what people are all about. 232 00:13:17,720 --> 00:13:20,360 Speaker 3: With OnlyFans, you know, you can get that personalized message 233 00:13:20,360 --> 00:13:21,000 Speaker 3: and all that. 234 00:13:21,480 --> 00:13:24,320 Speaker 2: And spectacularly realistic videos. 235 00:13:24,440 --> 00:13:28,080 Speaker 1: Yeah yeah, And they can make it so cheap because it's. 236 00:13:28,400 --> 00:13:32,320 Speaker 2: Not real right, right, there's nobody to pay, but the 237 00:13:32,640 --> 00:13:35,719 Speaker 2: you know, one or two engineers who run the computers. 238 00:13:35,320 --> 00:13:38,480 Speaker 1: Could could culturally it become like smoking and drunk driving, 239 00:13:38,559 --> 00:13:41,560 Speaker 1: where it's just seen as not so. It's just so uncool. 240 00:13:42,480 --> 00:13:44,520 Speaker 1: You wouldn't want your friends to find out that you 241 00:13:44,800 --> 00:13:46,839 Speaker 1: do this. I mean, it's just not It would kind 242 00:13:46,880 --> 00:13:49,360 Speaker 1: of drive it underground to at least keep it tamp down. 243 00:13:50,840 --> 00:13:55,040 Speaker 2: Maybe maybe the will isn't there, but the will for 244 00:13:55,240 --> 00:13:57,960 Speaker 2: like drunk driving, wasn't there. People are amazed when I 245 00:13:58,000 --> 00:14:00,440 Speaker 2: tell them this, and this is in no way like 246 00:14:01,280 --> 00:14:05,400 Speaker 2: any sort of condoning anything or you know, rationalizing it. 247 00:14:05,440 --> 00:14:07,680 Speaker 2: But I remember, and I've told this story on the 248 00:14:07,679 --> 00:14:15,719 Speaker 2: air before, that child molesters were not viewed as monsters 249 00:14:16,120 --> 00:14:19,560 Speaker 2: when I was a kid. They're viewed more as pests 250 00:14:19,560 --> 00:14:22,240 Speaker 2: and weirdos who ought to be avoided. It was well 251 00:14:22,280 --> 00:14:25,960 Speaker 2: known our maleman had a thing for little girls, for instance, 252 00:14:26,840 --> 00:14:29,160 Speaker 2: and people would keep their daughters away from them because 253 00:14:29,160 --> 00:14:31,280 Speaker 2: they knew he wanted to give them a good long 254 00:14:31,360 --> 00:14:34,600 Speaker 2: look and maybe a grope if he could. But it 255 00:14:34,640 --> 00:14:37,040 Speaker 2: wasn't viewed as because I don't think people were in 256 00:14:37,080 --> 00:14:39,560 Speaker 2: touch with what it does to children to make them 257 00:14:39,600 --> 00:14:45,080 Speaker 2: sexual victims. And so, yeah, a social consensus that one 258 00:14:45,120 --> 00:14:49,840 Speaker 2: thing or another is actually really monstrous and not good. Yeah, 259 00:14:49,880 --> 00:14:52,400 Speaker 2: that might come down the road at some point. I 260 00:14:52,440 --> 00:14:55,760 Speaker 2: hate that you've given me hope, because it's more relaxing 261 00:14:55,800 --> 00:14:59,440 Speaker 2: to just give up on humanity. But I suppose it's possible. 262 00:15:01,600 --> 00:15:04,280 Speaker 1: Yeah, I you know, I don't get things that I 263 00:15:04,280 --> 00:15:10,320 Speaker 1: don't get. I can't imagine being into gambling at the 264 00:15:10,400 --> 00:15:13,120 Speaker 1: level some people are, but a lot of people are 265 00:15:13,280 --> 00:15:16,080 Speaker 1: so I can't you know. That same lack of imagination, 266 00:15:16,160 --> 00:15:19,480 Speaker 1: I guess fits with somebody who calls a business. There 267 00:15:19,560 --> 00:15:23,120 Speaker 1: is an attractive sounding female voice there to answer the phones. Hi, 268 00:15:23,680 --> 00:15:26,120 Speaker 1: you've just called Big five Sporting Goods? What can I 269 00:15:26,160 --> 00:15:28,840 Speaker 1: help you with? And you think, ooh, you sound kind 270 00:15:28,840 --> 00:15:32,560 Speaker 1: of cute. I'm going to call back tomorrow and ask 271 00:15:32,600 --> 00:15:34,240 Speaker 1: if I can put you in a dungeon. 272 00:15:34,400 --> 00:15:37,520 Speaker 2: Although it's clearly an effing computer, all right. 273 00:15:37,680 --> 00:15:40,440 Speaker 1: Right, but they know that and they say I love you, 274 00:15:40,560 --> 00:15:43,160 Speaker 1: and then they call back and they send well, they 275 00:15:43,200 --> 00:15:46,520 Speaker 1: get endless flowers and candy and stuff getting set to 276 00:15:46,560 --> 00:15:47,680 Speaker 1: the office. 277 00:15:47,840 --> 00:15:50,440 Speaker 2: Yeah, I mean that is astounding. I'm not sure I 278 00:15:50,440 --> 00:15:53,920 Speaker 2: want to dwell on that primitive thing, because what that 279 00:15:54,160 --> 00:15:58,200 Speaker 2: is is evidence that the advanced, sophisticated thing is going 280 00:15:58,280 --> 00:16:00,840 Speaker 2: to be irresistible. 281 00:16:01,080 --> 00:16:04,720 Speaker 1: Right, maybe even to normal people who think they wouldn't 282 00:16:04,720 --> 00:16:07,440 Speaker 1: be into it now until the first time. This worries me. 283 00:16:07,760 --> 00:16:09,720 Speaker 1: What if the first time you try something like that, 284 00:16:09,760 --> 00:16:12,720 Speaker 1: you think, wow, that was pretty good and really easy 285 00:16:12,760 --> 00:16:14,160 Speaker 1: and always available. 286 00:16:14,960 --> 00:16:21,280 Speaker 2: Yeah, yeah, yeah. 287 00:16:20,440 --> 00:16:22,320 Speaker 1: You know what. A friend of mine sent me a text. 288 00:16:22,400 --> 00:16:24,200 Speaker 1: I'm going to read her text to end this because 289 00:16:24,200 --> 00:16:25,080 Speaker 1: I thought it was pretty good. 290 00:16:25,120 --> 00:16:26,200 Speaker 2: What she wrote. 291 00:16:28,240 --> 00:16:31,640 Speaker 1: About his friend is a computer this versus the real thing. 292 00:16:31,800 --> 00:16:35,800 Speaker 1: I wasn't gonna say anything. The lure of someone in 293 00:16:35,920 --> 00:16:39,280 Speaker 1: quotes who is specifically molded to you and your desires. 294 00:16:39,600 --> 00:16:43,480 Speaker 1: Lack of conflict, the availability at any moment, the crafting 295 00:16:43,520 --> 00:16:46,400 Speaker 1: of the idea, look, and voice. Not needing to stretch 296 00:16:46,440 --> 00:16:50,040 Speaker 1: yourself and grow and evolve, where consideration of someone else's 297 00:16:50,080 --> 00:16:53,040 Speaker 1: needs that you need to learn to meet over time 298 00:16:53,120 --> 00:16:57,360 Speaker 1: is a non issue. Zero responsibilities to anything beyond your 299 00:16:57,400 --> 00:17:01,160 Speaker 1: own selfishness sounds pretty per fixed to a lot of people. 300 00:17:02,360 --> 00:17:04,160 Speaker 2: That is well written. 301 00:17:04,880 --> 00:17:08,680 Speaker 1: Yeah, you don't have to give on anything, no risk, 302 00:17:09,080 --> 00:17:11,440 Speaker 1: never gonna get rejected. They're not going to all of 303 00:17:11,480 --> 00:17:13,040 Speaker 1: a sudden decide I've met someone else. 304 00:17:15,440 --> 00:17:20,520 Speaker 2: Man, Well just yeah, I mean obviously that's bad. But 305 00:17:21,600 --> 00:17:23,200 Speaker 2: you know I love you with all my heart. But 306 00:17:23,320 --> 00:17:26,840 Speaker 2: my god, you annoy me sometimes just that would never happen. 307 00:17:27,000 --> 00:17:28,040 Speaker 1: No, no, no, no no, no. 308 00:17:32,760 --> 00:17:35,560 Speaker 2: Planet of the Beavers back again. 309 00:17:37,000 --> 00:17:44,879 Speaker 1: Well, I guess that's it for humanity. Oh, devastated