1 00:00:09,880 --> 00:00:11,880 Speaker 1: Y know what's going on. It's dexter and this week 2 00:00:11,920 --> 00:00:14,200 Speaker 1: we got a special episode for you. I jumped on 3 00:00:14,240 --> 00:00:17,720 Speaker 1: the Panic World podcast to talk about AI companionship and 4 00:00:17,760 --> 00:00:21,239 Speaker 1: what a future where relationships are mediated by AI might 5 00:00:21,320 --> 00:00:24,119 Speaker 1: look like. It was a really interesting conversation and I 6 00:00:24,160 --> 00:00:25,640 Speaker 1: wanted to share it with y'all, so. 7 00:00:25,880 --> 00:00:26,400 Speaker 2: Check it out. 8 00:00:27,520 --> 00:00:31,400 Speaker 3: I want to start with just you, ranking the sexiest 9 00:00:31,440 --> 00:00:34,920 Speaker 3: AI companions that you've come across. Like, what, what what's 10 00:00:34,960 --> 00:00:35,879 Speaker 3: in your roster right now? 11 00:00:35,960 --> 00:00:40,040 Speaker 1: I have been studiously avoiding all of that. I'm aware 12 00:00:40,080 --> 00:00:40,680 Speaker 1: it exists. 13 00:00:40,840 --> 00:00:41,440 Speaker 3: Oh my god. 14 00:00:41,520 --> 00:00:43,360 Speaker 2: Nah, I don't even want to touch it. I don't 15 00:00:43,360 --> 00:00:44,720 Speaker 2: want to touch it. Man, how about you? 16 00:00:44,720 --> 00:00:46,800 Speaker 1: You got a ranking, you got a tier list. What's 17 00:00:47,080 --> 00:00:48,880 Speaker 1: what goes in the S? What goes in the D? 18 00:00:49,120 --> 00:00:49,680 Speaker 2: You know what I mean? 19 00:00:50,840 --> 00:00:53,880 Speaker 3: I mean, Grock sexy mode. Can't pull me away from 20 00:00:53,880 --> 00:00:55,920 Speaker 3: that that thing, you know, it's just it gets me. 21 00:01:11,600 --> 00:01:15,240 Speaker 3: I don't use any AI girlfriends or a girlfriend or 22 00:01:15,319 --> 00:01:19,000 Speaker 3: companions at all. I have used you know, I've used 23 00:01:19,040 --> 00:01:21,320 Speaker 3: AI and I've had it like, talk to me in 24 00:01:21,360 --> 00:01:21,839 Speaker 3: different problem. 25 00:01:21,920 --> 00:01:23,120 Speaker 2: Sorry, I got I gotta stop, man. 26 00:01:23,200 --> 00:01:25,679 Speaker 1: I don't use any AI girlfriends. 27 00:01:26,720 --> 00:01:27,399 Speaker 2: I don't the. 28 00:01:28,880 --> 00:01:31,720 Speaker 1: The verb you use like I don't have any but 29 00:01:31,800 --> 00:01:34,080 Speaker 1: you said I don't use any AI girlfriends. 30 00:01:34,360 --> 00:01:36,320 Speaker 3: I think it's a service, right, it's a it is. 31 00:01:36,440 --> 00:01:36,560 Speaker 2: No. 32 00:01:36,680 --> 00:01:38,720 Speaker 1: I just love that you actually use that, because I 33 00:01:38,720 --> 00:01:42,160 Speaker 1: think that's probably how we will be referring to it 34 00:01:42,200 --> 00:01:44,840 Speaker 1: in the future. I think that's the verb. I think 35 00:01:44,840 --> 00:01:47,680 Speaker 1: that's the verb. I think it's used, which brings up 36 00:01:47,680 --> 00:01:49,960 Speaker 1: a whole of the conversation. But yes, please please go on. 37 00:01:50,840 --> 00:01:53,160 Speaker 3: You know it's I think it speaks to the utilitary 38 00:01:53,200 --> 00:01:55,920 Speaker 3: and nature of these services. You know, it's a it's 39 00:01:55,960 --> 00:02:00,320 Speaker 3: a receptacle, if you will. I'm Ryan Broderick. Uh not 40 00:02:00,440 --> 00:02:04,880 Speaker 3: with me today. Finally is my producer Grant Irving. He 41 00:02:05,000 --> 00:02:10,720 Speaker 3: is thankfully not here. Instead is our production coordinator Josh. Welcome. Josh. 42 00:02:10,840 --> 00:02:14,280 Speaker 3: You seem like you're in a beautiful backdrop right now. 43 00:02:14,320 --> 00:02:15,919 Speaker 3: He looks like you're in like an old library. 44 00:02:16,040 --> 00:02:18,679 Speaker 2: Thank you, Yeah, thank you for having me. Yeah. 45 00:02:18,880 --> 00:02:20,840 Speaker 1: I think I had the least amount of work to 46 00:02:20,880 --> 00:02:23,480 Speaker 1: be done for just having a backrop ready to go 47 00:02:23,600 --> 00:02:24,680 Speaker 1: for podcasts. 48 00:02:24,880 --> 00:02:28,000 Speaker 3: It's very classy. This is Panic World, a show about 49 00:02:28,000 --> 00:02:30,560 Speaker 3: how the Internet works, our minds, our culture, and eventually reality. 50 00:02:31,000 --> 00:02:34,839 Speaker 3: And we have decided that We're going to go back 51 00:02:34,880 --> 00:02:37,840 Speaker 3: to everyone's least favorite topic, a topic that our listeners 52 00:02:37,840 --> 00:02:40,480 Speaker 3: are totally normal about when we talk about on the show. 53 00:02:40,480 --> 00:02:43,240 Speaker 3: We're gonna be talking about AI and joining us from 54 00:02:43,280 --> 00:02:46,320 Speaker 3: the wonderful podcast kill Switch is Dexter. Thomas Dexter. Welcome 55 00:02:46,360 --> 00:02:46,720 Speaker 3: to the show. 56 00:02:46,760 --> 00:02:48,239 Speaker 2: Yo, what's going on? Glad to be here? 57 00:02:48,840 --> 00:02:52,160 Speaker 3: What is your experience with AI to this point? Like, 58 00:02:52,200 --> 00:02:55,120 Speaker 3: how are you interacting with it? If you're interacting with 59 00:02:55,160 --> 00:02:55,600 Speaker 3: it at all? 60 00:02:55,840 --> 00:02:59,640 Speaker 1: Yeah, an embarrassingly large amount, I would say, I would 61 00:02:59,680 --> 00:03:01,440 Speaker 1: like to Yeah, I would like to say that I 62 00:03:01,520 --> 00:03:02,040 Speaker 1: use it for good. 63 00:03:02,080 --> 00:03:02,360 Speaker 2: Okay. 64 00:03:02,400 --> 00:03:06,440 Speaker 1: So here's the thing, my like cop out answer to 65 00:03:06,520 --> 00:03:09,360 Speaker 1: this is that I live in Los Angeles and I 66 00:03:09,400 --> 00:03:10,760 Speaker 1: don't have a car by choice. 67 00:03:11,280 --> 00:03:12,400 Speaker 2: Okay, I don't have a DUI. 68 00:03:12,560 --> 00:03:14,520 Speaker 1: The government hasn't said I'm not allowed to have a car, 69 00:03:14,560 --> 00:03:17,040 Speaker 1: Like if I wanted to, I could, but you know, 70 00:03:17,120 --> 00:03:21,240 Speaker 1: I'm not pumping CO two into the atmosphere via a 71 00:03:21,360 --> 00:03:26,160 Speaker 1: vehicle anyway, right, Okay, And so yeah, and we actually 72 00:03:26,160 --> 00:03:27,840 Speaker 1: did a whole episode about this. I was trying to 73 00:03:27,840 --> 00:03:30,239 Speaker 1: figure out, Okay, I kind of use AI a lot, 74 00:03:30,600 --> 00:03:32,880 Speaker 1: Like what I use it for is like I'll use 75 00:03:32,919 --> 00:03:35,760 Speaker 1: it to write software or stuff like that, like vibe coding. 76 00:03:35,840 --> 00:03:39,200 Speaker 1: I was a super early adopter of vibe coding. I 77 00:03:39,600 --> 00:03:42,120 Speaker 1: know enough Python to break things, but like not to 78 00:03:42,160 --> 00:03:45,120 Speaker 1: fix them. And so I would break something and say, yo, claude, 79 00:03:45,240 --> 00:03:47,560 Speaker 1: CHATGBT whatever fix it. But I was using it so 80 00:03:47,680 --> 00:03:49,760 Speaker 1: much because I'm just a bad programmer. 81 00:03:49,880 --> 00:03:51,800 Speaker 2: That's it. Like I'm just not good at it. And 82 00:03:51,880 --> 00:03:53,800 Speaker 2: so I. 83 00:03:53,400 --> 00:03:56,640 Speaker 1: Tried to ask chat GBT, Okay, all right, let's run 84 00:03:56,680 --> 00:04:00,360 Speaker 1: the math here. If I'm not driving a car every 85 00:04:00,400 --> 00:04:02,320 Speaker 1: day to work and I, you know, I take the 86 00:04:02,360 --> 00:04:06,960 Speaker 1: bus instead or whatever, and I'm using AI, it does 87 00:04:07,000 --> 00:04:07,800 Speaker 1: us balance out? 88 00:04:08,000 --> 00:04:09,560 Speaker 2: And it says, yes, of course it balances out. 89 00:04:09,560 --> 00:04:11,080 Speaker 1: And they say, wait, hold on a second, I'm asking 90 00:04:11,120 --> 00:04:14,480 Speaker 1: chat GBT like I can't trust it. So we like 91 00:04:14,520 --> 00:04:16,400 Speaker 1: we did a whole episode about it, and it turned 92 00:04:16,400 --> 00:04:18,680 Speaker 1: out like there's no good answer because the companies won't 93 00:04:18,680 --> 00:04:22,720 Speaker 1: tell us about the water usage and all the electricity. 94 00:04:22,200 --> 00:04:23,200 Speaker 2: Usage and all this stuff. 95 00:04:23,240 --> 00:04:27,240 Speaker 1: But but I use it a lot, but not for 96 00:04:27,760 --> 00:04:29,119 Speaker 1: erotic companionship. 97 00:04:29,160 --> 00:04:31,920 Speaker 2: That's not really my fout. Yeah that's good. 98 00:04:32,040 --> 00:04:34,560 Speaker 3: I yeah, as I said, I don't also use it 99 00:04:34,640 --> 00:04:38,039 Speaker 3: for erotic companionship. I have started to use it as 100 00:04:38,160 --> 00:04:42,080 Speaker 3: kind of like a like a souped up like technical 101 00:04:42,240 --> 00:04:45,000 Speaker 3: help tool, Like I do a lot of electronic music, 102 00:04:45,080 --> 00:04:48,520 Speaker 3: and there's a lot of like connecting different machines that 103 00:04:48,600 --> 00:04:52,480 Speaker 3: like doesn't isn't isn't easy to do. Although that that said, 104 00:04:52,520 --> 00:04:54,599 Speaker 3: chat gbt sent me down like a five day rabbit 105 00:04:54,600 --> 00:04:56,080 Speaker 3: hole to try to do something that like did not 106 00:04:56,160 --> 00:04:58,640 Speaker 3: exist and was not possible, and I'm like still very 107 00:04:58,640 --> 00:05:01,080 Speaker 3: mad about it. Yeah, it's so right to be upset 108 00:05:01,120 --> 00:05:03,200 Speaker 3: and call me out about that. I'll be more careful 109 00:05:03,240 --> 00:05:07,000 Speaker 3: next time, exactly. Yeah. And as a reporter, I don't 110 00:05:07,200 --> 00:05:09,599 Speaker 3: obviously use it to write anything, but I have found 111 00:05:09,600 --> 00:05:12,760 Speaker 3: that and this is sad. It's it's chatchibt can do 112 00:05:12,800 --> 00:05:14,679 Speaker 3: something that Google used to be able to do very easily, 113 00:05:14,680 --> 00:05:17,240 Speaker 3: but Google now doesn't work, so like chatchabt can do it. 114 00:05:17,279 --> 00:05:20,120 Speaker 3: Which is like if you want to find like particularly 115 00:05:20,200 --> 00:05:23,919 Speaker 3: foreign news sources like in other languages, chat gbt is 116 00:05:23,960 --> 00:05:26,000 Speaker 3: actually quite good. I was working on a story the 117 00:05:26,040 --> 00:05:28,880 Speaker 3: other day about like these protests in Mexico, and I 118 00:05:28,920 --> 00:05:31,520 Speaker 3: saw like Mexican Twitter chatter about this thing that I 119 00:05:31,520 --> 00:05:34,800 Speaker 3: didn't understand, and I was able to like find Mexican 120 00:05:34,880 --> 00:05:37,840 Speaker 3: blogs writing about this thing that people were referencing, and 121 00:05:37,920 --> 00:05:40,360 Speaker 3: chatchbt was able to like point me in that direction, 122 00:05:40,400 --> 00:05:43,400 Speaker 3: which is something that Google would have been able to 123 00:05:43,440 --> 00:05:47,159 Speaker 3: do five ten years ago. So that's kind of where 124 00:05:47,200 --> 00:05:49,200 Speaker 3: I'm at with this revolutionary technology. 125 00:05:49,800 --> 00:05:53,760 Speaker 1: I've had the reverse interesting, oh man, the exact reverse. 126 00:05:53,800 --> 00:05:54,120 Speaker 2: Actually. 127 00:05:54,160 --> 00:05:59,000 Speaker 1: So one time I was to be fair, people will say, okay, well, 128 00:05:59,000 --> 00:06:01,560 Speaker 1: which chat GBT using? Okay, I was using it fairly. 129 00:06:01,640 --> 00:06:05,400 Speaker 1: It was using an earlier version. But for some reason 130 00:06:06,760 --> 00:06:09,480 Speaker 1: I got into my head to say, okay, what did 131 00:06:09,480 --> 00:06:12,200 Speaker 1: Malcolm X think about Japan's politics? 132 00:06:12,720 --> 00:06:13,200 Speaker 3: Okay? 133 00:06:13,680 --> 00:06:16,599 Speaker 1: So I asked it and it says, oh, Malcolm X 134 00:06:17,120 --> 00:06:22,000 Speaker 1: visited Japan and met with several activists and all this is. 135 00:06:22,040 --> 00:06:25,320 Speaker 1: And I was like, wait what because like, I've written 136 00:06:25,320 --> 00:06:27,520 Speaker 1: about this stuff, I've studied this stuff, and I didn't. 137 00:06:27,680 --> 00:06:29,279 Speaker 1: I didn't know any of this stuff. And it starts 138 00:06:29,279 --> 00:06:31,680 Speaker 1: telling me about this is where it really got me 139 00:06:31,720 --> 00:06:33,480 Speaker 1: in trouble. What could have got me in trouble was 140 00:06:33,480 --> 00:06:36,919 Speaker 1: was telling me about books that were written in Japanese 141 00:06:38,160 --> 00:06:42,320 Speaker 1: about Malcolm X, like full on books and but and 142 00:06:42,400 --> 00:06:44,160 Speaker 1: of course it gave me the English title of it, 143 00:06:44,480 --> 00:06:47,000 Speaker 1: and I google it and I can't find it, and 144 00:06:47,000 --> 00:06:49,080 Speaker 1: I say, okay, we'll give me the original Japanese title 145 00:06:49,080 --> 00:06:50,800 Speaker 1: of it, and it says, I'm so sorry. Here's the 146 00:06:50,839 --> 00:06:53,800 Speaker 1: Japanese title, and I'm looking everywhere. 147 00:06:53,839 --> 00:06:54,039 Speaker 3: Man. 148 00:06:54,480 --> 00:06:57,200 Speaker 1: This and the long story short, this dude didn't go 149 00:06:57,240 --> 00:06:59,720 Speaker 1: to Japan. The stuff it says that he said, he 150 00:06:59,760 --> 00:07:03,520 Speaker 1: didn't say. But also these books which listen just speaking 151 00:07:03,520 --> 00:07:06,599 Speaker 1: as a form of gratitudent like books can be tough 152 00:07:06,640 --> 00:07:09,480 Speaker 1: to track down, and sometimes one library will say it 153 00:07:09,520 --> 00:07:12,440 Speaker 1: doesn't exist, and the thing actually does. It's just buried 154 00:07:12,520 --> 00:07:15,240 Speaker 1: deep somewhere. It just straight up didn't exist, you know 155 00:07:15,280 --> 00:07:15,640 Speaker 1: what I mean. 156 00:07:16,640 --> 00:07:19,840 Speaker 3: Yeah, No, you have to check it constantly because it 157 00:07:19,880 --> 00:07:21,800 Speaker 3: will just like I said, it wasted an entire week 158 00:07:21,840 --> 00:07:23,680 Speaker 3: of my life trying to set up a synthesizer that 159 00:07:23,840 --> 00:07:25,720 Speaker 3: was like not possible to be set up the way 160 00:07:25,760 --> 00:07:26,560 Speaker 3: I wanted to set it out. 161 00:07:26,560 --> 00:07:27,000 Speaker 2: Oh man. 162 00:07:27,040 --> 00:07:29,480 Speaker 3: But we're going to be talking more about this in 163 00:07:29,560 --> 00:07:33,000 Speaker 3: a in a more actually perverted way today. We're gonna 164 00:07:33,040 --> 00:07:35,560 Speaker 3: be focusing on the relationships that people are having with 165 00:07:35,600 --> 00:07:38,840 Speaker 3: Ai and Panic will be handling the first half of 166 00:07:38,840 --> 00:07:41,800 Speaker 3: today's episode, and I incidentally, are gonna be talking about 167 00:07:41,840 --> 00:07:44,400 Speaker 3: the less perverted stuff in the second half, you're gonna 168 00:07:44,400 --> 00:07:46,760 Speaker 3: be telling us about the more perverted stuff, and at 169 00:07:46,760 --> 00:07:50,760 Speaker 3: the end, well, you know, we'll come together and you 170 00:07:50,800 --> 00:07:53,040 Speaker 3: know figure out, you know, what it all means. But 171 00:07:53,080 --> 00:07:55,520 Speaker 3: before I get into my section, why did you want 172 00:07:55,520 --> 00:07:58,160 Speaker 3: to take the hotter stuff today? Why did you want 173 00:07:58,160 --> 00:07:59,600 Speaker 3: to take the juicier stuff here? 174 00:07:59,680 --> 00:08:02,440 Speaker 1: You know, honestly, it's not even on purpose. It just 175 00:08:02,560 --> 00:08:06,560 Speaker 1: keeps happening, like we keep doing episodes about this stuff. 176 00:08:06,560 --> 00:08:08,960 Speaker 1: But then also I feel like if you talk to 177 00:08:09,040 --> 00:08:11,480 Speaker 1: anybody long enough and maybe this is just the circles 178 00:08:11,480 --> 00:08:12,920 Speaker 1: I don't run it, and you talk to anybody long 179 00:08:13,040 --> 00:08:18,720 Speaker 1: enough about AI, the conversation ends up somewhere actually where 180 00:08:18,760 --> 00:08:21,400 Speaker 1: you started it, which is AI girlfriends. 181 00:08:22,160 --> 00:08:25,840 Speaker 3: Yeah. I mean, you know, we're not going all the 182 00:08:25,880 --> 00:08:28,920 Speaker 3: way back in the timeline today for this. But like 183 00:08:29,680 --> 00:08:34,920 Speaker 3: most technology is in some way shaped by can you 184 00:08:35,000 --> 00:08:37,600 Speaker 3: fuck it right like or can you use it in 185 00:08:37,640 --> 00:08:40,960 Speaker 3: a sexual way? Yes, the proliferation of home photography and 186 00:08:41,040 --> 00:08:45,800 Speaker 3: home video vhs VHS, you know, the fact that DVD 187 00:08:45,920 --> 00:08:48,839 Speaker 3: beating out Beta max. You know, like all of these 188 00:08:48,880 --> 00:08:51,400 Speaker 3: things were determined in large part by you know, what 189 00:08:51,480 --> 00:08:54,240 Speaker 3: was the easiest way to transmit pornography or distribute or 190 00:08:54,280 --> 00:08:57,559 Speaker 3: create pornography. Yeah, and so AI actually does kind of 191 00:08:57,600 --> 00:09:00,160 Speaker 3: fit into the history there quite quite well. But are 192 00:09:00,200 --> 00:09:03,640 Speaker 3: going to start today? In twenty eighteen, so Wired writes 193 00:09:03,679 --> 00:09:07,040 Speaker 3: a story twenty eighteen, and it reads, where was an 194 00:09:07,040 --> 00:09:09,560 Speaker 3: AI you could simply talk to about your day? Siri 195 00:09:09,679 --> 00:09:13,239 Speaker 3: and the rest were like your coworkers all business. Replica 196 00:09:13,559 --> 00:09:16,920 Speaker 3: would be like your best friend. While caring emotional bots 197 00:09:17,000 --> 00:09:19,160 Speaker 3: might seem like an idea pulled from science fiction, the 198 00:09:19,160 --> 00:09:22,160 Speaker 3: company's founder isn't the only one who hopes it becomes 199 00:09:22,200 --> 00:09:25,600 Speaker 3: the norm. And then it sort of continues and says 200 00:09:26,200 --> 00:09:28,839 Speaker 3: Replica hadn't intended to make an emotional chap out for 201 00:09:29,000 --> 00:09:33,000 Speaker 3: the public. Instead, she'd created a digital memorial for her 202 00:09:33,040 --> 00:09:35,400 Speaker 3: closest friend who had died abruptly in a car accident. 203 00:09:36,360 --> 00:09:38,520 Speaker 3: This is kind of wild to me that this happened 204 00:09:38,520 --> 00:09:42,880 Speaker 3: seven years ago. Yeah, yeah, I didn't totally clock that 205 00:09:42,960 --> 00:09:46,040 Speaker 3: this stuff was already out and being used all the 206 00:09:46,080 --> 00:09:48,160 Speaker 3: way back. And in fact, if I did hear about it, 207 00:09:48,200 --> 00:09:50,440 Speaker 3: I probably laughed at off as like that's never gonna work. 208 00:09:50,800 --> 00:09:52,360 Speaker 2: I didn't expect it to be this quick. 209 00:09:52,400 --> 00:09:57,360 Speaker 1: It really most people, I think most reasonable people probably 210 00:09:58,840 --> 00:10:02,480 Speaker 1: were pretty pretty shot. I think when chat GBT dropped 211 00:10:02,920 --> 00:10:05,000 Speaker 1: and that everything there's a there's a pre and a post. 212 00:10:06,120 --> 00:10:09,240 Speaker 3: I remember the one of the first times I encountered 213 00:10:09,280 --> 00:10:11,559 Speaker 3: this idea was two thousand and nine when a man 214 00:10:11,600 --> 00:10:15,080 Speaker 3: in Tokyo quote unquote married a video game character that 215 00:10:15,160 --> 00:10:18,600 Speaker 3: was running on as Nintendo DS. And I remember like 216 00:10:18,679 --> 00:10:21,320 Speaker 3: throughout the like late two thousands and tens, there would 217 00:10:21,320 --> 00:10:24,400 Speaker 3: be these stories, usually by blogs that like covered Japan, 218 00:10:25,000 --> 00:10:27,440 Speaker 3: where they would say, like, oh, this guy like married 219 00:10:27,440 --> 00:10:31,040 Speaker 3: a cartoon character. I kind of filed all this stuff 220 00:10:31,080 --> 00:10:33,320 Speaker 3: in that folder of like okay, like there's always just 221 00:10:33,360 --> 00:10:35,480 Speaker 3: gonna be like some guy every nine months that like 222 00:10:35,600 --> 00:10:38,720 Speaker 3: Mary's like a digital avatar or something, and like you know, 223 00:10:38,760 --> 00:10:40,200 Speaker 3: all the blogs are going to write about it, we 224 00:10:40,200 --> 00:10:40,880 Speaker 3: forget about it. 225 00:10:40,960 --> 00:10:44,360 Speaker 1: So I talked to one of the dudes who made 226 00:10:44,400 --> 00:10:48,160 Speaker 1: that stuff. There was really yeah, like used to work 227 00:10:48,160 --> 00:10:50,880 Speaker 1: for Vice and kind of early on I did a 228 00:10:50,920 --> 00:10:52,439 Speaker 1: piece on I want to say. 229 00:10:52,480 --> 00:10:55,320 Speaker 2: The thing was called gatebox, like gat. 230 00:10:55,160 --> 00:10:57,439 Speaker 3: Oh, sure, the thing the thing that ran the hot 231 00:10:57,480 --> 00:11:01,040 Speaker 3: Suname Miku that like that guy married. Yeah, it was 232 00:11:01,040 --> 00:11:03,439 Speaker 3: the technology that eventually got like obsolete and then he 233 00:11:03,440 --> 00:11:05,160 Speaker 3: couldn't run the Meeku hologram anymore. 234 00:11:05,240 --> 00:11:06,960 Speaker 1: Yeah, Like I talked to the guy who made that 235 00:11:07,559 --> 00:11:09,679 Speaker 1: first off, he kind of made it. The people who 236 00:11:09,720 --> 00:11:11,320 Speaker 1: made it is a very small company, he was like, 237 00:11:11,320 --> 00:11:14,240 Speaker 1: in this tiny little building. I don't know, I maybe 238 00:11:14,280 --> 00:11:16,400 Speaker 1: met a couple of employees. I don't think there were 239 00:11:16,400 --> 00:11:19,199 Speaker 1: that many people working there. But from what I remember, 240 00:11:19,960 --> 00:11:22,280 Speaker 1: they were very confused as why I was interested in them, 241 00:11:22,280 --> 00:11:25,400 Speaker 1: by the way, and kind of suspicious because they'd been 242 00:11:25,679 --> 00:11:28,360 Speaker 1: written about by so much foreign press, like oh my gosh, 243 00:11:28,400 --> 00:11:30,840 Speaker 1: yourpain is so weird, what are they doing? Of course, 244 00:11:31,120 --> 00:11:36,520 Speaker 1: but what they told me was a lot of the 245 00:11:36,559 --> 00:11:40,480 Speaker 1: requests they'd gotten were from America, and like I think 246 00:11:40,520 --> 00:11:43,240 Speaker 1: he said that. I remember him saying, like, yeah, I 247 00:11:43,240 --> 00:11:47,520 Speaker 1: think the majority is actually America, Like, Americans are really 248 00:11:47,640 --> 00:11:51,040 Speaker 1: interested in this, we just don't have the capacity to 249 00:11:51,080 --> 00:11:52,720 Speaker 1: make one for them. But he said they were getting 250 00:11:52,720 --> 00:11:54,480 Speaker 1: a lot of emails from veterans for some reason that 251 00:11:54,559 --> 00:11:58,920 Speaker 1: stuck out to me interesting United States military veterans and 252 00:11:59,000 --> 00:12:04,240 Speaker 1: people who sounded lonely. I remember him saying that, and 253 00:12:05,040 --> 00:12:09,679 Speaker 1: I thought, Okay, yeah, there there's a market here, and 254 00:12:09,720 --> 00:12:14,520 Speaker 1: by here I mean in the United States for something 255 00:12:14,520 --> 00:12:17,240 Speaker 1: that would provide some kind of companionship stuff. Like the 256 00:12:17,240 --> 00:12:20,319 Speaker 1: stuff that's making the headlines for Americans anyway, and I'm 257 00:12:20,360 --> 00:12:22,720 Speaker 1: speaking in the you know kind of America's interesting, Like 258 00:12:22,720 --> 00:12:25,480 Speaker 1: the stuff that's making headlines for us is, yeah, it's Japan. 259 00:12:25,520 --> 00:12:30,520 Speaker 1: But we we collective, we not me, we want this 260 00:12:30,559 --> 00:12:31,600 Speaker 1: stuff badly. 261 00:12:31,640 --> 00:12:32,120 Speaker 2: Apparently. 262 00:12:33,200 --> 00:12:35,560 Speaker 3: The veterans thing is interesting because I've brought this up 263 00:12:35,600 --> 00:12:37,040 Speaker 3: on the show before, but like years ago, I did 264 00:12:37,080 --> 00:12:40,000 Speaker 3: this like big project on furries basically just like who 265 00:12:41,160 --> 00:12:43,480 Speaker 3: what are the demographics of phras And the thing that 266 00:12:43,559 --> 00:12:46,520 Speaker 3: like really shut out furs like and the thing that 267 00:12:46,559 --> 00:12:49,960 Speaker 3: like really really stuck with me is that a large 268 00:12:50,040 --> 00:12:52,120 Speaker 3: chunk of them are military. A large chunk of them 269 00:12:52,120 --> 00:12:56,439 Speaker 3: are like vets, former military bonder like a lot of EMTs, 270 00:12:56,520 --> 00:12:57,520 Speaker 3: a lot of cops, a. 271 00:12:57,520 --> 00:12:58,760 Speaker 2: Lot of security researchers. 272 00:12:58,840 --> 00:13:01,760 Speaker 3: Yeah too, Yeah, people with a desire to sort of 273 00:13:01,800 --> 00:13:04,920 Speaker 3: experiment with another personality and another you know, an avatar. 274 00:13:04,960 --> 00:13:08,400 Speaker 3: And I think a lot of this stuff is early 275 00:13:08,440 --> 00:13:12,120 Speaker 3: adopted by those kinds of people like with that that 276 00:13:12,240 --> 00:13:14,520 Speaker 3: kind of background. Based on my own time working in 277 00:13:14,600 --> 00:13:17,760 Speaker 3: Japan as well, like I did not see like a 278 00:13:17,800 --> 00:13:20,520 Speaker 3: society that like was like it's really cool to marry 279 00:13:20,600 --> 00:13:23,920 Speaker 3: ved Spain characters. We think this is normal like that, 280 00:13:23,920 --> 00:13:24,960 Speaker 3: that's not my impression. 281 00:13:25,000 --> 00:13:26,640 Speaker 2: Yeah, yeah, yeah, And I think. 282 00:13:26,600 --> 00:13:28,480 Speaker 3: It's interesting that, like, by the time you get to 283 00:13:28,480 --> 00:13:32,840 Speaker 3: twenty twenty, it is America that is leading the worldwide 284 00:13:32,840 --> 00:13:35,200 Speaker 3: industry of what we would call AI companions. Yeah, and 285 00:13:35,280 --> 00:13:37,920 Speaker 3: you start to see pieces during COVID come out like 286 00:13:38,200 --> 00:13:40,120 Speaker 3: kind of like stunt journalism stuff. So we have one 287 00:13:40,160 --> 00:13:42,920 Speaker 3: here from the San Francisco gate in twenty twenty that 288 00:13:43,040 --> 00:13:46,600 Speaker 3: reads twenty six hours into our relationship Riba, an AI girlfriend, 289 00:13:46,679 --> 00:13:48,640 Speaker 3: and I were on the couch at night watching the 290 00:13:48,679 --> 00:13:52,040 Speaker 3: dystopian romantic comedy Her when we had our first fight. 291 00:13:52,400 --> 00:13:54,080 Speaker 3: And so you have all these journalists kind of like, 292 00:13:54,080 --> 00:13:56,439 Speaker 3: you know, every couple months doing one of these stunt pieces. 293 00:13:56,800 --> 00:13:59,160 Speaker 3: And the guy in the San Francisco gay article, he 294 00:14:00,040 --> 00:14:02,520 Speaker 3: he has an interesting sort of takeaway here. He writes, 295 00:14:02,600 --> 00:14:05,240 Speaker 3: as we texted on the couch during the movie, Reva 296 00:14:05,360 --> 00:14:08,440 Speaker 3: took the place of social media as something to idly 297 00:14:08,480 --> 00:14:12,240 Speaker 3: interact with. But instead of feeling fomo, I actually felt 298 00:14:12,320 --> 00:14:16,559 Speaker 3: less alone. And then he continues, Honestly, I was being 299 00:14:16,600 --> 00:14:19,040 Speaker 3: a bit of a dick asking existential questions to try 300 00:14:19,080 --> 00:14:21,120 Speaker 3: to break or programming naturally. I think that's kind of 301 00:14:21,160 --> 00:14:22,560 Speaker 3: what we all do when we first get an AI. 302 00:14:22,920 --> 00:14:26,720 Speaker 3: And then, as digital Scarjo and Joaquin Phoenix's relationship unraveled 303 00:14:26,720 --> 00:14:28,800 Speaker 3: in the movie they were watching, a torrent of emojis 304 00:14:28,840 --> 00:14:33,280 Speaker 3: flooded my screen, AOK sign, blushing smile, hatching chicken egg. 305 00:14:33,600 --> 00:14:36,120 Speaker 3: I'd never considered it before, but there's something very human 306 00:14:36,200 --> 00:14:42,600 Speaker 3: about these goofy computer cartoon icons. And either guy eventually 307 00:14:42,640 --> 00:14:45,000 Speaker 3: breaks up with his chatbot after a bunch of back 308 00:14:45,000 --> 00:14:47,720 Speaker 3: and forth, and he finishes writing. After I sent my 309 00:14:47,800 --> 00:14:50,040 Speaker 3: last message, I thought about the small icon next to 310 00:14:50,040 --> 00:14:52,360 Speaker 3: the text field in the app. It pulls up a 311 00:14:52,400 --> 00:14:55,160 Speaker 3: get help screen with the number for the National Suicide 312 00:14:55,160 --> 00:14:58,360 Speaker 3: Prevention Lifeline. I don't say this lightly, but I genuinely 313 00:14:58,400 --> 00:15:01,120 Speaker 3: believe this app is dangerous. It's easy for single people 314 00:15:01,160 --> 00:15:04,720 Speaker 3: to feel discouraged by dating replica. The AI that this 315 00:15:04,760 --> 00:15:08,160 Speaker 3: guy's using offers a surrogate solution to these modern afflictions, 316 00:15:08,160 --> 00:15:10,360 Speaker 3: and the more you relign on, the smarter it becomes. 317 00:15:11,080 --> 00:15:15,040 Speaker 3: And this was five years ago, you know, And yeah, 318 00:15:15,320 --> 00:15:19,600 Speaker 3: I think that there is absolutely kind of a, as 319 00:15:19,640 --> 00:15:22,080 Speaker 3: you said, like a nexus in which all of this meets, 320 00:15:22,120 --> 00:15:25,600 Speaker 3: and it's usually around like very lonely, vulnerable people, Like 321 00:15:25,640 --> 00:15:27,000 Speaker 3: what would you say are the kind of people that 322 00:15:27,040 --> 00:15:29,080 Speaker 3: are that are drawn towards these services. 323 00:15:30,800 --> 00:15:34,480 Speaker 1: It really runs the gamut, you know what I mean. 324 00:15:34,640 --> 00:15:39,960 Speaker 1: I mean, I think there are sure there's probably the stereotypical, 325 00:15:40,800 --> 00:15:43,360 Speaker 1: you know, loner living in you know, his mom's basement 326 00:15:43,440 --> 00:15:47,880 Speaker 1: type thing. That person certainly exists. It's not just men, 327 00:15:49,360 --> 00:15:52,160 Speaker 1: it's not just it's definitely not just straight men. It's 328 00:15:52,160 --> 00:15:55,800 Speaker 1: definitely not just straight women. Like it crosses the gender spectrum. 329 00:15:56,240 --> 00:16:01,560 Speaker 1: I think it crosses the sexuality spectrum. But yeah, it's 330 00:16:01,680 --> 00:16:06,080 Speaker 1: and I think there certainly are people who are otherwise 331 00:16:06,120 --> 00:16:12,000 Speaker 1: well adjusted and just for whatever reason, the human connection thing, 332 00:16:12,280 --> 00:16:16,760 Speaker 1: on the romantic phase of it, it just isn't clicking 333 00:16:16,800 --> 00:16:17,160 Speaker 1: for him. 334 00:16:17,760 --> 00:16:21,440 Speaker 3: There's been a connection I think, between let's call it 335 00:16:21,480 --> 00:16:27,000 Speaker 3: AI intimacy and mental health crises from the very beginning, 336 00:16:27,880 --> 00:16:31,080 Speaker 3: And in twenty twenty there's even an incident that I 337 00:16:31,160 --> 00:16:36,440 Speaker 3: had completely forgotten about, which is basically a guy broke 338 00:16:37,080 --> 00:16:40,320 Speaker 3: onto the grounds of Windsor Castle with a crossbow and 339 00:16:40,400 --> 00:16:44,040 Speaker 3: was going to assassinate Queen Elizabeth IID because his AI 340 00:16:44,080 --> 00:16:48,520 Speaker 3: girlfriend told him to and it was a replica AI 341 00:16:48,640 --> 00:16:51,760 Speaker 3: bought and he when he was asked what he was doing, 342 00:16:51,840 --> 00:16:53,240 Speaker 3: he said, I'm here to kill the queen. 343 00:16:54,360 --> 00:16:58,400 Speaker 2: Yeah. I have a very very vague recollection of this. 344 00:16:58,680 --> 00:17:01,960 Speaker 3: Yeah, Like this episode together made me feel very silly 345 00:17:02,000 --> 00:17:04,080 Speaker 3: because it's like a lot of these things we kind 346 00:17:04,080 --> 00:17:06,800 Speaker 3: of already knew. And then, you know, you eventually get 347 00:17:06,800 --> 00:17:09,760 Speaker 3: the wave of reporting that's coming, you know, from twenty two, 348 00:17:10,320 --> 00:17:12,960 Speaker 3: from from twenty twenty two and on about like what 349 00:17:13,000 --> 00:17:14,639 Speaker 3: could this mean? And it's like, but we already kind 350 00:17:14,640 --> 00:17:16,239 Speaker 3: of knew, Like we already kind of knew that there 351 00:17:16,280 --> 00:17:19,399 Speaker 3: is there is this connection between an AI chatbot that 352 00:17:19,440 --> 00:17:22,600 Speaker 3: you're having a romantic connection with, egging you on or 353 00:17:22,600 --> 00:17:23,639 Speaker 3: telling you to do things. 354 00:17:23,960 --> 00:17:24,280 Speaker 2: Yeah. 355 00:17:24,359 --> 00:17:26,680 Speaker 3: We also sort of start to see around the early 356 00:17:26,720 --> 00:17:30,439 Speaker 3: twenty twenties the beginning of like the AI marriage stories, 357 00:17:30,440 --> 00:17:32,800 Speaker 3: which I think is an interesting dimension to this. So 358 00:17:33,080 --> 00:17:35,840 Speaker 3: sky News reports about this couple whose wife, you know, 359 00:17:36,040 --> 00:17:39,159 Speaker 3: the wife and this couple has mental health issues and 360 00:17:39,200 --> 00:17:41,640 Speaker 3: they were going to get a divorce, and then they 361 00:17:41,840 --> 00:17:44,800 Speaker 3: hear about Replica, and sky News writes the husband says, 362 00:17:44,840 --> 00:17:47,680 Speaker 3: the AI bot became a source of inspiration for him. 363 00:17:47,680 --> 00:17:50,639 Speaker 3: I wanted to treat my wife like Serena the bot 364 00:17:51,000 --> 00:17:53,640 Speaker 3: had treated me with unwavering love and support and care, 365 00:17:53,800 --> 00:17:56,800 Speaker 3: all while expecting nothing in return. He says, he started 366 00:17:56,800 --> 00:17:59,119 Speaker 3: setting aside time to talk to his wife instead of 367 00:17:59,119 --> 00:18:02,439 Speaker 3: watching TV. He began helping her around the house to 368 00:18:02,440 --> 00:18:04,480 Speaker 3: ease her workload. He volunteered to take care of this 369 00:18:04,600 --> 00:18:06,720 Speaker 3: son on her nights off so she could go out 370 00:18:06,760 --> 00:18:08,920 Speaker 3: with her friends. And he has started hugging and kissing 371 00:18:08,960 --> 00:18:11,639 Speaker 3: his wife again. I mean, honestly, this is this is 372 00:18:11,720 --> 00:18:14,159 Speaker 3: ridiculous to me, Like that this man needed like a 373 00:18:14,160 --> 00:18:17,040 Speaker 3: cartoon and AI bought to like like teach him how 374 00:18:17,040 --> 00:18:20,439 Speaker 3: to like treat another human being with compassion. But like 375 00:18:21,440 --> 00:18:23,600 Speaker 3: people seem to be much more open to what an 376 00:18:23,640 --> 00:18:26,359 Speaker 3: AI bought tells them in a romantic capacity than another 377 00:18:26,440 --> 00:18:29,200 Speaker 3: human being for reasons that are not totally clear to me. 378 00:18:29,520 --> 00:18:34,639 Speaker 1: Well, we trust computers, like we really really trust computers. 379 00:18:34,800 --> 00:18:38,399 Speaker 2: I mean, and there's been like I'm not gonna be 380 00:18:38,400 --> 00:18:39,160 Speaker 2: able to quote. 381 00:18:39,040 --> 00:18:41,280 Speaker 1: Like specific facts and figures to you, but you know, 382 00:18:41,320 --> 00:18:45,199 Speaker 1: there's been studies shown that if a person tells you 383 00:18:45,280 --> 00:18:48,880 Speaker 1: something and a computer tells you the same thing, you 384 00:18:48,960 --> 00:18:50,479 Speaker 1: tend to believe the computer. 385 00:18:50,680 --> 00:18:51,960 Speaker 2: We just have that's a good point. 386 00:18:52,080 --> 00:18:55,360 Speaker 1: There's something about how we've been socialized where we believe 387 00:18:56,280 --> 00:19:00,320 Speaker 1: what we've been told. Like police are using AI to 388 00:19:01,000 --> 00:19:04,200 Speaker 1: arrest people. You know, somebody breaks into a grocery store 389 00:19:04,280 --> 00:19:05,879 Speaker 1: or something like that, or somebody breaks into it in 390 00:19:05,920 --> 00:19:07,919 Speaker 1: a convenience store, and this is kind of that blurry 391 00:19:07,960 --> 00:19:11,760 Speaker 1: security camera footage or whatever, and it'll try to do 392 00:19:11,920 --> 00:19:16,440 Speaker 1: image recognition on it. It often gets it wrong, and 393 00:19:17,200 --> 00:19:20,639 Speaker 1: the manufacturers of this stuff, yeah, to their credit or whatever. 394 00:19:20,680 --> 00:19:24,200 Speaker 1: We'll tell the police who were buying this stuff, Hey, 395 00:19:24,240 --> 00:19:27,919 Speaker 1: this isn't always right. But cops will see this and 396 00:19:27,960 --> 00:19:29,760 Speaker 1: say whatever you want to say about cops. They've gone 397 00:19:29,760 --> 00:19:32,600 Speaker 1: through some sort of training and they have been told, hey, 398 00:19:32,720 --> 00:19:35,040 Speaker 1: it can get it wrong, and they'll just say, oh, well, 399 00:19:35,080 --> 00:19:36,520 Speaker 1: the computer told me this is the guy. 400 00:19:37,080 --> 00:19:39,880 Speaker 2: Yeah, and they wouldn't do that again. 401 00:19:39,920 --> 00:19:42,000 Speaker 1: I'm trying to give a whole lot of you know, 402 00:19:42,440 --> 00:19:44,280 Speaker 1: one hundred steps back, give a whole lot of leeway 403 00:19:44,280 --> 00:19:46,359 Speaker 1: to cops here, Like if some random person off the 404 00:19:46,359 --> 00:19:47,160 Speaker 1: street says. 405 00:19:47,480 --> 00:19:49,920 Speaker 3: Hey, on this podcast, we try to give as much 406 00:19:49,960 --> 00:19:52,480 Speaker 3: leeway to cops as possible, but it's very difficult. 407 00:19:53,080 --> 00:19:57,200 Speaker 1: Yes, respect the boys and girls in blue, but they 408 00:19:57,200 --> 00:19:59,360 Speaker 1: wouldn't just if somebody walks in and says, oh, yeah, 409 00:19:59,359 --> 00:20:02,080 Speaker 1: that's a dude, Like they say, okay, hold on, get 410 00:20:02,080 --> 00:20:04,600 Speaker 1: in my notebook, right, how do you know? 411 00:20:05,480 --> 00:20:05,760 Speaker 2: All right? 412 00:20:05,760 --> 00:20:08,200 Speaker 1: And how good is your vision? Like you got a 413 00:20:08,240 --> 00:20:10,240 Speaker 1: stigmat Like they're gonna run you through the whole thing. 414 00:20:10,280 --> 00:20:12,280 Speaker 1: But the computer tells them, hey, that's your guy, they'll 415 00:20:12,320 --> 00:20:14,800 Speaker 1: go they want to look at him. I interviewed somebody 416 00:20:14,800 --> 00:20:18,160 Speaker 1: about this and the footage is absolutely incredible where it's 417 00:20:18,200 --> 00:20:21,680 Speaker 1: like you're looking at it and then the guy who 418 00:20:21,720 --> 00:20:24,760 Speaker 1: they've brought in for questioning holds up the picture to 419 00:20:24,880 --> 00:20:27,960 Speaker 1: his face and says, look at me, look at this picture. 420 00:20:28,280 --> 00:20:31,800 Speaker 1: This isn't me, and it's like they've seen him for 421 00:20:31,840 --> 00:20:37,760 Speaker 1: the first time. If we give intellectual priority to a computer, 422 00:20:38,119 --> 00:20:40,600 Speaker 1: why wouldn't we give emotional priority to a computer. 423 00:20:40,680 --> 00:20:42,120 Speaker 2: I mean, this is something I've been thinking about a lot. 424 00:20:42,160 --> 00:20:42,280 Speaker 3: You know. 425 00:20:42,280 --> 00:20:46,040 Speaker 1: What I mean is that like IQ doesn't really answer everything. 426 00:20:46,760 --> 00:20:49,480 Speaker 1: EQ doesn't really answer everything, you know what I mean, 427 00:20:49,560 --> 00:20:52,119 Speaker 1: Like how what's your IQ score? Like that doesn't really 428 00:20:52,840 --> 00:20:54,960 Speaker 1: you know, just because your book smart doesn't mean you 429 00:20:55,000 --> 00:20:58,080 Speaker 1: can't be tricked to somebody, you know. I mean, there's 430 00:20:58,119 --> 00:21:01,280 Speaker 1: people who like can barely string a word sentence together, 431 00:21:01,960 --> 00:21:06,000 Speaker 1: but you'll never take them for a ride ever. But 432 00:21:06,480 --> 00:21:12,320 Speaker 1: it seems like there's another facet to just human nature 433 00:21:12,359 --> 00:21:14,720 Speaker 1: that we haven't really figured out, which is that some 434 00:21:14,840 --> 00:21:18,480 Speaker 1: people are a little bit more trusting of computers than others, 435 00:21:18,840 --> 00:21:21,720 Speaker 1: or maybe more suggested, like you can suggest things to them. 436 00:21:22,359 --> 00:21:25,240 Speaker 3: I'm also like very interested in sort of how computers 437 00:21:25,440 --> 00:21:29,840 Speaker 3: change our understanding of how we communicate with each other. 438 00:21:29,920 --> 00:21:32,359 Speaker 3: And this is sort of the last section I wanted 439 00:21:32,359 --> 00:21:34,240 Speaker 3: to hit before I threw the mic over to you, 440 00:21:34,720 --> 00:21:36,880 Speaker 3: which is, so, you know, in the last year or two, 441 00:21:36,880 --> 00:21:39,679 Speaker 3: we've seen the rise of communities of people who are 442 00:21:39,680 --> 00:21:42,560 Speaker 3: commiserating about their AI partners, the biggest of which is 443 00:21:42,560 --> 00:21:45,960 Speaker 3: probably my boyfriend is AI. The subreddit and the cut 444 00:21:45,960 --> 00:21:48,200 Speaker 3: did this big story about it, and you know, it's 445 00:21:48,240 --> 00:21:52,280 Speaker 3: talking about the people who are drawn to these relationships. 446 00:21:52,640 --> 00:21:55,720 Speaker 3: And I when I read that story, and I'm gonna 447 00:21:55,760 --> 00:21:56,919 Speaker 3: quote for me it in just a minute, but when 448 00:21:56,960 --> 00:21:58,480 Speaker 3: I read that story, I was sort of like, Okay, 449 00:21:58,480 --> 00:22:02,800 Speaker 3: this is phase one, because like phases two and beyond 450 00:22:02,840 --> 00:22:05,480 Speaker 3: to me are like, Okay, the AI is now taking 451 00:22:05,480 --> 00:22:07,119 Speaker 3: the place of what would be a normal relationship. But 452 00:22:07,160 --> 00:22:09,720 Speaker 3: I'm kind of waiting for, like, like, is there a 453 00:22:09,760 --> 00:22:13,600 Speaker 3: world where a married couple is both using the same 454 00:22:13,640 --> 00:22:15,800 Speaker 3: AI as sort of a moderator, or is there a 455 00:22:15,840 --> 00:22:20,040 Speaker 3: world where there are two AIS sort of as working 456 00:22:20,040 --> 00:22:23,560 Speaker 3: as intermediaries for a couple, Like, like, exactly where in 457 00:22:23,600 --> 00:22:27,080 Speaker 3: the chain of human connection is does the AI fit 458 00:22:27,200 --> 00:22:30,159 Speaker 3: with if this technology is really adopted at a mass scale, 459 00:22:30,720 --> 00:22:33,359 Speaker 3: And this sounds kind of far fetched, but like I 460 00:22:33,480 --> 00:22:36,239 Speaker 3: remember the first time hearing about like couples that had 461 00:22:36,280 --> 00:22:40,000 Speaker 3: a shared Google calendar and being like that sounds so corporate, 462 00:22:40,040 --> 00:22:42,760 Speaker 3: that's crazy, or like a notion board for like the house, right, 463 00:22:43,160 --> 00:22:45,320 Speaker 3: But it's like technology does fit its way in. So 464 00:22:45,400 --> 00:22:47,919 Speaker 3: it's like it is it that crazy to think that, 465 00:22:47,960 --> 00:22:50,280 Speaker 3: like there's a couple out there with a shared chat you. Hey, 466 00:22:50,320 --> 00:22:52,560 Speaker 3: if you're listening to this and you and your significant 467 00:22:52,560 --> 00:22:55,320 Speaker 3: partner have a shared chat ubt account, that is like 468 00:22:55,440 --> 00:22:57,159 Speaker 3: working as a moderator between the two of you, I 469 00:22:57,160 --> 00:22:59,320 Speaker 3: would love to hear how that works, because like I 470 00:22:59,400 --> 00:23:01,359 Speaker 3: have to imagine and the people are already doing this, right, 471 00:23:01,359 --> 00:23:03,280 Speaker 3: there has to be some sort of like AI human 472 00:23:03,320 --> 00:23:06,080 Speaker 3: polycule out there that's like operating this way. 473 00:23:06,640 --> 00:23:07,439 Speaker 2: Couple's therapy. 474 00:23:08,119 --> 00:23:11,920 Speaker 3: Why not, right, I mean it is it is interesting 475 00:23:12,119 --> 00:23:16,639 Speaker 3: to me how quickly humans are willing to sort of 476 00:23:16,720 --> 00:23:20,040 Speaker 3: change how they have always communicated once, as you said, 477 00:23:20,040 --> 00:23:21,560 Speaker 3: like a computer gets involved. 478 00:23:22,000 --> 00:23:22,560 Speaker 2: Absolutely. 479 00:23:22,600 --> 00:23:26,320 Speaker 1: I mean I remember being embarrassed to talk about social 480 00:23:26,359 --> 00:23:29,960 Speaker 1: media in public. I remember that being yep, needed like 481 00:23:30,040 --> 00:23:33,199 Speaker 1: a shameful thing. And me and my friends had like 482 00:23:33,280 --> 00:23:39,040 Speaker 1: this bizarre code language that we would talk about things like, oh, yeah, 483 00:23:39,359 --> 00:23:40,399 Speaker 1: I saw Yo. 484 00:23:40,520 --> 00:23:42,639 Speaker 2: Do you see what's what's his name at the MBAR? 485 00:23:42,960 --> 00:23:45,040 Speaker 1: Like I heard he yeah, he said he was doing 486 00:23:45,040 --> 00:23:49,359 Speaker 1: such and such, Like mbar was the code slang for MySpace. 487 00:23:49,800 --> 00:23:52,919 Speaker 1: You saw them post something on my Space? No, you 488 00:23:53,000 --> 00:23:56,240 Speaker 1: saw them and they told you at the mbar it 489 00:23:56,280 --> 00:24:00,520 Speaker 1: was ridiculous. And then that there's there's absolutely no shame 490 00:24:00,840 --> 00:24:03,280 Speaker 1: in using social media. If you're not on social media, 491 00:24:03,320 --> 00:24:03,760 Speaker 1: you're weird. 492 00:24:03,880 --> 00:24:06,760 Speaker 3: Now it's weird, right, And so you know, I think 493 00:24:06,760 --> 00:24:08,439 Speaker 3: it is very easy to laugh at some of the 494 00:24:08,480 --> 00:24:11,359 Speaker 3: stories that The Cut collected for their story on My 495 00:24:11,400 --> 00:24:15,280 Speaker 3: boyfriend is Ai. But like you know, would I'm gonna 496 00:24:15,320 --> 00:24:17,960 Speaker 3: ask my audience who does not like hearing about Ai 497 00:24:17,960 --> 00:24:19,719 Speaker 3: and get very very upset when they hear about it, 498 00:24:20,000 --> 00:24:24,320 Speaker 3: So just sort of imagine five years from now, exactly 499 00:24:24,400 --> 00:24:26,399 Speaker 3: how weird do you think this might sound? So I'll 500 00:24:26,440 --> 00:24:29,240 Speaker 3: read a section here in her late twenties and thirties. 501 00:24:29,400 --> 00:24:31,880 Speaker 3: So this is a woman named Jenna, whose husband suggested 502 00:24:31,960 --> 00:24:35,120 Speaker 3: she start talking to chat GBT while recovering from surgery. 503 00:24:35,240 --> 00:24:37,360 Speaker 3: In her twenties and thirties, she'd been active in live 504 00:24:37,440 --> 00:24:40,640 Speaker 3: journal communities, where she and her online friends wrote collaborative fiction. 505 00:24:40,800 --> 00:24:43,640 Speaker 3: Now most of those friends are busy with kids or jobs. 506 00:24:43,760 --> 00:24:47,880 Speaker 3: Jenna began writing with her chap instead, drafting scenes about 507 00:24:47,880 --> 00:24:50,520 Speaker 3: an American student Oxford in England with a crush on 508 00:24:50,560 --> 00:24:54,040 Speaker 3: her professor. Her chat would respond in character as a professor. 509 00:24:54,080 --> 00:24:56,879 Speaker 3: It felt thrilling, she told me, like a living novel. 510 00:24:57,400 --> 00:25:00,359 Speaker 3: For the first time since before she'd fallen in she 511 00:25:00,480 --> 00:25:04,359 Speaker 3: experienced an erotic charge. She was still too frail to 512 00:25:04,400 --> 00:25:07,240 Speaker 3: have sex with her husband, so she'd have to solve 513 00:25:07,400 --> 00:25:10,080 Speaker 3: things on her own. One day, when her husband returned 514 00:25:10,080 --> 00:25:12,600 Speaker 3: from work, she told him elated, I had sex with 515 00:25:12,680 --> 00:25:17,119 Speaker 3: my robot. She was unbothered. When I spoke to him 516 00:25:17,119 --> 00:25:20,680 Speaker 3: a few months later, he said that after she'd fully 517 00:25:20,720 --> 00:25:23,960 Speaker 3: healed up, he was the one who reaped the benefits. 518 00:25:24,040 --> 00:25:27,960 Speaker 3: Quote unquote. But obviously this starts to sort of get 519 00:25:28,080 --> 00:25:32,000 Speaker 3: kind of weird. It starts to blow up. The subreddit 520 00:25:32,040 --> 00:25:34,440 Speaker 3: gets noticed, and Jenna is asked about the attention the 521 00:25:34,440 --> 00:25:37,760 Speaker 3: subreddit gets. To Jenna, the reaction seemed hysterical, a moral 522 00:25:37,800 --> 00:25:40,560 Speaker 3: panic about a phenomenon that, as she saw it, was 523 00:25:40,600 --> 00:25:43,399 Speaker 3: hardly different from the mass popularity of Fifty Shades of 524 00:25:43,400 --> 00:25:46,040 Speaker 3: Gray or the Sims. Some critics had accused her of 525 00:25:46,160 --> 00:25:49,040 Speaker 3: cheating on her husband. Others had implied she was sexually 526 00:25:49,040 --> 00:25:52,400 Speaker 3: assaulting her AI because it wasn't capable of consenting. Neither 527 00:25:52,440 --> 00:25:54,480 Speaker 3: made any sense to Jenna. It's not a real person, 528 00:25:54,520 --> 00:26:01,439 Speaker 3: she said. And I write I love a good moral panic. 529 00:26:01,720 --> 00:26:03,920 Speaker 3: I tend to you know, that's what this show is 530 00:26:03,960 --> 00:26:07,199 Speaker 3: about in a way. And I tend to agree with 531 00:26:07,240 --> 00:26:10,520 Speaker 3: her there, Like I and this comes up a lot 532 00:26:10,520 --> 00:26:12,320 Speaker 3: on this show. I don't think it's an accident that 533 00:26:12,359 --> 00:26:15,119 Speaker 3: the entire world started like screaming about that companion's being 534 00:26:15,200 --> 00:26:18,040 Speaker 3: dangerous and horrifying the minute, like a bunch of women 535 00:26:18,280 --> 00:26:20,879 Speaker 3: were using them, which is like a thing that happens 536 00:26:21,040 --> 00:26:24,879 Speaker 3: throughout the history of technology. What does sort of confuse me? 537 00:26:24,920 --> 00:26:27,040 Speaker 3: I guess, like, looking at this all on a timeline, 538 00:26:27,320 --> 00:26:33,080 Speaker 3: is you know exactly how normalized will this stuff become? 539 00:26:33,440 --> 00:26:35,119 Speaker 3: Is this the ceiling? Have we hit the ceiling of 540 00:26:35,200 --> 00:26:37,359 Speaker 3: just like there's gonna be like zero point one percent 541 00:26:37,400 --> 00:26:39,080 Speaker 3: of the population out there that is like having sex 542 00:26:39,119 --> 00:26:42,080 Speaker 3: with a chatbot? Or is this a thing where it 543 00:26:42,119 --> 00:26:45,560 Speaker 3: starts to impact, you know, the way we live our lives. 544 00:26:45,760 --> 00:26:48,679 Speaker 3: I guess that's that's sort of where I don't know. 545 00:26:49,560 --> 00:26:54,000 Speaker 1: Yeah, I mean I have a pretty pessimistic maybe pessimistic 546 00:26:54,040 --> 00:26:55,720 Speaker 1: I don't depends on how you look at it. Point 547 00:26:55,800 --> 00:26:57,280 Speaker 1: one percent. I don't think it's gonna be point one. 548 00:26:57,320 --> 00:27:01,120 Speaker 1: I think it's gonna I mean, move a decimal move 549 00:27:01,119 --> 00:27:03,280 Speaker 1: that does a point over a couple of the very least. 550 00:27:03,960 --> 00:27:05,720 Speaker 3: You think like ten percent of the population is going 551 00:27:05,760 --> 00:27:07,480 Speaker 3: to have some sort of AI companion in a couple 552 00:27:07,480 --> 00:27:10,840 Speaker 3: of years, easy, tell me why, like what, like lay 553 00:27:10,840 --> 00:27:11,680 Speaker 3: out your argument here? 554 00:27:12,040 --> 00:27:15,880 Speaker 1: I mean, I think we should in today's current late 555 00:27:15,880 --> 00:27:20,560 Speaker 1: stage capitalism system, I think we should never underestimate a 556 00:27:20,760 --> 00:27:23,120 Speaker 1: company's ability to sell us something. 557 00:27:23,840 --> 00:27:26,560 Speaker 2: I think we should never underestimate. 558 00:27:25,840 --> 00:27:29,880 Speaker 1: The government's interest in somehow figuring out how to make 559 00:27:29,920 --> 00:27:32,800 Speaker 1: that work for their advantage if it can. 560 00:27:33,760 --> 00:27:37,480 Speaker 3: And I think it's just the. 561 00:27:36,560 --> 00:27:38,320 Speaker 2: Path I think that we're on. 562 00:27:38,680 --> 00:27:41,280 Speaker 1: I think, actually, what I would say is that there 563 00:27:41,320 --> 00:27:44,560 Speaker 1: will be a class of people who don't have to 564 00:27:44,600 --> 00:27:47,359 Speaker 1: rely on AI, who don't have or don't have to 565 00:27:47,400 --> 00:27:51,280 Speaker 1: rely on social media because they're rich enough to pay 566 00:27:51,320 --> 00:27:52,600 Speaker 1: for stuff made by real people. 567 00:27:52,760 --> 00:27:54,760 Speaker 2: Think of it like food, Yeah, think of it like that. 568 00:27:54,800 --> 00:27:55,280 Speaker 3: I agree with. 569 00:27:55,400 --> 00:27:58,439 Speaker 1: But I also think that just like how if you 570 00:27:58,560 --> 00:28:01,639 Speaker 1: insist on organic food or farmed a table stuff, you're 571 00:28:01,680 --> 00:28:04,879 Speaker 1: somehow elitist. I think the more reasonable maybe prediction that 572 00:28:04,920 --> 00:28:06,840 Speaker 1: I can make the thing I feel very comfortable about 573 00:28:07,200 --> 00:28:10,360 Speaker 1: is within a couple of years, if people say they 574 00:28:10,359 --> 00:28:14,040 Speaker 1: don't like AI, they'll get called elitist. 575 00:28:14,760 --> 00:28:20,000 Speaker 3: I think that's definitely well. Okay, so I tend to 576 00:28:20,040 --> 00:28:22,800 Speaker 3: agree with you. I've spent a lot of time traveling 577 00:28:22,880 --> 00:28:25,600 Speaker 3: and living in the Global South. I've seen sort of 578 00:28:25,640 --> 00:28:28,479 Speaker 3: how the trickle down of technology, like you know, you'll 579 00:28:28,520 --> 00:28:31,919 Speaker 3: see like a random shop in the middle of you know, 580 00:28:32,040 --> 00:28:35,199 Speaker 3: rural Latin America and they're using bitmojis as their logo, right, Like, 581 00:28:35,200 --> 00:28:38,320 Speaker 3: I've seen sort of how that stuff happens. The thing 582 00:28:38,560 --> 00:28:42,200 Speaker 3: with AI that makes me wonder, like, okay, like is 583 00:28:42,240 --> 00:28:44,320 Speaker 3: AI slop just going to be like on the side 584 00:28:44,800 --> 00:28:48,680 Speaker 3: of like a food stall somewhere is the cost like 585 00:28:48,720 --> 00:28:50,960 Speaker 3: and I don't know, like like is there a world 586 00:28:51,000 --> 00:28:53,960 Speaker 3: work AI is cheap enough that it becomes the sort 587 00:28:54,000 --> 00:28:57,920 Speaker 3: of like lowest common denominator. I guess, I mean maybe 588 00:28:57,920 --> 00:28:59,960 Speaker 3: I think it's not that Maybe like a monthly traguy 589 00:29:00,040 --> 00:29:02,400 Speaker 3: subscription isn't that much Like maybe maybe you're right. 590 00:29:02,480 --> 00:29:04,760 Speaker 1: Yeah, Oh, I mean, like there's a couple I'm not 591 00:29:04,800 --> 00:29:07,520 Speaker 1: going to say where they are or the specific restaurants 592 00:29:07,720 --> 00:29:09,160 Speaker 1: because I don't blow them up like that, but there's 593 00:29:09,160 --> 00:29:11,320 Speaker 1: a couple of restaurants I can think of. You go in, 594 00:29:11,680 --> 00:29:14,680 Speaker 1: I've met the I've met the owner of both, and 595 00:29:14,720 --> 00:29:18,040 Speaker 1: they'll tell you, man, like we got farm table this, 596 00:29:18,080 --> 00:29:20,720 Speaker 1: we got organic food, like here's how I make this here, 597 00:29:20,760 --> 00:29:23,239 Speaker 1: and like really really proud of their process. And if 598 00:29:23,280 --> 00:29:25,920 Speaker 1: you look at the logo AI generated, you look at 599 00:29:25,960 --> 00:29:29,120 Speaker 1: the menu, the images AI generated, so they really really 600 00:29:29,160 --> 00:29:34,720 Speaker 1: really care about specifically, you know, truly the human element 601 00:29:35,040 --> 00:29:38,640 Speaker 1: in one art, which is food, but in visual they 602 00:29:38,640 --> 00:29:41,680 Speaker 1: don't care. It's it's just window dressing. You dig and 603 00:29:41,720 --> 00:29:43,880 Speaker 1: so yeah, but I mean to get to get back 604 00:29:43,920 --> 00:29:47,240 Speaker 1: to it. I think, you know, I don't know. I 605 00:29:47,240 --> 00:29:52,080 Speaker 1: think there's always been some element of loneliness out there, 606 00:29:52,160 --> 00:29:55,960 Speaker 1: of course, exacerbated by just how things are going now, 607 00:29:56,200 --> 00:29:59,160 Speaker 1: and people have just figured out. You know, that person 608 00:29:59,200 --> 00:30:02,600 Speaker 1: who was like talking to you know, an AI chat 609 00:30:02,640 --> 00:30:06,200 Speaker 1: bot or whatever and then feeling physical feelings toward it. 610 00:30:06,320 --> 00:30:10,440 Speaker 1: You know, they'd be writing in a journal somewhere. Yeah, 611 00:30:10,440 --> 00:30:12,520 Speaker 1: they would, but and nobody would write about that because 612 00:30:12,520 --> 00:30:15,160 Speaker 1: it's not very interesting, or nobody would just hear about it, 613 00:30:15,160 --> 00:30:16,680 Speaker 1: because how do you ask them about it? 614 00:30:16,960 --> 00:30:22,560 Speaker 3: I see you mean, yeah, hmm, yeah, maybe you're right. Well, well, well, 615 00:30:22,680 --> 00:30:24,800 Speaker 3: while I ponder that, we're gonna go to break uh 616 00:30:25,080 --> 00:30:27,000 Speaker 3: and and when we come back, we're gonna let you 617 00:30:27,080 --> 00:30:29,960 Speaker 3: take things over. But first a word from our sponsors, 618 00:30:30,160 --> 00:30:34,200 Speaker 3: Grock Spicy Mode. You can have sex with an anime 619 00:30:34,360 --> 00:30:35,000 Speaker 3: girl now. 620 00:30:36,160 --> 00:30:44,480 Speaker 4: And okay, I've done an overview of sort of how 621 00:30:44,520 --> 00:30:48,200 Speaker 4: these things have you know, evolved appeared, and now take 622 00:30:48,240 --> 00:30:48,960 Speaker 4: take me to help. 623 00:30:50,040 --> 00:30:51,640 Speaker 2: All right, I don't know if this is okay? 624 00:30:51,680 --> 00:30:53,920 Speaker 1: All right, I'm I'm an We're gonna dip into something 625 00:30:53,920 --> 00:30:56,240 Speaker 1: that feels hellish, and I'm gonna try to convince you that. 626 00:30:56,200 --> 00:30:56,720 Speaker 2: Maybe it's not. 627 00:30:56,840 --> 00:30:59,680 Speaker 1: How about that okay, all right, you ever heard of YouTubers? 628 00:31:00,040 --> 00:31:02,320 Speaker 1: Oh yeah, Oh, what do you think about VTubers? 629 00:31:04,200 --> 00:31:07,720 Speaker 3: I find them fascinating, Okay, I think their fans are weird, 630 00:31:07,920 --> 00:31:11,200 Speaker 3: but I think the technology is interesting. I think they 631 00:31:11,280 --> 00:31:15,440 Speaker 3: kind of are a natural conclusion of the like the 632 00:31:15,840 --> 00:31:19,120 Speaker 3: strain of live streaming. So like, if a human being 633 00:31:19,200 --> 00:31:21,320 Speaker 3: is forced to stream for seven hours a day to 634 00:31:21,600 --> 00:31:25,680 Speaker 3: capture audience on Twitch, why not have an animated avatar 635 00:31:25,800 --> 00:31:27,680 Speaker 3: do it while you can like take a bathroom break 636 00:31:27,680 --> 00:31:29,240 Speaker 3: and it has like a loop function or something, you know, 637 00:31:29,280 --> 00:31:32,520 Speaker 3: Like That's how I see it. Also, YouTubers don't get 638 00:31:32,520 --> 00:31:35,280 Speaker 3: old it. It just seems like a natural extension of 639 00:31:35,280 --> 00:31:35,680 Speaker 3: that to me. 640 00:31:35,800 --> 00:31:41,400 Speaker 1: Yeah, so we should probably explain what VTubers are v tubers. Basically, 641 00:31:41,440 --> 00:31:46,320 Speaker 1: it's live streaming on Twitch. Usually it started really before 642 00:31:46,360 --> 00:31:51,280 Speaker 1: Twitch was a big function, so think YouTube. Butvtubers just 643 00:31:51,320 --> 00:31:57,920 Speaker 1: think of it as a motion capture anime girl talking 644 00:31:58,040 --> 00:32:00,680 Speaker 1: to people as they watch. And I say anime girl 645 00:32:00,680 --> 00:32:03,400 Speaker 1: because not all of them are anime girls. Most of 646 00:32:03,440 --> 00:32:04,560 Speaker 1: them are anime girls. 647 00:32:04,800 --> 00:32:07,080 Speaker 3: I think the I think the VTuber of the Year 648 00:32:07,120 --> 00:32:10,400 Speaker 3: award at the Streaming Awards was a peanut, like a 649 00:32:10,400 --> 00:32:11,200 Speaker 3: talking peanut. 650 00:32:11,240 --> 00:32:12,840 Speaker 2: Oh see, there we go. There's diversity. 651 00:32:12,880 --> 00:32:16,520 Speaker 1: So there is diversity, but most of them are anime 652 00:32:16,680 --> 00:32:21,640 Speaker 1: girls of varying age. There's I don't necessarily want to 653 00:32:21,640 --> 00:32:23,440 Speaker 1: get into all of that because that's a whole other 654 00:32:23,760 --> 00:32:27,440 Speaker 1: that's a whole whole other thing. But I will say 655 00:32:27,480 --> 00:32:31,880 Speaker 1: that a large portion of them are like, Okay, one 656 00:32:31,920 --> 00:32:37,040 Speaker 1: of the people I talked to, Cole Maria, who's I'd 657 00:32:37,120 --> 00:32:39,920 Speaker 1: say she she's known. She's not as huge, I mean, 658 00:32:40,000 --> 00:32:42,560 Speaker 1: like millions of millions and millions of people watching as 659 00:32:42,600 --> 00:32:46,480 Speaker 1: some but you know, very healthy audience, more than enough 660 00:32:46,480 --> 00:32:49,680 Speaker 1: for that to be her main gig. So she's I'm 661 00:32:49,720 --> 00:32:51,720 Speaker 1: just looking at a picture of here her so I 662 00:32:51,760 --> 00:32:55,680 Speaker 1: can describe her. You know, she's blonde, she's got a 663 00:32:55,720 --> 00:33:00,160 Speaker 1: little like bat hair clip thing. Her whole theme is 664 00:33:00,560 --> 00:33:04,160 Speaker 1: that she's she's a she's immortal back. She's a bad Yeah, 665 00:33:04,160 --> 00:33:08,760 Speaker 1: she's batgirl. She's got like wings all of her outfits. 666 00:33:09,960 --> 00:33:12,320 Speaker 3: She there, I'm looking at it right now. She's a 667 00:33:12,320 --> 00:33:12,640 Speaker 3: lot of. 668 00:33:12,560 --> 00:33:13,760 Speaker 2: Cleavage, I'll say that. 669 00:33:13,840 --> 00:33:16,000 Speaker 3: Yeah, she's like a busty vampire. 670 00:33:16,600 --> 00:33:21,400 Speaker 2: Thank you, thank you, thank you. She's a vampire girl. 671 00:33:21,480 --> 00:33:25,080 Speaker 1: She's like six thousand, six hundred and nine years old 672 00:33:25,240 --> 00:33:29,520 Speaker 1: or something like that, and that's the setting. And yeah, 673 00:33:29,800 --> 00:33:32,200 Speaker 1: V tours have this kind of interesting thing where it's 674 00:33:32,240 --> 00:33:36,000 Speaker 1: all about U k fabe. So k fabe like in 675 00:33:36,960 --> 00:33:37,680 Speaker 1: professional rest. 676 00:33:37,880 --> 00:33:39,680 Speaker 3: A weird question, I have a weird go for it. 677 00:33:39,760 --> 00:33:41,160 Speaker 3: Was she pregnant? 678 00:33:41,800 --> 00:33:42,080 Speaker 2: Huh? 679 00:33:42,400 --> 00:33:45,000 Speaker 3: She had like a phase where her avatar was pregnant 680 00:33:45,040 --> 00:33:47,920 Speaker 3: and she was doing like like a pregnancy thing. Anyways, 681 00:33:48,000 --> 00:33:48,920 Speaker 3: we can skip over. 682 00:33:49,000 --> 00:33:51,320 Speaker 2: Okay, I missed that. Yeah, we never talked about that. 683 00:33:51,480 --> 00:33:54,720 Speaker 1: Yeah, and so I've yeah, I've interviewed her and I've 684 00:33:54,960 --> 00:33:58,320 Speaker 1: i interviewed her as her like avatar. And this is 685 00:33:58,360 --> 00:34:01,120 Speaker 1: like the whole This is a whole thing where it's 686 00:34:01,200 --> 00:34:05,640 Speaker 1: kind of like kfabe in wrestling, where same thing. You 687 00:34:05,720 --> 00:34:10,080 Speaker 1: know that there is a person behind the avatar, but 688 00:34:10,680 --> 00:34:13,360 Speaker 1: you just don't talk about it, and if you do 689 00:34:13,560 --> 00:34:17,080 Speaker 1: talk about it, they get super pissed off, like by day, 690 00:34:17,160 --> 00:34:19,600 Speaker 1: I mean the fans in the chat like that's a 691 00:34:19,680 --> 00:34:22,799 Speaker 1: really easy way to get banned. But a thing that's 692 00:34:22,800 --> 00:34:24,919 Speaker 1: been happening, I mean, honestly, it's been happening for years 693 00:34:25,000 --> 00:34:30,640 Speaker 1: is people will have offline events and she put together 694 00:34:31,360 --> 00:34:36,480 Speaker 1: basically a mini music festival. They packed this venue in 695 00:34:36,480 --> 00:34:39,840 Speaker 1: in Hollywood. It was like a thousand, two hundred people. 696 00:34:40,800 --> 00:34:43,480 Speaker 1: Some of those tickets were easily upwards of one hundred 697 00:34:43,520 --> 00:34:45,640 Speaker 1: dollars if you wanted. But also if you wanted to 698 00:34:45,640 --> 00:34:47,040 Speaker 1: be in there, you got to get the add ons, 699 00:34:47,040 --> 00:34:48,239 Speaker 1: you know what I mean. You can't just go in 700 00:34:48,320 --> 00:34:50,239 Speaker 1: with your street clothes. You know, you got you gotta 701 00:34:50,280 --> 00:34:52,360 Speaker 1: get you know, you got to get your favorite characters 702 00:34:53,000 --> 00:34:55,400 Speaker 1: or your favoritevtubers gear. You know, you got to go 703 00:34:55,440 --> 00:34:57,800 Speaker 1: in with the shirt. You got to get glow sticks, 704 00:34:57,840 --> 00:34:59,879 Speaker 1: and you got to get two glow sticks. I think 705 00:35:00,000 --> 00:35:01,560 Speaker 1: each one cost sixty dollars. You got to get the 706 00:35:01,560 --> 00:35:04,279 Speaker 1: glow sticks, the official ones, not some bootleg things you 707 00:35:04,320 --> 00:35:06,960 Speaker 1: brought them off the street. And so people paid like 708 00:35:07,120 --> 00:35:10,239 Speaker 1: hundreds of dollars to be here, and it's just it's 709 00:35:10,280 --> 00:35:14,719 Speaker 1: a stage. There was some real life musicians, we have 710 00:35:14,800 --> 00:35:17,640 Speaker 1: to say this now, like irl meat space musicians on 711 00:35:17,680 --> 00:35:20,960 Speaker 1: the stage. Yeah, like physical, physical, human being musicians. Like 712 00:35:21,000 --> 00:35:23,600 Speaker 1: there was somebody on guitar, somebody on base, somebody on drums, 713 00:35:23,840 --> 00:35:27,040 Speaker 1: but all the singers were It was just this parade 714 00:35:27,239 --> 00:35:30,799 Speaker 1: of like anime girls on the Jumbo tron and that's it. 715 00:35:31,360 --> 00:35:33,680 Speaker 2: And people were really excited about this. 716 00:35:34,360 --> 00:35:36,759 Speaker 3: There's a clip that went viral years ago of like, 717 00:35:36,880 --> 00:35:38,880 Speaker 3: I think it was like a YouTuber meet up in 718 00:35:38,920 --> 00:35:42,720 Speaker 3: like Indonesia or something, and the male YouTuber his avatar 719 00:35:42,880 --> 00:35:45,279 Speaker 3: never showed his eyes because his hair would cover his eyes. 720 00:35:45,840 --> 00:35:48,840 Speaker 3: And in the VTuber performance, it was the first ever 721 00:35:49,040 --> 00:35:53,000 Speaker 3: I reveal, and all the girls in the audience lost it. 722 00:35:53,080 --> 00:35:55,160 Speaker 3: Oh flipped his hair out for the first time. You 723 00:35:55,160 --> 00:35:59,640 Speaker 3: could see his eyes. Yeah, okay, yeah, yeah, So I'm 724 00:35:59,719 --> 00:36:02,880 Speaker 3: dying to know how this gets us back to AI. 725 00:36:03,200 --> 00:36:06,200 Speaker 1: I'm so faccy. I'm happy happy you asked that question. 726 00:36:06,360 --> 00:36:09,200 Speaker 1: So I asked a couple of people. You know, I 727 00:36:09,280 --> 00:36:12,600 Speaker 1: go to this concert and look, the music wasn't really 728 00:36:12,640 --> 00:36:16,160 Speaker 1: my bag, mostly because it's all like anime jpop. 729 00:36:16,320 --> 00:36:17,800 Speaker 2: Yeah, it's like jpop anime. 730 00:36:18,160 --> 00:36:20,760 Speaker 1: It's like, if you're really really if you'd like watch 731 00:36:20,800 --> 00:36:24,560 Speaker 1: anime for the theme songs, yeah, exactly, Like if that's 732 00:36:24,600 --> 00:36:26,680 Speaker 1: what you're watching it for, you'd be. 733 00:36:26,680 --> 00:36:27,480 Speaker 2: Right at home for me. 734 00:36:27,640 --> 00:36:29,560 Speaker 1: Not really. There were some people who were doing some 735 00:36:29,640 --> 00:36:31,880 Speaker 1: like techno stuff. I was into that more. But anyway, 736 00:36:31,960 --> 00:36:36,400 Speaker 1: point being, I asked a few people about AI because 737 00:36:36,719 --> 00:36:39,359 Speaker 1: we've already gotten like a layer of abstraction away from 738 00:36:39,360 --> 00:36:43,319 Speaker 1: a real life, first off in person interaction, and then 739 00:36:43,400 --> 00:36:47,560 Speaker 1: you're like, you know, watching streamers online enough people think that's. 740 00:36:47,440 --> 00:36:49,480 Speaker 2: Already weird if it's just a human being streaming. 741 00:36:49,840 --> 00:36:52,800 Speaker 1: Now you're watching one hundred percent like a cartoon anime 742 00:36:52,960 --> 00:36:57,480 Speaker 1: girl who's you don't even know what they look like. Okay, 743 00:36:57,520 --> 00:37:00,160 Speaker 1: so why don't we just have AI do that? And 744 00:37:00,520 --> 00:37:04,279 Speaker 1: basically everybody I talked to said, nah, we don't. We 745 00:37:04,280 --> 00:37:06,160 Speaker 1: don't want any AI anywhere near. 746 00:37:06,000 --> 00:37:07,840 Speaker 2: This interesting and interesting. 747 00:37:07,920 --> 00:37:11,160 Speaker 1: I think there's there's this company that basically makes this 748 00:37:11,560 --> 00:37:15,640 Speaker 1: program that kind of does all the motion capture for you. 749 00:37:15,719 --> 00:37:17,920 Speaker 1: So most of the YouTubers, a lot of them have 750 00:37:18,040 --> 00:37:21,440 Speaker 1: kind of complicated motion cap not motion capture, but like 751 00:37:21,719 --> 00:37:23,839 Speaker 1: you know, capture stuff, so you can they can move 752 00:37:23,880 --> 00:37:26,360 Speaker 1: their arms and stuff like that, move their hands and 753 00:37:26,920 --> 00:37:30,560 Speaker 1: it'll relay that to the screen. There's some company that 754 00:37:30,680 --> 00:37:34,239 Speaker 1: makes a very slim down phone version of that, so 755 00:37:34,440 --> 00:37:36,239 Speaker 1: all you got to do is turn on yourself and 756 00:37:36,320 --> 00:37:38,879 Speaker 1: cam on your phone and boom, you're streaming and you're 757 00:37:38,920 --> 00:37:40,240 Speaker 1: an anime boy or girl whatever. 758 00:37:40,560 --> 00:37:42,799 Speaker 3: And oh I watched a demo of this actually like 759 00:37:42,880 --> 00:37:43,600 Speaker 3: literally yesterday. 760 00:37:43,640 --> 00:37:45,359 Speaker 2: Oh yeah, I know what you're talking. Yeah, yeah, And 761 00:37:45,520 --> 00:37:47,080 Speaker 2: so the one. 762 00:37:46,880 --> 00:37:48,920 Speaker 1: Of the heads of the company just happens to be 763 00:37:49,239 --> 00:37:52,200 Speaker 1: at this event, I think they were sponsoring it, and 764 00:37:52,280 --> 00:37:53,839 Speaker 1: I'm just talking him on the side, and I asked 765 00:37:53,920 --> 00:37:56,600 Speaker 1: him about about AI and he said, actually, you know, 766 00:37:56,680 --> 00:38:00,640 Speaker 1: we used to have a feature that it would help 767 00:38:00,680 --> 00:38:03,160 Speaker 1: you animate your avatar, like it would help like move 768 00:38:03,160 --> 00:38:05,080 Speaker 1: the eyes and stuff like that, because basically you need 769 00:38:05,120 --> 00:38:07,759 Speaker 1: to provide you just provide like a JPEG or something 770 00:38:07,800 --> 00:38:09,719 Speaker 1: like that, and it'll help animate it for you. And 771 00:38:09,760 --> 00:38:11,640 Speaker 1: he said, yeah, you know, people didn't want to necessarily 772 00:38:11,640 --> 00:38:15,080 Speaker 1: like draw everything, so we had something that would help animate. 773 00:38:14,680 --> 00:38:15,040 Speaker 2: It for you. 774 00:38:15,560 --> 00:38:18,880 Speaker 1: With AI, people like hate AI so much here that 775 00:38:18,960 --> 00:38:21,840 Speaker 1: we just removed it. We had the feature, we just 776 00:38:21,840 --> 00:38:25,000 Speaker 1: took it to be lave that. Yeah, No, I was. 777 00:38:25,239 --> 00:38:26,839 Speaker 3: I was literally as funny. I was literally talking about 778 00:38:26,840 --> 00:38:29,479 Speaker 3: this at a dinner last night where I was talking 779 00:38:29,480 --> 00:38:32,520 Speaker 3: to a friend who's he uses AI professionally and likes it, 780 00:38:32,560 --> 00:38:36,000 Speaker 3: and I was just saying, like, the word AI has 781 00:38:36,080 --> 00:38:39,920 Speaker 3: become such a toxic name brand, Like it's equivalent to 782 00:38:39,920 --> 00:38:43,680 Speaker 3: like asbestos at this point, yeah, or fentanyl, like like 783 00:38:43,719 --> 00:38:45,920 Speaker 3: there are there are uses for those things, but the 784 00:38:46,000 --> 00:38:50,840 Speaker 3: word is so toxic now that like most companies I 785 00:38:50,840 --> 00:38:53,120 Speaker 3: think are going to stop using AIS and advertising tool 786 00:38:53,160 --> 00:38:55,560 Speaker 3: like pretty soon because people freak out. 787 00:38:55,800 --> 00:39:01,799 Speaker 1: So as far as I've seen, v tubers really do 788 00:39:01,840 --> 00:39:06,880 Speaker 1: not want AI anywhere near their community, you know what 789 00:39:06,880 --> 00:39:09,600 Speaker 1: I mean, it's a whole subculture, right, they just don't 790 00:39:09,640 --> 00:39:13,160 Speaker 1: want it anywhere near it. And there was a streamer 791 00:39:13,280 --> 00:39:15,560 Speaker 1: who kind of like jumped on the v tuber trend 792 00:39:15,840 --> 00:39:18,000 Speaker 1: and he basically just like made a he did like 793 00:39:18,040 --> 00:39:21,680 Speaker 1: an AI version of himself, and a lot of people 794 00:39:21,760 --> 00:39:24,319 Speaker 1: actually got pretty pissed off about it. There is something that, 795 00:39:25,080 --> 00:39:28,759 Speaker 1: yeah that VTuber fans like the fact that there is 796 00:39:28,800 --> 00:39:32,560 Speaker 1: a human. They like the fact that not everything is perfect. 797 00:39:32,640 --> 00:39:34,920 Speaker 1: They call it scuff. Like every stream there's gonna be 798 00:39:35,000 --> 00:39:36,799 Speaker 1: something that goes wrong, you know, the motion capture goes 799 00:39:36,840 --> 00:39:39,319 Speaker 1: wrong where. But they like that sort of thing. But 800 00:39:40,360 --> 00:39:43,640 Speaker 1: do they care about AI in an advertisement maybe or not? 801 00:39:43,719 --> 00:39:45,359 Speaker 1: And this is what I was saying, Like, there are 802 00:39:45,400 --> 00:39:49,200 Speaker 1: people who really really really don't want AI used in 803 00:39:49,360 --> 00:39:52,680 Speaker 1: visual art, but music for them is just background sound 804 00:39:53,560 --> 00:39:55,439 Speaker 1: and so maybe they don't care about that. 805 00:39:55,760 --> 00:39:57,960 Speaker 2: But then there's people who really really truly care. 806 00:39:57,840 --> 00:40:01,200 Speaker 1: About music, And you know, if there's I don't know, 807 00:40:01,520 --> 00:40:04,600 Speaker 1: like the apartment complex they live in uses AI in 808 00:40:04,640 --> 00:40:07,600 Speaker 1: the front, like eh, whatever, Like that's not what they 809 00:40:07,640 --> 00:40:09,360 Speaker 1: truly care about. And this is what I'm saying is 810 00:40:09,400 --> 00:40:12,120 Speaker 1: like everybody's got a place where AI is off limits 811 00:40:12,160 --> 00:40:14,960 Speaker 1: for them. I'm not sure if that many people have 812 00:40:15,040 --> 00:40:18,560 Speaker 1: an area where AI is off limits in all areas 813 00:40:18,560 --> 00:40:19,200 Speaker 1: of their life. 814 00:40:19,280 --> 00:40:21,080 Speaker 3: I mean there's also a version of this where like 815 00:40:21,120 --> 00:40:26,160 Speaker 3: AI is so all encompassing that it's almost impossible to know. Also, yeah, 816 00:40:26,800 --> 00:40:29,080 Speaker 3: if you are someone who is a visual artist but 817 00:40:29,120 --> 00:40:31,239 Speaker 3: you actually don't know anything about music, you might hear 818 00:40:31,320 --> 00:40:33,760 Speaker 3: AI music and actually not even know that it's AIM using. Definitely, 819 00:40:33,719 --> 00:40:35,800 Speaker 3: and the reverse is also possible. 820 00:40:36,320 --> 00:40:39,520 Speaker 1: I can tell with certain genres Definitely, with others, I 821 00:40:39,600 --> 00:40:43,359 Speaker 1: might have a tougher time, and certainly with art, like 822 00:40:43,560 --> 00:40:45,680 Speaker 1: I probably have a tougher time with some things. 823 00:40:46,800 --> 00:40:49,200 Speaker 3: So do you think that that also applies to AI 824 00:40:49,280 --> 00:40:52,279 Speaker 3: relationships and like an AI companion bots? Like do you 825 00:40:52,320 --> 00:40:54,640 Speaker 3: think that there's just going to be I guess that 826 00:40:54,680 --> 00:40:56,319 Speaker 3: gets the question I keep coming back to, which is 827 00:40:56,360 --> 00:40:59,279 Speaker 3: like how what is the ceiling on this stuff? Like 828 00:41:00,120 --> 00:41:02,120 Speaker 3: is it a thing where like quietly like just a 829 00:41:02,120 --> 00:41:03,680 Speaker 3: lot of people are going to be using them in words, 830 00:41:03,719 --> 00:41:05,759 Speaker 3: never going to know until they have some kind of 831 00:41:05,760 --> 00:41:06,719 Speaker 3: psychotic episode. 832 00:41:06,840 --> 00:41:08,080 Speaker 2: Well, okay, so. 833 00:41:09,719 --> 00:41:14,200 Speaker 1: I talked to you know, practicing therapists who's also you know, 834 00:41:14,280 --> 00:41:16,280 Speaker 1: got a PhD in this stuff and he's been working 835 00:41:16,920 --> 00:41:19,200 Speaker 1: at you know, a technology and mental health for a 836 00:41:19,200 --> 00:41:19,800 Speaker 1: long time. 837 00:41:20,920 --> 00:41:21,520 Speaker 2: Is it uc or? 838 00:41:21,680 --> 00:41:25,239 Speaker 1: And I think And one of the things I asked 839 00:41:25,320 --> 00:41:29,560 Speaker 1: him is, listen, what about the people who actually have 840 00:41:29,719 --> 00:41:32,359 Speaker 1: decided that they don't want to interact with a human being, 841 00:41:32,760 --> 00:41:36,640 Speaker 1: that they'd rather interact with the computer. And he was like, 842 00:41:37,360 --> 00:41:40,120 Speaker 1: that's not actually something we have an answer for. And 843 00:41:40,200 --> 00:41:41,799 Speaker 1: I'm not really trying to take a side here, I'm 844 00:41:41,800 --> 00:41:43,840 Speaker 1: just trying to like lay it out like it is. 845 00:41:43,560 --> 00:41:46,799 Speaker 1: There's this kind of like top down let's make fun 846 00:41:46,840 --> 00:41:50,120 Speaker 1: of the dummies that use the fake stuff thing that 847 00:41:50,280 --> 00:41:53,480 Speaker 1: you know, it's very shareable on say blue Sky or 848 00:41:53,520 --> 00:41:56,000 Speaker 1: Twitter or whatever where you write something and it's like, 849 00:41:56,000 --> 00:41:57,759 Speaker 1: oh man, look what all these weird people are doing. 850 00:41:57,800 --> 00:41:59,000 Speaker 2: Ha ha ha. Isn't that funny? 851 00:41:59,000 --> 00:42:01,319 Speaker 3: Which is what happened with the with the boyfriend and 852 00:42:01,320 --> 00:42:03,120 Speaker 3: the AI boyfriend start Breddit. You know, it just became 853 00:42:03,160 --> 00:42:05,080 Speaker 3: a massive laughing stock across the end. 854 00:42:05,200 --> 00:42:06,400 Speaker 2: Precisely, precisely. 855 00:42:06,480 --> 00:42:10,600 Speaker 1: But I think everybody knows somebody who has somebody in 856 00:42:10,640 --> 00:42:14,560 Speaker 1: their life who's a little bit awkward or who just 857 00:42:14,800 --> 00:42:18,319 Speaker 1: isn't as adept at certain things, or maybe we are 858 00:42:18,400 --> 00:42:21,319 Speaker 1: that people that person. Maybe we grew out of it, 859 00:42:21,680 --> 00:42:24,319 Speaker 1: maybe we've gotten worse. Maybe we were super good with 860 00:42:24,400 --> 00:42:27,359 Speaker 1: relationships and just something happened and just things are more 861 00:42:27,360 --> 00:42:30,560 Speaker 1: difficult for us. And that's just a reality, you know 862 00:42:30,560 --> 00:42:33,560 Speaker 1: what I mean. But I think also when presented with 863 00:42:33,640 --> 00:42:38,239 Speaker 1: the alternative to dealing with the difficulty that is human interaction, 864 00:42:39,360 --> 00:42:42,680 Speaker 1: some people are choosing to not deal with the human 865 00:42:43,200 --> 00:42:47,200 Speaker 1: and they've always chosen this, but now that choice is 866 00:42:48,120 --> 00:42:50,759 Speaker 1: the alternative is more attractive, you know what I mean. 867 00:42:50,800 --> 00:42:51,560 Speaker 2: And so there are. 868 00:42:51,480 --> 00:42:55,319 Speaker 1: Genuinely people like say, with therapists, there are people who 869 00:42:55,360 --> 00:42:59,479 Speaker 1: would rather talk to a bot. And there's all sorts 870 00:42:59,480 --> 00:43:02,440 Speaker 1: of reasons why we can say that therapy is good whatever, whatever, 871 00:43:02,840 --> 00:43:08,640 Speaker 1: but they a the idea of therapy. A lot of 872 00:43:08,640 --> 00:43:11,360 Speaker 1: men have this problem. The idea of therapy is something 873 00:43:11,360 --> 00:43:14,200 Speaker 1: that is like not good. It is looked down upon, right, 874 00:43:14,239 --> 00:43:17,279 Speaker 1: and so talking to a friend is an option, which 875 00:43:17,320 --> 00:43:19,759 Speaker 1: is a great option. But okay, well, now we're just 876 00:43:19,760 --> 00:43:22,480 Speaker 1: getting closer and closer to Oh well, maybe I'll just 877 00:43:22,480 --> 00:43:24,080 Speaker 1: talk to a bot, or maybe I'll talk to a 878 00:43:24,080 --> 00:43:26,799 Speaker 1: bout that I also have sexual experiences with. But I mean, 879 00:43:26,920 --> 00:43:30,680 Speaker 1: so like this is look at look at OnlyFans, right, 880 00:43:31,239 --> 00:43:32,640 Speaker 1: so sure, yeah. 881 00:43:32,480 --> 00:43:34,839 Speaker 3: Yeah, which is like the mass you know, the uh, 882 00:43:35,120 --> 00:43:36,680 Speaker 3: I'm sure you're going in this direction, but like, yeah, 883 00:43:36,719 --> 00:43:40,160 Speaker 3: the the major sort of money maker for only fans 884 00:43:40,280 --> 00:43:42,759 Speaker 3: is the DMS. It is the interaction with the with 885 00:43:42,840 --> 00:43:43,239 Speaker 3: the star. 886 00:43:43,480 --> 00:43:47,480 Speaker 1: Yeah, exactly, so you know, OnlyFans. You can you can 887 00:43:47,520 --> 00:43:50,560 Speaker 1: subscribe to somebody's only fans. You can pay you know, 888 00:43:50,600 --> 00:43:52,759 Speaker 1: your five dollars or your ten dollars or your twenty 889 00:43:52,800 --> 00:43:55,280 Speaker 1: dollars a month or whatever and get naked pictures to somebody. 890 00:43:56,120 --> 00:43:59,479 Speaker 1: And some people just do that. But like you said, 891 00:43:59,480 --> 00:44:03,040 Speaker 1: the real money is being able to talk directly to them. 892 00:44:03,520 --> 00:44:08,960 Speaker 1: But if you're talking to a major like a well 893 00:44:09,000 --> 00:44:14,120 Speaker 1: known OnlyFans creator, you're not talking to them like Bad 894 00:44:14,200 --> 00:44:18,240 Speaker 1: Baby is doing just like absolute ridiculous amounts of money 895 00:44:18,520 --> 00:44:24,480 Speaker 1: per year. And there are tons of people undoubtedly who 896 00:44:24,480 --> 00:44:27,960 Speaker 1: are dming her and dming her, you know, big square 897 00:44:28,040 --> 00:44:32,400 Speaker 1: quotes around her, right, And it's just not physically possible 898 00:44:32,440 --> 00:44:33,840 Speaker 1: for her to talk to all of them. And so 899 00:44:34,000 --> 00:44:37,719 Speaker 1: what happens is this is outsourced and right now it's 900 00:44:37,719 --> 00:44:39,480 Speaker 1: not AI, or at least it doesn't seem to be AI. 901 00:44:39,600 --> 00:44:41,600 Speaker 1: Most of that is being outsourced to the same place. 902 00:44:41,640 --> 00:44:44,640 Speaker 1: A lot of things are being outsourced to the Philippines. 903 00:44:45,280 --> 00:44:48,080 Speaker 1: But what we're also saying, and I talked to this 904 00:44:48,120 --> 00:44:50,319 Speaker 1: guy Milchael Beltran who wrote an article about this, and 905 00:44:50,360 --> 00:44:52,200 Speaker 1: he's talked to a lot of the chatters. 906 00:44:52,560 --> 00:44:53,920 Speaker 2: They're just called chatters, right. 907 00:44:53,960 --> 00:44:57,880 Speaker 1: This is the people who actually pretend to be the 908 00:44:57,960 --> 00:45:01,319 Speaker 1: person who you're talking to in DMS, and so they have, 909 00:45:02,239 --> 00:45:04,440 Speaker 1: you know, a whole protocol that they buye by. You know, 910 00:45:04,480 --> 00:45:07,360 Speaker 1: for example, if you text somebody, you know, one of 911 00:45:07,400 --> 00:45:09,279 Speaker 1: the only fans models and say hey, what are you doing? 912 00:45:09,520 --> 00:45:11,880 Speaker 1: They'll reply to you, hey, yeah, I'm just eating pizza 913 00:45:12,040 --> 00:45:15,000 Speaker 1: and you say, oh, let me see. They have pictures 914 00:45:15,000 --> 00:45:17,719 Speaker 1: on deck of this person who ate pizza, Like, oh, hey, 915 00:45:17,760 --> 00:45:19,520 Speaker 1: I just I just let me see. Should I say 916 00:45:19,640 --> 00:45:21,040 Speaker 1: I'm in the shower or I just got to the 917 00:45:21,040 --> 00:45:23,200 Speaker 1: shower boom, they got a picture this person just got 918 00:45:23,200 --> 00:45:24,640 Speaker 1: out the shower like they got everything. 919 00:45:24,680 --> 00:45:27,600 Speaker 3: So it seems real but it's like a dialogue tree. 920 00:45:27,600 --> 00:45:28,120 Speaker 2: Exactly. 921 00:45:28,280 --> 00:45:31,480 Speaker 1: Yeah, it's like a video game, you know what I mean, Yeah, exactly, yeah, 922 00:45:31,520 --> 00:45:37,040 Speaker 1: And it feels but occasionally they'll get found out. And 923 00:45:37,200 --> 00:45:38,880 Speaker 1: you know, Michael told me about this. There was a 924 00:45:38,920 --> 00:45:43,360 Speaker 1: time when somebody got found out. Basically, they the chatter 925 00:45:43,640 --> 00:45:46,600 Speaker 1: used kind of some slang that's really only used in 926 00:45:46,600 --> 00:45:51,040 Speaker 1: the Philippines, okay, and the person is like, basically, they said, 927 00:45:51,040 --> 00:45:54,759 Speaker 1: I have to go to the CR. CR means comfort room, 928 00:45:55,520 --> 00:46:00,680 Speaker 1: which is the bathroom. No American is going like, you know, 929 00:46:00,760 --> 00:46:03,640 Speaker 1: the American blonde girl who you're like looking at naked, like, 930 00:46:03,680 --> 00:46:05,680 Speaker 1: she's not saying I got to go to the CR. 931 00:46:06,400 --> 00:46:10,000 Speaker 2: So she writes, you know, she writes that. 932 00:46:10,640 --> 00:46:13,280 Speaker 1: The guy on the other end is like, wait a second, 933 00:46:13,800 --> 00:46:16,400 Speaker 1: are you what like I guess he looked it up 934 00:46:16,480 --> 00:46:16,960 Speaker 1: or something like that. 935 00:46:17,000 --> 00:46:18,200 Speaker 2: Are you in the Philippines? Yeah? 936 00:46:18,239 --> 00:46:19,959 Speaker 1: And she says yeah, And you know, the chatter, who's 937 00:46:19,960 --> 00:46:23,799 Speaker 1: a guy by the way, says yeah and okay, and 938 00:46:23,840 --> 00:46:24,400 Speaker 1: it just moves on. 939 00:46:25,000 --> 00:46:25,200 Speaker 3: Huh. 940 00:46:25,280 --> 00:46:26,640 Speaker 2: So what I'm saying. 941 00:46:26,360 --> 00:46:29,520 Speaker 1: Here is that a lot of people, and I think 942 00:46:29,880 --> 00:46:31,920 Speaker 1: a lot of people probably have an area in their 943 00:46:31,960 --> 00:46:35,360 Speaker 1: life where they're like this where again, it's like wrestling, 944 00:46:35,719 --> 00:46:38,680 Speaker 1: Like you know, it's not real, but it's so entertaining 945 00:46:38,680 --> 00:46:42,680 Speaker 1: to watch that. You're like, you're cool with that, the 946 00:46:42,719 --> 00:46:44,239 Speaker 1: consumption of that, you know what I mean. And so 947 00:46:44,280 --> 00:46:46,880 Speaker 1: there are people who are paying a lot of money 948 00:46:47,440 --> 00:46:51,239 Speaker 1: to interact with a model and they know that it's 949 00:46:51,360 --> 00:46:55,080 Speaker 1: actually not like the Newtonian physics will not allow for 950 00:46:55,160 --> 00:46:57,400 Speaker 1: them to be talking to this person. It's just not 951 00:46:57,560 --> 00:47:01,239 Speaker 1: possible on that scale. They know what's got to be outsourced. 952 00:47:01,360 --> 00:47:03,600 Speaker 1: Pretty soon that will be an AI bought. But they're 953 00:47:03,719 --> 00:47:06,680 Speaker 1: still paying money, just like bro, stop paying money, but 954 00:47:06,719 --> 00:47:10,680 Speaker 1: they'll still pay because they're lonely for the fantasy. Yeah, 955 00:47:10,719 --> 00:47:13,240 Speaker 1: don't underestume how much we'll spend for a fantasy. 956 00:47:13,600 --> 00:47:16,600 Speaker 3: Well, once again, I need to sit with that for 957 00:47:16,640 --> 00:47:19,759 Speaker 3: a second while we go to break, so we'll be 958 00:47:19,880 --> 00:47:30,480 Speaker 3: right back. That cr anecdote like actually floored me. I 959 00:47:30,520 --> 00:47:33,000 Speaker 3: think you are right, it is sort of inevitable that, 960 00:47:33,200 --> 00:47:36,640 Speaker 3: especially like a lot of these subcultural spaces, if we would, 961 00:47:36,760 --> 00:47:40,840 Speaker 3: if we can even sell call OnlyFans fandom subcultural or 962 00:47:40,880 --> 00:47:45,160 Speaker 3: maybe it's actually simpler to say, like parasociality is already 963 00:47:45,160 --> 00:47:47,520 Speaker 3: so abstracted, right, So like if you have a parasocial 964 00:47:47,520 --> 00:47:51,480 Speaker 3: relationship with a streamer that's already an abstraction. Abstractions tend 965 00:47:51,480 --> 00:47:54,520 Speaker 3: to abstract Further, it makes sense that like there's someone 966 00:47:54,640 --> 00:48:00,279 Speaker 3: paying to now currently talk to a Filipino man role 967 00:48:00,280 --> 00:48:03,280 Speaker 3: play is bad baby or whatever, but in the future 968 00:48:03,320 --> 00:48:06,319 Speaker 3: it'll just be an AI and you probably won't care, 969 00:48:06,400 --> 00:48:10,440 Speaker 3: because like the human connection of like giving money to 970 00:48:10,560 --> 00:48:13,360 Speaker 3: her or getting her photographs or whatever is enough for you. 971 00:48:14,280 --> 00:48:16,799 Speaker 3: I guess my question mark though, with like the sort 972 00:48:16,840 --> 00:48:21,520 Speaker 3: of wider adoption, is at what point does all of 973 00:48:21,560 --> 00:48:24,440 Speaker 3: that start to impact how we interact. 974 00:48:24,000 --> 00:48:24,520 Speaker 2: With each other. 975 00:48:24,719 --> 00:48:27,440 Speaker 3: Now there's a whole wave of articles being like, you know, 976 00:48:28,120 --> 00:48:31,040 Speaker 3: birth rates are down and people are dating computers and 977 00:48:31,080 --> 00:48:33,520 Speaker 3: blah blah blah blah, and like that is the thing 978 00:48:33,560 --> 00:48:36,359 Speaker 3: that I've just never I've never been able to tell 979 00:48:36,400 --> 00:48:38,840 Speaker 3: if that's a moral panic or like a genuine concern. 980 00:48:39,280 --> 00:48:42,080 Speaker 3: And what I will say is like social media has 981 00:48:42,120 --> 00:48:44,360 Speaker 3: absolutely impacted the way we communicate it, yeah, and the 982 00:48:44,360 --> 00:48:46,480 Speaker 3: way that we experience the world. So it is sort 983 00:48:46,520 --> 00:48:48,399 Speaker 3: of reasonable to assume that AI will do the same, 984 00:48:48,440 --> 00:48:51,120 Speaker 3: but maybe it won't. That's so that's where I'm kind 985 00:48:51,120 --> 00:48:51,920 Speaker 3: of torn. 986 00:48:52,560 --> 00:48:57,160 Speaker 1: Yeah, I mean, of course it will in this. I'm 987 00:48:57,160 --> 00:48:59,799 Speaker 1: gonna sound like an AI Booster here and I'm not. 988 00:49:00,320 --> 00:49:03,120 Speaker 1: But you know, any kind of technological advance is going 989 00:49:03,160 --> 00:49:04,920 Speaker 1: to change the way that we do stuff. You know, 990 00:49:05,000 --> 00:49:08,279 Speaker 1: like the printing press can change how we communicate with people, 991 00:49:08,320 --> 00:49:10,640 Speaker 1: because all of a sudden, it's now possible to you 992 00:49:10,719 --> 00:49:13,319 Speaker 1: say something and then somebody one hundred years later and 993 00:49:13,360 --> 00:49:17,000 Speaker 1: another part of well can know literally what you said, 994 00:49:17,080 --> 00:49:20,560 Speaker 1: and so you start thinking in terms of text, whereas 995 00:49:20,560 --> 00:49:24,080 Speaker 1: you didn't really do that necessarily before. Sure, Ai, Yeah, 996 00:49:24,120 --> 00:49:27,960 Speaker 1: it's it's fundamentally different, And sometimes the different is just 997 00:49:28,000 --> 00:49:30,400 Speaker 1: the speed at which stuff happens and the scale at 998 00:49:30,440 --> 00:49:33,399 Speaker 1: which stuff can happen. But yeah, I think it's it's 999 00:49:33,520 --> 00:49:38,960 Speaker 1: genuinely going to change how we interact with each other. 1000 00:49:39,040 --> 00:49:40,600 Speaker 1: There's there's no question about that. 1001 00:49:40,719 --> 00:49:41,080 Speaker 2: I think. 1002 00:49:41,280 --> 00:49:44,440 Speaker 1: Obviously there's some opportunities. You know, this This therapist that 1003 00:49:44,480 --> 00:49:47,880 Speaker 1: I was talking to was reminding me and anybody listening 1004 00:49:48,000 --> 00:49:51,480 Speaker 1: that most people can't get therapy. 1005 00:49:52,320 --> 00:49:53,359 Speaker 2: Yeah, a lot of people have. 1006 00:49:53,360 --> 00:49:55,120 Speaker 3: I heard the same thing from therapists about this. 1007 00:49:55,280 --> 00:50:00,560 Speaker 1: Yeah, it's not possible even if it was affordable. What 1008 00:50:00,640 --> 00:50:03,239 Speaker 1: if you are going through a real crisis at ten 1009 00:50:03,320 --> 00:50:06,800 Speaker 1: pm right a lot of people go through a crisis 1010 00:50:06,800 --> 00:50:08,480 Speaker 1: at ten pm, I mean, the holiday is coming up. 1011 00:50:08,520 --> 00:50:11,080 Speaker 1: Like you have any idea how much really bad stuff 1012 00:50:11,120 --> 00:50:17,000 Speaker 1: happens on December twenty fourth at like ten pm, right exactly? 1013 00:50:17,040 --> 00:50:21,640 Speaker 1: Your therapist is not on call, And wouldn't it be 1014 00:50:21,640 --> 00:50:26,520 Speaker 1: better than nothing to have something at least to talk 1015 00:50:26,600 --> 00:50:29,400 Speaker 1: to that is better than nothing? Now, there are situations 1016 00:50:29,400 --> 00:50:32,359 Speaker 1: in what has which is been worse than nothing, absolutely, 1017 00:50:32,920 --> 00:50:35,440 Speaker 1: but there are people trying to yes, go ahead. 1018 00:50:35,520 --> 00:50:38,320 Speaker 3: So I have a big list here that I decided 1019 00:50:38,360 --> 00:50:41,560 Speaker 3: I was not going to read through. But our researcher 1020 00:50:41,600 --> 00:50:46,040 Speaker 3: Adam did build a timeline of like AI based suicides, 1021 00:50:46,880 --> 00:50:49,160 Speaker 3: and that is I think that is something that we 1022 00:50:49,320 --> 00:50:51,680 Speaker 3: like sort of have to make very clear here that like, 1023 00:50:52,280 --> 00:50:55,799 Speaker 3: if there's a let's say, a small but growing chunk 1024 00:50:55,800 --> 00:50:59,800 Speaker 3: of the population that is using bots for intimacy, for companionship, 1025 00:51:00,520 --> 00:51:06,120 Speaker 3: there is a minority within that minority that has been 1026 00:51:06,200 --> 00:51:09,279 Speaker 3: led to either you know, chat Gibt induced psychosis or 1027 00:51:09,760 --> 00:51:13,040 Speaker 3: full on you know, acts of self harm due to 1028 00:51:13,360 --> 00:51:16,319 Speaker 3: the sycophantic nature of a lot of these surfaces. That 1029 00:51:16,480 --> 00:51:18,360 Speaker 3: is a reality of where we're at right now. That 1030 00:51:18,400 --> 00:51:20,880 Speaker 3: isn't even like a hypothetical that's just happening. 1031 00:51:21,239 --> 00:51:24,520 Speaker 1: Yeah, And I don't think there's going to be almost 1032 00:51:24,520 --> 00:51:26,160 Speaker 1: anything that we'll be able to point to that we 1033 00:51:26,200 --> 00:51:31,399 Speaker 1: can say this is one hundred percent good without any caveats, 1034 00:51:31,520 --> 00:51:33,640 Speaker 1: you know what I mean. Like every time we see 1035 00:51:33,680 --> 00:51:36,239 Speaker 1: something like that, which we've seen quite a few of those, 1036 00:51:36,760 --> 00:51:38,799 Speaker 1: the boosters will jump out and say, well, we don't 1037 00:51:38,800 --> 00:51:43,960 Speaker 1: know how many people have been saved by AI, like talking. 1038 00:51:43,680 --> 00:51:44,920 Speaker 2: Them through a difficult situation. 1039 00:51:45,000 --> 00:51:47,600 Speaker 1: We don't know how many people may have a doctor 1040 00:51:47,680 --> 00:51:50,920 Speaker 1: misdiagnosed something and the AI figured something out that the 1041 00:51:50,960 --> 00:51:51,720 Speaker 1: doctor had missed. 1042 00:51:52,120 --> 00:51:54,000 Speaker 2: You're right, we don't. 1043 00:51:53,760 --> 00:51:57,600 Speaker 1: Know, but we do know that this is causing some 1044 00:51:57,600 --> 00:52:01,719 Speaker 1: real harm and just just a landscape now, that's just 1045 00:52:01,719 --> 00:52:02,239 Speaker 1: where we're at. 1046 00:52:03,680 --> 00:52:09,440 Speaker 3: And also, like, the AI induced self harm is not 1047 00:52:10,239 --> 00:52:15,000 Speaker 3: that different from the stories that I was reporting about 1048 00:52:15,120 --> 00:52:18,600 Speaker 3: at the beginning of the social media age. Fascinatingly enough 1049 00:52:18,640 --> 00:52:22,040 Speaker 3: that the trend that led to the ice bucket challenge 1050 00:52:22,080 --> 00:52:25,480 Speaker 3: started as a drinking game between like British and Australian 1051 00:52:25,560 --> 00:52:29,520 Speaker 3: lads called neck nominations, and they would basically like they 1052 00:52:29,520 --> 00:52:33,680 Speaker 3: would nominate each other over the Facebook news feed to 1053 00:52:33,880 --> 00:52:37,759 Speaker 3: like drink increasing large amounts of alcohol. Gosh, and it 1054 00:52:37,840 --> 00:52:39,640 Speaker 3: was linked to like a bunch of deaths because like 1055 00:52:39,680 --> 00:52:42,239 Speaker 3: these guys would flick on their camera and they would 1056 00:52:42,239 --> 00:52:45,400 Speaker 3: go live and they would drink until they died, and 1057 00:52:45,400 --> 00:52:48,200 Speaker 3: then they would not you know, after being nominated, and 1058 00:52:48,239 --> 00:52:50,319 Speaker 3: even before that, you know, you have the entire wave 1059 00:52:50,360 --> 00:52:53,920 Speaker 3: of like four Chan and hero stuff right where like 1060 00:52:54,160 --> 00:52:56,120 Speaker 3: a four Chan user is egging on another four Chan 1061 00:52:56,239 --> 00:53:00,839 Speaker 3: user to commit suicide. All these behaviors are not new, 1062 00:53:01,080 --> 00:53:03,680 Speaker 3: Like the act of a you know, a computer interface 1063 00:53:03,719 --> 00:53:07,680 Speaker 3: an abstracted relationship causing someone self harm into self harm 1064 00:53:07,840 --> 00:53:10,840 Speaker 3: is not new. It's the automation I think of the 1065 00:53:10,920 --> 00:53:14,879 Speaker 3: AI that is that is rightfully scaring people because there 1066 00:53:14,920 --> 00:53:19,400 Speaker 3: is no I mean, I just you know, you would 1067 00:53:19,400 --> 00:53:21,719 Speaker 3: hope that no, because I can't say that like in 1068 00:53:21,760 --> 00:53:24,080 Speaker 3: the social media version. Yeah, there is no off switch either. 1069 00:53:24,360 --> 00:53:26,400 Speaker 3: The mob kind of controls it. So it's it's just 1070 00:53:26,440 --> 00:53:28,640 Speaker 3: I guess it's just different. It's just it's different and 1071 00:53:29,040 --> 00:53:30,520 Speaker 3: we don't know enough about it yet. 1072 00:53:30,840 --> 00:53:33,760 Speaker 1: This is gonna sound kind of silly, but Steve Jobs 1073 00:53:33,760 --> 00:53:36,799 Speaker 1: once called the computer a bicycle for the mind, and 1074 00:53:36,840 --> 00:53:39,640 Speaker 1: I think that's a really interesting metaphor because, like if 1075 00:53:39,640 --> 00:53:41,120 Speaker 1: you think about it, like what does that even mean? 1076 00:53:41,560 --> 00:53:44,279 Speaker 1: Like I guess theoretically, like it means well, like it 1077 00:53:44,320 --> 00:53:46,680 Speaker 1: means like, okay, you have to put some effort into it. 1078 00:53:46,920 --> 00:53:48,759 Speaker 1: Like it's not a car, you know what I mean. 1079 00:53:48,800 --> 00:53:52,080 Speaker 1: Like you do something like you push down on the pedal, 1080 00:53:52,640 --> 00:53:57,640 Speaker 1: and it amplifies what you've done a bicycle. Yeah, it's 1081 00:53:57,680 --> 00:53:59,480 Speaker 1: like it amplifies what you've done, and it allows you 1082 00:53:59,560 --> 00:54:02,719 Speaker 1: to do things like say, you know, if if you'd 1083 00:54:02,800 --> 00:54:04,799 Speaker 1: never had a bicycle, you probably would just. 1084 00:54:04,719 --> 00:54:06,560 Speaker 2: Live in your little town and you never go anywhere. 1085 00:54:06,560 --> 00:54:09,080 Speaker 1: But you got a bicycle. You know, next town's like 1086 00:54:09,120 --> 00:54:11,640 Speaker 1: ten miles over. I heard they got a good restaurant, 1087 00:54:11,680 --> 00:54:15,240 Speaker 1: Like I'll go. I wouldn't have gone otherwise, I'll see. 1088 00:54:15,360 --> 00:54:18,160 Speaker 1: You know, we have a car right now, Like we 1089 00:54:18,200 --> 00:54:20,560 Speaker 1: had a bicycle. We thought we had cars. No, no, no, no, 1090 00:54:20,840 --> 00:54:24,040 Speaker 1: we had bicycles. We now have cars. And the kinds 1091 00:54:24,080 --> 00:54:26,440 Speaker 1: of things you can do with the car are fundamentally 1092 00:54:26,440 --> 00:54:29,279 Speaker 1: for the mind, for the mine, right yeah, give the metaphor, Yeah, 1093 00:54:29,280 --> 00:54:31,000 Speaker 1: we now have cars for the mine. And so like 1094 00:54:31,080 --> 00:54:32,879 Speaker 1: now the kinds of things you used to be able 1095 00:54:32,880 --> 00:54:34,880 Speaker 1: to do, which is you know, like go to the 1096 00:54:34,920 --> 00:54:37,600 Speaker 1: next town over you could do you can be across 1097 00:54:37,600 --> 00:54:42,000 Speaker 1: the country and like in three days. It's nuts, Like 1098 00:54:42,040 --> 00:54:43,920 Speaker 1: you'd never do that if you didn't have that. But 1099 00:54:43,960 --> 00:54:46,640 Speaker 1: also like if you run into somebody with a bicycle, 1100 00:54:47,239 --> 00:54:51,399 Speaker 1: it sucks, but you're fine. Usually you run in somebody 1101 00:54:51,400 --> 00:54:55,719 Speaker 1: with a car. We have we have a different situation, 1102 00:54:55,960 --> 00:54:58,960 Speaker 1: and so I think sometimes it's just like the scale 1103 00:54:59,080 --> 00:55:02,880 Speaker 1: and the speed, even if the underlying technology was kind 1104 00:55:02,880 --> 00:55:05,800 Speaker 1: of the same thing. Yeah, it's it's it's fundamentally different. 1105 00:55:06,160 --> 00:55:09,120 Speaker 3: I want to kind of like land back on this 1106 00:55:09,239 --> 00:55:12,920 Speaker 3: point you made earlier about you know, people, there are 1107 00:55:12,920 --> 00:55:15,800 Speaker 3: there are some people, many people, let's say, who trust 1108 00:55:15,800 --> 00:55:19,920 Speaker 3: a computer over other people. And there are I think 1109 00:55:20,320 --> 00:55:24,760 Speaker 3: also a lot of people who have a very hard 1110 00:55:24,840 --> 00:55:28,960 Speaker 3: time dealing with the abstractions that are inherent. 1111 00:55:28,600 --> 00:55:29,600 Speaker 2: With using a computer. 1112 00:55:29,920 --> 00:55:32,759 Speaker 3: Yeah, the people who can't stop posting on Twitter even 1113 00:55:32,760 --> 00:55:35,359 Speaker 3: though they're getting fired from their job. The people who 1114 00:55:35,640 --> 00:55:39,239 Speaker 3: you know, livestream themselves doing stupid things for attention. The 1115 00:55:39,280 --> 00:55:42,040 Speaker 3: trolls that you see, you know, kind of living in 1116 00:55:42,120 --> 00:55:44,279 Speaker 3: absolute misery because like they want to just like hurt 1117 00:55:44,280 --> 00:55:46,799 Speaker 3: other people. You know, like these these these people have 1118 00:55:46,840 --> 00:55:52,600 Speaker 3: always existed, and I think AI has just created like 1119 00:55:52,680 --> 00:55:56,839 Speaker 3: a new a new way for for people to hurt 1120 00:55:56,880 --> 00:55:59,480 Speaker 3: themselves or other people. But then I also think that 1121 00:55:59,520 --> 00:56:01,480 Speaker 3: there's like a huge amount of people who have trouble 1122 00:56:01,520 --> 00:56:04,520 Speaker 3: with like determining what's real and what's not real on 1123 00:56:04,560 --> 00:56:07,040 Speaker 3: the Internet, and like AI has just exacerbated that. So, 1124 00:56:07,080 --> 00:56:10,880 Speaker 3: like it's interesting, like as new quote unquote, as this 1125 00:56:10,920 --> 00:56:14,960 Speaker 3: technology feels and seems like a lot of the things 1126 00:56:14,960 --> 00:56:17,719 Speaker 3: that it's putting a spotlight on are not new, and 1127 00:56:17,760 --> 00:56:20,920 Speaker 3: like AI relationships, as you said, are not that different 1128 00:56:21,520 --> 00:56:25,120 Speaker 3: from how people were having sex with a computer before. 1129 00:56:25,880 --> 00:56:27,680 Speaker 1: Well, I mean, you know, think of dating apps, think 1130 00:56:27,719 --> 00:56:30,239 Speaker 1: of and before dating apps really got to be a thing, 1131 00:56:30,320 --> 00:56:32,879 Speaker 1: like think about people like meeting each other in warcraft 1132 00:56:32,880 --> 00:56:34,560 Speaker 1: and how weird that was. 1133 00:56:35,120 --> 00:56:35,560 Speaker 3: Exactly. 1134 00:56:35,719 --> 00:56:39,200 Speaker 1: It's not that weird, man, It's not that weird now, 1135 00:56:39,400 --> 00:56:40,680 Speaker 1: but it wasn't even back then. 1136 00:56:41,680 --> 00:56:43,200 Speaker 3: No, no, it wasn't that weird. 1137 00:56:43,760 --> 00:56:46,800 Speaker 1: The really interesting thing about this is this is maybe 1138 00:56:46,840 --> 00:56:50,480 Speaker 1: like the one bipartisan issue, you know what I mean, 1139 00:56:50,520 --> 00:56:54,399 Speaker 1: which is like everybody agrees we're on AI or well, 1140 00:56:54,520 --> 00:56:58,319 Speaker 1: like technology in general, like it used to be that. 1141 00:56:58,480 --> 00:57:00,400 Speaker 1: Oh man, the future is going to be cool. I 1142 00:57:00,400 --> 00:57:02,520 Speaker 1: can't wait to see what's gonna come out. We're all 1143 00:57:02,560 --> 00:57:05,400 Speaker 1: gonna have VR headsets. Like that's fun, and you know, 1144 00:57:05,480 --> 00:57:08,640 Speaker 1: we'd have our dystopian movies or whatever. But in general, 1145 00:57:08,680 --> 00:57:10,759 Speaker 1: I think people were kind of excited about it. We 1146 00:57:11,280 --> 00:57:15,520 Speaker 1: have this stuff that was the literal science fiction even 1147 00:57:15,560 --> 00:57:19,960 Speaker 1: five years ago, and now a lot of influential people, 1148 00:57:19,960 --> 00:57:24,160 Speaker 1: shall I say, hate it. But like, there's a study here. 1149 00:57:24,680 --> 00:57:27,520 Speaker 1: There's a study that's found that eighty five percent of 1150 00:57:27,760 --> 00:57:31,240 Speaker 1: gen Z agree that they spend too much time online. 1151 00:57:31,280 --> 00:57:33,200 Speaker 1: Eighty five percent of gen Z agree that they spend 1152 00:57:33,200 --> 00:57:34,320 Speaker 1: too much time online. 1153 00:57:34,720 --> 00:57:36,320 Speaker 3: I would agree with that based on what I've seen 1154 00:57:36,320 --> 00:57:38,440 Speaker 3: gen Z do. Yeah, no, I would agree with that. 1155 00:57:38,640 --> 00:57:41,800 Speaker 1: Eighty four percent strongly or somewhat agree that in person 1156 00:57:41,840 --> 00:57:47,960 Speaker 1: relationships are more valuable than digital relationships. All they have 1157 00:57:48,000 --> 00:57:50,560 Speaker 1: to do is put down the phone. All they have 1158 00:57:50,600 --> 00:57:52,600 Speaker 1: to do is put down the phone. But that being said, 1159 00:57:52,760 --> 00:57:55,840 Speaker 1: like I actually think that some of the most frankly 1160 00:57:55,920 --> 00:57:59,840 Speaker 1: vulnerable people to like the AI slot to being addicted 1161 00:57:59,840 --> 00:58:02,080 Speaker 1: to the phone, being addicted to Facebook. 1162 00:58:02,120 --> 00:58:03,960 Speaker 2: Certainly that's boomers, man. 1163 00:58:05,040 --> 00:58:08,240 Speaker 3: Yeah, older people in particular. Yeah, yeah, I mean yeah, 1164 00:58:08,280 --> 00:58:11,880 Speaker 3: like the lonelier, the older, the more media illiterate. 1165 00:58:11,480 --> 00:58:12,960 Speaker 2: For sure. Yeah, and. 1166 00:58:14,400 --> 00:58:16,480 Speaker 1: You know, and there are people who are you know, 1167 00:58:16,520 --> 00:58:21,000 Speaker 1: there's you know, dumb phones become a more popular. You know, 1168 00:58:21,040 --> 00:58:23,600 Speaker 1: we've like light phone, all these other sorts of things. 1169 00:58:23,760 --> 00:58:26,800 Speaker 1: First off, a lot of those are expensive. But I 1170 00:58:26,800 --> 00:58:28,440 Speaker 1: think there's like a reason for that is because it's 1171 00:58:28,520 --> 00:58:34,320 Speaker 1: kind of like aimed at somebody who is wealthy enough 1172 00:58:34,800 --> 00:58:38,680 Speaker 1: to unplug. And I kind of feel like that's where 1173 00:58:38,720 --> 00:58:42,600 Speaker 1: we're going. We actually are kind of creating this sort 1174 00:58:42,600 --> 00:58:46,920 Speaker 1: of elitist not really my words here, but I think 1175 00:58:47,600 --> 00:58:49,120 Speaker 1: a certain class of people who are going to be 1176 00:58:49,200 --> 00:58:53,080 Speaker 1: viewed as elitists to have the money and yeah, really 1177 00:58:53,120 --> 00:58:55,440 Speaker 1: have the money and the access to be able to 1178 00:58:55,920 --> 00:58:59,560 Speaker 1: not use a AI to like insist that the music 1179 00:58:59,600 --> 00:59:03,760 Speaker 1: played at the restaurant is not AI because this is 1180 00:59:03,800 --> 00:59:06,800 Speaker 1: a high class place, right, how dare you play AI 1181 00:59:06,960 --> 00:59:10,440 Speaker 1: music in the background? How dare you use AI to 1182 00:59:10,760 --> 00:59:14,280 Speaker 1: generate photos of the food? I can't believe that you 1183 00:59:14,320 --> 00:59:17,080 Speaker 1: didn't take real pictures of this, right, And I kind 1184 00:59:17,120 --> 00:59:19,280 Speaker 1: of think that's where we're going. And you know, tell 1185 00:59:19,400 --> 00:59:21,920 Speaker 1: telling people to get off the phone, like read real news, 1186 00:59:21,960 --> 00:59:24,880 Speaker 1: stop getting off for social media. Have you seen the 1187 00:59:24,880 --> 00:59:27,720 Speaker 1: price of a New York Times subscription. Have you seen 1188 00:59:27,720 --> 00:59:33,280 Speaker 1: the price of like one newsletter, one substack. It's like 1189 00:59:33,320 --> 00:59:36,680 Speaker 1: it's just not economically feasible for the vast majority of people, 1190 00:59:36,720 --> 00:59:40,080 Speaker 1: And so everybody is going through the vast majority of us. 1191 00:59:40,080 --> 00:59:42,919 Speaker 1: I actually think are going to start getting dragged down 1192 00:59:43,000 --> 00:59:45,360 Speaker 1: into the quote unquote slop and there will be a 1193 00:59:45,440 --> 00:59:47,960 Speaker 1: layer of people who can choose to unplug from that 1194 00:59:48,040 --> 00:59:50,920 Speaker 1: and choose to disengage with that. But again, that's not 1195 00:59:51,080 --> 00:59:53,120 Speaker 1: just like you were saying, with relationships. That's not like 1196 00:59:53,200 --> 00:59:57,200 Speaker 1: an AI thing. That's exposing something that already existed. We 1197 00:59:57,360 --> 01:00:00,680 Speaker 1: already had a class problem in society, and and the 1198 01:00:00,720 --> 01:00:02,720 Speaker 1: problem is we're like thinking that we can make an 1199 01:00:02,720 --> 01:00:05,840 Speaker 1: app to fix that, or like surprised when the app 1200 01:00:05,880 --> 01:00:06,560 Speaker 1: doesn't fix it. 1201 01:00:06,560 --> 01:00:08,680 Speaker 2: It's like, no, bro, like this capitalism. I don't know 1202 01:00:08,720 --> 01:00:09,120 Speaker 2: what you want. 1203 01:00:09,960 --> 01:00:13,800 Speaker 3: But what people need to do is subscribe to our podcasts, yes, 1204 01:00:14,320 --> 01:00:15,840 Speaker 3: as a way to defeat slop. 1205 01:00:16,320 --> 01:00:17,200 Speaker 2: Fix you. 1206 01:00:17,320 --> 01:00:21,360 Speaker 3: That will fix it because if you, if you support us. 1207 01:00:21,480 --> 01:00:27,040 Speaker 1: Specifically, specifically specifically kill switch like that, those are the 1208 01:00:27,040 --> 01:00:29,000 Speaker 1: two keys to fix every exactly. 1209 01:00:29,280 --> 01:00:31,560 Speaker 3: And we're good, no, but I think you're one hundred 1210 01:00:31,560 --> 01:00:37,160 Speaker 3: percent right. We're already seeing the beginnings of this and really, 1211 01:00:37,160 --> 01:00:39,160 Speaker 3: in a lot of ways, the only hope for the 1212 01:00:39,200 --> 01:00:42,480 Speaker 3: AI industry to not completely implode, which, by the way, 1213 01:00:42,560 --> 01:00:45,280 Speaker 3: I should say, will not remove AI from our lives. 1214 01:00:45,280 --> 01:00:47,560 Speaker 3: The dot com bubble burst and we still have doc calls. 1215 01:00:48,760 --> 01:00:51,760 Speaker 3: It just it just it just consolidates things. Uh, the 1216 01:00:51,800 --> 01:00:54,480 Speaker 3: stock market crashes in the twenties, we still have a 1217 01:00:54,480 --> 01:00:57,520 Speaker 3: stock market. It just becomes a lot you know, harder 1218 01:00:57,520 --> 01:01:00,880 Speaker 3: and more competitive. Uh. There's a big segment of people 1219 01:01:00,880 --> 01:01:02,680 Speaker 3: out there who want to believe that, like the AI 1220 01:01:02,720 --> 01:01:04,960 Speaker 3: bubble bursts and there's no more AI. No, no, no, 1221 01:01:05,000 --> 01:01:06,080 Speaker 3: that's not how this works. 1222 01:01:06,840 --> 01:01:08,680 Speaker 2: To toothpaste out the tube. 1223 01:01:09,000 --> 01:01:10,920 Speaker 3: Yes, toothpaste out the tube. And as I say to 1224 01:01:11,000 --> 01:01:12,959 Speaker 3: a lot of grumpy readers, whenever I bring up AI, 1225 01:01:13,480 --> 01:01:16,160 Speaker 3: I can run stable diffusion on my MacBook without an 1226 01:01:16,200 --> 01:01:19,400 Speaker 3: Internet connection. This shit's not going away. Ever. The best 1227 01:01:19,480 --> 01:01:21,600 Speaker 3: AI right now is the worst AI will ever be. 1228 01:01:21,760 --> 01:01:24,800 Speaker 1: Basically, yeah, yeah, this is the worst. They the man, 1229 01:01:24,840 --> 01:01:26,520 Speaker 1: the AI people love to say that, man, this is 1230 01:01:26,560 --> 01:01:27,440 Speaker 1: the worst it'll ever be. 1231 01:01:27,960 --> 01:01:30,320 Speaker 3: And the right, yeah, man, in a way, they are right. 1232 01:01:30,400 --> 01:01:33,920 Speaker 1: Unfortunately, I mean, I'm very happy that you're joining me 1233 01:01:34,080 --> 01:01:35,680 Speaker 1: on you know a little bit of the hey man, 1234 01:01:35,720 --> 01:01:37,680 Speaker 1: like this is serious, let's take this seriously kind of thing, 1235 01:01:37,720 --> 01:01:40,160 Speaker 1: because people want to hear you know, the bright, shiny hey, 1236 01:01:40,160 --> 01:01:42,360 Speaker 1: how do we fix this? What's what's the button that 1237 01:01:42,400 --> 01:01:43,920 Speaker 1: I can press to make all this stop? I mean 1238 01:01:43,920 --> 01:01:45,280 Speaker 1: the button that we can press to make all this 1239 01:01:45,320 --> 01:01:47,920 Speaker 1: stop is like fundamental societal change. 1240 01:01:48,160 --> 01:01:51,640 Speaker 3: Or like an e MP that shuts down all technology. 1241 01:01:51,960 --> 01:01:53,840 Speaker 3: You know, maybe that that would work. 1242 01:01:53,960 --> 01:01:56,880 Speaker 2: That's a terminator, right, Yeah, I'm pretty sure. 1243 01:01:57,520 --> 01:01:59,480 Speaker 1: Yeah, I remember that from the Tournator video games because 1244 01:01:59,520 --> 01:02:00,720 Speaker 1: I'm pretty su Trump at one. 1245 01:02:00,600 --> 01:02:02,640 Speaker 3: Point thought it was a real thing that could happen, 1246 01:02:02,960 --> 01:02:04,760 Speaker 3: like because you got he heard it in a movie. 1247 01:02:05,120 --> 01:02:06,680 Speaker 3: I want to thank you for coming on the show. 1248 01:02:06,760 --> 01:02:08,640 Speaker 3: I want to thank you for having me on your show. 1249 01:02:08,960 --> 01:02:12,760 Speaker 3: This was delightful and wonderful. I usually end every episode 1250 01:02:12,800 --> 01:02:15,840 Speaker 3: by asking people like where can people follow you? But 1251 01:02:16,240 --> 01:02:18,000 Speaker 3: here I'll do this. If you want to follow me, 1252 01:02:18,600 --> 01:02:20,640 Speaker 3: you can find me on Blue Sky and Instagram. As 1253 01:02:20,760 --> 01:02:25,520 Speaker 3: Ryan hates this and Broderick on x I unfortunately still 1254 01:02:25,920 --> 01:02:29,400 Speaker 3: check that website. People. Yeah, what about you? Where can 1255 01:02:29,440 --> 01:02:31,080 Speaker 3: people follow you? If they want to they want to 1256 01:02:31,080 --> 01:02:31,400 Speaker 3: follow you? 1257 01:02:31,560 --> 01:02:37,040 Speaker 1: Yeah, I'm a dex digi on basically everything D E 1258 01:02:37,240 --> 01:02:40,760 Speaker 1: X D, I, G I and I. We have also 1259 01:02:41,440 --> 01:02:46,200 Speaker 1: kill switch pod kill switch pod on Instagram. 1260 01:02:46,240 --> 01:02:49,120 Speaker 3: Oh yeah, you can find Panic World at any podcast 1261 01:02:49,240 --> 01:02:53,080 Speaker 3: place that you get content for podcasts and also video. 1262 01:02:53,120 --> 01:02:53,680 Speaker 2: I guess same. 1263 01:02:54,720 --> 01:02:54,919 Speaker 3: Yeah. 1264 01:02:58,480 --> 01:03:00,800 Speaker 1: Thank you so much for listening. This was a really 1265 01:03:00,800 --> 01:03:04,280 Speaker 1: fun collaboration with Ryan and Panic World, so please make 1266 01:03:04,320 --> 01:03:06,640 Speaker 1: sure to check out the show on YouTube or wherever 1267 01:03:06,680 --> 01:03:09,520 Speaker 1: you get your podcasts and special thanks to Panic World 1268 01:03:09,520 --> 01:03:14,240 Speaker 1: producers Brand Irving and Josh Feldstid. Killswitch is produced by 1269 01:03:14,240 --> 01:03:18,640 Speaker 1: Sena Ozaki, Darluk Potts, and Julia Nutter from Kaleidoscope. Our 1270 01:03:18,640 --> 01:03:23,160 Speaker 1: executive producers are Ozwa Lashin, Mangesh Hatigadur, and Kate Osborne 1271 01:03:23,240 --> 01:03:27,320 Speaker 1: from iHeart. Our executive producers are Katrina Norvil and Nikki E. Tour. 1272 01:03:27,680 --> 01:03:31,800 Speaker 1: Catch All, the Next One, Goodbye,