1 00:00:02,520 --> 00:00:08,240 Speaker 1: Media, Well, come back to Behind the Bastards. I don't 2 00:00:08,280 --> 00:00:09,880 Speaker 1: know why I did that, kind of like it was 3 00:00:09,880 --> 00:00:12,480 Speaker 1: like I was doing a Halloween opening. Didn't work at all, 4 00:00:12,640 --> 00:00:16,919 Speaker 1: horrible idea. But I'm very happy to announce our guest 5 00:00:17,000 --> 00:00:20,840 Speaker 1: today back after a long hiatus from this show, but 6 00:00:20,880 --> 00:00:23,119 Speaker 1: not from our hearts, Iffy and whatdy Wall. 7 00:00:24,920 --> 00:00:27,200 Speaker 2: How's it going going to be back? 8 00:00:27,440 --> 00:00:29,560 Speaker 1: Yeah? Oh so happy to have you back. If he 9 00:00:30,000 --> 00:00:34,199 Speaker 1: you are a significant part of the dropout, a wonderful 10 00:00:34,280 --> 00:00:37,960 Speaker 1: channel on YouTube that is the survivor of college humor, 11 00:00:38,040 --> 00:00:41,040 Speaker 1: which is kind of like we're ken in the comedy 12 00:00:41,040 --> 00:00:42,200 Speaker 1: internet comedy world. 13 00:00:42,600 --> 00:00:45,760 Speaker 3: Yeah, and you guys have been doing some really cool 14 00:00:45,800 --> 00:00:48,760 Speaker 3: stuff lately, like some of the funnest, like most interesting 15 00:00:48,880 --> 00:00:53,479 Speaker 3: like quasi game show comedy bits like I've watched hours 16 00:00:53,479 --> 00:00:53,640 Speaker 3: of it. 17 00:00:53,720 --> 00:00:56,360 Speaker 2: Yeah, yeah, I was, you know, I was talking to 18 00:00:57,520 --> 00:01:01,880 Speaker 2: Sam and Dave's the kind of like higher ups there, 19 00:01:01,920 --> 00:01:05,119 Speaker 2: which is, you know, which is a testament to how 20 00:01:05,160 --> 00:01:07,880 Speaker 2: cool the company is, because no, I haven't spoke to 21 00:01:07,959 --> 00:01:10,679 Speaker 2: the higher ups at ICC or any of the other companies, 22 00:01:11,040 --> 00:01:13,200 Speaker 2: but I was like, oh yeah, this is like where 23 00:01:13,880 --> 00:01:18,000 Speaker 2: like Panel Comedy lives and now after Midnight's back. But 24 00:01:18,040 --> 00:01:20,880 Speaker 2: I feel like Americans run away from panel comedy shows 25 00:01:21,160 --> 00:01:24,280 Speaker 2: and I like an excuse to riff and goof with friends. 26 00:01:24,800 --> 00:01:27,240 Speaker 1: Yes, I think, and I think that YouTube and the 27 00:01:27,319 --> 00:01:29,240 Speaker 1: kind of what people get out of a lot of 28 00:01:29,280 --> 00:01:33,000 Speaker 1: like streaming stuff too, like not like streaming TV, but like, 29 00:01:33,160 --> 00:01:36,600 Speaker 1: you know, streamers is that goofing with friends thing. Like 30 00:01:36,720 --> 00:01:38,560 Speaker 1: I watch red letter media stuff for a lot of 31 00:01:38,560 --> 00:01:41,480 Speaker 1: the same reason, and I feel like there's a way 32 00:01:41,480 --> 00:01:43,680 Speaker 1: to save that stuff. It's just not putting it on 33 00:01:43,760 --> 00:01:46,200 Speaker 1: television at eleven thirty at night necessarily. 34 00:01:47,800 --> 00:01:50,000 Speaker 2: Yeah, yeah, yeah, people should be a little more lucid 35 00:01:50,640 --> 00:01:54,160 Speaker 2: when they're taking in that kind of media. But yeah, 36 00:01:54,200 --> 00:01:57,800 Speaker 2: where else are you gonna watch? You know, a block 37 00:01:58,840 --> 00:02:01,520 Speaker 2: trying to think of the name of the show it was. 38 00:02:01,560 --> 00:02:03,480 Speaker 2: It was the one with the guy he has the 39 00:02:03,960 --> 00:02:06,560 Speaker 2: it's not a top hat, it's like a fedora almost, 40 00:02:07,120 --> 00:02:10,440 Speaker 2: and he's a detective and all dads love it. 41 00:02:10,760 --> 00:02:17,880 Speaker 1: Bosh Yes, oh god, I don't even use the tools 42 00:02:17,960 --> 00:02:24,400 Speaker 1: that Bosh makes. They love Bosh the Bosh heads. Yeah, 43 00:02:24,639 --> 00:02:27,000 Speaker 1: I would laugh. But I just got into Reacher, which 44 00:02:27,040 --> 00:02:29,720 Speaker 1: is like it's, you know, it's the stupidest thing on TV, 45 00:02:29,880 --> 00:02:32,200 Speaker 1: but I love it. It's like, it's like I used 46 00:02:32,200 --> 00:02:34,840 Speaker 1: to watch a lot of Walker Texas Rangers again the 47 00:02:34,880 --> 00:02:38,200 Speaker 1: inheritor of that, but not racist or as racist, I 48 00:02:38,240 --> 00:02:39,960 Speaker 1: guess exactly say not at all. 49 00:02:40,200 --> 00:02:42,120 Speaker 2: Look, that's why I'm not even going to judge, because 50 00:02:42,160 --> 00:02:44,640 Speaker 2: I remember one time, me and em we were up 51 00:02:44,760 --> 00:02:47,919 Speaker 2: in the cabin in right Wood and we needed to 52 00:02:48,000 --> 00:02:50,480 Speaker 2: kill Tom and we were like, let's watch this nine 53 00:02:50,520 --> 00:02:53,119 Speaker 2: to one one show whatever, just see and we were 54 00:02:53,160 --> 00:02:56,400 Speaker 2: like seven episodes in locked. You're like, hold on, this 55 00:02:56,480 --> 00:02:59,440 Speaker 2: is this? This kind of goes Angela Bassid know what 56 00:02:59,520 --> 00:02:59,919 Speaker 2: she's doing. 57 00:03:00,160 --> 00:03:02,880 Speaker 1: Mm hm oh yeah, no, she's she's a pro. And 58 00:03:02,919 --> 00:03:05,520 Speaker 1: you know, you can't get work like that or work 59 00:03:05,639 --> 00:03:08,519 Speaker 1: like you and your colleagues do at the dropout without 60 00:03:08,600 --> 00:03:11,919 Speaker 1: human beings actually doing things. Which is why today we're 61 00:03:11,919 --> 00:03:14,520 Speaker 1: here to talk about AI, and particularly we're going to 62 00:03:14,560 --> 00:03:17,519 Speaker 1: talk about how a lot of the conversation and a 63 00:03:17,520 --> 00:03:20,880 Speaker 1: lot of the fandom around AI has turned into more 64 00:03:20,960 --> 00:03:24,160 Speaker 1: or less occult. So that's that's that's the premise of 65 00:03:24,200 --> 00:03:28,000 Speaker 1: this episode. Iffie, I want to start by going back 66 00:03:28,000 --> 00:03:30,480 Speaker 1: to a place I was about a week ago as 67 00:03:30,480 --> 00:03:34,760 Speaker 1: we record this, the Consumer Electronics Show. Every year they've 68 00:03:34,800 --> 00:03:37,720 Speaker 1: been holding it for a long time now, and it's 69 00:03:37,800 --> 00:03:40,360 Speaker 1: kind of how the tech industry talks about itself and 70 00:03:40,400 --> 00:03:42,840 Speaker 1: what it has planned for the future. A lot of 71 00:03:42,840 --> 00:03:44,880 Speaker 1: it's hype, you know, they're kind of talking up what's 72 00:03:44,880 --> 00:03:47,280 Speaker 1: coming out that they're hoping we'll get buzzed, we'll make money, 73 00:03:47,480 --> 00:03:49,240 Speaker 1: but you also get an idea of like what do 74 00:03:49,280 --> 00:03:51,520 Speaker 1: you think we want and what are you trying to 75 00:03:51,520 --> 00:03:54,160 Speaker 1: get us excited about? And I think the most revealing 76 00:03:54,240 --> 00:03:57,040 Speaker 1: product that I saw this year was the Rabbit R 77 00:03:57,080 --> 00:04:00,480 Speaker 1: one and it's it's a little square shaped gadgets a screen, 78 00:04:00,560 --> 00:04:03,760 Speaker 1: it's got a little camera that can swivel, and it's 79 00:04:03,800 --> 00:04:06,960 Speaker 1: an AI that basically you talk to it like you 80 00:04:07,000 --> 00:04:09,880 Speaker 1: would an Alexa, but it can use your apps and 81 00:04:09,960 --> 00:04:12,520 Speaker 1: it's supposed to reduce friction in your life by basically 82 00:04:12,640 --> 00:04:16,800 Speaker 1: routing every move you make online through this machine intelligence. 83 00:04:17,000 --> 00:04:18,720 Speaker 1: So you tell it what you want to do and 84 00:04:18,760 --> 00:04:22,000 Speaker 1: it does it instead of like you using like physically 85 00:04:22,120 --> 00:04:24,600 Speaker 1: using your smartphone as much. You still have to click 86 00:04:24,640 --> 00:04:26,960 Speaker 1: it sometimes. And I want to play you a clip 87 00:04:27,000 --> 00:04:31,040 Speaker 1: of this where this is the CEO Jesse Lu's keynote speech. 88 00:04:31,720 --> 00:04:35,599 Speaker 4: Our smartphones has become the best device to kill time. 89 00:04:35,800 --> 00:04:38,920 Speaker 4: Instead of saving them. It's just harder for them to 90 00:04:38,960 --> 00:04:44,240 Speaker 4: do things. Many people before us have tried to build 91 00:04:44,279 --> 00:04:48,200 Speaker 4: a simpler and more intuitive computers with AI. A decade ago, 92 00:04:48,760 --> 00:04:55,200 Speaker 4: companies like Apple, Microsoft, and Amazon made Siri, Contana and Alixa. 93 00:04:55,480 --> 00:05:00,159 Speaker 4: With these smart speakers, often they either don't know what 94 00:05:00,160 --> 00:05:03,960 Speaker 4: you're talking about or failed to accomplish the past. We 95 00:05:04,160 --> 00:05:10,000 Speaker 4: ask for recent achievements in large language models, however, or LMS, 96 00:05:10,279 --> 00:05:14,160 Speaker 4: a type of AI technology, have made it much easier 97 00:05:14,320 --> 00:05:19,640 Speaker 4: for machines to understand you. The popularity of LM chatbots 98 00:05:20,000 --> 00:05:23,640 Speaker 4: over the past years has shown that the natural language 99 00:05:23,680 --> 00:05:26,400 Speaker 4: based experience is the pass forward. 100 00:05:27,360 --> 00:05:31,040 Speaker 1: Now. I don't know that I entirely agree with that, 101 00:05:31,080 --> 00:05:33,520 Speaker 1: because I think the biggest influence that these chatbots have 102 00:05:33,520 --> 00:05:35,600 Speaker 1: had on me is that whenever I try to deal 103 00:05:35,640 --> 00:05:37,719 Speaker 1: with like an airline or something, I get stuck on 104 00:05:37,839 --> 00:05:40,640 Speaker 1: chat GPT and it's a pain in the as to 105 00:05:40,760 --> 00:05:41,400 Speaker 1: do anything. 106 00:05:42,880 --> 00:05:46,119 Speaker 2: What's so funny about the rise in AI right now 107 00:05:46,440 --> 00:05:49,559 Speaker 2: is like, if we really think about it, and god 108 00:05:49,680 --> 00:05:52,839 Speaker 2: damn it, Robert, you threaded the needle right there. When 109 00:05:52,880 --> 00:05:56,240 Speaker 2: you really think about it, all AI is is just 110 00:05:56,400 --> 00:06:01,960 Speaker 2: the evolution of the shittiest part of calling. Yes, yes, all, 111 00:06:02,480 --> 00:06:04,760 Speaker 2: the one thing that we do as soon as we 112 00:06:04,800 --> 00:06:07,000 Speaker 2: call is like zero zero zero, let me get straight 113 00:06:07,040 --> 00:06:07,679 Speaker 2: to a human. 114 00:06:07,560 --> 00:06:08,560 Speaker 1: Let me talk to a human. 115 00:06:09,160 --> 00:06:11,720 Speaker 2: And the and these these like eggheads were like, what 116 00:06:11,760 --> 00:06:14,560 Speaker 2: if we did more of that? What if we remove 117 00:06:14,800 --> 00:06:22,000 Speaker 2: your solace from this sad eyes of your life? 118 00:06:22,440 --> 00:06:24,960 Speaker 1: And it's it's the kind of thing he seems so 119 00:06:25,600 --> 00:06:28,159 Speaker 1: off from my experience where he's like the problem with 120 00:06:28,279 --> 00:06:30,480 Speaker 1: phones is that it's too hard to do things. No, 121 00:06:30,720 --> 00:06:32,360 Speaker 1: it's too easy for me to order a bunch of 122 00:06:32,440 --> 00:06:34,960 Speaker 1: junk food and have a stranger deliver it. That's been 123 00:06:35,000 --> 00:06:38,159 Speaker 1: a problem for me, right, Oh yeah, it's too easy 124 00:06:38,200 --> 00:06:40,800 Speaker 1: for me to waste six hours on Twitter, Like that's 125 00:06:40,800 --> 00:06:41,760 Speaker 1: what the issue is. 126 00:06:42,040 --> 00:06:47,080 Speaker 2: Do you know how much the easiness of everything has 127 00:06:47,120 --> 00:06:49,760 Speaker 2: made me wake up the next morning with cold food 128 00:06:49,800 --> 00:06:51,320 Speaker 2: on my porch that I never touched. 129 00:06:51,520 --> 00:06:54,720 Speaker 1: Yes, because it's too easy, It was too easy. 130 00:06:54,960 --> 00:06:58,880 Speaker 2: Sometimes we need those those those like bumps in the 131 00:06:58,960 --> 00:07:04,000 Speaker 2: road to keep drunk stone diffy from ordering a like 132 00:07:04,200 --> 00:07:07,360 Speaker 2: triple double burger that I'm hoping gets to me and 133 00:07:07,480 --> 00:07:09,080 Speaker 2: before I fall asleep, it never does. 134 00:07:09,320 --> 00:07:11,720 Speaker 1: I've had so many problems as with my phone, but 135 00:07:11,760 --> 00:07:13,160 Speaker 1: I don't think I've ever had the problem of like 136 00:07:13,200 --> 00:07:16,880 Speaker 1: this thing's just too hard to use. Yeah, that's just 137 00:07:16,960 --> 00:07:20,560 Speaker 1: not a problem I know that anyone has. But but 138 00:07:20,720 --> 00:07:24,360 Speaker 1: Jesse complains that, you know, there's too much friction with 139 00:07:24,400 --> 00:07:28,080 Speaker 1: smartphones and his device. That rabbit is going to let you, 140 00:07:28,120 --> 00:07:29,679 Speaker 1: Like you can just tell it book me a flight 141 00:07:29,800 --> 00:07:32,600 Speaker 1: or a hotel on Expedia and you don't even have 142 00:07:32,640 --> 00:07:35,000 Speaker 1: to know, like it'll just pick a hotel for you 143 00:07:35,080 --> 00:07:37,160 Speaker 1: a lot of the time, or like what flight you 144 00:07:37,160 --> 00:07:40,560 Speaker 1: know it thinks is the most efficient. Jesse's goal is 145 00:07:40,600 --> 00:07:44,240 Speaker 1: to basically create AI agents for customers, which like live 146 00:07:44,240 --> 00:07:46,280 Speaker 1: in this little device you wear and act as you 147 00:07:46,360 --> 00:07:50,240 Speaker 1: online to handle tasks you normally use your phone to do. So, 148 00:07:50,240 --> 00:07:51,840 Speaker 1: you can tell your rabbit to book you an uber, 149 00:07:52,000 --> 00:07:53,160 Speaker 1: you can have a book you a flight, or you 150 00:07:53,200 --> 00:07:55,560 Speaker 1: can have it plan your trip to a foreign country. 151 00:07:56,080 --> 00:08:01,120 Speaker 1: Nobody sounds really bad. It sounds so fucking bad. 152 00:08:01,320 --> 00:08:03,760 Speaker 5: Rabbit's like, hey, how would you like a middle speech 153 00:08:03,960 --> 00:08:04,320 Speaker 5: seed on. 154 00:08:06,200 --> 00:08:10,920 Speaker 1: Sufficient as hell? Yeah, spirits so cheap. Now you can 155 00:08:11,320 --> 00:08:14,040 Speaker 1: you can direct it more. But then that just seems like, well, yeah, 156 00:08:14,080 --> 00:08:16,120 Speaker 1: that's what I'm already doing on my computer. Why is 157 00:08:16,160 --> 00:08:18,760 Speaker 1: it easiest just to like work through a vocal chatbot 158 00:08:18,760 --> 00:08:21,800 Speaker 1: that might not understand me, or at least will be 159 00:08:21,840 --> 00:08:24,360 Speaker 1: as much friction as like, Yeah, when I touch my 160 00:08:24,400 --> 00:08:26,200 Speaker 1: phone and it hits the wrong thing, right, I just 161 00:08:26,200 --> 00:08:28,840 Speaker 1: don't see that I'm saving much here. No one also 162 00:08:28,880 --> 00:08:31,920 Speaker 1: seems to know how Rabbit's going to integrate with all 163 00:08:31,960 --> 00:08:34,920 Speaker 1: these apps, because that means their device has to have 164 00:08:35,000 --> 00:08:37,120 Speaker 1: access to them for you, and that's kind of a 165 00:08:37,120 --> 00:08:40,120 Speaker 1: big ask for all of these different companies. That said, 166 00:08:40,200 --> 00:08:41,920 Speaker 1: and no one knows, by the way, how secure it's 167 00:08:41,960 --> 00:08:45,000 Speaker 1: going to be. But no one at CES was listening either, 168 00:08:45,040 --> 00:08:47,480 Speaker 1: because the first ten thousand pre order models that opened 169 00:08:47,480 --> 00:08:51,079 Speaker 1: at CES sold out instantly. How does it mean? That 170 00:08:51,120 --> 00:08:53,400 Speaker 1: doesn't mean a lot of normal people are going to 171 00:08:53,440 --> 00:08:55,880 Speaker 1: buy it? It means a lot of tech freaks wanted 172 00:08:55,920 --> 00:08:58,079 Speaker 1: this thing. Yeah, that is the thing. 173 00:08:58,000 --> 00:09:00,959 Speaker 2: Too, is like, yeah, if you're at c you're already 174 00:09:01,160 --> 00:09:03,840 Speaker 2: you're already drinking the kool aid. Yes, I you know, 175 00:09:03,920 --> 00:09:07,079 Speaker 2: And I'm very much It's so funny because I'm very 176 00:09:07,120 --> 00:09:09,480 Speaker 2: much that guy, or I was very much that guy 177 00:09:09,760 --> 00:09:13,080 Speaker 2: who wanted to be on that cusp of technology. And 178 00:09:13,120 --> 00:09:15,760 Speaker 2: I feel like, you know, in my early twenties, it 179 00:09:15,840 --> 00:09:17,960 Speaker 2: seems so cool because you're like, yeah, I want to 180 00:09:17,960 --> 00:09:20,920 Speaker 2: be iron man, I want to just have full control. 181 00:09:21,240 --> 00:09:24,080 Speaker 2: And then you kind of get to the point where 182 00:09:24,080 --> 00:09:27,840 Speaker 2: you start if you have enough self reflection, you notice 183 00:09:27,880 --> 00:09:31,160 Speaker 2: that you're just kind of you're using technology to do 184 00:09:31,320 --> 00:09:34,400 Speaker 2: things that you could just do for much simpler and cheaper, Like, 185 00:09:34,440 --> 00:09:38,400 Speaker 2: for example, like when you get these apps and then 186 00:09:38,440 --> 00:09:41,280 Speaker 2: I'm like sitting in a photo editor for like fifty 187 00:09:41,360 --> 00:09:43,400 Speaker 2: minutes just to make it look like it was shot 188 00:09:43,440 --> 00:09:44,760 Speaker 2: on film, And I was like, I can buy a 189 00:09:44,800 --> 00:09:50,000 Speaker 2: disposable camera like that exists. And what's great is when 190 00:09:50,040 --> 00:09:52,920 Speaker 2: you now when you develop them, they put it, they 191 00:09:53,000 --> 00:09:55,160 Speaker 2: email it to you so it can go on Instagram. 192 00:09:55,200 --> 00:09:58,480 Speaker 2: It's going to take longer, but arguably you might get 193 00:09:58,520 --> 00:10:01,760 Speaker 2: some post flip clarity that pick that didn't look good 194 00:10:01,920 --> 00:10:05,679 Speaker 2: isn't now on the internet, or you know, you can 195 00:10:05,920 --> 00:10:08,840 Speaker 2: like spread it out. But yeah, I am getting. 196 00:10:08,640 --> 00:10:12,079 Speaker 1: Very you are, like, you are so much more thoughtful 197 00:10:12,240 --> 00:10:15,120 Speaker 1: in that than like anybody involved in this product has 198 00:10:15,120 --> 00:10:18,320 Speaker 1: been over the course of their entire life. Now, a 199 00:10:18,320 --> 00:10:21,640 Speaker 1: couple of skeptics who have given reviews have already noted problems. 200 00:10:21,720 --> 00:10:24,320 Speaker 1: Richard Lawler of the verbs was like, this thing is 201 00:10:24,320 --> 00:10:26,360 Speaker 1: not built for left handed people to use, Like they 202 00:10:26,440 --> 00:10:29,600 Speaker 1: forgotten that left handed people existed, and so they designed 203 00:10:29,640 --> 00:10:32,880 Speaker 1: it in a way that's specifically a pain in the ass. Which, yes, 204 00:10:34,360 --> 00:10:36,559 Speaker 1: there's also they brag they have this camera that can 205 00:10:36,640 --> 00:10:38,640 Speaker 1: like move on its own, so we can cover stuff 206 00:10:38,760 --> 00:10:41,080 Speaker 1: in front of it or behind it, And a commenter 207 00:10:41,200 --> 00:10:44,120 Speaker 1: on Lawler's article was like, it's a pretty fundamental design 208 00:10:44,480 --> 00:10:46,680 Speaker 1: principle that you don't add moving parts if you don't 209 00:10:46,679 --> 00:10:49,000 Speaker 1: need them. And there's plenty of space in this for 210 00:10:49,080 --> 00:10:50,880 Speaker 1: a camera in the front and the back, which is 211 00:10:50,920 --> 00:10:53,040 Speaker 1: one less point of failure, one less thing for shit 212 00:10:53,080 --> 00:10:55,120 Speaker 1: to get gunked up in. This is actually bad design. 213 00:10:55,120 --> 00:10:57,520 Speaker 1: They're bragging about this, but it's a bad idea. 214 00:10:57,800 --> 00:10:58,240 Speaker 2: Yeah. 215 00:10:58,320 --> 00:11:00,560 Speaker 1: Yeah, there's a couple of other issues in there. You know, 216 00:11:00,600 --> 00:11:01,719 Speaker 1: we'll see it looks like it's going to be a 217 00:11:01,760 --> 00:11:04,080 Speaker 1: lot thicker than a smartphone. I just don't know the 218 00:11:04,120 --> 00:11:07,160 Speaker 1: degree to which a regular it's the same Google glass issue, Right, 219 00:11:07,559 --> 00:11:09,080 Speaker 1: do you want a second thing that you have to 220 00:11:09,160 --> 00:11:11,760 Speaker 1: keep on you like alongside your you know. 221 00:11:12,040 --> 00:11:14,000 Speaker 2: Just going to be my question. So I'm glad, So 222 00:11:14,200 --> 00:11:17,520 Speaker 2: is this a something that should be replacing your phone 223 00:11:17,600 --> 00:11:20,600 Speaker 2: or is this another thing that you are supposed to 224 00:11:20,640 --> 00:11:21,679 Speaker 2: be holding on to. 225 00:11:22,280 --> 00:11:25,040 Speaker 1: I think the goal is for it to eventually replace it, 226 00:11:25,080 --> 00:11:27,360 Speaker 1: but at this point you will still need to have both, 227 00:11:28,000 --> 00:11:30,920 Speaker 1: so like carry another thing, you know. 228 00:11:31,400 --> 00:11:33,480 Speaker 2: You know, walking around like a drug dealer. You have 229 00:11:33,520 --> 00:11:35,200 Speaker 2: two different It's kind of bit. 230 00:11:35,280 --> 00:11:37,760 Speaker 1: It's a big device and it's it's not tiny and 231 00:11:37,800 --> 00:11:39,679 Speaker 1: it has a big screen. So it's just like, well, 232 00:11:39,720 --> 00:11:42,480 Speaker 1: you just made a different kind of smartphone. Again, what 233 00:11:42,600 --> 00:11:46,920 Speaker 1: am I gaining? Yeah? What he's holding in this video? Correct? Yeah? 234 00:11:46,960 --> 00:11:50,000 Speaker 2: Yeah, yeah, yeah, okay, well and what ye what more 235 00:11:50,200 --> 00:11:53,960 Speaker 2: is like that? On top of tech person, it's definitely 236 00:11:54,000 --> 00:11:56,680 Speaker 2: the person who carries a power bank with them. So 237 00:11:56,720 --> 00:12:00,720 Speaker 2: that is a thirst and that's me. So that's three 238 00:12:00,840 --> 00:12:04,080 Speaker 2: things you're rocking with. I know I have so many 239 00:12:04,120 --> 00:12:08,760 Speaker 2: power banks and have and ADHD has blocked every single 240 00:12:08,800 --> 00:12:10,520 Speaker 2: one of them from being useful because you do, I 241 00:12:10,559 --> 00:12:13,560 Speaker 2: forget to charge them or I forget to bring them. 242 00:12:14,400 --> 00:12:17,760 Speaker 1: It also just doesn't look comfortable. It's bigger than his hand. 243 00:12:18,320 --> 00:12:20,640 Speaker 1: Oh yes, he's not a teenage We'll get a better 244 00:12:20,679 --> 00:12:22,880 Speaker 1: look at in a second, because I want to show 245 00:12:22,880 --> 00:12:24,880 Speaker 1: you when he's talking he's trying to make the case 246 00:12:24,920 --> 00:12:26,959 Speaker 1: for this and one of like, he spends a significant 247 00:12:27,040 --> 00:12:29,559 Speaker 1: portion of his not very long keynote on him Rick 248 00:12:29,679 --> 00:12:32,880 Speaker 1: rolling himself in the most iffy You've gotta fucking watch this. 249 00:12:33,040 --> 00:12:35,439 Speaker 1: It is so painful. 250 00:12:36,160 --> 00:12:39,720 Speaker 4: R One has an eye, an onboard camera designed for 251 00:12:39,800 --> 00:12:44,640 Speaker 4: advanced computer vision. It can analyze surroundings and take actions 252 00:12:44,679 --> 00:12:48,800 Speaker 4: in real time. To activate the eye, just double tap 253 00:12:48,840 --> 00:12:55,840 Speaker 4: the button. Oh funny seeing you here, Rick. 254 00:12:56,040 --> 00:12:58,840 Speaker 1: It's a picture of Rick Astley that he points it at. 255 00:13:00,160 --> 00:13:10,640 Speaker 6: Let me take a look. I'm never gonna get you off. 256 00:13:12,520 --> 00:13:19,160 Speaker 1: Enjoy what am I getting? Rip roll in my own keynotes? 257 00:13:20,240 --> 00:13:27,720 Speaker 2: The next one okay tech keynotes spe to take impromt. 258 00:13:27,360 --> 00:13:33,120 Speaker 1: Class one like and I don't recommend that to normal people. 259 00:13:33,640 --> 00:13:36,839 Speaker 2: Because like you're you're like, no, selling your own joke, 260 00:13:37,559 --> 00:13:41,959 Speaker 2: just like what you're not even laughing at it? Why 261 00:13:42,000 --> 00:13:42,800 Speaker 2: am I gonna laugh? 262 00:13:43,120 --> 00:13:46,080 Speaker 1: I will say this that was scripted almost exactly the 263 00:13:46,080 --> 00:13:52,559 Speaker 1: way Tim Robinson would have written it. Oh man, But 264 00:13:52,640 --> 00:13:55,320 Speaker 1: I bet Sam Richardson could have delivered that bit better. 265 00:14:00,080 --> 00:14:02,560 Speaker 1: So that is very funny. But I find this next 266 00:14:02,600 --> 00:14:05,880 Speaker 1: clip more disturbing because it shows this kind of desire 267 00:14:05,960 --> 00:14:08,280 Speaker 1: that the people that are the early adopters here have 268 00:14:08,720 --> 00:14:11,080 Speaker 1: not just for more convenience, but to hand over like 269 00:14:11,320 --> 00:14:14,640 Speaker 1: the power to choose to a robot that's basically just 270 00:14:14,760 --> 00:14:18,760 Speaker 1: pulling the first advertised result from Google Like it's kind 271 00:14:18,760 --> 00:14:19,360 Speaker 1: of messed up. 272 00:14:20,080 --> 00:14:22,920 Speaker 4: Ooh, I can also use our one to order foot 273 00:14:24,280 --> 00:14:27,120 Speaker 4: get me a twelve inch pizza from Pizza Huts. Denvers 274 00:14:27,160 --> 00:14:29,440 Speaker 4: will hear the most ordered option on the app. 275 00:14:29,280 --> 00:14:32,440 Speaker 6: Is fine ordering a twelve inch pizza from Pizza Hut. 276 00:14:33,160 --> 00:14:35,680 Speaker 6: Since you mentioned that the most ordered option is fine, 277 00:14:35,760 --> 00:14:41,880 Speaker 6: I will select that for you. I just created an 278 00:14:41,960 --> 00:14:44,680 Speaker 6: order for a twelve inch pizza. It's going to be 279 00:14:44,720 --> 00:14:47,360 Speaker 6: hand tossed with a classic marinara sauce and topped with 280 00:14:47,440 --> 00:14:50,560 Speaker 6: regular cheese. Please confirm your order. 281 00:14:51,880 --> 00:14:53,720 Speaker 4: That sounds really good. I just come from an order. 282 00:14:53,720 --> 00:14:55,920 Speaker 1: He has to click it again, just like on a smartphone. 283 00:14:55,960 --> 00:14:57,600 Speaker 1: He guess, like, look at the device and click it. 284 00:14:57,680 --> 00:15:02,560 Speaker 2: Here's a freebie for any of even Richard Lawlor, who 285 00:15:02,600 --> 00:15:04,920 Speaker 2: has been in the game for a long time, but 286 00:15:05,120 --> 00:15:08,400 Speaker 2: like any of you tech bloggers, I just do a 287 00:15:08,520 --> 00:15:12,600 Speaker 2: side by side with this video and physically do everything 288 00:15:12,640 --> 00:15:17,880 Speaker 2: he's doing and the real time, because a pizza would 289 00:15:17,880 --> 00:15:20,480 Speaker 2: have been ordered way faster if you would have just 290 00:15:20,520 --> 00:15:21,760 Speaker 2: pulled out your phone. 291 00:15:21,520 --> 00:15:23,840 Speaker 1: Well and you could get the pizza you want. Yeah, 292 00:15:23,880 --> 00:15:25,720 Speaker 1: I like the twelve. 293 00:15:25,360 --> 00:15:28,760 Speaker 5: Inch Classic marinera most ordered pizza. 294 00:15:28,840 --> 00:15:31,640 Speaker 2: I want this shit. I actually like that's so weird. 295 00:15:31,880 --> 00:15:35,160 Speaker 1: Yeah yeah. Also, who orders a twelve inch pizza from 296 00:15:35,160 --> 00:15:35,840 Speaker 1: Pizza Hut? 297 00:15:35,920 --> 00:15:37,320 Speaker 2: Nobody? Yeah? 298 00:15:38,320 --> 00:15:41,000 Speaker 1: Then no way? Is that the most ordered product from 299 00:15:41,040 --> 00:15:42,520 Speaker 1: Pizza Hut. I don't believe it. 300 00:15:43,360 --> 00:15:45,240 Speaker 2: It was definitely paid nonsense. 301 00:15:45,880 --> 00:15:49,480 Speaker 1: Yes, yeah, it's just like yeah, and they're the next clip. 302 00:15:49,480 --> 00:15:51,520 Speaker 1: I don't think we'll actually play it, but like it is, 303 00:15:51,760 --> 00:15:53,880 Speaker 1: it's him saying, hey, plan out like a three day 304 00:15:53,960 --> 00:15:56,560 Speaker 1: vacation in London for me, And as far as I 305 00:15:56,560 --> 00:15:59,520 Speaker 1: can tell, the AI goes for like the first top 306 00:15:59,560 --> 00:16:01,840 Speaker 1: ten list of things to do in London that it finds, 307 00:16:01,840 --> 00:16:05,120 Speaker 1: which was probably written by an AI, and then makes 308 00:16:05,120 --> 00:16:08,040 Speaker 1: an itinerary based on those and it's like, first off, 309 00:16:08,240 --> 00:16:11,720 Speaker 1: are you that basic? Second, planning a vacation is fun? 310 00:16:12,080 --> 00:16:14,000 Speaker 1: Is that not a thing that you want to do? 311 00:16:14,680 --> 00:16:18,680 Speaker 2: Yeah? You're so right? Why would you? Because I look, 312 00:16:18,760 --> 00:16:22,040 Speaker 2: the reason you would go to like a travel agent 313 00:16:22,440 --> 00:16:24,480 Speaker 2: is because they are experts at it. They're gonna find 314 00:16:24,520 --> 00:16:27,600 Speaker 2: the most fun thing for you outside of that, Yeah, 315 00:16:27,640 --> 00:16:29,360 Speaker 2: I want to plan the cool stuff I'm gonna do, 316 00:16:29,520 --> 00:16:32,240 Speaker 2: you know, And and yeah, what about people with fears, 317 00:16:32,280 --> 00:16:35,560 Speaker 2: you know? Or people without skills, which is definitely going 318 00:16:35,640 --> 00:16:38,440 Speaker 2: to be a large margin of people who do this. 319 00:16:38,920 --> 00:16:42,000 Speaker 2: So you you're in London, now you're you're, you're, you're, 320 00:16:42,440 --> 00:16:44,320 Speaker 2: And then it takes you on a trip to Malta 321 00:16:44,640 --> 00:16:47,239 Speaker 2: to go scuba diving. You don't even know how to swim, 322 00:16:47,320 --> 00:16:50,120 Speaker 2: and now you sit in therey like you already paid 323 00:16:50,120 --> 00:16:52,760 Speaker 2: for bro, Yeah. 324 00:16:51,960 --> 00:16:56,400 Speaker 1: You let your fuck for you. It's on you. It's silly, right, 325 00:16:56,440 --> 00:16:58,440 Speaker 1: and I don't want to be I'm gonna say this 326 00:16:58,560 --> 00:17:01,040 Speaker 1: is not the most direct parallel to cult shit will have, 327 00:17:01,120 --> 00:17:03,440 Speaker 1: But watching this, I couldn't help but think about a 328 00:17:03,480 --> 00:17:06,080 Speaker 1: cult that was like the subject of our second episode 329 00:17:06,080 --> 00:17:08,720 Speaker 1: for this year, The Finders, And it was one of 330 00:17:08,760 --> 00:17:10,840 Speaker 1: those things. The guy Marion Petty who ran it was 331 00:17:10,880 --> 00:17:13,080 Speaker 1: like running games is the way he framed it. And 332 00:17:13,080 --> 00:17:16,640 Speaker 1: people would join the cult and give up their agency, 333 00:17:16,640 --> 00:17:18,480 Speaker 1: and he'd tell them go take a job in this city, 334 00:17:18,600 --> 00:17:21,119 Speaker 1: or like go follow this guy and take notes on 335 00:17:21,200 --> 00:17:23,320 Speaker 1: him for a year, or like have a kid and 336 00:17:23,400 --> 00:17:26,160 Speaker 1: raise it. This way, and this is stuff like this 337 00:17:26,200 --> 00:17:28,800 Speaker 1: is really common within cults. One of the appeals of 338 00:17:28,800 --> 00:17:30,760 Speaker 1: a cult to a lot of people is that you 339 00:17:30,840 --> 00:17:33,920 Speaker 1: both get a sense of purpose by following the cult 340 00:17:33,960 --> 00:17:36,040 Speaker 1: and whatever things that says it going to do, and 341 00:17:36,119 --> 00:17:38,520 Speaker 1: you give up the burden of having to choose a 342 00:17:38,560 --> 00:17:40,880 Speaker 1: life for yourself. And this is such a common thing 343 00:17:40,920 --> 00:17:44,200 Speaker 1: in cult dynamics that psychologist Robert Lifton, who's kind of 344 00:17:44,200 --> 00:17:47,639 Speaker 1: one of the foundational minds and studying cults, described it 345 00:17:47,720 --> 00:17:51,560 Speaker 1: as voluntary self surrender, and it's a major characteristic of 346 00:17:51,600 --> 00:17:54,560 Speaker 1: a cult. Many of the finders, we're not you know, 347 00:17:54,560 --> 00:17:57,040 Speaker 1: these are not dumb people. These are not like rubes. 348 00:17:57,119 --> 00:18:00,439 Speaker 1: These are not hillbillies as they're often portrayed in our 349 00:18:00,480 --> 00:18:03,520 Speaker 1: popular media. A lot of the finders had Ivy League degrees. 350 00:18:03,720 --> 00:18:06,880 Speaker 1: One of them owned an oil company, and these guys 351 00:18:06,880 --> 00:18:10,480 Speaker 1: still handed their lives over to a cult leader. Haruki Murakami, 352 00:18:10,480 --> 00:18:13,120 Speaker 1: writing about ohm Shinriko, which is the cult that set 353 00:18:13,119 --> 00:18:15,640 Speaker 1: off a bunch of poison gas in the Tokyo subway, 354 00:18:16,040 --> 00:18:18,919 Speaker 1: noted that many of its members were doctors or engineers 355 00:18:18,920 --> 00:18:22,800 Speaker 1: who quote actively sought to be controlled. I found a 356 00:18:22,800 --> 00:18:25,840 Speaker 1: lot of this really information on the fundamental characteristics of 357 00:18:25,840 --> 00:18:27,760 Speaker 1: what makes something a cult In an article by Zoe 358 00:18:27,880 --> 00:18:30,160 Speaker 1: Heller published for The New Yorker back in twenty twenty one. 359 00:18:30,920 --> 00:18:33,400 Speaker 1: At the time, she was kind of looking at QAnon 360 00:18:33,640 --> 00:18:36,160 Speaker 1: and trying to decide, there's not like a clear guy 361 00:18:36,400 --> 00:18:38,280 Speaker 1: and that's the cult leader, and there's not like a 362 00:18:38,280 --> 00:18:41,119 Speaker 1: geographic center to this, and usually there is with cults 363 00:18:41,160 --> 00:18:44,040 Speaker 1: in history. Does this still qualify? And I think a 364 00:18:44,040 --> 00:18:45,840 Speaker 1: lot of people would agree that, like Yad does. I 365 00:18:45,840 --> 00:18:48,080 Speaker 1: think a lot of experts tend to agree that Yead does. 366 00:18:48,520 --> 00:18:50,719 Speaker 1: And when she was looking at the QAnon movement as 367 00:18:50,720 --> 00:18:54,520 Speaker 1: a cult, Heller noted this. Robert Lifton suggests that people 368 00:18:54,520 --> 00:18:56,800 Speaker 1: with certain kinds of personal history are more likely to 369 00:18:56,880 --> 00:19:00,280 Speaker 1: experience such a longing, those with quote an early sense 370 00:19:00,280 --> 00:19:03,679 Speaker 1: of confusion or dislocation, or at the opposite extreme, an 371 00:19:03,720 --> 00:19:07,520 Speaker 1: early experience of unusually intense family milieu control. But he 372 00:19:07,600 --> 00:19:10,439 Speaker 1: stresses that the capacity for totalist submission lurks in all 373 00:19:10,440 --> 00:19:13,320 Speaker 1: of us and is probably rooted in childhood, the prolonged 374 00:19:13,359 --> 00:19:15,480 Speaker 1: period of dependence during which we have no choice but 375 00:19:15,520 --> 00:19:19,280 Speaker 1: to attribute to our parents and exaggerated omnipotence. And I 376 00:19:20,080 --> 00:19:22,160 Speaker 1: found particularly the bit where he's talking about like, yeah, 377 00:19:22,200 --> 00:19:25,200 Speaker 1: if you an early sense of confusion or dislocation makes 378 00:19:25,240 --> 00:19:27,959 Speaker 1: people crave this kind of to give up this kind 379 00:19:28,000 --> 00:19:31,840 Speaker 1: of control and responsibility. The people running these AI companies 380 00:19:31,840 --> 00:19:34,280 Speaker 1: and maybe not necessarily the very top, because I think 381 00:19:34,280 --> 00:19:37,119 Speaker 1: those tend to be pretty cynical, realistic human beings. But 382 00:19:37,240 --> 00:19:38,680 Speaker 1: like a lot of the people who are in them, 383 00:19:38,680 --> 00:19:40,760 Speaker 1: and a lot of the people who are latching onto 384 00:19:40,800 --> 00:19:44,560 Speaker 1: AI as a fandom online are people whose childhoods and adolescences, 385 00:19:44,600 --> 00:19:46,280 Speaker 1: like all of ours, were shaped by nine to eleven, 386 00:19:46,359 --> 00:19:49,719 Speaker 1: the dislocation and change that that caused, and their young adulthoods. 387 00:19:49,760 --> 00:19:51,240 Speaker 1: A lot of these people, like us, will have come 388 00:19:51,320 --> 00:19:52,879 Speaker 1: of age around the time of the two thousand and 389 00:19:52,920 --> 00:19:55,720 Speaker 1: eight crash. Many of the people who are younger in 390 00:19:55,760 --> 00:19:58,560 Speaker 1: the AI fan base are you know, maybe zoomers and stuff, 391 00:19:58,600 --> 00:20:01,600 Speaker 1: and you know, a lot of them are people who 392 00:20:01,640 --> 00:20:04,199 Speaker 1: have really ugly ideas about like artists shouldn't charge for 393 00:20:04,240 --> 00:20:07,560 Speaker 1: shit or whatever. Yes, but also these are people who 394 00:20:07,720 --> 00:20:10,000 Speaker 1: a lot of them came into their careers went into 395 00:20:10,040 --> 00:20:12,920 Speaker 1: stem fields because they were told coming up the tech 396 00:20:12,960 --> 00:20:15,000 Speaker 1: industry is the safest place to make, you know, a 397 00:20:15,040 --> 00:20:17,720 Speaker 1: good living for yourself and that all fell apart a 398 00:20:17,760 --> 00:20:19,720 Speaker 1: couple of years ago, right, it started to at least 399 00:20:19,800 --> 00:20:23,879 Speaker 1: tech laughs began so again dislocation, chaos, the sense that 400 00:20:23,960 --> 00:20:26,639 Speaker 1: like what else am I going to entrust my life to? 401 00:20:27,000 --> 00:20:28,760 Speaker 1: I thought I had a plan and it fell apart. 402 00:20:29,320 --> 00:20:32,520 Speaker 2: Yeah, you know, I this is this is where if 403 00:20:32,520 --> 00:20:36,400 Speaker 2: he's going to get real, it's philosophical and big brereer. 404 00:20:36,840 --> 00:20:39,639 Speaker 2: But I just finished All about Love by Bell Hooks, 405 00:20:39,760 --> 00:20:43,640 Speaker 2: and you know she often talks about like the wandering 406 00:20:43,680 --> 00:20:47,000 Speaker 2: life with lovelessness and that, and that's searching for it 407 00:20:47,080 --> 00:20:49,359 Speaker 2: and not having having it. And I feel like that 408 00:20:49,400 --> 00:20:52,280 Speaker 2: goes hand in hand with what you're saying right where 409 00:20:52,320 --> 00:20:55,520 Speaker 2: it's like I want a sense of belonging and I 410 00:20:55,600 --> 00:20:57,719 Speaker 2: want to feel like I'm a part of people, and 411 00:20:57,720 --> 00:21:01,760 Speaker 2: whether that is running into target trampling people for Stanley cups, 412 00:21:01,920 --> 00:21:04,240 Speaker 2: or it's being a part of like what you perceive 413 00:21:04,320 --> 00:21:06,399 Speaker 2: to be the next big thing. Like I think that 414 00:21:06,520 --> 00:21:10,280 Speaker 2: is the biggest kind of selling point for a lot 415 00:21:10,320 --> 00:21:12,680 Speaker 2: of these AI people who's like this is the future, 416 00:21:12,960 --> 00:21:17,240 Speaker 2: Like that is almost every person who starts a fifty 417 00:21:17,280 --> 00:21:21,080 Speaker 2: tweet thread with shitty examples of why AI rocks starts 418 00:21:21,119 --> 00:21:23,880 Speaker 2: itting with this is the future and you just got 419 00:21:23,920 --> 00:21:26,280 Speaker 2: to get over it. And there's so many people who 420 00:21:26,359 --> 00:21:27,719 Speaker 2: just want to be on the ground floor of that. 421 00:21:27,760 --> 00:21:30,199 Speaker 2: They want to be the people who were on it. 422 00:21:30,240 --> 00:21:32,960 Speaker 2: Because how many times even I, you know, when you 423 00:21:33,400 --> 00:21:36,200 Speaker 2: have that like time machine question, you're like, oh, Stock 424 00:21:36,240 --> 00:21:39,840 Speaker 2: and Starbucks, ooh uh an Apple. You just want to 425 00:21:39,840 --> 00:21:43,800 Speaker 2: be there before it gets big. So when it really 426 00:21:43,960 --> 00:21:45,520 Speaker 2: at the end of the day, it all comes down 427 00:21:45,560 --> 00:21:47,320 Speaker 2: to commerce, you want to be at the top when 428 00:21:47,359 --> 00:21:50,560 Speaker 2: it all shifts. And that is actually the danger in 429 00:21:50,600 --> 00:21:53,240 Speaker 2: this for me is the commerce. I think about it 430 00:21:53,320 --> 00:21:57,960 Speaker 2: often because you know, like you're saying it, like orders 431 00:21:58,000 --> 00:22:01,640 Speaker 2: the top Google search. Google is kindrently in courts right 432 00:22:01,640 --> 00:22:06,000 Speaker 2: now fighting against you know, basically shaking down companies to 433 00:22:06,000 --> 00:22:08,960 Speaker 2: see who would be the top one. So like the 434 00:22:09,000 --> 00:22:14,480 Speaker 2: future of this being actually you know, a useful app 435 00:22:14,560 --> 00:22:17,560 Speaker 2: kind of lives now in that case because if Google 436 00:22:17,600 --> 00:22:19,920 Speaker 2: wins and they can put whoever's on top, that's only 437 00:22:19,920 --> 00:22:22,560 Speaker 2: going to make it more valuable where they place who's 438 00:22:22,560 --> 00:22:26,160 Speaker 2: on top. Because people are using these weird rabbits, you know, yeah, exactly, 439 00:22:26,560 --> 00:22:30,880 Speaker 2: it's yeah, it is to them. They see the beginning 440 00:22:30,960 --> 00:22:34,960 Speaker 2: of the futures, and I feel like, to me, I'm 441 00:22:35,040 --> 00:22:37,200 Speaker 2: just looking at all the ways it can be abused, 442 00:22:37,280 --> 00:22:39,600 Speaker 2: because if we just look at everything that has come 443 00:22:39,640 --> 00:22:42,240 Speaker 2: before us, we have to think of the ways that 444 00:22:42,320 --> 00:22:43,760 Speaker 2: it has been abused. 445 00:22:43,440 --> 00:22:46,000 Speaker 1: And all the ways it'll be a worse future, you know. 446 00:22:46,320 --> 00:22:48,600 Speaker 1: And I think I really liked that you brought up 447 00:22:48,680 --> 00:22:50,960 Speaker 1: the panic they try to inciteen the rest of us, 448 00:22:51,000 --> 00:22:54,080 Speaker 1: the like the fomo where it's like this is the future, 449 00:22:54,359 --> 00:22:57,160 Speaker 1: get on board or you're gonna get left behind, y right? 450 00:22:57,600 --> 00:23:01,080 Speaker 1: That is that is the cult recruitment tactic, right, And 451 00:23:01,359 --> 00:23:03,000 Speaker 1: what they're trying to do. I just brought up that 452 00:23:03,480 --> 00:23:06,080 Speaker 1: a lot of the people who are most vulnerable to 453 00:23:06,800 --> 00:23:09,159 Speaker 1: this are the folks who like, yeah, they have this 454 00:23:09,280 --> 00:23:12,639 Speaker 1: sense of like insecurity, dislocation, and they see getting on 455 00:23:12,680 --> 00:23:14,720 Speaker 1: board with this early they feel like a sense of 456 00:23:14,720 --> 00:23:17,439 Speaker 1: security there. And by saying you're going to get left behind, 457 00:23:17,560 --> 00:23:20,080 Speaker 1: this is the only way forward, you won't be competitive 458 00:23:20,119 --> 00:23:22,840 Speaker 1: if you don't embrace this stuff that they're trying to 459 00:23:22,880 --> 00:23:26,320 Speaker 1: induce that sense of fear and dislocation to make people vulnerable. 460 00:23:26,359 --> 00:23:28,160 Speaker 1: And I want to read another quote from that art 461 00:23:28,280 --> 00:23:32,119 Speaker 1: from that New Yorker article, The Less control we feel 462 00:23:32,119 --> 00:23:34,760 Speaker 1: we have over our circumstances, the more likely we are 463 00:23:34,800 --> 00:23:37,320 Speaker 1: to entrust our fates to a higher power. A classic 464 00:23:37,359 --> 00:23:40,680 Speaker 1: example of this relationship was provided by the anthropologist Bronislam 465 00:23:40,720 --> 00:23:44,160 Speaker 1: Molina Whisky, who found that fishermen in the Trobyan Islands 466 00:23:44,200 --> 00:23:46,480 Speaker 1: off the coast of New Guinea engaged in more magic 467 00:23:46,560 --> 00:23:49,280 Speaker 1: rituals the further out to see they went. And I 468 00:23:49,320 --> 00:23:52,159 Speaker 1: think we all feel like we're getting further out to 469 00:23:52,200 --> 00:23:56,000 Speaker 1: see these days, right, Like, it's not hard to see why. 470 00:23:56,000 --> 00:23:57,720 Speaker 1: I like, yeah, I'm near the shore or what. I 471 00:23:57,760 --> 00:23:59,679 Speaker 1: don't believe in anything but what's right in front of me. 472 00:23:59,840 --> 00:24:01,960 Speaker 1: And then like you can't say anything but water, and 473 00:24:02,000 --> 00:24:03,920 Speaker 1: you're like, no, there's a god and I can keep 474 00:24:03,960 --> 00:24:04,680 Speaker 1: them happy. 475 00:24:07,359 --> 00:24:11,640 Speaker 2: Yes, yes, indeed, it's it's tough. Yeah, everyone's just kind 476 00:24:11,640 --> 00:24:15,639 Speaker 2: of grasping at what they can to just bolster themselves. 477 00:24:15,640 --> 00:24:20,639 Speaker 2: And sometimes yeah, you're grasping at some weird stuff. 478 00:24:21,200 --> 00:24:24,320 Speaker 1: Yeah yeah, yeah, And it's you know, it's noted often 479 00:24:24,440 --> 00:24:26,639 Speaker 1: by I think a lot of particularly atheists on the Internet, 480 00:24:26,680 --> 00:24:29,840 Speaker 1: that like church attendance is down people who identify as 481 00:24:29,920 --> 00:24:32,320 Speaker 1: part of an organized religion like that that is at 482 00:24:32,359 --> 00:24:35,960 Speaker 1: its lowest level basically. Ever, and this is true. These 483 00:24:36,000 --> 00:24:38,679 Speaker 1: are real trends and they have real effects. But I 484 00:24:38,720 --> 00:24:41,000 Speaker 1: don't think the fact that less people are religious in 485 00:24:41,040 --> 00:24:44,760 Speaker 1: the traditional sense means they're less superstitious or spiritual than 486 00:24:44,800 --> 00:24:46,840 Speaker 1: they ever. It's just that what they invest with that 487 00:24:46,880 --> 00:24:50,359 Speaker 1: belief has changed, in part because they've seen the world 488 00:24:50,400 --> 00:24:54,040 Speaker 1: dislocate so far out of what most priests and another 489 00:24:54,160 --> 00:24:56,720 Speaker 1: sort of like religious heads are capable of sort of 490 00:24:56,760 --> 00:24:59,600 Speaker 1: explaining or comforting them over right, It's like, oh, religion 491 00:24:59,640 --> 00:25:02,800 Speaker 1: is less comforting in a world as advanced as ours 492 00:25:03,600 --> 00:25:07,040 Speaker 1: for most people. Now, this may seem like a reach 493 00:25:07,080 --> 00:25:09,040 Speaker 1: still to kind of call what's going on around AI 494 00:25:09,160 --> 00:25:10,639 Speaker 1: a cult, and I get that. I ask you to 495 00:25:10,680 --> 00:25:12,959 Speaker 1: bear with me here, and I do want to note 496 00:25:12,960 --> 00:25:16,240 Speaker 1: there's nothing wrong with the inherent technology that we often 497 00:25:16,320 --> 00:25:18,360 Speaker 1: call AI, or at least not with all of it. 498 00:25:18,680 --> 00:25:20,760 Speaker 1: That's A because it's used as such a wide banner 499 00:25:20,840 --> 00:25:23,520 Speaker 1: term for stuff is very just like a text recognition 500 00:25:23,600 --> 00:25:26,000 Speaker 1: program that can listen to human voice and create an 501 00:25:26,040 --> 00:25:30,120 Speaker 1: on the fly transcription. That's an AI. That's an example 502 00:25:30,160 --> 00:25:33,000 Speaker 1: of that kind of technology, right like it gets folded 503 00:25:33,000 --> 00:25:34,520 Speaker 1: in there. That's one of the things in AI has 504 00:25:34,560 --> 00:25:39,040 Speaker 1: to do recognizing language in like facial recognition too, recognizing faces. 505 00:25:39,240 --> 00:25:41,520 Speaker 1: If you're ever going to have an actual artificial intelligence, 506 00:25:41,600 --> 00:25:44,160 Speaker 1: those are two of the baseline capabilities that it needs. 507 00:25:44,640 --> 00:25:47,119 Speaker 1: Chatbots obviously are a big part of this, along with 508 00:25:47,200 --> 00:25:49,040 Speaker 1: like the Sundry tools that are being used now to 509 00:25:49,040 --> 00:25:52,120 Speaker 1: clone voices, to generate deep fakes and fuel our now 510 00:25:52,160 --> 00:25:56,680 Speaker 1: constant trip into the Uncanny Valley. Cees featured some real 511 00:25:56,760 --> 00:25:59,480 Speaker 1: products that actually did harness the promise of machine learning 512 00:25:59,480 --> 00:26:01,000 Speaker 1: in ways that I thought, we're cool as I noted 513 00:26:01,160 --> 00:26:03,840 Speaker 1: it could happen here. There's like this telescope. It uses 514 00:26:03,920 --> 00:26:07,080 Speaker 1: machine learning to like basically clean up images that you 515 00:26:07,160 --> 00:26:09,159 Speaker 1: take with it at night when there's like a lot 516 00:26:09,200 --> 00:26:11,000 Speaker 1: of light pollutions so you can see more clearly. And 517 00:26:11,040 --> 00:26:13,920 Speaker 1: I'm like, yeah, that's dope. That's great, But that lived 518 00:26:13,960 --> 00:26:16,720 Speaker 1: alongside a lot of nonsense, you know. Chat GPT for 519 00:26:16,880 --> 00:26:19,880 Speaker 1: dogs was a real sin I saw, and like, there 520 00:26:19,920 --> 00:26:22,879 Speaker 1: was an AI assisted fleshlight to help you not be 521 00:26:22,960 --> 00:26:24,119 Speaker 1: a premature. 522 00:26:25,000 --> 00:26:27,520 Speaker 2: Because of course that's the one that popped on my tongueline. 523 00:26:27,960 --> 00:26:31,080 Speaker 2: It's like and it was like and then they gamified 524 00:26:31,119 --> 00:26:34,560 Speaker 2: it where you go to different planets, you defeat the planet. 525 00:26:34,640 --> 00:26:37,640 Speaker 2: So I'm like, what you You keep talking about beating 526 00:26:37,640 --> 00:26:40,000 Speaker 2: the planets? So how do I lose? Is it when 527 00:26:40,000 --> 00:26:44,960 Speaker 2: I bust about bussing loss? Because you're now introducing shame 528 00:26:45,040 --> 00:26:47,159 Speaker 2: to sex again and I thought we finally got out 529 00:26:48,400 --> 00:26:50,760 Speaker 2: moving best that. Yeah, I can't beat. 530 00:26:50,640 --> 00:26:55,679 Speaker 1: Level those kind of bad ideas. That's all par for 531 00:26:55,720 --> 00:26:58,200 Speaker 1: the course for CEES. But what I saw this year, 532 00:26:58,200 --> 00:27:00,639 Speaker 1: in last year, not just at CEES, just over the 533 00:27:00,720 --> 00:27:03,719 Speaker 1: year in the tech industry from futurist fanboys and titans 534 00:27:03,720 --> 00:27:06,879 Speaker 1: of industry like Mark Andersson, is a kind of unhinged 535 00:27:06,920 --> 00:27:10,960 Speaker 1: messianic fervor that compares better that to scientology than it 536 00:27:11,000 --> 00:27:14,240 Speaker 1: does to the iPhone. And I mean that literally. Mark 537 00:27:14,280 --> 00:27:17,080 Speaker 1: Andreesen is the co founder of Netscape and the capital 538 00:27:17,119 --> 00:27:20,439 Speaker 1: firm Andresen Horowitz. He is one of the most influential 539 00:27:20,480 --> 00:27:22,920 Speaker 1: investors in tech history, and he's put more money into 540 00:27:22,920 --> 00:27:26,640 Speaker 1: AI's startups than almost anyone else. Last year, he published 541 00:27:26,680 --> 00:27:31,200 Speaker 1: something called the Techno Optimist Manifesto on the Andresen Horowitz website. 542 00:27:31,640 --> 00:27:33,639 Speaker 1: On the surface, it's a pean to the promise of 543 00:27:33,680 --> 00:27:37,080 Speaker 1: AI and an exhortation to embrace the promise of technology 544 00:27:37,280 --> 00:27:40,800 Speaker 1: and disregard pessimism. Plenty of people called the Peace Out 545 00:27:40,840 --> 00:27:43,359 Speaker 1: for its logical fallacies. For example, it ignores that a 546 00:27:43,359 --> 00:27:45,960 Speaker 1: lot of tech pessimism is due to real harm caused 547 00:27:46,000 --> 00:27:49,040 Speaker 1: by some of the companies Andresen invested in, like Facebook. 548 00:27:49,359 --> 00:27:53,040 Speaker 1: What's attracted less attention is the messianic overtones of everything. 549 00:27:53,040 --> 00:27:57,800 Speaker 1: Andresen believes quote, we believe artificial intelligence can save lives 550 00:27:57,880 --> 00:28:00,960 Speaker 1: if we let it. Medicine, along Manye with many other fields, 551 00:28:01,160 --> 00:28:02,760 Speaker 1: is in the Stone age compared to what we can 552 00:28:02,800 --> 00:28:05,959 Speaker 1: achieve with joined human and machine intelligence working on new cures. 553 00:28:06,200 --> 00:28:08,199 Speaker 1: There are scores of common causes of death that can 554 00:28:08,240 --> 00:28:11,000 Speaker 1: be fixed with AI, from plane crashes to pandemics to 555 00:28:11,040 --> 00:28:15,400 Speaker 1: wartime friendly fire. Now he's right that there's some medical 556 00:28:15,480 --> 00:28:17,760 Speaker 1: uses for AI. It's being used right now to help 557 00:28:17,800 --> 00:28:20,200 Speaker 1: improve the ability to recognize certain kinds of cancer, and 558 00:28:20,400 --> 00:28:22,879 Speaker 1: there's the potential for stuff like in home devices that 559 00:28:22,960 --> 00:28:24,919 Speaker 1: let you scan your skin to see if you're developing 560 00:28:24,960 --> 00:28:28,760 Speaker 1: a melanoma. And there's debate still over how useful it's 561 00:28:28,800 --> 00:28:31,600 Speaker 1: going to be in medical research. I've talked to recently 562 00:28:31,640 --> 00:28:34,000 Speaker 1: some experts and I've read some stuff that like there 563 00:28:34,040 --> 00:28:36,480 Speaker 1: are some reasons for caution too, for some of the 564 00:28:36,520 --> 00:28:39,400 Speaker 1: same reasons we should have caution with this everywhere. There's 565 00:28:39,400 --> 00:28:43,240 Speaker 1: also disinformation that spread medically with AI, even to doctors, 566 00:28:43,520 --> 00:28:45,880 Speaker 1: and some of the patterns that using this stuff gets 567 00:28:45,960 --> 00:28:49,680 Speaker 1: medical professionals into can make them discount certain diagnoses as well. 568 00:28:49,960 --> 00:28:51,479 Speaker 1: So I don't say that to like say, there's not 569 00:28:51,560 --> 00:28:53,840 Speaker 1: going to be some significant uses for some of the 570 00:28:53,880 --> 00:28:57,240 Speaker 1: way this technology works medically. Some aspects of AI will 571 00:28:57,280 --> 00:29:00,000 Speaker 1: save lives. It's just the evidence right now doesn't see 572 00:29:00,240 --> 00:29:03,440 Speaker 1: just it's going to completely revolutionize medical science. It's another 573 00:29:03,480 --> 00:29:05,520 Speaker 1: advancement that will be good in some ways, and there 574 00:29:05,520 --> 00:29:07,280 Speaker 1: will be some negative aspects of it too. 575 00:29:07,440 --> 00:29:07,600 Speaker 2: Right. 576 00:29:08,200 --> 00:29:10,840 Speaker 1: It's also very much not fair to say that, like 577 00:29:10,920 --> 00:29:14,040 Speaker 1: we're going to reduce deaths for human beings as a 578 00:29:14,080 --> 00:29:16,680 Speaker 1: result of AI, because right now the nation of Israel 579 00:29:16,760 --> 00:29:19,600 Speaker 1: is using an AI program called the Gospel to assist 580 00:29:19,600 --> 00:29:22,120 Speaker 1: it in aiming its air strikes, which have been widely 581 00:29:22,200 --> 00:29:27,720 Speaker 1: condemned for their out exceptional outstanding, in many cases genocidal 582 00:29:27,840 --> 00:29:31,080 Speaker 1: level of civilian casualties. Yes, it's just outrageous. 583 00:29:31,360 --> 00:29:34,080 Speaker 2: Yeah, oh, one hundred percent, And you know, you know 584 00:29:34,920 --> 00:29:37,240 Speaker 2: that's exactly what's going on as a genocide, and you 585 00:29:37,240 --> 00:29:39,720 Speaker 2: know the language and a lot of these speeches says 586 00:29:39,760 --> 00:29:43,560 Speaker 2: that say as much fair, yeah, even more so. Yeah, 587 00:29:43,680 --> 00:29:46,520 Speaker 2: like like you're saying another thing I want to point out, 588 00:29:46,520 --> 00:29:48,960 Speaker 2: which you might have been about to say, and I'm 589 00:29:49,240 --> 00:29:52,320 Speaker 2: already jumping ahead. Is how I think it was chet 590 00:29:52,440 --> 00:29:56,880 Speaker 2: GPT that has quietly switched their terms of service to 591 00:29:57,440 --> 00:30:00,400 Speaker 2: say that it wouldn't be used for like weapons and. 592 00:30:00,320 --> 00:30:03,360 Speaker 1: Oh yeah, to hurt people, and that's for sure. 593 00:30:03,440 --> 00:30:06,120 Speaker 2: Yeah, and now now it has it has been quietly 594 00:30:06,160 --> 00:30:09,360 Speaker 2: scrubbed from those terms of service, and we do we 595 00:30:09,400 --> 00:30:12,200 Speaker 2: need to talk about that, Yeah, because there's just so 596 00:30:12,320 --> 00:30:17,320 Speaker 2: many things that we have grown accustomed to with tech 597 00:30:17,800 --> 00:30:20,440 Speaker 2: that I think is dangerous as we get into things 598 00:30:20,480 --> 00:30:24,000 Speaker 2: that have more room for error. Because we're used to 599 00:30:24,080 --> 00:30:26,560 Speaker 2: updated terms of services on our iPhone, right every time 600 00:30:26,560 --> 00:30:28,760 Speaker 2: we grab an eye an update, it's like, here's a 601 00:30:28,800 --> 00:30:30,840 Speaker 2: new terms of service and you just kind of scroll 602 00:30:30,880 --> 00:30:33,000 Speaker 2: through it and you go yeah, because you're like, yeah, 603 00:30:33,160 --> 00:30:35,480 Speaker 2: you know, this is just a phone. It's not gonna, 604 00:30:35,560 --> 00:30:39,200 Speaker 2: you know, be used for anything weird. Yet uh so 605 00:30:39,800 --> 00:30:41,920 Speaker 2: you're you're comfortable. But like when you're doing the same 606 00:30:41,960 --> 00:30:46,880 Speaker 2: thing with these chat GPT machine learning situations where you're 607 00:30:46,920 --> 00:30:49,920 Speaker 2: you're agreeing, and you're agreeing, Okay, I will help this 608 00:30:50,000 --> 00:30:55,400 Speaker 2: thing learn, and now you are just actively helping it 609 00:30:55,520 --> 00:31:00,959 Speaker 2: learn how to be an assassin. Like what happens there? 610 00:31:01,400 --> 00:31:04,800 Speaker 1: Yeah, And it's again it's this back and forth where 611 00:31:05,120 --> 00:31:07,960 Speaker 1: on one hand, there is some technology like AI enabled 612 00:31:08,040 --> 00:31:10,880 Speaker 1: robots that can go run onto a battlefield and pick 613 00:31:10,960 --> 00:31:12,600 Speaker 1: up an injured soldier. I have no desire to see 614 00:31:12,640 --> 00:31:15,320 Speaker 1: some random private bleed to death in a foreign country, 615 00:31:15,520 --> 00:31:19,520 Speaker 1: fine with that. Or anti missile missiles right, using AI 616 00:31:19,640 --> 00:31:21,680 Speaker 1: to intercept and stop a missile from blowing up in 617 00:31:21,720 --> 00:31:24,080 Speaker 1: a civilian area sounds fine, Like I don't. I don't 618 00:31:24,080 --> 00:31:26,840 Speaker 1: want random people to die from missiles, But it's also 619 00:31:26,880 --> 00:31:29,320 Speaker 1: going to be used to target those missiles. And to say, like, 620 00:31:29,400 --> 00:31:32,560 Speaker 1: based on some shit we analyzed on Twitter or whatever, 621 00:31:32,800 --> 00:31:35,840 Speaker 1: we think wiping out this grid square of apartment buildings 622 00:31:35,960 --> 00:31:39,200 Speaker 1: will really get a lot of the bad guys, and based. 623 00:31:38,960 --> 00:31:43,120 Speaker 2: On we should blow them up exactly. It's crazy. 624 00:31:43,640 --> 00:31:46,200 Speaker 1: It's just it's there's certainly it's certainly not fair to 625 00:31:46,240 --> 00:31:49,040 Speaker 1: say there won't be benefits, but it's absolutely unclear in 626 00:31:49,080 --> 00:31:51,560 Speaker 1: every field of endeavor whether or not they will outweigh 627 00:31:51,560 --> 00:31:54,800 Speaker 1: the harms, right, And even if they do, to what extent, 628 00:31:55,000 --> 00:31:56,920 Speaker 1: you know, because a lot of what I'm saying suggests 629 00:31:56,920 --> 00:31:58,680 Speaker 1: that even if the benefits outweigh the harms in a 630 00:31:58,680 --> 00:32:00,960 Speaker 1: lot of fields, it's still I'm not going because of 631 00:32:01,000 --> 00:32:03,200 Speaker 1: the extent of the harms. In part, it's still not 632 00:32:03,240 --> 00:32:04,880 Speaker 1: going to be a massive sea change. 633 00:32:05,000 --> 00:32:05,200 Speaker 2: Right. 634 00:32:05,600 --> 00:32:08,280 Speaker 1: There are a lot of reasons for caution, but Mark 635 00:32:08,360 --> 00:32:11,360 Speaker 1: has no time for doubters. In fact, to him, doubting 636 00:32:11,400 --> 00:32:16,000 Speaker 1: the benefits of AGI artificial general intelligence is the only 637 00:32:16,160 --> 00:32:20,480 Speaker 1: true sin of his religion. Quote. We believe any deceleration 638 00:32:20,600 --> 00:32:24,680 Speaker 1: of AI will cost lives deaths that were preventable by 639 00:32:24,720 --> 00:32:27,160 Speaker 1: the AI that was prevented from existing. Is a form 640 00:32:27,200 --> 00:32:32,920 Speaker 1: of murder. And that's fucked up. That's really dangerous to 641 00:32:33,440 --> 00:32:36,760 Speaker 1: start talking like that. Oh yeah, and this is the 642 00:32:36,760 --> 00:32:39,520 Speaker 1: more direct cult comment here. I want you to compare 643 00:32:39,520 --> 00:32:42,240 Speaker 1: the claim Mark made above that slowing down AI is 644 00:32:42,440 --> 00:32:45,840 Speaker 1: identical to murder. I want you to compare that to 645 00:32:45,880 --> 00:32:48,680 Speaker 1: the claims the Church of Scientologies makes. Because the Church 646 00:32:48,720 --> 00:32:51,400 Speaker 1: of Scientology, they have this list of practices and the 647 00:32:51,400 --> 00:32:54,400 Speaker 1: beliefs that they call tech, right, and they believe that 648 00:32:54,720 --> 00:32:57,760 Speaker 1: by taking on tech, by engaging with it, people can 649 00:32:57,800 --> 00:33:00,320 Speaker 1: become clear of all of their flaws, and and by 650 00:33:00,360 --> 00:33:02,160 Speaker 1: doing that you can help you can fix all of 651 00:33:02,160 --> 00:33:04,840 Speaker 1: the problems in the world. Right, the Church of Scientology 652 00:33:04,840 --> 00:33:07,680 Speaker 1: on his websites claims that its followers will quote rid 653 00:33:07,720 --> 00:33:10,440 Speaker 1: the planet of insanity, war and crime, and in its 654 00:33:10,440 --> 00:33:13,360 Speaker 1: place create a civilization in which sanity and piece exist. 655 00:33:13,920 --> 00:33:16,800 Speaker 1: How is that in any way different from Mark Andresen 656 00:33:16,920 --> 00:33:18,960 Speaker 1: saying all of the shit that he's saying, right, that 657 00:33:19,040 --> 00:33:22,000 Speaker 1: it's going to like create this this We're going to 658 00:33:22,120 --> 00:33:24,920 Speaker 1: revolutionize medicine, We're going to like d friendly fire, We're 659 00:33:24,920 --> 00:33:26,960 Speaker 1: going to cure pandemics, we're going to stop car crash. 660 00:33:27,040 --> 00:33:32,000 Speaker 1: Asks what is the difference? Right and Scientology uses that 661 00:33:32,040 --> 00:33:35,400 Speaker 1: claim that scientology tech is so necessary it's going to 662 00:33:35,520 --> 00:33:38,080 Speaker 1: fix all these problems. So anyone who gets in the 663 00:33:38,120 --> 00:33:41,160 Speaker 1: way of the Church of Scientology and the deployment of 664 00:33:41,160 --> 00:33:44,160 Speaker 1: this tech for mankind's benefit is subject to what they 665 00:33:44,200 --> 00:33:47,520 Speaker 1: call fair game. A person declared fair game quote may 666 00:33:47,520 --> 00:33:49,960 Speaker 1: be deprived of property or injured by any means by 667 00:33:50,000 --> 00:33:53,960 Speaker 1: any Scientologist. And again, Mark Andersson has not said that 668 00:33:54,040 --> 00:33:56,560 Speaker 1: in his Techno Optimist manifesto. In fact, he makes some 669 00:33:56,600 --> 00:33:59,280 Speaker 1: claims about like none of no people are our enemies. 670 00:33:59,360 --> 00:33:59,560 Speaker 2: Right. 671 00:34:00,280 --> 00:34:02,840 Speaker 1: If you're saying you're a murderer for slowing this down, 672 00:34:03,000 --> 00:34:05,360 Speaker 1: it's not her to see how some people might adopt 673 00:34:06,400 --> 00:34:09,319 Speaker 1: a practice like fair game eventually, right, that's how where 674 00:34:09,360 --> 00:34:10,080 Speaker 1: else does this go? 675 00:34:10,280 --> 00:34:12,160 Speaker 2: Is my way? What do we do with murderers? What 676 00:34:12,200 --> 00:34:15,520 Speaker 2: does I feel like the general rule across all creeds? Yeah, 677 00:34:15,520 --> 00:34:20,320 Speaker 2: across all beliefs is typically murderers are bad and should die. 678 00:34:20,440 --> 00:34:22,880 Speaker 1: That is at least to be punished. Right, there's a 679 00:34:22,960 --> 00:34:28,400 Speaker 1: punishment for murders. Most people agree. Yeah, speaking of murder, 680 00:34:28,560 --> 00:34:30,400 Speaker 1: you know who has never committed a murder? 681 00:34:31,760 --> 00:34:36,839 Speaker 2: You cannot say that, like Blake. Okay, all of our ads. 682 00:34:36,719 --> 00:34:40,600 Speaker 1: You know who, I can't prove we're involved in any murders, 683 00:34:40,640 --> 00:34:43,239 Speaker 1: just like I have no evidence that Jamie Loftis had 684 00:34:43,280 --> 00:34:45,840 Speaker 1: any role with Okay, Okay. 685 00:34:45,320 --> 00:34:49,000 Speaker 5: What I'm saying one one, my girl is innocent, and 686 00:34:49,040 --> 00:34:53,520 Speaker 5: two you can't say that about ours about this, Okay, Well, 687 00:34:53,640 --> 00:34:56,560 Speaker 5: we don't pick them. 688 00:34:55,640 --> 00:34:57,760 Speaker 1: If he By the way, we're spreading a rumor online 689 00:34:57,760 --> 00:35:00,600 Speaker 1: that Jamie Loftis was possibly involved in a series of 690 00:35:00,680 --> 00:35:04,399 Speaker 1: murders in Grand Rapids, Michigan. It's a bit, it's it's 691 00:35:04,480 --> 00:35:05,240 Speaker 1: it's a good bits. 692 00:35:08,239 --> 00:35:12,960 Speaker 5: So you know who knows innocent? 693 00:35:13,880 --> 00:35:14,279 Speaker 2: Uh huh. 694 00:35:14,280 --> 00:35:25,480 Speaker 1: Anyway, here's here's some ads. Oh We're back. So the 695 00:35:25,480 --> 00:35:29,040 Speaker 1: more you dig into Anderson's theology, the more it starts 696 00:35:29,040 --> 00:35:32,840 Speaker 1: to seem like a form of techno capitalist Christianity. AI 697 00:35:33,040 --> 00:35:35,720 Speaker 1: is the savior in the cases of devices like the Rabbit, 698 00:35:35,760 --> 00:35:39,719 Speaker 1: it might literally become our own personal Jesus. And who 699 00:35:39,840 --> 00:35:43,840 Speaker 1: you might ask, is God. Quote. We believe the market 700 00:35:43,880 --> 00:35:48,280 Speaker 1: economy is a discovery machine, a form of intelligence, an exploratory, 701 00:35:48,360 --> 00:35:55,239 Speaker 1: evolutionary adaptive system. Now God, through this concept of reality 702 00:35:55,680 --> 00:35:59,400 Speaker 1: capitalism itself and capitalize the sea there because it's a deity. 703 00:35:59,600 --> 00:36:04,440 Speaker 1: Hassen to bring artificial general intelligence into being. All the 704 00:36:04,560 --> 00:36:07,440 Speaker 1: jobs lost, all the incoherent flats and choking our Internet, 705 00:36:07,440 --> 00:36:10,040 Speaker 1: all the Amazon drop shippers using chat GPT to write 706 00:36:10,040 --> 00:36:13,600 Speaker 1: product descriptions, these are but the market expressing its will. 707 00:36:14,239 --> 00:36:17,760 Speaker 1: Artists have to be plagiarized. Children need to be presented 708 00:36:17,800 --> 00:36:21,360 Speaker 1: with hours of procedurally generated slop and lies on YouTube 709 00:36:21,440 --> 00:36:25,000 Speaker 1: so that we one day can reach the promise land 710 00:36:25,160 --> 00:36:28,680 Speaker 1: of artificial general intelligence. Iffy isn't it worth it. 711 00:36:29,080 --> 00:36:32,319 Speaker 2: Oh my god, it's so isn't And it's I didn't know. 712 00:36:32,840 --> 00:36:33,040 Speaker 4: You know. 713 00:36:33,160 --> 00:36:36,080 Speaker 2: One of the biggest criticism with AI is that it 714 00:36:36,239 --> 00:36:38,520 Speaker 2: you know, it is one of the effects of when 715 00:36:38,960 --> 00:36:42,759 Speaker 2: you know, creativity and commerce meets commerce will always try 716 00:36:42,800 --> 00:36:46,960 Speaker 2: and kill creativity because it is commerce is more concerned 717 00:36:47,080 --> 00:36:50,440 Speaker 2: with the the the the the buck than it is 718 00:36:50,520 --> 00:36:53,440 Speaker 2: the outcome or what it takes to get set buck. 719 00:36:53,960 --> 00:36:56,000 Speaker 2: And that was that was going to be a whole thing. 720 00:36:56,040 --> 00:36:58,399 Speaker 2: I was going to drop at some point, and they 721 00:36:58,480 --> 00:37:00,799 Speaker 2: just said it for me. They just send like like 722 00:37:00,840 --> 00:37:02,960 Speaker 2: they I didn't. I thought it was more veiled. I 723 00:37:02,960 --> 00:37:04,840 Speaker 2: thought it was more hidden. But you know, that is 724 00:37:05,160 --> 00:37:09,120 Speaker 2: that is why you can you can try and say 725 00:37:09,160 --> 00:37:12,200 Speaker 2: that it is ethical to take from artists because we're 726 00:37:12,360 --> 00:37:16,440 Speaker 2: making it easier for you. Your fingie's hurt doing all that drawing. 727 00:37:16,640 --> 00:37:19,680 Speaker 2: But if it learns from you, and and and and 728 00:37:19,680 --> 00:37:22,919 Speaker 2: and and you know, uh, don't ask how you're gonna 729 00:37:22,920 --> 00:37:25,239 Speaker 2: get paid, But if it learns from you, we can. 730 00:37:25,520 --> 00:37:27,960 Speaker 2: We can give the people what they want without causing 731 00:37:28,000 --> 00:37:31,600 Speaker 2: you all this labor. You know, but who gets the money. 732 00:37:31,640 --> 00:37:34,200 Speaker 2: It's it's always the dork behind the computer who did 733 00:37:34,239 --> 00:37:38,279 Speaker 2: the code that is just essentially stealing from all of 734 00:37:38,320 --> 00:37:42,600 Speaker 2: these people and learning from them and then just producing 735 00:37:42,640 --> 00:37:46,520 Speaker 2: this amalgation of everything they've done. 736 00:37:46,800 --> 00:37:50,239 Speaker 1: Yeah. Yeah, absolutely. Also, I should know I'm trying to 737 00:37:50,280 --> 00:37:52,480 Speaker 1: be consistent about this. I wrote it down and then 738 00:37:52,520 --> 00:37:54,800 Speaker 1: I think slipped into it. It's Andres and Mark Andrees 739 00:37:54,840 --> 00:37:57,759 Speaker 1: and andresen Horowitz. Uh, it's just a weird name that 740 00:37:57,800 --> 00:38:00,120 Speaker 1: I'm not used to say. I wrote this down and 741 00:38:00,120 --> 00:38:03,160 Speaker 1: then immediately forgot to correct myself at the start of 742 00:38:03,200 --> 00:38:06,560 Speaker 1: the podcast. Again, folks, hack and a fraud. But you 743 00:38:06,560 --> 00:38:08,520 Speaker 1: know who can own well, actually, I won't say only 744 00:38:08,600 --> 00:38:11,239 Speaker 1: humans can be hacks and frauds like that. Because the 745 00:38:11,280 --> 00:38:14,560 Speaker 1: AI is absolutely mispronounced shit and get shit wrong too. 746 00:38:15,640 --> 00:38:20,239 Speaker 1: I guess that maybe they are getting conscious. Can they 747 00:38:20,239 --> 00:38:22,279 Speaker 1: build an AI that's as much a hack and a 748 00:38:22,360 --> 00:38:27,320 Speaker 1: fraud as I am. We'll see. So no, thank you, Sophie, 749 00:38:27,640 --> 00:38:31,440 Speaker 1: I appreciate it. AGI is treated as an inevitability by 750 00:38:31,480 --> 00:38:33,759 Speaker 1: people like Sam Altman of open Ai, who need it 751 00:38:33,800 --> 00:38:36,360 Speaker 1: to be at least perceived as inevitable so their company 752 00:38:36,360 --> 00:38:39,399 Speaker 1: can have the largest possible IPO. Right, there's a lot 753 00:38:39,440 --> 00:38:41,640 Speaker 1: of money on the line and in the people with 754 00:38:41,760 --> 00:38:45,040 Speaker 1: money believing all of the promises that Andresen is making. 755 00:38:45,440 --> 00:38:48,200 Speaker 1: This messianic fervor has also been adopted by squadrons of 756 00:38:48,280 --> 00:38:51,239 Speaker 1: less influential tech executives who simply need AI to be 757 00:38:51,360 --> 00:38:55,000 Speaker 1: real because it solves a financial problem. Venture capital funding 758 00:38:55,040 --> 00:38:57,560 Speaker 1: for big tech collapsed in the months before chat GPT 759 00:38:57,719 --> 00:39:01,160 Speaker 1: hit public consciousness. The reason cees was packed with so 760 00:39:01,239 --> 00:39:04,120 Speaker 1: many random AI branded products was that sticking those two 761 00:39:04,200 --> 00:39:06,640 Speaker 1: letters on a new company is like they treat it 762 00:39:06,680 --> 00:39:08,960 Speaker 1: like a talisman, right, It's this ritual to bring back 763 00:39:09,000 --> 00:39:11,319 Speaker 1: the rainy season. You know, if you throw AI in 764 00:39:11,360 --> 00:39:14,440 Speaker 1: your shit, people might buy it. Yeah, And it's you know, 765 00:39:14,960 --> 00:39:17,800 Speaker 1: there's versions of this, like laptop makers are throwing AI 766 00:39:17,880 --> 00:39:20,360 Speaker 1: in everything they do now just because like laptop sales 767 00:39:20,480 --> 00:39:22,840 Speaker 1: soared during the start of the pandemic, but they plummeted 768 00:39:22,880 --> 00:39:25,120 Speaker 1: because people, for one thing, don't need to buy laptops 769 00:39:25,120 --> 00:39:29,719 Speaker 1: all the fucking time. Most people wear them out right, yes, exactly. 770 00:39:29,960 --> 00:39:31,920 Speaker 1: And again this comes in. If you can get people 771 00:39:31,920 --> 00:39:33,319 Speaker 1: in the cold, if you can get them scared that 772 00:39:33,320 --> 00:39:35,439 Speaker 1: they're going to fall behind without AI, then maybe they'll 773 00:39:35,480 --> 00:39:37,160 Speaker 1: buy a new AI and I have lap cup top 774 00:39:37,160 --> 00:39:38,360 Speaker 1: because they're like, well, this is what I got to 775 00:39:38,400 --> 00:39:41,719 Speaker 1: do to stay competitive. You know, the terminology that these 776 00:39:41,800 --> 00:39:45,719 Speaker 1: rich tech executives use around AI is generally more grounded 777 00:39:45,760 --> 00:39:49,520 Speaker 1: than Andreson's prophesying, right, but it's just as irrational. The 778 00:39:49,560 --> 00:39:52,040 Speaker 1: most unhinged thing I heard in person at CES was 779 00:39:52,040 --> 00:39:56,759 Speaker 1: from Bartley Richardson, an AI infrastructure manager at Nvidia, who 780 00:39:56,840 --> 00:39:59,799 Speaker 1: opened a panel on deep fakes by announcing, I think 781 00:40:00,000 --> 00:40:02,800 Speaker 1: everybody has a co pilot. Everybody's making a co pilot. 782 00:40:02,840 --> 00:40:05,120 Speaker 1: Everybody wants a co pilot. Right, there's going to be 783 00:40:05,160 --> 00:40:08,040 Speaker 1: a Bartley co pilot maybe sometime in the future. That's 784 00:40:08,080 --> 00:40:10,359 Speaker 1: just a great way to accelerate us as humans, right. 785 00:40:11,760 --> 00:40:15,000 Speaker 1: And it's funny he's named Bartley. If you if you 786 00:40:15,040 --> 00:40:17,360 Speaker 1: know your old Star Trek and you can remember Barkley, 787 00:40:17,520 --> 00:40:22,960 Speaker 1: the the Sad insign. He sounds like that guy and 788 00:40:23,080 --> 00:40:23,839 Speaker 1: resembles him. 789 00:40:26,960 --> 00:40:30,640 Speaker 2: Oh man, What's what's funny about that speech is it 790 00:40:30,680 --> 00:40:36,480 Speaker 2: sounds like he's trying to convince himself to that Yeah, yeah, 791 00:40:36,520 --> 00:40:38,480 Speaker 2: we were not wasting our time, are we? 792 00:40:38,640 --> 00:40:38,879 Speaker 5: Yeah? 793 00:40:39,120 --> 00:40:42,520 Speaker 1: Yeah again. Later in a separate panel, in Nvidia in 794 00:40:42,600 --> 00:40:45,560 Speaker 1: house council Nikki Pope, who's like the only skeptic they 795 00:40:45,640 --> 00:40:50,120 Speaker 1: let On cited internal research showing consumer trust and brands 796 00:40:50,160 --> 00:40:54,600 Speaker 1: fell whenever they used AI. This gels with research published 797 00:40:54,640 --> 00:40:57,520 Speaker 1: last December that found around twenty five percent of customers 798 00:40:57,719 --> 00:41:01,200 Speaker 1: trust decisions made by AI less than those made by people. 799 00:41:01,680 --> 00:41:05,120 Speaker 1: No one on stage bothered to ask Bartley. It was like, Okay, 800 00:41:05,160 --> 00:41:08,000 Speaker 1: you want to use this thing. We know your own 801 00:41:08,160 --> 00:41:11,160 Speaker 1: company has data that it makes companies less trustworthy. Are 802 00:41:11,200 --> 00:41:13,680 Speaker 1: you worried that if you use it people won't trust you? 803 00:41:13,800 --> 00:41:13,920 Speaker 5: Like? 804 00:41:15,160 --> 00:41:17,640 Speaker 1: Is that not in your head? And that's that was 805 00:41:17,719 --> 00:41:20,520 Speaker 1: kind of a pattern at CEES all of the benefits 806 00:41:20,560 --> 00:41:23,840 Speaker 1: of AI, with some very very specific exceptions. Most of 807 00:41:23,840 --> 00:41:26,399 Speaker 1: the benefits of AI were touted in vague terms. It'll 808 00:41:26,400 --> 00:41:29,000 Speaker 1: make your company nimble, it'll make it more efficient, you know, 809 00:41:29,239 --> 00:41:33,280 Speaker 1: it'll accelerate you harms though, while they were discussed less often, 810 00:41:33,560 --> 00:41:36,759 Speaker 1: they were discussed with a terrible specificity that stood out 811 00:41:36,840 --> 00:41:38,759 Speaker 1: next to the vagueness. One of the guys in the 812 00:41:38,760 --> 00:41:41,200 Speaker 1: deep Fake panel was Ben Coleman, and he's the CEO 813 00:41:41,280 --> 00:41:45,160 Speaker 1: of Reality Defender, which is a company that detects artificially 814 00:41:45,200 --> 00:41:47,520 Speaker 1: generated media. Right their job is like, let you know 815 00:41:47,560 --> 00:41:50,960 Speaker 1: if something's AI generated. And he claims that his company 816 00:41:51,000 --> 00:41:56,320 Speaker 1: expects half a trillion dollars in fraud worldwide this year 817 00:41:57,000 --> 00:42:02,360 Speaker 1: just from voice cloning AI. Not fraud from AI, just 818 00:42:02,520 --> 00:42:06,280 Speaker 1: the fake voice AI happy trillion dollars. 819 00:42:06,640 --> 00:42:09,560 Speaker 2: Yeah. I here's the thing too that I think is 820 00:42:09,600 --> 00:42:11,680 Speaker 2: the scariest part of AI that I don't think we've 821 00:42:11,680 --> 00:42:15,279 Speaker 2: talked about yet is that everyone can use it. You know, 822 00:42:15,400 --> 00:42:17,759 Speaker 2: like it's not this isn't a thing that is like, 823 00:42:18,080 --> 00:42:21,799 Speaker 2: oh well, it's it's you know, these companies is too expensive. 824 00:42:21,880 --> 00:42:23,839 Speaker 2: People are priced out, and we just got to hope 825 00:42:23,840 --> 00:42:28,680 Speaker 2: that everyone's good. You're getting like SpongeBob rapping Kendrick Lyrics 826 00:42:28,719 --> 00:42:32,759 Speaker 2: on TikTok. You know, we my buddy Bennieam jokingly like 827 00:42:32,840 --> 00:42:35,040 Speaker 2: promoted his show and made it look like he was 828 00:42:35,080 --> 00:42:38,319 Speaker 2: having a FaceTime with Obama, and it was like pretty good. 829 00:42:38,360 --> 00:42:40,399 Speaker 2: The only reason you knew it was it was fake 830 00:42:40,520 --> 00:42:43,840 Speaker 2: was because of just the nature of the video. But like, 831 00:42:44,080 --> 00:42:46,719 Speaker 2: at what point is someone going to stop and go, hey, 832 00:42:47,880 --> 00:42:52,560 Speaker 2: we shouldn't have technology where someone can impersonate world leaders. 833 00:42:53,080 --> 00:42:57,000 Speaker 1: Yeah, it's and you know, to be honest, that's not 834 00:42:57,120 --> 00:43:00,400 Speaker 1: even because just because like I think everyone is ready 835 00:43:00,400 --> 00:43:03,080 Speaker 1: for the idea that like, yeah, people are faking Obama 836 00:43:03,200 --> 00:43:05,000 Speaker 1: or by because we've done little versions of that for 837 00:43:05,120 --> 00:43:08,680 Speaker 1: years now. I think the scariest thing is people aren't 838 00:43:08,680 --> 00:43:11,640 Speaker 1: ready for their loved ones to be imitated by AI. 839 00:43:11,800 --> 00:43:13,920 Speaker 1: And that is a thing that is happening in twenty 840 00:43:13,960 --> 00:43:16,120 Speaker 1: twenty three. And this has happened in a lot to 841 00:43:16,160 --> 00:43:17,719 Speaker 1: a lot of people. There was a specific case that 842 00:43:17,800 --> 00:43:19,680 Speaker 1: kind of went viral of this mother who got a 843 00:43:19,760 --> 00:43:23,759 Speaker 1: call from what sounded like her kidnapped daughter and like, 844 00:43:24,000 --> 00:43:26,120 Speaker 1: the AI generated the voice of her daughter, and then 845 00:43:26,120 --> 00:43:28,040 Speaker 1: a guy was like, give us money or will fucking 846 00:43:28,120 --> 00:43:30,960 Speaker 1: murder her? Right, and her kid was never kidnapped. She 847 00:43:31,080 --> 00:43:34,279 Speaker 1: very nearly sent them money because who wouldn't write yea 848 00:43:34,600 --> 00:43:36,440 Speaker 1: like if you don't know that that's a thing that 849 00:43:36,520 --> 00:43:41,799 Speaker 1: can do. Who would not y like, that's a that's 850 00:43:41,840 --> 00:43:44,439 Speaker 1: a rock and it's the AI was able to clone 851 00:43:44,440 --> 00:43:46,399 Speaker 1: her daughter's voice because her daughter has a TikTok, right, 852 00:43:46,520 --> 00:43:49,000 Speaker 1: it doesn't take that much, you know. And this is 853 00:43:49,000 --> 00:43:50,799 Speaker 1: why by the way that people are talking about ways 854 00:43:50,800 --> 00:43:52,279 Speaker 1: to mitigate this, I think one of them is like 855 00:43:52,360 --> 00:43:55,080 Speaker 1: have a family password or something where it's like, all right, 856 00:43:55,120 --> 00:43:57,520 Speaker 1: if I'm fucking kidnapped, I'm going to say the password 857 00:43:57,640 --> 00:44:00,239 Speaker 1: you know, so that you know some random person with 858 00:44:00,280 --> 00:44:02,440 Speaker 1: your TikTok won't know it or at least has to 859 00:44:02,560 --> 00:44:04,839 Speaker 1: try harder to guess it. So great, thanks to AI, 860 00:44:04,920 --> 00:44:07,160 Speaker 1: now we have to have passwords for our families in 861 00:44:07,200 --> 00:44:08,400 Speaker 1: real life. Cool. 862 00:44:08,800 --> 00:44:11,279 Speaker 2: Yeah, if you want to see that Thanksgiving, you got 863 00:44:11,280 --> 00:44:12,239 Speaker 2: to know the password. 864 00:44:12,520 --> 00:44:18,800 Speaker 1: Yeah. Fucking great. At CEES and at the substacks and 865 00:44:18,880 --> 00:44:21,520 Speaker 1: newsletters of all these AI cultists, there's no time to 866 00:44:21,600 --> 00:44:24,719 Speaker 1: dwell on problems like these. Full steam ahead is the 867 00:44:24,760 --> 00:44:28,200 Speaker 1: only serious suggestion they make. You should all be excited. 868 00:44:28,280 --> 00:44:31,480 Speaker 1: Google's VP of Engineering Bashad Bazzati tells us during a 869 00:44:31,480 --> 00:44:35,200 Speaker 1: panel discussion with McDonald's executive if you're not using AI, 870 00:44:35,320 --> 00:44:39,960 Speaker 1: bishod warned, you're missing out. And I heard versions variations 871 00:44:40,000 --> 00:44:42,239 Speaker 1: of the same sentiment over and over again. Right, not 872 00:44:42,320 --> 00:44:43,880 Speaker 1: just this stuff is great, but like you're kind of 873 00:44:43,920 --> 00:44:46,359 Speaker 1: doomed if you don't use it, And I will give 874 00:44:46,680 --> 00:44:49,960 Speaker 1: Nicky Pope was the only skeptic really who had a 875 00:44:50,000 --> 00:44:53,680 Speaker 1: speaking role in CEES. It is not coincidental that she 876 00:44:53,880 --> 00:44:56,839 Speaker 1: was an academic and a black woman, because her background 877 00:44:57,239 --> 00:45:01,120 Speaker 1: is studying algorithmic bias in the justice system. Yes, and 878 00:45:01,800 --> 00:45:05,320 Speaker 1: so she she she had some really good points about 879 00:45:05,320 --> 00:45:07,680 Speaker 1: like the actual dangers this stuff had. She was on 880 00:45:07,719 --> 00:45:10,239 Speaker 1: this the panel, she was almost governing AI risk and 881 00:45:10,320 --> 00:45:11,960 Speaker 1: kind of her like the partner on the panel. The 882 00:45:11,960 --> 00:45:15,560 Speaker 1: guy she was talking with was Adobe VP Alexandrew Costin, 883 00:45:16,040 --> 00:45:18,400 Speaker 1: and she urged the audience, I want you to think 884 00:45:18,480 --> 00:45:23,400 Speaker 1: about the direct harm algorithmic bias could do to marginalized communities. Quote, 885 00:45:23,520 --> 00:45:26,960 Speaker 1: if we create AI that desperately treats one group tremendously 886 00:45:27,000 --> 00:45:29,560 Speaker 1: in favor of another group, the group that has disadvantaged 887 00:45:29,640 --> 00:45:33,640 Speaker 1: or disenfranchised, that's an existential threat to that group. And 888 00:45:33,680 --> 00:45:36,399 Speaker 1: she was specifically like people talk about the existential threat 889 00:45:36,400 --> 00:45:38,400 Speaker 1: of an AI going crazy and killing us all, but 890 00:45:38,520 --> 00:45:41,600 Speaker 1: like that's not as realistic as what we know will happen. 891 00:45:41,800 --> 00:45:45,080 Speaker 2: Yes, oh yeah. When you have these banks that like 892 00:45:45,560 --> 00:45:49,560 Speaker 2: automatically you know, are using AI to try and approve 893 00:45:50,160 --> 00:45:52,960 Speaker 2: you know, loans, and Deonte sends his name in and 894 00:45:53,000 --> 00:45:54,440 Speaker 2: they're like not approved. 895 00:45:54,880 --> 00:45:55,719 Speaker 1: Yeah exactly. 896 00:45:55,880 --> 00:45:58,399 Speaker 2: He's like nah, good, And. 897 00:45:58,640 --> 00:46:00,959 Speaker 1: I am glad she was there. She again she still 898 00:46:01,040 --> 00:46:02,680 Speaker 1: you know, works for a company it's gonna make money 899 00:46:02,719 --> 00:46:05,319 Speaker 1: off of this. She's not like a doomer on it, 900 00:46:05,360 --> 00:46:07,879 Speaker 1: but like at least one person was being like, could 901 00:46:07,920 --> 00:46:10,560 Speaker 1: we please acknowledge there are dangerous I. 902 00:46:10,520 --> 00:46:14,279 Speaker 2: Know, because here's the thing is and I truly believe this, 903 00:46:14,360 --> 00:46:18,040 Speaker 2: And you were basically saying this earlier that AI as 904 00:46:18,080 --> 00:46:20,839 Speaker 2: a tool is fine. You know, yeah, when you win, 905 00:46:20,920 --> 00:46:23,480 Speaker 2: it is. And a tool is something that is always 906 00:46:23,480 --> 00:46:27,200 Speaker 2: held and used by a human that there's the checks 907 00:46:27,200 --> 00:46:30,040 Speaker 2: and balances. It is only as evil as the person 908 00:46:30,040 --> 00:46:33,840 Speaker 2: who's using it. And that is just any item, physical 909 00:46:34,120 --> 00:46:37,200 Speaker 2: or digital, will ever, you know, will always be under 910 00:46:37,400 --> 00:46:40,040 Speaker 2: But the moment you're like, I'm going to give you 911 00:46:40,160 --> 00:46:45,480 Speaker 2: free reign based on information, and how many times has 912 00:46:45,600 --> 00:46:50,160 Speaker 2: an article gone online that was like, it's scanned Reddit 913 00:46:50,239 --> 00:46:52,920 Speaker 2: or it's scanned Twitter and it's racist. Now you know 914 00:46:53,160 --> 00:46:56,840 Speaker 2: there's and we still like went full steam ahead with 915 00:46:56,960 --> 00:47:00,160 Speaker 2: producing this and thinking we're right and we know, especially 916 00:47:00,200 --> 00:47:03,040 Speaker 2: when you see a lot of these tech leaders being 917 00:47:03,320 --> 00:47:07,320 Speaker 2: predominantly white men, and we know that in general, most 918 00:47:07,440 --> 00:47:12,000 Speaker 2: white men don't care about protecting marginalized people. They care 919 00:47:12,040 --> 00:47:14,399 Speaker 2: about getting their bottom dollar. They don't see they see 920 00:47:14,440 --> 00:47:18,279 Speaker 2: it as a as a rare occurrence because they don't 921 00:47:18,360 --> 00:47:22,680 Speaker 2: live that existence. They don't have the data pun intended 922 00:47:22,960 --> 00:47:26,040 Speaker 2: to build something to defend against it because it's not 923 00:47:26,080 --> 00:47:28,000 Speaker 2: a real problem to them because they don't see it 924 00:47:28,040 --> 00:47:31,840 Speaker 2: because and that is beyond just them being them and 925 00:47:31,880 --> 00:47:34,840 Speaker 2: more into as humans. A lot of times, if you 926 00:47:34,880 --> 00:47:37,520 Speaker 2: don't do the work to see it and understand what 927 00:47:37,680 --> 00:47:41,680 Speaker 2: happens to other people outside of your perspective, you're just 928 00:47:41,719 --> 00:47:45,120 Speaker 2: gonna believe that it's it's not real and or people 929 00:47:45,160 --> 00:47:47,839 Speaker 2: are exaggerating, or it's this and it's that. And when 930 00:47:47,920 --> 00:47:49,880 Speaker 2: you are a gung ho and you have drunk the 931 00:47:49,960 --> 00:47:52,920 Speaker 2: kool aid that is the AI kool aid, and you 932 00:47:53,000 --> 00:47:55,759 Speaker 2: are telling people that this is the future, we have 933 00:47:55,880 --> 00:47:59,280 Speaker 2: to do it, you're gonna push ahead. But like we've 934 00:47:59,400 --> 00:48:04,240 Speaker 2: literally have seen a clear cut example of what happens 935 00:48:04,440 --> 00:48:08,560 Speaker 2: when you push past safety and and you just do 936 00:48:08,640 --> 00:48:10,640 Speaker 2: what you want to do just because you have a 937 00:48:10,680 --> 00:48:13,520 Speaker 2: whole bunch of money and a mad Catz controller, Like, 938 00:48:13,719 --> 00:48:14,879 Speaker 2: it gets dangerous. 939 00:48:15,360 --> 00:48:19,200 Speaker 1: So as you started talking about like the dangers of 940 00:48:19,239 --> 00:48:22,280 Speaker 1: certain tools, right and how they the value of the tools, 941 00:48:22,560 --> 00:48:24,839 Speaker 1: I literally looked over at the gun on my table, right, 942 00:48:25,239 --> 00:48:28,680 Speaker 1: and we all agree, even people who really like them, 943 00:48:28,880 --> 00:48:31,520 Speaker 1: there should be regulation. And I think the vast majority 944 00:48:31,680 --> 00:48:35,560 Speaker 1: like agree more regulation, but they are like, again, not 945 00:48:35,640 --> 00:48:37,839 Speaker 1: to say that it's sufficient, but there are a lot 946 00:48:37,840 --> 00:48:40,480 Speaker 1: of laws about like where you can carry a gun legally, 947 00:48:40,560 --> 00:48:44,360 Speaker 1: how you can buy a gun, right, and because people 948 00:48:44,440 --> 00:48:47,280 Speaker 1: understand that, like, yeah, if a tool is that powerful, 949 00:48:47,880 --> 00:48:51,640 Speaker 1: there should be limitations and things and things that you 950 00:48:51,719 --> 00:48:54,080 Speaker 1: can do that get them taken away from you forever. 951 00:48:54,440 --> 00:48:54,560 Speaker 4: Right. 952 00:48:54,600 --> 00:48:56,680 Speaker 1: Yes, I don't know how we do that with AI, 953 00:48:57,080 --> 00:49:01,879 Speaker 1: but I don't think that's a reason not to try. Yeah, 954 00:49:02,000 --> 00:49:02,680 Speaker 1: you do what. 955 00:49:02,960 --> 00:49:04,919 Speaker 2: In that movie Hackers, and it's like you're just being 956 00:49:05,040 --> 00:49:07,600 Speaker 2: from the Internet till you graduate at high school or 957 00:49:07,640 --> 00:49:08,239 Speaker 2: whatever it was. 958 00:49:08,640 --> 00:49:13,240 Speaker 1: Yeah. Yeah, Now, Costin claimed that the biggest issue again, 959 00:49:13,280 --> 00:49:16,880 Speaker 1: so nicky Pope is like, yeah, I think this stuff 960 00:49:16,920 --> 00:49:21,000 Speaker 1: could really hurt marginalized communities, and Alexandrew Costin from Adobe 961 00:49:21,200 --> 00:49:24,160 Speaker 1: responded that like, well, I agree, but the biggest risk 962 00:49:24,239 --> 00:49:28,400 Speaker 1: of generative AI isn't fraud or plagiarism. It's not using AI. 963 00:49:28,800 --> 00:49:28,960 Speaker 4: Right. 964 00:49:29,440 --> 00:49:31,760 Speaker 1: He claims, like this is as big as the Internet, 965 00:49:31,800 --> 00:49:33,880 Speaker 1: and we all just have to get on board. And 966 00:49:33,920 --> 00:49:38,680 Speaker 1: then I'm gonna read verbatim how he ends this particular statement. 967 00:49:38,920 --> 00:49:40,960 Speaker 1: I think humanity will find a way to tame it 968 00:49:41,000 --> 00:49:51,840 Speaker 1: to our best interest. Hopefully great, cool, no way, why awesome, awesome, 969 00:49:52,320 --> 00:49:55,160 Speaker 1: And the whole week was like that again, these really specific, 970 00:49:55,280 --> 00:49:58,120 Speaker 1: devastating harms and then vague claims of like, yeah, we're 971 00:49:58,160 --> 00:50:01,719 Speaker 1: all just gonna have to debt. And I brought up 972 00:50:01,760 --> 00:50:04,800 Speaker 1: scientology earlier. But when I think about touting like vague 973 00:50:04,800 --> 00:50:07,840 Speaker 1: claims of world saving benefits alongside, and it's going to 974 00:50:07,920 --> 00:50:10,279 Speaker 1: hurt too, when you have to accept the pain, I 975 00:50:10,320 --> 00:50:12,840 Speaker 1: think of Keith Ranieri, right, the next Sium guy. We 976 00:50:12,880 --> 00:50:16,040 Speaker 1: all remember Keith. You know, like most cult leaders, Ranieri 977 00:50:16,120 --> 00:50:19,280 Speaker 1: promised his mostly female followers, you'll get all these benefits. 978 00:50:19,280 --> 00:50:21,640 Speaker 1: I'm going to like heal you. You'll be extra productive, 979 00:50:21,680 --> 00:50:24,160 Speaker 1: you'll be super good in your business, super good in 980 00:50:24,200 --> 00:50:26,399 Speaker 1: your career. But you have to follow my commands because 981 00:50:26,400 --> 00:50:28,319 Speaker 1: they have to retrain you on some stuff, and so 982 00:50:28,640 --> 00:50:31,120 Speaker 1: it's going to be uncomfortable, right, And the end result 983 00:50:31,200 --> 00:50:32,919 Speaker 1: of this is a bunch of them branded them their 984 00:50:32,920 --> 00:50:37,560 Speaker 1: flesh and partook in sex trafficking. You know, the techs, 985 00:50:37,800 --> 00:50:40,879 Speaker 1: these tech executives are not Ranieri, but I think they 986 00:50:40,920 --> 00:50:43,279 Speaker 1: see money in aping some of his tactics. Right, the 987 00:50:43,320 --> 00:50:45,800 Speaker 1: benefits are so good, we just have to accept some pain. 988 00:50:45,920 --> 00:50:49,200 Speaker 1: You know I got to hurt you to rebuild you better. Now, 989 00:50:49,520 --> 00:50:52,160 Speaker 1: all of the free money right now is going to AI, 990 00:50:52,400 --> 00:50:54,560 Speaker 1: and these guys know the best way to chase it 991 00:50:54,600 --> 00:50:56,560 Speaker 1: is to throw logic to the wind and promise the 992 00:50:56,640 --> 00:50:59,480 Speaker 1: masses that if we just let this technology run roughshod 993 00:50:59,760 --> 00:51:02,560 Speaker 1: over every field of human endeavor, it will be worth 994 00:51:02,560 --> 00:51:05,479 Speaker 1: it in the end. This is rational for them, because 995 00:51:05,480 --> 00:51:07,239 Speaker 1: they're going to make a lot of money, but it 996 00:51:07,320 --> 00:51:09,640 Speaker 1: is an irrational thing for us to let them do. 997 00:51:10,320 --> 00:51:12,919 Speaker 1: Why would we want to put artists and illustrators who 998 00:51:12,920 --> 00:51:15,240 Speaker 1: we like out of a job. Why would we accept 999 00:51:15,239 --> 00:51:16,920 Speaker 1: a world where it is impossible to talk to a 1000 00:51:17,000 --> 00:51:19,600 Speaker 1: human when you have a problem, and you're instead thrown 1001 00:51:19,680 --> 00:51:22,640 Speaker 1: to a churning swarm of chatbots. Why would we let 1002 00:51:22,760 --> 00:51:25,640 Speaker 1: Sam Altman hoover up the world's knowledge and resell it 1003 00:51:25,680 --> 00:51:28,680 Speaker 1: back to us. We wouldn't, and we won't unless he 1004 00:51:28,719 --> 00:51:31,239 Speaker 1: can convince us that doing so is the only way 1005 00:51:31,280 --> 00:51:34,319 Speaker 1: to solve the problems that scare us. Climate change, the 1006 00:51:34,360 --> 00:51:36,440 Speaker 1: cure for cancer, and into war, or at least and 1007 00:51:36,520 --> 00:51:39,160 Speaker 1: into the fear that we will all be victimized by 1008 00:51:39,160 --> 00:51:41,480 Speaker 1: crime or terrorism. All of these have been touted as 1009 00:51:41,480 --> 00:51:43,640 Speaker 1: benefits of the coming AI age if we can just 1010 00:51:44,000 --> 00:51:48,480 Speaker 1: reach the AI Promise Land, and we're going to talk 1011 00:51:48,520 --> 00:51:51,240 Speaker 1: about some of the people who believe in that promised 1012 00:51:51,280 --> 00:51:53,560 Speaker 1: land and what they think it'll be like. But first, Iffy, 1013 00:51:53,840 --> 00:51:56,000 Speaker 1: you know, it is the real promise Land, the only 1014 00:51:56,040 --> 00:52:00,120 Speaker 1: actual paradise any of us will ever know. What by 1015 00:52:00,239 --> 00:52:06,839 Speaker 1: from the sponsors of this show, of course, all right, 1016 00:52:06,880 --> 00:52:16,680 Speaker 1: here's an ad All right, we're back. So I want 1017 00:52:16,719 --> 00:52:20,200 Speaker 1: to talk about Silicon Valley's latest subculture, emphasis on the 1018 00:52:20,239 --> 00:52:25,719 Speaker 1: cult Effective Accelerationism or E slash acc e AC, I 1019 00:52:25,719 --> 00:52:28,200 Speaker 1: think is probably how you could pronounce it. The gist 1020 00:52:28,239 --> 00:52:32,080 Speaker 1: of this movement fits with Mark Andrewson's manifesto AI development 1021 00:52:32,160 --> 00:52:35,359 Speaker 1: must be accelerated without restriction, no matter the cost. E 1022 00:52:35,440 --> 00:52:37,520 Speaker 1: act has been covered by a number of journalists, but 1023 00:52:37,560 --> 00:52:40,279 Speaker 1: most of that coverage misses how very spiritual some of 1024 00:52:40,320 --> 00:52:43,120 Speaker 1: it seems. One of the inaugural documents of the entire 1025 00:52:43,160 --> 00:52:47,600 Speaker 1: belief system opens with the statement accelerationism is simply the 1026 00:52:47,600 --> 00:52:51,239 Speaker 1: self awareness of capitalism, which has scarcely begun. Again, we 1027 00:52:51,280 --> 00:52:53,960 Speaker 1: see a statement that AI has somehow enmeshed itself with 1028 00:52:54,040 --> 00:52:58,560 Speaker 1: capitalism's ability to understand itself. It is some way intelligent 1029 00:52:58,640 --> 00:53:00,799 Speaker 1: and can know itself. I don't know how else you 1030 00:53:00,840 --> 00:53:03,080 Speaker 1: interpret this, but as belief in a god built by 1031 00:53:03,120 --> 00:53:06,799 Speaker 1: atheists who like money a lot. The argument continues that 1032 00:53:06,880 --> 00:53:10,080 Speaker 1: nothing matters more than extending the quote light of consciousness 1033 00:53:10,080 --> 00:53:13,160 Speaker 1: into the stars, a belief elon Musk himself is championed. 1034 00:53:13,560 --> 00:53:16,120 Speaker 1: AI is the force the market will use to do this, 1035 00:53:16,320 --> 00:53:20,080 Speaker 1: and quote this force cannot be stopped. This is followed 1036 00:53:20,080 --> 00:53:23,799 Speaker 1: by wild claims that next generation life forms will be 1037 00:53:23,840 --> 00:53:27,359 Speaker 1: created inevitably by AI, and then a few sentences down 1038 00:53:27,400 --> 00:53:29,759 Speaker 1: you get the kicker. Those who are first to usher 1039 00:53:29,800 --> 00:53:32,640 Speaker 1: in and control the hyper parameters of AI slash techno 1040 00:53:32,680 --> 00:53:36,200 Speaker 1: capital have immense agency over the future of consciousness. So 1041 00:53:36,239 --> 00:53:38,319 Speaker 1: AA is not just a god. It's a god we 1042 00:53:38,360 --> 00:53:40,760 Speaker 1: can build, and we can use it to shape our future, 1043 00:53:40,840 --> 00:53:43,520 Speaker 1: the future of our reality, to our own whims. And again, 1044 00:53:43,560 --> 00:53:45,920 Speaker 1: some of these guys will acknowledge maybe it'll kill all 1045 00:53:45,920 --> 00:53:48,120 Speaker 1: of us, but as long as it makes a technology 1046 00:53:48,120 --> 00:53:50,200 Speaker 1: that spreads to the stars, that's worth it, because we've 1047 00:53:50,280 --> 00:53:53,960 Speaker 1: kept the light of consciousness alive. Wow, that's not I 1048 00:53:53,960 --> 00:53:57,040 Speaker 1: don't think the mainstream view, but you can definitely find 1049 00:53:57,040 --> 00:54:00,600 Speaker 1: people saying that shit and they'll be like, if you 1050 00:54:00,640 --> 00:54:03,239 Speaker 1: attempt to slow this process down, there are risks, and 1051 00:54:03,280 --> 00:54:05,799 Speaker 1: they're saying the same thing, andres you stop it from 1052 00:54:05,800 --> 00:54:08,799 Speaker 1: doing all these wonderful things. But also I do kind 1053 00:54:08,840 --> 00:54:11,719 Speaker 1: of view that as a veiled threat, right, because if 1054 00:54:11,920 --> 00:54:13,759 Speaker 1: AI is the only way to spread the light of 1055 00:54:13,800 --> 00:54:16,399 Speaker 1: consciousness to the void, and that is the only thing 1056 00:54:16,440 --> 00:54:18,440 Speaker 1: that matters, what do you do to the people who 1057 00:54:18,480 --> 00:54:21,160 Speaker 1: seek to stop you, right, who seek to stop AI? 1058 00:54:21,440 --> 00:54:23,920 Speaker 1: I actually am fine with extending the light of consciousness 1059 00:54:23,960 --> 00:54:26,279 Speaker 1: into space. I'm a big fan of Star Trek. I 1060 00:54:26,360 --> 00:54:30,480 Speaker 1: just don't believe that the intelligent, hyper aware capitalism is 1061 00:54:30,520 --> 00:54:34,000 Speaker 1: the thing to do it. Again, too much of a 1062 00:54:34,040 --> 00:54:36,480 Speaker 1: Star Trek guy for that. When I look at the 1063 00:54:36,520 --> 00:54:39,360 Speaker 1: people who want to follow Mark Andresen's vision, who find 1064 00:54:39,400 --> 00:54:42,040 Speaker 1: what these each people are saying is not just compelling 1065 00:54:42,080 --> 00:54:45,120 Speaker 1: but inspiring. I think of another passage from that New 1066 00:54:45,160 --> 00:54:48,960 Speaker 1: Yorker article by Zoe Heller quote, not passive victims, they 1067 00:54:48,960 --> 00:54:52,040 Speaker 1: themselves sought to be controlled, Hiki Murakami wrote of the 1068 00:54:52,080 --> 00:54:54,960 Speaker 1: members of om Shinriko, the cult Hu's seren gas attack 1069 00:54:55,000 --> 00:54:57,920 Speaker 1: on the Tokyo subway in nineteen ninety five killed thirteen people. 1070 00:54:58,239 --> 00:55:01,760 Speaker 1: In his book Underground Murka, he describes most own members 1071 00:55:01,760 --> 00:55:05,120 Speaker 1: as having deposited all their precious personal holdings of selfhood 1072 00:55:05,280 --> 00:55:08,840 Speaker 1: in the spiritual bank of the cults leader Shoko Asahara, 1073 00:55:08,920 --> 00:55:11,360 Speaker 1: submitting to a higher authority, to someone else's account of 1074 00:55:11,440 --> 00:55:15,759 Speaker 1: reality was, he claims their aim. Now the EAC Manifesto 1075 00:55:15,840 --> 00:55:19,320 Speaker 1: newsletter thing use the term techno capital in conjunction with AI. 1076 00:55:19,680 --> 00:55:21,440 Speaker 1: This is a word that you can find a few 1077 00:55:21,520 --> 00:55:25,040 Speaker 1: different definitions on because it's it's a wonky, like weird 1078 00:55:25,160 --> 00:55:29,160 Speaker 1: academia philosophy term, and there's a number of folks who have, 1079 00:55:29,400 --> 00:55:32,280 Speaker 1: you know, will argue about how it ought to be described. 1080 00:55:32,280 --> 00:55:34,160 Speaker 1: But this is broadly kind of the same thing that 1081 00:55:34,200 --> 00:55:36,960 Speaker 1: Andresen is referring to when he talks about the market 1082 00:55:37,239 --> 00:55:41,279 Speaker 1: as this intelligent discovery organism. Right, And while there are 1083 00:55:41,320 --> 00:55:43,400 Speaker 1: a few different ways you'll see this defined, the EAC 1084 00:55:43,480 --> 00:55:46,879 Speaker 1: people and Andresen himself are thinking about how philosopher Nick Land, 1085 00:55:46,920 --> 00:55:50,120 Speaker 1: who's the guy who's generally credited with like popularizing the 1086 00:55:50,200 --> 00:55:54,799 Speaker 1: term technocapitalism, defines it. Land is one of many advocates 1087 00:55:54,840 --> 00:55:58,319 Speaker 1: of the idea of a technological singularity, the point where 1088 00:55:58,360 --> 00:56:02,680 Speaker 1: technological growth driven by improvements in computing becomes irreversible, the 1089 00:56:02,719 --> 00:56:05,960 Speaker 1: moment at which a super intelligent machine begins inventing more 1090 00:56:06,040 --> 00:56:08,480 Speaker 1: and more of itself and expanding tech in a way 1091 00:56:08,560 --> 00:56:12,319 Speaker 1: that humans can't. As one of Gland's fans summarized in 1092 00:56:12,360 --> 00:56:16,200 Speaker 1: a medium post, a runaway reaction of self improvement loops 1093 00:56:16,239 --> 00:56:20,640 Speaker 1: will almost instantaneously create a coherent superintelligent machine. It is 1094 00:56:20,719 --> 00:56:26,000 Speaker 1: man's last invention. The most notable of industries AI, nanotechnology, femtotechnology, 1095 00:56:26,040 --> 00:56:29,720 Speaker 1: and genetic engineering will erupt with rapid advancements, quickly exceeding 1096 00:56:29,800 --> 00:56:34,120 Speaker 1: human intelligence. Now, obviously the way Land writes is again 1097 00:56:34,280 --> 00:56:38,000 Speaker 1: kind of worth reading, but dense, perhaps two dents for 1098 00:56:38,160 --> 00:56:41,520 Speaker 1: an entertainment podcast. So I'm going to read again from 1099 00:56:41,560 --> 00:56:44,239 Speaker 1: a sub stack called Regress Studies by a writer named 1100 00:56:44,239 --> 00:56:47,880 Speaker 1: Santi Ruiz, kind of talking about this idea of techno 1101 00:56:47,920 --> 00:56:52,160 Speaker 1: capitalism that Land has from more critical standpoint, quote Nick Land, 1102 00:56:52,200 --> 00:56:54,480 Speaker 1: who coined the term is a misanthrope. He doesn't like 1103 00:56:54,600 --> 00:56:56,360 Speaker 1: humans much. So the idea that there could be an 1104 00:56:56,480 --> 00:57:00,239 Speaker 1: entity coming, already being born, drawing itself into existence by 1105 00:57:00,320 --> 00:57:04,239 Speaker 1: hyperstitionally preying on the dreams of humanity cannibalizing their desires. 1106 00:57:04,520 --> 00:57:07,959 Speaker 1: Wearing the invisible hand is an ideological skin. He's into 1107 00:57:08,000 --> 00:57:13,760 Speaker 1: that technoeconomic interactivity crumble social order in autosophisticating machine runway, 1108 00:57:13,960 --> 00:57:16,360 Speaker 1: as he would put it. And that's good. You're being 1109 00:57:16,360 --> 00:57:19,080 Speaker 1: colonized by an entity that doesn't care about you, except 1110 00:57:19,120 --> 00:57:21,960 Speaker 1: insofar as you make a good host. We'll talk about 1111 00:57:22,040 --> 00:57:25,280 Speaker 1: hyperstition in a little bit here. So Land is the 1112 00:57:25,400 --> 00:57:28,520 Speaker 1: guru of accelerationism. You might not be surprised to learn 1113 00:57:28,600 --> 00:57:30,680 Speaker 1: that he has a devoted following among the far right. 1114 00:57:30,880 --> 00:57:33,320 Speaker 1: This is because he is quite racist, anti democratic, and 1115 00:57:33,360 --> 00:57:37,200 Speaker 1: obsessed with eugenics. Now, his eugenics are not your gram 1116 00:57:37,240 --> 00:57:40,920 Speaker 1: Pappy's eugenics. For him, it involves gene editing, which will 1117 00:57:40,960 --> 00:57:43,520 Speaker 1: be available to greater extents than every thanks to AI. 1118 00:57:44,200 --> 00:57:48,160 Speaker 1: Land claims to disnet like white nationalists and conventional racists 1119 00:57:48,200 --> 00:57:51,120 Speaker 1: because they don't see the whole picture. Quote and this 1120 00:57:51,160 --> 00:57:54,960 Speaker 1: is me quoting from one of Land's publications, Racial identarianism 1121 00:57:55,000 --> 00:57:59,800 Speaker 1: and visages a conservation of comparative genetic isolation, generally to 1122 00:58:00,320 --> 00:58:04,640 Speaker 1: by boundaries corresponding to conspicuous phenotypic variation. It is race 1123 00:58:04,760 --> 00:58:07,400 Speaker 1: realist in that it admits to seeing what everyone does 1124 00:58:07,400 --> 00:58:10,040 Speaker 1: in fact see, which is to say, consistent patterns of 1125 00:58:10,040 --> 00:58:15,160 Speaker 1: striking correlated multidimensional variety between human populations or subspecies. Its 1126 00:58:15,240 --> 00:58:21,280 Speaker 1: unrealism lies in its projections. That's pretty racist. Land is 1127 00:58:21,320 --> 00:58:24,480 Speaker 1: listed by name and Andresen's manifesto as someone you should 1128 00:58:24,520 --> 00:58:27,600 Speaker 1: read for a better understanding of the wonderful, optimistic future 1129 00:58:27,640 --> 00:58:31,720 Speaker 1: he and his illke plan for us. He cites extensively 1130 00:58:31,760 --> 00:58:34,960 Speaker 1: Grigory Cochrane, who posits that space travel spreading to the 1131 00:58:35,000 --> 00:58:39,360 Speaker 1: stars will solve our race problem because it's a natural filter, 1132 00:58:39,600 --> 00:58:42,560 Speaker 1: basically saying some races won't make it into space, so 1133 00:58:42,600 --> 00:58:45,120 Speaker 1: we don't need to be to be violent, like, we 1134 00:58:45,280 --> 00:58:47,280 Speaker 1: just have to spread to space and that will do 1135 00:58:47,320 --> 00:58:51,480 Speaker 1: our eugenics part of it for us. So that's cool. 1136 00:58:52,680 --> 00:58:57,560 Speaker 2: Yeah, you know, I'm stuck on this, like, you know, 1137 00:58:57,800 --> 00:59:02,520 Speaker 2: journey to the Stars through f thing, because I don't know, Robert, 1138 00:59:02,560 --> 00:59:04,360 Speaker 2: do you play Warhammer forty two? 1139 00:59:04,400 --> 00:59:08,960 Speaker 1: Oh? God if he ify, of course I play. I've 1140 00:59:08,960 --> 00:59:11,600 Speaker 1: been playing Warhammer forty K most of my life. 1141 00:59:11,800 --> 00:59:17,120 Speaker 2: Okay, because this is sound a very adeptis mechanics, Yeah right, yes, absolutely, 1142 00:59:17,440 --> 00:59:20,200 Speaker 2: And I'm like, I'm like, what is going on here, 1143 00:59:20,240 --> 00:59:22,360 Speaker 2: I'm deep and well trader, and now I'm like, no, 1144 00:59:22,560 --> 00:59:24,400 Speaker 2: this is them. 1145 00:59:24,680 --> 00:59:27,000 Speaker 1: What is fun is that like Warhammer forty thousand, the 1146 00:59:27,040 --> 00:59:29,760 Speaker 1: deep Fluff envisions a society that it's like a hybrid 1147 00:59:29,840 --> 00:59:33,320 Speaker 1: between the Federation and Star Trek, and what these ai 1148 00:59:33,600 --> 00:59:37,440 Speaker 1: Eyak people dream of that it is this utopia for 1149 00:59:37,520 --> 00:59:40,160 Speaker 1: like ten thousand years because they developed thinking machines and 1150 00:59:40,160 --> 00:59:42,000 Speaker 1: then all of the thinking machines turn on them and 1151 00:59:42,080 --> 00:59:46,200 Speaker 1: murder everybody. And so in the future we just lobottomize 1152 00:59:46,200 --> 00:59:49,320 Speaker 1: people who commit crimes and turn them into computers for 1153 00:59:49,440 --> 00:59:52,480 Speaker 1: us because we can't have intelligent machines anymore. 1154 00:59:53,720 --> 00:59:56,560 Speaker 2: So we're just we're going that's what it looks like. 1155 00:59:56,600 --> 00:59:58,440 Speaker 2: We're marching towards with these folks. 1156 00:59:58,760 --> 01:00:01,640 Speaker 1: Yeah, And obviously the thing that the Warhammer people are 1157 01:00:01,760 --> 01:00:03,960 Speaker 1: are inspired by is like the but Larry and g 1158 01:00:04,120 --> 01:00:07,360 Speaker 1: Hot and Dune, which is the more artful version of 1159 01:00:07,360 --> 01:00:11,200 Speaker 1: that story with less orcs, which makes it inferior in 1160 01:00:11,280 --> 01:00:12,800 Speaker 1: my mind. But I do love Dune. 1161 01:00:12,840 --> 01:00:16,560 Speaker 2: So you need more red because the everything Yeah. 1162 01:00:16,480 --> 01:00:17,680 Speaker 1: Red makes it go faster. 1163 01:00:17,840 --> 01:00:18,520 Speaker 2: Yeah. 1164 01:00:18,600 --> 01:00:23,040 Speaker 1: So Land concludes by imagining both racists and anti racists 1165 01:00:23,080 --> 01:00:26,280 Speaker 1: binding together in defense of the concept of race. Right, 1166 01:00:26,560 --> 01:00:28,200 Speaker 1: that's what the result of AI is going to be. 1167 01:00:28,240 --> 01:00:30,920 Speaker 1: The racists need race. You know, we're going to get 1168 01:00:30,960 --> 01:00:33,080 Speaker 1: so good at gene editing, the racists will get angry 1169 01:00:33,080 --> 01:00:35,360 Speaker 1: and the anti racists will get angry because they're so 1170 01:00:35,440 --> 01:00:37,720 Speaker 1: in love with the concept of race. Well, we're just 1171 01:00:37,800 --> 01:00:42,400 Speaker 1: going to improve people annihilate racial differences through moving to 1172 01:00:42,440 --> 01:00:46,200 Speaker 1: the stars and the natural filter that that implies. By 1173 01:00:46,240 --> 01:00:48,240 Speaker 1: the way, the name of the article Land wrote all 1174 01:00:48,280 --> 01:00:51,960 Speaker 1: this and is called hyper racism. So cool guy Glad 1175 01:00:52,040 --> 01:00:54,880 Speaker 1: Mark Andresen cites him in his manifest were Glad the 1176 01:00:54,920 --> 01:00:57,680 Speaker 1: biggest venture capital guy in the country. I was like, yeah, 1177 01:00:57,760 --> 01:01:01,160 Speaker 1: read this, dude. Yeah, that's the sequel to Flom's hyperrealism. 1178 01:01:01,360 --> 01:01:04,400 Speaker 1: So that's yeah, it is interesting. And these guys don't 1179 01:01:04,400 --> 01:01:06,000 Speaker 1: tend to cide it as much. I think you get 1180 01:01:06,000 --> 01:01:07,760 Speaker 1: in some of the deeper stuff. They're all talking about 1181 01:01:07,760 --> 01:01:10,959 Speaker 1: these technocapitalist concepts that Nick Land plays with. They don't 1182 01:01:11,000 --> 01:01:12,560 Speaker 1: talk about what I think is actually one of his 1183 01:01:12,720 --> 01:01:16,280 Speaker 1: most sort of insightful points, which's about a concept called hyperstition. 1184 01:01:16,680 --> 01:01:21,240 Speaker 1: And in brief, hyperstition is like creating things in fiction 1185 01:01:21,440 --> 01:01:24,120 Speaker 1: that become real and the process by which that happens. 1186 01:01:24,240 --> 01:01:25,840 Speaker 1: I think about that a lot when I think about 1187 01:01:25,880 --> 01:01:27,920 Speaker 1: things like the bootlearry and jihad, the war against the 1188 01:01:27,960 --> 01:01:30,600 Speaker 1: Intelligent Machines and the do in Dune, or you know 1189 01:01:30,640 --> 01:01:32,880 Speaker 1: what happened in the Warhammer forty thousand universe. But I 1190 01:01:32,920 --> 01:01:35,800 Speaker 1: also think about how part of why these people are 1191 01:01:35,840 --> 01:01:43,440 Speaker 1: targeting creators, writers, actors, musicians, artists like pen and paper, 1192 01:01:43,520 --> 01:01:46,280 Speaker 1: you know, painting artists and stuff, is because the only 1193 01:01:46,360 --> 01:01:49,280 Speaker 1: way out of this future they have envisioned is in 1194 01:01:49,600 --> 01:01:53,680 Speaker 1: imagining a better one and then making it real, right like, 1195 01:01:53,800 --> 01:01:56,240 Speaker 1: and that is a thing that creatives have a role 1196 01:01:56,320 --> 01:01:58,760 Speaker 1: in doing. So if you can kill that ability, hand 1197 01:01:58,760 --> 01:02:00,960 Speaker 1: it over to the machines that you can control, maybe 1198 01:02:00,960 --> 01:02:06,560 Speaker 1: you can stop them from this path of resistance. Motherfuckers, 1199 01:02:07,160 --> 01:02:07,320 Speaker 1: I know. 1200 01:02:08,080 --> 01:02:10,360 Speaker 2: They're on there with They're like, you want something better, 1201 01:02:10,720 --> 01:02:11,880 Speaker 2: We got to take it away from it? 1202 01:02:11,960 --> 01:02:17,120 Speaker 1: Yeah, fuck you. So anyway, I think that's going to 1203 01:02:17,240 --> 01:02:19,800 Speaker 1: end it for us. In part one, you know, this 1204 01:02:19,800 --> 01:02:23,240 Speaker 1: this whole investigation in much more condensed form, just kind 1205 01:02:23,240 --> 01:02:26,560 Speaker 1: of really focusing specifically on the argument there's cult dynamics 1206 01:02:26,560 --> 01:02:28,600 Speaker 1: of the fandom is being published in an article on 1207 01:02:28,720 --> 01:02:31,640 Speaker 1: Rolling Stone. I'll probably edit in like the title or 1208 01:02:31,640 --> 01:02:34,400 Speaker 1: something here so you can find it. But check that out. 1209 01:02:34,440 --> 01:02:37,880 Speaker 1: That's kind of the more easily shareable version, more condensed, iffy. 1210 01:02:38,080 --> 01:02:40,120 Speaker 1: Where should people check out you and your stuff? 1211 01:02:40,360 --> 01:02:40,560 Speaker 6: Oh? 1212 01:02:40,600 --> 01:02:43,280 Speaker 2: Man, I'm if you wide away on Twitter and Instagram, 1213 01:02:43,320 --> 01:02:47,840 Speaker 2: you know, so definitely peep me there. Listen to our 1214 01:02:47,880 --> 01:02:50,920 Speaker 2: relationship pod with if you and Emmy if you want 1215 01:02:50,920 --> 01:02:54,720 Speaker 2: to hear us talk about relationship stuff and uh, yeah, 1216 01:02:54,760 --> 01:02:57,320 Speaker 2: you have Max Film for movies, but if you go 1217 01:02:57,360 --> 01:02:59,080 Speaker 2: to if you widy Way, you'll find all that stuff. 1218 01:02:59,080 --> 01:03:03,000 Speaker 2: And of course watch drop something absolutely has been to 1219 01:03:03,000 --> 01:03:05,080 Speaker 2: be announced next week. 1220 01:03:05,880 --> 01:03:08,600 Speaker 1: Yes, watch drop Out something cool is coming soon and 1221 01:03:08,640 --> 01:03:15,160 Speaker 1: it's also an extremely human endeavor. Yes, yes, just like 1222 01:03:15,320 --> 01:03:18,880 Speaker 1: this show is, so go with. Uh, I don't know 1223 01:03:18,920 --> 01:03:22,439 Speaker 1: whatever god you worship or the machine god you plan 1224 01:03:22,560 --> 01:03:27,400 Speaker 1: to meme into being Goodbye. 1225 01:03:28,600 --> 01:03:31,320 Speaker 6: Behind the Bastards is a production of cool Zone Media. 1226 01:03:31,680 --> 01:03:34,960 Speaker 6: For more from cool Zone Media, visit our website Coolzonemedia 1227 01:03:35,120 --> 01:03:38,320 Speaker 6: dot com, or check us out on the iHeartRadio app, 1228 01:03:38,400 --> 01:03:40,720 Speaker 6: Apple Podcasts, or wherever you get your podcasts.