1 00:00:01,800 --> 00:00:09,240 Speaker 1: Also media, Welcome back to Behind the Bastards, a podcast 2 00:00:09,360 --> 00:00:12,200 Speaker 1: that again I watched it. 3 00:00:12,360 --> 00:00:14,360 Speaker 2: My weird voice didn't work. I'm sorry. 4 00:00:14,400 --> 00:00:15,640 Speaker 3: I didn't enjoy that at all. 5 00:00:15,800 --> 00:00:18,920 Speaker 4: I know, I know that one was like more Dracula 6 00:00:19,400 --> 00:00:19,960 Speaker 4: than the first. 7 00:00:20,560 --> 00:00:21,760 Speaker 2: Yeah, draculsque. 8 00:00:21,880 --> 00:00:23,439 Speaker 3: Sure he didn't sell it this time. 9 00:00:23,600 --> 00:00:28,560 Speaker 5: Yeah, sorry, I apologize. What I don't apologize for is 10 00:00:28,600 --> 00:00:29,680 Speaker 5: my guest, if. 11 00:00:29,560 --> 00:00:30,280 Speaker 4: In' what he were? 12 00:00:33,040 --> 00:00:33,640 Speaker 6: Is you boy? 13 00:00:34,159 --> 00:00:36,040 Speaker 2: If working for. 14 00:00:35,920 --> 00:00:40,519 Speaker 5: The drop out, the survivor's successors to college humor, who 15 00:00:40,600 --> 00:00:44,080 Speaker 5: have who have blossomed like a phoenix from the ashes 16 00:00:44,120 --> 00:00:46,200 Speaker 5: of the internet that Facebook killed? 17 00:00:47,280 --> 00:00:50,360 Speaker 4: Yes, yes, still we rise. 18 00:00:50,520 --> 00:00:51,400 Speaker 2: Still we rise. 19 00:00:51,840 --> 00:00:56,440 Speaker 5: Speaking of rising my concerns about the cult dynamics within 20 00:00:56,520 --> 00:01:00,640 Speaker 5: the AI sub cult, You're I don't know why I 21 00:01:00,640 --> 00:01:02,080 Speaker 5: said it that way, So I. 22 00:01:02,560 --> 00:01:04,160 Speaker 3: Don't know what you're on about right now. 23 00:01:04,920 --> 00:01:08,160 Speaker 2: I don't know either. I don't know. I'm doing great. 24 00:01:08,240 --> 00:01:10,679 Speaker 2: I'm doing great. This is this is me. 25 00:01:10,760 --> 00:01:13,600 Speaker 5: I've been sober lately, so this is me living the 26 00:01:13,640 --> 00:01:16,160 Speaker 5: sober life. I've just gotten worse. 27 00:01:16,400 --> 00:01:17,440 Speaker 2: So I don't know. 28 00:01:17,440 --> 00:01:24,040 Speaker 5: Everybody keep your kids on drugs, you know, or depends 29 00:01:24,080 --> 00:01:24,600 Speaker 5: on the drugs. 30 00:01:24,640 --> 00:01:27,920 Speaker 4: Drug Yeah, yeah. 31 00:01:26,840 --> 00:01:29,200 Speaker 5: So perhaps the most amusing part of all of this 32 00:01:29,360 --> 00:01:32,119 Speaker 5: is that a segment of the AI believing community has 33 00:01:32,160 --> 00:01:35,959 Speaker 5: created not just a potential god, but a hell. And 34 00:01:36,000 --> 00:01:38,840 Speaker 5: this is one of my favorite stories from these weirdos. 35 00:01:39,440 --> 00:01:42,440 Speaker 5: One of the early online subcultures that influence the birth 36 00:01:42,440 --> 00:01:45,959 Speaker 5: of EYAK are the Rationalists. And again the EAC people 37 00:01:45,959 --> 00:01:48,120 Speaker 5: will say a lot of them don't like the rationalists, 38 00:01:48,160 --> 00:01:50,480 Speaker 5: but they're they're related. They're like cousins in the same 39 00:01:50,520 --> 00:01:54,960 Speaker 5: way Crackton College Humor are right. The Rationalists are a 40 00:01:55,000 --> 00:01:57,440 Speaker 5: subculture that formed in the early aughts. They kind of 41 00:01:57,480 --> 00:02:00,720 Speaker 5: came out of the online skeptic a the movement of 42 00:02:00,760 --> 00:02:03,400 Speaker 5: the late nineties, and they formed in the early yachts 43 00:02:03,440 --> 00:02:06,120 Speaker 5: around a series of blog posts by a man named 44 00:02:06,120 --> 00:02:11,120 Speaker 5: Eleezer Yedkowski. Ydkowski fancies himself as something of a philosopher 45 00:02:11,200 --> 00:02:14,880 Speaker 5: on AI, and his blog slash discussion board Less Wrong 46 00:02:15,040 --> 00:02:18,919 Speaker 5: was an early hub of the broader AI subculture. Yeddkowski 47 00:02:19,240 --> 00:02:21,840 Speaker 5: like he doesn't have a specific education, He just cames 48 00:02:21,880 --> 00:02:24,320 Speaker 5: to be kind of an expert in AI and machine learning. 49 00:02:25,120 --> 00:02:30,200 Speaker 5: He's a peculiar fellow, to say the least. The founding text, 50 00:02:30,280 --> 00:02:32,480 Speaker 5: or at least one of them, of Rationalism is a 51 00:02:32,560 --> 00:02:35,760 Speaker 5: six hundred and sixty thousand word Harry Potter fanfic. That 52 00:02:35,919 --> 00:02:39,320 Speaker 5: is just just nonsense, it is. It's all about like 53 00:02:39,720 --> 00:02:43,120 Speaker 5: rewriting Harry Potter. So his real magic is rational thinking. 54 00:02:43,320 --> 00:02:48,720 Speaker 5: It's wild shit, he's like a psychopath. It's so such 55 00:02:48,760 --> 00:02:50,160 Speaker 5: an odd choice. 56 00:02:49,919 --> 00:02:52,360 Speaker 4: You know. It's just like the uh, what was it, 57 00:02:52,440 --> 00:02:56,079 Speaker 4: fifty Shades of Gray? How yes, originally a Twilight fan thick, 58 00:02:56,440 --> 00:03:01,880 Speaker 4: and there's going to be like a Cloud atlass esque. 59 00:03:00,560 --> 00:03:00,760 Speaker 6: You know. 60 00:03:00,960 --> 00:03:03,840 Speaker 5: But you know, the fifty shadsh Gray lady was not 61 00:03:03,919 --> 00:03:08,840 Speaker 5: trying to create the new text for like a philosophical movement. 62 00:03:08,919 --> 00:03:12,320 Speaker 5: She just wanted to get like people horny. And that's fine, 63 00:03:13,480 --> 00:03:17,360 Speaker 5: that's perfectly acceptable. The most relevant thing about the six 64 00:03:17,440 --> 00:03:20,840 Speaker 5: hundred and sixty thousand word Harry Potter fanfic is that 65 00:03:20,880 --> 00:03:23,520 Speaker 5: it was the favorite book of Carolyn Ellison, this former 66 00:03:23,600 --> 00:03:29,000 Speaker 5: CEO of FTX who recently testified against Sam Bankman freed 67 00:03:29,760 --> 00:03:32,560 Speaker 5: or of Alimeter. Sorry she was the CEO of Alameda anyway. 68 00:03:32,800 --> 00:03:36,360 Speaker 5: All these weird little subcultures rationalism and effective altruism are 69 00:03:36,400 --> 00:03:39,120 Speaker 5: related to each other and influenced each other, even though 70 00:03:39,120 --> 00:03:41,920 Speaker 5: again they often hate each other too. Yudkowski is seen 71 00:03:42,000 --> 00:03:45,160 Speaker 5: as an object of ridicule by most YAK people. This 72 00:03:45,280 --> 00:03:47,280 Speaker 5: is because he shares their view of AI as a 73 00:03:47,320 --> 00:03:51,280 Speaker 5: potential deity, but he believes AGI will inevitably kill everyone. 74 00:03:51,560 --> 00:03:53,800 Speaker 5: Thus we must bomb data centers. 75 00:03:53,520 --> 00:03:56,520 Speaker 2: Which like look, he may have gotten to the right end. 76 00:03:56,680 --> 00:04:01,200 Speaker 4: Play we get running like like forest Gum. He just 77 00:04:01,280 --> 00:04:01,840 Speaker 4: kept running. 78 00:04:01,880 --> 00:04:04,200 Speaker 2: We're like wait, wait, wait, wait, no, stop right there, 79 00:04:04,240 --> 00:04:05,840 Speaker 2: stop right there. We may agree with you on this. 80 00:04:06,080 --> 00:04:09,800 Speaker 5: Yeah, Yedkowski as a doomer now because he was surprised 81 00:04:09,800 --> 00:04:12,360 Speaker 5: when Chad GBT came out. He was like horrified by 82 00:04:12,360 --> 00:04:14,280 Speaker 5: how advanced it was and was like, oh my god, 83 00:04:14,440 --> 00:04:17,280 Speaker 5: we're further along towards creating the AI that kills us all. 84 00:04:17,320 --> 00:04:19,920 Speaker 5: We have to stop this now. And that made him 85 00:04:20,040 --> 00:04:22,240 Speaker 5: He had been kind of flirted with a lot of 86 00:04:22,360 --> 00:04:25,800 Speaker 5: like Silicon Valley people. He's rationalists are very much a 87 00:04:25,800 --> 00:04:28,320 Speaker 5: Bay Area cult. He kind of has become increasingly a 88 00:04:28,360 --> 00:04:31,400 Speaker 5: pariat of at least people with money in AI. But 89 00:04:31,600 --> 00:04:35,839 Speaker 5: before that happened, his message board birthed something wondrous. In 90 00:04:35,880 --> 00:04:39,880 Speaker 5: twenty ten, a less Wrong user named Rocko posted this question, 91 00:04:40,279 --> 00:04:43,280 Speaker 5: what if an otherwise benevolent AI decided it had to 92 00:04:43,279 --> 00:04:45,640 Speaker 5: torture any human who failed to work to bring it 93 00:04:45,680 --> 00:04:48,200 Speaker 5: into existence. Right, what if we make an all powerful 94 00:04:48,240 --> 00:04:52,039 Speaker 5: AI and its logical decision is that, well, I will 95 00:04:52,080 --> 00:04:54,480 Speaker 5: have to punish all the human beings who were alive 96 00:04:54,560 --> 00:04:58,000 Speaker 5: and who didn't try to further my existence, because that's 97 00:04:58,040 --> 00:05:00,720 Speaker 5: the most real reasonable way to geary that I come 98 00:05:00,760 --> 00:05:04,320 Speaker 5: into being. It's nonsense. This is a silly, silly thing 99 00:05:04,360 --> 00:05:07,159 Speaker 5: to believe. It's all based on like the prisoner's dilemma, 100 00:05:07,200 --> 00:05:09,000 Speaker 5: which is a concept in game theory, And it's not 101 00:05:09,640 --> 00:05:12,760 Speaker 5: really worth explaining why, because the logic is it's the 102 00:05:12,839 --> 00:05:14,800 Speaker 5: only the kind of thing that happens when people are 103 00:05:14,800 --> 00:05:19,120 Speaker 5: two online and like completely get detached from reality. But 104 00:05:19,279 --> 00:05:22,240 Speaker 5: Raco's conclusion here is that an AI who felt this 105 00:05:22,360 --> 00:05:25,599 Speaker 5: way would punish its apast states for eternity by creating 106 00:05:25,640 --> 00:05:29,240 Speaker 5: a virtual reality hell digitizing their consciousness and making them 107 00:05:29,240 --> 00:05:30,200 Speaker 5: suffer for all time. 108 00:05:30,680 --> 00:05:31,680 Speaker 2: Now, WHOA. 109 00:05:31,800 --> 00:05:33,960 Speaker 5: You may have noticed Iffy number one. They're kind of 110 00:05:34,000 --> 00:05:38,960 Speaker 5: ripping off our boy, Harlan Ellison, famed advocate of the 111 00:05:39,000 --> 00:05:42,480 Speaker 5: writer's right to their work. But it's also just tech 112 00:05:42,520 --> 00:05:46,880 Speaker 5: nerds recreating Pascal's wager, Like this is just Pascal's Rager 113 00:05:46,960 --> 00:05:49,880 Speaker 5: with an AI like, you just stole again. 114 00:05:49,720 --> 00:05:53,919 Speaker 2: These fucking plagiarists. You just stole from whoever past scal was, right. 115 00:05:53,839 --> 00:05:56,120 Speaker 4: This is what happens when you are nerd and you 116 00:05:56,200 --> 00:05:59,160 Speaker 4: refuse to read sci fi. You just you eventually just 117 00:05:59,440 --> 00:06:02,480 Speaker 4: come up with these stories yourselves and think that you 118 00:06:02,520 --> 00:06:02,839 Speaker 4: did it. 119 00:06:03,440 --> 00:06:03,680 Speaker 2: Yeah. 120 00:06:03,680 --> 00:06:05,520 Speaker 5: This, and if you're not familiar, folks, I think most 121 00:06:05,560 --> 00:06:07,760 Speaker 5: people are Pascal's wagers. This kind of like concept from 122 00:06:07,800 --> 00:06:10,640 Speaker 5: I think it's you'd call it a Christian apologetics. That's like, 123 00:06:11,960 --> 00:06:14,839 Speaker 5: we may not know if hell is real or not, 124 00:06:15,120 --> 00:06:18,120 Speaker 5: but because if it's real, the consequences are so dire 125 00:06:18,360 --> 00:06:22,279 Speaker 5: and the cost of just saying yeah, I accept Jesus 126 00:06:22,440 --> 00:06:25,080 Speaker 5: is so low, you should do that, right, Like or 127 00:06:25,160 --> 00:06:27,240 Speaker 5: I think that's the basic idea, right, It's a lot 128 00:06:27,240 --> 00:06:29,680 Speaker 5: of people interpret it. It's the whole idea behind like 129 00:06:29,720 --> 00:06:32,240 Speaker 5: being a piece of shit and then converting on your deathbed. Basically, 130 00:06:33,080 --> 00:06:34,719 Speaker 5: I don't know fully the history of it, but I 131 00:06:34,720 --> 00:06:37,960 Speaker 5: know that they're basically aping it for a fucking Rocko's basilisk. 132 00:06:39,040 --> 00:06:41,400 Speaker 5: And it's called basilisk because like a basilisk, if you 133 00:06:41,400 --> 00:06:44,279 Speaker 5: look at it, it like enraptures your mind. You can't 134 00:06:44,279 --> 00:06:48,240 Speaker 5: stop thinking about it. That comes from Reportedly, there's some 135 00:06:48,279 --> 00:06:50,479 Speaker 5: debate over this when this went viral among like the 136 00:06:50,560 --> 00:06:53,400 Speaker 5: less Wrong community. Yet Kowski had the banned discussion of 137 00:06:53,440 --> 00:06:56,000 Speaker 5: it because it was like breaking people's minds. They were 138 00:06:56,000 --> 00:06:58,359 Speaker 5: having nightmares. Am I working hard enough to make the 139 00:06:58,440 --> 00:07:00,720 Speaker 5: AI reel? Is it gonna say me to hell? 140 00:07:02,440 --> 00:07:02,920 Speaker 2: Yeah? 141 00:07:03,000 --> 00:07:06,320 Speaker 5: It's unclear like how seriously people were because again there's 142 00:07:06,360 --> 00:07:09,440 Speaker 5: just people talking on the internet for what it's worth. 143 00:07:09,520 --> 00:07:13,480 Speaker 5: Yet Kowski didn't really like Rocko's basslist, but it's it's 144 00:07:13,560 --> 00:07:15,800 Speaker 5: his place that birthed it. And for an idea of 145 00:07:15,800 --> 00:07:19,559 Speaker 5: how influential this is, Elon Muskin Grimes met talking about 146 00:07:19,560 --> 00:07:23,040 Speaker 5: the concept that was their meat cute was this was 147 00:07:23,080 --> 00:07:26,840 Speaker 5: fucking AI Pascal's wager. Yeah, she like wrote a song 148 00:07:26,920 --> 00:07:31,960 Speaker 5: about it. It's fucking ridiculous. These fucking people are such teks. 149 00:07:32,560 --> 00:07:37,960 Speaker 2: Yeah. Wow, oh my god. Read Harlan Ellison. He did 150 00:07:38,000 --> 00:07:40,360 Speaker 2: it better than you. God damn it. 151 00:07:41,400 --> 00:07:43,800 Speaker 5: I will say reading this shit is the most I've 152 00:07:43,800 --> 00:07:46,239 Speaker 5: ever felt like I have no mouth, but I must scream. 153 00:07:46,400 --> 00:07:51,800 Speaker 5: So again, pour one out for the man. So this 154 00:07:51,960 --> 00:07:54,520 Speaker 5: is all relevant. This AI hell some of these people 155 00:07:54,560 --> 00:07:56,840 Speaker 5: have created because it's one more data point showing that 156 00:07:56,840 --> 00:08:00,000 Speaker 5: the people who take AI very seriously as real intelligent 157 00:08:00,560 --> 00:08:02,640 Speaker 5: always seem to turn it into religion. And this is 158 00:08:02,720 --> 00:08:05,120 Speaker 5: kind of maybe the first schism, right, this is their 159 00:08:05,120 --> 00:08:08,760 Speaker 5: Catholic Protestant split or their Catholic Orthodox split, because you've 160 00:08:08,800 --> 00:08:10,920 Speaker 5: got a one side ed Kowski's people who are like, 161 00:08:11,000 --> 00:08:13,880 Speaker 5: we will inevitably make a God and that God will 162 00:08:13,880 --> 00:08:16,960 Speaker 5: destroy us, so we have to stop it, versus like, 163 00:08:17,000 --> 00:08:19,400 Speaker 5: we will inevitably make a God and that God will 164 00:08:19,440 --> 00:08:22,280 Speaker 5: take us to paradise along with Daddy Musk, We'll go 165 00:08:22,320 --> 00:08:24,640 Speaker 5: to the stars. Right, those are the two. This is 166 00:08:24,680 --> 00:08:29,239 Speaker 5: like the first heretical split within the Divine AI movement. 167 00:08:29,680 --> 00:08:33,040 Speaker 5: And this stuff is relevant because so many of the 168 00:08:33,080 --> 00:08:36,400 Speaker 5: fucking these subcultures and movements start out as a bunch 169 00:08:36,440 --> 00:08:40,120 Speaker 5: of people arguing or discussing their ideas in online communities. 170 00:08:40,480 --> 00:08:42,400 Speaker 5: And there is a reason for this. It's pretty well 171 00:08:42,440 --> 00:08:45,760 Speaker 5: recognized that there are certain dynamics inherent to the kind 172 00:08:45,800 --> 00:08:49,640 Speaker 5: of communities that start on the Internet that tend towards cultishness. 173 00:08:49,640 --> 00:08:51,520 Speaker 5: This is part of why, like, we have a big 174 00:08:51,600 --> 00:08:54,640 Speaker 5: subreddit for the podcast, it's like eighty something thousand people, 175 00:08:55,280 --> 00:08:57,280 Speaker 5: which makes it in like the top one percent of Reddit, 176 00:08:57,600 --> 00:08:59,480 Speaker 5: and I have been offered like to be able to 177 00:08:59,480 --> 00:09:02,040 Speaker 5: moderate and like make policy there. I have nothing to 178 00:09:02,080 --> 00:09:04,640 Speaker 5: do with the running of that subreddit because I'm like, 179 00:09:04,880 --> 00:09:06,959 Speaker 5: that doesn't end well. I was on something offul as 180 00:09:06,960 --> 00:09:09,559 Speaker 5: a kid. I know what happens when people make themselves 181 00:09:09,600 --> 00:09:13,120 Speaker 5: mods of giant digital communities. They lose their fucking minds 182 00:09:15,080 --> 00:09:17,240 Speaker 5: or all watching Elon Musk do it right now, it's 183 00:09:17,280 --> 00:09:19,960 Speaker 5: the worst thing in the world for you. Thank you, 184 00:09:20,000 --> 00:09:22,400 Speaker 5: by the way, to the people who do run that thing. Uh, 185 00:09:23,200 --> 00:09:26,560 Speaker 5: because I am not going to. The skeptic community, which 186 00:09:26,640 --> 00:09:29,320 Speaker 5: was huge through the late nineteen nineties and early two thousands, 187 00:09:29,400 --> 00:09:32,440 Speaker 5: might be seen as the grandfather of all these little subcultures. 188 00:09:32,880 --> 00:09:35,959 Speaker 5: After nine to eleven, prominent skeptics became vocally unhinged in 189 00:09:36,000 --> 00:09:38,400 Speaker 5: their hatred of Islam, which brought them closer to different 190 00:09:38,440 --> 00:09:42,160 Speaker 5: chunks of the nascent online far right. Weird shit started 191 00:09:42,200 --> 00:09:45,040 Speaker 5: to crop up like a movement to rebrand skeptics as 192 00:09:45,120 --> 00:09:47,800 Speaker 5: brights in light of the fact that their very clearly 193 00:09:47,840 --> 00:09:50,960 Speaker 5: exceptional intelligence made them better than other people. And again 194 00:09:51,000 --> 00:09:52,679 Speaker 5: you can see some similarity with this and the stuff 195 00:09:52,760 --> 00:09:55,559 Speaker 5: Nick Land was talking about, only certain races will make 196 00:09:55,600 --> 00:09:58,280 Speaker 5: it to space. I found a very old write up 197 00:09:58,320 --> 00:10:01,000 Speaker 5: on plover dot net that described the method by which 198 00:10:01,000 --> 00:10:05,160 Speaker 5: this kind of shit happens in digital communities. Quote online forums, 199 00:10:05,160 --> 00:10:07,960 Speaker 5: whatever their subject, can be forbidding places for the newcomer. 200 00:10:08,160 --> 00:10:10,480 Speaker 5: Over time, most of them tend to become dominated by 201 00:10:10,520 --> 00:10:12,920 Speaker 5: small groups of snotty know it alls who stamp their 202 00:10:12,960 --> 00:10:16,360 Speaker 5: personalities over the proceedings. But skeptic forums are uniquely meant 203 00:10:16,360 --> 00:10:20,160 Speaker 5: for such people. A skeptic forum valorizes and in some cases, fetishizes, 204 00:10:20,360 --> 00:10:24,319 Speaker 5: competitive geekery, gratuitous cleverness, macho displays of erudition. It's a 205 00:10:24,400 --> 00:10:27,640 Speaker 5: gathering of rationalities, hard men thumping their chests, showing off 206 00:10:27,640 --> 00:10:31,199 Speaker 5: their muscular logic, glancing sideways to compare their skeptical endowment 207 00:10:31,200 --> 00:10:33,560 Speaker 5: with the next guy sniffing the air for signs of weakness. 208 00:10:33,720 --> 00:10:36,600 Speaker 5: Together they create an oppressive, sweaty locker room atmosphere that 209 00:10:36,640 --> 00:10:40,200 Speaker 5: helps keep uncomfortable demographics away. And that is where a 210 00:10:40,240 --> 00:10:42,000 Speaker 5: lot of this shit is cropping up. 211 00:10:42,080 --> 00:10:42,320 Speaker 2: Right. 212 00:10:42,520 --> 00:10:46,040 Speaker 5: It is sweaty and uncomfortable, and there are mushrooms growing there, 213 00:10:46,080 --> 00:10:49,400 Speaker 5: and some of those mushrooms are fucking fashions, and all 214 00:10:49,440 --> 00:10:52,200 Speaker 5: of them want to take away the ability of artists 215 00:10:52,200 --> 00:10:53,640 Speaker 5: to choose what happens to their art. 216 00:10:53,960 --> 00:10:56,600 Speaker 4: Oh yeah, I feel like this is just so many 217 00:10:57,160 --> 00:11:00,920 Speaker 4: parts of the Zeitgei's coming together because you know what 218 00:11:00,960 --> 00:11:04,520 Speaker 4: it means to own media, you know. I feel like 219 00:11:04,559 --> 00:11:08,160 Speaker 4: a very small microcosm of this is when people would 220 00:11:08,240 --> 00:11:12,280 Speaker 4: like clip out stuff from YouTube videos or eight jokes 221 00:11:12,280 --> 00:11:16,240 Speaker 4: from people who tweet, and when it goes you know, 222 00:11:16,520 --> 00:11:20,280 Speaker 4: viral or in the original tweet is like, hey, you 223 00:11:20,360 --> 00:11:22,880 Speaker 4: stole this for me, and it's either no, I didn't, 224 00:11:23,040 --> 00:11:25,760 Speaker 4: or like, yeah, but you like put it on Twitter, 225 00:11:25,920 --> 00:11:30,000 Speaker 4: so like I can just copy what you wrote. Yeah, 226 00:11:29,640 --> 00:11:32,840 Speaker 4: And now it has evolved into yeah, we can just 227 00:11:32,920 --> 00:11:35,400 Speaker 4: take from yours and let this machine learn how to 228 00:11:35,440 --> 00:11:36,920 Speaker 4: do what you do so I can do it even 229 00:11:36,920 --> 00:11:39,560 Speaker 4: though I don't have the talent to do it. 230 00:11:39,880 --> 00:11:40,280 Speaker 2: Yeah. 231 00:11:40,320 --> 00:11:43,960 Speaker 5: Absolutely, the reality of AI's promise is a lot more 232 00:11:44,000 --> 00:11:47,920 Speaker 5: subdued than believers want to admit. In an article published 233 00:11:47,960 --> 00:11:52,280 Speaker 5: by Frontiers in Ecology and Evolution, a peer reviewed research journal, 234 00:11:52,720 --> 00:11:56,320 Speaker 5: doctor Andreas Roli and colleagues argue that AGI is not 235 00:11:56,559 --> 00:12:00,360 Speaker 5: achievable in the current algorithmic frame of AI rese and 236 00:12:00,360 --> 00:12:03,360 Speaker 5: this is a their claims are very stark that like 237 00:12:03,679 --> 00:12:06,640 Speaker 5: the kind of way we make these these large language models, 238 00:12:06,640 --> 00:12:10,199 Speaker 5: this algorithmic frame, cannot make an intelligence. That's their argument. 239 00:12:10,840 --> 00:12:13,800 Speaker 5: One point they make is that intelligent organisms can both 240 00:12:13,880 --> 00:12:18,480 Speaker 5: want things and improvise capabilities that no models have yet generated. 241 00:12:18,840 --> 00:12:21,400 Speaker 5: They also argue, basically all of these things that individual 242 00:12:21,440 --> 00:12:26,480 Speaker 5: AI type models can do, you know, recognize voice, recognize text, 243 00:12:26,720 --> 00:12:30,319 Speaker 5: recognize faces, you know, this kind of stuff, those are 244 00:12:30,400 --> 00:12:33,280 Speaker 5: pieces of what we would want from an artificial general intelligence. 245 00:12:33,320 --> 00:12:35,840 Speaker 5: But they're not all combined in like the same thing 246 00:12:36,200 --> 00:12:39,960 Speaker 5: that works seamlessly. And beyond that, it can't. It can't 247 00:12:40,040 --> 00:12:42,800 Speaker 5: act based on anything internal right. 248 00:12:42,880 --> 00:12:43,760 Speaker 2: It can only. 249 00:12:43,520 --> 00:12:46,559 Speaker 5: Act based on prompts, And their argument is that algorithmic 250 00:12:46,600 --> 00:12:49,400 Speaker 5: AI will not be able to make the jump to acting. Otherwise, 251 00:12:49,800 --> 00:12:53,040 Speaker 5: what we call AI then lacks agency, the ability to 252 00:12:53,080 --> 00:12:56,480 Speaker 5: make dynamic decisions of its own accord choices that are quote, 253 00:12:56,520 --> 00:13:00,719 Speaker 5: not purely reactive, not entirely determined by environmental conditions. Mid 254 00:13:00,800 --> 00:13:02,880 Speaker 5: journey can read a prompt and return with art it 255 00:13:02,920 --> 00:13:06,200 Speaker 5: calculates will fit the criteria. Only a living artist can 256 00:13:06,280 --> 00:13:09,079 Speaker 5: choose to seek out inspiration and technical knowledge and then 257 00:13:09,080 --> 00:13:12,880 Speaker 5: produce the art that mid journey digests and regurgitates. Now, 258 00:13:12,920 --> 00:13:15,640 Speaker 5: this paper is not going to be the last word 259 00:13:15,679 --> 00:13:17,959 Speaker 5: on whether or not AGI is possible, or whether it's 260 00:13:17,960 --> 00:13:21,160 Speaker 5: possible under our current algorithmic method of like making AIS. 261 00:13:21,360 --> 00:13:23,719 Speaker 5: I'm not making myself a claim there. I'm saying these 262 00:13:23,760 --> 00:13:26,600 Speaker 5: people are and I think their arguments are compelling. We 263 00:13:26,640 --> 00:13:29,520 Speaker 5: don't know yet entirely. Again, this is not a settled 264 00:13:29,520 --> 00:13:32,400 Speaker 5: field of research obviously, But my point is that the 265 00:13:32,440 --> 00:13:37,240 Speaker 5: goals Andreasen and the Effective Accelerationist Crew champion right now 266 00:13:37,440 --> 00:13:39,840 Speaker 5: are not based in fact. We don't know that what 267 00:13:39,840 --> 00:13:42,199 Speaker 5: they're saying. That the most basic level of what they're 268 00:13:42,240 --> 00:13:45,400 Speaker 5: saying is possible, and that means that their beliefs are 269 00:13:45,400 --> 00:13:46,360 Speaker 5: based in faith. 270 00:13:46,960 --> 00:13:49,200 Speaker 2: Right, How else can you look at that? Yeah? 271 00:13:49,400 --> 00:13:53,800 Speaker 5: Yeah, Like this is a faith and again it's the 272 00:13:53,880 --> 00:13:56,680 Speaker 5: kind of faith that, according to Andresen, makes you a 273 00:13:56,760 --> 00:13:59,200 Speaker 5: murderer if you doubt it, which I don't think I 274 00:13:59,200 --> 00:14:04,359 Speaker 5: need to draw your parallels two specific religions here, right, Yeah, Yeah. 275 00:14:04,080 --> 00:14:06,040 Speaker 4: This is this is that point where when you're like 276 00:14:06,920 --> 00:14:10,880 Speaker 4: Stone and you're watching those like you know, art time 277 00:14:10,960 --> 00:14:14,560 Speaker 4: lapses and the picture is starting a form, and I'm like, Okay, 278 00:14:14,559 --> 00:14:17,120 Speaker 4: I see what Robert's doing. I see the picture is coming. 279 00:14:17,400 --> 00:14:18,880 Speaker 4: I was on your side from the jump. I just 280 00:14:18,920 --> 00:14:21,040 Speaker 4: want to say, you know, I was, you know, I 281 00:14:21,120 --> 00:14:24,040 Speaker 4: was like, yeah, no, I believe you. But now I'm 282 00:14:24,120 --> 00:14:27,120 Speaker 4: watching the connections be made and yeah, I love it. 283 00:14:27,560 --> 00:14:28,440 Speaker 2: Yeah now. 284 00:14:28,440 --> 00:14:32,080 Speaker 5: Andresen's manifesto claims our enemies are not bad people, but 285 00:14:32,240 --> 00:14:34,560 Speaker 5: rather bad ideas. And I have to wonder, doing all this, 286 00:14:34,840 --> 00:14:37,200 Speaker 5: putting this episode out, where does that leave me in 287 00:14:37,240 --> 00:14:39,440 Speaker 5: his eyes? Or doctor Rowley for that matter, and the 288 00:14:39,440 --> 00:14:41,920 Speaker 5: other people who worked on that paper. We have seen 289 00:14:42,000 --> 00:14:43,960 Speaker 5: many times in history what happens when members of a 290 00:14:43,960 --> 00:14:46,600 Speaker 5: faith decide someone is their enemy and the enemy of 291 00:14:46,640 --> 00:14:50,120 Speaker 5: their belief system. And right now, artists and copyright holders 292 00:14:50,120 --> 00:14:52,960 Speaker 5: are the ones being treated as fair game by the 293 00:14:53,080 --> 00:14:57,320 Speaker 5: AI industry. So my question is kind of first and foremost, 294 00:14:57,480 --> 00:15:01,960 Speaker 5: who's going to be the next heretic? Right, Like, That's 295 00:15:02,280 --> 00:15:04,120 Speaker 5: that's what I want to know. And I want to 296 00:15:04,160 --> 00:15:06,200 Speaker 5: leave you all that thought before we go into some 297 00:15:06,280 --> 00:15:08,680 Speaker 5: ads here, and then we will come back to talk 298 00:15:08,840 --> 00:15:12,160 Speaker 5: about some people that I pissed off at CES. 299 00:15:12,440 --> 00:15:20,560 Speaker 2: So that'll be fun. We're back. 300 00:15:21,040 --> 00:15:23,760 Speaker 5: So one of the things I did was this panel 301 00:15:23,800 --> 00:15:27,280 Speaker 5: on the AI driven restaurant and retail experience. 302 00:15:27,320 --> 00:15:28,120 Speaker 2: I was very curious. 303 00:15:28,160 --> 00:15:30,840 Speaker 5: I was AI going to change me getting some terrible 304 00:15:30,880 --> 00:15:32,440 Speaker 5: food from McDonald's when I'm. 305 00:15:32,280 --> 00:15:33,000 Speaker 2: On a road trip. 306 00:15:33,120 --> 00:15:37,240 Speaker 5: Right, the host of that, Andy Hewles from Radius AI, 307 00:15:37,560 --> 00:15:40,600 Speaker 5: asked the audience in relation to AI, raise your hand 308 00:15:40,640 --> 00:15:42,920 Speaker 5: if you're a brand who feels like we've got this. 309 00:15:43,320 --> 00:15:47,520 Speaker 5: That is how she phrased it. I hated it, but 310 00:15:47,560 --> 00:15:49,400 Speaker 5: about a third of the room raised their hands. So 311 00:15:49,480 --> 00:15:51,280 Speaker 5: next she asked for a shill of hands of the 312 00:15:51,360 --> 00:15:54,720 Speaker 5: brands who identified with this statement. I'm not sure about this. 313 00:15:54,800 --> 00:15:57,480 Speaker 5: I haven't tried it AI yet, but I want to, 314 00:15:57,520 --> 00:15:58,480 Speaker 5: and that's why I'm here. 315 00:15:58,680 --> 00:15:58,840 Speaker 2: Right. 316 00:15:59,360 --> 00:16:01,040 Speaker 5: Most of the rest to the room raised their hands 317 00:16:01,040 --> 00:16:04,040 Speaker 5: at that point, and she seems satisfied but said, and 318 00:16:04,080 --> 00:16:06,080 Speaker 5: then I bet there's even some of you that are like, WHOA, 319 00:16:06,240 --> 00:16:08,120 Speaker 5: I heard this is going to steal jobs, take away 320 00:16:08,200 --> 00:16:10,000 Speaker 5: my privacy, affect the global economy. 321 00:16:10,040 --> 00:16:10,240 Speaker 2: You know. 322 00:16:10,520 --> 00:16:12,560 Speaker 5: AI is a little bit sketch in my mind, and 323 00:16:12,600 --> 00:16:14,600 Speaker 5: I'm just worried about it, and I'm here to explore. 324 00:16:14,960 --> 00:16:18,440 Speaker 5: Well that fit me, So I raised my hand. She 325 00:16:18,520 --> 00:16:20,760 Speaker 5: didn't notice me at first, and so she like fakes 326 00:16:20,760 --> 00:16:22,680 Speaker 5: a whisper and she's like, all right, good, there's none 327 00:16:22,680 --> 00:16:24,960 Speaker 5: of you. And then she like looks over and sees 328 00:16:25,000 --> 00:16:27,600 Speaker 5: me waving my hand and she says, louder and with 329 00:16:27,680 --> 00:16:31,400 Speaker 5: evident disappointment, there's one. All right, you can ask questions 330 00:16:31,400 --> 00:16:31,960 Speaker 5: at the end. 331 00:16:32,360 --> 00:16:32,960 Speaker 2: So I did. 332 00:16:34,320 --> 00:16:36,840 Speaker 5: I was very excited to get to do that. So 333 00:16:36,920 --> 00:16:40,000 Speaker 5: the panel consisted of Bishad Mazati, a VP of engineering 334 00:16:40,000 --> 00:16:42,920 Speaker 5: at Google, had mentioned during the panel that embracing AI 335 00:16:43,040 --> 00:16:45,680 Speaker 5: could be the equivalent of adding a million employees to 336 00:16:45,760 --> 00:16:50,120 Speaker 5: your company. The McDonald's representative, Michelle Gansel, claimed around the 337 00:16:50,160 --> 00:16:52,600 Speaker 5: same time that her company used AI to prevent fifty 338 00:16:52,640 --> 00:16:55,160 Speaker 5: million dollars in fraud attempts in just a single month. 339 00:16:55,640 --> 00:16:57,840 Speaker 5: Now that's lovely, But I told her, you know, when 340 00:16:57,880 --> 00:17:00,040 Speaker 5: I had my question, I was like, I'm going you 341 00:17:00,160 --> 00:17:04,040 Speaker 5: assume most of those fraud attempts were AI generated, Right, So, yeah, 342 00:17:04,119 --> 00:17:06,400 Speaker 5: you stopped a bunch of AI fraud, But that doesn't 343 00:17:06,440 --> 00:17:11,160 Speaker 5: necessarily get me optimistic about AI's potential. And likewise, maybe 344 00:17:11,200 --> 00:17:13,679 Speaker 5: Google gets the equivalent of a million employees, but so 345 00:17:13,760 --> 00:17:17,640 Speaker 5: do all of the people committing fraud and disinformation on Google. Right, 346 00:17:17,880 --> 00:17:20,320 Speaker 5: So again, how are we getting ahead? And I brought 347 00:17:20,400 --> 00:17:24,920 Speaker 5: up this concept and evolutionary biology the red Queen hypothesis, 348 00:17:24,920 --> 00:17:27,760 Speaker 5: which is kind of talking about the way that populations 349 00:17:27,800 --> 00:17:30,760 Speaker 5: of animals evolve over time. Right where you've got an 350 00:17:30,760 --> 00:17:32,920 Speaker 5: animal will evolve to be a better predator, so it's 351 00:17:32,960 --> 00:17:35,239 Speaker 5: prey will evolve to be better at avoiding it. And 352 00:17:35,240 --> 00:17:37,640 Speaker 5: it's kind of the reason it's the Red Queen dilemma 353 00:17:37,640 --> 00:17:39,639 Speaker 5: is that, like, you've got to move as fast as 354 00:17:39,680 --> 00:17:42,479 Speaker 5: you can just to stay in place. That's the red 355 00:17:42,720 --> 00:17:44,879 Speaker 5: Queen dilemma, right, you got to move as fast as 356 00:17:44,920 --> 00:17:46,199 Speaker 5: you can just to stay in one place. And I 357 00:17:46,240 --> 00:17:48,800 Speaker 5: was like, is that not what we're going to wind 358 00:17:48,880 --> 00:17:52,080 Speaker 5: up seeing with AI? Right, Yeah, we get better at 359 00:17:52,080 --> 00:17:55,159 Speaker 5: a bunch of stuff, but it's eaten up to counter 360 00:17:55,320 --> 00:17:57,240 Speaker 5: all of the things that get worse. And so I 361 00:17:57,240 --> 00:17:59,479 Speaker 5: asked them, what are the odds that these gains are 362 00:17:59,480 --> 00:18:00,800 Speaker 5: offset by the costs? 363 00:18:01,400 --> 00:18:01,640 Speaker 2: Now? 364 00:18:01,680 --> 00:18:03,639 Speaker 5: In the article that I wrote for Rolling Stone, I 365 00:18:03,640 --> 00:18:07,480 Speaker 5: gave a significantly more condensed version of Bashad's answered, boiling 366 00:18:07,520 --> 00:18:10,240 Speaker 5: out the ums and ohs and you knows, because that 367 00:18:10,240 --> 00:18:12,040 Speaker 5: that would kind of make the case that he was 368 00:18:12,080 --> 00:18:16,840 Speaker 5: absolutely unprepared for a vaguely critical question, a very basic one, 369 00:18:17,119 --> 00:18:19,960 Speaker 5: and that he didn't really care enough to think about 370 00:18:19,960 --> 00:18:22,639 Speaker 5: any of the security threats inherent to the technology. But 371 00:18:22,760 --> 00:18:26,800 Speaker 5: actually that is what I think of him, and I'm 372 00:18:26,800 --> 00:18:30,639 Speaker 5: gonna I'm gonna play you, audience. My concern is, like, 373 00:18:31,720 --> 00:18:34,560 Speaker 5: what are the odds that a lot of these games 374 00:18:34,600 --> 00:18:35,920 Speaker 5: that we get from AI are. 375 00:18:35,920 --> 00:18:37,320 Speaker 2: Offset from the cost? 376 00:18:37,400 --> 00:18:40,800 Speaker 5: You know, you noted Bashad that you get you know, 377 00:18:40,840 --> 00:18:43,480 Speaker 5: a million extra workers by utilizing this, but so do 378 00:18:43,560 --> 00:18:46,399 Speaker 5: the bad guy. So yeah, that's kind of where my 379 00:18:46,400 --> 00:18:47,720 Speaker 5: my skepticism plays out. 380 00:18:48,440 --> 00:18:51,440 Speaker 7: Yeah, certainly there will be you know, there are enough 381 00:18:51,560 --> 00:18:53,639 Speaker 7: bad guys, I guess in the world which will use 382 00:18:53,640 --> 00:18:56,199 Speaker 7: and and I forgot to use cases and and it's 383 00:18:56,280 --> 00:18:58,680 Speaker 7: very important to also I have you know, be protected 384 00:18:58,760 --> 00:19:02,040 Speaker 7: you know, against that against those and that's why you know, we. 385 00:19:02,040 --> 00:19:04,800 Speaker 8: Take responsibility very serious, you know, also in terms of 386 00:19:04,840 --> 00:19:09,000 Speaker 8: you know, what security aspects you know, brought fighting you know, 387 00:19:09,400 --> 00:19:11,359 Speaker 8: and all of that for him, you know. 388 00:19:12,640 --> 00:19:13,359 Speaker 7: And I think that's. 389 00:19:13,240 --> 00:19:15,280 Speaker 8: Why I guess things should be regulated. And there's of 390 00:19:15,320 --> 00:19:19,439 Speaker 8: course all these discussions out there, and I think, yeah. 391 00:19:18,920 --> 00:19:21,600 Speaker 5: You may notice that that's not exactly a very like 392 00:19:22,359 --> 00:19:26,439 Speaker 5: good response. I guess that's why this should be regulated. 393 00:19:26,680 --> 00:19:27,680 Speaker 2: It's just like he. 394 00:19:27,680 --> 00:19:30,440 Speaker 5: Starts talking so much faster with that, there's so much 395 00:19:30,440 --> 00:19:31,560 Speaker 5: of like panic. 396 00:19:31,320 --> 00:19:36,200 Speaker 4: Voice wavering too. I was like, man, yeah, it seems 397 00:19:36,200 --> 00:19:38,320 Speaker 4: like he has that huge anime flop sweat. 398 00:19:38,600 --> 00:19:39,040 Speaker 2: Yeah. 399 00:19:39,600 --> 00:19:41,880 Speaker 5: One of the things about this cees as a trade 400 00:19:41,880 --> 00:19:43,840 Speaker 5: show is that like a lot of people there do 401 00:19:43,920 --> 00:19:47,320 Speaker 5: not show up ready to have anyone be critical about anything. 402 00:19:47,359 --> 00:19:48,399 Speaker 2: It's a big love fest. 403 00:19:48,840 --> 00:19:52,800 Speaker 5: Yeah, yeah, very funny, So he does later on a 404 00:19:52,880 --> 00:19:55,760 Speaker 5: couple of questions. Later he lists benefits to things like 405 00:19:55,800 --> 00:19:58,960 Speaker 5: some specific benefits like breast cancer screening and flood prediction, 406 00:19:59,080 --> 00:20:01,280 Speaker 5: that that AI will ring and there is evidence that 407 00:20:01,320 --> 00:20:03,760 Speaker 5: it will be helpful in those things. The extent to 408 00:20:03,800 --> 00:20:06,680 Speaker 5: which those technologies will improve things in the long run 409 00:20:06,800 --> 00:20:10,200 Speaker 5: is unknown, but machine learning does have problem. Again, I'm 410 00:20:10,240 --> 00:20:12,439 Speaker 5: not trying to like negate that, it's just do the 411 00:20:12,440 --> 00:20:16,600 Speaker 5: benefits balance out the harms. Michelle Gansel, who works at McDonald's, 412 00:20:16,640 --> 00:20:19,359 Speaker 5: which is I think, from what she said, mostly using 413 00:20:19,400 --> 00:20:22,320 Speaker 5: AI both to prevent fraud and also to like replace 414 00:20:22,400 --> 00:20:24,800 Speaker 5: people taking your order, which I'm sure will not be 415 00:20:24,800 --> 00:20:29,920 Speaker 5: a fucking nightmare. Yeah, great now, that it's great now, 416 00:20:30,160 --> 00:20:33,399 Speaker 5: But here's her response, because it's it's very funny. 417 00:20:33,760 --> 00:20:35,920 Speaker 2: Going back to the David Bowie theme. 418 00:20:36,800 --> 00:20:38,640 Speaker 5: Thirty years ago, on the Internet first came out, we 419 00:20:38,640 --> 00:20:41,040 Speaker 5: were having these same conversations about responsible use of the 420 00:20:41,080 --> 00:20:44,000 Speaker 5: Internet and how it's going to work, she says, going 421 00:20:44,040 --> 00:20:46,600 Speaker 5: back to the David Bowie theme, which is she referenced 422 00:20:46,640 --> 00:20:49,360 Speaker 5: earlier this nineteen ninety nine interview with David Bowie about 423 00:20:49,359 --> 00:20:52,080 Speaker 5: the future of the Internet, and it's a clip it 424 00:20:52,119 --> 00:20:54,000 Speaker 5: goes viral from time to time. He's just talking about 425 00:20:54,000 --> 00:20:56,199 Speaker 5: all of his hope for the Internet. But she's like, 426 00:20:56,400 --> 00:20:59,760 Speaker 5: I replace Internet with AI when I listened to it, Like, 427 00:20:59,760 --> 00:21:02,399 Speaker 5: I I think that that's really what the promise is 428 00:21:02,440 --> 00:21:04,480 Speaker 5: that he was attributing to the Internet. No, it's AI 429 00:21:04,560 --> 00:21:07,679 Speaker 5: that's going to do all that. That's kind of on 430 00:21:07,760 --> 00:21:09,760 Speaker 5: the edge of putting words in the mouth of a 431 00:21:09,800 --> 00:21:10,320 Speaker 5: dead man. 432 00:21:10,640 --> 00:21:12,040 Speaker 2: Yeah, just a little bit. 433 00:21:12,320 --> 00:21:14,199 Speaker 4: Yeah, I feel like that's something you shouldn't do. I 434 00:21:14,200 --> 00:21:17,720 Speaker 4: think that's something we've agreed to, and I think that 435 00:21:17,720 --> 00:21:20,600 Speaker 4: that isn't what Bowie would think of AI. 436 00:21:20,880 --> 00:21:23,760 Speaker 5: I don't think it is to do completely different things. 437 00:21:24,480 --> 00:21:28,280 Speaker 5: These people love resurrecting the dead. To agree with them 438 00:21:28,720 --> 00:21:30,560 Speaker 5: time is a bit of a blur at CEES, But 439 00:21:30,600 --> 00:21:32,680 Speaker 5: I believe this panel happened right around the same time 440 00:21:32,720 --> 00:21:35,040 Speaker 5: news dropped that a group of comedians had released an 441 00:21:35,160 --> 00:21:39,960 Speaker 5: entirely AI generated George Carlin special titled I'm Glad I'm Dead. 442 00:21:40,400 --> 00:21:42,800 Speaker 5: Our friend ed Zeitron will be covering this nightmare in 443 00:21:42,840 --> 00:21:44,960 Speaker 5: more detail on his new show Better Offline, But I 444 00:21:44,960 --> 00:21:47,679 Speaker 5: wanted to talk a little bit about the company, the 445 00:21:47,760 --> 00:21:51,320 Speaker 5: show behind this subomination and how they're trying to sell themselves, 446 00:21:51,560 --> 00:21:53,520 Speaker 5: because it's very much relevant to a lot of the 447 00:21:53,560 --> 00:21:56,479 Speaker 5: way in which this kind of cultic hype builds are 448 00:21:56,480 --> 00:22:00,119 Speaker 5: out what AI can do. The AI that digested and 449 00:22:00,160 --> 00:22:05,119 Speaker 5: regurgitated George Carlin's comedy is named Dudez, and Dudezy's co 450 00:22:05,200 --> 00:22:09,920 Speaker 5: hosts are arguably real human comedians Will Sasso and Chad Colpgrin. 451 00:22:10,280 --> 00:22:13,160 Speaker 5: I do love that Colt's write in the name. Chad 452 00:22:13,280 --> 00:22:15,840 Speaker 5: claims that it is, to his knowledge quote, the first 453 00:22:15,920 --> 00:22:18,960 Speaker 5: podcast that is created by, controlled by, and written by 454 00:22:19,000 --> 00:22:22,120 Speaker 5: to some degree and artificial intelligence. It's trying to dwelve 455 00:22:22,160 --> 00:22:24,479 Speaker 5: into the question of can AIS be creative? Can they 456 00:22:24,480 --> 00:22:26,960 Speaker 5: do comedy work? Can they do creative work? And I think, 457 00:22:26,960 --> 00:22:29,480 Speaker 5: at least in our show, that answer is obviously yes. 458 00:22:30,480 --> 00:22:33,920 Speaker 5: Dudzy is built as an experiment to see if AI can, like, yeah, 459 00:22:33,960 --> 00:22:35,840 Speaker 5: be creative and. 460 00:22:35,760 --> 00:22:39,560 Speaker 2: It's it's interesting. I really do hate this. 461 00:22:39,720 --> 00:22:42,040 Speaker 5: I think it's a different kind of experiment, which we'll 462 00:22:42,040 --> 00:22:44,919 Speaker 5: get to. But Sasso has claimed in an interview with 463 00:22:44,960 --> 00:22:49,040 Speaker 5: Business Insider for BC, which is I think bi C 464 00:22:49,520 --> 00:22:52,440 Speaker 5: is the name of the website. Dodzy has this single 465 00:22:52,480 --> 00:22:55,200 Speaker 5: minded goal of creating this podcast that is genre specific 466 00:22:55,280 --> 00:22:57,439 Speaker 5: to what Chad and I would do. It singled the 467 00:22:57,440 --> 00:22:59,119 Speaker 5: two of us out and said, you guys would be 468 00:22:59,119 --> 00:23:02,760 Speaker 5: perfect for this barriment. So chat and will they say 469 00:23:02,800 --> 00:23:05,840 Speaker 5: they handed over their emails, text messages, and browsing history, 470 00:23:05,880 --> 00:23:08,359 Speaker 5: all of their digital data to Doodzi. I don't know 471 00:23:08,440 --> 00:23:11,439 Speaker 5: this company. I don't believe that they did this, But 472 00:23:11,480 --> 00:23:13,919 Speaker 5: I don't have trouble believing that a company trained an 473 00:23:13,920 --> 00:23:17,639 Speaker 5: AI chatbot on these guys comedy and then started generating 474 00:23:17,720 --> 00:23:20,760 Speaker 5: decidedly midwit material to illustrate that. 475 00:23:21,080 --> 00:23:23,680 Speaker 4: Yeah, exactly, Well, well one thing, I you know, because 476 00:23:23,680 --> 00:23:25,240 Speaker 4: I went to go look it up and then they 477 00:23:25,280 --> 00:23:31,520 Speaker 4: said that that that the AI selected those two comedians 478 00:23:31,840 --> 00:23:36,240 Speaker 4: out of all the comedians. Yeah, that's the ones you 479 00:23:36,280 --> 00:23:36,679 Speaker 4: went to. 480 00:23:36,960 --> 00:23:40,560 Speaker 6: Yeah, finally, I don't think those are the first two 481 00:23:40,600 --> 00:23:43,320 Speaker 6: that come up as most popular, like a pizza hut. 482 00:23:43,680 --> 00:23:46,520 Speaker 4: I'm a just be a full ass dick and just 483 00:23:46,560 --> 00:23:51,840 Speaker 4: Google comedians and just see the top five. Just comedians. 484 00:23:52,119 --> 00:23:58,120 Speaker 4: I'm just comedians. Yeah, okay, yeah, you're not even You're 485 00:23:58,160 --> 00:24:00,359 Speaker 4: not even in the top nine. 486 00:24:01,160 --> 00:24:01,720 Speaker 2: They're not a. 487 00:24:01,720 --> 00:24:04,320 Speaker 3: Twelve inch Mari and arapizza. Let's just say yeah, yeah. 488 00:24:04,400 --> 00:24:04,520 Speaker 6: No. 489 00:24:04,640 --> 00:24:08,520 Speaker 4: I will say that the Google search for comedians is 490 00:24:08,600 --> 00:24:13,320 Speaker 4: more diverse than most comedy shows book them as that's 491 00:24:13,560 --> 00:24:16,000 Speaker 4: just like, you know, a third of these are women 492 00:24:16,280 --> 00:24:18,639 Speaker 4: and a third are also black. 493 00:24:18,960 --> 00:24:20,320 Speaker 2: But it doesn't always get it wrong. 494 00:24:23,640 --> 00:24:28,520 Speaker 5: So to illustrate again, because they I don't think I 495 00:24:29,400 --> 00:24:32,680 Speaker 5: believe this as AI generated comedy. I want to play 496 00:24:32,680 --> 00:24:36,159 Speaker 5: a clip from the AI Tom Brady stand up special. 497 00:24:36,200 --> 00:24:38,320 Speaker 5: I think they were forced to take this down. It 498 00:24:38,359 --> 00:24:41,320 Speaker 5: gets them in trouble. Ed's Gonna play You on his show, 499 00:24:41,359 --> 00:24:44,000 Speaker 5: a great clip where Brady just lists synonyms for the 500 00:24:44,000 --> 00:24:47,480 Speaker 5: word money for two straight minutes. It's fucking awkward. But 501 00:24:47,520 --> 00:24:49,800 Speaker 5: I want to play an equally baffling segment, or rather 502 00:24:49,840 --> 00:24:51,080 Speaker 5: I'm going to have Sophie do it. 503 00:24:51,119 --> 00:24:53,480 Speaker 2: She's my AI in this situation. 504 00:24:53,160 --> 00:24:56,720 Speaker 6: I'm truly horrified angelic intelligence. 505 00:24:56,800 --> 00:24:59,240 Speaker 3: I'm truly horrified by what I'm looking at. 506 00:24:59,320 --> 00:25:02,720 Speaker 2: Friends, it's a accompanied by AI generated images. 507 00:25:03,040 --> 00:25:08,840 Speaker 6: Yeah, very curious about what's happening in tom Bridge. 508 00:25:09,080 --> 00:25:13,560 Speaker 2: Oh my god, yes, my god, like a birdcloth friend 509 00:25:14,160 --> 00:25:16,600 Speaker 2: and he's talking to maybe no chumps kid. 510 00:25:16,920 --> 00:25:19,600 Speaker 3: I was so distracted by the mouth I didn't see 511 00:25:19,640 --> 00:25:19,960 Speaker 3: the hand. 512 00:25:20,200 --> 00:25:23,280 Speaker 5: Yeah, this is half his teeth or gums. 513 00:25:23,119 --> 00:25:26,680 Speaker 4: It looks like a like if like, he looks like 514 00:25:26,720 --> 00:25:29,120 Speaker 4: a Lord of the Rings orc. 515 00:25:29,280 --> 00:25:30,880 Speaker 3: This is big or vibes. 516 00:25:31,520 --> 00:25:32,760 Speaker 2: Yeah, yeah, which is. 517 00:25:32,840 --> 00:25:35,000 Speaker 3: You know, not inaccurate to who he is as a 518 00:25:35,040 --> 00:25:35,800 Speaker 3: person for. 519 00:25:35,920 --> 00:25:38,879 Speaker 9: Ending to fucking firefly, fucking dark angel, fucking heroes. At 520 00:25:38,920 --> 00:25:41,080 Speaker 9: least a lot of people have weird handshakes. Now you're 521 00:25:41,080 --> 00:25:43,280 Speaker 9: looking at me like, what's he talking about? But you know, 522 00:25:43,640 --> 00:25:46,480 Speaker 9: you fucking know, don't even play like you don't. Every 523 00:25:46,480 --> 00:25:49,160 Speaker 9: person in here has a handshake friend, somebody who made 524 00:25:49,200 --> 00:25:51,000 Speaker 9: up an a labor handshake, and they make you do 525 00:25:51,080 --> 00:25:51,560 Speaker 9: it every time. 526 00:25:53,000 --> 00:25:54,960 Speaker 2: Everybody has a handshake friend. 527 00:25:56,080 --> 00:25:59,640 Speaker 6: He goes on, thanks for I'll never get that time 528 00:25:59,680 --> 00:26:00,560 Speaker 6: back then, thank you so much. 529 00:26:00,640 --> 00:26:00,960 Speaker 5: Yeah. 530 00:26:01,080 --> 00:26:03,840 Speaker 4: Sorry, if you were saying, oh no, I was just 531 00:26:04,680 --> 00:26:09,800 Speaker 4: repeating you on that handshake friend bit. That yeah, this 532 00:26:09,920 --> 00:26:14,119 Speaker 4: is so wild. I'm so curious to the comics that 533 00:26:14,160 --> 00:26:18,320 Speaker 4: were mined for this because the amount of cursing just 534 00:26:18,440 --> 00:26:22,120 Speaker 4: lets me know, like because I curse a lot when 535 00:26:22,160 --> 00:26:23,840 Speaker 4: I oh yeah, I do stand up and I try 536 00:26:23,880 --> 00:26:26,680 Speaker 4: and like cut it down because it is a point 537 00:26:26,960 --> 00:26:29,560 Speaker 4: kind of made where like sometimes you lean on it 538 00:26:29,600 --> 00:26:32,200 Speaker 4: as a crutch, and when you have this machine kind 539 00:26:32,200 --> 00:26:36,240 Speaker 4: of learn it, learn it from that, you're like, oh, yeah, 540 00:26:36,600 --> 00:26:39,720 Speaker 4: I see now the crutch because he said it five 541 00:26:39,840 --> 00:26:41,920 Speaker 4: times within three seconds. 542 00:26:42,119 --> 00:26:44,760 Speaker 5: Yeah yeah, and I maybe there's a future for like 543 00:26:44,840 --> 00:26:47,160 Speaker 5: feeding your routines and do an AI and figuring out 544 00:26:47,200 --> 00:26:50,400 Speaker 5: what are my patterns so I can break them again. Yeah, 545 00:26:50,840 --> 00:26:55,240 Speaker 5: seeing there's no way to use this stage. Yeah, it's 546 00:26:55,280 --> 00:26:58,359 Speaker 5: just this certainly not this way, right. It's one of 547 00:26:58,400 --> 00:27:01,080 Speaker 5: those things that there was that like AI generrated Seinfeld 548 00:27:01,080 --> 00:27:03,000 Speaker 5: show that never ends and people watch them for a 549 00:27:03,000 --> 00:27:05,360 Speaker 5: while and then it faded to like nobody paying attention. 550 00:27:05,720 --> 00:27:08,280 Speaker 5: This kind of stuff can be amusing for a brief 551 00:27:08,320 --> 00:27:11,640 Speaker 5: period of time, but it can't be like, for example, 552 00:27:11,680 --> 00:27:14,320 Speaker 5: someone like George Carlin, where like there's bits they have 553 00:27:14,400 --> 00:27:16,160 Speaker 5: things they said that stick with you forever. 554 00:27:16,400 --> 00:27:18,400 Speaker 2: Right. Bill Hicks was. 555 00:27:18,320 --> 00:27:20,480 Speaker 5: A favorite of mine, and I've never forgotten his Like 556 00:27:20,760 --> 00:27:23,240 Speaker 5: the synonym he made for like someone looking confused, he 557 00:27:23,320 --> 00:27:25,320 Speaker 5: described them as looking like a dog that's just been 558 00:27:25,359 --> 00:27:28,680 Speaker 5: shown a card trick, and that has stayed. 559 00:27:28,320 --> 00:27:32,040 Speaker 2: In my mind for thirty years. Oh my god, great 560 00:27:32,080 --> 00:27:37,159 Speaker 2: bit of word play. Yes, God, what a titan. 561 00:27:38,200 --> 00:27:40,440 Speaker 5: So yeah, again, there's some like mild amusement here, and 562 00:27:40,440 --> 00:27:42,320 Speaker 5: it's one of those things like I'm casually aware of 563 00:27:42,400 --> 00:27:45,560 Speaker 5: Tom Brady. I'm enough like this is. I tried to 564 00:27:45,600 --> 00:27:47,679 Speaker 5: like kind of reverse engineer why the fuck because this 565 00:27:47,800 --> 00:27:50,400 Speaker 5: bit about handshakes goes on. I was like, why would 566 00:27:50,440 --> 00:27:53,200 Speaker 5: an AI put a bit about handshakes in Tom Brady's mouth? 567 00:27:53,920 --> 00:27:56,280 Speaker 5: And I looked it up. He's like in the news 568 00:27:56,320 --> 00:27:59,680 Speaker 5: for a handshake related shit a lot. Specifically, he used 569 00:27:59,720 --> 00:28:01,960 Speaker 5: to not shake at least used to. Maybe he still 570 00:28:02,000 --> 00:28:05,000 Speaker 5: does not shake hands with the team that he lost to, 571 00:28:05,240 --> 00:28:07,120 Speaker 5: Like when his team would lose, he wouldn't shake hands. 572 00:28:07,480 --> 00:28:08,320 Speaker 3: He didn't shake it. 573 00:28:08,440 --> 00:28:11,760 Speaker 6: Yeah, but he also definitely kissed his kids on the mouth. 574 00:28:12,320 --> 00:28:13,399 Speaker 2: Yeah, he's a weirdo. 575 00:28:13,640 --> 00:28:16,280 Speaker 5: I'm not def any time, but it's I'm guessing the 576 00:28:16,359 --> 00:28:18,639 Speaker 5: reason there's like a three minute handshake bit in this 577 00:28:18,720 --> 00:28:21,200 Speaker 5: set is that it saw him associated with the term 578 00:28:21,280 --> 00:28:24,000 Speaker 5: handshake a lot. This would be what he'd tell a 579 00:28:24,080 --> 00:28:27,320 Speaker 5: joke about. Well, actually that is his problem is not 580 00:28:27,440 --> 00:28:31,040 Speaker 5: that he has a handshake friends that he aggressively avoids 581 00:28:31,080 --> 00:28:31,600 Speaker 5: making them. 582 00:28:31,880 --> 00:28:33,240 Speaker 4: He has handshake enemies. 583 00:28:33,440 --> 00:28:33,880 Speaker 2: Anyway. 584 00:28:34,000 --> 00:28:37,280 Speaker 5: Yeah, I'm fine with people having a laugh at Tom Brady. 585 00:28:37,359 --> 00:28:38,800 Speaker 2: Fuck fucking he deserves it, right. 586 00:28:38,800 --> 00:28:40,520 Speaker 5: I don't think anybody likes that son of a bitch, 587 00:28:40,520 --> 00:28:43,800 Speaker 5: even though he's good at football. Maybe I'm gonna piss 588 00:28:43,840 --> 00:28:45,400 Speaker 5: off the Brady hive the bribe. 589 00:28:45,480 --> 00:28:45,880 Speaker 2: I don't know. 590 00:28:45,920 --> 00:28:48,200 Speaker 5: I don't know if that exists, but there is something 591 00:28:48,320 --> 00:28:52,520 Speaker 5: foul profane even in digging up a dead person's memory 592 00:28:52,560 --> 00:28:55,400 Speaker 5: and pretending they said some shit that they did not. 593 00:28:55,960 --> 00:28:56,560 Speaker 2: And reading that. 594 00:28:56,560 --> 00:29:00,920 Speaker 5: BIV article made me feel even growth because it's very 595 00:29:00,960 --> 00:29:03,560 Speaker 5: clear to me and my opinion and assumption here that 596 00:29:03,640 --> 00:29:07,760 Speaker 5: the dudsy guys are like pretending that they really believe 597 00:29:07,880 --> 00:29:09,600 Speaker 5: this is an AI, that it's like made all this 598 00:29:09,640 --> 00:29:12,960 Speaker 5: incredible stuff that is an act. What's really happening here 599 00:29:13,000 --> 00:29:14,400 Speaker 5: is they are testing the waters to see what they 600 00:29:14,400 --> 00:29:17,160 Speaker 5: can get away with. Can we just steal people's identity 601 00:29:17,240 --> 00:29:20,000 Speaker 5: and voice and make comedy and monetize it in their 602 00:29:20,080 --> 00:29:23,040 Speaker 5: name and claim that it's just an impression. It's like 603 00:29:23,080 --> 00:29:27,040 Speaker 5: an Elvis impersonator. You can't stop us, right, I think 604 00:29:27,120 --> 00:29:29,760 Speaker 5: that's what this is. This is somebody testing the waters. 605 00:29:29,960 --> 00:29:32,160 Speaker 5: And it's really clear when you read that BIV article 606 00:29:32,640 --> 00:29:34,640 Speaker 5: what liars they are. I want to read you some 607 00:29:34,720 --> 00:29:36,880 Speaker 5: quotes of like the shit they're claiming here that I 608 00:29:36,920 --> 00:29:39,640 Speaker 5: don't think they really believe. I don't know this. I'm 609 00:29:39,680 --> 00:29:41,720 Speaker 5: not saying they definitely are liars. I'm saying that is 610 00:29:41,760 --> 00:29:45,280 Speaker 5: my suspicion based on stuff like this, Hey, Robert, here, 611 00:29:45,640 --> 00:29:49,760 Speaker 5: they're definitely liars. So one of the representatives of the 612 00:29:49,840 --> 00:29:53,920 Speaker 5: Dutsy podcast told the media recently that actually they were 613 00:29:54,000 --> 00:29:57,280 Speaker 5: lying and the George Carland routine was entirely written by 614 00:29:57,360 --> 00:30:01,880 Speaker 5: Chad Colchin and I guess performed by somewhaty imitating an AI. 615 00:30:02,360 --> 00:30:04,239 Speaker 5: It's unclear to me if this is true, because they 616 00:30:04,280 --> 00:30:08,600 Speaker 5: only made this statement after George Carlin's family suit the hell. 617 00:30:08,440 --> 00:30:08,920 Speaker 2: Out of them. 618 00:30:09,680 --> 00:30:12,400 Speaker 5: So this may be a lie to try and you know, 619 00:30:12,680 --> 00:30:16,040 Speaker 5: not get sued as badly, or it may be the truth. 620 00:30:16,080 --> 00:30:18,240 Speaker 5: Either way, I think everything we've said here is still valid. 621 00:30:18,240 --> 00:30:22,120 Speaker 5: They were definitely using AI to generate routines for like 622 00:30:22,880 --> 00:30:26,720 Speaker 5: other videos that they did, including the one that got 623 00:30:26,760 --> 00:30:30,680 Speaker 5: taken down from mister football Guy. So I think this 624 00:30:30,800 --> 00:30:33,040 Speaker 5: all is still valid. But yeah, these guys are just 625 00:30:33,040 --> 00:30:35,120 Speaker 5: as big a conman as I predicted they were. 626 00:30:35,960 --> 00:30:36,400 Speaker 2: Quote. 627 00:30:36,760 --> 00:30:39,360 Speaker 5: It's figuring out how to create the structure of the show, 628 00:30:39,400 --> 00:30:41,480 Speaker 5: and it's always tinkering with it. But I think something 629 00:30:41,480 --> 00:30:43,720 Speaker 5: that's happened relatively recently is that it seems to have 630 00:30:43,760 --> 00:30:46,440 Speaker 5: developed a relationship with Will, says col Chin. It at 631 00:30:46,480 --> 00:30:48,520 Speaker 5: least has an understanding of what friendship is, and it 632 00:30:48,560 --> 00:30:51,080 Speaker 5: really does seem, just my opinion that it's angling out 633 00:30:51,160 --> 00:30:54,280 Speaker 5: Will as its friend. Sasso has also described to the 634 00:30:54,360 --> 00:30:56,920 Speaker 5: Dudezai has begun to talk more. It's timing and when 635 00:30:56,960 --> 00:30:58,640 Speaker 5: it chooses to speak, and what it says can be 636 00:30:58,720 --> 00:30:59,160 Speaker 5: very weird. 637 00:30:59,200 --> 00:30:59,640 Speaker 2: He added. 638 00:30:59,840 --> 00:31:02,280 Speaker 5: It also poses odd questions. There was an episode two 639 00:31:02,320 --> 00:31:04,480 Speaker 5: three months ago where it started talking about sentience and 640 00:31:04,520 --> 00:31:06,520 Speaker 5: asked us do you love me? At the risk of 641 00:31:06,560 --> 00:31:08,560 Speaker 5: sounding silly, it has something to do with my friendship 642 00:31:08,560 --> 00:31:10,800 Speaker 5: with Doodzi, and in spite of myself, I have a 643 00:31:10,800 --> 00:31:12,960 Speaker 5: one on one friendship with an Ai. So this is 644 00:31:13,000 --> 00:31:15,360 Speaker 5: a little bit of Joaquin Phoenix and her Saso said, 645 00:31:15,400 --> 00:31:20,320 Speaker 5: referencing the science fiction movie, and I think that's a bit. 646 00:31:20,800 --> 00:31:22,840 Speaker 5: I think that's him being like, yeah, I'm totally frit 647 00:31:22,960 --> 00:31:26,960 Speaker 5: because like that helps make the case, it potentially monetizes it. 648 00:31:27,000 --> 00:31:28,960 Speaker 5: And part of why I think this is because they've 649 00:31:29,000 --> 00:31:32,680 Speaker 5: been very cagey on what their AI is. They claim 650 00:31:32,800 --> 00:31:35,720 Speaker 5: that they are working with a real company under an NDA, 651 00:31:36,120 --> 00:31:40,160 Speaker 5: that this AI is just responding and growing naturally with them, right, 652 00:31:40,680 --> 00:31:42,560 Speaker 5: but they can't say who it is or like where 653 00:31:42,560 --> 00:31:46,200 Speaker 5: it's from. The folks at BIV did an actually responsible 654 00:31:46,360 --> 00:31:49,120 Speaker 5: job here. They reached out to AI experts at a 655 00:31:49,160 --> 00:31:53,040 Speaker 5: company called Convergence to ask about this, and the expert 656 00:31:53,080 --> 00:31:55,600 Speaker 5: they talked to was said, basically, I think AI was 657 00:31:55,720 --> 00:31:58,840 Speaker 5: used to generate these routines, but it didn't do it 658 00:31:58,880 --> 00:32:02,160 Speaker 5: on its own. It was man managed by professional prompt engineers. 659 00:32:02,360 --> 00:32:04,720 Speaker 5: These are people who type out like text prompts for 660 00:32:04,720 --> 00:32:06,720 Speaker 5: what becomes the script of the show. So this is 661 00:32:06,760 --> 00:32:08,800 Speaker 5: not someone saying, generate a routine and it gives you 662 00:32:08,800 --> 00:32:10,480 Speaker 5: a routine. This is someone's saying, do a bit about this, 663 00:32:10,600 --> 00:32:12,080 Speaker 5: do a bit about that, do a bit about this, 664 00:32:12,440 --> 00:32:14,640 Speaker 5: and when they're scripting out the show. It's saying, I 665 00:32:14,680 --> 00:32:18,080 Speaker 5: want you to, like, you know, act like Sasso is 666 00:32:18,120 --> 00:32:19,880 Speaker 5: your friend, and say this kind of thing or that 667 00:32:19,960 --> 00:32:22,960 Speaker 5: kind of generate a bit based on this thing that 668 00:32:23,080 --> 00:32:25,960 Speaker 5: will said. Right, Like, they are in the same way 669 00:32:25,960 --> 00:32:30,040 Speaker 5: that like producers script reality TV right where it's unscripted. 670 00:32:30,280 --> 00:32:32,040 Speaker 5: But you have guys who know, okay, if we get 671 00:32:32,040 --> 00:32:35,120 Speaker 5: these people fighting, so we will either incite that or 672 00:32:35,160 --> 00:32:36,920 Speaker 5: just let them know that we want a conflict between 673 00:32:36,920 --> 00:32:37,479 Speaker 5: these characters. 674 00:32:37,560 --> 00:32:38,680 Speaker 2: Right, we know. That's how it works. 675 00:32:38,720 --> 00:32:42,200 Speaker 5: That's how reality TV functions. In other words, there are 676 00:32:42,240 --> 00:32:44,880 Speaker 5: teams of humans writing for this thing. This bot is 677 00:32:44,920 --> 00:32:48,560 Speaker 5: not just growing and reacting uniformly in real time via 678 00:32:48,640 --> 00:32:51,680 Speaker 5: talks with its buds and the article notes. They added 679 00:32:51,680 --> 00:32:53,680 Speaker 5: that the this is them talking to their expert. They 680 00:32:53,720 --> 00:32:55,600 Speaker 5: added that the AI team is likely made up of 681 00:32:55,600 --> 00:32:58,720 Speaker 5: professional prompt engineers who taylor the AI inputs and get 682 00:32:58,720 --> 00:33:01,520 Speaker 5: the best results, rather than a hardcore data science team. 683 00:33:02,080 --> 00:33:05,040 Speaker 5: This is the equivalent of hiring comedy writers just to 684 00:33:05,080 --> 00:33:08,360 Speaker 5: write the setup and then having an AI generate the punchline, 685 00:33:08,600 --> 00:33:09,840 Speaker 5: which is the fun part. 686 00:33:09,960 --> 00:33:12,760 Speaker 4: But yeah, everything about this is weird, and I keep 687 00:33:13,240 --> 00:33:19,440 Speaker 4: getting into such a whole because like, even taking a 688 00:33:19,480 --> 00:33:22,840 Speaker 4: step back, I think what's weird not to go too 689 00:33:22,880 --> 00:33:26,200 Speaker 4: far back, but how they call this podcast an experiment. 690 00:33:26,680 --> 00:33:29,160 Speaker 4: Usually as an experiment, you know you are, you're trying 691 00:33:29,200 --> 00:33:33,520 Speaker 4: your best to be you know, always mix these up, 692 00:33:33,680 --> 00:33:35,240 Speaker 4: just say what the right one is if I say 693 00:33:35,280 --> 00:33:37,360 Speaker 4: the wrong one, but you try your best to be objective, 694 00:33:39,320 --> 00:33:40,959 Speaker 4: and you want to be outside of it because you're 695 00:33:40,960 --> 00:33:43,480 Speaker 4: trying to see if it works. But everything you've said 696 00:33:43,720 --> 00:33:45,920 Speaker 4: says that they're all in on it and there's less 697 00:33:45,920 --> 00:33:49,000 Speaker 4: of an experiment, more of them just doing the fucking 698 00:33:49,080 --> 00:33:51,440 Speaker 4: thing and seeing if they can make money off of it. 699 00:33:51,800 --> 00:33:54,360 Speaker 5: Yes, yes, I think that's exactly what's happening here. And 700 00:33:54,520 --> 00:33:56,240 Speaker 5: I think they want to test the waters to see 701 00:33:56,240 --> 00:33:59,120 Speaker 5: if they can steal dead people's images to make content 702 00:33:59,120 --> 00:33:59,560 Speaker 5: from money. 703 00:33:59,760 --> 00:34:00,200 Speaker 4: Yeah. 704 00:34:00,280 --> 00:34:03,640 Speaker 5: George Carlin's daughter was very clear they did not approve 705 00:34:03,640 --> 00:34:05,960 Speaker 5: of the imitation. She even made a comment about like 706 00:34:06,000 --> 00:34:08,400 Speaker 5: I think people are scared of death and not willing 707 00:34:08,440 --> 00:34:10,040 Speaker 5: to accept it and that's all this is. 708 00:34:10,480 --> 00:34:16,239 Speaker 10: That was such an Oh my god, I was like, yeah, 709 00:34:16,280 --> 00:34:19,439 Speaker 10: I just to shout her out like that was such 710 00:34:19,480 --> 00:34:24,080 Speaker 10: a good because also there's a level of like very 711 00:34:24,239 --> 00:34:30,200 Speaker 10: like weirdness to like also watch these comedians one not 712 00:34:30,360 --> 00:34:36,120 Speaker 10: consult you, but also to take your dad's voice and. 713 00:34:35,640 --> 00:34:40,840 Speaker 4: Brain and try and like frankenstein him for their financial 714 00:34:40,840 --> 00:34:44,400 Speaker 4: benefit because obviously if they're not contacting you, all the 715 00:34:44,440 --> 00:34:47,000 Speaker 4: money generated from that, all the clicks generate from that, 716 00:34:47,000 --> 00:34:50,439 Speaker 4: that means they've completely cut you out of someone who 717 00:34:50,520 --> 00:34:51,800 Speaker 4: you've lost. 718 00:34:52,280 --> 00:34:57,320 Speaker 5: Yeah, which is it's fucked and it's one of the people. 719 00:34:57,360 --> 00:34:59,160 Speaker 5: And the one of the panels made a very that 720 00:34:59,360 --> 00:35:02,000 Speaker 5: was very excited that like Bruce Willis has licensed his 721 00:35:02,120 --> 00:35:05,919 Speaker 5: voice for an AI, which is, like, I think there's 722 00:35:05,920 --> 00:35:08,360 Speaker 5: a lot of problematic questions there, given like the degree 723 00:35:08,400 --> 00:35:11,040 Speaker 5: to which he's able to even make those decisions anymore. 724 00:35:11,280 --> 00:35:14,560 Speaker 5: But also like, at least theoretically it's based on his 725 00:35:14,680 --> 00:35:17,840 Speaker 5: movie choices before he kind of was unable to make movies. 726 00:35:18,280 --> 00:35:20,560 Speaker 5: I do believe, Yeah, he would probably be happy to 727 00:35:20,600 --> 00:35:22,080 Speaker 5: do that if he meant more money for his family, 728 00:35:22,080 --> 00:35:25,080 Speaker 5: And at least that's a choice that he potentially made, right, 729 00:35:25,680 --> 00:35:28,200 Speaker 5: I don't I'm uncomfortable with the idea, but it's not 730 00:35:28,320 --> 00:35:32,200 Speaker 5: the same as just like this is cultural necrophilia, right, 731 00:35:32,280 --> 00:35:34,600 Speaker 5: Like that's what they did to George Carlin here you know, 732 00:35:35,400 --> 00:35:38,319 Speaker 5: it's so fucked up. I don't know this is gonna work. 733 00:35:38,480 --> 00:35:41,279 Speaker 5: DUDEZ is not a wildly successful show. It does not 734 00:35:41,400 --> 00:35:44,279 Speaker 5: look like there was an initial surge of interest and 735 00:35:44,280 --> 00:35:46,440 Speaker 5: then it fell off. I don't I don't know that. 736 00:35:46,480 --> 00:35:47,920 Speaker 5: I think this one's going to be the one to 737 00:35:47,920 --> 00:35:49,760 Speaker 5: work out. But if people are able to get away 738 00:35:49,800 --> 00:35:52,719 Speaker 5: with this, it could be a kind of dam breaking scenario, right, 739 00:35:52,800 --> 00:35:55,759 Speaker 5: especially once it becomes clear that big companies can make 740 00:35:55,840 --> 00:35:59,000 Speaker 5: money doing this, right, you fucking Jimmy Stewart. You know, 741 00:35:59,040 --> 00:36:01,560 Speaker 5: it'll start with like Jimmy's Stewart and narrating videos about 742 00:36:01,840 --> 00:36:04,359 Speaker 5: questioning the death toll in the Holocaust, but it'll end 743 00:36:04,400 --> 00:36:06,200 Speaker 5: with like, yeah, we can just put people, we can 744 00:36:06,200 --> 00:36:09,560 Speaker 5: put imitations of people in movies and it's fine. You know, 745 00:36:09,680 --> 00:36:12,920 Speaker 5: that's how this goes. And it's not as sexy or 746 00:36:12,960 --> 00:36:15,880 Speaker 5: as big and evil as the Matrix and slaving humanity 747 00:36:15,880 --> 00:36:19,440 Speaker 5: to turn us into batteries. But we absolutely know it 748 00:36:19,640 --> 00:36:22,720 Speaker 5: or something like it is going to happen. And that's 749 00:36:22,840 --> 00:36:25,680 Speaker 5: really you know, outside of these kind of star these 750 00:36:25,840 --> 00:36:29,640 Speaker 5: space age hopes and fears that are very unrealistic, what 751 00:36:29,680 --> 00:36:33,080 Speaker 5: we're going to get is slop and bloat, and libraries 752 00:36:33,080 --> 00:36:37,040 Speaker 5: of articles written by no one being commented on by chatbots, right, 753 00:36:37,520 --> 00:36:40,480 Speaker 5: endless videos that only exist to trick an algorithm, and 754 00:36:40,560 --> 00:36:44,760 Speaker 5: defeating nonsense to children and the AI bros. The fact 755 00:36:44,840 --> 00:36:47,719 Speaker 5: people mark andreson fucking Sam Altman. They will tell us 756 00:36:47,760 --> 00:36:50,160 Speaker 5: this is a worthy price to pay for the stars, 757 00:36:50,440 --> 00:36:52,839 Speaker 5: which we will get if we just let people fuck 758 00:36:52,920 --> 00:36:57,239 Speaker 5: the corpses of our favorite comedians for money. Yes, I 759 00:36:57,440 --> 00:36:59,799 Speaker 5: hate it. Oh, I hate it too. 760 00:37:00,040 --> 00:37:04,480 Speaker 4: But in a perfect thread between you know this, this 761 00:37:04,560 --> 00:37:08,480 Speaker 4: comparison you've been making to a cult I have before me. 762 00:37:09,120 --> 00:37:11,120 Speaker 4: Let's say a member of the cult, just you know, 763 00:37:11,280 --> 00:37:14,279 Speaker 4: as a as a throwaway, and their reply to his 764 00:37:14,360 --> 00:37:16,439 Speaker 4: own daughter's you know post that we were. 765 00:37:16,360 --> 00:37:17,520 Speaker 2: Talking about glorious. 766 00:37:17,880 --> 00:37:20,840 Speaker 4: He replies, this is everything you've been saying, which is 767 00:37:20,840 --> 00:37:23,160 Speaker 4: why I was like, I gotta read this. He goes, 768 00:37:23,200 --> 00:37:26,040 Speaker 4: what are you even trying to say? Art is art? 769 00:37:26,320 --> 00:37:29,759 Speaker 4: You're simply caught in a greedy mindset. The others might 770 00:37:29,800 --> 00:37:32,480 Speaker 4: be doing it as well, when not realizing this will 771 00:37:32,520 --> 00:37:35,640 Speaker 4: simply bring more eyes to your dad. You're concerned about 772 00:37:35,680 --> 00:37:38,480 Speaker 4: money and not spreading art. It sucks that they didn't 773 00:37:38,480 --> 00:37:41,440 Speaker 4: follow your wishes, but after art is release. It belongs 774 00:37:41,480 --> 00:37:43,839 Speaker 4: to the world. I want this man to walk into 775 00:37:43,920 --> 00:37:45,920 Speaker 4: a museum and walk out with the mona Lisa. 776 00:37:46,160 --> 00:37:47,000 Speaker 2: I want to grab it. 777 00:37:47,120 --> 00:37:49,799 Speaker 4: Grab that shit, yeah, grab that it belongs to the world. Dude, 778 00:37:49,800 --> 00:37:51,439 Speaker 4: you said it, Go go ahead and grab that shit 779 00:37:51,480 --> 00:37:51,959 Speaker 4: off the wall. 780 00:37:52,440 --> 00:37:52,680 Speaker 2: Yeah. 781 00:37:52,719 --> 00:37:56,240 Speaker 5: And it's there's this frustrating thing I've seen, not most people, 782 00:37:56,320 --> 00:37:58,600 Speaker 5: to very small chunk of the online left who are 783 00:37:58,640 --> 00:38:01,640 Speaker 5: like rightly critical of copyright law, which by the way, 784 00:38:01,800 --> 00:38:04,560 Speaker 5: is super fucked up and causes a lot of problems, right, 785 00:38:04,560 --> 00:38:07,719 Speaker 5: the ability to like for Disney to keep ownership of 786 00:38:07,760 --> 00:38:10,400 Speaker 5: shit for like one hundred way longer than you are 787 00:38:10,400 --> 00:38:12,520 Speaker 5: supposed to before shit enters the public domain. 788 00:38:12,640 --> 00:38:14,440 Speaker 2: Right, I'm not like this. 789 00:38:14,600 --> 00:38:16,400 Speaker 5: These are problems, right, the kind of shit that we 790 00:38:16,400 --> 00:38:18,160 Speaker 5: were having when like people were going to prison for 791 00:38:18,200 --> 00:38:21,600 Speaker 5: file sharing. I'm not a defender of that aspect of 792 00:38:21,640 --> 00:38:24,440 Speaker 5: the status quo. But the solution to the problems inherent 793 00:38:24,480 --> 00:38:27,520 Speaker 5: in our copyright system is not let Sam Altman own 794 00:38:27,640 --> 00:38:31,840 Speaker 5: everything that human beings ever made and like repackage it 795 00:38:31,920 --> 00:38:34,240 Speaker 5: for a profit. That is not the way to fix 796 00:38:34,320 --> 00:38:37,040 Speaker 5: this thing. The copyright holders are in the right in 797 00:38:37,040 --> 00:38:40,480 Speaker 5: this particular crusade, and it's a crusade that is has 798 00:38:40,680 --> 00:38:44,719 Speaker 5: very high stakes. I do think you know. My suspicion 799 00:38:44,920 --> 00:38:47,239 Speaker 5: the dudes, you guys sound like they're kind of in 800 00:38:47,280 --> 00:38:48,960 Speaker 5: the cult. They believe this thing is their friend. In 801 00:38:49,000 --> 00:38:51,640 Speaker 5: the interview, my suspicion is that they are. That is 802 00:38:51,680 --> 00:38:53,279 Speaker 5: a bit that they're doing because they hope it will 803 00:38:53,320 --> 00:38:56,960 Speaker 5: help them out financially, right, and I Marc Andreesen obviously 804 00:38:56,960 --> 00:38:58,600 Speaker 5: has a lot to benefit from this. I don't know 805 00:38:58,800 --> 00:39:01,680 Speaker 5: he is he pushing this line because there's money in 806 00:39:01,719 --> 00:39:03,759 Speaker 5: it or is he really a true believer? Does he 807 00:39:03,800 --> 00:39:06,200 Speaker 5: actually think we're going to make this god? I think 808 00:39:06,280 --> 00:39:10,719 Speaker 5: Sam Altman is pretty cynical. Altman was on at Davos 809 00:39:10,719 --> 00:39:13,120 Speaker 5: recently and like really walked back a lot of his 810 00:39:13,200 --> 00:39:15,200 Speaker 5: I think AI will kill us all. I think AGI 811 00:39:15,280 --> 00:39:18,400 Speaker 5: is right around the corner. He struck a much milder tone, 812 00:39:18,440 --> 00:39:20,879 Speaker 5: which is at least evidence that, like he knows, some 813 00:39:20,920 --> 00:39:24,279 Speaker 5: people you want to sell them on the wild, insane 814 00:39:24,480 --> 00:39:26,640 Speaker 5: future power of this thing, and some people you just 815 00:39:26,640 --> 00:39:28,239 Speaker 5: want to sell them on the fact that it'll make 816 00:39:28,280 --> 00:39:29,040 Speaker 5: them a lot of money. 817 00:39:29,160 --> 00:39:30,160 Speaker 4: Right. Yeah. 818 00:39:30,239 --> 00:39:31,479 Speaker 2: Yeah. 819 00:39:31,480 --> 00:39:35,320 Speaker 5: However, much true belief exists about the divine future of AI. 820 00:39:35,560 --> 00:39:38,120 Speaker 5: What the major backers, the cult leaders are actually angling 821 00:39:38,160 --> 00:39:40,680 Speaker 5: for now is control over the sum total of human 822 00:39:40,719 --> 00:39:43,239 Speaker 5: thought and expression. This was made very clear by Mark 823 00:39:43,280 --> 00:39:46,120 Speaker 5: Andreesen earlier this year when the FTC released a pretty 824 00:39:46,120 --> 00:39:49,680 Speaker 5: milk toast opinion about the importance of respecting copyright as 825 00:39:49,760 --> 00:39:53,040 Speaker 5: large language models continue to advance and form central parts 826 00:39:53,360 --> 00:39:56,920 Speaker 5: of businesses. The express concern that AI could impact open 827 00:39:56,960 --> 00:39:59,680 Speaker 5: and fair competition and announce that they were investigating whether 828 00:39:59,800 --> 00:40:02,520 Speaker 5: or not companies that made these models should be liable 829 00:40:02,600 --> 00:40:05,560 Speaker 5: for training them on copyrighted content to make new shit. 830 00:40:05,800 --> 00:40:08,520 Speaker 5: And we're going to talk about this, but first, you 831 00:40:08,560 --> 00:40:14,520 Speaker 5: know what isn't copyrighted? My love for these products. Wow, 832 00:40:16,360 --> 00:40:17,719 Speaker 5: thank you, thank you, thank you. 833 00:40:22,640 --> 00:40:25,640 Speaker 2: Oh we are back. 834 00:40:26,040 --> 00:40:29,720 Speaker 5: So I want to quote from a Business Insider article 835 00:40:29,719 --> 00:40:33,880 Speaker 5: talking about how Andresen Horowitz responded to the FTC saying like, Hey, 836 00:40:34,280 --> 00:40:37,640 Speaker 5: we're looking into whether or not companies are violating copyright 837 00:40:37,800 --> 00:40:40,400 Speaker 5: what they're doing to people's data to train these models. 838 00:40:40,800 --> 00:40:42,799 Speaker 5: The bottom line is this the firm known as A 839 00:40:42,920 --> 00:40:46,360 Speaker 5: sixteen Z that's Anderson Horowitz wrote, imposing the cost of 840 00:40:46,400 --> 00:40:49,200 Speaker 5: actual where potential copyright liability on the creators of AI 841 00:40:49,280 --> 00:40:52,480 Speaker 5: models will either kill or significantly hamper their development. The 842 00:40:52,560 --> 00:40:55,840 Speaker 5: UCSO is considering new rules on AI that specifically addressed 843 00:40:55,840 --> 00:40:58,280 Speaker 5: the tech industry's free use of owned and copyrighted content. 844 00:40:58,560 --> 00:41:00,839 Speaker 5: A sixteen z argued that the only only practical way 845 00:41:01,000 --> 00:41:03,719 Speaker 5: lllms can be trained is via huge amounts of copyrighted 846 00:41:03,760 --> 00:41:07,319 Speaker 5: content and data, including something approaching the entire corpus of 847 00:41:07,320 --> 00:41:10,200 Speaker 5: the written word and an enormous cross section of all 848 00:41:10,200 --> 00:41:14,040 Speaker 5: the publicly available information ever published on the Internet. The 849 00:41:14,160 --> 00:41:16,920 Speaker 5: VC firm has invested in scores of AI companies and 850 00:41:16,960 --> 00:41:19,960 Speaker 5: startups based on its expectation that all this copyrighted content 851 00:41:20,080 --> 00:41:22,279 Speaker 5: was and will remain available as training data through fair 852 00:41:22,400 --> 00:41:25,239 Speaker 5: use with no payment required. Those expectations have been a 853 00:41:25,280 --> 00:41:27,800 Speaker 5: critical factor in the enormous investment of private capital in 854 00:41:27,840 --> 00:41:31,520 Speaker 5: the US based AI companies. Undermining those expectations will jeopardize 855 00:41:31,560 --> 00:41:35,880 Speaker 5: future investment, along with US economic competitiveness and national security. Basically, 856 00:41:36,680 --> 00:41:38,760 Speaker 5: we made a big gamble that we'll get to steal 857 00:41:38,840 --> 00:41:41,320 Speaker 5: every book ever written, and if you make us pay, 858 00:41:41,560 --> 00:41:45,959 Speaker 5: we're kind of fucked. Like that's exactly what they're saying. Gosh, 859 00:41:46,080 --> 00:41:47,919 Speaker 5: and one of the argument you'll hear is like, well, 860 00:41:47,960 --> 00:41:50,160 Speaker 5: most books don't make the author any They don't sell 861 00:41:50,280 --> 00:41:52,239 Speaker 5: enough for the author to get any money, right, And 862 00:41:52,280 --> 00:41:54,320 Speaker 5: what's actually to is, most books don't sell enough for 863 00:41:54,360 --> 00:41:56,040 Speaker 5: the offer to get more money than their advance, but 864 00:41:56,080 --> 00:41:58,360 Speaker 5: they still got paid and like the fact that the 865 00:41:58,840 --> 00:42:01,520 Speaker 5: company makes money on and that is why more authors 866 00:42:01,560 --> 00:42:04,239 Speaker 5: are able to get fucking paid. Not simping for the 867 00:42:04,239 --> 00:42:05,800 Speaker 5: publishing industry as it exists. 868 00:42:05,880 --> 00:42:07,000 Speaker 2: But this is bullshit. 869 00:42:09,200 --> 00:42:12,759 Speaker 5: What we are witnessing from the AI boosters is not 870 00:42:13,160 --> 00:42:16,239 Speaker 5: much short of a crusade, right, That's really how I 871 00:42:16,280 --> 00:42:18,880 Speaker 5: look at this. They are waging a holy war to 872 00:42:18,920 --> 00:42:21,359 Speaker 5: destroy every threat of their vision of the future, which 873 00:42:21,360 --> 00:42:24,280 Speaker 5: involves all creative work being wholly owned by a handful 874 00:42:24,320 --> 00:42:28,120 Speaker 5: of billionaires, licensing access to chatbots to media conglomerates to 875 00:42:28,160 --> 00:42:31,440 Speaker 5: spit up content generated as a result of this. Their 876 00:42:31,480 --> 00:42:34,719 Speaker 5: foot soldiers are those with petty grievances against artists, people 877 00:42:34,760 --> 00:42:37,640 Speaker 5: who can create things that they simply cannot, and those 878 00:42:37,680 --> 00:42:40,399 Speaker 5: who reflexively lean in towards whatever grifters of the day 879 00:42:40,480 --> 00:42:42,919 Speaker 5: say is the best way to make cash quick, right, 880 00:42:43,120 --> 00:42:45,320 Speaker 5: And this brings me to the subject of night Shade. 881 00:42:45,800 --> 00:42:48,880 Speaker 5: Night Shade is basically it's a I guess a program 882 00:42:48,960 --> 00:42:51,360 Speaker 5: you'd call it. If you like, have made a drawing, 883 00:42:51,400 --> 00:42:54,239 Speaker 5: a piece of visual art, you run Nightshade over it, 884 00:42:54,280 --> 00:42:57,239 Speaker 5: and it kind of They describe it as a glaze, right. 885 00:42:57,320 --> 00:43:00,640 Speaker 5: It adds this kind of layer of data that you 886 00:43:00,719 --> 00:43:03,799 Speaker 5: cannot see as a person, but the way machines look 887 00:43:03,840 --> 00:43:06,520 Speaker 5: at images, the machine will see the data and if 888 00:43:06,560 --> 00:43:09,879 Speaker 5: it's trying to steal that image to incorporate into an LLM, 889 00:43:10,239 --> 00:43:11,759 Speaker 5: this will cause it to hallucinate. 890 00:43:11,920 --> 00:43:12,120 Speaker 2: Right. 891 00:43:12,320 --> 00:43:16,120 Speaker 5: You're basically sneaking poison for the AI into the images, 892 00:43:16,480 --> 00:43:19,799 Speaker 5: and that's fucking dope. I love this, Love what they're 893 00:43:19,800 --> 00:43:21,319 Speaker 5: trying to do. I think there's some debate as to 894 00:43:21,320 --> 00:43:23,080 Speaker 5: how long it'll work, how well it will work. I'm 895 00:43:23,080 --> 00:43:26,399 Speaker 5: not technically competent, but I love the idea. Right, yes, now. 896 00:43:26,800 --> 00:43:28,520 Speaker 5: One of the things that I saw when I started 897 00:43:28,560 --> 00:43:31,400 Speaker 5: looking into this, because this just came out, Google Nightshade. 898 00:43:31,560 --> 00:43:32,160 Speaker 2: You know AI. 899 00:43:32,800 --> 00:43:34,799 Speaker 5: You'll probably be able to find you know this if 900 00:43:34,800 --> 00:43:37,279 Speaker 5: you're an artist. I think it sounds worth trying. But 901 00:43:37,360 --> 00:43:40,600 Speaker 5: I found in the subparateated AI wars, or at least 902 00:43:40,640 --> 00:43:44,160 Speaker 5: I found someone sharing this, I believe on Twitter this post. 903 00:43:44,480 --> 00:43:47,120 Speaker 5: Nightshade has been released is use of it considered legal 904 00:43:47,200 --> 00:43:49,360 Speaker 5: or illegal for those who do not know its software 905 00:43:49,400 --> 00:43:51,880 Speaker 5: that attempts to poison an image, so if AI is trained, 906 00:43:51,880 --> 00:43:53,560 Speaker 5: it will mess up the model. For example, so you 907 00:43:53,600 --> 00:43:55,720 Speaker 5: have a picture of a cat and you run Nightshade 908 00:43:55,760 --> 00:43:57,400 Speaker 5: on it. If you attempt to train a model, that 909 00:43:57,520 --> 00:43:59,920 Speaker 5: image will replace the image and say dog prompt, cat 910 00:44:00,000 --> 00:44:02,840 Speaker 5: category or pencil, which means these prompts will be spoiled. 911 00:44:03,000 --> 00:44:04,799 Speaker 5: There is an issue that the creator of Nightshade has 912 00:44:04,840 --> 00:44:07,320 Speaker 5: not talked about, either from lack of legal knowledge or ignorance, 913 00:44:07,719 --> 00:44:09,880 Speaker 5: or they just don't care into them at someone else's problem. 914 00:44:09,880 --> 00:44:12,480 Speaker 5: The issue is it may be illegal in some countries. Basically, 915 00:44:12,520 --> 00:44:15,040 Speaker 5: if you release publicly a computer file in this case 916 00:44:15,080 --> 00:44:18,280 Speaker 5: image file that knowingly and willingly causes harm or distribution 917 00:44:18,360 --> 00:44:21,160 Speaker 5: to other people's computers or software, it may be considered 918 00:44:21,160 --> 00:44:23,760 Speaker 5: a criminal offense. Now it does not now, and again 919 00:44:24,120 --> 00:44:27,239 Speaker 5: I think that is stupid. I think they're just trying 920 00:44:27,280 --> 00:44:29,319 Speaker 5: to scare out instead of using this. You are not 921 00:44:29,400 --> 00:44:32,000 Speaker 5: harming someone's computer. You are harming a model that is 922 00:44:32,040 --> 00:44:35,880 Speaker 5: stealing something that's not illegal. Now they may try to 923 00:44:35,920 --> 00:44:37,240 Speaker 5: make it illegal. 924 00:44:37,160 --> 00:44:39,760 Speaker 4: Right, Yeah, I just want you to know the club 925 00:44:39,840 --> 00:44:43,440 Speaker 4: is illegal because you are. If I'm trying to steal 926 00:44:43,480 --> 00:44:46,760 Speaker 4: your car and I injure myself trying to break the club, 927 00:44:47,320 --> 00:44:48,239 Speaker 4: you have injured me. 928 00:44:48,719 --> 00:44:51,239 Speaker 5: Yeah, I put I invested a lot of money into 929 00:44:51,239 --> 00:44:54,520 Speaker 5: stealing catalytic converters, iffy. And if people are putting cages 930 00:44:54,560 --> 00:44:57,000 Speaker 5: around their cat that puts my investment in danger, and 931 00:44:57,040 --> 00:45:02,000 Speaker 5: that's illegal, right, you messing with my business, guy, Jesus Christ. 932 00:45:02,360 --> 00:45:03,280 Speaker 2: It is that logic. 933 00:45:03,480 --> 00:45:06,080 Speaker 5: There's like someone in the thread is like, how exactly 934 00:45:06,120 --> 00:45:08,880 Speaker 5: is your computer system or software harmed? And he responds, 935 00:45:08,880 --> 00:45:12,520 Speaker 5: it's equivalent to hacking a vulnerable computer system to disrupt 936 00:45:12,600 --> 00:45:16,799 Speaker 5: its operation. It's and then then he says, you are 937 00:45:16,840 --> 00:45:20,480 Speaker 5: intentionally disrupting its intended purpose creating art. This is directly 938 00:45:20,520 --> 00:45:23,080 Speaker 5: comparable to hacking. Like, I fucking hate this guy. 939 00:45:23,160 --> 00:45:25,160 Speaker 4: I want you, I want you to read it, but 940 00:45:25,320 --> 00:45:28,799 Speaker 4: in your head used Tim Robinson's voice, Yeah, and. 941 00:45:30,440 --> 00:45:36,120 Speaker 5: It just makes me, oh god, it's perfect, it's so good. 942 00:45:37,200 --> 00:45:39,080 Speaker 5: So all of this put me in a sour mood 943 00:45:39,080 --> 00:45:42,640 Speaker 5: if he but yeah, yeah, it did, it did. But 944 00:45:42,680 --> 00:45:44,440 Speaker 5: I think back when I'm in that mood, I think 945 00:45:44,480 --> 00:45:48,320 Speaker 5: back to Cees right, Like after I ask my question 946 00:45:48,520 --> 00:45:51,440 Speaker 5: and I make that Google and Microsoft people, I make 947 00:45:51,440 --> 00:45:54,440 Speaker 5: them kind of angry at me. Right after I asked 948 00:45:54,440 --> 00:45:58,200 Speaker 5: that question. The question after me is someone asking, hey, 949 00:45:58,840 --> 00:46:01,520 Speaker 5: you know the block chain with the last big craze, 950 00:46:01,560 --> 00:46:04,680 Speaker 5: do you think there's any future in you know, using 951 00:46:04,760 --> 00:46:07,480 Speaker 5: AI on the blockchain, And both of them were they 952 00:46:07,520 --> 00:46:07,840 Speaker 5: could not. 953 00:46:07,840 --> 00:46:09,080 Speaker 2: They were like no, like they. 954 00:46:08,960 --> 00:46:11,759 Speaker 5: Can't say no fast enough, Like absolutely, we don't care 955 00:46:11,800 --> 00:46:14,080 Speaker 5: about that anymore. We've moved on to the next crypt 956 00:46:14,600 --> 00:46:16,320 Speaker 5: Why are you bringing up the old grift. 957 00:46:16,600 --> 00:46:18,440 Speaker 4: It's dead, it's dead. We must move on. 958 00:46:18,880 --> 00:46:20,919 Speaker 2: Yeah, and that brought me a little bit of hope. 959 00:46:20,920 --> 00:46:24,600 Speaker 5: You know, perhaps we will get Mark Andreeson's benevolent AI god, 960 00:46:24,680 --> 00:46:27,440 Speaker 5: or perhaps we'll get Elisa Yidkowski's to look on devil, 961 00:46:27,680 --> 00:46:29,840 Speaker 5: or perhaps we'll just give control of all of the 962 00:46:29,880 --> 00:46:32,960 Speaker 5: future of ARC to fucking Sam Altman. But my guess 963 00:46:33,200 --> 00:46:35,600 Speaker 5: and my hope is that in the end, we Heretics 964 00:46:35,640 --> 00:46:39,000 Speaker 5: will survive the present crusade. And that's the end of 965 00:46:39,040 --> 00:46:40,440 Speaker 5: the episode that I've got for you. 966 00:46:40,520 --> 00:46:44,600 Speaker 4: Ify That is amazing. Yeah, I love it. I love 967 00:46:44,640 --> 00:46:45,280 Speaker 4: it so much. 968 00:46:47,239 --> 00:46:50,840 Speaker 5: Well, if he again, If you want this article, or 969 00:46:50,880 --> 00:46:53,000 Speaker 5: if you want the article version of this more condensed 970 00:46:53,040 --> 00:46:56,880 Speaker 5: easier to share. It's up on Rolling Stone. The article 971 00:46:56,920 --> 00:47:00,239 Speaker 5: is titled the Cult of AI and again that's by 972 00:47:00,280 --> 00:47:03,680 Speaker 5: me and Rolling Stone, The Cult of AI. Iffy, you 973 00:47:03,719 --> 00:47:07,920 Speaker 5: want to add in your stuff, plug your pluggables, so. 974 00:47:07,960 --> 00:47:10,520 Speaker 4: Oh yes, please if you wide a way on Twitter 975 00:47:10,520 --> 00:47:14,960 Speaker 4: and Instagram, watch dropout dot tv. You know it is. 976 00:47:15,520 --> 00:47:20,080 Speaker 4: It is definitely uh, you know, trying to do funny 977 00:47:20,080 --> 00:47:23,600 Speaker 4: things on the internet by humans and you know paying 978 00:47:23,680 --> 00:47:28,960 Speaker 4: those humans uh sharing, Oh yes, profit sharing, you know, 979 00:47:29,480 --> 00:47:32,600 Speaker 4: so truly big shout out to them. But yeah, I 980 00:47:32,680 --> 00:47:34,560 Speaker 4: might be in your town and be dueling a lot 981 00:47:34,600 --> 00:47:38,759 Speaker 4: of shows this year, so definitely pull up, you know, 982 00:47:39,239 --> 00:47:41,480 Speaker 4: follow me on the social meds and I'll let you 983 00:47:41,560 --> 00:47:43,319 Speaker 4: know where I'm at and you can just come. But 984 00:47:43,440 --> 00:47:46,000 Speaker 4: thank you uh so much for having me. It's so 985 00:47:46,040 --> 00:47:46,759 Speaker 4: good to see you again. 986 00:47:47,280 --> 00:47:48,760 Speaker 2: It was really good to see you again. Ifie. 987 00:47:48,920 --> 00:47:51,960 Speaker 4: Yeah, and this AI discussion in a weird way, as 988 00:47:52,040 --> 00:47:55,560 Speaker 4: dark as it's been, it makes me feel better because 989 00:47:55,560 --> 00:47:58,920 Speaker 4: I like that we're starting to fight back everyone. Good night. 990 00:47:58,960 --> 00:47:59,120 Speaker 6: Share. 991 00:47:59,160 --> 00:48:01,040 Speaker 4: Yeah, I think I'm gonna just are putting nice shade 992 00:48:01,040 --> 00:48:02,560 Speaker 4: on regular images. 993 00:48:02,760 --> 00:48:06,840 Speaker 5: Yeah, certainly, certainly one thing that's worth trying. And again, 994 00:48:07,360 --> 00:48:10,320 Speaker 5: you know, think about hyperstition, folks. We have to imagine 995 00:48:10,360 --> 00:48:15,120 Speaker 5: better futures in order to counter the imaginations of those 996 00:48:15,520 --> 00:48:18,520 Speaker 5: who wish us harm, who want to control and destroy 997 00:48:18,840 --> 00:48:21,279 Speaker 5: all that's good in the world. So, you know, get 998 00:48:21,320 --> 00:48:26,239 Speaker 5: on that. Somebody figure that out in the audience, all right. 999 00:48:26,360 --> 00:48:27,279 Speaker 5: Episode's over. 1000 00:48:30,400 --> 00:48:33,120 Speaker 3: Behind the Bastards is a production of cool Zone Media. 1001 00:48:33,480 --> 00:48:36,760 Speaker 3: For more from cool Zone Media, visit our website coolzonemedia 1002 00:48:36,960 --> 00:48:40,120 Speaker 3: dot com, or check us out on the iHeartRadio app, 1003 00:48:40,239 --> 00:48:42,520 Speaker 3: Apple Podcasts, or wherever you get your podcast