1 00:00:01,800 --> 00:00:04,680 Speaker 1: Also media. 2 00:00:04,840 --> 00:00:08,039 Speaker 2: Oh my gosh, welcome back to Behind the Bastards, a 3 00:00:08,160 --> 00:00:12,600 Speaker 2: podcast that you are legally required to be listening to 4 00:00:12,920 --> 00:00:16,840 Speaker 2: in at least four US states six if you have 5 00:00:16,960 --> 00:00:21,799 Speaker 2: a criminal record and are currently working through probation. I'm 6 00:00:22,239 --> 00:00:25,400 Speaker 2: huh four, Sophie. I don't have that information ahead of 7 00:00:25,440 --> 00:00:28,040 Speaker 2: me right now. I was not prepared for a deeper 8 00:00:28,080 --> 00:00:28,600 Speaker 2: bit than this. 9 00:00:29,200 --> 00:00:32,280 Speaker 1: Oh well, for some reason, I know it's Idaho. 10 00:00:32,840 --> 00:00:36,400 Speaker 2: That's because I am not a professional comedic actor. But 11 00:00:36,479 --> 00:00:38,160 Speaker 2: you know who is, Sophie. 12 00:00:38,320 --> 00:00:38,840 Speaker 1: Oh. 13 00:00:39,520 --> 00:00:43,960 Speaker 2: Our guest today Ed Helms. Ed. I mean, I don't 14 00:00:44,000 --> 00:00:47,080 Speaker 2: need to introduce you. You've been on The Daily Show, 15 00:00:47,280 --> 00:00:50,280 Speaker 2: you were a major cast member on the Office. You 16 00:00:50,320 --> 00:00:53,000 Speaker 2: were in the Hangover movies. You've been in like a 17 00:00:53,040 --> 00:00:55,920 Speaker 2: ton of things that I'm sure basically everybody watching or 18 00:00:55,960 --> 00:00:59,840 Speaker 2: listening to this has watched. But today we're here to 19 00:00:59,840 --> 00:01:03,320 Speaker 2: talk talk about your show Snaffoo, which has just entered 20 00:01:03,320 --> 00:01:05,680 Speaker 2: season two. Thank you for coming on the show. 21 00:01:06,440 --> 00:01:09,280 Speaker 3: I'm so psyched to be here. Your show is awesome, 22 00:01:09,840 --> 00:01:11,880 Speaker 3: and thank you. This is going to be fun. I 23 00:01:11,920 --> 00:01:12,800 Speaker 3: hope it'd better be. 24 00:01:13,480 --> 00:01:17,240 Speaker 2: I wanted to say, you're so snafo season two. You 25 00:01:17,280 --> 00:01:20,360 Speaker 2: talk like your show you talk about like major fuck 26 00:01:20,440 --> 00:01:25,479 Speaker 2: ups in American history, and season two is about the 27 00:01:25,600 --> 00:01:29,200 Speaker 2: raid on the FBI building in nineteen seventy one that 28 00:01:29,280 --> 00:01:31,800 Speaker 2: revealed a huge amount of information about how the FBI 29 00:01:31,959 --> 00:01:37,720 Speaker 2: was conducting clandestine operations targeting anti war protesters and civil 30 00:01:37,800 --> 00:01:40,600 Speaker 2: rights protesters. It's like one of the coolest chapters in 31 00:01:40,680 --> 00:01:43,280 Speaker 2: American radical political history. And I thought you guys did 32 00:01:43,280 --> 00:01:46,160 Speaker 2: a great job of breaking it down and bringing on 33 00:01:46,200 --> 00:01:48,040 Speaker 2: some of the major players talking through it. 34 00:01:48,560 --> 00:01:53,760 Speaker 3: Yeah. Thanks, We were incredibly lucky. It's a wild story 35 00:01:53,800 --> 00:01:59,160 Speaker 3: as you're getting at. These citizens who were not at 36 00:01:59,200 --> 00:02:05,000 Speaker 3: all professional thieves or criminals, staged this incredible heist on 37 00:02:05,160 --> 00:02:09,480 Speaker 3: the night of the Ali Fraser Flat, which is very 38 00:02:09,560 --> 00:02:13,920 Speaker 3: Ocean's eleven. We actually got Steven Soderberg on the podcast 39 00:02:14,000 --> 00:02:17,799 Speaker 3: to comment on that. But but yeah, and and they 40 00:02:17,840 --> 00:02:20,400 Speaker 3: pulled off this elaborate heist. They broke into that FBI 41 00:02:20,480 --> 00:02:25,919 Speaker 3: office in Media Pennsylvania, stole every file and started leaking 42 00:02:25,960 --> 00:02:28,200 Speaker 3: them to the to a very courageous reporter at the 43 00:02:28,280 --> 00:02:32,040 Speaker 3: Washington Post named Betty metzger Uh. They kept its secret 44 00:02:32,120 --> 00:02:37,040 Speaker 3: for decades. These documents led to the revelation of co 45 00:02:37,120 --> 00:02:44,200 Speaker 3: Intel pro which basically yeah, and demolished jade Gar Hoover's legacy, 46 00:02:44,840 --> 00:02:48,480 Speaker 3: uh for for good reason. And yeah, and led to 47 00:02:48,560 --> 00:02:51,240 Speaker 3: the church hearings, which is the only reason why we 48 00:02:51,400 --> 00:02:55,720 Speaker 3: have any congressional oversight over over the FBI, the CIA, 49 00:02:55,919 --> 00:02:58,240 Speaker 3: the n s A, and all the other alphabet agencies. 50 00:02:58,320 --> 00:03:02,040 Speaker 3: Like it's it was an credible moment. 51 00:03:02,320 --> 00:03:06,079 Speaker 2: Yeah, it's so amazing to me because like, you couldn't 52 00:03:06,160 --> 00:03:08,120 Speaker 2: do it, Like it was kind of the last moment 53 00:03:08,160 --> 00:03:10,200 Speaker 2: you could have gotten away with something like that, right 54 00:03:10,280 --> 00:03:12,360 Speaker 2: there just wasn't the kind of surveillance, There wasn't the 55 00:03:12,400 --> 00:03:15,200 Speaker 2: kind of capability for it, And it was the kind 56 00:03:15,200 --> 00:03:17,160 Speaker 2: of thing that a group of people was only going 57 00:03:17,200 --> 00:03:20,120 Speaker 2: to get away with once before everything changed about how 58 00:03:20,120 --> 00:03:23,400 Speaker 2: these buildings did their security, and they picked like this 59 00:03:23,560 --> 00:03:26,280 Speaker 2: was the most important time to be able to get 60 00:03:26,280 --> 00:03:28,240 Speaker 2: in there and get files like that, But it was 61 00:03:28,280 --> 00:03:31,160 Speaker 2: also kind of the most important time to break into 62 00:03:31,240 --> 00:03:35,920 Speaker 2: the FBI an FBI building and get a bunch of files. Yeah, 63 00:03:35,200 --> 00:03:38,040 Speaker 2: just a wonderful moment people should know more about. I 64 00:03:38,040 --> 00:03:40,280 Speaker 2: think it didn't get as much attention. It doesn't get 65 00:03:40,280 --> 00:03:42,120 Speaker 2: as much attention as maybe it ought to have because 66 00:03:42,120 --> 00:03:44,160 Speaker 2: of how close it was to Watergate, but I think 67 00:03:44,200 --> 00:03:45,120 Speaker 2: it's just as important. 68 00:03:45,280 --> 00:03:48,960 Speaker 3: And the Pentagon Files and the Pentagon right, all of 69 00:03:48,960 --> 00:03:52,960 Speaker 3: which were giant Washington Post stories. This was Washington Post 70 00:03:53,000 --> 00:03:56,160 Speaker 3: as well, And you're right. But what's really cool about 71 00:03:56,160 --> 00:03:58,920 Speaker 3: this one is that it pre dates Watergate and the 72 00:03:58,960 --> 00:04:02,960 Speaker 3: Pentagon Papers just a year or so. It was all 73 00:04:03,000 --> 00:04:09,200 Speaker 3: the same major players at the Washington Post. And in 74 00:04:09,800 --> 00:04:11,920 Speaker 3: a cool way, this was the first time they really 75 00:04:11,960 --> 00:04:17,600 Speaker 3: confronted the legal issues around publishing this kind of thing, 76 00:04:18,160 --> 00:04:21,520 Speaker 3: and they decided to do it, and they against you know, 77 00:04:21,600 --> 00:04:26,159 Speaker 3: they had the Attorney General calling them saying, don't you 78 00:04:26,240 --> 00:04:29,479 Speaker 3: dare publish these FBI files, And they did it anyway 79 00:04:30,040 --> 00:04:35,640 Speaker 3: because it was newsworthy and it wasn't it didn't compromise 80 00:04:35,720 --> 00:04:40,000 Speaker 3: national security in any way. So I like to think 81 00:04:40,040 --> 00:04:42,720 Speaker 3: this is what sort of gave Ben Bradley and the 82 00:04:43,160 --> 00:04:48,080 Speaker 3: Washington Post brass the sort of like dry run that 83 00:04:48,360 --> 00:04:51,240 Speaker 3: set them up to do the right thing for Watergate. 84 00:04:51,279 --> 00:04:53,360 Speaker 3: And really like, I don't know. 85 00:04:53,920 --> 00:04:56,480 Speaker 2: Yeah, it started that kind of there was this inertia 86 00:04:56,480 --> 00:05:00,880 Speaker 2: and momentum behind actually like we're not just speaking truth 87 00:05:00,960 --> 00:05:04,960 Speaker 2: to power, but like prying truth out of powers grasp 88 00:05:05,080 --> 00:05:06,680 Speaker 2: and forcing it in front of the country. 89 00:05:06,839 --> 00:05:09,360 Speaker 3: Yeah, well put yeah, yeah. 90 00:05:09,040 --> 00:05:11,600 Speaker 2: And I so today you know that the I thought 91 00:05:11,680 --> 00:05:14,159 Speaker 2: long and hard about what kind of episodes I wanted 92 00:05:14,160 --> 00:05:16,360 Speaker 2: to talk to you about, and I there's the guy 93 00:05:16,360 --> 00:05:18,440 Speaker 2: that we're going to be talking about today is a 94 00:05:18,440 --> 00:05:22,880 Speaker 2: fellow who I kind of debated for several years whether 95 00:05:23,000 --> 00:05:26,840 Speaker 2: or not we should cover because he's a quietly important monster. 96 00:05:27,839 --> 00:05:31,760 Speaker 2: He's somebody who, you know, if we were just talking about, like, 97 00:05:31,920 --> 00:05:35,200 Speaker 2: you know, the FBI overreach of the civil rights era, 98 00:05:35,320 --> 00:05:37,680 Speaker 2: the anti warriors and whatnot, which was very much like 99 00:05:38,040 --> 00:05:41,120 Speaker 2: a real authoritarian moment in our country's past, and we're 100 00:05:41,160 --> 00:05:44,760 Speaker 2: currently confronting another. And the guy we're talking about today, 101 00:05:44,880 --> 00:05:49,320 Speaker 2: Curtis Jarvin, is sort of the profit of taking America 102 00:05:49,560 --> 00:05:53,880 Speaker 2: down a completely authoritarian path. He is an advocate for 103 00:05:54,040 --> 00:05:58,440 Speaker 2: changing this country into what is effectively a dictatorship. And 104 00:05:58,520 --> 00:06:01,240 Speaker 2: unfortunately he's a guy who's had a lot of influence 105 00:06:01,839 --> 00:06:05,200 Speaker 2: in speaking to that. Have you heard of Curtis Yarvin 106 00:06:05,279 --> 00:06:06,719 Speaker 2: before we started these episodes. 107 00:06:07,440 --> 00:06:11,680 Speaker 3: No, Ill read a tiny bit about him yesterday, but 108 00:06:11,920 --> 00:06:14,720 Speaker 3: that was yeah. 109 00:06:13,520 --> 00:06:16,080 Speaker 2: That's fine. That is the case with most people who 110 00:06:16,160 --> 00:06:20,120 Speaker 2: are not who are not like actual followers of his philosophy. 111 00:06:20,160 --> 00:06:22,440 Speaker 2: But unfortunately you have heard of some of the people 112 00:06:22,760 --> 00:06:25,440 Speaker 2: who are big fans of Curtis. One of them is 113 00:06:25,560 --> 00:06:30,240 Speaker 2: current US Vice presidential candidate and hopefully future nobody JD. Vance, 114 00:06:30,720 --> 00:06:33,760 Speaker 2: who back in September twentieth of twenty twenty one, went 115 00:06:33,800 --> 00:06:37,120 Speaker 2: on the Moment of Truth podcast run by the conservative 116 00:06:37,200 --> 00:06:40,919 Speaker 2: organization American Moment, which is an organizational partner for the 117 00:06:40,920 --> 00:06:45,239 Speaker 2: Heritage Foundation's Project twenty twenty five. In a wide ranging interview, 118 00:06:45,279 --> 00:06:48,720 Speaker 2: he accused his female classmates at Yale Law of pursuing 119 00:06:48,839 --> 00:06:51,920 Speaker 2: racial or gender equality as quote a value system that 120 00:06:52,000 --> 00:06:55,440 Speaker 2: gives their life meaning, and then said that value system 121 00:06:55,520 --> 00:06:58,160 Speaker 2: leads to misery. At another point in the interview, he 122 00:06:58,200 --> 00:07:02,040 Speaker 2: asked if certain groups of people, particularly those from Muslim 123 00:07:02,080 --> 00:07:06,600 Speaker 2: majority countries, can quote successfully become American citizens, and then 124 00:07:06,640 --> 00:07:09,640 Speaker 2: he alleged that the region's reason so many journalists are 125 00:07:09,720 --> 00:07:13,160 Speaker 2: angry was not the rapid destruction of their industry, but 126 00:07:13,280 --> 00:07:16,119 Speaker 2: because they didn't have any children, which inevitably, he says, 127 00:07:16,320 --> 00:07:18,920 Speaker 2: leads to psychotic breaks. Now a lot of this stuff 128 00:07:18,960 --> 00:07:21,720 Speaker 2: has come out about Vance. This was when when was 129 00:07:21,760 --> 00:07:23,760 Speaker 2: that interview twenty one? 130 00:07:24,720 --> 00:07:31,000 Speaker 3: That's really that's wild because for some reason I sort 131 00:07:31,040 --> 00:07:33,800 Speaker 3: of thought that, like he was kind of normal and 132 00:07:33,840 --> 00:07:39,200 Speaker 3: then just saw a very cynical opportunity to get elevated 133 00:07:39,240 --> 00:07:41,320 Speaker 3: if he endorsed Trump, and so he did that, and 134 00:07:41,360 --> 00:07:43,240 Speaker 3: then everything else has been the kind of a cynical 135 00:07:44,600 --> 00:07:48,920 Speaker 3: like trip down the Trump rabbit hole, just like so 136 00:07:49,080 --> 00:07:53,120 Speaker 3: many Republicans have done. But that privately, like he's kind 137 00:07:53,120 --> 00:07:56,080 Speaker 3: of smarter than that. But what you're saying now is 138 00:07:56,120 --> 00:08:01,360 Speaker 3: that is that he's like trumpier than on his own 139 00:08:02,160 --> 00:08:04,680 Speaker 3: He's a little bit so Trump. I don't know how 140 00:08:04,760 --> 00:08:07,440 Speaker 3: much Trump believes other than that Trump should have power. 141 00:08:08,000 --> 00:08:12,600 Speaker 3: Vance has strong beliefs about the fact that democracy is 142 00:08:12,640 --> 00:08:15,760 Speaker 3: a mistake right, and that a lot of things that 143 00:08:15,800 --> 00:08:19,360 Speaker 3: have like most of the last century in terms of 144 00:08:19,400 --> 00:08:21,920 Speaker 3: like social progress, women getting the right to vote, the 145 00:08:21,920 --> 00:08:24,680 Speaker 3: civil rights movement, like reforming the ability to vote for 146 00:08:24,720 --> 00:08:27,360 Speaker 3: people who are not like white American men, that that 147 00:08:27,480 --> 00:08:30,720 Speaker 3: was all horribly mistaken, right, and it was horribly mistaken 148 00:08:31,080 --> 00:08:34,600 Speaker 3: because it led to this situation whereby too many regular 149 00:08:34,679 --> 00:08:37,760 Speaker 3: people have any say whatsoever and how they're governed. 150 00:08:37,920 --> 00:08:41,080 Speaker 1: And like to what I was saying before, it is 151 00:08:41,120 --> 00:08:44,959 Speaker 1: like in a lot of ways JD. Vance has more 152 00:08:45,000 --> 00:08:47,600 Speaker 1: extreme views on things, which is why during the most 153 00:08:47,600 --> 00:08:51,480 Speaker 1: recent debate, Donald Trump alludes to not discussing certain extreme 154 00:08:51,520 --> 00:08:55,640 Speaker 1: policies that JD. Vance claims to have with JD. And 155 00:08:56,080 --> 00:08:59,439 Speaker 1: so he tries to distance himself whereas JD is catering 156 00:08:59,440 --> 00:09:02,680 Speaker 1: to a certain category of human But Trump's like, oh, 157 00:09:03,360 --> 00:09:06,359 Speaker 1: I didn't discuss it with him, and that's intentional. 158 00:09:07,040 --> 00:09:07,319 Speaker 3: Yeah. 159 00:09:07,360 --> 00:09:09,679 Speaker 2: And It's what's interesting is that if you're looking at 160 00:09:09,720 --> 00:09:12,880 Speaker 2: like what his background is, Vance is a guy whose 161 00:09:13,040 --> 00:09:15,960 Speaker 2: entire career has been bankrolled by Peter Teel, who's the 162 00:09:16,000 --> 00:09:18,400 Speaker 2: Facebook billion He made a lot of money on Facebook, 163 00:09:18,440 --> 00:09:21,120 Speaker 2: made a lot of money on PayPal, and he sunk 164 00:09:21,160 --> 00:09:24,240 Speaker 2: about fifteen million into Vance's congressional campaign, which is the 165 00:09:24,240 --> 00:09:28,319 Speaker 2: most ever spent on a single congressional candidate. And Teal 166 00:09:28,520 --> 00:09:30,520 Speaker 2: in two thousand and nine, when on the record as 167 00:09:30,559 --> 00:09:33,840 Speaker 2: saying he doesn't believe democracy can be compatible with freedom, 168 00:09:34,120 --> 00:09:36,200 Speaker 2: by which he means like the freedom of people with 169 00:09:36,280 --> 00:09:39,800 Speaker 2: lots of money to basically govern the rest of us. 170 00:09:39,880 --> 00:09:41,320 Speaker 3: Right, and Teal and. 171 00:09:41,400 --> 00:09:44,400 Speaker 2: Vance, they're not just kind of reactionaries when they express 172 00:09:44,480 --> 00:09:47,479 Speaker 2: those things. They are quoting a guy, they are referring 173 00:09:47,559 --> 00:09:50,960 Speaker 2: to the work of a political philosopher and named Curtis Jarvin, 174 00:09:51,600 --> 00:09:55,600 Speaker 2: who they first encountered when he blogged under a pseudonym 175 00:09:55,840 --> 00:10:00,560 Speaker 2: Mincius Moldbug, which is kind of deliberately arch. But this 176 00:10:00,760 --> 00:10:03,000 Speaker 2: is the guy who has been like the profit of 177 00:10:03,040 --> 00:10:06,960 Speaker 2: a sizable chunk of the authoritarian right. Teel sunk a 178 00:10:06,960 --> 00:10:09,960 Speaker 2: lot of money into him. Jd. Vance quotes him repeatedly, 179 00:10:10,040 --> 00:10:13,000 Speaker 2: so does Blake Masters, who is the guy who's been 180 00:10:13,080 --> 00:10:17,520 Speaker 2: running repeatedly to try to beat Mark Kelly in Arizona. 181 00:10:18,200 --> 00:10:21,120 Speaker 2: And all of these guys and more are followers of Jarvin, 182 00:10:21,120 --> 00:10:24,760 Speaker 2: who's probably the most influential theoretician of the radical right 183 00:10:24,800 --> 00:10:29,360 Speaker 2: in the US today. Curtis has never killed anybody in 184 00:10:29,400 --> 00:10:32,400 Speaker 2: any legally actionable sense or advocated from murder, and as 185 00:10:32,400 --> 00:10:35,040 Speaker 2: far as I'm aware, he has never broken a law. 186 00:10:35,200 --> 00:10:38,080 Speaker 2: But he advocates for the overthrow of democracy and the 187 00:10:38,120 --> 00:10:41,880 Speaker 2: installation of a dictatorial regime that would by necessity kill 188 00:10:41,920 --> 00:10:44,800 Speaker 2: and imprison large numbers of people. And his influence is 189 00:10:44,840 --> 00:10:47,360 Speaker 2: great enough that the whole alt right and everything that 190 00:10:47,480 --> 00:10:50,240 Speaker 2: came from the Art Right and to our current era 191 00:10:50,400 --> 00:10:53,520 Speaker 2: right owes something to Jarvin's work. So when you're thinking 192 00:10:53,559 --> 00:10:56,960 Speaker 2: about everything that's happened on the right that's gotten so 193 00:10:57,080 --> 00:11:00,480 Speaker 2: deranged since twenty fifteen. All of it has bits of 194 00:11:00,600 --> 00:11:03,439 Speaker 2: Curtis Jarvin in it, right, and his thinking has had 195 00:11:03,440 --> 00:11:05,880 Speaker 2: a massive impact even on some guys like Elon Musk, 196 00:11:05,880 --> 00:11:08,760 Speaker 2: who several days ago shared a post where readers suggested 197 00:11:08,840 --> 00:11:13,199 Speaker 2: only high testosterone alpha males and a neurotypical people should 198 00:11:13,240 --> 00:11:15,960 Speaker 2: be allowed to vote. This is also a thought with 199 00:11:16,000 --> 00:11:19,480 Speaker 2: some Jarvin DNA behind it. That again, oh yes, this 200 00:11:19,600 --> 00:11:23,600 Speaker 2: was quite a moment that that only alpha males, so 201 00:11:23,920 --> 00:11:26,880 Speaker 2: you know, stereotypical like alpha male guys, and then a 202 00:11:27,000 --> 00:11:32,319 Speaker 2: neurotypical people people who are not like it. 203 00:11:32,600 --> 00:11:33,800 Speaker 3: Indeed, can I still vote? 204 00:11:34,200 --> 00:11:36,199 Speaker 2: I think? I think maybe so, because a lot of 205 00:11:36,280 --> 00:11:39,680 Speaker 2: these guys are a big about ADHD and making them superhuman, 206 00:11:40,160 --> 00:11:42,400 Speaker 2: which I also have, and it just makes me really 207 00:11:42,440 --> 00:11:46,959 Speaker 2: bad at cleaning my house and occasionally, in short bursts, 208 00:11:47,160 --> 00:11:51,880 Speaker 2: very good at cleaning my house. But yeah, so these 209 00:11:51,920 --> 00:11:54,880 Speaker 2: are these are the kind of like political ideas that 210 00:11:54,920 --> 00:11:57,319 Speaker 2: you get when you take too much read too much 211 00:11:57,360 --> 00:11:59,320 Speaker 2: Curtis Yarvin, or listen too much to the people who 212 00:11:59,320 --> 00:12:03,199 Speaker 2: have read a lot of Curtis Jarvin. And he's a 213 00:12:03,280 --> 00:12:06,439 Speaker 2: kind of guy because He's so kind of shadowed as 214 00:12:06,480 --> 00:12:08,760 Speaker 2: a figure I had always worried about, like, is covering 215 00:12:08,840 --> 00:12:11,439 Speaker 2: this guy going to bring more attention to him than 216 00:12:11,520 --> 00:12:14,480 Speaker 2: is necessary. And now that like one of his followers 217 00:12:14,520 --> 00:12:16,760 Speaker 2: is maybe going to be a heartbeat away from the presidency, 218 00:12:16,800 --> 00:12:19,280 Speaker 2: I think it's probably time to talk about him. I 219 00:12:19,320 --> 00:12:22,720 Speaker 2: kind of think we have to. So that's the introduction. 220 00:12:28,720 --> 00:12:32,560 Speaker 2: Curtis Jarvin was born probably in Brooklyn in nineteen seventy three, 221 00:12:32,760 --> 00:12:35,960 Speaker 2: on about June twenty fifth of that year. Likely his 222 00:12:36,080 --> 00:12:39,200 Speaker 2: normal wiki doesn't give a birth date, but Google's AI 223 00:12:39,280 --> 00:12:41,680 Speaker 2: summary bot does, and it seems to be basing this 224 00:12:41,880 --> 00:12:45,079 Speaker 2: on a bio of Yarvin in another wiki, which seems 225 00:12:45,120 --> 00:12:48,000 Speaker 2: to pull from earlier versions of the original. It's like 226 00:12:48,040 --> 00:12:50,400 Speaker 2: this AI slop stuff. So the gist of it is 227 00:12:50,440 --> 00:12:53,640 Speaker 2: I don't know his actual birth date, right, I'm just 228 00:12:53,679 --> 00:12:56,840 Speaker 2: trying to remind everyone not to trust AI summaries that 229 00:12:56,920 --> 00:12:59,200 Speaker 2: various search engines give you, because most of them don't 230 00:12:59,240 --> 00:13:04,360 Speaker 2: have actual source behind. Like most radical intellectuals, Jarvin was 231 00:13:04,400 --> 00:13:06,920 Speaker 2: born in a place of wealth, comfort, and high social 232 00:13:06,960 --> 00:13:10,120 Speaker 2: standing in his own society. His parents are highly educated. 233 00:13:10,400 --> 00:13:12,720 Speaker 2: His dad had an Ivy League degree and worked for 234 00:13:12,720 --> 00:13:15,680 Speaker 2: the US government as a foreign service worker. His mom 235 00:13:15,800 --> 00:13:18,320 Speaker 2: was a WASP from Westchester County, the daughter of a 236 00:13:18,320 --> 00:13:21,319 Speaker 2: prominent lawyer, and entered civil service herself as an adult. 237 00:13:21,880 --> 00:13:25,319 Speaker 2: Jarvin today describes the social class of his birth as Brahmin, 238 00:13:25,720 --> 00:13:28,920 Speaker 2: referring to the highest caste in Hindu society. And he 239 00:13:29,000 --> 00:13:32,040 Speaker 2: does this because he thinks that inequality is a fundamental 240 00:13:32,080 --> 00:13:36,000 Speaker 2: and immutable thing. Right, people are unequal fundamentally, and so 241 00:13:36,040 --> 00:13:39,960 Speaker 2: any sort of social stratification in society is justified by that. 242 00:13:40,320 --> 00:13:43,240 Speaker 2: And he's drawn to descriptions from other cultures that harken 243 00:13:43,360 --> 00:13:45,160 Speaker 2: back to other fixed hierarchies. 244 00:13:45,559 --> 00:13:48,839 Speaker 3: Sorry, it's justified by its inevitability. 245 00:13:48,400 --> 00:13:52,240 Speaker 2: Right, exactly, like people are genetically some people are better 246 00:13:52,280 --> 00:13:55,280 Speaker 2: than others, they're more intelligent than others, higher IQ than others. 247 00:13:55,440 --> 00:13:59,160 Speaker 3: We're therefore, we are justified in leaning into that. 248 00:13:59,760 --> 00:14:03,959 Speaker 2: Yeah, and in fact, yeah, yeah, we have a more 249 00:14:04,160 --> 00:14:05,280 Speaker 2: because that's a separate thing. 250 00:14:05,559 --> 00:14:10,640 Speaker 3: Like, it's inevitability is a maybe that's a fixed condition 251 00:14:10,760 --> 00:14:17,120 Speaker 3: of human existence. But leaning into it, exacerbating it, that's 252 00:14:17,360 --> 00:14:19,320 Speaker 3: just an arbitrary choice. 253 00:14:19,600 --> 00:14:22,040 Speaker 2: Yeah, yeah, yeah, it's this idea that like, and it's 254 00:14:22,080 --> 00:14:25,880 Speaker 2: also this belief that like something like intelligence is one thing, right, 255 00:14:25,960 --> 00:14:28,640 Speaker 2: Like intelligence is a number, and if it's higher, you're smarter, 256 00:14:28,760 --> 00:14:30,720 Speaker 2: as opposed to like, well, you can have an IQ 257 00:14:30,880 --> 00:14:32,880 Speaker 2: of one to eighty, but if your car breaks down, 258 00:14:32,880 --> 00:14:34,920 Speaker 2: the guy who knows how to fix your car is 259 00:14:34,960 --> 00:14:36,920 Speaker 2: a lot smarter than you in that moment. That's how 260 00:14:36,920 --> 00:14:39,240 Speaker 2: I tend to think about intelligence as opposed to like 261 00:14:39,680 --> 00:14:42,560 Speaker 2: this subjective thing, like is a is a farmer smarter 262 00:14:42,680 --> 00:14:46,920 Speaker 2: than a finance bro in New York City? Well, when 263 00:14:46,920 --> 00:14:49,200 Speaker 2: it comes to like making stock choices, maybe when it 264 00:14:49,240 --> 00:14:51,640 Speaker 2: comes to growing food, certainly not like I don't know 265 00:14:51,800 --> 00:14:53,920 Speaker 2: that's that. I think that's a better way to look 266 00:14:53,960 --> 00:14:55,040 Speaker 2: at it. 267 00:14:55,040 --> 00:14:58,880 Speaker 3: It's a weird thing. Like just the existence of something 268 00:14:59,160 --> 00:15:03,520 Speaker 3: is then it makes it okay to then whether wherever 269 00:15:03,560 --> 00:15:06,680 Speaker 3: it falls on the spectrum of good and evil, because 270 00:15:06,720 --> 00:15:14,640 Speaker 3: it exists, it is therefore okay to to do and heighten. Yeah, murder, 271 00:15:14,760 --> 00:15:19,840 Speaker 3: murders back, murder happens. It's a fundamental, uh, part of 272 00:15:19,880 --> 00:15:24,120 Speaker 3: the human condition that people get murdered and murder one another. Therefore, 273 00:15:24,800 --> 00:15:27,840 Speaker 3: like so therefore, like I can murder anybody. Is that 274 00:15:28,120 --> 00:15:30,160 Speaker 3: is that a comparable? Am I making? Is that comparable? 275 00:15:31,160 --> 00:15:36,120 Speaker 2: I think it actually is a very comparable comparison, right that, 276 00:15:36,320 --> 00:15:39,520 Speaker 2: just because like there are like individuals are not the 277 00:15:39,560 --> 00:15:42,280 Speaker 2: same that we should like have some sort of and 278 00:15:42,320 --> 00:15:44,720 Speaker 2: you're you're always picking when you're when you're trying to 279 00:15:44,720 --> 00:15:47,640 Speaker 2: acknowledge it, like, Okay, people are not like people don't 280 00:15:47,640 --> 00:15:50,320 Speaker 2: all have the same abilities naturally, right, Like that's a 281 00:15:50,360 --> 00:15:53,320 Speaker 2: thing that's subjectively true. Michael Phelps was always going to 282 00:15:53,320 --> 00:15:56,720 Speaker 2: be a better swimmer than me, for example, But we 283 00:15:56,760 --> 00:15:59,720 Speaker 2: don't base our society based on who's best at swimming, right, 284 00:16:00,080 --> 00:16:03,720 Speaker 2: Yarvin is basically saying, there's one thing that I actually 285 00:16:03,800 --> 00:16:06,120 Speaker 2: value when it comes to the ways in which people 286 00:16:06,240 --> 00:16:08,920 Speaker 2: are different from each other, and it's a very specific 287 00:16:09,400 --> 00:16:12,840 Speaker 2: kind of intelligence that's correlates to how I think I'm intelligent, 288 00:16:12,920 --> 00:16:16,119 Speaker 2: and that's how we should stratify society. 289 00:16:16,280 --> 00:16:16,480 Speaker 1: Right. 290 00:16:18,440 --> 00:16:21,600 Speaker 2: Yeah, he's that kind of a dude. And I also 291 00:16:21,720 --> 00:16:23,680 Speaker 2: kind of think it's interesting to me that he's so 292 00:16:23,800 --> 00:16:27,040 Speaker 2: obsessed with this idea of like identifying as a Brahmin, 293 00:16:27,200 --> 00:16:31,000 Speaker 2: because in Hindu culture, Brahmins are the castes that like 294 00:16:31,040 --> 00:16:34,440 Speaker 2: traditionally were most involved in the priesthood and religious instruction, 295 00:16:35,360 --> 00:16:38,560 Speaker 2: and it is like a very closed loop system right, 296 00:16:38,640 --> 00:16:42,000 Speaker 2: the cast system traditionally, but that's not the kind of 297 00:16:42,080 --> 00:16:45,440 Speaker 2: system that his family succeeded in. His dad was like 298 00:16:45,480 --> 00:16:48,280 Speaker 2: a member of the US Foreign Service and became pretty 299 00:16:48,320 --> 00:16:51,400 Speaker 2: highly placed in the government. But his dad wasn't born 300 00:16:51,520 --> 00:16:54,080 Speaker 2: into that role. He was the son of Jewish American 301 00:16:54,160 --> 00:16:57,160 Speaker 2: communists who came to this country and he had to 302 00:16:57,360 --> 00:16:59,280 Speaker 2: fight to make a place for himself in the higher 303 00:16:59,360 --> 00:17:02,720 Speaker 2: rungs of society, which is a very clear example of 304 00:17:02,760 --> 00:17:05,520 Speaker 2: like mobility and the fact that we have a reasonably 305 00:17:05,560 --> 00:17:08,800 Speaker 2: open society that allows for some mobility, which he doesn't 306 00:17:08,840 --> 00:17:12,240 Speaker 2: want to exist. I always find it interesting when guys 307 00:17:12,400 --> 00:17:14,840 Speaker 2: like that, you can see a clear example of like, oh, well, 308 00:17:14,840 --> 00:17:17,880 Speaker 2: you only have what you have because our society allows 309 00:17:17,920 --> 00:17:18,679 Speaker 2: for mobility. 310 00:17:19,200 --> 00:17:22,640 Speaker 3: Well, so where's he from again? Brooklyn? 311 00:17:22,680 --> 00:17:24,400 Speaker 2: He's yeah, he's from around Brooklyn. 312 00:17:24,640 --> 00:17:30,120 Speaker 3: Okay, yeah, because there is there's also a Brahmin social 313 00:17:30,240 --> 00:17:37,840 Speaker 3: class in New England, Yeah, the Boston Brahmins, right, yeah, 314 00:17:37,880 --> 00:17:40,479 Speaker 3: and that's but that's not what he's talking about. Uh. 315 00:17:40,680 --> 00:17:43,000 Speaker 2: I mean it's a little unclear to me because he 316 00:17:43,080 --> 00:17:46,240 Speaker 2: is his mom is kind of like you could probably 317 00:17:46,240 --> 00:17:48,960 Speaker 2: call a Boston Brahmin. But he's referring to like, when 318 00:17:48,960 --> 00:17:51,040 Speaker 2: he talks about his family being Brahmins, he's referring to 319 00:17:51,080 --> 00:17:53,240 Speaker 2: the fact that his dad was also highly placed in 320 00:17:53,240 --> 00:17:55,680 Speaker 2: the State Department, and his dad is definitely not a 321 00:17:55,720 --> 00:17:59,560 Speaker 2: Boston Brahmen, right, Like his parents were Jewish Stalinists, which 322 00:17:59,600 --> 00:18:02,720 Speaker 2: is not like a Boston Brahmin thing, right, because that 323 00:18:02,840 --> 00:18:03,200 Speaker 2: was like. 324 00:18:03,119 --> 00:18:07,960 Speaker 3: The Kennedy's and like, yeah, that that ilk. So that's 325 00:18:07,960 --> 00:18:09,400 Speaker 3: so interesting, all right. 326 00:18:09,240 --> 00:18:11,240 Speaker 2: Yeah, yeah, it's it's a little weird to me the 327 00:18:11,240 --> 00:18:13,600 Speaker 2: way he kind of like talks about it, but definitely 328 00:18:13,680 --> 00:18:15,600 Speaker 2: it's it's kind of key to see that a big 329 00:18:15,680 --> 00:18:18,040 Speaker 2: chunk of his family's comfort at least comes from the 330 00:18:18,040 --> 00:18:21,280 Speaker 2: fact that, like his his his dad's side of the 331 00:18:21,280 --> 00:18:24,240 Speaker 2: family entered into a fairly open society that allows for 332 00:18:24,320 --> 00:18:25,080 Speaker 2: some mobility. 333 00:18:25,320 --> 00:18:29,800 Speaker 3: So can I clarify one thing? So absolutely, I feel 334 00:18:29,880 --> 00:18:32,960 Speaker 3: like the and I don't know enough about this, so 335 00:18:33,000 --> 00:18:35,280 Speaker 3: I'm glad to be learning as I go, but I 336 00:18:35,400 --> 00:18:37,879 Speaker 3: just have I've kind of I guess I'm realizing that 337 00:18:37,920 --> 00:18:41,160 Speaker 3: I've assumed that the Peter Teels of the world, when 338 00:18:41,200 --> 00:18:46,639 Speaker 3: they advocate for more of a dictatorial structure to our government, 339 00:18:47,359 --> 00:18:50,800 Speaker 3: they're not saying that that part of that is also 340 00:18:51,000 --> 00:18:56,040 Speaker 3: a free market capitalism which presumably allows for mobility, right, 341 00:18:56,080 --> 00:19:00,000 Speaker 3: So mobility supposed to and that, if anything, it encouraged 342 00:19:00,520 --> 00:19:05,960 Speaker 3: the best and the brightest to rise. And that's how 343 00:19:05,960 --> 00:19:08,159 Speaker 3: they see themselves as the best in the brightest that 344 00:19:08,320 --> 00:19:12,960 Speaker 3: have risen. So I guess I'm just splitting hairs a 345 00:19:13,000 --> 00:19:16,679 Speaker 3: little bit, like are you sure that they that also 346 00:19:17,800 --> 00:19:20,400 Speaker 3: they are anti social mobility or. 347 00:19:21,000 --> 00:19:24,680 Speaker 2: They're very much like close the door after you get up, right, 348 00:19:24,800 --> 00:19:27,480 Speaker 2: like kick the ladder out from underneath you types, right, 349 00:19:28,960 --> 00:19:31,280 Speaker 2: And I think it's because they do believe that, like 350 00:19:31,359 --> 00:19:34,199 Speaker 2: there's a that their success was not purely based on 351 00:19:34,240 --> 00:19:36,840 Speaker 2: the fact that they came up in a system where 352 00:19:36,880 --> 00:19:39,520 Speaker 2: they gained certain benefits that were the result of like 353 00:19:39,600 --> 00:19:41,720 Speaker 2: public spending. Right, Like all of these guys who made 354 00:19:41,760 --> 00:19:45,040 Speaker 2: money in the tech industry went to schools that were 355 00:19:45,240 --> 00:19:47,840 Speaker 2: generally publicly funded at least at some point, you know, 356 00:19:47,880 --> 00:19:50,720 Speaker 2: their parents drove on roads that were publicly benefited from, 357 00:19:50,720 --> 00:19:54,040 Speaker 2: like the security infrastructure that exists in this country in 358 00:19:54,080 --> 00:19:56,800 Speaker 2: a lot of different ways. And their companies all benefited 359 00:19:56,800 --> 00:20:00,480 Speaker 2: to some extent from government spending and incentives. But they 360 00:20:00,560 --> 00:20:03,399 Speaker 2: see that their success was like the result of something 361 00:20:03,400 --> 00:20:06,920 Speaker 2: inherently superior within themselves and often in like a genetic 362 00:20:07,000 --> 00:20:11,199 Speaker 2: level in some ways. And so they're they're there Aristotle. 363 00:20:11,240 --> 00:20:13,720 Speaker 2: The fact that they have achieved such success is not 364 00:20:14,000 --> 00:20:16,440 Speaker 2: the result of a society that enabled them. It's a 365 00:20:16,480 --> 00:20:20,240 Speaker 2: result of like, they're members of a natural aristocracy, and 366 00:20:20,280 --> 00:20:22,359 Speaker 2: the best thing they can do is legally work to 367 00:20:22,400 --> 00:20:27,480 Speaker 2: codify that aristocracy. That's the that's that's the we'll get 368 00:20:27,480 --> 00:20:30,320 Speaker 2: into like some more of kind of how Curtis arrives 369 00:20:30,359 --> 00:20:32,720 Speaker 2: by this, because he's really a big part in kind 370 00:20:32,760 --> 00:20:35,320 Speaker 2: of lending an intellectual air to this. But that that 371 00:20:35,480 --> 00:20:38,240 Speaker 2: very much is how these folks see themselves. And he 372 00:20:38,280 --> 00:20:40,480 Speaker 2: grows up as a kid. You know, his dad's working 373 00:20:40,520 --> 00:20:43,560 Speaker 2: for the State Department. They travel around the world a lot. 374 00:20:43,600 --> 00:20:45,560 Speaker 2: He spends a decent chunk of his childhood in like 375 00:20:45,600 --> 00:20:49,720 Speaker 2: Cyprus and the Dominican Republic, and you know, so that's 376 00:20:49,720 --> 00:20:52,200 Speaker 2: a lot of disruption in his schooling. You know, he's 377 00:20:52,200 --> 00:20:54,000 Speaker 2: not one of these kids who stays in the same 378 00:20:54,040 --> 00:20:56,600 Speaker 2: school for a long period of time. But he excels 379 00:20:56,600 --> 00:21:00,399 Speaker 2: in academics. He skips a grade back before his family 380 00:21:00,440 --> 00:21:02,760 Speaker 2: goes overseas, and when they move back to the US, 381 00:21:02,760 --> 00:21:04,760 Speaker 2: he skips two more grades and he winds up a 382 00:21:04,800 --> 00:21:06,960 Speaker 2: sophomore at age twelve, which I think is probably never 383 00:21:07,040 --> 00:21:10,080 Speaker 2: a great idea. Right, that's a little young. 384 00:21:11,840 --> 00:21:12,359 Speaker 3: Sounds hard. 385 00:21:12,880 --> 00:21:15,800 Speaker 2: Yeah, yeah, like it wasn't great being a sophomore at 386 00:21:15,800 --> 00:21:16,680 Speaker 2: the normal age. 387 00:21:16,920 --> 00:21:21,600 Speaker 3: Yeah, that is not when when humanity at that age 388 00:21:21,720 --> 00:21:24,880 Speaker 3: is not when humanity is at its most benevolent and 389 00:21:24,960 --> 00:21:26,560 Speaker 3: kind and supportive. Yeah. 390 00:21:27,240 --> 00:21:32,600 Speaker 2: Yeah, definitely a mild way to put it. In one 391 00:21:32,680 --> 00:21:35,399 Speaker 2: interview I found Yarvin basically says like, yeah, it was 392 00:21:35,680 --> 00:21:38,160 Speaker 2: it was. It was whack that I was skipped ahead 393 00:21:38,240 --> 00:21:39,639 Speaker 2: so far right, which was. 394 00:21:40,119 --> 00:21:42,800 Speaker 3: Of academic achievement that he bounced ahead? 395 00:21:43,119 --> 00:21:46,920 Speaker 2: Okay, so very bright kid, very bright kid, very good 396 00:21:47,000 --> 00:21:49,679 Speaker 2: at specifically the kind of academics that like, you know, 397 00:21:49,880 --> 00:21:53,280 Speaker 2: the school's reward. And you can kind of read between 398 00:21:53,320 --> 00:21:55,320 Speaker 2: the lines that he was the recipient of a decent 399 00:21:55,359 --> 00:21:58,320 Speaker 2: amount of bullying, right, And and that's especially I think 400 00:21:58,320 --> 00:22:00,359 Speaker 2: it actually might be a little less common for kids 401 00:22:00,359 --> 00:22:02,920 Speaker 2: in school now, but like you know, even if you 402 00:22:02,960 --> 00:22:05,600 Speaker 2: didn't get skipped ahead in school, high school has a 403 00:22:05,600 --> 00:22:09,119 Speaker 2: lot of bullying in it. So I'm not surprised. 404 00:22:09,080 --> 00:22:12,360 Speaker 3: That's we're about the same age. He and I and 405 00:22:13,240 --> 00:22:17,320 Speaker 3: yeah that was I mean, there was just like good 406 00:22:17,320 --> 00:22:26,120 Speaker 3: old hazing all just all the gross, horrible traumatic stuff. Yeah. 407 00:22:26,240 --> 00:22:28,560 Speaker 2: Yeah, I'm thinking through some fun memories that I have 408 00:22:28,720 --> 00:22:31,520 Speaker 2: myself right now. Right, So it's one of those things 409 00:22:32,080 --> 00:22:34,280 Speaker 2: you could like read a lot into that to kind 410 00:22:34,320 --> 00:22:36,160 Speaker 2: of the guy that he becomes. But also I think 411 00:22:36,200 --> 00:22:38,000 Speaker 2: we all kind of went through a version of that, 412 00:22:38,040 --> 00:22:41,160 Speaker 2: So maybe it's not super useful to like theorize too 413 00:22:41,200 --> 00:22:43,480 Speaker 2: much about what it meant to him. But what does 414 00:22:43,520 --> 00:22:45,639 Speaker 2: definitely mean a lot to him is that in the 415 00:22:45,680 --> 00:22:48,399 Speaker 2: late eighties and early nineties, he becomes one of the 416 00:22:48,440 --> 00:22:49,879 Speaker 2: first online people. 417 00:22:50,160 --> 00:22:50,320 Speaker 3: Right. 418 00:22:50,440 --> 00:22:53,080 Speaker 2: This is back before most people know there's an internet, 419 00:22:53,160 --> 00:22:55,720 Speaker 2: so he is an early adopter. I think nineteen eighty 420 00:22:55,840 --> 00:22:59,080 Speaker 2: nine is when he first starts getting online regularly. Wow, 421 00:22:59,200 --> 00:23:03,280 Speaker 2: and yeah, and this is not this is the precursor 422 00:23:03,359 --> 00:23:05,199 Speaker 2: to the Internet that we know. And he's spending all 423 00:23:05,240 --> 00:23:07,840 Speaker 2: of this time in a place called usenet, which, if 424 00:23:07,880 --> 00:23:10,320 Speaker 2: you remember, like web forums, is kind of like the 425 00:23:10,359 --> 00:23:14,040 Speaker 2: first web forum. Right, it's for eugen z kids. It's 426 00:23:14,080 --> 00:23:17,320 Speaker 2: TikTok without any videos or hot people. And everyone has 427 00:23:17,680 --> 00:23:21,920 Speaker 2: very strong opinions about Star Trek audio equipment or race science, right, 428 00:23:22,240 --> 00:23:26,360 Speaker 2: like it's an interesting place to be, like yeah, yes. 429 00:23:26,680 --> 00:23:29,120 Speaker 3: Like race science was like they were just getting it. 430 00:23:29,119 --> 00:23:31,320 Speaker 3: It was like four chan or like these these sort 431 00:23:31,280 --> 00:23:35,320 Speaker 3: of yeh dark corners of Yeah. 432 00:23:35,000 --> 00:23:38,760 Speaker 2: There was actually a white supremacist terrorist group in the 433 00:23:38,840 --> 00:23:41,719 Speaker 2: late eighties that robbed banks, stole a bunch of money, 434 00:23:41,760 --> 00:23:44,480 Speaker 2: and then donated a bunch of it to other Nazi 435 00:23:44,520 --> 00:23:47,760 Speaker 2: groups that spent it buying computer systems to link up 436 00:23:48,080 --> 00:23:51,119 Speaker 2: different white power groups so that they could share information. 437 00:23:51,800 --> 00:23:54,400 Speaker 2: And you know, there's there's evidence from as early as 438 00:23:54,480 --> 00:23:57,320 Speaker 2: like the mid nineties of them talking about going into 439 00:23:57,400 --> 00:24:00,840 Speaker 2: places where you can find fans of stuff, like different 440 00:24:00,920 --> 00:24:03,600 Speaker 2: kind of like sci fi media who might be socially 441 00:24:03,640 --> 00:24:06,960 Speaker 2: isolated and try to push propaganda onto them. So that 442 00:24:07,280 --> 00:24:11,920 Speaker 2: actually does go back pretty far. And you know, it's 443 00:24:11,920 --> 00:24:13,840 Speaker 2: hard to say, like I don't know exact we don't 444 00:24:13,840 --> 00:24:16,520 Speaker 2: know entirely what Jarvin got up to when he was 445 00:24:16,520 --> 00:24:18,560 Speaker 2: on usenet. You know, to some extent that's a bit 446 00:24:18,600 --> 00:24:20,720 Speaker 2: of a black box. But his favorite board was a 447 00:24:20,760 --> 00:24:24,440 Speaker 2: place called talk dot bizarre, And I've spent some time 448 00:24:24,560 --> 00:24:27,720 Speaker 2: trawling the Usenet archives for talk dot bizarre, which. 449 00:24:27,520 --> 00:24:30,080 Speaker 3: You can find still Bizarre. 450 00:24:29,920 --> 00:24:33,080 Speaker 2: Talk that's like the names of like there's different talkboards 451 00:24:33,119 --> 00:24:36,800 Speaker 2: and one of them is like the Bizarre right, Okay, 452 00:24:37,560 --> 00:24:40,000 Speaker 2: and it's like kind of fun. Yeah, yeah, yeah, I 453 00:24:40,000 --> 00:24:41,800 Speaker 2: think it's where I would have spent time if I 454 00:24:41,840 --> 00:24:44,600 Speaker 2: had been a little bit older. It's like the first 455 00:24:44,840 --> 00:24:47,520 Speaker 2: place where you would find like Internet humor, right, the 456 00:24:47,600 --> 00:24:50,200 Speaker 2: kind of stuff you eventually you would see on boards 457 00:24:50,240 --> 00:24:52,679 Speaker 2: like something Awful, and then four Chan and now like 458 00:24:52,960 --> 00:24:56,399 Speaker 2: all of Twitter culture. Right, So it's it's inside jokes 459 00:24:56,480 --> 00:24:59,640 Speaker 2: and memes and what we now call shit posting, right, 460 00:25:00,160 --> 00:25:03,080 Speaker 2: And Yarvin is like one of the first generation of 461 00:25:03,119 --> 00:25:06,200 Speaker 2: ship posters, and he says this of his time on usenet, 462 00:25:06,680 --> 00:25:09,400 Speaker 2: it was a decentralized system, and more importantly, it had 463 00:25:09,400 --> 00:25:12,399 Speaker 2: this amazing form of admission control because everyone on it 464 00:25:12,440 --> 00:25:14,800 Speaker 2: was an engineering student or worked at a tech company 465 00:25:14,880 --> 00:25:20,080 Speaker 2: or something. So critically, it's not an open platform. The 466 00:25:20,119 --> 00:25:24,560 Speaker 2: only people here are to some extent involved in academics, 467 00:25:24,640 --> 00:25:28,800 Speaker 2: involved in the tech industry, and very smart. Right in 468 00:25:28,880 --> 00:25:30,360 Speaker 2: nineteen eighty five, just. 469 00:25:30,320 --> 00:25:32,280 Speaker 3: To get access to it at that time, yes, you 470 00:25:32,320 --> 00:25:33,520 Speaker 3: had to have had to be in. 471 00:25:34,000 --> 00:25:37,240 Speaker 2: Yeah, so they're they're the elite in a way, right, 472 00:25:37,280 --> 00:25:39,600 Speaker 2: and that's how, really how he comes to see them, 473 00:25:39,680 --> 00:25:42,439 Speaker 2: and Yarvin is definitely part of that elite. In nineteen 474 00:25:42,520 --> 00:25:46,240 Speaker 2: eighty five, he'd entered a Johns Hopkins study for mathematically 475 00:25:46,280 --> 00:25:49,400 Speaker 2: Precocious Youth, and then he had started taking classes at 476 00:25:49,440 --> 00:25:52,879 Speaker 2: Brown University. Even at this early stage of development, he 477 00:25:53,000 --> 00:25:56,440 Speaker 2: showed a distinct interest in authoritarian leaders and just as 478 00:25:56,440 --> 00:26:00,159 Speaker 2: critically and being very wrong about them. In nineteen ninety one, 479 00:26:00,200 --> 00:26:02,560 Speaker 2: he wrote in a discussion on us NEET, I wonder 480 00:26:02,600 --> 00:26:06,119 Speaker 2: if the Soviet power ladder, a vicious bureaucratic backbiting, brings 481 00:26:06,119 --> 00:26:08,639 Speaker 2: stronger him into the top than the American system of 482 00:26:08,720 --> 00:26:13,280 Speaker 2: feel good soundbites. Now, given that the USSR collapsed the 483 00:26:13,320 --> 00:26:15,360 Speaker 2: next year, not a great prediction. 484 00:26:16,160 --> 00:26:21,480 Speaker 3: Yeah, this is so, this is like you should have 485 00:26:21,480 --> 00:26:24,960 Speaker 3: had Rain Wilson on this episode, because you're describing Dwight shrut. 486 00:26:25,680 --> 00:26:28,400 Speaker 2: Yeah, he's got he's got more than a little. 487 00:26:28,200 --> 00:26:33,119 Speaker 3: Bit of that, right, a precociousness and a sort of 488 00:26:33,880 --> 00:26:39,680 Speaker 3: very specific kind of brilliance and a preoccupation with with 489 00:26:39,680 --> 00:26:41,159 Speaker 3: with like stern leadership. 490 00:26:41,440 --> 00:26:45,080 Speaker 1: And can you just imagine Dwight just telling everybody that 491 00:26:45,160 --> 00:26:49,520 Speaker 1: he entered a Johns Hopkins study of mathematical precocious youth 492 00:26:49,880 --> 00:26:52,560 Speaker 1: that would be brought up constantly. 493 00:26:52,119 --> 00:26:55,840 Speaker 3: By the way, if there is one way to guarantee 494 00:26:56,280 --> 00:26:59,720 Speaker 3: you're gonna get your ass kicked on a playground. 495 00:27:01,520 --> 00:27:07,560 Speaker 2: True, I'm not even going to get out the first 496 00:27:07,560 --> 00:27:10,720 Speaker 2: syllable of precocious before they start swinging. 497 00:27:14,200 --> 00:27:14,520 Speaker 3: Yeah. 498 00:27:14,600 --> 00:27:18,119 Speaker 2: So, while he is at college, Jarvin shows minimal interest 499 00:27:18,160 --> 00:27:21,320 Speaker 2: in the humanities. He only takes five undergraduate courses in 500 00:27:21,400 --> 00:27:25,280 Speaker 2: these subjects, focused on history and in college now Brown 501 00:27:25,480 --> 00:27:28,800 Speaker 2: is where he starts at college, right, and he graduates 502 00:27:28,800 --> 00:27:30,720 Speaker 2: in ninety two. He goes on to be a grad 503 00:27:30,760 --> 00:27:34,399 Speaker 2: student in a COMPSI PhD program at Berkeley, and his 504 00:27:34,760 --> 00:27:37,480 Speaker 2: goal at that point is to enter the tech industry, right, 505 00:27:37,960 --> 00:27:41,320 Speaker 2: which is just starting to really explode from as the Internet. 506 00:27:41,359 --> 00:27:44,439 Speaker 2: This is kind of the very the immediate precursor to 507 00:27:44,480 --> 00:27:47,200 Speaker 2: the big dot com boom. And as he moves from 508 00:27:47,280 --> 00:27:50,160 Speaker 2: high school to college and then from college to grad 509 00:27:50,200 --> 00:27:53,240 Speaker 2: school and starts flirting with big tech, he continues spending 510 00:27:53,280 --> 00:27:57,040 Speaker 2: his time online exploring his first political ideology, and he 511 00:27:57,119 --> 00:28:01,040 Speaker 2: is initially a libertarian. I want to quote from a 512 00:28:01,080 --> 00:28:04,520 Speaker 2: profile Joshua Tit wrote about Jarvin for a book on 513 00:28:04,520 --> 00:28:08,719 Speaker 2: the radical right quote. Engineers like Jarvin are typically sorted 514 00:28:08,720 --> 00:28:12,240 Speaker 2: through competitive academic programs, which they consider analogous to the 515 00:28:12,280 --> 00:28:16,360 Speaker 2: competition imagined in a libertarian society. Their world is rational, 516 00:28:16,520 --> 00:28:20,159 Speaker 2: rule bound, and solvable. Within the subculture, computer software and 517 00:28:20,240 --> 00:28:24,400 Speaker 2: hardware are the dominant metaphors for society. Such thinking dovetails 518 00:28:24,400 --> 00:28:27,280 Speaker 2: with the ironclad assumptions about human and market behavior of 519 00:28:27,320 --> 00:28:30,680 Speaker 2: the Austrian school of economics led by Ludwig von mess 520 00:28:31,080 --> 00:28:35,600 Speaker 2: Tech culture systems focus also accords with libertarianism's concentration on 521 00:28:35,680 --> 00:28:39,680 Speaker 2: efficiency and solving government. And so he's one of these 522 00:28:39,720 --> 00:28:42,240 Speaker 2: guys who number one comes to think, I am again, 523 00:28:42,320 --> 00:28:45,560 Speaker 2: I've been sorted into this natural aristocracy based on my 524 00:28:45,680 --> 00:28:48,360 Speaker 2: skill that I've earned, and the world around may he 525 00:28:48,440 --> 00:28:51,440 Speaker 2: sees like, seems so chaotic. But the computer systems I'm 526 00:28:51,440 --> 00:28:55,320 Speaker 2: working with are so sensible and ordered, and the companies 527 00:28:55,480 --> 00:28:57,880 Speaker 2: that I am interested in all seem to be so 528 00:28:58,000 --> 00:29:01,080 Speaker 2: much more efficient than the government. Couldn't we fix the 529 00:29:01,160 --> 00:29:04,360 Speaker 2: government if we made it more like a computer program 530 00:29:04,360 --> 00:29:08,120 Speaker 2: and more like the tech industry, right, which you can't 531 00:29:08,200 --> 00:29:11,400 Speaker 2: because people don't work that way. But there's always guys 532 00:29:11,400 --> 00:29:15,560 Speaker 2: who think this way, right, And you know, hopefully most 533 00:29:15,600 --> 00:29:17,800 Speaker 2: of them, I think it doesn't lead anywhere, but like 534 00:29:17,880 --> 00:29:21,040 Speaker 2: some bad opinions on the internet, unfortunately for Jarvin, it's 535 00:29:21,080 --> 00:29:23,880 Speaker 2: going to go a little bit further than that. Speaking 536 00:29:24,000 --> 00:29:29,680 Speaker 2: of disastrous ideological conclusions, you know who's never had any 537 00:29:29,720 --> 00:29:41,400 Speaker 2: of those are sponsors electedly So we're back right now. 538 00:29:41,800 --> 00:29:45,360 Speaker 2: The kind of thinking that Jarvin has about libertarianism, about 539 00:29:45,360 --> 00:29:48,200 Speaker 2: being a part of this natural aristocracy, is not really 540 00:29:48,240 --> 00:29:51,880 Speaker 2: congruent with human liberty in the broad sense, right, because 541 00:29:52,400 --> 00:29:56,160 Speaker 2: you know, if you are able to, as a business owner, 542 00:29:56,280 --> 00:29:59,560 Speaker 2: use your liberty like unconstrained by government regulations to dump 543 00:29:59,560 --> 00:30:02,840 Speaker 2: poisonous in a waterway, right, that is, you are more 544 00:30:02,920 --> 00:30:06,680 Speaker 2: free as the person running that business, but you're also 545 00:30:06,840 --> 00:30:10,040 Speaker 2: destroying life, and you know, one would say, harming the 546 00:30:10,040 --> 00:30:13,000 Speaker 2: liberty of thousands of other people who rely on that waterway. 547 00:30:13,120 --> 00:30:13,280 Speaker 1: Right. 548 00:30:13,320 --> 00:30:16,160 Speaker 2: So I would say, as someone who is like inclined 549 00:30:16,200 --> 00:30:20,280 Speaker 2: to some libertarian ideas, I don't really understand why so 550 00:30:20,360 --> 00:30:22,840 Speaker 2: many libertarians are obsessed with this kind of like ending 551 00:30:22,880 --> 00:30:25,120 Speaker 2: of government restrictions on corporations. 552 00:30:26,040 --> 00:30:30,240 Speaker 3: Yeah yeah, and also and and and another version of 553 00:30:30,280 --> 00:30:36,440 Speaker 3: that I find the anti union rhetoric. So oh yeah, 554 00:30:36,520 --> 00:30:40,400 Speaker 3: it's so hilarious to me because the formation of a 555 00:30:40,560 --> 00:30:49,120 Speaker 3: union to collectively bargain with a CEO is the most like, 556 00:30:50,720 --> 00:30:56,000 Speaker 3: the most natural expression of free spreech a free speech. 557 00:30:56,560 --> 00:30:59,640 Speaker 3: It is, it is, it is such a natural and 558 00:30:59,760 --> 00:31:03,680 Speaker 3: so to be like, you know, free speech, I'm a 559 00:31:03,720 --> 00:31:09,080 Speaker 3: constitutional you know, libertarian or whatever, and then also in 560 00:31:09,120 --> 00:31:12,520 Speaker 3: the same breath be like unions are should be illegal. 561 00:31:13,400 --> 00:31:22,600 Speaker 3: It's yeah, unions are a natural growth and a natural 562 00:31:23,320 --> 00:31:25,200 Speaker 3: oppositional force to exploitation. 563 00:31:25,880 --> 00:31:28,840 Speaker 2: Yeah. And I think they also, like very real, like 564 00:31:29,040 --> 00:31:32,840 Speaker 2: very objectively increase the amount of like freedom, right, Like 565 00:31:33,320 --> 00:31:35,720 Speaker 2: if you're kind of looking at it that way, when 566 00:31:35,760 --> 00:31:38,680 Speaker 2: people have a way to band together to oppose a 567 00:31:38,800 --> 00:31:43,040 Speaker 2: much larger, more powerful, you know, more moneyed interest, then 568 00:31:43,360 --> 00:31:46,240 Speaker 2: they have more agency, you know, in their lives, right 569 00:31:46,320 --> 00:31:49,000 Speaker 2: Like that's that's I mean definitely how I look at it. 570 00:31:49,040 --> 00:31:52,720 Speaker 2: And I will say Jarvin, he actually is pretty good 571 00:31:52,720 --> 00:31:55,760 Speaker 2: at not getting lost in this part of the discourse, 572 00:31:55,840 --> 00:31:58,840 Speaker 2: right because he drops this idea that liberty is a 573 00:31:58,960 --> 00:32:01,280 Speaker 2: value in any way, shape or form pretty early on. 574 00:32:01,360 --> 00:32:04,560 Speaker 2: Like he's not one of these guys who preaches libertarianism 575 00:32:04,600 --> 00:32:06,840 Speaker 2: because he thinks that it's or because he's trying to 576 00:32:06,840 --> 00:32:10,080 Speaker 2: convince people that it's somehow better for human freedom. He's 577 00:32:10,120 --> 00:32:12,240 Speaker 2: someone who just kind of drops the idea that there's 578 00:32:12,240 --> 00:32:15,160 Speaker 2: any value in human freedom pretty early on, right, so 579 00:32:15,200 --> 00:32:17,240 Speaker 2: there's no point in paying lip service to it, which 580 00:32:17,280 --> 00:32:20,120 Speaker 2: is at least more honest than a lot of these guys. 581 00:32:21,320 --> 00:32:21,480 Speaker 1: Now. 582 00:32:21,480 --> 00:32:24,080 Speaker 2: The major pivot point which leads to him dropping his 583 00:32:24,160 --> 00:32:28,240 Speaker 2: libertarian trappings and embracing this more authoritarian belief system hinges 584 00:32:28,280 --> 00:32:30,960 Speaker 2: on the place that he was and kind of remains 585 00:32:31,000 --> 00:32:34,520 Speaker 2: his mental home, which is the early Internet. The old 586 00:32:34,600 --> 00:32:38,320 Speaker 2: days of usenet were a simulacrum of what is today 587 00:32:38,440 --> 00:32:41,600 Speaker 2: Jarvin's ideal society. As I stated before, back then you 588 00:32:41,600 --> 00:32:44,000 Speaker 2: couldn't post unless you were someone with a degree of 589 00:32:44,040 --> 00:32:48,160 Speaker 2: like skill, money, or access to a large institution, And 590 00:32:48,240 --> 00:32:51,280 Speaker 2: so you would only get new users in any large 591 00:32:51,320 --> 00:32:54,240 Speaker 2: amount every September, when you get new college classes of 592 00:32:54,280 --> 00:32:57,400 Speaker 2: kids who would get onboarded and start posting, right. And 593 00:32:57,440 --> 00:32:59,800 Speaker 2: so for a few years, every September the Internet would 594 00:32:59,800 --> 00:33:01,920 Speaker 2: be an annoying for a while while all these newbies 595 00:33:01,960 --> 00:33:04,640 Speaker 2: came in who don't know like the social mores, and 596 00:33:04,680 --> 00:33:07,160 Speaker 2: they would have to get acclimatized, right, But there were 597 00:33:07,160 --> 00:33:09,400 Speaker 2: always more old heads, people who had been there a 598 00:33:09,440 --> 00:33:11,640 Speaker 2: long time to keep the new people in line, and 599 00:33:11,680 --> 00:33:15,440 Speaker 2: there was this natural hierarchy based on age and technical skill. 600 00:33:15,880 --> 00:33:19,640 Speaker 2: And then one year late nineteen ninety three, Usenet opens 601 00:33:19,720 --> 00:33:22,720 Speaker 2: up to anyone with an Internet connection, and suddenly you 602 00:33:22,800 --> 00:33:26,120 Speaker 2: have what people call eternal September. Right, like, it's never 603 00:33:26,280 --> 00:33:29,320 Speaker 2: ended since nineteen ninety three because there were no there's 604 00:33:29,360 --> 00:33:31,880 Speaker 2: not been any kind of like guard rails to block 605 00:33:32,000 --> 00:33:35,160 Speaker 2: new people from coming on after that point. This is 606 00:33:35,400 --> 00:33:37,960 Speaker 2: you know, it's an important moment in Internet history. It's 607 00:33:38,000 --> 00:33:42,080 Speaker 2: a catastrophic moment for Curtis Yarbin, right, and the mental 608 00:33:42,120 --> 00:33:44,880 Speaker 2: impact this has is key to understanding him. In one 609 00:33:44,960 --> 00:33:47,760 Speaker 2: interview with Tablet magazine, he complained, you had this sort 610 00:33:47,800 --> 00:33:51,360 Speaker 2: of de facto aristocracy that didn't know it was an aristocracy, 611 00:33:51,400 --> 00:33:54,080 Speaker 2: and then it fell apart. These are all big Lord 612 00:33:54,080 --> 00:33:55,680 Speaker 2: of the Rings guys, So I'll use the Lord of 613 00:33:55,680 --> 00:33:58,760 Speaker 2: the Rings analogy. They talk about this like the like 614 00:33:58,840 --> 00:34:02,520 Speaker 2: the the period of time when the elves ruled everything 615 00:34:02,560 --> 00:34:05,160 Speaker 2: before Sauron had his big war, right like before the 616 00:34:05,160 --> 00:34:09,080 Speaker 2: breaking of the world. That's eternal September that ruins this 617 00:34:09,200 --> 00:34:11,840 Speaker 2: kind of like more noble Golden age and brings about 618 00:34:11,880 --> 00:34:13,800 Speaker 2: this dirty, grubby age of men. 619 00:34:14,280 --> 00:34:16,560 Speaker 3: So we'll take care work for it. I'm not a 620 00:34:16,560 --> 00:34:19,600 Speaker 3: Lord of the Rings guy. I mean, I respect it, 621 00:34:19,920 --> 00:34:22,360 Speaker 3: but I just don't have that level of knowledge. 622 00:34:23,120 --> 00:34:25,239 Speaker 1: Yeah I do. I'm wearing a Lord of the Rings 623 00:34:25,239 --> 00:34:28,680 Speaker 1: hat right now. Can back that claim. 624 00:34:29,440 --> 00:34:32,719 Speaker 2: All these guys are big jd Vance his company, his 625 00:34:33,200 --> 00:34:36,440 Speaker 2: venture capital company, was named after one of the rings 626 00:34:36,560 --> 00:34:39,600 Speaker 2: in the Lord of the Rings. Peter Thiel's surveillance company 627 00:34:39,680 --> 00:34:41,719 Speaker 2: is named Palanteer from the Lord of the Rings. So 628 00:34:42,280 --> 00:34:44,680 Speaker 2: this is very much the language that they all speak. 629 00:34:45,280 --> 00:34:45,600 Speaker 1: Funny. 630 00:34:46,120 --> 00:34:49,680 Speaker 3: That's also one of Stephen Colbert's obsessions, and I wonder 631 00:34:49,719 --> 00:34:53,920 Speaker 3: if if they might find common ground and have like 632 00:34:53,960 --> 00:34:56,040 Speaker 3: a fun chat on that subject. 633 00:34:56,800 --> 00:34:58,960 Speaker 2: I certainly could have a chat about it. I think 634 00:34:59,000 --> 00:35:02,359 Speaker 2: Colbert would probably probably be kind of horrified of some 635 00:35:02,440 --> 00:35:05,319 Speaker 2: of the things that they're referencing, and they're like, like, 636 00:35:05,400 --> 00:35:08,000 Speaker 2: you named your company after this thing that is specifically 637 00:35:08,000 --> 00:35:10,280 Speaker 2: a device that only the evil Wizard uses. 638 00:35:10,719 --> 00:35:14,000 Speaker 3: Okay, yeah, yeah, but. 639 00:35:14,000 --> 00:35:16,719 Speaker 2: I don't know that would be an interesting conversation. So 640 00:35:17,200 --> 00:35:20,239 Speaker 2: I think this period of time, this kind of collapse 641 00:35:20,280 --> 00:35:22,359 Speaker 2: of this natural aress, what he sees as a natural 642 00:35:22,360 --> 00:35:27,439 Speaker 2: aristocracy is key to understanding why Jarvin comes to hate democracy, 643 00:35:27,560 --> 00:35:30,480 Speaker 2: right because it kind of ruined his Internet playground, the 644 00:35:30,520 --> 00:35:33,640 Speaker 2: first place where he ever felt that he fit in rights. 645 00:35:33,840 --> 00:35:35,960 Speaker 2: That's sort of what I see as like the er 646 00:35:36,120 --> 00:35:39,120 Speaker 2: moment of his like coming to hate this kind of 647 00:35:39,160 --> 00:35:42,600 Speaker 2: idea of any kind of democratic society. Now, if you're 648 00:35:42,640 --> 00:35:44,560 Speaker 2: going to claim that you and your friends on the 649 00:35:44,600 --> 00:35:47,319 Speaker 2: Internet back in the day were like the aristocracy of 650 00:35:47,400 --> 00:35:51,680 Speaker 2: some long lost utopia of logic that invites people to 651 00:35:51,760 --> 00:35:54,399 Speaker 2: look at what you were posting on the Internet back then, 652 00:35:54,719 --> 00:35:57,000 Speaker 2: And I've looked at some of Jarvin's old posts, and 653 00:35:57,120 --> 00:36:00,120 Speaker 2: Socrates he wasn't. He does seem to have spent some 654 00:36:00,160 --> 00:36:03,320 Speaker 2: of it writing comedy for a hacking and DIY media 655 00:36:03,360 --> 00:36:06,520 Speaker 2: collective called the Cult of the Dead Cow. This is 656 00:36:06,520 --> 00:36:08,640 Speaker 2: where we get to like the weirdest connection here, because 657 00:36:08,640 --> 00:36:10,880 Speaker 2: if you've heard of the Cult of the Dead Cow recently, 658 00:36:11,120 --> 00:36:17,480 Speaker 2: it's because Beto O'Rourke was also a member. So yeah, yeah, yeah, 659 00:36:17,520 --> 00:36:21,640 Speaker 2: So he and Beto have a very very strange connection 660 00:36:21,760 --> 00:36:24,719 Speaker 2: to each other. Now, the Cult of the Dead Cow 661 00:36:24,840 --> 00:36:28,240 Speaker 2: was like a complicated thing. It's it's one Reuter's article 662 00:36:28,280 --> 00:36:30,520 Speaker 2: I found describes it as the oldest group of computer 663 00:36:30,600 --> 00:36:33,640 Speaker 2: hackers in US history. I think that over sells how 664 00:36:33,680 --> 00:36:37,360 Speaker 2: cool Jarvin's involvement in it is, because I think he 665 00:36:37,520 --> 00:36:39,840 Speaker 2: was mostly They were also like a media collective, so 666 00:36:39,840 --> 00:36:43,200 Speaker 2: they put out like pieces of writing and whatnot, and 667 00:36:43,280 --> 00:36:46,239 Speaker 2: I think that's mostly what Yarvin's involvement was, right, And 668 00:36:46,560 --> 00:36:49,080 Speaker 2: the best evidence I have of what he was writing 669 00:36:49,120 --> 00:36:53,040 Speaker 2: for them is a satiric piece of Badger human hybrid erotica, 670 00:36:54,280 --> 00:36:56,920 Speaker 2: which I think might hold a little bit of evidence 671 00:36:56,920 --> 00:36:59,839 Speaker 2: of his future interest in race science, although it's hard 672 00:36:59,880 --> 00:37:01,719 Speaker 2: to say. Do you want to hear some of his 673 00:37:02,800 --> 00:37:04,719 Speaker 2: human hybrid erotica? 674 00:37:04,840 --> 00:37:09,400 Speaker 3: Hold on, let me get some lube. 675 00:37:11,120 --> 00:37:14,400 Speaker 2: Love me for my genes, says Antonio Kneeling. If you 676 00:37:14,440 --> 00:37:16,640 Speaker 2: cannot love me for myself, you must love me for 677 00:37:16,719 --> 00:37:19,960 Speaker 2: my jeans. I've never told anyone this before. I've always 678 00:37:20,040 --> 00:37:22,160 Speaker 2: kept it to myself. I have always let them think 679 00:37:22,160 --> 00:37:24,080 Speaker 2: it but an accident of cruel nature that I have 680 00:37:24,120 --> 00:37:27,440 Speaker 2: white hair on my cheekbones and a thoroughly disreputable looking nose. 681 00:37:27,719 --> 00:37:29,759 Speaker 2: But the fact is that I am part badger on 682 00:37:29,800 --> 00:37:35,279 Speaker 2: my father's side. So I don't know. I don't know 683 00:37:35,320 --> 00:37:37,439 Speaker 2: what to say about that. I know it's a joke. 684 00:37:37,520 --> 00:37:41,120 Speaker 2: It's a bit, right, I don't think it lands Maybe 685 00:37:41,160 --> 00:37:44,520 Speaker 2: it was funnier back in the early Internet, although maybe 686 00:37:44,520 --> 00:37:46,080 Speaker 2: the bar was just a lot lower there. 687 00:37:46,120 --> 00:37:50,080 Speaker 3: It feels like there's some context we're missing, like just 688 00:37:50,400 --> 00:37:53,920 Speaker 3: just I'm digging hard for some out here, like so 689 00:37:54,080 --> 00:37:57,160 Speaker 3: am I. It just seems like there was some inside 690 00:37:57,280 --> 00:38:02,200 Speaker 3: joke about Badger fucking or something that we're not that 691 00:38:02,280 --> 00:38:05,120 Speaker 3: was sort of came before this. 692 00:38:05,120 --> 00:38:07,359 Speaker 2: This has to be part of a dialogue that we've 693 00:38:07,400 --> 00:38:10,600 Speaker 2: lost pieces of over the years, right, It is like 694 00:38:10,719 --> 00:38:15,239 Speaker 2: about two pages of like Badger erotica. That is, it's 695 00:38:15,239 --> 00:38:18,080 Speaker 2: weirdly the love Me for my jenes line stands out 696 00:38:18,120 --> 00:38:19,799 Speaker 2: to me. But I may be reading more into that 697 00:38:19,800 --> 00:38:24,000 Speaker 2: that is necessary. But yeah, so you know that's the 698 00:38:24,080 --> 00:38:26,279 Speaker 2: kind of stuff he's doing. He's like, it's it's pretty 699 00:38:26,360 --> 00:38:29,560 Speaker 2: lighthearted comedy, right, or it's at least attempting to be. 700 00:38:30,560 --> 00:38:32,239 Speaker 2: So he's not, as far as I can tell, on 701 00:38:32,400 --> 00:38:34,960 Speaker 2: like the serious hacking side of what the Cult of 702 00:38:35,000 --> 00:38:39,279 Speaker 2: the Dead Cow is doing at this period of time. Now, 703 00:38:39,480 --> 00:38:42,360 Speaker 2: Jarvin weathered the fall of Usenet and not long after 704 00:38:42,400 --> 00:38:45,200 Speaker 2: the Eternal September began he dropped out of Berkeley for 705 00:38:45,239 --> 00:38:48,080 Speaker 2: a job at a tech company. He started flirting with 706 00:38:48,160 --> 00:38:52,600 Speaker 2: the specific strains of more authoritarian money simptered libertarian ideology 707 00:38:52,600 --> 00:38:55,120 Speaker 2: as opposed to like, you know, the old school guys 708 00:38:55,120 --> 00:38:59,040 Speaker 2: who actually, like you know, really were pretty focused on 709 00:38:59,160 --> 00:39:01,759 Speaker 2: human liberty. I think kind of the last dregs of 710 00:39:01,800 --> 00:39:04,560 Speaker 2: those guys, or you saw it, like Penin Teller would 711 00:39:04,600 --> 00:39:07,439 Speaker 2: be a great evidence of that, right, Like that kind 712 00:39:07,440 --> 00:39:11,040 Speaker 2: of libertarian was a lot more prominent back then. And 713 00:39:11,320 --> 00:39:13,480 Speaker 2: he's sort of Yarvin is sort of right on the 714 00:39:13,600 --> 00:39:16,240 Speaker 2: edge of the folks who got a lot of money 715 00:39:16,360 --> 00:39:19,000 Speaker 2: in the tech industry and started getting angry that, like 716 00:39:19,040 --> 00:39:21,320 Speaker 2: they have to still pay taxes to keep the roads 717 00:39:21,400 --> 00:39:25,440 Speaker 2: up right. That's kind of where he moves into. And 718 00:39:25,520 --> 00:39:28,239 Speaker 2: Yarvin is eventually going to kind of come out of 719 00:39:28,239 --> 00:39:30,719 Speaker 2: that as a monarchist. And it behooves us to look 720 00:39:30,719 --> 00:39:33,680 Speaker 2: at how that happened. Now there are some signs of 721 00:39:33,719 --> 00:39:36,560 Speaker 2: his ideological turn. And another short story he wrote for 722 00:39:36,600 --> 00:39:38,799 Speaker 2: the Cult of the Dead Cow the year after that 723 00:39:38,880 --> 00:39:42,560 Speaker 2: Badger story in nineteen ninety four. This piece is titled 724 00:39:42,560 --> 00:39:45,080 Speaker 2: The Bishop, and it opens with the lines no one 725 00:39:45,120 --> 00:39:48,359 Speaker 2: has come into the cathedral in some time. It's about 726 00:39:48,400 --> 00:39:50,600 Speaker 2: an old bishop who exists out of time in a 727 00:39:50,640 --> 00:39:53,160 Speaker 2: moldering cathedral that no one has visited in years, and 728 00:39:53,200 --> 00:39:57,280 Speaker 2: at one point Jarvin, possibly describing himself, writes, the bishop 729 00:39:57,360 --> 00:39:59,799 Speaker 2: is a man of logic. Unlike many older people, he 730 00:39:59,880 --> 00:40:02,480 Speaker 2: is unwilling to repaint the world he sees around him 731 00:40:02,680 --> 00:40:04,839 Speaker 2: to make it a more comfortable place in which to live. 732 00:40:05,160 --> 00:40:08,480 Speaker 2: He recognizes unpleasant facts, indeed he delights in them for 733 00:40:08,520 --> 00:40:11,200 Speaker 2: the in the act of recognition he finds proof that 734 00:40:11,239 --> 00:40:14,400 Speaker 2: his faculties have not decayed to that state of contented oblivion, 735 00:40:14,600 --> 00:40:18,440 Speaker 2: which he believes a sheer precursor to death. And this 736 00:40:18,520 --> 00:40:21,480 Speaker 2: is kind of noteworthy in part because the term cathedral 737 00:40:21,640 --> 00:40:24,799 Speaker 2: is going to be really influential important for Jarv and 738 00:40:24,800 --> 00:40:26,239 Speaker 2: he's going to come to use it as a term 739 00:40:26,280 --> 00:40:32,240 Speaker 2: to refer to the news media, the political establishment, and academia. 740 00:40:32,400 --> 00:40:32,560 Speaker 3: Right. 741 00:40:32,640 --> 00:40:35,879 Speaker 2: Everyone who annoys him, right is the cathedral. And this 742 00:40:35,960 --> 00:40:39,560 Speaker 2: is sort of like the evil regime that he's going 743 00:40:39,600 --> 00:40:41,759 Speaker 2: to set himself to the task of destroying. And this 744 00:40:41,880 --> 00:40:44,160 Speaker 2: is how a lot of these guys think. It's why 745 00:40:44,160 --> 00:40:46,560 Speaker 2: there's so much focused why guys like Vance spends so 746 00:40:46,640 --> 00:40:51,440 Speaker 2: much time attacking schools, attacking like professors and academia. It's 747 00:40:51,560 --> 00:40:54,040 Speaker 2: why there's so much hatred of journalists, right, These are 748 00:40:54,040 --> 00:40:58,400 Speaker 2: the people who, in his eyes, are invested in propping 749 00:40:58,520 --> 00:41:02,279 Speaker 2: up a clearly dysfunctnctional, failing society. Right, and so you 750 00:41:02,360 --> 00:41:06,040 Speaker 2: have to destroy the cathedral in order to build anything new. 751 00:41:06,160 --> 00:41:08,600 Speaker 2: That's that's what he's going to come to believe. Right 752 00:41:09,280 --> 00:41:12,320 Speaker 2: In the early two thousands, the dot com bubble bursts, 753 00:41:12,440 --> 00:41:14,400 Speaker 2: and at some point after that, Yarvin wound up with 754 00:41:14,400 --> 00:41:16,520 Speaker 2: several hundred grand as the result of a buyout of 755 00:41:16,560 --> 00:41:19,000 Speaker 2: a company he worked at, So not enough to retire, 756 00:41:19,040 --> 00:41:20,960 Speaker 2: but enough to sit around and really think about what 757 00:41:21,000 --> 00:41:22,080 Speaker 2: he wants to do next. 758 00:41:22,520 --> 00:41:24,000 Speaker 3: What year is this? 759 00:41:24,000 --> 00:41:26,080 Speaker 2: This would be like in the early two thousands, So 760 00:41:26,080 --> 00:41:28,160 Speaker 2: this is all happening sometime between like two thousand and 761 00:41:28,200 --> 00:41:31,359 Speaker 2: one and two thousand and four, you know, the dot 762 00:41:31,400 --> 00:41:33,879 Speaker 2: com bubble bursts. Sometime after nine to eleven, I think 763 00:41:33,920 --> 00:41:36,120 Speaker 2: is when he gets bought out, And by two thousand 764 00:41:36,160 --> 00:41:38,799 Speaker 2: and three or four, he's kind of sitting around on 765 00:41:38,880 --> 00:41:41,920 Speaker 2: a pile of money, reading a lot, trying to figure 766 00:41:41,920 --> 00:41:43,520 Speaker 2: out what he wants to do next with his life. 767 00:41:44,280 --> 00:41:46,400 Speaker 2: What he kind of decides is that he wants to 768 00:41:46,400 --> 00:41:49,880 Speaker 2: think about politics and economics. Now, Yarvin had made some 769 00:41:50,040 --> 00:41:52,560 Speaker 2: friends during his tech years, and he'd gotten interested in 770 00:41:52,640 --> 00:41:56,760 Speaker 2: Austrian school economists, mostly because of this University of Tennessee 771 00:41:56,800 --> 00:41:59,720 Speaker 2: law professor Glenn Reynolds, who was like an early blogger, 772 00:41:59,760 --> 00:42:02,320 Speaker 2: who had gotten Jarvin interested in a guy named Ludwig 773 00:42:02,400 --> 00:42:07,440 Speaker 2: von Meses, and eventually through this Jarvin gets interested in 774 00:42:07,680 --> 00:42:12,280 Speaker 2: a fan of Meaes another theoretician named Marie Rothbard. Rothbard 775 00:42:12,360 --> 00:42:16,439 Speaker 2: was a foundational a narco capitalist thinker. I don't really 776 00:42:16,520 --> 00:42:19,080 Speaker 2: like that term, but that's what they called themselves, and 777 00:42:19,719 --> 00:42:22,799 Speaker 2: he basically believes like there should not be a state, right, 778 00:42:22,840 --> 00:42:26,600 Speaker 2: there should not be any power higher than individuals and 779 00:42:26,800 --> 00:42:30,239 Speaker 2: corporations spending their money to make things happen. Right, That's 780 00:42:30,320 --> 00:42:34,560 Speaker 2: kind of the gist of anarchist. Yeah, yeah, and you know, 781 00:42:34,840 --> 00:42:39,760 Speaker 2: I think like a more like an anarchist would argue 782 00:42:40,080 --> 00:42:42,000 Speaker 2: the fact that you have a bunch of money is 783 00:42:42,080 --> 00:42:45,280 Speaker 2: like as much a problematic hierarchy as, you know, anything 784 00:42:45,320 --> 00:42:48,359 Speaker 2: that the state does, and not, like you can't really 785 00:42:48,360 --> 00:42:50,719 Speaker 2: be an anarcho capitalist. A lot of people would argue, 786 00:42:50,760 --> 00:42:55,440 Speaker 2: but Rothbard is one who feels like, basically that the state, 787 00:42:55,600 --> 00:42:57,719 Speaker 2: the primary reason the state is unethical is that it 788 00:42:57,719 --> 00:42:59,400 Speaker 2: stops people from doing what they want to do with 789 00:42:59,440 --> 00:43:02,160 Speaker 2: their money. Right, Whereas an anarchist would be like, well, 790 00:43:02,160 --> 00:43:03,920 Speaker 2: the reason that the state is an ethical is that 791 00:43:04,160 --> 00:43:09,280 Speaker 2: states can do a lot of harm to people at scale. Right, Anyway, 792 00:43:10,040 --> 00:43:11,799 Speaker 2: none of that really matters to the point, which is 793 00:43:11,840 --> 00:43:16,400 Speaker 2: that he gets really interested in this guy Rothbard and Rothbard. 794 00:43:16,440 --> 00:43:19,439 Speaker 2: One of the things the Rights about is this kind 795 00:43:19,440 --> 00:43:23,480 Speaker 2: of anger at the concept of people advocating for civil rights. 796 00:43:23,640 --> 00:43:23,839 Speaker 3: Right. 797 00:43:24,480 --> 00:43:27,920 Speaker 2: Anyone advocating for civil rights, in Rothbard's mind, is an 798 00:43:28,040 --> 00:43:31,800 Speaker 2: enemy right because the only way to advocate for civil 799 00:43:31,880 --> 00:43:34,640 Speaker 2: rights is to advocate for the state to make rules 800 00:43:34,800 --> 00:43:39,360 Speaker 2: about those rights, and that leads inevitably to tyranny. Rothbard wrote, 801 00:43:39,400 --> 00:43:42,560 Speaker 2: behind the honeyed but patently absurd pleads for equality is 802 00:43:42,600 --> 00:43:45,960 Speaker 2: a ruthless drive for placing themselves the elites, at the 803 00:43:45,960 --> 00:43:48,560 Speaker 2: top of a new hierarchy of power. And this is 804 00:43:48,560 --> 00:43:50,040 Speaker 2: something you see a lot on the right today, this 805 00:43:50,160 --> 00:43:53,319 Speaker 2: idea that like any group of people who are advocating 806 00:43:53,400 --> 00:43:55,840 Speaker 2: for their own civil who are advocating for civil rights 807 00:43:55,880 --> 00:43:59,760 Speaker 2: because they're being oppressed under the present system are secretly 808 00:43:59,760 --> 00:44:02,719 Speaker 2: trying to make themselves rulers, right. All they really want 809 00:44:02,760 --> 00:44:04,719 Speaker 2: to do is a press you by, I don't know, 810 00:44:04,719 --> 00:44:08,560 Speaker 2: getting the right to vote or own credit cards or whatever. 811 00:44:09,960 --> 00:44:12,359 Speaker 2: So that's kind of like a big part of Rothbard's 812 00:44:12,480 --> 00:44:15,840 Speaker 2: belief system, and Jarvin really takes to that now. And 813 00:44:16,239 --> 00:44:17,879 Speaker 2: that quote that I just read from him came out 814 00:44:17,880 --> 00:44:19,680 Speaker 2: in ninety five, so you get the kind of feeling 815 00:44:19,760 --> 00:44:22,400 Speaker 2: like this is the sort of thinking Jarvin is hoovering 816 00:44:22,480 --> 00:44:24,839 Speaker 2: up in that period right before you know, the dot 817 00:44:24,880 --> 00:44:27,880 Speaker 2: com boom and then the dot com bust. And ultimately 818 00:44:27,920 --> 00:44:30,680 Speaker 2: his reading of these Austrian school guys leads him to 819 00:44:30,719 --> 00:44:34,319 Speaker 2: another dude named Thomas Carlyle. Now Carlyle's been dead for 820 00:44:34,400 --> 00:44:37,320 Speaker 2: a while. He's a Scottish philosopher from the eighteen hundreds, 821 00:44:37,600 --> 00:44:40,719 Speaker 2: and he's kind of seen as a proto to these 822 00:44:41,120 --> 00:44:43,760 Speaker 2: a lot of these kind of like more modern thinkers 823 00:44:43,760 --> 00:44:47,239 Speaker 2: that he's reading. And Carlyle is, you know, he's a 824 00:44:46,960 --> 00:44:49,920 Speaker 2: he's an authoritarian who believes that you need a strong 825 00:44:49,960 --> 00:44:54,279 Speaker 2: man to stop groups of marginalized people from making themselves 826 00:44:54,320 --> 00:44:58,000 Speaker 2: the new tyrants, right. And he's also as we'll talk 827 00:44:58,000 --> 00:45:00,399 Speaker 2: about a massive racist. He's one of the these guys 828 00:45:00,440 --> 00:45:05,080 Speaker 2: who justify slavery as being a fundamentally ethical system for 829 00:45:05,160 --> 00:45:10,200 Speaker 2: reasons of like, basically, certain groups of people are different genetically, 830 00:45:10,320 --> 00:45:13,920 Speaker 2: so slavery is a natural like hierarchy in society. So 831 00:45:13,960 --> 00:45:17,160 Speaker 2: these are the kind of people that Jarvin is digesting 832 00:45:17,280 --> 00:45:19,040 Speaker 2: when he comes upon the work of a fellow named 833 00:45:19,040 --> 00:45:23,160 Speaker 2: Hans Herman Hop. Hop is a German born political theorist 834 00:45:23,280 --> 00:45:27,560 Speaker 2: and a leading Austrian School economist. He's another anarcho capitalist, 835 00:45:28,080 --> 00:45:31,320 Speaker 2: and hop is a big advocate of monarchy in a 836 00:45:31,360 --> 00:45:34,880 Speaker 2: way that he defines monarchy as a privately owned government 837 00:45:35,200 --> 00:45:37,960 Speaker 2: as opposed to a democracy, which he calls a publicly 838 00:45:38,000 --> 00:45:42,200 Speaker 2: owned government. And Hopp believes that the transition from monarchy 839 00:45:42,239 --> 00:45:45,120 Speaker 2: to democracy over the twentieth century was like the big 840 00:45:45,160 --> 00:45:47,680 Speaker 2: mistake that we made as humans and has caused nothing 841 00:45:47,719 --> 00:45:52,640 Speaker 2: but civilizational decline ever since. And from Hop Jarvin gets 842 00:45:52,680 --> 00:45:55,040 Speaker 2: the idea that the best way to run anything is 843 00:45:55,080 --> 00:45:57,319 Speaker 2: to have one guy be in charge of it. Right, 844 00:45:57,360 --> 00:46:00,920 Speaker 2: You can't effectively run an organization if there's any power sharing. 845 00:46:01,239 --> 00:46:03,560 Speaker 2: The only way to do anything is to have a 846 00:46:03,560 --> 00:46:08,239 Speaker 2: single person be invested with absolute power right. I know 847 00:46:08,280 --> 00:46:11,359 Speaker 2: that's kind of like a tortured logical route, but those 848 00:46:11,360 --> 00:46:13,719 Speaker 2: are sort of the ingredients that eventually cook up to 849 00:46:13,800 --> 00:46:17,960 Speaker 2: him becoming a monarchist. Right now, we might say that's 850 00:46:17,960 --> 00:46:20,160 Speaker 2: not the most logical thing, right if you look at 851 00:46:20,160 --> 00:46:23,000 Speaker 2: what happened to all of the absolute monarchies, they kind 852 00:46:23,000 --> 00:46:26,880 Speaker 2: of destroyed each other circle World War One, And Jarvin 853 00:46:26,920 --> 00:46:29,840 Speaker 2: would argue, no, no, no, those weren't real absolute monarchies. 854 00:46:29,880 --> 00:46:33,080 Speaker 2: They had they all made too many compromises with different 855 00:46:33,160 --> 00:46:37,319 Speaker 2: sort of like democratical instruments within those societies. And that's 856 00:46:37,360 --> 00:46:39,759 Speaker 2: the reason why Austria Hungary fell, and that's the reason 857 00:46:39,760 --> 00:46:42,920 Speaker 2: why the czar fell, right, they didn't have quite enough power. 858 00:46:43,400 --> 00:46:44,320 Speaker 2: I think that's silly. 859 00:46:45,080 --> 00:46:50,840 Speaker 3: Well, sure, it places such an unreasonable amount of faith 860 00:46:51,040 --> 00:46:55,880 Speaker 3: in one person or in just like, yeah, the integrity 861 00:46:55,120 --> 00:47:00,719 Speaker 3: of humans, like yeah, like people. The reason that it's 862 00:47:01,080 --> 00:47:04,960 Speaker 3: the reason that you have to embrace a messy system 863 00:47:05,040 --> 00:47:07,040 Speaker 3: is because people are inherently messy. 864 00:47:07,360 --> 00:47:10,359 Speaker 2: Yeah, that's a great way to put it. 865 00:47:10,719 --> 00:47:14,920 Speaker 3: A monarchy is a wonderful fantasy. But like, how do 866 00:47:14,960 --> 00:47:18,440 Speaker 3: you pick the guy? How do you pick that person? 867 00:47:18,680 --> 00:47:21,600 Speaker 3: And then and then what if he gets hit on 868 00:47:21,640 --> 00:47:26,760 Speaker 3: the head wrong if he's wrong and n Dali Lama 869 00:47:26,800 --> 00:47:30,279 Speaker 3: thing where where it's it's a birthright thing. And then 870 00:47:30,400 --> 00:47:34,160 Speaker 3: like what if what if they're just like a narcissistic, 871 00:47:34,880 --> 00:47:41,719 Speaker 3: suicidal or depressive or whatever, Like, uh, what if they 872 00:47:41,719 --> 00:47:42,600 Speaker 3: want nothing to do with it? 873 00:47:42,640 --> 00:47:43,000 Speaker 1: I don't know. 874 00:47:43,040 --> 00:47:44,480 Speaker 3: It just seems nuts. 875 00:47:45,239 --> 00:47:48,320 Speaker 2: It's this wild It's this thing that like everyone understands 876 00:47:48,360 --> 00:47:51,839 Speaker 2: the frustration with democracy, right, Like it's really messy and 877 00:47:51,920 --> 00:47:54,880 Speaker 2: really annoying a lot of the time, and like people 878 00:47:54,920 --> 00:47:58,520 Speaker 2: make a lot of bad decisions, especially even as collectives. 879 00:47:58,520 --> 00:48:01,080 Speaker 2: Groups of people make really bad decisions a lot of 880 00:48:01,120 --> 00:48:03,960 Speaker 2: the time. Right, But then saying like the solution to 881 00:48:04,000 --> 00:48:07,680 Speaker 2: this is to have one guy be in charge. It's like, well, 882 00:48:07,880 --> 00:48:10,279 Speaker 2: number one, how do you pick that guy? Number two? 883 00:48:10,440 --> 00:48:12,799 Speaker 2: Like we've all seen it, Like people change over the 884 00:48:12,840 --> 00:48:15,720 Speaker 2: course of their lives, right, Like what happens if that guy, 885 00:48:16,360 --> 00:48:19,239 Speaker 2: like his mental decapacity gets declined or whatever, or he 886 00:48:19,239 --> 00:48:22,600 Speaker 2: gets obsessed with something weird and crazy and dangerous, which 887 00:48:22,600 --> 00:48:25,520 Speaker 2: is what happens to every monarchy, Right, They all wind 888 00:48:25,600 --> 00:48:29,279 Speaker 2: up ruled by like maniacs who make terrible decisions, which 889 00:48:29,320 --> 00:48:31,360 Speaker 2: is like why we had World War One. You have 890 00:48:31,440 --> 00:48:34,319 Speaker 2: all these like monarchs who were obsessed with these very 891 00:48:34,360 --> 00:48:40,040 Speaker 2: silly attitudes with the and these very silly, petty grievances 892 00:48:40,080 --> 00:48:43,480 Speaker 2: between each other, and had made like generations of terrible 893 00:48:43,480 --> 00:48:46,240 Speaker 2: decisions when it came to like purchasing arms and building 894 00:48:46,239 --> 00:48:49,920 Speaker 2: their military machines, and like, it just turns out that 895 00:48:50,280 --> 00:48:54,040 Speaker 2: the bad decisions of one guy are certainly not like 896 00:48:54,280 --> 00:48:57,640 Speaker 2: any less catastrophic than the bad decisions of like groups 897 00:48:57,640 --> 00:49:01,239 Speaker 2: of people. Right, anytime you've got people who spend all 898 00:49:01,280 --> 00:49:03,919 Speaker 2: of their time like theorizing about the way things ought 899 00:49:03,960 --> 00:49:06,040 Speaker 2: to be, as opposed to like dealing with the way 900 00:49:06,080 --> 00:49:10,040 Speaker 2: people are, you're going to wind up with with nonsense, right, 901 00:49:10,200 --> 00:49:13,440 Speaker 2: And like that's that. Unfortunately, every now and then we 902 00:49:13,480 --> 00:49:16,359 Speaker 2: get to see like what that nonsense looks like, you know, 903 00:49:16,560 --> 00:49:18,719 Speaker 2: when when people actually put it in place. You know, 904 00:49:18,760 --> 00:49:21,480 Speaker 2: in the case of like absolute monarchies like this, we 905 00:49:21,560 --> 00:49:24,280 Speaker 2: got the trenches in World War One. In the case 906 00:49:24,320 --> 00:49:28,320 Speaker 2: of like a very authoritarian communism, you know, we got Stalin, 907 00:49:29,360 --> 00:49:31,560 Speaker 2: And I guess kind of like part of part of 908 00:49:31,560 --> 00:49:33,839 Speaker 2: why I think Jarvin is important to understand is that 909 00:49:33,880 --> 00:49:36,440 Speaker 2: as kooky as a lot of this stuff is, he 910 00:49:36,560 --> 00:49:39,239 Speaker 2: is a guy who wants to take these theories that 911 00:49:39,280 --> 00:49:41,400 Speaker 2: he made himself when he was like sitting alone in 912 00:49:41,440 --> 00:49:44,799 Speaker 2: his apartment reading books and not really any interacting with 913 00:49:44,840 --> 00:49:47,600 Speaker 2: real people. He's a guy who wants those theories to 914 00:49:47,680 --> 00:49:51,480 Speaker 2: govern the lives of hundreds of millions, ideally billions, right, 915 00:49:52,800 --> 00:49:55,319 Speaker 2: And that's a real dangerous kind of person. You know, 916 00:49:55,840 --> 00:49:58,719 Speaker 2: Like we can regular people can sit around and like 917 00:49:58,800 --> 00:50:01,160 Speaker 2: read their books and and talk about like, well, this 918 00:50:01,280 --> 00:50:04,000 Speaker 2: might be neat or this might be neat. But whenever 919 00:50:04,000 --> 00:50:06,640 Speaker 2: you're talking about, like, I know, how to reorder all 920 00:50:06,680 --> 00:50:11,719 Speaker 2: of society, you've become dangerous. And you know, that's kind 921 00:50:11,760 --> 00:50:14,120 Speaker 2: of what Yarvin is doing during this period of time 922 00:50:14,120 --> 00:50:16,240 Speaker 2: where he's sitting at home and he's reading his books. 923 00:50:17,760 --> 00:50:20,919 Speaker 2: So the kind of the system that he pulls out 924 00:50:20,920 --> 00:50:23,120 Speaker 2: of this period where he's just like get reading everything 925 00:50:23,120 --> 00:50:26,520 Speaker 2: he can get his hands on, is that monarchs are. 926 00:50:26,960 --> 00:50:29,920 Speaker 2: A monarchy is the ideal kind of system of government 927 00:50:29,920 --> 00:50:33,239 Speaker 2: because it's the best at maximizing long term profits within 928 00:50:33,280 --> 00:50:37,080 Speaker 2: a society. Because monarchs have to think long term, right, 929 00:50:37,120 --> 00:50:40,480 Speaker 2: they can't be destructive in the short term, like you 930 00:50:40,520 --> 00:50:42,640 Speaker 2: know leaders in a democracy are because they have a 931 00:50:42,680 --> 00:50:45,040 Speaker 2: limited term limit. You know, maybe they only care about 932 00:50:45,160 --> 00:50:48,680 Speaker 2: benefiting themselves. A monarch wouldn't act that way because they 933 00:50:48,680 --> 00:50:51,160 Speaker 2: have no desire to destroy their own property. And again, 934 00:50:51,160 --> 00:50:53,040 Speaker 2: I would I would point you, yeah, I would point 935 00:50:53,080 --> 00:50:57,480 Speaker 2: you back to Yeah, we could talk about like the 936 00:50:57,520 --> 00:51:02,040 Speaker 2: Saudi royal family too, right, entirely bio oil or Roman emperors, 937 00:51:03,120 --> 00:51:05,879 Speaker 2: like literally most of the monarchies that have ever been 938 00:51:06,000 --> 00:51:08,440 Speaker 2: have like collapsed as a result of the fact that 939 00:51:08,440 --> 00:51:11,440 Speaker 2: that's also an inherently destructive thing. You know, some of 940 00:51:11,480 --> 00:51:14,160 Speaker 2: that just comes down to human nature. But he does 941 00:51:14,239 --> 00:51:16,960 Speaker 2: try to deal with this, the fact that monarchies clearly 942 00:51:17,000 --> 00:51:19,720 Speaker 2: don't work the way that he thinks that they should. 943 00:51:20,760 --> 00:51:22,920 Speaker 2: And he thinks that a big part of the issue 944 00:51:23,160 --> 00:51:26,919 Speaker 2: is that, you you, they all make too many compromises. Right, 945 00:51:26,960 --> 00:51:29,160 Speaker 2: all of these monarchies that collapsed during the turn of 946 00:51:29,160 --> 00:51:33,080 Speaker 2: the century had allowed some democratic elements into society. And 947 00:51:33,600 --> 00:51:36,480 Speaker 2: you know, they had allowed that because there were revolutions, right, 948 00:51:36,560 --> 00:51:40,040 Speaker 2: people like occupied Vienna for a period of time in 949 00:51:40,120 --> 00:51:42,680 Speaker 2: forty eight, Like, there were a bunch of like socialist 950 00:51:42,800 --> 00:51:45,759 Speaker 2: uprisings in the middle of the nineteenth century, and as 951 00:51:45,760 --> 00:51:49,719 Speaker 2: a result, a lot of these absolute monarchies into introduced reforms, 952 00:51:49,840 --> 00:51:52,600 Speaker 2: you know, and he sees those reforms as this was 953 00:51:52,640 --> 00:51:56,240 Speaker 2: like a terrible step that ensured their demise as opposed 954 00:51:56,239 --> 00:51:59,600 Speaker 2: to like, well, the absolute monarch chose to make those 955 00:51:59,640 --> 00:52:02,480 Speaker 2: reforms because they could not hold on to power otherwise. 956 00:52:02,560 --> 00:52:06,120 Speaker 2: But again, there's never a perfect logical consistency with guys. 957 00:52:06,160 --> 00:52:08,759 Speaker 3: I asked this question. Yeah, so like, if you're an 958 00:52:08,800 --> 00:52:16,239 Speaker 3: absolute monarch, are you delegating anything you're are delegating to? 959 00:52:16,400 --> 00:52:21,319 Speaker 3: Like what are the strung? What is the is the 960 00:52:22,800 --> 00:52:24,440 Speaker 3: like do you have to be just like an insane 961 00:52:24,440 --> 00:52:26,960 Speaker 3: micromanager to be Yeah, but. 962 00:52:27,400 --> 00:52:29,120 Speaker 2: I think the key is the key is to him. 963 00:52:29,520 --> 00:52:32,480 Speaker 2: The difference would be like a bureaucratic structure, wherein there 964 00:52:32,520 --> 00:52:34,960 Speaker 2: are other centers of power, right like if you've got 965 00:52:35,160 --> 00:52:37,719 Speaker 2: a constitutional monarch, but there's still some kind of like 966 00:52:37,800 --> 00:52:41,000 Speaker 2: Congress or Senate or whatever that has some things that 967 00:52:41,040 --> 00:52:46,200 Speaker 2: are within its scope. And with right, he he he wants, 968 00:52:46,280 --> 00:52:48,680 Speaker 2: he wants. He does actually view it as a CEO 969 00:52:48,800 --> 00:52:52,120 Speaker 2: where they do delegate, but the CEO is ultimately the 970 00:52:52,160 --> 00:52:57,399 Speaker 2: guy in power, right who they saw the delegatees. I mean, 971 00:52:57,920 --> 00:53:01,440 Speaker 2: I think the CEO in his ideal like situation, right, 972 00:53:01,560 --> 00:53:04,400 Speaker 2: like his ideal system of government that he kind of 973 00:53:04,400 --> 00:53:07,600 Speaker 2: comes around to is like the way Facebook is run right, 974 00:53:07,680 --> 00:53:10,319 Speaker 2: where you do have like a border directors technically, but 975 00:53:10,440 --> 00:53:13,439 Speaker 2: Zuckerberg has enough control of stock that like no one 976 00:53:13,480 --> 00:53:16,080 Speaker 2: can force him out. The buck stops with him, like 977 00:53:16,200 --> 00:53:19,120 Speaker 2: he ultimately has all of the power in that organization. 978 00:53:19,719 --> 00:53:25,640 Speaker 2: That's how Jarvin thinks country should be run right, which. 979 00:53:25,520 --> 00:53:28,040 Speaker 3: In his defense, Facebook is a flawless organization. 980 00:53:28,160 --> 00:53:31,680 Speaker 2: Yeah, yeah, we all know that nothing ever goes wrong there. 981 00:53:34,440 --> 00:53:38,279 Speaker 2: So the final straw for Jarvin's tolerance of democracy came 982 00:53:38,280 --> 00:53:39,799 Speaker 2: in two thousand and four as a result of the 983 00:53:39,840 --> 00:53:43,680 Speaker 2: swift Boat's Veteran for swift Boat Veterans for Truth scandal. 984 00:53:43,920 --> 00:53:47,440 Speaker 2: You remember this, I'm sure right, Oh yeah, of course, yeah, yeah, 985 00:53:47,440 --> 00:53:49,640 Speaker 2: this is back in the two thousand and four election 986 00:53:49,800 --> 00:53:52,680 Speaker 2: John Kerry with the Democratic nominee. Kerrie had been wounded 987 00:53:52,719 --> 00:53:56,040 Speaker 2: three times in Vietnam, and then after he had left 988 00:53:56,040 --> 00:53:58,560 Speaker 2: the service, he had become an anti war activist, right 989 00:53:58,560 --> 00:54:00,879 Speaker 2: he like testified in Congress this was a really big deal, 990 00:54:01,480 --> 00:54:04,440 Speaker 2: and so as number one. As a result, conservatives had 991 00:54:04,520 --> 00:54:07,640 Speaker 2: never really forgiven John Kerry for as they saw betraying 992 00:54:07,920 --> 00:54:12,560 Speaker 2: the country in Vietnam, and also obviously, like Bush was 993 00:54:13,080 --> 00:54:15,000 Speaker 2: running on the back of two wars that he had 994 00:54:15,040 --> 00:54:18,520 Speaker 2: gotten the country and Carrie had been against those. So 995 00:54:18,560 --> 00:54:21,319 Speaker 2: there was this like pretty hideous conflict. And the way 996 00:54:21,360 --> 00:54:23,160 Speaker 2: that a lot of folks on the right chose to, 997 00:54:23,480 --> 00:54:26,960 Speaker 2: like particularly those within Bush's campaign, chose to respond was 998 00:54:27,000 --> 00:54:30,600 Speaker 2: by arguing and bringing up, you know, people who claimed 999 00:54:30,960 --> 00:54:32,919 Speaker 2: people who had served in Vietnam, who claimed that Kerry 1000 00:54:32,960 --> 00:54:35,759 Speaker 2: had lied about his service right, that he hadn't really 1001 00:54:35,760 --> 00:54:37,879 Speaker 2: done the things he'd done, that his purple hearts were 1002 00:54:37,960 --> 00:54:41,840 Speaker 2: essentially like due to exaggerations, and none of this was true, 1003 00:54:42,600 --> 00:54:46,040 Speaker 2: and in fact, like when journalists actually talk to people 1004 00:54:46,080 --> 00:54:47,839 Speaker 2: who had served with carry, they're like, no, he was 1005 00:54:47,960 --> 00:54:53,080 Speaker 2: like a very good soldier who was wounded repeatedly doing 1006 00:54:53,120 --> 00:54:57,120 Speaker 2: his job. But the propat Canada campaign largely worked right, 1007 00:54:57,480 --> 00:55:00,880 Speaker 2: and Yarvin that critically he bought the propit Ganda campaign, 1008 00:55:00,960 --> 00:55:03,760 Speaker 2: and he was angry that the media, in his eyes, 1009 00:55:03,880 --> 00:55:07,520 Speaker 2: worked to protect Carrie, which proved that it was fundamentally 1010 00:55:07,640 --> 00:55:11,120 Speaker 2: evil and allied with Academia and what people now call 1011 00:55:11,200 --> 00:55:15,239 Speaker 2: the deep state, career government employees operating this sort of 1012 00:55:15,280 --> 00:55:18,400 Speaker 2: shadow government that really ran things right. His attitude is 1013 00:55:18,400 --> 00:55:21,720 Speaker 2: that because John Kerrey didn't suffer enough from the swift 1014 00:55:21,760 --> 00:55:25,000 Speaker 2: boat scandal. That means that the whole media complex in 1015 00:55:25,040 --> 00:55:27,719 Speaker 2: the United States was corrupt and needed to be destroyed, 1016 00:55:27,760 --> 00:55:30,480 Speaker 2: which is a crazy thing to lead you to that conclusion. 1017 00:55:32,600 --> 00:55:35,440 Speaker 2: Like it's just it's one of the it's interesting to 1018 00:55:35,520 --> 00:55:38,480 Speaker 2: me because this guy really does he tries to portray 1019 00:55:38,600 --> 00:55:45,640 Speaker 2: himself as this like dark philosopher, like esoteric almost political 1020 00:55:45,719 --> 00:55:47,879 Speaker 2: mad man, but when you get right down to it, 1021 00:55:47,920 --> 00:55:51,760 Speaker 2: he's like your crank uncle who's angry about John Carrey 1022 00:55:51,800 --> 00:55:52,560 Speaker 2: on Facebook. 1023 00:55:53,360 --> 00:55:56,640 Speaker 3: Well, also the swift voting. It's a weird thing to 1024 00:55:57,960 --> 00:56:01,160 Speaker 3: it's a weird thing to take from that whole chapter 1025 00:56:02,080 --> 00:56:06,919 Speaker 3: of political American political history because it because swift boating worked. Yeah, 1026 00:56:06,960 --> 00:56:09,120 Speaker 3: and then and then media, by the way, took the 1027 00:56:09,160 --> 00:56:14,799 Speaker 3: bait and just like amplified the story and uh yeah, 1028 00:56:14,880 --> 00:56:18,920 Speaker 3: and if they tried to protect Carrie, which I'm I'm 1029 00:56:18,960 --> 00:56:21,640 Speaker 3: sure a few journalists probably. 1030 00:56:21,280 --> 00:56:23,400 Speaker 2: Wanted, certainly individuals. 1031 00:56:22,800 --> 00:56:26,640 Speaker 3: Yeah they failed. Yeah, they didn't work. 1032 00:56:27,680 --> 00:56:29,920 Speaker 2: Like that's like and that's how I would say, is 1033 00:56:29,960 --> 00:56:32,360 Speaker 2: like I think if you're saying what happened the swift 1034 00:56:32,400 --> 00:56:35,120 Speaker 2: Boating thing is why I lost faith in the media, 1035 00:56:35,200 --> 00:56:41,279 Speaker 2: that's reasonable, but not for the reason that he did, right, Yeah, 1036 00:56:41,320 --> 00:56:44,760 Speaker 2: but anyway, that's what that's where he goes, right, speaking 1037 00:56:44,840 --> 00:56:49,879 Speaker 2: of the shadow government that really runs things. That's all 1038 00:56:49,920 --> 00:57:01,000 Speaker 2: of our sponsors are affiliated with. We're back. So now 1039 00:57:01,400 --> 00:57:04,359 Speaker 2: the years that Yarvin is kind of doing his having 1040 00:57:04,360 --> 00:57:06,560 Speaker 2: his like period in the wilderness, coming up with his 1041 00:57:06,600 --> 00:57:10,080 Speaker 2: political ideology largely like two thousand and three or four 1042 00:57:10,120 --> 00:57:12,400 Speaker 2: to like two thousand and seven or so, are the 1043 00:57:12,480 --> 00:57:15,160 Speaker 2: years that the tech industry like that brings us Web 1044 00:57:15,200 --> 00:57:18,160 Speaker 2: two point zero is starting to emerge. You get Google, 1045 00:57:18,720 --> 00:57:21,440 Speaker 2: you get you know, Apple had been around for a while, right, 1046 00:57:21,560 --> 00:57:24,000 Speaker 2: but they you know, we start to see like the 1047 00:57:24,080 --> 00:57:28,360 Speaker 2: what's going to become the smartphone era like grind towards 1048 00:57:28,440 --> 00:57:32,480 Speaker 2: you know, coming into being. Facebook also starts like two 1049 00:57:32,520 --> 00:57:34,120 Speaker 2: thousand and six or seven, I think, is like when 1050 00:57:34,160 --> 00:57:36,680 Speaker 2: it very first starts out. So like this is kind 1051 00:57:36,680 --> 00:57:39,560 Speaker 2: of the early birth of the Web two point zero era, 1052 00:57:39,640 --> 00:57:43,080 Speaker 2: which are all of these founder driven startups for the 1053 00:57:43,120 --> 00:57:46,920 Speaker 2: most part, right, And Yarvin comes to see this, this 1054 00:57:47,000 --> 00:57:50,400 Speaker 2: system that gives us Google and Facebook as inherently better 1055 00:57:50,560 --> 00:57:53,280 Speaker 2: than the system that governs the country. Right, it's and 1056 00:57:53,320 --> 00:57:56,760 Speaker 2: it's more akin to his kind of idealized absolute monarchy. 1057 00:57:57,320 --> 00:57:59,440 Speaker 2: So by this point in time, around two thousand and seven, 1058 00:58:00,040 --> 00:58:02,840 Speaker 2: Vin has more or less come across all the ingredients 1059 00:58:02,880 --> 00:58:06,480 Speaker 2: of his new ideology, this kind of reactionary monarchism with 1060 00:58:06,600 --> 00:58:10,520 Speaker 2: Austrian economic tendencies. The problem is that none of these 1061 00:58:10,560 --> 00:58:13,320 Speaker 2: philosophers that he likes, these guys like Roth barden Hopp, 1062 00:58:13,600 --> 00:58:17,320 Speaker 2: have quite gotten it right. And so he decides, I've 1063 00:58:17,360 --> 00:58:19,840 Speaker 2: got to start putting my ideas out there. I finally 1064 00:58:19,880 --> 00:58:24,160 Speaker 2: figured it out. I've consolidated the contradictions between all these systems, 1065 00:58:24,920 --> 00:58:26,760 Speaker 2: and now I'm going to start putting it out for 1066 00:58:26,800 --> 00:58:30,080 Speaker 2: people to see. Right. So in two thousand and seven 1067 00:58:30,120 --> 00:58:32,240 Speaker 2: he breaks out of this kind of chrysalis of reading 1068 00:58:32,320 --> 00:58:34,720 Speaker 2: that he'd put himself in, and he comes up with 1069 00:58:34,760 --> 00:58:38,640 Speaker 2: a blog under a pen name, Minsius Moldbug. And it's 1070 00:58:38,720 --> 00:58:40,920 Speaker 2: under this pen name that he's going to start writing 1071 00:58:40,960 --> 00:58:43,920 Speaker 2: a bunch of essays of political theory. In an interview 1072 00:58:43,960 --> 00:58:47,360 Speaker 2: with Max Raskin, Yarvin describes the origin of this nickname 1073 00:58:47,400 --> 00:58:51,120 Speaker 2: Minsius Moldbug. This way, it came from two different handles 1074 00:58:51,120 --> 00:58:53,720 Speaker 2: I was using in different places. I would post occasionally 1075 00:58:53,760 --> 00:58:56,040 Speaker 2: on Reddit or hacker News. Sometimes I would get banned, 1076 00:58:56,040 --> 00:58:58,200 Speaker 2: and I would choose the name of a new classical figure, 1077 00:58:58,240 --> 00:59:00,520 Speaker 2: and I just happened to land on Minsius. And then 1078 00:59:00,560 --> 00:59:03,000 Speaker 2: I was doing some economics posting, and I posted something 1079 00:59:03,040 --> 00:59:05,600 Speaker 2: about gold, but I said mold instead of gold, because 1080 00:59:05,600 --> 00:59:08,640 Speaker 2: I was talking about something with a hypothetical restricted supply. 1081 00:59:09,680 --> 00:59:12,080 Speaker 2: So it's just kind of like a foreign name, but 1082 00:59:12,280 --> 00:59:15,520 Speaker 2: it sounds like a little bit sinister, and it's interesting 1083 00:59:15,560 --> 00:59:18,320 Speaker 2: to me. Minsius the first name comes from a Confucian 1084 00:59:18,320 --> 00:59:21,640 Speaker 2: philosopher from the three hundreds BC who was a major 1085 00:59:21,680 --> 00:59:24,919 Speaker 2: figure in that kind of thought, and he had, during 1086 00:59:24,960 --> 00:59:27,880 Speaker 2: the Warring States period, interviewed a bunch of different kings 1087 00:59:27,920 --> 00:59:31,600 Speaker 2: and written a book about like what he'd learned about ruling. Now, 1088 00:59:31,640 --> 00:59:34,200 Speaker 2: Minsius was kind of focused on getting monarchs to act 1089 00:59:34,240 --> 00:59:37,600 Speaker 2: more benevolently towards the poor and the downtrodden. So he's 1090 00:59:37,600 --> 00:59:40,200 Speaker 2: not really a figure that has a lot to do 1091 00:59:40,280 --> 00:59:42,600 Speaker 2: with the kind of politics Jarvin is about to espouse. 1092 00:59:42,640 --> 00:59:45,280 Speaker 2: I think he'd largely picked the name because it makes 1093 00:59:45,320 --> 00:59:49,080 Speaker 2: him sound kind of sinister. But he starts putting out 1094 00:59:49,120 --> 00:59:52,280 Speaker 2: his new thoughts on politics in this blog, in a 1095 00:59:52,320 --> 00:59:56,520 Speaker 2: series of essays called Unqualified Reservations, all geared at getting 1096 00:59:56,560 --> 00:59:59,640 Speaker 2: his readers on board with the idea of reorganizing society 1097 00:59:59,680 --> 01:00:02,760 Speaker 2: away from democracy and towards a kind of enlightened one 1098 01:00:02,840 --> 01:00:04,920 Speaker 2: man rule that he believes is going to work a 1099 01:00:04,920 --> 01:00:10,440 Speaker 2: lot better. Unlike most philosophers, Jarvin Peppers's essays with casual slurs. 1100 01:00:11,120 --> 01:00:12,880 Speaker 2: In reading one, where he talks about World War Two, 1101 01:00:12,920 --> 01:00:15,680 Speaker 2: he refers to the Japanese repeatedly by a common slur 1102 01:00:15,760 --> 01:00:18,600 Speaker 2: at the time, and in another he makes a satiric 1103 01:00:18,640 --> 01:00:21,360 Speaker 2: statement about how the indigent poor should be destroyed and 1104 01:00:21,400 --> 01:00:25,400 Speaker 2: turned into biodiesel fuel. This kind of stuff, it hasn't 1105 01:00:25,520 --> 01:00:28,120 Speaker 2: the impact of getting like on the rare occasions in 1106 01:00:28,160 --> 01:00:30,720 Speaker 2: these early days that like major news outlets will look 1107 01:00:30,720 --> 01:00:32,800 Speaker 2: at his work, they'll kind of decide to ignore him 1108 01:00:32,840 --> 01:00:35,680 Speaker 2: because it's this guy dropping a bunch of racial slurs 1109 01:00:35,680 --> 01:00:39,240 Speaker 2: and crude jokes. He's clearly not a serious thinker. But 1110 01:00:39,640 --> 01:00:43,840 Speaker 2: the other thing that this style of discourse does is 1111 01:00:43,880 --> 01:00:48,440 Speaker 2: it's very attractive to young men, particularly young kind of 1112 01:00:48,440 --> 01:00:51,760 Speaker 2: intelligent autodid acts in the tech industry who spend a 1113 01:00:51,760 --> 01:00:54,560 Speaker 2: lot of time reading the internet. Right, And it is 1114 01:00:55,080 --> 01:00:57,400 Speaker 2: kind of in the same way that a lot of 1115 01:00:57,520 --> 01:00:59,800 Speaker 2: like the way people talk on four Chan is going 1116 01:00:59,800 --> 01:01:02,200 Speaker 2: to be attractive to these kinds of guys, right, And 1117 01:01:02,240 --> 01:01:05,280 Speaker 2: what you're seeing in these early Moldbug episodes, with this 1118 01:01:05,440 --> 01:01:07,720 Speaker 2: use of slurs and these kind of like joking not 1119 01:01:07,840 --> 01:01:11,960 Speaker 2: joking statements about killing poor people is the precursor to 1120 01:01:12,040 --> 01:01:15,000 Speaker 2: the way the alt right is going to talk about 1121 01:01:15,000 --> 01:01:17,920 Speaker 2: issues right and use kind of humor and jokes that 1122 01:01:17,960 --> 01:01:21,440 Speaker 2: aren't really jokes to kind of push more extreme ideas. 1123 01:01:21,600 --> 01:01:21,800 Speaker 3: Right. 1124 01:01:22,240 --> 01:01:25,800 Speaker 2: Moldbug is really the guy who starts doing that in 1125 01:01:26,840 --> 01:01:28,320 Speaker 2: I don't know if you'd say he was the first, 1126 01:01:28,360 --> 01:01:30,720 Speaker 2: but he's certainly the first with a platform to be 1127 01:01:30,800 --> 01:01:32,720 Speaker 2: doing that in a way that's really influential to a 1128 01:01:32,760 --> 01:01:33,520 Speaker 2: lot of these people. 1129 01:01:34,160 --> 01:01:36,680 Speaker 3: Now, can I ask another real I'm gonna I have 1130 01:01:36,680 --> 01:01:40,360 Speaker 3: two questions, Yeah, yeah. One is, are we sure we're 1131 01:01:40,400 --> 01:01:43,280 Speaker 3: pronouncing Mencius correctly? Is it not Menstus? 1132 01:01:44,320 --> 01:01:47,120 Speaker 2: I think it is Menshus sorry Menshus, yes, yes, but 1133 01:01:47,160 --> 01:01:49,480 Speaker 2: it spelled yeah the other way. 1134 01:01:49,680 --> 01:01:52,919 Speaker 3: And I have no idea. I just when you said 1135 01:01:52,960 --> 01:01:56,760 Speaker 3: it was a confusion, suddenly thought no, maybe anyway. 1136 01:01:56,880 --> 01:01:59,440 Speaker 2: Yeah, I think it is Menshus. Yeah. 1137 01:01:59,480 --> 01:02:07,960 Speaker 3: And then my second question is to what extent I 1138 01:02:08,000 --> 01:02:13,080 Speaker 3: find the humor aspect of this fascinating because it raises 1139 01:02:13,160 --> 01:02:17,480 Speaker 3: the possibility or I guess I should. I guess My 1140 01:02:17,560 --> 01:02:20,920 Speaker 3: question is like where on the spectrum of like just 1141 01:02:21,120 --> 01:02:29,920 Speaker 3: kind of like very mendacious and angry person who wants 1142 01:02:29,960 --> 01:02:34,280 Speaker 3: to reshape the world versus like like all the way 1143 01:02:34,320 --> 01:02:37,160 Speaker 3: to the other end of just being like a really 1144 01:02:37,320 --> 01:02:42,920 Speaker 3: giddy shitster gadfly with who just wants to throw crazy 1145 01:02:42,960 --> 01:02:46,960 Speaker 3: ideas out there and and get a reaction out of 1146 01:02:47,000 --> 01:02:50,280 Speaker 3: people the way like ninety percent of Twitter is like, 1147 01:02:50,640 --> 01:02:55,040 Speaker 3: where on that spectrum is he? Because it's that does 1148 01:02:55,080 --> 01:03:00,840 Speaker 3: sound like there's like, like, you know, churning up poor 1149 01:03:00,920 --> 01:03:05,800 Speaker 3: people to create biodiesel is a it's a tasteless joke. 1150 01:03:05,840 --> 01:03:08,200 Speaker 2: It's it's like a swifty Thomas Swift or it's like 1151 01:03:08,280 --> 01:03:13,760 Speaker 2: Jonathan Swift type joke, right, Like, but it could. 1152 01:03:13,600 --> 01:03:16,400 Speaker 3: Be construed as just like trolling, right right. 1153 01:03:16,960 --> 01:03:19,240 Speaker 2: Well, I think that's kind of the key point. So, 1154 01:03:19,320 --> 01:03:21,200 Speaker 2: like what you're talking about is like the term we 1155 01:03:21,280 --> 01:03:24,920 Speaker 2: use for it is shit posting, right and and Yarvin 1156 01:03:24,960 --> 01:03:27,360 Speaker 2: is very much a shit poster, right, But he's also 1157 01:03:27,520 --> 01:03:30,320 Speaker 2: using that as a tool where he understands that this 1158 01:03:30,520 --> 01:03:34,280 Speaker 2: is how young men particularly talk on the Internet, and 1159 01:03:34,320 --> 01:03:36,920 Speaker 2: It is something that inherently, if you're talking this way, 1160 01:03:36,920 --> 01:03:40,160 Speaker 2: if you're engaging this way, you have more credibility with 1161 01:03:40,200 --> 01:03:43,640 Speaker 2: them than you know, people who are trying to be 1162 01:03:43,760 --> 01:03:48,520 Speaker 2: more respectable, who largely like this chunk of folks doesn't 1163 01:03:48,560 --> 01:03:51,280 Speaker 2: think highly of right these like kind of like these 1164 01:03:51,320 --> 01:03:55,400 Speaker 2: traditional sort of like intellectual elites, you know, academics and 1165 01:03:55,480 --> 01:03:58,439 Speaker 2: journalists and alike, they have a lot of disdain for, 1166 01:03:59,080 --> 01:04:02,760 Speaker 2: but they trust someone who communicates like them. And so 1167 01:04:02,960 --> 01:04:05,920 Speaker 2: by using these kind of like by by basically peppering 1168 01:04:05,960 --> 01:04:09,600 Speaker 2: in sort of trolling language in these very serious articles 1169 01:04:10,160 --> 01:04:14,919 Speaker 2: arguing for anti democratic politics, he makes himself credible to them. 1170 01:04:14,960 --> 01:04:18,880 Speaker 2: And he also there's also a sense that because he's 1171 01:04:19,000 --> 01:04:21,840 Speaker 2: including some of this this stuff that is a lot racier, 1172 01:04:22,800 --> 01:04:27,640 Speaker 2: he's there's something almost forbidden knowledge about the stuff that 1173 01:04:27,680 --> 01:04:29,600 Speaker 2: he's putting out right that makes them want to share 1174 01:04:29,640 --> 01:04:32,360 Speaker 2: it with each other. And that's very much like a 1175 01:04:32,400 --> 01:04:34,800 Speaker 2: factor in his success. What he's doing here is like 1176 01:04:35,400 --> 01:04:38,840 Speaker 2: very much intentional and very intelligent and very effective, and 1177 01:04:38,880 --> 01:04:40,760 Speaker 2: it you know, if you want to look at kind 1178 01:04:40,760 --> 01:04:45,880 Speaker 2: of the ultimate uh uh like evolution of these sort 1179 01:04:45,920 --> 01:04:48,080 Speaker 2: of tactics, I think a great touch point would be 1180 01:04:48,120 --> 01:04:52,120 Speaker 2: the christ Church Church shooters manifesto, which included a lot 1181 01:04:52,120 --> 01:04:54,600 Speaker 2: of these like inside jokes, a lot of like forum 1182 01:04:54,840 --> 01:04:59,960 Speaker 2: troll language wrapped around serious arguments for like why people 1183 01:05:00,040 --> 01:05:03,160 Speaker 2: should carry out white supremacist attacks. And it's a kind 1184 01:05:03,200 --> 01:05:06,640 Speaker 2: of tactic that is really what gave us the alt 1185 01:05:06,720 --> 01:05:09,439 Speaker 2: right as a political force, and it's still very much 1186 01:05:09,880 --> 01:05:12,480 Speaker 2: how these people communicate now. I think it started to 1187 01:05:12,560 --> 01:05:16,240 Speaker 2: hurt them recently. The whole the weird stuff that Tim 1188 01:05:16,280 --> 01:05:19,680 Speaker 2: Walls began pulling out has actually been a really effective 1189 01:05:19,720 --> 01:05:22,600 Speaker 2: thing because when you actually, like take the way these 1190 01:05:22,640 --> 01:05:24,760 Speaker 2: people talk amongst each other and put it up in 1191 01:05:24,760 --> 01:05:28,200 Speaker 2: front of an audience, it's deeply off putting to most people. 1192 01:05:29,440 --> 01:05:32,760 Speaker 2: But it also kind of led to this establishment of 1193 01:05:32,800 --> 01:05:36,080 Speaker 2: like an internal language for these folks that kind of 1194 01:05:36,160 --> 01:05:41,200 Speaker 2: led to an ossification of their ideological tendencies. Right, we're 1195 01:05:41,200 --> 01:05:43,640 Speaker 2: all using the same kind of terms and words that 1196 01:05:43,680 --> 01:05:46,840 Speaker 2: we've come to recognize as like dog whistles for different things, 1197 01:05:47,840 --> 01:05:50,360 Speaker 2: and Jarvin is really doing that in a very organized way. 1198 01:05:50,400 --> 01:05:53,640 Speaker 2: He's good at developing terms for people to use that 1199 01:05:53,680 --> 01:05:56,360 Speaker 2: get adopted on a large scale. You know. One of 1200 01:05:56,400 --> 01:05:58,960 Speaker 2: the best example this would be his term the cathedral, right, 1201 01:05:59,480 --> 01:06:02,840 Speaker 2: which you know he uses to be mean this nexus 1202 01:06:02,880 --> 01:06:07,160 Speaker 2: of everything he doesn't like, the liberal media, the university system, academia, 1203 01:06:07,600 --> 01:06:11,840 Speaker 2: you know, like career of government employees, everything he considers bad, 1204 01:06:11,840 --> 01:06:15,880 Speaker 2: and everything his ideal monarch would destroy. Right, in his 1205 01:06:15,960 --> 01:06:19,320 Speaker 2: ideal world, there's not going to be an independent academic community, 1206 01:06:19,320 --> 01:06:21,960 Speaker 2: there's not going to be newspapers or journalists, just a 1207 01:06:22,040 --> 01:06:24,480 Speaker 2: king and an aristocracy, and of course he's going to 1208 01:06:24,480 --> 01:06:30,040 Speaker 2: be a natural member of that aristocracy. Right now, he 1209 01:06:30,080 --> 01:06:32,800 Speaker 2: does kind of the last piece of this ideology he's 1210 01:06:32,800 --> 01:06:35,960 Speaker 2: putting together is he has to explain why a lot 1211 01:06:35,960 --> 01:06:39,120 Speaker 2: of these real world feudalist governments that fell apart all 1212 01:06:39,200 --> 01:06:41,959 Speaker 2: fell apart. And part of it is obviously they gave 1213 01:06:42,080 --> 01:06:44,400 Speaker 2: too much freedom to people who weren't the monarch. But 1214 01:06:44,480 --> 01:06:46,480 Speaker 2: the other thing he comes up with is that old 1215 01:06:46,520 --> 01:06:50,200 Speaker 2: monarchies denied citizens the freedom to exit. And so in 1216 01:06:50,240 --> 01:06:53,840 Speaker 2: this ideal world, he supposes countries will be small, like 1217 01:06:54,080 --> 01:06:56,680 Speaker 2: the size of a city in most cases, and they'll 1218 01:06:56,720 --> 01:07:00,320 Speaker 2: compete with citizens who would have the freedom to leave. Right, 1219 01:07:00,400 --> 01:07:02,720 Speaker 2: So it's fine. Now, there's a lot of questions that 1220 01:07:02,760 --> 01:07:05,440 Speaker 2: aren't answered here, like how do you make a society 1221 01:07:05,920 --> 01:07:10,000 Speaker 2: like function that way, and a world is interconnected as ours. 1222 01:07:10,600 --> 01:07:13,480 Speaker 2: How do you stop you know, one monarch from repeatedly 1223 01:07:13,560 --> 01:07:15,720 Speaker 2: taking over other a like how do you stop Why 1224 01:07:15,800 --> 01:07:19,400 Speaker 2: wouldn't they use force? Why would people just let valuable 1225 01:07:19,440 --> 01:07:22,360 Speaker 2: subjects leave? How do people leave if the monarch can 1226 01:07:22,400 --> 01:07:24,560 Speaker 2: stop them from taking their assets out? All of these 1227 01:07:24,560 --> 01:07:27,800 Speaker 2: things that like would be actual problems if anyone tried 1228 01:07:27,840 --> 01:07:30,520 Speaker 2: to do this sort of thing. Like there's not actually 1229 01:07:30,520 --> 01:07:32,840 Speaker 2: an answer to this, but like that's that's kind of 1230 01:07:32,880 --> 01:07:36,360 Speaker 2: his idealized version of a society. It's a bunch of 1231 01:07:36,400 --> 01:07:39,800 Speaker 2: small monarchies all over the world that people can theoretically 1232 01:07:39,920 --> 01:07:42,840 Speaker 2: leave and move between. The way people leave companies and 1233 01:07:43,120 --> 01:07:45,360 Speaker 2: go to work for other companies. So like, you know 1234 01:07:45,400 --> 01:07:47,760 Speaker 2: how much everybody loves work. That's how the whole government 1235 01:07:47,800 --> 01:07:48,200 Speaker 2: should be. 1236 01:07:49,880 --> 01:07:52,480 Speaker 3: That's it. That's wild. I hadn't thought of it as 1237 01:07:54,480 --> 01:07:58,160 Speaker 3: on such a small scale. Here's another question, like in 1238 01:07:58,200 --> 01:08:02,280 Speaker 3: the same way that like company will have a board 1239 01:08:02,960 --> 01:08:07,520 Speaker 3: that can like ousta ceo or like, like is there 1240 01:08:07,520 --> 01:08:11,920 Speaker 3: any is there any stopgap measure for like a disastrous 1241 01:08:13,800 --> 01:08:16,880 Speaker 3: or someone like let's say someone has like a brain 1242 01:08:16,920 --> 01:08:21,200 Speaker 3: eating worm. Yeah, right, but they are showing no symptoms 1243 01:08:21,439 --> 01:08:26,000 Speaker 3: when they are appointed or ascend to the the monarchy, 1244 01:08:26,640 --> 01:08:29,160 Speaker 3: but then over the next five years they become like 1245 01:08:29,360 --> 01:08:34,160 Speaker 3: absolutely batshit crazy. Is there any any stopgap there? 1246 01:08:35,120 --> 01:08:37,599 Speaker 2: The only stock gap he builds in is the idea 1247 01:08:37,680 --> 01:08:41,200 Speaker 2: that well, theoretically, if the if the if the ruler's bad, 1248 01:08:41,560 --> 01:08:44,640 Speaker 2: everyone would be able to leave and then their system. 1249 01:08:44,280 --> 01:08:46,479 Speaker 3: Will right right, Oh, it's that thing. 1250 01:08:47,439 --> 01:08:49,200 Speaker 2: Yeah, it's like it's that thing where it's like, well, 1251 01:08:49,240 --> 01:08:51,200 Speaker 2: but what if he wants to shoot people who tries. 1252 01:08:51,280 --> 01:08:55,320 Speaker 3: It's like rand Paul saying, like civil rights are are 1253 01:08:55,439 --> 01:08:59,439 Speaker 3: dumb because if you put like a whites only sign 1254 01:08:59,479 --> 01:09:01,800 Speaker 3: in front of your store, you're gonna lose business and 1255 01:09:01,800 --> 01:09:04,439 Speaker 3: you're gonna go out of business and the market will 1256 01:09:04,520 --> 01:09:12,559 Speaker 3: keep you from being racist. Meanwhile, like that, yeah, back 1257 01:09:12,600 --> 01:09:16,040 Speaker 3: when people did that, it was didn't work until the 1258 01:09:16,120 --> 01:09:19,160 Speaker 3: laws kicked in. Yeah. Yeah no. 1259 01:09:19,240 --> 01:09:21,800 Speaker 2: And it's it's this, it's these it's this weird mix 1260 01:09:21,840 --> 01:09:27,439 Speaker 2: of like naivete uh and like starry eyed thinking that 1261 01:09:27,960 --> 01:09:30,160 Speaker 2: to a degree, I think he's just kind of being 1262 01:09:30,200 --> 01:09:33,519 Speaker 2: dishonest with the naivete, like he knows any state like 1263 01:09:33,600 --> 01:09:37,639 Speaker 2: this would just be a dictatorship like enforced through violence. Right, 1264 01:09:37,960 --> 01:09:39,880 Speaker 2: But that's what he wants as long as he's a 1265 01:09:39,880 --> 01:09:42,000 Speaker 2: part of the aristocracy. And he's just kind of built 1266 01:09:42,040 --> 01:09:44,559 Speaker 2: in this. Well, people would just leave if they didn't 1267 01:09:44,680 --> 01:09:47,679 Speaker 2: like it, because he has to have some answer for it, right, 1268 01:09:47,720 --> 01:09:50,800 Speaker 2: But I kind of think he knows how ugly a 1269 01:09:50,880 --> 01:09:53,040 Speaker 2: system like this would be in practice. He's just more 1270 01:09:53,120 --> 01:09:57,240 Speaker 2: or less fine with it right now. The last kind 1271 01:09:57,320 --> 01:10:00,400 Speaker 2: of ingredient to the ideological system year Vin is cooking 1272 01:10:00,479 --> 01:10:03,880 Speaker 2: up is, of course racism. And I want to read 1273 01:10:03,880 --> 01:10:06,439 Speaker 2: a passage from an article in tech Crunch about Jarvin 1274 01:10:06,479 --> 01:10:08,960 Speaker 2: and his followers and how they are quote obsessed with 1275 01:10:09,000 --> 01:10:12,320 Speaker 2: a concept called human biodiversity what used to be called 1276 01:10:12,360 --> 01:10:15,920 Speaker 2: scientific racism. Specifically, they believe that IQ is one of, 1277 01:10:15,960 --> 01:10:18,760 Speaker 2: if not the most important personal traits, and that it's 1278 01:10:18,760 --> 01:10:23,559 Speaker 2: predominantly genetic. Neo reactionaries would replace or supplement the divine 1279 01:10:23,680 --> 01:10:26,760 Speaker 2: right of kings and the aristocracy with the genetic right 1280 01:10:26,840 --> 01:10:27,440 Speaker 2: of elites. 1281 01:10:27,880 --> 01:10:28,160 Speaker 3: Right. 1282 01:10:28,320 --> 01:10:31,000 Speaker 2: So this is another element of how he tries to justify, well, 1283 01:10:31,000 --> 01:10:33,919 Speaker 2: my system's smarter than the old school of monarchies. 1284 01:10:34,000 --> 01:10:34,120 Speaker 3: Right. 1285 01:10:34,160 --> 01:10:36,360 Speaker 2: It's not just these bunch of families are the people 1286 01:10:36,360 --> 01:10:40,080 Speaker 2: who are in charge. Our aristocracy is people who naturally 1287 01:10:40,120 --> 01:10:43,639 Speaker 2: are superior because of their IQ, because obviously that tells 1288 01:10:43,680 --> 01:10:45,200 Speaker 2: you everything about a person. 1289 01:10:46,040 --> 01:10:51,599 Speaker 3: Right, emotional IQ are we Yeah, no, no, no, no, that's 1290 01:10:51,600 --> 01:10:52,479 Speaker 3: not where that could be good. 1291 01:10:52,680 --> 01:10:53,439 Speaker 1: Absolutely not. 1292 01:10:54,920 --> 01:10:57,280 Speaker 3: I'm for that. I'm for people with like very strong 1293 01:10:57,320 --> 01:11:00,240 Speaker 3: emotional IQs being in charge of things. 1294 01:11:00,760 --> 01:11:03,320 Speaker 2: Yeah, yeah, no, no, no, that's not the system. We're 1295 01:11:03,360 --> 01:11:04,960 Speaker 2: going to have just a bunch of guys who are 1296 01:11:05,000 --> 01:11:09,120 Speaker 2: really good at coding running everything. You know, that way 1297 01:11:09,200 --> 01:11:11,360 Speaker 2: everything can finally work the way uber does. 1298 01:11:12,280 --> 01:11:14,120 Speaker 1: Oh so I'll feel unsafe all the time. 1299 01:11:14,640 --> 01:11:15,559 Speaker 3: Yeah, okay, cool. 1300 01:11:17,880 --> 01:11:22,639 Speaker 2: So it's probably not surprising muldbugs theories take off among 1301 01:11:22,720 --> 01:11:27,200 Speaker 2: specifically a lot of Silicon Valley young men right who 1302 01:11:27,200 --> 01:11:30,400 Speaker 2: are excessively online, and it also starts to take off. 1303 01:11:30,400 --> 01:11:33,240 Speaker 2: He begins being spread by a lot of like far 1304 01:11:33,400 --> 01:11:35,800 Speaker 2: right folks on the Internet and kind of the mid 1305 01:11:35,880 --> 01:11:38,680 Speaker 2: aughts who find his work and share it amongst themselves. 1306 01:11:39,680 --> 01:11:43,200 Speaker 2: It's just two years after Muldbug starts his blog that 1307 01:11:43,320 --> 01:11:46,880 Speaker 2: Peter Teal gives a speech about democracy being incompatible with 1308 01:11:46,920 --> 01:11:50,839 Speaker 2: liberty and Teal starts putting money Jarvin's way, right. He's 1309 01:11:50,960 --> 01:11:54,559 Speaker 2: probably the number one guy sending money towards Yarvin backing. 1310 01:11:54,800 --> 01:11:58,320 Speaker 2: He backs a tech company that Yarvin starts, and he's 1311 01:11:58,400 --> 01:12:01,639 Speaker 2: just generally sort of like but like his early sort 1312 01:12:01,680 --> 01:12:06,559 Speaker 2: of moneyed backer, right, and Yarvin kind of as a result, 1313 01:12:06,600 --> 01:12:09,240 Speaker 2: he starts getting shared almost like people are like handing 1314 01:12:09,280 --> 01:12:11,599 Speaker 2: out drugs to each other. Like, we want to keep 1315 01:12:11,600 --> 01:12:13,400 Speaker 2: this on the down low. You don't want like people 1316 01:12:13,439 --> 01:12:16,720 Speaker 2: too many people to know publicly that you're reading Moldbug, 1317 01:12:16,760 --> 01:12:18,640 Speaker 2: but like, have you read this latest article if you 1318 01:12:18,720 --> 01:12:21,639 Speaker 2: checked out this blog, right, And he starts getting invited 1319 01:12:21,680 --> 01:12:24,880 Speaker 2: to give talks, and he starts saying things in these 1320 01:12:24,920 --> 01:12:27,600 Speaker 2: talks that like speeches at these schools to these conservative 1321 01:12:27,640 --> 01:12:30,719 Speaker 2: clubs and the like, like if Americans want to change 1322 01:12:30,720 --> 01:12:32,800 Speaker 2: their government, they're going to have to get over their 1323 01:12:32,800 --> 01:12:38,679 Speaker 2: dictator phobia. There's really no other solution. And that's kind 1324 01:12:38,760 --> 01:12:41,599 Speaker 2: of the thinking that is going to lead directly into 1325 01:12:41,720 --> 01:12:44,439 Speaker 2: the alt right, and it's embrace of Donald Trump. Yarvin 1326 01:12:44,600 --> 01:12:48,240 Speaker 2: is one of the key ideological pieces there. He is 1327 01:12:48,280 --> 01:12:50,920 Speaker 2: building a bridge that is eventually going to lead to 1328 01:12:50,960 --> 01:12:53,200 Speaker 2: how a lot of these people think about what Trump 1329 01:12:53,240 --> 01:12:53,760 Speaker 2: should be. 1330 01:12:53,920 --> 01:12:54,120 Speaker 3: Right. 1331 01:12:54,160 --> 01:12:56,719 Speaker 2: It's part of why there's a lot of this joking 1332 01:12:56,800 --> 01:12:59,160 Speaker 2: not joking talk about wanting Trump to be like a 1333 01:12:59,200 --> 01:13:02,200 Speaker 2: god king, right, is it's a lot of these guys 1334 01:13:02,240 --> 01:13:06,160 Speaker 2: who are knowingly or unknowingly parroting thoughts that kind of 1335 01:13:06,200 --> 01:13:10,080 Speaker 2: came initially into the right from Jarvin. And yeah, that's 1336 01:13:10,200 --> 01:13:12,280 Speaker 2: that's part one and part two we're going to talk 1337 01:13:12,320 --> 01:13:17,040 Speaker 2: about like how he actually gets connected to politics and 1338 01:13:17,280 --> 01:13:20,080 Speaker 2: kind of where we are today with this guy. But uh, yeah, 1339 01:13:20,120 --> 01:13:21,200 Speaker 2: how are you feeling ed? 1340 01:13:21,720 --> 01:13:22,759 Speaker 3: I'm a little rattled. 1341 01:13:25,120 --> 01:13:26,559 Speaker 2: It's dark stuff, right. 1342 01:13:26,479 --> 01:13:28,200 Speaker 1: Yeah, that's the right reaction. 1343 01:13:29,240 --> 01:13:35,280 Speaker 3: Where what happens if if he is like in the court, 1344 01:13:35,720 --> 01:13:42,960 Speaker 3: the high court of this monarch and uh and gets 1345 01:13:42,960 --> 01:13:47,680 Speaker 3: a stomach flu and throws up in during a ceremony 1346 01:13:47,680 --> 01:13:52,280 Speaker 3: of some kind, and is like sent to a dungeon 1347 01:13:52,320 --> 01:13:54,640 Speaker 3: for the rest of his life at no fault of 1348 01:13:54,640 --> 01:13:59,240 Speaker 3: his own, Like what which is a very reasonable expectation 1349 01:13:59,479 --> 01:14:04,240 Speaker 3: of of a monarchical system. And so is he then 1350 01:14:04,360 --> 01:14:07,200 Speaker 3: sitting in the dungeon saying it's still the best, This 1351 01:14:07,280 --> 01:14:09,960 Speaker 3: is still the best, this is still the best system. 1352 01:14:10,400 --> 01:14:12,880 Speaker 2: I don't think he thinks that could happen, because I 1353 01:14:12,920 --> 01:14:15,080 Speaker 2: think he doesn't believe something that you and I believe 1354 01:14:15,080 --> 01:14:18,080 Speaker 2: in that I think most rational people believe, which is 1355 01:14:18,080 --> 01:14:20,960 Speaker 2: that like power corrupts, so like, even if you are 1356 01:14:21,040 --> 01:14:23,360 Speaker 2: not the kind of guy who would throw people in 1357 01:14:23,400 --> 01:14:26,960 Speaker 2: a dungeon when you become king, just the fact that 1358 01:14:27,080 --> 01:14:30,719 Speaker 2: being a king is deranging, right, having that kind of power, 1359 01:14:31,160 --> 01:14:34,719 Speaker 2: you will eventually get used to exercising it and doing 1360 01:14:34,800 --> 01:14:37,200 Speaker 2: things like punishing people who just annoy you. And we 1361 01:14:37,280 --> 01:14:39,200 Speaker 2: know that this happens because we have a lot of 1362 01:14:39,240 --> 01:14:43,360 Speaker 2: examples of like when people are made dictators, how folks 1363 01:14:43,400 --> 01:14:46,200 Speaker 2: who were at least more normal at one point become 1364 01:14:46,400 --> 01:14:49,599 Speaker 2: like more violent and dangerous to be around, right, Like, 1365 01:14:49,640 --> 01:14:53,160 Speaker 2: this is a very well documented thing that comes with power. 1366 01:14:53,680 --> 01:14:56,760 Speaker 2: And I think he doesn't believe that fundamentally because he 1367 01:14:56,840 --> 01:15:03,360 Speaker 2: thinks that power naturally accumulates in natural systems of elites, right, 1368 01:15:03,760 --> 01:15:05,240 Speaker 2: so it can't be bad for them. 1369 01:15:05,439 --> 01:15:09,400 Speaker 3: Or I suppose an argument might be, well, if I 1370 01:15:09,439 --> 01:15:12,840 Speaker 3: started to see those tendencies in the leader, I would 1371 01:15:12,840 --> 01:15:18,000 Speaker 3: then go to a different monarchy with But what if 1372 01:15:18,000 --> 01:15:20,320 Speaker 3: it is like what if you're the first one? What 1373 01:15:20,360 --> 01:15:25,960 Speaker 3: if you're the first example of that that guy going crazy? 1374 01:15:27,439 --> 01:15:29,800 Speaker 2: I think it's it's also like a failure. These guys 1375 01:15:29,840 --> 01:15:32,640 Speaker 2: all consider themselves historians, but they don't study history in 1376 01:15:32,680 --> 01:15:36,000 Speaker 2: any kind of like rigorous academic fashion. And like, every 1377 01:15:36,000 --> 01:15:38,559 Speaker 2: time I hear this argument about well people would just leave, 1378 01:15:38,720 --> 01:15:41,760 Speaker 2: I think about like what happened to Jewish people in 1379 01:15:41,840 --> 01:15:45,599 Speaker 2: Nazi Germany, where if they wanted to leave, the state 1380 01:15:45,640 --> 01:15:48,920 Speaker 2: would take all of their property effectively. Right, Some people 1381 01:15:48,960 --> 01:15:51,240 Speaker 2: did get to leave, but they didn't get to take 1382 01:15:51,280 --> 01:15:54,040 Speaker 2: their assets with them, right, Like that was a theft, 1383 01:15:54,240 --> 01:15:56,679 Speaker 2: was a part of the system. And it's a thing 1384 01:15:56,760 --> 01:16:00,760 Speaker 2: that a state operated by a single man with absolute 1385 01:16:00,880 --> 01:16:04,599 Speaker 2: power and to grudge can do. And there's no reason 1386 01:16:04,680 --> 01:16:07,200 Speaker 2: in his system that it wouldn't happen to anyone trying 1387 01:16:07,200 --> 01:16:10,800 Speaker 2: to leave a bad you know, Ceo King. Right, But 1388 01:16:10,840 --> 01:16:14,160 Speaker 2: I either again, he's just not bringing this up because 1389 01:16:14,200 --> 01:16:16,080 Speaker 2: he doesn't care about the people he thinks this would 1390 01:16:16,160 --> 01:16:19,880 Speaker 2: happen to, or he just isn't read enough on the 1391 01:16:19,960 --> 01:16:22,400 Speaker 2: kind of history that's actually relevant to how a system 1392 01:16:22,439 --> 01:16:25,120 Speaker 2: like this would work in real life. You know, that's 1393 01:16:25,160 --> 01:16:28,160 Speaker 2: what I would kind of respect. Yeah, yeah, people have 1394 01:16:28,240 --> 01:16:32,080 Speaker 2: tried this, Curtis, which he may very well be fully 1395 01:16:32,120 --> 01:16:33,800 Speaker 2: aware of, and just kind of trying to do a 1396 01:16:33,800 --> 01:16:36,040 Speaker 2: little sleight of hand here, right, because he's more or 1397 01:16:36,120 --> 01:16:38,559 Speaker 2: less fine with who he thinks would be the people 1398 01:16:38,600 --> 01:16:41,720 Speaker 2: targeted unfairly in this system, which is like he's one 1399 01:16:41,720 --> 01:16:45,479 Speaker 2: of these guys who is annoyed with the left and progressives, right, 1400 01:16:45,479 --> 01:16:49,240 Speaker 2: he hates social justice and advocates for social justice, so 1401 01:16:49,720 --> 01:16:52,000 Speaker 2: if those people get targeted, he doesn't have a problem 1402 01:16:52,040 --> 01:16:55,040 Speaker 2: with it. You know, I think part of it's just 1403 01:16:55,120 --> 01:16:57,320 Speaker 2: not believing you could ever be the victim of the 1404 01:16:57,320 --> 01:17:00,280 Speaker 2: system you seek to put in place, which you know, 1405 01:17:00,560 --> 01:17:03,080 Speaker 2: statistically you want to look at like what happened to 1406 01:17:03,120 --> 01:17:06,320 Speaker 2: the early Bolsheviks after the Bolshevik Revolution. Most of those 1407 01:17:06,360 --> 01:17:09,680 Speaker 2: guys delivered a retirement, right, and you know you don't 1408 01:17:09,720 --> 01:17:12,519 Speaker 2: want to talk about like the first generation of Nazi 1409 01:17:12,560 --> 01:17:15,920 Speaker 2: street fighters. A lot of those guys didn't wind up 1410 01:17:15,960 --> 01:17:22,080 Speaker 2: retiring either. Anyway, Ed, let's retire for this episode until 1411 01:17:22,160 --> 01:17:27,200 Speaker 2: part two. People should check out your podcast SNAPFO Season 1412 01:17:27,280 --> 01:17:31,360 Speaker 2: two is out now and yeah, we'll be back on Thursday. 1413 01:17:31,840 --> 01:17:33,080 Speaker 3: All right, see you then. 1414 01:17:36,560 --> 01:17:39,280 Speaker 1: Behind the Bastards is a production of cool Zone Media. 1415 01:17:39,640 --> 01:17:42,920 Speaker 1: For more from cool Zone Media, visit our website Coolzonemedia 1416 01:17:43,080 --> 01:17:46,280 Speaker 1: dot com, or check us out on the iHeartRadio app, 1417 01:17:46,360 --> 01:17:48,720 Speaker 1: Apple Podcasts, or wherever you get your podcasts,