1 00:00:01,840 --> 00:00:07,240 Speaker 1: Also media. Welcome back to Behind the Bastards, a podcast 2 00:00:07,400 --> 00:00:11,120 Speaker 1: about the worst people in all of history and with 3 00:00:11,200 --> 00:00:14,320 Speaker 1: us today, someone who, as far as I am aware, 4 00:00:15,480 --> 00:00:17,759 Speaker 1: is not one of the worst people in all of history. 5 00:00:18,280 --> 00:00:21,520 Speaker 1: Ed Helm's, Ed, have you ever committed a crime? 6 00:00:22,760 --> 00:00:23,440 Speaker 2: Lots? 7 00:00:23,840 --> 00:00:27,320 Speaker 1: Okay? Well, okay, well I expected the opposite answer. 8 00:00:28,320 --> 00:00:36,040 Speaker 2: I've only murdered jerks. Yeah, you know, you're fine. 9 00:00:36,120 --> 00:00:39,440 Speaker 1: I'm on the right side of I still have like 10 00:00:41,040 --> 00:00:43,479 Speaker 1: if you're just killing bad people, like, who's got a 11 00:00:43,479 --> 00:00:50,800 Speaker 1: problem with that? Right? Yeah? Just jerks. It's like we 12 00:00:50,800 --> 00:00:52,760 Speaker 1: were It's like we were talking about early. You can't 13 00:00:52,800 --> 00:00:55,440 Speaker 1: have a black and white like murders always bad because 14 00:00:55,480 --> 00:00:58,360 Speaker 1: like what if somebody's a jerk? Just wondering? Can I 15 00:00:58,400 --> 00:01:00,480 Speaker 1: send you a list of names for note no apparent 16 00:01:00,520 --> 00:01:05,039 Speaker 1: reason they're jerks? Yeah? Okay, okay, this is great info 17 00:01:05,080 --> 00:01:09,960 Speaker 1: to have it. Welcome back to the show, Ed Helms. 18 00:01:10,880 --> 00:01:14,800 Speaker 1: I mean again, I don't feel like you necessarily need introduction, 19 00:01:15,360 --> 00:01:18,839 Speaker 1: but I will remind everyone about your excellent podcast SNAFU 20 00:01:19,240 --> 00:01:23,040 Speaker 1: and you just had season two drop talking about the 21 00:01:23,240 --> 00:01:27,080 Speaker 1: wonderful activist burglary of an FBI building in nineteen seventy 22 00:01:27,120 --> 00:01:30,759 Speaker 1: one that led to some of the most important revelations 23 00:01:30,840 --> 00:01:35,800 Speaker 1: of the burgeoning security state in history. So yeah, great show. 24 00:01:35,920 --> 00:01:38,959 Speaker 1: People should check it out. Ed, are you ready? 25 00:01:39,280 --> 00:01:42,560 Speaker 2: I'm ready here more. As you described season two of SNAFU, 26 00:01:42,640 --> 00:01:46,160 Speaker 2: I couldn't help thinking about how how well it connects 27 00:01:46,200 --> 00:01:47,680 Speaker 2: to our subject matter. 28 00:01:48,200 --> 00:01:50,040 Speaker 1: Yep, untrammeled authority. 29 00:01:50,120 --> 00:01:54,320 Speaker 2: Yeah, because Jagar Hoover basically had some degree of unchecked 30 00:01:54,400 --> 00:02:01,040 Speaker 2: authority within Yeah. Oh yeah, and he was doing really 31 00:02:01,240 --> 00:02:02,000 Speaker 2: awful stuff. 32 00:02:02,440 --> 00:02:05,840 Speaker 1: Well, he's a great example because he was this kind 33 00:02:05,880 --> 00:02:09,919 Speaker 1: of ceo god king of the security state for decades 34 00:02:10,240 --> 00:02:14,239 Speaker 1: and it didn't end well or middle well or start 35 00:02:14,280 --> 00:02:18,160 Speaker 1: well come back. 36 00:02:18,200 --> 00:02:20,480 Speaker 2: Comes back to that. I think at the beginning of 37 00:02:20,520 --> 00:02:22,440 Speaker 2: the last episode, I was like, how do you pick 38 00:02:22,520 --> 00:02:25,800 Speaker 2: the guy right? Right? The guy? How can you be 39 00:02:25,880 --> 00:02:27,800 Speaker 2: sure you got the right guy. 40 00:02:28,840 --> 00:02:31,720 Speaker 1: It's also this thing of like I think with Hoover, 41 00:02:31,800 --> 00:02:34,320 Speaker 1: you really did see a man who he wasn't the 42 00:02:34,360 --> 00:02:36,880 Speaker 1: same guy at the start of his time running the 43 00:02:36,919 --> 00:02:39,000 Speaker 1: FBI as he was at the end, you know, because 44 00:02:39,200 --> 00:02:42,720 Speaker 1: number when he aged, our brains change as we age, 45 00:02:42,480 --> 00:02:46,280 Speaker 1: we in many cases get worse at some things, right, 46 00:02:46,800 --> 00:02:50,120 Speaker 1: And like he also the period of time that he 47 00:02:50,160 --> 00:02:53,760 Speaker 1: had with power, he got used to doing increasingly extreme 48 00:02:54,080 --> 00:02:57,320 Speaker 1: things with it, you know, like there was this constant 49 00:02:57,480 --> 00:03:02,640 Speaker 1: escalation of his authoritary impulses, like the job was bad 50 00:03:02,720 --> 00:03:05,160 Speaker 1: for him, like it was bad for j Edgar Hoover, 51 00:03:06,639 --> 00:03:08,800 Speaker 1: and it was it made him a worse person. And 52 00:03:08,840 --> 00:03:12,200 Speaker 1: I think that kind of fundamentally, guys who advocate for 53 00:03:12,240 --> 00:03:14,839 Speaker 1: these systems never take that into account, in part because 54 00:03:14,880 --> 00:03:16,480 Speaker 1: I just don't think they believe it the way that 55 00:03:16,880 --> 00:03:20,079 Speaker 1: I think most normal people do. Like we you don't. 56 00:03:20,240 --> 00:03:22,880 Speaker 1: You don't have to be like on the left or 57 00:03:22,919 --> 00:03:26,119 Speaker 1: a libertarian or an anarchist to be like, well, yeah, 58 00:03:26,120 --> 00:03:28,120 Speaker 1: if you give people access to all of the power 59 00:03:28,160 --> 00:03:33,639 Speaker 1: in the world, they generally do horrible things with it. Yeah, yeah, yeah, 60 00:03:34,080 --> 00:03:36,800 Speaker 1: yeah I would anyway, huh. I would. Yeah. I would 61 00:03:36,840 --> 00:03:40,040 Speaker 1: do awful, nightmarish things. I do really fun things, but 62 00:03:40,320 --> 00:03:43,320 Speaker 1: people might get hurt. I would have a good time, 63 00:03:43,480 --> 00:03:48,080 Speaker 1: but not everyone would have a good time. I think 64 00:03:48,120 --> 00:03:50,280 Speaker 1: maybe that's just how these guys think about it, right, 65 00:03:51,080 --> 00:03:52,840 Speaker 1: I'm going to be the one having a good time. 66 00:03:55,520 --> 00:03:58,280 Speaker 1: So and again, that's part of why we're doing these episodes, 67 00:03:58,320 --> 00:04:01,120 Speaker 1: because Curtis Yarvin could potently be on the brink of 68 00:04:01,160 --> 00:04:03,600 Speaker 1: having a very good time, and I really want to 69 00:04:03,640 --> 00:04:11,320 Speaker 1: emphasize what a bad idea that would be. So I 70 00:04:11,320 --> 00:04:13,960 Speaker 1: have referred to Yarvin as a monarchist a few times 71 00:04:13,960 --> 00:04:16,680 Speaker 1: in these episodes, and I will again. I do think 72 00:04:16,680 --> 00:04:19,479 Speaker 1: it's important to note he would disagree with me using 73 00:04:19,520 --> 00:04:22,440 Speaker 1: that title because he thinks that calling someone a monarchist 74 00:04:22,480 --> 00:04:26,320 Speaker 1: brings to mind the constitutional monarchies that largely failed in 75 00:04:26,320 --> 00:04:29,520 Speaker 1: the early twentieth century, and he blames that failure on 76 00:04:29,640 --> 00:04:33,520 Speaker 1: compromises in absolute power. Now, we talked last episode about 77 00:04:33,600 --> 00:04:36,720 Speaker 1: Yarvin's core beliefs democracy isn't really real, and if it 78 00:04:37,120 --> 00:04:39,840 Speaker 1: was real, it wouldn't be a good idea. The cathedral 79 00:04:39,920 --> 00:04:42,599 Speaker 1: who really runs things are the enemy, and some sort 80 00:04:42,640 --> 00:04:46,159 Speaker 1: of aristocracy in an absolute CEO monarch paired with the 81 00:04:46,200 --> 00:04:48,920 Speaker 1: freedom to exit if you don't like the particular flavor 82 00:04:48,920 --> 00:04:52,320 Speaker 1: of oppression in your home is the ideal state of governance. 83 00:04:52,920 --> 00:04:55,800 Speaker 1: As a tech guy, though Yarvin doesn't like to couch 84 00:04:55,880 --> 00:04:58,320 Speaker 1: his yearning for a king, which is what this is, right, 85 00:04:58,360 --> 00:05:00,560 Speaker 1: he's a monarchist. He's not this the same kind of 86 00:05:00,560 --> 00:05:03,760 Speaker 1: monarchists that, for example, like guys like JR. R. Tolkien were, 87 00:05:03,800 --> 00:05:06,400 Speaker 1: But he's very much a monarchist, and he doesn't like 88 00:05:06,480 --> 00:05:09,719 Speaker 1: to he doesn't like to seem like the regular people 89 00:05:09,760 --> 00:05:11,880 Speaker 1: who long for the days of having a king. Right, 90 00:05:12,640 --> 00:05:15,520 Speaker 1: He's got to be like a little bit smarter about it, right, 91 00:05:15,560 --> 00:05:17,839 Speaker 1: And this is where kind of the big tech line 92 00:05:17,880 --> 00:05:21,960 Speaker 1: of things comes in and his argument. He starts making this, 93 00:05:22,240 --> 00:05:24,320 Speaker 1: you know, in the early period of like the web 94 00:05:24,360 --> 00:05:27,159 Speaker 1: two point zero tech boom, you get smartphones, you get 95 00:05:27,200 --> 00:05:30,640 Speaker 1: like apps like Twitter and YouTube and Facebook, and there's 96 00:05:30,680 --> 00:05:33,919 Speaker 1: this period of time, I know it feels very distant 97 00:05:33,960 --> 00:05:36,520 Speaker 1: to us, but where people thought this was going to 98 00:05:36,800 --> 00:05:40,279 Speaker 1: enhance democracy around the globe, right, that all of these 99 00:05:40,320 --> 00:05:43,320 Speaker 1: connecting technologies were going to be a massive boon for 100 00:05:43,480 --> 00:05:48,960 Speaker 1: like liberatory movements across the planet. That's not what happened. 101 00:05:49,080 --> 00:05:52,400 Speaker 1: And Jarvin very early on isn't just saying that's not 102 00:05:52,440 --> 00:05:55,520 Speaker 1: what's going to happen. He's saying that's not what should happen. Right, 103 00:05:55,960 --> 00:05:59,719 Speaker 1: Big tech shouldn't enable democracy around the world. It should 104 00:05:59,720 --> 00:06:03,400 Speaker 1: take control in a very literal sense. And he becomes 105 00:06:03,400 --> 00:06:06,160 Speaker 1: an advocate of a kind of political philosophy he called 106 00:06:06,320 --> 00:06:10,839 Speaker 1: neo cameralism, which Francis Sang described in an excellent essay 107 00:06:10,839 --> 00:06:13,359 Speaker 1: as quote, arguing that the state should be run like 108 00:06:13,400 --> 00:06:15,920 Speaker 1: a business ie with a CEO at its head and 109 00:06:15,960 --> 00:06:20,080 Speaker 1: no democratic mechanisms. Jarvin has always taken pains to express 110 00:06:20,080 --> 00:06:23,440 Speaker 1: in public his belief that this change can be done peacefully, 111 00:06:23,839 --> 00:06:26,440 Speaker 1: and you know, we're not. He thinks we're not already living, 112 00:06:26,480 --> 00:06:28,479 Speaker 1: We're already not living in a democracy, so there's no 113 00:06:28,520 --> 00:06:30,960 Speaker 1: reason this has to be painful. But a study of 114 00:06:30,960 --> 00:06:32,800 Speaker 1: his writing over the years makes it clear that not 115 00:06:32,920 --> 00:06:36,480 Speaker 1: only is he open to violence, but an enthusiastic about it. 116 00:06:36,880 --> 00:06:38,960 Speaker 1: And this brings me to one of the uglier parts 117 00:06:38,960 --> 00:06:42,479 Speaker 1: of our story. In twenty eleven, a Nazi named Anders 118 00:06:42,480 --> 00:06:45,080 Speaker 1: Brevik shot up a summer camp hosted by the Workers 119 00:06:45,160 --> 00:06:49,160 Speaker 1: Youth League, a left wing political organization in Norway. Brewick, 120 00:06:49,200 --> 00:06:52,240 Speaker 1: who considers himself a member of the Knights Templar, acting 121 00:06:52,240 --> 00:06:55,520 Speaker 1: to defend his faith and race from evil communists, shot 122 00:06:55,560 --> 00:06:59,920 Speaker 1: and killed or bombed seventy seven people. Jarvin wrote about 123 00:07:00,120 --> 00:07:03,640 Speaker 1: this as well, arguing that terrorism was a legitimate tactic 124 00:07:03,720 --> 00:07:07,160 Speaker 1: and that Nazi terror had been legitimate because it worked 125 00:07:07,800 --> 00:07:11,760 Speaker 1: for the baffler. Corey Payne summarizes quote Breviks killing spree, 126 00:07:11,800 --> 00:07:15,280 Speaker 1: which targeted young Norwegian leftists was illegitimate because it was 127 00:07:15,280 --> 00:07:19,600 Speaker 1: insufficient to free Norway from euro Communism. After all, he 128 00:07:19,640 --> 00:07:22,920 Speaker 1: only killed seventy seven people. We can note the only 129 00:07:22,960 --> 00:07:26,040 Speaker 1: thing he didn't screw up. At least he shot Communists, 130 00:07:26,120 --> 00:07:29,280 Speaker 1: not Muslims. He gored the Matador and not the cape. 131 00:07:29,440 --> 00:07:32,520 Speaker 1: Jarvin wrote on July twenty third, twenty eleven, one day 132 00:07:32,520 --> 00:07:36,080 Speaker 1: after the terror in Oslo. So that is I mean, 133 00:07:36,360 --> 00:07:38,120 Speaker 1: we've gotten now from a guy who is just sort 134 00:07:38,160 --> 00:07:41,440 Speaker 1: of like preaching his idiosyncratic political system to a guy 135 00:07:41,440 --> 00:07:44,880 Speaker 1: who was being like he didn't kill enough kids for 136 00:07:44,960 --> 00:07:48,360 Speaker 1: it to matter, right, Like this is you know, it's 137 00:07:48,360 --> 00:07:50,920 Speaker 1: important not to lean into the fact that like this 138 00:07:51,080 --> 00:07:53,640 Speaker 1: is not just a guy whose politics would lead to 139 00:07:53,720 --> 00:07:56,680 Speaker 1: bad directions. This is a pretty vile person. Where is 140 00:07:56,720 --> 00:08:00,800 Speaker 1: he writing this on his blog Unqualified Reservus, which is 141 00:08:01,320 --> 00:08:02,920 Speaker 1: just publicly available blog. 142 00:08:03,600 --> 00:08:06,640 Speaker 2: Yeah, yeah, not, He's not like in the dark recesses 143 00:08:06,680 --> 00:08:07,520 Speaker 2: of four chan. 144 00:08:08,240 --> 00:08:11,080 Speaker 1: No no, no, no. This is a publicly available blog. 145 00:08:11,200 --> 00:08:14,200 Speaker 1: Now it's not. It's not. It's very popular with a 146 00:08:14,240 --> 00:08:17,720 Speaker 1: certain subset of the world, and he's writing under a pseudonym, 147 00:08:17,760 --> 00:08:20,440 Speaker 1: so people don't know his real name. In twenty eleven. 148 00:08:20,640 --> 00:08:24,400 Speaker 2: Right now, the pseudonym could mean that there's some measure 149 00:08:24,440 --> 00:08:27,360 Speaker 2: of trolling going on, but not. But but this is 150 00:08:27,840 --> 00:08:33,080 Speaker 2: there's no, there's no, there's no kind of like winky irony. 151 00:08:33,120 --> 00:08:35,280 Speaker 2: I mean, this is some of the most if it's 152 00:08:35,320 --> 00:08:38,280 Speaker 2: remotely humorous at an attempt, it is some of the 153 00:08:38,320 --> 00:08:43,760 Speaker 2: most like wretched, awful and heart wrenching. It failed at humor. 154 00:08:44,120 --> 00:08:49,520 Speaker 2: But but if again, like there's no version in which 155 00:08:50,280 --> 00:08:55,120 Speaker 2: as it's using a pseudonym, he's being like, I can't even. 156 00:08:54,960 --> 00:08:57,840 Speaker 1: Think of no, no, this is this is very much 157 00:08:57,920 --> 00:09:01,480 Speaker 1: not him fucking around like this is because I mean 158 00:09:01,480 --> 00:09:03,440 Speaker 1: you can tell for a degree, like the like talking 159 00:09:03,440 --> 00:09:06,760 Speaker 1: about grinding up people for biodiesel. I've seen some articles 160 00:09:06,800 --> 00:09:08,959 Speaker 1: where people mistake that for a serious position if you 161 00:09:09,000 --> 00:09:11,720 Speaker 1: read it in context, he's clearly like joking, right, Like 162 00:09:11,800 --> 00:09:14,840 Speaker 1: it's thrown out as a joke. This essay is him 163 00:09:14,840 --> 00:09:18,280 Speaker 1: talking about like why the Brevic attack was bad, and 164 00:09:18,320 --> 00:09:20,760 Speaker 1: it's bad not because he killed all those people, but 165 00:09:20,800 --> 00:09:25,240 Speaker 1: because it's not big enough to destroy the left, right, Like, 166 00:09:25,320 --> 00:09:28,440 Speaker 1: that's that's very much the take that he has on this, 167 00:09:28,480 --> 00:09:31,120 Speaker 1: which is not really he's not being satirical, he's not 168 00:09:31,120 --> 00:09:32,680 Speaker 1: not that you know, it would be good to be 169 00:09:32,760 --> 00:09:35,400 Speaker 1: satirical about this, but that's not really what he's doing. 170 00:09:36,800 --> 00:09:38,560 Speaker 1: And this is this kind of stuff. Like part of 171 00:09:38,559 --> 00:09:40,880 Speaker 1: why people don't catch on to this more often about 172 00:09:40,960 --> 00:09:43,840 Speaker 1: Jarvin when they write about him, for like mainstream news sources, 173 00:09:43,880 --> 00:09:47,160 Speaker 1: is that he writes so much like to catch this stuff, 174 00:09:47,520 --> 00:09:49,560 Speaker 1: like I wouldn't have caught all of this stuff if 175 00:09:49,600 --> 00:09:51,360 Speaker 1: I had just been I wouldn't have had time to 176 00:09:51,400 --> 00:09:54,240 Speaker 1: read all of his archive. Thankfully, guys like Corey Pine 177 00:09:54,600 --> 00:09:56,440 Speaker 1: did a lot of that work, and so people have 178 00:09:56,520 --> 00:09:59,440 Speaker 1: been collecting kind of like the worst hits of Curtis 179 00:09:59,480 --> 00:10:03,320 Speaker 1: Yarfin for a while now, but it's not immediately easy 180 00:10:03,360 --> 00:10:05,640 Speaker 1: for people to do, and it gets missed by the 181 00:10:05,679 --> 00:10:07,960 Speaker 1: people who aren't really like big fans of his a 182 00:10:08,000 --> 00:10:10,280 Speaker 1: lot of the time, right, because there's just so much there. 183 00:10:10,360 --> 00:10:13,640 Speaker 1: And he writes he's got kind of log area, right, 184 00:10:14,200 --> 00:10:17,440 Speaker 1: diarrhea of the mouth a bit, he's very worthy. So 185 00:10:17,640 --> 00:10:20,480 Speaker 1: around the same time as he's writing his articles about 186 00:10:20,520 --> 00:10:24,560 Speaker 1: the Yetoya shooting, Jarvin is also pushing an acronym out 187 00:10:24,559 --> 00:10:27,319 Speaker 1: in his writing to the people who are like built 188 00:10:27,320 --> 00:10:30,000 Speaker 1: this kind of like political circle around him, which is 189 00:10:30,080 --> 00:10:33,640 Speaker 1: kind of liberally sprinkled with Peter Thiel dollars. And the 190 00:10:33,679 --> 00:10:37,320 Speaker 1: acronym that he starts pushing is rage, which means retire 191 00:10:37,480 --> 00:10:42,040 Speaker 1: all government employees. And this is a term that he's 192 00:10:42,120 --> 00:10:45,640 Speaker 1: come to for this policy plan that he starts pushing 193 00:10:45,679 --> 00:10:49,680 Speaker 1: among the young conservatives following him, that we need to 194 00:10:49,720 --> 00:10:52,080 Speaker 1: get a president who is our kind of guy in 195 00:10:52,160 --> 00:10:57,480 Speaker 1: office and have him forcibly retire the entire professional cast 196 00:10:57,640 --> 00:11:00,880 Speaker 1: of the government and replace them with are people with 197 00:11:00,960 --> 00:11:02,439 Speaker 1: people who think, and that's how we. 198 00:11:02,400 --> 00:11:03,640 Speaker 2: Shift twenty five. 199 00:11:04,160 --> 00:11:07,120 Speaker 1: Yes, yes, yes, exactly, that is and this is something 200 00:11:07,160 --> 00:11:10,760 Speaker 1: he is pushing in the early aughts, right, like yes, 201 00:11:11,280 --> 00:11:13,600 Speaker 1: And to be honest, a lot of Project twenty twenty 202 00:11:13,600 --> 00:11:15,800 Speaker 1: five is people cribbing from Yarvin, and it's because it's 203 00:11:15,840 --> 00:11:18,720 Speaker 1: people who were influenced by him or influenced by people 204 00:11:18,760 --> 00:11:21,000 Speaker 1: who are influenced by him, right. But a lot of 205 00:11:21,000 --> 00:11:26,080 Speaker 1: this intellectually starts with him. You know, obviously he's not 206 00:11:26,160 --> 00:11:28,640 Speaker 1: the only guy thinking about stuff like this, but he's 207 00:11:28,679 --> 00:11:30,560 Speaker 1: putting it out in a form that is like a 208 00:11:30,600 --> 00:11:35,080 Speaker 1: cohesive ideology. So in the early years, and we're talking 209 00:11:35,120 --> 00:11:37,480 Speaker 1: kind of the mid aughts around like twenty eleven up 210 00:11:37,520 --> 00:11:42,640 Speaker 1: through twenty fourteen. His adherents are mostly other tech industry creatures, 211 00:11:42,920 --> 00:11:46,800 Speaker 1: and one of these people is someone named Justine Tunney. 212 00:11:47,360 --> 00:11:50,599 Speaker 1: Justine started out as someone who was like more progressive. 213 00:11:50,640 --> 00:11:54,359 Speaker 1: She was an occupy activist back during you know occupy, 214 00:11:55,120 --> 00:11:58,280 Speaker 1: and she gets hired as a Google engineer. In March 215 00:11:58,360 --> 00:12:01,960 Speaker 1: of twenty fourteen, Esteem published a petition on the White 216 00:12:02,000 --> 00:12:06,520 Speaker 1: House website demanding a national referendum on three points. Number one, 217 00:12:06,720 --> 00:12:11,240 Speaker 1: retire all government employees, number two, transfer administrative authority to 218 00:12:11,280 --> 00:12:14,880 Speaker 1: the tech industry, and number three appoint Google CEO Eric 219 00:12:14,880 --> 00:12:19,400 Speaker 1: Schmidt's CEO of America. Now, this is a very silly 220 00:12:19,440 --> 00:12:23,360 Speaker 1: idea for one thing who's looking at Google today thinks 221 00:12:23,480 --> 00:12:28,520 Speaker 1: Eric Schmidt should run the country like But in defense 222 00:12:28,559 --> 00:12:31,160 Speaker 1: of this, Tunney wrote, It's time for the US regime 223 00:12:31,240 --> 00:12:33,480 Speaker 1: to politely take its exit from history and do what's 224 00:12:33,520 --> 00:12:36,840 Speaker 1: best for America. The tech industry can offer us good governance, 225 00:12:37,000 --> 00:12:41,400 Speaker 1: prison and prevent further American decline. Now part of why 226 00:12:41,400 --> 00:12:43,360 Speaker 1: I think Tunny is an interesting case study and like 227 00:12:43,480 --> 00:12:46,800 Speaker 1: followers of Moldbug, because she tells people on Twitter who 228 00:12:46,880 --> 00:12:50,120 Speaker 1: are questioning her about this petition she's put up that 229 00:12:50,160 --> 00:12:53,880 Speaker 1: they need to read mensious. Moldbug Tunny is interesting because 230 00:12:53,920 --> 00:12:56,840 Speaker 1: she doesn't come out of the traditional right. She's a 231 00:12:56,920 --> 00:13:00,680 Speaker 1: transgender woman who had built herself as an anarchist earlier 232 00:13:00,720 --> 00:13:04,320 Speaker 1: in her ideological life, prior to finding Mouldbugs writing, and 233 00:13:04,360 --> 00:13:07,679 Speaker 1: so this arc she takes from an economic justice advocate 234 00:13:07,760 --> 00:13:11,640 Speaker 1: to tech industry monarchist shows how seductive a lot of 235 00:13:11,679 --> 00:13:15,560 Speaker 1: intelligent people found Yarvin. And the fact that people like Tounny, 236 00:13:15,840 --> 00:13:18,960 Speaker 1: who don't come out of where you would expect someone 237 00:13:18,960 --> 00:13:22,319 Speaker 1: to wind up believing these far right ideas get enraptured 238 00:13:22,360 --> 00:13:25,400 Speaker 1: by Jarvin's writing is part of why he starts to 239 00:13:25,400 --> 00:13:28,080 Speaker 1: get this reputation for almost being this kind of like 240 00:13:28,720 --> 00:13:32,520 Speaker 1: mental sorcerer, someone whose work has this almost like lovecrafty 241 00:13:32,559 --> 00:13:36,920 Speaker 1: and pull in twisting people's beliefs and ideals. And that's 242 00:13:37,000 --> 00:13:39,480 Speaker 1: very much the reputation that he starts to pick up 243 00:13:39,600 --> 00:13:43,840 Speaker 1: in like the aughts. There's a philosopher, kind of a 244 00:13:44,000 --> 00:13:48,520 Speaker 1: reactionary philosopher named Nick Land who's a fan of Jarvin's writing, 245 00:13:48,840 --> 00:13:52,240 Speaker 1: who gives it the nickname the Dark Enlightenment, right, that's 246 00:13:52,520 --> 00:13:54,640 Speaker 1: the term that he comes up with to refer to 247 00:13:55,360 --> 00:13:59,800 Speaker 1: these kind of neo reactionary, anti democratic, monarchist policies. And 248 00:13:59,800 --> 00:14:03,600 Speaker 1: it's written that way because supposedly, once you read Jarvin's 249 00:14:04,280 --> 00:14:07,480 Speaker 1: arguments for why democracy can't work and like this sort 250 00:14:07,520 --> 00:14:10,880 Speaker 1: of authoritarian system is better, it's like this through the 251 00:14:10,960 --> 00:14:13,720 Speaker 1: looking glass breaking moment. I've taken the red pill, and 252 00:14:13,760 --> 00:14:15,360 Speaker 1: I can never go back right. 253 00:14:16,200 --> 00:14:19,920 Speaker 2: Emally endowing this sky and these writings with a lot 254 00:14:19,960 --> 00:14:25,400 Speaker 2: of oh yes, like magical power in a very absurd 255 00:14:25,640 --> 00:14:29,160 Speaker 2: way it is, and it takes them out of the 256 00:14:29,200 --> 00:14:34,400 Speaker 2: realm of like intellectual exploration and dialogue and idea sharing 257 00:14:34,440 --> 00:14:40,280 Speaker 2: it into this like truly like conspiratorial, like come into 258 00:14:40,320 --> 00:14:44,240 Speaker 2: the fold, put on your dark cloak and join the 259 00:14:44,320 --> 00:14:45,440 Speaker 2: dark enlightenment. 260 00:14:47,000 --> 00:14:50,320 Speaker 1: And that's what Moldbug wants. Right. That's good branding for 261 00:14:50,440 --> 00:14:52,760 Speaker 1: you if you're this guy, that's certainly how you want 262 00:14:52,800 --> 00:14:56,400 Speaker 1: to be seen. Okay, I think that's not really accurate 263 00:14:56,400 --> 00:14:58,160 Speaker 1: to what's happening to Tony or to what's happening to 264 00:14:58,200 --> 00:15:01,160 Speaker 1: most of the people following them, in part because I 265 00:15:01,200 --> 00:15:04,320 Speaker 1: have a degree of, like a professional understanding from my 266 00:15:04,440 --> 00:15:08,640 Speaker 1: former career writing and analyzing terrorist groups of how people 267 00:15:08,680 --> 00:15:11,680 Speaker 1: get radicalized, right, and a thing that gets missed a lot, 268 00:15:11,760 --> 00:15:15,000 Speaker 1: and scholar Scott Atron was kind of the guy who 269 00:15:15,600 --> 00:15:17,840 Speaker 1: I started reading who wrote about this a lot. But 270 00:15:17,880 --> 00:15:22,040 Speaker 1: about how people get radicalized, it always happens nearly always 271 00:15:22,080 --> 00:15:25,880 Speaker 1: in communities, right, even if it's not a physical community, 272 00:15:25,880 --> 00:15:30,120 Speaker 1: if it's online. People don't generally get brought into radical 273 00:15:30,360 --> 00:15:34,960 Speaker 1: belief systems or extremist politics on their own. They get 274 00:15:34,960 --> 00:15:38,320 Speaker 1: brought in in part because there's circle of friends that 275 00:15:38,400 --> 00:15:42,520 Speaker 1: people they respect and think are cool get drawn into it, right. 276 00:15:42,880 --> 00:15:44,560 Speaker 1: And I think that's what's happening to Tunny, And I 277 00:15:44,560 --> 00:15:48,160 Speaker 1: think that's a lot of the power that Mouldbugs writing 278 00:15:48,240 --> 00:15:51,760 Speaker 1: has is that he is telling the Silicon Valley set 279 00:15:52,160 --> 00:15:54,720 Speaker 1: of people who got a lot of money very quickly 280 00:15:54,760 --> 00:15:57,800 Speaker 1: when they were young and got kind of lost their 281 00:15:57,840 --> 00:16:01,840 Speaker 1: minds and their belief about their own genius. He's telling 282 00:16:01,840 --> 00:16:04,240 Speaker 1: them what they want to hear, that they should run things. 283 00:16:04,560 --> 00:16:07,440 Speaker 1: They start sharing his stuff with each other, they start 284 00:16:07,480 --> 00:16:10,840 Speaker 1: talking like him because of the way he writes, and 285 00:16:10,920 --> 00:16:14,320 Speaker 1: it becomes like the cool thing within a certain set 286 00:16:14,560 --> 00:16:18,760 Speaker 1: in Silicon Valley to be into Moldbug, right, And that's 287 00:16:19,160 --> 00:16:21,360 Speaker 1: a lot of the appeal, right. It's not that his 288 00:16:21,480 --> 00:16:25,800 Speaker 1: writing has some sort of like magical mindwarping effect. It's 289 00:16:25,800 --> 00:16:29,360 Speaker 1: that this community of people that a lot of folks, 290 00:16:29,480 --> 00:16:32,000 Speaker 1: especially like newer folks coming into the tech who want 291 00:16:32,000 --> 00:16:34,640 Speaker 1: to be you know, get founder money, who want to 292 00:16:34,640 --> 00:16:38,080 Speaker 1: be part of this like in crowd, they all like 293 00:16:38,560 --> 00:16:41,480 Speaker 1: it's the cool thing to be talking about to believe in. Right, 294 00:16:41,560 --> 00:16:43,760 Speaker 1: Like that's I think where a lot of the power 295 00:16:44,200 --> 00:16:46,960 Speaker 1: that he initially has comes from. And I think that's 296 00:16:47,000 --> 00:16:48,440 Speaker 1: always the case. If you want to look at like 297 00:16:48,520 --> 00:16:50,800 Speaker 1: why a lot of people joined ISIS a lot of 298 00:16:50,840 --> 00:16:54,320 Speaker 1: these young communities, it's because Isis was cool, right to 299 00:16:54,480 --> 00:16:56,480 Speaker 1: a lot of these young people, the way that they talk, 300 00:16:56,600 --> 00:16:59,600 Speaker 1: the slang they use, the media they put out. Communities 301 00:16:59,640 --> 00:17:04,080 Speaker 1: of people got radicalized in part because it was attractive 302 00:17:04,359 --> 00:17:06,280 Speaker 1: in that way to them. Right, It's not like a 303 00:17:06,400 --> 00:17:09,199 Speaker 1: mind virus. It's kind of the same way fads and 304 00:17:09,320 --> 00:17:12,760 Speaker 1: trends always work. Right, This is the way that like 305 00:17:13,160 --> 00:17:16,240 Speaker 1: any fad takes hold. You know, it's just a much 306 00:17:16,320 --> 00:17:19,800 Speaker 1: darker example of that. But that is how radicalization tends 307 00:17:19,800 --> 00:17:22,280 Speaker 1: to occur. And I think that's what's happening with moldbugs writing. 308 00:17:22,920 --> 00:17:28,600 Speaker 2: Yeah, it's interesting that the you're saying Tonny Tunny is 309 00:17:28,920 --> 00:17:32,320 Speaker 2: her name, right, Yes, Justine Tunny in Tunny was sort 310 00:17:32,359 --> 00:17:35,760 Speaker 2: of brought in even just talking sort of more generally 311 00:17:35,800 --> 00:17:41,199 Speaker 2: about radicalization that it's often the product of a community. 312 00:17:42,040 --> 00:17:46,399 Speaker 2: What about the what about the Jarvins, Like do you 313 00:17:46,440 --> 00:17:49,000 Speaker 2: think that that was Are they the outlier the kind 314 00:17:49,040 --> 00:17:53,359 Speaker 2: of like the well no special beacons of or or 315 00:17:53,359 --> 00:17:58,240 Speaker 2: did he come from that youth community that you know, 316 00:17:58,359 --> 00:17:59,080 Speaker 2: from his youth. 317 00:17:59,720 --> 00:18:01,920 Speaker 1: I think that's a great question. But I do think 318 00:18:02,000 --> 00:18:04,160 Speaker 1: if you look at kind of his background, you see 319 00:18:04,160 --> 00:18:07,399 Speaker 1: the community he was radicalized in. Right when he starts 320 00:18:07,400 --> 00:18:09,879 Speaker 1: working for the tech industry, When when he's you know, 321 00:18:10,240 --> 00:18:12,439 Speaker 1: in the internet in the mid nineties, he starts talking 322 00:18:12,480 --> 00:18:15,879 Speaker 1: to these guys who are these big Austrian school advocates, 323 00:18:15,920 --> 00:18:19,320 Speaker 1: who are you know he admires. These are generally men 324 00:18:19,359 --> 00:18:21,040 Speaker 1: who are a bit older than him, who are more 325 00:18:21,040 --> 00:18:23,320 Speaker 1: accomplished in their careers, and they're telling him, Oh, you 326 00:18:23,359 --> 00:18:26,879 Speaker 1: should read you know, hansrmon Hop, you should read Ludwig 327 00:18:26,920 --> 00:18:29,560 Speaker 1: von mess you should read Thomas Carlyle, you should read 328 00:18:30,200 --> 00:18:34,240 Speaker 1: Murray Rothbard, right, And he reads these guys and he 329 00:18:34,400 --> 00:18:37,160 Speaker 1: takes their thinking seriously, in part because people he respects 330 00:18:37,160 --> 00:18:39,879 Speaker 1: and thinks are cool are telling him to write. So 331 00:18:40,080 --> 00:18:43,400 Speaker 1: I do think it's a version of the same process, right, 332 00:18:43,560 --> 00:18:47,800 Speaker 1: in part, because that's just how human beings adopt ideas, right, 333 00:18:48,320 --> 00:18:50,920 Speaker 1: Like it's the if you're you know, I know people 334 00:18:50,960 --> 00:18:57,160 Speaker 1: who are like evangelical Christians, right, And there's a couple 335 00:18:57,240 --> 00:19:00,639 Speaker 1: of different attitudes towards how you should evangelize, But the 336 00:19:00,680 --> 00:19:03,080 Speaker 1: one that people I think are generally of good will 337 00:19:03,080 --> 00:19:05,800 Speaker 1: have is that like, well, if you're a really good 338 00:19:05,880 --> 00:19:09,720 Speaker 1: and admirable person, people will will find will be interested 339 00:19:09,760 --> 00:19:11,480 Speaker 1: in like what you believe. And that's the best way 340 00:19:11,480 --> 00:19:14,520 Speaker 1: to prostlytize, right by just actually like being kind of 341 00:19:14,600 --> 00:19:17,760 Speaker 1: rad And that's how I've found, Like, when I got 342 00:19:17,760 --> 00:19:20,399 Speaker 1: into radical politics, it was because I ran into a 343 00:19:20,480 --> 00:19:23,439 Speaker 1: bunch of like anarchists who were on a regular basis 344 00:19:23,440 --> 00:19:25,639 Speaker 1: going out and feeding homeless people, and I thought that 345 00:19:25,760 --> 00:19:27,760 Speaker 1: was dope, and that got me interested in what other 346 00:19:27,840 --> 00:19:32,119 Speaker 1: things do these people believe because I thought they were cool? Right, Like, 347 00:19:32,240 --> 00:19:34,560 Speaker 1: that is how I think it's just like mostly how 348 00:19:34,600 --> 00:19:37,159 Speaker 1: people work. Right. But it's in the interest of a 349 00:19:37,160 --> 00:19:41,160 Speaker 1: guy like Jarvin to make it feel like his writing 350 00:19:41,240 --> 00:19:44,320 Speaker 1: has this like love crafty and power to enrapture and 351 00:19:44,359 --> 00:19:49,720 Speaker 1: warp minds anyway, cool people and all the world's all 352 00:19:49,800 --> 00:19:54,960 Speaker 1: just high school right, Like, oh so tunny is you 353 00:19:55,000 --> 00:19:57,600 Speaker 1: know a good example of these kind of early adherents 354 00:19:57,680 --> 00:20:01,720 Speaker 1: to Moldbug, who are mostly young, disaffected engineers, software engineers, 355 00:20:01,760 --> 00:20:06,480 Speaker 1: tech industry people. And when Yarvin kind of part of 356 00:20:06,560 --> 00:20:09,199 Speaker 1: where this sort of dark Enlightenment turn takes off is 357 00:20:09,240 --> 00:20:13,440 Speaker 1: that in around twenty thirteen twenty fourteen, when mainstream news 358 00:20:13,480 --> 00:20:17,160 Speaker 1: starts writing about him, the most attractive angle to take 359 00:20:17,200 --> 00:20:20,359 Speaker 1: with this guy is not, here is a trend in 360 00:20:20,400 --> 00:20:24,760 Speaker 1: a certain subset of like Silicon Valley tech people. Let's 361 00:20:24,760 --> 00:20:26,840 Speaker 1: look at the reasons why this trend might be popular. 362 00:20:26,920 --> 00:20:30,919 Speaker 1: It's this dark enlightenment guy whose work has this like 363 00:20:31,080 --> 00:20:35,560 Speaker 1: dangerous rapture. Like the media helps create the myth of Yarvin, 364 00:20:35,560 --> 00:20:37,600 Speaker 1: and he manipulates it effectively, And I think this is 365 00:20:37,640 --> 00:20:41,040 Speaker 1: part of why you know, this reinforces his beliefs to 366 00:20:41,080 --> 00:20:45,440 Speaker 1: an extent about like how bad the cathedral is these 367 00:20:45,520 --> 00:20:48,320 Speaker 1: like media organizations that like, wow, it's so easy for 368 00:20:48,359 --> 00:20:53,240 Speaker 1: me to manipulate them. Right, So yeah, anyway, that's kind 369 00:20:53,240 --> 00:20:56,640 Speaker 1: of what's happening in the late aughts with this guy. 370 00:20:57,480 --> 00:20:59,840 Speaker 1: So one of the people who is going to come 371 00:21:00,080 --> 00:21:04,800 Speaker 1: to be sort of a follower of him in this 372 00:21:05,440 --> 00:21:08,440 Speaker 1: kind of later period after he starts to achieve more 373 00:21:08,480 --> 00:21:14,240 Speaker 1: prominence is JD. Vance. And it's because moldbugs writing is 374 00:21:14,320 --> 00:21:16,960 Speaker 1: really appealing to these people who got a lot of 375 00:21:17,000 --> 00:21:20,400 Speaker 1: money and power very quickly, often because they were either 376 00:21:20,440 --> 00:21:23,080 Speaker 1: in like venture capital, because they were in you know, 377 00:21:23,119 --> 00:21:25,320 Speaker 1: the tech industry or something, and so they have all 378 00:21:25,320 --> 00:21:29,119 Speaker 1: this wealth, but they don't come out of like they 379 00:21:29,119 --> 00:21:31,320 Speaker 1: don't come out of academia. They don't have any sort 380 00:21:31,359 --> 00:21:36,280 Speaker 1: of like place in the traditional like media hierarchy that 381 00:21:36,600 --> 00:21:39,560 Speaker 1: demands respect. Right, you may have a lot of money, 382 00:21:39,560 --> 00:21:41,719 Speaker 1: but like, why should I care what some software engineer 383 00:21:41,720 --> 00:21:45,000 Speaker 1: about Google says about politics or whatever? Right, why should 384 00:21:45,040 --> 00:21:47,280 Speaker 1: I care about what some like finance dude who was 385 00:21:47,320 --> 00:21:49,480 Speaker 1: really good at like gambling on the housing market has 386 00:21:49,520 --> 00:21:51,359 Speaker 1: to say about politics just because you have a bunch 387 00:21:51,400 --> 00:21:54,760 Speaker 1: of money. And that's where Vance comes out of. You know, 388 00:21:54,800 --> 00:21:56,879 Speaker 1: he's one of these guys who's got like a background 389 00:21:56,960 --> 00:21:59,120 Speaker 1: in private equity, and he's one of these guys who 390 00:21:59,600 --> 00:22:03,280 Speaker 1: is kind of like angry that that doesn't automatically afford 391 00:22:03,359 --> 00:22:05,920 Speaker 1: him the respect of like what he considers to be 392 00:22:05,960 --> 00:22:10,479 Speaker 1: an elite in society. Another one of the proponents of 393 00:22:10,840 --> 00:22:14,680 Speaker 1: Jarvin's thinking about particularly this retire all government employees thing, 394 00:22:14,880 --> 00:22:18,960 Speaker 1: is Blake Masters, who's a twice failed Arizona congressional candidate 395 00:22:19,320 --> 00:22:22,680 Speaker 1: and another Peter Teel protege right. In an interview with 396 00:22:22,800 --> 00:22:25,800 Speaker 1: Vanity Fair before the first of his failed campaigns, Masters 397 00:22:25,880 --> 00:22:29,320 Speaker 1: was act asked how he would drain the swamp in 398 00:22:29,400 --> 00:22:32,919 Speaker 1: practice if he was like brought into Congress, and he responded, 399 00:22:33,119 --> 00:22:35,840 Speaker 1: one of my friends has this acronym he calls rage 400 00:22:36,119 --> 00:22:39,680 Speaker 1: retire all government employees. And he was referring to Yarvin. 401 00:22:39,680 --> 00:22:42,480 Speaker 1: He and Yarvin are friends, just like Vance and Yarvin, 402 00:22:42,520 --> 00:22:44,240 Speaker 1: you know, go to a lot of the same parties 403 00:22:44,640 --> 00:22:47,480 Speaker 1: and whatnot. Like these are all guys who are not 404 00:22:47,640 --> 00:22:51,359 Speaker 1: just close ideologically, but like are in the same physical 405 00:22:51,440 --> 00:22:54,080 Speaker 1: spaces a lot of the time talking with each other. 406 00:22:54,320 --> 00:22:58,960 Speaker 2: That's an important, yeah, aspect to this. That's that's new. 407 00:22:59,320 --> 00:23:02,280 Speaker 2: So that this just a a kind of like, Hey, 408 00:23:02,359 --> 00:23:05,840 Speaker 2: I read your stuff and or I read this guy's stuff, 409 00:23:05,880 --> 00:23:09,359 Speaker 2: and I think it's compelling and I'd like to crib 410 00:23:09,440 --> 00:23:15,360 Speaker 2: some ideas from this. It's more it's also that they're 411 00:23:15,400 --> 00:23:19,920 Speaker 2: hanging out. Yes, it's moving, and that really starts. We'll 412 00:23:19,920 --> 00:23:22,879 Speaker 2: talk about this in a little bit, But once Jarvin 413 00:23:22,960 --> 00:23:26,240 Speaker 2: gets kind of docked, right, so people in twenty fourteen, 414 00:23:26,280 --> 00:23:29,160 Speaker 2: I think it is tech Crunch publishes what his actual 415 00:23:29,280 --> 00:23:32,960 Speaker 2: name is and instead of like there's this period of 416 00:23:33,040 --> 00:23:35,959 Speaker 2: time where there's some backlash against him, but mostly what 417 00:23:36,040 --> 00:23:37,680 Speaker 2: it does is kind of elevate him. 418 00:23:37,720 --> 00:23:40,560 Speaker 1: Like he does it. He's not hiding behind a pseudonym anymore, 419 00:23:40,960 --> 00:23:43,359 Speaker 1: but now he's free to be a public figure in 420 00:23:43,400 --> 00:23:45,639 Speaker 1: more of a way. And so he starts publicly showing 421 00:23:45,720 --> 00:23:48,480 Speaker 1: up at these parties and events where guys like jd 422 00:23:48,640 --> 00:23:52,359 Speaker 1: Vance and Blake Masters are in attendance, where like people 423 00:23:52,359 --> 00:23:57,040 Speaker 1: who are making inroads into the actual like political strata 424 00:23:57,280 --> 00:23:59,399 Speaker 1: of the right are in attendance. And that's a thing 425 00:23:59,400 --> 00:24:02,359 Speaker 1: that's increasing happening, like once we kind of hit the 426 00:24:02,359 --> 00:24:06,720 Speaker 1: Trump era, and yeah, it's it's kind of important. The 427 00:24:06,760 --> 00:24:10,560 Speaker 1: best example of the degree to which Jarvin's thoughts and 428 00:24:10,640 --> 00:24:15,080 Speaker 1: policies have entered mainstream politics is rage. Is this idea 429 00:24:15,080 --> 00:24:17,680 Speaker 1: that we need to bring in a guy who's going 430 00:24:17,720 --> 00:24:20,080 Speaker 1: to fire everyone in the government and replace it with 431 00:24:20,160 --> 00:24:23,119 Speaker 1: our people. This is the kind of thing that, like, 432 00:24:23,359 --> 00:24:25,520 Speaker 1: not only is this something that guys like Masters and 433 00:24:25,800 --> 00:24:28,960 Speaker 1: Dvans are talking about. Trump tried to do this already 434 00:24:29,040 --> 00:24:32,000 Speaker 1: in twenty twenty at the end of his term. He 435 00:24:32,080 --> 00:24:36,399 Speaker 1: sought to reclassify thousands and thousands of federal employees in 436 00:24:36,520 --> 00:24:39,120 Speaker 1: order to strip them of job protections and make them 437 00:24:39,160 --> 00:24:42,800 Speaker 1: at well employees that he could fire. This was reversed 438 00:24:43,400 --> 00:24:45,159 Speaker 1: under Biden, but it is the kind of thing that 439 00:24:45,240 --> 00:24:47,880 Speaker 1: Trump could bring back and plans to bring back if 440 00:24:47,880 --> 00:24:50,480 Speaker 1: he wins again, because he has promised that if he 441 00:24:50,520 --> 00:24:52,800 Speaker 1: takes office again in twenty twenty five, one of the 442 00:24:52,840 --> 00:24:54,920 Speaker 1: first things he's going to do is fire thousands of 443 00:24:54,960 --> 00:25:01,200 Speaker 1: quote unquote crooked government employees. Trump has repeated disavowed Project 444 00:25:01,200 --> 00:25:03,720 Speaker 1: twenty twenty five because it's become kind of a toxic 445 00:25:03,800 --> 00:25:07,680 Speaker 1: thing politically for him, But he has promised that when 446 00:25:07,720 --> 00:25:09,639 Speaker 1: he takes off as he's going to like, you know, 447 00:25:09,760 --> 00:25:12,000 Speaker 1: clean house within the federal government and put his own 448 00:25:12,040 --> 00:25:14,640 Speaker 1: people in there. And this is very in line with 449 00:25:14,680 --> 00:25:18,360 Speaker 1: what Heritage Foundation president Kevin Roberts told The New York 450 00:25:18,359 --> 00:25:21,800 Speaker 1: Times he hoped a second Trump term would bring quote 451 00:25:21,800 --> 00:25:24,600 Speaker 1: people will lose their jobs, hopefully their lives are able 452 00:25:24,640 --> 00:25:27,040 Speaker 1: to flourish. In spite of that, buildings will be shut down. 453 00:25:27,119 --> 00:25:30,600 Speaker 1: Hopefully they can be repurposed for private industry. And that's 454 00:25:30,680 --> 00:25:32,879 Speaker 1: Kevin Roberts is not a guy who's ever said Curtis 455 00:25:32,960 --> 00:25:37,760 Speaker 1: Yarvin's name that I can find. But that's Menschus moldbug, right, Like, 456 00:25:37,800 --> 00:25:40,600 Speaker 1: that's exactly what he has been talking about for years. 457 00:25:41,480 --> 00:25:45,440 Speaker 2: Would you draw a direct line from Minshus to these positions? 458 00:25:45,480 --> 00:25:47,840 Speaker 2: I mean, and I know absolutely aim, But are you 459 00:25:48,760 --> 00:25:51,880 Speaker 2: do you feel very very confident that moldbug is the 460 00:25:51,920 --> 00:25:54,439 Speaker 2: source of these ideas or was this sort of a 461 00:25:54,520 --> 00:25:59,520 Speaker 2: general sentiment that Minsches was reflecting among like a certain set. 462 00:26:00,280 --> 00:26:03,360 Speaker 1: I think it's best to think about it like a garden, right, 463 00:26:03,640 --> 00:26:09,480 Speaker 1: And what Minschus did was kind of spread fertilizer over 464 00:26:09,520 --> 00:26:12,159 Speaker 1: that garden. That made a climate where a lot of 465 00:26:12,200 --> 00:26:15,080 Speaker 1: these ideas, many of which do come kind of directly 466 00:26:15,240 --> 00:26:20,000 Speaker 1: from his writing, but he also helped kind of set 467 00:26:20,040 --> 00:26:23,439 Speaker 1: the ideological climate on the far right. And as the 468 00:26:23,480 --> 00:26:27,399 Speaker 1: far right took over kind of the center right Republican Party, 469 00:26:27,880 --> 00:26:30,240 Speaker 1: it brought a lot of these mold buggy and ideas 470 00:26:30,280 --> 00:26:32,400 Speaker 1: with it, which is part of why guys like Peter 471 00:26:32,520 --> 00:26:36,359 Speaker 1: Teel are anti democratic activists and use a lot of 472 00:26:36,400 --> 00:26:40,520 Speaker 1: their money to that end, have funded yarbin Is. They 473 00:26:40,640 --> 00:26:43,280 Speaker 1: see him as useful for that sort of thing, right, 474 00:26:43,840 --> 00:26:46,320 Speaker 1: And a lot of these guys who have gone on 475 00:26:46,359 --> 00:26:48,720 Speaker 1: to work at the Heritage Foundation, you know, in the 476 00:26:48,760 --> 00:26:52,360 Speaker 1: early two thousands, in two thousand and eight, nine, ten eleven, 477 00:26:52,800 --> 00:26:55,919 Speaker 1: where like kids in high school and college passing around 478 00:26:55,920 --> 00:26:59,359 Speaker 1: minshus Moldbug tracts, right, Like, that's kind of how this 479 00:26:59,480 --> 00:27:02,159 Speaker 1: has worked. So I would see him as he really 480 00:27:02,320 --> 00:27:05,959 Speaker 1: he prepared an environment for these kind of politics to grow. 481 00:27:07,440 --> 00:27:10,439 Speaker 1: And I think there is a very direct line between 482 00:27:10,480 --> 00:27:12,080 Speaker 1: him and a lot of what you see with Project 483 00:27:12,119 --> 00:27:13,960 Speaker 1: twenty twenty five, a lot of the stuff. Even though 484 00:27:14,200 --> 00:27:16,720 Speaker 1: Trump certainly has never read one of these guys articles, 485 00:27:16,880 --> 00:27:20,600 Speaker 1: he's surrounded by people who are telling him and who 486 00:27:20,600 --> 00:27:23,480 Speaker 1: have ideas about how absolute power can be attained that 487 00:27:23,520 --> 00:27:27,320 Speaker 1: are based on things Moldbug is written. You know, I 488 00:27:27,359 --> 00:27:31,960 Speaker 1: think that that's absolutely something I'm confident arguing and confident 489 00:27:32,000 --> 00:27:35,360 Speaker 1: part because there's so many financial ties and direct personal 490 00:27:35,400 --> 00:27:39,439 Speaker 1: ties between him and people around Trump, you know, So 491 00:27:40,240 --> 00:27:43,359 Speaker 1: I know that's dark. So let's just go to ads 492 00:27:43,400 --> 00:27:44,960 Speaker 1: and try not to think about it for a second, 493 00:27:45,080 --> 00:27:53,000 Speaker 1: think about these products for a minute, and we're back. 494 00:27:54,040 --> 00:27:57,479 Speaker 1: So one of the things I find interesting about Jarvin 495 00:27:57,600 --> 00:28:00,760 Speaker 1: is there's this degree of insecurity to some of his 496 00:28:00,840 --> 00:28:05,440 Speaker 1: writing where he definitely is a monarchist, and he's definitely 497 00:28:05,440 --> 00:28:08,159 Speaker 1: someone who feels a sense of nostalgia to some of 498 00:28:08,160 --> 00:28:11,440 Speaker 1: these old absolute monarchies. But he doesn't want to get 499 00:28:11,480 --> 00:28:15,480 Speaker 1: lumped in with the guys who are like unronically stands 500 00:28:16,160 --> 00:28:18,400 Speaker 1: of the czar or whatever and think that, like, oh, 501 00:28:18,440 --> 00:28:21,119 Speaker 1: if only we could bring the Romanovs back to power 502 00:28:21,160 --> 00:28:25,000 Speaker 1: in Russia, everything would be better. Jarvin doesn't isn't a 503 00:28:25,000 --> 00:28:27,520 Speaker 1: monarchist for shallow reasons, right. He doesn't want to be 504 00:28:27,600 --> 00:28:30,879 Speaker 1: seen as someone who advocates this because he's nostalgic. He 505 00:28:30,920 --> 00:28:32,520 Speaker 1: wants to be seen as someone who has run the 506 00:28:32,600 --> 00:28:36,440 Speaker 1: numbers and concluded that there's no alternative to this system. 507 00:28:37,040 --> 00:28:39,600 Speaker 1: And so whenever he makes his arguments for why things 508 00:28:39,640 --> 00:28:44,080 Speaker 1: should be this way, he liberally sprinkles them with citations. Now, 509 00:28:44,120 --> 00:28:47,680 Speaker 1: his citations are a bunch of old reactionaries who had 510 00:28:47,720 --> 00:28:51,200 Speaker 1: like argued against kings and emperors giving up any power 511 00:28:51,280 --> 00:28:54,960 Speaker 1: back in like the eighteen fifties and the like. But 512 00:28:55,720 --> 00:28:58,520 Speaker 1: he has this kind of belief that any sort of 513 00:28:58,600 --> 00:29:02,880 Speaker 1: quote from an old, dead guy is a primary source. 514 00:29:02,920 --> 00:29:06,760 Speaker 1: That is something like people should take more seriously when 515 00:29:06,800 --> 00:29:09,080 Speaker 1: they're like trying to make their minds up about how 516 00:29:09,120 --> 00:29:11,160 Speaker 1: the world works. He has this kind of he's an 517 00:29:11,200 --> 00:29:14,440 Speaker 1: autodide act and that he's someone who is self taught 518 00:29:14,680 --> 00:29:17,520 Speaker 1: through reading a bunch of books about the things he likes. 519 00:29:18,000 --> 00:29:20,800 Speaker 1: And he's convinced that this is sort of like a 520 00:29:20,880 --> 00:29:24,400 Speaker 1: more ideologically rigorous thing than what people do in academia, right, 521 00:29:24,720 --> 00:29:26,800 Speaker 1: and kind of the key thing in academia is that 522 00:29:27,080 --> 00:29:30,120 Speaker 1: if you have ideas or theories, or if you're making arguments, 523 00:29:30,360 --> 00:29:33,280 Speaker 1: you have to expose them to other people and debate 524 00:29:33,360 --> 00:29:36,600 Speaker 1: and have them like torn down. And you know, that's 525 00:29:36,760 --> 00:29:39,800 Speaker 1: like a key part of the way the academic process works. 526 00:29:40,280 --> 00:29:42,920 Speaker 1: He's only ever existed inside his own head, and so 527 00:29:43,040 --> 00:29:45,240 Speaker 1: you get stuff like this two thousand and eight blog 528 00:29:45,280 --> 00:29:50,160 Speaker 1: post titled an open Letter to open Minded Progressives, where 529 00:29:50,200 --> 00:29:54,120 Speaker 1: he writes this about primary sources. The neat thing about 530 00:29:54,120 --> 00:29:57,120 Speaker 1: primary source is is that often it only takes one 531 00:29:57,200 --> 00:29:59,240 Speaker 1: to prove your point. If you find the theory of 532 00:29:59,280 --> 00:30:03,440 Speaker 1: relativity ancient Greek documents, and you know the documents are authentic. 533 00:30:03,760 --> 00:30:07,160 Speaker 1: You know that the ancient Greeks discovered relativity. How why 534 00:30:07,240 --> 00:30:10,240 Speaker 1: it doesn't matter your understanding of ancient Greeks needs to 535 00:30:10,280 --> 00:30:15,360 Speaker 1: include Greek relativity. Now that's a non sequitur, because, like, 536 00:30:15,640 --> 00:30:18,920 Speaker 1: if you were to find an ancient Greek wrote write 537 00:30:18,960 --> 00:30:23,000 Speaker 1: about the theory of relativity, that alone wouldn't necessarily alter 538 00:30:23,160 --> 00:30:25,400 Speaker 1: your understanding of ancient Greece, because there's a lot of 539 00:30:25,400 --> 00:30:28,560 Speaker 1: other questions, like did anyone else come across this writing 540 00:30:28,560 --> 00:30:32,120 Speaker 1: at the time, was this idea disseminated, was it adopted 541 00:30:32,160 --> 00:30:34,880 Speaker 1: on any kind of scale into the dominant theoretical models 542 00:30:34,880 --> 00:30:37,200 Speaker 1: of the time, or was it one cranks weird belief 543 00:30:37,200 --> 00:30:39,160 Speaker 1: that he laid out in an old letter to a friend. Right, 544 00:30:39,480 --> 00:30:41,920 Speaker 1: the fact that you might find something that sounds like 545 00:30:42,040 --> 00:30:45,640 Speaker 1: an argument for about relativity in an ancient Greek document 546 00:30:45,880 --> 00:30:49,800 Speaker 1: doesn't necessarily change your understanding of ancient Greece in a 547 00:30:49,840 --> 00:30:52,280 Speaker 1: meaningful way. Right. The fact that you could cite that 548 00:30:52,440 --> 00:30:55,480 Speaker 1: primary source wouldn't be enough to say the ancient Greeks 549 00:30:55,560 --> 00:30:57,960 Speaker 1: had an understanding of relativity, because you don't know that 550 00:30:58,080 --> 00:31:00,560 Speaker 1: it wasn't just one guy who was viewed widely as 551 00:31:00,600 --> 00:31:03,880 Speaker 1: a crank, right, and a good example of this kind 552 00:31:03,920 --> 00:31:06,360 Speaker 1: of thing from real world history, because he's just making 553 00:31:06,400 --> 00:31:09,520 Speaker 1: up fake history there. Because again there's not really a 554 00:31:09,560 --> 00:31:10,520 Speaker 1: good argument. 555 00:31:10,200 --> 00:31:10,680 Speaker 2: To be made. 556 00:31:11,440 --> 00:31:14,120 Speaker 1: But I will make an interesting argument, and it's about 557 00:31:14,120 --> 00:31:18,520 Speaker 1: something called the Alio file, which was an ancient steam 558 00:31:18,560 --> 00:31:21,760 Speaker 1: engine that was first described by Hero of Alexandria in 559 00:31:21,840 --> 00:31:25,680 Speaker 1: the first century AD. It was actually technically a steam turbine, 560 00:31:25,720 --> 00:31:28,200 Speaker 1: but it was a precursor to a steam engine that 561 00:31:28,320 --> 00:31:32,400 Speaker 1: existed in ancient Rome. Right Now, that's a cool bit 562 00:31:32,440 --> 00:31:35,160 Speaker 1: of history. But does that mean that the Romans had 563 00:31:35,160 --> 00:31:38,479 Speaker 1: steam engines and the ability to make trains? No, because 564 00:31:38,480 --> 00:31:40,560 Speaker 1: this was only ever used as like a party trick, 565 00:31:40,680 --> 00:31:42,920 Speaker 1: Like a couple of prototypes were made and they were 566 00:31:42,960 --> 00:31:46,200 Speaker 1: like made to impress people at like gatherings for rich people. 567 00:31:46,520 --> 00:31:49,080 Speaker 1: No one ever did anything with it. So the fact 568 00:31:49,120 --> 00:31:52,280 Speaker 1: that technically there was the knowledge to make a steam 569 00:31:52,280 --> 00:31:56,000 Speaker 1: engine in ancient Rome doesn't change your understanding of ancient 570 00:31:56,120 --> 00:31:58,640 Speaker 1: Rome because they still didn't have steam power. 571 00:31:58,920 --> 00:32:05,600 Speaker 2: Right. Yeah, it's like somebody's showing off their Aston Martin like, right, 572 00:32:05,720 --> 00:32:08,840 Speaker 2: not everybody had Aston Martin's or has asked, Yeah, some 573 00:32:08,960 --> 00:32:10,600 Speaker 2: rich guy is like check it out. 574 00:32:11,400 --> 00:32:13,320 Speaker 1: Yeah, And then you were to say that, like, well, 575 00:32:13,360 --> 00:32:15,560 Speaker 1: that this meant that the Aston Martin was the car 576 00:32:15,600 --> 00:32:18,080 Speaker 1: of the twenty first century. Like no, like like like 577 00:32:18,160 --> 00:32:20,640 Speaker 1: a couple hundred people had them or whatever, you. 578 00:32:20,560 --> 00:32:22,560 Speaker 2: Know, I would say, I mean, I would say that 579 00:32:23,400 --> 00:32:26,600 Speaker 2: he has a point that depending on the assertion by 580 00:32:26,600 --> 00:32:31,080 Speaker 2: a primary source, sure it can have like pretty powerful 581 00:32:31,560 --> 00:32:33,600 Speaker 2: value in an historical context. 582 00:32:34,120 --> 00:32:37,640 Speaker 1: Oh absolutely, But you also you have to take into account. 583 00:32:38,000 --> 00:32:39,800 Speaker 1: I think part of it is that he's he's talking 584 00:32:39,800 --> 00:32:42,800 Speaker 1: about this in like internet debate terms, where like someone 585 00:32:42,840 --> 00:32:44,920 Speaker 1: makes an argument and you throw in a quote from 586 00:32:44,920 --> 00:32:48,160 Speaker 1: a source and like it's probably done because most people 587 00:32:48,200 --> 00:32:50,280 Speaker 1: don't have the time or the breadth of knowledge to 588 00:32:50,320 --> 00:32:53,440 Speaker 1: really argue these things in detail, as opposed to like 589 00:32:53,760 --> 00:32:56,400 Speaker 1: in academia. Part of what you would be trying to 590 00:32:56,400 --> 00:32:59,120 Speaker 1: do is like not just here's what one source said 591 00:32:59,160 --> 00:32:59,800 Speaker 1: about this. 592 00:32:59,720 --> 00:33:01,440 Speaker 2: But wanted to be irrefutable. 593 00:33:02,000 --> 00:33:04,920 Speaker 1: Yeah, here's the balance of sources. Like we can do 594 00:33:04,960 --> 00:33:06,959 Speaker 1: a survey of all of these different people from the 595 00:33:06,960 --> 00:33:10,120 Speaker 1: time writing about this moment, and we found you're gonna 596 00:33:10,160 --> 00:33:12,960 Speaker 1: find conflict, right, because you know, if you have like 597 00:33:13,080 --> 00:33:16,320 Speaker 1: ten people who are all present at the same shooting 598 00:33:16,440 --> 00:33:19,240 Speaker 1: or car crash or whatever. You're gonna get ten slightly 599 00:33:19,240 --> 00:33:23,000 Speaker 1: different accounts, right, those are all primary sources. A primary 600 00:33:23,040 --> 00:33:25,680 Speaker 1: source does not mean something is right. It just means 601 00:33:25,760 --> 00:33:30,200 Speaker 1: it's from someone who was there, you know. But that's 602 00:33:30,200 --> 00:33:32,800 Speaker 1: not how Mouldbug thinks about it, because he very much 603 00:33:32,840 --> 00:33:37,280 Speaker 1: has this. He thinks about these these old dead reactionary 604 00:33:37,320 --> 00:33:39,920 Speaker 1: writers the way he wants to be thought about, right, 605 00:33:39,960 --> 00:33:42,800 Speaker 1: which is as someone who is like more or less 606 00:33:42,840 --> 00:33:47,560 Speaker 1: ideologically unimpeachable, right, because they're his favorite writers. And so 607 00:33:47,720 --> 00:33:50,040 Speaker 1: I think it behooves us to talk about probably his 608 00:33:50,200 --> 00:33:53,400 Speaker 1: favorite of these old dead reactionaries, because it tells us 609 00:33:53,400 --> 00:33:56,880 Speaker 1: a lot about some of the things that Yarvin believes. 610 00:33:56,880 --> 00:33:58,600 Speaker 1: And this is who we've talked. We talked about this 611 00:33:58,600 --> 00:34:03,520 Speaker 1: guy in part one, Thomas Carlyle. Now Carlyle is writing 612 00:34:03,600 --> 00:34:09,399 Speaker 1: in the mid eighteen hundreds. He's a Scottish writer who 613 00:34:09,640 --> 00:34:11,080 Speaker 1: you know. One of the things he wrote about, he 614 00:34:11,120 --> 00:34:13,200 Speaker 1: was an early writer, kind of talking about the plight 615 00:34:13,280 --> 00:34:17,880 Speaker 1: of the working class under industrialization. So there's a degree 616 00:34:17,880 --> 00:34:19,759 Speaker 1: of what he was writing about that I think was 617 00:34:20,239 --> 00:34:23,560 Speaker 1: fairly valid. But he was also a massive bigot and 618 00:34:23,640 --> 00:34:25,279 Speaker 1: for kind of an overview of that, I found a 619 00:34:25,320 --> 00:34:28,400 Speaker 1: write up by the Glasgow Museum of Slavery aptly titled 620 00:34:28,480 --> 00:34:34,760 Speaker 1: Thomas Carlyle Historian Writer Racist, that describes an essay Carlyle wrote. Quote. 621 00:34:35,160 --> 00:34:38,440 Speaker 1: Carlyle complained that emancipated black people in the West Indies 622 00:34:38,480 --> 00:34:41,239 Speaker 1: were lazy, working little bit, eating well, benefiting from the 623 00:34:41,239 --> 00:34:45,000 Speaker 1: favorable climate and abundance of tropical fruit, while sugarcane on 624 00:34:45,080 --> 00:34:48,040 Speaker 1: British plantations rotted due to lack of labor. In the 625 00:34:48,080 --> 00:34:50,759 Speaker 1: context of recession and unemployment back in Britain and the 626 00:34:50,760 --> 00:34:53,920 Speaker 1: potato famines in Ireland, this was an emode of accusation. 627 00:34:54,200 --> 00:34:57,200 Speaker 1: It was also blatantly untrue, according to figures produced by 628 00:34:57,200 --> 00:34:59,799 Speaker 1: the Anti Slavery Society in eighteen forty seven to forty 629 00:34:59,840 --> 00:35:02,479 Speaker 1: t nine, which showed that sugar production had in fact 630 00:35:02,560 --> 00:35:05,680 Speaker 1: gone up. However, the emancipation of slaves in the British 631 00:35:05,719 --> 00:35:08,360 Speaker 1: Empire in eighteen thirty three had created a labor problem 632 00:35:08,360 --> 00:35:11,840 Speaker 1: that had cut into profit margins, exacerbated in eighteen forty 633 00:35:11,880 --> 00:35:14,719 Speaker 1: six when the Sugar Duties to Act ended subsidies for 634 00:35:14,760 --> 00:35:18,560 Speaker 1: British plantation owners. And this is the kind of thing. 635 00:35:18,600 --> 00:35:23,399 Speaker 1: Because the Glasgow Museum of Slavery is trying to view 636 00:35:23,400 --> 00:35:27,279 Speaker 1: things from an academic and an accurate standpoint, they're able 637 00:35:27,280 --> 00:35:30,000 Speaker 1: to look at what Carlyle said, and what he said 638 00:35:30,040 --> 00:35:32,960 Speaker 1: that was like demonstrably not just racists, but like we 639 00:35:33,040 --> 00:35:36,920 Speaker 1: can argue completely factually wrong about like the economics of 640 00:35:37,000 --> 00:35:40,160 Speaker 1: the system that he was arguing in favor of. Moldbug 641 00:35:40,239 --> 00:35:43,239 Speaker 1: can't really engage in a lot of these criticisms of 642 00:35:43,280 --> 00:35:47,120 Speaker 1: Carlyle because the whole reason Carlyle was wrong was that 643 00:35:47,200 --> 00:35:52,400 Speaker 1: plantation owners, this natural aristocracy, were lazy, corrupt and incompetent, 644 00:35:52,480 --> 00:35:55,200 Speaker 1: and that flies in the face of Jarvin's belief system. 645 00:35:55,520 --> 00:35:59,239 Speaker 1: So instead, when Curtis Jarvin writes an essay on Thomas Carlyle, 646 00:35:59,280 --> 00:36:03,400 Speaker 1: he writes that is quote a natural human relationship like 647 00:36:03,480 --> 00:36:06,920 Speaker 1: that of patron and client and enthusias. That Carlyle is 648 00:36:07,200 --> 00:36:09,760 Speaker 1: the one writer in English whose name can be uttered 649 00:36:09,760 --> 00:36:13,719 Speaker 1: with Shakespeare's. Like this guy who's big claim to fame 650 00:36:13,800 --> 00:36:16,280 Speaker 1: is his essay on why slavery is good and natural, 651 00:36:16,520 --> 00:36:20,360 Speaker 1: is just as as brilliant a thinker as William Shakespeare. 652 00:36:20,960 --> 00:36:24,440 Speaker 1: Like again, one an idea of kind of how vile. 653 00:36:24,200 --> 00:36:26,360 Speaker 2: This guy is how well was it written? 654 00:36:27,200 --> 00:36:33,400 Speaker 1: I mean it's okay, yeah, it's it's it's not it's no, no, 655 00:36:33,440 --> 00:36:35,640 Speaker 1: it's it's definitely non Niamic pentemy. It's not even in 656 00:36:35,719 --> 00:36:41,520 Speaker 1: dactyla texameter. You know, Like, come on, can we back 657 00:36:41,600 --> 00:36:44,239 Speaker 1: up for one second? Was there anything yeah said in 658 00:36:44,280 --> 00:36:49,319 Speaker 1: this open letter to open minded progressives that was that 659 00:36:49,360 --> 00:36:51,080 Speaker 1: would change a progressive's mind? 660 00:36:51,600 --> 00:36:54,200 Speaker 2: Or like what was the gist? What was the logline 661 00:36:54,280 --> 00:36:57,920 Speaker 2: of that? Because you got into a quote, but I 662 00:36:58,160 --> 00:37:00,279 Speaker 2: I was just fascinated by this open letter. 663 00:37:00,840 --> 00:37:03,680 Speaker 1: Yeah, it's largely a series of art. It's really largely 664 00:37:03,680 --> 00:37:06,960 Speaker 1: trying to convince people that these sort of ideas of 665 00:37:07,280 --> 00:37:10,080 Speaker 1: democracy in human progress that are kind of inherent to 666 00:37:10,120 --> 00:37:14,440 Speaker 1: the progressive tradition. Right of this like gradual march of 667 00:37:14,560 --> 00:37:17,680 Speaker 1: progress in terms of like pushing for for more rights 668 00:37:17,719 --> 00:37:21,879 Speaker 1: and more justice in society is kind of fundamentally fallacious 669 00:37:21,920 --> 00:37:25,680 Speaker 1: and flawed, Right that, like none of this actually works, right, 670 00:37:25,760 --> 00:37:29,800 Speaker 1: that it only it only creates new tyrannies. That's his argument. 671 00:37:29,840 --> 00:37:33,480 Speaker 1: It's this whole you advocating for your own rights is 672 00:37:33,520 --> 00:37:36,279 Speaker 1: really you restricting my rights? You know, Like that's that's 673 00:37:36,360 --> 00:37:39,840 Speaker 1: kind of the gist. Of the argument that he's making there. Wow, 674 00:37:39,920 --> 00:37:44,160 Speaker 1: that's in fourteen chapters. Everything he says takes too long. 675 00:37:44,840 --> 00:37:48,040 Speaker 2: Also, slavery is cool, Yeah. 676 00:37:47,920 --> 00:37:52,040 Speaker 1: Yeah, slave, I mean, slavery is a natural relationship, right 677 00:37:53,600 --> 00:37:56,120 Speaker 1: based because again he believes that, like, it can't just 678 00:37:56,160 --> 00:37:58,719 Speaker 1: be that one group of people were at one time 679 00:37:58,840 --> 00:38:01,239 Speaker 1: able to exert more of violence than another group of 680 00:38:01,239 --> 00:38:03,960 Speaker 1: people and so they took them into their possession, right, 681 00:38:04,360 --> 00:38:07,840 Speaker 1: because that makes it sound awful. It's got to be that, like, no, 682 00:38:07,960 --> 00:38:10,920 Speaker 1: this group of people are naturally inclined to rule in 683 00:38:11,000 --> 00:38:13,040 Speaker 1: this other group are naturally inclined. 684 00:38:12,640 --> 00:38:13,879 Speaker 2: To be ruled, you serve. 685 00:38:14,920 --> 00:38:20,120 Speaker 1: Yeah, So, when it comes to talking about Jarvin as 686 00:38:20,160 --> 00:38:23,319 Speaker 1: this sort of dark Enlightenment figure, this guy who's able 687 00:38:23,360 --> 00:38:26,799 Speaker 1: to enrapture people's minds with this almost magic quality of 688 00:38:26,840 --> 00:38:29,919 Speaker 1: his pros, I want to kind of I think we've 689 00:38:29,920 --> 00:38:31,760 Speaker 1: talked a little bit about why I don't think that's 690 00:38:31,960 --> 00:38:35,120 Speaker 1: accurate to how his work was appealing to people. But 691 00:38:35,160 --> 00:38:38,839 Speaker 1: I want to really puncture that myth by pointing out 692 00:38:38,840 --> 00:38:41,279 Speaker 1: one of the examples where he was very much in 693 00:38:41,360 --> 00:38:43,520 Speaker 1: line with kind of like the gutter of the right 694 00:38:43,560 --> 00:38:46,040 Speaker 1: wing and not at all like this sort of more 695 00:38:46,040 --> 00:38:49,400 Speaker 1: intellectual side that he tries to present himself as, sometimes 696 00:38:49,600 --> 00:38:53,359 Speaker 1: in articles like that open letter to Progressives. In two 697 00:38:53,440 --> 00:38:55,879 Speaker 1: thousand and eight, he published a blog post titled did 698 00:38:55,920 --> 00:38:58,880 Speaker 1: Barack Obama go to Colombia? And this is written by 699 00:38:58,920 --> 00:39:02,080 Speaker 1: Jarvin as a serious and investigation into whether or not 700 00:39:02,160 --> 00:39:05,960 Speaker 1: Barack Obama had faked his attendance at Columbia College. And 701 00:39:06,000 --> 00:39:09,440 Speaker 1: the evidence Yarvin had that Obama had faked his attendants 702 00:39:09,840 --> 00:39:12,600 Speaker 1: was that several other Columbia grads from the same year 703 00:39:12,880 --> 00:39:15,680 Speaker 1: said they didn't know him. Right, this was a huge 704 00:39:15,800 --> 00:39:18,239 Speaker 1: thing on like the right at this time. You would 705 00:39:18,280 --> 00:39:20,799 Speaker 1: get like Trump and stuff, you know, retweeting this way 706 00:39:20,840 --> 00:39:23,239 Speaker 1: back in the day that like, well, a bunch of 707 00:39:23,280 --> 00:39:26,319 Speaker 1: guys who went to Colombia didn't know Barack Obama. Like 708 00:39:26,440 --> 00:39:28,640 Speaker 1: this man came out of nowhere. He must have faked 709 00:39:28,640 --> 00:39:32,480 Speaker 1: his attendance at this college. And here's what Jarvin writes 710 00:39:32,520 --> 00:39:34,600 Speaker 1: about it, so you can get a hint at his 711 00:39:34,719 --> 00:39:38,279 Speaker 1: like sparkling prose. So let me ask anyone who cares 712 00:39:38,320 --> 00:39:41,400 Speaker 1: to comment below, how exactly do we the American people, 713 00:39:41,480 --> 00:39:45,000 Speaker 1: lord help us know that Barack Obama attended Columbia, or 714 00:39:45,040 --> 00:39:47,759 Speaker 1: more precisely, why should we assume, on the basis of 715 00:39:47,760 --> 00:39:50,680 Speaker 1: the evidence we have that he did, do we serious belief, 716 00:39:50,760 --> 00:39:53,440 Speaker 1: seriously believe it is possible for a future president to 717 00:39:53,440 --> 00:39:58,520 Speaker 1: be unremembered at his alma mater. Now, the very next 718 00:39:58,560 --> 00:40:00,680 Speaker 1: paragraph in that article is what we know is that 719 00:40:00,719 --> 00:40:04,840 Speaker 1: a Columbia spokesman has confirmed that Obama attended Columbia. And like, 720 00:40:05,200 --> 00:40:07,560 Speaker 1: I would say, well, that's part of how you find out, right, 721 00:40:07,600 --> 00:40:10,399 Speaker 1: as you asked the school did this guy attend, as 722 00:40:10,400 --> 00:40:13,480 Speaker 1: opposed to ask random people who happened to go to 723 00:40:13,520 --> 00:40:16,960 Speaker 1: a school with thousands of folks did you know this guy? Like, 724 00:40:20,120 --> 00:40:23,319 Speaker 1: it's just such like the logical line there is. So 725 00:40:24,719 --> 00:40:27,120 Speaker 1: it's very much a guy picking the reality he wants 726 00:40:27,160 --> 00:40:30,000 Speaker 1: to have, right, which is that Barack Obama is a fraud. Right, 727 00:40:30,040 --> 00:40:32,360 Speaker 1: He's a fraud and he faked his attendants at Columbia. 728 00:40:32,640 --> 00:40:35,080 Speaker 1: He wants to believe that he finds a couple of 729 00:40:35,120 --> 00:40:37,560 Speaker 1: other Columbia grads who don't like Obama and say, well, 730 00:40:37,560 --> 00:40:39,920 Speaker 1: I never knew him, and that's the only facts that 731 00:40:40,040 --> 00:40:43,000 Speaker 1: Jarvin needs here, as opposed to like, well, is there 732 00:40:43,000 --> 00:40:45,480 Speaker 1: any actual evidence that he attended? And it turns out 733 00:40:45,520 --> 00:40:48,239 Speaker 1: that is, and like thirty seconds of googling, I found 734 00:40:48,280 --> 00:40:51,000 Speaker 1: an article by a writer in the Jewish Journal who 735 00:40:51,080 --> 00:40:54,160 Speaker 1: attended Columbia at the same time as Obama. And also, 736 00:40:54,719 --> 00:40:57,040 Speaker 1: like all of these guys who were giving interviews to 737 00:40:57,040 --> 00:41:00,280 Speaker 1: writing papers in two thousand and eight didn't remember Barack 738 00:41:00,320 --> 00:41:03,960 Speaker 1: Obama because it's a big school. But this guy dug 739 00:41:04,000 --> 00:41:06,840 Speaker 1: through a bunch of his old graduation papers and found 740 00:41:06,840 --> 00:41:10,840 Speaker 1: a graduation program that lists Barack Obama by name because 741 00:41:10,840 --> 00:41:14,760 Speaker 1: he went to Columbia University. This was where were people 742 00:41:14,880 --> 00:41:17,480 Speaker 1: pointed this out at the time. There's pictures of him there, 743 00:41:17,719 --> 00:41:20,800 Speaker 1: He's on all sorts of documents. He did know people. 744 00:41:21,160 --> 00:41:24,040 Speaker 1: It was just this like fever dream that spread through 745 00:41:24,040 --> 00:41:27,080 Speaker 1: the right and the same way this myth about Haitians 746 00:41:27,120 --> 00:41:30,120 Speaker 1: eating people's pets in Springfield, Ohio is spreading right now. 747 00:41:30,160 --> 00:41:35,200 Speaker 1: It's deliberate disinformation, and it's it's disinformation that is believed 748 00:41:35,239 --> 00:41:38,279 Speaker 1: by people who need an explanation for like how a 749 00:41:38,320 --> 00:41:41,319 Speaker 1: guy like Obama, who they don't think should be able 750 00:41:41,360 --> 00:41:44,400 Speaker 1: to do the things that Obama did, was able to 751 00:41:44,440 --> 00:41:46,759 Speaker 1: do it right. These are racists right who need a 752 00:41:46,800 --> 00:41:51,920 Speaker 1: reason why, how how Barack Obama became the president that isn't. Well, 753 00:41:52,000 --> 00:41:56,480 Speaker 1: he's the most charismatic man in American politics, right, he 754 00:41:56,600 --> 00:41:59,080 Speaker 1: was really good at elections, he was really good at 755 00:41:59,160 --> 00:42:02,920 Speaker 1: running at campaign. That's not a thing that works with 756 00:42:03,000 --> 00:42:06,799 Speaker 1: their belief systems. So Jarvin has to buy into these 757 00:42:06,880 --> 00:42:10,360 Speaker 1: fantasies in order to accept this reality because it just 758 00:42:10,440 --> 00:42:15,520 Speaker 1: doesn't correlate with his racism. And in this he's just like, 759 00:42:15,640 --> 00:42:17,759 Speaker 1: I mean, he's Donald Trump was doing all this shit 760 00:42:17,880 --> 00:42:20,480 Speaker 1: back at the same time, he was like a birther right, Like, 761 00:42:20,520 --> 00:42:25,000 Speaker 1: this is not any intellectually higher up than Bertherism. But 762 00:42:25,760 --> 00:42:28,640 Speaker 1: you know, I think it's kind of important to look 763 00:42:28,680 --> 00:42:32,000 Speaker 1: at this ugly stuff because it contrasts to this dark 764 00:42:32,120 --> 00:42:35,040 Speaker 1: Enlightenment puppet master view of Yarvin and paints a picture 765 00:42:35,080 --> 00:42:38,040 Speaker 1: as a guy who is not just writing stuff that 766 00:42:38,160 --> 00:42:41,040 Speaker 1: is influential, but is also going along with the flow 767 00:42:41,120 --> 00:42:44,799 Speaker 1: and getting caught up in conspiracy theories and bullshit the 768 00:42:44,880 --> 00:42:47,759 Speaker 1: same way anyone else in that space is right, he's 769 00:42:47,800 --> 00:42:56,160 Speaker 1: not special and he's not a hyper genius. Yeah anyway, yeah, yeah, yeah, 770 00:42:56,320 --> 00:42:59,720 Speaker 1: that's dumb. I mean, yeah, I think it's dumb. Yeah. 771 00:43:00,640 --> 00:43:04,080 Speaker 1: I think people don't catch it because his articles are 772 00:43:04,120 --> 00:43:07,560 Speaker 1: always he like throws in Latin quotes and like references 773 00:43:07,560 --> 00:43:09,879 Speaker 1: and quotes from like these old You know, he's got 774 00:43:09,960 --> 00:43:14,400 Speaker 1: a great backlog of like witticisms by different historical thinkers. 775 00:43:14,800 --> 00:43:17,440 Speaker 1: I would describe his writing style as like if Fraser 776 00:43:17,480 --> 00:43:20,560 Speaker 1: were written by a fascist, right, Like that's how Curtis 777 00:43:20,640 --> 00:43:24,759 Speaker 1: Jarfin writes. But it very much is kind of just 778 00:43:24,840 --> 00:43:27,080 Speaker 1: to paper over the fact that he's the same kind 779 00:43:27,080 --> 00:43:29,160 Speaker 1: of blowhard as like rush Limbaugh. 780 00:43:29,360 --> 00:43:33,600 Speaker 2: Right, how does this comport with the right sort of 781 00:43:33,800 --> 00:43:35,879 Speaker 2: fetishization of the Constitution. 782 00:43:36,719 --> 00:43:40,120 Speaker 1: Well, I'll be clear, Jarvin does not fetishize the Constitution. 783 00:43:40,760 --> 00:43:43,280 Speaker 1: I think he views it I think is pretty clearly 784 00:43:43,320 --> 00:43:46,240 Speaker 1: a misstep, right, because it hands all of this power 785 00:43:46,520 --> 00:43:50,640 Speaker 1: over to groups that should not like or at least 786 00:43:50,680 --> 00:43:52,879 Speaker 1: he believes. I think if you were to go back, 787 00:43:52,920 --> 00:43:55,880 Speaker 1: he would argue that, like the basic ideas behind a 788 00:43:55,920 --> 00:43:57,759 Speaker 1: lot of the founders, which is that we should have 789 00:43:57,840 --> 00:44:01,680 Speaker 1: this republic that is governed by elites, this like natural 790 00:44:01,800 --> 00:44:05,880 Speaker 1: elite aristocracy, like Jefferson believed in this kind of natural 791 00:44:05,920 --> 00:44:10,920 Speaker 1: aristocracy of intelligent white men. Right, that is pretty close 792 00:44:11,080 --> 00:44:14,719 Speaker 1: to what Moldbug believes about the world. But obviously, as 793 00:44:14,760 --> 00:44:18,040 Speaker 1: the franchise was extended to larger and larger chunks of people, 794 00:44:18,120 --> 00:44:21,480 Speaker 1: as slavery was ended, as we've repeatedly had these movements 795 00:44:21,480 --> 00:44:25,439 Speaker 1: towards social justice and towards bringing more people into being 796 00:44:25,440 --> 00:44:29,239 Speaker 1: able to have a voice in the system like that. 797 00:44:29,320 --> 00:44:31,560 Speaker 1: The fact that that was allowable at all was a 798 00:44:31,680 --> 00:44:34,799 Speaker 1: terrible flaw in the Constitution as written right, The fact 799 00:44:34,800 --> 00:44:39,200 Speaker 1: that it included the potential for democratic change was kind 800 00:44:39,200 --> 00:44:41,400 Speaker 1: of its its fundamental fatal failure. 801 00:44:43,440 --> 00:44:47,080 Speaker 2: Yeah, I mean, I don't know the nature of this, 802 00:44:47,360 --> 00:44:54,200 Speaker 2: of Jarvin's relationship or friendship or whatever you want to 803 00:44:54,239 --> 00:45:00,279 Speaker 2: call it with JD. Van's, but it's saved to say. 804 00:45:00,280 --> 00:45:01,840 Speaker 2: I think a lot of people on the right would 805 00:45:01,840 --> 00:45:07,360 Speaker 2: would agree that that Yarvin is is taking some seriously 806 00:45:07,560 --> 00:45:12,840 Speaker 2: on American views here, espousing some very un American views, 807 00:45:13,160 --> 00:45:17,839 Speaker 2: un American just meeting like oh yeah and so, and 808 00:45:17,920 --> 00:45:22,319 Speaker 2: yet we have a vice presidential candidate was presumably at 809 00:45:22,440 --> 00:45:24,919 Speaker 2: least hobnobbing. I don't know if they're friends. I don't 810 00:45:24,960 --> 00:45:27,200 Speaker 2: know if if I don't know if Jady Vance would 811 00:45:27,200 --> 00:45:28,200 Speaker 2: say Jarvin. 812 00:45:28,080 --> 00:45:31,719 Speaker 1: Associates, Yeah, yeah, I don't know, associates. 813 00:45:31,560 --> 00:45:37,560 Speaker 2: Has Jadie Vance been asked about Jarvin? Has he has 814 00:45:37,600 --> 00:45:40,400 Speaker 2: he take. Has he had had any sort of public 815 00:45:40,560 --> 00:45:41,439 Speaker 2: expression of. 816 00:45:42,280 --> 00:45:44,239 Speaker 1: Yeah, I mean he's been there was actually really good. 817 00:45:44,280 --> 00:45:46,640 Speaker 1: I think it was a New Yorker article from about 818 00:45:46,680 --> 00:45:49,560 Speaker 1: two years ago, right before his Senate run, where he 819 00:45:49,640 --> 00:45:52,920 Speaker 1: was like at a conference with Yarvin and like at 820 00:45:52,960 --> 00:45:57,320 Speaker 1: a party with him and talking like you know, was questioned, 821 00:45:57,440 --> 00:45:59,920 Speaker 1: you know, or the writer of that article talked with 822 00:46:00,040 --> 00:46:02,400 Speaker 1: him in part about like some of Jarvin's ideas, like 823 00:46:02,440 --> 00:46:06,160 Speaker 1: this retire all government employees thing like he's he's he 824 00:46:06,200 --> 00:46:09,360 Speaker 1: has been like publicly associated with him. I think the 825 00:46:09,760 --> 00:46:12,000 Speaker 1: kind of the one of the more interesting points is 826 00:46:12,040 --> 00:46:15,600 Speaker 1: that like Jarvin spent the twenty sixteen election at Peter 827 00:46:15,680 --> 00:46:20,320 Speaker 1: Teel's house like watching the returns, like he's he's gotten 828 00:46:20,400 --> 00:46:23,799 Speaker 1: increasingly sort of like publicly plugged into this set of 829 00:46:23,800 --> 00:46:27,719 Speaker 1: people who have direct connections with power brokers on the right. 830 00:46:28,320 --> 00:46:31,000 Speaker 1: And you're right, one of the interesting things about him 831 00:46:31,040 --> 00:46:35,960 Speaker 1: is that he's not really directly He's never become famous 832 00:46:36,000 --> 00:46:38,439 Speaker 1: in the same way that like a guy like JD. 833 00:46:38,560 --> 00:46:41,600 Speaker 1: Vance has right because he is too toxic to like 834 00:46:41,880 --> 00:46:47,120 Speaker 1: bring out and and kind of publicly embroder his ideas. Yes, yes, 835 00:46:47,520 --> 00:46:50,160 Speaker 1: but you will get people who will talk about stuff 836 00:46:50,200 --> 00:46:52,880 Speaker 1: like you know, Blake Masters and Jdvanser are comfortable talking 837 00:46:52,920 --> 00:46:56,680 Speaker 1: about rage, right, this idea that really comes out of 838 00:46:56,719 --> 00:47:01,640 Speaker 1: mouldbugs writing. And so it's it's he remains kind of 839 00:47:01,680 --> 00:47:04,279 Speaker 1: toxic enough that you don't want to bring him out 840 00:47:04,280 --> 00:47:08,600 Speaker 1: too publicly, but he's also popular enough that part of 841 00:47:08,680 --> 00:47:11,880 Speaker 1: like how you signal to other people who are on 842 00:47:12,000 --> 00:47:14,279 Speaker 1: this chunk of the right that like you're one of them, 843 00:47:14,800 --> 00:47:18,399 Speaker 1: is you kind of signpost that you believe a lot 844 00:47:18,400 --> 00:47:20,279 Speaker 1: of the same things that he believes, right, and you 845 00:47:21,440 --> 00:47:23,319 Speaker 1: like you show up at the same events at the 846 00:47:23,360 --> 00:47:27,640 Speaker 1: same kind of like you know, like conferences and whatnot 847 00:47:27,640 --> 00:47:31,520 Speaker 1: where he gives speeches. These are kind of like you 848 00:47:31,560 --> 00:47:34,120 Speaker 1: can see the it's not very hard to draw the 849 00:47:34,120 --> 00:47:38,000 Speaker 1: connections between these people. But Jarvin's not a face man, right, 850 00:47:38,080 --> 00:47:40,200 Speaker 1: He's never going to run for office, and you certainly 851 00:47:40,239 --> 00:47:43,759 Speaker 1: don't want him like arguing about like his neo monarchist 852 00:47:43,920 --> 00:47:46,799 Speaker 1: views on Fox News, Right, that's still a little bit 853 00:47:46,840 --> 00:47:50,240 Speaker 1: too extreme. But if you say, we need a strong 854 00:47:50,360 --> 00:47:52,719 Speaker 1: executive leader who's going to run the country like a 855 00:47:52,840 --> 00:47:56,720 Speaker 1: CEO and fire all of these unelected bureaucrats and replace 856 00:47:56,800 --> 00:47:59,080 Speaker 1: them with like people who are going to fight for 857 00:47:59,160 --> 00:48:02,080 Speaker 1: quote unquote liber right, and we need to punish all 858 00:48:02,120 --> 00:48:04,520 Speaker 1: of our political opponents. We need to like lock up 859 00:48:04,640 --> 00:48:07,239 Speaker 1: members of the lying news media. Now, all of these 860 00:48:07,280 --> 00:48:09,439 Speaker 1: are things that you will get a lot of buy 861 00:48:09,480 --> 00:48:11,520 Speaker 1: in on on the trump Ist right. Right. This is 862 00:48:11,560 --> 00:48:15,480 Speaker 1: all stuff that Trump himself talks about a lot, And 863 00:48:15,520 --> 00:48:19,120 Speaker 1: it's the kind of thing people wonder, how did how 864 00:48:19,120 --> 00:48:22,400 Speaker 1: have we gone so far down this road seemingly so quickly, 865 00:48:23,120 --> 00:48:26,080 Speaker 1: And it's because it didn't all start with Donald Trump, right. 866 00:48:26,160 --> 00:48:29,840 Speaker 1: People were kind of tilling the ideological soil for years 867 00:48:29,880 --> 00:48:32,400 Speaker 1: before that point. And like one of those guys is 868 00:48:32,560 --> 00:48:35,680 Speaker 1: Curtis Yarvin, and he's one of the most influential ones 869 00:48:35,719 --> 00:48:36,560 Speaker 1: of those guys. 870 00:48:36,320 --> 00:48:40,440 Speaker 2: As the Jarvin taken a position on Trump publicly. 871 00:48:42,080 --> 00:48:45,360 Speaker 1: I mean, yeah, he again he understands like there's a 872 00:48:45,400 --> 00:48:51,360 Speaker 1: degree of like toxicity to his endorsement, but he watched 873 00:48:51,360 --> 00:48:54,080 Speaker 1: the election at Peter Teal's house and was like very 874 00:48:54,120 --> 00:48:57,400 Speaker 1: excited that Trump had won, so like he is essentially 875 00:48:58,200 --> 00:48:59,759 Speaker 1: he has come out being like this is I think 876 00:48:59,800 --> 00:49:02,920 Speaker 1: as close to the kind of guy we're going to 877 00:49:03,000 --> 00:49:05,920 Speaker 1: get to start the process of turning the country in 878 00:49:06,000 --> 00:49:08,560 Speaker 1: the direction that I want right now, He's not a 879 00:49:08,560 --> 00:49:12,839 Speaker 1: guy who wants the entirety of the United States run 880 00:49:12,880 --> 00:49:15,800 Speaker 1: by one dude. He seeks this kind of political devolution 881 00:49:16,000 --> 00:49:20,959 Speaker 1: into these competing corporate city states. But he sees Trump 882 00:49:21,040 --> 00:49:23,000 Speaker 1: as like a step on that road. This is a 883 00:49:23,000 --> 00:49:26,960 Speaker 1: guy who will centralize power, who will get these bureaucrats out, 884 00:49:26,960 --> 00:49:30,120 Speaker 1: who will destroy you know, the left as an organized 885 00:49:30,120 --> 00:49:34,839 Speaker 1: political force, and that will allow these other kind of interests, 886 00:49:35,320 --> 00:49:38,000 Speaker 1: these corporate interests, to kind of take and centralize more 887 00:49:38,040 --> 00:49:41,800 Speaker 1: power themselves as kind of the state gets whittled down 888 00:49:41,920 --> 00:49:45,080 Speaker 1: and we can devolve power to what are effectively like 889 00:49:45,200 --> 00:49:46,160 Speaker 1: corporate warlords. 890 00:49:46,239 --> 00:49:46,399 Speaker 2: Right. 891 00:49:46,440 --> 00:49:49,080 Speaker 1: That's kind of the end result of his system. You 892 00:49:49,120 --> 00:49:52,480 Speaker 1: can see it in there's this group of guys on 893 00:49:52,520 --> 00:49:54,759 Speaker 1: Silicon Valley, in Silicon Valley right now who are start 894 00:49:54,880 --> 00:49:58,120 Speaker 1: trying to start their own city backed by like Silicon 895 00:49:58,239 --> 00:50:03,120 Speaker 1: Valley VC money. Yeah. Yeah, And they've talked a lot about. 896 00:50:03,080 --> 00:50:05,360 Speaker 2: They they've given up on that, right. 897 00:50:06,160 --> 00:50:09,719 Speaker 1: No, no, no, no, that is still very much an 898 00:50:09,760 --> 00:50:14,360 Speaker 1: effort being made. It's and there's there's been like efforts 899 00:50:14,400 --> 00:50:16,840 Speaker 1: to kind of take over local San Francisco politics, and 900 00:50:16,880 --> 00:50:21,120 Speaker 1: there's a lot of Moldebug associated guys in that as well, 901 00:50:21,680 --> 00:50:25,239 Speaker 1: Like one of the lead figures there is a big 902 00:50:25,280 --> 00:50:27,400 Speaker 1: fan of his writing who has been kind of pushing 903 00:50:27,400 --> 00:50:30,040 Speaker 1: this idea as again a step on the road to 904 00:50:30,120 --> 00:50:34,160 Speaker 1: these corporate controlled city states. Right, this is obviously like 905 00:50:34,239 --> 00:50:38,239 Speaker 1: part of the process of devolving any kind of accountable 906 00:50:38,320 --> 00:50:42,440 Speaker 1: state power into the control of what are effectively like 907 00:50:42,520 --> 00:50:48,640 Speaker 1: ceo kings, Right, and yeah, that's that's that's uh, that's 908 00:50:48,719 --> 00:50:49,799 Speaker 1: kind of where we're going here. 909 00:50:50,440 --> 00:50:52,280 Speaker 2: But we all get to be Twitter's. 910 00:50:52,880 --> 00:50:55,720 Speaker 1: Yeah, everything gets to be run like Twitter. Isn't that exciting? 911 00:50:55,880 --> 00:50:57,240 Speaker 2: We get to live in a Twitter? 912 00:50:58,360 --> 00:50:59,920 Speaker 1: Yeah? Don't you want to live in Twitter? 913 00:51:00,280 --> 00:51:05,000 Speaker 2: No? I wish I could just be a tweet that 914 00:51:05,200 --> 00:51:08,680 Speaker 2: lived in Yeah exclusively. 915 00:51:08,719 --> 00:51:13,279 Speaker 1: Yeah, it seems like such a nice place. Speaking of 916 00:51:13,400 --> 00:51:18,719 Speaker 1: nice places, our sponsors all, they'll create a nice little 917 00:51:18,719 --> 00:51:21,280 Speaker 1: place for your ears to live for like three minutes, 918 00:51:21,400 --> 00:51:31,080 Speaker 1: however long an ad break is. We're back. So I 919 00:51:31,200 --> 00:51:33,239 Speaker 1: kind I really debated with myself, like how much do 920 00:51:33,280 --> 00:51:35,320 Speaker 1: we get into of like all of the different terrible 921 00:51:35,320 --> 00:51:37,400 Speaker 1: things he said, like all of each of the different 922 00:51:37,440 --> 00:51:39,319 Speaker 1: like beliefs. Yarvin has a spouse and this is a 923 00:51:39,320 --> 00:51:42,880 Speaker 1: guy who's been writing thousands of words a week on 924 00:51:42,920 --> 00:51:45,839 Speaker 1: the internet for years, So there's like there's too much 925 00:51:45,880 --> 00:51:50,959 Speaker 1: there for us to give a comprehensive look into the man. 926 00:51:52,080 --> 00:51:56,799 Speaker 2: He just completely summarize all of his writing, right, I. 927 00:51:56,840 --> 00:51:58,880 Speaker 1: Mean, there's enough of it out there that it can do, 928 00:51:59,280 --> 00:52:02,040 Speaker 1: Like it can give you a decent amount on mold bug, 929 00:52:02,040 --> 00:52:04,560 Speaker 1: although you're going to miss stuff, like you know some 930 00:52:04,600 --> 00:52:06,520 Speaker 1: of this you have to know a little bit more 931 00:52:06,520 --> 00:52:09,200 Speaker 1: about the far right in order to catch references he makes. 932 00:52:09,239 --> 00:52:11,279 Speaker 1: Like I was reading one essay of his where he 933 00:52:11,320 --> 00:52:13,799 Speaker 1: talks about Rhodesia. Do you know what? Rhodesia was. 934 00:52:15,640 --> 00:52:20,879 Speaker 2: The source of the of those unusual ridge backed dogs. 935 00:52:21,440 --> 00:52:23,840 Speaker 1: It is the source of those dogs. It was also 936 00:52:24,120 --> 00:52:28,720 Speaker 1: a white ethno state in Africa that was like initially 937 00:52:28,800 --> 00:52:32,239 Speaker 1: started out as a colony of the UK. They refused 938 00:52:32,400 --> 00:52:35,920 Speaker 1: to give up like the power of the white minority, 939 00:52:35,920 --> 00:52:38,279 Speaker 1: which is like three percent of the country and had 940 00:52:38,360 --> 00:52:42,360 Speaker 1: essentially total electoral power. They became like a pariah. This 941 00:52:42,480 --> 00:52:44,680 Speaker 1: is like during like the nineteen seventies, a lot of 942 00:52:44,680 --> 00:52:47,719 Speaker 1: this is happening, and wound up fighting a very long 943 00:52:47,800 --> 00:52:53,160 Speaker 1: war with like the vast majority of the country in 944 00:52:53,239 --> 00:52:56,160 Speaker 1: order to try and maintain this state of white minority rule. 945 00:52:56,200 --> 00:52:59,319 Speaker 1: It was a very brutal period of time. This bush 946 00:52:59,400 --> 00:53:01,560 Speaker 1: War they x acuted was where a lot of early 947 00:53:01,560 --> 00:53:03,680 Speaker 1: insurgent tactics were carried on, and it was carried out 948 00:53:03,719 --> 00:53:06,440 Speaker 1: in the name of keeping white people in charge of 949 00:53:06,480 --> 00:53:10,480 Speaker 1: this massive population of black people who had effectively no power. 950 00:53:11,040 --> 00:53:14,640 Speaker 1: And Rhodesia was close to the ideal state for Moldbook. 951 00:53:14,680 --> 00:53:16,800 Speaker 1: I found an article of his where he writes about 952 00:53:16,880 --> 00:53:20,400 Speaker 1: Ian Smith, who was like the guy who was leading 953 00:53:20,480 --> 00:53:24,319 Speaker 1: Rhodesia during its like quote unquote war for liberation. And 954 00:53:24,400 --> 00:53:28,200 Speaker 1: he writes because because Smith died a few years back, 955 00:53:28,280 --> 00:53:30,840 Speaker 1: and Moldbook like wrote this elegy to him that opened 956 00:53:30,840 --> 00:53:33,759 Speaker 1: with the words the last great Englishman is dead and 957 00:53:33,840 --> 00:53:38,879 Speaker 1: fuck who disagrees, And it's basically this piece about how 958 00:53:38,960 --> 00:53:42,399 Speaker 1: like Smith was the last man who was brave enough 959 00:53:42,440 --> 00:53:44,439 Speaker 1: to fight for what we all know is the only 960 00:53:44,560 --> 00:53:46,319 Speaker 1: kind of state that can work, which is one in 961 00:53:46,360 --> 00:53:50,520 Speaker 1: which a natural biological elite rules over the masses who 962 00:53:50,560 --> 00:53:55,160 Speaker 1: are unfit to have any sort of power. It ends 963 00:53:55,200 --> 00:53:57,560 Speaker 1: on these lines, which I think are kind of telling 964 00:53:57,600 --> 00:54:01,160 Speaker 1: to his ideology, and one we will either be hacked 965 00:54:01,200 --> 00:54:03,480 Speaker 1: to death in our own Beds or some similar or 966 00:54:03,600 --> 00:54:06,640 Speaker 1: nasty thing, or Ian Smith or Enoch Powell, and even 967 00:54:06,680 --> 00:54:09,600 Speaker 1: our own tailgunner Joe will have another life in bronze. 968 00:54:09,880 --> 00:54:12,160 Speaker 1: But do you know us, I'm not sure we have 969 00:54:12,200 --> 00:54:15,920 Speaker 1: been introduced. We are the neo mccarthyists. Our motto this 970 00:54:16,120 --> 00:54:19,439 Speaker 1: time will finish the job. And so what he's saying 971 00:54:19,480 --> 00:54:22,919 Speaker 1: there is that like Smith is like Joe McCarthy. He's 972 00:54:22,920 --> 00:54:26,319 Speaker 1: one of these guys who embodies the violence of the 973 00:54:26,360 --> 00:54:30,920 Speaker 1: politics that we're advocates of. Right and again, he's never 974 00:54:30,960 --> 00:54:32,560 Speaker 1: going to come out and say I think we should 975 00:54:32,640 --> 00:54:35,040 Speaker 1: kill all the people who disagree with me. But he 976 00:54:35,120 --> 00:54:39,200 Speaker 1: will harken back to these figures and say, these are 977 00:54:39,760 --> 00:54:42,560 Speaker 1: my kind of ideological heroes. And the movement that I 978 00:54:42,640 --> 00:54:45,040 Speaker 1: am seeking to incite is going to finish the job. 979 00:54:45,120 --> 00:54:48,239 Speaker 1: It's going to kill all the communists, it's going to 980 00:54:48,360 --> 00:54:52,800 Speaker 1: win where Rhodesia lost. Right, Like, it's not as direct 981 00:54:52,840 --> 00:54:55,719 Speaker 1: as saying, you know, I'm a white nationalist, and in fact, 982 00:54:55,719 --> 00:54:57,960 Speaker 1: he will never admit to being a white nationalist. But 983 00:54:58,000 --> 00:55:01,080 Speaker 1: what is the conclusion when you're t talking about the 984 00:55:01,080 --> 00:55:04,359 Speaker 1: failure of Rhodesia as being a tragedy other than well, 985 00:55:04,480 --> 00:55:08,680 Speaker 1: just someone who supports white nationalists. Right, you can see 986 00:55:08,760 --> 00:55:11,920 Speaker 1: kind of like the violent, inherent violence inherent in what 987 00:55:12,040 --> 00:55:12,799 Speaker 1: he's pushing for. 988 00:55:13,000 --> 00:55:13,239 Speaker 2: There. 989 00:55:14,160 --> 00:55:16,840 Speaker 1: In that same essay, he praises the novel The Camp 990 00:55:16,880 --> 00:55:20,040 Speaker 1: of the Saints, which is a racist book about migrants 991 00:55:20,040 --> 00:55:24,320 Speaker 1: from India flooding Europe and destroying civilization, which is Steve 992 00:55:24,440 --> 00:55:28,919 Speaker 1: Bannon's favorite book. And Steve Bannon is another influent Yeah, yeah, 993 00:55:28,960 --> 00:55:32,520 Speaker 1: it's great replacement stuff, right. Bannon is a big fan 994 00:55:32,560 --> 00:55:35,960 Speaker 1: of Curtis Jarvin. Again, all of these people are connected 995 00:55:36,000 --> 00:55:39,839 Speaker 1: and they're all fans of each other. Like the kind 996 00:55:39,840 --> 00:55:43,759 Speaker 1: of like ideological simpatico here is a crucial part of 997 00:55:43,800 --> 00:55:47,319 Speaker 1: the story. Now, in the last few years, as he's 998 00:55:47,360 --> 00:55:50,319 Speaker 1: gained more of an influence online, Jarvin has grown a 999 00:55:50,360 --> 00:55:53,680 Speaker 1: bigger head. He started referring to himself as the sith 1000 00:55:53,760 --> 00:55:56,640 Speaker 1: Lord of the modern anti democratic right because he is 1001 00:55:57,160 --> 00:55:59,759 Speaker 1: a big nerd. He's also he's the guy who brought 1002 00:55:59,800 --> 00:56:03,160 Speaker 1: the red pill into right wing politics, like taking it 1003 00:56:03,200 --> 00:56:13,719 Speaker 1: from The Matrix. Yeah that's cool, huh that yeah, yeah, 1004 00:56:13,760 --> 00:56:16,360 Speaker 1: And he's you know, it's interesting because like obviously The 1005 00:56:16,440 --> 00:56:21,080 Speaker 1: Matrix was a movie written by two trans women in 1006 00:56:22,320 --> 00:56:24,080 Speaker 1: not at all about right wing politics. 1007 00:56:24,160 --> 00:56:27,839 Speaker 2: Right, the matrix is sort of a great allegory for 1008 00:56:28,280 --> 00:56:29,399 Speaker 2: his beliefs. 1009 00:56:29,880 --> 00:56:32,200 Speaker 1: I do think it is an allegory for his beliefs, 1010 00:56:32,239 --> 00:56:36,040 Speaker 1: and like the violence and depression that they necessitate. I 1011 00:56:36,080 --> 00:56:39,920 Speaker 1: think like he tends to take it as like realizing 1012 00:56:40,360 --> 00:56:43,040 Speaker 1: that democracy is fake and that like all of these 1013 00:56:43,080 --> 00:56:46,960 Speaker 1: social justice movements are inherently like evil, flawed attempts to 1014 00:56:47,040 --> 00:56:50,839 Speaker 1: like destroy the natural aristocracy. Like that's the matrix, right 1015 00:56:51,040 --> 00:56:53,799 Speaker 1: is you know, people who don't look like him being 1016 00:56:53,840 --> 00:56:58,200 Speaker 1: able to vote, which is much shallower. 1017 00:56:58,400 --> 00:57:02,400 Speaker 2: Yeah, the violent its inherent in these systems that that 1018 00:57:02,440 --> 00:57:08,160 Speaker 2: you're referring to. It does it does seem to And 1019 00:57:08,200 --> 00:57:13,880 Speaker 2: I hate to armchair psychoanalyze anybody, and I'm not equipped 1020 00:57:13,920 --> 00:57:15,840 Speaker 2: to do that. So this is more of a just sort. 1021 00:57:15,640 --> 00:57:18,320 Speaker 1: Of we always say we hate it, but we always 1022 00:57:18,360 --> 00:57:18,960 Speaker 1: do a little bit. 1023 00:57:19,120 --> 00:57:22,320 Speaker 2: Oh we can't help it, right, Yeah, but no, this 1024 00:57:22,440 --> 00:57:25,400 Speaker 2: is just sort of a pontification. But does it feel 1025 00:57:25,760 --> 00:57:27,480 Speaker 2: to you because it does to me a little bit, 1026 00:57:27,560 --> 00:57:31,560 Speaker 2: like like to really take these these positions seriously and 1027 00:57:31,680 --> 00:57:36,880 Speaker 2: advocate for these things is is, in its own way 1028 00:57:36,920 --> 00:57:42,800 Speaker 2: a kind of enjoyment of violence or a an and 1029 00:57:42,880 --> 00:57:46,080 Speaker 2: in that way, almost like a psychopathy, like a like 1030 00:57:46,520 --> 00:57:52,200 Speaker 2: violence is wonderful? Yeah, yeah, Is there is there anything 1031 00:57:52,200 --> 00:57:55,960 Speaker 2: to be found in Arvin's writings that is that that 1032 00:57:56,080 --> 00:58:02,280 Speaker 2: laments the that violences or subjugation or the suffering of 1033 00:58:02,320 --> 00:58:07,240 Speaker 2: any humans, is a is an unfortunate side effect of 1034 00:58:07,320 --> 00:58:11,280 Speaker 2: these systems but necessary? Or is it just sort of 1035 00:58:11,360 --> 00:58:15,960 Speaker 2: like no, that's just an awesome part of it. 1036 00:58:15,680 --> 00:58:18,400 Speaker 1: It's more like number one. I think a big thing 1037 00:58:18,440 --> 00:58:20,720 Speaker 1: that he tries to do is minimize the degree to 1038 00:58:20,760 --> 00:58:23,720 Speaker 1: which that's necessary. But when it does come up, it 1039 00:58:23,800 --> 00:58:27,000 Speaker 1: is this very full throated embrace of like, well they're communists, 1040 00:58:27,040 --> 00:58:28,600 Speaker 1: you know, like this is what we have to do. 1041 00:58:28,680 --> 00:58:33,160 Speaker 1: Like the terrorism of the Nazis was great because it worked, right, Like, 1042 00:58:33,600 --> 00:58:36,440 Speaker 1: I don't think he's a guy who feels bad at 1043 00:58:36,440 --> 00:58:39,960 Speaker 1: all about the inevitable consequences of his beliefs. And I 1044 00:58:39,960 --> 00:58:42,400 Speaker 1: certainly don't get that hint from his writing that there's 1045 00:58:42,440 --> 00:58:45,480 Speaker 1: any sort of like real regret. There maybe some like 1046 00:58:45,600 --> 00:58:49,600 Speaker 1: sign postings to regret about the unfortunate you know, necessities 1047 00:58:49,640 --> 00:58:52,000 Speaker 1: that will come about, But this is I think you're 1048 00:58:52,040 --> 00:58:54,320 Speaker 1: right on the money. There's a lot of like violent 1049 00:58:54,520 --> 00:58:57,560 Speaker 1: fantasizing here. Against this is a guy who spends a 1050 00:58:57,600 --> 00:58:59,720 Speaker 1: lot of time obsessed with the things that annoy him 1051 00:58:59,720 --> 00:59:02,840 Speaker 1: in the world world and convinced that like the right 1052 00:59:02,920 --> 00:59:06,360 Speaker 1: solution to those things is a sort of terminal force. 1053 00:59:06,720 --> 00:59:09,360 Speaker 1: And you get this all over with people who get 1054 00:59:09,440 --> 00:59:12,840 Speaker 1: kind of too wrapped up in like the specifics of 1055 00:59:12,880 --> 00:59:16,200 Speaker 1: their ideology and their anger at its discontents. You can 1056 00:59:16,240 --> 00:59:19,880 Speaker 1: find like you know, maoists and whatnot online who will 1057 00:59:19,880 --> 00:59:22,480 Speaker 1: fantasize about like when our revolution takes over, you know, 1058 00:59:22,480 --> 00:59:25,040 Speaker 1: we're going to put people in re education camps or whatever. 1059 00:59:26,040 --> 00:59:29,439 Speaker 1: It's a it's a necessary byproduct of not having enough 1060 00:59:29,480 --> 00:59:32,200 Speaker 1: empathy and spending too much time alone in a room 1061 00:59:32,360 --> 00:59:35,560 Speaker 1: obsessing over how right you are? Right is anything that 1062 00:59:35,880 --> 00:59:38,960 Speaker 1: kind of inherently conflicts with that as the world, in 1063 00:59:39,000 --> 00:59:43,200 Speaker 1: all of its complexity, will always do should be solved 1064 00:59:43,200 --> 00:59:45,440 Speaker 1: by the most violence I can bring to bear right. 1065 00:59:45,480 --> 00:59:47,960 Speaker 1: And that's why we need to capture the presidens. 1066 00:59:47,600 --> 00:59:49,400 Speaker 2: So certain solution which is. 1067 00:59:49,720 --> 00:59:54,120 Speaker 1: Yeah, And that's why these guys, that's why their big 1068 00:59:54,160 --> 00:59:56,960 Speaker 1: goal is the presidency right, because the way they see it, 1069 00:59:57,000 --> 00:59:58,720 Speaker 1: and certainly the way the Supreme Court is set it up, 1070 00:59:58,800 --> 01:00:02,200 Speaker 1: getting the presidency gives you access to the greatest store 1071 01:00:02,200 --> 01:00:05,840 Speaker 1: of violence that has ever existed in human history. Right, Like, 1072 01:00:05,880 --> 01:00:09,840 Speaker 1: that is what the president has access to. Right. People 1073 01:00:09,840 --> 01:00:11,680 Speaker 1: don't like to talk about it that way, but there's 1074 01:00:11,680 --> 01:00:15,320 Speaker 1: no other real way to view it, potentially, And if 1075 01:00:15,360 --> 01:00:17,360 Speaker 1: you see the president as someone who should have no 1076 01:00:17,480 --> 01:00:19,760 Speaker 1: guard rails, and this is very much what we're going 1077 01:00:19,800 --> 01:00:22,440 Speaker 1: to advocate for, as a president that has no restrictions 1078 01:00:22,440 --> 01:00:24,960 Speaker 1: on his power, then what you're looking at is a 1079 01:00:25,000 --> 01:00:28,320 Speaker 1: guy who has the ability to use the most violence 1080 01:00:28,360 --> 01:00:31,800 Speaker 1: ever concentrated in cleansing the world of the people who 1081 01:00:31,840 --> 01:00:35,680 Speaker 1: make it not fit this schema that I've cooked up 1082 01:00:35,720 --> 01:00:42,080 Speaker 1: in my head that I find very attractive. Yeah, it's 1083 01:00:42,120 --> 01:00:45,320 Speaker 1: good stuff. So if you're looking for a writing on 1084 01:00:45,760 --> 01:00:47,840 Speaker 1: you know, Jarvin to this day, if you're looking at 1085 01:00:47,920 --> 01:00:50,640 Speaker 1: like what people have come to call the strain of 1086 01:00:50,680 --> 01:00:53,240 Speaker 1: thought that he helped ignite, the term you'll come across 1087 01:00:53,240 --> 01:00:57,680 Speaker 1: the most is neo reactionary or NRX. Right, this is 1088 01:00:57,760 --> 01:00:59,800 Speaker 1: kind of the how you'll see it written about a 1089 01:00:59,800 --> 01:01:04,040 Speaker 1: lot in like blogs by Bay Area techies. And this 1090 01:01:04,760 --> 01:01:07,120 Speaker 1: term started to be used more in like twenty thirteen, 1091 01:01:07,120 --> 01:01:10,000 Speaker 1: twenty fourteen. I think it is when it really took off. 1092 01:01:10,040 --> 01:01:12,880 Speaker 1: You had some sort of writing by people like Clint 1093 01:01:12,880 --> 01:01:16,160 Speaker 1: Finley at tech Crunch in twenty thirteen that noted that 1094 01:01:16,480 --> 01:01:18,720 Speaker 1: you were seeing a lot of these this thought take 1095 01:01:18,760 --> 01:01:23,280 Speaker 1: off among influential people in big tech. Finley wrote PayPal 1096 01:01:23,760 --> 01:01:27,560 Speaker 1: founder Peter Teal his most voice similar ideas, and Pat Dickinson, 1097 01:01:27,560 --> 01:01:30,680 Speaker 1: the former CTO of Business Insider, say he's been influenced 1098 01:01:30,720 --> 01:01:34,320 Speaker 1: by neo reactionary thought. It may be a small minority worldview, 1099 01:01:34,320 --> 01:01:36,000 Speaker 1: but it's one that I think shines some light on 1100 01:01:36,040 --> 01:01:40,200 Speaker 1: the psyche of contemporary tech culture now. It was through 1101 01:01:40,240 --> 01:01:44,040 Speaker 1: tech Crunch in twenty fourteen that Minsius Moldbug was first 1102 01:01:44,080 --> 01:01:47,800 Speaker 1: revealed to be computer scientist Curtis Yarvin, and it was 1103 01:01:47,840 --> 01:01:51,200 Speaker 1: revealed that Yarvin was working at a startup funded by 1104 01:01:51,200 --> 01:01:56,520 Speaker 1: Peter Teal's money called Tlawn. Now Tlawn was the name 1105 01:01:56,560 --> 01:01:59,360 Speaker 1: comes from a short story like a sci fi short 1106 01:01:59,400 --> 01:02:03,080 Speaker 1: story written by a fella named Borges in nineteen forty 1107 01:02:04,040 --> 01:02:06,480 Speaker 1: and one summary of the story I found from Francis 1108 01:02:06,480 --> 01:02:10,240 Speaker 1: Saying writes that it quote describes a secret society Orbis 1109 01:02:10,320 --> 01:02:14,560 Speaker 1: Tertius that architects an entirely new world Plawn by establishing 1110 01:02:14,640 --> 01:02:18,400 Speaker 1: an encyclopedia describing it. Over time, bits of this fictional 1111 01:02:18,400 --> 01:02:21,160 Speaker 1: world begin to emerge in the real world, consuming it. 1112 01:02:21,840 --> 01:02:25,320 Speaker 1: And this is kind of how Muldbug thinks of his writing. Right, 1113 01:02:25,480 --> 01:02:29,680 Speaker 1: I am writing the future into being by theorizing it. 1114 01:02:29,840 --> 01:02:30,000 Speaker 2: Right. 1115 01:02:30,040 --> 01:02:32,320 Speaker 1: That's why this company is named after it. That's why 1116 01:02:32,320 --> 01:02:37,920 Speaker 1: he finds that story so influential. The kind of process 1117 01:02:38,200 --> 01:02:41,000 Speaker 1: of this is called hyperstition. It's this process of like 1118 01:02:41,440 --> 01:02:44,520 Speaker 1: taking ideas that exist only in people's heads in fiction 1119 01:02:44,720 --> 01:02:48,120 Speaker 1: and like forcing them into the real world. It's one 1120 01:02:48,160 --> 01:02:51,040 Speaker 1: of those premises that seems silly until suddenly a guy 1121 01:02:51,040 --> 01:02:53,720 Speaker 1: who might be the vice president starts ranting about how 1122 01:02:53,800 --> 01:02:56,440 Speaker 1: childless women are psychopaths and we need to fire the 1123 01:02:56,440 --> 01:02:59,480 Speaker 1: government so Donald Trump can remake it in his own image. 1124 01:03:00,120 --> 01:03:03,120 Speaker 1: This is all very like silly seeming until it isn't. 1125 01:03:03,520 --> 01:03:05,720 Speaker 1: And kind of the scariest thing about Jarvin is that 1126 01:03:05,920 --> 01:03:09,280 Speaker 1: he has to an extent been successful in like writing 1127 01:03:09,920 --> 01:03:13,320 Speaker 1: a different world into being like he has. He has 1128 01:03:13,360 --> 01:03:16,600 Speaker 1: had more influence in this than you want to believe. 1129 01:03:17,640 --> 01:03:21,120 Speaker 1: Kind of the tipping point for Jarvin's influence and culture 1130 01:03:21,160 --> 01:03:23,480 Speaker 1: and for that kind of politics starting to take over 1131 01:03:23,520 --> 01:03:26,840 Speaker 1: the right was twenty fourteen, and it happened appropriately enough 1132 01:03:26,880 --> 01:03:29,200 Speaker 1: on the Internet with something called gamer Gate. 1133 01:03:29,600 --> 01:03:30,440 Speaker 2: H and there we are. 1134 01:03:30,480 --> 01:03:33,400 Speaker 1: I was wondering how long it would Takejanopolis to show 1135 01:03:33,520 --> 01:03:36,960 Speaker 1: up gamer Gate. A lot of the people who are 1136 01:03:37,560 --> 01:03:40,640 Speaker 1: thought leaders in Gamergate, who are like, like, who are 1137 01:03:40,760 --> 01:03:44,760 Speaker 1: some of the early uh like voices behind that and 1138 01:03:44,800 --> 01:03:48,200 Speaker 1: behind that kind of neo reactionary swell as it enters 1139 01:03:48,640 --> 01:03:52,440 Speaker 1: public consciousness are fans of Yarvins. One of them is 1140 01:03:52,480 --> 01:03:55,320 Speaker 1: Steve Bannon. Bannon is a big behind the scenes player, 1141 01:03:55,400 --> 01:03:59,520 Speaker 1: and what happens there as is Milo Yanopolis, and Yanopolis 1142 01:03:59,760 --> 01:04:03,920 Speaker 1: is kind of a guy who comes to fame through Gamergate, 1143 01:04:04,000 --> 01:04:06,280 Speaker 1: as like making himself into kind of a voice of 1144 01:04:06,280 --> 01:04:09,960 Speaker 1: the movement, and Yanopolis is an adherent of Yarvin's philosophy. 1145 01:04:11,200 --> 01:04:14,800 Speaker 1: The next year, after Gamergate twenty fifteen, Trump descends his 1146 01:04:14,920 --> 01:04:17,600 Speaker 1: Escalator to launch his campaign, and in short order, the 1147 01:04:17,680 --> 01:04:22,760 Speaker 1: alright is eternal. Yeah, and the alright becomes kind of 1148 01:04:23,200 --> 01:04:26,320 Speaker 1: undeniable to anyone. Right, It's not something you can ignore anymore, 1149 01:04:26,520 --> 01:04:31,040 Speaker 1: And figures like Janopolis and Bannon are major like faces 1150 01:04:31,160 --> 01:04:35,080 Speaker 1: of the movement, while Jarvin remains kind of obscure still, 1151 01:04:35,160 --> 01:04:37,760 Speaker 1: you know, people who are in the know, who understand 1152 01:04:38,080 --> 01:04:40,760 Speaker 1: where a lot of the ideas guys like Eanopolis are 1153 01:04:41,200 --> 01:04:44,400 Speaker 1: spouting in public come from, know that he's a player 1154 01:04:44,480 --> 01:04:47,280 Speaker 1: in the field, but he's kind of obscured and shadowed 1155 01:04:47,320 --> 01:04:50,440 Speaker 1: by the cloaking factor of his very dense, clumsy prose. 1156 01:04:51,200 --> 01:04:53,480 Speaker 1: Corey Pine at The Baffler is probably the first guy 1157 01:04:53,520 --> 01:04:57,760 Speaker 1: to cohesively suggests that Jarvin's relationship to Peter Teel and 1158 01:04:57,800 --> 01:05:00,360 Speaker 1: to the folks around him were part of a whider 1159 01:05:00,360 --> 01:05:04,520 Speaker 1: movement towards anti democratic change. His work was influential enough 1160 01:05:04,560 --> 01:05:07,000 Speaker 1: to get The New York Times to ask Peter Teal 1161 01:05:07,160 --> 01:05:10,040 Speaker 1: if he was working to fund a monarchist coup, and 1162 01:05:10,120 --> 01:05:13,120 Speaker 1: Teal replied in a cryptic fashion, it was a full 1163 01:05:13,160 --> 01:05:16,560 Speaker 1: on conspiracy theory. In truth, there's nobody sitting around plotting 1164 01:05:16,560 --> 01:05:19,000 Speaker 1: the future, though I sometimes think it would be better 1165 01:05:19,040 --> 01:05:22,640 Speaker 1: if people were, which is like the most sinister way 1166 01:05:22,680 --> 01:05:27,200 Speaker 1: to reply to that. Right, So twenty sixteen comes along. 1167 01:05:27,280 --> 01:05:30,680 Speaker 1: Next Trump wins the election, and suddenly even normal people 1168 01:05:30,680 --> 01:05:33,360 Speaker 1: who don't spend their free time online looking at fascists 1169 01:05:33,400 --> 01:05:36,120 Speaker 1: on the internet become aware that something is very wrong. 1170 01:05:36,600 --> 01:05:40,400 Speaker 1: In twenty seventeen, BuzzFeed News publishes an expose based on 1171 01:05:40,480 --> 01:05:43,040 Speaker 1: leaked emails between a bunch of members of the alt right, 1172 01:05:43,160 --> 01:05:46,600 Speaker 1: including Bannon and Yanopolis, and I'm going to quote Corey 1173 01:05:46,640 --> 01:05:49,800 Speaker 1: Pine describing these leaks and from some of the most 1174 01:05:49,800 --> 01:05:53,880 Speaker 1: revealing reveals. In that piece, BuzzFeed reported that Teal and 1175 01:05:53,920 --> 01:05:57,120 Speaker 1: Eanopolis had made plans to meet during the July Republican 1176 01:05:57,240 --> 01:06:00,680 Speaker 1: National Convention, but much of Eanopolis's knowledge of seems to 1177 01:06:00,760 --> 01:06:03,520 Speaker 1: come secondhand from other right wing activists, as well as 1178 01:06:03,600 --> 01:06:06,920 Speaker 1: Curtis Jarvin, the blogger who advocates the return of feudalism. 1179 01:06:07,280 --> 01:06:10,760 Speaker 1: The story then quotes this exchange. Jarvin told Yanopolis that 1180 01:06:10,800 --> 01:06:14,720 Speaker 1: he had been coaching Teal. Peter needs guidance on politics, 1181 01:06:14,720 --> 01:06:18,040 Speaker 1: for sure. Yanopolis responded, less than you might think, Jarvin 1182 01:06:18,080 --> 01:06:20,440 Speaker 1: wrote back, I watched the election at his house. I 1183 01:06:20,480 --> 01:06:23,920 Speaker 1: think my hangover lasted in this Tuesday, He's fully enlightened. 1184 01:06:23,960 --> 01:06:29,200 Speaker 1: He just plays it very carefully. And you know, if 1185 01:06:29,240 --> 01:06:31,919 Speaker 1: you're looking for direct connections, it's something that we only 1186 01:06:31,960 --> 01:06:35,400 Speaker 1: really have because of you know, leaks like that. But 1187 01:06:36,080 --> 01:06:39,720 Speaker 1: these are not like conspiracy theory connections that we have 1188 01:06:39,800 --> 01:06:42,439 Speaker 1: to draw between people, like we have texts between these 1189 01:06:42,480 --> 01:06:46,240 Speaker 1: guys talking about how they're trying to convince moneyed interests 1190 01:06:46,320 --> 01:06:49,440 Speaker 1: of their plans to like end democracy in the United States, 1191 01:06:49,480 --> 01:06:52,280 Speaker 1: and they've been doing it for a while. And that's 1192 01:06:52,840 --> 01:06:57,160 Speaker 1: that's most of what I've got to say about Curtis Yarvin. Now, 1193 01:06:57,320 --> 01:06:59,080 Speaker 1: I would be doing you at a service if I 1194 01:06:59,120 --> 01:07:01,760 Speaker 1: didn't end this by talking at least for a little 1195 01:07:01,800 --> 01:07:04,240 Speaker 1: bit about the other side of his professional life, because 1196 01:07:04,280 --> 01:07:07,880 Speaker 1: while he's been writing all of this fascist theorizing and whatnot, 1197 01:07:08,280 --> 01:07:10,600 Speaker 1: he has a career as a software developer and a 1198 01:07:10,640 --> 01:07:14,240 Speaker 1: project that represents he and he will state represents like 1199 01:07:14,240 --> 01:07:15,480 Speaker 1: his political dreams. 1200 01:07:15,760 --> 01:07:16,040 Speaker 2: RBIT. 1201 01:07:16,920 --> 01:07:20,880 Speaker 1: Now, RBIT is like a period on paper. It's supposed 1202 01:07:20,880 --> 01:07:23,600 Speaker 1: to be basically a period of peer social networking tool 1203 01:07:23,640 --> 01:07:27,640 Speaker 1: like masted on that allows individual users more control over 1204 01:07:27,680 --> 01:07:30,880 Speaker 1: their information in digital life. And when he talks about ERBIT, 1205 01:07:31,800 --> 01:07:34,840 Speaker 1: Jarvin talks about it as like this is me building 1206 01:07:34,840 --> 01:07:39,000 Speaker 1: in software a representation of my ideal form of government, right, 1207 01:07:39,400 --> 01:07:41,640 Speaker 1: And he frames it as like something that gives you 1208 01:07:42,000 --> 01:07:45,760 Speaker 1: control over your own life and data, not some central 1209 01:07:45,800 --> 01:07:48,840 Speaker 1: company like Twitter that like can be corrupt and used 1210 01:07:48,840 --> 01:07:53,760 Speaker 1: to bad ends. And that's complete bullshit, right. The reality 1211 01:07:53,800 --> 01:07:56,840 Speaker 1: is that Rbit is backed by Peter Thiel money, and 1212 01:07:56,920 --> 01:08:02,400 Speaker 1: it is an attempt to effectively like build a different 1213 01:08:02,520 --> 01:08:07,040 Speaker 1: kind of social networking infrastructure for the Internet that Curtis 1214 01:08:07,160 --> 01:08:10,240 Speaker 1: Yarvin has complete control over. Right now, the good news 1215 01:08:10,360 --> 01:08:14,280 Speaker 1: is it hasn't actually taken off. This company is largely 1216 01:08:14,320 --> 01:08:17,799 Speaker 1: a failure. It's generally agreed to be pretty badly coded. 1217 01:08:18,640 --> 01:08:21,360 Speaker 1: I found a pretty good analysis of it by Francis Saying, 1218 01:08:21,400 --> 01:08:23,880 Speaker 1: who runs a website called Distributed Web of Care, who 1219 01:08:23,920 --> 01:08:27,360 Speaker 1: points out that, like, while Jarvin tries to frame this 1220 01:08:27,439 --> 01:08:30,360 Speaker 1: as like Mastodon, is like, well, you keep control of 1221 01:08:30,400 --> 01:08:32,920 Speaker 1: your data. You're in charge of your digital life, not 1222 01:08:33,040 --> 01:08:36,200 Speaker 1: some company. The way everything is set up is you 1223 01:08:36,320 --> 01:08:40,280 Speaker 1: have these like different nodes, and there's a limited number 1224 01:08:40,280 --> 01:08:43,320 Speaker 1: of nodes, and most of them are controlled by Jarvin 1225 01:08:43,439 --> 01:08:45,800 Speaker 1: and the Herbit company. So what he's like really trying 1226 01:08:45,840 --> 01:08:48,799 Speaker 1: to do here is set up like a landlord scam, 1227 01:08:49,000 --> 01:08:51,639 Speaker 1: like on the Internet, right where he's the big landlord. 1228 01:08:51,680 --> 01:08:54,160 Speaker 1: And I think this is kind of revealing, not because 1229 01:08:54,200 --> 01:08:57,439 Speaker 1: herb it's important, but because it shows we've kind of 1230 01:08:57,439 --> 01:08:59,800 Speaker 1: been talking this whole time. How much of this does 1231 01:08:59,800 --> 01:09:02,559 Speaker 1: he really believe? How much of this is like him 1232 01:09:02,640 --> 01:09:05,679 Speaker 1: kind of dressing it up, and I think you get 1233 01:09:05,720 --> 01:09:08,240 Speaker 1: the real Curtis Jarvin here, which is not a guy 1234 01:09:08,760 --> 01:09:13,280 Speaker 1: with any real high minded intellectual desires beyond I want 1235 01:09:13,320 --> 01:09:15,360 Speaker 1: to be the one in power, right, Like I want 1236 01:09:15,400 --> 01:09:17,240 Speaker 1: to be the one with the money. I want to 1237 01:09:17,280 --> 01:09:19,920 Speaker 1: be the big landlord. Right. He doesn't hate Twitter and 1238 01:09:20,000 --> 01:09:23,920 Speaker 1: Facebook because they ruined his old Internet. He hates it because, 1239 01:09:23,960 --> 01:09:27,439 Speaker 1: like he's just another guy in all of those systems 1240 01:09:27,439 --> 01:09:29,240 Speaker 1: and he wants to be the king or at least 1241 01:09:29,240 --> 01:09:31,920 Speaker 1: a night right. I think that's kind of like the 1242 01:09:31,960 --> 01:09:34,479 Speaker 1: note to end on is pointing out, like, underneath it 1243 01:09:34,520 --> 01:09:38,040 Speaker 1: all and underneath all of the like dressing up in 1244 01:09:38,040 --> 01:09:41,639 Speaker 1: intellectualism that he puts on, this is just a guy 1245 01:09:41,800 --> 01:09:44,960 Speaker 1: who's angry that he's not currently the one with all 1246 01:09:44,960 --> 01:09:48,640 Speaker 1: the power, right, And that's that's really what it's all about. 1247 01:09:49,400 --> 01:09:58,880 Speaker 2: Whoah, sorry, because that I've learned. Yeah, I'm not a fan. 1248 01:10:00,120 --> 01:10:03,080 Speaker 1: Yeah that's good to hear. That's good to hear. 1249 01:10:04,000 --> 01:10:06,320 Speaker 2: But I do I do want to read up more 1250 01:10:06,800 --> 01:10:10,040 Speaker 2: and get a sense of his tone and and and 1251 01:10:10,120 --> 01:10:13,760 Speaker 2: maybe well I'd love to hear him speak and get 1252 01:10:13,760 --> 01:10:16,160 Speaker 2: a more better sense of his persona and kind of 1253 01:10:16,200 --> 01:10:16,559 Speaker 2: what that. 1254 01:10:17,240 --> 01:10:20,920 Speaker 1: Oh, yes, he's on many podcasts. 1255 01:10:21,080 --> 01:10:27,320 Speaker 2: How does that Bannon reconcile someone openly advocating for the 1256 01:10:27,320 --> 01:10:32,040 Speaker 2: the basically just the burning of the Constitution with also 1257 01:10:33,439 --> 01:10:41,160 Speaker 2: being a leader for for just like Republican ascendancy, Like, 1258 01:10:41,200 --> 01:10:44,120 Speaker 2: how does how do you reconcile those things? I'm so 1259 01:10:44,240 --> 01:10:48,639 Speaker 2: confused with that unless there's unless it's is it a 1260 01:10:49,760 --> 01:10:53,240 Speaker 2: you know, do you think it's a like if we 1261 01:10:53,240 --> 01:10:57,439 Speaker 2: were to say JD. Vance gets to be president one day, 1262 01:10:57,520 --> 01:11:02,439 Speaker 2: do you think it's a bait and switch? Are they like, 1263 01:11:03,160 --> 01:11:08,040 Speaker 2: is it really a you know, is that Peter Thiel's 1264 01:11:08,080 --> 01:11:10,400 Speaker 2: game to sort of bait and switch the Americans into 1265 01:11:10,479 --> 01:11:12,640 Speaker 2: like just get him into office and then we're going 1266 01:11:12,720 --> 01:11:15,599 Speaker 2: to make him a dictator? Or is it? 1267 01:11:16,600 --> 01:11:16,920 Speaker 1: Uh? 1268 01:11:17,680 --> 01:11:20,320 Speaker 2: Is it a thing where someone like a JD. Vance 1269 01:11:20,920 --> 01:11:25,000 Speaker 2: who's participating in American democracy as a candidate, is really 1270 01:11:25,120 --> 01:11:28,320 Speaker 2: sort of like I like some of these ideas. I 1271 01:11:28,479 --> 01:11:33,080 Speaker 2: like the CEO approach to a presidency, but like, I 1272 01:11:33,120 --> 01:11:37,800 Speaker 2: still believe in the Constitute. I still believe that democracy 1273 01:11:38,120 --> 01:11:41,160 Speaker 2: is our our is our bedrock. 1274 01:11:42,520 --> 01:11:44,560 Speaker 1: I don't think for one thing, just based on a 1275 01:11:44,560 --> 01:11:46,840 Speaker 1: lot of stuff that Vance has said on some of 1276 01:11:46,840 --> 01:11:49,559 Speaker 1: these far right podcasts. I don't believe he believes that 1277 01:11:49,680 --> 01:11:53,519 Speaker 1: democracy is our bedrock. I think a lot of what 1278 01:11:53,680 --> 01:11:56,200 Speaker 1: ways these guys will couch it is that's saying, like, well, 1279 01:11:56,240 --> 01:11:58,680 Speaker 1: I believe in a republic, right, and kind of what 1280 01:11:58,720 --> 01:12:03,400 Speaker 1: they're hearkening back to this very classical idea that well, 1281 01:12:03,680 --> 01:12:08,040 Speaker 1: only like property owning men should be able to vote. Right, Like, sure, 1282 01:12:08,080 --> 01:12:11,519 Speaker 1: there should be right some voting, but like not the 1283 01:12:11,560 --> 01:12:14,400 Speaker 1: franchise should certainly be And this is something Vance has 1284 01:12:15,080 --> 01:12:17,559 Speaker 1: has embraced when he's talked about how like people without 1285 01:12:17,640 --> 01:12:21,519 Speaker 1: children shouldn't have the same degree of electoral say, right, 1286 01:12:21,760 --> 01:12:23,559 Speaker 1: you should have men, If you're the head of a family, 1287 01:12:23,600 --> 01:12:26,360 Speaker 1: you should get to vote, have more votes that basically 1288 01:12:26,400 --> 01:12:28,640 Speaker 1: account for how many kids you have, and probably for 1289 01:12:28,720 --> 01:12:32,120 Speaker 1: your wife too. Right. These are things Vance has talked 1290 01:12:32,160 --> 01:12:35,200 Speaker 1: about on different podcasts he's appeared on, and Bannon has 1291 01:12:35,880 --> 01:12:38,639 Speaker 1: expressed a lot of anti democratic sentiment. I think part 1292 01:12:38,640 --> 01:12:41,400 Speaker 1: of the problem is that when these people get confronted 1293 01:12:41,600 --> 01:12:46,280 Speaker 1: by the media, there's this inherent impulse that like mainstream 1294 01:12:46,320 --> 01:12:50,400 Speaker 1: political reporters have to normalize them in a way that 1295 01:12:50,680 --> 01:12:53,680 Speaker 1: I don't think is that I think provides them with 1296 01:12:53,760 --> 01:12:57,120 Speaker 1: cover right to not just to not talk about them 1297 01:12:57,160 --> 01:13:00,720 Speaker 1: as if they are people who are trying to end democracy, 1298 01:13:00,840 --> 01:13:04,400 Speaker 1: because they very much are. You know, that's not the normal. 1299 01:13:04,840 --> 01:13:07,880 Speaker 1: That's not the norm within like people who vote for 1300 01:13:07,920 --> 01:13:10,760 Speaker 1: the Republican Party. But when we are talking about guys 1301 01:13:10,760 --> 01:13:16,360 Speaker 1: like JD. Vance and Steve Bannon and Steve Miller, you know, 1302 01:13:16,760 --> 01:13:21,160 Speaker 1: Trump's former associate, like all of these guys, that is 1303 01:13:21,200 --> 01:13:23,960 Speaker 1: the norm among these people. And part of what part 1304 01:13:23,960 --> 01:13:27,200 Speaker 1: of how they see Trump as useful and potentially Advance 1305 01:13:27,320 --> 01:13:29,320 Speaker 1: is useful, is not that they're going to get us 1306 01:13:29,400 --> 01:13:32,400 Speaker 1: to this end system that Jarvin theorizes. And when they 1307 01:13:32,479 --> 01:13:35,040 Speaker 1: when they take Yarvin's ideas, I don't think most of 1308 01:13:35,080 --> 01:13:37,679 Speaker 1: them want exactly the same kind of system that he does, 1309 01:13:37,760 --> 01:13:40,240 Speaker 1: because he's kind of a kook, and right his system 1310 01:13:40,320 --> 01:13:42,719 Speaker 1: is unworkable, and none of them really want to devolve. 1311 01:13:42,760 --> 01:13:45,000 Speaker 1: Most of them don't want to devolve power. Right, A 1312 01:13:45,040 --> 01:13:48,519 Speaker 1: guy like Teal may want to devolve some powers so 1313 01:13:48,600 --> 01:13:51,920 Speaker 1: that he can run, you know, effectively his own little 1314 01:13:51,960 --> 01:13:54,280 Speaker 1: city state where he gets to make all of the rules. 1315 01:13:55,560 --> 01:13:58,200 Speaker 1: But there's a degree of like chaos inherent and the 1316 01:13:58,240 --> 01:14:01,760 Speaker 1: actual thing that Yarvin advocate that makes it unworkable, but 1317 01:14:01,800 --> 01:14:05,840 Speaker 1: they find ideas that he has very useful. And likewise, 1318 01:14:06,240 --> 01:14:09,519 Speaker 1: I think folks like Jarvin think that a guy like Vance, 1319 01:14:09,560 --> 01:14:13,040 Speaker 1: even if he doesn't back believe everything they believe, would 1320 01:14:13,080 --> 01:14:16,400 Speaker 1: be useful because he brings us closer to this kind 1321 01:14:16,439 --> 01:14:18,120 Speaker 1: of a system. And part of how we would do 1322 01:14:18,160 --> 01:14:21,520 Speaker 1: that is by purging the left, by purging and destroying 1323 01:14:21,560 --> 01:14:24,320 Speaker 1: like all of these journalists, by by locking up our 1324 01:14:24,360 --> 01:14:27,680 Speaker 1: political opposition, right, Like that is a big part of 1325 01:14:27,720 --> 01:14:30,800 Speaker 1: it for them, is we want to use the violence 1326 01:14:30,880 --> 01:14:34,200 Speaker 1: that the state has access to to destroy the people 1327 01:14:34,280 --> 01:14:36,679 Speaker 1: who don't want the world to be this way, right, 1328 01:14:36,720 --> 01:14:39,960 Speaker 1: And there very is this kind of yearning for having 1329 01:14:40,000 --> 01:14:43,800 Speaker 1: a freehand to utilize that violence to crush opposition. That 1330 01:14:43,960 --> 01:14:46,559 Speaker 1: is key to that's what binds all these two guys together. Like, 1331 01:14:46,640 --> 01:14:49,200 Speaker 1: as you've stated, and I think this is an important point. 1332 01:14:49,560 --> 01:14:52,920 Speaker 1: It's not like Steve Bannon, like believes to the letter 1333 01:14:53,040 --> 01:14:55,599 Speaker 1: and all of the crazy shit that Jarvin writes about, right, 1334 01:14:55,640 --> 01:14:58,320 Speaker 1: It's not like JD. Vance is like sleeping with a 1335 01:14:58,320 --> 01:15:01,559 Speaker 1: bunch of printouts of unqualified resid under his bed. There's 1336 01:15:01,600 --> 01:15:04,360 Speaker 1: some ideas they find useful. They like the way he messages, 1337 01:15:04,680 --> 01:15:07,200 Speaker 1: They think it's like, especially like rage. They think that's 1338 01:15:07,240 --> 01:15:09,320 Speaker 1: good messaging for this thing that we want to do. 1339 01:15:09,960 --> 01:15:12,880 Speaker 1: But they're all kind of tied together by we want 1340 01:15:12,960 --> 01:15:16,120 Speaker 1: supreme executive power so that we can wield violence against 1341 01:15:16,120 --> 01:15:19,920 Speaker 1: our enemies, right, And that's where they're all simpatico. Right, 1342 01:15:20,040 --> 01:15:22,640 Speaker 1: And that's I think kind of the most important takeaway 1343 01:15:22,680 --> 01:15:26,040 Speaker 1: here is not all of these people have been enraptured 1344 01:15:26,080 --> 01:15:29,760 Speaker 1: by Curtis Jarvin and are like thought zombies to his beliefs. 1345 01:15:30,160 --> 01:15:33,000 Speaker 1: They find some of his ideas useful, and likewise, he 1346 01:15:33,120 --> 01:15:35,680 Speaker 1: and the people who follow him like they're all in 1347 01:15:35,760 --> 01:15:38,000 Speaker 1: agreement about one thing, and it's who they want to hurt. 1348 01:15:38,160 --> 01:15:43,720 Speaker 2: Yeah, it's interesting because I've always I felt that there 1349 01:15:43,760 --> 01:15:45,519 Speaker 2: was an article and I think it was New York 1350 01:15:45,600 --> 01:15:50,639 Speaker 2: Magazine during the run up to the twenty sixteen election 1351 01:15:51,400 --> 01:15:57,160 Speaker 2: that said, I think the phrasing was electing Trump would 1352 01:15:57,200 --> 01:16:02,160 Speaker 2: be an extinction level event for the United States. Yeah, 1353 01:16:03,160 --> 01:16:05,280 Speaker 2: a lot of people are scared of a second Trump 1354 01:16:05,800 --> 01:16:10,320 Speaker 2: presidency because they feel like democracy would crumble under Trump. 1355 01:16:10,360 --> 01:16:14,519 Speaker 2: And that's that's been a you know, rallying cry of 1356 01:16:14,560 --> 01:16:18,320 Speaker 2: the left now for quite a while. And I agree 1357 01:16:18,360 --> 01:16:20,679 Speaker 2: to some extent I do think that he's a threat 1358 01:16:21,080 --> 01:16:24,160 Speaker 2: to democracy. I don't know that he wants to completely 1359 01:16:24,240 --> 01:16:27,320 Speaker 2: topple it, but I I do think that how he 1360 01:16:27,400 --> 01:16:30,920 Speaker 2: handled the transition of power was very very alarming and 1361 01:16:31,000 --> 01:16:37,680 Speaker 2: unnerving and should be disqualifying, but at least to be 1362 01:16:37,800 --> 01:16:44,880 Speaker 2: for a party to nominate him, if not legally disqualifying. So, 1363 01:16:46,200 --> 01:16:49,560 Speaker 2: but I've always thought that his his threat to democracy 1364 01:16:49,640 --> 01:16:53,519 Speaker 2: is more his unwieldiness and his sort of lust. Trump's 1365 01:16:54,280 --> 01:16:59,200 Speaker 2: just narcissism and lust for power and lust for staying 1366 01:16:59,280 --> 01:17:02,080 Speaker 2: the center of it, tension like that was almost the 1367 01:17:02,120 --> 01:17:06,439 Speaker 2: most That's why he's a threat, and and that and 1368 01:17:06,479 --> 01:17:10,680 Speaker 2: that that's just kind of wild and crazy and ridiculous 1369 01:17:10,720 --> 01:17:15,280 Speaker 2: of him, of Trump, And but it is it's dangerous 1370 01:17:15,360 --> 01:17:18,439 Speaker 2: if he gets there. But it's a but it's a 1371 01:17:18,720 --> 01:17:22,760 Speaker 2: it's a kind of danger that is almost cartoonish, like 1372 01:17:22,840 --> 01:17:26,360 Speaker 2: it's it's just so ridiculous. But what you've been laying 1373 01:17:26,360 --> 01:17:30,439 Speaker 2: out these last two episodes, Yeah, very sort of as 1374 01:17:30,479 --> 01:17:39,000 Speaker 2: a much more sinister intellectual underpinning with dismantling of democracy, 1375 01:17:39,360 --> 01:17:46,120 Speaker 2: that is, uh, that is arguably far more serious kind 1376 01:17:46,160 --> 01:17:52,400 Speaker 2: of threat. Uh as a as a doctrine. Yeah, And 1377 01:17:52,479 --> 01:17:55,240 Speaker 2: I've never perceived Trump as someone driven by a sort 1378 01:17:55,280 --> 01:17:57,680 Speaker 2: of like anti democratic doctrine. I just thought he was 1379 01:17:58,000 --> 01:18:01,160 Speaker 2: he isn't. I just was always thought like, oh, he's 1380 01:18:01,160 --> 01:18:05,880 Speaker 2: a dictator just because he's a megalomaniac and he wants 1381 01:18:05,920 --> 01:18:07,519 Speaker 2: to behave like he wants to have all the power 1382 01:18:07,520 --> 01:18:09,559 Speaker 2: of a dictator. He wants to behave he wants to 1383 01:18:09,600 --> 01:18:13,640 Speaker 2: be like you know mel Brooks, like it's good to 1384 01:18:13,640 --> 01:18:16,480 Speaker 2: be the king, Like that's all he wants. Yeah. Yeah, 1385 01:18:16,520 --> 01:18:19,920 Speaker 2: But here is a group of people or a person 1386 01:18:20,720 --> 01:18:26,880 Speaker 2: with some measure of influence. Uh, really thinking this stuff through? 1387 01:18:28,240 --> 01:18:31,040 Speaker 1: You're right that I think this is has been generally missed. 1388 01:18:31,479 --> 01:18:34,000 Speaker 1: You know, people have started in particularly like kind of 1389 01:18:34,000 --> 01:18:37,519 Speaker 1: the mainstream Democratic party, but also just like like centrists, 1390 01:18:37,520 --> 01:18:39,960 Speaker 1: you know, people who are not particularly political, but are 1391 01:18:40,439 --> 01:18:42,639 Speaker 1: I think most people who are people of goodwill don't 1392 01:18:42,640 --> 01:18:44,960 Speaker 1: want to live in a dictatorship. Right. That would be 1393 01:18:45,280 --> 01:18:47,400 Speaker 1: how I would describe a person of goodwill is you 1394 01:18:47,400 --> 01:18:49,640 Speaker 1: don't want to institute a dictatorship. What? 1395 01:18:49,920 --> 01:18:50,120 Speaker 2: Well? 1396 01:18:50,160 --> 01:18:54,040 Speaker 1: I think that's a key part, right. I think people 1397 01:18:54,080 --> 01:18:57,799 Speaker 1: have started to realize the intellectual threat and the broader 1398 01:18:57,880 --> 01:19:01,240 Speaker 1: threat that trump Ism has opened the door for. With 1399 01:19:01,320 --> 01:19:04,120 Speaker 1: Project twenty twenty five. But what has happened with Trump? 1400 01:19:04,120 --> 01:19:05,760 Speaker 1: Because I think you're right on the money with your 1401 01:19:06,000 --> 01:19:09,160 Speaker 1: characterization of him. He is not an ideologue. If he 1402 01:19:09,240 --> 01:19:11,640 Speaker 1: thought he could have gotten to power running as a 1403 01:19:11,680 --> 01:19:14,000 Speaker 1: Democrat and you know, still been the big man on top, 1404 01:19:14,040 --> 01:19:15,800 Speaker 1: he would have done that, right. That just was not 1405 01:19:15,960 --> 01:19:17,519 Speaker 1: the what wound up working for him. 1406 01:19:17,600 --> 01:19:17,800 Speaker 2: Right. 1407 01:19:18,760 --> 01:19:21,320 Speaker 1: He is a Trumpist, and that he believes in himself 1408 01:19:21,400 --> 01:19:24,240 Speaker 1: and himself being the guy who is the most important person. 1409 01:19:24,840 --> 01:19:29,679 Speaker 1: But when he started running and when he first took 1410 01:19:29,680 --> 01:19:34,880 Speaker 1: over the Republican Party, he started breaking norms, right. And 1411 01:19:35,000 --> 01:19:36,880 Speaker 1: sometimes it's good to break norms, right. It used to 1412 01:19:36,880 --> 01:19:38,360 Speaker 1: be the norm that, you know, a huge chunk of 1413 01:19:38,360 --> 01:19:40,439 Speaker 1: this country was segregated, right, and that was something that 1414 01:19:40,439 --> 01:19:42,479 Speaker 1: we had to very that was something that forcefully had 1415 01:19:42,520 --> 01:19:44,400 Speaker 1: to be broken, right. The government had to deployee forced 1416 01:19:44,479 --> 01:19:46,599 Speaker 1: to do that. And in that case that was very 1417 01:19:46,680 --> 01:19:50,120 Speaker 1: much a good thing. But when you break norms, it 1418 01:19:50,280 --> 01:19:55,400 Speaker 1: opens up space for people who have extreme views to 1419 01:19:55,560 --> 01:19:59,439 Speaker 1: push those views into the public sphere. And what happened 1420 01:19:59,479 --> 01:20:02,719 Speaker 1: with Trump as he destroyed the Republican Party that had 1421 01:20:02,720 --> 01:20:05,200 Speaker 1: been he wound up pushing out a lot of people 1422 01:20:05,280 --> 01:20:09,160 Speaker 1: who had been influential in that party, and that opened 1423 01:20:09,240 --> 01:20:12,840 Speaker 1: up space for a whole bunch of people as long. Basically, 1424 01:20:12,920 --> 01:20:16,799 Speaker 1: anyone could become an influential part of the Republican Party 1425 01:20:17,000 --> 01:20:19,280 Speaker 1: if you could do one key thing, which was been 1426 01:20:19,360 --> 01:20:22,240 Speaker 1: to the knee to Trump and also get along with him. 1427 01:20:22,560 --> 01:20:24,840 Speaker 1: And so guys like Stephen Miller, right, are people who 1428 01:20:24,960 --> 01:20:28,360 Speaker 1: understood how to do that. You know. That's how like 1429 01:20:28,400 --> 01:20:30,920 Speaker 1: what jd. Vance has done, that's how he became the VP. 1430 01:20:31,200 --> 01:20:34,280 Speaker 1: Right as he started courting Trump, he started like helping 1431 01:20:34,360 --> 01:20:36,439 Speaker 1: out with stuff like when Trump would do like public 1432 01:20:36,479 --> 01:20:42,320 Speaker 1: appearances after that big chemical spill in East Palestine like 1433 01:20:42,479 --> 01:20:44,439 Speaker 1: a year or so ago. Like Vance is the guy 1434 01:20:44,439 --> 01:20:46,400 Speaker 1: who handled a lot of the advance work for that. 1435 01:20:46,520 --> 01:20:48,920 Speaker 1: He had pressed Trump. He was good at sucking up 1436 01:20:48,920 --> 01:20:51,799 Speaker 1: to him. And they did all this consciously, not because 1437 01:20:52,080 --> 01:20:54,439 Speaker 1: they are all believers that Trump should be the most 1438 01:20:54,439 --> 01:20:58,559 Speaker 1: important person in politics, but because they understood that with 1439 01:20:58,640 --> 01:21:01,599 Speaker 1: access to Trump comes the ability to kind of twist 1440 01:21:01,640 --> 01:21:04,240 Speaker 1: the country in this direction if you can just convince 1441 01:21:04,320 --> 01:21:08,400 Speaker 1: him it's the way to go right, and that is 1442 01:21:08,479 --> 01:21:11,360 Speaker 1: the way, this kind of capture of the system, that's 1443 01:21:11,520 --> 01:21:13,400 Speaker 1: that's how they want it to like that's what they 1444 01:21:13,439 --> 01:21:15,960 Speaker 1: are very consciously trying to do. And so it's it's 1445 01:21:16,000 --> 01:21:18,760 Speaker 1: a situation where, because of who he is and the 1446 01:21:18,800 --> 01:21:21,559 Speaker 1: things that he has made possible, we do have to 1447 01:21:21,600 --> 01:21:25,040 Speaker 1: confront the fact that these these very extreme people who 1448 01:21:25,080 --> 01:21:28,200 Speaker 1: have a very dark vision for what our society should 1449 01:21:28,200 --> 01:21:33,640 Speaker 1: be are kind of at the gates right now. And you, 1450 01:21:33,640 --> 01:21:39,600 Speaker 1: you know, it's unfortunate, but I think at this point undeniable, 1451 01:21:40,560 --> 01:21:43,280 Speaker 1: and I do think it's it's a thing people kind 1452 01:21:43,280 --> 01:21:47,400 Speaker 1: of have to look at with with with clear eyes, 1453 01:21:47,560 --> 01:21:54,680 Speaker 1: because it's it's at this point a very immediate threat. Anyway, 1454 01:21:54,560 --> 01:22:00,800 Speaker 1: you had a good Friday, ruining my day, Maybe it'll 1455 01:22:00,800 --> 01:22:01,000 Speaker 1: be up. 1456 01:22:02,080 --> 01:22:04,920 Speaker 2: I'm gonna go home and be like bitter and angry 1457 01:22:04,960 --> 01:22:09,400 Speaker 2: at my family. What are you doing to help save democracy? 1458 01:22:10,160 --> 01:22:10,360 Speaker 1: Dad? 1459 01:22:10,439 --> 01:22:13,400 Speaker 2: I'm three, but get out there. 1460 01:22:15,640 --> 01:22:18,120 Speaker 1: That's kind of a normal reaction after coming on the show. 1461 01:22:18,240 --> 01:22:21,519 Speaker 2: Unfortunately. Yeah, I'm gonna go home and be mad. 1462 01:22:22,040 --> 01:22:25,320 Speaker 1: I'm gonna be angry all weekend. Well Ed, I'm gonna 1463 01:22:25,320 --> 01:22:27,240 Speaker 1: be happy all weekend because I got to have a 1464 01:22:27,280 --> 01:22:28,759 Speaker 1: fun time talking with you today. 1465 01:22:29,560 --> 01:22:32,640 Speaker 2: This was fun. I really appreciate thanks for having me. 1466 01:22:32,760 --> 01:22:36,559 Speaker 1: Yeah, yeah, thanks for being on. People should check out 1467 01:22:36,600 --> 01:22:41,040 Speaker 1: your podcast, Snap Who Season two is great. I haven't 1468 01:22:41,040 --> 01:22:43,240 Speaker 1: listened to season one yet, but I'm sure it's also great. 1469 01:22:43,479 --> 01:22:46,920 Speaker 1: I'm excited to get into that as well soon. Yeah, 1470 01:22:46,960 --> 01:22:48,600 Speaker 1: I had anything else you want to plug before we 1471 01:22:48,680 --> 01:22:49,360 Speaker 1: roll out today? 1472 01:22:49,439 --> 01:22:56,400 Speaker 2: Snap and season one Season one is super fun. Yeah yeah, 1473 01:22:56,479 --> 01:23:00,519 Speaker 2: yeah yeah, so Jack, You're both out that out. 1474 01:23:01,080 --> 01:23:03,760 Speaker 1: Thank you, Ed and everyone have a good rest of 1475 01:23:03,800 --> 01:23:09,760 Speaker 1: your week. Behind the Bastards is a production of cool 1476 01:23:09,840 --> 01:23:12,720 Speaker 1: Zone Media. For more from cool Zone Media, visit our 1477 01:23:12,720 --> 01:23:16,439 Speaker 1: website Coolzonemedia dot com, or check us out on the 1478 01:23:16,479 --> 01:23:20,320 Speaker 1: iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.