1 00:00:00,040 --> 00:00:07,360 Speaker 1: M hmm. What's Dick and Matador's. I'm Robert Evans hosted 2 00:00:07,360 --> 00:00:11,360 Speaker 1: Behind the Bastards and that is the intro. People requested 3 00:00:11,440 --> 00:00:13,400 Speaker 1: that intro, Sophie, I don't need to hear your guff 4 00:00:13,400 --> 00:00:21,800 Speaker 1: about it. I do want, you know, people's feedback, but 5 00:00:21,880 --> 00:00:26,079 Speaker 1: I don't need guff. Okay, this is a noun. I 6 00:00:26,120 --> 00:00:29,480 Speaker 1: don't agree with that. Alright, Well something was Inductor ZUFI. 7 00:00:30,440 --> 00:00:33,960 Speaker 1: I think that was, by the standards of my recent introductions, 8 00:00:34,040 --> 00:00:36,320 Speaker 1: one of the best ones I've done. I just don't 9 00:00:36,360 --> 00:00:43,800 Speaker 1: like Crimes against Potatoes. Well that's actually totally fair. Um, 10 00:00:43,840 --> 00:00:49,200 Speaker 1: but that introduction, uh does tie in, yeah, slightly with 11 00:00:49,200 --> 00:00:54,520 Speaker 1: with today's episode. Uh so you know, first off, Jamie Loftus, 12 00:00:54,520 --> 00:00:57,320 Speaker 1: you are our guest today. How you doing? Jamie co 13 00:00:57,400 --> 00:01:01,800 Speaker 1: hosted the Bechel Cast the the creative mind behind Boss 14 00:01:01,800 --> 00:01:04,480 Speaker 1: Whom Is Girl? Which is going to be in Edinburgh 15 00:01:04,520 --> 00:01:07,560 Speaker 1: probably by the time this episode drops. The most entertaining 16 00:01:07,560 --> 00:01:10,560 Speaker 1: show I've been to in like the last ten years. 17 00:01:10,760 --> 00:01:14,160 Speaker 1: Fly to Edinburgh. I've several times, and if you listen 18 00:01:14,200 --> 00:01:18,080 Speaker 1: to this years later and Jamie Loftus is not in Edinburgh, 19 00:01:18,440 --> 00:01:21,080 Speaker 1: fly to Edinburgh and demand that they take her back, 20 00:01:22,120 --> 00:01:27,280 Speaker 1: Like what happened there? Yeah, there's yeah, yeah yeah, force 21 00:01:27,360 --> 00:01:30,280 Speaker 1: it upon the town fathers of Edinburgh. I mean really 22 00:01:30,440 --> 00:01:35,119 Speaker 1: optimistic for you to even mention as a possibility as 23 00:01:35,120 --> 00:01:40,040 Speaker 1: a time period that will exist. Yeah. Well I'm an optimist. 24 00:01:40,640 --> 00:01:44,280 Speaker 1: I'm not I'm not well, I don't know, I'm a 25 00:01:44,600 --> 00:01:48,440 Speaker 1: I'm a mix of both, I guess. So today, normally 26 00:01:48,440 --> 00:01:49,800 Speaker 1: this is a show where I talk about the very 27 00:01:49,800 --> 00:01:51,720 Speaker 1: worst people in all of history. But today we're doing 28 00:01:51,760 --> 00:01:54,680 Speaker 1: something a little bit different. We are getting very much 29 00:01:54,720 --> 00:01:58,000 Speaker 1: behind the bastards because we're going into some prehistory about 30 00:01:58,440 --> 00:02:02,000 Speaker 1: where authoritarianism comes from, where fascism comes from, how it 31 00:02:02,080 --> 00:02:04,240 Speaker 1: might be kind of programmed into our brains to an 32 00:02:04,240 --> 00:02:08,160 Speaker 1: extent um. This is kind of a weird one, and 33 00:02:08,200 --> 00:02:12,320 Speaker 1: it's based on you know, I read about primarily dictators 34 00:02:12,360 --> 00:02:16,280 Speaker 1: and terrible unethical political leaders and corporate leaders, people who 35 00:02:16,280 --> 00:02:21,000 Speaker 1: abuse power, uh, basically as a more than full time job, 36 00:02:21,680 --> 00:02:25,200 Speaker 1: and having done that for a year and change, you 37 00:02:25,639 --> 00:02:28,359 Speaker 1: start to have some ideas about the nature of power 38 00:02:28,480 --> 00:02:32,320 Speaker 1: and the nature of authoritarianism and the possibilities of the 39 00:02:32,400 --> 00:02:37,440 Speaker 1: human race and stuff. Um, So this is like, uh, 40 00:02:37,720 --> 00:02:40,040 Speaker 1: it's not a half asked but it's maybe three quarters 41 00:02:40,080 --> 00:02:42,600 Speaker 1: of an asked attempt at me putting some of that 42 00:02:42,680 --> 00:02:46,960 Speaker 1: together and explaining some of the conclusions I've drawn, uh 43 00:02:47,200 --> 00:02:51,160 Speaker 1: from all of this research, and uh it's not ready 44 00:02:51,200 --> 00:02:54,520 Speaker 1: for public consumption, but I'm gonna put it out there 45 00:02:54,560 --> 00:02:56,840 Speaker 1: anyway because I'm a hack and a fraud. So that's 46 00:02:56,840 --> 00:02:59,600 Speaker 1: where we are. I've got Sophie and Jamie in the 47 00:02:59,680 --> 00:03:02,480 Speaker 1: room to tear me down if if what I'm saying 48 00:03:02,560 --> 00:03:05,920 Speaker 1: some nonsense, and Anderson and Anderson, this would be really 49 00:03:05,960 --> 00:03:09,480 Speaker 1: fun if your conclusions were actually I was being kind 50 00:03:09,480 --> 00:03:14,119 Speaker 1: of dramatic before power is actually really good and I'm 51 00:03:14,160 --> 00:03:17,320 Speaker 1: going to work for cook Industries. Bye bye. This this 52 00:03:17,520 --> 00:03:22,440 Speaker 1: ends on a on a pro Saddam Hussein rant, Robert 53 00:03:22,600 --> 00:03:32,359 Speaker 1: you know what, leave Yeah, I've become a Stalinist. Yeah, 54 00:03:32,360 --> 00:03:34,480 Speaker 1: oh god, I just want to hear you refer to 55 00:03:34,520 --> 00:03:39,119 Speaker 1: yourself as a bit of a drama queen. Turns out 56 00:03:39,120 --> 00:03:41,000 Speaker 1: this year I've just been a bit of a drama 57 00:03:41,120 --> 00:03:43,320 Speaker 1: queen and things are actually fine. He's like, I've just 58 00:03:43,360 --> 00:03:47,440 Speaker 1: been binging Mary Kay and Nashley. You know, that's my 59 00:03:47,480 --> 00:03:51,200 Speaker 1: new life. As I learned from Mary Kay and Ashley, 60 00:03:51,480 --> 00:03:55,280 Speaker 1: Everything's fine as long as you as long as you 61 00:03:55,400 --> 00:04:00,320 Speaker 1: light to your wait, Mary Kay, I was too busy 62 00:04:00,360 --> 00:04:03,800 Speaker 1: trying to remember exactly what had happened in uh in 63 00:04:03,880 --> 00:04:06,360 Speaker 1: that movie where they trick their parents into both getting 64 00:04:06,360 --> 00:04:08,480 Speaker 1: on a cruise ship with them to make their divorced 65 00:04:08,520 --> 00:04:11,680 Speaker 1: mom and dad get back together. Buddy gun one, isn't 66 00:04:11,720 --> 00:04:15,000 Speaker 1: that one? That that is? Isn't that uh parent trap? 67 00:04:15,760 --> 00:04:21,240 Speaker 1: Parent trap? Yeah? Not Mary Kane? Actually not actually twins, 68 00:04:21,320 --> 00:04:25,680 Speaker 1: that's double Lindsay Lohan. Yeah, and as I miss, wait, 69 00:04:25,760 --> 00:04:27,960 Speaker 1: was that Lindsay lohunt? That wasn't even Mary Kay? And Nactually, 70 00:04:28,000 --> 00:04:31,320 Speaker 1: goddamn it. I know I think about the such a 71 00:04:31,720 --> 00:04:36,520 Speaker 1: that I tried to mention Ariana Grande in casual conversation 72 00:04:36,720 --> 00:04:40,720 Speaker 1: last year, and you waited maybe a full ten seconds 73 00:04:40,760 --> 00:04:44,520 Speaker 1: by being like, I don't know who you're talking. I 74 00:04:44,560 --> 00:04:46,520 Speaker 1: actually bring that up thinking about it once a week. 75 00:04:46,560 --> 00:04:49,080 Speaker 1: I bring it up constantly with salespeople when they asked 76 00:04:49,080 --> 00:04:53,080 Speaker 1: me why Robert can't pronounce normal words, and I give 77 00:04:53,120 --> 00:04:56,640 Speaker 1: them the explanation that he can pronounce some crazy Russian 78 00:04:56,680 --> 00:05:01,840 Speaker 1: ship but cannot pronounce Ariana Grande. See this is frustrating 79 00:05:01,880 --> 00:05:04,080 Speaker 1: because I feel like if you, if you add up 80 00:05:04,080 --> 00:05:06,640 Speaker 1: the number of words that I read every week on 81 00:05:06,680 --> 00:05:11,080 Speaker 1: this show, I have above a pronunciation rate. Someone crunched 82 00:05:11,120 --> 00:05:14,160 Speaker 1: those numbers. Yeah, I mean we're talking about ten thousand 83 00:05:14,279 --> 00:05:16,359 Speaker 1: ish words a week. You know, you get a couple 84 00:05:16,400 --> 00:05:26,320 Speaker 1: wrong mistakes are made, but usually the words are like tree, 85 00:05:24,839 --> 00:05:30,200 Speaker 1: t look we I love tries and I pronounced them route. 86 00:05:32,480 --> 00:05:34,560 Speaker 1: Why did you let us come on the show together? 87 00:05:34,680 --> 00:05:36,960 Speaker 1: We're not going to get anything done. I needed I 88 00:05:37,000 --> 00:05:40,719 Speaker 1: needed some ship talking of my my, my philosophizing here, 89 00:05:40,760 --> 00:05:43,080 Speaker 1: because if I'm going to do an episode where I 90 00:05:43,400 --> 00:05:46,680 Speaker 1: unload of a half baked philosophy, somebody should be there 91 00:05:46,720 --> 00:05:49,040 Speaker 1: to like point out when I've made a huge logical 92 00:05:49,200 --> 00:05:53,599 Speaker 1: error or pronounce a name wrong. Okay, this actually will 93 00:05:53,600 --> 00:05:56,240 Speaker 1: tie in with something we talked about later in the episode. 94 00:05:56,320 --> 00:05:58,840 Speaker 1: I didn't plan that ahead of time, but it totally does. 95 00:05:58,920 --> 00:06:02,280 Speaker 1: And I'm very proud of myself of now. So let's 96 00:06:02,279 --> 00:06:06,320 Speaker 1: get into it. In nineteen eighty nine, Francis fuku Yama, 97 00:06:06,360 --> 00:06:09,320 Speaker 1: a political scientist and author, wrote an essay titled the 98 00:06:09,480 --> 00:06:12,320 Speaker 1: End of History Looking forward just a little bit. He 99 00:06:12,360 --> 00:06:14,320 Speaker 1: was able to see the fall of the Soviet Union 100 00:06:14,360 --> 00:06:16,919 Speaker 1: was on the very near horizon. To fuku Yama and 101 00:06:16,960 --> 00:06:19,000 Speaker 1: too many. At the dawn of the nineteen nineties, it 102 00:06:19,040 --> 00:06:22,159 Speaker 1: looked as if a new epoch in human society was dawning, 103 00:06:22,520 --> 00:06:25,279 Speaker 1: one in which the great historical shifts between empires and 104 00:06:25,320 --> 00:06:27,920 Speaker 1: modes of government that had persisted for EON's would cease, 105 00:06:28,120 --> 00:06:33,799 Speaker 1: because mankind had clearly arrived at the perfect system, liberal democracy. 106 00:06:33,920 --> 00:06:38,520 Speaker 1: So that's Fukuyama's thinking in like nineteen eighty nine. Now 107 00:06:38,640 --> 00:06:41,080 Speaker 1: he turned his essay into a book in nineteen ninety two, 108 00:06:41,120 --> 00:06:43,359 Speaker 1: after the fall of the USSR, when he looked like 109 00:06:43,440 --> 00:06:46,800 Speaker 1: he basically read the future UH in the end of history. 110 00:06:46,839 --> 00:06:49,720 Speaker 1: He wrote that humanity was witnessing quote not just the 111 00:06:49,760 --> 00:06:52,640 Speaker 1: passing of a particular period of post war history, but 112 00:06:52,720 --> 00:06:55,080 Speaker 1: the end of history is such that is the endpoint 113 00:06:55,120 --> 00:06:59,719 Speaker 1: of mankind's ideological evolution, and the universalization of Western liberal 114 00:06:59,720 --> 00:07:03,200 Speaker 1: demonocracy is the final form of human government. So a 115 00:07:03,200 --> 00:07:06,680 Speaker 1: bit of a bold claim to make UH and a 116 00:07:06,680 --> 00:07:08,720 Speaker 1: little less than thirty years later, in the Year of 117 00:07:08,760 --> 00:07:12,559 Speaker 1: Our Lord twenty nineteen, fuki Yama's theories have not aged well. 118 00:07:14,160 --> 00:07:15,920 Speaker 1: Rather than living at the end of history, we now 119 00:07:15,920 --> 00:07:18,160 Speaker 1: seem to be living through a period where these once 120 00:07:18,200 --> 00:07:21,680 Speaker 1: in vulnerable liberal democracies are dropping faster than fat beats 121 00:07:21,680 --> 00:07:26,560 Speaker 1: at a warehouse. Rave. Thank you, thank you for that, Jamie. 122 00:07:26,880 --> 00:07:32,880 Speaker 1: That's proud of that one. Under Victor Orban, Hungary has 123 00:07:32,880 --> 00:07:36,679 Speaker 1: transitioned to what he calls an illiberal democracy. Under tap 124 00:07:36,760 --> 00:07:40,840 Speaker 1: resign uh, Turkey has moved very close to an outright dictatorship. 125 00:07:41,160 --> 00:07:44,160 Speaker 1: The elections of JayR. Bolsonaro in Brazil and Rodrigo du 126 00:07:44,240 --> 00:07:47,480 Speaker 1: Terte in the Philippines hardly bode any better, But the 127 00:07:47,520 --> 00:07:50,080 Speaker 1: situation is actually even worse than it looks. Based on 128 00:07:50,120 --> 00:07:53,880 Speaker 1: all that, researchers with a German institute, Bertelsman stiff Tong 129 00:07:54,320 --> 00:07:56,880 Speaker 1: published a study back in March of two eighteen that 130 00:07:56,960 --> 00:08:00,640 Speaker 1: analyzed the quality of democracy, the market, economy, and leadership 131 00:08:00,680 --> 00:08:03,640 Speaker 1: of some hundred and twenty nine nations. They use this 132 00:08:03,720 --> 00:08:06,760 Speaker 1: to put together what they called a transformation Index that 133 00:08:06,840 --> 00:08:11,160 Speaker 1: roughly analyze the overall levels of freedom, authoritarianism, and inequality 134 00:08:11,200 --> 00:08:14,840 Speaker 1: in those societies. They found that roughly one billion more 135 00:08:14,960 --> 00:08:19,920 Speaker 1: people live under dictatorships now than did fifteen years ago. Wow, 136 00:08:19,960 --> 00:08:22,760 Speaker 1: that's insane. Yeah, it's not a great statistic. I don't 137 00:08:22,800 --> 00:08:26,240 Speaker 1: like that. M hmm. That's all I have to say 138 00:08:26,280 --> 00:08:29,360 Speaker 1: about it. That's my only comment. Not into it. Okay, Sophie, 139 00:08:29,560 --> 00:08:34,680 Speaker 1: not a dictatorship fan, bold stance to take considering honestly, 140 00:08:34,800 --> 00:08:37,520 Speaker 1: brave of you. I mean we can add to that 141 00:08:37,600 --> 00:08:41,400 Speaker 1: because I feel like our relationship Robert is a dictatorship 142 00:08:42,120 --> 00:08:46,400 Speaker 1: and I am that dictator. Yes, in which direction nothing 143 00:08:46,440 --> 00:08:50,480 Speaker 1: would surprise me. And like Stalin, you regularly throw oranges 144 00:08:50,520 --> 00:08:53,280 Speaker 1: at me and make me watch Cowboy movies while you drink. 145 00:08:53,400 --> 00:08:58,160 Speaker 1: I've seen it. Yeah, absolutely, I hate to see it 146 00:08:58,200 --> 00:08:59,920 Speaker 1: and I see it. Yeah, but you are I mean 147 00:09:00,000 --> 00:09:01,680 Speaker 1: that you are my son, and I accept you for 148 00:09:01,720 --> 00:09:04,760 Speaker 1: who you are. Yeah, except for when I don't want 149 00:09:04,800 --> 00:09:09,480 Speaker 1: to watch Cowboy movies. Yeah, not acceptable. Then that's not acceptable. 150 00:09:10,400 --> 00:09:14,360 Speaker 1: So uh yeah. So these guys look into the number 151 00:09:14,360 --> 00:09:16,480 Speaker 1: of people living in dictatorships and how that's changed, and 152 00:09:16,480 --> 00:09:18,280 Speaker 1: they find out that there's a billion more people living 153 00:09:18,320 --> 00:09:21,160 Speaker 1: under such regimes now than we're fifteen years ago. I've 154 00:09:21,160 --> 00:09:22,960 Speaker 1: had a write up of this study in the Local, 155 00:09:23,000 --> 00:09:26,520 Speaker 1: which is a German newspaper. Quote. Well, the researchers concluded 156 00:09:26,520 --> 00:09:28,640 Speaker 1: that the number of people living in democracies rose from 157 00:09:28,679 --> 00:09:31,040 Speaker 1: four billion to four point two billion between two thousand 158 00:09:31,000 --> 00:09:33,280 Speaker 1: and three and two thousand seventeen. They also found that 159 00:09:33,360 --> 00:09:36,319 Speaker 1: three point three billion people lived under dictatorship last year 160 00:09:36,320 --> 00:09:39,160 Speaker 1: compared to two point three billion in two thousand three. 161 00:09:39,840 --> 00:09:43,079 Speaker 1: So the trends aren't good. Uh. It is said that 162 00:09:43,120 --> 00:09:46,200 Speaker 1: the number of countries classified as having exemplary standards of 163 00:09:46,240 --> 00:09:48,439 Speaker 1: free and fair elections had dropped from one in six 164 00:09:48,480 --> 00:09:51,080 Speaker 1: and two thousand six to one in fourteen last year. 165 00:09:51,320 --> 00:09:53,560 Speaker 1: And while seventeen of the hundred and twenty nine countries 166 00:09:53,600 --> 00:09:56,280 Speaker 1: were considered to have completely unrestricted freedom of press and 167 00:09:56,320 --> 00:09:58,600 Speaker 1: opinion in two thousand six, that was the case for 168 00:09:58,679 --> 00:10:02,760 Speaker 1: just ten countries last year. Okay, so we're not We're 169 00:10:02,760 --> 00:10:06,000 Speaker 1: not on a great trend line if you you want 170 00:10:06,000 --> 00:10:09,959 Speaker 1: to look at history that way. Um. Now, in every 171 00:10:09,960 --> 00:10:12,360 Speaker 1: era there are philosophers whose purpose seems to be to 172 00:10:12,400 --> 00:10:14,200 Speaker 1: reinforce in the minds of the great and good that 173 00:10:14,200 --> 00:10:16,560 Speaker 1: whatever systems put them in their lofty place are right 174 00:10:16,600 --> 00:10:25,920 Speaker 1: and decent and perfect. What would you, Francis fuki Yama 175 00:10:25,960 --> 00:10:28,040 Speaker 1: was that man cheering the victors of the Cold War 176 00:10:28,160 --> 00:10:31,000 Speaker 1: on and assuring them that their world order would persist 177 00:10:31,040 --> 00:10:33,719 Speaker 1: for all time, And now that mankind seems to be 178 00:10:33,760 --> 00:10:37,200 Speaker 1: sinking into a darker and more authoritarian era. A new 179 00:10:37,240 --> 00:10:40,640 Speaker 1: philosopher has arrived to praise the righteousness of this shift. 180 00:10:41,080 --> 00:10:45,560 Speaker 1: His name is Dr Jordan Peterson. Stand about, I was 181 00:10:45,600 --> 00:10:48,559 Speaker 1: having a perfectly lovely conversation. Yeah, we were having a 182 00:10:48,600 --> 00:10:50,720 Speaker 1: conversation while you were talking, and then I got the 183 00:10:50,800 --> 00:10:54,199 Speaker 1: Jordan Peterson momo and I had just had a sharp pain. 184 00:10:54,440 --> 00:10:56,800 Speaker 1: But before we get into that sharp pain, what would 185 00:10:56,840 --> 00:10:59,080 Speaker 1: all what would what would you sell the Lofty Place? 186 00:10:59,320 --> 00:11:01,480 Speaker 1: What would you sell at the Lofty Place with a 187 00:11:01,559 --> 00:11:04,040 Speaker 1: whole sorts of ship? I mean it would be and 188 00:11:04,120 --> 00:11:06,680 Speaker 1: also none of it would be It would be I 189 00:11:06,720 --> 00:11:09,920 Speaker 1: think probably a front for something else. Like it would 190 00:11:09,960 --> 00:11:12,080 Speaker 1: be mostly I don't even know. I'm not I'm not 191 00:11:12,160 --> 00:11:16,120 Speaker 1: a very good consumerwards, So yeah, it would be a 192 00:11:16,320 --> 00:11:18,840 Speaker 1: shop for Okay, no, it's just supposed to be a shop, 193 00:11:18,840 --> 00:11:22,720 Speaker 1: but it's actually a trap. Um so i'll, you know, 194 00:11:22,840 --> 00:11:29,240 Speaker 1: like lure in like sword guys and capture Mark Zuckerberg. Yeah, exactly, 195 00:11:29,280 --> 00:11:32,560 Speaker 1: like guys who own swords. And then it's really kind 196 00:11:32,600 --> 00:11:35,199 Speaker 1: of like a re education kind of thing, and the 197 00:11:35,240 --> 00:11:38,160 Speaker 1: doors snap and then I teach the sword guys a 198 00:11:38,200 --> 00:11:40,160 Speaker 1: thing or two about a thing or two, and then 199 00:11:40,200 --> 00:11:43,840 Speaker 1: they leave mentally healthier. I can respect that. Back to 200 00:11:43,920 --> 00:11:47,040 Speaker 1: Jordan Peterson, well, I mean speaking of sword guys, he 201 00:11:47,120 --> 00:11:49,920 Speaker 1: kind of ties in there. But he also ties into 202 00:11:49,920 --> 00:11:53,120 Speaker 1: authoritarianism because in his best selling book Twelve Rules for 203 00:11:53,160 --> 00:11:56,280 Speaker 1: Life and Antidote to Chaos, Dr Peterson argues that strict 204 00:11:56,400 --> 00:11:59,720 Speaker 1: hierarchies are natural and healthy, at least to some extent, 205 00:12:00,200 --> 00:12:02,520 Speaker 1: according to a write up in the conversation quote. To 206 00:12:02,559 --> 00:12:05,360 Speaker 1: prove his point, Peterson uses the example of lobsters, which 207 00:12:05,400 --> 00:12:09,120 Speaker 1: humans share a common evolutionary ancestor with. Peterson argues that, 208 00:12:09,160 --> 00:12:11,920 Speaker 1: like humans, lobsters exist in hierarchies and have a nervous 209 00:12:11,920 --> 00:12:15,200 Speaker 1: system attuned to status, which runs on serotonin, a brain 210 00:12:15,280 --> 00:12:18,360 Speaker 1: chemical often associated with feelings of happiness. But higher up 211 00:12:18,360 --> 00:12:21,160 Speaker 1: a hierarchy of lobster climbs, This brain mechanism helps to 212 00:12:21,160 --> 00:12:23,760 Speaker 1: make more serotonin available. The more defeat it suffers, the 213 00:12:23,800 --> 00:12:26,920 Speaker 1: more restricted the serotonin supply. Lower serotonin is in turn 214 00:12:26,960 --> 00:12:30,200 Speaker 1: associated with more negative emotions, perhaps making it harder to 215 00:12:30,200 --> 00:12:33,040 Speaker 1: climb back up the ladder. According to Peterson, hierarchies and 216 00:12:33,120 --> 00:12:36,000 Speaker 1: humans work in a similar way. We are wired to 217 00:12:36,120 --> 00:12:40,120 Speaker 1: live in them. So that's Dr Peterson, And as much 218 00:12:40,160 --> 00:12:43,040 Speaker 1: as I might personally find him frustrating a disagreeable. He's 219 00:12:43,080 --> 00:12:46,160 Speaker 1: not the only person making arguments like this, do we 220 00:12:46,240 --> 00:12:50,160 Speaker 1: have a sea? Do we have a second source? I 221 00:12:50,240 --> 00:12:55,000 Speaker 1: just don't like his We've got other sources and and yeah, 222 00:12:55,160 --> 00:12:58,800 Speaker 1: unfortunately we have other sources. Uh. There's a distressing amount 223 00:12:58,800 --> 00:13:01,640 Speaker 1: of data that reinforces the idea that strict hierarchy may 224 00:13:01,679 --> 00:13:04,880 Speaker 1: be more natural to humankind than the egalitarian future those 225 00:13:04,920 --> 00:13:06,760 Speaker 1: of us who grew up watching Star Trek the next 226 00:13:06,760 --> 00:13:10,440 Speaker 1: generation might prefer. In two thousand eight, a scientific study 227 00:13:10,440 --> 00:13:12,920 Speaker 1: on a hierarchy and the human brains started making the rounds. 228 00:13:12,960 --> 00:13:16,479 Speaker 1: Online websites like PBS News Hour summarize it with headlines 229 00:13:16,520 --> 00:13:19,800 Speaker 1: like this, social status is hardwired into the brain, study 230 00:13:19,880 --> 00:13:23,120 Speaker 1: shows The research this article and others like it discussed 231 00:13:23,160 --> 00:13:25,520 Speaker 1: was based on a study conducted by the National Institute 232 00:13:25,520 --> 00:13:27,840 Speaker 1: of Mental Health using an FMR I to measure the 233 00:13:27,840 --> 00:13:30,600 Speaker 1: brain activity of seventy two people playing a computer game 234 00:13:30,640 --> 00:13:33,319 Speaker 1: with financial rewards on the line. According to the press 235 00:13:33,360 --> 00:13:35,760 Speaker 1: release quote, there were assigned a status that they were 236 00:13:35,800 --> 00:13:38,120 Speaker 1: told was based on their playing skill. In fact, the 237 00:13:38,120 --> 00:13:41,079 Speaker 1: game outcomes were predetermined by the other players simulated by 238 00:13:41,080 --> 00:13:44,680 Speaker 1: computer participants intermittently saw pictures and scores of an inferior 239 00:13:44,679 --> 00:13:47,480 Speaker 1: and a superior player they thought were simultaneously playing in 240 00:13:47,520 --> 00:13:50,440 Speaker 1: other rooms. Although they knew the perceived players scores would 241 00:13:50,440 --> 00:13:53,120 Speaker 1: not affect their own outcomes a reward, and were instructed 242 00:13:53,160 --> 00:13:56,360 Speaker 1: to ignore them. Participants brain activity and behavior were highly 243 00:13:56,360 --> 00:14:00,520 Speaker 1: influenced by their position in the implied hierarchy. Now I 244 00:14:00,600 --> 00:14:03,200 Speaker 1: found a more detailed breakdown of that study by an 245 00:14:03,240 --> 00:14:07,520 Speaker 1: actual scientist, Dr. Kabiz Kamrani, writing on anthropology dot net, 246 00:14:07,760 --> 00:14:11,360 Speaker 1: and he notes quote Overall, this observation implies that social 247 00:14:11,400 --> 00:14:14,160 Speaker 1: status is highly valued in our subconscious minds, even as 248 00:14:14,240 --> 00:14:16,560 Speaker 1: much as money. The press is gorging itself on the 249 00:14:16,600 --> 00:14:18,760 Speaker 1: sound bite they just love it when something is complex 250 00:14:18,800 --> 00:14:21,200 Speaker 1: as social hierarchy and brain functions are reduced to something 251 00:14:21,240 --> 00:14:25,080 Speaker 1: as simple as gaining money. Another interesting observation involved subjects 252 00:14:25,080 --> 00:14:27,760 Speaker 1: that were presented a superior competitor in the game. When 253 00:14:27,800 --> 00:14:30,520 Speaker 1: that happened at triggered activity in quote, an area near 254 00:14:30,560 --> 00:14:32,640 Speaker 1: the front of the brain that appears to size people up, 255 00:14:32,680 --> 00:14:36,360 Speaker 1: making interpersonal judgments and assessing social status. A circuit involving 256 00:14:36,400 --> 00:14:38,720 Speaker 1: the midfront part of the brain that processes the intentions 257 00:14:38,720 --> 00:14:41,680 Speaker 1: and most motives of others and emotion processing areas deep 258 00:14:41,680 --> 00:14:45,040 Speaker 1: in the brain activated when the hierarchy became unstable, allowing 259 00:14:45,040 --> 00:14:48,840 Speaker 1: for upward and downward nobility. That it is the prefrontal 260 00:14:48,880 --> 00:14:55,080 Speaker 1: cortex is the judge bitch cortex. Y Dr Camrani goes 261 00:14:55,080 --> 00:14:57,720 Speaker 1: on to write, these results kind of thwart any utopian 262 00:14:57,720 --> 00:15:01,400 Speaker 1: anarchists out there. This data shows that are hierarchical consciousness 263 00:15:01,440 --> 00:15:03,480 Speaker 1: seems to be ingrained in the human brains so much 264 00:15:03,520 --> 00:15:06,320 Speaker 1: so that there are distinct circuits activated by concerns over 265 00:15:06,400 --> 00:15:09,920 Speaker 1: social rank. So like, so like, what what kind of 266 00:15:09,920 --> 00:15:13,720 Speaker 1: a study is this? Like did he how many? It's 267 00:15:13,720 --> 00:15:15,600 Speaker 1: an f M r I studies. So they're doing these 268 00:15:15,640 --> 00:15:17,720 Speaker 1: sort of things to try to make put people in 269 00:15:17,800 --> 00:15:21,600 Speaker 1: situations where they would be specifically like maybe lad to 270 00:15:21,640 --> 00:15:24,680 Speaker 1: think about their rank in a hierarchy, and they're also 271 00:15:24,720 --> 00:15:28,240 Speaker 1: measuring their brains at the same time, and they're finding that, like, 272 00:15:29,040 --> 00:15:31,640 Speaker 1: because of the way that people's brains react in these studies, 273 00:15:31,680 --> 00:15:35,240 Speaker 1: it suggests that parts of our brain are hardwired um 274 00:15:35,320 --> 00:15:39,640 Speaker 1: to view ourselves in part of a social hierarchy, as 275 00:15:39,640 --> 00:15:43,840 Speaker 1: opposed to like human beings inherently being egalitarian and social 276 00:15:43,920 --> 00:15:47,360 Speaker 1: hierarchy being something that's falsely imposed on us from outside, 277 00:15:47,640 --> 00:15:53,320 Speaker 1: Like the structure of our brain, uh, seems to reinforce hierarchy. 278 00:15:53,520 --> 00:15:56,320 Speaker 1: And do we know is there like are we pulling 279 00:15:56,360 --> 00:15:59,840 Speaker 1: from like a wide group of yeah, a sample, yeah, 280 00:16:00,040 --> 00:16:02,960 Speaker 1: not samples. This is this is just one study, um, 281 00:16:03,000 --> 00:16:05,520 Speaker 1: but there are other studies that have found similar things 282 00:16:05,920 --> 00:16:09,040 Speaker 1: that its specific study does, say the sample size of 283 00:16:09,080 --> 00:16:11,440 Speaker 1: the group that he did. Yeah, this one's I think 284 00:16:11,480 --> 00:16:15,440 Speaker 1: a seventy two persons study, this most recent one here. Um. 285 00:16:15,480 --> 00:16:17,360 Speaker 1: But there's like this is this is like sort of 286 00:16:17,360 --> 00:16:20,960 Speaker 1: emblematic of one sort of strain of research, and there's 287 00:16:21,000 --> 00:16:23,880 Speaker 1: a couple of other studies in it that talk about, um, 288 00:16:24,040 --> 00:16:26,480 Speaker 1: hierarchy and stuff. So it's not I'm not trying to 289 00:16:26,520 --> 00:16:29,640 Speaker 1: present this as like the end all be all, but 290 00:16:30,280 --> 00:16:34,280 Speaker 1: it is kind of a bummer to read stuff like that, um, 291 00:16:34,320 --> 00:16:37,239 Speaker 1: because it, Yeah, it would seem to push the conclusion 292 00:16:37,280 --> 00:16:40,160 Speaker 1: that we are to some extent, at least irrevocably chain 293 00:16:40,240 --> 00:16:43,760 Speaker 1: to hierarchy, to systems of inequality, uh and in other words, 294 00:16:43,800 --> 00:16:46,800 Speaker 1: to a world dominated by bastards. And while that would 295 00:16:46,840 --> 00:16:49,320 Speaker 1: mean eternal job security for Sophie and I, it's not 296 00:16:49,360 --> 00:16:51,520 Speaker 1: a worry that a world that I want to live 297 00:16:51,560 --> 00:16:54,760 Speaker 1: in particularly, so I dug a little bit deeper with 298 00:16:54,800 --> 00:16:57,760 Speaker 1: a little bit of desperation to it. Um, And I 299 00:16:57,800 --> 00:17:00,480 Speaker 1: found evidence to suggest that our primate ants sestors, or 300 00:17:00,480 --> 00:17:02,640 Speaker 1: at least many of our primate ancestors, would have been 301 00:17:02,680 --> 00:17:06,400 Speaker 1: beings with strict social hierarchies. Scientists think this is plausible 302 00:17:06,400 --> 00:17:08,800 Speaker 1: because many of our modern ape and monkey relatives show 303 00:17:08,800 --> 00:17:12,439 Speaker 1: evidence of this too. Gibbons are strictly monogamous, Chimpanzees have 304 00:17:12,480 --> 00:17:16,639 Speaker 1: elaborate sexual hierarchies, Silverback guerrillas don't exactly work out their 305 00:17:16,680 --> 00:17:20,760 Speaker 1: differences in mutual self criticism sessions. UM. I'm trying not 306 00:17:20,800 --> 00:17:23,520 Speaker 1: to be too absolutist with anything here, but I think 307 00:17:23,520 --> 00:17:25,880 Speaker 1: it's it's fair to say, based on a lot of 308 00:17:26,000 --> 00:17:29,080 Speaker 1: anthropological research, that many of the pre human primates we 309 00:17:29,160 --> 00:17:32,159 Speaker 1: descended from would have behaved in similar ways, which is 310 00:17:32,160 --> 00:17:34,600 Speaker 1: probably why we have brains that are to some extent 311 00:17:34,800 --> 00:17:40,920 Speaker 1: hardwired for hierarchy. UM. Now, this gets more complicated when 312 00:17:40,920 --> 00:17:43,639 Speaker 1: you also slot in the fact that an increasing body 313 00:17:43,680 --> 00:17:47,320 Speaker 1: of anthropological research suggests that many of our hunter gatherer 314 00:17:47,400 --> 00:17:51,439 Speaker 1: ancestors would have lived in egalitarian communities. Which is the 315 00:17:51,480 --> 00:17:55,359 Speaker 1: conclusion that scientists increasingly make as they study ancient man 316 00:17:55,440 --> 00:17:57,959 Speaker 1: and modern hunter gatherers, who sort of are seen as 317 00:17:58,000 --> 00:18:02,200 Speaker 1: kind of a stand in for our ancestors. Um. Which 318 00:18:03,440 --> 00:18:06,800 Speaker 1: kind of suggests that tens of thousands of years in 319 00:18:06,960 --> 00:18:09,919 Speaker 1: our past, at some point, you know, we sort of 320 00:18:09,960 --> 00:18:13,600 Speaker 1: evolved with this this like structures in our brain that 321 00:18:13,800 --> 00:18:16,480 Speaker 1: kind of function the same way as like an addiction 322 00:18:16,520 --> 00:18:19,040 Speaker 1: to hierarchy, and at some point in our past we 323 00:18:19,080 --> 00:18:21,639 Speaker 1: got over it for a period of like thousands and 324 00:18:21,680 --> 00:18:26,800 Speaker 1: thousands and thousands of years. UM. So yeah, that's interesting 325 00:18:26,840 --> 00:18:29,960 Speaker 1: to me. Now. In two thousan twelve researchers writing for 326 00:18:30,000 --> 00:18:32,000 Speaker 1: the Journal of Human Nature published the results of a 327 00:18:32,040 --> 00:18:34,760 Speaker 1: study into a sample of fifty three human societies in 328 00:18:34,800 --> 00:18:39,280 Speaker 1: which polyandrous unions were common. Um, I see where we're 329 00:18:39,320 --> 00:18:43,199 Speaker 1: going here, Robert. I don't know. This is just a 330 00:18:43,240 --> 00:18:48,359 Speaker 1: little bit but not for the most part, nack is 331 00:18:48,480 --> 00:18:51,080 Speaker 1: just just a little bit, just a little bit rounded 332 00:18:51,640 --> 00:18:56,040 Speaker 1: your lifestyle. I can't believe this. Now we demonstrate that 333 00:18:56,080 --> 00:18:58,399 Speaker 1: although polyandry is rare, it is not as rare as 334 00:18:58,400 --> 00:19:00,879 Speaker 1: commonly believed as found worldwide it and is in most 335 00:19:00,920 --> 00:19:04,639 Speaker 1: common in egalitarian societies. We also argue that polyandry likely 336 00:19:04,640 --> 00:19:08,600 Speaker 1: existed during yeah early human history and should be examined 337 00:19:08,640 --> 00:19:11,880 Speaker 1: from an evolutionary perspective. Our analysis reveals that it may 338 00:19:11,880 --> 00:19:15,320 Speaker 1: be a predictable response. Okay, here's the thing. It's a 339 00:19:15,359 --> 00:19:19,840 Speaker 1: predictable response to a high operational sex ratio favoring males, 340 00:19:19,840 --> 00:19:21,600 Speaker 1: and may also be a response to high rates of 341 00:19:21,600 --> 00:19:29,280 Speaker 1: male mortality and possibly male absenteees. Longer makes sense. No, no, no, no, 342 00:19:29,320 --> 00:19:32,600 Speaker 1: that's not what it is. Because men in ancient societies 343 00:19:32,640 --> 00:19:35,240 Speaker 1: would have died so often, it didn't make sense for 344 00:19:35,280 --> 00:19:37,440 Speaker 1: people to be strictly monogamous. So you should if you're 345 00:19:37,440 --> 00:19:39,400 Speaker 1: going to have men dying at a high rate because 346 00:19:39,400 --> 00:19:42,280 Speaker 1: they're out hunting stuff. It makes sense if everyone in 347 00:19:42,320 --> 00:19:44,440 Speaker 1: the tribe raises all of the kids, and if people 348 00:19:44,440 --> 00:19:47,879 Speaker 1: don't have strong bonds of monogamy. That's what they're saying. Um, 349 00:19:47,920 --> 00:19:51,320 Speaker 1: ancient people weren't polyandrous because they were making an ethical 350 00:19:51,400 --> 00:19:54,080 Speaker 1: choice about it being more ethical than monogamy. It's just 351 00:19:54,200 --> 00:19:55,960 Speaker 1: if there's a hundred and fifty of you and your 352 00:19:55,960 --> 00:19:58,800 Speaker 1: tribe and people are dropping all the time because they're 353 00:19:58,800 --> 00:20:01,960 Speaker 1: out fucking hunting wool and ship it doesn't make sense 354 00:20:02,160 --> 00:20:06,080 Speaker 1: to like have oh his her husband died, so now 355 00:20:06,119 --> 00:20:08,440 Speaker 1: her kids don't get food. Like, that's not a great 356 00:20:08,440 --> 00:20:10,280 Speaker 1: way to if there's not that many of you, you 357 00:20:10,359 --> 00:20:12,640 Speaker 1: just can't live that way. You gotta stay hornies, stay 358 00:20:12,680 --> 00:20:16,840 Speaker 1: frothy all the time. Yeah, Or it's it's more that like, uh, 359 00:20:17,040 --> 00:20:19,520 Speaker 1: Like one of the things that's really common in particularly 360 00:20:19,560 --> 00:20:23,000 Speaker 1: a lot of Latin American tribal societies is they have 361 00:20:23,160 --> 00:20:26,360 Speaker 1: these beliefs about sperm that once a woman gets pregnant, 362 00:20:26,760 --> 00:20:29,359 Speaker 1: every guy she has sex with after the pregnancy has 363 00:20:29,400 --> 00:20:33,200 Speaker 1: started contribute sperm that helps build the child. And so 364 00:20:33,400 --> 00:20:37,040 Speaker 1: kids have multiple fathers in the tribe and that means 365 00:20:37,080 --> 00:20:38,639 Speaker 1: that like if two or three of them die, you 366 00:20:38,720 --> 00:20:40,520 Speaker 1: still got three or four dads, and like they're all 367 00:20:40,560 --> 00:20:43,919 Speaker 1: responsible for teaching the kids certain things, which is a 368 00:20:44,000 --> 00:20:46,560 Speaker 1: really logical way to have a society if there's if 369 00:20:46,600 --> 00:20:49,800 Speaker 1: you're a hunter gatherer tribe, it makes sense logic. I've 370 00:20:49,840 --> 00:20:54,280 Speaker 1: never found more daddies to be I don't need there 371 00:20:54,320 --> 00:20:56,560 Speaker 1: to be like girls that have or boys that have 372 00:20:56,800 --> 00:20:59,760 Speaker 1: like more than one dad daddy, unless we're talking about 373 00:20:59,800 --> 00:21:04,880 Speaker 1: sugar daddy culture, in which case you do you Yeah, well, 374 00:21:05,400 --> 00:21:07,760 Speaker 1: I think you guys and your reaction when I started 375 00:21:07,760 --> 00:21:10,919 Speaker 1: talking about polyandry does make sense to me because like 376 00:21:10,960 --> 00:21:13,800 Speaker 1: I'm I'm polyamorous, and I'm I'm familiar with a lot 377 00:21:13,840 --> 00:21:16,240 Speaker 1: of frustrating people in that community who will make claims 378 00:21:16,280 --> 00:21:19,080 Speaker 1: that like, oh, it's more natural. Um, it's more ethical 379 00:21:19,119 --> 00:21:21,159 Speaker 1: because our ancestors did it, And it's important to not 380 00:21:21,240 --> 00:21:23,879 Speaker 1: that Like, No, our ancestors, to the extent that they 381 00:21:23,880 --> 00:21:27,000 Speaker 1: were polyandrous, didn't do it for ethical reasons. They did 382 00:21:27,000 --> 00:21:29,280 Speaker 1: it because it was like made logical sense for the 383 00:21:29,320 --> 00:21:33,960 Speaker 1: world that they lived in. Yeah, it's it's when it's 384 00:21:34,000 --> 00:21:36,400 Speaker 1: just something to tweet about. I love when I see 385 00:21:36,400 --> 00:21:40,200 Speaker 1: my polyamorous friends and I and uh, you know, I'm 386 00:21:40,600 --> 00:21:43,840 Speaker 1: they asked me how my monogamous relationship is going, and 387 00:21:43,920 --> 00:21:46,800 Speaker 1: I'm like, oh, it's good, and they're like, well, wow, 388 00:21:47,520 --> 00:21:51,080 Speaker 1: you're missing out over here, and I was like, okay, gang, 389 00:21:51,680 --> 00:21:54,360 Speaker 1: let's just play the board game, all right, just do 390 00:21:54,440 --> 00:21:57,720 Speaker 1: this front, just respectful of each other. And the other 391 00:21:57,800 --> 00:22:01,159 Speaker 1: thing that is important here is if we're talking about 392 00:22:01,560 --> 00:22:03,320 Speaker 1: like this book Sex at Dawn, which is a really 393 00:22:03,359 --> 00:22:06,560 Speaker 1: interesting book focuses a lot on one of like the 394 00:22:06,640 --> 00:22:10,159 Speaker 1: polyandrous species of monkeys bnobos, but ignores that there are 395 00:22:10,200 --> 00:22:12,640 Speaker 1: a lot of monogygamous species as well, So it's it's 396 00:22:12,760 --> 00:22:16,240 Speaker 1: entirely possible that, like, we descend more from monogamous types 397 00:22:16,280 --> 00:22:19,000 Speaker 1: of primates than we do from polyandrous types of primates. 398 00:22:19,280 --> 00:22:22,160 Speaker 1: And if that's the case, then this period of time 399 00:22:22,160 --> 00:22:25,479 Speaker 1: in which most human beings were polyandrous isn't a return 400 00:22:25,520 --> 00:22:27,959 Speaker 1: to It wasn't like natural for them. It was something 401 00:22:28,480 --> 00:22:31,360 Speaker 1: that they evolved to do, and no more natural than 402 00:22:31,400 --> 00:22:33,359 Speaker 1: like a cell phone, and like a cell phone was 403 00:22:33,480 --> 00:22:37,280 Speaker 1: essentially like an adaptation people developed over time in order 404 00:22:37,320 --> 00:22:39,600 Speaker 1: to increase their odds of survival, which is what I'm 405 00:22:39,600 --> 00:22:42,560 Speaker 1: getting to here. It is a bigger point than polyandry. 406 00:22:43,119 --> 00:22:49,040 Speaker 1: What else can increase your chances of survival? Products and 407 00:22:49,080 --> 00:22:58,280 Speaker 1: services it's an ad break, Damn they will, especially if 408 00:22:58,280 --> 00:23:01,800 Speaker 1: it's dick Pills, which fits right into what we're talking 409 00:23:01,840 --> 00:23:07,760 Speaker 1: about God, but also fun cook Industries, Fox News right, yeah, right, yeah, 410 00:23:08,280 --> 00:23:21,480 Speaker 1: fuck like monkeys. Thanks to Dick Pills products, we're back, 411 00:23:21,760 --> 00:23:24,440 Speaker 1: and I'm I'm continuing to build to my larger point, 412 00:23:24,480 --> 00:23:27,960 Speaker 1: which is gonna keep going on. So there's a two 413 00:23:27,960 --> 00:23:31,160 Speaker 1: thousand fifteen study by the University College of London, which 414 00:23:31,160 --> 00:23:33,119 Speaker 1: put forward the same suggestion that men and women in 415 00:23:33,160 --> 00:23:36,719 Speaker 1: pre agricultural human society has likely lived in relative equality. 416 00:23:37,119 --> 00:23:40,160 Speaker 1: Mark Dibble, lead author and the study, said sexual equality 417 00:23:40,240 --> 00:23:43,280 Speaker 1: is one of the important changes that distinguishes humans. It 418 00:23:43,320 --> 00:23:45,639 Speaker 1: hasn't really been highlighted before. So again, this guy is 419 00:23:45,680 --> 00:23:47,960 Speaker 1: saying this is a change it was. It's not a 420 00:23:48,000 --> 00:23:49,919 Speaker 1: thing that came naturally to us. It's something that we 421 00:23:49,960 --> 00:23:54,360 Speaker 1: adapted to for specific benefits. So these two studies are 422 00:23:54,359 --> 00:23:57,760 Speaker 1: part of an surprisingly large, to me, pretty convincing body 423 00:23:57,760 --> 00:23:59,880 Speaker 1: of research which makes the case both at the now 424 00:24:00,040 --> 00:24:02,600 Speaker 1: standard nuclear family has not been the norm for much 425 00:24:02,640 --> 00:24:05,720 Speaker 1: of human history, and that human society in the days 426 00:24:05,720 --> 00:24:08,119 Speaker 1: when life was nasty, brutish and short was also a 427 00:24:08,119 --> 00:24:11,320 Speaker 1: lot more equal and less exploitative than it does today, 428 00:24:11,760 --> 00:24:14,159 Speaker 1: not just for reasons of the kind of sexual bonds 429 00:24:14,200 --> 00:24:16,520 Speaker 1: people had. I'm gonna quote from a Guardian report on 430 00:24:16,560 --> 00:24:19,000 Speaker 1: the matter. The first real splash in this arena came 431 00:24:19,000 --> 00:24:22,040 Speaker 1: from the anthropologist Lewis Morgan and his book Ancient Society. 432 00:24:22,160 --> 00:24:24,720 Speaker 1: In the book, Morgan presented the results of his study 433 00:24:24,760 --> 00:24:27,920 Speaker 1: of the Iroquois, a Native American hunter gatherer society, and 434 00:24:28,000 --> 00:24:31,200 Speaker 1: upstate New York. The Iroquois Morgan observed lived in large 435 00:24:31,240 --> 00:24:34,080 Speaker 1: family units based on polyamorous relationships, in which men and 436 00:24:34,080 --> 00:24:37,080 Speaker 1: women lived in general equality. Morgan's work had a broader 437 00:24:37,119 --> 00:24:39,400 Speaker 1: audience when it was taken up by Friedrich Ingle's, most 438 00:24:39,440 --> 00:24:41,919 Speaker 1: famous for being the co author of the Communist Manifesto. 439 00:24:41,960 --> 00:24:44,720 Speaker 1: In his book The Origin of Family Private Property in 440 00:24:44,760 --> 00:24:47,439 Speaker 1: the State, Ingles drew on Morgan's data, as well as 441 00:24:47,480 --> 00:24:50,080 Speaker 1: evidence from around the world, to argue that prehistoric society 442 00:24:50,080 --> 00:24:53,920 Speaker 1: has lived in what he called primitive communism. Other anthropologists 443 00:24:53,920 --> 00:24:58,359 Speaker 1: now called this fierce egalitarianism, societies where families were based 444 00:24:58,359 --> 00:25:01,639 Speaker 1: on polyamory, in which in would people lived in active equality, 445 00:25:01,720 --> 00:25:06,320 Speaker 1: i e. Equality is enforced. And that's the key part here. 446 00:25:06,920 --> 00:25:11,520 Speaker 1: Enforced In our society, rules are enforced unevenly and imperfectly 447 00:25:11,640 --> 00:25:14,720 Speaker 1: by law enforcement of varying stripes. But we all accept 448 00:25:14,800 --> 00:25:17,040 Speaker 1: that most of the things we consider crimes will not 449 00:25:17,119 --> 00:25:19,879 Speaker 1: be punished. Most drug users won't be busted, most men 450 00:25:19,920 --> 00:25:22,680 Speaker 1: who beat their wives won't go to jail. Roughly of 451 00:25:22,760 --> 00:25:24,720 Speaker 1: murderers get away with their crimes. And I probably don't 452 00:25:24,720 --> 00:25:27,119 Speaker 1: need to point out to this audience that the number 453 00:25:27,160 --> 00:25:30,119 Speaker 1: of rapists who don't get punished for their crimes is 454 00:25:30,160 --> 00:25:35,280 Speaker 1: way higher than UM. So we can understand that statistic, 455 00:25:36,600 --> 00:25:38,800 Speaker 1: and that's you can. You can you can have a 456 00:25:38,880 --> 00:25:42,360 Speaker 1: society that more or less functions with those statistics when 457 00:25:42,359 --> 00:25:45,280 Speaker 1: there's hundreds of millions of you, and there's way more 458 00:25:45,320 --> 00:25:48,399 Speaker 1: food than everyone needs to eat, and the margins of 459 00:25:48,440 --> 00:25:53,280 Speaker 1: survival for our social groups are pretty wide. Primitive hunter 460 00:25:53,400 --> 00:25:57,000 Speaker 1: gatherer humans, however, lived in small bands of several dozen 461 00:25:57,040 --> 00:25:59,159 Speaker 1: to perhaps a hundred and fifty or so at the 462 00:25:59,280 --> 00:26:01,560 Speaker 1: large end of things. They lived in a world in 463 00:26:01,640 --> 00:26:03,440 Speaker 1: a time in which the margins of life and death 464 00:26:03,480 --> 00:26:07,080 Speaker 1: were much thinner. Their tribes could not survive people stealing 465 00:26:07,119 --> 00:26:10,320 Speaker 1: food from each other or committing multiple murders. This is 466 00:26:10,359 --> 00:26:13,240 Speaker 1: one of the reasons why some scientists suspect polyamory was 467 00:26:13,280 --> 00:26:17,760 Speaker 1: so common among humans in this period. And um or 468 00:26:17,800 --> 00:26:19,879 Speaker 1: do I miss it? Did you say the average lifespan 469 00:26:20,040 --> 00:26:24,520 Speaker 1: around this time? That's not a super useful statistic because 470 00:26:24,560 --> 00:26:27,080 Speaker 1: of of of infant mortality. Like one of the mistakes 471 00:26:27,080 --> 00:26:28,439 Speaker 1: a lot of people make when they think about the 472 00:26:28,440 --> 00:26:30,560 Speaker 1: past is like, oh, the average lifespan was thirty five. 473 00:26:30,600 --> 00:26:32,560 Speaker 1: That means that thirty year an old man. No, if 474 00:26:32,560 --> 00:26:34,240 Speaker 1: you make it to thirty, you're probably gonna live to 475 00:26:34,280 --> 00:26:37,240 Speaker 1: fifty or sixty at least, and seventy wouldn't even be crazy. 476 00:26:37,480 --> 00:26:39,680 Speaker 1: It's just that so many fucking babies are dying back 477 00:26:39,720 --> 00:26:45,560 Speaker 1: then that it drops the average a lot. Yeah. Yeah, 478 00:26:45,600 --> 00:26:47,679 Speaker 1: it's not at all weird for people who make it 479 00:26:47,720 --> 00:26:49,960 Speaker 1: to thirty to live to like sixty even back then. 480 00:26:50,320 --> 00:26:52,399 Speaker 1: You know, maybe fifty would be a lot more common. 481 00:26:52,600 --> 00:26:56,240 Speaker 1: Sixties still really old then, but people aren't. It's not 482 00:26:56,280 --> 00:26:58,760 Speaker 1: the norm to die at thirty. It's the norm to 483 00:26:58,840 --> 00:27:00,960 Speaker 1: die as a baby. I mean, yeah, it's like I mean, 484 00:27:01,119 --> 00:27:03,280 Speaker 1: by the time you're thirty, you're on a roll in 485 00:27:03,359 --> 00:27:06,200 Speaker 1: terms of being alive. Yes, you're probably a pretty tough 486 00:27:06,240 --> 00:27:08,080 Speaker 1: son of a bitch if you make it to thirty 487 00:27:08,200 --> 00:27:12,960 Speaker 1: in that kind of world, or daughter of a bastard um, 488 00:27:13,000 --> 00:27:15,400 Speaker 1: although there would not have been a lot of bastards 489 00:27:15,400 --> 00:27:18,199 Speaker 1: back then because societies couldn't survive them, which is the 490 00:27:18,240 --> 00:27:23,760 Speaker 1: point I'm building too. So yeah, So, ancient tribal people 491 00:27:23,800 --> 00:27:27,280 Speaker 1: had a huge number of what are called leveling mechanisms 492 00:27:27,320 --> 00:27:31,679 Speaker 1: that's the anthropological term, to defend themselves against dangerous members 493 00:27:31,680 --> 00:27:33,560 Speaker 1: of the group and This is where I get to 494 00:27:33,600 --> 00:27:35,640 Speaker 1: tell you, guys, one of my very very very very 495 00:27:35,800 --> 00:27:39,240 Speaker 1: very favorite stories. Have you ever heard of the shaming 496 00:27:39,280 --> 00:27:43,679 Speaker 1: of the meat? No? No, it doesn't has nothing to 497 00:27:43,680 --> 00:27:45,800 Speaker 1: do with sex. It has something to do with gender, 498 00:27:45,800 --> 00:27:52,560 Speaker 1: but nothing to do with sex whatever. Richard borche Lee 499 00:27:52,680 --> 00:27:55,240 Speaker 1: is a Canadian anthropologist who has spent a huge amount 500 00:27:55,240 --> 00:27:58,040 Speaker 1: of time living with and studying and writing about modern 501 00:27:58,119 --> 00:28:01,480 Speaker 1: hunter gatherer people's like the and the ju Juanci, both 502 00:28:01,480 --> 00:28:03,720 Speaker 1: of which I'm sure I've mispronounced the names of. Like 503 00:28:03,760 --> 00:28:06,840 Speaker 1: the you have to do like a weird I I can't. 504 00:28:06,880 --> 00:28:08,800 Speaker 1: I just I'm not gonna be able to. But they're 505 00:28:08,840 --> 00:28:11,600 Speaker 1: the Ekong people. Uh. These are people who exist today 506 00:28:11,600 --> 00:28:13,640 Speaker 1: in our modern, connected world, but the rhythms of their 507 00:28:13,640 --> 00:28:16,880 Speaker 1: lives and tactics of their societies are seen by anthropologists 508 00:28:16,880 --> 00:28:19,720 Speaker 1: as sort of a window into the human past. Uh. 509 00:28:19,880 --> 00:28:22,280 Speaker 1: Studying them is not a perfect look at our ancestors, 510 00:28:22,280 --> 00:28:24,760 Speaker 1: but it's about as good as we can get. In 511 00:28:24,760 --> 00:28:27,840 Speaker 1: the nineteen seventies, Richard spent time living with the Kong, 512 00:28:28,040 --> 00:28:30,280 Speaker 1: and near the end of this period embedded with them. 513 00:28:30,320 --> 00:28:33,320 Speaker 1: It just so happened to coincide with Christmas and out 514 00:28:33,320 --> 00:28:35,560 Speaker 1: of a sense of festivity and a desire to express 515 00:28:35,600 --> 00:28:38,520 Speaker 1: his gratitude towards the tribe for hosting him. Richard Lee 516 00:28:38,560 --> 00:28:41,480 Speaker 1: bought a gigantic ox to present them so that everybody 517 00:28:41,480 --> 00:28:44,200 Speaker 1: could have a sweet ass feast. Now, the ox he 518 00:28:44,320 --> 00:28:46,960 Speaker 1: picked weighed twelve hundred pounds, which meant that it was 519 00:28:47,080 --> 00:28:49,280 Speaker 1: enough meat for every man, woman and child among the 520 00:28:49,280 --> 00:28:51,800 Speaker 1: tribues with to get like four pounds of meat. So 521 00:28:51,880 --> 00:28:54,480 Speaker 1: he's like, this is fucking awesome, Like I've made this 522 00:28:54,520 --> 00:28:55,920 Speaker 1: is gonna be a great gift. This is a great 523 00:28:55,920 --> 00:28:58,240 Speaker 1: way to show my gratitude to them. They're gonna love 524 00:28:58,280 --> 00:29:01,800 Speaker 1: this ship. So I go a quote now from what 525 00:29:01,880 --> 00:29:04,840 Speaker 1: he wrote about this experience, which is a basically an 526 00:29:04,880 --> 00:29:07,640 Speaker 1: article titled Eating Christmas in the Kalahari, which you can 527 00:29:07,640 --> 00:29:09,440 Speaker 1: find online. It will be in the source notes. It's 528 00:29:09,440 --> 00:29:13,640 Speaker 1: a great read quote. The next morning, words spread among 529 00:29:13,640 --> 00:29:16,240 Speaker 1: the people that the big solid black one was was 530 00:29:16,280 --> 00:29:19,080 Speaker 1: the ox chosen by Unta my bushman name. It means 531 00:29:19,160 --> 00:29:22,720 Speaker 1: roughly whitey for the Christmas feast. That afternoon, I received 532 00:29:22,760 --> 00:29:26,440 Speaker 1: the first delegation. Binnah, an outspoken sixty year old mother 533 00:29:26,480 --> 00:29:28,840 Speaker 1: of five, came to the point slowly where were you 534 00:29:28,880 --> 00:29:32,240 Speaker 1: planning to eat Christmas right here, I replied, a loner 535 00:29:32,280 --> 00:29:34,440 Speaker 1: with others, I expect to invite all the people to 536 00:29:34,480 --> 00:29:37,320 Speaker 1: eat Christmas with me. Eat what I have purchased. You 537 00:29:37,360 --> 00:29:39,360 Speaker 1: have as black ox, and I am going to slaughter 538 00:29:39,400 --> 00:29:41,560 Speaker 1: and cook it. That's what we were told at the well, 539 00:29:41,600 --> 00:29:44,480 Speaker 1: but refused to believe it until we heard it from yourself. Well, 540 00:29:44,520 --> 00:29:47,440 Speaker 1: it's the black one, I replied expansively, although wondering what 541 00:29:47,560 --> 00:29:50,280 Speaker 1: she was driving at. Oh no, ben A groaned, turning 542 00:29:50,320 --> 00:29:52,720 Speaker 1: to her group. They were right. Turning back to me, 543 00:29:52,800 --> 00:29:54,880 Speaker 1: she asked, do you expect us to eat that bag 544 00:29:54,920 --> 00:29:59,120 Speaker 1: of bones? Bag of bones? It's the biggest ox here, big, yes, 545 00:29:59,200 --> 00:30:01,880 Speaker 1: but old and fin. Everybody knows there's no meat on 546 00:30:01,920 --> 00:30:03,680 Speaker 1: that old ox. What did you expect us to eat? 547 00:30:03,680 --> 00:30:06,440 Speaker 1: Off it? The horns? Everybody chuckled at ben as one 548 00:30:06,480 --> 00:30:08,840 Speaker 1: liner as they walked away. But all I can manage 549 00:30:08,840 --> 00:30:11,920 Speaker 1: was a weak grin. Are you wondering where this is going? 550 00:30:12,400 --> 00:30:15,600 Speaker 1: You're gonna fucking you. You're gonna fucking love it. Okay, 551 00:30:15,600 --> 00:30:20,280 Speaker 1: I'm excited great. Over the next several days, tribesmen and 552 00:30:20,280 --> 00:30:23,280 Speaker 1: women and children would make repeated mocking gibes to Richard 553 00:30:23,320 --> 00:30:25,720 Speaker 1: about the scrawny size of the enormous ox that he 554 00:30:25,840 --> 00:30:28,240 Speaker 1: bought them. No, I don't know, Lee, but reading his 555 00:30:28,320 --> 00:30:30,600 Speaker 1: writing you get the fielding. He's a very open minded, 556 00:30:30,680 --> 00:30:33,040 Speaker 1: friendly and hard to rattle sort of dude, which you'd 557 00:30:33,040 --> 00:30:35,360 Speaker 1: expect from an anthropologist who spent his whole life like 558 00:30:35,440 --> 00:30:38,520 Speaker 1: living among different tribal groups around the world. But even 559 00:30:38,600 --> 00:30:42,080 Speaker 1: he started meat story, yeah, it's it's frustrating. Will they 560 00:30:42,160 --> 00:30:44,800 Speaker 1: keep harping on him? Dozens and dozens of of Like 561 00:30:44,880 --> 00:30:46,840 Speaker 1: everyone in the tribe is making fun of him for 562 00:30:46,960 --> 00:30:50,200 Speaker 1: days about this, So he starts to get frustrated and 563 00:30:50,240 --> 00:30:53,120 Speaker 1: even angry as this goes on, and eventually some of 564 00:30:53,160 --> 00:30:55,400 Speaker 1: his good friends among the tribe explained to him that 565 00:30:55,440 --> 00:30:58,080 Speaker 1: this was common behavior, particularly from other members of the 566 00:30:58,120 --> 00:31:01,240 Speaker 1: tribe towards their young hunters. You you just make fun 567 00:31:01,240 --> 00:31:03,880 Speaker 1: of people for the shittiness of whatever they hunt, regardless 568 00:31:03,880 --> 00:31:05,440 Speaker 1: of how big it is, when it's time to like 569 00:31:05,520 --> 00:31:09,440 Speaker 1: help them clean and cook it. So, in frustration and confusion, 570 00:31:09,680 --> 00:31:12,320 Speaker 1: Lee asked one of his friends, why insult a man 571 00:31:12,400 --> 00:31:14,160 Speaker 1: after he has gone to all that trouble to track 572 00:31:14,200 --> 00:31:15,680 Speaker 1: and kill an animal, and when he is going to 573 00:31:15,680 --> 00:31:17,360 Speaker 1: share the meat with you so that your children will 574 00:31:17,360 --> 00:31:22,440 Speaker 1: have something to eat. Arrogance was his cryptic answer. Arrogance. Yes, 575 00:31:22,720 --> 00:31:24,920 Speaker 1: when a young man kills much meat, he comes to 576 00:31:24,920 --> 00:31:27,000 Speaker 1: think of himself as a chief or a big man, 577 00:31:27,320 --> 00:31:28,880 Speaker 1: and he thinks of the rest of us as his 578 00:31:28,960 --> 00:31:32,680 Speaker 1: servants or inferiors. We can't accept this. We refuse one 579 00:31:32,720 --> 00:31:35,920 Speaker 1: who boasts, for someday his pride will make him kill somebody, 580 00:31:36,200 --> 00:31:38,560 Speaker 1: So we always speak of his meat is worthless. This 581 00:31:38,560 --> 00:31:41,680 Speaker 1: way we cool his heart and make him gentle. So 582 00:31:41,720 --> 00:31:46,800 Speaker 1: this is like the meat version of like humility. Yeah, 583 00:31:46,840 --> 00:31:50,400 Speaker 1: this is how you enforce humility among a hunter gathered. Try. 584 00:31:50,480 --> 00:31:53,760 Speaker 1: This is how you attack and fight the male ego 585 00:31:54,400 --> 00:31:56,720 Speaker 1: when you can't afford to let it go out of 586 00:31:56,720 --> 00:32:00,440 Speaker 1: control like it gets to do in our society. Wow. Yeah, 587 00:32:00,480 --> 00:32:02,800 Speaker 1: that's kind of beautiful. I mean, you like to text 588 00:32:02,840 --> 00:32:05,040 Speaker 1: me all the time telling me that you're embarrassed by 589 00:32:05,040 --> 00:32:07,160 Speaker 1: your gender, which I just wanted to bring up for 590 00:32:07,200 --> 00:32:11,320 Speaker 1: no reason. I get frustrating a lot. That's nice, I mean, 591 00:32:11,360 --> 00:32:13,680 Speaker 1: I do think it is also funny that it's like, Okay, 592 00:32:13,720 --> 00:32:16,360 Speaker 1: how do we get through to the men. It's like, okay, 593 00:32:16,440 --> 00:32:20,400 Speaker 1: let's just rap a moral and a bunch of meat 594 00:32:21,000 --> 00:32:23,000 Speaker 1: and maybe they won't taste it on the way down. 595 00:32:23,000 --> 00:32:26,160 Speaker 1: This is kind of very Texas by any means necessary. 596 00:32:26,360 --> 00:32:33,000 Speaker 1: I like dream texts. This story is so Robert. Now. 597 00:32:33,080 --> 00:32:37,080 Speaker 1: There are some scientists who theorize that sarcasm and humor 598 00:32:37,160 --> 00:32:40,920 Speaker 1: itself evolved in human culture as a leveling mechanism, as 599 00:32:40,920 --> 00:32:42,800 Speaker 1: a way to cool the hearts of arrogant young men 600 00:32:42,880 --> 00:32:44,880 Speaker 1: before they went mad with power. So that's like why 601 00:32:45,000 --> 00:32:50,360 Speaker 1: we have humor. That's why we have deadpools things, things 602 00:32:50,440 --> 00:32:54,080 Speaker 1: have gotten mutated, but like that was its initial purpose, 603 00:32:54,320 --> 00:32:58,040 Speaker 1: is to allow us to because like humor is a 604 00:32:58,760 --> 00:33:02,120 Speaker 1: making fun of somebody, Insulting somebody is a way to 605 00:33:02,200 --> 00:33:06,040 Speaker 1: attack them without physically fighting and getting into a physical 606 00:33:06,080 --> 00:33:09,080 Speaker 1: battle where people die and are injured. So that's sort 607 00:33:09,120 --> 00:33:11,480 Speaker 1: of the theory that like maybe this is kind of 608 00:33:11,520 --> 00:33:13,880 Speaker 1: the evolutionary use of a sense of humor or at 609 00:33:13,960 --> 00:33:17,400 Speaker 1: least one of them. If being good at insults is 610 00:33:17,480 --> 00:33:22,840 Speaker 1: just uh not attacking someone I'm a you know, fucking samurai. 611 00:33:23,360 --> 00:33:26,720 Speaker 1: Well yeah, I mean there's a reason why in so 612 00:33:26,840 --> 00:33:31,840 Speaker 1: many cultures around the world, like it's pretty common for 613 00:33:31,880 --> 00:33:35,080 Speaker 1: people's grandmas to be like both kind of in charge 614 00:33:35,080 --> 00:33:38,480 Speaker 1: of the family and also talking shit about everybody all 615 00:33:38,520 --> 00:33:44,720 Speaker 1: the time, like, um, yeah, we the evolution of sarcasm 616 00:33:45,120 --> 00:33:47,880 Speaker 1: is what what a what a dark road to go down? 617 00:33:49,080 --> 00:33:52,440 Speaker 1: Target the other day, I said, sarcasm, it's how I hug. 618 00:33:52,680 --> 00:33:56,120 Speaker 1: I think it really speaks to your point. Uh. And 619 00:33:56,160 --> 00:33:59,200 Speaker 1: also a funk that shirt and anyone that's ever worn it. 620 00:33:59,200 --> 00:34:00,920 Speaker 1: It's gotten out of hand end in the modern era, 621 00:34:01,080 --> 00:34:04,120 Speaker 1: but we can see where it started. Yeah, dead Pool 622 00:34:04,240 --> 00:34:12,680 Speaker 1: Robert mistakes are made. Yeah. So there's a body of 623 00:34:12,680 --> 00:34:16,279 Speaker 1: scientific research that suggests possessing power impacts the brain and 624 00:34:16,320 --> 00:34:20,720 Speaker 1: manner similar to brain damage. Dr Keltner, a psychology professor 625 00:34:20,719 --> 00:34:22,480 Speaker 1: at u C. Berkeley, is one of the scientists on 626 00:34:22,520 --> 00:34:25,720 Speaker 1: the forefront of this field of study. From the Atlantic quote, 627 00:34:26,000 --> 00:34:28,400 Speaker 1: subjects under the influence of power, he found, in studies 628 00:34:28,400 --> 00:34:30,600 Speaker 1: spanning two decades acted as if they had suffered a 629 00:34:30,640 --> 00:34:33,840 Speaker 1: traumatic brain injury, becoming more impulsive, less risk aware, and 630 00:34:33,840 --> 00:34:36,560 Speaker 1: crucially less adept at seeing things from other people's point 631 00:34:36,600 --> 00:34:41,000 Speaker 1: of view. Uh. Suke Vender Opie, a neuroscientistic master University 632 00:34:41,040 --> 00:34:45,400 Speaker 1: in Ontario, recently described something similar. Unlike Keltner, who studies behaviors, 633 00:34:45,400 --> 00:34:47,600 Speaker 1: Opie studies brains, and when he put the heads of 634 00:34:47,600 --> 00:34:50,200 Speaker 1: the powerful and the not so powerful under a transcranial 635 00:34:50,239 --> 00:34:53,760 Speaker 1: magnetic stimulation machine, he found that power in fact impairs 636 00:34:53,800 --> 00:34:58,799 Speaker 1: a specific neural process, mirroring that maybe a cornerstone of empathy. Now, 637 00:34:58,800 --> 00:35:00,480 Speaker 1: before we take too much out of this, there's a 638 00:35:00,520 --> 00:35:02,800 Speaker 1: lot of debate about the validity of this research and 639 00:35:02,840 --> 00:35:04,520 Speaker 1: the extent to which it can tell us anything about 640 00:35:04,560 --> 00:35:07,360 Speaker 1: the real world. I found an interesting neuroskeptic article that 641 00:35:07,400 --> 00:35:09,560 Speaker 1: points out that power priming, which is the kind of 642 00:35:09,560 --> 00:35:13,000 Speaker 1: studies that were conducted to get these results, power priming 643 00:35:13,000 --> 00:35:15,040 Speaker 1: studies have real flaws when we try to apply their 644 00:35:15,080 --> 00:35:19,799 Speaker 1: lessons outside of a research context. But I think the 645 00:35:20,320 --> 00:35:22,880 Speaker 1: lived experience of the I Kung and other hunter gatherers 646 00:35:22,920 --> 00:35:26,680 Speaker 1: seems to support at least the conclusion that a lot 647 00:35:26,800 --> 00:35:30,280 Speaker 1: of people who live on like traditional more hunter gatherers 648 00:35:30,280 --> 00:35:34,320 Speaker 1: societies kind of understood that power was bad for people 649 00:35:34,480 --> 00:35:37,560 Speaker 1: and it made them more dangerous to themselves and others, 650 00:35:37,920 --> 00:35:42,760 Speaker 1: and that they needed to be like egalitarianism was then again, 651 00:35:42,800 --> 00:35:46,480 Speaker 1: not an ethical decision. It's a defensive reaction to the 652 00:35:46,600 --> 00:35:52,200 Speaker 1: dangers of power. So I find it that interesting. I think, yeah, 653 00:35:52,280 --> 00:35:57,239 Speaker 1: that that definitely tracks with a lot of That's part 654 00:35:57,239 --> 00:36:00,399 Speaker 1: of why I do think that like powerful people using 655 00:36:00,560 --> 00:36:03,920 Speaker 1: social media is so extremely interesting, as you can almost 656 00:36:03,960 --> 00:36:05,959 Speaker 1: like see the rot on the edges of their brain. 657 00:36:06,000 --> 00:36:09,600 Speaker 1: And like even like Trump aside, there's like there's so 658 00:36:09,640 --> 00:36:13,520 Speaker 1: many examples of just like you can just see in 659 00:36:13,600 --> 00:36:18,799 Speaker 1: real time the brain rot forming the famous people. Yeah yeah, 660 00:36:18,800 --> 00:36:21,719 Speaker 1: but I mean just like influential people who have too 661 00:36:21,760 --> 00:36:23,919 Speaker 1: much money for your own good, Like you can even 662 00:36:24,000 --> 00:36:28,120 Speaker 1: like social media influencers your corrosion. Well, they're chaotic, evil. 663 00:36:28,160 --> 00:36:30,799 Speaker 1: I don't even include them. But like I think, like 664 00:36:30,840 --> 00:36:33,400 Speaker 1: the best example of that I've seen recently is Elon 665 00:36:33,520 --> 00:36:37,560 Speaker 1: Musk's bizarre crusade against crediting artists, which is like if 666 00:36:37,640 --> 00:36:39,680 Speaker 1: you had sat down with him, if he'd never gotten 667 00:36:39,719 --> 00:36:41,600 Speaker 1: on Twitter, and you just sat down with him be like, oh, 668 00:36:41,760 --> 00:36:44,000 Speaker 1: if you share the work of an artist you like, 669 00:36:44,120 --> 00:36:45,600 Speaker 1: you should add their name to it, he would have 670 00:36:45,560 --> 00:36:49,880 Speaker 1: been like, oh, yeah, of course that makes total sense. Yeah, 671 00:36:50,040 --> 00:36:53,000 Speaker 1: but because of the way social media works, somebody says 672 00:36:53,040 --> 00:36:56,399 Speaker 1: that on Twitter and he just immediately attacks them. It's 673 00:36:56,520 --> 00:36:59,560 Speaker 1: like this, why are you having this fight? Elon Musk? 674 00:36:59,719 --> 00:37:02,839 Speaker 1: I mean, his entire online presence like it should be 675 00:37:02,920 --> 00:37:05,200 Speaker 1: there should be a thesis paper on it, because it is. 676 00:37:05,200 --> 00:37:11,000 Speaker 1: It's like you you can just see his brain turning 677 00:37:11,040 --> 00:37:15,759 Speaker 1: to dust its very eyes, yeah, like and interest seeing. 678 00:37:15,880 --> 00:37:18,239 Speaker 1: And I'm sure that that's an extension of things that 679 00:37:18,360 --> 00:37:20,839 Speaker 1: have been going on, you know, since the beginning of time, 680 00:37:20,880 --> 00:37:24,240 Speaker 1: but like actually getting to observe it and having people 681 00:37:24,360 --> 00:37:29,840 Speaker 1: volunteer that information to you is fascinating. I suspect I 682 00:37:29,840 --> 00:37:31,520 Speaker 1: don't I can't know this, but like, if we can 683 00:37:31,560 --> 00:37:34,320 Speaker 1: imagine a world in which Donald Trump never had Twitter 684 00:37:34,600 --> 00:37:37,759 Speaker 1: but also still got elected president, my suspicion is that 685 00:37:37,800 --> 00:37:42,640 Speaker 1: he'd suck less. Not I don't think you would have 686 00:37:42,680 --> 00:37:48,160 Speaker 1: been elected without never without Twitter, But yeah, I think 687 00:37:48,200 --> 00:37:51,080 Speaker 1: it's been bad for him. Yeah, I think that I 688 00:37:51,120 --> 00:37:55,080 Speaker 1: had worked in his favor during the election, and then 689 00:37:55,200 --> 00:37:59,439 Speaker 1: during his presidency it's been just like a giant Yeah. Yeah, 690 00:37:59,480 --> 00:38:03,239 Speaker 1: You're welcome swners for that sound effect. Back to my 691 00:38:04,440 --> 00:38:06,880 Speaker 1: my theory here that I'm still building towards. There's a 692 00:38:06,880 --> 00:38:09,360 Speaker 1: lot of like I'm taking you on, like the journey 693 00:38:09,360 --> 00:38:11,520 Speaker 1: of just ship I've been thinking about for the last 694 00:38:11,600 --> 00:38:13,040 Speaker 1: year and a half. So this is kind of like 695 00:38:13,080 --> 00:38:17,640 Speaker 1: the pattern my brain has taken it as best as 696 00:38:17,680 --> 00:38:21,120 Speaker 1: I can recreate my brain rot that's what we're touring. 697 00:38:22,200 --> 00:38:24,600 Speaker 1: So there's a tendency and progressive thought, and it's something 698 00:38:24,600 --> 00:38:26,440 Speaker 1: that I fight against a lot to look at groups 699 00:38:26,480 --> 00:38:30,239 Speaker 1: like the Iroquois and other research into our egalitarian ancestors 700 00:38:30,239 --> 00:38:32,680 Speaker 1: and make the point that such forms of social organization 701 00:38:32,760 --> 00:38:36,160 Speaker 1: are more natural and thus healthier than the supremely hierarchical 702 00:38:36,200 --> 00:38:39,040 Speaker 1: societies we live in today. But I tend to think 703 00:38:39,040 --> 00:38:41,799 Speaker 1: the evolution of leveling mechanisms like the shaming of the 704 00:38:41,880 --> 00:38:44,560 Speaker 1: meat suggests kind of the opposite, well, at least not 705 00:38:44,600 --> 00:38:47,680 Speaker 1: about the healthy thing, but about it being natural. Hierarchy 706 00:38:47,719 --> 00:38:50,760 Speaker 1: is natural for human beings. It's something our brains slide 707 00:38:50,800 --> 00:38:54,960 Speaker 1: into without careful vigilance. Our ancestors were not egalitarian because 708 00:38:54,960 --> 00:38:59,360 Speaker 1: it felt natural. They evolved to enforce egalitarianism with great 709 00:38:59,440 --> 00:39:02,800 Speaker 1: vigilance as a defense mechanism against the dangers of power. 710 00:39:03,440 --> 00:39:06,200 Speaker 1: And this presents perhaps a less utopian view of man's 711 00:39:06,239 --> 00:39:08,799 Speaker 1: inherent nature, but I think it also posits a more 712 00:39:08,840 --> 00:39:12,120 Speaker 1: optimistic picture of our future, because if Homo sapiens beat 713 00:39:12,160 --> 00:39:15,360 Speaker 1: the problems of ingrained hierarchy once, then we can fucking 714 00:39:15,360 --> 00:39:20,160 Speaker 1: well do it again. Yeah, thank you. And that leads 715 00:39:20,160 --> 00:39:23,319 Speaker 1: pretty naturally to my next question. If our ancestors once 716 00:39:23,360 --> 00:39:26,880 Speaker 1: lived free, fucking egalitarian lives, sleeping under the stars, probably 717 00:39:26,920 --> 00:39:30,680 Speaker 1: taking hell of mushrooms and not enforcing strict gender hierarchy, 718 00:39:31,320 --> 00:39:36,919 Speaker 1: wh whoa who went wrong? How did we go from 719 00:39:36,960 --> 00:39:39,439 Speaker 1: all that to the last you know, ten thousand years 720 00:39:39,520 --> 00:39:41,719 Speaker 1: or whatever of human history, which you know has kind 721 00:39:41,760 --> 00:39:43,640 Speaker 1: of been a ship show. But but do you know 722 00:39:43,680 --> 00:39:46,960 Speaker 1: what is not a ship show? Ship the products and 723 00:39:47,000 --> 00:39:51,680 Speaker 1: services that support the show. Oh I was I was 724 00:39:51,719 --> 00:39:53,839 Speaker 1: about to drop the ball yet again and be like, 725 00:39:53,920 --> 00:39:58,000 Speaker 1: how do you mean? What do you mean? I'm Robert, 726 00:39:58,120 --> 00:40:09,680 Speaker 1: thank you, product X. We're back. So we're talking about 727 00:40:09,800 --> 00:40:15,720 Speaker 1: why this egalitarian order of the human race that seems 728 00:40:15,760 --> 00:40:18,160 Speaker 1: to have existed at a point in the distant past, 729 00:40:18,560 --> 00:40:21,799 Speaker 1: what made it fall apart? Uh? And I found a 730 00:40:21,800 --> 00:40:24,560 Speaker 1: good rite up on this subject in New Scientist magazine 731 00:40:24,840 --> 00:40:29,120 Speaker 1: by a researcher named Deborah Rodgers. She cites social anthropologist 732 00:40:29,160 --> 00:40:32,560 Speaker 1: Christopher Bohm, who believes the suppression of the dominance hierarchies 733 00:40:32,600 --> 00:40:35,719 Speaker 1: of our primate ancestors was a quote central adaptation of 734 00:40:35,800 --> 00:40:38,960 Speaker 1: human evolution. Bom thinks we would not have spread across 735 00:40:39,000 --> 00:40:44,160 Speaker 1: the world without the adaptation of egalitarianism. He notes, inequality 736 00:40:44,200 --> 00:40:46,400 Speaker 1: did not spread because it is a better system for 737 00:40:46,440 --> 00:40:50,759 Speaker 1: our survival. So why then, did inequality eat the world? Well, 738 00:40:50,760 --> 00:40:54,320 Speaker 1: that's a question that's been posed by a number of histories. 739 00:40:54,360 --> 00:40:57,839 Speaker 1: Great thinkers. Jean Jacques Rousseau theorized in seventeen fifty four 740 00:40:57,840 --> 00:41:00,640 Speaker 1: that inequality started with the idea of prime at property. 741 00:41:01,040 --> 00:41:04,160 Speaker 1: Social Darwinists in the eighteen hundreds thought that inequality was 742 00:41:04,200 --> 00:41:07,120 Speaker 1: the inevitable result of the struggle of survival of the fittest, 743 00:41:07,600 --> 00:41:10,120 Speaker 1: in which the more fit and almost inevitably white people 744 00:41:10,160 --> 00:41:14,680 Speaker 1: formed a natural aristocracy by dint of their evolutionary success. 745 00:41:14,719 --> 00:41:18,640 Speaker 1: But this thinking has continued to evolve in the twentieth century. 746 00:41:18,680 --> 00:41:22,200 Speaker 1: According to Deborah Rogers quote, by the mid twentieth century, 747 00:41:22,200 --> 00:41:25,600 Speaker 1: a new theory began to dominate. Anthropologists including Julian Steward, 748 00:41:25,680 --> 00:41:29,359 Speaker 1: Leslie White, and Robert Carniro offered a slightly different versions 749 00:41:29,400 --> 00:41:32,520 Speaker 1: of this following story. Population growth mean we needed more food, 750 00:41:32,560 --> 00:41:34,960 Speaker 1: so we turned to agriculture, which led to surplus and 751 00:41:34,960 --> 00:41:37,879 Speaker 1: the need for managers and specialized roles, which in turn 752 00:41:37,920 --> 00:41:41,520 Speaker 1: led to corresponding social classes. Meanwhile, we began to use 753 00:41:41,600 --> 00:41:44,360 Speaker 1: up natural resources and needed to venture ever further afield 754 00:41:44,400 --> 00:41:47,560 Speaker 1: to seek them out. This expansion bread conflict and conquest, 755 00:41:47,680 --> 00:41:51,400 Speaker 1: with a conquered becoming the underclass. The more recent explanations 756 00:41:51,400 --> 00:41:54,280 Speaker 1: have expanded on these ideas. One line of reasoning suggests 757 00:41:54,280 --> 00:41:56,920 Speaker 1: that self aggrandizing individuals who lived in lands of plenty 758 00:41:56,960 --> 00:42:00,359 Speaker 1: ascended the social ranks by exploiting their surplus, first through 759 00:42:00,400 --> 00:42:03,239 Speaker 1: feasts or gift giving, and later by outright dominance at 760 00:42:03,239 --> 00:42:06,600 Speaker 1: the group level, argues and argue anthropologist Peter Richardson and 761 00:42:06,680 --> 00:42:09,879 Speaker 1: Robert Boyd. Improved coordination and division of labor allowed more 762 00:42:09,920 --> 00:42:13,680 Speaker 1: complex societies to outcompete the simpler, more equal societies from 763 00:42:13,680 --> 00:42:17,200 Speaker 1: a mechanistic perspective. Others argued that once inequality took hold, 764 00:42:17,320 --> 00:42:20,840 Speaker 1: is when uneven resource distribution benefited one family more than others, 765 00:42:21,160 --> 00:42:24,719 Speaker 1: it's simply became ever more entrenched. The advent of agriculture 766 00:42:24,719 --> 00:42:28,760 Speaker 1: and trade resulted in private property inheritance and larger trade networks, 767 00:42:28,760 --> 00:42:32,680 Speaker 1: which perpetuated and compounded economic advantages. So it's like when 768 00:42:32,719 --> 00:42:34,560 Speaker 1: you're in college and you have to do a group 769 00:42:34,640 --> 00:42:39,640 Speaker 1: project and you have some people that are either like 770 00:42:39,880 --> 00:42:43,840 Speaker 1: not available or like bad or and then you have 771 00:42:43,880 --> 00:42:46,640 Speaker 1: the people that are busy bodies and want to do everything. 772 00:42:47,320 --> 00:42:49,239 Speaker 1: And then you have the people that you know, have 773 00:42:49,360 --> 00:42:51,560 Speaker 1: the special tutors, so they know how to do everything 774 00:42:51,560 --> 00:42:54,680 Speaker 1: because they have help. And then you people get a 775 00:42:54,719 --> 00:42:57,919 Speaker 1: better grade and then things, people get jobs, people don't 776 00:42:57,920 --> 00:43:04,400 Speaker 1: get's like college. It's like college. It's a group. Or 777 00:43:04,600 --> 00:43:07,560 Speaker 1: do you leave sad and in debt and you may 778 00:43:07,680 --> 00:43:09,839 Speaker 1: or may not get a job, and you probably are 779 00:43:09,880 --> 00:43:13,560 Speaker 1: not getting a job in the thing you studied. Or 780 00:43:13,600 --> 00:43:15,879 Speaker 1: it's like college, and that the people who didn't go 781 00:43:16,239 --> 00:43:18,719 Speaker 1: wound up without tens of thousands of dollars in debt, 782 00:43:18,880 --> 00:43:25,960 Speaker 1: and so uh wind up a lot wealthier again, so 783 00:43:26,040 --> 00:43:28,840 Speaker 1: they benefit from the fruits of the labor of others. 784 00:43:30,120 --> 00:43:33,680 Speaker 1: And then the higher I see, Okay, it's so it's college. 785 00:43:34,480 --> 00:43:37,080 Speaker 1: It's like college. Yeah. Now, if we find this new 786 00:43:37,080 --> 00:43:41,280 Speaker 1: school of thought credible, then hierarchy and authoritarianism itself seemed 787 00:43:41,320 --> 00:43:43,480 Speaker 1: not like the natural order of things, but more like 788 00:43:43,560 --> 00:43:46,440 Speaker 1: a virus, one that was forcibly beaten down and almost 789 00:43:46,440 --> 00:43:49,040 Speaker 1: wiped out for thousands of years, but persisted in some 790 00:43:49,080 --> 00:43:52,600 Speaker 1: isolated corners until the development of agriculture and the evolution 791 00:43:52,600 --> 00:43:55,839 Speaker 1: into larger, more organized societies provided it with a chance 792 00:43:55,880 --> 00:43:59,359 Speaker 1: to escape and replicate on a mass scale. Once more so, 793 00:43:59,560 --> 00:44:02,520 Speaker 1: we always has had these sort of tendencies towards hierarchy 794 00:44:02,560 --> 00:44:06,040 Speaker 1: and authoritarianism programmed into our brains, and for a long 795 00:44:06,120 --> 00:44:08,680 Speaker 1: time we fought it in these tiny societies. But once 796 00:44:08,719 --> 00:44:11,880 Speaker 1: we started building these larger societies, it's sort of escapes 797 00:44:12,360 --> 00:44:15,360 Speaker 1: and kind of runs wild. It's almost like how measles 798 00:44:15,440 --> 00:44:18,120 Speaker 1: was nearly wiped out by vaccines until enough dumb people 799 00:44:18,120 --> 00:44:20,320 Speaker 1: stopped vaccinating their kids that it was able to spread 800 00:44:20,320 --> 00:44:22,800 Speaker 1: and get a toe hold again. Thank you, Jessica Bille, 801 00:44:23,239 --> 00:44:29,719 Speaker 1: Thank you Jessica Bile. Some people might argue that hierarchy 802 00:44:29,719 --> 00:44:32,680 Speaker 1: and authoritarianism and Jessica Bile make for stronger and more 803 00:44:32,719 --> 00:44:36,320 Speaker 1: competitive societies, and that's why these forms of organizations spread 804 00:44:36,360 --> 00:44:41,560 Speaker 1: across the globe. Yeah, that's certainly an arguable point. Deborah 805 00:44:41,600 --> 00:44:44,760 Speaker 1: Rodgers and other researchers, however, have found in their research 806 00:44:44,840 --> 00:44:47,040 Speaker 1: some data that would seem to argue against that point. 807 00:44:47,360 --> 00:44:51,240 Speaker 1: Quote in a demographic simulation that ohm Car desh Pondi, 808 00:44:51,360 --> 00:44:54,759 Speaker 1: Marcus Feldman, and I conducted at Stanford University, California, we 809 00:44:54,840 --> 00:44:57,880 Speaker 1: found that rather than imparting advantages to the group, unequal 810 00:44:57,880 --> 00:45:00,920 Speaker 1: access to resources is inherently to stab realizing and greatly 811 00:45:01,000 --> 00:45:04,239 Speaker 1: raises the chance of group extinction and stable environments. This 812 00:45:04,360 --> 00:45:06,760 Speaker 1: was true whether we modeled inequality as a multi tiered 813 00:45:06,760 --> 00:45:10,160 Speaker 1: class society or what economists call a Peretto wealth distribution, 814 00:45:10,480 --> 00:45:12,560 Speaker 1: in which as with the one percent, the rich get, 815 00:45:12,560 --> 00:45:16,000 Speaker 1: the lions share. Counterintuitively, the fact that inequality was so 816 00:45:16,080 --> 00:45:19,279 Speaker 1: destabilized and caused these societies to spread by creating an 817 00:45:19,320 --> 00:45:22,840 Speaker 1: incentive to migrate in search of further resources. The rules 818 00:45:22,840 --> 00:45:25,720 Speaker 1: in our simulation did not allow from migration to already 819 00:45:25,760 --> 00:45:27,880 Speaker 1: occupied locations, but it was clear that this would have 820 00:45:27,920 --> 00:45:30,160 Speaker 1: happened in the real world, leading a conquest of the 821 00:45:30,200 --> 00:45:33,680 Speaker 1: more stable egalitarian societies, exactly what we see as we 822 00:45:33,719 --> 00:45:36,520 Speaker 1: look back in history. In other words, inequality did not 823 00:45:36,560 --> 00:45:38,800 Speaker 1: spread from group to group because it is an inherently 824 00:45:38,840 --> 00:45:43,040 Speaker 1: better system for survival, but because it creates demographic instability, 825 00:45:43,080 --> 00:45:45,960 Speaker 1: which drives migration and conflict and leads to the cultural 826 00:45:46,040 --> 00:45:50,440 Speaker 1: or physical extinction of egalitarian societies. And it's interesting if 827 00:45:50,440 --> 00:45:52,799 Speaker 1: you look into like who a lot of the Europeans 828 00:45:52,840 --> 00:45:55,680 Speaker 1: who sailed to the New World, as they called it, 829 00:45:56,320 --> 00:45:58,279 Speaker 1: were there were a lot of second and third and 830 00:45:58,360 --> 00:46:01,239 Speaker 1: fourth sons of wealthier emilies who like weren't going to 831 00:46:01,280 --> 00:46:03,440 Speaker 1: inherit the family wealth and so had to go make 832 00:46:03,480 --> 00:46:07,280 Speaker 1: their fortune elsewhere. So like this seems like a really 833 00:46:08,200 --> 00:46:11,880 Speaker 1: strongly arguable point to me. And it also it also 834 00:46:12,000 --> 00:46:14,680 Speaker 1: feels more like this comparison that I keep making to 835 00:46:14,800 --> 00:46:18,080 Speaker 1: a virus, because like the way virus is spread, they're 836 00:46:18,080 --> 00:46:22,279 Speaker 1: not viruses aren't sustainable, they're not stable. They need to 837 00:46:22,320 --> 00:46:27,680 Speaker 1: continually like destroy populations of of living things and need 838 00:46:27,719 --> 00:46:30,280 Speaker 1: to spread to new populations in order to stay alive. 839 00:46:30,360 --> 00:46:34,440 Speaker 1: So like I think authoritarianism, comparing it to a virus, 840 00:46:34,480 --> 00:46:37,239 Speaker 1: I think there's a lot of Uh. I think it's 841 00:46:37,239 --> 00:46:41,279 Speaker 1: a useful way to look at it. Um. Yeah, yeah, 842 00:46:41,400 --> 00:46:44,000 Speaker 1: so it's it's kind of weird that, Yeah, you don't 843 00:46:44,000 --> 00:46:48,279 Speaker 1: really hear like social movements or trends ever referred to 844 00:46:48,520 --> 00:46:53,320 Speaker 1: as a virus. I mean, I've never heard that comparison before. 845 00:46:53,560 --> 00:46:56,879 Speaker 1: We talk about virility a lot when we talk about ideas, 846 00:46:56,920 --> 00:47:03,120 Speaker 1: But yeah, I think looking at authoritaryianism virally um can 847 00:47:03,120 --> 00:47:06,640 Speaker 1: spread much like a meme of a cat. Yeah, it does. 848 00:47:06,719 --> 00:47:10,680 Speaker 1: It spreads just like a cat mame. Yeah, fascist dictatorships 849 00:47:10,760 --> 00:47:16,520 Speaker 1: spread like a cat mat a dictatorship. Right. Yeah. So 850 00:47:16,960 --> 00:47:20,239 Speaker 1: uh obviously, like you know, we could try to like 851 00:47:20,320 --> 00:47:23,520 Speaker 1: argue at which points in history authoritarianism hit its peak. 852 00:47:23,719 --> 00:47:25,520 Speaker 1: It's probably more apt to say that it ebbed and 853 00:47:25,560 --> 00:47:28,239 Speaker 1: flowed in different places across distance and time, and every 854 00:47:28,239 --> 00:47:31,720 Speaker 1: now and then individual societies would evolve like ancient Athens 855 00:47:31,800 --> 00:47:34,560 Speaker 1: or the Iroquois, who established, you know, cultures that were 856 00:47:34,640 --> 00:47:38,240 Speaker 1: more egalitarian than those around them, but in a global sense, 857 00:47:38,360 --> 00:47:41,319 Speaker 1: strict hierarchy and authoritarian means of rule where the order 858 00:47:41,320 --> 00:47:43,480 Speaker 1: of the day for the majority of people across the 859 00:47:43,560 --> 00:47:47,440 Speaker 1: last several thousand years. And again I'm I'm gonna oversimplify 860 00:47:47,480 --> 00:47:49,640 Speaker 1: here because I'm not a historian and like this is 861 00:47:49,680 --> 00:47:53,600 Speaker 1: not a historic like like like an academic paper, but 862 00:47:53,680 --> 00:47:55,600 Speaker 1: I think it's fair to say broadly that the last 863 00:47:55,640 --> 00:47:57,880 Speaker 1: eight hundred years or so have seen a major push 864 00:47:58,080 --> 00:48:02,440 Speaker 1: back towards egalitarianism and against authoritarian means of control across 865 00:48:02,480 --> 00:48:04,840 Speaker 1: the globe. And if you're going to pick an arbitrary 866 00:48:04,960 --> 00:48:07,000 Speaker 1: start point for this, you might choose the signing of 867 00:48:07,000 --> 00:48:09,880 Speaker 1: the Magna Carta in June of twelve fifteen, and there 868 00:48:09,880 --> 00:48:13,760 Speaker 1: would be a variety of other dates that would be important, 869 00:48:14,760 --> 00:48:19,520 Speaker 1: like seventeen seventy six, uh, like eighteen sixty five, like 870 00:48:19,680 --> 00:48:24,560 Speaker 1: nineteen really patriotic for a second there. And my my 871 00:48:24,719 --> 00:48:27,200 Speaker 1: dates picked are very Western centric because I don't know 872 00:48:27,239 --> 00:48:30,319 Speaker 1: as much about like Chinese history, Japanese history. But you know, 873 00:48:30,360 --> 00:48:32,920 Speaker 1: I think nineteen seventeen the Russian Revolution would be another 874 00:48:32,960 --> 00:48:36,200 Speaker 1: point in that. Uh. And of course nineteen forty five 875 00:48:36,360 --> 00:48:39,839 Speaker 1: would be another big moment in the sort of eight 876 00:48:39,880 --> 00:48:45,080 Speaker 1: hundred year or so push back against authoritarianism. Um. And 877 00:48:45,120 --> 00:48:48,320 Speaker 1: if we're going to keep rolling with my viral authoritarianism analogy, 878 00:48:48,400 --> 00:48:50,520 Speaker 1: we might look at the global defeat of the ACCESS 879 00:48:50,560 --> 00:48:52,879 Speaker 1: in World War two is equivalent to a max backs 880 00:48:52,920 --> 00:48:56,360 Speaker 1: in mass vaccination campaign. Uh. And if we want to 881 00:48:56,400 --> 00:48:59,239 Speaker 1: extend the analogy even further, we could compare people like 882 00:48:59,320 --> 00:49:01,799 Speaker 1: Cia or actor Alan Dulas and has been shant for 883 00:49:01,840 --> 00:49:05,440 Speaker 1: authoritarian regime changes in Latin America to the anti vaxers 884 00:49:05,440 --> 00:49:10,440 Speaker 1: like Jessica Bial people who saw socialism. Yeah. So Jessica 885 00:49:10,480 --> 00:49:14,320 Speaker 1: Bill and Allan Dulas, who're the same fucking guy. Yeah damn. 886 00:49:14,400 --> 00:49:18,279 Speaker 1: I would read that piece on medium dot com. Thank you, 887 00:49:18,480 --> 00:49:20,759 Speaker 1: thank you. Medium dot com is where it would put 888 00:49:20,760 --> 00:49:23,799 Speaker 1: this if I weren't such a narcissist with a podcast. 889 00:49:24,040 --> 00:49:29,560 Speaker 1: Um but he's dictatorship. You must put your call. You 890 00:49:29,560 --> 00:49:42,640 Speaker 1: guys got to shame my meat. I'm calling hr Anderson. Yeah, 891 00:49:42,640 --> 00:49:46,080 Speaker 1: so you've got like this. Uh so. Yeah. The last 892 00:49:46,080 --> 00:49:49,279 Speaker 1: several decades of creeping authoritarianism in our own society have 893 00:49:49,360 --> 00:49:51,600 Speaker 1: been driven in large part by the fact that decades 894 00:49:51,640 --> 00:49:54,880 Speaker 1: of American leaders have supported dictators and strongman across places 895 00:49:54,920 --> 00:49:57,960 Speaker 1: like Latin America. The crisis of the Border, which provided 896 00:49:58,000 --> 00:50:00,640 Speaker 1: such fuel to the American right, has been driven largely 897 00:50:00,719 --> 00:50:04,200 Speaker 1: by refugees fleeing from places like Guatemala and El Salvador. Well, 898 00:50:04,200 --> 00:50:07,520 Speaker 1: the US supported and trained death squads and assassinated democratically 899 00:50:07,560 --> 00:50:11,120 Speaker 1: elected leaders. Um you could also make a point about 900 00:50:11,120 --> 00:50:13,960 Speaker 1: our failure to intervene in Syria, all the refugees who 901 00:50:14,000 --> 00:50:17,600 Speaker 1: fled from Bashar al Assad's fascistic campaign of extermination, and 902 00:50:17,640 --> 00:50:19,920 Speaker 1: how that fueled the far right in Europe and in 903 00:50:19,920 --> 00:50:25,000 Speaker 1: the United States. Um So, the plight of those refugees, 904 00:50:25,000 --> 00:50:26,839 Speaker 1: in their decision to flee the safety of the US, 905 00:50:26,960 --> 00:50:29,400 Speaker 1: has invigorated a right wing movement that has grown stronger 906 00:50:29,440 --> 00:50:32,400 Speaker 1: and more dangerous over the decades, starting with KKK border 907 00:50:32,400 --> 00:50:35,360 Speaker 1: patrols in the nineteen seventies and ending with concentration camps 908 00:50:35,360 --> 00:50:39,239 Speaker 1: in Texas and Donald Trump in the White House. Um, now, 909 00:50:39,360 --> 00:50:42,680 Speaker 1: I'm not the only person thinking along lines similar to 910 00:50:42,760 --> 00:50:45,160 Speaker 1: this when I was doing my final research for the 911 00:50:45,160 --> 00:50:47,360 Speaker 1: War on Everyone. The audio book that I swear is 912 00:50:47,400 --> 00:50:51,640 Speaker 1: coming out soon. I just finished reading it. It's being edited. Yeah, 913 00:50:51,680 --> 00:50:55,240 Speaker 1: it's being edited right now by the audio people. So, Daniel, Yeah, 914 00:50:55,360 --> 00:50:58,960 Speaker 1: it is coming out. I came across this, Hey, Daniel, 915 00:50:59,160 --> 00:51:00,799 Speaker 1: this It actually might be up at the time people 916 00:51:00,800 --> 00:51:04,520 Speaker 1: listen to this episode. I don't know. Um. I came 917 00:51:04,560 --> 00:51:07,120 Speaker 1: across a study of how fascists were using the internet 918 00:51:07,160 --> 00:51:09,160 Speaker 1: back in the late nineties and early two thousands to 919 00:51:09,200 --> 00:51:12,040 Speaker 1: keep their movement alive. The starting study was written up 920 00:51:12,040 --> 00:51:13,920 Speaker 1: by a researcher named Less Black, and in it he 921 00:51:14,000 --> 00:51:17,640 Speaker 1: cites a book called A Thousand Plateaus Capitalism and Schizophrenia 922 00:51:17,680 --> 00:51:20,279 Speaker 1: from six Now, the book was written by a pair 923 00:51:20,320 --> 00:51:23,000 Speaker 1: of philosophers to Lose and Guitari and I don't pretend 924 00:51:23,040 --> 00:51:25,440 Speaker 1: to understand the overall thrust of the text because I 925 00:51:25,480 --> 00:51:28,920 Speaker 1: am so bad at reading political theory. But Black cites 926 00:51:28,960 --> 00:51:31,319 Speaker 1: a piece of that book that seems to be making 927 00:51:31,320 --> 00:51:33,760 Speaker 1: a similar argument to the one I've been making, albeit 928 00:51:33,960 --> 00:51:38,640 Speaker 1: with a slightly different analogy. Quote delusing Guitari argue that 929 00:51:38,719 --> 00:51:41,359 Speaker 1: part of the nature of fascism is a proliferation of 930 00:51:41,440 --> 00:51:45,240 Speaker 1: molecular focuses in interaction which skipped from point to point 931 00:51:45,280 --> 00:51:48,360 Speaker 1: before beginning to resonate together. This comment might well have 932 00:51:48,360 --> 00:51:51,799 Speaker 1: been made about the lateral connectedness found in cyberspace, rather 933 00:51:51,840 --> 00:51:55,480 Speaker 1: than seeing fascism and shrined in a totalitarian bureaucracy. They 934 00:51:55,560 --> 00:51:58,320 Speaker 1: argue that fascism was and is manifest in the micro 935 00:51:58,520 --> 00:52:02,160 Speaker 1: organization of everyday life. The power of fascist culture here 936 00:52:02,280 --> 00:52:06,200 Speaker 1: is in its molecular and supple segmentarity, with flows capable 937 00:52:06,200 --> 00:52:10,480 Speaker 1: of suffusing every cell. What makes fascism dangerous is its 938 00:52:10,520 --> 00:52:14,000 Speaker 1: molecular or micro political power, for it is a mass 939 00:52:14,040 --> 00:52:20,759 Speaker 1: movement of cancerous body rather than a totalitarian organism. Yeah, 940 00:52:21,280 --> 00:52:24,200 Speaker 1: m all right, So I don't even really want to 941 00:52:24,239 --> 00:52:29,600 Speaker 1: comment on that. I think it's it's deep. Yeah, yeah, anyway, 942 00:52:29,640 --> 00:52:35,680 Speaker 1: that's what I got so far. Robert presented your worldviews 943 00:52:36,160 --> 00:52:41,399 Speaker 1: very suctinct. This was like Robert's ideology on life more 944 00:52:41,440 --> 00:52:45,520 Speaker 1: or less Yeah, it's a fascism isn't something that's imposed 945 00:52:45,520 --> 00:52:47,759 Speaker 1: from the top down. It's something that bubbles up from 946 00:52:47,800 --> 00:52:50,960 Speaker 1: within groups of human beings. And if you're going to 947 00:52:51,000 --> 00:52:54,840 Speaker 1: stop it, it requires constant vigilance, like the vigilance of say, 948 00:52:54,880 --> 00:52:56,839 Speaker 1: a group of kung who makes sure to keep an 949 00:52:56,840 --> 00:52:59,000 Speaker 1: eye out for a young man who gets too big 950 00:52:59,040 --> 00:53:00,960 Speaker 1: of an idea of his own importance because it brought 951 00:53:01,040 --> 00:53:03,880 Speaker 1: down a fucking gazelle. And like a virus can move 952 00:53:04,160 --> 00:53:08,160 Speaker 1: and change and and and go away, but you can 953 00:53:08,280 --> 00:53:10,480 Speaker 1: come back, I mean, and you can you can basically 954 00:53:10,480 --> 00:53:13,239 Speaker 1: erase the metaphor at that point of like, how like 955 00:53:13,320 --> 00:53:16,040 Speaker 1: fascism is spreading like a virus now, because it's spreading 956 00:53:16,080 --> 00:53:18,239 Speaker 1: like a virus in the Internet sense, to like it 957 00:53:18,440 --> 00:53:21,960 Speaker 1: is truly one and the same. I think I will. Okay, 958 00:53:22,080 --> 00:53:27,279 Speaker 1: So Robert, how do we how do we fix it? Yeah? 959 00:53:27,360 --> 00:53:32,640 Speaker 1: And uh, not hearing any answers. Well you know I didn't, 960 00:53:32,719 --> 00:53:37,640 Speaker 1: you know, I only had so much time to comprehensive Well, 961 00:53:37,680 --> 00:53:39,920 Speaker 1: you know, it does. That is a little bit of it, Sophie. 962 00:53:39,960 --> 00:53:43,160 Speaker 1: Like one lesson we can take out of human prehistory 963 00:53:43,200 --> 00:53:45,680 Speaker 1: in terms of how we fight fascism is that It's 964 00:53:45,719 --> 00:53:48,720 Speaker 1: not something we fight by voting for the right person. 965 00:53:49,040 --> 00:53:51,920 Speaker 1: It's something we fight on a day to day basis 966 00:53:52,040 --> 00:53:55,319 Speaker 1: in our daily lives. It's something we fight not just 967 00:53:55,440 --> 00:53:57,480 Speaker 1: by keeping an eye out on other people, but by 968 00:53:57,480 --> 00:54:00,960 Speaker 1: fighting the fascists within our own cell, by fighting like 969 00:54:01,000 --> 00:54:04,960 Speaker 1: those authoritarian impulses and urges that we all have because 970 00:54:04,960 --> 00:54:08,560 Speaker 1: it's coated into our brain. Um, it's it's a constant 971 00:54:08,600 --> 00:54:13,120 Speaker 1: battle that starts at the bottom. And if it doesn't persist, 972 00:54:13,160 --> 00:54:15,799 Speaker 1: and if there isn't discipline at every level of society 973 00:54:15,840 --> 00:54:21,400 Speaker 1: to watch against it, it will come back terrifying. Okay, 974 00:54:21,640 --> 00:54:23,879 Speaker 1: so the call has been coming from inside the house 975 00:54:23,880 --> 00:54:26,319 Speaker 1: the whole time. Call the fascism has been coming from 976 00:54:26,320 --> 00:54:29,280 Speaker 1: inside your brain this whole time. Yeah, Okay, good gat 977 00:54:29,320 --> 00:54:36,200 Speaker 1: good by, bolt Cutters. It's one tool that you can 978 00:54:36,320 --> 00:54:40,200 Speaker 1: use against fascism, only brain out. I think that's really 979 00:54:40,200 --> 00:54:42,839 Speaker 1: a good answer. Yeah, if we could just yank out 980 00:54:43,120 --> 00:54:48,960 Speaker 1: the judge bitch cortex from our brain, Cutters, that's you're 981 00:54:48,960 --> 00:54:52,080 Speaker 1: pretty close to some things. Kurt Vonnegutt was theorizing about 982 00:54:52,120 --> 00:54:53,640 Speaker 1: near the end of his life. There he was like, 983 00:54:53,640 --> 00:54:55,920 Speaker 1: if we were all just dumber, this would work so 984 00:54:56,000 --> 00:54:58,480 Speaker 1: much better. Yeah, it's like it's just everyone, just us 985 00:54:58,600 --> 00:55:02,480 Speaker 1: to just get a light bother me. Yeah, being smart 986 00:55:02,600 --> 00:55:04,520 Speaker 1: is not worth it. I don't happen to hold to 987 00:55:04,600 --> 00:55:07,680 Speaker 1: that idea, but I'm willing to admit that Kurt Vonnegutt 988 00:55:07,719 --> 00:55:09,480 Speaker 1: was way smarter than I will ever be, and he 989 00:55:09,600 --> 00:55:13,640 Speaker 1: might have been right. So it's I mean, base level. 990 00:55:13,800 --> 00:55:16,560 Speaker 1: I haven't revisited that phase of Kurt Kurt Vonnegut in 991 00:55:16,600 --> 00:55:20,359 Speaker 1: a while, but sounds fun. But Robert, you wouldn't agree 992 00:55:20,360 --> 00:55:23,800 Speaker 1: with that because you are not a judge bitch. Well, 993 00:55:23,920 --> 00:55:26,640 Speaker 1: I try not to be, but I will admit that 994 00:55:26,719 --> 00:55:29,040 Speaker 1: I think when Kurt Vonnegutt was a judge bitch, he 995 00:55:29,120 --> 00:55:31,200 Speaker 1: was right to be a judge bitch. Fair. I mean, yeah, 996 00:55:31,239 --> 00:55:35,040 Speaker 1: I choose your moments. I'm constantly yeah, I mean constant 997 00:55:35,040 --> 00:55:39,799 Speaker 1: suppression of the judge bitch within is necessary. I keep 998 00:55:39,800 --> 00:55:44,480 Speaker 1: her loressed up. Yeah, I keep her locked up until necessary. Yeah, 999 00:55:45,040 --> 00:55:47,680 Speaker 1: But there are moments where it's like, oh, there she is. There, 1000 00:55:47,719 --> 00:55:51,279 Speaker 1: she's coming out wearing that outfit sometimes right, and doing 1001 00:55:51,320 --> 00:55:55,480 Speaker 1: that superhero pose, ready to save the fucking day. Well yeah, 1002 00:55:55,520 --> 00:55:58,080 Speaker 1: and it's it's like she has bolk cutters, she's got 1003 00:55:58,120 --> 00:56:01,040 Speaker 1: bold cutters. Yeah, they can be used for good or evil. 1004 00:56:01,760 --> 00:56:05,279 Speaker 1: Much like authoritarianism. It's not always the wrong thing like 1005 00:56:05,440 --> 00:56:08,880 Speaker 1: we have it. Like, it's useful in certain situations. Um. 1006 00:56:08,920 --> 00:56:11,319 Speaker 1: You know, if you've got well, if you've got like 1007 00:56:11,360 --> 00:56:14,040 Speaker 1: a wildfire, you need one person being like okay, you 1008 00:56:14,080 --> 00:56:15,480 Speaker 1: go here, you go here, you go here, Like this 1009 00:56:15,520 --> 00:56:17,799 Speaker 1: is what we're going to try to do, like our podcast, 1010 00:56:18,400 --> 00:56:21,400 Speaker 1: like our podcast, or like a military unit, where to 1011 00:56:21,520 --> 00:56:24,399 Speaker 1: some extent there's certain kinds of hierarchy that you want 1012 00:56:24,440 --> 00:56:27,839 Speaker 1: in a military unit like our podcast, like our podcast, 1013 00:56:28,520 --> 00:56:32,640 Speaker 1: or like uh uh, that's that's about it. Um. I 1014 00:56:32,680 --> 00:56:35,560 Speaker 1: don't think it's useful nearly as often as we use it, 1015 00:56:35,600 --> 00:56:40,239 Speaker 1: but it's not useless, but it has to be carefully controlled. 1016 00:56:41,640 --> 00:56:45,160 Speaker 1: Yeah okay, yeah, so like maybe if we're going to 1017 00:56:45,320 --> 00:56:48,839 Speaker 1: keep having presidents, we execute every president after they finish 1018 00:56:48,920 --> 00:56:51,200 Speaker 1: their term of office, and that way only people who 1019 00:56:51,200 --> 00:56:57,120 Speaker 1: are truly selfless take the job. Now we're out in 1020 00:56:57,160 --> 00:56:59,759 Speaker 1: the crazy crazy town. Yeah, it was like, well, now 1021 00:56:59,840 --> 00:57:04,719 Speaker 1: we're just shull. It's not the Hunger Games if you're 1022 00:57:04,760 --> 00:57:08,600 Speaker 1: just killing the person at the top. True true, true, 1023 00:57:08,920 --> 00:57:12,320 Speaker 1: still not still not. Um, I've only seen the first 1024 00:57:12,320 --> 00:57:17,520 Speaker 1: one does it end well, that's right. Are the Hunger Games? 1025 00:57:17,560 --> 00:57:20,120 Speaker 1: A good idea for society is that the conclusion should 1026 00:57:20,120 --> 00:57:22,880 Speaker 1: get a shot, should give a shot, should be hunger 1027 00:57:22,880 --> 00:57:25,600 Speaker 1: someome games, but instead of children, it's like members of 1028 00:57:25,640 --> 00:57:30,000 Speaker 1: the cabinet sick love it great? Oh now that would 1029 00:57:30,040 --> 00:57:32,760 Speaker 1: be amazing and at the end of like a presidential 1030 00:57:32,960 --> 00:57:36,120 Speaker 1: because like then I'd be really excited about some of 1031 00:57:36,120 --> 00:57:38,520 Speaker 1: the people who have been in Trump's cabinet because I 1032 00:57:38,520 --> 00:57:41,440 Speaker 1: would love to see Steven Miller and Steve Bannon fighting 1033 00:57:41,440 --> 00:57:44,200 Speaker 1: with like homemade spears over a pit of lava. Like 1034 00:57:44,320 --> 00:57:47,560 Speaker 1: it would be a blast. That would be the fucking greatest. 1035 00:57:47,680 --> 00:57:50,080 Speaker 1: Jared Pushner would hide the whole time, and then somehow 1036 00:57:50,120 --> 00:57:54,200 Speaker 1: Win would hide the whole time five ft beneath the 1037 00:57:54,200 --> 00:57:57,360 Speaker 1: ground and which is zombie out of just a Banka 1038 00:57:57,480 --> 00:58:00,840 Speaker 1: Trump and the Betsy Divos just throttle each other with 1039 00:58:00,880 --> 00:58:04,480 Speaker 1: like fucking scarves or something. Yeah. That that that's fun. 1040 00:58:04,600 --> 00:58:08,240 Speaker 1: That's fun to think about. Reference It's almost like he's 1041 00:58:08,280 --> 00:58:15,680 Speaker 1: already got the graphic novel storyboarded. Yeah. I don't have 1042 00:58:15,720 --> 00:58:19,800 Speaker 1: more detailed solutions than that, but this is where my 1043 00:58:19,840 --> 00:58:22,520 Speaker 1: thinking's gone in the last eighteen months or so. Of 1044 00:58:22,560 --> 00:58:25,400 Speaker 1: doing this podcast. So now you all have to deal 1045 00:58:25,480 --> 00:58:30,120 Speaker 1: with it too. Sorry, I know I think it's good. 1046 00:58:30,280 --> 00:58:36,439 Speaker 1: I think, well not the takeaways are dire. But yeah, 1047 00:58:36,480 --> 00:58:39,960 Speaker 1: I've never heard it put succinctly like that. Well, I 1048 00:58:40,000 --> 00:58:41,960 Speaker 1: don't know if i'd call it succinctly. I've been talking 1049 00:58:42,080 --> 00:58:45,360 Speaker 1: for fifty nine minutes or so. But I did my best. 1050 00:58:46,080 --> 00:58:49,520 Speaker 1: It was only one part. Yeah, it was only one part. 1051 00:58:49,640 --> 00:58:52,480 Speaker 1: Only one part. Well, I would not agree to do 1052 00:58:52,600 --> 00:58:57,919 Speaker 1: two parts, Jamie. Wow. Yeah, no, I was forced into 1053 00:58:58,000 --> 00:59:01,520 Speaker 1: doing this one part. Well, I'm glad that you were 1054 00:59:01,600 --> 00:59:04,080 Speaker 1: unlike too, because this is just a whole new experience 1055 00:59:04,080 --> 00:59:06,640 Speaker 1: of Sophie've never had were we can just make eye 1056 00:59:06,720 --> 00:59:10,160 Speaker 1: rolls to each other directly and then say what we're thinking. 1057 00:59:11,560 --> 00:59:15,760 Speaker 1: I know, can you measure if Robert was here, Oh, 1058 00:59:15,800 --> 00:59:18,000 Speaker 1: I'd be I'd be I rolled into a coma by 1059 00:59:18,000 --> 00:59:22,720 Speaker 1: this unconscious. You would just be tied up. Yeah. We 1060 00:59:22,720 --> 00:59:26,840 Speaker 1: would have gone out onto the the Poisonous spell cat 1061 00:59:27,200 --> 00:59:29,680 Speaker 1: for sure. Yeah, the poison room. I would have cracked 1062 00:59:29,720 --> 00:59:33,760 Speaker 1: it open with my podcasting machete, can't Yeah, you do 1063 00:59:33,840 --> 00:59:38,040 Speaker 1: have a machete. That's like that has says podcasting on 1064 00:59:38,160 --> 00:59:42,480 Speaker 1: the Blade. I am wearing my throwing Bagels behind the 1065 00:59:42,480 --> 00:59:45,080 Speaker 1: massa T shirt right now, so I need to get 1066 00:59:45,080 --> 00:59:47,520 Speaker 1: one of those. I have one for you. Oh yeah, 1067 00:59:47,720 --> 00:59:49,920 Speaker 1: I'm wearing my what if Fraser were a part of 1068 00:59:49,920 --> 00:59:53,200 Speaker 1: the Fantastic four T shirt today? I love what if 1069 00:59:53,240 --> 00:59:55,800 Speaker 1: Frasier were a part of the Fantastic Four. Well, that's 1070 00:59:55,800 --> 01:00:00,200 Speaker 1: the very question this T shirt explores. Everyone think about that, 1071 01:00:00,440 --> 01:00:05,160 Speaker 1: and also, how do we fight against the monster coated 1072 01:00:05,200 --> 01:00:10,120 Speaker 1: into our brains? Hitler is almost inevitable? Yeah, both of 1073 01:00:10,160 --> 01:00:13,800 Speaker 1: those things, Yeah, in that order. Please have have the 1074 01:00:13,840 --> 01:00:17,200 Speaker 1: answers on my desk by Monday. If there's a better 1075 01:00:17,280 --> 01:00:20,360 Speaker 1: symbol for creeping authoritarianism in the human spirit than the 1076 01:00:20,400 --> 01:00:29,480 Speaker 1: television show Frasier, I haven't found it. Yeah, all right, Jamie, 1077 01:00:29,520 --> 01:00:32,600 Speaker 1: you want to plug your plug doubles. Sure you can 1078 01:00:32,680 --> 01:00:36,560 Speaker 1: listen to the Bechdel Cast every Thursday Feminist Movie podcast, 1079 01:00:37,160 --> 01:00:40,200 Speaker 1: or follow me on Twitter at Jamie Loft His Help, 1080 01:00:40,320 --> 01:00:42,600 Speaker 1: or come see my show at Edinburgh Fringe Fest in 1081 01:00:42,640 --> 01:00:49,160 Speaker 1: Scotland all August. Yeah, and uh, Sophie, you want to 1082 01:00:49,160 --> 01:00:52,680 Speaker 1: plug my plug doubles because you're on the thing too 1083 01:00:52,920 --> 01:00:56,120 Speaker 1: and we have the same podcast. Well, um, I didn't 1084 01:00:56,160 --> 01:01:01,200 Speaker 1: agree to that in my contract, but follow Robert on Twitter, 1085 01:01:01,480 --> 01:01:06,720 Speaker 1: I write, Okay, thank you didn't discussing. I mean, I 1086 01:01:06,760 --> 01:01:10,080 Speaker 1: just think we could have titled it better. Um at 1087 01:01:10,120 --> 01:01:13,960 Speaker 1: bastards Pod on Twitter and Instagram behind the Bastard's dot 1088 01:01:13,960 --> 01:01:18,200 Speaker 1: com for the sources for this pod t public. We 1089 01:01:18,320 --> 01:01:22,040 Speaker 1: have t shirts, we have totes, we have phone cases, 1090 01:01:22,440 --> 01:01:25,640 Speaker 1: we have um not bolt cutters but soon to be 1091 01:01:25,720 --> 01:01:31,480 Speaker 1: bolt cutters. We have we have, we have, we have 1092 01:01:31,520 --> 01:01:33,720 Speaker 1: a couple. We have a couple of new new designs 1093 01:01:33,800 --> 01:01:36,360 Speaker 1: up there, so check it out. Good stuff and they 1094 01:01:36,360 --> 01:01:41,080 Speaker 1: have sales like all the time. Yeah, and you can 1095 01:01:41,120 --> 01:01:44,240 Speaker 1: find Sophie on Twitter at at Bastard's pod because she 1096 01:01:44,360 --> 01:01:45,920 Speaker 1: runs the Twitter So if you want to tell her 1097 01:01:45,960 --> 01:01:48,480 Speaker 1: to be less or more mean to me, you can 1098 01:01:48,560 --> 01:01:52,400 Speaker 1: let her know the direct channel. But in reality, don't 1099 01:01:52,400 --> 01:01:58,760 Speaker 1: do that. Don't do that. Uh and but do yeah, 1100 01:01:58,840 --> 01:02:02,400 Speaker 1: maybe get some bolt cutters, um. Certainly keep an eye 1101 01:02:02,400 --> 01:02:05,880 Speaker 1: out for creeping authoritarianism in your own daily life and 1102 01:02:06,160 --> 01:02:08,640 Speaker 1: try to shame some meat on your way home. If 1103 01:02:08,680 --> 01:02:11,480 Speaker 1: you'd like to humble Robert, please find me on Instagram 1104 01:02:11,520 --> 01:02:16,000 Speaker 1: at so underscore right Underscore up Underscore Sunshine and let 1105 01:02:16,000 --> 01:02:18,080 Speaker 1: me know your opinions on Robert, so I can tell 1106 01:02:18,120 --> 01:02:20,280 Speaker 1: it to him to its face and make sure that's 1107 01:02:20,320 --> 01:02:24,080 Speaker 1: great that he doesn't know the size of his I 1108 01:02:24,080 --> 01:02:26,600 Speaker 1: don't want to say meat because I don't instagram and 1109 01:02:26,800 --> 01:02:29,640 Speaker 1: get or shame shame. Let's go shame, Let's go shame. 1110 01:02:30,400 --> 01:02:31,720 Speaker 1: I don't want to say I don't like I just 1111 01:02:31,720 --> 01:02:34,840 Speaker 1: don't enjoy the phrase size of meat in general. No, 1112 01:02:35,000 --> 01:02:38,240 Speaker 1: I think that a meat shaming T shirt is called 1113 01:02:38,320 --> 01:02:44,400 Speaker 1: for at this point. Yeah, yeah, there was serious Robert 1114 01:02:44,440 --> 01:02:47,240 Speaker 1: influences here. The meat shaming story really is? That is 1115 01:02:47,240 --> 01:02:53,240 Speaker 1: gonna so Texas. It's fucking amazing. Yeah. Yeah, one of 1116 01:02:53,280 --> 01:02:55,200 Speaker 1: those things I read a while back and has stuck 1117 01:02:55,240 --> 01:02:56,960 Speaker 1: with me ever since. They should turn it into a 1118 01:02:57,040 --> 01:03:02,560 Speaker 1: children's book. They should the shame of the meat with uh, 1119 01:03:02,600 --> 01:03:05,240 Speaker 1: what's that? What's that? Curious chimpanzee? He seems like the 1120 01:03:05,320 --> 01:03:10,360 Speaker 1: right character for that. Yeah, are you referring to George? Yeah, George, 1121 01:03:10,400 --> 01:03:13,440 Speaker 1: if you'd like to keep my meat big, please message 1122 01:03:13,440 --> 01:03:15,920 Speaker 1: me and tell me how cute my dog is there. 1123 01:03:16,000 --> 01:03:19,240 Speaker 1: All right, Well, this has been a rambling enough exit 1124 01:03:19,880 --> 01:03:25,560 Speaker 1: the episodes done, go yeah, buy it? I love you.