1 00:00:01,720 --> 00:00:05,720 Speaker 1: All the media. 2 00:00:06,600 --> 00:00:10,200 Speaker 2: Hello and welcome to it happen here. Last episode, I 3 00:00:10,280 --> 00:00:14,080 Speaker 2: was joined by your savior. Hello, and he's here again 4 00:00:14,280 --> 00:00:17,720 Speaker 2: because we're gonna get more into what we spoke about 5 00:00:17,800 --> 00:00:21,000 Speaker 2: last time. Last episode, we painted a hopeful account of 6 00:00:21,160 --> 00:00:24,960 Speaker 2: humanity's nature. Could see if my reading of Rutka Bregmann's 7 00:00:25,280 --> 00:00:29,120 Speaker 2: Humankind a hopeful history. So I probably fed into the 8 00:00:29,200 --> 00:00:32,639 Speaker 2: anarchists so utopia narrative a bit with that previous episode. 9 00:00:32,640 --> 00:00:35,720 Speaker 2: But the truth is that I'm not really being optimistic 10 00:00:36,320 --> 00:00:41,000 Speaker 2: and being realistic. But realism has been confused with cynonicism 11 00:00:41,080 --> 00:00:44,040 Speaker 2: for so long that even acknowledging both sides of the 12 00:00:44,080 --> 00:00:49,080 Speaker 2: coin can be seen as overly utopia. People can be bad, 13 00:00:49,159 --> 00:00:52,320 Speaker 2: and we'll get into the why. But for whatever reason 14 00:00:52,400 --> 00:00:55,760 Speaker 2: they are bad, that is why as anarchists have consistently 15 00:00:55,880 --> 00:01:00,000 Speaker 2: argued nobody should have authority. Now they will always be outlined. 16 00:01:00,560 --> 00:01:03,000 Speaker 2: And this explanation I'm about to share it is not 17 00:01:03,000 --> 00:01:07,520 Speaker 2: going to get into every unique case of badness, but 18 00:01:07,600 --> 00:01:09,280 Speaker 2: we are going to get into some of the reasons 19 00:01:09,280 --> 00:01:12,400 Speaker 2: that people do bad and what we can do about it. 20 00:01:13,360 --> 00:01:16,640 Speaker 2: As I said last episode, we took issue with this 21 00:01:17,000 --> 00:01:22,160 Speaker 2: idea of civilization as a thin venail and were put 22 00:01:22,200 --> 00:01:26,520 Speaker 2: forward the premise the Keemans are mostly pretty decent. In fact, 23 00:01:26,560 --> 00:01:29,480 Speaker 2: I didn't mention it last episode, but we don't even 24 00:01:29,520 --> 00:01:32,399 Speaker 2: really like to kill each other, contrary to popular belief, 25 00:01:33,600 --> 00:01:36,600 Speaker 2: brag when actually shares that in World War Two, studies 26 00:01:36,640 --> 00:01:40,880 Speaker 2: showed that many soldiers didn't shoot their weapons even in combat. 27 00:01:41,640 --> 00:01:44,880 Speaker 2: Trained soldiers had a difficult time actually pulling the trigger 28 00:01:45,040 --> 00:01:48,760 Speaker 2: and killing people. There are exceptions, as I said before, 29 00:01:49,320 --> 00:01:51,240 Speaker 2: but in a lot of cases it's very difficult for 30 00:01:51,280 --> 00:01:56,000 Speaker 2: people to actually kill. Military strategies ended up changing once 31 00:01:56,040 --> 00:01:59,560 Speaker 2: authorities realized this, and the training programs of soldiers was 32 00:01:59,600 --> 00:02:03,520 Speaker 2: to readers time to overcome this resistance. But that reluctance 33 00:02:03,600 --> 00:02:07,440 Speaker 2: to killin does also indicate that it takes some effort 34 00:02:08,080 --> 00:02:12,920 Speaker 2: to overcome our general decency toward each other because most people, 35 00:02:13,280 --> 00:02:17,720 Speaker 2: again most on all, are not natural born killers. So again, 36 00:02:17,800 --> 00:02:20,760 Speaker 2: how do we do bad? You know, all sorts of 37 00:02:20,800 --> 00:02:23,800 Speaker 2: atrostees have been carried out by humans, both in ancient 38 00:02:23,960 --> 00:02:26,520 Speaker 2: and modern times. What do you think is the course. 39 00:02:27,200 --> 00:02:31,799 Speaker 3: Self preservation in some way, either physical or psychological. I'm 40 00:02:31,840 --> 00:02:35,160 Speaker 3: not an anthropologist, I'm not a sociologist. Most of my 41 00:02:35,200 --> 00:02:39,400 Speaker 3: experiences with people is both queer people and then looking 42 00:02:39,440 --> 00:02:42,480 Speaker 3: at Nazis and like political extremists. So it's maybe not 43 00:02:42,560 --> 00:02:46,440 Speaker 3: the best sample side for the general population. I think 44 00:02:46,639 --> 00:02:50,280 Speaker 3: I tend to exist kind of on the perimeter of 45 00:02:50,680 --> 00:02:55,799 Speaker 3: most human experience, but probably some form of either psychological 46 00:02:55,880 --> 00:02:59,960 Speaker 3: or physical self preservation in my experience slash opinion. 47 00:03:00,639 --> 00:03:03,799 Speaker 2: That's interesting. I didn't think of it. I think it 48 00:03:04,240 --> 00:03:08,240 Speaker 2: comes close to what Breaguen ends up getting into, but 49 00:03:08,800 --> 00:03:12,240 Speaker 2: I think self preservation, well, we'll get into that in 50 00:03:12,280 --> 00:03:16,200 Speaker 2: a bit. You know, it's it's difficult to square that 51 00:03:16,520 --> 00:03:19,720 Speaker 2: with just how brutal some of these disasters have been, 52 00:03:20,400 --> 00:03:24,680 Speaker 2: you know, these atrocities that have taken place around the world, 53 00:03:24,919 --> 00:03:29,280 Speaker 2: organized systemic industrial cruelty, you know, things like the Holocaust. 54 00:03:29,760 --> 00:03:33,919 Speaker 3: Totally, it's interesting because I think it's two paradoxical instincts 55 00:03:33,919 --> 00:03:36,320 Speaker 3: that play off each other. There's this self preservation and 56 00:03:36,360 --> 00:03:39,080 Speaker 3: there's also I believe in I thin guess some version 57 00:03:39,240 --> 00:03:42,160 Speaker 3: of the death drive, and I think those can can 58 00:03:42,160 --> 00:03:47,480 Speaker 3: interact in really odd ways. But the desk drive, Yeah, 59 00:03:47,800 --> 00:03:49,800 Speaker 3: like it's like it's specifically, it's like specifically like like 60 00:03:49,840 --> 00:03:51,800 Speaker 3: fascism and like you know, you can see this in 61 00:03:51,880 --> 00:03:54,520 Speaker 3: like the genocides of the twentieth century and twenty first 62 00:03:54,560 --> 00:03:57,800 Speaker 3: centuries like specifically, but no like fascism as like a 63 00:03:57,800 --> 00:04:02,240 Speaker 3: political embodying of the death, which is I think also 64 00:04:02,280 --> 00:04:05,400 Speaker 3: an aspect. I think these things exist together in parallel 65 00:04:05,440 --> 00:04:08,920 Speaker 3: while being paradoxical, and that's what produces a lot of 66 00:04:08,920 --> 00:04:13,600 Speaker 3: the incongruity around things like fascism, right, it is, it 67 00:04:13,640 --> 00:04:16,360 Speaker 3: is like an inherently paradoxical system. 68 00:04:16,720 --> 00:04:19,320 Speaker 2: When you say self preservation, are you just talking about 69 00:04:19,320 --> 00:04:22,360 Speaker 2: on the individual level, or are you seeing like community 70 00:04:22,960 --> 00:04:24,120 Speaker 2: self preservation as. 71 00:04:24,040 --> 00:04:27,360 Speaker 3: Well, both both, but also I think not even just physical, 72 00:04:27,400 --> 00:04:30,039 Speaker 3: but also like psychological, like being able to like continue 73 00:04:30,440 --> 00:04:33,599 Speaker 3: being able to continue existing as yourself, either within a 74 00:04:33,600 --> 00:04:36,240 Speaker 3: group of people or just you as an individual, Like 75 00:04:36,279 --> 00:04:38,640 Speaker 3: psychological things that you need to do to make yourself 76 00:04:38,640 --> 00:04:41,160 Speaker 3: feel like you're in community or that you are safe, 77 00:04:41,200 --> 00:04:43,480 Speaker 3: that you have meaning, or that you have purpose as 78 00:04:43,480 --> 00:04:44,760 Speaker 3: well as the physical aspects. 79 00:04:44,920 --> 00:04:48,680 Speaker 2: And you're saying that that lends itself to atrocity. 80 00:04:48,160 --> 00:04:48,839 Speaker 3: I think it can. 81 00:04:49,040 --> 00:04:52,320 Speaker 2: Yeah, yeah, yeah, well that actually is strikingly close to 82 00:04:52,360 --> 00:04:54,640 Speaker 2: what Bragman ends up uncovering. 83 00:04:54,960 --> 00:04:56,839 Speaker 3: Look at the reasons that people will talk about for 84 00:04:56,920 --> 00:05:00,240 Speaker 3: like why the genocide and Gaza is like necessary, right? 85 00:05:00,279 --> 00:05:00,800 Speaker 2: It is? 86 00:05:00,960 --> 00:05:03,719 Speaker 3: It is playing off both of those impulses. 87 00:05:04,200 --> 00:05:07,839 Speaker 2: Yeah, yeah, I mean all sorts of genocides when you 88 00:05:07,880 --> 00:05:09,719 Speaker 2: hear the descriptions of them, and this is what you 89 00:05:09,800 --> 00:05:13,520 Speaker 2: hear of the people who perpetuated them, what their explanations 90 00:05:13,560 --> 00:05:19,400 Speaker 2: or justifications, you know, from the Holocaust to Rwanda to Palestine. Yeah, 91 00:05:19,400 --> 00:05:22,160 Speaker 2: it's me and mar you know, totally it is deeply 92 00:05:22,200 --> 00:05:25,880 Speaker 2: Evil's that something we can look away from it? It 93 00:05:25,960 --> 00:05:29,200 Speaker 2: really is difficult to square with the most humans a 94 00:05:29,200 --> 00:05:31,000 Speaker 2: decent thesis when you look at how some of these 95 00:05:31,040 --> 00:05:35,200 Speaker 2: society is, even the ordinary people, for example, the citizen 96 00:05:35,279 --> 00:05:38,800 Speaker 2: population of Israel, even the civilian population, even they are 97 00:05:38,839 --> 00:05:44,479 Speaker 2: like disturbingly genocideh in their rhetoric. And so you know, 98 00:05:44,480 --> 00:05:46,400 Speaker 2: it's like, how do we reach that point? How do 99 00:05:46,440 --> 00:05:50,040 Speaker 2: we get there? How does an ordinary human baby grow 100 00:05:50,120 --> 00:05:50,720 Speaker 2: into that? 101 00:05:51,279 --> 00:05:53,919 Speaker 3: It can happen to you. It can happen very easily, 102 00:05:54,000 --> 00:05:55,599 Speaker 3: and I think it can happen in a short time 103 00:05:55,640 --> 00:05:57,760 Speaker 3: span and you can get out of it. I think 104 00:05:57,960 --> 00:06:00,000 Speaker 3: maybe not just as easily, but you can't get out 105 00:06:00,080 --> 00:06:02,240 Speaker 3: of it also in a fastime span. I think it's 106 00:06:02,279 --> 00:06:04,320 Speaker 3: like that you are not immune to propaganda. Idea you 107 00:06:04,360 --> 00:06:06,600 Speaker 3: can look at like in n To Germany, Robert has 108 00:06:06,600 --> 00:06:10,839 Speaker 3: talked about quote unquote the little Nazis, the regular Germans 109 00:06:10,839 --> 00:06:14,200 Speaker 3: who ended up partipating and becoming Nazis, and you are 110 00:06:14,440 --> 00:06:18,880 Speaker 3: not immune from that, and that can happen as a 111 00:06:18,920 --> 00:06:21,640 Speaker 3: response to a whole bunch of traumatic impulses as well. 112 00:06:22,240 --> 00:06:25,520 Speaker 3: Whereas I think people now even use like politics just 113 00:06:25,680 --> 00:06:29,320 Speaker 3: to You're like this like idea of politics as permission 114 00:06:29,400 --> 00:06:33,920 Speaker 3: to be like an overtly cruel person to other people, 115 00:06:34,000 --> 00:06:36,479 Speaker 3: either like in your life or online. Right, you will 116 00:06:36,520 --> 00:06:40,239 Speaker 3: you will use use various political topics, and that gives 117 00:06:40,240 --> 00:06:44,159 Speaker 3: you permission to unleash unmitigated hostility against people that you 118 00:06:44,240 --> 00:06:46,960 Speaker 3: now perceive as being like immoral or you perceive as 119 00:06:46,960 --> 00:06:48,560 Speaker 3: being like ontological enemies. 120 00:06:49,360 --> 00:06:53,760 Speaker 2: Exactly exactly. I mean there were particular studies that were 121 00:06:54,120 --> 00:06:57,240 Speaker 2: undertaken in the twentieth century that are often used to 122 00:06:57,279 --> 00:07:00,840 Speaker 2: sort of explain that, you know, after all of Nazi 123 00:07:00,880 --> 00:07:03,760 Speaker 2: Germany and the post World War TWI era, people will 124 00:07:03,880 --> 00:07:08,800 Speaker 2: seeking explanations for atrocity and so so experiments were done 125 00:07:09,600 --> 00:07:12,800 Speaker 2: and are now pointed to as explanations for how this 126 00:07:12,920 --> 00:07:15,600 Speaker 2: could have taken place. You know, so one particular experiment 127 00:07:15,680 --> 00:07:18,760 Speaker 2: that's really well known as the Stanford prison experiment. Right, 128 00:07:19,000 --> 00:07:22,280 Speaker 2: this idea that you take random students and give them 129 00:07:22,280 --> 00:07:26,520 Speaker 2: a position of power and they become sadistic gods. You know, 130 00:07:26,840 --> 00:07:29,400 Speaker 2: it proves just how thin the ven ale civilization really is, 131 00:07:29,520 --> 00:07:34,280 Speaker 2: or already the evil the civilization could empower. But at 132 00:07:34,320 --> 00:07:38,840 Speaker 2: least for that particular experiment, the reality was never so straightforward. 133 00:07:38,880 --> 00:07:42,360 Speaker 2: You know, the gods were literally coached and encouraged to 134 00:07:42,400 --> 00:07:45,800 Speaker 2: be cruel. You know, they were actually putting on performances. 135 00:07:45,840 --> 00:07:50,240 Speaker 2: The prisoners were also expected to perform, so rather than 136 00:07:50,240 --> 00:07:53,720 Speaker 2: being like an actual scientific experiment, it was more like 137 00:07:54,080 --> 00:07:54,840 Speaker 2: guided theater. 138 00:07:55,640 --> 00:07:58,520 Speaker 3: I mean it inadvertently, it becomes an interesting experiment in 139 00:07:58,600 --> 00:08:03,240 Speaker 3: like humans desired to like please authority. 140 00:08:02,920 --> 00:08:05,760 Speaker 2: Right exactly, exactly, to like perform to. 141 00:08:05,720 --> 00:08:08,960 Speaker 3: The expectations of the people who are actually running this experiment, 142 00:08:09,480 --> 00:08:13,080 Speaker 3: and how capable you are of falling into these roles 143 00:08:13,360 --> 00:08:15,920 Speaker 3: under like under that paradigm exactly. 144 00:08:16,000 --> 00:08:17,720 Speaker 2: I mean, you see that in Nazi Germany as well, 145 00:08:17,760 --> 00:08:20,320 Speaker 2: a lot of the people were doing things to please 146 00:08:20,360 --> 00:08:24,560 Speaker 2: the fear, you know, like they didn't necessarily know or 147 00:08:24,640 --> 00:08:27,600 Speaker 2: there was a lot of wiggle room from what I've 148 00:08:27,640 --> 00:08:31,840 Speaker 2: read to interpret the fear as wishes, Yeah, as people 149 00:08:31,840 --> 00:08:35,240 Speaker 2: who wanted to rank up and rise up in the 150 00:08:35,400 --> 00:08:38,720 Speaker 2: in the organization, but interpret things in a way that 151 00:08:38,760 --> 00:08:45,000 Speaker 2: they would presume would please Pitta and his desires moving 152 00:08:45,000 --> 00:08:48,000 Speaker 2: towards the fear. Yeah, exactly, exactly. That's the that's the 153 00:08:48,040 --> 00:09:01,000 Speaker 2: name of the phenomenon. I mean, when the sound for 154 00:09:01,040 --> 00:09:04,679 Speaker 2: Person experim When people tried to recreate the experiment for television, 155 00:09:04,720 --> 00:09:08,520 Speaker 2: E've got it made for pretty boring TV because it 156 00:09:08,640 --> 00:09:11,800 Speaker 2: was bad science in the first place. It's not something 157 00:09:11,800 --> 00:09:13,960 Speaker 2: that people do not actually, it's it's what they do 158 00:09:14,040 --> 00:09:16,520 Speaker 2: when they are pushed, when they are prodded, you know, 159 00:09:16,520 --> 00:09:19,360 Speaker 2: when certain expectations, et ceter. It's kind of similar with 160 00:09:19,400 --> 00:09:22,760 Speaker 2: this other famous experiment that that Bragman talks about, which 161 00:09:22,800 --> 00:09:27,160 Speaker 2: is the Stanley Milcrumb's Obedience experiments, where volunteers were told 162 00:09:27,200 --> 00:09:31,600 Speaker 2: to administer increasingly painful electric shocks the stranger just because 163 00:09:31,600 --> 00:09:34,000 Speaker 2: I got in a lamp code told them to. It's 164 00:09:34,120 --> 00:09:36,839 Speaker 2: like another instance of you know, we doing these things 165 00:09:36,880 --> 00:09:39,400 Speaker 2: just to please authority, even to the point of murder, 166 00:09:39,920 --> 00:09:43,920 Speaker 2: because you know, the dial of the electric shock was 167 00:09:44,000 --> 00:09:46,200 Speaker 2: deadly after a certain point, and you could hear the 168 00:09:46,240 --> 00:09:49,000 Speaker 2: screams of the the victims. Of course they were fake screams, 169 00:09:49,000 --> 00:09:52,360 Speaker 2: but you know, the participants could hear them. But what 170 00:09:52,440 --> 00:09:55,360 Speaker 2: Bragman ended up uncovering is that most of the participants 171 00:09:55,440 --> 00:09:59,040 Speaker 2: will't follow the orders blindly. They were following the orders, yes, 172 00:09:59,480 --> 00:10:02,079 Speaker 2: but they would do because they believed that they were 173 00:10:02,080 --> 00:10:05,280 Speaker 2: doing something good, something good for the good of science. 174 00:10:05,920 --> 00:10:08,160 Speaker 2: That even though the shocks were uncomfortable, that it wasn't 175 00:10:08,160 --> 00:10:11,280 Speaker 2: something they want to do, there was a noble sacrifice 176 00:10:11,800 --> 00:10:15,400 Speaker 2: in the name of progress. Even so, the participants weren't 177 00:10:15,480 --> 00:10:18,200 Speaker 2: in the front. You know, they were distressed, they were shaken, 178 00:10:18,240 --> 00:10:20,240 Speaker 2: they were sweat and they were begging to check on 179 00:10:20,320 --> 00:10:24,320 Speaker 2: the learner. But they also said things like he agreed 180 00:10:24,360 --> 00:10:26,800 Speaker 2: to be in the experiment, you know, or this will 181 00:10:26,800 --> 00:10:29,320 Speaker 2: help science right or I don't want to do this, 182 00:10:29,440 --> 00:10:32,160 Speaker 2: but I have to, demanded lab Coote, who is telling 183 00:10:32,160 --> 00:10:35,080 Speaker 2: them to continue, please continue, please continue. He was calm, 184 00:10:35,080 --> 00:10:38,880 Speaker 2: he was professional, and also even how the nudge is 185 00:10:38,920 --> 00:10:41,800 Speaker 2: that he used were framed made a difference. So if 186 00:10:41,800 --> 00:10:44,160 Speaker 2: he was directly ordering them and telling them you have 187 00:10:44,240 --> 00:10:47,800 Speaker 2: to do this, surprise many people would actually be more 188 00:10:47,920 --> 00:10:51,280 Speaker 2: likely to resist a direct order framed in that way 189 00:10:51,559 --> 00:10:54,840 Speaker 2: for such an experiment, but a more subtle nudge, just 190 00:10:54,880 --> 00:10:58,280 Speaker 2: like know what, you know, science, the experimental requires this, 191 00:10:58,440 --> 00:11:01,040 Speaker 2: you know, the experiment needs to do this at more 192 00:11:01,120 --> 00:11:04,080 Speaker 2: subtle It tended to get people to continue. And the 193 00:11:04,080 --> 00:11:06,000 Speaker 2: people who were interviewed who did take it up to 194 00:11:06,000 --> 00:11:08,280 Speaker 2: those higher volagers, they said they did it because they 195 00:11:08,320 --> 00:11:13,400 Speaker 2: believed they were contributing to scientific development. So it's really 196 00:11:13,400 --> 00:11:17,960 Speaker 2: this misguided belief in a higher cause that also contributes 197 00:11:18,040 --> 00:11:22,120 Speaker 2: to atrocity. It's very easy to get this idea that, oh, 198 00:11:22,160 --> 00:11:24,440 Speaker 2: you know that those are just monstrous people. You know, 199 00:11:24,440 --> 00:11:26,400 Speaker 2: we have this idea in pop culture that these the 200 00:11:26,480 --> 00:11:31,480 Speaker 2: Nazis are like cartoonish monsters. They are monstrous, but they 201 00:11:31,480 --> 00:11:35,000 Speaker 2: are monstrous people. You know, they are, at the end 202 00:11:35,000 --> 00:11:38,400 Speaker 2: of the day, people who do evil with the belief 203 00:11:38,800 --> 00:11:41,600 Speaker 2: that they're doing good. It's a very extent. I know 204 00:11:41,640 --> 00:11:45,719 Speaker 2: that there were some who you know, recanted or who 205 00:11:45,800 --> 00:11:47,640 Speaker 2: knew what they were doing wrong, but they had other 206 00:11:47,679 --> 00:11:51,760 Speaker 2: pressures that were pushing them in that direction. Right. There 207 00:11:51,760 --> 00:11:54,679 Speaker 2: are many explanations we was behaving and all sorts of situations, 208 00:11:55,120 --> 00:11:56,960 Speaker 2: but a lot of the people they thought that they 209 00:11:57,000 --> 00:12:01,600 Speaker 2: were contributing to the right things, didn't care, but they 210 00:12:01,600 --> 00:12:04,240 Speaker 2: were taught to care in the wrong direction. The bad 211 00:12:04,280 --> 00:12:08,640 Speaker 2: guys don't think that they're bad guys. And whether we're 212 00:12:08,679 --> 00:12:11,120 Speaker 2: talking about the Nazis of the past or the Designers 213 00:12:11,240 --> 00:12:16,040 Speaker 2: of today, they construct these elaborate narratives to frame themselves 214 00:12:16,040 --> 00:12:18,320 Speaker 2: as the righteous ones. You know, as far as the 215 00:12:18,400 --> 00:12:21,560 Speaker 2: Nazis are concerned, they are purge in Germany of a 216 00:12:21,720 --> 00:12:24,520 Speaker 2: serious threat to the well be in and the safety 217 00:12:24,600 --> 00:12:27,920 Speaker 2: of their future and all that stuff. Read Designers to stay. 218 00:12:28,000 --> 00:12:30,160 Speaker 2: You ask them, even though they're pariahs of the world 219 00:12:30,240 --> 00:12:33,160 Speaker 2: at this point, you ask them why they believe that 220 00:12:33,200 --> 00:12:35,400 Speaker 2: this must continue, and they will say, you know, we 221 00:12:35,480 --> 00:12:38,920 Speaker 2: have to defend ourselves. We have a right to defend ourselves. YadA, YadA, YadA. 222 00:12:40,000 --> 00:12:43,520 Speaker 2: There are true believers within these groups, you know, who 223 00:12:43,640 --> 00:12:46,480 Speaker 2: are able to commit some of the worst acts, committed 224 00:12:46,559 --> 00:12:50,800 Speaker 2: ideologues who boast of their trustees, who express no remorse, 225 00:12:50,840 --> 00:12:53,520 Speaker 2: who take pride in their role, and people reach that 226 00:12:53,640 --> 00:12:58,360 Speaker 2: point of ideology through a process of radicalization. You know, 227 00:12:58,360 --> 00:13:01,439 Speaker 2: we look at the tends to genocide. I think is 228 00:13:01,480 --> 00:13:03,920 Speaker 2: the framework people have used before to point out how 229 00:13:04,000 --> 00:13:08,440 Speaker 2: a segment of a population can become a target of genocide. 230 00:13:08,480 --> 00:13:11,200 Speaker 2: It's not like one day you wake up and it's 231 00:13:11,240 --> 00:13:13,720 Speaker 2: just like, oh, we're going to geneside this group of people. 232 00:13:14,240 --> 00:13:18,400 Speaker 2: It's a process. You know. First you start off with classification. 233 00:13:18,559 --> 00:13:21,920 Speaker 2: You create a separate group of people, separate category of poison. 234 00:13:22,559 --> 00:13:26,280 Speaker 2: You make them signify themselves in some way, carry ID 235 00:13:26,520 --> 00:13:30,600 Speaker 2: cards or some kind of insignia on their clothing or whatever. 236 00:13:31,040 --> 00:13:34,440 Speaker 2: They begin to face discrimination or some kind the discrimination, 237 00:13:34,840 --> 00:13:38,320 Speaker 2: you know, is ramped up through dehumanizing language. You compare 238 00:13:38,360 --> 00:13:41,280 Speaker 2: them to verment or rodents, the disease. And that's just 239 00:13:41,280 --> 00:13:42,760 Speaker 2: the thing, and we're going to get to that. But 240 00:13:43,000 --> 00:13:47,240 Speaker 2: part of how you get people who would otherwise be 241 00:13:48,000 --> 00:13:52,920 Speaker 2: caring or compassionate about their fellow human is through distance. Right, 242 00:13:53,040 --> 00:13:58,280 Speaker 2: So the people who who are most blood thirsty tend 243 00:13:58,320 --> 00:14:01,360 Speaker 2: to be very far from the front lines. You know, 244 00:14:01,400 --> 00:14:04,600 Speaker 2: people who are demanding that World War One continue, for example, 245 00:14:05,000 --> 00:14:08,760 Speaker 2: they were very far from the actual fighting versus at 246 00:14:08,760 --> 00:14:11,560 Speaker 2: the actual front lines of World War One. You had 247 00:14:11,600 --> 00:14:17,040 Speaker 2: soldiers playing football together during Christmas. That's a separate story. 248 00:14:17,160 --> 00:14:20,880 Speaker 2: But you create distance, either create physical distance or you 249 00:14:20,920 --> 00:14:23,920 Speaker 2: create psychological distance and dehumanization is one of the ways 250 00:14:23,960 --> 00:14:27,440 Speaker 2: you create psychological distance. You distance people from seeing their 251 00:14:27,480 --> 00:14:31,160 Speaker 2: fellow human being as a human being. Segregation is another 252 00:14:31,160 --> 00:14:34,040 Speaker 2: way of creating that distance, which then lends itself to 253 00:14:34,080 --> 00:14:38,120 Speaker 2: the humanization. Comparing the people to wom into animals, anything 254 00:14:38,160 --> 00:14:41,000 Speaker 2: other than human as another step in the humanization and 255 00:14:41,040 --> 00:14:44,480 Speaker 2: can people to separate themselves from those peopuel and then 256 00:14:44,520 --> 00:14:47,080 Speaker 2: they create specific groups. The next stage to create specific 257 00:14:47,080 --> 00:14:52,520 Speaker 2: groups and organizations to enforce discriminatory policies. You fool the broadcast, 258 00:14:52,560 --> 00:14:57,680 Speaker 2: operate the propaganda to polarize population. And then we'll step seven, eight, nine, 259 00:14:57,760 --> 00:15:02,240 Speaker 2: and ten go from actually preparing the removal relocation of 260 00:15:02,280 --> 00:15:07,600 Speaker 2: people to the persecution, the extermination of the group, and 261 00:15:07,680 --> 00:15:11,680 Speaker 2: finally the denial that such a crime ever could so 262 00:15:12,440 --> 00:15:16,320 Speaker 2: that process it can take years, it can take decades, 263 00:15:17,280 --> 00:15:20,840 Speaker 2: but it's something that can turn even the most regular 264 00:15:20,920 --> 00:15:25,600 Speaker 2: poison into virulent proponent of Rihanna side if they are 265 00:15:25,640 --> 00:15:31,400 Speaker 2: not fastidious in their opposition to any such language, especially 266 00:15:31,520 --> 00:15:35,000 Speaker 2: in the early stages, because they get fed this steady 267 00:15:35,000 --> 00:15:38,320 Speaker 2: extreme propaganda of all the actions are justified, their loyalty 268 00:15:38,360 --> 00:15:41,000 Speaker 2: to their inn group becomes tested by their willingness to 269 00:15:41,080 --> 00:15:44,320 Speaker 2: engage in those harmful actions. The stay with that group, 270 00:15:44,360 --> 00:15:46,160 Speaker 2: will do whatever their tool is good, even if it 271 00:15:46,240 --> 00:15:50,200 Speaker 2: leads to other people being hood and it just creates 272 00:15:50,200 --> 00:15:53,280 Speaker 2: an evil. But it's an ordinary evil. It's an evil 273 00:15:53,280 --> 00:15:55,800 Speaker 2: that is convinced of. It's a virtue. It is wrapped 274 00:15:55,840 --> 00:16:00,360 Speaker 2: up in ideology and social conformity because you know, humans 275 00:16:00,360 --> 00:16:03,800 Speaker 2: are social creatures and it drives us to cooperate. But 276 00:16:03,840 --> 00:16:07,600 Speaker 2: that sociality can be narrowed down to test our in group. 277 00:16:08,280 --> 00:16:10,880 Speaker 2: And that's where Breakman actually gets into an interesting point 278 00:16:11,040 --> 00:16:13,960 Speaker 2: about empathy, right, because we tend to see empathy as 279 00:16:14,000 --> 00:16:17,120 Speaker 2: a positive thing, and it can be. But as Bragman notes, 280 00:16:17,560 --> 00:16:21,640 Speaker 2: drawn from psychologists Paul Bloom's work, empathy can also make 281 00:16:21,920 --> 00:16:26,400 Speaker 2: us partial, irrational, and even cruel, because it can narrow 282 00:16:26,480 --> 00:16:29,240 Speaker 2: our focus to those people who are like us and 283 00:16:29,320 --> 00:16:32,920 Speaker 2: ignore others. That's why soldiers can fight and kill other 284 00:16:32,960 --> 00:16:35,720 Speaker 2: people because they feel empathy for their in group, their 285 00:16:35,760 --> 00:16:39,800 Speaker 2: homeland citizens, or their comrades and arms. Their loyalty and 286 00:16:39,920 --> 00:16:43,800 Speaker 2: affection for the people they care about supersedes the lives 287 00:16:43,840 --> 00:16:47,200 Speaker 2: of the people that they don't care about. Now, of course, 288 00:16:47,200 --> 00:16:49,120 Speaker 2: we want to look at systems and we're talking about 289 00:16:49,120 --> 00:16:51,280 Speaker 2: this because I don't think that this hijack and of 290 00:16:51,360 --> 00:16:56,680 Speaker 2: empathy is inevitable. You know, nationalism, propaganda, these things play 291 00:16:56,680 --> 00:17:00,520 Speaker 2: a role in how people end up being separated this way, 292 00:17:00,920 --> 00:17:03,800 Speaker 2: and it's in groups and our groups. But you know, 293 00:17:03,880 --> 00:17:06,520 Speaker 2: there is also indications that in group and our group, 294 00:17:06,600 --> 00:17:09,720 Speaker 2: separation cann occur even in the absence of a stake, 295 00:17:09,920 --> 00:17:12,520 Speaker 2: So it is something we have to be continuously vigilanti of. 296 00:17:23,560 --> 00:17:28,359 Speaker 2: Another aspect of as a systemic analysis or approach, is 297 00:17:29,040 --> 00:17:33,880 Speaker 2: looking at how our position within society also shapes how 298 00:17:34,000 --> 00:17:36,359 Speaker 2: we operate, how we treat people, how we think, and 299 00:17:36,440 --> 00:17:41,080 Speaker 2: how we act. Bragrant cites neuroscience research that demonstrates how 300 00:17:41,160 --> 00:17:46,280 Speaker 2: authority literally changes how we think. Powerful people become less 301 00:17:46,280 --> 00:17:50,120 Speaker 2: empathetic and I'm more likely to see others as tools 302 00:17:50,800 --> 00:17:54,960 Speaker 2: rather than independent people. You know, this is not new information, 303 00:17:55,040 --> 00:17:57,520 Speaker 2: per see. You know, the envirolence that powerful people are 304 00:17:57,560 --> 00:18:01,080 Speaker 2: in both shapes them and are shaped by them, they say, 305 00:18:01,119 --> 00:18:04,320 Speaker 2: and has long gone that power corrupts and absolute power 306 00:18:04,359 --> 00:18:09,400 Speaker 2: crupts absolutely and spaces like Silicon Valley, like Wall streets, 307 00:18:09,440 --> 00:18:13,040 Speaker 2: like Washington, DC, like corporate boardrooms and all the other 308 00:18:13,359 --> 00:18:16,879 Speaker 2: upper echelons of government to divorce rulers and authorities from 309 00:18:16,960 --> 00:18:20,200 Speaker 2: ordinary people, their insular spaces that keep them from being 310 00:18:20,280 --> 00:18:23,440 Speaker 2: challenged or being grounded by the impact of their actions 311 00:18:23,520 --> 00:18:27,040 Speaker 2: and others, so powerful people don't have to care. And 312 00:18:27,080 --> 00:18:29,680 Speaker 2: I think such hierarchies are attractive to people who already 313 00:18:29,720 --> 00:18:33,200 Speaker 2: are inclined to do bad, even if they believe that 314 00:18:33,240 --> 00:18:37,119 Speaker 2: they're doing good. The authoritarians, the supremacists, the abusers, They 315 00:18:37,119 --> 00:18:40,640 Speaker 2: are attracted to those positions. But even good intention people 316 00:18:40,680 --> 00:18:43,679 Speaker 2: could lose themselves in authority too, because authorities as a 317 00:18:43,680 --> 00:18:47,200 Speaker 2: whole existed in this bubble that rewards their worst instincts, 318 00:18:47,560 --> 00:18:51,360 Speaker 2: and a further shape in the system around their worst instincts, 319 00:18:51,440 --> 00:18:56,240 Speaker 2: around distrust, selfishness, exploitation, and so on, to reward themselves 320 00:18:56,280 --> 00:18:59,320 Speaker 2: and their patterns of behavior, and thus, through the social 321 00:18:59,400 --> 00:19:03,800 Speaker 2: most seaway fact, people end up in that expectation created 322 00:19:03,800 --> 00:19:04,399 Speaker 2: by the system. 323 00:19:05,000 --> 00:19:07,879 Speaker 3: I guess my only comment here is that these systems 324 00:19:07,880 --> 00:19:10,359 Speaker 3: are not just exclusive to like state power or like 325 00:19:10,480 --> 00:19:15,720 Speaker 3: corporate authority. These same mechanisms reproduce themselves in all sorts 326 00:19:15,720 --> 00:19:19,679 Speaker 3: of social arrangements, including like radical politics, and frankly especially 327 00:19:19,720 --> 00:19:22,119 Speaker 3: radical politics. You can see it's a lot with groups, 328 00:19:22,119 --> 00:19:27,040 Speaker 3: whether they're communists, whether they're anarchists, whether they're I don't know. 329 00:19:27,160 --> 00:19:32,439 Speaker 3: Social democrats probably have this problem, but no specifically like 330 00:19:32,480 --> 00:19:36,639 Speaker 3: in anarchist scenes, you see this happen constantly. It is 331 00:19:36,760 --> 00:19:38,880 Speaker 3: almost funny how how much of these things just get 332 00:19:39,440 --> 00:19:43,600 Speaker 3: natively recreated and like in group out, group in amics 333 00:19:43,600 --> 00:19:45,800 Speaker 3: are always are always a big issue. I mean, like 334 00:19:45,840 --> 00:19:50,560 Speaker 3: you can also point to the book Cultish, which explains 335 00:19:50,640 --> 00:19:56,720 Speaker 3: how American culture is pretty defined by like cult like tendencies, 336 00:19:56,800 --> 00:19:59,280 Speaker 3: not saying that every single group is a cult, but 337 00:19:59,400 --> 00:20:02,720 Speaker 3: cult dynamic explain to a large part of everyday American life. 338 00:20:04,240 --> 00:20:07,160 Speaker 3: And that's both good and bad. Sometimes being in occult 339 00:20:07,240 --> 00:20:10,000 Speaker 3: is fun until it's not very fun. So these dynamics 340 00:20:10,000 --> 00:20:13,639 Speaker 3: themselves are not necessarily you know, bad, but there's something 341 00:20:13,680 --> 00:20:15,280 Speaker 3: to be like mindful of. 342 00:20:15,880 --> 00:20:21,119 Speaker 2: Yeah, exactly, so in being mindful of it, you know, 343 00:20:21,720 --> 00:20:23,359 Speaker 2: that's an aspect of it. You know, we have to 344 00:20:23,400 --> 00:20:30,760 Speaker 2: find solutions to this academic of badness, of behaviors being 345 00:20:30,800 --> 00:20:35,080 Speaker 2: reinforced by these systems that are cause harm to people 346 00:20:35,119 --> 00:20:38,399 Speaker 2: and harm to the world. And so what I always 347 00:20:38,400 --> 00:20:43,080 Speaker 2: advocate for and we is big and small. I wouldn't 348 00:20:43,119 --> 00:20:46,159 Speaker 2: call it the one solution to everything, but it's it 349 00:20:46,200 --> 00:20:50,600 Speaker 2: does encompass a lot, but it's just understanding and taking 350 00:20:50,640 --> 00:20:55,359 Speaker 2: on a dynamic social revolutionary approach to change, you know, 351 00:20:55,560 --> 00:21:00,679 Speaker 2: from the effort to do to confront the existence system, 352 00:21:00,720 --> 00:21:03,120 Speaker 2: to stand up against it. But they're also the things 353 00:21:03,160 --> 00:21:06,200 Speaker 2: that you do to put forward an alternative, to put 354 00:21:06,280 --> 00:21:11,160 Speaker 2: forward and to practice alternatives. So one of the things 355 00:21:11,160 --> 00:21:14,359 Speaker 2: that we can do is to create, but you know, 356 00:21:14,640 --> 00:21:20,120 Speaker 2: perpetuate a positive and trusting take on human decen see, 357 00:21:20,600 --> 00:21:23,440 Speaker 2: you know, to create that social place boy effect that 358 00:21:23,480 --> 00:21:26,399 Speaker 2: can shape how people treat each other for better, but 359 00:21:26,920 --> 00:21:28,760 Speaker 2: that can be boiled down to just be nicer to 360 00:21:28,760 --> 00:21:32,159 Speaker 2: each other. So there's more we've done than that. Of course, 361 00:21:32,560 --> 00:21:34,679 Speaker 2: on the systems front, we also to change how we 362 00:21:34,840 --> 00:21:39,200 Speaker 2: educate each other in radical spaces and also in terms 363 00:21:39,200 --> 00:21:43,239 Speaker 2: of how we raise children. We have to organize you know, 364 00:21:43,560 --> 00:21:50,760 Speaker 2: alternative economic systems and alternative social arrangements that get us 365 00:21:50,800 --> 00:21:56,040 Speaker 2: in the habit of trust, of trusting people's freedom, of 366 00:21:56,160 --> 00:22:03,159 Speaker 2: practicing freedom, and also of emphasizing greater intrinsic motivation in 367 00:22:03,280 --> 00:22:05,800 Speaker 2: people as well. You know, a lot of our society 368 00:22:05,920 --> 00:22:10,320 Speaker 2: is built around control and mechanisms of control through extrinsic 369 00:22:10,359 --> 00:22:14,359 Speaker 2: forms of motivation, you know, like punishments and prisons and 370 00:22:15,080 --> 00:22:18,080 Speaker 2: grades and bonuses and wages, all the different things that 371 00:22:18,119 --> 00:22:21,080 Speaker 2: are meant to keep us going here now. But I 372 00:22:21,119 --> 00:22:24,960 Speaker 2: think a system that more leans into intrinsic motivation is 373 00:22:24,960 --> 00:22:27,399 Speaker 2: something that we should be working toward. You know that 374 00:22:27,480 --> 00:22:30,280 Speaker 2: people do things so they're in see for reasons that 375 00:22:30,359 --> 00:22:33,080 Speaker 2: we are driven by. That I think is far more 376 00:22:33,280 --> 00:22:37,920 Speaker 2: sustainable long term and more fulfilling long term than continuing 377 00:22:37,920 --> 00:22:41,680 Speaker 2: to be stuck with the punishments and rewards that come 378 00:22:41,760 --> 00:22:45,320 Speaker 2: from outside. Yes, we have to develop a revolutionary consciousness 379 00:22:45,320 --> 00:22:50,280 Speaker 2: that is also very much grounded in you know, people's 380 00:22:50,320 --> 00:22:53,680 Speaker 2: intrinsic motivation to have their needs met, to pursue their interests, 381 00:22:54,160 --> 00:22:56,960 Speaker 2: to care for others. And that is where I think 382 00:22:56,960 --> 00:22:59,879 Speaker 2: we'll say tea in fors long term because you can 383 00:23:00,080 --> 00:23:04,720 Speaker 2: create all these bonuses and incentives externally, but I don't 384 00:23:04,760 --> 00:23:07,960 Speaker 2: think it's something that will last. There are, you know, 385 00:23:08,119 --> 00:23:12,440 Speaker 2: experiments in earth with a great emphasis on intrinsic motivation, 386 00:23:13,200 --> 00:23:16,679 Speaker 2: not even necessarily radical experiments. By see what Bregman actually 387 00:23:16,680 --> 00:23:20,159 Speaker 2: looks at examples of schools that don't have grades or 388 00:23:20,200 --> 00:23:23,719 Speaker 2: fixed curriculums, and that companies that don't have managers, that 389 00:23:23,720 --> 00:23:26,919 Speaker 2: are run entirely by employees. I mean, anarchists have been 390 00:23:26,920 --> 00:23:30,040 Speaker 2: known about these, but he emphasizes that the people in 391 00:23:30,080 --> 00:23:33,520 Speaker 2: these environments thrive because they've been trusted to direct themselves. 392 00:23:34,000 --> 00:23:36,160 Speaker 2: They can bring off the best in themselves because they've 393 00:23:36,200 --> 00:23:39,360 Speaker 2: been given the room to do so, you know, and 394 00:23:39,680 --> 00:23:43,280 Speaker 2: spaces like free schools and maker spaces and cooperatives they 395 00:23:43,320 --> 00:23:47,120 Speaker 2: give us the room to develop our cooperation and creativity. 396 00:23:47,320 --> 00:23:47,440 Speaker 1: You know. 397 00:23:47,480 --> 00:23:49,760 Speaker 2: Of course, the system isn't going to stand by as 398 00:23:49,800 --> 00:23:54,160 Speaker 2: these transformations take place. It might tolerate or even celebrate 399 00:23:54,240 --> 00:23:57,520 Speaker 2: some like the examples that Bregman had looked at, but 400 00:23:57,560 --> 00:24:00,080 Speaker 2: those are always going to be treated as exceptions the 401 00:24:00,119 --> 00:24:02,080 Speaker 2: second he tried to make them the norm. I think 402 00:24:02,080 --> 00:24:06,000 Speaker 2: you're going to face some real challenges because ordinary people 403 00:24:06,000 --> 00:24:08,960 Speaker 2: one of these things, but the rulers don't. It's like 404 00:24:09,000 --> 00:24:12,120 Speaker 2: the example that I had brought up earlier, you know, 405 00:24:12,520 --> 00:24:15,880 Speaker 2: the famous nineteen fourteen Christmas truth during World War One, 406 00:24:16,400 --> 00:24:19,840 Speaker 2: where British and Jeman soldiers put down their guns, they 407 00:24:19,840 --> 00:24:24,000 Speaker 2: sang songs, they played football, but eventually the high command 408 00:24:24,160 --> 00:24:27,600 Speaker 2: crack down these truces. The fratnization of people who are 409 00:24:27,600 --> 00:24:29,880 Speaker 2: different from each other was a threat to the war 410 00:24:29,960 --> 00:24:34,720 Speaker 2: machine because these systems are invested in maintaining hostility and division, 411 00:24:35,080 --> 00:24:38,440 Speaker 2: and so we have to consciously and openly stand up 412 00:24:38,600 --> 00:24:43,040 Speaker 2: against hostility and division to build systems that bring out 413 00:24:43,040 --> 00:24:46,000 Speaker 2: the best in people. I don't think that a hopeful 414 00:24:46,320 --> 00:24:49,719 Speaker 2: view of human nature should be seen as utopian. As 415 00:24:49,760 --> 00:24:53,880 Speaker 2: I said earlier, is realistic. Cynicism is not realism. They're 416 00:24:53,880 --> 00:24:57,080 Speaker 2: not the same thing. Having hope is not being that 417 00:24:57,119 --> 00:25:00,800 Speaker 2: you are completely deluded of the darks side or dark 418 00:25:00,880 --> 00:25:05,000 Speaker 2: aspects of humanity and humanity's possibilities. But it means that 419 00:25:05,040 --> 00:25:08,680 Speaker 2: you don't limit yourself to that outcome, that you challenge 420 00:25:08,680 --> 00:25:12,199 Speaker 2: that narrative, and that you seek to do better and 421 00:25:12,280 --> 00:25:15,840 Speaker 2: to create something better. And that's really what I care about, 422 00:25:16,440 --> 00:25:18,840 Speaker 2: and that's all I have to share. All power to 423 00:25:18,880 --> 00:25:19,480 Speaker 2: all the people. 424 00:25:20,119 --> 00:25:26,120 Speaker 1: This it could happen. Here is a production of cool 425 00:25:26,200 --> 00:25:29,359 Speaker 1: Zone Media. For more podcasts from cool Zone Media, visit 426 00:25:29,400 --> 00:25:32,520 Speaker 1: our website Coolzonemedia dot com, or check us out on 427 00:25:32,560 --> 00:25:36,760 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. 428 00:25:37,040 --> 00:25:38,959 Speaker 1: You can now find sources for it could happen here, 429 00:25:39,000 --> 00:25:42,000 Speaker 1: listed directly in episode descriptions. Thanks for listening,