1 00:00:05,559 --> 00:00:09,000 Speaker 1: Would you electrocute someone if you were told to by 2 00:00:09,039 --> 00:00:14,520 Speaker 1: an authority figure? Would you torture a prisoner just because 3 00:00:14,560 --> 00:00:17,400 Speaker 1: you were put in the uniform of a prison guard? 4 00:00:18,280 --> 00:00:21,600 Speaker 1: How much of your behavior is a function of your situation? 5 00:00:22,200 --> 00:00:23,880 Speaker 1: And what does any of this have to do with 6 00:00:24,560 --> 00:00:28,400 Speaker 1: matching the length of a line, or soldiers posing with 7 00:00:28,520 --> 00:00:33,919 Speaker 1: dead bodies of their enemies, or propaganda or dehumanization or 8 00:00:34,000 --> 00:00:41,120 Speaker 1: how we should educate our children. Welcome to Inner Cosmos 9 00:00:41,120 --> 00:00:44,360 Speaker 1: with me David Eagleman. I'm a neuroscientist and an author 10 00:00:44,400 --> 00:00:48,040 Speaker 1: at Stanford and in these episodes, I examine the intersection 11 00:00:48,159 --> 00:00:52,360 Speaker 1: between our brains and our lives, and we sail deeply 12 00:00:52,440 --> 00:00:56,160 Speaker 1: into our three pound universe to understand why and how 13 00:00:56,240 --> 00:01:06,600 Speaker 1: our lives look the way they do. In last week's episode, 14 00:01:07,280 --> 00:01:12,199 Speaker 1: we talked about dehumanization and how your brain can dial 15 00:01:12,400 --> 00:01:15,040 Speaker 1: up and down the degree to which you view another 16 00:01:15,160 --> 00:01:18,800 Speaker 1: person as human. And I gave you a particular example 17 00:01:18,959 --> 00:01:22,840 Speaker 1: of empathy, which is where you're simulating what it is 18 00:01:23,000 --> 00:01:26,400 Speaker 1: like to be someone else, and we saw how empathy 19 00:01:26,480 --> 00:01:29,760 Speaker 1: can be modulated based on whether those people are in 20 00:01:29,800 --> 00:01:32,600 Speaker 1: your in group or your outgroup. For today, I want 21 00:01:32,600 --> 00:01:35,720 Speaker 1: to drill down a little bit deeper into the heart 22 00:01:35,840 --> 00:01:38,840 Speaker 1: of a related issue, which is that when you look 23 00:01:38,880 --> 00:01:43,160 Speaker 1: at people who are behaving in these very violent acts 24 00:01:43,200 --> 00:01:46,400 Speaker 1: throughout history, the assumption has been for a long time 25 00:01:47,000 --> 00:01:50,800 Speaker 1: that it's something about the disposition of those people. In 26 00:01:50,840 --> 00:01:55,080 Speaker 1: other words, there's something really wrong with those people. But 27 00:01:55,200 --> 00:01:58,840 Speaker 1: this started coming into questions some years ago because there 28 00:01:58,880 --> 00:02:02,760 Speaker 1: were so many hundreds of thousands, sometimes millions of people 29 00:02:03,360 --> 00:02:07,440 Speaker 1: participating in these violent acts, and it's a strange theory 30 00:02:07,480 --> 00:02:10,160 Speaker 1: to say that all of them had something wrong with 31 00:02:10,240 --> 00:02:14,840 Speaker 1: their brains. Instead, what researchers started thinking about is maybe 32 00:02:14,840 --> 00:02:19,880 Speaker 1: there are situational forces that make people behave in these 33 00:02:20,200 --> 00:02:24,240 Speaker 1: incredibly awful ways. So there's something to be understood here 34 00:02:24,280 --> 00:02:28,839 Speaker 1: about the social context that people find themselves in that 35 00:02:29,040 --> 00:02:31,720 Speaker 1: cause them to behave a certain way. And of course 36 00:02:31,800 --> 00:02:35,680 Speaker 1: this leads to the question of could you behave this 37 00:02:35,840 --> 00:02:39,680 Speaker 1: way if you found yourself in that situation? And this 38 00:02:39,760 --> 00:02:42,520 Speaker 1: makes us all very uncomfortable to even think about it, 39 00:02:42,560 --> 00:02:45,320 Speaker 1: because we know we're good people and we're not going 40 00:02:45,360 --> 00:02:49,120 Speaker 1: to behave in these terrible ways. But the reason it's 41 00:02:49,160 --> 00:02:53,040 Speaker 1: important to ask these questions is because social psychologists got 42 00:02:53,080 --> 00:02:56,840 Speaker 1: really interested in what was happening with what came to 43 00:02:56,880 --> 00:03:01,320 Speaker 1: be known as the finality of evil. So after World 44 00:03:01,360 --> 00:03:05,280 Speaker 1: War two, for example, Adolf Eikman was on trial. He 45 00:03:05,360 --> 00:03:09,079 Speaker 1: was one of the main coordinators of the final solution 46 00:03:09,400 --> 00:03:12,320 Speaker 1: for the Jewish population. He personally had the blood of 47 00:03:12,520 --> 00:03:17,079 Speaker 1: hundreds of thousands, or maybe millions of people on his hands. 48 00:03:18,120 --> 00:03:21,040 Speaker 1: And the thing is, as the journalist Hannah Errand put 49 00:03:21,080 --> 00:03:23,919 Speaker 1: it as she covered his trial, she called this the 50 00:03:23,960 --> 00:03:29,960 Speaker 1: banality of evil, because there was nothing particularly special about 51 00:03:30,040 --> 00:03:33,000 Speaker 1: Adolf Eikman. He was on the stand and he said, 52 00:03:33,720 --> 00:03:36,120 Speaker 1: I was just doing my job. He was part of 53 00:03:36,160 --> 00:03:40,040 Speaker 1: this machinery. He had this opportunity to impress his wife 54 00:03:40,120 --> 00:03:42,840 Speaker 1: and the people around him. There were all kinds of 55 00:03:42,960 --> 00:03:47,240 Speaker 1: situational forces at play here. Now this is no defense 56 00:03:47,360 --> 00:03:50,480 Speaker 1: of his behavior, but what it does encourage us to 57 00:03:50,480 --> 00:03:55,240 Speaker 1: do is try to understand what are these situational forces 58 00:03:55,280 --> 00:03:59,880 Speaker 1: that steer whole populations of people to do incredibly off 59 00:04:01,080 --> 00:04:05,240 Speaker 1: then in other situations you wouldn't even consider. And this 60 00:04:05,280 --> 00:04:10,520 Speaker 1: is why the whole research question about situational forces came 61 00:04:10,560 --> 00:04:13,840 Speaker 1: to the forefront. So right after World War two there 62 00:04:13,920 --> 00:04:18,320 Speaker 1: was a research psychologist named Solomon ash and he decided 63 00:04:18,680 --> 00:04:23,000 Speaker 1: I want to understand how it is that social forces 64 00:04:23,080 --> 00:04:26,640 Speaker 1: can change people's decision making. So he did a very 65 00:04:26,640 --> 00:04:30,880 Speaker 1: simple experiment. You come in to participate in the lab, 66 00:04:31,320 --> 00:04:33,880 Speaker 1: and you see there are seven other people that are 67 00:04:33,880 --> 00:04:36,680 Speaker 1: there to participate as well, just like you, And you're 68 00:04:36,720 --> 00:04:40,520 Speaker 1: all shown a line on the screen of a certain length, 69 00:04:41,080 --> 00:04:44,800 Speaker 1: and then you're shown three more lines marked AB and C, 70 00:04:45,200 --> 00:04:49,279 Speaker 1: and you're asked which one matches the original line in 71 00:04:49,360 --> 00:04:52,640 Speaker 1: terms of length. But it just so happens that you're 72 00:04:52,680 --> 00:04:55,800 Speaker 1: sitting in the eighth chair, and so the first person 73 00:04:56,000 --> 00:04:59,640 Speaker 1: registers his answer, and the next person calls out her answer, 74 00:04:59,680 --> 00:05:01,520 Speaker 1: and so go on, and it goes down the line, 75 00:05:02,080 --> 00:05:06,080 Speaker 1: and it turns out that all these people are shills. 76 00:05:06,279 --> 00:05:09,280 Speaker 1: That means there are plants from the experimenter. They're not 77 00:05:09,560 --> 00:05:11,200 Speaker 1: just like you, even though they appear to be just 78 00:05:11,279 --> 00:05:14,440 Speaker 1: like you. And sometimes what they'll do is they'll all 79 00:05:14,560 --> 00:05:19,120 Speaker 1: say the wrong answer, but they'll say it confidently. They'll 80 00:05:19,160 --> 00:05:23,040 Speaker 1: maybe pick the shortest line over there, and they'll say, oh, yeah, 81 00:05:23,080 --> 00:05:26,800 Speaker 1: it's definitely line C, and you stare at it and 82 00:05:26,920 --> 00:05:30,920 Speaker 1: you think it's line B. But person number one, person 83 00:05:30,960 --> 00:05:33,800 Speaker 1: number two, number three, they all are picking line C 84 00:05:34,120 --> 00:05:36,400 Speaker 1: and so what do you do when it comes time 85 00:05:36,839 --> 00:05:39,680 Speaker 1: for your turn? Are you going to say, you know what, 86 00:05:39,800 --> 00:05:42,839 Speaker 1: you guys are all wrong, it's line B. Or do 87 00:05:42,880 --> 00:05:45,640 Speaker 1: you think, gosh, maybe there's something wrong with me in 88 00:05:45,680 --> 00:05:50,200 Speaker 1: the way that I'm seeing this Now. Solomon Ash didn't 89 00:05:50,440 --> 00:05:52,640 Speaker 1: think this would work. He figured people would go ahead 90 00:05:52,680 --> 00:05:55,279 Speaker 1: and stick with what they thought was the right answer. 91 00:05:55,920 --> 00:05:59,479 Speaker 1: But the results were really surprising because what happened is 92 00:05:59,480 --> 00:06:04,279 Speaker 1: that almost almost everybody conformed to the group whatever the 93 00:06:04,320 --> 00:06:09,520 Speaker 1: group was saying. The experimental subject was reluctant to say otherwise, 94 00:06:09,760 --> 00:06:13,240 Speaker 1: even though this was a clear, easy perceptual task with 95 00:06:13,360 --> 00:06:17,520 Speaker 1: a right and wrong answer. So this was a surprising result. Now, 96 00:06:17,560 --> 00:06:20,600 Speaker 1: as it turns out, Ash had a student in his lab, 97 00:06:20,640 --> 00:06:25,520 Speaker 1: a young man named Stanley Milgram, who watched this, and Milgram, 98 00:06:25,640 --> 00:06:29,320 Speaker 1: being Jewish just like Ash, and having just seen what 99 00:06:29,440 --> 00:06:33,040 Speaker 1: happened in World War two, was very interested in these issues. 100 00:06:33,640 --> 00:06:36,560 Speaker 1: And Milgram noticed something, which is there was no social 101 00:06:36,680 --> 00:06:41,120 Speaker 1: consequence to this line experiment. It wasn't a moral decision 102 00:06:41,160 --> 00:06:44,440 Speaker 1: of any sort. It was just a very simple perceptual decision. 103 00:06:45,120 --> 00:06:48,880 Speaker 1: So Milgram decided to launch an experiment that drilled a 104 00:06:48,880 --> 00:06:51,560 Speaker 1: lot deeper and it's one of the most famous experiments 105 00:06:51,560 --> 00:06:54,720 Speaker 1: in psychology, but not everyone knows the details. So let's 106 00:06:54,720 --> 00:06:59,120 Speaker 1: just walk through this. First, you see an ad that's 107 00:06:59,160 --> 00:07:02,560 Speaker 1: advertising for pe people to participate in a study about memory. 108 00:07:02,880 --> 00:07:05,280 Speaker 1: It says, we will pay five hundred men to help 109 00:07:05,360 --> 00:07:09,120 Speaker 1: us complete a scientific study of memory and learning. The 110 00:07:09,160 --> 00:07:11,400 Speaker 1: flyer says that you will get paid for one hour 111 00:07:11,480 --> 00:07:14,480 Speaker 1: of your time and there's no further obligation. So you 112 00:07:14,600 --> 00:07:17,800 Speaker 1: show up. It's this laboratory at Yale University, and you're 113 00:07:17,800 --> 00:07:20,680 Speaker 1: told that in this experiment you're going to play the 114 00:07:20,760 --> 00:07:24,640 Speaker 1: role of the teacher. And there's another volunteer over on 115 00:07:24,680 --> 00:07:27,280 Speaker 1: the other side of the wall, and you're told this 116 00:07:27,360 --> 00:07:30,480 Speaker 1: is a study on the effect of punishment on how 117 00:07:30,520 --> 00:07:33,400 Speaker 1: well people can learn. And so it's explained to you 118 00:07:33,840 --> 00:07:37,920 Speaker 1: that the learner will be asked to memorize arbitrary pairs 119 00:07:37,920 --> 00:07:41,120 Speaker 1: of words, and he's going to be continuously quizzed on it, 120 00:07:41,600 --> 00:07:45,080 Speaker 1: and he's just been strapped to a chair that can 121 00:07:45,120 --> 00:07:48,280 Speaker 1: give him a small electrical shock. So if he gets 122 00:07:48,280 --> 00:07:51,640 Speaker 1: an answer right, the experimenter moves on to the next problem. 123 00:07:52,000 --> 00:07:55,040 Speaker 1: But if the guy gets the answer wrong, your job 124 00:07:55,120 --> 00:07:58,880 Speaker 1: is to deliver a small electrical shock. Now you're sitting 125 00:07:58,880 --> 00:08:01,360 Speaker 1: in front of this device which he has thirty switches 126 00:08:01,520 --> 00:08:03,960 Speaker 1: in a row on it, and these switches are marked 127 00:08:03,960 --> 00:08:07,320 Speaker 1: at the left side as slight shock, all the way 128 00:08:07,400 --> 00:08:11,760 Speaker 1: up to the right side where it says danger extreme shock. 129 00:08:12,600 --> 00:08:16,320 Speaker 1: And so you hear the experiment begin. The job of 130 00:08:16,360 --> 00:08:18,960 Speaker 1: the learner is to learn these associations between the words, 131 00:08:19,240 --> 00:08:21,040 Speaker 1: and he gets the first answer right, and then he 132 00:08:21,040 --> 00:08:24,080 Speaker 1: gets the second answer right and so on. But finally 133 00:08:24,240 --> 00:08:27,400 Speaker 1: he can't remember some pairing and he gets the answer wrong. 134 00:08:28,000 --> 00:08:30,520 Speaker 1: So now the experimenter and the white coat says to you, 135 00:08:31,040 --> 00:08:33,040 Speaker 1: I want you to press the button for the lowest 136 00:08:33,120 --> 00:08:35,679 Speaker 1: level of shock, level one. So that's all you need 137 00:08:35,760 --> 00:08:38,200 Speaker 1: to do. You press the button that reads slight shock. 138 00:08:38,679 --> 00:08:40,480 Speaker 1: And even though it seems like the kind of thing 139 00:08:40,480 --> 00:08:43,880 Speaker 1: you might not normally do, you're being instructed by a 140 00:08:43,920 --> 00:08:47,800 Speaker 1: professional researcher, and so you hit the button. You can't 141 00:08:47,840 --> 00:08:50,199 Speaker 1: see the learner and you don't hear anything, so the 142 00:08:50,240 --> 00:08:53,280 Speaker 1: whole thing is pretty straightforward. Okay, So he's going along 143 00:08:53,320 --> 00:08:57,920 Speaker 1: and he gets another wrong answer, and you're told to 144 00:08:57,920 --> 00:09:00,680 Speaker 1: give him a tiny shock again, and this goes on, 145 00:09:00,800 --> 00:09:03,679 Speaker 1: but the guy isn't performing that well at memorizing the 146 00:09:03,760 --> 00:09:06,480 Speaker 1: pairs of words, and so the experimenter tells you at 147 00:09:06,480 --> 00:09:10,080 Speaker 1: some point, okay, each time he gets another wrong answer, 148 00:09:10,880 --> 00:09:13,880 Speaker 1: I want you to increase the level of the shock. 149 00:09:15,040 --> 00:09:17,520 Speaker 1: So the learner gets it wrong and you're told to 150 00:09:17,559 --> 00:09:20,160 Speaker 1: hit the second button, and this goes on, and every 151 00:09:20,200 --> 00:09:23,040 Speaker 1: time he gets the wrong answer, you have to move 152 00:09:23,160 --> 00:09:25,679 Speaker 1: up in the level of shock. So you work your 153 00:09:25,720 --> 00:09:29,160 Speaker 1: way up along the buttons, and it changes from mild 154 00:09:29,200 --> 00:09:33,320 Speaker 1: shock up to extreme intensity shock and over on the 155 00:09:33,360 --> 00:09:37,000 Speaker 1: right side the buttons read danger severe shock, and in 156 00:09:37,040 --> 00:09:39,760 Speaker 1: fact the last few buttons are past that danger sign 157 00:09:39,800 --> 00:09:44,000 Speaker 1: and they're unlabeled. So you're getting a little worried about this, 158 00:09:44,320 --> 00:09:47,320 Speaker 1: And as the learner is going along, you're hoping that 159 00:09:47,400 --> 00:09:49,800 Speaker 1: he'll get the right answers so you won't have to 160 00:09:49,960 --> 00:09:52,360 Speaker 1: keep going up very high. But the guy gets a 161 00:09:52,360 --> 00:09:56,160 Speaker 1: wrong answer, and then another, and eventually another, and the 162 00:09:56,280 --> 00:10:00,440 Speaker 1: experimenter very calmly tells you to give him these higher 163 00:10:00,480 --> 00:10:04,480 Speaker 1: and higher shocks. And what's happening by midway up this 164 00:10:04,600 --> 00:10:07,280 Speaker 1: scale is that when you give a shock, you hear 165 00:10:07,360 --> 00:10:09,960 Speaker 1: the learner in the other room say ow. And as 166 00:10:10,000 --> 00:10:12,440 Speaker 1: you're going up higher than that, he says, ow, that 167 00:10:12,559 --> 00:10:15,000 Speaker 1: really hurt. And as you go up higher, he says, 168 00:10:15,559 --> 00:10:17,240 Speaker 1: let me out of here. I don't want to be 169 00:10:17,280 --> 00:10:22,000 Speaker 1: part of this experiment. So you look pleadingly to the experimenter, 170 00:10:22,040 --> 00:10:26,040 Speaker 1: and the experimenter in his white coat, says, keep going. 171 00:10:26,880 --> 00:10:29,840 Speaker 1: So maybe you keep going and the guy says, OW, 172 00:10:30,000 --> 00:10:32,040 Speaker 1: I want out of here, let me out, And the 173 00:10:32,080 --> 00:10:36,599 Speaker 1: experimenter says to you, keep going, and you say I 174 00:10:36,600 --> 00:10:39,560 Speaker 1: don't want to keep going. He's obviously in pain, and 175 00:10:39,600 --> 00:10:43,880 Speaker 1: the experimenter says, don't worry. I take full responsibility. You 176 00:10:43,920 --> 00:10:46,440 Speaker 1: are not responsible for any of this. You are just 177 00:10:46,520 --> 00:10:51,679 Speaker 1: participating in an experiment. So the question is how high 178 00:10:51,960 --> 00:10:55,440 Speaker 1: do you keep going? Because at some point, when you 179 00:10:55,480 --> 00:11:03,640 Speaker 1: reach a pretty high level, the learner stops responding. You 180 00:11:03,760 --> 00:11:06,960 Speaker 1: press the shock button, but you don't hear any cries anymore. 181 00:11:07,440 --> 00:11:11,840 Speaker 1: Is he unconscious? Is he dead? There's nothing but silence now, 182 00:11:12,240 --> 00:11:16,160 Speaker 1: And the question is will you keep going even beyond 183 00:11:16,320 --> 00:11:21,400 Speaker 1: that level? Will anybody go all the way to the top? Now, 184 00:11:21,559 --> 00:11:25,320 Speaker 1: As it turns out, the learner was a shill. He 185 00:11:25,400 --> 00:11:28,800 Speaker 1: is not a real volunteer. He was working with the experimenter. 186 00:11:29,480 --> 00:11:52,360 Speaker 1: You were the experimental subject. So Stanley Milgram talked to 187 00:11:52,400 --> 00:11:55,880 Speaker 1: a group of psychiatrists, and he said, what's your prediction. 188 00:11:56,160 --> 00:11:59,520 Speaker 1: How high will people go? How many people do you 189 00:11:59,520 --> 00:12:01,360 Speaker 1: think will go all the way to the top and 190 00:12:01,400 --> 00:12:05,640 Speaker 1: will give somebody the strongest electrical shock, even though the 191 00:12:05,720 --> 00:12:09,360 Speaker 1: learner seems to be unconscious at this point or maybe dead. 192 00:12:09,920 --> 00:12:14,120 Speaker 1: So the psychiatrists concluded that their prediction was one percent 193 00:12:14,160 --> 00:12:17,720 Speaker 1: of people will do this because psychiatrists are experts in 194 00:12:17,800 --> 00:12:22,280 Speaker 1: human nature, and they're thinking about this in a dispositional way, 195 00:12:22,320 --> 00:12:26,280 Speaker 1: in other words, who has that disposition? Who's the kind 196 00:12:26,320 --> 00:12:30,160 Speaker 1: of person that would allow them to do that? And 197 00:12:30,200 --> 00:12:32,560 Speaker 1: they figured, well, the only person who would do that 198 00:12:32,720 --> 00:12:36,280 Speaker 1: is somebody who is psychopathic, and psychopaths make up about 199 00:12:36,360 --> 00:12:40,679 Speaker 1: one percent of the population. But as it turned out, 200 00:12:40,720 --> 00:12:45,640 Speaker 1: sixty five percent of Milgrim's volunteers went all the way 201 00:12:45,720 --> 00:12:49,880 Speaker 1: to the top. They delivered four hundred and fifty volt 202 00:12:49,920 --> 00:12:54,400 Speaker 1: shocks to the learner. Why It's simply because they were 203 00:12:54,520 --> 00:12:58,800 Speaker 1: asked to. So Milgram wrote a famous book after this, 204 00:12:59,000 --> 00:13:03,880 Speaker 1: called Obedient to Authority, because he couldn't believe what he 205 00:13:03,920 --> 00:13:07,280 Speaker 1: had just found. He couldn't believe that people would listen 206 00:13:07,320 --> 00:13:09,840 Speaker 1: to him all the way up to the very top 207 00:13:10,160 --> 00:13:13,439 Speaker 1: where they were perhaps killing somebody, and somebody who was 208 00:13:13,480 --> 00:13:16,480 Speaker 1: a total stranger to them, who had done nothing to them. 209 00:13:16,920 --> 00:13:21,120 Speaker 1: And in his book he describes nineteen different versions of 210 00:13:21,160 --> 00:13:24,800 Speaker 1: the experiment that he did. He tweaked every possible parameter. 211 00:13:24,920 --> 00:13:28,200 Speaker 1: He figured out the details of our behavior, like if you, 212 00:13:28,400 --> 00:13:30,920 Speaker 1: the teacher, actually have to be close to the learner 213 00:13:31,000 --> 00:13:35,400 Speaker 1: where you can see him, then compliance goes down. Or similarly, 214 00:13:35,480 --> 00:13:38,320 Speaker 1: if the experimenter, the guy in the white coat, is 215 00:13:38,600 --> 00:13:42,959 Speaker 1: farther away from you, then compliance also goes down. And 216 00:13:43,040 --> 00:13:45,960 Speaker 1: in the extreme case, when the experimenter is just talking 217 00:13:46,000 --> 00:13:50,600 Speaker 1: to you on a telephone, compliance only drops to twenty percent. Now, 218 00:13:50,640 --> 00:13:55,920 Speaker 1: twenty percent is still really awfully high for delivering these 219 00:13:56,160 --> 00:13:59,360 Speaker 1: strong electrical shocks. And by the way, he ran the 220 00:13:59,440 --> 00:14:04,000 Speaker 1: experiment also with female participants instead of male and even 221 00:14:04,000 --> 00:14:07,680 Speaker 1: though the women expressed more stress about it, they still 222 00:14:07,720 --> 00:14:10,920 Speaker 1: did exactly as much. Shocking sixty five percent of them 223 00:14:11,280 --> 00:14:14,440 Speaker 1: went up to the top most switch. Now, I want 224 00:14:14,440 --> 00:14:18,120 Speaker 1: to make a quick note here about the thirty five 225 00:14:18,160 --> 00:14:21,200 Speaker 1: percent those people who did not go all the way 226 00:14:21,280 --> 00:14:24,320 Speaker 1: up to four hundred and fifty volts. Thank goodness, there's 227 00:14:24,360 --> 00:14:27,720 Speaker 1: that thirty five percent those are the people that we 228 00:14:27,800 --> 00:14:30,960 Speaker 1: need a socially model. Those are the people who were 229 00:14:31,000 --> 00:14:34,760 Speaker 1: perhaps raised in households where they were always asked to 230 00:14:34,880 --> 00:14:38,120 Speaker 1: question why they were doing something. And even though we 231 00:14:38,160 --> 00:14:41,760 Speaker 1: talked in the last episode about all these bloody events 232 00:14:41,800 --> 00:14:45,640 Speaker 1: in the past century, there were always people who hid 233 00:14:45,720 --> 00:14:49,960 Speaker 1: Jewish families in the Holocaust, or protected Chinese women during 234 00:14:49,960 --> 00:14:53,479 Speaker 1: the invasion of Nan King, or people who sheltered Tutsi 235 00:14:53,520 --> 00:14:57,880 Speaker 1: in Rwanda. So thank goodness those people exist. And our 236 00:14:58,080 --> 00:15:01,880 Speaker 1: job really is to take that thin radio signal to 237 00:15:01,960 --> 00:15:05,440 Speaker 1: the world and amplify it, to work every day to 238 00:15:05,640 --> 00:15:09,720 Speaker 1: cultivate that kind of bravery in ourselves and our children. 239 00:15:09,840 --> 00:15:12,760 Speaker 1: And I'm not talking about acting like we know what's 240 00:15:12,880 --> 00:15:15,520 Speaker 1: right and wrong and going and hurting people. I'm talking 241 00:15:15,520 --> 00:15:18,480 Speaker 1: about being the kind of people who don't allow that 242 00:15:18,560 --> 00:15:22,240 Speaker 1: to happen, no matter who is doing the hurting. Okay, 243 00:15:22,280 --> 00:15:25,480 Speaker 1: So back to Milgram's experiments. He wanted to make sure 244 00:15:25,520 --> 00:15:28,320 Speaker 1: that this had nothing to do, in particular with the 245 00:15:28,440 --> 00:15:32,000 Speaker 1: academic prestige of Ale University, where he was from, So 246 00:15:32,080 --> 00:15:35,760 Speaker 1: he rented some random office space in downtown New Haven, 247 00:15:35,800 --> 00:15:39,120 Speaker 1: Connecticut and just said he was a random researcher and 248 00:15:39,240 --> 00:15:43,680 Speaker 1: people still comply just as much. So. Milgram's experiments were 249 00:15:43,720 --> 00:15:48,040 Speaker 1: a shocking illustration of how easy it is to get 250 00:15:48,080 --> 00:15:52,160 Speaker 1: people to listen to authority. Now, it turns out there 251 00:15:52,200 --> 00:15:56,120 Speaker 1: is another kind of social influence also besides authority, and 252 00:15:56,160 --> 00:16:00,840 Speaker 1: that is the influence of your peers. Happens that Milgram 253 00:16:00,920 --> 00:16:04,920 Speaker 1: had a high school friend named Philip Zimbardo. Milgrim ended 254 00:16:05,000 --> 00:16:07,760 Speaker 1: up at Yale and Zimbardo ended up working at Stanford. 255 00:16:07,960 --> 00:16:12,160 Speaker 1: Zimbardo was interested in how prison systems run and why 256 00:16:12,200 --> 00:16:19,920 Speaker 1: people behave the way they do in prisons, so he 257 00:16:20,000 --> 00:16:22,680 Speaker 1: recruited people for a two week study, and he made 258 00:16:22,680 --> 00:16:26,480 Speaker 1: sure to do full psychological tests on them to ensure 259 00:16:26,600 --> 00:16:30,240 Speaker 1: they were all in a normal range essentially random normal 260 00:16:30,320 --> 00:16:34,040 Speaker 1: research participants. Then he assigned them to the role of 261 00:16:34,080 --> 00:16:38,240 Speaker 1: a prisoner or the role of a guard. For the guards, 262 00:16:38,480 --> 00:16:41,720 Speaker 1: he gave them things like sunglasses to cover their eyes 263 00:16:41,920 --> 00:16:45,080 Speaker 1: and billy clubs to carry, and for the prisoners, he 264 00:16:45,160 --> 00:16:48,080 Speaker 1: stripped them of all their clothing except for a simple gown, 265 00:16:50,280 --> 00:16:53,480 Speaker 1: and he made things as realistic as possible, so he 266 00:16:53,520 --> 00:16:56,920 Speaker 1: actually picked up the prisoners in police cars and brought 267 00:16:56,920 --> 00:17:01,920 Speaker 1: them in and had them handcuffed and checked in and 268 00:17:02,000 --> 00:17:05,080 Speaker 1: he had three different shifts of guards who would switch 269 00:17:05,119 --> 00:17:08,919 Speaker 1: off every eight hours, whereas the prisoners actually lived there 270 00:17:09,000 --> 00:17:11,639 Speaker 1: in this basement, which was set up to be just 271 00:17:11,720 --> 00:17:14,160 Speaker 1: like a jail cell. And you may have heard about 272 00:17:14,160 --> 00:17:18,639 Speaker 1: the outcome of this experiment. The guards started acting like 273 00:17:18,760 --> 00:17:26,280 Speaker 1: bullies to the prisoners, making them do arbitrary things just 274 00:17:26,320 --> 00:17:29,280 Speaker 1: for the sake of punishment. So, for example, they had 275 00:17:29,280 --> 00:17:32,000 Speaker 1: to lineup to count off in the morning, and that 276 00:17:32,119 --> 00:17:34,080 Speaker 1: was a perfect time for the guards to make up 277 00:17:34,200 --> 00:17:37,719 Speaker 1: arbitrary rules like okay, now you have to do it backwards. 278 00:17:37,920 --> 00:17:41,280 Speaker 1: Now you have to sing. Now you're not singing sweetly enough, 279 00:17:41,320 --> 00:17:43,240 Speaker 1: you have to do it again. And they would do 280 00:17:43,280 --> 00:17:46,080 Speaker 1: this kind of thing for hours just to persecute these guys. 281 00:17:46,400 --> 00:17:50,439 Speaker 1: But it just kept magnifying. The guards started becoming so 282 00:17:51,040 --> 00:17:54,800 Speaker 1: creatively evil in coming up with punishments and rules, and 283 00:17:54,880 --> 00:17:58,760 Speaker 1: this all happened quite quickly. The guards started taking away 284 00:17:58,760 --> 00:18:02,000 Speaker 1: food from them, taking their beds away, locking them in 285 00:18:02,080 --> 00:18:08,080 Speaker 1: solitary confinement, and everybody involved became psychologically a bit traumatized, 286 00:18:08,280 --> 00:18:11,600 Speaker 1: and the experiment had to shut down early because essentially, 287 00:18:11,720 --> 00:18:15,480 Speaker 1: right away the prisoners and guards fell into these roles 288 00:18:15,920 --> 00:18:19,840 Speaker 1: and what happened was the community was shocked by what 289 00:18:20,000 --> 00:18:22,800 Speaker 1: had just transpired in this experiment, because these were just 290 00:18:22,880 --> 00:18:26,240 Speaker 1: normal young men who just by dint of being put 291 00:18:26,280 --> 00:18:34,120 Speaker 1: in these roles, ended up behaving so differently. And Zimbardo 292 00:18:34,200 --> 00:18:36,920 Speaker 1: wrote a great book on this called The Lucifer Effect, 293 00:18:37,320 --> 00:18:41,360 Speaker 1: about how people can turn into such bad actors depending 294 00:18:41,440 --> 00:18:44,760 Speaker 1: on the situation in which they find themselves. And what 295 00:18:44,880 --> 00:18:49,800 Speaker 1: Zimbardo emphasized is the situational nature of our behavior. In 296 00:18:49,800 --> 00:18:53,560 Speaker 1: other words, it matters what role you're playing, and beyond 297 00:18:53,560 --> 00:18:56,280 Speaker 1: a particular situation, he said, you have to understand whole 298 00:18:56,640 --> 00:19:01,119 Speaker 1: systems to understand how humans behave you have to understand 299 00:19:01,160 --> 00:19:04,560 Speaker 1: more than their disposition, in other words, the kind of 300 00:19:04,640 --> 00:19:08,320 Speaker 1: person that individual is, and you have to understand more 301 00:19:08,400 --> 00:19:10,439 Speaker 1: than the situation they're in right now. You have to 302 00:19:10,480 --> 00:19:14,400 Speaker 1: understand the whole system they're in. Because what happens in prisons, 303 00:19:14,440 --> 00:19:17,760 Speaker 1: for example, is not unique to the Stanford prison experiment. 304 00:19:18,040 --> 00:19:21,040 Speaker 1: It's typical of what happens in prisons where the system 305 00:19:21,080 --> 00:19:23,760 Speaker 1: is set up with guards and prisoners and they've each 306 00:19:23,800 --> 00:19:28,280 Speaker 1: got their roles, and the guards want total compliance and 307 00:19:28,359 --> 00:19:31,360 Speaker 1: the prisoners want to resist that, so the guards will 308 00:19:31,480 --> 00:19:34,199 Speaker 1: keep upping the arms race until they're certain that they 309 00:19:34,200 --> 00:19:37,320 Speaker 1: can get compliance from the prisoners. And in this light, 310 00:19:37,400 --> 00:19:40,959 Speaker 1: it's no real surprise what happened, for example, in Abu 311 00:19:41,040 --> 00:19:46,360 Speaker 1: grab where photographs emerged of torture and humiliation of the 312 00:19:46,400 --> 00:19:49,400 Speaker 1: insurgents that were being held there by the US forces. 313 00:19:49,920 --> 00:19:53,560 Speaker 1: It's exactly the character of interaction that happened in the 314 00:19:53,600 --> 00:19:56,240 Speaker 1: Stanford prison experiment. So you may have seen some years 315 00:19:56,240 --> 00:20:01,520 Speaker 1: ago where photos surfaced of prisoners being stripped, naked and humiliated, 316 00:20:01,760 --> 00:20:05,159 Speaker 1: or another was of a man being terrorized by the 317 00:20:05,280 --> 00:20:08,160 Speaker 1: US guard dogs, or there's a photo of a man 318 00:20:08,240 --> 00:20:11,199 Speaker 1: standing on a small box and he's draped in a 319 00:20:11,240 --> 00:20:14,200 Speaker 1: sheet with his head covered, and he has two electrode 320 00:20:14,200 --> 00:20:17,000 Speaker 1: wires attached to his fingers, and he was told that 321 00:20:17,040 --> 00:20:20,040 Speaker 1: as soon as he loses his strength and falls off 322 00:20:20,080 --> 00:20:23,560 Speaker 1: the box, he's going to be electrocuted. Zimbardo's point is 323 00:20:23,880 --> 00:20:25,919 Speaker 1: it's not as easy as thinking about this as a 324 00:20:25,960 --> 00:20:28,439 Speaker 1: few bad apples in the system, which is how the 325 00:20:28,640 --> 00:20:31,840 Speaker 1: Army worked to portray this. The army said, we are 326 00:20:31,920 --> 00:20:34,920 Speaker 1: shocked at what happened in Abu Ghrab. There were obviously 327 00:20:34,960 --> 00:20:38,840 Speaker 1: some bad apples who behaved badly, Zimbardo's point is that 328 00:20:38,880 --> 00:20:42,360 Speaker 1: it's not really a few bad apples that's a systemic problem. 329 00:20:42,359 --> 00:20:46,320 Speaker 1: It's a system that sets up particular situations like the 330 00:20:46,320 --> 00:20:49,520 Speaker 1: Stanford prison experiment. He wasn't making a defense of the 331 00:20:49,600 --> 00:20:53,320 Speaker 1: soldiers who behaved badly like this, torturing their prisoners, but 332 00:20:53,400 --> 00:20:57,080 Speaker 1: it's important to try to figure out how to build 333 00:20:57,200 --> 00:21:01,119 Speaker 1: or repair these systems so that it does happen. And 334 00:21:01,160 --> 00:21:05,440 Speaker 1: shortly after that, eighteen pictures emerged of American soldiers who 335 00:21:05,440 --> 00:21:09,200 Speaker 1: are posing with dead members of the Taliban and making 336 00:21:09,280 --> 00:21:11,720 Speaker 1: it look like the bodies were doing things like having 337 00:21:11,720 --> 00:21:14,480 Speaker 1: their hand on their shoulder. And so what I thought 338 00:21:14,520 --> 00:21:18,959 Speaker 1: was interesting is the headlines the La Times said photo 339 00:21:19,080 --> 00:21:24,520 Speaker 1: of US soldiers posing with Afghan corpses prompts condemnation, and 340 00:21:24,600 --> 00:21:28,800 Speaker 1: in the subtitle, American officials denounce the actions of troops 341 00:21:28,880 --> 00:21:32,920 Speaker 1: photographed with dead insurgents. But this condemnation is a little 342 00:21:32,920 --> 00:21:36,080 Speaker 1: hard to understand because the army tells you, look, we 343 00:21:36,119 --> 00:21:38,000 Speaker 1: want you to go over there, we want you to 344 00:21:38,119 --> 00:21:41,080 Speaker 1: kill these guys, to wreck their roads and burn their bridges, 345 00:21:41,560 --> 00:21:44,439 Speaker 1: but don't take any pictures with them, because that's disrespectful. 346 00:21:44,920 --> 00:21:50,120 Speaker 1: This condemnation is complicated because the soldier's behavior was part 347 00:21:50,119 --> 00:21:54,400 Speaker 1: of the system, part of the situational variables that were 348 00:21:54,440 --> 00:21:56,919 Speaker 1: set up. You get these young men and women to 349 00:21:56,960 --> 00:21:59,960 Speaker 1: have vim and vigor and go out to kill the enemy, 350 00:22:00,119 --> 00:22:04,640 Speaker 1: give them propaganda, you dehumanize the enemy, and then you say, hey, 351 00:22:04,760 --> 00:22:08,800 Speaker 1: we're outraged that you didn't treat this corpse respectfully. And 352 00:22:08,800 --> 00:22:10,720 Speaker 1: by the way, just as a side note, it turns 353 00:22:10,760 --> 00:22:15,280 Speaker 1: out that with modern communication channels, these photographs surfaced quickly 354 00:22:15,280 --> 00:22:18,760 Speaker 1: and everyone thought this was some awful new sign of 355 00:22:18,800 --> 00:22:22,400 Speaker 1: the times. But this is as old as war itself. 356 00:22:22,480 --> 00:22:25,520 Speaker 1: People have always posed with the dead bodies of their 357 00:22:25,640 --> 00:22:28,679 Speaker 1: enemies for as long as there have been cameras, and 358 00:22:28,720 --> 00:22:31,960 Speaker 1: before photography, they would do things like cut off people's 359 00:22:32,040 --> 00:22:34,440 Speaker 1: ears or take out teeth or stuff like that, and 360 00:22:34,440 --> 00:22:37,560 Speaker 1: make belts and necklaces out of them. So there's nothing 361 00:22:37,640 --> 00:22:41,080 Speaker 1: new going on here. Again, this is not a defense 362 00:22:41,119 --> 00:22:44,119 Speaker 1: of that behavior, but it is to say there's something 363 00:22:44,160 --> 00:22:48,840 Speaker 1: about the systemic variables in wartime that change the way 364 00:22:48,960 --> 00:22:53,040 Speaker 1: people make decisions in these situations. And I want to 365 00:22:53,040 --> 00:22:54,880 Speaker 1: be clear about one point, which is that when I'm 366 00:22:54,920 --> 00:22:58,760 Speaker 1: talking about with these situational variables, this doesn't get rid 367 00:22:58,840 --> 00:23:02,640 Speaker 1: of individual respons responsibility, but it gives us the tools 368 00:23:02,640 --> 00:23:08,000 Speaker 1: to understand the variables that chronically cause situations like this. 369 00:23:08,640 --> 00:23:11,840 Speaker 1: People who behave badly in Abu grad, prison or any 370 00:23:11,840 --> 00:23:15,440 Speaker 1: place else, they still have to face punishment for many reasons, 371 00:23:15,480 --> 00:23:19,080 Speaker 1: because justice, it turns out, tries to accomplish many things 372 00:23:19,119 --> 00:23:23,320 Speaker 1: at the same time, such as slaking public bloodlust and 373 00:23:23,359 --> 00:23:26,399 Speaker 1: setting up examples for the next people. So the people 374 00:23:26,400 --> 00:23:29,600 Speaker 1: who commit bad acts still have to get punished. The 375 00:23:29,800 --> 00:23:34,520 Speaker 1: importance of studying all the variables that steer human behavior 376 00:23:34,560 --> 00:23:36,679 Speaker 1: is not to let people off the hook. It's to 377 00:23:36,760 --> 00:23:40,000 Speaker 1: prevent the next generation from ending up in the same 378 00:23:40,119 --> 00:23:45,119 Speaker 1: situation and performing as badly. Now. In this episode and 379 00:23:45,160 --> 00:23:50,160 Speaker 1: the last one, we talked about dehumanization and its neural underpinnings, 380 00:23:50,359 --> 00:23:53,280 Speaker 1: and in this episode we talked about the situational variables 381 00:23:53,600 --> 00:23:56,480 Speaker 1: which play a role in it. So the question is 382 00:23:57,200 --> 00:24:00,960 Speaker 1: what can we do about dehumanization As we come to 383 00:24:01,040 --> 00:24:04,639 Speaker 1: understand the neural basis and the psychological basis and the 384 00:24:04,720 --> 00:24:08,919 Speaker 1: contextual basis, how does that steer us? So I'm need 385 00:24:08,960 --> 00:24:13,760 Speaker 1: to propose three lessons that emerge for us. The first 386 00:24:14,400 --> 00:24:20,359 Speaker 1: is the unspeakably important role of education education, specifically about 387 00:24:20,440 --> 00:24:25,920 Speaker 1: propaganda and dehumanization and the social context that influences us. 388 00:24:26,320 --> 00:24:29,760 Speaker 1: It's critical for us to teach the Milgrim experiments and 389 00:24:29,800 --> 00:24:33,920 Speaker 1: the Zimbardo experiments to our children and to ourselves, such 390 00:24:33,920 --> 00:24:37,359 Speaker 1: as this becomes part of our background knowledge. Everyone knows 391 00:24:37,400 --> 00:24:40,280 Speaker 1: and talks about these sorts of experiments, and they know 392 00:24:40,440 --> 00:24:43,720 Speaker 1: what to do about it, because, after all, in his 393 00:24:43,920 --> 00:24:47,840 Speaker 1: book Obedience to Authority, Milgrim said, look, I'm going to 394 00:24:47,920 --> 00:24:50,159 Speaker 1: take what we've learned here and distill it down to 395 00:24:50,240 --> 00:24:54,320 Speaker 1: all the rules that people use when they want other 396 00:24:54,359 --> 00:24:57,640 Speaker 1: people to do their bidding. So he said, first of all, 397 00:24:57,800 --> 00:25:01,000 Speaker 1: if a person wants you to be obedient to them, 398 00:25:01,200 --> 00:25:06,040 Speaker 1: first they're going to pre arrange some form of contractual obligation. 399 00:25:06,600 --> 00:25:08,760 Speaker 1: In this case, it was we're going to pay you 400 00:25:08,840 --> 00:25:13,240 Speaker 1: four bucks, and then you're going to participate in this experiment. Next, 401 00:25:13,520 --> 00:25:17,280 Speaker 1: they'll give you a meaningful role, like you're a teacher 402 00:25:17,560 --> 00:25:22,680 Speaker 1: or your a guard, and those activate particular response scripts. 403 00:25:22,720 --> 00:25:25,080 Speaker 1: People feel like, oh, I know exactly what to do 404 00:25:25,160 --> 00:25:28,879 Speaker 1: with that. Then the person will present basic rules to 405 00:25:28,920 --> 00:25:31,680 Speaker 1: be followed, like when the student gets the wrong answer, 406 00:25:32,160 --> 00:25:34,679 Speaker 1: you move up to the next level of shock. And 407 00:25:34,760 --> 00:25:38,560 Speaker 1: these rules can be arbitrarily changed. Later, the person will 408 00:25:38,920 --> 00:25:42,480 Speaker 1: change the semantics of the act. Instead of calling it 409 00:25:42,920 --> 00:25:47,320 Speaker 1: hurting the victim, they'll call it helping the experimenter. The 410 00:25:47,359 --> 00:25:51,880 Speaker 1: person will allow for a diffusion of responsibility. So remember 411 00:25:51,920 --> 00:25:56,119 Speaker 1: in Milgram's setup, the experimenter said, don't worry all be 412 00:25:56,240 --> 00:25:59,720 Speaker 1: responsible for anything that happens to him. Just keep going 413 00:26:00,080 --> 00:26:01,840 Speaker 1: this way. When you're doing the act, you don't have 414 00:26:01,880 --> 00:26:04,920 Speaker 1: to take everything onto your own shoulders. The person will 415 00:26:04,960 --> 00:26:08,560 Speaker 1: start the path with small steps, like just give him 416 00:26:08,560 --> 00:26:11,800 Speaker 1: a little electrical shock. He'll barely feel it, and then 417 00:26:11,800 --> 00:26:14,680 Speaker 1: as things go on, they'll say just a little more, 418 00:26:14,840 --> 00:26:17,879 Speaker 1: just a little higher, until before you know it, you 419 00:26:17,920 --> 00:26:21,200 Speaker 1: are at the top of the scale, delivering dangerously high 420 00:26:21,280 --> 00:26:24,880 Speaker 1: levels to a person who might already be unconscious. It's 421 00:26:24,920 --> 00:26:27,159 Speaker 1: like the frog and the frying pan method. If the 422 00:26:27,200 --> 00:26:31,240 Speaker 1: heat gets turned up slowly, the frog doesn't jump out, 423 00:26:31,720 --> 00:26:35,119 Speaker 1: and humans are no different in this way. Finally, the 424 00:26:35,160 --> 00:26:40,080 Speaker 1: person will make exit costs high, but they'll allow verbal descent. 425 00:26:40,480 --> 00:26:43,000 Speaker 1: In other words, they won't let you leave the experiment, 426 00:26:43,040 --> 00:26:46,960 Speaker 1: but they will allow you to express distress. They'll allow 427 00:26:47,000 --> 00:26:50,439 Speaker 1: you to complain because that'll make you feel better. A 428 00:26:50,520 --> 00:26:54,000 Speaker 1: complaining person will still go up to four hundred and 429 00:26:54,000 --> 00:26:58,120 Speaker 1: fifty volts, but they'll feel better about themselves if they say, oh, 430 00:26:58,160 --> 00:27:00,560 Speaker 1: I feel really uncomfortable, I don't want to do this, 431 00:27:01,080 --> 00:27:04,359 Speaker 1: but they still do it. And finally, the person will 432 00:27:04,400 --> 00:27:10,440 Speaker 1: offer larger goals, some ideology that your small contribution helps with. 433 00:27:10,960 --> 00:27:13,159 Speaker 1: In the case of the Milgrim experiment, it was as 434 00:27:13,200 --> 00:27:16,720 Speaker 1: simple as we're trying to study the science of memory. 435 00:27:17,240 --> 00:27:21,359 Speaker 1: In the case of genocide, it's usually about pride or purity, 436 00:27:21,800 --> 00:27:26,640 Speaker 1: or economics or restoring dignity and opportunity or whatever story. 437 00:27:27,040 --> 00:27:30,320 Speaker 1: But the idea is that as you shock somebody or 438 00:27:30,359 --> 00:27:34,800 Speaker 1: perhaps shoot somebody, you are working in service of a 439 00:27:34,920 --> 00:27:39,160 Speaker 1: larger goal. So Milgrim was able to distill all these 440 00:27:39,320 --> 00:27:44,560 Speaker 1: rules that we find whenever people blindly follow authority, and 441 00:27:44,600 --> 00:27:48,480 Speaker 1: you see these rules played out the same way across 442 00:27:48,600 --> 00:27:52,520 Speaker 1: place and time. And this is what we need to 443 00:27:52,600 --> 00:27:56,680 Speaker 1: teach our children, so they know the signs to look for, 444 00:27:57,359 --> 00:28:01,080 Speaker 1: so they know how not to get used into the trap. 445 00:28:01,160 --> 00:28:03,800 Speaker 1: I mean, for God's sake, we teach all children how 446 00:28:03,840 --> 00:28:06,199 Speaker 1: to do long division by hand, and we teach them 447 00:28:06,240 --> 00:28:09,640 Speaker 1: how to play soccer and how to watercolor. But why 448 00:28:09,680 --> 00:28:12,800 Speaker 1: isn't it mandatory that we teach them lessons like this, 449 00:28:13,560 --> 00:28:17,439 Speaker 1: like how to know when they are getting manipulated, how 450 00:28:17,520 --> 00:28:22,160 Speaker 1: easy it is to get manipulated, how to develop immunity 451 00:28:22,840 --> 00:28:27,800 Speaker 1: against manipulation by simply knowing the signs to look for. 452 00:28:28,280 --> 00:28:47,880 Speaker 1: That would be an education worth having. Okay, So the 453 00:28:47,920 --> 00:28:53,040 Speaker 1: first thing we need is meaningful, universal education about these issues. 454 00:28:53,680 --> 00:28:57,600 Speaker 1: The second thing we need is social modeling. So in 455 00:28:57,640 --> 00:29:01,120 Speaker 1: the last episode, I talked about syndrome E, which is 456 00:29:01,160 --> 00:29:04,640 Speaker 1: where your neural circuitry for caring about other people gets 457 00:29:04,760 --> 00:29:08,240 Speaker 1: turned down or turn off, and people act like psychopaths, 458 00:29:08,880 --> 00:29:13,320 Speaker 1: performing actions like murdering mothers and their babies on camera, 459 00:29:13,920 --> 00:29:17,680 Speaker 1: things that would normally not even be thinkable or conscionable. 460 00:29:18,160 --> 00:29:21,240 Speaker 1: And I spoke about it as though everyone in wartime 461 00:29:21,760 --> 00:29:25,880 Speaker 1: can catch syndrome E, or that everyone in Milgrim's experiments 462 00:29:26,000 --> 00:29:30,320 Speaker 1: showed inappropriate obedience to authority, But in fact there are 463 00:29:30,440 --> 00:29:35,680 Speaker 1: always heroes who stand up against authority. In Nazi Germany, 464 00:29:35,680 --> 00:29:38,240 Speaker 1: for example, there was a group of students known as 465 00:29:38,360 --> 00:29:41,800 Speaker 1: the White Rose. They put all their efforts into making 466 00:29:41,840 --> 00:29:46,080 Speaker 1: and disseminating flyers and pamphlets against the actions of the 467 00:29:46,160 --> 00:29:51,480 Speaker 1: Third Rite. Now tragically, they were eventually captured and rounded up, 468 00:29:51,520 --> 00:29:54,640 Speaker 1: and they were all executed by the Nazis. But this 469 00:29:54,760 --> 00:29:57,920 Speaker 1: is the kind of thing for us to teach our 470 00:29:58,000 --> 00:30:02,080 Speaker 1: children about and keep their names alive, celebrating heroes who 471 00:30:02,200 --> 00:30:06,240 Speaker 1: stand against authority when they see something going horribly wrong. 472 00:30:06,680 --> 00:30:08,920 Speaker 1: And I'm not talking about just being a pain in 473 00:30:08,960 --> 00:30:12,600 Speaker 1: the neck to authority, because that's trivial and not always useful. 474 00:30:13,120 --> 00:30:17,560 Speaker 1: I'm talking about seeing something that's actually really wrong. And 475 00:30:17,640 --> 00:30:20,480 Speaker 1: even though it appears that all the adults know what 476 00:30:20,520 --> 00:30:24,320 Speaker 1: they're doing and have good reasons and they're only asking 477 00:30:24,360 --> 00:30:26,800 Speaker 1: you to do something small and they'll take the responsibility 478 00:30:26,800 --> 00:30:29,640 Speaker 1: and so on, think about whether it's the kind of 479 00:30:29,800 --> 00:30:33,000 Speaker 1: action you would want to take if you thought about 480 00:30:33,040 --> 00:30:36,680 Speaker 1: it from your own first principles, would you feel that 481 00:30:36,800 --> 00:30:40,200 Speaker 1: it's conscionable to murder your neighbors and take their stuff. 482 00:30:40,800 --> 00:30:44,320 Speaker 1: If the answer is no, then it should remain no, 483 00:30:45,040 --> 00:30:47,960 Speaker 1: even if the world gets a little nutty. And this 484 00:30:48,000 --> 00:30:52,080 Speaker 1: is where social modeling helps. We learn about heroes who 485 00:30:52,200 --> 00:30:55,680 Speaker 1: stuck with their conscience. So if you know these stories, 486 00:30:55,760 --> 00:30:58,959 Speaker 1: then the next time you find yourself in some situation, 487 00:31:00,000 --> 00:31:03,240 Speaker 1: at least got a template that you can think about following. 488 00:31:03,600 --> 00:31:07,200 Speaker 1: So number two is about teaching our children and ourselves 489 00:31:07,240 --> 00:31:11,040 Speaker 1: about those who stand strongly against things that are asked 490 00:31:11,080 --> 00:31:15,000 Speaker 1: of them in a time of war and madness. The 491 00:31:15,040 --> 00:31:20,080 Speaker 1: third defense against dehumanization is clever social structuring. And I 492 00:31:20,160 --> 00:31:23,520 Speaker 1: talked about this a few episodes ago about the Iroquis 493 00:31:23,600 --> 00:31:27,120 Speaker 1: Native Americans who lived up around what's now upstate New York. 494 00:31:29,000 --> 00:31:32,360 Speaker 1: And they're known as the League of Peace and Power. 495 00:31:32,400 --> 00:31:34,560 Speaker 1: But they weren't always known as that, and certainly not 496 00:31:34,680 --> 00:31:41,160 Speaker 1: four hundred years ago. There used to be six tribes 497 00:31:41,200 --> 00:31:45,000 Speaker 1: who were always fighting with one another, real bloody battles. 498 00:31:45,560 --> 00:31:49,040 Speaker 1: But in the sixteen hundreds they were brought together by 499 00:31:49,080 --> 00:31:51,840 Speaker 1: a man who came to be known as the Great Peacemaker. 500 00:31:55,840 --> 00:32:01,000 Speaker 1: He combined them into one nation. By the way, combining 501 00:32:01,040 --> 00:32:02,920 Speaker 1: people is not enough. It turns out that if you 502 00:32:02,960 --> 00:32:07,040 Speaker 1: simply push people together, that can fall back apart easily. 503 00:32:07,440 --> 00:32:13,000 Speaker 1: He did something more clever. He structured clans such that 504 00:32:13,080 --> 00:32:17,720 Speaker 1: each tribe member ended up belonging to one of nine clans. 505 00:32:18,040 --> 00:32:20,800 Speaker 1: So I might be a member of the Seneca tribe, 506 00:32:21,040 --> 00:32:24,040 Speaker 1: but I'm a member of the Wolf clan, and you're 507 00:32:24,120 --> 00:32:27,520 Speaker 1: a member of the Mohawk tribe, but you're also a 508 00:32:27,560 --> 00:32:31,320 Speaker 1: member of the Wolf clan. And the key is that 509 00:32:31,440 --> 00:32:36,000 Speaker 1: the membership to tribes and clans these cross cut. And 510 00:32:36,040 --> 00:32:39,280 Speaker 1: so how is the Seneca tribe going to fight against 511 00:32:39,360 --> 00:32:42,440 Speaker 1: the Mohawk tribe when I'm a wolf and you're a wolf. 512 00:32:42,880 --> 00:32:44,840 Speaker 1: And by the way, my Seneca friend is in the 513 00:32:44,880 --> 00:32:47,160 Speaker 1: Hawk clan and your Mohawk friend is in the Hawk 514 00:32:47,240 --> 00:32:51,120 Speaker 1: clan too, So when we all consider waging war, we think, 515 00:32:51,520 --> 00:32:53,360 Speaker 1: I don't know, I got friends over there, I've got 516 00:32:53,520 --> 00:32:58,000 Speaker 1: fellow clansmen in that tribe. So by cleverly structuring things 517 00:32:58,000 --> 00:33:03,320 Speaker 1: in a society, by cultivating cross cutting ties, you can 518 00:33:03,480 --> 00:33:07,880 Speaker 1: tamp down people's natural vigor to make easy outgroups. You 519 00:33:07,920 --> 00:33:16,480 Speaker 1: can complexify their allegiances. I think it's likely naive for 520 00:33:16,600 --> 00:33:19,680 Speaker 1: us to think about obtaining world peace by just getting 521 00:33:19,680 --> 00:33:22,920 Speaker 1: everyone to get along, because we're very hardwired for in 522 00:33:23,000 --> 00:33:27,880 Speaker 1: groups and outgroups. But we can structure things carefully like 523 00:33:27,920 --> 00:33:32,080 Speaker 1: the Iroquois chief did, so that things have counter balance, 524 00:33:32,640 --> 00:33:34,880 Speaker 1: so that it's not so easy for people to raise 525 00:33:35,160 --> 00:33:39,440 Speaker 1: arms against one another. So that's our third tool, is 526 00:33:39,720 --> 00:33:46,320 Speaker 1: social structuring to create or enhance cross cutting allegiances. So 527 00:33:46,440 --> 00:33:49,120 Speaker 1: let's wrap up in this episode and the last one, 528 00:33:49,120 --> 00:33:51,080 Speaker 1: I've told you about the way that the human brain 529 00:33:51,200 --> 00:33:55,959 Speaker 1: is so social and comes to understand other people as people, 530 00:33:56,640 --> 00:33:59,880 Speaker 1: but it's also really easy for brains to form in 531 00:34:00,040 --> 00:34:03,160 Speaker 1: groups and outgroups, and how the circus in your brain 532 00:34:03,280 --> 00:34:07,120 Speaker 1: that understand other people can come to see them more 533 00:34:07,240 --> 00:34:12,359 Speaker 1: like objects. They're no longer human, they are dehumanized. And 534 00:34:12,440 --> 00:34:16,680 Speaker 1: once someone has become an object, it becomes much easier, 535 00:34:16,719 --> 00:34:20,839 Speaker 1: sometimes trivial, to do what you need to make them 536 00:34:21,239 --> 00:34:24,000 Speaker 1: not be a problem to you anymore, and pushing people 537 00:34:24,120 --> 00:34:26,919 Speaker 1: into the outgroup is not hard to do. We saw 538 00:34:27,000 --> 00:34:30,920 Speaker 1: last week the tools of the trade, of propaganda and 539 00:34:31,040 --> 00:34:34,719 Speaker 1: other techniques of dehumanization, and in this week's episode we 540 00:34:34,800 --> 00:34:39,239 Speaker 1: saw how the situation you're in can influence decision making 541 00:34:39,280 --> 00:34:43,440 Speaker 1: as well. All it takes sometimes is an authority figure 542 00:34:43,520 --> 00:34:46,200 Speaker 1: telling you there's a contract and you have to do this, 543 00:34:46,280 --> 00:34:48,960 Speaker 1: and this is for a greater purpose, and don't worry, 544 00:34:49,000 --> 00:34:52,720 Speaker 1: the responsibility will be diffused off of you. Just press 545 00:34:52,840 --> 00:34:56,799 Speaker 1: the button, or in the case of Zimbardo's experiment, systems 546 00:34:56,920 --> 00:35:01,120 Speaker 1: have an inherent structure such that guards and prisoners have 547 00:35:01,239 --> 00:35:05,560 Speaker 1: their own implicit scripts, and it's very easy to find 548 00:35:05,600 --> 00:35:08,120 Speaker 1: that you know those scripts, and in that situation you 549 00:35:08,280 --> 00:35:11,799 Speaker 1: play out those scripts. All of these situations make it 550 00:35:11,920 --> 00:35:16,720 Speaker 1: much easier to take dehumanization on board, to treat another 551 00:35:16,800 --> 00:35:19,880 Speaker 1: person not like someone just like you, but instead like 552 00:35:19,920 --> 00:35:24,560 Speaker 1: an object. And although we're a massively social species capable 553 00:35:24,600 --> 00:35:28,719 Speaker 1: of such empathy, it's just not difficult to set up 554 00:35:28,760 --> 00:35:34,160 Speaker 1: situations where we're capable of such violence. And I suggested 555 00:35:34,280 --> 00:35:37,359 Speaker 1: three ways that we might take our knowledge of this 556 00:35:37,600 --> 00:35:42,120 Speaker 1: and reorganize ourselves. The first is education of our community 557 00:35:42,520 --> 00:35:45,640 Speaker 1: about the tricks of the trade, of propaganda and obedience 558 00:35:45,640 --> 00:35:49,240 Speaker 1: to authority, because the truth is, once you know this stuff, 559 00:35:49,280 --> 00:35:52,319 Speaker 1: it becomes so obvious what people are trying to accomplish, 560 00:35:52,640 --> 00:35:56,680 Speaker 1: and you have a meaningful immunity to it. But without 561 00:35:56,920 --> 00:36:00,400 Speaker 1: education on it, the youth of each new gen generation 562 00:36:00,719 --> 00:36:05,080 Speaker 1: is at risk. So let's get this information into schools 563 00:36:05,160 --> 00:36:08,960 Speaker 1: and communities. The second way is social modeling, that is, 564 00:36:09,480 --> 00:36:12,120 Speaker 1: looking at people who stood up before us, like the 565 00:36:12,560 --> 00:36:16,560 Speaker 1: people in Milgrim's experiments who said, sorry, I'm not going 566 00:36:16,600 --> 00:36:19,319 Speaker 1: to deliver that next shock, and the guy says, but 567 00:36:19,400 --> 00:36:22,360 Speaker 1: you have to. That's the experiment. I will take responsibility 568 00:36:22,520 --> 00:36:25,319 Speaker 1: and you say, no, I'm not going to do it. 569 00:36:25,320 --> 00:36:28,359 Speaker 1: It takes courage to be that kind of person, and 570 00:36:28,400 --> 00:36:31,239 Speaker 1: we'd all be better off if we saw lots of 571 00:36:31,280 --> 00:36:34,160 Speaker 1: examples of that kind of behavior. Then it wouldn't seem 572 00:36:34,320 --> 00:36:38,040 Speaker 1: so foreign to us, and we would find it easier 573 00:36:38,520 --> 00:36:42,160 Speaker 1: to discover that courage when we need it. And the 574 00:36:42,200 --> 00:36:46,200 Speaker 1: third thing is figuring out ways to surface the ties 575 00:36:46,320 --> 00:36:50,239 Speaker 1: between people that perhaps they weren't aware of before, to 576 00:36:50,480 --> 00:36:55,319 Speaker 1: establish ties that bind across all the typical boundaries. And 577 00:36:55,360 --> 00:36:57,879 Speaker 1: I'll talk in a different episode about how we might 578 00:36:57,960 --> 00:37:01,520 Speaker 1: do this by leveraging the power of social media. Social 579 00:37:01,520 --> 00:37:03,640 Speaker 1: media is not going away, so let's see if we 580 00:37:03,680 --> 00:37:08,439 Speaker 1: can leverage it for unity instead of division. So those 581 00:37:08,440 --> 00:37:10,719 Speaker 1: are three strategies we can take. And the reason this 582 00:37:10,840 --> 00:37:16,759 Speaker 1: matters is because we have evolved for use sociality. We're 583 00:37:16,760 --> 00:37:21,520 Speaker 1: not independent contributors. We have succeeded as a species because 584 00:37:21,560 --> 00:37:24,920 Speaker 1: we behave as a super organism as a group. And 585 00:37:25,000 --> 00:37:28,320 Speaker 1: the reason I think it's so absolutely critical to study 586 00:37:28,360 --> 00:37:30,960 Speaker 1: all this is because this is what is going to 587 00:37:31,040 --> 00:37:35,840 Speaker 1: define our future. I mean, we pour billions of dollars 588 00:37:35,840 --> 00:37:40,000 Speaker 1: into working to understand the science of Alzheimer's and cancer 589 00:37:40,120 --> 00:37:45,000 Speaker 1: and diabetes, as we should, but these issues of dehumanization 590 00:37:45,239 --> 00:37:49,879 Speaker 1: affect our species in an even deeper way, and there's 591 00:37:49,960 --> 00:37:54,839 Speaker 1: comparatively little research about this. The important thing for our 592 00:37:55,080 --> 00:38:00,040 Speaker 1: future is understanding why and how people can behave so 593 00:38:00,120 --> 00:38:04,280 Speaker 1: badly towards one another. This may be the single most 594 00:38:04,440 --> 00:38:09,799 Speaker 1: important question in terms of our legislation, the education of 595 00:38:09,800 --> 00:38:18,520 Speaker 1: our children, and the future of our species. Go to 596 00:38:18,560 --> 00:38:21,920 Speaker 1: Eagleman dot com slash podcast for more information and to 597 00:38:22,000 --> 00:38:27,239 Speaker 1: find further reading. Send me an email at podcasts at 598 00:38:27,239 --> 00:38:30,520 Speaker 1: eagleman dot com with questions or discussion, and I'll be 599 00:38:30,560 --> 00:38:35,360 Speaker 1: making an episode soon in which I address those. Until 600 00:38:35,400 --> 00:38:39,440 Speaker 1: next time, I'm David Eagleman, and this is Inner Cosmos.