1 00:00:05,640 --> 00:00:09,040 Speaker 1: Why is there so much polarization in the world, and 2 00:00:09,080 --> 00:00:11,000 Speaker 1: what does this have to do with the brain, And 3 00:00:11,039 --> 00:00:12,480 Speaker 1: what does any of this have to do with how 4 00:00:12,480 --> 00:00:16,560 Speaker 1: you picture a cat, or why we respond to certain 5 00:00:16,600 --> 00:00:20,920 Speaker 1: cartoons or the British nobleman Lord Gordon, or the Iroqui 6 00:00:21,040 --> 00:00:25,600 Speaker 1: Native Americans? And why do you naturally feel that everyone 7 00:00:25,600 --> 00:00:29,880 Speaker 1: who disagrees with you is a troll or misinformed? And 8 00:00:29,960 --> 00:00:33,320 Speaker 1: if you could just shout loudly enough in all caps 9 00:00:33,400 --> 00:00:37,000 Speaker 1: on Twitter, they would see that you're right. Why can't 10 00:00:37,040 --> 00:00:43,880 Speaker 1: they see that you know the truth? Welcome to inner 11 00:00:43,960 --> 00:00:48,519 Speaker 1: Cosmos with me, David Eagleman. I'm a neuroscientist and an 12 00:00:48,520 --> 00:00:52,080 Speaker 1: author at Stanford University, and I've spent my whole career 13 00:00:52,360 --> 00:00:56,800 Speaker 1: studying the intersection between how the brain works and how 14 00:00:56,840 --> 00:01:10,760 Speaker 1: we experience life. It hasn't escaped anybody's notice that we 15 00:01:10,800 --> 00:01:15,880 Speaker 1: are in a time in which polarization and disagreement is 16 00:01:16,080 --> 00:01:19,279 Speaker 1: higher than most of us have seen in our lives 17 00:01:19,319 --> 00:01:22,000 Speaker 1: so far, and so in the past decade I've become 18 00:01:22,120 --> 00:01:27,680 Speaker 1: very interested in the brain science behind that behind polarization, 19 00:01:28,080 --> 00:01:32,040 Speaker 1: and more generally, how we come to believe our own 20 00:01:32,120 --> 00:01:36,320 Speaker 1: political opinions and why we're so certain that everyone else 21 00:01:36,560 --> 00:01:39,680 Speaker 1: is wrong, and how if we could just talk to them, 22 00:01:39,800 --> 00:01:43,280 Speaker 1: if they could just listen to us, they would see 23 00:01:43,319 --> 00:01:46,280 Speaker 1: the light and they would know that we are right 24 00:01:46,720 --> 00:01:50,640 Speaker 1: and they were mistaken. Now, I want to set the stage. 25 00:01:51,040 --> 00:01:54,760 Speaker 1: Polarization is not a new thing. Although we are in 26 00:01:54,800 --> 00:01:58,720 Speaker 1: a polarized era right now, this is far from unique. 27 00:01:58,800 --> 00:02:01,840 Speaker 1: Just think about the Civil War in America, where you 28 00:02:01,920 --> 00:02:05,720 Speaker 1: had brothers and neighbors taking up arms against one another, 29 00:02:06,200 --> 00:02:09,560 Speaker 1: or in the nineteen sixties and seventies, people here held 30 00:02:09,680 --> 00:02:13,079 Speaker 1: vastly different opinions about the war in Vietnam and how 31 00:02:13,120 --> 00:02:16,800 Speaker 1: to treat the returning soldiers or take stuff that's even bigger, 32 00:02:16,919 --> 00:02:20,959 Speaker 1: like Nazism in Germany, which was the most advanced country 33 00:02:21,040 --> 00:02:24,080 Speaker 1: in Europe. The thing to recognize is that in the 34 00:02:24,160 --> 00:02:28,560 Speaker 1: nineteen thirty four elections in Germany, every single seat in 35 00:02:28,600 --> 00:02:32,960 Speaker 1: the Reichstag, the German Parliament, was either Communist Party far 36 00:02:33,040 --> 00:02:36,840 Speaker 1: left or National Socialist Party far right. Or look more 37 00:02:36,880 --> 00:02:40,360 Speaker 1: generally at the whole twentieth century, the Communist Revolution in 38 00:02:40,480 --> 00:02:44,840 Speaker 1: China or in the USSR, or the Hutu massacre of 39 00:02:44,880 --> 00:02:48,640 Speaker 1: the Tutsi in Rwanda, or the Camera Rouge in Cambodia, 40 00:02:48,960 --> 00:02:53,160 Speaker 1: and on and on. There's nothing new about polarization and 41 00:02:53,200 --> 00:02:55,600 Speaker 1: people taking up arms. And what I want to talk 42 00:02:55,600 --> 00:02:59,359 Speaker 1: about today is why so the question on lots of 43 00:02:59,360 --> 00:03:01,560 Speaker 1: people's mind, it seems to be, is this because of 44 00:03:01,639 --> 00:03:04,359 Speaker 1: social media. I don't think that has much of anything 45 00:03:04,400 --> 00:03:06,360 Speaker 1: to do with it. Keep in mind that all the 46 00:03:06,400 --> 00:03:10,840 Speaker 1: examples I just named preceded Twitter and Facebook, and those 47 00:03:10,880 --> 00:03:13,960 Speaker 1: were much worse than what we have going on currently. 48 00:03:14,480 --> 00:03:17,560 Speaker 1: The fact is it doesn't take much to get people 49 00:03:17,680 --> 00:03:21,320 Speaker 1: all worked up over different opinions and taking up arms, 50 00:03:21,360 --> 00:03:24,400 Speaker 1: and you don't need social media for that. And my 51 00:03:24,480 --> 00:03:27,560 Speaker 1: goal today is to explain why it is so easy 52 00:03:27,960 --> 00:03:31,000 Speaker 1: for people to get so worked up and believe their truth. 53 00:03:31,400 --> 00:03:33,240 Speaker 1: So this is what we're going to explore. Why does 54 00:03:33,280 --> 00:03:37,440 Speaker 1: everyone have different opinions? And why does everyone with a 55 00:03:37,520 --> 00:03:43,160 Speaker 1: different opinion to yours seem misinformed or obstreperous or like 56 00:03:43,240 --> 00:03:47,560 Speaker 1: a troll. So I'll start with a cartoon and a story. 57 00:03:48,200 --> 00:03:51,440 Speaker 1: The cartoon was one that I saw recently on Twitter. 58 00:03:51,680 --> 00:03:54,520 Speaker 1: It shows a bunch of people walking on a road 59 00:03:55,080 --> 00:03:57,120 Speaker 1: and up ahead there's a fork in the road and 60 00:03:57,160 --> 00:04:01,600 Speaker 1: off to one side there's a small number of smart, 61 00:04:01,600 --> 00:04:06,559 Speaker 1: thoughtful people that are following a windy path marked complex 62 00:04:06,680 --> 00:04:09,360 Speaker 1: but right, and on the other side of the fork 63 00:04:09,640 --> 00:04:12,160 Speaker 1: there is a very packed group of people and they're 64 00:04:12,200 --> 00:04:16,000 Speaker 1: all walking and it's labeled simple but wrong, and this 65 00:04:16,080 --> 00:04:18,840 Speaker 1: takes them to a cliff that they eventually tumble off of. 66 00:04:19,600 --> 00:04:21,560 Speaker 1: Now I'm going to come back to this cartoon later, 67 00:04:22,120 --> 00:04:24,960 Speaker 1: and when I come back, we're going to understand this 68 00:04:25,080 --> 00:04:26,280 Speaker 1: in a slightly different way. 69 00:04:26,600 --> 00:04:28,520 Speaker 2: But first I want to turn to this. 70 00:04:28,640 --> 00:04:32,159 Speaker 1: Story, which is a true story. Many years ago, I 71 00:04:32,200 --> 00:04:35,279 Speaker 1: got my PhD in neuroscience. I was a second year 72 00:04:35,320 --> 00:04:38,520 Speaker 1: graduate student when the new class of first year students 73 00:04:38,560 --> 00:04:41,520 Speaker 1: came in and one student really stood out, and I'm 74 00:04:41,520 --> 00:04:42,920 Speaker 1: going to call her Tanya. 75 00:04:43,640 --> 00:04:44,919 Speaker 2: She seemed very sweet. 76 00:04:45,000 --> 00:04:48,600 Speaker 1: And what we found out was that Tanya had extraordinary grades, 77 00:04:49,160 --> 00:04:52,600 Speaker 1: and she'd come from a very impoverished neighborhood in Chicago, 78 00:04:52,960 --> 00:04:56,280 Speaker 1: but she had these incredible grades and these incredible letters 79 00:04:56,279 --> 00:05:01,000 Speaker 1: of recommendation and great scores on her gres. So during 80 00:05:01,040 --> 00:05:03,159 Speaker 1: the first month or two of school, she was given 81 00:05:03,240 --> 00:05:05,719 Speaker 1: a special award and there was going to be a 82 00:05:05,720 --> 00:05:09,440 Speaker 1: banquet for it, and to my surprise, she asked me 83 00:05:09,640 --> 00:05:11,320 Speaker 1: if I would be her date to the banquet. 84 00:05:11,720 --> 00:05:12,520 Speaker 2: So I said yes. 85 00:05:12,640 --> 00:05:14,719 Speaker 1: I didn't know her well, but I thought she seemed 86 00:05:14,800 --> 00:05:17,640 Speaker 1: very sweet, and so I said yes. So the banquet 87 00:05:17,680 --> 00:05:20,680 Speaker 1: was supposed to be on Friday of that week, but 88 00:05:20,760 --> 00:05:25,800 Speaker 1: as it turns out, the banquet never happened. Why it's 89 00:05:25,800 --> 00:05:29,039 Speaker 1: because on Tuesday of that week, Tanya was in the 90 00:05:29,240 --> 00:05:32,719 Speaker 1: women's restroom with one of the administrators at the school 91 00:05:33,160 --> 00:05:36,440 Speaker 1: and they started talking. The administrator said, Wow, Tanya, everything 92 00:05:36,520 --> 00:05:38,400 Speaker 1: is so amazing about you and. 93 00:05:38,400 --> 00:05:39,680 Speaker 2: Your grades and your skills. 94 00:05:39,960 --> 00:05:44,040 Speaker 1: I want to know how your school cultivated a thinker 95 00:05:44,160 --> 00:05:47,719 Speaker 1: like you. And so Tanya just had some humble answer, 96 00:05:47,839 --> 00:05:50,760 Speaker 1: and so this administrator decided she was going to call 97 00:05:50,800 --> 00:05:54,560 Speaker 1: the school and find out how they produced somebody like Tanya. 98 00:05:54,680 --> 00:05:56,279 Speaker 2: So she calls to. 99 00:05:55,880 --> 00:05:58,560 Speaker 1: Talk to one of the professors that wrote a letter 100 00:05:58,600 --> 00:06:01,760 Speaker 1: of recommendations for Tanya. And so she dials up and 101 00:06:01,839 --> 00:06:04,320 Speaker 1: she gets the secretary and she asks to speak to 102 00:06:04,360 --> 00:06:09,720 Speaker 1: the professor, and the secretary says who, and the administrator 103 00:06:09,760 --> 00:06:12,080 Speaker 1: repeats the name. I'm looking for, Professor so and so, 104 00:06:12,240 --> 00:06:15,119 Speaker 1: and the woman on the other end says, there's nobody 105 00:06:15,200 --> 00:06:18,160 Speaker 1: here by that name. So the administrator says, yeah, this 106 00:06:18,200 --> 00:06:20,760 Speaker 1: is the professor that wrote this letter, and the secretary says, 107 00:06:21,120 --> 00:06:23,320 Speaker 1: I've worked here thirty years and there is nobody here 108 00:06:23,360 --> 00:06:26,039 Speaker 1: by this name. So turn out to be a fake 109 00:06:26,279 --> 00:06:29,560 Speaker 1: letter of recommendation. So the administrator calls the second letter 110 00:06:29,600 --> 00:06:33,640 Speaker 1: of recommendation, same story. So she calls the third recommender 111 00:06:33,720 --> 00:06:36,000 Speaker 1: and gets connected, and it turns out that it was 112 00:06:36,360 --> 00:06:41,240 Speaker 1: Tanya's mother's office. And so what quickly became unraveled is 113 00:06:41,279 --> 00:06:46,040 Speaker 1: that Tanya had faked everything her transcript or GRES. And 114 00:06:46,080 --> 00:06:47,920 Speaker 1: this was by the way back in the nineties. So 115 00:06:47,960 --> 00:06:51,599 Speaker 1: she did this by digitally scanning her gres and then 116 00:06:51,720 --> 00:06:56,760 Speaker 1: changing them with early versions of photoshop, and then reprinting them. Anyway, 117 00:06:57,000 --> 00:06:59,440 Speaker 1: one of my colleagues equipped that she should get a 118 00:06:59,440 --> 00:07:03,120 Speaker 1: PhD just for the cleverness of her deception. But the 119 00:07:03,200 --> 00:07:07,279 Speaker 1: thing that struck me was how blind we all were 120 00:07:07,640 --> 00:07:11,760 Speaker 1: to the deception. We were completely fooled by it. So anyway, 121 00:07:11,800 --> 00:07:15,080 Speaker 1: the graduate school said to her, WHOA, there's something really 122 00:07:15,160 --> 00:07:16,960 Speaker 1: strange here, and you have to come up with an 123 00:07:17,000 --> 00:07:20,960 Speaker 1: explanation for this, And Tanya just ran away. But one 124 00:07:21,000 --> 00:07:23,320 Speaker 1: of my classmates caught her on the way out at 125 00:07:23,360 --> 00:07:25,800 Speaker 1: the door, and she had an excuse for everything, and 126 00:07:25,840 --> 00:07:29,200 Speaker 1: she said, I got screwed by this person, and this 127 00:07:29,200 --> 00:07:31,640 Speaker 1: person cheated me, and I thought they were writing a 128 00:07:31,680 --> 00:07:33,960 Speaker 1: real letter, and I thought the school was accredited, but 129 00:07:34,000 --> 00:07:37,160 Speaker 1: they lied to me, so she had an excuse for everything. Now, 130 00:07:37,240 --> 00:07:39,800 Speaker 1: obviously the details of this story stuck with me through 131 00:07:39,840 --> 00:07:42,120 Speaker 1: the years because I had almost gone on a date 132 00:07:42,160 --> 00:07:45,400 Speaker 1: with this girl. Anyway, that was the end of the 133 00:07:45,480 --> 00:07:50,160 Speaker 1: Tanya story, or so I thought. A couple of years later, 134 00:07:50,320 --> 00:07:54,240 Speaker 1: someone pointed me to an article from the Yale University newspaper, 135 00:07:54,760 --> 00:07:58,000 Speaker 1: and that's how we learned that Tanya had left our 136 00:07:58,080 --> 00:08:01,920 Speaker 1: school and gone to ya next and faked her way 137 00:08:01,960 --> 00:08:05,720 Speaker 1: into graduate school there, and Yale had caught her. This 138 00:08:05,880 --> 00:08:08,200 Speaker 1: was just like the first time, but Yale was mad 139 00:08:08,240 --> 00:08:11,240 Speaker 1: because they had been paying her a stipend, and so 140 00:08:11,280 --> 00:08:14,440 Speaker 1: they put her in jail. And I pictured the girl 141 00:08:14,480 --> 00:08:16,320 Speaker 1: that I almost went on a date with sitting on 142 00:08:16,440 --> 00:08:20,480 Speaker 1: a cement bench in jail, and the next newspaper article 143 00:08:20,480 --> 00:08:25,880 Speaker 1: I found showed that in jail, Tanya had bitten two guards, 144 00:08:26,320 --> 00:08:28,320 Speaker 1: and at some point she was released from jail, and 145 00:08:28,360 --> 00:08:32,320 Speaker 1: then we heard nothing about her after that until two 146 00:08:32,400 --> 00:08:36,160 Speaker 1: years later. Because Tanya went home to where she lived 147 00:08:36,200 --> 00:08:40,240 Speaker 1: in Chicago, and she and her mother decided to do 148 00:08:40,440 --> 00:08:44,160 Speaker 1: a big drug deal with two men who turned out 149 00:08:44,200 --> 00:08:48,000 Speaker 1: to be undercover agents, and they got caught and she 150 00:08:48,160 --> 00:08:51,640 Speaker 1: was going to be sentenced for a long time, and 151 00:08:51,720 --> 00:08:54,840 Speaker 1: so she and her cousin came up with an idea. 152 00:08:55,679 --> 00:08:58,400 Speaker 1: So they went out and they found a woman who 153 00:08:58,440 --> 00:09:02,120 Speaker 1: was a drug addict, who who was approximately her size 154 00:09:02,160 --> 00:09:04,640 Speaker 1: and looked a bit like her. And they said, hey, 155 00:09:04,920 --> 00:09:07,200 Speaker 1: we're going to give you free drugs if you come 156 00:09:07,240 --> 00:09:10,240 Speaker 1: to the dentist. And she said she was doing an 157 00:09:10,280 --> 00:09:13,439 Speaker 1: insurance scam, and she said, come to the dentist and 158 00:09:13,480 --> 00:09:16,280 Speaker 1: say that your name is my name. And she had 159 00:09:16,280 --> 00:09:18,760 Speaker 1: the woman wearing gloves to the dentist office so there 160 00:09:18,760 --> 00:09:22,520 Speaker 1: wouldn't be any fingerprints, and she accompanied her and she 161 00:09:22,600 --> 00:09:25,720 Speaker 1: signed the paperwork for her. And then after they had 162 00:09:25,720 --> 00:09:29,640 Speaker 1: the imprints, they brought this woman home and they smashed 163 00:09:29,640 --> 00:09:31,880 Speaker 1: her on the head with a brick and they injected 164 00:09:31,920 --> 00:09:34,840 Speaker 1: her with a bunch of insulin to make her pass out. 165 00:09:34,960 --> 00:09:37,000 Speaker 1: That's all they had access to and that's why they 166 00:09:37,320 --> 00:09:41,319 Speaker 1: used the insulin. And their plan was to kill this 167 00:09:41,400 --> 00:09:45,000 Speaker 1: woman and burn the body so that the dental imprints 168 00:09:45,040 --> 00:09:47,880 Speaker 1: would be found and they would conclude that Tanya was 169 00:09:47,960 --> 00:09:50,319 Speaker 1: dead and then she wouldn't have to go to jail. 170 00:09:50,520 --> 00:09:53,560 Speaker 1: As it turns out, this poor woman eventually came to 171 00:09:53,880 --> 00:09:57,240 Speaker 1: in the basement of Tanya's cousin's house and she managed 172 00:09:57,280 --> 00:10:00,080 Speaker 1: to scramble out of the window and she ran a 173 00:10:00,080 --> 00:10:02,600 Speaker 1: crossed the street to a Kentucky Fried Chicken restaurant. She 174 00:10:02,679 --> 00:10:05,960 Speaker 1: screamed for help, and Tanya realized that the woman had 175 00:10:06,000 --> 00:10:08,960 Speaker 1: just escaped, and she ran into the Kentucky Fried Chicken 176 00:10:09,040 --> 00:10:11,800 Speaker 1: right after her and started screaming that the woman had 177 00:10:11,840 --> 00:10:14,920 Speaker 1: stolen her money. And everything was very confusing, but the 178 00:10:14,920 --> 00:10:17,720 Speaker 1: police showed up and they couldn't figure out what was happening, 179 00:10:17,760 --> 00:10:18,920 Speaker 1: so they arrested everyone. 180 00:10:19,000 --> 00:10:20,760 Speaker 2: Here was this woman with blood. 181 00:10:20,440 --> 00:10:23,120 Speaker 1: All over her, so they just took everyone into the station, 182 00:10:23,760 --> 00:10:29,720 Speaker 1: and eventually the whole thing became unraveled and everyone involved 183 00:10:29,760 --> 00:10:33,839 Speaker 1: in this case was jaw dropped. As this story unfolded, 184 00:10:34,280 --> 00:10:39,120 Speaker 1: the ease and creativity with which Tanya thought of a 185 00:10:39,280 --> 00:10:42,040 Speaker 1: plan that went, okay, how about I just kill this 186 00:10:42,080 --> 00:10:45,080 Speaker 1: woman and then burned the body. And when you burn 187 00:10:45,160 --> 00:10:47,920 Speaker 1: the body, the dental imprints are the things that last. 188 00:10:48,000 --> 00:10:50,960 Speaker 1: And so because this woman had gone to the dentist 189 00:10:51,120 --> 00:10:54,800 Speaker 1: under Tanya's name, then the world would conclude I'm dead 190 00:10:54,880 --> 00:10:57,640 Speaker 1: and then I won't go to jail, and I was thinking, Wow, 191 00:10:57,720 --> 00:11:00,400 Speaker 1: I almost went on a day with this woman. So 192 00:11:00,679 --> 00:11:02,640 Speaker 1: this was one of the moments in my life when 193 00:11:02,640 --> 00:11:06,640 Speaker 1: I was struck by how different people can be on 194 00:11:06,800 --> 00:11:11,240 Speaker 1: the inside and how little insight we have into the 195 00:11:11,400 --> 00:11:16,600 Speaker 1: cosmos of someone else's brain and mind. And happily this 196 00:11:16,760 --> 00:11:19,720 Speaker 1: was positioned at the very beginning of my neuroscience career, 197 00:11:19,840 --> 00:11:23,480 Speaker 1: and it influenced what I've been studying since. It has 198 00:11:23,480 --> 00:11:27,880 Speaker 1: always struck me as fascinating the differences between people. Everyone 199 00:11:28,400 --> 00:11:32,000 Speaker 1: is very different on the inside, sometimes much more than 200 00:11:32,040 --> 00:11:35,240 Speaker 1: we expect. Now it turns out my father was a 201 00:11:35,280 --> 00:11:38,320 Speaker 1: forensic psychiatrist and he was involved in most of the 202 00:11:38,800 --> 00:11:42,640 Speaker 1: big mass murder cases in New Mexico where I grew up. 203 00:11:43,080 --> 00:11:45,680 Speaker 1: One of them was a guy named William Wayne Gilbert 204 00:11:45,720 --> 00:11:50,040 Speaker 1: who had killed three people cold blooded murder, and my 205 00:11:50,120 --> 00:11:53,240 Speaker 1: father became the psychiatrist in that case. Now, I was 206 00:11:53,280 --> 00:11:56,760 Speaker 1: a child and we went to some social event with 207 00:11:56,840 --> 00:12:00,240 Speaker 1: my father, and I remember somebody saying to my father, 208 00:12:00,920 --> 00:12:04,960 Speaker 1: Gilbert should not get the death penalty because presumably he 209 00:12:05,040 --> 00:12:08,640 Speaker 1: feels terrible for what he's done. Presumably he feels deep 210 00:12:09,000 --> 00:12:13,520 Speaker 1: regret for having killed three people. And as a kid, 211 00:12:13,760 --> 00:12:18,160 Speaker 1: I remember totally feeling like I agreed with that, but 212 00:12:18,240 --> 00:12:21,680 Speaker 1: I remembered my father's surprise at the statement, because my 213 00:12:21,760 --> 00:12:25,840 Speaker 1: father had just spent hours in deposition with Gilbert, and 214 00:12:25,920 --> 00:12:29,319 Speaker 1: he explained to this man genuinely and professionally that it 215 00:12:29,559 --> 00:12:35,000 Speaker 1: simply wasn't the case that Gilbert had regret, because when 216 00:12:35,040 --> 00:12:38,400 Speaker 1: William Wayne Gilbert would think about the idea of going 217 00:12:38,440 --> 00:12:41,920 Speaker 1: to murder somebody, he said, he had the same level 218 00:12:42,040 --> 00:12:45,320 Speaker 1: of excitement that he did as a child on the 219 00:12:45,400 --> 00:12:48,200 Speaker 1: night before Christmas. That's what it felt like for him 220 00:12:48,480 --> 00:12:52,080 Speaker 1: when he was thinking of killing someone. And so my 221 00:12:52,200 --> 00:12:54,880 Speaker 1: father's point to this man and to me when I 222 00:12:54,920 --> 00:12:58,840 Speaker 1: was eight years old, is you can't actually stick yourself 223 00:12:58,880 --> 00:13:01,440 Speaker 1: in other people's shoes, as much as you'd like to. 224 00:13:01,880 --> 00:13:05,280 Speaker 1: You think everybody's just like me, especially when you're a child, 225 00:13:05,880 --> 00:13:08,920 Speaker 1: but in fact people can be quite different on the inside. 226 00:13:09,440 --> 00:13:11,920 Speaker 1: And it turns out that the governor of New Mexico 227 00:13:11,960 --> 00:13:15,840 Speaker 1: at the time commuted William Wayne Gilbert's sentence since he 228 00:13:15,920 --> 00:13:18,320 Speaker 1: was sentenced to death, so Gilbert was not going to 229 00:13:18,400 --> 00:13:21,520 Speaker 1: die now, and to show his gratitude, he managed to 230 00:13:21,600 --> 00:13:24,920 Speaker 1: smuggle a pistol into the prison and led one of 231 00:13:24,920 --> 00:13:29,559 Speaker 1: the most stunning prison breaks from a maximum security penitentiary 232 00:13:30,000 --> 00:13:33,720 Speaker 1: where he let out several other murderers. And so for 233 00:13:33,800 --> 00:13:36,480 Speaker 1: a few months it was very tense in New Mexico 234 00:13:36,840 --> 00:13:40,280 Speaker 1: because there was this group of mass murderers on the loose. 235 00:13:40,600 --> 00:13:43,520 Speaker 1: They were finally found again, but only after this very 236 00:13:43,559 --> 00:13:45,640 Speaker 1: scary month or two where they were hiding out and 237 00:13:45,640 --> 00:13:48,600 Speaker 1: taking hostages. So all this stuff really got me thinking 238 00:13:48,640 --> 00:13:53,280 Speaker 1: from a young age about the differences between people. So 239 00:13:53,360 --> 00:13:56,840 Speaker 1: I'll just mention one more anecdote here, my father deposed 240 00:13:56,840 --> 00:13:59,719 Speaker 1: a guy who was on trial. This guy had gone 241 00:13:59,720 --> 00:14:03,079 Speaker 1: into Western Union office and asked the clerk to use 242 00:14:03,080 --> 00:14:05,480 Speaker 1: the phone, and the clerk had said, sorry, but the 243 00:14:05,520 --> 00:14:07,640 Speaker 1: phone is only for the back office people and not 244 00:14:07,679 --> 00:14:12,480 Speaker 1: for customers. So this guy jumped the counter and beat 245 00:14:12,559 --> 00:14:15,520 Speaker 1: the clerk almost to death. And the interesting part is 246 00:14:15,520 --> 00:14:18,360 Speaker 1: that he said to my father during the deposition, but 247 00:14:18,679 --> 00:14:21,440 Speaker 1: anybody would have done the same thing in that situation, right, 248 00:14:22,160 --> 00:14:25,240 Speaker 1: And he meant it. He was being genuine because we 249 00:14:25,320 --> 00:14:30,280 Speaker 1: all have an internal model of the world what constitutes 250 00:14:30,600 --> 00:14:34,880 Speaker 1: appropriate behavior in the world, and we assume that everyone 251 00:14:34,960 --> 00:14:38,520 Speaker 1: else's model is the same as ours, and this guy 252 00:14:38,720 --> 00:14:43,440 Speaker 1: genuinely could not imagine anyone having a different reaction in 253 00:14:43,520 --> 00:14:48,800 Speaker 1: that circumstance. Now, when we think about the differences between people, 254 00:14:48,840 --> 00:14:53,000 Speaker 1: were used to thinking about extreme cases like Tanya or 255 00:14:53,240 --> 00:14:56,800 Speaker 1: William Wayne Gilbert or this guy in the Western Union office. 256 00:14:56,840 --> 00:15:01,440 Speaker 1: These people, presumably have psychopathy paths make up about one 257 00:15:01,480 --> 00:15:05,200 Speaker 1: percent of the population. These are people who simply don't 258 00:15:05,280 --> 00:15:08,480 Speaker 1: care about your feelings. You're just an obstacle that they're 259 00:15:08,480 --> 00:15:10,960 Speaker 1: trying to get around to get what they want. But 260 00:15:11,040 --> 00:15:14,120 Speaker 1: this idea that other people can be different from you 261 00:15:14,320 --> 00:15:19,760 Speaker 1: on the inside can be generalized. So take something like schizophrenia. 262 00:15:20,440 --> 00:15:24,200 Speaker 1: You see a man on the street corner and he's yelling. 263 00:15:24,280 --> 00:15:27,000 Speaker 1: He's in an angry dialogue with somebody who's not there. 264 00:15:27,080 --> 00:15:31,640 Speaker 1: He's delusional, he's not in contact with reality. So in 265 00:15:31,680 --> 00:15:34,840 Speaker 1: situations like this, you look at the man and you say, Okay, 266 00:15:35,240 --> 00:15:39,320 Speaker 1: I guess that man's internal model isn't the same as mine. 267 00:15:39,360 --> 00:15:42,600 Speaker 1: He's not experiencing reality the same way I do. So 268 00:15:42,680 --> 00:15:45,760 Speaker 1: you might think, okay, I get that about different models 269 00:15:45,840 --> 00:15:49,400 Speaker 1: on the inside for psychopaths and for people with schizophrenia, 270 00:15:49,440 --> 00:15:52,440 Speaker 1: But otherwise the intuition is that the rest of us 271 00:15:52,480 --> 00:15:55,320 Speaker 1: are all about the same. But there are some important 272 00:15:55,320 --> 00:15:58,040 Speaker 1: things to note here. First, when it comes to something 273 00:15:58,160 --> 00:16:02,480 Speaker 1: like psychopathy or schizophrenia, we tend to think about these 274 00:16:02,520 --> 00:16:06,560 Speaker 1: as being categories, like Okay, that person's psychopathic and I'm not, 275 00:16:06,800 --> 00:16:10,320 Speaker 1: or that person is schizophrenic and I'm not. But in fact, 276 00:16:10,840 --> 00:16:13,280 Speaker 1: all of these things are now appreciated as living on 277 00:16:13,360 --> 00:16:18,040 Speaker 1: a spectrum, so you have different degrees along the spectrum. 278 00:16:18,120 --> 00:16:22,320 Speaker 1: So take psychopathy for example. There's a well used way 279 00:16:22,400 --> 00:16:25,240 Speaker 1: to measure this. It's a questionnaire and you score a 280 00:16:25,240 --> 00:16:28,120 Speaker 1: certain number of points on this from one to forty, 281 00:16:28,560 --> 00:16:31,280 Speaker 1: and in the United States, if you're above a thirty, 282 00:16:31,600 --> 00:16:34,720 Speaker 1: then you are labeled a psychopath. You don't want to 283 00:16:34,720 --> 00:16:38,240 Speaker 1: be roommates with the guy who scores twenty nine on 284 00:16:38,320 --> 00:16:42,160 Speaker 1: this test, even though they're not technically a psychopath, they 285 00:16:42,200 --> 00:16:46,200 Speaker 1: share a lot of the fundamental characteristics, so it's a 286 00:16:46,320 --> 00:16:49,560 Speaker 1: spectral issue. And by the way, this is the same 287 00:16:49,600 --> 00:16:52,920 Speaker 1: with everything in psychiatry. This has been the direction of 288 00:16:52,960 --> 00:16:57,400 Speaker 1: the Bible of psychiatry called the Diagnostic and Statistical Manual. 289 00:16:57,680 --> 00:17:00,520 Speaker 1: It used to be all about categories, and now everything 290 00:17:00,600 --> 00:17:04,200 Speaker 1: is about living on a spectrum, but let's keep drilling down. 291 00:17:04,560 --> 00:17:07,520 Speaker 1: When we look at mental illnesses more generally, we find 292 00:17:07,600 --> 00:17:11,680 Speaker 1: things that influence people's thoughts or feelings or behaviors, including 293 00:17:11,720 --> 00:17:16,119 Speaker 1: not just things like schizophrenia or psychopathy, but you presumably 294 00:17:16,200 --> 00:17:22,560 Speaker 1: know people with clinical depression or bipolar disorder, or anxiety disorder, 295 00:17:22,880 --> 00:17:27,760 Speaker 1: or obsessive compulsive disorder or borderline or narcissistic or avoidant, 296 00:17:27,880 --> 00:17:32,000 Speaker 1: or an eating disorder, or dissociated identity disorder or panic disorder, 297 00:17:32,320 --> 00:17:34,880 Speaker 1: or many other things like that. There are at least 298 00:17:34,880 --> 00:17:38,200 Speaker 1: one hundred and fifty seven of these in the Diagnostic 299 00:17:38,200 --> 00:17:43,120 Speaker 1: and Statistical Manual, So it becomes increasingly difficult to assert 300 00:17:43,800 --> 00:17:48,840 Speaker 1: that everybody is exactly like you on the inside. Despite 301 00:17:48,880 --> 00:17:53,080 Speaker 1: superficial appearances, people can be very different in terms of 302 00:17:53,080 --> 00:17:55,920 Speaker 1: what is happening on the inside. And if you've read 303 00:17:55,920 --> 00:17:58,159 Speaker 1: my books or listen to my other podcasts, you know 304 00:17:58,280 --> 00:18:01,679 Speaker 1: that when we see people with two strokes or brain 305 00:18:01,760 --> 00:18:04,800 Speaker 1: injuries to different parts of the brain, it's not hard 306 00:18:04,840 --> 00:18:07,680 Speaker 1: to say, oh, I guess their reality is a little 307 00:18:07,720 --> 00:18:11,040 Speaker 1: bit different than mine. But it proves harder to think 308 00:18:11,080 --> 00:18:15,840 Speaker 1: about this in terms of everybody that you know and love, 309 00:18:16,119 --> 00:18:19,720 Speaker 1: because we assume that the people in our lives have 310 00:18:19,920 --> 00:18:22,359 Speaker 1: essentially the same thing going on on the inside. That 311 00:18:22,400 --> 00:18:26,760 Speaker 1: we do the same opinions, the same way of sense making, 312 00:18:26,800 --> 00:18:30,600 Speaker 1: the same way of gathering meaning, the same political views 313 00:18:30,640 --> 00:18:34,520 Speaker 1: on the world as you do, but they don't. Everyone 314 00:18:34,600 --> 00:18:38,439 Speaker 1: that you know is having a slightly different reality going 315 00:18:38,480 --> 00:18:41,919 Speaker 1: on than you are. We are very different people on 316 00:18:42,000 --> 00:18:44,600 Speaker 1: the inside. Now, I want to be clear that I'm 317 00:18:44,640 --> 00:18:47,560 Speaker 1: not just saying this as a philosophical claim. We can 318 00:18:47,640 --> 00:18:51,520 Speaker 1: increasingly measure so many examples of this, and this is 319 00:18:51,520 --> 00:18:54,119 Speaker 1: something I've worked on in my lab for most of 320 00:18:54,119 --> 00:18:58,199 Speaker 1: my career. Neuroscience has always cared about the big disorders, 321 00:18:58,240 --> 00:19:02,399 Speaker 1: the things that are most obvious and societally costly. But 322 00:19:02,480 --> 00:19:06,560 Speaker 1: when we start looking at the more innocuous details, we 323 00:19:06,800 --> 00:19:10,879 Speaker 1: uncover these clear and measurable ways that reality can be 324 00:19:10,920 --> 00:19:14,560 Speaker 1: different inside different heads. So, for example, imagine that you 325 00:19:14,640 --> 00:19:17,360 Speaker 1: and I and a bunch of other people are looking 326 00:19:17,800 --> 00:19:20,240 Speaker 1: over Times Square in New York. We're standing on a 327 00:19:20,280 --> 00:19:24,000 Speaker 1: corner and enjoying watching the crowd. So you open your 328 00:19:24,040 --> 00:19:26,159 Speaker 1: eyes and there's the world and all its blues and 329 00:19:26,200 --> 00:19:29,240 Speaker 1: golds and greens. But if you happen to be colorblind, 330 00:19:29,280 --> 00:19:32,520 Speaker 1: you're seeing it differently than the person next to you. 331 00:19:32,560 --> 00:19:36,679 Speaker 1: Maybe you can't distinguish reds from greens those look exactly 332 00:19:36,760 --> 00:19:38,840 Speaker 1: the same to you. Or maybe you have a more 333 00:19:38,880 --> 00:19:41,879 Speaker 1: extreme form of color blindness in which there are no 334 00:19:42,000 --> 00:19:44,720 Speaker 1: colors at all, just shades of gray. So for you, 335 00:19:44,840 --> 00:19:47,280 Speaker 1: in the person next to you, the internal experience can 336 00:19:47,320 --> 00:19:50,160 Speaker 1: be quite different, even though you're looking at the same scene. 337 00:19:50,600 --> 00:19:53,080 Speaker 2: And we now know that a small. 338 00:19:52,800 --> 00:19:56,960 Speaker 1: Fraction of the female population has not just three types 339 00:19:56,960 --> 00:20:00,359 Speaker 1: of color photoreceptors in their eyes and their retinas, but 340 00:20:00,880 --> 00:20:05,080 Speaker 1: four types of color photoreceptors, which means they are seeing 341 00:20:05,280 --> 00:20:09,920 Speaker 1: colors that the rest of us can't even imagine. This 342 00:20:09,960 --> 00:20:12,800 Speaker 1: is called tetrachromacy, and I'll come back to this in 343 00:20:12,840 --> 00:20:15,160 Speaker 1: a future episode, But the point I want to make 344 00:20:15,240 --> 00:20:17,920 Speaker 1: now is that we might all be watching the same 345 00:20:18,040 --> 00:20:35,959 Speaker 1: corner in Times Square, but having totally different internal experiences. 346 00:20:37,920 --> 00:20:39,920 Speaker 1: This is the type of issue that I've been studying 347 00:20:39,920 --> 00:20:41,680 Speaker 1: in my lab for many years, so you might have 348 00:20:41,760 --> 00:20:47,000 Speaker 1: heard about my episode on synesthesia. In synesthesia, people have 349 00:20:47,080 --> 00:20:50,840 Speaker 1: a blending of the senses. About three percent of the 350 00:20:50,880 --> 00:20:54,680 Speaker 1: population will, for example, see a letter on a page 351 00:20:55,000 --> 00:20:58,679 Speaker 1: and that'll trigger a color experience for them on the inside. 352 00:20:58,880 --> 00:21:02,400 Speaker 1: So maybe s's a purple for you, and L triggers 353 00:21:02,400 --> 00:21:05,760 Speaker 1: a green experience, and so on. It's different for every synesthete, 354 00:21:05,840 --> 00:21:09,280 Speaker 1: and there are many forms of synesthesia. You might hear 355 00:21:09,320 --> 00:21:12,119 Speaker 1: a sound and that triggers a visual experience, or you 356 00:21:12,200 --> 00:21:15,680 Speaker 1: taste something and it puts a feeling on your fingertips. Essentially, 357 00:21:15,760 --> 00:21:19,080 Speaker 1: any sense can end up having crosstalk with any other 358 00:21:19,160 --> 00:21:22,760 Speaker 1: sense in these different forms of synesthesia. And my colleagues 359 00:21:22,800 --> 00:21:26,359 Speaker 1: and I have documented dozens and dozens of forms throughout 360 00:21:26,359 --> 00:21:30,240 Speaker 1: the population. Now, synesthesia is not considered a disease or 361 00:21:30,280 --> 00:21:34,760 Speaker 1: a disorder. It is simply an alternative reality. The point 362 00:21:34,800 --> 00:21:38,440 Speaker 1: is that people can have very different experiences on the inside, 363 00:21:38,600 --> 00:21:42,399 Speaker 1: but to a synesthee, their experience is precisely as real 364 00:21:42,440 --> 00:21:45,919 Speaker 1: as anything you might experience. So in neuroscience, this is 365 00:21:46,000 --> 00:21:49,159 Speaker 1: just one more recent appreciation that from one person to 366 00:21:49,200 --> 00:21:51,639 Speaker 1: the next, reality can be a little bit different. And 367 00:21:51,720 --> 00:21:53,879 Speaker 1: let me give you one more example that's even newer, 368 00:21:54,200 --> 00:21:59,080 Speaker 1: the issue of how we imagine a visual scene inside 369 00:21:59,080 --> 00:22:02,840 Speaker 1: our heads. So I'm going to ask you to picture 370 00:22:03,040 --> 00:22:07,800 Speaker 1: this picture a gray and white cat on a picnic 371 00:22:07,840 --> 00:22:13,199 Speaker 1: table eating colorful cereal and looking at you suspiciously, So 372 00:22:13,359 --> 00:22:16,280 Speaker 1: really picture that in your head now. There were two 373 00:22:16,400 --> 00:22:19,760 Speaker 1: researchers who began looking at this question of mental imageries 374 00:22:19,800 --> 00:22:23,919 Speaker 1: some years ago, Stephen Costlin and Zen and Felicition, and 375 00:22:23,960 --> 00:22:28,720 Speaker 1: they ended up disagreeing very strongly. Coslin said, when you're 376 00:22:28,720 --> 00:22:32,720 Speaker 1: imaging something, you're essentially running your visual cortex to see 377 00:22:32,720 --> 00:22:36,960 Speaker 1: this like a movie. It's like vision. And Felician said, 378 00:22:37,200 --> 00:22:41,639 Speaker 1: that's ridiculous. You're not seeing something. It's purely conceptual. It 379 00:22:41,680 --> 00:22:45,800 Speaker 1: doesn't involve seeing in vision. And they both did experiments 380 00:22:45,840 --> 00:22:49,240 Speaker 1: back and forth, and Felician said, look, you're insane. It's 381 00:22:49,240 --> 00:22:52,640 Speaker 1: not stored like a picture, and Costlin said, no, you're insane. 382 00:22:52,680 --> 00:22:55,800 Speaker 1: It's not stored like a proposition. You're actually seeing it. 383 00:22:56,119 --> 00:22:59,040 Speaker 1: And it was very difficult for a conclusion to be 384 00:22:59,080 --> 00:23:02,800 Speaker 1: reached here. Both argued passionately for their side of the argument, 385 00:23:03,080 --> 00:23:05,639 Speaker 1: and this went on for twenty years in the literature. 386 00:23:06,480 --> 00:23:08,760 Speaker 1: So why couldn't they come to an agreement on this. 387 00:23:09,640 --> 00:23:14,760 Speaker 1: The answer is Coslin had what we now call hyper fantasia, 388 00:23:14,840 --> 00:23:18,720 Speaker 1: which means he has extremely vivid mental imagery. When he 389 00:23:18,800 --> 00:23:23,560 Speaker 1: imagines something, it's as vivid as real seeing. Now, this 390 00:23:23,680 --> 00:23:26,639 Speaker 1: is the opposite of what Felician had, which is called 391 00:23:27,000 --> 00:23:31,840 Speaker 1: a fantasia, where mental visual imagery is not present. He 392 00:23:31,840 --> 00:23:35,240 Speaker 1: doesn't see anything in particular, He just has a concept, 393 00:23:35,560 --> 00:23:39,600 Speaker 1: and all of this lives on a spectrum from hyperfantasia 394 00:23:39,640 --> 00:23:43,359 Speaker 1: to a fantasia. Everyone in the population is somewhere on 395 00:23:43,760 --> 00:23:44,920 Speaker 1: this spectrum. 396 00:23:45,040 --> 00:23:46,280 Speaker 2: So think about this for a moment. 397 00:23:46,359 --> 00:23:51,000 Speaker 1: Picture an ant crawling on a checkered tablecloth towards a 398 00:23:51,119 --> 00:23:55,080 Speaker 1: jar of purple jelly. Are you closer to the hyper 399 00:23:55,119 --> 00:23:57,760 Speaker 1: fantasia side, where you're seeing it like a movie, or 400 00:23:57,920 --> 00:24:01,000 Speaker 1: closer to the a fantasia side where you can understand 401 00:24:01,000 --> 00:24:04,560 Speaker 1: the concept perfectly fine, but you're not seeing anything. So 402 00:24:04,640 --> 00:24:06,720 Speaker 1: my lab has studied this in detail, and we use 403 00:24:06,800 --> 00:24:09,720 Speaker 1: neuroimaging to figure out what's going on in the brain 404 00:24:09,800 --> 00:24:11,000 Speaker 1: along this spectrum. 405 00:24:11,200 --> 00:24:11,920 Speaker 2: But what I want to. 406 00:24:11,840 --> 00:24:15,320 Speaker 1: Focus on today is why there was such a spirited 407 00:24:15,359 --> 00:24:19,080 Speaker 1: debate in the literature for two decades before anyone realized 408 00:24:19,320 --> 00:24:23,919 Speaker 1: there was a spectrum. It's because both researchers were operating 409 00:24:24,080 --> 00:24:29,840 Speaker 1: under the assumption that everyone else experiences visual imagery just 410 00:24:30,000 --> 00:24:34,119 Speaker 1: like they do. So when Felician introspected, he wasn't seeing 411 00:24:34,119 --> 00:24:36,119 Speaker 1: a picture, and so he felt clear that other people 412 00:24:36,160 --> 00:24:40,320 Speaker 1: don't either, And when Coslin introspected, he was seeing a 413 00:24:40,480 --> 00:24:44,359 Speaker 1: super clear image and he assumed that's what happens inside 414 00:24:44,560 --> 00:24:45,200 Speaker 1: every head. 415 00:24:45,640 --> 00:24:46,639 Speaker 2: And I want to use. 416 00:24:46,520 --> 00:24:50,280 Speaker 1: The debate between them as a more general metaphor that 417 00:24:50,359 --> 00:24:54,719 Speaker 1: we all assume that everyone else is experiencing the world 418 00:24:54,800 --> 00:24:58,040 Speaker 1: the way we do. My point in talking about color 419 00:24:58,119 --> 00:25:03,560 Speaker 1: vision and synesthesia and visual imagery is this, as neuroscience 420 00:25:03,600 --> 00:25:07,160 Speaker 1: and psychology move on from studying the really big disorders, 421 00:25:07,480 --> 00:25:12,359 Speaker 1: we find increasingly more subtle issues, which caused us to say, wow, 422 00:25:12,480 --> 00:25:16,520 Speaker 1: I didn't realize that someone could experience so differently than 423 00:25:16,520 --> 00:25:19,800 Speaker 1: I do. But the fact is that everyone is having 424 00:25:19,920 --> 00:25:24,000 Speaker 1: a different internal experience and this led me to search 425 00:25:24,080 --> 00:25:27,960 Speaker 1: for a good metaphor. And when I saw the movie 426 00:25:28,080 --> 00:25:32,240 Speaker 1: poster for The Martian, I thought, Oh, that's it, because 427 00:25:32,760 --> 00:25:37,919 Speaker 1: the poster shows a single person, Matt Damon, walking around 428 00:25:37,920 --> 00:25:41,040 Speaker 1: on his own planet. He's the only one there, and 429 00:25:41,080 --> 00:25:44,439 Speaker 1: I thought, that's the perfect model. We're each living on 430 00:25:44,480 --> 00:25:49,600 Speaker 1: our own planet, we're each having our experiences, and we think, yeah, 431 00:25:49,640 --> 00:25:53,480 Speaker 1: this is reality, and it makes sense that everyone has 432 00:25:53,520 --> 00:25:57,159 Speaker 1: the same experience that I do. But in fact, just 433 00:25:57,240 --> 00:26:01,400 Speaker 1: like in any galaxy, each planet is different. Everyone's got 434 00:26:01,440 --> 00:26:05,119 Speaker 1: their own atmosphere, their own landscape, their own experiences. But 435 00:26:05,200 --> 00:26:09,119 Speaker 1: we always feel certain that he or she feels exactly 436 00:26:09,200 --> 00:26:13,119 Speaker 1: the same way that I do about whatever. Take for example, 437 00:26:13,160 --> 00:26:16,360 Speaker 1: what comes to mind when I say the word justice. 438 00:26:17,119 --> 00:26:22,120 Speaker 1: What happens inside everyone's head is slightly different. Or fairness 439 00:26:22,680 --> 00:26:24,760 Speaker 1: what comes to your mind it might not be the 440 00:26:24,800 --> 00:26:35,280 Speaker 1: same thing that comes to someone else's. Or attractiveness or love, home, freedom, success, 441 00:26:36,160 --> 00:26:39,679 Speaker 1: these concepts are triggering different neural networks and different people. 442 00:26:40,040 --> 00:26:43,359 Speaker 1: They trigger different meaning, which is tied to your whole 443 00:26:43,400 --> 00:26:47,359 Speaker 1: personal history and your aspirations. But we assume when we 444 00:26:47,480 --> 00:26:51,040 Speaker 1: use words that the other person knows exactly what we mean. 445 00:26:51,720 --> 00:26:55,000 Speaker 1: We operate under the assumption that words mean the same 446 00:26:55,119 --> 00:26:58,000 Speaker 1: thing to me that they do to you. But in 447 00:26:58,040 --> 00:27:02,200 Speaker 1: fact that never happens because we each have different internal lives. 448 00:27:02,560 --> 00:27:04,320 Speaker 1: And one of the ways that you can always appreciate 449 00:27:04,359 --> 00:27:06,440 Speaker 1: this is just look around you. The next time you're 450 00:27:06,480 --> 00:27:09,440 Speaker 1: in the bookstore or the library. There are so many 451 00:27:09,520 --> 00:27:13,480 Speaker 1: different sections, and nobody walks in and explores all the 452 00:27:13,520 --> 00:27:16,520 Speaker 1: sections equally. Instead, people go in and they go straight 453 00:27:16,560 --> 00:27:19,320 Speaker 1: to the section they want the thing that resonates with 454 00:27:19,440 --> 00:27:23,199 Speaker 1: their internal model of the world, mysteries or romances or 455 00:27:23,200 --> 00:27:26,879 Speaker 1: westerns or sci fi or whatever. They gravitate to particular 456 00:27:26,920 --> 00:27:30,280 Speaker 1: things and not others because of the differences in their brains. 457 00:27:30,960 --> 00:27:33,400 Speaker 1: So this is the first important lesson that I want 458 00:27:33,440 --> 00:27:36,639 Speaker 1: to establish. Others see the world differently than you do. 459 00:27:37,160 --> 00:27:40,240 Speaker 1: But why is this true? Why can't we have one 460 00:27:40,240 --> 00:27:43,840 Speaker 1: really smart person who writes a blog post and says, hey, 461 00:27:43,840 --> 00:27:45,680 Speaker 1: I think this is the way we should run the country, 462 00:27:46,080 --> 00:27:49,359 Speaker 1: and all three hundred and sixty five million people read 463 00:27:49,359 --> 00:27:52,520 Speaker 1: then say, yeah, that's pretty good. Why are we all 464 00:27:52,640 --> 00:27:55,800 Speaker 1: so different? This question has been at the heart of 465 00:27:55,840 --> 00:27:59,800 Speaker 1: a very long standing debate where people attribute differences to 466 00:28:00,080 --> 00:28:03,919 Speaker 1: either genetic factors or environmental factors, in other words, the 467 00:28:04,400 --> 00:28:08,720 Speaker 1: nature versus nurture debates. Why do you argue with your 468 00:28:08,760 --> 00:28:12,920 Speaker 1: sibling about political issues? Does your sibling have very different 469 00:28:12,960 --> 00:28:15,400 Speaker 1: genes than you do, even though you have the same parents. 470 00:28:16,040 --> 00:28:18,919 Speaker 1: Does your sibling have different nurture than you even though 471 00:28:18,960 --> 00:28:21,879 Speaker 1: you were raised in the same household? So are we 472 00:28:21,960 --> 00:28:26,119 Speaker 1: determined by our genetics or our environment? And traditionally there 473 00:28:26,119 --> 00:28:29,200 Speaker 1: have been very strong advocates on both sides of this. Well, 474 00:28:29,880 --> 00:28:33,040 Speaker 1: both of these have something to say. So let's start 475 00:28:33,040 --> 00:28:36,920 Speaker 1: with genetics. Do genetic differences matter? Heck yeah they do. 476 00:28:37,600 --> 00:28:40,720 Speaker 1: Although we're all members of this species, Homo sapiens, there 477 00:28:40,760 --> 00:28:45,120 Speaker 1: are millions of differences in our genomes from person to person. 478 00:28:45,640 --> 00:28:50,880 Speaker 1: For the cognitionanty, these are single nucleotide polymorphisms or substitution variants, 479 00:28:50,960 --> 00:28:54,080 Speaker 1: or copy number variants and so on. And your genetics 480 00:28:54,120 --> 00:28:58,640 Speaker 1: matter for who you are. Take just as an example 481 00:28:58,680 --> 00:29:02,400 Speaker 1: from my book Incognito, so I compiled statistics that if 482 00:29:02,440 --> 00:29:07,000 Speaker 1: you are a carrier of certain genes, your probability of 483 00:29:07,000 --> 00:29:10,680 Speaker 1: committing violent crime goes up eight hundred and eighty two percent. 484 00:29:11,320 --> 00:29:14,360 Speaker 1: So I took statistics from the US Department of Justice, 485 00:29:14,400 --> 00:29:16,800 Speaker 1: and I broke these down into two groups, those who 486 00:29:16,840 --> 00:29:19,080 Speaker 1: carry the genes and those who do not. And it's 487 00:29:19,080 --> 00:29:22,560 Speaker 1: a massive difference. Can you guess what this collection of 488 00:29:22,640 --> 00:29:27,920 Speaker 1: genes is. We summarize it as the Y chromosome. If 489 00:29:27,960 --> 00:29:31,440 Speaker 1: you have these genes, we call you a male. So 490 00:29:31,560 --> 00:29:35,160 Speaker 1: genes matter, but it's not all genes. It's also your 491 00:29:35,200 --> 00:29:39,000 Speaker 1: experiences in the world. We drop into the world with 492 00:29:39,320 --> 00:29:43,640 Speaker 1: half baked brains and we absorb everything around us, everything 493 00:29:43,680 --> 00:29:47,680 Speaker 1: that you know that you believe, your language, your culture, 494 00:29:47,760 --> 00:29:51,680 Speaker 1: your memories. It's all stored in this giant neural network. 495 00:29:52,120 --> 00:29:56,600 Speaker 1: And how does it get stored by reconfiguration of the network. 496 00:29:56,640 --> 00:30:01,000 Speaker 1: This is known as brain plasticity. Brains absorb the world 497 00:30:01,120 --> 00:30:04,640 Speaker 1: around them, and that's how the world shapes you. We're 498 00:30:04,760 --> 00:30:09,040 Speaker 1: influenced by our culture, our friends, our neighbors, our generation, 499 00:30:09,320 --> 00:30:12,840 Speaker 1: and so on. So you're shaped by both your genes 500 00:30:13,000 --> 00:30:17,320 Speaker 1: and your environment, and these are intertwined in very complex ways, 501 00:30:17,680 --> 00:30:20,719 Speaker 1: such that it's really rare that we can point to 502 00:30:20,720 --> 00:30:23,480 Speaker 1: one or the other and conclude that it's responsible for 503 00:30:23,560 --> 00:30:26,240 Speaker 1: something that we see. It's all about what we nowadays 504 00:30:26,240 --> 00:30:30,360 Speaker 1: call gene environment interactions. So you've got the genes that 505 00:30:30,400 --> 00:30:32,240 Speaker 1: you drop into the world with, and then you've got 506 00:30:32,280 --> 00:30:36,320 Speaker 1: all these experiences, and these intertwined in this complex way. 507 00:30:36,760 --> 00:30:40,040 Speaker 1: Your experiences actually shape your nervous system and can feed 508 00:30:40,400 --> 00:30:42,880 Speaker 1: all the way down to the level of which genes 509 00:30:42,880 --> 00:30:45,560 Speaker 1: are getting expressed and which you are getting suppressed. And by 510 00:30:45,600 --> 00:30:47,840 Speaker 1: the way, you don't choose your genes, and you don't 511 00:30:47,920 --> 00:30:51,440 Speaker 1: choose your childhood experiences. None of that is about choice, 512 00:30:51,440 --> 00:30:55,360 Speaker 1: but it makes you who you are now. These differences 513 00:30:55,360 --> 00:30:58,560 Speaker 1: can be quite subtle. You can disagree with your sibling 514 00:30:58,640 --> 00:31:02,960 Speaker 1: politically even know you're genetically similar and we're raised in 515 00:31:02,960 --> 00:31:06,800 Speaker 1: a similar environment, but small differences can take you off 516 00:31:07,040 --> 00:31:10,240 Speaker 1: in very different directions. So you're shaped by both your 517 00:31:10,280 --> 00:31:14,120 Speaker 1: genes and your environment, and hence the nature versus nurture 518 00:31:14,200 --> 00:31:18,120 Speaker 1: question is dead. And brains end up as different from 519 00:31:18,120 --> 00:31:22,239 Speaker 1: one another as faces. Just walk along the street and 520 00:31:22,280 --> 00:31:26,760 Speaker 1: look at how different people's faces are. The variety of 521 00:31:26,840 --> 00:31:29,320 Speaker 1: faces that you see around you, Well, there's that much 522 00:31:29,400 --> 00:31:32,360 Speaker 1: variety in brains too. And just as a side note, 523 00:31:32,400 --> 00:31:35,880 Speaker 1: I can recognize all my students just from their brain scans, 524 00:31:35,920 --> 00:31:40,280 Speaker 1: because brains actually physically come out looking different from one another. Okay, 525 00:31:40,320 --> 00:31:42,840 Speaker 1: so we've established that people are very different on the 526 00:31:42,880 --> 00:31:45,720 Speaker 1: inside and given a sense of how that comes about. 527 00:31:46,120 --> 00:31:49,400 Speaker 1: But you may think, okay, other people are different from me. 528 00:31:50,120 --> 00:31:53,800 Speaker 1: But what is clear is that I see the truth. 529 00:31:54,360 --> 00:31:56,400 Speaker 1: And if I could just sit down with them and 530 00:31:56,480 --> 00:31:59,600 Speaker 1: have them listen to me, or I could just shout 531 00:31:59,600 --> 00:32:03,680 Speaker 1: in cap letters on Twitter, everyone would see the correctness 532 00:32:03,720 --> 00:32:04,480 Speaker 1: of my position. 533 00:32:04,640 --> 00:32:04,880 Speaker 2: Right. 534 00:32:05,520 --> 00:32:08,480 Speaker 1: So the main question that comes up in our lives politically, 535 00:32:08,840 --> 00:32:12,880 Speaker 1: whether on social media or and dinner conversations is why 536 00:32:13,000 --> 00:32:17,600 Speaker 1: can't everyone see the truth? So now we'll turn to 537 00:32:17,680 --> 00:32:20,880 Speaker 1: Act three, where we'll ask how do you land on 538 00:32:21,080 --> 00:32:25,640 Speaker 1: your opinions, your notion of the truth, and how accurate 539 00:32:25,760 --> 00:32:30,320 Speaker 1: and complete is it really? So the critical concept I 540 00:32:30,320 --> 00:32:33,000 Speaker 1: want to tell you about here is this your brain 541 00:32:33,280 --> 00:32:37,480 Speaker 1: is locked in silence and darkness, and your brain's job 542 00:32:37,720 --> 00:32:41,120 Speaker 1: is to build an internal model of the outside world 543 00:32:41,200 --> 00:32:44,520 Speaker 1: so that it understands what is happening out there. So 544 00:32:44,720 --> 00:32:47,920 Speaker 1: everything in your life, everything about the way the world works, 545 00:32:48,440 --> 00:32:53,960 Speaker 1: is represented in your brain, usually unconsciously. How you deal 546 00:32:54,000 --> 00:32:57,560 Speaker 1: with people, where your house is, how to operate the 547 00:32:57,600 --> 00:33:01,040 Speaker 1: appliances in your kitchen, what language you speak and your 548 00:33:01,080 --> 00:33:04,120 Speaker 1: spouse speaks, how to drive your car. Everything in your 549 00:33:04,120 --> 00:33:08,280 Speaker 1: life is represented in this internal model. And I'm not 550 00:33:08,320 --> 00:33:10,560 Speaker 1: going to talk too much about the internal model today 551 00:33:10,680 --> 00:33:13,400 Speaker 1: except to say that one of the fascinating things is 552 00:33:13,440 --> 00:33:16,840 Speaker 1: that usually it's totally unconscious. And I gave an example 553 00:33:16,880 --> 00:33:19,600 Speaker 1: of this in a recent episode about putting your hands 554 00:33:19,680 --> 00:33:22,640 Speaker 1: up on an imaginary steering wheel in front of you 555 00:33:23,200 --> 00:33:26,280 Speaker 1: and pretending that you're driving thirty miles an hour down 556 00:33:26,280 --> 00:33:28,880 Speaker 1: the road, and I asked you to make a lane 557 00:33:28,960 --> 00:33:32,200 Speaker 1: change from the center lane into the right lane. And 558 00:33:32,360 --> 00:33:35,240 Speaker 1: what essentially everyone does with their hands is they turn 559 00:33:35,280 --> 00:33:37,640 Speaker 1: the steering wheel to the right and back to center. 560 00:33:38,040 --> 00:33:40,600 Speaker 1: But that would actually steer you off the road and 561 00:33:40,640 --> 00:33:43,040 Speaker 1: you would crash. When you actually get in the cart 562 00:33:43,120 --> 00:33:45,680 Speaker 1: and watch what your hands are doing, you'll see that 563 00:33:45,760 --> 00:33:48,200 Speaker 1: the way you make a lane change is by turning 564 00:33:48,240 --> 00:33:50,840 Speaker 1: to the right, back to center, just as far to 565 00:33:50,880 --> 00:33:53,160 Speaker 1: the left, and then back to center again. That's how 566 00:33:53,160 --> 00:33:55,959 Speaker 1: you make a lane change. Your brain has made a 567 00:33:56,160 --> 00:33:58,920 Speaker 1: model of the physics of cars and steering wheels and 568 00:33:59,000 --> 00:34:02,120 Speaker 1: roads and so on, but you don't even know how 569 00:34:02,200 --> 00:34:04,120 Speaker 1: you do this, and you didn't even know that your 570 00:34:04,120 --> 00:34:07,720 Speaker 1: brain had that model. This is the gap between what 571 00:34:07,800 --> 00:34:10,840 Speaker 1: your brain knows under the hood and what your conscious 572 00:34:10,880 --> 00:34:13,719 Speaker 1: mind has access to. And the point I want to 573 00:34:13,760 --> 00:34:16,280 Speaker 1: make is that you have this same sort of model 574 00:34:16,719 --> 00:34:21,160 Speaker 1: about the whole world around you and its political truths. Now, 575 00:34:21,200 --> 00:34:23,920 Speaker 1: the details of your trajectory in the world up to 576 00:34:23,960 --> 00:34:27,239 Speaker 1: this moment convince you that you know the truths, even 577 00:34:27,239 --> 00:34:30,000 Speaker 1: though someone else's internal model might tell them that they 578 00:34:30,040 --> 00:34:33,040 Speaker 1: know the truth and it might be different. Now, the 579 00:34:33,120 --> 00:34:35,920 Speaker 1: really important point here, the thing that no one talks about, 580 00:34:36,280 --> 00:34:40,200 Speaker 1: is that we don't usually take into account how poor 581 00:34:40,680 --> 00:34:44,879 Speaker 1: our internal models are. Here's just an example, starting in 582 00:34:45,080 --> 00:34:48,360 Speaker 1: early twenty twenty, when the pandemic hit, why did all 583 00:34:48,440 --> 00:34:51,799 Speaker 1: the bank lobbies close? After all, there were lots of 584 00:34:51,880 --> 00:34:54,400 Speaker 1: other shops that were open. The floorist was open, the 585 00:34:54,440 --> 00:34:56,879 Speaker 1: hair salons were open, and all of these were much 586 00:34:57,200 --> 00:35:01,680 Speaker 1: smaller spaces than the spacious lobbies of the bank. And 587 00:35:01,719 --> 00:35:04,759 Speaker 1: it's not that the banks couldn't put up plexiglass in 588 00:35:04,800 --> 00:35:07,560 Speaker 1: front of the teller's windows. In fact, that was usually 589 00:35:07,600 --> 00:35:10,360 Speaker 1: already in place. And it's not that the banks wanted 590 00:35:10,400 --> 00:35:13,600 Speaker 1: to be closed, because they still staffed the drive up 591 00:35:13,640 --> 00:35:14,799 Speaker 1: windows throughout the day. 592 00:35:15,200 --> 00:35:16,239 Speaker 2: So what was going on. 593 00:35:17,000 --> 00:35:19,719 Speaker 1: Most people didn't know why, And this is an example 594 00:35:19,800 --> 00:35:23,320 Speaker 1: of internal models. If something is not in your model, 595 00:35:23,360 --> 00:35:26,000 Speaker 1: then it just doesn't strike you. The answer was that 596 00:35:26,080 --> 00:35:28,840 Speaker 1: in the spring of twenty twenty, most of the population 597 00:35:28,960 --> 00:35:33,319 Speaker 1: began wearing masks, and the bankers didn't want masked customers 598 00:35:33,360 --> 00:35:36,800 Speaker 1: coming through all day. It's a perfect costume for a robbery, 599 00:35:37,160 --> 00:35:39,719 Speaker 1: so they closed their lobbies. The point is simply that 600 00:35:39,840 --> 00:35:42,520 Speaker 1: we always think our understanding of the world is complete, 601 00:35:42,640 --> 00:35:45,279 Speaker 1: but we're always facing situations where we say, oh, I 602 00:35:45,280 --> 00:35:48,359 Speaker 1: guess I didn't know that. Now it's complete, by the way, 603 00:35:48,400 --> 00:35:51,600 Speaker 1: the limits of your internal model. This is the engine 604 00:35:52,120 --> 00:35:55,120 Speaker 1: of comedy. So a comedian, I'll say something like this, 605 00:35:55,600 --> 00:35:57,400 Speaker 1: I went to the doctor the other day and the 606 00:35:57,440 --> 00:36:00,720 Speaker 1: doctor said, put your clothes there in the corner next 607 00:36:00,760 --> 00:36:04,280 Speaker 1: to mine. It's funny and surprising only because your brain 608 00:36:04,920 --> 00:36:08,640 Speaker 1: makes a full world of assumptions about the doctor, and 609 00:36:08,680 --> 00:36:11,840 Speaker 1: then you realize that your model didn't have all the information. 610 00:36:12,360 --> 00:36:15,480 Speaker 1: And there are so many examples of the limitations of 611 00:36:15,520 --> 00:36:18,360 Speaker 1: your model. Imagine I were to draw a handful of 612 00:36:18,440 --> 00:36:21,240 Speaker 1: diagonal lines on a page and ask you what you see. 613 00:36:21,560 --> 00:36:23,920 Speaker 1: You'd say diagonal lines. But if I say to you 614 00:36:24,560 --> 00:36:27,959 Speaker 1: how many are there here, you'd realize that you don't 615 00:36:28,000 --> 00:36:30,719 Speaker 1: know the answer, and you have to deploy your attention 616 00:36:31,360 --> 00:36:34,200 Speaker 1: to seek the answer. So this happens to us all 617 00:36:34,239 --> 00:36:36,840 Speaker 1: the time, where we think we have a complete understanding 618 00:36:36,880 --> 00:36:38,759 Speaker 1: of what's in front of us, but a little bit 619 00:36:38,760 --> 00:36:41,640 Speaker 1: of questioning unmasks that we don't actually have all the 620 00:36:41,640 --> 00:36:44,560 Speaker 1: details of the picture. And I'm using this all as 621 00:36:44,560 --> 00:36:49,560 Speaker 1: a metaphor to emphasize the importance of genuine dialogue with 622 00:36:49,600 --> 00:36:53,680 Speaker 1: other people, because sometimes you can't see what questions you 623 00:36:53,840 --> 00:36:57,400 Speaker 1: haven't asked, or the things that you're not even aware 624 00:36:57,480 --> 00:37:00,520 Speaker 1: you're not aware of. And by genuine dify dialogue, I 625 00:37:00,520 --> 00:37:03,000 Speaker 1: don't mean how do I convince the other person that 626 00:37:03,080 --> 00:37:07,400 Speaker 1: I'm right and they're wrong. I mean listening and considering 627 00:37:07,640 --> 00:37:11,359 Speaker 1: and questioning, and trying where appropriate to change your own 628 00:37:11,440 --> 00:37:15,600 Speaker 1: mind or to stand in a slightly different viewpoint than 629 00:37:15,640 --> 00:37:18,520 Speaker 1: you were before. Now, I want to come back to 630 00:37:18,560 --> 00:37:21,200 Speaker 1: this issue that it's so easy to poke holes in 631 00:37:21,280 --> 00:37:24,400 Speaker 1: our internal models. And the question I've always wondered is 632 00:37:24,440 --> 00:37:27,600 Speaker 1: why do we think our models are complete when they 633 00:37:27,719 --> 00:37:32,239 Speaker 1: lack so many answers? So consider this. Do you know 634 00:37:32,280 --> 00:37:35,399 Speaker 1: what a bicycle looks like? Of course you do. Now, 635 00:37:35,480 --> 00:37:37,279 Speaker 1: if you're in a place where you can get out 636 00:37:37,320 --> 00:37:39,680 Speaker 1: a piece of paper, I'd like you to get it 637 00:37:39,719 --> 00:37:43,480 Speaker 1: out and sketch a bicycle. Go ahead, And if you 638 00:37:44,080 --> 00:37:46,719 Speaker 1: can't get the paper, then go ahead and sketch it 639 00:37:46,719 --> 00:37:49,759 Speaker 1: in the air with your finger. Just the basic outlines 640 00:37:49,760 --> 00:37:53,279 Speaker 1: of the frame, the wheels, the seat, the handlebars. 641 00:37:53,320 --> 00:37:54,440 Speaker 2: That's all. Okay. 642 00:37:54,480 --> 00:37:56,960 Speaker 1: Now, I hope you'll actually do this, because assuming you do, 643 00:37:57,120 --> 00:38:01,480 Speaker 1: you will find yourself shocked about how poorly you are 644 00:38:01,520 --> 00:38:05,680 Speaker 1: able to actually reproduce the bike on paper. You think 645 00:38:06,000 --> 00:38:09,440 Speaker 1: you have your bicycle pictured perfectly in your mind, but 646 00:38:09,560 --> 00:38:14,840 Speaker 1: your model, your internal model, is actually quite lousy. For example, 647 00:38:14,880 --> 00:38:17,560 Speaker 1: does the chain connect to both the front and the 648 00:38:17,600 --> 00:38:21,280 Speaker 1: back wheel? And what shape is the frame exactly? And 649 00:38:21,360 --> 00:38:24,680 Speaker 1: how do the handlebars plug into the front wheel. It's 650 00:38:24,880 --> 00:38:27,440 Speaker 1: shocking because you know what a bicycle looks like, or 651 00:38:27,440 --> 00:38:30,319 Speaker 1: at least you think you do. But it's actually a 652 00:38:30,360 --> 00:38:34,680 Speaker 1: big challenge when you're really pushed on your understanding. And 653 00:38:34,719 --> 00:38:39,560 Speaker 1: this is known as the illusion of explanatory depth. Just 654 00:38:39,600 --> 00:38:42,920 Speaker 1: because you're convinced that your model has the full picture, 655 00:38:43,160 --> 00:38:46,880 Speaker 1: it doesn't mean that it actually does. So here's another example. 656 00:38:46,960 --> 00:38:49,759 Speaker 1: Imagine that I ask you if you know how the 657 00:38:49,800 --> 00:38:53,759 Speaker 1: electoral college works in this country, and you might say, yeah, 658 00:38:53,800 --> 00:38:56,160 Speaker 1: I know how that works. But now I say, great, 659 00:38:56,480 --> 00:38:59,520 Speaker 1: please explain it to me, and you might find that 660 00:38:59,560 --> 00:39:02,080 Speaker 1: you get stuck. You think you know, but as soon 661 00:39:02,080 --> 00:39:04,960 Speaker 1: as I scratched the surface of something, you find that 662 00:39:05,000 --> 00:39:08,279 Speaker 1: it's not quite as clear as you suspected. And there 663 00:39:08,280 --> 00:39:10,480 Speaker 1: are a million examples of this sort of thing, And 664 00:39:10,520 --> 00:39:14,040 Speaker 1: if you start paying attention, you'll see more and more 665 00:39:14,560 --> 00:39:18,279 Speaker 1: of the limitations of your knowledge. So it's useful to 666 00:39:18,360 --> 00:39:21,600 Speaker 1: question ourselves, with all of our political opinions, about where 667 00:39:21,640 --> 00:39:27,280 Speaker 1: our knowledge is lacking, because not knowing information doesn't mean 668 00:39:27,800 --> 00:39:31,319 Speaker 1: that you don't have a high sense of certainty about it, 669 00:39:31,920 --> 00:39:34,600 Speaker 1: especially if you really don't know much about a topic 670 00:39:34,640 --> 00:39:37,719 Speaker 1: at all. My graduate advisor once told me to go 671 00:39:37,760 --> 00:39:40,799 Speaker 1: to the library to learn about lattice gases, and I 672 00:39:40,800 --> 00:39:42,840 Speaker 1: had never heard of that at all. So I walked 673 00:39:42,880 --> 00:39:46,080 Speaker 1: over to the library and I discovered there was half 674 00:39:46,080 --> 00:39:49,839 Speaker 1: a shelf with books all about lattice gases. And I 675 00:39:49,920 --> 00:39:53,480 Speaker 1: was shocked, because how could smart people have devoted their 676 00:39:53,520 --> 00:39:56,919 Speaker 1: scientific careers to writing about something that I had never 677 00:39:57,000 --> 00:40:00,400 Speaker 1: even heard of just twenty minutes before. And this is 678 00:40:00,440 --> 00:40:04,279 Speaker 1: the kind of lesson that emerges as you mature in 679 00:40:04,320 --> 00:40:07,359 Speaker 1: the world. But interestingly, it takes a great deal of 680 00:40:07,440 --> 00:40:12,040 Speaker 1: work to get there. If you've studied less about a subject, 681 00:40:12,480 --> 00:40:16,879 Speaker 1: you tend to overinflate your knowledge. This is what's known 682 00:40:16,920 --> 00:40:20,160 Speaker 1: as the Dunning Krueger effect. So these are a couple 683 00:40:20,200 --> 00:40:23,120 Speaker 1: of psychologists and they ran studies where they found that 684 00:40:23,560 --> 00:40:26,800 Speaker 1: if they ask people a bunch of questions about humor, 685 00:40:26,960 --> 00:40:30,799 Speaker 1: or grammar or logic, you then take the people who 686 00:40:30,880 --> 00:40:35,839 Speaker 1: score at the bottom quartile, and they grossly overestimate their 687 00:40:35,840 --> 00:40:39,319 Speaker 1: performance on the test. Although their test scores put them, 688 00:40:39,440 --> 00:40:43,040 Speaker 1: let's say, in the twelfth percentile, they estimate themselves to 689 00:40:43,120 --> 00:40:45,960 Speaker 1: be in the sixty second percentile when you ask them, 690 00:40:46,239 --> 00:40:48,640 Speaker 1: how much do you know about this compared to other people? 691 00:40:48,960 --> 00:40:51,560 Speaker 1: In other words, the less that you know about a topic, 692 00:40:52,000 --> 00:40:55,880 Speaker 1: the more confidence you have in your own abilities. What 693 00:40:56,080 --> 00:40:58,480 Speaker 1: happens is that as you learn more about a topic, 694 00:40:58,680 --> 00:41:01,880 Speaker 1: your confidence goes down, and it's only much later, when 695 00:41:01,920 --> 00:41:05,080 Speaker 1: you become an expert, that it starts to go back up. So, now, 696 00:41:05,120 --> 00:41:07,200 Speaker 1: given everything I've told you so far, I want to 697 00:41:07,239 --> 00:41:09,880 Speaker 1: return to the cartoon that I described at the beginning 698 00:41:09,880 --> 00:41:12,719 Speaker 1: of this podcast, about the fork in the road where 699 00:41:12,760 --> 00:41:16,279 Speaker 1: one sign points to complex but right and the other 700 00:41:16,320 --> 00:41:19,719 Speaker 1: points to simple but wrong, and almost everybody is going 701 00:41:19,719 --> 00:41:21,919 Speaker 1: in the simple but wrong path, except for the few 702 00:41:21,960 --> 00:41:25,520 Speaker 1: people winding their way up the steep, complex but right path. 703 00:41:25,960 --> 00:41:29,239 Speaker 1: Now the cartoon struck me as funny, but perhaps not 704 00:41:29,480 --> 00:41:32,440 Speaker 1: for the obvious reasons, because when I saw this cartoon 705 00:41:32,480 --> 00:41:35,640 Speaker 1: on Twitter, I noticed that it had racked up many 706 00:41:35,800 --> 00:41:39,840 Speaker 1: thousands of likes. So I started to research who exactly 707 00:41:39,960 --> 00:41:43,360 Speaker 1: were the enthusiasts, and I had a suspicion that I 708 00:41:43,440 --> 00:41:45,680 Speaker 1: knew the answer, and that turned out to be correct. 709 00:41:46,120 --> 00:41:52,080 Speaker 1: Each person of whatever political persuasion, sees himself in the 710 00:41:52,560 --> 00:41:56,719 Speaker 1: complex correct thinkers winding the steep path. Whether you are 711 00:41:56,760 --> 00:42:00,680 Speaker 1: on the right or the left, whether with the Independence 712 00:42:00,840 --> 00:42:04,600 Speaker 1: or Green Party or libertarians, whether you're a fan of 713 00:42:04,880 --> 00:42:11,080 Speaker 1: Antifa or QAnon, whether you're a denizen of Wokistan or Magastan, 714 00:42:11,800 --> 00:42:15,879 Speaker 1: you fundamentally know that you are a person who engages 715 00:42:16,080 --> 00:42:20,880 Speaker 1: in refined and proper thinking. You appreciate data that is 716 00:42:21,040 --> 00:42:24,920 Speaker 1: intricate and meaningful, while people on the other side, for 717 00:42:25,040 --> 00:42:28,880 Speaker 1: reasons that you can only guess, they believe incorrect things. 718 00:42:29,400 --> 00:42:31,879 Speaker 1: What I want to emphasize is that each person who 719 00:42:31,920 --> 00:42:35,560 Speaker 1: clicked the thumbs up button knew that he was not 720 00:42:35,840 --> 00:42:38,720 Speaker 1: like the sheep on the opposite side of the fork. Instead, 721 00:42:39,000 --> 00:42:44,680 Speaker 1: he possessed an intricate and accurate view of the world. Okay, 722 00:42:44,719 --> 00:42:48,440 Speaker 1: So which side of the political debate did the cartoonist support, Well, 723 00:42:48,480 --> 00:42:51,279 Speaker 1: who knows who cares. It was presumably one of his 724 00:42:51,440 --> 00:42:56,239 Speaker 1: most successful cartoons because it was the skeleton key that 725 00:42:56,400 --> 00:43:15,239 Speaker 1: fit the lock of each reader's internal model. So the 726 00:43:15,239 --> 00:43:18,880 Speaker 1: first step to rising above our internal models is to 727 00:43:18,920 --> 00:43:22,520 Speaker 1: start watching for these bug traps that lure us in. 728 00:43:23,160 --> 00:43:26,279 Speaker 1: I just saw an example yesterday, a bumper sticker that 729 00:43:26,400 --> 00:43:31,560 Speaker 1: read make America America Again. And everyone on the road 730 00:43:31,560 --> 00:43:36,000 Speaker 1: who saw this bumper sticker presumably thought that sounded great. 731 00:43:36,280 --> 00:43:40,160 Speaker 1: Why because both sides of the political spectrum are equally 732 00:43:40,200 --> 00:43:47,160 Speaker 1: happy to engage in retrospective romanticization. Liberals and conservatives can 733 00:43:47,200 --> 00:43:50,359 Speaker 1: equally well reach back in memory to a time that 734 00:43:50,440 --> 00:43:55,719 Speaker 1: seemed less complicated, a totally illusory era where the nation 735 00:43:56,120 --> 00:43:59,800 Speaker 1: agreed with the logic of your political viewpoint, before the 736 00:44:00,400 --> 00:44:03,239 Speaker 1: of the crazies with whom you now have to deal. 737 00:44:03,719 --> 00:44:06,400 Speaker 1: So the only thing that the bumper sticker really points 738 00:44:06,440 --> 00:44:11,440 Speaker 1: back to is the impoverishment of our memories. The cartoon 739 00:44:11,640 --> 00:44:14,279 Speaker 1: and the bumper sticker. These work because they can mean 740 00:44:14,400 --> 00:44:18,280 Speaker 1: anything to anyone, and what they reveal is the degree 741 00:44:18,440 --> 00:44:22,520 Speaker 1: to which we live inside our internal models. And we 742 00:44:22,640 --> 00:44:27,560 Speaker 1: assume everyone shares the same model we do, and anyone 743 00:44:27,560 --> 00:44:31,640 Speaker 1: who has a different internal model we tend to demonize. 744 00:44:31,680 --> 00:44:35,080 Speaker 1: And that's because we believe our models so strongly, because 745 00:44:35,080 --> 00:44:38,000 Speaker 1: that's all we have, whether that's our religion or our 746 00:44:38,080 --> 00:44:41,640 Speaker 1: political side. Of the spectrum, whether we're the Communists or 747 00:44:41,680 --> 00:44:44,800 Speaker 1: the Nazis in nineteen thirty four, it makes us angry 748 00:44:44,880 --> 00:44:48,279 Speaker 1: that other people can't see the truth as clearly as 749 00:44:48,320 --> 00:44:51,600 Speaker 1: we can, and we are suspicious of them. And this 750 00:44:51,719 --> 00:44:54,560 Speaker 1: leads me to the final chapter of today's episode, which 751 00:44:54,600 --> 00:44:58,680 Speaker 1: is the notion of empathy. And there's an important aspect 752 00:44:58,760 --> 00:45:01,640 Speaker 1: of this that's typically over look So I'll illustrate this 753 00:45:01,680 --> 00:45:05,839 Speaker 1: with a historical example. In the late seventeen hundreds, there 754 00:45:05,920 --> 00:45:09,800 Speaker 1: was a British nobleman named Lord Gordon, who was born 755 00:45:09,880 --> 00:45:13,960 Speaker 1: into privilege, but he found himself caring deeply about the 756 00:45:13,960 --> 00:45:18,279 Speaker 1: welfare of the sailors. He was an officer, but he 757 00:45:18,440 --> 00:45:22,640 Speaker 1: campaigned energetically to improve their conditions, and his empathy was 758 00:45:22,719 --> 00:45:26,480 Speaker 1: broader than that. When they sailed to Jamaica, he was 759 00:45:26,520 --> 00:45:30,000 Speaker 1: disgusted by the slavery there and he berated the British 760 00:45:30,040 --> 00:45:32,600 Speaker 1: governor about it. So here's an example of a guy 761 00:45:32,680 --> 00:45:36,240 Speaker 1: where everywhere he went he sought to improve the well 762 00:45:36,280 --> 00:45:37,000 Speaker 1: being of. 763 00:45:36,920 --> 00:45:38,440 Speaker 2: Those less fortunate. 764 00:45:38,880 --> 00:45:42,160 Speaker 1: So the question, from a neuroscience point of view is 765 00:45:42,239 --> 00:45:45,560 Speaker 1: why did Lord Gordon care so much for others? And 766 00:45:45,680 --> 00:45:48,480 Speaker 1: why do any of us help strangers after all, the 767 00:45:48,640 --> 00:45:52,960 Speaker 1: driving force of evolution is survival of the fittest, not 768 00:45:53,360 --> 00:45:57,640 Speaker 1: of the friendliest. Well, fortunately there's another force at work. 769 00:45:58,120 --> 00:45:59,000 Speaker 2: Our brains are. 770 00:45:58,960 --> 00:46:04,080 Speaker 1: Constantly in the business of simulating the experiences of other people, 771 00:46:04,520 --> 00:46:08,520 Speaker 1: and under the right circumstances, this leads to empathy, the 772 00:46:08,560 --> 00:46:15,839 Speaker 1: experiencing of another's emotions. Empathy is what counterbalances our appetite 773 00:46:15,880 --> 00:46:20,480 Speaker 1: for power and tribalism and violence. Empathy is the glue 774 00:46:20,560 --> 00:46:25,799 Speaker 1: that binds society together. Our species dominance is due in 775 00:46:25,840 --> 00:46:29,480 Speaker 1: part to our empathy, which helps us to cooperate flexibly 776 00:46:29,520 --> 00:46:34,480 Speaker 1: in large groups. Now, how can we study empathy neuroscientifically? 777 00:46:35,000 --> 00:46:38,560 Speaker 1: So in my lab we performed a brain imaging study 778 00:46:39,040 --> 00:46:41,240 Speaker 1: in which you are in the scanner and you see 779 00:46:41,280 --> 00:46:44,840 Speaker 1: six hands on the screen in front of you, and 780 00:46:44,920 --> 00:46:49,040 Speaker 1: the computer goes aroundoo and randomly picks one of the hands. 781 00:46:49,360 --> 00:46:52,000 Speaker 1: So one of two things happens. Either the hand gets 782 00:46:52,120 --> 00:46:56,160 Speaker 1: touched with a Q tip or it gets stabbed with 783 00:46:56,280 --> 00:46:57,520 Speaker 1: a syringe needle. 784 00:46:57,800 --> 00:46:58,359 Speaker 2: And when you. 785 00:46:58,280 --> 00:47:01,799 Speaker 1: See it get stabbed, it's very cringe worthy. And so 786 00:47:02,040 --> 00:47:05,200 Speaker 1: what we're doing is we're looking in the brain images 787 00:47:05,280 --> 00:47:09,160 Speaker 1: to understand what is the difference between these two cases. 788 00:47:09,520 --> 00:47:12,200 Speaker 1: That are visually quite similar, but in one of the 789 00:47:12,200 --> 00:47:16,040 Speaker 1: cases you have this very visceral response. And what we 790 00:47:16,200 --> 00:47:19,160 Speaker 1: find is that when the hand gets stabbed with the 791 00:47:19,200 --> 00:47:22,680 Speaker 1: syringe needle, this network of areas in your brain that 792 00:47:22,719 --> 00:47:27,160 Speaker 1: we summarize as the pain matrix, this comes online. And 793 00:47:27,239 --> 00:47:30,040 Speaker 1: these areas in your brain are what would come online 794 00:47:30,080 --> 00:47:34,000 Speaker 1: if your own hand got stabbed. So when you see 795 00:47:34,040 --> 00:47:39,200 Speaker 1: someone else's hand get stabed, that activates this same pain matrix. 796 00:47:39,680 --> 00:47:41,839 Speaker 2: You are not getting hurt, but you. 797 00:47:41,800 --> 00:47:43,960 Speaker 1: Are simulating what it would be like to be that 798 00:47:44,160 --> 00:47:47,880 Speaker 1: person and have your hand get stabbed. This is the 799 00:47:47,960 --> 00:47:55,480 Speaker 1: neural basis of empathy. Now, if the story ended there, 800 00:47:56,080 --> 00:47:59,399 Speaker 1: all of us humans would operate like a big, cooperative 801 00:47:59,480 --> 00:48:04,200 Speaker 1: ant called But the reality is more complex. So let's 802 00:48:04,200 --> 00:48:08,880 Speaker 1: return to Lord Gordon. He empathized with sailors and slaves, 803 00:48:09,200 --> 00:48:13,120 Speaker 1: but he had nothing but hatred for his Catholic neighbors. 804 00:48:13,520 --> 00:48:18,160 Speaker 1: He worked tirelessly to repeal the civil rights of Catholics, 805 00:48:18,640 --> 00:48:22,839 Speaker 1: and in seventeen eighty Lord Gordon marched a crowd of 806 00:48:22,960 --> 00:48:26,240 Speaker 1: fifty thousand people to the Houses of Parliament in London 807 00:48:26,480 --> 00:48:30,560 Speaker 1: and for a week, the mob destroyed Catholic churches and 808 00:48:30,600 --> 00:48:33,880 Speaker 1: Catholic homes in what came to be known as the 809 00:48:33,920 --> 00:48:38,440 Speaker 1: Gordon Riots, which was the most destructive domestic upheaval in 810 00:48:38,480 --> 00:48:42,640 Speaker 1: the history of London. So why did Lord Gordon, a 811 00:48:42,719 --> 00:48:47,799 Speaker 1: person so capable of empathy, have such antipathy for Catholics. 812 00:48:48,280 --> 00:48:52,080 Speaker 1: The answer paints a fundamental fact about human nature, which 813 00:48:52,120 --> 00:48:56,600 Speaker 1: is our tendency to form in groups and outgroups, groups 814 00:48:56,600 --> 00:48:59,200 Speaker 1: that we feel attached to and those that we feel 815 00:48:59,239 --> 00:49:05,279 Speaker 1: opposed to. Our empathy is selective. So, especially after the 816 00:49:05,280 --> 00:49:09,279 Speaker 1: Second World War, psychologists began to study this issue of 817 00:49:09,360 --> 00:49:11,840 Speaker 1: in groups and out groups and how this can so 818 00:49:12,160 --> 00:49:15,319 Speaker 1: easily lead to violence. And my lab and a lot 819 00:49:15,320 --> 00:49:17,560 Speaker 1: of others have done a lot of research into this 820 00:49:17,680 --> 00:49:21,760 Speaker 1: issue of how easily we form in groups and out groups. 821 00:49:21,800 --> 00:49:25,279 Speaker 1: And I'll give you just one example of this. So 822 00:49:25,640 --> 00:49:27,960 Speaker 1: come back to this experiment with the people in the 823 00:49:27,960 --> 00:49:32,000 Speaker 1: brain scanner watching the hands get stabbed. Now we take 824 00:49:32,040 --> 00:49:35,080 Speaker 1: these six hands on the screen and we just add 825 00:49:35,160 --> 00:49:42,400 Speaker 1: one word labels to each hand Christian, Jewish, Muslim, Hindu, Scientologist, atheist. 826 00:49:42,960 --> 00:49:45,399 Speaker 1: And now the computer goes around boo boo, boo boop, 827 00:49:45,440 --> 00:49:48,080 Speaker 1: and it randomly picks a hand, and you see that 828 00:49:48,239 --> 00:49:51,200 Speaker 1: hand get touched with a Q tip or stabbed with 829 00:49:51,239 --> 00:49:54,400 Speaker 1: a syringe needle. And the question is does your brain 830 00:49:54,560 --> 00:49:58,560 Speaker 1: care as much if it's an out group versus your 831 00:49:58,680 --> 00:50:02,080 Speaker 1: in group. We tested people of all different faiths and 832 00:50:02,120 --> 00:50:06,040 Speaker 1: atheists also, and the result is that you care more 833 00:50:06,080 --> 00:50:10,600 Speaker 1: about your in group and you care less about outgroups. 834 00:50:10,960 --> 00:50:13,640 Speaker 1: When you see the hand get stabbed that is labeled 835 00:50:13,640 --> 00:50:16,200 Speaker 1: with your in group, we can measure a very big 836 00:50:16,239 --> 00:50:19,239 Speaker 1: response in the pain matrix. And when you see a 837 00:50:19,239 --> 00:50:21,880 Speaker 1: hand get stabbed that has one of the other labels 838 00:50:21,920 --> 00:50:25,279 Speaker 1: on it, we see a very small response in the 839 00:50:25,320 --> 00:50:30,319 Speaker 1: pain matrix. Your brain just doesn't care as much. This 840 00:50:30,520 --> 00:50:33,360 Speaker 1: is a large effect and it's depressing that it's true, 841 00:50:33,719 --> 00:50:36,080 Speaker 1: but it's just the way humans are. We care a 842 00:50:36,120 --> 00:50:39,600 Speaker 1: lot about our in groups. And these are just single 843 00:50:39,640 --> 00:50:42,600 Speaker 1: word labels. I mean, all the hands look the same 844 00:50:42,920 --> 00:50:45,239 Speaker 1: and they just have different colored wristbands on them so 845 00:50:45,280 --> 00:50:47,879 Speaker 1: you can distinguish them. But it turns out we are 846 00:50:48,000 --> 00:50:52,279 Speaker 1: really really sensitive to these labels. So the issue of 847 00:50:52,400 --> 00:50:57,280 Speaker 1: empathy is subtle and complex. With just a single word label, 848 00:50:57,360 --> 00:50:59,959 Speaker 1: your brain can feel more or less empathy for someone. 849 00:51:00,080 --> 00:51:03,800 Speaker 1: One can run the imagery about them and their pain 850 00:51:04,520 --> 00:51:08,680 Speaker 1: more or less vividly. Now, what's fascinating is how rapidly 851 00:51:08,719 --> 00:51:12,439 Speaker 1: our levels of empathy can change. So we next took 852 00:51:12,480 --> 00:51:15,120 Speaker 1: the exact same subject and we presented them with a 853 00:51:15,160 --> 00:51:19,080 Speaker 1: single sentence. The year is twenty twenty five, and these 854 00:51:19,120 --> 00:51:22,440 Speaker 1: three groups have teamed up against these three groups. And 855 00:51:22,520 --> 00:51:25,040 Speaker 1: so now you find your in group teamed up on 856 00:51:25,120 --> 00:51:27,520 Speaker 1: one side or the other. The computer has picked these 857 00:51:27,560 --> 00:51:30,799 Speaker 1: sides randomly, and so you've got this team and the 858 00:51:30,880 --> 00:51:33,879 Speaker 1: other team. So what do you think happens? You care 859 00:51:33,960 --> 00:51:37,719 Speaker 1: now about your allies. The two groups have randomly got 860 00:51:37,880 --> 00:51:40,759 Speaker 1: lumped in there with your in group. So suddenly when 861 00:51:40,760 --> 00:51:44,120 Speaker 1: you see their hand get stabbed, you have a larger 862 00:51:44,160 --> 00:51:46,719 Speaker 1: empathy response than you did just a moment ago. 863 00:51:46,800 --> 00:51:48,040 Speaker 2: And you didn't care about them. 864 00:51:48,239 --> 00:51:50,319 Speaker 1: You still don't care about the out groups on the 865 00:51:50,320 --> 00:51:53,640 Speaker 1: other side, but you care about these allies now more, 866 00:51:53,840 --> 00:51:57,040 Speaker 1: which is not surprising. Like, for example, when the Soviets 867 00:51:57,160 --> 00:52:00,279 Speaker 1: fought side by side with the Americans in World War II, too, 868 00:52:00,640 --> 00:52:04,760 Speaker 1: they had been bitter enemies before then World War two happened, 869 00:52:04,800 --> 00:52:08,560 Speaker 1: and suddenly their allies they're fighting together. They're clapping each 870 00:52:08,560 --> 00:52:10,759 Speaker 1: other on the back and sharing cigarettes and so on. 871 00:52:11,239 --> 00:52:13,520 Speaker 1: And then the war ends and now their enemies again. 872 00:52:16,680 --> 00:52:19,320 Speaker 1: Now take a moment to think about your own level 873 00:52:19,440 --> 00:52:24,200 Speaker 1: of empathy towards others. Imagine that you see a seventy 874 00:52:24,200 --> 00:52:26,680 Speaker 1: five year old man get hit in the face and 875 00:52:26,760 --> 00:52:29,960 Speaker 1: his nose gets cut and he's bleeding. Do you feel 876 00:52:30,080 --> 00:52:33,920 Speaker 1: an empathic sting with that? Okay, Well, now imagine that 877 00:52:34,000 --> 00:52:38,240 Speaker 1: he's at a rally for Joe Biden or for Donald Trump, 878 00:52:38,719 --> 00:52:41,920 Speaker 1: or just anyone you agree or disagree with. Is your 879 00:52:42,040 --> 00:52:46,759 Speaker 1: empathy any different? And if so, does that challenge your 880 00:52:46,880 --> 00:52:51,480 Speaker 1: view of yourself as an empathic person. If you felt 881 00:52:51,600 --> 00:52:55,239 Speaker 1: unequal responses in those two situations, a Biden rally or 882 00:52:55,239 --> 00:52:59,080 Speaker 1: a Trump rally, you're not alone. People generally assess their 883 00:52:59,120 --> 00:53:02,560 Speaker 1: own empathy by thinking about those in their in group. 884 00:53:03,160 --> 00:53:06,560 Speaker 1: I've always been struck by this in action adventure movies. 885 00:53:06,600 --> 00:53:09,960 Speaker 1: When we see a person get hurt, if it's the protagonist, 886 00:53:10,000 --> 00:53:13,560 Speaker 1: we really WinCE. But if it's the antagonist and he's 887 00:53:13,600 --> 00:53:15,960 Speaker 1: falling off one hundred foot cliff to his death, we 888 00:53:16,040 --> 00:53:19,719 Speaker 1: feel just find about that, possibly happy about that. So 889 00:53:19,800 --> 00:53:23,160 Speaker 1: what this means is that we have the capacity to 890 00:53:23,320 --> 00:53:27,359 Speaker 1: feel someone else's pain in different ways, depending on whether 891 00:53:27,360 --> 00:53:30,080 Speaker 1: they're a member of our tribe or not, and the 892 00:53:30,160 --> 00:53:35,160 Speaker 1: tribal tendencies of humans. This can incite murder and torture, 893 00:53:35,320 --> 00:53:40,520 Speaker 1: from the Spanish Inquisition to the Rwandan genocide. This can 894 00:53:40,640 --> 00:53:45,760 Speaker 1: buy the appeal of nationalist visions, from Hitler's Final Solution 895 00:53:45,920 --> 00:53:51,000 Speaker 1: to Mao's Cultural Revolution. So, given how deeply our biases 896 00:53:51,040 --> 00:53:55,480 Speaker 1: are ingrained, the question is, are we doomed to repeat 897 00:53:55,520 --> 00:53:59,440 Speaker 1: these kinds of atrocities forever? So I'm going to suggest 898 00:53:59,640 --> 00:54:04,160 Speaker 1: perhaps apps not. I'm going to give five strategies here 899 00:54:04,600 --> 00:54:09,560 Speaker 1: to narrow the empathic divide between people. The first thing 900 00:54:09,600 --> 00:54:13,279 Speaker 1: has to do with just understanding our own biases. We 901 00:54:13,320 --> 00:54:17,080 Speaker 1: can increase our awareness of our own internal thought patterns 902 00:54:17,080 --> 00:54:21,759 Speaker 1: so that we recognize our partisanship as we experience it. 903 00:54:22,200 --> 00:54:25,239 Speaker 1: For example, in our social echo chambers, we tend to 904 00:54:25,840 --> 00:54:28,680 Speaker 1: accept the logic of our in group and we reject 905 00:54:28,840 --> 00:54:33,240 Speaker 1: the logic of outgroups. And we're also predisposed to help 906 00:54:33,280 --> 00:54:36,360 Speaker 1: those in our in groups rather than those a little 907 00:54:36,640 --> 00:54:38,520 Speaker 1: farther away who might need help. 908 00:54:38,560 --> 00:54:38,799 Speaker 2: More. 909 00:54:39,520 --> 00:54:43,919 Speaker 1: Understanding the biases behind our actions in this way can 910 00:54:44,000 --> 00:54:48,600 Speaker 1: help lead us to more altruistic behavior. The second strategy 911 00:54:48,880 --> 00:54:52,480 Speaker 1: for narrowing the empathic divide has to do with building 912 00:54:52,560 --> 00:54:56,480 Speaker 1: a better model of other people. So instead of concluding 913 00:54:56,520 --> 00:55:01,080 Speaker 1: that your brother or a coworker is a or an idiot, 914 00:55:01,440 --> 00:55:04,720 Speaker 1: just try taking a crack at understanding his point of view. 915 00:55:05,280 --> 00:55:07,680 Speaker 1: It's not the same as agreeing with his point of view, 916 00:55:07,719 --> 00:55:10,920 Speaker 1: but it's trying to step into that person's world to 917 00:55:11,120 --> 00:55:16,359 Speaker 1: avoid the oversimplifications that we typically accept. And by the way, 918 00:55:16,400 --> 00:55:19,920 Speaker 1: this is often accomplished through art and literature, which has 919 00:55:20,239 --> 00:55:23,200 Speaker 1: for a long time wage day behind the scenes battle 920 00:55:23,239 --> 00:55:27,879 Speaker 1: against dehumanization. Theater and books and movies. This lets people 921 00:55:28,080 --> 00:55:30,840 Speaker 1: step into the shoes of other people, and in the 922 00:55:30,880 --> 00:55:34,040 Speaker 1: fourteen forties, when the printing press was invented, this allowed 923 00:55:34,440 --> 00:55:38,520 Speaker 1: stories to spread widely. So, for example, when Harriet Peacher 924 00:55:38,600 --> 00:55:43,400 Speaker 1: Stowe published the anti slavery novel Uncle Tom's Cabin in 925 00:55:43,440 --> 00:55:47,759 Speaker 1: eighteen fifty two, readers stepped inside a shack that they 926 00:55:47,920 --> 00:55:51,120 Speaker 1: otherwise wouldn't have ever entered, and once in it was 927 00:55:51,200 --> 00:55:55,239 Speaker 1: no longer so easy to relegate the characters to an 928 00:55:55,280 --> 00:55:59,920 Speaker 1: out group. The third strategy is to learn and resis 929 00:56:00,440 --> 00:56:04,200 Speaker 1: the tactics of dehumanization. There are a lot of tricks 930 00:56:04,239 --> 00:56:07,719 Speaker 1: that governments and propagandists employ and I'm going to do 931 00:56:07,760 --> 00:56:10,600 Speaker 1: a different episode on that, but I'll mention here that 932 00:56:10,760 --> 00:56:15,279 Speaker 1: one common ploy is what's called moral pollution, in which 933 00:56:15,360 --> 00:56:20,239 Speaker 1: a group is socially smeared by association with something repulsive 934 00:56:20,440 --> 00:56:24,400 Speaker 1: like vermin or insects, or anything that develops them in 935 00:56:24,440 --> 00:56:28,640 Speaker 1: a negative emotional cloud. Once you have a negative emotional 936 00:56:28,680 --> 00:56:32,480 Speaker 1: reaction to a group, it becomes harder to hear their perspectives. 937 00:56:33,120 --> 00:56:35,600 Speaker 1: So when you can recognize that a person is being 938 00:56:35,680 --> 00:56:39,920 Speaker 1: attacked for his identity rather than his arguments, you can 939 00:56:39,960 --> 00:56:44,120 Speaker 1: better defend yourself against this trick. The fourth strategy has 940 00:56:44,120 --> 00:56:48,880 Speaker 1: to do with blinding your biases, so design processes and 941 00:56:49,000 --> 00:56:54,240 Speaker 1: organizations that remove the chance that prejudices interfere with your judgment. 942 00:56:54,719 --> 00:56:58,600 Speaker 1: For example, a lot of software companies here in Silicon Valley, 943 00:56:58,840 --> 00:57:02,840 Speaker 1: they'll ask job candidates to submit code rather than to 944 00:57:02,840 --> 00:57:07,040 Speaker 1: show up in person. And many orchestras have blind auditions, 945 00:57:07,080 --> 00:57:09,640 Speaker 1: which means they audition people behind a curtain, so you 946 00:57:09,760 --> 00:57:12,880 Speaker 1: can't see the gender or the race of the person 947 00:57:13,120 --> 00:57:15,800 Speaker 1: who's looking for the job. You're just listening to the music. 948 00:57:16,200 --> 00:57:19,240 Speaker 1: And in the same way, many universities have a need 949 00:57:19,320 --> 00:57:25,080 Speaker 1: blind application process so they can separate intelligence from financial considerations. 950 00:57:25,520 --> 00:57:30,360 Speaker 1: So the idea is, wherever biases can be subconsciously triggered, 951 00:57:30,720 --> 00:57:34,160 Speaker 1: it's best if you just remove the opportunity. And the 952 00:57:34,200 --> 00:57:37,720 Speaker 1: fifth strategy I think is the least intuitive, and that 953 00:57:37,880 --> 00:57:42,920 Speaker 1: is to entangle group memberships. So what I mean is 954 00:57:43,480 --> 00:57:48,840 Speaker 1: work to ensure that communities are intertwined. So to see 955 00:57:48,840 --> 00:57:52,800 Speaker 1: how this would work in practice, consider the five tribes 956 00:57:53,000 --> 00:57:57,000 Speaker 1: of the Iroqui Native Americans, who fought intensely with each 957 00:57:57,080 --> 00:58:00,120 Speaker 1: other in the fifteenth century. So they had a new 958 00:58:00,200 --> 00:58:03,280 Speaker 1: leader come in named Deganaweda, who came to be known 959 00:58:03,320 --> 00:58:06,000 Speaker 1: as the Great Peacemaker. And what he did is he 960 00:58:06,040 --> 00:58:11,760 Speaker 1: hassigned each tribe member to one of nine different clans, 961 00:58:12,240 --> 00:58:14,480 Speaker 1: the Wolf clan or the Bear clan, or the Turtle 962 00:58:14,520 --> 00:58:16,480 Speaker 1: clan or Sandpiper. 963 00:58:15,920 --> 00:58:17,120 Speaker 2: Deer so on. 964 00:58:17,800 --> 00:58:21,720 Speaker 1: So members of each clan had representation from all the 965 00:58:21,760 --> 00:58:26,480 Speaker 1: different tribes, and these relationships were cross cutting. So I 966 00:58:26,560 --> 00:58:29,040 Speaker 1: say to you, hey, tribe member, let's go attack that 967 00:58:29,120 --> 00:58:31,480 Speaker 1: other tribe over there, And you say, oh, you know, 968 00:58:31,600 --> 00:58:34,120 Speaker 1: I would. But I'm a member of the Eagle clan 969 00:58:34,200 --> 00:58:36,400 Speaker 1: and so is he, and so I'm not really that 970 00:58:36,480 --> 00:58:41,880 Speaker 1: interested in attacking him anymore. So by emphasizing the overlapping 971 00:58:42,160 --> 00:58:47,480 Speaker 1: dual allegiances to tribe and to clan, de Ganaweda complicated 972 00:58:47,480 --> 00:58:50,880 Speaker 1: the notions of us and them, and in this way 973 00:58:50,920 --> 00:58:55,680 Speaker 1: he was able to defang the intertribal warfare. So what 974 00:58:55,720 --> 00:58:58,880 Speaker 1: we've seen in today's episode is how different we are 975 00:58:58,880 --> 00:59:02,680 Speaker 1: on the inside, and yet how strongly we believe our 976 00:59:02,720 --> 00:59:07,520 Speaker 1: own truths, even though our knowledge of everything is so impoverished. 977 00:59:08,080 --> 00:59:10,120 Speaker 1: And yet we all walk around with the impression that 978 00:59:10,160 --> 00:59:12,400 Speaker 1: if we could just sit down with another person on 979 00:59:12,440 --> 00:59:17,040 Speaker 1: the other side, we could show them the truth. So 980 00:59:17,080 --> 00:59:19,600 Speaker 1: if you have the same opinions as everyone else in 981 00:59:19,640 --> 00:59:22,919 Speaker 1: your life, great, But I hope you don't. I hope 982 00:59:22,920 --> 00:59:26,720 Speaker 1: you can take the opportunity to dig deep and find 983 00:59:26,760 --> 00:59:29,200 Speaker 1: out how the other folks in your life see the 984 00:59:29,200 --> 00:59:31,800 Speaker 1: world and listen to them. It's not the same as 985 00:59:31,840 --> 00:59:34,560 Speaker 1: agreeing with them or giving in to them, but it 986 00:59:34,600 --> 00:59:37,840 Speaker 1: is acknowledging that your point of view doesn't have a 987 00:59:38,080 --> 00:59:41,520 Speaker 1: lock on the absolute truth, and it's allowing that the 988 00:59:41,520 --> 00:59:44,560 Speaker 1: most important thing you can learn is an ability to 989 00:59:45,160 --> 00:59:49,760 Speaker 1: dialogue in conditions of disagreements and discomfort. It's the most 990 00:59:49,800 --> 00:59:52,960 Speaker 1: important thing that you can do for each other and 991 00:59:53,080 --> 00:59:58,440 Speaker 1: for your own brain. As Voltaire said, uncertainty is an 992 00:59:58,600 --> 01:00:04,440 Speaker 1: uncomfortable position, but certainty is an absurd position. So I've 993 01:00:04,480 --> 01:00:07,680 Speaker 1: given five directions for helping us to learn how to 994 01:00:07,720 --> 01:00:10,720 Speaker 1: bridge that gap, not by assuming we're right, but by 995 01:00:10,760 --> 01:00:14,680 Speaker 1: having the intellectual humility to realize that it's a big, 996 01:00:14,760 --> 01:00:19,320 Speaker 1: pluralistic world out there, and that everyone, including you, has 997 01:00:19,360 --> 01:00:22,600 Speaker 1: a model of the truth, and that only by adopting 998 01:00:22,640 --> 01:00:27,360 Speaker 1: the stance of genuine dialogue and understanding your own biases 999 01:00:27,920 --> 01:00:31,800 Speaker 1: and the possibility that we might be wrong, can we 1000 01:00:31,880 --> 01:00:39,960 Speaker 1: hope to move things forward. If you're interested in learning more, 1001 01:00:40,240 --> 01:00:43,640 Speaker 1: find further readings on these topics at Eagleman dot com, 1002 01:00:43,680 --> 01:00:48,720 Speaker 1: Slash podcast, and if you have questions, comments thoughts, email 1003 01:00:48,840 --> 01:00:52,120 Speaker 1: us at podcast at eagleman dot com, where we've been 1004 01:00:52,160 --> 01:00:56,640 Speaker 1: getting great responses. Watch full video episodes and leave comments 1005 01:00:56,680 --> 01:01:00,960 Speaker 1: on YouTube at Inner cosmospod. Until then, this is David 1006 01:01:00,960 --> 01:01:03,840 Speaker 1: Eagleman signing off from the Inner Cosmos