1 00:00:05,120 --> 00:00:10,560 Speaker 1: Would a utopia be possible and would it even be desirable? 2 00:00:11,119 --> 00:00:15,400 Speaker 1: Are we wired up with desires and preferences like tribalism 3 00:00:15,440 --> 00:00:20,080 Speaker 1: and jealousy that make perfect societies difficult to achieve? 4 00:00:20,840 --> 00:00:22,599 Speaker 2: Do we love hierarchies? 5 00:00:23,200 --> 00:00:28,080 Speaker 1: Why are primate brains such excellent detectors of unfairness? 6 00:00:28,480 --> 00:00:30,160 Speaker 2: Do we actually like struggle? 7 00:00:30,520 --> 00:00:35,120 Speaker 1: Did the Church's disavowal of first cousin marriages lead to 8 00:00:35,200 --> 00:00:39,559 Speaker 1: better politics? This week we'll talk with psychologist Paul Bloom 9 00:00:39,720 --> 00:00:48,800 Speaker 1: about the possibility slash impossibility of achieving societal utopias. 10 00:00:49,800 --> 00:00:52,080 Speaker 2: Welcome to Inner Cosmos with me David Eagleman. 11 00:00:52,200 --> 00:00:55,360 Speaker 1: I'm a neuroscientist and author at Stanford, and in these 12 00:00:55,400 --> 00:00:59,400 Speaker 1: episodes we sail deeply into our three pound universe to 13 00:00:59,480 --> 00:01:02,640 Speaker 1: understand why and how our lives and. 14 00:01:02,560 --> 00:01:04,840 Speaker 2: Societies look the way they do. 15 00:01:19,400 --> 00:01:23,960 Speaker 1: Today's episode is about whether we humans yoked with our 16 00:01:24,000 --> 00:01:27,800 Speaker 1: brains and our psychologies, whether we will ever get to 17 00:01:27,920 --> 00:01:33,240 Speaker 1: a utopian society. Imagine a world where everyone has what 18 00:01:33,360 --> 00:01:37,640 Speaker 1: they need, no crime, no hunger, no injustice, a place 19 00:01:37,760 --> 00:01:41,399 Speaker 1: of peace and fairness and fulfillment. For as long as 20 00:01:41,400 --> 00:01:43,839 Speaker 1: we have been human, we have dreamed of such a place. 21 00:01:43,920 --> 00:01:46,720 Speaker 1: We've even given it a name utopia. 22 00:01:47,319 --> 00:01:47,520 Speaker 2: Now. 23 00:01:47,560 --> 00:01:52,440 Speaker 1: The word utopia was coined by Thomas Moore in fifteen sixteen. 24 00:01:52,720 --> 00:01:58,440 Speaker 1: He described a fictional island society where private property didn't exist, 25 00:01:58,480 --> 00:02:02,680 Speaker 1: and where everyone worked and where everyone shared equally. But 26 00:02:02,800 --> 00:02:06,520 Speaker 1: even then he chose the word as a pun from 27 00:02:06,600 --> 00:02:11,280 Speaker 1: the roots ooh topos, which means no place. This was 28 00:02:11,320 --> 00:02:16,000 Speaker 1: his hint that such perfection might not exist anywhere or 29 00:02:16,120 --> 00:02:19,600 Speaker 1: any when. But this possibility of it not coming to 30 00:02:19,680 --> 00:02:23,760 Speaker 1: fruition has never stopped people from thinking about it and 31 00:02:23,919 --> 00:02:27,960 Speaker 1: working toward it, and often picking up arms to try 32 00:02:28,000 --> 00:02:31,480 Speaker 1: to achieve it. But again and again the road to 33 00:02:31,680 --> 00:02:35,120 Speaker 1: utopia gets off ramped by human nature. 34 00:02:35,480 --> 00:02:37,200 Speaker 2: One historical example. 35 00:02:36,840 --> 00:02:40,320 Speaker 1: Of this comes from the French Revolution in seventeen eighty nine. 36 00:02:40,400 --> 00:02:45,080 Speaker 1: It was all about liberty and equality and fraternity, and 37 00:02:45,160 --> 00:02:47,880 Speaker 1: that's nothing but good stuff. It makes total sense to 38 00:02:47,919 --> 00:02:51,919 Speaker 1: be on board with that. The monarchy was overthrown, they 39 00:02:52,120 --> 00:02:56,440 Speaker 1: tossed out feudal privileges, they declared a new republic. But 40 00:02:56,720 --> 00:03:00,919 Speaker 1: by seventeen ninety two France was tearing itself apart. He 41 00:03:01,000 --> 00:03:04,720 Speaker 1: had royalists and moderates and radical factions. They were all 42 00:03:05,000 --> 00:03:08,280 Speaker 1: in the fight for dominance. And so out of this chaos, 43 00:03:08,320 --> 00:03:13,160 Speaker 1: in seventeen ninety three emerged the Committee of Public Safety, 44 00:03:13,600 --> 00:03:17,880 Speaker 1: which was an emergency body to protect the revolution from 45 00:03:18,120 --> 00:03:23,600 Speaker 1: enemies foreign and domestic, and its leading voice was Maximilian Robespierre. 46 00:03:23,919 --> 00:03:28,239 Speaker 1: He was known as the Incorruptible because of his unbending 47 00:03:28,280 --> 00:03:31,399 Speaker 1: devotion to revolutionary virtue. 48 00:03:31,480 --> 00:03:32,760 Speaker 2: He believed all the. 49 00:03:32,680 --> 00:03:37,200 Speaker 1: Way down that the revolution had to be defended militarily, 50 00:03:37,200 --> 00:03:43,760 Speaker 1: of course, but also morally through purity unity. The destruction 51 00:03:44,080 --> 00:03:51,920 Speaker 1: of corruption. To him, terror was justice. So they made 52 00:03:52,080 --> 00:03:57,960 Speaker 1: revolutionary tribunals to accuse people of counter revolutionary activity. This 53 00:03:58,120 --> 00:04:03,800 Speaker 1: was anyone noble's clergy, political opponents, former allies, and what 54 00:04:03,920 --> 00:04:08,840 Speaker 1: was called the law of suspects made essentially everyone vulnerable 55 00:04:08,880 --> 00:04:14,600 Speaker 1: to arrest if they were showing insufficient enthusiasm for the revolution. 56 00:04:15,080 --> 00:04:18,200 Speaker 1: And as you remember from your history classes, the guillotine 57 00:04:18,640 --> 00:04:21,159 Speaker 1: became the symbol of the new order because it was 58 00:04:21,200 --> 00:04:26,359 Speaker 1: proposed as an egalitarian instrument of execution. So over the 59 00:04:26,400 --> 00:04:30,960 Speaker 1: next two years there were seventeen thousand executions, and at 60 00:04:31,040 --> 00:04:34,560 Speaker 1: least that many more who died in prisons or summary killings. 61 00:04:34,839 --> 00:04:36,280 Speaker 2: So what the heck happened here? 62 00:04:36,800 --> 00:04:40,440 Speaker 1: What Robespierre so desperately hoped for was what he called 63 00:04:40,760 --> 00:04:44,760 Speaker 1: a republic of virtue, where citizens would be motivated by 64 00:04:45,160 --> 00:04:49,320 Speaker 1: civic morality rather than self interest. But in practice, what 65 00:04:49,400 --> 00:04:54,400 Speaker 1: he thought of as virtue became enforced conformity. Anybody could 66 00:04:54,400 --> 00:04:59,440 Speaker 1: become a suspect. They could be labeled as corrupt, or aristocratic, 67 00:04:59,640 --> 00:05:05,640 Speaker 1: or insufficiently revolutionary. So what happened is the revolution, which 68 00:05:05,839 --> 00:05:09,320 Speaker 1: looked like it was aiming towards the utopia, quickly became 69 00:05:09,400 --> 00:05:14,279 Speaker 1: known as the Reign of Terror. Everyone was frantic and scared. 70 00:05:14,680 --> 00:05:18,320 Speaker 1: Revolutionaries turned on each other. A bunch of the early 71 00:05:18,440 --> 00:05:22,640 Speaker 1: leaders were sent to the guillotine by their former ally, Robespierre. 72 00:05:22,800 --> 00:05:26,120 Speaker 1: So by mid seventeen ninety four, everyone who knew Robespierre, 73 00:05:26,200 --> 00:05:30,120 Speaker 1: even as close allies, everyone was in fear for their lives. 74 00:05:30,440 --> 00:05:33,640 Speaker 1: So finally, in July of seventeen ninety four, he got 75 00:05:33,800 --> 00:05:37,760 Speaker 1: arrested and guillotined, and the Reign of Terror finally ended. 76 00:05:38,680 --> 00:05:41,960 Speaker 1: So this is just one of a gajillion examples where 77 00:05:42,000 --> 00:05:47,400 Speaker 1: the pursuit of moral purity and ideal justice slips into violence. 78 00:05:47,880 --> 00:05:50,800 Speaker 1: This is exactly what happened in the Soviet Union. The 79 00:05:50,920 --> 00:05:55,680 Speaker 1: original communist revolutionaries were so clear and bright eyed on 80 00:05:55,760 --> 00:05:59,480 Speaker 1: the utopia that they wanted to build. But between Lenin 81 00:05:59,560 --> 00:06:04,560 Speaker 1: and star there were an estimated one million political executions. 82 00:06:04,920 --> 00:06:08,400 Speaker 1: Both Lenin and Stalin believed they were creating a new 83 00:06:08,880 --> 00:06:15,560 Speaker 1: purified world and egalitarian utopia free from exploitation. But just 84 00:06:15,600 --> 00:06:20,040 Speaker 1: like with Robespierre, the pursuit of purity turned into the 85 00:06:20,080 --> 00:06:25,440 Speaker 1: destruction of perceived impurity. Just like the French Revolution, the 86 00:06:25,480 --> 00:06:29,719 Speaker 1: Soviet experiment began with a dream of equality and ended 87 00:06:29,839 --> 00:06:34,080 Speaker 1: in terror. In the thousands of cases like these, utopian 88 00:06:34,200 --> 00:06:41,080 Speaker 1: zeal plus human psychology, fear and rivalry and paranoia and tribalism, 89 00:06:41,600 --> 00:06:47,799 Speaker 1: this combination produces the opposite of paradise. Whenever movements aim 90 00:06:47,880 --> 00:06:54,160 Speaker 1: to purify human nature itself, they eventually turned the blade inward. 91 00:06:54,320 --> 00:06:58,520 Speaker 1: And this is where psychology and neuroscience become key parts 92 00:06:58,560 --> 00:07:01,960 Speaker 1: of the analysis, because at the center of every grand 93 00:07:02,040 --> 00:07:07,240 Speaker 1: political blueprint sets the human brain, a brain that evolved 94 00:07:07,279 --> 00:07:11,840 Speaker 1: in survival conditions, tuned for tribalism. 95 00:07:11,320 --> 00:07:13,240 Speaker 2: For hierarchy, for rivalry. 96 00:07:13,480 --> 00:07:15,920 Speaker 1: This has been a topic close to my heart for years, 97 00:07:15,960 --> 00:07:17,920 Speaker 1: and in fact, one of my short stories in my 98 00:07:17,960 --> 00:07:21,360 Speaker 1: book Some is exactly on this topic. So to set 99 00:07:21,400 --> 00:07:25,360 Speaker 1: the table for today's podcast. I'll read that story now. 100 00:07:25,600 --> 00:07:30,680 Speaker 1: This is called egalitaire. In the afterlife, you discover that 101 00:07:30,720 --> 00:07:34,240 Speaker 1: God understands the complexities of life. 102 00:07:34,280 --> 00:07:36,200 Speaker 2: She had originally submitted to. 103 00:07:36,200 --> 00:07:39,760 Speaker 1: Peer pressure when she structured her universe like all the 104 00:07:39,800 --> 00:07:44,040 Speaker 1: other gods had, with a binary categorization of people into 105 00:07:44,120 --> 00:07:47,200 Speaker 1: good and evil. But it didn't take long for her 106 00:07:47,240 --> 00:07:50,320 Speaker 1: to realize that humans could be good in many ways 107 00:07:50,640 --> 00:07:55,720 Speaker 1: and simultaneously corrupt and mean spirited in other ways. How 108 00:07:55,800 --> 00:07:58,760 Speaker 1: was she to arbitrate who goes to heaven and who 109 00:07:58,920 --> 00:08:03,040 Speaker 1: to hell? Might not be possible. She considered that a 110 00:08:03,080 --> 00:08:07,360 Speaker 1: man could be an embezzler and still give to charitable causes. 111 00:08:07,880 --> 00:08:11,640 Speaker 1: Might not a woman be an adulteress, but bring pleasure 112 00:08:11,720 --> 00:08:15,120 Speaker 1: and security to two men's lives. Might not a child 113 00:08:15,560 --> 00:08:21,320 Speaker 1: unwittingly divulge secrets that splinter a family. Dividing the population 114 00:08:21,520 --> 00:08:25,320 Speaker 1: into two categories good and bad seemed like a more 115 00:08:25,400 --> 00:08:29,559 Speaker 1: reasonable task when she was younger, but with experience these 116 00:08:29,600 --> 00:08:34,240 Speaker 1: decisions became more difficult. She composed complex formulas to weigh 117 00:08:34,320 --> 00:08:37,880 Speaker 1: hundreds of factors and ran computer programs that rolled out 118 00:08:37,960 --> 00:08:42,760 Speaker 1: long strips of paper with eternal decisions, but her sensitivities 119 00:08:42,840 --> 00:08:46,800 Speaker 1: revolted at this automation, and when the computer generated a 120 00:08:46,880 --> 00:08:50,920 Speaker 1: decision she disagreed with, she took the opportunity to kick 121 00:08:50,960 --> 00:08:54,600 Speaker 1: out the plug and rage. That afternoon, she listened to 122 00:08:54,679 --> 00:08:58,839 Speaker 1: the grievances of the dead from two warring nations. Both 123 00:08:58,920 --> 00:09:04,000 Speaker 1: sides had suffered, both sides had legitimate grievances, both pled 124 00:09:04,040 --> 00:09:08,840 Speaker 1: their cases earnestly. She covered her ears and moaned in misery. 125 00:09:09,240 --> 00:09:12,560 Speaker 1: She knew her humans were multi dimensional, and she could 126 00:09:12,600 --> 00:09:17,199 Speaker 1: no longer live under the rigid architecture of her youthful choices. 127 00:09:18,320 --> 00:09:22,120 Speaker 1: Not all gods suffer over this. We can consider ourselves 128 00:09:22,240 --> 00:09:25,679 Speaker 1: lucky that in death we answer to a god with 129 00:09:25,840 --> 00:09:31,400 Speaker 1: deep sensitivity to the byzantine hearts of her creations. For months, 130 00:09:31,480 --> 00:09:34,920 Speaker 1: she moped around her living room in Heaven, head drooped 131 00:09:34,960 --> 00:09:39,600 Speaker 1: like a bulrush while the lines piled up. Her advisers 132 00:09:39,800 --> 00:09:43,960 Speaker 1: advised her to delegate the decision making, but she loved 133 00:09:43,960 --> 00:09:46,520 Speaker 1: her humans too much to leave them to the care 134 00:09:46,559 --> 00:09:49,960 Speaker 1: of anyone else. In a moment of desperation, the thought 135 00:09:50,000 --> 00:09:53,600 Speaker 1: crossed her mind to let everyone wait online indefinitely, letting 136 00:09:53,640 --> 00:09:56,680 Speaker 1: them work it out on their own. But then a 137 00:09:56,840 --> 00:10:00,680 Speaker 1: better idea struck her. Generous spirit. She I can afford it. 138 00:10:00,920 --> 00:10:05,320 Speaker 1: She would grant everyone, every last human a place in heaven. 139 00:10:05,880 --> 00:10:09,160 Speaker 1: After all, everyone had something good inside. It was part 140 00:10:09,160 --> 00:10:13,280 Speaker 1: of the design specifications. Her new plan brought back the 141 00:10:13,400 --> 00:10:16,520 Speaker 1: bounce to her gait, returned the color. 142 00:10:16,280 --> 00:10:17,120 Speaker 2: To her cheeks. 143 00:10:17,559 --> 00:10:20,800 Speaker 1: She shut down the operations in Hell, fired the devil, 144 00:10:21,080 --> 00:10:23,840 Speaker 1: and brought every last human to be by her side 145 00:10:23,960 --> 00:10:29,560 Speaker 1: in Heaven, newcomers or old timers, nefarious or righteous. Under 146 00:10:29,600 --> 00:10:33,040 Speaker 1: the new system, everyone gets equal time to speak with her. 147 00:10:33,520 --> 00:10:37,319 Speaker 1: Most people find her a little garrulous and over solicitous, 148 00:10:37,520 --> 00:10:41,160 Speaker 1: but she cannot be accused of not caring. The most 149 00:10:41,240 --> 00:10:44,360 Speaker 1: important part of her new system is that everyone is 150 00:10:44,440 --> 00:10:48,840 Speaker 1: treated equally. There is no longer fire for some and 151 00:10:48,920 --> 00:10:52,720 Speaker 1: harp music for others. The afterlife is no longer defined 152 00:10:52,800 --> 00:10:57,880 Speaker 1: by cots versus water beds, raw potatoes versus sushi, hot 153 00:10:57,920 --> 00:11:02,000 Speaker 1: water versus champagne. Every everyone is a brother to all, 154 00:11:02,200 --> 00:11:05,960 Speaker 1: and for the first time, an idea has been realized 155 00:11:06,080 --> 00:11:08,640 Speaker 1: that never came to fruition on earth. 156 00:11:08,960 --> 00:11:10,679 Speaker 2: True equality. 157 00:11:11,600 --> 00:11:16,120 Speaker 1: The Communists are baffled and irritated because they have finally 158 00:11:16,160 --> 00:11:19,600 Speaker 1: achieved their perfect society, but only by the help. 159 00:11:19,400 --> 00:11:21,600 Speaker 2: Of a god In whom they don't want to believe. 160 00:11:22,440 --> 00:11:26,640 Speaker 1: The meritocrats are abashed that they're stuck for eternity in 161 00:11:26,840 --> 00:11:31,319 Speaker 1: an incentiveless system with a bunch of pinkos. The conservatives 162 00:11:31,559 --> 00:11:36,040 Speaker 1: have no penniless to disparage, the liberals have no downtrodden 163 00:11:36,080 --> 00:11:39,360 Speaker 1: to promote. So God sits on the edge of her 164 00:11:39,360 --> 00:11:43,600 Speaker 1: bed and weeps at night because the only thing everyone 165 00:11:43,679 --> 00:11:49,720 Speaker 1: can agree on is that they're all in hell. That 166 00:11:49,880 --> 00:11:53,480 Speaker 1: was the story Egalitaire from my book Some. So let's 167 00:11:53,520 --> 00:11:55,960 Speaker 1: get back to the human foibles that get in the 168 00:11:56,000 --> 00:11:59,840 Speaker 1: way of utopias. An important clue is that these aren't 169 00:12:00,080 --> 00:12:03,480 Speaker 1: learned behaviors. You can look at how predisposed we are 170 00:12:03,640 --> 00:12:07,680 Speaker 1: to tribalism, for example, by looking at children. An experiment 171 00:12:07,760 --> 00:12:10,240 Speaker 1: in the nineteen fifties took two groups of boys at 172 00:12:10,240 --> 00:12:13,959 Speaker 1: summer camp and randomly separated them and gave them different 173 00:12:14,200 --> 00:12:18,000 Speaker 1: team names, the Rattlers and the Eagles, and the psychologists 174 00:12:18,040 --> 00:12:23,319 Speaker 1: watched closely and measured how quickly this descended into tribal hostility, 175 00:12:23,679 --> 00:12:28,280 Speaker 1: hurling insults, raiding each other's cabins, burning each other's flags, 176 00:12:28,600 --> 00:12:31,160 Speaker 1: breaking into physical violence. 177 00:12:31,600 --> 00:12:32,560 Speaker 2: You don't need. 178 00:12:32,440 --> 00:12:36,120 Speaker 1: Decades of history to divide two groups. The division can 179 00:12:36,160 --> 00:12:41,280 Speaker 1: happen in days or hours because group identity us versus them, 180 00:12:41,920 --> 00:12:45,040 Speaker 1: lives deep in our circuitry. And by the way, the 181 00:12:45,080 --> 00:12:47,680 Speaker 1: things that get in our way with building utopias aren't 182 00:12:47,720 --> 00:12:51,520 Speaker 1: limited to humans. You can see the general elements throughout 183 00:12:51,559 --> 00:12:54,640 Speaker 1: the animal kingdom. In one study, all Link in the 184 00:12:54,679 --> 00:12:58,360 Speaker 1: show notes, you have capuchin monkeys trained to do a 185 00:12:58,360 --> 00:13:01,320 Speaker 1: little task for a slice of queer, and they're perfectly 186 00:13:01,520 --> 00:13:05,600 Speaker 1: happy with that payment until they see the monkey next 187 00:13:05,600 --> 00:13:08,839 Speaker 1: to them get a nice, juicy grape which is better 188 00:13:09,080 --> 00:13:12,200 Speaker 1: for the same amount of work. And suddenly the cucumber 189 00:13:12,520 --> 00:13:15,640 Speaker 1: becomes an insult. The monkeys fling it back at the 190 00:13:15,679 --> 00:13:17,559 Speaker 1: researcher and pound. 191 00:13:17,200 --> 00:13:18,640 Speaker 2: The cage in outrage. 192 00:13:18,960 --> 00:13:21,800 Speaker 1: Because even for our cousins in the animal kingdom, they 193 00:13:21,840 --> 00:13:26,120 Speaker 1: are keenly attuned to others getting more than them. This 194 00:13:26,320 --> 00:13:32,520 Speaker 1: insults a very deep seated root of fairness. So what 195 00:13:32,559 --> 00:13:35,240 Speaker 1: does this mean when you have societies with millions of 196 00:13:35,320 --> 00:13:38,440 Speaker 1: humans and everyone is looking at their neighbors to see 197 00:13:38,480 --> 00:13:41,880 Speaker 1: if they feel anyone is getting something better than they are. 198 00:13:42,679 --> 00:13:45,880 Speaker 1: So the question I want to explore today is if 199 00:13:46,040 --> 00:13:49,640 Speaker 1: children fracture into tribes with just some labels and monkeys 200 00:13:49,760 --> 00:13:52,199 Speaker 1: riot when they feel like they're not getting enough, and 201 00:13:52,559 --> 00:13:56,600 Speaker 1: every adult human attempt at rebooting the system seems to 202 00:13:56,679 --> 00:13:59,560 Speaker 1: end up in a reign of terror and political executions. 203 00:14:00,200 --> 00:14:06,080 Speaker 1: What chance does utopia stand? Are humans fundamentally incompatible with 204 00:14:06,360 --> 00:14:11,359 Speaker 1: paradise or are these drives flexible and capable of being reshaped? 205 00:14:11,640 --> 00:14:13,640 Speaker 1: So this is what I want to explore today with 206 00:14:13,720 --> 00:14:16,240 Speaker 1: my guest psychologist, Paul Bloom. 207 00:14:16,600 --> 00:14:19,280 Speaker 2: He has thought deeply about human. 208 00:14:19,080 --> 00:14:24,760 Speaker 1: Nature, our capacities for kindness and cruelty, our hunger for fairness, 209 00:14:25,200 --> 00:14:28,280 Speaker 1: our appetite for meaning. We're going to talk about whether 210 00:14:28,440 --> 00:14:34,560 Speaker 1: utopia is politically possible or do our flawed, ambitious, competitive 211 00:14:34,600 --> 00:14:38,240 Speaker 1: brains always going to get in the way, or are 212 00:14:38,280 --> 00:14:42,000 Speaker 1: there ways we might at least move in the right directions. 213 00:14:42,640 --> 00:14:45,040 Speaker 1: Paul Bloom is a professor of psychology at the University 214 00:14:45,040 --> 00:14:48,400 Speaker 1: of Toronto and Professor Emeritis at Yale. He's the author 215 00:14:48,440 --> 00:14:51,400 Speaker 1: of psych The Story of the Human Mind, and many 216 00:14:51,440 --> 00:14:55,160 Speaker 1: other influential books. So let's jump into this conversation at 217 00:14:55,160 --> 00:15:04,280 Speaker 1: the crossroads of psychology and society. So, Paul, almost all 218 00:15:04,320 --> 00:15:07,880 Speaker 1: thinkers have thought about the issue of utopia and whether 219 00:15:07,920 --> 00:15:11,880 Speaker 1: it's possible, and most political movements are trying to get 220 00:15:11,920 --> 00:15:15,120 Speaker 1: themselves in that direction. But you're a little less hopeful 221 00:15:15,160 --> 00:15:20,200 Speaker 1: about the possibility of reaching utopia because of human nature, 222 00:15:20,320 --> 00:15:21,320 Speaker 1: So tell us about that. 223 00:15:21,680 --> 00:15:26,320 Speaker 3: So I find the idea of utopia really interesting. It's 224 00:15:26,360 --> 00:15:28,200 Speaker 3: not such a practical concern. I mean, if you and 225 00:15:28,200 --> 00:15:30,440 Speaker 3: I were to talk about the troubles of today, we 226 00:15:30,440 --> 00:15:33,920 Speaker 3: probably wouldn't settle on when we ever reach utopia or not. 227 00:15:34,040 --> 00:15:36,480 Speaker 3: We'll be talking about, you know, local improvements, and maybe 228 00:15:36,520 --> 00:15:38,920 Speaker 3: things suck in certain ways, let's it must make them better. 229 00:15:39,440 --> 00:15:43,840 Speaker 3: But Nick Monstrom said it really nicely describe utopias as 230 00:15:44,200 --> 00:15:47,120 Speaker 3: a thought experiment, as a way of exploring what we 231 00:15:47,240 --> 00:15:50,000 Speaker 3: want and what we're capable of. And I think the 232 00:15:50,080 --> 00:15:53,680 Speaker 3: idea of utopia is an excellent way to think about 233 00:15:53,760 --> 00:15:58,160 Speaker 3: human nature. And in particular, if you think deeply about utopia, 234 00:15:58,840 --> 00:16:02,000 Speaker 3: you'll come to see that certain aspects of our nature, 235 00:16:02,000 --> 00:16:05,720 Speaker 3: of human nature that make it untenable, that we are 236 00:16:05,720 --> 00:16:09,560 Speaker 3: not suited for a perfect world, that certain aspects of 237 00:16:09,600 --> 00:16:14,440 Speaker 3: how we are mean perfection will be forever out of reach. 238 00:16:14,640 --> 00:16:16,920 Speaker 3: It doesn't mean we can't make things better. It doesn't 239 00:16:16,960 --> 00:16:22,120 Speaker 3: mean near utopias aren't possible, but a perfect world unless 240 00:16:22,120 --> 00:16:25,440 Speaker 3: you radically reconfigure human nature so that we're not human anymore, 241 00:16:25,720 --> 00:16:28,040 Speaker 3: will be forever unattainable. 242 00:16:27,680 --> 00:16:30,240 Speaker 1: And give us a sense of those facets of human 243 00:16:30,320 --> 00:16:32,440 Speaker 1: nature like self interest and envy. 244 00:16:32,800 --> 00:16:38,000 Speaker 3: Yeah. Absolutely, Well take this particular case. In every utopia 245 00:16:38,160 --> 00:16:42,160 Speaker 3: people have thought of, there's some degree of equality and 246 00:16:42,200 --> 00:16:45,320 Speaker 3: a sort of mutual love where people care about everybody 247 00:16:45,320 --> 00:16:48,840 Speaker 3: else within this utopian community. You to see why that 248 00:16:48,880 --> 00:16:52,040 Speaker 3: would make sense. You think about tribes and families, and 249 00:16:51,760 --> 00:16:55,160 Speaker 3: they push against each other and our interests get torn apart. 250 00:16:55,440 --> 00:16:58,960 Speaker 3: And there are utopian thinkers, you know, such as Maw 251 00:16:59,400 --> 00:17:02,840 Speaker 3: who thought we simply dissolved these special ties now would 252 00:17:02,880 --> 00:17:05,800 Speaker 3: force people from different social classes to marry one another, 253 00:17:06,440 --> 00:17:08,040 Speaker 3: you know, taking away old choice and say well, this 254 00:17:08,080 --> 00:17:11,600 Speaker 3: will bring people together. A lot more benign example is 255 00:17:11,480 --> 00:17:14,399 Speaker 3: the Israeli kibbutz. When I was a teenager, I spent 256 00:17:14,440 --> 00:17:17,439 Speaker 3: a summer in the kibbutz in Israeli desert, and this 257 00:17:17,520 --> 00:17:20,200 Speaker 3: is a traditional kibbutz where their children were raised communally. 258 00:17:20,680 --> 00:17:23,040 Speaker 3: People have a baby, the baby was was sort of 259 00:17:23,200 --> 00:17:26,000 Speaker 3: given to sort of general daycare area where it would 260 00:17:26,000 --> 00:17:29,240 Speaker 3: be taken care of, and babies ended up ultimately knowing 261 00:17:29,280 --> 00:17:32,760 Speaker 3: who their biological parents were. But there was nothing, No 262 00:17:32,800 --> 00:17:34,600 Speaker 3: big deal was made of it. The idea would be 263 00:17:34,600 --> 00:17:37,359 Speaker 3: the bonds of family could easily be dissolved. Now his 264 00:17:37,480 --> 00:17:40,879 Speaker 3: experiment was a failure. The kibbutz was a failure. People 265 00:17:40,960 --> 00:17:44,720 Speaker 3: love their children, children love their parents. People have greater 266 00:17:44,880 --> 00:17:47,280 Speaker 3: ties to their siblings and their parents and their aunts 267 00:17:47,320 --> 00:17:52,280 Speaker 3: and uncles than they do to strangers. And there's every 268 00:17:52,320 --> 00:17:54,720 Speaker 3: indication that this is part of how we're wired up 269 00:17:54,960 --> 00:17:58,399 Speaker 3: to be. It's, to some extent it's evolutionary biology one 270 00:17:58,480 --> 00:18:01,320 Speaker 3: oh one, which is, you know, the forces that guided 271 00:18:01,320 --> 00:18:04,600 Speaker 3: the evolution of our desires and our preferences were highly 272 00:18:04,640 --> 00:18:07,600 Speaker 3: sensitive to whether or not other entities shared your gams. 273 00:18:08,119 --> 00:18:10,280 Speaker 3: You know. So parents love their children because parents who 274 00:18:10,280 --> 00:18:12,520 Speaker 3: don't didn't love their children, didn't reproduce as much as 275 00:18:12,560 --> 00:18:16,800 Speaker 3: parents who did, and that causes that the bonds of 276 00:18:16,880 --> 00:18:20,600 Speaker 3: family push against somebody who'd say, well, you just want 277 00:18:20,640 --> 00:18:22,440 Speaker 3: a good society. You want people to just care about 278 00:18:22,440 --> 00:18:25,439 Speaker 3: societies care about friends and strangers the same way. So 279 00:18:25,600 --> 00:18:28,320 Speaker 3: too with the bonds of friendship. So two of the 280 00:18:28,320 --> 00:18:30,960 Speaker 3: bonds of romantic love, and romantic love sets up other 281 00:18:31,000 --> 00:18:34,119 Speaker 3: problems like jealousy. If I'm really attracted to somebody and 282 00:18:35,000 --> 00:18:38,199 Speaker 3: I want them, if they don't choose to be with me, 283 00:18:39,280 --> 00:18:41,600 Speaker 3: it could be very painful for me. And if they 284 00:18:41,600 --> 00:18:43,240 Speaker 3: do choose to be with me, often I want them 285 00:18:43,280 --> 00:18:44,920 Speaker 3: to only choose to be with me, and they want 286 00:18:45,000 --> 00:18:46,879 Speaker 3: me to only be with them, and that could be 287 00:18:46,880 --> 00:18:51,480 Speaker 3: painful too. And in fact, every utopia has had problems 288 00:18:51,520 --> 00:18:55,639 Speaker 3: with sex. Either on the one extreme, either what they 289 00:18:55,720 --> 00:18:58,520 Speaker 3: say is no sex except for procreation, trying to make 290 00:18:58,520 --> 00:19:01,480 Speaker 3: a problem go away that way, or everybody has sex 291 00:19:01,520 --> 00:19:04,600 Speaker 3: with everybody else communal marriage, and those don't work out either. 292 00:19:05,680 --> 00:19:24,560 Speaker 3: So I mean, short version, human nature is messy. 293 00:19:24,640 --> 00:19:27,600 Speaker 1: So we have things like tribalism, we have rivalry, and 294 00:19:27,640 --> 00:19:30,000 Speaker 1: your point is these are hardwired in and they are 295 00:19:30,040 --> 00:19:34,000 Speaker 1: adaptive generally, but they're also destructive to. 296 00:19:34,040 --> 00:19:36,920 Speaker 2: The to the concept of a utopia. 297 00:19:37,000 --> 00:19:39,359 Speaker 1: So your take is that a world without conflict or 298 00:19:39,359 --> 00:19:43,440 Speaker 1: competition would actually run counter to how our motivations work. 299 00:19:43,880 --> 00:19:46,000 Speaker 3: So we talked a little bit about the bonds of 300 00:19:46,080 --> 00:19:50,800 Speaker 3: family and friendship and love. Another feature of our nature 301 00:19:50,960 --> 00:19:55,800 Speaker 3: is we are hierarchical beings. We are you know, ultimately primates, 302 00:19:56,080 --> 00:19:58,480 Speaker 3: and we care about how we stand relative to other people, 303 00:19:59,200 --> 00:20:01,760 Speaker 3: so you know, we want them. We don't want an 304 00:20:01,760 --> 00:20:05,159 Speaker 3: equal world, for instance. We want a world where merit 305 00:20:05,560 --> 00:20:09,479 Speaker 3: is corresponds to results, at least to some extent, and 306 00:20:09,520 --> 00:20:12,320 Speaker 3: so we chafe against pure equality. There's been a series 307 00:20:12,359 --> 00:20:15,440 Speaker 3: of studies that have been done, some of them from 308 00:20:15,440 --> 00:20:19,120 Speaker 3: my lab of children. And despite where you sometimes here, 309 00:20:19,200 --> 00:20:24,120 Speaker 3: we're not natural born egalitarians. If you work twice as 310 00:20:24,160 --> 00:20:26,880 Speaker 3: hard as me, even like a four year old things, 311 00:20:26,920 --> 00:20:29,439 Speaker 3: you should get more than me for your work. And 312 00:20:29,520 --> 00:20:31,280 Speaker 3: a situation where we if we were to get the 313 00:20:31,359 --> 00:20:37,840 Speaker 3: same would be you know, considered unfair. And this sort 314 00:20:37,920 --> 00:20:43,879 Speaker 3: of this idea that marriage should be rewarded again means 315 00:20:43,920 --> 00:20:46,679 Speaker 3: the ideal world from a psychological point of view is 316 00:20:46,720 --> 00:20:49,760 Speaker 3: an unequal one. But the unequal one leads to resentments 317 00:20:49,760 --> 00:20:53,880 Speaker 3: and frustration and so on. And you know, existing political 318 00:20:53,880 --> 00:20:57,120 Speaker 3: systems like communism and socialism and capitalism try to deal 319 00:20:57,160 --> 00:21:00,720 Speaker 3: with us through some complicated set of compromises. And I 320 00:21:00,760 --> 00:21:03,080 Speaker 3: think the compromises are kind of the best we're going 321 00:21:03,160 --> 00:21:05,480 Speaker 3: to get. It'll be kind of equal in this way 322 00:21:05,520 --> 00:21:08,080 Speaker 3: and unequal in this way. We'll reward in this way, 323 00:21:08,119 --> 00:21:09,679 Speaker 3: but not in this way. We'll set up limits in 324 00:21:09,720 --> 00:21:11,520 Speaker 3: this way, not in that way. But the idea of 325 00:21:11,560 --> 00:21:15,919 Speaker 3: a perfectly smooth world where we're all the same. Maybe 326 00:21:15,920 --> 00:21:18,440 Speaker 3: it could be imposed, but people be very unhappy with it. 327 00:21:18,840 --> 00:21:21,760 Speaker 1: What's an example of a piece of legislation that finds 328 00:21:21,760 --> 00:21:22,480 Speaker 1: a compromise. 329 00:21:22,760 --> 00:21:25,600 Speaker 3: Well, i'll say the tax system. So you can imagine 330 00:21:25,600 --> 00:21:28,760 Speaker 3: a tax system which treated the rich and poor identical. 331 00:21:28,840 --> 00:21:31,479 Speaker 3: The billionaire has to pay a percentage and the pauper 332 00:21:31,560 --> 00:21:35,480 Speaker 3: had to pay a percentage. That seems absurd. Just the 333 00:21:35,560 --> 00:21:38,119 Speaker 3: idea of sort of marginal utility would say that these 334 00:21:38,160 --> 00:21:41,520 Speaker 3: people should be treated differently. You should tax the millionaire more. 335 00:21:41,720 --> 00:21:46,040 Speaker 3: On the flip side, a system that entirely confiscated the 336 00:21:46,080 --> 00:21:50,840 Speaker 3: money of the rich would also discourage the accumulation of wealth, 337 00:21:50,840 --> 00:21:55,160 Speaker 3: discourage businesses, discouraged creation. So what you have in most society, 338 00:21:55,160 --> 00:21:58,479 Speaker 3: isn't it a progressive tax system? So and so you know, 339 00:21:58,720 --> 00:22:01,800 Speaker 3: the poor pay nothing. Are five percent and as ten 340 00:22:01,840 --> 00:22:05,280 Speaker 3: percent and it is fifteen percent, and and nobody's happy 341 00:22:05,320 --> 00:22:07,439 Speaker 3: with us because well everybody says, oh, it should be 342 00:22:07,480 --> 00:22:10,720 Speaker 3: higher here or lower here. And I certainly have no 343 00:22:10,800 --> 00:22:13,320 Speaker 3: ideal solution. But the point is, there isn't going to 344 00:22:13,320 --> 00:22:15,560 Speaker 3: be an ideal solution, just some sort of rough and 345 00:22:15,640 --> 00:22:19,920 Speaker 3: ready compromise. So too with the balances between individual freedom 346 00:22:20,320 --> 00:22:23,080 Speaker 3: and the safety of other people. So every society is 347 00:22:23,080 --> 00:22:25,719 Speaker 3: going to have some constraints on free speech. You know, 348 00:22:25,800 --> 00:22:28,880 Speaker 3: you can't you can't threaten to kill somebody. You can't 349 00:22:28,880 --> 00:22:32,200 Speaker 3: do insider trading, you can't blackmail people or extort people 350 00:22:32,880 --> 00:22:35,560 Speaker 3: if it gets too onerous. And political speeches block people 351 00:22:35,640 --> 00:22:39,560 Speaker 3: push back, and most so many of our political debates 352 00:22:39,600 --> 00:22:43,600 Speaker 3: right now struggle with as the margins you know, right now, 353 00:22:43,760 --> 00:22:45,359 Speaker 3: right now in the United States, you know, some people 354 00:22:45,400 --> 00:22:49,200 Speaker 3: want flag burning band either business or flag burning speech. 355 00:22:49,320 --> 00:22:52,640 Speaker 3: We shouldn't we shouldn't ban it. Yeah, my point isn't 356 00:22:52,640 --> 00:22:54,720 Speaker 3: the same thing, but these particular debates. But to say 357 00:22:54,760 --> 00:22:57,240 Speaker 3: this is our fate, we are, oh, he's going to 358 00:22:57,280 --> 00:23:00,400 Speaker 3: be arguing at these at these basic things because there's 359 00:23:00,400 --> 00:23:01,640 Speaker 3: no optimal solution. 360 00:23:02,080 --> 00:23:04,280 Speaker 1: So, Paul, you and I did a podcast a while 361 00:23:04,280 --> 00:23:07,879 Speaker 1: ago when we talked about loneliness and the question of 362 00:23:07,920 --> 00:23:11,919 Speaker 1: whether if somebody isn't working at all the painful stuff 363 00:23:11,960 --> 00:23:15,840 Speaker 1: about relationships, maybe they won't get better at them, and 364 00:23:15,880 --> 00:23:18,520 Speaker 1: maybe that will actually be a disservice to them. By 365 00:23:18,560 --> 00:23:23,040 Speaker 1: the same token. I think you feel that if all 366 00:23:23,080 --> 00:23:26,720 Speaker 1: struggle were removed in let's say a utopia of the 367 00:23:26,760 --> 00:23:29,280 Speaker 1: future where AI and machines and other things could take 368 00:23:29,320 --> 00:23:32,879 Speaker 1: care of everything for us, we would lose something about 369 00:23:33,000 --> 00:23:34,360 Speaker 1: meaning in our lives. 370 00:23:34,440 --> 00:23:36,600 Speaker 2: Tell us your take on that we would. 371 00:23:37,440 --> 00:23:39,320 Speaker 3: I wrote about this in my books The Sweet Spot, 372 00:23:39,359 --> 00:23:42,720 Speaker 3: where I argued that there's an optimal level of struggle, 373 00:23:42,800 --> 00:23:45,760 Speaker 3: of difficulty that's important for a meaningful and good life. 374 00:23:46,200 --> 00:23:49,000 Speaker 3: And then Nick Mostron recently wrote a book on utopia 375 00:23:49,240 --> 00:23:51,560 Speaker 3: where you took up this issue and talked about a 376 00:23:51,640 --> 00:23:54,800 Speaker 3: possible sort of post scarcity world where we have everything 377 00:23:54,800 --> 00:23:56,239 Speaker 3: we need. We have all the food we need, all 378 00:23:56,280 --> 00:24:00,040 Speaker 3: the shelter we need, anything we want that have, and 379 00:24:00,480 --> 00:24:03,159 Speaker 3: so there need not be any struggle. Maybe there'll be 380 00:24:03,160 --> 00:24:06,520 Speaker 3: no disease by then. And I have two thoughts of this. 381 00:24:07,160 --> 00:24:09,399 Speaker 3: One thought is that that would be a boring world, 382 00:24:09,520 --> 00:24:13,000 Speaker 3: a frustrating world, one that we would find we will 383 00:24:13,040 --> 00:24:16,600 Speaker 3: lead to an wi and misery. We like struggle, I mean, 384 00:24:16,760 --> 00:24:20,320 Speaker 3: too much struggle is miserable, but too little struggle is 385 00:24:21,440 --> 00:24:24,800 Speaker 3: bad too, And so the first thought is that world 386 00:24:25,000 --> 00:24:29,280 Speaker 3: will be terrible. The second thought is a world with 387 00:24:29,840 --> 00:24:33,840 Speaker 3: no conflict and no struggle is not I think possible, 388 00:24:34,800 --> 00:24:37,000 Speaker 3: because maybe we'll have enough food and maybe we'll have 389 00:24:37,080 --> 00:24:42,119 Speaker 3: enough shelter, enough resources. But you know, there will always 390 00:24:42,119 --> 00:24:44,520 Speaker 3: be a case as long as we're people that somebody 391 00:24:44,520 --> 00:24:47,399 Speaker 3: will love somebody and they won't love them back. There 392 00:24:47,400 --> 00:24:49,520 Speaker 3: will always be the case so long as we're people 393 00:24:49,840 --> 00:24:52,800 Speaker 3: that you will want an award, a prize and honor 394 00:24:53,119 --> 00:24:54,640 Speaker 3: and I will want it too, and there's just one 395 00:24:54,680 --> 00:24:55,320 Speaker 3: to go around. 396 00:24:55,720 --> 00:24:57,040 Speaker 2: And to follow up on that point. 397 00:24:57,359 --> 00:25:02,119 Speaker 1: Your view, also, I think is that inequality is structurally unavoidable, 398 00:25:02,200 --> 00:25:06,000 Speaker 1: because even if you had a society where everyone had everything, 399 00:25:06,080 --> 00:25:10,200 Speaker 1: people will seek new markers of status and distinction. 400 00:25:11,080 --> 00:25:11,600 Speaker 2: That's true. 401 00:25:11,680 --> 00:25:14,360 Speaker 3: I mean we see this in sort of artificial societies 402 00:25:14,440 --> 00:25:19,159 Speaker 3: like communes or even academic departments where everyone is an 403 00:25:19,200 --> 00:25:22,520 Speaker 3: assistant professor and was a full professor. The people jockey 404 00:25:22,520 --> 00:25:26,600 Speaker 3: for a position, a jockey for status. I could easily 405 00:25:26,640 --> 00:25:28,119 Speaker 3: imagine a world, and I think it would be a 406 00:25:28,160 --> 00:25:31,600 Speaker 3: nice world where the status didn't translate into access for 407 00:25:31,680 --> 00:25:34,840 Speaker 3: resources that we desperately need. You know, where you know 408 00:25:35,080 --> 00:25:37,960 Speaker 3: either you will have enough food, eat our eye will 409 00:25:37,960 --> 00:25:40,600 Speaker 3: but not both. Much better to avoid such a world. 410 00:25:41,440 --> 00:25:44,280 Speaker 3: But we're never going to have a world where everybody 411 00:25:44,320 --> 00:25:48,280 Speaker 3: gets to matter of respect they think they deserve, because 412 00:25:48,320 --> 00:25:50,600 Speaker 3: many of us at certain times think we deserve more 413 00:25:50,600 --> 00:25:54,399 Speaker 3: respect than other people. So you know, if you want 414 00:25:54,520 --> 00:25:56,680 Speaker 3: your voice to be heard, you want to have sway 415 00:25:56,720 --> 00:25:59,640 Speaker 3: over things and to be maximum respected, and I want 416 00:25:59,640 --> 00:26:02,440 Speaker 3: this two and so too for the five other people 417 00:26:02,480 --> 00:26:06,040 Speaker 3: around us. There is a clash and some things, some 418 00:26:06,160 --> 00:26:14,760 Speaker 3: resources like respect, love, often sexual attraction, friendship, are inherently limited. 419 00:26:15,600 --> 00:26:18,199 Speaker 3: And so in some way the worry that we have 420 00:26:18,320 --> 00:26:20,119 Speaker 3: that oh what do we do with a world of 421 00:26:20,240 --> 00:26:23,600 Speaker 3: no conflict? How will we survive? It isn't going to 422 00:26:23,640 --> 00:26:26,360 Speaker 3: happen because there will always be some sort of conflict. 423 00:26:26,160 --> 00:26:26,520 Speaker 2: That's right. 424 00:26:26,560 --> 00:26:29,600 Speaker 1: And so your view is that the competition for recognition 425 00:26:29,840 --> 00:26:35,440 Speaker 1: and influence and prestige is always going to reintroduce hierarchy 426 00:26:35,960 --> 00:26:38,040 Speaker 1: and undermine this utopian ideal. 427 00:26:38,480 --> 00:26:41,200 Speaker 3: Yeah, I gave a talk on utopia for the first 428 00:26:41,240 --> 00:26:45,520 Speaker 3: time in this wonderful festival in Wales called How the 429 00:26:45,600 --> 00:26:49,639 Speaker 3: Light Gets In And they have a little poster that 430 00:26:49,680 --> 00:26:52,320 Speaker 3: they put up saying this is our festival and everything, 431 00:26:52,520 --> 00:26:55,640 Speaker 3: and they list all the names of the luminaries who 432 00:26:55,720 --> 00:26:58,800 Speaker 3: are presenting at the festival and I was fifteenth on 433 00:26:58,840 --> 00:27:03,640 Speaker 3: a list, so it's things like that, you know. Now, 434 00:27:03,880 --> 00:27:05,639 Speaker 3: now the people in front of me were like you 435 00:27:06,080 --> 00:27:08,720 Speaker 3: Jaijack and Steven Pinker a great name, so you know, 436 00:27:08,720 --> 00:27:12,480 Speaker 3: it's fine. But still I think everybody on that list 437 00:27:12,520 --> 00:27:14,520 Speaker 3: would have said, I kind of like to be first 438 00:27:14,600 --> 00:27:18,919 Speaker 3: or second or third. And that's a tiny example, but 439 00:27:19,000 --> 00:27:21,520 Speaker 3: so much of life is like that. You know. 440 00:27:21,560 --> 00:27:24,160 Speaker 1: I just heard a routine from Sarah Silverman, the comedian, 441 00:27:24,280 --> 00:27:26,639 Speaker 1: and she she said that she checked into a hotel 442 00:27:26,680 --> 00:27:29,240 Speaker 1: and the person said, wow, Sarah Silverman, you're in my 443 00:27:29,359 --> 00:27:32,680 Speaker 1: top four comedians. And she immediately felt like, okay, well 444 00:27:32,680 --> 00:27:34,720 Speaker 1: that means I'm number four then, right, because that was 445 00:27:34,760 --> 00:27:38,760 Speaker 1: you wouldn't have phrased it that way, Yeah, exactly. 446 00:27:38,840 --> 00:27:42,560 Speaker 3: And you see this in kids. You know, I've seen 447 00:27:42,640 --> 00:27:44,919 Speaker 3: kids and I had to have two sons and they 448 00:27:44,960 --> 00:27:47,080 Speaker 3: were little. You know, I've seen them fight over a sock, 449 00:27:48,080 --> 00:27:50,639 Speaker 3: you know, because you know, they were playing a socks 450 00:27:50,680 --> 00:27:53,080 Speaker 3: they could have each played with, but they each wanted 451 00:27:53,119 --> 00:27:56,120 Speaker 3: that sock. You know, one of one kids jumps onto 452 00:27:56,119 --> 00:27:57,399 Speaker 3: a chair that kind of wants to sit in the 453 00:27:57,400 --> 00:28:01,679 Speaker 3: same chair. Certainly they're fighting for her parents' attention. And 454 00:28:01,760 --> 00:28:04,720 Speaker 3: this is you know, this is just how people work, 455 00:28:05,600 --> 00:28:09,120 Speaker 3: and you know I. So we're never going to have this, 456 00:28:09,119 --> 00:28:11,680 Speaker 3: this frictionless world. Back to the idea of friction, back 457 00:28:11,720 --> 00:28:13,760 Speaker 3: to the idea of some sort of suffering a conflict. 458 00:28:14,000 --> 00:28:16,280 Speaker 2: I don't think this is some sort of horrible fate. 459 00:28:17,160 --> 00:28:19,080 Speaker 3: I think it's just part of our lot, that that 460 00:28:19,880 --> 00:28:21,560 Speaker 3: by a time you know, you and I, you know, 461 00:28:21,680 --> 00:28:24,720 Speaker 3: pass away, we will have had our share of disappointments, 462 00:28:24,960 --> 00:28:29,240 Speaker 3: of unrequited love, of awards, we didn't receive honors we 463 00:28:29,280 --> 00:28:32,160 Speaker 3: felt we deserved, and we didn't get moments of lack 464 00:28:32,200 --> 00:28:35,800 Speaker 3: of respect. But you know, if we're lucky, we've also found, 465 00:28:35,880 --> 00:28:39,560 Speaker 3: you know, occasional flashes of true love and surprise honors 466 00:28:39,600 --> 00:28:44,120 Speaker 3: and and winning the you know it. It's this again, 467 00:28:44,680 --> 00:28:47,680 Speaker 3: this messiness is what we're condemned to, but it's not 468 00:28:47,720 --> 00:28:48,440 Speaker 3: a terrible fate. 469 00:28:48,880 --> 00:28:49,600 Speaker 2: Here's a question. 470 00:28:49,800 --> 00:28:55,840 Speaker 1: So history often shows that attempts at utopia end in authoritarianism. 471 00:28:55,880 --> 00:28:58,720 Speaker 1: Let's think, you know, communism in the twentieth century or something. 472 00:28:59,640 --> 00:29:02,240 Speaker 1: How would did you tie that to psychology. 473 00:29:03,200 --> 00:29:06,920 Speaker 3: I think the move to authoritarianism comes because we not 474 00:29:06,960 --> 00:29:10,320 Speaker 3: only have preferences. It's not only true to say I 475 00:29:10,360 --> 00:29:12,240 Speaker 3: want to be able to spend time with my children. 476 00:29:12,360 --> 00:29:14,680 Speaker 3: I want to be able to marry somebody who I 477 00:29:14,720 --> 00:29:16,800 Speaker 3: love and who loves me. I want to be free 478 00:29:16,840 --> 00:29:19,920 Speaker 3: to sell things to people who want to buy them 479 00:29:19,960 --> 00:29:21,960 Speaker 3: from me. I want to be free to have property. 480 00:29:22,160 --> 00:29:26,080 Speaker 3: Not only that, but we also have an appetite for 481 00:29:26,120 --> 00:29:28,960 Speaker 3: what you can call autonomy, even for his own sake. 482 00:29:29,280 --> 00:29:33,240 Speaker 3: We want to be able to within the limits possible, 483 00:29:33,360 --> 00:29:36,600 Speaker 3: get our way, and we don't like constraints on it. 484 00:29:37,360 --> 00:29:39,680 Speaker 3: There's actually a really interesting body of work in social 485 00:29:39,680 --> 00:29:42,520 Speaker 3: psychology that finds that if you tell people you can't 486 00:29:42,560 --> 00:29:46,160 Speaker 3: do something, all of a sudden, it becomes immensely desirable. 487 00:29:47,360 --> 00:29:49,800 Speaker 3: I mean, the classic case of this is Eve, you know, 488 00:29:49,840 --> 00:29:52,360 Speaker 3: in the Garden of Eden, doing the one thing she 489 00:29:52,480 --> 00:29:55,360 Speaker 3: was told she couldn't do. There's any incredible human about that. 490 00:29:55,880 --> 00:29:59,120 Speaker 3: You know, if I told you, David, don't press that button, 491 00:29:59,200 --> 00:30:01,560 Speaker 3: I demand you don't, resident you will find yourself an 492 00:30:01,680 --> 00:30:06,320 Speaker 3: urge press that button. And so this urge for autonomy, 493 00:30:06,440 --> 00:30:09,480 Speaker 3: sometimes for perversity, is within I think all of us, 494 00:30:09,520 --> 00:30:12,840 Speaker 3: to different degrees, and then you need to just you know, 495 00:30:13,120 --> 00:30:16,280 Speaker 3: a state needs to stomp it down. And if a 496 00:30:16,280 --> 00:30:18,240 Speaker 3: state wants to tell you you can love who you 497 00:30:18,320 --> 00:30:20,400 Speaker 3: want to love, You can't live where you want to live, 498 00:30:20,440 --> 00:30:23,640 Speaker 3: you can't say what you want to say. Well, they 499 00:30:23,680 --> 00:30:27,479 Speaker 3: often have to use force because people will, in an 500 00:30:27,480 --> 00:30:30,160 Speaker 3: ideal world not quietly go along with us, and so 501 00:30:30,640 --> 00:30:35,520 Speaker 3: utopian ideas typically end up. You know, honestly, I end 502 00:30:35,600 --> 00:30:38,760 Speaker 3: up in concentration camp, stand up with mass starvation because 503 00:30:39,120 --> 00:30:43,480 Speaker 3: because people are not suited for the utopian plans and 504 00:30:43,520 --> 00:30:44,200 Speaker 3: all push back. 505 00:30:44,480 --> 00:30:49,080 Speaker 1: Now you're not opposed to seeking progress in societies affect 506 00:30:49,120 --> 00:30:51,320 Speaker 1: you very much for that. So tell us how you 507 00:30:52,000 --> 00:30:55,600 Speaker 1: balance that in your head about utopia. Maybe impossible, but 508 00:30:55,680 --> 00:30:57,280 Speaker 1: we should be making progress. 509 00:30:57,560 --> 00:31:00,959 Speaker 3: I think you should certainly make progress. And you know, 510 00:31:01,000 --> 00:31:05,520 Speaker 3: to some extent I will. I'll conceive the obvious, which 511 00:31:05,560 --> 00:31:11,360 Speaker 3: is progress often involves force. If you say, well, here's 512 00:31:11,440 --> 00:31:14,240 Speaker 3: our new law, and the new law says such and so, 513 00:31:14,280 --> 00:31:16,520 Speaker 3: and it gives people more freedom and more rights. But 514 00:31:16,560 --> 00:31:19,800 Speaker 3: people may violate law, and in order to do that, 515 00:31:19,840 --> 00:31:22,360 Speaker 3: to stop them violent law, you might need to provide force. 516 00:31:22,800 --> 00:31:25,880 Speaker 3: You know, there's no just thing as a voluntary tax system. 517 00:31:26,040 --> 00:31:28,560 Speaker 3: I'm a believer in free speech, and I think if 518 00:31:28,560 --> 00:31:31,440 Speaker 3: somebody comes to campus and they're invited to give a talk, 519 00:31:31,560 --> 00:31:35,440 Speaker 3: they should be able to give a talk uninterrupted. But 520 00:31:35,520 --> 00:31:38,560 Speaker 3: that means that somebody might have to threaten, or to punish, 521 00:31:38,800 --> 00:31:42,880 Speaker 3: or even drag away people who would violently protest against them. 522 00:31:43,280 --> 00:31:45,120 Speaker 3: You're not going to find a decent war without the 523 00:31:45,160 --> 00:31:49,760 Speaker 3: possibility of force, even in the simplest infantiations. I think 524 00:31:49,800 --> 00:31:53,760 Speaker 3: that ideas of change, I think there's all sorts of 525 00:31:53,800 --> 00:31:55,880 Speaker 3: waste this world could be better, all sorts of ways 526 00:31:55,880 --> 00:31:58,280 Speaker 3: of this world has been better. You know, just take 527 00:31:58,320 --> 00:32:02,160 Speaker 3: a simple example of marriage to gay people. Seems to 528 00:32:02,200 --> 00:32:05,520 Speaker 3: be something that has had a million pluses and turned 529 00:32:05,520 --> 00:32:08,520 Speaker 3: out to have almost no minuses besides some people being 530 00:32:08,600 --> 00:32:12,320 Speaker 3: unhappy with it. But I think we want to be 531 00:32:12,440 --> 00:32:16,200 Speaker 3: very conscious about having changes that go too much against 532 00:32:16,200 --> 00:32:18,800 Speaker 3: the way people are having. Changes for instance, that that 533 00:32:19,400 --> 00:32:22,560 Speaker 3: you know, try to destroy ties of family, ties of love, 534 00:32:22,960 --> 00:32:26,959 Speaker 3: ties of friendship, even tribal ties. I think, for instance, 535 00:32:27,000 --> 00:32:30,840 Speaker 3: people of the same ethnicity who feel a kinship towards 536 00:32:30,880 --> 00:32:33,440 Speaker 3: their ethnicity, I don't think the States should try to 537 00:32:33,520 --> 00:32:35,560 Speaker 3: sort of stop them from the States should try to 538 00:32:35,600 --> 00:32:38,720 Speaker 3: stop people from worshiping or from practicing their sort of 539 00:32:38,840 --> 00:32:42,000 Speaker 3: communal customs. I think that runs deep, and if you 540 00:32:42,120 --> 00:32:45,360 Speaker 3: underestimate how deep it runs, often you end up with 541 00:32:45,520 --> 00:32:49,120 Speaker 3: kind of a terrible rebellious population, all sorts of trouble. 542 00:32:49,560 --> 00:32:52,600 Speaker 3: So I'm all for human progress. I just think and 543 00:32:52,640 --> 00:32:55,760 Speaker 3: maybe this is that's that's the liberal progressive side. The 544 00:32:55,800 --> 00:33:00,120 Speaker 3: conservative side is is don't push too much again the 545 00:33:00,160 --> 00:33:00,840 Speaker 3: way people are. 546 00:33:18,920 --> 00:33:20,320 Speaker 1: I want to come back to this issue about in 547 00:33:20,360 --> 00:33:22,320 Speaker 1: groups and outgroups. You may know that we did these 548 00:33:22,320 --> 00:33:25,040 Speaker 1: studies in my lab a while ago where you're in 549 00:33:25,080 --> 00:33:28,680 Speaker 1: the brain scanner fMRI and you see six hands arrayed 550 00:33:28,760 --> 00:33:31,120 Speaker 1: on the screen, and the computer randomly picks one of 551 00:33:31,120 --> 00:33:33,840 Speaker 1: those hands, and you see it gets stabbed. 552 00:33:33,480 --> 00:33:34,600 Speaker 2: With a syringe needle. 553 00:33:34,920 --> 00:33:38,800 Speaker 1: And so what happens is your brain has an empathy response. Essentially, 554 00:33:38,840 --> 00:33:42,000 Speaker 1: your pain matrix lights up. It's not your hand getting stabbed, 555 00:33:42,040 --> 00:33:46,280 Speaker 1: but nonetheless you have this reaction of lighting up these 556 00:33:46,320 --> 00:33:48,680 Speaker 1: pain networks as though it maybe it was your hand. 557 00:33:49,040 --> 00:33:52,479 Speaker 1: That's presumably the neural basis of empathy. But what we 558 00:33:52,520 --> 00:33:55,160 Speaker 1: did then is added a one word label to each 559 00:33:55,200 --> 00:33:59,760 Speaker 1: hand Christian, Jewish, Muslim, scientologist, Hindu, atheist. And now the 560 00:34:00,240 --> 00:34:02,440 Speaker 1: goes around, picks a hand, you see the handget stabbed, 561 00:34:02,680 --> 00:34:05,200 Speaker 1: and the question is depending on what your in group is, 562 00:34:05,280 --> 00:34:07,600 Speaker 1: how do you feel about the out groups? And it 563 00:34:07,640 --> 00:34:12,560 Speaker 1: turns out across all religions, everyone cares more about their 564 00:34:12,560 --> 00:34:18,000 Speaker 1: in group. This very fast neural response, this first neural 565 00:34:18,080 --> 00:34:20,399 Speaker 1: response is much larger if you see your in group 566 00:34:20,440 --> 00:34:23,279 Speaker 1: get stabbed, and much smaller when you see a member 567 00:34:23,320 --> 00:34:24,880 Speaker 1: of your outgroup get stabbed. And by the way, this 568 00:34:25,000 --> 00:34:27,759 Speaker 1: was true for atheists as well. They cared more when 569 00:34:27,800 --> 00:34:29,640 Speaker 1: they see an atheist's hand gets stabbed. So it's not 570 00:34:29,680 --> 00:34:32,880 Speaker 1: even an indictment of religion. It's just about in groups 571 00:34:33,160 --> 00:34:37,480 Speaker 1: and out groups. And so what's cool is that people 572 00:34:37,680 --> 00:34:41,239 Speaker 1: can have all kinds of cognitive layers on top of 573 00:34:41,280 --> 00:34:43,839 Speaker 1: that so that they can still continue to do the 574 00:34:43,920 --> 00:34:48,759 Speaker 1: right thing in their societies. But fundamentally we're yoked with 575 00:34:48,840 --> 00:34:50,040 Speaker 1: that kind of bias. 576 00:34:51,120 --> 00:34:54,439 Speaker 3: Yeah, I love that work. I'm familiar with our word. 577 00:34:54,480 --> 00:34:56,960 Speaker 3: Just similar study showing more of a reaction if you're 578 00:34:57,000 --> 00:34:59,439 Speaker 3: white to seeing a white hand stab and a black 579 00:34:59,480 --> 00:35:01,880 Speaker 3: hand stab, and vice versia if you're black. There's some 580 00:35:01,920 --> 00:35:04,840 Speaker 3: studies in Europe with soccer fans where you know, you 581 00:35:04,840 --> 00:35:06,719 Speaker 3: watch somebody be shocked and if they're a member of 582 00:35:06,760 --> 00:35:09,760 Speaker 3: a different soccer team, are a fan of a different 583 00:35:09,760 --> 00:35:12,520 Speaker 3: soccer team, you feel pleasure instead of instead of pain. 584 00:35:13,800 --> 00:35:15,960 Speaker 3: In some way, I take that as a challenge to 585 00:35:16,000 --> 00:35:18,319 Speaker 3: what I just said. So I said that you kind 586 00:35:18,320 --> 00:35:20,880 Speaker 3: of have to accept human nature as it is, but 587 00:35:21,200 --> 00:35:23,080 Speaker 3: I think there are times where you do have to 588 00:35:23,080 --> 00:35:27,240 Speaker 3: push back a little bit, and our tremendously strong favoring 589 00:35:27,320 --> 00:35:30,480 Speaker 3: in group kind of has to go a little bit. 590 00:35:30,560 --> 00:35:35,400 Speaker 3: In a healthy democratic society, it's really good for people 591 00:35:35,440 --> 00:35:38,719 Speaker 3: of one ethnicity to care about people from another ethnicity. 592 00:35:39,480 --> 00:35:42,040 Speaker 3: It's really good, you know, for the atheists to care 593 00:35:42,040 --> 00:35:44,759 Speaker 3: about the religious person, for the Catholic to care about 594 00:35:44,760 --> 00:35:48,240 Speaker 3: a Protestant, and so on. And I wouldn't be so 595 00:35:48,840 --> 00:35:51,200 Speaker 3: conservative as I say, well, Lise, are natures to care 596 00:35:51,239 --> 00:35:53,879 Speaker 3: about our own, so leave it alone. I think to 597 00:35:53,880 --> 00:35:58,120 Speaker 3: some extent societies will push towards some sort of cosmopolitanism. 598 00:35:58,840 --> 00:36:00,680 Speaker 3: So I would draw a distinction. I would grow have 599 00:36:00,719 --> 00:36:04,279 Speaker 3: a stinction when me caring about my fellow Canadians way 600 00:36:04,320 --> 00:36:06,600 Speaker 3: more than Americans, which is I think something which you 601 00:36:06,600 --> 00:36:09,359 Speaker 3: should sort of try to tap down. Maybe you could 602 00:36:09,360 --> 00:36:12,759 Speaker 3: try to try to ameliorate a little bit versus me 603 00:36:12,920 --> 00:36:17,160 Speaker 3: caring about my children versus other people's children, where I 604 00:36:17,160 --> 00:36:19,080 Speaker 3: think to some extent, you have to take that kind 605 00:36:19,080 --> 00:36:22,399 Speaker 3: of as a given. You'd want to construct a good 606 00:36:22,400 --> 00:36:24,600 Speaker 3: society and more lang which it, except we are stuck 607 00:36:24,640 --> 00:36:28,680 Speaker 3: with that and we cannot mess with that. So some 608 00:36:28,719 --> 00:36:32,320 Speaker 3: parts of human nature are sort of so tightly weaved 609 00:36:32,360 --> 00:36:34,800 Speaker 3: in that you just you know, you should take it 610 00:36:34,800 --> 00:36:37,440 Speaker 3: as a premise. Other parts and you're you're, and you're 611 00:36:37,440 --> 00:36:39,399 Speaker 3: good to point us out. You may want to push 612 00:36:39,400 --> 00:36:40,000 Speaker 3: back on a bit. 613 00:36:40,239 --> 00:36:40,640 Speaker 2: That's right. 614 00:36:40,840 --> 00:36:43,080 Speaker 1: You know, we often do this in legislation. This in 615 00:36:43,120 --> 00:36:46,600 Speaker 1: some sense represents our longest term thinking. So for example, 616 00:36:47,000 --> 00:36:50,880 Speaker 1: anti discrimination housing laws is a way of saying, look, 617 00:36:50,920 --> 00:36:53,080 Speaker 1: we get it, we know that you like people who 618 00:36:53,120 --> 00:36:56,160 Speaker 1: look like you better, but we're going to establish us 619 00:36:56,200 --> 00:36:58,200 Speaker 1: as a rule in our society that you can't do 620 00:36:58,280 --> 00:36:58,920 Speaker 1: that anymore. 621 00:36:59,640 --> 00:37:02,400 Speaker 3: That's right, I mean. Joe Henrich made the argument in 622 00:37:02,440 --> 00:37:05,080 Speaker 3: this wonderful book The Weirdest People in the World that 623 00:37:05,200 --> 00:37:08,560 Speaker 3: one of the great moral revolutions of our time of 624 00:37:08,640 --> 00:37:15,080 Speaker 3: human history happened when the Church disavowed cousin marriage. And 625 00:37:15,200 --> 00:37:18,879 Speaker 3: as a result of this, it rippled through and all 626 00:37:18,880 --> 00:37:22,000 Speaker 3: of a sudden the bounds of family became less salient 627 00:37:22,040 --> 00:37:27,720 Speaker 3: in our political culture, and people became more individualistic, less 628 00:37:27,880 --> 00:37:31,560 Speaker 3: less kin less blood related, and then the world was 629 00:37:31,600 --> 00:37:34,640 Speaker 3: transformed as a result. And right now we have you know, 630 00:37:35,000 --> 00:37:38,880 Speaker 3: I say you shouldn't push back upon the bounds of family, 631 00:37:39,120 --> 00:37:40,880 Speaker 3: but we do in a little bit. We have anti 632 00:37:40,920 --> 00:37:44,440 Speaker 3: nepotism laws. If I'm looking for a research assistant to hire, 633 00:37:45,000 --> 00:37:47,840 Speaker 3: I can't hire my wife, are my sons or my cousin, 634 00:37:48,440 --> 00:37:50,080 Speaker 3: you know? And I think those are good lots. And I 635 00:37:50,120 --> 00:37:52,160 Speaker 3: think we accept that in the realm of the sort 636 00:37:52,160 --> 00:37:56,040 Speaker 3: of world of the market and world of universities, these 637 00:37:56,120 --> 00:37:58,040 Speaker 3: kinship bonds should not matter. 638 00:37:58,400 --> 00:38:00,400 Speaker 1: And so your view as a psychologist is that the 639 00:38:00,480 --> 00:38:02,759 Speaker 1: point is not to give up on moving in the 640 00:38:02,840 --> 00:38:07,320 Speaker 1: direction of a utopia, but instead to replace utopian dreams 641 00:38:07,360 --> 00:38:10,799 Speaker 1: with reforms that are achievable given the way we are 642 00:38:10,840 --> 00:38:11,680 Speaker 1: given our wiring. 643 00:38:12,160 --> 00:38:15,399 Speaker 3: That's a beautiful way of putting, I mean, a sort 644 00:38:15,440 --> 00:38:17,360 Speaker 3: of even more sort of constructive way of putting. It 645 00:38:17,440 --> 00:38:19,760 Speaker 3: is people who want to make the world a better place, 646 00:38:19,920 --> 00:38:23,320 Speaker 3: and we should all be such people as we embark 647 00:38:23,360 --> 00:38:27,240 Speaker 3: on this project. We should think deeply about human nature. 648 00:38:28,000 --> 00:38:31,440 Speaker 3: That'll tell us what changes will come easy. It'll tell 649 00:38:31,520 --> 00:38:34,200 Speaker 3: us what changes will be difficult, but worthwhile, and what 650 00:38:34,320 --> 00:38:37,120 Speaker 3: changes we shouldn't even try to go near. And I 651 00:38:37,120 --> 00:38:38,920 Speaker 3: think that that's really useful. 652 00:38:43,440 --> 00:38:46,399 Speaker 1: That was my conversation with Paul Bloom. So where does 653 00:38:46,400 --> 00:38:50,240 Speaker 1: that leave us. If there's one very critical lesson from history, 654 00:38:50,320 --> 00:38:55,800 Speaker 1: it's that grand designs for perfection tend to unravel. Utopia 655 00:38:55,880 --> 00:39:00,200 Speaker 1: is very often kernel into dystopias. We've seen that all 656 00:39:00,200 --> 00:39:04,440 Speaker 1: throughout history, where paradises very quickly become prisons. And this 657 00:39:04,520 --> 00:39:08,040 Speaker 1: is because the human brain, for all its massive success, 658 00:39:08,560 --> 00:39:13,319 Speaker 1: gives a tricky foundation for building heaven on Earth. But 659 00:39:13,760 --> 00:39:16,080 Speaker 1: the hope that I think Paul and I both share 660 00:39:16,520 --> 00:39:20,680 Speaker 1: is that maybe a close study of psychology, combined with 661 00:39:21,040 --> 00:39:25,480 Speaker 1: a close read of history can carry forward the seeds 662 00:39:25,520 --> 00:39:28,279 Speaker 1: of progress. In other words, the right way to move 663 00:39:28,560 --> 00:39:32,600 Speaker 1: in the direction of utopia isn't to try to erase 664 00:39:32,719 --> 00:39:36,640 Speaker 1: human drives, but instead figure out how to channel them. 665 00:39:36,880 --> 00:39:38,800 Speaker 2: So let me just say a couple of things. First. 666 00:39:38,840 --> 00:39:41,759 Speaker 1: It's important to note that these instincts we have are 667 00:39:41,880 --> 00:39:45,000 Speaker 1: often double edged. For example, I told you at the 668 00:39:45,000 --> 00:39:50,160 Speaker 1: beginning about the experiment of dividing summer camp kids into 669 00:39:50,280 --> 00:39:53,280 Speaker 1: teams and watching them polarize. But the thing to note 670 00:39:53,280 --> 00:39:57,840 Speaker 1: is that the same grouping reflex. This team reflex is 671 00:39:57,880 --> 00:40:03,120 Speaker 1: what binds neighbors in the communities. The capacity for division 672 00:40:03,239 --> 00:40:07,239 Speaker 1: is also the capacity for belonging. It's double edged and 673 00:40:07,320 --> 00:40:09,480 Speaker 1: it always has been. And that leads us to a 674 00:40:09,520 --> 00:40:13,680 Speaker 1: deeper question, which is how can you take our tribal instincts, 675 00:40:13,920 --> 00:40:18,080 Speaker 1: which probably aren't going away, and find ways to widen 676 00:40:18,200 --> 00:40:22,120 Speaker 1: the circle of what counts as us. As I've talked 677 00:40:22,160 --> 00:40:25,120 Speaker 1: about in other episodes, I think the Internet actually gives 678 00:40:25,200 --> 00:40:28,960 Speaker 1: us a powerful tool to do this, because the algorithms, 679 00:40:29,200 --> 00:40:35,280 Speaker 1: if programmed correctly, can complexify relationships by surfacing the things 680 00:40:35,520 --> 00:40:38,480 Speaker 1: that we have in common. And some years ago I 681 00:40:38,520 --> 00:40:41,680 Speaker 1: wrote about this in an article in The Economist called 682 00:40:42,239 --> 00:40:46,680 Speaker 1: does your brain care about other people? It depends, And 683 00:40:46,719 --> 00:40:50,560 Speaker 1: that article was all about how our brains are predisposed 684 00:40:50,600 --> 00:40:53,320 Speaker 1: for in groups and outgroups. And one of the things 685 00:40:53,360 --> 00:40:56,680 Speaker 1: this does is make it very easy for governments to 686 00:40:57,080 --> 00:41:00,080 Speaker 1: leverage propaganda. But as it turns out, one of the 687 00:41:00,080 --> 00:41:03,480 Speaker 1: most effective things we can do as a society is 688 00:41:03,520 --> 00:41:08,400 Speaker 1: to learn the basic tricks of propaganda. For example, calling 689 00:41:08,440 --> 00:41:13,359 Speaker 1: the other group of people animals or viruses or insects 690 00:41:13,719 --> 00:41:19,360 Speaker 1: or anything other than other humans what propaganda does is 691 00:41:19,400 --> 00:41:22,799 Speaker 1: it dials down particular networks in our brain that are 692 00:41:22,840 --> 00:41:28,919 Speaker 1: involved in understanding other people, and that makes violent polarization 693 00:41:29,200 --> 00:41:32,560 Speaker 1: much easier, because now you're taking up arms against some 694 00:41:32,880 --> 00:41:37,000 Speaker 1: group that your brain has come to view as subhuman. So, 695 00:41:37,040 --> 00:41:40,360 Speaker 1: as I said, the way to get societal immunity against 696 00:41:40,440 --> 00:41:44,000 Speaker 1: this basic psychological trick is to put it on the 697 00:41:44,040 --> 00:41:45,000 Speaker 1: table as. 698 00:41:44,880 --> 00:41:48,000 Speaker 2: Part of our education, so that the next. 699 00:41:47,719 --> 00:41:52,160 Speaker 1: Time you hear a muckraker or pundit or official say 700 00:41:52,239 --> 00:41:55,640 Speaker 1: these things like there are a bunch of animals, you. 701 00:41:55,600 --> 00:41:59,880 Speaker 2: Can immediately see through the magic trick instead of falling 702 00:42:00,120 --> 00:42:00,440 Speaker 2: for it. 703 00:42:01,239 --> 00:42:03,600 Speaker 1: Now zooming back out, I want to note that when 704 00:42:03,640 --> 00:42:06,239 Speaker 1: we look at the long arc of history and we 705 00:42:06,320 --> 00:42:10,920 Speaker 1: see massive declines in poverty and rises in literacy and 706 00:42:11,120 --> 00:42:15,360 Speaker 1: fewer wars between great powers, we can see that something 707 00:42:15,480 --> 00:42:19,560 Speaker 1: is working directionally. It's very imperfect, but things are on 708 00:42:19,640 --> 00:42:24,400 Speaker 1: the timescale of decades or centuries working, and given the 709 00:42:24,480 --> 00:42:27,400 Speaker 1: brains were yoked with this may be the best we 710 00:42:27,440 --> 00:42:31,040 Speaker 1: can do. The lesson that surfaces from psychology and history 711 00:42:31,360 --> 00:42:35,720 Speaker 1: is that utopia is probably not a destination but a direction. 712 00:42:36,239 --> 00:42:39,360 Speaker 1: I mentioned at the outset that Thomas Moore coined the 713 00:42:39,360 --> 00:42:44,760 Speaker 1: word utopia from the root words which mean no place utopia. 714 00:42:44,960 --> 00:42:48,399 Speaker 1: That was pretty prescient. Margaret Atwood put it this way, 715 00:42:48,800 --> 00:42:54,200 Speaker 1: Utopia is not one place, but many, always receding ahead 716 00:42:54,200 --> 00:42:57,120 Speaker 1: of us. And I would say that's the point of 717 00:42:57,120 --> 00:43:01,200 Speaker 1: today's episode. We're probably never going to reach it, but 718 00:43:02,000 --> 00:43:07,840 Speaker 1: the point is the endless act of reaching. So without 719 00:43:07,960 --> 00:43:12,920 Speaker 1: being delusional, let's keep making progress, building up a world 720 00:43:13,160 --> 00:43:16,440 Speaker 1: where tomorrow is just a little kinder, a little fairer, 721 00:43:16,800 --> 00:43:23,799 Speaker 1: and even more abundant than today. Go to eagleman dot 722 00:43:23,800 --> 00:43:27,040 Speaker 1: com slash podcast for more information and to find further reading. 723 00:43:27,520 --> 00:43:30,600 Speaker 1: Join the weekly discussions on my substack, and check out 724 00:43:30,600 --> 00:43:34,200 Speaker 1: and subscribe to Inner Cosmos on YouTube for videos of 725 00:43:34,239 --> 00:43:37,480 Speaker 1: each episode and to leave comments until next time. I'm 726 00:43:37,560 --> 00:43:40,239 Speaker 1: David Eagleman, and this is Inner Cosmos.