1 00:00:04,600 --> 00:00:09,040 Speaker 1: Humans disagree on many political fronts, but we're finding ourselves 2 00:00:09,080 --> 00:00:13,760 Speaker 1: globally in a period of higher polarization, and the question 3 00:00:13,920 --> 00:00:17,760 Speaker 1: is is there anything to be done about that. Presumably 4 00:00:17,760 --> 00:00:20,759 Speaker 1: we're not going to get humans to not have conflict, 5 00:00:21,160 --> 00:00:25,279 Speaker 1: but is there any such thing as better conflict? What 6 00:00:25,320 --> 00:00:27,400 Speaker 1: would that mean? And what does any of that have 7 00:00:27,480 --> 00:00:30,360 Speaker 1: to do with brains or with the fact that modern 8 00:00:30,440 --> 00:00:34,040 Speaker 1: humans only spread out across the globe sixty thousand years ago, 9 00:00:34,560 --> 00:00:40,640 Speaker 1: or with social media recommender algorithms, or the Iroquois Native Americans. 10 00:00:43,200 --> 00:00:46,680 Speaker 1: Welcome to Inner Cosmos with me David Eagleman. I'm a 11 00:00:46,720 --> 00:00:50,400 Speaker 1: neuroscientist and an author at Stanford and in these episodes, 12 00:00:50,520 --> 00:00:54,000 Speaker 1: I examine the intersection of our brains and our lives. 13 00:00:54,400 --> 00:00:57,440 Speaker 1: And today's episode is about something that you know, I'm 14 00:00:57,480 --> 00:01:00,560 Speaker 1: obsessed with from the point of view of brainience, which 15 00:01:00,600 --> 00:01:05,959 Speaker 1: is issues of conflict and empathy and in groups and outgroups. 16 00:01:06,840 --> 00:01:10,760 Speaker 1: We find ourselves in a time of conflict. Now. The 17 00:01:10,840 --> 00:01:14,640 Speaker 1: things on everyone's mind are the war in Ukraine and 18 00:01:14,720 --> 00:01:18,440 Speaker 1: the war between Israel and Hamas. And even though those 19 00:01:18,520 --> 00:01:21,679 Speaker 1: have sucked up most of the media's attention, it's critical 20 00:01:21,680 --> 00:01:25,319 Speaker 1: to note that we have lots of other conflicts also 21 00:01:25,400 --> 00:01:30,080 Speaker 1: going on around the globe right now. There's ongoing internal 22 00:01:30,120 --> 00:01:34,600 Speaker 1: conflict in Myanmar. There's an Islamist insurgency in the Maghreb 23 00:01:34,680 --> 00:01:39,280 Speaker 1: region of Africa. There's the Boko Haram insurgency in Nigeria. 24 00:01:39,319 --> 00:01:44,080 Speaker 1: There's a civil war between rival factions in Sudan. There's 25 00:01:44,080 --> 00:01:48,080 Speaker 1: a multi sided conflict of the Syrian Civil War. There's 26 00:01:48,120 --> 00:01:53,120 Speaker 1: the ongoing civil wars in Somalia and Ethiopia, and Afghanistan 27 00:01:53,200 --> 00:01:57,400 Speaker 1: has been in near continuous armed conflict for essentially as 28 00:01:57,400 --> 00:01:59,920 Speaker 1: long as I've been alive. Now, if you've been listening 29 00:01:59,960 --> 00:02:02,320 Speaker 1: to this podcast for a while, you know that I'm 30 00:02:02,680 --> 00:02:07,800 Speaker 1: very interested in why conflict happens between humans and why 31 00:02:07,880 --> 00:02:12,920 Speaker 1: it happens so readily and consistently. So in this light, 32 00:02:13,520 --> 00:02:17,280 Speaker 1: there are three main things to note from the neuroscience 33 00:02:17,320 --> 00:02:20,520 Speaker 1: point of view. The first thing is that we all 34 00:02:21,080 --> 00:02:25,799 Speaker 1: have the same brains on the inside. So despite cultural 35 00:02:25,800 --> 00:02:30,720 Speaker 1: differences and geographic differences, your pink brain looks exactly the 36 00:02:30,760 --> 00:02:35,040 Speaker 1: same as someone else's anyone else's anywhere around the globe. 37 00:02:35,080 --> 00:02:37,080 Speaker 1: And I don't mean that as a feel good statement. 38 00:02:37,120 --> 00:02:42,760 Speaker 1: That's just a biological fact. Anatomically modern humans radiated out 39 00:02:42,760 --> 00:02:46,200 Speaker 1: of Africa only about sixty thousand years ago, and some 40 00:02:46,280 --> 00:02:49,080 Speaker 1: turned left and became European, and others turned right and 41 00:02:49,120 --> 00:02:53,000 Speaker 1: became Asian, and those who stayed remained African. But the 42 00:02:53,200 --> 00:02:57,120 Speaker 1: timescale that we're talking about some tens of thousands of years. 43 00:02:57,639 --> 00:03:01,640 Speaker 1: That is simply not enough time on the evolutionary timescale 44 00:03:01,919 --> 00:03:05,519 Speaker 1: for brains to change in any meaningful way to change 45 00:03:05,560 --> 00:03:10,320 Speaker 1: their operation or their algorithms. So we're all running the 46 00:03:10,360 --> 00:03:16,440 Speaker 1: same version of the biological operating system, despite superficial differences 47 00:03:16,480 --> 00:03:19,800 Speaker 1: in size and shape and culture and the costumes that 48 00:03:19,840 --> 00:03:22,359 Speaker 1: we wear and how much melanin we have in our 49 00:03:22,400 --> 00:03:26,680 Speaker 1: skin and so on. We can end up seeming pretty 50 00:03:26,720 --> 00:03:29,480 Speaker 1: different as a result of these surface properties, but we 51 00:03:29,600 --> 00:03:36,160 Speaker 1: have the same cognitive and emotional engine on the inside. Now. Unfortunately, 52 00:03:36,760 --> 00:03:40,200 Speaker 1: one of the things that operating system does is it 53 00:03:40,240 --> 00:03:44,320 Speaker 1: gives us all a very strong drive toward having in 54 00:03:44,400 --> 00:03:50,720 Speaker 1: groups and outgroups. This tribalism is something that characterizes our species. 55 00:03:51,160 --> 00:03:53,880 Speaker 1: We belong to groups based. 56 00:03:53,560 --> 00:03:56,520 Speaker 2: On our neighborhood or the way we look, or our 57 00:03:56,560 --> 00:03:59,720 Speaker 2: religion or our family or whatever it is, and we 58 00:03:59,800 --> 00:04:03,640 Speaker 2: take to feel closer to our in groups and treat 59 00:04:03,680 --> 00:04:06,760 Speaker 2: them better and listen to their opinions more strongly. 60 00:04:07,200 --> 00:04:10,280 Speaker 1: Our outgroups not so much. When it comes to someone 61 00:04:10,600 --> 00:04:13,720 Speaker 1: from the other side of the railroad tracks, or who 62 00:04:13,800 --> 00:04:17,640 Speaker 1: happens to have grown up under a different deity, or 63 00:04:17,680 --> 00:04:22,039 Speaker 1: who sings different songs or uses different language sounds to 64 00:04:22,080 --> 00:04:25,799 Speaker 1: transmit information, or whatever it is, we tend to treat 65 00:04:25,880 --> 00:04:31,280 Speaker 1: them with more suspicion. And in decades of neuroscience experiments, 66 00:04:31,320 --> 00:04:34,120 Speaker 1: including those from my lab, we've been able to show 67 00:04:34,520 --> 00:04:37,880 Speaker 1: that your brain simply does not care as much about 68 00:04:37,960 --> 00:04:42,680 Speaker 1: people in your outgroup. At the extreme, your brain will 69 00:04:42,760 --> 00:04:45,040 Speaker 1: view somebody in an outgroup the same way it will 70 00:04:45,120 --> 00:04:49,160 Speaker 1: view an object, not like a fellow human the way 71 00:04:49,200 --> 00:04:51,560 Speaker 1: it does with people in your in group, but instead 72 00:04:52,000 --> 00:04:56,560 Speaker 1: as something that's not particularly worthy of empathy, something that 73 00:04:56,600 --> 00:04:58,279 Speaker 1: you have to deal with in the same way that 74 00:04:58,320 --> 00:05:01,160 Speaker 1: you would deal with a vice, rusts, or a cockroach 75 00:05:01,279 --> 00:05:03,920 Speaker 1: or a rat. And if you're interested in more detail 76 00:05:04,000 --> 00:05:07,360 Speaker 1: on this point, I dove deep into examples of this 77 00:05:07,600 --> 00:05:10,880 Speaker 1: in episode twenty and one of the things I talked 78 00:05:10,920 --> 00:05:13,680 Speaker 1: about was an experiment that we did in my lab 79 00:05:14,080 --> 00:05:18,000 Speaker 1: that got at this in group outgroup difference, where all 80 00:05:18,040 --> 00:05:21,880 Speaker 1: we needed were single word labels. So I'll just tell 81 00:05:21,880 --> 00:05:25,200 Speaker 1: you the short version here of the experiment. We put 82 00:05:25,240 --> 00:05:29,640 Speaker 1: you in brain imaging in fMRI, and you're looking at 83 00:05:29,720 --> 00:05:32,919 Speaker 1: six hands on the screen and one of them gets 84 00:05:33,200 --> 00:05:37,720 Speaker 1: stabbed by a syringe needle. Now, seeing that activates a 85 00:05:38,279 --> 00:05:42,600 Speaker 1: network in your brain that we summarize as the pain matrix. 86 00:05:42,680 --> 00:05:46,120 Speaker 1: Your brain is showing a big spike, and what that 87 00:05:46,200 --> 00:05:50,040 Speaker 1: represents is you are feeling the other person's pain even 88 00:05:50,080 --> 00:05:53,279 Speaker 1: though it's not your hand. The eras in your brain 89 00:05:53,279 --> 00:05:56,279 Speaker 1: that care about pain are coming online. This is the 90 00:05:56,320 --> 00:06:01,640 Speaker 1: basis of empathy, of feeling someone else else's pain. But 91 00:06:01,880 --> 00:06:11,040 Speaker 1: now we label the six hands with one word labels Christian, Jewish, Muslim, Hindu, Scientologist, atheist, 92 00:06:11,360 --> 00:06:14,120 Speaker 1: And now we measure people of all religions in the 93 00:06:14,160 --> 00:06:17,560 Speaker 1: scanner as these different hands get stabbed. And what we 94 00:06:17,720 --> 00:06:21,880 Speaker 1: found is that everyone has a larger brain response when 95 00:06:21,920 --> 00:06:25,240 Speaker 1: it's their own group that's getting stabbed, and their brain 96 00:06:25,320 --> 00:06:28,720 Speaker 1: shows a smaller response when it's any of the other 97 00:06:29,000 --> 00:06:33,479 Speaker 1: five out groups. And this was true across religions and 98 00:06:33,720 --> 00:06:36,800 Speaker 1: even for the atheist group, which cared more when the 99 00:06:36,839 --> 00:06:40,520 Speaker 1: atheist hand gets stabbed than the other hands. So this 100 00:06:40,640 --> 00:06:43,560 Speaker 1: is not an indictment of religion, but more broadly, it 101 00:06:43,680 --> 00:06:49,920 Speaker 1: highlights your brain's exquisite skills at distinguishing in group from 102 00:06:49,960 --> 00:06:52,920 Speaker 1: your outgroups and the fact that your brain simply does 103 00:06:53,000 --> 00:06:57,200 Speaker 1: not spend as much energy on simulating what it is 104 00:06:57,440 --> 00:07:01,279 Speaker 1: like to be someone who is in your If you 105 00:07:01,320 --> 00:07:04,599 Speaker 1: want more details on that, please listen to episode twenty, 106 00:07:04,640 --> 00:07:07,039 Speaker 1: which is called why does your brain care about some 107 00:07:07,160 --> 00:07:11,280 Speaker 1: people more than others? And beyond the fact that we 108 00:07:11,320 --> 00:07:15,160 Speaker 1: are naturally tribalistic, there's a third point to note, which 109 00:07:15,200 --> 00:07:18,400 Speaker 1: is that every one of us feels like we know 110 00:07:18,720 --> 00:07:21,440 Speaker 1: the truth, and it's not clear to us why other 111 00:07:21,520 --> 00:07:24,840 Speaker 1: people don't see the truth as clearly as we do. 112 00:07:24,880 --> 00:07:29,800 Speaker 1: They must be stupid or ill informed, or trolls or 113 00:07:30,240 --> 00:07:34,080 Speaker 1: stubborn or who knows why they don't simply admit to 114 00:07:34,160 --> 00:07:38,360 Speaker 1: what is so obviously the truth. And deep down most 115 00:07:38,400 --> 00:07:41,880 Speaker 1: people have an intuition that if you could just simply 116 00:07:42,400 --> 00:07:46,040 Speaker 1: shout loudly enough in all caps on social media, that 117 00:07:46,280 --> 00:07:49,800 Speaker 1: everyone would agree with you, everyone would see the light. 118 00:07:50,480 --> 00:07:53,240 Speaker 1: And this is true of everything, not just political positions. 119 00:07:53,320 --> 00:07:57,800 Speaker 1: Take religion. There are two thousand active religions on this 120 00:07:57,920 --> 00:08:01,000 Speaker 1: small planet, and many people feel that if they could 121 00:08:01,120 --> 00:08:04,760 Speaker 1: just share with all the infidels why their religion is 122 00:08:04,840 --> 00:08:09,240 Speaker 1: right and everyone else is wrong, that everyone would agree 123 00:08:09,280 --> 00:08:12,600 Speaker 1: with them. Everyone would come to know that the religion 124 00:08:12,720 --> 00:08:17,120 Speaker 1: you happened to grow up in is the correct one. Now, 125 00:08:17,160 --> 00:08:20,200 Speaker 1: if there actually were one correct religion, we might expect 126 00:08:20,240 --> 00:08:23,920 Speaker 1: it to spread everywhere equally, But obviously that's not the case. 127 00:08:24,200 --> 00:08:28,120 Speaker 1: Religions remain clustered around where they started, and you're not 128 00:08:28,200 --> 00:08:32,360 Speaker 1: going to see a blossoming of Islam in Boise, Idaho, 129 00:08:32,600 --> 00:08:35,560 Speaker 1: just like you won't expect to see a blossoming of 130 00:08:35,600 --> 00:08:39,600 Speaker 1: Protestantism in Mecca. But here's the interesting part. Whether we're 131 00:08:39,640 --> 00:08:44,920 Speaker 1: examining one's political preferences, or religious affiliation or whatever, people 132 00:08:45,040 --> 00:08:50,640 Speaker 1: are generally reluctant to change to consider the perspectives of 133 00:08:50,840 --> 00:08:55,520 Speaker 1: other people. So, given the deep wiring that we have 134 00:08:55,760 --> 00:08:58,160 Speaker 1: for having our in groups and for feeling we know 135 00:08:58,320 --> 00:09:01,319 Speaker 1: the right answer because of our very limited internal models, 136 00:09:01,840 --> 00:09:05,520 Speaker 1: this leads to the critical question about whether there would 137 00:09:05,520 --> 00:09:11,800 Speaker 1: be any way for eight billion brains to find peaceful coexistence. Now, 138 00:09:11,840 --> 00:09:14,360 Speaker 1: if you've been a regular listener to the podcast, you 139 00:09:14,440 --> 00:09:18,120 Speaker 1: know that I'm generally highly optimistic about everything, but I 140 00:09:18,160 --> 00:09:21,160 Speaker 1: would suggest the answer to the question of whether we 141 00:09:21,240 --> 00:09:26,120 Speaker 1: will achieve peace is probably not. We're probably not going 142 00:09:26,200 --> 00:09:30,040 Speaker 1: to find ourselves ever in some magical moment where all 143 00:09:30,080 --> 00:09:33,760 Speaker 1: the newspapers say, Wow, nobody is fighting, we're taking a 144 00:09:33,840 --> 00:09:36,200 Speaker 1: day off today, and we're not going to find some 145 00:09:36,320 --> 00:09:41,320 Speaker 1: extraordinary presidential candidate where everyone says, yeah, that person's great. 146 00:09:41,400 --> 00:09:45,080 Speaker 1: I think we can all agree presidential elections are always 147 00:09:45,160 --> 00:09:48,880 Speaker 1: fairly close to fifty to fifty. Why this results from 148 00:09:48,920 --> 00:09:53,320 Speaker 1: the enormous variety of internal models inside everyone's heads. There 149 00:09:53,440 --> 00:09:57,720 Speaker 1: is no perfect candidate because no one can fit all 150 00:09:57,760 --> 00:10:02,120 Speaker 1: the constraints at once of all all the different models. Okay, 151 00:10:02,160 --> 00:10:04,480 Speaker 1: so if we take this all as our starting point 152 00:10:04,559 --> 00:10:08,240 Speaker 1: for today, the question is what does this all mean 153 00:10:08,480 --> 00:10:12,800 Speaker 1: for the future of conflict? Are we facing a future 154 00:10:13,120 --> 00:10:16,760 Speaker 1: where wars and polarization will stay as they are or 155 00:10:16,840 --> 00:10:21,480 Speaker 1: even grow worse. Well, that's certainly a reasonable fear. But 156 00:10:21,559 --> 00:10:24,760 Speaker 1: a little while ago I discovered there was a movement 157 00:10:24,920 --> 00:10:29,080 Speaker 1: of political scholars around the idea of how to have 158 00:10:29,679 --> 00:10:33,600 Speaker 1: better conflict. Now, what does better conflict mean? We'll get 159 00:10:33,640 --> 00:10:36,400 Speaker 1: into that in a second, but first I just want 160 00:10:36,440 --> 00:10:39,959 Speaker 1: to say I've been moved and inspired by this movement. 161 00:10:40,000 --> 00:10:43,080 Speaker 1: It feels like it's the opposite of what we see 162 00:10:43,120 --> 00:10:46,959 Speaker 1: online every day, where people pick aside and scream and 163 00:10:47,080 --> 00:10:50,280 Speaker 1: yell for it or fight and die for it at 164 00:10:50,280 --> 00:10:57,760 Speaker 1: the cost of not meaningfully considering the complexity of human situations. 165 00:10:58,600 --> 00:11:01,640 Speaker 1: Just take what's happening in the middle least. Typically, if 166 00:11:01,679 --> 00:11:05,880 Speaker 1: someone points out that the history there is extraordinarily multi 167 00:11:06,000 --> 00:11:10,079 Speaker 1: layered and complex, people will often say, no, this is 168 00:11:10,120 --> 00:11:13,040 Speaker 1: actually quite simple, and then they will give their version 169 00:11:13,120 --> 00:11:16,040 Speaker 1: of the history. But the extraordinary part is that people 170 00:11:16,080 --> 00:11:19,199 Speaker 1: on both sides will deliver that same line. They will say, 171 00:11:19,200 --> 00:11:22,920 Speaker 1: it's actually very simple, And to me, that's a barometer 172 00:11:23,040 --> 00:11:26,520 Speaker 1: that tells me the situation is not so simple. But instead, 173 00:11:26,760 --> 00:11:30,520 Speaker 1: like most human conflict, it's full of different paths that 174 00:11:30,559 --> 00:11:33,880 Speaker 1: you can take through the history to come to a 175 00:11:34,040 --> 00:11:38,079 Speaker 1: variety of conclusions. And we witness this sort of complexity 176 00:11:38,440 --> 00:11:43,160 Speaker 1: whenever you have millions of people with all personalities involved 177 00:11:43,640 --> 00:11:48,040 Speaker 1: in conflict on all sides. You have sinners and saints, 178 00:11:48,120 --> 00:11:52,440 Speaker 1: and you have psychopaths and peace seekers, and no one 179 00:11:52,520 --> 00:11:57,400 Speaker 1: can hope to derive a meaningful solution without doing the 180 00:11:57,480 --> 00:12:01,040 Speaker 1: hard work of digging in and trying to understand what 181 00:12:01,240 --> 00:12:06,439 Speaker 1: happens between groups of millions of humans, rather than imagining 182 00:12:06,480 --> 00:12:09,800 Speaker 1: there's a good guy side and a bad guy side. So, 183 00:12:09,960 --> 00:12:12,880 Speaker 1: as I said, I was very moved to discover that 184 00:12:12,960 --> 00:12:17,720 Speaker 1: some people really take on this sort of political intellectual 185 00:12:17,800 --> 00:12:21,200 Speaker 1: humility and try to figure out what do we do 186 00:12:21,760 --> 00:12:23,840 Speaker 1: with humans? What do we do about the fact that 187 00:12:23,840 --> 00:12:26,840 Speaker 1: they're never going to agree, There's always going to be conflict. 188 00:12:27,080 --> 00:12:32,000 Speaker 1: How do we make conflict more productive rather than simply 189 00:12:32,080 --> 00:12:36,240 Speaker 1: being about bloodshed? So to explore this, I called up 190 00:12:36,280 --> 00:12:40,680 Speaker 1: my colleague Jonathan Stray. Jonathan is a former journalist who 191 00:12:40,720 --> 00:12:43,800 Speaker 1: is now a senior scientist at the Berkeley Center for 192 00:12:43,920 --> 00:12:48,240 Speaker 1: Human Compatible AI, where he works on the algorithms that 193 00:12:48,400 --> 00:12:52,960 Speaker 1: feed up content to online and how their operation affects 194 00:12:53,000 --> 00:12:56,839 Speaker 1: things like polarization. And he writes a terrific newsletter called 195 00:12:57,160 --> 00:13:05,760 Speaker 1: The Better Conflict Bulletin. So here's John Jonathan. So you 196 00:13:05,840 --> 00:13:09,400 Speaker 1: are a conflict researcher. Tell us about what that means. 197 00:13:09,920 --> 00:13:15,040 Speaker 3: Well, I try to study what causes people to disagree 198 00:13:15,040 --> 00:13:19,120 Speaker 3: and then how that disagreement transforms into polarization and eventually 199 00:13:19,200 --> 00:13:23,520 Speaker 3: violence if unchecked, And how to build computer controlled media 200 00:13:23,559 --> 00:13:25,760 Speaker 3: systems like social media that will prevent that. 201 00:13:26,280 --> 00:13:30,120 Speaker 1: I know that you think about conflict in terms of 202 00:13:30,240 --> 00:13:33,160 Speaker 1: good and bad conflict. So what does that mean? 203 00:13:33,800 --> 00:13:37,240 Speaker 3: Well, we don't want no conflict in society. Conflict is 204 00:13:37,360 --> 00:13:39,960 Speaker 3: so first of all, it's inevitable. We all disagree, we 205 00:13:40,000 --> 00:13:45,520 Speaker 3: want different things unavoidably. But sometimes when we disagree, it 206 00:13:45,600 --> 00:13:47,520 Speaker 3: comes to something productive, you know, we talk about it, 207 00:13:47,559 --> 00:13:50,520 Speaker 3: we figure something out, and sometimes it doesn't right, Sometimes 208 00:13:50,520 --> 00:13:54,040 Speaker 3: it escalates to some sort of zero sum game or 209 00:13:54,080 --> 00:13:58,679 Speaker 3: even violence. The idea is, when we think about conflict, 210 00:13:58,679 --> 00:14:01,000 Speaker 3: we're not trying to prevent people from fighting. We're trying 211 00:14:01,040 --> 00:14:03,080 Speaker 3: to have them fight in some sort of productive way. 212 00:14:03,800 --> 00:14:05,480 Speaker 1: So what does that look like? What does that mean? 213 00:14:06,000 --> 00:14:09,199 Speaker 3: Well, the simplest example might be violent versus nonviolent, right, 214 00:14:09,280 --> 00:14:13,080 Speaker 3: so we can all understand the difference between you know, 215 00:14:13,520 --> 00:14:15,920 Speaker 3: people trying to get their way by using force or 216 00:14:15,960 --> 00:14:20,080 Speaker 3: getting their way by discussion, yes, but also non violent 217 00:14:20,160 --> 00:14:24,520 Speaker 3: conflict tactics, you know, protests. You know, there's famous historical 218 00:14:24,600 --> 00:14:29,040 Speaker 3: figures like doctor Martin, Luther King or Gandhi who really 219 00:14:29,080 --> 00:14:33,880 Speaker 3: developed these techniques and showed that you could achieve you know, 220 00:14:34,000 --> 00:14:38,200 Speaker 3: large political victories without using physical force. So that's that's 221 00:14:38,240 --> 00:14:41,200 Speaker 3: maybe the most obvious way but there's a bunch of 222 00:14:41,200 --> 00:14:42,800 Speaker 3: other stuff when you start thinking about this. 223 00:14:43,040 --> 00:14:45,480 Speaker 4: So maybe you think about this at a personal level. 224 00:14:45,520 --> 00:14:47,520 Speaker 3: You know, we've all had the experience of an argument 225 00:14:47,560 --> 00:14:50,320 Speaker 3: where you know, we said some ugly things that can't 226 00:14:50,360 --> 00:14:52,560 Speaker 3: be unsaid, and you feel worse at the end of it. 227 00:14:52,920 --> 00:14:54,280 Speaker 3: And then I think most of us have had the 228 00:14:54,360 --> 00:14:56,600 Speaker 3: experience of an argument where you know, there were some 229 00:14:56,680 --> 00:15:00,200 Speaker 3: hard truths, but it was honest, it was carrying, it 230 00:15:00,240 --> 00:15:03,200 Speaker 3: was empathetic, and even if we couldn't solve the problem 231 00:15:03,280 --> 00:15:05,480 Speaker 3: at the end of it, we still have a positive 232 00:15:05,520 --> 00:15:06,400 Speaker 3: regard for each other. 233 00:15:06,600 --> 00:15:08,520 Speaker 4: And so that would be an example of good versus 234 00:15:08,520 --> 00:15:09,200 Speaker 4: bad conflict. 235 00:15:09,720 --> 00:15:11,880 Speaker 1: So one of the things that has always intrigued me 236 00:15:12,000 --> 00:15:14,880 Speaker 1: is that we all have this illusion that we know 237 00:15:15,040 --> 00:15:17,640 Speaker 1: the truth, that we have a complete story of what's 238 00:15:17,720 --> 00:15:21,600 Speaker 1: going on, and therefore other people that disagree with us 239 00:15:21,680 --> 00:15:26,240 Speaker 1: seem like they're trolls, or they're uninformed, or they're disingenuous. 240 00:15:26,720 --> 00:15:30,440 Speaker 1: And so the question is, how do we address that 241 00:15:30,720 --> 00:15:33,440 Speaker 1: illusion that we have. How do we get ourselves a 242 00:15:33,520 --> 00:15:36,400 Speaker 1: little bit outside our own fish bowls to see the 243 00:15:36,440 --> 00:15:37,040 Speaker 1: other side. 244 00:15:37,560 --> 00:15:40,200 Speaker 3: Yeah, no, that's a great question. So there's a bunch 245 00:15:40,200 --> 00:15:42,000 Speaker 3: of ways of answering that. So there's I was at 246 00:15:42,040 --> 00:15:45,320 Speaker 3: a conference yesterday on the topic of intellectual humility, the 247 00:15:45,400 --> 00:15:48,600 Speaker 3: idea that actually we don't know the answer, there's always 248 00:15:48,600 --> 00:15:51,600 Speaker 3: something we're missing. And this is important not just as 249 00:15:51,720 --> 00:15:55,400 Speaker 3: an epistemological practice, that is, to be more correct in 250 00:15:55,440 --> 00:15:59,680 Speaker 3: what we believe, but as a relational practice, meaning it 251 00:15:59,800 --> 00:16:02,040 Speaker 3: changed is how we relate and how we can relate 252 00:16:02,040 --> 00:16:04,880 Speaker 3: to other people. And I think part of this is 253 00:16:05,680 --> 00:16:08,520 Speaker 3: we always have a partial picture, and we tend to, 254 00:16:09,080 --> 00:16:13,000 Speaker 3: in conflict situations, believe that the other person actually holds 255 00:16:13,040 --> 00:16:16,560 Speaker 3: more extreme views than they do. So there's really good 256 00:16:16,600 --> 00:16:18,600 Speaker 3: evidence of that. This is the case in the American 257 00:16:18,600 --> 00:16:21,520 Speaker 3: Culture War, where if you ask one side what the 258 00:16:21,520 --> 00:16:24,320 Speaker 3: other side thinks, they will imagine the other side is 259 00:16:24,360 --> 00:16:28,920 Speaker 3: actually more extreme than they are. So conflict in particular 260 00:16:29,520 --> 00:16:31,480 Speaker 3: is prone to these misperceptions of the other. 261 00:16:32,320 --> 00:16:35,400 Speaker 1: So give us an example of how we know that 262 00:16:35,400 --> 00:16:37,880 Speaker 1: that each side thinks the other side is more extreme. 263 00:16:38,400 --> 00:16:39,720 Speaker 4: Yeah. Yeah, there's tons of work on this. 264 00:16:39,760 --> 00:16:41,840 Speaker 3: So there's a group called the Perception Gap which has 265 00:16:41,880 --> 00:16:45,040 Speaker 3: done a bunch of research from this. And so, for example, 266 00:16:45,120 --> 00:16:52,480 Speaker 3: if you ask Republicans how many Democrats would support completely 267 00:16:52,480 --> 00:16:56,200 Speaker 3: open borders, they'll say something like seventy five percent. If 268 00:16:56,200 --> 00:16:58,400 Speaker 3: you ask Democrats how many of them would support completely 269 00:16:58,480 --> 00:17:03,160 Speaker 3: open borders, it's something like, so they're twenty or twenty 270 00:17:03,160 --> 00:17:06,560 Speaker 3: five percentage points off. And this pattern is consistent across 271 00:17:06,560 --> 00:17:09,479 Speaker 3: a range of issues, and it's bidirectional as well. So 272 00:17:09,720 --> 00:17:11,880 Speaker 3: you know, if you ask Republicans or if you ask 273 00:17:11,920 --> 00:17:16,159 Speaker 3: democrats something like, you know, how many Republicans would say that, 274 00:17:17,200 --> 00:17:20,000 Speaker 3: you know, everyone should have access to as many guns 275 00:17:20,040 --> 00:17:22,840 Speaker 3: as they want, Democrats will guess that number a lot 276 00:17:22,880 --> 00:17:24,760 Speaker 3: higher than Republicans will say. 277 00:17:24,880 --> 00:17:28,320 Speaker 1: So, speaking of Democrats and Republicans, here's the question. Are 278 00:17:28,359 --> 00:17:33,239 Speaker 1: we more polarized currently in America? Is that just an 279 00:17:33,280 --> 00:17:35,480 Speaker 1: impression or is that backed up by statistics? 280 00:17:36,119 --> 00:17:39,280 Speaker 3: No, Unfortunately, it's real, and you can measure it a 281 00:17:39,280 --> 00:17:41,920 Speaker 3: lot of different ways. You can look at where people 282 00:17:41,960 --> 00:17:44,639 Speaker 3: are on political issues, you can look at how people 283 00:17:44,680 --> 00:17:48,400 Speaker 3: feel about each other. You can look at congressional voting patterns, 284 00:17:48,440 --> 00:17:51,720 Speaker 3: whether you know members of Congress will cross the aisle 285 00:17:51,800 --> 00:17:54,680 Speaker 3: to vote on each other's builts, and all of these 286 00:17:54,880 --> 00:17:57,760 Speaker 3: measures show the same basic pattern, which is that polarization 287 00:17:57,840 --> 00:18:01,120 Speaker 3: has been increasing since the the mid seventies or maybe 288 00:18:01,119 --> 00:18:04,760 Speaker 3: the early eighties and is now at historically high levels. 289 00:18:05,160 --> 00:18:07,399 Speaker 3: And it's not just America, it's several other countries in 290 00:18:07,440 --> 00:18:08,560 Speaker 3: the world as well. 291 00:18:08,640 --> 00:18:11,359 Speaker 1: Right now, I know you and I have independently made 292 00:18:11,440 --> 00:18:15,200 Speaker 1: arguments about why this is not entirely about social media, 293 00:18:15,240 --> 00:18:17,560 Speaker 1: and that, of course seems like a good thing to 294 00:18:17,600 --> 00:18:19,840 Speaker 1: point to is the fact that this started in the seventies. 295 00:18:19,880 --> 00:18:24,240 Speaker 1: But tell us your sense of the involvement and the 296 00:18:24,280 --> 00:18:27,000 Speaker 1: non involvement of social media in this polarization. 297 00:18:27,760 --> 00:18:29,920 Speaker 4: Yeah, well, there's definitely some involvement. 298 00:18:30,119 --> 00:18:34,000 Speaker 3: There are a number of ways that social media, in 299 00:18:34,040 --> 00:18:38,160 Speaker 3: particular social media algorithms, that is to say, the systems 300 00:18:38,160 --> 00:18:40,120 Speaker 3: that decide what we see. 301 00:18:40,040 --> 00:18:42,639 Speaker 4: And in what order, can have some involvement. 302 00:18:43,160 --> 00:18:45,240 Speaker 3: So probably a lot of your listeners have heard of 303 00:18:45,240 --> 00:18:47,520 Speaker 3: the idea of the filter bubble, the idea that, you know, 304 00:18:47,600 --> 00:18:51,560 Speaker 3: we're each trapped in this sort of algorithmically produced reality 305 00:18:51,560 --> 00:18:54,639 Speaker 3: where we don't see the other side. The evidence tends 306 00:18:54,680 --> 00:18:57,560 Speaker 3: to be against that. We actually do see quite a 307 00:18:57,600 --> 00:18:59,720 Speaker 3: lot of cross cutting content, you know, even if it's 308 00:18:59,760 --> 00:19:02,920 Speaker 3: you know, rage clicking on an article that has a 309 00:19:02,920 --> 00:19:06,199 Speaker 3: different viewpoint, but there's a bunch of other things going on. So, 310 00:19:06,480 --> 00:19:12,000 Speaker 3: for example, most social media and other systems which select 311 00:19:12,000 --> 00:19:15,360 Speaker 3: content for us, like news recommenders and so forth, rank 312 00:19:15,480 --> 00:19:18,560 Speaker 3: things based on basically the number of clicks they get engagement, 313 00:19:19,320 --> 00:19:22,040 Speaker 3: And we really do pay more attention to things that 314 00:19:22,080 --> 00:19:24,520 Speaker 3: are valuable to us, but also we pay more attention 315 00:19:24,600 --> 00:19:31,320 Speaker 3: to things that are threatening, offensive, fearful. So this approach 316 00:19:31,359 --> 00:19:34,160 Speaker 3: of ranking things by how much attention we give them 317 00:19:34,400 --> 00:19:54,840 Speaker 3: also tends to amplify more extreme or scarier items. 318 00:19:57,800 --> 00:20:00,159 Speaker 1: Now, I did an episode a few episodes ag Go 319 00:20:00,240 --> 00:20:05,000 Speaker 1: where I mentioned for the Iroquois Native Americans up in 320 00:20:05,240 --> 00:20:09,479 Speaker 1: essentially Wisconsin and Canada, some hundreds of years ago, they 321 00:20:09,480 --> 00:20:13,160 Speaker 1: were having bloody battles between these different tribes all the time, 322 00:20:13,680 --> 00:20:16,840 Speaker 1: and they had a new leader come in who came 323 00:20:16,880 --> 00:20:19,000 Speaker 1: to be known as the Great Peacemaker because one of 324 00:20:19,000 --> 00:20:22,680 Speaker 1: the things he did is assigned everybody in the tribes 325 00:20:23,080 --> 00:20:25,879 Speaker 1: to different clans. So you might be a member of 326 00:20:25,880 --> 00:20:28,920 Speaker 1: the Turtle clan and I'm a member of the Heron clan, 327 00:20:28,960 --> 00:20:31,199 Speaker 1: even though we're members of the same tribe. And it 328 00:20:31,280 --> 00:20:35,000 Speaker 1: turns out these allegiances were cross cutting, so that each 329 00:20:35,080 --> 00:20:37,760 Speaker 1: person had their allegiance to their tribe and also to 330 00:20:37,800 --> 00:20:41,200 Speaker 1: their clan. And these weren't equivalent, and that ended up 331 00:20:41,240 --> 00:20:45,000 Speaker 1: bringing peace because now it wasn't so simple to have 332 00:20:45,080 --> 00:20:48,360 Speaker 1: a clear in group and out group because they were mixed. 333 00:20:48,760 --> 00:20:51,720 Speaker 1: And I know that you in your work with social 334 00:20:51,760 --> 00:20:55,680 Speaker 1: media have been looking at something similar to this, which 335 00:20:55,720 --> 00:21:00,600 Speaker 1: is this issue of how an algorithm should rank posts. 336 00:21:00,680 --> 00:21:01,600 Speaker 1: So tell us about that. 337 00:21:02,640 --> 00:21:05,720 Speaker 3: Yeah, so that is a great example. I've heard of 338 00:21:05,760 --> 00:21:09,439 Speaker 3: this Ariquais example as well. And what is happening is 339 00:21:09,480 --> 00:21:13,480 Speaker 3: conflict is tied up with identity. We fight basically against 340 00:21:13,560 --> 00:21:17,200 Speaker 3: people who are different than us. And in the last 341 00:21:17,200 --> 00:21:20,480 Speaker 3: few decades in this country, identity has become more and 342 00:21:20,600 --> 00:21:24,560 Speaker 3: more collapsed into a single dimension of you know, left 343 00:21:24,640 --> 00:21:28,080 Speaker 3: versus right, red versus blue, Republicans versus Democrats. 344 00:21:28,200 --> 00:21:29,640 Speaker 4: It didn't actually used to be this way. 345 00:21:30,080 --> 00:21:34,080 Speaker 3: People didn't identify with their political parties in the same 346 00:21:34,080 --> 00:21:38,880 Speaker 3: way a generation ago. And also there was much more 347 00:21:39,000 --> 00:21:44,000 Speaker 3: mixed identification at the national versus local level. So you know, 348 00:21:44,080 --> 00:21:46,360 Speaker 3: you would have some town where, you know, you had 349 00:21:46,840 --> 00:21:50,080 Speaker 3: a Republican mayor, but you had, you know, progressive politics 350 00:21:50,080 --> 00:21:52,560 Speaker 3: because that's what the people there believed. 351 00:21:52,160 --> 00:21:52,760 Speaker 4: In and wanted. 352 00:21:53,760 --> 00:21:58,360 Speaker 3: Politics has become very nationalized. There's less local news, there's 353 00:21:58,480 --> 00:21:59,760 Speaker 3: less split ticket voting. 354 00:22:00,520 --> 00:22:02,440 Speaker 4: Every issue takes place. 355 00:22:02,200 --> 00:22:05,359 Speaker 3: At this huge scale where there's only two ends in spectrum. 356 00:22:06,280 --> 00:22:08,960 Speaker 3: So how you would have to reverse this is you 357 00:22:09,000 --> 00:22:13,440 Speaker 3: would have to have some sort of sense of not 358 00:22:13,480 --> 00:22:17,479 Speaker 3: necessarily local, but community level values. 359 00:22:17,560 --> 00:22:18,360 Speaker 4: Right, it would have to be. 360 00:22:18,280 --> 00:22:23,200 Speaker 3: Okay for you know, specific groups online or perhaps discussion 361 00:22:23,400 --> 00:22:27,399 Speaker 3: to have opinions that just don't quite fall along this 362 00:22:27,560 --> 00:22:28,600 Speaker 3: left right axis. 363 00:22:29,160 --> 00:22:32,920 Speaker 1: And what's the reason that we have seen more sorting 364 00:22:33,240 --> 00:22:36,439 Speaker 1: into these groups. What's the reason that particular issues go 365 00:22:36,600 --> 00:22:39,280 Speaker 1: together that don't necessarily belong together. 366 00:22:40,000 --> 00:22:44,359 Speaker 3: Well, again, it's an identity thing. It's one of the 367 00:22:44,400 --> 00:22:47,560 Speaker 3: things that makes all of these issues sort of one 368 00:22:48,040 --> 00:22:51,960 Speaker 3: big blob where you only have two choices is merely 369 00:22:52,080 --> 00:22:55,920 Speaker 3: exposure to people who are far away from us, either 370 00:22:56,000 --> 00:22:59,720 Speaker 3: physically or social. So you know, maybe you only talk 371 00:22:59,760 --> 00:23:01,800 Speaker 3: to your neighbors in your town, or you only read 372 00:23:01,800 --> 00:23:03,800 Speaker 3: the local newspaper, and so you could have this sort 373 00:23:03,840 --> 00:23:07,399 Speaker 3: of quixotic local politics in a way that was healthy. 374 00:23:07,720 --> 00:23:10,840 Speaker 3: But now we are exposed to people all across the 375 00:23:10,920 --> 00:23:13,840 Speaker 3: country and indeed the world on a daily basis. So 376 00:23:13,880 --> 00:23:17,439 Speaker 3: now we're comparing our point of view to someone you 377 00:23:17,440 --> 00:23:20,400 Speaker 3: know that will never meet, who's far away from us. 378 00:23:20,960 --> 00:23:25,160 Speaker 3: And there's simulations that show you don't need any antagonism 379 00:23:25,359 --> 00:23:27,640 Speaker 3: for people to sort themselves into two big tribes. 380 00:23:27,680 --> 00:23:30,000 Speaker 4: All you need is when you talk. 381 00:23:29,840 --> 00:23:34,320 Speaker 3: To someone and they're you know, near enough like you, 382 00:23:34,320 --> 00:23:36,720 Speaker 3: you get a little bit closer to them, right, which 383 00:23:36,760 --> 00:23:38,720 Speaker 3: is a natural human thing. When two people talk, they 384 00:23:38,720 --> 00:23:40,480 Speaker 3: tend to you know, converge a little on their on 385 00:23:40,520 --> 00:23:46,040 Speaker 3: their views and identity. But because that those interactions are 386 00:23:46,080 --> 00:23:49,520 Speaker 3: now happening all across the country, we're coalescing into sort 387 00:23:49,560 --> 00:23:50,639 Speaker 3: of two big groups. 388 00:23:51,080 --> 00:23:52,520 Speaker 4: It's it's kind of a paradox, right. 389 00:23:52,560 --> 00:23:54,440 Speaker 3: We want to be able to talk to people who 390 00:23:54,480 --> 00:23:56,439 Speaker 3: are very far away from us, you know, that's the 391 00:23:56,480 --> 00:23:58,440 Speaker 3: promise of these online networks, and. 392 00:23:58,440 --> 00:24:00,880 Speaker 4: Yet it seems to make our con clicks global. 393 00:24:01,520 --> 00:24:04,400 Speaker 1: And there are network science models on this right that 394 00:24:04,600 --> 00:24:07,840 Speaker 1: demonstrate how groups end up dividing like this. And what 395 00:24:07,960 --> 00:24:12,520 Speaker 1: is the evidence that these network science models actually cash 396 00:24:12,560 --> 00:24:13,840 Speaker 1: out in real life? 397 00:24:14,520 --> 00:24:17,159 Speaker 3: Yeah, So these models developed actually from models built in 398 00:24:17,200 --> 00:24:22,119 Speaker 3: the nineteen sixties to study physical racial segregation. And what 399 00:24:22,160 --> 00:24:26,240 Speaker 3: they showed is that you don't need racism in the 400 00:24:26,320 --> 00:24:29,119 Speaker 3: sense of not wanting to live near people who are 401 00:24:29,160 --> 00:24:31,919 Speaker 3: a different race than you. All you need is a 402 00:24:31,960 --> 00:24:34,520 Speaker 3: slight preference, you know, maybe a few percent to live 403 00:24:34,560 --> 00:24:37,280 Speaker 3: near people who are more like you, and that is 404 00:24:37,440 --> 00:24:41,640 Speaker 3: enough for entire neighborhoods to eventually segregate. And so those 405 00:24:41,680 --> 00:24:44,600 Speaker 3: same models are now being applied to online networks and 406 00:24:44,640 --> 00:24:47,679 Speaker 3: they show very similar results. People if they have a 407 00:24:47,720 --> 00:24:49,920 Speaker 3: slight preference for people who are maybe a little more 408 00:24:49,960 --> 00:24:52,000 Speaker 3: like them in their politics, will spend. 409 00:24:51,840 --> 00:24:54,040 Speaker 4: More time, you know, in a bunch of ways, right, 410 00:24:54,280 --> 00:24:55,120 Speaker 4: hanging out. 411 00:24:55,359 --> 00:24:59,440 Speaker 3: In the same groups, or maybe you know, friending them 412 00:24:59,680 --> 00:25:02,040 Speaker 3: or fallollowing them, that sort of thing. And so we 413 00:25:02,119 --> 00:25:03,840 Speaker 3: see this this sort of global splitting. 414 00:25:04,520 --> 00:25:07,360 Speaker 1: Right, So before we drop this start, I just want 415 00:25:07,359 --> 00:25:10,119 Speaker 1: to return to this issue of what does social media 416 00:25:10,400 --> 00:25:12,720 Speaker 1: not have to do with the current polarization, And that 417 00:25:12,800 --> 00:25:16,080 Speaker 1: certainly seems like one of the issues is this, you know, 418 00:25:16,160 --> 00:25:19,680 Speaker 1: network segregation. What else do you see where you think 419 00:25:19,680 --> 00:25:23,440 Speaker 1: that social media is maybe not the sole culprit as 420 00:25:23,440 --> 00:25:25,000 Speaker 1: opposed to human behavior. 421 00:25:25,640 --> 00:25:25,840 Speaker 4: Yeah. 422 00:25:25,880 --> 00:25:27,719 Speaker 3: Well, one of the things I like to say is that, 423 00:25:27,840 --> 00:25:31,880 Speaker 3: you know, if Facebook could fix our democracy by changing 424 00:25:31,880 --> 00:25:34,320 Speaker 3: one hundred lines of code, they would have done it already, right, 425 00:25:34,400 --> 00:25:38,840 Speaker 3: if only our problem was that simple. But polarization is 426 00:25:38,840 --> 00:25:41,560 Speaker 3: one of these problems. That requires an all of society approach. 427 00:25:41,640 --> 00:25:44,920 Speaker 3: So it's you know, social media, yes, but it's also 428 00:25:45,560 --> 00:25:48,920 Speaker 3: media journalists. Journalists are going to have to learn to 429 00:25:48,960 --> 00:25:52,959 Speaker 3: cover politics in different ways, and it's the politicians themselves. 430 00:25:53,440 --> 00:25:58,199 Speaker 3: There is always an advantage to taking a view that 431 00:25:58,280 --> 00:26:01,760 Speaker 3: is more extreme because it motivates people, and so it's 432 00:26:01,840 --> 00:26:04,240 Speaker 3: kind of a devil's bargain, right. You can get people 433 00:26:04,280 --> 00:26:06,720 Speaker 3: to turn out and show up for your cause by 434 00:26:07,240 --> 00:26:11,000 Speaker 3: pressing on a divisive issue, but doing that divide society further. 435 00:26:11,560 --> 00:26:13,800 Speaker 3: So we actually need a whole bunch of sectors of 436 00:26:13,920 --> 00:26:18,880 Speaker 3: society to move in the same direction. And there's para 437 00:26:18,960 --> 00:26:22,640 Speaker 3: conflict scholars called Guy and Heidi Burgess, and they have 438 00:26:22,680 --> 00:26:28,760 Speaker 3: a wonderful article on forty different roles that people have 439 00:26:28,840 --> 00:26:31,760 Speaker 3: to do to create a healthy and peaceful democracy. And 440 00:26:31,800 --> 00:26:34,639 Speaker 3: what do they say, Well, some of them are exactly 441 00:26:34,640 --> 00:26:36,959 Speaker 3: what you'd expect, right, So we already talked about journalists 442 00:26:37,000 --> 00:26:40,720 Speaker 3: and so forth, But they also talk about issue analysts. 443 00:26:40,760 --> 00:26:43,120 Speaker 3: There are people who have to go deep on particular issues. 444 00:26:43,359 --> 00:26:47,119 Speaker 3: They talk about healers, people who present a vision where 445 00:26:47,200 --> 00:26:50,800 Speaker 3: we have emotionally healthy relationships with each other. They talk 446 00:26:50,840 --> 00:26:54,440 Speaker 3: about they call them democracy firsters, people who are concerned 447 00:26:54,760 --> 00:26:58,040 Speaker 3: first and foremost with the correct functioning of elections and 448 00:26:58,160 --> 00:27:00,560 Speaker 3: rule of law. Once you start thinking this way, you 449 00:27:00,600 --> 00:27:03,240 Speaker 3: realize there's a lot of different roles to play, and 450 00:27:03,240 --> 00:27:04,560 Speaker 3: a lot of people have to be pulling in the 451 00:27:04,600 --> 00:27:07,879 Speaker 3: same direction. And they say that what we need is 452 00:27:08,359 --> 00:27:12,479 Speaker 3: a generation of people interested in conflict who in much 453 00:27:12,560 --> 00:27:14,399 Speaker 3: the same way that we now have a generation of 454 00:27:14,400 --> 00:27:16,560 Speaker 3: people who are working on climate change. 455 00:27:16,920 --> 00:27:20,680 Speaker 1: Specifically, who are interested in doing conflict right exactly, because 456 00:27:20,680 --> 00:27:24,000 Speaker 1: we certainly have a generation interest in conflict. What do 457 00:27:24,040 --> 00:27:26,640 Speaker 1: you see when you look around at college camps? Is, Jonathan, 458 00:27:26,680 --> 00:27:29,040 Speaker 1: you're located at Berkeley, and I know that you are 459 00:27:29,080 --> 00:27:33,280 Speaker 1: a very wise person who keeps a foot in both 460 00:27:33,359 --> 00:27:38,000 Speaker 1: camps and tries to see things from all sides. That's 461 00:27:38,040 --> 00:27:41,800 Speaker 1: not the reputation that Berkeley has in particular. So when 462 00:27:41,880 --> 00:27:44,760 Speaker 1: you look around at the campus, what do you see? 463 00:27:44,760 --> 00:27:47,359 Speaker 1: And is anything different about this generation than previous ones? 464 00:27:48,200 --> 00:27:48,560 Speaker 4: Yeah? 465 00:27:48,720 --> 00:27:50,359 Speaker 3: I mean, you know that's a loaded question, right the 466 00:27:50,440 --> 00:27:51,480 Speaker 3: kids today and so forth. 467 00:27:51,560 --> 00:27:53,240 Speaker 4: But yeah, I mean, of course Berkeley. 468 00:27:52,880 --> 00:27:56,359 Speaker 3: Is a liberal campus, right, you know, it's a public 469 00:27:56,440 --> 00:27:58,760 Speaker 3: university in one of the most liberal places in America. 470 00:27:58,840 --> 00:28:00,000 Speaker 4: So it has it's own policy. 471 00:28:01,000 --> 00:28:03,560 Speaker 3: I do see, And of course it's not just Berkeley 472 00:28:04,600 --> 00:28:08,639 Speaker 3: that the current generation of students are I would say, 473 00:28:09,320 --> 00:28:11,680 Speaker 3: more involved in politics, which I would say is a 474 00:28:11,720 --> 00:28:17,000 Speaker 3: good thing, but also less open or less compromising, which 475 00:28:17,040 --> 00:28:21,520 Speaker 3: troubles me in particular when it leads to the suppression 476 00:28:21,720 --> 00:28:24,840 Speaker 3: of alternative views. Now, I don't want to say that, 477 00:28:25,280 --> 00:28:27,200 Speaker 3: you know, anybody should be able to say anything without 478 00:28:27,200 --> 00:28:32,000 Speaker 3: any consequences. I think there are real issues here, for example, inclusion. 479 00:28:32,400 --> 00:28:36,080 Speaker 3: You don't want to make people feel like they don't belong. 480 00:28:36,040 --> 00:28:37,880 Speaker 4: At an academic institution. 481 00:28:38,520 --> 00:28:41,120 Speaker 3: However, where it sort of crosses a line for me 482 00:28:41,400 --> 00:28:44,600 Speaker 3: is where saying something a little unorthodox or a little 483 00:28:44,680 --> 00:28:50,480 Speaker 3: challenging becomes impossible. People are scared to do it. And 484 00:28:50,480 --> 00:28:53,120 Speaker 3: I'm not talking about, you know, the haters here. I'm 485 00:28:53,160 --> 00:28:56,240 Speaker 3: talking about people who are trying to engage in good 486 00:28:56,280 --> 00:28:58,960 Speaker 3: faith and maybe saying something that is, you know, a 487 00:28:58,960 --> 00:29:02,120 Speaker 3: little controversial, and there's just less. 488 00:29:01,920 --> 00:29:04,120 Speaker 4: Tolerance for that than there used to be. 489 00:29:04,920 --> 00:29:07,040 Speaker 3: And I think both it's just sort of anecdotally, you know, 490 00:29:07,080 --> 00:29:09,600 Speaker 3: talking to the faculty, but also there's a bunch of 491 00:29:09,720 --> 00:29:13,600 Speaker 3: data on this. Ironically, Berkeley, which is associated with free 492 00:29:13,600 --> 00:29:16,160 Speaker 3: speech very closely. Right, we have the Free Speech Cafe 493 00:29:16,600 --> 00:29:18,320 Speaker 3: on campus, and this was the heart of the free 494 00:29:18,320 --> 00:29:22,680 Speaker 3: speech movement in the nineteen sixties, is now a place 495 00:29:22,760 --> 00:29:28,880 Speaker 3: where people are often criticizing free speech as being too permissive, 496 00:29:29,320 --> 00:29:30,560 Speaker 3: and I find that very troubling. 497 00:29:31,280 --> 00:29:33,800 Speaker 1: So how do you think about dealing with that as 498 00:29:33,840 --> 00:29:36,600 Speaker 1: you think about training the next generation, for example, with 499 00:29:36,600 --> 00:29:39,280 Speaker 1: all these different roles that we might need to have 500 00:29:39,400 --> 00:29:41,840 Speaker 1: better conflict, how do you do that? 501 00:29:42,600 --> 00:29:44,560 Speaker 3: Yeah, well, you know, I tried to do this in 502 00:29:44,400 --> 00:29:47,840 Speaker 3: my classes yesterday. I mean, it's really about sort of 503 00:29:48,400 --> 00:29:50,160 Speaker 3: I think it requires two things, right. One is you 504 00:29:50,200 --> 00:29:54,520 Speaker 3: have to create a sense of psychological safety in some way. 505 00:29:54,720 --> 00:29:58,080 Speaker 3: Right when my classes, you know, we discuss algorithm design, 506 00:29:58,160 --> 00:30:01,280 Speaker 3: we discuss racially biased algorith we discuss all this stuff. 507 00:30:01,800 --> 00:30:03,040 Speaker 3: And one of the things I say at the top 508 00:30:03,080 --> 00:30:05,880 Speaker 3: of those classes is like, okay, so you know, we're 509 00:30:05,880 --> 00:30:08,280 Speaker 3: going to talk about some issues which are very charged. 510 00:30:08,840 --> 00:30:10,600 Speaker 3: I understand that some of you are going to have, 511 00:30:11,480 --> 00:30:16,040 Speaker 3: you know, real upsetting personal experience with this stuff. But 512 00:30:16,880 --> 00:30:18,760 Speaker 3: you know, I asked that you you give it a shot, 513 00:30:19,640 --> 00:30:22,440 Speaker 3: and that you know, you try to engage in a 514 00:30:22,480 --> 00:30:26,120 Speaker 3: spirit of curiosity. And we're not always going to get 515 00:30:26,120 --> 00:30:29,080 Speaker 3: things right, But I don't want anyone to ever accuse 516 00:30:29,160 --> 00:30:32,560 Speaker 3: us of not being thoughtful or careful or are empathetic. 517 00:30:33,080 --> 00:30:34,959 Speaker 3: And I think that's the spirit you have to approach this, 518 00:30:34,960 --> 00:30:38,360 Speaker 3: This is spirit of curiosity. So for example, why do 519 00:30:38,400 --> 00:30:41,120 Speaker 3: we disagree about this? What is it about you know, 520 00:30:41,520 --> 00:30:45,240 Speaker 3: this student experience and that student's experience that leads to 521 00:30:45,240 --> 00:30:50,960 Speaker 3: to such dramatically different conclusions on say, affirmative action, the 522 00:30:51,360 --> 00:30:55,480 Speaker 3: you know, media bias, you know, how we should deal 523 00:30:55,520 --> 00:30:57,800 Speaker 3: with crime, how should we should do with immigration that 524 00:30:57,920 --> 00:31:02,080 Speaker 3: came from somewhere? And very often what we find when 525 00:31:02,120 --> 00:31:04,960 Speaker 3: we have the conversation that way is either there's some 526 00:31:05,080 --> 00:31:08,800 Speaker 3: intense personal experience, family background. You know, my mother came 527 00:31:08,840 --> 00:31:12,680 Speaker 3: from El Salvador, my father was deported, you know, I 528 00:31:12,720 --> 00:31:16,040 Speaker 3: had to grow up in economic uncertainty, you know, something 529 00:31:16,120 --> 00:31:21,360 Speaker 3: like this. Or we find that people shaped their ideas 530 00:31:21,400 --> 00:31:23,480 Speaker 3: from their social network and never really thought about it. 531 00:31:23,800 --> 00:31:26,680 Speaker 3: And this is part of what polarization is, is that 532 00:31:26,760 --> 00:31:30,920 Speaker 3: all of our political ideas end up sort of collapsing 533 00:31:30,960 --> 00:31:35,000 Speaker 3: into this left right access. There's no logical reason why 534 00:31:35,440 --> 00:31:39,240 Speaker 3: if you are pro choice you should also be concerned 535 00:31:39,240 --> 00:31:42,920 Speaker 3: about climate change and yet here we are. So there's 536 00:31:42,960 --> 00:31:48,120 Speaker 3: some sort of social process that makes everybody split intoto 537 00:31:48,120 --> 00:31:50,840 Speaker 3: camps in this way, and we can counteract that by 538 00:31:50,960 --> 00:31:54,200 Speaker 3: thinking carefully for ourselves and having discussions with people who 539 00:31:54,360 --> 00:31:55,480 Speaker 3: might have a different view. 540 00:31:56,000 --> 00:31:59,080 Speaker 1: So you address this in your class, but how do 541 00:31:59,120 --> 00:32:02,880 Speaker 1: you think about taking to a larger world. And before 542 00:32:02,880 --> 00:32:05,160 Speaker 1: we get into social media algorithms, which I want to do, 543 00:32:05,680 --> 00:32:08,160 Speaker 1: you know, one of the things that struck me. Let's 544 00:32:08,240 --> 00:32:11,840 Speaker 1: take the Israel Hamas conflict going on right now. Whenever 545 00:32:11,880 --> 00:32:15,320 Speaker 1: it comes up about Okay, which side launched that rocket, 546 00:32:15,440 --> 00:32:19,040 Speaker 1: it turns out that no amount of evidence sways anybody 547 00:32:19,080 --> 00:32:22,800 Speaker 1: on that. People have their side and they generally stick 548 00:32:22,840 --> 00:32:24,400 Speaker 1: with it. As far as I can tell. This is 549 00:32:24,480 --> 00:32:27,520 Speaker 1: just a you know, view from surfing a lot of 550 00:32:27,520 --> 00:32:30,080 Speaker 1: social media on this stuff. The question is how do 551 00:32:30,160 --> 00:32:33,000 Speaker 1: you expand beyond your class to get people to ask 552 00:32:33,080 --> 00:32:36,760 Speaker 1: these questions about changing their point of view, not even 553 00:32:36,760 --> 00:32:39,960 Speaker 1: necessarily changing it, but just being willing to examine other 554 00:32:40,120 --> 00:32:40,960 Speaker 1: pieces of evidence. 555 00:32:41,360 --> 00:32:43,960 Speaker 3: First, let's talk about facts. I believe that facts matter. 556 00:32:44,040 --> 00:32:45,920 Speaker 3: I believe that they're deeply important. I used to be 557 00:32:45,960 --> 00:32:49,040 Speaker 3: an investigative journalist. We can't know everything, but we can 558 00:32:49,160 --> 00:32:54,080 Speaker 3: know some things. The problem is what facts mean depends 559 00:32:54,080 --> 00:32:57,000 Speaker 3: on who you ask. So in the you know Israel 560 00:32:57,040 --> 00:33:00,680 Speaker 3: Palestine conflict, the fact that Palestinians are living there before 561 00:33:00,760 --> 00:33:04,120 Speaker 3: nineteen forty seven is a fact that matters deeply to 562 00:33:04,160 --> 00:33:07,200 Speaker 3: a lot of people. The fact that Jews were living 563 00:33:07,200 --> 00:33:09,400 Speaker 3: there three thousand years ago is a fact that matters 564 00:33:09,440 --> 00:33:12,040 Speaker 3: deeply to a lot of people. So these facts are 565 00:33:12,080 --> 00:33:14,480 Speaker 3: not in dispute. What is in dispute is the meaning 566 00:33:14,560 --> 00:33:18,880 Speaker 3: of these facts, which points to the problem is fundamentally 567 00:33:18,920 --> 00:33:22,160 Speaker 3: relational in many cases rather than factual, and people can 568 00:33:22,200 --> 00:33:25,160 Speaker 3: be misinformed. I'm not disputing that, but often what you 569 00:33:25,240 --> 00:33:28,200 Speaker 3: find is it is about how people feel about each 570 00:33:28,240 --> 00:33:30,120 Speaker 3: other and about how they are relating to each other. 571 00:33:30,240 --> 00:33:32,680 Speaker 4: Now, so you ask what we can do. 572 00:33:33,240 --> 00:33:35,760 Speaker 3: So this is probably the moment to mention that I 573 00:33:35,800 --> 00:33:38,959 Speaker 3: write a newsletter called the Better Conflict Bulletin, and we 574 00:33:39,000 --> 00:33:40,360 Speaker 3: are news and analysis for. 575 00:33:40,360 --> 00:33:41,200 Speaker 4: A better culture war. 576 00:33:41,640 --> 00:33:45,560 Speaker 3: And the idea here is that many people have this 577 00:33:45,840 --> 00:33:50,800 Speaker 3: gut sense that we're fighting ugly, and we try to 578 00:33:50,840 --> 00:33:53,960 Speaker 3: explore in this newsletter what would it be to fight better? 579 00:33:54,400 --> 00:33:59,600 Speaker 3: So we cover conflict research, the science of how people 580 00:34:00,520 --> 00:34:05,120 Speaker 3: misunderstand and misperceive each other, and we cover people who 581 00:34:05,160 --> 00:34:08,319 Speaker 3: are successfully navigating the culture in a more productive way. 582 00:34:08,920 --> 00:34:12,279 Speaker 3: So there are people exploring these topics, there's not a 583 00:34:12,280 --> 00:34:17,360 Speaker 3: lot of coverage for it because, as they say in 584 00:34:17,400 --> 00:34:21,480 Speaker 3: the peace building field, sometimes peace has no natural constituency. 585 00:34:21,680 --> 00:34:24,399 Speaker 3: It's very easy to get people excited about winning. It's 586 00:34:24,400 --> 00:34:27,520 Speaker 3: harder to get people excited about living together harmoniously. 587 00:34:27,800 --> 00:34:29,960 Speaker 1: That's right. That's why when I met I just met 588 00:34:30,000 --> 00:34:32,239 Speaker 1: you very recently, Jonathan, just a few weeks ago, and 589 00:34:32,280 --> 00:34:35,399 Speaker 1: I was so excited by the kind of work you're doing. 590 00:34:35,440 --> 00:34:38,200 Speaker 1: Because I come from a neuroscience angle. I study a 591 00:34:38,200 --> 00:34:41,279 Speaker 1: lot about in groups and outgroups and empathy and how 592 00:34:41,320 --> 00:34:45,920 Speaker 1: we so easily relegate others to a different group. I 593 00:34:46,000 --> 00:34:47,839 Speaker 1: was so excited to learn that you and others are 594 00:34:47,840 --> 00:34:51,719 Speaker 1: doing the boots on the ground work of trying to 595 00:34:52,120 --> 00:34:56,040 Speaker 1: bring groups of people together. And so I still want 596 00:34:56,040 --> 00:34:58,719 Speaker 1: to drill in on this point though. So you've got 597 00:34:58,719 --> 00:35:01,880 Speaker 1: the newsletter, which is to I'm a paid subscriber to that. 598 00:35:02,440 --> 00:35:04,720 Speaker 1: What are the things that you think about though, besides 599 00:35:04,760 --> 00:35:06,440 Speaker 1: social media, which we'll get to in a second, And 600 00:35:06,480 --> 00:35:09,799 Speaker 1: that's obviously a big leverage point. But are there any 601 00:35:09,840 --> 00:35:12,080 Speaker 1: other things you can do when you think about how 602 00:35:12,080 --> 00:35:13,720 Speaker 1: do we get people to fight less ugly. 603 00:35:14,520 --> 00:35:17,319 Speaker 3: Yeah, so there's a bunch of sort of immediate things 604 00:35:17,360 --> 00:35:20,600 Speaker 3: that you can do. So, first of all, there's this 605 00:35:21,600 --> 00:35:25,520 Speaker 3: perception gap, so misperceptions, so we tend to be both 606 00:35:25,600 --> 00:35:29,200 Speaker 3: misinformed about what the other side actually believes and we 607 00:35:29,280 --> 00:35:33,280 Speaker 3: tend to stereotype them, meaning you know, if a Republican 608 00:35:33,320 --> 00:35:35,960 Speaker 3: thinks about a Democrat, or a Democrat thinks about a Republican, 609 00:35:36,000 --> 00:35:39,919 Speaker 3: they think about the most extreme version of that right. 610 00:35:40,400 --> 00:35:41,920 Speaker 3: And you can see this in the data when you 611 00:35:41,920 --> 00:35:44,440 Speaker 3: ask people, you know, you know, what is the distribution 612 00:35:44,600 --> 00:35:48,239 Speaker 3: of how you know, many you know, the opposite side 613 00:35:48,280 --> 00:35:50,279 Speaker 3: with support let's say violence if they don't get their 614 00:35:50,280 --> 00:35:52,680 Speaker 3: way in election, right, and they you get these like 615 00:35:52,880 --> 00:35:56,279 Speaker 3: very extreme distributions, where in the reality there's actually much 616 00:35:56,320 --> 00:35:59,720 Speaker 3: more more overlap. So yeah, first, you can inform yourself 617 00:35:59,719 --> 00:36:02,480 Speaker 3: about what other people actually believe in. 618 00:36:02,560 --> 00:36:03,560 Speaker 4: Honest in its way. 619 00:36:04,040 --> 00:36:09,000 Speaker 3: Second, there is a set of techniques for how to 620 00:36:09,040 --> 00:36:15,160 Speaker 3: have conversations with someone who has not just like you know, 621 00:36:15,280 --> 00:36:18,600 Speaker 3: a polite conversation, but like genuinely has different values than you. 622 00:36:19,400 --> 00:36:22,200 Speaker 4: And it's got to start with curiosity. It's got to 623 00:36:22,200 --> 00:36:22,960 Speaker 4: start with listening. 624 00:36:23,520 --> 00:36:26,239 Speaker 3: And I would suggest don't go into those conversations with 625 00:36:26,280 --> 00:36:27,719 Speaker 3: the goal of changing someone's mind. 626 00:36:27,800 --> 00:36:29,480 Speaker 4: You wouldn't want them to do that with you. 627 00:36:29,920 --> 00:36:32,040 Speaker 3: Go into those conversations with the goal of trying to 628 00:36:32,120 --> 00:36:36,200 Speaker 3: understand how they got to that place. Because when we're 629 00:36:36,200 --> 00:36:40,080 Speaker 3: talking about the American conflict, we're talking about you disagree 630 00:36:40,120 --> 00:36:43,480 Speaker 3: with half of America. Well half of America, you know, 631 00:36:43,600 --> 00:36:48,360 Speaker 3: isn't stupid. You know, they're not like fundamentally broken or 632 00:36:48,440 --> 00:36:51,319 Speaker 3: evil or something. They're going to be pretty average, just 633 00:36:51,360 --> 00:36:53,560 Speaker 3: like the other half of America. So how is it 634 00:36:53,640 --> 00:36:57,280 Speaker 3: that smart and kind people can end up believing something 635 00:36:57,320 --> 00:37:01,200 Speaker 3: completely different than you. That's the thing to get curious about. 636 00:37:01,760 --> 00:37:04,279 Speaker 3: And the third thing I would say is watch for 637 00:37:04,360 --> 00:37:08,399 Speaker 3: your own emotional reactions. Watch for where you know, you 638 00:37:08,440 --> 00:37:13,560 Speaker 3: get you know, angry or let's just say uptight, or 639 00:37:14,000 --> 00:37:15,799 Speaker 3: you know, really have this feeling that you have to 640 00:37:15,840 --> 00:37:21,000 Speaker 3: defend something or protect something, because those reactions will lead 641 00:37:21,040 --> 00:37:24,880 Speaker 3: you to first, what is it that you're scared of 642 00:37:24,960 --> 00:37:25,800 Speaker 3: that you're worried about it? 643 00:37:25,840 --> 00:37:27,320 Speaker 4: Where are your fears and concerns? 644 00:37:27,560 --> 00:37:32,560 Speaker 3: And second, if you can notice those and not be 645 00:37:32,800 --> 00:37:35,680 Speaker 3: overtaken by them, you can have much better relationships with 646 00:37:35,680 --> 00:37:38,400 Speaker 3: people who disagree with you. And actually, one of the 647 00:37:38,520 --> 00:37:43,240 Speaker 3: practices that we teach is speak them aloud, right, don't 648 00:37:43,320 --> 00:37:46,600 Speaker 3: don't project your anger onto the person across from you, 649 00:37:46,840 --> 00:37:49,880 Speaker 3: who you're probably stereotyping, who you know, never did you 650 00:37:49,920 --> 00:37:52,520 Speaker 3: personally any wrong. Just say, you know, when you say that, 651 00:37:53,800 --> 00:37:56,720 Speaker 3: I get very upset, I have reaction. I feel anger 652 00:37:56,840 --> 00:37:59,760 Speaker 3: thinking about that, without directing it towards the other person. 653 00:38:00,239 --> 00:38:03,160 Speaker 3: We can't hide our emotions. If we're going to relate better, 654 00:38:03,200 --> 00:38:04,360 Speaker 3: they have to be on the table. 655 00:38:20,520 --> 00:38:23,480 Speaker 1: If you were suddenly assigned to be the educations are 656 00:38:23,520 --> 00:38:27,239 Speaker 1: for the nation, how would you build junior high or 657 00:38:27,320 --> 00:38:32,080 Speaker 1: high school classes to teach kids about this, about how 658 00:38:32,120 --> 00:38:35,080 Speaker 1: to relate, about how to let's say, stealman each other's 659 00:38:35,239 --> 00:38:37,959 Speaker 1: arguments so that they can understand them better. 660 00:38:38,760 --> 00:38:43,879 Speaker 3: Yeah, so what we need to do is give young 661 00:38:44,000 --> 00:38:49,799 Speaker 3: people the ability and the experience of relating successfully across 662 00:38:50,520 --> 00:38:53,279 Speaker 3: value divides. And there's actually a bunch of organizations that 663 00:38:53,360 --> 00:38:56,000 Speaker 3: do this both at the K through twelve and the 664 00:38:56,080 --> 00:38:59,399 Speaker 3: university level. You know, these programs are some of them 665 00:38:59,400 --> 00:39:03,000 Speaker 3: are in class teacher led where the last students to 666 00:39:03,640 --> 00:39:05,160 Speaker 3: you know, what is a what is the thing you 667 00:39:05,160 --> 00:39:08,760 Speaker 3: feel very strongly about and either try to find students 668 00:39:08,800 --> 00:39:12,319 Speaker 3: who have a different opinion and teach them to constructively 669 00:39:12,400 --> 00:39:16,120 Speaker 3: talk about that, or do things like there was a 670 00:39:16,160 --> 00:39:19,440 Speaker 3: recent New York Times piece about a writing teacher who says, 671 00:39:19,960 --> 00:39:22,400 Speaker 3: come up with a character who you think is a 672 00:39:22,480 --> 00:39:27,120 Speaker 3: terrible human being or something that no one should absolutely 673 00:39:27,120 --> 00:39:30,520 Speaker 3: ever say, and write a story where they say that 674 00:39:30,600 --> 00:39:33,560 Speaker 3: thing in context in a way which is sympathetic or 675 00:39:33,600 --> 00:39:36,640 Speaker 3: humanizing towards them. And I think it's this is the 676 00:39:36,680 --> 00:39:40,520 Speaker 3: fundamental skill to be able to see the humanity in people, 677 00:39:40,640 --> 00:39:45,919 Speaker 3: even in moments of profound disagreement. And I don't mean 678 00:39:45,920 --> 00:39:51,000 Speaker 3: to minimize the actual stakes of these types of conversations, right. 679 00:39:51,080 --> 00:39:53,920 Speaker 3: You know, if you are an immigrant who's talking to 680 00:39:53,960 --> 00:39:56,719 Speaker 3: someone who says, you know, we shouldn't allow any more 681 00:39:56,719 --> 00:40:00,440 Speaker 3: immigration to this country, that has a deep personal impact 682 00:40:00,480 --> 00:40:02,120 Speaker 3: on you because it may means you never get to 683 00:40:02,120 --> 00:40:03,120 Speaker 3: see your family again. 684 00:40:03,600 --> 00:40:03,799 Speaker 4: Right. 685 00:40:04,080 --> 00:40:06,480 Speaker 3: So I'm not saying that, you know, we should all 686 00:40:06,480 --> 00:40:09,000 Speaker 3: talk until we get along. I'm saying we have to 687 00:40:09,040 --> 00:40:14,000 Speaker 3: see our political adversaries as human. And there are a 688 00:40:14,040 --> 00:40:16,480 Speaker 3: bunch of organizations who try to train people to do 689 00:40:16,520 --> 00:40:19,719 Speaker 3: this and also give people the experience of I had 690 00:40:19,760 --> 00:40:23,240 Speaker 3: a conversation with someone who disagrees with me and it went, okay. 691 00:40:24,040 --> 00:40:28,400 Speaker 3: We are aversive to these conversations. They're hard, they're emotionally taxing, 692 00:40:28,960 --> 00:40:31,560 Speaker 3: there's no guarantee they're going to come out well. So 693 00:40:31,600 --> 00:40:34,560 Speaker 3: we have to give people the confidence to engage in 694 00:40:34,600 --> 00:40:35,040 Speaker 3: this way. 695 00:40:36,280 --> 00:40:38,719 Speaker 1: So I'm so glad to hear you use the word humanizing, 696 00:40:38,760 --> 00:40:41,920 Speaker 1: because that's really the key from a neuroscience point of view. 697 00:40:41,960 --> 00:40:44,440 Speaker 1: The issue is that people in our out group, we 698 00:40:44,520 --> 00:40:48,560 Speaker 1: actually analyze them with our brains in a different way, 699 00:40:48,719 --> 00:40:51,520 Speaker 1: such that they are more like an object than a 700 00:40:51,600 --> 00:40:55,680 Speaker 1: fellow human. And that's how that opens the door to 701 00:40:56,640 --> 00:41:00,480 Speaker 1: you know, genocide of various sorts, or a lower level 702 00:41:00,600 --> 00:41:03,480 Speaker 1: you know, violence, or even just insulting or whatever the 703 00:41:03,520 --> 00:41:05,640 Speaker 1: thing is. We just don't care about them in the 704 00:41:05,640 --> 00:41:07,720 Speaker 1: way that we care about the people that we consider 705 00:41:08,160 --> 00:41:11,080 Speaker 1: in our in group. And yet obviously we're all made 706 00:41:11,080 --> 00:41:13,480 Speaker 1: of the same biology. We all come about from our 707 00:41:13,520 --> 00:41:17,799 Speaker 1: genetics and our experience, and it feels like humanization is 708 00:41:17,840 --> 00:41:20,080 Speaker 1: really the key. So you were just telling me about 709 00:41:20,120 --> 00:41:22,160 Speaker 1: what could be done with high school kids. Tell me 710 00:41:22,520 --> 00:41:25,239 Speaker 1: what you and others are doing with adults in terms 711 00:41:25,280 --> 00:41:28,240 Speaker 1: of helping them bridge the gap. 712 00:41:28,920 --> 00:41:31,480 Speaker 3: So there's a bunch of bridge building organizations. There's a 713 00:41:31,480 --> 00:41:33,880 Speaker 3: big one called braver Angels. And what this is is 714 00:41:33,920 --> 00:41:38,440 Speaker 3: an organization which organizes they call them Red Blue Conversations. 715 00:41:39,200 --> 00:41:41,799 Speaker 3: There's a bunch of local chapters, plus they do an 716 00:41:41,800 --> 00:41:44,120 Speaker 3: online so you can actually sign up and say, you know, 717 00:41:44,680 --> 00:41:47,319 Speaker 3: I'm in you know, this town in Indiana, and I 718 00:41:47,360 --> 00:41:49,040 Speaker 3: want to have a conversation with people who. 719 00:41:48,880 --> 00:41:49,440 Speaker 4: Disagree with me. 720 00:41:50,000 --> 00:41:53,239 Speaker 3: And they have a particular way that they mediate and 721 00:41:53,280 --> 00:41:57,120 Speaker 3: facilitate these conversations to make them productive. And so if 722 00:41:57,120 --> 00:41:59,520 Speaker 3: you want to have that encounter, you can have it. 723 00:41:59,520 --> 00:42:00,920 Speaker 3: They are a group that will set that up for 724 00:42:00,960 --> 00:42:02,120 Speaker 3: you and show you how to do it. 725 00:42:02,360 --> 00:42:04,239 Speaker 1: Do the people who sign up for that are they 726 00:42:04,600 --> 00:42:06,920 Speaker 1: are Some of them just trying to win, and they think, yes, 727 00:42:07,000 --> 00:42:08,759 Speaker 1: I want to meet people from the other side, so 728 00:42:08,800 --> 00:42:09,920 Speaker 1: I can convince them. 729 00:42:10,600 --> 00:42:16,200 Speaker 3: Possibly, But the moderation format is designed to prevent that. 730 00:42:16,520 --> 00:42:18,480 Speaker 3: They don't let people bludge in each other. And they 731 00:42:18,560 --> 00:42:21,120 Speaker 3: say right up front, the goal here is not to 732 00:42:21,200 --> 00:42:24,799 Speaker 3: change someone's mind. The goal is to increase understanding. 733 00:42:25,320 --> 00:42:27,160 Speaker 1: Oh that's lovely, Okay, great, So you're about to tell 734 00:42:27,160 --> 00:42:28,240 Speaker 1: me about another organization. 735 00:42:28,760 --> 00:42:31,239 Speaker 3: Yeah, so there's a bunch of organizations doing this kind 736 00:42:31,239 --> 00:42:34,759 Speaker 3: of bridging work. I'm also involved in an organization called 737 00:42:34,800 --> 00:42:37,960 Speaker 3: the Dignity Index, And what this is is they've created 738 00:42:37,960 --> 00:42:42,880 Speaker 3: a scale from contempt to dignity to rate political speech. 739 00:42:43,200 --> 00:42:47,520 Speaker 3: So when a politician talks about their opponent, are they saying, 740 00:42:47,719 --> 00:42:50,080 Speaker 3: you know, those people are evil, we have to destroy 741 00:42:50,120 --> 00:42:53,760 Speaker 3: them to save America. Or are they saying, I respect 742 00:42:53,840 --> 00:42:56,360 Speaker 3: what they believe. I believe something different. I think my 743 00:42:56,440 --> 00:42:58,160 Speaker 3: plan is better and if I win, we're going to 744 00:42:58,200 --> 00:43:01,640 Speaker 3: work together. Those are really different things, and the scale 745 00:43:01,640 --> 00:43:04,680 Speaker 3: actually goes it's an a point scale. It goes from 746 00:43:04,960 --> 00:43:08,719 Speaker 3: literally calling for genocide to literally saying that you see 747 00:43:08,719 --> 00:43:11,839 Speaker 3: no a difference between self and other. And so what 748 00:43:11,880 --> 00:43:14,480 Speaker 3: they're using this for is a couple different things. First 749 00:43:14,640 --> 00:43:18,080 Speaker 3: is they're putting together sort of scorecards in the upcoming election, 750 00:43:18,760 --> 00:43:22,440 Speaker 3: just rate different candidates, especially at the local level, on 751 00:43:22,480 --> 00:43:24,960 Speaker 3: whether they speak with contempt or dignity. 752 00:43:25,040 --> 00:43:25,839 Speaker 4: And the second thing. 753 00:43:25,719 --> 00:43:30,440 Speaker 3: They're doing is groups student groups in universities to train 754 00:43:30,560 --> 00:43:34,759 Speaker 3: people to rate political speech on this scale, which is 755 00:43:34,840 --> 00:43:38,440 Speaker 3: less about producing the ratings and more about getting people 756 00:43:38,560 --> 00:43:41,880 Speaker 3: to think in this way and to notice when people 757 00:43:41,920 --> 00:43:47,120 Speaker 3: are engaging in let's say, constructive versus destructive disagreement. 758 00:43:47,880 --> 00:43:51,520 Speaker 1: Excellent. Okay, so those are things on an individual level 759 00:43:51,640 --> 00:43:55,680 Speaker 1: that can be done on a societal level. Tell me 760 00:43:55,760 --> 00:43:58,480 Speaker 1: how you think about, for example, journalism and what might 761 00:43:58,520 --> 00:43:59,080 Speaker 1: be done there. 762 00:44:00,040 --> 00:44:00,239 Speaker 4: Yeah. 763 00:44:00,280 --> 00:44:03,600 Speaker 3: So, as I mentioned, I used to be a professional journalist. 764 00:44:03,600 --> 00:44:05,640 Speaker 3: I was an editor at the AP, I was an 765 00:44:05,640 --> 00:44:10,000 Speaker 3: investigative journalist at Republica. So I've seen the machine from 766 00:44:10,080 --> 00:44:14,040 Speaker 3: the inside. And what I will say about journalists is, 767 00:44:14,040 --> 00:44:16,760 Speaker 3: first of all, I have enormous respect for my colleagues 768 00:44:16,760 --> 00:44:19,600 Speaker 3: and journalism, and they are some of the most deeply 769 00:44:19,640 --> 00:44:22,480 Speaker 3: idealistic people that I know. Right, this isn't this isn't 770 00:44:22,520 --> 00:44:27,480 Speaker 3: a conspiracy. But most of them are pretty politically liberal, 771 00:44:27,920 --> 00:44:32,120 Speaker 3: and that's not a conspiracy either. That is because, especially 772 00:44:32,200 --> 00:44:34,840 Speaker 3: with the decline of local news, most of them work 773 00:44:35,040 --> 00:44:38,000 Speaker 3: for national outlets in big cities on. 774 00:44:37,960 --> 00:44:38,600 Speaker 4: The East Coast. 775 00:44:39,239 --> 00:44:42,920 Speaker 3: The social context in which they exist is pretty liberal, 776 00:44:42,920 --> 00:44:45,400 Speaker 3: and in particular, much farther to the left than the 777 00:44:45,440 --> 00:44:46,320 Speaker 3: Median American. 778 00:44:47,680 --> 00:44:48,719 Speaker 4: You know what that. 779 00:44:48,719 --> 00:44:53,719 Speaker 3: Means is they will have less contact with and less 780 00:44:53,800 --> 00:44:59,040 Speaker 3: understanding with conservatives, which means that conservatives will not see, 781 00:44:59,280 --> 00:45:04,000 Speaker 3: their views are identity reflected in the coverage of mainstream journalism. 782 00:45:04,400 --> 00:45:07,160 Speaker 1: And this is because conservatives tend to live in more 783 00:45:07,239 --> 00:45:08,000 Speaker 1: rural areas. 784 00:45:08,200 --> 00:45:14,640 Speaker 3: Yes, yeah, it's because demographically journalists are not like conservatives, right. 785 00:45:14,680 --> 00:45:18,359 Speaker 3: And it's not again, this isn't a conspiracy. This just 786 00:45:18,719 --> 00:45:19,839 Speaker 3: it's pure demographics. 787 00:45:19,920 --> 00:45:24,000 Speaker 4: Right. They are educated, urban people. 788 00:45:24,640 --> 00:45:27,280 Speaker 3: And you know, one of the strongest correlates of political 789 00:45:27,320 --> 00:45:29,560 Speaker 3: identity is population density. 790 00:45:29,640 --> 00:45:32,719 Speaker 4: Right, It's that simple in many ways. That's another way 791 00:45:32,719 --> 00:45:34,400 Speaker 4: you can talk about the divides in this country is 792 00:45:34,440 --> 00:45:35,399 Speaker 4: between urban and rural. 793 00:45:36,320 --> 00:45:38,960 Speaker 3: So anyway, given that that is the state of affairs, 794 00:45:38,960 --> 00:45:42,400 Speaker 3: what you get is there are only a few media outlets, 795 00:45:42,880 --> 00:45:48,200 Speaker 3: notably Fox, which speak in a language and a value 796 00:45:48,239 --> 00:45:53,359 Speaker 3: system which resonates with conservatives, and that leaves a sort 797 00:45:53,400 --> 00:45:56,280 Speaker 3: of vacuum where people who want that kind of coverage 798 00:45:56,360 --> 00:46:00,760 Speaker 3: have to end up going to let's say, less credible 799 00:46:00,960 --> 00:46:03,080 Speaker 3: or even fringe news sites. Right, so you start to 800 00:46:03,080 --> 00:46:07,600 Speaker 3: get to your Newsmax or your One America, and whatever 801 00:46:07,640 --> 00:46:11,120 Speaker 3: you can say about their politics from a pure sort 802 00:46:11,120 --> 00:46:15,719 Speaker 3: of journalism quality perspective, they're just not very good. And 803 00:46:15,760 --> 00:46:18,919 Speaker 3: so that's part of why we see, you know, higher 804 00:46:19,040 --> 00:46:22,279 Speaker 3: rates of misinformation and so forth on the political right 805 00:46:22,360 --> 00:46:28,640 Speaker 3: in the US, And my suggestion, which is somewhat controversial, 806 00:46:29,320 --> 00:46:32,680 Speaker 3: is that we need more conservative journalists. We need more 807 00:46:32,680 --> 00:46:37,440 Speaker 3: well trained people who understand the values of the people 808 00:46:37,480 --> 00:46:40,640 Speaker 3: who most journalists don't cover well. 809 00:46:41,280 --> 00:46:42,920 Speaker 1: Okay, So, by the way, you just pointed out this 810 00:46:42,960 --> 00:46:47,319 Speaker 1: difference between left and right wing journalism because of the 811 00:46:47,400 --> 00:46:51,600 Speaker 1: distribution in the country between rural and urban. But generally, 812 00:46:51,600 --> 00:46:55,160 Speaker 1: one of the things I've found so important is this issue. 813 00:46:55,200 --> 00:46:58,200 Speaker 1: I know this is something you've looked at about, for example, 814 00:46:58,280 --> 00:47:02,400 Speaker 1: conspiracy theories on the left and the right, and essentially 815 00:47:02,440 --> 00:47:05,799 Speaker 1: that they are equal. In other words, both sides, all 816 00:47:05,880 --> 00:47:09,840 Speaker 1: parties are just as subject to this sort of thinking. 817 00:47:10,400 --> 00:47:13,920 Speaker 1: And yet both parties accuse the other of this, just 818 00:47:13,960 --> 00:47:16,279 Speaker 1: in the way that both parties accuse the other of 819 00:47:16,400 --> 00:47:20,200 Speaker 1: doing book banning when both are guilty of this. Where 820 00:47:20,239 --> 00:47:22,799 Speaker 1: else do you see this sort of thing and looking 821 00:47:22,800 --> 00:47:25,360 Speaker 1: for other examples where the left and the right accuse 822 00:47:25,440 --> 00:47:27,400 Speaker 1: each other of things that they are equally guilty of. 823 00:47:28,239 --> 00:47:31,560 Speaker 3: Yeah, so you're raising the issue of symmetry versus asymmetry 824 00:47:31,600 --> 00:47:34,279 Speaker 3: and conflict, and this is a big issue, right, So 825 00:47:34,360 --> 00:47:39,080 Speaker 3: you have broadly speaking, sort of two schools of thought 826 00:47:39,280 --> 00:47:42,279 Speaker 3: or ways that people talk about conflict. One is, you know, 827 00:47:42,360 --> 00:47:46,719 Speaker 3: their side is obviously worse, that they want to destroy democracy. 828 00:47:47,160 --> 00:47:51,000 Speaker 3: They're the oppressor, they're the abuser. And you have this 829 00:47:51,040 --> 00:47:53,799 Speaker 3: other way of talking about it, which is, look, we're 830 00:47:53,840 --> 00:47:57,239 Speaker 3: all human, we are all contributing to being locked in 831 00:47:57,280 --> 00:48:02,600 Speaker 3: this escalating conflict spiral. Nobody's immune from misperceptions or mistakes. 832 00:48:03,760 --> 00:48:07,080 Speaker 3: Both of these things can be true, right. There really 833 00:48:07,200 --> 00:48:12,319 Speaker 3: are cases where one side is doing heinous things and 834 00:48:12,840 --> 00:48:15,000 Speaker 3: the other side is not, or at least some sort 835 00:48:15,040 --> 00:48:18,440 Speaker 3: of difference between the two. And I think part of 836 00:48:18,719 --> 00:48:23,280 Speaker 3: thinking about constructive conflict is bringing in issues of justice, 837 00:48:24,440 --> 00:48:28,520 Speaker 3: so you know, sometimes it really is on one side 838 00:48:28,560 --> 00:48:32,080 Speaker 3: to change their behavior, and that's where we require accountability 839 00:48:32,080 --> 00:48:36,279 Speaker 3: in various forms. On the other hand, conflict is a case, 840 00:48:36,360 --> 00:48:38,799 Speaker 3: especially conflict escalation is a case where it really does 841 00:48:38,840 --> 00:48:42,360 Speaker 3: take two to tango. I tend to think about the 842 00:48:42,400 --> 00:48:46,560 Speaker 3: symmetries more than the asymmetries in the American conflict, largely 843 00:48:46,600 --> 00:48:50,440 Speaker 3: because everyone else is focusing on the asymmetries. And so 844 00:48:50,480 --> 00:48:53,680 Speaker 3: you mentioned conspiracy theories, so that's a great example. The 845 00:48:53,760 --> 00:48:59,120 Speaker 3: sort of media narrative generally is that there's much more 846 00:48:59,160 --> 00:49:02,080 Speaker 3: sort of conspiratory thinking on the right in the US. 847 00:49:03,239 --> 00:49:07,000 Speaker 3: And if you make a huge list of conspiracy theories, 848 00:49:08,040 --> 00:49:12,160 Speaker 3: you know, everything from you know, secret Jewish cabal's controlling 849 00:49:12,200 --> 00:49:16,960 Speaker 3: the world to Holocaust denial to chemtrails, what you find 850 00:49:17,160 --> 00:49:20,799 Speaker 3: is that it's pretty bipartisan there. You know, about the 851 00:49:20,880 --> 00:49:24,719 Speaker 3: same number of conspiracy theories are more commonly believed. 852 00:49:24,440 --> 00:49:26,319 Speaker 4: On the left as opposed to the right. 853 00:49:27,200 --> 00:49:33,000 Speaker 3: However, if you look at misinformation consumption, you find that 854 00:49:33,200 --> 00:49:35,840 Speaker 3: it is definitely more of a right wing thing. And 855 00:49:35,880 --> 00:49:38,439 Speaker 3: I want to put a sort of big as risk 856 00:49:38,520 --> 00:49:41,920 Speaker 3: here and say, well, you know, doesn't this depend on 857 00:49:41,960 --> 00:49:46,839 Speaker 3: who's defining misinformation? And what we find is that when 858 00:49:46,880 --> 00:49:49,239 Speaker 3: you ask bipartisan pants, so you get a bunch of 859 00:49:49,280 --> 00:49:53,560 Speaker 3: Democrats Republicans together and you put a news article in 860 00:49:53,600 --> 00:49:56,880 Speaker 3: front of them, or a let's say, a purported news article, 861 00:49:57,280 --> 00:49:59,239 Speaker 3: and you say, you know, is this true or not? 862 00:49:59,400 --> 00:50:01,240 Speaker 3: You know, take your time, you can use any reference 863 00:50:01,320 --> 00:50:03,560 Speaker 3: materials you wants. You know, let's you know, look it 864 00:50:03,640 --> 00:50:08,160 Speaker 3: up online. You find that there is generally strong agreement 865 00:50:08,280 --> 00:50:11,960 Speaker 3: between bipartisan panels and professional fact checkers. This is the 866 00:50:12,040 --> 00:50:15,279 Speaker 3: level of evidence that I want to see to say 867 00:50:15,280 --> 00:50:16,759 Speaker 3: that there really is an asymmetry. 868 00:50:16,960 --> 00:50:17,960 Speaker 4: And I do think it's true. 869 00:50:18,000 --> 00:50:23,359 Speaker 3: There's just much more low quality information circulating in right 870 00:50:23,360 --> 00:50:24,759 Speaker 3: wing spaces. 871 00:50:24,719 --> 00:50:26,880 Speaker 1: And this is because of the journalism issuy that you 872 00:50:26,920 --> 00:50:27,800 Speaker 1: were mentioning. 873 00:50:28,440 --> 00:50:29,920 Speaker 3: Yeah, I think there's a number of things going on 874 00:50:30,040 --> 00:50:31,759 Speaker 3: right One of them is there's sort of a news 875 00:50:31,840 --> 00:50:36,440 Speaker 3: void right. There just isn't a lot of right wing journalism. 876 00:50:36,520 --> 00:50:38,560 Speaker 3: So if people have a demand for that there, you know, 877 00:50:38,640 --> 00:50:43,560 Speaker 3: that creates an incentive for people who don't really care 878 00:50:43,600 --> 00:50:45,840 Speaker 3: about the journalism to publish things that are going to 879 00:50:45,840 --> 00:50:49,840 Speaker 3: get attention because there's no there isn't anything else in 880 00:50:49,880 --> 00:50:52,520 Speaker 3: that political space that is that is well done. So 881 00:50:52,600 --> 00:50:54,920 Speaker 3: I think I think that is real. But I want 882 00:50:55,560 --> 00:50:58,480 Speaker 3: when people say there's there's an asymmetry, right, it's it's 883 00:50:58,560 --> 00:51:00,759 Speaker 3: really those people who are the problem. I think we 884 00:51:00,800 --> 00:51:03,320 Speaker 3: should have a high bar for evidence. We should have 885 00:51:03,400 --> 00:51:06,239 Speaker 3: a high standard for saying, yes, this is real. It 886 00:51:06,320 --> 00:51:09,960 Speaker 3: isn't just a cudgel that you're using to try to 887 00:51:10,000 --> 00:51:14,440 Speaker 3: win the culture war, because the culture war is not winnable. 888 00:51:14,680 --> 00:51:18,160 Speaker 3: That's a fantasy. You can't exclude half of the population 889 00:51:18,960 --> 00:51:22,399 Speaker 3: from politics forevermore So, we have to find some other 890 00:51:22,400 --> 00:51:23,480 Speaker 3: way to approach each other. 891 00:51:24,280 --> 00:51:26,280 Speaker 1: And so one of the things that you are really 892 00:51:26,320 --> 00:51:30,120 Speaker 1: concentrating on is social media as a leverage point. So 893 00:51:30,200 --> 00:51:34,600 Speaker 1: again we talked about individual ways to help with conflict resolution, 894 00:51:34,760 --> 00:51:36,560 Speaker 1: We've talked about societal way as we were just talking 895 00:51:36,560 --> 00:51:39,640 Speaker 1: about journalism, but as far as social media goes, I 896 00:51:39,680 --> 00:51:41,640 Speaker 1: know that the way you think about this is there's 897 00:51:41,680 --> 00:51:46,960 Speaker 1: this interplay between human psychology, which cares about threats and 898 00:51:47,239 --> 00:51:50,680 Speaker 1: recommend our algorithms, with the social media companies in terms 899 00:51:50,719 --> 00:51:52,840 Speaker 1: of what they're serving up to you, and then the 900 00:51:52,960 --> 00:51:56,640 Speaker 1: content producers, who are going to do the things that 901 00:51:56,760 --> 00:52:00,560 Speaker 1: get them the views. And so human psychology we probably 902 00:52:00,600 --> 00:52:04,360 Speaker 1: can't change too much, and the content producers were probably 903 00:52:04,360 --> 00:52:08,000 Speaker 1: not going to dissuade them from producing things that get views. 904 00:52:08,320 --> 00:52:12,480 Speaker 1: So really it's the recommender algorithms that are up for 905 00:52:12,640 --> 00:52:16,040 Speaker 1: grabs there. So we touched on this before, but let's 906 00:52:16,040 --> 00:52:18,959 Speaker 1: return to that. What do you see as the possibilities there? 907 00:52:19,960 --> 00:52:22,760 Speaker 3: Yeah, so I one of the reasons that I study 908 00:52:23,400 --> 00:52:26,600 Speaker 3: recommender algorithms, which by the way, isn't just social media. 909 00:52:26,719 --> 00:52:31,320 Speaker 3: It's you know, news recommenders, it's job recommendations, it's Amazon products, 910 00:52:31,360 --> 00:52:34,959 Speaker 3: it's Netflix it's it's music, it's podcasts, it's everything. Right, 911 00:52:35,080 --> 00:52:37,480 Speaker 3: all of this stuff is picked for us by machines now, 912 00:52:38,160 --> 00:52:40,520 Speaker 3: and potentially all of it has political content. You may 913 00:52:40,560 --> 00:52:43,840 Speaker 3: not think that you know, who cares what Spotify's recommender 914 00:52:43,920 --> 00:52:47,239 Speaker 3: is doing. Well, you know this podcast is on Spotify, right, 915 00:52:47,320 --> 00:52:50,320 Speaker 3: so that that matters too. So broader than social media, 916 00:52:50,800 --> 00:52:53,760 Speaker 3: there's two reasons I think focusing on social media is interesting. 917 00:52:53,880 --> 00:52:56,920 Speaker 3: When is the direct effects and others are the indirect 918 00:52:56,920 --> 00:53:02,000 Speaker 3: effects via incentives for producers, So direct effects. So what 919 00:53:02,040 --> 00:53:05,600 Speaker 3: I would like to see is less use of engagement 920 00:53:05,640 --> 00:53:09,960 Speaker 3: signals in content ranking. So in other words, how much 921 00:53:10,000 --> 00:53:13,560 Speaker 3: somebody clicked on something, you know, how many seconds they 922 00:53:13,600 --> 00:53:17,080 Speaker 3: spent watching that TikTok video, et cetera, should have less 923 00:53:17,080 --> 00:53:20,240 Speaker 3: of an influence on whether it is shown to other people. 924 00:53:21,040 --> 00:53:24,480 Speaker 3: And to some extent, this change is already starting to happen. 925 00:53:25,080 --> 00:53:30,960 Speaker 3: So there are at least three platforms, of which Facebook 926 00:53:31,040 --> 00:53:34,320 Speaker 3: is the only one who's said this publicly. They basically 927 00:53:34,440 --> 00:53:38,880 Speaker 3: don't use resharing as a signal for civic and health content. 928 00:53:39,239 --> 00:53:41,919 Speaker 3: So maybe for entertainment, whatever catches your attention is fine, 929 00:53:41,960 --> 00:53:45,480 Speaker 3: But maybe for civic and health and politics and these 930 00:53:45,520 --> 00:53:51,640 Speaker 3: types of critical information sources we shouldn't use, you know, 931 00:53:51,920 --> 00:53:54,440 Speaker 3: whether it went viral as a signal for whether it's 932 00:53:54,440 --> 00:53:56,600 Speaker 3: any good, and so that is starting to happen. 933 00:53:56,680 --> 00:53:57,799 Speaker 4: I'd like to see more of that. 934 00:53:58,239 --> 00:54:01,759 Speaker 3: I recently pub to paper with a bunch of collaborators 935 00:54:02,160 --> 00:54:06,880 Speaker 3: where we cataloged all of the options to using engagement 936 00:54:06,920 --> 00:54:09,759 Speaker 3: as a content ranking signal. It's called what we Know 937 00:54:09,840 --> 00:54:12,840 Speaker 3: about using non engagement signals and content marking. To just 938 00:54:12,920 --> 00:54:15,320 Speaker 3: try to get this knowledge out there and to socialize 939 00:54:15,320 --> 00:54:17,400 Speaker 3: it because a lot of this stuff is sort of 940 00:54:17,440 --> 00:54:20,520 Speaker 3: very diffuse across industry. People in industry know it but 941 00:54:21,160 --> 00:54:22,600 Speaker 3: can't talk about it because. 942 00:54:22,400 --> 00:54:23,040 Speaker 4: It's all private. 943 00:54:23,520 --> 00:54:25,359 Speaker 3: So what we did is we got together people from 944 00:54:25,360 --> 00:54:29,279 Speaker 3: eight platforms for an off the record discussion about what 945 00:54:29,360 --> 00:54:31,560 Speaker 3: can we say about how to do this better, and 946 00:54:31,640 --> 00:54:37,360 Speaker 3: then we reconstructed their conclusions from public sources scattered academic literature, 947 00:54:38,160 --> 00:54:41,919 Speaker 3: old company blog posts, but also many references from the 948 00:54:42,040 --> 00:54:45,439 Speaker 3: Facebook files which were the leaks that Francis Hagan brought 949 00:54:45,440 --> 00:54:48,440 Speaker 3: out in twenty twenty one. We sort of learned what 950 00:54:48,560 --> 00:54:51,319 Speaker 3: to look for in those files. So that's the first 951 00:54:51,320 --> 00:54:53,759 Speaker 3: thing I think social media can be better. We can 952 00:54:53,800 --> 00:54:56,839 Speaker 3: build it not to optimize for outrage, And in fact 953 00:54:57,200 --> 00:55:00,640 Speaker 3: the frontier is something called bridging based ranking, and the 954 00:55:00,719 --> 00:55:06,720 Speaker 3: idea there is you find content that both sides agree 955 00:55:06,920 --> 00:55:09,160 Speaker 3: is good. So, you know, think about this, do you 956 00:55:09,200 --> 00:55:13,600 Speaker 3: want the inflammatory news article that appeals to Democrats? Do 957 00:55:13,600 --> 00:55:16,120 Speaker 3: you want the inflammatory news article that appeals to Republicans? 958 00:55:16,640 --> 00:55:19,759 Speaker 3: Or do you want the article that everybody reads and says, yeah, 959 00:55:19,800 --> 00:55:23,120 Speaker 3: that's kind of good. Now, maybe you know, psychologically you're 960 00:55:23,160 --> 00:55:26,400 Speaker 3: much more likely to click on the inflammatory headline, but 961 00:55:26,440 --> 00:55:29,799 Speaker 3: that doesn't mean our better selves actually want that. And 962 00:55:29,880 --> 00:55:32,840 Speaker 3: so I'm involved with a bunch of experiments trying to 963 00:55:33,160 --> 00:55:37,720 Speaker 3: find this bridging content and promote it. So that's the 964 00:55:37,760 --> 00:55:40,080 Speaker 3: sort of direct changes, right, There's a bunch of algorithmic 965 00:55:40,160 --> 00:55:43,000 Speaker 3: changes that you know, I and many of my colleagues 966 00:55:43,080 --> 00:55:45,719 Speaker 3: are exploring and trying. 967 00:55:45,480 --> 00:55:46,279 Speaker 4: To advocate for. 968 00:55:46,920 --> 00:55:49,720 Speaker 3: But then one of the really interesting things about doing 969 00:55:49,960 --> 00:55:53,719 Speaker 3: this in the algorithm space is that, precisely because these 970 00:55:53,719 --> 00:55:57,400 Speaker 3: algorithms decide what everybody sees, changing them can change the 971 00:55:57,480 --> 00:56:03,440 Speaker 3: incentive for producers. So if less inflammatory material is downranked 972 00:56:03,480 --> 00:56:07,240 Speaker 3: and less popular, that means it's less profitable to produce, 973 00:56:07,719 --> 00:56:12,600 Speaker 3: and therefore that changes the kind of content that journalists, politicians, 974 00:56:13,560 --> 00:56:18,200 Speaker 3: you know, think tanks, et cetera, find successful, find reaches 975 00:56:18,200 --> 00:56:22,200 Speaker 3: an audience. And so this second order or indirect effect 976 00:56:22,480 --> 00:56:26,319 Speaker 3: is very interesting because it says that, you know, maybe 977 00:56:26,320 --> 00:56:28,760 Speaker 3: if you can get ten platforms to change their algorithms 978 00:56:28,760 --> 00:56:31,880 Speaker 3: to use bridging based ranking, that could have an effect 979 00:56:31,960 --> 00:56:34,080 Speaker 3: on a much broader media ecosystem. 980 00:56:34,440 --> 00:56:35,600 Speaker 4: So it's a leverage point. 981 00:56:36,320 --> 00:56:40,360 Speaker 1: Yes, how do you convince the social media platform to change? 982 00:56:40,920 --> 00:56:45,239 Speaker 3: Well, I think there's a sort of three cases here. 983 00:56:45,760 --> 00:56:48,719 Speaker 3: So and this is why we're testing these algorithms. So 984 00:56:49,360 --> 00:56:52,360 Speaker 3: I am running something called the pro Social Ranking Challenge. 985 00:56:52,880 --> 00:56:56,040 Speaker 3: It is an open competition for better social media algorithms 986 00:56:56,080 --> 00:56:59,759 Speaker 3: where teams from around the world are competing, first of 987 00:56:59,800 --> 00:57:02,719 Speaker 3: all for a cash prize, but mostly we're going to 988 00:57:02,719 --> 00:57:07,160 Speaker 3: take the winning algorithms, as judged by a panel of scientists, 989 00:57:07,600 --> 00:57:11,960 Speaker 3: and we're going to test them on Facebook, Twitter, and 990 00:57:12,160 --> 00:57:16,200 Speaker 3: Reddit using a custom browser extension. And so we're actually 991 00:57:16,240 --> 00:57:20,760 Speaker 3: going to look to see if it changes polarization, wellbeing, 992 00:57:21,040 --> 00:57:26,960 Speaker 3: and other types of attitudes and outcomes, including engagement. Crucially, 993 00:57:27,000 --> 00:57:30,080 Speaker 3: we are testing whether it changes both short term and 994 00:57:30,160 --> 00:57:31,600 Speaker 3: long term use of these products. 995 00:57:32,320 --> 00:57:33,360 Speaker 4: And so from that. 996 00:57:33,560 --> 00:57:36,360 Speaker 3: We will learn which of three worlds we live in. 997 00:57:36,760 --> 00:57:40,480 Speaker 3: If the universe is kind, we will discover that producing 998 00:57:40,520 --> 00:57:45,920 Speaker 3: a better product that reduces polarization also increases long term retention. 999 00:57:46,280 --> 00:57:48,440 Speaker 3: So you make a higher quality product, people stay on 1000 00:57:48,480 --> 00:57:51,160 Speaker 3: the platform, maybe not in the short term, but certainly 1001 00:57:51,160 --> 00:57:53,400 Speaker 3: in the long term. And then you can do well 1002 00:57:53,400 --> 00:57:55,080 Speaker 3: by doing good right, And then it's just sort of 1003 00:57:55,120 --> 00:57:57,440 Speaker 3: getting the word out. Or we could live in a 1004 00:57:57,480 --> 00:58:01,800 Speaker 3: world where you know, it has neutral but to maybe 1005 00:58:01,840 --> 00:58:07,360 Speaker 3: slightly negative effects to using algorithms that reduce polarization. And 1006 00:58:08,000 --> 00:58:11,320 Speaker 3: it's not unheard of for platforms to make, you know, 1007 00:58:11,480 --> 00:58:15,800 Speaker 3: slightly revenue reducing changes to their algorithms in the interest 1008 00:58:15,840 --> 00:58:18,000 Speaker 3: of public good. You know, I collect examples of this. 1009 00:58:18,320 --> 00:58:20,920 Speaker 3: So then it's an advocacy campaign, right, this is the 1010 00:58:21,000 --> 00:58:23,760 Speaker 3: right thing, you should do it. There's lots of groups 1011 00:58:23,760 --> 00:58:25,880 Speaker 3: that exist to put pressure on companies to do the 1012 00:58:25,920 --> 00:58:30,600 Speaker 3: right things. Or perhaps in some sense, the worst outcome 1013 00:58:30,680 --> 00:58:34,480 Speaker 3: is we discover that making algorithms which produce better conflict 1014 00:58:34,520 --> 00:58:39,240 Speaker 3: outcomes tends to reduce usage of the product in a 1015 00:58:39,280 --> 00:58:43,120 Speaker 3: way that is meaningful from a business perspective. And then 1016 00:58:43,160 --> 00:58:46,760 Speaker 3: what we have is a collective action problem. You have 1017 00:58:46,800 --> 00:58:50,120 Speaker 3: a first mover disadvantage. Who in that, whoever changes their 1018 00:58:50,160 --> 00:58:53,760 Speaker 3: algorithm to be better first loses money relatives to everyone else. 1019 00:58:54,120 --> 00:58:58,120 Speaker 3: And then you have to look at regulation because that 1020 00:58:58,240 --> 00:59:00,400 Speaker 3: can level the playing field, and very much the same 1021 00:59:00,400 --> 00:59:03,120 Speaker 3: way that environmental regulation prevents a race to the bottom. 1022 00:59:03,160 --> 00:59:06,760 Speaker 3: You know, nobody wants to be the first to use 1023 00:59:06,800 --> 00:59:10,040 Speaker 3: a more expensive process that results in less pollution, because 1024 00:59:10,080 --> 00:59:12,400 Speaker 3: then they would lose their market share. But if everybody 1025 00:59:12,440 --> 00:59:14,960 Speaker 3: has to do it, then it's okay. So those are 1026 00:59:14,960 --> 00:59:16,960 Speaker 3: the three outcomes, right, Either it's just a matter of 1027 00:59:17,160 --> 00:59:20,560 Speaker 3: spreading the good word or its advocacy or its regulation 1028 00:59:20,720 --> 00:59:22,960 Speaker 3: depending on where the science takes. 1029 00:59:22,840 --> 00:59:25,760 Speaker 1: Us, or what else. In conclusion, would you like to say. 1030 00:59:26,200 --> 00:59:30,480 Speaker 3: If we care about relating to each other better, and 1031 00:59:30,520 --> 00:59:32,600 Speaker 3: I don't just mean like kumbay oh why can't we 1032 00:59:32,640 --> 00:59:37,680 Speaker 3: all get along? But actually having a politics that functions better, 1033 00:59:37,720 --> 00:59:40,760 Speaker 3: where we get to fight what we believe in without 1034 00:59:40,960 --> 00:59:45,040 Speaker 3: dehumanizing the other side, without misperceiving what they're actually about, 1035 00:59:45,520 --> 00:59:48,960 Speaker 3: without things turning ugly and violent. That's something that we 1036 00:59:49,000 --> 00:59:54,600 Speaker 3: can do. There's many many ways to have better political conflict, 1037 00:59:55,360 --> 00:59:58,320 Speaker 3: but it's going to take a fundamentally different attitude As 1038 00:59:58,360 --> 01:00:01,760 Speaker 3: I said before, the the culture war is not winnable. 1039 01:00:02,200 --> 01:00:04,680 Speaker 3: There is no world in which you get to exclude 1040 01:00:04,760 --> 01:00:10,439 Speaker 3: your political opponents from politics indefinitely. That's what a democracy is, right. 1041 01:00:11,120 --> 01:00:14,680 Speaker 3: We accept someone winning an election because the next election 1042 01:00:14,720 --> 01:00:19,120 Speaker 3: they might lose. So there are ways to get involved, 1043 01:00:19,160 --> 01:00:20,320 Speaker 3: there are things you can do. 1044 01:00:20,840 --> 01:00:24,360 Speaker 4: The situation is not hopeless. 1045 01:00:27,720 --> 01:00:31,520 Speaker 1: That was Jonathan's strain at Berkeley. So let's take the 1046 01:00:31,560 --> 01:00:35,280 Speaker 1: work that Jonathan and others are doing to address our 1047 01:00:35,360 --> 01:00:39,520 Speaker 1: illusions that people who disagree with us are misinformed trolls. 1048 01:00:39,680 --> 01:00:43,040 Speaker 1: It's a very useful exercise to figure out how we 1049 01:00:43,120 --> 01:00:46,000 Speaker 1: can look at somebody who disagrees with us as not 1050 01:00:46,160 --> 01:00:50,200 Speaker 1: being cold and incompetent, but possibly someone who is kind 1051 01:00:50,280 --> 01:00:55,320 Speaker 1: and generous and has a different opinion. This starts with 1052 01:00:55,480 --> 01:00:59,520 Speaker 1: intellectual humility, understanding that we don't know it all, and 1053 01:00:59,560 --> 01:01:02,000 Speaker 1: I don't know that from a philosophical point of view, 1054 01:01:02,000 --> 01:01:06,000 Speaker 1: but from a neuroscience point of view. Because of brain plasticity, 1055 01:01:06,480 --> 01:01:10,200 Speaker 1: we each form an internal model of the world based 1056 01:01:10,240 --> 01:01:14,000 Speaker 1: on our very thin trajectory of space and time, and 1057 01:01:14,080 --> 01:01:17,800 Speaker 1: we form our political opinions based on just the little 1058 01:01:17,800 --> 01:01:22,200 Speaker 1: bit that we're exposed to. We shape our ideas from 1059 01:01:22,200 --> 01:01:25,360 Speaker 1: the social networks we happen to be embedded in, and 1060 01:01:25,440 --> 01:01:28,680 Speaker 1: we're not consciously aware that we're doing this. So at 1061 01:01:28,680 --> 01:01:31,040 Speaker 1: the heart of all of this is a need for 1062 01:01:31,240 --> 01:01:34,200 Speaker 1: intellectual humility, and we're going to need a lot of 1063 01:01:34,240 --> 01:01:37,640 Speaker 1: this to address the kind of polarization, the kind of 1064 01:01:37,720 --> 01:01:42,320 Speaker 1: fear and loathing that we're seeing across the globe. Meaningful 1065 01:01:42,600 --> 01:01:46,000 Speaker 1: dialogues are a great start, but also there's a need 1066 01:01:46,080 --> 01:01:50,600 Speaker 1: for scaling. How do we build this into our educational systems, 1067 01:01:50,640 --> 01:01:54,440 Speaker 1: How do we build this into the fundamental algorithms underlying 1068 01:01:54,440 --> 01:01:58,080 Speaker 1: our social media. How do we build what some scholars 1069 01:01:58,160 --> 01:02:03,880 Speaker 1: like Heidi and Guy Burgess massively parallel peace building. There's 1070 01:02:03,880 --> 01:02:05,800 Speaker 1: still a lot of work to be done on this front, 1071 01:02:06,160 --> 01:02:11,880 Speaker 1: especially as we're moving from communicating via soapbox speeches and 1072 01:02:12,080 --> 01:02:16,400 Speaker 1: hand delivered pamphlets to instant communication that allows you to 1073 01:02:16,480 --> 01:02:20,800 Speaker 1: deliver your speech or your pamphlet to everyone's mobile rectangle 1074 01:02:20,800 --> 01:02:24,400 Speaker 1: around the planet. So I suggest that an important angle 1075 01:02:24,560 --> 01:02:28,040 Speaker 1: on all this is to understand the neuroscience at the 1076 01:02:28,120 --> 01:02:30,560 Speaker 1: base of everything, why we think the way we do, 1077 01:02:31,000 --> 01:02:34,320 Speaker 1: why we behave the way we behave, and then work 1078 01:02:34,400 --> 01:02:38,120 Speaker 1: to build our societal structures like our media, our dialogue, 1079 01:02:38,120 --> 01:02:42,160 Speaker 1: our education. With that in mind, we can no longer 1080 01:02:42,560 --> 01:02:47,160 Speaker 1: make the romantic assumption that we each are just objective 1081 01:02:47,440 --> 01:02:51,840 Speaker 1: holders of truth and that ours is this single logical 1082 01:02:51,960 --> 01:02:55,600 Speaker 1: argument or position that should convince everyone. And this is 1083 01:02:55,640 --> 01:02:58,920 Speaker 1: because our internal models of the world give us different 1084 01:02:58,960 --> 01:03:03,080 Speaker 1: biases towards different groups. We have different sensitivities towards issues, 1085 01:03:03,120 --> 01:03:06,520 Speaker 1: we have different levels of knowledge, we have different emotional 1086 01:03:06,560 --> 01:03:10,200 Speaker 1: affiliations to things happening in the world. So the first 1087 01:03:10,240 --> 01:03:13,920 Speaker 1: step to better conflict is to have a more realistic 1088 01:03:14,040 --> 01:03:18,480 Speaker 1: understanding that we are each living on our own planet 1089 01:03:18,840 --> 01:03:23,920 Speaker 1: mentally and emotionally, and the important goal is to bridge planets, 1090 01:03:23,960 --> 01:03:27,840 Speaker 1: to set up some signaling across the vast reaches of 1091 01:03:27,920 --> 01:03:31,160 Speaker 1: space between us. It's easy to say that the people 1092 01:03:31,160 --> 01:03:35,080 Speaker 1: who disagree with you are ugly trolls, but as we discussed, 1093 01:03:35,080 --> 01:03:38,960 Speaker 1: it can make slightly more sense to ask how someone 1094 01:03:38,960 --> 01:03:43,240 Speaker 1: who is smart and kind can end up believing something 1095 01:03:43,440 --> 01:03:46,520 Speaker 1: different than you do. This is not a plea to 1096 01:03:46,880 --> 01:03:50,800 Speaker 1: agree with the other side, but to better understand their 1097 01:03:50,840 --> 01:03:55,720 Speaker 1: motivation and their philosophy, and fundamentally, to better understand are 1098 01:03:55,880 --> 01:04:03,360 Speaker 1: shared biology and therefore are shared human go to eagleman 1099 01:04:03,440 --> 01:04:06,600 Speaker 1: dot com slash podcast for more information and to find 1100 01:04:06,720 --> 01:04:10,440 Speaker 1: further reading. Send me an email at podcasts at eagleman 1101 01:04:10,520 --> 01:04:14,200 Speaker 1: dot com with any questions or discussions, and check out 1102 01:04:14,200 --> 01:04:17,680 Speaker 1: and subscribe to Inner Cosmos on YouTube for videos of 1103 01:04:17,680 --> 01:04:21,560 Speaker 1: each episode and to leave comments. Until next time, I'm 1104 01:04:21,680 --> 01:04:24,720 Speaker 1: David Eagleman and this is inner Cosmos