WEBVTT - Ep 53 " Can societies fight better? "

0:00:04.600 --> 0:00:09.040
<v Speaker 1>Humans disagree on many political fronts, but we're finding ourselves

0:00:09.080 --> 0:00:13.760
<v Speaker 1>globally in a period of higher polarization, and the question

0:00:13.920 --> 0:00:17.760
<v Speaker 1>is is there anything to be done about that. Presumably

0:00:17.760 --> 0:00:20.759
<v Speaker 1>we're not going to get humans to not have conflict,

0:00:21.160 --> 0:00:25.279
<v Speaker 1>but is there any such thing as better conflict? What

0:00:25.320 --> 0:00:27.400
<v Speaker 1>would that mean? And what does any of that have

0:00:27.480 --> 0:00:30.360
<v Speaker 1>to do with brains or with the fact that modern

0:00:30.440 --> 0:00:34.040
<v Speaker 1>humans only spread out across the globe sixty thousand years ago,

0:00:34.560 --> 0:00:40.640
<v Speaker 1>or with social media recommender algorithms, or the Iroquois Native Americans.

0:00:43.200 --> 0:00:46.680
<v Speaker 1>Welcome to Inner Cosmos with me David Eagleman. I'm a

0:00:46.720 --> 0:00:50.400
<v Speaker 1>neuroscientist and an author at Stanford and in these episodes,

0:00:50.520 --> 0:00:54.000
<v Speaker 1>I examine the intersection of our brains and our lives.

0:00:54.400 --> 0:00:57.440
<v Speaker 1>And today's episode is about something that you know, I'm

0:00:57.480 --> 0:01:00.560
<v Speaker 1>obsessed with from the point of view of brainience, which

0:01:00.600 --> 0:01:05.959
<v Speaker 1>is issues of conflict and empathy and in groups and outgroups.

0:01:06.840 --> 0:01:10.760
<v Speaker 1>We find ourselves in a time of conflict. Now. The

0:01:10.840 --> 0:01:14.640
<v Speaker 1>things on everyone's mind are the war in Ukraine and

0:01:14.720 --> 0:01:18.440
<v Speaker 1>the war between Israel and Hamas. And even though those

0:01:18.520 --> 0:01:21.679
<v Speaker 1>have sucked up most of the media's attention, it's critical

0:01:21.680 --> 0:01:25.319
<v Speaker 1>to note that we have lots of other conflicts also

0:01:25.400 --> 0:01:30.080
<v Speaker 1>going on around the globe right now. There's ongoing internal

0:01:30.120 --> 0:01:34.600
<v Speaker 1>conflict in Myanmar. There's an Islamist insurgency in the Maghreb

0:01:34.680 --> 0:01:39.280
<v Speaker 1>region of Africa. There's the Boko Haram insurgency in Nigeria.

0:01:39.319 --> 0:01:44.080
<v Speaker 1>There's a civil war between rival factions in Sudan. There's

0:01:44.080 --> 0:01:48.080
<v Speaker 1>a multi sided conflict of the Syrian Civil War. There's

0:01:48.120 --> 0:01:53.120
<v Speaker 1>the ongoing civil wars in Somalia and Ethiopia, and Afghanistan

0:01:53.200 --> 0:01:57.400
<v Speaker 1>has been in near continuous armed conflict for essentially as

0:01:57.400 --> 0:01:59.920
<v Speaker 1>long as I've been alive. Now, if you've been listening

0:01:59.960 --> 0:02:02.320
<v Speaker 1>to this podcast for a while, you know that I'm

0:02:02.680 --> 0:02:07.800
<v Speaker 1>very interested in why conflict happens between humans and why

0:02:07.880 --> 0:02:12.920
<v Speaker 1>it happens so readily and consistently. So in this light,

0:02:13.520 --> 0:02:17.280
<v Speaker 1>there are three main things to note from the neuroscience

0:02:17.320 --> 0:02:20.520
<v Speaker 1>point of view. The first thing is that we all

0:02:21.080 --> 0:02:25.799
<v Speaker 1>have the same brains on the inside. So despite cultural

0:02:25.800 --> 0:02:30.720
<v Speaker 1>differences and geographic differences, your pink brain looks exactly the

0:02:30.760 --> 0:02:35.040
<v Speaker 1>same as someone else's anyone else's anywhere around the globe.

0:02:35.080 --> 0:02:37.080
<v Speaker 1>And I don't mean that as a feel good statement.

0:02:37.120 --> 0:02:42.760
<v Speaker 1>That's just a biological fact. Anatomically modern humans radiated out

0:02:42.760 --> 0:02:46.200
<v Speaker 1>of Africa only about sixty thousand years ago, and some

0:02:46.280 --> 0:02:49.080
<v Speaker 1>turned left and became European, and others turned right and

0:02:49.120 --> 0:02:53.000
<v Speaker 1>became Asian, and those who stayed remained African. But the

0:02:53.200 --> 0:02:57.120
<v Speaker 1>timescale that we're talking about some tens of thousands of years.

0:02:57.639 --> 0:03:01.640
<v Speaker 1>That is simply not enough time on the evolutionary timescale

0:03:01.919 --> 0:03:05.519
<v Speaker 1>for brains to change in any meaningful way to change

0:03:05.560 --> 0:03:10.320
<v Speaker 1>their operation or their algorithms. So we're all running the

0:03:10.360 --> 0:03:16.440
<v Speaker 1>same version of the biological operating system, despite superficial differences

0:03:16.480 --> 0:03:19.800
<v Speaker 1>in size and shape and culture and the costumes that

0:03:19.840 --> 0:03:22.359
<v Speaker 1>we wear and how much melanin we have in our

0:03:22.400 --> 0:03:26.680
<v Speaker 1>skin and so on. We can end up seeming pretty

0:03:26.720 --> 0:03:29.480
<v Speaker 1>different as a result of these surface properties, but we

0:03:29.600 --> 0:03:36.160
<v Speaker 1>have the same cognitive and emotional engine on the inside. Now. Unfortunately,

0:03:36.760 --> 0:03:40.200
<v Speaker 1>one of the things that operating system does is it

0:03:40.240 --> 0:03:44.320
<v Speaker 1>gives us all a very strong drive toward having in

0:03:44.400 --> 0:03:50.720
<v Speaker 1>groups and outgroups. This tribalism is something that characterizes our species.

0:03:51.160 --> 0:03:53.880
<v Speaker 1>We belong to groups based.

0:03:53.560 --> 0:03:56.520
<v Speaker 2>On our neighborhood or the way we look, or our

0:03:56.560 --> 0:03:59.720
<v Speaker 2>religion or our family or whatever it is, and we

0:03:59.800 --> 0:04:03.640
<v Speaker 2>take to feel closer to our in groups and treat

0:04:03.680 --> 0:04:06.760
<v Speaker 2>them better and listen to their opinions more strongly.

0:04:07.200 --> 0:04:10.280
<v Speaker 1>Our outgroups not so much. When it comes to someone

0:04:10.600 --> 0:04:13.720
<v Speaker 1>from the other side of the railroad tracks, or who

0:04:13.800 --> 0:04:17.640
<v Speaker 1>happens to have grown up under a different deity, or

0:04:17.680 --> 0:04:22.039
<v Speaker 1>who sings different songs or uses different language sounds to

0:04:22.080 --> 0:04:25.799
<v Speaker 1>transmit information, or whatever it is, we tend to treat

0:04:25.880 --> 0:04:31.280
<v Speaker 1>them with more suspicion. And in decades of neuroscience experiments,

0:04:31.320 --> 0:04:34.120
<v Speaker 1>including those from my lab, we've been able to show

0:04:34.520 --> 0:04:37.880
<v Speaker 1>that your brain simply does not care as much about

0:04:37.960 --> 0:04:42.680
<v Speaker 1>people in your outgroup. At the extreme, your brain will

0:04:42.760 --> 0:04:45.040
<v Speaker 1>view somebody in an outgroup the same way it will

0:04:45.120 --> 0:04:49.160
<v Speaker 1>view an object, not like a fellow human the way

0:04:49.200 --> 0:04:51.560
<v Speaker 1>it does with people in your in group, but instead

0:04:52.000 --> 0:04:56.560
<v Speaker 1>as something that's not particularly worthy of empathy, something that

0:04:56.600 --> 0:04:58.279
<v Speaker 1>you have to deal with in the same way that

0:04:58.320 --> 0:05:01.160
<v Speaker 1>you would deal with a vice, rusts, or a cockroach

0:05:01.279 --> 0:05:03.920
<v Speaker 1>or a rat. And if you're interested in more detail

0:05:04.000 --> 0:05:07.360
<v Speaker 1>on this point, I dove deep into examples of this

0:05:07.600 --> 0:05:10.880
<v Speaker 1>in episode twenty and one of the things I talked

0:05:10.920 --> 0:05:13.680
<v Speaker 1>about was an experiment that we did in my lab

0:05:14.080 --> 0:05:18.000
<v Speaker 1>that got at this in group outgroup difference, where all

0:05:18.040 --> 0:05:21.880
<v Speaker 1>we needed were single word labels. So I'll just tell

0:05:21.880 --> 0:05:25.200
<v Speaker 1>you the short version here of the experiment. We put

0:05:25.240 --> 0:05:29.640
<v Speaker 1>you in brain imaging in fMRI, and you're looking at

0:05:29.720 --> 0:05:32.919
<v Speaker 1>six hands on the screen and one of them gets

0:05:33.200 --> 0:05:37.720
<v Speaker 1>stabbed by a syringe needle. Now, seeing that activates a

0:05:38.279 --> 0:05:42.600
<v Speaker 1>network in your brain that we summarize as the pain matrix.

0:05:42.680 --> 0:05:46.120
<v Speaker 1>Your brain is showing a big spike, and what that

0:05:46.200 --> 0:05:50.040
<v Speaker 1>represents is you are feeling the other person's pain even

0:05:50.080 --> 0:05:53.279
<v Speaker 1>though it's not your hand. The eras in your brain

0:05:53.279 --> 0:05:56.279
<v Speaker 1>that care about pain are coming online. This is the

0:05:56.320 --> 0:06:01.640
<v Speaker 1>basis of empathy, of feeling someone else else's pain. But

0:06:01.880 --> 0:06:11.040
<v Speaker 1>now we label the six hands with one word labels Christian, Jewish, Muslim, Hindu, Scientologist, atheist,

0:06:11.360 --> 0:06:14.120
<v Speaker 1>And now we measure people of all religions in the

0:06:14.160 --> 0:06:17.560
<v Speaker 1>scanner as these different hands get stabbed. And what we

0:06:17.720 --> 0:06:21.880
<v Speaker 1>found is that everyone has a larger brain response when

0:06:21.920 --> 0:06:25.240
<v Speaker 1>it's their own group that's getting stabbed, and their brain

0:06:25.320 --> 0:06:28.720
<v Speaker 1>shows a smaller response when it's any of the other

0:06:29.000 --> 0:06:33.479
<v Speaker 1>five out groups. And this was true across religions and

0:06:33.720 --> 0:06:36.800
<v Speaker 1>even for the atheist group, which cared more when the

0:06:36.839 --> 0:06:40.520
<v Speaker 1>atheist hand gets stabbed than the other hands. So this

0:06:40.640 --> 0:06:43.560
<v Speaker 1>is not an indictment of religion, but more broadly, it

0:06:43.680 --> 0:06:49.920
<v Speaker 1>highlights your brain's exquisite skills at distinguishing in group from

0:06:49.960 --> 0:06:52.920
<v Speaker 1>your outgroups and the fact that your brain simply does

0:06:53.000 --> 0:06:57.200
<v Speaker 1>not spend as much energy on simulating what it is

0:06:57.440 --> 0:07:01.279
<v Speaker 1>like to be someone who is in your If you

0:07:01.320 --> 0:07:04.599
<v Speaker 1>want more details on that, please listen to episode twenty,

0:07:04.640 --> 0:07:07.039
<v Speaker 1>which is called why does your brain care about some

0:07:07.160 --> 0:07:11.280
<v Speaker 1>people more than others? And beyond the fact that we

0:07:11.320 --> 0:07:15.160
<v Speaker 1>are naturally tribalistic, there's a third point to note, which

0:07:15.200 --> 0:07:18.400
<v Speaker 1>is that every one of us feels like we know

0:07:18.720 --> 0:07:21.440
<v Speaker 1>the truth, and it's not clear to us why other

0:07:21.520 --> 0:07:24.840
<v Speaker 1>people don't see the truth as clearly as we do.

0:07:24.880 --> 0:07:29.800
<v Speaker 1>They must be stupid or ill informed, or trolls or

0:07:30.240 --> 0:07:34.080
<v Speaker 1>stubborn or who knows why they don't simply admit to

0:07:34.160 --> 0:07:38.360
<v Speaker 1>what is so obviously the truth. And deep down most

0:07:38.400 --> 0:07:41.880
<v Speaker 1>people have an intuition that if you could just simply

0:07:42.400 --> 0:07:46.040
<v Speaker 1>shout loudly enough in all caps on social media, that

0:07:46.280 --> 0:07:49.800
<v Speaker 1>everyone would agree with you, everyone would see the light.

0:07:50.480 --> 0:07:53.240
<v Speaker 1>And this is true of everything, not just political positions.

0:07:53.320 --> 0:07:57.800
<v Speaker 1>Take religion. There are two thousand active religions on this

0:07:57.920 --> 0:08:01.000
<v Speaker 1>small planet, and many people feel that if they could

0:08:01.120 --> 0:08:04.760
<v Speaker 1>just share with all the infidels why their religion is

0:08:04.840 --> 0:08:09.240
<v Speaker 1>right and everyone else is wrong, that everyone would agree

0:08:09.280 --> 0:08:12.600
<v Speaker 1>with them. Everyone would come to know that the religion

0:08:12.720 --> 0:08:17.120
<v Speaker 1>you happened to grow up in is the correct one. Now,

0:08:17.160 --> 0:08:20.200
<v Speaker 1>if there actually were one correct religion, we might expect

0:08:20.240 --> 0:08:23.920
<v Speaker 1>it to spread everywhere equally, But obviously that's not the case.

0:08:24.200 --> 0:08:28.120
<v Speaker 1>Religions remain clustered around where they started, and you're not

0:08:28.200 --> 0:08:32.360
<v Speaker 1>going to see a blossoming of Islam in Boise, Idaho,

0:08:32.600 --> 0:08:35.560
<v Speaker 1>just like you won't expect to see a blossoming of

0:08:35.600 --> 0:08:39.600
<v Speaker 1>Protestantism in Mecca. But here's the interesting part. Whether we're

0:08:39.640 --> 0:08:44.920
<v Speaker 1>examining one's political preferences, or religious affiliation or whatever, people

0:08:45.040 --> 0:08:50.640
<v Speaker 1>are generally reluctant to change to consider the perspectives of

0:08:50.840 --> 0:08:55.520
<v Speaker 1>other people. So, given the deep wiring that we have

0:08:55.760 --> 0:08:58.160
<v Speaker 1>for having our in groups and for feeling we know

0:08:58.320 --> 0:09:01.319
<v Speaker 1>the right answer because of our very limited internal models,

0:09:01.840 --> 0:09:05.520
<v Speaker 1>this leads to the critical question about whether there would

0:09:05.520 --> 0:09:11.800
<v Speaker 1>be any way for eight billion brains to find peaceful coexistence. Now,

0:09:11.840 --> 0:09:14.360
<v Speaker 1>if you've been a regular listener to the podcast, you

0:09:14.440 --> 0:09:18.120
<v Speaker 1>know that I'm generally highly optimistic about everything, but I

0:09:18.160 --> 0:09:21.160
<v Speaker 1>would suggest the answer to the question of whether we

0:09:21.240 --> 0:09:26.120
<v Speaker 1>will achieve peace is probably not. We're probably not going

0:09:26.200 --> 0:09:30.040
<v Speaker 1>to find ourselves ever in some magical moment where all

0:09:30.080 --> 0:09:33.760
<v Speaker 1>the newspapers say, Wow, nobody is fighting, we're taking a

0:09:33.840 --> 0:09:36.200
<v Speaker 1>day off today, and we're not going to find some

0:09:36.320 --> 0:09:41.320
<v Speaker 1>extraordinary presidential candidate where everyone says, yeah, that person's great.

0:09:41.400 --> 0:09:45.080
<v Speaker 1>I think we can all agree presidential elections are always

0:09:45.160 --> 0:09:48.880
<v Speaker 1>fairly close to fifty to fifty. Why this results from

0:09:48.920 --> 0:09:53.320
<v Speaker 1>the enormous variety of internal models inside everyone's heads. There

0:09:53.440 --> 0:09:57.720
<v Speaker 1>is no perfect candidate because no one can fit all

0:09:57.760 --> 0:10:02.120
<v Speaker 1>the constraints at once of all all the different models. Okay,

0:10:02.160 --> 0:10:04.480
<v Speaker 1>so if we take this all as our starting point

0:10:04.559 --> 0:10:08.240
<v Speaker 1>for today, the question is what does this all mean

0:10:08.480 --> 0:10:12.800
<v Speaker 1>for the future of conflict? Are we facing a future

0:10:13.120 --> 0:10:16.760
<v Speaker 1>where wars and polarization will stay as they are or

0:10:16.840 --> 0:10:21.480
<v Speaker 1>even grow worse. Well, that's certainly a reasonable fear. But

0:10:21.559 --> 0:10:24.760
<v Speaker 1>a little while ago I discovered there was a movement

0:10:24.920 --> 0:10:29.080
<v Speaker 1>of political scholars around the idea of how to have

0:10:29.679 --> 0:10:33.600
<v Speaker 1>better conflict. Now, what does better conflict mean? We'll get

0:10:33.640 --> 0:10:36.400
<v Speaker 1>into that in a second, but first I just want

0:10:36.440 --> 0:10:39.959
<v Speaker 1>to say I've been moved and inspired by this movement.

0:10:40.000 --> 0:10:43.080
<v Speaker 1>It feels like it's the opposite of what we see

0:10:43.120 --> 0:10:46.959
<v Speaker 1>online every day, where people pick aside and scream and

0:10:47.080 --> 0:10:50.280
<v Speaker 1>yell for it or fight and die for it at

0:10:50.280 --> 0:10:57.760
<v Speaker 1>the cost of not meaningfully considering the complexity of human situations.

0:10:58.600 --> 0:11:01.640
<v Speaker 1>Just take what's happening in the middle least. Typically, if

0:11:01.679 --> 0:11:05.880
<v Speaker 1>someone points out that the history there is extraordinarily multi

0:11:06.000 --> 0:11:10.079
<v Speaker 1>layered and complex, people will often say, no, this is

0:11:10.120 --> 0:11:13.040
<v Speaker 1>actually quite simple, and then they will give their version

0:11:13.120 --> 0:11:16.040
<v Speaker 1>of the history. But the extraordinary part is that people

0:11:16.080 --> 0:11:19.199
<v Speaker 1>on both sides will deliver that same line. They will say,

0:11:19.200 --> 0:11:22.920
<v Speaker 1>it's actually very simple, And to me, that's a barometer

0:11:23.040 --> 0:11:26.520
<v Speaker 1>that tells me the situation is not so simple. But instead,

0:11:26.760 --> 0:11:30.520
<v Speaker 1>like most human conflict, it's full of different paths that

0:11:30.559 --> 0:11:33.880
<v Speaker 1>you can take through the history to come to a

0:11:34.040 --> 0:11:38.079
<v Speaker 1>variety of conclusions. And we witness this sort of complexity

0:11:38.440 --> 0:11:43.160
<v Speaker 1>whenever you have millions of people with all personalities involved

0:11:43.640 --> 0:11:48.040
<v Speaker 1>in conflict on all sides. You have sinners and saints,

0:11:48.120 --> 0:11:52.440
<v Speaker 1>and you have psychopaths and peace seekers, and no one

0:11:52.520 --> 0:11:57.400
<v Speaker 1>can hope to derive a meaningful solution without doing the

0:11:57.480 --> 0:12:01.040
<v Speaker 1>hard work of digging in and trying to understand what

0:12:01.240 --> 0:12:06.439
<v Speaker 1>happens between groups of millions of humans, rather than imagining

0:12:06.480 --> 0:12:09.800
<v Speaker 1>there's a good guy side and a bad guy side. So,

0:12:09.960 --> 0:12:12.880
<v Speaker 1>as I said, I was very moved to discover that

0:12:12.960 --> 0:12:17.720
<v Speaker 1>some people really take on this sort of political intellectual

0:12:17.800 --> 0:12:21.200
<v Speaker 1>humility and try to figure out what do we do

0:12:21.760 --> 0:12:23.840
<v Speaker 1>with humans? What do we do about the fact that

0:12:23.840 --> 0:12:26.840
<v Speaker 1>they're never going to agree, There's always going to be conflict.

0:12:27.080 --> 0:12:32.000
<v Speaker 1>How do we make conflict more productive rather than simply

0:12:32.080 --> 0:12:36.240
<v Speaker 1>being about bloodshed? So to explore this, I called up

0:12:36.280 --> 0:12:40.680
<v Speaker 1>my colleague Jonathan Stray. Jonathan is a former journalist who

0:12:40.720 --> 0:12:43.800
<v Speaker 1>is now a senior scientist at the Berkeley Center for

0:12:43.920 --> 0:12:48.240
<v Speaker 1>Human Compatible AI, where he works on the algorithms that

0:12:48.400 --> 0:12:52.960
<v Speaker 1>feed up content to online and how their operation affects

0:12:53.000 --> 0:12:56.839
<v Speaker 1>things like polarization. And he writes a terrific newsletter called

0:12:57.160 --> 0:13:05.760
<v Speaker 1>The Better Conflict Bulletin. So here's John Jonathan. So you

0:13:05.840 --> 0:13:09.400
<v Speaker 1>are a conflict researcher. Tell us about what that means.

0:13:09.920 --> 0:13:15.040
<v Speaker 3>Well, I try to study what causes people to disagree

0:13:15.040 --> 0:13:19.120
<v Speaker 3>and then how that disagreement transforms into polarization and eventually

0:13:19.200 --> 0:13:23.520
<v Speaker 3>violence if unchecked, And how to build computer controlled media

0:13:23.559 --> 0:13:25.760
<v Speaker 3>systems like social media that will prevent that.

0:13:26.280 --> 0:13:30.120
<v Speaker 1>I know that you think about conflict in terms of

0:13:30.240 --> 0:13:33.160
<v Speaker 1>good and bad conflict. So what does that mean?

0:13:33.800 --> 0:13:37.240
<v Speaker 3>Well, we don't want no conflict in society. Conflict is

0:13:37.360 --> 0:13:39.960
<v Speaker 3>so first of all, it's inevitable. We all disagree, we

0:13:40.000 --> 0:13:45.520
<v Speaker 3>want different things unavoidably. But sometimes when we disagree, it

0:13:45.600 --> 0:13:47.520
<v Speaker 3>comes to something productive, you know, we talk about it,

0:13:47.559 --> 0:13:50.520
<v Speaker 3>we figure something out, and sometimes it doesn't right, Sometimes

0:13:50.520 --> 0:13:54.040
<v Speaker 3>it escalates to some sort of zero sum game or

0:13:54.080 --> 0:13:58.679
<v Speaker 3>even violence. The idea is, when we think about conflict,

0:13:58.679 --> 0:14:01.000
<v Speaker 3>we're not trying to prevent people from fighting. We're trying

0:14:01.040 --> 0:14:03.080
<v Speaker 3>to have them fight in some sort of productive way.

0:14:03.800 --> 0:14:05.480
<v Speaker 1>So what does that look like? What does that mean?

0:14:06.000 --> 0:14:09.199
<v Speaker 3>Well, the simplest example might be violent versus nonviolent, right,

0:14:09.280 --> 0:14:13.080
<v Speaker 3>so we can all understand the difference between you know,

0:14:13.520 --> 0:14:15.920
<v Speaker 3>people trying to get their way by using force or

0:14:15.960 --> 0:14:20.080
<v Speaker 3>getting their way by discussion, yes, but also non violent

0:14:20.160 --> 0:14:24.520
<v Speaker 3>conflict tactics, you know, protests. You know, there's famous historical

0:14:24.600 --> 0:14:29.040
<v Speaker 3>figures like doctor Martin, Luther King or Gandhi who really

0:14:29.080 --> 0:14:33.880
<v Speaker 3>developed these techniques and showed that you could achieve you know,

0:14:34.000 --> 0:14:38.200
<v Speaker 3>large political victories without using physical force. So that's that's

0:14:38.240 --> 0:14:41.200
<v Speaker 3>maybe the most obvious way but there's a bunch of

0:14:41.200 --> 0:14:42.800
<v Speaker 3>other stuff when you start thinking about this.

0:14:43.040 --> 0:14:45.480
<v Speaker 4>So maybe you think about this at a personal level.

0:14:45.520 --> 0:14:47.520
<v Speaker 3>You know, we've all had the experience of an argument

0:14:47.560 --> 0:14:50.320
<v Speaker 3>where you know, we said some ugly things that can't

0:14:50.360 --> 0:14:52.560
<v Speaker 3>be unsaid, and you feel worse at the end of it.

0:14:52.920 --> 0:14:54.280
<v Speaker 3>And then I think most of us have had the

0:14:54.360 --> 0:14:56.600
<v Speaker 3>experience of an argument where you know, there were some

0:14:56.680 --> 0:15:00.200
<v Speaker 3>hard truths, but it was honest, it was carrying, it

0:15:00.240 --> 0:15:03.200
<v Speaker 3>was empathetic, and even if we couldn't solve the problem

0:15:03.280 --> 0:15:05.480
<v Speaker 3>at the end of it, we still have a positive

0:15:05.520 --> 0:15:06.400
<v Speaker 3>regard for each other.

0:15:06.600 --> 0:15:08.520
<v Speaker 4>And so that would be an example of good versus

0:15:08.520 --> 0:15:09.200
<v Speaker 4>bad conflict.

0:15:09.720 --> 0:15:11.880
<v Speaker 1>So one of the things that has always intrigued me

0:15:12.000 --> 0:15:14.880
<v Speaker 1>is that we all have this illusion that we know

0:15:15.040 --> 0:15:17.640
<v Speaker 1>the truth, that we have a complete story of what's

0:15:17.720 --> 0:15:21.600
<v Speaker 1>going on, and therefore other people that disagree with us

0:15:21.680 --> 0:15:26.240
<v Speaker 1>seem like they're trolls, or they're uninformed, or they're disingenuous.

0:15:26.720 --> 0:15:30.440
<v Speaker 1>And so the question is, how do we address that

0:15:30.720 --> 0:15:33.440
<v Speaker 1>illusion that we have. How do we get ourselves a

0:15:33.520 --> 0:15:36.400
<v Speaker 1>little bit outside our own fish bowls to see the

0:15:36.440 --> 0:15:37.040
<v Speaker 1>other side.

0:15:37.560 --> 0:15:40.200
<v Speaker 3>Yeah, no, that's a great question. So there's a bunch

0:15:40.200 --> 0:15:42.000
<v Speaker 3>of ways of answering that. So there's I was at

0:15:42.040 --> 0:15:45.320
<v Speaker 3>a conference yesterday on the topic of intellectual humility, the

0:15:45.400 --> 0:15:48.600
<v Speaker 3>idea that actually we don't know the answer, there's always

0:15:48.600 --> 0:15:51.600
<v Speaker 3>something we're missing. And this is important not just as

0:15:51.720 --> 0:15:55.400
<v Speaker 3>an epistemological practice, that is, to be more correct in

0:15:55.440 --> 0:15:59.680
<v Speaker 3>what we believe, but as a relational practice, meaning it

0:15:59.800 --> 0:16:02.040
<v Speaker 3>changed is how we relate and how we can relate

0:16:02.040 --> 0:16:04.880
<v Speaker 3>to other people. And I think part of this is

0:16:05.680 --> 0:16:08.520
<v Speaker 3>we always have a partial picture, and we tend to,

0:16:09.080 --> 0:16:13.000
<v Speaker 3>in conflict situations, believe that the other person actually holds

0:16:13.040 --> 0:16:16.560
<v Speaker 3>more extreme views than they do. So there's really good

0:16:16.600 --> 0:16:18.600
<v Speaker 3>evidence of that. This is the case in the American

0:16:18.600 --> 0:16:21.520
<v Speaker 3>Culture War, where if you ask one side what the

0:16:21.520 --> 0:16:24.320
<v Speaker 3>other side thinks, they will imagine the other side is

0:16:24.360 --> 0:16:28.920
<v Speaker 3>actually more extreme than they are. So conflict in particular

0:16:29.520 --> 0:16:31.480
<v Speaker 3>is prone to these misperceptions of the other.

0:16:32.320 --> 0:16:35.400
<v Speaker 1>So give us an example of how we know that

0:16:35.400 --> 0:16:37.880
<v Speaker 1>that each side thinks the other side is more extreme.

0:16:38.400 --> 0:16:39.720
<v Speaker 4>Yeah. Yeah, there's tons of work on this.

0:16:39.760 --> 0:16:41.840
<v Speaker 3>So there's a group called the Perception Gap which has

0:16:41.880 --> 0:16:45.040
<v Speaker 3>done a bunch of research from this. And so, for example,

0:16:45.120 --> 0:16:52.480
<v Speaker 3>if you ask Republicans how many Democrats would support completely

0:16:52.480 --> 0:16:56.200
<v Speaker 3>open borders, they'll say something like seventy five percent. If

0:16:56.200 --> 0:16:58.400
<v Speaker 3>you ask Democrats how many of them would support completely

0:16:58.480 --> 0:17:03.160
<v Speaker 3>open borders, it's something like, so they're twenty or twenty

0:17:03.160 --> 0:17:06.560
<v Speaker 3>five percentage points off. And this pattern is consistent across

0:17:06.560 --> 0:17:09.479
<v Speaker 3>a range of issues, and it's bidirectional as well. So

0:17:09.720 --> 0:17:11.880
<v Speaker 3>you know, if you ask Republicans or if you ask

0:17:11.920 --> 0:17:16.159
<v Speaker 3>democrats something like, you know, how many Republicans would say that,

0:17:17.200 --> 0:17:20.000
<v Speaker 3>you know, everyone should have access to as many guns

0:17:20.040 --> 0:17:22.840
<v Speaker 3>as they want, Democrats will guess that number a lot

0:17:22.880 --> 0:17:24.760
<v Speaker 3>higher than Republicans will say.

0:17:24.880 --> 0:17:28.320
<v Speaker 1>So, speaking of Democrats and Republicans, here's the question. Are

0:17:28.359 --> 0:17:33.239
<v Speaker 1>we more polarized currently in America? Is that just an

0:17:33.280 --> 0:17:35.480
<v Speaker 1>impression or is that backed up by statistics?

0:17:36.119 --> 0:17:39.280
<v Speaker 3>No, Unfortunately, it's real, and you can measure it a

0:17:39.280 --> 0:17:41.920
<v Speaker 3>lot of different ways. You can look at where people

0:17:41.960 --> 0:17:44.639
<v Speaker 3>are on political issues, you can look at how people

0:17:44.680 --> 0:17:48.400
<v Speaker 3>feel about each other. You can look at congressional voting patterns,

0:17:48.440 --> 0:17:51.720
<v Speaker 3>whether you know members of Congress will cross the aisle

0:17:51.800 --> 0:17:54.680
<v Speaker 3>to vote on each other's builts, and all of these

0:17:54.880 --> 0:17:57.760
<v Speaker 3>measures show the same basic pattern, which is that polarization

0:17:57.840 --> 0:18:01.120
<v Speaker 3>has been increasing since the the mid seventies or maybe

0:18:01.119 --> 0:18:04.760
<v Speaker 3>the early eighties and is now at historically high levels.

0:18:05.160 --> 0:18:07.399
<v Speaker 3>And it's not just America, it's several other countries in

0:18:07.440 --> 0:18:08.560
<v Speaker 3>the world as well.

0:18:08.640 --> 0:18:11.359
<v Speaker 1>Right now, I know you and I have independently made

0:18:11.440 --> 0:18:15.200
<v Speaker 1>arguments about why this is not entirely about social media,

0:18:15.240 --> 0:18:17.560
<v Speaker 1>and that, of course seems like a good thing to

0:18:17.600 --> 0:18:19.840
<v Speaker 1>point to is the fact that this started in the seventies.

0:18:19.880 --> 0:18:24.240
<v Speaker 1>But tell us your sense of the involvement and the

0:18:24.280 --> 0:18:27.000
<v Speaker 1>non involvement of social media in this polarization.

0:18:27.760 --> 0:18:29.920
<v Speaker 4>Yeah, well, there's definitely some involvement.

0:18:30.119 --> 0:18:34.000
<v Speaker 3>There are a number of ways that social media, in

0:18:34.040 --> 0:18:38.160
<v Speaker 3>particular social media algorithms, that is to say, the systems

0:18:38.160 --> 0:18:40.120
<v Speaker 3>that decide what we see.

0:18:40.040 --> 0:18:42.639
<v Speaker 4>And in what order, can have some involvement.

0:18:43.160 --> 0:18:45.240
<v Speaker 3>So probably a lot of your listeners have heard of

0:18:45.240 --> 0:18:47.520
<v Speaker 3>the idea of the filter bubble, the idea that, you know,

0:18:47.600 --> 0:18:51.560
<v Speaker 3>we're each trapped in this sort of algorithmically produced reality

0:18:51.560 --> 0:18:54.639
<v Speaker 3>where we don't see the other side. The evidence tends

0:18:54.680 --> 0:18:57.560
<v Speaker 3>to be against that. We actually do see quite a

0:18:57.600 --> 0:18:59.720
<v Speaker 3>lot of cross cutting content, you know, even if it's

0:18:59.760 --> 0:19:02.920
<v Speaker 3>you know, rage clicking on an article that has a

0:19:02.920 --> 0:19:06.199
<v Speaker 3>different viewpoint, but there's a bunch of other things going on. So,

0:19:06.480 --> 0:19:12.000
<v Speaker 3>for example, most social media and other systems which select

0:19:12.000 --> 0:19:15.360
<v Speaker 3>content for us, like news recommenders and so forth, rank

0:19:15.480 --> 0:19:18.560
<v Speaker 3>things based on basically the number of clicks they get engagement,

0:19:19.320 --> 0:19:22.040
<v Speaker 3>And we really do pay more attention to things that

0:19:22.080 --> 0:19:24.520
<v Speaker 3>are valuable to us, but also we pay more attention

0:19:24.600 --> 0:19:31.320
<v Speaker 3>to things that are threatening, offensive, fearful. So this approach

0:19:31.359 --> 0:19:34.160
<v Speaker 3>of ranking things by how much attention we give them

0:19:34.400 --> 0:19:54.840
<v Speaker 3>also tends to amplify more extreme or scarier items.

0:19:57.800 --> 0:20:00.159
<v Speaker 1>Now, I did an episode a few episodes ag Go

0:20:00.240 --> 0:20:05.000
<v Speaker 1>where I mentioned for the Iroquois Native Americans up in

0:20:05.240 --> 0:20:09.479
<v Speaker 1>essentially Wisconsin and Canada, some hundreds of years ago, they

0:20:09.480 --> 0:20:13.160
<v Speaker 1>were having bloody battles between these different tribes all the time,

0:20:13.680 --> 0:20:16.840
<v Speaker 1>and they had a new leader come in who came

0:20:16.880 --> 0:20:19.000
<v Speaker 1>to be known as the Great Peacemaker because one of

0:20:19.000 --> 0:20:22.680
<v Speaker 1>the things he did is assigned everybody in the tribes

0:20:23.080 --> 0:20:25.879
<v Speaker 1>to different clans. So you might be a member of

0:20:25.880 --> 0:20:28.920
<v Speaker 1>the Turtle clan and I'm a member of the Heron clan,

0:20:28.960 --> 0:20:31.199
<v Speaker 1>even though we're members of the same tribe. And it

0:20:31.280 --> 0:20:35.000
<v Speaker 1>turns out these allegiances were cross cutting, so that each

0:20:35.080 --> 0:20:37.760
<v Speaker 1>person had their allegiance to their tribe and also to

0:20:37.800 --> 0:20:41.200
<v Speaker 1>their clan. And these weren't equivalent, and that ended up

0:20:41.240 --> 0:20:45.000
<v Speaker 1>bringing peace because now it wasn't so simple to have

0:20:45.080 --> 0:20:48.360
<v Speaker 1>a clear in group and out group because they were mixed.

0:20:48.760 --> 0:20:51.720
<v Speaker 1>And I know that you in your work with social

0:20:51.760 --> 0:20:55.680
<v Speaker 1>media have been looking at something similar to this, which

0:20:55.720 --> 0:21:00.600
<v Speaker 1>is this issue of how an algorithm should rank posts.

0:21:00.680 --> 0:21:01.600
<v Speaker 1>So tell us about that.

0:21:02.640 --> 0:21:05.720
<v Speaker 3>Yeah, so that is a great example. I've heard of

0:21:05.760 --> 0:21:09.439
<v Speaker 3>this Ariquais example as well. And what is happening is

0:21:09.480 --> 0:21:13.480
<v Speaker 3>conflict is tied up with identity. We fight basically against

0:21:13.560 --> 0:21:17.200
<v Speaker 3>people who are different than us. And in the last

0:21:17.200 --> 0:21:20.480
<v Speaker 3>few decades in this country, identity has become more and

0:21:20.600 --> 0:21:24.560
<v Speaker 3>more collapsed into a single dimension of you know, left

0:21:24.640 --> 0:21:28.080
<v Speaker 3>versus right, red versus blue, Republicans versus Democrats.

0:21:28.200 --> 0:21:29.640
<v Speaker 4>It didn't actually used to be this way.

0:21:30.080 --> 0:21:34.080
<v Speaker 3>People didn't identify with their political parties in the same

0:21:34.080 --> 0:21:38.880
<v Speaker 3>way a generation ago. And also there was much more

0:21:39.000 --> 0:21:44.000
<v Speaker 3>mixed identification at the national versus local level. So you know,

0:21:44.080 --> 0:21:46.360
<v Speaker 3>you would have some town where, you know, you had

0:21:46.840 --> 0:21:50.080
<v Speaker 3>a Republican mayor, but you had, you know, progressive politics

0:21:50.080 --> 0:21:52.560
<v Speaker 3>because that's what the people there believed.

0:21:52.160 --> 0:21:52.760
<v Speaker 4>In and wanted.

0:21:53.760 --> 0:21:58.360
<v Speaker 3>Politics has become very nationalized. There's less local news, there's

0:21:58.480 --> 0:21:59.760
<v Speaker 3>less split ticket voting.

0:22:00.520 --> 0:22:02.440
<v Speaker 4>Every issue takes place.

0:22:02.200 --> 0:22:05.359
<v Speaker 3>At this huge scale where there's only two ends in spectrum.

0:22:06.280 --> 0:22:08.960
<v Speaker 3>So how you would have to reverse this is you

0:22:09.000 --> 0:22:13.440
<v Speaker 3>would have to have some sort of sense of not

0:22:13.480 --> 0:22:17.479
<v Speaker 3>necessarily local, but community level values.

0:22:17.560 --> 0:22:18.360
<v Speaker 4>Right, it would have to be.

0:22:18.280 --> 0:22:23.200
<v Speaker 3>Okay for you know, specific groups online or perhaps discussion

0:22:23.400 --> 0:22:27.399
<v Speaker 3>to have opinions that just don't quite fall along this

0:22:27.560 --> 0:22:28.600
<v Speaker 3>left right axis.

0:22:29.160 --> 0:22:32.920
<v Speaker 1>And what's the reason that we have seen more sorting

0:22:33.240 --> 0:22:36.439
<v Speaker 1>into these groups. What's the reason that particular issues go

0:22:36.600 --> 0:22:39.280
<v Speaker 1>together that don't necessarily belong together.

0:22:40.000 --> 0:22:44.359
<v Speaker 3>Well, again, it's an identity thing. It's one of the

0:22:44.400 --> 0:22:47.560
<v Speaker 3>things that makes all of these issues sort of one

0:22:48.040 --> 0:22:51.960
<v Speaker 3>big blob where you only have two choices is merely

0:22:52.080 --> 0:22:55.920
<v Speaker 3>exposure to people who are far away from us, either

0:22:56.000 --> 0:22:59.720
<v Speaker 3>physically or social. So you know, maybe you only talk

0:22:59.760 --> 0:23:01.800
<v Speaker 3>to your neighbors in your town, or you only read

0:23:01.800 --> 0:23:03.800
<v Speaker 3>the local newspaper, and so you could have this sort

0:23:03.840 --> 0:23:07.399
<v Speaker 3>of quixotic local politics in a way that was healthy.

0:23:07.720 --> 0:23:10.840
<v Speaker 3>But now we are exposed to people all across the

0:23:10.920 --> 0:23:13.840
<v Speaker 3>country and indeed the world on a daily basis. So

0:23:13.880 --> 0:23:17.439
<v Speaker 3>now we're comparing our point of view to someone you

0:23:17.440 --> 0:23:20.400
<v Speaker 3>know that will never meet, who's far away from us.

0:23:20.960 --> 0:23:25.160
<v Speaker 3>And there's simulations that show you don't need any antagonism

0:23:25.359 --> 0:23:27.640
<v Speaker 3>for people to sort themselves into two big tribes.

0:23:27.680 --> 0:23:30.000
<v Speaker 4>All you need is when you talk.

0:23:29.840 --> 0:23:34.320
<v Speaker 3>To someone and they're you know, near enough like you,

0:23:34.320 --> 0:23:36.720
<v Speaker 3>you get a little bit closer to them, right, which

0:23:36.760 --> 0:23:38.720
<v Speaker 3>is a natural human thing. When two people talk, they

0:23:38.720 --> 0:23:40.480
<v Speaker 3>tend to you know, converge a little on their on

0:23:40.520 --> 0:23:46.040
<v Speaker 3>their views and identity. But because that those interactions are

0:23:46.080 --> 0:23:49.520
<v Speaker 3>now happening all across the country, we're coalescing into sort

0:23:49.560 --> 0:23:50.639
<v Speaker 3>of two big groups.

0:23:51.080 --> 0:23:52.520
<v Speaker 4>It's it's kind of a paradox, right.

0:23:52.560 --> 0:23:54.440
<v Speaker 3>We want to be able to talk to people who

0:23:54.480 --> 0:23:56.439
<v Speaker 3>are very far away from us, you know, that's the

0:23:56.480 --> 0:23:58.440
<v Speaker 3>promise of these online networks, and.

0:23:58.440 --> 0:24:00.880
<v Speaker 4>Yet it seems to make our con clicks global.

0:24:01.520 --> 0:24:04.400
<v Speaker 1>And there are network science models on this right that

0:24:04.600 --> 0:24:07.840
<v Speaker 1>demonstrate how groups end up dividing like this. And what

0:24:07.960 --> 0:24:12.520
<v Speaker 1>is the evidence that these network science models actually cash

0:24:12.560 --> 0:24:13.840
<v Speaker 1>out in real life?

0:24:14.520 --> 0:24:17.159
<v Speaker 3>Yeah, So these models developed actually from models built in

0:24:17.200 --> 0:24:22.119
<v Speaker 3>the nineteen sixties to study physical racial segregation. And what

0:24:22.160 --> 0:24:26.240
<v Speaker 3>they showed is that you don't need racism in the

0:24:26.320 --> 0:24:29.119
<v Speaker 3>sense of not wanting to live near people who are

0:24:29.160 --> 0:24:31.919
<v Speaker 3>a different race than you. All you need is a

0:24:31.960 --> 0:24:34.520
<v Speaker 3>slight preference, you know, maybe a few percent to live

0:24:34.560 --> 0:24:37.280
<v Speaker 3>near people who are more like you, and that is

0:24:37.440 --> 0:24:41.640
<v Speaker 3>enough for entire neighborhoods to eventually segregate. And so those

0:24:41.680 --> 0:24:44.600
<v Speaker 3>same models are now being applied to online networks and

0:24:44.640 --> 0:24:47.679
<v Speaker 3>they show very similar results. People if they have a

0:24:47.720 --> 0:24:49.920
<v Speaker 3>slight preference for people who are maybe a little more

0:24:49.960 --> 0:24:52.000
<v Speaker 3>like them in their politics, will spend.

0:24:51.840 --> 0:24:54.040
<v Speaker 4>More time, you know, in a bunch of ways, right,

0:24:54.280 --> 0:24:55.120
<v Speaker 4>hanging out.

0:24:55.359 --> 0:24:59.440
<v Speaker 3>In the same groups, or maybe you know, friending them

0:24:59.680 --> 0:25:02.040
<v Speaker 3>or fallollowing them, that sort of thing. And so we

0:25:02.119 --> 0:25:03.840
<v Speaker 3>see this this sort of global splitting.

0:25:04.520 --> 0:25:07.360
<v Speaker 1>Right, So before we drop this start, I just want

0:25:07.359 --> 0:25:10.119
<v Speaker 1>to return to this issue of what does social media

0:25:10.400 --> 0:25:12.720
<v Speaker 1>not have to do with the current polarization, And that

0:25:12.800 --> 0:25:16.080
<v Speaker 1>certainly seems like one of the issues is this, you know,

0:25:16.160 --> 0:25:19.680
<v Speaker 1>network segregation. What else do you see where you think

0:25:19.680 --> 0:25:23.440
<v Speaker 1>that social media is maybe not the sole culprit as

0:25:23.440 --> 0:25:25.000
<v Speaker 1>opposed to human behavior.

0:25:25.640 --> 0:25:25.840
<v Speaker 4>Yeah.

0:25:25.880 --> 0:25:27.719
<v Speaker 3>Well, one of the things I like to say is that,

0:25:27.840 --> 0:25:31.880
<v Speaker 3>you know, if Facebook could fix our democracy by changing

0:25:31.880 --> 0:25:34.320
<v Speaker 3>one hundred lines of code, they would have done it already, right,

0:25:34.400 --> 0:25:38.840
<v Speaker 3>if only our problem was that simple. But polarization is

0:25:38.840 --> 0:25:41.560
<v Speaker 3>one of these problems. That requires an all of society approach.

0:25:41.640 --> 0:25:44.920
<v Speaker 3>So it's you know, social media, yes, but it's also

0:25:45.560 --> 0:25:48.920
<v Speaker 3>media journalists. Journalists are going to have to learn to

0:25:48.960 --> 0:25:52.959
<v Speaker 3>cover politics in different ways, and it's the politicians themselves.

0:25:53.440 --> 0:25:58.199
<v Speaker 3>There is always an advantage to taking a view that

0:25:58.280 --> 0:26:01.760
<v Speaker 3>is more extreme because it motivates people, and so it's

0:26:01.840 --> 0:26:04.240
<v Speaker 3>kind of a devil's bargain, right. You can get people

0:26:04.280 --> 0:26:06.720
<v Speaker 3>to turn out and show up for your cause by

0:26:07.240 --> 0:26:11.000
<v Speaker 3>pressing on a divisive issue, but doing that divide society further.

0:26:11.560 --> 0:26:13.800
<v Speaker 3>So we actually need a whole bunch of sectors of

0:26:13.920 --> 0:26:18.880
<v Speaker 3>society to move in the same direction. And there's para

0:26:18.960 --> 0:26:22.640
<v Speaker 3>conflict scholars called Guy and Heidi Burgess, and they have

0:26:22.680 --> 0:26:28.760
<v Speaker 3>a wonderful article on forty different roles that people have

0:26:28.840 --> 0:26:31.760
<v Speaker 3>to do to create a healthy and peaceful democracy. And

0:26:31.800 --> 0:26:34.639
<v Speaker 3>what do they say, Well, some of them are exactly

0:26:34.640 --> 0:26:36.959
<v Speaker 3>what you'd expect, right, So we already talked about journalists

0:26:37.000 --> 0:26:40.720
<v Speaker 3>and so forth, But they also talk about issue analysts.

0:26:40.760 --> 0:26:43.120
<v Speaker 3>There are people who have to go deep on particular issues.

0:26:43.359 --> 0:26:47.119
<v Speaker 3>They talk about healers, people who present a vision where

0:26:47.200 --> 0:26:50.800
<v Speaker 3>we have emotionally healthy relationships with each other. They talk

0:26:50.840 --> 0:26:54.440
<v Speaker 3>about they call them democracy firsters, people who are concerned

0:26:54.760 --> 0:26:58.040
<v Speaker 3>first and foremost with the correct functioning of elections and

0:26:58.160 --> 0:27:00.560
<v Speaker 3>rule of law. Once you start thinking this way, you

0:27:00.600 --> 0:27:03.240
<v Speaker 3>realize there's a lot of different roles to play, and

0:27:03.240 --> 0:27:04.560
<v Speaker 3>a lot of people have to be pulling in the

0:27:04.600 --> 0:27:07.879
<v Speaker 3>same direction. And they say that what we need is

0:27:08.359 --> 0:27:12.479
<v Speaker 3>a generation of people interested in conflict who in much

0:27:12.560 --> 0:27:14.399
<v Speaker 3>the same way that we now have a generation of

0:27:14.400 --> 0:27:16.560
<v Speaker 3>people who are working on climate change.

0:27:16.920 --> 0:27:20.680
<v Speaker 1>Specifically, who are interested in doing conflict right exactly, because

0:27:20.680 --> 0:27:24.000
<v Speaker 1>we certainly have a generation interest in conflict. What do

0:27:24.040 --> 0:27:26.640
<v Speaker 1>you see when you look around at college camps? Is, Jonathan,

0:27:26.680 --> 0:27:29.040
<v Speaker 1>you're located at Berkeley, and I know that you are

0:27:29.080 --> 0:27:33.280
<v Speaker 1>a very wise person who keeps a foot in both

0:27:33.359 --> 0:27:38.000
<v Speaker 1>camps and tries to see things from all sides. That's

0:27:38.040 --> 0:27:41.800
<v Speaker 1>not the reputation that Berkeley has in particular. So when

0:27:41.880 --> 0:27:44.760
<v Speaker 1>you look around at the campus, what do you see?

0:27:44.760 --> 0:27:47.359
<v Speaker 1>And is anything different about this generation than previous ones?

0:27:48.200 --> 0:27:48.560
<v Speaker 4>Yeah?

0:27:48.720 --> 0:27:50.359
<v Speaker 3>I mean, you know that's a loaded question, right the

0:27:50.440 --> 0:27:51.480
<v Speaker 3>kids today and so forth.

0:27:51.560 --> 0:27:53.240
<v Speaker 4>But yeah, I mean, of course Berkeley.

0:27:52.880 --> 0:27:56.359
<v Speaker 3>Is a liberal campus, right, you know, it's a public

0:27:56.440 --> 0:27:58.760
<v Speaker 3>university in one of the most liberal places in America.

0:27:58.840 --> 0:28:00.000
<v Speaker 4>So it has it's own policy.

0:28:01.000 --> 0:28:03.560
<v Speaker 3>I do see, And of course it's not just Berkeley

0:28:04.600 --> 0:28:08.639
<v Speaker 3>that the current generation of students are I would say,

0:28:09.320 --> 0:28:11.680
<v Speaker 3>more involved in politics, which I would say is a

0:28:11.720 --> 0:28:17.000
<v Speaker 3>good thing, but also less open or less compromising, which

0:28:17.040 --> 0:28:21.520
<v Speaker 3>troubles me in particular when it leads to the suppression

0:28:21.720 --> 0:28:24.840
<v Speaker 3>of alternative views. Now, I don't want to say that,

0:28:25.280 --> 0:28:27.200
<v Speaker 3>you know, anybody should be able to say anything without

0:28:27.200 --> 0:28:32.000
<v Speaker 3>any consequences. I think there are real issues here, for example, inclusion.

0:28:32.400 --> 0:28:36.080
<v Speaker 3>You don't want to make people feel like they don't belong.

0:28:36.040 --> 0:28:37.880
<v Speaker 4>At an academic institution.

0:28:38.520 --> 0:28:41.120
<v Speaker 3>However, where it sort of crosses a line for me

0:28:41.400 --> 0:28:44.600
<v Speaker 3>is where saying something a little unorthodox or a little

0:28:44.680 --> 0:28:50.480
<v Speaker 3>challenging becomes impossible. People are scared to do it. And

0:28:50.480 --> 0:28:53.120
<v Speaker 3>I'm not talking about, you know, the haters here. I'm

0:28:53.160 --> 0:28:56.240
<v Speaker 3>talking about people who are trying to engage in good

0:28:56.280 --> 0:28:58.960
<v Speaker 3>faith and maybe saying something that is, you know, a

0:28:58.960 --> 0:29:02.120
<v Speaker 3>little controversial, and there's just less.

0:29:01.920 --> 0:29:04.120
<v Speaker 4>Tolerance for that than there used to be.

0:29:04.920 --> 0:29:07.040
<v Speaker 3>And I think both it's just sort of anecdotally, you know,

0:29:07.080 --> 0:29:09.600
<v Speaker 3>talking to the faculty, but also there's a bunch of

0:29:09.720 --> 0:29:13.600
<v Speaker 3>data on this. Ironically, Berkeley, which is associated with free

0:29:13.600 --> 0:29:16.160
<v Speaker 3>speech very closely. Right, we have the Free Speech Cafe

0:29:16.600 --> 0:29:18.320
<v Speaker 3>on campus, and this was the heart of the free

0:29:18.320 --> 0:29:22.680
<v Speaker 3>speech movement in the nineteen sixties, is now a place

0:29:22.760 --> 0:29:28.880
<v Speaker 3>where people are often criticizing free speech as being too permissive,

0:29:29.320 --> 0:29:30.560
<v Speaker 3>and I find that very troubling.

0:29:31.280 --> 0:29:33.800
<v Speaker 1>So how do you think about dealing with that as

0:29:33.840 --> 0:29:36.600
<v Speaker 1>you think about training the next generation, for example, with

0:29:36.600 --> 0:29:39.280
<v Speaker 1>all these different roles that we might need to have

0:29:39.400 --> 0:29:41.840
<v Speaker 1>better conflict, how do you do that?

0:29:42.600 --> 0:29:44.560
<v Speaker 3>Yeah, well, you know, I tried to do this in

0:29:44.400 --> 0:29:47.840
<v Speaker 3>my classes yesterday. I mean, it's really about sort of

0:29:48.400 --> 0:29:50.160
<v Speaker 3>I think it requires two things, right. One is you

0:29:50.200 --> 0:29:54.520
<v Speaker 3>have to create a sense of psychological safety in some way.

0:29:54.720 --> 0:29:58.080
<v Speaker 3>Right when my classes, you know, we discuss algorithm design,

0:29:58.160 --> 0:30:01.280
<v Speaker 3>we discuss racially biased algorith we discuss all this stuff.

0:30:01.800 --> 0:30:03.040
<v Speaker 3>And one of the things I say at the top

0:30:03.080 --> 0:30:05.880
<v Speaker 3>of those classes is like, okay, so you know, we're

0:30:05.880 --> 0:30:08.280
<v Speaker 3>going to talk about some issues which are very charged.

0:30:08.840 --> 0:30:10.600
<v Speaker 3>I understand that some of you are going to have,

0:30:11.480 --> 0:30:16.040
<v Speaker 3>you know, real upsetting personal experience with this stuff. But

0:30:16.880 --> 0:30:18.760
<v Speaker 3>you know, I asked that you you give it a shot,

0:30:19.640 --> 0:30:22.440
<v Speaker 3>and that you know, you try to engage in a

0:30:22.480 --> 0:30:26.120
<v Speaker 3>spirit of curiosity. And we're not always going to get

0:30:26.120 --> 0:30:29.080
<v Speaker 3>things right, But I don't want anyone to ever accuse

0:30:29.160 --> 0:30:32.560
<v Speaker 3>us of not being thoughtful or careful or are empathetic.

0:30:33.080 --> 0:30:34.959
<v Speaker 3>And I think that's the spirit you have to approach this,

0:30:34.960 --> 0:30:38.360
<v Speaker 3>This is spirit of curiosity. So for example, why do

0:30:38.400 --> 0:30:41.120
<v Speaker 3>we disagree about this? What is it about you know,

0:30:41.520 --> 0:30:45.240
<v Speaker 3>this student experience and that student's experience that leads to

0:30:45.240 --> 0:30:50.960
<v Speaker 3>to such dramatically different conclusions on say, affirmative action, the

0:30:51.360 --> 0:30:55.480
<v Speaker 3>you know, media bias, you know, how we should deal

0:30:55.520 --> 0:30:57.800
<v Speaker 3>with crime, how should we should do with immigration that

0:30:57.920 --> 0:31:02.080
<v Speaker 3>came from somewhere? And very often what we find when

0:31:02.120 --> 0:31:04.960
<v Speaker 3>we have the conversation that way is either there's some

0:31:05.080 --> 0:31:08.800
<v Speaker 3>intense personal experience, family background. You know, my mother came

0:31:08.840 --> 0:31:12.680
<v Speaker 3>from El Salvador, my father was deported, you know, I

0:31:12.720 --> 0:31:16.040
<v Speaker 3>had to grow up in economic uncertainty, you know, something

0:31:16.120 --> 0:31:21.360
<v Speaker 3>like this. Or we find that people shaped their ideas

0:31:21.400 --> 0:31:23.480
<v Speaker 3>from their social network and never really thought about it.

0:31:23.800 --> 0:31:26.680
<v Speaker 3>And this is part of what polarization is, is that

0:31:26.760 --> 0:31:30.920
<v Speaker 3>all of our political ideas end up sort of collapsing

0:31:30.960 --> 0:31:35.000
<v Speaker 3>into this left right access. There's no logical reason why

0:31:35.440 --> 0:31:39.240
<v Speaker 3>if you are pro choice you should also be concerned

0:31:39.240 --> 0:31:42.920
<v Speaker 3>about climate change and yet here we are. So there's

0:31:42.960 --> 0:31:48.120
<v Speaker 3>some sort of social process that makes everybody split intoto

0:31:48.120 --> 0:31:50.840
<v Speaker 3>camps in this way, and we can counteract that by

0:31:50.960 --> 0:31:54.200
<v Speaker 3>thinking carefully for ourselves and having discussions with people who

0:31:54.360 --> 0:31:55.480
<v Speaker 3>might have a different view.

0:31:56.000 --> 0:31:59.080
<v Speaker 1>So you address this in your class, but how do

0:31:59.120 --> 0:32:02.880
<v Speaker 1>you think about taking to a larger world. And before

0:32:02.880 --> 0:32:05.160
<v Speaker 1>we get into social media algorithms, which I want to do,

0:32:05.680 --> 0:32:08.160
<v Speaker 1>you know, one of the things that struck me. Let's

0:32:08.240 --> 0:32:11.840
<v Speaker 1>take the Israel Hamas conflict going on right now. Whenever

0:32:11.880 --> 0:32:15.320
<v Speaker 1>it comes up about Okay, which side launched that rocket,

0:32:15.440 --> 0:32:19.040
<v Speaker 1>it turns out that no amount of evidence sways anybody

0:32:19.080 --> 0:32:22.800
<v Speaker 1>on that. People have their side and they generally stick

0:32:22.840 --> 0:32:24.400
<v Speaker 1>with it. As far as I can tell. This is

0:32:24.480 --> 0:32:27.520
<v Speaker 1>just a you know, view from surfing a lot of

0:32:27.520 --> 0:32:30.080
<v Speaker 1>social media on this stuff. The question is how do

0:32:30.160 --> 0:32:33.000
<v Speaker 1>you expand beyond your class to get people to ask

0:32:33.080 --> 0:32:36.760
<v Speaker 1>these questions about changing their point of view, not even

0:32:36.760 --> 0:32:39.960
<v Speaker 1>necessarily changing it, but just being willing to examine other

0:32:40.120 --> 0:32:40.960
<v Speaker 1>pieces of evidence.

0:32:41.360 --> 0:32:43.960
<v Speaker 3>First, let's talk about facts. I believe that facts matter.

0:32:44.040 --> 0:32:45.920
<v Speaker 3>I believe that they're deeply important. I used to be

0:32:45.960 --> 0:32:49.040
<v Speaker 3>an investigative journalist. We can't know everything, but we can

0:32:49.160 --> 0:32:54.080
<v Speaker 3>know some things. The problem is what facts mean depends

0:32:54.080 --> 0:32:57.000
<v Speaker 3>on who you ask. So in the you know Israel

0:32:57.040 --> 0:33:00.680
<v Speaker 3>Palestine conflict, the fact that Palestinians are living there before

0:33:00.760 --> 0:33:04.120
<v Speaker 3>nineteen forty seven is a fact that matters deeply to

0:33:04.160 --> 0:33:07.200
<v Speaker 3>a lot of people. The fact that Jews were living

0:33:07.200 --> 0:33:09.400
<v Speaker 3>there three thousand years ago is a fact that matters

0:33:09.440 --> 0:33:12.040
<v Speaker 3>deeply to a lot of people. So these facts are

0:33:12.080 --> 0:33:14.480
<v Speaker 3>not in dispute. What is in dispute is the meaning

0:33:14.560 --> 0:33:18.880
<v Speaker 3>of these facts, which points to the problem is fundamentally

0:33:18.920 --> 0:33:22.160
<v Speaker 3>relational in many cases rather than factual, and people can

0:33:22.200 --> 0:33:25.160
<v Speaker 3>be misinformed. I'm not disputing that, but often what you

0:33:25.240 --> 0:33:28.200
<v Speaker 3>find is it is about how people feel about each

0:33:28.240 --> 0:33:30.120
<v Speaker 3>other and about how they are relating to each other.

0:33:30.240 --> 0:33:32.680
<v Speaker 4>Now, so you ask what we can do.

0:33:33.240 --> 0:33:35.760
<v Speaker 3>So this is probably the moment to mention that I

0:33:35.800 --> 0:33:38.959
<v Speaker 3>write a newsletter called the Better Conflict Bulletin, and we

0:33:39.000 --> 0:33:40.360
<v Speaker 3>are news and analysis for.

0:33:40.360 --> 0:33:41.200
<v Speaker 4>A better culture war.

0:33:41.640 --> 0:33:45.560
<v Speaker 3>And the idea here is that many people have this

0:33:45.840 --> 0:33:50.800
<v Speaker 3>gut sense that we're fighting ugly, and we try to

0:33:50.840 --> 0:33:53.960
<v Speaker 3>explore in this newsletter what would it be to fight better?

0:33:54.400 --> 0:33:59.600
<v Speaker 3>So we cover conflict research, the science of how people

0:34:00.520 --> 0:34:05.120
<v Speaker 3>misunderstand and misperceive each other, and we cover people who

0:34:05.160 --> 0:34:08.319
<v Speaker 3>are successfully navigating the culture in a more productive way.

0:34:08.920 --> 0:34:12.279
<v Speaker 3>So there are people exploring these topics, there's not a

0:34:12.280 --> 0:34:17.360
<v Speaker 3>lot of coverage for it because, as they say in

0:34:17.400 --> 0:34:21.480
<v Speaker 3>the peace building field, sometimes peace has no natural constituency.

0:34:21.680 --> 0:34:24.399
<v Speaker 3>It's very easy to get people excited about winning. It's

0:34:24.400 --> 0:34:27.520
<v Speaker 3>harder to get people excited about living together harmoniously.

0:34:27.800 --> 0:34:29.960
<v Speaker 1>That's right. That's why when I met I just met

0:34:30.000 --> 0:34:32.239
<v Speaker 1>you very recently, Jonathan, just a few weeks ago, and

0:34:32.280 --> 0:34:35.399
<v Speaker 1>I was so excited by the kind of work you're doing.

0:34:35.440 --> 0:34:38.200
<v Speaker 1>Because I come from a neuroscience angle. I study a

0:34:38.200 --> 0:34:41.279
<v Speaker 1>lot about in groups and outgroups and empathy and how

0:34:41.320 --> 0:34:45.920
<v Speaker 1>we so easily relegate others to a different group. I

0:34:46.000 --> 0:34:47.839
<v Speaker 1>was so excited to learn that you and others are

0:34:47.840 --> 0:34:51.719
<v Speaker 1>doing the boots on the ground work of trying to

0:34:52.120 --> 0:34:56.040
<v Speaker 1>bring groups of people together. And so I still want

0:34:56.040 --> 0:34:58.719
<v Speaker 1>to drill in on this point though. So you've got

0:34:58.719 --> 0:35:01.880
<v Speaker 1>the newsletter, which is to I'm a paid subscriber to that.

0:35:02.440 --> 0:35:04.720
<v Speaker 1>What are the things that you think about though, besides

0:35:04.760 --> 0:35:06.440
<v Speaker 1>social media, which we'll get to in a second, And

0:35:06.480 --> 0:35:09.799
<v Speaker 1>that's obviously a big leverage point. But are there any

0:35:09.840 --> 0:35:12.080
<v Speaker 1>other things you can do when you think about how

0:35:12.080 --> 0:35:13.720
<v Speaker 1>do we get people to fight less ugly.

0:35:14.520 --> 0:35:17.319
<v Speaker 3>Yeah, so there's a bunch of sort of immediate things

0:35:17.360 --> 0:35:20.600
<v Speaker 3>that you can do. So, first of all, there's this

0:35:21.600 --> 0:35:25.520
<v Speaker 3>perception gap, so misperceptions, so we tend to be both

0:35:25.600 --> 0:35:29.200
<v Speaker 3>misinformed about what the other side actually believes and we

0:35:29.280 --> 0:35:33.280
<v Speaker 3>tend to stereotype them, meaning you know, if a Republican

0:35:33.320 --> 0:35:35.960
<v Speaker 3>thinks about a Democrat, or a Democrat thinks about a Republican,

0:35:36.000 --> 0:35:39.919
<v Speaker 3>they think about the most extreme version of that right.

0:35:40.400 --> 0:35:41.920
<v Speaker 3>And you can see this in the data when you

0:35:41.920 --> 0:35:44.440
<v Speaker 3>ask people, you know, you know, what is the distribution

0:35:44.600 --> 0:35:48.239
<v Speaker 3>of how you know, many you know, the opposite side

0:35:48.280 --> 0:35:50.279
<v Speaker 3>with support let's say violence if they don't get their

0:35:50.280 --> 0:35:52.680
<v Speaker 3>way in election, right, and they you get these like

0:35:52.880 --> 0:35:56.279
<v Speaker 3>very extreme distributions, where in the reality there's actually much

0:35:56.320 --> 0:35:59.720
<v Speaker 3>more more overlap. So yeah, first, you can inform yourself

0:35:59.719 --> 0:36:02.480
<v Speaker 3>about what other people actually believe in.

0:36:02.560 --> 0:36:03.560
<v Speaker 4>Honest in its way.

0:36:04.040 --> 0:36:09.000
<v Speaker 3>Second, there is a set of techniques for how to

0:36:09.040 --> 0:36:15.160
<v Speaker 3>have conversations with someone who has not just like you know,

0:36:15.280 --> 0:36:18.600
<v Speaker 3>a polite conversation, but like genuinely has different values than you.

0:36:19.400 --> 0:36:22.200
<v Speaker 4>And it's got to start with curiosity. It's got to

0:36:22.200 --> 0:36:22.960
<v Speaker 4>start with listening.

0:36:23.520 --> 0:36:26.239
<v Speaker 3>And I would suggest don't go into those conversations with

0:36:26.280 --> 0:36:27.719
<v Speaker 3>the goal of changing someone's mind.

0:36:27.800 --> 0:36:29.480
<v Speaker 4>You wouldn't want them to do that with you.

0:36:29.920 --> 0:36:32.040
<v Speaker 3>Go into those conversations with the goal of trying to

0:36:32.120 --> 0:36:36.200
<v Speaker 3>understand how they got to that place. Because when we're

0:36:36.200 --> 0:36:40.080
<v Speaker 3>talking about the American conflict, we're talking about you disagree

0:36:40.120 --> 0:36:43.480
<v Speaker 3>with half of America. Well half of America, you know,

0:36:43.600 --> 0:36:48.360
<v Speaker 3>isn't stupid. You know, they're not like fundamentally broken or

0:36:48.440 --> 0:36:51.319
<v Speaker 3>evil or something. They're going to be pretty average, just

0:36:51.360 --> 0:36:53.560
<v Speaker 3>like the other half of America. So how is it

0:36:53.640 --> 0:36:57.280
<v Speaker 3>that smart and kind people can end up believing something

0:36:57.320 --> 0:37:01.200
<v Speaker 3>completely different than you. That's the thing to get curious about.

0:37:01.760 --> 0:37:04.279
<v Speaker 3>And the third thing I would say is watch for

0:37:04.360 --> 0:37:08.399
<v Speaker 3>your own emotional reactions. Watch for where you know, you

0:37:08.440 --> 0:37:13.560
<v Speaker 3>get you know, angry or let's just say uptight, or

0:37:14.000 --> 0:37:15.799
<v Speaker 3>you know, really have this feeling that you have to

0:37:15.840 --> 0:37:21.000
<v Speaker 3>defend something or protect something, because those reactions will lead

0:37:21.040 --> 0:37:24.880
<v Speaker 3>you to first, what is it that you're scared of

0:37:24.960 --> 0:37:25.800
<v Speaker 3>that you're worried about it?

0:37:25.840 --> 0:37:27.320
<v Speaker 4>Where are your fears and concerns?

0:37:27.560 --> 0:37:32.560
<v Speaker 3>And second, if you can notice those and not be

0:37:32.800 --> 0:37:35.680
<v Speaker 3>overtaken by them, you can have much better relationships with

0:37:35.680 --> 0:37:38.400
<v Speaker 3>people who disagree with you. And actually, one of the

0:37:38.520 --> 0:37:43.240
<v Speaker 3>practices that we teach is speak them aloud, right, don't

0:37:43.320 --> 0:37:46.600
<v Speaker 3>don't project your anger onto the person across from you,

0:37:46.840 --> 0:37:49.880
<v Speaker 3>who you're probably stereotyping, who you know, never did you

0:37:49.920 --> 0:37:52.520
<v Speaker 3>personally any wrong. Just say, you know, when you say that,

0:37:53.800 --> 0:37:56.720
<v Speaker 3>I get very upset, I have reaction. I feel anger

0:37:56.840 --> 0:37:59.760
<v Speaker 3>thinking about that, without directing it towards the other person.

0:38:00.239 --> 0:38:03.160
<v Speaker 3>We can't hide our emotions. If we're going to relate better,

0:38:03.200 --> 0:38:04.360
<v Speaker 3>they have to be on the table.

0:38:20.520 --> 0:38:23.480
<v Speaker 1>If you were suddenly assigned to be the educations are

0:38:23.520 --> 0:38:27.239
<v Speaker 1>for the nation, how would you build junior high or

0:38:27.320 --> 0:38:32.080
<v Speaker 1>high school classes to teach kids about this, about how

0:38:32.120 --> 0:38:35.080
<v Speaker 1>to relate, about how to let's say, stealman each other's

0:38:35.239 --> 0:38:37.959
<v Speaker 1>arguments so that they can understand them better.

0:38:38.760 --> 0:38:43.879
<v Speaker 3>Yeah, so what we need to do is give young

0:38:44.000 --> 0:38:49.799
<v Speaker 3>people the ability and the experience of relating successfully across

0:38:50.520 --> 0:38:53.279
<v Speaker 3>value divides. And there's actually a bunch of organizations that

0:38:53.360 --> 0:38:56.000
<v Speaker 3>do this both at the K through twelve and the

0:38:56.080 --> 0:38:59.399
<v Speaker 3>university level. You know, these programs are some of them

0:38:59.400 --> 0:39:03.000
<v Speaker 3>are in class teacher led where the last students to

0:39:03.640 --> 0:39:05.160
<v Speaker 3>you know, what is a what is the thing you

0:39:05.160 --> 0:39:08.760
<v Speaker 3>feel very strongly about and either try to find students

0:39:08.800 --> 0:39:12.319
<v Speaker 3>who have a different opinion and teach them to constructively

0:39:12.400 --> 0:39:16.120
<v Speaker 3>talk about that, or do things like there was a

0:39:16.160 --> 0:39:19.440
<v Speaker 3>recent New York Times piece about a writing teacher who says,

0:39:19.960 --> 0:39:22.400
<v Speaker 3>come up with a character who you think is a

0:39:22.480 --> 0:39:27.120
<v Speaker 3>terrible human being or something that no one should absolutely

0:39:27.120 --> 0:39:30.520
<v Speaker 3>ever say, and write a story where they say that

0:39:30.600 --> 0:39:33.560
<v Speaker 3>thing in context in a way which is sympathetic or

0:39:33.600 --> 0:39:36.640
<v Speaker 3>humanizing towards them. And I think it's this is the

0:39:36.680 --> 0:39:40.520
<v Speaker 3>fundamental skill to be able to see the humanity in people,

0:39:40.640 --> 0:39:45.919
<v Speaker 3>even in moments of profound disagreement. And I don't mean

0:39:45.920 --> 0:39:51.000
<v Speaker 3>to minimize the actual stakes of these types of conversations, right.

0:39:51.080 --> 0:39:53.920
<v Speaker 3>You know, if you are an immigrant who's talking to

0:39:53.960 --> 0:39:56.719
<v Speaker 3>someone who says, you know, we shouldn't allow any more

0:39:56.719 --> 0:40:00.440
<v Speaker 3>immigration to this country, that has a deep personal impact

0:40:00.480 --> 0:40:02.120
<v Speaker 3>on you because it may means you never get to

0:40:02.120 --> 0:40:03.120
<v Speaker 3>see your family again.

0:40:03.600 --> 0:40:03.799
<v Speaker 4>Right.

0:40:04.080 --> 0:40:06.480
<v Speaker 3>So I'm not saying that, you know, we should all

0:40:06.480 --> 0:40:09.000
<v Speaker 3>talk until we get along. I'm saying we have to

0:40:09.040 --> 0:40:14.000
<v Speaker 3>see our political adversaries as human. And there are a

0:40:14.040 --> 0:40:16.480
<v Speaker 3>bunch of organizations who try to train people to do

0:40:16.520 --> 0:40:19.719
<v Speaker 3>this and also give people the experience of I had

0:40:19.760 --> 0:40:23.240
<v Speaker 3>a conversation with someone who disagrees with me and it went, okay.

0:40:24.040 --> 0:40:28.400
<v Speaker 3>We are aversive to these conversations. They're hard, they're emotionally taxing,

0:40:28.960 --> 0:40:31.560
<v Speaker 3>there's no guarantee they're going to come out well. So

0:40:31.600 --> 0:40:34.560
<v Speaker 3>we have to give people the confidence to engage in

0:40:34.600 --> 0:40:35.040
<v Speaker 3>this way.

0:40:36.280 --> 0:40:38.719
<v Speaker 1>So I'm so glad to hear you use the word humanizing,

0:40:38.760 --> 0:40:41.920
<v Speaker 1>because that's really the key from a neuroscience point of view.

0:40:41.960 --> 0:40:44.440
<v Speaker 1>The issue is that people in our out group, we

0:40:44.520 --> 0:40:48.560
<v Speaker 1>actually analyze them with our brains in a different way,

0:40:48.719 --> 0:40:51.520
<v Speaker 1>such that they are more like an object than a

0:40:51.600 --> 0:40:55.680
<v Speaker 1>fellow human. And that's how that opens the door to

0:40:56.640 --> 0:41:00.480
<v Speaker 1>you know, genocide of various sorts, or a lower level

0:41:00.600 --> 0:41:03.480
<v Speaker 1>you know, violence, or even just insulting or whatever the

0:41:03.520 --> 0:41:05.640
<v Speaker 1>thing is. We just don't care about them in the

0:41:05.640 --> 0:41:07.720
<v Speaker 1>way that we care about the people that we consider

0:41:08.160 --> 0:41:11.080
<v Speaker 1>in our in group. And yet obviously we're all made

0:41:11.080 --> 0:41:13.480
<v Speaker 1>of the same biology. We all come about from our

0:41:13.520 --> 0:41:17.799
<v Speaker 1>genetics and our experience, and it feels like humanization is

0:41:17.840 --> 0:41:20.080
<v Speaker 1>really the key. So you were just telling me about

0:41:20.120 --> 0:41:22.160
<v Speaker 1>what could be done with high school kids. Tell me

0:41:22.520 --> 0:41:25.239
<v Speaker 1>what you and others are doing with adults in terms

0:41:25.280 --> 0:41:28.240
<v Speaker 1>of helping them bridge the gap.

0:41:28.920 --> 0:41:31.480
<v Speaker 3>So there's a bunch of bridge building organizations. There's a

0:41:31.480 --> 0:41:33.880
<v Speaker 3>big one called braver Angels. And what this is is

0:41:33.920 --> 0:41:38.440
<v Speaker 3>an organization which organizes they call them Red Blue Conversations.

0:41:39.200 --> 0:41:41.799
<v Speaker 3>There's a bunch of local chapters, plus they do an

0:41:41.800 --> 0:41:44.120
<v Speaker 3>online so you can actually sign up and say, you know,

0:41:44.680 --> 0:41:47.319
<v Speaker 3>I'm in you know, this town in Indiana, and I

0:41:47.360 --> 0:41:49.040
<v Speaker 3>want to have a conversation with people who.

0:41:48.880 --> 0:41:49.440
<v Speaker 4>Disagree with me.

0:41:50.000 --> 0:41:53.239
<v Speaker 3>And they have a particular way that they mediate and

0:41:53.280 --> 0:41:57.120
<v Speaker 3>facilitate these conversations to make them productive. And so if

0:41:57.120 --> 0:41:59.520
<v Speaker 3>you want to have that encounter, you can have it.

0:41:59.520 --> 0:42:00.920
<v Speaker 3>They are a group that will set that up for

0:42:00.960 --> 0:42:02.120
<v Speaker 3>you and show you how to do it.

0:42:02.360 --> 0:42:04.239
<v Speaker 1>Do the people who sign up for that are they

0:42:04.600 --> 0:42:06.920
<v Speaker 1>are Some of them just trying to win, and they think, yes,

0:42:07.000 --> 0:42:08.759
<v Speaker 1>I want to meet people from the other side, so

0:42:08.800 --> 0:42:09.920
<v Speaker 1>I can convince them.

0:42:10.600 --> 0:42:16.200
<v Speaker 3>Possibly, But the moderation format is designed to prevent that.

0:42:16.520 --> 0:42:18.480
<v Speaker 3>They don't let people bludge in each other. And they

0:42:18.560 --> 0:42:21.120
<v Speaker 3>say right up front, the goal here is not to

0:42:21.200 --> 0:42:24.799
<v Speaker 3>change someone's mind. The goal is to increase understanding.

0:42:25.320 --> 0:42:27.160
<v Speaker 1>Oh that's lovely, Okay, great, So you're about to tell

0:42:27.160 --> 0:42:28.240
<v Speaker 1>me about another organization.

0:42:28.760 --> 0:42:31.239
<v Speaker 3>Yeah, so there's a bunch of organizations doing this kind

0:42:31.239 --> 0:42:34.759
<v Speaker 3>of bridging work. I'm also involved in an organization called

0:42:34.800 --> 0:42:37.960
<v Speaker 3>the Dignity Index, And what this is is they've created

0:42:37.960 --> 0:42:42.880
<v Speaker 3>a scale from contempt to dignity to rate political speech.

0:42:43.200 --> 0:42:47.520
<v Speaker 3>So when a politician talks about their opponent, are they saying,

0:42:47.719 --> 0:42:50.080
<v Speaker 3>you know, those people are evil, we have to destroy

0:42:50.120 --> 0:42:53.760
<v Speaker 3>them to save America. Or are they saying, I respect

0:42:53.840 --> 0:42:56.360
<v Speaker 3>what they believe. I believe something different. I think my

0:42:56.440 --> 0:42:58.160
<v Speaker 3>plan is better and if I win, we're going to

0:42:58.200 --> 0:43:01.640
<v Speaker 3>work together. Those are really different things, and the scale

0:43:01.640 --> 0:43:04.680
<v Speaker 3>actually goes it's an a point scale. It goes from

0:43:04.960 --> 0:43:08.719
<v Speaker 3>literally calling for genocide to literally saying that you see

0:43:08.719 --> 0:43:11.839
<v Speaker 3>no a difference between self and other. And so what

0:43:11.880 --> 0:43:14.480
<v Speaker 3>they're using this for is a couple different things. First

0:43:14.640 --> 0:43:18.080
<v Speaker 3>is they're putting together sort of scorecards in the upcoming election,

0:43:18.760 --> 0:43:22.440
<v Speaker 3>just rate different candidates, especially at the local level, on

0:43:22.480 --> 0:43:24.960
<v Speaker 3>whether they speak with contempt or dignity.

0:43:25.040 --> 0:43:25.839
<v Speaker 4>And the second thing.

0:43:25.719 --> 0:43:30.440
<v Speaker 3>They're doing is groups student groups in universities to train

0:43:30.560 --> 0:43:34.759
<v Speaker 3>people to rate political speech on this scale, which is

0:43:34.840 --> 0:43:38.440
<v Speaker 3>less about producing the ratings and more about getting people

0:43:38.560 --> 0:43:41.880
<v Speaker 3>to think in this way and to notice when people

0:43:41.920 --> 0:43:47.120
<v Speaker 3>are engaging in let's say, constructive versus destructive disagreement.

0:43:47.880 --> 0:43:51.520
<v Speaker 1>Excellent. Okay, so those are things on an individual level

0:43:51.640 --> 0:43:55.680
<v Speaker 1>that can be done on a societal level. Tell me

0:43:55.760 --> 0:43:58.480
<v Speaker 1>how you think about, for example, journalism and what might

0:43:58.520 --> 0:43:59.080
<v Speaker 1>be done there.

0:44:00.040 --> 0:44:00.239
<v Speaker 4>Yeah.

0:44:00.280 --> 0:44:03.600
<v Speaker 3>So, as I mentioned, I used to be a professional journalist.

0:44:03.600 --> 0:44:05.640
<v Speaker 3>I was an editor at the AP, I was an

0:44:05.640 --> 0:44:10.000
<v Speaker 3>investigative journalist at Republica. So I've seen the machine from

0:44:10.080 --> 0:44:14.040
<v Speaker 3>the inside. And what I will say about journalists is,

0:44:14.040 --> 0:44:16.760
<v Speaker 3>first of all, I have enormous respect for my colleagues

0:44:16.760 --> 0:44:19.600
<v Speaker 3>and journalism, and they are some of the most deeply

0:44:19.640 --> 0:44:22.480
<v Speaker 3>idealistic people that I know. Right, this isn't this isn't

0:44:22.520 --> 0:44:27.480
<v Speaker 3>a conspiracy. But most of them are pretty politically liberal,

0:44:27.920 --> 0:44:32.120
<v Speaker 3>and that's not a conspiracy either. That is because, especially

0:44:32.200 --> 0:44:34.840
<v Speaker 3>with the decline of local news, most of them work

0:44:35.040 --> 0:44:38.000
<v Speaker 3>for national outlets in big cities on.

0:44:37.960 --> 0:44:38.600
<v Speaker 4>The East Coast.

0:44:39.239 --> 0:44:42.920
<v Speaker 3>The social context in which they exist is pretty liberal,

0:44:42.920 --> 0:44:45.400
<v Speaker 3>and in particular, much farther to the left than the

0:44:45.440 --> 0:44:46.320
<v Speaker 3>Median American.

0:44:47.680 --> 0:44:48.719
<v Speaker 4>You know what that.

0:44:48.719 --> 0:44:53.719
<v Speaker 3>Means is they will have less contact with and less

0:44:53.800 --> 0:44:59.040
<v Speaker 3>understanding with conservatives, which means that conservatives will not see,

0:44:59.280 --> 0:45:04.000
<v Speaker 3>their views are identity reflected in the coverage of mainstream journalism.

0:45:04.400 --> 0:45:07.160
<v Speaker 1>And this is because conservatives tend to live in more

0:45:07.239 --> 0:45:08.000
<v Speaker 1>rural areas.

0:45:08.200 --> 0:45:14.640
<v Speaker 3>Yes, yeah, it's because demographically journalists are not like conservatives, right.

0:45:14.680 --> 0:45:18.359
<v Speaker 3>And it's not again, this isn't a conspiracy. This just

0:45:18.719 --> 0:45:19.839
<v Speaker 3>it's pure demographics.

0:45:19.920 --> 0:45:24.000
<v Speaker 4>Right. They are educated, urban people.

0:45:24.640 --> 0:45:27.280
<v Speaker 3>And you know, one of the strongest correlates of political

0:45:27.320 --> 0:45:29.560
<v Speaker 3>identity is population density.

0:45:29.640 --> 0:45:32.719
<v Speaker 4>Right, It's that simple in many ways. That's another way

0:45:32.719 --> 0:45:34.400
<v Speaker 4>you can talk about the divides in this country is

0:45:34.440 --> 0:45:35.399
<v Speaker 4>between urban and rural.

0:45:36.320 --> 0:45:38.960
<v Speaker 3>So anyway, given that that is the state of affairs,

0:45:38.960 --> 0:45:42.400
<v Speaker 3>what you get is there are only a few media outlets,

0:45:42.880 --> 0:45:48.200
<v Speaker 3>notably Fox, which speak in a language and a value

0:45:48.239 --> 0:45:53.359
<v Speaker 3>system which resonates with conservatives, and that leaves a sort

0:45:53.400 --> 0:45:56.280
<v Speaker 3>of vacuum where people who want that kind of coverage

0:45:56.360 --> 0:46:00.760
<v Speaker 3>have to end up going to let's say, less credible

0:46:00.960 --> 0:46:03.080
<v Speaker 3>or even fringe news sites. Right, so you start to

0:46:03.080 --> 0:46:07.600
<v Speaker 3>get to your Newsmax or your One America, and whatever

0:46:07.640 --> 0:46:11.120
<v Speaker 3>you can say about their politics from a pure sort

0:46:11.120 --> 0:46:15.719
<v Speaker 3>of journalism quality perspective, they're just not very good. And

0:46:15.760 --> 0:46:18.919
<v Speaker 3>so that's part of why we see, you know, higher

0:46:19.040 --> 0:46:22.279
<v Speaker 3>rates of misinformation and so forth on the political right

0:46:22.360 --> 0:46:28.640
<v Speaker 3>in the US, And my suggestion, which is somewhat controversial,

0:46:29.320 --> 0:46:32.680
<v Speaker 3>is that we need more conservative journalists. We need more

0:46:32.680 --> 0:46:37.440
<v Speaker 3>well trained people who understand the values of the people

0:46:37.480 --> 0:46:40.640
<v Speaker 3>who most journalists don't cover well.

0:46:41.280 --> 0:46:42.920
<v Speaker 1>Okay, So, by the way, you just pointed out this

0:46:42.960 --> 0:46:47.319
<v Speaker 1>difference between left and right wing journalism because of the

0:46:47.400 --> 0:46:51.600
<v Speaker 1>distribution in the country between rural and urban. But generally,

0:46:51.600 --> 0:46:55.160
<v Speaker 1>one of the things I've found so important is this issue.

0:46:55.200 --> 0:46:58.200
<v Speaker 1>I know this is something you've looked at about, for example,

0:46:58.280 --> 0:47:02.400
<v Speaker 1>conspiracy theories on the left and the right, and essentially

0:47:02.440 --> 0:47:05.799
<v Speaker 1>that they are equal. In other words, both sides, all

0:47:05.880 --> 0:47:09.840
<v Speaker 1>parties are just as subject to this sort of thinking.

0:47:10.400 --> 0:47:13.920
<v Speaker 1>And yet both parties accuse the other of this, just

0:47:13.960 --> 0:47:16.279
<v Speaker 1>in the way that both parties accuse the other of

0:47:16.400 --> 0:47:20.200
<v Speaker 1>doing book banning when both are guilty of this. Where

0:47:20.239 --> 0:47:22.799
<v Speaker 1>else do you see this sort of thing and looking

0:47:22.800 --> 0:47:25.360
<v Speaker 1>for other examples where the left and the right accuse

0:47:25.440 --> 0:47:27.400
<v Speaker 1>each other of things that they are equally guilty of.

0:47:28.239 --> 0:47:31.560
<v Speaker 3>Yeah, so you're raising the issue of symmetry versus asymmetry

0:47:31.600 --> 0:47:34.279
<v Speaker 3>and conflict, and this is a big issue, right, So

0:47:34.360 --> 0:47:39.080
<v Speaker 3>you have broadly speaking, sort of two schools of thought

0:47:39.280 --> 0:47:42.279
<v Speaker 3>or ways that people talk about conflict. One is, you know,

0:47:42.360 --> 0:47:46.719
<v Speaker 3>their side is obviously worse, that they want to destroy democracy.

0:47:47.160 --> 0:47:51.000
<v Speaker 3>They're the oppressor, they're the abuser. And you have this

0:47:51.040 --> 0:47:53.799
<v Speaker 3>other way of talking about it, which is, look, we're

0:47:53.840 --> 0:47:57.239
<v Speaker 3>all human, we are all contributing to being locked in

0:47:57.280 --> 0:48:02.600
<v Speaker 3>this escalating conflict spiral. Nobody's immune from misperceptions or mistakes.

0:48:03.760 --> 0:48:07.080
<v Speaker 3>Both of these things can be true, right. There really

0:48:07.200 --> 0:48:12.319
<v Speaker 3>are cases where one side is doing heinous things and

0:48:12.840 --> 0:48:15.000
<v Speaker 3>the other side is not, or at least some sort

0:48:15.040 --> 0:48:18.440
<v Speaker 3>of difference between the two. And I think part of

0:48:18.719 --> 0:48:23.280
<v Speaker 3>thinking about constructive conflict is bringing in issues of justice,

0:48:24.440 --> 0:48:28.520
<v Speaker 3>so you know, sometimes it really is on one side

0:48:28.560 --> 0:48:32.080
<v Speaker 3>to change their behavior, and that's where we require accountability

0:48:32.080 --> 0:48:36.279
<v Speaker 3>in various forms. On the other hand, conflict is a case,

0:48:36.360 --> 0:48:38.799
<v Speaker 3>especially conflict escalation is a case where it really does

0:48:38.840 --> 0:48:42.360
<v Speaker 3>take two to tango. I tend to think about the

0:48:42.400 --> 0:48:46.560
<v Speaker 3>symmetries more than the asymmetries in the American conflict, largely

0:48:46.600 --> 0:48:50.440
<v Speaker 3>because everyone else is focusing on the asymmetries. And so

0:48:50.480 --> 0:48:53.680
<v Speaker 3>you mentioned conspiracy theories, so that's a great example. The

0:48:53.760 --> 0:48:59.120
<v Speaker 3>sort of media narrative generally is that there's much more

0:48:59.160 --> 0:49:02.080
<v Speaker 3>sort of conspiratory thinking on the right in the US.

0:49:03.239 --> 0:49:07.000
<v Speaker 3>And if you make a huge list of conspiracy theories,

0:49:08.040 --> 0:49:12.160
<v Speaker 3>you know, everything from you know, secret Jewish cabal's controlling

0:49:12.200 --> 0:49:16.960
<v Speaker 3>the world to Holocaust denial to chemtrails, what you find

0:49:17.160 --> 0:49:20.799
<v Speaker 3>is that it's pretty bipartisan there. You know, about the

0:49:20.880 --> 0:49:24.719
<v Speaker 3>same number of conspiracy theories are more commonly believed.

0:49:24.440 --> 0:49:26.319
<v Speaker 4>On the left as opposed to the right.

0:49:27.200 --> 0:49:33.000
<v Speaker 3>However, if you look at misinformation consumption, you find that

0:49:33.200 --> 0:49:35.840
<v Speaker 3>it is definitely more of a right wing thing. And

0:49:35.880 --> 0:49:38.439
<v Speaker 3>I want to put a sort of big as risk

0:49:38.520 --> 0:49:41.920
<v Speaker 3>here and say, well, you know, doesn't this depend on

0:49:41.960 --> 0:49:46.839
<v Speaker 3>who's defining misinformation? And what we find is that when

0:49:46.880 --> 0:49:49.239
<v Speaker 3>you ask bipartisan pants, so you get a bunch of

0:49:49.280 --> 0:49:53.560
<v Speaker 3>Democrats Republicans together and you put a news article in

0:49:53.600 --> 0:49:56.880
<v Speaker 3>front of them, or a let's say, a purported news article,

0:49:57.280 --> 0:49:59.239
<v Speaker 3>and you say, you know, is this true or not?

0:49:59.400 --> 0:50:01.240
<v Speaker 3>You know, take your time, you can use any reference

0:50:01.320 --> 0:50:03.560
<v Speaker 3>materials you wants. You know, let's you know, look it

0:50:03.640 --> 0:50:08.160
<v Speaker 3>up online. You find that there is generally strong agreement

0:50:08.280 --> 0:50:11.960
<v Speaker 3>between bipartisan panels and professional fact checkers. This is the

0:50:12.040 --> 0:50:15.279
<v Speaker 3>level of evidence that I want to see to say

0:50:15.280 --> 0:50:16.759
<v Speaker 3>that there really is an asymmetry.

0:50:16.960 --> 0:50:17.960
<v Speaker 4>And I do think it's true.

0:50:18.000 --> 0:50:23.359
<v Speaker 3>There's just much more low quality information circulating in right

0:50:23.360 --> 0:50:24.759
<v Speaker 3>wing spaces.

0:50:24.719 --> 0:50:26.880
<v Speaker 1>And this is because of the journalism issuy that you

0:50:26.920 --> 0:50:27.800
<v Speaker 1>were mentioning.

0:50:28.440 --> 0:50:29.920
<v Speaker 3>Yeah, I think there's a number of things going on

0:50:30.040 --> 0:50:31.759
<v Speaker 3>right One of them is there's sort of a news

0:50:31.840 --> 0:50:36.440
<v Speaker 3>void right. There just isn't a lot of right wing journalism.

0:50:36.520 --> 0:50:38.560
<v Speaker 3>So if people have a demand for that there, you know,

0:50:38.640 --> 0:50:43.560
<v Speaker 3>that creates an incentive for people who don't really care

0:50:43.600 --> 0:50:45.840
<v Speaker 3>about the journalism to publish things that are going to

0:50:45.840 --> 0:50:49.840
<v Speaker 3>get attention because there's no there isn't anything else in

0:50:49.880 --> 0:50:52.520
<v Speaker 3>that political space that is that is well done. So

0:50:52.600 --> 0:50:54.920
<v Speaker 3>I think I think that is real. But I want

0:50:55.560 --> 0:50:58.480
<v Speaker 3>when people say there's there's an asymmetry, right, it's it's

0:50:58.560 --> 0:51:00.759
<v Speaker 3>really those people who are the problem. I think we

0:51:00.800 --> 0:51:03.320
<v Speaker 3>should have a high bar for evidence. We should have

0:51:03.400 --> 0:51:06.239
<v Speaker 3>a high standard for saying, yes, this is real. It

0:51:06.320 --> 0:51:09.960
<v Speaker 3>isn't just a cudgel that you're using to try to

0:51:10.000 --> 0:51:14.440
<v Speaker 3>win the culture war, because the culture war is not winnable.

0:51:14.680 --> 0:51:18.160
<v Speaker 3>That's a fantasy. You can't exclude half of the population

0:51:18.960 --> 0:51:22.399
<v Speaker 3>from politics forevermore So, we have to find some other

0:51:22.400 --> 0:51:23.480
<v Speaker 3>way to approach each other.

0:51:24.280 --> 0:51:26.280
<v Speaker 1>And so one of the things that you are really

0:51:26.320 --> 0:51:30.120
<v Speaker 1>concentrating on is social media as a leverage point. So

0:51:30.200 --> 0:51:34.600
<v Speaker 1>again we talked about individual ways to help with conflict resolution,

0:51:34.760 --> 0:51:36.560
<v Speaker 1>We've talked about societal way as we were just talking

0:51:36.560 --> 0:51:39.640
<v Speaker 1>about journalism, but as far as social media goes, I

0:51:39.680 --> 0:51:41.640
<v Speaker 1>know that the way you think about this is there's

0:51:41.680 --> 0:51:46.960
<v Speaker 1>this interplay between human psychology, which cares about threats and

0:51:47.239 --> 0:51:50.680
<v Speaker 1>recommend our algorithms, with the social media companies in terms

0:51:50.719 --> 0:51:52.840
<v Speaker 1>of what they're serving up to you, and then the

0:51:52.960 --> 0:51:56.640
<v Speaker 1>content producers, who are going to do the things that

0:51:56.760 --> 0:52:00.560
<v Speaker 1>get them the views. And so human psychology we probably

0:52:00.600 --> 0:52:04.360
<v Speaker 1>can't change too much, and the content producers were probably

0:52:04.360 --> 0:52:08.000
<v Speaker 1>not going to dissuade them from producing things that get views.

0:52:08.320 --> 0:52:12.480
<v Speaker 1>So really it's the recommender algorithms that are up for

0:52:12.640 --> 0:52:16.040
<v Speaker 1>grabs there. So we touched on this before, but let's

0:52:16.040 --> 0:52:18.959
<v Speaker 1>return to that. What do you see as the possibilities there?

0:52:19.960 --> 0:52:22.760
<v Speaker 3>Yeah, so I one of the reasons that I study

0:52:23.400 --> 0:52:26.600
<v Speaker 3>recommender algorithms, which by the way, isn't just social media.

0:52:26.719 --> 0:52:31.320
<v Speaker 3>It's you know, news recommenders, it's job recommendations, it's Amazon products,

0:52:31.360 --> 0:52:34.959
<v Speaker 3>it's Netflix it's it's music, it's podcasts, it's everything. Right,

0:52:35.080 --> 0:52:37.480
<v Speaker 3>all of this stuff is picked for us by machines now,

0:52:38.160 --> 0:52:40.520
<v Speaker 3>and potentially all of it has political content. You may

0:52:40.560 --> 0:52:43.840
<v Speaker 3>not think that you know, who cares what Spotify's recommender

0:52:43.920 --> 0:52:47.239
<v Speaker 3>is doing. Well, you know this podcast is on Spotify, right,

0:52:47.320 --> 0:52:50.320
<v Speaker 3>so that that matters too. So broader than social media,

0:52:50.800 --> 0:52:53.760
<v Speaker 3>there's two reasons I think focusing on social media is interesting.

0:52:53.880 --> 0:52:56.920
<v Speaker 3>When is the direct effects and others are the indirect

0:52:56.920 --> 0:53:02.000
<v Speaker 3>effects via incentives for producers, So direct effects. So what

0:53:02.040 --> 0:53:05.600
<v Speaker 3>I would like to see is less use of engagement

0:53:05.640 --> 0:53:09.960
<v Speaker 3>signals in content ranking. So in other words, how much

0:53:10.000 --> 0:53:13.560
<v Speaker 3>somebody clicked on something, you know, how many seconds they

0:53:13.600 --> 0:53:17.080
<v Speaker 3>spent watching that TikTok video, et cetera, should have less

0:53:17.080 --> 0:53:20.240
<v Speaker 3>of an influence on whether it is shown to other people.

0:53:21.040 --> 0:53:24.480
<v Speaker 3>And to some extent, this change is already starting to happen.

0:53:25.080 --> 0:53:30.960
<v Speaker 3>So there are at least three platforms, of which Facebook

0:53:31.040 --> 0:53:34.320
<v Speaker 3>is the only one who's said this publicly. They basically

0:53:34.440 --> 0:53:38.880
<v Speaker 3>don't use resharing as a signal for civic and health content.

0:53:39.239 --> 0:53:41.919
<v Speaker 3>So maybe for entertainment, whatever catches your attention is fine,

0:53:41.960 --> 0:53:45.480
<v Speaker 3>But maybe for civic and health and politics and these

0:53:45.520 --> 0:53:51.640
<v Speaker 3>types of critical information sources we shouldn't use, you know,

0:53:51.920 --> 0:53:54.440
<v Speaker 3>whether it went viral as a signal for whether it's

0:53:54.440 --> 0:53:56.600
<v Speaker 3>any good, and so that is starting to happen.

0:53:56.680 --> 0:53:57.799
<v Speaker 4>I'd like to see more of that.

0:53:58.239 --> 0:54:01.759
<v Speaker 3>I recently pub to paper with a bunch of collaborators

0:54:02.160 --> 0:54:06.880
<v Speaker 3>where we cataloged all of the options to using engagement

0:54:06.920 --> 0:54:09.759
<v Speaker 3>as a content ranking signal. It's called what we Know

0:54:09.840 --> 0:54:12.840
<v Speaker 3>about using non engagement signals and content marking. To just

0:54:12.920 --> 0:54:15.320
<v Speaker 3>try to get this knowledge out there and to socialize

0:54:15.320 --> 0:54:17.400
<v Speaker 3>it because a lot of this stuff is sort of

0:54:17.440 --> 0:54:20.520
<v Speaker 3>very diffuse across industry. People in industry know it but

0:54:21.160 --> 0:54:22.600
<v Speaker 3>can't talk about it because.

0:54:22.400 --> 0:54:23.040
<v Speaker 4>It's all private.

0:54:23.520 --> 0:54:25.359
<v Speaker 3>So what we did is we got together people from

0:54:25.360 --> 0:54:29.279
<v Speaker 3>eight platforms for an off the record discussion about what

0:54:29.360 --> 0:54:31.560
<v Speaker 3>can we say about how to do this better, and

0:54:31.640 --> 0:54:37.360
<v Speaker 3>then we reconstructed their conclusions from public sources scattered academic literature,

0:54:38.160 --> 0:54:41.919
<v Speaker 3>old company blog posts, but also many references from the

0:54:42.040 --> 0:54:45.439
<v Speaker 3>Facebook files which were the leaks that Francis Hagan brought

0:54:45.440 --> 0:54:48.440
<v Speaker 3>out in twenty twenty one. We sort of learned what

0:54:48.560 --> 0:54:51.319
<v Speaker 3>to look for in those files. So that's the first

0:54:51.320 --> 0:54:53.759
<v Speaker 3>thing I think social media can be better. We can

0:54:53.800 --> 0:54:56.839
<v Speaker 3>build it not to optimize for outrage, And in fact

0:54:57.200 --> 0:55:00.640
<v Speaker 3>the frontier is something called bridging based ranking, and the

0:55:00.719 --> 0:55:06.720
<v Speaker 3>idea there is you find content that both sides agree

0:55:06.920 --> 0:55:09.160
<v Speaker 3>is good. So, you know, think about this, do you

0:55:09.200 --> 0:55:13.600
<v Speaker 3>want the inflammatory news article that appeals to Democrats? Do

0:55:13.600 --> 0:55:16.120
<v Speaker 3>you want the inflammatory news article that appeals to Republicans?

0:55:16.640 --> 0:55:19.759
<v Speaker 3>Or do you want the article that everybody reads and says, yeah,

0:55:19.800 --> 0:55:23.120
<v Speaker 3>that's kind of good. Now, maybe you know, psychologically you're

0:55:23.160 --> 0:55:26.400
<v Speaker 3>much more likely to click on the inflammatory headline, but

0:55:26.440 --> 0:55:29.799
<v Speaker 3>that doesn't mean our better selves actually want that. And

0:55:29.880 --> 0:55:32.840
<v Speaker 3>so I'm involved with a bunch of experiments trying to

0:55:33.160 --> 0:55:37.720
<v Speaker 3>find this bridging content and promote it. So that's the

0:55:37.760 --> 0:55:40.080
<v Speaker 3>sort of direct changes, right, There's a bunch of algorithmic

0:55:40.160 --> 0:55:43.000
<v Speaker 3>changes that you know, I and many of my colleagues

0:55:43.080 --> 0:55:45.719
<v Speaker 3>are exploring and trying.

0:55:45.480 --> 0:55:46.279
<v Speaker 4>To advocate for.

0:55:46.920 --> 0:55:49.720
<v Speaker 3>But then one of the really interesting things about doing

0:55:49.960 --> 0:55:53.719
<v Speaker 3>this in the algorithm space is that, precisely because these

0:55:53.719 --> 0:55:57.400
<v Speaker 3>algorithms decide what everybody sees, changing them can change the

0:55:57.480 --> 0:56:03.440
<v Speaker 3>incentive for producers. So if less inflammatory material is downranked

0:56:03.480 --> 0:56:07.240
<v Speaker 3>and less popular, that means it's less profitable to produce,

0:56:07.719 --> 0:56:12.600
<v Speaker 3>and therefore that changes the kind of content that journalists, politicians,

0:56:13.560 --> 0:56:18.200
<v Speaker 3>you know, think tanks, et cetera, find successful, find reaches

0:56:18.200 --> 0:56:22.200
<v Speaker 3>an audience. And so this second order or indirect effect

0:56:22.480 --> 0:56:26.319
<v Speaker 3>is very interesting because it says that, you know, maybe

0:56:26.320 --> 0:56:28.760
<v Speaker 3>if you can get ten platforms to change their algorithms

0:56:28.760 --> 0:56:31.880
<v Speaker 3>to use bridging based ranking, that could have an effect

0:56:31.960 --> 0:56:34.080
<v Speaker 3>on a much broader media ecosystem.

0:56:34.440 --> 0:56:35.600
<v Speaker 4>So it's a leverage point.

0:56:36.320 --> 0:56:40.360
<v Speaker 1>Yes, how do you convince the social media platform to change?

0:56:40.920 --> 0:56:45.239
<v Speaker 3>Well, I think there's a sort of three cases here.

0:56:45.760 --> 0:56:48.719
<v Speaker 3>So and this is why we're testing these algorithms. So

0:56:49.360 --> 0:56:52.360
<v Speaker 3>I am running something called the pro Social Ranking Challenge.

0:56:52.880 --> 0:56:56.040
<v Speaker 3>It is an open competition for better social media algorithms

0:56:56.080 --> 0:56:59.759
<v Speaker 3>where teams from around the world are competing, first of

0:56:59.800 --> 0:57:02.719
<v Speaker 3>all for a cash prize, but mostly we're going to

0:57:02.719 --> 0:57:07.160
<v Speaker 3>take the winning algorithms, as judged by a panel of scientists,

0:57:07.600 --> 0:57:11.960
<v Speaker 3>and we're going to test them on Facebook, Twitter, and

0:57:12.160 --> 0:57:16.200
<v Speaker 3>Reddit using a custom browser extension. And so we're actually

0:57:16.240 --> 0:57:20.760
<v Speaker 3>going to look to see if it changes polarization, wellbeing,

0:57:21.040 --> 0:57:26.960
<v Speaker 3>and other types of attitudes and outcomes, including engagement. Crucially,

0:57:27.000 --> 0:57:30.080
<v Speaker 3>we are testing whether it changes both short term and

0:57:30.160 --> 0:57:31.600
<v Speaker 3>long term use of these products.

0:57:32.320 --> 0:57:33.360
<v Speaker 4>And so from that.

0:57:33.560 --> 0:57:36.360
<v Speaker 3>We will learn which of three worlds we live in.

0:57:36.760 --> 0:57:40.480
<v Speaker 3>If the universe is kind, we will discover that producing

0:57:40.520 --> 0:57:45.920
<v Speaker 3>a better product that reduces polarization also increases long term retention.

0:57:46.280 --> 0:57:48.440
<v Speaker 3>So you make a higher quality product, people stay on

0:57:48.480 --> 0:57:51.160
<v Speaker 3>the platform, maybe not in the short term, but certainly

0:57:51.160 --> 0:57:53.400
<v Speaker 3>in the long term. And then you can do well

0:57:53.400 --> 0:57:55.080
<v Speaker 3>by doing good right, And then it's just sort of

0:57:55.120 --> 0:57:57.440
<v Speaker 3>getting the word out. Or we could live in a

0:57:57.480 --> 0:58:01.800
<v Speaker 3>world where you know, it has neutral but to maybe

0:58:01.840 --> 0:58:07.360
<v Speaker 3>slightly negative effects to using algorithms that reduce polarization. And

0:58:08.000 --> 0:58:11.320
<v Speaker 3>it's not unheard of for platforms to make, you know,

0:58:11.480 --> 0:58:15.800
<v Speaker 3>slightly revenue reducing changes to their algorithms in the interest

0:58:15.840 --> 0:58:18.000
<v Speaker 3>of public good. You know, I collect examples of this.

0:58:18.320 --> 0:58:20.920
<v Speaker 3>So then it's an advocacy campaign, right, this is the

0:58:21.000 --> 0:58:23.760
<v Speaker 3>right thing, you should do it. There's lots of groups

0:58:23.760 --> 0:58:25.880
<v Speaker 3>that exist to put pressure on companies to do the

0:58:25.920 --> 0:58:30.600
<v Speaker 3>right things. Or perhaps in some sense, the worst outcome

0:58:30.680 --> 0:58:34.480
<v Speaker 3>is we discover that making algorithms which produce better conflict

0:58:34.520 --> 0:58:39.240
<v Speaker 3>outcomes tends to reduce usage of the product in a

0:58:39.280 --> 0:58:43.120
<v Speaker 3>way that is meaningful from a business perspective. And then

0:58:43.160 --> 0:58:46.760
<v Speaker 3>what we have is a collective action problem. You have

0:58:46.800 --> 0:58:50.120
<v Speaker 3>a first mover disadvantage. Who in that, whoever changes their

0:58:50.160 --> 0:58:53.760
<v Speaker 3>algorithm to be better first loses money relatives to everyone else.

0:58:54.120 --> 0:58:58.120
<v Speaker 3>And then you have to look at regulation because that

0:58:58.240 --> 0:59:00.400
<v Speaker 3>can level the playing field, and very much the same

0:59:00.400 --> 0:59:03.120
<v Speaker 3>way that environmental regulation prevents a race to the bottom.

0:59:03.160 --> 0:59:06.760
<v Speaker 3>You know, nobody wants to be the first to use

0:59:06.800 --> 0:59:10.040
<v Speaker 3>a more expensive process that results in less pollution, because

0:59:10.080 --> 0:59:12.400
<v Speaker 3>then they would lose their market share. But if everybody

0:59:12.440 --> 0:59:14.960
<v Speaker 3>has to do it, then it's okay. So those are

0:59:14.960 --> 0:59:16.960
<v Speaker 3>the three outcomes, right, Either it's just a matter of

0:59:17.160 --> 0:59:20.560
<v Speaker 3>spreading the good word or its advocacy or its regulation

0:59:20.720 --> 0:59:22.960
<v Speaker 3>depending on where the science takes.

0:59:22.840 --> 0:59:25.760
<v Speaker 1>Us, or what else. In conclusion, would you like to say.

0:59:26.200 --> 0:59:30.480
<v Speaker 3>If we care about relating to each other better, and

0:59:30.520 --> 0:59:32.600
<v Speaker 3>I don't just mean like kumbay oh why can't we

0:59:32.640 --> 0:59:37.680
<v Speaker 3>all get along? But actually having a politics that functions better,

0:59:37.720 --> 0:59:40.760
<v Speaker 3>where we get to fight what we believe in without

0:59:40.960 --> 0:59:45.040
<v Speaker 3>dehumanizing the other side, without misperceiving what they're actually about,

0:59:45.520 --> 0:59:48.960
<v Speaker 3>without things turning ugly and violent. That's something that we

0:59:49.000 --> 0:59:54.600
<v Speaker 3>can do. There's many many ways to have better political conflict,

0:59:55.360 --> 0:59:58.320
<v Speaker 3>but it's going to take a fundamentally different attitude As

0:59:58.360 --> 1:00:01.760
<v Speaker 3>I said before, the the culture war is not winnable.

1:00:02.200 --> 1:00:04.680
<v Speaker 3>There is no world in which you get to exclude

1:00:04.760 --> 1:00:10.439
<v Speaker 3>your political opponents from politics indefinitely. That's what a democracy is, right.

1:00:11.120 --> 1:00:14.680
<v Speaker 3>We accept someone winning an election because the next election

1:00:14.720 --> 1:00:19.120
<v Speaker 3>they might lose. So there are ways to get involved,

1:00:19.160 --> 1:00:20.320
<v Speaker 3>there are things you can do.

1:00:20.840 --> 1:00:24.360
<v Speaker 4>The situation is not hopeless.

1:00:27.720 --> 1:00:31.520
<v Speaker 1>That was Jonathan's strain at Berkeley. So let's take the

1:00:31.560 --> 1:00:35.280
<v Speaker 1>work that Jonathan and others are doing to address our

1:00:35.360 --> 1:00:39.520
<v Speaker 1>illusions that people who disagree with us are misinformed trolls.

1:00:39.680 --> 1:00:43.040
<v Speaker 1>It's a very useful exercise to figure out how we

1:00:43.120 --> 1:00:46.000
<v Speaker 1>can look at somebody who disagrees with us as not

1:00:46.160 --> 1:00:50.200
<v Speaker 1>being cold and incompetent, but possibly someone who is kind

1:00:50.280 --> 1:00:55.320
<v Speaker 1>and generous and has a different opinion. This starts with

1:00:55.480 --> 1:00:59.520
<v Speaker 1>intellectual humility, understanding that we don't know it all, and

1:00:59.560 --> 1:01:02.000
<v Speaker 1>I don't know that from a philosophical point of view,

1:01:02.000 --> 1:01:06.000
<v Speaker 1>but from a neuroscience point of view. Because of brain plasticity,

1:01:06.480 --> 1:01:10.200
<v Speaker 1>we each form an internal model of the world based

1:01:10.240 --> 1:01:14.000
<v Speaker 1>on our very thin trajectory of space and time, and

1:01:14.080 --> 1:01:17.800
<v Speaker 1>we form our political opinions based on just the little

1:01:17.800 --> 1:01:22.200
<v Speaker 1>bit that we're exposed to. We shape our ideas from

1:01:22.200 --> 1:01:25.360
<v Speaker 1>the social networks we happen to be embedded in, and

1:01:25.440 --> 1:01:28.680
<v Speaker 1>we're not consciously aware that we're doing this. So at

1:01:28.680 --> 1:01:31.040
<v Speaker 1>the heart of all of this is a need for

1:01:31.240 --> 1:01:34.200
<v Speaker 1>intellectual humility, and we're going to need a lot of

1:01:34.240 --> 1:01:37.640
<v Speaker 1>this to address the kind of polarization, the kind of

1:01:37.720 --> 1:01:42.320
<v Speaker 1>fear and loathing that we're seeing across the globe. Meaningful

1:01:42.600 --> 1:01:46.000
<v Speaker 1>dialogues are a great start, but also there's a need

1:01:46.080 --> 1:01:50.600
<v Speaker 1>for scaling. How do we build this into our educational systems,

1:01:50.640 --> 1:01:54.440
<v Speaker 1>How do we build this into the fundamental algorithms underlying

1:01:54.440 --> 1:01:58.080
<v Speaker 1>our social media. How do we build what some scholars

1:01:58.160 --> 1:02:03.880
<v Speaker 1>like Heidi and Guy Burgess massively parallel peace building. There's

1:02:03.880 --> 1:02:05.800
<v Speaker 1>still a lot of work to be done on this front,

1:02:06.160 --> 1:02:11.880
<v Speaker 1>especially as we're moving from communicating via soapbox speeches and

1:02:12.080 --> 1:02:16.400
<v Speaker 1>hand delivered pamphlets to instant communication that allows you to

1:02:16.480 --> 1:02:20.800
<v Speaker 1>deliver your speech or your pamphlet to everyone's mobile rectangle

1:02:20.800 --> 1:02:24.400
<v Speaker 1>around the planet. So I suggest that an important angle

1:02:24.560 --> 1:02:28.040
<v Speaker 1>on all this is to understand the neuroscience at the

1:02:28.120 --> 1:02:30.560
<v Speaker 1>base of everything, why we think the way we do,

1:02:31.000 --> 1:02:34.320
<v Speaker 1>why we behave the way we behave, and then work

1:02:34.400 --> 1:02:38.120
<v Speaker 1>to build our societal structures like our media, our dialogue,

1:02:38.120 --> 1:02:42.160
<v Speaker 1>our education. With that in mind, we can no longer

1:02:42.560 --> 1:02:47.160
<v Speaker 1>make the romantic assumption that we each are just objective

1:02:47.440 --> 1:02:51.840
<v Speaker 1>holders of truth and that ours is this single logical

1:02:51.960 --> 1:02:55.600
<v Speaker 1>argument or position that should convince everyone. And this is

1:02:55.640 --> 1:02:58.920
<v Speaker 1>because our internal models of the world give us different

1:02:58.960 --> 1:03:03.080
<v Speaker 1>biases towards different groups. We have different sensitivities towards issues,

1:03:03.120 --> 1:03:06.520
<v Speaker 1>we have different levels of knowledge, we have different emotional

1:03:06.560 --> 1:03:10.200
<v Speaker 1>affiliations to things happening in the world. So the first

1:03:10.240 --> 1:03:13.920
<v Speaker 1>step to better conflict is to have a more realistic

1:03:14.040 --> 1:03:18.480
<v Speaker 1>understanding that we are each living on our own planet

1:03:18.840 --> 1:03:23.920
<v Speaker 1>mentally and emotionally, and the important goal is to bridge planets,

1:03:23.960 --> 1:03:27.840
<v Speaker 1>to set up some signaling across the vast reaches of

1:03:27.920 --> 1:03:31.160
<v Speaker 1>space between us. It's easy to say that the people

1:03:31.160 --> 1:03:35.080
<v Speaker 1>who disagree with you are ugly trolls, but as we discussed,

1:03:35.080 --> 1:03:38.960
<v Speaker 1>it can make slightly more sense to ask how someone

1:03:38.960 --> 1:03:43.240
<v Speaker 1>who is smart and kind can end up believing something

1:03:43.440 --> 1:03:46.520
<v Speaker 1>different than you do. This is not a plea to

1:03:46.880 --> 1:03:50.800
<v Speaker 1>agree with the other side, but to better understand their

1:03:50.840 --> 1:03:55.720
<v Speaker 1>motivation and their philosophy, and fundamentally, to better understand are

1:03:55.880 --> 1:04:03.360
<v Speaker 1>shared biology and therefore are shared human go to eagleman

1:04:03.440 --> 1:04:06.600
<v Speaker 1>dot com slash podcast for more information and to find

1:04:06.720 --> 1:04:10.440
<v Speaker 1>further reading. Send me an email at podcasts at eagleman

1:04:10.520 --> 1:04:14.200
<v Speaker 1>dot com with any questions or discussions, and check out

1:04:14.200 --> 1:04:17.680
<v Speaker 1>and subscribe to Inner Cosmos on YouTube for videos of

1:04:17.680 --> 1:04:21.560
<v Speaker 1>each episode and to leave comments. Until next time, I'm

1:04:21.680 --> 1:04:24.720
<v Speaker 1>David Eagleman and this is inner Cosmos