1 00:00:15,516 --> 00:00:15,956 Speaker 1: Pushkin. 2 00:00:20,676 --> 00:00:23,356 Speaker 2: It's no secret that politics around the world has become 3 00:00:23,436 --> 00:00:26,996 Speaker 2: more divided and more toxic. Here in the US, our 4 00:00:27,076 --> 00:00:30,596 Speaker 2: political parties seem further apart on key issues than ever before. 5 00:00:31,236 --> 00:00:31,596 Speaker 3: Back in the. 6 00:00:31,636 --> 00:00:35,156 Speaker 2: Nineteen eighties, Democrats and Republicans reported liking people in their 7 00:00:35,156 --> 00:00:38,196 Speaker 2: own party and feeling relatively neutral about folks on the 8 00:00:38,196 --> 00:00:42,156 Speaker 2: other side. But today those feelings are much more polarized. 9 00:00:42,556 --> 00:00:45,876 Speaker 2: One survey from twenty twenty found that Democrats and Republicans 10 00:00:45,916 --> 00:00:48,716 Speaker 2: now dislike their rivals more than they like the people 11 00:00:48,716 --> 00:00:52,596 Speaker 2: who share their political views. This level of emotional polarization 12 00:00:52,916 --> 00:00:55,916 Speaker 2: is new in American politics, but it's not that surprising 13 00:00:55,996 --> 00:00:59,156 Speaker 2: given well, pretty much every screen you look at these days, 14 00:00:59,436 --> 00:01:03,116 Speaker 2: cable news, shows, social media sites, emailed attack ads, they 15 00:01:03,156 --> 00:01:06,076 Speaker 2: all show us a very extreme version of the other side. 16 00:01:06,276 --> 00:01:09,556 Speaker 2: Our opponents aren't just wrong about the policies, they're moral 17 00:01:09,916 --> 00:01:11,476 Speaker 2: and out to destroy our nation. 18 00:01:11,556 --> 00:01:14,556 Speaker 4: Of brainwashing that's going to the young people right now 19 00:01:14,676 --> 00:01:15,556 Speaker 4: is unbelieving. 20 00:01:15,596 --> 00:01:18,276 Speaker 5: Now they're inviting the government into their bedroom to tell 21 00:01:18,316 --> 00:01:20,156 Speaker 5: any woman what to do with her private parts. 22 00:01:20,276 --> 00:01:20,996 Speaker 2: I don't think that's right. 23 00:01:21,196 --> 00:01:24,836 Speaker 5: We're being flooded in by illegal people coming in a 24 00:01:24,836 --> 00:01:28,156 Speaker 5: lot of people that vote for certain candidate of voting 25 00:01:28,196 --> 00:01:33,036 Speaker 5: against democracy. This level of division, distrust and anger, it 26 00:01:33,076 --> 00:01:37,396 Speaker 5: takes a huge toll on us. It's exhausting. I'm Jamil Zaki, 27 00:01:37,436 --> 00:01:39,836 Speaker 5: and in my lab at Stanford, we found that more 28 00:01:39,876 --> 00:01:42,996 Speaker 5: than eighty percent of Americans on both sides are fed 29 00:01:43,076 --> 00:01:45,756 Speaker 5: up with it and wish that the country was less divided. 30 00:01:46,236 --> 00:01:48,436 Speaker 5: But if you're convinced that the other side is made 31 00:01:48,516 --> 00:01:51,876 Speaker 5: up of terrible, hostile people, there's little hope for things 32 00:01:51,876 --> 00:01:52,596 Speaker 5: to get any better. 33 00:01:52,836 --> 00:01:56,076 Speaker 6: They're so annoying, they're come with so money negative. That's 34 00:01:56,076 --> 00:01:57,036 Speaker 6: what a Republican come. 35 00:01:57,196 --> 00:01:59,236 Speaker 4: You know, it's always put out there that Republicans are this, 36 00:01:59,356 --> 00:02:00,916 Speaker 4: Republicans are that we're racists. 37 00:02:00,996 --> 00:02:02,396 Speaker 7: We don't listen to other people. 38 00:02:02,556 --> 00:02:05,476 Speaker 4: A lot of my family are Republicans, but I'm not 39 00:02:05,556 --> 00:02:07,636 Speaker 4: friends with any Republicans. 40 00:02:08,116 --> 00:02:10,316 Speaker 5: But are we right to imagine that everyone on the 41 00:02:10,356 --> 00:02:13,116 Speaker 5: other side is just plain awful? And if we're not, 42 00:02:13,556 --> 00:02:16,676 Speaker 5: can we fight this polarization and connect with our fellow 43 00:02:16,716 --> 00:02:20,116 Speaker 5: citizens across the aisle? Is there any hope that we 44 00:02:20,156 --> 00:02:21,876 Speaker 5: even could be less divided? 45 00:02:24,756 --> 00:02:27,076 Speaker 2: Our minds are constantly telling us what you to be happy? 46 00:02:27,596 --> 00:02:29,876 Speaker 2: What if our minds are wrong? What if our minds 47 00:02:29,956 --> 00:02:32,396 Speaker 2: are lying to us, leading us away from what will 48 00:02:32,436 --> 00:02:35,796 Speaker 2: really make us happy. The goodness is that understanding the 49 00:02:35,796 --> 00:02:37,836 Speaker 2: science of the mind can point us all back in 50 00:02:37,876 --> 00:02:41,436 Speaker 2: the right direction. You're listening to the Happiness Lab with me, 51 00:02:41,716 --> 00:02:43,356 Speaker 2: doctor Laurie Santos. 52 00:02:43,076 --> 00:02:54,916 Speaker 5: And me doctor Jamille Zaki. Amanda, it's great to meet you. 53 00:02:55,036 --> 00:02:55,836 Speaker 5: I'm a huge fan. 54 00:02:56,076 --> 00:02:57,116 Speaker 3: Oh, thanks for saying that. 55 00:02:57,196 --> 00:02:58,996 Speaker 5: Yeah, I recommend your book for everybody. 56 00:02:59,156 --> 00:03:00,756 Speaker 3: Likewise, I've really been enjoying When I. 57 00:03:00,796 --> 00:03:03,396 Speaker 5: Was writing Hope for Cynics, I wanted to learn more 58 00:03:03,436 --> 00:03:07,476 Speaker 5: about how cynical thinking might worsen polarization. One of the 59 00:03:07,476 --> 00:03:10,516 Speaker 5: people who inspired me the most was Amanda Ripley. 60 00:03:10,876 --> 00:03:13,436 Speaker 3: I've been a journalist for twenty years and I spent 61 00:03:13,516 --> 00:03:16,436 Speaker 3: a lot of time covering conflict. That's part of the job, 62 00:03:16,516 --> 00:03:18,356 Speaker 3: and I thought I was pretty good at it. I 63 00:03:18,356 --> 00:03:20,316 Speaker 3: thought I understood it pretty well. I thought it was 64 00:03:20,316 --> 00:03:22,716 Speaker 3: comfortable with it. And then, you know, six or eight 65 00:03:22,796 --> 00:03:26,116 Speaker 3: years ago, I started to feel like things were happening 66 00:03:26,316 --> 00:03:31,236 Speaker 3: in the country politically that didn't make sense, and anything 67 00:03:31,276 --> 00:03:33,396 Speaker 3: I might do as a journalist was either going to 68 00:03:33,556 --> 00:03:36,636 Speaker 3: have no impact or make things worse. 69 00:03:37,156 --> 00:03:39,996 Speaker 5: People have gotten more polarized in the past few decades, 70 00:03:40,316 --> 00:03:42,876 Speaker 5: but so is the media. When I was a kid, 71 00:03:43,156 --> 00:03:46,476 Speaker 5: not quite in the age of the dinosaurs, TV news 72 00:03:46,596 --> 00:03:49,796 Speaker 5: was like the ancient land mass of Pangaea, a single 73 00:03:49,916 --> 00:03:53,476 Speaker 5: continent of information on which we all lived. We might 74 00:03:53,516 --> 00:03:55,796 Speaker 5: not have shared our views, but at least we shared 75 00:03:55,796 --> 00:03:58,516 Speaker 5: a sense of what was going on. But then the 76 00:03:58,596 --> 00:04:02,996 Speaker 5: tectonic plates shifted, that single continent of news broke into 77 00:04:03,036 --> 00:04:07,396 Speaker 5: pieces that drifted apart. With media companies competing for our attention, 78 00:04:08,236 --> 00:04:11,516 Speaker 5: a new business mindel emerged. Instead of trying to get 79 00:04:11,556 --> 00:04:16,716 Speaker 5: the most viewers, news channels started cultivating audience loyalty by 80 00:04:16,716 --> 00:04:19,676 Speaker 5: feeding people what they wanted to hear twenty four hours 81 00:04:19,716 --> 00:04:24,356 Speaker 5: a day. Partisan cable news was born, pumping out divisive 82 00:04:24,396 --> 00:04:27,716 Speaker 5: rhetoric and encouraging us to fear and loathe our rivals. 83 00:04:28,396 --> 00:04:31,676 Speaker 3: It's just painful because you can see the ways in 84 00:04:31,756 --> 00:04:36,996 Speaker 3: which the places I worked exploited and incited conflict without 85 00:04:37,276 --> 00:04:38,316 Speaker 3: anyone knowing it. 86 00:04:38,996 --> 00:04:42,836 Speaker 5: So, according to Amanda, the news wasn't just reflecting polarization, 87 00:04:43,356 --> 00:04:46,836 Speaker 5: it was profiting from it. Research suggests this is spreading 88 00:04:46,836 --> 00:04:51,636 Speaker 5: cynicism through what's called mean world syndrome. The more people 89 00:04:51,636 --> 00:04:54,596 Speaker 5: tune into the news, the worse they think humanity is. 90 00:04:55,076 --> 00:04:58,276 Speaker 5: And that makes sense because, in addition to becoming more divisive, 91 00:04:58,636 --> 00:05:02,396 Speaker 5: the news has also grown more negative. Since the beginning 92 00:05:02,436 --> 00:05:05,916 Speaker 5: of this century, the presence of angry words and headlines 93 00:05:06,036 --> 00:05:08,836 Speaker 5: has increased by one hundred percent and the presence of 94 00:05:08,836 --> 00:05:10,876 Speaker 5: fear by one hundred and fifty percent. 95 00:05:11,516 --> 00:05:14,996 Speaker 2: Unsurprisingly, none of this makes us happy. Once survey asked 96 00:05:14,996 --> 00:05:18,156 Speaker 2: people to complete the following sentence, the news makes me 97 00:05:18,196 --> 00:05:22,796 Speaker 2: feel blank the top answers, hopeless, agitated, and despair. What 98 00:05:22,836 --> 00:05:26,636 Speaker 2: would your answers be? Amanda was devastated that her industry 99 00:05:26,756 --> 00:05:29,436 Speaker 2: was making people miserable. She hated the idea that she 100 00:05:29,556 --> 00:05:32,636 Speaker 2: might be contributing to the conflict brewing all around us. 101 00:05:33,116 --> 00:05:36,396 Speaker 3: There are stories I've written or headlines I've approved that 102 00:05:36,516 --> 00:05:40,316 Speaker 3: definitely piled on that exploited the conflict in order to 103 00:05:40,356 --> 00:05:43,196 Speaker 3: get attention or status or profit. 104 00:05:43,596 --> 00:05:46,876 Speaker 2: Amanda wanted to better understand how good journalists wound up 105 00:05:46,916 --> 00:05:49,556 Speaker 2: fanning the flames of polarization and what they could do 106 00:05:49,756 --> 00:05:50,596 Speaker 2: to break free. 107 00:05:50,716 --> 00:05:53,996 Speaker 3: So I started following people who have been through really 108 00:05:54,116 --> 00:05:58,756 Speaker 3: dysfunctional conflict and now are in healthier conflict. 109 00:05:58,956 --> 00:06:02,356 Speaker 2: Amanda interviewed rival gang leaders in Chicago. She traveled to 110 00:06:02,396 --> 00:06:05,716 Speaker 2: Columbia to learn about that country's civil war. She shadowed 111 00:06:05,756 --> 00:06:08,916 Speaker 2: radical environmentalists and even did a deep dive into small 112 00:06:08,916 --> 00:06:13,156 Speaker 2: time politics. What she discovered turned into a book, High Conflict, 113 00:06:13,436 --> 00:06:15,316 Speaker 2: Why we get trapped, and How we get out. 114 00:06:15,676 --> 00:06:18,156 Speaker 3: Conflict is not a problem, right, and we know this 115 00:06:18,196 --> 00:06:20,876 Speaker 3: from all of the research into human behavior. The conflict 116 00:06:20,876 --> 00:06:24,076 Speaker 3: onto itself can be really important and helpful. The problem 117 00:06:24,276 --> 00:06:28,916 Speaker 3: is high conflict or intractable conflict, which becomes kind of 118 00:06:28,996 --> 00:06:31,356 Speaker 3: like conflict for conflict's sake. 119 00:06:31,756 --> 00:06:34,396 Speaker 5: So if conflict can be just fine, when does it 120 00:06:34,396 --> 00:06:38,196 Speaker 5: turn into a problem. According to Amanda, the first step 121 00:06:38,356 --> 00:06:40,516 Speaker 5: is when we draw up rigid battle. 122 00:06:40,196 --> 00:06:44,316 Speaker 3: Lines, we literally lose our peripheral vision. We start to 123 00:06:44,636 --> 00:06:47,676 Speaker 3: cleanly divide the world into us and them and miss 124 00:06:47,716 --> 00:06:49,916 Speaker 3: all the people that don't really fit. 125 00:06:50,596 --> 00:06:53,036 Speaker 5: Once we define ourselves as members of a group, we 126 00:06:53,116 --> 00:06:56,636 Speaker 5: begin stereotyping the other group as the enemy. Caught in 127 00:06:56,676 --> 00:07:00,596 Speaker 5: a zero sum competition, we win only if they lose. 128 00:07:01,156 --> 00:07:05,036 Speaker 5: This is when emotional polarization takes over. We begin to 129 00:07:05,076 --> 00:07:08,276 Speaker 5: dislike the other side and take pleasure in their suffering, 130 00:07:09,476 --> 00:07:13,076 Speaker 5: known as schadenfreude. But it even opens the door to 131 00:07:13,196 --> 00:07:16,036 Speaker 5: us finding a justification to hurt them ourselves. 132 00:07:16,716 --> 00:07:20,956 Speaker 3: We know that when people feel like they are better 133 00:07:21,036 --> 00:07:25,036 Speaker 3: than another person or another group, when they feel maybe 134 00:07:25,116 --> 00:07:28,636 Speaker 3: humiliated by that other person or other group, all of 135 00:07:28,676 --> 00:07:31,876 Speaker 3: that makes violent conflict much more likely. 136 00:07:32,516 --> 00:07:35,636 Speaker 5: So this cycle of fear creates hatred and even violence. 137 00:07:35,996 --> 00:07:38,796 Speaker 5: It's exhausting and most of us wish it would stop. 138 00:07:39,356 --> 00:07:42,756 Speaker 5: Remember that statistic saying eighty percent of us want less division. 139 00:07:43,116 --> 00:07:45,476 Speaker 5: But it does help at least one group of people. 140 00:07:46,116 --> 00:07:49,596 Speaker 5: Amanda calls them conflict entrepreneurs, So. 141 00:07:49,636 --> 00:07:52,876 Speaker 3: People who exploit or inflame conflict for their own ends. 142 00:07:53,316 --> 00:07:56,876 Speaker 5: We've always had conflict entrepreneurs who earned a living saying 143 00:07:56,876 --> 00:08:01,276 Speaker 5: outrageous things in newspapers or on TV. But these days 144 00:08:01,396 --> 00:08:05,436 Speaker 5: anyone can engineer conflict and profit from it, reaching millions 145 00:08:05,436 --> 00:08:07,316 Speaker 5: of people from the comfort of their own home. 146 00:08:07,716 --> 00:08:09,956 Speaker 3: You know, you can go viral online, as we all know, 147 00:08:10,116 --> 00:08:14,636 Speaker 3: with like really cool, inspiring, hopeful, surprising content. You can 148 00:08:14,676 --> 00:08:18,036 Speaker 3: absolutely do that, but it's a lot easier to go 149 00:08:18,196 --> 00:08:22,996 Speaker 3: viral with like really uninspired and kind of mediocre content 150 00:08:23,356 --> 00:08:26,196 Speaker 3: if you can provoke outrage. So we've kind of set 151 00:08:26,196 --> 00:08:29,236 Speaker 3: things up to create a golden age for conflict entrepreneurs, 152 00:08:29,356 --> 00:08:32,116 Speaker 3: and it's very easy to be one. You can raise 153 00:08:32,196 --> 00:08:35,236 Speaker 3: money by being a conflict entrepreneur. You can get attention, 154 00:08:35,516 --> 00:08:38,676 Speaker 3: you can get a sense of belonging, you get followers. 155 00:08:38,156 --> 00:08:38,836 Speaker 1: You get likes. 156 00:08:39,196 --> 00:08:41,676 Speaker 5: You know, Amanda, it's easy to think about the conflict 157 00:08:41,836 --> 00:08:45,396 Speaker 5: entrepreneur as somebody cynical who's trying to make money from 158 00:08:45,756 --> 00:08:50,556 Speaker 5: or gain notoriety from their the ideological bomb throwing. But 159 00:08:50,636 --> 00:08:54,556 Speaker 5: you're actually saying, sometimes conflict entrepreneurs are looking for what 160 00:08:54,676 --> 00:08:57,636 Speaker 5: the rest of us are looking for, community and belonging. 161 00:08:58,836 --> 00:09:00,796 Speaker 3: Yeah, that's a great way of putting into meal. I mean, 162 00:09:00,796 --> 00:09:06,276 Speaker 3: I think certainly profit is maybe the least seductive reward 163 00:09:07,116 --> 00:09:09,276 Speaker 3: for conflict entrepreneurs. It's definitely in there. 164 00:09:09,596 --> 00:09:10,556 Speaker 1: But what is money? 165 00:09:10,676 --> 00:09:13,556 Speaker 3: Right? Money is often, especially for someone who's been doing 166 00:09:13,556 --> 00:09:17,276 Speaker 3: this their whole life, it's a proxy for being loved, 167 00:09:17,436 --> 00:09:20,996 Speaker 3: for being respected, for being powerful. Right, So it's a 168 00:09:21,036 --> 00:09:23,276 Speaker 3: way for people to feel like they matter. 169 00:09:24,076 --> 00:09:27,876 Speaker 5: Mattering it's a core psychological need we all share. We 170 00:09:27,996 --> 00:09:31,116 Speaker 5: go back again and again to whatever makes us feel 171 00:09:31,116 --> 00:09:34,756 Speaker 5: that way, and conflict, even when it's scary, can make 172 00:09:34,836 --> 00:09:38,996 Speaker 5: us feel a sense of purpose, belonging and mattering because 173 00:09:39,036 --> 00:09:41,596 Speaker 5: of this it can suck people in, turning any of 174 00:09:41,676 --> 00:09:45,636 Speaker 5: us into many conflict entrepreneurs. And once it gets going, 175 00:09:45,756 --> 00:09:48,196 Speaker 5: it's also hard to get out of. Once we're in 176 00:09:48,196 --> 00:09:50,996 Speaker 5: a state of high conflict, we experience a whole host 177 00:09:51,076 --> 00:09:56,476 Speaker 5: of psychological changes. Our thinking becomes oversimplified. If you think 178 00:09:56,516 --> 00:09:59,636 Speaker 5: you're smart, your enemy must be dumb. If you're good, 179 00:09:59,876 --> 00:10:02,916 Speaker 5: they're bad. If you want one thing, they must want 180 00:10:02,956 --> 00:10:06,476 Speaker 5: the exact opposite. But lots of these assumptions are just 181 00:10:06,716 --> 00:10:07,396 Speaker 5: plain wrong. 182 00:10:07,476 --> 00:10:09,116 Speaker 1: The democratic way it drives me absolute. 183 00:10:09,596 --> 00:10:12,036 Speaker 3: I can't tell you how many terrorists are actually coming 184 00:10:12,116 --> 00:10:13,396 Speaker 3: into our country right now. 185 00:10:13,436 --> 00:10:17,516 Speaker 7: I think their Republicans is very hard paper. 186 00:10:17,716 --> 00:10:21,676 Speaker 5: Democrats are real stupid, and they don't know much, and they. 187 00:10:22,076 --> 00:10:22,836 Speaker 1: Better than everyone. 188 00:10:24,036 --> 00:10:26,916 Speaker 5: At least in the US. Political groups are wrong about 189 00:10:26,916 --> 00:10:29,996 Speaker 5: each other in almost every way we can measure. We 190 00:10:30,076 --> 00:10:31,956 Speaker 5: don't know what people on the other side are like, 191 00:10:32,236 --> 00:10:36,036 Speaker 5: even at a basic level. For example, Democrats tend to 192 00:10:36,036 --> 00:10:39,436 Speaker 5: think that Republicans are super rich. When asked how many 193 00:10:39,476 --> 00:10:42,116 Speaker 5: Republicans have a salary of over two hundred and fifty 194 00:10:42,156 --> 00:10:46,116 Speaker 5: thousand dollars a year, Democrats, on average guess around forty 195 00:10:46,156 --> 00:10:46,716 Speaker 5: four percent. 196 00:10:46,956 --> 00:10:49,396 Speaker 3: I think the real number Epison like twelve percent. 197 00:10:49,716 --> 00:10:52,476 Speaker 5: I don't want to correct, Amanda, but the actual statistic 198 00:10:52,796 --> 00:10:53,516 Speaker 5: is two percent. 199 00:10:53,876 --> 00:10:54,996 Speaker 6: See even I just did it. 200 00:10:56,956 --> 00:10:58,076 Speaker 1: I just did the same mistake. 201 00:10:58,556 --> 00:11:01,556 Speaker 5: The right gets the left wrong too. When Republicans are 202 00:11:01,556 --> 00:11:05,876 Speaker 5: asked how many Democrats identify as LGBTQ, they assumed it 203 00:11:05,916 --> 00:11:09,836 Speaker 5: was around thirty eight percent, the actual proportion six percent. 204 00:11:10,396 --> 00:11:13,196 Speaker 5: We don't know who the other side is or what 205 00:11:13,316 --> 00:11:17,556 Speaker 5: they think. This is called false polarization. Of course, we 206 00:11:17,636 --> 00:11:21,436 Speaker 5: are polarized, but these divisions are larger in our minds 207 00:11:21,676 --> 00:11:25,956 Speaker 5: than in reality. Both Democrats and Republicans report that the 208 00:11:25,996 --> 00:11:29,236 Speaker 5: average person they disagree with is much more extreme than 209 00:11:29,276 --> 00:11:32,716 Speaker 5: they really are, and it gets worse. We also think 210 00:11:32,756 --> 00:11:35,836 Speaker 5: our average rival is about twice as hateful as they 211 00:11:35,876 --> 00:11:36,316 Speaker 5: really are. 212 00:11:36,396 --> 00:11:38,916 Speaker 3: We kind of caricature each other, or rather, I should say, 213 00:11:38,956 --> 00:11:44,036 Speaker 3: conflict entrepreneurs caricature each other, and then we believe these things, 214 00:11:44,196 --> 00:11:47,076 Speaker 3: especially when we're really segregated like we are right now. 215 00:11:47,436 --> 00:11:50,156 Speaker 5: The good news is there is way more common ground 216 00:11:50,196 --> 00:11:53,636 Speaker 5: in our political landscape than we realize. Most of us 217 00:11:53,676 --> 00:11:56,436 Speaker 5: agree on at least some issues, and almost all of 218 00:11:56,516 --> 00:12:00,756 Speaker 5: us share certain values. Most Americans want peace and democracy. 219 00:12:01,196 --> 00:12:04,516 Speaker 5: The bad news is that conflict. Entrepreneurs convince us that 220 00:12:04,556 --> 00:12:07,276 Speaker 5: people on the other side are awful, so we avoid 221 00:12:07,356 --> 00:12:10,756 Speaker 5: talking with them, even when the those conversations might help 222 00:12:10,876 --> 00:12:14,036 Speaker 5: us become less wrong. What's worse, we end up thinking 223 00:12:14,116 --> 00:12:17,036 Speaker 5: that we need to escalate conflict even though almost no 224 00:12:17,116 --> 00:12:21,556 Speaker 5: one wants to. This feeds even more division. An exhausted 225 00:12:21,596 --> 00:12:25,916 Speaker 5: majority wishes things could be different, but feels incorrectly that 226 00:12:25,956 --> 00:12:26,636 Speaker 5: they're alone. 227 00:12:26,756 --> 00:12:28,796 Speaker 3: I think people are getting really tired of it, but 228 00:12:28,916 --> 00:12:32,956 Speaker 3: they also feel totally trapped. We know this doesn't feel right. 229 00:12:33,156 --> 00:12:35,756 Speaker 3: I think we all know that something is wrong in 230 00:12:35,756 --> 00:12:39,676 Speaker 3: our culture, in our politics, and that we need to 231 00:12:39,716 --> 00:12:43,236 Speaker 3: do better. I think there's huge unmet demand for a 232 00:12:43,276 --> 00:12:46,516 Speaker 3: different way to do politics, a different way to do journalism, 233 00:12:46,836 --> 00:12:47,756 Speaker 3: different way to fight. 234 00:12:48,516 --> 00:12:50,436 Speaker 5: Is there hope that we can find a way out 235 00:12:50,436 --> 00:12:54,076 Speaker 5: of high conflict. Well, after the break, we'll meet someone 236 00:12:54,076 --> 00:12:57,916 Speaker 5: who's had to diffuse lots of polarizing situations and open 237 00:12:57,956 --> 00:13:00,996 Speaker 5: a type of dialogue not where we agree, but where 238 00:13:00,996 --> 00:13:03,996 Speaker 5: we can still see each other's humanity and common goals. 239 00:13:04,596 --> 00:13:06,116 Speaker 5: Even with our families. 240 00:13:05,796 --> 00:13:10,196 Speaker 2: Half my families Republicans. It always makes for a great thanksgeving. 241 00:13:09,796 --> 00:13:12,516 Speaker 5: It out more on that when the happiness lab returns 242 00:13:12,596 --> 00:13:13,076 Speaker 5: in a moment. 243 00:13:24,956 --> 00:13:27,916 Speaker 7: A lot of people surprise us, both good and bad. 244 00:13:28,676 --> 00:13:30,876 Speaker 2: Britt Baron tries not to see the world in black 245 00:13:30,916 --> 00:13:34,356 Speaker 2: and white. Instead, she's a self proclaimed master of nuance. 246 00:13:34,716 --> 00:13:38,156 Speaker 6: You know traditional cartoon Disney. They're just a good guy, 247 00:13:38,236 --> 00:13:40,596 Speaker 6: there's a bad guy. You want the good guy to win. 248 00:13:40,716 --> 00:13:43,396 Speaker 1: That is it. That is the entire story. And so 249 00:13:43,476 --> 00:13:44,196 Speaker 1: we have this. 250 00:13:44,196 --> 00:13:47,436 Speaker 6: Idea in our mind that things are that clean and 251 00:13:47,436 --> 00:13:48,916 Speaker 6: that things can be that clean. 252 00:13:49,316 --> 00:13:52,836 Speaker 2: But BRIT's own identities never fit cleanly into simple categories. 253 00:13:53,036 --> 00:13:55,276 Speaker 7: I grew up in the Evangelical church. 254 00:13:55,396 --> 00:13:59,316 Speaker 6: My parents were both very committed, and so that identity 255 00:13:59,676 --> 00:14:02,676 Speaker 6: I actually shaped a lot of decisions I made, schools, 256 00:14:02,676 --> 00:14:04,796 Speaker 6: I went to things I was a part of. 257 00:14:05,076 --> 00:14:07,516 Speaker 2: But growing up as a black woman in these predominantly 258 00:14:07,516 --> 00:14:11,276 Speaker 2: white religious spaces was I to achieve a sense of belonging. 259 00:14:11,316 --> 00:14:14,836 Speaker 2: Britt wound up embracing some strong fundamentalist beliefs, like the 260 00:14:14,916 --> 00:14:18,316 Speaker 2: idea that homosexuality is a sin and that girls need 261 00:14:18,356 --> 00:14:19,956 Speaker 2: to save themselves for marriage. 262 00:14:20,316 --> 00:14:23,036 Speaker 6: I think when I was like twelve or something, I 263 00:14:23,076 --> 00:14:25,676 Speaker 6: was in this youth group and the youth pastor passed 264 00:14:25,716 --> 00:14:28,996 Speaker 6: around a glass of water and everyone spit in the water, 265 00:14:29,356 --> 00:14:31,356 Speaker 6: and in the end he said, would anyone want to 266 00:14:31,396 --> 00:14:31,916 Speaker 6: drink this? 267 00:14:31,916 --> 00:14:34,196 Speaker 7: This is what happens every time you sleep with someone. 268 00:14:34,396 --> 00:14:35,276 Speaker 7: This is what's happening. 269 00:14:35,396 --> 00:14:38,036 Speaker 6: I mean even now saying that, I'm like, you know, 270 00:14:38,236 --> 00:14:40,996 Speaker 6: throwing up in my mouth. But yeah, I carried all 271 00:14:41,116 --> 00:14:43,396 Speaker 6: these stories and then I passed these stories on. 272 00:14:43,876 --> 00:14:46,716 Speaker 2: Britt was a dynamic speaker. She quickly rose through the 273 00:14:46,796 --> 00:14:49,756 Speaker 2: leadership breaks of her mostly white congregation, and I. 274 00:14:49,716 --> 00:14:52,076 Speaker 6: Found myself at a very young age of twenty six 275 00:14:52,156 --> 00:14:53,996 Speaker 6: being a pastor at a megachurch. 276 00:14:54,516 --> 00:14:58,476 Speaker 2: Things were going smoothly until Britt and the church's creative director, Sammy, 277 00:14:58,636 --> 00:14:59,236 Speaker 2: fell in love. 278 00:14:59,436 --> 00:15:01,636 Speaker 6: And so that is where I feel like I had 279 00:15:01,756 --> 00:15:03,116 Speaker 6: a decision to make. 280 00:15:03,556 --> 00:15:05,996 Speaker 2: Sammy is not the sort of person that Britt expected 281 00:15:06,036 --> 00:15:09,516 Speaker 2: to fall for. Sammy is a woman, and Britt had 282 00:15:09,516 --> 00:15:12,916 Speaker 2: spent her career preaching that same sex relationships were evil. 283 00:15:13,036 --> 00:15:16,316 Speaker 6: I struggled a lot with feeling like this seems like 284 00:15:16,356 --> 00:15:17,636 Speaker 6: a good thing, but what if it's bad? 285 00:15:17,676 --> 00:15:19,436 Speaker 7: What if this is a trick? What if you know, 286 00:15:19,516 --> 00:15:20,516 Speaker 7: the devil tricking you know? 287 00:15:20,596 --> 00:15:22,876 Speaker 6: I mean, I was like so deeply in this religion 288 00:15:22,996 --> 00:15:24,516 Speaker 6: called you know, what if you want to call it. 289 00:15:24,556 --> 00:15:25,436 Speaker 1: I was so deep in it. 290 00:15:25,676 --> 00:15:28,956 Speaker 2: Eventually, Britt and Sammy decided to declare their love openly 291 00:15:29,436 --> 00:15:31,196 Speaker 2: and came out to their church community. 292 00:15:31,276 --> 00:15:33,356 Speaker 1: I thought we would lose our jobs, which we both did. 293 00:15:33,756 --> 00:15:37,596 Speaker 6: I thought that we would get weird messages on the internet, 294 00:15:37,636 --> 00:15:38,156 Speaker 6: and we did. 295 00:15:38,796 --> 00:15:41,276 Speaker 2: Britta and Sammy lost much of the community they'd built 296 00:15:41,396 --> 00:15:44,436 Speaker 2: over decades. Many of BRIT's friends were so steeped in 297 00:15:44,516 --> 00:15:47,916 Speaker 2: binary thinking they simply couldn't accept that their former friends 298 00:15:47,996 --> 00:15:49,196 Speaker 2: had become a loving couple. 299 00:15:49,596 --> 00:15:53,396 Speaker 6: We got engaged not too long after coming out, and 300 00:15:53,876 --> 00:15:57,196 Speaker 6: there were three people, some of our closest friends in life, 301 00:15:57,236 --> 00:15:59,636 Speaker 6: who we asked to be in our wedding, who eventually 302 00:15:59,676 --> 00:16:01,916 Speaker 6: said this sort of stands against what they believe in. 303 00:16:01,956 --> 00:16:03,956 Speaker 1: They're not able to participate. 304 00:16:03,476 --> 00:16:03,756 Speaker 6: In with this. 305 00:16:04,476 --> 00:16:07,556 Speaker 2: The experience was devastating, but it got Brit thinking could 306 00:16:07,556 --> 00:16:10,556 Speaker 2: people who DRIs agreed so deeply on key issues ever 307 00:16:10,596 --> 00:16:13,236 Speaker 2: find a way to connect. She dedicated the next stages 308 00:16:13,276 --> 00:16:16,356 Speaker 2: of her career to helping people start meaningful dialogue to 309 00:16:16,396 --> 00:16:19,356 Speaker 2: see beyond their usual black and white views. I first 310 00:16:19,476 --> 00:16:22,036 Speaker 2: learned about BRIT's work through her book Do You Still 311 00:16:22,076 --> 00:16:25,036 Speaker 2: Talk to Grandma? When the problematic people in our lives 312 00:16:25,156 --> 00:16:26,036 Speaker 2: are the ones we love. 313 00:16:26,276 --> 00:16:28,836 Speaker 6: I was at dinner with a group of friends and 314 00:16:28,956 --> 00:16:31,556 Speaker 6: one of our friends was saying, oh, yeah, my grandma 315 00:16:31,676 --> 00:16:35,236 Speaker 6: voted for Trump, And someone else was like, do you 316 00:16:35,276 --> 00:16:38,476 Speaker 6: still talk to her? And she was like my nana, 317 00:16:39,036 --> 00:16:41,716 Speaker 6: And I was like, oh my gosh, are we cancling Grandma's? 318 00:16:41,756 --> 00:16:41,836 Speaker 2: Like? 319 00:16:41,836 --> 00:16:42,556 Speaker 1: Are we saying? 320 00:16:42,716 --> 00:16:45,916 Speaker 6: It's too hard for me to sit in any gray? 321 00:16:46,036 --> 00:16:48,836 Speaker 6: So I am just going to talk to people with 322 00:16:48,916 --> 00:16:50,316 Speaker 6: whom I'm on the same page. 323 00:16:50,596 --> 00:16:52,956 Speaker 2: Britt argues that we'd all be happier if we found 324 00:16:52,956 --> 00:16:55,076 Speaker 2: ways to disagree more agreeably. 325 00:16:55,236 --> 00:16:58,716 Speaker 6: My assumption, my feeling, my hunch is that a lot 326 00:16:58,796 --> 00:17:02,596 Speaker 6: of us are actually looking for a way to disagree 327 00:17:02,676 --> 00:17:04,476 Speaker 6: and still be in relationships with each other. 328 00:17:05,036 --> 00:17:07,516 Speaker 5: But to do that, we need to recognize that we 329 00:17:07,716 --> 00:17:11,036 Speaker 5: actually agree with the vast majority of people more than 330 00:17:11,076 --> 00:17:15,556 Speaker 5: our lying minds think. False polarization is just that false. 331 00:17:16,156 --> 00:17:18,916 Speaker 5: Take the US for example. Did you know that seventy 332 00:17:18,956 --> 00:17:21,996 Speaker 5: two percent of Americans think that climate change is happening 333 00:17:22,156 --> 00:17:24,796 Speaker 5: and agree we should take action, Or that more than 334 00:17:24,836 --> 00:17:27,756 Speaker 5: eighty percent of Americans believe that racism is a problem 335 00:17:27,836 --> 00:17:30,916 Speaker 5: we need to address. Over eighty percent of our fellow 336 00:17:30,956 --> 00:17:33,596 Speaker 5: citizens agree we should be fighting for a free press, 337 00:17:33,956 --> 00:17:37,556 Speaker 5: for freedom of speech, and for reasonable gun control measures 338 00:17:37,596 --> 00:17:42,156 Speaker 5: like background checks. Ninety percent of Americans support everyone's right 339 00:17:42,236 --> 00:17:45,476 Speaker 5: to a fair vote. Those stats make it clear we 340 00:17:45,636 --> 00:17:48,196 Speaker 5: actually see eye to eye on way more of the 341 00:17:48,236 --> 00:17:49,596 Speaker 5: fundamentals than we think. 342 00:17:49,876 --> 00:17:52,396 Speaker 6: If we can agree on the what, then we don't 343 00:17:52,436 --> 00:17:53,756 Speaker 6: necessarily have to agree. 344 00:17:53,476 --> 00:17:53,996 Speaker 7: On the how. 345 00:17:54,436 --> 00:17:57,316 Speaker 5: The problem is that we don't realize how aligned we 346 00:17:57,396 --> 00:18:01,036 Speaker 5: really are. But what would happen if we did realize 347 00:18:01,556 --> 00:18:05,236 Speaker 5: if we corrected our mistaken intuitions and developed a more 348 00:18:05,316 --> 00:18:08,916 Speaker 5: accurate view of what people on the other side actually believe. 349 00:18:09,916 --> 00:18:13,396 Speaker 5: That's what Stanford's sociologist Rob Willer and his colleagues tested. 350 00:18:14,076 --> 00:18:17,316 Speaker 5: They brought Democrats and Republicans into the lab and asked them, 351 00:18:17,476 --> 00:18:19,836 Speaker 5: how much do you feel it's justified for your party 352 00:18:19,876 --> 00:18:23,636 Speaker 5: to use violence to advance political goals. He then had 353 00:18:23,676 --> 00:18:26,276 Speaker 5: the same people predict what the other side would answered. 354 00:18:26,596 --> 00:18:30,276 Speaker 5: Willer found that each side overestimated the other's approval of 355 00:18:30,396 --> 00:18:35,516 Speaker 5: violence by nearly three hundred percent three hundred percent. But 356 00:18:35,596 --> 00:18:38,436 Speaker 5: Willer and his colleagues didn't stop there. They showed a 357 00:18:38,436 --> 00:18:41,756 Speaker 5: different group of people how big the usual overestimate was. 358 00:18:42,396 --> 00:18:46,036 Speaker 5: They explained just how off the average person was in 359 00:18:46,116 --> 00:18:49,996 Speaker 5: their predictions about the other side and what happened while 360 00:18:49,996 --> 00:18:53,596 Speaker 5: those new participants corrected their beliefs about the other side's 361 00:18:53,676 --> 00:18:57,916 Speaker 5: violent tendencies, and maybe more importantly, their own support for 362 00:18:57,996 --> 00:19:02,836 Speaker 5: political violence went down. Simply correcting our inaccurate perceptions of 363 00:19:02,876 --> 00:19:06,596 Speaker 5: our rivals can help us dial back our willingness to 364 00:19:06,716 --> 00:19:10,236 Speaker 5: strike preemptively. It can make us seek more peace. 365 00:19:11,036 --> 00:19:13,636 Speaker 2: Brit thinks that results like these prove that we need 366 00:19:13,676 --> 00:19:16,476 Speaker 2: to embrace a bit more nuance and drop the binary 367 00:19:16,516 --> 00:19:19,356 Speaker 2: judgments we make about say a grandma who's kind and 368 00:19:19,436 --> 00:19:21,516 Speaker 2: loving but also votes for someone. 369 00:19:21,276 --> 00:19:23,596 Speaker 1: We hate, So is she good or is she bad? 370 00:19:23,996 --> 00:19:25,556 Speaker 6: Right, very few of us are willing to sit with 371 00:19:25,596 --> 00:19:28,996 Speaker 6: the fact of Grandma's both, and so are we. 372 00:19:29,596 --> 00:19:32,116 Speaker 2: And that gets to the heart of BRIT's strategy for 373 00:19:32,196 --> 00:19:34,956 Speaker 2: disagreeing better. We need to bring some humility to our 374 00:19:34,996 --> 00:19:35,756 Speaker 2: own views too. 375 00:19:36,036 --> 00:19:38,476 Speaker 6: I think a lot of us have convinced ourselves that 376 00:19:38,516 --> 00:19:41,476 Speaker 6: we can fully exist on the quote unquote right side 377 00:19:41,476 --> 00:19:43,636 Speaker 6: of the line. So if someone can be all bad, 378 00:19:43,836 --> 00:19:47,076 Speaker 6: then we can be all good, and we can exist 379 00:19:47,156 --> 00:19:49,956 Speaker 6: in only right answers, and we can know the formula 380 00:19:50,116 --> 00:19:51,956 Speaker 6: and we can have it right, and we know the 381 00:19:52,036 --> 00:19:54,356 Speaker 6: right people to follow, we know the right things to repost, 382 00:19:54,396 --> 00:19:56,676 Speaker 6: we know the right books to read, and oh my gosh, 383 00:19:56,716 --> 00:19:58,636 Speaker 6: I just wipe this one off my brow because I 384 00:19:58,676 --> 00:20:00,876 Speaker 6: don't have to worry about ever being bad. 385 00:20:01,516 --> 00:20:04,116 Speaker 2: Brit worries that we don't just think we're right right now. 386 00:20:04,436 --> 00:20:07,356 Speaker 2: We assume we've always been right and that we always 387 00:20:07,436 --> 00:20:10,836 Speaker 2: will be right. It's a bias calls progressive amnesia. 388 00:20:10,876 --> 00:20:16,036 Speaker 6: Progressive amnesia is our ability to conveniently forget a time 389 00:20:16,116 --> 00:20:19,116 Speaker 6: before we knew what we know now. We have all 390 00:20:19,196 --> 00:20:22,196 Speaker 6: seen a movie that we like love from our childhood. 391 00:20:22,636 --> 00:20:25,636 Speaker 6: We rewatch it only to find and we've all said 392 00:20:25,636 --> 00:20:28,596 Speaker 6: this phrase before that movie doesn't hold up. And owning 393 00:20:28,676 --> 00:20:31,836 Speaker 6: that and experiencing that should give us a different perspective. 394 00:20:32,516 --> 00:20:34,956 Speaker 7: It should change our lens. 395 00:20:34,676 --> 00:20:38,076 Speaker 6: To sort of soften some of that black and white 396 00:20:38,116 --> 00:20:39,516 Speaker 6: line we've drawn so firmly. 397 00:20:40,276 --> 00:20:42,956 Speaker 2: Noticing just how much our own views have changed can 398 00:20:42,996 --> 00:20:45,556 Speaker 2: help us remember that other people have the capacity for 399 00:20:45,676 --> 00:20:48,676 Speaker 2: change too. It's a reminder that Britt turns to whenever 400 00:20:48,716 --> 00:20:50,436 Speaker 2: she encounters someone she disagrees with. 401 00:20:50,716 --> 00:20:54,196 Speaker 6: I have outgrown so many beliefs I used to firmly 402 00:20:54,276 --> 00:20:57,636 Speaker 6: hold right. I believe these fundamentally Christian things. I had 403 00:20:57,676 --> 00:21:01,316 Speaker 6: thoughts about women and purity culture. I used to believe 404 00:21:01,436 --> 00:21:04,636 Speaker 6: that being gay was a sin, like I bought into that. 405 00:21:04,916 --> 00:21:06,836 Speaker 6: Now I can't look at the people who are there 406 00:21:07,036 --> 00:21:11,036 Speaker 6: as just like antiquated, prehistoric idiots who don't stand a 407 00:21:11,116 --> 00:21:13,236 Speaker 6: chance and they're just dumb and they're bad, when in 408 00:21:13,276 --> 00:21:14,396 Speaker 6: reality they're just me. 409 00:21:14,436 --> 00:21:15,276 Speaker 1: Fifteen years ago. 410 00:21:15,596 --> 00:21:17,716 Speaker 6: I think there's just a lot of empathy I feel 411 00:21:17,756 --> 00:21:20,316 Speaker 6: for people who have only ever been told one story 412 00:21:20,716 --> 00:21:23,556 Speaker 6: and now are retelling that story because what else. 413 00:21:23,396 --> 00:21:23,836 Speaker 1: Would they know? 414 00:21:24,276 --> 00:21:26,316 Speaker 6: And that doesn't change the work that we have to 415 00:21:26,356 --> 00:21:29,396 Speaker 6: do in terms of justice and fighting for equity, but 416 00:21:29,476 --> 00:21:31,516 Speaker 6: it should change our approach to the work as we 417 00:21:31,636 --> 00:21:33,676 Speaker 6: understand those people in a different way. 418 00:21:34,316 --> 00:21:35,956 Speaker 2: But we don't just need to change the way we 419 00:21:35,996 --> 00:21:38,436 Speaker 2: think about people on the other side of political debates. 420 00:21:38,676 --> 00:21:41,716 Speaker 2: We also need to change how we act. We actually 421 00:21:41,756 --> 00:21:44,076 Speaker 2: need to talk to the people we disagree. 422 00:21:43,676 --> 00:21:47,356 Speaker 6: With, and so talking to people is so wildly important 423 00:21:47,436 --> 00:21:49,836 Speaker 6: because that is how we see them as people. 424 00:21:49,876 --> 00:21:52,316 Speaker 7: Again, I think there are a lot of ways in. 425 00:21:52,276 --> 00:21:55,996 Speaker 6: Which human interaction can actually turn the needle quite a bit. 426 00:21:56,316 --> 00:21:57,556 Speaker 1: Because I've experienced that. 427 00:21:58,276 --> 00:22:01,196 Speaker 2: But how do we get those tricky conversations going. We'll 428 00:22:01,236 --> 00:22:04,316 Speaker 2: find out when the Happiness Lab returns after the break. 429 00:22:12,516 --> 00:22:13,476 Speaker 2: Hey Lisa, how's it going? 430 00:22:13,676 --> 00:22:13,996 Speaker 6: Hello? 431 00:22:14,196 --> 00:22:14,876 Speaker 1: Doing well? 432 00:22:15,276 --> 00:22:16,436 Speaker 4: Thank you for having. 433 00:22:16,116 --> 00:22:17,636 Speaker 3: Me, No, thanks for coming on. 434 00:22:17,756 --> 00:22:18,676 Speaker 7: I love, love love. 435 00:22:18,876 --> 00:22:22,556 Speaker 5: This is Louisa Santims no relation to Laurie. In fact, 436 00:22:22,636 --> 00:22:25,356 Speaker 5: she hails from a different continent than our fearless host. 437 00:22:26,036 --> 00:22:28,676 Speaker 4: I was born in Brazil, and Brazil has an interesting 438 00:22:28,996 --> 00:22:32,476 Speaker 4: history because we had a military dictatorship. So my parents 439 00:22:32,516 --> 00:22:35,276 Speaker 4: lived through that and I kind of saw their shifting 440 00:22:35,276 --> 00:22:36,316 Speaker 4: and their political beliefs. 441 00:22:36,396 --> 00:22:40,156 Speaker 5: Louise's homeland has seen decades of political upheaval and division, 442 00:22:40,596 --> 00:22:44,556 Speaker 5: and sadly, those disagreements have driven a painful wedge directly 443 00:22:44,596 --> 00:22:46,476 Speaker 5: between members of her closest family. 444 00:22:47,196 --> 00:22:50,596 Speaker 4: My mom and one of my cousins stopped talking until 445 00:22:50,596 --> 00:22:55,476 Speaker 4: this day, don't really communicate, and it's been, you know, 446 00:22:55,556 --> 00:22:58,956 Speaker 4: ten years, and so that was really shocking for me 447 00:22:59,116 --> 00:23:03,156 Speaker 4: to see how something that at times in people's lives 448 00:23:03,276 --> 00:23:09,276 Speaker 4: feels abstract can really corrode these very real relationships. 449 00:23:09,396 --> 00:23:13,036 Speaker 5: Left Brazil to study psychology in the US, eventually working 450 00:23:13,076 --> 00:23:16,636 Speaker 5: with me at Stanford, and quickly realized that the rise 451 00:23:16,676 --> 00:23:21,156 Speaker 5: of populist politicians was causing worrying levels of polarization in 452 00:23:21,196 --> 00:23:21,996 Speaker 5: both countries. 453 00:23:22,396 --> 00:23:25,276 Speaker 4: So I was in college when thewenty sixteen election happened, 454 00:23:25,316 --> 00:23:28,116 Speaker 4: and I kind of saw the fabric of both countries 455 00:23:28,356 --> 00:23:32,316 Speaker 4: unraveling because of these dynamics, and I became really interested 456 00:23:32,356 --> 00:23:36,116 Speaker 4: in if there's anything we could use to help mitigate 457 00:23:36,156 --> 00:23:37,756 Speaker 4: some of these effects. 458 00:23:37,996 --> 00:23:41,956 Speaker 5: Louisa became interested in false polarization, the idea that people 459 00:23:42,036 --> 00:23:45,196 Speaker 5: simply don't disagree as much as they think. She began 460 00:23:45,276 --> 00:23:47,716 Speaker 5: to wonder what was causing us to be so wrong 461 00:23:47,796 --> 00:23:52,156 Speaker 5: about the views of others. Louisa gravitated towards an answer. 462 00:23:52,516 --> 00:23:55,356 Speaker 5: Study after study showed that, like in her own family, 463 00:23:55,756 --> 00:23:58,356 Speaker 5: people don't know what the other side really thinks because 464 00:23:58,356 --> 00:24:00,356 Speaker 5: they're unwilling to even talk to them. 465 00:24:00,436 --> 00:24:02,756 Speaker 4: People might see value in it, but they are just 466 00:24:03,236 --> 00:24:06,436 Speaker 4: very concerned about how these conversations are going to go 467 00:24:06,476 --> 00:24:09,636 Speaker 4: and tend to imagine kind of the worst a breeding 468 00:24:09,636 --> 00:24:11,876 Speaker 4: ground for these misperceptions. 469 00:24:11,556 --> 00:24:14,836 Speaker 2: And that breeding ground has grown pretty extreme. One recent 470 00:24:14,916 --> 00:24:18,476 Speaker 2: experiment found that people will refuse money if it requires 471 00:24:18,516 --> 00:24:22,036 Speaker 2: listening to a political opponent. In another study, subjects predicted 472 00:24:22,036 --> 00:24:25,076 Speaker 2: that talking to a political opponent would feel about as 473 00:24:25,076 --> 00:24:27,836 Speaker 2: bad as getting a tooth pulled, and family members who 474 00:24:27,836 --> 00:24:30,756 Speaker 2: spent their Thanksgiving dinners in a town where people voted 475 00:24:30,756 --> 00:24:33,716 Speaker 2: differently than they did wound up leaving that holiday meal 476 00:24:33,836 --> 00:24:36,836 Speaker 2: fifty minutes earlier than those who wait in communities that 477 00:24:36,956 --> 00:24:37,956 Speaker 2: voted more similarly. 478 00:24:38,036 --> 00:24:40,276 Speaker 5: Hold up, LORI wait a minute, you're saying that these 479 00:24:40,316 --> 00:24:43,716 Speaker 5: people leave before dessert. They're giving up pie. I mean 480 00:24:44,276 --> 00:24:48,356 Speaker 5: pumpkin pie, apple pie. That's an incredible desire to avoid 481 00:24:48,396 --> 00:24:50,436 Speaker 5: a conversation if you're willing to give that up. 482 00:24:51,236 --> 00:24:55,276 Speaker 4: So our idea was, can we get everyday people who 483 00:24:55,356 --> 00:24:58,396 Speaker 4: disagree on topics that are important for the country to 484 00:24:58,716 --> 00:25:01,716 Speaker 4: just sit and have a conversation for twenty minutes and 485 00:25:01,756 --> 00:25:04,196 Speaker 4: what are the consequences of that? 486 00:25:04,476 --> 00:25:07,276 Speaker 2: The idea turned into Luisa's PhD dissertation. 487 00:25:07,676 --> 00:25:11,876 Speaker 4: We basically out and asked a bunch of Republicans and 488 00:25:11,916 --> 00:25:16,556 Speaker 4: Democrats about their beliefs on three hot button topics. It 489 00:25:16,596 --> 00:25:21,676 Speaker 4: was immigration, gun control, and climate change. And we explained 490 00:25:21,676 --> 00:25:24,156 Speaker 4: to them that we were trying to have the study 491 00:25:24,156 --> 00:25:28,396 Speaker 4: where people would meet over zoom and talk about disagreements 492 00:25:28,436 --> 00:25:31,476 Speaker 4: with a person who supported the other party, And then 493 00:25:31,516 --> 00:25:33,796 Speaker 4: we asked them to forecast how these conversations were going 494 00:25:33,876 --> 00:25:36,036 Speaker 4: to go, So we asked them how pleasant do you 495 00:25:36,076 --> 00:25:37,196 Speaker 4: think the conversation will be? 496 00:25:37,236 --> 00:25:37,916 Speaker 7: How productive? 497 00:25:37,956 --> 00:25:39,356 Speaker 4: How much do you think you're going to like the 498 00:25:39,396 --> 00:25:42,756 Speaker 4: person on the other side, And a week later we 499 00:25:42,796 --> 00:25:45,196 Speaker 4: actually paired people to have conversations. 500 00:25:45,436 --> 00:25:49,556 Speaker 2: Louisa's methods seems straightforward enough, but her PhD advisor was 501 00:25:49,596 --> 00:25:51,076 Speaker 2: a little worried, more than a little. 502 00:25:51,356 --> 00:25:53,916 Speaker 5: One of our core missions in research, as you know, Laurie, 503 00:25:53,956 --> 00:25:56,076 Speaker 5: is not just to find out about people, but to 504 00:25:56,156 --> 00:25:59,676 Speaker 5: keep them safe. Louisa and I were terrified that people 505 00:25:59,716 --> 00:26:03,196 Speaker 5: would threaten each other, or dox each other, you name it. 506 00:26:03,556 --> 00:26:06,236 Speaker 5: We wanted to do everything we could to avoid harm, 507 00:26:06,516 --> 00:26:07,356 Speaker 5: so Luisa. 508 00:26:07,076 --> 00:26:09,436 Speaker 2: And Jamille put protections in place to make make sure 509 00:26:09,476 --> 00:26:12,796 Speaker 2: that the conversations went smoothly. They added a moderator who 510 00:26:12,796 --> 00:26:15,756 Speaker 2: had explicit instructions about what to do if the conversation 511 00:26:15,836 --> 00:26:19,196 Speaker 2: went south. The moderator began the conversation by reminding the 512 00:26:19,196 --> 00:26:23,236 Speaker 2: participants about the political questions they'd answered before, and explained 513 00:26:23,276 --> 00:26:25,796 Speaker 2: that they'd now be having a conversation with someone who 514 00:26:25,876 --> 00:26:29,116 Speaker 2: disagreed with their views. The moderator that introduced the first 515 00:26:29,196 --> 00:26:33,076 Speaker 2: of two hot fun topics, say gun control, and invited 516 00:26:33,116 --> 00:26:35,156 Speaker 2: the strangers to exchange their views. 517 00:26:35,276 --> 00:26:37,876 Speaker 4: And then a moderator would mute themselves and turn their 518 00:26:37,916 --> 00:26:40,916 Speaker 4: camera off, and then people will basically have this ten 519 00:26:40,956 --> 00:26:44,156 Speaker 4: minute conversation with the stranger that person they just met 520 00:26:44,196 --> 00:26:45,476 Speaker 4: on this issue. 521 00:26:45,716 --> 00:26:48,436 Speaker 5: Well, I think we need guns because there are a 522 00:26:48,476 --> 00:26:49,636 Speaker 5: lot of crazies out there. 523 00:26:50,276 --> 00:26:52,276 Speaker 3: I think the real problem is that there are just 524 00:26:52,356 --> 00:26:54,916 Speaker 3: too many crazy people with guns. 525 00:26:55,356 --> 00:26:59,036 Speaker 2: When I first heard about Luisa's study, my initial reaction was, man, 526 00:26:59,116 --> 00:27:01,796 Speaker 2: I hope she plaid those poor moderators a lot of money, 527 00:27:02,236 --> 00:27:05,436 Speaker 2: because I assumed most conversations would turn into twenty minute 528 00:27:05,476 --> 00:27:08,916 Speaker 2: screaming matches. And I wasn't the only one with this prediction. 529 00:27:09,396 --> 00:27:12,196 Speaker 2: Luisa recruited a different group of people and asked them 530 00:27:12,236 --> 00:27:16,236 Speaker 2: to forecast how many conversations would require moderator intervention. 531 00:27:17,036 --> 00:27:21,436 Speaker 4: And people thought that twenty five percent of conversations would 532 00:27:21,436 --> 00:27:24,636 Speaker 4: have to be stopped, when in fact none did. 533 00:27:25,556 --> 00:27:26,436 Speaker 7: I was shocked. 534 00:27:26,676 --> 00:27:29,036 Speaker 4: I did not think that they were gonna love it, 535 00:27:29,676 --> 00:27:30,556 Speaker 4: but people. 536 00:27:30,356 --> 00:27:33,436 Speaker 2: Did love it. They shared their views civilly and listened 537 00:27:33,476 --> 00:27:35,836 Speaker 2: to stories from a complete stranger on the other side 538 00:27:35,956 --> 00:27:37,316 Speaker 2: with genuine curiosity. 539 00:27:37,636 --> 00:27:38,756 Speaker 4: My Mom's not guns. 540 00:27:38,916 --> 00:27:41,476 Speaker 5: I mean, she's in her eighties, SAEs it make her 541 00:27:41,516 --> 00:27:45,556 Speaker 5: feel safer. I guess I can understand that my sister's 542 00:27:45,596 --> 00:27:46,156 Speaker 5: the same way. 543 00:27:46,556 --> 00:27:49,356 Speaker 2: But the most shocking finding came when Luisa asked people 544 00:27:49,356 --> 00:27:52,356 Speaker 2: how much they enjoyed the conversation on a scale from 545 00:27:52,436 --> 00:27:55,556 Speaker 2: one not at all to one hundred extremely enjoyable. 546 00:27:56,196 --> 00:27:59,116 Speaker 4: The most common response was one hundred out of one hundred. 547 00:27:59,236 --> 00:28:02,676 Speaker 4: People thought it was extremely pleasant, that these conversations were 548 00:28:02,716 --> 00:28:06,036 Speaker 4: extremely productive, that they found a lot of common ground 549 00:28:06,036 --> 00:28:09,756 Speaker 4: with their partner, and that they liked their conversation partner 550 00:28:09,956 --> 00:28:13,476 Speaker 4: a lot. And a very common comment at the end 551 00:28:13,556 --> 00:28:18,116 Speaker 4: of the survey was that people were shocked by how 552 00:28:18,156 --> 00:28:20,636 Speaker 4: well it went and how much they liked the person 553 00:28:20,676 --> 00:28:23,316 Speaker 4: they talked to. They even wrote things like I would 554 00:28:23,356 --> 00:28:26,356 Speaker 4: love to have this person over for cocktails in dinner, 555 00:28:26,396 --> 00:28:30,396 Speaker 4: and you know, they really seemed to connect with one another, 556 00:28:30,556 --> 00:28:34,436 Speaker 4: even though they were talking about these frot topics of 557 00:28:34,476 --> 00:28:37,156 Speaker 4: political disagreement. And I think it speaks a little bit 558 00:28:37,196 --> 00:28:40,556 Speaker 4: to the hunger that people have to find these types 559 00:28:40,636 --> 00:28:43,036 Speaker 4: of connections, you know, to kind of feel that there 560 00:28:43,196 --> 00:28:46,236 Speaker 4: is common ground out there when everything around. 561 00:28:45,956 --> 00:28:47,476 Speaker 1: Us seemed to indicate otherwise. 562 00:28:47,636 --> 00:28:51,196 Speaker 2: But why did these conversations go so shockingly well? Luisa 563 00:28:51,196 --> 00:28:54,036 Speaker 2: think's part of the success involved the way strangers talked 564 00:28:54,036 --> 00:28:56,756 Speaker 2: about their views. Rather than give some lecture aimed at 565 00:28:56,756 --> 00:29:00,076 Speaker 2: convincing the other side, most participants just talked about their 566 00:29:00,076 --> 00:29:03,756 Speaker 2: lived experience. They told stories about how and why they 567 00:29:03,756 --> 00:29:04,916 Speaker 2: came to believe what they did. 568 00:29:05,476 --> 00:29:08,996 Speaker 4: For example, in one of our conversations, this person who 569 00:29:09,196 --> 00:29:14,116 Speaker 4: the Republican gun supporter, shared that he's gay, he lives 570 00:29:14,156 --> 00:29:17,076 Speaker 4: in an area where he feels very unsafe, and he 571 00:29:17,156 --> 00:29:20,156 Speaker 4: has received death threats in the past, and that was 572 00:29:20,196 --> 00:29:22,836 Speaker 4: the time where he bought his first gun. So he 573 00:29:22,916 --> 00:29:24,596 Speaker 4: kind of shares the story about how he was actually 574 00:29:24,636 --> 00:29:28,196 Speaker 4: opposed to guns before and then this setting of being 575 00:29:28,276 --> 00:29:32,596 Speaker 4: frightened made him purchase a gun, and nowadays he feels 576 00:29:32,916 --> 00:29:36,476 Speaker 4: safer because of that purchase. I feel like stories like 577 00:29:36,556 --> 00:29:39,236 Speaker 4: that really shatter kind of preconceived ideas of who that 578 00:29:39,236 --> 00:29:41,076 Speaker 4: person on the other side is, and it kind of 579 00:29:41,116 --> 00:29:43,756 Speaker 4: opens us up to understanding how a person can disagree 580 00:29:43,796 --> 00:29:46,276 Speaker 4: with us, and it might not fully change or believes, 581 00:29:46,316 --> 00:29:48,396 Speaker 4: but at least gives insight into the humanity of that 582 00:29:48,436 --> 00:29:49,516 Speaker 4: person who believes that. 583 00:29:49,916 --> 00:29:53,276 Speaker 2: Luis's conversations involved a rhetorical tactic that we've talked about 584 00:29:53,316 --> 00:29:57,116 Speaker 2: before on the Happiness Lab, asking deep questions. Research shows 585 00:29:57,116 --> 00:30:00,116 Speaker 2: that people feel more hurt when their conversation partners ask 586 00:30:00,196 --> 00:30:03,556 Speaker 2: deep questions and listen to their replies. Hearing a partner's 587 00:30:03,636 --> 00:30:06,396 Speaker 2: question can also help us think more carefully about our 588 00:30:06,396 --> 00:30:07,156 Speaker 2: own views on a. 589 00:30:07,116 --> 00:30:10,356 Speaker 4: Topic, like sometimes when people ask enough questions, even the 590 00:30:10,396 --> 00:30:13,196 Speaker 4: person who was very confident in their beliefs starting out 591 00:30:13,556 --> 00:30:15,716 Speaker 4: become a little less confident as it goes on. 592 00:30:16,316 --> 00:30:19,716 Speaker 2: Asking questions also helps your conversation partner know that you 593 00:30:19,756 --> 00:30:22,396 Speaker 2: really care about their experiences and perspective. 594 00:30:22,876 --> 00:30:25,756 Speaker 4: So I think that sometimes people come to conversations thinking 595 00:30:25,796 --> 00:30:28,036 Speaker 4: that the sole goal on the other side is to 596 00:30:28,036 --> 00:30:30,676 Speaker 4: persuade you, but actually people are more interested in learning 597 00:30:30,716 --> 00:30:31,716 Speaker 4: than we give them credit for. 598 00:30:32,436 --> 00:30:36,036 Speaker 2: Louis's study shows that hard political conversations are much more 599 00:30:36,036 --> 00:30:39,436 Speaker 2: effective and enjoyable than we think. Her findings about the 600 00:30:39,436 --> 00:30:42,196 Speaker 2: power of a quick one on one political dialogue have 601 00:30:42,276 --> 00:30:44,676 Speaker 2: made her much more hopeful that we can fight our 602 00:30:44,676 --> 00:30:48,116 Speaker 2: polarization and fix the toxic conflict that so many of 603 00:30:48,196 --> 00:30:48,836 Speaker 2: us despise. 604 00:30:49,116 --> 00:30:52,236 Speaker 4: Most people do not want the world that we currently 605 00:30:52,316 --> 00:30:54,596 Speaker 4: live in, and they kind of would like more connection. 606 00:30:55,476 --> 00:30:59,036 Speaker 5: Louise's research is so encouraging and compelling, but has it 607 00:30:59,116 --> 00:31:01,316 Speaker 5: fixed the rift between her mother and her cousin. 608 00:31:01,516 --> 00:31:04,916 Speaker 4: I try in private conversations with each of them, and 609 00:31:05,036 --> 00:31:07,916 Speaker 4: both of them asked me about each other, you know, 610 00:31:07,956 --> 00:31:10,716 Speaker 4: So there's a lot of care there and love, And 611 00:31:10,756 --> 00:31:13,876 Speaker 4: I think that like both of them seem when I 612 00:31:13,876 --> 00:31:15,796 Speaker 4: talk to them in private, very open to the idea 613 00:31:15,796 --> 00:31:18,316 Speaker 4: of it. But no one wants to do the first step. 614 00:31:18,436 --> 00:31:21,076 Speaker 4: So I hope I can come back to this podcast 615 00:31:21,116 --> 00:31:24,956 Speaker 4: in a few years and say that that conflict has ended. 616 00:31:25,636 --> 00:31:28,916 Speaker 5: This points to a new frontier for Louisa's work. Getting 617 00:31:28,956 --> 00:31:31,396 Speaker 5: strangers to be polite to each other is one thing, 618 00:31:31,916 --> 00:31:34,676 Speaker 5: but the emotions of a family dynamic are on another 619 00:31:34,756 --> 00:31:38,036 Speaker 5: level of challenge, one we hope to explore in the future. 620 00:31:39,116 --> 00:31:42,116 Speaker 5: Of course, there are real disagreements in our country and 621 00:31:42,156 --> 00:31:45,876 Speaker 5: around the world. There are real violent extremists and bad 622 00:31:45,916 --> 00:31:49,236 Speaker 5: faith actors out there, and real threats to our values 623 00:31:49,476 --> 00:31:53,476 Speaker 5: and even democracy itself. But our problems only get worse 624 00:31:53,716 --> 00:31:57,556 Speaker 5: when we cynically decide that absolutely everyone we disagree with 625 00:31:57,716 --> 00:32:01,236 Speaker 5: is awful. The way we see each other now shapes 626 00:32:01,276 --> 00:32:04,436 Speaker 5: the future we can imagine together. When we write off 627 00:32:04,476 --> 00:32:07,476 Speaker 5: the other side, we give up on things getting any better. 628 00:32:07,996 --> 00:32:11,156 Speaker 5: The common ground we could explore it together remains an 629 00:32:11,236 --> 00:32:14,236 Speaker 5: undiscovered country. But we don't have to give into this. 630 00:32:14,796 --> 00:32:18,076 Speaker 5: The data about what people really want is more accurate 631 00:32:18,116 --> 00:32:22,236 Speaker 5: than our assumptions and more hopeful too. By opening our 632 00:32:22,276 --> 00:32:26,036 Speaker 5: minds and paying closer attention, we can rediscover the things 633 00:32:26,116 --> 00:32:30,196 Speaker 5: most of us want and maybe even start to mend division. 634 00:32:31,076 --> 00:32:33,876 Speaker 5: And that new approach probably won't hurt our happiness either. 635 00:32:35,156 --> 00:32:37,836 Speaker 2: In our next episode, in this special season on Finding Hope, 636 00:32:37,996 --> 00:32:40,276 Speaker 2: we'll build on what you've heard so far and talk 637 00:32:40,316 --> 00:32:42,756 Speaker 2: to people who put their cynicism aside to work with 638 00:32:42,836 --> 00:32:45,716 Speaker 2: others to fix the problems our society is facing, and 639 00:32:45,716 --> 00:32:49,276 Speaker 2: in doing so made their communities and themselves a lot happier. 640 00:32:49,556 --> 00:32:51,756 Speaker 7: There was no guarantee we were going to be successful. 641 00:32:51,836 --> 00:32:54,076 Speaker 3: If anything, everybody told us all the reasons. 642 00:32:53,756 --> 00:32:56,476 Speaker 7: We would fail, and yet we. 643 00:32:56,316 --> 00:32:57,556 Speaker 3: Were willing to try anyways. 644 00:32:57,996 --> 00:33:00,396 Speaker 2: All that next time on the Happiness Lab would be 645 00:33:00,796 --> 00:33:03,876 Speaker 2: doctor Laurie Santos and me, doctor jimille Zaki. 646 00:33:09,196 --> 00:33:09,276 Speaker 5: Yes,