1 00:00:15,436 --> 00:00:27,596 Speaker 1: Pushkin. I'm not a groupie person really at all. I 2 00:00:27,636 --> 00:00:31,196 Speaker 1: don't really care about sports. I don't really care about teams. 3 00:00:31,876 --> 00:00:34,236 Speaker 1: I did not care about baseball at all. I really 4 00:00:34,236 --> 00:00:36,356 Speaker 1: it was only there because I was so smitten with him. 5 00:00:36,436 --> 00:00:38,316 Speaker 1: Me and a Chikara is telling me about an early 6 00:00:38,396 --> 00:00:40,796 Speaker 1: date she had with her husband, Carrie. Carrie is an 7 00:00:40,796 --> 00:00:43,436 Speaker 1: avid Boston Red Sox fan, so they decided to check 8 00:00:43,436 --> 00:00:46,436 Speaker 1: out a game against the Sox's arch rivals, the Yankees, 9 00:00:46,916 --> 00:00:50,916 Speaker 1: and they did so an enemy territory Yankee stadium. He 10 00:00:51,516 --> 00:00:54,596 Speaker 1: is a little bit risk seeking, and so he decided 11 00:00:54,596 --> 00:00:56,796 Speaker 1: to wear his Red Sox hat to this game. Even 12 00:00:56,836 --> 00:00:59,436 Speaker 1: just getting off at the subway station was already a 13 00:00:59,436 --> 00:01:02,796 Speaker 1: bunch of jokes and ribbing and look at this guy. 14 00:01:02,996 --> 00:01:05,956 Speaker 1: And what was really interesting to me, though, was that 15 00:01:05,996 --> 00:01:08,236 Speaker 1: it seemed very good hearted at first, right, it was 16 00:01:08,276 --> 00:01:11,076 Speaker 1: just sort of people acting the script. But as the 17 00:01:11,076 --> 00:01:14,116 Speaker 1: game wore on, the score got closer and closer, and 18 00:01:14,196 --> 00:01:18,316 Speaker 1: the entire stadium grew tense, the banter stopped being playful, 19 00:01:18,756 --> 00:01:22,396 Speaker 1: and Carrie wasn't reacting well. So at some point I decided, 20 00:01:22,476 --> 00:01:24,636 Speaker 1: I'm going to take the hat from him, because I 21 00:01:24,676 --> 00:01:26,516 Speaker 1: don't want him to get into a fist fight here, 22 00:01:26,796 --> 00:01:28,836 Speaker 1: so I didn't have anywhere to put this hat, so 23 00:01:28,876 --> 00:01:31,156 Speaker 1: I put it on my own head, thinking I'm not 24 00:01:31,316 --> 00:01:34,076 Speaker 1: a fan of either team. I really couldn't care less 25 00:01:34,196 --> 00:01:37,276 Speaker 1: about this. No one's gonna talk to me, no one's 26 00:01:37,276 --> 00:01:40,156 Speaker 1: gonna say anything. And I couldn't have been more wrong 27 00:01:40,196 --> 00:01:43,076 Speaker 1: about that assumption. So the second I put on the hat, 28 00:01:43,116 --> 00:01:45,756 Speaker 1: people started calling me names. They started telling me about 29 00:01:45,796 --> 00:01:48,236 Speaker 1: my mother, They started saying, oh manner of horrible things 30 00:01:48,236 --> 00:01:51,156 Speaker 1: about me. But there was a second assumption Mina was 31 00:01:51,196 --> 00:01:54,036 Speaker 1: wrong about. She thought if she got taunted, it wouldn't 32 00:01:54,036 --> 00:01:56,876 Speaker 1: bother her. But within minutes of putting on that Red 33 00:01:56,876 --> 00:02:00,676 Speaker 1: Sox cap, Mina found herself screaming at Yankee fans in 34 00:02:00,716 --> 00:02:03,396 Speaker 1: the seats all around her. My husband actually had to 35 00:02:03,396 --> 00:02:05,876 Speaker 1: put himself between me and this other guy. And this 36 00:02:06,236 --> 00:02:09,436 Speaker 1: experience gave me this incredible insight, which was that by 37 00:02:09,516 --> 00:02:12,596 Speaker 1: virtue of marking myself as a member of Red Sox Nation, 38 00:02:13,236 --> 00:02:15,676 Speaker 1: I started to get treated that way. And once I 39 00:02:15,716 --> 00:02:18,636 Speaker 1: started to receive that treatment, I started to react on 40 00:02:18,676 --> 00:02:21,996 Speaker 1: behalf of Red Zox Nation. Mina had unwittingly fallen prey 41 00:02:22,036 --> 00:02:24,356 Speaker 1: to some of the most dangerous forces in all of 42 00:02:24,436 --> 00:02:28,076 Speaker 1: human nature our inter group biases. So it's that shift 43 00:02:28,116 --> 00:02:31,156 Speaker 1: that happens when you approach an idea or an interaction 44 00:02:31,356 --> 00:02:33,836 Speaker 1: not through the lens of me and you, but rather 45 00:02:33,956 --> 00:02:36,916 Speaker 1: through the lens of us and them. As humans, we 46 00:02:37,036 --> 00:02:39,716 Speaker 1: naturally divide the world into end groups and out groups, 47 00:02:39,716 --> 00:02:42,436 Speaker 1: and not just on sports fields, we also do so 48 00:02:42,556 --> 00:02:46,836 Speaker 1: across political, ethnic, racial, national, and ideological lines as well. 49 00:02:47,596 --> 00:02:49,396 Speaker 1: The bonding we get from being part of a group 50 00:02:49,516 --> 00:02:52,316 Speaker 1: can sometimes feel good. It can make us feel connected, 51 00:02:52,636 --> 00:02:55,676 Speaker 1: like we're part of something bigger than ourselves, but our 52 00:02:55,716 --> 00:02:58,676 Speaker 1: partisan urges can also cause us to feel pretty miserable. 53 00:02:59,276 --> 00:03:02,596 Speaker 1: They can steal opportunities to make meaningful connections with people 54 00:03:02,596 --> 00:03:05,116 Speaker 1: who are different from us. They can make us feel 55 00:03:05,156 --> 00:03:07,596 Speaker 1: angry at the other side and cause us to engage 56 00:03:07,636 --> 00:03:12,076 Speaker 1: in nasty, sometimes even violent behaviors, and our tendency towards 57 00:03:12,156 --> 00:03:14,876 Speaker 1: us versus them thinking has even led to much worse 58 00:03:14,916 --> 00:03:18,476 Speaker 1: outcomes than a dip in our personal happiness. These urges 59 00:03:18,556 --> 00:03:23,316 Speaker 1: hinder important progress in politics. They can fuel lethal racist violence, 60 00:03:23,596 --> 00:03:27,156 Speaker 1: deadly ethnic conflicts, and some of the worst atrocities human 61 00:03:27,236 --> 00:03:30,716 Speaker 1: history has ever seen. By some counts, over two hundred 62 00:03:30,716 --> 00:03:34,396 Speaker 1: million civilians, not soldiers. Civilians perished in the last century 63 00:03:34,876 --> 00:03:38,036 Speaker 1: as a result of large scale group violence. So, in 64 00:03:38,116 --> 00:03:40,756 Speaker 1: a time when our society is feeling more divided than ever, 65 00:03:41,396 --> 00:03:43,876 Speaker 1: what can we do to avoid all the anger and bitterness? 66 00:03:44,556 --> 00:03:46,996 Speaker 1: What can we do to fight the intergroup biases that 67 00:03:47,156 --> 00:03:50,876 Speaker 1: lead to so much unhappiness? These are the very, very 68 00:03:50,876 --> 00:03:53,876 Speaker 1: hard issues that will try to tackle scientifically in this 69 00:03:53,916 --> 00:04:01,716 Speaker 1: season's final two episodes of The Happiness lab Our. Minds 70 00:04:01,716 --> 00:04:03,676 Speaker 1: are constantly telling us what to do to be happy. 71 00:04:04,156 --> 00:04:06,356 Speaker 1: But what if our minds are wrong? What if our 72 00:04:06,396 --> 00:04:09,116 Speaker 1: minds are lying to us, leading us away from old 73 00:04:09,396 --> 00:04:12,676 Speaker 1: make us happy? The good news is that understanding the 74 00:04:12,716 --> 00:04:14,796 Speaker 1: science of the mind can point us all back in 75 00:04:14,836 --> 00:04:17,756 Speaker 1: the right direction. You're listening to The Happiness Lab with 76 00:04:17,876 --> 00:04:29,796 Speaker 1: Doctor Laurie Santos. We left Mina's story with her screaming uncontrollably, 77 00:04:30,276 --> 00:04:32,996 Speaker 1: having to be restrained by her boyfriend from squaring off 78 00:04:32,996 --> 00:04:36,516 Speaker 1: with people she'd never met, her passions inflamed by a 79 00:04:36,556 --> 00:04:40,156 Speaker 1: game she had zero interest in just hours before. It 80 00:04:40,196 --> 00:04:42,916 Speaker 1: may sound crazy that one little red sox hat could 81 00:04:42,956 --> 00:04:48,116 Speaker 1: cause all that trouble, but yet happened and was actually 82 00:04:48,156 --> 00:04:51,316 Speaker 1: the impetus for my dissertation. Mina is now a professor 83 00:04:51,356 --> 00:04:55,156 Speaker 1: of psychology at Harvard University. She's become a world expert 84 00:04:55,196 --> 00:04:58,756 Speaker 1: on the neural underpinnings of our intergroup biases. As we 85 00:04:58,796 --> 00:05:01,796 Speaker 1: got to talking, I learned that Mina's unfortunate foray into 86 00:05:01,836 --> 00:05:04,676 Speaker 1: Red Sox fandom wasn't the first time she'd seen the 87 00:05:04,796 --> 00:05:08,196 Speaker 1: dark side of our US versus them, thinking my dad 88 00:05:08,356 --> 00:05:11,036 Speaker 1: is Serbian, my mom is Bossi, and my entire family 89 00:05:11,116 --> 00:05:13,916 Speaker 1: is from former Yugoslavia. Mina and her parents moved to 90 00:05:13,956 --> 00:05:16,796 Speaker 1: the United States in the nineteen eighties, but her extended 91 00:05:16,836 --> 00:05:20,756 Speaker 1: families stayed in former Yugoslavia as her homeland descended into 92 00:05:20,756 --> 00:05:23,876 Speaker 1: bloody civil war. You know, I would hear these harrowing 93 00:05:23,956 --> 00:05:28,956 Speaker 1: stories of people who had been neighbors for decades, who 94 00:05:28,956 --> 00:05:31,596 Speaker 1: had raised their children together, they had been friends, they 95 00:05:31,636 --> 00:05:36,076 Speaker 1: had been in each other's weddings, and basically, when things 96 00:05:36,076 --> 00:05:39,036 Speaker 1: took a turn politically, they turned on each other and 97 00:05:39,356 --> 00:05:42,196 Speaker 1: murdered one another or tried to murder one another. And 98 00:05:42,916 --> 00:05:47,316 Speaker 1: this was really, I mean, horrifying, but also really fascinating 99 00:05:47,476 --> 00:05:49,876 Speaker 1: to me, in part because what it revealed to me 100 00:05:49,996 --> 00:05:52,996 Speaker 1: was how quickly these dynamics can change. The speed with 101 00:05:52,996 --> 00:05:55,556 Speaker 1: which our group instincts can lead to all out violence 102 00:05:55,716 --> 00:05:59,836 Speaker 1: is especially shocking because generally speaking, humans really don't like 103 00:05:59,916 --> 00:06:03,436 Speaker 1: doing bad things to one another. Laurie, did you punch 104 00:06:03,516 --> 00:06:06,716 Speaker 1: somebody today? I have not punched somebody today. It's been 105 00:06:06,716 --> 00:06:10,596 Speaker 1: a good day. Have you punched anyone in the last month? No? 106 00:06:10,956 --> 00:06:14,636 Speaker 1: In fact, okay, how about the last year? Actually no, 107 00:06:14,796 --> 00:06:17,916 Speaker 1: I'm a kind of non puncher most of the time. Yeah, right. 108 00:06:17,956 --> 00:06:20,836 Speaker 1: And what's really interesting about this is that there seemed 109 00:06:20,876 --> 00:06:24,556 Speaker 1: to be very strong moral prohibitions against harm that guide 110 00:06:24,636 --> 00:06:28,876 Speaker 1: most people's behavior most of the time. Decades and decades 111 00:06:28,876 --> 00:06:31,916 Speaker 1: of psychological research show that we really don't like doing 112 00:06:31,956 --> 00:06:36,476 Speaker 1: mean stuff. One study by the neuroscientist Molly Crockett found 113 00:06:36,476 --> 00:06:39,476 Speaker 1: that participants were more reluctant to administer an electric shock 114 00:06:39,556 --> 00:06:43,036 Speaker 1: to a stranger than they were to shock themselves. Harvard 115 00:06:43,116 --> 00:06:46,556 Speaker 1: psychologist Fiery Cushman found that people even get queasy when 116 00:06:46,556 --> 00:06:49,636 Speaker 1: they pretend to harm other people. He had his subjects 117 00:06:49,636 --> 00:06:52,356 Speaker 1: point a toy gun in someone's face or smash the 118 00:06:52,396 --> 00:06:56,156 Speaker 1: skull of a realistic looking plastic baby. Cushman found that 119 00:06:56,196 --> 00:06:59,036 Speaker 1: even though people knew these mean actions were fake, they 120 00:06:59,076 --> 00:07:02,556 Speaker 1: still showed a strong physiological reaction to doing them. They 121 00:07:02,556 --> 00:07:05,556 Speaker 1: had an increased heart rate and other bodily signs of arousal, 122 00:07:06,276 --> 00:07:10,196 Speaker 1: and participants showed these physiological reactions more when performing the 123 00:07:10,236 --> 00:07:14,276 Speaker 1: fake actions themselves versus watching a similar action being performed 124 00:07:14,316 --> 00:07:18,196 Speaker 1: by somebody else. We just hate doing mean stuff, and 125 00:07:18,276 --> 00:07:21,436 Speaker 1: yet every day we see videos of people doing violent 126 00:07:21,476 --> 00:07:25,196 Speaker 1: things to strangers who've done them no harm. So that's 127 00:07:25,196 --> 00:07:27,796 Speaker 1: this puzzle, and what it suggests is that there had 128 00:07:27,796 --> 00:07:31,756 Speaker 1: to be certain preconditions or certain factors in place in 129 00:07:31,876 --> 00:07:35,836 Speaker 1: order for people to overcome this aversion to harm. Mina 130 00:07:35,916 --> 00:07:38,516 Speaker 1: has spent the last decade studying what leads people down 131 00:07:38,556 --> 00:07:41,756 Speaker 1: this awful road towards actively wanting to hurt members of 132 00:07:41,756 --> 00:07:44,636 Speaker 1: other groups. She's found that the first step is what 133 00:07:44,676 --> 00:07:48,916 Speaker 1: psychologists have christened the intergroup empathy gap. Normally, we feel 134 00:07:48,956 --> 00:07:51,276 Speaker 1: sad when others are sad, and a bit of empathic 135 00:07:51,316 --> 00:07:54,996 Speaker 1: pain when others get hurt, but not always. It wouldn't 136 00:07:55,036 --> 00:07:57,716 Speaker 1: make sense for us to empathize with all people all 137 00:07:57,756 --> 00:08:00,476 Speaker 1: the time, right If we really felt the weight of 138 00:08:00,476 --> 00:08:02,516 Speaker 1: every person in the world, we'd never get out of bed. 139 00:08:02,716 --> 00:08:05,836 Speaker 1: And it turns out that more and more evidence indicates 140 00:08:05,836 --> 00:08:09,116 Speaker 1: that these failures of empathy are particularly likely when targets 141 00:08:09,156 --> 00:08:12,276 Speaker 1: or socially distant, so when they belong to other social 142 00:08:12,436 --> 00:08:15,516 Speaker 1: or ethnic groups. Tons of studies show that we literally 143 00:08:15,556 --> 00:08:18,396 Speaker 1: don't experience the pain of outgroup members the same way 144 00:08:18,396 --> 00:08:20,956 Speaker 1: we do for people in our own group. There's evidence, 145 00:08:20,996 --> 00:08:23,956 Speaker 1: for example, that white doctors are less likely to prescribe 146 00:08:23,996 --> 00:08:27,316 Speaker 1: pain medication to their black patients, and even when such 147 00:08:27,356 --> 00:08:31,556 Speaker 1: pain reducers are prescribed, they're given in lower quantities. Are 148 00:08:31,596 --> 00:08:34,996 Speaker 1: indifference to other groups pain I mean that even doctors can, 149 00:08:34,996 --> 00:08:37,876 Speaker 1: intovert only cause people who are unlike them to suffer 150 00:08:37,916 --> 00:08:41,316 Speaker 1: more than is necessary. But an absence of empathy still 151 00:08:41,356 --> 00:08:44,036 Speaker 1: doesn't mean that we're cool with knowingly causing bad things 152 00:08:44,036 --> 00:08:46,876 Speaker 1: to happen to other people. To do that, we need 153 00:08:46,916 --> 00:08:49,316 Speaker 1: to take the next troubling step on that dangerous inner 154 00:08:49,316 --> 00:08:52,036 Speaker 1: group path, one that's summed up by a German word 155 00:08:52,236 --> 00:08:57,676 Speaker 1: schadenfreude literally harm joy. Schadenfreude really specifically refers to the 156 00:08:57,716 --> 00:09:02,996 Speaker 1: malicious pleasure that people feel when they see another person suffering. Oftentimes, 157 00:09:02,996 --> 00:09:05,436 Speaker 1: if you just don't like someone, if you perceive that 158 00:09:05,516 --> 00:09:09,716 Speaker 1: someone has acted in an unjust way, if they're deserving, 159 00:09:10,356 --> 00:09:12,996 Speaker 1: or if you envied them that these would all be 160 00:09:13,036 --> 00:09:17,236 Speaker 1: precursors for feeling pleasure when you saw that person suffer misfortune. 161 00:09:17,556 --> 00:09:20,356 Speaker 1: But Mina wanted to know, as schadenfreude could also be 162 00:09:20,396 --> 00:09:23,036 Speaker 1: at the root of between group conflicts too. Would it 163 00:09:23,076 --> 00:09:25,316 Speaker 1: be enough just to know that someone came from a 164 00:09:25,356 --> 00:09:27,516 Speaker 1: different group in order to be able to engender this 165 00:09:27,636 --> 00:09:31,796 Speaker 1: kind of aggressive, malicious pleasure. Human history has shown us 166 00:09:31,836 --> 00:09:34,716 Speaker 1: that it's surprisingly easy to get people to feel schadenfreude 167 00:09:34,796 --> 00:09:38,116 Speaker 1: towards outgroup members. Sports is this great microcosm in which 168 00:09:38,156 --> 00:09:42,396 Speaker 1: to study these dynamics, because people not only feel allowed, 169 00:09:42,396 --> 00:09:47,436 Speaker 1: but emboldened to say really horrible things about the outgroup. 170 00:09:47,796 --> 00:09:49,716 Speaker 1: So what we realize is if we could just tap 171 00:09:49,756 --> 00:09:52,876 Speaker 1: into individuals who identified as sports fans, that maybe we 172 00:09:52,916 --> 00:09:55,516 Speaker 1: could get some honest responding when we just asked them, 173 00:09:55,556 --> 00:09:57,116 Speaker 1: how good does it make you feel to see this 174 00:09:57,156 --> 00:10:00,196 Speaker 1: bad thing happened to these other folks? Mina recruited, of course, 175 00:10:00,396 --> 00:10:03,276 Speaker 1: Red Sox and Yankees fans. She stuck them inside a 176 00:10:03,316 --> 00:10:06,316 Speaker 1: brain scanner and showed them cartoon versions of baseball plays 177 00:10:06,436 --> 00:10:09,876 Speaker 1: involving lots of different teams, but on theical trials the 178 00:10:09,876 --> 00:10:12,396 Speaker 1: ones Mina was really interested in. They got to watch 179 00:10:12,476 --> 00:10:15,076 Speaker 1: good and bad things happen to their arrivals. What we 180 00:10:15,156 --> 00:10:19,036 Speaker 1: found was that watching your rival fail engaged several different 181 00:10:19,036 --> 00:10:22,436 Speaker 1: brain regions, but the only one that was associated with 182 00:10:22,476 --> 00:10:26,236 Speaker 1: just how much pleasure participants said that they felt was 183 00:10:26,316 --> 00:10:29,396 Speaker 1: this region called the ventral stri item. The ventral stri 184 00:10:29,476 --> 00:10:32,156 Speaker 1: item is part of our brain's rewards her kuit, but 185 00:10:32,276 --> 00:10:35,836 Speaker 1: this region doesn't just register, Hey, that event felt really good. 186 00:10:36,556 --> 00:10:40,516 Speaker 1: It's also critically involved in learning. That means that when 187 00:10:40,516 --> 00:10:43,476 Speaker 1: the ventral stri item is activated, it helps us decide 188 00:10:43,516 --> 00:10:46,396 Speaker 1: how we personally should behave in the future. It notes 189 00:10:46,436 --> 00:10:49,836 Speaker 1: when there's a surprising positive event in the environment and says, Okay, 190 00:10:49,916 --> 00:10:52,716 Speaker 1: let's come back to taking the action that brought that 191 00:10:52,956 --> 00:10:55,036 Speaker 1: event about, because that's going to be the thing that's 192 00:10:55,076 --> 00:10:58,436 Speaker 1: going to be rewarding in the future. Mina's Red Sox 193 00:10:58,516 --> 00:11:01,076 Speaker 1: fans were starting to make a mental connection between that 194 00:11:01,236 --> 00:11:04,476 Speaker 1: great schadenfreud of feeling and the possibility that they could 195 00:11:04,556 --> 00:11:07,556 Speaker 1: use their own personal behaviors to cause that nice feeling again, 196 00:11:08,236 --> 00:11:13,396 Speaker 1: perhaps by actively harming someone else. Those participants who exhibited 197 00:11:13,436 --> 00:11:15,716 Speaker 1: that much more ventral straight le activation in response to 198 00:11:15,796 --> 00:11:18,436 Speaker 1: watching their rival fail were the same people who told 199 00:11:18,476 --> 00:11:20,156 Speaker 1: me two weeks later that they would be that much 200 00:11:20,236 --> 00:11:24,956 Speaker 1: more likely to heckle, hit, and insult a rival fan. 201 00:11:25,196 --> 00:11:30,116 Speaker 1: So that for me established this suggestive link between the 202 00:11:30,236 --> 00:11:34,156 Speaker 1: sort of pleasure of watching out group failure or harm 203 00:11:34,636 --> 00:11:38,636 Speaker 1: and potentially the likelihood that it was related to your 204 00:11:38,676 --> 00:11:41,676 Speaker 1: own desire to become the agent of harm and other circumstances. 205 00:11:42,796 --> 00:11:45,596 Speaker 1: And these awful dynamics don't just play out on sports fields. 206 00:11:46,236 --> 00:11:49,156 Speaker 1: The same processes are at least partially at work in 207 00:11:49,276 --> 00:11:52,956 Speaker 1: settings where people engage in more large scale violence. Think 208 00:11:53,076 --> 00:11:57,516 Speaker 1: ethnic cleansing like Mina's family experienced in Yugoslavia, or hate 209 00:11:57,516 --> 00:12:01,356 Speaker 1: crimes against marginalized groups, or the long legacy of lethal 210 00:12:01,436 --> 00:12:04,836 Speaker 1: violence that law enforcement personnel have inflicted on black people 211 00:12:04,836 --> 00:12:07,876 Speaker 1: in the United States. The processes Mina has observed in 212 00:12:07,916 --> 00:12:10,876 Speaker 1: her baseball fans likely contribute to the many acts of 213 00:12:10,956 --> 00:12:14,356 Speaker 1: racist violence we see in the news, shockingly often, especially 214 00:12:14,436 --> 00:12:16,796 Speaker 1: in cases where there are structural features in place to 215 00:12:16,876 --> 00:12:20,276 Speaker 1: help inflame our sense of competition and increase our fear 216 00:12:20,396 --> 00:12:24,396 Speaker 1: of the other side. Now understanding the processes that lead 217 00:12:24,436 --> 00:12:27,276 Speaker 1: to these violent acts, of course, doesn't excuse them. I 218 00:12:27,396 --> 00:12:30,116 Speaker 1: want to be super clear on that point, but Mina's 219 00:12:30,116 --> 00:12:33,236 Speaker 1: work is incredibly important here because it shows just how 220 00:12:33,316 --> 00:12:37,356 Speaker 1: easily situational and structural factors can lead otherwise non violent 221 00:12:37,436 --> 00:12:41,356 Speaker 1: people towards brutal actions. To mean, the most striking thing 222 00:12:41,436 --> 00:12:44,276 Speaker 1: about schadenferda is not that it happens, but how quickly 223 00:12:44,316 --> 00:12:47,596 Speaker 1: we can shift from empathy or indifference to taking pleasure 224 00:12:47,636 --> 00:12:50,036 Speaker 1: and other people's pains, and that there are cases in 225 00:12:50,076 --> 00:12:52,876 Speaker 1: which out group harm appears to be driven by the 226 00:12:53,076 --> 00:12:57,716 Speaker 1: sheer hedonic benefit. It just feels good, and I find 227 00:12:57,756 --> 00:13:01,156 Speaker 1: that totally fascinating. Now, there are lots and lots of 228 00:13:01,236 --> 00:13:03,956 Speaker 1: structural changes needed to stop the large scale inner group 229 00:13:04,076 --> 00:13:07,316 Speaker 1: violence we see all over the world, but Mina's work 230 00:13:07,356 --> 00:13:10,396 Speaker 1: suggests that we might also be able to intervene psychologically 231 00:13:10,836 --> 00:13:13,356 Speaker 1: to curb at least some parts of these often lethal 232 00:13:13,556 --> 00:13:16,956 Speaker 1: downward cycles. And as is often the case in the 233 00:13:16,996 --> 00:13:21,116 Speaker 1: Happiness Lab, part of the solution might involve recognizing the 234 00:13:21,236 --> 00:13:24,036 Speaker 1: mistakes our minds are making All the time. I've been 235 00:13:24,076 --> 00:13:27,396 Speaker 1: talking a lot about how competition is doing quite a 236 00:13:27,436 --> 00:13:29,796 Speaker 1: bit of work in these contexts, right, and I think 237 00:13:29,836 --> 00:13:32,676 Speaker 1: that a huge part of conflict escalation is actually a 238 00:13:32,836 --> 00:13:35,796 Speaker 1: mistake that we make in inner group contexts, which is 239 00:13:35,836 --> 00:13:39,836 Speaker 1: that we don't deal with the person in front of us. Instead, 240 00:13:39,916 --> 00:13:43,596 Speaker 1: what we're doing is we're dealing with some idea, some models, 241 00:13:43,716 --> 00:13:47,356 Speaker 1: some stereotype of who they are. When we get back 242 00:13:47,396 --> 00:13:49,796 Speaker 1: from the break, we'll examine how we can turn off 243 00:13:49,836 --> 00:13:52,316 Speaker 1: our intergroup empathy gap and do so in a way 244 00:13:52,356 --> 00:13:55,236 Speaker 1: that can not only make us happier but also holds 245 00:13:55,276 --> 00:13:58,116 Speaker 1: the promise of helping us to make society a kinder 246 00:13:58,276 --> 00:14:01,676 Speaker 1: and less polarized place. The Happiness lab will be right back. 247 00:14:13,316 --> 00:14:15,436 Speaker 1: I'm very worried. I think that there are lots of 248 00:14:15,516 --> 00:14:18,596 Speaker 1: trends that are pushing us away from what is really 249 00:14:18,676 --> 00:14:21,316 Speaker 1: our truer natural state, which is to be interconnected with 250 00:14:21,396 --> 00:14:25,716 Speaker 1: each other. That said, I'm not fatalistic, right, I think 251 00:14:25,716 --> 00:14:28,596 Speaker 1: that there are things we can do to push back. 252 00:14:29,076 --> 00:14:31,796 Speaker 1: This is my friend Jamie Zaki, a professor of psychology 253 00:14:31,796 --> 00:14:35,036 Speaker 1: at Stanford University. Jamil has just written an important new 254 00:14:35,076 --> 00:14:38,236 Speaker 1: book called The War for Kindness, Building Empathy in a 255 00:14:38,316 --> 00:14:41,916 Speaker 1: Fractured World. That used to be called Choosing Empathy, which 256 00:14:41,996 --> 00:14:44,196 Speaker 1: now is the title of one of the chapters. I 257 00:14:44,276 --> 00:14:47,636 Speaker 1: started writing it in twenty fifteen and I don't know. 258 00:14:47,876 --> 00:14:52,236 Speaker 1: Around late twenty sixteen early twenty seventeen, I can't quite 259 00:14:52,316 --> 00:14:54,916 Speaker 1: put my finger on what it was, but something changed 260 00:14:55,516 --> 00:15:01,036 Speaker 1: in our culture. I felt like things were getting crueler 261 00:15:01,476 --> 00:15:06,076 Speaker 1: and less connected, and people were getting really exhausted with 262 00:15:06,196 --> 00:15:09,756 Speaker 1: trying to connect with each other and really embracing social 263 00:15:09,836 --> 00:15:12,596 Speaker 1: division in a way that I hadn't seen in my 264 00:15:12,716 --> 00:15:15,196 Speaker 1: adult life. I felt like I was being a Pollyanna, 265 00:15:15,356 --> 00:15:18,116 Speaker 1: just writing this kind of positive hey, you know, you 266 00:15:18,196 --> 00:15:20,836 Speaker 1: can choose empathy too, when all around me it seemed 267 00:15:20,876 --> 00:15:24,916 Speaker 1: like this giant tire fire where people just hated each 268 00:15:24,916 --> 00:15:26,756 Speaker 1: other more than ever, And it felt to me like 269 00:15:26,876 --> 00:15:31,276 Speaker 1: I needed and wanted to acknowledge that to be empathic, 270 00:15:31,396 --> 00:15:37,596 Speaker 1: to choose empathy is a radical choice in today's culture. 271 00:15:38,356 --> 00:15:41,796 Speaker 1: It is a fight against other forces that are pushing 272 00:15:41,916 --> 00:15:45,316 Speaker 1: us in the opposite direction. Jimmiale wants us to get 273 00:15:45,396 --> 00:15:48,396 Speaker 1: pissed off at the current polarized state of society and 274 00:15:48,516 --> 00:15:51,036 Speaker 1: to take up arms for a coming battle, but his 275 00:15:51,196 --> 00:15:54,156 Speaker 1: war doesn't involve weapons or the usual inner group bloodshed. 276 00:15:54,836 --> 00:15:57,996 Speaker 1: Jamil wants us to fight divisiveness and our ever increasing 277 00:15:58,076 --> 00:16:00,836 Speaker 1: sense of disconnection. He wants each and every one of 278 00:16:00,916 --> 00:16:04,036 Speaker 1: us to commit to being kinder to one another. If 279 00:16:04,076 --> 00:16:06,036 Speaker 1: you've paid any attention to the news in the last 280 00:16:06,076 --> 00:16:09,316 Speaker 1: few years, you understand that Jimmiale's war for kindness is 281 00:16:09,396 --> 00:16:12,196 Speaker 1: becoming more and more of an uphill battle. A growing 282 00:16:12,276 --> 00:16:14,836 Speaker 1: body of work shows the empathy in general seems to 283 00:16:14,876 --> 00:16:18,356 Speaker 1: be decreasing over time. One study presented people with a 284 00:16:18,436 --> 00:16:21,156 Speaker 1: series of statements and asked them how well it described them, 285 00:16:21,396 --> 00:16:23,636 Speaker 1: on a scale from one not at all to five 286 00:16:23,876 --> 00:16:27,356 Speaker 1: fits you perfectly. The statements were things like I often 287 00:16:27,396 --> 00:16:30,076 Speaker 1: have tender, concerned feelings for people who are less fortunate 288 00:16:30,156 --> 00:16:33,276 Speaker 1: than me, and when I'm upset at someone, I usually 289 00:16:33,316 --> 00:16:35,036 Speaker 1: try to put myself in their shoes for a while. 290 00:16:35,396 --> 00:16:38,076 Speaker 1: And what they found was that in nineteen seventy nine, 291 00:16:38,236 --> 00:16:40,716 Speaker 1: the average Americans scored like a four out of five, 292 00:16:40,796 --> 00:16:44,876 Speaker 1: which sounds not terrible, maybe a B. But by two 293 00:16:44,916 --> 00:16:47,436 Speaker 1: thousand and nine, the average American dropped down to a 294 00:16:47,516 --> 00:16:49,756 Speaker 1: three point five out of five. So, to put that 295 00:16:49,836 --> 00:16:52,956 Speaker 1: in perspective, the average American in two thousand and nine 296 00:16:53,076 --> 00:16:56,996 Speaker 1: less empathic by this measure than seventy five percent of 297 00:16:57,076 --> 00:17:01,796 Speaker 1: Americans just thirty years before. This rising level of disconnection 298 00:17:02,116 --> 00:17:04,116 Speaker 1: means that more and more of us are missing out 299 00:17:04,436 --> 00:17:07,356 Speaker 1: on a potential boost to our well being. It's surprising 300 00:17:07,516 --> 00:17:09,476 Speaker 1: to a lot of people that empathy is good for 301 00:17:09,876 --> 00:17:13,316 Speaker 1: us the empathizer, right. We typically think of it almost 302 00:17:13,316 --> 00:17:17,156 Speaker 1: like a transfer, like I give up my money or 303 00:17:17,276 --> 00:17:20,876 Speaker 1: time or emotional peace in order to help you have 304 00:17:21,036 --> 00:17:24,156 Speaker 1: more of it. It's sort of the quintessential act of 305 00:17:24,396 --> 00:17:27,756 Speaker 1: self sacrifice. It turns out, though, that the data point 306 00:17:28,156 --> 00:17:31,916 Speaker 1: almost exactly in the opposite direction, that caring for others 307 00:17:32,116 --> 00:17:34,356 Speaker 1: is one of the most important ways we can care 308 00:17:34,516 --> 00:17:37,876 Speaker 1: for ourselves. People who experience a lot of empathy also 309 00:17:37,996 --> 00:17:41,636 Speaker 1: tend to be happier, less stressed, and experience less depression. 310 00:17:42,356 --> 00:17:44,636 Speaker 1: They find it easier to make new friends and to 311 00:17:44,756 --> 00:17:49,636 Speaker 1: maintain important relationships like their marriages. Seventh Graders who are 312 00:17:49,676 --> 00:17:52,716 Speaker 1: able to understand what others feel are also better able 313 00:17:52,796 --> 00:17:55,836 Speaker 1: to survive seventh grade, which is not easy. If my 314 00:17:55,956 --> 00:18:00,116 Speaker 1: recollection serves, the false intuition that empathic work reduces our 315 00:18:00,156 --> 00:18:03,636 Speaker 1: happiness is hard to shake. Jamil saw this himself when 316 00:18:03,676 --> 00:18:06,516 Speaker 1: he taught a class of stamper called Becoming Kinder. So 317 00:18:07,156 --> 00:18:11,516 Speaker 1: every weekend I would give students these kindness challenges, these 318 00:18:11,596 --> 00:18:15,876 Speaker 1: little practical assignments meant to help push them to empathize more, 319 00:18:15,956 --> 00:18:17,796 Speaker 1: and one of the very first ones that we did 320 00:18:17,956 --> 00:18:21,196 Speaker 1: was spend on someone else. So in a moment when 321 00:18:21,276 --> 00:18:24,996 Speaker 1: you don't feel like you have enough time or energy 322 00:18:25,476 --> 00:18:28,276 Speaker 1: for yourself, do the thing that doesn't come naturally to 323 00:18:28,396 --> 00:18:31,236 Speaker 1: you and help someone else instead. And the students were 324 00:18:31,316 --> 00:18:33,676 Speaker 1: really worried about this because it was midterm season. It 325 00:18:33,716 --> 00:18:36,996 Speaker 1: feels like it's always midterm season. It's always been term season, 326 00:18:37,276 --> 00:18:40,476 Speaker 1: It's somehow always midterm season. But they were freaked out. 327 00:18:40,596 --> 00:18:43,316 Speaker 1: They were overwhelmed, and they thought, God, I don't have 328 00:18:43,356 --> 00:18:46,916 Speaker 1: the time to do stuff for other people. And reliably, 329 00:18:47,076 --> 00:18:49,556 Speaker 1: they came back from that challenge feeling like I was 330 00:18:49,796 --> 00:18:53,156 Speaker 1: shocked because after I helped someone else, I didn't feel depleted. 331 00:18:53,716 --> 00:18:56,636 Speaker 1: I felt energized. I kind of felt like, if I 332 00:18:56,716 --> 00:18:59,116 Speaker 1: can do for someone else, then I must be doing 333 00:18:59,196 --> 00:19:02,596 Speaker 1: okay myself. When we don't take actions that could make 334 00:19:02,636 --> 00:19:05,276 Speaker 1: the people around us feel better, when we don't check 335 00:19:05,316 --> 00:19:07,196 Speaker 1: in with friends or notice of a co worker is 336 00:19:07,236 --> 00:19:10,116 Speaker 1: in pain, or stop to aid a stranger, it means 337 00:19:10,316 --> 00:19:13,396 Speaker 1: we're each contributing to that tire fire culture Jamil talked 338 00:19:13,396 --> 00:19:17,036 Speaker 1: about earlier, but Jamie's work shows that doesn't have to 339 00:19:17,116 --> 00:19:20,196 Speaker 1: be the case. Another trick that our minds play on 340 00:19:20,316 --> 00:19:23,876 Speaker 1: us and that our culture plays on us, is convincing 341 00:19:23,996 --> 00:19:26,316 Speaker 1: us that we can't change. I think there's this big 342 00:19:26,436 --> 00:19:30,116 Speaker 1: stereotype that some people are empathic, some people are not, 343 00:19:30,836 --> 00:19:33,516 Speaker 1: and whatever level of empathy you have, it's like your 344 00:19:33,516 --> 00:19:36,516 Speaker 1: adult height or your eye color, you'll have it for life. 345 00:19:37,116 --> 00:19:39,716 Speaker 1: But I think that the evidence actually again point in 346 00:19:39,836 --> 00:19:44,196 Speaker 1: the opposite direction. There are things we can do to 347 00:19:44,356 --> 00:19:47,556 Speaker 1: push back. I mean, the fact that empathy has declined 348 00:19:47,876 --> 00:19:50,996 Speaker 1: so much in the last thirty years means that it's malleable. 349 00:19:51,756 --> 00:19:55,076 Speaker 1: Things that go down can come up. And I think 350 00:19:55,156 --> 00:19:57,316 Speaker 1: that one of the first things that I want people 351 00:19:57,356 --> 00:20:01,596 Speaker 1: to understand is that empathy is under our control more 352 00:20:01,676 --> 00:20:05,236 Speaker 1: than we realize. There are specific strategies each of us 353 00:20:05,276 --> 00:20:08,236 Speaker 1: can use today to increase our empathy, and not just 354 00:20:08,356 --> 00:20:11,116 Speaker 1: in a parochial w where we extend kindness only to 355 00:20:11,196 --> 00:20:13,756 Speaker 1: the people who are like us. The science says we 356 00:20:13,836 --> 00:20:16,876 Speaker 1: can turn up compassion to fight the dangerous intergroup empathy 357 00:20:16,956 --> 00:20:19,556 Speaker 1: gaps that plague our culture, and that we may even 358 00:20:19,596 --> 00:20:22,076 Speaker 1: be able to use some of these strategies to start 359 00:20:22,156 --> 00:20:25,756 Speaker 1: reducing the biggest and most painful divides in society. We'll 360 00:20:25,796 --> 00:20:29,556 Speaker 1: examine all these exciting possibilities when the Happiness lab returns 361 00:20:29,636 --> 00:20:43,516 Speaker 1: in a moment. It turns out that, in fact, empathy 362 00:20:43,796 --> 00:20:45,996 Speaker 1: is like a skill, and there are lots of things 363 00:20:46,036 --> 00:20:49,436 Speaker 1: that we can do to cultivate empathy in ourselves and others. 364 00:20:49,716 --> 00:20:52,556 Speaker 1: When Jamil taught his Becoming Kinder class, he gave his 365 00:20:52,636 --> 00:20:56,236 Speaker 1: students a super hard assignment, an empathy challenge he christened 366 00:20:56,596 --> 00:20:59,396 Speaker 1: disagreeing Better, Oh and welcome back everybody. You can check 367 00:20:59,396 --> 00:21:01,556 Speaker 1: out a version on his website. Find someone with whom 368 00:21:01,636 --> 00:21:05,356 Speaker 1: you have an ideological difference of opinion. But then, instead 369 00:21:05,396 --> 00:21:07,676 Speaker 1: of yelling at each other, or judging each other or 370 00:21:07,716 --> 00:21:11,476 Speaker 1: even debating, I want you to try to cultivate curiosity 371 00:21:11,596 --> 00:21:15,076 Speaker 1: about each other. Ask this person how they came to 372 00:21:15,236 --> 00:21:18,196 Speaker 1: have their opinion in the first place, and share with 373 00:21:18,316 --> 00:21:20,796 Speaker 1: them the story of how you came to have your 374 00:21:20,836 --> 00:21:24,636 Speaker 1: opinion in the first place. Students embarked on hard conversations 375 00:21:24,836 --> 00:21:29,236 Speaker 1: with racist Facebook posting uncles and frank discussions about sexuality 376 00:21:29,396 --> 00:21:32,676 Speaker 1: with their less than progressive parents. They predicted that these 377 00:21:32,716 --> 00:21:36,116 Speaker 1: exchanges would end in frustration or even tears, but in 378 00:21:36,236 --> 00:21:40,636 Speaker 1: nearly all cases those stories sharing conversations went better than expecting. 379 00:21:41,276 --> 00:21:45,556 Speaker 1: When you start with narratives, instead of either calling people 380 00:21:45,596 --> 00:21:49,716 Speaker 1: out or saying how wrong they are, you get to 381 00:21:49,876 --> 00:21:53,236 Speaker 1: a new type of discussion right away, one in which 382 00:21:53,756 --> 00:21:57,116 Speaker 1: it actually doesn't matter as much if you would agree 383 00:21:57,196 --> 00:22:01,116 Speaker 1: on every point. But something just as important, or maybe 384 00:22:01,156 --> 00:22:04,436 Speaker 1: even more important, happens, which is that they grow to 385 00:22:04,516 --> 00:22:09,476 Speaker 1: appreciate that people they disagree with are not necessarily bad people. 386 00:22:11,076 --> 00:22:14,036 Speaker 1: There's just people with different stories than their own. No 387 00:22:14,156 --> 00:22:18,276 Speaker 1: one owes anyone empathy, especially if they're expressing bigoted viewpoints. 388 00:22:19,076 --> 00:22:22,316 Speaker 1: But Jamil says his students were still surprisingly grateful for 389 00:22:22,436 --> 00:22:24,956 Speaker 1: having been given the challenge, because it taught them that 390 00:22:25,116 --> 00:22:30,636 Speaker 1: making connections across seemingly unbridgable divides is actually possible. But 391 00:22:30,836 --> 00:22:33,356 Speaker 1: you might be saying this is just an anecdote from 392 00:22:33,396 --> 00:22:36,516 Speaker 1: one college class of students talking to their family members. 393 00:22:37,356 --> 00:22:40,516 Speaker 1: Is there scientific evidence that sharing stories and empathic work 394 00:22:40,596 --> 00:22:44,156 Speaker 1: like this really does the job? Can connecting over shared 395 00:22:44,196 --> 00:22:47,836 Speaker 1: experiences actually reduce the inner group disconnection we see all 396 00:22:47,876 --> 00:22:54,636 Speaker 1: over the world. Did you vote to allow gay and 397 00:22:54,756 --> 00:22:57,076 Speaker 1: lesbian couples to continue to marry or did you vote 398 00:22:57,116 --> 00:22:59,356 Speaker 1: to a banned gay and lesbian couples from being able 399 00:22:59,396 --> 00:23:05,196 Speaker 1: to marry. Typical political canvassing involves knocking on someone's door 400 00:23:05,356 --> 00:23:08,596 Speaker 1: and launching into a one way conversation filled with facts, figures, 401 00:23:08,636 --> 00:23:12,756 Speaker 1: and strong arguments. This style of canvassing doesn't really work, 402 00:23:13,476 --> 00:23:17,116 Speaker 1: especially when the usual political partisanship tightens its gript. My 403 00:23:17,236 --> 00:23:20,516 Speaker 1: name's Josh Kala. I'm an assistant professor of political science 404 00:23:20,556 --> 00:23:23,796 Speaker 1: and data science at Yale University. People just tune it out. 405 00:23:23,956 --> 00:23:27,236 Speaker 1: They won't engage, they won't pay attention, or they'll argue 406 00:23:27,236 --> 00:23:30,516 Speaker 1: against it. Most people are just such consistently democrat, are 407 00:23:30,556 --> 00:23:34,436 Speaker 1: such consistently Republican, there's often not much room to to 408 00:23:34,596 --> 00:23:37,076 Speaker 1: change someone's mind there. But Josh and his colleagues have 409 00:23:37,196 --> 00:23:40,116 Speaker 1: begun studying the effectiveness of a new kind of canvassing, 410 00:23:40,756 --> 00:23:43,596 Speaker 1: one that can break down our inner group blinders, and 411 00:23:43,796 --> 00:23:46,316 Speaker 1: one that also employs a lot of the same empathic 412 00:23:46,436 --> 00:23:49,716 Speaker 1: practices that Jamie and his students used in that disagreeing 413 00:23:49,796 --> 00:23:53,356 Speaker 1: better assignment. Deep canvassing is a longer form of canvassing 414 00:23:53,556 --> 00:23:58,356 Speaker 1: that really involves sharing personal narratives about an issue. It 415 00:23:58,516 --> 00:24:02,436 Speaker 1: often involves a canvasser sharing a moment in their life 416 00:24:02,676 --> 00:24:06,036 Speaker 1: that is somehow relevant to the issue that's being studied. 417 00:24:06,716 --> 00:24:09,516 Speaker 1: I'm a gay guy. I doubt that totally shocks you, 418 00:24:09,876 --> 00:24:12,796 Speaker 1: and I was in a relationship for eighteen years. Often 419 00:24:12,836 --> 00:24:16,916 Speaker 1: those stories then prompt the voter to share their own 420 00:24:16,956 --> 00:24:21,276 Speaker 1: story when my wife died or whatever, it broke my heart. Well, no, 421 00:24:21,396 --> 00:24:23,076 Speaker 1: it didn't break my art. Put a hole in it 422 00:24:23,356 --> 00:24:25,676 Speaker 1: and it won't be a lot. Yeah, it sounds like 423 00:24:25,836 --> 00:24:29,116 Speaker 1: marriage is incredibly important to you. I was married forty 424 00:24:29,116 --> 00:24:31,196 Speaker 1: seven years, so she'd press you. But what we find 425 00:24:31,316 --> 00:24:35,356 Speaker 1: is that by talking through these stories, the type of 426 00:24:35,396 --> 00:24:38,756 Speaker 1: discrimination that we're trying to reduce becomes much more concrete, 427 00:24:39,196 --> 00:24:42,876 Speaker 1: and it also reduces a lot of fears that the 428 00:24:42,996 --> 00:24:45,716 Speaker 1: voter has towards that out group. It helps them understand 429 00:24:46,156 --> 00:24:48,796 Speaker 1: what does this word transgender actually mean? Or what does 430 00:24:48,836 --> 00:24:51,916 Speaker 1: this word undocumented immigrant actually mean? By walking through the 431 00:24:51,996 --> 00:24:55,116 Speaker 1: life of an undocumented immigrant or a transgender person through 432 00:24:55,196 --> 00:24:59,556 Speaker 1: hearing their story, hearing the canvasser's stories tends to shut 433 00:24:59,596 --> 00:25:02,716 Speaker 1: off the indifference we typically feel for people outside our tribe. 434 00:25:03,516 --> 00:25:06,476 Speaker 1: Deep canvassing also forces the listener to see people from 435 00:25:06,556 --> 00:25:10,916 Speaker 1: unfamiliar groups and with unfamiliar views as people and that 436 00:25:11,036 --> 00:25:15,036 Speaker 1: empathic boost allows deep canvassers to do something that billions 437 00:25:15,076 --> 00:25:18,916 Speaker 1: of dollars of political ads can't. They actually change people's 438 00:25:18,956 --> 00:25:22,396 Speaker 1: minds about controversial political issues. You know this issue is 439 00:25:22,436 --> 00:25:24,436 Speaker 1: going to come up for a vote again in the future, 440 00:25:26,836 --> 00:25:29,196 Speaker 1: vote in favor of allowing gay and muspian couples to 441 00:25:29,276 --> 00:25:33,236 Speaker 1: marry good. Josh's careful field research has found that deep 442 00:25:33,276 --> 00:25:36,596 Speaker 1: canvassing gets about five to seven new supporters for every 443 00:25:36,716 --> 00:25:39,556 Speaker 1: hundred people they talk to. Now that might not sound 444 00:25:39,596 --> 00:25:42,396 Speaker 1: like much, but for ballot measures that are typically one 445 00:25:42,436 --> 00:25:45,996 Speaker 1: are lost by less than five percentage points, deep canvassing 446 00:25:46,116 --> 00:25:48,836 Speaker 1: can make or break the adoption of a progressive new law. 447 00:25:49,396 --> 00:25:53,076 Speaker 1: But what's most impressive about Josh's deep canvassing findings is 448 00:25:53,116 --> 00:25:57,356 Speaker 1: that these persuasion effects last a long long time. Well, 449 00:25:57,556 --> 00:26:00,316 Speaker 1: keep doing those follow up surveys two, three, four or 450 00:26:00,316 --> 00:26:03,996 Speaker 1: five months later, and typically we run out of money 451 00:26:04,036 --> 00:26:08,596 Speaker 1: to run more surveys before the effects dissidate. But aside 452 00:26:08,596 --> 00:26:11,436 Speaker 1: from the policy tis, Josh has also seen how powerful 453 00:26:11,476 --> 00:26:15,596 Speaker 1: deep canvassing can be and empathically connecting people from different identities. 454 00:26:16,356 --> 00:26:19,556 Speaker 1: One of his favorite examples comes from an encounter in Florida, 455 00:26:19,916 --> 00:26:23,236 Speaker 1: during a campaign for trans rights. He was a transgender canvasser, 456 00:26:23,396 --> 00:26:26,116 Speaker 1: and he shows up to a house that has a 457 00:26:26,196 --> 00:26:29,916 Speaker 1: big American flag and a pickup truck, and he's really 458 00:26:29,956 --> 00:26:33,356 Speaker 1: bracing himself for what he expects to be a really 459 00:26:33,436 --> 00:26:37,876 Speaker 1: difficult conversation. The canvas shared his story anyway, He described 460 00:26:37,876 --> 00:26:40,596 Speaker 1: the kinds of prejudice and misunderstanding he faces on a 461 00:26:40,716 --> 00:26:43,916 Speaker 1: near daily basis. He even shared a story about being 462 00:26:43,956 --> 00:26:45,996 Speaker 1: made fun of and called an animal on the New 463 00:26:46,076 --> 00:26:50,716 Speaker 1: York subway. After sharing this story of personal discrimination, he 464 00:26:50,836 --> 00:26:55,436 Speaker 1: asked the person he's canvas, saying, this white, macho pickup 465 00:26:55,516 --> 00:26:58,476 Speaker 1: truck drive in American flag, guy, have you ever faced 466 00:26:58,516 --> 00:27:01,476 Speaker 1: anything like that? The guy he's canvassing really pauses for 467 00:27:01,556 --> 00:27:04,116 Speaker 1: a minute and says, the experiences that you face, a 468 00:27:04,156 --> 00:27:07,276 Speaker 1: discrimination that you face, and people's lack of empathy and understanding, 469 00:27:07,356 --> 00:27:11,076 Speaker 1: it's it's not that different than the experiences that I face. 470 00:27:11,356 --> 00:27:14,876 Speaker 1: I served in Afghanistan. I did two tours there. I 471 00:27:14,996 --> 00:27:17,396 Speaker 1: came back and I had PTSD. People would look at 472 00:27:17,436 --> 00:27:19,676 Speaker 1: me like I was crazy and they wouldn't understand what 473 00:27:19,876 --> 00:27:21,996 Speaker 1: was going on. And what I love about that story 474 00:27:22,036 --> 00:27:24,596 Speaker 1: is it just questioned so many of the assumptions that 475 00:27:24,996 --> 00:27:27,836 Speaker 1: we all make, and shows that again, if we're patient, 476 00:27:27,876 --> 00:27:30,596 Speaker 1: if we question those assumptions, if we're vulnerable and try 477 00:27:30,956 --> 00:27:33,636 Speaker 1: and listen and share our share our experiences, we can 478 00:27:33,716 --> 00:27:38,196 Speaker 1: be successful at changing minds. Jamil is a huge fan 479 00:27:38,276 --> 00:27:41,636 Speaker 1: of Josh's deep canvassing work because it provides a wonderful 480 00:27:41,716 --> 00:27:44,396 Speaker 1: example of the central claim in his War for Kindness book, 481 00:27:44,876 --> 00:27:46,996 Speaker 1: the empathy is a skill we can build over time, 482 00:27:47,476 --> 00:27:49,796 Speaker 1: a tool we can use to do some amazing things 483 00:27:50,436 --> 00:27:52,116 Speaker 1: if and when we have the bandwidth to use it. 484 00:27:52,796 --> 00:27:56,676 Speaker 1: I think our culture right now includes a lot of 485 00:27:56,756 --> 00:28:03,036 Speaker 1: threat at home, online, outside. We're constantly feel as though 486 00:28:03,116 --> 00:28:07,036 Speaker 1: there are people who threaten our identity, who threaten our 487 00:28:07,156 --> 00:28:10,956 Speaker 1: way of life, who threatened our beliefs, and I'm not 488 00:28:11,036 --> 00:28:14,956 Speaker 1: going to say that that's untrue. And although it's easy 489 00:28:15,036 --> 00:28:18,196 Speaker 1: and natural to engage in I guess what you'd call 490 00:28:18,316 --> 00:28:22,356 Speaker 1: call out culture right so just sort of attacking people 491 00:28:22,676 --> 00:28:28,636 Speaker 1: who have toxic or problematic attitudes. I think the hard 492 00:28:29,236 --> 00:28:32,516 Speaker 1: but often very productive thing to do is to be 493 00:28:32,676 --> 00:28:36,476 Speaker 1: the person who takes that first step, who puts their 494 00:28:36,556 --> 00:28:41,516 Speaker 1: guard down and decides to be vulnerable. And Jamille himself 495 00:28:41,596 --> 00:28:44,676 Speaker 1: recognizes just how hard that first step can be. It 496 00:28:44,796 --> 00:28:47,756 Speaker 1: can be really exhausting to try to empathize with people 497 00:28:47,956 --> 00:28:50,676 Speaker 1: who are different from us, especially if they have opinions 498 00:28:51,196 --> 00:28:56,036 Speaker 1: that we might fear or ab or now. I try 499 00:28:56,156 --> 00:28:59,116 Speaker 1: really hard to be an understanding person, and I truly 500 00:28:59,156 --> 00:29:02,116 Speaker 1: believe in the importance of Jamie's battle for kindness. But 501 00:29:02,996 --> 00:29:05,676 Speaker 1: almost every day I see some view online that makes 502 00:29:05,716 --> 00:29:08,796 Speaker 1: me see red. When people seem to be so hateful, 503 00:29:09,116 --> 00:29:11,116 Speaker 1: it's really really hard for me to see them is 504 00:29:11,156 --> 00:29:15,036 Speaker 1: deserving of my compassion or my emotional energy. I was 505 00:29:15,116 --> 00:29:17,316 Speaker 1: surprised that the guy who literally wrote the book on 506 00:29:17,356 --> 00:29:21,116 Speaker 1: empathy got exactly what I was saying. Trust me, I 507 00:29:21,276 --> 00:29:23,916 Speaker 1: feel that way all the time. I still remember where 508 00:29:23,916 --> 00:29:26,796 Speaker 1: the New York Times had this whole, very sympathetic portrayal 509 00:29:26,916 --> 00:29:30,396 Speaker 1: of a family in Illinois that happened to be Nazis, 510 00:29:30,796 --> 00:29:32,636 Speaker 1: and I remember a detail where they were trying to 511 00:29:32,756 --> 00:29:35,676 Speaker 1: humanize this family by talking about how they cooked their pasta, 512 00:29:35,716 --> 00:29:38,036 Speaker 1: and I just remember thinking, I don't want to hear 513 00:29:38,076 --> 00:29:40,996 Speaker 1: about your Nazi pasta. You know, I don't want to 514 00:29:41,076 --> 00:29:46,396 Speaker 1: humanize you. It's exhausting to connect, and it's especially exhausting 515 00:29:46,476 --> 00:29:48,636 Speaker 1: to connect with people who say things that are awful 516 00:29:49,116 --> 00:29:53,236 Speaker 1: and that don't really deserve a platform. So I think 517 00:29:53,276 --> 00:29:57,916 Speaker 1: it's perfectly okay for people to think about what they 518 00:29:58,036 --> 00:30:00,716 Speaker 1: have the energy for, what they have the space for, 519 00:30:01,316 --> 00:30:05,836 Speaker 1: and no one should feel like they're obligated to connect 520 00:30:05,956 --> 00:30:10,716 Speaker 1: with or empathize with somebody who's saying off things. Now, 521 00:30:10,796 --> 00:30:13,316 Speaker 1: no one has to do this, it's not anybody's job. 522 00:30:14,396 --> 00:30:19,316 Speaker 1: But when we do, it's remarkable how powerful that can be, 523 00:30:20,076 --> 00:30:22,996 Speaker 1: because sometimes what you realize is that people on the 524 00:30:23,076 --> 00:30:27,316 Speaker 1: other side are also waiting for a chance to be human. 525 00:30:28,676 --> 00:30:31,836 Speaker 1: Despite the uphill battle, Jamil is optimistic that his war 526 00:30:31,956 --> 00:30:36,236 Speaker 1: for kindness is gaining new recruits. I've received hundreds of 527 00:30:36,356 --> 00:30:40,516 Speaker 1: emails that are something along the lines of I am 528 00:30:40,636 --> 00:30:43,236 Speaker 1: so fed up with this culture of division. I want 529 00:30:43,356 --> 00:30:46,156 Speaker 1: more empathy in the world. But I'm the only one, 530 00:30:47,076 --> 00:30:49,116 Speaker 1: and I'm almost like, can I put you all in 531 00:30:49,196 --> 00:30:52,076 Speaker 1: a group chat or something? Is there like so many 532 00:30:52,116 --> 00:30:55,516 Speaker 1: of you and I think we often feel alone, like 533 00:30:55,796 --> 00:30:59,436 Speaker 1: we are the only ones swimming upstream against a culture 534 00:30:59,476 --> 00:31:04,316 Speaker 1: of hatred and division and isolation. But I think it's 535 00:31:05,196 --> 00:31:07,876 Speaker 1: really shocking how powerful it can be to take the 536 00:31:07,916 --> 00:31:11,556 Speaker 1: first step to be that change instead of waiting for it, 537 00:31:12,356 --> 00:31:16,156 Speaker 1: because when we take a step towards listening to others, 538 00:31:16,276 --> 00:31:19,716 Speaker 1: towards being vulnerable with them, oftentimes we find that they're 539 00:31:19,756 --> 00:31:22,636 Speaker 1: ready to do the same thing. And I think that 540 00:31:23,596 --> 00:31:26,556 Speaker 1: each one of us does that, that can change our lives, 541 00:31:26,756 --> 00:31:29,076 Speaker 1: and it can change the lives of people around us, 542 00:31:29,116 --> 00:31:32,756 Speaker 1: maybe even save lives. The thing that really inspires me 543 00:31:32,996 --> 00:31:35,956 Speaker 1: is what could happen if a lot of us did that, 544 00:31:36,796 --> 00:31:39,516 Speaker 1: if most of us did that, because then we wouldn't 545 00:31:39,556 --> 00:31:41,996 Speaker 1: be changing lives when at a time we'd have the 546 00:31:42,076 --> 00:31:45,916 Speaker 1: chance to actually change our entire culture. I try to 547 00:31:45,956 --> 00:31:49,156 Speaker 1: be an optimistic person, but right now, in twenty twenty, 548 00:31:49,396 --> 00:31:51,556 Speaker 1: the idea that our entire culture will change for the 549 00:31:51,636 --> 00:31:55,276 Speaker 1: better seems a pretty distant hope. So much about how 550 00:31:55,316 --> 00:31:58,396 Speaker 1: our institutions work seems to be wrong, and the flaws 551 00:31:58,436 --> 00:32:02,196 Speaker 1: in these systems lead to prejudice, cruelty, and injustice, and 552 00:32:02,316 --> 00:32:04,436 Speaker 1: lots of people don't seem to realize that the burden 553 00:32:04,476 --> 00:32:07,396 Speaker 1: of all these awful things continues to hurt some marginalized 554 00:32:07,436 --> 00:32:11,036 Speaker 1: groups more than others. Seeing all this makes me really 555 00:32:11,156 --> 00:32:14,036 Speaker 1: sad and angry at the groups I feel are responsible. 556 00:32:14,596 --> 00:32:17,756 Speaker 1: I hate the injustice. I hate the divisions, and I 557 00:32:17,876 --> 00:32:21,276 Speaker 1: hate the heat. Even on my best days, it's hard 558 00:32:21,316 --> 00:32:24,036 Speaker 1: not to lose hope that I personally can make any 559 00:32:24,116 --> 00:32:27,356 Speaker 1: difference in these historic problems. But I don't want to 560 00:32:27,396 --> 00:32:29,716 Speaker 1: just retreat to my end group, and I don't want 561 00:32:29,716 --> 00:32:32,116 Speaker 1: to empathize with only people who are exactly like me, 562 00:32:32,956 --> 00:32:35,156 Speaker 1: nor gload at the pain and misfortune of those who 563 00:32:35,196 --> 00:32:39,036 Speaker 1: hold different views or who've lived different lives. So I'm 564 00:32:39,076 --> 00:32:42,156 Speaker 1: now committing to trying to follow Jamille's advice. I'm going 565 00:32:42,236 --> 00:32:44,676 Speaker 1: to remember that the science shows my intuitions are wrong, 566 00:32:45,276 --> 00:32:47,036 Speaker 1: that if I try to take the first step and 567 00:32:47,116 --> 00:32:49,796 Speaker 1: put my guard down, at least in those cases where 568 00:32:49,836 --> 00:32:52,276 Speaker 1: I have the emotional bandwidth to do so, it might 569 00:32:52,316 --> 00:32:55,556 Speaker 1: be more effective than I think. Rather than only seeing 570 00:32:55,596 --> 00:32:57,756 Speaker 1: someone as a member of an identity I disagree with, 571 00:32:58,356 --> 00:33:01,076 Speaker 1: I'll try to connect a bit better. I'll ask people 572 00:33:01,116 --> 00:33:04,116 Speaker 1: to share their stories, and if they'll listen, I'll share 573 00:33:04,156 --> 00:33:07,356 Speaker 1: my own. But like Jamille, I also want to make 574 00:33:07,396 --> 00:33:10,196 Speaker 1: sure that all this empathic labor is a bit more 575 00:33:10,276 --> 00:33:14,036 Speaker 1: evenly distributed, that the hard work of deep connection doesn't 576 00:33:14,076 --> 00:33:17,316 Speaker 1: just fall to historically marginalized groups who've long been on 577 00:33:17,396 --> 00:33:20,236 Speaker 1: the receiving side of all the injustice, These are the 578 00:33:20,276 --> 00:33:22,556 Speaker 1: folks who are least likely to have the needed emotional 579 00:33:22,636 --> 00:33:26,156 Speaker 1: bandwidth to make connections. I also want to make sure 580 00:33:26,196 --> 00:33:29,236 Speaker 1: that we're distributing the work of correcting these injustices a 581 00:33:29,356 --> 00:33:32,196 Speaker 1: little more fairly, and that the blind spots of our 582 00:33:32,236 --> 00:33:35,956 Speaker 1: mind don't prevent well intentioned people like me from inadvertently 583 00:33:36,036 --> 00:33:40,396 Speaker 1: making all those structural inequalities worse. And so when The 584 00:33:40,396 --> 00:33:43,916 Speaker 1: Happiness Lab returns next time, we'll tackle all these issues directly. 585 00:33:44,556 --> 00:33:46,676 Speaker 1: In our next episode, How to Be a Better Ally, 586 00:33:47,196 --> 00:33:49,236 Speaker 1: We'll hear what the science says about how you can 587 00:33:49,316 --> 00:33:53,036 Speaker 1: fight the structures that lead to some of society's worst injustices, 588 00:33:53,676 --> 00:33:55,996 Speaker 1: and how the lives of our mind sometimes cause good 589 00:33:56,036 --> 00:33:59,876 Speaker 1: people to unknowingly make things worse. We'll also see that 590 00:33:59,996 --> 00:34:03,436 Speaker 1: using evidence based strategies for becoming a better ally can 591 00:34:03,476 --> 00:34:06,796 Speaker 1: not only boost your own personal well being, but more importantly, 592 00:34:07,316 --> 00:34:10,276 Speaker 1: can make us more effective and contribute positively to the 593 00:34:10,356 --> 00:34:13,156 Speaker 1: causes we care about most. And so I hope you'll 594 00:34:13,196 --> 00:34:15,796 Speaker 1: return next week to hear the final season two episode 595 00:34:15,836 --> 00:34:28,676 Speaker 1: of The Happiness Lab with me Doctor Laurie Santos. The 596 00:34:28,756 --> 00:34:31,236 Speaker 1: Happiness Lab is co written and produced by Ryan Dilley. 597 00:34:31,796 --> 00:34:35,636 Speaker 1: Our original music was composed by Zachary Silver, with additional scoring, 598 00:34:35,756 --> 00:34:39,276 Speaker 1: mixing and mastering by Evan Viola. Pete Naton also helped 599 00:34:39,276 --> 00:34:42,716 Speaker 1: with production. Joseph Fridman checked our facts and our editing 600 00:34:42,836 --> 00:34:45,956 Speaker 1: was done by Sophie Crane mckibbon. Special thanks to Mel LaBelle, 601 00:34:46,196 --> 00:34:50,836 Speaker 1: Carl mcgliori, Heather Fain, Julia Barton, Maggie Taylor, Maya Kani, 602 00:34:51,196 --> 00:34:54,836 Speaker 1: Jacob Weisberg and my agent Ben Davis. The Happiness Lab 603 00:34:54,956 --> 00:34:57,596 Speaker 1: is brought to you by Pushkin Industries and me, Doctor 604 00:34:57,676 --> 00:34:58,396 Speaker 1: Laurie Santos