1 00:00:04,280 --> 00:00:08,200 Speaker 1: It's springtime in Minneapolis, my hometown. The snow is melting 2 00:00:08,360 --> 00:00:11,240 Speaker 1: on a lot of signages on display, signs posted on 3 00:00:11,280 --> 00:00:14,000 Speaker 1: people's lawns, taped to their windows, attacked to their doors, 4 00:00:14,520 --> 00:00:16,919 Speaker 1: and one of the most popular reads, hate has no 5 00:00:17,079 --> 00:00:20,120 Speaker 1: home here. Maybe you've seen that sign too. There's an 6 00:00:20,120 --> 00:00:22,720 Speaker 1: online map full of pins where they've been spotted, from 7 00:00:22,760 --> 00:00:26,720 Speaker 1: South Korea to Somalia. That sign campaign, started in two 8 00:00:26,760 --> 00:00:29,920 Speaker 1: thousand seventeen by a group of neighbors in Chicago, means 9 00:00:30,000 --> 00:00:32,839 Speaker 1: to resist bigotry and to extend kindness to people on 10 00:00:32,880 --> 00:00:35,760 Speaker 1: the margins of our society, and the spirit of the 11 00:00:35,840 --> 00:00:40,000 Speaker 1: sign is unimpeachable. But the literalist in me wonders what 12 00:00:40,159 --> 00:00:43,760 Speaker 1: that phrase really means, Like do the people who live 13 00:00:43,800 --> 00:00:48,160 Speaker 1: in those houses permit themselves to hate a concept like bigotry? 14 00:00:49,159 --> 00:00:51,720 Speaker 1: Late at night over drinks with friends, would they admit 15 00:00:51,760 --> 00:00:55,320 Speaker 1: to hating extremists of the opposing party? Are we allowed 16 00:00:55,320 --> 00:01:00,760 Speaker 1: to hate hate groups? This is deeply human The show 17 00:01:00,800 --> 00:01:03,400 Speaker 1: about why you do what you do I'm Dessa host 18 00:01:03,480 --> 00:01:06,800 Speaker 1: and fellow human being often overwhelmed by the victual and 19 00:01:06,920 --> 00:01:09,840 Speaker 1: violence on the news and in the social feeds. So 20 00:01:09,880 --> 00:01:13,880 Speaker 1: the question to jure is how does hurt or anger 21 00:01:14,560 --> 00:01:18,760 Speaker 1: metastasize into hatred? Why do we hate one another? And 22 00:01:18,800 --> 00:01:22,440 Speaker 1: how can we stop? Also, even though the topic is 23 00:01:22,560 --> 00:01:25,640 Speaker 1: necessarily heavy, I promise this isn't a lay face down 24 00:01:25,640 --> 00:01:27,560 Speaker 1: on the carpet, because why even try any more kind 25 00:01:27,600 --> 00:01:30,360 Speaker 1: of episode? Even the dark stuff needs a little light. 26 00:01:34,080 --> 00:01:37,200 Speaker 1: When did kids start saying I hate you? Probably those 27 00:01:37,200 --> 00:01:39,399 Speaker 1: words would come out either just before or just after 28 00:01:39,520 --> 00:01:43,240 Speaker 1: three years old. That is Emma Carlson who goes by 29 00:01:43,480 --> 00:01:46,479 Speaker 1: M she's a clinical therapist who works with kids and adolescence. 30 00:01:46,880 --> 00:01:50,360 Speaker 1: The word hate may come out because they don't have 31 00:01:50,400 --> 00:01:53,680 Speaker 1: any other language for it. Their vocabulary is so limited. 32 00:01:54,160 --> 00:01:56,560 Speaker 1: You have a four year old, I do? Have you 33 00:01:56,640 --> 00:02:01,160 Speaker 1: heard it? Oh? Yes? She was very, very ticked off 34 00:02:01,280 --> 00:02:05,080 Speaker 1: that I wouldn't let her jump off of her bed 35 00:02:05,240 --> 00:02:09,080 Speaker 1: onto the floor she was living. You know, my sweet, kind, 36 00:02:09,440 --> 00:02:14,320 Speaker 1: empathic little person just became this ball of rage that 37 00:02:14,400 --> 00:02:17,320 Speaker 1: would cry at the drop of a hat and would 38 00:02:17,480 --> 00:02:21,640 Speaker 1: throw a tantrum over what I would consider next to nothing, 39 00:02:25,160 --> 00:02:28,600 Speaker 1: Such big emotions coming out of this tiny, tiny body, 40 00:02:28,760 --> 00:02:32,800 Speaker 1: and she would go I hate you, And then like 41 00:02:32,840 --> 00:02:34,920 Speaker 1: does the clinical part of you know, like, ah, this 42 00:02:35,000 --> 00:02:37,600 Speaker 1: is just simply, you know, a child struggling with verbal 43 00:02:37,639 --> 00:02:41,200 Speaker 1: skills to express frustration. Absolutely, and like my head notes that, 44 00:02:41,720 --> 00:02:45,920 Speaker 1: but my heart was breaking. Everything that I had, you know, 45 00:02:45,960 --> 00:02:48,560 Speaker 1: thought I had dealt with about my relationship with my 46 00:02:48,600 --> 00:02:54,440 Speaker 1: mom just became completely unearthed. And it was the catastrophic 47 00:02:54,480 --> 00:02:57,080 Speaker 1: thinking of, Okay, she hates me. Now it is just 48 00:02:57,200 --> 00:03:01,280 Speaker 1: going to get worse. The thing is, M does hate 49 00:03:01,280 --> 00:03:03,680 Speaker 1: her own mom, or at least she did for a 50 00:03:03,680 --> 00:03:06,240 Speaker 1: big part of her life, and so hearing that word 51 00:03:06,240 --> 00:03:09,880 Speaker 1: from her daughter delivered more than the standard jolt of pain. 52 00:03:11,480 --> 00:03:13,560 Speaker 1: We'll come back to M and her trials with her 53 00:03:13,560 --> 00:03:19,520 Speaker 1: own mom, but let's switch gears for a minute. Most 54 00:03:19,600 --> 00:03:22,520 Speaker 1: of us, if we're lucky, don't feel hatred towards members 55 00:03:22,560 --> 00:03:25,320 Speaker 1: of our immediate family. The place we might be most 56 00:03:25,480 --> 00:03:29,880 Speaker 1: likely to encounter hatred his online, and you might be thinking, well, 57 00:03:30,120 --> 00:03:32,400 Speaker 1: on social media, hate's gonna hate. That's just part of 58 00:03:32,440 --> 00:03:36,840 Speaker 1: the deal. But global agencies are taking it seriously. The 59 00:03:36,920 --> 00:03:39,760 Speaker 1: United Nations called for a regulation of hate speech on 60 00:03:39,800 --> 00:03:46,880 Speaker 1: social media, as the recipient hate online feels like bedbugs 61 00:03:46,960 --> 00:03:50,600 Speaker 1: of the mind, and when you are not around them, 62 00:03:50,640 --> 00:03:53,240 Speaker 1: you kind of still feel them crawling in your head. 63 00:03:54,000 --> 00:03:58,000 Speaker 1: That's Dylan Marrin, who I met as an actor. He's 64 00:03:58,040 --> 00:04:01,800 Speaker 1: tall and slim, thoughtful on a goofy with a signature 65 00:04:01,800 --> 00:04:04,880 Speaker 1: flop of dark curls. And in the years since we met, 66 00:04:05,080 --> 00:04:08,320 Speaker 1: he's become known as a cultural critic, particularly for a 67 00:04:08,400 --> 00:04:13,920 Speaker 1: series of online videos that championed progressive causes so GLBT rights, 68 00:04:14,440 --> 00:04:18,840 Speaker 1: media representation, feminism, and some of his stuff has gone viral, 69 00:04:19,200 --> 00:04:24,000 Speaker 1: like viral, viral viral. People who say it doesn't exist 70 00:04:24,080 --> 00:04:29,560 Speaker 1: are full of today. I'm on boxing, police brutality, the 71 00:04:29,720 --> 00:04:33,799 Speaker 1: two biggest videos. I think we're like fifteen twenty million. 72 00:04:34,400 --> 00:04:37,279 Speaker 1: The higher his star rows, the spici or the comments 73 00:04:37,279 --> 00:04:40,520 Speaker 1: section got, people who disagreed with the ideas in Dylan's 74 00:04:40,600 --> 00:04:45,160 Speaker 1: videos started writing some really, really foul stuff about him online. 75 00:04:45,720 --> 00:04:48,359 Speaker 1: He created a file on his computer called the hate 76 00:04:48,400 --> 00:04:55,960 Speaker 1: folder where he keeps screenshots. Let's see, uh, you are 77 00:04:56,080 --> 00:05:01,160 Speaker 1: cancer and another person said, please decapitate yourself. I remember 78 00:05:01,200 --> 00:05:04,880 Speaker 1: one person wrote that they wanted to hear the sound 79 00:05:04,880 --> 00:05:08,599 Speaker 1: of my septum crushing under their fist. Was there some 80 00:05:08,600 --> 00:05:11,000 Speaker 1: part of you that celebrated the haters because it meant 81 00:05:11,080 --> 00:05:12,800 Speaker 1: that you were touching a nerve, or that you're doing 82 00:05:12,800 --> 00:05:15,880 Speaker 1: something important. Oh my god. Yeah. The first time that 83 00:05:15,920 --> 00:05:18,600 Speaker 1: I started getting it, I was like, oh my god, 84 00:05:18,960 --> 00:05:23,960 Speaker 1: like I I can't believe like I matter enough to 85 00:05:24,120 --> 00:05:26,839 Speaker 1: be hated. If you're going to speak truth to power, 86 00:05:26,960 --> 00:05:29,880 Speaker 1: you gotta brace for blowback. And for a second it 87 00:05:29,960 --> 00:05:33,160 Speaker 1: was easy to ride pretty high on that buzz of righteousness. 88 00:05:34,320 --> 00:05:37,280 Speaker 1: Both of these things are true at the same time, right, 89 00:05:37,720 --> 00:05:44,760 Speaker 1: it is incredibly psychologically damaging, and also I felt that 90 00:05:44,960 --> 00:05:49,280 Speaker 1: they were proof of my power. You are just like 91 00:05:49,560 --> 00:05:54,680 Speaker 1: I am, the king of the world. Over time, Dylan 92 00:05:54,760 --> 00:05:58,360 Speaker 1: started to believe that, in an important way, social platforms 93 00:05:58,360 --> 00:06:02,159 Speaker 1: were bringing out the very worst people people, including himself. 94 00:06:02,680 --> 00:06:05,640 Speaker 1: The most negative thing you can write, which is also 95 00:06:06,200 --> 00:06:09,960 Speaker 1: confusingly and dangerously sometimes the funniest thing that you can write, 96 00:06:10,000 --> 00:06:14,240 Speaker 1: that is easiest to upvote, because hyperbole and intensity is 97 00:06:14,240 --> 00:06:19,200 Speaker 1: what cuts through online. Dylan recently wrote a book about 98 00:06:19,200 --> 00:06:22,320 Speaker 1: his experience, and I read it on a plane, nodding 99 00:06:22,360 --> 00:06:25,400 Speaker 1: athletically for much of the ride. As an indie musician, 100 00:06:25,760 --> 00:06:27,760 Speaker 1: I spend a lot of my time online more than 101 00:06:27,760 --> 00:06:31,520 Speaker 1: I would like, promoting tours or albums, and I've seen 102 00:06:31,560 --> 00:06:34,280 Speaker 1: how a clever diss rises like a hot air balloon. 103 00:06:34,320 --> 00:06:37,960 Speaker 1: Over more nuanced perspectives. And I too have screenshot at 104 00:06:38,000 --> 00:06:41,839 Speaker 1: hateful messages where people question my gender or call me 105 00:06:41,920 --> 00:06:44,800 Speaker 1: names you can't say on the radio, and I fixated 106 00:06:44,920 --> 00:06:48,000 Speaker 1: on those words. I've lost sleep more of them. I've 107 00:06:48,000 --> 00:06:50,760 Speaker 1: carried mace, afraid somebody might bring that aggression to the 108 00:06:50,760 --> 00:06:54,240 Speaker 1: world outside the screen. And the way that my friends 109 00:06:54,240 --> 00:06:57,080 Speaker 1: tried to comfort me sounds a lot like the way 110 00:06:57,120 --> 00:07:01,279 Speaker 1: that Dylan's friends tried to comfort him. Forget them. They're 111 00:07:01,360 --> 00:07:04,640 Speaker 1: just sad, lonely guys who live in their mother's basement. 112 00:07:04,800 --> 00:07:07,440 Speaker 1: And it's like, well, first of all, if they're sad, 113 00:07:07,839 --> 00:07:10,880 Speaker 1: I too feel sad many times. If they live in 114 00:07:10,920 --> 00:07:14,160 Speaker 1: their mother's basement, I also lived with my mom for 115 00:07:14,200 --> 00:07:16,960 Speaker 1: a long time post college. Like I get it, you know, 116 00:07:17,040 --> 00:07:22,880 Speaker 1: like that's relatable to me. One night, Dylan started clicking 117 00:07:22,880 --> 00:07:25,440 Speaker 1: through the profile of a guy named Josh, who had 118 00:07:25,440 --> 00:07:27,600 Speaker 1: written to tell him that being gay is a sin 119 00:07:28,080 --> 00:07:31,560 Speaker 1: and also that Dylan is a moron. But sifting through 120 00:07:31,640 --> 00:07:35,840 Speaker 1: Josh's posts humanized him. For Dylan, it really is like 121 00:07:36,560 --> 00:07:41,200 Speaker 1: sending a hate letter and then paper clipping, you know, 122 00:07:41,480 --> 00:07:45,840 Speaker 1: photos from your family reunion, a partial family tree, and 123 00:07:46,120 --> 00:07:49,880 Speaker 1: your resume as you send it. There was one post 124 00:07:49,920 --> 00:07:52,720 Speaker 1: where he talked about being alone on a Friday night. 125 00:07:53,440 --> 00:07:57,080 Speaker 1: There were so many posts he made about like crying 126 00:07:57,160 --> 00:08:01,000 Speaker 1: at a movie, about feeling alone and wanting to hang 127 00:08:01,040 --> 00:08:05,760 Speaker 1: out with someone, and I just saw myself reflected in him. 128 00:08:05,880 --> 00:08:09,160 Speaker 1: Almost on impulse, Dylan reached out to Josh, who was 129 00:08:09,200 --> 00:08:11,760 Speaker 1: a senior in high school and having a pretty horrible 130 00:08:11,800 --> 00:08:15,000 Speaker 1: time of it. I was just angry about it all. 131 00:08:16,040 --> 00:08:18,480 Speaker 1: It was just a lot of it was a build 132 00:08:18,560 --> 00:08:21,000 Speaker 1: up of all your multiple videos you made of Dylan 133 00:08:21,120 --> 00:08:24,640 Speaker 1: offered that he'd been bullied in high school too. Josh 134 00:08:24,920 --> 00:08:27,360 Speaker 1: had a lot of family and law enforcement, and a 135 00:08:27,440 --> 00:08:30,480 Speaker 1: video that Dylan had made about police brutality had pushed 136 00:08:30,480 --> 00:08:33,440 Speaker 1: a tender spot. How do you feel that people like 137 00:08:33,520 --> 00:08:38,240 Speaker 1: you and me can have productive conversations. I think that 138 00:08:38,679 --> 00:08:40,760 Speaker 1: if you're trying to have a conversation with someone that's 139 00:08:40,800 --> 00:08:45,760 Speaker 1: completely different than you, then take everything away that makes 140 00:08:45,640 --> 00:08:48,880 Speaker 1: us different. In the conversation was not a total kumba 141 00:08:48,920 --> 00:08:51,840 Speaker 1: yah full of tearful epiphanies, but it was a conversation, 142 00:08:52,520 --> 00:08:55,200 Speaker 1: and Josh apologized. It was a person who was like, 143 00:08:55,280 --> 00:08:57,760 Speaker 1: this is who I am, and I'm really sorry that 144 00:08:57,800 --> 00:09:00,600 Speaker 1: I hurt you. Dylan went on to have lots and 145 00:09:00,679 --> 00:09:03,960 Speaker 1: lots of conversations with people who have written awful things 146 00:09:04,000 --> 00:09:06,520 Speaker 1: about him, and he just published a book about what 147 00:09:06,559 --> 00:09:09,800 Speaker 1: he's learned. It's called Conversations with People who Hate Me. 148 00:09:10,240 --> 00:09:12,200 Speaker 1: That's the one I read on the plane. And his 149 00:09:12,320 --> 00:09:16,640 Speaker 1: takeaway from all these exchanges. Our tendency to write people off, 150 00:09:17,000 --> 00:09:21,600 Speaker 1: to dismiss haters as trolls. It's just fundamentally flawed. It 151 00:09:21,760 --> 00:09:25,440 Speaker 1: is this fantasy we tell ourselves that the people who 152 00:09:25,480 --> 00:09:30,040 Speaker 1: write negative things online are one type of person, and 153 00:09:30,120 --> 00:09:34,720 Speaker 1: the word troll evokes this image of this distant monster 154 00:09:34,920 --> 00:09:38,000 Speaker 1: who lives under a bridge, and we are the good 155 00:09:38,280 --> 00:09:43,439 Speaker 1: and noble townspeople who are tortured by this monster. As 156 00:09:43,520 --> 00:09:47,240 Speaker 1: Dylan writes, their entire lives are not built around tormenting 157 00:09:47,240 --> 00:09:53,880 Speaker 1: the villagers. On the contrary, they are fellow villagers. We're 158 00:09:53,920 --> 00:09:56,880 Speaker 1: often really eager to recognize other people as haters, but 159 00:09:57,120 --> 00:10:01,199 Speaker 1: real slow to name that impulse in ourselves. That's documented 160 00:10:01,200 --> 00:10:05,000 Speaker 1: in formal psychological surveys too. When asked if they hate anyone, 161 00:10:05,280 --> 00:10:08,880 Speaker 1: most people say no, but a significant number of people 162 00:10:08,920 --> 00:10:12,920 Speaker 1: say they've been subject to hate. However, that pattern doesn't 163 00:10:12,920 --> 00:10:15,840 Speaker 1: hold in communities that have been locked in long term 164 00:10:16,080 --> 00:10:20,400 Speaker 1: violent conflict. In that context, some people will announce flat 165 00:10:20,440 --> 00:10:24,319 Speaker 1: out that they just hate the other side. I served 166 00:10:24,679 --> 00:10:26,840 Speaker 1: as an officer in especially unit in the in the 167 00:10:26,880 --> 00:10:30,840 Speaker 1: Israeli Army. It's mandatory in Israel, so everyone is serving 168 00:10:31,080 --> 00:10:35,480 Speaker 1: in the army, and twenty four years ago I was 169 00:10:35,679 --> 00:10:41,040 Speaker 1: very very seriously injured in Lebanon in a fight with 170 00:10:41,440 --> 00:10:46,680 Speaker 1: Bah soldiers. I was hospitalized for for a very long time, 171 00:10:46,840 --> 00:10:50,679 Speaker 1: for almost four years. Both of my hands were paralyzed 172 00:10:50,720 --> 00:10:54,240 Speaker 1: for a very long time. They weren't functioning. That is 173 00:10:54,280 --> 00:10:57,920 Speaker 1: around helper and professor of social psychology at the Hebrew 174 00:10:58,000 --> 00:11:02,160 Speaker 1: University in Jerusalem in Israel. He now studies the psychology 175 00:11:02,160 --> 00:11:05,559 Speaker 1: at play in long term intergroup conflicts like the one 176 00:11:05,600 --> 00:11:08,319 Speaker 1: that's plagued the region he was born into, the one 177 00:11:08,360 --> 00:11:12,880 Speaker 1: that's consumed generations of Palestinians and Israelis. And the questions 178 00:11:12,920 --> 00:11:15,720 Speaker 1: that have motivated his life's work are the same ones 179 00:11:15,800 --> 00:11:18,520 Speaker 1: that came to him as healing injured in a hospital bed. 180 00:11:19,200 --> 00:11:22,600 Speaker 1: You really have a lot of time to think do 181 00:11:22,679 --> 00:11:26,199 Speaker 1: we have to be in this situation? There must be 182 00:11:26,280 --> 00:11:28,800 Speaker 1: something that we can do. People don't want to keep 183 00:11:28,800 --> 00:11:32,320 Speaker 1: on hurting each other and killing each other? Why do 184 00:11:32,400 --> 00:11:36,160 Speaker 1: people hate? Ran spends his time trying to find out 185 00:11:36,240 --> 00:11:39,480 Speaker 1: what hate is, exactly how it works, and how it 186 00:11:39,559 --> 00:11:42,720 Speaker 1: might be stopped. I don't see hatred as an extreme 187 00:11:42,880 --> 00:11:46,640 Speaker 1: version of any other emotion. I don't think that hatred 188 00:11:46,679 --> 00:11:49,080 Speaker 1: is an extreme version of dislike. I don't think that 189 00:11:49,080 --> 00:11:52,200 Speaker 1: it's an extreme version of anger. Ran doesn't use the 190 00:11:52,320 --> 00:11:54,760 Speaker 1: term in the same way that we do casually, the 191 00:11:54,800 --> 00:11:59,120 Speaker 1: way we might hate lima beans or the nasal whining 192 00:11:59,160 --> 00:12:02,080 Speaker 1: of a particular up star. That stuff wouldn't qualify. If 193 00:12:02,120 --> 00:12:05,800 Speaker 1: I feel hate, I don't hate the action that this 194 00:12:06,000 --> 00:12:09,920 Speaker 1: person did. I hate the person it set. They did 195 00:12:10,000 --> 00:12:13,000 Speaker 1: something wrong, and they did it because you know this 196 00:12:13,120 --> 00:12:17,960 Speaker 1: is who they are. It's in their nature or character 197 00:12:18,360 --> 00:12:22,040 Speaker 1: or culture, and this can never be can never be changed. 198 00:12:22,400 --> 00:12:25,120 Speaker 1: You might be angry at someone for stealing from you, 199 00:12:25,720 --> 00:12:28,560 Speaker 1: or you might hate them for being an irredeemable thief, 200 00:12:29,400 --> 00:12:32,079 Speaker 1: and it runs work. To hate another person or group 201 00:12:32,320 --> 00:12:35,280 Speaker 1: is to perceive them as intrinsically bad or evil, a 202 00:12:35,360 --> 00:12:38,280 Speaker 1: threat to you or those you care about. Are there 203 00:12:38,360 --> 00:12:43,080 Speaker 1: certain personal characteristics that makes some people more prone to 204 00:12:43,120 --> 00:12:46,640 Speaker 1: hatred than other people? Definitely, people who hate are people 205 00:12:46,679 --> 00:12:51,840 Speaker 1: that cannot tolerate ambiguity or complexity, and for them it's 206 00:12:51,960 --> 00:12:55,760 Speaker 1: much much easier and in many many ways also addresses 207 00:12:55,800 --> 00:12:59,880 Speaker 1: their like psychological needs to somehow see the world as 208 00:13:00,200 --> 00:13:03,800 Speaker 1: you know, black versus why, the good versus the people. 209 00:13:06,120 --> 00:13:09,559 Speaker 1: Addan has spent his adult life trying to understand hatred 210 00:13:09,600 --> 00:13:13,839 Speaker 1: and quantifiable terms, analyzing data and looking for trend lines. 211 00:13:14,880 --> 00:13:18,280 Speaker 1: M the therapist we spoke to earlier, spent many years 212 00:13:18,280 --> 00:13:21,280 Speaker 1: sorting through the messiness of her own personal hate before 213 00:13:21,320 --> 00:13:24,840 Speaker 1: she started working in clinical terms. The relationship with her 214 00:13:24,880 --> 00:13:28,480 Speaker 1: mother had been difficult for a long time. M's mom 215 00:13:28,520 --> 00:13:30,880 Speaker 1: had berated her since she was little, but there was 216 00:13:30,920 --> 00:13:35,040 Speaker 1: a specific moment when all that strife jelled into something stronger. 217 00:13:38,320 --> 00:13:40,920 Speaker 1: My mom had stayed home from work, which was happening 218 00:13:40,960 --> 00:13:45,760 Speaker 1: more and more frequently, and she was so hungover she 219 00:13:46,120 --> 00:13:48,679 Speaker 1: fell down on the floor of the kitchen and had 220 00:13:48,720 --> 00:13:52,160 Speaker 1: a seizure in front of me. So like, I called 221 00:13:52,200 --> 00:13:57,240 Speaker 1: the paramedics, and I called my dad, and and then 222 00:13:57,240 --> 00:13:59,920 Speaker 1: it just never got talked about ever again. M was 223 00:14:00,080 --> 00:14:04,040 Speaker 1: sixteen at the time of the kitchen episode. Nobody checked 224 00:14:04,080 --> 00:14:06,199 Speaker 1: up on her to make sure she was okay afterwards, 225 00:14:06,960 --> 00:14:11,280 Speaker 1: and that disregard broke some last straw inside her. M 226 00:14:11,320 --> 00:14:14,640 Speaker 1: was not swept up in a tantrum. She realized that 227 00:14:14,720 --> 00:14:18,800 Speaker 1: she just really and truly hated her mother. It was 228 00:14:18,840 --> 00:14:21,840 Speaker 1: easier to put all of those feelings into just kind 229 00:14:21,840 --> 00:14:24,320 Speaker 1: of like a ball and just bury them down and 230 00:14:24,400 --> 00:14:27,880 Speaker 1: hold onto them so tightly. I was so angry. I 231 00:14:28,000 --> 00:14:31,800 Speaker 1: was so angry. M cut ties and with a very 232 00:14:31,800 --> 00:14:37,120 Speaker 1: sharp knife. Her mom called clearly and precarious situations struggling 233 00:14:37,160 --> 00:14:40,360 Speaker 1: with addiction and her own mental health crises. But M 234 00:14:40,480 --> 00:14:44,400 Speaker 1: was just done. When she got engaged, she decided not 235 00:14:44,440 --> 00:14:46,960 Speaker 1: to invite her mother to the wedding. I clung to 236 00:14:47,040 --> 00:14:50,840 Speaker 1: that hate sometimes. It was the only thing that kept 237 00:14:50,840 --> 00:15:04,200 Speaker 1: me going. A Hatred can be effectively used as fuel, 238 00:15:04,880 --> 00:15:07,080 Speaker 1: and not just by those of us trying to traverse 239 00:15:07,160 --> 00:15:12,000 Speaker 1: thorny and painful personal relationships. Institutions with global influence have 240 00:15:12,120 --> 00:15:18,480 Speaker 1: tried to harness hate too. It was a fascinating study 241 00:15:18,520 --> 00:15:22,680 Speaker 1: that found that after World War Two, only a tiny 242 00:15:22,760 --> 00:15:26,200 Speaker 1: fraction of soldiers actually fired their guns at the enemy, 243 00:15:26,280 --> 00:15:29,400 Speaker 1: and those that did fire their guns very often intentionally 244 00:15:29,680 --> 00:15:33,440 Speaker 1: missed their targets, their human targets. So after World War Two. 245 00:15:33,440 --> 00:15:36,080 Speaker 1: When it was discovered this was the case, military leaders 246 00:15:36,120 --> 00:15:41,440 Speaker 1: decided to initiate psychological training programs which resulted in a 247 00:15:41,560 --> 00:15:45,640 Speaker 1: massive increase in the willingness to kill in a war setting. 248 00:15:46,200 --> 00:15:49,840 Speaker 1: That is hate crime expert Matthew Williams. He's professor of 249 00:15:49,840 --> 00:15:53,240 Speaker 1: criminology at Cardiff University, and there he was describing the 250 00:15:53,280 --> 00:15:57,080 Speaker 1: psychological programs employed by the U. S Military to increase 251 00:15:57,200 --> 00:16:00,440 Speaker 1: kill rates after the Second World War. The army started 252 00:16:00,520 --> 00:16:03,880 Speaker 1: using man shaped targets instead of bulls eyes and tried 253 00:16:03,960 --> 00:16:07,800 Speaker 1: dispersing responsibility for killing throughout the troops or to place 254 00:16:07,880 --> 00:16:11,280 Speaker 1: it onto an authority figure like the commanding officer. They 255 00:16:11,320 --> 00:16:15,240 Speaker 1: also aimed to recast the enemy. One side is told 256 00:16:15,280 --> 00:16:18,600 Speaker 1: that the other side is in some way subhuman. They 257 00:16:18,640 --> 00:16:22,560 Speaker 1: are akin to say, parasites, they are cockroaches, their vermin 258 00:16:22,680 --> 00:16:26,120 Speaker 1: for example. So if hate can cause violence, and what 259 00:16:26,240 --> 00:16:30,960 Speaker 1: causes hate, Matthew identifies what he calls accelerance, social or 260 00:16:30,960 --> 00:16:34,840 Speaker 1: psychological forces that push people to hatred. So a social 261 00:16:34,840 --> 00:16:38,280 Speaker 1: force might be, for example, a condition on economic condition 262 00:16:38,320 --> 00:16:41,520 Speaker 1: where competition is fierce because the resources are incredibly scarce. 263 00:16:42,120 --> 00:16:46,480 Speaker 1: In situations where resources are scarce in times of economic downturn. 264 00:16:46,560 --> 00:16:48,960 Speaker 1: For example, we saw it in two thousand and eight 265 00:16:48,960 --> 00:16:53,320 Speaker 1: with the crash. Division grows in an environment where money 266 00:16:53,440 --> 00:16:56,600 Speaker 1: is really tight, For example, people might fight to obtain 267 00:16:56,680 --> 00:17:00,320 Speaker 1: resources for themselves in their community and denigrate other people 268 00:17:00,360 --> 00:17:03,560 Speaker 1: who are trying to do the same. That's a social excellerant. 269 00:17:03,960 --> 00:17:08,000 Speaker 1: The second type of excelerant is psychological. For example, a 270 00:17:08,080 --> 00:17:11,760 Speaker 1: trauma that's happened to a person in their lifetime. Say 271 00:17:11,760 --> 00:17:14,359 Speaker 1: they've been unemployed for a very long time, and they 272 00:17:14,400 --> 00:17:17,840 Speaker 1: feel shame, embarrassment the way their life is developed, and 273 00:17:17,880 --> 00:17:19,879 Speaker 1: then they're told by a political leader that it's not 274 00:17:19,960 --> 00:17:22,920 Speaker 1: their fault that they are unemployed, and in fact, it's 275 00:17:23,000 --> 00:17:25,919 Speaker 1: the fault of immigrants because they're taking all the jobs. 276 00:17:26,440 --> 00:17:29,000 Speaker 1: Then all of a sudden, this psychological trauma is being 277 00:17:29,000 --> 00:17:34,720 Speaker 1: weaponized in a way that demonizes an outgroup. The social 278 00:17:34,720 --> 00:17:40,600 Speaker 1: excelerant job scarcity, compounds with the psychological excelerant humiliation, and 279 00:17:40,680 --> 00:17:43,720 Speaker 1: Matthew says that if those forces bear down hard enough, 280 00:17:44,320 --> 00:17:48,120 Speaker 1: hatred is forged in the pressure, and hateful sentiment can 281 00:17:48,160 --> 00:17:52,680 Speaker 1: become hateful action. I was a victim of a hate 282 00:17:52,720 --> 00:17:55,439 Speaker 1: crime around twenty years ago when I was standing in 283 00:17:55,480 --> 00:17:58,400 Speaker 1: a gay bar in London. I lit up a cigarette 284 00:17:58,400 --> 00:18:01,520 Speaker 1: and someone asked me for a light. I offered, and 285 00:18:01,760 --> 00:18:06,399 Speaker 1: within seconds I was on the floor and I looked 286 00:18:06,480 --> 00:18:10,520 Speaker 1: up at my attacker and they used a homophobic slur. 287 00:18:11,240 --> 00:18:14,520 Speaker 1: Two other men were in on the assault. Queer bashing 288 00:18:14,800 --> 00:18:17,000 Speaker 1: was the name for that kind of attack. Then it 289 00:18:17,119 --> 00:18:21,160 Speaker 1: wasn't about robbery or personal conflict. It was more like sport. 290 00:18:22,240 --> 00:18:25,920 Speaker 1: Matthew remembers hearing their laughter as they left. They didn't 291 00:18:25,960 --> 00:18:28,800 Speaker 1: hate him personally. Their contempt was for his general way 292 00:18:28,800 --> 00:18:33,440 Speaker 1: of life. The hate in hate crime overlaps or maybe 293 00:18:33,440 --> 00:18:38,280 Speaker 1: blurs into other terms like bigotry or moral disregard. After 294 00:18:38,320 --> 00:18:41,280 Speaker 1: the attack, Matthew lay on the floor, bleeding from the mouth, 295 00:18:41,680 --> 00:18:46,000 Speaker 1: head ringing. He's never held his partner's hand in public since. 296 00:18:47,440 --> 00:18:50,360 Speaker 1: He became obsessed with the question of his attackers motives. 297 00:18:50,640 --> 00:18:53,760 Speaker 1: What could those guys have possibly gained from beating him up. 298 00:18:54,280 --> 00:18:57,360 Speaker 1: I was looking for something concrete to separate them from me. 299 00:18:57,400 --> 00:19:00,000 Speaker 1: I was looking to see that there was something maybe 300 00:19:00,280 --> 00:19:03,400 Speaker 1: you know they are so different from me because biologically speaking, 301 00:19:03,440 --> 00:19:07,000 Speaker 1: they are fundamentally different. I wanted to find that, but 302 00:19:07,080 --> 00:19:09,400 Speaker 1: I didn't. The surprise for me was that We're all 303 00:19:09,480 --> 00:19:13,640 Speaker 1: capable of hatred. Given the right set of circumstances, anyone 304 00:19:13,720 --> 00:19:18,480 Speaker 1: can become the attacker. Because all my attackers were young 305 00:19:19,119 --> 00:19:23,240 Speaker 1: black men, one of my concerns was that the attack 306 00:19:23,359 --> 00:19:27,400 Speaker 1: may have changed the way I think about young black men. 307 00:19:28,800 --> 00:19:33,880 Speaker 1: As the saying goes, hurt people hurt people were subject 308 00:19:33,960 --> 00:19:36,600 Speaker 1: to different accelerants, and our ability to cope with them 309 00:19:36,680 --> 00:19:39,800 Speaker 1: varies two but none of us are immune to them, 310 00:19:39,840 --> 00:19:42,919 Speaker 1: and Matthew was self aware enough to guard against his 311 00:19:43,000 --> 00:19:47,359 Speaker 1: own experience pushing him to reactionary or retaliatory attitudes of 312 00:19:47,359 --> 00:19:50,480 Speaker 1: his own. There's an onus on all of us to 313 00:19:50,600 --> 00:19:56,159 Speaker 1: prevent our particular pains from calcifying into general prejudices. Without 314 00:19:56,200 --> 00:20:00,359 Speaker 1: willful intervention, hate can have a very long life span. 315 00:20:01,800 --> 00:20:06,360 Speaker 1: Back to Iran, a psychologist ask Israelis today what does 316 00:20:06,400 --> 00:20:09,800 Speaker 1: it mean to be in Israel? And five out of 317 00:20:09,840 --> 00:20:13,200 Speaker 1: the first seven things they will say would be related 318 00:20:13,200 --> 00:20:15,840 Speaker 1: to Palestinians. You know, we're better to them, We're more 319 00:20:15,880 --> 00:20:19,119 Speaker 1: more than them. They want to damage or hurt us 320 00:20:19,119 --> 00:20:22,840 Speaker 1: in many ways, and we have to stick together to 321 00:20:23,000 --> 00:20:25,240 Speaker 1: cope with that situation. Would it be too much to 322 00:20:25,240 --> 00:20:27,720 Speaker 1: say that a sense of hatred that feeling hatred can 323 00:20:27,840 --> 00:20:33,560 Speaker 1: unite people are definitely unfortunately. I think that in many, many, 324 00:20:33,600 --> 00:20:38,960 Speaker 1: many ways, hate is the ultimate glue that brings people 325 00:20:39,000 --> 00:20:44,159 Speaker 1: together into groups. Probably the most powerful, extreme and also 326 00:20:44,240 --> 00:20:49,520 Speaker 1: mobilizing emotion is hatred. In the cartoon studio of imagination, 327 00:20:50,040 --> 00:20:52,520 Speaker 1: it's easy to picture people and the grips of hate 328 00:20:52,680 --> 00:20:57,240 Speaker 1: as looming over others, victimizing the less powerful. But according 329 00:20:57,280 --> 00:21:02,680 Speaker 1: to Iran, those who actually experience hate understand themselves as victims, 330 00:21:02,720 --> 00:21:05,320 Speaker 1: like they're on the right side of a moral battle, 331 00:21:05,680 --> 00:21:08,479 Speaker 1: defending a way of life and their virtues from an 332 00:21:08,560 --> 00:21:13,639 Speaker 1: enemy hell bent on their destruction. Given that hate is 333 00:21:13,680 --> 00:21:18,400 Speaker 1: so morally salient, so effective at connecting us and motivating action, 334 00:21:19,520 --> 00:21:23,399 Speaker 1: how would we ever hope to quit it? Matthew the 335 00:21:23,400 --> 00:21:28,479 Speaker 1: criminologist says, our efforts have to start early. You cannot 336 00:21:28,560 --> 00:21:32,400 Speaker 1: turn off the cultural tap to prevent the information flooding 337 00:21:32,440 --> 00:21:35,719 Speaker 1: into the brain. So this is why when trying to 338 00:21:35,760 --> 00:21:41,760 Speaker 1: initiate anti racist policies, anti racist initiatives, this has to 339 00:21:41,760 --> 00:21:43,560 Speaker 1: be done at a very young age. We have to 340 00:21:43,640 --> 00:21:46,119 Speaker 1: start under the age of eleven, at least under the 341 00:21:46,160 --> 00:21:49,920 Speaker 1: age of eleven to see real differences That idea rests 342 00:21:50,000 --> 00:21:53,760 Speaker 1: on several studies of children's brains which indicate that fear 343 00:21:53,880 --> 00:21:57,199 Speaker 1: responses to black faces are learned. They don't show up 344 00:21:57,200 --> 00:22:00,359 Speaker 1: when kids are little, but they do by adolescents. But 345 00:22:00,440 --> 00:22:02,680 Speaker 1: of course we can't just wait for an improved crop 346 00:22:02,720 --> 00:22:04,720 Speaker 1: of ten year olds to grow up and save the world. 347 00:22:05,400 --> 00:22:08,840 Speaker 1: What works to stop people from hating? So that's the 348 00:22:08,920 --> 00:22:12,320 Speaker 1: million question. I think when people hate it is driven 349 00:22:12,359 --> 00:22:16,560 Speaker 1: by a more general idea that, you know, people and 350 00:22:16,720 --> 00:22:20,960 Speaker 1: groups simply cannot change. But in his formal studies, and 351 00:22:21,320 --> 00:22:23,879 Speaker 1: has found that if you can convince people that change 352 00:22:23,880 --> 00:22:26,800 Speaker 1: is possible, not even in the context of the group 353 00:22:26,880 --> 00:22:30,119 Speaker 1: with whom they're in conflict, but like generally, that people 354 00:22:30,160 --> 00:22:33,840 Speaker 1: can evolve over time, then it's possible to make some headway. 355 00:22:33,960 --> 00:22:37,479 Speaker 1: In two thousand eleven, Science magazine published the findings of 356 00:22:37,480 --> 00:22:40,239 Speaker 1: one such study. So we talked about the fact that 357 00:22:40,560 --> 00:22:44,800 Speaker 1: in the history people change and groups change their views, 358 00:22:44,880 --> 00:22:50,080 Speaker 1: their attitudes, and their behavior. We managed to reduce their 359 00:22:50,160 --> 00:22:55,600 Speaker 1: levels of hatred towards the other group in almost and 360 00:22:55,640 --> 00:22:58,240 Speaker 1: that sort of reduction might have real consequences in the 361 00:22:58,320 --> 00:23:00,840 Speaker 1: kind of policies that people would can sit her endorsing, 362 00:23:01,320 --> 00:23:05,639 Speaker 1: and by that also to increase their willingness to engage 363 00:23:05,720 --> 00:23:10,680 Speaker 1: in actions that would you know, entail making huge compromises, 364 00:23:10,760 --> 00:23:15,160 Speaker 1: political compromises in order to promote peace. So just by 365 00:23:15,160 --> 00:23:19,679 Speaker 1: convincing that the groups can change, we decreased hatred and 366 00:23:19,880 --> 00:23:24,680 Speaker 1: increase people's support for compromises. Now back to M, who 367 00:23:24,680 --> 00:23:28,800 Speaker 1: sees no promise of any compromise with her mom. But 368 00:23:28,960 --> 00:23:32,000 Speaker 1: this point in her story, she has spent years hating 369 00:23:32,000 --> 00:23:36,080 Speaker 1: her mother. The hallmark version of this story would involve 370 00:23:36,080 --> 00:23:39,560 Speaker 1: a reconciliation. That's not the real life of the story, 371 00:23:39,840 --> 00:23:44,159 Speaker 1: absolutely not. As M moved through her adulthood, the hate 372 00:23:44,680 --> 00:23:47,600 Speaker 1: that had at one point served as an engine became 373 00:23:47,640 --> 00:23:51,520 Speaker 1: an anchor. When I got married, I was like, I am, 374 00:23:51,560 --> 00:23:54,000 Speaker 1: I have to let this go. I have to let 375 00:23:54,000 --> 00:23:58,560 Speaker 1: it go. That's easy to say, but hard to do. 376 00:24:00,119 --> 00:24:03,720 Speaker 1: For M. That didn't mean mending fences. It just meant 377 00:24:03,800 --> 00:24:06,879 Speaker 1: letting go of the electrified wire feelings that I have 378 00:24:07,040 --> 00:24:11,280 Speaker 1: towards my mom. It's often easier to feel nothing about 379 00:24:11,280 --> 00:24:14,280 Speaker 1: her than it is to feel anything about her, so 380 00:24:14,320 --> 00:24:15,920 Speaker 1: I try really hard most of the time to feel 381 00:24:15,960 --> 00:24:18,399 Speaker 1: nothing about it. Do you think falling in love with 382 00:24:18,440 --> 00:24:21,560 Speaker 1: your husband helped to fall out of hate with your mom. Yeah, 383 00:24:21,640 --> 00:24:26,440 Speaker 1: I think so, And in turn it allowed me the 384 00:24:26,440 --> 00:24:30,080 Speaker 1: permission to fall back in love with myself. I became 385 00:24:30,119 --> 00:24:33,240 Speaker 1: a much better student at school when I stopped hating 386 00:24:33,280 --> 00:24:36,920 Speaker 1: my mom. It's probably a correlation not causation thing, but 387 00:24:37,440 --> 00:24:41,120 Speaker 1: I had so much more time and energy to put 388 00:24:41,160 --> 00:24:45,680 Speaker 1: into school, which I love. Even after everything, I do 389 00:24:46,119 --> 00:24:50,240 Speaker 1: have to thank my mom because without her I would 390 00:24:50,240 --> 00:24:56,760 Speaker 1: not be in the career that I am now. It's peace. 391 00:24:57,800 --> 00:25:00,560 Speaker 1: It's not the best of all possible pieces, or the 392 00:25:00,600 --> 00:25:03,159 Speaker 1: one that she designed if she were in charge, but 393 00:25:03,240 --> 00:25:06,560 Speaker 1: it's peace. Even when a war ends, it doesn't mean 394 00:25:06,560 --> 00:25:14,120 Speaker 1: the underlying conflicts do. The pressures of our lives, social 395 00:25:14,359 --> 00:25:18,360 Speaker 1: or psychological, can move us to hate one another. Binary 396 00:25:18,440 --> 00:25:21,679 Speaker 1: thinking that casts some groups as virtuous and others as 397 00:25:21,800 --> 00:25:25,560 Speaker 1: villains can make us more susceptible to hatred. When we're 398 00:25:25,600 --> 00:25:28,840 Speaker 1: de personalized to one another on a forum like the Internet, 399 00:25:29,000 --> 00:25:32,440 Speaker 1: or as a product of propaganda, it's easier to hate 400 00:25:32,440 --> 00:25:37,320 Speaker 1: somebody who doesn't seem fully human. Moreover, hatred might take 401 00:25:37,400 --> 00:25:40,560 Speaker 1: special vigilance to stave off, because when you're in it, 402 00:25:40,560 --> 00:25:46,040 Speaker 1: it can feel not only justified, but righteous. I hope 403 00:25:46,119 --> 00:25:49,159 Speaker 1: that posting a sign outside that says hate has no 404 00:25:49,320 --> 00:25:53,680 Speaker 1: home here is a comfort to our neighbors, particularly those 405 00:25:53,720 --> 00:25:58,120 Speaker 1: that look, love, talk, pray differently than us. But we'd 406 00:25:58,119 --> 00:26:01,640 Speaker 1: probably do just as well to routinely search insider houses 407 00:26:02,200 --> 00:26:05,359 Speaker 1: for any sign that a small tendril of something uninvited 408 00:26:05,400 --> 00:26:09,159 Speaker 1: hasn't come up through a floorboard, disguised as high minded 409 00:26:09,200 --> 00:26:14,280 Speaker 1: indignation or moral purity, but capable of cracking the foundation nonetheless. 410 00:26:17,200 --> 00:26:20,760 Speaker 1: Deeply Human is a BBC World Service and American public 411 00:26:20,760 --> 00:26:23,720 Speaker 1: media co production with I Heart Media. It's written and 412 00:26:23,800 --> 00:26:27,480 Speaker 1: hosted by me Dessa. Find me online at Dessa on 413 00:26:27,560 --> 00:26:34,200 Speaker 1: Instagram and Dessa Darling on Twitter. What is a vampire facial? 414 00:26:35,400 --> 00:26:37,399 Speaker 1: How did a plastic surgeon come to be one of 415 00:26:37,440 --> 00:26:42,520 Speaker 1: Brazil's national icons? Is a makeover a healthy ego boost 416 00:26:43,200 --> 00:26:46,880 Speaker 1: or a concession to a pretty messed up cosmetic industry? 417 00:26:47,240 --> 00:26:51,800 Speaker 1: On the next Deeply Human, we're investigating beauty ethics and 418 00:26:51,840 --> 00:26:53,280 Speaker 1: the intersection between them.