1 00:00:00,240 --> 00:00:04,040 Speaker 1: Hi Friends. Is Charles W. Chuck Bryant here, your friendly 2 00:00:04,240 --> 00:00:09,920 Speaker 1: neighborhood podcaster setting up this week's Saturday Selects episode and 3 00:00:09,960 --> 00:00:14,400 Speaker 1: this week everybody, I picked that episode all about hate. Oh, 4 00:00:14,440 --> 00:00:17,319 Speaker 1: I hate this, hate it when people do that, hate 5 00:00:17,360 --> 00:00:19,919 Speaker 1: it when that happens. You probably say that stuff a lot. 6 00:00:20,160 --> 00:00:22,239 Speaker 1: But what does hate really mean? What is it the 7 00:00:22,280 --> 00:00:24,239 Speaker 1: core of hate? And what does it mean for the 8 00:00:24,239 --> 00:00:26,840 Speaker 1: world in our community? Well, we talk about all that 9 00:00:26,880 --> 00:00:38,159 Speaker 1: stuff in this episode from how hate works. Welcome to 10 00:00:38,240 --> 00:00:47,200 Speaker 1: Stuff you Should Know, a production of I Heart Radio. Hey, 11 00:00:47,280 --> 00:00:50,040 Speaker 1: and welcome to the podcast. I'm Josh Clark. With me 12 00:00:50,159 --> 00:00:54,040 Speaker 1: is always the chipper and cheerful Charles W. Chuck Bryant, Man, 13 00:00:54,480 --> 00:00:58,800 Speaker 1: I'm going ten different directions, buddy. Yeah, I'm a little 14 00:00:58,840 --> 00:01:03,320 Speaker 1: screwy are Yeah, we'll focus on this one. Okay, okay, 15 00:01:03,360 --> 00:01:05,840 Speaker 1: because we're going on in one direction, and that's hate. 16 00:01:05,880 --> 00:01:10,479 Speaker 1: I hate to focus. Okay. You hate broccoli. I do 17 00:01:10,560 --> 00:01:13,679 Speaker 1: hate broccoli, and you know that. I also hate peas, 18 00:01:13,720 --> 00:01:16,760 Speaker 1: like split peas. I remember declaring um as a child 19 00:01:16,840 --> 00:01:19,800 Speaker 1: that peas are some of my most hated enemies. I 20 00:01:19,840 --> 00:01:22,679 Speaker 1: think a lot of kids don't like peas because they're mushy. Yeah, 21 00:01:23,120 --> 00:01:25,560 Speaker 1: well that's the problem with all vegetables really, they're mushy. 22 00:01:25,600 --> 00:01:29,840 Speaker 1: They're overcooked if you cook something. No, I've had pretty 23 00:01:29,920 --> 00:01:34,280 Speaker 1: nasty broccoli. But BROCCOLI's all that's separate. It's just disgusting 24 00:01:34,280 --> 00:01:37,240 Speaker 1: in every single way. But cream spinach, I love that. 25 00:01:37,480 --> 00:01:40,000 Speaker 1: It's awesome. Yeah, that's good stuff. You and I shared 26 00:01:40,080 --> 00:01:44,280 Speaker 1: a cream spinach at Morton steak Has recently, like two ladies. Yeah, 27 00:01:44,440 --> 00:01:47,199 Speaker 1: it was something we couldn't even finish. It was so rich. 28 00:01:47,520 --> 00:01:50,240 Speaker 1: It was really good, So chuck. We don't hate cream spinach. 29 00:01:50,680 --> 00:01:53,680 Speaker 1: I hate broccoli and um. One of the things I 30 00:01:53,680 --> 00:01:56,080 Speaker 1: hate more than anything else is not having an intro, 31 00:01:56,800 --> 00:02:01,320 Speaker 1: which I don't because I was looking online and strangely 32 00:02:02,080 --> 00:02:06,200 Speaker 1: they're The online world is a repository for hate in 33 00:02:06,200 --> 00:02:10,400 Speaker 1: a certain way, as in like um, neo Nazi punk 34 00:02:10,480 --> 00:02:17,040 Speaker 1: bands not pants huh. This article calls it pop music, yeah, um, 35 00:02:17,200 --> 00:02:21,160 Speaker 1: or you know Facebook groups dedicated to hate, like you know, 36 00:02:21,280 --> 00:02:25,639 Speaker 1: Holocaust denial and that kind of stuff. Sure, um, but 37 00:02:25,960 --> 00:02:29,120 Speaker 1: this word is so ubiquitous in our culture that there 38 00:02:29,160 --> 00:02:31,840 Speaker 1: was nothing there. Like I found a guy um and 39 00:02:31,960 --> 00:02:35,880 Speaker 1: dairy and mass who was accused of hate crime. Um, 40 00:02:35,919 --> 00:02:39,959 Speaker 1: everybody wants to know why, Um, Cleveland fans hate Lebron 41 00:02:40,480 --> 00:02:43,240 Speaker 1: I can answer that, But I mean, like, we throw 42 00:02:43,320 --> 00:02:46,920 Speaker 1: this word around like the you know, some reality TV 43 00:02:47,080 --> 00:02:50,600 Speaker 1: series was the show you love to hate? Right, Um, 44 00:02:51,480 --> 00:02:54,720 Speaker 1: we we use this word a lot. But yeah, I 45 00:02:54,760 --> 00:02:57,919 Speaker 1: found a study out of the University of Texas that 46 00:02:57,919 --> 00:03:02,120 Speaker 1: that asked people how often they hate it, and um, 47 00:03:02,280 --> 00:03:05,880 Speaker 1: nobody said every day. It's not an everyday thing. So 48 00:03:05,960 --> 00:03:08,640 Speaker 1: like we we hate things like broccoli, but we also 49 00:03:08,720 --> 00:03:12,320 Speaker 1: realized that there's a real distinction between hating something and 50 00:03:12,880 --> 00:03:17,040 Speaker 1: experiencing actual hate. You hit it on the head, and 51 00:03:17,080 --> 00:03:21,079 Speaker 1: this is a pretty old distinction, right, Like, like philosophers 52 00:03:21,080 --> 00:03:24,120 Speaker 1: have have been aware of this before. I think Aristotle 53 00:03:24,680 --> 00:03:28,280 Speaker 1: was pretty sure he hated peas, but he really hated 54 00:03:28,400 --> 00:03:31,600 Speaker 1: him Lock. Yeah, he and he's not Webster, so I 55 00:03:31,639 --> 00:03:35,760 Speaker 1: will read his definition because he's Aristotle. Yeah, he said 56 00:03:35,760 --> 00:03:38,960 Speaker 1: it was a dislike for someone based on our negative 57 00:03:38,960 --> 00:03:42,440 Speaker 1: perception of that person's nature that is so intense that 58 00:03:42,520 --> 00:03:45,480 Speaker 1: whoever feels it wants to cause real harm to another, 59 00:03:45,720 --> 00:03:48,040 Speaker 1: Like I really want to harm you. Yeah, So that's 60 00:03:48,080 --> 00:03:50,560 Speaker 1: the difference. Like Pete, like you said, people throw that 61 00:03:50,600 --> 00:03:52,200 Speaker 1: word around. I hate broccoli. But you're not going to 62 00:03:52,280 --> 00:03:55,600 Speaker 1: go out and try and burn down broccoli farms. No, 63 00:03:56,440 --> 00:03:59,920 Speaker 1: I know that's silly. I'm not gonna go burn down 64 00:04:00,120 --> 00:04:04,120 Speaker 1: be BROCCOLI's family's broccoli farm that was used to fund 65 00:04:04,120 --> 00:04:06,680 Speaker 1: the James Bond movies. But Josh, I think, and this 66 00:04:06,760 --> 00:04:10,240 Speaker 1: is me surmising in my own personal purview, I think 67 00:04:10,280 --> 00:04:13,400 Speaker 1: there are kind of two types of hate, well three 68 00:04:13,440 --> 00:04:15,200 Speaker 1: types really. Then one type that you just throw the 69 00:04:15,200 --> 00:04:17,360 Speaker 1: word around like I hate that show, I hate broccoli. 70 00:04:17,680 --> 00:04:20,520 Speaker 1: One that is real hate, which I think is fear 71 00:04:20,560 --> 00:04:25,400 Speaker 1: based when you don't know someone personally or a group personally, 72 00:04:25,400 --> 00:04:30,080 Speaker 1: where you hate a group of people. And then there's 73 00:04:30,120 --> 00:04:35,040 Speaker 1: like the anger retribution based hate, like someone personally has 74 00:04:35,640 --> 00:04:39,080 Speaker 1: wronged you so badly that you hate them and and 75 00:04:39,200 --> 00:04:42,920 Speaker 1: cause and either want to cause or which ill upon them. Right, Well, 76 00:04:43,040 --> 00:04:46,440 Speaker 1: you just brought up a huge can of worms by 77 00:04:46,520 --> 00:04:49,719 Speaker 1: using the word anger. Like there's a real debate over 78 00:04:49,760 --> 00:04:53,200 Speaker 1: whether hate and anger are the same thing, right right. 79 00:04:53,360 --> 00:04:55,919 Speaker 1: They say they're not. It depends on who you talk to. 80 00:04:56,080 --> 00:04:58,599 Speaker 1: But the people who say they're not say things like, um, 81 00:04:59,080 --> 00:05:03,480 Speaker 1: hate is is brought on by humiliation or ill treatment 82 00:05:04,040 --> 00:05:07,760 Speaker 1: or being devalued where anger is brought on when you're 83 00:05:07,880 --> 00:05:12,560 Speaker 1: when you're treated UM in a way that you consider unfairly. Right. Right, 84 00:05:13,320 --> 00:05:18,320 Speaker 1: anger is the result of UM not having any recourse, 85 00:05:19,520 --> 00:05:22,520 Speaker 1: right frustration perhaps coupled with that, right and that that 86 00:05:22,600 --> 00:05:25,960 Speaker 1: kind of dances along the border because people who hate, 87 00:05:26,080 --> 00:05:30,560 Speaker 1: you know, other groups often are frustrated, Like when we 88 00:05:30,560 --> 00:05:34,799 Speaker 1: talked about the fascism and the Fascism podcast, getting um 89 00:05:34,800 --> 00:05:38,360 Speaker 1: getting groups all riled up against a scapegoat, is one 90 00:05:38,360 --> 00:05:42,440 Speaker 1: of the tenets of fascism, right, And so these people 91 00:05:42,440 --> 00:05:45,400 Speaker 1: are frustrated at their lot in life, their unemployment high 92 00:05:45,400 --> 00:05:48,440 Speaker 1: because of the Jews or something like that, Right, But 93 00:05:48,640 --> 00:05:53,720 Speaker 1: really they're not. They're they're angry about their job while 94 00:05:53,760 --> 00:05:56,920 Speaker 1: they hate the Jews. So the two are really intertwined. 95 00:05:56,920 --> 00:05:58,960 Speaker 1: But there's a lot of people think if you look 96 00:05:59,000 --> 00:06:01,680 Speaker 1: at them deeply enough to not one and the same, right, Well, 97 00:06:01,720 --> 00:06:03,000 Speaker 1: I think a lot of times I kind of hate 98 00:06:03,040 --> 00:06:06,040 Speaker 1: has displaced anger and frustration at your own you know, 99 00:06:06,240 --> 00:06:08,520 Speaker 1: a lot like what we're saying. Yeah, but there is 100 00:06:08,880 --> 00:06:13,599 Speaker 1: there is also a very um strong physiological basis to 101 00:06:13,680 --> 00:06:17,320 Speaker 1: it as well. I Mean, it's an emotion supposedly, although 102 00:06:17,360 --> 00:06:20,400 Speaker 1: it's not one of the basic emotions anger is yeah, 103 00:06:20,440 --> 00:06:31,039 Speaker 1: what are the basic emotions? Anger? Joy, fear, disgusted, and peckishness. 104 00:06:31,720 --> 00:06:35,280 Speaker 1: I thought it was joy, pain, sunshine and rain. No, 105 00:06:36,600 --> 00:06:39,680 Speaker 1: who's that Rob Base? No? I can't remember. I could 106 00:06:39,720 --> 00:06:42,039 Speaker 1: sing it, but I can't remember sing it. No, no, no, 107 00:06:42,160 --> 00:06:45,000 Speaker 1: I think it's Rob Base. No it was. It was 108 00:06:45,040 --> 00:06:48,200 Speaker 1: a duo. Oh no, no, I'm thinking of I Want money, 109 00:06:48,240 --> 00:06:51,880 Speaker 1: lots and lots of money. That was a duo to 110 00:06:52,040 --> 00:06:55,160 Speaker 1: be Rich? Remember that stupid song? Yeah, kind of. They 111 00:06:55,200 --> 00:06:57,600 Speaker 1: wrote a song about being rich. Oh yeah, how great 112 00:06:57,600 --> 00:06:59,760 Speaker 1: it was, and that was their only song. So unless 113 00:06:59,760 --> 00:07:03,679 Speaker 1: they were already rich than they never were from that song, 114 00:07:04,279 --> 00:07:07,680 Speaker 1: that makes sense. Yeah, it does just blew my mind, buddy. 115 00:07:08,640 --> 00:07:11,920 Speaker 1: So do you hate that song? Uh? I do now 116 00:07:11,960 --> 00:07:14,440 Speaker 1: because it's in my head. So Chuck, What is this 117 00:07:15,040 --> 00:07:19,440 Speaker 1: um physiological basis of anger? Well, it's pointed out in 118 00:07:19,480 --> 00:07:23,320 Speaker 1: the article within an Iron Maiden song, which I thought 119 00:07:23,360 --> 00:07:25,440 Speaker 1: was an odd choice. There's a thin line between love 120 00:07:25,480 --> 00:07:30,280 Speaker 1: and hate. Yeah, it's like there's a whole other song 121 00:07:30,400 --> 00:07:32,840 Speaker 1: called There's a thin Line between Love and Hate. Well, 122 00:07:32,880 --> 00:07:35,320 Speaker 1: there's a much more popular song, I think the Persuaders, 123 00:07:35,520 --> 00:07:37,920 Speaker 1: which was it's a thin line between love and hate 124 00:07:37,920 --> 00:07:41,120 Speaker 1: the old Motown song. Right, have you ever heard the 125 00:07:41,160 --> 00:07:45,040 Speaker 1: Pretender's version of it? No, it's hands down the greatest 126 00:07:45,120 --> 00:07:49,960 Speaker 1: version ever. Really is a thin between love and Hey, 127 00:07:50,200 --> 00:07:53,360 Speaker 1: the Pretenders covered the Persuaders. Yep, all right, I'm telling 128 00:07:53,400 --> 00:07:55,720 Speaker 1: you all right. So apparently Iron Maiden actually listen to 129 00:07:55,720 --> 00:07:57,480 Speaker 1: that song on YouTube the other day and it's a 130 00:07:57,720 --> 00:07:59,680 Speaker 1: it's an Iron Maiden song. Yeah no, I looked it 131 00:07:59,720 --> 00:08:02,680 Speaker 1: up to make sure that Iron maidenheadn't covered the Persuaders, 132 00:08:03,240 --> 00:08:05,760 Speaker 1: and uh no, Bruce Dickinson came up with his own lyrics, 133 00:08:05,800 --> 00:08:08,720 Speaker 1: his own version. He's like, that one's fine, I'm doing 134 00:08:08,760 --> 00:08:11,400 Speaker 1: this one. That's right. So the point of all this, Josh, 135 00:08:11,720 --> 00:08:13,680 Speaker 1: is that there is a thin line between love and 136 00:08:13,720 --> 00:08:16,800 Speaker 1: hate as far as uh the brain goes. Because um, 137 00:08:16,800 --> 00:08:18,920 Speaker 1: in two thousand and eight, there was a study at 138 00:08:18,920 --> 00:08:21,520 Speaker 1: the University College of London and that's in the UK, 139 00:08:22,400 --> 00:08:26,520 Speaker 1: and um, they got seventeen people, not very wide ranging. 140 00:08:26,640 --> 00:08:29,040 Speaker 1: I had a lot of problems with this study, but 141 00:08:29,200 --> 00:08:32,920 Speaker 1: they got seventeen people who said they hated someone else. 142 00:08:33,280 --> 00:08:34,800 Speaker 1: Maybe that's why they Maybe I have a hard time 143 00:08:34,800 --> 00:08:39,599 Speaker 1: finding someone who hates someone else. Maybe not because I 144 00:08:39,640 --> 00:08:42,640 Speaker 1: don't hate anyone. I was about to ask you that. Well, 145 00:08:42,280 --> 00:08:44,520 Speaker 1: we'll get to the personal stuff in a minute. OK. 146 00:08:45,000 --> 00:08:47,120 Speaker 1: So this study, what they did was they found seventeen 147 00:08:47,160 --> 00:08:50,480 Speaker 1: people who hated someone else, threw him under the old 148 00:08:50,520 --> 00:08:54,839 Speaker 1: wonder machine and looked, showed him pictures of the people 149 00:08:54,880 --> 00:08:58,800 Speaker 1: they hated, which to record the results. I guess they're like, 150 00:08:59,040 --> 00:09:01,040 Speaker 1: you need to bring picture the people you hate for 151 00:09:01,040 --> 00:09:02,680 Speaker 1: the study. Yeah, they could have just said think of 152 00:09:02,720 --> 00:09:04,240 Speaker 1: the person you hate, I think, and it would achieve 153 00:09:04,280 --> 00:09:07,280 Speaker 1: the same goal, I guess. So anyway, they what they 154 00:09:07,320 --> 00:09:10,440 Speaker 1: found out was that a couple of regions in the 155 00:09:10,480 --> 00:09:12,720 Speaker 1: brain there's like a hate circuit. They call it there 156 00:09:13,040 --> 00:09:19,480 Speaker 1: the pudaminta and the Jerry laughed at that, and the 157 00:09:19,520 --> 00:09:25,480 Speaker 1: insular cortex in both fired up with pictures of people 158 00:09:25,480 --> 00:09:28,720 Speaker 1: that they hated, right, And the significance of this is 159 00:09:28,760 --> 00:09:31,240 Speaker 1: that both of those regions also fire up when you 160 00:09:31,320 --> 00:09:34,720 Speaker 1: see a picture or think about someone you love, which 161 00:09:34,760 --> 00:09:37,080 Speaker 1: is the longest way to say, it's a thin line 162 00:09:37,120 --> 00:09:39,560 Speaker 1: between lemon hate, right, and I think everybody kind of 163 00:09:39,600 --> 00:09:47,120 Speaker 1: senses that it's like um when passions flaring are it's 164 00:09:47,160 --> 00:09:50,160 Speaker 1: virtually the same thing. There are two sides of one coin. 165 00:09:50,760 --> 00:09:53,920 Speaker 1: In my opinion, if you truly hate somebody, the real 166 00:09:54,000 --> 00:09:56,680 Speaker 1: hate to fear is not one where somebody's like, oh 167 00:09:56,760 --> 00:09:59,800 Speaker 1: I hate you so much, you know, because that that 168 00:10:00,000 --> 00:10:02,839 Speaker 1: can be turned. That means that they have some sort 169 00:10:02,880 --> 00:10:06,000 Speaker 1: of emotional connection to you. The one to be afraid 170 00:10:06,040 --> 00:10:10,960 Speaker 1: of is the detached, calm, cool kind of hatred, because 171 00:10:11,000 --> 00:10:15,120 Speaker 1: that's the one where you end up dead somewhere, Like 172 00:10:15,160 --> 00:10:18,240 Speaker 1: I'm the Green River killer and I hate prostitutes. Well, 173 00:10:18,280 --> 00:10:23,760 Speaker 1: that brings up an interesting sidebar, right, Um, do serial 174 00:10:23,840 --> 00:10:29,400 Speaker 1: killers hate their victims? No? End of side bar. Well 175 00:10:29,480 --> 00:10:32,200 Speaker 1: that they have long said that serial killers don't experience 176 00:10:32,240 --> 00:10:36,000 Speaker 1: emotion on that scale, but they're starting to uh to 177 00:10:36,280 --> 00:10:39,240 Speaker 1: change their thinking in certain cases because a lot of 178 00:10:39,280 --> 00:10:43,880 Speaker 1: serial killers suffer from antisocial personality disorder, and people who 179 00:10:43,920 --> 00:10:47,120 Speaker 1: suffer from that experience a range of emotions. So it's 180 00:10:47,120 --> 00:10:49,880 Speaker 1: not always I think it's it's both. You know that 181 00:10:50,040 --> 00:10:53,360 Speaker 1: not you can't say every serial killer is the same. Well, 182 00:10:53,400 --> 00:10:55,320 Speaker 1: they've been saying that for a long time. They've been 183 00:10:55,320 --> 00:10:58,480 Speaker 1: trying to find the threads that connect them. And I 184 00:10:58,559 --> 00:11:00,920 Speaker 1: told you about the sociologists talked to. He was just 185 00:11:01,000 --> 00:11:04,800 Speaker 1: really up in arms that psychology has spent four decades 186 00:11:04,920 --> 00:11:06,800 Speaker 1: or so looking at serial killers, and the best they 187 00:11:06,920 --> 00:11:10,680 Speaker 1: come up with as any personal any social personality disorder. 188 00:11:10,960 --> 00:11:13,320 Speaker 1: He's like, of course they have a personality disorder, they're 189 00:11:13,360 --> 00:11:16,480 Speaker 1: serial killers. Yeah, that's a good point. Yeah. So back 190 00:11:16,520 --> 00:11:18,480 Speaker 1: to that study though about the brain and the difference 191 00:11:18,480 --> 00:11:22,040 Speaker 1: between love and hate. Um, they did see. Um a difference, 192 00:11:22,080 --> 00:11:24,720 Speaker 1: a key difference because the areas of the frontal cortex 193 00:11:25,120 --> 00:11:30,320 Speaker 1: associated with judgment and critical thinking become less active when 194 00:11:30,320 --> 00:11:32,839 Speaker 1: you see someone you love on the f m R 195 00:11:32,880 --> 00:11:36,640 Speaker 1: I machine, But when you saw someone you hate, most 196 00:11:36,720 --> 00:11:42,400 Speaker 1: of your frontal cortex cortex cortex is active, remains active. Yeah, 197 00:11:42,600 --> 00:11:44,920 Speaker 1: so that's a big difference. But that makes sense as well, Chuck, 198 00:11:44,960 --> 00:11:46,760 Speaker 1: because I mean, if you see I know, you don't 199 00:11:46,760 --> 00:11:49,960 Speaker 1: hate anybody, so you wouldn't understand this. But when you 200 00:11:50,000 --> 00:11:52,720 Speaker 1: see someone you hate, you like it's a personality law. 201 00:11:52,920 --> 00:11:57,720 Speaker 1: You just you you you tend to criticize them in 202 00:11:57,800 --> 00:12:00,480 Speaker 1: your head, like, oh, you're wearing that swear. You look 203 00:12:00,559 --> 00:12:03,599 Speaker 1: so fat and stupid in that sweater. I hope you 204 00:12:03,880 --> 00:12:07,160 Speaker 1: somehow get did you strangle yourself on that sweater? Yeah? 205 00:12:07,200 --> 00:12:08,920 Speaker 1: So the point is that it takes like hatred as 206 00:12:08,960 --> 00:12:12,679 Speaker 1: an active thing. It's an active rumination on this. It's 207 00:12:12,720 --> 00:12:15,560 Speaker 1: not a knee jerk thing like when you might see 208 00:12:15,559 --> 00:12:18,640 Speaker 1: a picture of someone you love. All right, So that's interesting, right, 209 00:12:18,880 --> 00:12:21,040 Speaker 1: that's what that study came up with, the seven of 210 00:12:21,160 --> 00:12:24,520 Speaker 1: the seventeen people. Yeah, with this, Yeah, it couldn't get 211 00:12:26,280 --> 00:12:28,520 Speaker 1: you know. And the other problem is, I'm sure they 212 00:12:28,520 --> 00:12:36,480 Speaker 1: were um weird western educated, uh deed something rich and developed? 213 00:12:37,920 --> 00:12:40,640 Speaker 1: Care what the ice stands for? And what would you 214 00:12:40,640 --> 00:12:44,160 Speaker 1: just spell weird? It's basically like the idea that all 215 00:12:44,200 --> 00:12:47,360 Speaker 1: of these studies that sited are cited a lot of 216 00:12:47,400 --> 00:12:51,160 Speaker 1: them are They're just college kids. So it's like this 217 00:12:51,520 --> 00:12:57,080 Speaker 1: really narrow niche of the human population that they extrapolate onto. Ye. 218 00:12:57,280 --> 00:12:59,680 Speaker 1: Good point, and in this case they just use seventeen 219 00:12:59,720 --> 00:13:02,319 Speaker 1: of them. Well, we're here to report it and then 220 00:13:02,320 --> 00:13:06,360 Speaker 1: criticize it, and we're done. We did both, that's right. Uh. 221 00:13:06,360 --> 00:13:09,040 Speaker 1: What what's the deal with like old hate though? Like 222 00:13:09,280 --> 00:13:12,760 Speaker 1: don't they have some inclination of like early hate with 223 00:13:12,960 --> 00:13:15,960 Speaker 1: Caveman and the like? Well, yeah, because parts of the 224 00:13:16,240 --> 00:13:20,360 Speaker 1: you know, the closer to the center or the brain 225 00:13:20,440 --> 00:13:22,520 Speaker 1: stem that you get in the brain, the more ancient 226 00:13:22,559 --> 00:13:24,640 Speaker 1: that part of the brain is, and if there's a 227 00:13:24,679 --> 00:13:29,520 Speaker 1: region like the putamen that's associated with a certain thing 228 00:13:29,679 --> 00:13:32,560 Speaker 1: e g. Hate, then that means that hate's been around 229 00:13:32,600 --> 00:13:35,320 Speaker 1: for a very long time because they are part of 230 00:13:35,320 --> 00:13:39,040 Speaker 1: the brain. UH has been able to carry out that 231 00:13:39,080 --> 00:13:42,040 Speaker 1: function for this song or should ostensibly got you right, 232 00:13:42,480 --> 00:13:45,160 Speaker 1: But then it's also new with the prefrontal cortex, which 233 00:13:45,200 --> 00:13:48,440 Speaker 1: is a fairly newer um aspect. So maybe we just hated, 234 00:13:48,679 --> 00:13:52,600 Speaker 1: but we didn't criticize. We just hated. And they think 235 00:13:52,840 --> 00:13:57,320 Speaker 1: possibly that we developed hate as a species or a 236 00:13:57,360 --> 00:14:01,600 Speaker 1: capacity to hate as a serve love, a mechanism way 237 00:14:01,600 --> 00:14:06,319 Speaker 1: back in hunter gatherer days where we could feel justified 238 00:14:06,400 --> 00:14:11,920 Speaker 1: by say, taking food from another group because we hated him, 239 00:14:12,240 --> 00:14:16,839 Speaker 1: which I actually found pretty that's a pretty inspiring idea. Yeah, 240 00:14:17,360 --> 00:14:18,960 Speaker 1: that you had to work up hate enough to go 241 00:14:19,080 --> 00:14:22,120 Speaker 1: in pillage. Yes, Okay, I think that's kind of neat 242 00:14:22,320 --> 00:14:28,000 Speaker 1: because it makes it seem like we aren't naturally hateful beings, 243 00:14:28,040 --> 00:14:31,840 Speaker 1: and I don't think we are. I believe everybody has hate, 244 00:14:32,360 --> 00:14:35,600 Speaker 1: and everyone has a vast capacity to hate, but I 245 00:14:35,640 --> 00:14:39,240 Speaker 1: don't I wouldn't characterize this as generally hateful. That's good. 246 00:14:40,600 --> 00:15:05,280 Speaker 1: I'm kind of surprised here you say that it's true though? Alright, so, uh, 247 00:15:05,560 --> 00:15:08,320 Speaker 1: it's in the Bible, Josh, It's an ancient text all 248 00:15:08,360 --> 00:15:11,080 Speaker 1: over the place. Hate's been around a long time. M Right. 249 00:15:11,560 --> 00:15:15,560 Speaker 1: Are we gonna talk about Carthage? Yeah, because I know 250 00:15:15,680 --> 00:15:20,520 Speaker 1: you love this, the Carthaginian general, Hannibal Carthaginian, and you 251 00:15:20,600 --> 00:15:25,960 Speaker 1: gotta stop that. Hannibal pledged to his father, Dad, I 252 00:15:26,000 --> 00:15:29,200 Speaker 1: hate Rome, I hate Romans. I don't like the Italians. 253 00:15:29,720 --> 00:15:33,880 Speaker 1: I hate them forever, and I will swear retribution because 254 00:15:33,880 --> 00:15:37,320 Speaker 1: they have seized our provinces. He said, father, yes, son, 255 00:15:37,520 --> 00:15:40,360 Speaker 1: I'm going to kill the Romans. And he did. He 256 00:15:40,400 --> 00:15:42,600 Speaker 1: made good on that, invaded Italy and did quite a 257 00:15:42,640 --> 00:15:44,920 Speaker 1: bit of damage. Of course, the Romans fired back because 258 00:15:45,160 --> 00:15:47,720 Speaker 1: they hate the Carthage. Why can't I say that word? 259 00:15:48,120 --> 00:15:53,200 Speaker 1: They hated people from Carthage and uh, in one PC 260 00:15:54,080 --> 00:15:57,520 Speaker 1: they did some pretty bad things like burning them in 261 00:15:57,560 --> 00:16:01,240 Speaker 1: their houses while they screamed. But is that hate? Like, 262 00:16:02,480 --> 00:16:04,360 Speaker 1: I don't know, And that's a I think that's an 263 00:16:04,400 --> 00:16:07,640 Speaker 1: issue that I have um here there with this is 264 00:16:07,680 --> 00:16:11,920 Speaker 1: that there that's kind of a jump to conclusion, like 265 00:16:12,120 --> 00:16:15,280 Speaker 1: is it hate? I don't know? It does hate? Form 266 00:16:15,360 --> 00:16:19,800 Speaker 1: the basis for war or horrible acts in war. Well, 267 00:16:19,800 --> 00:16:22,360 Speaker 1: I don't know, because it's it's condemned pretty much in 268 00:16:22,440 --> 00:16:24,640 Speaker 1: like the New Testament in the Bible, it's condemned in 269 00:16:24,680 --> 00:16:28,800 Speaker 1: the Koran. Uh, let not hatred of the people incite 270 00:16:28,840 --> 00:16:32,920 Speaker 1: you not to act equitably. And in uh medieval and 271 00:16:32,960 --> 00:16:36,080 Speaker 1: Renaissance Europe you came up in you. But in Italy 272 00:16:36,160 --> 00:16:40,320 Speaker 1: they came up with the vendetta, which is uh very 273 00:16:40,400 --> 00:16:45,000 Speaker 1: much retribution for hatred. There you go. I I let's see, 274 00:16:45,000 --> 00:16:46,880 Speaker 1: that's what I'm saying. Like I think, let's say a 275 00:16:46,960 --> 00:16:51,080 Speaker 1: Roman soldier comes to your town while you're away using 276 00:16:51,160 --> 00:16:56,920 Speaker 1: the the latrine pit that your village has dug, and 277 00:16:57,000 --> 00:16:59,440 Speaker 1: they burn your family alive in your house while you're 278 00:16:59,520 --> 00:17:02,880 Speaker 1: using the bathroom. And you come back and you see 279 00:17:02,920 --> 00:17:06,120 Speaker 1: the Roman legions going away and your family is dead, 280 00:17:06,160 --> 00:17:10,640 Speaker 1: burned to crisps. Right. Um. I don't think the Romans 281 00:17:10,720 --> 00:17:16,159 Speaker 1: necessarily felt hatred to commit that act, but that act 282 00:17:16,440 --> 00:17:20,760 Speaker 1: would incite hatred in the person that it befell. So 283 00:17:20,800 --> 00:17:23,640 Speaker 1: I think of vendetta is an excellent example of hatred. 284 00:17:24,440 --> 00:17:29,240 Speaker 1: Because somebody done, you're wrong, and you're gonna get back 285 00:17:29,280 --> 00:17:31,720 Speaker 1: at them. Or they did something to your father or 286 00:17:31,760 --> 00:17:34,919 Speaker 1: something you The vendetta is very long lasting from what 287 00:17:34,960 --> 00:17:37,159 Speaker 1: I understand. Yeah, and it's not. I mean, this is 288 00:17:37,240 --> 00:17:40,800 Speaker 1: obviously we're talking about mafia vendetta's and war vendetta's, but 289 00:17:41,080 --> 00:17:43,199 Speaker 1: it can happen on a smaller level. You might not 290 00:17:43,240 --> 00:17:45,680 Speaker 1: think of it as vendetta though, But if someone done 291 00:17:45,720 --> 00:17:48,360 Speaker 1: you really wrong and you're like, I'm gonna get that 292 00:17:48,440 --> 00:17:53,119 Speaker 1: person back by doing this in six months when they 293 00:17:53,280 --> 00:17:56,280 Speaker 1: least expect it, that's a vendetta. Yeah, but you you 294 00:17:56,320 --> 00:18:00,280 Speaker 1: don't call it a vendetta. It's just uh, well, in 295 00:18:00,280 --> 00:18:03,440 Speaker 1: in Italy, they it's just come up in sir um, 296 00:18:03,480 --> 00:18:07,159 Speaker 1: I'm gonna get you suck a Yeah, bad people do 297 00:18:07,240 --> 00:18:09,159 Speaker 1: that though. There was a word for it though in 298 00:18:09,240 --> 00:18:14,200 Speaker 1: medieval and Renaissance Europe, in the mess yeah, which is 299 00:18:14,280 --> 00:18:17,199 Speaker 1: Latin for un friendship. It was a legal term for 300 00:18:17,240 --> 00:18:20,199 Speaker 1: hating somebody. Okay, So what we've done is established that 301 00:18:20,240 --> 00:18:23,320 Speaker 1: hatred is definitely a thing that's been around a long time. 302 00:18:24,280 --> 00:18:26,840 Speaker 1: Is that what we've done? Yeah, and chuck, of course 303 00:18:27,000 --> 00:18:32,720 Speaker 1: it's still around, um in recent modern history. There's other 304 00:18:32,800 --> 00:18:37,840 Speaker 1: examples that we could go into, like hate groups. Yeah, well, 305 00:18:38,920 --> 00:18:40,919 Speaker 1: let's talk about the Nazis real quick, because again we 306 00:18:40,920 --> 00:18:45,159 Speaker 1: talked about fascism and one of the tenets of it being, um, 307 00:18:45,760 --> 00:18:50,960 Speaker 1: I guess inciting other group to hate. Yeah, group hate. 308 00:18:51,080 --> 00:18:53,280 Speaker 1: That's that's where we are for sure, and a lot 309 00:18:53,320 --> 00:18:56,080 Speaker 1: of that. That gave a lot of a body of 310 00:18:56,240 --> 00:18:59,440 Speaker 1: data for people to study, and that they're still studying. 311 00:19:00,240 --> 00:19:03,000 Speaker 1: But um one guy in particular named Martin Oppenheimer, who 312 00:19:03,040 --> 00:19:08,960 Speaker 1: is a sociologist from Rutgers University. UM it was basically said, like, look, 313 00:19:09,000 --> 00:19:12,520 Speaker 1: the Nazis are proof positive that you can, number one, 314 00:19:12,600 --> 00:19:16,919 Speaker 1: get an entire group to hate another group, and that 315 00:19:17,000 --> 00:19:22,159 Speaker 1: you do this by um identifying and exploiting the group 316 00:19:22,680 --> 00:19:27,200 Speaker 1: that you're with their frustrations, say unemployment, joblessness, and then 317 00:19:27,240 --> 00:19:30,159 Speaker 1: basically saying those are the people who are at fault. 318 00:19:30,440 --> 00:19:32,959 Speaker 1: That's how you start the pot exactly. That's how you 319 00:19:33,000 --> 00:19:35,480 Speaker 1: incite hatred, which is gotta be one of the worst 320 00:19:35,480 --> 00:19:37,520 Speaker 1: things you can do. One of the worst non violent 321 00:19:37,880 --> 00:19:41,400 Speaker 1: acts I think a person commit is insite hatred. Yeah, 322 00:19:41,560 --> 00:19:43,879 Speaker 1: you know. Yeah. And also I thought what came to 323 00:19:43,960 --> 00:19:46,320 Speaker 1: mind to me when I was reading this was some 324 00:19:46,359 --> 00:19:50,000 Speaker 1: of these same tactics, like a marginalized people people who 325 00:19:50,080 --> 00:19:55,159 Speaker 1: are insecure, who are seeking safety somewhere. It's also the 326 00:19:55,240 --> 00:19:57,119 Speaker 1: kind of the same thing they do with the cults 327 00:19:57,160 --> 00:20:00,560 Speaker 1: and the brainwashing. They're seeking out these same types of 328 00:20:00,560 --> 00:20:02,639 Speaker 1: people and saying, hey, you feel marginalized, you feel like 329 00:20:02,680 --> 00:20:05,440 Speaker 1: you're not loved, you need a safe haven. But they're 330 00:20:05,480 --> 00:20:09,200 Speaker 1: not saying go hate someone else. They're saying, just come 331 00:20:09,240 --> 00:20:12,679 Speaker 1: and be with our group. Well, our our association of 332 00:20:12,720 --> 00:20:16,440 Speaker 1: like in group and out group is like this emotional 333 00:20:16,480 --> 00:20:20,119 Speaker 1: psychological razor blade that can be exploited in any number 334 00:20:20,119 --> 00:20:23,520 Speaker 1: of ways exactly, you know, but it's always a marginalized people. 335 00:20:24,000 --> 00:20:28,040 Speaker 1: It seems like, yeah, yeah, or there, But you mean 336 00:20:28,040 --> 00:20:30,800 Speaker 1: the people who are stirred, who have hatred stirred up 337 00:20:30,800 --> 00:20:34,040 Speaker 1: in them. Yeah, or go join a cult or something 338 00:20:34,080 --> 00:20:38,320 Speaker 1: like that. Yeah, you mean teenagers and uh well, as 339 00:20:38,320 --> 00:20:41,320 Speaker 1: Stanford study in two thousand and ten basically said, hey, 340 00:20:41,359 --> 00:20:44,760 Speaker 1: if you want to um teach teenagers to hate, here's 341 00:20:44,760 --> 00:20:48,439 Speaker 1: how you do it. You can't just overtly say go 342 00:20:48,720 --> 00:20:53,119 Speaker 1: hate this group, you know, hate Muslims, hate black people, 343 00:20:53,240 --> 00:20:57,640 Speaker 1: hate Jewish people, hate gays. You can't just say that 344 00:20:57,720 --> 00:21:01,040 Speaker 1: it's not good enough. But if you tell us story 345 00:21:01,400 --> 00:21:06,080 Speaker 1: that basically implies this is these are people you should 346 00:21:06,119 --> 00:21:11,360 Speaker 1: hate and here's why, right, like um, homosexuals or Petter asks, 347 00:21:11,440 --> 00:21:15,520 Speaker 1: and so you know you can't let them into certain groups. 348 00:21:15,680 --> 00:21:18,119 Speaker 1: And by the way, you should hate them because of 349 00:21:18,200 --> 00:21:23,320 Speaker 1: the story. Right then, I think that I had a 350 00:21:23,320 --> 00:21:25,280 Speaker 1: problem with that one because it was like, that's true 351 00:21:25,320 --> 00:21:27,920 Speaker 1: for everything. If you tell the story, it's gonna hit 352 00:21:28,000 --> 00:21:30,760 Speaker 1: home or personally into somebody. Yeah, you can't say, hey, 353 00:21:30,880 --> 00:21:34,920 Speaker 1: go love uh Sea Biscuit because he ran a horse 354 00:21:35,000 --> 00:21:37,199 Speaker 1: race that was pretty neat. But if you tell the 355 00:21:37,200 --> 00:21:39,600 Speaker 1: story of Sea Biscuit, all of a sudden, you're gonna 356 00:21:39,640 --> 00:21:41,960 Speaker 1: leave that thing going. Man, I'm getting my butt to 357 00:21:41,960 --> 00:21:44,720 Speaker 1: the Kentucky Derby next year because I love me some 358 00:21:44,760 --> 00:21:48,320 Speaker 1: horse racing. I love Sea Biscuit. See you saw the movie, right, No, 359 00:21:48,440 --> 00:21:52,800 Speaker 1: I didn't um the But the funny thing is is 360 00:21:52,880 --> 00:21:55,159 Speaker 1: that that whole that study made the careers of two 361 00:21:55,240 --> 00:21:58,000 Speaker 1: standard researchers so right. But they do have a point 362 00:21:58,040 --> 00:22:00,360 Speaker 1: because they point out in this article or I don't, 363 00:22:00,359 --> 00:22:04,960 Speaker 1: but we do. Um D W. Griffith's awful um movie 364 00:22:05,840 --> 00:22:09,600 Speaker 1: awful on content. Birth of a Nation from nineteen fifteen. 365 00:22:09,800 --> 00:22:12,159 Speaker 1: It's no Sea Biscuit. It's no sea biscuit. But it 366 00:22:12,240 --> 00:22:15,479 Speaker 1: did a really good job of getting people to hate 367 00:22:15,720 --> 00:22:18,479 Speaker 1: black people in the United States. Yeah, it doesn't it 368 00:22:18,520 --> 00:22:21,639 Speaker 1: feature like the well, since it was nineteen fifteen, it 369 00:22:21,680 --> 00:22:23,800 Speaker 1: is like the first and everything. But it's like the 370 00:22:23,840 --> 00:22:28,560 Speaker 1: first on screen rape, uh or implied rape. There was 371 00:22:28,600 --> 00:22:31,600 Speaker 1: a rape of a white woman by like an escaped slave, 372 00:22:31,680 --> 00:22:34,240 Speaker 1: I think by a white actor in black face of 373 00:22:34,280 --> 00:22:37,840 Speaker 1: course at the time, and um, it was a big, 374 00:22:37,920 --> 00:22:41,399 Speaker 1: huge movie. It grows ten million dollars in nineteen fifteen. 375 00:22:41,600 --> 00:22:46,400 Speaker 1: That's like that's two that's two sixteen million dollars today 376 00:22:46,480 --> 00:22:50,200 Speaker 1: is what grows. Yeah. And it was based on a play, Yeah, 377 00:22:50,200 --> 00:22:52,080 Speaker 1: it is. It is based on a play in a 378 00:22:52,119 --> 00:22:56,639 Speaker 1: book called The Klansman. And d W. Griffith felt so 379 00:22:56,720 --> 00:23:00,439 Speaker 1: bad about this afterward that he made a follow up 380 00:23:00,480 --> 00:23:03,760 Speaker 1: film that you're called Intolerance, which was a three hour 381 00:23:03,920 --> 00:23:10,119 Speaker 1: silent film meditation on four parallel stories of man's intolerance 382 00:23:10,160 --> 00:23:13,920 Speaker 1: throughout history. Oh, I didn't know he did that. That's good, Yeah, 383 00:23:14,280 --> 00:23:17,080 Speaker 1: because I want to like the W. Griffith. Yeah, I 384 00:23:17,080 --> 00:23:20,760 Speaker 1: mean he didn't write Birth of the Nation, so he 385 00:23:21,560 --> 00:23:24,280 Speaker 1: directed it. Not like getting him off the hook or anything. 386 00:23:24,359 --> 00:23:26,280 Speaker 1: But I think at the time. He was just trying 387 00:23:26,280 --> 00:23:28,360 Speaker 1: to make a movie that sold a lot of tickets 388 00:23:29,160 --> 00:23:30,639 Speaker 1: and that was the way to do it. Yeah, that's 389 00:23:30,680 --> 00:23:33,440 Speaker 1: the way to do it. And then the Nazi. Of course, 390 00:23:33,480 --> 00:23:39,040 Speaker 1: anyone who saw Inglorious Bastards knows that Gebel's, Joseph Gebel's 391 00:23:40,000 --> 00:23:42,520 Speaker 1: was in charge of, you know, the propaganda department with 392 00:23:42,640 --> 00:23:46,600 Speaker 1: feature films, and they had one called Judge SEUs. Is 393 00:23:46,600 --> 00:23:50,159 Speaker 1: it Judge or you'd probably it'd be yourd SEUs. So 394 00:23:50,240 --> 00:23:52,720 Speaker 1: you're the one who speaks German. How did you say, 395 00:23:52,840 --> 00:23:55,040 Speaker 1: Judd Seuss? I don't know. I was concentrating on the 396 00:23:55,119 --> 00:23:59,280 Speaker 1: umlaut part and the SIUs. Okay, so yeah, be your sus. 397 00:23:59,800 --> 00:24:03,439 Speaker 1: But that featured a main character, a Jewish main character 398 00:24:03,520 --> 00:24:06,280 Speaker 1: who was shunned by a gentile woman and so he 399 00:24:06,400 --> 00:24:10,399 Speaker 1: raped her, oh yeah, among other things, and it was 400 00:24:10,440 --> 00:24:14,520 Speaker 1: required viewing for the Stormtroopers. Yeah, they loved it. And 401 00:24:14,520 --> 00:24:18,159 Speaker 1: then they give him crystal meth. Really yeah, from what 402 00:24:18,200 --> 00:24:22,080 Speaker 1: I understand, that'll do it. Um. And that didn't just 403 00:24:22,119 --> 00:24:25,720 Speaker 1: go out with the Nazis. Um media has been playing 404 00:24:25,760 --> 00:24:30,639 Speaker 1: like more and more of a role um among I 405 00:24:30,640 --> 00:24:33,760 Speaker 1: guess hate groups, hatred as a as a concept and 406 00:24:33,880 --> 00:24:39,040 Speaker 1: as a practice, right, Yeah, because um, I think in 407 00:24:39,040 --> 00:24:44,480 Speaker 1: the nineties, Uh, Bosnian Serb TV showed UM something that's 408 00:24:44,560 --> 00:24:47,639 Speaker 1: kind of referred to now as like a basically hate 409 00:24:47,640 --> 00:24:52,760 Speaker 1: mongering UM series called Genocide that stirred up emotion against 410 00:24:52,840 --> 00:25:00,440 Speaker 1: the Bosnian Muslims, right yeah. Uh and and uh, well 411 00:25:00,480 --> 00:25:02,600 Speaker 1: you know what happened with that in the Balkan War. Yeah. 412 00:25:02,600 --> 00:25:05,919 Speaker 1: Alqaed has done similar things on the web. Obviously, the 413 00:25:05,920 --> 00:25:07,480 Speaker 1: web is a good place to to go try and 414 00:25:07,520 --> 00:25:10,920 Speaker 1: get this thing done these days. And they had chat rooms, 415 00:25:11,200 --> 00:25:15,960 Speaker 1: they have chat rooms with Facebook's becoming increasingly um available 416 00:25:16,040 --> 00:25:21,560 Speaker 1: for people who have UM hate based ideologies. UM. And 417 00:25:21,600 --> 00:25:25,000 Speaker 1: Facebook is like, look, we can't we can't do. I mean, 418 00:25:25,000 --> 00:25:27,119 Speaker 1: we'll find them and shut them down and when we 419 00:25:27,240 --> 00:25:52,520 Speaker 1: when we can, but like they're all over the place. Yeah. Um. 420 00:25:52,560 --> 00:25:57,720 Speaker 1: And then uh also chuck pop music. Yeah, they called 421 00:25:57,800 --> 00:26:00,720 Speaker 1: the pop music. And the reason I no, I can't 422 00:26:00,760 --> 00:26:03,080 Speaker 1: call it pop music, it's because I've seen some of 423 00:26:03,080 --> 00:26:05,400 Speaker 1: those specials and I saw really good when I can't 424 00:26:05,400 --> 00:26:08,760 Speaker 1: remember on neo Nazis and they have you know, they 425 00:26:08,960 --> 00:26:13,240 Speaker 1: have musical groups that are neo Nazi songs and they 426 00:26:13,280 --> 00:26:16,240 Speaker 1: just sing about hating other people, and it's you know, 427 00:26:16,280 --> 00:26:19,200 Speaker 1: it's aggressive music. It's not it's not pop music pop 428 00:26:19,280 --> 00:26:23,080 Speaker 1: there's no sense. No, it's not handsome, So um chuck. 429 00:26:23,240 --> 00:26:26,800 Speaker 1: The article begs a pretty um interesting question. I think, 430 00:26:27,440 --> 00:26:33,120 Speaker 1: um is hate a mental illness? Because you know, don't 431 00:26:33,160 --> 00:26:35,920 Speaker 1: you have to be slightly mentally ill to burn down 432 00:26:35,960 --> 00:26:40,359 Speaker 1: a house with an entire family trapped inside? Or maybe 433 00:26:40,359 --> 00:26:45,120 Speaker 1: you're just following orders? Okay, you know, excellent. I think 434 00:26:45,160 --> 00:26:49,679 Speaker 1: you just hit upon it. Our understanding of hate is 435 00:26:49,760 --> 00:26:54,880 Speaker 1: incomplete because our understanding of the things that we do 436 00:26:54,960 --> 00:26:58,960 Speaker 1: that we associate with hate is also incomplete. Are you 437 00:26:59,080 --> 00:27:01,560 Speaker 1: just following orders? Are you being whipped up into a 438 00:27:01,600 --> 00:27:05,399 Speaker 1: mob mentality? Do you actually hate this other group because 439 00:27:05,440 --> 00:27:08,680 Speaker 1: you lost your job? Or is this emotion just being 440 00:27:08,720 --> 00:27:13,399 Speaker 1: exploited by someone else, a third party? Um? I I. 441 00:27:14,040 --> 00:27:16,760 Speaker 1: And also I think our understanding of mental illness isn't 442 00:27:16,880 --> 00:27:19,359 Speaker 1: refined enough to say yes, hates the product of a 443 00:27:19,400 --> 00:27:22,960 Speaker 1: mental illness? Sure, because they referenced Um Hitler and Osama 444 00:27:22,960 --> 00:27:25,200 Speaker 1: bin Laden as two people they suspect might have been 445 00:27:25,200 --> 00:27:30,760 Speaker 1: mentally ill um or at least anti social, and they 446 00:27:30,880 --> 00:27:35,399 Speaker 1: also referenced the Columbine shooters as one of them. Suffered 447 00:27:35,440 --> 00:27:37,879 Speaker 1: from depression and they had these hate filled rants that 448 00:27:37,960 --> 00:27:41,159 Speaker 1: they ended up finding And was there a link between 449 00:27:41,160 --> 00:27:45,280 Speaker 1: that depression and hatred? Right? And I guess the the 450 00:27:45,520 --> 00:27:48,600 Speaker 1: that begs the question like, did they were they so? 451 00:27:49,240 --> 00:27:51,960 Speaker 1: Was Osama bin Laden and Hitler and Dylan Klebold like 452 00:27:52,080 --> 00:27:55,000 Speaker 1: so wrapped up in hatred that they were crazy? Or 453 00:27:55,040 --> 00:28:00,360 Speaker 1: where was um hatred a byproduct of, you know, any 454 00:28:00,400 --> 00:28:03,399 Speaker 1: mental illness they may or may not have had. These 455 00:28:03,440 --> 00:28:06,520 Speaker 1: are questions we don't know. But my whole idea that 456 00:28:06,840 --> 00:28:12,200 Speaker 1: hatred is brought out when you are um mistreated by 457 00:28:12,240 --> 00:28:16,600 Speaker 1: someone else is backed up by a two thousand study 458 00:28:16,680 --> 00:28:19,480 Speaker 1: of people from Kosovo and those who have gone through 459 00:28:19,520 --> 00:28:25,159 Speaker 1: the most trauma and stress hated the Serbian troops who'd 460 00:28:25,240 --> 00:28:27,920 Speaker 1: um you know, borne that out on them more than 461 00:28:28,200 --> 00:28:31,919 Speaker 1: other people who maybe had pleasant exchanges with Serbian troops. 462 00:28:32,480 --> 00:28:35,400 Speaker 1: I guess that makes sense. Yeah, we gotta mention hate 463 00:28:35,440 --> 00:28:39,600 Speaker 1: crimes and hate groups. Briefly. Hate crime is obviously a 464 00:28:39,640 --> 00:28:42,440 Speaker 1: crime carried out against somebody based on their skin color, 465 00:28:43,280 --> 00:28:48,400 Speaker 1: their ethnicity, their national origin, their gender, disability, sexual orientation 466 00:28:49,120 --> 00:28:51,600 Speaker 1: is when you hear a lot about yeah, disabilities, a 467 00:28:51,640 --> 00:28:54,200 Speaker 1: sad one because it took a while to um get 468 00:28:54,640 --> 00:28:59,160 Speaker 1: that into hate crime bills. Oh really yeah, interesting, but 469 00:28:59,520 --> 00:29:03,640 Speaker 1: the Congress is past legislation now that makes hate crimes 470 00:29:03,720 --> 00:29:07,120 Speaker 1: more serious offenses than just like a regular assault. Well yeah, 471 00:29:07,200 --> 00:29:09,720 Speaker 1: which is pretty awesome. Yeah, and how it should be. 472 00:29:10,200 --> 00:29:13,640 Speaker 1: I remember, um, when the there was a child's safety 473 00:29:13,720 --> 00:29:16,760 Speaker 1: law that was being passed in two thousand and six, 474 00:29:17,400 --> 00:29:21,800 Speaker 1: and there was a hate crime language that was attached 475 00:29:21,840 --> 00:29:28,400 Speaker 1: to it that made um, sexual orientation crimes hate crimes 476 00:29:28,440 --> 00:29:31,600 Speaker 1: on a federal level, and there's a big outrage about 477 00:29:31,640 --> 00:29:34,240 Speaker 1: it among religious groups. Do you remember that. I think 478 00:29:34,560 --> 00:29:38,360 Speaker 1: they were like, wait, we have a First Amendment freedom 479 00:29:38,400 --> 00:29:42,239 Speaker 1: to hate gay people as part of our religion. You know, 480 00:29:42,360 --> 00:29:45,920 Speaker 1: so you're you're saying that that that in and of 481 00:29:45,960 --> 00:29:48,960 Speaker 1: itself is is a hate crime by saying like, no, 482 00:29:49,120 --> 00:29:52,680 Speaker 1: these people are wrong, homosexuality is bad, it's wrong, that 483 00:29:52,800 --> 00:29:55,760 Speaker 1: kind of thing, um, And that they thought that that 484 00:29:55,840 --> 00:29:58,560 Speaker 1: kind of infringed on there, which I don't think it does, 485 00:29:58,720 --> 00:30:00,880 Speaker 1: but that was their argument for a while. I don't 486 00:30:00,920 --> 00:30:04,360 Speaker 1: think it worked. Interesting. Uh, So I have a list 487 00:30:04,440 --> 00:30:06,440 Speaker 1: here first, Josh. I know, we have a couple of 488 00:30:06,440 --> 00:30:10,280 Speaker 1: more little stats um about hate groups since two thousand 489 00:30:10,720 --> 00:30:15,000 Speaker 1: The Southern Poverty Law Center claims that the US hate 490 00:30:15,000 --> 00:30:18,120 Speaker 1: groups in the US has grown by more than and 491 00:30:18,320 --> 00:30:22,640 Speaker 1: since since two thousand. Yeah, and they had the top 492 00:30:22,680 --> 00:30:25,960 Speaker 1: five states with the biggest concentrations of hate groups. And 493 00:30:26,000 --> 00:30:28,040 Speaker 1: this one was continued on the next page. And when 494 00:30:28,040 --> 00:30:30,480 Speaker 1: I was reading it, I was like, please, Georgia, don't 495 00:30:30,480 --> 00:30:33,400 Speaker 1: be on there, Please don't be on there. And it's 496 00:30:33,440 --> 00:30:36,400 Speaker 1: not And we will count them down from five to 497 00:30:36,440 --> 00:30:40,320 Speaker 1: create suspense. Idaho is number five for hate groups. Evidently, 498 00:30:41,320 --> 00:30:44,720 Speaker 1: Wyoming is number four. You got, Arkansas is number three, 499 00:30:45,720 --> 00:30:49,280 Speaker 1: Mississippi is number two, two from the south, and then 500 00:30:49,520 --> 00:30:52,960 Speaker 1: um number one according to the Southern Poverty Law Center 501 00:30:53,120 --> 00:30:59,719 Speaker 1: is Montana. Yep, that's you know, Montana. Grab your guns, fellas. Yeah. Well, 502 00:30:59,760 --> 00:31:01,960 Speaker 1: there's a lot of Melissias in Montana. Yeah, but there's 503 00:31:02,000 --> 00:31:04,560 Speaker 1: also a lot of like super chill cool like fly 504 00:31:04,680 --> 00:31:08,240 Speaker 1: fish and uh, microbrew drinking hippies out there. It's an 505 00:31:08,280 --> 00:31:10,720 Speaker 1: interesting mix. And I've spent time there and I saw 506 00:31:10,800 --> 00:31:13,240 Speaker 1: both in this town and it was I could feel 507 00:31:13,280 --> 00:31:18,680 Speaker 1: the friction even between those groups, like with an Indian burn. Yeah, 508 00:31:18,720 --> 00:31:19,960 Speaker 1: Like I was like in I was out in a 509 00:31:20,160 --> 00:31:22,480 Speaker 1: saloon and having a good time with some locals, and 510 00:31:22,520 --> 00:31:25,320 Speaker 1: then a couple of like cowboys came in that didn't 511 00:31:25,360 --> 00:31:28,280 Speaker 1: like the people from l A being in there, and 512 00:31:28,680 --> 00:31:31,760 Speaker 1: you could like, definitely, since there's two different types of 513 00:31:31,760 --> 00:31:35,719 Speaker 1: people in Montana, there's probably more than two, but I'm generalizing. No, 514 00:31:35,840 --> 00:31:39,520 Speaker 1: there's two. There's just two. Okay, hate groups and hippies. 515 00:31:39,920 --> 00:31:44,080 Speaker 1: So um, chuck, you got some stance for us? Uh? Yeah, 516 00:31:44,120 --> 00:31:50,800 Speaker 1: you dug this upright. On who people hate, acquaintances, friends, 517 00:31:51,400 --> 00:31:56,080 Speaker 1: family members twelve percent. That's sad ex boyfriends and girlfriends 518 00:31:56,080 --> 00:32:01,120 Speaker 1: twelve percent. And within the family, it's fathers are hated 519 00:32:01,440 --> 00:32:08,840 Speaker 1: the most at, mothers in laws, and siblings at three percent. 520 00:32:08,960 --> 00:32:11,320 Speaker 1: That's kind of sweet. That's surprising to me, though, I 521 00:32:11,320 --> 00:32:13,360 Speaker 1: would think siblings would be the highest because they're the 522 00:32:13,400 --> 00:32:15,880 Speaker 1: ones that beat the tar out of you most frequently 523 00:32:15,920 --> 00:32:18,760 Speaker 1: in most families. All right, so do you hate people? 524 00:32:19,040 --> 00:32:23,640 Speaker 1: That's let's finish up with that. Um. I have found 525 00:32:23,720 --> 00:32:27,120 Speaker 1: that the best way to hate somebody is to just 526 00:32:28,120 --> 00:32:32,000 Speaker 1: check them off. So you'll write someone off but not 527 00:32:32,080 --> 00:32:34,800 Speaker 1: have that active hatred. I I don't generally like, I 528 00:32:34,840 --> 00:32:37,200 Speaker 1: will just be like, I can't believe you wore that sweater, 529 00:32:37,320 --> 00:32:40,560 Speaker 1: you fat pig idiot in my head, but it's usually 530 00:32:40,600 --> 00:32:44,000 Speaker 1: because I'm in like a bad mood about something else. 531 00:32:44,920 --> 00:32:47,680 Speaker 1: Like I don't walk around just actively hating people. It's 532 00:32:47,680 --> 00:32:50,360 Speaker 1: a waste of time. It's a total waste of time. 533 00:32:51,320 --> 00:32:54,800 Speaker 1: I don't I don't think I ever hated anybody. At 534 00:32:54,800 --> 00:32:59,239 Speaker 1: a situation and an ex girlfriend shacked up with one 535 00:32:59,240 --> 00:33:00,880 Speaker 1: of one of my best friend ends after I moved 536 00:33:00,960 --> 00:33:04,000 Speaker 1: state and we were broken up quote unquote, but I 537 00:33:04,040 --> 00:33:07,320 Speaker 1: also was like, I'm coming back for you, like, you know, 538 00:33:07,400 --> 00:33:11,120 Speaker 1: this isn't over. Were you going to find work? And Californias. 539 00:33:11,360 --> 00:33:14,760 Speaker 1: I was going west in my in my wagon and 540 00:33:15,120 --> 00:33:18,040 Speaker 1: they checked up pretty quick after I left. And I 541 00:33:18,040 --> 00:33:21,000 Speaker 1: had like a few years of like bad dreams and 542 00:33:21,680 --> 00:33:24,280 Speaker 1: periodic bad dreams. I wasn't like every day I woke 543 00:33:24,360 --> 00:33:26,840 Speaker 1: up thinking about it, but it faded away. But it 544 00:33:26,880 --> 00:33:29,720 Speaker 1: was never even hate. It was just like, man, I 545 00:33:29,760 --> 00:33:33,120 Speaker 1: gotta do that really yeah. I was just just like, 546 00:33:33,160 --> 00:33:36,640 Speaker 1: that's that sucks. Don't do that friends. That's that's one 547 00:33:36,720 --> 00:33:38,520 Speaker 1: end of the spectrum. The other end of the spectrum 548 00:33:38,600 --> 00:33:42,440 Speaker 1: is like people who go and like kill those people, 549 00:33:42,600 --> 00:33:45,280 Speaker 1: those two people, well, yeah, and that's like former famous 550 00:33:45,320 --> 00:33:47,920 Speaker 1: football stars and that's that. I think it's all in 551 00:33:47,960 --> 00:33:50,640 Speaker 1: the wiring. You're wired a certain way, and I'm not 552 00:33:50,720 --> 00:33:54,480 Speaker 1: wired to to indulge those kinds of things. I suspect 553 00:33:54,520 --> 00:33:57,000 Speaker 1: that all has to do with the amygdala. You think, 554 00:33:57,760 --> 00:34:00,200 Speaker 1: all right, well, if you want to learn more about 555 00:34:00,200 --> 00:34:02,720 Speaker 1: the amygdala, you can type that word into the search 556 00:34:02,760 --> 00:34:05,080 Speaker 1: bart how stuff works dot Com. You can also type 557 00:34:05,080 --> 00:34:07,400 Speaker 1: in the word hate to bring up the article that 558 00:34:07,480 --> 00:34:10,239 Speaker 1: we worked off of today. I should point out to 559 00:34:10,320 --> 00:34:13,239 Speaker 1: Josh that I made right with the dude years later, 560 00:34:14,080 --> 00:34:18,960 Speaker 1: and uh, never made right with a girl. What does 561 00:34:19,000 --> 00:34:21,200 Speaker 1: that say? I think it says that you hated the 562 00:34:21,200 --> 00:34:24,680 Speaker 1: girl more. I just never felt the need to dredge 563 00:34:24,680 --> 00:34:26,920 Speaker 1: that back up with her. Gotcha. But the dude, I 564 00:34:26,960 --> 00:34:28,480 Speaker 1: was like, Man, you can't have like an old friend 565 00:34:28,480 --> 00:34:30,080 Speaker 1: that you're not friends with anymore, at least I can't. 566 00:34:30,640 --> 00:34:33,400 Speaker 1: I don't like that stuff. Yeah, man, I don't like 567 00:34:33,440 --> 00:34:35,799 Speaker 1: that hanging over my head. Okay, try to make it right. 568 00:34:35,880 --> 00:34:40,200 Speaker 1: That's what I say. You've done. Now I'm done. Sorry anyway, 569 00:34:40,239 --> 00:34:42,239 Speaker 1: I think did I even say handy search bar you 570 00:34:42,280 --> 00:34:45,480 Speaker 1: totally threw me off? All right, well, handy search bar 571 00:34:45,560 --> 00:34:47,920 Speaker 1: how stuff works dot Com. I said that Chuck. So 572 00:34:48,000 --> 00:34:54,160 Speaker 1: that means it's your turn for listener mail. Yes, Josh, 573 00:34:54,200 --> 00:34:56,840 Speaker 1: this is on suicide bombing, and this Nick brings up 574 00:34:56,840 --> 00:34:58,640 Speaker 1: a very good point that I think kind of fits 575 00:34:58,640 --> 00:35:01,680 Speaker 1: in with this podcast. Okay, Hi guys and Jerry. I 576 00:35:01,680 --> 00:35:04,560 Speaker 1: think y'all are very brave for taking on the issue 577 00:35:04,560 --> 00:35:07,920 Speaker 1: of suicide bombing. I don't know about brave, but I 578 00:35:07,960 --> 00:35:10,719 Speaker 1: appreciate it. I don't want to contribute too much to 579 00:35:10,760 --> 00:35:12,960 Speaker 1: the delusion of emails, but I would like to say 580 00:35:13,360 --> 00:35:16,320 Speaker 1: you could have more explicitly underscored something that I believe 581 00:35:16,360 --> 00:35:20,640 Speaker 1: is key to understanding suicide bombing and terrorism in general, 582 00:35:21,560 --> 00:35:25,839 Speaker 1: both our weapons of the weak and the belieguered sort 583 00:35:25,880 --> 00:35:30,080 Speaker 1: of like our hate thing. Okay, do you agree? Um? Yeah, well, 584 00:35:30,080 --> 00:35:32,120 Speaker 1: I mean we've been said a suicide bomb or costs 585 00:35:32,120 --> 00:35:34,960 Speaker 1: about a hundred and fifty bucks exactly. He He points out, 586 00:35:35,160 --> 00:35:38,600 Speaker 1: if Palestinians, for instance, had access to predator drones and 587 00:35:38,640 --> 00:35:42,040 Speaker 1: guided missile systems rather than rocks and slingshots, I don't 588 00:35:42,080 --> 00:35:45,279 Speaker 1: think that Palestinians would resort to martyrdom. I would also 589 00:35:45,360 --> 00:35:49,279 Speaker 1: point to suicide bombings carried out by the viet Men 590 00:35:50,000 --> 00:35:53,160 Speaker 1: during the French occupation of Vietnam or the example of 591 00:35:53,320 --> 00:35:57,040 Speaker 1: Tamil Tigers of Sri Lanka, both of which movements were 592 00:35:57,080 --> 00:35:59,200 Speaker 1: secular in nature. All I want to say is it 593 00:35:59,200 --> 00:36:02,719 Speaker 1: seems like suicide bombings is a phenomenon often arising from 594 00:36:02,760 --> 00:36:06,600 Speaker 1: situations which there is a huge asymmetry of power between 595 00:36:06,600 --> 00:36:10,920 Speaker 1: an occupying or apartheid resime regime or a native or 596 00:36:10,960 --> 00:36:13,680 Speaker 1: a press population. You guys did mention this, but I 597 00:36:13,719 --> 00:36:16,239 Speaker 1: think this dimension is at least as important to the 598 00:36:16,280 --> 00:36:20,200 Speaker 1: issue as religion or notions of martyrdom um. That is 599 00:36:20,520 --> 00:36:23,160 Speaker 1: sincerely from Nick, and I kind of agree Nick, and 600 00:36:23,280 --> 00:36:25,680 Speaker 1: Nick is a sharp tag. It's like right on the money. Yeah, 601 00:36:25,719 --> 00:36:29,879 Speaker 1: thanks for that one. Wow. Okay, Well, if you think 602 00:36:29,880 --> 00:36:32,160 Speaker 1: you're a sharp tack, we want to hear from you, right, Chuck, 603 00:36:32,320 --> 00:36:35,160 Speaker 1: that's right, send us an email about anything at all, 604 00:36:35,239 --> 00:36:40,600 Speaker 1: Anything at all. Two Stuff podcast at how Stuff Works 605 00:36:40,640 --> 00:36:45,440 Speaker 1: dot com. Stuff You Should Know is a production of 606 00:36:45,440 --> 00:36:48,759 Speaker 1: I Heart Radio. For more podcasts my heart Radio, visit 607 00:36:48,800 --> 00:36:51,600 Speaker 1: the I heart Radio app, Apple Podcasts, or wherever you 608 00:36:51,680 --> 00:36:52,960 Speaker 1: listen to your favorite shows.