1 00:00:00,280 --> 00:00:02,920 Speaker 1: Brought to you by the reinvented two thousand twelve Camray. 2 00:00:03,160 --> 00:00:07,560 Speaker 1: It's ready. Are you welcome to stuff you should know 3 00:00:08,160 --> 00:00:16,400 Speaker 1: from house Stuff Works dot com. Hey, and welcome to 4 00:00:16,440 --> 00:00:19,320 Speaker 1: the podcast. I'm Josh Clark. With me is always a 5 00:00:19,480 --> 00:00:23,599 Speaker 1: chipper and cheerful Charles W. Chuck Bryant Man, I'm going 6 00:00:24,560 --> 00:00:29,639 Speaker 1: ten different directions, buddy. Yeah, I'm a little screwy, are you. Yeah, 7 00:00:29,880 --> 00:00:32,880 Speaker 1: we'll focus on this one. Okay, okay, because we're going 8 00:00:32,960 --> 00:00:36,479 Speaker 1: on in one direction, and that's hate. I hate to focus. Okay. 9 00:00:36,560 --> 00:00:40,599 Speaker 1: You hate broccoli. I do hate broccoli, and you know that. 10 00:00:40,720 --> 00:00:44,280 Speaker 1: I also hate peas, like split peas. I remember declaring 11 00:00:44,440 --> 00:00:46,720 Speaker 1: um as a child that peas are some of my 12 00:00:46,880 --> 00:00:49,360 Speaker 1: most hated enemies. I think a lot of kids don't 13 00:00:49,360 --> 00:00:52,400 Speaker 1: like peas because they're mushy. Yeah. Well that's the problem 14 00:00:52,400 --> 00:00:55,360 Speaker 1: with all vegetables. Really, they're mushy, they're overcooked if you 15 00:00:56,000 --> 00:01:00,000 Speaker 1: cook something. No, I've had pretty nasty broccoli. But broccoli, 16 00:01:00,000 --> 00:01:03,760 Speaker 1: it's all that's separate. It's just disgusting in every single way. 17 00:01:03,840 --> 00:01:07,080 Speaker 1: But cream spinach, I love that. It's awesome. Yeah, that's 18 00:01:07,120 --> 00:01:10,000 Speaker 1: good stuff. You and I shared a cream spinach at 19 00:01:10,040 --> 00:01:13,720 Speaker 1: Morton steak. That's recently two ladies. Yeah, it was something 20 00:01:13,840 --> 00:01:16,360 Speaker 1: we couldn't even finish. It was so rich. It was 21 00:01:16,400 --> 00:01:19,360 Speaker 1: really good, So Chuck, we don't hate cream spinach, I 22 00:01:19,400 --> 00:01:22,600 Speaker 1: hate broccoli and um. One of the things I hate 23 00:01:22,680 --> 00:01:25,600 Speaker 1: more than anything else is not having an intro, which 24 00:01:25,640 --> 00:01:30,920 Speaker 1: I don't because I was looking online and strangely they're 25 00:01:31,040 --> 00:01:34,880 Speaker 1: The online world is a repository for hate in a 26 00:01:35,000 --> 00:01:39,440 Speaker 1: certain way, as in like, um, neo Nazi punk bands 27 00:01:40,480 --> 00:01:45,600 Speaker 1: not huh, this article calls it pop music pop music, yeah, um, 28 00:01:45,760 --> 00:01:49,720 Speaker 1: or you know Facebook groups dedicated to hate, like you know, 29 00:01:49,880 --> 00:01:54,240 Speaker 1: Holocaust denial and that kind of stuff. Sure, Um, but 30 00:01:54,520 --> 00:01:57,680 Speaker 1: this word is so ubiquitous in our culture that there 31 00:01:57,760 --> 00:02:00,440 Speaker 1: was nothing there. Like I found a guy um and 32 00:02:00,560 --> 00:02:04,480 Speaker 1: dairy and mass who was accused of hate crime. Um, 33 00:02:04,520 --> 00:02:08,520 Speaker 1: everybody wants to know why, um, Cleveland fans hate Lebron. 34 00:02:09,040 --> 00:02:11,800 Speaker 1: I can answer that. But I mean, like, we throw 35 00:02:11,880 --> 00:02:15,520 Speaker 1: this word around like the you know, some reality TV 36 00:02:15,680 --> 00:02:19,200 Speaker 1: series was the show you love to hate? Right? Um. 37 00:02:20,040 --> 00:02:23,320 Speaker 1: We we use this word a lot. But yeah, I 38 00:02:23,360 --> 00:02:26,520 Speaker 1: found a study out of the University of Texas that 39 00:02:26,520 --> 00:02:31,320 Speaker 1: that asked people how often they hated and um, nobody 40 00:02:31,400 --> 00:02:34,680 Speaker 1: said every day. It's not an everyday thing. So like 41 00:02:34,720 --> 00:02:37,640 Speaker 1: we we hate things mc brockley, but we also realized 42 00:02:37,680 --> 00:02:42,280 Speaker 1: that there's a real distinction between hating something and experiencing 43 00:02:42,400 --> 00:02:45,800 Speaker 1: actual hate. You hit it on the head, and this 44 00:02:45,880 --> 00:02:49,880 Speaker 1: is a pretty old distinction, right, Like, like philosophers have 45 00:02:49,880 --> 00:02:53,320 Speaker 1: have been aware of this before. I think Aristotle was 46 00:02:53,360 --> 00:02:58,120 Speaker 1: pretty sure he hated peas, but he really hated him Lock. Yeah. 47 00:02:58,200 --> 00:03:00,840 Speaker 1: He and he's not Webster, so I will read his 48 00:03:00,919 --> 00:03:04,600 Speaker 1: definition because he's Aristotle. Uh. He said it was a 49 00:03:04,639 --> 00:03:08,320 Speaker 1: dislike for someone based on our negative perception of that 50 00:03:08,400 --> 00:03:12,040 Speaker 1: person's nature. That is so intense that whoever feels it 51 00:03:12,120 --> 00:03:15,080 Speaker 1: wants to cause real harm to another, Like I really 52 00:03:15,080 --> 00:03:18,280 Speaker 1: want to harm you. Yeah, so that's the difference, Like Pete, 53 00:03:18,280 --> 00:03:20,240 Speaker 1: like you said, people throw that word around. I hate broccoli, 54 00:03:20,280 --> 00:03:21,960 Speaker 1: but you're not gonna go out and try and burn 55 00:03:22,040 --> 00:03:27,359 Speaker 1: down broccoli farms. No, I know that's silly. I'm not 56 00:03:27,400 --> 00:03:31,920 Speaker 1: gonna go burn down Cubby BROCCOLI's family's broccoli farm. It 57 00:03:31,960 --> 00:03:34,440 Speaker 1: was used to fund the James Bond movies. But Josh, 58 00:03:34,560 --> 00:03:37,560 Speaker 1: I think, and this is me surmising in my own 59 00:03:37,600 --> 00:03:40,040 Speaker 1: personal purview, I think there are kind of two types 60 00:03:40,080 --> 00:03:43,240 Speaker 1: of hate, well three types really. Then one type that 61 00:03:43,280 --> 00:03:45,040 Speaker 1: you just throw the word around like I hate that show, 62 00:03:45,080 --> 00:03:48,440 Speaker 1: I hate broccoli. One that is real hate, which I 63 00:03:48,480 --> 00:03:52,040 Speaker 1: think is fear based when you don't know someone personally 64 00:03:52,160 --> 00:03:56,640 Speaker 1: or a group personally, where you hate a group of people. 65 00:03:58,120 --> 00:04:01,800 Speaker 1: And then there's like the anger retribution based hate, like 66 00:04:01,880 --> 00:04:06,600 Speaker 1: someone personally has wronged you so badly that you hate 67 00:04:06,600 --> 00:04:09,600 Speaker 1: them and and cause and either want to cause or 68 00:04:09,840 --> 00:04:12,480 Speaker 1: which ill upon them. Right, Well, you just brought up 69 00:04:12,520 --> 00:04:16,080 Speaker 1: a huge can of worms by using the word anger. 70 00:04:16,640 --> 00:04:19,960 Speaker 1: Like there's a real debate over whether hate and anger 71 00:04:20,080 --> 00:04:22,600 Speaker 1: are the same thing, right, right, They say they're not. 72 00:04:23,360 --> 00:04:25,120 Speaker 1: It depends on who you talk to. But the people 73 00:04:25,120 --> 00:04:28,320 Speaker 1: who say they're not say things like UM, hate is 74 00:04:29,000 --> 00:04:33,760 Speaker 1: brought on by humiliation or ill treatment or being devalued. 75 00:04:34,640 --> 00:04:37,320 Speaker 1: Where anger is brought on when you're when you're treated 76 00:04:37,480 --> 00:04:41,160 Speaker 1: UM in a way that you consider unfairly. Right, right, 77 00:04:41,920 --> 00:04:48,240 Speaker 1: anger is the result of UM not having any recourse. Right, 78 00:04:48,360 --> 00:04:51,360 Speaker 1: frustration perhaps coupled with that right and that that kind 79 00:04:51,360 --> 00:04:54,920 Speaker 1: of dances along the border because people who hate, you know, 80 00:04:55,160 --> 00:04:59,599 Speaker 1: other groups often are frustrated. Like when we talked about 81 00:04:59,640 --> 00:05:03,719 Speaker 1: the fast sism in the fascism podcast Getting Um Getting 82 00:05:03,760 --> 00:05:07,039 Speaker 1: groups all riled up against a scapegoat is one of 83 00:05:07,080 --> 00:05:11,160 Speaker 1: the tenets of fascism, right, And so these people are 84 00:05:11,200 --> 00:05:14,240 Speaker 1: frustrated at their lot in life, their unemployments high because 85 00:05:14,240 --> 00:05:17,560 Speaker 1: of the Jews or something like that, Right, But really 86 00:05:17,760 --> 00:05:22,480 Speaker 1: they're not. They're they're angry about their job while they 87 00:05:22,520 --> 00:05:25,599 Speaker 1: hate the Jews. So the two are really intertwined. But 88 00:05:25,640 --> 00:05:27,680 Speaker 1: there's a lot of people think if you look at 89 00:05:27,720 --> 00:05:30,279 Speaker 1: them deeply enough, they're not one and the same. Well, 90 00:05:30,320 --> 00:05:31,599 Speaker 1: I think a lot of times that kind of hate 91 00:05:31,680 --> 00:05:36,320 Speaker 1: is displaced anger and frustration at your own you know. Yeah, 92 00:05:36,600 --> 00:05:40,680 Speaker 1: but there is there is also a very um strong 93 00:05:40,800 --> 00:05:43,880 Speaker 1: physiological basis to it as well. I mean, it's an 94 00:05:43,880 --> 00:05:47,400 Speaker 1: emotion supposedly, although it's not one of the basic emotions. 95 00:05:47,640 --> 00:05:53,000 Speaker 1: Anger is Yeah, what are the basic emotions? Anger? Joy, 96 00:05:53,680 --> 00:06:01,720 Speaker 1: fear discussed and peckishness. I thought it was joy, pain, 97 00:06:01,880 --> 00:06:07,360 Speaker 1: sunshine and rain. No is that Rob bass No? I 98 00:06:07,360 --> 00:06:09,400 Speaker 1: can't remember. I could sing it, but I can't remember 99 00:06:09,480 --> 00:06:11,799 Speaker 1: sing it. No, no, no, I think it's Rob Basse. 100 00:06:12,600 --> 00:06:14,880 Speaker 1: No it was it was a duo. Oh no, no, 101 00:06:14,960 --> 00:06:17,360 Speaker 1: I'm thinking of uh, I want money, lots and lots 102 00:06:17,400 --> 00:06:21,800 Speaker 1: of money. That was a duo to be rich. Remember 103 00:06:21,839 --> 00:06:24,280 Speaker 1: that stupid song? Yeah, kind of. They wrote a song 104 00:06:24,320 --> 00:06:27,080 Speaker 1: about being rich. Oh yeah, how great it was, and 105 00:06:27,120 --> 00:06:28,880 Speaker 1: that was their only song. So unless they were already 106 00:06:28,960 --> 00:06:33,680 Speaker 1: rich than they never were from that song that makes sense. Yeah, 107 00:06:33,680 --> 00:06:37,680 Speaker 1: it does just blew my mind, buddy. So do you 108 00:06:37,720 --> 00:06:41,039 Speaker 1: hate that song? Uh? I do now because it's in 109 00:06:41,080 --> 00:06:45,400 Speaker 1: my head. So, Chuck, what is this um physiological basis 110 00:06:45,440 --> 00:06:49,200 Speaker 1: of anger? Well, it's pointed out in the article within 111 00:06:49,279 --> 00:06:52,680 Speaker 1: an Iron Maiden song, which I thought was an odd choice. 112 00:06:52,720 --> 00:06:55,440 Speaker 1: There's a thin line between love and hate. Yeah, It's 113 00:06:55,480 --> 00:06:59,760 Speaker 1: like there's a whole other song called There's a thin 114 00:06:59,800 --> 00:07:02,120 Speaker 1: Line between Love and Hate. Well, there's a much more 115 00:07:02,120 --> 00:07:04,960 Speaker 1: popular song, I think, the Persuaders, which was It's a 116 00:07:05,040 --> 00:07:08,800 Speaker 1: thin line between Love and hate, the old Motown song. Right. 117 00:07:08,839 --> 00:07:11,920 Speaker 1: Have you ever heard the Pretender's version of it? No, 118 00:07:12,200 --> 00:07:15,160 Speaker 1: it's hands down the greatest version ever. Really is a 119 00:07:15,320 --> 00:07:19,760 Speaker 1: thin yea between love and Hey, the Pretenders covered the 120 00:07:19,760 --> 00:07:22,440 Speaker 1: Persuaders and yep, all right, I'm telling you all right. 121 00:07:22,480 --> 00:07:24,800 Speaker 1: So apparently Iron Maiden actually listen to that song on 122 00:07:24,880 --> 00:07:26,760 Speaker 1: YouTube the other day and it's a it's an Iron 123 00:07:26,800 --> 00:07:28,760 Speaker 1: Maiden song. Yeah, now I looked it up to make 124 00:07:28,760 --> 00:07:32,440 Speaker 1: sure that Iron maidenheading covered the persuaders, and uh no, 125 00:07:32,600 --> 00:07:35,000 Speaker 1: Bruce Dickinson came up with his own lyrics, his own version. 126 00:07:35,040 --> 00:07:38,320 Speaker 1: He's like, that one's fine. I'm doing this one. That's right. 127 00:07:38,480 --> 00:07:40,880 Speaker 1: So the point of all this, Josh, is that there 128 00:07:40,960 --> 00:07:42,880 Speaker 1: is a thin line between love and hate as far 129 00:07:42,960 --> 00:07:45,960 Speaker 1: as uh the brain goes. Because um, in two thousand 130 00:07:46,000 --> 00:07:48,440 Speaker 1: and eight, there was a study at the University College 131 00:07:48,440 --> 00:07:52,120 Speaker 1: of London and that's in the UK, and um, they 132 00:07:52,200 --> 00:07:55,520 Speaker 1: got seventeen people, not very wide ranging. I had a 133 00:07:55,560 --> 00:07:59,360 Speaker 1: lot of problems with this study, but they got seventeen 134 00:07:59,360 --> 00:08:02,440 Speaker 1: people who say they hated someone else. Maybe that's why 135 00:08:02,480 --> 00:08:04,360 Speaker 1: they maybe have a hard time finding someone who hates 136 00:08:04,400 --> 00:08:09,160 Speaker 1: someone else. Maybe not because I don't hate anyone. I 137 00:08:09,200 --> 00:08:11,600 Speaker 1: was about to ask you that. Well, we'll get to 138 00:08:11,640 --> 00:08:14,320 Speaker 1: the personal stuff in a minute. So this study, what 139 00:08:14,360 --> 00:08:16,920 Speaker 1: they did was they found seventeen people who hated someone else, 140 00:08:17,640 --> 00:08:22,080 Speaker 1: threw them under the old wonder machine and looked showed 141 00:08:22,120 --> 00:08:24,960 Speaker 1: him pictures of the people they hated. Which is so 142 00:08:25,000 --> 00:08:27,680 Speaker 1: funny to record the results. I guess they're like, you 143 00:08:27,760 --> 00:08:30,360 Speaker 1: need to bring pictures of people you hate. For the study, Yeah, 144 00:08:30,360 --> 00:08:31,960 Speaker 1: they could have just said think of the person you hate, 145 00:08:31,960 --> 00:08:34,320 Speaker 1: I think, and it would achieve the same goal, I guess. 146 00:08:34,720 --> 00:08:38,040 Speaker 1: So anyway, they what they found out was that a 147 00:08:38,080 --> 00:08:40,560 Speaker 1: couple of regions in the brain there's like a hate circuit. 148 00:08:40,600 --> 00:08:46,240 Speaker 1: They call it there the pudament okay, and the Jerry 149 00:08:46,320 --> 00:08:51,440 Speaker 1: laughed at that, and the insular cortex in both fired 150 00:08:51,520 --> 00:08:55,520 Speaker 1: up with pictures of people that they hated, right, And 151 00:08:55,559 --> 00:08:58,560 Speaker 1: the significance of this is that both of those regions 152 00:08:58,600 --> 00:09:01,600 Speaker 1: also fire up when you see picture or think about 153 00:09:01,600 --> 00:09:05,080 Speaker 1: someone you love, which is the longest way to say. 154 00:09:05,080 --> 00:09:07,200 Speaker 1: It's a thin line between Lemon hates, right, and I 155 00:09:07,240 --> 00:09:10,640 Speaker 1: think everybody kind of senses that. It's like um when 156 00:09:12,840 --> 00:09:17,480 Speaker 1: passions flaring are it's virtually the same thing. There are 157 00:09:17,480 --> 00:09:20,400 Speaker 1: two sides of one coin. In my opinion, if you 158 00:09:20,480 --> 00:09:23,960 Speaker 1: truly hate somebody, the real hate to fear is not 159 00:09:24,080 --> 00:09:26,760 Speaker 1: one where somebody's like, oh I hate you so much, 160 00:09:27,000 --> 00:09:30,360 Speaker 1: you know, because that that can be turned. That means 161 00:09:30,360 --> 00:09:33,200 Speaker 1: that they have some sort of emotional connection to you. 162 00:09:33,559 --> 00:09:36,560 Speaker 1: The one to be afraid of is the detached, calm, 163 00:09:37,120 --> 00:09:41,600 Speaker 1: cool kind of hatred, because that's the one where you 164 00:09:42,320 --> 00:09:44,880 Speaker 1: end up dead somewhere like I'm the Green River killer 165 00:09:44,960 --> 00:09:51,200 Speaker 1: and I hate prostitutes. Well, that brings up an interesting sidebar, right, Um, 166 00:09:51,440 --> 00:09:57,920 Speaker 1: do serial killers hate their victims? No? End of sidebar. Well, 167 00:09:58,080 --> 00:10:01,160 Speaker 1: they have long said that serial killers don't experience emotion 168 00:10:01,200 --> 00:10:05,120 Speaker 1: on that scale, but they're starting to uh to change 169 00:10:05,160 --> 00:10:08,160 Speaker 1: their thinking in certain cases because a lot of serial 170 00:10:08,240 --> 00:10:12,800 Speaker 1: killers suffer from antisocial personality disorder, and people who suffer 171 00:10:12,840 --> 00:10:15,840 Speaker 1: from that experience a range of emotions. So it's not 172 00:10:15,920 --> 00:10:18,679 Speaker 1: always I think it's it's both. You know that not 173 00:10:18,840 --> 00:10:21,960 Speaker 1: you can't say every serial killer it's the same. Well, 174 00:10:21,960 --> 00:10:23,920 Speaker 1: they've been saying that for a long time. They've been 175 00:10:23,920 --> 00:10:27,080 Speaker 1: trying to find the threads that connect them. And I 176 00:10:27,160 --> 00:10:29,360 Speaker 1: told you about the sociologist I talked to. He was 177 00:10:29,400 --> 00:10:32,559 Speaker 1: just really up in arms that psychology has spent for 178 00:10:32,920 --> 00:10:35,640 Speaker 1: decades or so looking at serial killers and investing come 179 00:10:35,720 --> 00:10:39,840 Speaker 1: up with his anti personal any social personality disorder. He's like, 180 00:10:39,920 --> 00:10:42,760 Speaker 1: of course they have a personality disorder, they're serial killers. 181 00:10:43,280 --> 00:10:45,800 Speaker 1: That's a good point. So back to that study though, 182 00:10:45,800 --> 00:10:47,880 Speaker 1: about the brain and the difference between love and hate, 183 00:10:47,880 --> 00:10:51,440 Speaker 1: and they did see, um, a difference, a key difference 184 00:10:51,480 --> 00:10:55,000 Speaker 1: because the areas of the frontal cortex associated with judgment 185 00:10:55,080 --> 00:10:59,559 Speaker 1: and critical thinking become less active when you see someone 186 00:10:59,640 --> 00:11:02,000 Speaker 1: you love of on the f m R I machine, 187 00:11:02,880 --> 00:11:05,520 Speaker 1: But when you saw someone you hate, most of your 188 00:11:05,559 --> 00:11:11,280 Speaker 1: frontal cortex cortex cortex is active, remains active. Yeah, so 189 00:11:11,320 --> 00:11:13,520 Speaker 1: that's a big difference. But that makes sense as well, Chuck, 190 00:11:13,559 --> 00:11:15,360 Speaker 1: because I mean, if you see I know, you don't 191 00:11:15,360 --> 00:11:18,560 Speaker 1: hate anybody, so you wouldn't understand this. But when you 192 00:11:18,600 --> 00:11:21,000 Speaker 1: see someone you hate, you just like it's a personality, 193 00:11:21,480 --> 00:11:26,320 Speaker 1: you just you you you tend to criticize them in 194 00:11:26,360 --> 00:11:28,840 Speaker 1: your head, like, oh, you're wearing that sweater today. You 195 00:11:28,880 --> 00:11:32,080 Speaker 1: look so fat and stupid in that sweater. I hope 196 00:11:32,120 --> 00:11:35,760 Speaker 1: you somehow get did you strangle yourself on that sweater? Yeah? 197 00:11:35,760 --> 00:11:37,520 Speaker 1: So the point is that it takes like hatred is 198 00:11:37,520 --> 00:11:41,280 Speaker 1: an active thing. It's an active rumination on this. It's 199 00:11:41,320 --> 00:11:44,120 Speaker 1: not a knee jerk thing like when you might see 200 00:11:44,120 --> 00:11:47,240 Speaker 1: a picture of someone you love. All right, So that's interesting, right, 201 00:11:47,480 --> 00:11:49,680 Speaker 1: that's what that study came up with the set of 202 00:11:49,720 --> 00:11:53,120 Speaker 1: the seventeen people. Yeah, with the this Yeah, it couldn't 203 00:11:52,960 --> 00:11:56,920 Speaker 1: get you know. And the other problem is I'm sure 204 00:11:56,960 --> 00:12:02,920 Speaker 1: they were um weird western educated Uh deed something rich 205 00:12:03,200 --> 00:12:08,880 Speaker 1: and developed carrel, what the eye stands for? And what 206 00:12:08,880 --> 00:12:12,280 Speaker 1: would you just spell weird? It's basically like the idea 207 00:12:12,360 --> 00:12:15,600 Speaker 1: that all of these studies that cited are cited. A 208 00:12:15,640 --> 00:12:19,400 Speaker 1: lot of them are they're just college kids. So it's 209 00:12:19,440 --> 00:12:23,960 Speaker 1: like this really narrow niche of the human population that 210 00:12:24,000 --> 00:12:27,319 Speaker 1: they extrapolate onto good point and in this case they 211 00:12:27,360 --> 00:12:30,480 Speaker 1: just use seventeen of them. Well, we're here to report 212 00:12:30,480 --> 00:12:33,800 Speaker 1: it and then criticize it, and we're done. We did both, 213 00:12:34,000 --> 00:12:37,440 Speaker 1: that's right. Uh. What what's the deal with like old hate? Though? 214 00:12:37,520 --> 00:12:41,080 Speaker 1: Like don't they have some inclination of like early hate 215 00:12:41,240 --> 00:12:44,400 Speaker 1: with Caveman and the like? Well, yeah, because parts of 216 00:12:44,440 --> 00:12:48,680 Speaker 1: the you know, the closer to the center or the 217 00:12:48,720 --> 00:12:50,719 Speaker 1: brain stem that you get in the brain, the more 218 00:12:50,760 --> 00:12:53,160 Speaker 1: ancient that part of the brain is. And if there's 219 00:12:53,160 --> 00:12:57,480 Speaker 1: a region like the pew tamin that's associated with a 220 00:12:57,520 --> 00:13:00,800 Speaker 1: certain thing e g. Hate, then that means that hate's 221 00:13:00,800 --> 00:13:03,440 Speaker 1: been around for a very long time because they are 222 00:13:03,559 --> 00:13:07,240 Speaker 1: part of the brain. Uh has been able to carry 223 00:13:07,280 --> 00:13:10,240 Speaker 1: out that function for this song or should ostensibly got 224 00:13:10,280 --> 00:13:13,040 Speaker 1: you right, but then it's also new with the prefrontal 225 00:13:13,080 --> 00:13:16,440 Speaker 1: cortex which is a fairly newer aspect. So maybe we 226 00:13:16,559 --> 00:13:20,760 Speaker 1: just hated, but we didn't criticize. We just hated. And 227 00:13:20,800 --> 00:13:25,640 Speaker 1: they think possibly that we developed hate as a species 228 00:13:25,760 --> 00:13:30,160 Speaker 1: or a capacity to hate as a survival mechanism way 229 00:13:30,200 --> 00:13:34,920 Speaker 1: back in hunter gatherer days, where we could feel justified 230 00:13:34,960 --> 00:13:40,760 Speaker 1: by say, taking food from another group because we hated them, 231 00:13:40,840 --> 00:13:45,800 Speaker 1: which I actually found pretty that's a pretty inspiring idea. Yeah, 232 00:13:45,920 --> 00:13:47,560 Speaker 1: that you had to work up hate enough to go 233 00:13:47,679 --> 00:13:51,320 Speaker 1: in pillage. I think that's kind of neat because it 234 00:13:51,400 --> 00:13:56,720 Speaker 1: makes it seem like we aren't naturally hateful beings, and 235 00:13:56,760 --> 00:14:00,480 Speaker 1: I don't think we are. I believe everybody had hate, 236 00:14:00,960 --> 00:14:04,200 Speaker 1: and everyone has a vast capacity to hate, but I 237 00:14:04,240 --> 00:14:08,000 Speaker 1: don't I wouldn't characterize this as generally hateful. That's good. 238 00:14:09,200 --> 00:14:10,920 Speaker 1: I'm kind of surprised to hear you say that that's 239 00:14:10,960 --> 00:14:15,280 Speaker 1: true though. Alright, So, uh, it's in the Bible, Josh, 240 00:14:15,440 --> 00:14:17,800 Speaker 1: It's an ancient text all over the place. Hate's been 241 00:14:17,840 --> 00:14:22,480 Speaker 1: around a long time. M Right. We canna talk about Carthage, Yeah, 242 00:14:23,640 --> 00:14:27,920 Speaker 1: because I know you love this. The Carthaginian general, Hannibal Carthaginian, 243 00:14:28,000 --> 00:14:33,640 Speaker 1: and you gotta stop that Hannibal pledged to his father, Dad, 244 00:14:34,320 --> 00:14:37,680 Speaker 1: I hate Rome, I hate Romans. I don't like the Italians. 245 00:14:38,200 --> 00:14:42,320 Speaker 1: I hate them forever, and I will swear retribution because 246 00:14:42,360 --> 00:14:45,800 Speaker 1: they have seized our provinces. He said, father, yes, son, 247 00:14:46,000 --> 00:14:48,840 Speaker 1: I'm going to kill the Romans. And he did. He 248 00:14:48,880 --> 00:14:51,080 Speaker 1: made good on that, invaded Italy and did quite a 249 00:14:51,120 --> 00:14:53,400 Speaker 1: bit of damage. Of course, the Romans fired back because 250 00:14:53,640 --> 00:14:56,720 Speaker 1: they hate the Carthage. Why can't say that word? They 251 00:14:56,760 --> 00:15:01,840 Speaker 1: hated people from Carthage, the Carthagenians, and in one PC 252 00:15:02,560 --> 00:15:06,000 Speaker 1: they did some pretty bad things like burning them in 253 00:15:06,040 --> 00:15:09,720 Speaker 1: their houses while they screamed. But does that hate, like, 254 00:15:10,960 --> 00:15:12,840 Speaker 1: I don't know, And that's a I think that's an 255 00:15:12,840 --> 00:15:16,120 Speaker 1: issue that I have um here there with this is 256 00:15:16,160 --> 00:15:20,400 Speaker 1: that there? That's kind of a jump to conclusion, Like 257 00:15:20,600 --> 00:15:23,760 Speaker 1: is it hate? I don't know. It does hate form 258 00:15:23,840 --> 00:15:28,280 Speaker 1: the basis for war or horrible acts in war? Well, 259 00:15:28,280 --> 00:15:30,880 Speaker 1: I don't know, because it's it's condemned pretty much in 260 00:15:30,920 --> 00:15:33,120 Speaker 1: like the New Testament in the Bible, it's condemned in 261 00:15:33,160 --> 00:15:37,280 Speaker 1: the Koran. Uh, let not hatred of the people incite 262 00:15:37,280 --> 00:15:41,400 Speaker 1: you not to act equitably, And in uh Medieval and 263 00:15:41,440 --> 00:15:44,560 Speaker 1: Renaissance Europe you came up in you but in Italy 264 00:15:44,640 --> 00:15:48,800 Speaker 1: they came up with the vendetta, which is uh, very 265 00:15:48,880 --> 00:15:53,000 Speaker 1: much retribution for the hatred. There you go, I I 266 00:15:53,080 --> 00:15:55,080 Speaker 1: let's see, that's what I'm saying. Like I think, let's 267 00:15:55,080 --> 00:15:58,320 Speaker 1: say a Roman soldier comes to your town while you're 268 00:15:58,360 --> 00:16:04,320 Speaker 1: away using the the latrine pit that your village has dug, 269 00:16:05,120 --> 00:16:07,640 Speaker 1: and they burn your family alive in your house while 270 00:16:07,720 --> 00:16:11,080 Speaker 1: you're using the bathroom. And you come back and you 271 00:16:11,120 --> 00:16:14,600 Speaker 1: see the Roman legions going away and your family is dead, 272 00:16:14,640 --> 00:16:19,120 Speaker 1: burned the crisps. Right. Um. I don't think the Romans 273 00:16:19,200 --> 00:16:24,640 Speaker 1: necessarily felt hatred to commit that act, but that act 274 00:16:24,920 --> 00:16:29,240 Speaker 1: would incite hatred in the person that it befell. So 275 00:16:29,280 --> 00:16:32,120 Speaker 1: I think of vendetta is an excellent example of hatred 276 00:16:32,880 --> 00:16:37,720 Speaker 1: because somebody done, you're wrong and you're gonna get back 277 00:16:37,760 --> 00:16:40,200 Speaker 1: at them, or they did something to your father or 278 00:16:40,240 --> 00:16:43,400 Speaker 1: something you The vendetta is very long lasting from what 279 00:16:43,440 --> 00:16:45,640 Speaker 1: I understand. Yeah, and it's not I mean, this is 280 00:16:45,680 --> 00:16:49,280 Speaker 1: obviously we're talking about mafia vendetta's and war vendetta's, but 281 00:16:49,560 --> 00:16:51,680 Speaker 1: it can happen on a smaller level. You might not 282 00:16:51,720 --> 00:16:54,160 Speaker 1: think of it as vendetta though, but if someone done 283 00:16:54,160 --> 00:16:56,840 Speaker 1: you really wrong, and you're like, I'm gonna get that 284 00:16:56,880 --> 00:17:01,600 Speaker 1: person back by doing this and six months and they 285 00:17:01,760 --> 00:17:05,200 Speaker 1: least expected that's a vendetta. But you you don't call 286 00:17:05,240 --> 00:17:09,280 Speaker 1: it a vendetta. It's just uh, well in in Italy, 287 00:17:09,320 --> 00:17:12,400 Speaker 1: they it's just come up it, sir, Um, I'm gonna 288 00:17:12,440 --> 00:17:16,480 Speaker 1: get you suck a bad people do that though. There 289 00:17:16,520 --> 00:17:19,159 Speaker 1: was a word for it though in medieval and Renaissance Europe, 290 00:17:20,040 --> 00:17:24,359 Speaker 1: in the Messiah, which is Latin for un friendship, it 291 00:17:24,720 --> 00:17:27,600 Speaker 1: was a legal term for hating somebody. Okay, so what 292 00:17:27,640 --> 00:17:30,440 Speaker 1: we've done is established that hatred is definitely a thing 293 00:17:30,520 --> 00:17:34,000 Speaker 1: that's been around a long time. Is that what we've done? Yeah, 294 00:17:34,240 --> 00:17:38,440 Speaker 1: and chuck, of course it's still around, um in recent 295 00:17:38,640 --> 00:17:42,760 Speaker 1: modern history. There's other examples that we could go into, 296 00:17:43,080 --> 00:17:48,520 Speaker 1: like hate groups. Yeah, well, let's talk about the Nazis 297 00:17:48,520 --> 00:17:50,600 Speaker 1: real quick, because again we talked about fascism and one 298 00:17:50,600 --> 00:17:56,720 Speaker 1: of the tenets of it being, um, I guess, inciting 299 00:17:56,840 --> 00:18:00,520 Speaker 1: other group to hate. Yeah, group hate. That's that's where 300 00:18:00,520 --> 00:18:02,520 Speaker 1: we are for sure. And a lot of that that 301 00:18:02,680 --> 00:18:06,200 Speaker 1: gave a lot of a body of data for people 302 00:18:06,240 --> 00:18:09,360 Speaker 1: to study and that they're still studying. But um, one 303 00:18:09,359 --> 00:18:12,399 Speaker 1: guy in particular named Martin Oppenheimer, is a sociologist from 304 00:18:12,440 --> 00:18:17,520 Speaker 1: Rutgers University. UM. It was basically said, like, look, the 305 00:18:17,600 --> 00:18:21,320 Speaker 1: Nazis are proof positive that you can number one, get 306 00:18:21,520 --> 00:18:25,640 Speaker 1: an entire group to hate another group, and that you 307 00:18:25,800 --> 00:18:31,240 Speaker 1: do this by um identifying and exploiting the group that 308 00:18:31,320 --> 00:18:36,480 Speaker 1: you're with their frustrations, say unemployment, joblessness, and then basically saying, 309 00:18:37,080 --> 00:18:39,239 Speaker 1: those are the people who are at fault. That's how 310 00:18:39,280 --> 00:18:42,360 Speaker 1: you store the pot exactly. That's how you incite hatred, 311 00:18:42,560 --> 00:18:44,280 Speaker 1: which is gotta be one of the worst things you 312 00:18:44,320 --> 00:18:47,000 Speaker 1: can do. One of the worst non violent acts I 313 00:18:47,119 --> 00:18:50,720 Speaker 1: think a person commit is insite hatred, you know. Yeah. 314 00:18:50,720 --> 00:18:53,240 Speaker 1: And also I thought what came to mind to me 315 00:18:53,280 --> 00:18:55,959 Speaker 1: when I was reading this was some of these same tactics, 316 00:18:56,000 --> 00:19:00,439 Speaker 1: like a marginalized people, people who are in a cure, 317 00:19:00,640 --> 00:19:03,960 Speaker 1: who are seeking safety somewhere. It's also the kind of 318 00:19:03,960 --> 00:19:06,480 Speaker 1: the same thing they do with the cults and the brainwashing. 319 00:19:07,600 --> 00:19:09,800 Speaker 1: They're seeking out these same types of people and saying, hey, 320 00:19:09,840 --> 00:19:12,199 Speaker 1: you feel marginalized, you feel like you're not loved, you 321 00:19:12,240 --> 00:19:16,160 Speaker 1: need a safe haven. But they're not saying go hate 322 00:19:16,200 --> 00:19:19,840 Speaker 1: someone else. They're saying, just come and be with our group, well, 323 00:19:19,880 --> 00:19:22,600 Speaker 1: our our association of like in group and out group 324 00:19:23,160 --> 00:19:26,959 Speaker 1: is like this emotional psychological razor blade that can be 325 00:19:27,000 --> 00:19:30,320 Speaker 1: exploited in any number of ways exactly, you know, but 326 00:19:30,359 --> 00:19:34,600 Speaker 1: it's always a marginalized people. It seems like yeah, yeah, 327 00:19:34,840 --> 00:19:37,880 Speaker 1: or there, But you mean the people who are stirred, 328 00:19:38,000 --> 00:19:41,439 Speaker 1: who have hatred stirred up in them. Yeah, or go 329 00:19:41,520 --> 00:19:44,800 Speaker 1: join a cult or something like that. Yeah, you mean teenagers? 330 00:19:45,920 --> 00:19:48,399 Speaker 1: And uh, well, at Stanford study in two thousand and 331 00:19:48,440 --> 00:19:51,679 Speaker 1: ten basically said hey, if you want to um teach 332 00:19:51,960 --> 00:19:54,760 Speaker 1: teenagers to hate, here's how you do it. You can't 333 00:19:54,840 --> 00:20:00,600 Speaker 1: just overtly say go hate this group, you know, hate Muslims, 334 00:20:00,600 --> 00:20:05,480 Speaker 1: hate black people, hate Jewish people, hate gays. You can't 335 00:20:05,520 --> 00:20:08,040 Speaker 1: just say that it's not good enough. But if you 336 00:20:08,080 --> 00:20:13,959 Speaker 1: tell a story that basically implies this is these are 337 00:20:13,960 --> 00:20:17,399 Speaker 1: people you should hate and here's why, right, like um, 338 00:20:17,440 --> 00:20:21,800 Speaker 1: homosexuals or Petter asks, and so you know you can't 339 00:20:21,880 --> 00:20:24,760 Speaker 1: let them into certain groups, and by the way, you 340 00:20:24,760 --> 00:20:30,280 Speaker 1: should hate them because of the story. Right then I 341 00:20:30,320 --> 00:20:32,760 Speaker 1: think that I had a problem with that one because 342 00:20:32,800 --> 00:20:35,080 Speaker 1: it was like, that's true for everything. If you tell 343 00:20:35,080 --> 00:20:38,440 Speaker 1: the story, it's gonna hit home or personally to somebody. Yeah, 344 00:20:38,520 --> 00:20:42,720 Speaker 1: you can't say, hey, go love uh Sea Biscuit because 345 00:20:42,720 --> 00:20:45,119 Speaker 1: he ran a horse race that was pretty neat. But 346 00:20:45,160 --> 00:20:47,359 Speaker 1: if you tell the story of Sea Biscuit all of 347 00:20:47,400 --> 00:20:49,720 Speaker 1: a sudden, you're gonna leave that thing going. Man, I'm 348 00:20:49,760 --> 00:20:52,359 Speaker 1: getting my butt to the Kentucky Derby next year because 349 00:20:52,400 --> 00:20:55,160 Speaker 1: I love me some horse racing. I love Sea Biscuit. 350 00:20:55,280 --> 00:20:59,359 Speaker 1: See he saw the movie, right, No, I didn't um 351 00:20:59,600 --> 00:21:02,080 Speaker 1: the But the funny thing is is that that whole 352 00:21:02,119 --> 00:21:05,560 Speaker 1: that study made the careers of two Stanford researchers so right. 353 00:21:05,600 --> 00:21:07,719 Speaker 1: But they do have a point because they point out 354 00:21:07,720 --> 00:21:10,080 Speaker 1: in this article, or they don't, but we do. Um. 355 00:21:10,240 --> 00:21:16,160 Speaker 1: D W. Griffith's awful um movie awful on content. Birth 356 00:21:16,160 --> 00:21:19,280 Speaker 1: of a Nation from nineteen fifteen. It's no Sea Biscuit. 357 00:21:19,480 --> 00:21:21,480 Speaker 1: It's no Sea Biscuit, but it did a really good 358 00:21:21,600 --> 00:21:25,399 Speaker 1: job of getting people to hate black people in the 359 00:21:25,480 --> 00:21:29,160 Speaker 1: United States. Yeah, doesn't it feature like the Well, since 360 00:21:29,160 --> 00:21:31,240 Speaker 1: it was nineteen fifteen, it's like the first and everything. 361 00:21:31,640 --> 00:21:35,560 Speaker 1: But it's like the first on screen rape uh or 362 00:21:35,640 --> 00:21:38,320 Speaker 1: implied rape. There was a rape of a white woman 363 00:21:38,320 --> 00:21:41,200 Speaker 1: by like an escaped slave I think by a white 364 00:21:41,200 --> 00:21:45,520 Speaker 1: actor in black Face of course at the time, and um, 365 00:21:45,560 --> 00:21:48,080 Speaker 1: it was a big, huge movie. It grows ten million 366 00:21:48,080 --> 00:21:53,280 Speaker 1: dollars in nineteen fifteen. That's two hundred million. That's two 367 00:21:53,720 --> 00:21:56,679 Speaker 1: sixteen million dollars today, is what grows. Yeah. And it 368 00:21:56,720 --> 00:21:59,960 Speaker 1: was based on a play, Yeah, it is. This based 369 00:22:00,000 --> 00:22:03,080 Speaker 1: on a play in a book called The Klansman. And 370 00:22:03,480 --> 00:22:06,760 Speaker 1: d W. Griffith felt so bad about this afterward that 371 00:22:07,440 --> 00:22:10,520 Speaker 1: he made a follow up film that you're called Intolerance, 372 00:22:11,080 --> 00:22:15,399 Speaker 1: which was a three hour silent film meditation on uh 373 00:22:15,520 --> 00:22:20,320 Speaker 1: four parallel stories of man's intolerance throughout history. Oh. I 374 00:22:20,359 --> 00:22:23,159 Speaker 1: didn't know he did that. That's good, Yeah, because I 375 00:22:23,200 --> 00:22:25,840 Speaker 1: want to like d W. Griffith. Yeah, I mean, he 376 00:22:25,880 --> 00:22:30,600 Speaker 1: didn't write Birth of the Nation, so he directed it. 377 00:22:31,000 --> 00:22:33,040 Speaker 1: Not like getting him off the hook or anything, but 378 00:22:33,320 --> 00:22:34,800 Speaker 1: I think at the time he was just trying to 379 00:22:35,080 --> 00:22:37,840 Speaker 1: make a movie that sold lot of tickets and that 380 00:22:37,880 --> 00:22:39,440 Speaker 1: was the way to do it. Yeah, that's the way 381 00:22:39,480 --> 00:22:42,320 Speaker 1: to do it. And then the Nazi of course, anyone 382 00:22:42,320 --> 00:22:48,560 Speaker 1: who saw Inglorious Bastards knows that Gebel's, Joseph Gebel's was 383 00:22:48,640 --> 00:22:51,880 Speaker 1: in charge of, you know, the propaganda department with feature films, 384 00:22:52,520 --> 00:22:55,480 Speaker 1: and they had one called Judd Seuss. Is it Judd 385 00:22:55,560 --> 00:22:59,000 Speaker 1: or you'd probably it'd be you'd seous. So you're the 386 00:22:59,000 --> 00:23:01,200 Speaker 1: one who speaks to German. And how did you say 387 00:23:01,320 --> 00:23:03,520 Speaker 1: Judd Seuss? I don't know. I was concentrating on the 388 00:23:03,640 --> 00:23:08,840 Speaker 1: umlaut part and that's just okay. So yeah, but that 389 00:23:08,920 --> 00:23:12,960 Speaker 1: featured a main character, a Jewish main character who was 390 00:23:13,000 --> 00:23:15,320 Speaker 1: shunned by a gentile woman and so he raped her, 391 00:23:16,200 --> 00:23:20,800 Speaker 1: among other things, and it was required viewing for the Stormtroopers. 392 00:23:22,400 --> 00:23:24,400 Speaker 1: They loved it. And then they give him crystal meth. 393 00:23:25,240 --> 00:23:29,320 Speaker 1: Really yeah, from what I understand, that will do it. Um. 394 00:23:29,400 --> 00:23:32,720 Speaker 1: And that didn't just go out with the Nazis. UM 395 00:23:33,200 --> 00:23:35,280 Speaker 1: media has been playing like more and more of a 396 00:23:35,400 --> 00:23:41,440 Speaker 1: role UM among I guess hate groups tred as a 397 00:23:41,440 --> 00:23:46,960 Speaker 1: as a concept and as a practice, right yeah, because um, 398 00:23:47,080 --> 00:23:51,680 Speaker 1: I think in the nineties, Uh, Bosnian Serb TV showed 399 00:23:51,880 --> 00:23:54,479 Speaker 1: UM something that's kind of referred to now as like 400 00:23:54,520 --> 00:23:59,840 Speaker 1: a basically hate mongering UM series called Genocide that stir 401 00:24:00,119 --> 00:24:07,199 Speaker 1: up emotion against the Bosnian Muslims. Right yeah, uh and 402 00:24:07,200 --> 00:24:09,960 Speaker 1: and uh, well you know what happened with that in 403 00:24:09,960 --> 00:24:12,600 Speaker 1: the Balkan War. Yeah. Alqaed has done similar things on 404 00:24:12,640 --> 00:24:15,360 Speaker 1: the web. Obviously, the web is a good place to 405 00:24:15,359 --> 00:24:17,359 Speaker 1: to go try and get this thing done these days. 406 00:24:18,240 --> 00:24:21,040 Speaker 1: And they have chat rooms, they have terrans. With Facebook's 407 00:24:21,040 --> 00:24:27,480 Speaker 1: becoming increasingly um available for people who have UM hate 408 00:24:27,480 --> 00:24:31,600 Speaker 1: based ideologies. Um, and Facebook is like, look, we can't 409 00:24:32,200 --> 00:24:34,400 Speaker 1: we can't do. I mean, we'll find them and shut 410 00:24:34,440 --> 00:24:37,280 Speaker 1: them down and when we when we can, but like 411 00:24:37,320 --> 00:24:43,040 Speaker 1: they're all over the place. Yeah, um and then uh 412 00:24:43,240 --> 00:24:46,960 Speaker 1: also chuck pop music. Yeah, they called the pop music. 413 00:24:47,600 --> 00:24:49,880 Speaker 1: And the reason I know I can't call it pop music, 414 00:24:49,920 --> 00:24:52,800 Speaker 1: it's because I've seen some of those specials and I 415 00:24:52,840 --> 00:24:55,000 Speaker 1: saw really good when I can't remember on neo Nazis 416 00:24:55,040 --> 00:24:58,439 Speaker 1: and they have you know, they have musical groups that 417 00:24:59,080 --> 00:25:02,919 Speaker 1: are neo Nazi songs and they just sing about hating 418 00:25:02,920 --> 00:25:05,760 Speaker 1: other people and it's you know, it's aggressive music. It's 419 00:25:05,800 --> 00:25:09,880 Speaker 1: not it's not pop. There's no sense. No, it's not handsome. 420 00:25:10,280 --> 00:25:14,800 Speaker 1: So um chuck. The article begs a pretty um interesting question. 421 00:25:14,840 --> 00:25:21,080 Speaker 1: I think um is hate a mental illness, because you know, 422 00:25:21,280 --> 00:25:24,000 Speaker 1: don't you have to be slightly mentally ill to burn 423 00:25:24,080 --> 00:25:28,480 Speaker 1: down a house with an entire family trapped inside? Or 424 00:25:28,520 --> 00:25:33,520 Speaker 1: maybe you're just following orders? Okay, Yeah, excellent. I think 425 00:25:33,560 --> 00:25:38,880 Speaker 1: you just hit upon it. Our understanding of hate is incomplete, 426 00:25:39,160 --> 00:25:43,439 Speaker 1: because our understanding of the things that we do that 427 00:25:43,480 --> 00:25:47,560 Speaker 1: we associate with hate is also incomplete. Are you just 428 00:25:47,600 --> 00:25:50,920 Speaker 1: following orders? Are you being whipped up into a mob mentality? 429 00:25:51,440 --> 00:25:54,240 Speaker 1: Do you actually hate this other group because you lost 430 00:25:54,280 --> 00:25:58,040 Speaker 1: your job? Or is this emotion just being exploited by 431 00:25:58,119 --> 00:26:02,920 Speaker 1: someone else, a third party? Um? I I And also 432 00:26:03,000 --> 00:26:06,040 Speaker 1: I think our understanding of mental illness isn't refined enough 433 00:26:06,080 --> 00:26:08,920 Speaker 1: to say yes, hates the product of a mental illness. Sure, 434 00:26:09,000 --> 00:26:12,000 Speaker 1: because they referenced Um Hitler and Osama bin Laden as 435 00:26:12,000 --> 00:26:15,880 Speaker 1: two people they suspect might have been mentally ill um 436 00:26:16,000 --> 00:26:20,159 Speaker 1: or at least anti social. And they also referenced the 437 00:26:20,440 --> 00:26:24,879 Speaker 1: Columbine shooters as one of them suffered from depression and 438 00:26:24,880 --> 00:26:27,240 Speaker 1: they had these hate filled rants that they ended up finding. 439 00:26:27,880 --> 00:26:31,880 Speaker 1: And was there a link between that depression and hatred? Right? 440 00:26:32,640 --> 00:26:35,159 Speaker 1: And I guess the the that begs the question like 441 00:26:35,320 --> 00:26:38,800 Speaker 1: did they were they? So? Was Osama bin Laden and 442 00:26:38,880 --> 00:26:41,760 Speaker 1: Hitler and Dylan Klebold like so wrapped up in hatred 443 00:26:41,800 --> 00:26:46,639 Speaker 1: that they were crazy? Or was um hatred a byproduct 444 00:26:47,200 --> 00:26:50,159 Speaker 1: of you know, any mental illness? They may or may 445 00:26:50,200 --> 00:26:53,000 Speaker 1: not have had. These are questions we don't know. But 446 00:26:53,680 --> 00:26:57,360 Speaker 1: my whole idea that hatred is brought out when you 447 00:26:57,440 --> 00:27:03,240 Speaker 1: are um mistreated by someone else is backed up by 448 00:27:03,280 --> 00:27:07,200 Speaker 1: a two thousand study of people from Kosovo and those 449 00:27:07,240 --> 00:27:11,520 Speaker 1: who have gone through the most trauma and stress hated 450 00:27:11,520 --> 00:27:15,080 Speaker 1: the Serbian troops who'd um you know, borne that out 451 00:27:15,119 --> 00:27:18,760 Speaker 1: on them more than other people who maybe had pleasant 452 00:27:18,800 --> 00:27:23,000 Speaker 1: exchanges with Serbian troops. I guess that makes sense. We 453 00:27:23,040 --> 00:27:27,160 Speaker 1: gotta mention hate crimes and hate groups briefly. Hate crime 454 00:27:27,240 --> 00:27:30,000 Speaker 1: is obviously a crime carried out against somebody based on 455 00:27:30,040 --> 00:27:35,240 Speaker 1: their skin color, their ethnicity, their national origin, their gender, disability, 456 00:27:35,720 --> 00:27:39,720 Speaker 1: sexual orientation. Is when you hear a lot about yeah disabilities, 457 00:27:39,920 --> 00:27:41,920 Speaker 1: A sad one because it took a while to um 458 00:27:42,200 --> 00:27:46,840 Speaker 1: get that into hate crime bills. Oh really yeah interesting, 459 00:27:47,480 --> 00:27:51,480 Speaker 1: but the Congress is past legislation now that makes hate 460 00:27:51,520 --> 00:27:54,920 Speaker 1: crimes more serious offenses than just like a regular assault. 461 00:27:55,000 --> 00:27:57,639 Speaker 1: Well yeah, which is pretty awesome. Yeah, and how it 462 00:27:57,640 --> 00:28:01,119 Speaker 1: should be. I remember, um we in the there was 463 00:28:01,160 --> 00:28:04,320 Speaker 1: a child safety law that was being passed in two 464 00:28:04,359 --> 00:28:08,639 Speaker 1: thousands six, and there was a hate crime language that 465 00:28:08,760 --> 00:28:15,960 Speaker 1: was attached to it that made um sexual orientation crimes 466 00:28:16,080 --> 00:28:19,080 Speaker 1: hate crimes on a federal level, and there's a big 467 00:28:19,119 --> 00:28:22,080 Speaker 1: outrage about it among religious groups. Do you remember that. 468 00:28:22,359 --> 00:28:24,520 Speaker 1: I think they were like, wait, we have a First 469 00:28:24,560 --> 00:28:29,480 Speaker 1: Amendment freedom to hate gay people as part of our religion. 470 00:28:30,400 --> 00:28:34,000 Speaker 1: You know, so you're you're saying that that that in 471 00:28:34,040 --> 00:28:37,359 Speaker 1: and of itself is a hate crime. By saying like, no, 472 00:28:37,520 --> 00:28:41,080 Speaker 1: these people are wrong, homosexuality is bad, it's wrong, that 473 00:28:41,200 --> 00:28:44,120 Speaker 1: kind of thing, um, And that they thought that that 474 00:28:44,240 --> 00:28:46,960 Speaker 1: kind of infringed on it, which I don't think it does, 475 00:28:47,120 --> 00:28:49,280 Speaker 1: but that was their argument for a while. I don't 476 00:28:49,320 --> 00:28:52,400 Speaker 1: think it worked. Interesting. Yeah, uh so I have a 477 00:28:52,480 --> 00:28:54,760 Speaker 1: list here first, Josh, and then we have a couple 478 00:28:54,840 --> 00:28:58,680 Speaker 1: more little stats um about hate groups since two thousand. 479 00:28:59,080 --> 00:29:03,360 Speaker 1: The Southern Pretty Low Center claims that the US hate 480 00:29:03,400 --> 00:29:06,320 Speaker 1: groups in the US has grown by more than and 481 00:29:06,720 --> 00:29:11,000 Speaker 1: since since two thousand. Yeah, and they had the top 482 00:29:11,080 --> 00:29:14,280 Speaker 1: five states with the biggest concentrations of hate groups. And 483 00:29:14,400 --> 00:29:16,400 Speaker 1: this one was continued on the next page. And when 484 00:29:16,440 --> 00:29:18,840 Speaker 1: I was reading it, I was like, please, Georgia, don't 485 00:29:18,880 --> 00:29:21,760 Speaker 1: be on there, Please don't be on there. And it's 486 00:29:21,840 --> 00:29:24,760 Speaker 1: not and we will count them down from five to 487 00:29:24,840 --> 00:29:28,680 Speaker 1: create suspense Idaho is number five for hate groups. Evidently 488 00:29:29,720 --> 00:29:33,080 Speaker 1: Wyoming is number four. You got Arkansas is number three, 489 00:29:34,120 --> 00:29:38,040 Speaker 1: Mississippi is number two. Two from the south, and then um, 490 00:29:38,280 --> 00:29:42,960 Speaker 1: number one according to the Southern Poverty Law Center is Montana. Yep, 491 00:29:44,560 --> 00:29:48,320 Speaker 1: that's you know, Montana. Grab your guns, fellas. Yeah, there's 492 00:29:48,320 --> 00:29:50,600 Speaker 1: a lot of Melissias in Montana. Yeah, but there's also 493 00:29:50,640 --> 00:29:53,280 Speaker 1: a lot of like super chill cool like fly fish 494 00:29:53,360 --> 00:29:57,280 Speaker 1: and uh, microbrew drinking hippies out there. It's an interesting mix. 495 00:29:57,920 --> 00:29:59,480 Speaker 1: And I spent time there and I saw both in 496 00:29:59,520 --> 00:30:02,480 Speaker 1: this town. That was I could feel the friction even 497 00:30:03,680 --> 00:30:07,160 Speaker 1: between those groups, like with an Indian burn. Yeah, Like 498 00:30:07,240 --> 00:30:09,080 Speaker 1: I was like I was out in a saloon and 499 00:30:09,480 --> 00:30:11,640 Speaker 1: having a good time with some locals, and then a 500 00:30:11,720 --> 00:30:14,040 Speaker 1: couple of like cowboys came in that didn't like the 501 00:30:14,920 --> 00:30:17,400 Speaker 1: people from l A being in there, and you could, 502 00:30:17,440 --> 00:30:21,320 Speaker 1: like definitely sense there's two different types of people in Montana. 503 00:30:21,400 --> 00:30:24,600 Speaker 1: There's probably wasn't two, but I'm generalizing. No, there's two. 504 00:30:24,760 --> 00:30:29,360 Speaker 1: There's just two. Okay, hate groups and hippies. So um, chuck, 505 00:30:29,680 --> 00:30:32,800 Speaker 1: you've got some stance for us. Uh yeah, you dug 506 00:30:32,880 --> 00:30:40,560 Speaker 1: this upright on who people hate? Acquaintances friends, family members 507 00:30:40,600 --> 00:30:45,240 Speaker 1: twelve percent. That's sad, ex boyfriends and girlfriends twelve percent. 508 00:30:46,280 --> 00:30:50,240 Speaker 1: And within the family, it's fathers are hated the most, 509 00:30:50,520 --> 00:30:57,200 Speaker 1: at mothers at in laws, and siblings at three percent. 510 00:30:57,360 --> 00:30:59,680 Speaker 1: That's kind of sweet. That's surprising to me, though, I 511 00:30:59,680 --> 00:31:01,760 Speaker 1: would think siblings would be the highest because they're the 512 00:31:01,800 --> 00:31:04,240 Speaker 1: ones that beat the tar out of you most frequently 513 00:31:04,320 --> 00:31:07,120 Speaker 1: in most families. All right, so do you hate people? 514 00:31:07,440 --> 00:31:12,160 Speaker 1: That's let's finish up with that. Um. I've found that 515 00:31:12,360 --> 00:31:16,720 Speaker 1: the best way to hate somebody is to just check 516 00:31:16,760 --> 00:31:20,320 Speaker 1: them off. So usual you'll write someone off but not 517 00:31:20,480 --> 00:31:23,120 Speaker 1: have that active hatred. I I don't generally like I 518 00:31:23,240 --> 00:31:25,560 Speaker 1: will just be like I can't believe you wore that sweater, 519 00:31:25,720 --> 00:31:28,920 Speaker 1: you fat pig idiot in my head, But it's usually 520 00:31:29,000 --> 00:31:33,360 Speaker 1: because I'm in like a bad mood about something else. Like, 521 00:31:33,480 --> 00:31:36,120 Speaker 1: I don't walk around just actively hating people. It's a 522 00:31:36,160 --> 00:31:39,720 Speaker 1: waste of time. It's a total waste of time. I 523 00:31:39,760 --> 00:31:43,240 Speaker 1: don't I don't think I ever hated anybody at a 524 00:31:43,320 --> 00:31:47,680 Speaker 1: situation and an ex girlfriend shacked up with one of 525 00:31:47,760 --> 00:31:50,640 Speaker 1: one of my best friends after I moved state and 526 00:31:50,800 --> 00:31:52,960 Speaker 1: we were broken up quote unquote, But I also was like, 527 00:31:53,360 --> 00:31:56,320 Speaker 1: I'm coming back for you, like, you know, this isn't over. 528 00:31:56,520 --> 00:32:00,480 Speaker 1: Were you going to find work? And California on West 529 00:32:01,200 --> 00:32:04,480 Speaker 1: in my in my wagon? And they shacked up pretty 530 00:32:04,520 --> 00:32:07,240 Speaker 1: quick after I left, and I had like a few 531 00:32:07,320 --> 00:32:11,200 Speaker 1: years of like bad dreams and periodic bad dreams. I 532 00:32:11,280 --> 00:32:13,440 Speaker 1: wasn't like every day I woke up thinking about it, 533 00:32:14,000 --> 00:32:16,000 Speaker 1: but it faded away. But it was never even hate. 534 00:32:16,040 --> 00:32:20,480 Speaker 1: It was just like, man, I gotta do that. Really, yeah, 535 00:32:20,560 --> 00:32:24,120 Speaker 1: It's just just like that's that sucks. Don't do that, friends. 536 00:32:24,320 --> 00:32:26,280 Speaker 1: That's that's one end of the spectrum. The other end 537 00:32:26,320 --> 00:32:28,480 Speaker 1: of the spectrum is like people who go and like 538 00:32:29,080 --> 00:32:32,680 Speaker 1: kill those people, those two people, well yeah, and that's 539 00:32:32,720 --> 00:32:35,880 Speaker 1: like former famous football stars, and that's that. I think 540 00:32:35,920 --> 00:32:38,640 Speaker 1: it's all in the wiring. You're wired a certain way, 541 00:32:38,680 --> 00:32:41,520 Speaker 1: and I'm not wired to to indulge those kinds of things. 542 00:32:42,240 --> 00:32:44,600 Speaker 1: I suspect that all has to do with the amygdala, 543 00:32:45,040 --> 00:32:48,080 Speaker 1: you know. All right, Well, if you want to learn 544 00:32:48,160 --> 00:32:50,720 Speaker 1: more about the amygdala, you can type that word into 545 00:32:50,760 --> 00:32:52,680 Speaker 1: the search bar at how stuff works dot com. You 546 00:32:52,760 --> 00:32:55,000 Speaker 1: can also type in the word hate to bring up 547 00:32:55,040 --> 00:32:58,120 Speaker 1: the article that we worked off of today. I should 548 00:32:58,160 --> 00:33:00,760 Speaker 1: point out to Josh that I made with the dude 549 00:33:01,080 --> 00:33:04,760 Speaker 1: years later and uh never made right with a girl. 550 00:33:07,080 --> 00:33:09,120 Speaker 1: What does that say. I think it says that you 551 00:33:09,200 --> 00:33:12,400 Speaker 1: hated the girl more. I just never felt the need 552 00:33:12,520 --> 00:33:15,240 Speaker 1: to drudge that back up with her. But the dude, 553 00:33:15,240 --> 00:33:16,640 Speaker 1: I was like, Man, you can't have like an old 554 00:33:16,680 --> 00:33:18,440 Speaker 1: friend that you're not friends with anymore, at least I can't. 555 00:33:19,000 --> 00:33:21,760 Speaker 1: I don't like that stuff. Yeah, man, I don't like 556 00:33:21,840 --> 00:33:24,200 Speaker 1: that hanging over my head. Okay, try to make it right. 557 00:33:24,280 --> 00:33:28,600 Speaker 1: That's what I say. You've done. Now I'm done. Sorry anyway, 558 00:33:28,600 --> 00:33:30,600 Speaker 1: I think did I even say handy search part? You 559 00:33:30,680 --> 00:33:33,840 Speaker 1: totally threw me off? All right, well, handy search bar? 560 00:33:33,960 --> 00:33:36,280 Speaker 1: How stuff? First dot com? I said that, Chuck, So 561 00:33:36,400 --> 00:33:42,480 Speaker 1: that means it's your turn for listener mail. Yes, Josh, 562 00:33:42,560 --> 00:33:45,240 Speaker 1: this is on suicide bombing, and this Nick brings up 563 00:33:45,280 --> 00:33:47,000 Speaker 1: a very good point that I think kind of fits 564 00:33:47,040 --> 00:33:50,240 Speaker 1: in with this podcast. Hi, guys and Jerry. I think 565 00:33:50,600 --> 00:33:53,000 Speaker 1: y'all are very brave for taking on the issue of 566 00:33:53,040 --> 00:33:56,800 Speaker 1: suicide bombing. I don't know about brave, but I appreciate it. 567 00:33:57,600 --> 00:33:59,720 Speaker 1: I don't want to contribute too much to the delusion 568 00:33:59,720 --> 00:34:02,080 Speaker 1: of anils, but I would like to say, you could 569 00:34:02,120 --> 00:34:05,040 Speaker 1: have more explicitly underscored something that I believe is key 570 00:34:05,600 --> 00:34:10,359 Speaker 1: to understanding suicide bombing and terrorism in general, both our 571 00:34:10,440 --> 00:34:14,440 Speaker 1: weapons of the weak and the believer, sort of like 572 00:34:14,520 --> 00:34:18,440 Speaker 1: our hate thing. Okay do you agree? Um? Yeah, well, 573 00:34:18,480 --> 00:34:20,480 Speaker 1: I mean we've been said it's suicide bomb or cost 574 00:34:20,520 --> 00:34:23,319 Speaker 1: about a hundred and fifty bucks exactly. He points out, 575 00:34:23,560 --> 00:34:26,960 Speaker 1: if Palestinians, for instance, had access to predator drones and 576 00:34:27,040 --> 00:34:30,439 Speaker 1: guided missile systems rather than rocks and slingshots, I don't 577 00:34:30,440 --> 00:34:33,640 Speaker 1: think that Palestinians would resort to martyrdom. I would also 578 00:34:33,760 --> 00:34:37,640 Speaker 1: point to suicide bombings carried out by the viet Men 579 00:34:38,400 --> 00:34:41,480 Speaker 1: during the French occupation of Vietnam, or the example of 580 00:34:41,719 --> 00:34:45,440 Speaker 1: Tamil Tigers of Sri Lanka, both of which movements were 581 00:34:45,440 --> 00:34:47,600 Speaker 1: secular in nature. All I want to say is it 582 00:34:47,600 --> 00:34:51,080 Speaker 1: seems like suicide bombings is a phenomenon often arising from 583 00:34:51,160 --> 00:34:54,160 Speaker 1: situations in which there is a huge asymmetry of power 584 00:34:54,640 --> 00:34:59,160 Speaker 1: between an occupying or apartheid resime regime or a native 585 00:34:59,280 --> 00:35:01,719 Speaker 1: or a press popular Asian. You guys did mention this, 586 00:35:01,840 --> 00:35:04,440 Speaker 1: but I think this dimension is at least as important 587 00:35:04,440 --> 00:35:08,120 Speaker 1: to the issue as religion or notions of martyrdom um. 588 00:35:08,239 --> 00:35:11,399 Speaker 1: That is sincerely from Nick and I kind of agree Nick, 589 00:35:11,520 --> 00:35:13,400 Speaker 1: and Nick is a sharp tack. It's like right on 590 00:35:13,480 --> 00:35:17,759 Speaker 1: the money. Yeah, thanks for that one. Wow. Okay, well, 591 00:35:17,800 --> 00:35:19,560 Speaker 1: if you think you're a sharp tack, we want to 592 00:35:19,600 --> 00:35:21,960 Speaker 1: hear from you, right, Chuck, that's right, send us an 593 00:35:22,000 --> 00:35:27,080 Speaker 1: email about anything at all, Anything at all. Two Stuff 594 00:35:27,160 --> 00:35:35,000 Speaker 1: podcast at how stuff works dot com. Be sure to 595 00:35:35,080 --> 00:35:37,840 Speaker 1: check out our new video podcast, Stuff from the Future. 596 00:35:38,200 --> 00:35:40,480 Speaker 1: Join how Stuff Work staff as we explore the most 597 00:35:40,520 --> 00:35:46,680 Speaker 1: promising and perplexing possibilities of tomorrow, brought to you by 598 00:35:46,680 --> 00:35:50,040 Speaker 1: the reinvented two thousand twelve camera. It's ready, are you