1 00:00:00,560 --> 00:00:03,880 Speaker 1: Hey, everybody, get this. We have a mind bending announcement 2 00:00:03,920 --> 00:00:06,880 Speaker 1: to make the Stuff you Should Know episode on Vinyl 3 00:00:07,200 --> 00:00:10,920 Speaker 1: is now on vinyl. You can learn about records by 4 00:00:11,000 --> 00:00:14,200 Speaker 1: listening to a record. It's possibly the first time a 5 00:00:14,240 --> 00:00:17,520 Speaker 1: podcast episode has ever been put to wax, and we 6 00:00:17,640 --> 00:00:21,040 Speaker 1: did it along with our friends at bourn Loosers Records. 7 00:00:21,640 --> 00:00:24,880 Speaker 1: It comes in three awesome colors, black, white, and a 8 00:00:24,960 --> 00:00:27,920 Speaker 1: super cool splattercore and you can order it for pre 9 00:00:28,040 --> 00:00:32,960 Speaker 1: sale now at syskvinyl dot com. Records will ship on 10 00:00:33,000 --> 00:00:36,960 Speaker 1: October twentieth, just in time for Halloween, whatever that means. 11 00:00:37,159 --> 00:00:40,519 Speaker 1: So go to sysk vinyl dot com right now to 12 00:00:40,600 --> 00:00:43,920 Speaker 1: get this super duper limited edition, super cool stuff you 13 00:00:43,960 --> 00:00:46,599 Speaker 1: should know thing a record on records. 14 00:00:50,000 --> 00:00:55,800 Speaker 2: Welcome to Stuff you Should Know. A production of iHeartRadio. 15 00:01:00,080 --> 00:01:02,400 Speaker 1: Welcome to the podcast. I'm Josh, and there's Chuck and 16 00:01:02,480 --> 00:01:04,679 Speaker 1: Jerry's here too, and this is Stuff you Should Know, 17 00:01:05,760 --> 00:01:12,600 Speaker 1: the podcast about Revenge. Revenge. We've done an episode on 18 00:01:13,360 --> 00:01:16,520 Speaker 1: it was like a top ten on cases Legendary Cases 19 00:01:16,520 --> 00:01:19,280 Speaker 1: of Revenge. Oh yeah, I remember that, but we didn't 20 00:01:19,280 --> 00:01:22,720 Speaker 1: talk much about Revenge itself, and I felt it was 21 00:01:22,840 --> 00:01:25,680 Speaker 1: high time. We've been dancing around it for decades now. 22 00:01:27,200 --> 00:01:30,400 Speaker 2: And here we are I thought, this is a great idea, 23 00:01:30,480 --> 00:01:32,880 Speaker 2: So kudos to you because it. Dave helped us out 24 00:01:32,880 --> 00:01:37,920 Speaker 2: with this one. And it's a lot of like science 25 00:01:38,240 --> 00:01:41,480 Speaker 2: and studies have sort of and I'm not going to 26 00:01:41,520 --> 00:01:44,920 Speaker 2: spoil anything, but have sort of produced results that fly 27 00:01:45,040 --> 00:01:48,040 Speaker 2: in the face of what one might typically think about 28 00:01:48,080 --> 00:01:50,880 Speaker 2: revenge and what it means for the person getting the revenge. 29 00:01:51,560 --> 00:01:56,400 Speaker 1: Yeah, I think most people how we feel about revenge, 30 00:01:56,520 --> 00:02:01,240 Speaker 1: it's from watching movies and it's like deeply satisfying to 31 00:02:01,280 --> 00:02:06,040 Speaker 1: watch the bad guy who deserves revenge get their come upance, right, 32 00:02:06,240 --> 00:02:09,120 Speaker 1: sure is, or even be killed just like, yes, that 33 00:02:09,160 --> 00:02:13,480 Speaker 1: guy deserved that kind of thing. But in reality carrying 34 00:02:13,480 --> 00:02:17,640 Speaker 1: out acts of revenge or they just it's not like 35 00:02:17,680 --> 00:02:19,560 Speaker 1: the movies, I guess, is what I'm trying to say. 36 00:02:19,600 --> 00:02:22,560 Speaker 1: And yet there's a lot of evidence of revenge in 37 00:02:22,639 --> 00:02:25,959 Speaker 1: real life, so much so that the New York Police 38 00:02:25,960 --> 00:02:29,440 Speaker 1: Department came out with a study in twenty twelve and 39 00:02:29,639 --> 00:02:32,720 Speaker 1: found out that forty two percent of the homicides in 40 00:02:32,800 --> 00:02:37,600 Speaker 1: New York were motivated by revenge. Man So, and that 41 00:02:37,720 --> 00:02:41,760 Speaker 1: actually kind of underscores like a problem with revenge is 42 00:02:42,440 --> 00:02:46,880 Speaker 1: that when you enact vengeance on somebody and you leave 43 00:02:46,919 --> 00:02:51,840 Speaker 1: them alive, almost invariably that person feels like you overdid 44 00:02:53,800 --> 00:02:56,640 Speaker 1: in response to what they did. It was disproportionate, So 45 00:02:56,680 --> 00:02:59,480 Speaker 1: now they have to strike back again, and it can 46 00:02:59,520 --> 00:03:01,880 Speaker 1: go back and fourth until somebody dies or else somebody 47 00:03:01,880 --> 00:03:04,280 Speaker 1: can die right away is the first act of revenge. 48 00:03:04,960 --> 00:03:07,280 Speaker 1: But the point of the whole thing is is that 49 00:03:07,880 --> 00:03:10,720 Speaker 1: once you do carry out revenge, no matter if it's 50 00:03:10,760 --> 00:03:15,360 Speaker 1: petty exciting somebody up for spam or killing somebody in 51 00:03:15,440 --> 00:03:20,480 Speaker 1: response to whatever slight like road rage, they cut you 52 00:03:20,520 --> 00:03:24,600 Speaker 1: off in traffic, you don't feel good afterward. You actually 53 00:03:24,639 --> 00:03:29,600 Speaker 1: feel worse. And that's the underlying point of this entire episode. 54 00:03:29,880 --> 00:03:32,360 Speaker 2: Yeah, you know, my favorite Petty I don't do it, 55 00:03:32,400 --> 00:03:36,240 Speaker 2: but my favorite petty revenge to witnesses is when and 56 00:03:36,320 --> 00:03:40,560 Speaker 2: it's so dumb, everyone just settled down, is on a highway, 57 00:03:40,600 --> 00:03:44,040 Speaker 2: when someone is on an expressway and they clean their 58 00:03:44,080 --> 00:03:46,520 Speaker 2: windows and it gets all over the car behind them. 59 00:03:46,560 --> 00:03:49,280 Speaker 2: I see people all the time race in front of 60 00:03:49,280 --> 00:03:51,120 Speaker 2: that person and do the same thing back then. 61 00:03:51,240 --> 00:03:54,640 Speaker 1: God, really, yeah, that is petty. 62 00:03:55,560 --> 00:03:58,040 Speaker 2: That is Tom Petty. That's not Tom Petty. Could Thom 63 00:03:58,040 --> 00:04:01,520 Speaker 2: Petty was great. That's just and I also wanted to 64 00:04:01,560 --> 00:04:05,160 Speaker 2: say too, you talked about revenge, coming back harder or whatever. 65 00:04:05,840 --> 00:04:09,760 Speaker 2: Emily has her own personal saying, like when we're messing 66 00:04:09,840 --> 00:04:12,280 Speaker 2: around and I like I will do something to her, 67 00:04:13,000 --> 00:04:14,800 Speaker 2: or I'll say something kind of mean as a joke, 68 00:04:15,000 --> 00:04:19,440 Speaker 2: she'll eviscerate me, and she calls it coming back double. 69 00:04:19,480 --> 00:04:21,480 Speaker 3: She goes, I come back double. Oh boy. 70 00:04:22,480 --> 00:04:25,280 Speaker 2: He's one of those people that think she gets pushed 71 00:04:25,279 --> 00:04:28,880 Speaker 2: into corner and man, she comes out hard. So it's 72 00:04:29,240 --> 00:04:31,359 Speaker 2: a good trait, I think, and one to be wary of. 73 00:04:31,440 --> 00:04:34,240 Speaker 1: At the same time, Yes, I'm suddenly way more wary 74 00:04:34,240 --> 00:04:36,799 Speaker 1: of Emily than I was before. Luckily, I've always stayed 75 00:04:36,800 --> 00:04:37,520 Speaker 1: on her good side. 76 00:04:37,640 --> 00:04:39,680 Speaker 3: Yeah, you wouldn't come at it Emily anyway, You're smart. 77 00:04:39,800 --> 00:04:39,880 Speaker 2: No. 78 00:04:40,600 --> 00:04:46,320 Speaker 1: So there's a lot of questions revolving around revenge. If 79 00:04:47,240 --> 00:04:49,240 Speaker 1: if we know for a fact it feels good to 80 00:04:49,240 --> 00:04:52,799 Speaker 1: think about, but then it feels bad to do despite 81 00:04:52,839 --> 00:04:54,440 Speaker 1: the fact that we're thinking about it, where like this 82 00:04:54,560 --> 00:04:56,640 Speaker 1: is going to feel good, it's not the act of 83 00:04:56,760 --> 00:04:59,479 Speaker 1: thinking about it that feels good. It's fantasizing about how 84 00:04:59,720 --> 00:05:01,960 Speaker 1: good it's going to feel to get that person back 85 00:05:02,360 --> 00:05:04,920 Speaker 1: to set the universe right again, to do all sorts 86 00:05:04,960 --> 00:05:09,520 Speaker 1: of things that revenge allegedly does. And it turns out, 87 00:05:10,279 --> 00:05:12,600 Speaker 1: when you carry out an act of revenge, you are 88 00:05:12,640 --> 00:05:18,360 Speaker 1: playing the chump to evolution on behalf of society as 89 00:05:18,400 --> 00:05:21,719 Speaker 1: a whole, and that's kind of like the whole basis 90 00:05:21,720 --> 00:05:26,800 Speaker 1: of revenge. There's an evolutionary instinct that's very very old. 91 00:05:27,040 --> 00:05:31,320 Speaker 1: It's found extensively in the animal kingdom, and it really 92 00:05:31,760 --> 00:05:36,920 Speaker 1: collides with the modern evolved humans that live in these 93 00:05:36,960 --> 00:05:40,440 Speaker 1: complex societies we've formed today. When you put those two 94 00:05:40,440 --> 00:05:43,160 Speaker 1: things together, an interesting podcast comes out. 95 00:05:43,600 --> 00:05:44,120 Speaker 3: That's right. 96 00:05:44,640 --> 00:05:46,679 Speaker 2: What you're talking about in the animal kingdom is also 97 00:05:46,720 --> 00:05:52,160 Speaker 2: called retaliatory aggression, and that is the idea that so, 98 00:05:52,240 --> 00:05:56,680 Speaker 2: let's say a lion mama goes out and kills an 99 00:05:56,720 --> 00:06:01,120 Speaker 2: animal to leave for her little cubs to eat. Another 100 00:06:01,240 --> 00:06:03,760 Speaker 2: animal is like, ooh, you know, let me see if 101 00:06:03,800 --> 00:06:05,479 Speaker 2: I can sneak in there and eat some of that too. 102 00:06:06,040 --> 00:06:09,320 Speaker 2: The mama lion doesn't just scare this thing off to 103 00:06:09,440 --> 00:06:12,480 Speaker 2: preserve that meat for the kids. The mama lion goes 104 00:06:12,920 --> 00:06:16,920 Speaker 2: and hunts down and kills that animal. Yes, that's they 105 00:06:16,920 --> 00:06:18,960 Speaker 2: come back double Emily style right. 106 00:06:19,279 --> 00:06:22,040 Speaker 1: I mean, like the problem solved, the hyena has been 107 00:06:22,120 --> 00:06:25,520 Speaker 1: chased away, But to leave your kids and go find 108 00:06:25,560 --> 00:06:29,760 Speaker 1: it and kill it, that is seems retaliatorially aggressive. 109 00:06:30,360 --> 00:06:34,360 Speaker 2: Yeah, and this next one too, I'm gonna mention these 110 00:06:34,360 --> 00:06:36,599 Speaker 2: are interesting because it made me sort of question the 111 00:06:36,640 --> 00:06:39,720 Speaker 2: idea of revenge versus punishment, right, because I think those 112 00:06:39,760 --> 00:06:42,520 Speaker 2: are different things. But the Teresis monkey, We've talked a 113 00:06:42,520 --> 00:06:45,839 Speaker 2: lot about their vocalizations, like they're all about the group, 114 00:06:46,000 --> 00:06:47,880 Speaker 2: or they should be at least, And like when they 115 00:06:47,960 --> 00:06:50,440 Speaker 2: find food, let's say they will tell everyone, hey, I 116 00:06:50,480 --> 00:06:52,960 Speaker 2: found food. But if a Riesei's monkey is ever like, 117 00:06:53,360 --> 00:06:54,719 Speaker 2: you know, I'm gonna have a little bit of this 118 00:06:54,800 --> 00:06:57,920 Speaker 2: first before I call out. And if they find that out, 119 00:06:58,279 --> 00:07:00,640 Speaker 2: there's a punishment for that Reese's monkey. I don't think 120 00:07:00,680 --> 00:07:03,760 Speaker 2: they kill it, but there is a punishment. And this 121 00:07:03,880 --> 00:07:10,480 Speaker 2: is the idea that these retaliatory aggressions are deterrence. It's 122 00:07:10,520 --> 00:07:14,760 Speaker 2: like a punishment for everyone to see to prevent future 123 00:07:14,760 --> 00:07:17,680 Speaker 2: trans aggressions like hey, did you hyena see that? Did 124 00:07:17,720 --> 00:07:20,760 Speaker 2: you other Reesis monkeys see that? So that, you know, 125 00:07:20,800 --> 00:07:25,480 Speaker 2: would be an advantageous thing evolutionarily speaking, So that gene 126 00:07:25,520 --> 00:07:26,200 Speaker 2: gets passed on. 127 00:07:27,000 --> 00:07:30,160 Speaker 1: Yeah, because the more the more you're prone to do that, 128 00:07:30,400 --> 00:07:33,760 Speaker 1: the likelier you are to not have food stolen from 129 00:07:33,800 --> 00:07:36,040 Speaker 1: you for your kids, the likelier it is for your 130 00:07:36,120 --> 00:07:40,160 Speaker 1: kids to survive and your lineage to survive. So it 131 00:07:40,280 --> 00:07:45,200 Speaker 1: makes sense evolutionarily speaking, this retaliatory aggression does at least right. 132 00:07:45,600 --> 00:07:48,800 Speaker 2: Yeah, which I would still argue is punishment more than revenge. 133 00:07:50,080 --> 00:07:53,160 Speaker 2: I think there's an emotional component that's missing, but we're getting. 134 00:07:52,840 --> 00:07:55,720 Speaker 1: To that absolutely. I think you're absolutely right. And there's 135 00:07:55,760 --> 00:07:59,200 Speaker 1: a story, a couple of stories of tigers actually engaging 136 00:07:59,240 --> 00:08:04,320 Speaker 1: in can only be described as revenge, and it's very 137 00:08:04,400 --> 00:08:06,680 Speaker 1: much up in the air whether what we're witnessing is 138 00:08:06,720 --> 00:08:10,800 Speaker 1: actual revenge. But like, like, there was a very famous 139 00:08:10,840 --> 00:08:14,960 Speaker 1: story out of Russia where like a poacher not only 140 00:08:15,280 --> 00:08:18,880 Speaker 1: shot a tiger but also took some of their kill, 141 00:08:19,400 --> 00:08:21,800 Speaker 1: and that the tiger tracked the guy down, found his 142 00:08:22,440 --> 00:08:25,600 Speaker 1: little lodging, destroyed everything he could find in the lodging, 143 00:08:25,640 --> 00:08:27,600 Speaker 1: and then waited outside for the hunter to come back 144 00:08:27,640 --> 00:08:30,320 Speaker 1: and then kill them. And that the tiger managed to 145 00:08:30,360 --> 00:08:32,640 Speaker 1: hold this idea in his head. I think it was 146 00:08:32,760 --> 00:08:35,800 Speaker 1: her her head for up to maybe twenty four hours 147 00:08:36,040 --> 00:08:41,079 Speaker 1: after the hunter shot her. There's a couple of stories 148 00:08:41,080 --> 00:08:43,880 Speaker 1: out there that seem to pertain to tigers specifically that 149 00:08:44,200 --> 00:08:47,080 Speaker 1: it's almost like it does contain an emotional component to it. 150 00:08:47,240 --> 00:08:50,040 Speaker 1: But for the most part, yes, it's it's solving a 151 00:08:50,080 --> 00:08:55,199 Speaker 1: problem and then maybe preventing future problems among the animals. 152 00:08:55,520 --> 00:08:55,800 Speaker 3: Yeah. 153 00:08:55,840 --> 00:08:58,520 Speaker 2: You know, one of my favorite sayings is revenge is 154 00:08:58,559 --> 00:09:01,400 Speaker 2: a meal best serve cold. I don't know why, because 155 00:09:01,400 --> 00:09:04,080 Speaker 2: I'm not a revenge guy really, but I just I 156 00:09:04,120 --> 00:09:05,920 Speaker 2: think that is just such a great saying. 157 00:09:06,679 --> 00:09:07,319 Speaker 3: I just like it. 158 00:09:07,720 --> 00:09:10,000 Speaker 2: You know, there's something about like, oh no, no, no, 159 00:09:10,040 --> 00:09:12,200 Speaker 2: the real revenge is like when he waited around. 160 00:09:11,880 --> 00:09:12,360 Speaker 3: For a while. 161 00:09:12,440 --> 00:09:16,000 Speaker 2: Oh yeah, and then when you would might not be suspected, 162 00:09:16,040 --> 00:09:17,400 Speaker 2: you come back and take that revenge. 163 00:09:17,480 --> 00:09:20,079 Speaker 1: Yeah, because if you just immediately do it in response, 164 00:09:20,120 --> 00:09:22,880 Speaker 1: you're a hot head and a dummy. Anybody can do that. 165 00:09:22,960 --> 00:09:24,920 Speaker 1: But to sit there and really stew on it and 166 00:09:24,960 --> 00:09:27,559 Speaker 1: figure out the best way to really get back at 167 00:09:27,559 --> 00:09:30,760 Speaker 1: the person that takes intellect Yeah, I agree, and a 168 00:09:30,760 --> 00:09:33,559 Speaker 1: little bit of craziness. I just have to say. 169 00:09:33,640 --> 00:09:36,440 Speaker 2: All right, so now we can get to the humans. 170 00:09:36,480 --> 00:09:39,800 Speaker 2: As far as evolution is concerned, we have that same 171 00:09:40,080 --> 00:09:45,079 Speaker 2: sort of instinct ingrained in our DNA for that retaliatory aggression. 172 00:09:46,120 --> 00:09:49,600 Speaker 2: Our ancestors when they were living in hunter gatherer groups, 173 00:09:49,880 --> 00:09:51,960 Speaker 2: was a lot of relying on one another, obviously a 174 00:09:51,960 --> 00:09:55,679 Speaker 2: lot of communication and cooperation, and thus a lot of 175 00:09:56,240 --> 00:10:00,480 Speaker 2: punishing to be done if people either were outsider people 176 00:10:00,480 --> 00:10:02,480 Speaker 2: inside the group didn't cooperate and do the right thing. 177 00:10:02,920 --> 00:10:05,920 Speaker 1: Yes, and so this is again the same thing what 178 00:10:06,040 --> 00:10:09,520 Speaker 1: you were talking about. You're punishing the person who transgressed. 179 00:10:09,559 --> 00:10:15,440 Speaker 1: You're also deterring future behavior. And the more we became social, 180 00:10:15,840 --> 00:10:18,599 Speaker 1: the more important this kind of stuff became, because we 181 00:10:19,520 --> 00:10:23,320 Speaker 1: started depending on other people, and so as a result, 182 00:10:23,440 --> 00:10:27,120 Speaker 1: we started monitoring one another, and that in and of 183 00:10:27,160 --> 00:10:32,199 Speaker 1: itself can act as a deterrent in the future, because 184 00:10:32,200 --> 00:10:37,080 Speaker 1: you know that there's a vengeance instinct, and there's a 185 00:10:37,160 --> 00:10:43,240 Speaker 1: set amount or set structure of norms and rules, and 186 00:10:43,280 --> 00:10:45,600 Speaker 1: then other people are watching you to see what you're doing, 187 00:10:45,679 --> 00:10:49,120 Speaker 1: and you're watching them, and that kind of creates an 188 00:10:49,200 --> 00:10:53,199 Speaker 1: atmosphere of conformity. And you say what you want about conformity, 189 00:10:53,200 --> 00:10:57,040 Speaker 1: but if you have a large group of people following 190 00:10:57,040 --> 00:11:01,480 Speaker 1: the same rules, you're taking care of a really basic 191 00:11:01,640 --> 00:11:04,800 Speaker 1: problem and issue and you can then kind of evolve 192 00:11:05,120 --> 00:11:09,200 Speaker 1: more into more and more complex societies. Yeah, that's I 193 00:11:09,200 --> 00:11:13,040 Speaker 1: saw one person say that revenge is ultimately what provided 194 00:11:13,040 --> 00:11:16,400 Speaker 1: the basis for human civilization and allowed it to grow 195 00:11:16,440 --> 00:11:18,640 Speaker 1: knowing that there was such a thing as revenge that 196 00:11:18,760 --> 00:11:20,120 Speaker 1: humans were capable. 197 00:11:19,720 --> 00:11:21,000 Speaker 3: A bit totally. 198 00:11:22,080 --> 00:11:24,000 Speaker 2: So kind of put a pin in that for a second, 199 00:11:24,320 --> 00:11:28,000 Speaker 2: because we should talk about this idea of like sweet revenge. 200 00:11:28,960 --> 00:11:31,560 Speaker 2: That's a that's a word that's often associated with revenge. 201 00:11:31,600 --> 00:11:35,680 Speaker 2: And you talked about the fantasy of revenge, and it's 202 00:11:35,760 --> 00:11:40,760 Speaker 2: it's you know, it's a fantasy because for very good reasons, 203 00:11:41,040 --> 00:11:45,480 Speaker 2: if you are physically hurt, obviously or emotionally or psychologically 204 00:11:45,559 --> 00:11:50,400 Speaker 2: wounded by somebody you, it's a natural instinct to think 205 00:11:50,400 --> 00:11:53,920 Speaker 2: about getting back at that person, right, and the feelings 206 00:11:53,920 --> 00:11:56,880 Speaker 2: that come with that take place. And a part of 207 00:11:56,880 --> 00:12:00,000 Speaker 2: the brain called the dorsal striatum, and it's the same 208 00:12:00,040 --> 00:12:03,960 Speaker 2: aim part of the brain that controls the reward system 209 00:12:04,040 --> 00:12:08,120 Speaker 2: of like, hey, that pecan pie tastes great, that sexual 210 00:12:08,200 --> 00:12:13,040 Speaker 2: orgasm feels amazing, or the drug that really want. 211 00:12:12,920 --> 00:12:13,760 Speaker 3: To take feels good. 212 00:12:14,120 --> 00:12:18,400 Speaker 2: It's that same lizard brain pathway that revenge lights up, 213 00:12:18,440 --> 00:12:22,120 Speaker 2: that lights up whenever you do anything that feels rewarding 214 00:12:22,360 --> 00:12:23,680 Speaker 2: or satisfying for somebody. 215 00:12:23,880 --> 00:12:27,679 Speaker 1: Yeah, it's extraordinarily powerful and hard to deny and overcome 216 00:12:27,800 --> 00:12:30,040 Speaker 1: because it's just such a basic response. 217 00:12:30,120 --> 00:12:31,199 Speaker 3: Right, totally. 218 00:12:31,320 --> 00:12:34,520 Speaker 1: But again, the problem, and this is where the tension arises. 219 00:12:35,120 --> 00:12:40,559 Speaker 1: We have evolved to a way where we've created these 220 00:12:40,559 --> 00:12:44,520 Speaker 1: societies with rules and expectations that in part say like, 221 00:12:44,640 --> 00:12:48,400 Speaker 1: you can't carry out revenge. It's not okay. And you 222 00:12:48,600 --> 00:12:51,040 Speaker 1: know that that's not okay as a modern human living 223 00:12:51,040 --> 00:12:54,480 Speaker 1: in modern human society. And yet we have that part 224 00:12:54,480 --> 00:12:56,840 Speaker 1: of our brain, that really powerful, basic part of our brain, 225 00:12:57,200 --> 00:13:00,600 Speaker 1: telling us to do it, and we know we're not 226 00:13:00,600 --> 00:13:02,360 Speaker 1: supposed to. And that's kind of like the point in 227 00:13:02,480 --> 00:13:04,400 Speaker 1: human evolution that we live in right now. 228 00:13:04,920 --> 00:13:08,720 Speaker 3: That's right, Should we take a break? Sure? 229 00:13:08,960 --> 00:13:10,839 Speaker 2: All right, that sounds like a good stopping point. So 230 00:13:10,880 --> 00:13:12,880 Speaker 2: we'll take a break and we'll come back and we'll 231 00:13:12,920 --> 00:13:15,559 Speaker 2: hit on the thing that you brought up earlier about 232 00:13:15,559 --> 00:13:18,360 Speaker 2: the fact that actually getting that revenge may not be 233 00:13:18,440 --> 00:13:42,480 Speaker 2: so sweet. Stop. 234 00:13:43,760 --> 00:13:47,400 Speaker 1: So, Chuck, one of the things about revenge that makes 235 00:13:47,400 --> 00:13:49,720 Speaker 1: it different from the drug that you were talking about, 236 00:13:49,840 --> 00:13:53,000 Speaker 1: or the orgasm or whatever, is that when you think 237 00:13:53,040 --> 00:13:57,679 Speaker 1: of it, it's more fulfilling than when you actually do it. 238 00:13:57,720 --> 00:14:00,880 Speaker 1: So like if you think about a drug, might it 239 00:14:00,960 --> 00:14:03,600 Speaker 1: might be pleasurable, but it's probably nothing compared to what 240 00:14:03,640 --> 00:14:06,439 Speaker 1: the drugs doing in your brain when you actually take it, right, 241 00:14:06,440 --> 00:14:10,560 Speaker 1: It's not true with revenge. Not only is the thought 242 00:14:10,720 --> 00:14:16,040 Speaker 1: a fantasy of revenge more fulfilling and will hit that 243 00:14:16,080 --> 00:14:19,040 Speaker 1: limbic system harder when you actually do carry out an 244 00:14:19,040 --> 00:14:23,560 Speaker 1: active revenge, it actually creates negative feelings in you as well. 245 00:14:23,720 --> 00:14:27,080 Speaker 2: Yeah, which is interesting because like, how can an idea 246 00:14:27,160 --> 00:14:30,560 Speaker 2: of something? How can a fantasy of something trigger the 247 00:14:30,600 --> 00:14:34,440 Speaker 2: same cascades of the other pleasurable things in their life 248 00:14:34,440 --> 00:14:38,000 Speaker 2: that you're actually doing, And when you think about it, 249 00:14:37,640 --> 00:14:42,040 Speaker 2: it actually does make sense because revenge, actually taking revenge 250 00:14:42,280 --> 00:14:47,000 Speaker 2: is risky. Thinking about revenge fantasies, Sizing about revenge is 251 00:14:47,040 --> 00:14:50,720 Speaker 2: not risky per se. I mean it could be dangerous 252 00:14:50,760 --> 00:14:54,400 Speaker 2: for you know, negative for a person perhaps eventually if 253 00:14:54,400 --> 00:14:57,200 Speaker 2: you like you've become obsessed with it, but initially it's 254 00:14:57,320 --> 00:15:00,240 Speaker 2: a feel good feeling, but carrying through on it can 255 00:15:00,240 --> 00:15:02,320 Speaker 2: be risky. If you go to if you're the hunter 256 00:15:02,480 --> 00:15:05,920 Speaker 2: gatherer group and someone invade your group and steals your meat. 257 00:15:06,640 --> 00:15:08,520 Speaker 2: You could go you could just sit there and think 258 00:15:08,520 --> 00:15:10,280 Speaker 2: about how great it would be to get them back, 259 00:15:10,280 --> 00:15:12,640 Speaker 2: and that's probably the safest move. Or you could actually 260 00:15:12,720 --> 00:15:15,080 Speaker 2: go to that other camp and try and kill that person. 261 00:15:15,400 --> 00:15:17,200 Speaker 2: But you're taking a big risk at that point. 262 00:15:17,400 --> 00:15:21,080 Speaker 1: You individually are that's right, but you're doing it on 263 00:15:21,280 --> 00:15:24,320 Speaker 1: behalf of the group or the group benefits whether you're 264 00:15:24,320 --> 00:15:25,600 Speaker 1: doing it on their behalf or not. 265 00:15:26,080 --> 00:15:26,600 Speaker 3: That's right. 266 00:15:26,920 --> 00:15:30,600 Speaker 1: If you didn't do anything, though, that group, not just you, 267 00:15:30,680 --> 00:15:33,280 Speaker 1: but the group you're a member of, would seem weak 268 00:15:33,960 --> 00:15:39,320 Speaker 1: to other groups. Yeah, and it's it sounds really like 269 00:15:40,040 --> 00:15:44,280 Speaker 1: kind of chromagnet or something like that, But that's it. 270 00:15:44,400 --> 00:15:47,920 Speaker 1: It's important. You can't have like I was saying before, 271 00:15:47,960 --> 00:15:52,280 Speaker 1: you can't have society without the knowledge that if you 272 00:15:52,360 --> 00:15:56,400 Speaker 1: transgress there will be consequences for it. I saw, I 273 00:15:56,400 --> 00:15:59,440 Speaker 1: saw it put. There's a neurologist and psychologist named Jeff 274 00:15:59,480 --> 00:16:03,480 Speaker 1: viktor Off. He said that reciprocal altruism, which is how 275 00:16:03,520 --> 00:16:10,000 Speaker 1: people cooperate between groups and within groups, that it rewards 276 00:16:10,120 --> 00:16:14,800 Speaker 1: and requires a costly signal demonstrating risk taking on behalf 277 00:16:14,840 --> 00:16:17,160 Speaker 1: of the in group. So for people to be able 278 00:16:17,200 --> 00:16:19,440 Speaker 1: to trade with one another. For people to be able 279 00:16:19,480 --> 00:16:21,680 Speaker 1: to get along in a society and not kill each 280 00:16:21,720 --> 00:16:24,320 Speaker 1: other or whatever, you have to know that there's a 281 00:16:24,400 --> 00:16:27,760 Speaker 1: threat to you if you transgress. It has to be 282 00:16:27,880 --> 00:16:32,560 Speaker 1: there or else people will inevitably invariably cheat or kill 283 00:16:32,600 --> 00:16:35,960 Speaker 1: you or do whatever. And it's a really basic, paranoid 284 00:16:36,000 --> 00:16:38,840 Speaker 1: way of looking at the world. But if you start 285 00:16:38,880 --> 00:16:44,000 Speaker 1: to study revenge, it seems like it's a lynchpin of society, 286 00:16:44,160 --> 00:16:47,560 Speaker 1: as it states that there's just you can't have a 287 00:16:47,640 --> 00:16:52,520 Speaker 1: society animal or human without that that threat hanging over 288 00:16:52,560 --> 00:16:53,360 Speaker 1: you of revenge. 289 00:16:53,520 --> 00:16:54,680 Speaker 3: Yeah, totally. Yeah. 290 00:16:54,720 --> 00:16:57,760 Speaker 2: I got another quote, and this is the idea kind 291 00:16:57,760 --> 00:17:01,800 Speaker 2: of supports the idea that the revenge itself isn't you know, 292 00:17:01,840 --> 00:17:03,560 Speaker 2: it's risky and it could be bad for you. It's 293 00:17:03,600 --> 00:17:07,199 Speaker 2: really the idea of it that's that's better or at 294 00:17:07,280 --> 00:17:10,560 Speaker 2: least better for the individual. This from Francis Bacon. A 295 00:17:10,600 --> 00:17:14,120 Speaker 2: man that studieth revenge keeps his own wounds green, which 296 00:17:14,160 --> 00:17:17,280 Speaker 2: otherwise would heal and do well. And that's sort of 297 00:17:17,320 --> 00:17:19,840 Speaker 2: the thing that you know, has come up over and 298 00:17:19,840 --> 00:17:22,160 Speaker 2: over again and studies that we're going to be talking about, 299 00:17:22,560 --> 00:17:25,400 Speaker 2: is that the you know, the path of the Buddha. 300 00:17:25,480 --> 00:17:29,159 Speaker 2: The getting over things and not seeking revenge is really 301 00:17:29,200 --> 00:17:39,880 Speaker 2: the path that ultimately will bring someone what satisfaction? Tranquility, yeah, tranquility, 302 00:17:40,080 --> 00:17:42,760 Speaker 2: Maybe not satisfaction, actually yeah. 303 00:17:42,800 --> 00:17:45,800 Speaker 1: I think it's getting past the need for satisfaction that 304 00:17:46,000 --> 00:17:48,320 Speaker 1: will lead you to the point that you really want 305 00:17:48,320 --> 00:17:50,199 Speaker 1: to get to, which is feeling good again, but like 306 00:17:50,280 --> 00:17:53,800 Speaker 1: you felt before you were wronged. Right. The thing is 307 00:17:53,800 --> 00:17:56,960 Speaker 1: is with no wronged. The thing is is with with 308 00:17:57,119 --> 00:18:02,120 Speaker 1: revenge what again? What you have is an innate, automatic 309 00:18:02,320 --> 00:18:05,840 Speaker 1: impulse to smash the other person in the face to 310 00:18:05,920 --> 00:18:08,560 Speaker 1: get back at them for them insulting you or your 311 00:18:08,600 --> 00:18:13,359 Speaker 1: family or your favorite football team or whatever. It's a 312 00:18:13,400 --> 00:18:17,359 Speaker 1: really basic instinct that if you can learn to overcome, 313 00:18:17,920 --> 00:18:20,959 Speaker 1: not just you as an individual can evolve, we as 314 00:18:21,000 --> 00:18:24,800 Speaker 1: a society can evolve. The thing is is you still 315 00:18:24,880 --> 00:18:30,080 Speaker 1: need that for just to keep society going and functioning. 316 00:18:30,560 --> 00:18:34,280 Speaker 1: What we've figured out as further evolved humans that we 317 00:18:34,400 --> 00:18:41,000 Speaker 1: can externalize that revenge instinct and imbue our institutions with that, 318 00:18:41,520 --> 00:18:46,040 Speaker 1: where we've created court systems and justice systems. They're responsible 319 00:18:46,359 --> 00:18:50,520 Speaker 1: for carrying out acts of vengeance or retribution or righting 320 00:18:50,600 --> 00:18:54,679 Speaker 1: wrongs in serving justice on behalf of the individuals of 321 00:18:54,720 --> 00:18:58,359 Speaker 1: society and as society for society as a whole, so 322 00:18:58,400 --> 00:19:01,119 Speaker 1: that we don't have to care acts of revenge on 323 00:19:01,119 --> 00:19:03,919 Speaker 1: one another. And in fact, we have rules now that 324 00:19:04,000 --> 00:19:06,040 Speaker 1: if you do carry out an act of revenge, you 325 00:19:06,119 --> 00:19:08,720 Speaker 1: can be punished by those same institutions that are there 326 00:19:08,960 --> 00:19:11,320 Speaker 1: to enact vengeance on your behalf. 327 00:19:12,240 --> 00:19:14,880 Speaker 2: Yeah, because what happened is, you know, we went from 328 00:19:14,880 --> 00:19:17,159 Speaker 2: the hunter gatherers, where you literally had to do this 329 00:19:17,240 --> 00:19:20,880 Speaker 2: for your group to survive, to eventually settling down. Once 330 00:19:20,920 --> 00:19:26,040 Speaker 2: we became farmers and settlers and eventually urbanites, and those 331 00:19:26,160 --> 00:19:29,480 Speaker 2: became Those same instincts were there, but they became moral codes, 332 00:19:30,040 --> 00:19:33,000 Speaker 2: and all of a sudden, you know, we had these 333 00:19:33,080 --> 00:19:36,200 Speaker 2: moral codes like you don't cheat on your friend's wife 334 00:19:36,240 --> 00:19:39,879 Speaker 2: and stuff like that, and you know that's not punishable 335 00:19:39,920 --> 00:19:43,000 Speaker 2: by death, but that revenge instinct is still there to 336 00:19:43,040 --> 00:19:46,560 Speaker 2: overcome these moral codes. There was a psychologist named Herbert 337 00:19:46,840 --> 00:19:51,760 Speaker 2: Herbert Gentis that talked about revenge seeking his moral behavior. 338 00:19:51,800 --> 00:19:55,240 Speaker 2: Individual secrivenge not when they've been hurt, but when they've 339 00:19:55,240 --> 00:19:58,560 Speaker 2: been morally wronged. I would also argue, you know, some 340 00:19:58,640 --> 00:20:00,920 Speaker 2: people's secrevenge when they literally they have been physically hurt 341 00:20:00,960 --> 00:20:03,960 Speaker 2: as well, right, But it's also a morally wronging that 342 00:20:04,040 --> 00:20:07,119 Speaker 2: happens if someone you know, jumps you and beats you 343 00:20:07,200 --> 00:20:08,800 Speaker 2: up at a football game or something. 344 00:20:09,000 --> 00:20:12,240 Speaker 1: And that's actually that's attention that philosophers been trying to 345 00:20:12,280 --> 00:20:15,239 Speaker 1: figure out for a while. John Stuart Mill was a 346 00:20:15,280 --> 00:20:19,919 Speaker 1: fan of the deterrent explanation of revenge or punishment or 347 00:20:20,000 --> 00:20:22,720 Speaker 1: whatever you want to call it, and he was saying, like, 348 00:20:22,720 --> 00:20:26,640 Speaker 1: like with the animal Kingdom, when you you punish the transgressor, 349 00:20:26,960 --> 00:20:30,280 Speaker 1: you're you're deterring future behavior by making an example of them. 350 00:20:30,600 --> 00:20:36,800 Speaker 1: Emmanuel Kant said, no revenge exists because when somebody transgresses, 351 00:20:37,840 --> 00:20:42,040 Speaker 1: you're against a person, you're being morally wronged, and just 352 00:20:43,320 --> 00:20:48,400 Speaker 1: remove everything else. Morally speaking, that person deserves to be punished. 353 00:20:48,760 --> 00:20:50,960 Speaker 1: And he put it in a really kind of alarming way. 354 00:20:51,160 --> 00:20:53,679 Speaker 1: But philosophers always operate on the fringes anyway to make 355 00:20:53,720 --> 00:20:54,280 Speaker 1: their points. 356 00:20:54,720 --> 00:20:54,960 Speaker 3: Right. 357 00:20:55,040 --> 00:21:02,800 Speaker 1: He was saying that a genuinely I guess, a legitimate society, 358 00:21:03,080 --> 00:21:05,240 Speaker 1: even if it was disbanding, it was in the act 359 00:21:05,280 --> 00:21:10,879 Speaker 1: of disbanding that they were they they were required to 360 00:21:10,960 --> 00:21:15,000 Speaker 1: go in and kill all of the remaining prisoners, all 361 00:21:15,040 --> 00:21:17,240 Speaker 1: the all the murderers, like, go execute the rest of 362 00:21:17,280 --> 00:21:21,360 Speaker 1: the murderers. Just because your society's disbanding makes no excuse 363 00:21:21,400 --> 00:21:24,399 Speaker 1: whatsoever for the people that you've imprisoned that transgressed against 364 00:21:24,400 --> 00:21:28,199 Speaker 1: the society, because they committed a moral wrong that is 365 00:21:28,560 --> 00:21:31,840 Speaker 1: larger and more important than any any individual or even 366 00:21:31,920 --> 00:21:34,760 Speaker 1: any society, and that they deserve to be punished in that. 367 00:21:34,760 --> 00:21:38,640 Speaker 1: That's the function of revenge, according to Khan. Oh interesting, Yeah, 368 00:21:38,680 --> 00:21:40,360 Speaker 1: it's kind of a kind of vengeance. 369 00:21:40,400 --> 00:21:44,280 Speaker 2: kN was he was serious, man, Yeah, he came back 370 00:21:44,280 --> 00:21:48,199 Speaker 2: double you mentioned earlier. You know, we have systems set up, 371 00:21:48,240 --> 00:21:52,159 Speaker 2: you know, court systems, police forces, things like that these days. 372 00:21:52,160 --> 00:21:57,639 Speaker 2: But it's interesting that they found that historically these places 373 00:21:57,640 --> 00:22:01,000 Speaker 2: in the world where the culture were what you would 374 00:22:01,040 --> 00:22:05,480 Speaker 2: call like a culture of honor, were more prone to, 375 00:22:05,960 --> 00:22:08,040 Speaker 2: you know, commit acts of violence as revenge. Like the 376 00:22:08,080 --> 00:22:11,879 Speaker 2: American South was historically known as a culture of honor, 377 00:22:11,960 --> 00:22:14,119 Speaker 2: where you would go out and defend the honor or 378 00:22:14,119 --> 00:22:16,160 Speaker 2: fight somebody or have a shootout with somebody. 379 00:22:16,440 --> 00:22:20,280 Speaker 1: I saw specifically, that's white Southern culture. That patterns of 380 00:22:20,400 --> 00:22:25,720 Speaker 1: African American retribution or crimes like that don't really vary geographically. 381 00:22:25,760 --> 00:22:28,879 Speaker 1: That's the white American South is the one that does that. 382 00:22:29,560 --> 00:22:30,520 Speaker 3: Yeah, that makes sense. 383 00:22:30,960 --> 00:22:32,480 Speaker 2: I mean, I think that's what they're talking about as 384 00:22:32,520 --> 00:22:35,919 Speaker 2: sort of you know, pre Antebellum and stuff like that. 385 00:22:37,720 --> 00:22:38,120 Speaker 1: Culture. 386 00:22:39,440 --> 00:22:42,880 Speaker 2: Middle Eastern cultures historically can kind of be the same 387 00:22:42,920 --> 00:22:46,600 Speaker 2: way as far as revenge goes restoring honor, and they 388 00:22:46,680 --> 00:22:49,120 Speaker 2: found and this is what gets really interesting is that 389 00:22:49,640 --> 00:22:52,560 Speaker 2: cultures and areas that have a history of weak law 390 00:22:52,640 --> 00:22:57,560 Speaker 2: enforcement maybe engage in revenge more often. When you hear 391 00:22:57,560 --> 00:23:01,560 Speaker 2: about like street justice, you might think of a low 392 00:23:01,600 --> 00:23:04,720 Speaker 2: income community that maybe mistrust the system, they don't think 393 00:23:04,760 --> 00:23:06,600 Speaker 2: the courts or the police are on their side to 394 00:23:06,640 --> 00:23:09,119 Speaker 2: begin with or would take care of them. So that's 395 00:23:09,119 --> 00:23:11,200 Speaker 2: where you're going to see more sort of street justice 396 00:23:11,240 --> 00:23:12,440 Speaker 2: revenge carried out right. 397 00:23:13,400 --> 00:23:17,920 Speaker 1: And then same with workplace environments and schools. Apparently three 398 00:23:18,160 --> 00:23:21,959 Speaker 1: and five school shootings from nineteen seventy four to twenty 399 00:23:22,080 --> 00:23:26,879 Speaker 1: twenty were acts of vengeance revenge. I'm surprised it was 400 00:23:26,920 --> 00:23:31,000 Speaker 1: actually that low of a ratio. And then if you 401 00:23:31,400 --> 00:23:35,600 Speaker 1: work in a place where your complaints to management or 402 00:23:35,600 --> 00:23:38,560 Speaker 1: whatever seem to be falling on deaf ears, that can 403 00:23:38,600 --> 00:23:41,560 Speaker 1: also lead to vengeance in the workplace like workplace shootings. 404 00:23:41,840 --> 00:23:45,320 Speaker 1: Like remember in our going postal episode, what like the 405 00:23:45,320 --> 00:23:50,080 Speaker 1: common factor was that management was not only like dismissing 406 00:23:50,160 --> 00:23:55,280 Speaker 1: complaints about bullying, they were often engaged in bullying themselves. Yeah, 407 00:23:55,480 --> 00:23:58,520 Speaker 1: and like it doesn't justify or excuse it, but that 408 00:23:58,640 --> 00:24:01,600 Speaker 1: is an example of some buddy tarrying out an act 409 00:24:01,640 --> 00:24:03,240 Speaker 1: of revenge, at least in their mind. 410 00:24:03,880 --> 00:24:07,880 Speaker 2: Yeah, for sure, you were talking about philosophers earlier. Now 411 00:24:07,920 --> 00:24:11,240 Speaker 2: we get to finally talk about our old friend Sigmund 412 00:24:11,240 --> 00:24:15,679 Speaker 2: Freud and Joseph Preuer his mentor because they had what 413 00:24:15,760 --> 00:24:19,520 Speaker 2: was known as the catharsis theory of aggression or the 414 00:24:19,760 --> 00:24:24,520 Speaker 2: hydraulic model. And this is the idea is that a 415 00:24:24,520 --> 00:24:28,280 Speaker 2: lot of psychosis or most of them were repression and 416 00:24:28,320 --> 00:24:31,360 Speaker 2: it was negative emotion that was building up. And if 417 00:24:31,400 --> 00:24:34,119 Speaker 2: you repress these emotions, if you have these negative emotions, 418 00:24:34,359 --> 00:24:37,479 Speaker 2: if you have anger towards someone or frustration, and it 419 00:24:37,520 --> 00:24:40,760 Speaker 2: builds up like a hydraulic pomp, eventually you're going to 420 00:24:40,840 --> 00:24:43,480 Speaker 2: pop or you're going to have what's called a catharsis 421 00:24:43,600 --> 00:24:49,560 Speaker 2: or Greek meaning cleansing or purging, and you will release 422 00:24:49,600 --> 00:24:52,159 Speaker 2: that in an unhealthy way, which is probably going to 423 00:24:52,200 --> 00:24:57,120 Speaker 2: be revenge, Freud said, it could manifest as hysteria. And 424 00:24:58,000 --> 00:25:01,359 Speaker 2: here's the thing, though, is that's stuff falls apart when 425 00:25:01,400 --> 00:25:05,080 Speaker 2: you actually apply science to it. They have found that, 426 00:25:05,600 --> 00:25:09,600 Speaker 2: you know, you know, punching a punching bag can maybe 427 00:25:09,680 --> 00:25:12,879 Speaker 2: give you an immediate relief, but a lot of times 428 00:25:12,880 --> 00:25:15,400 Speaker 2: that stuff only serves to work you up more when 429 00:25:15,400 --> 00:25:16,479 Speaker 2: you apply science to it. 430 00:25:16,680 --> 00:25:19,520 Speaker 1: Yeah, because the basis of the Catharsis theory was that 431 00:25:20,000 --> 00:25:23,000 Speaker 1: rather than going and killing the person who wronged you, 432 00:25:23,000 --> 00:25:25,800 Speaker 1: you could go hit that punching bag and pretend the 433 00:25:25,840 --> 00:25:27,960 Speaker 1: punching bag was them, and you would get out that 434 00:25:28,040 --> 00:25:31,000 Speaker 1: repressed anger and feel better and could move on. But yes, 435 00:25:31,560 --> 00:25:35,040 Speaker 1: starting I think the fifties, they were like this, wait 436 00:25:35,040 --> 00:25:37,119 Speaker 1: a minute, this is this is not right at all. 437 00:25:37,160 --> 00:25:39,480 Speaker 1: It turns out that when you do that, it just 438 00:25:39,600 --> 00:25:45,280 Speaker 1: extends that the sour feelings that cause you to want 439 00:25:45,280 --> 00:25:48,040 Speaker 1: revenge in the first place. And like we said, if 440 00:25:48,080 --> 00:25:52,000 Speaker 1: you can find a way to forgive or forget or 441 00:25:52,119 --> 00:25:57,040 Speaker 1: move on or whatever, you will ultimately be happier in 442 00:25:57,080 --> 00:26:00,960 Speaker 1: the long run, and even immediately compare to somebody who 443 00:26:01,080 --> 00:26:04,120 Speaker 1: actually carries out an act of revenge or even goes 444 00:26:04,160 --> 00:26:06,840 Speaker 1: and punches a punching bag, pretending like it's the person 445 00:26:06,840 --> 00:26:08,879 Speaker 1: they want to carry out an active revenge against. 446 00:26:09,560 --> 00:26:12,720 Speaker 2: Yeah, there's that psychologist you're talking about, a Ri Hornberger. 447 00:26:13,280 --> 00:26:15,000 Speaker 2: One of the studies they did, he would have an 448 00:26:15,040 --> 00:26:17,800 Speaker 2: actor come in and like insult someone in a study. 449 00:26:18,320 --> 00:26:20,160 Speaker 1: Can you just see Ted Dats been doing this early 450 00:26:20,200 --> 00:26:20,760 Speaker 1: in his career. 451 00:26:22,520 --> 00:26:26,560 Speaker 2: Yeah, you half witch, look at that nose. So somebody 452 00:26:26,600 --> 00:26:29,520 Speaker 2: would get really mad, apparently and be instructed to go 453 00:26:30,280 --> 00:26:33,200 Speaker 2: bang nails, hit nails into a board for ten minutes. See, 454 00:26:33,560 --> 00:26:36,440 Speaker 2: you know, apparently let out that frustration as their fists. 455 00:26:36,720 --> 00:26:37,240 Speaker 3: That's right. 456 00:26:37,320 --> 00:26:39,520 Speaker 2: The other half of the people just had to sit 457 00:26:39,560 --> 00:26:41,880 Speaker 2: there and think about it for ten minutes, and then 458 00:26:41,960 --> 00:26:44,280 Speaker 2: they were given a chance to criticize the person who 459 00:26:44,280 --> 00:26:47,480 Speaker 2: insulted them. And if you subscribe to Freud and the 460 00:26:47,520 --> 00:26:51,960 Speaker 2: Catharsis theory, then the nail pounders would have been you know, relieved, 461 00:26:52,280 --> 00:26:53,879 Speaker 2: and their aggression would have been let out and they 462 00:26:53,880 --> 00:26:56,800 Speaker 2: would have been less hostile. But the exact opposite happened. 463 00:26:56,800 --> 00:26:59,640 Speaker 2: They were even more hostile after they pounded those nails 464 00:26:59,680 --> 00:27:02,080 Speaker 2: towards the actor than the people who did nothing. 465 00:27:02,560 --> 00:27:06,280 Speaker 1: Yeah, that was Hornberger in the fifties. It's still being 466 00:27:06,359 --> 00:27:11,080 Speaker 1: proven today. There's a psychologist named Brad Bushman who made 467 00:27:11,080 --> 00:27:15,280 Speaker 1: a slightly more robust study of the whole thing, but 468 00:27:15,320 --> 00:27:19,680 Speaker 1: it followed essentially the same methodology, where you were thinking 469 00:27:19,760 --> 00:27:22,360 Speaker 1: about whoever you wanted to get revenge on while you're 470 00:27:22,400 --> 00:27:26,560 Speaker 1: hitting a punching bag, or another group was hitting a 471 00:27:26,600 --> 00:27:28,879 Speaker 1: punching bag, but they were told to think about the 472 00:27:28,880 --> 00:27:32,160 Speaker 1: health benefits of boxing, and then the other one didn't 473 00:27:32,200 --> 00:27:36,359 Speaker 1: punch anything, the third group, and they found that the 474 00:27:36,440 --> 00:27:39,080 Speaker 1: rumination group was the one who displayed the most anger, 475 00:27:40,160 --> 00:27:42,960 Speaker 1: and then the distraction group, who were also punching the 476 00:27:42,960 --> 00:27:46,840 Speaker 1: bag but thinking about how great boxing is, they were second. 477 00:27:47,600 --> 00:27:49,920 Speaker 1: And then the last group, the people who didn't punch anything, 478 00:27:50,000 --> 00:27:52,360 Speaker 1: They were the happiest, They were the least hostile as 479 00:27:52,400 --> 00:27:57,080 Speaker 1: a result afterward. Yeah, and you know, again we're getting 480 00:27:57,080 --> 00:27:59,840 Speaker 1: into social psychology territory here. But these people are working 481 00:27:59,840 --> 00:28:03,240 Speaker 1: with the best they can while staying within ethical boundaries, 482 00:28:03,280 --> 00:28:06,560 Speaker 1: like you can't actually harm somebody, but they do have 483 00:28:06,640 --> 00:28:13,000 Speaker 1: ways of making you feel insulted or cheated. That's another 484 00:28:13,040 --> 00:28:16,120 Speaker 1: big one too. And when you're in one of these experiments, 485 00:28:16,440 --> 00:28:19,120 Speaker 1: you don't know that they're researching revenge. You think they're 486 00:28:19,119 --> 00:28:21,560 Speaker 1: researching how well you can play like a game with 487 00:28:21,640 --> 00:28:24,199 Speaker 1: others or something. You have no idea that that's what 488 00:28:24,240 --> 00:28:27,359 Speaker 1: they're researching. So there are some pretty good models for 489 00:28:27,440 --> 00:28:30,400 Speaker 1: testing revenge without actually putting anyone in harm's way. 490 00:28:31,119 --> 00:28:31,840 Speaker 3: Yeah, for sure. 491 00:28:32,720 --> 00:28:35,160 Speaker 2: Should we take a break, Yeah, let's all right, let's 492 00:28:35,160 --> 00:28:37,200 Speaker 2: take a break, and we'll talk a little bit about 493 00:28:38,840 --> 00:28:48,880 Speaker 2: sometimes revenge can feel good and explain those studies as well. 494 00:29:06,400 --> 00:29:08,880 Speaker 2: All right, before we broke, we talked about the fact 495 00:29:08,920 --> 00:29:13,920 Speaker 2: that revenge basically actually undertaking the revenge, or in these 496 00:29:13,920 --> 00:29:16,160 Speaker 2: studies at least it's not revenge, but you know, letting 497 00:29:16,160 --> 00:29:19,280 Speaker 2: out that aggression, thinking about the person who did you wrong, 498 00:29:19,400 --> 00:29:20,920 Speaker 2: we punched a heavy bag. 499 00:29:21,240 --> 00:29:22,520 Speaker 3: Really just makes you feel worse. 500 00:29:23,840 --> 00:29:27,840 Speaker 2: Revenge, under the right circumstances can make you feel better, apparently, 501 00:29:28,280 --> 00:29:33,400 Speaker 2: according to the studies of German psychologists Mario I'm sorry, 502 00:29:33,440 --> 00:29:39,920 Speaker 2: he's a psychological scientist, Mario Goldwitz, and this gets a 503 00:29:39,960 --> 00:29:44,480 Speaker 2: little interesting. I think he talked about comparative suffering and 504 00:29:45,280 --> 00:29:49,200 Speaker 2: the notion that when you see the person who wronged 505 00:29:49,280 --> 00:29:53,800 Speaker 2: you suffer might restore a balance, an emotional balance, getting 506 00:29:53,880 --> 00:29:57,040 Speaker 2: yourself into the universe at large. Even that and his 507 00:29:57,160 --> 00:30:00,920 Speaker 2: other theory of the what's called the understanding hypothesis, which 508 00:30:01,000 --> 00:30:04,640 Speaker 2: is that if the under the person who did you wrong, 509 00:30:05,600 --> 00:30:08,560 Speaker 2: if they suffer, that's fine, but that's really not enough. 510 00:30:08,560 --> 00:30:11,400 Speaker 2: They have to know that they're suffering because of what 511 00:30:11,440 --> 00:30:14,440 Speaker 2: they did to you. Yeah, and that that can actually 512 00:30:14,520 --> 00:30:19,080 Speaker 2: bring some I don't know about positive emotions, but make 513 00:30:19,160 --> 00:30:21,760 Speaker 2: someone feel good as opposed to feeling worse. 514 00:30:21,840 --> 00:30:24,520 Speaker 1: Yeah. And he came up with a pretty clever experiment 515 00:30:24,560 --> 00:30:28,480 Speaker 1: to test which one was correct comparative suffering or understanding hypothesis. 516 00:30:29,040 --> 00:30:32,720 Speaker 1: And essentially what happened was the research participants thought that 517 00:30:32,760 --> 00:30:35,840 Speaker 1: they were trying to compete for raffle tickets with another 518 00:30:35,920 --> 00:30:38,560 Speaker 1: person who was in another room. They were paired up 519 00:30:38,600 --> 00:30:42,080 Speaker 1: with a partner team member, yeah, partner, and after they 520 00:30:42,080 --> 00:30:44,120 Speaker 1: won all the raffle tickets, they were told that they 521 00:30:44,120 --> 00:30:47,400 Speaker 1: and the partner could divvy them up between them and 522 00:30:47,880 --> 00:30:51,520 Speaker 1: basically all of the participants, you know, cut them in half, 523 00:30:51,640 --> 00:30:54,640 Speaker 1: distribute them evenly. But they found that the other people 524 00:30:55,000 --> 00:30:57,920 Speaker 1: had really shorted them on their tickets. Their partner had 525 00:30:57,960 --> 00:31:01,440 Speaker 1: really kept a bunch of tickets rather than distributing them evenly, 526 00:31:01,800 --> 00:31:05,959 Speaker 1: so they had been wronged in some way, they were 527 00:31:06,000 --> 00:31:09,520 Speaker 1: given a chance to write that wrong by carrying out revenge. 528 00:31:10,320 --> 00:31:15,800 Speaker 1: They were allowed to redistribute the tickets like a second chance. 529 00:31:16,040 --> 00:31:19,640 Speaker 1: And in that case they almost invariably screwed the other 530 00:31:19,720 --> 00:31:22,920 Speaker 1: person over who would you know, So they enacted revenge. 531 00:31:23,000 --> 00:31:26,800 Speaker 1: And then this is where Goalwitzer really kind of shown 532 00:31:26,920 --> 00:31:30,480 Speaker 1: for me. He figured out a way to test how 533 00:31:30,560 --> 00:31:33,480 Speaker 1: satisfied those people were with that act of revenge. 534 00:31:34,400 --> 00:31:37,000 Speaker 2: Yeah, there were I think sixty percent of the people 535 00:31:38,440 --> 00:31:42,160 Speaker 2: ended up shorting them in return, sometimes even more than 536 00:31:42,160 --> 00:31:44,680 Speaker 2: they were shorted to begin with, Like they came back 537 00:31:44,720 --> 00:31:48,240 Speaker 2: double Emily styles. So he went that one extra step, 538 00:31:48,320 --> 00:31:50,520 Speaker 2: like you were saying, and he said, all right, here's 539 00:31:50,600 --> 00:31:55,280 Speaker 2: what you do. Now you can write a note to 540 00:31:55,400 --> 00:31:59,360 Speaker 2: the person and say whatever you want. You can reference 541 00:31:59,400 --> 00:32:03,240 Speaker 2: the you know, the justice. So one person wrote sorry 542 00:32:03,280 --> 00:32:05,760 Speaker 2: for taking the tickets away. And remember now it gets 543 00:32:05,800 --> 00:32:08,360 Speaker 2: a little convoluted, but this is someone who initially was 544 00:32:08,400 --> 00:32:11,640 Speaker 2: shorted and then they took revenge by shorting the other person, 545 00:32:11,800 --> 00:32:14,080 Speaker 2: maybe even more right, And so they sent them a 546 00:32:14,080 --> 00:32:16,800 Speaker 2: note they said, sorry for taking tickets away, but unfortunately 547 00:32:17,080 --> 00:32:21,239 Speaker 2: you only cared about yourself. That's so childish, so they 548 00:32:21,280 --> 00:32:23,920 Speaker 2: would write a note of many many of them would 549 00:32:23,960 --> 00:32:27,760 Speaker 2: write notes like saying, I really want you to understand 550 00:32:27,880 --> 00:32:30,040 Speaker 2: this is why you're getting shorted. Right. 551 00:32:30,160 --> 00:32:34,080 Speaker 1: So then what Gallwitzer figured out was that he could 552 00:32:34,080 --> 00:32:38,920 Speaker 1: test understanding hypothesis and comparative suffering by getting two different 553 00:32:39,040 --> 00:32:42,360 Speaker 1: kinds of notes to the people. That group of participants 554 00:32:42,560 --> 00:32:44,960 Speaker 1: that had carried out revenge and then sent a note 555 00:32:44,960 --> 00:32:48,280 Speaker 1: saying I wronged you because you wronged me. And the 556 00:32:48,320 --> 00:32:51,239 Speaker 1: first note was kind of it was like contrician. They 557 00:32:51,240 --> 00:32:53,240 Speaker 1: were saying, yeah, I understand you really gave it to 558 00:32:53,320 --> 00:32:56,320 Speaker 1: me because I had wronged you initially. And then the 559 00:32:56,320 --> 00:32:58,920 Speaker 1: other note was like, hey, you way over did it. 560 00:32:58,960 --> 00:33:00,720 Speaker 1: I didn't do it that bad to you. I'm a 561 00:33:00,760 --> 00:33:08,600 Speaker 1: little indignant. And so if the comparative suffering was correct, 562 00:33:09,160 --> 00:33:11,640 Speaker 1: just knowing that those people had been put out by 563 00:33:11,680 --> 00:33:15,760 Speaker 1: the revenge of retaliation should have been satisfying enough. But 564 00:33:15,880 --> 00:33:18,160 Speaker 1: what Gallwitzer found was that that's not the case at all. 565 00:33:18,520 --> 00:33:22,560 Speaker 1: That the group that got the note back that said, man, 566 00:33:22,640 --> 00:33:24,880 Speaker 1: you really stuck it to me and I feel like 567 00:33:24,920 --> 00:33:29,040 Speaker 1: a schmo because of what I did to you, they 568 00:33:29,080 --> 00:33:33,440 Speaker 1: were far more satisfied than the people who had just 569 00:33:33,480 --> 00:33:35,800 Speaker 1: gotten the note back, saying like, I'm a little indignant 570 00:33:35,800 --> 00:33:38,600 Speaker 1: you overdid it. So just knowing that they suffered was 571 00:33:38,640 --> 00:33:42,280 Speaker 1: not enough. They had to know why they were suffering. 572 00:33:43,040 --> 00:33:49,120 Speaker 2: Yeah, I think that's really interesting. Like I guess it's 573 00:33:49,160 --> 00:33:53,280 Speaker 2: the idea of like, if somebody and if you're trust me, 574 00:33:53,400 --> 00:33:55,560 Speaker 2: no one should ever do anything like this. It means 575 00:33:55,560 --> 00:33:57,920 Speaker 2: you're a truly bad person if you do. But if 576 00:33:57,920 --> 00:33:59,920 Speaker 2: you engage in road rage and someone cuts you off 577 00:34:00,240 --> 00:34:02,200 Speaker 2: and you follow them to the gas station and like 578 00:34:02,440 --> 00:34:05,720 Speaker 2: cut their tire when they're in the store and leave, 579 00:34:06,240 --> 00:34:10,440 Speaker 2: that wouldn't be as satisfying scientifically as if you do 580 00:34:10,560 --> 00:34:12,560 Speaker 2: that and leave a note that says like, you know, 581 00:34:12,640 --> 00:34:14,480 Speaker 2: this is what you get for cutting me off. 582 00:34:14,360 --> 00:34:18,719 Speaker 1: Right exactly. So you said something in there that I 583 00:34:18,800 --> 00:34:23,040 Speaker 1: think is really important too, that that group who retaliated 584 00:34:23,200 --> 00:34:27,520 Speaker 1: the participants, when they retaliated, they often distributed things even 585 00:34:27,560 --> 00:34:31,200 Speaker 1: more unfairly than their partner had initially right right, right, right, 586 00:34:31,280 --> 00:34:33,840 Speaker 1: And that's something that's a big problem with the cycle 587 00:34:33,880 --> 00:34:37,279 Speaker 1: of revenge that a researcher named Arlene Stillwell from State 588 00:34:37,440 --> 00:34:41,200 Speaker 1: University of New York at Potsdam pointed out. The problem 589 00:34:41,360 --> 00:34:44,359 Speaker 1: is is that when you are on the side of 590 00:34:44,400 --> 00:34:49,160 Speaker 1: avenging yourself for a wrong, you think that after you've 591 00:34:49,200 --> 00:34:53,600 Speaker 1: done that, things are right again. You've created equilibrium in 592 00:34:53,960 --> 00:34:58,640 Speaker 1: the moral universe again. But when you're the recipient of vengeance, 593 00:34:59,200 --> 00:35:02,560 Speaker 1: you feel like that person was disproportionate to the wrong 594 00:35:02,640 --> 00:35:05,360 Speaker 1: that you inflicted, and so, like I was saying earlier, 595 00:35:05,440 --> 00:35:07,160 Speaker 1: now you feel like you might need to get them 596 00:35:07,200 --> 00:35:09,360 Speaker 1: back again. And it just goes tit for tat and 597 00:35:09,400 --> 00:35:13,759 Speaker 1: tit for tat, And that's why the safest, smartest, most 598 00:35:13,920 --> 00:35:16,560 Speaker 1: highly evolved, most Buddhist thing you can do is to 599 00:35:16,640 --> 00:35:19,200 Speaker 1: just short circuit the whole thing and let it go 600 00:35:19,520 --> 00:35:22,960 Speaker 1: and just move on and know, yes, you've been morally wronged, 601 00:35:22,960 --> 00:35:25,920 Speaker 1: and you have the power within you to not do 602 00:35:26,000 --> 00:35:29,600 Speaker 1: a thing about it and like live a happier life 603 00:35:29,600 --> 00:35:31,560 Speaker 1: than you would if you did something about it. 604 00:35:32,520 --> 00:35:35,799 Speaker 2: Yeah, And I think that kind of holds with They 605 00:35:35,840 --> 00:35:38,520 Speaker 2: found some pretty good research on what's called impact bias, 606 00:35:39,040 --> 00:35:43,480 Speaker 2: and that's the idea that people tend to overestimate how 607 00:35:43,560 --> 00:35:46,799 Speaker 2: much like one kind of even sometimes small, single thing 608 00:35:47,360 --> 00:35:48,600 Speaker 2: will affect their future. 609 00:35:48,640 --> 00:35:49,600 Speaker 3: They overestimate it. 610 00:35:49,640 --> 00:35:52,040 Speaker 2: And the example he used is like a kid, a 611 00:35:52,120 --> 00:35:53,960 Speaker 2: high school kid, saying, well, if I don't get an 612 00:35:53,960 --> 00:35:55,759 Speaker 2: A in this class, it'll ruin my chances to get 613 00:35:55,760 --> 00:35:59,000 Speaker 2: into a good college. And that's probably overestimating things, because 614 00:35:59,080 --> 00:36:03,080 Speaker 2: getting into a college is more about this one class maybe, 615 00:36:03,480 --> 00:36:08,800 Speaker 2: or this one test. But people underestimate, apparently, like anger 616 00:36:08,840 --> 00:36:11,120 Speaker 2: goes to opposite, they will underestimate how hard it is 617 00:36:11,160 --> 00:36:15,239 Speaker 2: to shake angry thoughts. So you might think that you 618 00:36:15,280 --> 00:36:18,359 Speaker 2: can get over something by committing the act revenge, but 619 00:36:18,600 --> 00:36:22,160 Speaker 2: you're really underestimating it, and those those feelings are going 620 00:36:22,239 --> 00:36:24,640 Speaker 2: to stick with you even past that revenge act. 621 00:36:24,760 --> 00:36:28,000 Speaker 1: Yeah. With anger in particular, it's its own thing. It 622 00:36:28,000 --> 00:36:31,520 Speaker 1: doesn't follow the rules of other emotions, right yeah, And 623 00:36:31,719 --> 00:36:34,520 Speaker 1: that is of course part and parcel with revenge. You 624 00:36:34,560 --> 00:36:38,239 Speaker 1: are angry, maybe even hateful, and you have to carry 625 00:36:38,280 --> 00:36:40,799 Speaker 1: out some sort of act of vengeance, right yeah. 626 00:36:40,800 --> 00:36:43,239 Speaker 2: And I think there's also something to the idea that 627 00:36:45,480 --> 00:36:49,200 Speaker 2: even though you think committing that actor of revenge will 628 00:36:50,000 --> 00:36:53,200 Speaker 2: fulfill you, what it does in the end is it 629 00:36:53,680 --> 00:36:55,480 Speaker 2: you've heard about, like you know, you brought me down 630 00:36:55,480 --> 00:36:58,239 Speaker 2: to your level, Like if you go and slash that 631 00:36:58,239 --> 00:37:01,319 Speaker 2: guy's tire for cutting you off, you know I got you. 632 00:37:01,800 --> 00:37:04,120 Speaker 2: But those negative thoughts about yourself are going to creep 633 00:37:04,120 --> 00:37:07,560 Speaker 2: in because you have now stooped and done something even 634 00:37:07,640 --> 00:37:10,320 Speaker 2: worse than cutting someone off. You you know, cost someone 635 00:37:10,360 --> 00:37:13,160 Speaker 2: money and ruin their property and potentially created a danger 636 00:37:13,200 --> 00:37:13,520 Speaker 2: for them. 637 00:37:14,400 --> 00:37:16,799 Speaker 1: You know, well, even if you were cut off by 638 00:37:16,800 --> 00:37:19,560 Speaker 1: the person, I've seen people do this too, and you 639 00:37:20,120 --> 00:37:21,839 Speaker 1: rev your engine and catch up to them and cut 640 00:37:21,920 --> 00:37:25,239 Speaker 1: them off. Right, So it's literally tip for tad, right, 641 00:37:25,520 --> 00:37:29,520 Speaker 1: there's no nothing was done beyond it's completely even You're 642 00:37:29,560 --> 00:37:33,239 Speaker 1: still gonna feel bad about having should that, And it's 643 00:37:33,239 --> 00:37:38,359 Speaker 1: insane how it happens, Like you are just driven by 644 00:37:38,400 --> 00:37:40,799 Speaker 1: this rage, is just feeling like this is what you're 645 00:37:40,840 --> 00:37:42,960 Speaker 1: supposed to be doing, This is what the universe is 646 00:37:43,160 --> 00:37:46,200 Speaker 1: demanding you do to set things right again. And then 647 00:37:46,239 --> 00:37:49,640 Speaker 1: the moment you do it, you feel terrible about yourself 648 00:37:49,680 --> 00:37:51,959 Speaker 1: in one way or another. And it is just such 649 00:37:51,960 --> 00:37:56,759 Speaker 1: a just such a BS evolutionary relic that like you're 650 00:37:56,800 --> 00:38:00,919 Speaker 1: being manipulated by genetics at that point. Yeah, at that moment, 651 00:38:01,000 --> 00:38:03,640 Speaker 1: you're being manipulated. You are a puppet. And so the 652 00:38:03,640 --> 00:38:05,960 Speaker 1: best thing you can do to control your own destiny 653 00:38:06,160 --> 00:38:08,839 Speaker 1: again is to say, man, that guy cut me off. 654 00:38:08,960 --> 00:38:11,439 Speaker 1: He's a putt or even better, that guy cut me off. 655 00:38:11,640 --> 00:38:15,120 Speaker 1: Maybe he's a his lake is bleeding and he's he's 656 00:38:15,160 --> 00:38:17,680 Speaker 1: got to get to the hospital. You know, there are 657 00:38:17,719 --> 00:38:19,319 Speaker 1: a lot of things you can do, but when you 658 00:38:19,440 --> 00:38:24,520 Speaker 1: do that, you're overcoming your genetic destiny and taking it 659 00:38:24,560 --> 00:38:26,000 Speaker 1: in an even better direction. 660 00:38:27,120 --> 00:38:27,640 Speaker 3: Yeah. 661 00:38:27,760 --> 00:38:31,120 Speaker 2: I mean it's tough stuff, man, to follow that pass 662 00:38:32,360 --> 00:38:35,080 Speaker 2: for me, for a lot of people, I have tried 663 00:38:35,120 --> 00:38:38,720 Speaker 2: and tried, the more of the older I've gotten to 664 00:38:38,760 --> 00:38:43,440 Speaker 2: try try, right, It's hard, but try to think, like 665 00:38:44,239 --> 00:38:47,000 Speaker 2: what happened to that person? Why are they like that? 666 00:38:47,080 --> 00:38:49,520 Speaker 2: Like when you see someone who's you can tell as 667 00:38:49,560 --> 00:38:51,839 Speaker 2: a bad person, not someone who cuts you off in traffic, but. 668 00:38:52,080 --> 00:38:53,600 Speaker 1: Like if you's someone hit learn in traffic. 669 00:38:55,000 --> 00:38:58,000 Speaker 2: No, but when you know someone is doing the wrong thing, 670 00:38:58,040 --> 00:39:00,960 Speaker 2: and someone is just a bad human doing doing bad things, 671 00:39:01,760 --> 00:39:04,360 Speaker 2: I try to find some empathy of like what happened 672 00:39:04,360 --> 00:39:06,960 Speaker 2: to them that made them that way and what happened 673 00:39:07,000 --> 00:39:10,400 Speaker 2: to them today to make them that angry, And I 674 00:39:11,120 --> 00:39:12,839 Speaker 2: try to seek those moments out. 675 00:39:12,880 --> 00:39:13,880 Speaker 3: It's stuff though. 676 00:39:13,719 --> 00:39:18,920 Speaker 1: I'm trying too, But to be real, there's a crew 677 00:39:19,280 --> 00:39:25,399 Speaker 1: of appliance delivery dudes who scratched my wood floor two 678 00:39:25,480 --> 00:39:29,160 Speaker 1: years ago, and the company refusing, has refused to pay, 679 00:39:29,440 --> 00:39:32,319 Speaker 1: refuse to pay over a year ago. Yeah, and I 680 00:39:32,560 --> 00:39:35,960 Speaker 1: still am like, should I get those guys back? And 681 00:39:36,000 --> 00:39:38,400 Speaker 1: if so, how like I, well, you break into their 682 00:39:38,400 --> 00:39:40,799 Speaker 1: house and scratch your floor? Man? Yeah, tip for tat? 683 00:39:40,920 --> 00:39:41,080 Speaker 2: Right? 684 00:39:41,719 --> 00:39:44,640 Speaker 1: Yeah, but then also dovetails with that, that idea, that 685 00:39:44,960 --> 00:39:47,000 Speaker 1: tip for tat, or even better, an eye for an 686 00:39:47,000 --> 00:39:49,640 Speaker 1: eye leads the whole world blind as part of that 687 00:39:49,840 --> 00:39:53,879 Speaker 1: escalation and encycle of retaliation. Which is such a great 688 00:39:53,960 --> 00:39:56,680 Speaker 1: quote and everybody attributes it to Gandhi, but apparently it 689 00:39:56,760 --> 00:39:59,680 Speaker 1: was not word for a word, but the sentiment was 690 00:39:59,719 --> 00:40:03,239 Speaker 1: by Canadian named George Perry Graham. He was a journalist 691 00:40:03,520 --> 00:40:10,120 Speaker 1: and a politician and a huge fan of the Raptors. 692 00:40:10,760 --> 00:40:15,239 Speaker 1: No Gordon Lightfoot, oh, of course, even though it comes 693 00:40:15,239 --> 00:40:17,279 Speaker 1: back to g L. I think he was probably dead 694 00:40:17,320 --> 00:40:20,239 Speaker 1: long before Gordon Lightfoot was alive. I just wanted to 695 00:40:20,239 --> 00:40:23,200 Speaker 1: give a little shout out for Canada's sake. 696 00:40:24,600 --> 00:40:26,080 Speaker 2: Well, why don't we wrap it all up with this 697 00:40:26,239 --> 00:40:30,800 Speaker 2: idea of mutually assured destruction because it dovetails nicely, right. 698 00:40:31,000 --> 00:40:35,759 Speaker 1: It does, chuck, And here's why to think it away. So, 699 00:40:37,080 --> 00:40:39,040 Speaker 1: just as a refresher. We've talked about it before. The 700 00:40:39,080 --> 00:40:42,960 Speaker 1: doctrine of mutually assured destruction was that if you are 701 00:40:43,000 --> 00:40:45,680 Speaker 1: a nuclear superpower, say the Soviet Union during the Cold 702 00:40:45,719 --> 00:40:48,799 Speaker 1: War or the United States during the Cold War, if 703 00:40:48,840 --> 00:40:53,759 Speaker 1: you launched an initial first strike, the other side, even 704 00:40:53,800 --> 00:40:56,520 Speaker 1: though they were doomed because your nuclear warheads were in 705 00:40:56,560 --> 00:40:59,600 Speaker 1: the air and going to come and kill everybody there 706 00:41:00,160 --> 00:41:03,640 Speaker 1: in the meantime, they were going to launch a retaliatory strike. 707 00:41:04,239 --> 00:41:08,080 Speaker 1: It would do nothing to save anybody's life on their side. 708 00:41:08,360 --> 00:41:11,560 Speaker 1: It wouldn't do anything to stop those missiles from coming. 709 00:41:11,840 --> 00:41:17,040 Speaker 1: It was strictly revenge and the human awareness of the 710 00:41:18,120 --> 00:41:20,160 Speaker 1: concept of revenge and that that was a very real 711 00:41:20,239 --> 00:41:22,960 Speaker 1: thing that the other side really would do. That is 712 00:41:23,000 --> 00:41:26,319 Speaker 1: what kept people from carrying out an initial strike during 713 00:41:26,360 --> 00:41:29,719 Speaker 1: the Cold War, according to mutually assured destruction doctrine. 714 00:41:29,840 --> 00:41:30,279 Speaker 3: That's right. 715 00:41:30,680 --> 00:41:32,400 Speaker 1: Oh, did I take up the whole thing? I'm sorry, 716 00:41:32,760 --> 00:41:33,320 Speaker 1: you shouldn't be. 717 00:41:33,800 --> 00:41:35,480 Speaker 2: I mean, I think it's it's a great way to 718 00:41:35,520 --> 00:41:38,920 Speaker 2: end it. It's for all this talk of revenge. It's like, 719 00:41:39,960 --> 00:41:42,880 Speaker 2: is that the thing that has kept humanity on the earth? 720 00:41:43,320 --> 00:41:45,960 Speaker 1: Yeah, I mean, that's that's what some people say. And 721 00:41:46,000 --> 00:41:50,120 Speaker 1: I mean, Yeah, the doctrine of mutually assured destruction keeps 722 00:41:50,120 --> 00:41:53,040 Speaker 1: getting questioned, like was that really what what was keeping 723 00:41:53,320 --> 00:41:55,279 Speaker 1: things in check? Or was there really like behind the 724 00:41:55,280 --> 00:41:57,319 Speaker 1: scenes stuff we didn't know about. And it seems like 725 00:41:57,400 --> 00:42:00,480 Speaker 1: more and more it really was keeping things and check. 726 00:42:00,520 --> 00:42:02,960 Speaker 1: And it seems to be because there was a total 727 00:42:03,000 --> 00:42:06,040 Speaker 1: awareness that the other side would kill you just because 728 00:42:06,080 --> 00:42:10,759 Speaker 1: you killed them first. Yeah, so that's revenge, everybody. I 729 00:42:10,840 --> 00:42:13,359 Speaker 1: think if there was one point to this episode, it's 730 00:42:13,560 --> 00:42:16,479 Speaker 1: get past it as best you can, and if you can't, 731 00:42:16,520 --> 00:42:18,839 Speaker 1: don't be too hard on yourself. Just try again next time. 732 00:42:19,880 --> 00:42:24,319 Speaker 2: Yeah, try that narrow path is narrow, and do your best. 733 00:42:24,360 --> 00:42:28,319 Speaker 2: It's hard, we all struggle with it. But see if 734 00:42:28,320 --> 00:42:30,799 Speaker 2: at the next time you want to get revenge on 735 00:42:30,880 --> 00:42:34,239 Speaker 2: somebody in traffic or wherever, See if you can take 736 00:42:34,280 --> 00:42:37,920 Speaker 2: that narrow path and calm yourself down, and you know, 737 00:42:38,000 --> 00:42:40,120 Speaker 2: there's a really good chance you're gonna feel better about 738 00:42:40,120 --> 00:42:41,680 Speaker 2: yourself and the world's going to be a better. 739 00:42:41,480 --> 00:42:42,359 Speaker 3: Place because of it. 740 00:42:42,640 --> 00:42:45,680 Speaker 1: To squeeze your steering wheel until you bleed from your 741 00:42:45,719 --> 00:42:50,000 Speaker 1: palms and your butt cheeks. Since Chuck said butt cheeks, 742 00:42:50,000 --> 00:42:52,200 Speaker 1: everybody knows that's time for listener mail. 743 00:42:55,680 --> 00:42:58,640 Speaker 2: This is a great follow up quite a while ago, 744 00:42:58,800 --> 00:43:00,759 Speaker 2: not that long ago, but like last year. Sometime we 745 00:43:01,440 --> 00:43:02,880 Speaker 2: or it may have been this year, I don't know. 746 00:43:03,080 --> 00:43:05,719 Speaker 2: We read an email about a guy who's had the 747 00:43:05,760 --> 00:43:06,480 Speaker 2: cussing dentist. 748 00:43:07,160 --> 00:43:08,480 Speaker 1: Oh that was just a few months ago. 749 00:43:08,600 --> 00:43:11,560 Speaker 2: Yeah, okay, I have no concept of time anymore. Understood 750 00:43:13,160 --> 00:43:15,440 Speaker 2: the guy who's dentist cussed, and we liked it because 751 00:43:15,480 --> 00:43:18,160 Speaker 2: my doctor cusses, and I just think it's funny, you know, 752 00:43:18,200 --> 00:43:21,200 Speaker 2: it's a funny story about the dentist dropping and f bomb. 753 00:43:21,200 --> 00:43:22,880 Speaker 2: We got to pull these f and teeth or whatever. 754 00:43:24,120 --> 00:43:26,359 Speaker 2: And then we get this emial. Hey guys, my name 755 00:43:26,440 --> 00:43:28,920 Speaker 2: is Ginger and I'm a dental assistant in Blueville, Main. 756 00:43:29,480 --> 00:43:31,279 Speaker 2: The patient came into our office this week and said 757 00:43:31,280 --> 00:43:34,000 Speaker 2: my boss was famous. She then proceeded to tell us 758 00:43:34,160 --> 00:43:35,600 Speaker 2: that she listened to your show and at the end 759 00:43:35,600 --> 00:43:37,839 Speaker 2: you read letters from viewers, and went on to say 760 00:43:37,840 --> 00:43:40,200 Speaker 2: that the patient wrote in about the dentist that swears. 761 00:43:40,760 --> 00:43:43,200 Speaker 2: I went back found the episode August. Oh yeah, here 762 00:43:43,200 --> 00:43:46,640 Speaker 2: its right here, August twenty three, August third Really. 763 00:43:46,640 --> 00:43:51,919 Speaker 3: Wow, oh boy, that was this month. Well, we read 764 00:43:51,960 --> 00:43:54,239 Speaker 3: it to be fair five weeks before that. 765 00:43:55,200 --> 00:43:57,359 Speaker 2: The episode was the last meal ritual and it has 766 00:43:57,440 --> 00:43:59,960 Speaker 2: to be my boss, doctor Travis Castle. 767 00:44:00,080 --> 00:44:02,000 Speaker 3: By the way, I said it was okay to read this. 768 00:44:02,760 --> 00:44:04,719 Speaker 2: By any chance that the patients say the dentist was 769 00:44:05,040 --> 00:44:07,320 Speaker 2: because my ball thinks it could totally be another dentist, 770 00:44:07,360 --> 00:44:10,000 Speaker 2: but I don't believe it. If you could, could you 771 00:44:10,040 --> 00:44:11,839 Speaker 2: send me an email back so I could know if 772 00:44:11,880 --> 00:44:14,759 Speaker 2: I was right? He he, So I did. 773 00:44:14,719 --> 00:44:15,439 Speaker 3: Look this up. 774 00:44:16,600 --> 00:44:19,320 Speaker 2: The original gentleman did not give the name of the dentist, 775 00:44:19,880 --> 00:44:23,120 Speaker 2: but he was a dentist in Maine. And what are 776 00:44:23,160 --> 00:44:26,840 Speaker 2: the chances that there are two cursing dentists in Maine. 777 00:44:26,960 --> 00:44:29,240 Speaker 1: They're essentially zero, probably zero. 778 00:44:29,440 --> 00:44:31,920 Speaker 2: So Ginger goes on to say he does like to cuss, 779 00:44:32,360 --> 00:44:35,360 Speaker 2: obviously not in front of kids, and he's not everyone's 780 00:44:35,400 --> 00:44:36,840 Speaker 2: cup of tea, but he surely has a lot of 781 00:44:36,840 --> 00:44:37,520 Speaker 2: fun to work with. 782 00:44:37,920 --> 00:44:39,280 Speaker 3: He's a genuine dentist. 783 00:44:39,960 --> 00:44:41,919 Speaker 2: I don't even know what that means, but it takes 784 00:44:42,000 --> 00:44:44,040 Speaker 2: no bs from root patients. I've worked with him for 785 00:44:44,040 --> 00:44:45,759 Speaker 2: two years and I still love to come to work 786 00:44:45,800 --> 00:44:47,960 Speaker 2: every day. Thank you for taking the time to read 787 00:44:48,000 --> 00:44:50,880 Speaker 2: my email. Thanks to our patient, you have a new listener, 788 00:44:51,440 --> 00:44:53,319 Speaker 2: and I look forward to hearing from you. That is 789 00:44:53,320 --> 00:44:58,279 Speaker 2: from Ginger, the dental assistant of doctor Castleberry. So I 790 00:44:58,320 --> 00:45:01,600 Speaker 2: almost want to save up a cleaning and go see 791 00:45:01,880 --> 00:45:03,560 Speaker 2: Ginger and doctor Castleberry sometime. 792 00:45:03,880 --> 00:45:06,759 Speaker 1: Why not just book a plane ride up to Maine and. 793 00:45:06,760 --> 00:45:09,000 Speaker 2: To your tea clean and then come on have a 794 00:45:09,000 --> 00:45:10,360 Speaker 2: swim in their cold, cold ocean. 795 00:45:11,000 --> 00:45:16,440 Speaker 1: That email, by the way, was ginger v itis. Sorry Ginger, 796 00:45:17,680 --> 00:45:22,319 Speaker 1: sorry to everybody who hates puns, including me. Well, if 797 00:45:22,360 --> 00:45:24,160 Speaker 1: you want to get in touch of this, like Ginger did, 798 00:45:24,200 --> 00:45:26,560 Speaker 1: you can send us an email. Send it off to 799 00:45:26,640 --> 00:45:32,400 Speaker 1: stuff podcast at iHeartRadio dot com. 800 00:45:32,560 --> 00:45:34,879 Speaker 3: Stuff you Should Know is a production of iHeartRadio. 801 00:45:35,360 --> 00:45:38,560 Speaker 2: For more podcasts my heart Radio, visit the iHeartRadio app, 802 00:45:38,760 --> 00:45:41,680 Speaker 2: Apple Podcasts, or wherever you listen to your favorite shows.