1 00:00:00,720 --> 00:00:05,000 Speaker 1: Hey, you may have noticed this past Saturday, you got 2 00:00:05,000 --> 00:00:08,440 Speaker 1: an extra episode of Stuff you should know. That's why, 3 00:00:08,520 --> 00:00:11,280 Speaker 1: as case selects. That's right, it was not a mistake. 4 00:00:11,320 --> 00:00:13,960 Speaker 1: What we decided to do here after nine plus years 5 00:00:14,040 --> 00:00:17,080 Speaker 1: is um you know. Maybe you don't know that we 6 00:00:17,120 --> 00:00:21,320 Speaker 1: have nine plus episodes. Uh, so we're gonna start throwing 7 00:00:21,320 --> 00:00:25,040 Speaker 1: out all I don't want to call it a rerun. Well, no, 8 00:00:25,200 --> 00:00:30,520 Speaker 1: it's a it's a hand selected curated episode by us. Yeah, 9 00:00:30,560 --> 00:00:33,319 Speaker 1: a classic if you will that. Joshua, pick one out, 10 00:00:33,360 --> 00:00:36,159 Speaker 1: I'll pick one out. Might be Newsy, it might just 11 00:00:36,200 --> 00:00:37,959 Speaker 1: be one of our favorites, and we're gonna run those 12 00:00:37,960 --> 00:00:40,200 Speaker 1: on Saturday. If you haven't heard it, check it out. 13 00:00:40,240 --> 00:00:42,760 Speaker 1: If you have, we'd love for you to listen again. Sure, 14 00:00:43,200 --> 00:00:46,720 Speaker 1: so check it out in your podcast feed. It's as 15 00:00:46,760 --> 00:00:51,440 Speaker 1: simple as that. Welcome to Stuff you should know from 16 00:00:51,440 --> 00:01:00,720 Speaker 1: how Stuff works dot com. Hey, and welcome to the podcast. 17 00:01:01,200 --> 00:01:06,440 Speaker 1: I'm Josh Clark. There's Charles W. Chuck, Lena, my Shoulder, 18 00:01:06,520 --> 00:01:12,280 Speaker 1: Bryant and Jerry. How about a hug Roland No, actually, 19 00:01:12,360 --> 00:01:16,240 Speaker 1: I'm sorry, Jerry's here in spirit. Our guest producer today 20 00:01:16,319 --> 00:01:22,720 Speaker 1: is Noel no my beard heels all Brown yes, everybody 21 00:01:22,760 --> 00:01:26,440 Speaker 1: knows it's Noel Brown. Are you using your empathy voice? Yeah? 22 00:01:26,480 --> 00:01:30,840 Speaker 1: Is it working needful? In nobody? Really? That's the b 23 00:01:31,000 --> 00:01:36,559 Speaker 1: D eyes. Let's say I cut you for ten dollars. Oh, 24 00:01:36,600 --> 00:01:41,679 Speaker 1: how are you, sir? I'm feeling empathetic good. I'm doing good. 25 00:01:42,280 --> 00:01:46,840 Speaker 1: I have some very strong opinions on empathy, and not 26 00:01:46,959 --> 00:01:50,480 Speaker 1: just empathy, but empathy research in particulars. I'm sure you're 27 00:01:50,520 --> 00:01:52,480 Speaker 1: not at all surprised to hear. I'm not at all 28 00:01:52,520 --> 00:01:55,000 Speaker 1: surprised to hear. Did you come to the same or 29 00:01:55,080 --> 00:01:58,880 Speaker 1: similar conclusions as I did? I don't know yet, because 30 00:01:58,880 --> 00:02:01,840 Speaker 1: we don't talk about this stuff beforehand. That's true, that's 31 00:02:01,840 --> 00:02:07,120 Speaker 1: how we that's the magic going blind. Did you know 32 00:02:07,160 --> 00:02:10,600 Speaker 1: that there's like an Atlanta magic thing? Now, what do 33 00:02:10,600 --> 00:02:14,320 Speaker 1: you mean? Society something something along? I just saw a 34 00:02:14,360 --> 00:02:16,680 Speaker 1: sign for it, an old fourth Ward. But there's like 35 00:02:16,720 --> 00:02:20,760 Speaker 1: a seems to be a legitimate magicians with's that castle 36 00:02:20,800 --> 00:02:23,799 Speaker 1: in l A. Oh, the magic Castle. It's not that, 37 00:02:23,880 --> 00:02:26,720 Speaker 1: but it's probably something that the people who do the 38 00:02:26,760 --> 00:02:29,480 Speaker 1: Atlanta thing or I'm sure aware of the magic Castle. 39 00:02:29,639 --> 00:02:31,960 Speaker 1: Probably And then you did a double ticket to sign 40 00:02:31,960 --> 00:02:36,720 Speaker 1: and it disappeared in a poop of smile. It would 41 00:02:36,720 --> 00:02:39,960 Speaker 1: be great. I went to the Magic Castle once, Lucky. Yeah, 42 00:02:40,000 --> 00:02:42,959 Speaker 1: it's awesome. I think we had this conversation because I 43 00:02:43,000 --> 00:02:45,359 Speaker 1: asked you if you've seen that documentary about the kids 44 00:02:45,360 --> 00:02:47,840 Speaker 1: competition at the Magic Castle. Yeah, and I have not, 45 00:02:48,040 --> 00:02:50,520 Speaker 1: but um, it's a really good choke. Yeah, if you 46 00:02:50,560 --> 00:02:52,920 Speaker 1: can highly recommend it. If you can get in, you 47 00:02:53,000 --> 00:02:58,120 Speaker 1: gotta know somebody. You gotta know Ben Stiller. Oh really no, 48 00:02:58,280 --> 00:03:01,240 Speaker 1: there's a movie that he was in that took place 49 00:03:01,240 --> 00:03:03,320 Speaker 1: in the Magic Castle and he was like the bad guy. 50 00:03:03,360 --> 00:03:06,000 Speaker 1: I think I don't remember what it was. Maybe it 51 00:03:06,040 --> 00:03:11,639 Speaker 1: was that documentary. Well, let's talk empathy, chuck. Wait, hold on, 52 00:03:11,680 --> 00:03:14,240 Speaker 1: I have an intro. I have an intro. Are you 53 00:03:14,320 --> 00:03:18,400 Speaker 1: familiar with Frank Rich, the left leaning, well lefty as 54 00:03:18,480 --> 00:03:22,840 Speaker 1: heck essayist. I don't think so. He Uh, he's good. 55 00:03:22,880 --> 00:03:26,040 Speaker 1: He's about his He's about as good an essayists as 56 00:03:26,040 --> 00:03:29,840 Speaker 1: you'll find on the left. Um, he's a consultant on Veep. 57 00:03:30,320 --> 00:03:33,600 Speaker 1: He just he's hilarious and he knows his stuff right. 58 00:03:33,960 --> 00:03:36,400 Speaker 1: He usually writes for Harper's but he's also got a 59 00:03:36,400 --> 00:03:39,920 Speaker 1: regular gig in New York Magazine and in New York magazine. Recently, 60 00:03:40,640 --> 00:03:44,119 Speaker 1: he published a column I think this week, um, well 61 00:03:44,160 --> 00:03:47,080 Speaker 1: this week as of when we're recording this, and I 62 00:03:47,120 --> 00:03:49,720 Speaker 1: think it was called like no Sympathy for the Hillbilly 63 00:03:49,840 --> 00:03:53,320 Speaker 1: or something like that, and it was basically, this is 64 00:03:53,360 --> 00:03:56,720 Speaker 1: really astounding coming from him, But it was basically him saying, 65 00:03:57,320 --> 00:04:01,000 Speaker 1: you know what, Um, I know that on the left 66 00:04:01,360 --> 00:04:04,280 Speaker 1: people tend to be bleeding heart liberals and want to 67 00:04:04,320 --> 00:04:08,320 Speaker 1: empathize with everybody and feel everyone else's pain and understand 68 00:04:08,360 --> 00:04:12,080 Speaker 1: where people are coming from. But I believe that if 69 00:04:12,120 --> 00:04:15,400 Speaker 1: you voted for Trump and you're angry, or if I 70 00:04:15,440 --> 00:04:17,800 Speaker 1: believe if you're angry at the people who voted for Trump, 71 00:04:17,800 --> 00:04:20,400 Speaker 1: are angry that Trump is president, you should be angry 72 00:04:20,400 --> 00:04:23,599 Speaker 1: at the people who voted him into into office as well. 73 00:04:23,960 --> 00:04:26,920 Speaker 1: And he basically is beating a drum, which I also 74 00:04:27,000 --> 00:04:30,039 Speaker 1: started to see in other places as well, where it's like, no, 75 00:04:30,520 --> 00:04:33,080 Speaker 1: you don't have to understand people who voted for Trump, 76 00:04:33,120 --> 00:04:35,280 Speaker 1: you don't have to love your enemy. Let's just go 77 00:04:35,400 --> 00:04:38,719 Speaker 1: to war with these people. And it's it's it's legitimate. 78 00:04:38,720 --> 00:04:42,760 Speaker 1: He's totally serious too, and it amounts to basically a 79 00:04:42,880 --> 00:04:45,680 Speaker 1: call to go to the dark side. To resist everything 80 00:04:45,720 --> 00:04:49,159 Speaker 1: that you know, the left has traditionally prided itself on, 81 00:04:49,640 --> 00:04:55,120 Speaker 1: and just go full bore like culture war against the right, 82 00:04:55,800 --> 00:04:58,760 Speaker 1: and um, it just seems like a really bad idea 83 00:04:58,839 --> 00:05:01,200 Speaker 1: to me. But one of the things that stuck out 84 00:05:01,279 --> 00:05:03,720 Speaker 1: to me about it the most was that it was 85 00:05:03,880 --> 00:05:09,560 Speaker 1: so contrary to the um ethos, the prevailing thought of 86 00:05:09,640 --> 00:05:12,440 Speaker 1: the time, or at least what made up the Obama administration, 87 00:05:12,480 --> 00:05:15,240 Speaker 1: which was we need to be more empathetic, we need 88 00:05:15,279 --> 00:05:18,719 Speaker 1: to understand people's plight more. And even after Hillary lost people, 89 00:05:18,720 --> 00:05:21,320 Speaker 1: one of the big post mortems was Hillary didn't connect 90 00:05:21,360 --> 00:05:23,960 Speaker 1: with blue collar workers who were out of work. She 91 00:05:24,080 --> 00:05:26,479 Speaker 1: was totally out of touch with that, she couldn't empathize 92 00:05:26,520 --> 00:05:29,440 Speaker 1: with them. Well, I think a further post mortem has 93 00:05:29,480 --> 00:05:31,720 Speaker 1: been like Hillary could empathize with those people all day, 94 00:05:31,720 --> 00:05:33,120 Speaker 1: but they hated her and they were never going to 95 00:05:33,240 --> 00:05:36,760 Speaker 1: vote for And now Frank Rich is saying, so hate 96 00:05:36,760 --> 00:05:40,080 Speaker 1: them back is the thing Again. I disagree with that, 97 00:05:40,120 --> 00:05:43,960 Speaker 1: but it really points out how what a fragile turning 98 00:05:43,960 --> 00:05:47,240 Speaker 1: point we're at right now this path in history on America. 99 00:05:47,360 --> 00:05:49,839 Speaker 1: Are we gonna stay and just keep trying to be 100 00:05:49,920 --> 00:05:53,000 Speaker 1: empathetic or we again just gonna go full board to 101 00:05:53,080 --> 00:05:56,440 Speaker 1: the dark side, and and just everybody's gonna hate everybody 102 00:05:56,480 --> 00:06:01,200 Speaker 1: who's just who's not like them. White an intro, thank 103 00:06:01,240 --> 00:06:05,839 Speaker 1: you for a coastal elite. Oh, I'm not a coastal elite. 104 00:06:06,360 --> 00:06:08,520 Speaker 1: I'm just kidding. I just like that phrase. I hope 105 00:06:08,520 --> 00:06:10,920 Speaker 1: I'm not. Man. I really don't think I am, and 106 00:06:11,120 --> 00:06:14,640 Speaker 1: I hope people don't think I am. I do stick 107 00:06:14,680 --> 00:06:16,919 Speaker 1: my pinky in the air when I take SIPs of water, 108 00:06:17,839 --> 00:06:21,680 Speaker 1: and that water has been strained through a m Franciscan 109 00:06:21,760 --> 00:06:25,640 Speaker 1: monks mouth. First. I don't think the only water I'll drink. 110 00:06:25,680 --> 00:06:27,520 Speaker 1: I don't think you can be a coastal elite if 111 00:06:27,520 --> 00:06:31,320 Speaker 1: you have your roots in Toledo right exactly, you know, 112 00:06:31,400 --> 00:06:34,920 Speaker 1: And I don't forget where I'm from man, and my family, 113 00:06:34,920 --> 00:06:38,200 Speaker 1: you know, has long roots and Tennessee and Mississippi. If 114 00:06:38,920 --> 00:06:42,280 Speaker 1: you know this by reading my Wikipedia page, right, does 115 00:06:42,320 --> 00:06:44,960 Speaker 1: it say that you're part Choctaw on there yet? I'm 116 00:06:44,960 --> 00:06:49,320 Speaker 1: sure it will soon, all right, So we're talking empathy here. UM. 117 00:06:49,400 --> 00:06:51,359 Speaker 1: A lot of this sounded familiar, so much so that 118 00:06:51,440 --> 00:06:55,240 Speaker 1: I like Quadripple checked that we had not done this. Um, 119 00:06:55,240 --> 00:06:57,080 Speaker 1: and I think we've just talked about it a lot, 120 00:06:57,200 --> 00:07:03,280 Speaker 1: and namely and uh in our Mirror Neurons episode. Yeah. Um, 121 00:07:03,360 --> 00:07:05,560 Speaker 1: and I thought about that one a lot when I 122 00:07:05,600 --> 00:07:08,240 Speaker 1: was researching this. Well, I think it's definitely a component 123 00:07:08,320 --> 00:07:11,000 Speaker 1: of empathy, but it's not to be confused with empathy. 124 00:07:11,000 --> 00:07:13,520 Speaker 1: It's like part of it, I think is the impression 125 00:07:13,560 --> 00:07:16,840 Speaker 1: I have agreed. So, um, empathy if you look at 126 00:07:16,880 --> 00:07:22,040 Speaker 1: our not a great article, Uh, they do define it. Um. 127 00:07:22,160 --> 00:07:24,040 Speaker 1: You know, everyone kind of knows what it is. But 128 00:07:24,840 --> 00:07:27,040 Speaker 1: just to be clear, it's not sympathy. It's a it's 129 00:07:27,680 --> 00:07:32,200 Speaker 1: if you can feel and share someone else's emotions. Is empathy, 130 00:07:32,560 --> 00:07:36,760 Speaker 1: which is different than sympathy, and that uh you're don't 131 00:07:37,040 --> 00:07:39,640 Speaker 1: You're not feeling it, but you do care about it, 132 00:07:40,400 --> 00:07:43,520 Speaker 1: right right. It's like, um, you can understand why someone 133 00:07:43,560 --> 00:07:47,480 Speaker 1: would be feeling like they're feeling. Is intellectual yea, Like 134 00:07:47,600 --> 00:07:51,160 Speaker 1: sympathies from the brain and empathy is from say, the heart. Yeah, 135 00:07:51,160 --> 00:07:53,360 Speaker 1: and a lot of these words. When we get into 136 00:07:53,360 --> 00:07:56,640 Speaker 1: the definitions of empathy and versus compassion, it gets a 137 00:07:56,680 --> 00:08:00,560 Speaker 1: little Uh, I don't know. Some times I feel like 138 00:08:00,600 --> 00:08:04,040 Speaker 1: people are kind of splitting hairs with that. To me, 139 00:08:04,280 --> 00:08:07,880 Speaker 1: chuck is a huge red flag that the field is 140 00:08:08,040 --> 00:08:11,040 Speaker 1: not nearly as established as people like to think. Like 141 00:08:11,080 --> 00:08:14,080 Speaker 1: if there's still confusion on basic terms like empathy and 142 00:08:14,160 --> 00:08:19,000 Speaker 1: sympathy and they're used interchangeably, it just means that no 143 00:08:19,040 --> 00:08:22,680 Speaker 1: one is doing the right kind of hardcore researcher publishing 144 00:08:22,680 --> 00:08:24,600 Speaker 1: the right kind of hardcore papers that say this is 145 00:08:24,640 --> 00:08:26,280 Speaker 1: what it is, or this is what it not, this 146 00:08:26,320 --> 00:08:29,440 Speaker 1: is what it isn't almost just said this is what 147 00:08:29,560 --> 00:08:34,240 Speaker 1: it not is, this is what it ain't. No coastal elite. 148 00:08:35,360 --> 00:08:39,640 Speaker 1: But there was an original German word um einfilung, which 149 00:08:39,679 --> 00:08:42,040 Speaker 1: means feeling into and that's where empathy comes from. And 150 00:08:42,120 --> 00:08:46,000 Speaker 1: if you talk to an expert or a researcher um 151 00:08:46,080 --> 00:08:48,600 Speaker 1: these days, they're gonna talk about a couple of types 152 00:08:48,640 --> 00:08:54,359 Speaker 1: of empathy, um effective or maybe emotional empathy and cognitive 153 00:08:54,400 --> 00:08:59,120 Speaker 1: empathy and um the distinction is, as it turns out, 154 00:08:59,240 --> 00:09:03,840 Speaker 1: is pretty important it and to me, well, to me, 155 00:09:03,920 --> 00:09:06,160 Speaker 1: this is where a little bit of the splitting hairs 156 00:09:06,320 --> 00:09:10,400 Speaker 1: comes in, because as far as talking about um effective 157 00:09:10,480 --> 00:09:14,560 Speaker 1: empathy versus compassion, like is it the same thing? Or 158 00:09:14,559 --> 00:09:18,240 Speaker 1: I'm sorry, cognitive empathy would be more like compassion because 159 00:09:18,280 --> 00:09:22,640 Speaker 1: you're not really taking on someone else's pain. So compassion, 160 00:09:22,760 --> 00:09:25,920 Speaker 1: I think, is even like a third word. This is so, 161 00:09:26,000 --> 00:09:29,640 Speaker 1: this is what I came up with. You've got cognitive empathy, 162 00:09:29,679 --> 00:09:33,560 Speaker 1: which is sympathy, right, you can understand why someone would 163 00:09:33,600 --> 00:09:38,040 Speaker 1: be feeling a certain way. Then then you've got effective empathy, 164 00:09:38,760 --> 00:09:41,720 Speaker 1: empathy which this one dude calls it okay, which is 165 00:09:41,760 --> 00:09:43,959 Speaker 1: like you're really putting yourself in that person shoes and 166 00:09:44,000 --> 00:09:47,560 Speaker 1: you're feeling how they're feeling. Right then, But then compassion, 167 00:09:47,760 --> 00:09:50,160 Speaker 1: it seems to me, is the end goal of this 168 00:09:50,320 --> 00:09:52,839 Speaker 1: That's where you actually moved to act. That's where you 169 00:09:52,920 --> 00:09:55,319 Speaker 1: do something about it. That's where you put your hand 170 00:09:56,040 --> 00:09:59,080 Speaker 1: on someone's shoulder and say it's gonna be all right, 171 00:09:59,400 --> 00:10:04,480 Speaker 1: or know, here's a check for five dollars um, get 172 00:10:04,520 --> 00:10:06,720 Speaker 1: some groceries with it. Who knows what you're gonna do. 173 00:10:06,760 --> 00:10:10,040 Speaker 1: But I think to me, compassion is the act, like 174 00:10:10,080 --> 00:10:13,760 Speaker 1: the action, the end goal of empathy, whether it's cognitive 175 00:10:14,240 --> 00:10:17,640 Speaker 1: or um or effective. And that's that's what I think. 176 00:10:17,800 --> 00:10:20,880 Speaker 1: And you know what, this field is so unestablished that 177 00:10:21,000 --> 00:10:24,319 Speaker 1: I can just say that stuff. Yeah, and it's probably right. 178 00:10:24,480 --> 00:10:26,360 Speaker 1: Let's just say that that's true. No one can really 179 00:10:26,360 --> 00:10:30,600 Speaker 1: come along and say definitively that you're not right right. Uh, 180 00:10:30,720 --> 00:10:32,720 Speaker 1: So you know, to put giving you an example of 181 00:10:32,720 --> 00:10:35,880 Speaker 1: what that might mean effective or emotional empathy. Um, if 182 00:10:35,920 --> 00:10:37,959 Speaker 1: someone if you have a friend or family member going 183 00:10:37,960 --> 00:10:41,000 Speaker 1: through a very hard time, uh, and they're distraught, and 184 00:10:41,040 --> 00:10:44,680 Speaker 1: then you are also distraught just like they are, then 185 00:10:44,720 --> 00:10:48,880 Speaker 1: that is definitely effective empathy. Whereas you're not just like, 186 00:10:48,920 --> 00:10:51,679 Speaker 1: oh man, you know your your uncle passed away. I'm 187 00:10:51,720 --> 00:10:54,240 Speaker 1: really sorry to hear that, and I feel terribly for you. 188 00:10:54,720 --> 00:10:58,520 Speaker 1: But if if you are you know, actively taking that 189 00:10:58,600 --> 00:11:01,440 Speaker 1: on to the point where you're crying too, and you 190 00:11:01,480 --> 00:11:06,000 Speaker 1: didn't know the uncle, because that would be the differentiation rights, 191 00:11:06,000 --> 00:11:08,040 Speaker 1: like you don't have a personal stake in it, but 192 00:11:08,080 --> 00:11:11,480 Speaker 1: you're still taking it on as if it is your own. Yes, 193 00:11:12,720 --> 00:11:16,880 Speaker 1: and then depending on your view of things, And we'll 194 00:11:16,920 --> 00:11:19,400 Speaker 1: talk a lot about this. There's this really great psychologist 195 00:11:19,480 --> 00:11:22,560 Speaker 1: named Paul Bloom who has basically dedicated a lot of 196 00:11:22,600 --> 00:11:26,760 Speaker 1: his life to shooting down ideas of how great empathy is. Yeah, 197 00:11:26,800 --> 00:11:28,400 Speaker 1: I thought he was. I thought he made a lot 198 00:11:28,400 --> 00:11:30,439 Speaker 1: of good points, and some quite agree with either. But 199 00:11:30,520 --> 00:11:33,800 Speaker 1: he's great. He's really good at poking holes in the 200 00:11:33,880 --> 00:11:37,600 Speaker 1: concept of empathy. But he points out that, um that, 201 00:11:38,600 --> 00:11:41,640 Speaker 1: I guess it's probably good if somebody's something, someone's in 202 00:11:41,679 --> 00:11:44,640 Speaker 1: a great mood and you're empathetic and sharing in that 203 00:11:44,720 --> 00:11:47,360 Speaker 1: great mood and amplifying it. But on the flip side 204 00:11:47,360 --> 00:11:51,440 Speaker 1: of the coin, if somebody is in a horrifically tragically 205 00:11:51,679 --> 00:11:55,760 Speaker 1: sad mood and you're sitting there amplifying that by joining 206 00:11:55,800 --> 00:11:58,400 Speaker 1: in part and parcel with it, then you're you're doing 207 00:11:58,440 --> 00:12:03,760 Speaker 1: a disservice, right, So in some in some ways, Um, well, 208 00:12:03,800 --> 00:12:07,840 Speaker 1: I'll just say Paul Bloom's whole basically, his whole thesis 209 00:12:08,160 --> 00:12:10,400 Speaker 1: and I subscribe to it as well, is that cognitive 210 00:12:10,480 --> 00:12:14,120 Speaker 1: is far and away the superior of the two types 211 00:12:14,160 --> 00:12:17,440 Speaker 1: of empathy as far as the ultimate goal, which again 212 00:12:17,720 --> 00:12:20,600 Speaker 1: to me is compassion. Yes, you want to just pepper 213 00:12:20,640 --> 00:12:22,640 Speaker 1: in some of his stuff as we go. Does that 214 00:12:22,679 --> 00:12:26,680 Speaker 1: make sense because here's a great spot too. Uh. And 215 00:12:26,760 --> 00:12:29,880 Speaker 1: this is one of the studies I imagine. I don't 216 00:12:29,920 --> 00:12:31,240 Speaker 1: know if you had a problem with it, but I 217 00:12:31,280 --> 00:12:34,000 Speaker 1: had a problem with a lot of these studies. Um. 218 00:12:34,040 --> 00:12:36,560 Speaker 1: But there was a study, um at least one where 219 00:12:36,640 --> 00:12:40,360 Speaker 1: psychologists said, um, how much money will you donate to 220 00:12:40,440 --> 00:12:43,840 Speaker 1: develop a drug that would save one child's life. Um. 221 00:12:43,880 --> 00:12:46,320 Speaker 1: And then another group was asked, how much would you 222 00:12:46,520 --> 00:12:49,280 Speaker 1: donate to develop a drug that would save eight kids? 223 00:12:49,920 --> 00:12:52,120 Speaker 1: And it was about the same answer. Um. Where things 224 00:12:52,200 --> 00:12:55,120 Speaker 1: changed was when they asked a third group about the 225 00:12:55,160 --> 00:12:57,400 Speaker 1: one child. But they showed a picture of the kid 226 00:12:57,559 --> 00:13:00,000 Speaker 1: and like, you know, I said, this is this little Joe. 227 00:13:00,040 --> 00:13:03,520 Speaker 1: He's fourteen years old, and this is his sad little face. 228 00:13:04,200 --> 00:13:08,600 Speaker 1: And then donations really shot up. And this is where, um, 229 00:13:08,640 --> 00:13:11,760 Speaker 1: what was his name, Paul Bloom, Paul Bloom, a psychologist. Yeah, 230 00:13:11,760 --> 00:13:15,080 Speaker 1: this is where Paul Bloom says that, um, this emotional 231 00:13:15,120 --> 00:13:20,200 Speaker 1: empathy is for the birds because A it's it's um, 232 00:13:20,240 --> 00:13:24,440 Speaker 1: it's narrow, and B it's very like people tend to 233 00:13:24,920 --> 00:13:27,080 Speaker 1: want to help people that are like them. So it's 234 00:13:28,320 --> 00:13:31,920 Speaker 1: I mean biases that the right word super biased. Yeah, 235 00:13:32,000 --> 00:13:34,280 Speaker 1: and and it it makes no sense. Not only does 236 00:13:34,280 --> 00:13:37,960 Speaker 1: it not scale upward as the number of people affect by, say, 237 00:13:38,000 --> 00:13:41,280 Speaker 1: like a tragedy increase, it actually goes the other way, 238 00:13:41,600 --> 00:13:44,560 Speaker 1: where the more people that are affected by something, the 239 00:13:44,640 --> 00:13:48,319 Speaker 1: less empathetic a person tends to be. Whereas if say 240 00:13:48,360 --> 00:13:51,560 Speaker 1: it's one person and you know that person's name, and 241 00:13:51,600 --> 00:13:53,800 Speaker 1: you see that person's picture on the news, and yeah, 242 00:13:54,000 --> 00:13:56,360 Speaker 1: they look like you or your neighbor, your daughter. You're 243 00:13:56,360 --> 00:13:59,400 Speaker 1: gonna empathize a lot. But at the same time, there 244 00:13:59,440 --> 00:14:01,600 Speaker 1: could be you know, the same thing could be happening 245 00:14:01,640 --> 00:14:04,280 Speaker 1: to fifty other people, and if you'll just vote a 246 00:14:04,320 --> 00:14:07,160 Speaker 1: certain way you can alleviate their suffering. You wouldn't lift 247 00:14:07,160 --> 00:14:09,880 Speaker 1: a finger to do it, especially if it meant slightly 248 00:14:09,960 --> 00:14:12,640 Speaker 1: higher taxes for you. So in that sense, empathy makes 249 00:14:13,280 --> 00:14:16,760 Speaker 1: no sense whatsoever. Yeah. I mean he even quoted Mother 250 00:14:16,920 --> 00:14:21,120 Speaker 1: Teresa in his uh in this essay, which is um quote, 251 00:14:21,120 --> 00:14:23,080 Speaker 1: if I look at the mass, I will never act. 252 00:14:23,400 --> 00:14:26,240 Speaker 1: If I look at the one, I will, So he's 253 00:14:26,240 --> 00:14:29,960 Speaker 1: going with the heavy hitters there, you know, when you 254 00:14:29,960 --> 00:14:33,880 Speaker 1: bring Mother Teresa in there to kind of make a point. Yeah, 255 00:14:33,960 --> 00:14:36,560 Speaker 1: but you know he makes a good point. Um oh yeah. 256 00:14:36,560 --> 00:14:39,520 Speaker 1: Like and and that study does. I didn't have a 257 00:14:39,520 --> 00:14:41,120 Speaker 1: big problem with that study because it does kind of 258 00:14:41,120 --> 00:14:44,320 Speaker 1: prove that out right. That was Telea Coca and uh 259 00:14:44,360 --> 00:14:49,160 Speaker 1: Alana Ritov, their psychologists, and then Ritov in another UM 260 00:14:49,480 --> 00:14:53,560 Speaker 1: co author, conducted another study where UM that kind of 261 00:14:53,600 --> 00:14:56,000 Speaker 1: pointed out one of the problems with empathy, which was 262 00:14:56,360 --> 00:14:59,640 Speaker 1: they said, Okay, UM. Two different groups of people heard this, 263 00:14:59,760 --> 00:15:05,280 Speaker 1: that UM that a vaccine maker cost a child her life, 264 00:15:05,760 --> 00:15:09,800 Speaker 1: kill the child because of the vaccine. Now, UM, should 265 00:15:09,840 --> 00:15:12,720 Speaker 1: the vaccine maker be fined? And then one group was 266 00:15:12,760 --> 00:15:16,440 Speaker 1: told that the fine would probably make the vaccine maker 267 00:15:17,000 --> 00:15:21,600 Speaker 1: UM follow guidelines even more strictly and would probably prevent accidents, 268 00:15:22,040 --> 00:15:24,640 Speaker 1: and then the other further accidents, and then the UM. 269 00:15:24,720 --> 00:15:29,200 Speaker 1: The other group was told that this fine would probably 270 00:15:29,720 --> 00:15:32,360 Speaker 1: make the vaccine maker get out of the business and 271 00:15:32,400 --> 00:15:35,760 Speaker 1: more people would die because they couldn't get the vaccine, 272 00:15:36,440 --> 00:15:40,240 Speaker 1: and both groups said that yes, the vaccine maker should 273 00:15:40,240 --> 00:15:44,640 Speaker 1: be punished with UM the highest fine possible. Extreme prejudice. Right. 274 00:15:44,720 --> 00:15:47,920 Speaker 1: So the upshot of all of this is that especially 275 00:15:47,920 --> 00:15:53,560 Speaker 1: with UM effective empathy as we understand it, we we 276 00:15:54,120 --> 00:15:57,160 Speaker 1: it doesn't it doesn't follow any kind of rational guidelines, 277 00:15:57,280 --> 00:16:00,560 Speaker 1: and the basis of rationality being that who is more 278 00:16:00,600 --> 00:16:04,040 Speaker 1: important than one right and empathy just doesn't go in 279 00:16:04,080 --> 00:16:09,160 Speaker 1: that direction. Yeah. But UM, Interestingly, UM, while you can 280 00:16:09,440 --> 00:16:12,720 Speaker 1: train yourself to be more empathetic, it definitely to me 281 00:16:13,000 --> 00:16:16,360 Speaker 1: feels like something that you were sort of born with 282 00:16:16,480 --> 00:16:19,200 Speaker 1: to a certain degree, or maybe in the formative years 283 00:16:19,200 --> 00:16:23,760 Speaker 1: you might gain um but uh In in blooms article, 284 00:16:23,800 --> 00:16:26,640 Speaker 1: he talks about babies and as as soon as a 285 00:16:26,680 --> 00:16:29,360 Speaker 1: baby can get up and start getting around, they're gonna 286 00:16:29,520 --> 00:16:31,720 Speaker 1: try and comfort. Like if you go into a preschool 287 00:16:32,240 --> 00:16:35,920 Speaker 1: and there's another baby crying, you will probably see another 288 00:16:35,960 --> 00:16:38,280 Speaker 1: little baby walking over there and patting the little baby 289 00:16:38,320 --> 00:16:41,520 Speaker 1: and stroking the baby. There's nothing more adorable. It's a 290 00:16:41,520 --> 00:16:45,880 Speaker 1: pretty adorable um. And you know it happens in the 291 00:16:45,920 --> 00:16:50,520 Speaker 1: animal kingdom. Um. Although they did note um this, Franz 292 00:16:50,600 --> 00:16:55,480 Speaker 1: de Wal the pramatologists notes that it kind of follows 293 00:16:55,840 --> 00:16:59,320 Speaker 1: humans in a way and that um, a chimpanzee might 294 00:17:00,240 --> 00:17:04,400 Speaker 1: really um like put like hug a victim of an attack, 295 00:17:05,200 --> 00:17:08,560 Speaker 1: but it's got to be another jimp. Like if they're 296 00:17:08,600 --> 00:17:11,120 Speaker 1: like they will smash the brains out of another kind 297 00:17:11,160 --> 00:17:13,960 Speaker 1: of monkey maybe if it wanders into their little village. 298 00:17:14,600 --> 00:17:17,560 Speaker 1: That to me kind of underscores this whole, this whole thing, 299 00:17:17,600 --> 00:17:20,240 Speaker 1: like when we when we look at empathy, the first 300 00:17:20,280 --> 00:17:22,560 Speaker 1: question that people have is like, why don't we have 301 00:17:22,600 --> 00:17:24,880 Speaker 1: more empathy or why don't we have empathy for everybody? 302 00:17:24,920 --> 00:17:27,919 Speaker 1: We're all here, And it seems like based on France 303 00:17:27,960 --> 00:17:31,679 Speaker 1: to Walls studies and um other studies about the the 304 00:17:31,760 --> 00:17:35,399 Speaker 1: evolution of in group and out group behavior. Like we 305 00:17:35,400 --> 00:17:39,240 Speaker 1: we evolved over hundreds of thousands, if not millions of years, 306 00:17:39,640 --> 00:17:41,600 Speaker 1: but I guess more than that if you're if you're 307 00:17:41,600 --> 00:17:45,800 Speaker 1: also looking at the grade apes right to see other 308 00:17:46,080 --> 00:17:49,040 Speaker 1: groups that aren't like us as threatening, right, it makes 309 00:17:49,040 --> 00:17:53,120 Speaker 1: sense in an evolutionary speaking, right, And it's it's only 310 00:17:53,160 --> 00:17:56,240 Speaker 1: in like the last uh ten eleven thousand years that 311 00:17:56,280 --> 00:17:59,040 Speaker 1: we settled down and started forming cities. But even then 312 00:17:59,080 --> 00:18:01,400 Speaker 1: there was in group and out group people. You didn't recognize, 313 00:18:01,440 --> 00:18:03,720 Speaker 1: we're coming to kill you for your crops, so you 314 00:18:03,760 --> 00:18:06,040 Speaker 1: needed to fight those people. You didn't need to empathize 315 00:18:06,119 --> 00:18:08,280 Speaker 1: with them that, oh you're hungry, so you're gonna take 316 00:18:08,320 --> 00:18:11,560 Speaker 1: my life. I understand, right. That didn't That didn't jibe 317 00:18:11,560 --> 00:18:15,879 Speaker 1: with natural selection. But then you add jets into the mix, 318 00:18:16,040 --> 00:18:18,520 Speaker 1: and then TV and then the Internet, and all of 319 00:18:18,560 --> 00:18:21,320 Speaker 1: a sudden, we're exposed to more in groups and out 320 00:18:21,320 --> 00:18:24,120 Speaker 1: groups and are expected to get along more civilly than 321 00:18:24,119 --> 00:18:27,480 Speaker 1: ever before. But our evolution hasn't caught up quite enough, right, 322 00:18:28,160 --> 00:18:30,280 Speaker 1: So now we're faced at this point where it's like, Okay, 323 00:18:30,400 --> 00:18:32,200 Speaker 1: we just need to figure out how to empathize more, 324 00:18:32,520 --> 00:18:36,280 Speaker 1: and this last vestige that's holding back out completely civil 325 00:18:36,320 --> 00:18:39,440 Speaker 1: global society will fade away. In France, to all put 326 00:18:39,440 --> 00:18:41,840 Speaker 1: it pretty well, he said, this is the challenge of 327 00:18:41,880 --> 00:18:45,640 Speaker 1: our time globalization by a tribal species, and that's where 328 00:18:45,640 --> 00:18:48,119 Speaker 1: we're facing right now. And right now it feels like, 329 00:18:48,160 --> 00:18:51,920 Speaker 1: at least in the United States, we're backsliding. Yeah. Well, 330 00:18:51,920 --> 00:18:54,560 Speaker 1: that's a good place to take a break, I think, Yeah, 331 00:18:54,560 --> 00:18:56,240 Speaker 1: all right, Well, we're gonna come back in just a 332 00:18:56,240 --> 00:18:58,960 Speaker 1: minute and talk a little bit about something called the 333 00:18:59,040 --> 00:19:27,720 Speaker 1: racial empathy gap right after this. All right, so I 334 00:19:27,760 --> 00:19:31,800 Speaker 1: promised some talk about race. And there's something called the 335 00:19:31,920 --> 00:19:35,360 Speaker 1: racial empathy gap. Um studies of kind of. I mean, 336 00:19:36,440 --> 00:19:38,359 Speaker 1: if you walk around as a living breath and human 337 00:19:38,520 --> 00:19:41,000 Speaker 1: human being, you can probably tell that that's something. But 338 00:19:41,080 --> 00:19:44,199 Speaker 1: they have done studies on it, and um, a lot 339 00:19:44,240 --> 00:19:47,119 Speaker 1: of these studies are a little hinky to me. But uh, 340 00:19:47,760 --> 00:19:52,080 Speaker 1: in one they showed video clips of a needle going 341 00:19:52,119 --> 00:19:57,399 Speaker 1: into someone's skin, uh, notably a white person's skin at first, 342 00:19:58,160 --> 00:20:03,400 Speaker 1: and what they found was, um, white people reacted more 343 00:20:03,640 --> 00:20:05,600 Speaker 1: or with more empathy when the needle went into white 344 00:20:05,600 --> 00:20:09,119 Speaker 1: skin than when it went into dark skin, right, or 345 00:20:09,200 --> 00:20:11,600 Speaker 1: they had they showed more signs of distress, like they 346 00:20:11,600 --> 00:20:14,040 Speaker 1: started to sweat a little more, sure, their hearts started 347 00:20:14,040 --> 00:20:16,160 Speaker 1: to beat a little faster. Yeah, that's where I think 348 00:20:16,200 --> 00:20:20,200 Speaker 1: mirror neurons might come into play. Um right, Yeah, that's 349 00:20:20,280 --> 00:20:23,600 Speaker 1: what they're that's brain wiring. That's a huge problem with 350 00:20:23,680 --> 00:20:27,440 Speaker 1: reading about empathy in the popular media. They're huge jumps 351 00:20:27,480 --> 00:20:31,560 Speaker 1: from mirror neurons to full on effective empathy with just 352 00:20:31,720 --> 00:20:35,040 Speaker 1: the switch of a sentence and then or the the 353 00:20:35,160 --> 00:20:38,160 Speaker 1: stroke of a headline, like and so people are not 354 00:20:38,240 --> 00:20:40,560 Speaker 1: talking about the same thing. And I'm sure there's plenty 355 00:20:40,600 --> 00:20:43,479 Speaker 1: of empathy researchers out there that are just like, guys, guys, 356 00:20:43,560 --> 00:20:46,760 Speaker 1: this is not like you're making huge jumps at the conclusion. 357 00:20:46,760 --> 00:20:50,520 Speaker 1: Everybody's like, shut up, doesn't matter, we're selling clicks, you know. 358 00:20:51,480 --> 00:20:54,760 Speaker 1: But so, yes, that's so. It is surely setting off 359 00:20:54,800 --> 00:20:58,880 Speaker 1: mirror neurons. I don't understand how it's being translated into empathy. 360 00:20:59,720 --> 00:21:02,560 Speaker 1: A side from I think a lot of the empathy 361 00:21:02,680 --> 00:21:06,600 Speaker 1: studies involves self reporting. So I think what they're doing 362 00:21:06,720 --> 00:21:12,959 Speaker 1: is they're saying, oh well, uh subject twenty nine, Um, 363 00:21:13,000 --> 00:21:15,680 Speaker 1: their heart really started beating and look at this on 364 00:21:15,800 --> 00:21:19,040 Speaker 1: this questionnaire they filled out, they really consider themselves an 365 00:21:19,040 --> 00:21:23,960 Speaker 1: empathetic person. Ipso fact though an empathetic person is responding 366 00:21:24,640 --> 00:21:28,399 Speaker 1: very empathetically right now to seeing this needle. Yeah, Like 367 00:21:28,440 --> 00:21:32,159 Speaker 1: what if they showed painted someone's skin green, Well they have, 368 00:21:32,280 --> 00:21:34,720 Speaker 1: they've done violet tinted and actually can tell you the 369 00:21:34,760 --> 00:21:38,480 Speaker 1: truth as far as correlating with self reports. Um, that 370 00:21:38,480 --> 00:21:42,560 Speaker 1: that does tend to be a pretty good, um control 371 00:21:42,720 --> 00:21:46,639 Speaker 1: to tell you the truth because apparently all people respond 372 00:21:46,680 --> 00:21:52,399 Speaker 1: to that one. Huh didn't that interesting? Um, there is 373 00:21:52,520 --> 00:21:54,480 Speaker 1: something going on there though, I mean, we're not like 374 00:21:54,520 --> 00:21:59,080 Speaker 1: discounting that because they have done studies that show that minorities, um, 375 00:21:59,080 --> 00:22:02,680 Speaker 1: maybe don't get pain medication like they should compared to 376 00:22:02,720 --> 00:22:05,920 Speaker 1: white people. Uh, And I don't know. It seems like 377 00:22:05,960 --> 00:22:08,840 Speaker 1: a racial empathy gap is a pretty decent explanation for 378 00:22:08,880 --> 00:22:11,840 Speaker 1: that for sure, or in the criminal justice system, which 379 00:22:11,840 --> 00:22:16,880 Speaker 1: we've talked a lot about, or maybe just in empathy 380 00:22:16,920 --> 00:22:22,720 Speaker 1: altogether between races. So if you're if you're a judge though, 381 00:22:22,960 --> 00:22:27,200 Speaker 1: and you're you're not following sentencing guidelines, you're just using 382 00:22:27,240 --> 00:22:30,000 Speaker 1: your own personal biases to hand out sentences, and you 383 00:22:30,080 --> 00:22:33,160 Speaker 1: have people's lives and futures in your hands. Yeah, you're 384 00:22:33,200 --> 00:22:35,600 Speaker 1: not following the law, You're following your own bias. You're 385 00:22:35,600 --> 00:22:37,879 Speaker 1: a piece of garbage. Well, I nothing to do with 386 00:22:37,880 --> 00:22:40,159 Speaker 1: you being an empathetic person or not. What about that 387 00:22:40,240 --> 00:22:44,760 Speaker 1: judge who U remember the guy the swimmer who raped 388 00:22:44,800 --> 00:22:47,480 Speaker 1: the girl by the dumpster. It was obvious that judge 389 00:22:47,480 --> 00:22:49,760 Speaker 1: was kind of like, oh, look at this kid, like, oh, 390 00:22:49,800 --> 00:22:51,760 Speaker 1: I don't want to ruin his future. I don't want 391 00:22:51,760 --> 00:22:53,640 Speaker 1: to ruin his future, Like that could have been my son, 392 00:22:53,960 --> 00:22:56,120 Speaker 1: you know, it's kind of like me. It was clearly 393 00:22:57,560 --> 00:23:01,719 Speaker 1: bias and empathy going on because he was like him. 394 00:23:01,760 --> 00:23:04,320 Speaker 1: And there's no way if that would have been some 395 00:23:04,359 --> 00:23:06,919 Speaker 1: black kid that he wouldn't have ruled differently. I just 396 00:23:06,960 --> 00:23:09,480 Speaker 1: there's no no one can convince me that that that's 397 00:23:09,520 --> 00:23:11,879 Speaker 1: not the truth, right, And I think that there's like 398 00:23:11,960 --> 00:23:14,840 Speaker 1: there's another distinction that's eventually going to be hammered out 399 00:23:14,880 --> 00:23:18,400 Speaker 1: to Like I don't think he was empathizing with that 400 00:23:18,520 --> 00:23:21,960 Speaker 1: swimmer kid. If he was, I could be wrong, who knows, 401 00:23:22,400 --> 00:23:24,600 Speaker 1: but I think he was um at the very least 402 00:23:24,640 --> 00:23:27,199 Speaker 1: exhibiting a bias that yes he was, he let the 403 00:23:27,240 --> 00:23:31,000 Speaker 1: kid off off the hook Um because he looked like him. 404 00:23:31,080 --> 00:23:32,880 Speaker 1: I think he might have been sympathizing with him though, 405 00:23:33,640 --> 00:23:37,240 Speaker 1: because even flat outsaid like this could ruin his life. Yeah, 406 00:23:37,280 --> 00:23:40,760 Speaker 1: he was definitely sympathizing at least for sure. Boy uh 407 00:23:40,800 --> 00:23:45,280 Speaker 1: so um. Going back a bit to uh philosopher Adam 408 00:23:45,320 --> 00:23:48,560 Speaker 1: Smith way back in the day, I think was clearly 409 00:23:48,560 --> 00:23:50,480 Speaker 1: talking about mirror neurons, even though he didn't know that 410 00:23:50,520 --> 00:23:53,000 Speaker 1: was the thing at the time when he wrote that, um, 411 00:23:53,520 --> 00:23:56,919 Speaker 1: persons of delicate fibers who notice a beggar's sores and 412 00:23:57,000 --> 00:23:59,800 Speaker 1: ulcers are apt to feel an itching or an easy sensation, 413 00:24:00,320 --> 00:24:03,000 Speaker 1: and the correspondent part of their own bodies. I mean, 414 00:24:03,000 --> 00:24:07,160 Speaker 1: that's absolutely mirror neurons firing off. And we've been saying 415 00:24:07,160 --> 00:24:08,720 Speaker 1: that a lot. If you don't know what we're talking about, 416 00:24:08,720 --> 00:24:12,040 Speaker 1: listen to uh great, feel someone else's pain? Yeah, can 417 00:24:12,080 --> 00:24:13,440 Speaker 1: you feel someone else pain? It was from a few 418 00:24:13,520 --> 00:24:15,920 Speaker 1: years ago, but it was one of my favorites we've 419 00:24:15,960 --> 00:24:18,879 Speaker 1: ever done, just because it's so fascinating. It really is. 420 00:24:18,960 --> 00:24:21,560 Speaker 1: But the brain is wired like that, and it's it's 421 00:24:21,600 --> 00:24:25,159 Speaker 1: the reason why and this is the you know, the 422 00:24:25,200 --> 00:24:27,480 Speaker 1: easiest way to explain it. Like if you see like 423 00:24:27,560 --> 00:24:29,840 Speaker 1: in a football game, someone's leg gets broken and you 424 00:24:29,920 --> 00:24:33,120 Speaker 1: literally feel like pain shoot through your body. That's those 425 00:24:33,160 --> 00:24:36,040 Speaker 1: are mirror neurons. Did you see There's a Simpsons recently 426 00:24:36,080 --> 00:24:39,040 Speaker 1: where Kirk van Houghton is back in college and he 427 00:24:39,080 --> 00:24:41,720 Speaker 1: goes to like high five. He's like a lacrosse player. 428 00:24:42,080 --> 00:24:44,560 Speaker 1: He goes to high five the college mascot, which is 429 00:24:44,600 --> 00:24:46,439 Speaker 1: like a guy in a suit of armor, and he 430 00:24:46,560 --> 00:24:50,080 Speaker 1: breaks his wrist in like fifty places and they show 431 00:24:50,119 --> 00:24:52,760 Speaker 1: they cut to the sideline and Joe thisman takes his 432 00:24:52,840 --> 00:24:56,639 Speaker 1: head off and throws up into man. I remember that 433 00:24:56,800 --> 00:24:59,480 Speaker 1: sisman thing. I think we talked about that in the episode. Yeah, 434 00:24:59,520 --> 00:25:01,879 Speaker 1: I still I don't think I still have ever seen it. 435 00:25:01,920 --> 00:25:04,439 Speaker 1: You don't need to. I think I do, though, Like, 436 00:25:04,440 --> 00:25:06,600 Speaker 1: how can I be walking and talking through life and 437 00:25:06,600 --> 00:25:09,239 Speaker 1: not haven't seen Joe Theisman break his leg. Well, it's 438 00:25:09,280 --> 00:25:11,920 Speaker 1: one of those things when you see a body get 439 00:25:11,960 --> 00:25:16,320 Speaker 1: bent in a very unnatural like direction. It's just, Yeah, 440 00:25:16,640 --> 00:25:19,280 Speaker 1: your your brain is hardwired to not accept that. I 441 00:25:19,359 --> 00:25:22,280 Speaker 1: know it makes you faint because your brain is like, 442 00:25:22,280 --> 00:25:25,080 Speaker 1: I can't see anymore. Speaking of the brain, Chuck Um, 443 00:25:25,200 --> 00:25:28,639 Speaker 1: let's talk a little bit about the brain, right, So, UM. 444 00:25:28,680 --> 00:25:30,600 Speaker 1: One of the we've already kind of touched on one 445 00:25:30,600 --> 00:25:32,760 Speaker 1: of the issues that I think we both have with 446 00:25:33,080 --> 00:25:38,040 Speaker 1: UM empathy research is that the does the designs of 447 00:25:38,080 --> 00:25:41,439 Speaker 1: the studies are just so shoddy it's mind boggling. But 448 00:25:41,520 --> 00:25:43,320 Speaker 1: then the other part of it, it's like, well, just 449 00:25:43,400 --> 00:25:46,760 Speaker 1: leave it to neuroscience. But neuroscience is still using the 450 00:25:46,840 --> 00:25:50,479 Speaker 1: same old m R S that it was before. And again, 451 00:25:50,520 --> 00:25:53,480 Speaker 1: all it's showing is that's where more oxygen is in 452 00:25:53,560 --> 00:25:56,000 Speaker 1: the part of the brain, right, then, so we're gonna 453 00:25:56,040 --> 00:25:58,400 Speaker 1: correlate that to that part of the brain being lit up. 454 00:25:58,520 --> 00:26:00,639 Speaker 1: So that means that this part of brain has to 455 00:26:00,680 --> 00:26:03,399 Speaker 1: do with UM. Looking at pictures of boops, this is 456 00:26:03,440 --> 00:26:06,480 Speaker 1: the boom region, right, And this is like the level 457 00:26:06,560 --> 00:26:09,320 Speaker 1: that that neurology is is ad as far as behavioral 458 00:26:09,320 --> 00:26:12,800 Speaker 1: studies goes. Right, you put these two together, this is 459 00:26:12,840 --> 00:26:15,639 Speaker 1: the state of the art with with empathy research, but 460 00:26:15,720 --> 00:26:18,199 Speaker 1: with the brain as far as that goes. They have 461 00:26:18,320 --> 00:26:21,399 Speaker 1: kind of isolated a few different parts. And again this 462 00:26:21,480 --> 00:26:24,359 Speaker 1: is kind of like we think that this has to 463 00:26:24,440 --> 00:26:27,760 Speaker 1: do with this process just because in trial after trial, 464 00:26:28,119 --> 00:26:30,560 Speaker 1: the same circuit has been followed or the same region 465 00:26:30,600 --> 00:26:35,199 Speaker 1: has lit up when we've applied this stimulus to different people. UM, 466 00:26:35,280 --> 00:26:39,840 Speaker 1: so there's a there's there's good evidence that this this 467 00:26:39,960 --> 00:26:42,480 Speaker 1: does have to do with say empathizing or whatever, but 468 00:26:42,680 --> 00:26:46,400 Speaker 1: it's still it's just a very it's a rudimentary understanding 469 00:26:46,400 --> 00:26:48,280 Speaker 1: at this point, I think, compared to say, like fifty 470 00:26:48,359 --> 00:26:52,360 Speaker 1: years from now. Right. So what what they've what they 471 00:26:52,400 --> 00:26:55,640 Speaker 1: think they figured out is that there's a UM part 472 00:26:55,680 --> 00:26:59,440 Speaker 1: of the brain and I love parts of the brain. 473 00:26:59,600 --> 00:27:02,600 Speaker 1: The effect active effective empathy part of the brain is 474 00:27:02,600 --> 00:27:05,680 Speaker 1: called the insular cortex. That's where they think that the 475 00:27:05,760 --> 00:27:09,240 Speaker 1: effective region are part of the effective region lies the 476 00:27:09,240 --> 00:27:14,560 Speaker 1: anterior insular cortex, and then the cognitive empathy UH is 477 00:27:14,600 --> 00:27:17,720 Speaker 1: thought to reside or originate in the mid singulate cortex. 478 00:27:18,560 --> 00:27:23,440 Speaker 1: And actually those came from a Monash University research UM 479 00:27:23,600 --> 00:27:29,359 Speaker 1: paper that's that looked at the concentration of gray matter, 480 00:27:29,560 --> 00:27:31,720 Speaker 1: the density of gray matter, and that's like the neurons, 481 00:27:31,720 --> 00:27:35,800 Speaker 1: whereas white matters like the connecting material. Right. UM. And 482 00:27:35,840 --> 00:27:41,560 Speaker 1: so they're saying people who have UM, really effective empathy 483 00:27:41,800 --> 00:27:46,800 Speaker 1: have denser insular cortexes cortices, and then people who have 484 00:27:46,960 --> 00:27:54,520 Speaker 1: really serious cognitive empathy have dense midsingulate cortices. That's where 485 00:27:54,520 --> 00:27:56,880 Speaker 1: it's at right now. Yeah, they did a pretty interesting 486 00:27:56,920 --> 00:28:04,080 Speaker 1: test um this uh Tanya Tania singer and this dude 487 00:28:04,160 --> 00:28:08,600 Speaker 1: name Mattheo Ricard He's a Buddhist monk. And I get 488 00:28:08,640 --> 00:28:11,880 Speaker 1: the idea that they picked this guy because he can 489 00:28:12,240 --> 00:28:17,359 Speaker 1: very much control his brains and emotion. So what they 490 00:28:17,400 --> 00:28:19,680 Speaker 1: did was, he's a Buddhist monk. They did some F 491 00:28:19,880 --> 00:28:22,480 Speaker 1: M R I brain scanning on this guy and they said, 492 00:28:23,040 --> 00:28:27,000 Speaker 1: all right, sir Mr Ricard. Um, he's like, please call 493 00:28:27,040 --> 00:28:31,280 Speaker 1: me mete you. We would like you to engage in 494 00:28:31,320 --> 00:28:35,879 Speaker 1: some different types of compassion and h meditate and direct 495 00:28:35,920 --> 00:28:38,960 Speaker 1: that meditation towards people who are suffering. And then they 496 00:28:39,000 --> 00:28:41,440 Speaker 1: hooked him up to the to the brain scan magic 497 00:28:41,520 --> 00:28:46,240 Speaker 1: machine and they found that the meditative states UM. It 498 00:28:46,360 --> 00:28:48,880 Speaker 1: was actually surprising to them. It did not activate parts 499 00:28:48,880 --> 00:28:52,320 Speaker 1: of the brain that are usually activated by non meditators 500 00:28:53,040 --> 00:28:56,200 Speaker 1: when they think about pain. But he said, you know 501 00:28:56,360 --> 00:28:58,160 Speaker 1: it was it was good for me. Basically it was 502 00:28:58,160 --> 00:29:01,080 Speaker 1: a warm, positive state. And he said, all right, now 503 00:29:02,080 --> 00:29:04,720 Speaker 1: put yourself in this what you know they would call 504 00:29:04,840 --> 00:29:09,520 Speaker 1: the emotional empathetic state. UM. And I guess he's able 505 00:29:09,560 --> 00:29:13,840 Speaker 1: to turn that on like a switch, right this, Yeah, exactly, 506 00:29:14,480 --> 00:29:16,640 Speaker 1: and blood just comes out of his nose. Yeah, in 507 00:29:16,720 --> 00:29:19,520 Speaker 1: different parts of the brain lit up. And he said 508 00:29:20,240 --> 00:29:24,120 Speaker 1: this empathetic sharing very quickly became intolerable to me. I 509 00:29:24,120 --> 00:29:28,160 Speaker 1: felt emotionally exhausted, very similar to being burnt out. So 510 00:29:28,200 --> 00:29:32,840 Speaker 1: that's one of the big arguments against this emotional or 511 00:29:32,880 --> 00:29:37,360 Speaker 1: effective empathy is that you can't take on everyone else's 512 00:29:37,360 --> 00:29:39,800 Speaker 1: pain like this. Let's say you're a social worker or 513 00:29:39,840 --> 00:29:43,840 Speaker 1: you're a nurse or a doctor, like it's gonna drive 514 00:29:43,880 --> 00:29:47,280 Speaker 1: you insane. Oh yeah, well you'll you'll burn out. It's 515 00:29:47,280 --> 00:29:51,040 Speaker 1: called empathy distress. Yeah. And when they've talked to patients 516 00:29:51,080 --> 00:29:54,480 Speaker 1: like hospital patients, they don't want that either. They won't 517 00:29:54,520 --> 00:29:58,120 Speaker 1: They want maybe someone who has some sympathy. But patients 518 00:29:58,320 --> 00:30:02,760 Speaker 1: are more likely to feel better. I was just imagining 519 00:30:02,760 --> 00:30:05,480 Speaker 1: a doctor coming in and just falling to pieces at 520 00:30:05,520 --> 00:30:09,800 Speaker 1: your your condition. Doctors are coming yeah, well you don't Yeah, 521 00:30:09,840 --> 00:30:11,400 Speaker 1: like you said, you don't want a doctor like, No, 522 00:30:11,520 --> 00:30:15,080 Speaker 1: they feel better if their doctor is kind of clinical 523 00:30:15,120 --> 00:30:17,719 Speaker 1: and reassuring and really seems like they have it together, 524 00:30:18,520 --> 00:30:21,880 Speaker 1: which makes sense. Yeah, and you don't want somebody who's like, frankly, 525 00:30:21,960 --> 00:30:24,280 Speaker 1: I could care less whether you live or die. I 526 00:30:24,320 --> 00:30:27,840 Speaker 1: want somewhere in between those two, which which is where, 527 00:30:28,000 --> 00:30:30,719 Speaker 1: oh my god, you're gonna die? Like you don't want 528 00:30:30,720 --> 00:30:32,800 Speaker 1: that out of your doctor. No, But it seems like 529 00:30:33,160 --> 00:30:35,920 Speaker 1: the middle of that, those two specs, that those two 530 00:30:36,000 --> 00:30:40,120 Speaker 1: ends of the spectrum is where cognitive empathy comes in. Yeah, well, Chuck, 531 00:30:40,120 --> 00:30:43,160 Speaker 1: how about we take a break here, second break, That 532 00:30:43,200 --> 00:31:12,160 Speaker 1: sounds good, and we'll come back, we promise. All right, man, 533 00:31:13,600 --> 00:31:16,520 Speaker 1: what do you wanna talk about Sasha Baron Cohen? I 534 00:31:16,880 --> 00:31:20,200 Speaker 1: still have never actually looked up whether that's his brother 535 00:31:20,320 --> 00:31:25,840 Speaker 1: or cousin or what. Yeah. Psychologist Simon Baron Cohen wrote 536 00:31:25,880 --> 00:31:27,800 Speaker 1: a book in two thousand eleven called The Science of Evil, 537 00:31:27,840 --> 00:31:33,160 Speaker 1: and he's he's way down with empathy. Yeah, And I 538 00:31:33,200 --> 00:31:35,920 Speaker 1: guess that they describe him as a thoughtful defender is 539 00:31:35,920 --> 00:31:40,080 Speaker 1: what Bloom describes him as of empathy. Um. And he 540 00:31:40,120 --> 00:31:43,000 Speaker 1: has a ranking system, an empathy curve from zero to 541 00:31:43,080 --> 00:31:48,040 Speaker 1: six and zero is no empathy basically or sociopath and 542 00:31:48,440 --> 00:31:53,160 Speaker 1: six is you, I guess, the most hardcore of uh 543 00:31:53,440 --> 00:31:59,320 Speaker 1: emotional impaths. Yeah, you're in. You call it a constant 544 00:31:59,360 --> 00:32:02,480 Speaker 1: state of hype arousal. And he had this one woman 545 00:32:02,560 --> 00:32:05,880 Speaker 1: that he used in his little example named Hannah, who 546 00:32:05,920 --> 00:32:08,440 Speaker 1: was a therapist. It's probably a great job for her, 547 00:32:09,080 --> 00:32:13,440 Speaker 1: but she's just one of these people that, uh, by 548 00:32:13,440 --> 00:32:16,760 Speaker 1: all accounts, is just wired that way, like her friends 549 00:32:16,760 --> 00:32:20,120 Speaker 1: and her family and her patients, like she just really 550 00:32:20,200 --> 00:32:24,600 Speaker 1: feels for them all, Like it's not just her job, 551 00:32:25,120 --> 00:32:29,680 Speaker 1: which is in in some ways that it probably helps 552 00:32:29,720 --> 00:32:32,600 Speaker 1: some people, but in other ways it's really probably number 553 00:32:32,600 --> 00:32:36,480 Speaker 1: one off putting and even if everybody liked it, it's 554 00:32:36,520 --> 00:32:39,720 Speaker 1: bad for her in the end. Like you you're we're 555 00:32:39,760 --> 00:32:43,600 Speaker 1: not We're not designed to carry everybody's problems and issues 556 00:32:43,640 --> 00:32:45,720 Speaker 1: with us all the time. Yeah, And that's kind of 557 00:32:45,760 --> 00:32:48,360 Speaker 1: the main point Bloom is making, is that people like 558 00:32:48,480 --> 00:32:51,560 Speaker 1: Hannah are headed for headed toward burnout, she said it 559 00:32:51,640 --> 00:32:55,760 Speaker 1: for And he also does make the point that friends 560 00:32:55,760 --> 00:32:58,680 Speaker 1: and family don't like they need a certain amount of 561 00:32:58,680 --> 00:33:02,120 Speaker 1: that empathy. But you don't want one that's always like 562 00:33:02,240 --> 00:33:05,280 Speaker 1: in that state, Like you also want someone that's like, 563 00:33:05,560 --> 00:33:08,240 Speaker 1: all right, let's turn that frown upside down and let's 564 00:33:08,240 --> 00:33:10,680 Speaker 1: go out and take a walk, you know, Like you 565 00:33:10,680 --> 00:33:13,400 Speaker 1: don't want someone that's always cries when you cry. You know, right, 566 00:33:13,760 --> 00:33:18,240 Speaker 1: you're just gonna be like I thought, had a bad 567 00:33:18,880 --> 00:33:21,880 Speaker 1: but and and you can extend that also to um, 568 00:33:21,920 --> 00:33:24,720 Speaker 1: the way that people react in some ways, to say 569 00:33:24,760 --> 00:33:27,560 Speaker 1: like a mass tragedy or something like that, like look 570 00:33:27,560 --> 00:33:29,960 Speaker 1: a look at new Town, right the Sandy Hook shooting. 571 00:33:30,240 --> 00:33:34,160 Speaker 1: Twenty small kids were killed. Six adults were also killed 572 00:33:34,160 --> 00:33:38,280 Speaker 1: at the elementary school. It was the most horrific tragedy 573 00:33:38,320 --> 00:33:40,600 Speaker 1: I think that ever took place in the United States. 574 00:33:40,760 --> 00:33:44,040 Speaker 1: It was basically the one that everyone who believes in 575 00:33:44,200 --> 00:33:47,280 Speaker 1: very strict gun control was waiting for. Was New New 576 00:33:47,360 --> 00:33:50,040 Speaker 1: was gonna happen sooner or later, and thought, this is 577 00:33:50,080 --> 00:33:52,960 Speaker 1: gonna be the tipping point. And it didn't happen. Right. 578 00:33:53,440 --> 00:33:59,200 Speaker 1: What people reacted to with was outpourings of donations, lots 579 00:33:59,240 --> 00:34:02,000 Speaker 1: of stuffed animal moles apparently there were three for every 580 00:34:02,040 --> 00:34:07,440 Speaker 1: resident of the town were sent. Um yeah, and UM, 581 00:34:07,600 --> 00:34:10,719 Speaker 1: lots of thoughts and prayers. And if you ever have 582 00:34:10,840 --> 00:34:15,040 Speaker 1: seen um, you know Anthony Geselnick, he he yeah, he 583 00:34:15,080 --> 00:34:17,720 Speaker 1: has a Netflix special I think it's still on called 584 00:34:17,760 --> 00:34:20,120 Speaker 1: Thoughts and Prayers, and you watch that and he explains 585 00:34:20,120 --> 00:34:22,359 Speaker 1: to you just how valuable your thoughts and prayers are, 586 00:34:22,440 --> 00:34:26,920 Speaker 1: especially on Twitter UM. But Paul Bloom points out, is 587 00:34:27,040 --> 00:34:31,200 Speaker 1: like this actually proved to be this outpouring proved to 588 00:34:31,239 --> 00:34:34,240 Speaker 1: be an additional burden on this town which is already 589 00:34:34,239 --> 00:34:37,040 Speaker 1: suffering tremendously. But like they had to UM. There was 590 00:34:37,080 --> 00:34:40,960 Speaker 1: something like eight volunteers who were tasked with handling all 591 00:34:40,960 --> 00:34:44,520 Speaker 1: the donations UM, whether it was stuffed animals or money, 592 00:34:44,640 --> 00:34:47,200 Speaker 1: and they apparently had to get a warehouse to put 593 00:34:47,239 --> 00:34:50,400 Speaker 1: all the stuffed animals in. And I think even some 594 00:34:50,440 --> 00:34:53,200 Speaker 1: of the public officials were like, please stop sending us stuff. 595 00:34:53,560 --> 00:34:55,920 Speaker 1: Send stuff, but send it to other people. We've got 596 00:34:55,960 --> 00:34:58,640 Speaker 1: everything we made, send it to other people. And everyone said, no, 597 00:34:58,800 --> 00:35:01,640 Speaker 1: shut up, this is about us, not you. And I 598 00:35:01,680 --> 00:35:06,440 Speaker 1: think that that's part of UM effective empathy, that outpouring 599 00:35:06,920 --> 00:35:09,399 Speaker 1: of stuff that seems like a nice gesture that makes 600 00:35:09,400 --> 00:35:12,759 Speaker 1: you feel better but doesn't actually help in any real 601 00:35:12,840 --> 00:35:17,040 Speaker 1: substantial way. I think that kind of underlies or betrays 602 00:35:17,200 --> 00:35:21,080 Speaker 1: what UM, what effective empathy is all about, and why 603 00:35:21,120 --> 00:35:24,000 Speaker 1: why we are moved to do something with effective empathy 604 00:35:24,040 --> 00:35:27,759 Speaker 1: Because we're feeling something right then, and writing a check 605 00:35:27,880 --> 00:35:30,560 Speaker 1: or sending a teddy bear is a good way to 606 00:35:30,560 --> 00:35:35,080 Speaker 1: to feel better. For us to feel better whereas cognitive 607 00:35:35,120 --> 00:35:40,040 Speaker 1: empathy would be like, um, I'm going to see to 608 00:35:40,200 --> 00:35:44,759 Speaker 1: it that every senator who blocked the gun control bill 609 00:35:44,840 --> 00:35:48,799 Speaker 1: following new Town is voted right out of office. That 610 00:35:48,840 --> 00:35:52,440 Speaker 1: would be cognitive empathy. You're empathizing with the parents, you're 611 00:35:52,480 --> 00:35:55,759 Speaker 1: empathizing with future kids who haven't been killed yet, and 612 00:35:55,800 --> 00:35:57,719 Speaker 1: you're gonna do what you can to make sure it 613 00:35:57,760 --> 00:36:01,040 Speaker 1: doesn't happen, rather than writing a check um or sending 614 00:36:01,080 --> 00:36:04,319 Speaker 1: a teddy bear. Those to me are the real distinctions 615 00:36:04,400 --> 00:36:09,280 Speaker 1: between cognitive and effective empathy as far as that ultimate 616 00:36:09,320 --> 00:36:12,759 Speaker 1: goal is concerned, which is again compassion, but compassion is 617 00:36:13,920 --> 00:36:16,440 Speaker 1: doing what you can to improve the outcome for the 618 00:36:16,480 --> 00:36:21,480 Speaker 1: greater good. Yeah. Yeah, that's interesting. And I Another thing 619 00:36:21,520 --> 00:36:24,760 Speaker 1: that UM kind of jumped out to me was these 620 00:36:24,800 --> 00:36:29,960 Speaker 1: psychologists Vicky Helgesen and Heidi Fritz. They were researching why 621 00:36:30,040 --> 00:36:33,400 Speaker 1: women are more likely I think twice as likely as 622 00:36:33,440 --> 00:36:37,680 Speaker 1: men to get depressed and experienced depression. And they thought, 623 00:36:37,800 --> 00:36:39,359 Speaker 1: you know, they said, you know what, I think it's 624 00:36:39,360 --> 00:36:42,960 Speaker 1: because women are more empathetic and and you know, emotionally empathetic. 625 00:36:42,960 --> 00:36:45,400 Speaker 1: And they take this on and uh, they said that 626 00:36:45,440 --> 00:36:48,839 Speaker 1: there's a propensity for what they called unmitigated communion, which 627 00:36:48,920 --> 00:36:51,640 Speaker 1: is a quote an excessive concern with others and placing 628 00:36:51,640 --> 00:36:55,640 Speaker 1: others needs before one's own end quote. And they you know, 629 00:36:55,719 --> 00:36:57,960 Speaker 1: gave people And this is one of those like a 630 00:36:58,080 --> 00:37:01,279 Speaker 1: nine item questionnaire. How much can you really learn? Um? 631 00:37:01,320 --> 00:37:03,920 Speaker 1: But uh, some of the statements agree to disagree with. 632 00:37:03,960 --> 00:37:05,640 Speaker 1: We're like for me to be happy, I need others 633 00:37:05,680 --> 00:37:07,719 Speaker 1: to be happy. I can't say no when someone has 634 00:37:07,800 --> 00:37:11,520 Speaker 1: for help. Often worry about others problems and kind of 635 00:37:11,560 --> 00:37:14,640 Speaker 1: across the board, women score higher than men do on 636 00:37:14,680 --> 00:37:18,080 Speaker 1: this and um, you know, I think a lot of 637 00:37:18,080 --> 00:37:20,480 Speaker 1: that probably has to do with with evolution to with 638 00:37:21,840 --> 00:37:24,200 Speaker 1: you know, women having to care for their babies right 639 00:37:24,200 --> 00:37:27,040 Speaker 1: out of the gate. Which took took his wife. You 640 00:37:27,080 --> 00:37:29,840 Speaker 1: know although it took took we know never took a wife. 641 00:37:30,120 --> 00:37:33,240 Speaker 1: Um took took kind of a round. He got around. 642 00:37:33,640 --> 00:37:36,319 Speaker 1: But the women that took took would would knock up. 643 00:37:37,680 --> 00:37:40,600 Speaker 1: They would immediately be in charge of those babies. And 644 00:37:40,640 --> 00:37:44,040 Speaker 1: that's what um that primatologists talked about two was you know, 645 00:37:44,160 --> 00:37:47,480 Speaker 1: this is kind of straight up evolution. Our natural selection 646 00:37:47,560 --> 00:37:49,800 Speaker 1: is right out of the gate. We have this empathy 647 00:37:49,840 --> 00:37:53,759 Speaker 1: because we have to care for young and then um, 648 00:37:53,960 --> 00:37:56,400 Speaker 1: I think we already mentioned too, and then that definitely 649 00:37:56,440 --> 00:38:02,200 Speaker 1: evolves into protect the tribe, right, because we're better off 650 00:38:02,239 --> 00:38:04,279 Speaker 1: if the people around us are healthy and happy and 651 00:38:04,320 --> 00:38:09,759 Speaker 1: ready to ward off attacks. Um. But the idea that 652 00:38:09,880 --> 00:38:15,719 Speaker 1: women are more prone to experience, say, effective empathy, or 653 00:38:15,840 --> 00:38:18,920 Speaker 1: just even empathy in general, it's actually got a has 654 00:38:18,960 --> 00:38:21,560 Speaker 1: a biological basis. To tell you the truth to chuck 655 00:38:22,000 --> 00:38:27,480 Speaker 1: um in in adolescence or puberty, apparently girls have they 656 00:38:27,520 --> 00:38:32,720 Speaker 1: score high for effective empathy throughout their entire adolescence, where 657 00:38:32,760 --> 00:38:38,960 Speaker 1: between about ages thirteen and sixteen, boys effective empathy declined. Yeah, 658 00:38:39,040 --> 00:38:43,000 Speaker 1: and they say, oh, oh you feel bad, You're about 659 00:38:43,000 --> 00:38:45,600 Speaker 1: to feel worse because I'm gonna give you a swirly. Yeah. 660 00:38:45,680 --> 00:38:48,640 Speaker 1: I don't know what it is, but it's a it's 661 00:38:48,640 --> 00:38:52,840 Speaker 1: where you stick someone's head in the toilet and flush swirling. 662 00:38:52,960 --> 00:38:56,239 Speaker 1: Never heard of that, fortunately, I had only heard of it, 663 00:38:56,360 --> 00:38:58,840 Speaker 1: never witnessed it or had it done to me. We 664 00:38:58,880 --> 00:39:02,560 Speaker 1: did noggies and uh, was it wedgie's when you did 665 00:39:02,600 --> 00:39:08,759 Speaker 1: the underwear. Yeah. Yeah, they're terrible. They are terrible, and 666 00:39:08,760 --> 00:39:13,040 Speaker 1: that's bullying behavior. And there are some theories about bullies too, 667 00:39:13,120 --> 00:39:18,080 Speaker 1: that they actually use empathy to manipulate people like they 668 00:39:18,200 --> 00:39:23,359 Speaker 1: they'll use it against them. Well, yeah, yeah, they they 669 00:39:23,719 --> 00:39:28,719 Speaker 1: used cognitive empathy to calculate the best most effective way 670 00:39:28,760 --> 00:39:33,080 Speaker 1: to hurt somebody and then um, they turn off any 671 00:39:33,120 --> 00:39:37,600 Speaker 1: potential like effective empathy, um when they're actually carrying out 672 00:39:37,600 --> 00:39:41,640 Speaker 1: their active bullying. Yeah. And with the teenagers too. They 673 00:39:41,680 --> 00:39:46,800 Speaker 1: they say that if you develop effective and cognitive empathy, um, 674 00:39:46,880 --> 00:39:49,839 Speaker 1: that you're going to be happier, You're gonna argue less 675 00:39:49,840 --> 00:39:54,280 Speaker 1: with your parents, you're gonna have more healthy relationships, which 676 00:39:54,320 --> 00:39:56,680 Speaker 1: you know kind of all makes sense. Sure, And they 677 00:39:56,719 --> 00:39:59,440 Speaker 1: also were saying too, and we will will get into 678 00:39:59,680 --> 00:40:02,000 Speaker 1: how to increase your own empathy if you think that 679 00:40:02,080 --> 00:40:04,960 Speaker 1: kind of thing is a good idea. Um. But that 680 00:40:05,640 --> 00:40:08,759 Speaker 1: babies learn empathy out of the out of the gate 681 00:40:08,880 --> 00:40:12,799 Speaker 1: by being empathized with, by being treated warmly by their 682 00:40:12,840 --> 00:40:16,720 Speaker 1: parents and other adults, being responded to in a warm manner, 683 00:40:17,040 --> 00:40:20,239 Speaker 1: that that actually is the beginning of empathy. And it's 684 00:40:20,280 --> 00:40:21,840 Speaker 1: like you said, you can see a little kid in 685 00:40:21,840 --> 00:40:24,919 Speaker 1: a preschool go over and comfort or console another little kid, 686 00:40:25,680 --> 00:40:28,359 Speaker 1: um who's in distress. Boy, that's why when I hear 687 00:40:28,360 --> 00:40:33,680 Speaker 1: about neglect like baby and infant neglect is just man, 688 00:40:33,800 --> 00:40:36,680 Speaker 1: that's like the most heartbreaking thing you can imagine. It's 689 00:40:36,680 --> 00:40:38,759 Speaker 1: like a baby just like left in a room to 690 00:40:38,800 --> 00:40:42,719 Speaker 1: cry and cry and cry forever. Plus also when we 691 00:40:42,719 --> 00:40:45,560 Speaker 1: were talking about the breastfeeding episode, that body to body 692 00:40:45,600 --> 00:40:49,319 Speaker 1: contact of being held shows or has been shown to 693 00:40:49,880 --> 00:40:52,960 Speaker 1: affect their development if they don't have it enough. It's 694 00:40:53,000 --> 00:40:54,920 Speaker 1: just all sorts of terrible things that happened to you 695 00:40:54,920 --> 00:40:58,680 Speaker 1: when you're neglected as a baby. Yeah, it's terrible, So Chuck, 696 00:40:58,680 --> 00:41:00,640 Speaker 1: there are plenty of people who's a well, we need 697 00:41:00,680 --> 00:41:03,120 Speaker 1: to empathize more. So just get out there and learn 698 00:41:03,120 --> 00:41:05,000 Speaker 1: how to empathize. And there's plenty of people out there 699 00:41:05,040 --> 00:41:07,759 Speaker 1: here will teach you techniques on empathizing with people more 700 00:41:08,239 --> 00:41:10,879 Speaker 1: and they may be worth trying. Like I found them 701 00:41:11,000 --> 00:41:16,120 Speaker 1: very helpful in a lot of cases, especially on interpersonal communication. Right, 702 00:41:16,800 --> 00:41:19,440 Speaker 1: But as far as like changing the world on a 703 00:41:19,480 --> 00:41:22,279 Speaker 1: massive scale for for the better, is it a good 704 00:41:22,280 --> 00:41:25,240 Speaker 1: idea to go out and just empathize, empathize, empathize, because 705 00:41:25,280 --> 00:41:28,719 Speaker 1: there's a big question mark with that, who exactly are 706 00:41:28,719 --> 00:41:32,040 Speaker 1: you supposed to empathize with? Like with just about every problem, 707 00:41:32,080 --> 00:41:34,480 Speaker 1: there's a group that's being helped by something and a 708 00:41:34,560 --> 00:41:37,400 Speaker 1: group that's being har harmed by something, especially when it 709 00:41:37,440 --> 00:41:40,680 Speaker 1: comes to public policy, right, So which group you're gonna 710 00:41:40,680 --> 00:41:43,680 Speaker 1: empathize with? If you empathize with the current victims and 711 00:41:43,719 --> 00:41:45,920 Speaker 1: you change public policy to help them, well, then you're 712 00:41:45,960 --> 00:41:49,640 Speaker 1: leaving the people who are currently benefiting out in the cold. Right. 713 00:41:49,719 --> 00:41:52,520 Speaker 1: So there's a big question of who you should empathize 714 00:41:52,560 --> 00:41:54,759 Speaker 1: with at any given point in time, which makes this 715 00:41:54,800 --> 00:42:01,400 Speaker 1: whole behavioral science nudge politics BS that is ultimately behind 716 00:42:01,440 --> 00:42:06,080 Speaker 1: this whole push to empathize more um that that's not 717 00:42:06,160 --> 00:42:08,960 Speaker 1: taking that into consideration. And then there's this kind of 718 00:42:09,000 --> 00:42:12,640 Speaker 1: a second facet to that, which is studies have found 719 00:42:12,680 --> 00:42:18,440 Speaker 1: that when you increase empathy in people, um, they tend 720 00:42:18,480 --> 00:42:22,200 Speaker 1: to empathize more with their own group, but it also 721 00:42:22,239 --> 00:42:26,160 Speaker 1: in kind increases hostility in those people toward out groups. 722 00:42:27,120 --> 00:42:29,000 Speaker 1: You know what I'm saying, Like, they see their friend 723 00:42:29,040 --> 00:42:31,239 Speaker 1: who's being hurt is more of a victim, and how 724 00:42:31,280 --> 00:42:34,080 Speaker 1: could you do this to them? And now I want 725 00:42:34,080 --> 00:42:36,440 Speaker 1: to get you back because one of the sour sides 726 00:42:36,480 --> 00:42:40,400 Speaker 1: of empathy is that it frequently comes with a taste 727 00:42:40,400 --> 00:42:42,879 Speaker 1: for retribution too, I think is how Paul Bloom put 728 00:42:42,920 --> 00:42:47,799 Speaker 1: it the dark side of empathy. So just yeah, there 729 00:42:47,840 --> 00:42:49,960 Speaker 1: is a dark side. There's a dark side to everything 730 00:42:50,000 --> 00:42:56,359 Speaker 1: in there. Yeah, except you from all dark side. You're 731 00:42:56,400 --> 00:43:01,640 Speaker 1: all hit kin. So we'll finish up here with a 732 00:43:01,640 --> 00:43:08,320 Speaker 1: bit on people with autism, because there's this stereotype, um 733 00:43:08,400 --> 00:43:11,880 Speaker 1: that if you everyone's probably heard it that you know what, 734 00:43:11,960 --> 00:43:16,120 Speaker 1: people with autism lack empathy and they don't understand emotions. 735 00:43:16,600 --> 00:43:21,160 Speaker 1: And if you know, anybody who uh either has autism 736 00:43:21,320 --> 00:43:24,080 Speaker 1: or is a parent of a child with autism, they 737 00:43:24,120 --> 00:43:27,960 Speaker 1: will dispel that myth pretty straight up just from their 738 00:43:27,960 --> 00:43:31,560 Speaker 1: own lives. Um. But these people did some studying and 739 00:43:31,600 --> 00:43:33,640 Speaker 1: some research because they were like, that's not good enough 740 00:43:33,680 --> 00:43:36,960 Speaker 1: for me, and it's not good enough to just say that, Like, 741 00:43:37,560 --> 00:43:41,319 Speaker 1: you know, every autism is different for everyone, So some 742 00:43:41,400 --> 00:43:44,280 Speaker 1: people have empathy or show people with autism show empathy, 743 00:43:44,360 --> 00:43:48,279 Speaker 1: so but everyone's different, So who cares about investigating that? Yeah? 744 00:43:48,280 --> 00:43:50,160 Speaker 1: So I really love the approach they took here. They 745 00:43:50,160 --> 00:43:52,680 Speaker 1: were kind of really wanted to keep digging, which I 746 00:43:52,680 --> 00:43:55,800 Speaker 1: really respected. So, uh, they said, you know what, I 747 00:43:55,840 --> 00:43:57,640 Speaker 1: think it might be going on here. There's this other 748 00:43:58,000 --> 00:44:06,120 Speaker 1: um condition called alexathemia, and alexithemia means you have a 749 00:44:06,160 --> 00:44:11,440 Speaker 1: difficult time understanding your own emotions. So you might, you know, 750 00:44:11,560 --> 00:44:14,040 Speaker 1: you might have a feeling that you're experiencing an emotion, 751 00:44:14,320 --> 00:44:16,120 Speaker 1: but you just don't know what it is. And about 752 00:44:16,120 --> 00:44:18,960 Speaker 1: ten percent of people have it in the regular population. 753 00:44:19,320 --> 00:44:23,440 Speaker 1: About fifty people with autism have alexithemia. But they're not 754 00:44:23,520 --> 00:44:28,279 Speaker 1: the same thing. No, And these guys actually found that, um, 755 00:44:28,400 --> 00:44:32,440 Speaker 1: people with autism who do not have alexathemia tend to 756 00:44:32,480 --> 00:44:37,560 Speaker 1: display empathy. Yeah, and even you know, lots of empathy. 757 00:44:38,239 --> 00:44:42,359 Speaker 1: Lots of empathy. Yeah, empathy. They got binders full of empathy, 758 00:44:42,560 --> 00:44:46,880 Speaker 1: finders full of empathy. Oh yeah, we remember when that 759 00:44:46,920 --> 00:44:50,160 Speaker 1: was the most controversial thing going in politics. Oh man, 760 00:44:50,440 --> 00:44:53,920 Speaker 1: finders full of empathy. Uh. Yeah, like they had they 761 00:44:53,920 --> 00:44:58,160 Speaker 1: scored you know, very strong when it came to measuring empathy. Uh. 762 00:44:58,160 --> 00:45:00,200 Speaker 1: And what they did was they you know, the makes 763 00:45:00,200 --> 00:45:02,080 Speaker 1: sense the way they did, it's very I really like 764 00:45:02,200 --> 00:45:05,560 Speaker 1: this study. They had four groups, uh, individuals with autism 765 00:45:05,680 --> 00:45:11,319 Speaker 1: and alexithemia, Uh, individuals with autism without it, individuals with 766 00:45:11,320 --> 00:45:15,040 Speaker 1: alexithemia but not autism, and then people that didn't have 767 00:45:15,120 --> 00:45:21,600 Speaker 1: either one. And it basically seems to kind of prove that. Yeah, 768 00:45:21,960 --> 00:45:25,720 Speaker 1: it's just not true that people with autism don't have empathy. 769 00:45:25,920 --> 00:45:29,440 Speaker 1: It's really Alexithemia's what's going on right, Which is I 770 00:45:29,480 --> 00:45:32,960 Speaker 1: think a novel finding or a novel hypothesis. I don't 771 00:45:33,000 --> 00:45:36,000 Speaker 1: think this is part of a larger field. I think 772 00:45:36,040 --> 00:45:38,480 Speaker 1: these these guys came up with that. Yeah, and did 773 00:45:38,520 --> 00:45:41,800 Speaker 1: you see that other study the UM from Goldsmith's University 774 00:45:41,800 --> 00:45:46,200 Speaker 1: of London about the facial expressions. I thought that was 775 00:45:46,200 --> 00:45:50,400 Speaker 1: pretty interesting too. Yeah, that they they investigated that, UM, 776 00:45:50,440 --> 00:45:54,680 Speaker 1: if you expose people with autism to the sounds of 777 00:45:54,680 --> 00:45:57,520 Speaker 1: people's voices and ask them to rate what emotion that 778 00:45:57,560 --> 00:46:02,480 Speaker 1: person is experiencing, they're far better UM calling that correctly 779 00:46:03,000 --> 00:46:06,919 Speaker 1: then faces. And apparently it's because people with autism tend 780 00:46:06,960 --> 00:46:10,400 Speaker 1: to spend much less time studying faces, not because they 781 00:46:10,440 --> 00:46:13,800 Speaker 1: can't empathize. They just aren't using cues that UM people 782 00:46:13,800 --> 00:46:20,040 Speaker 1: without autism use to UM conclude what emotions people are experiencing. Yeah, 783 00:46:20,160 --> 00:46:22,439 Speaker 1: really interesting stuff. And I don't know why this didn't 784 00:46:22,440 --> 00:46:26,719 Speaker 1: get more play because it still seems like people are 785 00:46:26,800 --> 00:46:30,400 Speaker 1: kind of banging that drum that you know, people with 786 00:46:30,440 --> 00:46:33,040 Speaker 1: autism that aren't empathetic. Yeah, I don't. I don't know 787 00:46:33,080 --> 00:46:37,279 Speaker 1: why either just makes sense that, yeah, um, we need 788 00:46:37,320 --> 00:46:40,480 Speaker 1: to do an entire episode on autism. Yeah, maybe alexophenia. 789 00:46:40,600 --> 00:46:42,400 Speaker 1: I've never heard of that. We also need to do 790 00:46:42,440 --> 00:46:45,480 Speaker 1: one on psychopaths too, which is another group that tends 791 00:46:45,520 --> 00:46:48,440 Speaker 1: to be pointed to. Is kind of incorrectly as far 792 00:46:48,440 --> 00:46:51,600 Speaker 1: as empathy goes, where if you're lacking empathy, you're a psychopath. 793 00:46:51,920 --> 00:46:54,040 Speaker 1: What actually turns out that if you have what's called 794 00:46:54,040 --> 00:46:58,560 Speaker 1: a shallow affect, meaning like you're across the board emotionally, 795 00:46:58,560 --> 00:47:03,200 Speaker 1: you're pretty stunted and um, shallow or superficial, that's what 796 00:47:03,320 --> 00:47:07,360 Speaker 1: really qualifies you as a psychopath, not just missing empathy. Um. 797 00:47:07,400 --> 00:47:10,360 Speaker 1: But yet again, it's another popular misconception that's being allowed 798 00:47:10,400 --> 00:47:15,080 Speaker 1: to persist. I'm just irritated, Chuck. I've got a great 799 00:47:15,160 --> 00:47:18,200 Speaker 1: quote though from Paul Bloom, And I also want to 800 00:47:18,239 --> 00:47:22,000 Speaker 1: say that I think, um that empathy also the different 801 00:47:22,080 --> 00:47:26,280 Speaker 1: kinds of empathy also get divided among the genders as well. 802 00:47:26,560 --> 00:47:29,240 Speaker 1: And we even said, we even talked about that study 803 00:47:29,280 --> 00:47:32,160 Speaker 1: that concluded that women tend to suffer from depression because 804 00:47:32,200 --> 00:47:35,720 Speaker 1: they're more empathetic. I think that maybe that's the case, 805 00:47:35,760 --> 00:47:38,399 Speaker 1: and there is a biological basis for it in adolescence. 806 00:47:38,760 --> 00:47:42,000 Speaker 1: But one thing that seems to persist everywhere is that 807 00:47:42,160 --> 00:47:45,960 Speaker 1: um different types of empathy or different techniques for empathy 808 00:47:46,000 --> 00:47:49,920 Speaker 1: to produce empathy can be learned, They can be taught. 809 00:47:50,440 --> 00:47:52,000 Speaker 1: And I think if you just say, like, well, wait 810 00:47:52,040 --> 00:47:53,879 Speaker 1: a minute, I really want to solve this problem, I'm 811 00:47:53,880 --> 00:47:55,480 Speaker 1: not going to fly off the handle or I'm not 812 00:47:55,480 --> 00:47:58,400 Speaker 1: gonna lose my marbles. I'm gonna like really put some 813 00:47:58,480 --> 00:48:00,560 Speaker 1: thought into it, and I can still be passionate, but 814 00:48:00,560 --> 00:48:03,080 Speaker 1: I don't have to completely experience someone else's pain. I 815 00:48:03,120 --> 00:48:06,680 Speaker 1: don't think that that's a biological imperative one way or another. 816 00:48:06,760 --> 00:48:09,640 Speaker 1: I think if you decide to make a choice or 817 00:48:09,680 --> 00:48:12,160 Speaker 1: a change in the way you approach situations, that has 818 00:48:12,200 --> 00:48:14,520 Speaker 1: nothing to do with gender. So I just wanting to 819 00:48:14,560 --> 00:48:17,440 Speaker 1: point that out. Yeah, And as far as teaching empathy, 820 00:48:18,239 --> 00:48:20,000 Speaker 1: like there's been a little bit of poopo ing of 821 00:48:20,040 --> 00:48:22,720 Speaker 1: emotional empathy, but I think it's I think it's definitely 822 00:48:23,000 --> 00:48:24,720 Speaker 1: like a pretty good thing to do as a parent 823 00:48:24,800 --> 00:48:28,319 Speaker 1: to try and teach your child to like, hey, you know, 824 00:48:28,840 --> 00:48:30,879 Speaker 1: how would you feel if someone was doing this to you? 825 00:48:31,719 --> 00:48:34,680 Speaker 1: And that's how they learn Yeah, exactly. You don't learn 826 00:48:34,680 --> 00:48:37,360 Speaker 1: it on your own. I think it has to be 827 00:48:37,440 --> 00:48:44,520 Speaker 1: imparted by good parents. Agreed. And um. Again, the the goal, 828 00:48:44,719 --> 00:48:46,760 Speaker 1: and this is a Paul Bloom quote. The goal isn't 829 00:48:47,640 --> 00:48:50,719 Speaker 1: to to love every single person like you love the 830 00:48:51,160 --> 00:48:54,200 Speaker 1: people closest to you, but to value other people just 831 00:48:54,239 --> 00:48:56,479 Speaker 1: for the very fact that they're human beings. Right, that's 832 00:48:56,560 --> 00:49:00,120 Speaker 1: the goal that everybody's looking for with with empathy, And 833 00:49:00,120 --> 00:49:03,319 Speaker 1: he says, quote, our best hope for the future is 834 00:49:03,360 --> 00:49:05,720 Speaker 1: not to get people to think of all humanity as family. 835 00:49:05,800 --> 00:49:09,000 Speaker 1: That's impossible. It lies instead in an appreciation of the 836 00:49:09,040 --> 00:49:12,719 Speaker 1: fact that even if we don't empathize with distant strangers, 837 00:49:13,040 --> 00:49:15,640 Speaker 1: their lives have the same value as the lives of 838 00:49:15,680 --> 00:49:21,040 Speaker 1: those we love. That's the key. Very interesting. Yeah, good stuff, 839 00:49:21,320 --> 00:49:26,280 Speaker 1: good stuff. We should subtitle this one Empathy a Lucy 840 00:49:26,320 --> 00:49:32,239 Speaker 1: Goosey episode, also known as what Paul Bloom says. Thank 841 00:49:32,280 --> 00:49:36,000 Speaker 1: you Paul Bloom. Yeah, big big ups to Paul Bloom. Uh. 842 00:49:36,040 --> 00:49:38,520 Speaker 1: And since I said big ups to Paul Bloom, that 843 00:49:38,640 --> 00:49:44,000 Speaker 1: means it's time for listener mail. Chuck Um, I'm gonna 844 00:49:44,040 --> 00:49:50,920 Speaker 1: call this hook worms nice um. Hello from the Sunny South, 845 00:49:51,200 --> 00:49:55,880 Speaker 1: United States. Southerners aren't lazy and dumb, they just had hookworm. 846 00:49:56,040 --> 00:49:59,680 Speaker 1: Great title. By the way, Josh brought back a childhood memory, 847 00:49:59,719 --> 00:50:00,960 Speaker 1: and if I only had to write end guys who 848 00:50:00,960 --> 00:50:02,840 Speaker 1: grew up in Florida. So we spent most of the 849 00:50:02,880 --> 00:50:05,719 Speaker 1: summer with our shoes off uh, And I remember my 850 00:50:05,760 --> 00:50:09,080 Speaker 1: mother distinctly reminded me to wear shoes UH so I 851 00:50:09,080 --> 00:50:13,160 Speaker 1: wouldn't get the ground ditch. This never happened. I called 852 00:50:13,160 --> 00:50:14,600 Speaker 1: my mom, who was now eighty eight years old, to 853 00:50:14,680 --> 00:50:16,880 Speaker 1: verify a few facts and about when I was a 854 00:50:16,880 --> 00:50:19,239 Speaker 1: little girl, I believe around five to seven or eight 855 00:50:19,360 --> 00:50:21,719 Speaker 1: years before school started, my mother would give me a 856 00:50:21,800 --> 00:50:26,200 Speaker 1: worm treatment on my feet. I explained to her what 857 00:50:26,200 --> 00:50:28,640 Speaker 1: I had learned during the podcast about hookworms and how 858 00:50:28,640 --> 00:50:30,640 Speaker 1: they affected the body. When I mentioned how they cause 859 00:50:31,120 --> 00:50:33,640 Speaker 1: severe anemia and caused the body to be more susceptible 860 00:50:33,680 --> 00:50:36,400 Speaker 1: to illness, she remembered a story about my father's cousin. 861 00:50:36,960 --> 00:50:40,160 Speaker 1: Apparently the cousin was so and became so incredibly ill 862 00:50:40,719 --> 00:50:42,360 Speaker 1: she was very close to dying. They took her to 863 00:50:42,360 --> 00:50:45,160 Speaker 1: the hospital and found out she was severely anemic, and 864 00:50:45,239 --> 00:50:47,879 Speaker 1: before they began any other diagnostics, they decided to test 865 00:50:47,880 --> 00:50:51,560 Speaker 1: her for hookworm and bingo as my mother said, she 866 00:50:51,760 --> 00:50:54,240 Speaker 1: was full of them. She had a high worm burden. 867 00:50:54,760 --> 00:50:58,120 Speaker 1: She did. Uh. Mom said. It took three treatments to 868 00:50:58,120 --> 00:51:00,560 Speaker 1: get rid of the worms. The story was she was 869 00:51:00,600 --> 00:51:03,960 Speaker 1: so infested they literally came out of her mouth when 870 00:51:03,960 --> 00:51:08,239 Speaker 1: she was being treated. Oh my god, Wow, that is 871 00:51:08,280 --> 00:51:10,400 Speaker 1: the best story I've heard in a while. And she 872 00:51:10,480 --> 00:51:13,360 Speaker 1: put in parentheses. I know, right, because I think she 873 00:51:13,400 --> 00:51:16,440 Speaker 1: anticipated that reaction. That's why you don't want to be 874 00:51:17,120 --> 00:51:23,440 Speaker 1: a six point oh um effective empathetic person. Yeah, that's right. Uh. 875 00:51:23,520 --> 00:51:26,200 Speaker 1: This cousin is actually still alive and in her early nineties, 876 00:51:26,920 --> 00:51:29,759 Speaker 1: so uh, this would have been in the nineties. I 877 00:51:29,760 --> 00:51:33,360 Speaker 1: hope she doesn't listen to this ship Hookworm and Fancy 878 00:51:33,400 --> 00:51:37,880 Speaker 1: Free in Florida. As from Terry Brunson of Panama City. Nice, 879 00:51:37,880 --> 00:51:40,040 Speaker 1: Thanks a lot, Terry. That was a great email. I 880 00:51:40,120 --> 00:51:42,960 Speaker 1: had everything had. It was a roller coaster ride. There 881 00:51:42,960 --> 00:51:45,160 Speaker 1: was a cousin who had worms coming out of her mouth. 882 00:51:45,520 --> 00:51:51,640 Speaker 1: I laughed, I cried. There was a mom, an old cousin. 883 00:51:52,360 --> 00:51:54,640 Speaker 1: I'd like to know what the worm treatment consisted of. 884 00:51:54,680 --> 00:51:58,800 Speaker 1: I'll bet there was dead cat in there somewhere. Uh. 885 00:51:58,840 --> 00:52:02,440 Speaker 1: If you want to tell us about your family's weird remedies. 886 00:52:02,480 --> 00:52:04,840 Speaker 1: We want to know the ingredients, and you can tweet 887 00:52:04,880 --> 00:52:07,279 Speaker 1: them to us at s y s K podcast or 888 00:52:07,680 --> 00:52:11,839 Speaker 1: hit me up at Josh Underscore um Underscore Clark. You 889 00:52:11,880 --> 00:52:14,160 Speaker 1: can hang out with us on Facebook at Facebook dot 890 00:52:14,200 --> 00:52:16,839 Speaker 1: com slash stuff you Should Know or Facebook dot com 891 00:52:16,920 --> 00:52:19,920 Speaker 1: slash Charles W. Chuck Bryant Uh, send us an email 892 00:52:19,960 --> 00:52:22,279 Speaker 1: the Stuff podcast at how stuff Works dot com, and 893 00:52:22,360 --> 00:52:24,120 Speaker 1: join us as always at our home on the web, 894 00:52:24,239 --> 00:52:30,640 Speaker 1: stuff you Should Know dot com. For more on this 895 00:52:30,840 --> 00:52:33,360 Speaker 1: and thousands of other topics, is it how stuff Works 896 00:52:33,360 --> 00:52:44,279 Speaker 1: dot com