1 00:00:01,040 --> 00:00:04,240 Speaker 1: Good afternoon, good morning, good evening, Good Saturday to you. 2 00:00:05,040 --> 00:00:09,000 Speaker 1: Chuck Bryant here podcaster co host Stuff you Should Know 3 00:00:09,560 --> 00:00:12,600 Speaker 1: to introduce the Saturday Selects episode My pick this week 4 00:00:12,640 --> 00:00:16,720 Speaker 1: everybody is empathy. We all need a little empathy in 5 00:00:16,720 --> 00:00:19,640 Speaker 1: our lives. And I remember this being a pretty good episode. 6 00:00:19,640 --> 00:00:23,720 Speaker 1: This is from April six. Listen close and take heed 7 00:00:24,000 --> 00:00:31,920 Speaker 1: how empathy works. Welcome to Stuff you Should Know, a 8 00:00:32,040 --> 00:00:41,640 Speaker 1: production of I Heart Radio. Hey, and welcome to the podcast. 9 00:00:42,159 --> 00:00:47,360 Speaker 1: I'm Josh Clark. There's Charles w Chuck, Lena my Shoulder 10 00:00:47,440 --> 00:00:53,199 Speaker 1: Bryant and Jerry. How about a hug rolland no, actually, 11 00:00:53,560 --> 00:00:57,200 Speaker 1: I'm sorry, Jerry's here in spirit. Our guest producer today 12 00:00:57,280 --> 00:01:03,680 Speaker 1: is Noel my Beard. He'll is all Brown. Yes, everybody 13 00:01:03,720 --> 00:01:07,400 Speaker 1: knows it's Noel Brown. Are you using your empathy voice? Yeah? 14 00:01:07,440 --> 00:01:11,640 Speaker 1: Is it working? Ein ful? In? Nobody? Really, it's the 15 00:01:11,720 --> 00:01:15,520 Speaker 1: bad eyes. Let's say I cut you for ten dollars. 16 00:01:17,200 --> 00:01:21,400 Speaker 1: Oh how are you, sir? I'm feeling um empathetic good. 17 00:01:21,920 --> 00:01:27,480 Speaker 1: I'm doing good. I have some very strong opinions on empathy, 18 00:01:27,560 --> 00:01:30,759 Speaker 1: and not just empathy, but empathy research in particulars. I'm 19 00:01:30,760 --> 00:01:33,120 Speaker 1: sure you're not at all surprised to hear. I'm not 20 00:01:33,160 --> 00:01:35,399 Speaker 1: at all surprised to hear. Did you come to the 21 00:01:35,480 --> 00:01:38,440 Speaker 1: same or similar conclusions as I did? I don't know 22 00:01:38,600 --> 00:01:42,440 Speaker 1: yet because we don't talk about this stuff beforehand. That's true, 23 00:01:42,600 --> 00:01:47,960 Speaker 1: that's how we that's the magic going blind. Did you 24 00:01:48,000 --> 00:01:51,440 Speaker 1: know that there's like an Atlanta magic thing? Now, what 25 00:01:51,480 --> 00:01:54,800 Speaker 1: do you mean, like a society something something along? I 26 00:01:54,920 --> 00:01:57,000 Speaker 1: just saw a sign for it, an old fourth ward. 27 00:01:57,080 --> 00:01:59,920 Speaker 1: But there's like a seems to be a legitimate magician. 28 00:02:00,920 --> 00:02:03,640 Speaker 1: What's that castle in l A, Oh, the Magic Castle. 29 00:02:03,920 --> 00:02:07,160 Speaker 1: It's not that, but it's probably something that the people 30 00:02:07,200 --> 00:02:09,560 Speaker 1: who do the Atlanta thing or I'm sure aware of 31 00:02:09,600 --> 00:02:12,160 Speaker 1: the Magic Castle. Probably And then you did a double 32 00:02:12,200 --> 00:02:14,040 Speaker 1: take at the sign and it disappeared in a poop 33 00:02:14,080 --> 00:02:18,359 Speaker 1: of smart It would be great. I went to the 34 00:02:18,360 --> 00:02:22,520 Speaker 1: Magic Castle once, Lucky. Yeah, it's awesome. I think we 35 00:02:22,600 --> 00:02:24,799 Speaker 1: had this conversation because I asked you if you've seen 36 00:02:24,800 --> 00:02:28,200 Speaker 1: that documentary about the kids competition at the Magic Castle. Yeah, 37 00:02:28,320 --> 00:02:30,919 Speaker 1: I have not, but um, it's a really good choke. Yeah, 38 00:02:30,960 --> 00:02:33,280 Speaker 1: if if you can highly recommend it, if you can 39 00:02:33,320 --> 00:02:36,359 Speaker 1: get in, you gotta know somebody. You gotta know Ben 40 00:02:36,400 --> 00:02:40,880 Speaker 1: Stiller oh really. Now, there was a movie that he 41 00:02:41,040 --> 00:02:43,240 Speaker 1: was in that took place in the Magic Castle and 42 00:02:43,280 --> 00:02:45,000 Speaker 1: he was like the bad guy. I think I don't 43 00:02:45,000 --> 00:02:49,440 Speaker 1: remember what it was. Maybe it was that documentary. Well, 44 00:02:49,520 --> 00:02:52,800 Speaker 1: let's talk empathy, chuck. Alrighty wait, hold on, I have 45 00:02:52,840 --> 00:02:55,640 Speaker 1: an intro. I have an intro. Okay, are you familiar 46 00:02:55,680 --> 00:03:00,320 Speaker 1: with Frank Rich, the left leaning well lefty is heck essayist? 47 00:03:01,320 --> 00:03:04,359 Speaker 1: I don't think so. He Uh, he's good. He's about 48 00:03:04,400 --> 00:03:07,480 Speaker 1: his He's about as good an essayist as you'll find 49 00:03:07,560 --> 00:03:11,360 Speaker 1: on the left. Um, he's a consultant on Veep. He 50 00:03:11,520 --> 00:03:15,079 Speaker 1: just he's hilarious and he knows his stuff right. He 51 00:03:15,200 --> 00:03:17,680 Speaker 1: usually writes for Harper's, but he's also got a regular 52 00:03:17,720 --> 00:03:20,400 Speaker 1: gig in New York Magazine and in New York Magazine. 53 00:03:20,440 --> 00:03:24,320 Speaker 1: Recently he published a column I think this week, um, 54 00:03:24,840 --> 00:03:27,440 Speaker 1: well this week as of when we're recording this, and 55 00:03:27,919 --> 00:03:30,080 Speaker 1: I think it was called like no Sympathy for the 56 00:03:30,160 --> 00:03:34,120 Speaker 1: Hillbilly or something like that, and it was basically, this 57 00:03:34,200 --> 00:03:36,800 Speaker 1: is really astounding coming from him, but it was basically 58 00:03:36,960 --> 00:03:41,360 Speaker 1: him saying, you know what, Um, I know that on 59 00:03:41,400 --> 00:03:44,720 Speaker 1: the left people tend to be bleeding heart liberals and 60 00:03:44,920 --> 00:03:48,280 Speaker 1: want to empathize with everybody and feel everyone else's pain 61 00:03:48,400 --> 00:03:52,280 Speaker 1: and understand where people are coming from. But I believe 62 00:03:52,640 --> 00:03:55,920 Speaker 1: that if you voted for Trump and you're angry, or 63 00:03:55,960 --> 00:03:57,880 Speaker 1: if I believe if you're angry at the people who 64 00:03:58,000 --> 00:04:00,160 Speaker 1: voted for Trump, are angry that Trump is president in it, 65 00:04:00,520 --> 00:04:02,520 Speaker 1: you should be angry at the people who voted him 66 00:04:02,560 --> 00:04:06,360 Speaker 1: into into office as well. And he basically is beating 67 00:04:06,360 --> 00:04:08,839 Speaker 1: a drum, which I also started to see in other 68 00:04:08,880 --> 00:04:11,960 Speaker 1: places as well, where it's like, no, you don't have 69 00:04:12,000 --> 00:04:14,560 Speaker 1: to understand people who voted for Trump, you don't have 70 00:04:14,640 --> 00:04:16,880 Speaker 1: to love your enemy. Let's just go to war with 71 00:04:16,920 --> 00:04:20,240 Speaker 1: these people, and it's it's it's legitimate. He is totally 72 00:04:20,279 --> 00:04:24,240 Speaker 1: serious too, and it amounts to basically a call to 73 00:04:24,360 --> 00:04:27,280 Speaker 1: go to the dark side, to resist everything that you know, 74 00:04:27,320 --> 00:04:31,200 Speaker 1: the left has traditionally prided itself on and just go 75 00:04:31,400 --> 00:04:37,440 Speaker 1: full bore like culture war against the right, and um, 76 00:04:38,000 --> 00:04:40,040 Speaker 1: it just seems like a really bad idea to me. 77 00:04:40,080 --> 00:04:42,440 Speaker 1: But one of the things that stuck out to me 78 00:04:42,680 --> 00:04:47,000 Speaker 1: about it the most was that it was so contrary 79 00:04:47,040 --> 00:04:51,000 Speaker 1: to the um ethos, the prevailing thought of the time, 80 00:04:51,080 --> 00:04:53,680 Speaker 1: or at least what made up the Obama administration, which 81 00:04:53,800 --> 00:04:56,279 Speaker 1: was we need to be more empathetic, we need to 82 00:04:56,400 --> 00:04:59,640 Speaker 1: understand people's plight more. And even after Hillary lost people 83 00:04:59,640 --> 00:05:02,440 Speaker 1: into the big post mortems was Hillary didn't connect with 84 00:05:02,680 --> 00:05:05,080 Speaker 1: blue collar workers who were out of work. She was 85 00:05:05,120 --> 00:05:08,400 Speaker 1: totally out of touch with that, she couldn't empathize with them. Well, 86 00:05:08,680 --> 00:05:11,320 Speaker 1: I think further pours mortem has been like Hillary could 87 00:05:11,400 --> 00:05:13,320 Speaker 1: empathize with those people all day, but they hated her 88 00:05:13,320 --> 00:05:15,520 Speaker 1: and they were never going to vote for And now 89 00:05:15,560 --> 00:05:20,080 Speaker 1: Frank Rich is saying, so hate them back is the thing. Again. 90 00:05:20,120 --> 00:05:22,920 Speaker 1: I disagree with that, but it really points out how 91 00:05:23,640 --> 00:05:26,520 Speaker 1: what a fragile turning point we're at right now this 92 00:05:26,640 --> 00:05:29,840 Speaker 1: path in history on America. Are we gonna stay and 93 00:05:29,920 --> 00:05:32,839 Speaker 1: just keep trying to be empathetic or we again just 94 00:05:32,880 --> 00:05:35,599 Speaker 1: gonna go full board to the dark side and and 95 00:05:35,720 --> 00:05:40,480 Speaker 1: just everybody's gonna hate everybody who's just who's not like them. 96 00:05:40,680 --> 00:05:45,240 Speaker 1: Quite an intro, thank you for a coastal elite. Oh 97 00:05:45,279 --> 00:05:48,120 Speaker 1: I'm not a coastal elite. I'm just kidding. I just 98 00:05:48,160 --> 00:05:50,839 Speaker 1: like that phrase. I hope I'm not. Man. I really 99 00:05:50,839 --> 00:05:53,080 Speaker 1: don't think I am, and I hope people don't think 100 00:05:53,160 --> 00:05:56,400 Speaker 1: I am. I I do stick my pinkie in the 101 00:05:56,400 --> 00:05:59,320 Speaker 1: air when I take SIPs of water, and that water 102 00:05:59,360 --> 00:06:04,719 Speaker 1: has been st through a m Franciscan monks mouth. First, 103 00:06:05,040 --> 00:06:06,840 Speaker 1: I don't think the only water I'll drink. I don't 104 00:06:06,880 --> 00:06:09,400 Speaker 1: think you can be a coastal elite if you have 105 00:06:09,480 --> 00:06:13,920 Speaker 1: your roots in Toledo, right, exactly. And I don't forget 106 00:06:13,920 --> 00:06:16,320 Speaker 1: where I'm from, man, and my family, you know, has 107 00:06:16,440 --> 00:06:20,240 Speaker 1: long roots in Tennessee and Mississippi. If you know this 108 00:06:20,279 --> 00:06:23,719 Speaker 1: by reading my Wikipedia page, right, does it say that 109 00:06:23,839 --> 00:06:26,599 Speaker 1: you're part choctaw On there yet? I'm sure it will soon, 110 00:06:28,160 --> 00:06:30,600 Speaker 1: all right. So we're talking empathy here. Um. A lot 111 00:06:30,600 --> 00:06:33,120 Speaker 1: of this sounded familiar, so much so that I like 112 00:06:33,200 --> 00:06:36,279 Speaker 1: quadruple checked that we had not done this. Um. And 113 00:06:36,320 --> 00:06:38,240 Speaker 1: I think we've just talked about it a lot, and 114 00:06:38,360 --> 00:06:44,240 Speaker 1: namely and uh in our Mirror Neurons episode. Yeah, um, 115 00:06:44,320 --> 00:06:46,479 Speaker 1: and I thought about that one a lot when I 116 00:06:46,520 --> 00:06:49,159 Speaker 1: was researching this. Well. I think it's definitely a component 117 00:06:49,279 --> 00:06:51,920 Speaker 1: of empathy, but it's not to be confused with empathy. 118 00:06:51,960 --> 00:06:54,480 Speaker 1: It's like part of it, I think, is the impression 119 00:06:54,520 --> 00:06:57,800 Speaker 1: I have agreed. So, um, empathy, if you look at 120 00:06:57,800 --> 00:07:03,000 Speaker 1: our that's a great article. Uh, they do define it. Um, 121 00:07:03,120 --> 00:07:04,960 Speaker 1: you know, everyone kind of knows what it is. But 122 00:07:05,800 --> 00:07:08,800 Speaker 1: just to be clear, it's not sympathy. It's it's if 123 00:07:09,000 --> 00:07:12,720 Speaker 1: you can feel and share someone else's emotions. Is empathy, 124 00:07:13,480 --> 00:07:17,679 Speaker 1: which is different than sympathy, and that uh, you're don't 125 00:07:17,960 --> 00:07:20,600 Speaker 1: you're not feeling it, but you do care about it, 126 00:07:21,360 --> 00:07:24,480 Speaker 1: right right, It's like, um, you can understand why someone 127 00:07:24,520 --> 00:07:28,440 Speaker 1: would be feeling like they're feeling. Is intellectual, Yeah, Like 128 00:07:28,520 --> 00:07:32,080 Speaker 1: sympathies from the brain and empathy is from say, the heart. Yeah, 129 00:07:32,120 --> 00:07:34,280 Speaker 1: and a lot of these words. When we get into 130 00:07:34,320 --> 00:07:37,640 Speaker 1: the definitions of empathy and versus compassion, it gets a 131 00:07:37,640 --> 00:07:41,760 Speaker 1: little uh, I don't know. Sometimes I feel like people 132 00:07:41,760 --> 00:07:45,520 Speaker 1: are kind of splitting hairs with that. To me, chuck 133 00:07:45,960 --> 00:07:49,040 Speaker 1: is a huge red flag that the field is not 134 00:07:49,320 --> 00:07:52,120 Speaker 1: nearly as established as people like to think. Like, if 135 00:07:52,160 --> 00:07:56,000 Speaker 1: there's still confusion on basic terms like empathy and sympathy 136 00:07:56,240 --> 00:08:00,200 Speaker 1: and they're used interchangeably, it just means that no one 137 00:08:00,360 --> 00:08:03,600 Speaker 1: is doing the right kind of hardcore research. You're publishing 138 00:08:03,600 --> 00:08:05,560 Speaker 1: the right kind of hardcore papers that say this is 139 00:08:05,600 --> 00:08:07,200 Speaker 1: what it is, or this is what it not, this 140 00:08:07,280 --> 00:08:10,400 Speaker 1: is what it isn't. Almost just said, this is what 141 00:08:10,480 --> 00:08:15,200 Speaker 1: it not is, this is what it ain't no coastal elite. 142 00:08:16,320 --> 00:08:20,600 Speaker 1: But there was an original German word um einfulung, which 143 00:08:20,600 --> 00:08:23,000 Speaker 1: means feeling into And that's where empathy comes from. And 144 00:08:23,040 --> 00:08:26,960 Speaker 1: if you talk to an expert or a researcher um 145 00:08:27,000 --> 00:08:29,559 Speaker 1: these days, they're gonna talk about a couple of types 146 00:08:29,560 --> 00:08:35,320 Speaker 1: of empathy, UM effective or maybe emotional empathy and cognitive 147 00:08:35,320 --> 00:08:40,080 Speaker 1: empathy and UM. The distinction is, as it turns out, 148 00:08:40,160 --> 00:08:45,000 Speaker 1: is pretty important and to me, well, to me, this 149 00:08:45,040 --> 00:08:47,520 Speaker 1: is where a little bit of the splitting hairs comes 150 00:08:47,559 --> 00:08:52,600 Speaker 1: in because as far as talking about um effective empathy 151 00:08:53,120 --> 00:08:55,920 Speaker 1: versus compassion, like is it the same thing or I'm sorry, 152 00:08:55,920 --> 00:08:59,440 Speaker 1: cognitive empathy would be more like compassion because you're not 153 00:08:59,480 --> 00:09:04,000 Speaker 1: really taking on someone else's pain. So compassion, I think, 154 00:09:04,080 --> 00:09:07,080 Speaker 1: is even like a third word. This is so this 155 00:09:07,160 --> 00:09:10,559 Speaker 1: is what I came up with. You've got cognitive empathy, 156 00:09:10,640 --> 00:09:14,520 Speaker 1: which is sympathy, right you can understand why someone would 157 00:09:14,559 --> 00:09:20,120 Speaker 1: be feeling a certain way. Then you've got effective empathy 158 00:09:20,520 --> 00:09:23,720 Speaker 1: one d it okay, which is like you're really putting 159 00:09:23,760 --> 00:09:26,240 Speaker 1: yourself in that person shoes and you're feeling how they're feeling. 160 00:09:26,360 --> 00:09:29,720 Speaker 1: Right then. But then compassion, it seems to me, is 161 00:09:29,800 --> 00:09:32,680 Speaker 1: the end goal of this. That's where you actually moved 162 00:09:32,679 --> 00:09:34,960 Speaker 1: to act. That's where you do something about it. It's 163 00:09:35,000 --> 00:09:38,800 Speaker 1: where you put your hand on someone's shoulder and say 164 00:09:38,920 --> 00:09:41,480 Speaker 1: it's gonna be all right, or you know, here's a 165 00:09:41,559 --> 00:09:46,480 Speaker 1: check for five dollars, um, get some groceries with it. 166 00:09:46,520 --> 00:09:48,480 Speaker 1: Who knows what you're gonna do. But I think to me, 167 00:09:48,600 --> 00:09:52,360 Speaker 1: compassion is the act, like the action, the end goal 168 00:09:52,559 --> 00:09:57,840 Speaker 1: of empathy, whether it's cognitive or um or effective. That's 169 00:09:57,920 --> 00:10:00,320 Speaker 1: that's what I think. And you know what, this fields 170 00:10:00,320 --> 00:10:04,240 Speaker 1: are so unestablished that I can just say that stuff. Yeah, 171 00:10:04,280 --> 00:10:06,600 Speaker 1: and it's probably right. Let's just say that that's true. 172 00:10:06,640 --> 00:10:09,400 Speaker 1: No one can really come along and say definitively that 173 00:10:09,440 --> 00:10:12,600 Speaker 1: you're not right right. Uh, So you know, to put 174 00:10:12,640 --> 00:10:14,760 Speaker 1: given you an example of what that might mean. Effective 175 00:10:14,840 --> 00:10:17,760 Speaker 1: or emotional empathy. Um, if someone if you have a 176 00:10:17,760 --> 00:10:20,920 Speaker 1: friend or family member going through a very hard time, uh, 177 00:10:20,960 --> 00:10:24,160 Speaker 1: and they're distraught, and then you are also distraught just 178 00:10:24,240 --> 00:10:28,920 Speaker 1: like they are, then that is definitely effective empathy. Whereas 179 00:10:29,000 --> 00:10:31,640 Speaker 1: you're not just like, oh man, you know your your 180 00:10:31,720 --> 00:10:33,840 Speaker 1: uncle passed away. I'm really sorry to hear that, and 181 00:10:33,880 --> 00:10:36,560 Speaker 1: I feel terribly for you. But if if you were, 182 00:10:37,320 --> 00:10:41,040 Speaker 1: you know, actively taking that on to the point where 183 00:10:41,080 --> 00:10:44,760 Speaker 1: you're crying too, and you didn't know the uncle, because 184 00:10:44,800 --> 00:10:47,800 Speaker 1: that would be the differentiation. Right, It's like you don't 185 00:10:47,800 --> 00:10:49,720 Speaker 1: have a personal stake in it, but you're still taking 186 00:10:49,720 --> 00:10:53,760 Speaker 1: it on as if it is your own. Yes, and 187 00:10:53,800 --> 00:10:58,000 Speaker 1: then depending on your view of things, And we'll talk 188 00:10:58,040 --> 00:11:00,360 Speaker 1: a lot about this. There's this really great see collegiest 189 00:11:00,440 --> 00:11:03,520 Speaker 1: named Paul Bloom who has basically dedicated a lot of 190 00:11:03,520 --> 00:11:07,720 Speaker 1: his life to shooting down ideas of how great empathy is. Yeah, 191 00:11:07,760 --> 00:11:09,360 Speaker 1: I thought he was. I thought he made a lot 192 00:11:09,360 --> 00:11:11,360 Speaker 1: of good points and something quite agree with either. But 193 00:11:11,440 --> 00:11:14,520 Speaker 1: he's great. He's really good at and poking holes in 194 00:11:14,640 --> 00:11:18,560 Speaker 1: the concept of empathy. But he points out that, um that, 195 00:11:19,520 --> 00:11:22,600 Speaker 1: I guess it's probably good if somebody's something, someone's in 196 00:11:22,640 --> 00:11:25,600 Speaker 1: a great mood and you're empathetic and sharing in that 197 00:11:25,679 --> 00:11:28,320 Speaker 1: great mood and amplifying it. But on the flip side 198 00:11:28,320 --> 00:11:32,400 Speaker 1: of the coin, if somebody is in a horrifically tragically 199 00:11:32,600 --> 00:11:36,679 Speaker 1: sad mood and you're sitting there amplifying that by joining 200 00:11:36,720 --> 00:11:39,360 Speaker 1: in part and parcel with it, then you're you're doing 201 00:11:39,360 --> 00:11:44,679 Speaker 1: a disservice, right, So in some in some ways, UM, well, 202 00:11:44,720 --> 00:11:48,800 Speaker 1: I'll just say Paul Bloom's whole basic, his whole thesis, 203 00:11:49,120 --> 00:11:51,359 Speaker 1: and I subscribe to it as well, is that cognitive 204 00:11:51,440 --> 00:11:55,080 Speaker 1: is far and away the superior of the two types 205 00:11:55,120 --> 00:11:58,400 Speaker 1: of empathy as far as the ultimate goal, which again 206 00:11:58,679 --> 00:12:01,560 Speaker 1: to me is compassion. Yes, you want to just pepper 207 00:12:01,559 --> 00:12:03,560 Speaker 1: in some of his stuff as we go. Does that 208 00:12:03,600 --> 00:12:07,640 Speaker 1: make sense? Because here's a great spot too. Uh. And 209 00:12:07,720 --> 00:12:10,840 Speaker 1: this is one of the studies I imagine. I don't 210 00:12:10,840 --> 00:12:12,400 Speaker 1: know if you had a problem with it, but a 211 00:12:12,480 --> 00:12:15,319 Speaker 1: problem with a lot of these studies. Um. But there 212 00:12:15,400 --> 00:12:19,760 Speaker 1: was a study, um, at least one where psychologists said, um, 213 00:12:19,840 --> 00:12:22,200 Speaker 1: how much money will you donate to develop a drug 214 00:12:22,280 --> 00:12:25,480 Speaker 1: that would save one child's life? Um? And then another 215 00:12:25,520 --> 00:12:28,560 Speaker 1: group was asked, how much would you donate to develop 216 00:12:28,600 --> 00:12:31,280 Speaker 1: a drug that would save eight kids? And it was 217 00:12:31,320 --> 00:12:34,199 Speaker 1: about the same answer. UM. Where things changed was when 218 00:12:34,200 --> 00:12:36,880 Speaker 1: they asked a third group about the one child. But 219 00:12:36,960 --> 00:12:39,640 Speaker 1: they showed a picture of the kid and like you know, 220 00:12:39,640 --> 00:12:42,240 Speaker 1: I said, this is this little Joey. He's fourteen years old, 221 00:12:42,760 --> 00:12:46,319 Speaker 1: and this is his sad little face. And then donations 222 00:12:46,480 --> 00:12:49,880 Speaker 1: really shot up. And this is where um, what was 223 00:12:49,880 --> 00:12:52,719 Speaker 1: his name, Paul Bloom, Paul Bloom, a psychologist. Yeah, this 224 00:12:52,760 --> 00:12:56,480 Speaker 1: is where Paul Bloom says that, um, this emotional empathy 225 00:12:56,600 --> 00:13:01,840 Speaker 1: is for the birds because A it's it's um, it's narrow, 226 00:13:02,200 --> 00:13:06,160 Speaker 1: and B it's very like people tend to want to 227 00:13:06,200 --> 00:13:09,360 Speaker 1: help people that are like them. So it's yeah, I 228 00:13:09,400 --> 00:13:13,120 Speaker 1: mean biases that the right word super biased. Yeah, And 229 00:13:13,120 --> 00:13:15,520 Speaker 1: and it makes no sense. Not only does it not 230 00:13:15,640 --> 00:13:18,880 Speaker 1: scale upward as the number of people affect by, say, 231 00:13:18,960 --> 00:13:22,280 Speaker 1: like a tragedy increase, it actually goes the other way, 232 00:13:22,520 --> 00:13:25,520 Speaker 1: where the more people that are affected by something, the 233 00:13:25,600 --> 00:13:29,240 Speaker 1: less empathetic a person tends to be. Whereas if say 234 00:13:29,280 --> 00:13:32,480 Speaker 1: it's one person and you know that person's name, and 235 00:13:32,520 --> 00:13:34,680 Speaker 1: you see that person's picture on the news, and yeah, 236 00:13:34,920 --> 00:13:37,319 Speaker 1: they look like you or your neighbor, your daughter, You're 237 00:13:37,320 --> 00:13:40,360 Speaker 1: gonna empathize a lot. But at the same time, there 238 00:13:40,360 --> 00:13:42,560 Speaker 1: could be you know, the same thing could be happening 239 00:13:42,559 --> 00:13:45,760 Speaker 1: to other people and if you'll just vote a certain way, 240 00:13:45,920 --> 00:13:48,440 Speaker 1: you can alleviate their suffering. You wouldn't lift a finger 241 00:13:48,480 --> 00:13:51,600 Speaker 1: to do it, especially if it meant slightly higher taxes 242 00:13:51,679 --> 00:13:56,160 Speaker 1: for you. So in that sense. Empathy makes no sense whatsoever. Yeah. 243 00:13:56,160 --> 00:13:59,280 Speaker 1: I mean he even quoted Mother Teresa and his UH 244 00:13:59,480 --> 00:14:02,520 Speaker 1: in the essay, which is UM quote, if I look 245 00:14:02,559 --> 00:14:04,760 Speaker 1: at the mass, I will never act. If I look 246 00:14:04,800 --> 00:14:07,760 Speaker 1: at the one, I will So he's going with the 247 00:14:08,240 --> 00:14:11,840 Speaker 1: heavy hitters there, you know, when you bring Mother Teresa 248 00:14:11,840 --> 00:14:15,000 Speaker 1: in there to kind of make a point. Yeah, but 249 00:14:15,080 --> 00:14:17,320 Speaker 1: you know he makes a good point. Um. Oh yeah, 250 00:14:17,520 --> 00:14:20,440 Speaker 1: Like and and that study does. I didn't have a 251 00:14:20,480 --> 00:14:22,040 Speaker 1: big problem with that study because it does kind of 252 00:14:22,080 --> 00:14:25,200 Speaker 1: prove that out right. That was Telea Coca and uh 253 00:14:25,280 --> 00:14:30,280 Speaker 1: Alana Ritov, their psychologists. And then Ritov and another UM 254 00:14:30,440 --> 00:14:34,520 Speaker 1: co author conducted another study where UM that kind of 255 00:14:34,520 --> 00:14:36,960 Speaker 1: pointed out one of the problems with empathy, which was 256 00:14:37,280 --> 00:14:40,520 Speaker 1: they said, okay, UM, two different groups of people heard 257 00:14:40,560 --> 00:14:45,320 Speaker 1: this that UM that a vaccine maker cost a child 258 00:14:45,680 --> 00:14:49,720 Speaker 1: her life, kill the child because of the vaccine. Now, UM, 259 00:14:50,480 --> 00:14:53,520 Speaker 1: should the vaccine maker be fined? And then one group 260 00:14:53,600 --> 00:14:56,720 Speaker 1: was told that the fine would probably make the vaccine 261 00:14:57,040 --> 00:15:01,440 Speaker 1: maker UM follow guidelines even more strictly and would probably 262 00:15:01,440 --> 00:15:05,040 Speaker 1: prevent accidents, and then the other further accidents, and then 263 00:15:05,080 --> 00:15:08,760 Speaker 1: the UM the other group was told that this fine 264 00:15:09,160 --> 00:15:12,400 Speaker 1: would probably make the vaccine maker get out of the 265 00:15:12,440 --> 00:15:15,960 Speaker 1: business and more people would die because they couldn't get 266 00:15:16,000 --> 00:15:20,600 Speaker 1: the vaccine. And both groups said that yes, the vaccine 267 00:15:20,640 --> 00:15:23,800 Speaker 1: maker should be punished with um the highest fine possible 268 00:15:24,240 --> 00:15:27,480 Speaker 1: with extreme prejudice. Right. So the upshot of all of 269 00:15:27,520 --> 00:15:32,000 Speaker 1: this is that especially with UM effective empathy as we 270 00:15:32,120 --> 00:15:36,800 Speaker 1: understand it, we we it doesn't It doesn't follow any 271 00:15:36,840 --> 00:15:40,000 Speaker 1: kind of rational guidelines, and rather the basis of rationality 272 00:15:40,080 --> 00:15:44,080 Speaker 1: being that two is more important than one, and empathy 273 00:15:44,160 --> 00:15:48,520 Speaker 1: just doesn't go in that direction. Yeah. But um, Interestingly, UM, 274 00:15:49,160 --> 00:15:52,640 Speaker 1: while you can train yourself to be more empathetic, it 275 00:15:52,720 --> 00:15:56,880 Speaker 1: definitely to me feels like something that you're sort of 276 00:15:56,920 --> 00:15:59,320 Speaker 1: born with to a certain degree, or maybe in the 277 00:15:59,360 --> 00:16:03,080 Speaker 1: formative years you might gain UM. But uh. In in 278 00:16:03,200 --> 00:16:07,360 Speaker 1: blooms article, he talks about babies and as as soon 279 00:16:07,400 --> 00:16:09,480 Speaker 1: as a baby can get up and start getting around, 280 00:16:09,840 --> 00:16:12,000 Speaker 1: they're gonna try and comfort. Like if you go into 281 00:16:12,040 --> 00:16:16,240 Speaker 1: a preschool and there's another baby crying, you will probably 282 00:16:16,280 --> 00:16:18,640 Speaker 1: see another little baby walking over there and patting the 283 00:16:18,680 --> 00:16:22,120 Speaker 1: little baby and stroking the baby. There's nothing more adorable 284 00:16:22,160 --> 00:16:26,720 Speaker 1: than pretty adorable. Um. And you know it happens in 285 00:16:26,760 --> 00:16:30,680 Speaker 1: the animal Kingdom UM, although they did note UM this, 286 00:16:31,120 --> 00:16:36,000 Speaker 1: Franz de Wal the Pramatologists notes that it kind of 287 00:16:36,000 --> 00:16:40,080 Speaker 1: follows humans in a way and that um, a chimpanzee 288 00:16:40,160 --> 00:16:44,800 Speaker 1: might really um like put like hug a victim of 289 00:16:44,840 --> 00:16:49,080 Speaker 1: an attack, but it's got to be another chimp. Like 290 00:16:49,120 --> 00:16:51,520 Speaker 1: if they're like they will smash the brains out of 291 00:16:51,560 --> 00:16:53,720 Speaker 1: another kind of monkey maybe if it wanders into their 292 00:16:53,720 --> 00:16:57,440 Speaker 1: little village. That to me kind of underscores this whole, 293 00:16:57,920 --> 00:16:59,920 Speaker 1: this whole thing like when we when we look at EMPI, 294 00:17:00,080 --> 00:17:02,960 Speaker 1: the the first question that people have is like, why 295 00:17:02,960 --> 00:17:04,840 Speaker 1: don't we have more empathy or why don't we have 296 00:17:04,880 --> 00:17:07,879 Speaker 1: empathy for everybody? We're all humans, And it seems like 297 00:17:07,920 --> 00:17:11,560 Speaker 1: based on France to Walls studies and UM other studies 298 00:17:11,600 --> 00:17:15,200 Speaker 1: about the the evolution of in group and out group behavior, 299 00:17:15,760 --> 00:17:19,080 Speaker 1: like we we evolved over hundreds of thousands, if not 300 00:17:19,200 --> 00:17:22,160 Speaker 1: millions of years. I guess more than that. If you're 301 00:17:22,200 --> 00:17:25,159 Speaker 1: if you're also looking at the grade apes right to 302 00:17:25,720 --> 00:17:29,400 Speaker 1: see other groups that aren't like us as threatening, right, 303 00:17:29,520 --> 00:17:33,560 Speaker 1: it makes sense in an evolutionary speaking right, and it's 304 00:17:33,680 --> 00:17:36,800 Speaker 1: it's only in like the last uh ten eleven thousand 305 00:17:36,840 --> 00:17:39,480 Speaker 1: years that we settled down and started forming cities. But 306 00:17:39,560 --> 00:17:41,520 Speaker 1: even then there was in group and out group people 307 00:17:41,520 --> 00:17:44,200 Speaker 1: you didn't recognize were coming to kill you for your crops. 308 00:17:44,440 --> 00:17:46,320 Speaker 1: So you needed to fight those people. You didn't need 309 00:17:46,359 --> 00:17:48,800 Speaker 1: to empathize with them that, oh you're hungry, so you're 310 00:17:48,800 --> 00:17:51,600 Speaker 1: gonna take my life. I understand, right. That didn't That 311 00:17:51,680 --> 00:17:55,960 Speaker 1: didn't jibe with natural selection. But then you add jets 312 00:17:56,119 --> 00:17:58,920 Speaker 1: into the mix, and then TV and then the Internet, 313 00:17:59,160 --> 00:18:01,600 Speaker 1: and all of a sudden, we're exposed to more in 314 00:18:01,680 --> 00:18:04,040 Speaker 1: groups and out groups and are expected to get along 315 00:18:04,040 --> 00:18:07,320 Speaker 1: more civilly than ever before. But our evolution hasn't caught 316 00:18:07,400 --> 00:18:10,159 Speaker 1: up quite enough, right, So now we're faced at this 317 00:18:10,200 --> 00:18:12,040 Speaker 1: point where it's like, okay, we just need to figure 318 00:18:12,080 --> 00:18:15,160 Speaker 1: out how to empathize more, and this last vestige that's 319 00:18:15,160 --> 00:18:19,240 Speaker 1: holding back a completely civil global society will fade away. 320 00:18:19,400 --> 00:18:21,320 Speaker 1: In France, to all put it pretty well, he said, 321 00:18:21,880 --> 00:18:24,840 Speaker 1: this is the challenge of our time, globalization by a 322 00:18:24,920 --> 00:18:28,200 Speaker 1: tribal species, and that's where we're facing right now. And 323 00:18:28,320 --> 00:18:30,439 Speaker 1: right now it feels like, at least in the United States, 324 00:18:30,720 --> 00:18:33,680 Speaker 1: we're backsliding. Yeah, well, that's a good place to take 325 00:18:33,680 --> 00:18:36,480 Speaker 1: a break, I think. Yeah, all right, well, we're gonna 326 00:18:36,520 --> 00:18:37,960 Speaker 1: come back in just a minute and talk a little 327 00:18:38,000 --> 00:18:41,800 Speaker 1: bit about something called the racial empathy gap right after this. 328 00:19:08,000 --> 00:19:11,880 Speaker 1: All right, So I promised some talk about race, and 329 00:19:11,920 --> 00:19:15,359 Speaker 1: there's something called the racial empathy gap. Um studies of 330 00:19:15,440 --> 00:19:18,359 Speaker 1: kind of. I mean, if you walk around as a 331 00:19:18,400 --> 00:19:20,720 Speaker 1: living breath and human human being, you can probably tell 332 00:19:20,760 --> 00:19:23,159 Speaker 1: that that's something. But they have done studies on it, 333 00:19:23,280 --> 00:19:26,040 Speaker 1: and um, a lot of these studies are a little 334 00:19:26,080 --> 00:19:30,040 Speaker 1: hinky to me. But uh, in one they showed video 335 00:19:30,119 --> 00:19:36,440 Speaker 1: clips of a needle going into someone's skin, uh, notably 336 00:19:36,480 --> 00:19:41,360 Speaker 1: a white person's skin at first, and what they found was, um, 337 00:19:41,480 --> 00:19:45,639 Speaker 1: white people reacted more or with more empathy when the 338 00:19:45,640 --> 00:19:48,240 Speaker 1: needle went into white skin than when it went into 339 00:19:48,320 --> 00:19:51,560 Speaker 1: dark skin. Right or they had they showed more signs 340 00:19:51,560 --> 00:19:54,280 Speaker 1: of distress, like they started to sweat a little more sure, 341 00:19:54,320 --> 00:19:56,640 Speaker 1: their hearts started to beat a little faster. Yeah, that's 342 00:19:56,640 --> 00:20:00,880 Speaker 1: where I think mirror neurons might come into play. Um, right, Yeah, 343 00:20:00,960 --> 00:20:03,720 Speaker 1: that's what they're like, it's brain wiring. That's a huge 344 00:20:03,840 --> 00:20:07,399 Speaker 1: problem with reading about empathy in the popular media. They're 345 00:20:07,520 --> 00:20:12,040 Speaker 1: huge jumps from mirror neurons to full on effective empathy 346 00:20:12,080 --> 00:20:15,320 Speaker 1: with just the switch of a sentence and then or 347 00:20:15,680 --> 00:20:18,720 Speaker 1: the the stroke of a headline, like and so people 348 00:20:18,760 --> 00:20:21,000 Speaker 1: are not talking about the same thing. And I'm sure 349 00:20:21,040 --> 00:20:24,439 Speaker 1: there's plenty of empathy researchers out there that are just like, guys, guys, 350 00:20:24,520 --> 00:20:27,680 Speaker 1: this is not like you're making huge jumps at the conclusion. 351 00:20:27,720 --> 00:20:31,439 Speaker 1: Everybody's like, shut up, doesn't matter, We're selling clicks, you know. 352 00:20:32,440 --> 00:20:35,680 Speaker 1: But so, yes, that's so. It is surely setting off 353 00:20:35,760 --> 00:20:40,280 Speaker 1: mirror neurons. I don't understand how it's being translated into empathy. 354 00:20:40,720 --> 00:20:44,080 Speaker 1: Aside from I think a lot of the empathy studies 355 00:20:44,359 --> 00:20:47,800 Speaker 1: involves self reporting. So I think what they're doing is 356 00:20:48,200 --> 00:20:54,480 Speaker 1: they're saying, oh, well, uh subject nine, Um, their heart 357 00:20:54,720 --> 00:20:57,480 Speaker 1: really started beating, and look at this on this questionnaire 358 00:20:57,480 --> 00:21:00,919 Speaker 1: they filled out, they really consider themselves an empathetic person, 359 00:21:01,240 --> 00:21:06,920 Speaker 1: ipso facto, an empathetic person is responding very empathetically right 360 00:21:06,960 --> 00:21:09,800 Speaker 1: now to seeing this needle. Yeah, like, what if they 361 00:21:09,840 --> 00:21:13,680 Speaker 1: showed painted someone's skin green, Well, they have they've done 362 00:21:13,800 --> 00:21:16,320 Speaker 1: violet tended and actually tell you the truth. As far 363 00:21:16,400 --> 00:21:20,200 Speaker 1: as correlating with self reports, um, that that does tend 364 00:21:20,200 --> 00:21:25,160 Speaker 1: to be a pretty good um control the truth because 365 00:21:25,160 --> 00:21:31,000 Speaker 1: apparently all people respond to that one, Huh, isn't that interesting? Yeah? 366 00:21:31,000 --> 00:21:34,720 Speaker 1: It is, actually, Um, there is something going on there though. 367 00:21:34,760 --> 00:21:36,840 Speaker 1: I mean we're not like discounting that because they have 368 00:21:36,960 --> 00:21:40,880 Speaker 1: done studies that show that minorities, um, maybe don't get 369 00:21:41,680 --> 00:21:45,359 Speaker 1: pain medication like they should compared to white people. Uh. 370 00:21:45,400 --> 00:21:47,600 Speaker 1: And I don't know. It seems like a racial empathy 371 00:21:47,600 --> 00:21:50,680 Speaker 1: gap is a pretty decent explanation for that for sure, 372 00:21:51,320 --> 00:21:53,200 Speaker 1: or in the criminal justice system, which we've talked a 373 00:21:53,240 --> 00:22:00,439 Speaker 1: lot about, or maybe just in empathy altogether between races. Yet. 374 00:22:01,240 --> 00:22:04,320 Speaker 1: So if you're if you're a judge though, and you're 375 00:22:05,320 --> 00:22:08,560 Speaker 1: you're not following sentencing guidelines, you're just using your own 376 00:22:08,600 --> 00:22:11,680 Speaker 1: personal biases to hand out sentences and you have people's 377 00:22:11,840 --> 00:22:14,639 Speaker 1: lives and futures in your hands. Yeah, you're not following 378 00:22:14,680 --> 00:22:16,920 Speaker 1: the law, you're following your own bias. You're a piece 379 00:22:16,920 --> 00:22:18,800 Speaker 1: of garbage. Well, and you have nothing to do with 380 00:22:18,840 --> 00:22:21,080 Speaker 1: you being an empathetic person or not. What about that 381 00:22:21,200 --> 00:22:25,800 Speaker 1: judge who remember the guy the swimmer who raped the 382 00:22:25,800 --> 00:22:28,480 Speaker 1: girl by the dumpster. It was obvious that judge was 383 00:22:28,560 --> 00:22:30,680 Speaker 1: kind of like, oh, look at this kid, like, oh, 384 00:22:30,760 --> 00:22:32,719 Speaker 1: I don't want to ruin his future. I don't want 385 00:22:32,720 --> 00:22:34,600 Speaker 1: to ruin his future, like that could have been my son. 386 00:22:34,920 --> 00:22:37,080 Speaker 1: You know, it's kind of like me. It was clearly 387 00:22:38,480 --> 00:22:42,400 Speaker 1: bias and empathy going on because he was like him, right, 388 00:22:42,680 --> 00:22:45,239 Speaker 1: And there's no way if that would have been some 389 00:22:45,280 --> 00:22:47,840 Speaker 1: black kid that he wouldn't have ruled differently. I just 390 00:22:47,920 --> 00:22:50,440 Speaker 1: there's no no one can convince me that that that's 391 00:22:50,480 --> 00:22:52,800 Speaker 1: not the truth, right, And I think that there's like 392 00:22:52,920 --> 00:22:55,800 Speaker 1: there's another distinction that's eventually going to be hammered out 393 00:22:55,840 --> 00:22:59,360 Speaker 1: to Like I don't think he was empathizing with that 394 00:22:59,480 --> 00:23:02,880 Speaker 1: swimmer kid. If he was, I could be wrong, who knows, 395 00:23:03,320 --> 00:23:05,560 Speaker 1: but I think he was m at the very least 396 00:23:05,600 --> 00:23:08,640 Speaker 1: exhibiting a bias that yes, he let the kid off 397 00:23:08,720 --> 00:23:12,240 Speaker 1: off the hook, um because he looked like I think 398 00:23:12,240 --> 00:23:14,560 Speaker 1: he might have been sympathizing with him though. Sure, Yeah, 399 00:23:14,600 --> 00:23:18,200 Speaker 1: because he even flat outsaid like this could ruin his life. Yeah, 400 00:23:18,200 --> 00:23:21,720 Speaker 1: he was definitely sympathizing at least for sure. Boy uh 401 00:23:21,760 --> 00:23:26,240 Speaker 1: so um. Going back a bit to uh philosopher Adam 402 00:23:26,280 --> 00:23:29,480 Speaker 1: Smith way back in the day, I think was clearly 403 00:23:29,520 --> 00:23:31,439 Speaker 1: talking about mirror neurons, even though he didn't know that 404 00:23:31,440 --> 00:23:33,479 Speaker 1: was a thing at the time when he wrote that 405 00:23:33,680 --> 00:23:37,800 Speaker 1: um persons of delicate fibers who notice a beggar's sores 406 00:23:37,800 --> 00:23:39,879 Speaker 1: and ulcers are apt to feel an itching or an 407 00:23:39,880 --> 00:23:43,480 Speaker 1: easy sensation in the correspondent part of their own bodies. 408 00:23:43,720 --> 00:23:47,680 Speaker 1: I mean, that's absolutely mirror neurons firing off. And we've 409 00:23:47,680 --> 00:23:49,000 Speaker 1: been saying that a lot. If you don't know what 410 00:23:49,000 --> 00:23:52,880 Speaker 1: we're talking about, listen to, uh feel someone else's pain? Yeah, 411 00:23:52,920 --> 00:23:54,200 Speaker 1: can you feel someone else pain? It was from a 412 00:23:54,240 --> 00:23:56,680 Speaker 1: few years ago, but it was one of my favorites 413 00:23:56,720 --> 00:23:59,800 Speaker 1: we've ever done, just because it's so fascinating. It really is. 414 00:24:00,119 --> 00:24:02,600 Speaker 1: The brain is wired like that, and it's it's the 415 00:24:02,640 --> 00:24:06,520 Speaker 1: reason why. And this is the you know, the easiest 416 00:24:06,520 --> 00:24:08,560 Speaker 1: way to explain it. Like if you see, like in 417 00:24:08,600 --> 00:24:11,240 Speaker 1: a football game someone's leg gets broken and you literally 418 00:24:11,280 --> 00:24:14,199 Speaker 1: feel like pain shoot through your body, that's those are 419 00:24:14,240 --> 00:24:17,000 Speaker 1: mirror neurons. Did you see There is a Simpsons recently 420 00:24:17,000 --> 00:24:20,000 Speaker 1: where Kirk van Houghton is back in college and he 421 00:24:20,040 --> 00:24:22,680 Speaker 1: goes to like high five. He's like a lacrosse player 422 00:24:22,880 --> 00:24:25,400 Speaker 1: and he goes to high five the college mascot, which 423 00:24:25,440 --> 00:24:27,280 Speaker 1: is like a guy in a suit of armor, and 424 00:24:27,320 --> 00:24:30,760 Speaker 1: he breaks his wrist in like fifty places, and they 425 00:24:30,760 --> 00:24:33,680 Speaker 1: show they cut to the sideline and Joe thisman takes 426 00:24:33,680 --> 00:24:37,520 Speaker 1: his hat off and throws up into man. I remember 427 00:24:37,560 --> 00:24:39,359 Speaker 1: that Siseman thing. I think we talked about that in 428 00:24:39,359 --> 00:24:42,080 Speaker 1: that episode. Yeah, I still I don't think I still 429 00:24:42,119 --> 00:24:44,680 Speaker 1: have ever seen it. You don't need to. I think 430 00:24:44,680 --> 00:24:46,480 Speaker 1: I do, though, Like, how can I be walking and 431 00:24:46,520 --> 00:24:49,400 Speaker 1: talking through life and not haven't seen Joe thisman break 432 00:24:49,440 --> 00:24:51,080 Speaker 1: his leg. Well, it's one of those things when you 433 00:24:51,119 --> 00:24:55,760 Speaker 1: see a body get bent in a very unnatural like direction. 434 00:24:56,480 --> 00:24:58,879 Speaker 1: It's just yeah, your your brain is hard wired to 435 00:24:58,960 --> 00:25:02,639 Speaker 1: not accept that. I know it makes you faint because 436 00:25:02,640 --> 00:25:05,040 Speaker 1: your brain is like, I can't see anymore. Speaking of 437 00:25:05,080 --> 00:25:07,200 Speaker 1: the brain, chuck. Um, let's talk a little bit about 438 00:25:07,240 --> 00:25:10,480 Speaker 1: the brain, right, So, Um, one of the we've already 439 00:25:10,560 --> 00:25:12,600 Speaker 1: kind of touched on. One of the issues that I 440 00:25:12,640 --> 00:25:15,920 Speaker 1: think we both have with um empathy research is that 441 00:25:16,320 --> 00:25:19,959 Speaker 1: the does the designs of the studies are just so 442 00:25:20,040 --> 00:25:23,600 Speaker 1: shoddy it's mind boggling. But then the other part of it, 443 00:25:23,560 --> 00:25:26,880 Speaker 1: it's like, well, just leave it to neuroscience. But neuroscience 444 00:25:26,920 --> 00:25:29,359 Speaker 1: is still using the same old m R I s 445 00:25:29,480 --> 00:25:32,399 Speaker 1: that it was before. And again, all it's showing is 446 00:25:32,440 --> 00:25:35,800 Speaker 1: that's where more oxygen is in the part of the brain, right, then, 447 00:25:35,920 --> 00:25:38,399 Speaker 1: so we're gonna correlate that to that part of the 448 00:25:38,400 --> 00:25:40,800 Speaker 1: brain being lit up. So that means that this part 449 00:25:40,840 --> 00:25:43,159 Speaker 1: of the brain has to do with um, looking at 450 00:25:43,200 --> 00:25:46,479 Speaker 1: pictures of boops, this is the boob region, right, And 451 00:25:46,520 --> 00:25:49,119 Speaker 1: this is like the level that that neurology is is 452 00:25:49,160 --> 00:25:52,360 Speaker 1: that as far as behavioral studies goes, Right, you put 453 00:25:52,400 --> 00:25:54,640 Speaker 1: these two together, this is the state of the art 454 00:25:54,680 --> 00:25:57,920 Speaker 1: with with empathy research, but with the brain as far 455 00:25:57,960 --> 00:26:00,399 Speaker 1: as that goes. They have kind of isole aid a 456 00:26:00,400 --> 00:26:03,320 Speaker 1: few different parts. And again this is kind of like 457 00:26:03,720 --> 00:26:06,560 Speaker 1: we think that this has to do with this process 458 00:26:06,600 --> 00:26:10,080 Speaker 1: just because in trial after trial, the same circuit has 459 00:26:10,080 --> 00:26:12,320 Speaker 1: been followed or the same region has lit up when 460 00:26:12,320 --> 00:26:16,760 Speaker 1: we've applied this stimulus to different people. Um, so there's 461 00:26:16,760 --> 00:26:21,280 Speaker 1: a there's there's good evidence that this this does have 462 00:26:21,400 --> 00:26:24,400 Speaker 1: to do with say empathizing or whatever. But it's still 463 00:26:24,440 --> 00:26:27,840 Speaker 1: it's just a very it's a rudimentary understanding at this point, 464 00:26:27,840 --> 00:26:30,240 Speaker 1: I think, compared to say like fifty years from now, right, 465 00:26:30,840 --> 00:26:34,400 Speaker 1: So what what they've what they think they figured out 466 00:26:34,920 --> 00:26:39,440 Speaker 1: is that there's a part of the brain and I 467 00:26:39,560 --> 00:26:42,960 Speaker 1: love parts of the brain. The effective effective empathy part 468 00:26:42,960 --> 00:26:45,760 Speaker 1: of the brain is called the insular cortex. That's where 469 00:26:45,800 --> 00:26:48,760 Speaker 1: they think that the effective region are part of the 470 00:26:48,760 --> 00:26:52,760 Speaker 1: effective region lies the anterior insular cortex, and then the 471 00:26:52,960 --> 00:26:57,119 Speaker 1: cognitive empathy UH is thought to reside or originate in 472 00:26:57,160 --> 00:27:01,199 Speaker 1: the mid singulate cortex. And actually those came from a 473 00:27:01,240 --> 00:27:07,119 Speaker 1: Monesh University research UM paper that's that looked at the 474 00:27:07,320 --> 00:27:11,720 Speaker 1: concentration of gray matter, the density of gray matter, and 475 00:27:11,760 --> 00:27:14,880 Speaker 1: that's like the neurons, whereas white matters like the connecting material, 476 00:27:15,000 --> 00:27:20,160 Speaker 1: right um. And so they're saying people who have UM, 477 00:27:20,200 --> 00:27:27,280 Speaker 1: really effective empathy have denser insular cortexas cortices, and then 478 00:27:27,280 --> 00:27:33,680 Speaker 1: people who have really serious cognitive empathy have dense midsingulate cortices. 479 00:27:35,040 --> 00:27:37,040 Speaker 1: That's where it's at right now. Yeah, they did a 480 00:27:37,040 --> 00:27:44,359 Speaker 1: pretty interesting test um this uh Tanya Tania singer in 481 00:27:44,640 --> 00:27:49,240 Speaker 1: this dude name Mattheo Ricard. He's a Buddhist monk. And 482 00:27:49,280 --> 00:27:52,040 Speaker 1: I get the idea that they picked this guy because 483 00:27:52,240 --> 00:27:57,120 Speaker 1: he can very much control his brains and emotion, right, 484 00:27:57,960 --> 00:27:59,880 Speaker 1: So what they did was he's a Buddhist monk. They 485 00:28:00,080 --> 00:28:02,680 Speaker 1: did some f M R I brain scanning on this guy, 486 00:28:02,800 --> 00:28:07,320 Speaker 1: and they said, all right, sir, Mr ricard Um. He's like, 487 00:28:07,400 --> 00:28:11,480 Speaker 1: please call me met you met you. We would like 488 00:28:11,520 --> 00:28:14,520 Speaker 1: you to engage in some different types of compassion and 489 00:28:15,119 --> 00:28:18,720 Speaker 1: meditate and direct that meditation towards people who are suffering. 490 00:28:19,240 --> 00:28:21,199 Speaker 1: And then they hooked him up to the to the 491 00:28:21,240 --> 00:28:25,200 Speaker 1: brain scan magic machine, and they found that the meditative 492 00:28:25,240 --> 00:28:28,880 Speaker 1: states UM. It was actually surprising to them. It did 493 00:28:28,920 --> 00:28:32,160 Speaker 1: not activate parts of the brain that are usually activated 494 00:28:32,200 --> 00:28:36,639 Speaker 1: by non meditators when they think about pain. But he said, 495 00:28:36,920 --> 00:28:38,920 Speaker 1: you know it was. It was good for me. Basically, 496 00:28:38,920 --> 00:28:42,040 Speaker 1: it was a warm, positive state. And he said, all right, now, 497 00:28:43,000 --> 00:28:45,640 Speaker 1: put yourself in this what you know they would call 498 00:28:45,760 --> 00:28:50,440 Speaker 1: the emotional empathetic state UM. And I guess he's able 499 00:28:50,480 --> 00:28:53,280 Speaker 1: to turn that on like a switch, right, He's like 500 00:28:53,320 --> 00:28:56,680 Speaker 1: watch this, yeah exactly, and blood just comes out of 501 00:28:56,720 --> 00:28:59,480 Speaker 1: his nose. Yeah. In different parts of the brain lit up. 502 00:28:59,640 --> 00:29:04,760 Speaker 1: And he said, this empathetic sharing very quickly became intolerable 503 00:29:04,800 --> 00:29:07,680 Speaker 1: to me. I felt emotionally exhausted, very similar to being 504 00:29:07,680 --> 00:29:11,080 Speaker 1: burnt out. So that's one of the big arguments against 505 00:29:11,160 --> 00:29:17,120 Speaker 1: this emotional or effective empathy is that you you can't 506 00:29:17,160 --> 00:29:19,800 Speaker 1: take on everyone else's pain like this. Let's say you're 507 00:29:19,800 --> 00:29:23,240 Speaker 1: a social worker or you're a nurse or a doctor, like, 508 00:29:23,600 --> 00:29:27,280 Speaker 1: it's gonna drive you insane. Oh yeah, well you'll you'll 509 00:29:27,320 --> 00:29:31,240 Speaker 1: burn out. It's called empathy distress. Yeah. And when they've 510 00:29:31,280 --> 00:29:34,680 Speaker 1: talked to patients like hospital patients, they don't want that either. 511 00:29:35,040 --> 00:29:38,120 Speaker 1: They won't They want maybe someone who has some sympathy. 512 00:29:38,160 --> 00:29:42,880 Speaker 1: But patients are more likely to feel better. But I 513 00:29:42,960 --> 00:29:45,320 Speaker 1: was just imagining a doctor coming in and just falling 514 00:29:45,320 --> 00:29:49,920 Speaker 1: to pieces your your condition. Doctors aren't like coming yeah, 515 00:29:49,960 --> 00:29:51,600 Speaker 1: well you don't. Yeah, like you said, you don't want 516 00:29:51,600 --> 00:29:54,160 Speaker 1: a doctor like, No, They feel better if their doctor 517 00:29:54,320 --> 00:29:57,760 Speaker 1: is kind of clinical and reassuring and really seems like 518 00:29:57,840 --> 00:30:01,320 Speaker 1: they have it together, which makes Yeah, And you don't 519 00:30:01,320 --> 00:30:04,040 Speaker 1: want somebody who's like, frankly, I could care less whether 520 00:30:04,080 --> 00:30:07,040 Speaker 1: you live or die. I want somewhere in between those two, 521 00:30:07,760 --> 00:30:10,280 Speaker 1: which which is where oh my god, you're gonna die, 522 00:30:10,920 --> 00:30:12,640 Speaker 1: Like you don't want that out of your doctor. No, 523 00:30:13,080 --> 00:30:16,000 Speaker 1: But it seems like the middle of that, those two specs, 524 00:30:16,000 --> 00:30:19,000 Speaker 1: that those two ends of the spectrum is where cognitive 525 00:30:19,000 --> 00:30:22,160 Speaker 1: empathy comes in. We'll chuck, how about we take a 526 00:30:22,160 --> 00:30:25,600 Speaker 1: break here? Second break that sounds good and we'll come back, 527 00:30:25,640 --> 00:30:54,960 Speaker 1: we promise. All right, man, what do you want to 528 00:30:54,960 --> 00:30:59,080 Speaker 1: talk about Sasha Baron Cohen? I still have never actually 529 00:30:59,680 --> 00:31:04,320 Speaker 1: looked up whether that's his brother or cousin or what. Simon. Yeah, 530 00:31:04,320 --> 00:31:07,600 Speaker 1: psychologist Simon Baron Cohen wrote a book in two thousand 531 00:31:07,680 --> 00:31:10,200 Speaker 1: eleven called The Science of Evil. And he's he's way 532 00:31:10,240 --> 00:31:15,040 Speaker 1: down with empathy. Yeah, and I guess that they describe 533 00:31:15,080 --> 00:31:19,040 Speaker 1: him as a thoughtful defender is what Bloom describes him 534 00:31:19,080 --> 00:31:22,080 Speaker 1: as of empathy. Um. And he has a ranking system, 535 00:31:22,160 --> 00:31:25,680 Speaker 1: an empathy curve from zero to six and zero is 536 00:31:25,720 --> 00:31:31,920 Speaker 1: no empathy basically or sociopath and six is you, I guess, 537 00:31:31,960 --> 00:31:38,840 Speaker 1: the most hardcore of uh, emotional impaths. Yeah, you're in. 538 00:31:39,240 --> 00:31:42,320 Speaker 1: You call it a constant state of hyper arousal. And 539 00:31:42,360 --> 00:31:44,280 Speaker 1: he had this one woman that he used in his 540 00:31:44,360 --> 00:31:48,480 Speaker 1: little example named Hannah, who was a therapist. It's probably 541 00:31:48,480 --> 00:31:51,080 Speaker 1: a great job for her, but she's just one of 542 00:31:51,120 --> 00:31:55,959 Speaker 1: these people that, uh, by all accounts, is just wired 543 00:31:56,000 --> 00:31:59,560 Speaker 1: that way, like her friends and her family and her patients, 544 00:32:00,080 --> 00:32:04,680 Speaker 1: like she just really feels for them all, Like it's 545 00:32:04,720 --> 00:32:08,000 Speaker 1: not just her job, which is in in some ways 546 00:32:09,160 --> 00:32:12,000 Speaker 1: that it probably helps some people but in other ways, 547 00:32:12,000 --> 00:32:15,280 Speaker 1: it's really probably number one off pudding, and even if 548 00:32:15,320 --> 00:32:19,000 Speaker 1: everybody liked it, it's bad for her in the end. 549 00:32:19,880 --> 00:32:22,560 Speaker 1: Like you you're we're not We're not designed to carry 550 00:32:22,680 --> 00:32:26,120 Speaker 1: everybody's problems and issues with us all the time. Yeah, 551 00:32:26,160 --> 00:32:27,920 Speaker 1: And that's kind of the main point Bloom is making, 552 00:32:28,000 --> 00:32:31,760 Speaker 1: is that people like Hannah are headed for, headed towards burnout, 553 00:32:32,080 --> 00:32:36,200 Speaker 1: just headed for. And he also does make the point 554 00:32:36,240 --> 00:32:39,240 Speaker 1: that friends and family don't like they need a certain 555 00:32:39,280 --> 00:32:41,360 Speaker 1: amount of that empathy. But you don't want someone that's 556 00:32:41,400 --> 00:32:45,800 Speaker 1: always like in that state, Like you also want someone 557 00:32:45,840 --> 00:32:48,880 Speaker 1: that's like, all right, let's turn that frown upside down 558 00:32:48,880 --> 00:32:51,520 Speaker 1: and let's go out and take a walk, you know, Like, 559 00:32:51,560 --> 00:32:53,720 Speaker 1: you don't want someone that's always cries when you cry, 560 00:32:53,840 --> 00:32:58,360 Speaker 1: you know, right, you're just gonna be like wrong, I 561 00:32:58,400 --> 00:33:01,120 Speaker 1: thought I had it bad. But and you can extend 562 00:33:01,120 --> 00:33:04,520 Speaker 1: that also to um the way that people react in 563 00:33:04,560 --> 00:33:07,320 Speaker 1: some ways, to say like a mass tragedy or something 564 00:33:07,400 --> 00:33:09,840 Speaker 1: like that, like look look at new Town. Right the 565 00:33:09,920 --> 00:33:14,320 Speaker 1: Sandy Hook shooting, twenty small kids were killed. Six adults 566 00:33:14,320 --> 00:33:17,360 Speaker 1: were also killed at the elementary school. It was the 567 00:33:17,400 --> 00:33:20,640 Speaker 1: most horrific tragedy I think that ever took place in 568 00:33:20,640 --> 00:33:23,440 Speaker 1: the United States. It was basically the one that everyone 569 00:33:23,680 --> 00:33:27,479 Speaker 1: who believes in very strict gun control was waiting for. 570 00:33:27,640 --> 00:33:29,960 Speaker 1: Was new, knew was gonna happen sooner or later, and 571 00:33:30,120 --> 00:33:32,680 Speaker 1: thought this is gonna be the tipping point, and it 572 00:33:32,720 --> 00:33:38,000 Speaker 1: didn't happen right. What people reacted to with was outpourings 573 00:33:38,040 --> 00:33:42,480 Speaker 1: of donations, lots of stuffed animals. Apparently there were three 574 00:33:42,520 --> 00:33:47,320 Speaker 1: for every resident of the town were sent um yeah 575 00:33:47,560 --> 00:33:51,080 Speaker 1: and UM, lots of thoughts and prayers and if you 576 00:33:51,160 --> 00:33:55,840 Speaker 1: ever have seen um, you know Anthony Geslnick He yeah, 577 00:33:55,920 --> 00:33:57,920 Speaker 1: he has a Netflix special I think it's still on 578 00:33:58,400 --> 00:34:01,040 Speaker 1: called Thoughts and Prayers and watch that and he explains 579 00:34:01,080 --> 00:34:03,320 Speaker 1: to you just how valuable your thoughts and prayers are, 580 00:34:03,400 --> 00:34:07,840 Speaker 1: especially on Twitter. UM. But Paul Bloom points out is 581 00:34:08,000 --> 00:34:12,160 Speaker 1: like this actually proved to be This outpouring proved to 582 00:34:12,200 --> 00:34:15,160 Speaker 1: be an additional burden on this town which is already 583 00:34:15,200 --> 00:34:17,960 Speaker 1: suffering tremendously. But like they had to UM. There was 584 00:34:18,000 --> 00:34:21,880 Speaker 1: something like eight volunteers who were tasked with handling all 585 00:34:21,920 --> 00:34:25,480 Speaker 1: the donations UM, whether it was stuffed animals or money, 586 00:34:25,560 --> 00:34:28,160 Speaker 1: and they apparently had to get a warehouse to put 587 00:34:28,160 --> 00:34:31,359 Speaker 1: all the stuffed animals in and I think even some 588 00:34:31,400 --> 00:34:34,120 Speaker 1: of the public officials were like, please stop sending us stuff, 589 00:34:34,480 --> 00:34:36,879 Speaker 1: send stuff, but send it to other people. We've got 590 00:34:36,920 --> 00:34:39,600 Speaker 1: everything we need. Send it to other people. And everyone said, no, 591 00:34:39,760 --> 00:34:42,560 Speaker 1: shut up, this is about us, not you. And I 592 00:34:42,640 --> 00:34:47,400 Speaker 1: think that that's part of um effective empathy, that outpouring 593 00:34:47,880 --> 00:34:50,319 Speaker 1: of stuff that seems like a nice gesture that makes 594 00:34:50,360 --> 00:34:53,719 Speaker 1: you feel better but doesn't actually help in any real 595 00:34:53,760 --> 00:34:57,960 Speaker 1: substantial way. I think that kind of underlies or betrays 596 00:34:58,120 --> 00:35:02,000 Speaker 1: what um, what effective empathy is all about, and why 597 00:35:02,080 --> 00:35:04,960 Speaker 1: why we are moved to do something with effective empathy 598 00:35:04,960 --> 00:35:08,880 Speaker 1: because we're feeling something right then, and writing a checker 599 00:35:08,920 --> 00:35:11,600 Speaker 1: sending a teddy bear is a good way to to 600 00:35:11,880 --> 00:35:16,480 Speaker 1: feel better, for us to feel better, Whereas cognitive empathy 601 00:35:16,520 --> 00:35:21,319 Speaker 1: would be like, um, I'm going to see to it 602 00:35:21,680 --> 00:35:26,200 Speaker 1: that every senator who blocked the gun control bill following 603 00:35:26,239 --> 00:35:30,000 Speaker 1: New Town is voted right out of office. That would 604 00:35:30,040 --> 00:35:34,000 Speaker 1: be cognitive empathy. You're empathizing with the parents, you're empathizing 605 00:35:34,000 --> 00:35:36,880 Speaker 1: with future kids who haven't been killed yet, and you're 606 00:35:36,920 --> 00:35:39,400 Speaker 1: gonna do what you can to make sure it doesn't happen. 607 00:35:39,680 --> 00:35:42,680 Speaker 1: Rather than writing a check. UM, or sending a teddy bear. 608 00:35:42,960 --> 00:35:47,040 Speaker 1: Those two me are the real distinctions between cognitive and 609 00:35:47,560 --> 00:35:51,160 Speaker 1: effective empathy as far as that ultimate goal is concerned, 610 00:35:51,160 --> 00:35:55,440 Speaker 1: which is again compassion, but compassion is doing what you 611 00:35:55,520 --> 00:36:00,000 Speaker 1: can to improve the outcome for the greater good. Yeah, 612 00:36:00,040 --> 00:36:03,960 Speaker 1: that's interesting and I Another thing that UM kind of 613 00:36:04,040 --> 00:36:08,200 Speaker 1: jumped out to me was these psychologists Vicky Helgesen and 614 00:36:08,280 --> 00:36:12,800 Speaker 1: Heidi Fritz. They were researching why women are more likely 615 00:36:13,200 --> 00:36:15,560 Speaker 1: I think twice as likely as men to get depressed 616 00:36:15,600 --> 00:36:19,560 Speaker 1: and experienced depression. And they thought, you know, they said, 617 00:36:19,600 --> 00:36:21,200 Speaker 1: you know what, I think it's because women are more 618 00:36:21,200 --> 00:36:24,319 Speaker 1: empathetic and and you know, emotionally empathetic. And they take 619 00:36:24,360 --> 00:36:27,200 Speaker 1: this on and uh, they said that there's a propensity 620 00:36:27,239 --> 00:36:30,720 Speaker 1: for what they called unmitigated communion, which is in a a quote, 621 00:36:30,719 --> 00:36:33,480 Speaker 1: an excessive concern with others and placing others needs before 622 00:36:33,560 --> 00:36:37,240 Speaker 1: one's own end quote. And they you know, gave people 623 00:36:37,320 --> 00:36:40,160 Speaker 1: and this is one of those like a nine item questionnaire. 624 00:36:40,200 --> 00:36:43,279 Speaker 1: How much can you really learn? Um? But uh, some 625 00:36:43,360 --> 00:36:45,400 Speaker 1: of the statements agree to disagree with. We're like, for 626 00:36:45,440 --> 00:36:47,120 Speaker 1: me to be happy, I need others to be happy. 627 00:36:47,480 --> 00:36:50,040 Speaker 1: I can't say no when someone asked for help often 628 00:36:50,040 --> 00:36:53,160 Speaker 1: worry about others problems and kind of across the board, 629 00:36:53,160 --> 00:36:58,160 Speaker 1: women score higher than men do on this and UM, 630 00:36:58,239 --> 00:36:59,600 Speaker 1: you know, I think a lot of that probably has 631 00:36:59,640 --> 00:37:03,360 Speaker 1: to do with with evolution to with, you know, women 632 00:37:03,800 --> 00:37:05,719 Speaker 1: having to care for their babies right out of the gate. 633 00:37:06,360 --> 00:37:08,759 Speaker 1: Which took took his wife? You know, although it took 634 00:37:08,760 --> 00:37:12,360 Speaker 1: took we know, never took a wife. Um took to 635 00:37:12,520 --> 00:37:15,800 Speaker 1: cut around. He got around. But the women that took 636 00:37:15,800 --> 00:37:20,040 Speaker 1: took would would knock up. They would immediately be in 637 00:37:20,160 --> 00:37:23,760 Speaker 1: charge of those babies. And that's what um, that primatologist 638 00:37:23,800 --> 00:37:25,759 Speaker 1: talked about two was. You know, this is kind of 639 00:37:25,920 --> 00:37:29,000 Speaker 1: straight up evolution. Our natural selection is right out of 640 00:37:29,040 --> 00:37:31,400 Speaker 1: the gate, we have this empathy because we have to 641 00:37:31,440 --> 00:37:35,520 Speaker 1: care for young and then um, I think we already 642 00:37:35,520 --> 00:37:40,440 Speaker 1: mentioned too, and then that definitely evolves into protect the tribe, 643 00:37:41,600 --> 00:37:44,200 Speaker 1: right because we're better off if the people around us 644 00:37:44,200 --> 00:37:48,960 Speaker 1: are healthy and happy and ready to ward off attacks. Um. 645 00:37:49,480 --> 00:37:53,640 Speaker 1: But the the idea that women are more prone to 646 00:37:54,040 --> 00:37:58,160 Speaker 1: experience a effective empathy or just even empathy in general, 647 00:37:58,520 --> 00:38:01,399 Speaker 1: it's actually got a has a biological basis. To tell 648 00:38:01,400 --> 00:38:05,960 Speaker 1: you the truth to chuck um in in adolescence or puberty. 649 00:38:06,400 --> 00:38:11,320 Speaker 1: Apparently girls have They score high for effective empathy throughout 650 00:38:11,320 --> 00:38:15,920 Speaker 1: their entire adolescence, where between about ages thirteen and sixteen, 651 00:38:16,239 --> 00:38:19,959 Speaker 1: boys effective empathy declined. They take a little vacation, Yeah, 652 00:38:19,960 --> 00:38:23,919 Speaker 1: and they say, oh, oh you feel bad, You're about 653 00:38:23,960 --> 00:38:26,560 Speaker 1: to feel worse because I'm gonna give you a swirly. Yeah. 654 00:38:26,600 --> 00:38:29,400 Speaker 1: I don't know what it srely is, but it's a 655 00:38:29,440 --> 00:38:31,160 Speaker 1: it's where you stick someone's head in the toilet and 656 00:38:31,160 --> 00:38:36,600 Speaker 1: flush swirl. Never heard of that. Fortunately, I had only 657 00:38:36,640 --> 00:38:39,359 Speaker 1: heard of it, never witnessed it or had it done 658 00:38:39,360 --> 00:38:43,160 Speaker 1: to me. We did nuggies and uh was it wedgies 659 00:38:43,200 --> 00:38:48,719 Speaker 1: when you did the underwear. Yeah. Yeah, they're terrible. They 660 00:38:48,719 --> 00:38:52,880 Speaker 1: are terrible, and that's bullying behavior. And there are some 661 00:38:52,920 --> 00:38:56,200 Speaker 1: theories about bullies too that they actually use empathy to 662 00:38:56,600 --> 00:39:02,880 Speaker 1: manipulate people, like they they'll use it again stem Well, yeah, 663 00:39:03,160 --> 00:39:08,680 Speaker 1: they they they used cognitive empathy to calculate the best 664 00:39:08,800 --> 00:39:12,600 Speaker 1: most effective way to hurt somebody, and then um, they 665 00:39:12,680 --> 00:39:17,439 Speaker 1: turn off any potential like effective empathy. Um, when they're 666 00:39:17,480 --> 00:39:21,279 Speaker 1: actually carrying out their active bullying. Yeah. And with the 667 00:39:21,320 --> 00:39:25,400 Speaker 1: teenagers too, they they say that if you develop effective 668 00:39:25,400 --> 00:39:29,800 Speaker 1: and cognitive empathy, Um, that you're going to be happier, 669 00:39:29,840 --> 00:39:32,640 Speaker 1: you're gonna argue less with your parents, you're gonna have 670 00:39:32,760 --> 00:39:35,880 Speaker 1: more healthy relationships, which you know, it kind of all 671 00:39:35,920 --> 00:39:38,799 Speaker 1: makes sense sure, And they also were saying too, and 672 00:39:38,880 --> 00:39:42,040 Speaker 1: we will will get into how to increase your own 673 00:39:42,080 --> 00:39:43,600 Speaker 1: empathy if you think that kind of thing is a 674 00:39:43,600 --> 00:39:48,719 Speaker 1: good idea. Um, But that babies learn empathy out of 675 00:39:48,760 --> 00:39:51,399 Speaker 1: the out of the gate by being empathized with, by 676 00:39:51,560 --> 00:39:55,839 Speaker 1: being treated warmly by their parents and other adults, being 677 00:39:55,880 --> 00:39:59,000 Speaker 1: responded to in a warm manner, that that actually is 678 00:39:59,160 --> 00:40:01,799 Speaker 1: the beginning of empathy. And it's like you said, you 679 00:40:01,840 --> 00:40:03,640 Speaker 1: can see a little kid in a preschool go over 680 00:40:03,719 --> 00:40:08,480 Speaker 1: in comfort or console another little kid, um who's in distress. Boy, 681 00:40:08,520 --> 00:40:11,640 Speaker 1: that's why I when I hear about neglect like baby 682 00:40:11,640 --> 00:40:15,439 Speaker 1: and infant neglect is just man, that's like the most 683 00:40:15,440 --> 00:40:18,600 Speaker 1: heartbreaking thing you can imagine. It's like a baby just 684 00:40:18,680 --> 00:40:20,640 Speaker 1: like left in a room to cry and cry and 685 00:40:20,680 --> 00:40:24,360 Speaker 1: cry forever. Plus Also, when we were talking about the 686 00:40:24,400 --> 00:40:27,719 Speaker 1: breastfeeding episode, that body to body contact of being held 687 00:40:28,600 --> 00:40:32,200 Speaker 1: shows or has been shown to affect their development if 688 00:40:32,200 --> 00:40:34,680 Speaker 1: they don't have it enough. It's just all sorts of 689 00:40:34,760 --> 00:40:36,880 Speaker 1: terrible things that happen to you when you're neglected as 690 00:40:36,920 --> 00:40:40,360 Speaker 1: a baby. Yeah, it's terrible, So Chuck, there are plenty 691 00:40:40,360 --> 00:40:42,719 Speaker 1: of people who say, well, we need to empathize more. 692 00:40:43,000 --> 00:40:44,800 Speaker 1: So just get out there and learn how to empathize. 693 00:40:44,840 --> 00:40:46,520 Speaker 1: And there's plenty of people out there who will teach 694 00:40:46,520 --> 00:40:50,200 Speaker 1: you techniques on empathizing with people more, and they may 695 00:40:50,200 --> 00:40:52,840 Speaker 1: be worth trying, Like I found them very helpful in 696 00:40:52,880 --> 00:40:57,799 Speaker 1: a lot of cases, especially on interpersonal communication. Right. But 697 00:40:58,160 --> 00:41:00,800 Speaker 1: as far as like changing the world down a massive 698 00:41:00,840 --> 00:41:03,520 Speaker 1: scale for for the better, is it a good idea 699 00:41:03,560 --> 00:41:06,439 Speaker 1: to go out and just empathize, empathize, empathize, because there's 700 00:41:06,480 --> 00:41:09,719 Speaker 1: a big question mark with that. Who exactly are you 701 00:41:09,719 --> 00:41:13,000 Speaker 1: supposed to empathize with? Like with just about every problem, 702 00:41:13,040 --> 00:41:15,440 Speaker 1: there's a group that's being helped by something and a 703 00:41:15,480 --> 00:41:18,360 Speaker 1: group that's being har harmed by something, especially when it 704 00:41:18,360 --> 00:41:21,640 Speaker 1: comes to public policy, right, So which group you're gonna 705 00:41:21,640 --> 00:41:24,640 Speaker 1: empathize with? If you empathize with the current victims and 706 00:41:24,680 --> 00:41:26,839 Speaker 1: you change public policy to help them, well, then you're 707 00:41:26,920 --> 00:41:30,600 Speaker 1: leaving the people who are currently benefiting out in the cold. Right, 708 00:41:30,640 --> 00:41:33,480 Speaker 1: So there's a big question of who you should empathize 709 00:41:33,480 --> 00:41:35,719 Speaker 1: with at any given point in time, which makes this 710 00:41:35,760 --> 00:41:42,320 Speaker 1: whole behavioral science nudge politics BS that is ultimately behind 711 00:41:42,360 --> 00:41:47,040 Speaker 1: this whole push to empathize more um that that's not 712 00:41:47,120 --> 00:41:49,920 Speaker 1: taking that into consideration. And then there's this kind of 713 00:41:49,920 --> 00:41:53,600 Speaker 1: a second facet to that, which is studies have found 714 00:41:53,640 --> 00:41:59,360 Speaker 1: that when you increase empathy in people, um, they tend 715 00:41:59,400 --> 00:42:02,960 Speaker 1: to pathize more with their own group, but it also 716 00:42:03,200 --> 00:42:07,120 Speaker 1: in kind increases hostility in those people towards out groups. 717 00:42:08,080 --> 00:42:09,960 Speaker 1: You know what I'm saying, Like, they see their friend 718 00:42:10,000 --> 00:42:12,200 Speaker 1: who's being hurt is more of a victim and how 719 00:42:12,239 --> 00:42:15,000 Speaker 1: could you do this to them? And now I want 720 00:42:15,040 --> 00:42:17,400 Speaker 1: to get you back because one of the sour sides 721 00:42:17,400 --> 00:42:21,320 Speaker 1: of empathy is that it frequently comes with a taste 722 00:42:21,360 --> 00:42:23,960 Speaker 1: for retribution too, I think is how Paul Bloom put it, 723 00:42:26,200 --> 00:42:28,840 Speaker 1: the dark side of empathy. So just yeah, there is 724 00:42:28,880 --> 00:42:33,040 Speaker 1: a dark side. There's a dark side to everything in there. Yeah, 725 00:42:33,200 --> 00:42:40,359 Speaker 1: except you, I'm all dark side, You're all light kind. 726 00:42:41,280 --> 00:42:44,280 Speaker 1: So we'll finish up here with a bit on people 727 00:42:44,320 --> 00:42:50,560 Speaker 1: with autism, because there's this stereotype, um that if you 728 00:42:50,920 --> 00:42:53,200 Speaker 1: everyone's probably heard it that you know what, people with 729 00:42:53,280 --> 00:42:58,000 Speaker 1: autism lack empathy and they don't understand emotions. And if 730 00:42:58,040 --> 00:43:02,640 Speaker 1: you know anybody who uh either has autism or is 731 00:43:02,680 --> 00:43:06,000 Speaker 1: a parent of a child with autism, they will dispel 732 00:43:06,080 --> 00:43:10,239 Speaker 1: that myth pretty straight up just from their own lives. Um. 733 00:43:10,280 --> 00:43:13,440 Speaker 1: But these people did some studying and some research because 734 00:43:13,440 --> 00:43:15,359 Speaker 1: they were like, that's not good enough for me, and 735 00:43:15,360 --> 00:43:18,759 Speaker 1: it's not good enough to just say that, Like, you know, 736 00:43:19,520 --> 00:43:22,840 Speaker 1: every autism is different for everyone, So some people have 737 00:43:23,000 --> 00:43:26,880 Speaker 1: empathy or people with autism show empathy, so but everyone's different, 738 00:43:26,920 --> 00:43:29,759 Speaker 1: So who cares about investigating that? Yeah? So I really 739 00:43:29,760 --> 00:43:31,640 Speaker 1: love the approach they took here. They were kind of 740 00:43:31,680 --> 00:43:35,760 Speaker 1: really wanted to keep digging, which I really respected. So, uh, 741 00:43:35,800 --> 00:43:37,279 Speaker 1: they said, you know what, I think it might be 742 00:43:37,280 --> 00:43:42,360 Speaker 1: going on here. There's this other um condition called alexathemia. 743 00:43:43,280 --> 00:43:48,560 Speaker 1: And alexathemia means you have a difficult time understanding your 744 00:43:48,560 --> 00:43:53,080 Speaker 1: own emotions. So you might, you know, you might have 745 00:43:53,120 --> 00:43:55,600 Speaker 1: a feeling that you're experiencing an emotion, but you just 746 00:43:55,640 --> 00:43:57,680 Speaker 1: don't know what it is. And about ten percent of 747 00:43:57,800 --> 00:44:01,640 Speaker 1: people have it in the regular population. About fifty people 748 00:44:01,640 --> 00:44:06,040 Speaker 1: with autism have alexithemia. But they're not the same thing. No, 749 00:44:06,719 --> 00:44:10,400 Speaker 1: And these guys actually found that um, people with autism 750 00:44:10,400 --> 00:44:15,279 Speaker 1: who do not have alexithemia tend to display empathy. Yeah, 751 00:44:15,400 --> 00:44:21,280 Speaker 1: and even you know, lots of empathy. Lots of empathy. Yeah, empathy. 752 00:44:21,320 --> 00:44:24,719 Speaker 1: They got binders full of empathy, finders full of empathy that. 753 00:44:25,680 --> 00:44:28,799 Speaker 1: Oh yeah, we remember when that was the most controversial 754 00:44:28,800 --> 00:44:33,200 Speaker 1: thing going in politics. Oh man, finders full of empathy. 755 00:44:33,360 --> 00:44:36,560 Speaker 1: Uh yeah, like they had they scored you know, very 756 00:44:36,560 --> 00:44:39,319 Speaker 1: strong when it came to measuring empathy. Uh. And what 757 00:44:39,360 --> 00:44:41,360 Speaker 1: they did was they you know, that makes sense. The 758 00:44:41,400 --> 00:44:43,640 Speaker 1: way they did it's very I really like this study. 759 00:44:43,920 --> 00:44:48,960 Speaker 1: They had four groups, uh, individuals with autism and alexithemia, Uh, 760 00:44:49,000 --> 00:44:53,920 Speaker 1: individuals with autism without it, individuals with alexithemia but not autism, 761 00:44:54,000 --> 00:44:57,719 Speaker 1: and then people that didn't have either one. And it 762 00:44:58,440 --> 00:45:03,239 Speaker 1: basically seems to kind of prove that, Yeah, it's just 763 00:45:03,400 --> 00:45:07,040 Speaker 1: not true that people with autism don't have empathy. It's 764 00:45:07,080 --> 00:45:10,400 Speaker 1: really alexithemia is what's going on, right, Which is I 765 00:45:10,440 --> 00:45:13,919 Speaker 1: think a novel finding or a novel hypothesis. I don't 766 00:45:13,920 --> 00:45:16,919 Speaker 1: think this is part of a larger field. I think 767 00:45:16,960 --> 00:45:19,400 Speaker 1: these these guys came up with that. Yeah, and did 768 00:45:19,440 --> 00:45:22,720 Speaker 1: you see that other study the UM from Goldsmith's University 769 00:45:22,719 --> 00:45:27,040 Speaker 1: of London about the facial expressions. Yeah. I thought that 770 00:45:27,080 --> 00:45:31,360 Speaker 1: was pretty interesting too. Yeah, that they they investigated that. Um, 771 00:45:31,400 --> 00:45:35,600 Speaker 1: if you expose people with autism to the sounds of 772 00:45:35,600 --> 00:45:38,480 Speaker 1: people's voices and ask them to rate what emotion that 773 00:45:38,520 --> 00:45:42,279 Speaker 1: person is experiencing, they're far better at um calling that 774 00:45:42,920 --> 00:45:47,640 Speaker 1: correctly than faces. And apparently it's because people with autism 775 00:45:47,640 --> 00:45:51,200 Speaker 1: tend to spend much less time studying faces, not because 776 00:45:51,239 --> 00:45:54,360 Speaker 1: they can't empathize. They just aren't using cues that um 777 00:45:54,440 --> 00:45:59,000 Speaker 1: people without autism use to um conclude what emotions people 778 00:45:59,000 --> 00:46:02,719 Speaker 1: are experience dancing. Yeah, really interesting stuff. And I don't 779 00:46:02,760 --> 00:46:06,759 Speaker 1: know why this didn't get more play because it still 780 00:46:06,800 --> 00:46:10,279 Speaker 1: seems like people are kind of banging that drum that, 781 00:46:10,400 --> 00:46:13,600 Speaker 1: you know, people with autism that aren't empathetic. Yeah, I don't. 782 00:46:13,600 --> 00:46:17,839 Speaker 1: I don't know why either. It just makes sense that, yeah, um, 783 00:46:17,880 --> 00:46:20,399 Speaker 1: we need to do an entire episode on autism. Yeah, 784 00:46:20,480 --> 00:46:23,120 Speaker 1: maybe alexophemia. I've never heard of that. We also need 785 00:46:23,160 --> 00:46:25,880 Speaker 1: to do one on psychopaths too, which is another group 786 00:46:25,920 --> 00:46:29,120 Speaker 1: that tends to be pointed to is kind of incorrectly 787 00:46:29,160 --> 00:46:31,640 Speaker 1: as far as empathy goes, Where if you're lacking empathy, 788 00:46:31,680 --> 00:46:34,279 Speaker 1: you're a psychopath. What actually turns out that if you 789 00:46:34,320 --> 00:46:38,240 Speaker 1: have what's called a shallow affect, meaning like you're across 790 00:46:38,280 --> 00:46:42,879 Speaker 1: the board emotionally, you're pretty stunted and um, shallow or superficial, 791 00:46:43,760 --> 00:46:46,160 Speaker 1: that's what really qualifies you as a psychopath, not just 792 00:46:46,360 --> 00:46:50,560 Speaker 1: missing empathy. UM. But yet again it's another popular misconception 793 00:46:50,560 --> 00:46:55,520 Speaker 1: that's being allowed to persist. I'm just irritated, Chuck. I've 794 00:46:55,560 --> 00:46:58,600 Speaker 1: got a great quote though, from Paul Bloom and I 795 00:46:58,640 --> 00:47:02,040 Speaker 1: also want to say that I think, um, that empathy 796 00:47:02,120 --> 00:47:05,080 Speaker 1: also the different kinds of empathy also get divided among 797 00:47:05,280 --> 00:47:08,600 Speaker 1: the genders as well. And we even said, we've even 798 00:47:08,640 --> 00:47:11,960 Speaker 1: talked about that study that concluded that women tend to 799 00:47:11,960 --> 00:47:15,200 Speaker 1: suffer from depression because they're more empathetic. I think that 800 00:47:15,520 --> 00:47:18,320 Speaker 1: maybe that's the case, and there is a biological basis 801 00:47:18,320 --> 00:47:21,040 Speaker 1: for it in adolescence. But one thing that seems to 802 00:47:21,080 --> 00:47:25,239 Speaker 1: persist everywhere is that, um, different types of empathy or 803 00:47:25,280 --> 00:47:28,880 Speaker 1: different techniques for empathy to produce empathy can be learned, 804 00:47:30,000 --> 00:47:32,759 Speaker 1: they can be taught. And I think if you just say, like, well, 805 00:47:32,760 --> 00:47:34,640 Speaker 1: wait a minute, I really want to solve this problem, 806 00:47:34,640 --> 00:47:36,319 Speaker 1: I'm not going to fly off the handle or I'm 807 00:47:36,360 --> 00:47:39,160 Speaker 1: not gonna lose my marbles. I'm gonna like really put 808 00:47:39,200 --> 00:47:41,400 Speaker 1: some thought into it, and I can still be compassionate, 809 00:47:41,400 --> 00:47:43,680 Speaker 1: but I don't have to completely experience someone else's pain. 810 00:47:43,960 --> 00:47:47,160 Speaker 1: I don't think that that's a biological imperative one way 811 00:47:47,239 --> 00:47:49,880 Speaker 1: or another. I think if you decide to make a 812 00:47:50,160 --> 00:47:52,799 Speaker 1: choice or a change in the way you approach situations, 813 00:47:52,840 --> 00:47:55,120 Speaker 1: that has nothing to do with gender. So I just 814 00:47:55,120 --> 00:47:57,200 Speaker 1: wanted to point that out. Yeah, and as far as 815 00:47:57,239 --> 00:48:00,680 Speaker 1: teaching empathy, Like, there's been a little bit of poopo 816 00:48:00,800 --> 00:48:03,080 Speaker 1: ing of emotional empathy, but I think it's I think 817 00:48:03,080 --> 00:48:05,239 Speaker 1: it's definitely like a pretty good thing to do as 818 00:48:05,239 --> 00:48:08,919 Speaker 1: a parent to try and teach your child to like, Hey, 819 00:48:09,040 --> 00:48:11,399 Speaker 1: you know, how would you feel if someone was doing 820 00:48:11,400 --> 00:48:14,800 Speaker 1: this to you? Yeah, And that's how they learn Yeah, exactly. 821 00:48:15,040 --> 00:48:17,799 Speaker 1: You don't learn it on your own. I think it 822 00:48:17,840 --> 00:48:23,440 Speaker 1: has to be imparted by good parents, agreed, and um. Again, 823 00:48:23,719 --> 00:48:27,000 Speaker 1: the the goal, and this is a Paul Bloom quote, 824 00:48:27,040 --> 00:48:30,480 Speaker 1: The goal isn't to to love every single person like 825 00:48:30,520 --> 00:48:34,440 Speaker 1: you love the people closest to you, but to value 826 00:48:34,480 --> 00:48:36,560 Speaker 1: other people just for the very fact that they're human. 827 00:48:36,600 --> 00:48:39,959 Speaker 1: Beings right, that's the goal that everybody's looking for with 828 00:48:39,960 --> 00:48:43,080 Speaker 1: with empathy, and he says, quote, our best hope for 829 00:48:43,120 --> 00:48:45,440 Speaker 1: the future is not to get people to think of 830 00:48:45,480 --> 00:48:48,799 Speaker 1: all humanity as family. That's impossible. It lies instead in 831 00:48:48,840 --> 00:48:51,760 Speaker 1: an appreciation of the fact that, even if we don't 832 00:48:51,840 --> 00:48:55,640 Speaker 1: empathize with distance strangers, their lives have the same value 833 00:48:55,840 --> 00:49:01,279 Speaker 1: as the lives of those we love. That's the key. Interesting. Yeah, 834 00:49:01,320 --> 00:49:06,440 Speaker 1: good stuff, good stuff. We should subtitle this one Empathy 835 00:49:06,680 --> 00:49:11,120 Speaker 1: A Lucy Goosey episode, also known as what Paul Bloom says. 836 00:49:13,000 --> 00:49:16,920 Speaker 1: Thank you Paul Bloom. Yeah, big big ups to Paul Bloom. Uh. 837 00:49:16,960 --> 00:49:19,480 Speaker 1: And since I said big ups to Paul Bloom, that 838 00:49:19,600 --> 00:49:24,680 Speaker 1: means it's time for a listener mail chuck. Um, I'm 839 00:49:24,680 --> 00:49:31,120 Speaker 1: gonna call this hook worms nice um, Hello from the 840 00:49:31,160 --> 00:49:35,520 Speaker 1: Sunny South, United States. Southerners aren't lazy and dumb, they 841 00:49:35,600 --> 00:49:39,560 Speaker 1: just had hookworm. Great title. By the way, Josh brought 842 00:49:39,560 --> 00:49:41,799 Speaker 1: back a childhood memory, and I finally had to write end. Guys. 843 00:49:41,800 --> 00:49:43,719 Speaker 1: I grew up in Florida, so we spent most of 844 00:49:43,760 --> 00:49:46,560 Speaker 1: the summer with our shoes off. Uh. And I remember 845 00:49:46,600 --> 00:49:49,920 Speaker 1: my mother distinctly reminding me to wear shoes. Uh, so 846 00:49:49,960 --> 00:49:53,759 Speaker 1: I wouldn't get the ground ditch. It's never happened. I 847 00:49:53,840 --> 00:49:55,480 Speaker 1: called my mom, who is now eighty eight years old, 848 00:49:55,520 --> 00:49:57,759 Speaker 1: to verify a few facts and about when I was 849 00:49:57,760 --> 00:50:00,000 Speaker 1: a little girl, I believe around five to seven or 850 00:50:00,080 --> 00:50:02,520 Speaker 1: eight years before school started, my mother would give me 851 00:50:02,600 --> 00:50:07,000 Speaker 1: a worm treatment on my feet. I explained to her 852 00:50:07,040 --> 00:50:09,400 Speaker 1: what I had learned during the podcast about hookworms and 853 00:50:09,440 --> 00:50:11,279 Speaker 1: how they affected the body. When I mentioned how they 854 00:50:11,320 --> 00:50:14,040 Speaker 1: cause severe anemia and caused the body to be more 855 00:50:14,080 --> 00:50:17,320 Speaker 1: susceptible to illness, she remembered a story about my father's cousin. 856 00:50:17,920 --> 00:50:21,120 Speaker 1: Apparently the cousin was so and became so incredibly ill 857 00:50:21,680 --> 00:50:23,279 Speaker 1: she was very close to dying. They took her to 858 00:50:23,320 --> 00:50:26,120 Speaker 1: the hospital and found out she was severely anemic, and 859 00:50:26,160 --> 00:50:28,839 Speaker 1: before they began any other diagnostics, they decided to test 860 00:50:28,840 --> 00:50:32,520 Speaker 1: her for hookworm and bingo. As my mother said, she 861 00:50:32,680 --> 00:50:35,200 Speaker 1: was full of them. She had a high worm burden. 862 00:50:35,719 --> 00:50:39,040 Speaker 1: She did uh. Mom said it took three treatments to 863 00:50:39,080 --> 00:50:41,520 Speaker 1: get rid of the worms. The story was she was 864 00:50:41,560 --> 00:50:44,879 Speaker 1: so infested they literally came out of her mouth when 865 00:50:44,920 --> 00:50:49,160 Speaker 1: she was being treated. Oh my god, Wow, that is 866 00:50:49,200 --> 00:50:51,360 Speaker 1: the best story I've heard in a while, and she 867 00:50:51,400 --> 00:50:54,319 Speaker 1: put in parentheses. I know, right, because I think she 868 00:50:54,360 --> 00:50:57,319 Speaker 1: anticipated that reaction. That's why you don't want to be 869 00:50:57,719 --> 00:51:04,359 Speaker 1: uh six point oh um effective empathetic person. Yeah, that's right. Uh. 870 00:51:04,480 --> 00:51:07,160 Speaker 1: This cousin is actually still alive and in her early nineties, 871 00:51:07,880 --> 00:51:10,680 Speaker 1: so uh, this would have been in the nineties. I 872 00:51:10,719 --> 00:51:14,280 Speaker 1: hope she doesn't listen to this show. Hookworm and Fancy 873 00:51:14,320 --> 00:51:18,799 Speaker 1: Free in Florida. As from Terry Brunson of Panama City. Nice. 874 00:51:18,840 --> 00:51:21,000 Speaker 1: Thanks a lot, Terry. That was a great email. It 875 00:51:21,040 --> 00:51:23,880 Speaker 1: had everything had. It was a roller coaster ride. There 876 00:51:23,920 --> 00:51:26,920 Speaker 1: was a cousin who had worms coming out of her mouth. Laughed, 877 00:51:26,920 --> 00:51:33,560 Speaker 1: I cried. There was a mom, an old cousin. I'd 878 00:51:33,560 --> 00:51:35,799 Speaker 1: like to know what the worm treatment consisted of. I'll 879 00:51:35,800 --> 00:51:39,719 Speaker 1: bet there was dead cat and they're somewhere. Oh my god. Uh. 880 00:51:39,760 --> 00:51:43,360 Speaker 1: If you want to tell us about your family's weird remedies, 881 00:51:43,440 --> 00:51:45,800 Speaker 1: we want to know the ingredients and you can tweet 882 00:51:45,840 --> 00:51:48,239 Speaker 1: them to us at s Y s K podcast, or 883 00:51:48,920 --> 00:51:51,240 Speaker 1: you can hang out with us on Facebook at facebook 884 00:51:51,280 --> 00:51:53,799 Speaker 1: dot com slash Stuff you Should Know or Facebook dot 885 00:51:53,880 --> 00:51:56,839 Speaker 1: com slash Charles W. Chuck Bryant. Uh send us an 886 00:51:56,840 --> 00:51:59,280 Speaker 1: email The Stuff podcast at how Stuff Works dot com 887 00:51:59,440 --> 00:52:01,279 Speaker 1: and join us is always at our home on the web. 888 00:52:01,440 --> 00:52:06,440 Speaker 1: Stuff you Should Know dot Com. Stuff you Should Know 889 00:52:06,520 --> 00:52:09,320 Speaker 1: is a production of I Heart Radio. For more podcasts 890 00:52:09,400 --> 00:52:12,799 Speaker 1: my heart Radio, visit the i heart Radio app, Apple Podcasts, 891 00:52:12,880 --> 00:52:17,600 Speaker 1: or wherever you listen to your favorite shows. H