1 00:00:15,436 --> 00:00:22,156 Speaker 1: Pushkin. I was driving home late at night after spending 2 00:00:22,156 --> 00:00:24,596 Speaker 1: the night in Seattle, and as I was coming over 3 00:00:25,076 --> 00:00:28,196 Speaker 1: a bridge back into Tacoma, a little dog run out 4 00:00:28,196 --> 00:00:30,476 Speaker 1: in front of my car. I did exactly what most 5 00:00:30,476 --> 00:00:32,516 Speaker 1: people do when this happens, and I now know you 6 00:00:32,516 --> 00:00:34,636 Speaker 1: shouldn't do, which as I swerved to try to avoid 7 00:00:34,676 --> 00:00:38,036 Speaker 1: hitting it, and the result of both swerving and then 8 00:00:38,116 --> 00:00:41,756 Speaker 1: ultimately hitting it anyway, was that my car was sent 9 00:00:41,796 --> 00:00:44,876 Speaker 1: into a sort of a fishtail and then a spin 10 00:00:45,316 --> 00:00:48,876 Speaker 1: across the freeway. This is Abbey Marsh. She was only 11 00:00:48,956 --> 00:00:51,596 Speaker 1: nineteen years old when the events in this story took place. 12 00:00:52,396 --> 00:00:54,196 Speaker 1: When the car finally came to a stop, it was 13 00:00:54,436 --> 00:00:56,716 Speaker 1: in the fast lane of the freeway, just past the 14 00:00:56,756 --> 00:00:59,636 Speaker 1: crest of this bridge i'd been crossing, which meant that 15 00:00:59,676 --> 00:01:03,436 Speaker 1: I was invisible to the oncoming traffic. Unfortunately, they were 16 00:01:03,476 --> 00:01:05,476 Speaker 1: quite visible to me because my car was now facing 17 00:01:05,476 --> 00:01:08,396 Speaker 1: backward into the oncoming traffic and the engine on a 18 00:01:08,436 --> 00:01:12,316 Speaker 1: car sort of sputtered to a halt, and I didn't 19 00:01:12,316 --> 00:01:15,196 Speaker 1: have a phone, and I had no way of escaping 20 00:01:15,236 --> 00:01:17,436 Speaker 1: because this bridge didn't have any shoulders on it, so 21 00:01:17,996 --> 00:01:19,756 Speaker 1: there was nowhere to go even if I were to 22 00:01:19,756 --> 00:01:22,716 Speaker 1: get out of the car, and I just panicked. I 23 00:01:22,716 --> 00:01:25,476 Speaker 1: couldn't get the car to her back on And I 24 00:01:25,556 --> 00:01:28,556 Speaker 1: was feeling every time one of these trucks or semis 25 00:01:28,596 --> 00:01:30,636 Speaker 1: past me, the whole car would shutter if they went by, 26 00:01:31,676 --> 00:01:34,556 Speaker 1: like ushit. Now, what am I gonna do? Like I like, 27 00:01:35,196 --> 00:01:37,556 Speaker 1: you know, your mind is sort of stuttering through the 28 00:01:37,556 --> 00:01:39,916 Speaker 1: different options. And do I get out of the car? 29 00:01:39,996 --> 00:01:41,436 Speaker 1: Do I stay in the car? If I get out 30 00:01:41,436 --> 00:01:43,596 Speaker 1: of the car, I was risking car hitting me. But 31 00:01:43,596 --> 00:01:45,196 Speaker 1: then if I stayed in the car, I was definitely 32 00:01:45,196 --> 00:01:47,476 Speaker 1: going to get hit eventually, because these cars that were 33 00:01:47,516 --> 00:01:49,876 Speaker 1: coming over the crest were swerving barely in time. Who 34 00:01:49,876 --> 00:01:55,356 Speaker 1: avoid me? There? Abby was a teenager, all alone and 35 00:01:55,476 --> 00:01:58,796 Speaker 1: trapped on the freeway, confronting what seemed like certain death, 36 00:01:59,036 --> 00:02:03,596 Speaker 1: And it was just the sense of futility and blankness. 37 00:02:03,716 --> 00:02:07,636 Speaker 1: It was awful. But what happened next propelled Abby on 38 00:02:07,676 --> 00:02:10,156 Speaker 1: a totally new journey, a journey that would bring her 39 00:02:10,196 --> 00:02:12,876 Speaker 1: face to face with the worst and best parts of 40 00:02:12,916 --> 00:02:16,076 Speaker 1: human nature, and one that has allowed her to unlock 41 00:02:16,116 --> 00:02:19,636 Speaker 1: a counterintuitive secret to what makes life happier and a 42 00:02:19,636 --> 00:02:27,036 Speaker 1: bit more worth living. Our minds are constantly telling us 43 00:02:27,036 --> 00:02:28,956 Speaker 1: what to do to be happy. But what if our 44 00:02:28,956 --> 00:02:31,676 Speaker 1: minds are wrong? What if our minds are lying to us, 45 00:02:32,036 --> 00:02:34,356 Speaker 1: leading us away from what will really make us happy. 46 00:02:34,956 --> 00:02:37,076 Speaker 1: The good news is that understanding the science of the 47 00:02:37,196 --> 00:02:39,556 Speaker 1: mind can point us all back in the right direction. 48 00:02:40,116 --> 00:02:44,316 Speaker 1: You're listening to the Happiness Lab of doctor Laurie Santos. 49 00:02:48,076 --> 00:02:51,836 Speaker 1: Abby Marsh was trapped in her disabled sub in the dark, 50 00:02:52,316 --> 00:02:55,516 Speaker 1: facing the wrong way on the highway. She watched in 51 00:02:55,636 --> 00:02:59,836 Speaker 1: terror as oncoming traffic swerved by. She needed a miracle. 52 00:03:00,596 --> 00:03:03,596 Speaker 1: It's funny because you know, I don't believe in guardian angels, 53 00:03:03,636 --> 00:03:05,796 Speaker 1: and in fact, it frustrates me when people refer to 54 00:03:05,956 --> 00:03:08,396 Speaker 1: very alterwistic people as angels, because I feel like it's 55 00:03:08,396 --> 00:03:12,636 Speaker 1: sort of takes away from how compassionate real live human 56 00:03:12,636 --> 00:03:14,836 Speaker 1: beings can be. You know, you don't have to be 57 00:03:14,836 --> 00:03:18,036 Speaker 1: supernatural to help somebody else. But it did have that 58 00:03:18,116 --> 00:03:21,316 Speaker 1: sense of just he just appeared out of nowhere. Abby 59 00:03:21,396 --> 00:03:23,996 Speaker 1: looked over to see that a complete stranger was knocking 60 00:03:23,996 --> 00:03:27,316 Speaker 1: calmly on her passenger side door. My memory of him 61 00:03:27,436 --> 00:03:29,796 Speaker 1: is that he was wearing a suit and a lot 62 00:03:29,836 --> 00:03:33,036 Speaker 1: of gold jewelry and sunglasses. It was the middle of 63 00:03:33,036 --> 00:03:34,636 Speaker 1: the night, that made no sense, but he said, you 64 00:03:34,756 --> 00:03:37,596 Speaker 1: looked like you could do some help, and I said, yeah, 65 00:03:37,676 --> 00:03:41,516 Speaker 1: I think I could. And he ran around the front 66 00:03:41,516 --> 00:03:44,876 Speaker 1: of my car into the traffic, got into the driver's seat, 67 00:03:45,596 --> 00:03:49,076 Speaker 1: and then he got the car back started again and 68 00:03:49,356 --> 00:03:51,916 Speaker 1: got just back across the road and parked us behind 69 00:03:51,956 --> 00:03:54,556 Speaker 1: his car. I was shaking, and I'm sure it was gray, 70 00:03:54,636 --> 00:03:56,236 Speaker 1: and I fell awful, and he said, you don't look 71 00:03:56,276 --> 00:03:57,716 Speaker 1: so good. Do you need me to follow you to 72 00:03:57,756 --> 00:04:00,556 Speaker 1: make sure you get home? Okay? I was like, no, no no, no, 73 00:04:00,596 --> 00:04:02,476 Speaker 1: I'll be fine. I'll be fine. I'm pretty sure. I 74 00:04:02,516 --> 00:04:04,756 Speaker 1: didn't say thank you. And he's like, okay, you take 75 00:04:04,796 --> 00:04:07,156 Speaker 1: care of yourself, and he got out of my car, 76 00:04:07,236 --> 00:04:10,436 Speaker 1: back into his own and disappeared. Abbey has spent a 77 00:04:10,436 --> 00:04:12,716 Speaker 1: lot of time wondering about the man that rescued her 78 00:04:12,956 --> 00:04:16,516 Speaker 1: and the reasons behind his actions. The instant practically he 79 00:04:16,556 --> 00:04:19,156 Speaker 1: saw in my car. He must have pulled over and 80 00:04:19,196 --> 00:04:21,476 Speaker 1: then run across five milnes of freeway traffic in the 81 00:04:21,556 --> 00:04:23,396 Speaker 1: dark to get to me. Why would somebody do that? 82 00:04:23,396 --> 00:04:25,956 Speaker 1: Why would somebody do that? What was that moment that 83 00:04:26,036 --> 00:04:28,676 Speaker 1: happened inside this other person's head that I owe my 84 00:04:28,716 --> 00:04:31,796 Speaker 1: life too? What's interesting to me is that just knowing 85 00:04:31,836 --> 00:04:34,076 Speaker 1: that people do it sort of semantically, you know, you 86 00:04:34,116 --> 00:04:36,276 Speaker 1: read about it in a newspaper, is you can sort 87 00:04:36,276 --> 00:04:38,596 Speaker 1: of be like, oh, that's interesting, but there's something about 88 00:04:38,596 --> 00:04:42,156 Speaker 1: it happening to you. Real human being made this choice 89 00:04:42,196 --> 00:04:45,396 Speaker 1: to save my life, even though he risked being killed himself, 90 00:04:45,436 --> 00:04:48,356 Speaker 1: to make you want to understand it, and the fact 91 00:04:48,356 --> 00:04:51,036 Speaker 1: that we don't really have good explanations for why somebody 92 00:04:51,076 --> 00:04:53,196 Speaker 1: would do something like this. In fact, it defies a 93 00:04:53,196 --> 00:04:55,876 Speaker 1: lot of conventional wisdom about human motivation. You know, all 94 00:04:56,076 --> 00:04:58,916 Speaker 1: humans being fundamentally selfish, that's something many people believe, and 95 00:04:58,956 --> 00:05:02,236 Speaker 1: so what could be more interesting than a concrete fact 96 00:05:02,316 --> 00:05:05,956 Speaker 1: that defies a lot of established ideas. But Abbey wasn't 97 00:05:05,956 --> 00:05:08,236 Speaker 1: content to just sit and wonder why her savior chose 98 00:05:08,276 --> 00:05:10,636 Speaker 1: to help her. She decided to get to the bottom 99 00:05:10,676 --> 00:05:14,916 Speaker 1: of his actions. Scientifically, I'm not sure how often these 100 00:05:15,276 --> 00:05:17,596 Speaker 1: motivations that drive us are clear in the moment, but 101 00:05:17,716 --> 00:05:21,316 Speaker 1: in retrospect, it's very clear that my research took a 102 00:05:21,396 --> 00:05:24,436 Speaker 1: very distinct track since then. Abby is now a professor 103 00:05:24,436 --> 00:05:27,676 Speaker 1: in the Department of Psychology and the Interdisciplinary Program in 104 00:05:27,756 --> 00:05:31,116 Speaker 1: Neuroscience at Georgetown she's a world expert and how we 105 00:05:31,156 --> 00:05:36,636 Speaker 1: process other people's emotions. I started studying how we respond 106 00:05:36,676 --> 00:05:39,796 Speaker 1: to other people's fear initially, so why it is that 107 00:05:40,316 --> 00:05:44,276 Speaker 1: the site of somebody who's frightens elicits caring responses and 108 00:05:44,316 --> 00:05:46,556 Speaker 1: people who see it. But that's a really difficult question 109 00:05:46,636 --> 00:05:48,636 Speaker 1: to study in the lab because you know you can't 110 00:05:48,676 --> 00:05:52,356 Speaker 1: ethically induce extreme fear in people, and it's really hard 111 00:05:52,356 --> 00:05:55,676 Speaker 1: to measure people's behavior when it comes to things like 112 00:05:55,716 --> 00:05:57,996 Speaker 1: altruism in the lab in a way that doesn't make 113 00:05:58,036 --> 00:06:00,596 Speaker 1: them do what you want them to do. So Abby 114 00:06:00,636 --> 00:06:04,556 Speaker 1: decided to employ a common psychology research logic. If you 115 00:06:04,596 --> 00:06:07,036 Speaker 1: want to understand a concept, one of the better ways 116 00:06:07,036 --> 00:06:09,636 Speaker 1: to do that is to find a population to people 117 00:06:09,636 --> 00:06:11,996 Speaker 1: who were missing the thing you're interested in and try 118 00:06:11,996 --> 00:06:14,636 Speaker 1: to understand what makes him different, and hopefully that will 119 00:06:14,676 --> 00:06:19,036 Speaker 1: help understand where that process you're interested and comes from, 120 00:06:19,476 --> 00:06:22,516 Speaker 1: which led Abbey to explore the nature of altruistic actions 121 00:06:22,876 --> 00:06:27,956 Speaker 1: using a seemingly strange population psychopaths. We had known for 122 00:06:27,996 --> 00:06:29,956 Speaker 1: a while, and sciences and known for a while that 123 00:06:30,236 --> 00:06:33,756 Speaker 1: people who are psychopathic don't respond normally to the site 124 00:06:33,756 --> 00:06:36,556 Speaker 1: of other people's fear, which even the idea of other 125 00:06:36,596 --> 00:06:39,556 Speaker 1: people's fear. And one of my favorite examples of this 126 00:06:39,636 --> 00:06:42,356 Speaker 1: comes from a story of my colleagues of fighting was 127 00:06:42,396 --> 00:06:45,516 Speaker 1: telling name she was testing a bunch of psychopathic adult 128 00:06:45,636 --> 00:06:49,276 Speaker 1: inmates on their ability to recognize other people's facial expressions, 129 00:06:49,436 --> 00:06:53,716 Speaker 1: and one psychopathic inmate she was testing was particularly about 130 00:06:53,716 --> 00:06:56,396 Speaker 1: it recognizing other people's fear so bad he missed every 131 00:06:56,396 --> 00:06:59,076 Speaker 1: single fearful expression she showed him. But he knew he 132 00:06:59,116 --> 00:07:01,716 Speaker 1: was doing badly because he got to the last fearful 133 00:07:01,756 --> 00:07:04,636 Speaker 1: expression in the set and he's like, you know, I 134 00:07:04,676 --> 00:07:06,436 Speaker 1: don't know what that expression is called, but I know 135 00:07:06,476 --> 00:07:09,476 Speaker 1: that's what people look like right before you stab them. 136 00:07:09,516 --> 00:07:12,836 Speaker 1: I find that so incredibly profound, because, you know, here's 137 00:07:12,836 --> 00:07:16,476 Speaker 1: a man who is imprisoned because he does things that 138 00:07:16,556 --> 00:07:19,756 Speaker 1: cause other people to believe they're going to die, and 139 00:07:20,036 --> 00:07:21,796 Speaker 1: he's like, oh, yay, I know what they look like. 140 00:07:22,116 --> 00:07:25,116 Speaker 1: You know, I recognize that phase. But he couldn't link 141 00:07:25,156 --> 00:07:28,196 Speaker 1: it to the emotion fear. He just he couldn't make 142 00:07:28,356 --> 00:07:34,236 Speaker 1: sense of that vivid, you know, wide eyed, distressed expression 143 00:07:34,716 --> 00:07:38,756 Speaker 1: and understand the emotional content behind it. Abby's now done 144 00:07:38,796 --> 00:07:42,636 Speaker 1: some elegant work exploring why psychopaths have this problem recognizing 145 00:07:42,636 --> 00:07:46,036 Speaker 1: others distress. We found that there's a structure in their 146 00:07:46,036 --> 00:07:48,676 Speaker 1: brain called the amgala that doesn't seem to respond normally 147 00:07:48,716 --> 00:07:51,436 Speaker 1: to other people's fear, whereas in most people, this particular 148 00:07:51,476 --> 00:07:54,716 Speaker 1: structure seems to be very active in response to somebody's fear, 149 00:07:54,756 --> 00:07:57,076 Speaker 1: and that seems to help you interpret that emotion, and 150 00:07:57,076 --> 00:07:59,516 Speaker 1: people who are psychopathic just show no response at all. 151 00:07:59,796 --> 00:08:02,036 Speaker 1: But the biggest idea that came from Abbey's work on 152 00:08:02,076 --> 00:08:05,036 Speaker 1: the brains of psychopaths was an insight that eventually led 153 00:08:05,036 --> 00:08:07,956 Speaker 1: her back to the question she first started asking on 154 00:08:07,956 --> 00:08:12,076 Speaker 1: that highway many years ago. We have understood now for 155 00:08:12,116 --> 00:08:16,436 Speaker 1: a while that psychopathy is not a sort of cluster 156 00:08:16,516 --> 00:08:21,676 Speaker 1: of individuals that's qualitatively different from everybody else. It's a spectrum. 157 00:08:21,716 --> 00:08:24,756 Speaker 1: And so there are people who are highly highly psychopathic 158 00:08:24,796 --> 00:08:27,436 Speaker 1: and people who are only moderately psychopathic, and then those 159 00:08:27,436 --> 00:08:31,676 Speaker 1: traits vary continuously throughout the population. And what it's interesting 160 00:08:31,716 --> 00:08:34,716 Speaker 1: about that fact is it suggests that there must be 161 00:08:34,756 --> 00:08:37,356 Speaker 1: such a thing as an anti psychopath. So if most 162 00:08:37,396 --> 00:08:39,676 Speaker 1: of us are sort of moderately compassionate, and we've got 163 00:08:39,716 --> 00:08:42,036 Speaker 1: psychopaths on one end who have no compassion, Well, there 164 00:08:42,116 --> 00:08:45,116 Speaker 1: must be a mirror image of that people who are 165 00:08:45,196 --> 00:08:48,036 Speaker 1: unusually compassionate, And I got to thinking about what that 166 00:08:48,116 --> 00:08:50,516 Speaker 1: might look like. What would it look like to be 167 00:08:50,596 --> 00:08:53,796 Speaker 1: anti psychopathic? Are they, for example, the kind of people 168 00:08:53,796 --> 00:08:57,036 Speaker 1: who would run into oncoming traffic to save a complete stranger. 169 00:08:57,716 --> 00:09:00,236 Speaker 1: Could they be the key to understanding why some people 170 00:09:00,236 --> 00:09:03,756 Speaker 1: are willing to lose everything to help others. After the 171 00:09:03,796 --> 00:09:07,236 Speaker 1: break you'll hear more about these so called anti psychopaths 172 00:09:07,276 --> 00:09:09,916 Speaker 1: and how they are unusual choices, demons straight what you 173 00:09:09,956 --> 00:09:13,196 Speaker 1: can personally do to become a little bit happier. The 174 00:09:13,236 --> 00:09:25,676 Speaker 1: Happiness Lab will be back in a moment. I started 175 00:09:25,676 --> 00:09:27,996 Speaker 1: out thinking it would be fun too, and you know, 176 00:09:28,196 --> 00:09:31,316 Speaker 1: edifying to study people who were heroic rescuers, like the 177 00:09:31,316 --> 00:09:33,676 Speaker 1: man who saved my life. But at the time it 178 00:09:33,676 --> 00:09:36,036 Speaker 1: wasn't at all clear how I would find them. After 179 00:09:36,076 --> 00:09:39,716 Speaker 1: exploring the brains of psychopathic killers, researcher Abbey Marsh wanted 180 00:09:39,756 --> 00:09:42,316 Speaker 1: to understand the minds of the polar opposite side of 181 00:09:42,356 --> 00:09:45,836 Speaker 1: the psychological spectrum. It's not an easy thing to put 182 00:09:45,876 --> 00:09:47,316 Speaker 1: like an ad in the newspaper for have you ever 183 00:09:47,316 --> 00:09:49,436 Speaker 1: saved a stranger's life? And risk your own. It wasn't 184 00:09:49,476 --> 00:09:51,636 Speaker 1: aware of any way to recruit them, and so I 185 00:09:51,636 --> 00:09:54,076 Speaker 1: thought about, well, there are other ways to save lives 186 00:09:54,116 --> 00:09:57,876 Speaker 1: that involve significant risk and sacrifice, and at the time, 187 00:09:58,196 --> 00:10:00,116 Speaker 1: a number of articles and a book could come out 188 00:10:00,556 --> 00:10:03,596 Speaker 1: about altruistic kidney donors. People who give away a kidney 189 00:10:03,796 --> 00:10:07,116 Speaker 1: to save the life of a stranger and ende renal failure. 190 00:10:07,436 --> 00:10:10,316 Speaker 1: Patience and end stage renal failure often wait three to 191 00:10:10,356 --> 00:10:12,636 Speaker 1: five years to get a kidney from a deceased donor. 192 00:10:13,516 --> 00:10:16,116 Speaker 1: Living kidney donors, people who are willing to give up 193 00:10:16,156 --> 00:10:18,796 Speaker 1: one of their two functioning kidneys, can cut down on 194 00:10:18,836 --> 00:10:21,156 Speaker 1: the weight time for the nearly one hundred thousand people 195 00:10:21,436 --> 00:10:24,596 Speaker 1: who are on that wait list. The donation procedure is 196 00:10:24,596 --> 00:10:27,516 Speaker 1: more straightforward than you might think. Most donors return to 197 00:10:27,516 --> 00:10:30,956 Speaker 1: their usual activities in a few weeks. But like all surgeries, 198 00:10:31,196 --> 00:10:35,236 Speaker 1: kidney donations come with at least some risk of serious complications, 199 00:10:35,276 --> 00:10:39,356 Speaker 1: things like blood clots, infection, or even death. And I thought, 200 00:10:39,636 --> 00:10:43,116 Speaker 1: you know, if anything is altruistic, it is that it 201 00:10:43,276 --> 00:10:48,156 Speaker 1: is this very significant decision to give away of an 202 00:10:48,196 --> 00:10:51,916 Speaker 1: internal organ, invital internal organ, to save the life of 203 00:10:51,956 --> 00:10:54,156 Speaker 1: somebody that you've never met. And in most cases have 204 00:10:54,236 --> 00:10:56,836 Speaker 1: been picked off a list for you. So it struck 205 00:10:56,916 --> 00:10:59,996 Speaker 1: me that if anything could be considered antipsychopathic, gets the 206 00:11:00,036 --> 00:11:02,556 Speaker 1: decision to give a kidney to a stranger. So Abby 207 00:11:02,636 --> 00:11:05,596 Speaker 1: harder target group kidney donors, but at the time it 208 00:11:05,636 --> 00:11:08,396 Speaker 1: was a pretty exclusive club. There are only a few 209 00:11:08,396 --> 00:11:11,756 Speaker 1: thousand in the entire country who'd gone through that invasive procedure. 210 00:11:12,516 --> 00:11:15,796 Speaker 1: Meaningful results in scientific studies require having as many test 211 00:11:15,836 --> 00:11:18,716 Speaker 1: subjects as possible, and so Abby was a bit worried 212 00:11:18,716 --> 00:11:21,356 Speaker 1: that she wouldn't be able to find enough donors. In 213 00:11:21,436 --> 00:11:23,636 Speaker 1: spite of the odds, she put an ad on an 214 00:11:23,716 --> 00:11:26,676 Speaker 1: organ transplant list Serve, have you ever donated a kidney 215 00:11:26,676 --> 00:11:29,116 Speaker 1: to a stranger? If so, a researcher at Georgetown is 216 00:11:29,196 --> 00:11:32,596 Speaker 1: interested in connecting a study with you. Abby reasonably assumed 217 00:11:32,636 --> 00:11:35,916 Speaker 1: that she'd never find enough participants from such an absolutely 218 00:11:35,996 --> 00:11:39,516 Speaker 1: tiny pool of potential recruits. I have this vivid memory 219 00:11:39,596 --> 00:11:41,196 Speaker 1: of sitting down and being like, oh, I wonder if 220 00:11:41,236 --> 00:11:44,876 Speaker 1: anybody's responded to my ads for kidney downers, and opening 221 00:11:44,956 --> 00:11:47,156 Speaker 1: up by laptop and my inbox was just flooded with 222 00:11:47,196 --> 00:11:51,756 Speaker 1: new messages, many with all caps subject things from altars 223 00:11:51,796 --> 00:11:54,596 Speaker 1: to kidney downers who were just very excited to be 224 00:11:54,716 --> 00:11:56,396 Speaker 1: taking part in my research. You know, I would love 225 00:11:56,436 --> 00:11:58,636 Speaker 1: to be your guinea pig. Please sign me up. Abby 226 00:11:58,716 --> 00:12:02,636 Speaker 1: was immediately shocked by this population's generosity. Hundreds of them 227 00:12:02,636 --> 00:12:05,396 Speaker 1: were ready to fly to her lab at a moment's notice, 228 00:12:05,796 --> 00:12:08,036 Speaker 1: and once they got there, they were happy to go 229 00:12:08,116 --> 00:12:11,556 Speaker 1: to great lengths to be helpful. And consider it, working 230 00:12:11,556 --> 00:12:13,836 Speaker 1: with lots of different populations over the years, it can 231 00:12:13,876 --> 00:12:15,556 Speaker 1: be a real trick to just get people to come 232 00:12:15,596 --> 00:12:18,196 Speaker 1: in and to come in on time. And the first 233 00:12:18,276 --> 00:12:20,876 Speaker 1: three altruistic kidney donors we brought in and they come 234 00:12:20,876 --> 00:12:22,236 Speaker 1: in from all over the country, and they were staying 235 00:12:22,236 --> 00:12:25,076 Speaker 1: in a hotel just a few blocks from the campus 236 00:12:25,116 --> 00:12:27,636 Speaker 1: where we were going to be scanning them, and they 237 00:12:27,676 --> 00:12:30,916 Speaker 1: were so worried about not being late to their brain 238 00:12:30,956 --> 00:12:35,236 Speaker 1: scans that they came three hours early the first camp. 239 00:12:35,436 --> 00:12:38,076 Speaker 1: It's like unheard of, and they ended up getting lost 240 00:12:38,116 --> 00:12:41,916 Speaker 1: in the bowels of the university hospital and almost ended 241 00:12:41,996 --> 00:12:44,436 Speaker 1: up breaking through a fire door to get to the 242 00:12:44,516 --> 00:12:46,836 Speaker 1: camp center and setting off alarms all over the hospital 243 00:12:46,836 --> 00:12:50,996 Speaker 1: because again they were so incredibly concerned about not being late. 244 00:12:51,556 --> 00:12:54,516 Speaker 1: Abby also found that her kidney donors were unusually humble. 245 00:12:55,076 --> 00:12:57,436 Speaker 1: They didn't like her hypothesis that they were in any 246 00:12:57,436 --> 00:13:00,876 Speaker 1: way special or at the extreme end of some goodness spectrum. 247 00:13:01,756 --> 00:13:04,276 Speaker 1: A number of them very kindly told me that they 248 00:13:04,276 --> 00:13:06,236 Speaker 1: were happy to participate in the study. They were happy 249 00:13:06,276 --> 00:13:08,196 Speaker 1: to help out, but they were pretty sure that I 250 00:13:08,236 --> 00:13:10,796 Speaker 1: was barking at the wrong tree. The idea that there 251 00:13:10,836 --> 00:13:13,116 Speaker 1: was anything different about them at all was just wrong. 252 00:13:13,156 --> 00:13:15,996 Speaker 1: They're not unusually altruistic, they're not unusually compassionate. They're just 253 00:13:16,076 --> 00:13:18,156 Speaker 1: like anybody else. They happen to be in the right 254 00:13:18,196 --> 00:13:21,036 Speaker 1: place at the right time, which is not how anybody 255 00:13:21,036 --> 00:13:24,076 Speaker 1: else talks about people who give kidneys to strangers. But 256 00:13:24,316 --> 00:13:28,556 Speaker 1: their real sense of humility has been really striking, just 257 00:13:29,196 --> 00:13:32,836 Speaker 1: an unwillingness to think of themselves as better than anybody else. 258 00:13:33,876 --> 00:13:37,156 Speaker 1: Of course, Abbey's results showed that kidney donors were wrong. 259 00:13:37,596 --> 00:13:40,196 Speaker 1: They were different, at least when it came to their brains. 260 00:13:40,876 --> 00:13:43,876 Speaker 1: Abbey used neuroimaging techniques to measure the size of the 261 00:13:43,876 --> 00:13:47,756 Speaker 1: donors amygdalas, that same brain structure involved in processing fear, 262 00:13:48,556 --> 00:13:51,676 Speaker 1: the very same part that was significantly smaller than average 263 00:13:51,756 --> 00:13:55,636 Speaker 1: in psychopaths. It turns out that her kidney donors also 264 00:13:55,716 --> 00:13:59,436 Speaker 1: had peculiar amygdala's, but they were eight percent larger than 265 00:13:59,476 --> 00:14:02,076 Speaker 1: those of average people. Now, it could be that her 266 00:14:02,116 --> 00:14:04,676 Speaker 1: donors were just born that way, but it was also 267 00:14:04,716 --> 00:14:07,956 Speaker 1: possible that performing acts of kindness over time had caused 268 00:14:07,956 --> 00:14:12,716 Speaker 1: the enlargement, like a muscle responding to exercise. Whatever the reason, 269 00:14:13,076 --> 00:14:16,236 Speaker 1: these extreme ultruists had ambigdalas that indeed looked like the 270 00:14:16,276 --> 00:14:19,076 Speaker 1: polar opposite of what she'd seen in her malicious criminals. 271 00:14:19,716 --> 00:14:24,316 Speaker 1: Abby had finally identified a population of anti psychopaths, which 272 00:14:24,356 --> 00:14:27,156 Speaker 1: was a pretty cool result for a budding neuroscientist. But 273 00:14:27,196 --> 00:14:29,676 Speaker 1: the most important thing about studying this new population for 274 00:14:29,716 --> 00:14:32,676 Speaker 1: Abby wasn't just that she had discovered a completely new, 275 00:14:32,836 --> 00:14:37,716 Speaker 1: nearly atypical population. The ultruistic kidney donors finally gave Abbey 276 00:14:37,756 --> 00:14:40,596 Speaker 1: the opportunity to pose the question that had puzzled her 277 00:14:40,636 --> 00:14:44,036 Speaker 1: for decades, The question she wanted to ask her highway savior. 278 00:14:44,516 --> 00:14:47,836 Speaker 1: Why would somebody do that? Why did her rescuer choose 279 00:14:47,876 --> 00:14:53,396 Speaker 1: to save her? Abby conducted interview after interview, asking what 280 00:14:53,436 --> 00:14:55,676 Speaker 1: was going through your head when you decided to help 281 00:14:55,716 --> 00:14:58,836 Speaker 1: someone in such an extreme way. The most common answer 282 00:14:58,876 --> 00:15:00,916 Speaker 1: I get to the question is it just hit me 283 00:15:00,956 --> 00:15:02,916 Speaker 1: like a bolt of lightning. I've found out that there 284 00:15:02,956 --> 00:15:05,876 Speaker 1: are people who were dying from kidney failure. There's one 285 00:15:05,956 --> 00:15:08,516 Speaker 1: hundred thousand of them on the waiting lists, and most 286 00:15:08,516 --> 00:15:10,676 Speaker 1: of us can of a kidney away and be none 287 00:15:10,676 --> 00:15:12,636 Speaker 1: the worse for wear for it. And I thought, I'll 288 00:15:12,676 --> 00:15:15,436 Speaker 1: do that. I mean, there is no decision process I 289 00:15:15,476 --> 00:15:17,556 Speaker 1: think is the interesting thing. Like, it's not a hemming 290 00:15:17,556 --> 00:15:20,316 Speaker 1: and hawing process for really almost anybody I've talked to, 291 00:15:20,396 --> 00:15:23,996 Speaker 1: it's just so, well, you can do this. Somebody's life 292 00:15:24,036 --> 00:15:26,276 Speaker 1: is gonna be saved. I'll do it. At first glance, 293 00:15:26,396 --> 00:15:29,436 Speaker 1: this sort of answer fits with Abby's initial hypothesis that 294 00:15:29,516 --> 00:15:32,796 Speaker 1: there had to be something fundamentally different about people who 295 00:15:32,796 --> 00:15:35,836 Speaker 1: would risk their lives for strangers without a moment's thought. 296 00:15:36,756 --> 00:15:39,116 Speaker 1: But if that were the case, her results wouldn't be 297 00:15:39,116 --> 00:15:41,316 Speaker 1: as relevant for all of you, and so I wouldn't 298 00:15:41,316 --> 00:15:43,996 Speaker 1: be talking about them here on the Happiness lab. As 299 00:15:44,036 --> 00:15:47,156 Speaker 1: Abby probe more deeply, she realized that this couldn't be 300 00:15:47,196 --> 00:15:50,876 Speaker 1: the whole story. As she heard more about her participants' lives, 301 00:15:51,396 --> 00:15:53,716 Speaker 1: she realized that many of them got to this act 302 00:15:53,716 --> 00:15:57,156 Speaker 1: of kidney donation through lots of smaller acts of generosity. 303 00:15:57,596 --> 00:15:59,636 Speaker 1: Nobody goes from sort of ground zero to donate a 304 00:15:59,756 --> 00:16:04,196 Speaker 1: kidney almost always. The people we've worked with are long 305 00:16:04,276 --> 00:16:07,956 Speaker 1: time blood donors, platelet donors, some have been marrow donors. 306 00:16:07,956 --> 00:16:12,196 Speaker 1: Many of them work in tier positions, rescue animals, foster children. 307 00:16:12,956 --> 00:16:16,876 Speaker 1: They've all done things in the past that involved giving 308 00:16:16,916 --> 00:16:19,836 Speaker 1: of themselves to help other people. Abbey realized that many 309 00:16:19,836 --> 00:16:22,276 Speaker 1: of her kidney donors wound up getting to what seemed 310 00:16:22,276 --> 00:16:25,596 Speaker 1: like an extreme altruistic choice through lots of baby steps, 311 00:16:26,076 --> 00:16:28,756 Speaker 1: smaller nice actions, the kinds of things that lots of 312 00:16:28,836 --> 00:16:32,316 Speaker 1: us do or could easily do. Over time, the donors 313 00:16:32,356 --> 00:16:35,836 Speaker 1: recognize that performing these smaller acts of kindness felt well, 314 00:16:35,916 --> 00:16:38,756 Speaker 1: kind of nice. The sense that I get is that 315 00:16:38,796 --> 00:16:42,196 Speaker 1: they have had the wonderful opportunity to discover how rewarding 316 00:16:42,196 --> 00:16:44,516 Speaker 1: that is, what a sense of joy and happiness it 317 00:16:44,596 --> 00:16:47,236 Speaker 1: gives you to help other people. And it's just like 318 00:16:47,276 --> 00:16:49,636 Speaker 1: any other reinforcement process. You sort of work your way up. 319 00:16:49,676 --> 00:16:51,836 Speaker 1: You're like, well, that was so rewarding. What else could 320 00:16:51,876 --> 00:16:54,156 Speaker 1: I do? If donating blood is good, I guess donating 321 00:16:54,196 --> 00:16:56,516 Speaker 1: marrows even better. If donating kidney is good. I guess 322 00:16:56,556 --> 00:16:58,236 Speaker 1: donating a piece of my liver is even better. And 323 00:16:58,356 --> 00:17:00,276 Speaker 1: I now have several kidney donors I've worked with who've 324 00:17:00,276 --> 00:17:03,076 Speaker 1: also donated a portion of their liver. As she heard 325 00:17:03,076 --> 00:17:06,716 Speaker 1: from more interviewees, Abbey's Big Savior on the Highway puzzle 326 00:17:06,996 --> 00:17:11,356 Speaker 1: started to become Clearerultists did what they did because they 327 00:17:11,356 --> 00:17:14,876 Speaker 1: had learned a simple yet counterintuitive principle of human motivation. 328 00:17:15,476 --> 00:17:19,476 Speaker 1: Doing nice things for other people feels really good, even 329 00:17:19,516 --> 00:17:22,396 Speaker 1: in cases where it's a bit costly. Helping others can 330 00:17:22,436 --> 00:17:25,836 Speaker 1: provide a big spike to our well being. Altruistic kidney 331 00:17:25,836 --> 00:17:29,196 Speaker 1: donors just take the usual wellbeing spite we all experience 332 00:17:29,436 --> 00:17:32,636 Speaker 1: to an extreme. It's therefore no surprise that they tend 333 00:17:32,676 --> 00:17:35,356 Speaker 1: to be a really happy group on average. It's a 334 00:17:35,796 --> 00:17:39,796 Speaker 1: universal response I get from them. They are so glad 335 00:17:39,836 --> 00:17:41,556 Speaker 1: that they made the decision to donate. They do it 336 00:17:41,556 --> 00:17:43,276 Speaker 1: one hundred more times. If they could do one hundred 337 00:17:43,276 --> 00:17:44,996 Speaker 1: more times. It's one of the best things that they 338 00:17:44,996 --> 00:17:48,516 Speaker 1: ever did, and it gives them this sense of joy 339 00:17:48,756 --> 00:17:50,996 Speaker 1: that sticks with them as far as I can tell forever. 340 00:17:51,316 --> 00:17:53,756 Speaker 1: Some of the people I've worked with donated close to 341 00:17:53,796 --> 00:17:56,436 Speaker 1: twenty years ago. Now, and it doesn't ever seem to 342 00:17:56,476 --> 00:18:01,236 Speaker 1: go away, that sense of vicarious joy of having been 343 00:18:01,276 --> 00:18:03,636 Speaker 1: able to do this thing for somebody else. I've had 344 00:18:03,676 --> 00:18:07,556 Speaker 1: many interviews and in tears as people are describing the 345 00:18:07,676 --> 00:18:10,636 Speaker 1: after effects of their nation and hearing that the child 346 00:18:10,716 --> 00:18:13,876 Speaker 1: who would receive their kidney like days after donnation, he 347 00:18:13,956 --> 00:18:16,356 Speaker 1: was making plans to go to the beach and camping 348 00:18:16,356 --> 00:18:18,236 Speaker 1: for like the first time. He'd never been able to 349 00:18:18,276 --> 00:18:21,276 Speaker 1: do these things. And like the kidney donors sabbing relating this, 350 00:18:21,356 --> 00:18:24,716 Speaker 1: and I'm sabbing relating this, it's incredibly profound. As Abby 351 00:18:24,796 --> 00:18:27,436 Speaker 1: heard more of these stories and saw the incredible joy 352 00:18:27,476 --> 00:18:30,436 Speaker 1: that her subjects experienced, she started to think that her 353 00:18:30,436 --> 00:18:33,836 Speaker 1: extreme subjects might be onto something important, something that the 354 00:18:33,876 --> 00:18:36,716 Speaker 1: rest of us could learn from. Two. If you want 355 00:18:36,756 --> 00:18:40,556 Speaker 1: to make a good decision about bringing joy and meaning 356 00:18:40,556 --> 00:18:42,676 Speaker 1: in a sense of connectedness into your own life, helping 357 00:18:42,716 --> 00:18:45,556 Speaker 1: people is clearly the way to do it. Abby started 358 00:18:45,596 --> 00:18:48,276 Speaker 1: to realize that all of us can benefit from doing 359 00:18:48,356 --> 00:18:50,996 Speaker 1: nice things for others, even if we're not yet ready 360 00:18:51,036 --> 00:18:53,276 Speaker 1: to give up a body part to a stranger. We 361 00:18:53,356 --> 00:18:56,116 Speaker 1: all have our own ways that we can make the 362 00:18:56,156 --> 00:18:59,076 Speaker 1: lives of other people better. You know, donating kidney is 363 00:18:59,116 --> 00:19:01,476 Speaker 1: one way, but it's certainly not the only way. But 364 00:19:01,556 --> 00:19:04,516 Speaker 1: as Abby right, I mean, it's clear that her donors 365 00:19:04,556 --> 00:19:07,876 Speaker 1: get a huge happiness boost from their generous act. But 366 00:19:07,956 --> 00:19:11,236 Speaker 1: can the average person really become happier by making a 367 00:19:11,316 --> 00:19:14,836 Speaker 1: small sacrifice to aid a stranger? Can shifting your focus 368 00:19:14,876 --> 00:19:17,956 Speaker 1: to helping other people really be a strategy for improving 369 00:19:17,956 --> 00:19:20,916 Speaker 1: well being? And if there is a path to becoming 370 00:19:20,916 --> 00:19:23,676 Speaker 1: a happy altruist, is there a step along that path 371 00:19:23,956 --> 00:19:27,556 Speaker 1: that you could take today? The Happiness Lab will be 372 00:19:27,636 --> 00:19:39,356 Speaker 1: right back. When we think of small, everyday things we 373 00:19:39,396 --> 00:19:41,676 Speaker 1: can do to boost our mood, we often think of 374 00:19:41,716 --> 00:19:44,756 Speaker 1: the idea of pampering ourselves. And whenever I think of 375 00:19:44,796 --> 00:19:47,596 Speaker 1: personal pampering, I'm reminded of one of my favorite seams 376 00:19:47,676 --> 00:19:50,156 Speaker 1: from the TV show Parks and Recreation. Once a year, 377 00:19:50,276 --> 00:19:52,716 Speaker 1: Donna and I spend a day treating ourselves. What do 378 00:19:52,756 --> 00:19:57,516 Speaker 1: we treat ourselves to? Clothes, treat yourself, treat yourself, massage, 379 00:19:57,676 --> 00:20:02,036 Speaker 1: treat yourself, mimosa, treat yourself, fine leather goods. Treat yourself. 380 00:20:02,076 --> 00:20:04,356 Speaker 1: It's the best day of the year, the best day 381 00:20:04,356 --> 00:20:06,956 Speaker 1: of the year when we want to be happier. We 382 00:20:07,036 --> 00:20:10,356 Speaker 1: think it's time to spoil ourselves, or, in the popular 383 00:20:10,356 --> 00:20:12,996 Speaker 1: parlance of parks wreck, I've got three words for you, 384 00:20:13,676 --> 00:20:16,676 Speaker 1: Yo sill. On the show Tom and Donna observed treat 385 00:20:16,716 --> 00:20:20,596 Speaker 1: yourself day every October thirteenth. It's now become a cultural phenomenon, 386 00:20:21,156 --> 00:20:23,636 Speaker 1: so much so that Rheta, the actress who plays Donna, 387 00:20:23,996 --> 00:20:26,196 Speaker 1: can't post a photo of a cocktail or a purse 388 00:20:26,236 --> 00:20:29,356 Speaker 1: on Instagram without some fan telling her to go ahead 389 00:20:29,436 --> 00:20:33,036 Speaker 1: and treat yourself. But it's a strategy, right. Should we 390 00:20:33,076 --> 00:20:35,996 Speaker 1: be treating ourselves to feel happier or are we missing 391 00:20:36,076 --> 00:20:40,316 Speaker 1: other more powerful opportunities to boost our moods? You know, 392 00:20:40,356 --> 00:20:42,956 Speaker 1: I don't think treating ourselves is a terrible idea, like 393 00:20:42,996 --> 00:20:46,476 Speaker 1: spending money on ourselves can be good. This is Liz Done, 394 00:20:46,836 --> 00:20:50,076 Speaker 1: a psychology professor at the University of British Columbia an 395 00:20:50,116 --> 00:20:53,316 Speaker 1: author of the book Happy Money, The Science of Happier Spending. 396 00:20:53,756 --> 00:20:56,156 Speaker 1: It's just that this idea of that spending money on 397 00:20:56,156 --> 00:20:59,356 Speaker 1: somebody else could actually be helpful, I think is especially 398 00:20:59,436 --> 00:21:02,036 Speaker 1: easy to overlook because I think we do just get 399 00:21:02,116 --> 00:21:05,076 Speaker 1: focused on ourselves. Liz studies the cases where our so 400 00:21:05,156 --> 00:21:09,276 Speaker 1: called treat yourself. Intuitions can lead us astray, especially when 401 00:21:09,276 --> 00:21:11,796 Speaker 1: it comes to spending our disposable income. I first got 402 00:21:11,796 --> 00:21:14,436 Speaker 1: interested in this idea, like, not because I was especially 403 00:21:14,516 --> 00:21:17,876 Speaker 1: interested in generosity, but because I was interested in money. 404 00:21:18,516 --> 00:21:21,996 Speaker 1: So I managed to make it through my twenties without 405 00:21:21,996 --> 00:21:25,356 Speaker 1: ever holding a real job. At twenty seven, I got 406 00:21:25,356 --> 00:21:27,716 Speaker 1: my first real job and they actually started paying me, 407 00:21:27,756 --> 00:21:29,516 Speaker 1: and I was like, oh wow, Like, what do I 408 00:21:29,556 --> 00:21:31,196 Speaker 1: do with all of this money? Like this is more 409 00:21:31,236 --> 00:21:33,996 Speaker 1: money than I need to survive, you know, and what 410 00:21:34,076 --> 00:21:36,036 Speaker 1: do I do with it? I was surprised at the 411 00:21:36,076 --> 00:21:38,636 Speaker 1: time by how little research there was on this topic. 412 00:21:39,316 --> 00:21:41,516 Speaker 1: Liz Comb the literature to figure out the best way 413 00:21:41,556 --> 00:21:44,036 Speaker 1: to spend our money to feel happier, and all the 414 00:21:44,116 --> 00:21:46,956 Speaker 1: existing studies seem to point in the same direction. The 415 00:21:47,036 --> 00:21:50,196 Speaker 1: science shows that treating ourselves doesn't make us as happy 416 00:21:50,516 --> 00:21:53,476 Speaker 1: as treating other people, and that result is not just 417 00:21:53,556 --> 00:21:57,316 Speaker 1: true for extremely altruistic people like Abby's kidney donors, even 418 00:21:57,316 --> 00:22:00,116 Speaker 1: when we look in pretty diverse regions of the world. 419 00:22:00,436 --> 00:22:02,836 Speaker 1: In fact, in all seven major regions of the world, 420 00:22:02,916 --> 00:22:06,356 Speaker 1: we find this relationship whereby people who donate money to 421 00:22:06,436 --> 00:22:09,276 Speaker 1: charity are happier than those who don't. So I thought, well, okay, 422 00:22:09,316 --> 00:22:12,076 Speaker 1: would we actually get more of kind of happiness being 423 00:22:12,156 --> 00:22:14,876 Speaker 1: for our buck by spending on others than by spending 424 00:22:14,876 --> 00:22:18,516 Speaker 1: on ourselves. Listen, our colleagues decided to test this in 425 00:22:18,556 --> 00:22:21,316 Speaker 1: a rather simple experiment. They walked up to people on 426 00:22:21,356 --> 00:22:24,556 Speaker 1: the street and handed them twenty dollars. We asked them 427 00:22:24,556 --> 00:22:26,436 Speaker 1: to spend it by the end of the day, but 428 00:22:26,556 --> 00:22:28,316 Speaker 1: with a catch. So we told half the people they 429 00:22:28,316 --> 00:22:31,076 Speaker 1: had to spend it on themselves, and we told half 430 00:22:31,076 --> 00:22:33,796 Speaker 1: the people they had to spend it to benefit others. 431 00:22:34,036 --> 00:22:36,316 Speaker 1: Imagine for a second that you're a subject in this study. 432 00:22:36,876 --> 00:22:39,156 Speaker 1: You just got twenty bucks out of the blue, and 433 00:22:39,196 --> 00:22:42,956 Speaker 1: you're asked to spend it. What would feel better spending 434 00:22:42,956 --> 00:22:45,396 Speaker 1: that money on a nice free meal or a shiny 435 00:22:45,436 --> 00:22:48,956 Speaker 1: manicure for yourself, or using that same amount of money 436 00:22:49,076 --> 00:22:52,596 Speaker 1: to help someone else. If you're like Liz's subjects, you 437 00:22:52,716 --> 00:22:56,836 Speaker 1: probably think the treat yourself condition would feel better. In fact, 438 00:22:57,196 --> 00:23:00,036 Speaker 1: Liz asked over a hundred people to predict which condition 439 00:23:00,036 --> 00:23:02,796 Speaker 1: they would prefer, and about two thirds of them went 440 00:23:02,836 --> 00:23:06,276 Speaker 1: with the treat yourself option. But what did Liz find 441 00:23:06,316 --> 00:23:09,196 Speaker 1: when people really spent that cash windfall. People were in 442 00:23:09,236 --> 00:23:11,676 Speaker 1: a better mood at the end of the day when 443 00:23:11,836 --> 00:23:14,076 Speaker 1: they'd been asked to spend this money on other people 444 00:23:14,236 --> 00:23:17,276 Speaker 1: rather than on themselves. The simple act of spending twenty 445 00:23:17,276 --> 00:23:20,876 Speaker 1: dollars on another person was enough to significantly raise people's 446 00:23:20,916 --> 00:23:23,316 Speaker 1: well being levels. But Liz has found that the same 447 00:23:23,356 --> 00:23:26,716 Speaker 1: effect holds for smaller amounts of money too. Her team 448 00:23:26,756 --> 00:23:29,556 Speaker 1: tested a different group of subjects. They were given only 449 00:23:29,596 --> 00:23:33,036 Speaker 1: five dollars to spend on themselves or someone else. This 450 00:23:33,156 --> 00:23:36,236 Speaker 1: second group showed exactly the same effect as those who 451 00:23:36,276 --> 00:23:38,756 Speaker 1: were given more cash. You don't necessarily have to be 452 00:23:38,796 --> 00:23:43,076 Speaker 1: spending crazy amounts of money on others, even like say 453 00:23:43,116 --> 00:23:46,516 Speaker 1: five dollars, or even just two dollars, and shifting it 454 00:23:46,556 --> 00:23:49,556 Speaker 1: towards using that money to benefit other people does seem 455 00:23:49,556 --> 00:23:53,156 Speaker 1: to provide this detectable benefit for moods. Listener colleagues have 456 00:23:53,236 --> 00:23:56,196 Speaker 1: now replicated the same effect in people all over the world, 457 00:23:56,636 --> 00:24:00,556 Speaker 1: in Canada, India, Uganda, and even remote villages on the 458 00:24:00,556 --> 00:24:04,276 Speaker 1: island of Vanawatu. The results are always the same for 459 00:24:04,436 --> 00:24:07,756 Speaker 1: rich and for poor people. One study of South African 460 00:24:07,796 --> 00:24:10,516 Speaker 1: subjects found that people who are happier spending money on 461 00:24:10,556 --> 00:24:13,956 Speaker 1: others even when they report not having enough money to 462 00:24:13,996 --> 00:24:16,556 Speaker 1: buy food for their families in the last year. But 463 00:24:16,676 --> 00:24:19,996 Speaker 1: what's most impressive is that Liz has shown that generosity 464 00:24:20,276 --> 00:24:22,796 Speaker 1: doesn't just feel good in adulthood. We started to wonder, like, 465 00:24:22,916 --> 00:24:25,036 Speaker 1: you know, is this a fundamental part of human nature? 466 00:24:25,156 --> 00:24:27,436 Speaker 1: So my student lair Acting and I teamed up with 467 00:24:27,556 --> 00:24:31,076 Speaker 1: Kylie Hamlin, who's a developmental psychologist, and we brought toddlers 468 00:24:31,116 --> 00:24:33,436 Speaker 1: just under the age of two into the lab. Now, 469 00:24:33,436 --> 00:24:36,036 Speaker 1: of course, toddlers don't really care about money, so we 470 00:24:36,076 --> 00:24:39,236 Speaker 1: worked with like the closest thing to toddler gold which 471 00:24:39,236 --> 00:24:42,996 Speaker 1: of course is goldfish crackers, And so we gave these 472 00:24:43,036 --> 00:24:46,596 Speaker 1: little kids windfall of goldfish for themselves, as well as 473 00:24:46,596 --> 00:24:48,716 Speaker 1: a chance to give some of those goldfish away to 474 00:24:48,796 --> 00:24:52,276 Speaker 1: a puppet named Monkey. The researchers watched how many goldfish 475 00:24:52,276 --> 00:24:55,076 Speaker 1: crackers kids gave away, and then they coded their facial 476 00:24:55,116 --> 00:24:58,676 Speaker 1: expressions to see how happy toddlers seemed afterwards. What we 477 00:24:58,676 --> 00:25:01,156 Speaker 1: see in the study is that even children under the 478 00:25:01,156 --> 00:25:04,276 Speaker 1: age of two seem to exhibit pleasure from giving their 479 00:25:04,276 --> 00:25:09,596 Speaker 1: resources away. Counterintuitively, the kids smiled more and see much 480 00:25:09,636 --> 00:25:13,316 Speaker 1: happier after losing a bunch of their goldfish crackers. It's 481 00:25:13,396 --> 00:25:15,716 Speaker 1: kind of just reassuring, Like as many problems as we 482 00:25:15,756 --> 00:25:18,276 Speaker 1: have in the world right now, it's like the tiny 483 00:25:18,356 --> 00:25:22,436 Speaker 1: humans are starting out with this proclivity to derive joy 484 00:25:22,836 --> 00:25:27,476 Speaker 1: from giving their stuff away. That to me, I don't know, 485 00:25:27,476 --> 00:25:32,716 Speaker 1: it makes me optimistic again about the world. What's less optimistic, though, 486 00:25:32,916 --> 00:25:35,676 Speaker 1: is that we adults don't realize that doing nice things 487 00:25:35,676 --> 00:25:38,676 Speaker 1: for others feel so good that it can have such 488 00:25:38,676 --> 00:25:41,716 Speaker 1: a positive impact on our mood. I mean, I definitely 489 00:25:41,716 --> 00:25:44,116 Speaker 1: want to be happier, but I haven't given away a 490 00:25:44,156 --> 00:25:47,276 Speaker 1: really significant chuck of my income, let alone a kidney. 491 00:25:47,876 --> 00:25:51,116 Speaker 1: I bet you haven't either. Our lying minds keep saying 492 00:25:51,196 --> 00:25:54,236 Speaker 1: treat yourself, which means we tend not to even take 493 00:25:54,316 --> 00:25:57,356 Speaker 1: baby steps towards kindness nearly as much as we could. 494 00:25:57,676 --> 00:26:00,316 Speaker 1: It's so interesting because I think on a broad level, 495 00:26:00,676 --> 00:26:04,436 Speaker 1: people totally recognize that this is the case, and I 496 00:26:04,516 --> 00:26:10,996 Speaker 1: get postcards and emails and stuff from people saying why 497 00:26:11,116 --> 00:26:13,756 Speaker 1: did you, as a scientist, need to waste your time 498 00:26:13,956 --> 00:26:16,396 Speaker 1: showing this Where people are just saying like, oh, we 499 00:26:16,396 --> 00:26:19,196 Speaker 1: already knew this from like the Bible or from like, 500 00:26:19,596 --> 00:26:22,356 Speaker 1: you know, what our parents taught us. And I think 501 00:26:22,396 --> 00:26:25,556 Speaker 1: you know, on a broad level, people recognize that generosity 502 00:26:25,636 --> 00:26:29,156 Speaker 1: feels good. I think what they miss is that, you know, 503 00:26:29,196 --> 00:26:31,836 Speaker 1: when they're looking at how to spend the twenty dollars 504 00:26:31,836 --> 00:26:36,276 Speaker 1: in their pocket, that's where they make the error. So 505 00:26:36,316 --> 00:26:39,916 Speaker 1: it's like, in this moment, with this like piece of 506 00:26:40,236 --> 00:26:43,996 Speaker 1: extra money in my hand, it doesn't maybe occur to 507 00:26:44,036 --> 00:26:46,796 Speaker 1: me to spend it on something else, or you know, 508 00:26:47,116 --> 00:26:50,756 Speaker 1: it feels much more tempting to use it to benefit 509 00:26:50,836 --> 00:26:53,716 Speaker 1: myself rather than to spend it on others. And again, 510 00:26:53,756 --> 00:26:56,396 Speaker 1: I think we forget that, like, oh, I could buy 511 00:26:56,436 --> 00:26:59,916 Speaker 1: a slightly less expensive car and then have a lot 512 00:26:59,956 --> 00:27:03,676 Speaker 1: of money left over to use to help other people 513 00:27:03,716 --> 00:27:06,356 Speaker 1: in my life or donate to charity or whatever. And 514 00:27:06,396 --> 00:27:08,236 Speaker 1: I think you know, that's where the error creeps in, 515 00:27:08,356 --> 00:27:10,476 Speaker 1: is that we've kind of get that we would actually 516 00:27:10,516 --> 00:27:13,236 Speaker 1: benefit from using the money less on ourselves and more 517 00:27:13,276 --> 00:27:16,236 Speaker 1: on other people. So if you really want to treat 518 00:27:16,276 --> 00:27:19,276 Speaker 1: yourself to a happier day, give up something to benefit 519 00:27:19,316 --> 00:27:22,796 Speaker 1: another person. You can start with money, just give up 520 00:27:22,836 --> 00:27:25,236 Speaker 1: a dollar or two, But if you're strapped for cash, 521 00:27:25,596 --> 00:27:28,676 Speaker 1: you can also give up time, like letting someone cut 522 00:27:28,676 --> 00:27:30,596 Speaker 1: in front of you in line at the grocery store. 523 00:27:30,956 --> 00:27:33,476 Speaker 1: It could even be a small service like helping a 524 00:27:33,516 --> 00:27:36,236 Speaker 1: neighbor clear off the snow, or maybe taking the time 525 00:27:36,276 --> 00:27:39,356 Speaker 1: to rate and review your favorite podcast. And doing nice 526 00:27:39,396 --> 00:27:42,356 Speaker 1: things for others doesn't just boost your mood, it also 527 00:27:42,396 --> 00:27:44,916 Speaker 1: makes the person who received your kind actions a little 528 00:27:44,956 --> 00:27:47,396 Speaker 1: happier too. And that's a kind of tip I most 529 00:27:47,436 --> 00:27:50,476 Speaker 1: love sharing on this podcast, one that lets my listeners 530 00:27:50,476 --> 00:27:53,276 Speaker 1: selfishly bump up their own happiness in a way that 531 00:27:53,396 --> 00:27:56,396 Speaker 1: also helps to make the world a better, more empathic place. 532 00:27:57,476 --> 00:27:59,996 Speaker 1: And who knows, maybe you won't be satisfied with the 533 00:28:00,076 --> 00:28:03,196 Speaker 1: small act of donating two dollars here or five bucks there. 534 00:28:04,116 --> 00:28:07,116 Speaker 1: Maybe you'll also graduate from those baby steps to the 535 00:28:07,236 --> 00:28:10,156 Speaker 1: kind of selfless acts that lead to a huge, huge, 536 00:28:10,236 --> 00:28:13,556 Speaker 1: long standing boost and well being, ones that really really 537 00:28:13,556 --> 00:28:17,396 Speaker 1: help people. Maybe you'll even save the life of another person, 538 00:28:18,196 --> 00:28:22,116 Speaker 1: A person like Abby stuck on the highway alone, embraced 539 00:28:22,156 --> 00:28:24,796 Speaker 1: for what could have been certain death, a moment that 540 00:28:24,836 --> 00:28:28,156 Speaker 1: has stayed with her for decades. It just stuck with me, 541 00:28:28,436 --> 00:28:30,156 Speaker 1: the fact that I owe my life to the stranger 542 00:28:30,156 --> 00:28:32,836 Speaker 1: who is willing to risk his to save me. I'm 543 00:28:32,876 --> 00:28:36,116 Speaker 1: hoping that your altruistic Savior is out there somewhere. If 544 00:28:36,156 --> 00:28:39,716 Speaker 1: he's listening right now, what would you say. That's a 545 00:28:39,756 --> 00:28:42,876 Speaker 1: big one. But you know, if he's out there listening today, 546 00:28:42,996 --> 00:28:47,916 Speaker 1: and I just want to say how profoundly grateful I 547 00:28:47,956 --> 00:28:51,356 Speaker 1: am for the beautiful thing that you did and the 548 00:28:51,396 --> 00:28:56,956 Speaker 1: opportunities you've given me, and I hope to do your 549 00:28:58,076 --> 00:29:12,276 Speaker 1: tremendous act of bravery justice. The Happiness Lab is co 550 00:29:12,316 --> 00:29:14,636 Speaker 1: written and produced by Ryan Dilley with the help of 551 00:29:14,636 --> 00:29:17,876 Speaker 1: Pete Naton. Our original music was composed by Zachary Silver, 552 00:29:18,236 --> 00:29:21,756 Speaker 1: with additional scoring, mixing and mastering by Evan Viola. The 553 00:29:21,796 --> 00:29:24,916 Speaker 1: show was edited by Sophie mckibbon and fact checked by 554 00:29:24,996 --> 00:29:29,596 Speaker 1: Joseph Fridman. Special thanks to Mia LaBelle, Carlie mcgliorre Heather Fame, 555 00:29:29,916 --> 00:29:34,636 Speaker 1: Julia Barton, Maggie Taylor, Maya Kanik, Jacob Weisberg, and my agent, 556 00:29:34,756 --> 00:29:37,716 Speaker 1: Ben Davis. The Happiness Lab is brought to you by 557 00:29:37,756 --> 00:29:41,036 Speaker 1: Pushkin Industries and by me, Doctor Laurie Santos