1 00:00:15,476 --> 00:00:20,996 Speaker 1: Pushkin a Happiness Lab listeners, We're still putting the finishing 2 00:00:21,036 --> 00:00:23,716 Speaker 1: touches on our next season of shows, a whole series 3 00:00:23,756 --> 00:00:27,996 Speaker 1: addressing situations in which I the happiness Professor and flunking class. 4 00:00:28,196 --> 00:00:30,236 Speaker 1: Our new season will be out on June third, but 5 00:00:30,316 --> 00:00:31,836 Speaker 1: here's a quick sneak preview. 6 00:00:32,036 --> 00:00:32,836 Speaker 2: Can you hear me now? 7 00:00:32,956 --> 00:00:35,556 Speaker 1: I'm doctor Laurie Santos, and I'm devoting the new season 8 00:00:35,636 --> 00:00:38,156 Speaker 1: of my podcast, The Happiness Lab to topics that are 9 00:00:38,196 --> 00:00:40,556 Speaker 1: dear to my heart, with people dear to my heart, 10 00:00:40,836 --> 00:00:41,276 Speaker 1: like my mom. 11 00:00:41,316 --> 00:00:42,596 Speaker 3: Wait a minute, let me put the TV. 12 00:00:43,036 --> 00:00:45,676 Speaker 1: I'll be finding out why I personally struggle so badly 13 00:00:45,756 --> 00:00:48,956 Speaker 1: with perfectionism, stress, and even sitting still and doing nothing. 14 00:00:48,996 --> 00:00:51,316 Speaker 1: But I feel like I'm bad at boredom because you're 15 00:00:51,356 --> 00:00:52,116 Speaker 1: bad at boredom. 16 00:00:52,236 --> 00:00:54,156 Speaker 2: Yeah. No, I didn't do well with doing nothing. 17 00:00:54,396 --> 00:00:56,516 Speaker 1: And once I find out why these things affect me 18 00:00:56,556 --> 00:00:59,116 Speaker 1: so badly, I'm hoping to do something about it. So 19 00:00:59,236 --> 00:01:01,996 Speaker 1: join me on my journey Wherever you get your podcasts, 20 00:01:03,516 --> 00:01:06,036 Speaker 1: so be sure to stay tuned. But today I wanted 21 00:01:06,076 --> 00:01:09,036 Speaker 1: to share an amazing episode of the Ted Interview podcast, 22 00:01:09,236 --> 00:01:11,436 Speaker 1: which focuses on the work of my friend and close 23 00:01:11,476 --> 00:01:15,076 Speaker 1: collaborator Liz Dunn, a psychology professor from the University of 24 00:01:15,116 --> 00:01:17,836 Speaker 1: British Columbia. Liz has been a guest on The Happiness 25 00:01:17,876 --> 00:01:20,476 Speaker 1: Lab a bunch, but in this episode, she'll take Chris 26 00:01:20,516 --> 00:01:23,796 Speaker 1: Anderson from TED on a deep dive across her entire career, 27 00:01:24,076 --> 00:01:26,556 Speaker 1: including some of her important work on how we can 28 00:01:26,596 --> 00:01:30,036 Speaker 1: all become happier. I hope you like this episode and 29 00:01:30,076 --> 00:01:32,156 Speaker 1: be sure to check out more editions of The Ted 30 00:01:32,236 --> 00:01:37,796 Speaker 1: Interview show wherever you get your podcasts. 31 00:01:38,636 --> 00:01:42,196 Speaker 3: Hello everyone, I'm Chris Anderson. Welcome to The Ted Interview. 32 00:01:42,556 --> 00:01:46,516 Speaker 3: This season, we're delving deep into the concept of generosity, 33 00:01:46,916 --> 00:01:50,116 Speaker 3: which I've come to believe is the single most crucial 34 00:01:50,156 --> 00:01:53,116 Speaker 3: idea that our modern world needs. It's an idea that 35 00:01:53,156 --> 00:01:56,156 Speaker 3: I've spent the last three years thinking about and researching 36 00:01:56,516 --> 00:02:01,436 Speaker 3: for a book titled Infectious Generosity. Now, Today's episode is 37 00:02:01,476 --> 00:02:04,876 Speaker 3: a special one for two reasons. First of all, I'm 38 00:02:04,916 --> 00:02:07,716 Speaker 3: sitting in front of a live audience at TED twenty 39 00:02:07,796 --> 00:02:11,316 Speaker 3: twenty four. The audience is full of brilliant people and 40 00:02:11,436 --> 00:02:14,876 Speaker 3: they will be helping co create this episode by asking 41 00:02:15,036 --> 00:02:19,556 Speaker 3: questions part way through. And then secondly, our guest today 42 00:02:19,716 --> 00:02:24,476 Speaker 3: she has played a catalytic role in developing my thinking 43 00:02:24,516 --> 00:02:29,836 Speaker 3: about generosity. Her name is Elizabeth Dunn. She's a social 44 00:02:29,956 --> 00:02:35,436 Speaker 3: psychology professor at the University of British Columbia, and apart 45 00:02:35,436 --> 00:02:38,316 Speaker 3: from delivering a beautiful ted talk on the links between 46 00:02:38,356 --> 00:02:42,916 Speaker 3: generosity and happiness, she also was involved in an experience 47 00:02:43,436 --> 00:02:46,316 Speaker 3: which forced me to go ahead and write this book. Honestly, 48 00:02:46,756 --> 00:02:50,436 Speaker 3: we will be talking about that shortly, but for now, 49 00:02:50,476 --> 00:03:09,596 Speaker 3: please join me in welcoming Elizabeth Dunn. It's great to 50 00:03:09,596 --> 00:03:09,996 Speaker 3: have you here. 51 00:03:10,196 --> 00:03:11,316 Speaker 2: Oh, thank you for having me. 52 00:03:12,516 --> 00:03:14,796 Speaker 3: Let's start by your sharing a little bit of your 53 00:03:14,796 --> 00:03:18,796 Speaker 3: background and what drew you to the study of happiness, 54 00:03:18,836 --> 00:03:20,876 Speaker 3: generosity and related Yeah. 55 00:03:20,916 --> 00:03:25,076 Speaker 4: So, I have been studying happiness for over twenty years 56 00:03:25,116 --> 00:03:28,916 Speaker 4: now as a social psychologist, and I got into studying 57 00:03:28,916 --> 00:03:32,836 Speaker 4: generosity primarily because I wanted to understand what made people 58 00:03:32,876 --> 00:03:36,596 Speaker 4: happy and specifically actually, when I got my first faculty 59 00:03:36,676 --> 00:03:39,636 Speaker 4: job as a professor at the University of British Columbia, 60 00:03:39,756 --> 00:03:42,356 Speaker 4: I started earning more money than I needed to stay 61 00:03:42,396 --> 00:03:44,436 Speaker 4: alive for the first time in my life, and I 62 00:03:44,516 --> 00:03:46,676 Speaker 4: was like, oh, I have disposable income. This is a 63 00:03:46,716 --> 00:03:48,916 Speaker 4: novelty for me, you know, And so I wanted to 64 00:03:48,996 --> 00:03:52,116 Speaker 4: understand how people could use money in order to enhance happiness, 65 00:03:52,156 --> 00:03:54,116 Speaker 4: and I thought generosity might. 66 00:03:53,996 --> 00:03:55,676 Speaker 2: Actually be a pretty good way to go. 67 00:03:55,756 --> 00:03:59,516 Speaker 4: So we started running experiments, first, small experiments just here 68 00:03:59,516 --> 00:04:02,356 Speaker 4: in Vancouver, and then built it out from there. So 69 00:04:02,476 --> 00:04:04,716 Speaker 4: that's sort of where I started. And then my research 70 00:04:04,716 --> 00:04:06,876 Speaker 4: has taken me in all kinds of different directions to 71 00:04:06,956 --> 00:04:09,796 Speaker 4: try to understand what really matters for him and happiness. 72 00:04:09,836 --> 00:04:14,596 Speaker 3: So typical experiment would be some students get given ten 73 00:04:14,676 --> 00:04:17,196 Speaker 3: or twenty bucks untel to do different things with it. 74 00:04:17,316 --> 00:04:19,796 Speaker 4: Yeah, So in the first experiment we ever ran on 75 00:04:19,796 --> 00:04:22,596 Speaker 4: this topic, we sent our research assistants out on campus 76 00:04:22,676 --> 00:04:25,716 Speaker 4: at UBC armed with cash. We gave them five and 77 00:04:25,716 --> 00:04:27,756 Speaker 4: twenty dollars bills. We had them literally walk up and 78 00:04:27,796 --> 00:04:31,756 Speaker 4: hand them to people, and we either told each individual 79 00:04:31,796 --> 00:04:34,956 Speaker 4: to spend this money to benefit themselves or to benefit others, 80 00:04:35,316 --> 00:04:36,916 Speaker 4: and then we called them at the end of the day, 81 00:04:36,996 --> 00:04:38,756 Speaker 4: found out how their day had been, asked them to 82 00:04:38,796 --> 00:04:41,276 Speaker 4: complete measures of happiness. And what we discovered was that 83 00:04:41,276 --> 00:04:43,396 Speaker 4: people felt happier at the end of the day when 84 00:04:43,396 --> 00:04:45,076 Speaker 4: they'd been assigned, basically by the flip. 85 00:04:44,916 --> 00:04:46,716 Speaker 2: Of a coin, to spend this money on. 86 00:04:46,756 --> 00:04:48,996 Speaker 3: Others, and do you think they actually did spend it 87 00:04:49,036 --> 00:04:49,316 Speaker 3: on this? 88 00:04:50,076 --> 00:04:52,956 Speaker 4: You know, they give us pretty detailed reports of like 89 00:04:53,436 --> 00:04:56,156 Speaker 4: exactly what they did, so yeah, I'm pretty confident. 90 00:04:56,796 --> 00:05:01,196 Speaker 3: Give us some more headlines of big findings that have 91 00:05:01,316 --> 00:05:03,116 Speaker 3: happened over the last decade or two. 92 00:05:03,436 --> 00:05:06,996 Speaker 4: So another big finding for us was to show that 93 00:05:07,036 --> 00:05:11,196 Speaker 4: this effect wasn't just limited to Vancouver to folks with 94 00:05:11,316 --> 00:05:13,556 Speaker 4: a reasonable amount of disposable income. We wanted to try 95 00:05:13,596 --> 00:05:16,676 Speaker 4: doing this in countries around the world. We explored this 96 00:05:16,796 --> 00:05:20,876 Speaker 4: idea with the littlest humans, with toddlers, showing that even 97 00:05:21,076 --> 00:05:24,396 Speaker 4: two year olds get joy from giving their resources to others. 98 00:05:24,436 --> 00:05:25,716 Speaker 2: Toddlers don't really care about money. 99 00:05:25,716 --> 00:05:29,396 Speaker 4: We worked with toddler gold specifically goldfish crackers, and so 100 00:05:29,996 --> 00:05:31,716 Speaker 4: you know, we've explored this idea in a number of 101 00:05:31,756 --> 00:05:34,476 Speaker 4: different ways. We've also looked at other aspects that matter 102 00:05:34,556 --> 00:05:36,956 Speaker 4: for happiness. In a project that grew out of a 103 00:05:36,996 --> 00:05:40,396 Speaker 4: collaboration at TED, we've been examining how to increase social 104 00:05:40,396 --> 00:05:45,276 Speaker 4: connections by bringing together architects and software engineers to figure 105 00:05:45,276 --> 00:05:48,316 Speaker 4: out how we can design physical spaces to bring people 106 00:05:48,356 --> 00:05:50,916 Speaker 4: together in positive ways. So I've done all kinds of 107 00:05:50,916 --> 00:05:54,076 Speaker 4: different things, But along the way, happiness has been sort 108 00:05:54,076 --> 00:05:54,796 Speaker 4: of my north star. 109 00:05:55,716 --> 00:05:59,036 Speaker 3: I mean, I think your team wrote a paper connected 110 00:05:59,076 --> 00:06:02,436 Speaker 3: to a giant Gallup survey that was done once asking 111 00:06:03,196 --> 00:06:06,676 Speaker 3: more than tw hundred thousand people across the world about 112 00:06:06,876 --> 00:06:10,116 Speaker 3: their happiness levels. Can you say a bit of yeah. 113 00:06:10,156 --> 00:06:14,596 Speaker 4: So, In taking advantage of really amazing data collected by Gallop, 114 00:06:14,956 --> 00:06:18,076 Speaker 4: what we discovered was that around the world, people who 115 00:06:18,596 --> 00:06:22,316 Speaker 4: had donated money to charity in the past month reported 116 00:06:22,356 --> 00:06:23,996 Speaker 4: greater happiness than those who didn't. 117 00:06:24,036 --> 00:06:25,516 Speaker 2: And this wasn't a small effect. 118 00:06:25,596 --> 00:06:27,956 Speaker 4: In fact, when we really dug into the data, what 119 00:06:27,996 --> 00:06:31,556 Speaker 4: we saw was that spending donating money to charity was 120 00:06:31,636 --> 00:06:35,636 Speaker 4: basically equivalent to a doubling of household income in terms 121 00:06:35,636 --> 00:06:37,196 Speaker 4: of its relationship with happiness. 122 00:06:37,556 --> 00:06:42,036 Speaker 3: People work their whole year to try to justify it. 123 00:06:42,076 --> 00:06:45,196 Speaker 3: Maybe if I could just get twenty percent more income, 124 00:06:45,276 --> 00:06:46,876 Speaker 3: that's what would make all the difference to me in 125 00:06:46,916 --> 00:06:50,836 Speaker 3: my family. Doubling of income rarely happens to most people. 126 00:06:50,916 --> 00:06:54,156 Speaker 3: It's like it's not quite winning the lottery, but it's close. 127 00:06:55,036 --> 00:06:58,676 Speaker 3: And yet on the face of it, just committing to 128 00:06:59,396 --> 00:07:03,356 Speaker 3: being generous to charities could actually deliver the same life 129 00:07:03,356 --> 00:07:05,956 Speaker 3: happiness that seems so implausible. Can that really be true? 130 00:07:06,196 --> 00:07:08,716 Speaker 4: Well, first off, you know, in that particular study, it's 131 00:07:08,756 --> 00:07:12,516 Speaker 4: important recognize it's just a correlation, right, So people who 132 00:07:13,236 --> 00:07:16,556 Speaker 4: give money to charity are happier than people who do not, right, 133 00:07:16,596 --> 00:07:18,236 Speaker 4: And you can imagine that people who are getting to 134 00:07:18,316 --> 00:07:20,276 Speaker 4: charity might differ in all kinds of ways. And we 135 00:07:20,316 --> 00:07:22,396 Speaker 4: do our best as scientists to try to control for 136 00:07:22,596 --> 00:07:26,036 Speaker 4: the obvious ones like income, and that relationship still holds. 137 00:07:26,276 --> 00:07:30,556 Speaker 4: But I wouldn't presume that it's entirely causal, that it's 138 00:07:30,796 --> 00:07:32,956 Speaker 4: just about spending money on others leading to happiness. Now 139 00:07:32,956 --> 00:07:34,996 Speaker 4: that said, it is still fascinating, you know when we 140 00:07:35,036 --> 00:07:38,756 Speaker 4: see this relationship emerge in experimental work as well, so 141 00:07:38,796 --> 00:07:41,156 Speaker 4: we know there is this causal relationship, and yet it 142 00:07:41,156 --> 00:07:43,796 Speaker 4: doesn't seem like people are always aware of it. 143 00:07:44,436 --> 00:07:47,596 Speaker 3: So it's a bit like the riddle about marriage and happiness, 144 00:07:47,636 --> 00:07:50,796 Speaker 3: Like all the stats show that on average marriage people 145 00:07:50,796 --> 00:07:53,476 Speaker 3: tend to be happier, But it did the marriage cause it? 146 00:07:53,596 --> 00:07:55,916 Speaker 3: Or is it that the happier people get married because 147 00:07:55,996 --> 00:07:58,756 Speaker 3: no one wants to marry us that'll get well, yeah. 148 00:07:58,556 --> 00:08:01,156 Speaker 4: And you know, the really interesting thing with marriage is 149 00:08:01,196 --> 00:08:03,436 Speaker 4: that the best longitudinal studies on this. 150 00:08:03,436 --> 00:08:05,476 Speaker 2: Where you follow the same people over. 151 00:08:05,276 --> 00:08:09,076 Speaker 4: Time, suggest that people do get happier when they get married, 152 00:08:09,476 --> 00:08:11,796 Speaker 4: but that boost and happiness last two years. 153 00:08:12,276 --> 00:08:13,516 Speaker 2: Now that's an average. 154 00:08:13,876 --> 00:08:16,596 Speaker 4: Individuals vary, So like if you went to your friend's 155 00:08:16,596 --> 00:08:18,596 Speaker 4: wedding two years ago, you don't need to send a little, 156 00:08:18,636 --> 00:08:22,676 Speaker 4: you know, condolence note. But certainly that's the sort of 157 00:08:22,676 --> 00:08:25,676 Speaker 4: fundamental force that we as happiness researchers are working against. 158 00:08:25,796 --> 00:08:28,636 Speaker 4: This thing called hedonic adaptation or whatever wonderful thing we have, 159 00:08:28,756 --> 00:08:30,756 Speaker 4: we adapt to it. So it's super exciting to walk 160 00:08:30,756 --> 00:08:32,556 Speaker 4: down the aisle. It's less exciting to walk down and 161 00:08:32,556 --> 00:08:34,036 Speaker 4: see your spouse at the breakfast table. 162 00:08:34,436 --> 00:08:37,116 Speaker 3: One thing I've learned about you, Liz, is that you 163 00:08:37,196 --> 00:08:39,956 Speaker 3: have all the skepticism that you would hopeful from a scientist, 164 00:08:40,036 --> 00:08:44,156 Speaker 3: and recently you've been involved in work trying to address 165 00:08:44,236 --> 00:08:47,756 Speaker 3: what's been this crisis in social science that all these 166 00:08:48,436 --> 00:08:50,596 Speaker 3: weird and wild experiments that were reported, some of them 167 00:08:50,596 --> 00:08:54,236 Speaker 3: didn't replicate. Talk about the work you've been doing recently 168 00:08:54,276 --> 00:08:54,716 Speaker 3: about that. 169 00:08:55,076 --> 00:08:57,196 Speaker 4: Yeah, so this is really important, and I think it's 170 00:08:57,236 --> 00:09:01,356 Speaker 4: something that many people who are interested in psychology and 171 00:09:01,396 --> 00:09:04,476 Speaker 4: behavioral science don't realize is that there's really been a 172 00:09:04,516 --> 00:09:08,196 Speaker 4: revolution in behavioral science over the past decade where we 173 00:09:08,276 --> 00:09:11,036 Speaker 4: came to realize that some of the really neat, exciting 174 00:09:11,036 --> 00:09:12,196 Speaker 4: effects that some of you might have. 175 00:09:12,236 --> 00:09:13,916 Speaker 2: Heard about just don't replicate. 176 00:09:13,996 --> 00:09:16,556 Speaker 4: And so my lab has been involved, along with many 177 00:09:16,596 --> 00:09:18,556 Speaker 4: other people in my field, in trying to change that, 178 00:09:18,716 --> 00:09:22,916 Speaker 4: trying to create a more replicable, reliable, robust science. And 179 00:09:22,996 --> 00:09:25,956 Speaker 4: so one of the big things that we've been involved 180 00:09:25,996 --> 00:09:29,836 Speaker 4: in doing is pre registering our studies. So I'm going 181 00:09:29,836 --> 00:09:31,036 Speaker 4: to get a little nerdy for a second, but I 182 00:09:31,036 --> 00:09:33,076 Speaker 4: promise it's going to be worth it because it's super important. 183 00:09:33,396 --> 00:09:37,556 Speaker 4: So in science sometimes it's like we throw a bunch 184 00:09:37,556 --> 00:09:40,596 Speaker 4: of darts against the wall and then we draw the 185 00:09:40,596 --> 00:09:43,716 Speaker 4: bullseye on afterward and say, hey, look we found what 186 00:09:43,756 --> 00:09:46,836 Speaker 4: we expected to find, right, And there's an obvious problem 187 00:09:46,836 --> 00:09:48,596 Speaker 4: with that because you don't know how many darts we 188 00:09:48,716 --> 00:09:51,396 Speaker 4: actually shot. And so this is a problem not just 189 00:09:51,396 --> 00:09:53,796 Speaker 4: for our psychology certainly, not just for happiness research, for 190 00:09:53,956 --> 00:09:57,556 Speaker 4: really all of quantitative science. And so now what we're 191 00:09:57,556 --> 00:10:00,196 Speaker 4: trying to do is pre register our studies and say 192 00:10:00,556 --> 00:10:02,836 Speaker 4: up front, we're going to call our shot ahead of 193 00:10:02,876 --> 00:10:05,156 Speaker 4: time and say this is what I'm going to do. 194 00:10:05,236 --> 00:10:07,436 Speaker 4: This is how I'm going to test it, and I'm 195 00:10:07,436 --> 00:10:10,476 Speaker 4: going to prove myself wrong that shot doesn't land. And 196 00:10:10,556 --> 00:10:13,076 Speaker 4: so one thing that I've been doing is going and 197 00:10:13,116 --> 00:10:16,156 Speaker 4: trying to reassess the happiness literature, in particular to try 198 00:10:16,236 --> 00:10:18,756 Speaker 4: to understand, you know, how good does it look when 199 00:10:18,796 --> 00:10:22,756 Speaker 4: we apply these newer approaches. Another really key piece is 200 00:10:22,876 --> 00:10:25,156 Speaker 4: using larger samples. So if you look at some of 201 00:10:25,156 --> 00:10:28,516 Speaker 4: the older psychology studies out there, they'll often have a 202 00:10:28,556 --> 00:10:32,036 Speaker 4: teeny tiny number of participants, and this can leave us 203 00:10:32,036 --> 00:10:34,476 Speaker 4: with a situation where those effects don't actually replicate. 204 00:10:34,516 --> 00:10:35,236 Speaker 2: You can get. 205 00:10:35,276 --> 00:10:38,676 Speaker 4: Little weird, fluky findings when you only study small numbers 206 00:10:38,676 --> 00:10:41,916 Speaker 4: of people. So increasingly in my field, we're using much 207 00:10:41,996 --> 00:10:45,396 Speaker 4: larger sample sizes, and that gives us much more reliable conclusions. 208 00:10:45,516 --> 00:10:47,836 Speaker 3: And your work found that actually quite a large percentage 209 00:10:47,876 --> 00:10:50,076 Speaker 3: of some of the earlier work probably. 210 00:10:49,716 --> 00:10:52,156 Speaker 4: Wasn't that valid, right, I mean, we don't know for 211 00:10:52,276 --> 00:10:55,196 Speaker 4: sure which studies would fail to replicate, but we found 212 00:10:55,196 --> 00:10:58,636 Speaker 4: we looked at some of the most widely reported ideas 213 00:10:58,676 --> 00:11:02,276 Speaker 4: about happiness in the media, So we looked at strategies 214 00:11:02,356 --> 00:11:05,516 Speaker 4: like getting out in nature, practicing, meditation and mindfulness as 215 00:11:05,556 --> 00:11:09,836 Speaker 4: well as exercise very popular strategies for promoting happiness, and 216 00:11:09,876 --> 00:11:14,396 Speaker 4: we found that ninety five percent of those experiments really did. 217 00:11:14,196 --> 00:11:16,276 Speaker 2: Not meet our current standards. 218 00:11:16,436 --> 00:11:18,356 Speaker 4: It doesn't mean that those things, like if you're super 219 00:11:18,356 --> 00:11:20,676 Speaker 4: into meditation or exercise, I'm not saying like, give those 220 00:11:20,716 --> 00:11:22,836 Speaker 4: things up and just watch Netflix because whatever. 221 00:11:23,196 --> 00:11:24,116 Speaker 2: Like those things. 222 00:11:24,116 --> 00:11:28,076 Speaker 4: There's good reasons to believe those strategies should work. But 223 00:11:28,156 --> 00:11:30,636 Speaker 4: it's just remarkable that we don't actually have the kind 224 00:11:30,636 --> 00:11:32,716 Speaker 4: of evidence that we would consider the kind of modern 225 00:11:32,716 --> 00:11:34,276 Speaker 4: gold standard to support them. 226 00:11:34,636 --> 00:11:36,996 Speaker 3: I mean, that seems like really important social science to do. 227 00:11:37,476 --> 00:11:39,396 Speaker 3: I mean, as someone who loves being in nature, I 228 00:11:39,396 --> 00:11:42,636 Speaker 3: would really like to know that my impression that it 229 00:11:42,676 --> 00:11:46,116 Speaker 3: brings happiness wasn't just my imagination. 230 00:11:46,236 --> 00:11:47,876 Speaker 4: Right, And I mean you may know, right, Like one 231 00:11:47,916 --> 00:11:49,956 Speaker 4: of the wonderful things about happiness is that you can 232 00:11:49,996 --> 00:11:53,436 Speaker 4: experiment on yourself, right, and you can see, hey, does this. 233 00:11:53,356 --> 00:11:54,916 Speaker 2: Make me happy? And that's great. 234 00:11:54,956 --> 00:11:56,916 Speaker 4: So if you find being in nature makes you happy, 235 00:11:56,956 --> 00:11:58,836 Speaker 4: then do it. But if we want to make broader 236 00:11:58,876 --> 00:12:01,756 Speaker 4: recommendations and really push forward policies and stuff, we want 237 00:12:01,756 --> 00:12:04,036 Speaker 4: to know what has the most reliable and the biggest 238 00:12:04,076 --> 00:12:06,636 Speaker 4: effects for the most people. It turns out that if 239 00:12:06,676 --> 00:12:09,396 Speaker 4: you do little things like you say, oh, let me 240 00:12:09,436 --> 00:12:11,236 Speaker 4: analyze the data a little bit differently, let me control 241 00:12:11,276 --> 00:12:14,516 Speaker 4: for gender. These people seemed like they weren't really paying attention. 242 00:12:14,596 --> 00:12:16,876 Speaker 4: I'm going to cut them from my study. Then you 243 00:12:16,876 --> 00:12:20,556 Speaker 4: can quickly actually create an effect where none really exists. 244 00:12:20,636 --> 00:12:22,796 Speaker 4: And that is what we are trying to change. And 245 00:12:22,836 --> 00:12:24,836 Speaker 4: I think where our field has come so far over 246 00:12:24,836 --> 00:12:25,796 Speaker 4: the past decade. 247 00:12:25,916 --> 00:12:29,876 Speaker 3: So wearing your full skeptical hat, you and I we 248 00:12:29,916 --> 00:12:34,196 Speaker 3: had a conversation in twenty twenty about a crazy idea 249 00:12:34,316 --> 00:12:36,796 Speaker 3: that happened because there was a donor in the tech 250 00:12:36,836 --> 00:12:40,316 Speaker 3: community who wanted to give away a million dollars and 251 00:12:40,356 --> 00:12:43,276 Speaker 3: to do some social science in the process. Talk about 252 00:12:43,276 --> 00:12:43,836 Speaker 3: what happened. 253 00:12:44,076 --> 00:12:46,036 Speaker 4: This was like the coolest phone call I've ever gotten, 254 00:12:46,476 --> 00:12:48,756 Speaker 4: where Chris called me and said, you know, I've got 255 00:12:48,756 --> 00:12:51,956 Speaker 4: this interesting opportunity. Is this something that your lab would 256 00:12:51,956 --> 00:12:55,076 Speaker 4: like to study where we could potentially I think initially 257 00:12:55,236 --> 00:12:57,876 Speaker 4: that we would give ten thousand US dollars to one 258 00:12:57,956 --> 00:13:01,356 Speaker 4: hundred people in multiple countries around the world and study 259 00:13:01,396 --> 00:13:04,276 Speaker 4: what happened. And I was blown away. I was very 260 00:13:04,276 --> 00:13:07,236 Speaker 4: excited I was like, count me in. And then we 261 00:13:07,276 --> 00:13:09,476 Speaker 4: got to talking more and we thought it would be 262 00:13:09,516 --> 00:13:11,956 Speaker 4: interesting to vary, you know, to have a couple of 263 00:13:11,956 --> 00:13:15,276 Speaker 4: different conditions or groups. So give these groups some different 264 00:13:15,316 --> 00:13:18,116 Speaker 4: instructions and be able to test what kind of difference 265 00:13:18,196 --> 00:13:20,516 Speaker 4: that made. And so we thought, well, maybe we could 266 00:13:20,556 --> 00:13:22,676 Speaker 4: take that hundred people and split it into two groups 267 00:13:22,676 --> 00:13:24,636 Speaker 4: of fifty. And that's where I gave you, like a 268 00:13:24,676 --> 00:13:28,236 Speaker 4: born lecture on statistics and how like, when you have 269 00:13:28,276 --> 00:13:31,196 Speaker 4: fifty people per group, it's actually not necessarily enough to 270 00:13:31,316 --> 00:13:34,076 Speaker 4: detect these effects in a really reliable way. So I said, 271 00:13:34,596 --> 00:13:36,796 Speaker 4: is there any way we could have two groups of 272 00:13:36,796 --> 00:13:39,876 Speaker 4: one hundred people? You know, it would two million dollars 273 00:13:39,956 --> 00:13:43,196 Speaker 4: be possible? And I let me tell you, as somebody 274 00:13:43,196 --> 00:13:46,516 Speaker 4: who like applies to academic funding agencies, I just expected 275 00:13:46,516 --> 00:13:48,036 Speaker 4: the answer to be no. But I was like, ah, 276 00:13:48,076 --> 00:13:50,876 Speaker 4: you might might as well mention this, and you can. 277 00:13:51,076 --> 00:13:53,396 Speaker 3: This is the tech community is full of surprises. It 278 00:13:53,436 --> 00:13:56,276 Speaker 3: really is full of surprises. And I'm so, but also 279 00:13:56,396 --> 00:13:59,756 Speaker 3: be careful what proposal you put to Liz done. But 280 00:13:59,916 --> 00:14:03,916 Speaker 3: so yeah, so ended up being two hundred people getting 281 00:14:04,036 --> 00:14:07,236 Speaker 3: ten thousand dollars each. That was the plan. How did 282 00:14:07,316 --> 00:14:09,076 Speaker 3: you structure the ex. 283 00:14:09,916 --> 00:14:12,676 Speaker 4: Yeah, so we had one group of one hundred people 284 00:14:12,756 --> 00:14:17,516 Speaker 4: who were told to share their participation in this experiment publicly. 285 00:14:17,596 --> 00:14:19,516 Speaker 4: So they were told to share this with their social 286 00:14:19,556 --> 00:14:22,196 Speaker 4: networks on Twitter at the time, now x, and then 287 00:14:22,236 --> 00:14:24,436 Speaker 4: we told the other one hundred to just keep this 288 00:14:24,596 --> 00:14:26,796 Speaker 4: on the download. You know, you can tell a couple 289 00:14:26,876 --> 00:14:29,116 Speaker 4: close friends and family if you need to, but basically 290 00:14:29,396 --> 00:14:31,676 Speaker 4: keep this news private. And so this allowed us to 291 00:14:31,716 --> 00:14:34,556 Speaker 4: look at whether sharing this news publicly would make a 292 00:14:34,556 --> 00:14:36,876 Speaker 4: difference for how people spent the money or how they 293 00:14:36,876 --> 00:14:40,556 Speaker 4: felt about it. And also, in one of our early conversations, 294 00:14:40,596 --> 00:14:42,516 Speaker 4: I said, you know what would be really great is 295 00:14:42,556 --> 00:14:44,636 Speaker 4: if we could have a control group of people that 296 00:14:44,716 --> 00:14:46,716 Speaker 4: don't get any money and we could study them all 297 00:14:46,716 --> 00:14:49,636 Speaker 4: along the way. And you were like, okay, Liz, yeah, 298 00:14:49,756 --> 00:14:52,076 Speaker 4: sure we can do that too. So we had this 299 00:14:52,116 --> 00:14:54,996 Speaker 4: control group of folks that were chosen at random. So, 300 00:14:55,516 --> 00:14:59,436 Speaker 4: after recruiting people making sure they qualify, we randomly assign 301 00:14:59,476 --> 00:15:01,836 Speaker 4: people to either get this money, share it publicly, get 302 00:15:01,836 --> 00:15:05,796 Speaker 4: this money, keep it, keep this news private, or not 303 00:15:05,916 --> 00:15:08,756 Speaker 4: get any money, but complete all of our surveys and 304 00:15:08,756 --> 00:15:11,716 Speaker 4: get a small fee for completing the surveys. 305 00:15:11,356 --> 00:15:13,716 Speaker 3: And specifically what they were told is that this money 306 00:15:14,396 --> 00:15:16,876 Speaker 3: was up to them as to how they could spend it. 307 00:15:17,396 --> 00:15:19,556 Speaker 3: First all, when they were recruited, because I got to 308 00:15:19,556 --> 00:15:22,236 Speaker 3: play a role in recruiting people, and we basically I 309 00:15:22,276 --> 00:15:25,156 Speaker 3: put out a tweet saying, Hey, do you want to 310 00:15:25,196 --> 00:15:29,276 Speaker 3: participate in an interesting experiment? Could be stressful, could be 311 00:15:29,916 --> 00:15:33,196 Speaker 3: potentially life changing, will take quite a bit of time, 312 00:15:33,316 --> 00:15:35,236 Speaker 3: what do you think? But with no mention of money. 313 00:15:35,556 --> 00:15:38,596 Speaker 3: So we got people not knowing really what they were 314 00:15:38,876 --> 00:15:41,036 Speaker 3: signing up for. And then when we told them they 315 00:15:41,076 --> 00:15:43,956 Speaker 3: were getting ten thousand dollars, like, some of them were like, oh, 316 00:15:44,036 --> 00:15:44,716 Speaker 3: it's a scam. 317 00:15:46,756 --> 00:15:48,756 Speaker 4: And I think that was one of the biggest challenges, right, 318 00:15:48,756 --> 00:15:50,836 Speaker 4: because if you go to the spam folder of your 319 00:15:50,836 --> 00:15:53,836 Speaker 4: inbox right now, you can probably find an offer like this, right, 320 00:15:54,116 --> 00:15:55,396 Speaker 4: So we had to I think one of the biggest 321 00:15:55,436 --> 00:15:58,116 Speaker 4: challenges was convincing people was real. And so you made 322 00:15:58,116 --> 00:16:00,276 Speaker 4: a great video of yourself, just like kind of a 323 00:16:00,316 --> 00:16:03,676 Speaker 4: low production video explaining to people what was happening. And 324 00:16:03,716 --> 00:16:05,996 Speaker 4: I think that was really important and kind of convincing 325 00:16:05,996 --> 00:16:08,636 Speaker 4: people like, no, for real, We're going to send you 326 00:16:08,676 --> 00:16:09,956 Speaker 4: ten thousand dollars The. 327 00:16:09,876 --> 00:16:11,836 Speaker 3: Only real strings attached then was that they just had 328 00:16:11,876 --> 00:16:13,636 Speaker 3: to report back to us how to fill out a 329 00:16:13,716 --> 00:16:15,756 Speaker 3: survey every few months. 330 00:16:15,436 --> 00:16:18,836 Speaker 4: Basically once yeah, every month for three months, and then 331 00:16:18,876 --> 00:16:19,276 Speaker 4: again at. 332 00:16:19,196 --> 00:16:19,996 Speaker 2: A six month mark. 333 00:16:20,156 --> 00:16:20,356 Speaker 5: Yeah. 334 00:16:20,596 --> 00:16:23,876 Speaker 3: Right. So what did we discover? 335 00:16:24,636 --> 00:16:26,636 Speaker 4: Well, we learned a lot from this experiment, and I 336 00:16:26,716 --> 00:16:29,236 Speaker 4: just want to say one of the really beautiful things 337 00:16:29,276 --> 00:16:32,596 Speaker 4: about this whole experience was that we were able to 338 00:16:32,636 --> 00:16:34,876 Speaker 4: conduct this experiment at a scale that I never would 339 00:16:34,876 --> 00:16:38,516 Speaker 4: have thought was possible, and the donors were so committed 340 00:16:38,556 --> 00:16:41,516 Speaker 4: to doing the science really carefully too, and so we 341 00:16:41,516 --> 00:16:45,636 Speaker 4: were able to address several long standing fundamental questions in 342 00:16:45,716 --> 00:16:48,676 Speaker 4: behavioral science. And I think the one that was probably 343 00:16:48,676 --> 00:16:52,036 Speaker 4: the core question that we started with was how would 344 00:16:52,036 --> 00:16:54,316 Speaker 4: people spend this money and would they use it in 345 00:16:54,396 --> 00:16:57,516 Speaker 4: ways that benefited others? I think one of the remarkable 346 00:16:57,556 --> 00:17:01,396 Speaker 4: findings from this study was that people used the majority 347 00:17:01,476 --> 00:17:05,076 Speaker 4: of this money, over six thousand dollars of the ten 348 00:17:05,116 --> 00:17:08,756 Speaker 4: thousand dollars windfall, in ways that benefited other people. 349 00:17:09,316 --> 00:17:10,596 Speaker 2: Now, we were. 350 00:17:10,396 --> 00:17:14,036 Speaker 4: Defining benefiting other people in a broad way, as you know, 351 00:17:14,396 --> 00:17:16,436 Speaker 4: any way that was you know, it could be anything 352 00:17:16,436 --> 00:17:18,836 Speaker 4: from taking friends out for dinner to making a donation 353 00:17:18,956 --> 00:17:22,316 Speaker 4: to charity, to treating a loved one to something special. 354 00:17:22,836 --> 00:17:25,596 Speaker 4: So perhaps in that sense, it's not so surprising that 355 00:17:25,596 --> 00:17:27,676 Speaker 4: that number was big. But even when we look at 356 00:17:27,676 --> 00:17:31,196 Speaker 4: a much narrower definition, we look just at charitable donations, 357 00:17:31,316 --> 00:17:34,276 Speaker 4: we still found that people spent almost seventeen hundred dollars 358 00:17:34,356 --> 00:17:36,396 Speaker 4: on average just on charitable donations. 359 00:17:36,876 --> 00:17:38,716 Speaker 2: And you know that that kind of blew me away. 360 00:17:39,196 --> 00:17:43,636 Speaker 3: In general, people I think, on average, spend two or 361 00:17:43,636 --> 00:17:47,276 Speaker 3: three or four percent at most of their income on charity, 362 00:17:47,756 --> 00:17:51,636 Speaker 3: and that's a massive increase. So what's going on there? 363 00:17:51,796 --> 00:17:55,396 Speaker 4: Well, I mean, I think one important component here is 364 00:17:55,436 --> 00:17:58,156 Speaker 4: that it was a gift, right, So it wasn't like 365 00:17:58,196 --> 00:18:01,476 Speaker 4: people earned this money per se. It wasn't something that 366 00:18:01,476 --> 00:18:06,116 Speaker 4: they were expecting. It began with an active generosity by 367 00:18:06,156 --> 00:18:09,396 Speaker 4: the donors. And I think that's still very interesting, right, 368 00:18:09,436 --> 00:18:12,476 Speaker 4: that one powerful act of generosity, and in some cases 369 00:18:12,516 --> 00:18:16,436 Speaker 4: the kinds of donations that people were making could themselves 370 00:18:16,516 --> 00:18:20,156 Speaker 4: create other opportunities, could have these incredible ripple effects into 371 00:18:20,196 --> 00:18:21,236 Speaker 4: some of the communities. 372 00:18:21,596 --> 00:18:23,436 Speaker 2: To me, that was one piece of it that was 373 00:18:23,476 --> 00:18:24,276 Speaker 2: just so powerful. 374 00:18:24,836 --> 00:18:27,796 Speaker 3: Was there a difference in the behavior between people who 375 00:18:28,596 --> 00:18:32,836 Speaker 3: on social media proudly boasting of the amount that they 376 00:18:32,836 --> 00:18:36,476 Speaker 3: were giving away to those who were keeping in private. 377 00:18:36,596 --> 00:18:38,756 Speaker 4: Yeah, so we expected that there would be a difference. 378 00:18:38,796 --> 00:18:41,236 Speaker 4: We thought people who were sharing this publicly would spend 379 00:18:41,276 --> 00:18:44,156 Speaker 4: more on others. If everyone knows you've got this money 380 00:18:44,196 --> 00:18:47,316 Speaker 4: and whatever, you can imagine, you would spend it more generously. 381 00:18:47,636 --> 00:18:50,436 Speaker 4: We pre registered that prediction, so we set up front, Hey, 382 00:18:50,476 --> 00:18:51,636 Speaker 4: this is what we think we're going to find this, 383 00:18:51,676 --> 00:18:52,556 Speaker 4: how we're going to test it. 384 00:18:52,876 --> 00:18:53,716 Speaker 2: And we were wrong. 385 00:18:54,196 --> 00:18:57,396 Speaker 4: So we found that it did not matter whether people 386 00:18:57,596 --> 00:19:01,396 Speaker 4: publicly sharing this information or not. Either way, we found 387 00:19:01,396 --> 00:19:04,236 Speaker 4: these very high levels of generosity, which you could call 388 00:19:04,236 --> 00:19:07,076 Speaker 4: a field experiment, but I call like a beautiful testament 389 00:19:07,116 --> 00:19:11,036 Speaker 4: to humanity that like, even when they're spending choice is private, 390 00:19:11,396 --> 00:19:13,556 Speaker 4: they still chose to spend the majority of the money 391 00:19:13,596 --> 00:19:14,876 Speaker 4: in ways that benefited others. 392 00:19:15,396 --> 00:19:18,716 Speaker 3: So it really was this was a response to generosity. 393 00:19:18,836 --> 00:19:23,436 Speaker 3: It wasn't an attempt to boost reputation. It seemed. 394 00:19:23,756 --> 00:19:25,836 Speaker 4: Yeah, I mean we can't. Of course, it's possible that 395 00:19:25,876 --> 00:19:28,036 Speaker 4: some people were you know, taking friends out for dinner 396 00:19:28,076 --> 00:19:30,396 Speaker 4: and in a way that can have reputational benefits. So 397 00:19:30,436 --> 00:19:34,436 Speaker 4: we can almost never erase reputation. And that's part of it. 398 00:19:34,476 --> 00:19:35,836 Speaker 4: And something that I love in your book is that 399 00:19:35,876 --> 00:19:38,196 Speaker 4: you say, you know, it's not necessarily bad if people 400 00:19:38,316 --> 00:19:42,676 Speaker 4: are engaging in generous behavior to enhance their reputations, Like, 401 00:19:42,676 --> 00:19:45,716 Speaker 4: what a wonderful feature of humanity, that that's something a 402 00:19:45,756 --> 00:19:48,476 Speaker 4: way that we would want to increase our enhance our reputation. 403 00:19:48,676 --> 00:19:51,556 Speaker 4: But yet we designed what we felt was a very 404 00:19:51,596 --> 00:19:56,196 Speaker 4: powerful manipulation of reputational concerns and it did not matter. 405 00:19:56,476 --> 00:19:57,516 Speaker 2: So that tells us to me. 406 00:19:57,716 --> 00:20:00,796 Speaker 4: My takeaway from that is that you know, perhaps reputational 407 00:20:00,836 --> 00:20:05,316 Speaker 4: concerns are not a necessary or driving force behind generosity. 408 00:20:05,676 --> 00:20:08,716 Speaker 3: And have you heard from other social scientists that they 409 00:20:08,836 --> 00:20:11,516 Speaker 3: view that this is a contribution to the science. 410 00:20:11,676 --> 00:20:14,276 Speaker 4: Yeah, we've gotten an amazing response to this from the 411 00:20:14,316 --> 00:20:17,596 Speaker 4: scientific community, and I think because we had the opportunity 412 00:20:17,756 --> 00:20:21,196 Speaker 4: to scale this up, to have this diverse, worldwide sample, 413 00:20:21,596 --> 00:20:25,596 Speaker 4: to work with amounts of money that are deeply meaningful. 414 00:20:25,716 --> 00:20:29,716 Speaker 4: So for folks who were in lower income countries who participated, 415 00:20:30,236 --> 00:20:34,276 Speaker 4: we were essentially doubling their annual income with this gift. 416 00:20:34,356 --> 00:20:36,356 Speaker 2: So it was a huge amount of money. 417 00:20:36,556 --> 00:20:39,156 Speaker 4: In contrast the kinds of experiments my lab has done 418 00:20:39,156 --> 00:20:41,476 Speaker 4: throughout my career are usually more on the order of 419 00:20:41,636 --> 00:20:44,356 Speaker 4: ten dollars, and so one might say, well, sure people 420 00:20:44,396 --> 00:20:46,076 Speaker 4: feel happy when they give away a few dollars, but 421 00:20:46,116 --> 00:20:46,876 Speaker 4: they don't really care. 422 00:20:47,036 --> 00:20:48,596 Speaker 2: And so this shows us with. 423 00:20:48,676 --> 00:20:51,116 Speaker 4: Very large amounts of money people are giving a lot 424 00:20:51,436 --> 00:20:53,476 Speaker 4: and as well, I'm sure talk about more getting a 425 00:20:53,476 --> 00:20:54,076 Speaker 4: lot of joy from it. 426 00:20:55,196 --> 00:20:57,116 Speaker 3: So talk about the first pape you published. 427 00:20:57,476 --> 00:20:57,876 Speaker 2: Yeah, So. 428 00:20:59,356 --> 00:21:01,196 Speaker 4: Chris started out, of course with this interest in how 429 00:21:01,276 --> 00:21:02,636 Speaker 4: generously are people going to spend the money? 430 00:21:02,636 --> 00:21:04,036 Speaker 2: And I wanted to know the answer to that too. 431 00:21:04,076 --> 00:21:05,836 Speaker 4: But I was like, this experiment can answer a lot 432 00:21:05,876 --> 00:21:08,356 Speaker 4: of fascinating questions. And one thing that I wanted to 433 00:21:08,396 --> 00:21:12,916 Speaker 4: know just how much happiness can two million dollars buy 434 00:21:13,436 --> 00:21:17,276 Speaker 4: when it's distributed across a large, diverse group of people, 435 00:21:17,596 --> 00:21:21,436 Speaker 4: as opposed to concentrated in the hands of one affluent couple. 436 00:21:21,836 --> 00:21:25,516 Speaker 4: And so, in a paper that is entitled Wealth Redistribution 437 00:21:25,676 --> 00:21:31,316 Speaker 4: Promotes Happiness, we demonstrate and quantify just how much happiness 438 00:21:31,316 --> 00:21:33,756 Speaker 4: two million dollars can buy. And one of the things 439 00:21:33,796 --> 00:21:37,036 Speaker 4: that we demonstrate is that by giving this money away 440 00:21:37,156 --> 00:21:41,956 Speaker 4: to this diverse group of people, this couple provided two 441 00:21:42,036 --> 00:21:44,916 Speaker 4: hundred and twenty five times as much happiness as they 442 00:21:44,956 --> 00:21:47,716 Speaker 4: could possibly have found for themselves from this money. 443 00:21:48,036 --> 00:21:51,836 Speaker 3: And that's an incredible number and wonderful when you think 444 00:21:51,836 --> 00:21:54,236 Speaker 3: about it, feminine, it's kind of obvious. I mean, when 445 00:21:54,236 --> 00:21:56,956 Speaker 3: someone's rich, how much happier can a bit of extra 446 00:21:56,956 --> 00:22:00,876 Speaker 3: money actually make them? Not much? But when it shared out, 447 00:22:00,956 --> 00:22:02,716 Speaker 3: we know that that can make a lot of difference 448 00:22:02,756 --> 00:22:04,636 Speaker 3: to a lot of people. I've just been hit so 449 00:22:04,756 --> 00:22:08,156 Speaker 3: many times since thinking about this work and writing the 450 00:22:08,156 --> 00:22:10,756 Speaker 3: book of just how big a deal it is that 451 00:22:11,076 --> 00:22:16,076 Speaker 3: generosity is fundamentally asymmetric. This is a really really exciting 452 00:22:16,116 --> 00:22:18,156 Speaker 3: feature about things that we normally think of the world 453 00:22:18,196 --> 00:22:21,676 Speaker 3: in zero some terms. You know, the reason people don't 454 00:22:21,756 --> 00:22:24,036 Speaker 3: give is because of loss, a version you don't want 455 00:22:24,076 --> 00:22:26,396 Speaker 3: to give away you lose it. It feels like you know, 456 00:22:26,516 --> 00:22:29,956 Speaker 3: you're gain my loss, no thank you. But actually there 457 00:22:29,956 --> 00:22:33,636 Speaker 3: are so many circumstances when that is not the case. 458 00:22:33,996 --> 00:22:38,076 Speaker 3: There's so much inequality in the world, and that money 459 00:22:38,156 --> 00:22:41,316 Speaker 3: is sitting there, and the cost to the richer person 460 00:22:41,396 --> 00:22:44,636 Speaker 3: literally is pretty low. And in fact, you could argue 461 00:22:44,756 --> 00:22:46,836 Speaker 3: and and in fact some of the other days you've 462 00:22:46,836 --> 00:22:48,956 Speaker 3: shown is that the actual act of generosity doesn't cost 463 00:22:48,996 --> 00:22:52,596 Speaker 3: them any happiness either it actually boosts their happiness. And 464 00:22:52,636 --> 00:22:55,676 Speaker 3: so it's this kind of thinking which which really gave 465 00:22:55,716 --> 00:22:58,756 Speaker 3: me no choice but to write this book. And if 466 00:22:58,796 --> 00:23:03,156 Speaker 3: there was a culture of generosity, everyone benefits in the 467 00:23:03,196 --> 00:23:05,956 Speaker 3: most amazing way, and we're all connected, and so it 468 00:23:05,956 --> 00:23:08,196 Speaker 3: should be easier to do this than ever. So, so, 469 00:23:08,756 --> 00:23:11,916 Speaker 3: how on earth is it that we're actually experiencing being 470 00:23:11,916 --> 00:23:13,036 Speaker 3: in a world that's getting meaner? 471 00:23:13,756 --> 00:23:15,436 Speaker 4: Well, you know, I don't know if we are living 472 00:23:15,476 --> 00:23:18,196 Speaker 4: in a world that's getting meaner. I think that, you know, 473 00:23:18,276 --> 00:23:20,316 Speaker 4: we have that perception, but I think one thing that 474 00:23:20,356 --> 00:23:23,156 Speaker 4: we've been seeing already just this week at TED so 475 00:23:23,276 --> 00:23:25,996 Speaker 4: far is how much of that may be an illusion 476 00:23:26,036 --> 00:23:28,516 Speaker 4: and how much good really is happening. And that's one 477 00:23:28,516 --> 00:23:30,716 Speaker 4: thing that I love about this experiment is that it 478 00:23:30,756 --> 00:23:34,316 Speaker 4: really highlights, like when we did the science very carefully 479 00:23:34,876 --> 00:23:39,356 Speaker 4: and tracked exactly what happened, you know, we were discovering 480 00:23:39,396 --> 00:23:43,396 Speaker 4: that people were spending this money in ways that benefited others, 481 00:23:43,476 --> 00:23:44,236 Speaker 4: paying this. 482 00:23:44,396 --> 00:23:45,956 Speaker 2: Act of generosity forward. 483 00:23:46,556 --> 00:23:50,436 Speaker 4: And so I actually, especially after doing this experiment, I 484 00:23:50,476 --> 00:23:55,476 Speaker 4: feel pretty good about humanity's potential for goodness. 485 00:23:55,596 --> 00:23:58,516 Speaker 3: So part of the riddle is that we are filtering 486 00:23:58,996 --> 00:24:01,396 Speaker 3: the stories we tell each other about humanity in a 487 00:24:01,396 --> 00:24:05,036 Speaker 3: way that's very destructive. Social media algorithms are probably helping 488 00:24:05,556 --> 00:24:08,436 Speaker 3: with this, and that when you actually pull the camera 489 00:24:08,516 --> 00:24:12,036 Speaker 3: back and look for actual data about what's happening in 490 00:24:12,076 --> 00:24:14,996 Speaker 3: the world, really different pictures can amaze. And certainly what 491 00:24:15,116 --> 00:24:19,196 Speaker 3: you know in doing research for this book, my brilliant 492 00:24:19,196 --> 00:24:22,116 Speaker 3: researcher is actually here, Kate. Kate Honey spent a couple 493 00:24:22,116 --> 00:24:24,996 Speaker 3: of years researching and looking for stories under the radar, 494 00:24:25,036 --> 00:24:30,116 Speaker 3: and there are so many stories of people doing kind 495 00:24:30,196 --> 00:24:35,116 Speaker 3: things that create these amazing ripple effects. Are they on 496 00:24:35,156 --> 00:24:37,836 Speaker 3: your news feed? They are not. And so it's tragic 497 00:24:37,876 --> 00:24:41,476 Speaker 3: because there's a saying we are shaped by the stories 498 00:24:41,836 --> 00:24:43,716 Speaker 3: we tell ourselves. There's evidence for this. 499 00:24:43,836 --> 00:24:45,876 Speaker 4: Yes, that's right, And I think it can also be 500 00:24:45,996 --> 00:24:49,436 Speaker 4: a kind of self reinforcing cycle where if we feel 501 00:24:49,476 --> 00:24:52,476 Speaker 4: like the world is this terrible place where nobody's doing 502 00:24:52,556 --> 00:24:55,036 Speaker 4: anything good, then why should I step up and do 503 00:24:55,116 --> 00:25:01,156 Speaker 4: something positive? And so I think flipping that script is 504 00:25:01,196 --> 00:25:09,396 Speaker 4: incredibly important. 505 00:25:14,756 --> 00:25:17,036 Speaker 3: Well, one of the joyful things was that after the 506 00:25:17,076 --> 00:25:19,356 Speaker 3: experiment was done and the science was sort of locked down, 507 00:25:19,836 --> 00:25:22,036 Speaker 3: I got to write to some of the people who'd 508 00:25:22,036 --> 00:25:24,956 Speaker 3: participated because they'd told us, you know, what they're done, 509 00:25:25,356 --> 00:25:28,916 Speaker 3: and I got to ask them, why did you do that? 510 00:25:29,396 --> 00:25:34,356 Speaker 3: And it was an amazing consistent picture that came back. 511 00:25:34,396 --> 00:25:37,636 Speaker 3: People often use the same language. A typical thing that 512 00:25:37,676 --> 00:25:42,676 Speaker 3: people said, and this surprised me was I felt seen, 513 00:25:43,356 --> 00:25:45,396 Speaker 3: you know, the act this is weird to me, like 514 00:25:45,716 --> 00:25:48,956 Speaker 3: being given ten thousand dollars by a stranger on the internet. 515 00:25:48,956 --> 00:25:52,436 Speaker 3: They felt I felt seen, and I felt like I 516 00:25:53,636 --> 00:25:56,356 Speaker 3: needed to let other people feel seen the way I 517 00:25:56,396 --> 00:26:00,436 Speaker 3: had felt seen. So this was quite a powerful, you know, 518 00:26:00,516 --> 00:26:03,996 Speaker 3: biological thing that really surprised me. But several of them said, 519 00:26:04,036 --> 00:26:06,036 Speaker 3: if I'd won this money in a lottery, I wouldn't 520 00:26:06,036 --> 00:26:06,916 Speaker 3: have behaved this way. 521 00:26:07,236 --> 00:26:09,916 Speaker 4: Well, and that's a really interesting hypothesis, right, So is 522 00:26:09,956 --> 00:26:12,396 Speaker 4: it the case, you know, that they wouldn't have been 523 00:26:12,396 --> 00:26:15,516 Speaker 4: so generous had we described the source of the money differently? 524 00:26:15,516 --> 00:26:17,956 Speaker 4: And I think you and I had a conversation after 525 00:26:17,996 --> 00:26:20,436 Speaker 4: we'd collected the data, after we'd run the experiment, and 526 00:26:20,476 --> 00:26:22,836 Speaker 4: you said, I wish we'd, you know, had this other 527 00:26:22,836 --> 00:26:24,836 Speaker 4: condition where we told people the money was, you know, 528 00:26:24,876 --> 00:26:26,676 Speaker 4: from a lottery or if they'd earned it or something. 529 00:26:26,836 --> 00:26:28,556 Speaker 2: And I was like, welcome to being a scientist. 530 00:26:28,716 --> 00:26:30,956 Speaker 4: This is the experience you always have after you run 531 00:26:31,036 --> 00:26:32,076 Speaker 4: the experiment, you wish. 532 00:26:31,956 --> 00:26:32,556 Speaker 2: You want the next thing. 533 00:26:32,596 --> 00:26:35,356 Speaker 4: But that's also the beautiful thing about sciences that were 534 00:26:35,596 --> 00:26:38,996 Speaker 4: each each question we try to answer propels us toward 535 00:26:39,036 --> 00:26:40,996 Speaker 4: the next one. And I think that's a really, really 536 00:26:40,996 --> 00:26:43,436 Speaker 4: fascinating one that you know, if anyone wants to donate 537 00:26:43,476 --> 00:26:46,556 Speaker 4: two million dollars, I'd love to pursue. 538 00:26:46,876 --> 00:26:50,916 Speaker 3: But there was there's a third payper you're working on. 539 00:26:51,196 --> 00:26:52,036 Speaker 3: Tell us about that one. 540 00:26:52,196 --> 00:26:54,636 Speaker 4: Yeah, So in the paper that we're currently writing up, 541 00:26:54,636 --> 00:26:56,316 Speaker 4: so you all are among the first to hear about it. 542 00:26:57,116 --> 00:27:00,596 Speaker 4: We were looking at the choices that people made about 543 00:27:00,596 --> 00:27:02,836 Speaker 4: how to spend the money in terms of the implications 544 00:27:02,836 --> 00:27:07,556 Speaker 4: for their own happiness. So we painstakingly coded each and 545 00:27:07,596 --> 00:27:11,196 Speaker 4: every spending description to assess how people had spent the money. 546 00:27:11,196 --> 00:27:15,396 Speaker 4: So we placed each purchase into seventeen different possible categories, 547 00:27:15,556 --> 00:27:20,556 Speaker 4: ranging from paying for utility bills, buying durable goods, buying 548 00:27:20,636 --> 00:27:23,316 Speaker 4: purchases that would save you time, all kinds of different things. 549 00:27:23,596 --> 00:27:26,236 Speaker 4: And then we looked at how much happiness each purchase 550 00:27:26,436 --> 00:27:30,316 Speaker 4: provided participants. And so what we discovered is that out 551 00:27:30,316 --> 00:27:34,796 Speaker 4: of all seventeen categories. The type of spending that provided 552 00:27:34,836 --> 00:27:39,116 Speaker 4: the highest level of happiness was in fact, making charitable donations, 553 00:27:39,756 --> 00:27:42,636 Speaker 4: so it confirmed what we had seen in previous research, 554 00:27:42,716 --> 00:27:46,236 Speaker 4: but with much larger spending amounts. If you're curious what 555 00:27:46,316 --> 00:27:50,076 Speaker 4: some of the others were. Second place was buying experiences, 556 00:27:50,516 --> 00:27:53,436 Speaker 4: so you know, going on trips, going out for special meals. 557 00:27:53,476 --> 00:27:56,916 Speaker 4: These also provided a lot of happiness. One that hasn't 558 00:27:56,916 --> 00:28:00,156 Speaker 4: popped up in previous research, but that we discovered in 559 00:28:00,196 --> 00:28:03,116 Speaker 4: this study produced particularly high levels of happiness was spending 560 00:28:03,116 --> 00:28:03,876 Speaker 4: money on education. 561 00:28:03,956 --> 00:28:06,356 Speaker 2: And as a professor, I like that one. 562 00:28:06,996 --> 00:28:11,116 Speaker 3: This one thing I'm puzzled by is that, I mean, 563 00:28:11,116 --> 00:28:15,196 Speaker 3: in one level, it makes sense why being generous would 564 00:28:15,236 --> 00:28:18,596 Speaker 3: bring happiness with it. If you think from an evolutionary perspective, 565 00:28:19,236 --> 00:28:21,356 Speaker 3: you would want there to be a reward for cooperative 566 00:28:21,396 --> 00:28:25,436 Speaker 3: behavior because there's obviously, you know, the belief is that 567 00:28:25,596 --> 00:28:28,636 Speaker 3: species that learn to cooperate get benefit from it, and 568 00:28:28,676 --> 00:28:32,556 Speaker 3: so you need that reward to do it. And yet, 569 00:28:32,676 --> 00:28:36,676 Speaker 3: unlike with other types of happiness, it doesn't seem to 570 00:28:36,676 --> 00:28:40,596 Speaker 3: be advertised. So if I'm hungry, I know that if 571 00:28:40,676 --> 00:28:43,596 Speaker 3: I eat, I will feel better, and you can say 572 00:28:43,636 --> 00:28:46,196 Speaker 3: the same for most other sort of obvious sort of 573 00:28:46,196 --> 00:28:50,596 Speaker 3: biological urges. But on this one, I think a lot 574 00:28:50,676 --> 00:28:53,756 Speaker 3: of people don't know that they're going to be happy. 575 00:28:54,276 --> 00:28:57,356 Speaker 3: It becomes a matter of sort of wisdom passed down 576 00:28:57,436 --> 00:28:59,436 Speaker 3: by the elders in your community or something like that, 577 00:28:59,676 --> 00:29:03,476 Speaker 3: or something that you eventually discover. Why do you think 578 00:29:03,756 --> 00:29:05,316 Speaker 3: that is hidden? 579 00:29:05,796 --> 00:29:08,756 Speaker 4: Yeah, I mean, I think it's actually surprisingly difficult for 580 00:29:08,796 --> 00:29:11,596 Speaker 4: people to figure out what makes them happy. 581 00:29:12,436 --> 00:29:13,036 Speaker 2: For one thing. 582 00:29:13,116 --> 00:29:16,236 Speaker 4: You know, we can notice that we feel happy after 583 00:29:16,836 --> 00:29:20,236 Speaker 4: you know, helping a friend engaging in some active generosity, 584 00:29:20,476 --> 00:29:22,556 Speaker 4: but we might say, oh, it's because you know, my 585 00:29:22,596 --> 00:29:24,716 Speaker 4: friend was in this particular situation and I'm glad that 586 00:29:24,716 --> 00:29:25,876 Speaker 4: I was able to help her out with that. But 587 00:29:25,916 --> 00:29:29,076 Speaker 4: we don't necessarily make the right broader inference of like 588 00:29:29,156 --> 00:29:31,916 Speaker 4: this is about generosity as a whole. And I think 589 00:29:31,956 --> 00:29:33,756 Speaker 4: there's also just a lot of smoke screens in our 590 00:29:33,796 --> 00:29:37,316 Speaker 4: society where that's not necessarily the message that we hear. 591 00:29:37,396 --> 00:29:41,116 Speaker 4: Although it's interesting because I have gotten, you know, letters 592 00:29:41,156 --> 00:29:43,596 Speaker 4: from people saying, why did we need you as a 593 00:29:43,596 --> 00:29:45,916 Speaker 4: social scientist to tell us that giving makes us happy? 594 00:29:45,956 --> 00:29:47,756 Speaker 4: This is something that is you know, taught to us 595 00:29:47,756 --> 00:29:51,516 Speaker 4: in religious traditions, by grandparents and so forth. So those 596 00:29:51,556 --> 00:29:54,236 Speaker 4: messages are out there. But I think it's really interesting 597 00:29:54,276 --> 00:29:57,116 Speaker 4: the point you make about, you know, feeling hungry, because 598 00:29:57,436 --> 00:29:59,476 Speaker 4: I think it's when people are in the moment that 599 00:29:59,516 --> 00:30:02,276 Speaker 4: they don't necessarily realize, oh, this is the best thing 600 00:30:02,316 --> 00:30:04,076 Speaker 4: that I could do with this money is use it 601 00:30:04,116 --> 00:30:06,476 Speaker 4: to benefit somebody else. And in fact, we've run studies 602 00:30:06,596 --> 00:30:09,156 Speaker 4: where we say, hey, we can give you money and 603 00:30:09,236 --> 00:30:12,116 Speaker 4: want you to tell us what would make you the happiest, 604 00:30:12,116 --> 00:30:15,116 Speaker 4: and they don't tend to think of using it to 605 00:30:15,156 --> 00:30:17,516 Speaker 4: benefit others. So's there does seem to be something about 606 00:30:17,596 --> 00:30:20,796 Speaker 4: money in particular that puts people in this mindset of 607 00:30:20,876 --> 00:30:24,036 Speaker 4: looking out for themselves, and so that that may actually 608 00:30:24,076 --> 00:30:26,956 Speaker 4: serve to distract us from this broader knowledge that we 609 00:30:27,036 --> 00:30:29,516 Speaker 4: have maybe picked up from important people and traditions in 610 00:30:29,556 --> 00:30:32,796 Speaker 4: our life that may get lost when we're faced with that, 611 00:30:32,956 --> 00:30:34,196 Speaker 4: you know wallet. 612 00:30:34,596 --> 00:30:41,556 Speaker 3: In your wonderful ted talk, you said that generosity doesn't 613 00:30:41,556 --> 00:30:45,636 Speaker 3: always bring happiness. What matters is how it's done, and 614 00:30:45,676 --> 00:30:50,836 Speaker 3: that generosity that is done where there is direct contact, 615 00:30:50,876 --> 00:30:53,596 Speaker 3: for example, with a person, when you can feel or 616 00:30:53,716 --> 00:30:57,596 Speaker 3: see the human impact of it, works a lot better. 617 00:30:57,636 --> 00:31:00,836 Speaker 3: In terms of bringing happiness. Could you say a bit 618 00:31:00,876 --> 00:31:01,396 Speaker 3: more about that. 619 00:31:01,796 --> 00:31:04,116 Speaker 4: Yeah, So I would say there's sort of three key 620 00:31:04,276 --> 00:31:08,196 Speaker 4: ingredients that are pretty essential in turning generosity into happiness. 621 00:31:08,356 --> 00:31:11,836 Speaker 4: So one is feeling a sense of connection ideally with it, 622 00:31:11,916 --> 00:31:14,076 Speaker 4: you know, the individuals or the causing you are helping, 623 00:31:14,116 --> 00:31:17,476 Speaker 4: having that actual In my case, for example, we were 624 00:31:17,516 --> 00:31:20,756 Speaker 4: able to privately sponsor a family of Searian refugees to 625 00:31:20,756 --> 00:31:22,716 Speaker 4: come to Vancouver, and we were literally picking them up 626 00:31:22,716 --> 00:31:24,716 Speaker 4: at the airport and hugging them, and so that's like 627 00:31:24,756 --> 00:31:26,876 Speaker 4: the ultimate form of contact. But even if you're a 628 00:31:26,876 --> 00:31:28,956 Speaker 4: little bit more removed from that, feeling that sense of 629 00:31:28,956 --> 00:31:31,956 Speaker 4: connection is critically important, and I think a lot of 630 00:31:32,036 --> 00:31:34,996 Speaker 4: charitable giving opportunities don't offer that, and that there's this 631 00:31:35,036 --> 00:31:37,876 Speaker 4: beautiful space for innovation and figuring out how to create that, 632 00:31:37,916 --> 00:31:40,716 Speaker 4: particularly when we're giving to people far away. So connection 633 00:31:40,836 --> 00:31:43,716 Speaker 4: is probably the number one. But also impact matters, so 634 00:31:43,796 --> 00:31:47,756 Speaker 4: being able to really understand or at least vividly imagine 635 00:31:47,836 --> 00:31:50,076 Speaker 4: how your generosity is making a difference. 636 00:31:50,516 --> 00:31:52,836 Speaker 2: And finally, choice matters, So feeling that. 637 00:31:52,756 --> 00:31:55,756 Speaker 4: You have a sense of choice, of autonomy of agency. 638 00:31:56,276 --> 00:31:58,956 Speaker 4: There's no better way to rob people of the joy 639 00:31:58,956 --> 00:32:00,876 Speaker 4: of giving than to back them into a corner and 640 00:32:00,876 --> 00:32:02,836 Speaker 4: make them feel like they've been forced to give. And 641 00:32:02,876 --> 00:32:05,316 Speaker 4: so I think, you know, we want to keep these 642 00:32:05,436 --> 00:32:08,596 Speaker 4: ingredients in mind because ideally we are building a future 643 00:32:09,156 --> 00:32:12,036 Speaker 4: that is filled with opportunities to give in these joyful 644 00:32:12,076 --> 00:32:15,316 Speaker 4: ways that involve you know, connection, impact, and choice. 645 00:32:15,356 --> 00:32:18,236 Speaker 3: So help connect the dots here because that makes a 646 00:32:18,236 --> 00:32:21,836 Speaker 3: lot of sense to me, But it also seems to 647 00:32:22,116 --> 00:32:26,236 Speaker 3: contradict another piece of advice that feels important, which is 648 00:32:26,236 --> 00:32:29,516 Speaker 3: that we want people to bring their minds to their giving. 649 00:32:29,956 --> 00:32:32,756 Speaker 3: So much of you know, life is this battle between 650 00:32:32,796 --> 00:32:38,236 Speaker 3: our instincts and then our reflective selves. And Paul Bloom, 651 00:32:38,236 --> 00:32:40,516 Speaker 3: who's been on this podcast series, I wrote a book 652 00:32:40,676 --> 00:32:43,316 Speaker 3: called Against Empathy where he was arguing that, you know, 653 00:32:43,396 --> 00:32:45,956 Speaker 3: we have these powerful feelings of empathy. You see someone suffering, 654 00:32:45,956 --> 00:32:49,036 Speaker 3: you want to help them, and quite possibly that does 655 00:32:49,076 --> 00:32:52,996 Speaker 3: bring with it most happiness, but it may distract us 656 00:32:52,996 --> 00:32:57,236 Speaker 3: from the wisest spending of money. So many people their 657 00:32:57,316 --> 00:33:00,756 Speaker 3: charitable lives are limited to oh, I saw some need 658 00:33:00,796 --> 00:33:03,996 Speaker 3: on TV of some disaster, and so I'll text the 659 00:33:04,036 --> 00:33:06,996 Speaker 3: money there and then you forget about it. And part 660 00:33:06,996 --> 00:33:09,756 Speaker 3: of what we want to argue for is for people 661 00:33:09,796 --> 00:33:13,596 Speaker 3: to be reflective and almost strategic about their giving so 662 00:33:13,636 --> 00:33:17,956 Speaker 3: that they can spend the money wisely. Now that may 663 00:33:18,036 --> 00:33:21,036 Speaker 3: mean that that doesn't make them as happy because you're 664 00:33:21,076 --> 00:33:25,276 Speaker 3: not triggering those instants out of human instinctive selves. How 665 00:33:25,356 --> 00:33:28,276 Speaker 3: might we bridge and get the best of both worlds here? 666 00:33:28,516 --> 00:33:30,716 Speaker 4: Yeah, I think the key to bridging those worlds is 667 00:33:30,756 --> 00:33:34,276 Speaker 4: to focus on impact. Because obviously impact is what matters 668 00:33:34,276 --> 00:33:37,356 Speaker 4: when we're kind of thinking with our heads. But we 669 00:33:37,436 --> 00:33:39,796 Speaker 4: also see in our research that people get a lot 670 00:33:39,876 --> 00:33:42,676 Speaker 4: more joy from giving when they can see or vividly 671 00:33:42,716 --> 00:33:46,596 Speaker 4: imagine the impact that they're having. And so I think 672 00:33:46,876 --> 00:33:50,996 Speaker 4: making it possible for charities that really do have a 673 00:33:51,436 --> 00:33:56,196 Speaker 4: genuine impact bringing that out in a way that potential 674 00:33:56,236 --> 00:34:00,356 Speaker 4: donors can see it, understand it, vividly experience it. I 675 00:34:00,356 --> 00:34:02,836 Speaker 4: think that is an opportunity where our heads and our 676 00:34:02,916 --> 00:34:06,556 Speaker 4: hearts meet and can center around impact. That said, you 677 00:34:06,636 --> 00:34:10,276 Speaker 4: mentioned you know, is this the wisest use. 678 00:34:10,116 --> 00:34:10,756 Speaker 2: Of your money? 679 00:34:10,836 --> 00:34:15,476 Speaker 4: And my favorite part of your book was arguing that 680 00:34:16,316 --> 00:34:17,876 Speaker 4: maybe that's not the right question. 681 00:34:18,036 --> 00:34:19,636 Speaker 2: Maybe we shouldn't be trying. 682 00:34:19,396 --> 00:34:24,756 Speaker 4: For the wisest use of our charitable donation money or 683 00:34:24,836 --> 00:34:28,236 Speaker 4: our pro social spending because that puts us in a trap. 684 00:34:28,276 --> 00:34:30,396 Speaker 4: And I've experienced this myself where I'm like, I have 685 00:34:30,436 --> 00:34:34,316 Speaker 4: to find the best charity doing X right, and then 686 00:34:34,476 --> 00:34:37,236 Speaker 4: I don't get around to donating, or I just get 687 00:34:37,236 --> 00:34:39,196 Speaker 4: in my head about it. And so I love your 688 00:34:39,236 --> 00:34:41,876 Speaker 4: book for sort of freeing people to say, is this 689 00:34:41,996 --> 00:34:44,596 Speaker 4: a good use of the money? And if the answer 690 00:34:44,636 --> 00:34:49,196 Speaker 4: is yes, then maybe greenlight yourself to go ahead and donate. 691 00:34:49,676 --> 00:34:52,076 Speaker 3: Yes. I like that. We have to really know ourselves 692 00:34:52,716 --> 00:34:55,916 Speaker 3: and figure out one how to avoid that trap. Two 693 00:34:56,356 --> 00:34:59,316 Speaker 3: how to feed some of those human instincts. So, say 694 00:34:59,356 --> 00:35:04,556 Speaker 3: you discover make an intellectual conclusion that an organization is 695 00:35:05,116 --> 00:35:09,996 Speaker 3: a wise one to support, don't just stop there. Give 696 00:35:10,076 --> 00:35:13,316 Speaker 3: your humanness a chance to actually see the impact. So 697 00:35:13,516 --> 00:35:17,516 Speaker 3: the statistics coming out of that group won't be enough, 698 00:35:18,076 --> 00:35:22,516 Speaker 3: But maybe if you actually can meet and get to 699 00:35:22,556 --> 00:35:26,476 Speaker 3: know some of the stories that are actually behind those statistics, 700 00:35:26,636 --> 00:35:29,036 Speaker 3: and perhaps join a community of other people who are 701 00:35:29,036 --> 00:35:31,956 Speaker 3: supporting them, that is the kind of thing that can 702 00:35:32,116 --> 00:35:34,676 Speaker 3: carry you on and turn a sort of short term 703 00:35:34,676 --> 00:35:37,436 Speaker 3: intellectual decision that's smart into an emotional thing that actually 704 00:35:37,676 --> 00:35:39,676 Speaker 3: brings with it joy and habit making. 705 00:35:40,036 --> 00:35:44,196 Speaker 4: Yeah, and I think for folks working in the nonprofit world, 706 00:35:44,476 --> 00:35:48,276 Speaker 4: contemplating not just how can we get donors to give 707 00:35:48,356 --> 00:35:51,076 Speaker 4: more money, but how can we make the experience of 708 00:35:51,116 --> 00:35:55,676 Speaker 4: donation more joyful is potentially a way to tackle that 709 00:35:55,756 --> 00:35:57,676 Speaker 4: question from a different perspective. 710 00:35:57,796 --> 00:36:01,436 Speaker 3: Well, okay, so ted, audience, this is your turn. Now. 711 00:36:01,596 --> 00:36:04,796 Speaker 3: If you have a question for Liz, please raise your hand, 712 00:36:04,876 --> 00:36:07,396 Speaker 3: A microphone will come to you and we're going to 713 00:36:07,396 --> 00:36:10,396 Speaker 3: get through as many as we can. Hi. 714 00:36:10,516 --> 00:36:11,196 Speaker 2: Yes, Sarah. 715 00:36:11,276 --> 00:36:14,156 Speaker 6: So I'm wondering is there a direct correlation between the 716 00:36:14,196 --> 00:36:17,036 Speaker 6: amount of happiness and the amount of generosity? Do you 717 00:36:17,076 --> 00:36:19,356 Speaker 6: get so much more money happiness out of giving a 718 00:36:19,396 --> 00:36:22,276 Speaker 6: million versus out of giving a thousand, And what does 719 00:36:22,316 --> 00:36:23,556 Speaker 6: that sort of racial look like. 720 00:36:23,716 --> 00:36:28,156 Speaker 4: This is actually a surprisingly difficult question to answer. Certainly 721 00:36:28,196 --> 00:36:31,636 Speaker 4: there is a relationship, and in fact there's a relationship. 722 00:36:31,636 --> 00:36:34,996 Speaker 4: What we see in the data from the Mystery experiment 723 00:36:35,276 --> 00:36:38,036 Speaker 4: is that just in general, people get more happiness from 724 00:36:38,036 --> 00:36:41,156 Speaker 4: more expensive stuff. So like the more money they spent 725 00:36:41,356 --> 00:36:44,036 Speaker 4: on a purchase, the more the higher they tend to 726 00:36:44,116 --> 00:36:46,396 Speaker 4: rate it. Not perfectly in terms of happiness, but certainly 727 00:36:46,436 --> 00:36:47,516 Speaker 4: that relationship is there. 728 00:36:48,236 --> 00:36:49,836 Speaker 2: So I would say. 729 00:36:49,636 --> 00:36:54,796 Speaker 4: Overall, larger amounts of money spent charitably do provide more happiness. 730 00:36:54,916 --> 00:36:57,836 Speaker 4: And yet it is not at all like a perfect 731 00:36:57,956 --> 00:37:01,796 Speaker 4: or super strong relationship, because people can also spend a 732 00:37:01,876 --> 00:37:04,396 Speaker 4: pretty small amount of money but do so in a 733 00:37:04,436 --> 00:37:07,996 Speaker 4: way that provides a ton of connection, a lot of 734 00:37:08,036 --> 00:37:10,156 Speaker 4: impacts and the felt from really chosen and that can 735 00:37:10,196 --> 00:37:12,636 Speaker 4: deliver a big boost in terms of happiness. So it's 736 00:37:12,636 --> 00:37:16,716 Speaker 4: certainly not just like a dollar for dollar kind of relationship. 737 00:37:16,716 --> 00:37:18,436 Speaker 4: And I will say too, I just want to mention 738 00:37:18,796 --> 00:37:22,236 Speaker 4: there's more to discover in these data. And we have 739 00:37:22,756 --> 00:37:24,916 Speaker 4: done a lot of work on our side to make 740 00:37:24,956 --> 00:37:25,836 Speaker 4: these data. 741 00:37:26,156 --> 00:37:28,716 Speaker 2: Accessible to researchers. So researchers need to come to us. 742 00:37:28,956 --> 00:37:31,356 Speaker 4: We'll check their credentials and everything, but we think there 743 00:37:31,356 --> 00:37:33,996 Speaker 4: are unanswered questions. So or if you are a scientist 744 00:37:34,156 --> 00:37:36,196 Speaker 4: or a curious philanthropist and want to try to figure 745 00:37:36,196 --> 00:37:38,676 Speaker 4: something out. One thing we've talked about is trying to 746 00:37:38,716 --> 00:37:41,316 Speaker 4: make these data as much of a gift to the 747 00:37:41,356 --> 00:37:44,876 Speaker 4: scientific community as we can by allowing people to use 748 00:37:44,916 --> 00:37:46,636 Speaker 4: the data to answer their own questions. 749 00:37:47,196 --> 00:37:49,916 Speaker 5: So I'm very nique, thank you for this. This is fascinating. 750 00:37:50,276 --> 00:37:52,396 Speaker 5: So I work in the US, but I live in 751 00:37:52,436 --> 00:37:56,316 Speaker 5: Europe and My question is to what extent does culture 752 00:37:56,436 --> 00:38:00,356 Speaker 5: impact how generous we are, Because what I've seen is 753 00:38:00,356 --> 00:38:03,396 Speaker 5: that in the US it's a very giving culture when 754 00:38:03,436 --> 00:38:07,396 Speaker 5: it comes to making a check, supporting your local charity, 755 00:38:08,156 --> 00:38:11,796 Speaker 5: donating your time. In Europe it's very, very different, and 756 00:38:12,396 --> 00:38:16,636 Speaker 5: it's really hard to find local charities, not even for children. 757 00:38:16,876 --> 00:38:20,836 Speaker 5: Or I tried to put something together for my daughter's school, 758 00:38:20,916 --> 00:38:23,956 Speaker 5: and everybody was so shocked, you know, and I think 759 00:38:23,996 --> 00:38:28,116 Speaker 5: they even doubted my intentions. It was really crazy. So 760 00:38:28,276 --> 00:38:30,316 Speaker 5: I had a thought, and I'd love to know what 761 00:38:30,396 --> 00:38:33,916 Speaker 5: you think. So in America we pay fewer taxes, really, 762 00:38:34,116 --> 00:38:37,836 Speaker 5: and there isn't this social safety net. So maybe there's 763 00:38:38,396 --> 00:38:40,756 Speaker 5: we have to do this because we have to fill 764 00:38:40,796 --> 00:38:43,236 Speaker 5: the gaps, and maybe in Europe that's not the case 765 00:38:43,236 --> 00:38:45,836 Speaker 5: and we pay a lot of taxes. That's a theory, 766 00:38:45,876 --> 00:38:48,036 Speaker 5: but I think it's probably more complex than that. 767 00:38:48,156 --> 00:38:49,316 Speaker 3: So what are your thoughts. 768 00:38:49,396 --> 00:38:54,116 Speaker 4: I mean, certainly we see big cultural differences when you look, 769 00:38:54,316 --> 00:38:57,476 Speaker 4: for example, in the Gallop World data that Chris mentioned, 770 00:38:57,676 --> 00:39:00,956 Speaker 4: there are substantial differences between different countries in terms of 771 00:39:00,996 --> 00:39:03,436 Speaker 4: how much people give to charity. I would say the 772 00:39:03,476 --> 00:39:06,916 Speaker 4: fascinating thing in this particular experiment was that, you know, 773 00:39:06,916 --> 00:39:10,956 Speaker 4: we had people from three lower income countries, so Kenya, Indonesia, 774 00:39:10,996 --> 00:39:15,636 Speaker 4: and Brazil. For higher income countries the US, the UK, Australia, 775 00:39:15,676 --> 00:39:18,436 Speaker 4: and Canada, and we didn't have enough people within each 776 00:39:18,476 --> 00:39:22,156 Speaker 4: country to treat each country separately. But we compared the 777 00:39:22,196 --> 00:39:24,556 Speaker 4: lower income countries with the higher income countries, and we 778 00:39:24,956 --> 00:39:28,036 Speaker 4: expected that perhaps people in the higher income countries would 779 00:39:28,116 --> 00:39:30,036 Speaker 4: spend more money on others because they have a lot 780 00:39:30,076 --> 00:39:32,876 Speaker 4: more disposable income on average, but we didn't find that. 781 00:39:32,916 --> 00:39:35,796 Speaker 4: We actually found people in the lower income countries spent 782 00:39:35,916 --> 00:39:38,716 Speaker 4: just as much on others compared to those in the 783 00:39:38,796 --> 00:39:39,716 Speaker 4: higher income countries. 784 00:39:40,276 --> 00:39:42,596 Speaker 7: First of all, let me say I'm one hundred percent 785 00:39:42,676 --> 00:39:44,956 Speaker 7: on giving. I'm a philanthropist. I worked with my local 786 00:39:44,956 --> 00:39:49,036 Speaker 7: community foundations. The first challenging question I have is this is, 787 00:39:49,196 --> 00:39:52,436 Speaker 7: since you're measuring happiness, I want to focus on that 788 00:39:52,556 --> 00:39:55,756 Speaker 7: word because I want to talk about happiness versus joy. 789 00:39:56,316 --> 00:39:58,596 Speaker 7: I noticed that you use those two terms I guess 790 00:39:58,596 --> 00:40:01,196 Speaker 7: interchangeably too, and I'm going to ask you how you 791 00:40:01,276 --> 00:40:03,636 Speaker 7: define happiness in order to measure happiness. 792 00:40:03,836 --> 00:40:05,676 Speaker 4: Thank you for asking, ma, because it is helpful to 793 00:40:05,716 --> 00:40:10,876 Speaker 4: clarify the definition. So we we define happiness as subjective 794 00:40:10,916 --> 00:40:13,276 Speaker 4: well being. That's like the technical jargon y term if 795 00:40:13,276 --> 00:40:17,196 Speaker 4: you want to google research in this area and broadly in. 796 00:40:17,196 --> 00:40:19,836 Speaker 2: My field of social psychology, that is. 797 00:40:19,876 --> 00:40:23,716 Speaker 4: The dominant way that we think about happiness. And so 798 00:40:23,876 --> 00:40:27,036 Speaker 4: subjective well being has three core components. And so we 799 00:40:27,156 --> 00:40:30,796 Speaker 4: have positive affect and that can include feelings like joy, 800 00:40:30,836 --> 00:40:33,516 Speaker 4: although it's sort of it's very central. It's like capturing 801 00:40:33,876 --> 00:40:38,116 Speaker 4: sort of the core elements of just feeling good. Basically, 802 00:40:38,436 --> 00:40:41,756 Speaker 4: we have negative affect, and so we're looking, you know, 803 00:40:42,036 --> 00:40:46,316 Speaker 4: just to be clear, even happy people experience negative emotions. 804 00:40:46,356 --> 00:40:49,276 Speaker 4: Negative emotions are healthy and they're good, they're part of 805 00:40:49,316 --> 00:40:50,636 Speaker 4: who we're meant to. 806 00:40:50,556 --> 00:40:51,436 Speaker 2: Be as a species. 807 00:40:51,636 --> 00:40:54,076 Speaker 4: But you know, very happy people tend to experience a 808 00:40:54,116 --> 00:40:57,036 Speaker 4: lot more positive emotion than negative emotion on a typical day. 809 00:40:57,436 --> 00:41:00,356 Speaker 4: And then the third component, which is really important is 810 00:41:00,436 --> 00:41:04,196 Speaker 4: life satisfaction, and that is a more cognitive and valuative, 811 00:41:04,276 --> 00:41:07,276 Speaker 4: more reflective judgment of like am I leading the kind 812 00:41:07,316 --> 00:41:10,316 Speaker 4: of life that I want to have? And the remarkable 813 00:41:10,316 --> 00:41:14,076 Speaker 4: thing about this experiment is that we saw substantial changes 814 00:41:14,236 --> 00:41:18,276 Speaker 4: not only in positive emotions and negative emotions, but also 815 00:41:18,436 --> 00:41:21,556 Speaker 4: in life satisfaction and in fact, when my lab reviewed 816 00:41:21,676 --> 00:41:24,956 Speaker 4: all of the preregistered experiments that have ever been conducted 817 00:41:25,116 --> 00:41:29,036 Speaker 4: on happiness, we found that this experiment had the largest 818 00:41:29,076 --> 00:41:31,876 Speaker 4: impact on life satisfaction that's ever been found. 819 00:41:31,996 --> 00:41:33,636 Speaker 3: So I think what the question is getting at is 820 00:41:33,676 --> 00:41:35,276 Speaker 3: that we want to feel that there's a difference between 821 00:41:35,276 --> 00:41:40,556 Speaker 3: the temporary pleasure of eating strawberry ice cream versus sort 822 00:41:40,596 --> 00:41:44,156 Speaker 3: of the deeper life satisfaction or joy of that can 823 00:41:44,196 --> 00:41:46,876 Speaker 3: come from giving. And I mean it does science support 824 00:41:46,916 --> 00:41:49,516 Speaker 3: the fact that there's the form of happiness that you 825 00:41:49,556 --> 00:41:52,036 Speaker 3: get from generosity. You know, it goes to that deeper happiness. 826 00:41:52,076 --> 00:41:54,916 Speaker 3: It's longer lasting than just the sort of temporary positive 827 00:41:54,956 --> 00:41:56,316 Speaker 3: effect that you might otherwise get. 828 00:41:56,516 --> 00:41:58,836 Speaker 4: Yeah, I mean we see it in this work and 829 00:41:59,156 --> 00:42:01,996 Speaker 4: in other studies. We see that the effects of generosity 830 00:42:02,076 --> 00:42:04,676 Speaker 4: are pretty broad and robust. So we see them both 831 00:42:04,716 --> 00:42:08,196 Speaker 4: in terms of this immediate increase in positive mood, but 832 00:42:08,516 --> 00:42:12,476 Speaker 4: also over time, this seems to result in actual changes 833 00:42:12,516 --> 00:42:14,836 Speaker 4: in people's satisfaction with their lives. 834 00:42:15,316 --> 00:42:18,156 Speaker 8: When I'm giving, should I be giving a dollar a 835 00:42:18,276 --> 00:42:22,076 Speaker 8: day to get my happiness or quarterly? What's a cadence? 836 00:42:22,316 --> 00:42:24,076 Speaker 8: And then should I be paying attention to the percentage 837 00:42:24,116 --> 00:42:26,756 Speaker 8: of my income or the percentage of the receiver's income, I. 838 00:42:26,716 --> 00:42:29,116 Speaker 4: Would say, again, it's not so much about the exact 839 00:42:29,196 --> 00:42:31,756 Speaker 4: number of dollars, but I would say looking for opportunities 840 00:42:31,796 --> 00:42:34,396 Speaker 4: to give where you really feel a sense of the impact. 841 00:42:35,156 --> 00:42:38,516 Speaker 3: In the book, I suggest to people who are well 842 00:42:38,516 --> 00:42:41,236 Speaker 3: off that we could do worse than look at what 843 00:42:41,316 --> 00:42:44,876 Speaker 3: the religious traditions are and the expectations are, which in 844 00:42:45,316 --> 00:42:48,996 Speaker 3: Christianity and Judaism are ten percent of income and in 845 00:42:49,636 --> 00:42:53,036 Speaker 3: Islam it's two and a half percent of net worth annually. 846 00:42:53,116 --> 00:42:55,596 Speaker 3: And I say, if you really want to embrace the 847 00:42:55,636 --> 00:42:58,236 Speaker 3: notion that as secular people many of us here at 848 00:42:58,236 --> 00:43:01,876 Speaker 3: TA probably secular people, do we want omorle standards to 849 00:43:01,876 --> 00:43:03,716 Speaker 3: be at least as high as those are the religions. 850 00:43:04,076 --> 00:43:07,396 Speaker 3: If so, there's an argument that you should try and 851 00:43:07,436 --> 00:43:09,796 Speaker 3: get to the position where you can commit to the 852 00:43:09,916 --> 00:43:12,836 Speaker 3: hire of those two standards ten percent of income or 853 00:43:12,876 --> 00:43:15,316 Speaker 3: two and a half percent of networth. What amazed me 854 00:43:15,356 --> 00:43:16,956 Speaker 3: in doing the book is that if you accept that, 855 00:43:17,036 --> 00:43:19,636 Speaker 3: embrace that, and do the math at what that would raise, 856 00:43:20,436 --> 00:43:22,436 Speaker 3: it would transform the world and we could switch the 857 00:43:22,476 --> 00:43:26,036 Speaker 3: conversation around philanthropy from being this slightly awkward thing to 858 00:43:26,116 --> 00:43:30,356 Speaker 3: being one of thrilling imagination and possibility. 859 00:43:30,716 --> 00:43:34,356 Speaker 9: Is there a difference in happiness experience between giving away 860 00:43:34,436 --> 00:43:37,436 Speaker 9: your money and giving away your time? And do you 861 00:43:37,716 --> 00:43:40,636 Speaker 9: get exponentially greater happiness by giving away both at the 862 00:43:40,676 --> 00:43:41,196 Speaker 9: same type. 863 00:43:42,316 --> 00:43:43,516 Speaker 2: Yeah, fascinating question. 864 00:43:44,316 --> 00:43:46,436 Speaker 4: You know, I can't think of an experiment that has 865 00:43:46,556 --> 00:43:51,316 Speaker 4: directly contrasted those but there's really strong, robust evidence that 866 00:43:51,676 --> 00:43:54,756 Speaker 4: does meet these kind of gold standards of modern behavioral 867 00:43:54,836 --> 00:43:59,196 Speaker 4: science showing that using money to benefit others promotes happiness. 868 00:43:59,196 --> 00:44:03,676 Speaker 4: And interestingly, the research on volunteering, for example, is one 869 00:44:03,676 --> 00:44:08,196 Speaker 4: way of giving time. Strangely that liurture hasn't produced the 870 00:44:08,236 --> 00:44:11,276 Speaker 4: strongest result, and I'm curious about why that is, and 871 00:44:11,636 --> 00:44:14,636 Speaker 4: I would love to see more large scale work on 872 00:44:14,636 --> 00:44:17,156 Speaker 4: that topic. But I especially love your insight about bringing 873 00:44:17,156 --> 00:44:19,316 Speaker 4: the time and the money together, in part because I 874 00:44:19,356 --> 00:44:22,036 Speaker 4: think putting in some of the time can maybe unleash 875 00:44:22,116 --> 00:44:23,156 Speaker 4: the benefits of the money. 876 00:44:23,196 --> 00:44:24,276 Speaker 2: And in my TED. 877 00:44:24,116 --> 00:44:27,476 Speaker 4: Talk I described a local charity just down the street 878 00:44:27,476 --> 00:44:31,436 Speaker 4: from here where folks in Vancouver will get together donate 879 00:44:31,516 --> 00:44:35,116 Speaker 4: money to this organization and then you go and you 880 00:44:35,596 --> 00:44:38,476 Speaker 4: make dinner for people on the downtown east Side and 881 00:44:38,516 --> 00:44:40,796 Speaker 4: you get to meet them, talk to them, and then 882 00:44:40,836 --> 00:44:43,316 Speaker 4: the money doesn't just buy them dinner. It also helps 883 00:44:43,356 --> 00:44:45,756 Speaker 4: to deal with the food security problem more broadly by 884 00:44:45,756 --> 00:44:48,036 Speaker 4: providing lunches throughout the week. And so that I think 885 00:44:48,076 --> 00:44:50,996 Speaker 4: is a beautiful model of a program where it bridges 886 00:44:51,036 --> 00:44:53,756 Speaker 4: the money and the time and the way that creates genuine, 887 00:44:53,756 --> 00:44:56,356 Speaker 4: meaningful connection for both donors and recipients. 888 00:44:56,756 --> 00:45:00,196 Speaker 10: H I just read there is generosity really a function 889 00:45:00,276 --> 00:45:04,396 Speaker 10: of feeling needed, because one feels better after giving. So 890 00:45:04,636 --> 00:45:07,556 Speaker 10: is that my own for my own satisfaction that I'm 891 00:45:07,596 --> 00:45:10,596 Speaker 10: doing or is just happens a label for it. 892 00:45:11,596 --> 00:45:13,716 Speaker 4: I love the answer you give to this question in 893 00:45:13,716 --> 00:45:14,836 Speaker 4: the box, so I think you should. 894 00:45:15,956 --> 00:45:23,316 Speaker 3: So I have been dismayed at how the conversation around generosity, 895 00:45:23,596 --> 00:45:27,876 Speaker 3: especially in terms of philanthropy, is happening in the modern culture, 896 00:45:27,956 --> 00:45:32,556 Speaker 3: where it feels like every opportunity has taken to poke 897 00:45:32,636 --> 00:45:36,436 Speaker 3: at people and to criticize and snipe at decisions of generosity, 898 00:45:36,716 --> 00:45:40,996 Speaker 3: of saying oh there's mixed motivation here, Oh they're only 899 00:45:40,996 --> 00:45:43,396 Speaker 3: doing it to make themselves feel good or to boost 900 00:45:43,516 --> 00:45:46,956 Speaker 3: their reputation, or oh couldn't they have given more? Or 901 00:45:47,036 --> 00:45:48,996 Speaker 3: oh how did they make that money? In the first place, 902 00:45:49,236 --> 00:45:52,716 Speaker 3: all these things are said, and I think it's toxic. 903 00:45:52,756 --> 00:45:56,356 Speaker 3: I think generosity has actually always there's no such thing 904 00:45:56,396 --> 00:45:59,556 Speaker 3: as pure generosity. I think the philosopher Emmanuel Kant is 905 00:45:59,596 --> 00:46:02,636 Speaker 3: wrong on this. Even when I was brought up, it 906 00:46:02,716 --> 00:46:05,676 Speaker 3: was give and you shall receive. You know, even our 907 00:46:05,756 --> 00:46:10,076 Speaker 3: parents you had to apply these other incentives to give. 908 00:46:10,556 --> 00:46:13,156 Speaker 3: And I think as a philosophy student, I used to 909 00:46:13,236 --> 00:46:16,676 Speaker 3: agonize over this. It was like, but I give to 910 00:46:16,796 --> 00:46:19,956 Speaker 3: satisfy my conscience. But it feels good to satisfy my conscience. 911 00:46:20,436 --> 00:46:23,556 Speaker 3: And so is that generosity? Well, yes, it down well 912 00:46:23,596 --> 00:46:26,716 Speaker 3: is generosity. And so I think we should not look 913 00:46:26,796 --> 00:46:30,356 Speaker 3: for reasons to ding generosity. We should look for reasons 914 00:46:30,756 --> 00:46:33,316 Speaker 3: to celebrate it. And in the connected age there are 915 00:46:33,316 --> 00:46:36,516 Speaker 3: more reasons than ever. Why. You know, generosity can spread, 916 00:46:36,556 --> 00:46:39,236 Speaker 3: It can change how you're regarded. It can introduce your 917 00:46:39,276 --> 00:46:41,516 Speaker 3: work to thousands of other people who may want to 918 00:46:41,556 --> 00:46:45,396 Speaker 3: work with you. It can enhance your reputation. It can 919 00:46:45,436 --> 00:46:48,316 Speaker 3: bring you happiness. It will bring you happiness, just as 920 00:46:48,316 --> 00:46:52,156 Speaker 3: it's hard to decide to go and work out, but 921 00:46:52,276 --> 00:46:54,716 Speaker 3: you know that long term, you know afterwards you'll feel 922 00:46:54,756 --> 00:46:58,076 Speaker 3: good about it. This is in the same category. It's 923 00:46:58,116 --> 00:47:00,316 Speaker 3: hard to do, but we should celebrate it even though 924 00:47:00,316 --> 00:47:03,356 Speaker 3: we know that there are rewards to the giver. And 925 00:47:03,396 --> 00:47:05,396 Speaker 3: so yeah, I've got a tackle in the book called 926 00:47:05,396 --> 00:47:09,916 Speaker 3: imperfect generosity. Generosity is the classic case which the perfect 927 00:47:09,916 --> 00:47:12,196 Speaker 3: becomes the enemy of the good. Let's not do that, 928 00:47:12,756 --> 00:47:14,436 Speaker 3: and that way we'll have a lot more generosity in 929 00:47:14,476 --> 00:47:14,836 Speaker 3: the world. 930 00:47:19,276 --> 00:47:22,276 Speaker 11: Religiously in the different traditions. I mean, the idea of 931 00:47:22,396 --> 00:47:25,676 Speaker 11: having a sincere heart or is the action itself good enough? 932 00:47:25,716 --> 00:47:28,396 Speaker 11: And I'm curious from the study, can I go in 933 00:47:28,436 --> 00:47:32,036 Speaker 11: with really bad motivations and still get the happiness effect 934 00:47:32,316 --> 00:47:33,156 Speaker 11: or does it change me? 935 00:47:33,276 --> 00:47:33,876 Speaker 3: I'm curious. 936 00:47:36,716 --> 00:47:39,196 Speaker 4: Well, we didn't ask people if they had bad motivations 937 00:47:39,636 --> 00:47:41,916 Speaker 4: going in, So yeah, I don't know. 938 00:47:41,996 --> 00:47:45,636 Speaker 3: What do you think, Chris, I mean, look, define bad motivations. 939 00:47:45,796 --> 00:47:49,316 Speaker 3: There's definitely a level of cynicism, which is something I 940 00:47:49,316 --> 00:47:53,596 Speaker 3: guess you can't claim is generous. But I think if 941 00:47:53,636 --> 00:47:58,276 Speaker 3: you see someone who needs something and you decide you 942 00:47:58,276 --> 00:48:02,876 Speaker 3: would like to meet that need, there's enough good motivation 943 00:48:02,996 --> 00:48:05,436 Speaker 3: in there for me to celebrate that act. 944 00:48:05,716 --> 00:48:08,796 Speaker 4: Yeah, I mean, certainly we do see that overall people 945 00:48:08,956 --> 00:48:11,396 Speaker 4: are getting the highest levels of happiness from what we 946 00:48:11,476 --> 00:48:13,676 Speaker 4: might call, you know, the purest form of giving, of 947 00:48:14,236 --> 00:48:15,476 Speaker 4: making charitable donations. 948 00:48:15,876 --> 00:48:18,036 Speaker 2: But also, you know, there's this interesting little finding. 949 00:48:18,076 --> 00:48:19,276 Speaker 4: I don't want to make too much of it because 950 00:48:19,276 --> 00:48:23,196 Speaker 4: we didn't expect it totally exploratory, but we saw that 951 00:48:23,396 --> 00:48:25,996 Speaker 4: in the public condition where people had to share the 952 00:48:26,036 --> 00:48:28,196 Speaker 4: decisions they were making along the way with this money, 953 00:48:28,556 --> 00:48:30,996 Speaker 4: we actually saw those folks getting a little bit less 954 00:48:31,036 --> 00:48:34,716 Speaker 4: happiness from their charitable donations compared to those who are 955 00:48:34,796 --> 00:48:37,436 Speaker 4: keeping it private. So it actually suggests that when you're 956 00:48:37,436 --> 00:48:39,956 Speaker 4: trying to be a little showy about this, it might detract. 957 00:48:39,996 --> 00:48:42,476 Speaker 2: Now, it doesn't mean we could never do this in. 958 00:48:42,436 --> 00:48:44,716 Speaker 4: A way that would work where we could both share 959 00:48:44,756 --> 00:48:47,236 Speaker 4: it and feel happy about it. But maybe it speaks 960 00:48:47,276 --> 00:48:50,556 Speaker 4: to the idea that generosity isn't all about just looking 961 00:48:50,556 --> 00:48:53,156 Speaker 4: good to other people, and that maybe when we're doing 962 00:48:53,196 --> 00:48:55,356 Speaker 4: it in these more private ways, it can feel great. 963 00:48:56,236 --> 00:48:56,396 Speaker 2: Oh. 964 00:48:56,436 --> 00:49:00,156 Speaker 12: Hi, I'm Marla. I'm curious about the intake process for 965 00:49:00,276 --> 00:49:06,396 Speaker 12: your research. Did you track previous generosity, previous charitable donations, 966 00:49:06,396 --> 00:49:09,356 Speaker 12: et cetera. And I'm also wondering if your plan to 967 00:49:09,436 --> 00:49:13,596 Speaker 12: track moving forward if they continue to be generous, and 968 00:49:13,876 --> 00:49:15,276 Speaker 12: if so, how we. 969 00:49:15,276 --> 00:49:18,956 Speaker 4: Did not track people's previous you know, charitable donations or 970 00:49:18,996 --> 00:49:21,996 Speaker 4: other spending choices. We did ask them, you know, just 971 00:49:22,076 --> 00:49:23,876 Speaker 4: as I think kind of a point of interest. We 972 00:49:23,916 --> 00:49:25,796 Speaker 4: did ask them some questions to make sure it would 973 00:49:25,796 --> 00:49:27,996 Speaker 4: be safe for them to be in this study. So 974 00:49:28,036 --> 00:49:30,476 Speaker 4: we did do a pretty careful intake process where we asked, 975 00:49:30,556 --> 00:49:32,796 Speaker 4: you know, we'd like to tell you about some wacky 976 00:49:32,836 --> 00:49:35,476 Speaker 4: things that could happen to you. Would any of these 977 00:49:35,556 --> 00:49:39,516 Speaker 4: cause you danger or serious distress? And so like having 978 00:49:39,556 --> 00:49:42,116 Speaker 4: a movie star show up on your doorstep getting ten 979 00:49:42,116 --> 00:49:44,596 Speaker 4: thousand dollars out of the blue, you know, all of 980 00:49:44,596 --> 00:49:46,956 Speaker 4: these things. And so we did not include people who 981 00:49:46,996 --> 00:49:49,076 Speaker 4: told us that it could be a danger to them, 982 00:49:49,076 --> 00:49:52,036 Speaker 4: but we didn't assess, you know, their previous giving. I 983 00:49:52,036 --> 00:49:54,676 Speaker 4: would love to know how this changes people in the 984 00:49:54,716 --> 00:49:57,396 Speaker 4: future and follow up with them, see you know, how 985 00:49:57,436 --> 00:50:00,796 Speaker 4: happy they are, what choices they've made down the road. 986 00:50:00,836 --> 00:50:03,196 Speaker 4: I think that would be a wonderful, fascinating thing to do. 987 00:50:03,676 --> 00:50:06,796 Speaker 3: Okay, Liz, do you have any final thought you'd like 988 00:50:06,876 --> 00:50:10,516 Speaker 3: to share from the mystery experiment or just in general, 989 00:50:10,676 --> 00:50:13,356 Speaker 3: something from your work that you wish was more widely known. 990 00:50:13,876 --> 00:50:16,996 Speaker 4: Yeah, I mean, I think this experiment, along with a 991 00:50:17,076 --> 00:50:20,516 Speaker 4: growing body of research, has really dealt the final death 992 00:50:20,556 --> 00:50:25,676 Speaker 4: blow to our notion of homoeconomicists as this self interested creature, 993 00:50:26,116 --> 00:50:28,516 Speaker 4: and it is time to leave that vision behind. 994 00:50:28,716 --> 00:50:30,516 Speaker 2: And I think that that is very freeing. 995 00:50:31,396 --> 00:50:33,316 Speaker 4: I'll also just leave you with one other stat that 996 00:50:33,316 --> 00:50:37,196 Speaker 4: didn't come up, which is that we found that people 997 00:50:37,676 --> 00:50:41,476 Speaker 4: in lower income countries got three times the happiness boost 998 00:50:41,596 --> 00:50:44,676 Speaker 4: from this money as those in higher income countries. So again, 999 00:50:44,756 --> 00:50:46,916 Speaker 4: in terms of an asymmetry, it's suggest how we can 1000 00:50:46,956 --> 00:50:47,556 Speaker 4: really make. 1001 00:50:47,436 --> 00:50:49,036 Speaker 2: The most of our money. 1002 00:50:49,436 --> 00:50:53,196 Speaker 4: That said, we found that there were detectable benefits for 1003 00:50:53,476 --> 00:50:57,436 Speaker 4: individuals making up to one hundred and twenty three thousand 1004 00:50:57,436 --> 00:51:00,796 Speaker 4: dollars per year. Ninety nine percent of the world's population 1005 00:51:01,076 --> 00:51:03,316 Speaker 4: makes less than that, and so I do think it 1006 00:51:03,396 --> 00:51:08,716 Speaker 4: speaks to the incredible potential power of redistribution of wealth 1007 00:51:08,796 --> 00:51:11,276 Speaker 4: to more happiness as we move into the future. 1008 00:51:13,036 --> 00:51:25,716 Speaker 3: Elizabeth Dunn, thank you so much. That was spectacular. Thank you. Okay, Well, 1009 00:51:25,796 --> 00:51:28,316 Speaker 3: that's all for today. A reminder that if you'd like 1010 00:51:28,356 --> 00:51:31,636 Speaker 3: to dig deeper into this conversation about the power of generosity, 1011 00:51:31,956 --> 00:51:36,196 Speaker 3: please consider reading my book Infectious Generosity or listening to it. 1012 00:51:36,756 --> 00:51:39,876 Speaker 3: Thanks to the Incredible Generosity of a Dona in the 1013 00:51:39,876 --> 00:51:42,996 Speaker 3: TED community. You can claim a free copy of the 1014 00:51:42,996 --> 00:51:47,756 Speaker 3: book by heading to Ted dot com slash Generosity. Next 1015 00:51:47,756 --> 00:51:51,316 Speaker 3: week is our final episode of this season. We'll be 1016 00:51:51,356 --> 00:51:55,516 Speaker 3: speaking with a visionary in the world of philanthropy, Natalie Cargill, 1017 00:51:55,756 --> 00:52:00,076 Speaker 3: whose work we actually just referenced here with Litz about 1018 00:52:00,356 --> 00:52:03,516 Speaker 3: the potential for truly big scale philanthropy. She's done a 1019 00:52:03,556 --> 00:52:07,556 Speaker 3: fascinating and essential mission to replace pessimism about the world's 1020 00:52:07,556 --> 00:52:11,036 Speaker 3: biggest problems with plans for actually solving them. And if 1021 00:52:11,076 --> 00:52:13,956 Speaker 3: you're keen to start your own generosity journey but not 1022 00:52:13,956 --> 00:52:17,156 Speaker 3: sure where to start, I would love you to check 1023 00:52:17,156 --> 00:52:20,836 Speaker 3: out a new tool that we've created called Tig. Tig 1024 00:52:20,916 --> 00:52:24,076 Speaker 3: is an AI assistant that can help you brainstorm ideas 1025 00:52:24,116 --> 00:52:26,596 Speaker 3: for what you can do with a little generosity and 1026 00:52:26,676 --> 00:52:29,836 Speaker 3: creativity in your own life or community. It's actually really 1027 00:52:30,076 --> 00:52:32,716 Speaker 3: fun to play with, and you can find Tig at 1028 00:52:32,956 --> 00:52:36,956 Speaker 3: infectious generosity dot org. The Ted Interview is part of 1029 00:52:36,996 --> 00:52:40,196 Speaker 3: the Ted Audio Collective, a collection of podcasts dedicated to 1030 00:52:40,236 --> 00:52:44,476 Speaker 3: sparking curiosity and sharing ideas that matter. This episode was 1031 00:52:44,516 --> 00:52:47,636 Speaker 3: produced by Jess Shane, our team who are there at 1032 00:52:47,636 --> 00:52:53,036 Speaker 3: the back includes Constanza Gaiado, Grace Rubinstein, Van van Cheng, 1033 00:52:53,396 --> 00:52:58,476 Speaker 3: Michelle quint Roxanne high Lash and Danielle Ballereso. This show 1034 00:52:58,556 --> 00:53:01,756 Speaker 3: is mixed by Sarah Bruguer and it was co created 1035 00:53:02,196 --> 00:53:10,076 Speaker 3: by you, our amazing live audience here in Vancouver. All right, 1036 00:53:10,116 --> 00:53:11,956 Speaker 3: If you like this show, please do share it with 1037 00:53:11,996 --> 00:53:15,396 Speaker 3: others wherever you can. Thanks so much for listening until 1038 00:53:15,436 --> 00:53:15,836 Speaker 3: next week