1 00:00:15,436 --> 00:00:24,516 Speaker 1: Pushkin Hey, Happiness Lab listeners. Today is Giving Tuesday, a 2 00:00:24,636 --> 00:00:27,396 Speaker 1: day where you're supposed to think more about charitable giving. 3 00:00:27,836 --> 00:00:30,396 Speaker 1: Being generous can give a big boost to our well being. 4 00:00:30,956 --> 00:00:33,756 Speaker 1: That said, I don't normally make donations to a good 5 00:00:33,756 --> 00:00:37,076 Speaker 1: cause just to make myself feel happier, but I do 6 00:00:37,156 --> 00:00:39,996 Speaker 1: sometimes wonder if I'm getting the most bank from my book, 7 00:00:40,196 --> 00:00:42,236 Speaker 1: both in terms of how much good my gift is 8 00:00:42,236 --> 00:00:44,636 Speaker 1: doing in the world and in terms of how much 9 00:00:44,636 --> 00:00:47,476 Speaker 1: of a happiness boost I'm getting in return. The reason 10 00:00:47,516 --> 00:00:50,356 Speaker 1: why we're spending money on charity, let's say, is because 11 00:00:50,396 --> 00:00:52,356 Speaker 1: we really want to help. We want to do some good, 12 00:00:52,716 --> 00:00:54,996 Speaker 1: and very few of us would say, well, I want 13 00:00:55,036 --> 00:00:57,156 Speaker 1: to do good, but not that much good. This is 14 00:00:57,196 --> 00:01:00,116 Speaker 1: my friend and colleague, Josh Green, a professor of psychology 15 00:01:00,236 --> 00:01:03,596 Speaker 1: at Harvard University, where he studies how our brains make 16 00:01:03,636 --> 00:01:07,196 Speaker 1: moral decisions. But before I launched into this episode with Josh, 17 00:01:07,436 --> 00:01:10,236 Speaker 1: he and I had a little catching up to you. Josh, 18 00:01:10,276 --> 00:01:12,156 Speaker 1: you don't you don't know about you were actually featured 19 00:01:12,196 --> 00:01:15,396 Speaker 1: in a recent episode of this episode on on fun 20 00:01:15,556 --> 00:01:18,756 Speaker 1: and moments of peak fun in My Life and My Goodness. 21 00:01:19,756 --> 00:01:21,716 Speaker 1: But do you remember our eighties sing along that we did. 22 00:01:21,796 --> 00:01:26,996 Speaker 1: We were saying, Bob Jovi, it's most funny. Yeah, it's 23 00:01:27,036 --> 00:01:30,316 Speaker 1: like I wasn't drinking, but it probably seemed like I was. 24 00:01:30,956 --> 00:01:33,636 Speaker 1: For this Giving Tuesday Bonus episode of The Happiness Lab, 25 00:01:33,996 --> 00:01:35,956 Speaker 1: I spoke with Josh about what we should keep in 26 00:01:35,996 --> 00:01:38,116 Speaker 1: mind if we want our charitable gifts to be as 27 00:01:38,156 --> 00:01:42,476 Speaker 1: effective as possible. So, Josh, we're about to enter the 28 00:01:42,476 --> 00:01:45,276 Speaker 1: holiday season, which can be the super stressful time for people. 29 00:01:45,556 --> 00:01:47,996 Speaker 1: But ironically, one of the best ways that we can 30 00:01:48,196 --> 00:01:49,956 Speaker 1: give ourselves a little bit of self care over the 31 00:01:49,996 --> 00:01:52,676 Speaker 1: holidays is to think about doing for others and so 32 00:01:53,076 --> 00:01:55,636 Speaker 1: talk a little bit about some of the psychological benefits 33 00:01:55,636 --> 00:01:58,716 Speaker 1: we get from doing nice stuff for other people. Well, 34 00:01:58,796 --> 00:02:00,476 Speaker 1: you know, this is work that I think you know 35 00:02:00,596 --> 00:02:03,636 Speaker 1: much better than I do. But what research suggests, and 36 00:02:03,676 --> 00:02:06,596 Speaker 1: I'm thinking here primarily of research by Elizabeth Dunn and 37 00:02:06,636 --> 00:02:10,276 Speaker 1: Mike Norton, people really misper see this that if you 38 00:02:10,436 --> 00:02:13,276 Speaker 1: just say here's one hundred dollars, go out and make 39 00:02:13,276 --> 00:02:16,316 Speaker 1: yourself happy, and then you've check in later with them, 40 00:02:16,636 --> 00:02:19,956 Speaker 1: the people who spent it on other people are happier. 41 00:02:20,316 --> 00:02:24,116 Speaker 1: It seems like using your resources to do something nice 42 00:02:24,156 --> 00:02:28,236 Speaker 1: for somebody else, maybe especially something unexpected really gives you 43 00:02:28,276 --> 00:02:30,516 Speaker 1: a boost. It makes you feel connected to other people. 44 00:02:30,716 --> 00:02:32,236 Speaker 1: And so this is like the first thing we get 45 00:02:32,316 --> 00:02:34,636 Speaker 1: wrong when it comes to kind of donating and happiness 46 00:02:34,716 --> 00:02:37,156 Speaker 1: is just that donating can make us happy. But there's 47 00:02:37,156 --> 00:02:38,796 Speaker 1: a second thing our mind gets wrong, which is something 48 00:02:38,796 --> 00:02:40,716 Speaker 1: that you focused on, which is that even when we 49 00:02:40,796 --> 00:02:43,076 Speaker 1: decide to give to a charity, we often don't do 50 00:02:43,076 --> 00:02:45,436 Speaker 1: it in the best way for our well being because 51 00:02:45,436 --> 00:02:49,156 Speaker 1: not all ways of helping others seem to work the same, right. Yeah, So, 52 00:02:49,236 --> 00:02:51,996 Speaker 1: people in the last ten years, especially an organization called 53 00:02:52,036 --> 00:02:55,116 Speaker 1: give Well, have been doing this research to try to 54 00:02:55,116 --> 00:02:57,156 Speaker 1: figure out how can you do the most good with 55 00:02:57,196 --> 00:03:01,076 Speaker 1: your dollars. The most effective charities in the world are 56 00:03:01,356 --> 00:03:04,796 Speaker 1: orders of magnitude more effective than typical charities. And when 57 00:03:04,836 --> 00:03:08,156 Speaker 1: I say effectiveness, I'm talking about how many lives can 58 00:03:08,196 --> 00:03:11,276 Speaker 1: you save for dollar per thousand dollars or ten thousand dollars, 59 00:03:11,436 --> 00:03:13,556 Speaker 1: or how much can you improve people's lives? And this 60 00:03:13,636 --> 00:03:15,596 Speaker 1: is not an easy thing to measure, but it can 61 00:03:15,676 --> 00:03:19,156 Speaker 1: be done. But the charities that do the most good 62 00:03:19,196 --> 00:03:22,076 Speaker 1: per dollar are not the ones that we may feel 63 00:03:22,116 --> 00:03:25,036 Speaker 1: the most immediate connection to. Right, a lot of people, 64 00:03:25,116 --> 00:03:27,436 Speaker 1: and I feel this too, right that you know, my 65 00:03:27,436 --> 00:03:29,796 Speaker 1: wife and I we want to support our local schools 66 00:03:29,836 --> 00:03:33,316 Speaker 1: and the Greater Boston Food Bank and fighting for racial 67 00:03:33,356 --> 00:03:36,636 Speaker 1: justice and equity in the criminal justice system, etc. It's 68 00:03:36,636 --> 00:03:39,436 Speaker 1: hard to do randomize controlled experiments to figure out what 69 00:03:39,476 --> 00:03:41,236 Speaker 1: works and what doesn't, but it doesn't mean it's not 70 00:03:41,276 --> 00:03:44,796 Speaker 1: necessarily worth supporting. So there are things that I think 71 00:03:44,836 --> 00:03:49,276 Speaker 1: all of us feel very strongly drawn to personally, but 72 00:03:49,356 --> 00:03:52,156 Speaker 1: that are not necessarily the things that you know, the 73 00:03:52,276 --> 00:03:55,596 Speaker 1: evidence shows has the highest impact in some cases because 74 00:03:55,596 --> 00:03:57,516 Speaker 1: they just don't have that impact, and in some cases 75 00:03:57,556 --> 00:03:59,556 Speaker 1: because it's just we don't know and it's just too 76 00:03:59,556 --> 00:04:03,236 Speaker 1: hard to measure. So there's this kind of tension. But 77 00:04:03,276 --> 00:04:05,596 Speaker 1: the instinct to do the local thing means we sometimes 78 00:04:05,636 --> 00:04:08,956 Speaker 1: don't rationally pay attention to the big differences between like 79 00:04:09,156 --> 00:04:11,476 Speaker 1: what charities are really capable of. And this was something 80 00:04:11,516 --> 00:04:13,916 Speaker 1: I was actually shocked by until I started reading your work, 81 00:04:14,196 --> 00:04:16,716 Speaker 1: just the level of different effect that you could get 82 00:04:16,756 --> 00:04:19,716 Speaker 1: from one charity to another, and so talk about how 83 00:04:19,756 --> 00:04:22,996 Speaker 1: big this difference is. Yeah, so the difference can be 84 00:04:23,676 --> 00:04:25,916 Speaker 1: a hundred times or even a thousand times, So a 85 00:04:25,956 --> 00:04:29,396 Speaker 1: really nice salient example from Toby Ord in the United 86 00:04:29,436 --> 00:04:33,436 Speaker 1: States training a guide dog to help a blind person 87 00:04:33,556 --> 00:04:38,036 Speaker 1: and cost about fifty thousand dollars, whereas in the developing world, 88 00:04:38,316 --> 00:04:41,676 Speaker 1: there's an infectious disease called trachoma that can be treated 89 00:04:41,756 --> 00:04:45,116 Speaker 1: with a simple surgical procedure that costs less than one 90 00:04:45,196 --> 00:04:48,676 Speaker 1: hundred dollars, and in that case, the difference in impact 91 00:04:48,836 --> 00:04:52,036 Speaker 1: is about a thousand times. Right, for the cost of 92 00:04:52,116 --> 00:04:55,876 Speaker 1: helping one person in the United States manage their blindness, 93 00:04:56,076 --> 00:04:58,756 Speaker 1: you can prevent five hundred, even a thousand people from 94 00:04:58,796 --> 00:05:02,156 Speaker 1: going blind in the first place by having trachoma surgery. 95 00:05:02,436 --> 00:05:05,236 Speaker 1: And that's just kind of hard to get your head around, right, 96 00:05:05,476 --> 00:05:09,036 Speaker 1: And you know, that's a very extreme case, but it is. 97 00:05:09,116 --> 00:05:12,396 Speaker 1: It's really not unusual for the most effective charities to 98 00:05:12,436 --> 00:05:15,596 Speaker 1: be a hundred times more effective than typical charities. But 99 00:05:15,956 --> 00:05:18,236 Speaker 1: very few people wake up and think, you know, what 100 00:05:18,276 --> 00:05:20,676 Speaker 1: I really want to do is support trachoma surgery, this 101 00:05:20,756 --> 00:05:23,756 Speaker 1: disease that I've never heard of. Right, The challenge here 102 00:05:23,996 --> 00:05:27,196 Speaker 1: is how can we make decisions that work with our 103 00:05:27,236 --> 00:05:30,956 Speaker 1: feelings in the sort of pragmatic way, but nevertheless enable 104 00:05:31,076 --> 00:05:33,596 Speaker 1: us to have more impact, to really do more good 105 00:05:33,636 --> 00:05:35,876 Speaker 1: because that's the point, right, I want to feel good, 106 00:05:35,876 --> 00:05:37,956 Speaker 1: but it's not about my feelings. That's not why I 107 00:05:37,996 --> 00:05:40,076 Speaker 1: think I'm doing this. I hope that's not right. You've 108 00:05:40,116 --> 00:05:42,516 Speaker 1: talked about this distinction in terms of the difference between 109 00:05:42,556 --> 00:05:44,996 Speaker 1: giving with your heart and giving with your head, right, 110 00:05:45,076 --> 00:05:47,796 Speaker 1: I mean, it seems like these very effective strategies. You know, 111 00:05:47,836 --> 00:05:49,516 Speaker 1: it's like, oh, that's kind of giving with my head, 112 00:05:49,676 --> 00:05:52,396 Speaker 1: but it doesn't maybe feel the same way as giving, 113 00:05:52,636 --> 00:05:54,316 Speaker 1: you know, with your heart. You know. So talk about 114 00:05:54,356 --> 00:05:55,996 Speaker 1: this distinction because it fits with some of the other 115 00:05:55,996 --> 00:05:58,276 Speaker 1: work you've done on moral psychology too, right, like in 116 00:05:58,356 --> 00:06:00,796 Speaker 1: the context of what's called trolley problems. So what's a 117 00:06:00,796 --> 00:06:02,756 Speaker 1: trolley problem? And give me an example of the kind 118 00:06:02,756 --> 00:06:04,636 Speaker 1: of most famous one that we tend to use in 119 00:06:04,676 --> 00:06:10,156 Speaker 1: psychology and philosophy. Yeah. Yeah, So I think in general 120 00:06:10,756 --> 00:06:12,756 Speaker 1: there are different ways that we can make a decision, 121 00:06:13,076 --> 00:06:16,116 Speaker 1: and so these sorts of trolley cases that we've used 122 00:06:16,156 --> 00:06:20,636 Speaker 1: as cognitive probes to understand moral cognition highlight this distinction 123 00:06:21,236 --> 00:06:24,556 Speaker 1: between a gut reaction that says do this or don't 124 00:06:24,596 --> 00:06:27,876 Speaker 1: do this, and thinking in terms of costs and benefits 125 00:06:27,916 --> 00:06:30,796 Speaker 1: on a large scale. So, in one version of the 126 00:06:30,796 --> 00:06:35,196 Speaker 1: trolley problem. The trolley is headed towards five people, and 127 00:06:35,276 --> 00:06:37,556 Speaker 1: you can hit a switch that will turn the trolley 128 00:06:37,556 --> 00:06:39,356 Speaker 1: away from the five people, and it will put it 129 00:06:39,396 --> 00:06:41,596 Speaker 1: on a sidetrack. But there's one person there, and so 130 00:06:41,676 --> 00:06:43,836 Speaker 1: that person will be killed. And if you ask people 131 00:06:43,876 --> 00:06:45,876 Speaker 1: say is it okay to hit the switch to minimize 132 00:06:45,916 --> 00:06:48,756 Speaker 1: the number of deaths, most people say that that's fine. 133 00:06:49,116 --> 00:06:52,956 Speaker 1: Then there's the footbridge case, where the trolley is head again, 134 00:06:52,996 --> 00:06:55,996 Speaker 1: headed towards five people. You're on a footbridge over the 135 00:06:56,036 --> 00:06:59,636 Speaker 1: tracks in between the oncoming trolley and the five and 136 00:06:59,716 --> 00:07:01,996 Speaker 1: the only way to save those five people is to 137 00:07:02,036 --> 00:07:04,356 Speaker 1: do something that feels really awful. There is this person 138 00:07:04,396 --> 00:07:07,556 Speaker 1: wearing let's say, a giant backpack, standing next to you, 139 00:07:07,876 --> 00:07:10,676 Speaker 1: and you can push them off of the footbridge and 140 00:07:10,716 --> 00:07:13,156 Speaker 1: they'll land on the tracks and they'll be killed by 141 00:07:13,156 --> 00:07:15,476 Speaker 1: the trolley, but it'll stop the trolley from killing the 142 00:07:15,556 --> 00:07:19,236 Speaker 1: five people. Inevitably say that it's wrong, or at least 143 00:07:19,236 --> 00:07:22,236 Speaker 1: feels very wrong, to trade one life for five in 144 00:07:22,236 --> 00:07:24,996 Speaker 1: the footbridge case. So what is going on there? So 145 00:07:25,076 --> 00:07:27,396 Speaker 1: I and other researchers, since it's spent a lot of 146 00:07:27,436 --> 00:07:30,916 Speaker 1: time looking at people's behavior, and reaction times and brains. 147 00:07:31,076 --> 00:07:34,476 Speaker 1: So we have this what we call dual process dynamic 148 00:07:34,836 --> 00:07:37,916 Speaker 1: where you have these feelings that say, no, you can't 149 00:07:37,956 --> 00:07:40,756 Speaker 1: do that right, and then you have this cost benefit 150 00:07:40,796 --> 00:07:43,076 Speaker 1: reasoning that says, but doesn't this make sense? That's in 151 00:07:43,116 --> 00:07:45,676 Speaker 1: the footbridge case. In the switch case, you don't have 152 00:07:45,716 --> 00:07:48,516 Speaker 1: the feeling is strongly. The work that we've been doing 153 00:07:48,716 --> 00:07:53,156 Speaker 1: on charitable giving has a similar kind of dual process dynamic, 154 00:07:53,476 --> 00:07:56,276 Speaker 1: but the feelings are in the positive domain, right, But 155 00:07:56,356 --> 00:07:58,996 Speaker 1: there's still a kind of tension there to be navigated, 156 00:07:59,036 --> 00:08:01,556 Speaker 1: and that's what we're trying to do here. Josh's interest 157 00:08:01,596 --> 00:08:05,076 Speaker 1: in morality go beyond hypothetical situations like the trolley problem. 158 00:08:05,756 --> 00:08:08,996 Speaker 1: He's part of a movement called effective altruism, one that's 159 00:08:09,236 --> 00:08:12,156 Speaker 1: very much grounded in reality. So the idea of effective 160 00:08:12,196 --> 00:08:15,956 Speaker 1: altruism is to use the resources you have to do 161 00:08:15,996 --> 00:08:19,276 Speaker 1: as much good as possible, and to make those decisions 162 00:08:19,316 --> 00:08:24,316 Speaker 1: about what does as much good as possible based on reason, evidence, 163 00:08:24,476 --> 00:08:28,396 Speaker 1: as clear analysis as possible. And so there are different 164 00:08:28,436 --> 00:08:31,876 Speaker 1: categories here. So this most straightforward thing that anyone with 165 00:08:32,116 --> 00:08:34,836 Speaker 1: disposable income can do is just to donate money to 166 00:08:35,356 --> 00:08:39,796 Speaker 1: super effective charities. Right. Effective altruism is also broader than that. 167 00:08:40,276 --> 00:08:43,236 Speaker 1: It has to do with choosing your career right, and 168 00:08:43,316 --> 00:08:47,396 Speaker 1: this has the same kind of hearthead dynamic that maybe 169 00:08:47,436 --> 00:08:50,556 Speaker 1: the most good one could do in principle is going 170 00:08:50,556 --> 00:08:52,996 Speaker 1: into finance and making as much money as possible and 171 00:08:53,036 --> 00:08:55,436 Speaker 1: then giving ninety five percent of it away, right, But 172 00:08:55,676 --> 00:08:58,556 Speaker 1: very few people are going to do that, and most 173 00:08:58,556 --> 00:09:00,916 Speaker 1: people wouldn't enjoy that if they're not the kind of 174 00:09:00,916 --> 00:09:02,876 Speaker 1: person who would go into finance anyway, and if they are, 175 00:09:02,916 --> 00:09:04,516 Speaker 1: they might not want to give ninety five percent of 176 00:09:04,556 --> 00:09:08,756 Speaker 1: their money way, right. So it's about finding a balance 177 00:09:09,236 --> 00:09:11,396 Speaker 1: where you say, Okay, what's something that I can do 178 00:09:11,436 --> 00:09:14,156 Speaker 1: that makes good use of my talents but uses those 179 00:09:14,156 --> 00:09:16,196 Speaker 1: resources in a way that really can do a lot 180 00:09:16,236 --> 00:09:18,356 Speaker 1: of good. And there's a great organization called eighty thousand 181 00:09:18,356 --> 00:09:21,756 Speaker 1: Hours that offers advice to people and helps them figure 182 00:09:21,756 --> 00:09:23,716 Speaker 1: out in a personal way, like how can I do 183 00:09:23,796 --> 00:09:26,116 Speaker 1: something that feels good and it's meaningful to me but 184 00:09:26,316 --> 00:09:29,356 Speaker 1: also really uses my talents and my skills in a 185 00:09:29,356 --> 00:09:30,596 Speaker 1: way that does a lot of good for the world. 186 00:09:30,716 --> 00:09:33,916 Speaker 1: So it's choosing a career, it's choosing what you do 187 00:09:34,036 --> 00:09:37,276 Speaker 1: with your resources, and it can be personal decisions as well, 188 00:09:37,436 --> 00:09:40,076 Speaker 1: so things about what you choose to eat. But the 189 00:09:40,116 --> 00:09:43,476 Speaker 1: general idea is that we should take the evidence seriously. 190 00:09:43,516 --> 00:09:45,916 Speaker 1: I mean, it's funny that like effective altruism is a 191 00:09:45,956 --> 00:09:49,156 Speaker 1: new idea, right, right, then we really need a movement 192 00:09:49,316 --> 00:09:52,596 Speaker 1: to do us hey ration susly. Yeah, if I went 193 00:09:52,636 --> 00:09:54,956 Speaker 1: to Harvard Business School across the river, it's like, I've 194 00:09:54,956 --> 00:09:57,236 Speaker 1: got this new idea. When you're trying to figure out 195 00:09:57,276 --> 00:09:59,676 Speaker 1: how to invest your money, look to see what kind 196 00:09:59,676 --> 00:10:02,276 Speaker 1: of return you're going to get. You know, the idea 197 00:10:02,396 --> 00:10:05,916 Speaker 1: of investing for impact in the business world is just 198 00:10:05,956 --> 00:10:08,556 Speaker 1: like the biggest du ever, right. But then when it 199 00:10:08,556 --> 00:10:11,956 Speaker 1: comes to trying to do good for the world more 200 00:10:12,036 --> 00:10:14,876 Speaker 1: generally is you know, it's only recently that people have 201 00:10:14,916 --> 00:10:19,716 Speaker 1: been doing the kind of serious analysis that investors have 202 00:10:19,876 --> 00:10:23,116 Speaker 1: been doing for decades and even centuries, and so it's 203 00:10:23,156 --> 00:10:25,916 Speaker 1: just it's just getting serious about it in the same 204 00:10:25,996 --> 00:10:28,676 Speaker 1: way that you get serious about business as a business person. 205 00:10:28,836 --> 00:10:30,676 Speaker 1: And so let's talk about some of the biases that 206 00:10:31,156 --> 00:10:32,756 Speaker 1: mess us up with this, Like, you know why we 207 00:10:32,756 --> 00:10:34,436 Speaker 1: can think about it in the business domain, but it's 208 00:10:34,436 --> 00:10:36,756 Speaker 1: so hard in the charitable giving domain. You know, one 209 00:10:36,796 --> 00:10:38,636 Speaker 1: reason we mess this up is that you know, our 210 00:10:38,636 --> 00:10:41,996 Speaker 1: brain isn't really good at seeing everybody in need who's 211 00:10:41,996 --> 00:10:44,076 Speaker 1: worthy of our help, you know, So talk about this 212 00:10:44,156 --> 00:10:46,596 Speaker 1: idea of the moral circle and why it might be 213 00:10:46,596 --> 00:10:49,196 Speaker 1: a little bit more narrow than we think. Yeah, so 214 00:10:49,476 --> 00:10:51,876 Speaker 1: where does morality come from, right, and where does human 215 00:10:51,956 --> 00:10:56,236 Speaker 1: sociality come from? And my view, not unique to me 216 00:10:56,316 --> 00:10:59,516 Speaker 1: but not shared by everybody, is that the fundamental principle 217 00:10:59,596 --> 00:11:02,636 Speaker 1: of life, not just humans, is cooperation. That if you 218 00:11:02,716 --> 00:11:04,476 Speaker 1: go all the way back to the beginning of the 219 00:11:04,556 --> 00:11:07,436 Speaker 1: history of life, you know, what you see are molecules 220 00:11:07,556 --> 00:11:10,676 Speaker 1: coming together to form larger molecules that can make copies 221 00:11:10,676 --> 00:11:14,596 Speaker 1: of themselves better, coming together to form cells, and then 222 00:11:14,836 --> 00:11:19,836 Speaker 1: multicellular colonies and organisms, and complicated animals with different organs 223 00:11:19,876 --> 00:11:22,756 Speaker 1: that cooperate and function together, and then social animals like 224 00:11:23,076 --> 00:11:27,156 Speaker 1: ants and chimpanzees and us, and then starting with us, 225 00:11:27,196 --> 00:11:31,836 Speaker 1: there's hunter gatherer bands, and there's more complex chiefdoms and 226 00:11:31,916 --> 00:11:35,996 Speaker 1: tribal societies and nation states. So the story of life 227 00:11:36,116 --> 00:11:41,396 Speaker 1: is a story of cooperation at increasingly complex levels. But 228 00:11:41,996 --> 00:11:46,276 Speaker 1: that cooperation isn't just there because it's nice. It's there 229 00:11:46,276 --> 00:11:48,876 Speaker 1: because it evolved. And anything that evolved evolves because it 230 00:11:48,876 --> 00:11:52,836 Speaker 1: has a competitive advantage. So teamwork is a competitive weapon, right. 231 00:11:53,356 --> 00:11:56,596 Speaker 1: It serves a competitive purpose at the highest level, so 232 00:11:56,716 --> 00:11:59,076 Speaker 1: you don't compete as much with the other people on 233 00:11:59,156 --> 00:12:02,316 Speaker 1: your team, so that you can more effectively compete with 234 00:12:02,636 --> 00:12:06,636 Speaker 1: the other team. And so there's this challenge here. Our 235 00:12:06,796 --> 00:12:12,956 Speaker 1: social emotions are designed to produce cooperative interactions in the 236 00:12:12,996 --> 00:12:17,956 Speaker 1: service of out competing others. So cooperation can exist up 237 00:12:17,956 --> 00:12:20,196 Speaker 1: to a point, but then it's always strained at the 238 00:12:20,236 --> 00:12:22,676 Speaker 1: highest level because the very force that caused it to 239 00:12:22,716 --> 00:12:28,596 Speaker 1: evolve biologically or culturally is competitive. But we humans, we're unusual. 240 00:12:28,716 --> 00:12:31,276 Speaker 1: We have the ability to understand all of this stuff, 241 00:12:31,756 --> 00:12:34,756 Speaker 1: climb that evolutionary ladder, and then kick it away. We're 242 00:12:34,756 --> 00:12:37,236 Speaker 1: going to do what makes sense for us given our values, 243 00:12:37,276 --> 00:12:39,636 Speaker 1: which are not necessarily the same thing as the values 244 00:12:39,996 --> 00:12:44,636 Speaker 1: that are implicit in the biological process. And I view 245 00:12:45,316 --> 00:12:49,836 Speaker 1: effective altruism is we are understanding the process and we're saying, okay, 246 00:12:50,156 --> 00:12:53,596 Speaker 1: we as a species have created enough resources that no 247 00:12:53,636 --> 00:12:56,236 Speaker 1: one has to be hungry, right, so why don't we 248 00:12:56,276 --> 00:13:01,116 Speaker 1: do one better? Why don't we apply the cooperative social 249 00:13:01,156 --> 00:13:05,516 Speaker 1: emotional capacities that we evolved for competition and apply them 250 00:13:05,556 --> 00:13:09,036 Speaker 1: not just cooperatively within our local groups, but more broadly. 251 00:13:09,236 --> 00:13:12,156 Speaker 1: And so my mission is to expand that circle all 252 00:13:12,156 --> 00:13:14,596 Speaker 1: the way out as best as I can, because that 253 00:13:14,636 --> 00:13:16,476 Speaker 1: means if we keep the circle too narrow, we might 254 00:13:16,476 --> 00:13:18,836 Speaker 1: be like missing out on the happiness benefits that comes 255 00:13:18,876 --> 00:13:20,796 Speaker 1: from doing really good in the world, but just in 256 00:13:20,836 --> 00:13:23,036 Speaker 1: this kind of narrow scope. But another thing we get 257 00:13:23,036 --> 00:13:25,236 Speaker 1: wrong is this idea of what's called scope neglect when 258 00:13:25,236 --> 00:13:27,276 Speaker 1: it comes to giving more broadly. You know, So what 259 00:13:27,396 --> 00:13:29,636 Speaker 1: is scope neglect? Explain this concept which I think is 260 00:13:29,636 --> 00:13:32,836 Speaker 1: so powerful. Yeah, Well, I mean, it's just that our 261 00:13:32,876 --> 00:13:37,156 Speaker 1: emotions are not designed to be numerous to take numbers 262 00:13:37,196 --> 00:13:40,436 Speaker 1: into account. Right, So there's a very real sense in 263 00:13:40,476 --> 00:13:44,836 Speaker 1: which saving a thousand people's lives is a thousand times 264 00:13:44,836 --> 00:13:48,196 Speaker 1: better than saving one person's life. But research suggests that 265 00:13:48,236 --> 00:13:52,876 Speaker 1: if anything, saving a single person is more emotionally salient 266 00:13:53,076 --> 00:13:55,956 Speaker 1: than saving a thousand, that it becomes the numbers become 267 00:13:56,076 --> 00:13:58,276 Speaker 1: very abstract, and even when you think about the numbers, 268 00:13:58,276 --> 00:14:01,636 Speaker 1: there's a kind of diminishing returns where after a while 269 00:14:02,036 --> 00:14:04,596 Speaker 1: it's just a lot. And so we kind of have 270 00:14:04,716 --> 00:14:08,516 Speaker 1: to compensate for that if we really want to make 271 00:14:08,596 --> 00:14:11,156 Speaker 1: choices really maximize the amount of good that we can do. 272 00:14:11,636 --> 00:14:13,476 Speaker 1: And we can see this even in the way sometimes 273 00:14:13,556 --> 00:14:17,596 Speaker 1: charities advertised things right, you know, it's the oftentimes charities 274 00:14:17,636 --> 00:14:20,356 Speaker 1: will put a picture of one person in need. This 275 00:14:20,396 --> 00:14:23,596 Speaker 1: is what researchers have called identifiable victim effect. So what's that? 276 00:14:23,676 --> 00:14:26,316 Speaker 1: I think it's powerful. Yeah, So this goes back to 277 00:14:26,476 --> 00:14:30,196 Speaker 1: work by Debrah Small and George Lowenstein. They did some 278 00:14:30,236 --> 00:14:33,076 Speaker 1: really nice lab experiments where they kind of created in 279 00:14:33,076 --> 00:14:36,596 Speaker 1: an economic way a victim. In the lab. They lost 280 00:14:36,636 --> 00:14:39,556 Speaker 1: their money, and then they could ask other experimenters, hey, 281 00:14:39,596 --> 00:14:41,516 Speaker 1: do you want to give some of your money to 282 00:14:41,676 --> 00:14:44,396 Speaker 1: make up for the person who lost it? And they 283 00:14:44,396 --> 00:14:46,156 Speaker 1: did this two different ways. There's a very sort of 284 00:14:46,156 --> 00:14:48,596 Speaker 1: subtle manipulation. It's kind of amazing that it worked, although 285 00:14:48,636 --> 00:14:51,556 Speaker 1: it makes sense. In one version they said, do you 286 00:14:51,596 --> 00:14:54,076 Speaker 1: want to help the person whoever it's going to be 287 00:14:54,476 --> 00:14:56,916 Speaker 1: who was harmed by this? You know, would be like 288 00:14:56,956 --> 00:14:59,076 Speaker 1: one of six people, but it wasn't determined it who 289 00:14:59,076 --> 00:15:01,876 Speaker 1: would be. And then in another version they said, do 290 00:15:01,876 --> 00:15:04,356 Speaker 1: you want to help person number four? Who's the person 291 00:15:04,356 --> 00:15:06,836 Speaker 1: who had this right? And you don't know anything about them. 292 00:15:06,916 --> 00:15:09,036 Speaker 1: It's just that it's been determined, and people were more 293 00:15:09,156 --> 00:15:11,756 Speaker 1: willing to help just when you said that it's person 294 00:15:11,876 --> 00:15:15,876 Speaker 1: number four as opposed to some person to be determined, right, 295 00:15:15,956 --> 00:15:18,756 Speaker 1: And this is like the thinnest possible shift, and when 296 00:15:18,836 --> 00:15:21,396 Speaker 1: you know, but when it's a real identifiable victim, like 297 00:15:21,476 --> 00:15:23,556 Speaker 1: you know you can see on an ad on TV, 298 00:15:23,996 --> 00:15:26,436 Speaker 1: that has a much more powerful response. And you know, 299 00:15:26,476 --> 00:15:28,516 Speaker 1: one worry about this is that you know, if you 300 00:15:28,596 --> 00:15:30,836 Speaker 1: draw people's attention to this, you know, one way to 301 00:15:30,836 --> 00:15:33,076 Speaker 1: go is to say, Okay, I'm going to care about 302 00:15:33,196 --> 00:15:37,156 Speaker 1: all of those anonymous children more instead of just focusing 303 00:15:37,156 --> 00:15:40,876 Speaker 1: on the single person who's very emotionally salient. But one worry, 304 00:15:40,876 --> 00:15:42,836 Speaker 1: and there's some evidence to suggests that this can happen 305 00:15:43,036 --> 00:15:46,116 Speaker 1: is when people understand what's going on. Instead they just say, okay, well, 306 00:15:46,116 --> 00:15:47,796 Speaker 1: I'm not going to care about either. I'm going to 307 00:15:47,876 --> 00:15:51,116 Speaker 1: care about the one person less. And so the challenge 308 00:15:51,156 --> 00:15:54,516 Speaker 1: here is how do you take that pro social feeling 309 00:15:55,156 --> 00:15:58,036 Speaker 1: and scale it up in a way or align it 310 00:15:58,076 --> 00:16:01,556 Speaker 1: more with the scope of the need is and what 311 00:16:01,596 --> 00:16:04,156 Speaker 1: you can do about it. I love this idea that 312 00:16:04,196 --> 00:16:07,356 Speaker 1: once we recognize our minds biases, we can find ways 313 00:16:07,396 --> 00:16:10,836 Speaker 1: to work with them rather than it's them. We'll hear 314 00:16:10,876 --> 00:16:13,916 Speaker 1: more about this when the Happiness Lab returns in a moment. 315 00:16:21,156 --> 00:16:24,716 Speaker 1: Psychology professor Josh Green wanted to understand what would make 316 00:16:24,756 --> 00:16:27,556 Speaker 1: people decide to give more donations to his list of 317 00:16:27,636 --> 00:16:30,956 Speaker 1: high impact charities instead of to causes that hold more 318 00:16:31,036 --> 00:16:33,996 Speaker 1: personal appeal. He wanted people to give with their heads 319 00:16:34,196 --> 00:16:38,116 Speaker 1: rather than with their hearts. But purely rational appeals didn't 320 00:16:38,156 --> 00:16:40,276 Speaker 1: turn out to be very effective. So you mentioned instead 321 00:16:40,316 --> 00:16:43,436 Speaker 1: of fighting the biases. But the idea here is to 322 00:16:43,436 --> 00:16:46,756 Speaker 1: not fight them, to work with them rather than against them. 323 00:16:46,996 --> 00:16:49,716 Speaker 1: And what's interesting is I started out trying to fight them, 324 00:16:49,756 --> 00:16:54,796 Speaker 1: you know. I was convinced to support highly effective charities 325 00:16:54,836 --> 00:16:58,956 Speaker 1: basically by philosopher Peter Singer, and he gave a famous 326 00:16:59,196 --> 00:17:01,036 Speaker 1: argument for doing this. You know, he said, look, if 327 00:17:01,036 --> 00:17:03,276 Speaker 1: you were walking by a pond and there was a 328 00:17:03,356 --> 00:17:05,996 Speaker 1: child drowning in that pond, and you could save the child. 329 00:17:06,436 --> 00:17:08,316 Speaker 1: But if you do that, you're going to wait in 330 00:17:08,316 --> 00:17:10,396 Speaker 1: and you're going to ruin the you know, nice suit 331 00:17:10,436 --> 00:17:12,636 Speaker 1: you're wearing or whatever it is. You know, would you 332 00:17:12,676 --> 00:17:14,436 Speaker 1: still save the child? Would it be okay for you 333 00:17:14,476 --> 00:17:16,196 Speaker 1: to let the child drown because you don't want to 334 00:17:16,236 --> 00:17:18,356 Speaker 1: ruin your suit that of course it would be terrible. 335 00:17:18,396 --> 00:17:19,956 Speaker 1: You'd be a monster, right if you let the kid 336 00:17:19,996 --> 00:17:23,196 Speaker 1: drown because you're worried about your suit. And then singer says, well, 337 00:17:23,636 --> 00:17:26,156 Speaker 1: there are children who are drowning in poverty all around 338 00:17:26,156 --> 00:17:29,036 Speaker 1: the world, who are badly needed food and medicine, And 339 00:17:29,116 --> 00:17:31,516 Speaker 1: for the price of an nice suit, you can save 340 00:17:31,636 --> 00:17:34,956 Speaker 1: or contribute to saving many of these children. So why 341 00:17:34,956 --> 00:17:37,836 Speaker 1: do you have any more of an obligation to wade 342 00:17:37,836 --> 00:17:41,076 Speaker 1: into the pond than you do to use what resources 343 00:17:41,116 --> 00:17:43,116 Speaker 1: you have to help those people. So I was very 344 00:17:43,196 --> 00:17:45,756 Speaker 1: much convinced by that argument when I was in my 345 00:17:46,116 --> 00:17:48,636 Speaker 1: late teens. I think, do you remember when you read it, Like, 346 00:17:48,796 --> 00:17:51,476 Speaker 1: do you remember having a moment of like, oh crap. 347 00:17:51,636 --> 00:17:53,836 Speaker 1: Like it was in college. I was in an urban 348 00:17:53,916 --> 00:17:57,356 Speaker 1: environment for the first time, and I would see homeless 349 00:17:57,356 --> 00:17:59,996 Speaker 1: people a lot, and I started thinking about this and 350 00:18:00,036 --> 00:18:01,716 Speaker 1: thinking about you know, I'd be on my way to 351 00:18:01,716 --> 00:18:03,476 Speaker 1: go buy something. This was back when I used to 352 00:18:03,516 --> 00:18:07,396 Speaker 1: buy a lot of CDs as in music for those 353 00:18:07,476 --> 00:18:11,356 Speaker 1: young people listening, and you know, I think, like, why 354 00:18:11,476 --> 00:18:13,236 Speaker 1: is it more important for me to have this? Like 355 00:18:13,356 --> 00:18:15,636 Speaker 1: you know, John Coltrane CD than it is for me 356 00:18:15,716 --> 00:18:17,996 Speaker 1: to help this person, right. And then you know, I 357 00:18:18,076 --> 00:18:22,196 Speaker 1: talked to Jonathan Barron, with psychology professor, and he's like, oh, 358 00:18:22,196 --> 00:18:25,156 Speaker 1: you should read this guy, Peter Singer. And I was like, oh, 359 00:18:25,196 --> 00:18:27,236 Speaker 1: and you know, Singer did a much better job of 360 00:18:27,356 --> 00:18:29,716 Speaker 1: laying this all out than I did. And I didn't 361 00:18:29,756 --> 00:18:31,716 Speaker 1: realize that, you know, he had already laid this out 362 00:18:31,716 --> 00:18:33,436 Speaker 1: a couple of years before I was born. But then 363 00:18:33,476 --> 00:18:35,676 Speaker 1: when I read that, you know, then it really gripped me. 364 00:18:35,756 --> 00:18:38,356 Speaker 1: So that was how I became convinced of this. And 365 00:18:38,396 --> 00:18:40,636 Speaker 1: I thought, well, if that's what worked for me, I'll 366 00:18:40,636 --> 00:18:43,636 Speaker 1: try to convince other people. And so with various people, 367 00:18:43,676 --> 00:18:46,236 Speaker 1: I've tried experiments where you kind of lay out the 368 00:18:46,236 --> 00:18:49,276 Speaker 1: Peter Singer sort of argument and see if people are 369 00:18:49,316 --> 00:18:51,676 Speaker 1: willing to, you know, donate something, if you give them 370 00:18:51,676 --> 00:18:54,036 Speaker 1: some money that they could keep or donate or otherwise 371 00:18:54,076 --> 00:18:57,036 Speaker 1: convince them. And what we found is that this works 372 00:18:57,116 --> 00:19:00,236 Speaker 1: a little bit at best. You know, sometimes that doesn't 373 00:19:00,276 --> 00:19:02,396 Speaker 1: work at all, and sometimes it works a little tiny bit, 374 00:19:02,436 --> 00:19:04,476 Speaker 1: but people do not respond to this in general the 375 00:19:04,516 --> 00:19:08,076 Speaker 1: way that I did. And so more recently he started thinking, 376 00:19:08,356 --> 00:19:11,316 Speaker 1: maybe there's another way here instead of fighting it, to 377 00:19:11,436 --> 00:19:13,796 Speaker 1: go with it, to be like water a bit. And 378 00:19:13,836 --> 00:19:16,996 Speaker 1: so this is the new project with the amazing Lucious 379 00:19:17,076 --> 00:19:20,036 Speaker 1: Caviola who's currently a post doc in my lab. And 380 00:19:20,076 --> 00:19:23,196 Speaker 1: we thought, rather than saying to people, don't give to 381 00:19:23,276 --> 00:19:26,996 Speaker 1: the charity that you love that is personally meaning for you, instead, 382 00:19:27,276 --> 00:19:30,116 Speaker 1: give to this charity that distributes malaria nets or provides 383 00:19:30,156 --> 00:19:32,716 Speaker 1: deworming treatments, which you don't know anything about. For a 384 00:19:32,756 --> 00:19:35,076 Speaker 1: lot of people, that just feels cold and alien. And 385 00:19:35,076 --> 00:19:36,556 Speaker 1: they said, like, yeah, I kind of get that, but 386 00:19:36,916 --> 00:19:39,396 Speaker 1: that's not where my heart is right and again I 387 00:19:39,716 --> 00:19:42,156 Speaker 1: get that too, you know, I don't exclusively give to 388 00:19:42,516 --> 00:19:46,356 Speaker 1: the super duper effective recommended charities. So then we thought, well, 389 00:19:46,356 --> 00:19:48,596 Speaker 1: what if we just ask people to do both, just 390 00:19:48,636 --> 00:19:51,876 Speaker 1: said hey, sor right, pick a charity that you love, 391 00:19:51,996 --> 00:19:54,436 Speaker 1: but also, you know, here's one that experts say is 392 00:19:54,516 --> 00:19:57,756 Speaker 1: incredibly effective. So we started doing experiments with this where 393 00:19:57,756 --> 00:20:00,276 Speaker 1: we said, you have this money, you can give it 394 00:20:00,316 --> 00:20:02,356 Speaker 1: to a charity that you choose or to this one 395 00:20:02,396 --> 00:20:05,356 Speaker 1: that's super effective. And we found that almost everybody chooses 396 00:20:05,356 --> 00:20:07,676 Speaker 1: the charity that they chose, not surprising. But then we 397 00:20:07,716 --> 00:20:10,116 Speaker 1: found if we just said hey, you've got choices. Give 398 00:20:10,116 --> 00:20:12,236 Speaker 1: it all to the one you picked, give it all 399 00:20:12,276 --> 00:20:15,476 Speaker 1: to the one we're recommending, or do a fifty fifty split. 400 00:20:15,676 --> 00:20:18,316 Speaker 1: And people were very happy to do the fifty fifty split, 401 00:20:18,516 --> 00:20:22,116 Speaker 1: so much so that more money went to the charity 402 00:20:22,156 --> 00:20:25,316 Speaker 1: that we chose with the fifty fifty splits. Then when 403 00:20:25,356 --> 00:20:27,956 Speaker 1: people only had the option to do one or the other, 404 00:20:28,156 --> 00:20:30,476 Speaker 1: so we said, off, maybe we're onto something here. And 405 00:20:30,516 --> 00:20:33,036 Speaker 1: we did some other experiments to try to understand the 406 00:20:33,076 --> 00:20:35,716 Speaker 1: psychology in more detail, and you know, the short and 407 00:20:35,796 --> 00:20:38,396 Speaker 1: long of it is that there's a kind of diminishing 408 00:20:38,396 --> 00:20:41,396 Speaker 1: returns that you get from supporting your favorite charity. That 409 00:20:41,436 --> 00:20:43,476 Speaker 1: when you support the charity that you love, it's not 410 00:20:43,516 --> 00:20:46,076 Speaker 1: so important to you whether you give fifty dollars or 411 00:20:46,076 --> 00:20:48,756 Speaker 1: one hundred dollars. It's just that you want to support it. 412 00:20:48,756 --> 00:20:50,156 Speaker 1: And yeah, it feels a little bit better to give 413 00:20:50,196 --> 00:20:52,396 Speaker 1: twice as much but not twice as good, But then 414 00:20:52,436 --> 00:20:54,956 Speaker 1: that makes room for doing something else. What we found 415 00:20:55,036 --> 00:20:57,156 Speaker 1: is that giving to a highly effective charity is not 416 00:20:57,196 --> 00:20:59,716 Speaker 1: only something that people are willing to do, but there's 417 00:20:59,716 --> 00:21:02,836 Speaker 1: something especially appealing about it because it has this kind 418 00:21:02,836 --> 00:21:06,236 Speaker 1: of hearthead complimentarity and then we thought, okay, so we 419 00:21:06,236 --> 00:21:07,436 Speaker 1: want to try to see if we can do this 420 00:21:07,436 --> 00:21:10,436 Speaker 1: out in the world. Okay, we can do the obvious 421 00:21:10,436 --> 00:21:12,436 Speaker 1: thing and say, well, what if we incentivize people. They say, 422 00:21:12,476 --> 00:21:14,956 Speaker 1: all right, if you make a split donation between one 423 00:21:15,036 --> 00:21:18,516 Speaker 1: you choose and one that experts recommend, we'll add, you know, 424 00:21:18,596 --> 00:21:20,996 Speaker 1: twenty five percent on top of both your donation. And 425 00:21:21,436 --> 00:21:23,196 Speaker 1: a nice thing about this is that it's both right. 426 00:21:23,276 --> 00:21:25,956 Speaker 1: We're not saying we're only encouraging you to do the 427 00:21:25,996 --> 00:21:30,756 Speaker 1: thing that we're kind of suggesting. We're supporting your charity, 428 00:21:30,796 --> 00:21:32,556 Speaker 1: the one that you picked, right, and we're happy, happy 429 00:21:32,596 --> 00:21:34,556 Speaker 1: to do that. I mean, we found that people really 430 00:21:34,556 --> 00:21:36,916 Speaker 1: loved this. And then there's one other piece to this. 431 00:21:37,356 --> 00:21:40,196 Speaker 1: He said, okay, well, we need these matching funds, and 432 00:21:40,276 --> 00:21:41,596 Speaker 1: one way to do this would be to have a 433 00:21:41,676 --> 00:21:44,476 Speaker 1: kind of angel investor donor who would sort of put 434 00:21:44,516 --> 00:21:46,996 Speaker 1: up the matching funds for this. But we thought maybe 435 00:21:47,236 --> 00:21:50,116 Speaker 1: we can do this in a new way. We ask people, okay, 436 00:21:50,156 --> 00:21:52,796 Speaker 1: so you've just had these matching funds with this donation, 437 00:21:53,156 --> 00:21:55,556 Speaker 1: would you be willing to take part of your donation, 438 00:21:55,596 --> 00:21:57,436 Speaker 1: the part that was going to go to the charity 439 00:21:57,436 --> 00:21:59,756 Speaker 1: that you actually didn't choose, but but is highly effective. 440 00:22:00,076 --> 00:22:01,916 Speaker 1: Would you be willing to put that in a fund 441 00:22:02,196 --> 00:22:04,676 Speaker 1: that would provide matching funds for other people, so kind 442 00:22:04,676 --> 00:22:07,196 Speaker 1: of pay it forward thing, And we found that a 443 00:22:07,316 --> 00:22:09,876 Speaker 1: significant number of people were very happy to do that. 444 00:22:10,036 --> 00:22:13,556 Speaker 1: So Lucius and I, with the help of some wonderful 445 00:22:13,836 --> 00:22:17,556 Speaker 1: web developers and one in particular, fabio Kun, created a 446 00:22:17,636 --> 00:22:21,996 Speaker 1: site called Giving Multiplier, which gives people the option to 447 00:22:22,076 --> 00:22:24,756 Speaker 1: do just that. But you can go to giving multiplier 448 00:22:24,796 --> 00:22:28,836 Speaker 1: dot org, slash Happiness Lab all caps, all one word, 449 00:22:28,836 --> 00:22:31,236 Speaker 1: where we've got a little special landing page just for 450 00:22:31,876 --> 00:22:35,436 Speaker 1: listeners of this podcast, and it's very simple. We have 451 00:22:35,476 --> 00:22:39,156 Speaker 1: a little search field where you can find any charity 452 00:22:39,196 --> 00:22:42,516 Speaker 1: that's registered in the United States. You enter the amount 453 00:22:42,596 --> 00:22:45,316 Speaker 1: of money that you want to donate, and then we 454 00:22:45,436 --> 00:22:48,796 Speaker 1: have this nifty I think it's very nifty little slider, 455 00:22:49,036 --> 00:22:51,316 Speaker 1: and then you can slide it to decide, Okay, do 456 00:22:51,396 --> 00:22:53,596 Speaker 1: I want to split it fifty fifty or do I 457 00:22:53,676 --> 00:22:56,676 Speaker 1: want to give like eight twenty percent to the other. 458 00:22:56,996 --> 00:22:59,556 Speaker 1: Right now, we're at the highest matching rate we've ever had, 459 00:22:59,676 --> 00:23:03,916 Speaker 1: especially with the Happiness Lab code, so we're experimenting with 460 00:23:03,956 --> 00:23:05,796 Speaker 1: seeing if we can go this high and still be 461 00:23:05,876 --> 00:23:09,036 Speaker 1: self sustaining. So I hope, I hope that works out, 462 00:23:09,116 --> 00:23:11,636 Speaker 1: and I hope happiness Labbs will give this a try. 463 00:23:11,876 --> 00:23:13,396 Speaker 1: I love this. I love that we get you know, 464 00:23:13,676 --> 00:23:16,956 Speaker 1: special multiplication on our donations. But I mean you said, 465 00:23:17,036 --> 00:23:18,916 Speaker 1: I hope this will work, but you already seeks a 466 00:23:18,916 --> 00:23:21,156 Speaker 1: lot of success from giving multiplier, right, like give me 467 00:23:21,156 --> 00:23:23,356 Speaker 1: a sense of like how much money people are donating. 468 00:23:23,756 --> 00:23:28,156 Speaker 1: We launched this in November of last year, and we thought, okay, 469 00:23:28,236 --> 00:23:30,956 Speaker 1: we'll we'll be really happy if this, you know, raises 470 00:23:30,996 --> 00:23:33,596 Speaker 1: like twenty thousand dollars or something like that, you know, 471 00:23:33,716 --> 00:23:35,436 Speaker 1: better than bakesale. You know. It was what we were 472 00:23:35,756 --> 00:23:40,116 Speaker 1: we were aiming for, and the response was unbelievable. We 473 00:23:40,196 --> 00:23:43,516 Speaker 1: have now I think we're we're up to about six 474 00:23:43,636 --> 00:23:47,196 Speaker 1: hundred and fifty thousand dollars total funds raised, but I 475 00:23:47,236 --> 00:23:49,396 Speaker 1: think we're really sort of just getting started, and it 476 00:23:49,476 --> 00:23:52,676 Speaker 1: seems like people like it and get it. Do you 477 00:23:52,676 --> 00:23:54,676 Speaker 1: think after, you know, after the end of the year, 478 00:23:54,716 --> 00:23:56,716 Speaker 1: we can come back and say how much money we, like, 479 00:23:56,796 --> 00:23:59,676 Speaker 1: my listeners raised for this? Oh? Absolutely, And I think 480 00:23:59,676 --> 00:24:02,796 Speaker 1: this is a great opportunity to take that competitive instinct 481 00:24:03,076 --> 00:24:05,556 Speaker 1: and turn it into something cooperative. So let's say, if 482 00:24:05,596 --> 00:24:07,756 Speaker 1: you can kick the crap out of the other podcasts, 483 00:24:08,076 --> 00:24:11,636 Speaker 1: I won't name them, but yeah, I will report back 484 00:24:11,676 --> 00:24:14,556 Speaker 1: with I'll come back with the numbers and let you 485 00:24:14,596 --> 00:24:16,436 Speaker 1: know how you did. It's for anyone that's on the 486 00:24:16,436 --> 00:24:19,316 Speaker 1: fence about maybe using this giving Tuesday, you know, to 487 00:24:19,356 --> 00:24:21,916 Speaker 1: do more for others, or who's who's on the fence 488 00:24:21,916 --> 00:24:24,196 Speaker 1: about doing more for others in this very effective way. 489 00:24:24,316 --> 00:24:27,316 Speaker 1: Any final advice for jumping in and making the decision 490 00:24:27,356 --> 00:24:29,796 Speaker 1: to give a little bit more with your heart and 491 00:24:30,156 --> 00:24:34,276 Speaker 1: your mind. Well, it's very hard to live one's life 492 00:24:34,676 --> 00:24:38,236 Speaker 1: being a pure effectiveness maximizer, right, I mean in the 493 00:24:38,316 --> 00:24:41,596 Speaker 1: limiting cases, this is like no birthday party for your kids. 494 00:24:41,956 --> 00:24:44,996 Speaker 1: You know, you don't need two kidneys. And so again, 495 00:24:45,076 --> 00:24:47,316 Speaker 1: this is not it's not about finger wagging, and it's 496 00:24:47,316 --> 00:24:49,636 Speaker 1: not about saying thou shalt you know, do this or 497 00:24:49,636 --> 00:24:52,916 Speaker 1: do that. But I mean for me, once you kind 498 00:24:52,956 --> 00:24:56,516 Speaker 1: of know that you can be extremely effective, like hundreds 499 00:24:56,516 --> 00:24:59,796 Speaker 1: of times more effective, then you know, once you have 500 00:24:59,876 --> 00:25:04,396 Speaker 1: that knowledge, it's like how can you ignore it? Giving 501 00:25:04,396 --> 00:25:07,036 Speaker 1: to others can provide us with a much needed happiness boost. 502 00:25:07,676 --> 00:25:09,516 Speaker 1: And that's why I wanted to give you that special 503 00:25:09,516 --> 00:25:13,756 Speaker 1: web address from Josh one more time. It's giving multiplier 504 00:25:13,956 --> 00:25:17,756 Speaker 1: dot org, slash Happiness Lab all caps, all one word. 505 00:25:18,356 --> 00:25:20,316 Speaker 1: But even if you aren't in a position to donate 506 00:25:20,356 --> 00:25:22,836 Speaker 1: to a charity right now, I hope you've picked up 507 00:25:22,876 --> 00:25:26,036 Speaker 1: some useful strategies from this episode to help you maximize 508 00:25:26,076 --> 00:25:28,836 Speaker 1: how you help others, and whether you give with your 509 00:25:28,836 --> 00:25:31,676 Speaker 1: head or your heart, with money or with time this 510 00:25:31,756 --> 00:25:35,276 Speaker 1: holiday season, the important thing is that you're making the effort. 511 00:25:36,196 --> 00:25:39,076 Speaker 1: This is the final Happiness Lab episode for twenty twenty one, 512 00:25:39,716 --> 00:25:41,996 Speaker 1: but we will be back on January fifth with a 513 00:25:42,076 --> 00:25:45,156 Speaker 1: new year mini season. We'll take a deep dive into 514 00:25:45,196 --> 00:25:49,156 Speaker 1: what we usually think of as negative emotions, things like sadness, grief, 515 00:25:49,196 --> 00:25:52,436 Speaker 1: and anxiety, and we'll talk to experts like Brenee Brown, 516 00:25:52,516 --> 00:25:56,676 Speaker 1: Adam Grant, and Julia Samuel about how understanding these emotions 517 00:25:56,716 --> 00:26:00,516 Speaker 1: better can help us improve our overall happiness. I'm also 518 00:26:00,556 --> 00:26:03,956 Speaker 1: releasing six special meditations to go along with these January 519 00:26:03,956 --> 00:26:07,876 Speaker 1: shows that will be exclusive to Pushkin Plus subscribers. Pushkin 520 00:26:07,916 --> 00:26:10,636 Speaker 1: Plus is available on the show page in Apple Podcasts 521 00:26:10,996 --> 00:26:16,036 Speaker 1: all Right, Pushkin dot fm slash plus That's Plus sign 522 00:26:16,116 --> 00:26:18,196 Speaker 1: up now and you'll have access to ads free listening 523 00:26:18,236 --> 00:26:21,836 Speaker 1: across many Pushkin Industry shows. I hope you have a 524 00:26:21,836 --> 00:26:24,876 Speaker 1: fantastic holiday season and I look forward to seeing you 525 00:26:24,956 --> 00:26:30,236 Speaker 1: in twenty twenty two. Until then, stay safe and stay happy. 526 00:26:35,836 --> 00:26:38,516 Speaker 1: The Happiness Lab is co written and produced by Ryan Delley. 527 00:26:38,796 --> 00:26:42,396 Speaker 1: Our original music was composed by Zachary Silver, with additional scoring, 528 00:26:42,516 --> 00:26:46,916 Speaker 1: mixing and mastering by Evan Viola. Joseph Fridman checked our facts. 529 00:26:46,956 --> 00:26:50,956 Speaker 1: Sophie Crane mckibbon edited our scripts. Emily Ann Vaughan offered 530 00:26:50,956 --> 00:26:55,876 Speaker 1: additional production support. Special thanks to Miela Belle, Carl mcgliori, 531 00:26:56,116 --> 00:27:01,116 Speaker 1: Heather Faine, Maggie Taylor, Daniella Lucarne, Maya Kanig, Nicole Morano, 532 00:27:01,476 --> 00:27:05,996 Speaker 1: Eric Xandler, Royston Baserve, Jacob Weisberg, and my agent Ben Davis. 533 00:27:06,476 --> 00:27:09,316 Speaker 1: Thatppiness Lab is brought to you by Pushkin Industries. Me 534 00:27:09,516 --> 00:27:10,596 Speaker 1: Doctor Laurie Santos