1 00:00:15,516 --> 00:00:15,956 Speaker 1: Pushkin. 2 00:00:21,556 --> 00:00:24,876 Speaker 2: I'm Jacob Goldstein. I host a Pushkin podcast called What's 3 00:00:24,876 --> 00:00:27,836 Speaker 2: Your Problem? And I am here randomly talking to you 4 00:00:27,956 --> 00:00:32,196 Speaker 2: right now because today is the day before Giving Tuesday. 5 00:00:32,956 --> 00:00:36,116 Speaker 2: Giving Tuesday, as you may already know, is the Tuesday 6 00:00:36,156 --> 00:00:39,076 Speaker 2: after Thanksgiving, and it's supposed to be the day we 7 00:00:39,156 --> 00:00:42,636 Speaker 2: give money to charity. And I'm gonna be honest with you. 8 00:00:43,036 --> 00:00:46,716 Speaker 2: In my middle aged, somewhat calcified heart, I cringe a 9 00:00:46,756 --> 00:00:50,436 Speaker 2: little bit every time I hear the phrase giving Tuesday. 10 00:00:50,916 --> 00:00:53,636 Speaker 2: I think Giving Tuesday is not a real thing. It's 11 00:00:53,636 --> 00:00:56,396 Speaker 2: not a real day. It's just something somebody made up 12 00:00:56,436 --> 00:01:00,756 Speaker 2: a few years ago. But that cynicism is not helping anybody. 13 00:01:01,196 --> 00:01:04,396 Speaker 2: In fact, as it turns out, it isn't even helping me. 14 00:01:05,596 --> 00:01:08,276 Speaker 2: I know this to be true because over the past 15 00:01:08,356 --> 00:01:13,636 Speaker 2: decade or so, research has made two things really clear. One, 16 00:01:14,356 --> 00:01:17,556 Speaker 2: giving money away makes us feel better than we think 17 00:01:17,636 --> 00:01:20,916 Speaker 2: it will make us feel. In other words, we underestimate 18 00:01:20,996 --> 00:01:24,796 Speaker 2: the benefit to ourselves of giving money to others. That's 19 00:01:24,836 --> 00:01:28,196 Speaker 2: thing one think. Two is this, there are charities that 20 00:01:28,276 --> 00:01:32,356 Speaker 2: are proven, proven by really robust evidence, to do a 21 00:01:32,396 --> 00:01:35,796 Speaker 2: tremendous amount of good with the money we give them, 22 00:01:36,236 --> 00:01:39,516 Speaker 2: So Today, I and my colleagues at Pushkin are leaning 23 00:01:39,636 --> 00:01:43,236 Speaker 2: into Giving Tuesday. We are putting out this special Giving 24 00:01:43,276 --> 00:01:46,716 Speaker 2: Tuesday show to get into this evidence to really understand 25 00:01:47,156 --> 00:01:49,316 Speaker 2: why giving money makes us happy, why we don't do 26 00:01:49,396 --> 00:01:53,516 Speaker 2: it more, and who we should give money to. To 27 00:01:53,596 --> 00:01:55,276 Speaker 2: start out on the show, I'm going to talk with 28 00:01:55,396 --> 00:01:59,276 Speaker 2: Lori Santos. Laurie's a Yale psychologist who hosts a Pushkin 29 00:01:59,356 --> 00:02:02,236 Speaker 2: show called The Happiness Lab, and Laura and I are 30 00:02:02,276 --> 00:02:05,116 Speaker 2: going to talk about the evidence that shows that giving 31 00:02:05,156 --> 00:02:08,396 Speaker 2: makes us happy and then the obvious puzzle that follows 32 00:02:08,396 --> 00:02:11,316 Speaker 2: from that evidence. If giving makes us so happy, why 33 00:02:11,356 --> 00:02:14,156 Speaker 2: don't we give more? Later on the show, I'll talk 34 00:02:14,156 --> 00:02:18,036 Speaker 2: with Ellie Hasenfeld. Ellie is the co founder and CEO 35 00:02:18,156 --> 00:02:22,436 Speaker 2: of GiveWell, and he has spent nearly two decades scouring 36 00:02:22,476 --> 00:02:25,316 Speaker 2: the world studying the research to try to figure out 37 00:02:25,436 --> 00:02:29,636 Speaker 2: which charities do the most good with every dollar. And 38 00:02:29,636 --> 00:02:32,996 Speaker 2: then finally I'll talk with Maria Konikova and Nate Silver. 39 00:02:33,316 --> 00:02:35,836 Speaker 2: Maria and Nate are a pair of writers who host 40 00:02:35,836 --> 00:02:39,636 Speaker 2: a Pushkin podcast called Risky Business, but they're also both 41 00:02:39,716 --> 00:02:43,836 Speaker 2: professional poker players. They are people whose livelihoods depend on 42 00:02:43,956 --> 00:02:47,796 Speaker 2: making optimal bets. So I'll be talking to them about 43 00:02:47,876 --> 00:02:56,396 Speaker 2: how they bring that thinking to their charitable giving. Lorie Santos, 44 00:02:57,396 --> 00:03:00,316 Speaker 2: tell me why I am not giving enough money to charity. 45 00:03:00,996 --> 00:03:04,276 Speaker 1: Well, it's probably because your mind is leading you astray, right, 46 00:03:04,756 --> 00:03:07,796 Speaker 1: I mean, you're like a smart person, right. You probably 47 00:03:07,836 --> 00:03:09,996 Speaker 1: think through what would be the pros and cons of 48 00:03:09,996 --> 00:03:12,516 Speaker 1: giving a charity. You probably do some simulations in your 49 00:03:12,516 --> 00:03:14,796 Speaker 1: head about how it would feel for you, and you 50 00:03:14,796 --> 00:03:17,756 Speaker 1: know how the recipients of that money would feel. And 51 00:03:17,836 --> 00:03:20,756 Speaker 1: there's just tons of psychological evidence showing that when we 52 00:03:20,796 --> 00:03:24,036 Speaker 1: do those simulations, we get them really really wrong. 53 00:03:24,716 --> 00:03:27,676 Speaker 2: Huh. We don't know what makes us happy with giving money, 54 00:03:27,716 --> 00:03:29,076 Speaker 2: as with everything. 55 00:03:28,756 --> 00:03:30,436 Speaker 1: As with everything. Yeah, I mean, in some ways, the 56 00:03:30,476 --> 00:03:32,236 Speaker 1: giving money part shouldn't be surprising. 57 00:03:32,316 --> 00:03:32,436 Speaker 2: You know. 58 00:03:32,476 --> 00:03:35,036 Speaker 1: I have a whole podcast about how we get happiness 59 00:03:35,036 --> 00:03:38,436 Speaker 1: wrong all the time. But this one's really insidious because 60 00:03:38,436 --> 00:03:40,916 Speaker 1: it means that we're like leaving opportunities not just to 61 00:03:40,956 --> 00:03:43,796 Speaker 1: make ourselves happier kind of on the table, but we're 62 00:03:43,796 --> 00:03:46,356 Speaker 1: also leaving opportunities to just do good in the world 63 00:03:46,476 --> 00:03:48,316 Speaker 1: and do good in society on the table too. So 64 00:03:48,316 --> 00:03:50,396 Speaker 1: in some ways, it's like even sadder Yeah. 65 00:03:50,396 --> 00:03:52,796 Speaker 2: Like people talk about win win, This is like lose lose. 66 00:03:52,836 --> 00:03:55,316 Speaker 2: It's right, I feel worse and everybody feels worse. 67 00:03:55,396 --> 00:03:57,756 Speaker 1: And it's another case where we could be building the pie. 68 00:03:57,836 --> 00:03:57,996 Speaker 3: Right. 69 00:03:58,036 --> 00:04:00,436 Speaker 1: You know, say I have you know, ten bucks sitting 70 00:04:00,436 --> 00:04:02,116 Speaker 1: around in my pocket, Right, it could spend it in 71 00:04:02,116 --> 00:04:04,956 Speaker 1: a way that makes me happy, or I could donate 72 00:04:04,956 --> 00:04:08,276 Speaker 1: it to a good cause. Right, I probably feel better. 73 00:04:08,356 --> 00:04:10,756 Speaker 1: The research would show spending at ten dollars on a 74 00:04:10,756 --> 00:04:12,916 Speaker 1: good charity, then I would feel kind of blowing it 75 00:04:12,996 --> 00:04:15,556 Speaker 1: on myself. But now the money is going to increase 76 00:04:15,556 --> 00:04:18,796 Speaker 1: happiness in other people, right, presumably somebody who really needed 77 00:04:18,796 --> 00:04:21,636 Speaker 1: that money. And so we're losing these opportunities to grow 78 00:04:21,676 --> 00:04:24,836 Speaker 1: the pie. And we really just need to understand this 79 00:04:24,916 --> 00:04:27,116 Speaker 1: bias better so that we can be happier. 80 00:04:27,916 --> 00:04:29,516 Speaker 2: So you learn it to the research that shows that 81 00:04:29,556 --> 00:04:32,916 Speaker 2: giving money away makes us happier. Tell me more about that, Like, 82 00:04:32,916 --> 00:04:35,836 Speaker 2: what is the academic work that's been done on this subject. 83 00:04:36,076 --> 00:04:38,236 Speaker 1: Yeah, well there's tons of studies now, you know. One 84 00:04:38,236 --> 00:04:40,476 Speaker 1: of my favorite is a really straightforward one. It comes 85 00:04:40,516 --> 00:04:42,516 Speaker 1: out of the lab of Elizabeth Dunn and her colleagues 86 00:04:42,556 --> 00:04:45,436 Speaker 1: at the University of British Columbia, and their method is 87 00:04:45,476 --> 00:04:47,916 Speaker 1: really straightforward. They walk up to some person on the 88 00:04:47,916 --> 00:04:49,596 Speaker 1: street and they say, hey, do you want to be 89 00:04:49,636 --> 00:04:51,516 Speaker 1: in a psych study? I think the person kind of 90 00:04:51,516 --> 00:04:53,956 Speaker 1: begrudging all he's like, okay, fine, But then it starts 91 00:04:53,956 --> 00:04:56,076 Speaker 1: out it's an awesome psych study because Liz and her 92 00:04:56,116 --> 00:04:58,636 Speaker 1: colleagues just hand you twenty bucks and she're like cool. 93 00:04:59,116 --> 00:05:00,796 Speaker 1: The key, though, is that she tells you how to 94 00:05:00,796 --> 00:05:03,076 Speaker 1: spend that money. She either says, hey, by the end 95 00:05:03,076 --> 00:05:05,716 Speaker 1: of the day, do something nice for yourself with this money. 96 00:05:05,756 --> 00:05:08,356 Speaker 1: Treat yourself, you know, it's something you wouldn't have expected. 97 00:05:08,916 --> 00:05:11,356 Speaker 1: Or by the end of the day, do something nice 98 00:05:11,356 --> 00:05:13,876 Speaker 1: with this money for somebody else. Right, you could donate 99 00:05:13,916 --> 00:05:16,036 Speaker 1: it to charity, you could buy your friend a Lotte, Right, 100 00:05:16,076 --> 00:05:18,036 Speaker 1: it doesn't matter, but it has to be for somebody else. 101 00:05:18,476 --> 00:05:21,276 Speaker 1: And then the key is that she calls participants later 102 00:05:21,356 --> 00:05:23,636 Speaker 1: that day and even in some cases later in the week, 103 00:05:23,876 --> 00:05:27,236 Speaker 1: and what she finds is that people tend to feel 104 00:05:27,276 --> 00:05:30,236 Speaker 1: happier when they donate the money to somebody else or 105 00:05:30,276 --> 00:05:32,476 Speaker 1: do something nice for somebody else with the money, more 106 00:05:32,516 --> 00:05:34,996 Speaker 1: so than they feel when they spent the equivalent amount 107 00:05:35,036 --> 00:05:38,076 Speaker 1: of money on themselves, and the study I love because 108 00:05:38,116 --> 00:05:41,676 Speaker 1: it's just so straightforward. It's just suggests that what we 109 00:05:41,756 --> 00:05:44,036 Speaker 1: predict will happen, right, And Liz has actually done these 110 00:05:44,076 --> 00:05:46,756 Speaker 1: studies where she asks a different group of participants, Hey, 111 00:05:46,796 --> 00:05:48,316 Speaker 1: imagine you were in the study where I walked up 112 00:05:48,316 --> 00:05:49,876 Speaker 1: to you on the street and gave you twenty bucks? 113 00:05:50,036 --> 00:05:52,676 Speaker 1: Would you be happier spending that on yourself or somebody else? 114 00:05:52,916 --> 00:05:55,916 Speaker 1: And like robustly people say, oh, I'd be happier spending 115 00:05:55,956 --> 00:05:58,156 Speaker 1: on myself, right, because I get something out of the deal. 116 00:05:58,516 --> 00:06:00,716 Speaker 1: But what she finds is that we're just our prediction 117 00:06:00,836 --> 00:06:03,716 Speaker 1: is just totally wrong. When we spend on others, we're happier. 118 00:06:04,436 --> 00:06:06,956 Speaker 2: So I feel like there's a subtlety there in the 119 00:06:07,036 --> 00:06:11,716 Speaker 2: spending on others group. Right. It is in some ways 120 00:06:11,916 --> 00:06:15,596 Speaker 2: more intuitive to me that like, whatever, buying lunch for 121 00:06:15,716 --> 00:06:18,756 Speaker 2: my friend would make me happy because my friend would 122 00:06:18,796 --> 00:06:20,796 Speaker 2: be so happy and we'd be happy together, and the 123 00:06:20,836 --> 00:06:24,476 Speaker 2: thing would be happening in this very you know, social, 124 00:06:24,636 --> 00:06:28,476 Speaker 2: real physical way, Like I get that one. It seems 125 00:06:28,596 --> 00:06:31,476 Speaker 2: less obvious to me that like giving twenty bucks to 126 00:06:31,516 --> 00:06:34,676 Speaker 2: a charity helping people in sub Saharan Africa would make 127 00:06:34,716 --> 00:06:39,636 Speaker 2: me happy, even though clearly intellectually, analytically, I know that 128 00:06:39,676 --> 00:06:41,996 Speaker 2: the twenty bucks going to sub Saharan Africa is going 129 00:06:42,076 --> 00:06:46,196 Speaker 2: to do more to increase abstract human happiness than buying 130 00:06:46,276 --> 00:06:50,116 Speaker 2: lunch for my friend who could have bought lunch for herself. So, like, 131 00:06:50,796 --> 00:06:52,516 Speaker 2: how does that piece of it work? How do we 132 00:06:52,516 --> 00:06:53,636 Speaker 2: think about that piece of it? 133 00:06:53,836 --> 00:06:56,196 Speaker 1: You are onto an important point, which is that there 134 00:06:56,196 --> 00:06:59,396 Speaker 1: are better and worse ways to give to charity, right 135 00:06:59,436 --> 00:07:01,716 Speaker 1: in terms of like boosting our own happiness and sort 136 00:07:01,756 --> 00:07:04,476 Speaker 1: of feeling the impact from that, we are we are 137 00:07:04,516 --> 00:07:06,796 Speaker 1: a lot happier if we can see the impact of 138 00:07:06,836 --> 00:07:09,276 Speaker 1: our work, right, But even when we don't see the 139 00:07:09,276 --> 00:07:11,996 Speaker 1: impact of our work, the act of donating winds up 140 00:07:11,996 --> 00:07:14,956 Speaker 1: making us feel better than we think. Again, I share 141 00:07:14,996 --> 00:07:16,996 Speaker 1: the intuition that you have, Jacob, Right, Like, I know 142 00:07:17,076 --> 00:07:19,876 Speaker 1: these studies. I can kind of quote the stats and 143 00:07:19,956 --> 00:07:22,036 Speaker 1: I still don't have the intuition that it works. But 144 00:07:22,076 --> 00:07:24,556 Speaker 1: the results really just suggest that we feel better than 145 00:07:24,596 --> 00:07:25,436 Speaker 1: we assume we will. 146 00:07:25,836 --> 00:07:27,556 Speaker 2: Why do you think we get it wrong? I mean, 147 00:07:27,596 --> 00:07:29,196 Speaker 2: I know we get everything wrong, But why do you 148 00:07:29,196 --> 00:07:30,956 Speaker 2: think we get this wrong? Yeah? 149 00:07:30,996 --> 00:07:32,836 Speaker 1: I mean we get everything wrong? Right, Our minds I 150 00:07:33,236 --> 00:07:35,356 Speaker 1: wish we could just like update like mind two point 151 00:07:35,436 --> 00:07:38,156 Speaker 1: zero would be so much better. I think there's some 152 00:07:38,716 --> 00:07:40,716 Speaker 1: there's some reasons that we get this one wrong. One 153 00:07:40,756 --> 00:07:43,236 Speaker 1: is sort of when we're doing an active kindness, what 154 00:07:43,396 --> 00:07:46,196 Speaker 1: we focus on as opposed to what the recipient will 155 00:07:46,236 --> 00:07:48,836 Speaker 1: focus on. Right, I'm sort of focused on whether or 156 00:07:48,876 --> 00:07:51,156 Speaker 1: not my gift is kind of in some sense competent, 157 00:07:51,356 --> 00:07:53,036 Speaker 1: Like am I doing the right thing? Am I picking 158 00:07:53,076 --> 00:07:56,676 Speaker 1: the right charity? Maybe in more local acts of kindness? 159 00:07:56,716 --> 00:07:58,596 Speaker 1: Am I doing it the right way? Right? 160 00:07:58,796 --> 00:08:00,836 Speaker 2: Yeah? I don't want to be awkward. I don't want 161 00:08:00,836 --> 00:08:03,236 Speaker 2: to be rude or make this person feel some sense 162 00:08:03,276 --> 00:08:06,716 Speaker 2: of obligation or reciprocity that might not work for them. 163 00:08:07,276 --> 00:08:09,276 Speaker 2: Overthinking it, you're saying we're over things. 164 00:08:09,476 --> 00:08:12,596 Speaker 1: But in terms of the overthinking, that's not what's happening. 165 00:08:12,676 --> 00:08:14,756 Speaker 1: On the other side. You think of the recipient of 166 00:08:14,796 --> 00:08:16,636 Speaker 1: a compliment, right, If you know, if someone walks up 167 00:08:16,636 --> 00:08:18,276 Speaker 1: to you in the streets like, hey, you know love 168 00:08:18,276 --> 00:08:20,836 Speaker 1: those glasses? They really see you, Jacob, like nicely done. 169 00:08:21,116 --> 00:08:23,956 Speaker 1: You're not thinking you you mean it and I like it, 170 00:08:24,596 --> 00:08:27,356 Speaker 1: But you're not thinking of like did they say it right? 171 00:08:27,436 --> 00:08:29,876 Speaker 1: Did they use the right adjective, was it cool glasses 172 00:08:29,956 --> 00:08:32,836 Speaker 1: or stylish? You're just like, oh my gosh, I'm surprised, 173 00:08:32,916 --> 00:08:36,196 Speaker 1: and I have this incredible warm feeling. And so this 174 00:08:36,236 --> 00:08:38,356 Speaker 1: is part of the disconnect, is that when we're making 175 00:08:38,356 --> 00:08:41,516 Speaker 1: the decision to do something nice, we're overthinking, we're caught 176 00:08:41,556 --> 00:08:42,956 Speaker 1: up and if we're doing it right and so on. 177 00:08:43,236 --> 00:08:45,916 Speaker 1: But the recipients they don't feel any of that. They're 178 00:08:45,956 --> 00:08:48,396 Speaker 1: just like, oh my gosh, I feel amazing. And so 179 00:08:48,476 --> 00:08:51,156 Speaker 1: we kind of mispredict what they're paying attention to when 180 00:08:51,156 --> 00:08:53,916 Speaker 1: they react, and that means their reactions are often a 181 00:08:53,916 --> 00:08:56,836 Speaker 1: lot more positive than we expect. And then we're like, oh, 182 00:08:56,996 --> 00:08:58,596 Speaker 1: I guess, I guess it was nice to do that 183 00:08:58,716 --> 00:08:59,796 Speaker 1: kind thing for somebody. 184 00:09:00,156 --> 00:09:03,076 Speaker 2: It's the broader lesson of like everybody's just thinking about 185 00:09:03,076 --> 00:09:05,756 Speaker 2: themselves all the time. We're thinking about ourselves as givers 186 00:09:05,756 --> 00:09:08,036 Speaker 2: and am I the optimal giver? Am I giving in 187 00:09:08,076 --> 00:09:10,796 Speaker 2: the optimal way? But the recipients aren't thinking about you. 188 00:09:10,836 --> 00:09:12,116 Speaker 2: They're just thinking about. 189 00:09:11,916 --> 00:09:13,836 Speaker 1: Them Yeah, And we get so caught up in the 190 00:09:13,836 --> 00:09:14,716 Speaker 1: awkwardness of it. 191 00:09:14,876 --> 00:09:14,996 Speaker 3: Right. 192 00:09:15,036 --> 00:09:18,116 Speaker 1: You know, how many compliments have you not given just 193 00:09:18,156 --> 00:09:19,596 Speaker 1: because you're like, oh, I don't want to do it 194 00:09:19,596 --> 00:09:22,276 Speaker 1: wrong or seem weird and some of Nick Eppley's a 195 00:09:22,356 --> 00:09:24,596 Speaker 1: professor at the University of Chicago's data. He finds that 196 00:09:24,956 --> 00:09:27,676 Speaker 1: about a third of the compliments we think in our heads, 197 00:09:27,956 --> 00:09:30,996 Speaker 1: we don't actually tell the people around us, right, which 198 00:09:31,036 --> 00:09:34,116 Speaker 1: when you think compliments usually are received really well and 199 00:09:34,196 --> 00:09:36,796 Speaker 1: make people's day, it's like a lot of positivity that's 200 00:09:36,836 --> 00:09:39,636 Speaker 1: just like stuck inside people's heads that we're not giving out. 201 00:09:40,036 --> 00:09:43,476 Speaker 2: Friction seems like another interesting piece, right, there's like you're 202 00:09:43,516 --> 00:09:46,476 Speaker 2: in your own head too much. And the other core 203 00:09:46,516 --> 00:09:48,316 Speaker 2: piece is like, ah, I don't know. I'm just some 204 00:09:48,436 --> 00:09:49,876 Speaker 2: guy in the world. How do I figure out who 205 00:09:49,876 --> 00:09:52,276 Speaker 2: to give to? And like, I feel like that one 206 00:09:52,436 --> 00:09:55,436 Speaker 2: is underrated in the world in general. Right, Like we 207 00:09:55,676 --> 00:09:58,636 Speaker 2: just we're like water, We just flow to the easiest route. 208 00:09:58,636 --> 00:10:00,596 Speaker 2: I mean, how do you see that playing out in 209 00:10:00,676 --> 00:10:03,596 Speaker 2: charity more generally, in giving more generally? 210 00:10:03,756 --> 00:10:06,516 Speaker 1: Yeah, I mean I think that friction is a huge thing, right. 211 00:10:06,556 --> 00:10:08,676 Speaker 1: I Mean a good friend of mine just had a 212 00:10:08,676 --> 00:10:11,996 Speaker 1: baby with his wife, and my instagraction was like, oh 213 00:10:12,036 --> 00:10:13,876 Speaker 1: my gosh, I should do something nice for them. Maybe 214 00:10:13,916 --> 00:10:16,116 Speaker 1: I'll get them some food or some onesies or something. 215 00:10:16,476 --> 00:10:18,396 Speaker 1: But I was like, do I just show up at 216 00:10:18,436 --> 00:10:20,796 Speaker 1: their house? Like do they have any dietary restrictions that 217 00:10:20,836 --> 00:10:21,676 Speaker 1: I'm forgetting about? 218 00:10:21,756 --> 00:10:22,036 Speaker 2: Is this we? 219 00:10:22,396 --> 00:10:25,316 Speaker 1: Like all again, all this overthinking in my head. But 220 00:10:25,396 --> 00:10:27,836 Speaker 1: another good friend of ours was like, I'm setting up 221 00:10:27,836 --> 00:10:30,476 Speaker 1: a meal train. Here's the day you click on this, lin, 222 00:10:30,596 --> 00:10:32,356 Speaker 1: It's super easy, right, what you're going to you know, 223 00:10:32,356 --> 00:10:34,676 Speaker 1: give him if it's a lasagn or whatever, here's how 224 00:10:34,676 --> 00:10:36,916 Speaker 1: you drop it off. She just made it really easy 225 00:10:37,356 --> 00:10:38,916 Speaker 1: for me to do the nice thing that I was 226 00:10:38,996 --> 00:10:42,716 Speaker 1: thinking about doing. Anyway, I needed somebody to make that 227 00:10:42,716 --> 00:10:44,996 Speaker 1: friction go away for me to help. And so I 228 00:10:44,996 --> 00:10:47,836 Speaker 1: think there's so many cases of this in terms of 229 00:10:47,836 --> 00:10:50,756 Speaker 1: what we could do to do nice things for others, 230 00:10:50,796 --> 00:10:53,116 Speaker 1: whether that's with a charitable donation or even just like 231 00:10:53,276 --> 00:10:55,556 Speaker 1: you know, asking a friend if they need some support, 232 00:10:55,796 --> 00:10:58,356 Speaker 1: checking in on the people we care about, sharing compliments 233 00:10:58,356 --> 00:11:00,716 Speaker 1: and so on. The friction kind of gets in the way. 234 00:11:00,756 --> 00:11:02,916 Speaker 1: And I think this is the idea, is that we 235 00:11:03,076 --> 00:11:06,236 Speaker 1: can overcome the friction by kind of reducing how much 236 00:11:06,316 --> 00:11:08,516 Speaker 1: work it is for us to do the nice thing. Right, 237 00:11:09,236 --> 00:11:12,236 Speaker 1: next thing is just texting a friend or you're already there, 238 00:11:12,356 --> 00:11:14,396 Speaker 1: you know, like you know in the subway and you 239 00:11:14,476 --> 00:11:17,236 Speaker 1: compliment somebody who's walking by. Right, these are the kinds 240 00:11:17,276 --> 00:11:19,396 Speaker 1: of things that we can do quickly, and if we 241 00:11:19,476 --> 00:11:21,476 Speaker 1: do them enough, then there's a second way that we 242 00:11:21,476 --> 00:11:23,716 Speaker 1: can reduce friction, which is that it just kind of 243 00:11:23,716 --> 00:11:25,956 Speaker 1: becomes a habit. Right, if we just get in the 244 00:11:25,996 --> 00:11:28,396 Speaker 1: habit of doing this over and over again, doing more 245 00:11:28,436 --> 00:11:30,996 Speaker 1: and more nice things, then all of a sudden, it 246 00:11:31,116 --> 00:11:33,716 Speaker 1: just becomes easier. Because so much that we know about 247 00:11:33,796 --> 00:11:35,756 Speaker 1: human psychology, even though we're kind of you know, in 248 00:11:35,796 --> 00:11:37,916 Speaker 1: the crappy beta version, shows that when we do something 249 00:11:37,916 --> 00:11:40,436 Speaker 1: over and over again, it just becomes easier to do 250 00:11:40,516 --> 00:11:43,836 Speaker 1: that same thing. And so one of the giving Tuesday 251 00:11:43,916 --> 00:11:46,676 Speaker 1: practices that I talk about in my show that Happiness 252 00:11:46,756 --> 00:11:50,036 Speaker 1: Lab is just hey, practice doing nice things, and it 253 00:11:50,076 --> 00:11:52,516 Speaker 1: will make it easier. You'll kind of experience less friction 254 00:11:52,596 --> 00:11:54,756 Speaker 1: over time, just because, like it's just the thing you 255 00:11:54,796 --> 00:11:56,956 Speaker 1: do when you see somebody you know at work, who 256 00:11:57,036 --> 00:11:59,676 Speaker 1: you know is wearing something nice or they do something 257 00:11:59,676 --> 00:12:01,556 Speaker 1: great in a team meeting. You'll just get good at 258 00:12:01,796 --> 00:12:05,476 Speaker 1: expressing compliments, expressing gratitude. It'll just become second nature. 259 00:12:05,476 --> 00:12:09,316 Speaker 2: Told you, So, you have this project through your show 260 00:12:09,396 --> 00:12:12,756 Speaker 2: the happiness lab of giving money away, like built on 261 00:12:12,796 --> 00:12:16,196 Speaker 2: this premise that we're talking about that not only would 262 00:12:16,316 --> 00:12:19,556 Speaker 2: recipients be better if people gave more, but the givers 263 00:12:19,596 --> 00:12:22,036 Speaker 2: themselves would be better if people gave more. You have 264 00:12:22,076 --> 00:12:25,676 Speaker 2: this project that you do every year for Giving Tuesday, 265 00:12:26,836 --> 00:12:29,476 Speaker 2: which is coming up. Tell me about that project. Yeah. 266 00:12:29,516 --> 00:12:31,596 Speaker 1: So the site that we've worked with is this group 267 00:12:31,636 --> 00:12:35,276 Speaker 1: called giving multiplier dot org, and their goal is to 268 00:12:35,316 --> 00:12:37,316 Speaker 1: fight a different kind of thing that can go wrong 269 00:12:37,396 --> 00:12:39,956 Speaker 1: when we think about donating to charities, which is that 270 00:12:40,396 --> 00:12:42,516 Speaker 1: many of us really do want to be kind of 271 00:12:42,556 --> 00:12:44,676 Speaker 1: competent about it. We want our money to go to 272 00:12:45,156 --> 00:12:48,196 Speaker 1: really good causes in the world, but we also kind 273 00:12:48,196 --> 00:12:50,356 Speaker 1: of fall prey to the causes that feel really close 274 00:12:50,396 --> 00:12:52,956 Speaker 1: to my heart, right you know, like I might want 275 00:12:52,996 --> 00:12:55,276 Speaker 1: to give to my local food bank, which is great, 276 00:12:55,276 --> 00:12:57,516 Speaker 1: you know, it's good to do that, but that ten 277 00:12:57,516 --> 00:12:59,316 Speaker 1: bucks I give to my local food bank, it might 278 00:12:59,356 --> 00:13:02,076 Speaker 1: not have as much impact as you know, giving to 279 00:13:02,116 --> 00:13:05,396 Speaker 1: somebody maybe an extreme poverty right in like Sub Saharan Africa. 280 00:13:06,156 --> 00:13:08,236 Speaker 1: You know, I haven't really analyzed as my local food 281 00:13:08,276 --> 00:13:10,236 Speaker 1: bank doing the best with the money and so on, 282 00:13:10,796 --> 00:13:13,756 Speaker 1: and so Giving Multiplier dot org has this has this 283 00:13:13,836 --> 00:13:16,876 Speaker 1: really nice combination of they say, Okay, you really feel 284 00:13:16,916 --> 00:13:18,956 Speaker 1: compelled to give to your food bank, but what if 285 00:13:18,956 --> 00:13:21,716 Speaker 1: you gave just part of that ten bucks to one 286 00:13:21,756 --> 00:13:24,356 Speaker 1: of these so called super effective charities. Right they've done 287 00:13:24,356 --> 00:13:26,516 Speaker 1: the research. They're like the dollar that you give from 288 00:13:26,556 --> 00:13:28,916 Speaker 1: that ten bucks to the super effective charity is going 289 00:13:28,956 --> 00:13:31,476 Speaker 1: to go even further. And so they kind of allow 290 00:13:31,556 --> 00:13:34,516 Speaker 1: you to make this distinction between your your heart and 291 00:13:34,556 --> 00:13:36,716 Speaker 1: what you kind of really feel locally, the kind of 292 00:13:36,716 --> 00:13:38,396 Speaker 1: thing that make you feel good because you can see 293 00:13:38,396 --> 00:13:41,596 Speaker 1: the impact in your community versus what's doing the best 294 00:13:41,596 --> 00:13:44,676 Speaker 1: work out there and giving. Multiplier dot Org this year 295 00:13:44,996 --> 00:13:47,876 Speaker 1: has picked a really great super effective charity which is 296 00:13:47,916 --> 00:13:51,676 Speaker 1: called Give Directly. This is this group that just gives 297 00:13:51,716 --> 00:13:55,876 Speaker 1: these unconditional cash transfers, like no strings attached, like cast 298 00:13:55,916 --> 00:13:58,436 Speaker 1: bonus to people living in extreme poverty. 299 00:13:58,596 --> 00:14:02,036 Speaker 2: And there's a Happiness Lab. You are l right, shout 300 00:14:02,076 --> 00:14:03,996 Speaker 2: it out, Shout it from the rooftops. What is it? 301 00:14:03,996 --> 00:14:08,036 Speaker 1: It's giving Multiplier dot org slash Happiness Lab super easy. 302 00:14:08,116 --> 00:14:12,076 Speaker 2: Go right now. Phones are open, operators are standing by, 303 00:14:12,636 --> 00:14:12,836 Speaker 2: you know. 304 00:14:12,876 --> 00:14:14,876 Speaker 1: And one of the things we've seen is that a 305 00:14:14,876 --> 00:14:18,636 Speaker 1: lot of our listeners will donate five bucks, three bucks 306 00:14:18,636 --> 00:14:21,956 Speaker 1: in some cases, but those kinds of donations really add up, 307 00:14:21,996 --> 00:14:24,516 Speaker 1: and especially if part of your donation is going to 308 00:14:24,556 --> 00:14:27,476 Speaker 1: one of these super effective charities, like that dollar is 309 00:14:27,516 --> 00:14:28,876 Speaker 1: going a really long way. 310 00:14:29,476 --> 00:14:32,516 Speaker 2: Yeah, I'll say, And I know, like analyzing super effective 311 00:14:32,556 --> 00:14:35,716 Speaker 2: charities ends up being about like randomized controlled trials, which 312 00:14:35,756 --> 00:14:38,316 Speaker 2: is great, like real evidence. But I will say that 313 00:14:38,396 --> 00:14:42,596 Speaker 2: I actually for a story I did ten years ago 314 00:14:42,716 --> 00:14:46,316 Speaker 2: or so, I went to Kenya to a village where 315 00:14:46,436 --> 00:14:50,636 Speaker 2: Give Directly was giving money, and I saw how profound 316 00:14:50,676 --> 00:14:52,756 Speaker 2: the impact is. I mean, it's people get one thousand 317 00:14:52,836 --> 00:14:55,716 Speaker 2: dollars at least at that time, no strings attached, and 318 00:14:55,796 --> 00:14:58,236 Speaker 2: like I talk to a guy who bought a motorcycle 319 00:14:58,236 --> 00:15:00,636 Speaker 2: so that he could start a motorcycle taxi business. Right, 320 00:15:00,676 --> 00:15:03,436 Speaker 2: So it's not just like they buy food and then 321 00:15:03,436 --> 00:15:06,916 Speaker 2: the money runs out. It's people have no capital, they 322 00:15:06,956 --> 00:15:09,876 Speaker 2: have no money, and so getting a th thousand dollars 323 00:15:09,876 --> 00:15:12,956 Speaker 2: allows them to make these investments that can change their 324 00:15:12,996 --> 00:15:14,156 Speaker 2: lives forever. Yeah. 325 00:15:14,156 --> 00:15:16,316 Speaker 1: We saw that last year where we really focused on 326 00:15:16,356 --> 00:15:19,796 Speaker 1: Give Directly in particular and one community specifically. So we 327 00:15:19,876 --> 00:15:23,036 Speaker 1: worked with this community, Kebobo in Rwanda, which is a 328 00:15:23,116 --> 00:15:26,116 Speaker 1: tiny village just on the outside of Kigali, the capital. 329 00:15:26,876 --> 00:15:28,956 Speaker 1: But they just like all, most of the people in 330 00:15:28,956 --> 00:15:31,356 Speaker 1: the community live off less than a dollar a day, 331 00:15:31,716 --> 00:15:33,876 Speaker 1: and just like you're saying, they just lack so many 332 00:15:33,916 --> 00:15:36,196 Speaker 1: of the basic conveniences that we take for granted. Right, 333 00:15:36,276 --> 00:15:38,916 Speaker 1: they have to hike two hours to get access to water, 334 00:15:38,956 --> 00:15:40,276 Speaker 1: and then the water comes back and it's like, are 335 00:15:40,316 --> 00:15:41,396 Speaker 1: you going to drink some water? A you're going to 336 00:15:41,476 --> 00:15:44,076 Speaker 1: give your kid a shower? Right, There's no access to 337 00:15:44,116 --> 00:15:47,316 Speaker 1: schools and these kinds of things. And last year, Happiness 338 00:15:47,396 --> 00:15:49,876 Speaker 1: Lab listeners were able to generate over one hundred thousand 339 00:15:49,956 --> 00:15:53,156 Speaker 1: dollars for this community in particular, And so, just like 340 00:15:53,196 --> 00:15:55,556 Speaker 1: you're saying, give directly, was able to give each person 341 00:15:55,556 --> 00:15:59,756 Speaker 1: in the community one thousand dollars unconditional cash transfer, and 342 00:15:59,836 --> 00:16:02,676 Speaker 1: the money went to things like motorbikes like you mentioned, 343 00:16:02,756 --> 00:16:06,196 Speaker 1: fixing roofs, buying mattresses. Right, most of the people in 344 00:16:06,316 --> 00:16:08,516 Speaker 1: Kebobo were sleeping on the floor. They just didn't have 345 00:16:08,556 --> 00:16:11,556 Speaker 1: access to a mattress. But some people did these really 346 00:16:11,596 --> 00:16:14,196 Speaker 1: creative things that one of the things I didn't expect 347 00:16:14,236 --> 00:16:16,556 Speaker 1: is that one of the couples that got the cash transfer, 348 00:16:16,636 --> 00:16:18,996 Speaker 1: bought a pub, which you might think like, oh, they 349 00:16:19,036 --> 00:16:19,516 Speaker 1: got a pub. 350 00:16:19,636 --> 00:16:22,316 Speaker 2: A pub it like a bar, like a I love that, 351 00:16:22,516 --> 00:16:23,236 Speaker 2: But the bar. 352 00:16:23,356 --> 00:16:26,476 Speaker 1: Like wound up employing people in the community. It became 353 00:16:26,516 --> 00:16:29,316 Speaker 1: this community hub where people could get together with each 354 00:16:29,316 --> 00:16:32,156 Speaker 1: other at night, and it's generated more income for them. 355 00:16:32,156 --> 00:16:34,276 Speaker 1: So now they're turning kind of the side of their 356 00:16:34,316 --> 00:16:37,276 Speaker 1: pub that they put together into a little mini grocery store, 357 00:16:37,476 --> 00:16:39,236 Speaker 1: which is one of the first spots that people can 358 00:16:39,316 --> 00:16:41,116 Speaker 1: buy food in town, so that they don't have to 359 00:16:41,196 --> 00:16:43,076 Speaker 1: leave town. And so it's like, if you leave it 360 00:16:43,156 --> 00:16:45,636 Speaker 1: up to people's ingenuity, they kind of come up with 361 00:16:45,676 --> 00:16:46,596 Speaker 1: these interesting things. 362 00:16:46,756 --> 00:16:49,236 Speaker 2: Yeah, I mean there's a really simple idea. Like the 363 00:16:49,316 --> 00:16:51,196 Speaker 2: reason I wanted to do that story all those years 364 00:16:51,236 --> 00:16:54,916 Speaker 2: ago is like people know what they need, right, Like 365 00:16:55,836 --> 00:16:59,396 Speaker 2: they know if they need food or a motorcycle or 366 00:16:59,436 --> 00:17:02,636 Speaker 2: a roof, they just don't have the money, yes, right, 367 00:17:02,716 --> 00:17:04,516 Speaker 2: So like if you give them the money, they can 368 00:17:04,556 --> 00:17:06,796 Speaker 2: buy what they need. That's the great thing about money. 369 00:17:07,156 --> 00:17:10,076 Speaker 1: Yeah, and this is something we forget with gifts in general. 370 00:17:10,116 --> 00:17:11,996 Speaker 1: I think this comes up in charity, but there's also 371 00:17:12,036 --> 00:17:14,916 Speaker 1: a work you know, giving Tuesdays sort of the prelude 372 00:17:14,956 --> 00:17:18,036 Speaker 1: to other holidays and gift giving moments coming up, and 373 00:17:18,156 --> 00:17:19,956 Speaker 1: it's just something we get wrong all the time. Like 374 00:17:19,996 --> 00:17:21,396 Speaker 1: we want to be able to come up with the 375 00:17:21,436 --> 00:17:23,716 Speaker 1: creative gift for somebody, but one of the best ways 376 00:17:23,716 --> 00:17:25,676 Speaker 1: to figure out the best gift is to just ask 377 00:17:25,716 --> 00:17:28,396 Speaker 1: people what do you want? And if you buy someone 378 00:17:28,516 --> 00:17:30,716 Speaker 1: that thing, they're going to be happy because that was 379 00:17:30,756 --> 00:17:31,516 Speaker 1: what they wanted. 380 00:17:31,796 --> 00:17:34,636 Speaker 2: Yeah, it goes back to the like we're thinking about 381 00:17:34,636 --> 00:17:39,196 Speaker 2: ourselves right, even when we're giving gifts this notionally you know, 382 00:17:39,356 --> 00:17:41,876 Speaker 2: other focused thing, we're actually like, oh am I a 383 00:17:41,916 --> 00:17:44,756 Speaker 2: good gift giver? Am I a good It's just ego. 384 00:17:44,836 --> 00:17:50,316 Speaker 2: It's just we're just screwing ourselves with our ego as always. Yeah, 385 00:17:50,436 --> 00:17:51,916 Speaker 2: it was great to talk with you, Laurie. It was 386 00:17:52,076 --> 00:17:54,796 Speaker 2: It was truly a delightful conversation. Thank you. 387 00:17:55,036 --> 00:17:57,756 Speaker 1: This is super Thanks for sharing the love on Giving Tuesday. 388 00:17:59,876 --> 00:18:02,796 Speaker 2: Laurie Santos is a professor of psychology at Yale and 389 00:18:02,836 --> 00:18:05,276 Speaker 2: the host of the Happiness Lab. They have a whole 390 00:18:05,276 --> 00:18:08,556 Speaker 2: episode on the psychology of generosity coming out this week. 391 00:18:09,516 --> 00:18:11,396 Speaker 2: We'll be back in a minute with my conversation with 392 00:18:11,436 --> 00:18:15,196 Speaker 2: Ellie Hassenfeld. Who spent nearly two decades scouring the planet 393 00:18:15,196 --> 00:18:17,756 Speaker 2: to find the most effective ways to spend money on 394 00:18:17,876 --> 00:18:39,756 Speaker 2: other people. Okay, so Laurie Santos explained convincingly that giving 395 00:18:39,796 --> 00:18:42,956 Speaker 2: away money makes us feel good. So now the question 396 00:18:43,116 --> 00:18:45,916 Speaker 2: is who do we give the money to? That is 397 00:18:45,996 --> 00:18:49,436 Speaker 2: basically the question that Ellie Hassenfeld asked himself almost twenty 398 00:18:49,556 --> 00:18:52,196 Speaker 2: years ago. It's a question that led him to co 399 00:18:52,316 --> 00:18:55,756 Speaker 2: found give Well, where he's now the CEO, and it's 400 00:18:55,796 --> 00:18:58,876 Speaker 2: a question that, in some really interesting ways, as you 401 00:18:58,876 --> 00:19:02,676 Speaker 2: will hear, has started to change the way charities themselves 402 00:19:02,876 --> 00:19:06,396 Speaker 2: think about what they do. To start, I asked Ellie 403 00:19:06,476 --> 00:19:09,036 Speaker 2: how he came to found GiveWell in the first place. 404 00:19:13,836 --> 00:19:16,116 Speaker 4: Back in two thousand and six, I was a couple 405 00:19:16,116 --> 00:19:17,956 Speaker 4: of years out of college working at a hedge fund, 406 00:19:18,076 --> 00:19:21,036 Speaker 4: and a friend, Holding Karnowski, and I wanted to give 407 00:19:21,076 --> 00:19:22,996 Speaker 4: to charity. And at the time, we were trying to 408 00:19:22,996 --> 00:19:25,796 Speaker 4: give a few thousand bucks away and we wanted to 409 00:19:25,836 --> 00:19:29,156 Speaker 4: find charitable organizations that were getting a lot of bang 410 00:19:29,196 --> 00:19:33,436 Speaker 4: for their buck. And when we went looking online for information, 411 00:19:33,996 --> 00:19:38,036 Speaker 4: we just couldn't find great information about what charities do 412 00:19:38,276 --> 00:19:41,356 Speaker 4: and how well it works. We heard a lot about 413 00:19:41,396 --> 00:19:44,396 Speaker 4: the overhead ratio, how much did they spend on administration 414 00:19:44,676 --> 00:19:47,916 Speaker 4: versus programs, but nothing that said this is what they're 415 00:19:47,916 --> 00:19:49,716 Speaker 4: doing and this is how many people they'll help with 416 00:19:49,756 --> 00:19:53,956 Speaker 4: their programs. We spent months trying to answer this question. 417 00:19:54,676 --> 00:19:56,476 Speaker 4: The two of us got a little bit obsessed with it, 418 00:19:56,876 --> 00:20:00,356 Speaker 4: and eventually, after about a year of working on this 419 00:20:00,396 --> 00:20:02,836 Speaker 4: project left our jobs to start give Well as a 420 00:20:02,836 --> 00:20:05,116 Speaker 4: full time project. And the idea was to create the 421 00:20:05,156 --> 00:20:07,876 Speaker 4: resource that we had been looking for as donors well. 422 00:20:07,916 --> 00:20:12,236 Speaker 2: And there is this interesting sort of broader idea in 423 00:20:13,196 --> 00:20:17,516 Speaker 2: the charity world, right in the philanthropic world, which is 424 00:20:18,436 --> 00:20:20,996 Speaker 2: what are they measuring? You Know, you can have a 425 00:20:21,076 --> 00:20:23,836 Speaker 2: charity that builds schools, and they might tell you how 426 00:20:23,876 --> 00:20:27,196 Speaker 2: many schools they build, but presumably you're not actually giving 427 00:20:27,196 --> 00:20:29,596 Speaker 2: money to build the school, right, You're giving money so 428 00:20:29,636 --> 00:20:33,276 Speaker 2: that children get a better education. And so I'm curious, 429 00:20:33,396 --> 00:20:36,516 Speaker 2: I mean, as you started to look deeper at the 430 00:20:36,556 --> 00:20:39,476 Speaker 2: time as you founded give Well, like what was just 431 00:20:39,556 --> 00:20:44,196 Speaker 2: the basic landscape of measurement within the charity world, Like. 432 00:20:45,116 --> 00:20:48,956 Speaker 4: It's just really hard to get information about the outcomes 433 00:20:49,036 --> 00:20:51,516 Speaker 4: that we cared about that I think donors ultimately do 434 00:20:51,596 --> 00:20:55,476 Speaker 4: care about and those outcomes would be things like, do 435 00:20:55,676 --> 00:21:00,196 Speaker 4: you save children's lives if you are providing funds for 436 00:21:00,236 --> 00:21:03,116 Speaker 4: health programs? If you're trying to reduce poverty, do you 437 00:21:03,676 --> 00:21:06,476 Speaker 4: increase people's incomes so that they can buy more of 438 00:21:06,556 --> 00:21:08,556 Speaker 4: the kinds of things that they want? And I would 439 00:21:08,556 --> 00:21:13,036 Speaker 4: say that buy and large. This information was not available 440 00:21:13,476 --> 00:21:16,836 Speaker 4: when we were calling up organizations and asking them for information. 441 00:21:17,436 --> 00:21:20,716 Speaker 4: They were often shocked that anyone would be even asking 442 00:21:20,836 --> 00:21:24,636 Speaker 4: a question like this, because it was just in two 443 00:21:24,676 --> 00:21:27,516 Speaker 4: thousand and six, two thousand and seven. It was completely 444 00:21:27,556 --> 00:21:31,156 Speaker 4: unusual that someone would be wondering about like what what 445 00:21:31,236 --> 00:21:33,316 Speaker 4: is the program actually accomplishing? What is the impact that 446 00:21:33,356 --> 00:21:34,156 Speaker 4: it's happening on the world? 447 00:21:34,476 --> 00:21:37,956 Speaker 2: Huh? I mean, is it almost a rude question? Was 448 00:21:37,996 --> 00:21:41,556 Speaker 2: it almost like? Look, we're spending our lives helping these people, 449 00:21:41,596 --> 00:21:45,996 Speaker 2: We're giving them cows, we're building clinics, Like who are you? 450 00:21:45,716 --> 00:21:48,396 Speaker 2: What are you asking about? Why? Where do you come 451 00:21:48,396 --> 00:21:49,636 Speaker 2: off asking these questions? 452 00:21:50,476 --> 00:21:53,716 Speaker 4: I think it's definitely an odd question to ask something 453 00:21:53,756 --> 00:21:56,156 Speaker 4: that I say a lot internally. I could well now 454 00:21:56,236 --> 00:21:59,076 Speaker 4: is that you know, we're the people who react skeptically 455 00:21:59,476 --> 00:22:03,116 Speaker 4: to organizations saying we're just trying to help people around 456 00:22:03,116 --> 00:22:04,996 Speaker 4: the world, and we say, well, how do you know 457 00:22:05,076 --> 00:22:07,556 Speaker 4: and can you prove it? And you know, that's not 458 00:22:07,956 --> 00:22:11,316 Speaker 4: a socially normal thing to do, but I think it's 459 00:22:11,356 --> 00:22:14,556 Speaker 4: necessary because gosh, it's so hard to have an impact 460 00:22:14,596 --> 00:22:16,876 Speaker 4: on people around the world, and asking those questions helps 461 00:22:16,916 --> 00:22:19,436 Speaker 4: get better information so we can shure that funding goes 462 00:22:19,476 --> 00:22:20,116 Speaker 4: to the best place. 463 00:22:20,676 --> 00:22:25,156 Speaker 2: So you do have this short list of pop charities 464 00:22:25,236 --> 00:22:27,116 Speaker 2: that seems kind of like the center of what you 465 00:22:27,196 --> 00:22:29,276 Speaker 2: do in some way. Right, you've looked at all of 466 00:22:29,316 --> 00:22:31,556 Speaker 2: these charities in the world, then you've landed on this 467 00:22:31,716 --> 00:22:35,516 Speaker 2: very small number. Briefly, what are they? 468 00:22:36,076 --> 00:22:38,716 Speaker 4: Yeah, so you know, these top charities account for about 469 00:22:38,756 --> 00:22:41,396 Speaker 4: two thirds of the funds we direct. There are four 470 00:22:41,436 --> 00:22:44,796 Speaker 4: of them. One is the Against Malaria Foundation, which delivers 471 00:22:44,796 --> 00:22:49,756 Speaker 4: malaria nets in Africa. The second one is called Malaria Consortium, 472 00:22:49,796 --> 00:22:53,556 Speaker 4: and we support their Seasonal Malaria Chemo Prevention program. That's 473 00:22:53,556 --> 00:22:57,316 Speaker 4: a preventative malaria program giving medicine to young children. The 474 00:22:57,716 --> 00:22:59,956 Speaker 4: third one, and these are in no particular order, but 475 00:22:59,996 --> 00:23:03,916 Speaker 4: the third one is Helen Keller International's Vitamin A Supplementation program. 476 00:23:04,156 --> 00:23:08,316 Speaker 4: This is a program that gives a small amount of 477 00:23:08,356 --> 00:23:11,356 Speaker 4: vitamin A children under the age of five, and it 478 00:23:11,436 --> 00:23:15,396 Speaker 4: is shown in numerous studies to reduce child mortality. And 479 00:23:15,436 --> 00:23:18,716 Speaker 4: then finally, New Incentives, which is the organization that provides 480 00:23:19,036 --> 00:23:22,796 Speaker 4: conditional cash trans or small cash transfers to encourage immunization. 481 00:23:23,676 --> 00:23:26,756 Speaker 4: You know, the top charities reflect you know, roughly two 482 00:23:26,756 --> 00:23:28,996 Speaker 4: thirds of the funds that we direct, and we see 483 00:23:28,996 --> 00:23:32,036 Speaker 4: them as the you know, really that tried and true. 484 00:23:32,076 --> 00:23:33,436 Speaker 4: Like if you're a donor and you want to have 485 00:23:33,476 --> 00:23:35,876 Speaker 4: a lot of impact and you want to have confidence 486 00:23:36,436 --> 00:23:39,636 Speaker 4: in that impact. These are organizations that we have followed 487 00:23:39,676 --> 00:23:41,516 Speaker 4: for many years and we have a lot of confidence 488 00:23:41,556 --> 00:23:44,316 Speaker 4: in because they are there's a lot of evidence that 489 00:23:44,356 --> 00:23:45,796 Speaker 4: supports their impact. 490 00:23:46,356 --> 00:23:48,996 Speaker 2: So how do you get from sort of generally being 491 00:23:49,036 --> 00:23:54,716 Speaker 2: interested in charity and in you know, research driven outcomes 492 00:23:55,836 --> 00:24:00,196 Speaker 2: to specifically focusing on saving the lives of children in 493 00:24:00,236 --> 00:24:04,236 Speaker 2: the developing world. Yeah, I mean so, at. 494 00:24:04,076 --> 00:24:08,596 Speaker 4: Its core, GiveWell is about finding outstanding programs that we 495 00:24:08,596 --> 00:24:12,676 Speaker 4: can support with the aim of having the most impact 496 00:24:12,796 --> 00:24:15,436 Speaker 4: with the funds that we direct. And when we started, 497 00:24:16,036 --> 00:24:19,356 Speaker 4: we didn't know where we were going to find those programs, 498 00:24:19,636 --> 00:24:24,276 Speaker 4: So we were looking at health programs focused on low 499 00:24:24,316 --> 00:24:28,196 Speaker 4: income countries, but also social programs focused on New York 500 00:24:28,196 --> 00:24:30,996 Speaker 4: City where we lived at the time, so job training programs, 501 00:24:31,076 --> 00:24:35,076 Speaker 4: education programs, et cetera. And after our first year of 502 00:24:35,116 --> 00:24:38,596 Speaker 4: work where we were focused on both US social programs 503 00:24:38,636 --> 00:24:42,476 Speaker 4: and also global programs, we looked at the data and 504 00:24:42,596 --> 00:24:46,756 Speaker 4: just saw how big the difference was in what a 505 00:24:46,836 --> 00:24:50,276 Speaker 4: dollar could accomplish overseas versus at home. And just to 506 00:24:50,276 --> 00:24:54,476 Speaker 4: make it concrete, you know, we estimate roughly, but I 507 00:24:54,516 --> 00:24:57,636 Speaker 4: think it's the right ballpark that five thousand or so 508 00:24:57,716 --> 00:25:00,476 Speaker 4: dollars will avert the death of a young child in 509 00:25:00,476 --> 00:25:04,276 Speaker 4: a low income country. That's about what it costs to 510 00:25:04,836 --> 00:25:08,516 Speaker 4: put a child through school for a couple of years 511 00:25:09,196 --> 00:25:13,356 Speaker 4: in a New York City charter school. And so that 512 00:25:13,556 --> 00:25:18,476 Speaker 4: differential really showed us that the opportunities to use money 513 00:25:18,516 --> 00:25:20,756 Speaker 4: to have a big impactor over on the world where 514 00:25:20,916 --> 00:25:24,276 Speaker 4: we're stronger overseas, and it drove us. It drove us 515 00:25:24,276 --> 00:25:28,516 Speaker 4: to focus our efforts there. We're finding the groups that are, 516 00:25:30,276 --> 00:25:33,116 Speaker 4: I think, like importantly like not sure that their own 517 00:25:33,196 --> 00:25:36,476 Speaker 4: programs are working, and so want to ensure that they're 518 00:25:36,516 --> 00:25:39,356 Speaker 4: gathering the data so that they know where the programs 519 00:25:39,356 --> 00:25:43,716 Speaker 4: are effective, where they're struggling so that they can make 520 00:25:43,876 --> 00:25:46,156 Speaker 4: changes to run those programs more effectively. 521 00:25:46,556 --> 00:25:49,156 Speaker 2: Yeah, I mean, so that's an interesting idea, right, Like 522 00:25:49,196 --> 00:25:53,796 Speaker 2: that idea of the groups themselves being unsure, it requires 523 00:25:53,836 --> 00:26:00,276 Speaker 2: a sense of what is what is the real end point? Right? 524 00:26:00,396 --> 00:26:04,036 Speaker 2: I think quite often and reasonably, like things are clearly 525 00:26:04,476 --> 00:26:08,836 Speaker 2: helpful if you whatever, give someone a cow and you 526 00:26:08,836 --> 00:26:11,596 Speaker 2: know training on how to take care of that cow, like, 527 00:26:12,236 --> 00:26:14,596 Speaker 2: pretty clearly that person is going to be better off 528 00:26:14,636 --> 00:26:16,956 Speaker 2: than if you hadn't done it, And so it might 529 00:26:16,996 --> 00:26:19,916 Speaker 2: not be obvious to say, oh, we need to measure, well, 530 00:26:19,916 --> 00:26:21,556 Speaker 2: how much does it cost to give them the cow? 531 00:26:21,596 --> 00:26:24,116 Speaker 2: How much better off are they? Are there things we 532 00:26:24,116 --> 00:26:26,836 Speaker 2: could change that would be even more helpful, Like most 533 00:26:26,836 --> 00:26:29,476 Speaker 2: people clearly don't do that, right, Most people in their 534 00:26:29,556 --> 00:26:33,116 Speaker 2: jobs in many domains are not constantly measuring and trying 535 00:26:33,156 --> 00:26:33,996 Speaker 2: to optimize. 536 00:26:34,356 --> 00:26:36,316 Speaker 4: Yeah, I mean, I just think the stakes are so 537 00:26:36,396 --> 00:26:39,836 Speaker 4: high that it's just absolutely critical that there is a 538 00:26:39,916 --> 00:26:44,036 Speaker 4: recognition that failure can happen and we have to do 539 00:26:44,076 --> 00:26:47,796 Speaker 4: the best we can. Billions of dollars go to health 540 00:26:47,836 --> 00:26:50,916 Speaker 4: aid every year, and the stakes are quite literally life 541 00:26:50,956 --> 00:26:54,516 Speaker 4: and death. And so therefore the difference between some of 542 00:26:54,876 --> 00:26:59,676 Speaker 4: the best programs that can very roughly say, avert the 543 00:26:59,716 --> 00:27:01,956 Speaker 4: death of a young child for approximately five to ten 544 00:27:01,996 --> 00:27:05,396 Speaker 4: thousand dollars, and then other programs which could have very 545 00:27:05,396 --> 00:27:08,356 Speaker 4: limited impact I think in the worst case, even cause harm. 546 00:27:09,276 --> 00:27:14,596 Speaker 4: Measurement in that feedback loop to say this, we want 547 00:27:14,636 --> 00:27:17,596 Speaker 4: to see whether it's working, we want to see the 548 00:27:17,676 --> 00:27:19,836 Speaker 4: extent to which it's working, and we want to learn 549 00:27:20,316 --> 00:27:21,956 Speaker 4: from what we've done so that we can do better. 550 00:27:21,996 --> 00:27:24,156 Speaker 4: That's true for the organizations we work with. That's true 551 00:27:24,196 --> 00:27:26,196 Speaker 4: for us as an organization. You know, we're trying to 552 00:27:26,196 --> 00:27:29,796 Speaker 4: follow the same project of learning from our own track 553 00:27:29,836 --> 00:27:32,436 Speaker 4: record in history to make better decisions in the future 554 00:27:32,476 --> 00:27:36,116 Speaker 4: and hopefully help people even more. The groups that we 555 00:27:36,156 --> 00:27:38,796 Speaker 4: work with most, and I think the kinds of people 556 00:27:38,836 --> 00:27:41,516 Speaker 4: who are most drawn to what we're doing, whether it's 557 00:27:41,516 --> 00:27:46,236 Speaker 4: donors or practitioners, are people who's I think their interest, 558 00:27:46,396 --> 00:27:48,436 Speaker 4: you know, the idea that they have that we have 559 00:27:48,916 --> 00:27:52,556 Speaker 4: is to try to find the way to use charitable 560 00:27:52,596 --> 00:27:55,796 Speaker 4: dollars to accomplish as much good as possible. And if 561 00:27:55,796 --> 00:27:59,676 Speaker 4: that idea is preventing HIV and young children, then great. 562 00:28:00,036 --> 00:28:03,596 Speaker 4: And if we can do better, if we can distribute 563 00:28:03,676 --> 00:28:07,316 Speaker 4: oral rehydration solution to prevent deaths from diarrhea, or if 564 00:28:07,316 --> 00:28:12,556 Speaker 4: we can encourage a testing and treatment to prevent cases 565 00:28:12,556 --> 00:28:16,436 Speaker 4: of tuberculosis and children. What I ultimately care about. The 566 00:28:16,436 --> 00:28:19,196 Speaker 4: thing that's important to me is just helping children, and 567 00:28:19,396 --> 00:28:22,196 Speaker 4: I'm not drawn to the specific cause or disease as 568 00:28:22,236 --> 00:28:24,236 Speaker 4: much as the outcome, which is trying to enable more 569 00:28:24,236 --> 00:28:26,716 Speaker 4: people to live long, healthy lives. 570 00:28:27,036 --> 00:28:33,316 Speaker 2: I feel like, you know, traditionally philanthropy was largely about 571 00:28:34,276 --> 00:28:38,076 Speaker 2: making donors feel good, and maybe it still is to 572 00:28:38,116 --> 00:28:40,956 Speaker 2: some degree, the nature of human nature being what it is. 573 00:28:41,516 --> 00:28:45,036 Speaker 2: But it seems like the growth of give well and 574 00:28:45,076 --> 00:28:51,356 Speaker 2: of sort of research driven philanthropy more generally has coincided 575 00:28:51,436 --> 00:28:57,236 Speaker 2: with the long boom of Silicon Valley, right, And it 576 00:28:57,316 --> 00:28:59,556 Speaker 2: strikes me that the kind of people who get rich 577 00:28:59,636 --> 00:29:05,996 Speaker 2: in tech are more numerically driven, are more metrics oriented 578 00:29:06,156 --> 00:29:09,596 Speaker 2: perhaps than earlier generations of rich people well, and that 579 00:29:09,596 --> 00:29:12,476 Speaker 2: that might be sort of part of what is going on, 580 00:29:12,596 --> 00:29:14,196 Speaker 2: part of the wind at your back, part of the 581 00:29:14,316 --> 00:29:18,156 Speaker 2: rise of research driven philanthropy. You buy that. 582 00:29:19,036 --> 00:29:21,396 Speaker 4: I think there's a kind of person in tech and 583 00:29:21,756 --> 00:29:25,556 Speaker 4: also in parts of finance. Those are the two sectors 584 00:29:25,756 --> 00:29:29,356 Speaker 4: from which we draw most of our donors who have 585 00:29:30,076 --> 00:29:32,556 Speaker 4: I think the perspective that they take on the world 586 00:29:32,636 --> 00:29:36,396 Speaker 4: as one we know that there's a lot we can 587 00:29:36,476 --> 00:29:39,036 Speaker 4: be wrong about. We know there can be big differences 588 00:29:39,076 --> 00:29:42,516 Speaker 4: in the investments we make or the decisions we make. 589 00:29:42,596 --> 00:29:45,836 Speaker 4: As a company leader, we also know that we can 590 00:29:45,876 --> 00:29:47,556 Speaker 4: be wrong and we want to learn about how to 591 00:29:47,556 --> 00:29:49,396 Speaker 4: do it better. And I do think we see that 592 00:29:49,436 --> 00:29:53,876 Speaker 4: coming out of those industries and it's a big part 593 00:29:53,956 --> 00:29:56,996 Speaker 4: of what has helped us grow to the size that 594 00:29:57,036 --> 00:29:57,596 Speaker 4: we are today. 595 00:29:58,676 --> 00:30:01,036 Speaker 2: So you were mentioning that when you first started out 596 00:30:01,076 --> 00:30:04,596 Speaker 2: and you were calling up charities and saying, what evidence 597 00:30:04,636 --> 00:30:08,076 Speaker 2: do you have that you're actually helping people basically, and 598 00:30:08,076 --> 00:30:11,196 Speaker 2: they would say, how dare you? Who are you? Why 599 00:30:11,196 --> 00:30:13,916 Speaker 2: are you asking me this? I'm spending my life helping people. 600 00:30:15,476 --> 00:30:17,756 Speaker 2: What do they say now when you call them up? 601 00:30:17,756 --> 00:30:18,716 Speaker 2: Has that changed? 602 00:30:19,156 --> 00:30:22,476 Speaker 4: Very practically? It's changed for us because when I was 603 00:30:22,476 --> 00:30:24,676 Speaker 4: calling people up almost twenty years ago, I was offering 604 00:30:24,676 --> 00:30:27,636 Speaker 4: them a thousand bucks and you know, now we have 605 00:30:27,676 --> 00:30:29,796 Speaker 4: a lot of funding to give, and so that does 606 00:30:29,836 --> 00:30:30,836 Speaker 4: make them more responsive. 607 00:30:31,236 --> 00:30:33,876 Speaker 2: Oh, that's interesting. So there's a sort of pull. So basically, 608 00:30:33,956 --> 00:30:36,396 Speaker 2: because you're directing hundreds of millions of dollars a year, 609 00:30:37,716 --> 00:30:40,516 Speaker 2: organizations have an incentive to be more research based. 610 00:30:40,716 --> 00:30:43,876 Speaker 4: Yeah, I mean very fundamentally. You know, there's a there's 611 00:30:43,876 --> 00:30:46,716 Speaker 4: a problem so to speak, in the charitable market where 612 00:30:46,756 --> 00:30:50,636 Speaker 4: the person deciding to open their wallet is not the 613 00:30:50,716 --> 00:30:55,156 Speaker 4: person who's ultimately receiving the service. So there's a disconnect 614 00:30:55,276 --> 00:30:59,156 Speaker 4: between the recipient and the giver. Where in the consumer 615 00:30:59,196 --> 00:31:02,316 Speaker 4: market that we're used to, you know, I purchase my 616 00:31:02,436 --> 00:31:04,316 Speaker 4: laptop and then I also use it and see how 617 00:31:04,316 --> 00:31:04,756 Speaker 4: good it is. 618 00:31:04,876 --> 00:31:06,716 Speaker 2: Yeah, and so I think, and if it sucks you, 619 00:31:06,716 --> 00:31:08,596 Speaker 2: you that company goes out of years. 620 00:31:08,436 --> 00:31:10,476 Speaker 4: Out of desness. But that's work security if you're great 621 00:31:10,516 --> 00:31:14,716 Speaker 4: at fundraising from donors but terrible at delivering a program 622 00:31:14,796 --> 00:31:16,956 Speaker 4: no one might ever know. And so very I think 623 00:31:17,236 --> 00:31:22,876 Speaker 4: just concretely give well has helped support the literal in 624 00:31:22,916 --> 00:31:26,036 Speaker 4: a very small way, the creation of an incentive to 625 00:31:26,116 --> 00:31:29,836 Speaker 4: operate in a way that is focused on demonstrating impact, 626 00:31:30,076 --> 00:31:33,036 Speaker 4: because the dollars that we have to give, the dollars 627 00:31:33,076 --> 00:31:37,436 Speaker 4: that we have to influence, are going after that evidence 628 00:31:37,596 --> 00:31:40,516 Speaker 4: of strong impact. And I should say, of course, like 629 00:31:40,596 --> 00:31:44,156 Speaker 4: we are just part of a larger and I think 630 00:31:44,236 --> 00:31:48,196 Speaker 4: ever growing ecosystem. You see this in the academics who 631 00:31:48,516 --> 00:31:53,116 Speaker 4: launched the randomized control trial movement in economics, organizations like 632 00:31:53,916 --> 00:31:58,196 Speaker 4: Evidence Action, the Clinton Health Access Initiative, Give Directly. I 633 00:31:58,196 --> 00:32:00,956 Speaker 4: mean this, It's a large and growing group of institutions, 634 00:32:00,996 --> 00:32:03,676 Speaker 4: even beyond the scope of just give Well, that are 635 00:32:03,876 --> 00:32:07,556 Speaker 4: operating in a way that is explicitly aiming to deliver 636 00:32:08,316 --> 00:32:11,916 Speaker 4: great results and demonstrate that those results are coming to fruition. 637 00:32:12,436 --> 00:32:14,636 Speaker 4: And I think that is just a massive change from 638 00:32:14,636 --> 00:32:16,116 Speaker 4: where we were twenty years ago. 639 00:32:16,756 --> 00:32:20,676 Speaker 2: So you mentioned give directly, and as it happens, the 640 00:32:20,676 --> 00:32:23,516 Speaker 2: there's a sort of charitable giving project out of one 641 00:32:23,556 --> 00:32:26,716 Speaker 2: of the shows at Pushkin that gives money to give 642 00:32:26,716 --> 00:32:30,636 Speaker 2: directly basically. And I'm curious about give well sort of 643 00:32:30,676 --> 00:32:33,516 Speaker 2: ongoing evaluation of give directly, Like what do you think 644 00:32:33,556 --> 00:32:35,316 Speaker 2: what do you think of give directly as work in 645 00:32:35,356 --> 00:32:39,356 Speaker 2: a quantitative professional way? Yeah, I think. 646 00:32:39,236 --> 00:32:42,996 Speaker 4: Like, extremely, extremely highly of them. I've personally been a 647 00:32:42,996 --> 00:32:46,236 Speaker 4: gift directly donor for many years, you know, continued to 648 00:32:46,236 --> 00:32:48,876 Speaker 4: give to them last year and will this year because 649 00:32:48,876 --> 00:32:50,876 Speaker 4: I really love what they do. I think it's just 650 00:32:50,876 --> 00:32:54,356 Speaker 4: so critical to say, you know, with some of our giving, 651 00:32:54,636 --> 00:32:58,556 Speaker 4: let's make sure that we're just supporting people to purchase, 652 00:32:58,876 --> 00:32:59,876 Speaker 4: you know, what they most. 653 00:32:59,756 --> 00:33:05,396 Speaker 2: Want still to come. On the show, Ellie talks about 654 00:33:05,436 --> 00:33:07,916 Speaker 2: some of the most surprising things he's seen in his 655 00:33:08,076 --> 00:33:09,476 Speaker 2: nearly two decades. 656 00:33:09,236 --> 00:33:10,156 Speaker 4: In the charity world. 657 00:33:20,276 --> 00:33:23,836 Speaker 2: What do you make of the fact that five thousand 658 00:33:23,916 --> 00:33:26,196 Speaker 2: dollars can save a child's life? 659 00:33:27,196 --> 00:33:30,076 Speaker 4: I think it is just an illustration of on some 660 00:33:30,156 --> 00:33:33,916 Speaker 4: level how unjust our world is, you know, potentially how 661 00:33:35,436 --> 00:33:38,396 Speaker 4: I think all of us, myself included, don't perhaps don't 662 00:33:38,436 --> 00:33:42,156 Speaker 4: really take it seriously as we should, the kind of 663 00:33:42,196 --> 00:33:46,236 Speaker 4: impact that we can have overseas. But fundamentally, you know, 664 00:33:46,276 --> 00:33:49,396 Speaker 4: I think it shows that something is very broken in 665 00:33:50,356 --> 00:33:56,316 Speaker 4: our system for allocating resources globally, because it's very hard 666 00:33:56,316 --> 00:34:00,036 Speaker 4: to accept that it's possible to save someone's life for 667 00:34:00,076 --> 00:34:01,196 Speaker 4: five thousand dollars. 668 00:34:01,596 --> 00:34:05,676 Speaker 2: Yeah, I mean with all the money, even with all 669 00:34:05,716 --> 00:34:10,116 Speaker 2: the money that people give way, Like why don't people 670 00:34:10,156 --> 00:34:13,556 Speaker 2: give enough money to buy bed nets for kids so 671 00:34:13,596 --> 00:34:15,956 Speaker 2: they don't get malaria? Right? Like there's some amount of money. 672 00:34:16,156 --> 00:34:20,716 Speaker 2: The more money people give, the less valuable each marginal 673 00:34:20,756 --> 00:34:24,436 Speaker 2: net would be, right, Like, why haven't people given enough 674 00:34:24,476 --> 00:34:28,036 Speaker 2: money to these programs to sort of give away all 675 00:34:28,076 --> 00:34:30,156 Speaker 2: the bed nets that you need to give away, and 676 00:34:30,196 --> 00:34:32,316 Speaker 2: give away all the malaria medicine that you'd need to 677 00:34:32,356 --> 00:34:36,156 Speaker 2: give away to stop kids from dying of malaria, at 678 00:34:36,236 --> 00:34:39,476 Speaker 2: least in these high intensity malaria areas where it's obvious 679 00:34:39,516 --> 00:34:41,676 Speaker 2: that kids are going to die of malaria every year. 680 00:34:43,036 --> 00:34:45,236 Speaker 4: I think, first of all, it's just worth noting how 681 00:34:45,316 --> 00:34:48,236 Speaker 4: much progress we have made globally in the last twenty 682 00:34:48,236 --> 00:34:53,556 Speaker 4: five years. The US government has given huge amounts to 683 00:34:54,436 --> 00:34:57,716 Speaker 4: a program called pepfar focused on HIV, the President's Malaria 684 00:34:57,716 --> 00:35:01,956 Speaker 4: Initiative focused on malaria, and has been instrumental in the 685 00:35:01,956 --> 00:35:05,236 Speaker 4: creation of the Global Fund, which focuses on HTB and malaria, 686 00:35:05,596 --> 00:35:08,396 Speaker 4: and GAVI, which focuses on immanizations. So since the year 687 00:35:08,436 --> 00:35:12,756 Speaker 4: two thousand on a, funding going to global health problems 688 00:35:12,916 --> 00:35:16,796 Speaker 4: has gone up a huge amount. It has plateaued more recently, 689 00:35:16,836 --> 00:35:18,516 Speaker 4: but it's gone up a huge amount, and we see 690 00:35:18,956 --> 00:35:22,716 Speaker 4: a massive reduction in child mortality. So we're doing a 691 00:35:22,716 --> 00:35:25,316 Speaker 4: lot better today than we were in the recent past. 692 00:35:25,916 --> 00:35:28,556 Speaker 4: And then I guess like, fundamentally, I don't know why 693 00:35:29,196 --> 00:35:33,476 Speaker 4: people don't give more or even give more to these charities, right, 694 00:35:33,556 --> 00:35:38,716 Speaker 4: It's it's more a question of direction. It's not even 695 00:35:38,756 --> 00:35:41,436 Speaker 4: why don't people give more money. It's like, if it's 696 00:35:41,516 --> 00:35:45,556 Speaker 4: really that easy to save a kid's life, like we 697 00:35:45,596 --> 00:35:48,876 Speaker 4: want that number to go up, right, like the cheaper 698 00:35:48,916 --> 00:35:49,956 Speaker 4: it is to save a kid's life. 699 00:35:49,956 --> 00:35:51,996 Speaker 2: I mean, it's kind of you cuts both ways. Right 700 00:35:52,076 --> 00:35:53,796 Speaker 2: on the one end, it's like, well, great, we know 701 00:35:53,876 --> 00:35:56,716 Speaker 2: a thing that is helpful. But on the other hand, like, well, 702 00:35:56,796 --> 00:35:59,636 Speaker 2: let's buy all the bed bets. So it's not so easy, 703 00:35:59,716 --> 00:36:01,316 Speaker 2: do you know what I'm saying? Uh? 704 00:36:01,356 --> 00:36:04,356 Speaker 4: Completely, And yeah, I think that you know, give Well 705 00:36:04,556 --> 00:36:06,956 Speaker 4: raises about three hundred million dollars a year. A give 706 00:36:06,996 --> 00:36:10,196 Speaker 4: Well that was raising a billion dollars a year would 707 00:36:10,596 --> 00:36:14,156 Speaker 4: the marginal dollars would be much less less cost effective 708 00:36:14,396 --> 00:36:16,036 Speaker 4: because we would have gone much further. 709 00:36:16,636 --> 00:36:18,596 Speaker 2: Weirdly, you want to get to a place where it's 710 00:36:18,676 --> 00:36:21,956 Speaker 2: more expensive to save a child's life, Like the more 711 00:36:21,996 --> 00:36:24,316 Speaker 2: expensive it is, the less inequality there is in the world, 712 00:36:24,356 --> 00:36:25,996 Speaker 2: the more kids' lives we're saving. 713 00:36:26,276 --> 00:36:30,196 Speaker 4: Exactly exactly, you know. And then I think I Give 714 00:36:30,236 --> 00:36:33,636 Speaker 4: Well itself is an institution that has raised much more 715 00:36:33,676 --> 00:36:36,396 Speaker 4: money over the last fifteen years than was raised previously 716 00:36:36,476 --> 00:36:38,836 Speaker 4: and is going and I think it reflects more people giving. 717 00:36:38,996 --> 00:36:41,836 Speaker 4: Why aren't people giving more to these programs? I think 718 00:36:41,916 --> 00:36:49,076 Speaker 4: because the honestly, the suffering and the poverty of say, 719 00:36:49,356 --> 00:36:52,836 Speaker 4: the poorest parts of Sub Saharan Africa is something that 720 00:36:52,876 --> 00:36:56,796 Speaker 4: we are largely blind to in our day to day lives. 721 00:36:56,796 --> 00:37:02,196 Speaker 4: It's not you know, we cover natural disasters when they occur, 722 00:37:02,956 --> 00:37:08,476 Speaker 4: but no one is covering literally the daily catastrophe of 723 00:37:09,236 --> 00:37:12,676 Speaker 4: child deaths due to infectious disease in Sub Saharan Africa. 724 00:37:12,756 --> 00:37:15,916 Speaker 4: You know, very roughly one thousand children die every day 725 00:37:16,356 --> 00:37:19,756 Speaker 4: of malaria. We know how to prevent it, and that's 726 00:37:19,796 --> 00:37:22,956 Speaker 4: not covered because it's well, I guess I don't know 727 00:37:22,996 --> 00:37:24,956 Speaker 4: exactly why. That's a question for you, not for me, 728 00:37:25,116 --> 00:37:27,316 Speaker 4: but it's not covered. And I think because of that, 729 00:37:27,796 --> 00:37:31,236 Speaker 4: on some level, we're able to live as if it's 730 00:37:31,236 --> 00:37:34,116 Speaker 4: not really there in that motivating force to get people 731 00:37:34,156 --> 00:37:37,196 Speaker 4: to see it and then act, isn't isn't happening. 732 00:37:38,156 --> 00:37:45,436 Speaker 2: You've been doing this now for fifteen years ish. If 733 00:37:45,476 --> 00:37:50,916 Speaker 2: you go back to when you started, what's been surprising 734 00:37:50,956 --> 00:37:52,436 Speaker 2: to you? What has happened in a way you would 735 00:37:52,436 --> 00:37:53,156 Speaker 2: not have expected. 736 00:37:54,236 --> 00:37:59,076 Speaker 4: I was really surprised when we started at how strange 737 00:37:59,716 --> 00:38:02,956 Speaker 4: our questions seem to the organizations that we were going to, 738 00:38:03,596 --> 00:38:09,236 Speaker 4: the question of how effective are your programs, how I'm 739 00:38:09,236 --> 00:38:12,076 Speaker 4: sure you accomplishing and how do you know seems like 740 00:38:12,116 --> 00:38:17,196 Speaker 4: a really an obvious question. Says are really surprising to me, 741 00:38:17,436 --> 00:38:20,196 Speaker 4: how much we've grown. When we started this, I think 742 00:38:20,236 --> 00:38:25,356 Speaker 4: we thought that we were just we were a couple 743 00:38:25,356 --> 00:38:28,956 Speaker 4: of guys who had this idiosyncratic interest in an approach 744 00:38:28,996 --> 00:38:31,956 Speaker 4: to charitable giving. And when we talked to people who 745 00:38:31,996 --> 00:38:36,876 Speaker 4: worked in philanthropy, they they reacted like we were nuts, 746 00:38:36,916 --> 00:38:39,116 Speaker 4: that no one would ever be into this. This is 747 00:38:39,116 --> 00:38:42,836 Speaker 4: not what donors go to galas, and donors like stories 748 00:38:42,876 --> 00:38:47,556 Speaker 4: and their names on buildings, and wow, it's shocking. You know, 749 00:38:47,596 --> 00:38:50,276 Speaker 4: people are people are normal. People are just giving three 750 00:38:50,316 --> 00:38:53,316 Speaker 4: hundred million dollars a year to help people around the world, 751 00:38:53,396 --> 00:38:56,356 Speaker 4: and they're buying large anonymous they're not getting their names 752 00:38:56,356 --> 00:38:59,556 Speaker 4: on buildings, and they're just you know, following along, trying 753 00:38:59,556 --> 00:39:00,916 Speaker 4: to make as big a difference as they can in 754 00:39:00,956 --> 00:39:04,076 Speaker 4: people's lives. So they'll never meet and you know, on 755 00:39:04,076 --> 00:39:06,676 Speaker 4: some level that maybe that makes sense, but that's also 756 00:39:06,716 --> 00:39:07,436 Speaker 4: really surprising. 757 00:39:08,156 --> 00:39:11,356 Speaker 2: It seems like the happy surprised their happy surprise. I 758 00:39:11,396 --> 00:39:13,356 Speaker 2: appreciate your time very much. It was great to talk 759 00:39:13,396 --> 00:39:15,476 Speaker 2: with you. Yeah, that was great to be here. Thanks 760 00:39:15,476 --> 00:39:20,076 Speaker 2: so much. Jacob Ellie Hasenfeld is the co founder and 761 00:39:20,196 --> 00:39:25,156 Speaker 2: CEO of gif Well. Last conversation on the show is 762 00:39:25,196 --> 00:39:29,276 Speaker 2: with Nate Silver and Maria Konikova. Nate is a statistician, 763 00:39:29,716 --> 00:39:33,716 Speaker 2: Maria's a psychologist. They are both writers and together they 764 00:39:33,796 --> 00:39:37,716 Speaker 2: host a podcast called Risky Business. They are also both 765 00:39:37,876 --> 00:39:41,196 Speaker 2: professional poker players, and that's really why I wanted to 766 00:39:41,236 --> 00:39:44,076 Speaker 2: talk to them about charitable giving. One of the things 767 00:39:44,116 --> 00:39:46,956 Speaker 2: they do on their show is they talk about bringing 768 00:39:46,956 --> 00:39:50,236 Speaker 2: a poker mindset to the decision making of everyday life, 769 00:39:50,596 --> 00:39:53,156 Speaker 2: and so I wanted to hear from them how professional 770 00:39:53,156 --> 00:39:58,756 Speaker 2: gamblers think about giving money away. So, like, the core 771 00:39:58,836 --> 00:40:02,156 Speaker 2: idea of the show is making better decisions using this 772 00:40:02,316 --> 00:40:08,996 Speaker 2: expected value framework right in like one sentence, what's expected value? 773 00:40:09,196 --> 00:40:14,756 Speaker 3: Dec did value is the net benefit you expect to 774 00:40:14,756 --> 00:40:20,556 Speaker 3: get averaged over all the uncertain outcomes. Now, I guess 775 00:40:20,556 --> 00:40:24,196 Speaker 3: with charitable giving, maybe it's more deterministic. Where we know, 776 00:40:24,356 --> 00:40:27,356 Speaker 3: for example, that mosquito nets in Africa have a high 777 00:40:27,436 --> 00:40:30,476 Speaker 3: return on investment. They say lives and prevent male area 778 00:40:31,156 --> 00:40:32,836 Speaker 3: at a relative of low costs. It's not like a 779 00:40:32,916 --> 00:40:37,156 Speaker 3: random element there exactly, although there are always some implementation issues, 780 00:40:38,156 --> 00:40:39,996 Speaker 3: but really it's a framework about utility. 781 00:40:40,356 --> 00:40:42,396 Speaker 5: And I was just to jump in a little bit 782 00:40:42,436 --> 00:40:46,876 Speaker 5: to say that, you know, we have the economic definition 783 00:40:46,916 --> 00:40:50,156 Speaker 5: of expected value, and then when you look at behavioral 784 00:40:50,236 --> 00:40:53,276 Speaker 5: economics and the way that people actually make decisions, you 785 00:40:53,356 --> 00:40:56,796 Speaker 5: realize that there's a lot of psychology involved as well, 786 00:40:57,076 --> 00:41:01,596 Speaker 5: and so calculating expected value is not as straightforward as 787 00:41:01,676 --> 00:41:05,236 Speaker 5: just kind of doing these dollar calculations, because you know, 788 00:41:05,276 --> 00:41:08,116 Speaker 5: how do you put a dollar amount on how good 789 00:41:08,116 --> 00:41:10,676 Speaker 5: you feel after a decision or how bad you feel, 790 00:41:10,996 --> 00:41:13,476 Speaker 5: or the regret that you might feel when you don't 791 00:41:13,516 --> 00:41:16,156 Speaker 5: take a decision. And when we're looking at kind of 792 00:41:16,196 --> 00:41:19,356 Speaker 5: the broader picture of expected value, you do have to 793 00:41:19,396 --> 00:41:21,916 Speaker 5: try to quantify that a little bit and try to 794 00:41:21,916 --> 00:41:24,716 Speaker 5: account for all of those different psychological factors that come 795 00:41:24,756 --> 00:41:25,636 Speaker 5: into play as well. 796 00:41:26,236 --> 00:41:28,316 Speaker 2: One of my favorite things about your show is when 797 00:41:28,316 --> 00:41:32,396 Speaker 2: you talk about the culture of professional poker players. Basically, 798 00:41:32,396 --> 00:41:35,716 Speaker 2: you're both professional poker players and you live in this 799 00:41:35,876 --> 00:41:40,756 Speaker 2: universe where people treat money really differently, right, And there 800 00:41:40,756 --> 00:41:42,996 Speaker 2: are these two terms that come up a lot on 801 00:41:43,036 --> 00:41:48,556 Speaker 2: the show two kinds of people, knits and dgens. What's 802 00:41:48,596 --> 00:41:50,756 Speaker 2: a knit? I'll leave this one to unit. 803 00:41:52,756 --> 00:41:55,676 Speaker 3: A knit is basically George Costanzo. Right, It's like a 804 00:41:56,396 --> 00:42:02,796 Speaker 3: neurotic risk of verse cheap, but more someone who is 805 00:42:02,836 --> 00:42:07,156 Speaker 3: so neurotic that they aren't taking plus ev bets. 806 00:42:07,276 --> 00:42:07,476 Speaker 2: Right. 807 00:42:07,556 --> 00:42:09,796 Speaker 3: They're two conservative for their your own good when it 808 00:42:09,836 --> 00:42:14,956 Speaker 3: comes to playing poker hands, for example, and have a 809 00:42:14,956 --> 00:42:17,996 Speaker 3: low openness to experience perhaps and can be annoying. 810 00:42:18,596 --> 00:42:22,316 Speaker 2: They're the ones who want to itemize the bill. Uh 811 00:42:22,716 --> 00:42:25,356 Speaker 2: when wait, I didn't eat the app. I shouldn't have 812 00:42:25,396 --> 00:42:28,916 Speaker 2: to pay for one third of the app, that's the yeah. 813 00:42:29,036 --> 00:42:32,396 Speaker 3: Whereas a djen is someone who likes to gamble, is 814 00:42:32,596 --> 00:42:36,476 Speaker 3: risk tolerant maybe to their own detriment, is freewheeling with 815 00:42:36,636 --> 00:42:39,676 Speaker 3: money and splashes around. 816 00:42:39,996 --> 00:42:42,916 Speaker 2: Yes, and d gen is short for degenerate gambler, but 817 00:42:43,116 --> 00:42:46,556 Speaker 2: in a loving way. Right, that's my Fausually, yeah, So 818 00:42:47,156 --> 00:42:53,516 Speaker 2: in your experience, who is more generous a degenerate knit djen? Absolutely? Oh? Yeah, 819 00:42:53,556 --> 00:42:57,076 Speaker 2: for sure. I don't think it's even close. You guys 820 00:42:57,116 --> 00:42:57,876 Speaker 2: give to charity. 821 00:42:59,916 --> 00:43:02,836 Speaker 5: I give to charitable causes that I believe in. So, 822 00:43:03,076 --> 00:43:07,516 Speaker 5: for instance, I gave a lot of money to Ukraine 823 00:43:07,956 --> 00:43:12,636 Speaker 5: when Russia invaded, and that is how kind of I 824 00:43:12,676 --> 00:43:16,196 Speaker 5: calculate my charitable giving. I understand that mosquito nets in 825 00:43:16,236 --> 00:43:19,036 Speaker 5: Africa are incredibly important. I have not given to malaria 826 00:43:19,116 --> 00:43:21,956 Speaker 5: because that is not something that you know, I feel 827 00:43:21,996 --> 00:43:24,876 Speaker 5: strongly about. There are other people feel strongly about that. 828 00:43:24,956 --> 00:43:27,116 Speaker 5: So for me part of it. You know, I have 829 00:43:27,196 --> 00:43:30,516 Speaker 5: given to educational causes. You know, I give to things 830 00:43:30,596 --> 00:43:33,436 Speaker 5: that I have a connection with and that I feel 831 00:43:33,556 --> 00:43:34,876 Speaker 5: also are underfunded. 832 00:43:35,396 --> 00:43:38,076 Speaker 2: Mari, you mentioned given to Ukraine, and I mean I 833 00:43:38,116 --> 00:43:40,516 Speaker 2: know you a little bit. I know your your life 834 00:43:40,556 --> 00:43:43,156 Speaker 2: story seems connected to that. Right, tell me about your 835 00:43:43,156 --> 00:43:46,756 Speaker 2: connection to helping Ukraine defend itself against Russia. Yeah. 836 00:43:46,836 --> 00:43:50,836 Speaker 5: So my dad is Ukrainian and my mom is from Moscow, 837 00:43:51,236 --> 00:43:54,556 Speaker 5: and I was born in Moscow and then came to 838 00:43:54,596 --> 00:43:57,716 Speaker 5: the United States when I was four years old and 839 00:43:57,956 --> 00:44:03,916 Speaker 5: have always been very anti the autocratic tendencies of Russia, 840 00:44:04,796 --> 00:44:08,396 Speaker 5: very anti Putin. And I think you know, when Putin 841 00:44:08,436 --> 00:44:11,956 Speaker 5: invented invaded Ukraine, to me, that was a no brainer. 842 00:44:12,196 --> 00:44:14,036 Speaker 5: I think at this point Ukraine is one of the 843 00:44:14,036 --> 00:44:18,356 Speaker 5: only things standing between US and the Third World War. Basically, 844 00:44:18,396 --> 00:44:20,836 Speaker 5: the fact that they're able to resist him. So for 845 00:44:20,916 --> 00:44:25,636 Speaker 5: me it was incredibly personal. And so I'm all in 846 00:44:25,876 --> 00:44:26,956 Speaker 5: on the Zelenski camp. 847 00:44:27,876 --> 00:44:32,796 Speaker 3: Nate Silver, do you give money away? I'm having to reevaluate. 848 00:44:33,556 --> 00:44:34,836 Speaker 3: I mean the short answers I haven't. 849 00:44:34,996 --> 00:44:37,716 Speaker 2: Uh huh. Does either saying on a show that you 850 00:44:37,796 --> 00:44:40,676 Speaker 2: haven't given money away or hearing Maria talk about giving 851 00:44:40,716 --> 00:44:44,036 Speaker 2: money away like honestly does it you think it makes 852 00:44:44,116 --> 00:44:45,996 Speaker 2: it any more likely that you'll give money away? You 853 00:44:46,036 --> 00:44:47,316 Speaker 2: think it's going to have any effect on you. 854 00:44:47,756 --> 00:44:51,356 Speaker 3: I'm generally not a person who is governed by guilt. 855 00:44:51,596 --> 00:44:53,516 Speaker 2: Ah, that's I mean, I think I should. 856 00:44:54,196 --> 00:44:57,956 Speaker 3: There's like a lot of long term financial planning that 857 00:44:57,956 --> 00:45:00,356 Speaker 3: gets put off. I mean, these are discussions that you know, 858 00:45:00,396 --> 00:45:04,756 Speaker 3: my household we've had and so we're we're aware of 859 00:45:04,796 --> 00:45:07,236 Speaker 3: this question and what we want to do with our 860 00:45:07,276 --> 00:45:09,596 Speaker 3: money in the long term, member thinking about it actively, 861 00:45:09,716 --> 00:45:15,316 Speaker 3: but kind of things get short circuited during I mean, look, clearly, 862 00:45:15,396 --> 00:45:17,356 Speaker 3: I think that you know, in some abstract sense, you 863 00:45:17,356 --> 00:45:19,516 Speaker 3: are being selfish if you're not. If you have a 864 00:45:19,556 --> 00:45:21,676 Speaker 3: comfortable life, then you are being selfish to not give. 865 00:45:24,116 --> 00:45:25,516 Speaker 3: But it's easy to make excuses. 866 00:45:25,916 --> 00:45:29,436 Speaker 2: I appreciate your honesty. So Nate, does that make you 867 00:45:29,476 --> 00:45:32,196 Speaker 2: a knit and Maria the DJ in I never would 868 00:45:32,196 --> 00:45:34,956 Speaker 2: have guessed, well, but I am. 869 00:45:35,156 --> 00:45:37,396 Speaker 3: I think Maria is also I am generous in those 870 00:45:37,436 --> 00:45:40,236 Speaker 3: other ways yea, and things like tips with things like 871 00:45:40,596 --> 00:45:44,436 Speaker 3: you know, picking up checks, even a fairly expensive meals 872 00:45:44,516 --> 00:45:47,196 Speaker 3: or something for my friends. And so maybe psychologically that 873 00:45:47,236 --> 00:45:50,796 Speaker 3: feels more satisfying than the abstract charitable giving. 874 00:45:51,076 --> 00:45:53,716 Speaker 2: Yes, I mean that is an interesting tension, right, Like 875 00:45:53,796 --> 00:45:56,236 Speaker 2: it feels better to buy lunch for your friend than 876 00:45:56,316 --> 00:45:59,036 Speaker 2: to send the money off to somebody thousands of miles away, 877 00:45:59,116 --> 00:46:03,116 Speaker 2: even though obviously the marginal benefit of that whatever one 878 00:46:03,156 --> 00:46:06,636 Speaker 2: hundred bucks, fifty bucks, any any bucks, is clearly greater 879 00:46:06,716 --> 00:46:09,836 Speaker 2: if you send it off thousand. Jacob in New Yorkers 880 00:46:11,156 --> 00:46:13,076 Speaker 2: be great. I was trying to be I was trying 881 00:46:13,076 --> 00:46:15,156 Speaker 2: to be a man of the people day and when 882 00:46:15,196 --> 00:46:17,756 Speaker 2: are we getting dinner? I totally agree. 883 00:46:17,796 --> 00:46:20,956 Speaker 5: And yeah, and I also just want to want to 884 00:46:20,956 --> 00:46:23,836 Speaker 5: add a little bit to say that I think that 885 00:46:24,516 --> 00:46:28,156 Speaker 5: in this particular case, like role models really do matter 886 00:46:28,196 --> 00:46:33,756 Speaker 5: when it comes to charitable giving tendencies, so psychologically speaking, 887 00:46:34,956 --> 00:46:37,596 Speaker 5: and one of the things that has made me actually 888 00:46:37,956 --> 00:46:40,596 Speaker 5: kind of give more than I have in the past 889 00:46:40,676 --> 00:46:43,156 Speaker 5: is my parents, who don't have a lot of money 890 00:46:43,956 --> 00:46:47,716 Speaker 5: and you know, our single income because my mom uh 891 00:46:48,476 --> 00:46:54,476 Speaker 5: no longer works, and they give a recurring donation every 892 00:46:54,516 --> 00:46:58,156 Speaker 5: single month to causes that they believe strongly in, including Ukraine. 893 00:46:58,756 --> 00:47:00,156 Speaker 5: And when I think about it, I'm like, you can't 894 00:47:00,156 --> 00:47:02,676 Speaker 5: afford to do this, and yet they do it, and 895 00:47:02,956 --> 00:47:06,276 Speaker 5: that you know it I won't say guilts me into 896 00:47:06,476 --> 00:47:08,996 Speaker 5: but like it may, it makes me realize that, like 897 00:47:09,436 --> 00:47:12,236 Speaker 5: you know, it is important when you are donating to 898 00:47:12,356 --> 00:47:14,916 Speaker 5: causes where it actually makes a difference. Nate and I 899 00:47:15,036 --> 00:47:19,676 Speaker 5: talked a lot on our podcast about kind of donating 900 00:47:19,756 --> 00:47:23,716 Speaker 5: to political campaigns, and you know, don't donate to the 901 00:47:23,756 --> 00:47:26,756 Speaker 5: presidential campaign because they don't need the money. But I, 902 00:47:27,076 --> 00:47:29,516 Speaker 5: you know, I think that that's kind of that's what 903 00:47:29,596 --> 00:47:32,156 Speaker 5: we're talking about at the end, Where do you donate 904 00:47:32,316 --> 00:47:34,796 Speaker 5: in a way that your money actually makes an impact 905 00:47:34,916 --> 00:47:36,756 Speaker 5: right now? And it doesn't have to be millions of 906 00:47:36,836 --> 00:47:38,916 Speaker 5: dollars right you can leave. I hope I have millions 907 00:47:38,916 --> 00:47:41,836 Speaker 5: to leave at the end of my life to good causes, 908 00:47:42,116 --> 00:47:45,076 Speaker 5: but you know, there are causes where even a few 909 00:47:45,156 --> 00:47:48,436 Speaker 5: hundred dollars actually can make a huge difference, but. 910 00:47:48,556 --> 00:47:51,396 Speaker 3: I guess because there probably is some pleasure from it. Right, 911 00:47:51,436 --> 00:47:53,916 Speaker 3: you kind of recognize the efemeral nature of money, and 912 00:47:53,916 --> 00:47:55,396 Speaker 3: it's kind of more an emotional reaction. 913 00:47:55,516 --> 00:47:56,356 Speaker 2: And then when. 914 00:47:58,116 --> 00:47:59,596 Speaker 3: I mean thinking about charity, it's kind of the more 915 00:47:59,676 --> 00:48:01,916 Speaker 3: rational part of your brain. 916 00:48:01,996 --> 00:48:02,156 Speaker 2: Right. 917 00:48:02,236 --> 00:48:04,436 Speaker 3: You can make excuses along lines of, well, maybe just 918 00:48:04,436 --> 00:48:05,476 Speaker 3: give it away at the end of my life, and 919 00:48:05,476 --> 00:48:08,556 Speaker 3: I pay a lot of taxes, and you know, figuring 920 00:48:08,596 --> 00:48:11,396 Speaker 3: out where to give is a discussion have with your partner, 921 00:48:11,476 --> 00:48:14,036 Speaker 3: and that can get you know, we have disagreements about that, 922 00:48:14,076 --> 00:48:16,556 Speaker 3: and so it's different than the kind of, oh, it's 923 00:48:16,636 --> 00:48:18,836 Speaker 3: gett a nice bottle of wine and I'll pick I'll 924 00:48:18,836 --> 00:48:21,076 Speaker 3: pick up dinner tonight because I know you had a 925 00:48:21,156 --> 00:48:22,716 Speaker 3: rough tournament series or something like that. 926 00:48:22,956 --> 00:48:27,996 Speaker 2: Huh. I never thought of you as such an emotional guy. 927 00:48:28,316 --> 00:48:31,836 Speaker 2: So so I'm curious in particular, So this this show 928 00:48:32,316 --> 00:48:35,876 Speaker 2: is about giving away money fundamentally, and you know, we 929 00:48:35,916 --> 00:48:38,996 Speaker 2: can talk about, you know, maximizing ev and what charities 930 00:48:38,996 --> 00:48:41,556 Speaker 2: are good, but there's also just this more general idea 931 00:48:41,556 --> 00:48:44,836 Speaker 2: of pro social behavior. Spending money on your friends is 932 00:48:44,956 --> 00:48:48,156 Speaker 2: part of this, and so I'm curious, what's your favorite 933 00:48:48,236 --> 00:48:50,956 Speaker 2: Djen dropping a bunch of money on their friends on 934 00:48:51,076 --> 00:48:52,956 Speaker 2: charity on anything story. 935 00:48:53,196 --> 00:48:56,316 Speaker 5: Yeah, well this isn't this isn't d Jenny, But I 936 00:48:56,356 --> 00:49:00,436 Speaker 5: think it's something that is quite important. The poker community 937 00:49:00,516 --> 00:49:03,876 Speaker 5: actually does give a shit ton of money to charity. 938 00:49:04,036 --> 00:49:06,796 Speaker 5: You know, those aren't the fun D Jenny stories, But 939 00:49:06,796 --> 00:49:09,476 Speaker 5: but that's I think, you know, I think it's important 940 00:49:09,476 --> 00:49:12,756 Speaker 5: to note that poker players actually are on top of this, 941 00:49:12,916 --> 00:49:15,356 Speaker 5: and there are a lot of poker players giving millions 942 00:49:15,596 --> 00:49:17,916 Speaker 5: to charity and matching charitable donations. 943 00:49:19,036 --> 00:49:20,836 Speaker 2: Thanks you, guys, it was a delight to talk to you. 944 00:49:21,316 --> 00:49:26,756 Speaker 2: Thank you, Thank you. So just before we in the 945 00:49:26,756 --> 00:49:29,116 Speaker 2: show here, I want to just mention one last thing. 946 00:49:29,836 --> 00:49:33,196 Speaker 2: There was a moment in that last conversation when Nate 947 00:49:33,356 --> 00:49:36,676 Speaker 2: was talking about the reasons he hasn't given money. You 948 00:49:36,676 --> 00:49:39,316 Speaker 2: know how giving money to charity has tied up with 949 00:49:39,356 --> 00:49:43,036 Speaker 2: all these other household financial planning decisions, et cetera, et cetera. 950 00:49:43,596 --> 00:49:45,436 Speaker 2: And I thought back to that thing that Lori Santo 951 00:49:45,596 --> 00:49:48,756 Speaker 2: said at the beginning of the show about friction. You 952 00:49:48,796 --> 00:49:51,916 Speaker 2: know how friction and not knowing how to give or 953 00:49:51,956 --> 00:49:54,516 Speaker 2: who to give to winds up being this huge barrier. 954 00:49:54,996 --> 00:49:57,516 Speaker 2: And I thought about how Ellie Hassenfeld has spent all 955 00:49:57,516 --> 00:50:00,236 Speaker 2: this time trying to find charities that are very clearly 956 00:50:00,316 --> 00:50:03,156 Speaker 2: doing good that you can just give money to and 957 00:50:03,316 --> 00:50:05,996 Speaker 2: feel good about. So to close out the show today 958 00:50:06,716 --> 00:50:08,516 Speaker 2: and to fight against friction in my own life, I'm 959 00:50:08,516 --> 00:50:10,836 Speaker 2: going to go right now to that website that Laurie 960 00:50:10,916 --> 00:50:18,516 Speaker 2: was talking about giving multiplier dot org slash Happiness Lab, 961 00:50:19,796 --> 00:50:22,356 Speaker 2: and I'm gonna give fifty bucks. Is it the perfect 962 00:50:22,356 --> 00:50:24,876 Speaker 2: amount of money? I don't know. I'm just gonna do 963 00:50:24,956 --> 00:50:34,756 Speaker 2: it right now. Thanks very much to Nate Silver and 964 00:50:34,836 --> 00:50:39,036 Speaker 2: Maria Konakova, the host of Risky Business, Ellie Hassenfeld of GiveWell, 965 00:50:39,516 --> 00:50:41,956 Speaker 2: and to Lorie Santos, the host of The Happiness Lab, 966 00:50:42,076 --> 00:50:44,836 Speaker 2: who got me thinking about giving Tuesday in the first place. 967 00:50:46,076 --> 00:50:49,276 Speaker 2: Today's show was produced by Lucy Sullivan and Isabelle Carter, 968 00:50:49,916 --> 00:50:55,156 Speaker 2: edited by Sarah Nix, and engineered by Jake Gorsky. Special 969 00:50:55,156 --> 00:50:58,596 Speaker 2: thanks to Ryan Dilley, Farah Daygrunge, and Owen Miller. I'm 970 00:50:58,676 --> 00:51:01,956 Speaker 2: Jacob Goldstein and I host the Pushkin Show What's Your Problem? 971 00:51:01,996 --> 00:51:11,796 Speaker 2: Thanks for listening.