1 00:00:01,480 --> 00:00:04,280 Speaker 1: Welcome to stuff you should know, a production of I 2 00:00:04,360 --> 00:00:13,280 Speaker 1: Heart Radio. Hey, and welcome to the podcast. I'm Josh 3 00:00:13,320 --> 00:00:16,200 Speaker 1: Clark and there's Charles w Chuck Bryant and Jerry's here, 4 00:00:16,239 --> 00:00:20,640 Speaker 1: so that appropriately makes this stuff you should know. That's right. 5 00:00:20,880 --> 00:00:23,239 Speaker 1: Before we get going, we want to give a little 6 00:00:23,320 --> 00:00:26,799 Speaker 1: special plug to our good friend John Hodgeman and our 7 00:00:26,800 --> 00:00:31,479 Speaker 1: buddy David Reese because they got a season two of 8 00:00:31,520 --> 00:00:35,200 Speaker 1: their awesome animated show dick Town. Yeah as in Private 9 00:00:35,200 --> 00:00:38,800 Speaker 1: Detective dick By. Yeah. I mean, it's cool enough to 10 00:00:38,840 --> 00:00:40,760 Speaker 1: get one season of the show, but if you've gotten 11 00:00:40,800 --> 00:00:43,080 Speaker 1: a second season and they're tossing you on f X 12 00:00:43,159 --> 00:00:45,800 Speaker 1: these days, you've made it. So their show is finally 13 00:00:45,840 --> 00:00:47,879 Speaker 1: made it and it's well deserved too, because it's a 14 00:00:47,880 --> 00:00:51,000 Speaker 1: pretty awesome cartoon. It is, it's very funny. It is 15 00:00:51,040 --> 00:00:55,959 Speaker 1: actually live now. It premiered on March three at ten 16 00:00:56,040 --> 00:00:59,520 Speaker 1: pm on f x X. You can watch it on 17 00:00:59,760 --> 00:01:02,760 Speaker 1: Who Lou and the whole jam here is that John 18 00:01:02,800 --> 00:01:04,840 Speaker 1: and I watched the whole first season, the whole their 19 00:01:04,840 --> 00:01:07,480 Speaker 1: short episodes. The whole first season was less than two 20 00:01:07,480 --> 00:01:10,560 Speaker 1: hours long, which really like makes a great case. We're 21 00:01:10,560 --> 00:01:13,440 Speaker 1: just streaming the whole thing and laughing a lot in 22 00:01:13,480 --> 00:01:17,480 Speaker 1: one night. But it's uh, it's about two detectives, John Hunchman, 23 00:01:17,959 --> 00:01:23,280 Speaker 1: John Hodgeman and David Purefoy David Reese, and uh, Hodgeman 24 00:01:23,400 --> 00:01:25,959 Speaker 1: is a is a private detective. He was a former 25 00:01:26,000 --> 00:01:30,880 Speaker 1: boy detective like Encyclopedia Brown type and and Dave was his, 26 00:01:31,200 --> 00:01:34,960 Speaker 1: uh sort of his bugs, meaning his nemesis in high 27 00:01:35,000 --> 00:01:39,200 Speaker 1: school and now is now his buddy and his sort 28 00:01:39,240 --> 00:01:42,160 Speaker 1: of muscle and his driver and his and they solve 29 00:01:42,240 --> 00:01:45,000 Speaker 1: cases together. And uh A season two I think is 30 00:01:45,080 --> 00:01:48,240 Speaker 1: even bigger and weirder, and it's sort of Scooby Doo. 31 00:01:48,320 --> 00:01:51,080 Speaker 1: It's just a lot of fun, really really fun show. Yeah, 32 00:01:51,120 --> 00:01:54,920 Speaker 1: the first season they did nothing but solve children's um, 33 00:01:54,920 --> 00:01:58,080 Speaker 1: which is his, and they were humiliated by that. So 34 00:01:58,120 --> 00:02:00,800 Speaker 1: they they've kind of expanded now to be grown ups. 35 00:02:00,840 --> 00:02:02,760 Speaker 1: They've resolved to be grown ups and they're solving grown 36 00:02:02,840 --> 00:02:06,040 Speaker 1: up mysteries for grown ups now, which is really something else. 37 00:02:06,080 --> 00:02:07,800 Speaker 1: So yeah, like you said, you can stream the whole 38 00:02:07,800 --> 00:02:11,640 Speaker 1: first season on Hulu, and you can catch this second 39 00:02:11,680 --> 00:02:15,400 Speaker 1: season on f x X. I wasn't aware of the 40 00:02:15,440 --> 00:02:18,320 Speaker 1: extra X. I don't take back what I originally said. 41 00:02:18,320 --> 00:02:22,160 Speaker 1: It's still big time but f x X that's right. Uh, 42 00:02:22,200 --> 00:02:25,560 Speaker 1: And it is rated PG thirteen, so if you're thirteen 43 00:02:25,560 --> 00:02:27,600 Speaker 1: and up you should enjoy. It's got a few swear words, 44 00:02:28,040 --> 00:02:30,800 Speaker 1: adult themes here and there, but it's it's great. It's 45 00:02:30,800 --> 00:02:34,800 Speaker 1: a lot of fun. Happy for Hodgman in Reese. Happy 46 00:02:34,880 --> 00:02:39,960 Speaker 1: Happy Hodgeman, Happy, happy Reese. And I just like saying 47 00:02:40,160 --> 00:02:43,640 Speaker 1: dick Town. Sure, it's a great name for a great show. 48 00:02:43,840 --> 00:02:47,040 Speaker 1: It is. Should we talk about effective altruism? Yeah, I 49 00:02:47,080 --> 00:02:49,480 Speaker 1: was gonna say, we're we're talking about that today and 50 00:02:49,520 --> 00:02:51,720 Speaker 1: this one kind of I don't know if you noticed 51 00:02:51,720 --> 00:02:54,079 Speaker 1: a similarity, but this one really kind of ties into 52 00:02:54,160 --> 00:02:58,200 Speaker 1: that short stuff that we uh we released before the 53 00:02:58,320 --> 00:03:01,320 Speaker 1: end of the year about riable giving. Did you notice 54 00:03:01,880 --> 00:03:06,480 Speaker 1: I did? Although in that episode it was like we're like, yeah, 55 00:03:06,560 --> 00:03:08,360 Speaker 1: you know, find a charity that speaks to you and 56 00:03:08,360 --> 00:03:10,960 Speaker 1: maybe something it's local, or if you have animals, or 57 00:03:10,960 --> 00:03:13,720 Speaker 1: if you had you know, a family member with cancer, 58 00:03:13,800 --> 00:03:17,480 Speaker 1: and this basically says don't do any of that, right, Uh, 59 00:03:17,560 --> 00:03:20,880 Speaker 1: the only way you should give is by just kind 60 00:03:20,919 --> 00:03:25,800 Speaker 1: of coldly calculating what would help a human the most 61 00:03:25,919 --> 00:03:31,079 Speaker 1: on planet Earth. Yes, so effective altruism is one of 62 00:03:31,120 --> 00:03:33,160 Speaker 1: those movements. It's a pretty new movement. I think it 63 00:03:33,200 --> 00:03:38,400 Speaker 1: really started in earnest around two thousand ten UM. And 64 00:03:38,440 --> 00:03:43,000 Speaker 1: it's one of those movements that like elicits passion one 65 00:03:43,040 --> 00:03:46,960 Speaker 1: way or another. It's a very polarizing idea if you 66 00:03:47,120 --> 00:03:50,320 Speaker 1: just take it at its bare bones, which people love 67 00:03:50,440 --> 00:03:52,680 Speaker 1: to do. And the reason why people love to take 68 00:03:52,720 --> 00:03:56,760 Speaker 1: it at its bare bones, at its extremes extremes is 69 00:03:56,800 --> 00:04:01,040 Speaker 1: because it is at heart of philosophical movement. Its rooted 70 00:04:01,040 --> 00:04:05,120 Speaker 1: in utilitarianism, and utilitarianism is even more polarizing it has 71 00:04:05,160 --> 00:04:09,640 Speaker 1: been for centuries than effective altruism is. And I think 72 00:04:09,680 --> 00:04:13,280 Speaker 1: if everybody would just move past the most extreme parts 73 00:04:13,280 --> 00:04:15,960 Speaker 1: of it and just kind of took effective altruism and 74 00:04:16,000 --> 00:04:19,640 Speaker 1: it's at its most middle ground, where most of it 75 00:04:19,680 --> 00:04:21,960 Speaker 1: seems to have accumulated and settled and where most of 76 00:04:22,000 --> 00:04:25,159 Speaker 1: the work is being done, it would be really difficult 77 00:04:25,200 --> 00:04:28,640 Speaker 1: to disagree with the ideas behind it. It's when you 78 00:04:28,760 --> 00:04:32,039 Speaker 1: trot out Peter Singer and some of his most extreme views, 79 00:04:32,200 --> 00:04:36,120 Speaker 1: or when you say, oh, it's all Silicon Valley billionaires, 80 00:04:36,200 --> 00:04:38,760 Speaker 1: you know, um, when you when you just look at 81 00:04:38,800 --> 00:04:41,159 Speaker 1: it like that, that's when people get all riled up 82 00:04:41,160 --> 00:04:43,039 Speaker 1: and they're like, I hate it infactive altruism. If you 83 00:04:43,080 --> 00:04:45,080 Speaker 1: really just kind of take it in a much more 84 00:04:45,640 --> 00:04:49,760 Speaker 1: level headed way, it's actually pretty sensible and pretty great 85 00:04:49,800 --> 00:04:52,200 Speaker 1: because at the end of the day, you're saving people's 86 00:04:52,240 --> 00:04:54,480 Speaker 1: lives and you're figuring out how to save the most 87 00:04:54,520 --> 00:04:58,760 Speaker 1: lives possible. Yeah. I think anything that has some of 88 00:04:58,800 --> 00:05:04,560 Speaker 1: its roots and philosophical movements of tech bros. It's it's 89 00:05:04,560 --> 00:05:08,279 Speaker 1: a hard sell for a lot of people. Uh, but 90 00:05:08,400 --> 00:05:11,640 Speaker 1: let's talk about a few things that it is, which 91 00:05:11,720 --> 00:05:16,640 Speaker 1: is the idea that, uh, there's a lot of good 92 00:05:16,640 --> 00:05:20,160 Speaker 1: that can be done with money, and if you can 93 00:05:20,240 --> 00:05:23,440 Speaker 1: provide for yourself and your own basic needs, um, you 94 00:05:23,480 --> 00:05:27,160 Speaker 1: should be probably giving to charity. Uh. You can take 95 00:05:27,200 --> 00:05:32,960 Speaker 1: a cold hard look at your finances by literal, strict 96 00:05:32,960 --> 00:05:37,279 Speaker 1: calculations financial calculations. If you make if you were a 97 00:05:37,320 --> 00:05:40,839 Speaker 1: person without kids making forty dollar a year, you are 98 00:05:40,960 --> 00:05:45,400 Speaker 1: in the ninety seven point four percentile on planet Earth 99 00:05:46,000 --> 00:05:49,640 Speaker 1: as far as your wealth goes. And that you might 100 00:05:49,680 --> 00:05:52,200 Speaker 1: not think if make forty dollars a year and then 101 00:05:52,200 --> 00:05:55,280 Speaker 1: I have taxes, and I really like people with a 102 00:05:55,279 --> 00:05:56,960 Speaker 1: lot of money should give to charities. I really don't 103 00:05:57,000 --> 00:06:00,080 Speaker 1: have enough to spare. The idea is that, no, you 104 00:06:00,160 --> 00:06:03,520 Speaker 1: have some to spare. You can give a little bit, 105 00:06:04,160 --> 00:06:06,919 Speaker 1: uh like ten of your money and still be in 106 00:06:06,920 --> 00:06:10,279 Speaker 1: the top ninety six percentile and you can literally save 107 00:06:10,640 --> 00:06:14,159 Speaker 1: human lives on plans. That's the big thing that they're 108 00:06:14,200 --> 00:06:16,840 Speaker 1: trying to get across here, that like, the money that 109 00:06:16,880 --> 00:06:21,880 Speaker 1: you're giving is saving lives that otherwise would be crippled 110 00:06:21,960 --> 00:06:26,279 Speaker 1: with disease or just not around, like they would die 111 00:06:26,839 --> 00:06:29,520 Speaker 1: if you didn't give this money. And the fact that 112 00:06:29,560 --> 00:06:32,680 Speaker 1: you are giving this money, those people are now living 113 00:06:33,040 --> 00:06:36,359 Speaker 1: what are called quality adjusted life years, where they're living 114 00:06:36,360 --> 00:06:41,400 Speaker 1: in additional healthy year or more because of that intervention 115 00:06:41,480 --> 00:06:44,800 Speaker 1: that you gave your money for. And that yes, it's 116 00:06:44,839 --> 00:06:48,839 Speaker 1: based on the premise that basically everyone living in the 117 00:06:48,960 --> 00:06:53,400 Speaker 1: United States is rich compared to entire swaths of the 118 00:06:53,440 --> 00:06:56,760 Speaker 1: rest of the world, and that basically anyone living in 119 00:06:56,760 --> 00:07:00,800 Speaker 1: the United States can afford to give temper scent of 120 00:07:00,880 --> 00:07:04,159 Speaker 1: their income and forego some clothes or some cars or 121 00:07:04,200 --> 00:07:08,960 Speaker 1: something like that to help other people literally survive. And 122 00:07:09,000 --> 00:07:13,680 Speaker 1: so right off the bat, we've reached levels of discomfort 123 00:07:13,760 --> 00:07:17,440 Speaker 1: for the average person, especially the average American that like 124 00:07:17,720 --> 00:07:20,080 Speaker 1: that are really tough to deal with. And so that's 125 00:07:20,120 --> 00:07:22,400 Speaker 1: the first challenge that effective ultrasts have to do is 126 00:07:22,480 --> 00:07:25,560 Speaker 1: kind kind of tamped down that overwhelming sense of guilt 127 00:07:25,560 --> 00:07:29,800 Speaker 1: and responsibility and shame at not doing that that that 128 00:07:29,840 --> 00:07:33,240 Speaker 1: people immediately kind of that crops up and people when 129 00:07:33,240 --> 00:07:36,600 Speaker 1: they hear about this. Yeah, So I think maybe let's 130 00:07:36,600 --> 00:07:39,000 Speaker 1: talk a little bit about the history and some of 131 00:07:39,000 --> 00:07:43,040 Speaker 1: the main organizations that are tackling this and maybe through 132 00:07:43,080 --> 00:07:47,120 Speaker 1: that what some of the founders describe as the core commitments. 133 00:07:48,080 --> 00:07:51,400 Speaker 1: Like you said, it took hold in about two thousand 134 00:07:51,440 --> 00:07:55,000 Speaker 1: and ten UH, and there's a group of organizations under 135 00:07:55,040 --> 00:07:58,160 Speaker 1: what is now an umbrella organization called the Center for 136 00:07:58,200 --> 00:08:02,680 Speaker 1: Effective Altruism c e A. And UH started off with 137 00:08:02,720 --> 00:08:07,000 Speaker 1: philosopher's Toby Ord and Will mccaskell founding a group called 138 00:08:07,040 --> 00:08:10,520 Speaker 1: Giving What we Can UH self defined as an international 139 00:08:10,520 --> 00:08:13,920 Speaker 1: community of people committed to giving more and giving more effectively. 140 00:08:14,640 --> 00:08:17,160 Speaker 1: A couple of years later, mccaskell and a man named 141 00:08:17,160 --> 00:08:20,520 Speaker 1: Benjamin Todd founded something called eighty thousand Hours, the ideas 142 00:08:20,560 --> 00:08:22,720 Speaker 1: that you might devote eighty thousand hours to a career, 143 00:08:23,400 --> 00:08:27,480 Speaker 1: so when choosing a career be very thoughtful on the 144 00:08:27,480 --> 00:08:30,200 Speaker 1: impact that career has for both good and evil. We'll 145 00:08:30,280 --> 00:08:35,679 Speaker 1: get way more into all this, uh, and then um, 146 00:08:35,720 --> 00:08:38,720 Speaker 1: there's other you know, sort of nut fringes and weird 147 00:08:38,760 --> 00:08:42,520 Speaker 1: groups but just on the outskirts called The Life You 148 00:08:42,520 --> 00:08:46,959 Speaker 1: Can Save and then animal charity evaluators, which we'll get 149 00:08:46,960 --> 00:08:50,480 Speaker 1: into how animals figure in. Um, but let's talk a 150 00:08:50,480 --> 00:08:53,000 Speaker 1: little bit. I guess about Will mccaskell and what he 151 00:08:53,040 --> 00:08:56,040 Speaker 1: sees as the core what he calls the core commitments 152 00:08:56,120 --> 00:08:59,480 Speaker 1: of e A. So yeah, and Will McCaskill, he's A. 153 00:09:00,160 --> 00:09:03,720 Speaker 1: He's out of Oxford, and so is Toby Ord. And 154 00:09:03,800 --> 00:09:06,040 Speaker 1: I first came across this chuck when I was researching 155 00:09:06,120 --> 00:09:09,000 Speaker 1: the End of the World podcast, and like, I deeply 156 00:09:09,040 --> 00:09:11,520 Speaker 1: admire Toby Ord on like a personal level. He actually 157 00:09:11,520 --> 00:09:13,800 Speaker 1: walks the walkkey and his whole family does, like they 158 00:09:13,920 --> 00:09:17,840 Speaker 1: donate a significant portion of their family income to charity 159 00:09:17,880 --> 00:09:20,200 Speaker 1: and like forego all sorts of stuff, and like he's 160 00:09:20,240 --> 00:09:23,040 Speaker 1: literally trying to save the world. So um, and that 161 00:09:23,120 --> 00:09:25,760 Speaker 1: since I'm I'm like really kind of open to the 162 00:09:25,800 --> 00:09:28,280 Speaker 1: ideas that come out of that guy's mouth. Um, and 163 00:09:29,400 --> 00:09:31,319 Speaker 1: you mentioned the End of the World with Josh Clark 164 00:09:31,880 --> 00:09:35,720 Speaker 1: available wherever you can find your podcast. Yes, you're wonderful heady, 165 00:09:36,040 --> 00:09:38,680 Speaker 1: highly produced in part series. Thank you very much. That 166 00:09:38,720 --> 00:09:41,320 Speaker 1: was nice of you. Where is the new tackle the 167 00:09:41,360 --> 00:09:45,560 Speaker 1: existing existential risks of the of the universe? Yes, okay, 168 00:09:45,640 --> 00:09:47,040 Speaker 1: that's I just want to make sure the right chich 169 00:09:47,080 --> 00:09:49,440 Speaker 1: one and the same. And I was not doing that 170 00:09:49,520 --> 00:09:51,200 Speaker 1: to set you up for a plug. I was doing 171 00:09:51,200 --> 00:09:53,079 Speaker 1: it and like kind of full disclosure that I'm a 172 00:09:53,120 --> 00:09:57,480 Speaker 1: little I'm probably a little less than objective at this one. Yeah, 173 00:09:57,480 --> 00:09:59,040 Speaker 1: but you know, that's a great show and it's still 174 00:09:59,080 --> 00:10:02,280 Speaker 1: out there just because it is, you know, a few 175 00:10:02,360 --> 00:10:04,800 Speaker 1: years old now, it's very evergreen. I think it's at 176 00:10:04,840 --> 00:10:07,840 Speaker 1: least in these times. Yeah, the world hasn't ended yet, 177 00:10:07,840 --> 00:10:12,720 Speaker 1: so it's still ever exactly good point. So, um, but 178 00:10:12,880 --> 00:10:16,120 Speaker 1: I mentioned that in part is kind of fully disclosed. 179 00:10:16,240 --> 00:10:18,440 Speaker 1: Um that I think Toby we're just one of the 180 00:10:18,440 --> 00:10:20,360 Speaker 1: greatest people walk in the earth right now. But also 181 00:10:20,720 --> 00:10:24,640 Speaker 1: Will mccaskell, who I don't know, seems to be in 182 00:10:24,760 --> 00:10:27,640 Speaker 1: lock step with Toby too, and so he's kind of 183 00:10:27,679 --> 00:10:29,760 Speaker 1: one of the one of the founders of this movement. 184 00:10:29,800 --> 00:10:32,440 Speaker 1: And he said that that there's um four tenants he 185 00:10:32,440 --> 00:10:34,600 Speaker 1: wrote a two thousand eighteen paper. And so there's basically 186 00:10:34,679 --> 00:10:39,160 Speaker 1: four tenants that formed the core of effective altruism. One 187 00:10:39,280 --> 00:10:42,079 Speaker 1: is maximizing the good, which we can all pretty much 188 00:10:42,120 --> 00:10:44,319 Speaker 1: get on board with, like you want to make as 189 00:10:44,400 --> 00:10:46,960 Speaker 1: much good as possible for as many people as possible. 190 00:10:47,800 --> 00:10:51,360 Speaker 1: The second is aligning your aligning your your ideas, your 191 00:10:52,280 --> 00:10:59,319 Speaker 1: your contributions with science using like evidence based um uh 192 00:10:59,400 --> 00:11:03,600 Speaker 1: well evian to to to create where you're going to 193 00:11:03,679 --> 00:11:06,400 Speaker 1: put your donations to use that to guide you rather 194 00:11:06,440 --> 00:11:08,640 Speaker 1: than your heart. It's a big one, so it's a 195 00:11:08,679 --> 00:11:12,199 Speaker 1: tough one for people's swallow. Another one's welfare is m 196 00:11:12,880 --> 00:11:18,400 Speaker 1: where by maximizing the good, you're you're improving the welfare 197 00:11:18,440 --> 00:11:21,040 Speaker 1: of others. That's the definition of good in that sense 198 00:11:21,120 --> 00:11:24,120 Speaker 1: of maximizing the good. And then last one is impartiality. 199 00:11:24,600 --> 00:11:28,160 Speaker 1: That's as hard for for people this swallow. That's harder 200 00:11:28,200 --> 00:11:31,840 Speaker 1: I think for people swallow than science alignment, um, because 201 00:11:31,840 --> 00:11:34,680 Speaker 1: what you're saying, then, Chuck, is that every single person 202 00:11:34,720 --> 00:11:40,640 Speaker 1: out there in the world equally deserves um your charitable contribution. Yeah. 203 00:11:40,679 --> 00:11:43,440 Speaker 1: And that's a big one because I'm trying to find 204 00:11:43,440 --> 00:11:50,480 Speaker 1: the number here of how much Americans give abroad. Where 205 00:11:50,559 --> 00:11:55,959 Speaker 1: is that? Okay, here we go out of the um. 206 00:11:56,000 --> 00:11:58,319 Speaker 1: What is it four hundred and seventy billion dollars that 207 00:11:58,360 --> 00:12:03,920 Speaker 1: Americans donate? Yeah, I think, yeah, one billion dollars twenty 208 00:12:04,000 --> 00:12:07,760 Speaker 1: five point nine billion of that went to went outside 209 00:12:07,760 --> 00:12:11,240 Speaker 1: of America to international affairs. So it's a lot of money, 210 00:12:11,240 --> 00:12:13,160 Speaker 1: but it's not a lot of money in the total 211 00:12:13,280 --> 00:12:17,679 Speaker 1: pot in the The idea for e A is is 212 00:12:17,679 --> 00:12:20,840 Speaker 1: to sort of shatter your way of thinking about, you know, 213 00:12:20,880 --> 00:12:24,120 Speaker 1: trying to help the your the people in your city, 214 00:12:24,200 --> 00:12:26,960 Speaker 1: or the people in your state or your country, and 215 00:12:27,040 --> 00:12:30,680 Speaker 1: to look at every human life is having equal value. Yes, 216 00:12:30,800 --> 00:12:33,520 Speaker 1: and not even human life, but every life. Yeah, they 217 00:12:33,520 --> 00:12:35,760 Speaker 1: include animals too, like you mentioned before, and we'll get 218 00:12:35,800 --> 00:12:39,880 Speaker 1: into a little more um. But the key is that 219 00:12:39,920 --> 00:12:43,840 Speaker 1: if every single person living on earth is equally important, 220 00:12:44,679 --> 00:12:47,559 Speaker 1: then and you're trying to maximize the help you can 221 00:12:47,760 --> 00:12:51,559 Speaker 1: you can do if from a from a strict e 222 00:12:51,800 --> 00:12:56,160 Speaker 1: A perspective, you're wasting your money if you're donating that money, 223 00:12:56,400 --> 00:12:59,960 Speaker 1: if you're an American, if you're donating it in America, 224 00:12:59,840 --> 00:13:02,720 Speaker 1: because just by virtue of the value of a dollar, 225 00:13:02,920 --> 00:13:06,520 Speaker 1: it can do exponentially more good. One dollar can in 226 00:13:06,800 --> 00:13:10,959 Speaker 1: other like developing poverty stricken areas of the world, then 227 00:13:11,040 --> 00:13:14,320 Speaker 1: it can here in the United States. So that right 228 00:13:14,360 --> 00:13:17,000 Speaker 1: there sets up for critics of that of a like 229 00:13:17,080 --> 00:13:18,920 Speaker 1: to point out that, well, wait a minute, wait amut, 230 00:13:18,960 --> 00:13:22,120 Speaker 1: are you saying that we shouldn't donate locally here at home, 231 00:13:22,160 --> 00:13:24,440 Speaker 1: That we shouldn't save the animals and the animal shelter, 232 00:13:24,559 --> 00:13:27,040 Speaker 1: That we shouldn't donate to your local food pantry, That 233 00:13:27,080 --> 00:13:29,959 Speaker 1: you shouldn't donate to your church. And if you really 234 00:13:30,040 --> 00:13:33,480 Speaker 1: back at effective altruists into a corner, they would say, look, 235 00:13:33,960 --> 00:13:37,920 Speaker 1: just speaking of maximizing your impact and everybody around the 236 00:13:37,920 --> 00:13:42,280 Speaker 1: world is equally important. No, you shouldn't be doing any 237 00:13:42,320 --> 00:13:44,720 Speaker 1: of those things, and you certainly shouldn't be donating any 238 00:13:44,720 --> 00:13:48,600 Speaker 1: money to your local museum or symphony or something like that. Yeah, 239 00:13:48,640 --> 00:13:50,960 Speaker 1: And they say that with their their head down and 240 00:13:50,960 --> 00:13:53,000 Speaker 1: they're kind of drawing on the floor with your foot. 241 00:13:53,880 --> 00:13:56,560 Speaker 1: They're saying like, yeah, that's kind of what we're saying. 242 00:13:57,000 --> 00:14:00,480 Speaker 1: That's right, yes, And that's really tough for people swallow. 243 00:14:00,520 --> 00:14:03,720 Speaker 1: There's like it's just this huge jagged pill that they're 244 00:14:03,720 --> 00:14:06,520 Speaker 1: asking people swallow. But if you can step back from it, 245 00:14:06,840 --> 00:14:10,400 Speaker 1: What they're ultimately saying is, look, man, you want to 246 00:14:10,480 --> 00:14:13,719 Speaker 1: do the most good with your charitable donations, here's how 247 00:14:13,720 --> 00:14:17,439 Speaker 1: to do it. You want to sit aside, you want 248 00:14:17,440 --> 00:14:20,320 Speaker 1: to feel good about it, or really do the good exactly. 249 00:14:20,360 --> 00:14:22,400 Speaker 1: And that's what they're doing. That's the whole basis of 250 00:14:22,760 --> 00:14:26,720 Speaker 1: effective altruism. Is they're saying, set like you're all of 251 00:14:26,720 --> 00:14:30,360 Speaker 1: your charitable giving is for you. You're doing it for yourself, 252 00:14:30,400 --> 00:14:33,000 Speaker 1: that's why you give. This takes that out of the 253 00:14:33,000 --> 00:14:37,400 Speaker 1: equation and says, now you're giving to genuinely help somebody else. 254 00:14:38,040 --> 00:14:41,240 Speaker 1: All right, I think that's a great beginning. Maybe let's 255 00:14:41,240 --> 00:14:43,840 Speaker 1: take a break now that everyone knows what this is 256 00:14:43,880 --> 00:14:47,560 Speaker 1: and everyone is is choking on their coffee because they 257 00:14:47,640 --> 00:14:52,320 Speaker 1: just donated to their local neighbor organization. Uh. And we'll 258 00:14:52,320 --> 00:14:54,880 Speaker 1: come back and talk about some of the other uh 259 00:14:55,200 --> 00:15:15,800 Speaker 1: philosophical founders right after this. George So a couple of 260 00:15:15,840 --> 00:15:18,120 Speaker 1: people we should mention really quickly because they're gonna come 261 00:15:18,200 --> 00:15:22,920 Speaker 1: up as far as organizations we did not mention GiveWell yet. Uh. 262 00:15:22,960 --> 00:15:24,920 Speaker 1: They were founded in two thousand seven. They're a big 263 00:15:24,920 --> 00:15:28,200 Speaker 1: part of the e a movement by Facebook co founder 264 00:15:28,280 --> 00:15:34,440 Speaker 1: Dustin Moskovitz and his wife. Is it carry or Carrie Tuna. 265 00:15:34,880 --> 00:15:37,080 Speaker 1: I'm going with Cary. I think so it's c A 266 00:15:37,200 --> 00:15:41,240 Speaker 1: r I. Uh So they have partnered up UM to 267 00:15:41,360 --> 00:15:46,800 Speaker 1: create open philanthropy, UM phil and philanthropy. It sounds weird. 268 00:15:48,280 --> 00:15:51,000 Speaker 1: I wanted to say philanthropy LOO too early in the 269 00:15:51,080 --> 00:15:54,720 Speaker 1: episode for that. Uh So they're they're big donors and 270 00:15:54,760 --> 00:15:58,000 Speaker 1: big believers in the cause UM. And then another person 271 00:15:58,120 --> 00:16:02,760 Speaker 1: you mentioned is well, first of all, you mentioned utilitarians, uh, 272 00:16:02,800 --> 00:16:06,760 Speaker 1: in this philosophical movement UM, they were developed by and 273 00:16:06,800 --> 00:16:10,680 Speaker 1: then we've talked about Jeremy Jeremy Bentham before and John 274 00:16:10,720 --> 00:16:14,640 Speaker 1: Stewart Mill. But the idea that people should do what 275 00:16:14,800 --> 00:16:18,200 Speaker 1: causes the most happiness and relieves the most suffering. And 276 00:16:18,280 --> 00:16:22,040 Speaker 1: the other guy you mentioned that uh sort of controversial, 277 00:16:22,040 --> 00:16:25,120 Speaker 1: I guess you could say is Peter Singer. Uh. He 278 00:16:25,280 --> 00:16:28,640 Speaker 1: is an author and a philosopher and a and a 279 00:16:28,640 --> 00:16:33,040 Speaker 1: ted talker who who kind of UM became I don't 280 00:16:33,040 --> 00:16:34,720 Speaker 1: know about famous, because a lot of people don't know 281 00:16:34,760 --> 00:16:39,600 Speaker 1: any modern philosophers, but in these circles became famous from 282 00:16:39,600 --> 00:16:43,280 Speaker 1: an idea thought experiment in V two from his essay Famine, 283 00:16:43,280 --> 00:16:48,240 Speaker 1: Affluence and morality, which is, you're going to work. You 284 00:16:48,360 --> 00:16:51,280 Speaker 1: just bought some really expensive, great new shoes. You see 285 00:16:51,400 --> 00:16:54,360 Speaker 1: a kid drowning in a shallow pond. Do you wait 286 00:16:54,440 --> 00:16:56,800 Speaker 1: in there and ruin those new shoes and rescue the 287 00:16:56,880 --> 00:17:02,400 Speaker 1: kid and make you late for work? And you know, people, 288 00:17:02,640 --> 00:17:04,600 Speaker 1: if asked would say, well, of course you do. You're 289 00:17:04,600 --> 00:17:07,520 Speaker 1: not gonna let that kid drown. So that the flip 290 00:17:07,560 --> 00:17:10,560 Speaker 1: to that is, well, that's happening every day all over 291 00:17:10,600 --> 00:17:14,440 Speaker 1: the world, and you're essentially saving your new shoes by 292 00:17:14,480 --> 00:17:19,720 Speaker 1: letting these kids die. Yeah, you're you're you're buying those 293 00:17:19,720 --> 00:17:22,159 Speaker 1: new shoes rather than donating that money to save a 294 00:17:22,240 --> 00:17:26,240 Speaker 1: child's life. It's morally speaking, it's the exact same thing. 295 00:17:27,000 --> 00:17:29,440 Speaker 1: And in the in the essay I wrote it last night, 296 00:17:29,440 --> 00:17:31,240 Speaker 1: it's really good. Do you want to feel like a 297 00:17:31,280 --> 00:17:36,040 Speaker 1: total piece of garbage for not doing enough in the world. Um. 298 00:17:36,320 --> 00:17:40,280 Speaker 1: He basically goes on to destroy any argument about, well, 299 00:17:40,520 --> 00:17:42,360 Speaker 1: that's a kid that you see in a pond. You're 300 00:17:42,400 --> 00:17:45,800 Speaker 1: you're actually physically saving that kid. He's like, well, it's 301 00:17:45,880 --> 00:17:48,080 Speaker 1: so easy to donate to help a child on the 302 00:17:48,119 --> 00:17:49,760 Speaker 1: other side of the world, right now, that for all 303 00:17:49,800 --> 00:17:52,720 Speaker 1: intents and purposes, it's as easy as going into a 304 00:17:52,760 --> 00:17:55,280 Speaker 1: pond to say. These days it is easier. You don't 305 00:17:55,320 --> 00:17:57,880 Speaker 1: even have to get wet. You're just calling your credit 306 00:17:57,920 --> 00:18:02,480 Speaker 1: card basically, you know. Um, So, so he just destroys 307 00:18:02,520 --> 00:18:05,719 Speaker 1: like any argument you could possibly have. And he is 308 00:18:06,320 --> 00:18:12,400 Speaker 1: an extremist utilitarian philosopher, and that he's basically saying, not 309 00:18:12,560 --> 00:18:18,040 Speaker 1: just that giving money to the point where you are 310 00:18:18,119 --> 00:18:23,800 Speaker 1: just above the level of poverty the as the people 311 00:18:23,840 --> 00:18:28,399 Speaker 1: you're giving, like really cutting into your your luxuries to 312 00:18:28,560 --> 00:18:31,440 Speaker 1: help other people. Not only is that a good thing 313 00:18:31,480 --> 00:18:35,000 Speaker 1: if you do that, it's actually not doing that is 314 00:18:35,040 --> 00:18:38,680 Speaker 1: a morally bad thing. It's morally wrong to not do that. 315 00:18:38,760 --> 00:18:41,840 Speaker 1: So he will really turn like the the hot plate 316 00:18:41,960 --> 00:18:44,560 Speaker 1: up under you. Um, and it just really make you 317 00:18:44,600 --> 00:18:47,760 Speaker 1: feel uncomfortable. But he's saying like like, this is my 318 00:18:47,800 --> 00:18:51,720 Speaker 1: philosophical argument, and it's pretty sound if you hear me out, 319 00:18:51,760 --> 00:18:54,680 Speaker 1: and if you hear him out, it is pretty sound. Um. 320 00:18:54,720 --> 00:18:57,720 Speaker 1: The problem is he's a utilitarian philosopher and a very 321 00:18:57,760 --> 00:19:01,679 Speaker 1: strict one too, And so um, there's a lot of 322 00:19:01,720 --> 00:19:04,199 Speaker 1: like you can take that stuff to the nth degree, 323 00:19:04,440 --> 00:19:08,600 Speaker 1: to some really terrible extremes, um to where it's it 324 00:19:08,680 --> 00:19:13,439 Speaker 1: becomes so anti sentimental that, um, it actually can be 325 00:19:13,520 --> 00:19:18,800 Speaker 1: nauseating sometimes, Like, strictly speaking, under your utilitarian view, this 326 00:19:18,840 --> 00:19:23,159 Speaker 1: one's often trotted out. It is morally good to murder 327 00:19:23,280 --> 00:19:26,439 Speaker 1: one person to harvest their organs to save the lives 328 00:19:26,440 --> 00:19:29,720 Speaker 1: of five other people with the the murdered person's organs. 329 00:19:30,480 --> 00:19:36,439 Speaker 1: Technically speaking in the utilitarian lens, that's that's maximizing the 330 00:19:36,480 --> 00:19:38,960 Speaker 1: good in the world. The thing is is, like, if 331 00:19:38,960 --> 00:19:42,440 Speaker 1: that's what you're focusing on, and you're equating effective altruisms 332 00:19:42,800 --> 00:19:45,639 Speaker 1: desire to get the most bang for your donation buck 333 00:19:46,160 --> 00:19:49,600 Speaker 1: to murdering somebody to harvest their organs to save five people, 334 00:19:49,880 --> 00:19:52,600 Speaker 1: you've just completely lost your way. Sure you can, you 335 00:19:52,600 --> 00:19:55,960 Speaker 1: can win an argument against utilitarianism in that respect, but 336 00:19:56,480 --> 00:19:59,320 Speaker 1: the fact that it's leveled and trained on on these 337 00:19:59,480 --> 00:20:05,000 Speaker 1: on this movement, this charitable philanthropy movement, is totally unfair, 338 00:20:05,080 --> 00:20:08,520 Speaker 1: even though yes, it is pretty much part and parcel 339 00:20:08,560 --> 00:20:13,200 Speaker 1: with utilitarianism. Yeah, the singer is a guy who I 340 00:20:13,240 --> 00:20:16,479 Speaker 1: think is one of his philosophies is the journey of life, 341 00:20:16,640 --> 00:20:22,359 Speaker 1: and that interrupting a life before it has goals or 342 00:20:22,480 --> 00:20:27,479 Speaker 1: after it's accomplished goals is Okay. Uh so he you know, 343 00:20:27,680 --> 00:20:29,119 Speaker 1: if you mentioned his name, there are a lot of 344 00:20:29,119 --> 00:20:33,280 Speaker 1: people will will point to this idea that he says 345 00:20:33,320 --> 00:20:36,880 Speaker 1: things like it's okay to kill a disabled baby right 346 00:20:36,920 --> 00:20:40,040 Speaker 1: after they're born in some cases, especially if it will 347 00:20:40,119 --> 00:20:42,920 Speaker 1: lead to the birth of another infant with better prospects 348 00:20:42,920 --> 00:20:46,720 Speaker 1: of a happy and productive life, or an older person 349 00:20:46,760 --> 00:20:49,600 Speaker 1: who has already accomplished those goals, and the idea being 350 00:20:49,640 --> 00:20:54,480 Speaker 1: that the disabled baby doesn't have goals yet. Uh, you know, 351 00:20:54,720 --> 00:20:58,879 Speaker 1: that's obviously some controversial stuff. Then he's a hard liner 352 00:20:58,880 --> 00:21:01,960 Speaker 1: and doubles down on this, but it again to to 353 00:21:02,000 --> 00:21:04,080 Speaker 1: sort of throw that in that has nothing to do 354 00:21:04,119 --> 00:21:08,520 Speaker 1: with effective altruism. No, he wrote that paper Famine, Affluence 355 00:21:08,560 --> 00:21:13,679 Speaker 1: and Morality, which basically provides the general contours of the 356 00:21:13,680 --> 00:21:17,080 Speaker 1: effective altruist movement. But it's not like he's just the 357 00:21:17,200 --> 00:21:20,240 Speaker 1: leading heartbeat of the movement or anything like that. That's 358 00:21:20,280 --> 00:21:23,240 Speaker 1: not their bible or anything like that. No, And unfortunately 359 00:21:23,280 --> 00:21:26,119 Speaker 1: he's an easy target that people can like point to 360 00:21:26,320 --> 00:21:29,200 Speaker 1: because the effective altruist movement has kind of taken some 361 00:21:29,280 --> 00:21:32,280 Speaker 1: of his ideas and they're like, oh, yeah, you like singer, Well, 362 00:21:32,320 --> 00:21:34,480 Speaker 1: what about singers arguing about this. It's like that has 363 00:21:34,520 --> 00:21:37,000 Speaker 1: nothing to do with effect of altruism. He makes a 364 00:21:37,040 --> 00:21:41,240 Speaker 1: really good, easy, easily obtained straw man that people like 365 00:21:41,320 --> 00:21:45,000 Speaker 1: to pick on. That's right. Uh. Let's talk about numbers 366 00:21:45,000 --> 00:21:48,080 Speaker 1: a little bit. We mentioned that in the United States, uh, 367 00:21:48,160 --> 00:21:53,520 Speaker 1: four seventy one billion dollars was donated in UM. About 368 00:21:53,560 --> 00:21:56,840 Speaker 1: three and twenty four of that came from individuals, which 369 00:21:56,880 --> 00:21:59,680 Speaker 1: is amazing. You know, those corporate guys are really pulling 370 00:21:59,720 --> 00:22:03,000 Speaker 1: their way. Yeah, no kidding, Uh, individuals, and that boils 371 00:22:03,040 --> 00:22:06,280 Speaker 1: down to about a thousand dollars per person uh in 372 00:22:06,320 --> 00:22:09,720 Speaker 1: the USA, which is not that much money if you 373 00:22:09,760 --> 00:22:12,199 Speaker 1: think about it. And out of that, there are a 374 00:22:12,200 --> 00:22:16,600 Speaker 1: couple of UM pledges that e A endorses. One called 375 00:22:16,640 --> 00:22:19,480 Speaker 1: Giving what we Can, which is promising to give away 376 00:22:19,520 --> 00:22:22,600 Speaker 1: ten percent of your lifetime income, and then another one 377 00:22:22,600 --> 00:22:25,040 Speaker 1: called the Founder's Pledge, where if you're a startup founder, 378 00:22:25,440 --> 00:22:29,719 Speaker 1: you promised to give away a percentage of your eventual proceeds. Uh. 379 00:22:29,800 --> 00:22:32,760 Speaker 1: And then there's also try Giving, which is a temporary 380 00:22:32,800 --> 00:22:37,400 Speaker 1: pledge to donate. And you know, it's only about twelve 381 00:22:37,480 --> 00:22:40,080 Speaker 1: years old. Only about eight to ten people have taken 382 00:22:40,119 --> 00:22:43,520 Speaker 1: these pledges so far, right, UM, which is still I mean, 383 00:22:43,520 --> 00:22:46,320 Speaker 1: that's a decent amount of people, especially considering that most 384 00:22:46,320 --> 00:22:50,360 Speaker 1: of the people involved in this movement are UM high earning, 385 00:22:51,080 --> 00:22:54,760 Speaker 1: UM extremely educated people who are probably like ten percent 386 00:22:54,800 --> 00:22:56,760 Speaker 1: of their income is going to add up to quite 387 00:22:56,760 --> 00:22:59,240 Speaker 1: a bit over the course of their careers. And that's 388 00:22:59,240 --> 00:23:02,240 Speaker 1: the thing they're saying, I'm going to give this ten 389 00:23:02,320 --> 00:23:06,120 Speaker 1: percent a year for my career. And the reason why 390 00:23:06,119 --> 00:23:09,159 Speaker 1: they really kind of targeted careers. Um, that's part of 391 00:23:09,240 --> 00:23:12,200 Speaker 1: eighty thousand hours. Eighty thousand hours is this idea that 392 00:23:12,240 --> 00:23:14,919 Speaker 1: we spent about eighty thousand hours working. So if you 393 00:23:15,000 --> 00:23:17,280 Speaker 1: took that eighty thousand hours and figured out how to 394 00:23:17,440 --> 00:23:23,760 Speaker 1: direct your energy the most effectively towards saving the world, um, 395 00:23:23,840 --> 00:23:26,399 Speaker 1: you can really do some good just by the virtue 396 00:23:26,440 --> 00:23:29,480 Speaker 1: of having to have this career to support yourself. And 397 00:23:29,560 --> 00:23:31,520 Speaker 1: so there's a couple of ways to do it. One 398 00:23:31,720 --> 00:23:34,360 Speaker 1: is to have a job that you make as much 399 00:23:34,400 --> 00:23:37,160 Speaker 1: money as you possibly can at and then you donate 400 00:23:37,320 --> 00:23:40,200 Speaker 1: as much as you comfortably can, and then maybe even 401 00:23:40,240 --> 00:23:44,560 Speaker 1: then some say ten percent or some people donate. There's 402 00:23:44,560 --> 00:23:48,280 Speaker 1: a NASA engineer named Brian Ottens who is profiled in 403 00:23:48,400 --> 00:23:53,320 Speaker 1: The Washington Post who said he specifically got the most stressful, 404 00:23:54,359 --> 00:23:58,359 Speaker 1: high earning job he could handle. Um in order to 405 00:23:58,400 --> 00:24:01,760 Speaker 1: give away. I think a quarter of is of his income, right, 406 00:24:02,359 --> 00:24:04,760 Speaker 1: And that's great. That's one way to do it. But 407 00:24:04,840 --> 00:24:07,560 Speaker 1: another way to do it is to say, Okay, actually, 408 00:24:08,000 --> 00:24:11,040 Speaker 1: I'm I'm going to figure out something that I really love, 409 00:24:11,800 --> 00:24:14,040 Speaker 1: but I'm going to adjust it so that it's going 410 00:24:14,080 --> 00:24:18,080 Speaker 1: to have the most impact possible. Yeah. I think it's interesting, 411 00:24:18,240 --> 00:24:21,159 Speaker 1: Like there are two ways to think about it. The 412 00:24:21,160 --> 00:24:22,560 Speaker 1: first one that you were talking about, they call it 413 00:24:22,600 --> 00:24:26,560 Speaker 1: earning to give. And you know, the idea that you 414 00:24:27,480 --> 00:24:29,880 Speaker 1: if you are capable of getting like a really high 415 00:24:29,880 --> 00:24:32,920 Speaker 1: paying job in like the oil industry, with the idea 416 00:24:32,960 --> 00:24:35,199 Speaker 1: that you're going to give most that a way in 417 00:24:35,200 --> 00:24:39,480 Speaker 1: the earning to give philosophy side of things, they're saying, yeah, 418 00:24:39,640 --> 00:24:41,919 Speaker 1: go do that. It doesn't have to be a socially 419 00:24:41,960 --> 00:24:44,760 Speaker 1: beneficial job. Make the most money you can and give 420 00:24:44,760 --> 00:24:47,399 Speaker 1: it away. Uh, don't go get the job of the 421 00:24:47,440 --> 00:24:50,760 Speaker 1: nonprofit because there are tons of people that will go 422 00:24:50,800 --> 00:24:53,639 Speaker 1: and get that job at the nonprofit like that that 423 00:24:53,800 --> 00:24:58,080 Speaker 1: someone will fill that position. UM eights doesn't. Uh, they 424 00:24:58,080 --> 00:25:00,160 Speaker 1: say that that's not the best way. There's is more. 425 00:25:00,400 --> 00:25:03,639 Speaker 1: The second when you mentioned, which is don't take a 426 00:25:03,720 --> 00:25:07,679 Speaker 1: job that causes a lot of harm. Being happy is 427 00:25:07,800 --> 00:25:10,560 Speaker 1: part of being productive, and you don't have to go 428 00:25:10,720 --> 00:25:12,520 Speaker 1: grind it out at a job you hate just because 429 00:25:12,520 --> 00:25:13,720 Speaker 1: you make a lot of money, so you can give 430 00:25:13,720 --> 00:25:17,560 Speaker 1: it away, like make yourself happy. Don't take a job 431 00:25:17,600 --> 00:25:21,720 Speaker 1: that causes harm. Do a job where you have uh 432 00:25:21,840 --> 00:25:25,119 Speaker 1: a talent. Um policy making is one field media. I 433 00:25:25,119 --> 00:25:28,040 Speaker 1: would argue that we have a job where you know, 434 00:25:28,080 --> 00:25:29,479 Speaker 1: we didn't know, but it turns out we have a 435 00:25:29,480 --> 00:25:33,919 Speaker 1: talent for doing this, and we can leverage our voice. Uh, 436 00:25:33,960 --> 00:25:37,280 Speaker 1: and we occasionally do to point out things that we 437 00:25:37,320 --> 00:25:40,439 Speaker 1: think make a difference in the world and to mobilize people. Um, 438 00:25:40,520 --> 00:25:42,320 Speaker 1: that's not the goal of our show, but we can 439 00:25:42,400 --> 00:25:45,560 Speaker 1: dabble in that, which is which is great. Uh. That's 440 00:25:45,600 --> 00:25:47,440 Speaker 1: not what we intended going into it. But I think 441 00:25:47,480 --> 00:25:49,919 Speaker 1: we woke up one day and found that we had 442 00:25:49,960 --> 00:25:52,320 Speaker 1: a lot of years, so we could we could throw 443 00:25:52,359 --> 00:25:56,000 Speaker 1: in episodes. I think that lead to good Yeah, I agreed, 444 00:25:56,200 --> 00:25:58,040 Speaker 1: which means we can shave a little off of that 445 00:25:58,200 --> 00:26:02,000 Speaker 1: ten percent we're morally obligated to donate every year, right, 446 00:26:04,840 --> 00:26:08,000 Speaker 1: so um, A good example of that of like figuring 447 00:26:08,000 --> 00:26:12,040 Speaker 1: out how to direct your career path more toward improving 448 00:26:12,040 --> 00:26:15,439 Speaker 1: the world. UM. On the I guess the eighty thousand 449 00:26:15,480 --> 00:26:18,280 Speaker 1: hours site, they profile the woman who wanted to become 450 00:26:18,280 --> 00:26:22,240 Speaker 1: a doctor, and she did some research and said, um, 451 00:26:22,359 --> 00:26:25,520 Speaker 1: well this is cool, but most doctors in Australia treat 452 00:26:25,600 --> 00:26:29,159 Speaker 1: Australians who are you know, relatively very well off and 453 00:26:29,359 --> 00:26:32,600 Speaker 1: very healthy. And so instead she decided that she wanted 454 00:26:32,640 --> 00:26:34,600 Speaker 1: to go into a different field of medicine. I think 455 00:26:34,600 --> 00:26:37,000 Speaker 1: she went into like epidemiology and figured out how to 456 00:26:37,040 --> 00:26:42,080 Speaker 1: get how to direct her her interest in medicine towards 457 00:26:42,440 --> 00:26:46,040 Speaker 1: getting vaccines out to market faster to get them through 458 00:26:46,080 --> 00:26:49,240 Speaker 1: the clinical trial process. And so she's not going to 459 00:26:49,320 --> 00:26:51,119 Speaker 1: get to be a doctor, but she's gonna get to 460 00:26:51,160 --> 00:26:53,480 Speaker 1: focus on medicine and she's going to get to have 461 00:26:53,520 --> 00:26:57,879 Speaker 1: the satisfaction that she's improving the world demonstrably through her job. 462 00:26:58,480 --> 00:27:00,679 Speaker 1: And she might not donate a die with that. I 463 00:27:00,720 --> 00:27:04,480 Speaker 1: suspect she's probably going to, because she's on the eighty 464 00:27:04,480 --> 00:27:09,359 Speaker 1: thou hours website. UM. But even if she didn't, she's 465 00:27:09,400 --> 00:27:14,120 Speaker 1: still figuring out how to use evidence UM to make 466 00:27:14,160 --> 00:27:18,240 Speaker 1: evidence based decisions to maximize the eights she's going to 467 00:27:18,280 --> 00:27:20,960 Speaker 1: spend in her career to make the world a better place. 468 00:27:21,920 --> 00:27:24,600 Speaker 1: Right because one of the ideas of e A and 469 00:27:24,640 --> 00:27:26,840 Speaker 1: a lot of you know, the the charity Navigator and 470 00:27:26,920 --> 00:27:30,879 Speaker 1: charity watched like good websites that we endorsed, uh that 471 00:27:30,960 --> 00:27:33,520 Speaker 1: we're not poopooing at all. But um, they tend to 472 00:27:33,520 --> 00:27:36,160 Speaker 1: focus a lot on you know, how much goes to overhead, 473 00:27:36,160 --> 00:27:38,639 Speaker 1: how much goes to the programs, which is which is good. 474 00:27:39,080 --> 00:27:41,119 Speaker 1: But e A is like, now, what we want to 475 00:27:41,160 --> 00:27:47,399 Speaker 1: see our data and literal scientific data measurables on how 476 00:27:47,520 --> 00:27:51,080 Speaker 1: much return you're getting for that dollar. And some charities 477 00:27:51,119 --> 00:27:52,679 Speaker 1: do this and are a little more open about it, 478 00:27:53,080 --> 00:27:57,160 Speaker 1: but they basically say, you know, every charity should say 479 00:27:57,840 --> 00:28:01,199 Speaker 1: here's how much your dollar, uh, here's how far your 480 00:28:01,240 --> 00:28:04,520 Speaker 1: dollar goes and exactly what it does. And the charities 481 00:28:04,600 --> 00:28:10,920 Speaker 1: of the West said, come on really nervously when they're 482 00:28:10,960 --> 00:28:13,320 Speaker 1: asked that, when they're told that they should be doing that, 483 00:28:13,400 --> 00:28:15,800 Speaker 1: because they just don't. Part of the reason why this 484 00:28:15,920 --> 00:28:20,000 Speaker 1: is very expensive to run. Um, what if effective altruists 485 00:28:20,119 --> 00:28:23,120 Speaker 1: like to use is the gold standard random control trials 486 00:28:23,880 --> 00:28:27,960 Speaker 1: where basically, um, you know what UX testing is user 487 00:28:28,000 --> 00:28:30,480 Speaker 1: experience testing for like a website. So there's a B 488 00:28:30,680 --> 00:28:33,240 Speaker 1: testing where you've got some people who are using your 489 00:28:33,240 --> 00:28:35,760 Speaker 1: website and they're getting one banner AD and the B 490 00:28:36,000 --> 00:28:38,800 Speaker 1: testers are getting a totally different banner AD, and you 491 00:28:38,880 --> 00:28:42,120 Speaker 1: just see which gets the most clicks. It's basically that, 492 00:28:42,720 --> 00:28:46,120 Speaker 1: but for a charity, for the work that the charity 493 00:28:46,160 --> 00:28:49,560 Speaker 1: is carrying out, some group gets malaria nets, another group doesn't, 494 00:28:49,720 --> 00:28:52,000 Speaker 1: and then you study which group had the best outcome, 495 00:28:52,240 --> 00:28:54,400 Speaker 1: and then you could say, oh, well, these malaria nets 496 00:28:54,400 --> 00:28:59,280 Speaker 1: increased these um these uh life adjusted years by you know, 497 00:28:59,640 --> 00:29:03,080 Speaker 1: thirty percent, which means that it comes out to um, 498 00:29:03,120 --> 00:29:08,080 Speaker 1: you know, point five life adjusted years UM per dollar 499 00:29:08,640 --> 00:29:11,840 Speaker 1: compared to you know, point to life adjusted years for 500 00:29:11,880 --> 00:29:15,200 Speaker 1: the control group. Ergo, we want to put our money 501 00:29:15,200 --> 00:29:18,680 Speaker 1: into these groups that distribute malaria nets in Africa because 502 00:29:18,720 --> 00:29:22,080 Speaker 1: they are demonstrably saving more lives than groups that don't. 503 00:29:22,480 --> 00:29:25,719 Speaker 1: Like they want data like that, and you just don't 504 00:29:26,320 --> 00:29:29,840 Speaker 1: get that with most charities. The good thing is that 505 00:29:29,880 --> 00:29:32,520 Speaker 1: they're pushing charities to do that, because if you do 506 00:29:32,680 --> 00:29:35,720 Speaker 1: care about that kind of thing, then then if you 507 00:29:35,760 --> 00:29:37,760 Speaker 1: can come up with that kind of evidence, you can 508 00:29:37,800 --> 00:29:40,600 Speaker 1: get these effective ultrus dollars and there's a lot of 509 00:29:40,640 --> 00:29:44,760 Speaker 1: dollars coming from that group, even though it is relatively small. Yeah, 510 00:29:44,800 --> 00:29:48,240 Speaker 1: it is interesting because you know, in that example, if 511 00:29:48,240 --> 00:29:52,560 Speaker 1: you were to just say on your website, uh, people 512 00:29:52,600 --> 00:29:57,760 Speaker 1: with with malaria nets fair better dot dot dot duh, right, 513 00:29:58,120 --> 00:30:00,880 Speaker 1: Like everyone knows that, but they really want to have 514 00:30:01,120 --> 00:30:05,000 Speaker 1: that to drill down and have that measurable where they 515 00:30:05,080 --> 00:30:08,760 Speaker 1: can point to a number and say that, you know this, 516 00:30:08,760 --> 00:30:11,760 Speaker 1: this is the actual result. We all know malaria nets help. 517 00:30:12,520 --> 00:30:15,160 Speaker 1: But maybe if people I mean maybe they think it 518 00:30:15,200 --> 00:30:18,120 Speaker 1: speaks to people more. Um, it certainly speaks to them, 519 00:30:18,160 --> 00:30:20,000 Speaker 1: but I guess they think it would speak to the 520 00:30:20,080 --> 00:30:23,080 Speaker 1: masses if if because these things cost money. I mean, 521 00:30:23,120 --> 00:30:27,000 Speaker 1: that's one of the criticisms of these randomized controlled trials. 522 00:30:27,000 --> 00:30:29,960 Speaker 1: It is that there is sort of expensive and like 523 00:30:30,040 --> 00:30:31,960 Speaker 1: maybe that money should be used to do to actually 524 00:30:32,040 --> 00:30:35,360 Speaker 1: donate instead of doing these trials. But they must think 525 00:30:35,400 --> 00:30:39,600 Speaker 1: it speaks to people to have actual data like that. Well, 526 00:30:39,600 --> 00:30:42,160 Speaker 1: it speaks to them because the way that you figure 527 00:30:42,200 --> 00:30:44,480 Speaker 1: out how to maximize your money is to have data 528 00:30:44,520 --> 00:30:47,800 Speaker 1: to look at to decide rather than your heart. It 529 00:30:47,840 --> 00:30:50,680 Speaker 1: makes sense these are techies because they're all about that 530 00:30:50,760 --> 00:30:54,080 Speaker 1: data very much so, and there's some problems with that, 531 00:30:54,160 --> 00:30:56,880 Speaker 1: with relying on that. There's some criticisms, I should say, 532 00:30:56,920 --> 00:30:59,280 Speaker 1: but it's problems to One is that there's a lot 533 00:30:59,320 --> 00:31:03,400 Speaker 1: of stuff that you can't quite quantify in terms like that. 534 00:31:03,440 --> 00:31:05,960 Speaker 1: Like if you're saying, like, no, I want to see 535 00:31:05,960 --> 00:31:12,040 Speaker 1: how many lives saved your your work is doing per dollar, um, well, 536 00:31:12,080 --> 00:31:14,640 Speaker 1: then the you know, the High Museum is going to 537 00:31:14,720 --> 00:31:18,520 Speaker 1: be like, um, zero, we're saving zero lives. But that 538 00:31:18,600 --> 00:31:23,400 Speaker 1: doesn't mean that they're not enriching or improving lives through 539 00:31:23,680 --> 00:31:26,560 Speaker 1: through the donations that they're receiving this art museum, you 540 00:31:26,600 --> 00:31:28,720 Speaker 1: know what I mean. Um. Livia, who helps us with 541 00:31:28,760 --> 00:31:31,400 Speaker 1: this article, gives an example. She's saying, like, you couldn't 542 00:31:31,400 --> 00:31:35,640 Speaker 1: really do a randomized controlled trial for the nineteen sixty 543 00:31:35,720 --> 00:31:41,320 Speaker 1: three March on Washington that helps solidify the civil rights movement, um, 544 00:31:41,360 --> 00:31:43,880 Speaker 1: and yet it'd be really hard to argue that that 545 00:31:43,920 --> 00:31:47,360 Speaker 1: didn't have any major like effects on the world. So 546 00:31:47,720 --> 00:31:50,520 Speaker 1: that's a that's a big that's a big argument. Then 547 00:31:50,560 --> 00:31:53,120 Speaker 1: the other thing is that sometimes these randomized controlled trials, 548 00:31:53,120 --> 00:31:55,520 Speaker 1: like you can hold it one year and then in 549 00:31:55,600 --> 00:31:57,240 Speaker 1: one part of the world and go to another part 550 00:31:57,240 --> 00:31:59,920 Speaker 1: of the world the next year. And it's what should 551 00:32:00,040 --> 00:32:02,160 Speaker 1: be the same is just not the same. And so 552 00:32:02,200 --> 00:32:04,480 Speaker 1: if you're basing all of your charitable giving on these things, 553 00:32:04,840 --> 00:32:10,040 Speaker 1: they better be um, reproducible or else what are you doing? Yeah, 554 00:32:10,080 --> 00:32:12,360 Speaker 1: I mean this. You get why this is such a 555 00:32:12,360 --> 00:32:14,280 Speaker 1: divisive thing and why it's such a hard sell to 556 00:32:14,320 --> 00:32:18,760 Speaker 1: people because people give with their hearts generally, Uh, they 557 00:32:18,760 --> 00:32:21,880 Speaker 1: give to causes they find personal to them, Like I 558 00:32:21,920 --> 00:32:25,479 Speaker 1: mentioned earlier, a family member with cancer, or a family 559 00:32:25,520 --> 00:32:28,120 Speaker 1: member with m S or just you know, name anything. 560 00:32:28,960 --> 00:32:33,080 Speaker 1: Generally people like have a personal connection somehow which makes 561 00:32:33,120 --> 00:32:35,600 Speaker 1: them want to give. And that's sort of the heart 562 00:32:35,640 --> 00:32:40,040 Speaker 1: of philanthropy has always been the heart. Uh. And it's 563 00:32:40,080 --> 00:32:42,480 Speaker 1: a it's a tough sell for e A to say, 564 00:32:42,560 --> 00:32:44,640 Speaker 1: is I'm sorry, you have to cut that, cut that 565 00:32:44,720 --> 00:32:48,240 Speaker 1: out of there. Um. You know, it's a very subjective 566 00:32:48,240 --> 00:32:52,680 Speaker 1: thing to what constitutes a problem, even um when it 567 00:32:52,720 --> 00:32:54,880 Speaker 1: comes to the animal thing, like when when people give 568 00:32:55,320 --> 00:32:58,800 Speaker 1: for animal charities, they're generally giving to you know, dogs 569 00:32:58,800 --> 00:33:02,480 Speaker 1: and cats and stuff like that. Um. These these great 570 00:33:02,560 --> 00:33:05,840 Speaker 1: organizations that do great work here in America. But the 571 00:33:05,880 --> 00:33:11,400 Speaker 1: concentration if from the e A. Perspective are factory farmed animals, 572 00:33:11,960 --> 00:33:14,760 Speaker 1: and that one percent of charitable spending in the US 573 00:33:14,840 --> 00:33:19,040 Speaker 1: goes to the suffering of farmed animals, and that's what 574 00:33:19,120 --> 00:33:23,080 Speaker 1: we should be concentrating on because of the massive, massive scale. Again, 575 00:33:23,160 --> 00:33:25,320 Speaker 1: to try and do the most good, you would look 576 00:33:25,320 --> 00:33:29,200 Speaker 1: at like where the most animals, and sadly they're on farms. Yeah, 577 00:33:29,240 --> 00:33:32,960 Speaker 1: I mean just from sheer numbers. Um, you can make 578 00:33:33,000 --> 00:33:36,080 Speaker 1: a you can make a case utilitarian speaking that your 579 00:33:36,120 --> 00:33:39,360 Speaker 1: money would be better off spent improving the lives of 580 00:33:39,480 --> 00:33:42,800 Speaker 1: cows that we're going to be slaughtered for beef, that 581 00:33:42,880 --> 00:33:45,240 Speaker 1: will still eventually be slaughtered for beef, but you can 582 00:33:45,280 --> 00:33:52,840 Speaker 1: improve their welfare during their lifetimes and that technically is maximizing, um, 583 00:33:53,000 --> 00:33:55,560 Speaker 1: the impact of your dollar by reducing suffering just because 584 00:33:55,560 --> 00:34:01,880 Speaker 1: there's so many cows awaiting slaughter in the world, humans 585 00:34:01,960 --> 00:34:04,960 Speaker 1: that are dying in Africa back. Yeah, that's a tough sell. 586 00:34:05,360 --> 00:34:07,920 Speaker 1: And I think this is where like, this is where 587 00:34:07,920 --> 00:34:10,560 Speaker 1: it makes sense to just kind of like maintain a 588 00:34:10,600 --> 00:34:13,279 Speaker 1: certain amount of common sense where it's like, yeah, man, 589 00:34:13,400 --> 00:34:15,920 Speaker 1: like if you really want to maximize your money, go 590 00:34:16,040 --> 00:34:18,399 Speaker 1: look at the e A sites. Go check out eighty 591 00:34:18,480 --> 00:34:22,200 Speaker 1: thousand hours, um, like like get into this and actually 592 00:34:22,200 --> 00:34:24,680 Speaker 1: do that. But there's no one who's saying, like, but 593 00:34:24,760 --> 00:34:28,440 Speaker 1: if you give one dollar to that, to that local 594 00:34:28,480 --> 00:34:30,879 Speaker 1: symphony that you love, you're a sucker, you're a chump, 595 00:34:30,880 --> 00:34:33,600 Speaker 1: you're an idiot. Nobody's saying that. And so maybe it 596 00:34:33,680 --> 00:34:36,480 Speaker 1: doesn't have to be all or nothing one way or 597 00:34:36,560 --> 00:34:38,799 Speaker 1: the other, which seems to be the push in the poll, 598 00:34:38,840 --> 00:34:42,120 Speaker 1: and I think the issue here, Yeah, we should read 599 00:34:42,239 --> 00:34:46,719 Speaker 1: directly from Will mccaskell um. He defends uh e A 600 00:34:47,000 --> 00:34:50,680 Speaker 1: and he says this effective altruism makes no claims about 601 00:34:50,680 --> 00:34:54,120 Speaker 1: what obligations of benevolence one has. Uh nor does e 602 00:34:54,280 --> 00:34:57,080 Speaker 1: A claim that all ways of helping others are morally 603 00:34:57,080 --> 00:34:59,960 Speaker 1: permissible as long as they help others the most. Indeed, 604 00:35:00,400 --> 00:35:03,880 Speaker 1: there's a strong community norm against promoting or engaging in 605 00:35:03,960 --> 00:35:07,640 Speaker 1: activities that cause harm. So they flat out say, like 606 00:35:08,000 --> 00:35:11,080 Speaker 1: the whole murder someone to harvest their organs, like, we're 607 00:35:11,120 --> 00:35:14,040 Speaker 1: not down with that, that's not what we're about. Please 608 00:35:14,040 --> 00:35:17,759 Speaker 1: stop mentioning Peter Singer, right, yeah, and he says it 609 00:35:17,800 --> 00:35:20,600 Speaker 1: doesn't require that I always sacrifice my own interest for 610 00:35:20,640 --> 00:35:23,200 Speaker 1: the good of others, And that's actually very contradictory of 611 00:35:23,239 --> 00:35:27,440 Speaker 1: Peter Singers. Um. Essay, he says, no, you're morally obligated 612 00:35:27,480 --> 00:35:29,520 Speaker 1: to do that, and if you don't, it's morally bad. 613 00:35:29,840 --> 00:35:32,120 Speaker 1: They're saying like, no, let's let's all just be reasonable. 614 00:35:32,239 --> 00:35:35,319 Speaker 1: You're like, yeah, we're philosophers, but you know we can 615 00:35:35,360 --> 00:35:38,359 Speaker 1: also like think like normal human beings too, And that's 616 00:35:38,400 --> 00:35:40,000 Speaker 1: what we're trying to do. We're trying to take this 617 00:35:40,320 --> 00:35:43,960 Speaker 1: kind of philosophical view um, based in science, based in evidence, 618 00:35:43,960 --> 00:35:47,840 Speaker 1: and try to direct money to get the biggest impact. Um. Yeah, 619 00:35:47,880 --> 00:35:49,920 Speaker 1: like you said, can we stop? Can we stop bringing 620 00:35:49,960 --> 00:35:54,040 Speaker 1: up Peter Singer place? How about we take another break 621 00:35:54,239 --> 00:35:57,759 Speaker 1: and uh, we'll talk a little bit about geez, what 622 00:35:57,800 --> 00:36:00,640 Speaker 1: else long termism and e A s impact right after 623 00:36:00,640 --> 00:36:22,960 Speaker 1: the case. So, long termism is part of the e 624 00:36:23,040 --> 00:36:25,879 Speaker 1: A movement, and this is the idea of hey, let's 625 00:36:25,920 --> 00:36:28,960 Speaker 1: not just think about helping people now. If we really 626 00:36:29,000 --> 00:36:31,799 Speaker 1: want to maximize impact to help the most people, which 627 00:36:31,840 --> 00:36:34,640 Speaker 1: is at the core of our mission statement, we need 628 00:36:34,719 --> 00:36:37,200 Speaker 1: to think about the future because there will be a 629 00:36:37,360 --> 00:36:41,600 Speaker 1: lot more people in the future to save and uh, 630 00:36:41,880 --> 00:36:46,239 Speaker 1: and so long termism is really where your dollar is 631 00:36:46,280 --> 00:36:47,719 Speaker 1: going to go the most if you think about like 632 00:36:47,800 --> 00:36:51,160 Speaker 1: deep into the future even yeah, um, like if if 633 00:36:51,360 --> 00:36:54,040 Speaker 1: humanity just kind of hangs around planet Earth for another 634 00:36:54,080 --> 00:36:56,880 Speaker 1: billion or so years, which is entirely possible, if we 635 00:36:56,920 --> 00:37:01,960 Speaker 1: can make it through the great filter, uh um uh, 636 00:37:02,239 --> 00:37:05,200 Speaker 1: there will be like quadrillions of human lives left to come. 637 00:37:06,160 --> 00:37:09,799 Speaker 1: And a lot of philosophers who think about this kind 638 00:37:09,800 --> 00:37:12,600 Speaker 1: of thing kind of make the make the case, or 639 00:37:12,680 --> 00:37:15,440 Speaker 1: can make the case if they want to, that their 640 00:37:15,480 --> 00:37:19,839 Speaker 1: lives are probably going to be vastly more um enjoyable 641 00:37:19,840 --> 00:37:23,000 Speaker 1: than ours, just from the technology available and not having 642 00:37:23,040 --> 00:37:24,759 Speaker 1: to work, and all sorts of great stuff that's going 643 00:37:24,840 --> 00:37:28,239 Speaker 1: to come along, and so technically, just by virtue of 644 00:37:28,280 --> 00:37:30,799 Speaker 1: the fact that there's so many more of them, we 645 00:37:30,840 --> 00:37:35,160 Speaker 1: should technically be sacrificing our own stuff now for the 646 00:37:35,239 --> 00:37:38,680 Speaker 1: benefit of these generations and generations and generations of humans 647 00:37:38,680 --> 00:37:41,799 Speaker 1: become that vastly out number the total number of humans 648 00:37:41,840 --> 00:37:44,280 Speaker 1: who have ever lived, like a hundred eight billion humans 649 00:37:44,320 --> 00:37:47,560 Speaker 1: have ever lived. We're talking quadrillions of humans left to come. 650 00:37:48,080 --> 00:37:51,680 Speaker 1: That very much devetails with the um the kind of 651 00:37:51,719 --> 00:37:54,560 Speaker 1: discomfort you can elicit from somebody who says that your 652 00:37:54,560 --> 00:37:58,040 Speaker 1: money is better spent relieving the suffering of cattle awaiting 653 00:37:58,040 --> 00:38:01,000 Speaker 1: slaughter than it is saving children's lives in Africa, you 654 00:38:01,000 --> 00:38:02,959 Speaker 1: know what I'm saying. Yeah, And they're not just talking 655 00:38:03,000 --> 00:38:06,359 Speaker 1: about climate change and like obviously that kind of existential risk. 656 00:38:06,400 --> 00:38:09,600 Speaker 1: They dabble in AI and stuff like that. And I 657 00:38:09,680 --> 00:38:12,399 Speaker 1: know that we don't need to go down that rabbit hole. 658 00:38:13,160 --> 00:38:14,360 Speaker 1: You should listen to the End of the World with 659 00:38:14,440 --> 00:38:17,799 Speaker 1: Josh Clark, the AI episode. But I mean, it's all 660 00:38:17,800 --> 00:38:21,319 Speaker 1: about that, but it does have to do with that 661 00:38:21,360 --> 00:38:23,400 Speaker 1: kind of stuff. It's not just like we need to 662 00:38:23,400 --> 00:38:27,200 Speaker 1: save the planet so it's around in a billion years. Uh, 663 00:38:27,280 --> 00:38:30,600 Speaker 1: you know, they tackle like all kinds of existential risk basically, Yeah, 664 00:38:30,640 --> 00:38:32,680 Speaker 1: and they dedicate like a lot of these guys are 665 00:38:32,680 --> 00:38:35,719 Speaker 1: dedicating their careers to figuring out how to avoid existential 666 00:38:35,760 --> 00:38:38,720 Speaker 1: risk because they've decided that that is the greatest threat 667 00:38:38,760 --> 00:38:42,759 Speaker 1: to the future that would cut out any possibility of 668 00:38:42,760 --> 00:38:46,120 Speaker 1: those quadrillions of lives. So that's what the that's literally 669 00:38:46,160 --> 00:38:49,359 Speaker 1: why they have dedicated themselves to thinking about and alleviating 670 00:38:49,400 --> 00:38:52,120 Speaker 1: these risks, because they're trying to save the future of 671 00:38:52,120 --> 00:38:55,640 Speaker 1: the human race. Because they've decided that that is the 672 00:38:55,680 --> 00:38:59,120 Speaker 1: best way to maximize their careers for the most good, 673 00:38:59,560 --> 00:39:02,760 Speaker 1: which is just astounding if you stop and think about 674 00:39:02,800 --> 00:39:07,640 Speaker 1: what they're actually doing in real life. Uh, we mentioned 675 00:39:07,680 --> 00:39:10,760 Speaker 1: the kind of money that's even though it's a UM, 676 00:39:10,800 --> 00:39:13,000 Speaker 1: not a huge movement so far. I think we said 677 00:39:13,040 --> 00:39:16,520 Speaker 1: like somewhereround eight thousand people of maybe these pledges, I 678 00:39:16,560 --> 00:39:20,160 Speaker 1: think overall. Uh. The co founder of eighty thousand Hours, 679 00:39:20,160 --> 00:39:24,640 Speaker 1: Benjamin Todd says about forty six billion dollars is committed 680 00:39:24,640 --> 00:39:28,600 Speaker 1: to e a going forward UM. Like you said, a lot. 681 00:39:28,719 --> 00:39:31,080 Speaker 1: You know, it's because of there are a lot of 682 00:39:31,160 --> 00:39:33,839 Speaker 1: rich people and tech people that are backing this thing. 683 00:39:33,920 --> 00:39:36,160 Speaker 1: So a lot of that money comes from people like 684 00:39:36,239 --> 00:39:40,880 Speaker 1: Duskin Moskovitz and uh, Cary Tuna and Sam bankman Fried 685 00:39:41,120 --> 00:39:44,680 Speaker 1: of he's a cryptocurrency guy, so a lot of that 686 00:39:44,719 --> 00:39:46,680 Speaker 1: money comes from them. But they're they're trying to just 687 00:39:47,239 --> 00:39:50,880 Speaker 1: raise awareness to get more and more regular people on 688 00:39:50,960 --> 00:39:53,560 Speaker 1: board that you know, if they have you know, two 689 00:39:53,560 --> 00:39:56,799 Speaker 1: thousand dollars or three thousand dollars to give a year, 690 00:39:57,280 --> 00:39:59,920 Speaker 1: they're saying, I think they estimate that three to four 691 00:40:00,080 --> 00:40:03,040 Speaker 1: five bucks is like the amount of money it takes 692 00:40:03,080 --> 00:40:05,760 Speaker 1: to save a human life and to give them additional 693 00:40:05,840 --> 00:40:08,560 Speaker 1: quality years. Yeah, so if you cough up that much 694 00:40:08,600 --> 00:40:10,759 Speaker 1: and you directed toward one of the charities that they've 695 00:40:10,760 --> 00:40:14,919 Speaker 1: identified is the most effective. UM through their sites through 696 00:40:14,960 --> 00:40:17,200 Speaker 1: like give well is a place to go look for 697 00:40:17,200 --> 00:40:20,480 Speaker 1: for charities like that that have been vetted by effective ultras. 698 00:40:20,520 --> 00:40:22,879 Speaker 1: You're literally saving the life of a child every year. 699 00:40:23,280 --> 00:40:25,799 Speaker 1: It's like you're saving a child from drowning in a 700 00:40:25,840 --> 00:40:28,400 Speaker 1: pond every single year, just and all you're doing is 701 00:40:28,480 --> 00:40:31,840 Speaker 1: ruining your new shoes or you know, it's interesting new shoes, 702 00:40:31,880 --> 00:40:35,000 Speaker 1: but yeah, you're you're ruining you're really nice vacation that year, 703 00:40:35,360 --> 00:40:37,040 Speaker 1: right because you you know, you sent this one thing. 704 00:40:37,040 --> 00:40:38,960 Speaker 1: I don't know where it came from, but the the 705 00:40:39,000 --> 00:40:41,240 Speaker 1: idea of someone running into a burning building and pulling 706 00:40:41,239 --> 00:40:42,800 Speaker 1: a child out or a kid out of a pond, 707 00:40:43,280 --> 00:40:45,920 Speaker 1: they're they're written in the newspaper as a hero. But 708 00:40:46,200 --> 00:40:48,960 Speaker 1: you can you can do that. You can save a 709 00:40:49,040 --> 00:40:52,480 Speaker 1: kid a year or more every year for the rest 710 00:40:52,480 --> 00:40:56,080 Speaker 1: of your life. Um, it's a little less dramatic. You're 711 00:40:56,080 --> 00:40:58,160 Speaker 1: not gonna have a newspaper, You're not gonna be above 712 00:40:58,160 --> 00:41:02,560 Speaker 1: the fold. It's you know what I'm saying, But uh, 713 00:41:02,800 --> 00:41:05,560 Speaker 1: that's I mean that has the a is like all 714 00:41:05,600 --> 00:41:11,440 Speaker 1: about it is the antithesis of that antithesis. Yeah, I 715 00:41:11,520 --> 00:41:14,040 Speaker 1: like you know what I mean, it's it's no frills 716 00:41:14,320 --> 00:41:18,279 Speaker 1: version of antithesis. The thing is too is also I mean, 717 00:41:18,320 --> 00:41:23,920 Speaker 1: it's still very relative. Likes is is relatively a very 718 00:41:24,000 --> 00:41:27,920 Speaker 1: large amount or so so size amount or not much amount, 719 00:41:27,920 --> 00:41:30,800 Speaker 1: depending on how much you you make. And again, nobody 720 00:41:30,840 --> 00:41:33,200 Speaker 1: in the effective ultraist movement is saying that you should 721 00:41:33,680 --> 00:41:37,640 Speaker 1: personally sacrifice unless you really want to, unless you're driven to. 722 00:41:38,080 --> 00:41:41,439 Speaker 1: But you're not morally required to personally sacrifice to cough 723 00:41:41,520 --> 00:41:44,359 Speaker 1: up that fort when it means you're you're not going 724 00:41:44,400 --> 00:41:46,160 Speaker 1: to be able to eat for a week, or you're 725 00:41:46,160 --> 00:41:48,279 Speaker 1: not gonna be able to have a place to live. Like, 726 00:41:48,320 --> 00:41:51,239 Speaker 1: nobody's saying that, and nobody's being flipping about the the 727 00:41:51,320 --> 00:41:56,560 Speaker 1: idea that isn't that much. What they're saying is can 728 00:41:56,640 --> 00:41:59,200 Speaker 1: literally save a child's life. And if you stop and 729 00:41:59,239 --> 00:42:01,560 Speaker 1: look at your life and think that you could come 730 00:42:01,640 --> 00:42:03,920 Speaker 1: up with that, you could donate it to a certain 731 00:42:03,920 --> 00:42:06,840 Speaker 1: place that will go save a child's life in real life. 732 00:42:07,320 --> 00:42:11,200 Speaker 1: That's that's what they're saying. Yeah, this, uh, this, this 733 00:42:11,239 --> 00:42:15,799 Speaker 1: would be a hard sell to Emily, what I'm thinking 734 00:42:15,840 --> 00:42:20,120 Speaker 1: about our our charity conversation we have every year, and 735 00:42:20,280 --> 00:42:23,360 Speaker 1: I'm trying to imagine myself saying, what if we don't 736 00:42:23,440 --> 00:42:25,839 Speaker 1: give to the local animal shelter and neighbor in need 737 00:42:25,920 --> 00:42:29,680 Speaker 1: like we usually do, and instead we do this. She 738 00:42:29,719 --> 00:42:32,120 Speaker 1: would just be like, uh, I see what you're saying, 739 00:42:32,160 --> 00:42:34,160 Speaker 1: but but no, get get out of my face with that. 740 00:42:34,760 --> 00:42:36,479 Speaker 1: But I mean you could be like, well, how about 741 00:42:36,520 --> 00:42:38,880 Speaker 1: we do both? You know exactly? So I think I 742 00:42:38,920 --> 00:42:40,880 Speaker 1: think that's the thing. That's my take on it, Like 743 00:42:41,800 --> 00:42:44,920 Speaker 1: we support co ED, and like that's I have no 744 00:42:45,040 --> 00:42:47,960 Speaker 1: qualms about supporting COD even after doing all this research 745 00:42:48,000 --> 00:42:51,919 Speaker 1: and understanding effective altruism even more, no qualms whatsoever. I'm 746 00:42:51,960 --> 00:42:54,759 Speaker 1: sure that that money could be directed better to help 747 00:42:54,800 --> 00:42:57,680 Speaker 1: other people in other parts of the world. I still 748 00:42:57,840 --> 00:43:00,920 Speaker 1: think it's money well spent and it's helping people, and 749 00:43:00,960 --> 00:43:03,120 Speaker 1: I'm I'm very happy with that. I think that's great. 750 00:43:03,200 --> 00:43:05,000 Speaker 1: And then I don't have any guilt or shame about 751 00:43:05,040 --> 00:43:08,160 Speaker 1: that at all. And because what you're saying at that point, 752 00:43:08,200 --> 00:43:11,560 Speaker 1: like with co ED, it is an organization dedicated to 753 00:43:12,239 --> 00:43:15,600 Speaker 1: helping children in a in a not very well off 754 00:43:15,640 --> 00:43:20,160 Speaker 1: country live better and longer live, so like it essentially 755 00:43:20,239 --> 00:43:23,160 Speaker 1: is effective altruism in a way, except effective altruism is 756 00:43:23,200 --> 00:43:26,879 Speaker 1: like no, no, no no, no no. The data says that 757 00:43:27,080 --> 00:43:31,879 Speaker 1: this one is look at the numbers. It's point this, this, 758 00:43:31,880 --> 00:43:35,840 Speaker 1: this better, and goes further like they really it's a 759 00:43:36,000 --> 00:43:38,600 Speaker 1: it's a numbers in a data game that makes it 760 00:43:38,640 --> 00:43:41,080 Speaker 1: tough for a lot of people to swallow. I think, yeah, 761 00:43:41,280 --> 00:43:47,560 Speaker 1: it's anti sentimentalism basically in the service of saving the 762 00:43:47,600 --> 00:43:51,080 Speaker 1: most lives possible. I know, it's it's it's interesting, and 763 00:43:51,080 --> 00:43:52,920 Speaker 1: it doesn't surprise me that it has its roots in 764 00:43:53,480 --> 00:43:57,319 Speaker 1: and philosophy because it is really a philosophical sort of 765 00:43:57,640 --> 00:44:00,760 Speaker 1: uh head scratcher at the end of the day. Yeah, 766 00:44:00,800 --> 00:44:04,239 Speaker 1: for sure, it's pretty interesting stuff and it really is. 767 00:44:04,400 --> 00:44:07,600 Speaker 1: I think it's I think it's fascinating. Yeah. So there's 768 00:44:07,680 --> 00:44:10,720 Speaker 1: I mean, there's a lot more to read, both criticisms 769 00:44:10,760 --> 00:44:18,240 Speaker 1: and um, you know, pro e a stuff, And seriously, 770 00:44:18,320 --> 00:44:21,240 Speaker 1: you could do worse than than reading um, Peter Singer's 771 00:44:22,120 --> 00:44:30,400 Speaker 1: essay what is it called Famine Affluence, Famine Affluence and Morality. 772 00:44:30,800 --> 00:44:33,600 Speaker 1: It's like sixteen pages. It's a really quick read. Um, 773 00:44:33,640 --> 00:44:35,799 Speaker 1: it's really good, So read that too, and just see 774 00:44:35,840 --> 00:44:38,000 Speaker 1: what you think, see what you think about yourself too, 775 00:44:38,160 --> 00:44:41,680 Speaker 1: and maybe take some time and examine you know, um, 776 00:44:41,719 --> 00:44:44,799 Speaker 1: if you could give to some of these charities, or 777 00:44:44,840 --> 00:44:47,239 Speaker 1: if you're not giving to charity at all, seriously, do 778 00:44:47,320 --> 00:44:51,919 Speaker 1: spend some time and see where you could make that change. Uh. 779 00:44:52,000 --> 00:44:54,560 Speaker 1: And since I said make that change and Chuck said yeah, 780 00:44:54,760 --> 00:44:56,600 Speaker 1: that means, of course it's time for a listener mail. 781 00:45:00,200 --> 00:45:02,200 Speaker 1: This is follow up to albinism. I knew we would 782 00:45:02,239 --> 00:45:05,319 Speaker 1: have someone who has albinism to write in. I'm glad 783 00:45:05,360 --> 00:45:07,400 Speaker 1: we did. And then we had listeners out there. And 784 00:45:07,480 --> 00:45:10,920 Speaker 1: this is from Brett. Hey, guys, a longtime listener. I 785 00:45:10,960 --> 00:45:13,360 Speaker 1: have albinism, so I thought i'd throw in my perspective. 786 00:45:14,040 --> 00:45:15,960 Speaker 1: First off, I know you were struggling to decide how 787 00:45:16,000 --> 00:45:20,160 Speaker 1: to describe it albino or albinism. My preference is using 788 00:45:20,160 --> 00:45:22,640 Speaker 1: the term albinism like you guys did, as to me, 789 00:45:22,680 --> 00:45:25,399 Speaker 1: it denotes a condition while saying if someone or something 790 00:45:25,480 --> 00:45:28,239 Speaker 1: is albino, it feels like you're delegating them to a 791 00:45:28,239 --> 00:45:31,759 Speaker 1: different species. Being called albino always used to bug me 792 00:45:31,800 --> 00:45:35,160 Speaker 1: growing up, and that was usually because they were kids 793 00:45:35,200 --> 00:45:36,960 Speaker 1: were trying to get a rise out of me. Fortunately, 794 00:45:37,520 --> 00:45:40,240 Speaker 1: I was a big kid, so it never really escalated 795 00:45:40,280 --> 00:45:45,359 Speaker 1: to physical bullying. Like I like this idea. Uh, like 796 00:45:45,600 --> 00:45:49,479 Speaker 1: the kid with albinism who's like huge and someone says something, 797 00:45:49,480 --> 00:45:53,600 Speaker 1: They're like, excuse me, what did you just say? I 798 00:45:53,600 --> 00:45:57,240 Speaker 1: didn't say anything. Being a child of the seventies and eighties, 799 00:45:57,280 --> 00:45:59,360 Speaker 1: like you're like you guys, it was pretty rough at times. 800 00:46:00,560 --> 00:46:03,799 Speaker 1: On the physical side, my eyes are very light sensitive. Uh, 801 00:46:03,840 --> 00:46:07,680 Speaker 1: they're blue. Where Again, while growing up, some of the 802 00:46:07,760 --> 00:46:10,280 Speaker 1: kids would keep asking me why my eyes were closed 803 00:46:10,719 --> 00:46:13,440 Speaker 1: it was bright. Uh. And of course the low vision 804 00:46:13,440 --> 00:46:16,920 Speaker 1: comes into play as well. I'm considered legally blind, as 805 00:46:16,920 --> 00:46:19,960 Speaker 1: pretty much every other person with albinism I have met 806 00:46:20,000 --> 00:46:23,200 Speaker 1: has the same issue. There were ways to adjust in school, 807 00:46:23,800 --> 00:46:28,319 Speaker 1: and ways they could assist me with large print books, magnifiers, binoculars, 808 00:46:29,120 --> 00:46:31,440 Speaker 1: or the teachers simply letting me look at their slides 809 00:46:31,480 --> 00:46:35,839 Speaker 1: afterward and have more time with them. Uh. Yeah, that's great. 810 00:46:36,200 --> 00:46:38,239 Speaker 1: As for how people with albinism are portrayed in TV 811 00:46:38,360 --> 00:46:41,040 Speaker 1: and movies, I don't think being portrayed as a hitman 812 00:46:41,160 --> 00:46:43,959 Speaker 1: or even someone with magical powers bug me as much 813 00:46:44,320 --> 00:46:46,080 Speaker 1: as the fact that I know that it was fake 814 00:46:46,400 --> 00:46:48,400 Speaker 1: because it would be really hard to be a hitman 815 00:46:49,040 --> 00:46:52,799 Speaker 1: with a kind of eyesight that we have. I love 816 00:46:52,880 --> 00:46:55,640 Speaker 1: that so practical. Uh And Brett had a lot of 817 00:46:55,640 --> 00:46:58,000 Speaker 1: other great things to say, but that is from Brett 818 00:46:58,600 --> 00:47:01,200 Speaker 1: and a long time listener. Thanks a lot, Brett, that 819 00:47:01,320 --> 00:47:04,959 Speaker 1: was great. Glad you rode in, And uh, yeah, thanks 820 00:47:05,000 --> 00:47:06,799 Speaker 1: a lot. If you want to be like Brett and 821 00:47:06,840 --> 00:47:08,719 Speaker 1: get in touch with us and say, hey, you guys 822 00:47:08,760 --> 00:47:11,560 Speaker 1: are pretty good, or hey you guys could have done 823 00:47:11,560 --> 00:47:14,000 Speaker 1: a lot better, or hey I'm mad at you guys, 824 00:47:14,440 --> 00:47:16,319 Speaker 1: or whatever you want to say, we would love to 825 00:47:16,360 --> 00:47:18,600 Speaker 1: hear from you. We can take it all and you 826 00:47:18,600 --> 00:47:22,000 Speaker 1: can address it all to stuff podcast at iHeart radio 827 00:47:22,080 --> 00:47:27,239 Speaker 1: dot com. Stuff you Should Know is a production of 828 00:47:27,239 --> 00:47:30,560 Speaker 1: I heart Radio. For more podcasts my heart Radio, visit 829 00:47:30,600 --> 00:47:33,400 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 830 00:47:33,480 --> 00:47:34,760 Speaker 1: listen to your favorite shows.