1 00:00:05,320 --> 00:00:14,840 Speaker 1: Uh, it could happen. Here is the podcast that you're 2 00:00:14,880 --> 00:00:19,680 Speaker 1: listening to. I'm Robert Evans, the person that you're listening to, 3 00:00:20,440 --> 00:00:23,360 Speaker 1: and one of the people who does this podcast. Boy, 4 00:00:23,680 --> 00:00:28,520 Speaker 1: what a what a glorious introduction that was. Let me 5 00:00:28,560 --> 00:00:31,880 Speaker 1: also introduce some human beings who you might know. First, 6 00:00:31,880 --> 00:00:35,800 Speaker 1: we have Chris and and we have James. Are are 7 00:00:35,800 --> 00:00:40,599 Speaker 1: our correspondence in the field joining us today. Also is 8 00:00:40,880 --> 00:00:46,680 Speaker 1: James's Spanish Civil War era mosen degauant Yep, that's right. Yeah, 9 00:00:46,880 --> 00:00:48,680 Speaker 1: I'm very happy it's joining us. It's going to make 10 00:00:48,720 --> 00:00:53,760 Speaker 1: contribution throughout the episode, just gonna it's it's an antique 11 00:00:54,080 --> 00:00:58,480 Speaker 1: bolt action rifle served in three World wars, the current one. 12 00:00:58,560 --> 00:01:00,720 Speaker 1: That's right. Yeah, and it's about too. It's about to 13 00:01:00,760 --> 00:01:05,000 Speaker 1: kick off this one now, which it might be. It 14 00:01:05,080 --> 00:01:07,440 Speaker 1: might be two in the alcolumn for the most in 15 00:01:07,520 --> 00:01:11,360 Speaker 1: the gun. Yeah, it's it's it's it's had a it's 16 00:01:11,400 --> 00:01:17,840 Speaker 1: served a mixed bag. Um. Yeah. Anyway, we're recording this 17 00:01:18,000 --> 00:01:22,479 Speaker 1: the day of the elections, so everybody's having a horrible one. Um, 18 00:01:23,360 --> 00:01:27,080 Speaker 1: I'm having a firearm Yeah, yeah, I did. I'm still 19 00:01:27,080 --> 00:01:31,280 Speaker 1: hoping my my Tech nine comes in before Oregon votes 20 00:01:31,319 --> 00:01:34,920 Speaker 1: on its next ballot measure. Anyway, Um, today, I wanted 21 00:01:34,959 --> 00:01:39,480 Speaker 1: to talk a little bit about something that I've been 22 00:01:39,520 --> 00:01:43,559 Speaker 1: thinking about kind of constantly, which is, um, it's called 23 00:01:43,560 --> 00:01:48,160 Speaker 1: effective altruism. And it's the short end of this is 24 00:01:48,200 --> 00:01:53,840 Speaker 1: that like it is a style of thinking about charitable 25 00:01:53,880 --> 00:01:58,080 Speaker 1: giving that Elon Musk in particular has recently highlighted as 26 00:01:58,160 --> 00:02:01,680 Speaker 1: like how he thinks about things. It's very popular with 27 00:02:01,760 --> 00:02:06,120 Speaker 1: the billionaire set, who are who are deeply invested in 28 00:02:06,400 --> 00:02:09,679 Speaker 1: getting people to think that they're saving the world, right, Um, 29 00:02:09,880 --> 00:02:12,760 Speaker 1: the folks who want to be seen as like looking 30 00:02:12,800 --> 00:02:16,559 Speaker 1: ahead and and and set protecting the future of mankind 31 00:02:16,680 --> 00:02:19,959 Speaker 1: and saving the world, um, but not doing it through 32 00:02:19,960 --> 00:02:23,880 Speaker 1: things like paying you know, more taxes and supporting you 33 00:02:23,919 --> 00:02:27,280 Speaker 1: know less money, being in politics and and all that 34 00:02:27,360 --> 00:02:29,520 Speaker 1: kind of jazz. Like not not anything that would would 35 00:02:29,520 --> 00:02:33,920 Speaker 1: actually harm their their personal ability to exercise power. So 36 00:02:34,160 --> 00:02:37,280 Speaker 1: it's gotten kind of attacked recently because it's associated with 37 00:02:37,280 --> 00:02:40,520 Speaker 1: guys like Muskin, because he is markedly less popular now 38 00:02:40,560 --> 00:02:43,200 Speaker 1: than he was let's say, ten years ago. Um, but 39 00:02:43,200 --> 00:02:48,880 Speaker 1: I want, Yeah, I wanted to talk because effective altruism, 40 00:02:48,919 --> 00:02:52,280 Speaker 1: which is an actual movement. There's like organizations that espouse this. 41 00:02:52,400 --> 00:02:54,880 Speaker 1: There's hundreds of millions of dollars in charitable giving that 42 00:02:55,280 --> 00:02:58,760 Speaker 1: gets handed out under the addresses of effective altruism, and 43 00:02:58,800 --> 00:03:01,040 Speaker 1: as it heads up, like most of it, it's fine, 44 00:03:01,080 --> 00:03:03,680 Speaker 1: like most of its charities to like get let out 45 00:03:03,720 --> 00:03:06,799 Speaker 1: of water and stuff like. It's not like effective altruism 46 00:03:06,840 --> 00:03:10,320 Speaker 1: is not comprehensively some sort of like scam by the wealthy. 47 00:03:10,440 --> 00:03:15,280 Speaker 1: It's more of a an honest theory about how charitable 48 00:03:15,760 --> 00:03:18,720 Speaker 1: giving ought to work that has been adopted by the 49 00:03:18,840 --> 00:03:21,840 Speaker 1: hyper wealthiest justification for fucked up ship and married to 50 00:03:21,919 --> 00:03:24,000 Speaker 1: something called long term is um, which we will be 51 00:03:24,040 --> 00:03:25,880 Speaker 1: talking about in a little bit. But I want to 52 00:03:25,919 --> 00:03:29,000 Speaker 1: talk about where the concept of effective altruism comes from. 53 00:03:29,040 --> 00:03:32,079 Speaker 1: If you read articles about this thing, most people who 54 00:03:32,120 --> 00:03:34,960 Speaker 1: study it will say that it kind of This got 55 00:03:34,960 --> 00:03:37,880 Speaker 1: started as a modern movement in nineteen seventy one with 56 00:03:37,960 --> 00:03:42,160 Speaker 1: an Australian philosopher named Peter Singer, and Singer wrote an 57 00:03:42,280 --> 00:03:46,560 Speaker 1: article titled Famine, Affluence and Morality UM. I think it 58 00:03:46,600 --> 00:03:49,600 Speaker 1: was actually published in nineteen seventy two. I don't know 59 00:03:49,640 --> 00:03:51,800 Speaker 1: one of the two seventy one or seventy two, and 60 00:03:51,800 --> 00:03:55,640 Speaker 1: and the essay basically argued that there's no difference morally 61 00:03:55,720 --> 00:03:58,480 Speaker 1: between your obligation to help a person dying on the 62 00:03:58,480 --> 00:04:00,520 Speaker 1: street in front of your house. Like dude gets hit 63 00:04:00,560 --> 00:04:02,120 Speaker 1: by a car in front of your house, you are 64 00:04:02,160 --> 00:04:04,800 Speaker 1: not more morally obligated to help him than you are 65 00:04:04,960 --> 00:04:08,760 Speaker 1: morally obligated to help people who are dying in Syria, 66 00:04:08,960 --> 00:04:14,560 Speaker 1: you know, um, And obviously, like there's a a version 67 00:04:14,600 --> 00:04:16,919 Speaker 1: of truth to that, which is that we're all responsible 68 00:04:16,920 --> 00:04:19,680 Speaker 1: for each other and internationalism is the only actual path 69 00:04:19,839 --> 00:04:21,719 Speaker 1: away from the nightmare. And when we do things like 70 00:04:21,800 --> 00:04:26,440 Speaker 1: ignore authoritarians massacring their people, it inevitably comes back to 71 00:04:26,520 --> 00:04:30,200 Speaker 1: affect us and like fuel the growth of an authoritarian 72 00:04:30,320 --> 00:04:35,680 Speaker 1: nightmare domestically. That is very true, um, But also there's 73 00:04:35,680 --> 00:04:38,520 Speaker 1: a fundamental silliness in it, because one reason why there 74 00:04:38,560 --> 00:04:40,680 Speaker 1: is a moral difference between helping a person dying in 75 00:04:40,720 --> 00:04:42,520 Speaker 1: the street in front of you and somebody who's in 76 00:04:42,640 --> 00:04:45,680 Speaker 1: danger in I don't know, southern China is that, like, 77 00:04:45,920 --> 00:04:48,920 Speaker 1: you can immediately help the person in front of your house, right, 78 00:04:49,000 --> 00:04:51,160 Speaker 1: Like if somebody gets hit, but you have the ability 79 00:04:51,240 --> 00:04:54,400 Speaker 1: to immediately render life saving eight, it's actually quite difficult 80 00:04:54,800 --> 00:04:57,360 Speaker 1: to help somebody who is for example, getting shot at 81 00:04:57,360 --> 00:05:00,440 Speaker 1: by the government uh in Tibet. Right, Like not that 82 00:05:00,480 --> 00:05:03,280 Speaker 1: you don't don't have a moral responsibility to that person, 83 00:05:03,600 --> 00:05:07,800 Speaker 1: but your moral responsibility to actually immediately take action when 84 00:05:07,880 --> 00:05:11,279 Speaker 1: somebody is bleeding out is higher than your responsibility to 85 00:05:11,360 --> 00:05:13,440 Speaker 1: try to figure out how to help people in distant 86 00:05:13,440 --> 00:05:17,000 Speaker 1: parts of the globe. Um, this is more nuanced than 87 00:05:17,040 --> 00:05:21,560 Speaker 1: I think a lot of especially like rich assholes like 88 00:05:21,640 --> 00:05:24,880 Speaker 1: to It's more nuanced than like the the I shouldn't 89 00:05:24,920 --> 00:05:28,000 Speaker 1: say rich assholes. What what's the problem with this is 90 00:05:28,080 --> 00:05:30,520 Speaker 1: that it's the this is the kind of revelation like 91 00:05:30,560 --> 00:05:33,400 Speaker 1: when you start talking this way that that feeds really 92 00:05:33,400 --> 00:05:36,760 Speaker 1: well into a fucking ted talk it it's a perfect 93 00:05:36,960 --> 00:05:39,919 Speaker 1: fix for that morality, whereas the reality is like a 94 00:05:39,960 --> 00:05:42,520 Speaker 1: lot more nuanced where and number one, it's also like, 95 00:05:43,000 --> 00:05:45,240 Speaker 1: well that the kind of help that you would render 96 00:05:45,279 --> 00:05:47,000 Speaker 1: to somebody who's been hit by a car in front 97 00:05:47,000 --> 00:05:49,600 Speaker 1: of your house is very different and requires really different 98 00:05:49,600 --> 00:05:51,679 Speaker 1: resources than the kind of help you would give people 99 00:05:52,120 --> 00:05:55,120 Speaker 1: in say, again like Syria, who are being murdered by 100 00:05:55,120 --> 00:05:57,360 Speaker 1: their government. Right, if somebody gets hit by a car 101 00:05:57,400 --> 00:05:58,760 Speaker 1: in front of your house, you run out with a 102 00:05:58,800 --> 00:06:01,280 Speaker 1: fucking tourniquet and a kit and you call nine one 103 00:06:01,320 --> 00:06:04,120 Speaker 1: one right, those are the resources that you can immediately 104 00:06:04,240 --> 00:06:08,760 Speaker 1: use if butshar Al Assad is firing poison gas at 105 00:06:09,480 --> 00:06:14,359 Speaker 1: protesters in you know, Aleppo, well your your stop the 106 00:06:14,360 --> 00:06:16,400 Speaker 1: bleed kit is not going to help with that, one 107 00:06:16,400 --> 00:06:18,479 Speaker 1: way or the other. Right, A very different set of 108 00:06:18,520 --> 00:06:23,160 Speaker 1: resources are necessary. UM. So it's it's foolish to compare 109 00:06:23,200 --> 00:06:26,919 Speaker 1: them anyway. Singer did UM and his essay was a 110 00:06:26,960 --> 00:06:29,560 Speaker 1: big hit. It's often called like a sleeper hit for 111 00:06:29,560 --> 00:06:32,320 Speaker 1: for young people who were kind of getting into the 112 00:06:32,720 --> 00:06:36,800 Speaker 1: you know, the charity industrial complex UM, or at least 113 00:06:36,800 --> 00:06:39,680 Speaker 1: we're considering it now. I found an interview with one 114 00:06:39,720 --> 00:06:42,400 Speaker 1: named Julia Wise who currently works at the Center for 115 00:06:42,480 --> 00:06:48,000 Speaker 1: Effective Altruism UM, and she was a started out as 116 00:06:48,000 --> 00:06:49,480 Speaker 1: a social work, like to give you an idea of 117 00:06:49,480 --> 00:06:50,880 Speaker 1: the kind of people who got into this. When she 118 00:06:50,920 --> 00:06:54,919 Speaker 1: read Weiss's article UM, she was a social worker, she 119 00:06:55,040 --> 00:06:57,520 Speaker 1: kind of fell in love with the concept and when 120 00:06:57,920 --> 00:07:00,960 Speaker 1: it started becoming a thing, and like the monies and eighties, 121 00:07:01,600 --> 00:07:04,520 Speaker 1: it was, as she described, quote, a bunch of philosophers 122 00:07:04,520 --> 00:07:06,719 Speaker 1: and their friends and nobody had a bunch of money. 123 00:07:07,040 --> 00:07:09,680 Speaker 1: So it was also more when Singer put it out 124 00:07:09,760 --> 00:07:12,679 Speaker 1: kind of a a wave, like a way of people 125 00:07:12,760 --> 00:07:15,520 Speaker 1: kind of debating how to think about charity, which is 126 00:07:15,520 --> 00:07:19,520 Speaker 1: is fine, people should always be like exploring stuff like that. 127 00:07:19,600 --> 00:07:21,520 Speaker 1: So it's not I don't want to be like going 128 00:07:21,560 --> 00:07:25,200 Speaker 1: after Singer too well, I do a little bit um 129 00:07:25,200 --> 00:07:29,120 Speaker 1: because Singer after kind of his movement has a couple 130 00:07:29,200 --> 00:07:32,560 Speaker 1: of decades to grow, winds up doing a Ted talk um, 131 00:07:32,600 --> 00:07:35,880 Speaker 1: and the Ted talk winds up kind of electrifying a 132 00:07:36,000 --> 00:07:40,600 Speaker 1: very specific chunk of the American techno set um. And 133 00:07:41,080 --> 00:07:43,720 Speaker 1: you can see kind of in in some of the 134 00:07:43,760 --> 00:07:48,440 Speaker 1: writing on this, like the way in which his talking 135 00:07:48,440 --> 00:07:51,360 Speaker 1: about sort of the morality of charity has gotten flattened 136 00:07:51,440 --> 00:07:54,080 Speaker 1: over the years. Quote which is the better thing to 137 00:07:54,120 --> 00:07:56,560 Speaker 1: do to provide a guide dog to one blind American 138 00:07:56,880 --> 00:08:01,200 Speaker 1: or cure two thousand people of blindness and developing countries? Um? 139 00:08:01,240 --> 00:08:03,760 Speaker 1: Which is like, I don't know both. There's resources to 140 00:08:03,800 --> 00:08:07,800 Speaker 1: do both. Um. We again, if you, for an example, 141 00:08:07,840 --> 00:08:10,320 Speaker 1: in the United States, were to attacks the billionaire class 142 00:08:10,320 --> 00:08:13,320 Speaker 1: and corporations a lot more, you could provide that blind 143 00:08:13,320 --> 00:08:17,160 Speaker 1: person in the United States, uh with with free healthcare 144 00:08:17,200 --> 00:08:19,720 Speaker 1: in a way that many countries do. Um. And we 145 00:08:19,760 --> 00:08:23,240 Speaker 1: could also continue or even expand charitable giving, maybe if 146 00:08:23,240 --> 00:08:24,960 Speaker 1: we were to do stuff like spend less money on 147 00:08:24,960 --> 00:08:27,680 Speaker 1: our military. Again, it's like a false choice, like it's 148 00:08:27,680 --> 00:08:30,400 Speaker 1: worth but but of course it's it's because the reason 149 00:08:30,440 --> 00:08:33,280 Speaker 1: this choice is there is because they're thinking about they're 150 00:08:33,280 --> 00:08:36,320 Speaker 1: thinking about helping people purely in the form of like 151 00:08:36,440 --> 00:08:40,400 Speaker 1: nobless obliged charity, right, They're they're thinking about periods like 152 00:08:40,520 --> 00:08:43,520 Speaker 1: rich like things that get improved when rich people put 153 00:08:43,559 --> 00:08:47,040 Speaker 1: money into them. Um. Yeah, so obviously we should help 154 00:08:47,520 --> 00:08:50,280 Speaker 1: you know, one of these groups before the other because 155 00:08:50,280 --> 00:08:52,840 Speaker 1: it's more effective and yeada, YadA YadA. Yeah. Yeah, well, 156 00:08:52,880 --> 00:08:54,560 Speaker 1: and I think I think that was one of the 157 00:08:54,559 --> 00:08:57,040 Speaker 1: things that like the there there's there's a second way 158 00:08:57,080 --> 00:08:59,160 Speaker 1: you can look at the original sort of problem of 159 00:08:59,600 --> 00:09:01,640 Speaker 1: we have this same methical responsibility. Someone who could sit 160 00:09:01,679 --> 00:09:02,800 Speaker 1: by a car or somebody's on the other side of 161 00:09:02,840 --> 00:09:04,640 Speaker 1: the world. Is that like the other way you can 162 00:09:04,679 --> 00:09:06,560 Speaker 1: look at that is like I don't care about what's 163 00:09:06,559 --> 00:09:07,840 Speaker 1: happening is some one of the other side of the world, 164 00:09:07,840 --> 00:09:08,920 Speaker 1: so they don't have to care about this person. You 165 00:09:09,000 --> 00:09:13,600 Speaker 1: get hit by a car, and that seems like people 166 00:09:13,640 --> 00:09:16,440 Speaker 1: are doing it's like, well, I really have to care 167 00:09:16,480 --> 00:09:19,760 Speaker 1: about this person here because there's someone over there. Yeah, 168 00:09:19,840 --> 00:09:22,280 Speaker 1: I did, Like I can see like how this lines 169 00:09:22,360 --> 00:09:24,400 Speaker 1: up with some of these like like bigger like meta 170 00:09:24,440 --> 00:09:28,360 Speaker 1: ethical kind of perspectives on what equality is and what 171 00:09:28,520 --> 00:09:31,760 Speaker 1: like your ethical obligations are. But then yeah, it seems 172 00:09:31,800 --> 00:09:35,640 Speaker 1: to just kind of be like a very clear, like 173 00:09:36,000 --> 00:09:38,360 Speaker 1: very clear slippery slope to making kind of mouth usy 174 00:09:38,440 --> 00:09:41,560 Speaker 1: and excuses for doing funk. All right, that's that's where 175 00:09:41,559 --> 00:09:55,120 Speaker 1: the story is heading. So oh good. Early two thousands, 176 00:09:55,200 --> 00:09:58,520 Speaker 1: he does like a Ted talk. You know, the momentum 177 00:09:58,559 --> 00:10:01,720 Speaker 1: around this idea starts to build and it really gets 178 00:10:01,720 --> 00:10:03,880 Speaker 1: a shot in the arm in two thousand thirteen with 179 00:10:03,920 --> 00:10:07,559 Speaker 1: the work of an author named Eric Friedman. Friedman's new 180 00:10:07,600 --> 00:10:09,719 Speaker 1: book or Friedland's book at the time that was new 181 00:10:09,800 --> 00:10:13,640 Speaker 1: was called Reinventing Philanthropy A Framework for More Effective giving 182 00:10:14,160 --> 00:10:16,480 Speaker 1: um and he kind of he kind of extends the 183 00:10:16,559 --> 00:10:19,040 Speaker 1: arguments that Singers making. One of the things that he 184 00:10:19,080 --> 00:10:22,319 Speaker 1: does is he he contrasts what St. Jude's Children's Reach 185 00:10:22,400 --> 00:10:26,920 Speaker 1: their hospitals are doing to like research children's medical or 186 00:10:27,120 --> 00:10:30,040 Speaker 1: like like illnesses that that kids supper and treatments for 187 00:10:30,120 --> 00:10:34,280 Speaker 1: them um with the Malagi Provincial Hospital in Angola um 188 00:10:34,320 --> 00:10:37,040 Speaker 1: and he kind of contrasts to patients who are being 189 00:10:37,160 --> 00:10:41,000 Speaker 1: served at the different hospitals for life threatening conditions and concludes, quote, 190 00:10:41,160 --> 00:10:43,640 Speaker 1: I'd probably also be very angry at the donors who 191 00:10:43,640 --> 00:10:47,080 Speaker 1: are continually funding St. Jude and leaving Melangi Provincial woefully 192 00:10:47,160 --> 00:10:49,720 Speaker 1: under resource. Why are the patients of St. Jude's so 193 00:10:49,800 --> 00:10:54,360 Speaker 1: much more worthy of life? And like, yeah, what are 194 00:10:54,400 --> 00:10:59,160 Speaker 1: the ridiculous way to think about it? Children's fucking asinine 195 00:10:59,240 --> 00:11:01,280 Speaker 1: And the fact that many of the people who are 196 00:11:01,320 --> 00:11:03,960 Speaker 1: doing these fucking TED talks and contributing to this like 197 00:11:04,000 --> 00:11:06,160 Speaker 1: a global tech class at the same people who are 198 00:11:06,200 --> 00:11:09,880 Speaker 1: making fucking millions of dollars off the pharmaceutical industry, which 199 00:11:09,920 --> 00:11:13,360 Speaker 1: continues to neglect the diseases that people like in the 200 00:11:13,600 --> 00:11:16,440 Speaker 1: colonial periphery suffer from because there's no profit in selling 201 00:11:16,480 --> 00:11:19,360 Speaker 1: them drugs, and instead you're selling boldness cures to people 202 00:11:19,360 --> 00:11:22,440 Speaker 1: in America, right, Like, yes, we can, I mean, like 203 00:11:22,520 --> 00:11:25,960 Speaker 1: you you could if we just if if every single 204 00:11:26,000 --> 00:11:28,439 Speaker 1: person who had a who's gotten a TED talk had 205 00:11:28,480 --> 00:11:31,200 Speaker 1: all of their wealth appropriated tomorrow, we could fund both 206 00:11:31,200 --> 00:11:34,880 Speaker 1: of these hospitals exactly yes, yeah, would will be better. 207 00:11:35,120 --> 00:11:39,280 Speaker 1: It's fundamentally a kind of obscenity to look at pharmaceutical 208 00:11:39,679 --> 00:11:42,680 Speaker 1: company CEO is making hundreds of millions and billions of 209 00:11:42,720 --> 00:11:46,040 Speaker 1: dollars selling people often literal poison and jacking up the 210 00:11:46,040 --> 00:11:48,280 Speaker 1: price of things like insulin. To look at these tech 211 00:11:48,360 --> 00:11:52,000 Speaker 1: CEOs accumulating tens of billions of dollars, and to say, 212 00:11:53,280 --> 00:11:57,960 Speaker 1: donations to this children's hospital are robbing an angolid hospital. Yes, 213 00:11:58,320 --> 00:12:01,120 Speaker 1: so I won't be paying my tax? Yeah, why don't 214 00:12:01,120 --> 00:12:06,920 Speaker 1: you go funk yourself? Yeah? And anyway, like but this 215 00:12:06,960 --> 00:12:09,800 Speaker 1: is like you can see who this appeals to, right 216 00:12:09,880 --> 00:12:12,920 Speaker 1: if you like the kind of people who love the 217 00:12:12,960 --> 00:12:22,840 Speaker 1: freakonomics books, which are bullshit regressive statistics. Can economic story please? Yeah? Okay, 218 00:12:22,920 --> 00:12:25,959 Speaker 1: So one of my professors at Chicago was a political 219 00:12:26,000 --> 00:12:29,360 Speaker 1: science guy, um or I guess the public policy and 220 00:12:30,120 --> 00:12:32,040 Speaker 1: there's there's a thing, there's a thing the freakonomics guy 221 00:12:32,080 --> 00:12:34,120 Speaker 1: wrote where he was trying to prove that money doesn't 222 00:12:34,160 --> 00:12:40,960 Speaker 1: actually influence, like doesn't actually influence left. Yeah, and you 223 00:12:41,000 --> 00:12:43,599 Speaker 1: know what my my, my, my professor wrote wrote a 224 00:12:43,640 --> 00:12:45,680 Speaker 1: paper about that, which is that you know and again 225 00:12:45,679 --> 00:12:47,240 Speaker 1: this is this is a perfect example of how done 226 00:12:47,240 --> 00:12:49,360 Speaker 1: this guy is that he doesn't This is how condis think, right, 227 00:12:49,400 --> 00:12:51,280 Speaker 1: Like they when they when they go into a field, 228 00:12:51,320 --> 00:12:53,120 Speaker 1: they go in thinking they already know everything and they 229 00:12:53,120 --> 00:12:56,360 Speaker 1: can prove whatever they want because okay, but the thing 230 00:12:56,360 --> 00:12:58,760 Speaker 1: this guy doesn't understand, right, is that, like and this 231 00:12:58,840 --> 00:13:00,480 Speaker 1: is the thing most people in the US do not 232 00:13:00,559 --> 00:13:03,120 Speaker 1: understand about how Congress works. Is that like all of 233 00:13:03,200 --> 00:13:05,439 Speaker 1: the ship that's happening on the floor of Congress, all 234 00:13:05,440 --> 00:13:08,120 Speaker 1: of those votes that is not that is not real congress, right, 235 00:13:08,160 --> 00:13:10,920 Speaker 1: that that is fate congress. Nothing nothing important to actually 236 00:13:10,960 --> 00:13:13,720 Speaker 1: happens there. All of the important stuff in Congress happens 237 00:13:13,760 --> 00:13:16,840 Speaker 1: in committees. And so you can't figure out whether money 238 00:13:16,880 --> 00:13:19,360 Speaker 1: is doing anything by measuring its effects on like votes 239 00:13:19,440 --> 00:13:22,160 Speaker 1: on the floor, because floor votes are bullshit. All of 240 00:13:22,160 --> 00:13:23,760 Speaker 1: the important stuff has already by the top, by the 241 00:13:23,760 --> 00:13:25,880 Speaker 1: tip of floor vote happens, all the important policing stuff 242 00:13:25,880 --> 00:13:28,319 Speaker 1: has already happened. And so he did this heard this 243 00:13:28,320 --> 00:13:29,560 Speaker 1: whole thing where he was you know, he had this 244 00:13:29,640 --> 00:13:33,440 Speaker 1: great I I. He had this great metric called like 245 00:13:34,200 --> 00:13:36,439 Speaker 1: uh oh God, and it was it's called like the 246 00:13:37,400 --> 00:13:40,960 Speaker 1: the dairy cow coefficients, which is like measuring like how 247 00:13:41,000 --> 00:13:44,120 Speaker 1: how someone should vote versus like how the dairy cows running. 248 00:13:44,160 --> 00:13:45,640 Speaker 1: It turns out, you know, if you look at what 249 00:13:45,679 --> 00:13:48,160 Speaker 1: these people do in committee. No, yeah, hey, look, it 250 00:13:48,200 --> 00:13:51,760 Speaker 1: turns out a lobby money is unbelievably effective. But because 251 00:13:51,960 --> 00:13:54,680 Speaker 1: this fucking guy had like and this is something that 252 00:13:54,760 --> 00:13:57,240 Speaker 1: like like this sort of distinction between between Congress like 253 00:13:57,400 --> 00:13:59,719 Speaker 1: on the floor and Congress and committee. Like there's a 254 00:13:59,760 --> 00:14:01,319 Speaker 1: press in it whose name of forgetting who has this 255 00:14:01,480 --> 00:14:04,480 Speaker 1: famous line that like Congress and Committee is Congress at work, 256 00:14:04,520 --> 00:14:06,679 Speaker 1: Congress on the floors, Corgress at play or something like that. 257 00:14:07,080 --> 00:14:09,720 Speaker 1: Like it's it's like this is just like basic ship 258 00:14:09,960 --> 00:14:13,400 Speaker 1: that if you know literally anything about how a field works, 259 00:14:13,760 --> 00:14:17,800 Speaker 1: you cannot do if you wanna, if you wanna, if 260 00:14:17,840 --> 00:14:20,520 Speaker 1: you want to a good breakdown of why the freakonomics 261 00:14:20,560 --> 00:14:25,040 Speaker 1: guy is full of shit. Michael Hobbs and Peter sham 262 00:14:25,120 --> 00:14:27,040 Speaker 1: Shary I think is his last name, have a new 263 00:14:27,080 --> 00:14:29,680 Speaker 1: podcast called If Books Could Kill and they break down 264 00:14:29,760 --> 00:14:33,880 Speaker 1: with like citations and everything like why everything in that 265 00:14:33,880 --> 00:14:36,520 Speaker 1: book is horseship but like the reason why it's the 266 00:14:36,600 --> 00:14:38,240 Speaker 1: only thing I'll disagree with you one, Chris is I 267 00:14:38,240 --> 00:14:41,000 Speaker 1: don't think he's an idiot. I think he's very intelligent, 268 00:14:41,120 --> 00:14:42,640 Speaker 1: and I think the thing that he's smart to do 269 00:14:42,760 --> 00:14:45,800 Speaker 1: is he recognizes that there's a specific type of person 270 00:14:45,920 --> 00:14:48,920 Speaker 1: and engineers and programmers are very likely to be this 271 00:14:49,000 --> 00:14:53,680 Speaker 1: type of person who kind of fundamentally like their oppositional defiant. 272 00:14:53,760 --> 00:14:56,600 Speaker 1: If somebody if something, if people say like well this 273 00:14:56,680 --> 00:14:59,680 Speaker 1: is good or this is bad, um, they're going to 274 00:14:59,720 --> 00:15:01,880 Speaker 1: take They want to take the opposite stance. And if 275 00:15:01,880 --> 00:15:03,880 Speaker 1: you can provide them way to like feel like they're 276 00:15:04,000 --> 00:15:06,360 Speaker 1: enlightened and smart and actually looking at the data by 277 00:15:06,400 --> 00:15:09,160 Speaker 1: doing it, then they'll take the opposite stance on stuff 278 00:15:09,160 --> 00:15:13,040 Speaker 1: like it's bad to let people by elections or it's 279 00:15:13,520 --> 00:15:16,840 Speaker 1: good to fund children's hospitals just because somebody has made 280 00:15:16,840 --> 00:15:20,040 Speaker 1: them feel smart for being an asshole. Um, that's what 281 00:15:20,120 --> 00:15:23,480 Speaker 1: the freakonomics guy does. Malcolm Gladwell does a subtler version 282 00:15:23,480 --> 00:15:26,200 Speaker 1: of it as a general rule. Um, and that's what 283 00:15:26,440 --> 00:15:30,080 Speaker 1: that's what the fucking freedman is doing in this this book. 284 00:15:30,080 --> 00:15:33,520 Speaker 1: In two thousand thirteen, I found a good review of 285 00:15:33,560 --> 00:15:37,760 Speaker 1: it in the Stanford Social Innovation Review. Um. That is uh, 286 00:15:38,120 --> 00:15:42,720 Speaker 1: pretty scathing, like surprisingly scathing considering it's it's written by 287 00:15:42,760 --> 00:15:46,240 Speaker 1: a bunch of like Stanford nerds. This approach amounts to 288 00:15:46,240 --> 00:15:49,880 Speaker 1: a little more than charitable imperialism, whereby my just causes 289 00:15:49,920 --> 00:15:51,960 Speaker 1: just in yours to one degree or another, is a 290 00:15:51,960 --> 00:15:55,760 Speaker 1: waste of precious resources. This approach is not informed giving 291 00:15:56,640 --> 00:15:59,440 Speaker 1: UM and I think that that does a pretty good 292 00:15:59,560 --> 00:16:03,880 Speaker 1: job of summarizing what I think is fucked up about it. 293 00:16:03,880 --> 00:16:05,840 Speaker 1: There's another thing that's really messed up, which is that 294 00:16:06,520 --> 00:16:10,280 Speaker 1: one of the conclusions that they gets come that they 295 00:16:10,320 --> 00:16:13,120 Speaker 1: come to here is that um, they don't recommend or 296 00:16:13,160 --> 00:16:15,400 Speaker 1: there's an organization called Guild well that kind of gets 297 00:16:15,560 --> 00:16:18,840 Speaker 1: gets formed as a result of the book Freedman Rights, 298 00:16:18,880 --> 00:16:22,400 Speaker 1: and they recommend not to deliver, like not to donate 299 00:16:22,400 --> 00:16:25,440 Speaker 1: money to disaster assistance in the wake of the Japanese tsunami. 300 00:16:25,840 --> 00:16:29,760 Speaker 1: Um And opposed disaster relief donations in general. Um because 301 00:16:29,880 --> 00:16:32,680 Speaker 1: quote and this is from Freedman, most of those killed 302 00:16:32,680 --> 00:16:36,000 Speaker 1: by disasters could not have been saved by donations, um, 303 00:16:36,080 --> 00:16:39,440 Speaker 1: which is number one, Like, that's the donations are about 304 00:16:39,440 --> 00:16:43,160 Speaker 1: like rebuilding communities generally, it's not like about the saving lives. 305 00:16:43,240 --> 00:16:46,000 Speaker 1: Usually it's about like, well, all of the infrastructure was 306 00:16:46,040 --> 00:16:50,320 Speaker 1: destroyed and it must be rebuilt. Um but okay, guy, 307 00:16:50,640 --> 00:16:52,400 Speaker 1: Well it's annoying to you because it's like it's it's 308 00:16:52,400 --> 00:16:55,000 Speaker 1: not like there's not good critiques of like pricipically always 309 00:16:55,000 --> 00:16:57,720 Speaker 1: like the Red Cross. Oh, it's all fucked up. Every 310 00:16:57,800 --> 00:17:01,920 Speaker 1: single yes, yeah, I critique is like the worst possible, 311 00:17:02,160 --> 00:17:07,639 Speaker 1: like the critiques. Yeah, every single large charitable organization is 312 00:17:07,680 --> 00:17:09,600 Speaker 1: fucked up. And if you go and talk to people 313 00:17:09,600 --> 00:17:11,760 Speaker 1: on the ground, they will bitch, Like if you go 314 00:17:11,840 --> 00:17:14,439 Speaker 1: to fucking war zones, people bitch more about NGOs than 315 00:17:14,480 --> 00:17:18,880 Speaker 1: the folks shooting at them half the time. Like, yeah, 316 00:17:18,920 --> 00:17:21,960 Speaker 1: they bitch about it being inefficient, about the stuff they're 317 00:17:21,960 --> 00:17:26,200 Speaker 1: given being like bad quality or like um like nonsense 318 00:17:26,320 --> 00:17:28,760 Speaker 1: like just being handed out to be handed out, which 319 00:17:28,760 --> 00:17:30,879 Speaker 1: is a thing that happens sometimes, and a bitch about 320 00:17:31,119 --> 00:17:34,000 Speaker 1: well paid aid workers staying in hotels and showing up 321 00:17:34,040 --> 00:17:38,040 Speaker 1: for a couple of hours to like do a photo op. Um. 322 00:17:38,080 --> 00:17:41,240 Speaker 1: There's also more incisive like you know, that's not to 323 00:17:41,280 --> 00:17:43,920 Speaker 1: say none of it's useful, Like, for example, as many 324 00:17:43,960 --> 00:17:47,440 Speaker 1: complaints as people have, everyone I've known who has been 325 00:17:47,480 --> 00:17:50,440 Speaker 1: in a place where medicine sans frontier slash doctors without 326 00:17:50,440 --> 00:17:54,240 Speaker 1: borders has operated while they have complaints about doctors without borders, 327 00:17:54,240 --> 00:17:56,439 Speaker 1: are like, it's good that there's more doctors here. We 328 00:17:56,560 --> 00:18:00,040 Speaker 1: fucking need them. Um. And you know, it's like you 329 00:18:00,119 --> 00:18:02,840 Speaker 1: and h c R. Plenty of things to complain about 330 00:18:02,880 --> 00:18:05,879 Speaker 1: you and h c R. Every refugee camp I go to, Also, 331 00:18:05,960 --> 00:18:08,439 Speaker 1: people have fucking water filters, intins and ship because of 332 00:18:08,480 --> 00:18:10,520 Speaker 1: U N h c R, which isn't nothing. It's a 333 00:18:10,600 --> 00:18:12,680 Speaker 1: damn site more than nothing, and it's a damn site 334 00:18:12,680 --> 00:18:15,800 Speaker 1: more than any of these long term motherfucker's are doing 335 00:18:15,840 --> 00:18:19,400 Speaker 1: for people who are I don't know, displaced by war. Yeah, 336 00:18:19,480 --> 00:18:23,080 Speaker 1: And it's like some of the things that they're doing 337 00:18:23,240 --> 00:18:25,320 Speaker 1: is like, this is very strange kind of attempts to 338 00:18:25,359 --> 00:18:29,000 Speaker 1: calculate and create markets for human life and human suffering, right, 339 00:18:29,040 --> 00:18:31,480 Speaker 1: which you see a lot if you were like, I've 340 00:18:31,480 --> 00:18:35,360 Speaker 1: worked in nonprofit, I've worked in in disaster response. I've 341 00:18:35,400 --> 00:18:37,520 Speaker 1: seen some of these things on the ground and it 342 00:18:37,680 --> 00:18:41,640 Speaker 1: it you see these bizarre fucking decisions being made by 343 00:18:41,760 --> 00:18:44,040 Speaker 1: by someone in an office who is likely never been 344 00:18:44,119 --> 00:18:47,959 Speaker 1: on the ground of these situations, and it inevitably results 345 00:18:48,240 --> 00:18:52,000 Speaker 1: in it's within these big organizations at the Red Cross 346 00:18:52,040 --> 00:18:55,639 Speaker 1: and MSF but also on a governmental level, right with 347 00:18:55,680 --> 00:18:58,520 Speaker 1: people not having the autonomy to respond in a situation 348 00:18:58,560 --> 00:19:01,200 Speaker 1: to reduce human suffering. And it's dead to be told 349 00:19:01,200 --> 00:19:03,560 Speaker 1: to do something which is supposedly evidence based based on 350 00:19:03,560 --> 00:19:05,800 Speaker 1: someone who's looked at the wrong criteria and come to 351 00:19:05,840 --> 00:19:09,800 Speaker 1: the wrong conclusion hundreds of miles away. And it's us. 352 00:19:10,359 --> 00:19:12,560 Speaker 1: It's bureaucrats, right, And it's like we we've we've, we've 353 00:19:12,560 --> 00:19:15,080 Speaker 1: we've we've somehow managed to create like the absolute worst 354 00:19:15,080 --> 00:19:17,800 Speaker 1: possible night mirror system of you have a bunch of 355 00:19:17,800 --> 00:19:20,399 Speaker 1: government bureaucrats and then you also have a bunch of 356 00:19:20,440 --> 00:19:22,959 Speaker 1: sort of privates that we have. We have like different 357 00:19:23,600 --> 00:19:26,320 Speaker 1: we're watching a collision of different kinds of private sector 358 00:19:26,359 --> 00:19:28,959 Speaker 1: bureaucrats like you have, you have your sort of engeo 359 00:19:29,000 --> 00:19:30,680 Speaker 1: bureaucrats you have, and then you know, and then you 360 00:19:30,760 --> 00:19:33,640 Speaker 1: have these billionaires who are also just fucking bureaucrats, and 361 00:19:33,760 --> 00:19:35,960 Speaker 1: all of them are just doing box ticking, and we 362 00:19:36,040 --> 00:19:41,280 Speaker 1: get like just the absolute worst nightmare fusion of horrible 363 00:19:41,280 --> 00:19:44,760 Speaker 1: bureaucracy and capitalism, which is a great way to run 364 00:19:44,920 --> 00:19:48,720 Speaker 1: programs to have people not die, And like so much 365 00:19:48,760 --> 00:19:51,399 Speaker 1: of this comes from what that the whole like free economics. 366 00:19:51,400 --> 00:19:53,000 Speaker 1: Thing to me strikes me as like we didn't, like 367 00:19:53,040 --> 00:19:55,399 Speaker 1: you said, reading the Wikipedia active about subject and then 368 00:19:55,400 --> 00:19:57,119 Speaker 1: applying trying to find out why you can apply a 369 00:19:57,119 --> 00:19:59,600 Speaker 1: market to it, and then posting that as a solution. 370 00:20:00,040 --> 00:20:02,720 Speaker 1: Stuff we have. The episodes were dropping on bastards well 371 00:20:02,760 --> 00:20:05,320 Speaker 1: the week before this episode will air, are about like 372 00:20:05,359 --> 00:20:06,919 Speaker 1: why the rent is so damn high? And one of 373 00:20:06,920 --> 00:20:09,639 Speaker 1: the complaints I have is that there's a specific class 374 00:20:09,640 --> 00:20:12,120 Speaker 1: of media people who the only answer they will accept 375 00:20:12,240 --> 00:20:16,160 Speaker 1: is because, uh, there's not enough multi family zoning, which 376 00:20:16,240 --> 00:20:18,320 Speaker 1: is just a part of why the rent is so 377 00:20:18,400 --> 00:20:22,159 Speaker 1: damn high. And reducing it to just that ignores um, 378 00:20:22,200 --> 00:20:26,200 Speaker 1: the price fixing software that tens of millions of Americans, Uh, 379 00:20:26,359 --> 00:20:30,520 Speaker 1: like landlords use um. It ignores shit like Airbnb. It 380 00:20:30,600 --> 00:20:33,879 Speaker 1: ignores like the fucking problems in the construction industry, the 381 00:20:33,960 --> 00:20:37,560 Speaker 1: lingering effects of the two thousand eight crash. It's very frustrating, 382 00:20:37,600 --> 00:20:40,159 Speaker 1: and it's the these kind of like freakonomics guys like 383 00:20:40,200 --> 00:20:43,160 Speaker 1: to do the same thing, like the fucking freakonomics student. 384 00:20:43,119 --> 00:20:45,640 Speaker 1: In particular, one of the things he got famous for 385 00:20:45,800 --> 00:20:48,879 Speaker 1: is being like, uh, you know, the dropping crime in 386 00:20:48,920 --> 00:20:51,560 Speaker 1: the nineties, this unprecedented fallen crime was due to abortion, 387 00:20:51,760 --> 00:20:55,320 Speaker 1: which zero I will say again zero people who are 388 00:20:55,359 --> 00:20:57,960 Speaker 1: experts on the topic of crime in America agree with. 389 00:20:58,200 --> 00:21:00,119 Speaker 1: What they will say is actually there's a ship out 390 00:21:00,160 --> 00:21:02,399 Speaker 1: of different things that contributed to the declining crime, and 391 00:21:02,440 --> 00:21:05,320 Speaker 1: there's a good chance that abortion had an impact. A 392 00:21:05,320 --> 00:21:08,080 Speaker 1: bigger impact was probably getting the lead out of like 393 00:21:08,320 --> 00:21:11,720 Speaker 1: reducing environmental let although that gets overstated too. There's all 394 00:21:11,760 --> 00:21:14,520 Speaker 1: sorts of different ship including like air conditioning, just the 395 00:21:14,560 --> 00:21:17,160 Speaker 1: fact that like, yeah, now more people have air conditioning. 396 00:21:17,160 --> 00:21:19,280 Speaker 1: And guess when violence is highest in the summer, when 397 00:21:19,280 --> 00:21:22,320 Speaker 1: people are stuck around each other outside and like all 398 00:21:22,359 --> 00:21:26,159 Speaker 1: sorts of computer games. People because something else to do. 399 00:21:26,359 --> 00:21:28,080 Speaker 1: But it's it you want to if you're gonna be 400 00:21:28,119 --> 00:21:29,800 Speaker 1: doing the kind of like if you're gonna be doing 401 00:21:30,080 --> 00:21:36,200 Speaker 1: ted talk fucking uh public works philosophy, then it helps 402 00:21:36,240 --> 00:21:38,800 Speaker 1: to just be able to like make one big Malcolm 403 00:21:38,800 --> 00:21:41,960 Speaker 1: Gladwell style fucking reveal. Anyway, that's how all these people 404 00:21:42,000 --> 00:21:45,040 Speaker 1: exist and how all of their morality is informed. After 405 00:21:45,119 --> 00:21:49,280 Speaker 1: two thousand thirteen, Friedman is kind of like followed up 406 00:21:49,280 --> 00:21:52,240 Speaker 1: by this guy named William mccaskell, who was currently the 407 00:21:52,400 --> 00:21:56,399 Speaker 1: he's a Scottish philosopher um, which god, it's easy to 408 00:21:56,440 --> 00:22:00,720 Speaker 1: get called a philosopher these days. Um. And he is 409 00:22:00,840 --> 00:22:03,920 Speaker 1: he is a personal friend of Elon Musk. When must 410 00:22:04,119 --> 00:22:06,679 Speaker 1: text messages got released as part of that court filing, 411 00:22:06,920 --> 00:22:10,000 Speaker 1: some of them were with McCaskill um, who was considering 412 00:22:10,040 --> 00:22:12,480 Speaker 1: like putting a bunch of money into buying Twitter. They 413 00:22:13,119 --> 00:22:16,399 Speaker 1: ultimately decided not to, I think, because they just like 414 00:22:16,440 --> 00:22:18,600 Speaker 1: it seems like McCaskill just didn't trust that Musk had 415 00:22:18,600 --> 00:22:20,719 Speaker 1: any sort of planned So he is, I will say this, 416 00:22:21,160 --> 00:22:25,320 Speaker 1: not an idiot, um, but he's wrong in ways that 417 00:22:25,359 --> 00:22:27,440 Speaker 1: are are deeply fucked up. And he wrote a book 418 00:22:27,440 --> 00:22:30,080 Speaker 1: that is currently a bestseller. It was published in August, 419 00:22:30,320 --> 00:22:34,040 Speaker 1: called What We Owe the Future. And the gist of 420 00:22:34,080 --> 00:22:36,840 Speaker 1: this is that like it's merging this kind of effective 421 00:22:36,880 --> 00:22:40,080 Speaker 1: altruism with what's called long termism, which is this argument 422 00:22:40,119 --> 00:22:44,880 Speaker 1: that morally we have to consider the impact of our 423 00:22:44,920 --> 00:22:47,800 Speaker 1: actions as not just on people alive today but in 424 00:22:47,880 --> 00:22:51,600 Speaker 1: future people, which is fine. There's actually a lot to 425 00:22:51,800 --> 00:22:54,199 Speaker 1: that idea, but the way it always works out is 426 00:22:54,320 --> 00:22:57,480 Speaker 1: we can't pay attention to problems that people are suffering now. 427 00:22:57,520 --> 00:23:00,359 Speaker 1: We have to we have to work on giving the 428 00:23:00,359 --> 00:23:03,560 Speaker 1: world from these bigger problems. Um. And again it's almost 429 00:23:03,800 --> 00:23:06,199 Speaker 1: it's almost exclusively used as an argument for guys like 430 00:23:06,280 --> 00:23:09,720 Speaker 1: Musk to like, well, we shouldn't tax billionaires out of 431 00:23:09,760 --> 00:23:12,719 Speaker 1: existence because I you know, I see this that with 432 00:23:12,800 --> 00:23:15,280 Speaker 1: clarity the problems that we face, and the long term 433 00:23:15,320 --> 00:23:17,679 Speaker 1: solution is for me to be able to push for 434 00:23:17,720 --> 00:23:19,600 Speaker 1: these specific things that I think are the only way 435 00:23:19,640 --> 00:23:22,240 Speaker 1: to save humanity. Right. I'm getting ahead of myself a 436 00:23:22,240 --> 00:23:25,920 Speaker 1: little bit here. Let's talk about mccaskell again. Um, when 437 00:23:25,960 --> 00:23:28,919 Speaker 1: he was at Oxford. He's an Oxford boy, James h. 438 00:23:29,240 --> 00:23:33,400 Speaker 1: Look at we've had in Banngus. Yeah. Uh. He started 439 00:23:33,400 --> 00:23:36,640 Speaker 1: a group called Giving what we Can in two thousand nine. Uh, 440 00:23:36,640 --> 00:23:38,720 Speaker 1: and members were supposed to give away ten percent of 441 00:23:38,760 --> 00:23:41,960 Speaker 1: what they earned to the most cost effective charities possible, 442 00:23:42,240 --> 00:23:45,800 Speaker 1: which is fine, there's nothing wrong with that idea basically, 443 00:23:46,080 --> 00:23:47,639 Speaker 1: and it was like a supposed to be basically a 444 00:23:47,680 --> 00:23:50,000 Speaker 1: lifelong promise that like, you know, we're all because you 445 00:23:50,040 --> 00:23:51,960 Speaker 1: assume Oxford people, a lot of them are gonna wind 446 00:23:52,000 --> 00:23:54,880 Speaker 1: up making very good money. You know, as we move 447 00:23:54,920 --> 00:23:57,000 Speaker 1: into our careers, this will be a more and more 448 00:23:57,040 --> 00:24:03,360 Speaker 1: influential kind of giving um, but yeah, dropped the board. 449 00:24:03,359 --> 00:24:07,439 Speaker 1: If they'd had me there, Yeah, those meetings might have 450 00:24:07,480 --> 00:24:13,440 Speaker 1: gone a little bit different. Yeah. Over time, though, he's 451 00:24:13,520 --> 00:24:16,359 Speaker 1: kind of moved into he's merged this and and again 452 00:24:16,400 --> 00:24:19,240 Speaker 1: the whole effective altruism movement. A lot of it does 453 00:24:19,359 --> 00:24:23,040 Speaker 1: start reasonably with people being like, are these charities were 454 00:24:23,040 --> 00:24:26,160 Speaker 1: donating to working? How can we make sure they're effective? 455 00:24:26,240 --> 00:24:29,680 Speaker 1: Like what can we do to make giving um work better? 456 00:24:29,760 --> 00:24:33,000 Speaker 1: Which is again perfectly fine, but very quickly gets married 457 00:24:33,359 --> 00:24:36,359 Speaker 1: to this kind of long termist thinking um and they 458 00:24:36,480 --> 00:24:40,200 Speaker 1: focus instead of stuff like, for example, funding hospitals, stuff 459 00:24:40,240 --> 00:24:43,879 Speaker 1: like preventing an artificial intelligence from killing everybody, or like 460 00:24:44,240 --> 00:24:47,639 Speaker 1: sending people to distant planets, which are like cool and 461 00:24:47,720 --> 00:24:50,840 Speaker 1: sci fi and everything, but also deeply unrealistic. I'll say 462 00:24:50,840 --> 00:24:53,199 Speaker 1: it right now. Our our threat is not that an 463 00:24:53,240 --> 00:24:55,760 Speaker 1: AI kills us all. There's certainly a threat that different 464 00:24:55,840 --> 00:24:59,160 Speaker 1: kind of artificial intelligences are used by authoritarians to make 465 00:24:59,160 --> 00:25:01,840 Speaker 1: life worse for every But by the way, Peter Teal 466 00:25:01,960 --> 00:25:05,160 Speaker 1: is a big back or of effective altruism. He's one 467 00:25:05,160 --> 00:25:08,600 Speaker 1: of the people building that fucking AI. This is the 468 00:25:08,680 --> 00:25:11,240 Speaker 1: guy who wrote that thing about earning to give right 469 00:25:11,280 --> 00:25:13,439 Speaker 1: like that. He was like, this is the guy who 470 00:25:13,520 --> 00:25:17,120 Speaker 1: did that. Yeah, okay, I'm familiar promise to never take 471 00:25:17,200 --> 00:25:20,240 Speaker 1: more than thirty one dollars or something and then come 472 00:25:20,280 --> 00:25:22,840 Speaker 1: over the course of a year in his life and 473 00:25:23,480 --> 00:25:27,040 Speaker 1: give charity. He gives all his book profits to charity, 474 00:25:27,080 --> 00:25:30,120 Speaker 1: but he also runs an organization that is spending more 475 00:25:30,160 --> 00:25:32,680 Speaker 1: and more on keeping its people comfortable because I guess 476 00:25:32,680 --> 00:25:35,159 Speaker 1: he doesn't have the money personally to spend anyway. I 477 00:25:35,240 --> 00:25:38,399 Speaker 1: think there's some sketchy ship there. Yeah, this whole idea, 478 00:25:38,520 --> 00:25:40,720 Speaker 1: and I'm sure we're gonna get today, right like, it 479 00:25:41,720 --> 00:25:46,399 Speaker 1: completely overlooks our obligation morally to agitate for structural change, 480 00:25:46,640 --> 00:25:49,720 Speaker 1: right like. It says like, if you can become a 481 00:25:49,760 --> 00:25:53,880 Speaker 1: billionaire through whatever bullshit, evil, fucking exploitative grift you can 482 00:25:54,080 --> 00:25:57,600 Speaker 1: and then give nine of that away, you're still perpetuating 483 00:25:57,600 --> 00:26:00,800 Speaker 1: a system in which one grift gets rich and thousands 484 00:26:00,800 --> 00:26:04,200 Speaker 1: of people die without fucking clean water. But that's okay 485 00:26:04,240 --> 00:26:06,840 Speaker 1: because you also donated some water filters or whatever like, 486 00:26:06,880 --> 00:26:11,440 Speaker 1: And it's not okay, And it makes me very angry. Actually, yeah, yeah, 487 00:26:11,480 --> 00:26:13,160 Speaker 1: it makes me angry too. And it's one of those 488 00:26:13,200 --> 00:26:15,440 Speaker 1: things if you look at like, here's all the charities 489 00:26:15,480 --> 00:26:18,119 Speaker 1: that mccaskell and his organization are putting hundreds of millions 490 00:26:18,119 --> 00:26:20,680 Speaker 1: dollars of dollars into. They're not all bad. A lot 491 00:26:20,720 --> 00:26:22,679 Speaker 1: of them are good, and I'm glad that money is 492 00:26:22,680 --> 00:26:26,320 Speaker 1: going there, But there's always this strain of deeply unsettling 493 00:26:26,359 --> 00:26:28,080 Speaker 1: logic running through it. Now, I want to quote from 494 00:26:28,080 --> 00:26:30,879 Speaker 1: a Time article that I think kind of gets in 495 00:26:30,920 --> 00:26:34,000 Speaker 1: a very subtle way, has this guy's number When I 496 00:26:34,040 --> 00:26:36,480 Speaker 1: start thinking in practice, if you've got if you've got 497 00:26:36,520 --> 00:26:38,480 Speaker 1: some things that look robustly good in both the short 498 00:26:38,520 --> 00:26:40,359 Speaker 1: and the long term, that definitely makes you feel a 499 00:26:40,359 --> 00:26:42,240 Speaker 1: lot better about something that is only good from a 500 00:26:42,320 --> 00:26:45,120 Speaker 1: very long term perspective, he says. This year, for example, 501 00:26:45,119 --> 00:26:48,200 Speaker 1: he personally donated to the Let Exposure Elimination Project, which 502 00:26:48,240 --> 00:26:51,080 Speaker 1: aims to in childhood let exposure, and the Atlas Fellowship, 503 00:26:51,240 --> 00:26:53,760 Speaker 1: which supports talented high school students around the world to 504 00:26:53,760 --> 00:26:56,880 Speaker 1: work on pressing problems. Not all issues are equally tractable, 505 00:26:56,880 --> 00:26:59,000 Speaker 1: but mccaskell still cares about a range. When we met 506 00:26:59,040 --> 00:27:01,960 Speaker 1: An Oxford, he expressed concerned for the ongoing political crisis 507 00:27:01,960 --> 00:27:05,040 Speaker 1: in Sri Lanka, though admitted he probably wouldn't tweet about it. 508 00:27:05,400 --> 00:27:07,840 Speaker 1: The answer, he believes, is to be honest about it. 509 00:27:07,960 --> 00:27:11,040 Speaker 1: In philanthropy, big donors typically choose causes based on their 510 00:27:11,040 --> 00:27:15,280 Speaker 1: personal passions and ultra subjectivist approach. Mccaskell says, where everything 511 00:27:15,320 --> 00:27:18,000 Speaker 1: is seemingly justifiable on the basis of doing some good, 512 00:27:18,240 --> 00:27:20,520 Speaker 1: he doesn't think that's tenable. If you can save someone 513 00:27:20,520 --> 00:27:23,000 Speaker 1: from drowning or ten people from dying in a burning building, 514 00:27:23,240 --> 00:27:25,399 Speaker 1: what should you do, he proposes, it is not a 515 00:27:25,440 --> 00:27:29,240 Speaker 1: morally appropriate response to say, well, I'm particularly passionate about drowning, 516 00:27:29,240 --> 00:27:31,520 Speaker 1: so I'm going to save one person from drowning rather 517 00:27:31,560 --> 00:27:33,720 Speaker 1: than the ten people from burning. And that's exactly the 518 00:27:33,760 --> 00:27:36,880 Speaker 1: situation we find ourselves in. And like, no, it is not. 519 00:27:37,240 --> 00:27:39,639 Speaker 1: That is nonsense because, among other things, if you're a 520 00:27:39,720 --> 00:27:43,280 Speaker 1: random person, uh, and you have a choice between saving 521 00:27:43,760 --> 00:27:45,720 Speaker 1: someone from drowning or ten people from dying in a 522 00:27:45,760 --> 00:27:49,600 Speaker 1: burning building, well you actually probably don't because saving people 523 00:27:49,600 --> 00:27:52,600 Speaker 1: from drowning is a really difficult technical skill, which is 524 00:27:52,600 --> 00:27:55,400 Speaker 1: why people usually die when they try to rescue whether 525 00:27:55,440 --> 00:27:58,679 Speaker 1: folks who were drowning. Yeah, the guy, the creator of 526 00:27:58,800 --> 00:28:03,359 Speaker 1: Hugio die from drowning it's really hard and dangerous, and 527 00:28:03,400 --> 00:28:06,480 Speaker 1: also so is rescuing people from a burning building, which 528 00:28:06,520 --> 00:28:09,040 Speaker 1: is why we have firefighters. And guess what, a lot 529 00:28:09,080 --> 00:28:11,560 Speaker 1: of firefighters may not be very good at saving people 530 00:28:11,560 --> 00:28:13,840 Speaker 1: from drowning because they have not trained for that. There 531 00:28:13,880 --> 00:28:17,400 Speaker 1: are different skills, and these are both problems, but they're 532 00:28:17,440 --> 00:28:20,840 Speaker 1: different skills. But what have you instead spend that time 533 00:28:21,119 --> 00:28:24,120 Speaker 1: buying some testlas dogs and then sold them and instead 534 00:28:24,160 --> 00:28:26,159 Speaker 1: invested in h I don't know, I find something that 535 00:28:26,200 --> 00:28:29,600 Speaker 1: stops water from from drowning people. Like none of the 536 00:28:29,680 --> 00:28:32,880 Speaker 1: problems we have are are none of the problems I'm 537 00:28:32,920 --> 00:28:35,560 Speaker 1: going to say right now, zero percent of the problems 538 00:28:35,600 --> 00:28:37,560 Speaker 1: we have are the result of some sort of like 539 00:28:38,360 --> 00:28:42,200 Speaker 1: lifeguard firefighter standing in between a burning building and like 540 00:28:42,240 --> 00:28:46,520 Speaker 1: a yacht race gone wrong and going oh god, noah. 541 00:28:46,520 --> 00:28:48,400 Speaker 1: It's like the trade he's doing the trolley problem, like 542 00:28:48,640 --> 00:29:01,960 Speaker 1: he's he's just he's trying to do the Troy problem. 543 00:29:02,000 --> 00:29:03,880 Speaker 1: It's funny that you talked about Sri Lanka too, because 544 00:29:03,880 --> 00:29:05,840 Speaker 1: it's like this is the perfect example. This is the 545 00:29:05,840 --> 00:29:09,080 Speaker 1: perfect example of a political crisis that is like completely 546 00:29:09,120 --> 00:29:12,480 Speaker 1: intractable to all of these like no, no, no, no, 547 00:29:12,520 --> 00:29:15,160 Speaker 1: one of these people donating the charities can like do 548 00:29:15,280 --> 00:29:18,080 Speaker 1: literally anything about that, because that's actually you know, like this, 549 00:29:18,520 --> 00:29:20,920 Speaker 1: like the crisis of Sri Lanka is a is a 550 00:29:21,080 --> 00:29:23,600 Speaker 1: is a both both is it like it is, but 551 00:29:23,680 --> 00:29:26,360 Speaker 1: it is both a sort of short term crisis of 552 00:29:26,440 --> 00:29:30,480 Speaker 1: this like you know, like utterly horrific genocidal political elites, 553 00:29:30,520 --> 00:29:32,440 Speaker 1: and then also a sort of long term crisis about 554 00:29:32,480 --> 00:29:35,800 Speaker 1: like the sort of structural position of like specimcific countries 555 00:29:35,800 --> 00:29:38,800 Speaker 1: and sort of the global colonial system. This is not 556 00:29:38,880 --> 00:29:40,600 Speaker 1: something any of these people can solve. The only the 557 00:29:40,640 --> 00:29:42,400 Speaker 1: only thing, the only way any of these people could 558 00:29:42,400 --> 00:29:44,440 Speaker 1: solve this is if people of Sri Lanka like just 559 00:29:44,520 --> 00:29:48,640 Speaker 1: expropriated them. But you know, but he but because because 560 00:29:48,680 --> 00:29:51,440 Speaker 1: because because these people like because Sri Lankas do not 561 00:29:51,520 --> 00:29:54,720 Speaker 1: have access to this guy and like six guns, right, 562 00:29:54,960 --> 00:29:57,440 Speaker 1: there's no there's no way, you know, he can just 563 00:29:57,440 --> 00:29:59,440 Speaker 1: sort of sit there in his chair going, well, it's 564 00:29:59,440 --> 00:30:01,840 Speaker 1: a crisis, us, I'm gonna tweet about it. I'm not 565 00:30:01,880 --> 00:30:05,080 Speaker 1: gonna tweet about it. Yeah, I was. I was simply 566 00:30:05,080 --> 00:30:08,480 Speaker 1: talked to newspapers about in tweeting. What what I would 567 00:30:08,480 --> 00:30:11,640 Speaker 1: say is that like here's the actual solution to the 568 00:30:11,720 --> 00:30:14,560 Speaker 1: stupid problem this guy came up with, Well, if we 569 00:30:14,560 --> 00:30:16,720 Speaker 1: were to tax all of the billionaires to the point 570 00:30:16,760 --> 00:30:19,280 Speaker 1: that they weren't billionaires and then put that into a 571 00:30:19,320 --> 00:30:23,120 Speaker 1: massive new like works progress fund that, instead of like 572 00:30:23,280 --> 00:30:27,840 Speaker 1: just building national parks, provided like rental assistance to millions 573 00:30:27,880 --> 00:30:30,000 Speaker 1: of Americans in exchange for them learning how to fight 574 00:30:30,080 --> 00:30:32,680 Speaker 1: fires and getting basic life gave it saving care and 575 00:30:32,680 --> 00:30:35,760 Speaker 1: getting trained in things, um like that, so that they 576 00:30:35,800 --> 00:30:38,520 Speaker 1: could deal with the consequences of climate change and be 577 00:30:38,560 --> 00:30:42,000 Speaker 1: able to protect their communities effectively, and be incentivized to 578 00:30:42,080 --> 00:30:44,280 Speaker 1: gain the actual technical skills that would allow them to 579 00:30:44,280 --> 00:30:47,400 Speaker 1: protect people. Well, then you would have more people capable 580 00:30:47,400 --> 00:30:49,760 Speaker 1: of saving someone from a burning building or from drowning. 581 00:30:50,040 --> 00:30:54,720 Speaker 1: Um but anyway, whatever, that's that's that's my that's my 582 00:30:54,880 --> 00:30:58,640 Speaker 1: pie in the sky. Leftist solution to that is use 583 00:30:58,920 --> 00:31:02,800 Speaker 1: funds taken from rich in order to incentivize people to 584 00:31:03,160 --> 00:31:05,560 Speaker 1: gain the skills that will allow them to protect their 585 00:31:05,560 --> 00:31:10,520 Speaker 1: communities in the event of disasters. Um anyway, whatever. Uh So, 586 00:31:10,720 --> 00:31:13,480 Speaker 1: Over the last decade, all of this thinking has increasingly 587 00:31:13,560 --> 00:31:16,720 Speaker 1: given way from a wonky theory on charitable giving by bighearted, 588 00:31:16,720 --> 00:31:19,160 Speaker 1: guilt ridden millennial kids and that's that's how this guy 589 00:31:19,360 --> 00:31:22,080 Speaker 1: has always framed in articles, McCaskill as he's like, in fact, 590 00:31:22,080 --> 00:31:24,240 Speaker 1: I'm gonna fucking I'm gonna scroll down here to my 591 00:31:24,320 --> 00:31:26,160 Speaker 1: notes and I'm gonna find the section of the article 592 00:31:26,160 --> 00:31:28,560 Speaker 1: to like show you the way he gets fucking talked 593 00:31:28,560 --> 00:31:31,280 Speaker 1: about in all of these quote. Thirteen years ago, William 594 00:31:31,320 --> 00:31:34,320 Speaker 1: McCaskill found himself standing in the aisle of a grocery store, 595 00:31:34,560 --> 00:31:37,640 Speaker 1: agonizing over which breakfast cereal to buy. If he switched 596 00:31:37,640 --> 00:31:39,520 Speaker 1: to a cheaper brand for a year, could he put 597 00:31:39,560 --> 00:31:45,400 Speaker 1: aside enough money to save someone's life? Like that's the yeah, 598 00:31:45,400 --> 00:31:47,200 Speaker 1: that's sort of like that you have where your engagement 599 00:31:47,200 --> 00:31:51,160 Speaker 1: with global poverty is in the fucking cheerios aisle exactly exactly. 600 00:31:53,520 --> 00:31:58,120 Speaker 1: And then yeah, of weight Rose in Oxford, I'm sure like, no, funk, sorry, 601 00:31:58,160 --> 00:32:01,560 Speaker 1: I'm so fucking angry at this, and it's it's clearly, 602 00:32:01,720 --> 00:32:03,800 Speaker 1: very clearly I can see that this is going towards 603 00:32:03,800 --> 00:32:06,520 Speaker 1: an excuse for incredibly wealthy people paying funk all in 604 00:32:06,600 --> 00:32:08,960 Speaker 1: taxes because they claim that it's not an efficient way 605 00:32:09,000 --> 00:32:12,760 Speaker 1: to do things, and they completely ignore all these structural 606 00:32:12,880 --> 00:32:15,840 Speaker 1: things which have to exist for their effective altruism to 607 00:32:15,880 --> 00:32:20,520 Speaker 1: occur in the first place, right, Yeah, it's um. Anyway, 608 00:32:20,600 --> 00:32:23,520 Speaker 1: this is effectively like over the years given away from 609 00:32:23,560 --> 00:32:26,560 Speaker 1: this again kind of this wonky theory by guilty millennial 610 00:32:26,600 --> 00:32:29,560 Speaker 1: kids to this pop philosophy for the fintech set, because 611 00:32:29,840 --> 00:32:32,320 Speaker 1: that's how these guilt written millennial kids wound up making 612 00:32:32,360 --> 00:32:35,640 Speaker 1: a bunch of money. Um. And yeah, that time article 613 00:32:35,880 --> 00:32:38,880 Speaker 1: gives like, I just want to read another quote from 614 00:32:38,880 --> 00:32:41,320 Speaker 1: it about one of the other guys who's involved in 615 00:32:41,320 --> 00:32:45,880 Speaker 1: putting a lot of money into McCaskill's organization. Quote. Mr 616 00:32:46,120 --> 00:32:49,280 Speaker 1: Mr Bankman Freed makes his donations through the ft X Foundation, 617 00:32:49,280 --> 00:32:51,120 Speaker 1: which is given away a hundred and forty million, of 618 00:32:51,120 --> 00:32:53,360 Speaker 1: which ninety million has gone through the group's future fund 619 00:32:53,440 --> 00:32:56,760 Speaker 1: towards long term causes. Mr McCaskill and Mr Bankman Fried's 620 00:32:56,800 --> 00:32:59,600 Speaker 1: relationship is an important piece and understanding the community's evolution 621 00:32:59,640 --> 00:33:01,520 Speaker 1: in the reason years. The two men first met in 622 00:33:01,520 --> 00:33:03,560 Speaker 1: two thousand and twelve when Mr Bankman Freed was a 623 00:33:03,600 --> 00:33:07,080 Speaker 1: student at m I T with an interest in utilitarian philosophy. 624 00:33:07,160 --> 00:33:09,480 Speaker 1: Over lunch, Mr Bankman Freed said that he was interested 625 00:33:09,520 --> 00:33:13,320 Speaker 1: in working she related to animal welfare. Mr McCaskill suggested 626 00:33:13,360 --> 00:33:15,520 Speaker 1: he might do more good by entering a high earning 627 00:33:15,560 --> 00:33:18,320 Speaker 1: field and donating money to the cause and by working 628 00:33:18,320 --> 00:33:21,560 Speaker 1: for it directly. Mr Bankman Freed contacted the Humane League 629 00:33:21,560 --> 00:33:23,640 Speaker 1: and other charities, asking if they would prefer his time 630 00:33:23,760 --> 00:33:26,239 Speaker 1: or donations based on his expected earnings if he went 631 00:33:26,280 --> 00:33:28,640 Speaker 1: to work in tech our finance. They opted for the money, 632 00:33:28,640 --> 00:33:31,640 Speaker 1: and he embarked on a remunerative career, eventually founding the 633 00:33:31,680 --> 00:33:40,600 Speaker 1: cryptocurrency exchange. First off, that guy absolutely did not call 634 00:33:40,640 --> 00:33:43,160 Speaker 1: any charities. Um. Sorry, this was a four. This was 635 00:33:43,200 --> 00:33:46,320 Speaker 1: from the Forbes article I used, not the Time article. Um. 636 00:33:46,360 --> 00:33:49,560 Speaker 1: First off, I don't believe that he but if he did, 637 00:33:49,560 --> 00:33:52,400 Speaker 1: it was something like, Hey, I don't have any skills 638 00:33:52,600 --> 00:33:54,640 Speaker 1: or training. Do you want money or do you want 639 00:33:54,680 --> 00:33:56,960 Speaker 1: me to volunteer? And they were like who then is this? Kids? Like, 640 00:33:57,000 --> 00:33:59,560 Speaker 1: we don't. We don't need another asshole wandering around here 641 00:33:59,600 --> 00:34:04,040 Speaker 1: trying to touch the cats. Um, send us to a check. Yeah. 642 00:34:04,160 --> 00:34:06,840 Speaker 1: And so instead of I don't know, getting trained as 643 00:34:06,880 --> 00:34:09,040 Speaker 1: a vet tech or something where he would actually be 644 00:34:09,040 --> 00:34:12,319 Speaker 1: able to help animals, he founded a cryptocurrency exchange and 645 00:34:12,360 --> 00:34:14,840 Speaker 1: contributed to the burning of massive amounts of carbon that 646 00:34:14,840 --> 00:34:18,120 Speaker 1: will contribute to mass deforestation and the deaths of animals 647 00:34:18,160 --> 00:34:20,760 Speaker 1: around the world. That's good. I think that there's another 648 00:34:20,800 --> 00:34:22,960 Speaker 1: aspect of this which I think is sort of under explored, 649 00:34:23,000 --> 00:34:26,080 Speaker 1: which is that utilitarianism is genuinely one of the greatest 650 00:34:26,080 --> 00:34:29,600 Speaker 1: evils humanity has ever created, every every bad decision anyone 651 00:34:29,600 --> 00:34:31,440 Speaker 1: has ever made. If you look behind it, you can 652 00:34:31,480 --> 00:34:33,680 Speaker 1: find your guilitarian is like, it's the basis of the 653 00:34:33,680 --> 00:34:39,080 Speaker 1: basis of all economics. It's horrible everything in the world. 654 00:34:41,120 --> 00:34:43,440 Speaker 1: It is an engine that allows rich people to feel 655 00:34:43,480 --> 00:34:46,080 Speaker 1: good about hurting poor people. That's that's what it is. 656 00:34:46,239 --> 00:34:48,240 Speaker 1: But and that's what I think this all makes clear. 657 00:34:48,600 --> 00:34:51,759 Speaker 1: So the actual rhetoric from these people is always likes, 658 00:34:51,960 --> 00:34:53,640 Speaker 1: especially if you're just kind of encountering it out in 659 00:34:53,640 --> 00:34:55,400 Speaker 1: the wild. It's hard to argue with a lot of 660 00:34:55,440 --> 00:34:57,160 Speaker 1: the time because they'll be like, well, look, we need 661 00:34:57,200 --> 00:34:58,880 Speaker 1: to look at what's going to help the most people, 662 00:34:58,920 --> 00:35:01,000 Speaker 1: and that's why we're you know, setting up None of 663 00:35:01,000 --> 00:35:03,000 Speaker 1: this matters if we don't deal with this problem or 664 00:35:03,040 --> 00:35:07,239 Speaker 1: that problem. And it's it's Taylor made to sound profound 665 00:35:07,280 --> 00:35:09,240 Speaker 1: and again and like a Ted talk or the website 666 00:35:09,239 --> 00:35:11,759 Speaker 1: for some charitable giving organization aimed at getting you to 667 00:35:11,840 --> 00:35:14,239 Speaker 1: like put ten percent of your income to long term 668 00:35:14,239 --> 00:35:17,520 Speaker 1: mist causes. But again, the funked up ship crusts kind 669 00:35:17,560 --> 00:35:19,880 Speaker 1: of around the edges for the most part, in lines 670 00:35:19,960 --> 00:35:22,960 Speaker 1: like these from a time profile on the Castle. The 671 00:35:23,000 --> 00:35:26,160 Speaker 1: first public protest against African American slavery was the six 672 00:35:26,440 --> 00:35:30,160 Speaker 1: eighty eight Germantown Quaker petition Slavery or was only Yeah, 673 00:35:30,239 --> 00:35:33,240 Speaker 1: slavery was only abolished in the British Empire in eighteen thirty, 674 00:35:33,280 --> 00:35:35,920 Speaker 1: three decades later in the US, and not until nineteen 675 00:35:36,000 --> 00:35:39,560 Speaker 1: sixty two in Saudi Arabia. History encourages mccaskell to favor 676 00:35:39,560 --> 00:35:43,120 Speaker 1: gradual progress, So for revolution abolition, he says, is maybe 677 00:35:43,120 --> 00:35:45,400 Speaker 1: the single best moral change ever. It's certainly up there 678 00:35:45,440 --> 00:35:48,560 Speaker 1: with feminism, and they're extremely incremental. They don't seem that 679 00:35:48,600 --> 00:35:50,840 Speaker 1: way because we enormously shrink the past. But it's almost 680 00:35:50,840 --> 00:35:54,239 Speaker 1: three hundred years we're talking about. Um. That wasn't the 681 00:35:54,239 --> 00:35:56,640 Speaker 1: result of incremental change. It was the result against the 682 00:35:56,640 --> 00:36:01,439 Speaker 1: people who owned slaves, fighting viciously against any attempts to slavery, Like, yeah, 683 00:36:01,840 --> 00:36:04,239 Speaker 1: it was a it was a battle. It was a 684 00:36:04,280 --> 00:36:07,000 Speaker 1: series of in fact, a series of revolutions in a 685 00:36:07,040 --> 00:36:10,240 Speaker 1: lot of cases, including like the Haitian Revolution and guys 686 00:36:10,280 --> 00:36:13,440 Speaker 1: like John Brown. There were a ship bleeding Kansas. There 687 00:36:13,440 --> 00:36:15,840 Speaker 1: were a shipload of people died fighting in order to 688 00:36:15,960 --> 00:36:19,480 Speaker 1: end slavery. Like, yeah, it's a civil war, dude, what 689 00:36:19,520 --> 00:36:23,280 Speaker 1: do you call that? That's not incremental? A million people 690 00:36:23,400 --> 00:36:25,719 Speaker 1: shot each other to death, you know, and it's it's 691 00:36:25,719 --> 00:36:27,200 Speaker 1: as far as we can talk about sort of income 692 00:36:27,239 --> 00:36:29,839 Speaker 1: with the progress. It's stuff like Okay, So the like 693 00:36:29,960 --> 00:36:32,600 Speaker 1: the slaves in Haiti freed themselves by means of revolution 694 00:36:32,680 --> 00:36:34,600 Speaker 1: and then sent a bunch of guns and weapons to 695 00:36:34,800 --> 00:36:37,399 Speaker 1: people in Latin America so that their armies could march 696 00:36:37,440 --> 00:36:42,160 Speaker 1: through Latin America and slavery. Like many revolutions had to 697 00:36:42,239 --> 00:36:45,560 Speaker 1: occur to end slavery because it was a powerful system 698 00:36:45,640 --> 00:36:48,200 Speaker 1: at the center of global capital that a lot of 699 00:36:48,360 --> 00:36:51,640 Speaker 1: entrenched and heavily arm interests were willing to die to maintain. 700 00:36:52,000 --> 00:36:56,000 Speaker 1: Which also is fun because I I I bet, I 701 00:36:56,080 --> 00:36:58,000 Speaker 1: bet if you look Throse, people supply chains, and this 702 00:36:58,040 --> 00:37:00,719 Speaker 1: is almost certainly true of Elon must supply Jay, like 703 00:37:01,239 --> 00:37:05,920 Speaker 1: I mean, Okay, must supply chains. In China. You can 704 00:37:06,000 --> 00:37:08,600 Speaker 1: have some kind of debate as to whether the kinds 705 00:37:08,600 --> 00:37:10,920 Speaker 1: of forced labor you're going to be encountering our slavery 706 00:37:11,080 --> 00:37:13,920 Speaker 1: Like I I bet if you look to present the 707 00:37:13,960 --> 00:37:15,960 Speaker 1: people who are affective alters, you can find slavery in 708 00:37:15,960 --> 00:37:19,040 Speaker 1: their supply chains, and their arguments will be like, well, 709 00:37:19,160 --> 00:37:22,200 Speaker 1: I can't end slavery and must supply chain because I 710 00:37:22,280 --> 00:37:25,920 Speaker 1: guarantee it. They're in the tech industry and like, nobody 711 00:37:25,960 --> 00:37:28,920 Speaker 1: has a laptop or a smartphone without the use of 712 00:37:29,000 --> 00:37:32,600 Speaker 1: rare earth minerals that are like acquired via slavery. It's 713 00:37:32,640 --> 00:37:34,799 Speaker 1: it's the same thing if you're wearing clothes, you have 714 00:37:34,920 --> 00:37:37,920 Speaker 1: something that slavery was involved in. Because the garment industry, 715 00:37:38,160 --> 00:37:42,000 Speaker 1: slavery is literally inextricable from it. Like the company that 716 00:37:42,040 --> 00:37:45,120 Speaker 1: has tried the hardest to remove slavery from their from 717 00:37:45,160 --> 00:37:49,400 Speaker 1: their production line, Patagonia, UM still continually finds like, oh, no, 718 00:37:49,520 --> 00:37:54,239 Speaker 1: they're smart. Yeah they're pretty good. I'm going out, but yeah, 719 00:37:53,800 --> 00:37:56,120 Speaker 1: they put a load of money into that ship and 720 00:37:56,120 --> 00:38:01,160 Speaker 1: they still it is hard. Um. Anyway, um, I'm going 721 00:38:01,200 --> 00:38:04,080 Speaker 1: to read another fun quote from the Forbes article. Mr 722 00:38:04,120 --> 00:38:06,160 Speaker 1: Bankman Freed said he expected to give away the bulk 723 00:38:06,160 --> 00:38:08,319 Speaker 1: of his fortune in the next tender twenty years. If 724 00:38:08,320 --> 00:38:10,880 Speaker 1: you're worried about existential risks of a really bad pandemic, 725 00:38:10,920 --> 00:38:13,080 Speaker 1: you sort of can't stall on that. Mr Bankman Freed 726 00:38:13,120 --> 00:38:15,600 Speaker 1: said in an interview. That is how his text messages 727 00:38:15,600 --> 00:38:17,880 Speaker 1: popped up among hundreds of others sent to Mr Musk. 728 00:38:18,400 --> 00:38:21,279 Speaker 1: Mr Bankman Freed ultimately did not join Mr Musk's bid. 729 00:38:21,360 --> 00:38:23,400 Speaker 1: I don't know exactly what Elon's goals are going to 730 00:38:23,440 --> 00:38:25,720 Speaker 1: be with Twitter. Mr Bankman Freed said in an interview 731 00:38:25,880 --> 00:38:28,359 Speaker 1: there was a little bit of ambiguity there. He had 732 00:38:28,360 --> 00:38:30,719 Speaker 1: his hands full in the month that followed his cryptocurrency 733 00:38:30,760 --> 00:38:33,080 Speaker 1: prices crashed. The Twitter deal has been volatile in its 734 00:38:33,080 --> 00:38:35,200 Speaker 1: own way, with Mr Musk trying to back out before 735 00:38:35,239 --> 00:38:37,759 Speaker 1: recently announcing his intention to follow through that after all. 736 00:38:38,000 --> 00:38:41,200 Speaker 1: In August, Mr Musk retweeted Mr mccaskell's book announcement to 737 00:38:41,239 --> 00:38:44,080 Speaker 1: his hundred and eight million followers with the observation worth 738 00:38:44,120 --> 00:38:48,160 Speaker 1: reading this is a close match to my philosophy. So 739 00:38:50,120 --> 00:38:54,319 Speaker 1: that's that's kind of the surface of where we are now. 740 00:38:54,600 --> 00:38:57,960 Speaker 1: Um it is not. It doesn't quite get at all 741 00:38:58,000 --> 00:38:59,880 Speaker 1: of the things that are deeply fucked up. And for 742 00:39:00,040 --> 00:39:03,319 Speaker 1: that I wanted to quote from another article. UM, I 743 00:39:03,400 --> 00:39:06,960 Speaker 1: found an a on a e O N. It's an 744 00:39:07,040 --> 00:39:09,880 Speaker 1: essay by uh God, let make it the author here 745 00:39:09,880 --> 00:39:11,960 Speaker 1: because it's it's quite good about long term ASM. It's 746 00:39:11,960 --> 00:39:15,240 Speaker 1: an essay called against long termism by Emil P. Torres, 747 00:39:15,280 --> 00:39:19,960 Speaker 1: a phb candidate at a university in Hanover in Germany 748 00:39:20,200 --> 00:39:23,359 Speaker 1: uh Leibnitz Universitat. I don't know. I feel silly every 749 00:39:23,400 --> 00:39:25,400 Speaker 1: time I tried to say Germans, so I'm not going 750 00:39:25,440 --> 00:39:27,919 Speaker 1: to try that hard. But the article is very good 751 00:39:28,160 --> 00:39:31,800 Speaker 1: UM and it kind of gets at how this effective 752 00:39:31,840 --> 00:39:35,920 Speaker 1: altruism movement has merged with long term is um in 753 00:39:35,960 --> 00:39:40,719 Speaker 1: a way that specifically exists to buoy the interests of 754 00:39:40,840 --> 00:39:44,719 Speaker 1: wealthy authoritarians around the world. Quote. This has roots in 755 00:39:44,760 --> 00:39:47,359 Speaker 1: the work of Nick Bostrom, who founded the grandiosely named 756 00:39:47,400 --> 00:39:50,439 Speaker 1: Future of Humanity Institute f HI in two thousand five, 757 00:39:50,560 --> 00:39:53,640 Speaker 1: and Nick Bestead, a research associated FHI and a program 758 00:39:53,680 --> 00:39:56,840 Speaker 1: officer at Open Philanthropy. It has been defended most publicly 759 00:39:56,880 --> 00:39:59,400 Speaker 1: by the FHI philosopher Toby or, the author of the 760 00:39:59,400 --> 00:40:03,000 Speaker 1: precipice Existential Risk in the Future of Humanity. Long termism 761 00:40:03,080 --> 00:40:06,000 Speaker 1: is the primary research focus of both the Global Priorities 762 00:40:06,040 --> 00:40:09,160 Speaker 1: Institute and an f HI in linked organization directed by 763 00:40:09,239 --> 00:40:12,960 Speaker 1: Hillary Greeves, and the Forethought Foundation run by William mccaskell, 764 00:40:13,120 --> 00:40:15,759 Speaker 1: who also holds positions at f HI and g p I. 765 00:40:16,160 --> 00:40:18,920 Speaker 1: Adding to the tangle of titles, names, institutes and acronyms. 766 00:40:18,960 --> 00:40:21,120 Speaker 1: Long termism is one of the main cause areas of 767 00:40:21,160 --> 00:40:24,239 Speaker 1: the so called effective altruism movement, which was introduced by 768 00:40:24,360 --> 00:40:26,600 Speaker 1: ord In around two thousand even eleven and now boast 769 00:40:26,600 --> 00:40:29,319 Speaker 1: of having a mind boggling forty six billion dollars in 770 00:40:29,360 --> 00:40:32,560 Speaker 1: committed funding. It is difficult to overstate how influential long 771 00:40:32,640 --> 00:40:35,680 Speaker 1: termism has become. Karl Marx in eighteen forty five decluded 772 00:40:35,680 --> 00:40:38,160 Speaker 1: that the point of philosophy isn't merely to interpret the world, 773 00:40:38,200 --> 00:40:40,640 Speaker 1: but change it, and this is exactly what long termists 774 00:40:40,640 --> 00:40:43,839 Speaker 1: have been doing with extraordinary success. Consider that Elon Musk, 775 00:40:43,880 --> 00:40:46,759 Speaker 1: who is cited and endorsed Bostrom's work, has donated one 776 00:40:46,760 --> 00:40:49,799 Speaker 1: point five million dollars to FHI through its sister organization, 777 00:40:49,920 --> 00:40:53,120 Speaker 1: even more grandiosely named Future of Life Institute. This was 778 00:40:53,160 --> 00:40:56,600 Speaker 1: co founded by the multimillionaire tech entrepreneur Jean Tallinn, who 779 00:40:56,640 --> 00:40:59,200 Speaker 1: has i recently noted doesn't believe that climate change poses 780 00:40:59,200 --> 00:41:02,239 Speaker 1: an existential threat to humanity because of his adherence to 781 00:41:02,280 --> 00:41:06,320 Speaker 1: the long termist ideology. Meanwhile, the billionaire libertarian and Donald 782 00:41:06,320 --> 00:41:08,800 Speaker 1: Trump supporter Peter Teal, who once gave the keynote address 783 00:41:08,800 --> 00:41:11,719 Speaker 1: at an Effective Altruism conference, has donated large sums of 784 00:41:11,719 --> 00:41:15,040 Speaker 1: money to the Machine Intelligence Research Institute, whose mission is 785 00:41:15,080 --> 00:41:18,000 Speaker 1: to save in humanity from super intelligent machines and is 786 00:41:18,040 --> 00:41:21,600 Speaker 1: deeply intertwined with long termist values. Other organizations, such as 787 00:41:21,640 --> 00:41:24,040 Speaker 1: g p I and the Foe Thought Foundation are funding 788 00:41:24,120 --> 00:41:26,640 Speaker 1: essay contests and scholarships and an effort to draw young 789 00:41:26,680 --> 00:41:29,080 Speaker 1: people into the community. Well, it's an open secret of 790 00:41:29,120 --> 00:41:31,760 Speaker 1: the Washington d C Base Center for Security and Emergence 791 00:41:31,840 --> 00:41:34,880 Speaker 1: and Emerging Technologies c SET aims to place long term 792 00:41:34,880 --> 00:41:38,120 Speaker 1: mists within high level US government positions to shape national apology. 793 00:41:38,400 --> 00:41:41,279 Speaker 1: In fact, c SET was established by Jason Mathani, a 794 00:41:41,320 --> 00:41:43,400 Speaker 1: former research assistant and f HI who is now the 795 00:41:43,400 --> 00:41:46,239 Speaker 1: Deputy Assistant to US President Joe Biden for Technology and 796 00:41:46,320 --> 00:41:50,400 Speaker 1: National Security. Or himself has, astonishingly for a philosopher, advised 797 00:41:50,440 --> 00:41:53,359 Speaker 1: the World Health Organization, the World Bank, the World Economic Forum, 798 00:41:53,400 --> 00:41:56,720 Speaker 1: the U s National Intelligence Council, the UK Prime Minister's Office, 799 00:41:56,760 --> 00:41:59,240 Speaker 1: Cabinet Office, and Government Office for Science, and he recently 800 00:41:59,239 --> 00:42:02,000 Speaker 1: contributed to report from the Secretary General of the United 801 00:42:02,080 --> 00:42:05,520 Speaker 1: Nations that specifically mentions long term is um. The short 802 00:42:05,560 --> 00:42:08,800 Speaker 1: answer is that elevating the fulfillment of humanities supposed potential 803 00:42:08,800 --> 00:42:11,440 Speaker 1: above all else could not trivially increase the probability that 804 00:42:11,480 --> 00:42:14,680 Speaker 1: actual people those alive today in the near future suffer 805 00:42:14,719 --> 00:42:17,680 Speaker 1: extreme harms even death. Consider As I noted elsewhere, the 806 00:42:17,719 --> 00:42:20,640 Speaker 1: long termist ideology inclines its adherence to take an insusian 807 00:42:20,680 --> 00:42:24,239 Speaker 1: attitude towards climate change. Why because even if climate change 808 00:42:24,239 --> 00:42:27,200 Speaker 1: causes island nations to disappear, triggers mass migrations, and kills 809 00:42:27,239 --> 00:42:29,759 Speaker 1: millions of people, it probably isn't going to compromise our 810 00:42:29,800 --> 00:42:32,480 Speaker 1: long term potential over the coming trillions of years. If 811 00:42:32,480 --> 00:42:34,799 Speaker 1: one takes a cosmic view of the situation, even a 812 00:42:34,840 --> 00:42:38,000 Speaker 1: climate catastrophe that cuts the human population by se for 813 00:42:38,000 --> 00:42:40,279 Speaker 1: the next two millennia will, in the grand scheme of things, 814 00:42:40,440 --> 00:42:42,799 Speaker 1: be nothing more than a small blip, the equivalent of 815 00:42:42,800 --> 00:42:44,719 Speaker 1: a nine year old man having stubbed his toe when 816 00:42:44,760 --> 00:42:49,160 Speaker 1: he was two. So this is evil, right, Like this 817 00:42:49,239 --> 00:42:52,239 Speaker 1: is like this is vicious and vile and cruel, and 818 00:42:52,280 --> 00:42:54,200 Speaker 1: it's one of those things. There's a book that I've 819 00:42:54,200 --> 00:42:56,399 Speaker 1: talked about on the show a couple of times UM 820 00:42:56,440 --> 00:42:59,640 Speaker 1: that is quite popular called Ministry of the Future UM 821 00:42:59,719 --> 00:43:01,319 Speaker 1: and I. It's a very good book, and one of 822 00:43:01,360 --> 00:43:03,640 Speaker 1: the attitude, like the basic premise of it is that 823 00:43:04,080 --> 00:43:08,600 Speaker 1: climate change is addressed finally and the worst aspects of 824 00:43:08,600 --> 00:43:10,719 Speaker 1: it are are dealt with and like begin to be 825 00:43:10,760 --> 00:43:14,600 Speaker 1: repaired because of the establishment of an organization called the 826 00:43:14,600 --> 00:43:17,719 Speaker 1: Ministry of the Futures. It's international organization that exists to 827 00:43:17,880 --> 00:43:21,279 Speaker 1: like look out for the interests of unborn people and 828 00:43:21,520 --> 00:43:24,239 Speaker 1: animals and plant species. And part of how they do 829 00:43:24,280 --> 00:43:27,160 Speaker 1: this is by murdering billionaires in their beds uh and 830 00:43:27,360 --> 00:43:30,480 Speaker 1: blowing up planes to in international air travel, which is 831 00:43:30,520 --> 00:43:32,880 Speaker 1: so there's a verse. Like again, the idea that like 832 00:43:33,719 --> 00:43:37,680 Speaker 1: we should be thinking about people and and living creatures 833 00:43:37,680 --> 00:43:40,400 Speaker 1: who have not yet been born is reasonable and the 834 00:43:40,440 --> 00:43:43,640 Speaker 1: reasonable conclusion of that is and so we should deal 835 00:43:43,719 --> 00:43:47,520 Speaker 1: with things like climate change and stop like thoughtlessly degrading 836 00:43:47,520 --> 00:43:50,560 Speaker 1: our environment so that people in the future will be 837 00:43:50,640 --> 00:43:54,279 Speaker 1: able to live a quality life. UM. The argument that 838 00:43:54,360 --> 00:43:56,839 Speaker 1: these long terms are making is No, that's foolish because 839 00:43:56,880 --> 00:43:58,799 Speaker 1: in a trillion years, none of it will matter. And 840 00:43:58,840 --> 00:44:00,920 Speaker 1: I intend to be alive and trillion years because I 841 00:44:00,960 --> 00:44:04,680 Speaker 1: will be an immortal machine man billionaire forever. You know, 842 00:44:04,840 --> 00:44:08,560 Speaker 1: It's about these people, These people like you think about this. 843 00:44:08,920 --> 00:44:11,799 Speaker 1: If you believe this, the only, literally the only thing 844 00:44:12,239 --> 00:44:14,120 Speaker 1: that you should spend your time doing is trying to 845 00:44:14,120 --> 00:44:16,879 Speaker 1: dismantle every single nuclear weapon on the planet. Like you, 846 00:44:16,880 --> 00:44:19,239 Speaker 1: you you should be forming your own private armies to 847 00:44:19,400 --> 00:44:22,440 Speaker 1: like storm military basis to destroy nukes. And none of 848 00:44:22,480 --> 00:44:24,279 Speaker 1: them will ever fucking do this. All these people will 849 00:44:24,280 --> 00:44:26,480 Speaker 1: back candidates who like want to have one nuclear weapons. 850 00:44:26,480 --> 00:44:29,359 Speaker 1: All these people who will back candidates who like like 851 00:44:29,560 --> 00:44:32,120 Speaker 1: you know, I wonder how many these people personally supported 852 00:44:32,200 --> 00:44:33,600 Speaker 1: dropping a nuke in the middle of a rock in 853 00:44:33,640 --> 00:44:41,000 Speaker 1: two thousand four, Like god, Yeah, anyway, this is probably 854 00:44:41,880 --> 00:44:45,000 Speaker 1: that's probably enough. I wanted to At some point, I 855 00:44:45,040 --> 00:44:47,879 Speaker 1: think we will be doing a more detailed look into 856 00:44:47,920 --> 00:44:50,160 Speaker 1: some of these people, and a more detailed look into 857 00:44:50,200 --> 00:44:53,520 Speaker 1: some Maybe maybe it's a Bastards episode, but this is 858 00:44:53,560 --> 00:44:56,160 Speaker 1: just getting more relevant. And I wanted to give people 859 00:44:56,280 --> 00:44:58,680 Speaker 1: I wanted to connect them with some like some some 860 00:44:58,800 --> 00:45:02,840 Speaker 1: resources per particularly that article on a on about the 861 00:45:03,320 --> 00:45:09,000 Speaker 1: dangers of long termism and uh yeah, anyway, be be advised. 862 00:45:09,120 --> 00:45:12,799 Speaker 1: This is what the fucking assholes who have spent like 863 00:45:12,880 --> 00:45:15,400 Speaker 1: think about how many cool things the tech industry is 864 00:45:15,400 --> 00:45:19,239 Speaker 1: actually made in the last decade. It's it's not many, right, Like, 865 00:45:19,280 --> 00:45:22,239 Speaker 1: it's mostly been vaporware. Like most of the different big 866 00:45:22,280 --> 00:45:24,480 Speaker 1: apps and stuff have all are in the process of 867 00:45:24,480 --> 00:45:27,160 Speaker 1: collapsing right now. That's why the industry is falling apart 868 00:45:28,080 --> 00:45:31,160 Speaker 1: as we record this in the metaverse. Yeah, that's right, 869 00:45:31,239 --> 00:45:34,680 Speaker 1: that's right. The legs. It's like you're sitting right next 870 00:45:34,680 --> 00:45:37,520 Speaker 1: to me, James, except for you have no laying legs in. 871 00:45:37,520 --> 00:45:44,880 Speaker 1: Your mouth is open in an endless wordless scream. Um. Finally, anyway, 872 00:45:44,920 --> 00:45:47,040 Speaker 1: that's what these assholes want to do, what they've done 873 00:45:47,040 --> 00:45:50,359 Speaker 1: to the Internet, sucking the vibrancy and the life and 874 00:45:50,440 --> 00:45:54,680 Speaker 1: like the freedom out of this this incredible creation, and 875 00:45:54,760 --> 00:45:58,600 Speaker 1: turning it into uh an engine for sucking your personal 876 00:45:58,719 --> 00:46:01,520 Speaker 1: data out and marketing things to you and making you 877 00:46:01,600 --> 00:46:04,600 Speaker 1: angry all the time as much as possible, and convincing 878 00:46:04,640 --> 00:46:07,920 Speaker 1: your parents and grandparents that fucking Joe Biden has been 879 00:46:07,920 --> 00:46:11,600 Speaker 1: replaced by a lizard man. Um like the people who 880 00:46:11,640 --> 00:46:15,160 Speaker 1: did that, uh now think that we can't take care 881 00:46:15,200 --> 00:46:17,720 Speaker 1: of people today because that would distract from our mission 882 00:46:17,760 --> 00:46:19,680 Speaker 1: to take care of people who have never been born 883 00:46:19,719 --> 00:46:28,279 Speaker 1: a trillion years from now. Um anyway, fuck them. It 884 00:46:28,360 --> 00:46:30,600 Speaker 1: could happen here as a production of cool Zone Media. 885 00:46:30,840 --> 00:46:33,520 Speaker 1: Well more podcasts from cool Zone Media, visit our website 886 00:46:33,560 --> 00:46:35,680 Speaker 1: cool zone media dot com, or check us out on 887 00:46:35,719 --> 00:46:38,239 Speaker 1: the I Heart Radio app, Apple Podcasts, or wherever you 888 00:46:38,320 --> 00:46:41,080 Speaker 1: listen to podcasts. You can find sources for It could 889 00:46:41,080 --> 00:46:44,080 Speaker 1: Happen Here, updated monthly at cool zone Media dot com 890 00:46:44,160 --> 00:46:46,040 Speaker 1: slash sources. Thanks for listening.