1 00:00:15,316 --> 00:00:24,196 Speaker 1: Pushkin. One of the most interesting people in the crypto 2 00:00:24,276 --> 00:00:28,756 Speaker 1: world right now is Sam Bankman Freed. He's thirty years old, 3 00:00:29,116 --> 00:00:32,356 Speaker 1: founded the crypto exchange FDx a few years back, and 4 00:00:32,436 --> 00:00:36,516 Speaker 1: today he is worth around twenty billion dollars according to Forbes. 5 00:00:36,916 --> 00:00:39,276 Speaker 1: And yet he still lives with roommates and drives a 6 00:00:39,316 --> 00:00:43,716 Speaker 1: Toyota Corolla. And maybe the most interesting thing about Sam, 7 00:00:43,796 --> 00:00:47,236 Speaker 1: it's actually not that into crypto. He didn't get into 8 00:00:47,276 --> 00:00:49,716 Speaker 1: it because he thinks bitcoin is going to replace the dollar, 9 00:00:49,916 --> 00:00:52,756 Speaker 1: or that we're on the brink of some revolution in 10 00:00:52,796 --> 00:00:55,836 Speaker 1: the very meaning of money. He got into crypto so 11 00:00:55,876 --> 00:00:58,476 Speaker 1: that he could make as much money as possible and 12 00:00:58,516 --> 00:01:02,756 Speaker 1: then give almost all of it away. So when Sam 13 00:01:02,756 --> 00:01:07,076 Speaker 1: thinks about really big problems, he doesn't necessarily think about 14 00:01:07,116 --> 00:01:10,436 Speaker 1: how much the price of bitcoin is falling, or that 15 00:01:10,556 --> 00:01:13,076 Speaker 1: a big stable coin fell apart a few weeks ago. 16 00:01:13,556 --> 00:01:18,036 Speaker 1: He thinks about things like how to save humanity from extinction. 17 00:01:18,956 --> 00:01:22,316 Speaker 1: How many people will live if we play our cards 18 00:01:22,356 --> 00:01:24,676 Speaker 1: right as a world like in the ninetieth percentile outcome, 19 00:01:24,756 --> 00:01:27,556 Speaker 1: How many people will live in the future dancers trillions, 20 00:01:27,556 --> 00:01:33,636 Speaker 1: probably maybe hundreds of hundreds of trillions. I'm Jacob Goldstein 21 00:01:33,716 --> 00:01:36,996 Speaker 1: and this is what's your problem? The show where entrepreneurs 22 00:01:37,036 --> 00:01:39,116 Speaker 1: and engineers talk about how they're going to change the 23 00:01:39,156 --> 00:01:42,476 Speaker 1: world once they solve a few problems. My guest today 24 00:01:42,676 --> 00:01:47,076 Speaker 1: is Sam Bankman Free, and his problem is this, how 25 00:01:47,076 --> 00:01:50,956 Speaker 1: do you save the world. Before we get to the interview, 26 00:01:51,156 --> 00:01:52,996 Speaker 1: I just want to take a minute here and set 27 00:01:53,076 --> 00:01:57,316 Speaker 1: up this one big idea. This really useful intellectual framework 28 00:01:57,356 --> 00:02:02,036 Speaker 1: that drives almost everything Sam does. It's called expected value. 29 00:02:02,556 --> 00:02:04,436 Speaker 1: I try to use it a lot because I think 30 00:02:04,436 --> 00:02:08,436 Speaker 1: it sort of is the default correct way in some 31 00:02:08,476 --> 00:02:12,196 Speaker 1: senses to calculate something like if you're just trying to 32 00:02:12,196 --> 00:02:15,116 Speaker 1: do a generic calculation, I think it's usually the right 33 00:02:15,156 --> 00:02:18,996 Speaker 1: thing to use. You could understand expected value by understanding 34 00:02:19,116 --> 00:02:22,956 Speaker 1: how Sam decided to start his company, the crypto Exchange FTX. 35 00:02:23,436 --> 00:02:26,356 Speaker 1: He was working as a trader making millions of dollars, 36 00:02:26,676 --> 00:02:30,036 Speaker 1: and when he thought about starting FTX, he knew there 37 00:02:30,116 --> 00:02:32,596 Speaker 1: was a really good chance it might fail. It might 38 00:02:32,676 --> 00:02:35,636 Speaker 1: ultimately be worth zero. But on the other hand, if 39 00:02:35,636 --> 00:02:39,116 Speaker 1: it succeeded, it could be worth tens of billions of dollars. 40 00:02:39,916 --> 00:02:43,316 Speaker 1: So here is a slightly oversimplified version of how you 41 00:02:43,356 --> 00:02:47,476 Speaker 1: would use expected value in this case, say, Sam thought 42 00:02:47,676 --> 00:02:50,556 Speaker 1: there was only a one percent chance that his exchange 43 00:02:50,596 --> 00:02:54,036 Speaker 1: would be really successful, but that if it were really successful, 44 00:02:54,076 --> 00:02:57,916 Speaker 1: it would make him twenty billion dollars. The expected value 45 00:02:57,916 --> 00:03:01,476 Speaker 1: of starting the exchange is the probability of it being 46 00:03:01,516 --> 00:03:06,796 Speaker 1: successful one percent times the value if that success happens 47 00:03:07,316 --> 00:03:12,676 Speaker 1: twenty billion dollars, which comes out to two hundred million 48 00:03:12,716 --> 00:03:18,276 Speaker 1: dollars a lot. So in twenty nineteen, Sam started FTX, 49 00:03:18,916 --> 00:03:21,356 Speaker 1: and Sam told me there is a really important lesson 50 00:03:21,396 --> 00:03:25,636 Speaker 1: here about expected value. One of the sort of takeaways 51 00:03:25,636 --> 00:03:28,796 Speaker 1: that often ends up coming from really thinking hard and 52 00:03:28,876 --> 00:03:32,036 Speaker 1: critically about expected values is that you should go for 53 00:03:32,116 --> 00:03:34,956 Speaker 1: it way more than is generally understood. Go big. You 54 00:03:34,996 --> 00:03:38,236 Speaker 1: should really go really big, even if you probably will 55 00:03:38,276 --> 00:03:41,476 Speaker 1: fail and wind up with zero. That's absolutely right. And 56 00:03:41,796 --> 00:03:44,316 Speaker 1: I think one of the intuitions for why that's the case, 57 00:03:44,476 --> 00:03:47,276 Speaker 1: for why I think going big is often the right 58 00:03:47,316 --> 00:03:50,036 Speaker 1: thing to do well if you think about it, like 59 00:03:50,676 --> 00:03:53,636 Speaker 1: you know, you've got obviously a number of options available 60 00:03:53,636 --> 00:03:57,276 Speaker 1: to you. Somewhere on the far right hand side of 61 00:03:57,316 --> 00:04:01,636 Speaker 1: distributions is like the best possible thing, meaning the really 62 00:04:01,676 --> 00:04:05,076 Speaker 1: good outcomes. That's right, the best possible thing you could 63 00:04:05,076 --> 00:04:09,476 Speaker 1: imagine happening, and the best possible thing is probably really good. 64 00:04:10,556 --> 00:04:15,236 Speaker 1: You know, it's probably orders of magnitude bigger than whatever 65 00:04:15,276 --> 00:04:17,836 Speaker 1: you're sort of expecting to do. Right, it's not a 66 00:04:17,876 --> 00:04:22,276 Speaker 1: little better, it's wildly better. It's almost unimaginably better, that's right. 67 00:04:22,316 --> 00:04:24,636 Speaker 1: And if you're thinking about, well, if I found a company, 68 00:04:24,636 --> 00:04:26,676 Speaker 1: how is it going to go? You know, you're probably 69 00:04:26,676 --> 00:04:29,316 Speaker 1: thinking this might be a million dollar company, right, but 70 00:04:29,436 --> 00:04:32,276 Speaker 1: the right hand distribution, the right hand tail of that 71 00:04:32,516 --> 00:04:34,796 Speaker 1: is going to be a billion dollar company, and that's 72 00:04:34,956 --> 00:04:37,956 Speaker 1: a thousand times bigger. And so in order for it 73 00:04:37,996 --> 00:04:40,676 Speaker 1: to be justified to choose that decision, if you really 74 00:04:40,756 --> 00:04:43,116 Speaker 1: do care linearly about money, if you really do think 75 00:04:43,476 --> 00:04:47,116 Speaker 1: that getting that marginal dollars worth a lot, you know, 76 00:04:47,196 --> 00:04:50,956 Speaker 1: even once you already have a lot of money, then 77 00:04:51,356 --> 00:04:53,876 Speaker 1: it should lead you to think that that, you know, 78 00:04:54,516 --> 00:04:57,276 Speaker 1: the best outcomes might be outcomes that have a ninety 79 00:04:57,316 --> 00:04:59,716 Speaker 1: nine percent chance of failure. Right, Because of ninety nine 80 00:04:59,716 --> 00:05:01,756 Speaker 1: percent chance of failure and a one percent chance of 81 00:05:01,836 --> 00:05:05,076 Speaker 1: that billion is still that's ten million, and that's a lot. 82 00:05:05,556 --> 00:05:08,516 Speaker 1: And so any kind that like there is some non zero, 83 00:05:08,876 --> 00:05:12,636 Speaker 1: non negligible chance of a really really good outcome are 84 00:05:12,756 --> 00:05:16,876 Speaker 1: times when you're gonna be incentivized more than seems natural, 85 00:05:17,036 --> 00:05:21,076 Speaker 1: probably to choose extreme outcomes. I feel like that maps 86 00:05:21,356 --> 00:05:24,796 Speaker 1: in different ways both to your work you do for 87 00:05:24,836 --> 00:05:27,276 Speaker 1: money and your altruism, which obviously are tied up, right, 88 00:05:27,316 --> 00:05:31,716 Speaker 1: They both seem like very much rooted in those extreme outcomes. 89 00:05:31,796 --> 00:05:34,836 Speaker 1: In the case of your work, it's an extremely large 90 00:05:34,836 --> 00:05:36,956 Speaker 1: amount of money in a short amount of time, and 91 00:05:36,996 --> 00:05:40,276 Speaker 1: in the case of the altruism, it's profoundly bad outcomes 92 00:05:40,356 --> 00:05:43,476 Speaker 1: like everybody dying. Right, Like, both of those are sort 93 00:05:43,516 --> 00:05:46,236 Speaker 1: of the same. In the kind of expected value universe, 94 00:05:46,516 --> 00:05:49,556 Speaker 1: there are things we should probably think more about than 95 00:05:49,676 --> 00:05:53,196 Speaker 1: seems intuitive if we're not using expected value, that's exactly right. 96 00:05:53,356 --> 00:05:55,756 Speaker 1: And you know, when you think, as you said the 97 00:05:55,796 --> 00:05:59,676 Speaker 1: altruism perspective, right, how many people will live if we 98 00:05:59,796 --> 00:06:01,516 Speaker 1: play our cards right as a world, like in the 99 00:06:01,676 --> 00:06:04,356 Speaker 1: ninetieth percentile outcome, how many people will live in the future. 100 00:06:04,396 --> 00:06:08,356 Speaker 1: The answer is trillions, probably maybe hundreds of trillions, rights 101 00:06:08,396 --> 00:06:11,556 Speaker 1: if thousands of times more than the number of people 102 00:06:11,876 --> 00:06:15,156 Speaker 1: who have ever lived. And so that's just that's a 103 00:06:15,236 --> 00:06:18,596 Speaker 1: huge factor, right, Anything that we do that actually has 104 00:06:18,676 --> 00:06:22,876 Speaker 1: impact on the whole future of the world is massively important. 105 00:06:22,916 --> 00:06:25,556 Speaker 1: It's kind of a ridiculous way to think at some level, right, 106 00:06:25,596 --> 00:06:28,796 Speaker 1: it just gets so big then, like you're just some 107 00:06:28,876 --> 00:06:31,716 Speaker 1: guy with a lot of money at some level, right, 108 00:06:31,796 --> 00:06:33,836 Speaker 1: talking about like trillions of people in the whole future 109 00:06:33,836 --> 00:06:36,756 Speaker 1: of humanity. Like it gets weird, Right, it does get 110 00:06:36,796 --> 00:06:40,436 Speaker 1: really weird. You should really stress us these and think like, Okay, 111 00:06:40,436 --> 00:06:43,076 Speaker 1: do I really believe this? Like, do I really actually 112 00:06:43,116 --> 00:06:47,036 Speaker 1: think that there's compelling evidence that like, you know, these 113 00:06:47,116 --> 00:06:49,196 Speaker 1: numbers that I'm looking at are as big as I'm 114 00:06:49,236 --> 00:06:52,356 Speaker 1: claiming they are, or am I kind of bullshitting myself 115 00:06:52,436 --> 00:06:54,956 Speaker 1: on this? Like, you know, you should absolutely have some 116 00:06:55,556 --> 00:06:59,396 Speaker 1: humility around that, But it's not totally implausible. And there 117 00:06:59,436 --> 00:07:02,396 Speaker 1: are examples of people people who we've heard of were 118 00:07:02,476 --> 00:07:04,516 Speaker 1: very famous, and people who no one has ever heard of, 119 00:07:05,716 --> 00:07:08,916 Speaker 1: who have had massive, massive impact on the world, who 120 00:07:08,956 --> 00:07:13,356 Speaker 1: have had that massive multiplier, And so it's not totally implausible. 121 00:07:13,436 --> 00:07:16,876 Speaker 1: And so you know, I think that like, while we 122 00:07:16,916 --> 00:07:19,956 Speaker 1: should absolutely have you know, a healthy dose of humility 123 00:07:19,996 --> 00:07:24,076 Speaker 1: towards extreme outcomes, you know, we should also acknowledge that 124 00:07:24,156 --> 00:07:27,356 Speaker 1: they can be real and that often the highest expected 125 00:07:27,436 --> 00:07:31,876 Speaker 1: value things are in fact pushing directly towards them. Well, 126 00:07:31,916 --> 00:07:36,156 Speaker 1: and in fact you did hit the extreme right tale 127 00:07:36,156 --> 00:07:39,996 Speaker 1: of the distribution in work right, you did just get 128 00:07:40,156 --> 00:07:43,876 Speaker 1: implausibly rich in a ridiculously short period of time. So 129 00:07:43,876 --> 00:07:47,476 Speaker 1: at least on that one it worked. That's right, and 130 00:07:47,516 --> 00:07:49,516 Speaker 1: I think it's that's certainly it's been a big update 131 00:07:49,596 --> 00:07:53,036 Speaker 1: towards me in the direction of life. This stuff is plausible. 132 00:07:53,636 --> 00:07:55,756 Speaker 1: Uh huh. So the fact that you got so rich 133 00:07:55,836 --> 00:07:58,316 Speaker 1: so fast in crypto does it make Does it push 134 00:07:58,356 --> 00:08:02,156 Speaker 1: your altruism toward like Welsh? Shit? If I could make 135 00:08:02,156 --> 00:08:05,356 Speaker 1: twenty billion dollars in three years, everybody on Earth could 136 00:08:05,396 --> 00:08:08,196 Speaker 1: in fact die from a pandemic or from some out 137 00:08:08,196 --> 00:08:10,476 Speaker 1: of control AI and I should spend some of the 138 00:08:10,516 --> 00:08:13,916 Speaker 1: money to try and reduce the probabilities of that. I mean, 139 00:08:14,196 --> 00:08:16,276 Speaker 1: is it like that? Yeah, it absolutely does. I think 140 00:08:16,276 --> 00:08:18,156 Speaker 1: it absolutely does make me think, you know what, like 141 00:08:18,836 --> 00:08:23,836 Speaker 1: these you know, really extreme outcomes are probably plausible, and 142 00:08:23,836 --> 00:08:26,236 Speaker 1: they're probably plausible enough that I should be taking them 143 00:08:26,236 --> 00:08:30,916 Speaker 1: really seriously, you know, And that has pretty profound implications 144 00:08:30,916 --> 00:08:35,076 Speaker 1: I think for what we should be doing. We'll get 145 00:08:35,076 --> 00:08:39,116 Speaker 1: to those profound implications and to exactly where Sam is 146 00:08:39,116 --> 00:08:51,796 Speaker 1: giving his money away in just a minute. That's the 147 00:08:51,876 --> 00:08:54,036 Speaker 1: end of the ads. Now we're going back to the show. 148 00:08:54,596 --> 00:08:57,436 Speaker 1: Sam told me he's given away about two hundred million 149 00:08:57,476 --> 00:09:00,716 Speaker 1: dollars so far, which obviously is a lot, but it 150 00:09:00,836 --> 00:09:03,956 Speaker 1: is also somewhere around one percent of what he plans 151 00:09:03,956 --> 00:09:08,196 Speaker 1: to give away eventually. His giving has been broad anti poverty, 152 00:09:08,236 --> 00:09:11,836 Speaker 1: animal welfare, healthcare, but he has started to focus on 153 00:09:11,876 --> 00:09:16,116 Speaker 1: a few areas. One of the biggest is pandemic preparedness. 154 00:09:16,516 --> 00:09:19,356 Speaker 1: That is a category that fits right into that expected 155 00:09:19,436 --> 00:09:23,916 Speaker 1: value framework. You know, a low probability but super deadly 156 00:09:23,996 --> 00:09:28,276 Speaker 1: pandemic is worth spending a lot to prevent. Another place 157 00:09:28,316 --> 00:09:31,436 Speaker 1: where he's been giving is politics. Sam was one of 158 00:09:31,476 --> 00:09:36,036 Speaker 1: the biggest donors to President Biden's twenty twenty campaign. More recently, 159 00:09:36,116 --> 00:09:39,956 Speaker 1: he donated over ten million dollars to support a candidate 160 00:09:39,996 --> 00:09:43,676 Speaker 1: in a Democratic primary for a congressional seat in Oregon, 161 00:09:44,116 --> 00:09:47,996 Speaker 1: largely because that candidate wanted to focus on pandemic preparedness. 162 00:09:48,396 --> 00:09:51,476 Speaker 1: The primary was held just last week and Sam's candidate 163 00:09:51,556 --> 00:09:54,076 Speaker 1: lost by a lot. I think that there are a 164 00:09:54,116 --> 00:09:57,476 Speaker 1: lot of takeaways from it, and yeah, I think that 165 00:09:57,596 --> 00:09:59,996 Speaker 1: might do it again. I would do it a bit 166 00:10:00,036 --> 00:10:04,876 Speaker 1: differently than last time. But you know, fundamentally, I think 167 00:10:04,916 --> 00:10:08,116 Speaker 1: it was a well fought race. I think that you know, 168 00:10:08,236 --> 00:10:11,516 Speaker 1: he had a real show. Um. And you know, going 169 00:10:11,516 --> 00:10:14,836 Speaker 1: back to the discussion of expected values, right, like, if 170 00:10:14,836 --> 00:10:17,676 Speaker 1: you're donating blacal races such that you think your candidates 171 00:10:17,676 --> 00:10:20,676 Speaker 1: are in ninety nine percent to weigh, you're almost certainly 172 00:10:20,676 --> 00:10:24,796 Speaker 1: doing something right because that person doesn't need your money exactly. 173 00:10:25,196 --> 00:10:27,476 Speaker 1: You should be donating such that you think that you 174 00:10:27,596 --> 00:10:32,356 Speaker 1: have a pretty substantial chance of losing. And you know, 175 00:10:32,396 --> 00:10:34,356 Speaker 1: I first stand by that. Do you expect you'll give 176 00:10:34,356 --> 00:10:37,076 Speaker 1: a lot of money in the twenty twenty four election cycle? 177 00:10:37,636 --> 00:10:39,316 Speaker 1: I would guess so. I don't know for sure. It's 178 00:10:39,356 --> 00:10:41,396 Speaker 1: going to depend on who's running, but you know I 179 00:10:41,396 --> 00:10:44,956 Speaker 1: would guess so, well, let's say Donald Trump runs for president. 180 00:10:44,956 --> 00:10:46,836 Speaker 1: Would that cause you to probably give a lot of 181 00:10:46,836 --> 00:10:49,396 Speaker 1: money to the person who's running against him? That's that's 182 00:10:49,396 --> 00:10:52,596 Speaker 1: a pretty decent guess. And and you know, I think 183 00:10:52,596 --> 00:10:54,356 Speaker 1: that I'm going to be looking a lot less at 184 00:10:54,396 --> 00:10:58,156 Speaker 1: like political party, um from that perspective, and a lot 185 00:10:58,196 --> 00:11:01,836 Speaker 1: more about you know, uh same governance Like that is 186 00:11:02,036 --> 00:11:04,316 Speaker 1: you know, at it's for the thing that I think 187 00:11:04,636 --> 00:11:09,996 Speaker 1: I care the most governing governance. I think the United 188 00:11:10,036 --> 00:11:14,916 Speaker 1: States has both a big opportunity and big responsibility to 189 00:11:15,036 --> 00:11:20,716 Speaker 1: the world to shepherd the West in a powerful but 190 00:11:20,796 --> 00:11:25,276 Speaker 1: responsible manner, and that everything that we do there has massive, 191 00:11:25,556 --> 00:11:30,596 Speaker 1: massive ripple effects on what the future looks like. You've 192 00:11:30,636 --> 00:11:33,916 Speaker 1: talked before about being surprised at how little money is 193 00:11:33,956 --> 00:11:37,276 Speaker 1: in politics. It is quite small, the amount of money 194 00:11:37,316 --> 00:11:41,836 Speaker 1: that is donated relative to how much money the government spends. Right, 195 00:11:43,476 --> 00:11:46,036 Speaker 1: does that lead you to want to donate a lot? 196 00:11:46,396 --> 00:11:49,156 Speaker 1: But does it follow from that that, like, you'll probably 197 00:11:49,156 --> 00:11:51,636 Speaker 1: donate a lot of money? It follows that I might 198 00:11:52,436 --> 00:11:55,316 Speaker 1: in the end. That's basically what I think. I think 199 00:11:55,356 --> 00:11:58,996 Speaker 1: that like, there are in some ways, there's you know, 200 00:11:59,036 --> 00:12:02,996 Speaker 1: in some ways surprisingly little money in politics, and given 201 00:12:02,996 --> 00:12:05,516 Speaker 1: sort of the scope of its impact, noe, that doesn't 202 00:12:05,516 --> 00:12:09,436 Speaker 1: necessarily mean they're good things to do donating politics, Like 203 00:12:09,676 --> 00:12:11,756 Speaker 1: it might be that, sure, but like, how are you 204 00:12:11,796 --> 00:12:14,436 Speaker 1: actually going to do anything useful? You know? Maybe that's 205 00:12:14,436 --> 00:12:16,996 Speaker 1: how it turns out, but but maybe not, you know, 206 00:12:17,156 --> 00:12:19,556 Speaker 1: I mean, it's not necessarily the case that more money 207 00:12:19,596 --> 00:12:22,956 Speaker 1: can have a meaningful effect on the outcome, That's right, 208 00:12:22,996 --> 00:12:26,316 Speaker 1: It's not necessarily the case, but it certainly gestures a 209 00:12:26,316 --> 00:12:30,356 Speaker 1: little bit in that direction. I mean, I imagine you 210 00:12:30,356 --> 00:12:33,756 Speaker 1: have some probability distribution in your mind of how much 211 00:12:33,796 --> 00:12:37,036 Speaker 1: money you might give in the next election cycle, Like, 212 00:12:38,356 --> 00:12:44,476 Speaker 1: give me some number. I would guess north of one 213 00:12:44,556 --> 00:12:48,476 Speaker 1: hundred million, um, And you know, as for how much 214 00:12:48,556 --> 00:12:51,796 Speaker 1: north of that, I don't know. You know, it really 215 00:12:51,836 --> 00:12:54,356 Speaker 1: does depend on what happens, Like it's really dependent on 216 00:12:54,436 --> 00:12:57,876 Speaker 1: exactly who's running where for why, like like these these 217 00:12:57,876 --> 00:13:01,196 Speaker 1: are are a super contingent thing. But um, but yeah, 218 00:13:01,236 --> 00:13:03,956 Speaker 1: I think that gives maybe some sense of what the 219 00:13:05,236 --> 00:13:09,116 Speaker 1: what the sort of like scale might be here more 220 00:13:09,116 --> 00:13:14,236 Speaker 1: than one hundred million, sort of spread across many races, organizations, 221 00:13:14,276 --> 00:13:16,716 Speaker 1: but towards the twenty twenty four election. So if that's 222 00:13:16,716 --> 00:13:20,876 Speaker 1: a floor, what's the ceiling? Like a billion? Might you 223 00:13:20,876 --> 00:13:25,156 Speaker 1: give a billion? Yeah? I think that's a decent life 224 00:13:25,476 --> 00:13:28,196 Speaker 1: thing to look at as a as is sort of 225 00:13:28,236 --> 00:13:30,756 Speaker 1: like I mean, I would hate to say like hard 226 00:13:30,796 --> 00:13:32,716 Speaker 1: ceiling its whom those was going to happen between now 227 00:13:32,756 --> 00:13:35,196 Speaker 1: and then, but as like at least sort of a 228 00:13:35,236 --> 00:13:39,316 Speaker 1: soft ceiling, I would say, yeah, okay, So so the 229 00:13:39,356 --> 00:13:42,516 Speaker 1: ballpark is like one hundred million ish to a billion 230 00:13:42,636 --> 00:13:45,236 Speaker 1: ish with again a lot of caveats on this, and 231 00:13:45,356 --> 00:13:47,196 Speaker 1: you know, there's a world which told end up being 232 00:13:47,436 --> 00:13:50,396 Speaker 1: close to zero if they're you know, things just work 233 00:13:50,396 --> 00:13:53,156 Speaker 1: out such that there isn't Is there much I'm excited about? 234 00:13:53,476 --> 00:13:55,996 Speaker 1: Like that seems like a very low probability to me, 235 00:13:56,196 --> 00:13:58,956 Speaker 1: based on what I know about you in the world. Yeah, 236 00:13:58,996 --> 00:14:01,156 Speaker 1: it's pretty I think it's I think it's very low 237 00:14:01,196 --> 00:14:03,036 Speaker 1: that it's actually gonna end up being zero. That does 238 00:14:03,036 --> 00:14:06,956 Speaker 1: seem pretty unlikely. Yeah, a billion seems way more likely 239 00:14:06,996 --> 00:14:11,836 Speaker 1: than zero to me. I think it's quaitely right. So 240 00:14:11,916 --> 00:14:14,516 Speaker 1: I think the most anybody gave last time, if I 241 00:14:14,556 --> 00:14:17,356 Speaker 1: have the right numbers, is two hundred and fifteen million. 242 00:14:18,196 --> 00:14:21,316 Speaker 1: That's for the twenty twenty cycle, the last presidential cycle. 243 00:14:22,596 --> 00:14:25,076 Speaker 1: It seems like you'll probably give more than that. Based 244 00:14:25,116 --> 00:14:27,436 Speaker 1: only on what you've told me, I think it is 245 00:14:27,476 --> 00:14:31,516 Speaker 1: eminently possible that I I think that that would not 246 00:14:31,596 --> 00:14:37,076 Speaker 1: surprise me. I've heard you say that you know, at 247 00:14:37,116 --> 00:14:39,516 Speaker 1: some points over the next few years, you hope to 248 00:14:39,556 --> 00:14:44,516 Speaker 1: find opportunities where you can spend give away like a 249 00:14:44,596 --> 00:14:48,316 Speaker 1: billion dollars really quickly. What are some of the places 250 00:14:48,316 --> 00:14:49,916 Speaker 1: you think that might happen. I mean, the election, the 251 00:14:49,916 --> 00:14:52,796 Speaker 1: twenty twenty four election is clearly one. What are a 252 00:14:52,796 --> 00:14:56,116 Speaker 1: few others? I think pandemic prevntion is potentially one of them. 253 00:14:56,196 --> 00:14:58,676 Speaker 1: I think you look at like how much would it 254 00:14:58,716 --> 00:15:02,756 Speaker 1: cost to you know, really definitely prevent the next pandemic 255 00:15:02,876 --> 00:15:04,756 Speaker 1: or you can never definitely prevent it, but to have 256 00:15:05,236 --> 00:15:07,436 Speaker 1: you know, a really good shot at it. I think 257 00:15:07,476 --> 00:15:11,156 Speaker 1: you're probably talking tens of billion dollars, which is crazy 258 00:15:11,236 --> 00:15:15,716 Speaker 1: that that governments aren't spending that money, right, that's right, 259 00:15:15,796 --> 00:15:17,796 Speaker 1: that really should be government spending it. And part of 260 00:15:17,796 --> 00:15:20,236 Speaker 1: this might be working with governments that because it's not 261 00:15:20,316 --> 00:15:22,836 Speaker 1: that much like if truly if they could reduce the 262 00:15:22,916 --> 00:15:26,996 Speaker 1: risk of a pandemic by half, say for thirty billion 263 00:15:27,076 --> 00:15:30,396 Speaker 1: dollars a year, like, do you actually think that is 264 00:15:30,476 --> 00:15:33,436 Speaker 1: that true? Or fifty billion? I think I think something 265 00:15:33,476 --> 00:15:36,356 Speaker 1: like that's probably true. I think something that's like not 266 00:15:36,436 --> 00:15:39,396 Speaker 1: too far off from that order of magnitude, and that 267 00:15:39,556 --> 00:15:42,836 Speaker 1: probably you know, by the time you're talking about you know, 268 00:15:43,076 --> 00:15:46,596 Speaker 1: many tens of million billions of dollars um. You know, 269 00:15:46,596 --> 00:15:48,556 Speaker 1: if that's something that you're probably going need to have 270 00:15:48,756 --> 00:15:52,036 Speaker 1: government stepping in on. But you know, I would be 271 00:15:52,236 --> 00:15:55,316 Speaker 1: happy to throw in a fair bit to help facilitate that. 272 00:15:56,076 --> 00:15:59,116 Speaker 1: So I should actually I should have asked this earlier. 273 00:15:59,156 --> 00:16:04,636 Speaker 1: But to what extent are your political donations a pandemic 274 00:16:04,716 --> 00:16:08,956 Speaker 1: prevention strategy? So I think most of them have been 275 00:16:09,156 --> 00:16:11,796 Speaker 1: so far, and you know, going forward, like there may 276 00:16:11,796 --> 00:16:14,876 Speaker 1: become other policy, you know, things like AI policy that 277 00:16:14,876 --> 00:16:17,276 Speaker 1: that that that you end up being really important. And 278 00:16:17,356 --> 00:16:19,716 Speaker 1: so it's not to say that like pandemics are the 279 00:16:19,756 --> 00:16:22,156 Speaker 1: only things that are ever going to matter to me policywise, 280 00:16:22,316 --> 00:16:24,276 Speaker 1: but that has been the big one so far, and 281 00:16:24,316 --> 00:16:27,596 Speaker 1: it's the idea there, like tens of billions of dollars 282 00:16:27,596 --> 00:16:30,796 Speaker 1: a year to significantly reduce the risk of another pandemic 283 00:16:30,916 --> 00:16:33,316 Speaker 1: is not that much for the government, but it's more 284 00:16:33,316 --> 00:16:35,596 Speaker 1: than you have, right, so you need a lever You 285 00:16:35,636 --> 00:16:38,596 Speaker 1: can't you can't actually spend all of your money and 286 00:16:38,676 --> 00:16:41,156 Speaker 1: meaningfully reduce the chances of another pandemic. And so if 287 00:16:41,156 --> 00:16:44,756 Speaker 1: you can use political donations to elect candidates who want 288 00:16:44,756 --> 00:16:48,236 Speaker 1: to spend money to prevent a pandemic, that works, that's right. 289 00:16:48,676 --> 00:16:53,036 Speaker 1: So you're giving lots of money to political candidates, you're 290 00:16:53,116 --> 00:16:57,276 Speaker 1: also doing a lot of work to um shape regulation 291 00:16:57,476 --> 00:17:00,676 Speaker 1: of crypto in the US, tell me about the overlap 292 00:17:00,716 --> 00:17:03,676 Speaker 1: between those two things. So most of the giving has 293 00:17:03,756 --> 00:17:07,996 Speaker 1: not been done with crypto in mind. And I have 294 00:17:08,116 --> 00:17:11,236 Speaker 1: been doing a ton of policy engagement on that, but 295 00:17:11,316 --> 00:17:16,636 Speaker 1: that's mostly going to DC and talking with policymakers. I mean, 296 00:17:16,676 --> 00:17:20,156 Speaker 1: here's the narrow version of the question. Is part of 297 00:17:20,196 --> 00:17:24,436 Speaker 1: what you want from your political donations some particular outcome 298 00:17:24,476 --> 00:17:28,876 Speaker 1: in crypto regulation? That is not a big part of it. 299 00:17:29,916 --> 00:17:32,596 Speaker 1: I wanted to talk more about crypto with Sam, but 300 00:17:32,756 --> 00:17:35,756 Speaker 1: our time was running short. I do promise to talk 301 00:17:35,916 --> 00:17:38,876 Speaker 1: more about crypto on the show before too long, and 302 00:17:38,956 --> 00:17:41,716 Speaker 1: Sam did have time for a quick lightning round. We'll 303 00:17:41,756 --> 00:17:53,036 Speaker 1: have that lightning round in a minute. Okay, let's get 304 00:17:53,076 --> 00:17:55,596 Speaker 1: back to the show. We're gonna close with the lightning round. 305 00:17:56,356 --> 00:17:58,756 Speaker 1: Let me just let's just do a lightning round. A 306 00:17:58,876 --> 00:18:02,876 Speaker 1: few quick questions and you can answer them fast. What's 307 00:18:02,916 --> 00:18:06,396 Speaker 1: the least rational thing you do? Least rational thing? I 308 00:18:06,516 --> 00:18:10,076 Speaker 1: know I spend way too much time like aimlessly browsing Facebook. Fee. 309 00:18:10,676 --> 00:18:13,236 Speaker 1: Is it true you still sleep on a bean bag chair? 310 00:18:13,276 --> 00:18:16,236 Speaker 1: And if so, why I did last night? I do 311 00:18:16,436 --> 00:18:21,316 Speaker 1: do many nights. Um, It's uh, I find it. I 312 00:18:21,396 --> 00:18:23,196 Speaker 1: kind of I don't know, it's what I'm used to 313 00:18:23,276 --> 00:18:25,036 Speaker 1: is honestly just part of the answer there. It's like, 314 00:18:25,356 --> 00:18:28,116 Speaker 1: it's what feels natural for me. If everything goes well, 315 00:18:28,116 --> 00:18:30,556 Speaker 1: what problem will you be trying to solve in five years? 316 00:18:31,716 --> 00:18:37,596 Speaker 1: I would say, the details of how to of what 317 00:18:37,716 --> 00:18:42,956 Speaker 1: to prioritize for pandemic prevention funding with you know institutes 318 00:18:42,996 --> 00:18:45,116 Speaker 1: that have been set up and are you know online 319 00:18:45,196 --> 00:18:47,396 Speaker 1: it a ton of capital into it, and you know 320 00:18:47,516 --> 00:18:50,276 Speaker 1: really great teams who are are devoting themselves to building 321 00:18:50,276 --> 00:18:53,356 Speaker 1: it out. So the dream is you'll be like deep 322 00:18:53,356 --> 00:18:56,276 Speaker 1: in the weeds figuring out how to prevent a pandemic. 323 00:18:56,836 --> 00:18:59,916 Speaker 1: Um I've seen in other interviews you're doing lots of 324 00:18:59,956 --> 00:19:02,316 Speaker 1: different things during the interview. I couldn't actually tell if 325 00:19:02,356 --> 00:19:04,636 Speaker 1: you were doing other things during this interview, But were 326 00:19:04,676 --> 00:19:07,116 Speaker 1: you and if so, what were you doing? As playing 327 00:19:07,156 --> 00:19:09,716 Speaker 1: game of Storybook for all? I say the name of 328 00:19:09,756 --> 00:19:13,476 Speaker 1: the game again, Storybook Brawl. How did you say it? 329 00:19:14,356 --> 00:19:17,116 Speaker 1: I took second place out of eight, could have been worse, 330 00:19:17,196 --> 00:19:19,996 Speaker 1: and I apologize I do have to haul off. Okay, 331 00:19:20,196 --> 00:19:22,156 Speaker 1: last one, what's one piece of advice you'd give to 332 00:19:22,156 --> 00:19:24,596 Speaker 1: somebody trying to solve a hard problem. One piece of 333 00:19:24,636 --> 00:19:29,516 Speaker 1: advice I would say, I just keep going, Just keep going, 334 00:19:29,516 --> 00:19:31,316 Speaker 1: stuff by stuff, you know, try and solve it, but 335 00:19:31,436 --> 00:19:33,716 Speaker 1: by bit, and you know, eventually, hopefully you'll get there. 336 00:19:39,516 --> 00:19:43,476 Speaker 1: Sam bankman Fried is the founder and CEO of FTX. 337 00:19:44,916 --> 00:19:48,636 Speaker 1: Today's show was produced by Edith Russlo, edited by Robert Smith, 338 00:19:48,756 --> 00:19:51,996 Speaker 1: and engineered by Amanda K. Wong. You can reach us 339 00:19:52,036 --> 00:19:56,516 Speaker 1: at problem at Pushkin dot fm, or you can find 340 00:19:56,516 --> 00:20:00,316 Speaker 1: me on Twitter at Jacob Goldstein. I'm Jacob Goldstein and 341 00:20:00,356 --> 00:20:02,876 Speaker 1: I'll be back next week with another episode of What's 342 00:20:02,876 --> 00:20:09,716 Speaker 1: Your Problem