1 00:00:15,316 --> 00:00:22,996 Speaker 1: Pushkin. In May of this year, I interviewed Sam Bankman 2 00:00:23,116 --> 00:00:26,756 Speaker 1: Freed for this show. Sam was a very young, very 3 00:00:26,876 --> 00:00:32,236 Speaker 1: rich entrepreneur and philanthropist who ran a crypto exchange called FTX. 4 00:00:33,076 --> 00:00:36,156 Speaker 1: And then last week, in a matter of just a 5 00:00:36,156 --> 00:00:40,876 Speaker 1: few days, FTX collapsed and filed for bankruptcy. And you know, 6 00:00:40,956 --> 00:00:43,596 Speaker 1: even to say that, even to say it collapsed and 7 00:00:43,596 --> 00:00:47,796 Speaker 1: filed for bankruptcy in a few days, somehow amazingly kind 8 00:00:47,796 --> 00:00:51,516 Speaker 1: of understates how fast and dramatic the fall of the 9 00:00:51,556 --> 00:00:55,756 Speaker 1: company and of Sam Bankman Freed was. Until last week, 10 00:00:56,156 --> 00:00:59,676 Speaker 1: Sam Bankman Freed had this persona of being like a 11 00:00:59,836 --> 00:01:03,796 Speaker 1: good guy. He wasn't chilling some ridiculous future where crypto 12 00:01:03,876 --> 00:01:06,156 Speaker 1: was going to solve all our problems. He was running 13 00:01:06,196 --> 00:01:11,196 Speaker 1: a real business, an exchange for crypto t and all along. 14 00:01:11,356 --> 00:01:14,316 Speaker 1: He said. His plan was to make as much money 15 00:01:14,316 --> 00:01:17,876 Speaker 1: as possible and then give it all away, and it 16 00:01:17,956 --> 00:01:21,276 Speaker 1: seemed to be working. He was only thirty and already 17 00:01:21,276 --> 00:01:25,596 Speaker 1: he'd given away hundreds of millions of dollars. Now the 18 00:01:25,676 --> 00:01:29,316 Speaker 1: money is gone, customers and investors may never get paid back, 19 00:01:29,796 --> 00:01:34,236 Speaker 1: and federal prosecutors are reportedly investigating both Sam and FTX. 20 00:01:35,276 --> 00:01:37,836 Speaker 1: This week as all this was happening, I listened back 21 00:01:37,876 --> 00:01:40,396 Speaker 1: to my interview with Sam from May, and there are 22 00:01:40,396 --> 00:01:42,836 Speaker 1: parts that I definitely find embarrassing in light of the 23 00:01:42,916 --> 00:01:45,676 Speaker 1: new news, moments when I should have challenged him more, 24 00:01:45,796 --> 00:01:48,916 Speaker 1: been more skeptical, and for that matter, my whole framing 25 00:01:48,916 --> 00:01:51,876 Speaker 1: of the show seems absurd, over the top. I mean, 26 00:01:51,956 --> 00:01:55,356 Speaker 1: I called the show Sam Bankman Freed wants to save 27 00:01:55,516 --> 00:02:00,756 Speaker 1: the world spoiler alert. He did not. And yet, and yet, 28 00:02:01,396 --> 00:02:04,996 Speaker 1: I did find the interview really interesting to listen to 29 00:02:05,116 --> 00:02:08,076 Speaker 1: again in light of the news. Some parts of it 30 00:02:08,116 --> 00:02:10,956 Speaker 1: still seemed true, but in a really different way, and 31 00:02:11,076 --> 00:02:14,836 Speaker 1: at least one thing Sam said seems very very prescient. 32 00:02:19,436 --> 00:02:22,556 Speaker 1: I'm Jacob Goldstein, and this is what's your problem? But today, 33 00:02:22,676 --> 00:02:26,196 Speaker 1: let's be honest, it's more of a what's my problem situation. 34 00:02:26,836 --> 00:02:29,876 Speaker 1: My problem is this, I interviewed a guy who was 35 00:02:29,916 --> 00:02:32,396 Speaker 1: not at all what he seemed to be, and I 36 00:02:32,436 --> 00:02:35,556 Speaker 1: didn't quite see it. So what we're gonna do for 37 00:02:35,636 --> 00:02:39,756 Speaker 1: today's show is we're gonna replay the original interview, and 38 00:02:39,876 --> 00:02:41,916 Speaker 1: at a few points in the interview, I'm going to 39 00:02:41,996 --> 00:02:44,876 Speaker 1: pop in now with a quick update based on everything 40 00:02:44,956 --> 00:02:49,356 Speaker 1: that has happened since now here is my conversation with 41 00:02:49,396 --> 00:02:52,276 Speaker 1: Sam Bankman Freed as it aired back in May of 42 00:02:52,316 --> 00:02:55,676 Speaker 1: this year. Before we get to the interview, I just 43 00:02:55,756 --> 00:02:57,556 Speaker 1: want to take a minute here and set up this 44 00:02:57,636 --> 00:03:02,116 Speaker 1: one big idea, this really useful intellectual framework that drives 45 00:03:02,156 --> 00:03:07,196 Speaker 1: almost everything Sam does. It's called expected value. I try 46 00:03:07,196 --> 00:03:09,036 Speaker 1: to use it a lot because I think it's started 47 00:03:09,716 --> 00:03:14,156 Speaker 1: is the default correct way in some senses to calculate something. 48 00:03:14,316 --> 00:03:17,636 Speaker 1: It's like if you're just trying to do a generic calculation, 49 00:03:18,196 --> 00:03:20,716 Speaker 1: I think it's usually the right thing to use. You 50 00:03:20,756 --> 00:03:24,676 Speaker 1: can understand expected value by understanding how Sam decided to 51 00:03:24,716 --> 00:03:28,436 Speaker 1: start his company, the crypto exchange FTX. He was working 52 00:03:28,436 --> 00:03:31,516 Speaker 1: as a trader making millions of dollars, and when he 53 00:03:31,596 --> 00:03:35,036 Speaker 1: thought about starting FTX, he knew there was a really 54 00:03:35,036 --> 00:03:38,236 Speaker 1: good chance it might fail. It might ultimately be worth zero. 55 00:03:38,716 --> 00:03:41,116 Speaker 1: But on the other hand, if it succeeded, it could 56 00:03:41,116 --> 00:03:44,716 Speaker 1: be worth tens of billions of dollars. So here is 57 00:03:44,756 --> 00:03:48,676 Speaker 1: a slightly oversimplified version of how you would use expected 58 00:03:48,756 --> 00:03:52,476 Speaker 1: value in this case. Say, Sam thought there was only 59 00:03:52,516 --> 00:03:56,276 Speaker 1: a one percent chance that his exchange would be really successful, 60 00:03:56,596 --> 00:03:58,716 Speaker 1: but that if it were really successful, it would make 61 00:03:58,796 --> 00:04:02,716 Speaker 1: him twenty billion dollars. The expected value of starting the 62 00:04:02,756 --> 00:04:07,356 Speaker 1: exchange is the probability of it being successful one percent 63 00:04:08,196 --> 00:04:13,436 Speaker 1: times the value if that success happens twenty billion dollars, 64 00:04:13,956 --> 00:04:18,236 Speaker 1: which comes out to two hundred million dollars a lot. 65 00:04:19,076 --> 00:04:23,836 Speaker 1: So in twenty nineteen, Sam started FTX, and Sam told 66 00:04:23,836 --> 00:04:27,356 Speaker 1: me there is a really important lesson here about expected value. 67 00:04:28,156 --> 00:04:30,716 Speaker 1: One of the sort of takeaways that often ends up 68 00:04:30,716 --> 00:04:34,756 Speaker 1: coming from really thinking hard and critically about expected values 69 00:04:34,956 --> 00:04:37,316 Speaker 1: is that you should go for it way more than 70 00:04:37,476 --> 00:04:40,916 Speaker 1: is generally understood. Go big. You should really go really big, 71 00:04:40,956 --> 00:04:44,196 Speaker 1: even if you probably will fail and wind up with zero. 72 00:04:44,556 --> 00:04:47,276 Speaker 1: That's absolutely right. And I think one of the intuitions 73 00:04:47,316 --> 00:04:50,396 Speaker 1: for why that's the case, for why I think going 74 00:04:50,436 --> 00:04:53,236 Speaker 1: big is often the right thing to do well if 75 00:04:53,276 --> 00:04:55,956 Speaker 1: you think about it, like you know, you've got obviously 76 00:04:55,956 --> 00:05:00,316 Speaker 1: a number of options available to you. Somewhere on the 77 00:05:00,356 --> 00:05:03,876 Speaker 1: far right hand side of distributions, is like the best 78 00:05:03,916 --> 00:05:08,036 Speaker 1: possible thing, meaning the really good outcomes. That's right, the 79 00:05:08,076 --> 00:05:11,436 Speaker 1: best possible thing you could imagine happening, And the best 80 00:05:11,476 --> 00:05:16,156 Speaker 1: possible thing is probably really good. You know, it's probably 81 00:05:16,396 --> 00:05:20,796 Speaker 1: orders of magnitude bigger than whatever you're sort of expecting 82 00:05:20,836 --> 00:05:24,076 Speaker 1: to do. Right, it's not a little better, it's wildly better. 83 00:05:24,076 --> 00:05:27,756 Speaker 1: It's almost unimaginably better, that's right. And if you're thinking about, well, 84 00:05:27,796 --> 00:05:29,636 Speaker 1: if I found a company, how is it going to go? 85 00:05:30,076 --> 00:05:32,156 Speaker 1: You know, you're probably thinking this might be a million 86 00:05:32,156 --> 00:05:35,676 Speaker 1: dollar company, right, But the right hand distribution, the right 87 00:05:35,716 --> 00:05:37,556 Speaker 1: hand tail of that is going to be a billion 88 00:05:37,556 --> 00:05:41,396 Speaker 1: dollar company, and that's a thousand times bigger. And so 89 00:05:41,516 --> 00:05:44,276 Speaker 1: in order for it to be justified to choose that decision, 90 00:05:44,356 --> 00:05:46,636 Speaker 1: if you really do care linearly about money, if you 91 00:05:46,636 --> 00:05:50,316 Speaker 1: really do think that getting that marginal dollars worth a lot, 92 00:05:51,156 --> 00:05:53,716 Speaker 1: you know, even once you already have a lot of money, 93 00:05:54,836 --> 00:05:58,076 Speaker 1: then it should lead you to think that that, you know, 94 00:05:58,716 --> 00:06:01,516 Speaker 1: the best outcomes might be outcomes that have a ninety 95 00:06:01,556 --> 00:06:03,916 Speaker 1: nine percent chance of failure, right, because in ninety nine 96 00:06:03,956 --> 00:06:05,996 Speaker 1: percent chance of failure and a one percent chance of 97 00:06:06,036 --> 00:06:09,036 Speaker 1: that billion is still that's ten million, and that's a lot. 98 00:06:09,756 --> 00:06:12,796 Speaker 1: And so any time that light there is some nonzero 99 00:06:12,996 --> 00:06:16,276 Speaker 1: and non negligible chance of a really really good outcome 100 00:06:16,716 --> 00:06:20,636 Speaker 1: are times when you're gonna be incentivized more than seems 101 00:06:20,716 --> 00:06:26,196 Speaker 1: natural probably to choose extreme outcomes. Okay, that was Sam 102 00:06:26,196 --> 00:06:28,716 Speaker 1: back in May, and listening back now, I have to 103 00:06:28,756 --> 00:06:31,676 Speaker 1: say this part kind of holds up, but in a 104 00:06:31,756 --> 00:06:35,996 Speaker 1: really different light. It's basically saying you should make massive, 105 00:06:36,156 --> 00:06:39,316 Speaker 1: long shot bets with a high probability of failure if 106 00:06:39,356 --> 00:06:42,636 Speaker 1: the potential payoff is big enough. And you know, that's 107 00:06:42,676 --> 00:06:44,636 Speaker 1: clearly what Sam was doing for the first part of 108 00:06:44,636 --> 00:06:46,556 Speaker 1: his career when the price of crypto was going up, 109 00:06:46,556 --> 00:06:49,076 Speaker 1: and he kept making these big bets and having huge successes. 110 00:06:49,396 --> 00:06:52,156 Speaker 1: Now we don't know all the details of what happened 111 00:06:52,196 --> 00:06:56,276 Speaker 1: at Sam's crypto exchange FTX before it collapsed, but remember 112 00:06:56,676 --> 00:06:59,996 Speaker 1: one of the key things that prosecutors are apparently looking 113 00:06:59,996 --> 00:07:04,476 Speaker 1: into is FTX allegedly loan customer funds to a trading 114 00:07:04,516 --> 00:07:08,756 Speaker 1: firm also controlled by Sam making that loan. Deciding to 115 00:07:08,756 --> 00:07:11,476 Speaker 1: make the loan, that is a kind of bet, right. 116 00:07:11,796 --> 00:07:14,356 Speaker 1: It's a bet that the training firm will make profitable 117 00:07:14,396 --> 00:07:17,316 Speaker 1: trades or investments with the borrowed money, a bet that 118 00:07:17,356 --> 00:07:19,756 Speaker 1: the market will go up, the money will be returned, 119 00:07:19,796 --> 00:07:22,836 Speaker 1: and nobody will get in trouble. It's possible that Sam 120 00:07:22,916 --> 00:07:26,636 Speaker 1: decided that was a bet that had positive expected value, 121 00:07:26,956 --> 00:07:29,956 Speaker 1: and he went ahead and did it. Okay, now let's 122 00:07:29,956 --> 00:07:32,276 Speaker 1: get back to the original interview. At this point, we're 123 00:07:32,316 --> 00:07:36,636 Speaker 1: talking about how expected value shapes Sam's thinking about philanthropy, 124 00:07:37,036 --> 00:07:39,996 Speaker 1: and what he says is when he's thinking about giving 125 00:07:39,996 --> 00:07:42,716 Speaker 1: away money, he doesn't just think about the world today. 126 00:07:42,796 --> 00:07:47,596 Speaker 1: He thinks about like the whole future of humanity. How 127 00:07:47,596 --> 00:07:50,916 Speaker 1: many people will live if we play our cards right, 128 00:07:50,956 --> 00:07:53,316 Speaker 1: as a world, like in the ninetieth percentile outcome, how 129 00:07:53,316 --> 00:07:55,876 Speaker 1: many people will live in the future. The answer is trillions, 130 00:07:55,996 --> 00:08:00,036 Speaker 1: probably maybe hundreds of trillions. Right, It's thousands of times 131 00:08:00,116 --> 00:08:03,236 Speaker 1: more than the number of people who have ever lived. 132 00:08:03,716 --> 00:08:07,116 Speaker 1: And so that's just that's a huge factor. Right, Anything 133 00:08:07,156 --> 00:08:10,036 Speaker 1: that we do that actually has packed on the whole 134 00:08:10,076 --> 00:08:13,636 Speaker 1: future of the world is massively important. It's kind of 135 00:08:13,676 --> 00:08:15,996 Speaker 1: a ridiculous way to think at some level, right, it 136 00:08:16,076 --> 00:08:19,476 Speaker 1: just gets so big then, like you're just some guy 137 00:08:19,636 --> 00:08:22,276 Speaker 1: with a lot of money at some level, right, talking 138 00:08:22,316 --> 00:08:24,796 Speaker 1: about like trillions of people in the whole future of humanity. 139 00:08:24,836 --> 00:08:27,836 Speaker 1: Like it gets weird, Right, it does get really weird. 140 00:08:28,116 --> 00:08:30,756 Speaker 1: You should really stress uses and and think like, Okay, 141 00:08:30,796 --> 00:08:33,436 Speaker 1: do I really believe this? Like do I really actually 142 00:08:33,436 --> 00:08:37,356 Speaker 1: think that there's compelling evidence that like, you know, these 143 00:08:37,436 --> 00:08:39,516 Speaker 1: numbers that I'm looking at are as big as I'm 144 00:08:39,556 --> 00:08:42,756 Speaker 1: claiming they are, or am I kind of bullshitting myself 145 00:08:42,756 --> 00:08:45,316 Speaker 1: on this? Like, you know, you should absolutely have some 146 00:08:45,876 --> 00:08:49,596 Speaker 1: humility around that, but but it's not totally implausible. And 147 00:08:49,636 --> 00:08:52,436 Speaker 1: there are examples of people people who we've heard of 148 00:08:52,516 --> 00:08:54,436 Speaker 1: were very famous, and people who no one has ever 149 00:08:54,516 --> 00:08:59,116 Speaker 1: heard of, who have had massive, massive impact on the world, 150 00:08:59,156 --> 00:09:02,716 Speaker 1: who have had that massive multiplier, and so it's not 151 00:09:02,796 --> 00:09:06,316 Speaker 1: totally implausible. And so you know, I think that, like, 152 00:09:06,876 --> 00:09:10,276 Speaker 1: while we should absolutely have you know, ay dosa humility 153 00:09:10,316 --> 00:09:14,396 Speaker 1: towards extreme outcomes, you know, we should also acknowledge that 154 00:09:14,476 --> 00:09:17,676 Speaker 1: they can be real and that often the highest expected 155 00:09:17,756 --> 00:09:22,196 Speaker 1: value things are in fact pushing directly towards them. Well, 156 00:09:22,236 --> 00:09:26,396 Speaker 1: and in fact, you did hit the extreme right tail 157 00:09:26,476 --> 00:09:30,316 Speaker 1: of the distribution in work, right, you did just get 158 00:09:30,476 --> 00:09:34,196 Speaker 1: implausibly rich in a ridiculously short period of time. So 159 00:09:34,236 --> 00:09:37,796 Speaker 1: at least on that one it worked. That's right, and 160 00:09:37,836 --> 00:09:39,836 Speaker 1: I think it's that's certainly it's been a big update 161 00:09:39,916 --> 00:09:43,356 Speaker 1: towards me in the direction of life. This stuff is plausible. 162 00:09:43,956 --> 00:09:46,076 Speaker 1: Uh huh. So the fact that you got so rich 163 00:09:46,156 --> 00:09:48,636 Speaker 1: so fast in crypto, does it make does it push 164 00:09:48,676 --> 00:09:52,476 Speaker 1: your altruism toward like Welsh shit? If I could make 165 00:09:52,516 --> 00:09:55,716 Speaker 1: twenty billion dollars in three years, everybody on Earth could 166 00:09:55,756 --> 00:09:58,516 Speaker 1: in fact die from a pandemic or from some out 167 00:09:58,516 --> 00:10:00,796 Speaker 1: of control AI and I should spend some of the 168 00:10:00,836 --> 00:10:04,236 Speaker 1: money to try and reduce the probabilities of that. I mean, 169 00:10:04,516 --> 00:10:06,596 Speaker 1: is it like that? Yeah, it absolutely does. I think 170 00:10:06,596 --> 00:10:08,476 Speaker 1: it absolutely does make me think, you know what, like 171 00:10:09,236 --> 00:10:14,156 Speaker 1: these you know, really extreme outcomes are probably plausible, and 172 00:10:14,196 --> 00:10:16,556 Speaker 1: they're probably plausible enough that I should be taking them 173 00:10:16,556 --> 00:10:21,236 Speaker 1: really seriously, you know, and that has pretty profound implications, 174 00:10:21,236 --> 00:10:25,676 Speaker 1: I think for what we should be doing. So listening 175 00:10:25,676 --> 00:10:27,876 Speaker 1: back to that section now, I do feel like there 176 00:10:27,996 --> 00:10:30,356 Speaker 1: was a moment in there when I was starting to 177 00:10:30,396 --> 00:10:33,156 Speaker 1: push in the right direction. It was when Sam started 178 00:10:33,196 --> 00:10:36,196 Speaker 1: talking about the whole future of humanity thing and helping 179 00:10:36,276 --> 00:10:39,636 Speaker 1: trillions of people, and I was like, come on, trillions 180 00:10:39,636 --> 00:10:41,796 Speaker 1: of people, the whole future of humanity. That seems like 181 00:10:41,836 --> 00:10:45,916 Speaker 1: a bit much. But Sam had a reasonable answer there. 182 00:10:46,036 --> 00:10:48,996 Speaker 1: He said, yes, we should have some humility and thinking 183 00:10:49,036 --> 00:10:51,396 Speaker 1: about things this big. But you know, look, he said, 184 00:10:51,756 --> 00:10:55,596 Speaker 1: there are people who have had really really profound effects 185 00:10:55,676 --> 00:10:59,356 Speaker 1: on humanity, and why shouldn't I, Sam Bankman freed try 186 00:10:59,396 --> 00:11:01,876 Speaker 1: to be one of those people, and that made sense 187 00:11:01,916 --> 00:11:05,596 Speaker 1: to me. It did seem reasonable, and I think that 188 00:11:05,796 --> 00:11:09,796 Speaker 1: is part of what was so appealing and convincing about Sam. 189 00:11:09,836 --> 00:11:12,956 Speaker 1: I think that is part of why I and so 190 00:11:12,996 --> 00:11:17,236 Speaker 1: many other people really believed him and wanted to believe him. 191 00:11:17,276 --> 00:11:21,836 Speaker 1: He seemed guided by this sort of straightforward logic. He 192 00:11:21,956 --> 00:11:27,396 Speaker 1: seemed like a very smart, very reasonable person. In a minute, 193 00:11:27,556 --> 00:11:30,556 Speaker 1: we'll have more from the May interview about Sam's role 194 00:11:30,636 --> 00:11:46,156 Speaker 1: as one of the biggest donors to democratic political candidates. Now, 195 00:11:46,276 --> 00:11:48,556 Speaker 1: let's go back to the show as it originally aired 196 00:11:48,636 --> 00:11:51,836 Speaker 1: in May of this year. Sam told me he's given 197 00:11:51,876 --> 00:11:55,636 Speaker 1: away about two hundred million dollars so far, which obviously 198 00:11:55,756 --> 00:11:58,596 Speaker 1: is a lot, but it is also somewhere around one 199 00:11:58,676 --> 00:12:01,916 Speaker 1: percent of what he plans to give away eventually. His 200 00:12:02,036 --> 00:12:06,396 Speaker 1: giving has been broad anti poverty, animal welfare, healthcare, but 201 00:12:06,476 --> 00:12:09,596 Speaker 1: he has started to focus on a few areas. One 202 00:12:09,636 --> 00:12:13,556 Speaker 1: of the biggest is pandemic preparedness. That is a category 203 00:12:13,596 --> 00:12:17,076 Speaker 1: that fits right into that expected value framework. You know, 204 00:12:17,476 --> 00:12:22,036 Speaker 1: a low probability but super deadly pandemic is worth spending 205 00:12:22,116 --> 00:12:25,316 Speaker 1: a lot to prevent. Another place, where he's been giving 206 00:12:25,636 --> 00:12:28,676 Speaker 1: is politics. Sam was one of the biggest donors to 207 00:12:28,756 --> 00:12:33,516 Speaker 1: President Biden's twenty twenty campaign. More recently, he donated over 208 00:12:33,636 --> 00:12:37,196 Speaker 1: ten million dollars to support a candidate in a Democratic 209 00:12:37,236 --> 00:12:41,116 Speaker 1: primary for a congressional seat in Oregon, largely because that 210 00:12:41,236 --> 00:12:45,396 Speaker 1: candidate wanted to focus on pandemic preparedness. The primary was 211 00:12:45,436 --> 00:12:48,876 Speaker 1: held just last week, and Sam's candidate lost by a lot. 212 00:12:49,676 --> 00:12:51,436 Speaker 1: I think that there are a lot of takeaways from it, 213 00:12:51,596 --> 00:12:55,316 Speaker 1: and you know, I think that might do it again. 214 00:12:55,356 --> 00:12:57,356 Speaker 1: I would do it a bit differently than last time. 215 00:12:58,196 --> 00:13:01,996 Speaker 1: But you know, fundamentally, I think it was a well 216 00:13:02,076 --> 00:13:04,796 Speaker 1: fought race. I think that you know, he had a 217 00:13:04,836 --> 00:13:08,516 Speaker 1: real shot. And you know, going back to this discussion 218 00:13:08,516 --> 00:13:12,636 Speaker 1: of expected values, right like, if you're donating blagal races, 219 00:13:12,716 --> 00:13:14,996 Speaker 1: that's that you think your candidates are ninety nine percent 220 00:13:15,076 --> 00:13:18,556 Speaker 1: to weigh, you're almost certainly doing something right because that 221 00:13:18,596 --> 00:13:22,396 Speaker 1: person doesn't need your money exactly. You should be donating 222 00:13:22,476 --> 00:13:25,836 Speaker 1: such that you think that you have a pretty substantial 223 00:13:25,916 --> 00:13:29,556 Speaker 1: chance of losing. And you know, I first stand by that. 224 00:13:29,716 --> 00:13:31,236 Speaker 1: Do you expect you'll give a lot of money in 225 00:13:31,276 --> 00:13:34,436 Speaker 1: the twenty twenty four election cycle? I would guess. So, 226 00:13:34,516 --> 00:13:36,196 Speaker 1: I don't know for sure, it's going to depend on 227 00:13:36,236 --> 00:13:38,996 Speaker 1: who's running, but you know I would guess so, well, 228 00:13:39,076 --> 00:13:41,756 Speaker 1: let's say Donald Trump runs for president. Would that cause 229 00:13:41,756 --> 00:13:43,796 Speaker 1: you to probably give a lot of money to the 230 00:13:43,836 --> 00:13:46,716 Speaker 1: person who's running against him. That's that's a pretty decent guess. 231 00:13:47,196 --> 00:13:49,636 Speaker 1: And you know, I think that I'm going to be 232 00:13:49,636 --> 00:13:52,876 Speaker 1: looking a lot less at like political party um from 233 00:13:52,876 --> 00:13:56,236 Speaker 1: that perspective, and a lot more about you know, uh 234 00:13:56,596 --> 00:13:59,396 Speaker 1: same governance Like that is you know at it's for 235 00:13:59,796 --> 00:14:05,196 Speaker 1: the thing that I think I care the most governing governance. 236 00:14:05,396 --> 00:14:09,076 Speaker 1: I think the United States has both a big opportunity, 237 00:14:09,276 --> 00:14:14,476 Speaker 1: big responsibility to the world to shepherd the West in 238 00:14:15,236 --> 00:14:19,956 Speaker 1: a powerful but responsible manner, and that everything that we 239 00:14:20,036 --> 00:14:24,156 Speaker 1: do there has massive, massive ripple effects on what the 240 00:14:24,236 --> 00:14:28,756 Speaker 1: future looks like. You've talked before about being surprised at 241 00:14:28,796 --> 00:14:32,756 Speaker 1: how little money is in politics. It is quite small, 242 00:14:32,796 --> 00:14:35,316 Speaker 1: the amount of money that is donated relative to how 243 00:14:35,396 --> 00:14:40,596 Speaker 1: much money the government spends. Right, does that lead you 244 00:14:40,676 --> 00:14:43,396 Speaker 1: to want to donate a lot? But does it follow 245 00:14:43,436 --> 00:14:46,356 Speaker 1: from that that, like, you'll probably donate a lot of money? 246 00:14:46,516 --> 00:14:49,836 Speaker 1: It follows that I might in the end. That's basically 247 00:14:49,876 --> 00:14:52,916 Speaker 1: what I think, um, I think that like there are 248 00:14:53,036 --> 00:14:56,156 Speaker 1: in some ways, there's you know, in some ways surprisingly 249 00:14:56,276 --> 00:14:59,916 Speaker 1: little money in politics and given sort of the skipe 250 00:14:59,916 --> 00:15:03,116 Speaker 1: of its impact. Note that doesn't necessarily mean they're good 251 00:15:03,116 --> 00:15:06,236 Speaker 1: things to do, um donating politics right, Like it might 252 00:15:06,276 --> 00:15:08,396 Speaker 1: be that sure, but like how are you actually going 253 00:15:08,436 --> 00:15:10,756 Speaker 1: to do anything? You full? You know, maybe that's how 254 00:15:10,796 --> 00:15:13,596 Speaker 1: it turns out, but but maybe not, you know, I mean, 255 00:15:13,636 --> 00:15:16,236 Speaker 1: it's not necessarily the case that more money can have 256 00:15:16,276 --> 00:15:19,476 Speaker 1: a meaningful effect on the outcome. That's right, It's not 257 00:15:19,516 --> 00:15:22,916 Speaker 1: necessarily the case, but it certainly gestures a little bit 258 00:15:22,916 --> 00:15:26,876 Speaker 1: in that direction. I mean, I imagine you have some 259 00:15:26,916 --> 00:15:30,356 Speaker 1: probability distribution in your mind of how much money you 260 00:15:30,436 --> 00:15:34,876 Speaker 1: might give in the next election cycle, Like give me 261 00:15:34,916 --> 00:15:42,356 Speaker 1: some number I would guess north of one hundred million UM. 262 00:15:42,756 --> 00:15:45,156 Speaker 1: And you know, as for how much north of that 263 00:15:45,316 --> 00:15:48,516 Speaker 1: if I don't know, you know, it really does depend 264 00:15:48,596 --> 00:15:51,316 Speaker 1: on what happens, Like it's really dependent on exactly who's 265 00:15:51,356 --> 00:15:54,396 Speaker 1: running where for what like like these these are are 266 00:15:54,396 --> 00:15:58,356 Speaker 1: a super contingent thing. But but yeah, I think that 267 00:15:58,476 --> 00:16:02,156 Speaker 1: gives maybe some sense of what the what the sort 268 00:16:02,196 --> 00:16:05,876 Speaker 1: of like scale might be here more than one hundred 269 00:16:05,916 --> 00:16:10,996 Speaker 1: million sort of spread across many organizations, but towards the 270 00:16:11,036 --> 00:16:14,636 Speaker 1: twenty twenty four election. So if that's a floor, what's 271 00:16:14,676 --> 00:16:19,636 Speaker 1: the ceiling? Like a billion? Might you give a billion? Yeah? 272 00:16:19,676 --> 00:16:22,516 Speaker 1: I think that's a decent life thing to look at 273 00:16:22,676 --> 00:16:25,476 Speaker 1: as a as a sort of like I mean, I 274 00:16:25,836 --> 00:16:27,836 Speaker 1: would hate to say like hard ceiling, because whom knows 275 00:16:27,876 --> 00:16:29,796 Speaker 1: what was going to happen between now and then, But 276 00:16:29,956 --> 00:16:32,596 Speaker 1: as like at least sort of a soft ceiling, I 277 00:16:32,636 --> 00:16:36,436 Speaker 1: would say, yeah, okay, So so the ballpark is like 278 00:16:36,916 --> 00:16:39,996 Speaker 1: one hundred million ish to a billion ish, with again 279 00:16:40,036 --> 00:16:42,076 Speaker 1: a lot of caveats on this, and you know, there's 280 00:16:42,076 --> 00:16:44,396 Speaker 1: a world WHI should end up being close to zero 281 00:16:44,556 --> 00:16:47,036 Speaker 1: if they're you know, if things just work out such 282 00:16:47,076 --> 00:16:49,756 Speaker 1: that there isn't Is there much I'm excited about? Like 283 00:16:49,956 --> 00:16:52,676 Speaker 1: that seems like a very low probability to me, based 284 00:16:52,716 --> 00:16:55,156 Speaker 1: on what I know about you in the world. Yeah, 285 00:16:55,196 --> 00:16:57,476 Speaker 1: it's I think it's I think it's very low that 286 00:16:57,476 --> 00:16:59,476 Speaker 1: it's actually gonna end up being zero. That does seem 287 00:16:59,476 --> 00:17:03,316 Speaker 1: pretty unlikely. Yeah, a billion seems way more likely than 288 00:17:03,436 --> 00:17:08,036 Speaker 1: zero to me. I think it's really right. Um, So 289 00:17:08,076 --> 00:17:10,716 Speaker 1: I think the most anybody gave last time, if I 290 00:17:10,756 --> 00:17:13,516 Speaker 1: have the right numbers is two hundred and fifteen million. 291 00:17:14,356 --> 00:17:17,516 Speaker 1: That's for the twenty twenty cycle, the last presidential cycle. 292 00:17:18,756 --> 00:17:21,276 Speaker 1: It seems like you'll probably give more than that. Based 293 00:17:21,316 --> 00:17:23,636 Speaker 1: only on what you've told me, I think it is 294 00:17:23,676 --> 00:17:28,396 Speaker 1: eminently possible that I think that that would not surprise me. 295 00:17:31,116 --> 00:17:33,516 Speaker 1: Just to pop in here again in the present in November. 296 00:17:34,556 --> 00:17:38,796 Speaker 1: Back in May, when this interview was published, this part 297 00:17:38,836 --> 00:17:41,156 Speaker 1: of the interview actually made a little bit of news. 298 00:17:41,756 --> 00:17:45,076 Speaker 1: NBC News and Politico both reported based on the interview 299 00:17:45,156 --> 00:17:48,476 Speaker 1: that Sam might give a billion dollars in the twenty 300 00:17:48,516 --> 00:17:52,716 Speaker 1: twenty four campaign. So that happened, and then some time passed, 301 00:17:53,076 --> 00:17:56,156 Speaker 1: and then in another interview just last month in October, 302 00:17:56,556 --> 00:17:59,196 Speaker 1: Sam walked this part of the interview back. He said 303 00:17:59,196 --> 00:18:01,236 Speaker 1: it was a dumb quote to give on his part, 304 00:18:01,676 --> 00:18:03,556 Speaker 1: And at the time when when I saw this news 305 00:18:03,676 --> 00:18:05,756 Speaker 1: last month, I thought it was just, you know, that 306 00:18:05,836 --> 00:18:10,116 Speaker 1: his philanthropic priorities had changed. But now, in light of 307 00:18:10,156 --> 00:18:13,396 Speaker 1: what we know now, it seems like maybe Sam knew 308 00:18:13,396 --> 00:18:16,316 Speaker 1: by October that he just wasn't going to have a 309 00:18:16,356 --> 00:18:20,316 Speaker 1: billion dollars to give in the next presidential cycle. One 310 00:18:20,316 --> 00:18:22,396 Speaker 1: other thing pops out at me. From this part of 311 00:18:22,396 --> 00:18:25,316 Speaker 1: the interview, and that is the moment when Sam says 312 00:18:25,476 --> 00:18:28,236 Speaker 1: his giving for the twenty twenty four election might be 313 00:18:28,436 --> 00:18:31,676 Speaker 1: close to zero, close to nothing. And I just dismissed 314 00:18:31,716 --> 00:18:34,156 Speaker 1: that idea out of hand. But of course, given what 315 00:18:34,156 --> 00:18:36,556 Speaker 1: we know now, the situation now, it might be the 316 00:18:36,596 --> 00:18:40,556 Speaker 1: most clearly true thing he told me in the whole interview. Okay, 317 00:18:40,876 --> 00:18:42,796 Speaker 1: we're gonna go back to the interview now. We're going 318 00:18:42,876 --> 00:18:45,756 Speaker 1: to talk more about philanthropy and also a little but 319 00:18:45,916 --> 00:18:49,356 Speaker 1: in retrospect, not enough about the relationship between his political 320 00:18:49,396 --> 00:18:54,276 Speaker 1: donations and government regulation of the crypto industry. I've heard 321 00:18:54,316 --> 00:18:58,036 Speaker 1: you say that you know, at some points over the 322 00:18:58,036 --> 00:19:00,756 Speaker 1: next few years, you hope to find opportunities where you 323 00:19:00,796 --> 00:19:06,076 Speaker 1: can spend give away like a billion dollars really quickly. 324 00:19:07,076 --> 00:19:08,916 Speaker 1: What are some of the places you think that might happen. 325 00:19:08,916 --> 00:19:10,956 Speaker 1: I mean, the election, the twenty twenty four election is 326 00:19:10,996 --> 00:19:14,556 Speaker 1: clearly one. What are a few others? I think pandemic 327 00:19:14,596 --> 00:19:16,596 Speaker 1: prevention is potentially one of them. I think you look 328 00:19:16,636 --> 00:19:19,676 Speaker 1: at like how much would it cost to you know, 329 00:19:20,116 --> 00:19:23,356 Speaker 1: really definitely prevent the next pandemic or you can never 330 00:19:23,396 --> 00:19:26,276 Speaker 1: definitely prevent it, but to have you know, a really 331 00:19:26,316 --> 00:19:28,596 Speaker 1: good shot at it. I think you're probably talking tens 332 00:19:28,596 --> 00:19:32,356 Speaker 1: of billions of dollars, which is crazy that that governments 333 00:19:32,356 --> 00:19:36,356 Speaker 1: aren't spending that money, right, that's right, that really should 334 00:19:36,356 --> 00:19:38,156 Speaker 1: be government spending it. And part of this might be 335 00:19:38,196 --> 00:19:40,756 Speaker 1: working with governments that because it's not that much like 336 00:19:41,156 --> 00:19:43,116 Speaker 1: if truly if they could reduce the risk of a 337 00:19:43,116 --> 00:19:47,756 Speaker 1: pandemic by half, say for thirty billion dollars a year, like, 338 00:19:48,636 --> 00:19:51,956 Speaker 1: do you actually think that is that true? Or fifty billion? 339 00:19:52,076 --> 00:19:54,676 Speaker 1: I think I think something like that's probably true. I 340 00:19:54,676 --> 00:19:57,556 Speaker 1: think something that's like not too far off from that 341 00:19:57,676 --> 00:20:00,836 Speaker 1: order of magnitude, and that probably you know, by the 342 00:20:00,916 --> 00:20:04,036 Speaker 1: time you're talking about you know, many tens of million 343 00:20:04,116 --> 00:20:07,436 Speaker 1: billions of dollars um. You know, that's something that you're 344 00:20:07,476 --> 00:20:10,076 Speaker 1: probably going to need to have government stepping in on. 345 00:20:10,276 --> 00:20:12,876 Speaker 1: But you know, I would be happy to throw in 346 00:20:12,916 --> 00:20:17,116 Speaker 1: a fair bit to health facilitate that. So I should 347 00:20:17,156 --> 00:20:20,596 Speaker 1: actually I should have asked this earlier. But to what 348 00:20:20,796 --> 00:20:27,196 Speaker 1: extent are your political donations a pandemic prevention strategy? So 349 00:20:27,516 --> 00:20:29,476 Speaker 1: I think most of them have been so far, and 350 00:20:29,756 --> 00:20:32,716 Speaker 1: you know, going forward, like there may become other policy, 351 00:20:32,876 --> 00:20:35,316 Speaker 1: you know, things like AI policy, that that that that 352 00:20:35,356 --> 00:20:37,556 Speaker 1: you end up being really important. And so it's not 353 00:20:37,596 --> 00:20:40,156 Speaker 1: to say that like pandemics are the only things that 354 00:20:40,196 --> 00:20:42,476 Speaker 1: are ever going to matter to me policywise, but that 355 00:20:42,556 --> 00:20:44,476 Speaker 1: has been the big one so far, and it's the 356 00:20:44,556 --> 00:20:47,796 Speaker 1: idea there, like tens of billions of dollars a year 357 00:20:47,876 --> 00:20:50,956 Speaker 1: to significantly reduce the risk of another pandemic is not 358 00:20:50,996 --> 00:20:54,236 Speaker 1: that much for the government, but it's more than you have, right, 359 00:20:54,276 --> 00:20:56,636 Speaker 1: So you need a lever You can't you can't actually 360 00:20:56,676 --> 00:20:59,756 Speaker 1: spend all of your money and meaningfully reduce the chances 361 00:20:59,756 --> 00:21:02,516 Speaker 1: of another pandemic. And so if you can use political 362 00:21:02,516 --> 00:21:05,196 Speaker 1: donations to elect candidates who want to spend money to 363 00:21:05,236 --> 00:21:09,516 Speaker 1: prevent a pandemic, that works, that's right. So you're giving 364 00:21:10,196 --> 00:21:13,676 Speaker 1: lots of money to political candidates. You're also doing a 365 00:21:13,676 --> 00:21:18,636 Speaker 1: lot of work to shape regulation of crypto in the US. 366 00:21:19,116 --> 00:21:21,916 Speaker 1: Tell me about the overlap between those two things. So 367 00:21:22,556 --> 00:21:26,196 Speaker 1: most of the giving has not been done with crypto 368 00:21:26,356 --> 00:21:29,076 Speaker 1: in mind, and I have been doing a ton of 369 00:21:29,156 --> 00:21:34,076 Speaker 1: policy engagement on that, but that's mostly going to DC 370 00:21:34,316 --> 00:21:37,796 Speaker 1: and talking with policymakers. I mean, here's the narrow version 371 00:21:37,796 --> 00:21:41,196 Speaker 1: of the question. Is part of what you want from 372 00:21:41,236 --> 00:21:47,716 Speaker 1: your political donations, some particular outcome in cryptoregulation that is 373 00:21:47,756 --> 00:21:50,956 Speaker 1: not a big part of it. That is where we 374 00:21:51,076 --> 00:21:53,236 Speaker 1: left the main part of the interview back in May, 375 00:21:53,876 --> 00:21:56,236 Speaker 1: and when we zoom out now and look at what 376 00:21:56,396 --> 00:22:01,076 Speaker 1: happened at FTX, you know, what Sam did is still 377 00:22:01,116 --> 00:22:05,636 Speaker 1: really hazy. The details will probably emerge over time, but 378 00:22:05,676 --> 00:22:09,676 Speaker 1: for now, one big question that is really interesting to 379 00:22:09,756 --> 00:22:16,116 Speaker 1: me is was the basic altruism story Sam was telling true. 380 00:22:16,676 --> 00:22:18,476 Speaker 1: Was he really trying to earn as much as he 381 00:22:18,476 --> 00:22:21,076 Speaker 1: could to give it all away and help humanity or 382 00:22:21,156 --> 00:22:24,836 Speaker 1: was that whole story a lie. I don't think we 383 00:22:24,916 --> 00:22:27,916 Speaker 1: know the answer to that for sure yet, but it 384 00:22:27,996 --> 00:22:31,516 Speaker 1: is possible that in spite of everything, that is really 385 00:22:31,636 --> 00:22:35,236 Speaker 1: what he was after, That was really his intent. And 386 00:22:35,316 --> 00:22:38,276 Speaker 1: if that is true, if he did mean to do 387 00:22:38,396 --> 00:22:40,676 Speaker 1: great things in the world, and if he did do 388 00:22:40,756 --> 00:22:43,236 Speaker 1: some of the bad things that have been alleged in 389 00:22:43,356 --> 00:22:46,596 Speaker 1: order to do those good things, well that is a 390 00:22:46,836 --> 00:22:51,676 Speaker 1: very complex, very human story about good intentions and bad 391 00:22:51,756 --> 00:22:57,396 Speaker 1: decisions and ultimately, maybe Hugres He'll be back in a 392 00:22:57,436 --> 00:23:08,876 Speaker 1: minute with the Lightning Round. Here is the Lightning Round 393 00:23:08,876 --> 00:23:11,716 Speaker 1: with Sam Bankman Freed as it originally ran back in May, 394 00:23:12,316 --> 00:23:15,116 Speaker 1: several of the answers played pretty differently now in light 395 00:23:15,156 --> 00:23:18,636 Speaker 1: of the collapse of FTX. Let me just let's just 396 00:23:18,676 --> 00:23:21,356 Speaker 1: do a lightning round a few quick questions and you 397 00:23:21,396 --> 00:23:25,476 Speaker 1: can answer them fast. What's the least rational thing you do? 398 00:23:26,476 --> 00:23:28,916 Speaker 1: Least rational thing? And I spend way too much time 399 00:23:28,996 --> 00:23:32,516 Speaker 1: like aimlessly browsing my Facebook fee. Is it true you 400 00:23:32,596 --> 00:23:34,716 Speaker 1: still sleep on a bean bag chair? And if so, 401 00:23:34,836 --> 00:23:38,556 Speaker 1: why I did last night? I do do many nights. Um, 402 00:23:39,076 --> 00:23:43,516 Speaker 1: it's I find it. I kind of I don't know, 403 00:23:43,556 --> 00:23:45,236 Speaker 1: It's what I'm used to. Is honestly just part of 404 00:23:45,276 --> 00:23:47,796 Speaker 1: the answer there. It's like it's what feels natural for me. 405 00:23:48,116 --> 00:23:50,596 Speaker 1: If everything goes well, what problem will you be trying 406 00:23:50,596 --> 00:23:57,236 Speaker 1: to solve in five years? I would say, the details 407 00:23:57,236 --> 00:24:02,556 Speaker 1: of how to of what to prioritize for pandemic prevention 408 00:24:02,676 --> 00:24:04,836 Speaker 1: funding with you know institutes that have been set up 409 00:24:04,876 --> 00:24:06,876 Speaker 1: and are you know online and get a ton of 410 00:24:06,916 --> 00:24:09,556 Speaker 1: capital into it and you know really great eight teams 411 00:24:09,596 --> 00:24:12,756 Speaker 1: who are are devoting themselves to building it out. So 412 00:24:12,836 --> 00:24:15,156 Speaker 1: the dream is you'll be like deep in the weeds 413 00:24:15,716 --> 00:24:19,236 Speaker 1: figuring out how to prevent a pandemic. I've seen in 414 00:24:19,316 --> 00:24:22,356 Speaker 1: other interviews You're doing lots of different things during the interview. 415 00:24:22,356 --> 00:24:24,396 Speaker 1: I couldn't actually tell if you were doing other things 416 00:24:24,596 --> 00:24:26,436 Speaker 1: during this interview, But were you and if so, what 417 00:24:26,476 --> 00:24:30,236 Speaker 1: were you doing? As playing game of Storybook Brawl. I 418 00:24:30,316 --> 00:24:33,956 Speaker 1: say the name of the game again, Storybook Brawl. How 419 00:24:33,996 --> 00:24:37,716 Speaker 1: did they? I took second place out of eight, could 420 00:24:37,716 --> 00:24:39,876 Speaker 1: have been worse, and I apologize. I do have to 421 00:24:39,876 --> 00:24:42,836 Speaker 1: haul off. Okay, last one. What's one piece of advice 422 00:24:42,836 --> 00:24:44,716 Speaker 1: you'd give to somebody trying to solve a hard problem. 423 00:24:45,196 --> 00:24:49,796 Speaker 1: One piece of advice I would say, I just keep going, 424 00:24:49,996 --> 00:24:51,956 Speaker 1: Just keep going, stuff by stuff, you know, try and 425 00:24:52,036 --> 00:24:53,956 Speaker 1: solve it, but by a bit, and you know, eventually, 426 00:24:53,996 --> 00:25:01,316 Speaker 1: hopefully you'll get there. That was my interview from May 427 00:25:01,436 --> 00:25:04,396 Speaker 1: with Sam banksman Free. At the time, he was the 428 00:25:04,436 --> 00:25:08,436 Speaker 1: founder and CEO of the Crypto Exchange FTX. Last week, 429 00:25:08,676 --> 00:25:11,716 Speaker 1: f filed for bankruptcy and Sam resigned from the front. 430 00:25:13,516 --> 00:25:17,676 Speaker 1: Today's show was edited by Robert Smith, produced by Edith Ruslo, 431 00:25:17,836 --> 00:25:21,356 Speaker 1: and engineered by Amanda ka Wong. I'm Jacob Goldstein, and 432 00:25:21,356 --> 00:25:23,676 Speaker 1: we'll be back next week with another episode of What's 433 00:25:23,676 --> 00:25:42,116 Speaker 1: Your Problem.