1 00:00:02,960 --> 00:00:07,080 Speaker 1: On February nineteen two, there was a mass shooting in Portland, Oregon. 2 00:00:07,320 --> 00:00:10,520 Speaker 1: Right wing extremist opened fire at people doing traffic support 3 00:00:10,560 --> 00:00:14,319 Speaker 1: for a weekly racial justice march. He killed one person, 4 00:00:14,400 --> 00:00:16,880 Speaker 1: he injured several others. One of the people who was injured, 5 00:00:17,440 --> 00:00:20,239 Speaker 1: a young woman, is currently paralyzed from the neck down. 6 00:00:20,800 --> 00:00:23,760 Speaker 1: Her friends and family are raising money, um, not for 7 00:00:23,920 --> 00:00:26,599 Speaker 1: immediate survival, stuff for for quality of life things, for 8 00:00:26,680 --> 00:00:30,560 Speaker 1: things to you know, allow her to enjoy herself out 9 00:00:30,600 --> 00:00:33,160 Speaker 1: in the world with people, given the fact that she's 10 00:00:33,240 --> 00:00:36,880 Speaker 1: dealt with something and is dealing with something, uh, permanently 11 00:00:36,920 --> 00:00:40,040 Speaker 1: life changing. So I wanted to highlight that they're already 12 00:00:40,159 --> 00:00:43,440 Speaker 1: above their initial goal. But obviously, you know, any anything 13 00:00:43,440 --> 00:00:45,239 Speaker 1: we can give will help her and her family have 14 00:00:45,280 --> 00:00:47,479 Speaker 1: a higher quality of life, and I think they deserve it. 15 00:00:47,520 --> 00:00:51,160 Speaker 1: So go to go fund me Portland mass shooting Paralyzed 16 00:00:51,159 --> 00:00:55,480 Speaker 1: Survivor Fund. That's go fund me Portland mass shooting Paralyzed 17 00:00:55,520 --> 00:00:58,560 Speaker 1: Survivor fund. If you've got some extra cash, you know, 18 00:00:58,800 --> 00:01:01,200 Speaker 1: the season being what it is, I get that everybody's 19 00:01:01,200 --> 00:01:03,680 Speaker 1: got a lot of expenses coming up. But if a 20 00:01:03,720 --> 00:01:06,559 Speaker 1: few folks have some extra bucks, I know it'll be appreciated. 21 00:01:06,680 --> 00:01:09,800 Speaker 1: It will help her have a warmer season this year, 22 00:01:09,800 --> 00:01:13,160 Speaker 1: and she certainly deserves it. So again, go fund me 23 00:01:13,200 --> 00:01:19,479 Speaker 1: Portland mass shooting paralyzed survivor fund Thank you. Oh God 24 00:01:19,680 --> 00:01:23,480 Speaker 1: is dead and you, Jamie Loftus, have killed him. I 25 00:01:23,520 --> 00:01:26,080 Speaker 1: did it, I did it, you did it. You did it. 26 00:01:26,200 --> 00:01:29,280 Speaker 1: No God was a hymn and Jamie killed him, hammered 27 00:01:29,280 --> 00:01:31,760 Speaker 1: at the back of the head. And and people are 28 00:01:31,800 --> 00:01:33,920 Speaker 1: going to be critical of that, but you know, you 29 00:01:33,959 --> 00:01:37,199 Speaker 1: don't know my story. And in my six part mini 30 00:01:37,280 --> 00:01:40,480 Speaker 1: series in which I'm played by Amanda Sea Freed, You're 31 00:01:40,520 --> 00:01:42,440 Speaker 1: going to start to see my side of the story. 32 00:01:42,640 --> 00:01:45,200 Speaker 1: And she is definitely not going to jail for what 33 00:01:45,280 --> 00:01:50,880 Speaker 1: I did. That's good. Unlike Founder of Their Enough Elizabeth Holmes, 34 00:01:51,080 --> 00:01:53,480 Speaker 1: who we just found out has been sentenced to eleven 35 00:01:53,480 --> 00:01:58,920 Speaker 1: point to five years in in prison. I'm kind of like, yeah, 36 00:01:59,000 --> 00:02:01,800 Speaker 1: I have mixed I have mixed feelings because it's like 37 00:02:02,200 --> 00:02:06,880 Speaker 1: people don't go First of all, the prison industrial complex 38 00:02:06,880 --> 00:02:11,600 Speaker 1: in general, it doesn't make anyone better to the extent 39 00:02:11,680 --> 00:02:14,280 Speaker 1: that there's value in the present day, and putting people 40 00:02:14,320 --> 00:02:17,320 Speaker 1: in prison, it's people who are like a severe ongoing danger. 41 00:02:17,840 --> 00:02:20,880 Speaker 1: And I don't see this making anything better. Like at 42 00:02:20,919 --> 00:02:23,600 Speaker 1: the same time, I hate her, so I don't. I'm 43 00:02:23,639 --> 00:02:26,400 Speaker 1: not gonna it's not it's not going to be the 44 00:02:26,400 --> 00:02:31,640 Speaker 1: top injustice I write route today. I do kind of 45 00:02:31,720 --> 00:02:38,120 Speaker 1: like that HERU after being exposed as an unrepentant criminal, 46 00:02:38,200 --> 00:02:39,480 Speaker 1: She's like, Oh, I think I'm just going to kind 47 00:02:39,480 --> 00:02:41,519 Speaker 1: of be like a norm e girl for a while 48 00:02:41,520 --> 00:02:45,480 Speaker 1: and I'm gonna gon gomal kid. And You're like, Liz, 49 00:02:45,520 --> 00:02:48,520 Speaker 1: it's too late. It's too late for that. Liz. You 50 00:02:48,639 --> 00:02:52,440 Speaker 1: defrauded people with a fake medical device that led folks 51 00:02:52,480 --> 00:02:55,720 Speaker 1: to to get treatment for things they didn't have and 52 00:02:55,800 --> 00:02:58,560 Speaker 1: ignore all this as they did, which is bad. That 53 00:02:58,680 --> 00:03:01,639 Speaker 1: before too many bodies at the floor, but that would 54 00:03:01,680 --> 00:03:05,040 Speaker 1: have stopped you, but you didn't really care. You applied 55 00:03:05,120 --> 00:03:08,360 Speaker 1: Steve Jobs logic to something that was not just a 56 00:03:08,400 --> 00:03:15,880 Speaker 1: silly box to keep in your pocket. Lizzie, you know, Lizzie, uh, 57 00:03:16,440 --> 00:03:21,000 Speaker 1: Lizzie again. Also putting her in prison for you know, 58 00:03:21,080 --> 00:03:23,640 Speaker 1: probably nine years, when you consider all the other things, 59 00:03:24,320 --> 00:03:28,480 Speaker 1: is like not gonna help anything. It'll just mean that 60 00:03:28,480 --> 00:03:30,640 Speaker 1: that kid she has grows up without a mom for 61 00:03:30,720 --> 00:03:33,760 Speaker 1: nine years, and that's not going to make that It's 62 00:03:33,800 --> 00:03:36,720 Speaker 1: like it's a huge moment for many things can be 63 00:03:36,760 --> 00:03:39,440 Speaker 1: true at once and having to hold all of those 64 00:03:39,480 --> 00:03:43,560 Speaker 1: truths and um, still recording episode of Behind the Bastards, 65 00:03:43,840 --> 00:03:47,240 Speaker 1: I can tell you about this. Actually, this actually will 66 00:03:47,280 --> 00:03:50,440 Speaker 1: be relevant to the episode. But yes, please please really yes, Okay, 67 00:03:50,560 --> 00:03:53,040 Speaker 1: this is definitely not going to be relevant. So let's 68 00:03:53,120 --> 00:03:56,280 Speaker 1: let's okay. I was a legal case I was thinking 69 00:03:56,320 --> 00:04:01,840 Speaker 1: about today. Um was the beanie baby is billionaire. Um. 70 00:04:01,920 --> 00:04:04,960 Speaker 1: When he went was taken the court in for holding 71 00:04:05,000 --> 00:04:08,240 Speaker 1: money in a Swiss bank account. Um. He it was 72 00:04:08,280 --> 00:04:12,680 Speaker 1: like a tax a tasion, tax evasion charge. Um. So 73 00:04:12,920 --> 00:04:17,000 Speaker 1: he was up for as many as five years in 74 00:04:17,640 --> 00:04:21,599 Speaker 1: prison for tax evasion, and he got off with I mean, 75 00:04:21,800 --> 00:04:24,160 Speaker 1: he's a billionaire, he's never going to suffer a consequence, right, 76 00:04:24,200 --> 00:04:27,640 Speaker 1: But like, he ended up getting a two years of 77 00:04:27,680 --> 00:04:30,720 Speaker 1: probation on the ground that it had been too publicly 78 00:04:30,839 --> 00:04:32,880 Speaker 1: humiliating so he didn't have to get to go to 79 00:04:32,960 --> 00:04:36,000 Speaker 1: jail because he was too and so farnest. It was 80 00:04:36,080 --> 00:04:38,800 Speaker 1: so embarrassing that he didn't have to go to jail. 81 00:04:39,360 --> 00:04:44,760 Speaker 1: What that's absolutely fucking weird. Anyways, I'm gonna I'm gonna 82 00:04:44,800 --> 00:04:47,520 Speaker 1: see how far this goes by committing murder and then 83 00:04:47,600 --> 00:04:51,119 Speaker 1: having my pants fall down and like, well, Judge, look, 84 00:04:51,839 --> 00:04:57,800 Speaker 1: yes I did stab that man seven times underpants. Yeah 85 00:04:58,400 --> 00:05:03,279 Speaker 1: pp myself so feel like that. Yeah we can we 86 00:05:03,360 --> 00:05:08,359 Speaker 1: just zero this one out. Man. I love the Beanie 87 00:05:08,360 --> 00:05:12,680 Speaker 1: Baby's story so much. I'm surrounded by my beans feeling safe. Wow, 88 00:05:12,720 --> 00:05:15,640 Speaker 1: that's good. You do literally have one on your shoulder 89 00:05:15,760 --> 00:05:19,760 Speaker 1: right now. Yeah, just just like I have this rifle 90 00:05:19,800 --> 00:05:22,120 Speaker 1: next to me. I think that we both have our 91 00:05:22,120 --> 00:05:28,000 Speaker 1: comfort objects at the ready. Um so, Jamie, speaking of 92 00:05:28,600 --> 00:05:32,000 Speaker 1: Elizabeth Holmes, because the person we're talking about today is 93 00:05:32,000 --> 00:05:34,920 Speaker 1: going to be the next story like that. By this 94 00:05:35,000 --> 00:05:36,880 Speaker 1: time next year, they're probably is going to be an 95 00:05:36,960 --> 00:05:42,039 Speaker 1: HBO documentary about this guy. God, maybe Taylor Taylor Kitch 96 00:05:42,080 --> 00:05:45,360 Speaker 1: could probably play him. Um, actually, if he wanted to. Really, 97 00:05:46,680 --> 00:05:50,080 Speaker 1: Taylor Kitch played hot David Koresh in The Waco Show, 98 00:05:50,839 --> 00:05:54,080 Speaker 1: walked into that. Yeah, they'd have to give him like 99 00:05:54,120 --> 00:05:56,599 Speaker 1: a belly suit or something. That's not anti fantom, just 100 00:05:56,640 --> 00:05:59,960 Speaker 1: being accurate, but he could do it. Maybe that is 101 00:06:00,320 --> 00:06:02,320 Speaker 1: I don't want to see this man ever again. No, 102 00:06:02,600 --> 00:06:05,600 Speaker 1: I'm mad. You don't want to see Taylor kitchigain you 103 00:06:05,600 --> 00:06:08,880 Speaker 1: don't want to see those cum gutters again. Unbelievable, goddamnit. 104 00:06:10,080 --> 00:06:13,760 Speaker 1: Plenty in this town, Robert, there's a million cum gutters. 105 00:06:13,760 --> 00:06:19,240 Speaker 1: I don't need those. Oh that that is true. But anyway, 106 00:06:19,360 --> 00:06:22,840 Speaker 1: so through the street that that that is that is 107 00:06:22,880 --> 00:06:25,800 Speaker 1: true of Los Angeles and no other city. Today we 108 00:06:25,880 --> 00:06:29,640 Speaker 1: are talking about a guy who absolutely never comes Sam 109 00:06:29,720 --> 00:06:33,120 Speaker 1: Bankman Freed. Why do you know this guy? Do you 110 00:06:33,160 --> 00:06:35,440 Speaker 1: know this guy? I don't know this guy. You don't 111 00:06:35,440 --> 00:06:37,880 Speaker 1: know this guy. You don't know this guy? Have you? 112 00:06:38,000 --> 00:06:41,119 Speaker 1: Have you caught any news in the last week about 113 00:06:41,120 --> 00:06:46,440 Speaker 1: how like a massive cryptocurrency exchange has collapsed Plummett? This 114 00:06:46,560 --> 00:06:49,680 Speaker 1: is that guy. Oh, this is the guy with his 115 00:06:49,800 --> 00:06:52,560 Speaker 1: like polyamorous sex ring that was running a big crypto 116 00:06:52,560 --> 00:06:54,600 Speaker 1: bank in the Bahamas and it all fell apart. Now 117 00:06:54,640 --> 00:06:59,200 Speaker 1: billions of dollars are gone. You've lost me again. Oh great, Okay, well, 118 00:06:59,200 --> 00:07:03,800 Speaker 1: well I'll right. This is still breaking. Um we are 119 00:07:03,880 --> 00:07:06,360 Speaker 1: because this is a Thanksgiving week episode. We only do 120 00:07:06,440 --> 00:07:08,960 Speaker 1: one episode on Thanksgivings. I needed a single one, so 121 00:07:09,040 --> 00:07:10,840 Speaker 1: I just want to give everyone background on this guy. 122 00:07:11,240 --> 00:07:13,360 Speaker 1: We will we may come back to this story because 123 00:07:13,400 --> 00:07:16,240 Speaker 1: there's a lot we don't fully understand about how he 124 00:07:16,280 --> 00:07:18,120 Speaker 1: did what he did and the degree to which. But 125 00:07:19,880 --> 00:07:24,120 Speaker 1: is that this guy, this guy ran a trading service 126 00:07:24,200 --> 00:07:28,520 Speaker 1: called Alameda Research and a crypto exchange, and exchange is 127 00:07:28,560 --> 00:07:31,800 Speaker 1: basically like a bank, right, It's a cross between a 128 00:07:31,840 --> 00:07:35,000 Speaker 1: bank and like a trading platform called f t X, 129 00:07:35,440 --> 00:07:38,960 Speaker 1: which was one of the largest cryptocurrency exchanges in the world, 130 00:07:39,000 --> 00:07:41,320 Speaker 1: and was also considered by most people to be the 131 00:07:41,320 --> 00:07:44,800 Speaker 1: most stable and like ethical and legitimate, right, people who 132 00:07:44,840 --> 00:07:46,440 Speaker 1: are just kind of on the outside looking at when 133 00:07:46,480 --> 00:07:49,560 Speaker 1: everything collapsed earlier this year, Right, you remember that we 134 00:07:49,600 --> 00:07:52,760 Speaker 1: have that I was engaged in that. It's just that 135 00:07:52,800 --> 00:07:55,560 Speaker 1: I get a lot of my news journalists on Twitter 136 00:07:55,600 --> 00:07:59,840 Speaker 1: and they've been busy this week. Yes, So when when 137 00:07:59,840 --> 00:08:02,040 Speaker 1: the when the when crypto like fell apart? When a 138 00:08:02,040 --> 00:08:04,280 Speaker 1: lot of crypto fell apart earlier this year, and like 139 00:08:04,320 --> 00:08:06,960 Speaker 1: a bunch of places went under. F t X was 140 00:08:07,000 --> 00:08:09,160 Speaker 1: one of the ones that stayed stable. And actually we're 141 00:08:09,200 --> 00:08:12,160 Speaker 1: buying up a bunch of like failing crypto companies to 142 00:08:12,200 --> 00:08:15,720 Speaker 1: try to like prop up the industry. They just collapsed 143 00:08:15,840 --> 00:08:18,720 Speaker 1: and like the value of all everything has been plummeting 144 00:08:18,800 --> 00:08:21,760 Speaker 1: for for the last several days. It's a big disaster. 145 00:08:22,160 --> 00:08:27,480 Speaker 1: It is very like yeah, in in and as in 146 00:08:27,560 --> 00:08:30,080 Speaker 1: words that will annoy me as little as possible. Can 147 00:08:30,120 --> 00:08:34,360 Speaker 1: you explain why fts remained solvent and others? It is 148 00:08:34,440 --> 00:08:39,760 Speaker 1: not so why did it outlast? Because they lied? Um, 149 00:08:39,960 --> 00:08:42,880 Speaker 1: so they were they were operating the short end of 150 00:08:42,920 --> 00:08:45,320 Speaker 1: how to describe it, And we may there will be 151 00:08:45,360 --> 00:08:48,560 Speaker 1: more details to come, but at present it seems fair 152 00:08:48,600 --> 00:08:51,120 Speaker 1: to say that it was a giant Ponzi scheme where 153 00:08:51,120 --> 00:08:54,120 Speaker 1: they were they were they were taking in money, promising 154 00:08:54,240 --> 00:08:58,400 Speaker 1: unreasonable returns, using other investor money to gamble on stuff 155 00:08:58,440 --> 00:09:01,079 Speaker 1: to try to provide anyway. And it was like all 156 00:09:01,120 --> 00:09:05,320 Speaker 1: the other guys they were, they were dishonest. Yeah, it is. 157 00:09:05,400 --> 00:09:08,079 Speaker 1: It is very likely. What what what differentiates this is 158 00:09:08,120 --> 00:09:11,240 Speaker 1: the scale because this is very likely a financial crime 159 00:09:11,240 --> 00:09:13,200 Speaker 1: on the level of what Bernie made off did. We 160 00:09:13,280 --> 00:09:16,000 Speaker 1: are talking in the ten to twenty billion dollars stolen 161 00:09:18,000 --> 00:09:22,480 Speaker 1: a lot of money. This is a serious financial crime. 162 00:09:22,880 --> 00:09:26,960 Speaker 1: I'm in on this pretty cold. And so for the 163 00:09:27,000 --> 00:09:32,119 Speaker 1: other the other kind of mass of um like touchstone 164 00:09:32,120 --> 00:09:33,880 Speaker 1: of this is that it has led to a class 165 00:09:33,880 --> 00:09:37,400 Speaker 1: action lawsuit against Larry David, Shaquille O'Neal, Tom Brady and 166 00:09:37,440 --> 00:09:40,440 Speaker 1: a number of celebrities who were all in a Super 167 00:09:40,480 --> 00:09:44,679 Speaker 1: Bowl ad for f t X. I remember that ad 168 00:09:45,000 --> 00:09:50,920 Speaker 1: that was so embarrassing for my man Larry. The lawsuit 169 00:09:51,000 --> 00:09:54,840 Speaker 1: is basically advertise. Basic was the high dollar Ponzi scheme, 170 00:09:54,880 --> 00:09:57,360 Speaker 1: and you guys were using your name recognition to sell 171 00:09:57,440 --> 00:10:02,160 Speaker 1: unregistered securities, which they were, which they were, which they 172 00:10:02,200 --> 00:10:05,880 Speaker 1: definitely were. Sorry, I just want to circle back to Shaquille. 173 00:10:05,920 --> 00:10:08,640 Speaker 1: Shaquille O'Neil will put his name on anything, to the 174 00:10:08,720 --> 00:10:13,560 Speaker 1: point where to the point where I worked at a 175 00:10:13,640 --> 00:10:16,680 Speaker 1: Haunted hay Ride this year, which we don't talk about 176 00:10:16,760 --> 00:10:24,079 Speaker 1: because it was a bad idea, but the rival what idea? 177 00:10:24,440 --> 00:10:28,760 Speaker 1: I guess what. I'm alive, bitch, bitch, I lived, I 178 00:10:28,800 --> 00:10:31,440 Speaker 1: lived to tell the tale. It's very unclear who is 179 00:10:31,520 --> 00:10:33,200 Speaker 1: right in the side of should I work at a 180 00:10:33,200 --> 00:10:35,760 Speaker 1: Haunted hay Ride or not? I still haven't really landed 181 00:10:35,760 --> 00:10:40,160 Speaker 1: on a name answer point being. Our closest rival Haunted 182 00:10:40,200 --> 00:10:47,640 Speaker 1: Hayride wise was sh Octoberfest. It was Shack themed haunted 183 00:10:47,679 --> 00:10:51,600 Speaker 1: attraction in which the only Shack related thing was a 184 00:10:51,600 --> 00:10:55,600 Speaker 1: gigantic inflatable Frankenstein that looked like Shack, which did sound awesome. 185 00:10:56,120 --> 00:10:58,520 Speaker 1: That sounds actually like the best time anyone's ever had 186 00:10:59,000 --> 00:11:02,080 Speaker 1: Shack put his name on an anything, including crypto and Halloween. 187 00:11:04,080 --> 00:11:07,120 Speaker 1: What a what a King? Um? Probably not. I'm sure 188 00:11:07,120 --> 00:11:09,200 Speaker 1: there's horrible things about Shack that have come out that 189 00:11:09,240 --> 00:11:13,560 Speaker 1: seems almost unavoidable. Anyway, Sam Bankman Freed is the guy 190 00:11:13,640 --> 00:11:17,000 Speaker 1: behind this gigantic financial crime that is still unraveling as 191 00:11:17,000 --> 00:11:19,200 Speaker 1: we do this episode. And I want to talk about 192 00:11:19,640 --> 00:11:22,400 Speaker 1: less about what happened on the exchange, because none of 193 00:11:22,440 --> 00:11:25,160 Speaker 1: us want to talk about how somebody carries out the 194 00:11:25,240 --> 00:11:27,959 Speaker 1: nuts and bolts of a cryptocurrency scam. But I want 195 00:11:28,000 --> 00:11:30,600 Speaker 1: to talk about I want to talk about the social 196 00:11:30,679 --> 00:11:32,600 Speaker 1: elements of the scam. I want to talk about how 197 00:11:32,679 --> 00:11:35,160 Speaker 1: he conned the media, how he conned celebrities, and how 198 00:11:35,200 --> 00:11:38,079 Speaker 1: he conned regulators UM. And I just want to talk 199 00:11:38,080 --> 00:11:41,400 Speaker 1: about also the way some of these people talked and 200 00:11:41,400 --> 00:11:45,280 Speaker 1: wrote about him, because there's a lot about I don't know. 201 00:11:45,320 --> 00:11:47,080 Speaker 1: A week or so ago, we did an episode on 202 00:11:47,080 --> 00:11:49,080 Speaker 1: The Daily Show we do what could happen here about 203 00:11:49,200 --> 00:11:53,920 Speaker 1: ethical altruism, which is in brief theory that like the 204 00:11:54,440 --> 00:11:57,520 Speaker 1: instead of trying to help people just because they need help, 205 00:11:57,880 --> 00:12:01,280 Speaker 1: you should only help people after you consider the way 206 00:12:01,280 --> 00:12:05,080 Speaker 1: to help people that is like the absolute most beneficial 207 00:12:05,120 --> 00:12:07,360 Speaker 1: way for like the least amount of you know, effort. 208 00:12:07,400 --> 00:12:09,600 Speaker 1: The it's utilitarianism, right, what can I How can I 209 00:12:09,640 --> 00:12:13,600 Speaker 1: do the greatest good with the least resources? And yeah, 210 00:12:14,080 --> 00:12:15,560 Speaker 1: and it's it's the way a lot of these like 211 00:12:15,960 --> 00:12:18,000 Speaker 1: and it's merged with this kind of thinking towards what 212 00:12:18,559 --> 00:12:21,200 Speaker 1: billionaire types called long termism. And the gist of this 213 00:12:21,320 --> 00:12:24,920 Speaker 1: is like, it's not worthwhile for me to do stuff 214 00:12:24,960 --> 00:12:27,600 Speaker 1: like pay taxes to have a society or guarantee like 215 00:12:27,679 --> 00:12:31,920 Speaker 1: universal health care. Instead, I should make Instead, the most 216 00:12:31,920 --> 00:12:33,760 Speaker 1: ethical thing that I should do is make as much 217 00:12:33,800 --> 00:12:36,040 Speaker 1: money as I personally can and then put that money 218 00:12:36,120 --> 00:12:38,320 Speaker 1: into things that I believe will save the world, like 219 00:12:38,440 --> 00:12:41,199 Speaker 1: research to stop aies from killing everyone and getting to 220 00:12:41,320 --> 00:12:45,959 Speaker 1: Mars and ship It's a way for billionaires, trains, It's 221 00:12:45,960 --> 00:12:48,960 Speaker 1: a way for billionaires and the other mega rich to 222 00:12:49,040 --> 00:12:52,080 Speaker 1: justify like continuing to do exactly what they want and 223 00:12:52,120 --> 00:12:55,040 Speaker 1: feel like they're saving in the world anyway, Sam, you 224 00:12:55,080 --> 00:12:57,640 Speaker 1: guy from Beanie you know the beanie babies billionaire did 225 00:12:57,640 --> 00:13:01,000 Speaker 1: to improve the world may a lot of beanie babies, 226 00:13:01,600 --> 00:13:03,760 Speaker 1: and then he bought the four seasons hotel and kept 227 00:13:03,760 --> 00:13:06,760 Speaker 1: making beanie babies. He didn't do ship. That's well, you know, 228 00:13:06,920 --> 00:13:09,960 Speaker 1: that's I'm fine with that compared to these guys because 229 00:13:09,960 --> 00:13:14,880 Speaker 1: they're where they're pretending anyway. Bank Sam Bankman Freed is 230 00:13:14,920 --> 00:13:16,520 Speaker 1: one of these guys. We're gonna get into that. But 231 00:13:16,520 --> 00:13:18,920 Speaker 1: we did this episode on It could Happen Here where 232 00:13:18,960 --> 00:13:21,840 Speaker 1: he was kind of a tangentle character in this very 233 00:13:21,920 --> 00:13:26,120 Speaker 1: unsettling and insidious movement that is behind guys like Elon 234 00:13:26,200 --> 00:13:28,040 Speaker 1: Musk who are claiming to be saving the world will 235 00:13:28,080 --> 00:13:31,080 Speaker 1: just sucking over people. And then like four days after 236 00:13:31,120 --> 00:13:33,600 Speaker 1: it came out, his entire life unraveled and his fortune 237 00:13:33,600 --> 00:13:36,440 Speaker 1: disappeared overnight because he was a giant con artist. What 238 00:13:36,520 --> 00:13:39,640 Speaker 1: a treat. It's very funny. So that's why we're talking 239 00:13:39,640 --> 00:13:42,960 Speaker 1: about him right now. He's like thirty years old and 240 00:13:43,000 --> 00:13:46,640 Speaker 1: looks like Mark Zuckerberg and David Dobrick's love child. He's 241 00:13:46,720 --> 00:13:51,280 Speaker 1: thirty years old, was a Mark Zuckerberg and David dobricks 242 00:13:51,320 --> 00:13:54,480 Speaker 1: love child. Look. I shouldn't call anyone a schlub, but 243 00:13:54,559 --> 00:13:57,760 Speaker 1: he looks like a schlub. I forget what David Dobrick 244 00:13:57,840 --> 00:14:03,840 Speaker 1: looked like because my brain protects myself respectfully. Two villains, 245 00:14:03,960 --> 00:14:09,520 Speaker 1: two villains, love child. Yeah. Um anyway, so I uh yeah. 246 00:14:09,600 --> 00:14:13,120 Speaker 1: Sam Bankmnfried was born in nineteen ninety two on the 247 00:14:13,160 --> 00:14:16,800 Speaker 1: campus of Stanford University, uh, continuing a long and proud 248 00:14:16,840 --> 00:14:19,840 Speaker 1: tradition of absolutely nothing good ever coming from that hellhole. 249 00:14:20,280 --> 00:14:24,760 Speaker 1: His parents are both extremely prominent Stanford professors. His mother, Barbara, 250 00:14:24,880 --> 00:14:27,240 Speaker 1: is a lawyer her clothes who clerked for the Second 251 00:14:27,240 --> 00:14:31,080 Speaker 1: Circuit Court and graduated from Harvard. She founded Mind the Gap, 252 00:14:31,240 --> 00:14:34,720 Speaker 1: a somewhat shady and mysterious democratic fundraising group. I think 253 00:14:34,720 --> 00:14:36,960 Speaker 1: it's shady and that people don't exactly know where all 254 00:14:36,960 --> 00:14:39,320 Speaker 1: the money comes from or like what their goals are. 255 00:14:39,560 --> 00:14:44,160 Speaker 1: She also yeah yeah. She also pinned an essay in 256 00:14:44,200 --> 00:14:47,520 Speaker 1: two thousand thirteen that the right wing is going nuts 257 00:14:47,520 --> 00:14:51,560 Speaker 1: about because she was basically arguing that like it's good 258 00:14:51,560 --> 00:14:54,280 Speaker 1: and evil are less a factor in in what people 259 00:14:54,320 --> 00:14:57,440 Speaker 1: do than environmental factors and and and all that stuff, 260 00:14:57,480 --> 00:15:00,000 Speaker 1: Like when people do things that are bad, it's more 261 00:15:00,040 --> 00:15:02,200 Speaker 1: and a product of their And it was kind of 262 00:15:02,240 --> 00:15:06,920 Speaker 1: am oh god, what's that fucking psychologist. It was like 263 00:15:06,960 --> 00:15:09,240 Speaker 1: a Skinner type argument where it's like, well, if people 264 00:15:09,320 --> 00:15:11,400 Speaker 1: have bad inputs in their youth and that's going to 265 00:15:11,440 --> 00:15:15,720 Speaker 1: determine anyway. Um, I think that's funny given what happens. Um, 266 00:15:16,000 --> 00:15:20,120 Speaker 1: I'm gonna guess she sucks at what she's she's she sucks. Uh, 267 00:15:20,160 --> 00:15:23,600 Speaker 1: And so does his dad, Joseph Bankman. Joseph is also 268 00:15:23,640 --> 00:15:26,640 Speaker 1: a lawyer. He is a graduate from Yale. His big 269 00:15:26,680 --> 00:15:29,280 Speaker 1: claim to fame was developing a proposal for an overhaul 270 00:15:29,320 --> 00:15:31,760 Speaker 1: of the California tax return system that would have filled 271 00:15:31,760 --> 00:15:33,960 Speaker 1: out citizens tax returns in advance. And I don't know, 272 00:15:34,000 --> 00:15:36,160 Speaker 1: I just said he sucked, But actually that sounds like 273 00:15:36,200 --> 00:15:38,960 Speaker 1: a good thing. Um, I think that that's actually a 274 00:15:39,000 --> 00:15:41,400 Speaker 1: cool thing to advocate for. The measure failed by one 275 00:15:41,520 --> 00:15:44,160 Speaker 1: vote after heavy lobbying from into It, a tax prep 276 00:15:44,480 --> 00:15:49,120 Speaker 1: prep software company. Um yeah, it kind of is. It's 277 00:15:49,160 --> 00:15:52,760 Speaker 1: it's total bullshit because stuff, it's the thing everybody agrees 278 00:15:52,840 --> 00:15:55,120 Speaker 1: with on paper, but nobody will actually fight the tax 279 00:15:55,120 --> 00:15:57,440 Speaker 1: prep companies, which is like, Hey, the I r S 280 00:15:57,480 --> 00:16:00,680 Speaker 1: like knows more or less what I'm ache and like 281 00:16:01,160 --> 00:16:03,440 Speaker 1: knows more or less what I owe. Why don't I 282 00:16:03,480 --> 00:16:05,240 Speaker 1: just get a thing from them? Why do I have 283 00:16:05,280 --> 00:16:07,960 Speaker 1: to go through this? Like anyway, there's no need but 284 00:16:08,000 --> 00:16:10,400 Speaker 1: it's like that's the other countries do it that way. 285 00:16:10,440 --> 00:16:13,680 Speaker 1: We don't, though, because there has to be a convoluted 286 00:16:13,720 --> 00:16:16,200 Speaker 1: system that's expensive and where they can charge you if 287 00:16:16,240 --> 00:16:18,880 Speaker 1: you make the tiniest mistake because you can't read size 288 00:16:18,880 --> 00:16:21,160 Speaker 1: one font well. And more to the point, because I 289 00:16:21,160 --> 00:16:23,960 Speaker 1: don't actually think the I R. S. Is advocating to 290 00:16:24,040 --> 00:16:25,440 Speaker 1: keep it a pain in the ass. I think it's 291 00:16:25,480 --> 00:16:29,200 Speaker 1: these tax prep companies because they have an entire industry 292 00:16:29,240 --> 00:16:32,560 Speaker 1: based on charging people to do the thing that they 293 00:16:32,600 --> 00:16:35,920 Speaker 1: have to do to avoid going to fucking prison. Um. Anyway, 294 00:16:36,200 --> 00:16:38,480 Speaker 1: I said he's an asshole, and I'm sure he is, 295 00:16:38,520 --> 00:16:41,400 Speaker 1: but he was right about this, and I don't know 296 00:16:41,400 --> 00:16:44,040 Speaker 1: what to say about that. Like all of us. Joseph 297 00:16:44,080 --> 00:16:46,400 Speaker 1: is also a podcaster. He is the host of the 298 00:16:46,480 --> 00:16:52,040 Speaker 1: co host of the Stanford Legal podcast too, and he's 299 00:16:52,080 --> 00:16:55,600 Speaker 1: a nerd. If it wasn't the holiday season and I 300 00:16:55,640 --> 00:16:58,320 Speaker 1: wasn't like getting ready for friends and family and all 301 00:16:58,320 --> 00:17:00,280 Speaker 1: that that good stuff, I would have listened to his 302 00:17:00,320 --> 00:17:03,360 Speaker 1: podcast and we would probably be making fun of him. 303 00:17:03,360 --> 00:17:06,600 Speaker 1: But you can do that on your own goodness, I believe, 304 00:17:07,000 --> 00:17:09,840 Speaker 1: isn't it doesn't it feel so horrible when you think 305 00:17:09,880 --> 00:17:13,840 Speaker 1: of how many people do what we do, but they're 306 00:17:13,880 --> 00:17:16,960 Speaker 1: the worst person you've ever heard of. It's so sad, 307 00:17:17,080 --> 00:17:22,320 Speaker 1: it's embarrassing. Yeah, it's it's like, I don't know, I 308 00:17:22,359 --> 00:17:26,240 Speaker 1: avoid self identifying as a podcaster as it is, it's 309 00:17:26,240 --> 00:17:28,560 Speaker 1: still not a stem. But then on top of that, 310 00:17:28,560 --> 00:17:30,560 Speaker 1: they're like, oh, like what, like I mean, you know, 311 00:17:30,680 --> 00:17:32,440 Speaker 1: you know, the only you know, the only thing that 312 00:17:32,760 --> 00:17:35,440 Speaker 1: I can compare it to is like when I started 313 00:17:35,440 --> 00:17:38,080 Speaker 1: making a living as a writer fifteen years ago, and 314 00:17:38,119 --> 00:17:40,800 Speaker 1: I would say that at like a at like a 315 00:17:41,720 --> 00:17:43,600 Speaker 1: party or something, someone asked like, well, what do you 316 00:17:43,600 --> 00:17:45,520 Speaker 1: do And I'm like, well, I'm a writer, and like 317 00:17:45,560 --> 00:17:48,760 Speaker 1: four other people would say yeah, me too. Uh and 318 00:17:48,800 --> 00:17:51,880 Speaker 1: then you wind up listening to everybody's pitches for their 319 00:17:51,880 --> 00:17:55,600 Speaker 1: novels that they're never going to finish. Um. So eventually 320 00:17:55,640 --> 00:17:58,119 Speaker 1: I just started lying and saying that I still worked 321 00:17:58,119 --> 00:18:01,199 Speaker 1: a special at well. I like, I mean, it's the 322 00:18:01,240 --> 00:18:03,520 Speaker 1: same thing with like if you say you're a comedian 323 00:18:03,600 --> 00:18:07,760 Speaker 1: in a party, you're god no, never never identify as 324 00:18:07,760 --> 00:18:10,040 Speaker 1: a comedian And oh me too, do you? And I've 325 00:18:10,080 --> 00:18:14,800 Speaker 1: done one whole open mic and Mike was very offensive 326 00:18:14,920 --> 00:18:18,919 Speaker 1: and he's had a comedian's job to push boundaries and oh, 327 00:18:19,040 --> 00:18:21,280 Speaker 1: tell me your job, you know how Lenny Bruce, Let 328 00:18:21,400 --> 00:18:23,720 Speaker 1: read that. Let read that list of curse words. Will 329 00:18:23,760 --> 00:18:26,880 Speaker 1: I just do that with slurs here? Let me show you. Yeah, 330 00:18:26,880 --> 00:18:30,720 Speaker 1: you're like yeah, like slame. Bruce was not funny in 331 00:18:30,840 --> 00:18:35,160 Speaker 1: that period of his career, even a little. Anyways, Anyways, 332 00:18:35,160 --> 00:18:37,880 Speaker 1: our job, I mean, are embarrassing what I'm saying. Anyway, 333 00:18:38,480 --> 00:18:41,639 Speaker 1: our jobs are indeed embarrassing. So, as you might guess 334 00:18:41,840 --> 00:18:45,400 Speaker 1: from all of that, Sam was born into what amounts 335 00:18:45,440 --> 00:18:50,159 Speaker 1: to America's like liberal aristocracy. He is a fucking coastal elite, right. 336 00:18:50,359 --> 00:18:54,080 Speaker 1: This kid grows up on the Stanford campus to Stanford professors. 337 00:18:54,320 --> 00:18:57,480 Speaker 1: One of his aunt's teaches at Columbia University is like 338 00:18:57,520 --> 00:19:01,320 Speaker 1: a professor there. So seems like he has close family 339 00:19:01,320 --> 00:19:04,840 Speaker 1: connections to employees at Yale and at Harvard UH as 340 00:19:04,840 --> 00:19:07,439 Speaker 1: well as Stanford, like his parents both go to Harvard. 341 00:19:07,480 --> 00:19:10,600 Speaker 1: I think it is Kennedy. He is a Yeah, he's 342 00:19:10,680 --> 00:19:13,560 Speaker 1: he's that. He is as you do not get much 343 00:19:13,600 --> 00:19:18,640 Speaker 1: more of a rarefied like intellectual air. He's wearing Linen's around. 344 00:19:18,800 --> 00:19:21,119 Speaker 1: This is how I think about this. This is a 345 00:19:21,240 --> 00:19:23,800 Speaker 1: child who at age eight, has strong opinions on a 346 00:19:23,880 --> 00:19:29,280 Speaker 1: manual kant Um, which will cheer for him at the table. No, No, 347 00:19:29,680 --> 00:19:32,119 Speaker 1: we're about to get into that, Jamie loves. I just 348 00:19:32,160 --> 00:19:35,040 Speaker 1: had I just had a vision of a child sitting 349 00:19:35,160 --> 00:19:41,160 Speaker 1: at like a holiday dinner and saying derivative and then amazing, Wow, 350 00:19:41,359 --> 00:19:44,040 Speaker 1: he is really coming along, isn't he. Yeah, this is 351 00:19:44,080 --> 00:19:46,159 Speaker 1: a little kid that, when he likes sits down at 352 00:19:46,160 --> 00:19:48,600 Speaker 1: the doctor's office, pulls out a fucking I don't know, 353 00:19:48,680 --> 00:19:51,800 Speaker 1: doritta or something book just just just so you know, 354 00:19:51,920 --> 00:19:55,359 Speaker 1: just so you know, he knows fancy philosophers. God damn 355 00:19:55,359 --> 00:19:59,879 Speaker 1: it board book. So his parents and his raised him 356 00:19:59,880 --> 00:20:03,040 Speaker 1: in his brother to be utilitarians. Uh one of the 357 00:20:03,119 --> 00:20:07,720 Speaker 1: articles about them into SpongeBob. Okay, yeah, yeah, I was. 358 00:20:08,119 --> 00:20:13,399 Speaker 1: I was raised to hassle cows in our back forty Um. 359 00:20:13,480 --> 00:20:16,720 Speaker 1: His parents nights. So this article notes that nights around 360 00:20:16,760 --> 00:20:20,280 Speaker 1: the family dinner table often focused around debates about how 361 00:20:20,359 --> 00:20:23,480 Speaker 1: to do the greatest good for the greatest number of people. Um. 362 00:20:23,520 --> 00:20:25,840 Speaker 1: In later interviews with I'm gonna get to this guy 363 00:20:25,880 --> 00:20:29,040 Speaker 1: in a second, the absolute dick writing this journalist to 364 00:20:29,080 --> 00:20:32,639 Speaker 1: ever write, Dick Sam would claim that his most formative 365 00:20:32,680 --> 00:20:36,200 Speaker 1: moment came at age twelve, when he was weighing arguments 366 00:20:36,240 --> 00:20:46,960 Speaker 1: around the abortion debate. So, first off, because because not 367 00:20:47,040 --> 00:20:51,920 Speaker 1: only no, also like in the context of like where 368 00:20:51,920 --> 00:20:54,840 Speaker 1: would he have been doing this? I'm guessing at like 369 00:20:54,880 --> 00:20:57,160 Speaker 1: around the family table or when they have the all. 370 00:20:57,200 --> 00:20:59,480 Speaker 1: You know, everybody's got their brandy and he's drinking in 371 00:20:59,600 --> 00:21:02,960 Speaker 1: some sort a fucking t that's insufferable, and they're all 372 00:21:03,000 --> 00:21:06,760 Speaker 1: talking about people's rights as if it's like a fun 373 00:21:06,840 --> 00:21:09,800 Speaker 1: intellectual problem, like how to fix it for them? It 374 00:21:09,960 --> 00:21:12,280 Speaker 1: is because whenever, wherever they land, the rules aren't going 375 00:21:12,320 --> 00:21:15,040 Speaker 1: to apply to them anyways. Yes, yes, where does he? 376 00:21:15,200 --> 00:21:18,240 Speaker 1: Where does he fall? Where does That's a great question. 377 00:21:18,280 --> 00:21:20,639 Speaker 1: So I'm going to quote now from an article previously 378 00:21:20,720 --> 00:21:24,600 Speaker 1: published by Sequoia Capital and written by Adam Fisher, who 379 00:21:24,600 --> 00:21:27,119 Speaker 1: should never be allowed to lift this article down. When 380 00:21:27,160 --> 00:21:31,000 Speaker 1: I say the dick writing ist like fucking pr flak journal, 381 00:21:31,080 --> 00:21:37,120 Speaker 1: it's it's it's shameful quote right. A rights based theorist 382 00:21:37,200 --> 00:21:40,280 Speaker 1: might argue that there aren't really any discontinuous differences, as 383 00:21:40,280 --> 00:21:43,040 Speaker 1: a fetus becomes a child, and thus fetus murder is 384 00:21:43,119 --> 00:21:47,400 Speaker 1: essentially child murder. The utilitarian argument compares the consequences of each. 385 00:21:47,560 --> 00:21:49,600 Speaker 1: The loss of an actual child's life, a life in 386 00:21:49,600 --> 00:21:51,960 Speaker 1: which a great deal of parental and societal resources have 387 00:21:52,000 --> 00:21:54,639 Speaker 1: been invested, is much more consequential than the loss of 388 00:21:54,640 --> 00:21:57,760 Speaker 1: a potential life in utero, and thus to a utilitarian 389 00:21:57,920 --> 00:22:01,600 Speaker 1: abortion looks more like birth control than murder. SPF. That's 390 00:22:01,640 --> 00:22:05,240 Speaker 1: what they always call him, the kid Sam spfs application 391 00:22:05,280 --> 00:22:08,359 Speaker 1: of utilitarianism helped him resolve some nagging doubts he had 392 00:22:08,400 --> 00:22:11,240 Speaker 1: about the ethics of abortion. It made him feel comfortable 393 00:22:11,280 --> 00:22:13,960 Speaker 1: being pro choice, as his friends, family, and peers were. 394 00:22:14,119 --> 00:22:17,679 Speaker 1: He saw the essential rightness of his philosophical faith. So 395 00:22:17,760 --> 00:22:20,800 Speaker 1: that's very fucked up. That is, that is so deep, 396 00:22:20,920 --> 00:22:23,800 Speaker 1: Like the term choice is used at the very ind there, 397 00:22:23,800 --> 00:22:26,080 Speaker 1: but it's clear that like he's not thinking about this 398 00:22:26,160 --> 00:22:29,480 Speaker 1: in terms of like the actual value of human bodily autonomy. 399 00:22:29,680 --> 00:22:33,800 Speaker 1: That does not weigh in the utilitarian calculus for him whatsoever. No, 400 00:22:34,440 --> 00:22:39,359 Speaker 1: that is, to would argue my friend Robert, no, no, no, 401 00:22:40,680 --> 00:22:45,920 Speaker 1: And again, even like, look, I ship reflexively sometimes on utilitarianism, 402 00:22:45,960 --> 00:22:49,399 Speaker 1: not because of the inherent value or disvalue of thinking 403 00:22:49,400 --> 00:22:51,080 Speaker 1: that way, but about the way it gets talked about 404 00:22:51,119 --> 00:22:53,760 Speaker 1: by these people. But like, if you're actually a utilitarian 405 00:22:54,840 --> 00:22:56,960 Speaker 1: and you care about the greatest good for the greatest 406 00:22:57,040 --> 00:23:00,560 Speaker 1: number of people, then bodily autonomy should factor into at right, 407 00:23:00,680 --> 00:23:04,400 Speaker 1: like human bodily autonomy is he should be hugely important 408 00:23:04,400 --> 00:23:08,240 Speaker 1: to you. But no, that's not logical. All that matters 409 00:23:08,320 --> 00:23:11,560 Speaker 1: is like, well how many If you've if less resources 410 00:23:11,560 --> 00:23:13,520 Speaker 1: than this have been invested in the fetus, then it's 411 00:23:13,560 --> 00:23:17,080 Speaker 1: not a person. So abortion makes that's fucking bullshit logic. 412 00:23:17,280 --> 00:23:21,040 Speaker 1: Fuck you, you're doing too much math. Stop, this isn't 413 00:23:21,440 --> 00:23:24,040 Speaker 1: This isn't a math problem, Sam, Like, this is not 414 00:23:24,119 --> 00:23:27,600 Speaker 1: a fucking math problem. Not everything is goddamn math problem. Robert, 415 00:23:27,640 --> 00:23:29,879 Speaker 1: he won't listen to unless you call him SBF, and 416 00:23:29,920 --> 00:23:35,040 Speaker 1: I was like, sunscreen, Yeah, we'll get that. They all 417 00:23:35,080 --> 00:23:37,960 Speaker 1: call him fucking SBF, and I hate it. But also 418 00:23:38,040 --> 00:23:40,000 Speaker 1: as this went on, I started using it more and 419 00:23:40,040 --> 00:23:41,920 Speaker 1: more because it's a pain in the astotype his whole 420 00:23:41,920 --> 00:23:45,800 Speaker 1: fucking last name out. Um. I look, I this is 421 00:23:45,840 --> 00:23:47,399 Speaker 1: one where I'm not going to give him a pass. 422 00:23:47,880 --> 00:23:49,840 Speaker 1: But I get it. If you write about this sucker 423 00:23:49,880 --> 00:23:54,320 Speaker 1: a lot, it does make it easier. So anyway, all 424 00:23:54,359 --> 00:23:56,640 Speaker 1: of this is very bad. Um, but you know what's 425 00:23:56,680 --> 00:24:00,760 Speaker 1: not bad, Jamie, the products and sir this to support 426 00:24:00,760 --> 00:24:05,320 Speaker 1: this podcast. That's not true. They are that's well, I've checked. 427 00:24:06,480 --> 00:24:08,359 Speaker 1: Hold on, I just ran a quick check on that, 428 00:24:08,440 --> 00:24:12,359 Speaker 1: and you can. But but what about the greatest good 429 00:24:12,440 --> 00:24:15,159 Speaker 1: for the greatest number of people? And and given that 430 00:24:15,320 --> 00:24:18,439 Speaker 1: I'm I'm a people, so it works out pretty well, 431 00:24:19,000 --> 00:24:20,840 Speaker 1: it works out very well for me. I think that 432 00:24:20,920 --> 00:24:24,639 Speaker 1: if you actually have more advertising revenue, you will actually 433 00:24:24,640 --> 00:24:27,240 Speaker 1: built a really fast train like you've been promising me 434 00:24:27,280 --> 00:24:30,480 Speaker 1: you would. Yeah, I'm gonna I'm gonna build the hyper loop. 435 00:24:30,680 --> 00:24:35,000 Speaker 1: Um yeah, yeah. And I'm gonna promise you one thing. 436 00:24:35,400 --> 00:24:36,800 Speaker 1: It's going to kill a hell of a lot more 437 00:24:36,800 --> 00:24:40,600 Speaker 1: people than that Simpsons monorail did and that and and 438 00:24:40,640 --> 00:24:42,720 Speaker 1: I'm gonna and and look, not everyone is going to 439 00:24:42,800 --> 00:24:45,840 Speaker 1: hold you to task for that, but I am thank you, Jamie, 440 00:24:45,880 --> 00:24:50,240 Speaker 1: thank you for keeping me honest and ensuring that we 441 00:24:50,240 --> 00:24:55,400 Speaker 1: we we really make a memorable disaster. Look, I'm available anytime. 442 00:24:55,440 --> 00:25:05,520 Speaker 1: I'm not visiting my friend Liz in jail. M uh 443 00:25:05,560 --> 00:25:10,320 Speaker 1: we are back. What a good time. So Sam bankman 444 00:25:10,400 --> 00:25:12,800 Speaker 1: Fried is above all else a numbers guy, and I 445 00:25:12,800 --> 00:25:15,159 Speaker 1: guess as a kid he was a numbers kid. His 446 00:25:15,240 --> 00:25:18,480 Speaker 1: parents sent him to Crystal Springs Uplands, a fancy prep 447 00:25:18,520 --> 00:25:21,920 Speaker 1: school in Hillsborough, California. I looked through the website because 448 00:25:21,960 --> 00:25:23,480 Speaker 1: I wanted to make fun of it, but it just 449 00:25:23,560 --> 00:25:25,600 Speaker 1: kind of seems like a really fancy school. I don't know. 450 00:25:25,640 --> 00:25:27,880 Speaker 1: I'm sure it's a great place to get an education. 451 00:25:28,160 --> 00:25:30,119 Speaker 1: They make, they put a lot of I will tell this. 452 00:25:30,240 --> 00:25:32,560 Speaker 1: They devote a lot of screen resources to letting you 453 00:25:32,560 --> 00:25:34,280 Speaker 1: know that they are not racist and that most of 454 00:25:34,280 --> 00:25:38,200 Speaker 1: their students aren't white. Um. They also have a friend. 455 00:25:38,560 --> 00:25:41,200 Speaker 1: They also have a French cinema class for sixth graders, 456 00:25:41,400 --> 00:25:44,440 Speaker 1: which is fine, But the cranky asshole in me that 457 00:25:44,560 --> 00:25:46,639 Speaker 1: still has a little piece of my soul raised by 458 00:25:46,720 --> 00:25:49,480 Speaker 1: right wing radio wants to say shit about it. I 459 00:25:49,680 --> 00:25:52,600 Speaker 1: was raised by by left wing people, and I still 460 00:25:52,600 --> 00:25:56,400 Speaker 1: think that that's some loser shit, and I think that 461 00:25:56,400 --> 00:26:01,720 Speaker 1: that's fucking dorky and goofy and like should it's like 462 00:26:02,240 --> 00:26:05,520 Speaker 1: what you know you meet because you meet people like 463 00:26:05,560 --> 00:26:09,240 Speaker 1: that in the wild, and they're sometimes there and maybe 464 00:26:09,280 --> 00:26:12,280 Speaker 1: even often very sweet people. But I'm like trying to 465 00:26:12,320 --> 00:26:14,760 Speaker 1: be like, oh, you know who Plankton is, and they're 466 00:26:14,800 --> 00:26:18,000 Speaker 1: like no, and then but they've been watching French movies 467 00:26:18,040 --> 00:26:21,240 Speaker 1: since they were like seven, and I just don't expect 468 00:26:21,320 --> 00:26:24,199 Speaker 1: that if I made a sixth if I meet a 469 00:26:24,240 --> 00:26:29,040 Speaker 1: sixth grader with strong opinions about French cinema, like I'm 470 00:26:29,080 --> 00:26:32,680 Speaker 1: just gonna leave. I'm gonna leave. I'm just gonna walk away, Robert, 471 00:26:32,800 --> 00:26:34,480 Speaker 1: that's really brave of you to march out of a 472 00:26:34,480 --> 00:26:36,879 Speaker 1: conversation with an eleven year old like I am not. 473 00:26:37,000 --> 00:26:40,080 Speaker 1: I'm not putting up with that ship. Absolutely not. I'm 474 00:26:40,280 --> 00:26:43,520 Speaker 1: leaving this sixth grade class. My name is Robert Evans, 475 00:26:43,520 --> 00:26:48,840 Speaker 1: and out here do fi is? Go watch your Renoir? 476 00:26:49,040 --> 00:26:50,720 Speaker 1: Is that one of them? That sounds like one of them? 477 00:26:50,720 --> 00:26:54,240 Speaker 1: Cloud Renoir? Is he a painter? Or Renoir is a painter? 478 00:26:54,359 --> 00:26:55,800 Speaker 1: I know the one you're looking for it, and I 479 00:26:55,840 --> 00:26:57,879 Speaker 1: was looking for it too, but I don't remember. But 480 00:26:58,000 --> 00:27:00,480 Speaker 1: guess who do you know who Plankton is? Of I 481 00:27:00,520 --> 00:27:02,320 Speaker 1: know who Plankton is, And I also know that at 482 00:27:02,400 --> 00:27:05,000 Speaker 1: least one of the directors they study as a pedophile. 483 00:27:05,119 --> 00:27:09,040 Speaker 1: Just knowing a little bit about French cinema, that's that's unavoidable. 484 00:27:11,600 --> 00:27:14,800 Speaker 1: So he does well. Again, the school is probably fine. 485 00:27:14,920 --> 00:27:18,879 Speaker 1: He does well. At the school. UM, he was notably insular. 486 00:27:19,640 --> 00:27:23,160 Speaker 1: He avoided most of his classmates to play StarCraft, which 487 00:27:23,240 --> 00:27:27,119 Speaker 1: is good, and League of Legends, which objectively sucks. He 488 00:27:27,160 --> 00:27:29,320 Speaker 1: also played a lot of magic the Gathering. So I 489 00:27:29,359 --> 00:27:32,720 Speaker 1: am confident he did not get late in high school. Um, 490 00:27:33,200 --> 00:27:38,919 Speaker 1: this is based on extensive personal experience. Like I just 491 00:27:39,000 --> 00:27:48,720 Speaker 1: actually did some field research and about four years of it. Okay. Interesting. 492 00:27:49,920 --> 00:27:53,040 Speaker 1: For college, he was accepted to and attended My Tea, 493 00:27:53,200 --> 00:27:57,720 Speaker 1: which marked out his family's elite North American University punch guard. Um, 494 00:27:57,760 --> 00:27:59,560 Speaker 1: they really hit him all now now that they've got 495 00:27:59,560 --> 00:28:01,399 Speaker 1: it in my kid in the family, they get a 496 00:28:01,400 --> 00:28:04,040 Speaker 1: free coffee. There used to be an m I T. 497 00:28:04,320 --> 00:28:06,239 Speaker 1: So when I was doing comedy in Boston, there was 498 00:28:06,280 --> 00:28:09,640 Speaker 1: like m I m I T had like a secret 499 00:28:09,720 --> 00:28:13,120 Speaker 1: comedy club that was just for M I T students 500 00:28:13,200 --> 00:28:18,639 Speaker 1: And it was awful. That's oh god, Yeah, they paid you, okay, 501 00:28:18,680 --> 00:28:20,879 Speaker 1: But it was like they're like, if you like knew 502 00:28:20,920 --> 00:28:24,240 Speaker 1: someone who like met someone who went to M I 503 00:28:24,320 --> 00:28:26,159 Speaker 1: T and they came to your shows, and they're like, 504 00:28:26,160 --> 00:28:29,119 Speaker 1: oh we we've we did the math and you are 505 00:28:29,160 --> 00:28:31,280 Speaker 1: allowed to come to our They called it their speak 506 00:28:31,400 --> 00:28:34,040 Speaker 1: easy I wonder if it's still around. It was so 507 00:28:35,320 --> 00:28:38,840 Speaker 1: it was. I mean it's not fun. Yea not a 508 00:28:38,880 --> 00:28:43,400 Speaker 1: fun crowd, I will say. But best of luck to 509 00:28:43,680 --> 00:28:46,720 Speaker 1: to whatever was going on there. So he goes to 510 00:28:46,840 --> 00:28:48,720 Speaker 1: m He goes to he might have been at one 511 00:28:48,720 --> 00:28:51,120 Speaker 1: of those shows where he might have because he joins 512 00:28:51,120 --> 00:28:54,840 Speaker 1: a fraternity there. Uh and and M I T. Well, Jamie, 513 00:28:55,080 --> 00:28:58,959 Speaker 1: it's a an M I T specific co ed nerd fraternity. 514 00:28:59,200 --> 00:29:05,080 Speaker 1: So night it's theta Um. And here's how Adam Fisher, 515 00:29:05,200 --> 00:29:08,080 Speaker 1: the guy I hate, described them in his article, which 516 00:29:08,160 --> 00:29:11,600 Speaker 1: was bad quote. A co ed fraternity of super geeks 517 00:29:11,720 --> 00:29:15,560 Speaker 1: similarly interested in magic and video games. Theten's are fond 518 00:29:15,560 --> 00:29:19,160 Speaker 1: of debating math, physics, computer science, linguistics, philosophy, and logic 519 00:29:19,200 --> 00:29:23,000 Speaker 1: problems for fun at alcohol free parties. Now I do 520 00:29:23,120 --> 00:29:24,600 Speaker 1: know a little bit about M I T. And I 521 00:29:24,640 --> 00:29:27,080 Speaker 1: know another thing these nerds often do is kill themselves 522 00:29:27,120 --> 00:29:29,800 Speaker 1: using nitrous oxide, because they will try to flood entire 523 00:29:29,920 --> 00:29:34,400 Speaker 1: rooms with nitrous to do like nitrous to ratio and 524 00:29:34,520 --> 00:29:37,160 Speaker 1: kill themselves. It's a thing that happens. Look it up 525 00:29:37,200 --> 00:29:40,880 Speaker 1: in M T. Nitrous des Yeah. I all I did 526 00:29:40,920 --> 00:29:47,240 Speaker 1: in college was drink too many blue moons. It's uh, 527 00:29:47,280 --> 00:29:51,160 Speaker 1: it's it's quite a thing. Um. So I don't know 528 00:29:51,320 --> 00:29:53,080 Speaker 1: doing over there, I don't know, I don't know, I 529 00:29:53,120 --> 00:29:54,840 Speaker 1: don't know. I don't know how much I believe that 530 00:29:54,840 --> 00:29:58,680 Speaker 1: that they were always alcohol and drug free parties, because 531 00:29:58,680 --> 00:30:00,360 Speaker 1: if there's one thing I know from ner, it's that 532 00:30:00,400 --> 00:30:03,040 Speaker 1: they do a shipload of drugs. I definitely didn't hear 533 00:30:03,080 --> 00:30:04,520 Speaker 1: of this one because I went to a couple of 534 00:30:04,600 --> 00:30:09,080 Speaker 1: m I T frat parties and they were, um, not sober. 535 00:30:09,400 --> 00:30:11,760 Speaker 1: Fun fact one the one of the m I T 536 00:30:11,880 --> 00:30:14,320 Speaker 1: frat parties I went to some for some reason when 537 00:30:14,320 --> 00:30:16,040 Speaker 1: I was in college, and I would get really drunk. 538 00:30:16,080 --> 00:30:18,160 Speaker 1: I would always I would like, I would I like 539 00:30:18,240 --> 00:30:21,360 Speaker 1: to like steal things from wherever I was. And so 540 00:30:21,480 --> 00:30:24,080 Speaker 1: I stole two critical pool balls from an m I 541 00:30:24,080 --> 00:30:26,400 Speaker 1: T frat house and someone was able to trace it 542 00:30:26,400 --> 00:30:30,680 Speaker 1: back to me and they demanded their pool balls back. Um, 543 00:30:30,800 --> 00:30:35,560 Speaker 1: and I embarrassingly, I think I capitulated. I think I 544 00:30:35,560 --> 00:30:38,479 Speaker 1: did give them back. I shouldn't. Wow. Wow. Actually, so 545 00:30:38,560 --> 00:30:41,080 Speaker 1: this one kid I'm finding and died because he put 546 00:30:41,080 --> 00:30:43,360 Speaker 1: a bag over his head to in hale Nitris, which 547 00:30:43,400 --> 00:30:45,960 Speaker 1: is the funk man. How did you get into in 548 00:30:46,560 --> 00:30:49,320 Speaker 1: M I T. I don't know any I was like 549 00:30:50,920 --> 00:30:53,760 Speaker 1: this guy, I don't know, don't put bags. Don't put 550 00:30:53,800 --> 00:30:57,680 Speaker 1: bags over your head kids. Um, yeah, I learned that 551 00:30:57,760 --> 00:31:00,640 Speaker 1: in public school. No less, so many there ways to 552 00:31:00,680 --> 00:31:05,320 Speaker 1: do whippets than putting a plastic bag over your head anyway, whatever. 553 00:31:05,960 --> 00:31:09,600 Speaker 1: According next, according to the popular Sam Bankman Freed endorsed 554 00:31:09,680 --> 00:31:12,200 Speaker 1: version of the story, he pivoted towards an almost obsessive 555 00:31:12,240 --> 00:31:15,240 Speaker 1: devotion to ethics. In his freshman year, he went vegan, 556 00:31:15,280 --> 00:31:18,640 Speaker 1: he organized a protest against factory farming, and he worried 557 00:31:18,720 --> 00:31:21,680 Speaker 1: obsessively over how he could change the world for the better. 558 00:31:22,200 --> 00:31:23,840 Speaker 1: And it was at this point that Sam met a 559 00:31:23,840 --> 00:31:27,560 Speaker 1: man who was going to change his life forever. William mcgaskell. UM. 560 00:31:27,560 --> 00:31:28,880 Speaker 1: If you want to learn more about this guy, I 561 00:31:28,880 --> 00:31:30,760 Speaker 1: do recommend the episode of it could happen here on 562 00:31:30,800 --> 00:31:35,600 Speaker 1: Effective Altruism. UM. This guy is today the most the 563 00:31:35,640 --> 00:31:38,560 Speaker 1: pop philosopher of effective altruism and long term is UM. 564 00:31:38,600 --> 00:31:41,120 Speaker 1: He is an elon Musk's text messages that we all 565 00:31:41,160 --> 00:31:44,880 Speaker 1: got as a result of the Twitter lawsuit. UM. At 566 00:31:44,880 --> 00:31:47,000 Speaker 1: this point he was also at M I T, and 567 00:31:47,240 --> 00:31:51,600 Speaker 1: he met with Sam at a cafe in Cambridge, Massachusetts 568 00:31:51,600 --> 00:31:54,720 Speaker 1: where mccast which one, which one? Which I don't know. 569 00:31:54,760 --> 00:31:58,440 Speaker 1: I'm sure it's out there somewhere, Jamie. Um it's I'm 570 00:31:58,440 --> 00:32:01,440 Speaker 1: sure you've gotten a hot dog there. Um, that's that 571 00:32:01,560 --> 00:32:03,960 Speaker 1: seems likely now, any place this guy's going it doesn't 572 00:32:04,000 --> 00:32:07,400 Speaker 1: have hot talk. They're not ethical. They're famously unethical, So 573 00:32:07,680 --> 00:32:11,760 Speaker 1: you're probably right. So mccaskell explained the concept of effective 574 00:32:11,760 --> 00:32:14,360 Speaker 1: altruism to him, which is again this idea that like 575 00:32:14,800 --> 00:32:17,960 Speaker 1: what matters is you should like think kind of coldly 576 00:32:18,040 --> 00:32:20,720 Speaker 1: and robotically about how you do help to make sure 577 00:32:20,720 --> 00:32:23,800 Speaker 1: that your charity money does the most that it can do. 578 00:32:24,200 --> 00:32:26,640 Speaker 1: But one of the big like arguments about it is 579 00:32:26,680 --> 00:32:29,080 Speaker 1: that like, okay, well, what if you, you know, should 580 00:32:29,080 --> 00:32:32,120 Speaker 1: you save a drowning child instead of saving like three 581 00:32:32,200 --> 00:32:34,640 Speaker 1: kids from a burning building. And it's like, that's a 582 00:32:34,680 --> 00:32:37,479 Speaker 1: nonsense choice. Nobody's ever been presented with that choice at 583 00:32:37,480 --> 00:32:39,600 Speaker 1: any point in the history of the human race. It 584 00:32:39,720 --> 00:32:42,479 Speaker 1: is not a reasonable that is not a there's no 585 00:32:42,560 --> 00:32:47,920 Speaker 1: point to that ethical argument. You're not smart, just draft driver. Yeah, 586 00:32:48,200 --> 00:32:51,720 Speaker 1: it makes no sense. Yeah, it makes no fucking sense. 587 00:32:52,120 --> 00:32:54,200 Speaker 1: There's like bits of it that are reasonable, which is 588 00:32:54,240 --> 00:32:56,080 Speaker 1: that like, well, you know, it makes sense to like 589 00:32:56,160 --> 00:32:58,880 Speaker 1: look at the best thing you can do financially, you know, 590 00:32:58,960 --> 00:33:01,600 Speaker 1: in terms of donating money, is you know, malaria prevention, 591 00:33:01,600 --> 00:33:03,480 Speaker 1: because it winds up being the most cost effective thing. 592 00:33:03,480 --> 00:33:05,480 Speaker 1: But it's like, okay, does that mean we shouldn't put 593 00:33:05,520 --> 00:33:08,400 Speaker 1: money into making the water in Flint, Michigan drinkable? And 594 00:33:08,520 --> 00:33:10,280 Speaker 1: a lot of these guys will say no, because that's 595 00:33:10,320 --> 00:33:12,360 Speaker 1: not the best use of money. And it's like, well, 596 00:33:12,520 --> 00:33:14,440 Speaker 1: we could do many things with money, especially if we 597 00:33:14,520 --> 00:33:18,920 Speaker 1: tax billionaires and put it towards rebuilding infrastructure. Um number 598 00:33:18,920 --> 00:33:26,040 Speaker 1: a number of things can be done anyway, McCaskill. McCaskill 599 00:33:26,160 --> 00:33:29,480 Speaker 1: kind of pills this guy on effective altruism. He frames 600 00:33:29,480 --> 00:33:33,520 Speaker 1: it as a strategic investment whose success um was measured 601 00:33:33,560 --> 00:33:36,880 Speaker 1: in populations worth of human lives. He estimated, using back 602 00:33:36,920 --> 00:33:39,320 Speaker 1: of the envelope math that two thous dollars could save 603 00:33:39,400 --> 00:33:41,720 Speaker 1: one life, and so a million dollars could save five 604 00:33:41,800 --> 00:33:44,520 Speaker 1: hundred people, a billion could save half a million, and 605 00:33:44,560 --> 00:33:48,280 Speaker 1: a trillion dollars could theoretically save half a billion lives. 606 00:33:48,920 --> 00:33:54,040 Speaker 1: Based on that totally legitimate math, people are math it 607 00:33:54,080 --> 00:33:57,960 Speaker 1: all works out that way based based on that absolutely 608 00:33:58,000 --> 00:34:00,959 Speaker 1: real math. The only ethical way for a genius like 609 00:34:01,080 --> 00:34:03,760 Speaker 1: Sam to use his time and talents is to become 610 00:34:03,800 --> 00:34:07,200 Speaker 1: the world's first trillionaire. And I'm gonna quote again from 611 00:34:07,240 --> 00:34:11,520 Speaker 1: that article that I know. SPF listened nodding as MS 612 00:34:11,520 --> 00:34:14,239 Speaker 1: Castell made his pitch. The urn to give logic was 613 00:34:14,320 --> 00:34:19,080 Speaker 1: air tight. It was SPF realized applied utilitarianism. Knowing what 614 00:34:19,160 --> 00:34:22,840 Speaker 1: he had to do, SPF simply said, yep, that makes sense. 615 00:34:23,239 --> 00:34:25,799 Speaker 1: But right there, between a bright yellow sunshade and the 616 00:34:25,840 --> 00:34:29,240 Speaker 1: crumb strewn red brick floor, SPF's purpose in life was set. 617 00:34:29,400 --> 00:34:32,200 Speaker 1: He was going to get filthy rich for charity's sake. 618 00:34:32,440 --> 00:34:36,280 Speaker 1: All the rest was merely execution risk. His course established, 619 00:34:36,320 --> 00:34:40,080 Speaker 1: McCaskill gave SPF one less navigational nudge to set him 620 00:34:40,120 --> 00:34:42,880 Speaker 1: on his way, suggesting that SPF get an internship at 621 00:34:42,960 --> 00:34:46,719 Speaker 1: Jane Street that summer, and so for the good of man, 622 00:34:46,920 --> 00:34:50,160 Speaker 1: from the good of mankind, get in the finance industry 623 00:34:50,200 --> 00:34:55,560 Speaker 1: and gamble like a mother final asshole. I god fucking 624 00:34:55,600 --> 00:35:01,920 Speaker 1: hate these people so much. What makes sense? Yeah, you know, look, 625 00:35:01,960 --> 00:35:05,799 Speaker 1: I think that I can't debate that lot. There's no 626 00:35:05,920 --> 00:35:08,600 Speaker 1: argument to be made about that logic. Jamie. I mean, 627 00:35:08,640 --> 00:35:10,200 Speaker 1: I know that this is the wrong person to be 628 00:35:10,239 --> 00:35:12,200 Speaker 1: turning on in this moment, but this is the dick 629 00:35:12,280 --> 00:35:16,040 Speaker 1: writing is like, this is the Jamie's like, he has 630 00:35:16,080 --> 00:35:22,680 Speaker 1: not begun to ride dick. This is this is this 631 00:35:22,800 --> 00:35:26,160 Speaker 1: man's got no gag reflection down, Jamie. I'm going to 632 00:35:26,239 --> 00:35:28,400 Speaker 1: read you some passages from this that are going to 633 00:35:28,480 --> 00:35:33,239 Speaker 1: make you gag. It is unbearable. So and it's like 634 00:35:33,280 --> 00:35:35,360 Speaker 1: this article is like ten fucking words. It took me 635 00:35:35,400 --> 00:35:37,360 Speaker 1: like an hour to get through this thing. It's massive. 636 00:35:37,719 --> 00:35:39,640 Speaker 1: Is He's like, maybe if I do it, he'll give 637 00:35:39,640 --> 00:35:42,359 Speaker 1: me a kiss on the mouth. Well, basically this this 638 00:35:42,440 --> 00:35:45,520 Speaker 1: was published by Sequoia, which is a massive like investment 639 00:35:45,960 --> 00:35:50,840 Speaker 1: fucking fund thing on their website journalistic entity. Yeah, it 640 00:35:50,880 --> 00:35:53,000 Speaker 1: looks like that. It looks exactly like an article from 641 00:35:53,000 --> 00:35:55,279 Speaker 1: like Wired or something like. They clearly laid it out 642 00:35:55,320 --> 00:35:56,960 Speaker 1: like that. But I think it was done because they 643 00:35:57,000 --> 00:35:59,560 Speaker 1: put like two hundred million dollars into his company, so 644 00:35:59,600 --> 00:36:01,799 Speaker 1: they needed to justify it by making them look like 645 00:36:01,840 --> 00:36:05,319 Speaker 1: a genius. So it's like those articles that like are occasionally, 646 00:36:05,360 --> 00:36:08,240 Speaker 1: I mean, it's more scary when they're on actual journalistic outlets. 647 00:36:08,280 --> 00:36:10,600 Speaker 1: And then there's just a little tag saying like hey, 648 00:36:10,640 --> 00:36:13,799 Speaker 1: this is sponsored by RuPaul's fracking farm or whatever the funk, 649 00:36:14,920 --> 00:36:22,200 Speaker 1: and it's like why why on mobile um LGBT icons. 650 00:36:22,760 --> 00:36:26,640 Speaker 1: So yeah, Sam Bankman freed gets into finance and he's 651 00:36:26,640 --> 00:36:29,000 Speaker 1: a very good trader. As I mean, I have no 652 00:36:29,040 --> 00:36:31,600 Speaker 1: way to judge this, but Sequoia says he was a 653 00:36:31,640 --> 00:36:34,040 Speaker 1: good trader. Um, he was good at making a lot 654 00:36:34,040 --> 00:36:36,120 Speaker 1: of money for other people and also a lot of 655 00:36:36,120 --> 00:36:40,439 Speaker 1: money for himself. Besides from he gave away fifty percent 656 00:36:40,520 --> 00:36:43,360 Speaker 1: of his income to his favorite charities, but those charities 657 00:36:43,360 --> 00:36:46,880 Speaker 1: were mostly the Center for Effective Altruism and eighty thousand Hours, 658 00:36:46,880 --> 00:36:51,000 Speaker 1: which is also an effective altruism charity. Good money, that's 659 00:36:51,200 --> 00:36:54,040 Speaker 1: a great question, Jamie. Um. It allows guys like like 660 00:36:54,120 --> 00:36:57,439 Speaker 1: mccaskell to live very well, while also saying they only 661 00:36:57,480 --> 00:36:59,759 Speaker 1: take thirty thou dollars in salaries and give away the 662 00:36:59,800 --> 00:37:02,719 Speaker 1: rest because their lives are heavily subsidized by these organizations 663 00:37:02,760 --> 00:37:05,560 Speaker 1: that allow billionaires to pretend to be heroes. So like, 664 00:37:05,840 --> 00:37:09,720 Speaker 1: so like charity, so like charity. Yeah. So he remains 665 00:37:09,760 --> 00:37:12,319 Speaker 1: there happily for years until two thousand seventeen, when he 666 00:37:12,360 --> 00:37:15,279 Speaker 1: begins to feel as if something is not right. Now 667 00:37:15,480 --> 00:37:19,320 Speaker 1: spoilers having crisis, he is having a quarter life crisis. 668 00:37:19,360 --> 00:37:23,000 Speaker 1: And this kid absolutely as a con artist. And what 669 00:37:23,080 --> 00:37:25,439 Speaker 1: I am giving you was the polished, press friendly version 670 00:37:25,440 --> 00:37:27,799 Speaker 1: of the story for a guy whose entire life, as 671 00:37:27,800 --> 00:37:30,080 Speaker 1: far as I can tell, was one long set up 672 00:37:30,120 --> 00:37:32,400 Speaker 1: for an ambitious con. So when I say stuff like 673 00:37:32,440 --> 00:37:34,040 Speaker 1: he gave a lot of money to charity, because there's 674 00:37:34,080 --> 00:37:37,440 Speaker 1: like other charities he gives to some which sound reasonable, 675 00:37:37,480 --> 00:37:39,520 Speaker 1: but I have actually no evidence that he did like, 676 00:37:39,560 --> 00:37:41,560 Speaker 1: I have no evidence that he did, and I haven't 677 00:37:41,560 --> 00:37:44,520 Speaker 1: seen it. Um So when I say stuff like he 678 00:37:44,640 --> 00:37:46,640 Speaker 1: felt like that, or when they say stuff like he 679 00:37:46,680 --> 00:37:50,239 Speaker 1: felt unfulfilled at Jane Street, that doesn't mean he actually did. 680 00:37:50,320 --> 00:37:52,680 Speaker 1: Because we are at present reliant on a lot of 681 00:37:52,680 --> 00:37:54,480 Speaker 1: reporting from back when this kid was the toast of 682 00:37:54,480 --> 00:37:57,000 Speaker 1: Wall Street. Now after his life fell apart and his 683 00:37:57,080 --> 00:37:58,759 Speaker 1: company crash and it became clear that he was a 684 00:37:58,760 --> 00:38:01,239 Speaker 1: financial criminal, it also came out that the guy who 685 00:38:01,239 --> 00:38:03,800 Speaker 1: wrote the big short has been following him for six months. 686 00:38:03,920 --> 00:38:08,080 Speaker 1: So I suspect at some point it's got that's gonna 687 00:38:08,120 --> 00:38:09,640 Speaker 1: be fun. We're all going to be in for a 688 00:38:09,680 --> 00:38:12,919 Speaker 1: real treat when that books. I love. I love when 689 00:38:12,920 --> 00:38:15,840 Speaker 1: you're like, and guess who is following him around? You're like, oh, 690 00:38:15,880 --> 00:38:19,040 Speaker 1: he's got Michael Lewis on his dail. It's like and also, 691 00:38:19,320 --> 00:38:23,759 Speaker 1: if you're an investor, shouldn't like it's somebody involved in 692 00:38:23,800 --> 00:38:26,239 Speaker 1: one of these companies. Probably should have been able to 693 00:38:26,239 --> 00:38:29,040 Speaker 1: find out, like, oh, hey, the big short guys hanging 694 00:38:29,040 --> 00:38:31,160 Speaker 1: out with him, that probably means this is a giant 695 00:38:31,200 --> 00:38:33,440 Speaker 1: financial crime. That guy's not gonna just hang out with 696 00:38:33,480 --> 00:38:36,479 Speaker 1: a dude who's good at legally making money to write 697 00:38:36,520 --> 00:38:39,160 Speaker 1: about how good he is it making money legally, because 698 00:38:39,200 --> 00:38:42,200 Speaker 1: that's not Michael Lewis is beat. I'm kind of going 699 00:38:42,239 --> 00:38:44,640 Speaker 1: for like a change of paces time, just gonna try 700 00:38:44,680 --> 00:38:47,640 Speaker 1: to see a guy who's like doing something right. Yeah, Michael, 701 00:38:47,880 --> 00:38:50,840 Speaker 1: aren't you the guy who only writes about financial crimes 702 00:38:50,880 --> 00:38:55,799 Speaker 1: on like a gigantic scale. No, No, I think this 703 00:38:55,880 --> 00:38:59,120 Speaker 1: is he's just probably just trying to network. Yeah, it's 704 00:38:59,200 --> 00:39:03,800 Speaker 1: just just just really like this guy's attitude towards altruism. 705 00:39:03,840 --> 00:39:08,359 Speaker 1: So anyway, um, yeah, anyway, here's how that again, very 706 00:39:08,360 --> 00:39:11,840 Speaker 1: dick writing pr flack motherfucker wrote about what happens next. 707 00:39:12,680 --> 00:39:16,160 Speaker 1: He was he realized too secure. Spfs mind had been 708 00:39:16,200 --> 00:39:19,320 Speaker 1: trained almost from birth to calculate as a schoolboy that 709 00:39:19,440 --> 00:39:23,400 Speaker 1: hedonic calculus of utilitarianism. And I'm trying to maximize the 710 00:39:23,520 --> 00:39:27,960 Speaker 1: utility function measured in Util's of course for abortion. During 711 00:39:28,000 --> 00:39:31,320 Speaker 1: his teenage game I know, I know that's a sentence. 712 00:39:32,920 --> 00:39:37,640 Speaker 1: Does he say, of course? Uh no, No, I said 713 00:39:37,640 --> 00:39:40,640 Speaker 1: that he does say, he does say, he does say, 714 00:39:40,640 --> 00:39:46,240 Speaker 1: of course, yes, measured of course, of course. Suck my ass, 715 00:39:46,360 --> 00:39:51,600 Speaker 1: you find unbelievable. During his teenage gaming years, his mathematical 716 00:39:51,640 --> 00:39:55,000 Speaker 1: abilities allowed him to sharpen his tactics and win. And 717 00:39:55,080 --> 00:39:58,200 Speaker 1: of course, every trade spf Ever made at Jane was 718 00:39:58,239 --> 00:40:00,880 Speaker 1: the subject of a risk reward calcul lation, all of 719 00:40:00,880 --> 00:40:04,240 Speaker 1: it boiled down to expected value. The formula is fairly simple. 720 00:40:04,360 --> 00:40:06,719 Speaker 1: If the amount one multiplied by the probability of winning 721 00:40:06,760 --> 00:40:08,920 Speaker 1: a bet is greater than the amount loss multiplied by 722 00:40:08,920 --> 00:40:11,200 Speaker 1: the probability of losing a bet, then you go for it. 723 00:40:11,280 --> 00:40:14,640 Speaker 1: Irrespective of units Util's euros, dollars, We're all subject to 724 00:40:14,640 --> 00:40:17,920 Speaker 1: the same reckoning. But at Jane, spf us most another 725 00:40:18,000 --> 00:40:21,759 Speaker 1: trading principle. He learned to be risk neutral. In simple terms, 726 00:40:21,800 --> 00:40:24,560 Speaker 1: a trader given a choice between a fifty fifty dollars 727 00:40:24,560 --> 00:40:27,040 Speaker 1: and a fifty chance at a hundred dollars must be 728 00:40:27,080 --> 00:40:29,680 Speaker 1: agnostic if they want to maximize the expected value of 729 00:40:29,680 --> 00:40:32,319 Speaker 1: earnings over a lifetime. Those who prefer that sure win 730 00:40:32,400 --> 00:40:34,880 Speaker 1: are risk averse and those who would rather gamble our 731 00:40:35,040 --> 00:40:37,680 Speaker 1: risk lovers. But both risk lovers and the risk averse 732 00:40:37,680 --> 00:40:40,279 Speaker 1: are suckers equally because over the long run they lose 733 00:40:40,280 --> 00:40:43,160 Speaker 1: out to the risk neutral who take both deals without prejudice. 734 00:40:43,600 --> 00:40:45,960 Speaker 1: That makes no sense. That makes no sense at all, 735 00:40:45,960 --> 00:40:47,680 Speaker 1: because like you, you're you're assuming you have to like 736 00:40:47,800 --> 00:40:50,960 Speaker 1: choose between one? Can you just take both? Is that 737 00:40:51,040 --> 00:40:52,759 Speaker 1: like the is that the offer? Because it seems like 738 00:40:52,840 --> 00:40:55,920 Speaker 1: the whole thought experiment is about choosing between one. None 739 00:40:55,960 --> 00:40:59,240 Speaker 1: of this makes very much sense. I had a brain 740 00:40:59,280 --> 00:41:01,080 Speaker 1: hemorrhage in them it all of that, and then I 741 00:41:01,120 --> 00:41:04,279 Speaker 1: was thinking I couldn't stop thinking about do you think 742 00:41:04,280 --> 00:41:08,640 Speaker 1: of the utilitarians? How do utilitarians feel about kissing with tongue? 743 00:41:08,680 --> 00:41:12,239 Speaker 1: Do you think, um, how many utils does it take 744 00:41:12,320 --> 00:41:14,319 Speaker 1: to kiss with tongue? Or or don't waste your fire. 745 00:41:14,880 --> 00:41:17,920 Speaker 1: Let me write the equation out, Jamie and try to 746 00:41:17,920 --> 00:41:20,520 Speaker 1: try to sketch out the math on I think they 747 00:41:20,560 --> 00:41:22,520 Speaker 1: wouldn't be into it. I think they would be like, well, 748 00:41:22,520 --> 00:41:29,160 Speaker 1: what's the point. Yeah, that's that seems real. Five utils. Sorry, James, Jamie, 749 00:41:29,160 --> 00:41:32,520 Speaker 1: I gotta continue to say, I don't know, but we 750 00:41:32,600 --> 00:41:37,400 Speaker 1: have to read another one quote here. SPF realized was 751 00:41:37,440 --> 00:41:40,160 Speaker 1: the rub When he applied this principle to his own life, 752 00:41:40,239 --> 00:41:42,719 Speaker 1: he came up short. There was little chance he'd get 753 00:41:42,800 --> 00:41:45,600 Speaker 1: himself fired from Jane Street. Thus, the decision to stick 754 00:41:45,640 --> 00:41:47,920 Speaker 1: with Jane was a risk averse preference. It was the 755 00:41:48,080 --> 00:41:52,160 Speaker 1: logical equivalent of being offered a choice between fifty dollars 756 00:41:52,160 --> 00:41:54,960 Speaker 1: and fifty of a hundred dollars and saying give me 757 00:41:55,080 --> 00:41:58,560 Speaker 1: President Grant. SPF was risk neutral on behalf of Jane Street, 758 00:41:58,600 --> 00:42:00,759 Speaker 1: but not He realized for his own life, to be 759 00:42:00,800 --> 00:42:03,759 Speaker 1: fully rational about maximizing his income on behalf of the poor, 760 00:42:04,040 --> 00:42:06,600 Speaker 1: he should apply his trading principles across the board. He 761 00:42:06,640 --> 00:42:09,000 Speaker 1: had to find a risk neutral career path, which, if 762 00:42:09,000 --> 00:42:11,839 Speaker 1: we strip away the trader jargon, actually means he needed 763 00:42:11,840 --> 00:42:15,680 Speaker 1: to take on a lot more risk. Oh Jamie, you 764 00:42:15,719 --> 00:42:18,160 Speaker 1: need to hear this, boy. We're driving it away. I've 765 00:42:18,160 --> 00:42:21,759 Speaker 1: been asleep for six minutes. Come on, Which if we 766 00:42:21,800 --> 00:42:24,640 Speaker 1: strip away. The trader jargon actually means he felt he 767 00:42:24,680 --> 00:42:26,560 Speaker 1: needed to take on a lot more risk in the 768 00:42:26,560 --> 00:42:29,520 Speaker 1: hopes of becoming part of the global elite. The math 769 00:42:29,600 --> 00:42:33,480 Speaker 1: couldn't be clearer. Very high risk multiplied by dynastic wealth. 770 00:42:33,760 --> 00:42:37,040 Speaker 1: Trump's low risk multiplied by mere rich guy wealth. To 771 00:42:37,120 --> 00:42:39,120 Speaker 1: do the most good for the world, SPF needed to 772 00:42:39,120 --> 00:42:41,520 Speaker 1: find a path on which he'd be a coin toss 773 00:42:41,560 --> 00:42:46,200 Speaker 1: away from going totally bust. So has this risk neutral, 774 00:42:46,200 --> 00:42:48,520 Speaker 1: But that means taking a lot of risk, because the 775 00:42:48,560 --> 00:42:50,800 Speaker 1: most risk is the only way to become the wealthiest 776 00:42:50,880 --> 00:42:53,280 Speaker 1: person in the world. And only by becoming the wealthiest 777 00:42:53,280 --> 00:42:57,080 Speaker 1: person in the world can you avoid risk. You get it, Jamie. Yeah, 778 00:42:57,160 --> 00:42:59,560 Speaker 1: And that's the most ethical thing you can do. That's 779 00:42:59,640 --> 00:43:02,759 Speaker 1: the really the most logical ethical way to live. So 780 00:43:02,960 --> 00:43:05,040 Speaker 1: do you think that they kissed with done or not? 781 00:43:05,840 --> 00:43:09,680 Speaker 1: I mean, I think what he's saying is ordered in 782 00:43:09,760 --> 00:43:12,560 Speaker 1: order to avoid the risk of catching an STD you 783 00:43:12,640 --> 00:43:15,279 Speaker 1: have to take on a job as the bathroom at 784 00:43:15,400 --> 00:43:19,960 Speaker 1: at a brothel. Um Ah, that's the risk a verse 785 00:43:20,719 --> 00:43:26,440 Speaker 1: or risk neutral presentation. Frantically rubbing my final grand cells together, 786 00:43:26,520 --> 00:43:31,080 Speaker 1: trying to make heads or tails of that, and it is. 787 00:43:31,239 --> 00:43:35,520 Speaker 1: It is howling clown shit. It is absolutely sparking nonsense. 788 00:43:35,960 --> 00:43:39,239 Speaker 1: Is like of a vortex of bullshit to be like, 789 00:43:39,520 --> 00:43:42,520 Speaker 1: so anyways, it's really clear and the math couldn't be 790 00:43:42,560 --> 00:43:44,839 Speaker 1: clear that he has to be the most richest guy 791 00:43:45,000 --> 00:43:48,400 Speaker 1: or everyone is going to die, like actually really urgent. 792 00:43:50,400 --> 00:43:53,080 Speaker 1: You know, it's one of those things because I have 793 00:43:53,200 --> 00:43:56,439 Speaker 1: I have known a number of rich guys in my life, um, 794 00:43:56,680 --> 00:44:00,200 Speaker 1: and some of them are in late for you, there's 795 00:44:00,239 --> 00:44:03,200 Speaker 1: two kinds. There's people. There's people who were poor at 796 00:44:03,200 --> 00:44:06,880 Speaker 1: one point, and some of those people are unhinged and 797 00:44:06,960 --> 00:44:09,560 Speaker 1: some of them still remember being poor enough to talk 798 00:44:09,600 --> 00:44:12,640 Speaker 1: like normal people. And then there's people like this who 799 00:44:12,760 --> 00:44:16,600 Speaker 1: Sam was never rich as a kid, but he lived 800 00:44:16,600 --> 00:44:20,800 Speaker 1: in this rarefied air where finance and like the concept 801 00:44:20,880 --> 00:44:23,840 Speaker 1: of worrying about money or his economic status was not 802 00:44:23,920 --> 00:44:26,000 Speaker 1: a thing because everyone around him when he was a 803 00:44:26,080 --> 00:44:29,160 Speaker 1: kid was so high status. And he like, he's just 804 00:44:29,320 --> 00:44:31,920 Speaker 1: lived in this it's not even a it's not even 805 00:44:31,960 --> 00:44:34,520 Speaker 1: a bubble. He grew up on a different planet, Like 806 00:44:34,600 --> 00:44:36,799 Speaker 1: the world does not exist to him the same way 807 00:44:36,800 --> 00:44:39,399 Speaker 1: it does to everyone else. And so he's like, that's 808 00:44:39,440 --> 00:44:42,480 Speaker 1: the only way you can talk about things in this way. Um, 809 00:44:42,760 --> 00:44:45,600 Speaker 1: that you're sort of sinister thing where you're talking about 810 00:44:46,080 --> 00:44:50,359 Speaker 1: everything like it is very my thinking. It's like it's 811 00:44:50,400 --> 00:44:53,080 Speaker 1: so easy for him and people like this to think 812 00:44:53,080 --> 00:44:55,719 Speaker 1: of other people as theoreticals because they've never had a 813 00:44:55,719 --> 00:44:59,479 Speaker 1: problem before. So it's are a game of chance because 814 00:45:00,200 --> 00:45:03,719 Speaker 1: met a person, right, right, he has never known a 815 00:45:03,880 --> 00:45:08,080 Speaker 1: human being. Just Stanford professors exactly, just Stanford professors and 816 00:45:08,160 --> 00:45:10,880 Speaker 1: problems in his video games. So yeah, and not that 817 00:45:10,960 --> 00:45:14,800 Speaker 1: all nerds. I mean, I'm not I'm I'm afraid of nerds, 818 00:45:14,880 --> 00:45:17,240 Speaker 1: you know, people who identify as nerds coming into my mansis. 819 00:45:17,239 --> 00:45:19,040 Speaker 1: I'm not saying that that's everybody, but I'm saying like 820 00:45:19,280 --> 00:45:22,560 Speaker 1: he's not socialized like a person, and that these are 821 00:45:22,680 --> 00:45:26,719 Speaker 1: has to do with class. Yeah, God damn it. Um. 822 00:45:26,719 --> 00:45:30,000 Speaker 1: So the next thing that SPF did after deciding he 823 00:45:30,040 --> 00:45:32,680 Speaker 1: had to quit Jane Street, um is start pondering how 824 00:45:32,719 --> 00:45:35,239 Speaker 1: he might change the world in a way that minimized 825 00:45:35,280 --> 00:45:38,760 Speaker 1: his risk. By maximizing his risk or some ship anyway, 826 00:45:38,920 --> 00:45:42,279 Speaker 1: as he told it, he considered four career fields and 827 00:45:42,360 --> 00:45:44,799 Speaker 1: this is These are his notes on the four things 828 00:45:44,840 --> 00:45:48,600 Speaker 1: he might do after being a traitor. Number one journalism 829 00:45:48,840 --> 00:45:52,439 Speaker 1: low pay, but a massively outsized impact potential. Number two 830 00:45:52,640 --> 00:45:55,840 Speaker 1: running for office or maybe just being an advisor. Number 831 00:45:55,840 --> 00:45:59,160 Speaker 1: three working for the movement e a effect of altruism 832 00:45:59,160 --> 00:46:03,000 Speaker 1: needs people. Number four starting a startup. But what exactly 833 00:46:03,320 --> 00:46:05,960 Speaker 1: Number five bumming around the Bay Area for a month 834 00:46:06,040 --> 00:46:09,360 Speaker 1: or so just to see what happens. Now again, it 835 00:46:09,400 --> 00:46:12,239 Speaker 1: sounds like a bad bumble day, Like it's like, so 836 00:46:12,360 --> 00:46:14,960 Speaker 1: what do you do? It's like, oh, well, um, so 837 00:46:15,160 --> 00:46:17,800 Speaker 1: start up maybe, but what is a start up? Really? 838 00:46:17,960 --> 00:46:21,560 Speaker 1: What is a startup? So again, he spent years working 839 00:46:21,560 --> 00:46:23,520 Speaker 1: in finance, He's got plenty of money. He came from 840 00:46:23,520 --> 00:46:25,520 Speaker 1: the Bay Area, So five was an option and it's 841 00:46:25,560 --> 00:46:27,839 Speaker 1: one he took. Um And by the way, like as 842 00:46:27,840 --> 00:46:30,080 Speaker 1: a general rule, if you decide to quit your job 843 00:46:30,360 --> 00:46:32,520 Speaker 1: and you have the financial ability to put her amount 844 00:46:32,560 --> 00:46:34,279 Speaker 1: around for a month or two and think things through, 845 00:46:34,560 --> 00:46:37,840 Speaker 1: not a bad idea, um, But Sam is going to 846 00:46:37,880 --> 00:46:40,600 Speaker 1: do this in the worst way possible. He eventually hits 847 00:46:40,680 --> 00:46:44,040 Speaker 1: upon his great next idea, which is to make a 848 00:46:44,040 --> 00:46:47,200 Speaker 1: shipload of money in crypto now when he quits. Jane 849 00:46:47,280 --> 00:46:49,440 Speaker 1: Street is two thousand seventeen, and if you guys can 850 00:46:49,480 --> 00:46:52,799 Speaker 1: remember back that far, that's the first big winter when 851 00:46:52,840 --> 00:46:56,160 Speaker 1: cryptocurrency boomed like kind of all throughout the last quarter 852 00:46:56,320 --> 00:46:59,799 Speaker 1: or so of two thousand seventeen, bitcoin was just say 853 00:47:00,040 --> 00:47:02,959 Speaker 1: laying up like massive rises. Ether had a big rice 854 00:47:03,040 --> 00:47:07,839 Speaker 1: to just kind of went around early. A lot of 855 00:47:07,880 --> 00:47:09,960 Speaker 1: bitcoin nerds who have people have been making fun of 856 00:47:09,960 --> 00:47:12,759 Speaker 1: her years became overnight multimillionaires. And this was kind of 857 00:47:12,760 --> 00:47:16,680 Speaker 1: the first point at which normal people started to think, ship, 858 00:47:16,760 --> 00:47:18,239 Speaker 1: maybe I should get into this, maybe I can make 859 00:47:18,239 --> 00:47:20,120 Speaker 1: a lot of money. Right. This is this is the 860 00:47:20,120 --> 00:47:22,040 Speaker 1: thing that blew up bitcoin. And there was a crash 861 00:47:22,080 --> 00:47:24,960 Speaker 1: after this, but it recovered, YadA, YadA, YadA. Sam was 862 00:47:25,000 --> 00:47:27,000 Speaker 1: savvy enough to look at this and know that these 863 00:47:27,120 --> 00:47:30,680 Speaker 1: moments where this thing where a lot of funny money 864 00:47:30,719 --> 00:47:33,359 Speaker 1: is on the table but there's no regulation um and 865 00:47:33,440 --> 00:47:35,799 Speaker 1: regular people who started to get interested because they think 866 00:47:35,800 --> 00:47:38,040 Speaker 1: they might get rich. This is the point at which 867 00:47:38,040 --> 00:47:40,759 Speaker 1: an unethical person can make the absolute most money in 868 00:47:40,760 --> 00:47:43,680 Speaker 1: a financial market. Right, And you can be unethical, to 869 00:47:43,760 --> 00:47:45,799 Speaker 1: quote one of the greats, you can be unethical and 870 00:47:45,840 --> 00:47:47,759 Speaker 1: still be illegal. That's the way I live my life. 871 00:47:48,520 --> 00:47:50,080 Speaker 1: And you know who else lives their life that way? 872 00:47:50,160 --> 00:47:54,120 Speaker 1: Jamie loftus you and you're tossing the ads. Yeah, that's right, 873 00:47:54,239 --> 00:47:59,799 Speaker 1: I am indeed, baby. Uh here we go, all right 874 00:48:05,960 --> 00:48:12,799 Speaker 1: and we're back. So Jay Loft, Yes, it's actually j Low. 875 00:48:12,960 --> 00:48:15,680 Speaker 1: It's the first. I'm the first person to use that, 876 00:48:16,000 --> 00:48:20,320 Speaker 1: and it's really starting to catch on bowls um afflex 877 00:48:20,400 --> 00:48:22,359 Speaker 1: calling me one sect, Jamie, let me take this call. 878 00:48:22,640 --> 00:48:26,680 Speaker 1: Oh that's why, boyfriend. Now he's just weeping outside of 879 00:48:26,680 --> 00:48:29,480 Speaker 1: a dunkin Donuts and pocket dialed me again, normal normal 880 00:48:29,560 --> 00:48:34,759 Speaker 1: Ben stuff? Am I right? Ben? Look sometimes I meet 881 00:48:34,880 --> 00:48:37,680 Speaker 1: I meet up with my friend Ben at the Atwater 882 00:48:37,800 --> 00:48:42,160 Speaker 1: Dunks and I really, I really said him straight. It's nice. Yeah, 883 00:48:42,480 --> 00:48:45,640 Speaker 1: Ben afflick sober for years and looks like he's hungover 884 00:48:45,680 --> 00:48:49,759 Speaker 1: in every single photograph. What a king man, I know, 885 00:48:50,000 --> 00:48:52,040 Speaker 1: I do, I do. I have a lot of There's 886 00:48:52,080 --> 00:48:54,399 Speaker 1: just a something about him that warms my heart. It's 887 00:48:54,400 --> 00:48:58,279 Speaker 1: his back tattoo. You love his back tattoos. It's just 888 00:48:58,440 --> 00:49:02,160 Speaker 1: that takes courage, you know, oral courage. I think he's 889 00:49:02,320 --> 00:49:05,640 Speaker 1: amazed that I think we need to have that back 890 00:49:05,719 --> 00:49:11,320 Speaker 1: tattoo and still got to marry Jennifer Lope. I think 891 00:49:11,400 --> 00:49:15,520 Speaker 1: I think put his name on the Vietnam Memorial in 892 00:49:15,640 --> 00:49:17,840 Speaker 1: honor of the courage that it took to get that 893 00:49:17,840 --> 00:49:21,120 Speaker 1: that back tattoo really is braver than the troops. And 894 00:49:21,160 --> 00:49:23,640 Speaker 1: he did for a while hold hold the status of 895 00:49:23,719 --> 00:49:27,839 Speaker 1: most divorced man in America. Um, but that is now 896 00:49:28,120 --> 00:49:32,759 Speaker 1: what he fixed it. But the contest, I was going 897 00:49:32,800 --> 00:49:36,560 Speaker 1: to say, they did they did get married at a plantation. 898 00:49:36,880 --> 00:49:38,879 Speaker 1: I don't know. Yeah that I mean, that's all. That's 899 00:49:38,880 --> 00:49:41,960 Speaker 1: all very fucked up. But I do feel the contest 900 00:49:42,000 --> 00:49:44,319 Speaker 1: for most divorced man. Did you ever watch dragon ball 901 00:49:44,400 --> 00:49:48,080 Speaker 1: Z as a kid, Jamie, I didn't know. I didn't know. Well, 902 00:49:48,120 --> 00:49:50,600 Speaker 1: there's this thing that Dragon ball Z would always do 903 00:49:50,680 --> 00:49:52,680 Speaker 1: this thing where like you have this guy and he's 904 00:49:52,719 --> 00:49:54,960 Speaker 1: like the most badass person ever that everybody has to 905 00:49:54,960 --> 00:49:56,680 Speaker 1: figure out how to fight. And it's this big problem 906 00:49:56,680 --> 00:49:59,320 Speaker 1: because this is the most terrifying thing in the universe. 907 00:49:59,560 --> 00:50:01,680 Speaker 1: And the next season, like there's something that's like a 908 00:50:01,719 --> 00:50:04,359 Speaker 1: thousand times scarier. It's just this like power creep kind 909 00:50:04,360 --> 00:50:06,319 Speaker 1: of thing. I feel like we've all been dealing with 910 00:50:06,360 --> 00:50:09,240 Speaker 1: that with a divorced guy, because divorced guy Ben affleck 911 00:50:09,560 --> 00:50:13,879 Speaker 1: Not doesn't even register on the divorced guy scale. Living 912 00:50:13,920 --> 00:50:17,600 Speaker 1: in the Age of Kanye and it's really divorced and 913 00:50:17,680 --> 00:50:21,160 Speaker 1: look out because Tom Brady is about to hit the world. Trade. 914 00:50:21,239 --> 00:50:25,680 Speaker 1: Tom Brady is gonna go super say and divorce that 915 00:50:25,840 --> 00:50:31,240 Speaker 1: is going to a third divorce Man has hit the world. 916 00:50:29,000 --> 00:50:38,000 Speaker 1: It's incredible. So back to Sam Bankman freed. He is 917 00:50:38,080 --> 00:50:40,879 Speaker 1: just he has just decided to get into crypto now 918 00:50:40,920 --> 00:50:42,840 Speaker 1: the first way to make money in crypto. That that 919 00:50:42,880 --> 00:50:44,520 Speaker 1: occurred to him because he spends a bunch of time 920 00:50:44,560 --> 00:50:47,040 Speaker 1: looking into the market and he finds he sees that 921 00:50:47,280 --> 00:50:49,800 Speaker 1: there's this thing called I think it's like the Kimchi 922 00:50:51,160 --> 00:50:53,759 Speaker 1: premium or whatever like that. They come up with some weird, 923 00:50:53,840 --> 00:50:56,240 Speaker 1: kind of racist name about it, which is basically bitcoin 924 00:50:56,320 --> 00:50:58,120 Speaker 1: is worth a lot more in Japan and Korea than 925 00:50:58,160 --> 00:50:59,919 Speaker 1: it is in the United States. Right, It's like worth 926 00:51:00,040 --> 00:51:02,920 Speaker 1: fifteen grand in Japan and Korea and it's like tin 927 00:51:03,000 --> 00:51:05,600 Speaker 1: grand in the United States something like that. Why is 928 00:51:05,680 --> 00:51:09,880 Speaker 1: that there's a variety of complicated factors. Basically there's a 929 00:51:09,880 --> 00:51:12,319 Speaker 1: bunch of different laws around banking and who can and 930 00:51:12,360 --> 00:51:15,759 Speaker 1: cannot hold accounts and execute trades in those areas. That 931 00:51:15,880 --> 00:51:18,960 Speaker 1: leads to this premium. Because like normally, if a premium 932 00:51:19,000 --> 00:51:20,920 Speaker 1: like that, if the markets were kind of accessible to 933 00:51:20,960 --> 00:51:23,960 Speaker 1: each other, and bitcoins worth fifteen grand in Asia and 934 00:51:24,040 --> 00:51:26,440 Speaker 1: ten grand in the US. Then you buy bitcoin in 935 00:51:26,440 --> 00:51:28,200 Speaker 1: the US and you sell it in Asia and you 936 00:51:28,200 --> 00:51:33,680 Speaker 1: get free money, right um. Very obvious, um if you can. 937 00:51:33,840 --> 00:51:35,920 Speaker 1: But you can't do that because you can't get access 938 00:51:35,960 --> 00:51:38,040 Speaker 1: to as an American, you can't like get a Chinese 939 00:51:38,040 --> 00:51:40,480 Speaker 1: account in order to buy bitcoin there, right M or 940 00:51:40,480 --> 00:51:42,920 Speaker 1: a Korean account or a Japanese account. There's all these 941 00:51:42,960 --> 00:51:47,760 Speaker 1: laws and they can't VPN. No, you can't. You cannot 942 00:51:47,800 --> 00:51:50,400 Speaker 1: do that, and no one can figure out how to 943 00:51:50,440 --> 00:51:54,560 Speaker 1: do it how as a Westerner to sell bitcoin over 944 00:51:54,680 --> 00:51:57,359 Speaker 1: in these parts of Asia and get that premium, right 945 00:51:57,400 --> 00:51:59,680 Speaker 1: and like just get a bunch of free cash. Sam 946 00:52:00,080 --> 00:52:02,520 Speaker 1: years out a way to do it, um, which is 947 00:52:02,560 --> 00:52:05,239 Speaker 1: basically like picking up a pile of free money. Right 948 00:52:05,440 --> 00:52:07,720 Speaker 1: if you're buying bitcoin for ten grand and other people 949 00:52:07,719 --> 00:52:10,480 Speaker 1: want to pay fifteen grand for it, you're just making cash, 950 00:52:10,640 --> 00:52:13,640 Speaker 1: right um. And primarily the way he does that is 951 00:52:13,680 --> 00:52:16,360 Speaker 1: through friends in the effective altruism community who are like 952 00:52:16,520 --> 00:52:18,920 Speaker 1: placed in banks and stuff over in Asia, who like 953 00:52:19,200 --> 00:52:21,319 Speaker 1: help him figure out how to do this. I'm not 954 00:52:21,360 --> 00:52:24,160 Speaker 1: going to go into details. They I mean, this fucking 955 00:52:24,160 --> 00:52:27,560 Speaker 1: dick writing article spends a long time explaining how he 956 00:52:27,640 --> 00:52:30,240 Speaker 1: figures this out, and it might even have been legal. 957 00:52:30,280 --> 00:52:31,799 Speaker 1: He may not have broken the law to do this, 958 00:52:31,840 --> 00:52:33,640 Speaker 1: although it's kind of hard to know because all of 959 00:52:33,680 --> 00:52:38,040 Speaker 1: this is complicated finance gibberish um. By the way, honestly, 960 00:52:38,080 --> 00:52:40,759 Speaker 1: I'm not getting a word of this. Finance guys call 961 00:52:40,840 --> 00:52:43,239 Speaker 1: this an arbitrage, and it's basically the ideal that if 962 00:52:43,280 --> 00:52:46,160 Speaker 1: you can, if there's a resource that's worth a bunch 963 00:52:46,239 --> 00:52:48,360 Speaker 1: more money one place than it is the other place, 964 00:52:48,640 --> 00:52:50,320 Speaker 1: you buy it where it's cheap, and you sell it 965 00:52:50,360 --> 00:52:52,920 Speaker 1: where it's expensive, and you get free money. Right, that 966 00:52:53,000 --> 00:53:00,719 Speaker 1: makes sense? Yeah? Yeah, If if my drug dealer sells 967 00:53:00,800 --> 00:53:06,040 Speaker 1: me ketamine for like forty bucks a graham, right, and 968 00:53:06,760 --> 00:53:10,080 Speaker 1: I don't know, at a house party you're hanging out 969 00:53:10,120 --> 00:53:13,000 Speaker 1: at a couple of blocks away, somebody gibb says that 970 00:53:13,000 --> 00:53:16,040 Speaker 1: they'd pay seventy dollars a graham for ketamine. You can 971 00:53:16,080 --> 00:53:18,839 Speaker 1: make thirty free bucks by taking the ketamine you bought 972 00:53:18,840 --> 00:53:22,600 Speaker 1: for forty bucks and selling it at that other house party. Right. Okay, 973 00:53:22,680 --> 00:53:24,880 Speaker 1: so that was you speaking to me in terms that 974 00:53:24,920 --> 00:53:27,879 Speaker 1: you would understand. But I do think I got it. Yeah, yeah, 975 00:53:28,160 --> 00:53:31,719 Speaker 1: ketamine makes everything makes sense. So he just gave it 976 00:53:31,760 --> 00:53:34,080 Speaker 1: to me. In Robert's speak, I got it, I got 977 00:53:34,120 --> 00:53:36,239 Speaker 1: he makes He makes a shipload of money doing this, 978 00:53:36,320 --> 00:53:39,120 Speaker 1: and he decides to roll this money into a company 979 00:53:39,200 --> 00:53:41,719 Speaker 1: which will allow him to hire employees to gamble with 980 00:53:41,760 --> 00:53:45,000 Speaker 1: cryptocurrency at scale, to try and find different funked up 981 00:53:45,000 --> 00:53:46,560 Speaker 1: little areas like this where they can make a bunch 982 00:53:46,560 --> 00:53:49,880 Speaker 1: of money by executing trades. He picks members of the 983 00:53:49,920 --> 00:53:53,920 Speaker 1: e A community as his first employees, including Carolyn Ellison, 984 00:53:54,040 --> 00:53:56,600 Speaker 1: a former coworker co worker at Jane Street who we 985 00:53:56,640 --> 00:53:59,319 Speaker 1: will be talking about in a little bit, and Nishad Singh, 986 00:53:59,400 --> 00:54:03,720 Speaker 1: a former Facebook employee. Industrial scale dick writer Adam Fisher 987 00:54:03,800 --> 00:54:06,719 Speaker 1: lets you know that sing is an incredible, almost impossibly 988 00:54:06,760 --> 00:54:09,920 Speaker 1: good human being by describing him this way. He often 989 00:54:09,920 --> 00:54:12,520 Speaker 1: wears a T shirt with the words Compassionate to the 990 00:54:12,560 --> 00:54:16,239 Speaker 1: core printed in diminutive all lower case font over his heart. 991 00:54:17,120 --> 00:54:19,960 Speaker 1: Do you do you think that this guy, this writer 992 00:54:20,239 --> 00:54:22,400 Speaker 1: like just it's at this point he's on like a 993 00:54:22,480 --> 00:54:27,200 Speaker 1: bucking bronco ASBFS dick, Like it's just absolutely he is 994 00:54:27,440 --> 00:54:30,359 Speaker 1: he is, he is fift this guy's dick by weight. 995 00:54:30,560 --> 00:54:33,120 Speaker 1: It is unbelievable. What does he think is going to 996 00:54:33,239 --> 00:54:36,200 Speaker 1: happen for him? He's got to get paid by Sequoia 997 00:54:36,280 --> 00:54:38,879 Speaker 1: to write a ten thousand word article that they then 998 00:54:38,960 --> 00:54:41,200 Speaker 1: pull from their website. When it becomes clear this man 999 00:54:41,320 --> 00:54:45,720 Speaker 1: is a massive financial crinamental, he's like, no, my greatest work, 1000 00:54:47,719 --> 00:54:50,680 Speaker 1: it's extremely funny. No, look, I don't know sing. But 1001 00:54:50,760 --> 00:54:53,120 Speaker 1: based on the description that guy gives, I am convinced 1002 00:54:53,160 --> 00:54:55,279 Speaker 1: he's murdered a child with his bare hands. And that 1003 00:54:55,440 --> 00:54:58,160 Speaker 1: is my head cannon for this man. UM. No one 1004 00:54:58,200 --> 00:55:02,160 Speaker 1: else would wear that shirt. So these these e A 1005 00:55:02,360 --> 00:55:06,239 Speaker 1: nerds all form a trading firm called Alameda UM, and 1006 00:55:06,760 --> 00:55:09,160 Speaker 1: in doing so, they came down on one side of 1007 00:55:09,280 --> 00:55:12,319 Speaker 1: probably the biggest split within the crypto community. See, the 1008 00:55:12,360 --> 00:55:15,000 Speaker 1: core of the idea that's not bad that exists within 1009 00:55:15,040 --> 00:55:19,400 Speaker 1: cryptocurrency is that decent is that centralized state controlled money 1010 00:55:19,600 --> 00:55:23,839 Speaker 1: is like has problems, right, you know that there's things 1011 00:55:23,840 --> 00:55:26,520 Speaker 1: about that that are bad, um, And it could be 1012 00:55:26,600 --> 00:55:28,879 Speaker 1: cool and useful to be able to separate the money 1013 00:55:28,960 --> 00:55:30,880 Speaker 1: from the state if you could do that right, if 1014 00:55:30,920 --> 00:55:32,920 Speaker 1: you could do that in a way that reduced the 1015 00:55:32,960 --> 00:55:36,160 Speaker 1: state's power to like you know, just locked down the 1016 00:55:36,160 --> 00:55:38,959 Speaker 1: bank accounts of dissidence and stuff like that. There's there's 1017 00:55:39,000 --> 00:55:41,839 Speaker 1: cool benefits to it. And what I just had an idea, 1018 00:55:41,920 --> 00:55:43,520 Speaker 1: what if we did that, but then we gave all 1019 00:55:43,560 --> 00:55:46,799 Speaker 1: the money to one guy. Well that's kind of what 1020 00:55:46,880 --> 00:55:51,440 Speaker 1: keeps happening. Um So. But also like spfs on the 1021 00:55:51,440 --> 00:55:54,920 Speaker 1: other side of this argument, right, because obviously most of 1022 00:55:54,960 --> 00:55:58,399 Speaker 1: the actual benefits of a truly decentralized online currency are 1023 00:55:58,480 --> 00:56:00,440 Speaker 1: just you can buy drugs with it over the internet. 1024 00:56:00,560 --> 00:56:03,000 Speaker 1: But still that is a real value people do in 1025 00:56:03,080 --> 00:56:07,120 Speaker 1: fact by drugs using cryptocurrency, and that's fine, um And 1026 00:56:07,160 --> 00:56:12,200 Speaker 1: the committed ideological crypto people tend to keep their money 1027 00:56:12,239 --> 00:56:15,520 Speaker 1: offline in a wallet only they can access, right. So 1028 00:56:15,560 --> 00:56:18,200 Speaker 1: you basically you have like a hard drive that has 1029 00:56:18,280 --> 00:56:20,720 Speaker 1: all of your crypto on it, and that that only 1030 00:56:20,760 --> 00:56:23,200 Speaker 1: touches the Internet when you plug that into your computer 1031 00:56:23,280 --> 00:56:25,520 Speaker 1: and you use it to make a transaction, right and 1032 00:56:25,600 --> 00:56:29,040 Speaker 1: otherwise it's completely offline, and so people can't just take 1033 00:56:29,080 --> 00:56:33,719 Speaker 1: it from you, right right, that's the smart people. This 1034 00:56:33,800 --> 00:56:36,200 Speaker 1: is a pain in the ass though, right, Like keeping 1035 00:56:36,200 --> 00:56:38,000 Speaker 1: it in this thing, like there's all these security you 1036 00:56:38,040 --> 00:56:40,280 Speaker 1: can lose your password people actually do lose their money 1037 00:56:40,280 --> 00:56:44,800 Speaker 1: this way too, But anyway, it's it's there's a measure 1038 00:56:44,840 --> 00:56:47,160 Speaker 1: to which it it makes sense and is secure, but 1039 00:56:47,320 --> 00:56:49,080 Speaker 1: most people don't want to go through that pain in 1040 00:56:49,120 --> 00:56:52,160 Speaker 1: the ass, so they put all of their cryptome currency 1041 00:56:52,239 --> 00:56:55,879 Speaker 1: and what are effectively crypto banks these exchanges, and these 1042 00:56:55,880 --> 00:56:58,640 Speaker 1: exchanges are places like Mount Cox which a few years 1043 00:56:58,640 --> 00:57:01,000 Speaker 1: ago all of the money got stole, and from an 1044 00:57:01,040 --> 00:57:03,759 Speaker 1: f t X which Sam Bankman freedmakes, which also all 1045 00:57:03,800 --> 00:57:06,400 Speaker 1: of the money gets stolen from. Right. So the people 1046 00:57:06,400 --> 00:57:08,839 Speaker 1: who are like, no, you shouldn't do what Sam is doing. 1047 00:57:08,880 --> 00:57:11,560 Speaker 1: You shouldn't make an exchange because that's not decentralized, and 1048 00:57:11,560 --> 00:57:14,120 Speaker 1: we like this because it's decentralized, they are the ones 1049 00:57:14,160 --> 00:57:17,800 Speaker 1: who get robbed less often because because they're a little 1050 00:57:17,840 --> 00:57:21,480 Speaker 1: smarter um. And that's part of the point these exchanges. 1051 00:57:21,600 --> 00:57:26,040 Speaker 1: They're meant for people who don't see crypto actually is like, well, 1052 00:57:26,080 --> 00:57:28,520 Speaker 1: I I want to fight the state by removing my 1053 00:57:28,600 --> 00:57:31,600 Speaker 1: money from the banking system. There for people who are 1054 00:57:31,640 --> 00:57:34,520 Speaker 1: like I want to try to get rich quick by gambling, right, 1055 00:57:34,760 --> 00:57:37,160 Speaker 1: and that's why those people are also the most vulnerable 1056 00:57:37,200 --> 00:57:40,560 Speaker 1: to scams. UM. However, anytime you explain crypto with me. 1057 00:57:40,640 --> 00:57:43,680 Speaker 1: To me, even though I know I need to understand 1058 00:57:43,720 --> 00:57:45,880 Speaker 1: it for the context of the episode, I just feel 1059 00:57:45,880 --> 00:57:48,400 Speaker 1: like I'm in a corner at a house party and 1060 00:57:48,440 --> 00:57:51,080 Speaker 1: I'm holding a clammy bud light and it's mostly empty, 1061 00:57:51,120 --> 00:57:54,440 Speaker 1: and You're like, just one more thing though, because you know, 1062 00:57:54,560 --> 00:57:56,320 Speaker 1: I mean, the important thing to understand is that what 1063 00:57:56,440 --> 00:58:00,440 Speaker 1: Sam has done so crypto is like unregular aided right. 1064 00:58:00,520 --> 00:58:03,560 Speaker 1: It is detached from any state, Okay, which is why 1065 00:58:03,640 --> 00:58:06,600 Speaker 1: people like it, Which is why people like it. Yes, 1066 00:58:06,760 --> 00:58:08,840 Speaker 1: Sam has what Sam has done has come in and 1067 00:58:08,880 --> 00:58:11,080 Speaker 1: he's not the first persons to do this and said, 1068 00:58:11,360 --> 00:58:13,360 Speaker 1: I have built a place where you can keep all 1069 00:58:13,400 --> 00:58:16,000 Speaker 1: of your crypto and you can trade it with other 1070 00:58:16,080 --> 00:58:18,440 Speaker 1: crypto to try to make money the same way people 1071 00:58:18,440 --> 00:58:22,560 Speaker 1: do with the stock market, right where it is more secure. No, 1072 00:58:23,240 --> 00:58:26,160 Speaker 1: because here's the thing, Jamie, he says, it's secure. But 1073 00:58:26,240 --> 00:58:28,520 Speaker 1: here's the thing. So you know how the banking system, 1074 00:58:28,560 --> 00:58:31,080 Speaker 1: how banks used to just go bust and everyone would 1075 00:58:31,080 --> 00:58:33,720 Speaker 1: lose their money and it caused a great depression. You 1076 00:58:33,760 --> 00:58:36,520 Speaker 1: get that part of the history of finance, right. Oh yeah, 1077 00:58:36,560 --> 00:58:39,440 Speaker 1: one of the characters from Titanic killed themselves over that. 1078 00:58:39,560 --> 00:58:42,440 Speaker 1: Exactly exactly. So that was a big problem, and we 1079 00:58:42,520 --> 00:58:48,320 Speaker 1: developed a bunch of regulations so that So what Sam 1080 00:58:48,360 --> 00:58:50,160 Speaker 1: has done is he's built a bank that has none 1081 00:58:50,160 --> 00:58:55,880 Speaker 1: of that so that people can gamble on the internet. 1082 00:58:53,360 --> 00:58:58,040 Speaker 1: So yeah, got exactly. That's all you need to understand 1083 00:58:58,120 --> 00:59:00,200 Speaker 1: is that Sam has built a big unregular they did 1084 00:59:00,280 --> 00:59:05,080 Speaker 1: bank for people to gamble with. UM. Yeah, anyway, UM, 1085 00:59:05,120 --> 00:59:07,000 Speaker 1: and this is this is it is. It could not 1086 00:59:07,080 --> 00:59:09,520 Speaker 1: have been clearer that this was a Ponzi scheme. And 1087 00:59:09,560 --> 00:59:12,360 Speaker 1: two eighteen they put out an advertisement to investors and 1088 00:59:12,400 --> 00:59:14,400 Speaker 1: I'm going to read it right now. I think you'll 1089 00:59:14,440 --> 00:59:20,640 Speaker 1: be like. We offer one investment product, annualized fixed rate loans. 1090 00:59:20,920 --> 00:59:23,000 Speaker 1: We can accept both fiat and crypto and can pay 1091 00:59:23,040 --> 00:59:27,200 Speaker 1: interest denominated in either. UM. These loans have no downside. 1092 00:59:27,320 --> 00:59:30,400 Speaker 1: We guarantee full payment the principle and interest enforceable under 1093 00:59:30,480 --> 00:59:33,080 Speaker 1: US law and established by all parties legal counsel. We 1094 00:59:33,120 --> 00:59:35,560 Speaker 1: are extremely confident we will pay this amount in the 1095 00:59:35,640 --> 00:59:38,320 Speaker 1: unlikely case where we lose more than two percent. Anyway, Again, 1096 00:59:38,880 --> 00:59:42,640 Speaker 1: I'm not a finance expert. Neither of you Jamie banks 1097 00:59:42,720 --> 00:59:47,640 Speaker 1: offer like three percent return on like a fucking uh. 1098 00:59:47,680 --> 00:59:49,200 Speaker 1: If you're like putting a pile of cash in a 1099 00:59:49,240 --> 00:59:52,360 Speaker 1: bank and you're getting three percent back, You're doing okay 1100 00:59:53,520 --> 00:59:57,880 Speaker 1: nonsense that nobody, no one can guarantee that it's not 1101 00:59:58,040 --> 01:00:01,480 Speaker 1: a real product. I would pretty struck by their use 1102 01:00:01,520 --> 01:00:05,000 Speaker 1: of the length whiche like we're pretty confident we're gonna 1103 01:00:05,040 --> 01:00:06,880 Speaker 1: be able to play it sounds like it's such just 1104 01:00:06,920 --> 01:00:09,480 Speaker 1: sounds like someone who is not actually very confident. And 1105 01:00:09,520 --> 01:00:11,400 Speaker 1: also if this is if you are if you are 1106 01:00:11,480 --> 01:00:14,040 Speaker 1: investing in a mark in the stock market, right, and 1107 01:00:14,120 --> 01:00:17,400 Speaker 1: someone says this investment has no downside, it's impossible for 1108 01:00:17,440 --> 01:00:20,560 Speaker 1: it to fail. That person's lying to you and breaking 1109 01:00:20,640 --> 01:00:24,520 Speaker 1: the law because it could and often does. Oh right, 1110 01:00:24,560 --> 01:00:27,200 Speaker 1: because then you're just you're you're making an almost guaranteed 1111 01:00:27,200 --> 01:00:30,120 Speaker 1: false problems. Yes, you cannot do you cannot say this 1112 01:00:30,240 --> 01:00:33,080 Speaker 1: stock can't go down. Right, If you're a stockbroker, you 1113 01:00:33,080 --> 01:00:36,000 Speaker 1: cannot tell a client it is impossible for your investment 1114 01:00:36,040 --> 01:00:39,680 Speaker 1: in this company to fail because that would be a crime. Um. 1115 01:00:39,720 --> 01:00:45,360 Speaker 1: But with crypto crypt that's unterted territory, territory, you can 1116 01:00:45,440 --> 01:00:48,280 Speaker 1: do anything. This is also a Ponzi scheme, right, What 1117 01:00:48,280 --> 01:00:51,240 Speaker 1: what what fucking made Off was doing is he had 1118 01:00:51,280 --> 01:00:54,200 Speaker 1: this investment portfolio that was I forget with the exact 1119 01:00:54,280 --> 01:00:57,720 Speaker 1: but it was promising an unbelievably high return and guaranteeing 1120 01:00:57,720 --> 01:00:59,680 Speaker 1: that people would get it right. And what he was 1121 01:00:59,720 --> 01:01:02,600 Speaker 1: doing is as new people put money into the investment, 1122 01:01:02,840 --> 01:01:06,240 Speaker 1: he was using their money to pay the old investors 1123 01:01:06,240 --> 01:01:08,840 Speaker 1: so that nobody noticed that things were sucking up. But 1124 01:01:08,880 --> 01:01:11,840 Speaker 1: eventually new people stopped putting money into the thing, and 1125 01:01:11,840 --> 01:01:13,480 Speaker 1: it all fell apart and a lot of people lost 1126 01:01:13,520 --> 01:01:18,120 Speaker 1: billions of dollars, right, Like, yeah, he was also using 1127 01:01:18,160 --> 01:01:20,439 Speaker 1: a lot of that money to live, you know, an 1128 01:01:20,480 --> 01:01:24,000 Speaker 1: incredibly lavish rich guy live anyway. Sorry, I only understand 1129 01:01:24,080 --> 01:01:27,160 Speaker 1: financial concepts when Selena Gomez breaks the fourth wall and 1130 01:01:27,200 --> 01:01:30,360 Speaker 1: explains it to me. I'm doing my best to be 1131 01:01:30,480 --> 01:01:35,439 Speaker 1: your Selena Gomez. But do you know who Selena Gomez is? Yeah, 1132 01:01:35,480 --> 01:01:38,960 Speaker 1: she's that chick from the Thing Nailed It. She's she's 1133 01:01:39,040 --> 01:01:40,880 Speaker 1: kind of in the middle of a fun scandal right 1134 01:01:40,920 --> 01:01:44,760 Speaker 1: now where she's in a feud with someone who donated 1135 01:01:44,760 --> 01:01:49,040 Speaker 1: her a kidney what which is a very funny online 1136 01:01:49,080 --> 01:01:54,560 Speaker 1: feud feud with that person because she Okay, if you 1137 01:01:54,680 --> 01:01:57,440 Speaker 1: asked Robert, this is something I can explain to you. 1138 01:01:58,320 --> 01:02:02,160 Speaker 1: This so, so what happened was Selena Gomez needed a 1139 01:02:02,200 --> 01:02:06,040 Speaker 1: kidney donated her close friend who works in the industry. 1140 01:02:06,040 --> 01:02:07,320 Speaker 1: I don't know who it was, but I guess she's 1141 01:02:07,320 --> 01:02:09,920 Speaker 1: like sort of famous gave her a kidney a couple 1142 01:02:09,920 --> 01:02:12,640 Speaker 1: of years ago. Then Selena Gomez turns around a couple 1143 01:02:12,640 --> 01:02:14,320 Speaker 1: of weeks ago and says, I have no friends in 1144 01:02:14,360 --> 01:02:18,800 Speaker 1: the industry except Taylor Swift incomes her kidney dontor being like, oh, 1145 01:02:18,880 --> 01:02:22,440 Speaker 1: that's interesting because I'm one kidney light d like I'm 1146 01:02:22,480 --> 01:02:25,640 Speaker 1: not quoting it, but and then, and then Selena, instead 1147 01:02:25,680 --> 01:02:29,400 Speaker 1: of apologizing, Selena Gomez is like, oh, sorry, I didn't 1148 01:02:29,480 --> 01:02:32,320 Speaker 1: thank every person I've ever met in my life. Yeah, 1149 01:02:32,320 --> 01:02:35,680 Speaker 1: but I mean, Selina, she did give you a kidney. Yeah, 1150 01:02:36,960 --> 01:02:39,120 Speaker 1: that's a special kind of friend. You guys weren't just 1151 01:02:39,240 --> 01:02:43,800 Speaker 1: like drinking buddies. She literally gave you a kidney. Um, 1152 01:02:43,880 --> 01:02:45,840 Speaker 1: and that she did. She did the thing that you 1153 01:02:45,880 --> 01:02:48,680 Speaker 1: would use as like a joking description of someone who've 1154 01:02:48,720 --> 01:02:52,439 Speaker 1: done a lot for you, to like like hyperbolically say 1155 01:02:52,440 --> 01:02:55,480 Speaker 1: that you owed them a lot you know who's done 1156 01:02:55,960 --> 01:02:58,520 Speaker 1: and implying that Taylor Swift has done more for you 1157 01:02:58,640 --> 01:03:02,360 Speaker 1: than your kidney don't so. I just think it's the 1158 01:03:02,360 --> 01:03:04,640 Speaker 1: funniest feat of all time. Alright, we were talking about 1159 01:03:04,680 --> 01:03:09,360 Speaker 1: a cryptocurrency. We weren't. Um, we were talking about let's 1160 01:03:09,600 --> 01:03:14,240 Speaker 1: let's get back to cryptocurrency. So to talk about what 1161 01:03:14,440 --> 01:03:19,320 Speaker 1: came next after establishing Alameda, UM, I'm going to quote 1162 01:03:19,320 --> 01:03:22,880 Speaker 1: again from that sequoia right up at this point mid 1163 01:03:22,880 --> 01:03:25,880 Speaker 1: two thousand nineteen, SPF decided to double down again and 1164 01:03:25,920 --> 01:03:29,200 Speaker 1: scratch his own itch. He would bet Alameda's multimillion dollar 1165 01:03:29,280 --> 01:03:32,040 Speaker 1: trading profits on a new venture, a trading exchange called 1166 01:03:32,120 --> 01:03:35,240 Speaker 1: f t X. It would combine coin bases, solid stolen 1167 01:03:35,360 --> 01:03:38,320 Speaker 1: regulation loving approach with the kinds of derivatives being offered 1168 01:03:38,360 --> 01:03:40,560 Speaker 1: by by Anance and others. He only gave himself a 1169 01:03:40,600 --> 01:03:44,000 Speaker 1: twenty chance of success, but in his mind, SPF needed 1170 01:03:44,040 --> 01:03:47,040 Speaker 1: extreme risk to maximize the expected value of his lifetime 1171 01:03:47,040 --> 01:03:49,960 Speaker 1: earnings and therefore the good his earn to give strategy 1172 01:03:50,040 --> 01:03:51,960 Speaker 1: could do. The fact that he was, by his own 1173 01:03:52,040 --> 01:03:55,160 Speaker 1: lights overwhelmingly likely to fail was besides the point. The 1174 01:03:55,200 --> 01:03:58,160 Speaker 1: point was this, when SPF multiplied out million billions of 1175 01:03:58,200 --> 01:04:00,840 Speaker 1: dollars a year, a successful cryptocur and the exchange could 1176 01:04:00,840 --> 01:04:04,280 Speaker 1: throw off by his self assessed chance of successfully building one. 1177 01:04:04,480 --> 01:04:07,480 Speaker 1: The number was still huge. That's the expected value. And 1178 01:04:07,520 --> 01:04:09,480 Speaker 1: if you live your life according to the same principles 1179 01:04:09,520 --> 01:04:11,960 Speaker 1: by which you trade an asset, there's only one way forward. 1180 01:04:12,160 --> 01:04:15,080 Speaker 1: You calculate the expected values, then aim for the largest one, 1181 01:04:15,160 --> 01:04:18,160 Speaker 1: because in one, but just one alternate future universe, everything 1182 01:04:18,200 --> 01:04:21,439 Speaker 1: works out fabulously. To maximize your expected value, you must 1183 01:04:21,440 --> 01:04:23,880 Speaker 1: aim for it and then march blindly forth, acting as 1184 01:04:23,880 --> 01:04:26,480 Speaker 1: if the fabulously lucky f b SPF of the future 1185 01:04:26,520 --> 01:04:29,720 Speaker 1: can reach into the other parallel universes and compensate the 1186 01:04:29,760 --> 01:04:33,240 Speaker 1: fail sun spfs for their losses. It sounds crazy, perhaps 1187 01:04:33,280 --> 01:04:36,720 Speaker 1: even selfish, but it's not. It's math. It follows the 1188 01:04:36,720 --> 01:04:40,160 Speaker 1: principle of risk neutrality. Yes, it actually is crazy. That's 1189 01:04:40,160 --> 01:04:44,800 Speaker 1: not math. I'm sorry, that's actually not math, but a 1190 01:04:44,920 --> 01:04:49,000 Speaker 1: lot of words all at once. That is like, oh 1191 01:04:49,040 --> 01:04:52,240 Speaker 1: my god, the spiraling launch is this? You are using 1192 01:04:52,720 --> 01:04:55,680 Speaker 1: hundreds of words and high minded bullshit rhetoric to be 1193 01:04:55,760 --> 01:05:00,440 Speaker 1: like gambling is the best way to make money. He's 1194 01:05:00,680 --> 01:05:05,080 Speaker 1: fabulously three times in one sentence. Yeah, man, you know who. 1195 01:05:05,120 --> 01:05:08,440 Speaker 1: I've heard this basic argument from my friends drunk in 1196 01:05:08,520 --> 01:05:11,640 Speaker 1: Las Vegas, explaining what they're trying to play at the craps. 1197 01:05:12,360 --> 01:05:14,760 Speaker 1: Why they don't want me to leave the little Horsey game? 1198 01:05:15,280 --> 01:05:17,240 Speaker 1: This is something. Yeah, someone's like, no, don't leave the 1199 01:05:17,240 --> 01:05:20,160 Speaker 1: Sex in the City slot machine, and you're you want 1200 01:05:20,160 --> 01:05:23,000 Speaker 1: to You want to hear my mathematical thinking on gambling, 1201 01:05:23,080 --> 01:05:25,720 Speaker 1: Jamie Loftus, because this will make more sense than anything 1202 01:05:25,760 --> 01:05:28,040 Speaker 1: that happens in the Sequoia article. Okay, when I go 1203 01:05:28,120 --> 01:05:30,560 Speaker 1: to Las Vegas, I find me the penny slots where 1204 01:05:30,560 --> 01:05:33,120 Speaker 1: I can see the most waitresses walking around with those 1205 01:05:33,160 --> 01:05:35,040 Speaker 1: little trays that they have the drinks and stuff on. 1206 01:05:35,520 --> 01:05:38,600 Speaker 1: Then I sit down at them and I don't start 1207 01:05:38,640 --> 01:05:40,440 Speaker 1: to play until one gets close to me, and then 1208 01:05:40,560 --> 01:05:42,400 Speaker 1: I pressed the button as soon as she walks past, 1209 01:05:42,480 --> 01:05:45,040 Speaker 1: and I like, catch your attention, and then I get 1210 01:05:45,040 --> 01:05:47,680 Speaker 1: a free drink. And the way that it works out 1211 01:05:47,840 --> 01:05:50,720 Speaker 1: is that as long as I can get more free 1212 01:05:50,760 --> 01:05:53,680 Speaker 1: drinks than I'm spending at the penny slots, and mostly 1213 01:05:53,720 --> 01:05:55,640 Speaker 1: I'm just reading a book and hiding it while I'm 1214 01:05:55,680 --> 01:05:57,200 Speaker 1: at the penny slots and only press it when the 1215 01:05:57,240 --> 01:06:00,520 Speaker 1: waitress gets near. Like the other Vegas guys, I can 1216 01:06:00,600 --> 01:06:03,360 Speaker 1: drink effectively for free. Right, it works out to be 1217 01:06:03,400 --> 01:06:06,000 Speaker 1: like twenty five cents of drink if you're really smart 1218 01:06:06,040 --> 01:06:09,400 Speaker 1: about it. That is my financial advice to all of you. 1219 01:06:09,880 --> 01:06:12,400 Speaker 1: I do. I do respect that how you make it 1220 01:06:12,440 --> 01:06:14,720 Speaker 1: even better. I used to go with a bag because 1221 01:06:14,760 --> 01:06:16,480 Speaker 1: when I was poor. What I would do is I 1222 01:06:16,480 --> 01:06:18,640 Speaker 1: would go to Vegas once every couple of years, usually 1223 01:06:18,640 --> 01:06:20,680 Speaker 1: for work, and I would get all the free drinks, 1224 01:06:20,680 --> 01:06:22,560 Speaker 1: which came in glasses a lot of the time, and 1225 01:06:22,560 --> 01:06:24,520 Speaker 1: I would keep the glasses, and so I was able 1226 01:06:24,560 --> 01:06:28,400 Speaker 1: to furnish my apartment with stolen Las Vegas glasses. Love 1227 01:06:28,480 --> 01:06:32,520 Speaker 1: stealing glasses. I've my fair share of glasses. And I 1228 01:06:32,560 --> 01:06:34,120 Speaker 1: would get up to the floors where there were the 1229 01:06:34,200 --> 01:06:36,479 Speaker 1: nicer hotel rooms, and I would find all the people 1230 01:06:36,520 --> 01:06:39,280 Speaker 1: who set out their plates and stuff from like room service, 1231 01:06:39,280 --> 01:06:41,120 Speaker 1: and I would just take those and take them back 1232 01:06:41,160 --> 01:06:46,360 Speaker 1: to my house. I respect that. I'm trying to think. 1233 01:06:46,400 --> 01:06:48,480 Speaker 1: I was like, I don't have a real system for well. Actually, 1234 01:06:48,600 --> 01:06:50,280 Speaker 1: when I go to Vegas, I always stay at the 1235 01:06:50,280 --> 01:06:52,440 Speaker 1: hotel with a roller coaster on top. And here's what 1236 01:06:52,520 --> 01:06:56,000 Speaker 1: I do. I go on the roller coaster once, sometimes twice. 1237 01:06:56,320 --> 01:06:59,320 Speaker 1: Then I go see one show. Usually it's horrible. The 1238 01:06:59,400 --> 01:07:02,400 Speaker 1: last time to see the Backstreet Boys. And guess what 1239 01:07:02,480 --> 01:07:04,080 Speaker 1: I found out One of the Backstreet Boys is in 1240 01:07:04,240 --> 01:07:07,200 Speaker 1: Q and on and then I got bumped up from this. 1241 01:07:08,200 --> 01:07:11,560 Speaker 1: It was a real The point I want to make, Jamie, 1242 01:07:11,680 --> 01:07:13,920 Speaker 1: is that what I just described to you, my Vegas 1243 01:07:13,960 --> 01:07:18,080 Speaker 1: strategy has made me infinite, infinitely more money and net 1244 01:07:18,120 --> 01:07:21,400 Speaker 1: profit than Samuel Bankman. Freed is actually going to make 1245 01:07:21,440 --> 01:07:29,040 Speaker 1: in crypto Kurts. Oh fun foreshadowing. Um, So never hang 1246 01:07:29,080 --> 01:07:31,760 Speaker 1: out near waitresses though, because he's probably afraid of women. 1247 01:07:31,800 --> 01:07:34,000 Speaker 1: He's probably afraid of them. But I am fine with 1248 01:07:34,040 --> 01:07:36,920 Speaker 1: asking women for a free drink as long as I'm 1249 01:07:36,920 --> 01:07:39,560 Speaker 1: paying at the penny slots, you know, So it's not weird. Wow. 1250 01:07:39,720 --> 01:07:44,280 Speaker 1: Feminist icon, robertist icon, Robert Evans, getting those getting those 1251 01:07:44,320 --> 01:07:47,360 Speaker 1: shitty Vegas Irish copies. They come in the glasses. I 1252 01:07:47,400 --> 01:07:51,480 Speaker 1: want a nasty but I like the glasses they used 1253 01:07:51,520 --> 01:07:53,400 Speaker 1: to come in. They do have, I know, but then 1254 01:07:53,440 --> 01:07:56,480 Speaker 1: there's the consequence of having to drink with inside. Well, 1255 01:07:56,520 --> 01:07:59,360 Speaker 1: you know, Jamie, that's why they call me a hero. Um. 1256 01:08:00,000 --> 01:08:02,720 Speaker 1: But the Mahatma Gandhi of the West, they call me 1257 01:08:02,800 --> 01:08:05,959 Speaker 1: the Jesus Christ. And podcasting. These are all things people 1258 01:08:06,440 --> 01:08:11,440 Speaker 1: if you're gonna lie, make it realistic, James, So James soup. Anyway, 1259 01:08:11,520 --> 01:08:17,040 Speaker 1: let's continue. Um, I'm like, yeah, I think it's hard 1260 01:08:17,080 --> 01:08:20,040 Speaker 1: to justify being risk averse on your own personal impact 1261 01:08:20,120 --> 01:08:22,400 Speaker 1: impact SPF told me when I quized him about it, 1262 01:08:22,600 --> 01:08:25,280 Speaker 1: unless you're doing it for personal reasons. In other words, 1263 01:08:25,439 --> 01:08:27,880 Speaker 1: it's selfish not to go for broke if you're planning 1264 01:08:27,880 --> 01:08:30,599 Speaker 1: on giving it all away in the end. Anyway, again, 1265 01:08:30,960 --> 01:08:36,000 Speaker 1: just clown shit. So all of this is a con spoilers, 1266 01:08:36,000 --> 01:08:37,479 Speaker 1: so you don't have to think that much about it. 1267 01:08:37,680 --> 01:08:40,720 Speaker 1: In a recent series of text messages with a Vox journalist, 1268 01:08:40,800 --> 01:08:43,640 Speaker 1: after his entire exchange exploded and everyone found out he 1269 01:08:43,680 --> 01:08:47,320 Speaker 1: was a financial criminal, Sam Bankman, Freed more or less 1270 01:08:47,360 --> 01:08:50,360 Speaker 1: admitted that everything he'd had to say about effective altruism 1271 01:08:50,560 --> 01:08:52,760 Speaker 1: was a con meant to get people to trust him 1272 01:08:52,760 --> 01:08:54,439 Speaker 1: and invest in his company. And I'm gonna read the 1273 01:08:54,520 --> 01:08:57,320 Speaker 1: text Yeah, I'm gonna read the texts to you between 1274 01:08:57,360 --> 01:08:59,360 Speaker 1: him and this journalist who, by the way, he put 1275 01:08:59,400 --> 01:09:01,800 Speaker 1: money in the v so he helped fund this journalist. 1276 01:09:03,840 --> 01:09:05,800 Speaker 1: So the ethics stuff, this is the journalist. So the 1277 01:09:05,800 --> 01:09:08,640 Speaker 1: ethics stuff mostly affront people will like you if you 1278 01:09:08,680 --> 01:09:10,200 Speaker 1: win and hate you with you lose, and that's how 1279 01:09:10,200 --> 01:09:13,599 Speaker 1: it all really works, Sam, Yeah, I mean that's not 1280 01:09:13,720 --> 01:09:16,320 Speaker 1: all of it, but it's a lot. The worst quadrant 1281 01:09:16,479 --> 01:09:19,719 Speaker 1: is sketchy and lose. The best is win plus question 1282 01:09:19,760 --> 01:09:23,320 Speaker 1: mark question mark question mark clean plus loses bad but 1283 01:09:23,400 --> 01:09:27,280 Speaker 1: not terrible. He also misspells terrible but whatever the journalist 1284 01:09:27,280 --> 01:09:29,439 Speaker 1: and then replies, you were really good at talking about 1285 01:09:29,439 --> 01:09:31,360 Speaker 1: ethics for someone who kind of saw it all as 1286 01:09:31,360 --> 01:09:34,479 Speaker 1: a game with winners and losers. Yeah, he he, I 1287 01:09:34,560 --> 01:09:38,360 Speaker 1: had to be. It's what reputations are made of. To 1288 01:09:38,439 --> 01:09:40,960 Speaker 1: some extent, I feel bad for those who get sucked 1289 01:09:41,040 --> 01:09:43,559 Speaker 1: by it. But this dumb game, we woke Westerner's play 1290 01:09:43,600 --> 01:09:45,799 Speaker 1: where we all say the right shibboleths and so everyone 1291 01:09:45,840 --> 01:09:48,639 Speaker 1: likes us. And that's the actual truth here, right, That's 1292 01:09:48,640 --> 01:09:50,640 Speaker 1: the thing that's honest about it is Like, man, it 1293 01:09:50,760 --> 01:09:53,479 Speaker 1: was all of this talk for everyone. That's the entire 1294 01:09:53,560 --> 01:09:57,760 Speaker 1: this Miscastell motherfucker. The only reason his effective altruism thing 1295 01:09:57,800 --> 01:10:00,639 Speaker 1: exists as a funded thing is as a fucking shibba 1296 01:10:00,680 --> 01:10:03,479 Speaker 1: left for billionaires who don't want to pay taxes and 1297 01:10:03,520 --> 01:10:06,040 Speaker 1: want to let the world crumble around them. Will sucking 1298 01:10:06,080 --> 01:10:07,800 Speaker 1: as much value out of the working class as they 1299 01:10:07,840 --> 01:10:09,840 Speaker 1: can and want to pretend like their heroes at the 1300 01:10:09,880 --> 01:10:12,160 Speaker 1: same time, so that people write their dicks and articles 1301 01:10:12,160 --> 01:10:18,160 Speaker 1: like that fucking sakoyapies that absolutely fucking ridiculous text. Actually 1302 01:10:18,320 --> 01:10:21,479 Speaker 1: is like a very important document. It's incredibly important. It's 1303 01:10:21,560 --> 01:10:24,719 Speaker 1: deeply crucial. I hate that there's such a crucial document 1304 01:10:24,760 --> 01:10:28,120 Speaker 1: that also includes he in it. Yeah, there's nothing to 1305 01:10:28,120 --> 01:10:31,200 Speaker 1: be done about that one, Jamie, Look, there's it doesn't 1306 01:10:31,240 --> 01:10:34,400 Speaker 1: feel good, but you know we are previous most important 1307 01:10:34,439 --> 01:10:36,679 Speaker 1: document to that effect was an I am that said 1308 01:10:36,680 --> 01:10:39,000 Speaker 1: ha ha, So a text with he he is kind 1309 01:10:39,040 --> 01:10:41,920 Speaker 1: of the logical progression that is fascinated. Do you have 1310 01:10:42,000 --> 01:10:44,880 Speaker 1: any like insight into like why he would so freely 1311 01:10:44,920 --> 01:10:47,920 Speaker 1: admit that? Now, I don't. Actually, one of two things 1312 01:10:48,000 --> 01:10:51,000 Speaker 1: has to be happening, because again the spoilers, this all 1313 01:10:51,040 --> 01:10:53,719 Speaker 1: falls apart. His exchange goes from worth thirty two billion 1314 01:10:53,760 --> 01:10:57,080 Speaker 1: dollars to worth basically zero dollars. In the space in 1315 01:10:57,120 --> 01:11:01,160 Speaker 1: my head anytime you say, like like twenty four hours 1316 01:11:01,160 --> 01:11:05,040 Speaker 1: this happens, his net worth falls in a day, Um, 1317 01:11:05,200 --> 01:11:07,599 Speaker 1: like it is it all because they realized that all 1318 01:11:07,640 --> 01:11:09,519 Speaker 1: of the money has gone that he'd been taking money 1319 01:11:09,520 --> 01:11:11,640 Speaker 1: from one business and using to gamble in another, and 1320 01:11:11,680 --> 01:11:13,760 Speaker 1: also to pay him and his friends, and all of 1321 01:11:13,800 --> 01:11:16,160 Speaker 1: the money that the investors had put in when they 1322 01:11:16,160 --> 01:11:19,000 Speaker 1: tried to withdraw it, like the their money that was 1323 01:11:19,040 --> 01:11:20,880 Speaker 1: supposed to be in there on paper, none of the 1324 01:11:20,920 --> 01:11:25,360 Speaker 1: money existed because again he had stolen it um anyway, 1325 01:11:25,640 --> 01:11:28,080 Speaker 1: the context of this article makes it clear that he 1326 01:11:28,120 --> 01:11:31,040 Speaker 1: felt like, I don't know whatever. This was all the 1327 01:11:31,120 --> 01:11:34,439 Speaker 1: confidence game, right, that's the key all of this could work, 1328 01:11:34,479 --> 01:11:36,559 Speaker 1: and the balance because people were looking at their balance. Shee, 1329 01:11:36,640 --> 01:11:38,800 Speaker 1: I'm making money. I'm making money. The returns are great. 1330 01:11:39,000 --> 01:11:42,120 Speaker 1: And that money existed on paper until they tried to 1331 01:11:42,160 --> 01:11:44,439 Speaker 1: take it out because then it actually wasn't there because 1332 01:11:44,439 --> 01:11:47,960 Speaker 1: he had already frittered it away. It's a confidence game, 1333 01:11:48,160 --> 01:11:51,040 Speaker 1: and we have an you know that's the way it 1334 01:11:51,160 --> 01:11:53,960 Speaker 1: actually worked. We also know and this is all still 1335 01:11:53,960 --> 01:11:55,600 Speaker 1: coming out, so I'm not gonna get too much into it, 1336 01:11:55,600 --> 01:11:57,880 Speaker 1: but we know that he had f t X loan himself. 1337 01:11:57,960 --> 01:12:01,599 Speaker 1: Sam BigMan freed about a billion dollars, Like his paper 1338 01:12:01,720 --> 01:12:05,160 Speaker 1: value was like two billion, but he gave himself basically 1339 01:12:05,240 --> 01:12:08,639 Speaker 1: a billion dollars in other people's money. Um, although he 1340 01:12:08,720 --> 01:12:11,160 Speaker 1: may have gambled that away. It's really unclear how much 1341 01:12:11,160 --> 01:12:15,160 Speaker 1: money he actually has liquid at the moment um do 1342 01:12:15,160 --> 01:12:19,400 Speaker 1: we think he has any I have no idea either either. 1343 01:12:19,520 --> 01:12:22,520 Speaker 1: He was like actually a gambling addict and a narcissist 1344 01:12:22,560 --> 01:12:24,280 Speaker 1: and he really did lose it all or this was 1345 01:12:24,320 --> 01:12:26,759 Speaker 1: a con from the beginning, knowing it would all collapse, 1346 01:12:26,760 --> 01:12:28,160 Speaker 1: and he got as much as he could out of it, 1347 01:12:28,200 --> 01:12:30,720 Speaker 1: and he's going to wind up someplace without extradition, right like, 1348 01:12:30,800 --> 01:12:32,640 Speaker 1: and that was the goal. If you asked me a 1349 01:12:32,680 --> 01:12:34,479 Speaker 1: couple of years ago, I would have said it's the latter. 1350 01:12:34,560 --> 01:12:36,040 Speaker 1: But I feel like the last couple of years have 1351 01:12:36,080 --> 01:12:39,080 Speaker 1: demonstrated so often that like people are just straight up 1352 01:12:39,120 --> 01:12:41,040 Speaker 1: not smart and don't have a plan, and it is 1353 01:12:41,080 --> 01:12:44,479 Speaker 1: all a narcissist shell game. It's very it's it's un 1354 01:12:45,080 --> 01:12:46,760 Speaker 1: at the moment. I'm going to read you some things 1355 01:12:46,800 --> 01:12:48,479 Speaker 1: at the end here and you can kind of make 1356 01:12:48,520 --> 01:12:52,200 Speaker 1: your anyway. Whatever. So after you know, starting FTX, the 1357 01:12:52,200 --> 01:12:54,519 Speaker 1: company moves to Hong Kong and then the Bahamas and 1358 01:12:54,560 --> 01:12:57,639 Speaker 1: they use they buy these very like a thirty nine 1359 01:12:57,640 --> 01:12:59,920 Speaker 1: million dollar mansion that he lives in with his friends 1360 01:13:00,200 --> 01:13:03,720 Speaker 1: using FTX tokens, which is like internal cash that his 1361 01:13:03,800 --> 01:13:07,000 Speaker 1: company issues based on the perceived value of the company. 1362 01:13:07,360 --> 01:13:11,040 Speaker 1: They don't know how many rooms is that it's a shipload, 1363 01:13:11,280 --> 01:13:13,600 Speaker 1: and and they're able to buy it because everything's like 1364 01:13:13,640 --> 01:13:16,160 Speaker 1: their paper on paper. They've gone from nothing to worth 1365 01:13:16,160 --> 01:13:18,559 Speaker 1: thirty two billion dollars in like a year or two. 1366 01:13:19,080 --> 01:13:22,280 Speaker 1: And these these idiots, like property owners in the Bahamas, 1367 01:13:22,280 --> 01:13:24,840 Speaker 1: are like, well, clearly, the best thing we could do 1368 01:13:25,000 --> 01:13:27,879 Speaker 1: is by this building using the fake money they created 1369 01:13:27,920 --> 01:13:29,920 Speaker 1: for their own company that they tell us is worth 1370 01:13:29,920 --> 01:13:34,840 Speaker 1: a lot of money. This is worth it. Um. So 1371 01:13:35,520 --> 01:13:39,400 Speaker 1: since everything collapsed, some people that SPF had like reached 1372 01:13:39,400 --> 01:13:42,639 Speaker 1: out to as early investors have commented about why they 1373 01:13:42,680 --> 01:13:45,120 Speaker 1: didn't invest in this company in the early days, and 1374 01:13:45,160 --> 01:13:47,280 Speaker 1: most of them it's because, like what they could see 1375 01:13:47,280 --> 01:13:49,679 Speaker 1: of his investments, the tens of millions that he promised 1376 01:13:49,680 --> 01:13:53,360 Speaker 1: to charities in the long positions in risky cryptocurrent companies 1377 01:13:53,400 --> 01:13:55,120 Speaker 1: didn't make sense. They would look at like the things 1378 01:13:55,120 --> 01:13:57,880 Speaker 1: he was buying with the company assets and the things 1379 01:13:57,880 --> 01:13:59,840 Speaker 1: that like he was investing in and be like, well, 1380 01:13:59,840 --> 01:14:02,200 Speaker 1: there's no way he could have that kind of liquidity 1381 01:14:02,240 --> 01:14:05,040 Speaker 1: if this is a legitimate exchange right, he can't have 1382 01:14:05,240 --> 01:14:08,080 Speaker 1: that kind of cash on hand. It doesn't make any sense. 1383 01:14:08,320 --> 01:14:12,240 Speaker 1: Rich fucking assholes always telling themselves like that, that's funny. 1384 01:14:12,360 --> 01:14:14,320 Speaker 1: And what's funny is that, like I found one of 1385 01:14:14,320 --> 01:14:16,240 Speaker 1: these guys who like, yeah, I didn't invest because I 1386 01:14:16,240 --> 01:14:18,519 Speaker 1: could tell it was a con and then was like, 1387 01:14:18,560 --> 01:14:20,680 Speaker 1: but I didn't tell anybody because I didn't want to 1388 01:14:20,720 --> 01:14:30,519 Speaker 1: get yelled at. That's unfortunately. I do see myself that 1389 01:14:30,720 --> 01:14:32,840 Speaker 1: statement a little bit where I was like, oh, someone 1390 01:14:32,880 --> 01:14:35,720 Speaker 1: might be someone might send me a rude text. I 1391 01:14:35,760 --> 01:14:40,880 Speaker 1: guess I will this. Yeah, oh my god, it's very fine, 1392 01:14:41,520 --> 01:14:44,519 Speaker 1: goofy bullshit. I don't I'll tell you one thing, Jamie. 1393 01:14:44,560 --> 01:14:49,000 Speaker 1: He's a spineless guy who still has his fucking money. Wow, 1394 01:14:49,040 --> 01:14:54,599 Speaker 1: you're in love with him? That's what. Yeah, you know whatever. 1395 01:14:54,680 --> 01:15:00,200 Speaker 1: It's the finance industry. They're all ghoules from what we 1396 01:15:00,280 --> 01:15:04,960 Speaker 1: can actually tell now, because again, this fucking Captain Dick writer, 1397 01:15:05,200 --> 01:15:08,880 Speaker 1: the bad writer, Like, I have to read you another 1398 01:15:08,920 --> 01:15:11,240 Speaker 1: quote that I didn't have in my script to give 1399 01:15:11,280 --> 01:15:14,280 Speaker 1: you an idea of just how much he fucking how 1400 01:15:14,360 --> 01:15:20,080 Speaker 1: much he loves SBF. Here's a quote, um from him. 1401 01:15:20,320 --> 01:15:24,639 Speaker 1: Uh okay, yeah, this is this is actually, Jamie, oh boy, 1402 01:15:26,000 --> 01:15:28,559 Speaker 1: this is when this is when, this is when, this 1403 01:15:28,640 --> 01:15:31,360 Speaker 1: is when the writer Captain Dick writer is hanging out 1404 01:15:31,400 --> 01:15:34,560 Speaker 1: with him in the Bahamas at his office quote, and 1405 01:15:34,640 --> 01:15:38,519 Speaker 1: he's like, what if we kissed? Sensing an opportunity for connection, 1406 01:15:38,600 --> 01:15:41,040 Speaker 1: I chip in with my own two satoshi, which is 1407 01:15:42,120 --> 01:15:45,000 Speaker 1: two cents but a bitcoin term. Anyway, I'm going to 1408 01:15:45,200 --> 01:15:48,559 Speaker 1: walk into the ocean. Wow. Okay. I don't pay any 1409 01:15:48,560 --> 01:15:51,040 Speaker 1: attention to social media. Not because I have any moral 1410 01:15:51,120 --> 01:15:53,639 Speaker 1: case against it, I say, but because for me, reading 1411 01:15:53,640 --> 01:15:55,519 Speaker 1: books is the highest band with way I know to 1412 01:15:55,560 --> 01:15:58,120 Speaker 1: get quality information into my brain, which just craves the 1413 01:15:58,160 --> 01:16:01,000 Speaker 1: stimulation I'm addicted to reading, which explains how I ended 1414 01:16:01,040 --> 01:16:04,760 Speaker 1: up being a writer. Oh yeah, said SPS say, SPF, 1415 01:16:04,800 --> 01:16:07,160 Speaker 1: I would never read a book. I'm not sure what 1416 01:16:07,280 --> 01:16:10,479 Speaker 1: to say. I've read a book a week from I 1417 01:16:10,520 --> 01:16:13,040 Speaker 1: hate them both because I'm not sure what to say. 1418 01:16:13,120 --> 01:16:15,200 Speaker 1: I read a book a week for my entire adult life, 1419 01:16:15,240 --> 01:16:17,840 Speaker 1: and I have written three of my own. I'm very 1420 01:16:17,840 --> 01:16:20,040 Speaker 1: skeptical of books. I don't want to say no book 1421 01:16:20,080 --> 01:16:22,240 Speaker 1: is ever worth reading, but I actually do believe something 1422 01:16:22,240 --> 01:16:24,960 Speaker 1: pretty close to that explains SPF. I think if you 1423 01:16:24,960 --> 01:16:26,720 Speaker 1: wrote a book you fucked up and it should have 1424 01:16:26,800 --> 01:16:32,280 Speaker 1: been a six paragraph blog post. Those guys writing word 1425 01:16:32,360 --> 01:16:37,920 Speaker 1: article that he will not read. It's like two guys 1426 01:16:38,000 --> 01:16:40,839 Speaker 1: whose heads are made out of rocks, just like clunking 1427 01:16:40,840 --> 01:16:45,160 Speaker 1: against each other repeatedly. Just just wait, Jamie, Jamie. It 1428 01:16:45,240 --> 01:16:47,599 Speaker 1: has it hasn't gotten as bad as it's going to get. 1429 01:16:49,000 --> 01:16:52,519 Speaker 1: Whatever the case, I hate books. Whatever the case, I 1430 01:16:52,560 --> 01:16:55,240 Speaker 1: find myself sad for the man, And it occurs to 1431 01:16:55,280 --> 01:16:58,160 Speaker 1: me that my reaction is exactly what might be expected 1432 01:16:58,200 --> 01:17:00,639 Speaker 1: from a beta in the brave New world Old Crypto 1433 01:17:00,800 --> 01:17:05,320 Speaker 1: is creating. Whoa how can you write that and not 1434 01:17:05,520 --> 01:17:08,719 Speaker 1: leap off the top of the building He literally self 1435 01:17:08,760 --> 01:17:12,320 Speaker 1: identified as a beta to also, just like what I 1436 01:17:12,439 --> 01:17:14,280 Speaker 1: love about this is just like what we have to 1437 01:17:14,320 --> 01:17:16,519 Speaker 1: take it. We have to take it as given that 1438 01:17:16,600 --> 01:17:18,640 Speaker 1: Crypto is creating a brave new world and none of 1439 01:17:18,680 --> 01:17:21,240 Speaker 1: us has any choice in that. It's inevitable, it's unstoppable. 1440 01:17:21,240 --> 01:17:24,200 Speaker 1: It's going to dominate everything, which is like capitulate to 1441 01:17:24,280 --> 01:17:28,880 Speaker 1: our Yeah, so I think again, what I wonder does 1442 01:17:28,920 --> 01:17:31,439 Speaker 1: he think? I think? Wouldn't someone with i Q points 1443 01:17:31,479 --> 01:17:34,920 Speaker 1: to spare realized that dismissing books, all books is essentially worthless? 1444 01:17:34,960 --> 01:17:37,439 Speaker 1: Might rile a writer? Was he playing with me? Is 1445 01:17:37,479 --> 01:17:40,320 Speaker 1: this fun? Is this humor? I'm satisfied with my meta 1446 01:17:40,360 --> 01:17:43,200 Speaker 1: analysis until I realized that one can always increment the 1447 01:17:43,280 --> 01:17:45,880 Speaker 1: level of strategic play in this sort of game. It's 1448 01:17:45,880 --> 01:17:48,080 Speaker 1: like poker. Level one is just thinking about how to 1449 01:17:48,120 --> 01:17:50,479 Speaker 1: strengthen your own hand. Level two is thinking about what 1450 01:17:50,520 --> 01:17:52,800 Speaker 1: your opponent's hand is. Level three is thinking about what 1451 01:17:52,840 --> 01:17:55,120 Speaker 1: your opponent thinks your hand is, and so on. And 1452 01:17:55,160 --> 01:17:58,240 Speaker 1: since SPF is obviously a genius, I should simply assume 1453 01:17:58,320 --> 01:18:00,960 Speaker 1: that compared with me, SPF will always be playing at 1454 01:18:01,040 --> 01:18:03,320 Speaker 1: level in plus one, which was making which makes my 1455 01:18:03,400 --> 01:18:06,439 Speaker 1: analysis of the intent behind SPS books for losers idea 1456 01:18:06,640 --> 01:18:11,040 Speaker 1: spiral into infinity and crash like a computer like it 1457 01:18:11,120 --> 01:18:14,719 Speaker 1: was how you write about me and your diary? No, Jamie, 1458 01:18:14,800 --> 01:18:17,799 Speaker 1: because do you know what the only ethical speaking of ethics, 1459 01:18:17,840 --> 01:18:21,000 Speaker 1: you know what, if you think this way about conversations, 1460 01:18:21,240 --> 01:18:24,240 Speaker 1: the ethical thing to do is fill your pockets with 1461 01:18:24,360 --> 01:18:29,679 Speaker 1: rocks and walk into the ocean. Wow, you're wolfing this guy. Yeah, 1462 01:18:29,840 --> 01:18:34,360 Speaker 1: I absolutely am him and Sam Bankman Free. I could 1463 01:18:34,360 --> 01:18:37,800 Speaker 1: not hate these people more. I mean, they're both there. 1464 01:18:38,200 --> 01:18:41,920 Speaker 1: There's never been two wronger people having a conversation, and 1465 01:18:41,920 --> 01:18:44,240 Speaker 1: of course it came out. He's just like, well, obviously 1466 01:18:44,280 --> 01:18:46,320 Speaker 1: he's a genius. We have to assume that because he 1467 01:18:46,360 --> 01:18:48,080 Speaker 1: became a billionaire. So and he's like, no, he was 1468 01:18:48,120 --> 01:18:51,599 Speaker 1: never a billion There his defense, that's math he did 1469 01:18:52,000 --> 01:18:55,240 Speaker 1: to balance sheet, Like we've now gotten access because he 1470 01:18:55,280 --> 01:18:57,719 Speaker 1: had to like go to the bankruptcy and step down. 1471 01:18:57,840 --> 01:19:00,000 Speaker 1: So like there's a caretaker trying to get people's money 1472 01:19:00,000 --> 01:19:01,840 Speaker 1: out of the company, and we know ship like we've 1473 01:19:01,880 --> 01:19:04,639 Speaker 1: seen the Excel files where they kept their financial records, 1474 01:19:04,840 --> 01:19:07,760 Speaker 1: and it's him being like, this is basically bullshit, Like, 1475 01:19:07,800 --> 01:19:09,920 Speaker 1: sorry about this, we funck this up. We weren't actually 1476 01:19:10,000 --> 01:19:13,280 Speaker 1: keeping records here the company balance sheet. There was no 1477 01:19:13,360 --> 01:19:17,880 Speaker 1: accounting department. People would file there, like like when they 1478 01:19:17,920 --> 01:19:20,439 Speaker 1: would spend hundreds tens and hundreds of millions of dollars 1479 01:19:20,479 --> 01:19:23,040 Speaker 1: and things, they would just message each other on signal 1480 01:19:23,080 --> 01:19:26,200 Speaker 1: about it to get approval and had deleted deleted messages 1481 01:19:26,240 --> 01:19:28,559 Speaker 1: on so there's actually like no record of a lot 1482 01:19:28,600 --> 01:19:31,080 Speaker 1: of the accounting they did. At one point hier and 1483 01:19:31,160 --> 01:19:34,160 Speaker 1: outside accounting firm to handle their accounting of this thirty 1484 01:19:34,160 --> 01:19:37,120 Speaker 1: two billion dollar company and the firm they hired is 1485 01:19:37,120 --> 01:19:41,440 Speaker 1: the first builds itself as the only metaverse based accounting firm. 1486 01:19:41,560 --> 01:19:46,479 Speaker 1: Um His the accounting firm for this thirty two billion 1487 01:19:46,520 --> 01:19:50,280 Speaker 1: dollar company, exists entirely within a crypto themed video game 1488 01:19:50,400 --> 01:19:56,440 Speaker 1: called decentral Land. Um. It is so futpid with everything 1489 01:19:56,479 --> 01:19:59,400 Speaker 1: you're saying. Okay, so this is really this is bad 1490 01:19:59,439 --> 01:20:03,960 Speaker 1: and this field bad to hear Um, I have a question, Yes, 1491 01:20:04,080 --> 01:20:08,360 Speaker 1: Jamie Um, who is I guess because I am I 1492 01:20:08,400 --> 01:20:11,280 Speaker 1: am kind of back in in Lizzy holmes Land because 1493 01:20:11,320 --> 01:20:14,639 Speaker 1: that whole last I mean, like departments that should be, 1494 01:20:14,720 --> 01:20:17,880 Speaker 1: you know, taking care of ship don't even exist, which 1495 01:20:18,080 --> 01:20:24,760 Speaker 1: is very farops adjacent to me parasian if you will. 1496 01:20:24,920 --> 01:20:27,720 Speaker 1: As with both written books, we can just make up 1497 01:20:27,720 --> 01:20:30,559 Speaker 1: for Yeah, of course, yeah, that's that's mostly what writing 1498 01:20:30,560 --> 01:20:33,040 Speaker 1: a book is. For sure. I view myself as the 1499 01:20:33,080 --> 01:20:35,479 Speaker 1: beta of this conversation, and so I feel comfortable asking 1500 01:20:35,680 --> 01:20:41,559 Speaker 1: Robert Um who is ultimately affected by this, Like what 1501 01:20:41,760 --> 01:20:47,080 Speaker 1: is the like trickle down of this what happens? Um? 1502 01:20:47,240 --> 01:20:53,280 Speaker 1: Is it just other rich assholes or it doesn't affect regularly? 1503 01:20:53,320 --> 01:20:57,760 Speaker 1: Like does it affect Probably there will presumably be some 1504 01:20:58,280 --> 01:21:00,719 Speaker 1: here's the thing, and here's why there's that lawsuit against 1505 01:21:00,760 --> 01:21:02,840 Speaker 1: like Larry David and all those other guys who appeared 1506 01:21:02,880 --> 01:21:06,680 Speaker 1: to the ft X. AD. Yeah, presumably a bunch of 1507 01:21:06,720 --> 01:21:09,800 Speaker 1: regular people got suckered into putting their money on ft X, 1508 01:21:09,840 --> 01:21:13,040 Speaker 1: and those people have probably lost some money. That said, 1509 01:21:13,640 --> 01:21:16,400 Speaker 1: for the most part, it's fine because most of the 1510 01:21:16,400 --> 01:21:20,599 Speaker 1: people who lost money are like gamblers who probably suck 1511 01:21:20,680 --> 01:21:23,400 Speaker 1: as much as this guy did. Um. And it's one 1512 01:21:23,400 --> 01:21:25,360 Speaker 1: of those things. There's just an article I think at 1513 01:21:25,400 --> 01:21:28,519 Speaker 1: Financial Times where someone's like, actually, and I think they 1514 01:21:28,520 --> 01:21:31,000 Speaker 1: actually had a good point. We shouldn't regulate the crypto 1515 01:21:31,040 --> 01:21:33,800 Speaker 1: industry because if we regulated, it will be brought in 1516 01:21:33,840 --> 01:21:36,559 Speaker 1: closer to the actual financial industry as it exists, and 1517 01:21:36,600 --> 01:21:39,120 Speaker 1: banks will put more investments into crypto and it will 1518 01:21:39,160 --> 01:21:41,280 Speaker 1: get seen as like legitimate and backed by the state. 1519 01:21:41,360 --> 01:21:44,360 Speaker 1: And so when these conmen destroyed tens of billions of 1520 01:21:44,400 --> 01:21:46,800 Speaker 1: dollars overnight and cause panic in the industry, it will 1521 01:21:46,840 --> 01:21:49,559 Speaker 1: affect the real economy. And right now it doesn't seem 1522 01:21:49,640 --> 01:21:51,960 Speaker 1: to And like, yeah, I guess that is kid, that's 1523 01:21:52,000 --> 01:21:54,280 Speaker 1: not a bad point. Maybe we just let it die 1524 01:21:54,360 --> 01:21:57,080 Speaker 1: on its own. I don't know. Um, And it seems 1525 01:21:57,120 --> 01:22:01,360 Speaker 1: so more like, well we'll know more. So, yeah, we 1526 01:22:01,400 --> 01:22:03,639 Speaker 1: will continue to learn more. One of the things that's 1527 01:22:03,640 --> 01:22:06,280 Speaker 1: funny is about this is that Sequoia, this investment firm, 1528 01:22:06,400 --> 01:22:08,960 Speaker 1: put like two hundred million dollars into the company, which 1529 01:22:09,000 --> 01:22:11,120 Speaker 1: they have all written off. Now they're accepting it as 1530 01:22:11,160 --> 01:22:15,439 Speaker 1: a total loss. They're going full that girl on this. Okay, Now, 1531 01:22:15,680 --> 01:22:19,120 Speaker 1: when you hear this very serious investment firm put two 1532 01:22:19,240 --> 01:22:22,880 Speaker 1: hundred million dollars into this business, you probably assume Jamie wow, 1533 01:22:22,920 --> 01:22:25,639 Speaker 1: I bet he had a good pitch, right, I actually 1534 01:22:25,680 --> 01:22:27,960 Speaker 1: I don't know if we're talking Sarin knows. I actually 1535 01:22:27,960 --> 01:22:30,439 Speaker 1: don't think that that is. That's not just qualifying to 1536 01:22:30,439 --> 01:22:33,120 Speaker 1: have a dogshit pitch. You know. You know who can 1537 01:22:33,160 --> 01:22:36,639 Speaker 1: make this clear for us is Captain Dick Ryder. Thanks 1538 01:22:37,439 --> 01:22:40,439 Speaker 1: SPF told Sequoia about the so called super app. I 1539 01:22:40,439 --> 01:22:42,080 Speaker 1: want FTX to be a place where you can do 1540 01:22:42,080 --> 01:22:44,719 Speaker 1: anything you want with your next dollar. You can buy bitcoin, 1541 01:22:44,840 --> 01:22:47,000 Speaker 1: you can send money in whatever currency to any friend 1542 01:22:47,040 --> 01:22:49,320 Speaker 1: anywhere in the world. You can buy a banana. You 1543 01:22:49,360 --> 01:22:51,320 Speaker 1: can do anything with you you want with your money 1544 01:22:51,320 --> 01:22:58,720 Speaker 1: from inside f t X. Suddenly the chat Yeah, I 1545 01:22:58,760 --> 01:23:00,920 Speaker 1: don't know, I feel like I can do anything I 1546 01:23:00,960 --> 01:23:04,519 Speaker 1: want with my debit card, Like I've never run into 1547 01:23:04,560 --> 01:23:06,720 Speaker 1: a thing I wanted to buy and been like, ah, 1548 01:23:06,760 --> 01:23:11,760 Speaker 1: I cannot know how do I even buy drugs with it? 1549 01:23:11,800 --> 01:23:14,360 Speaker 1: By going to an A t M. If I were 1550 01:23:14,400 --> 01:23:16,400 Speaker 1: to be a person who buys drugs, which I'm not, 1551 01:23:17,080 --> 01:23:18,679 Speaker 1: I could go to an A t M and take 1552 01:23:18,680 --> 01:23:21,840 Speaker 1: out cash and purchase support your local drug dealer and 1553 01:23:22,000 --> 01:23:27,160 Speaker 1: banana vendor for crying out so he gives this banana 1554 01:23:27,200 --> 01:23:30,760 Speaker 1: pitch quote. Suddenly, the chat window on Sequoia's side of 1555 01:23:30,760 --> 01:23:34,879 Speaker 1: the zoom lights up with partners freaking out. I love this, founder, 1556 01:23:34,960 --> 01:23:37,400 Speaker 1: typed one partner, I am a tin out of tin 1557 01:23:37,560 --> 01:23:42,760 Speaker 1: pinned another Yes, exclamation point, exclamation point, exclamation point, exclaimed 1558 01:23:42,760 --> 01:23:45,799 Speaker 1: a third. What Sequoia was reacting to was the scale 1559 01:23:45,800 --> 01:23:48,720 Speaker 1: of SPFS vision. It wasn't a story about how we 1560 01:23:48,840 --> 01:23:51,920 Speaker 1: might use fintech in the future, or crypto crypto or 1561 01:23:51,920 --> 01:23:53,960 Speaker 1: a new kind of bank. It was a vision about 1562 01:23:53,960 --> 01:23:57,000 Speaker 1: the future of money itself, with a total addressable market 1563 01:23:57,080 --> 01:23:59,800 Speaker 1: of every person on the entire planet. I sit tin 1564 01:24:00,200 --> 01:24:03,559 Speaker 1: from him, and I know it's These people are just 1565 01:24:03,960 --> 01:24:07,840 Speaker 1: actually like three executives doing lines of coke and once 1566 01:24:07,880 --> 01:24:12,559 Speaker 1: swallowing a banana with the peel still on. They're like, yes, yep, 1567 01:24:13,720 --> 01:24:18,320 Speaker 1: oh my god. Okay, so it's I mean, which does 1568 01:24:18,400 --> 01:24:20,599 Speaker 1: kind of continue with the trend of like, you know, 1569 01:24:20,640 --> 01:24:23,960 Speaker 1: as as b F has never had a problem or 1570 01:24:24,120 --> 01:24:27,440 Speaker 1: a like anything to overcome, Like if it's this easy 1571 01:24:27,520 --> 01:24:30,599 Speaker 1: for him to con people into ship, like, of course 1572 01:24:30,600 --> 01:24:33,639 Speaker 1: you would have a god complex. You've never you've never 1573 01:24:33,680 --> 01:24:39,519 Speaker 1: been told no, yep, yep, yep, yep, yep, yep. It's cool. 1574 01:24:39,680 --> 01:24:43,040 Speaker 1: So what Sequoia was? Yeah, sorry, I have to continue 1575 01:24:43,080 --> 01:24:45,000 Speaker 1: this this fucking quote. And next he's going to talk 1576 01:24:45,040 --> 01:24:47,479 Speaker 1: about a person who works at Sequoia and is in 1577 01:24:47,479 --> 01:24:49,760 Speaker 1: the room for this. I sent ten ft from him 1578 01:24:49,760 --> 01:24:53,200 Speaker 1: and I walked over thinking, oh shit, that was really good. 1579 01:24:53,880 --> 01:24:56,719 Speaker 1: Remembers Aurora. And it turns out that fucker was playing 1580 01:24:56,800 --> 01:24:59,880 Speaker 1: League of Legends through the entire meeting. And this is 1581 01:25:00,040 --> 01:25:02,439 Speaker 1: framed in the article and in all the coverage before 1582 01:25:02,439 --> 01:25:05,280 Speaker 1: everything fell apart. It's like, so awesome, He's so cool. 1583 01:25:05,560 --> 01:25:08,200 Speaker 1: This writer talks a bunch about how like Sam never 1584 01:25:08,240 --> 01:25:10,760 Speaker 1: stops playing video games when he's talking to this writer, 1585 01:25:10,840 --> 01:25:13,839 Speaker 1: when he's having corporate meetings, he's playing video games. Basically, 1586 01:25:13,920 --> 01:25:16,640 Speaker 1: a hundred percent of the time. Um, and this has 1587 01:25:16,640 --> 01:25:19,439 Speaker 1: always mentioned like he's always working, He's always in the office, 1588 01:25:19,439 --> 01:25:21,519 Speaker 1: he sleeps in a beanbag chair at his desk, and 1589 01:25:21,520 --> 01:25:24,880 Speaker 1: it's like, no, dude, he's not always working. He's conning you. 1590 01:25:25,160 --> 01:25:28,320 Speaker 1: And he plays video games all the time and pretends 1591 01:25:28,360 --> 01:25:31,719 Speaker 1: that that's a fucking job. Um, which is great, great 1592 01:25:31,760 --> 01:25:34,920 Speaker 1: con good for you always working in always They're two 1593 01:25:35,000 --> 01:25:39,640 Speaker 1: very different flavors of things happen exactly. Um, it's all 1594 01:25:39,760 --> 01:25:41,760 Speaker 1: part of the fucking con So is the fact that 1595 01:25:41,800 --> 01:25:44,639 Speaker 1: he always he always were like ratty old athletic shorts 1596 01:25:44,680 --> 01:25:47,960 Speaker 1: and like a wrinkled T shirt. Because like that's if 1597 01:25:48,000 --> 01:25:50,320 Speaker 1: you are a young man in the tech industry, that 1598 01:25:50,360 --> 01:25:52,960 Speaker 1: makes people think you're a genius, right, because genius is 1599 01:25:53,280 --> 01:25:56,920 Speaker 1: dressed like shit. Yeah, he's like, now, if I bring 1600 01:25:56,960 --> 01:26:00,719 Speaker 1: a real stink into their people are gonna like jack 1601 01:26:00,800 --> 01:26:05,360 Speaker 1: up my already fake i Q score about twenty points. Yeah, 1602 01:26:05,560 --> 01:26:07,559 Speaker 1: and it's what you know who else dresses like ship 1603 01:26:08,040 --> 01:26:13,840 Speaker 1: the guy I used to buy weed from. Yeah, yeah, 1604 01:26:14,400 --> 01:26:17,160 Speaker 1: you know who? Also? You know who wears the same 1605 01:26:17,160 --> 01:26:20,679 Speaker 1: outfit of Sam Bankman Freed, my old buddy who once 1606 01:26:20,720 --> 01:26:22,639 Speaker 1: at a party got into an argument with the guy 1607 01:26:22,680 --> 01:26:25,240 Speaker 1: and broke fifteen bones in his face because they were 1608 01:26:25,280 --> 01:26:31,639 Speaker 1: both drinking. Not a corporate genius. I do like that 1609 01:26:31,680 --> 01:26:34,280 Speaker 1: shared aesthetic where it's like, really the difference is the 1610 01:26:34,280 --> 01:26:37,960 Speaker 1: pet snake, that is that's the that's that's how you 1611 01:26:38,000 --> 01:26:46,080 Speaker 1: truly can sniff out the africa he's based. He's the 1612 01:26:46,080 --> 01:26:48,559 Speaker 1: same kind of person as Elizabeth Holmes and he was 1613 01:26:48,640 --> 01:26:50,920 Speaker 1: again he's a confidence man, and this gets us into 1614 01:26:50,960 --> 01:26:53,680 Speaker 1: the Larry David ship because the thing about being a 1615 01:26:53,680 --> 01:26:56,599 Speaker 1: confidence man is that as long as people are convinced 1616 01:26:56,600 --> 01:26:59,080 Speaker 1: there's money, their money is safe and most of them 1617 01:26:59,120 --> 01:27:01,360 Speaker 1: don't try to pull it out, then you can keep 1618 01:27:01,400 --> 01:27:04,440 Speaker 1: the con going, and you can keep the fake numbers increasing, 1619 01:27:04,520 --> 01:27:06,519 Speaker 1: and you everyone will think you're richer and you can 1620 01:27:06,560 --> 01:27:09,080 Speaker 1: actually get real money out of this. So one of 1621 01:27:09,120 --> 01:27:11,360 Speaker 1: the things that he did is he would pour shiploads 1622 01:27:11,360 --> 01:27:15,120 Speaker 1: of money into sponsorship deals and to other ventures to 1623 01:27:15,160 --> 01:27:18,519 Speaker 1: make his company seem legit. One way he did this, Yeah, 1624 01:27:18,560 --> 01:27:21,160 Speaker 1: he spent seventeen and a half million through FTX to 1625 01:27:21,240 --> 01:27:24,000 Speaker 1: sponsor the athletic teams that you see Berkeley. Uh. He 1626 01:27:24,080 --> 01:27:26,759 Speaker 1: launched a twenty million dollar ad campaign with Tom Brady 1627 01:27:26,800 --> 01:27:29,160 Speaker 1: and just sell Bunchen. He offered n F T s 1628 01:27:29,200 --> 01:27:32,439 Speaker 1: at coach uh huh, And he spent a hundred and 1629 01:27:32,479 --> 01:27:34,840 Speaker 1: thirty five million dollars on the naming rights for the 1630 01:27:34,840 --> 01:27:37,639 Speaker 1: Miami Heats Home arena. And the purpose this is all 1631 01:27:37,680 --> 01:27:40,080 Speaker 1: to build confidence, right you see, you see it's the 1632 01:27:40,680 --> 01:27:42,200 Speaker 1: It's the same thing Crypto dot Com did, by the 1633 01:27:42,200 --> 01:27:43,880 Speaker 1: way with the arena, and I want to say, I 1634 01:27:43,920 --> 01:27:46,479 Speaker 1: was like, I feel like the Crypto dot Com arena 1635 01:27:46,560 --> 01:27:49,400 Speaker 1: is not long for this world, and you should I 1636 01:27:49,520 --> 01:27:52,479 Speaker 1: don't recognize that as an act. Is my money safe 1637 01:27:52,520 --> 01:27:55,040 Speaker 1: in this thing that's a bank but not a bank? 1638 01:27:55,040 --> 01:27:58,479 Speaker 1: Well their name is on the arena, so it's probably legit. 1639 01:27:59,160 --> 01:28:01,519 Speaker 1: Here's the thing, is like, yeah, it's like now walking 1640 01:28:01,520 --> 01:28:04,120 Speaker 1: into the Crypto dot Com area arena, I couldn't feel 1641 01:28:04,160 --> 01:28:07,680 Speaker 1: less safe. I couldn't feel less secure. Like when it 1642 01:28:07,720 --> 01:28:10,040 Speaker 1: was the Staple Center. I'm like, oh, you know what's 1643 01:28:10,080 --> 01:28:13,479 Speaker 1: never going to go out of style? Little notebooks. People 1644 01:28:13,560 --> 01:28:17,479 Speaker 1: have been using Staples since we were cavemen. I assume. So, yes, 1645 01:28:18,240 --> 01:28:21,040 Speaker 1: technology just like the worst fucking name on earth. But 1646 01:28:21,240 --> 01:28:25,120 Speaker 1: it infuriates me these people. So if he has a 1647 01:28:25,200 --> 01:28:28,679 Speaker 1: dog in the fight, I do In his interview with Vox, 1648 01:28:28,880 --> 01:28:32,640 Speaker 1: Sam basically admits this, albeit in a slightly careful way. 1649 01:28:32,960 --> 01:28:36,200 Speaker 1: Journalist so, f t X technically wasn't gambling with their money. 1650 01:28:36,280 --> 01:28:38,120 Speaker 1: F t X had just loaned their money to Alameda, 1651 01:28:38,160 --> 01:28:40,920 Speaker 1: who hadn't, who had gambled with their money and lost it. 1652 01:28:41,160 --> 01:28:43,000 Speaker 1: And you didn't realize it was a big deal because 1653 01:28:43,000 --> 01:28:45,599 Speaker 1: you didn't realize how much money it was. Same response. 1654 01:28:45,880 --> 01:28:48,960 Speaker 1: And also, I thought Alameda had enough collateral to reasonably 1655 01:28:49,000 --> 01:28:51,720 Speaker 1: cover it. Journalist says, I get how you could have 1656 01:28:51,760 --> 01:28:53,679 Speaker 1: gotten away with it, but I guess that seems sketchy. 1657 01:28:53,680 --> 01:28:56,120 Speaker 1: Even if you get away with it, Sam, it was 1658 01:28:56,160 --> 01:29:00,760 Speaker 1: never the intention. Sometimes life creeps up on you, um 1659 01:29:00,960 --> 01:29:05,679 Speaker 1: so literally that life comes at you fast. So Sam's 1660 01:29:05,880 --> 01:29:08,679 Speaker 1: worth taps at around twenty two billion dollars on paper. 1661 01:29:08,920 --> 01:29:12,080 Speaker 1: In reality, neither Alameda nor ft X had ever taken 1662 01:29:12,080 --> 01:29:14,519 Speaker 1: and even close to that much money. The valuation was 1663 01:29:14,560 --> 01:29:18,479 Speaker 1: based entirely on nonsense calculations that were themselves based on 1664 01:29:18,600 --> 01:29:21,360 Speaker 1: lies from ft X. Is extremely cooked books. There's a 1665 01:29:21,400 --> 01:29:23,479 Speaker 1: lot more about this than we're getting into. People are 1666 01:29:23,520 --> 01:29:25,800 Speaker 1: still finding this all out. There's one thing I should 1667 01:29:25,800 --> 01:29:29,200 Speaker 1: probably read, which is, so you know when his company collapsed, 1668 01:29:29,200 --> 01:29:31,519 Speaker 1: he had to step down from running it, right, And 1669 01:29:31,600 --> 01:29:34,000 Speaker 1: because a lot of money is still in there and 1670 01:29:34,040 --> 01:29:36,479 Speaker 1: a lot of like investments are still tight up in that, 1671 01:29:36,880 --> 01:29:40,400 Speaker 1: they put a guy in charge of the company again, right. 1672 01:29:40,479 --> 01:29:43,759 Speaker 1: And there's like there's specific dudes in the business world, 1673 01:29:44,280 --> 01:29:47,280 Speaker 1: um whose like job is to come in when a 1674 01:29:47,320 --> 01:29:50,400 Speaker 1: company fucks up like this and like try and get 1675 01:29:50,439 --> 01:29:53,400 Speaker 1: as much money back out for the shareholders as possible 1676 01:29:53,439 --> 01:29:56,639 Speaker 1: to minimize the bleeding. So they kind of have like 1677 01:29:56,680 --> 01:29:59,880 Speaker 1: a like how they had like old Hollywood, they have 1678 01:30:00,040 --> 01:30:02,240 Speaker 1: like a fixer guy. Yeah, yeah, yeah, they have. And 1679 01:30:02,360 --> 01:30:05,280 Speaker 1: this is this fixer guy specifically, he is the guy 1680 01:30:05,439 --> 01:30:08,519 Speaker 1: that they brought in, um when in Ron, he took 1681 01:30:08,560 --> 01:30:10,920 Speaker 1: over in Ron after it fell apart in order to 1682 01:30:10,960 --> 01:30:13,640 Speaker 1: try and like minimize the damage from it. So he 1683 01:30:13,720 --> 01:30:15,479 Speaker 1: is the guy who got brought in to deal with 1684 01:30:15,520 --> 01:30:19,679 Speaker 1: like the fact that this massive fucking crime happened with Enron. 1685 01:30:19,960 --> 01:30:23,479 Speaker 1: UM one sec. My my favorite, um, my favorite en 1686 01:30:23,560 --> 01:30:29,360 Speaker 1: Run memory not that you asked, um, was the women 1687 01:30:29,439 --> 01:30:32,439 Speaker 1: of Enron playboy spread that I got too archived during 1688 01:30:32,479 --> 01:30:35,880 Speaker 1: my time there. Oh god boy, did those en Ron 1689 01:30:35,960 --> 01:30:40,559 Speaker 1: girl bosses have their there? So the second only two 1690 01:30:40,600 --> 01:30:43,240 Speaker 1: women of seven eleven, which is my actual favorite spread 1691 01:30:43,400 --> 01:30:47,240 Speaker 1: continue first option. This company has about a million creditors, 1692 01:30:47,280 --> 01:30:50,799 Speaker 1: so about a million people possibly lost their entire investment 1693 01:30:50,800 --> 01:30:52,679 Speaker 1: in this company, which is a stunning amount of people 1694 01:30:52,720 --> 01:30:56,320 Speaker 1: to take money from um. And again, we're probably we 1695 01:30:56,360 --> 01:30:58,599 Speaker 1: are probably looking at like five or ten billion dollars 1696 01:30:58,760 --> 01:31:01,519 Speaker 1: stolen something like that. Kind of unclear the exact amount, 1697 01:31:01,680 --> 01:31:04,120 Speaker 1: but anyway, this is what the guy in the Delaware 1698 01:31:04,160 --> 01:31:07,120 Speaker 1: bankruptcy court filing. This is what the guy who was 1699 01:31:07,560 --> 01:31:10,719 Speaker 1: the guy who took over in Ron after it became 1700 01:31:10,760 --> 01:31:12,920 Speaker 1: clear that the whole company was a criminal enterprise. This 1701 01:31:12,960 --> 01:31:16,400 Speaker 1: is what that guy wrote about Sam's company. Oh no, 1702 01:31:17,040 --> 01:31:19,840 Speaker 1: never in my career have I seen such a complete 1703 01:31:19,880 --> 01:31:23,000 Speaker 1: failure of corporate controls and such a complete absence of 1704 01:31:23,040 --> 01:31:27,559 Speaker 1: trustworthy financial information as occurred here. From compromise systems integrity 1705 01:31:27,600 --> 01:31:31,240 Speaker 1: and faulty regulatory oversight abroad to the concentration of control 1706 01:31:31,240 --> 01:31:34,920 Speaker 1: in the hands of a very small group of inexperienced, unsophisticated, 1707 01:31:34,960 --> 01:31:40,160 Speaker 1: and potentially compromised individuals, this situation is unprecedented. Again, that's 1708 01:31:40,200 --> 01:31:44,880 Speaker 1: the guy who took over in Ron, literally like Bill 1709 01:31:45,000 --> 01:31:48,280 Speaker 1: Clinton dropping a notes post, being like never have I 1710 01:31:48,320 --> 01:31:54,360 Speaker 1: seen a more cheated on my wife? And like, well, 1711 01:31:54,360 --> 01:31:57,000 Speaker 1: I will say it's different. This guy did not commit 1712 01:31:57,040 --> 01:31:58,840 Speaker 1: any of the iron crimes, right, this is the guy 1713 01:31:58,880 --> 01:32:02,479 Speaker 1: who comes saw them all. This guy is after he's 1714 01:32:02,520 --> 01:32:08,719 Speaker 1: brought in it becomes clear that none never in my career. 1715 01:32:09,040 --> 01:32:11,680 Speaker 1: This guy, Yeah, it's probably best to look at this 1716 01:32:11,720 --> 01:32:13,519 Speaker 1: guy is like an E M. T. When it becomes 1717 01:32:13,520 --> 01:32:16,320 Speaker 1: clear that a company that has a shipload of money 1718 01:32:16,320 --> 01:32:20,160 Speaker 1: in it and is central in the economy has collapsed 1719 01:32:20,200 --> 01:32:23,240 Speaker 1: because people broke the law. He comes in to minimize 1720 01:32:23,280 --> 01:32:26,360 Speaker 1: the damage. But he was not working at in Ron previously, right, Like, 1721 01:32:26,400 --> 01:32:28,439 Speaker 1: it's not he's not trying to like stop anyone from 1722 01:32:28,479 --> 01:32:31,200 Speaker 1: getting in trouble. He's trying to minimize how many people 1723 01:32:31,200 --> 01:32:34,920 Speaker 1: are hurt by this anyway. Yeah, um, I just want 1724 01:32:34,920 --> 01:32:37,400 Speaker 1: to make clear, like that guy's anyway whatever, he's not. 1725 01:32:37,520 --> 01:32:39,880 Speaker 1: He's not the end he is not the end runner. 1726 01:32:39,960 --> 01:32:42,479 Speaker 1: He did not make in roun bad. He's just was 1727 01:32:42,520 --> 01:32:44,960 Speaker 1: there afterwards and was like this company is even worse. 1728 01:32:45,280 --> 01:32:47,839 Speaker 1: He bought the issue of Playboy and then he pushing 1729 01:32:48,240 --> 01:32:50,519 Speaker 1: so I should god, there's yeah, we're we're getting close 1730 01:32:50,560 --> 01:32:53,519 Speaker 1: to done. I should note that before everything collapsed. Sam again, 1731 01:32:53,560 --> 01:32:56,160 Speaker 1: he's in the effect of altruism. He's he's he promised 1732 01:32:56,200 --> 01:32:58,439 Speaker 1: to donate like a couple of hundred million dollars to 1733 01:32:58,520 --> 01:33:01,280 Speaker 1: these e A causes. I'll lot less than that actually 1734 01:33:01,280 --> 01:33:04,280 Speaker 1: wound got out. Some of them were good, but a 1735 01:33:04,320 --> 01:33:06,799 Speaker 1: lot of it was like So he made a huge 1736 01:33:06,880 --> 01:33:09,120 Speaker 1: point that he was like his one of his major 1737 01:33:09,200 --> 01:33:12,559 Speaker 1: priorities was pandemic prevention. Right, we have to stop the 1738 01:33:12,600 --> 01:33:14,439 Speaker 1: next pandemic. I'm gonna put as much money as i 1739 01:33:14,439 --> 01:33:18,519 Speaker 1: can to pandemic prevention. That's the best effective altruist thing 1740 01:33:18,560 --> 01:33:21,760 Speaker 1: that I can do. Um to talk about how well 1741 01:33:21,800 --> 01:33:23,759 Speaker 1: that actually worked, I want to quote from the Washington 1742 01:33:23,840 --> 01:33:27,200 Speaker 1: Post here. Ft X backed projects ranged from a twelve 1743 01:33:27,200 --> 01:33:30,280 Speaker 1: million dollars to champion a California ballot initiative to strengthen 1744 01:33:30,320 --> 01:33:34,080 Speaker 1: public health programs and detect emerging virus threats. Amid lackluster support, 1745 01:33:34,080 --> 01:33:37,559 Speaker 1: the measure was punted four to investing more than eleven 1746 01:33:37,560 --> 01:33:40,920 Speaker 1: million on the unsuccessful congressional primary campaign of an Oregon 1747 01:33:40,920 --> 01:33:43,840 Speaker 1: bio security expert, and even a hundred and fifty thousand 1748 01:33:43,880 --> 01:33:47,320 Speaker 1: dollar grant to help moncleft Slough, the scientific advisor to 1749 01:33:47,360 --> 01:33:52,080 Speaker 1: the Trump administration's Operation Warp speed vaccine accelerator. Right his memoire. 1750 01:33:52,680 --> 01:33:54,599 Speaker 1: So that sounds like a giant waste of money. Right, 1751 01:33:54,720 --> 01:33:57,280 Speaker 1: that sounds like none of it. Even if it was 1752 01:33:57,320 --> 01:33:59,120 Speaker 1: like good, it sounds like it did in the like, 1753 01:33:59,120 --> 01:34:01,960 Speaker 1: even if the goals are good, like well, the ballot 1754 01:34:01,960 --> 01:34:04,519 Speaker 1: measure failed, like it got punted, So it's not like 1755 01:34:04,560 --> 01:34:08,080 Speaker 1: it worked. Um, and it gets worse because SPFS fund 1756 01:34:08,120 --> 01:34:10,240 Speaker 1: also put a lot of money, like five million dollars 1757 01:34:10,240 --> 01:34:12,599 Speaker 1: into Pro Publica. And Pro Publica they've done a lot 1758 01:34:12,680 --> 01:34:17,880 Speaker 1: of cool stuff. They also published an extremely flawed investigation 1759 01:34:18,240 --> 01:34:22,200 Speaker 1: that backed the lab leak hypothesis. Um, I'm gonna the 1760 01:34:22,360 --> 01:34:25,120 Speaker 1: l A Times and their analysis of this deeply flawed, 1761 01:34:25,160 --> 01:34:27,680 Speaker 1: flawed piece of reporting. And also, you know, we've got 1762 01:34:27,680 --> 01:34:29,519 Speaker 1: some notes for the l A Times as well. Yes, 1763 01:34:29,720 --> 01:34:32,680 Speaker 1: nobody's perfect. The The l A Times called it a 1764 01:34:32,720 --> 01:34:35,479 Speaker 1: train wreck, noting the article is based heavily on Chinese 1765 01:34:35,560 --> 01:34:39,000 Speaker 1: language documents that appear to have been mistranslated and misinterpreted, 1766 01:34:39,040 --> 01:34:42,160 Speaker 1: according to Chinese language experts who have piled on via 1767 01:34:42,200 --> 01:34:45,120 Speaker 1: social media since its publication. It also takes as gospel 1768 01:34:45,200 --> 01:34:47,560 Speaker 1: or a report by a rump group of Republican congressional 1769 01:34:47,560 --> 01:34:50,360 Speaker 1: staff members asserting that the pandemic was more likely than 1770 01:34:50,400 --> 01:34:53,080 Speaker 1: not the result of a research related incident. And this 1771 01:34:53,160 --> 01:34:55,639 Speaker 1: has been the fact that Pro Publica published this has 1772 01:34:55,720 --> 01:34:58,400 Speaker 1: like provided a shipload of fuel to the It was 1773 01:34:58,439 --> 01:35:01,559 Speaker 1: all a fucking lab leak for China. It's China's fault. 1774 01:35:02,160 --> 01:35:07,640 Speaker 1: Um Republicans. Yeah, Sam Bankman Freed funded that ship. Um 1775 01:35:07,720 --> 01:35:10,880 Speaker 1: and god yeah, basically none of the ship he was 1776 01:35:10,920 --> 01:35:13,000 Speaker 1: putting in money into that was supposed to be good 1777 01:35:13,400 --> 01:35:16,439 Speaker 1: really worked, and a lot of it was. So another 1778 01:35:16,560 --> 01:35:18,280 Speaker 1: thing the right is doing right now is they're talking 1779 01:35:18,320 --> 01:35:20,960 Speaker 1: about he was like the number one or number two 1780 01:35:21,000 --> 01:35:26,719 Speaker 1: donor to Democrats during the mid term elections. Seth McFarland. Yeah, yeah, 1781 01:35:26,800 --> 01:35:29,479 Speaker 1: but none of his donations worked. And also he gave 1782 01:35:29,520 --> 01:35:31,599 Speaker 1: it was like thirty two million he gave to the DIMS. 1783 01:35:31,640 --> 01:35:34,040 Speaker 1: He gave like twenty four million to the Republicans. And 1784 01:35:34,040 --> 01:35:37,000 Speaker 1: the reason he was giving this money Number one, there 1785 01:35:37,000 --> 01:35:40,680 Speaker 1: were some like pro pandemic response candidates he wanted to back, 1786 01:35:40,840 --> 01:35:44,080 Speaker 1: most of whom didn't you know, do well. But also 1787 01:35:44,240 --> 01:35:46,920 Speaker 1: like a lot of the money was towards Republican and 1788 01:35:46,960 --> 01:35:50,320 Speaker 1: Democratic candidates who were going to be part of the 1789 01:35:50,320 --> 01:35:52,439 Speaker 1: regulation of the crypto industry because he wanted to have 1790 01:35:52,439 --> 01:35:54,360 Speaker 1: a seat at the table and push regulations in a 1791 01:35:54,400 --> 01:36:01,400 Speaker 1: way that yeah, that would enable that is yeah exact lee. Um. 1792 01:36:01,479 --> 01:36:04,600 Speaker 1: So anyway, in general, nothing at Alameda or f t 1793 01:36:04,800 --> 01:36:07,280 Speaker 1: X was as it seemed. And that Dick writing Sequoia 1794 01:36:07,400 --> 01:36:11,160 Speaker 1: article Carolyn Ellison, the CEO of Alameda is he talks 1795 01:36:11,200 --> 01:36:14,040 Speaker 1: about her a bit, and like she frames her as 1796 01:36:14,040 --> 01:36:17,000 Speaker 1: like this quintessential innocent nerd girl, plucky and ethical and 1797 01:36:17,080 --> 01:36:19,519 Speaker 1: optimistic to show like these are the kind of you know, 1798 01:36:19,880 --> 01:36:22,519 Speaker 1: smart young gen z kids that are you know, building 1799 01:36:22,800 --> 01:36:25,600 Speaker 1: this this great company. And like she showed up in 1800 01:36:25,680 --> 01:36:27,960 Speaker 1: Larpe gear to meet with Sam and talk about the 1801 01:36:28,000 --> 01:36:32,479 Speaker 1: future of their great financial enterprise. And she's an ethical altruist. Um. 1802 01:36:32,640 --> 01:36:34,720 Speaker 1: Since everything fell apart, it's come out that she and 1803 01:36:34,760 --> 01:36:37,080 Speaker 1: Sam were dating each other and possibly other members of 1804 01:36:37,120 --> 01:36:42,559 Speaker 1: the company. Um. And also people have found her here 1805 01:36:42,600 --> 01:36:46,799 Speaker 1: we go. People have found her tumbler um, and boy 1806 01:36:47,000 --> 01:36:50,679 Speaker 1: is she's sketchy as hell. Um. I'm gonna quote from 1807 01:36:50,680 --> 01:36:55,080 Speaker 1: a report on her tumbler activity and decrypt dot com. Boy, howdy. 1808 01:36:55,360 --> 01:36:57,840 Speaker 1: When I first started my first foray into POLLI I 1809 01:36:57,880 --> 01:36:59,719 Speaker 1: thought of it as a radical break from my trad, 1810 01:36:59,760 --> 01:37:03,439 Speaker 1: pas asked, the account wrote in February, but tbh, I've 1811 01:37:03,479 --> 01:37:05,800 Speaker 1: come to decide that the only acceptable style of polly 1812 01:37:05,880 --> 01:37:09,320 Speaker 1: is at best characterizes something like imperial Chinese harem. The 1813 01:37:09,360 --> 01:37:12,040 Speaker 1: account went on to detail how a polyamorous dynamics should 1814 01:37:12,080 --> 01:37:16,160 Speaker 1: ideally function as a cutthroat market of sexual competition and subjugation. 1815 01:37:16,560 --> 01:37:19,840 Speaker 1: None of this, none high, non hierarchical bullshit, the account elaborated. 1816 01:37:19,960 --> 01:37:22,040 Speaker 1: Everyone should have a ranking of their partners, people should 1817 01:37:22,040 --> 01:37:23,439 Speaker 1: know where they fall in the ranking, and there should 1818 01:37:23,439 --> 01:37:28,080 Speaker 1: be vicious power struggles for the ranks. God, it gets worse, Jamie. 1819 01:37:28,360 --> 01:37:32,320 Speaker 1: The Ellison linked account also demonstrated a substantial preoccupation with 1820 01:37:32,560 --> 01:37:36,360 Speaker 1: hb D or human biodiversity, an online euphemism for the 1821 01:37:36,400 --> 01:37:42,000 Speaker 1: discredited fields of race science and eugenics popularized right, oh 1822 01:37:42,040 --> 01:37:44,559 Speaker 1: oh boy, give me one more paragraph and then we 1823 01:37:44,560 --> 01:37:48,280 Speaker 1: can talk about this. Jamie Ellison has for years vocalized 1824 01:37:48,280 --> 01:37:51,320 Speaker 1: her die Heart obsession with Harry Potter, and one post 1825 01:37:51,439 --> 01:37:54,439 Speaker 1: her affiliated Tumbler account tied her love of online character 1826 01:37:54,520 --> 01:37:57,559 Speaker 1: chrizz Is Quizzes to her pinshot for sorting Indians by 1827 01:37:57,600 --> 01:38:01,719 Speaker 1: their cast, which she presumed to indicate nedic distinction. Oh 1828 01:38:01,760 --> 01:38:06,759 Speaker 1: my god, it even J. K. Rowling wasn't thinking something 1829 01:38:06,800 --> 01:38:10,040 Speaker 1: that fucked up, and that's really saying something there, someone 1830 01:38:10,160 --> 01:38:16,519 Speaker 1: is amazing astonishing. Tumbler is exclusively for Sherlock fan fiction 1831 01:38:16,600 --> 01:38:19,840 Speaker 1: and things that trigger my eating disorder. That's it. I 1832 01:38:19,880 --> 01:38:24,599 Speaker 1: cannot like. Oh, that's so dark, it's I almost can't 1833 01:38:24,600 --> 01:38:27,719 Speaker 1: fathom it. Right, And again, there's so much. We probably 1834 01:38:27,760 --> 01:38:29,360 Speaker 1: will do a follow up at some point because like 1835 01:38:29,600 --> 01:38:31,639 Speaker 1: the fact that the fact that it was this easy 1836 01:38:31,720 --> 01:38:34,120 Speaker 1: for like some fucking crypto rage are the ones who 1837 01:38:34,120 --> 01:38:37,599 Speaker 1: reported on it to find her Tumbler where she talks 1838 01:38:37,640 --> 01:38:40,240 Speaker 1: about race science makes me think these guys were all 1839 01:38:40,240 --> 01:38:42,240 Speaker 1: probably into a lot more fucked up ship than they 1840 01:38:42,320 --> 01:38:48,360 Speaker 1: let on. Yeah, so we'll see, we'll see. I just 1841 01:38:50,560 --> 01:38:54,120 Speaker 1: we don't have the hour for me to decompress the 1842 01:38:54,120 --> 01:38:56,640 Speaker 1: way I need to after hearing that sentence. Specifically, I 1843 01:38:57,040 --> 01:39:01,080 Speaker 1: need a cigarette. I'm going to start smoke today. And 1844 01:39:01,160 --> 01:39:04,599 Speaker 1: I hope she's fucking happy. Oh my god, that is 1845 01:39:04,720 --> 01:39:12,800 Speaker 1: that is brutal place to land. Hopefully they're all all 1846 01:39:12,840 --> 01:39:15,759 Speaker 1: of their money's gone, but they probably squirreled away millions 1847 01:39:15,760 --> 01:39:18,280 Speaker 1: and stuff for themselves, although at least one of the 1848 01:39:18,360 --> 01:39:20,800 Speaker 1: articles I've read says that, like, his net worth is 1849 01:39:20,880 --> 01:39:24,759 Speaker 1: effectively zero now, but I don't think anyone actually knows 1850 01:39:24,880 --> 01:39:27,439 Speaker 1: what his net worth is right now, like and how 1851 01:39:27,520 --> 01:39:30,840 Speaker 1: much he got, like his company went from valued at 1852 01:39:30,880 --> 01:39:33,320 Speaker 1: thirty two billion dollars to most recently there's something like 1853 01:39:33,320 --> 01:39:37,880 Speaker 1: six fifty dollars in actual assets left. Um. But I 1854 01:39:37,920 --> 01:39:40,320 Speaker 1: also kind of think he and the others probably have 1855 01:39:40,640 --> 01:39:44,800 Speaker 1: millions or tens of millions that they set aside um 1856 01:39:44,840 --> 01:39:48,800 Speaker 1: in shady ways for themselves. Yeah, I mean, well that 1857 01:39:48,880 --> 01:39:53,920 Speaker 1: does sound like what the Beanie Baby's guy would do, 1858 01:39:53,960 --> 01:39:57,040 Speaker 1: and that is my yardstick for morality. That is so 1859 01:39:57,280 --> 01:40:01,439 Speaker 1: um that that is very very scary to uh to consider. 1860 01:40:02,280 --> 01:40:04,920 Speaker 1: I feel like guys like that the thing that is, 1861 01:40:05,400 --> 01:40:07,920 Speaker 1: I mean, I don't know whatever you hear about, Like 1862 01:40:07,960 --> 01:40:12,080 Speaker 1: it is so karmically satisfying to know that someone like 1863 01:40:12,280 --> 01:40:16,720 Speaker 1: SBF can can be completely bottomed out and like destroyed 1864 01:40:16,800 --> 01:40:19,040 Speaker 1: by something like this. But the thing is like when 1865 01:40:19,080 --> 01:40:22,680 Speaker 1: when rich guys like that lose everything, they just come 1866 01:40:22,760 --> 01:40:26,960 Speaker 1: up with worse, more hateful ideas and then come back. 1867 01:40:27,160 --> 01:40:29,240 Speaker 1: And that is like always what kind of scares me 1868 01:40:29,280 --> 01:40:34,920 Speaker 1: about that? It is so funny. Um. I don't know, Jamie, 1869 01:40:34,920 --> 01:40:36,640 Speaker 1: I don't know what the solution is. That's not like 1870 01:40:36,680 --> 01:40:39,960 Speaker 1: I want him to have money again. I just you know, what, 1871 01:40:40,240 --> 01:40:43,400 Speaker 1: how can you make someone say less again? I think 1872 01:40:43,439 --> 01:40:49,559 Speaker 1: the podcasters dilemma. Look, here's here's where I'll land on this. 1873 01:40:50,000 --> 01:40:53,320 Speaker 1: You know, I don't think I don't think people should 1874 01:40:53,360 --> 01:40:56,920 Speaker 1: be thrown into cages generally unless there's literally no other 1875 01:40:56,960 --> 01:40:59,439 Speaker 1: way to stop them from harming folks. So and I 1876 01:40:59,479 --> 01:41:02,240 Speaker 1: don't think that's the case with these people. So instead, 1877 01:41:04,840 --> 01:41:09,200 Speaker 1: I think the actual solution is to close from the 1878 01:41:09,240 --> 01:41:13,920 Speaker 1: outside all of the doors to that the fucking rich 1879 01:41:14,000 --> 01:41:18,760 Speaker 1: person apartment complex they occupy in that Bahamas development, lock 1880 01:41:18,800 --> 01:41:21,840 Speaker 1: it from the outside, and once a week drop in 1881 01:41:21,920 --> 01:41:26,040 Speaker 1: food and necessities via a helicopter and never let them 1882 01:41:26,080 --> 01:41:29,320 Speaker 1: leave or use the internet again. Them all just be 1883 01:41:29,439 --> 01:41:31,919 Speaker 1: with their friends in their in their weird little compound, 1884 01:41:31,920 --> 01:41:35,200 Speaker 1: going increasingly insane with their Chinese harem ship. I don't know. 1885 01:41:35,240 --> 01:41:37,360 Speaker 1: I guess that's another kind of prison. But if we 1886 01:41:37,439 --> 01:41:40,960 Speaker 1: film it, we can make money. But then SBF might 1887 01:41:41,000 --> 01:41:43,799 Speaker 1: may have to face his worst fear, which is reading 1888 01:41:43,800 --> 01:41:47,799 Speaker 1: a book. Yeah, I guess, Like, like, my serious answer 1889 01:41:47,800 --> 01:41:49,240 Speaker 1: is what do you do to people like this is 1890 01:41:49,320 --> 01:41:53,040 Speaker 1: you stop them from ever being able to have access 1891 01:41:53,080 --> 01:41:57,280 Speaker 1: to money again, or start companies again, and hopefully eventually 1892 01:41:57,360 --> 01:42:01,040 Speaker 1: they find something to do that actually helps human beings. 1893 01:42:01,120 --> 01:42:04,120 Speaker 1: And it is like like of any kind of use, 1894 01:42:04,479 --> 01:42:06,920 Speaker 1: Like we're going to a a grocery store there, that's a 1895 01:42:07,000 --> 01:42:10,000 Speaker 1: real benefit. People need to get food, and people need 1896 01:42:10,160 --> 01:42:14,320 Speaker 1: like that's a respectable, honest way to make a living. Um. 1897 01:42:14,400 --> 01:42:16,559 Speaker 1: And if any of these people were to get a 1898 01:42:16,640 --> 01:42:19,160 Speaker 1: job working in a safe way, they would be providing 1899 01:42:19,360 --> 01:42:22,400 Speaker 1: an infinitely greater benefit to the human race than they 1900 01:42:22,400 --> 01:42:25,240 Speaker 1: could ever have performed. Perhaps a bit more of that 1901 01:42:25,240 --> 01:42:29,320 Speaker 1: that ethical side of the ethical altruism you're looking. I'll 1902 01:42:29,320 --> 01:42:31,400 Speaker 1: be honest. I did not come to the recording today 1903 01:42:31,400 --> 01:42:34,080 Speaker 1: with a solution for the prison industrial complex. But I don't. 1904 01:42:34,280 --> 01:42:37,639 Speaker 1: But I still don't like it. Um, I still don't 1905 01:42:37,720 --> 01:42:45,240 Speaker 1: love it. Um. Yeah, yeah, I have no solution. Um, 1906 01:42:45,280 --> 01:42:48,880 Speaker 1: But you know what I do have, Jamie, what you're plug? 1907 01:42:50,200 --> 01:42:53,120 Speaker 1: No you don't. I have those? Well you have them, 1908 01:42:53,120 --> 01:42:56,280 Speaker 1: but I'm letting you have them. Oh my god. I mean, 1909 01:42:57,320 --> 01:42:59,479 Speaker 1: I mean I could probably do them. If nobody wants 1910 01:42:59,520 --> 01:43:02,040 Speaker 1: to take this job, I'll do it. I can. I'll 1911 01:43:02,040 --> 01:43:04,519 Speaker 1: do them, Sophie, do you want to do them? Yeah? 1912 01:43:04,560 --> 01:43:08,479 Speaker 1: I mean you can pre order Jamie's book. Um, and 1913 01:43:08,479 --> 01:43:13,040 Speaker 1: that that is linked in her Instagram bio. Um that 1914 01:43:13,160 --> 01:43:17,320 Speaker 1: you can follow her on Instagram at Jamie christ Superstar 1915 01:43:17,680 --> 01:43:19,400 Speaker 1: and you can follow her on Twitter at Jamie Loftus. 1916 01:43:19,520 --> 01:43:22,639 Speaker 1: Help if Twitter is still around. Uh. She has a 1917 01:43:22,640 --> 01:43:26,639 Speaker 1: podcast that she co hosts with Caitlyndt called The backtel Cast. 1918 01:43:27,040 --> 01:43:30,200 Speaker 1: You should listen to her many limited run series, including 1919 01:43:30,200 --> 01:43:33,840 Speaker 1: her most recent run, which is Ghost Church and Sophie 1920 01:43:33,920 --> 01:43:37,240 Speaker 1: produced along with The Vital Cast and everything every podcast 1921 01:43:37,320 --> 01:43:40,400 Speaker 1: on the planet. This has now become a plug for me. Um, 1922 01:43:40,520 --> 01:43:44,000 Speaker 1: do I get everything, Jamie? Yeah, that's exactly what I 1923 01:43:44,000 --> 01:43:48,120 Speaker 1: would have said, except worse. Yeah. So pre pre pre 1924 01:43:48,280 --> 01:43:53,519 Speaker 1: order raw Dog. Yeah, thanks, Sophie. Yeah, preorder raw Dog. 1925 01:43:53,560 --> 01:43:56,839 Speaker 1: If you listen to the hot Dog episode of Bastards 1926 01:43:56,840 --> 01:43:59,439 Speaker 1: and didn't like it, you did like it now by 1927 01:43:59,479 --> 01:44:02,639 Speaker 1: the book, Yeah, legally you did like it. And if 1928 01:44:02,680 --> 01:44:06,599 Speaker 1: you disagree with that statement, we will send the CIA 1929 01:44:06,840 --> 01:44:10,040 Speaker 1: to kill your family. Yeah. And that, and let's make 1930 01:44:10,040 --> 01:44:14,439 Speaker 1: sure to attribute that quote to Robert Evans specifically, and 1931 01:44:14,520 --> 01:44:20,080 Speaker 1: speaking specifically, Robert Evans specifically, Margaret killed Joy specifically, and 1932 01:44:20,160 --> 01:44:23,880 Speaker 1: myself will be doing a Behind the Bastards virtual livestream 1933 01:44:23,920 --> 01:44:27,560 Speaker 1: show on December eight. You can get tickets at a 1934 01:44:27,960 --> 01:44:33,160 Speaker 1: moment dot co slash bTB. Yeah. Also, I have a 1935 01:44:33,200 --> 01:44:37,400 Speaker 1: sub stack now because Twitter's not doing great. So that 1936 01:44:37,479 --> 01:44:42,080 Speaker 1: makes me so sad. Why I like writing things. I 1937 01:44:42,080 --> 01:44:45,439 Speaker 1: got to write a thing last night. Write more things 1938 01:44:46,400 --> 01:44:49,360 Speaker 1: better for me than being on Twitter all the goddamn time. Yeah, 1939 01:44:49,640 --> 01:44:52,120 Speaker 1: you can find it at shatter zone dot sub stack 1940 01:44:52,240 --> 01:44:54,880 Speaker 1: dot com. Go there and I will be writing. I'll 1941 01:44:54,880 --> 01:44:57,240 Speaker 1: try to write something every week. Maybe I won't, maybe 1942 01:44:57,240 --> 01:45:01,040 Speaker 1: all of this will will fall apart, be austin time 1943 01:45:01,120 --> 01:45:04,040 Speaker 1: like tears in the rain. Or maybe you'll get a 1944 01:45:04,040 --> 01:45:05,800 Speaker 1: new thing for me every week. There's no way to 1945 01:45:05,840 --> 01:45:08,479 Speaker 1: know every time. Half the time, when I get something 1946 01:45:08,520 --> 01:45:10,880 Speaker 1: from someone's substack, because I have subscribed to quite a few, 1947 01:45:11,000 --> 01:45:12,960 Speaker 1: but every time I get a message from someone's substack, 1948 01:45:13,000 --> 01:45:16,200 Speaker 1: it's always like I'm so sorry and like I didn't notice. 1949 01:45:16,439 --> 01:45:18,599 Speaker 1: Just give me the content and I'll send to hear 1950 01:45:18,640 --> 01:45:22,720 Speaker 1: about Yeah, yeah, it's fine, Look we all I don't know. 1951 01:45:23,240 --> 01:45:25,880 Speaker 1: I'm meaning to write more stuff as opposed to just 1952 01:45:25,920 --> 01:45:29,439 Speaker 1: tweeting ship posts. So maybe it will happen, maybe it won't. 1953 01:45:29,479 --> 01:45:35,840 Speaker 1: There's no way to know, perfect by Behind the Bastards 1954 01:45:35,840 --> 01:45:38,479 Speaker 1: is a production of cool zone Media. For more from 1955 01:45:38,479 --> 01:45:42,160 Speaker 1: cool Zone Media, visit our website cool zone media dot com, 1956 01:45:42,320 --> 01:45:44,639 Speaker 1: or check us out on the I Heart Radio app, 1957 01:45:44,720 --> 01:45:47,160 Speaker 1: Apple Podcasts, or wherever you get your podcasts.