1 00:00:01,800 --> 00:00:05,640 Speaker 1: Welcome to the movies you liked as an adolescent and 2 00:00:05,680 --> 00:00:08,920 Speaker 1: are now ashamed of shame cast. I'm Robert Evans and 3 00:00:08,960 --> 00:00:14,480 Speaker 1: today in the seat of eternal self hatred Jamie Loftus. Jamie, 4 00:00:14,760 --> 00:00:18,040 Speaker 1: you have just admitted, prior to the show starting that 5 00:00:18,160 --> 00:00:22,400 Speaker 1: you once loved the Dana Carve Vehicle Master of Disguise. 6 00:00:22,800 --> 00:00:24,240 Speaker 1: What do you have to say for yourself? 7 00:00:24,960 --> 00:00:27,960 Speaker 2: I feel fucking sick with myself, Robert, I haven't been 8 00:00:27,960 --> 00:00:31,639 Speaker 2: able to sleep in the twenty years since its release. 9 00:00:32,200 --> 00:00:35,120 Speaker 2: In my defense, it came out on my birthday, which 10 00:00:35,120 --> 00:00:37,120 Speaker 2: I feel like had a lot to do with why 11 00:00:37,159 --> 00:00:39,920 Speaker 2: I considered it my favorite movie. I felt a kingship 12 00:00:39,960 --> 00:00:40,199 Speaker 2: with it. 13 00:00:40,520 --> 00:00:44,240 Speaker 1: Yeah, birthdays are like a performance enhancing drug for movies 14 00:00:44,280 --> 00:00:45,440 Speaker 1: that you see when you're eleven. 15 00:00:46,520 --> 00:00:48,520 Speaker 2: Absolutely true, the blood. 16 00:00:48,240 --> 00:00:50,879 Speaker 1: Doping of positive movie memories. 17 00:00:51,360 --> 00:00:54,280 Speaker 2: And furthermore, it's the most famous children's movie that was 18 00:00:54,280 --> 00:00:57,240 Speaker 2: shooting on nine to eleven, And so I thank you. 19 00:00:57,360 --> 00:01:02,480 Speaker 2: I wows felt like it would have been disloyal to 20 00:01:02,560 --> 00:01:07,080 Speaker 2: my country to say a word against the Master of Disguise, 21 00:01:07,520 --> 00:01:11,039 Speaker 2: particularly the turtle Turtle scene. However, you know, I think 22 00:01:11,080 --> 00:01:15,000 Speaker 2: the movie certainly doesn't hold up and I feel fucking 23 00:01:15,120 --> 00:01:16,960 Speaker 2: sick with myself every single day. 24 00:01:17,680 --> 00:01:22,120 Speaker 1: Yeah, I wonder because famously Dana Carvey was dressed as 25 00:01:22,160 --> 00:01:24,840 Speaker 1: the Turtle Man when those planes hit those towers, and 26 00:01:24,959 --> 00:01:30,319 Speaker 1: James Cameron was twenty thousand feet below sea level exploring 27 00:01:30,360 --> 00:01:32,360 Speaker 1: the bottom of the ocean, and I wonder if they 28 00:01:32,400 --> 00:01:35,200 Speaker 1: ever cross paths at like a Hollywood event and started 29 00:01:35,200 --> 00:01:37,280 Speaker 1: talking about nine to eleven, So what are you up 30 00:01:37,319 --> 00:01:38,160 Speaker 1: to on that day? 31 00:01:39,920 --> 00:01:43,240 Speaker 2: And then Marlo Alberg just leans in apropos of nothing 32 00:01:43,280 --> 00:01:43,880 Speaker 2: and is like. 33 00:01:44,040 --> 00:01:47,920 Speaker 1: If I was there, I would have stopped it. Oh, Jamie, 34 00:01:48,080 --> 00:01:51,280 Speaker 1: you know, I can honestly say nine to eleven was 35 00:01:51,280 --> 00:01:53,080 Speaker 1: the first time I felt like this country let me 36 00:01:53,200 --> 00:01:58,080 Speaker 1: down because it delayed the release of the seminal Tim 37 00:01:58,080 --> 00:02:02,560 Speaker 1: Allen film Big Trouble, which it sure it sure did, 38 00:02:02,640 --> 00:02:05,960 Speaker 1: features a classic Patrick Warburton performance, by the way, it 39 00:02:06,280 --> 00:02:09,720 Speaker 1: sure do. Goddamn right. You see his ass, everybody, if 40 00:02:09,800 --> 00:02:12,600 Speaker 1: you want to see Patrick Warburton's ass, it's in that movie. 41 00:02:13,240 --> 00:02:17,079 Speaker 2: He is so underrated. I absolutely love love that man. 42 00:02:17,320 --> 00:02:18,640 Speaker 1: A total of talent. 43 00:02:20,840 --> 00:02:24,000 Speaker 3: Friends, just to say, this is behind the ba this. 44 00:02:23,919 --> 00:02:27,560 Speaker 1: Is behind the Bastards. A podcast about none of the 45 00:02:27,560 --> 00:02:32,200 Speaker 1: things that we were talking about. Uh, and today, actually, Jamie, 46 00:02:32,360 --> 00:02:35,080 Speaker 1: I've got you back in the hot seat, back in 47 00:02:35,160 --> 00:02:38,120 Speaker 1: the office, which is more of an ephemeral feeling than 48 00:02:38,160 --> 00:02:40,280 Speaker 1: a physical space, making. 49 00:02:40,080 --> 00:02:43,119 Speaker 2: Me an answer for my sins right off the jump. 50 00:02:43,000 --> 00:02:46,720 Speaker 1: Yeah, by talking again about our friend Sam Bankman Freed, 51 00:02:47,560 --> 00:02:50,160 Speaker 1: who you and I chatted about right after his life 52 00:02:50,200 --> 00:02:53,560 Speaker 1: collapsed last year, and I like we should do an update. 53 00:02:54,480 --> 00:02:56,800 Speaker 2: I think we should too, because I'll be honest, I 54 00:02:57,000 --> 00:03:01,480 Speaker 2: have done truly everything I can to avoid knowing more 55 00:03:01,520 --> 00:03:04,080 Speaker 2: about him, so I would say I know basically nothing 56 00:03:04,120 --> 00:03:05,480 Speaker 2: about him since we last spoke. 57 00:03:05,720 --> 00:03:08,400 Speaker 1: Yeah, It's it's amazing because normally, you know, I'm an 58 00:03:08,440 --> 00:03:11,320 Speaker 1: empathetic being, Like Sam has a face that I've just 59 00:03:11,360 --> 00:03:13,399 Speaker 1: always wanted to hit from the first time I saw 60 00:03:13,400 --> 00:03:16,760 Speaker 1: a picture of him. And normally when somebody goes through 61 00:03:16,960 --> 00:03:19,639 Speaker 1: this much shit, when like their life is this ruined, right, 62 00:03:20,080 --> 00:03:22,720 Speaker 1: Like I might I feel a little less like hitting 63 00:03:22,800 --> 00:03:25,080 Speaker 1: them because the world has hit them, but I still 64 00:03:25,160 --> 00:03:26,920 Speaker 1: kind of want to sock him in the fucking jaw. 65 00:03:28,440 --> 00:03:30,079 Speaker 1: Every time I see this guy. 66 00:03:32,320 --> 00:03:35,840 Speaker 2: Let me just check it has his face changed. 67 00:03:36,440 --> 00:03:39,000 Speaker 1: He wears suits sometimes now when he goes to court, 68 00:03:39,040 --> 00:03:41,720 Speaker 1: He's not wearing the basketball shorts. 69 00:03:41,200 --> 00:03:44,640 Speaker 2: Helpful and well, oh that's true. God, yeah, that's like 70 00:03:44,720 --> 00:03:50,240 Speaker 2: two different versions of an embarrassing, desperate way to present. Okay, 71 00:03:50,280 --> 00:03:51,880 Speaker 2: he's wearing a he's wearing a suit. 72 00:03:51,880 --> 00:03:54,640 Speaker 1: Now, yeah, he's wearing a suit. Now it looks like shit, 73 00:03:54,840 --> 00:03:58,040 Speaker 1: but whatever. Of course, I try not to judge people 74 00:03:58,040 --> 00:04:01,640 Speaker 1: on how they look unless that's part of the Cohn 75 00:04:01,800 --> 00:04:04,560 Speaker 1: and Sam as a guy for whom the dressing like 76 00:04:04,600 --> 00:04:09,240 Speaker 1: a slab was always part of his his like techbro genius, 77 00:04:09,280 --> 00:04:13,160 Speaker 1: you know, persona that he was putting on like a 78 00:04:13,440 --> 00:04:19,039 Speaker 1: fooled every puddy. A mug shot, Oh yeah, yeah, there's 79 00:04:19,040 --> 00:04:20,640 Speaker 1: got I believe there's a mugshot of him out at 80 00:04:20,680 --> 00:04:24,320 Speaker 1: this point, certainly when he went at the Bahamas. So 81 00:04:24,360 --> 00:04:27,120 Speaker 1: when we last left our buddy Sam in November of 82 00:04:27,120 --> 00:04:29,200 Speaker 1: twenty twenty two, he had been arrested in the Bahamas 83 00:04:29,200 --> 00:04:31,880 Speaker 1: and extradited to the United States, where he was charged 84 00:04:31,920 --> 00:04:35,000 Speaker 1: with so many financial crimes that he might theoretically spend 85 00:04:35,040 --> 00:04:36,880 Speaker 1: more than one hundred and fifteen years in prison. 86 00:04:37,680 --> 00:04:38,000 Speaker 2: Wow. 87 00:04:38,040 --> 00:04:41,280 Speaker 1: Now, in the days and months since, a lot has happened, 88 00:04:41,279 --> 00:04:43,159 Speaker 1: and a lot more has come out about how the 89 00:04:43,160 --> 00:04:46,680 Speaker 1: former crypto mogul behaved before and after his fall. I 90 00:04:46,720 --> 00:04:49,320 Speaker 1: want to start with some of the latter information, because 91 00:04:49,320 --> 00:04:51,800 Speaker 1: by far, the most entertaining story to drop as a 92 00:04:51,800 --> 00:04:55,279 Speaker 1: result of these serried legal filings against Sam is that 93 00:04:55,600 --> 00:04:58,960 Speaker 1: he was using the nonprofit arm of FTX to attempt 94 00:04:59,000 --> 00:05:00,960 Speaker 1: to buy a sovereign name he could use as an 95 00:05:00,960 --> 00:05:04,719 Speaker 1: apocalypse shelter. That is by far the funniest story that's 96 00:05:04,800 --> 00:05:08,279 Speaker 1: dropped about these guys in the days months since. 97 00:05:09,000 --> 00:05:10,400 Speaker 2: What was the plan there? 98 00:05:10,960 --> 00:05:16,440 Speaker 1: Oh, that's a great question, Jamie. So let's talk about 99 00:05:16,480 --> 00:05:20,360 Speaker 1: the island of Nauru. It's an island in the southwest Pacific. 100 00:05:20,360 --> 00:05:22,599 Speaker 1: I think it's about twenty one hundred miles away from 101 00:05:22,600 --> 00:05:25,279 Speaker 1: the coast of Australia, which, given the fact that Australia 102 00:05:25,360 --> 00:05:27,159 Speaker 1: is really out in the middle of nowhere, is pretty 103 00:05:27,200 --> 00:05:32,080 Speaker 1: close to Australia. It is presently the world's smallest island nation. 104 00:05:32,279 --> 00:05:34,600 Speaker 1: It's got a population of about twelve thousand or so, 105 00:05:34,760 --> 00:05:38,279 Speaker 1: not a ton of people, and as an incredibly tiny country, 106 00:05:38,320 --> 00:05:40,760 Speaker 1: one of its primary assets is simply the fact that 107 00:05:40,880 --> 00:05:44,320 Speaker 1: it is a sovereign nation. Because there's things that countries 108 00:05:44,360 --> 00:05:47,080 Speaker 1: can do then nothing else can do, like issue certain 109 00:05:47,160 --> 00:05:49,800 Speaker 1: kinds like passports and visas and do certain kinds of 110 00:05:49,800 --> 00:05:52,440 Speaker 1: things with banking. Right, So, if you're a really tiny 111 00:05:52,560 --> 00:05:56,080 Speaker 1: country that doesn't have like a shitload of natural resources, 112 00:05:56,360 --> 00:05:59,360 Speaker 1: one thing you can export is the benefits of your sovereignty. 113 00:05:59,440 --> 00:06:03,040 Speaker 1: To say, really rich people who who might want certain 114 00:06:03,200 --> 00:06:05,560 Speaker 1: things that you can do as a country. 115 00:06:05,680 --> 00:06:07,599 Speaker 2: Plan is coming to get. 116 00:06:08,480 --> 00:06:11,960 Speaker 1: So there's a number of ways in which Naru has 117 00:06:12,000 --> 00:06:14,120 Speaker 1: kind of taken advantage of this to get by. One 118 00:06:14,160 --> 00:06:16,840 Speaker 1: of them is that they've sort of sold access to 119 00:06:16,880 --> 00:06:20,680 Speaker 1: their land to Australia to use, so that Australia has 120 00:06:20,760 --> 00:06:23,839 Speaker 1: used them for years as an offshore processing center for 121 00:06:23,880 --> 00:06:27,920 Speaker 1: asylum seekers. I think this stopped most recently in twenty nineteen, 122 00:06:28,000 --> 00:06:29,680 Speaker 1: but there's been a couple of waves of this and 123 00:06:29,720 --> 00:06:32,560 Speaker 1: it was not a pleasant place, right. Conditions were so 124 00:06:32,720 --> 00:06:36,320 Speaker 1: brutal in sort of the offshore processing center on Nauru 125 00:06:36,480 --> 00:06:39,880 Speaker 1: from twenty twelve to twenty nineteen that several residents carried 126 00:06:39,920 --> 00:06:44,039 Speaker 1: out like deadly forms of protests, sowing their own lipshut 127 00:06:44,120 --> 00:06:46,760 Speaker 1: or lighting themselves on fire as a protest of the 128 00:06:46,800 --> 00:06:52,000 Speaker 1: conditions they were facing. Pretty ugly scene in the late 129 00:06:52,080 --> 00:06:54,600 Speaker 1: nineteen nineties. Kind of prior to this period, NARU was 130 00:06:54,640 --> 00:06:58,440 Speaker 1: the chief money laundering location for the emerging Russian oligarch class. 131 00:06:59,000 --> 00:07:00,960 Speaker 1: They helped a lot of these. All the archetypes you've 132 00:07:01,000 --> 00:07:03,279 Speaker 1: heard about in the context of Putin laund are about 133 00:07:03,320 --> 00:07:05,960 Speaker 1: seventy billion dollars in the ill gotten funds during the 134 00:07:06,000 --> 00:07:07,839 Speaker 1: early stages of the Russian Federation. 135 00:07:08,720 --> 00:07:10,320 Speaker 2: I have a good bastard's cameo. 136 00:07:10,520 --> 00:07:14,120 Speaker 1: Oh yeah, no, Naro's nar who's adjacent to a whole 137 00:07:14,120 --> 00:07:18,119 Speaker 1: lot of shitty people. Great, yeah, here, it's a lovely place. 138 00:07:18,840 --> 00:07:22,040 Speaker 1: Narus also designated a money laundering state by the US 139 00:07:22,080 --> 00:07:24,600 Speaker 1: Treasury in two thousand and two, which led to sanctions, 140 00:07:24,600 --> 00:07:26,800 Speaker 1: which I think is probably why they moved to like 141 00:07:26,920 --> 00:07:29,920 Speaker 1: letting Australia offshore migrants there for a while. And since 142 00:07:29,960 --> 00:07:33,920 Speaker 1: Australia stopped doing that in twenty nineteen, Sam Bankman, Freed 143 00:07:34,000 --> 00:07:36,720 Speaker 1: and his fellow effective altruists felt like they might have 144 00:07:36,720 --> 00:07:39,320 Speaker 1: had an opportunity there, right, Like Naru's kind of looking 145 00:07:39,320 --> 00:07:42,160 Speaker 1: for some new cash flow. They're looking for a sovereign 146 00:07:42,280 --> 00:07:44,080 Speaker 1: nation to do some things for. 147 00:07:44,840 --> 00:07:48,360 Speaker 2: Well, it's an opportunity to be effective. 148 00:07:48,280 --> 00:07:53,360 Speaker 1: Yeah, yeah, to be effectively altruistic towards yourself. Specifically, Sam 149 00:07:53,480 --> 00:07:56,280 Speaker 1: and his brother Gabriel Bankman Freed is actually the guy 150 00:07:56,400 --> 00:08:02,520 Speaker 1: sort of like organizing this attempted endeavor using ftx's charitable 151 00:08:02,800 --> 00:08:05,960 Speaker 1: donation's arm and their goal was to purchase the entire 152 00:08:06,000 --> 00:08:09,640 Speaker 1: island in order to construct what Gabe called a bunker, 153 00:08:09,760 --> 00:08:12,640 Speaker 1: shelter that would be used to quote ensure that most 154 00:08:12,640 --> 00:08:16,120 Speaker 1: effective altruists survive in the event that between fifty percent 155 00:08:16,160 --> 00:08:18,240 Speaker 1: and ninety nine point nine to nine percent of the 156 00:08:18,240 --> 00:08:20,840 Speaker 1: world population perishing a catastrophe. 157 00:08:21,760 --> 00:08:25,000 Speaker 2: Jesus, And that's like a pretty common this. I feel 158 00:08:25,040 --> 00:08:27,679 Speaker 2: like the Gabe of the situation is a very common 159 00:08:27,880 --> 00:08:33,200 Speaker 2: character in bat like just the devious brother. I mean, 160 00:08:33,320 --> 00:08:35,760 Speaker 2: I hate the bastard most of all, but I really 161 00:08:35,800 --> 00:08:38,640 Speaker 2: detest the devious brother as well. There's just it just 162 00:08:38,720 --> 00:08:41,439 Speaker 2: weeks of insecurity get your own grift man. 163 00:08:41,360 --> 00:08:43,719 Speaker 1: Especially since they're framing it not as look, you know, 164 00:08:43,760 --> 00:08:45,360 Speaker 1: when you got like a guy like Peter teal right, 165 00:08:45,400 --> 00:08:47,800 Speaker 1: and everybody knows Peter Teel's got like an evil rich 166 00:08:47,840 --> 00:08:50,080 Speaker 1: guy bunker to wait out the end of the world 167 00:08:50,120 --> 00:08:52,400 Speaker 1: if it happens, and like, fuck Peter teal but at 168 00:08:52,480 --> 00:08:55,160 Speaker 1: least that's Peter Tel's not pretending I have a bunker 169 00:08:55,200 --> 00:08:57,880 Speaker 1: to like, save the world by putting aside just the 170 00:08:57,920 --> 00:08:59,640 Speaker 1: best people. He's like, no, I'm a giant piece of 171 00:08:59,640 --> 00:09:02,440 Speaker 1: shit and I'm going to save myself if things go wrong. Okay, 172 00:09:02,720 --> 00:09:05,520 Speaker 1: like fucking Peter Teel, but at least it's honest. They're 173 00:09:05,559 --> 00:09:08,240 Speaker 1: framing it as my blood bunker for just boys. Yeah, 174 00:09:08,760 --> 00:09:12,520 Speaker 1: it's me and my blood boys. Jabe is like, no, 175 00:09:12,640 --> 00:09:14,880 Speaker 1: we have to in the event there's an apocalypse, we 176 00:09:14,960 --> 00:09:17,600 Speaker 1: have to save all the eas because they're the best people, 177 00:09:17,880 --> 00:09:20,200 Speaker 1: and that's what's best for the world that will do. 178 00:09:20,559 --> 00:09:23,440 Speaker 1: We're utilitarians, right, The greatest good for the greatest number 179 00:09:23,440 --> 00:09:25,520 Speaker 1: of people is to save all of the best people, 180 00:09:25,720 --> 00:09:28,319 Speaker 1: which is me and my friends, the other finance kids 181 00:09:28,480 --> 00:09:30,800 Speaker 1: who call themselves effective altruists, so they don't have to 182 00:09:30,800 --> 00:09:33,040 Speaker 1: feel bad for the fact that all they do is 183 00:09:33,080 --> 00:09:36,040 Speaker 1: play the stock market like every other piece of shit anyway. 184 00:09:36,240 --> 00:09:37,680 Speaker 2: So lucky that they all met each other. 185 00:09:38,040 --> 00:09:41,760 Speaker 1: So lucky all the good people I would love, I 186 00:09:41,800 --> 00:09:45,320 Speaker 1: would love honestly, Like, look, when the strike is over, 187 00:09:45,559 --> 00:09:47,920 Speaker 1: somebody at a network, bring me on. I will write 188 00:09:47,960 --> 00:09:52,040 Speaker 1: you a banger fucking script about an apocalypse where just 189 00:09:52,120 --> 00:09:54,360 Speaker 1: the EA guys are left in their bunker. Trying to 190 00:09:54,400 --> 00:09:55,400 Speaker 1: figure out society. 191 00:09:55,520 --> 00:10:01,040 Speaker 2: Okay, another incentive to end the strike would fucking rip. 192 00:10:01,160 --> 00:10:03,360 Speaker 1: Yeah, yeah, we could do. We could do quite a tale. 193 00:10:03,400 --> 00:10:05,439 Speaker 1: We could have some fun with this one, Jamie. 194 00:10:07,000 --> 00:10:10,400 Speaker 2: I don't I think that there would be some effective altruism, 195 00:10:10,840 --> 00:10:13,600 Speaker 2: uh like ridiculous enough to do cameos. 196 00:10:13,800 --> 00:10:16,320 Speaker 1: Oh, I believe we could get We could get fucking 197 00:10:16,360 --> 00:10:20,000 Speaker 1: William mccaskell in. There no problem bring his Scottish ass 198 00:10:20,040 --> 00:10:20,520 Speaker 1: on board. 199 00:10:20,679 --> 00:10:22,440 Speaker 2: Yeah, vain little perverts. 200 00:10:23,240 --> 00:10:26,320 Speaker 1: Oh, and they're all They're all I'm gonna be honest, 201 00:10:26,520 --> 00:10:28,679 Speaker 1: kind of stupid. So I bet we could trick him, 202 00:10:28,720 --> 00:10:32,560 Speaker 1: like you don't have journalistic ethics with an HBO show. Yeah, 203 00:10:32,600 --> 00:10:34,880 Speaker 1: we could just say we're bringing them on for an interview. 204 00:10:35,120 --> 00:10:37,560 Speaker 1: We could like film around him, liking what was that 205 00:10:37,640 --> 00:10:46,240 Speaker 1: fucking movie with uh? Oh yeah, Steve Martin where they 206 00:10:46,960 --> 00:10:49,600 Speaker 1: where they have to film the fake movie around Uh 207 00:10:49,640 --> 00:10:50,960 Speaker 1: what is it Chris rock Ship. 208 00:10:51,840 --> 00:10:54,880 Speaker 2: Oh, it's the It's oh. 209 00:10:55,960 --> 00:10:58,000 Speaker 1: See now you're now now now it's I know, it's 210 00:10:58,040 --> 00:11:00,160 Speaker 1: driving me nuts. We have to figure this out. Uh. 211 00:11:00,440 --> 00:11:02,160 Speaker 2: I rewatched it recently. 212 00:11:02,679 --> 00:11:03,319 Speaker 1: It holds up. 213 00:11:04,280 --> 00:11:06,800 Speaker 2: It does hold up. It's like one of the best movies. 214 00:11:07,520 --> 00:11:09,280 Speaker 1: Bofinger fucking dope. 215 00:11:09,400 --> 00:11:09,800 Speaker 2: Younger. 216 00:11:09,960 --> 00:11:12,400 Speaker 1: We could do a bow Finger with William mccaskell where 217 00:11:12,440 --> 00:11:15,360 Speaker 1: he thinks he's getting interviewed for a documentary and we're 218 00:11:15,400 --> 00:11:18,400 Speaker 1: really making him the bad guy off our HBO series, 219 00:11:20,240 --> 00:11:22,760 Speaker 1: bringing Steve Martin to fuck it. He's still he still 220 00:11:22,800 --> 00:11:23,120 Speaker 1: got it. 221 00:11:23,840 --> 00:11:26,199 Speaker 2: Yeah, he's got good politics. He'd be great. 222 00:11:26,520 --> 00:11:31,360 Speaker 1: Yeah. Uh glad, glad we remembered it. Watch Bowfinger, guys. 223 00:11:31,400 --> 00:11:34,200 Speaker 2: It holds up truly if you take anything away from today, 224 00:11:34,280 --> 00:11:36,560 Speaker 2: it really does. I was shocked at how well it helped. 225 00:11:36,640 --> 00:11:42,480 Speaker 1: Yeah, startlingly good movie, pretty good, like scientology joke. So, 226 00:11:42,960 --> 00:11:47,240 Speaker 1: Gabriel Bankman freed ran ftx's charitable donations wing, which included 227 00:11:47,280 --> 00:11:49,720 Speaker 1: a ton of money, for what many people have characterized 228 00:11:49,760 --> 00:11:53,440 Speaker 1: as political bribes. I'm not saying that he was bribing 229 00:11:53,480 --> 00:11:55,480 Speaker 1: politicians for FTX. I'm saying that's what a lot of 230 00:11:55,480 --> 00:11:58,400 Speaker 1: people have characterized what he was doing as. Now that 231 00:11:58,559 --> 00:12:00,480 Speaker 1: much has been known for a why, But it was 232 00:12:00,520 --> 00:12:03,800 Speaker 1: not until the current management of what remains of FTX 233 00:12:03,920 --> 00:12:07,040 Speaker 1: sued Sam. Because again there's new management that's trying to 234 00:12:07,080 --> 00:12:09,280 Speaker 1: recover as much money as possible and they're throwing Sam 235 00:12:09,320 --> 00:12:11,600 Speaker 1: under the bus. Because why wouldn't they, And that's how 236 00:12:11,640 --> 00:12:13,520 Speaker 1: all this stuff got revealed because they have all of 237 00:12:13,640 --> 00:12:17,000 Speaker 1: ftx's internal communications, so they found a bunch of shit 238 00:12:17,480 --> 00:12:20,240 Speaker 1: like that Sam was having Gabriel try to buy the 239 00:12:20,280 --> 00:12:23,959 Speaker 1: island of Nauru. Now it is unclear how serious their 240 00:12:23,960 --> 00:12:26,800 Speaker 1: attempt to buy this island was a representative of the 241 00:12:26,840 --> 00:12:29,880 Speaker 1: island's government has been like, no, no, no, we were 242 00:12:29,880 --> 00:12:32,360 Speaker 1: never putting our island for sale. This was never a 243 00:12:32,360 --> 00:12:35,520 Speaker 1: thing that was going to happen. And maybe that is 244 00:12:35,559 --> 00:12:38,240 Speaker 1: the case, maybe they were just being idiots fucking around. 245 00:12:38,520 --> 00:12:41,320 Speaker 1: But in a memo between Gabriel and an FTX officer, 246 00:12:41,360 --> 00:12:44,800 Speaker 1: the discussion was centered around the idea of buying the island, 247 00:12:44,840 --> 00:12:47,120 Speaker 1: of being in control of it as a sovereign country, 248 00:12:47,440 --> 00:12:50,040 Speaker 1: not just purchasing land, which is cool. 249 00:12:50,160 --> 00:12:51,920 Speaker 2: I mean that's a lot of work for a bit. 250 00:12:52,080 --> 00:12:54,400 Speaker 2: Not that I put it past him, but I'm just like, 251 00:12:54,559 --> 00:12:55,839 Speaker 2: it's not sounding like. 252 00:12:56,040 --> 00:12:59,559 Speaker 1: Yeah, no, I think I don't put it past It's 253 00:12:59,600 --> 00:13:03,840 Speaker 1: possible that Naru whatever is telling whatever government officials telling 254 00:13:03,840 --> 00:13:06,400 Speaker 1: the truth they were never considering selling the island. But 255 00:13:06,440 --> 00:13:09,280 Speaker 1: it's also possible that Gabe at all believed they could 256 00:13:09,320 --> 00:13:10,679 Speaker 1: buy the island, right. 257 00:13:10,880 --> 00:13:14,079 Speaker 2: Right, they may in fact be that dumb. 258 00:13:14,080 --> 00:13:18,120 Speaker 1: Yeah, yeah, yeah, Now there were discussions between these FDx 259 00:13:18,160 --> 00:13:22,280 Speaker 1: guys about using Naru as a base for human genetic experimentation. 260 00:13:23,480 --> 00:13:25,520 Speaker 1: You get the feeling that what their goal was to 261 00:13:25,520 --> 00:13:30,480 Speaker 1: create modified posthuman godlike bodies for their fellow effective altruists 262 00:13:30,480 --> 00:13:34,200 Speaker 1: so that they live forever and dominate mankind after the 263 00:13:34,240 --> 00:13:36,520 Speaker 1: collapse is undying immortals. 264 00:13:36,880 --> 00:13:43,360 Speaker 2: That just sent the ugliest photoshopped image like it was involuntary. 265 00:13:43,440 --> 00:13:50,439 Speaker 2: How quickly the like no neck, huge chest photoshops super. 266 00:13:50,200 --> 00:13:55,679 Speaker 1: So absolutely absolutely, Hey yeah, it sounds like a mix 267 00:13:55,760 --> 00:13:59,920 Speaker 1: between that one video game with the big robot dinosaur 268 00:14:00,320 --> 00:14:03,800 Speaker 1: and those sci fi books by that guy who wound 269 00:14:03,880 --> 00:14:08,319 Speaker 1: up being real anti Muslim But yeah, oh yeah. 270 00:14:08,120 --> 00:14:09,520 Speaker 2: A lot would be a lot of guys. I wonder 271 00:14:09,559 --> 00:14:10,040 Speaker 2: which guy. 272 00:14:10,320 --> 00:14:14,720 Speaker 1: Oh it's I think Ilium and olympos okay, I forget 273 00:14:14,760 --> 00:14:16,760 Speaker 1: the name of the author, but the premise of the 274 00:14:16,760 --> 00:14:19,280 Speaker 1: book is that like in the future, a bunch of 275 00:14:19,400 --> 00:14:22,120 Speaker 1: rich guys turn themselves into gods and decide to like 276 00:14:22,240 --> 00:14:25,160 Speaker 1: recreate the Trojan War with themselves as the Greek gods, 277 00:14:25,240 --> 00:14:28,560 Speaker 1: and they like resurrect a bunch of dead archaeologists to 278 00:14:28,600 --> 00:14:31,160 Speaker 1: make sure that they get the details right. It's it's fun, 279 00:14:31,280 --> 00:14:33,840 Speaker 1: it's it's it's quite a series, except for the weird 280 00:14:33,880 --> 00:14:39,200 Speaker 1: moments of bigotry. Oh that good stuff. Dan Simmons I think, 281 00:14:39,320 --> 00:14:43,480 Speaker 1: is the author anyway. So, yeah, they're talking about we 282 00:14:43,560 --> 00:14:47,320 Speaker 1: want to create a human genetic experimentation base, and they're like, 283 00:14:47,480 --> 00:14:50,440 Speaker 1: we want to figure out what the sensible regulations around 284 00:14:50,520 --> 00:14:53,080 Speaker 1: human genetic enhancement are, but we also want to build 285 00:14:53,080 --> 00:14:57,480 Speaker 1: a lab. And while they're talking about this, Gabriel ads cryptically, 286 00:14:57,720 --> 00:14:59,920 Speaker 1: probably there are other things it's useful to do with 287 00:15:00,200 --> 00:15:03,000 Speaker 1: sovereign country. I don't know what he means by that, 288 00:15:03,200 --> 00:15:07,600 Speaker 1: but yeah, probably huh. So that could be as banal 289 00:15:07,760 --> 00:15:10,320 Speaker 1: is just like money laundering, which NARU obviously is quite 290 00:15:10,320 --> 00:15:13,520 Speaker 1: a history of, or issuing things like passports. But given 291 00:15:13,560 --> 00:15:16,040 Speaker 1: the fact that all these dummies are permanently poisoned by 292 00:15:16,080 --> 00:15:18,680 Speaker 1: a mixture of sci fi fandoms and weird futurist cults, 293 00:15:18,800 --> 00:15:20,440 Speaker 1: I think it's safe to say we all dodged a 294 00:15:20,440 --> 00:15:22,360 Speaker 1: bullet by the fact that they never got too far 295 00:15:22,520 --> 00:15:26,520 Speaker 1: in this scheme. Now, my favorite thing about this whole 296 00:15:26,600 --> 00:15:29,120 Speaker 1: idea is how dumb it is. On its face, there 297 00:15:29,160 --> 00:15:32,720 Speaker 1: are some countries that could act as really good apocalypse 298 00:15:32,720 --> 00:15:34,920 Speaker 1: shelters for the super rich. Right, Switzerland is one. A 299 00:15:34,920 --> 00:15:37,840 Speaker 1: lot of rich people have their apocalypse shelters in Switzerland. 300 00:15:38,160 --> 00:15:41,200 Speaker 1: New Zealand is another. But the problem for that is that, 301 00:15:41,240 --> 00:15:44,640 Speaker 1: like Switzerland and New Zealand are both functional states. Obviously, 302 00:15:44,680 --> 00:15:47,280 Speaker 1: if you're a billionaire there, you can have some outsized influence, 303 00:15:47,320 --> 00:15:49,440 Speaker 1: but you're not just gonna run everything because there's other 304 00:15:49,600 --> 00:15:53,440 Speaker 1: interests and like a functional system of government in place 305 00:15:53,520 --> 00:15:56,200 Speaker 1: in all of those places. I think Sam and them 306 00:15:56,240 --> 00:15:58,480 Speaker 1: were hoping that since Naru's small enough, they could just 307 00:15:58,600 --> 00:16:01,920 Speaker 1: utterly dominate the government, but they ignored the fact that 308 00:16:02,040 --> 00:16:05,000 Speaker 1: it's like one of the worst places imaginable to have 309 00:16:05,200 --> 00:16:08,400 Speaker 1: as an apocalypse shelter. For one thing, the island does 310 00:16:08,440 --> 00:16:10,880 Speaker 1: not grow much food, which means it has to import 311 00:16:10,800 --> 00:16:13,080 Speaker 1: in ninety percent of what it needs to sustain its 312 00:16:13,240 --> 00:16:15,400 Speaker 1: very small population it has. 313 00:16:15,560 --> 00:16:17,480 Speaker 2: It's all good, we have so much money. 314 00:16:17,960 --> 00:16:22,280 Speaker 1: It's okay, we'll just keep importing it. Yeah. It also 315 00:16:22,360 --> 00:16:25,120 Speaker 1: has very little fresh water, and most of its infrastructure 316 00:16:25,160 --> 00:16:27,400 Speaker 1: is on the coast and vulnerable to both rising sea 317 00:16:27,480 --> 00:16:30,560 Speaker 1: levels and hurricanes. It is very close to the bottom 318 00:16:30,600 --> 00:16:32,600 Speaker 1: of places that you would want to have as a shelter. 319 00:16:34,200 --> 00:16:38,760 Speaker 1: So very funny. All these people are silly, now, Jamie. 320 00:16:39,360 --> 00:16:43,160 Speaker 1: The main reason current FTX is revealing all this is 321 00:16:43,200 --> 00:16:46,520 Speaker 1: that they are suing the old management of the country company, 322 00:16:46,600 --> 00:16:49,400 Speaker 1: i e. SAM, to try and reclaim a billion or 323 00:16:49,440 --> 00:16:52,920 Speaker 1: so dollars they argue was funneled illegally into nonsense like 324 00:16:52,960 --> 00:16:55,440 Speaker 1: this and into the pockets of bankment Freed and his 325 00:16:55,520 --> 00:16:59,160 Speaker 1: lieutenants in the months before FTX collapsed due to insolvency. 326 00:17:00,080 --> 00:17:03,640 Speaker 1: Suit against SAM also includes some more confounding lines described 327 00:17:03,640 --> 00:17:05,719 Speaker 1: in this paragraph from an article on the suit by 328 00:17:05,760 --> 00:17:09,359 Speaker 1: crypto news site decrypt. The lawsuit further says that the 329 00:17:09,359 --> 00:17:12,679 Speaker 1: projects run by the FTX Foundation were frequently misguided and 330 00:17:12,720 --> 00:17:16,720 Speaker 1: sometimes dystopian. These included a three hundred dollars one thousand 331 00:17:16,720 --> 00:17:19,200 Speaker 1: dollars grant to an individual to write a book about 332 00:17:19,200 --> 00:17:22,480 Speaker 1: how to figure out what humans utility function are, as 333 00:17:22,520 --> 00:17:24,800 Speaker 1: well as a four hundred thousand dollars grant to an 334 00:17:24,920 --> 00:17:28,840 Speaker 1: entity that posted YouTube videos related to rationalist and effective 335 00:17:28,840 --> 00:17:34,399 Speaker 1: altruism material, including videos on grabby aliens. Now does that 336 00:17:34,440 --> 00:17:37,320 Speaker 1: all seem like nonsense to you, Jamie? 337 00:17:37,600 --> 00:17:40,000 Speaker 2: I don't know. Robert let's hear them out about the 338 00:17:40,160 --> 00:17:41,280 Speaker 2: grabby alien. 339 00:17:41,480 --> 00:17:44,520 Speaker 1: Don't worry, I'm going to explain all of this horseshit 340 00:17:44,680 --> 00:17:44,960 Speaker 1: to you. 341 00:17:45,320 --> 00:17:50,439 Speaker 2: Uh my god, what kind of Rick and Morty ass nonsense? 342 00:17:50,600 --> 00:17:53,600 Speaker 1: It is some Rick and Morty ass nonsense. It's much 343 00:17:53,680 --> 00:17:55,960 Speaker 1: dumber than anything in Rick and Morty because at least 344 00:17:56,000 --> 00:17:58,800 Speaker 1: some of the people there understand story structure. 345 00:17:59,440 --> 00:18:02,160 Speaker 2: Unlike the at Please understand that it's a joke. 346 00:18:02,160 --> 00:18:05,320 Speaker 1: But it's a bit. Yes, yeah, so if you aren't 347 00:18:05,440 --> 00:18:08,520 Speaker 1: terminally adhered to one of the stupidest subcultures in the 348 00:18:08,560 --> 00:18:11,359 Speaker 1: broader tech sphere, that probably does seem like nonsense, and 349 00:18:11,440 --> 00:18:13,840 Speaker 1: it is. But let's start with the bit about paying 350 00:18:13,880 --> 00:18:16,720 Speaker 1: someone three hundred grand to write about what a human's 351 00:18:16,840 --> 00:18:20,479 Speaker 1: utility function is? Now, yes, what is a utility function? 352 00:18:20,720 --> 00:18:25,080 Speaker 1: Great question? In economics, a utility function is the measure 353 00:18:25,119 --> 00:18:28,719 Speaker 1: of welfare or satisfaction of a consumer as a function 354 00:18:28,760 --> 00:18:32,200 Speaker 1: of the consumption of real goods like food. In simple terms, 355 00:18:32,200 --> 00:18:34,840 Speaker 1: it's a way of describing the satisfaction or other benefits 356 00:18:34,880 --> 00:18:38,680 Speaker 1: gained by consuming a specific resource. This is important to 357 00:18:38,800 --> 00:18:41,399 Speaker 1: rational choice theory, which is a theory that states that 358 00:18:41,480 --> 00:18:45,680 Speaker 1: individuals use rational calculations to make rational choices to achieve 359 00:18:45,720 --> 00:18:49,359 Speaker 1: outcomes aligned with their own objectives. Now, most people who 360 00:18:49,400 --> 00:18:52,280 Speaker 1: aren't economists think that talking about the economy this way 361 00:18:52,320 --> 00:18:55,480 Speaker 1: is silly, because people are not in fact rational actors, 362 00:18:55,480 --> 00:18:58,120 Speaker 1: and in fact, we make shitty decisions all the time, 363 00:18:58,200 --> 00:19:01,840 Speaker 1: guided by misinformation or pressure that causes us to inaccurately 364 00:19:01,840 --> 00:19:04,920 Speaker 1: interpret the potential value of something like a college education 365 00:19:05,320 --> 00:19:07,520 Speaker 1: versus the cost of say, student loans. 366 00:19:07,680 --> 00:19:11,960 Speaker 2: Right, one could argue that Sam Bankman, Freed and co. 367 00:19:12,119 --> 00:19:14,840 Speaker 2: Are a great example of a wonderful example of the 368 00:19:14,880 --> 00:19:20,560 Speaker 2: IR But that's economics. This is not an economics podcast. 369 00:19:20,560 --> 00:19:22,879 Speaker 2: I'm not an economist, And the way that Bankman, Freed 370 00:19:22,960 --> 00:19:26,000 Speaker 2: and his fellow Eas talk about utility functions is not 371 00:19:26,160 --> 00:19:28,800 Speaker 2: the same as how economists talk about it. Right, So 372 00:19:28,840 --> 00:19:30,960 Speaker 2: when economists are talking about this, it's part of how 373 00:19:30,960 --> 00:19:34,320 Speaker 2: to kind of figure out why people might make rational 374 00:19:34,400 --> 00:19:37,679 Speaker 2: choices by understanding like the value of sort of the ranked, 375 00:19:38,119 --> 00:19:40,600 Speaker 2: like preference they give to certain things that they might 376 00:19:40,640 --> 00:19:44,520 Speaker 2: expend resources on. Broadly speaking, when Bankman, Freed, and Eas 377 00:19:44,560 --> 00:19:48,440 Speaker 2: talk about utility functions, what they mean is something even 378 00:19:48,520 --> 00:19:51,040 Speaker 2: more abstract. And I'm going to quote from a summary 379 00:19:51,040 --> 00:19:53,760 Speaker 2: from a write up on a effective altruism dot Org, 380 00:19:53,800 --> 00:19:55,760 Speaker 2: a website you should avoid at all costs. 381 00:19:56,160 --> 00:19:56,440 Speaker 1: Quote. 382 00:19:56,480 --> 00:19:58,840 Speaker 2: I'm really not looking forward to finding out how this 383 00:19:58,960 --> 00:20:01,640 Speaker 2: somehow relates to to the grabby aliens. 384 00:20:02,000 --> 00:20:05,280 Speaker 4: I'm sorry in a very dumb way, Jamie. 385 00:20:05,359 --> 00:20:09,120 Speaker 1: Quote Okay, good eas and rationalists love dropping the term 386 00:20:09,200 --> 00:20:12,159 Speaker 1: in every conversation. Using the term utility function can be 387 00:20:12,560 --> 00:20:15,760 Speaker 1: immensely helpful when aiming to maximize positive impact or do 388 00:20:15,880 --> 00:20:18,959 Speaker 1: the most good. The concept of a utility function provides 389 00:20:19,000 --> 00:20:22,360 Speaker 1: a systematic way to quantify and compare the potential benefits 390 00:20:22,359 --> 00:20:25,399 Speaker 1: of different actions, thus helping to guide decision making towards 391 00:20:25,400 --> 00:20:28,840 Speaker 1: the most effective outcomes. By representing values, goals, or beneficial 392 00:20:28,880 --> 00:20:32,560 Speaker 1: outcomes numerically, utility functions allow for a structured comparison and 393 00:20:32,640 --> 00:20:35,879 Speaker 1: prioritization of actions. If, for example, your goal is to 394 00:20:35,920 --> 00:20:39,720 Speaker 1: alleviate global suffering, you could assign values to different charitable 395 00:20:39,760 --> 00:20:43,520 Speaker 1: actions based on their estimated impact, thus creating utility function. 396 00:20:43,800 --> 00:20:46,399 Speaker 1: This function can then guide you to allocate your resources 397 00:20:46,440 --> 00:20:48,479 Speaker 1: like time or money, where they will generate the greatest 398 00:20:48,560 --> 00:20:52,639 Speaker 1: utility or good. Now, that just seems like you're saying 399 00:20:53,000 --> 00:20:55,359 Speaker 1: you should try to figure out how your money's gonna 400 00:20:55,400 --> 00:20:58,080 Speaker 1: be spent best right before you spend it. But that's 401 00:20:58,119 --> 00:21:00,280 Speaker 1: not actually what they're saying. What they are doing here 402 00:21:00,359 --> 00:21:03,080 Speaker 1: is they are they are set like a utility function. 403 00:21:03,119 --> 00:21:05,639 Speaker 1: This context is a way of assigning a number that 404 00:21:05,720 --> 00:21:08,399 Speaker 1: you have made up. There is no objective value to 405 00:21:08,440 --> 00:21:11,040 Speaker 1: this number. There's no rigor to this. You are making 406 00:21:11,119 --> 00:21:15,320 Speaker 1: up a number to determine the value of spending your 407 00:21:15,359 --> 00:21:18,320 Speaker 1: money in certain ways. And you are doing this so 408 00:21:18,359 --> 00:21:20,639 Speaker 1: that whatever it is you want to do with your money, 409 00:21:21,000 --> 00:21:24,960 Speaker 1: you can justify numerically as the scientifically best way to 410 00:21:25,000 --> 00:21:28,359 Speaker 1: spend your money. So I see you can. You can 411 00:21:28,560 --> 00:21:31,520 Speaker 1: argue in this way by assigning these values whenever wonky 412 00:21:31,520 --> 00:21:34,480 Speaker 1: way you want. Now, me, paying taxes to fund roads 413 00:21:34,560 --> 00:21:37,000 Speaker 1: and a healthcare system is a shitty use of my 414 00:21:37,080 --> 00:21:39,960 Speaker 1: money because it doesn't optimize this thing that I consider 415 00:21:40,000 --> 00:21:42,160 Speaker 1: to be of higher long term value. And the thing 416 00:21:42,160 --> 00:21:44,760 Speaker 1: that's of higher long term value to me is spending 417 00:21:44,800 --> 00:21:47,680 Speaker 1: money on fucking space travel research so that I can 418 00:21:47,720 --> 00:21:50,720 Speaker 1: be a demigod on Mars. Right, that's best for human 419 00:21:50,760 --> 00:21:53,800 Speaker 1: beings in the long term. So in utilitarian speak, you know, 420 00:21:53,840 --> 00:21:55,560 Speaker 1: the greatest good for the greatest number of people is 421 00:21:55,560 --> 00:21:58,160 Speaker 1: that's getting to Mars as opposed to feeding starving people. 422 00:21:58,240 --> 00:22:00,600 Speaker 1: Right now, you know the utility function of getting to 423 00:22:00,640 --> 00:22:02,560 Speaker 1: Mars as much higher, So that's where our money ought 424 00:22:02,560 --> 00:22:02,720 Speaker 1: to go. 425 00:22:03,400 --> 00:22:05,679 Speaker 2: Well, it's all yeah, it just like comes down to 426 00:22:05,800 --> 00:22:09,760 Speaker 2: like the great the greatest good is me hah smiling 427 00:22:09,800 --> 00:22:10,239 Speaker 2: a little bit. 428 00:22:10,480 --> 00:22:15,600 Speaker 1: Yeah, exactly. There's literally some writing on that in that vein. 429 00:22:15,720 --> 00:22:18,359 Speaker 1: So it is through math like this that EAS are 430 00:22:18,400 --> 00:22:20,280 Speaker 1: able to look at it a world where millions face 431 00:22:20,359 --> 00:22:22,800 Speaker 1: death by famine or disease or rising sea levels and 432 00:22:22,840 --> 00:22:25,080 Speaker 1: say the best way to help the planet is for 433 00:22:25,200 --> 00:22:28,040 Speaker 1: us to become finance Bros. And then spend our money 434 00:22:28,040 --> 00:22:31,920 Speaker 1: investing in AI companies or whatever. The fundamental selfishness of 435 00:22:31,960 --> 00:22:33,960 Speaker 1: this whole community is made clear when you read the 436 00:22:34,080 --> 00:22:36,960 Speaker 1: essays these people write on their websites, like less Wrong, 437 00:22:37,400 --> 00:22:42,040 Speaker 1: a blog founded by self declared AIX Yeah, Eliza Yidkowski. 438 00:22:42,080 --> 00:22:45,719 Speaker 1: Yidkowski is a rationalist, which is a related subculture to 439 00:22:45,760 --> 00:22:48,439 Speaker 1: the EAS. There's a lot of bleed over Sam and 440 00:22:48,440 --> 00:22:51,960 Speaker 1: a lot of his people were rationalists to rationalists adjacent 441 00:22:52,520 --> 00:22:56,040 Speaker 1: and Yeah. To give you an idea of how these 442 00:22:56,040 --> 00:22:58,400 Speaker 1: people talk about utility functions, I'm going to read an 443 00:22:58,400 --> 00:23:00,880 Speaker 1: excerpt from an article on this web site titled your 444 00:23:00,960 --> 00:23:05,560 Speaker 1: utility function is your utility Function? By David Udell. I've 445 00:23:05,560 --> 00:23:08,800 Speaker 1: been thinking a lot lately about exactly how altruistic I am. 446 00:23:08,880 --> 00:23:10,800 Speaker 1: The truth is that I'm not sure. I care a 447 00:23:10,800 --> 00:23:13,119 Speaker 1: lot about not dying, and about my girlfriend and family 448 00:23:13,160 --> 00:23:15,840 Speaker 1: and friends not dying, and about all of humanity not dying, 449 00:23:16,040 --> 00:23:18,320 Speaker 1: and about all life on this planet not dying too. 450 00:23:18,640 --> 00:23:21,360 Speaker 1: And I care about the glorious transhuman future and all that, 451 00:23:21,640 --> 00:23:24,640 Speaker 1: and the ten to the fiftieth power or whatever possible 452 00:23:24,680 --> 00:23:27,280 Speaker 1: good future lives hanging in the balance. And I care 453 00:23:27,320 --> 00:23:30,680 Speaker 1: about some of these things disproportionately to their apparent moral magnitude. 454 00:23:30,720 --> 00:23:32,919 Speaker 1: But what I care about is what I care about. 455 00:23:33,200 --> 00:23:35,480 Speaker 1: Rationality is the art of getting more of what you want, 456 00:23:35,680 --> 00:23:38,720 Speaker 1: whatever that is, of systematized winning by your own lights. 457 00:23:38,960 --> 00:23:41,080 Speaker 1: You will totally fail in that art if you bulldoze 458 00:23:41,080 --> 00:23:43,359 Speaker 1: your values in a desperate effort to fit in or 459 00:23:43,400 --> 00:23:45,920 Speaker 1: to be a good person, or to in the way 460 00:23:45,920 --> 00:23:47,879 Speaker 1: you're a model of society seems to ask you to. 461 00:23:48,440 --> 00:23:50,520 Speaker 1: So you see what he's saying, if you read between 462 00:23:50,560 --> 00:23:52,960 Speaker 1: the lines there, what the most rational thing for me 463 00:23:53,000 --> 00:23:54,919 Speaker 1: to do is whatever makes me feel. 464 00:23:54,680 --> 00:23:56,960 Speaker 2: Best, whatever gives me a little smile. 465 00:23:57,200 --> 00:24:00,399 Speaker 1: Don't let people shame you for spending your resources, you know, 466 00:24:00,680 --> 00:24:04,760 Speaker 1: entirely on yourself and your own whims Like you're actually 467 00:24:04,800 --> 00:24:06,040 Speaker 1: a hero if you do that. 468 00:24:06,680 --> 00:24:10,240 Speaker 2: Oh, it is so fascinating to me to watch, like, 469 00:24:11,160 --> 00:24:13,600 Speaker 2: I don't know. It seems like this like self conscious 470 00:24:14,080 --> 00:24:21,760 Speaker 2: reflex where they feel the need to define rational by 471 00:24:21,800 --> 00:24:25,959 Speaker 2: something that is more closely aligned with reality and then 472 00:24:26,000 --> 00:24:30,120 Speaker 2: immediately be like, but what that actually means is yeah, 473 00:24:30,280 --> 00:24:31,640 Speaker 2: that's said. 474 00:24:32,359 --> 00:24:34,840 Speaker 1: The core of it is always being able to say that, like, well, 475 00:24:35,000 --> 00:24:37,440 Speaker 1: if you suggest the number one, I have a responsibility 476 00:24:37,480 --> 00:24:40,240 Speaker 1: to other people, and that that responsibility is to some 477 00:24:40,359 --> 00:24:42,280 Speaker 1: extent out of my hands, which is what we all 478 00:24:42,320 --> 00:24:45,800 Speaker 1: say when we're in a society. Right, I don't have kids. 479 00:24:46,040 --> 00:24:48,280 Speaker 1: I don't have a choice not to spend some of 480 00:24:48,320 --> 00:24:51,320 Speaker 1: the significant amount of money I pay in taxes educating 481 00:24:51,359 --> 00:24:54,600 Speaker 1: other people's kids. Now, I'm not a complete piece of shit, 482 00:24:54,760 --> 00:24:58,520 Speaker 1: so I'm fine with that because like, I've done very 483 00:24:58,560 --> 00:25:01,760 Speaker 1: well for myself, and kids need educations. That's just a 484 00:25:01,880 --> 00:25:05,359 Speaker 1: nice way for the world to work. But these people 485 00:25:05,440 --> 00:25:07,720 Speaker 1: are more of the feeling that, like, no, I should 486 00:25:07,760 --> 00:25:11,199 Speaker 1: do whatever is possible to avoid paying for, you know, 487 00:25:11,240 --> 00:25:13,560 Speaker 1: a public school system. And in fact, I'm often going 488 00:25:13,600 --> 00:25:16,280 Speaker 1: to advocate for some sort of like weird voucher based 489 00:25:16,320 --> 00:25:20,280 Speaker 1: system that allows me to not fund public schools because 490 00:25:20,320 --> 00:25:22,320 Speaker 1: the greatest good for the greatest number of peoples for 491 00:25:22,359 --> 00:25:25,080 Speaker 1: me to ensure that like me and my rich kid 492 00:25:25,119 --> 00:25:28,000 Speaker 1: friends all get to send our kids to special schools 493 00:25:28,040 --> 00:25:31,199 Speaker 1: where like what, you know, fuck anybody else, Like I 494 00:25:31,200 --> 00:25:34,120 Speaker 1: don't have any other responsibility for the broader population. All 495 00:25:34,160 --> 00:25:38,080 Speaker 1: that actually matters is like me maximizing you know, my 496 00:25:38,200 --> 00:25:41,400 Speaker 1: own personal happiness. But I still want to feel like 497 00:25:41,400 --> 00:25:43,520 Speaker 1: like I'm a hero for doing it right if I 498 00:25:43,560 --> 00:25:46,760 Speaker 1: avoid paying taxes in order to like spend all of 499 00:25:46,800 --> 00:25:49,600 Speaker 1: my money investing in open AI so that I can 500 00:25:49,720 --> 00:25:52,240 Speaker 1: like take people's jobs away, Like I want to feel 501 00:25:52,280 --> 00:25:55,040 Speaker 1: like a hero for doing that, because what I'm arguing 502 00:25:55,119 --> 00:25:56,920 Speaker 1: is that it's best for you know, the people with 503 00:25:57,040 --> 00:25:58,720 Speaker 1: five hundred years from now, and there's going to be 504 00:25:58,720 --> 00:26:00,640 Speaker 1: more of them there than there are today. So I'm 505 00:26:00,640 --> 00:26:03,760 Speaker 1: a hero for like getting rich off of this company today. 506 00:26:04,119 --> 00:26:06,960 Speaker 2: You know, yes, well, because there's, first of all, there's 507 00:26:07,200 --> 00:26:09,760 Speaker 2: definitely going to be people in five hundred years. 508 00:26:10,119 --> 00:26:14,480 Speaker 1: Sure, for sure, definitely, Garrett. If these eas get their way, 509 00:26:14,600 --> 00:26:16,000 Speaker 1: sure yeah. 510 00:26:15,720 --> 00:26:18,600 Speaker 2: Yeah yeah. If we are doing the most good they're 511 00:26:19,320 --> 00:26:22,119 Speaker 2: that's so fucking exhausting. I mean, it does, like your 512 00:26:22,160 --> 00:26:26,199 Speaker 2: Peter Teel example. Not that you know, doing evil is 513 00:26:26,359 --> 00:26:29,200 Speaker 2: good in any capacity, but the most exhausting kind of 514 00:26:29,280 --> 00:26:32,800 Speaker 2: evil is the one that also insists on you validating 515 00:26:32,880 --> 00:26:35,560 Speaker 2: that it's not actually evil all that time, that it 516 00:26:35,640 --> 00:26:38,600 Speaker 2: deserves to cut the fuck up and ruin my life 517 00:26:38,920 --> 00:26:39,640 Speaker 2: or don't. 518 00:26:39,960 --> 00:26:41,840 Speaker 1: It's the same thing with a lot of these fucking 519 00:26:41,960 --> 00:26:44,560 Speaker 1: right wing media grifters. Were like, it's not enough for 520 00:26:44,600 --> 00:26:47,119 Speaker 1: them to be rich, it's not enough for them to 521 00:26:47,240 --> 00:26:50,639 Speaker 1: like get their way politically. They they they feel like 522 00:26:50,720 --> 00:26:54,800 Speaker 1: they have like an ethical their ode, like being cool 523 00:26:55,000 --> 00:26:57,640 Speaker 1: and respected, and it's the same good. Like these people 524 00:26:57,680 --> 00:27:01,360 Speaker 1: are all finance schools. They are all like actively fighting 525 00:27:01,440 --> 00:27:04,760 Speaker 1: to avoid paying taxes and to be able to concentrate 526 00:27:04,840 --> 00:27:07,320 Speaker 1: ever more power in an ever smaller number of people, 527 00:27:07,680 --> 00:27:11,080 Speaker 1: to destroy the lives of you know, artists and people 528 00:27:11,080 --> 00:27:14,000 Speaker 1: who are like working folks in order to like make 529 00:27:14,080 --> 00:27:16,840 Speaker 1: more short term profits. This is all what they actually 530 00:27:16,920 --> 00:27:20,960 Speaker 1: care about personally, but they want to feel like Gandhi 531 00:27:21,040 --> 00:27:25,000 Speaker 1: while they do it. Right, because what they're doing is guaranteeing. 532 00:27:25,040 --> 00:27:27,359 Speaker 1: What they'll argue Jamie is that like, well, you know 533 00:27:27,960 --> 00:27:30,040 Speaker 1: this may hurt this company, you know, me getting involved 534 00:27:30,080 --> 00:27:31,560 Speaker 1: in this company. Sure, we may destroy a lot of 535 00:27:31,640 --> 00:27:33,960 Speaker 1: jobs in the short term, but by doing so, we'll 536 00:27:33,960 --> 00:27:35,920 Speaker 1: be able to make sure that the AI we build 537 00:27:35,920 --> 00:27:39,359 Speaker 1: that eventually becomes our god is one that cares about 538 00:27:39,359 --> 00:27:41,360 Speaker 1: the future of humanity, and that's better for the most 539 00:27:41,400 --> 00:27:42,359 Speaker 1: people in long run. 540 00:27:43,040 --> 00:27:47,200 Speaker 2: Yes, and history should remember me as the greatest man 541 00:27:47,280 --> 00:27:49,240 Speaker 2: to ever live. And people will you know, that'll be 542 00:27:49,320 --> 00:27:52,960 Speaker 2: on television someday when my computer is writing every television show. 543 00:27:53,240 --> 00:27:57,520 Speaker 1: Fucking hate these people. So yeah, it's it's cool stuff. 544 00:27:57,560 --> 00:28:01,520 Speaker 1: And I think the fundamental selfishness of these people because 545 00:28:01,560 --> 00:28:04,960 Speaker 1: all that effective altruism and rationalism are really about is 546 00:28:05,000 --> 00:28:08,399 Speaker 1: by is creating a made up system of numbers to 547 00:28:08,720 --> 00:28:13,720 Speaker 1: justify you pursuing your own benefit as like science, right, 548 00:28:13,760 --> 00:28:15,400 Speaker 1: as like scientifically rational. 549 00:28:15,920 --> 00:28:18,639 Speaker 2: This is really it's not as if that doesn't, you know, 550 00:28:18,760 --> 00:28:22,919 Speaker 2: like math problems to serve a very small group self interest. 551 00:28:22,960 --> 00:28:28,359 Speaker 2: It's not as if that doesn't exist outside of this circle. 552 00:28:28,560 --> 00:28:34,399 Speaker 2: But it's just like bizarre, how uniquely like they lack 553 00:28:34,720 --> 00:28:38,760 Speaker 2: any sort of self awareness or I don't know, it's 554 00:28:38,840 --> 00:28:41,400 Speaker 2: just they're so fucking annoying, is what I'm trying to say. 555 00:28:41,520 --> 00:28:44,640 Speaker 1: That's very true, Jamie. And yes, this is all really 556 00:28:44,680 --> 00:28:47,560 Speaker 1: clear when you look at how the movement, the EA 557 00:28:47,560 --> 00:28:50,440 Speaker 1: movement treated Sam Bankman freed before and after his fall 558 00:28:50,480 --> 00:28:53,480 Speaker 1: from grace. If effective Altruism can be said to have 559 00:28:53,520 --> 00:28:56,040 Speaker 1: a pope, and it can, because all of these Silicon 560 00:28:56,120 --> 00:29:00,480 Speaker 1: Valley philosophical movements are just kirklan Brand Catholicism. Hope is 561 00:29:00,520 --> 00:29:04,520 Speaker 1: Will mccaskell, an Oxford moral philosopher and co founder for 562 00:29:04,560 --> 00:29:08,000 Speaker 1: the Center for Effective Altruism. When FTX collapsed and Sam 563 00:29:08,040 --> 00:29:10,080 Speaker 1: got arrested, he was quick to put out a statement 564 00:29:10,080 --> 00:29:12,640 Speaker 1: of outrage. I don't know which emotion is stronger, my 565 00:29:12,760 --> 00:29:15,280 Speaker 1: other rage at Sam and others for causing such harm 566 00:29:15,320 --> 00:29:18,040 Speaker 1: to so many people, or my sadness and self hatred 567 00:29:18,080 --> 00:29:21,800 Speaker 1: for falling for this deception. Now, the only reason I 568 00:29:21,840 --> 00:29:24,480 Speaker 1: would hesitate to call this horseshit, Jamie, is that horseshit, 569 00:29:24,560 --> 00:29:28,040 Speaker 1: by virtue of being inanimate waste, possesses a fundamental honesty 570 00:29:28,080 --> 00:29:29,640 Speaker 1: that mcaskell is incapable of. 571 00:29:31,240 --> 00:29:34,120 Speaker 2: Yeah, fuck yeah, Robert, that's the bitchiest thing I've heard 572 00:29:34,160 --> 00:29:35,160 Speaker 2: you say in a while. 573 00:29:35,280 --> 00:29:38,920 Speaker 1: That's thank you, thank you. He and SBF met back 574 00:29:38,960 --> 00:29:42,719 Speaker 1: at MIT when Sam was an undergrad. Mccaskell convinced him 575 00:29:42,760 --> 00:29:46,240 Speaker 1: that he could maximize his impact in humanitarian causes by 576 00:29:46,280 --> 00:29:48,640 Speaker 1: earning to give, you know, making as much money as 577 00:29:48,640 --> 00:29:50,840 Speaker 1: possible so that he can give it away in a 578 00:29:51,360 --> 00:29:54,240 Speaker 1: way that presumably will help the world. Now. When Sam 579 00:29:54,320 --> 00:29:57,560 Speaker 1: ultimately launched Alimeter Research, it was an EA project from 580 00:29:57,640 --> 00:30:01,200 Speaker 1: the start, staffed by Sam's friends in the community. One 581 00:30:01,240 --> 00:30:04,080 Speaker 1: software engineer told Time almost everyone who came on in 582 00:30:04,080 --> 00:30:06,200 Speaker 1: those early days was an EA. They were there for 583 00:30:06,240 --> 00:30:10,800 Speaker 1: EA reasons, says Naya Buscal, a former software engineer at Alameda. 584 00:30:10,880 --> 00:30:12,800 Speaker 1: That was the pitch we gave people. This is an 585 00:30:12,840 --> 00:30:13,400 Speaker 1: EA thing. 586 00:30:14,280 --> 00:30:17,560 Speaker 2: I know, I once again thinking about sports video games, 587 00:30:17,600 --> 00:30:20,680 Speaker 2: and I know it's not sports video games. 588 00:30:20,240 --> 00:30:22,560 Speaker 1: It is it is basically a sports video game. 589 00:30:22,680 --> 00:30:27,480 Speaker 3: I just had the flashback to the Yeah interview you did. 590 00:30:28,440 --> 00:30:32,160 Speaker 2: I was thinking, Okay, it's true, it's true. 591 00:30:32,320 --> 00:30:35,200 Speaker 1: In the early days, Sam's pitch was that fifty percent 592 00:30:35,240 --> 00:30:38,040 Speaker 1: of the company profits would be donated to EA causes, 593 00:30:38,240 --> 00:30:40,680 Speaker 1: and the initial round of investing that got the company 594 00:30:40,720 --> 00:30:44,600 Speaker 1: off the ground was funded entirely by rich EA types. Publicly, 595 00:30:44,680 --> 00:30:47,840 Speaker 1: mccaskell asked about talked about Sam like he was the 596 00:30:47,840 --> 00:30:51,880 Speaker 1: EA messiah, probably because ftx's Future Fund provided a huge 597 00:30:51,880 --> 00:30:54,840 Speaker 1: amount of support for his movement, and just nine months 598 00:30:54,840 --> 00:30:58,080 Speaker 1: in twenty twenty two, the Future Fund, run by Nick Bexted, 599 00:30:58,160 --> 00:31:02,160 Speaker 1: a moral philosopher who used FTX money to support various causes, 600 00:31:02,360 --> 00:31:04,560 Speaker 1: gave more than one hundred and sixty million dollars in 601 00:31:04,640 --> 00:31:09,040 Speaker 1: other people's money funneled through FTX too effective altruism, including 602 00:31:09,080 --> 00:31:12,080 Speaker 1: thirty three million dollars to organizations that mcaskell had a 603 00:31:12,120 --> 00:31:16,840 Speaker 1: direct interest in. So that's why mcaskell spoke so positively 604 00:31:16,880 --> 00:31:19,200 Speaker 1: about Sam, which is is made even more fucked up 605 00:31:19,200 --> 00:31:21,880 Speaker 1: when you realize that, like other people in the EA 606 00:31:21,920 --> 00:31:25,880 Speaker 1: movement had started warning mcaskell about Sam Bankman freed and 607 00:31:25,920 --> 00:31:28,320 Speaker 1: about him being a con man as early as twenty eighteen. 608 00:31:29,000 --> 00:31:30,800 Speaker 1: And I'm going to quote again from time this is 609 00:31:30,840 --> 00:31:34,479 Speaker 1: about from the very start of Alimeter research. Within months, 610 00:31:34,560 --> 00:31:36,840 Speaker 1: the good karma of the venture dissipated in a series 611 00:31:36,840 --> 00:31:39,280 Speaker 1: of internal clashes, many details of which have not been 612 00:31:39,320 --> 00:31:42,480 Speaker 1: previously reported. Some of the issues were personal Bankman Freed 613 00:31:42,520 --> 00:31:46,160 Speaker 1: could be dictatorial, according to one former colleague. Three former 614 00:31:46,160 --> 00:31:49,800 Speaker 1: Alameda employees told Time he had inappropriate romantic relationships with 615 00:31:49,840 --> 00:31:53,720 Speaker 1: his subordinates. Early Alameda executives also believed he had reneged 616 00:31:53,720 --> 00:31:56,040 Speaker 1: on an equity arrangement that would have left bankman Freed 617 00:31:56,080 --> 00:31:58,520 Speaker 1: with forty percent control of the firm, according to a 618 00:31:58,560 --> 00:32:01,960 Speaker 1: document reviewed by Time. Instead, according to two people with 619 00:32:02,200 --> 00:32:04,840 Speaker 1: knowledge of the situation, he had registered himself as sole 620 00:32:04,920 --> 00:32:10,040 Speaker 1: owner of Alameda. So basically, Sam has unethically taken control 621 00:32:10,080 --> 00:32:12,000 Speaker 1: of the firm and all of the money invested in it. 622 00:32:12,080 --> 00:32:14,080 Speaker 1: And he is like fucking his subordinates. 623 00:32:14,480 --> 00:32:16,520 Speaker 2: Its like, and he's also a sex pest. 624 00:32:16,760 --> 00:32:17,920 Speaker 1: He's also a sex. 625 00:32:17,920 --> 00:32:20,960 Speaker 2: Surprising And he's I've ever heard in my entire life. 626 00:32:20,680 --> 00:32:24,720 Speaker 1: Taking money from people's what are effectively bank accounts and 627 00:32:24,840 --> 00:32:27,040 Speaker 1: FTX and he's using it to prop up the value 628 00:32:27,040 --> 00:32:30,160 Speaker 1: of different cryptos tokens on Alameda to make shit look 629 00:32:30,240 --> 00:32:32,560 Speaker 1: like it which is money laundering, right, that's all. It's 630 00:32:32,560 --> 00:32:34,360 Speaker 1: like theft and money. It's fraud, you know. 631 00:32:35,560 --> 00:32:37,520 Speaker 2: Or is it the future Robert remember that? 632 00:32:37,840 --> 00:32:41,720 Speaker 1: I mean, yeah, remember what Larry David taught us all. 633 00:32:42,640 --> 00:32:45,520 Speaker 2: I know, there were some really unforgivable. 634 00:32:45,760 --> 00:32:49,000 Speaker 1: Uh, it's pretty funny, it is. Of all, it is 635 00:32:49,160 --> 00:32:51,600 Speaker 1: like I am on team, Like I can't be angry 636 00:32:51,600 --> 00:32:55,640 Speaker 1: at Larry David because honestly, if Larry David advertises a 637 00:32:55,680 --> 00:32:59,400 Speaker 1: financial product and you decide, he seems like a good 638 00:32:59,440 --> 00:33:04,120 Speaker 1: guy to this seems like a credible company to me, 639 00:33:04,600 --> 00:33:06,400 Speaker 1: I don't know. That's a little bit on you, right, 640 00:33:06,680 --> 00:33:07,680 Speaker 1: that's a little bit on you. 641 00:33:08,520 --> 00:33:11,000 Speaker 2: Oh Els was in that commercial there was that was 642 00:33:11,040 --> 00:33:13,640 Speaker 2: a that was a damning commercial. 643 00:33:13,120 --> 00:33:15,680 Speaker 1: Those Yeah, it is. It is. I still say a restaurant, 644 00:33:15,760 --> 00:33:18,720 Speaker 1: Larry David. But that's for a variety of reasons. 645 00:33:20,200 --> 00:33:22,680 Speaker 2: So I'm curious what your other reasons are. But that's 646 00:33:22,720 --> 00:33:23,440 Speaker 2: for another day. 647 00:33:23,600 --> 00:33:27,040 Speaker 1: Yeah, that's for a different day. So this caused so 648 00:33:27,360 --> 00:33:29,360 Speaker 1: again all of this shit. That is why this is 649 00:33:29,440 --> 00:33:33,200 Speaker 1: like everything that was obvious in Alameda in twenty eighteen. 650 00:33:34,120 --> 00:33:36,280 Speaker 1: This is all the stuff that he gets arrested for 651 00:33:36,360 --> 00:33:39,520 Speaker 1: in twenty twenty two. And when they become aware of this, 652 00:33:39,640 --> 00:33:42,200 Speaker 1: of the fact that he's fucking is subordinates and money laundering, 653 00:33:42,600 --> 00:33:46,240 Speaker 1: a bunch of EA people start raising alarm bells. In 654 00:33:46,280 --> 00:33:49,600 Speaker 1: twenty eighteen, several Alameda executives try to force him out 655 00:33:49,640 --> 00:33:53,440 Speaker 1: of the company and accuse him of gross negligence. Sam 656 00:33:53,520 --> 00:33:55,880 Speaker 1: wins the power struggle, though, and so most of the 657 00:33:55,920 --> 00:33:59,600 Speaker 1: EA management team and half of the company resign. Now 658 00:34:00,120 --> 00:34:02,240 Speaker 1: this might be you could see this as like, well, 659 00:34:02,240 --> 00:34:04,520 Speaker 1: maybe those effective altruist types were just in it for 660 00:34:04,560 --> 00:34:06,760 Speaker 1: the altruism and once they saw Sam with Shady, they 661 00:34:06,800 --> 00:34:09,600 Speaker 1: packed up. And that's true for those individual people. You know, 662 00:34:09,800 --> 00:34:11,960 Speaker 1: there's some decent people who just got caught up in 663 00:34:12,000 --> 00:34:16,080 Speaker 1: the movement, and they clearly had some degree of moral integrity, 664 00:34:16,360 --> 00:34:18,720 Speaker 1: But the broader effective altruism. 665 00:34:18,280 --> 00:34:22,880 Speaker 2: Movement incredibly low there, including unbelievable Will mccaskell. 666 00:34:22,960 --> 00:34:26,799 Speaker 1: It's pope never disaffiliated from Sam. Mccaskell was saying talking 667 00:34:26,840 --> 00:34:30,120 Speaker 1: about Sam like the second fucking coming up until late 668 00:34:30,200 --> 00:34:34,200 Speaker 1: twenty twenty two. And in exchange for laundering Sam's reputation, 669 00:34:34,719 --> 00:34:37,600 Speaker 1: Sam sent tens of millions of dollars one hundred and 670 00:34:37,600 --> 00:34:40,560 Speaker 1: sixty million to EA causes in twenty twenty two alone. 671 00:34:40,840 --> 00:34:44,520 Speaker 1: And that's why mccaskell maintained movement ties with Bankment Freed 672 00:34:44,920 --> 00:34:47,479 Speaker 1: quote in the weeks leading up to that April twenty 673 00:34:47,520 --> 00:34:50,480 Speaker 1: eighteen confrontation with Bankman Freed and in the months that followed, 674 00:34:50,640 --> 00:34:53,120 Speaker 1: Maccoley and others, who was one of the executives that left, 675 00:34:53,160 --> 00:34:56,880 Speaker 1: warned mccaskell, Bexted and Karnofsky about her co founder's alleged 676 00:34:56,920 --> 00:35:00,400 Speaker 1: duplicity and unscrupulous business ethics. According to four people with 677 00:35:00,480 --> 00:35:04,920 Speaker 1: knowledge of those discussions, buscal recalled speaking to McCully immediately 678 00:35:04,920 --> 00:35:08,200 Speaker 1: after one of McCauley's conversations with mccaskell in late twenty eighteen. 679 00:35:08,560 --> 00:35:12,160 Speaker 1: Will basically took Sam's side, said Buskal, who recalls waiting 680 00:35:12,200 --> 00:35:14,480 Speaker 1: with McCauley in the Stockholm airport while she was on 681 00:35:14,520 --> 00:35:18,120 Speaker 1: the phone, Will basically threatened her. Buskall recalls, I remember 682 00:35:18,160 --> 00:35:20,480 Speaker 1: my impression being that Will was taking a pretty hostile 683 00:35:20,520 --> 00:35:22,480 Speaker 1: stance here and he was just believing Sam side of 684 00:35:22,520 --> 00:35:25,600 Speaker 1: the story, which made no sense to me. So Will 685 00:35:25,800 --> 00:35:29,759 Speaker 1: again perfectly willing to like throw down against, you know, 686 00:35:29,800 --> 00:35:32,600 Speaker 1: the more honest people in his movement and personally threaten 687 00:35:32,640 --> 00:35:34,840 Speaker 1: them in order to keep the money flowing to his 688 00:35:34,960 --> 00:35:37,160 Speaker 1: fancy cause so that he can he right. 689 00:35:37,200 --> 00:35:39,800 Speaker 2: And then the second that it becomes you know, pr 690 00:35:39,920 --> 00:35:43,319 Speaker 2: inconvenient to keep the association, there should be some maybe 691 00:35:43,360 --> 00:35:45,799 Speaker 2: there is like a term for that of just like 692 00:35:46,080 --> 00:35:49,279 Speaker 2: even the cadence of what you read earlier of just 693 00:35:49,280 --> 00:35:53,800 Speaker 2: like the disingenuousness of like, oh I had no idea. 694 00:35:53,880 --> 00:35:56,640 Speaker 1: Like you're like, Okay, now you knew damn well what 695 00:35:56,760 --> 00:36:00,120 Speaker 1: was going on, and you knew that that's you were 696 00:36:00,120 --> 00:36:03,600 Speaker 1: willing to continue pretending he was a good guy and 697 00:36:03,640 --> 00:36:06,680 Speaker 1: aligned with your movement as long as the money kept flowing. 698 00:36:06,520 --> 00:36:09,600 Speaker 2: Right, No, light, Yeah, it no longer serves your best 699 00:36:09,719 --> 00:36:12,120 Speaker 2: interests to stand by him. 700 00:36:12,440 --> 00:36:17,799 Speaker 1: Yeah, so yeah, now you'll yeah anyway, So, Sam, we 701 00:36:17,840 --> 00:36:21,080 Speaker 1: don't actually know if he's completely bankrupt. We know he 702 00:36:21,120 --> 00:36:23,759 Speaker 1: took about a billion in payments and loans from FTX. 703 00:36:24,080 --> 00:36:26,160 Speaker 1: He claims not to really have any of that money, 704 00:36:26,200 --> 00:36:28,719 Speaker 1: and that he's he's working on getting what assets he 705 00:36:28,800 --> 00:36:31,239 Speaker 1: does have to like try to make a few more 706 00:36:31,280 --> 00:36:34,920 Speaker 1: of the investors that they had whole. That said, we 707 00:36:35,000 --> 00:36:36,840 Speaker 1: know that a big chunk of the money that he 708 00:36:36,880 --> 00:36:39,920 Speaker 1: made was funneled into real estate in his parents' names. 709 00:36:41,200 --> 00:36:43,759 Speaker 1: So that's fun. Speaking of his parents, one of the 710 00:36:43,800 --> 00:36:46,440 Speaker 1: big early mysteries of his case was that when he 711 00:36:46,520 --> 00:36:49,240 Speaker 1: gets out on two hundred and fifty million dollar bond, 712 00:36:49,760 --> 00:36:52,200 Speaker 1: his parents are signed on to the bail agreement with 713 00:36:52,239 --> 00:36:55,600 Speaker 1: their house as collateral, but there were also two mystery 714 00:36:55,960 --> 00:36:58,600 Speaker 1: co signers and their house. We'll be talking about this 715 00:36:58,640 --> 00:37:01,000 Speaker 1: in a second is on the stand for campus. The 716 00:37:01,040 --> 00:37:04,360 Speaker 1: two mystery co signers were Larry Kramer, a family friend 717 00:37:04,400 --> 00:37:08,399 Speaker 1: and the former dean of Stanford, and Andreas Popke, who 718 00:37:08,480 --> 00:37:11,239 Speaker 1: signed a two hundred thousand dollars bond. Popke is a 719 00:37:11,280 --> 00:37:14,360 Speaker 1: senior research scientist at Stanford and an advisor to several 720 00:37:14,440 --> 00:37:20,440 Speaker 1: Valley startups. So Stanford is very invested primarily because of 721 00:37:20,440 --> 00:37:23,640 Speaker 1: who Sam's parents are in this case, which is interesting 722 00:37:23,680 --> 00:37:24,840 Speaker 1: to me and. 723 00:37:24,960 --> 00:37:27,760 Speaker 2: Okay, because that was I mean, that is my outside 724 00:37:27,760 --> 00:37:32,239 Speaker 2: of his parents, I guess realistically, what is in it 725 00:37:32,440 --> 00:37:35,200 Speaker 2: for them by taking that risk. 726 00:37:35,880 --> 00:37:38,920 Speaker 1: I think it's actually just that these people are close 727 00:37:38,960 --> 00:37:42,080 Speaker 1: with his parents, who are professors at Stanford and deeply 728 00:37:42,160 --> 00:37:43,480 Speaker 1: tied into that community. 729 00:37:44,440 --> 00:37:46,360 Speaker 2: I can, I mean, I can very much see like 730 00:37:47,440 --> 00:37:50,640 Speaker 2: bougie parents with influence, being like, Sam's just a boy. 731 00:37:50,800 --> 00:37:53,400 Speaker 2: Anyone could have made this mistake. He thought he was 732 00:37:53,440 --> 00:37:55,440 Speaker 2: doing the right thing. He got mixed in with the rock. 733 00:37:56,120 --> 00:37:59,560 Speaker 1: This has to just be like some sort of crazy 734 00:37:59,600 --> 00:38:02,680 Speaker 1: mistake because they can't imagine he just it's so dumb 735 00:38:02,719 --> 00:38:05,319 Speaker 1: and blatant, like all he did was rob people in 736 00:38:05,480 --> 00:38:08,520 Speaker 1: order to like gamble, right, Like, fundamentally, there's not a 737 00:38:08,560 --> 00:38:12,160 Speaker 1: difference between like him taking people's money, claiming that he's 738 00:38:12,400 --> 00:38:15,160 Speaker 1: got a sure stock tip and then gambling at Vegas. 739 00:38:15,239 --> 00:38:17,799 Speaker 1: Like legally, there's no difference between that and what he did. 740 00:38:18,920 --> 00:38:22,680 Speaker 1: But they can't because his parents are like ethicists basically 741 00:38:23,360 --> 00:38:26,280 Speaker 1: at Stanford, and I don't think any of the people 742 00:38:26,320 --> 00:38:29,439 Speaker 1: they are social with can imagine that his crimes were 743 00:38:29,480 --> 00:38:32,799 Speaker 1: that venal and foolish. But you know whose crimes are 744 00:38:32,840 --> 00:38:34,400 Speaker 1: not venal and foolish. 745 00:38:34,080 --> 00:38:37,680 Speaker 2: Jamie, Oh, tell me, Robert. 746 00:38:37,960 --> 00:38:40,920 Speaker 1: The products and services that support our podcasts. 747 00:38:41,320 --> 00:38:44,560 Speaker 2: Uh, never encountered a product I didn't love. 748 00:38:44,880 --> 00:38:59,959 Speaker 1: No, that's exactly right. That's exactly right, and we are back. 749 00:39:00,800 --> 00:39:04,080 Speaker 1: So initially, the reason why these other Stanford people are 750 00:39:04,239 --> 00:39:08,000 Speaker 1: secret signers on to this bail agreement is because they 751 00:39:08,000 --> 00:39:10,239 Speaker 1: were afraid that they would be attacked because of how 752 00:39:10,280 --> 00:39:12,760 Speaker 1: angry people are at Sam And there was a reason 753 00:39:12,840 --> 00:39:15,440 Speaker 1: for this. Shortly after he was sent to house arrested 754 00:39:15,520 --> 00:39:18,600 Speaker 1: his parents home, someone drove their car into a barricade 755 00:39:18,640 --> 00:39:23,400 Speaker 1: set up outside of their house on the Stanford campus. Now, 756 00:39:23,920 --> 00:39:26,440 Speaker 1: as I stated earlier, both the elder Bankman Freeds are 757 00:39:26,480 --> 00:39:29,640 Speaker 1: former Stanford professors and their home is on campus, which 758 00:39:29,640 --> 00:39:33,000 Speaker 1: has created issues for the school. The university still will 759 00:39:33,000 --> 00:39:35,840 Speaker 1: not officially acknowledge that one of the world's most famous 760 00:39:35,840 --> 00:39:39,680 Speaker 1: accused felons currently resides in their prestigious walled academic garden 761 00:39:40,040 --> 00:39:41,680 Speaker 1: one Washington, Stanford. 762 00:39:41,880 --> 00:39:45,480 Speaker 2: Like really fumbleshit constant. 763 00:39:45,680 --> 00:39:49,760 Speaker 1: Oh yeah, yeah, it's absurd. It's because they're not really 764 00:39:49,800 --> 00:39:50,360 Speaker 1: that smart. 765 00:39:50,440 --> 00:39:54,279 Speaker 2: But well, yes, this again another great case for it. 766 00:39:54,280 --> 00:39:56,680 Speaker 2: It's also like, I don't know, just the arguing, like, well, 767 00:39:56,880 --> 00:39:59,799 Speaker 2: his parents are like ethesis, how could he be fucked up? 768 00:40:00,120 --> 00:40:03,160 Speaker 2: Have you ever met someone who raised no offense? 769 00:40:04,600 --> 00:40:06,239 Speaker 1: People in the people. 770 00:40:09,800 --> 00:40:13,440 Speaker 2: Call out any love, but it's like they know, they 771 00:40:13,520 --> 00:40:15,080 Speaker 2: know the language. 772 00:40:15,239 --> 00:40:18,960 Speaker 1: Yeah yeah. So one part Washington Post article I found 773 00:40:19,040 --> 00:40:22,400 Speaker 1: noted Stanford Law School didn't respond to requests for comment 774 00:40:22,480 --> 00:40:24,880 Speaker 1: when asked whether they could confirm a rumor that nearby 775 00:40:24,920 --> 00:40:27,760 Speaker 1: student co op had attacked the Bankman Freed home with eggs. 776 00:40:28,000 --> 00:40:32,080 Speaker 1: Stanford campus police did not respond. Socially, however, Bankman Freed 777 00:40:32,160 --> 00:40:34,920 Speaker 1: is a source of deep fascination. There are party flyers 778 00:40:34,960 --> 00:40:38,120 Speaker 1: with his likeness. He's a punchline. In campus comedy sketches, 779 00:40:38,360 --> 00:40:42,040 Speaker 1: students ride their bikes by on dates. The campus community 780 00:40:42,080 --> 00:40:44,840 Speaker 1: is well aware he's there. An annotated map located the 781 00:40:44,840 --> 00:40:48,360 Speaker 1: Bankman Freed home was posted on a student only social network. 782 00:40:49,239 --> 00:40:51,840 Speaker 2: Okay, I be honest. If someone asked you out and 783 00:40:51,880 --> 00:40:54,600 Speaker 2: they're like, here's my concept for a first date, We're 784 00:40:54,600 --> 00:40:59,480 Speaker 2: gonna go watch Sam Samuel Bank We're getting. 785 00:40:59,200 --> 00:41:02,120 Speaker 1: We're getting married that night. Jamie, I was. 786 00:41:02,040 --> 00:41:06,400 Speaker 2: Like, we're going like, bring a flask, watch one of 787 00:41:06,880 --> 00:41:12,040 Speaker 2: history's stupidest villains, and then for each other in a 788 00:41:12,080 --> 00:41:12,560 Speaker 2: parking line. 789 00:41:12,640 --> 00:41:16,759 Speaker 1: Absolutely, Jamie, that's that's the dream. That's the dream. None 790 00:41:16,800 --> 00:41:19,840 Speaker 1: of this. That's effective altruism right there. That's the greatest 791 00:41:19,840 --> 00:41:21,480 Speaker 1: good for the greatest number of people. 792 00:41:21,880 --> 00:41:24,840 Speaker 2: Sam could stand to learn a thing or two from these. 793 00:41:25,800 --> 00:41:32,040 Speaker 1: These theoretical Stanford students. Yes, so if you so again. 794 00:41:32,160 --> 00:41:35,000 Speaker 1: Sam's parents are both well respected teachers and experts in 795 00:41:35,040 --> 00:41:39,360 Speaker 1: different fields of ethics. Both were recruited to the university 796 00:41:39,400 --> 00:41:41,839 Speaker 1: in the nineteen eighties, and they almost immediately hooked up. 797 00:41:42,160 --> 00:41:44,640 Speaker 1: Barbara Freed made a name for herself as a She's 798 00:41:44,680 --> 00:41:47,680 Speaker 1: like a philosopher. Basically. Her big thing was she wrote 799 00:41:47,680 --> 00:41:50,880 Speaker 1: a paper dissecting the ethics of the trolley problem. Whereas 800 00:41:51,000 --> 00:41:54,200 Speaker 1: Joe Bankman is a finance ethics guy. He writes a 801 00:41:54,280 --> 00:41:58,000 Speaker 1: lot about like again, like not breaking the law with 802 00:41:58,120 --> 00:42:01,160 Speaker 1: finance shit. It seems to be a big focus of his. 803 00:42:02,120 --> 00:42:04,279 Speaker 2: This is a very obvious thing to say, but I 804 00:42:04,360 --> 00:42:09,520 Speaker 2: do in defense of the Bankman family. I do appreciate 805 00:42:09,640 --> 00:42:13,399 Speaker 2: when someone really rolls with their last name. I think 806 00:42:13,440 --> 00:42:16,799 Speaker 2: that that is a very fun quality. This is this 807 00:42:16,840 --> 00:42:18,560 Speaker 2: is an example of it not working out. I think 808 00:42:18,600 --> 00:42:21,879 Speaker 2: it's also equally bizarre when someone has a last name 809 00:42:21,880 --> 00:42:24,719 Speaker 2: where you're like, why are you not doing that job? 810 00:42:25,120 --> 00:42:27,879 Speaker 2: For example? Yeah, maybe I've told you this before. It's 811 00:42:27,880 --> 00:42:30,320 Speaker 2: one of my favorite facts I've learned in my life. 812 00:42:30,880 --> 00:42:38,960 Speaker 2: My childhood dentist's name was doctor Vagenis and met. Nevertheless, 813 00:42:39,520 --> 00:42:43,200 Speaker 2: Glorious insisted in going into team. For some reason, I 814 00:42:43,320 --> 00:42:46,719 Speaker 2: found it infuriating. I'm like, just roll with it, man, 815 00:42:46,960 --> 00:42:48,080 Speaker 2: that's your last name. 816 00:42:49,920 --> 00:42:53,160 Speaker 1: ANOD needs to go after this guy. No, I'm sorry. 817 00:42:53,200 --> 00:42:55,080 Speaker 1: I know you don't know how to do this job, 818 00:42:55,080 --> 00:42:57,759 Speaker 1: but that's your life now. People will trust you until 819 00:42:57,800 --> 00:43:00,520 Speaker 1: you figure it out. I do you you know, are 820 00:43:01,040 --> 00:43:06,600 Speaker 1: the labydpeaking of nominative determinism. Joe Bankman would be it. 821 00:43:06,680 --> 00:43:09,759 Speaker 1: I wouldn't trust Joe Bankman as a drug dealer, but 822 00:43:09,840 --> 00:43:12,759 Speaker 1: I would. I would. I would trust Johnny Cocaine as 823 00:43:12,800 --> 00:43:18,080 Speaker 1: a financial expert, like like as a stockbroker. Johnny Cocaine, Yeah, 824 00:43:18,160 --> 00:43:20,319 Speaker 1: let him invest my money. He knows what he's doing. 825 00:43:21,840 --> 00:43:24,360 Speaker 2: I just, you know, say what you will about Joe Bankman, 826 00:43:24,480 --> 00:43:27,040 Speaker 2: and I'm sure you should. I know nothing about this man, 827 00:43:27,440 --> 00:43:29,880 Speaker 2: but at least he got into the right business. 828 00:43:30,080 --> 00:43:34,160 Speaker 1: Yeah he does, he does. Now. I people talk with 829 00:43:34,280 --> 00:43:36,920 Speaker 1: like awe about this guy, like he's so ethical, he's 830 00:43:36,920 --> 00:43:39,479 Speaker 1: such like a decent man. He thinks so much about 831 00:43:39,520 --> 00:43:42,400 Speaker 1: doing the right thing. When the Washington Post is like 832 00:43:42,440 --> 00:43:45,520 Speaker 1: giving examples of noteworthy things in his past, one of the. 833 00:43:45,440 --> 00:43:47,480 Speaker 2: Ones you must talk about Ted Lasso. 834 00:43:48,000 --> 00:43:50,200 Speaker 1: In two thousand and two, he wrote a tongue in 835 00:43:50,280 --> 00:43:53,600 Speaker 1: cheek suggestion on how to avoid a Major League Baseball strike, 836 00:43:54,120 --> 00:43:56,400 Speaker 1: and his solution in this paper that's supposed to be 837 00:43:56,440 --> 00:43:59,640 Speaker 1: a joke was to levy taxes on teams and players. 838 00:43:59,640 --> 00:44:02,480 Speaker 1: Who's ruck that could only be avoided if the players 839 00:44:02,520 --> 00:44:05,080 Speaker 1: donated money to charity or the teams agreed to sell 840 00:44:05,160 --> 00:44:09,279 Speaker 1: Nickel hot dogs giants fans. Now, I don't know what 841 00:44:09,360 --> 00:44:12,279 Speaker 1: the joke is here, but it apparently tore it up 842 00:44:12,320 --> 00:44:16,000 Speaker 1: among upper middle class Ivy League finance academics. They all 843 00:44:16,040 --> 00:44:17,680 Speaker 1: talk about this is very funny. 844 00:44:18,520 --> 00:44:23,200 Speaker 2: Wait it comes up again. Is like remember when he said. 845 00:44:22,320 --> 00:44:25,279 Speaker 1: It was in the Washington Post article about this guy, 846 00:44:25,440 --> 00:44:27,319 Speaker 1: as like, look at a look at this. This is 847 00:44:27,360 --> 00:44:29,719 Speaker 1: like a noteworthy moment from his career. This like bad 848 00:44:29,840 --> 00:44:32,680 Speaker 1: joke that he made. But I guess that's what fucking 849 00:44:32,800 --> 00:44:34,360 Speaker 1: Stanford people find funny. 850 00:44:34,520 --> 00:44:39,600 Speaker 2: Imagine maybe it's hilarious that Yeah, I don't get it. 851 00:44:39,719 --> 00:44:43,560 Speaker 2: Christ maybe Stanford people were just like I think it's hilarious, 852 00:44:43,600 --> 00:44:45,759 Speaker 2: just like the word hot dog. They're like, oh, poor 853 00:44:45,800 --> 00:44:47,879 Speaker 2: people food. Like I don't know. 854 00:44:48,200 --> 00:44:52,000 Speaker 1: Now, everything you find about these people is like, wow, 855 00:44:52,160 --> 00:44:54,160 Speaker 1: it taught like is their friends talking about Like it's 856 00:44:54,200 --> 00:44:56,680 Speaker 1: so shocking that this could happen. You know, these were 857 00:44:56,719 --> 00:44:58,920 Speaker 1: like the best people we knew. They were so concerned 858 00:44:58,960 --> 00:45:02,279 Speaker 1: about ethics, raised their sons like little adults, and they 859 00:45:02,320 --> 00:45:05,960 Speaker 1: were always talking about utilitarianism. How could this have gone wrong? 860 00:45:06,440 --> 00:45:09,480 Speaker 1: I don't know. I feel when I read anecdotes about them, 861 00:45:09,520 --> 00:45:12,239 Speaker 1: like it's pretty obvious why it went wrong. And to 862 00:45:12,320 --> 00:45:14,160 Speaker 1: kind of make that point, Jamie, here's a quote from 863 00:45:14,160 --> 00:45:18,000 Speaker 1: an excellent write up by Puck News. Quote. Bankman, who 864 00:45:18,000 --> 00:45:20,280 Speaker 1: once boasted to a friend that his father had dutifully 865 00:45:20,320 --> 00:45:23,560 Speaker 1: recorded every cash receipt, wrote three case books on tax 866 00:45:23,600 --> 00:45:26,399 Speaker 1: shelters and tax evasion, becoming one of the country's leading 867 00:45:26,440 --> 00:45:29,040 Speaker 1: experts on the subject. One of Bankman's law students in 868 00:45:29,080 --> 00:45:31,920 Speaker 1: those early years was Peter Teal, who later told Bankman 869 00:45:31,960 --> 00:45:34,640 Speaker 1: that his tax law class was his most valuable because 870 00:45:34,640 --> 00:45:36,719 Speaker 1: he was able to put a lot of his Facebook 871 00:45:36,760 --> 00:45:39,560 Speaker 1: stock in an ira. As Bankman would later recall on 872 00:45:39,600 --> 00:45:42,719 Speaker 1: a podcast, this modest feed of financial engineering would save 873 00:45:42,880 --> 00:45:47,200 Speaker 1: Teel more than a billion dollars. So ethics, Jamie. Avoiding 874 00:45:47,239 --> 00:45:50,080 Speaker 1: a billion dollars in taxes so that Peter Teal can 875 00:45:50,120 --> 00:45:54,400 Speaker 1: spend it given Joe swastik Oat money to write New 876 00:45:54,480 --> 00:45:58,719 Speaker 1: York Times columns on racism. Hooray, I love ethics. 877 00:46:00,239 --> 00:46:03,400 Speaker 2: Shit yeah, and I and here I was a clown, 878 00:46:03,600 --> 00:46:05,920 Speaker 2: a fool about to be like, well, everyone wants to 879 00:46:05,960 --> 00:46:09,320 Speaker 2: rebel against their parents. That's probably why he's a cartoon villain. 880 00:46:09,400 --> 00:46:12,279 Speaker 1: When he talks ethics. It's not getting busted for being 881 00:46:12,280 --> 00:46:14,800 Speaker 1: a piece of shit with money. That's what it seems 882 00:46:14,880 --> 00:46:17,319 Speaker 1: like to me. Not an expert on ethics, I am 883 00:46:17,320 --> 00:46:21,360 Speaker 1: an expert on being a piece of shit though, So yeah, but. 884 00:46:21,200 --> 00:46:23,120 Speaker 2: You're a far different piece of shit. 885 00:46:23,239 --> 00:46:24,040 Speaker 1: Thank you, Jamie. 886 00:46:24,040 --> 00:46:27,880 Speaker 2: Thank Can I celebrate that about you? I like that is, 887 00:46:28,440 --> 00:46:31,759 Speaker 2: but I mean I'm not. I guess shocked that that 888 00:46:32,000 --> 00:46:37,840 Speaker 2: is exceedingly ethical to the Stanford crowd. But wow, damning. 889 00:46:38,600 --> 00:46:42,279 Speaker 1: So anyway, Barbara Freed, being a good Liberal, was horrified 890 00:46:42,280 --> 00:46:44,480 Speaker 1: by the Trump election and shows to fight back by 891 00:46:44,480 --> 00:46:48,000 Speaker 1: founding a political fundright raising group, Mind the Gap, which 892 00:46:48,080 --> 00:46:51,000 Speaker 1: was extremely successful during the Trump years and is rumored 893 00:46:51,000 --> 00:46:53,600 Speaker 1: to have acted is the model for ftx's own political 894 00:46:53,640 --> 00:46:57,080 Speaker 1: donation machine. Both of Sam's parents have seen their reputations 895 00:46:57,120 --> 00:46:59,200 Speaker 1: suffer with his arrest, and I'm going to continue with 896 00:46:59,239 --> 00:47:02,440 Speaker 1: a quote from Puck. Official property records show that Joe 897 00:47:02,480 --> 00:47:04,719 Speaker 1: Bankman and Barbara Freed were the named owners of a 898 00:47:04,760 --> 00:47:08,000 Speaker 1: sixteen point four million dollar beachside vacation home in Old 899 00:47:08,000 --> 00:47:11,399 Speaker 1: Fort Bay, part of a broader real estate portfolio owned 900 00:47:11,400 --> 00:47:14,680 Speaker 1: by FTX and senior executives totaling hundreds of millions of dollars. 901 00:47:14,960 --> 00:47:16,760 Speaker 1: They may have stayed there while working with a company 902 00:47:16,800 --> 00:47:19,640 Speaker 1: sometime over the last year. Sam said, though he denied 903 00:47:19,680 --> 00:47:21,960 Speaker 1: knowing any details about the three hundred million dollars worth 904 00:47:22,000 --> 00:47:24,360 Speaker 1: of real estate that FTX and his parents bought in 905 00:47:24,400 --> 00:47:25,000 Speaker 1: the Bahamas. 906 00:47:26,000 --> 00:47:28,520 Speaker 2: Oh, okay, so they know about absolutely everything in the. 907 00:47:29,160 --> 00:47:31,360 Speaker 1: Yeah, it sounds like it sounds like now. Joe and 908 00:47:31,400 --> 00:47:33,200 Speaker 1: Barbara have said that they've been working to return the 909 00:47:33,200 --> 00:47:37,120 Speaker 1: property to the company for some time, working too. Joe Bankman, 910 00:47:37,200 --> 00:47:39,480 Speaker 1: in particular, has hardly been a passive observer in his 911 00:47:39,560 --> 00:47:41,719 Speaker 1: son's scandal and may now be exposed to some legal 912 00:47:41,800 --> 00:47:45,000 Speaker 1: risk himself. Bankman interviewed and hired the first lawyers for 913 00:47:45,040 --> 00:47:48,040 Speaker 1: Alimeter Research back in twenty seventeen, and effectively served as 914 00:47:48,160 --> 00:47:51,120 Speaker 1: ftx's first attorney. He handled the inbound that came and 915 00:47:51,160 --> 00:47:53,799 Speaker 1: made the resulting introduction that helped FTX raise one hundred 916 00:47:53,840 --> 00:47:56,600 Speaker 1: and thirty million from his former law student. Private equity 917 00:47:56,600 --> 00:47:59,920 Speaker 1: mogul Orlando Bravo spent his free time on ftx's charitable 918 00:48:00,000 --> 00:48:02,719 Speaker 1: and regulatory efforts, and was ultimately in the room before 919 00:48:02,760 --> 00:48:05,440 Speaker 1: Sam made the fateful decision to sign the documents that 920 00:48:05,480 --> 00:48:10,799 Speaker 1: declared Chapter eleven. So they seem very involved in shady themselves. 921 00:48:11,320 --> 00:48:14,239 Speaker 1: I don't buy. Oh, they're so innocent. Their son just broke, 922 00:48:14,320 --> 00:48:16,480 Speaker 1: you know, made a mistake or whatever. They're all they 923 00:48:16,520 --> 00:48:19,360 Speaker 1: all just didn't think that this was criminal because the 924 00:48:19,400 --> 00:48:22,520 Speaker 1: people's money they were taking were poor, and they're fucking 925 00:48:22,600 --> 00:48:26,000 Speaker 1: Stanford brats, Like I have no respect for them. I 926 00:48:26,040 --> 00:48:29,160 Speaker 1: hope they lose their fancy Stanford house. They heard a 927 00:48:29,160 --> 00:48:29,400 Speaker 1: lot of it. 928 00:48:29,600 --> 00:48:33,759 Speaker 2: It's like, yeah, particularly because I mean it sounds like 929 00:48:33,840 --> 00:48:36,279 Speaker 2: this was their mo o from the start anyways, So 930 00:48:36,360 --> 00:48:39,040 Speaker 2: why would we now think that they would be above 931 00:48:39,239 --> 00:48:41,840 Speaker 2: this behavior if they're like, oh no, here's a way, Like, 932 00:48:42,200 --> 00:48:44,760 Speaker 2: I don't know. It seems like their definition of ethics 933 00:48:44,880 --> 00:48:46,960 Speaker 2: is things you can technically get away with. 934 00:48:47,120 --> 00:48:49,520 Speaker 1: Yeah, you think it's not illegal for Peter Teal to 935 00:48:49,560 --> 00:48:52,080 Speaker 1: get this billion dollars that he doesn't paid taxes on 936 00:48:52,200 --> 00:48:57,240 Speaker 1: because you know, fuckery anyway, mister Bankman and missus Freed 937 00:48:57,320 --> 00:49:00,680 Speaker 1: have joined now the expanding cast of disgrace Stanford affiliates. 938 00:49:00,840 --> 00:49:05,640 Speaker 1: This includes recently university president Mark Tessier Levine, currently accused 939 00:49:05,680 --> 00:49:08,120 Speaker 1: of manipulating images on research papers in a way that 940 00:49:08,200 --> 00:49:12,440 Speaker 1: is equivalent to falsifying lab data for Alzheimer's research. Obviously, 941 00:49:12,480 --> 00:49:16,480 Speaker 1: there's also Bastard's Pod alumni, the Thernose Lady, another famous 942 00:49:16,520 --> 00:49:19,760 Speaker 1: Stanford disgrace. And then there's the fact that they're alumni. 943 00:49:19,360 --> 00:49:21,719 Speaker 2: King my favorite Stanford disgrace. 944 00:49:21,960 --> 00:49:24,480 Speaker 1: Yeah, and then there's all their alumni who have created 945 00:49:24,480 --> 00:49:27,080 Speaker 1: companies or helped run them that have shattered the foundations 946 00:49:27,120 --> 00:49:29,880 Speaker 1: of our democracy in pursuit of a quick buck. Stanford's 947 00:49:29,880 --> 00:49:33,080 Speaker 1: current reputation is so grimy that a Washington Post article 948 00:49:33,080 --> 00:49:36,360 Speaker 1: on SBF's associations with the school ends with these lines, 949 00:49:36,400 --> 00:49:39,880 Speaker 1: and this is very funny. Jamie Adrian Daub, a Stanford 950 00:49:39,920 --> 00:49:42,920 Speaker 1: professor of comparative literature in German Studies and the author 951 00:49:42,960 --> 00:49:45,920 Speaker 1: of what Tech Calls Thinking, sees an encouraging sign in 952 00:49:45,960 --> 00:49:49,480 Speaker 1: Stanford being only peripherally involved in the bankman Freed scandal. 953 00:49:49,719 --> 00:49:51,480 Speaker 1: That might not have been the case ten years ago. 954 00:49:51,560 --> 00:49:54,359 Speaker 1: He notes, when the Silicon Valley hepe machine operated at 955 00:49:54,360 --> 00:49:56,520 Speaker 1: more of a fever bitch than it does today. Other 956 00:49:56,560 --> 00:49:59,560 Speaker 1: than his physical location, it's actually not that connected to 957 00:49:59,640 --> 00:50:02,080 Speaker 1: us for once, Dob said, And that way, it's a 958 00:50:02,120 --> 00:50:05,680 Speaker 1: sign of progress and also a little bit melancholy. Stanford 959 00:50:05,719 --> 00:50:07,480 Speaker 1: was a place where the future was shaped, so it's 960 00:50:07,520 --> 00:50:10,520 Speaker 1: quite possible that's not happening anymore, that it's happening in 961 00:50:10,520 --> 00:50:12,839 Speaker 1: the Bahamas now and only comes to Palo Alto once 962 00:50:12,880 --> 00:50:18,040 Speaker 1: it gets indicted. That's so funny. 963 00:50:18,960 --> 00:50:22,000 Speaker 2: I'm hug up on other than its physical location. 964 00:50:22,320 --> 00:50:24,160 Speaker 1: Yeah, other than the fact that he's here. He's not 965 00:50:24,320 --> 00:50:25,000 Speaker 1: very involved. 966 00:50:25,560 --> 00:50:27,839 Speaker 2: That's a big one, babe, that's a big one. 967 00:50:27,920 --> 00:50:29,440 Speaker 1: But it does seem significant to me. 968 00:50:30,640 --> 00:50:33,080 Speaker 2: No, if they're happy, I'm happy. That's great. 969 00:50:33,239 --> 00:50:34,440 Speaker 1: It's funny. 970 00:50:34,120 --> 00:50:38,560 Speaker 2: We have to return to to Elizabeth Holmes at some 971 00:50:38,600 --> 00:50:42,760 Speaker 2: point because I'm very uniquely interested in her rebranding as Liz. 972 00:50:43,680 --> 00:50:48,240 Speaker 1: Yeah, genius, because now I've forgotten her horrible crimes. 973 00:50:48,400 --> 00:50:50,799 Speaker 2: Yeah, I forgot about all of her crimes once she 974 00:50:50,840 --> 00:50:54,400 Speaker 2: had a cool and relatable name and had school relatable babies. 975 00:50:54,520 --> 00:50:58,000 Speaker 1: Well, escape hay kiss started going by Alan. 976 00:50:58,880 --> 00:51:03,200 Speaker 2: That's true. We can't take away that she's scammed Henry Kissinger. Yeah, 977 00:51:03,760 --> 00:51:07,560 Speaker 2: scam people out of their lives. But you know, yeah, 978 00:51:07,600 --> 00:51:09,480 Speaker 2: but the Henry Kissinger thing we cannot forget. 979 00:51:09,920 --> 00:51:12,560 Speaker 1: I say, we give her six months off as a 980 00:51:12,600 --> 00:51:18,160 Speaker 1: result of the Kissinger stick to me. So this brings 981 00:51:18,239 --> 00:51:21,080 Speaker 1: us to the subject of what precisely Sam Bankman Freed 982 00:51:21,120 --> 00:51:22,799 Speaker 1: has been up to in the nine months or so 983 00:51:22,840 --> 00:51:25,319 Speaker 1: since his fall from grace. The short answer is that 984 00:51:25,400 --> 00:51:28,239 Speaker 1: he has not had a wonderful time. In January, a 985 00:51:28,239 --> 00:51:30,760 Speaker 1: month or so after he was granted bail under house arrest, 986 00:51:31,000 --> 00:51:33,720 Speaker 1: the Southern District of New York, accused him of inappropriately 987 00:51:33,800 --> 00:51:37,799 Speaker 1: contacting former and FTX employees in order to influence their 988 00:51:37,800 --> 00:51:40,840 Speaker 1: testimony on his case. Sam tried to frame all of 989 00:51:40,880 --> 00:51:43,319 Speaker 1: this as part of his ill advised apology tour that 990 00:51:43,400 --> 00:51:45,920 Speaker 1: he embarked on last year in the lag period between 991 00:51:46,000 --> 00:51:48,520 Speaker 1: FTX collapsing and his formal charges. 992 00:51:48,920 --> 00:51:51,320 Speaker 2: Yeah, did he hit the notes up? What happened? 993 00:51:51,440 --> 00:51:53,719 Speaker 1: Oh, he's like calling them. Actually, I'm going to read 994 00:51:53,719 --> 00:51:56,399 Speaker 1: a quote from Puck in order to describe how he's 995 00:51:56,440 --> 00:51:59,680 Speaker 1: illegally contacting people. On a simber twelfth, the same day 996 00:51:59,680 --> 00:52:02,520 Speaker 1: he was rested in the Bahamas, Bankman Freed emailed FDx 997 00:52:02,560 --> 00:52:06,600 Speaker 1: bankruptcy CEO John Jay Ray the Third, offering potentially pertinent 998 00:52:06,640 --> 00:52:10,800 Speaker 1: information concerning future opportunities and financing for FDx and its creditors, 999 00:52:11,040 --> 00:52:13,560 Speaker 1: and asked to work constructively with Ray in the chapter 1000 00:52:13,600 --> 00:52:17,160 Speaker 1: eleven teen to do what's best for customers. No response Then, 1001 00:52:17,200 --> 00:52:19,920 Speaker 1: after his extradition, the crypto mogul sent another email to 1002 00:52:20,040 --> 00:52:22,960 Speaker 1: Ray on December thirtieth, in which he offered advice accessing 1003 00:52:23,000 --> 00:52:26,319 Speaker 1: Alameda funds. Still no response. Then, while being summoned to 1004 00:52:26,320 --> 00:52:29,480 Speaker 1: court in New York, SBF tried Ray again on January second. 1005 00:52:29,760 --> 00:52:31,640 Speaker 1: Mister Ray, I know things haven't gotten off on the 1006 00:52:31,719 --> 00:52:33,600 Speaker 1: right foot, but I really do want to be helpful. 1007 00:52:33,840 --> 00:52:35,799 Speaker 1: As I'm guessing you've heard, I'm in NYC for the 1008 00:52:35,800 --> 00:52:37,600 Speaker 1: next day. I'd love to meet up while I'm here, 1009 00:52:37,680 --> 00:52:40,040 Speaker 1: even if just to say hi, Ray, to not take 1010 00:52:40,120 --> 00:52:41,960 Speaker 1: him up on this offer, And. 1011 00:52:41,920 --> 00:52:42,720 Speaker 2: He has no shit. 1012 00:52:42,920 --> 00:52:44,920 Speaker 1: He's reached out to several people, and it's always like, 1013 00:52:44,960 --> 00:52:47,399 Speaker 1: I just want to help, you know, get as much 1014 00:52:47,440 --> 00:52:49,960 Speaker 1: money as we can for the customers. I just want to, 1015 00:52:50,120 --> 00:52:52,759 Speaker 1: you know, help you deal with the confusing aspects of this. 1016 00:52:52,880 --> 00:52:55,840 Speaker 1: But it's like, you're not supposed to be talking to 1017 00:52:56,000 --> 00:52:58,880 Speaker 1: people when you're in Sam situation who were involved in 1018 00:52:58,960 --> 00:53:02,000 Speaker 1: the company like this, because they're probably going to testify 1019 00:53:02,040 --> 00:53:02,480 Speaker 1: against you. 1020 00:53:03,360 --> 00:53:07,319 Speaker 2: I don't feel convinced that he understands that. I don't know, 1021 00:53:07,360 --> 00:53:09,600 Speaker 2: he's talking like a fucking spam email. 1022 00:53:09,880 --> 00:53:13,880 Speaker 1: Yeah, he really is. And in general, Sam has opted 1023 00:53:13,920 --> 00:53:16,000 Speaker 1: to take all of the actions under house arrests that 1024 00:53:16,040 --> 00:53:19,040 Speaker 1: are likeliest to cause stress ulcers in his lawyers. In 1025 00:53:19,080 --> 00:53:22,759 Speaker 1: addition to repeatedly contacting FTX employees, he decided to start 1026 00:53:22,760 --> 00:53:26,319 Speaker 1: a substack where he planned to explain how fd and 1027 00:53:26,360 --> 00:53:28,719 Speaker 1: it's like, there's that famous line from the wire, are 1028 00:53:28,760 --> 00:53:31,400 Speaker 1: you taking notes on a criminal conspiracy? But in this 1029 00:53:31,440 --> 00:53:34,440 Speaker 1: point the case, it's like, are you publishing blog posts 1030 00:53:34,480 --> 00:53:38,080 Speaker 1: about your criminal conspiracy after being indicted? 1031 00:53:39,600 --> 00:53:43,239 Speaker 2: It's out growing an audience, Robert h Loah, It's all 1032 00:53:43,280 --> 00:53:44,440 Speaker 2: a part of the play. 1033 00:53:44,719 --> 00:53:47,200 Speaker 1: Yeah, it's like if al Capone had started in New 1034 00:53:47,280 --> 00:53:49,799 Speaker 1: York Times call um on having people machined gun and 1035 00:53:49,840 --> 00:53:52,720 Speaker 1: alleys after he'd gone away to fucking Alcatratz. 1036 00:53:53,520 --> 00:53:57,359 Speaker 2: Like, hear me out, you guys, It actually makes way 1037 00:53:57,440 --> 00:53:59,160 Speaker 2: more sense when I explain it. 1038 00:53:59,760 --> 00:54:02,200 Speaker 1: Yeah, uh so neither. He only writes, like I think, 1039 00:54:02,280 --> 00:54:05,080 Speaker 1: two posts before he gives up because they're they're bad. 1040 00:54:05,120 --> 00:54:07,080 Speaker 1: He's bad and blogging. I tried to read them. He's 1041 00:54:07,160 --> 00:54:07,919 Speaker 1: dog shiit writer. 1042 00:54:08,600 --> 00:54:11,239 Speaker 2: Well can I I mean not to be aggressive, but 1043 00:54:11,320 --> 00:54:12,880 Speaker 2: that's most substacked people. 1044 00:54:13,040 --> 00:54:18,920 Speaker 1: It's by the way, find my substanct not you, Robert No, 1045 00:54:19,040 --> 00:54:19,479 Speaker 1: I know. 1046 00:54:19,440 --> 00:54:21,880 Speaker 2: You have more than two, but I'm just saying the 1047 00:54:22,000 --> 00:54:27,360 Speaker 2: average friend of mine that harangues me into you know 1048 00:54:27,440 --> 00:54:34,440 Speaker 2: these damn emails, it's too and and then they're done anyways, 1049 00:54:34,480 --> 00:54:36,560 Speaker 2: but not I'm probably going to start one that I'm 1050 00:54:36,600 --> 00:54:38,720 Speaker 2: just being insecure. 1051 00:54:37,960 --> 00:54:40,440 Speaker 1: What I wanted to do in a I think you'll 1052 00:54:40,480 --> 00:54:42,760 Speaker 1: beat Sam Bankman Freed, No, no question. 1053 00:54:43,280 --> 00:54:45,280 Speaker 2: Your substock is great. 1054 00:54:45,280 --> 00:54:45,799 Speaker 1: Thank you, sir. 1055 00:54:46,120 --> 00:54:47,520 Speaker 2: Your substack is great. 1056 00:54:47,760 --> 00:54:50,760 Speaker 1: Thank you, Jamie. This has been wonderful for my ego. 1057 00:54:52,360 --> 00:54:54,239 Speaker 2: Those are two that I actually read. 1058 00:54:54,880 --> 00:54:59,080 Speaker 1: So Sam's is bad though, and it's bad like in 1059 00:54:59,160 --> 00:55:01,959 Speaker 1: part of part of what he's doing is like he's 1060 00:55:02,000 --> 00:55:05,319 Speaker 1: been charged with a bunch of specific crimes, right and 1061 00:55:05,840 --> 00:55:08,560 Speaker 1: the posts that he puts up he does not acknowledge 1062 00:55:08,600 --> 00:55:11,000 Speaker 1: any of the charges against him. He doesn't like defend 1063 00:55:11,120 --> 00:55:13,759 Speaker 1: himself from them. Instead, he lays out a bunch of 1064 00:55:13,800 --> 00:55:17,440 Speaker 1: misleading in arcane spreadsheets to try and like argue that 1065 00:55:18,160 --> 00:55:20,839 Speaker 1: the company shouldn't have collapsed the way it did and 1066 00:55:20,840 --> 00:55:23,800 Speaker 1: that he like didn't realize like why he didn't realize 1067 00:55:23,840 --> 00:55:26,279 Speaker 1: it was so bad he's doing. It's the same thing 1068 00:55:26,320 --> 00:55:28,960 Speaker 1: as like the EA shit we open the episode with 1069 00:55:29,080 --> 00:55:32,440 Speaker 1: right this throwing out like confusing piles of numbers in 1070 00:55:32,520 --> 00:55:35,640 Speaker 1: order to distract people, right, Like, this is just like chaff. 1071 00:55:35,760 --> 00:55:37,440 Speaker 1: You know, that's what he's doing, is he's throwing out 1072 00:55:37,520 --> 00:55:40,239 Speaker 1: chaff in the way of a bunch of poorly formatted spreadsheets. 1073 00:55:41,800 --> 00:55:43,840 Speaker 1: They don't convince anyone that he's innocent. 1074 00:55:43,960 --> 00:55:45,600 Speaker 2: I was to say, but it sounds like it. Also. 1075 00:55:46,320 --> 00:55:47,240 Speaker 2: It did not work. 1076 00:55:47,760 --> 00:55:49,839 Speaker 1: It did not work at all. You should not do 1077 00:55:49,880 --> 00:55:52,680 Speaker 1: this if you are being indicted for numerous financial crimes 1078 00:55:52,719 --> 00:55:57,640 Speaker 1: bt dubs. So in early February, the judge overseeing Sam's 1079 00:55:57,640 --> 00:56:00,800 Speaker 1: case forbade him from using encrypted messaging app like Signal 1080 00:56:00,840 --> 00:56:03,640 Speaker 1: because he was so frequently trying to talk to other 1081 00:56:03,680 --> 00:56:07,080 Speaker 1: people who were part of this case with him in secret, 1082 00:56:07,239 --> 00:56:10,000 Speaker 1: which is illegal. He also got in trouble because he 1083 00:56:10,080 --> 00:56:13,120 Speaker 1: was caught using a VPN, which could have potentially allowed 1084 00:56:13,200 --> 00:56:16,160 Speaker 1: him to hide his communications. Sam argued he was just 1085 00:56:16,320 --> 00:56:20,440 Speaker 1: using a VPN to access his international NFL game pass account. 1086 00:56:20,640 --> 00:56:23,879 Speaker 2: So I was like, did he just say he was 1087 00:56:23,960 --> 00:56:26,520 Speaker 2: trying to want a Canadian Netflix. Yeah, that would be 1088 00:56:26,640 --> 00:56:28,200 Speaker 2: fucking classical officer. 1089 00:56:28,400 --> 00:56:30,480 Speaker 1: I just had a lot of shit to torre at homie, 1090 00:56:30,560 --> 00:56:34,879 Speaker 1: Like you get it, my men, he has since been 1091 00:56:34,920 --> 00:56:38,520 Speaker 1: limited to a normal flip phone due to his repeated 1092 00:56:38,520 --> 00:56:42,120 Speaker 1: inability to abide by his bail conditions. Now, some might 1093 00:56:42,200 --> 00:56:44,960 Speaker 1: note that Sam has already gotten more second chances than 1094 00:56:44,960 --> 00:56:48,400 Speaker 1: most accused criminals get with their bail conditions, it seems 1095 00:56:48,440 --> 00:56:50,920 Speaker 1: accurate to say that the leniency he has received gave 1096 00:56:51,000 --> 00:56:53,279 Speaker 1: him reason to feel as if he could act with impunity, 1097 00:56:53,520 --> 00:56:55,880 Speaker 1: which is why a couple of weeks ago he leaked 1098 00:56:55,880 --> 00:56:58,799 Speaker 1: his ex girlfriend's diary to The New York Times. 1099 00:56:58,960 --> 00:57:02,120 Speaker 2: Which is take me through the Witch's wi. 1100 00:57:03,160 --> 00:57:05,479 Speaker 3: That wasn't what I thought you were going to say. 1101 00:57:05,719 --> 00:57:08,240 Speaker 1: I know, I know, nobody thought it was gonna head here. 1102 00:57:08,280 --> 00:57:11,960 Speaker 2: You know, I mean, it's although it seemed like we 1103 00:57:11,960 --> 00:57:15,239 Speaker 2: were due for another spiteful action against a woman for 1104 00:57:15,320 --> 00:57:16,320 Speaker 2: seemingly no reason. 1105 00:57:16,400 --> 00:57:22,040 Speaker 1: Oh, absolutely absolutely. Now I will say I don't like 1106 00:57:22,120 --> 00:57:25,160 Speaker 1: this woman either. Caroline Ellison is I think a pretty 1107 00:57:25,160 --> 00:57:25,880 Speaker 1: shitty person. 1108 00:57:26,520 --> 00:57:29,480 Speaker 2: She was the we spoke about her last time, right, Yeah, yeah. 1109 00:57:29,200 --> 00:57:33,040 Speaker 1: She unpleasant lady. She was the former CEO of Alameda. 1110 00:57:33,160 --> 00:57:36,720 Speaker 1: I've been fucking listening to this podcast called Spellcaster, which 1111 00:57:36,800 --> 00:57:40,000 Speaker 1: is like a wondery podcast about Sam Bankman Freed. I 1112 00:57:40,000 --> 00:57:42,320 Speaker 1: don't like it. The woman who does it like was 1113 00:57:42,400 --> 00:57:45,200 Speaker 1: at a was at a bachelorette party with Carolyn Ellison 1114 00:57:45,280 --> 00:57:48,040 Speaker 1: right before the charges dropped and was like, Oh, she's 1115 00:57:48,080 --> 00:57:50,280 Speaker 1: so smart, She's so and she repeats the same bullshit 1116 00:57:50,280 --> 00:57:52,800 Speaker 1: everyone says about Sam. They were such, they were like geniuses, 1117 00:57:52,840 --> 00:57:55,720 Speaker 1: and it's like, no, they just like blew out a 1118 00:57:55,760 --> 00:57:58,480 Speaker 1: bunch of numbers you didn't understand and convinced you they 1119 00:57:58,520 --> 00:58:01,680 Speaker 1: were smart because they said numbers right, Like, there's nothing 1120 00:58:01,720 --> 00:58:03,720 Speaker 1: these people have done that is smart. 1121 00:58:04,200 --> 00:58:06,680 Speaker 2: With situations like that, I'm like, I guess I appreciate 1122 00:58:06,720 --> 00:58:10,800 Speaker 2: the disclosure, but like, why the fuck were you hired 1123 00:58:10,840 --> 00:58:11,760 Speaker 2: to do this show? 1124 00:58:11,960 --> 00:58:15,440 Speaker 1: Well, it's it's because big media is just as tiny 1125 00:58:15,440 --> 00:58:18,320 Speaker 1: and insular a world full of rich people as finance, 1126 00:58:18,760 --> 00:58:20,920 Speaker 1: and in fact, a lot of the same families have 1127 00:58:21,200 --> 00:58:25,320 Speaker 1: people at the times and people and fucking investment banks, 1128 00:58:25,320 --> 00:58:30,440 Speaker 1: which is why here at cool Zone Media we exclusively 1129 00:58:30,520 --> 00:58:33,160 Speaker 1: hire people who used to sell ketamine on their college 1130 00:58:33,160 --> 00:58:35,560 Speaker 1: campuses in order to get by. You know. That's that's 1131 00:58:35,600 --> 00:58:38,280 Speaker 1: that's the Cool Zone guarantee or adderall. 1132 00:58:38,880 --> 00:58:43,440 Speaker 2: You know, And I appreciate that you made an exception 1133 00:58:43,560 --> 00:58:45,560 Speaker 2: for some of us that you didn't have to be 1134 00:58:45,680 --> 00:58:48,600 Speaker 2: good at it. You just had to truck. No. 1135 00:58:48,840 --> 00:58:50,640 Speaker 1: In fact, we will not hire you if you were 1136 00:58:50,680 --> 00:58:54,560 Speaker 1: good at selling trump campus. Why did you even apply 1137 00:58:54,680 --> 00:58:59,040 Speaker 1: to this media mediocre part time campus drug dealers. That's 1138 00:58:59,080 --> 00:59:03,200 Speaker 1: our that's our higher ring pool. Yeah, that's our, Like, 1139 00:59:03,280 --> 00:59:04,840 Speaker 1: I don't know whatever, I don't know the names of 1140 00:59:04,920 --> 00:59:06,200 Speaker 1: enough fancy news. 1141 00:59:05,840 --> 00:59:08,200 Speaker 3: Really not that big of a stretch, Let's be honest. 1142 00:59:09,400 --> 00:59:10,480 Speaker 2: I'm fucking proud of that. 1143 00:59:11,600 --> 00:59:21,720 Speaker 1: So Carolyn Ellison, former CEO of Alameda and also Sam's 1144 00:59:21,760 --> 00:59:25,919 Speaker 1: on again off again bo she immediately turns. 1145 00:59:25,160 --> 00:59:27,160 Speaker 2: Like you using the term bo but continue? 1146 00:59:27,160 --> 00:59:27,480 Speaker 1: Why not? 1147 00:59:27,640 --> 00:59:31,760 Speaker 2: That's what I should I use boo. I kind of 1148 00:59:31,840 --> 00:59:32,400 Speaker 2: like boo. 1149 00:59:33,520 --> 00:59:36,320 Speaker 3: I was like, I was like, that doesn't really make sense. 1150 00:59:36,200 --> 00:59:40,480 Speaker 1: But they were ko boos. So she immediately turned state's 1151 00:59:40,480 --> 00:59:42,960 Speaker 1: witness and admitted guilt for her share of the illegal 1152 00:59:43,000 --> 00:59:47,240 Speaker 1: activities committed by Alameda, and she apparently, as a part 1153 00:59:47,600 --> 00:59:50,680 Speaker 1: as part of immediately rolling handed over her diary. I 1154 00:59:51,080 --> 00:59:53,000 Speaker 1: think that's how they got her diaries was part of 1155 00:59:53,040 --> 00:59:56,600 Speaker 1: the terms of like the Yeah, so it gets introduced 1156 00:59:56,600 --> 00:59:59,400 Speaker 1: into evidence, which obviously Sam I think will get access 1157 00:59:59,440 --> 01:00:00,960 Speaker 1: to as a result to that, because that's the way 1158 01:00:01,000 --> 01:00:03,600 Speaker 1: discovery works. I believe that's how he got her diary. 1159 01:00:04,000 --> 01:00:06,440 Speaker 2: Was she also a Stanford Head? 1160 01:00:07,880 --> 01:00:09,520 Speaker 1: I don't think she went to I think she her 1161 01:00:09,520 --> 01:00:12,840 Speaker 1: parents were professors at M I T h Wow. 1162 01:00:12,920 --> 01:00:15,560 Speaker 2: Yeah, losers over it Yeah yeah. 1163 01:00:15,400 --> 01:00:19,120 Speaker 1: Yeah, just Hobo University right there. 1164 01:00:19,520 --> 01:00:22,400 Speaker 2: Until it happens to me. I love when someone's diary 1165 01:00:22,480 --> 01:00:24,960 Speaker 2: is introduced into evidence, and that brings me back to 1166 01:00:25,000 --> 01:00:29,040 Speaker 2: Elizabeth Holmes yet again, when she like her her creepy 1167 01:00:29,080 --> 01:00:33,440 Speaker 2: little sexts with Sunny Belwani ooh. 1168 01:00:32,840 --> 01:00:36,680 Speaker 1: Some of the worst some of the very worst sex. 1169 01:00:37,000 --> 01:00:40,160 Speaker 2: That is maybe the moment that I felt closest to her. 1170 01:00:40,280 --> 01:00:42,640 Speaker 2: That's when she That's when Liz almost got me because 1171 01:00:42,680 --> 01:00:45,360 Speaker 2: she was sending walls of text to this guy and 1172 01:00:45,400 --> 01:00:48,320 Speaker 2: then he was sending me back. Okay, and it's like, you. 1173 01:00:48,400 --> 01:00:52,560 Speaker 1: Know what, brutal. No, No, she deserves she deserved a 1174 01:00:52,560 --> 01:00:55,480 Speaker 1: man like Jeff Bezos, who would call her the most 1175 01:00:55,560 --> 01:00:59,600 Speaker 1: unsettling nickname I've ever heard, you know, but at least 1176 01:00:59,600 --> 01:01:01,680 Speaker 1: he was bonded a live girl. 1177 01:01:03,720 --> 01:01:10,920 Speaker 3: That's right, that's right. I think we we as we 1178 01:01:10,920 --> 01:01:15,040 Speaker 3: we we we we as a collective blocks that out intentionally. 1179 01:01:15,200 --> 01:01:18,000 Speaker 1: Yeah, it's very funny. It does make it clear that 1180 01:01:18,120 --> 01:01:23,160 Speaker 1: he's not a robot, because like, nobody, nobody fakes that. 1181 01:01:23,160 --> 01:01:26,120 Speaker 1: That's that's that's evidence that he feels something. What he 1182 01:01:26,160 --> 01:01:30,640 Speaker 1: feels is off putting, it's frightening, it's like profoundly unsettling. 1183 01:01:30,680 --> 01:01:32,120 Speaker 1: But he does feel something. 1184 01:01:32,240 --> 01:01:36,000 Speaker 2: But unfortunately, yeah, chat GPT could have outdone that in 1185 01:01:36,080 --> 01:01:37,560 Speaker 2: terms of sounding like a person. 1186 01:01:38,160 --> 01:01:44,040 Speaker 1: Absolutely. Yeah. So anyway, Sam gets access to her diary 1187 01:01:44,080 --> 01:01:46,360 Speaker 1: one way or the other. Uh, and then he hands 1188 01:01:46,360 --> 01:01:48,800 Speaker 1: her diary to the New York Times so that they 1189 01:01:48,800 --> 01:01:51,600 Speaker 1: can write an article about it. Now that is unethical 1190 01:01:51,600 --> 01:01:55,160 Speaker 1: as fuck and possibly illegal. The prosecution has asked that 1191 01:01:55,240 --> 01:01:59,000 Speaker 1: he be jailed, that his bay will be revoked because 1192 01:01:59,040 --> 01:02:02,200 Speaker 1: of what he did. Sam's this is still going on 1193 01:02:02,360 --> 01:02:04,960 Speaker 1: as we talk. I'll record a little update if he 1194 01:02:05,000 --> 01:02:07,480 Speaker 1: does go to jail as a result of this. Hey everyone, 1195 01:02:07,640 --> 01:02:10,480 Speaker 1: Robert here just wanted to update you that, since we 1196 01:02:10,560 --> 01:02:14,080 Speaker 1: recorded this episode a couple of days before you're hearing it, 1197 01:02:14,440 --> 01:02:18,560 Speaker 1: Sam Bankman freed was remanded to custody. He is incarcerated 1198 01:02:18,560 --> 01:02:21,480 Speaker 1: now and he will remain in jail after violating his 1199 01:02:21,480 --> 01:02:26,160 Speaker 1: bail conditions, until his trial in October at least, and 1200 01:02:26,280 --> 01:02:29,960 Speaker 1: possibly well beyond that, depending on how the charges and 1201 01:02:30,080 --> 01:02:34,160 Speaker 1: sentencing and all that stuff go. I should note that 1202 01:02:34,600 --> 01:02:37,320 Speaker 1: kind of the most recent story after that is that 1203 01:02:37,800 --> 01:02:40,040 Speaker 1: his lawyers requested that he'd be allowed to have his 1204 01:02:40,440 --> 01:02:44,080 Speaker 1: ADHD medication and depression medication, which he ran out of 1205 01:02:44,200 --> 01:02:47,720 Speaker 1: soon after being taken into custody. The judge has ordered 1206 01:02:47,760 --> 01:02:50,440 Speaker 1: that he'd be given that medication. Obviously, I'm always in 1207 01:02:50,480 --> 01:02:53,720 Speaker 1: favor of people who are incarcerated having access to medication 1208 01:02:55,240 --> 01:02:57,840 Speaker 1: if anyone's interested. I don't actually think putting Sam in 1209 01:02:57,920 --> 01:03:00,760 Speaker 1: jail's going to do much good. I'm a little bit 1210 01:03:00,800 --> 01:03:03,640 Speaker 1: more mixed on this than I normally am, just because 1211 01:03:03,760 --> 01:03:08,160 Speaker 1: of the case of Adam Newman, the wee work guy 1212 01:03:08,200 --> 01:03:11,640 Speaker 1: who got off Scott free from his giant financial crimes 1213 01:03:11,680 --> 01:03:15,160 Speaker 1: and is now starting another giant grift company. And we'll 1214 01:03:15,160 --> 01:03:17,640 Speaker 1: probably fuck with a bunch of other people's lives. But 1215 01:03:18,080 --> 01:03:20,200 Speaker 1: I do kind of think it's unlikely that we're going 1216 01:03:20,240 --> 01:03:23,280 Speaker 1: to get much benefit out of this. That said, I 1217 01:03:23,320 --> 01:03:26,400 Speaker 1: don't really feel for Sam. He had many many chances 1218 01:03:26,440 --> 01:03:29,120 Speaker 1: not to be in this situation, and he fucked all 1219 01:03:29,160 --> 01:03:32,440 Speaker 1: of them up. So, you know, fuck the guy. Sam's 1220 01:03:32,560 --> 01:03:35,120 Speaker 1: lauriers have argued he was not attempting to discredit a 1221 01:03:35,120 --> 01:03:38,280 Speaker 1: week a witness, but just to respond to a toxic 1222 01:03:38,400 --> 01:03:41,720 Speaker 1: media environment, which he says unfairly portrays him as a villain. 1223 01:03:43,040 --> 01:03:45,680 Speaker 1: And I guess we're part of that toxic media environment. 1224 01:03:45,720 --> 01:03:49,680 Speaker 1: Although Sam free tip here, handing your ex girlfriend's diary 1225 01:03:49,720 --> 01:03:51,640 Speaker 1: to the New York Times is a bad way to 1226 01:03:51,680 --> 01:03:55,680 Speaker 1: seem like, not the villain. That's kind of villain behavior, homie. 1227 01:03:55,240 --> 01:03:56,920 Speaker 1: That hate to tell you. 1228 01:03:57,720 --> 01:04:00,680 Speaker 2: I again, but it's like again you can imagined his 1229 01:04:01,080 --> 01:04:06,480 Speaker 2: like doofy loser fucking logic of like no officer, I 1230 01:04:06,560 --> 01:04:09,400 Speaker 2: was just being a petty bitch. Is that the law? 1231 01:04:09,560 --> 01:04:11,360 Speaker 2: And you're like, yeah, this is, yes. 1232 01:04:11,280 --> 01:04:15,720 Speaker 1: Yes it is, actually, sir bruh. So humorously enough, that 1233 01:04:15,880 --> 01:04:18,120 Speaker 1: is the legal argument his lawyers are making. And they 1234 01:04:18,560 --> 01:04:20,360 Speaker 1: kind of have a point because they're like, look, if 1235 01:04:20,400 --> 01:04:23,200 Speaker 1: you read the New York Times article based on her diary, 1236 01:04:23,480 --> 01:04:26,040 Speaker 1: he seems like a piece of shit, So clearly we 1237 01:04:26,040 --> 01:04:31,000 Speaker 1: weren't trying to influence the prosecution. And like, they do 1238 01:04:31,120 --> 01:04:33,240 Speaker 1: have a point because he does come across as the 1239 01:04:33,280 --> 01:04:35,840 Speaker 1: bad guy in that article that he made happen. So 1240 01:04:35,920 --> 01:04:37,880 Speaker 1: that's funny he comes up, this. 1241 01:04:37,760 --> 01:04:40,760 Speaker 2: Is the bad guy in most things. 1242 01:04:41,360 --> 01:04:43,720 Speaker 1: Yeah, I'm gonna quote from some of that New York 1243 01:04:43,720 --> 01:04:46,920 Speaker 1: Times coverage here. Mister Bankman Freed and Miss Ellison have 1244 01:04:47,000 --> 01:04:50,960 Speaker 1: started an unsteady also started an unsteady romantic relationship, with 1245 01:04:51,080 --> 01:04:55,000 Speaker 1: multiple breakups and reconciliations. At times. Miss Ellison worried that 1246 01:04:55,120 --> 01:04:57,760 Speaker 1: mister Bankman Freed thought she wasn't good enough when he 1247 01:04:57,840 --> 01:05:00,280 Speaker 1: was around. She wrote in February twenty twenty two in 1248 01:05:00,320 --> 01:05:02,960 Speaker 1: a Google document, she had an instinct to shrink and 1249 01:05:03,000 --> 01:05:06,360 Speaker 1: become smaller and quieter and defer to others. After one split, 1250 01:05:06,400 --> 01:05:09,480 Speaker 1: miss Ellison cut off communication with mister bankmin Freed. I 1251 01:05:09,520 --> 01:05:11,959 Speaker 1: felt pretty hurt rejected, she wrote in the April twenty 1252 01:05:12,000 --> 01:05:15,040 Speaker 1: twenty two Google document. Not giving you the contact you 1253 01:05:15,040 --> 01:05:17,000 Speaker 1: wanted felt like the only way I could regain a 1254 01:05:17,040 --> 01:05:20,600 Speaker 1: sense of power. Miss Ellison was compensated far less generously 1255 01:05:20,640 --> 01:05:23,200 Speaker 1: than other top executives at FTX and Alomedia, though it's 1256 01:05:23,240 --> 01:05:26,160 Speaker 1: unclear whether she was aware of it. According to court filings, 1257 01:05:26,160 --> 01:05:28,880 Speaker 1: the Exchangees, founders and other key employees received three point 1258 01:05:28,960 --> 01:05:31,960 Speaker 1: two billion dollars in payouts and loans. Of that total, 1259 01:05:32,000 --> 01:05:34,600 Speaker 1: six million went to Miss Ellison, compared with five hundred 1260 01:05:34,600 --> 01:05:37,880 Speaker 1: and eighty seven million for mister Singh ftx's head of engineering, 1261 01:05:37,920 --> 01:05:40,120 Speaker 1: and two hundred and forty six million for mister Wang, 1262 01:05:40,400 --> 01:05:43,240 Speaker 1: one of the founders. Mister Bankman Freed received two point 1263 01:05:43,280 --> 01:05:49,800 Speaker 1: two billion. So's Ellison is definitely not innocent here. She 1264 01:05:49,920 --> 01:05:52,919 Speaker 1: has admitted guilt in this case, but the reporting makes 1265 01:05:52,960 --> 01:05:54,919 Speaker 1: it seem as if her main role was to act 1266 01:05:54,920 --> 01:05:57,360 Speaker 1: as a patsy. Sam knew she was in love with 1267 01:05:57,440 --> 01:05:59,800 Speaker 1: him and deeply insecure, so he put her in charge 1268 01:05:59,840 --> 01:06:01,760 Speaker 1: of Alameda so that he could use it as part 1269 01:06:01,760 --> 01:06:04,120 Speaker 1: of his grift to manipulate the value of his CRISP 1270 01:06:04,160 --> 01:06:08,600 Speaker 1: crypto empire using customer funds. And this basically the six 1271 01:06:08,680 --> 01:06:11,280 Speaker 1: million he gave her, which is a tiny fraction of 1272 01:06:11,320 --> 01:06:14,400 Speaker 1: like the three billion they funnel to executives. That's like 1273 01:06:15,200 --> 01:06:17,360 Speaker 1: him paying her to be a smoke screen. Right, She's 1274 01:06:17,400 --> 01:06:19,800 Speaker 1: not an equal partner in this enterprise. And one of 1275 01:06:19,840 --> 01:06:21,720 Speaker 1: the things that had happened right before this fell apart 1276 01:06:21,800 --> 01:06:24,200 Speaker 1: is he had he had stopped paying attention to her 1277 01:06:24,200 --> 01:06:27,760 Speaker 1: in Alameda in order to start throwing money through another 1278 01:06:27,800 --> 01:06:30,320 Speaker 1: crypto exchange run from a woman he was fucking now 1279 01:06:30,480 --> 01:06:32,240 Speaker 1: that he had like, so it seems like this was 1280 01:06:32,280 --> 01:06:33,160 Speaker 1: a path for him. 1281 01:06:33,160 --> 01:06:37,840 Speaker 4: Stop stop what I'm not doing this. I'm just talking 1282 01:06:37,880 --> 01:06:40,800 Speaker 4: about what he did. I just really just it's well, 1283 01:06:40,800 --> 01:06:43,640 Speaker 4: it's bad. He's a bastard. That's why we're talking about him. 1284 01:06:43,800 --> 01:06:46,480 Speaker 2: I just really I just really need I don't know 1285 01:06:46,520 --> 01:06:48,760 Speaker 2: who needs to hear this, but we just really need 1286 01:06:48,800 --> 01:06:50,600 Speaker 2: people to stop fucking Sam Pankman. 1287 01:06:50,760 --> 01:06:56,000 Speaker 1: That's this is very bad, gross behavior. 1288 01:06:56,400 --> 01:06:58,840 Speaker 2: It's gross and it's bad for the world. 1289 01:06:59,680 --> 01:07:03,880 Speaker 1: Yeah, yeah, so you know, fuck this guy. One bit 1290 01:07:03,880 --> 01:07:06,280 Speaker 1: of schadenfreude I can give you all is that, according 1291 01:07:06,320 --> 01:07:09,840 Speaker 1: to Puck News, Sam's present situation is so unpleasant that 1292 01:07:09,880 --> 01:07:12,120 Speaker 1: he considers his trips across the country to go to 1293 01:07:12,160 --> 01:07:14,480 Speaker 1: court in New York the highlight of his life now 1294 01:07:14,640 --> 01:07:16,520 Speaker 1: because he gets to like go out in the street. 1295 01:07:16,560 --> 01:07:19,040 Speaker 1: He's surrounded by lawyers in private security, so it's like 1296 01:07:19,080 --> 01:07:21,760 Speaker 1: he's got an entourage again. He gets to travel. This 1297 01:07:21,920 --> 01:07:24,400 Speaker 1: is like the closest he gets to feeling like when 1298 01:07:24,440 --> 01:07:27,680 Speaker 1: he used to be a billionaire. So that's kind of fun. 1299 01:07:29,360 --> 01:07:32,000 Speaker 1: The downside is from the perspective of an FBF hater, 1300 01:07:32,080 --> 01:07:35,200 Speaker 1: the downside is that recently one of his charges was dismissed. 1301 01:07:35,480 --> 01:07:38,720 Speaker 1: The campaign finance violation. This was not due to him 1302 01:07:38,720 --> 01:07:41,640 Speaker 1: being innocent, but due to some legal weirdness involving the 1303 01:07:41,720 --> 01:07:45,400 Speaker 1: letter of the extradition agreement the US signed with the Bahamas. Basically, 1304 01:07:45,480 --> 01:07:48,680 Speaker 1: when we put together the agreements that they'd extra died him, 1305 01:07:48,960 --> 01:07:51,280 Speaker 1: that was not on the document. So the FEDS had 1306 01:07:51,280 --> 01:07:53,479 Speaker 1: to like drop the charge in order to not deal 1307 01:07:53,480 --> 01:07:56,560 Speaker 1: with a bunch of other bullshit. It's a technicality, but 1308 01:07:56,640 --> 01:07:59,120 Speaker 1: it means that his brother, Gabe and several members of 1309 01:07:59,160 --> 01:08:02,400 Speaker 1: the philanthropic at FTX probably will not be charged for 1310 01:08:02,640 --> 01:08:06,120 Speaker 1: very likely committing crimes. And I say they likely committed 1311 01:08:06,120 --> 01:08:11,000 Speaker 1: crimes because FTX executive Nashad Singh already pled guilty this 1312 01:08:11,080 --> 01:08:15,840 Speaker 1: spring to participating in a straw donor scheme. So he is, yeah, 1313 01:08:15,880 --> 01:08:18,760 Speaker 1: and he pled guilty before they dropped this charge, which 1314 01:08:18,760 --> 01:08:21,200 Speaker 1: he's got to feel like an asshole for doing because 1315 01:08:21,240 --> 01:08:23,360 Speaker 1: now he is going to get punished for that, even 1316 01:08:23,360 --> 01:08:25,800 Speaker 1: though SBF is no longer being charged for it. 1317 01:08:26,000 --> 01:08:30,000 Speaker 2: Wow, that's I mean. Look, sometimes you do the ethical 1318 01:08:30,280 --> 01:08:33,200 Speaker 2: altruistic thing and it comes back to bite you in 1319 01:08:33,240 --> 01:08:33,719 Speaker 2: the ass. 1320 01:08:33,720 --> 01:08:37,720 Speaker 1: What do you do? Yeah, what are you going to do? Hey? 1321 01:08:37,720 --> 01:08:40,320 Speaker 1: Everyone just wanted to note that since we recorded this, 1322 01:08:40,479 --> 01:08:43,160 Speaker 1: the prosecution has noted that they will be seeking to 1323 01:08:43,320 --> 01:08:47,519 Speaker 1: add those charges back on that were dropped. So it's 1324 01:08:47,560 --> 01:08:50,439 Speaker 1: possible that both Sam and other members of his inner 1325 01:08:50,439 --> 01:08:53,479 Speaker 1: circle will be charged with all that stuff. We just 1326 01:08:53,520 --> 01:08:55,640 Speaker 1: really kind of don't know at this point. But I 1327 01:08:55,680 --> 01:08:58,680 Speaker 1: do want to note that the prosecution is at least saying, hey, like, 1328 01:08:58,960 --> 01:09:03,320 Speaker 1: despite this little mess up, we are not just giving 1329 01:09:03,400 --> 01:09:06,479 Speaker 1: up on this charge. So heads up about that could 1330 01:09:06,600 --> 01:09:12,400 Speaker 1: change in the future. Well, Jamie, yeah, how you doing. 1331 01:09:13,479 --> 01:09:15,720 Speaker 2: I really well, I have a question for you. 1332 01:09:16,680 --> 01:09:17,800 Speaker 1: I have an answer for you. 1333 01:09:18,479 --> 01:09:22,120 Speaker 2: Well, I sure fucking hope. So no. My question is 1334 01:09:22,240 --> 01:09:25,920 Speaker 2: that I'm curious, what do you see happening here? What 1335 01:09:26,040 --> 01:09:28,280 Speaker 2: feels plausible to you at this time. 1336 01:09:29,080 --> 01:09:31,160 Speaker 1: You know, I've been seeing a lot of people be like, Oh, 1337 01:09:31,160 --> 01:09:32,880 Speaker 1: he's gonna get off, He's gonna get all he's got 1338 01:09:32,880 --> 01:09:36,280 Speaker 1: too many connections, too many you know, people who he 1339 01:09:36,320 --> 01:09:38,599 Speaker 1: could roll on. I don't think he has many people 1340 01:09:38,680 --> 01:09:41,680 Speaker 1: he could really roll on. I don't think he was like, 1341 01:09:42,000 --> 01:09:45,160 Speaker 1: especially since these finance charges have been dropped. I don't 1342 01:09:45,160 --> 01:09:48,479 Speaker 1: know that I really think he's got the savvy to 1343 01:09:48,640 --> 01:09:53,040 Speaker 1: have like a guy like John McAfee, I believe John 1344 01:09:53,120 --> 01:09:55,960 Speaker 1: McAfee killed himself. I don't believe there's anything shady there. 1345 01:09:56,040 --> 01:09:57,720 Speaker 1: I know a lot about the guy. It makes sense 1346 01:09:57,760 --> 01:10:01,080 Speaker 1: to me that when his fucking running finally stopped, he 1347 01:10:01,080 --> 01:10:05,120 Speaker 1: would do that. But McAfee probably did have some dirt 1348 01:10:05,160 --> 01:10:07,519 Speaker 1: on some people. He was that kind of cunning, right, 1349 01:10:08,040 --> 01:10:10,479 Speaker 1: I wouldn't be shocked if John McAfee had put together 1350 01:10:10,520 --> 01:10:13,160 Speaker 1: some dirt on some people, right. I don't think Sam 1351 01:10:13,200 --> 01:10:15,479 Speaker 1: Bankman Freed is that cunning. I don't think he was 1352 01:10:15,560 --> 01:10:18,400 Speaker 1: like smart enough to have dirt on anyone who could 1353 01:10:18,560 --> 01:10:21,280 Speaker 1: get him out of this situation. I think there's a 1354 01:10:21,320 --> 01:10:23,760 Speaker 1: pretty good chance he does hard time. I think he 1355 01:10:23,840 --> 01:10:26,120 Speaker 1: fucked with too many people, he fucked with the money, 1356 01:10:26,160 --> 01:10:28,400 Speaker 1: and he fucked with it in too dumb of a way. 1357 01:10:28,479 --> 01:10:29,479 Speaker 1: So I think he's screwed. 1358 01:10:29,880 --> 01:10:32,559 Speaker 2: Okay, that was my instinct as well, because I feel 1359 01:10:32,600 --> 01:10:34,599 Speaker 2: like he doesn't even have I mean, he doesn't have 1360 01:10:34,680 --> 01:10:38,400 Speaker 2: any sort of like I think sometimes with these types 1361 01:10:38,880 --> 01:10:41,120 Speaker 2: you get some sort of press narrative that it's like 1362 01:10:41,160 --> 01:10:44,519 Speaker 2: they're playing four D chess, and like, even if that's 1363 01:10:44,560 --> 01:10:47,439 Speaker 2: not entirely true, there's like a median narrative that sticks 1364 01:10:47,520 --> 01:10:50,240 Speaker 2: to them. That makes them seem more plausible. But I 1365 01:10:50,320 --> 01:10:53,160 Speaker 2: just feel like everything that's, like, all of his actions 1366 01:10:53,200 --> 01:10:55,840 Speaker 2: and all of the media surrounding him, except for a 1367 01:10:55,960 --> 01:10:59,320 Speaker 2: very very small amount, seems to reinforce the fact that 1368 01:10:59,360 --> 01:11:05,040 Speaker 2: he's completely incompetent and malicious in every way. 1369 01:11:05,160 --> 01:11:10,040 Speaker 1: Yeah, yeah, I would say that, Well, good. 1370 01:11:11,560 --> 01:11:14,960 Speaker 2: God, I mean not that I you know, I don't know. 1371 01:11:15,000 --> 01:11:17,160 Speaker 2: I mean it seems like he's fucked. I certainly hope 1372 01:11:17,160 --> 01:11:17,719 Speaker 2: he's fucked. 1373 01:11:18,600 --> 01:11:21,160 Speaker 1: I hope he's fucked. I hate him. I think he's 1374 01:11:21,200 --> 01:11:25,519 Speaker 1: a gross person. I hope Will mccaskell goes away or 1375 01:11:25,560 --> 01:11:28,000 Speaker 1: gets like eaten by eaten by a large fish would 1376 01:11:28,040 --> 01:11:30,719 Speaker 1: be my pick, if I got it. If like God 1377 01:11:30,800 --> 01:11:31,720 Speaker 1: is like, what do you want me to do to 1378 01:11:31,720 --> 01:11:33,720 Speaker 1: Will mccaskal, I'm like, you, you remember that thing he 1379 01:11:33,720 --> 01:11:36,000 Speaker 1: did with a whale back in the day. What if 1380 01:11:36,000 --> 01:11:38,400 Speaker 1: he didn't get out? What if a whale just eats him? 1381 01:11:38,479 --> 01:11:38,640 Speaker 4: You know? 1382 01:11:39,120 --> 01:11:41,880 Speaker 2: And then God'd be like, oh, amazing. I love playing 1383 01:11:41,920 --> 01:11:42,240 Speaker 2: the hits. 1384 01:11:42,400 --> 01:11:45,719 Speaker 1: I love great pitch. You know what, Robert, I'm gonna 1385 01:11:45,720 --> 01:11:48,240 Speaker 1: give you that HBO series you've been asking for. 1386 01:11:48,640 --> 01:11:51,120 Speaker 2: Robert, you were an amazing collaborator. 1387 01:11:52,120 --> 01:11:56,799 Speaker 1: Oh yeah, Me and God co creators of my HBO series. 1388 01:11:57,120 --> 01:11:59,320 Speaker 1: I am hoping if the strike goes on, they take 1389 01:11:59,360 --> 01:12:02,240 Speaker 1: my reality show Pitch super Soaker full of Piss, which 1390 01:12:02,280 --> 01:12:06,200 Speaker 1: I really think has some path Jamie, I mean premises. 1391 01:12:07,600 --> 01:12:10,120 Speaker 2: No, go ahead, tell me the promise of super Soaker 1392 01:12:10,160 --> 01:12:11,880 Speaker 2: full of piss, Robert. I'm ready. 1393 01:12:12,000 --> 01:12:14,599 Speaker 1: So I'm in a van. I filled a super soaker 1394 01:12:14,640 --> 01:12:18,080 Speaker 1: with my piss, and I drive around like Rodeo drive, 1395 01:12:18,320 --> 01:12:20,479 Speaker 1: and I get out when I see someone who looks 1396 01:12:20,520 --> 01:12:22,880 Speaker 1: famous and I score them with a super soaker full 1397 01:12:22,920 --> 01:12:25,600 Speaker 1: of piss, and then we film it and then I 1398 01:12:25,680 --> 01:12:26,639 Speaker 1: leave very quickly. 1399 01:12:27,760 --> 01:12:29,400 Speaker 2: Okay, I mean I'm on top with that. 1400 01:12:29,720 --> 01:12:31,160 Speaker 1: Yeah, I think it's a great I think it's good. 1401 01:12:31,320 --> 01:12:34,640 Speaker 1: You know, I will get you know, I don't know. 1402 01:12:35,160 --> 01:12:37,280 Speaker 2: I would be kind of pro like if you could 1403 01:12:37,320 --> 01:12:39,960 Speaker 2: get I think the soundtrack is going to be really 1404 01:12:40,120 --> 01:12:42,160 Speaker 2: key there, Like I think if you could give me 1405 01:12:42,200 --> 01:12:44,880 Speaker 2: some like Jock jam situation, like. 1406 01:12:45,680 --> 01:12:50,120 Speaker 1: This is exciting. All live all live editions of Blink 1407 01:12:50,160 --> 01:12:53,439 Speaker 1: one eighty two songs doues, you know, because they they 1408 01:12:53,439 --> 01:12:55,759 Speaker 1: are one of the worst live bands that ever played. 1409 01:12:55,800 --> 01:13:00,400 Speaker 1: So it's really just upsetting to the to the viewer. 1410 01:13:00,479 --> 01:13:02,120 Speaker 1: That's the goal here, and you know. 1411 01:13:02,160 --> 01:13:05,160 Speaker 2: Given who blink one eighty two like rolls with these days, 1412 01:13:05,320 --> 01:13:08,559 Speaker 2: you may in fact run into Travis on Rodeo Drive. 1413 01:13:08,720 --> 01:13:10,519 Speaker 2: Oh yeah, utial full. 1414 01:13:10,320 --> 01:13:14,120 Speaker 1: Poet right in the face, just just a full load 1415 01:13:14,160 --> 01:13:16,280 Speaker 1: of it, you know, just to say. 1416 01:13:16,640 --> 01:13:19,599 Speaker 3: I hate this idea because it means that some fucking 1417 01:13:19,680 --> 01:13:23,000 Speaker 3: celebrity will murder you and then we'll be gone, and 1418 01:13:23,000 --> 01:13:23,840 Speaker 3: then I'll be sad. 1419 01:13:24,280 --> 01:13:27,200 Speaker 1: I'm gonna be honest, I'm not great at recognizing celebrities, 1420 01:13:27,200 --> 01:13:29,400 Speaker 1: so I'm anytime I could just see someone in a 1421 01:13:29,439 --> 01:13:32,280 Speaker 1: suit and spray him with the piss. Yeah, Jamie, I 1422 01:13:33,160 --> 01:13:38,600 Speaker 1: know we can sell. No, we just roll down the 1423 01:13:38,600 --> 01:13:42,040 Speaker 1: streets and stick Jonas Roberts spray, pray, spray, get the 1424 01:13:42,040 --> 01:13:42,880 Speaker 1: fuck out of here. 1425 01:13:43,200 --> 01:13:46,759 Speaker 2: Wow, No Jonas, Robert. 1426 01:13:46,920 --> 01:13:49,960 Speaker 1: That's right, that's right, that's right. I'm going to rely 1427 01:13:50,040 --> 01:13:52,200 Speaker 1: on Jamie to recognize him though. 1428 01:13:52,120 --> 01:13:54,120 Speaker 3: Oh my god, I did see that, Jamie. 1429 01:13:54,840 --> 01:13:58,240 Speaker 2: Yeah, all right, Well the Jonas brothers have their own 1430 01:13:58,360 --> 01:14:02,120 Speaker 2: vanity popcorn brand, now, isn't that something? These are the 1431 01:14:02,160 --> 01:14:04,280 Speaker 2: amazing things I can teach you, Robert. 1432 01:14:06,040 --> 01:14:09,120 Speaker 1: I've already taught me so much about hot dog through 1433 01:14:09,160 --> 01:14:11,160 Speaker 1: your best selling book Raw Dog. 1434 01:14:11,760 --> 01:14:13,799 Speaker 2: Wow. Perfect trans pivot. 1435 01:14:13,840 --> 01:14:14,320 Speaker 1: You're welcome. 1436 01:14:14,439 --> 01:14:19,040 Speaker 2: Let us spicy plug though gorgeous, gorgeous plug. Hey, it's 1437 01:14:19,040 --> 01:14:22,000 Speaker 2: never too late to start reading about hot dogs. It's 1438 01:14:22,040 --> 01:14:23,920 Speaker 2: never too late, never learning. 1439 01:14:24,280 --> 01:14:30,160 Speaker 1: Reading about hot dogs and also America fascinating story. Raw 1440 01:14:30,200 --> 01:14:32,280 Speaker 1: Dog find it wherever books are found. 1441 01:14:33,840 --> 01:14:37,160 Speaker 2: Yeah, well, thank you so much. I truly I was. 1442 01:14:37,400 --> 01:14:41,280 Speaker 2: I mean, as you know, I did a Bastard's episode 1443 01:14:41,360 --> 01:14:44,120 Speaker 2: about hot dogs as I was writing that book so 1444 01:14:44,160 --> 01:14:45,920 Speaker 2: that I would remain focused. 1445 01:14:46,800 --> 01:14:47,800 Speaker 1: And it's the best way. 1446 01:14:48,479 --> 01:14:52,799 Speaker 2: And I have heard that the subject of that episode, George, 1447 01:14:52,840 --> 01:14:57,280 Speaker 2: say that the hot dog eating community is actively protecting 1448 01:14:57,320 --> 01:14:59,679 Speaker 2: him from its existence. He does not know it exists. 1449 01:15:00,040 --> 01:15:03,680 Speaker 2: He does not the book. Everyone in his life is 1450 01:15:03,880 --> 01:15:07,920 Speaker 2: really actively try Like every hot dog eater or many 1451 01:15:08,320 --> 01:15:11,320 Speaker 2: gaters I talked to, We're like, yeah, no, we know 1452 01:15:11,360 --> 01:15:13,439 Speaker 2: about the Bastard's episode and we know about the book, 1453 01:15:13,479 --> 01:15:15,880 Speaker 2: but we really don't want George to know about it. 1454 01:15:15,920 --> 01:15:17,320 Speaker 2: I was like, okay, fair. 1455 01:15:17,240 --> 01:15:21,800 Speaker 1: Enough, beautiful jam. That made me feel great. You can 1456 01:15:21,840 --> 01:15:24,320 Speaker 1: sign up for this show and all other cool Zone 1457 01:15:24,360 --> 01:15:28,920 Speaker 1: shows ad free at cooler Zone Media. That's for Apple subscribers. 1458 01:15:28,960 --> 01:15:31,200 Speaker 1: We are working on the Android option. You can find 1459 01:15:31,280 --> 01:15:34,320 Speaker 1: my novel After the Revolution, by typing After the Revolution 1460 01:15:34,439 --> 01:15:37,760 Speaker 1: into whatever book buying site you use, or just walk 1461 01:15:37,880 --> 01:15:41,680 Speaker 1: into a bookstore and demand it from the manager at 1462 01:15:41,720 --> 01:15:47,760 Speaker 1: sword point. Anyway, Goodbye, Goodbye Bye. 1463 01:15:48,920 --> 01:15:51,639 Speaker 3: Behind the Bastards is a production of cool Zone Media. 1464 01:15:52,000 --> 01:15:55,280 Speaker 3: For more from cool Zone Media, visit our website coolzonemedia 1465 01:15:55,439 --> 01:15:58,679 Speaker 3: dot com, or check us out on the iHeartRadio app, 1466 01:15:58,720 --> 01:16:01,120 Speaker 3: Apple Podcasts, or where you get your podcasts.