1 00:00:02,520 --> 00:00:06,760 Speaker 1: Media. Oh, wow, Sophie, I don't know. I don't know 2 00:00:06,800 --> 00:00:08,800 Speaker 1: if we can put that on the air. I mean, 3 00:00:08,840 --> 00:00:11,240 Speaker 1: that's actionable threats against a sitting. 4 00:00:12,600 --> 00:00:12,879 Speaker 2: Wow. 5 00:00:13,280 --> 00:00:16,000 Speaker 1: I mean, just Sophie, that's it's just dangerous to be 6 00:00:16,040 --> 00:00:18,800 Speaker 1: saying stuff like that in this day and age. I know, 7 00:00:19,520 --> 00:00:23,360 Speaker 1: let's distract the audience and the federal agents listening in 8 00:00:23,480 --> 00:00:27,760 Speaker 1: by bringing on our guests today who definitely doesn't say 9 00:00:27,800 --> 00:00:32,720 Speaker 1: stuff like that. Langston Kerman, Langston, do you condemn Sophie's statements, 10 00:00:32,800 --> 00:00:35,000 Speaker 1: assuming they ever get out in unedited form? 11 00:00:35,320 --> 00:00:37,960 Speaker 2: I was. I was tempted to hang up the zone 12 00:00:38,040 --> 00:00:42,600 Speaker 2: right away. It was so inflammatory that I said, I 13 00:00:42,640 --> 00:00:43,519 Speaker 2: can't be a part of this. 14 00:00:43,880 --> 00:00:46,760 Speaker 1: No. No, it's time to bring peace to the country. 15 00:00:46,840 --> 00:00:47,040 Speaker 2: You know. 16 00:00:47,240 --> 00:00:49,360 Speaker 1: That's that's what we all need to be focusing on. 17 00:00:49,320 --> 00:00:52,480 Speaker 2: Is peace. We got to get back to what we were, 18 00:00:52,600 --> 00:00:54,360 Speaker 2: which was normal and peace. 19 00:00:54,200 --> 00:00:56,680 Speaker 1: Was normal and good. Everyone knows things used to be 20 00:00:56,720 --> 00:01:02,360 Speaker 1: good in this country for everybody, right, that's chilling for all. Yeah, 21 00:01:02,520 --> 00:01:05,600 Speaker 1: things were so great, And I think what'll get us 22 00:01:05,640 --> 00:01:09,000 Speaker 1: back to that is talking about puppies. Right. Everybody loves 23 00:01:09,000 --> 00:01:12,920 Speaker 1: a good puppy, right, puppies. Puppies are wonderful, and there's 24 00:01:12,959 --> 00:01:18,360 Speaker 1: been some really what is he talking about? God, we're 25 00:01:18,360 --> 00:01:20,880 Speaker 1: gonna talk about puppy mills. No, don't worry, this is 26 00:01:20,880 --> 00:01:22,960 Speaker 1: a fun one, everybody. We're gonna have a good time 27 00:01:23,000 --> 00:01:24,800 Speaker 1: this week. Like it is a guy I think is 28 00:01:24,840 --> 00:01:26,440 Speaker 1: a piece of shit, but it's gonna be fun. 29 00:01:26,800 --> 00:01:26,920 Speaker 3: YEA. 30 00:01:27,000 --> 00:01:29,880 Speaker 2: I will say before you even go, one of the 31 00:01:29,880 --> 00:01:33,880 Speaker 2: only taboos that exists in film and television is murdering dogs. Yes, 32 00:01:35,640 --> 00:01:38,040 Speaker 2: and I'm getting so scared before we even start. 33 00:01:38,080 --> 00:01:41,480 Speaker 1: I know nothing, No dogs are provably harmed yet and 34 00:01:42,000 --> 00:01:43,840 Speaker 1: in the making of this story, although I do have 35 00:01:43,840 --> 00:01:47,880 Speaker 1: to specify yet, but there is like a dog like 36 00:01:47,920 --> 00:01:50,880 Speaker 1: creature involved. Because if you if you spend any time online, 37 00:01:50,880 --> 00:01:52,639 Speaker 1: have you been on social media or just been watching 38 00:01:52,640 --> 00:01:54,640 Speaker 1: the news. I think probably close to one hundred percent 39 00:01:54,640 --> 00:01:57,120 Speaker 1: of our audience caught this. There was a big story 40 00:01:57,160 --> 00:01:59,480 Speaker 1: a couple of weeks back about how this company brought 41 00:01:59,520 --> 00:02:02,880 Speaker 1: back the dire wolf, right, which is an extinct kind 42 00:02:02,920 --> 00:02:05,720 Speaker 1: of wolf, and they did it using some I think 43 00:02:05,720 --> 00:02:09,280 Speaker 1: we can call it Jurassic Park style machinations, right, Like 44 00:02:09,600 --> 00:02:12,799 Speaker 1: that's what everyone thought of, And this is all the 45 00:02:12,840 --> 00:02:16,520 Speaker 1: work of a of an actual like science like bioscience 46 00:02:16,560 --> 00:02:20,880 Speaker 1: startup called Colossal Biosciences, which is just by name, a 47 00:02:20,960 --> 00:02:23,840 Speaker 1: company that could not sound more like it belonged than 48 00:02:23,840 --> 00:02:26,919 Speaker 1: a Michael Crichton novel if they just called it engine right, 49 00:02:27,080 --> 00:02:28,480 Speaker 1: like it's it's amazing. 50 00:02:28,800 --> 00:02:31,600 Speaker 2: It really feels like a fourth grader was, like. 51 00:02:31,639 --> 00:02:35,560 Speaker 1: I got it, like when I was in fourth grade 52 00:02:35,639 --> 00:02:37,120 Speaker 1: trying to rewrite Jurassic Park. 53 00:02:37,240 --> 00:02:40,639 Speaker 2: Yes, okay, okay, we'll get sue. 54 00:02:40,800 --> 00:02:43,360 Speaker 1: That's fine, it's fine, it's fine. Michael Crichton's not going 55 00:02:43,400 --> 00:02:50,239 Speaker 1: to sue a four year old Colossal Science original. Yeah, 56 00:02:50,360 --> 00:02:58,960 Speaker 1: so it'll publish this. So most news coverage of this 57 00:02:59,000 --> 00:03:04,000 Speaker 1: whole Dire Wolf thing kinda casually accepted the pr claims 58 00:03:04,040 --> 00:03:07,799 Speaker 1: being made by Colossal and it's co founder, slash Scientific spokesman, 59 00:03:08,240 --> 00:03:12,560 Speaker 1: doctor George Church. Time published an article with the title 60 00:03:12,840 --> 00:03:15,680 Speaker 1: the Return of the Dire Wolf, and it's it's as 61 00:03:15,800 --> 00:03:18,680 Speaker 1: hype an article as it could possibly be, And on 62 00:03:18,720 --> 00:03:20,959 Speaker 1: the front of it, there's a photo of a very 63 00:03:21,120 --> 00:03:24,920 Speaker 1: charismatic looking wolf as that I mean, that's a beautiful wolf, right, 64 00:03:25,000 --> 00:03:30,080 Speaker 1: that charm that that is a screen ready wolf. 65 00:03:30,639 --> 00:03:33,520 Speaker 2: You can tell you can tell that wolf knows how 66 00:03:33,560 --> 00:03:36,360 Speaker 2: the business works. That's not a wolf that you gotta like, 67 00:03:36,520 --> 00:03:38,520 Speaker 2: uh put a caretaker on, you. 68 00:03:38,520 --> 00:03:41,080 Speaker 1: Know, you gotta. They said that they put some dire 69 00:03:41,080 --> 00:03:43,240 Speaker 1: wolf jeans into this wolf. I think they might have 70 00:03:43,280 --> 00:03:45,320 Speaker 1: stuck one or two Tom Cruise jeans in there, because 71 00:03:45,320 --> 00:03:47,240 Speaker 1: that wolf knows where the camera is, right. 72 00:03:47,560 --> 00:03:49,960 Speaker 2: That wolf, That wolf does its own stunts. 73 00:03:50,520 --> 00:03:54,600 Speaker 1: Wolf does its own stunts, right. Obviously it's a good 74 00:03:54,600 --> 00:03:57,640 Speaker 1: looking wolf. No one's throwing shade against these animals here. 75 00:03:57,680 --> 00:04:02,000 Speaker 1: They're gorgeous, but they're not wolves, right. That's kind of 76 00:04:02,040 --> 00:04:04,440 Speaker 1: where we're starting here. It gets much more fucked up 77 00:04:04,480 --> 00:04:07,800 Speaker 1: than that. Dire Wolves were a very real species of 78 00:04:07,840 --> 00:04:10,480 Speaker 1: wolf which roamed the Americas. They were found in parts 79 00:04:10,520 --> 00:04:13,440 Speaker 1: of both North and South America from the late police 80 00:04:13,520 --> 00:04:16,600 Speaker 1: Decene to the early Holocene period, which is a span 81 00:04:16,720 --> 00:04:20,760 Speaker 1: of somewhere over one hundred thousand years. It's bite force like. 82 00:04:20,800 --> 00:04:23,520 Speaker 1: When these animals lived, they had a stronger bite force 83 00:04:23,680 --> 00:04:27,440 Speaker 1: than any known modern wolves, so they were pretty formidable. 84 00:04:27,960 --> 00:04:30,320 Speaker 1: But the reputation for them being like the size of 85 00:04:30,400 --> 00:04:33,880 Speaker 1: horses is something they largely accrued via Game of Thrones 86 00:04:33,920 --> 00:04:37,440 Speaker 1: because dire wolves were around the same size as the 87 00:04:37,520 --> 00:04:40,359 Speaker 1: largest modern wolves, a little bigger but we're talking like 88 00:04:40,480 --> 00:04:43,240 Speaker 1: ten or twenty pounds heavier than like a Yukon wolf 89 00:04:43,279 --> 00:04:44,080 Speaker 1: on average. 90 00:04:44,400 --> 00:04:49,640 Speaker 2: And there was a sub Right, there's a pretty extensive 91 00:04:49,680 --> 00:04:54,400 Speaker 2: collection of dire wolf bones in the Librea Tarpits Museum. 92 00:04:54,640 --> 00:04:57,960 Speaker 2: Oh right, and boy was I disappointed when I saw 93 00:04:58,000 --> 00:04:59,120 Speaker 2: how little those. 94 00:04:58,880 --> 00:05:02,000 Speaker 1: Bones were, right, he gave a throat shit. 95 00:05:02,160 --> 00:05:04,240 Speaker 2: Yeah, man, I walked in there and I said, I'm 96 00:05:04,240 --> 00:05:07,000 Speaker 2: gonna see the biggest wolf that's that ever was. And 97 00:05:07,080 --> 00:05:11,920 Speaker 2: those wolves look like snauzers. They're tiny little dogs as 98 00:05:11,920 --> 00:05:13,159 Speaker 2: far as I'm concerned. 99 00:05:13,480 --> 00:05:15,720 Speaker 1: Yeah, because you I mean, like, the average size of 100 00:05:15,720 --> 00:05:17,520 Speaker 1: a dire wolf was like one hundred and fifty pounds, 101 00:05:17,520 --> 00:05:20,559 Speaker 1: which is like a good sized caneid. But like, I've 102 00:05:20,600 --> 00:05:23,479 Speaker 1: known dogs bigger than that. I've known some two hundred 103 00:05:23,960 --> 00:05:28,000 Speaker 1: pound mastiffs, right like, and they're not that big on average. Yeah. No, 104 00:05:29,120 --> 00:05:31,800 Speaker 1: so yeah, again, these are they're bite in terms of 105 00:05:31,800 --> 00:05:34,360 Speaker 1: bite force, very formidable animals, but they're not huge. 106 00:05:34,480 --> 00:05:34,600 Speaker 2: Like. 107 00:05:34,640 --> 00:05:36,760 Speaker 1: That's stuff that George R. Martin put in his book, 108 00:05:36,800 --> 00:05:38,599 Speaker 1: because George R. And Martin knows how to make a 109 00:05:38,600 --> 00:05:41,599 Speaker 1: book cool. Right, Yeah, you gotta judge up reality a 110 00:05:41,600 --> 00:05:44,800 Speaker 1: little bit, you know, especially if it's a fantasy novel. 111 00:05:45,000 --> 00:05:46,880 Speaker 2: He knows how to make a book cool, and he 112 00:05:46,960 --> 00:05:49,920 Speaker 2: knows how to make a hat cool. He's got cool hats, 113 00:05:50,200 --> 00:05:52,280 Speaker 2: cool books and scarves. 114 00:05:52,160 --> 00:05:55,440 Speaker 1: Cool hats, cool books, and he's he's achieved. I've have 115 00:05:55,520 --> 00:05:58,320 Speaker 1: a lot of respect for George. He's achieved every writer's dream, 116 00:05:58,360 --> 00:06:01,400 Speaker 1: which is to never have to write again, right like that, 117 00:06:01,400 --> 00:06:04,160 Speaker 1: that's what we're all shooting for. So just live in 118 00:06:04,200 --> 00:06:07,480 Speaker 1: all lighthouse that never finished your series, I say, as 119 00:06:07,520 --> 00:06:12,920 Speaker 1: I'm two years overdue with my novel. But the name 120 00:06:13,000 --> 00:06:15,679 Speaker 1: of the species is presumably the major thing that inspired 121 00:06:15,720 --> 00:06:16,120 Speaker 1: George R. 122 00:06:16,160 --> 00:06:16,360 Speaker 2: Martin. 123 00:06:16,400 --> 00:06:18,840 Speaker 1: It's just a cool name, dire Wolf, Like there's it's 124 00:06:19,160 --> 00:06:21,240 Speaker 1: dire wolves have been in D and D before George 125 00:06:21,240 --> 00:06:23,200 Speaker 1: put him in his books, right because it's just a 126 00:06:23,240 --> 00:06:25,039 Speaker 1: cool thing to call a wolf. It's like, yeah, that 127 00:06:25,080 --> 00:06:30,360 Speaker 1: sounds like a scarier wolf to me. And Colossal Biosciences 128 00:06:30,880 --> 00:06:34,320 Speaker 1: knowing and being primarily this is a company that describes 129 00:06:34,360 --> 00:06:37,120 Speaker 1: themselves as being in gene sciences. They're in the pr 130 00:06:37,240 --> 00:06:39,800 Speaker 1: business as much as anything else, and they made the 131 00:06:39,839 --> 00:06:43,159 Speaker 1: wise decision to rely heavily on the popularity of the 132 00:06:43,200 --> 00:06:46,240 Speaker 1: Game of Thrones books and TV shows to act as 133 00:06:46,320 --> 00:06:50,680 Speaker 1: advertising for their quote unquote dire wolf right. And in 134 00:06:50,720 --> 00:06:54,000 Speaker 1: fact this is even written into that Fawning Time coverage. 135 00:06:54,040 --> 00:06:57,240 Speaker 1: And here's a quote from that article. Relying on deft 136 00:06:57,279 --> 00:07:02,120 Speaker 1: genetic engineering an ancient preserved DNA, colossal scientists deciphered the 137 00:07:02,160 --> 00:07:05,200 Speaker 1: dire wolf genome, rewrote the genetic code of the common 138 00:07:05,200 --> 00:07:07,960 Speaker 1: gray wolf to match it, and, using domestic dogs as 139 00:07:08,000 --> 00:07:11,560 Speaker 1: surrogate mothers, brought Romulus, Remus and their sister, two month 140 00:07:11,560 --> 00:07:15,160 Speaker 1: old Kalisi into the world during three separate births last 141 00:07:15,160 --> 00:07:17,960 Speaker 1: fall in this winter, effectively for the first time, de 142 00:07:18,080 --> 00:07:21,679 Speaker 1: extincting a line of beasts whose live geenpool vanished long ago. 143 00:07:22,280 --> 00:07:24,920 Speaker 1: Ti met the mates. Kalisi was not present due to 144 00:07:24,920 --> 00:07:27,040 Speaker 1: her young age, at a fenced field in a US 145 00:07:27,080 --> 00:07:29,920 Speaker 1: wildlife facility on March twenty fourth, on the condition that 146 00:07:29,960 --> 00:07:32,600 Speaker 1: their location remain a secret to protect the animals from 147 00:07:32,640 --> 00:07:37,320 Speaker 1: prying eyes. Now, naming a dire wolf after a character 148 00:07:37,360 --> 00:07:39,120 Speaker 1: in the books who had nothing to do with dire 149 00:07:39,160 --> 00:07:42,080 Speaker 1: wolves was by far the fringiest possible. 150 00:07:41,800 --> 00:07:43,920 Speaker 4: Choice here opportunity. 151 00:07:44,240 --> 00:07:45,880 Speaker 1: There were dire wolves with names. 152 00:07:46,360 --> 00:07:48,760 Speaker 2: There were a lot of Starks. They could have just 153 00:07:48,840 --> 00:07:51,520 Speaker 2: gone down the Stark lineage they didn't have to go 154 00:07:51,560 --> 00:07:52,000 Speaker 2: to Kali. 155 00:07:52,360 --> 00:07:54,640 Speaker 1: Literally, she had nothing to do with the wolves. Did 156 00:07:54,680 --> 00:07:56,080 Speaker 1: she even meet any of those? 157 00:07:57,120 --> 00:07:58,440 Speaker 2: A dragon lady? 158 00:07:58,840 --> 00:08:01,880 Speaker 5: She met the Yes, that. 159 00:08:01,960 --> 00:08:05,440 Speaker 1: She met Okay, she met jobs maybe okay, ghost right, Okay, 160 00:08:05,560 --> 00:08:07,360 Speaker 1: so maybe one we. 161 00:08:07,800 --> 00:08:08,600 Speaker 4: Don't talk about that. 162 00:08:08,680 --> 00:08:11,200 Speaker 1: Last season, we don't talk about that. I got really 163 00:08:11,240 --> 00:08:12,200 Speaker 1: sad for a second. 164 00:08:13,360 --> 00:08:16,200 Speaker 2: But it's the bummer, folks, should you should do that 165 00:08:16,280 --> 00:08:18,520 Speaker 2: season as one of the bestards from Yeah, that's. 166 00:08:18,400 --> 00:08:21,960 Speaker 1: What we're working on a sixth parter, but having the animals. 167 00:08:22,000 --> 00:08:24,440 Speaker 1: They also had the animals pose with George R. Martin 168 00:08:24,480 --> 00:08:26,400 Speaker 1: as part of like the press tour, and that was 169 00:08:26,440 --> 00:08:28,760 Speaker 1: a particular choice. First off, look at this which again 170 00:08:29,000 --> 00:08:31,000 Speaker 1: no shade on George. I want to hold a wolf 171 00:08:31,000 --> 00:08:36,160 Speaker 1: pup right, absolutely, yeah, Like it looks cuddly as hell. 172 00:08:36,520 --> 00:08:39,280 Speaker 1: But that's also kind of part of the problem because 173 00:08:39,280 --> 00:08:42,520 Speaker 1: like that wolf is actively yawning, right, Like, it seems 174 00:08:42,559 --> 00:08:45,520 Speaker 1: pretty chill to be there. And it's interesting to me 175 00:08:45,920 --> 00:08:48,840 Speaker 1: because if the information given to the team at time 176 00:08:49,000 --> 00:08:53,160 Speaker 1: by Colossal Biosciences was accurate, there's no way this photo 177 00:08:53,200 --> 00:08:55,920 Speaker 1: should exist. Here's what Time claimed right at the start 178 00:08:55,960 --> 00:08:59,240 Speaker 1: of their article. The angelic Exuberance puppies exhibit in the 179 00:08:59,240 --> 00:09:02,000 Speaker 1: presence of humans, trotting up for hugs, belly Rob's kisses 180 00:09:02,080 --> 00:09:05,000 Speaker 1: is completely absent. They keep their distance, retreating if a 181 00:09:05,040 --> 00:09:07,760 Speaker 1: person approaches, even if one of the handlers who raised 182 00:09:07,800 --> 00:09:10,120 Speaker 1: them from birth can only get so close before Romulus 183 00:09:10,120 --> 00:09:13,360 Speaker 1: and Remas flinch and retreat. This isn't domestic canine behavior. 184 00:09:13,400 --> 00:09:16,400 Speaker 1: This is wild lupine behavior. The pups are wolves. Not 185 00:09:16,440 --> 00:09:19,040 Speaker 1: only that they're dire wolves, which means they have cause 186 00:09:19,080 --> 00:09:23,000 Speaker 1: to be lonely, and again just genetically they're not dire wolves. 187 00:09:23,000 --> 00:09:26,199 Speaker 1: But also, why is George cuddling that animal? Then if 188 00:09:26,200 --> 00:09:28,800 Speaker 1: you can't, are you you're either forcing the animal into 189 00:09:28,840 --> 00:09:32,400 Speaker 1: a situation that makes it distinctly uncomfortable. But the animal 190 00:09:32,440 --> 00:09:34,520 Speaker 1: looks like it's yawning, so maybe they're just not as 191 00:09:34,600 --> 00:09:36,000 Speaker 1: wolfy as you're pretending. 192 00:09:36,160 --> 00:09:38,480 Speaker 2: Yeah, you're really trying to weave a story here. And 193 00:09:38,520 --> 00:09:41,920 Speaker 2: that's that's a nice dog. That seems like a really polite, 194 00:09:42,120 --> 00:09:43,000 Speaker 2: U sweet dog. 195 00:09:43,040 --> 00:09:46,120 Speaker 1: Shill it looks like a husky. Yeah, I don't know man. 196 00:09:46,240 --> 00:09:49,439 Speaker 4: And frankly, George, why are you not finishing the books? 197 00:09:49,880 --> 00:09:51,000 Speaker 2: Well, he's got a lot of looks. 198 00:09:51,360 --> 00:09:53,080 Speaker 1: I'm not going to give him shit for that. Again, 199 00:09:53,280 --> 00:09:56,400 Speaker 1: I also haven't, And if I have the chance to 200 00:09:56,400 --> 00:09:58,680 Speaker 1: cuddle a wolf rather than spend another day working on 201 00:09:58,760 --> 00:10:00,800 Speaker 1: my novel, I would be a added to that wolf. 202 00:10:00,880 --> 00:10:02,640 Speaker 4: So fucking fast. 203 00:10:03,320 --> 00:10:05,960 Speaker 1: The man owns a lighthouse, how you expect him to 204 00:10:05,960 --> 00:10:09,640 Speaker 1: finish a book? He's finished other books. I'll give him 205 00:10:09,640 --> 00:10:11,720 Speaker 1: shit for his involvement in this company, though. 206 00:10:11,960 --> 00:10:13,880 Speaker 2: That's cool to make that much money and be like, 207 00:10:13,960 --> 00:10:15,000 Speaker 2: I'm a buy a lighthouse. 208 00:10:15,160 --> 00:10:16,120 Speaker 1: I'm gonna get a lighthouse. 209 00:10:16,280 --> 00:10:19,400 Speaker 2: Yeah, of course I'm not sleeping where normal people sleep anymore. 210 00:10:19,800 --> 00:10:21,080 Speaker 2: I've got a different thing going on. 211 00:10:21,480 --> 00:10:27,600 Speaker 1: No, I'm going to recreate that great Robert Pattinson movie. 212 00:10:27,480 --> 00:10:31,600 Speaker 2: Where everybody was fine at the end everyone No. 213 00:10:31,600 --> 00:10:36,040 Speaker 1: No, really good movie with a good ending. So the 214 00:10:36,080 --> 00:10:38,679 Speaker 1: fact that there's this photo of George R. R. Martin with 215 00:10:38,679 --> 00:10:41,040 Speaker 1: one of these dire wolves makes a lot more sense 216 00:10:41,040 --> 00:10:43,319 Speaker 1: when you learn a few things about both the company 217 00:10:43,360 --> 00:10:46,360 Speaker 1: behind these animals and the actual science behind the project itself. 218 00:10:46,360 --> 00:10:49,280 Speaker 1: For one thing, George R. R. Martin is an investor in 219 00:10:49,360 --> 00:10:53,800 Speaker 1: Colossal Biosciences and also an advisor to the company, which 220 00:10:54,320 --> 00:10:57,800 Speaker 1: advisor and what George R. Martin's a number of things 221 00:10:57,840 --> 00:11:01,640 Speaker 1: he's not a scientist, he's not a geneticist. He's not 222 00:11:01,679 --> 00:11:03,640 Speaker 1: an expert in real dire. 223 00:11:03,440 --> 00:11:05,120 Speaker 2: Wolves because of sunglasses. 224 00:11:05,679 --> 00:11:11,000 Speaker 1: He sunglasses, like he invented fake dire wolves for his novels. 225 00:11:11,040 --> 00:11:14,200 Speaker 1: I don't understand, Like, under what circumstances would he be 226 00:11:14,280 --> 00:11:17,920 Speaker 1: an advisor to this company doing genetics work. That's like 227 00:11:18,000 --> 00:11:21,520 Speaker 1: if they hired the guy who played doctor Alan Grant 228 00:11:21,600 --> 00:11:24,320 Speaker 1: to advise a company cloning dinosaurs. It's like, well, but 229 00:11:25,040 --> 00:11:29,079 Speaker 1: he doesn't really know anything about dinosaurs. 230 00:11:30,000 --> 00:11:34,240 Speaker 2: Ye, he's actually he doesn't even speak with that accent. 231 00:11:34,720 --> 00:11:35,480 Speaker 2: He's pretending. 232 00:11:35,840 --> 00:11:38,920 Speaker 1: Yeah, Like it's like bringing Jeff Goldblum onto the project. Well, 233 00:11:38,960 --> 00:11:41,200 Speaker 1: you know, if you if you're trying to like, I 234 00:11:41,280 --> 00:11:44,199 Speaker 1: just don't think he has the expertise. Nothing against Jeff. 235 00:11:44,520 --> 00:11:46,560 Speaker 2: If you want to flirt with the dinosaur. 236 00:11:47,080 --> 00:11:48,600 Speaker 1: People, yes, bring Jeff in. 237 00:11:48,760 --> 00:11:52,400 Speaker 2: Yeah you want them dinosaurs horny a al get Jeff. 238 00:11:52,440 --> 00:11:54,800 Speaker 2: But otherwise you gotta yeah, he could do that ship 239 00:11:55,000 --> 00:11:56,960 Speaker 2: you gotta talk to a scientist you might. 240 00:11:56,880 --> 00:11:59,160 Speaker 1: Want to bring in, like Robert Backer or someone if 241 00:11:59,160 --> 00:12:03,400 Speaker 1: he's still alive. But anyway, An article by Michael hills 242 00:12:03,440 --> 00:12:06,880 Speaker 1: It for the Los Angeles Times explains how Martin is 243 00:12:07,040 --> 00:12:08,560 Speaker 1: being credited as an advisor. 244 00:12:08,600 --> 00:12:08,800 Speaker 2: Here. 245 00:12:09,280 --> 00:12:11,600 Speaker 1: He's named as a co author on a technical paper 246 00:12:11,600 --> 00:12:14,720 Speaker 1: the company published as a non peer reviewed preprint describing 247 00:12:14,720 --> 00:12:17,280 Speaker 1: its de extinction effort. The text credits him with the 248 00:12:17,320 --> 00:12:19,960 Speaker 1: review and editing of the paper's text, among thirty six 249 00:12:20,040 --> 00:12:23,400 Speaker 1: other credited co authors in that category. So he's one 250 00:12:23,440 --> 00:12:26,320 Speaker 1: of thirty six people who helped copy edit an article. 251 00:12:26,840 --> 00:12:31,240 Speaker 2: Yeah, this is okay to your point, this is just pr. 252 00:12:31,400 --> 00:12:35,200 Speaker 1: This is just PR. First off, there's everything else thirty 253 00:12:35,280 --> 00:12:37,360 Speaker 1: six people to edit an article. 254 00:12:37,640 --> 00:12:40,160 Speaker 2: And they didn't let those other thirty five people hold 255 00:12:40,200 --> 00:12:42,880 Speaker 2: that dog. And that's no fucked up, no, no, just George. 256 00:12:42,880 --> 00:12:46,360 Speaker 1: They just pressed George in there for that. Yeah, anyway, 257 00:12:46,400 --> 00:12:48,920 Speaker 1: to kind of enforce the point I made earlier, these wolves, 258 00:12:48,960 --> 00:12:52,200 Speaker 1: well very cute, are not dire wolves. There's some genes 259 00:12:52,400 --> 00:12:55,800 Speaker 1: that they found while sequencing direwolf genetics that have been 260 00:12:55,840 --> 00:12:59,240 Speaker 1: put into a normal wolf. But that doesn't It's kind 261 00:12:59,280 --> 00:13:03,000 Speaker 1: of like how like some people have some Neanderthal DNA 262 00:13:03,440 --> 00:13:07,800 Speaker 1: in there, but they're not themselves Neanderthals, right, They're just 263 00:13:07,880 --> 00:13:08,440 Speaker 1: they're people. 264 00:13:08,600 --> 00:13:12,400 Speaker 2: You know, that's gotta be a tough thing to figure 265 00:13:12,400 --> 00:13:14,520 Speaker 2: out for yourself. Though that I got a little bit 266 00:13:14,559 --> 00:13:16,000 Speaker 2: of that in there, because yeah. 267 00:13:16,040 --> 00:13:18,400 Speaker 1: You get some DNA from a species we wiped out. 268 00:13:18,679 --> 00:13:20,960 Speaker 2: Yeah you can kind of see it. And then then 269 00:13:21,200 --> 00:13:22,679 Speaker 2: that bumps you out where you're like. 270 00:13:22,679 --> 00:13:32,480 Speaker 1: Oh sure, like John Hamm, I assume right. Yeah, So 271 00:13:32,720 --> 00:13:36,559 Speaker 1: Colossal Biosciences is not actually in the de extinction business. 272 00:13:36,559 --> 00:13:40,559 Speaker 1: They are in the modifying animals genetically in ways that 273 00:13:40,720 --> 00:13:43,360 Speaker 1: in some cases hadn't been done before business, and that 274 00:13:43,559 --> 00:13:48,000 Speaker 1: is interesting, but it's not the extinction. And so they 275 00:13:48,040 --> 00:13:51,600 Speaker 1: are doing stuff. They are doing something new and something 276 00:13:51,640 --> 00:13:54,440 Speaker 1: that is in some ways very interesting, but it's not 277 00:13:54,559 --> 00:13:57,200 Speaker 1: what they're claiming they're doing. So I can't call this 278 00:13:57,280 --> 00:14:00,840 Speaker 1: a straight up con right, because they did make an 279 00:14:00,880 --> 00:14:04,760 Speaker 1: animal that didn't exist before, but it's also not a 280 00:14:04,840 --> 00:14:08,160 Speaker 1: dire wolf, and they're not de extincting anything. And I 281 00:14:08,160 --> 00:14:11,120 Speaker 1: think the evidence shows they are massively inflating what they 282 00:14:11,160 --> 00:14:14,560 Speaker 1: and their technology can do in order to win VC funding. 283 00:14:15,320 --> 00:14:18,000 Speaker 1: The whole explanation as to why we'll take a while, 284 00:14:18,040 --> 00:14:20,280 Speaker 1: but I'm going to start by talking about the claims 285 00:14:20,280 --> 00:14:23,320 Speaker 1: that first brought the company public attention back in September 286 00:14:23,360 --> 00:14:26,119 Speaker 1: of twenty twenty one. A whole spate of almost identical 287 00:14:26,200 --> 00:14:30,320 Speaker 1: articles dropped announcing the creation of Colossal Biosciences, and they're 288 00:14:30,360 --> 00:14:33,360 Speaker 1: planned to clone a wooly mammoth by twenty twenty seven, 289 00:14:33,520 --> 00:14:36,880 Speaker 1: so we got two more years before to be mammoths, right. 290 00:14:36,800 --> 00:14:39,720 Speaker 2: Megan, Wooly mammos back, y'all, this is exciting. 291 00:14:39,680 --> 00:14:42,200 Speaker 1: Very soon, like probably right around the same time we 292 00:14:42,280 --> 00:14:44,840 Speaker 1: get Severn season three. You know, we're lucky. 293 00:14:45,240 --> 00:14:47,760 Speaker 2: You know what its about the same year talks about 294 00:14:47,760 --> 00:14:52,239 Speaker 2: wooly mammos two is they also are not bigger than elephants. 295 00:14:52,600 --> 00:14:53,160 Speaker 1: No. 296 00:14:53,160 --> 00:14:55,400 Speaker 2: No, I thought this whole time that wooly mammoths were 297 00:14:55,440 --> 00:14:58,560 Speaker 2: like these giant beasts that we would never be able 298 00:14:58,600 --> 00:15:01,280 Speaker 2: to see again. And they're like smaller than the average 299 00:15:01,360 --> 00:15:02,560 Speaker 2: no African elephant. 300 00:15:02,880 --> 00:15:02,960 Speaker 3: No. 301 00:15:03,040 --> 00:15:05,760 Speaker 1: And again, it's one of those things whatever people start 302 00:15:05,760 --> 00:15:07,440 Speaker 1: to think about, Oh, it's it's a bum. We've missed 303 00:15:07,440 --> 00:15:10,760 Speaker 1: all the cool, coolest animals that existed. The largest thing 304 00:15:10,840 --> 00:15:13,480 Speaker 1: to ever exist on Earth is still around swims in 305 00:15:13,560 --> 00:15:15,360 Speaker 1: the sea and we're currently killing them. 306 00:15:17,040 --> 00:15:19,840 Speaker 4: Okay, something something you missed in this story now that 307 00:15:19,960 --> 00:15:22,880 Speaker 4: you've brought up the wooly mammoth is uh. Part of 308 00:15:22,920 --> 00:15:26,760 Speaker 4: the investors for this company are like a I'm. 309 00:15:26,640 --> 00:15:29,600 Speaker 1: Getting to it. I'm getting to that, So don't show 310 00:15:29,640 --> 00:15:31,240 Speaker 1: where I was, Like it was a guy know this? 311 00:15:31,560 --> 00:15:32,560 Speaker 1: Why do I know this? 312 00:15:32,680 --> 00:15:32,920 Speaker 4: Shit? 313 00:15:33,160 --> 00:15:36,280 Speaker 1: We're getting to the other investors in this fucking company. 314 00:15:36,320 --> 00:15:40,680 Speaker 1: Don't worry God. But here's a here's a representative example 315 00:15:40,760 --> 00:15:44,800 Speaker 1: from like the press that explosion around this wooly mammoth claim. 316 00:15:44,840 --> 00:15:48,120 Speaker 1: So this is a CNBC article lab grown wooly mammoths 317 00:15:48,120 --> 00:15:51,040 Speaker 1: could walk the earth in six years of geneticists, new 318 00:15:51,080 --> 00:15:54,000 Speaker 1: startup succeeds. This is published in twenty twenty one. And 319 00:15:54,040 --> 00:15:57,280 Speaker 1: the geneticist that they're discussing, and the guy the article 320 00:15:57,400 --> 00:16:01,000 Speaker 1: is based around interviewing is co founder of Claw Biosciences, 321 00:16:01,080 --> 00:16:04,160 Speaker 1: doctor George Church, who claimed that he'd had the idea 322 00:16:04,320 --> 00:16:07,640 Speaker 1: kicking around for years and research supports this fact. He's 323 00:16:07,680 --> 00:16:09,920 Speaker 1: been pushing this idea in one form or another for 324 00:16:10,000 --> 00:16:12,320 Speaker 1: like a decade or more, but that he'd just been 325 00:16:12,360 --> 00:16:15,440 Speaker 1: given fifteen million dollars in seed funding and a company 326 00:16:15,480 --> 00:16:19,400 Speaker 1: had been established with serial entrepreneur Ben Lamb as CEO. 327 00:16:19,800 --> 00:16:23,400 Speaker 1: And we will talk about Ben Lambs Moore in part two. Church, 328 00:16:23,480 --> 00:16:26,560 Speaker 1: though doctor George Church is a real doctor and his 329 00:16:26,680 --> 00:16:29,720 Speaker 1: credentials are impeccable on paper. And just to state, this 330 00:16:29,760 --> 00:16:33,200 Speaker 1: guy's kind of a weird case where he's exaggerating a 331 00:16:33,240 --> 00:16:35,560 Speaker 1: lot and I think you could even argue lying about 332 00:16:35,600 --> 00:16:39,680 Speaker 1: some things. But he's a real scientist with some very 333 00:16:39,720 --> 00:16:41,440 Speaker 1: impressive achievements behind him. 334 00:16:41,640 --> 00:16:44,760 Speaker 2: And I think it's important for us to say that 335 00:16:45,520 --> 00:16:50,480 Speaker 2: scientists can be both legitimate and liars. Oh with shit, right, yeah, yes, 336 00:16:50,600 --> 00:16:54,040 Speaker 2: we often I think conflate it somehow, where scientists are 337 00:16:54,160 --> 00:16:57,240 Speaker 2: like these moral beings that exist above us all and no, 338 00:16:57,400 --> 00:17:00,440 Speaker 2: they can be liars and also really smart and capable people. 339 00:17:00,600 --> 00:17:00,760 Speaker 3: Right. 340 00:17:00,920 --> 00:17:03,160 Speaker 1: It's like you could be a great science fiction author 341 00:17:03,200 --> 00:17:07,440 Speaker 1: and racist as fuck, right, like those two things have existed, 342 00:17:07,760 --> 00:17:10,240 Speaker 1: or like Isaac Asimov where you're like, wow, what a 343 00:17:10,320 --> 00:17:14,080 Speaker 1: genius and also sex pest, right, like those things do 344 00:17:14,280 --> 00:17:17,760 Speaker 1: not conflict whatsoever. You know, it might have helped him, 345 00:17:17,800 --> 00:17:23,520 Speaker 1: I have helped him. Who knows. So George Church's credentious. 346 00:17:23,520 --> 00:17:25,879 Speaker 1: I'm not calling Church's sex pest. Although he has some 347 00:17:25,920 --> 00:17:28,480 Speaker 1: shady involvement with people that we'll talk about, none of 348 00:17:28,480 --> 00:17:31,800 Speaker 1: it involves the accusations of his specific behavior, just his 349 00:17:32,000 --> 00:17:36,280 Speaker 1: choice of company anyway, his academic credentials anyway, He is 350 00:17:36,320 --> 00:17:39,879 Speaker 1: the oh we're this this episode ends. Uh, you know, 351 00:17:40,359 --> 00:17:41,840 Speaker 1: I'm not going to give you a hint, but you're 352 00:17:41,840 --> 00:17:43,639 Speaker 1: you're going to be psyched. You're not gonna be psyched, 353 00:17:43,640 --> 00:17:47,520 Speaker 1: You're gonna be bummed. He is the Robert Winthrop Professor 354 00:17:47,520 --> 00:17:50,560 Speaker 1: of Genetics at Harvard Medical School and a faculty member 355 00:17:50,600 --> 00:17:54,800 Speaker 1: at the Weiss Institute for Biologically Inspired Engineering, also at Harvard. 356 00:17:54,920 --> 00:17:59,040 Speaker 1: So you know that alone pretty bigical achievement. 357 00:17:58,640 --> 00:18:00,760 Speaker 2: Right, and he still holds those positions. 358 00:18:00,840 --> 00:18:04,600 Speaker 1: Yes, yes, as far as I'm aware. Yes, Church has 359 00:18:04,600 --> 00:18:07,520 Speaker 1: his name on more than one hundred patents. And you know, 360 00:18:07,640 --> 00:18:09,720 Speaker 1: some of those are things where like maybe he got 361 00:18:09,760 --> 00:18:11,760 Speaker 1: on there because he helped someone else, but a lot 362 00:18:11,800 --> 00:18:14,600 Speaker 1: of them are because he contributed really significant work to 363 00:18:14,640 --> 00:18:19,439 Speaker 1: those patents. He started the Personal Genome project. And he 364 00:18:19,480 --> 00:18:23,239 Speaker 1: has also helped found more than twenty companies. Now, that 365 00:18:23,320 --> 00:18:25,080 Speaker 1: last claim was the first thing I read about him 366 00:18:25,080 --> 00:18:26,919 Speaker 1: that made me wonder, like, Okay, is there something like 367 00:18:26,960 --> 00:18:29,600 Speaker 1: shady here, because twenty companies is too many? No, honest, 368 00:18:29,640 --> 00:18:32,520 Speaker 1: man has found more than twenty companies. You're doing some 369 00:18:32,600 --> 00:18:33,760 Speaker 1: fucked up shit, right. 370 00:18:34,359 --> 00:18:37,120 Speaker 2: You gotta focus, big man. That's a few too many companies. 371 00:18:37,520 --> 00:18:41,680 Speaker 1: That's too many companies. And then what I read about 372 00:18:41,720 --> 00:18:45,320 Speaker 1: the actual claims colossal through Church was making about why 373 00:18:45,400 --> 00:18:48,200 Speaker 1: cloning mammoths was not just like a cool thing to do, 374 00:18:48,320 --> 00:18:52,520 Speaker 1: but like necessary for conservation. I went fully over the edge. 375 00:18:52,560 --> 00:18:54,000 Speaker 1: This is when I was like, Okay, I got to 376 00:18:54,000 --> 00:18:55,760 Speaker 1: dig more into this guy. There's gotta be something fucked 377 00:18:55,800 --> 00:18:58,080 Speaker 1: up here. And he has made claims like this quote. 378 00:18:58,720 --> 00:19:01,600 Speaker 1: This is a quote from the article proponents of the project, 379 00:19:01,680 --> 00:19:04,680 Speaker 1: and they're talking about Church say rewilding the Arctic with 380 00:19:04,720 --> 00:19:07,639 Speaker 1: wooly mammoths could slow global warming by slowing the melting 381 00:19:07,680 --> 00:19:12,320 Speaker 1: of the permafrost where methane is currently trapped. That's not true. 382 00:19:13,000 --> 00:19:16,200 Speaker 1: How it has something to do with them stomping down 383 00:19:16,359 --> 00:19:18,840 Speaker 1: the ground to stop like trees from growing up so 384 00:19:18,880 --> 00:19:21,760 Speaker 1: the permafrost days. But like that's if theoretically there were 385 00:19:21,800 --> 00:19:26,240 Speaker 1: a massive, healthy mammoth population, it might do that. We're 386 00:19:26,280 --> 00:19:29,600 Speaker 1: not talking about number one, they're not talking about making mammoths. 387 00:19:29,600 --> 00:19:32,359 Speaker 1: They're talking about modifying African elephants as a spoiler for 388 00:19:32,400 --> 00:19:34,480 Speaker 1: what will be in part two. But also, like that's 389 00:19:34,520 --> 00:19:37,119 Speaker 1: just not a feasible place for this project to end with, 390 00:19:37,160 --> 00:19:41,560 Speaker 1: like massive herds of mammoths clumbing across the top say 391 00:19:41,720 --> 00:19:45,600 Speaker 1: to fix. Also, no amount of mammoths is going to 392 00:19:45,640 --> 00:19:48,040 Speaker 1: fix global warming, and it's current, Like, the problem is 393 00:19:48,080 --> 00:19:51,000 Speaker 1: not just there's too many trees in Siberia. There's other 394 00:19:51,080 --> 00:19:51,840 Speaker 1: shit going off. 395 00:19:52,000 --> 00:19:54,680 Speaker 2: If that's the approach you're taking, you're missing the mark 396 00:19:54,760 --> 00:19:57,320 Speaker 2: quite a bit. Yeah, I think mammoth mammoths can't be 397 00:19:57,359 --> 00:19:59,800 Speaker 2: our first start at fixing the problem, for sure. 398 00:20:00,000 --> 00:20:02,679 Speaker 1: I think a lack of mammoth says the primary reason 399 00:20:02,840 --> 00:20:08,040 Speaker 1: this is a problem. Yeah. Further shady factoids about the 400 00:20:08,040 --> 00:20:12,120 Speaker 1: business include the fact that it is a for profit enterprise. Now. 401 00:20:12,160 --> 00:20:14,399 Speaker 1: Ben Lamb, who's his co founder in the CEO, was 402 00:20:14,480 --> 00:20:17,840 Speaker 1: quick to tel CNBC none of our investors are focused 403 00:20:17,880 --> 00:20:21,320 Speaker 1: on monetizing right now, which is great, But then you 404 00:20:21,359 --> 00:20:23,800 Speaker 1: read about who those investors are, and you wonder, I 405 00:20:24,400 --> 00:20:27,320 Speaker 1: don't know if I believe that, because investors in Colossal 406 00:20:27,359 --> 00:20:30,240 Speaker 1: outside of George R. Martin include self help grifter, Tony 407 00:20:30,320 --> 00:20:33,800 Speaker 1: Robbins and winkle Boss Capital. Oh we got winkle By 408 00:20:34,080 --> 00:20:36,080 Speaker 1: the Winklebostuins. 409 00:20:36,080 --> 00:20:39,520 Speaker 2: Wow, Winkleby, that's crazy. 410 00:20:39,840 --> 00:20:42,040 Speaker 1: They'd only be involved in a real project. 411 00:20:42,480 --> 00:20:44,600 Speaker 2: This is like when a bunch of celebrities open up 412 00:20:44,640 --> 00:20:46,919 Speaker 2: an ice cream story, right, Eli, Why do y'all know 413 00:20:46,960 --> 00:20:47,400 Speaker 2: each other? 414 00:20:47,520 --> 00:20:49,720 Speaker 1: Oh, one of you's moving coke and you guys need 415 00:20:49,760 --> 00:20:51,440 Speaker 1: it away? Alounder shity, What. 416 00:20:51,520 --> 00:20:55,760 Speaker 2: Is this relationship that somehow foster naturally. 417 00:20:55,280 --> 00:20:59,080 Speaker 1: Between uh huh, this is something's wrong here and missing something. 418 00:21:00,080 --> 00:21:03,000 Speaker 1: And yes there are some famous TikTokers involved as well, 419 00:21:03,040 --> 00:21:05,679 Speaker 1: and some other celebrities who should not be involved in 420 00:21:05,680 --> 00:21:09,840 Speaker 1: a biosciencest company we'll talk about later. So by the 421 00:21:09,840 --> 00:21:12,520 Speaker 1: time I read about the Winklevoss twins being involved, I 422 00:21:12,560 --> 00:21:16,280 Speaker 1: was fully on team fuck these people winkleby the Winkles V. 423 00:21:17,520 --> 00:21:20,600 Speaker 1: But that's not enough to actually like justify accusing a 424 00:21:20,640 --> 00:21:23,719 Speaker 1: person and their company of being bastards, right, just like 425 00:21:23,960 --> 00:21:27,200 Speaker 1: even I wouldn't do that. So I looked deeper, and boy, howdy, 426 00:21:27,280 --> 00:21:30,560 Speaker 1: did I find some shit. Before we get into everything 427 00:21:30,600 --> 00:21:33,160 Speaker 1: that's fucked up about this company, a lot of what's 428 00:21:33,160 --> 00:21:35,919 Speaker 1: fucked up here is actually taught doctor George Church and 429 00:21:35,960 --> 00:21:38,959 Speaker 1: talking about what this guy's done and where he's come from, 430 00:21:39,000 --> 00:21:42,000 Speaker 1: Because this is a story of like a great scientist 431 00:21:42,400 --> 00:21:45,440 Speaker 1: who makes some choices that I would argue puts him 432 00:21:45,440 --> 00:21:50,440 Speaker 1: into a series of very unethical situations because there's money 433 00:21:50,440 --> 00:21:52,920 Speaker 1: in it. Right, That's what I think is going on here. 434 00:21:52,960 --> 00:21:56,040 Speaker 1: But I'm just gonna read you his bio. George McDonald 435 00:21:56,160 --> 00:21:59,040 Speaker 1: Church was born on August twenty eighth, nineteen fifty four, 436 00:21:59,119 --> 00:22:02,480 Speaker 1: on McDill Air Force Base in Tampa, Florida. He grew 437 00:22:02,560 --> 00:22:05,720 Speaker 1: up in Clearwater with the capitalist scientology. But I can't 438 00:22:05,760 --> 00:22:07,439 Speaker 1: hold that against him. He's got no ties that I 439 00:22:07,440 --> 00:22:09,640 Speaker 1: found to the organization. He just grew up there. 440 00:22:10,520 --> 00:22:11,479 Speaker 2: He just got lucky. 441 00:22:11,560 --> 00:22:15,320 Speaker 1: He just got lucky, thank God. Yeah, he's near Blue Base. 442 00:22:15,440 --> 00:22:18,600 Speaker 1: I think that's the big base in Clearwater. His family 443 00:22:18,640 --> 00:22:21,760 Speaker 1: life was somewhat chaotic by most people's standards, as he 444 00:22:21,840 --> 00:22:24,159 Speaker 1: laid out in an interview with the Harvard Gazette, I 445 00:22:24,240 --> 00:22:26,760 Speaker 1: had three fathers as my mother remarried. The first one 446 00:22:26,800 --> 00:22:29,000 Speaker 1: lasted about eight months post birth, and he was an 447 00:22:29,040 --> 00:22:32,080 Speaker 1: Air Force pilot, a pretty colorful character. I knew him 448 00:22:32,119 --> 00:22:33,880 Speaker 1: off and on through the years up until his death. 449 00:22:34,119 --> 00:22:35,640 Speaker 1: He was the sort of father that a young boy 450 00:22:35,640 --> 00:22:38,879 Speaker 1: would admire because he wasn't tied down by actual responsibilities. 451 00:22:39,119 --> 00:22:42,119 Speaker 1: That was Stu MacDonald. He was called barefoot stew He 452 00:22:42,200 --> 00:22:44,439 Speaker 1: was inducted into the Water Ski Hall of Fame. He 453 00:22:44,480 --> 00:22:46,760 Speaker 1: wasn't a terrific athlete. I mean, obviously he was a 454 00:22:46,800 --> 00:22:49,080 Speaker 1: pretty good one, but his real contribution to the sport, 455 00:22:49,119 --> 00:22:51,080 Speaker 1: which was relevant to me, was that he was a 456 00:22:51,080 --> 00:22:54,199 Speaker 1: good communicator. He was the first ABC wide World of 457 00:22:54,240 --> 00:22:57,520 Speaker 1: Sports color commentator. He was also just generally charismatic. He 458 00:22:57,600 --> 00:23:00,640 Speaker 1: was a male model. He worked on film television as well. 459 00:23:01,640 --> 00:23:05,119 Speaker 1: Right right, So, and this guy he's also primarily a 460 00:23:05,160 --> 00:23:08,520 Speaker 1: communicator now, right, And he's like, he's very old now, 461 00:23:08,520 --> 00:23:10,800 Speaker 1: but he's like a handsome kind of old Like he's 462 00:23:10,840 --> 00:23:13,320 Speaker 1: the you would cast him to be like the old 463 00:23:13,440 --> 00:23:15,960 Speaker 1: king in like a fucking new Robin Hood movie, like 464 00:23:16,040 --> 00:23:18,399 Speaker 1: comes Back at the Air, right, Like, he is that 465 00:23:18,520 --> 00:23:21,879 Speaker 1: kind of old guy. He was my birth dad, but 466 00:23:21,920 --> 00:23:24,440 Speaker 1: I don't think he really influenced me that much intellectually. 467 00:23:24,600 --> 00:23:27,200 Speaker 1: My second father was a lawyer and had the least influence. 468 00:23:27,359 --> 00:23:30,040 Speaker 1: Third dad was a physician who had two pretty important roles. 469 00:23:30,240 --> 00:23:32,399 Speaker 1: He sent me away to school, to an awesome high school. 470 00:23:32,440 --> 00:23:34,560 Speaker 1: Both my stepbrother and I went away at roughly the 471 00:23:34,600 --> 00:23:36,440 Speaker 1: same time. It might have just been to get the 472 00:23:36,440 --> 00:23:38,080 Speaker 1: young teenage boys out of the house, but in my 473 00:23:38,200 --> 00:23:40,840 Speaker 1: case it was very good. It was a liberal East 474 00:23:40,880 --> 00:23:43,400 Speaker 1: Coast school, and over which is where the Bushes went. 475 00:23:44,000 --> 00:23:46,680 Speaker 1: I don't know if i'd call that liberal, but and 476 00:23:47,119 --> 00:23:49,800 Speaker 1: Harvard chemistry professor George Whiteside and a bunch of other 477 00:23:49,840 --> 00:23:52,119 Speaker 1: interesting people. And the other thing he did was just 478 00:23:52,200 --> 00:23:54,959 Speaker 1: being a physician. I could look at his medical technology 479 00:23:54,960 --> 00:23:58,720 Speaker 1: and somehow be enthralled by it. And doctor Dad is 480 00:23:58,760 --> 00:24:00,840 Speaker 1: where he gets the last name Church, So that definitely 481 00:24:00,840 --> 00:24:03,119 Speaker 1: seems to be the guy he primarily considers to have 482 00:24:03,160 --> 00:24:05,520 Speaker 1: been his father. And that summary of events does kind 483 00:24:05,520 --> 00:24:07,800 Speaker 1: of smooth over a couple of things, including what seems 484 00:24:07,800 --> 00:24:10,320 Speaker 1: to have been a difficult start for George at school. 485 00:24:10,560 --> 00:24:13,159 Speaker 1: He's always very bright, but he has learning disabilities. He 486 00:24:13,200 --> 00:24:16,159 Speaker 1: had to repeat the ninth grade as a result. George 487 00:24:16,160 --> 00:24:19,240 Speaker 1: has claimed in recent years to have dyslexia and narcolepsy 488 00:24:19,359 --> 00:24:22,800 Speaker 1: OCD and add all of which he says it's a 489 00:24:22,840 --> 00:24:25,840 Speaker 1: lot of stuff. He says, they were all mild, but 490 00:24:26,040 --> 00:24:28,520 Speaker 1: it made me feel different, right, and so he became 491 00:24:28,600 --> 00:24:31,160 Speaker 1: kind of desperate in grade school not to stand out 492 00:24:31,280 --> 00:24:33,760 Speaker 1: or get attention, right, Like, he doesn't want to seem weird, 493 00:24:33,800 --> 00:24:36,520 Speaker 1: which is a pretty normal way for kids to feel 494 00:24:36,560 --> 00:24:37,080 Speaker 1: in school. 495 00:24:37,320 --> 00:24:39,680 Speaker 2: So far, it's the thing I've related to him most 496 00:24:39,720 --> 00:24:42,120 Speaker 2: on Right, I get that, Yeah, I get it that 497 00:24:42,440 --> 00:24:48,320 Speaker 2: I connect to Saying you reinvented wolves is a different conversation. 498 00:24:47,840 --> 00:24:53,119 Speaker 1: But the whole thing trouble with you very rarely claimed 499 00:24:53,119 --> 00:24:53,720 Speaker 1: that Langston. 500 00:24:54,840 --> 00:24:58,080 Speaker 2: It's almost never come up in our conversations at least. 501 00:24:57,920 --> 00:25:02,480 Speaker 1: Yeah, seldom. Prior to going to Andover Church, attended both 502 00:25:02,520 --> 00:25:05,840 Speaker 1: public and Catholic schools, but had bad experiences in both systems. 503 00:25:05,880 --> 00:25:08,359 Speaker 1: He just says the schools in Florida weren't very good. Again, 504 00:25:08,520 --> 00:25:14,520 Speaker 1: I don't have trouble seeing that. Despite his difficulty with academics, 505 00:25:14,600 --> 00:25:16,880 Speaker 1: he was a voracious reader and good at self directed 506 00:25:16,960 --> 00:25:19,719 Speaker 1: learning when he was interested in something. He built an 507 00:25:19,760 --> 00:25:22,440 Speaker 1: analog computer when he was ten, and when he started 508 00:25:22,440 --> 00:25:25,080 Speaker 1: at Andover they had a time share computing program with 509 00:25:25,200 --> 00:25:28,000 Speaker 1: nearby Dartmouth College, so he was able to spend time 510 00:25:28,040 --> 00:25:30,720 Speaker 1: on a computer before most people his age did, which 511 00:25:30,760 --> 00:25:33,560 Speaker 1: is like you see similar stories with like a lot 512 00:25:33,600 --> 00:25:35,520 Speaker 1: of the Bill Gates and Steve Jobs, a lot of 513 00:25:35,560 --> 00:25:38,440 Speaker 1: the guys who were like around this age and also 514 00:25:38,560 --> 00:25:40,360 Speaker 1: wound up being major tech players. 515 00:25:40,520 --> 00:25:44,760 Speaker 2: Right, Yeah, it was such rare technology, Yeah, to access 516 00:25:44,840 --> 00:25:45,440 Speaker 2: at that age. 517 00:25:45,480 --> 00:25:45,520 Speaker 1: That. 518 00:25:45,640 --> 00:25:48,320 Speaker 2: I bet if you really were able to invest in 519 00:25:48,359 --> 00:25:52,920 Speaker 2: the time and energy, you advanced a chess piece. So far, yes, 520 00:25:53,200 --> 00:25:54,320 Speaker 2: for yourself. 521 00:25:54,040 --> 00:25:56,200 Speaker 1: And you were generally kind of a rich kid, right, 522 00:25:56,280 --> 00:25:58,120 Speaker 1: which is you know the case with even though he's 523 00:25:58,119 --> 00:26:00,160 Speaker 1: got you know, he goes through a couple of dads, 524 00:26:00,240 --> 00:26:03,439 Speaker 1: this this last one is very comfortable financially and as 525 00:26:03,440 --> 00:26:05,879 Speaker 1: a result, he gets this opportunity and as a result, 526 00:26:05,960 --> 00:26:08,760 Speaker 1: then his story sounds less like a lot of big 527 00:26:08,840 --> 00:26:12,120 Speaker 1: science guys and more like a lot of tech startup dudes, right, 528 00:26:12,320 --> 00:26:14,640 Speaker 1: Like that's the kind of background this dude has. 529 00:26:14,680 --> 00:26:18,160 Speaker 2: They always talk about how like Bill Gates started Microsoft 530 00:26:18,200 --> 00:26:20,560 Speaker 2: in his garage, but it's like, oh, you had an 531 00:26:20,680 --> 00:26:27,120 Speaker 2: entire garage. Most people have to store things and park 532 00:26:27,160 --> 00:26:29,240 Speaker 2: ours in there. You were just like Noah, the garage 533 00:26:29,320 --> 00:26:30,040 Speaker 2: is a workspace. 534 00:26:30,280 --> 00:26:35,119 Speaker 1: Yeah, that's right, speaking of garages, if you want to 535 00:26:35,160 --> 00:26:38,240 Speaker 1: afford a garage, I don't know. I can't help you, 536 00:26:38,320 --> 00:26:39,920 Speaker 1: but you can buy these products. 537 00:26:40,160 --> 00:26:42,359 Speaker 2: Oh nice, this is a beautiful segue. 538 00:26:42,960 --> 00:26:51,840 Speaker 1: Yeah we're back. Okay, So so far it's been a 539 00:26:51,880 --> 00:26:54,200 Speaker 1: pretty similar story to a lot of tech guys. And 540 00:26:54,440 --> 00:26:56,399 Speaker 1: George has a story about what how we knew was 541 00:26:56,440 --> 00:26:58,040 Speaker 1: like ten or so, he goes to the New York 542 00:26:58,080 --> 00:27:00,439 Speaker 1: World's Fair and that has a huge impact on him. 543 00:27:00,440 --> 00:27:02,959 Speaker 1: He gets to see very early touch screens, which are 544 00:27:03,000 --> 00:27:06,199 Speaker 1: obviously a precursor to a real technology, and also a 545 00:27:06,240 --> 00:27:09,440 Speaker 1: lot of fake future technology, like personal jet packs and 546 00:27:09,480 --> 00:27:12,080 Speaker 1: stuff that, like, I mean, there's technically some jet packs, 547 00:27:12,080 --> 00:27:13,680 Speaker 1: but it's not what we thought it would be, right, 548 00:27:13,800 --> 00:27:15,359 Speaker 1: We're not flying around in those things. 549 00:27:15,400 --> 00:27:17,440 Speaker 2: You and be like, I thought it was gonna be 550 00:27:17,560 --> 00:27:19,600 Speaker 2: rocketman shit, and it's not that at all. 551 00:27:20,040 --> 00:27:22,280 Speaker 1: No. No, I thought right up to the fact that 552 00:27:22,320 --> 00:27:24,960 Speaker 1: I thought we'd all be shooting Nazis, And it turns 553 00:27:25,000 --> 00:27:28,400 Speaker 1: out we shose other things to do with Nazis, which 554 00:27:29,720 --> 00:27:30,400 Speaker 1: some of us think. 555 00:27:30,440 --> 00:27:33,119 Speaker 2: It's they're a better hang than we anticipated. 556 00:27:33,560 --> 00:27:35,960 Speaker 1: Certainly the better hang than the rocketman for the rocket 557 00:27:36,000 --> 00:27:42,119 Speaker 1: to your thought. So, but yeah, a lot of the 558 00:27:42,160 --> 00:27:44,880 Speaker 1: tech he sees is also just stuff that like never happened, right, 559 00:27:45,000 --> 00:27:47,359 Speaker 1: And George would later say, quote, it didn't take too 560 00:27:47,440 --> 00:27:50,199 Speaker 1: long for me to become disillusioned. Not only was it 561 00:27:50,280 --> 00:27:52,399 Speaker 1: not like that in Florida, it probably wasn't even like 562 00:27:52,440 --> 00:27:54,480 Speaker 1: that in New York once they shut down the World's Fair, 563 00:27:54,720 --> 00:27:56,399 Speaker 1: And it might not ever be like that if I 564 00:27:56,440 --> 00:27:58,480 Speaker 1: didn't do something about it. So I sort of felt 565 00:27:58,520 --> 00:28:01,040 Speaker 1: like if I want that, I have to it. And 566 00:28:01,080 --> 00:28:03,200 Speaker 1: you can take two things out of this either he 567 00:28:03,280 --> 00:28:06,280 Speaker 1: realizes the World's fair is largely like a pr thing 568 00:28:06,520 --> 00:28:08,640 Speaker 1: and most of this stuff isn't coming or at least 569 00:28:08,640 --> 00:28:11,080 Speaker 1: not coming anytime soon, and like, well, then I'm going 570 00:28:11,119 --> 00:28:13,120 Speaker 1: to get into I'm going to become a cutting edge 571 00:28:13,160 --> 00:28:16,560 Speaker 1: scientist to try to make this future real. Or maybe 572 00:28:16,560 --> 00:28:19,040 Speaker 1: what he learns is like, wow, it's really easy to 573 00:28:19,080 --> 00:28:20,840 Speaker 1: lie to a lot of people about what you can 574 00:28:20,880 --> 00:28:23,560 Speaker 1: do and like get money, and maybe that's maybe that's 575 00:28:23,600 --> 00:28:25,680 Speaker 1: a lot easier than inventing the future. 576 00:28:26,640 --> 00:28:30,919 Speaker 2: I think spotting a grift is real, profitable, real fast. 577 00:28:31,280 --> 00:28:35,560 Speaker 1: It can be. Yeah. Now, maybe to be fair, maybe 578 00:28:35,600 --> 00:28:38,120 Speaker 1: both of those things hit him, because he does get 579 00:28:38,160 --> 00:28:41,800 Speaker 1: into some real making the future shit. At first, Church 580 00:28:41,800 --> 00:28:44,840 Speaker 1: wound up attending summer courses in quantum physics. At MIT, 581 00:28:45,200 --> 00:28:48,920 Speaker 1: he gets into crystallography, which I don't really understand but 582 00:28:49,040 --> 00:28:52,440 Speaker 1: is important science. And he describes this as showing him, 583 00:28:52,480 --> 00:28:55,360 Speaker 1: quote the intersection of computers and biology, which is going 584 00:28:55,400 --> 00:28:58,040 Speaker 1: to be like a constant source of fascination for him. 585 00:28:58,400 --> 00:29:00,479 Speaker 1: Now he does still have issues in SCO he has 586 00:29:00,520 --> 00:29:03,640 Speaker 1: to repeat his first year of graduate school, and depending 587 00:29:03,680 --> 00:29:06,160 Speaker 1: on where you find him interviewed, this is explained differently. 588 00:29:06,200 --> 00:29:08,400 Speaker 1: I found one article that just said, well, he was 589 00:29:08,520 --> 00:29:12,200 Speaker 1: just taking so many classes, too many classes, so that 590 00:29:12,440 --> 00:29:14,560 Speaker 1: he couldn't graduate, and he was just like too interested 591 00:29:14,600 --> 00:29:16,960 Speaker 1: in doing too many different things, and it just like 592 00:29:17,520 --> 00:29:20,840 Speaker 1: graduating kind of slipped by him. And that's not really accurate. 593 00:29:21,080 --> 00:29:23,680 Speaker 1: The way he explained it in this Harvard student paper 594 00:29:23,720 --> 00:29:25,960 Speaker 1: is different from that, but it's also kind of weird. 595 00:29:26,560 --> 00:29:29,040 Speaker 1: Sometimes I could get away with barely going to classes. 596 00:29:29,120 --> 00:29:31,600 Speaker 1: Other times, like in organic chemistry. I loved it so much. 597 00:29:31,680 --> 00:29:33,480 Speaker 1: I did every single problem set in the back of 598 00:29:33,520 --> 00:29:35,920 Speaker 1: each chapter. They didn't even assign any of them. I 599 00:29:35,960 --> 00:29:37,680 Speaker 1: did them all. It was a full year course, and 600 00:29:37,720 --> 00:29:39,880 Speaker 1: I think I finished the book, including all the problems 601 00:29:39,880 --> 00:29:42,400 Speaker 1: in it, by halfway through the fall semester. That was 602 00:29:42,440 --> 00:29:44,640 Speaker 1: pretty typical. But I guess the reason I did it 603 00:29:44,640 --> 00:29:46,440 Speaker 1: in two years was that it was I was cheap 604 00:29:46,480 --> 00:29:48,640 Speaker 1: money wise, like a lot of teenagers, I didn't want 605 00:29:48,640 --> 00:29:51,160 Speaker 1: to keep being a burden on my parents. Steve Jobs 606 00:29:51,200 --> 00:29:53,880 Speaker 1: dropped out of college because he was worried about his parents' finances. 607 00:29:53,920 --> 00:29:55,960 Speaker 1: He did not. I didn't draw it out. I just 608 00:29:56,000 --> 00:29:58,440 Speaker 1: finished early. I also think I had this feeling that 609 00:29:58,480 --> 00:29:59,720 Speaker 1: if I took four years to do it, I would 610 00:29:59,720 --> 00:30:02,120 Speaker 1: probably flunk out, so it would be better to finish fast. 611 00:30:02,320 --> 00:30:04,040 Speaker 1: That turned out to be true at about the three 612 00:30:04,040 --> 00:30:05,880 Speaker 1: and a half year point. I did flunk out, but 613 00:30:05,880 --> 00:30:07,920 Speaker 1: out of graduate school. And you see that doesn't make 614 00:30:07,960 --> 00:30:09,840 Speaker 1: sense how He's like, well, I graduated early so I 615 00:30:09,840 --> 00:30:11,520 Speaker 1: wouldn't be a burden to my parents. But actually I 616 00:30:11,560 --> 00:30:14,200 Speaker 1: flunked out after three and a half years. I was like, well, 617 00:30:14,440 --> 00:30:15,960 Speaker 1: I don't understand what you're saying. 618 00:30:16,280 --> 00:30:19,280 Speaker 2: And I think if we're going back, that really speaks 619 00:30:19,320 --> 00:30:24,080 Speaker 2: to both the passionate learner and the grifter working in 620 00:30:24,360 --> 00:30:26,560 Speaker 2: in sort of synchronicity. 621 00:30:26,800 --> 00:30:29,000 Speaker 1: Right, if you'll forgive me likes and he has two 622 00:30:29,080 --> 00:30:30,240 Speaker 1: wolves inside him. 623 00:30:32,280 --> 00:30:35,880 Speaker 2: The man contains two wolves. Oh yeah, one DIYer one 624 00:30:35,920 --> 00:30:37,240 Speaker 2: pretty much a regular. 625 00:30:36,880 --> 00:30:39,200 Speaker 1: One trying to sell you a way. 626 00:30:40,040 --> 00:30:41,160 Speaker 4: I don't I don't. 627 00:30:40,960 --> 00:30:44,360 Speaker 1: Forgive it now. The way he describes this other times 628 00:30:44,400 --> 00:30:47,520 Speaker 1: is that he didn't even realize he'd flunked out of 629 00:30:47,600 --> 00:30:50,520 Speaker 1: graduate school because he was so excited about the crystallography work. 630 00:30:50,520 --> 00:30:53,840 Speaker 1: His advisor had him doing and his advisor was like, hey, man, 631 00:30:54,000 --> 00:30:56,520 Speaker 1: you're you're actually flunking. You know you're going to You've 632 00:30:56,560 --> 00:30:59,640 Speaker 1: got to like you're getting kicked out, and how hired 633 00:30:59,680 --> 00:31:02,640 Speaker 1: him as a technician, but was like, you can't just 634 00:31:02,760 --> 00:31:05,280 Speaker 1: keep doing this, you have to reapply to graduate school 635 00:31:05,320 --> 00:31:09,400 Speaker 1: somewhere else. And Church eventually reapplies to Harvard and describes 636 00:31:09,440 --> 00:31:11,920 Speaker 1: himself as being shocked at getting in because he'd flunked 637 00:31:11,920 --> 00:31:14,680 Speaker 1: out of Duke, but he had also gotten accepted to 638 00:31:14,720 --> 00:31:17,120 Speaker 1: Harvard before he went to Duke. And anyway, whatever he 639 00:31:17,120 --> 00:31:20,440 Speaker 1: gets accepted, did some stuff happen behind the scenes with 640 00:31:20,480 --> 00:31:22,120 Speaker 1: his dad and Harvard? I don't know. It may just 641 00:31:22,240 --> 00:31:23,800 Speaker 1: been that he'd been accepted before. 642 00:31:23,960 --> 00:31:26,840 Speaker 2: As I was. I bet having rich parents and a 643 00:31:26,960 --> 00:31:30,520 Speaker 2: nice little parachute probably helped him figure that out. 644 00:31:30,720 --> 00:31:33,720 Speaker 1: And he's got this professor who's probably going to bat 645 00:31:33,760 --> 00:31:35,520 Speaker 1: for him too, because he is good at some things. 646 00:31:35,520 --> 00:31:38,160 Speaker 1: But anyway, it's how he dropped out, and exactly why 647 00:31:38,280 --> 00:31:41,080 Speaker 1: is like a little bit different every time I read it, 648 00:31:41,880 --> 00:31:44,520 Speaker 1: which always kind of like raises my grifter hackles just 649 00:31:44,520 --> 00:31:48,120 Speaker 1: to scoch. But maybe I'm missing some things. At any rate, 650 00:31:48,160 --> 00:31:50,000 Speaker 1: he gets into Harvard, and he does better here. He 651 00:31:50,000 --> 00:31:53,440 Speaker 1: gets into chemistry and genomics sequencing, which is what he does. 652 00:31:53,480 --> 00:31:57,840 Speaker 1: His thesis on his nineteen eighty five PhD from Harvard 653 00:31:57,880 --> 00:32:00,720 Speaker 1: PERI write up on edge dot org quote included the 654 00:32:00,760 --> 00:32:05,800 Speaker 1: first methods for direct genome sequencing, molecular multiplexing, and bar coding. 655 00:32:06,040 --> 00:32:10,320 Speaker 1: These led to the first genome sequence pathogen Helicobacter pylori 656 00:32:10,360 --> 00:32:13,480 Speaker 1: in nineteen ninety four. His innovations have continued to nearly 657 00:32:13,520 --> 00:32:16,920 Speaker 1: all next generation DNA sequencing methods and companies and as 658 00:32:17,000 --> 00:32:18,800 Speaker 1: far as I can tell, and I even like reached 659 00:32:18,840 --> 00:32:21,040 Speaker 1: out to a friend of a friend who's in this field, 660 00:32:21,360 --> 00:32:24,640 Speaker 1: that is accurate. He is a legitimately like foundational mind 661 00:32:24,720 --> 00:32:28,960 Speaker 1: in modern genome sequencing. His work has been massively influential 662 00:32:29,040 --> 00:32:32,640 Speaker 1: in like specifically personal genome of people. He didn't invent 663 00:32:32,680 --> 00:32:35,960 Speaker 1: genome sequencing, right, but when we've like first started sequencing, 664 00:32:35,960 --> 00:32:38,040 Speaker 1: it costs billions of dollars to that the first time, 665 00:32:38,040 --> 00:32:41,000 Speaker 1: And he's a major reason why individuals can do it 666 00:32:41,240 --> 00:32:42,600 Speaker 1: and why you can do it for I think it's 667 00:32:42,640 --> 00:32:44,360 Speaker 1: like seven hundred and fifty bucks to get your genome 668 00:32:44,400 --> 00:32:47,719 Speaker 1: sequence now, right, Like, he is a big part of 669 00:32:47,720 --> 00:32:49,760 Speaker 1: that process, right, not even not to just write it 670 00:32:49,800 --> 00:32:53,080 Speaker 1: down to just him either. But his role is substantial 671 00:32:53,200 --> 00:32:56,040 Speaker 1: and this is meaningful, important science, right, and I don't 672 00:32:56,200 --> 00:32:57,640 Speaker 1: I'm not going to try to take that away from 673 00:32:57,680 --> 00:33:00,880 Speaker 1: him or pretend like this does not seem to be exaggerated. 674 00:33:01,120 --> 00:33:04,720 Speaker 1: Other aspects of his achievements will be this does not 675 00:33:04,840 --> 00:33:07,920 Speaker 1: seem to be right. A write up on him in 676 00:33:08,000 --> 00:33:12,600 Speaker 1: Popular Science by Janine Interlandi summarizes scientists are now using 677 00:33:12,680 --> 00:33:16,360 Speaker 1: it personal genome sequencing to identify intractable diseases such as 678 00:33:16,360 --> 00:33:18,760 Speaker 1: cancer and schizophrenia, and doctors are beginning to use it 679 00:33:18,760 --> 00:33:21,160 Speaker 1: to it enough by genetic mutations that cause rare and 680 00:33:21,240 --> 00:33:27,040 Speaker 1: until now undiagnosable illnesses. So Church becomes a PhD. Seems 681 00:33:27,080 --> 00:33:30,320 Speaker 1: like you're in that doing some good work, doing some 682 00:33:30,360 --> 00:33:33,440 Speaker 1: good work. He initiates the Personal Genome Project at Harvard 683 00:33:33,440 --> 00:33:35,680 Speaker 1: in two thousand and five, with the goal of sequencing 684 00:33:35,720 --> 00:33:38,840 Speaker 1: and publicizing the complete genomes and medical records of one 685 00:33:38,880 --> 00:33:43,280 Speaker 1: hundred thousand volunteers to further research into personalized medicine. And 686 00:33:43,320 --> 00:33:46,920 Speaker 1: all this is great, but there's also even in just 687 00:33:46,960 --> 00:33:49,719 Speaker 1: this you could be like, well, these people are volunteering 688 00:33:49,760 --> 00:33:51,760 Speaker 1: so maybe it's cool, but like there is some potential 689 00:33:51,920 --> 00:33:57,360 Speaker 1: troubling privacy stuff about publicizing everybody's genomes. You know. I 690 00:33:57,400 --> 00:34:00,000 Speaker 1: think we've all thought about that more in the last 691 00:34:00,040 --> 00:34:01,400 Speaker 1: couple of years about before. 692 00:34:01,480 --> 00:34:03,960 Speaker 2: Maybe my genome's pretty private to me. 693 00:34:05,520 --> 00:34:07,480 Speaker 1: I know a lot of people who use those twenty 694 00:34:07,520 --> 00:34:10,080 Speaker 1: three and me companies that are like, actually, I kind 695 00:34:10,080 --> 00:34:12,000 Speaker 1: of wish I hadn't done that now, knowing what they 696 00:34:12,040 --> 00:34:12,760 Speaker 1: do with the data. 697 00:34:12,960 --> 00:34:16,600 Speaker 2: Right, Yeah, they can like they can refuse a mortgage 698 00:34:16,680 --> 00:34:20,880 Speaker 2: because you've published this and now they're like, oh, we 699 00:34:21,239 --> 00:34:23,600 Speaker 2: think you're going to have diabetes, and diabetes means you 700 00:34:23,600 --> 00:34:27,160 Speaker 2: won't be able to pay thirty years worth of right mortgage, 701 00:34:27,239 --> 00:34:28,560 Speaker 2: So naw again. 702 00:34:29,080 --> 00:34:30,640 Speaker 1: And it's one of those things you're not necessarily a 703 00:34:30,680 --> 00:34:33,320 Speaker 1: bastard for, like being in a science that is used 704 00:34:33,360 --> 00:34:36,560 Speaker 1: by corporations and that isn't fundamentally evil but gets used 705 00:34:36,560 --> 00:34:38,759 Speaker 1: in some shady ways. But kind of what this does 706 00:34:38,760 --> 00:34:41,480 Speaker 1: show is I don't really think he often thinks about 707 00:34:41,719 --> 00:34:44,759 Speaker 1: the negative applications of what he's involved in. That is 708 00:34:44,760 --> 00:34:47,160 Speaker 1: going to be kind of a through line with George Church, 709 00:34:47,600 --> 00:34:50,320 Speaker 1: as we'll talk about later. But that article in Popular 710 00:34:50,360 --> 00:34:53,040 Speaker 1: Science continues more so than any other scientist in his field. 711 00:34:53,080 --> 00:34:55,359 Speaker 1: He is helping to forge a new kind of biology, 712 00:34:55,640 --> 00:34:58,759 Speaker 1: one less geared towards studying DNA than harnessing it for 713 00:34:58,840 --> 00:35:01,239 Speaker 1: our own aims. This is where the fucked up shit 714 00:35:01,400 --> 00:35:03,560 Speaker 1: starts to kind of come in. Is like, he is, 715 00:35:03,640 --> 00:35:08,200 Speaker 1: a DNA is no different from you know, a computer chip, right, 716 00:35:08,280 --> 00:35:10,480 Speaker 1: And we shouldn't think of it as different than that 717 00:35:10,600 --> 00:35:14,319 Speaker 1: in terms of allowing us to build new technologies. And 718 00:35:15,200 --> 00:35:18,239 Speaker 1: I can understand on some extent that attitude, but it 719 00:35:18,320 --> 00:35:22,480 Speaker 1: is also different. It's not it's not a computer chip. 720 00:35:22,800 --> 00:35:27,880 Speaker 2: Like, I really get nervous whenever the language starts dehumanizing 721 00:35:28,000 --> 00:35:31,839 Speaker 2: human experiences, like, right, that there has to be some 722 00:35:32,280 --> 00:35:35,279 Speaker 2: attachment to what it means to be a person for 723 00:35:35,360 --> 00:35:40,279 Speaker 2: this to remain healthy, normal, applicable in a way that 724 00:35:40,360 --> 00:35:44,960 Speaker 2: isn't just you scamming us into something much more scary evil. 725 00:35:45,440 --> 00:35:48,319 Speaker 1: Right, Yeah. And that's the thing, is, like, there's a 726 00:35:48,360 --> 00:35:51,200 Speaker 1: degree to which, if you're just talking purely logically, right, 727 00:35:52,000 --> 00:35:53,600 Speaker 1: there's a degree to which you can be like, well, 728 00:35:53,640 --> 00:35:55,560 Speaker 1: I guess it makes sense to say, like, you know, 729 00:35:55,960 --> 00:35:59,360 Speaker 1: if I'm open to the idea of like genetically editing 730 00:35:59,400 --> 00:36:03,280 Speaker 1: people to make them, you know, more resilient to diseases 731 00:36:03,360 --> 00:36:05,520 Speaker 1: or something. Maybe it makes a little sense to think 732 00:36:05,520 --> 00:36:08,480 Speaker 1: of it as a technology in that way, but the 733 00:36:08,640 --> 00:36:12,560 Speaker 1: line from that to thinking about the people and other 734 00:36:12,719 --> 00:36:18,640 Speaker 1: living beings you create as just smartphones that how do 735 00:36:18,680 --> 00:36:22,120 Speaker 1: you what stops you? What guard rails are you building 736 00:36:22,120 --> 00:36:23,959 Speaker 1: in to stop that? If this is how you're looking 737 00:36:24,000 --> 00:36:25,360 Speaker 1: at it, where are the guard rails? 738 00:36:25,480 --> 00:36:29,839 Speaker 2: Yeah, people turn into mind sweeper real fast. Exactly, it's 739 00:36:29,920 --> 00:36:33,480 Speaker 2: not exactly You're not dealing with bodies anymore in that skin. 740 00:36:33,600 --> 00:36:35,800 Speaker 1: Yeah, and that's not great. And that is kind of 741 00:36:35,840 --> 00:36:39,280 Speaker 1: where we're headed here. So Church's success led to Harvard 742 00:36:39,320 --> 00:36:41,920 Speaker 1: funding the establishment of his lab. He has like a 743 00:36:42,040 --> 00:36:44,320 Speaker 1: lab that is funded by Harvard that has been for 744 00:36:44,400 --> 00:36:47,600 Speaker 1: quite some time, and he brings in, you know, minds 745 00:36:47,640 --> 00:36:49,960 Speaker 1: that excite him and hires them and basically pays them 746 00:36:49,960 --> 00:36:52,280 Speaker 1: to like fuck around and try to figure shit out. 747 00:36:52,800 --> 00:36:55,760 Speaker 1: And he uses this like this is both a valid 748 00:36:56,360 --> 00:36:59,200 Speaker 1: thing to do in terms of science, but it's also 749 00:36:59,480 --> 00:37:02,600 Speaker 1: he uses it as like an incubator for startup ideas. 750 00:37:02,640 --> 00:37:06,480 Speaker 1: Like once people do stuff that shows promise, he'll often 751 00:37:06,600 --> 00:37:09,520 Speaker 1: spin what they're doing off into a company. Per an 752 00:37:09,560 --> 00:37:12,080 Speaker 1: article in popular science. The result is that his lab 753 00:37:12,120 --> 00:37:14,440 Speaker 1: manages to be both one of Harvard's top producers and 754 00:37:14,480 --> 00:37:17,879 Speaker 1: a well known receiving center for science's misfit toys. There's 755 00:37:17,880 --> 00:37:21,279 Speaker 1: an artist encoding Wikipedia entries into apple genomes to create 756 00:37:21,320 --> 00:37:24,280 Speaker 1: a literal tree of knowledge, and an insurance industry refugee 757 00:37:24,280 --> 00:37:26,800 Speaker 1: who fled his office job over a decade ago, worked 758 00:37:26,840 --> 00:37:29,919 Speaker 1: several months for free while teaching himself biochemistry, and now 759 00:37:29,960 --> 00:37:33,320 Speaker 1: serves as co head of the lab. So again, that 760 00:37:33,840 --> 00:37:35,919 Speaker 1: sounds kind of cool potentially. 761 00:37:35,560 --> 00:37:38,280 Speaker 2: You know, ah, yes, what's the true knowledge? 762 00:37:38,560 --> 00:37:39,920 Speaker 1: What is that supposed to that do? 763 00:37:41,360 --> 00:37:44,160 Speaker 2: Yeah, that's cool, I get I don't know, it sounds awesome. 764 00:37:44,200 --> 00:37:45,480 Speaker 1: It sounds like an art project. 765 00:37:45,600 --> 00:37:46,480 Speaker 2: Yeah. 766 00:37:46,520 --> 00:37:49,120 Speaker 1: The article we'll talk a little bit about like DNA coding. 767 00:37:49,120 --> 00:37:52,560 Speaker 1: That's actually there's some science there. But the article quotes 768 00:37:52,600 --> 00:37:54,720 Speaker 1: a former student of Churches who found it a genetic 769 00:37:54,760 --> 00:37:57,960 Speaker 1: engineering screening company that looks for inherited diseases. And he said, 770 00:37:58,040 --> 00:37:59,640 Speaker 1: we always joke that the only thing you need to 771 00:37:59,640 --> 00:38:02,840 Speaker 1: do to in George's lavish show up there is zero organization. 772 00:38:03,320 --> 00:38:06,279 Speaker 1: His style is just to let things happen. Mostly, you 773 00:38:06,360 --> 00:38:08,839 Speaker 1: have the constant sense that exciting things are happening. We're 774 00:38:08,880 --> 00:38:10,560 Speaker 1: about to happen, and if you miss out on it, 775 00:38:10,600 --> 00:38:11,960 Speaker 1: you have only yourself to blame. 776 00:38:12,600 --> 00:38:13,080 Speaker 2: Wow. 777 00:38:13,520 --> 00:38:16,680 Speaker 1: Yeah, and yeah, that's kind of the tech industry fomo, 778 00:38:17,120 --> 00:38:18,640 Speaker 1: you know, PR thing in a nutshell. 779 00:38:19,719 --> 00:38:23,200 Speaker 2: That's also just not a good thing to to sort of, 780 00:38:23,239 --> 00:38:25,640 Speaker 2: I guess have is like what people know about you 781 00:38:25,680 --> 00:38:27,839 Speaker 2: that like, yeah, we show up, he'll give you a job. 782 00:38:27,920 --> 00:38:30,640 Speaker 1: He'll give you a job he didn't give Yeah. Yeah, yeah, 783 00:38:31,480 --> 00:38:35,120 Speaker 1: he's looking for bodies who were fucking around now. In 784 00:38:35,160 --> 00:38:38,319 Speaker 1: the mid aughts, Church participated in kind of his first 785 00:38:38,360 --> 00:38:41,600 Speaker 1: noteworthy project after the genome sequencing stuff that we'll talk about, 786 00:38:41,640 --> 00:38:45,640 Speaker 1: which is a project to actually encode digital data made 787 00:38:45,680 --> 00:38:49,000 Speaker 1: of text into binary code and then transfer that into 788 00:38:49,120 --> 00:38:54,960 Speaker 1: genetic code, thus using DNA to store digital data. In 789 00:38:55,080 --> 00:38:58,600 Speaker 1: his case, this was also a marketing stunt because the 790 00:38:58,640 --> 00:39:03,080 Speaker 1: thing that he stored in DNA was like seventy billion 791 00:39:03,160 --> 00:39:06,200 Speaker 1: copies of the book that he had written with another 792 00:39:06,239 --> 00:39:07,960 Speaker 1: guy that was just about to come out, Right, So 793 00:39:08,000 --> 00:39:11,239 Speaker 1: he does this as a PR move, right, and it's 794 00:39:11,280 --> 00:39:14,400 Speaker 1: a brilliant PR move. The book ReGenesis was about to 795 00:39:14,440 --> 00:39:16,640 Speaker 1: hit shelves and suddenly there's all these articles about how 796 00:39:16,680 --> 00:39:20,319 Speaker 1: he stored seventy billion copies into like a dot of 797 00:39:20,440 --> 00:39:23,680 Speaker 1: DNA no longer than like a fucking period on a 798 00:39:23,680 --> 00:39:25,520 Speaker 1: piece of paper, that he was able to store so 799 00:39:25,520 --> 00:39:26,960 Speaker 1: many copies, Like, isn't that amazing? 800 00:39:27,080 --> 00:39:27,279 Speaker 2: Right? 801 00:39:28,520 --> 00:39:30,880 Speaker 1: And it is? It is pretty interesting, right, like that 802 00:39:30,960 --> 00:39:34,000 Speaker 1: he synthesized a strand of DNA, replicated it, and like 803 00:39:34,520 --> 00:39:36,680 Speaker 1: put it onto a scrap of paper and it contained 804 00:39:36,840 --> 00:39:40,520 Speaker 1: real data, right, This, in fact was so interesting that 805 00:39:40,600 --> 00:39:43,160 Speaker 1: it got him an appearance on the Colbert Report, right 806 00:39:44,160 --> 00:39:46,239 Speaker 1: where he pulled out the paper scrap which was like 807 00:39:46,280 --> 00:39:48,640 Speaker 1: the size and shape of a Foresten cookie slip and 808 00:39:48,680 --> 00:39:51,719 Speaker 1: showed it off to everybody. And this got representatives of 809 00:39:51,719 --> 00:39:55,920 Speaker 1: different companies who like archive films and other stuff reaching 810 00:39:55,920 --> 00:39:57,879 Speaker 1: out to him because they were like, oh shit, you know. 811 00:39:59,000 --> 00:40:00,719 Speaker 2: Maybe this is a paper. 812 00:40:02,160 --> 00:40:06,600 Speaker 1: Yes, yes, yes it is. We should work with him, yeah, 813 00:40:06,719 --> 00:40:11,160 Speaker 1: he's it is funny. Like again, I don't know that 814 00:40:11,200 --> 00:40:13,480 Speaker 1: this is exactly how a scientist should be putting out 815 00:40:13,480 --> 00:40:16,839 Speaker 1: a discovery of this magnitude apparently, but it's also it's 816 00:40:16,840 --> 00:40:18,640 Speaker 1: one of those things that's both like really cool and 817 00:40:18,719 --> 00:40:21,880 Speaker 1: interesting and somewhat less impressive than it sounds. When you 818 00:40:21,960 --> 00:40:25,400 Speaker 1: drill into what's actually happening here, right, because like the 819 00:40:25,800 --> 00:40:28,440 Speaker 1: like that article by INTERLANDI like makes it seem like 820 00:40:28,560 --> 00:40:30,799 Speaker 1: and this is obviously like a proof of concept for 821 00:40:30,840 --> 00:40:33,520 Speaker 1: something that could be potentially a huge deal for like 822 00:40:33,640 --> 00:40:37,279 Speaker 1: data storage. And that's not entirely untrue, but it's it's 823 00:40:37,320 --> 00:40:40,920 Speaker 1: not totally accurate either. The article goes on to summarize 824 00:40:40,960 --> 00:40:43,560 Speaker 1: the book that he co wrote with Ed Regis, who 825 00:40:43,960 --> 00:40:46,360 Speaker 1: It's weird to me that his co author's name is 826 00:40:46,520 --> 00:40:49,000 Speaker 1: ed Regis because Ed Regis is also a character in 827 00:40:49,040 --> 00:40:53,240 Speaker 1: the book Jurassic Park. He was the public relations manager 828 00:40:53,280 --> 00:40:56,800 Speaker 1: for engine in the novel. He doesn't wind up in 829 00:40:56,880 --> 00:40:58,480 Speaker 1: the fell. It's just really weird to me that his 830 00:40:58,560 --> 00:40:59,800 Speaker 1: co author on this book has that. 831 00:41:00,600 --> 00:41:03,040 Speaker 2: Yeah, but I bet he gets eaten. He sounds like 832 00:41:03,080 --> 00:41:03,640 Speaker 2: he got eaten. 833 00:41:04,160 --> 00:41:06,600 Speaker 1: Oh yeah, man, that motherfucker gets the hell eaten out 834 00:41:06,600 --> 00:41:09,560 Speaker 1: of him. Yes, yeah, if I am remembering right, he's 835 00:41:09,600 --> 00:41:11,520 Speaker 1: one of the ones who gets eaten like a son 836 00:41:11,560 --> 00:41:12,960 Speaker 1: of a bitch. I think he gets replaced with the 837 00:41:13,040 --> 00:41:18,040 Speaker 1: lawyer in the movie. So the book ReGenesis that Church 838 00:41:18,080 --> 00:41:21,319 Speaker 1: writes with Ed Regis quote envisioned this the future, This 839 00:41:21,360 --> 00:41:24,640 Speaker 1: new biology could bring one in which bacteria fuels cars 840 00:41:24,640 --> 00:41:27,560 Speaker 1: and commercial jets, and humans are immune to cancer. It 841 00:41:27,600 --> 00:41:29,520 Speaker 1: may sound like science fiction, or at least like a 842 00:41:29,520 --> 00:41:32,360 Speaker 1: litany of over hyped pipe dreams that science so often sells, 843 00:41:32,600 --> 00:41:35,120 Speaker 1: But George Church's pipe dreams have an uncanny record of 844 00:41:35,200 --> 00:41:38,800 Speaker 1: becoming reality. And I'd say this is the fundamental lie 845 00:41:38,840 --> 00:41:41,719 Speaker 1: about George that keeps getting repeated and spread by a 846 00:41:41,800 --> 00:41:45,960 Speaker 1: too credulous media. The man makes constant, wild and almost 847 00:41:46,000 --> 00:41:49,040 Speaker 1: impossible claims about what's going to happen in the future, 848 00:41:49,400 --> 00:41:51,279 Speaker 1: and then people will be like, yeah, it sounds nuts, 849 00:41:51,280 --> 00:41:53,880 Speaker 1: but his crazy dreams have become reality before, so we 850 00:41:53,880 --> 00:41:57,719 Speaker 1: should take him seriously. And we shouldn't because while Church 851 00:41:57,800 --> 00:42:00,480 Speaker 1: contributed massively to the science of gene c and saying 852 00:42:00,719 --> 00:42:03,520 Speaker 1: at no point were his ambitions in that field a 853 00:42:03,600 --> 00:42:06,040 Speaker 1: pipe dream, no one was ever like, no one can 854 00:42:06,080 --> 00:42:08,240 Speaker 1: do what you're trying to do, George. You can't personally 855 00:42:08,239 --> 00:42:11,640 Speaker 1: sequence the human genome. Scientists had been doing that right. 856 00:42:11,760 --> 00:42:14,520 Speaker 1: There were teams of people who had figured out aspects 857 00:42:14,520 --> 00:42:17,080 Speaker 1: of this before he got into the field. And while 858 00:42:17,120 --> 00:42:20,680 Speaker 1: what he discovered to do was really meaningful, nobody was like, 859 00:42:20,880 --> 00:42:23,600 Speaker 1: this will never get done. It was more like well, 860 00:42:23,640 --> 00:42:25,120 Speaker 1: someone's got to figure out, and he was the one 861 00:42:25,120 --> 00:42:27,160 Speaker 1: who figured it out. I'm not saying this is not impressive, 862 00:42:27,160 --> 00:42:29,160 Speaker 1: but it's not. It was never a pipe dream, right, 863 00:42:29,600 --> 00:42:31,960 Speaker 1: He's not by the time he got into it. And 864 00:42:32,320 --> 00:42:34,800 Speaker 1: the stuff he's talking about in this book, like altering 865 00:42:34,880 --> 00:42:37,719 Speaker 1: human biology to make us immune to cancer, that is 866 00:42:37,760 --> 00:42:41,360 Speaker 1: a pipe dream. There's no evidence that will ever be possible, 867 00:42:41,719 --> 00:42:45,959 Speaker 1: in part because cancers a bunch of different diseases. There's 868 00:42:46,000 --> 00:42:48,440 Speaker 1: never going to be a single thing that renders you 869 00:42:48,520 --> 00:42:52,000 Speaker 1: immune to cancer unless you start uploading people to the cloud, 870 00:42:52,080 --> 00:42:54,239 Speaker 1: which is also probably not possible. 871 00:42:54,880 --> 00:42:58,280 Speaker 2: I also get really nervous when the science includes both 872 00:42:58,480 --> 00:43:03,400 Speaker 2: car technology and answer elimination. That feels like, wait a minute, 873 00:43:03,719 --> 00:43:06,560 Speaker 2: you got to focus, big dog. Both of those things 874 00:43:06,600 --> 00:43:08,960 Speaker 2: can't be true from your single discovery. 875 00:43:09,280 --> 00:43:12,040 Speaker 1: Yeah. It's like if you, like, you know, you're a 876 00:43:12,080 --> 00:43:14,480 Speaker 1: Hollywood actor who's like starting to go bald, and you 877 00:43:14,520 --> 00:43:17,120 Speaker 1: go in for like Turkish hair transplants and the doctor's like, 878 00:43:17,320 --> 00:43:19,920 Speaker 1: hey man, you want a new liver, Like I haven't 879 00:43:20,440 --> 00:43:23,120 Speaker 1: I got one. I'd be like, wait, wait, wait, I don't. 880 00:43:23,440 --> 00:43:26,560 Speaker 2: I actually don't know. I prefer to keep the one 881 00:43:26,560 --> 00:43:28,319 Speaker 2: I have. I know it sucks, but I'll keep it. 882 00:43:28,360 --> 00:43:30,239 Speaker 1: I just came in here to get the Joel McHale man. 883 00:43:30,239 --> 00:43:35,000 Speaker 1: I really was not interested in a new organ. Never 884 00:43:35,080 --> 00:43:37,719 Speaker 1: have hair transplants worked out better for a man? My god? 885 00:43:37,880 --> 00:43:40,600 Speaker 2: Oh yeah his are? His are low and they're strong. 886 00:43:40,680 --> 00:43:46,160 Speaker 1: I really respect it, the Mona Lisa of hair. So 887 00:43:46,440 --> 00:43:48,640 Speaker 1: even when it comes to the cool things, doctor Church 888 00:43:48,760 --> 00:43:51,080 Speaker 1: actually did like store his book in DNA, and I 889 00:43:51,120 --> 00:43:55,200 Speaker 1: do think that's a cool idea, the practical reality behind 890 00:43:55,280 --> 00:43:57,560 Speaker 1: it is a lot less exciting than the hype. Now, 891 00:43:57,560 --> 00:43:59,239 Speaker 1: before we bust that, I want to show you a 892 00:43:59,320 --> 00:44:03,280 Speaker 1: video of Church presenting the exciting promise of DNA storage 893 00:44:03,320 --> 00:44:05,720 Speaker 1: in a video that was part of the promotional campaign 894 00:44:05,760 --> 00:44:08,439 Speaker 1: for his books. And he's being interviewed here with one 895 00:44:08,440 --> 00:44:10,719 Speaker 1: of his colleagues for this encoding project. 896 00:44:11,480 --> 00:44:13,879 Speaker 4: And yes, he does look exactly like I thought he would. 897 00:44:14,000 --> 00:44:15,040 Speaker 1: Yeah, no, he does. 898 00:44:15,760 --> 00:44:22,319 Speaker 3: The density is remarkably high, as little as one bit 899 00:44:22,440 --> 00:44:28,880 Speaker 3: per base one base per cubic nanometer, and so we 900 00:44:28,920 --> 00:44:31,960 Speaker 3: can store on the order of almost a zetabyte and 901 00:44:32,000 --> 00:44:35,240 Speaker 3: a gram of DNA a millileter volume. 902 00:44:35,800 --> 00:44:38,560 Speaker 1: The theoretical density of a DNA is that you could 903 00:44:38,600 --> 00:44:41,400 Speaker 1: store the total world information, which is one point in 904 00:44:41,440 --> 00:44:44,320 Speaker 1: setabtes at least in twenty eleven, and about four grams 905 00:44:44,320 --> 00:44:45,120 Speaker 1: of DNA. 906 00:44:46,040 --> 00:44:51,960 Speaker 3: And it leverages rapidly, improving next generation reading and writing 907 00:44:52,000 --> 00:44:52,480 Speaker 3: of DNA. 908 00:44:53,880 --> 00:44:56,440 Speaker 4: He looks like he'd be friends with Stocked in Rush. 909 00:44:56,560 --> 00:44:58,680 Speaker 1: He does look like he'd be friends with Stocked and Rush. 910 00:44:58,719 --> 00:45:01,240 Speaker 1: I think he's a lot smart than Stockton, though, although 911 00:45:01,239 --> 00:45:04,080 Speaker 1: that is a very low bar because Stockton was really 912 00:45:04,160 --> 00:45:11,200 Speaker 1: dumb all stocked at it turned into paste. Rush. He 913 00:45:11,239 --> 00:45:13,040 Speaker 1: does have the vibes of a guy that gets eaten 914 00:45:13,080 --> 00:45:14,960 Speaker 1: by his own dinosaurs. But I don't think that's going 915 00:45:15,000 --> 00:45:16,880 Speaker 1: to happen to him either, which is really tragic. That 916 00:45:16,920 --> 00:45:17,879 Speaker 1: actually does bum me out. 917 00:45:18,080 --> 00:45:22,120 Speaker 2: It definitely felt like late stage James Cameron, you know 918 00:45:22,120 --> 00:45:25,080 Speaker 2: what I mean, Like it felt like you're telling me 919 00:45:25,120 --> 00:45:28,120 Speaker 2: about the Avatar technology. This movie still sucks. 920 00:45:28,160 --> 00:45:34,080 Speaker 1: So yeah, yeah, So what he says here isn't technically wrong, 921 00:45:34,400 --> 00:45:36,920 Speaker 1: Like that's all technically accurate about what you could do 922 00:45:36,960 --> 00:45:39,920 Speaker 1: with DNA, but it doesn't mean that DNA is currently 923 00:45:40,040 --> 00:45:41,560 Speaker 1: or will be in any kind of timeframe a good 924 00:45:41,600 --> 00:45:45,080 Speaker 1: way to store data now, obviously there's a need for 925 00:45:45,160 --> 00:45:48,000 Speaker 1: a much better way to store data. Digital data storage 926 00:45:48,120 --> 00:45:51,600 Speaker 1: is not forever and has a lot of problems, and like, 927 00:45:51,960 --> 00:45:55,200 Speaker 1: is it just a really bad way to long term 928 00:45:55,360 --> 00:45:59,480 Speaker 1: protect human knowledge? And obviously, like paper is actually in 929 00:45:59,480 --> 00:46:01,640 Speaker 1: some ways better if you're storing it in like the 930 00:46:01,719 --> 00:46:04,840 Speaker 1: right conditions, like it will degrade less than digital data 931 00:46:04,880 --> 00:46:08,360 Speaker 1: overall long enough timeframe. But there's obvious problems with paper, 932 00:46:08,480 --> 00:46:10,160 Speaker 1: right Like are other things like if you've got like 933 00:46:10,400 --> 00:46:14,480 Speaker 1: a climate sealed place to store books versus some hard drives, 934 00:46:14,480 --> 00:46:17,400 Speaker 1: those hard drives will break on a faster timeframe, assuming 935 00:46:17,400 --> 00:46:21,120 Speaker 1: you managed to keep that place, you know, properly stored 936 00:46:21,160 --> 00:46:24,839 Speaker 1: and whatnot. But so we do need ways that are 937 00:46:24,960 --> 00:46:28,200 Speaker 1: much more space efficient because also the amount of data 938 00:46:28,280 --> 00:46:31,120 Speaker 1: humanity is producing, you know, especially since we have projects 939 00:46:31,160 --> 00:46:33,919 Speaker 1: like the Hadron Collider going, there's so much data being 940 00:46:33,920 --> 00:46:36,799 Speaker 1: made and storing it is a problem, right, like, because 941 00:46:36,840 --> 00:46:40,239 Speaker 1: you need these massive facilities in order to even store 942 00:46:40,280 --> 00:46:42,359 Speaker 1: a lot of this stuff. So these are issues that 943 00:46:42,440 --> 00:46:45,320 Speaker 1: we have, right and DNA and the fact that you 944 00:46:45,360 --> 00:46:48,120 Speaker 1: could store data with such density in it could be 945 00:46:48,160 --> 00:46:50,480 Speaker 1: a solution to aspects of it, but it's kind of 946 00:46:50,560 --> 00:46:52,840 Speaker 1: framed a lot is like and this is in the future, 947 00:46:53,160 --> 00:46:55,799 Speaker 1: Netflix will keep all its data in like DNA drive 948 00:46:55,920 --> 00:46:57,920 Speaker 1: yet to get like everyone like everything will be stored 949 00:46:57,960 --> 00:47:01,279 Speaker 1: and that probably is never going to have I can't 950 00:47:01,280 --> 00:47:03,479 Speaker 1: say definitely, but there's because there's a lot we don't 951 00:47:03,480 --> 00:47:06,080 Speaker 1: know about this, how this technology would work. But there's 952 00:47:06,400 --> 00:47:08,680 Speaker 1: the shit we haven't figured out yet is really significant. 953 00:47:08,680 --> 00:47:11,680 Speaker 1: For example, there's a high error rate when you write 954 00:47:11,760 --> 00:47:15,200 Speaker 1: data to DNA currently, and since it's really easy to 955 00:47:15,280 --> 00:47:18,600 Speaker 1: fuck up writing the data, the current best practices is 956 00:47:18,600 --> 00:47:22,560 Speaker 1: to store multiple redundant copies of each piece of information 957 00:47:22,680 --> 00:47:24,880 Speaker 1: so you have some that are right, which is like 958 00:47:24,960 --> 00:47:27,120 Speaker 1: he puts seventy billion copies of that book on like 959 00:47:27,160 --> 00:47:29,120 Speaker 1: a dot, right, Like that's that's kind of what we're 960 00:47:29,120 --> 00:47:31,279 Speaker 1: talking about here. You store a shitload of copies of 961 00:47:31,320 --> 00:47:34,959 Speaker 1: something because you know, and scientists don't even know how 962 00:47:35,080 --> 00:47:37,080 Speaker 1: many redundant backups we need yet. 963 00:47:37,120 --> 00:47:37,279 Speaker 3: Right. 964 00:47:37,320 --> 00:47:39,240 Speaker 1: I found a study where they're just trying to figure 965 00:47:39,280 --> 00:47:41,920 Speaker 1: that out, like, Okay, what is the actual best practice 966 00:47:41,920 --> 00:47:44,920 Speaker 1: for the actual number of different redundant copies to store 967 00:47:45,120 --> 00:47:47,440 Speaker 1: because we really hadn't locked that down yet. 968 00:47:47,600 --> 00:47:50,719 Speaker 2: So all of those all of those books, there's like 969 00:47:50,760 --> 00:47:54,040 Speaker 2: six of those books that are right, and then seventy 970 00:47:54,120 --> 00:47:58,680 Speaker 2: billion others that are like just mid likely shit books. 971 00:47:58,920 --> 00:48:00,880 Speaker 1: I don't know if it's that, but we like the 972 00:48:01,120 --> 00:48:02,719 Speaker 1: I think the problem is like we don't actually know 973 00:48:02,920 --> 00:48:07,000 Speaker 1: how many we should be doing, right, We're still figuring 974 00:48:07,040 --> 00:48:08,800 Speaker 1: that part of it out. And then there's a separate 975 00:48:08,800 --> 00:48:11,239 Speaker 1: app asue of like, Okay, well you've got that on 976 00:48:11,280 --> 00:48:13,840 Speaker 1: this dot, but you can't like that dot's not connected 977 00:48:13,880 --> 00:48:16,000 Speaker 1: to a computer, Like, sure, the data is there, but 978 00:48:16,080 --> 00:48:18,560 Speaker 1: how would you access and store it and use it 979 00:48:18,800 --> 00:48:21,279 Speaker 1: if you wanted to write? Like, could you get that 980 00:48:21,320 --> 00:48:24,480 Speaker 1: on a kindle easily? And the answer is no. Right. 981 00:48:25,440 --> 00:48:28,640 Speaker 1: I found an article on DNA data storage written by 982 00:48:29,120 --> 00:48:32,080 Speaker 1: Nithil Krishnaj that lays out some of the other practical 983 00:48:32,120 --> 00:48:34,880 Speaker 1: issues inherent to doing this for any practical reason. Quote, 984 00:48:35,160 --> 00:48:38,360 Speaker 1: DNA has horrendously slow read and write speeds, so it 985 00:48:38,440 --> 00:48:40,680 Speaker 1: is an ideal for real time storage, and activities like 986 00:48:40,680 --> 00:48:44,000 Speaker 1: streaming video and gaming definitely won't be viable at this time. 987 00:48:44,239 --> 00:48:46,680 Speaker 1: As a result, DNA data storage loses some of its 988 00:48:46,760 --> 00:48:48,960 Speaker 1: versatility and as of now, it would only work best 989 00:48:48,960 --> 00:48:52,200 Speaker 1: for long term storage. It's also not rewriteable. Once you 990 00:48:52,320 --> 00:48:55,200 Speaker 1: encode data into DNA, there's no way of making changes 991 00:48:55,239 --> 00:48:58,359 Speaker 1: to your data without redoing the encoding process. There's also 992 00:48:58,400 --> 00:49:01,279 Speaker 1: no random access functionality, which means you can't access a 993 00:49:01,280 --> 00:49:03,680 Speaker 1: certain part of the data without decoding all of it. 994 00:49:04,200 --> 00:49:07,000 Speaker 1: And this is still like interesting and potentially away. Again, 995 00:49:07,040 --> 00:49:09,600 Speaker 1: you could have a bunch of different places where all 996 00:49:09,640 --> 00:49:11,960 Speaker 1: of the data we've you know, made up to a 997 00:49:11,960 --> 00:49:14,799 Speaker 1: certain point is stored on DNA somewhere, and that would 998 00:49:14,880 --> 00:49:18,799 Speaker 1: potentially allow future people to access a lossless version of it, 999 00:49:18,840 --> 00:49:21,480 Speaker 1: and in a way that might be really helpful. But 1000 00:49:21,520 --> 00:49:23,720 Speaker 1: we're not talking about something that's going to alter daily 1001 00:49:23,800 --> 00:49:26,120 Speaker 1: life in its current form, and maybe not ever on 1002 00:49:26,280 --> 00:49:28,640 Speaker 1: any timeframe any of us will see because it's just 1003 00:49:28,719 --> 00:49:29,960 Speaker 1: not practical, right. 1004 00:49:30,000 --> 00:49:33,799 Speaker 2: Yeah, Netflix isn't going to exist when this is actually a. 1005 00:49:33,920 --> 00:49:38,840 Speaker 1: Thing, right yes, yes, Like at some point in the future, 1006 00:49:38,920 --> 00:49:40,799 Speaker 1: maybe they'll figure out all this stuff, but that is 1007 00:49:40,840 --> 00:49:43,440 Speaker 1: not any kind of time frame anybody should be like 1008 00:49:43,760 --> 00:49:46,399 Speaker 1: waiting for. Right Again, not to say this is an 1009 00:49:46,440 --> 00:49:50,280 Speaker 1: interesting or as a potential use, it's just it's just prectice, 1010 00:49:50,320 --> 00:49:52,400 Speaker 1: like this is the future of data storage, and it's like, 1011 00:49:52,440 --> 00:49:57,239 Speaker 1: well maybe in like a couple hundred years. Side up. Yeah, Now, 1012 00:49:57,320 --> 00:49:59,680 Speaker 1: there's also something in that video that I find creepy. 1013 00:49:59,719 --> 00:50:02,279 Speaker 1: As an side which is that doctor Church proposes one 1014 00:50:02,400 --> 00:50:04,920 Speaker 1: use of this technology would be to create permanent records 1015 00:50:04,960 --> 00:50:08,480 Speaker 1: of the brain activity of a human being, and I 1016 00:50:08,560 --> 00:50:10,040 Speaker 1: just don't like the way he says this. 1017 00:50:11,480 --> 00:50:14,399 Speaker 3: Or you could imagine other huge data sources, like all 1018 00:50:14,440 --> 00:50:17,200 Speaker 3: the neuronal flowerings in the brain, which could be encoded 1019 00:50:17,200 --> 00:50:19,640 Speaker 3: into the d NA, and again you could do selective 1020 00:50:19,680 --> 00:50:21,400 Speaker 3: reading of that as needed. 1021 00:50:22,680 --> 00:50:26,680 Speaker 1: Yikes, I don't love that. Now he's saying like, well, 1022 00:50:26,719 --> 00:50:29,560 Speaker 1: you could do some really groundbreaking medical studies if you 1023 00:50:29,600 --> 00:50:32,560 Speaker 1: had access to this much data. And sure, but when 1024 00:50:32,560 --> 00:50:36,000 Speaker 1: you talk about making perfect records of a human brain activity, 1025 00:50:36,160 --> 00:50:38,160 Speaker 1: you're also getting into the kind of territory where I'm like, 1026 00:50:38,200 --> 00:50:40,880 Speaker 1: I want to immediately hear what you think about the 1027 00:50:40,920 --> 00:50:45,719 Speaker 1: potential for surveillance and violation of privacy right ex. You 1028 00:50:45,840 --> 00:50:47,520 Speaker 1: kind of have to bring that up right away. You 1029 00:50:47,560 --> 00:50:49,919 Speaker 1: can't just be interested in the technology here. 1030 00:50:50,440 --> 00:50:53,080 Speaker 2: It seems like you want to download some information from 1031 00:50:53,120 --> 00:50:56,920 Speaker 2: people that maybe they didn't want to give you. That's 1032 00:50:56,960 --> 00:50:59,160 Speaker 2: some nasty work there, doctor Jege. 1033 00:50:59,400 --> 00:51:01,920 Speaker 1: I'm a little and I promise you we're so far 1034 00:51:01,960 --> 00:51:04,360 Speaker 1: getting into all this. Well, theoretically, there's stuff about this 1035 00:51:04,440 --> 00:51:07,640 Speaker 1: that could be wrong or he's exaggerating. The actual fucked 1036 00:51:07,719 --> 00:51:11,160 Speaker 1: up stuff starts right about now, right, because when we're 1037 00:51:11,160 --> 00:51:13,759 Speaker 1: talking about like this is a technology that could be 1038 00:51:13,840 --> 00:51:18,280 Speaker 1: good or could have some major problematic ethical you know, implications, 1039 00:51:18,520 --> 00:51:21,160 Speaker 1: you want to know the scientist working on the technology 1040 00:51:21,200 --> 00:51:24,320 Speaker 1: that could have fucked up ethical implications has a strong 1041 00:51:24,600 --> 00:51:28,480 Speaker 1: history of personal ethics. Right, And this brings me to 1042 00:51:28,560 --> 00:51:31,600 Speaker 1: doctor Church's history with our old friend of the pod, 1043 00:51:32,000 --> 00:51:38,680 Speaker 1: Jeffrey Epstein. Jeffrey Epstein, that's the monster at the end 1044 00:51:38,680 --> 00:51:42,680 Speaker 1: of this book. Okay, at the end of this episode, 1045 00:51:43,320 --> 00:51:47,080 Speaker 1: Church to Jeffrey Epstein, Yes. 1046 00:51:46,960 --> 00:51:48,080 Speaker 4: In need money. 1047 00:51:49,680 --> 00:51:55,799 Speaker 1: Surprise emerges literally he is the thin white duke of 1048 00:51:55,840 --> 00:51:58,680 Speaker 1: evil scientists. That's Jeffrey Epstein. 1049 00:51:59,239 --> 00:52:02,160 Speaker 2: That's really cool. Oh that's like a really exciting plot 1050 00:52:02,200 --> 00:52:04,560 Speaker 2: twist that you don't see coming. I love this. 1051 00:52:04,880 --> 00:52:06,800 Speaker 1: It's like the end of the second or third Kingsman 1052 00:52:06,840 --> 00:52:09,560 Speaker 1: movie when Hitler comes out of nowhere, like, oh, there 1053 00:52:09,600 --> 00:52:10,680 Speaker 1: we go, there we go. 1054 00:52:11,000 --> 00:52:14,120 Speaker 2: Oh, you guys really built that a long. 1055 00:52:13,920 --> 00:52:27,560 Speaker 1: Way great, dropping him like thanos. So some sources I 1056 00:52:27,600 --> 00:52:31,040 Speaker 1: have said say that Church in Epstein's relationship started in 1057 00:52:31,040 --> 00:52:33,760 Speaker 1: two thousand and five. I've heard Church claim two thousand 1058 00:52:33,760 --> 00:52:35,560 Speaker 1: and six, but people have said that he was receiving 1059 00:52:35,640 --> 00:52:38,440 Speaker 1: funding from Epstein as far back as two thousand and five. 1060 00:52:38,880 --> 00:52:42,000 Speaker 1: It may just be that his labs started receiving unrestricted 1061 00:52:42,040 --> 00:52:45,680 Speaker 1: funding from Epstein before they met, and I will remind 1062 00:52:45,680 --> 00:52:47,400 Speaker 1: you here they were receiving funding from two thousd and 1063 00:52:47,440 --> 00:52:49,239 Speaker 1: five to two thousand and seven. Epstein was convicted in 1064 00:52:49,239 --> 00:52:52,040 Speaker 1: two thousand and eight of sex trafficking. Although that's not 1065 00:52:52,040 --> 00:52:54,160 Speaker 1: the end of their relationship. But let's talk about those 1066 00:52:54,160 --> 00:52:56,719 Speaker 1: first couple of years. Now. At that point in his 1067 00:52:56,760 --> 00:52:59,040 Speaker 1: career two thousand and five, right, he is, He's just 1068 00:52:59,080 --> 00:53:02,480 Speaker 1: started the personal gene known project. His primary focus and 1069 00:53:02,520 --> 00:53:04,600 Speaker 1: the thing that he's most famous for is his work 1070 00:53:04,680 --> 00:53:07,919 Speaker 1: on like gene sequencing and gene editing. You know, he's 1071 00:53:07,960 --> 00:53:10,759 Speaker 1: into both of these things. An article for The New 1072 00:53:10,840 --> 00:53:14,560 Speaker 1: York Times that discusses Churches and other scientists associations with 1073 00:53:14,640 --> 00:53:17,719 Speaker 1: Epstein described doctor Church in this period as quote a 1074 00:53:17,760 --> 00:53:20,879 Speaker 1: molecular engineer who has worked to identify genes that could 1075 00:53:20,880 --> 00:53:26,080 Speaker 1: be altered to create superior humans. Oh oh uh oh oh, boy, 1076 00:53:27,440 --> 00:53:28,080 Speaker 1: don't like that. 1077 00:53:28,800 --> 00:53:31,160 Speaker 2: Superior humans. That's a trigger word for me. 1078 00:53:31,320 --> 00:53:35,000 Speaker 1: Yeah, yeah, boy has anyone ever said those two words 1079 00:53:35,040 --> 00:53:36,960 Speaker 1: and not been doing something horrifying. 1080 00:53:38,960 --> 00:53:39,560 Speaker 2: Yikes. 1081 00:53:39,760 --> 00:53:43,399 Speaker 1: So, doctor Church was an early pioneer for the use 1082 00:53:43,440 --> 00:53:47,080 Speaker 1: of Crisper to edit human genes, and one of his 1083 00:53:47,280 --> 00:53:50,160 Speaker 1: ambitions was and is, to create a method of gene 1084 00:53:50,160 --> 00:53:53,360 Speaker 1: therapy to, in his words, knock out both copies of 1085 00:53:53,440 --> 00:53:56,239 Speaker 1: your CCR five gene, which is the AIDS receptor, and 1086 00:53:56,280 --> 00:53:58,359 Speaker 1: then put them back in your body. Then you can't 1087 00:53:58,360 --> 00:54:01,120 Speaker 1: get AIDS anymore because the virus can't enter yourselves. And hey, 1088 00:54:01,880 --> 00:54:05,799 Speaker 1: that sounds fine. AIDS is bad stopping people from being 1089 00:54:05,800 --> 00:54:09,360 Speaker 1: able to get it, lovely. The issue is that Church's 1090 00:54:09,360 --> 00:54:12,719 Speaker 1: ambitions don't stop here, and Epstein was not drawn to 1091 00:54:12,800 --> 00:54:16,200 Speaker 1: Church's life work for anything as humanitarian as stopping a virus. 1092 00:54:16,640 --> 00:54:18,840 Speaker 1: I have found a couple stories of how Church and 1093 00:54:18,840 --> 00:54:22,040 Speaker 1: Epstein actually met for the first time. Church has claimed 1094 00:54:22,040 --> 00:54:25,239 Speaker 1: that he was connected to Epstein first either, and he says, 1095 00:54:25,239 --> 00:54:28,200 Speaker 1: I don't know which, either through the chairman of Harvard's 1096 00:54:28,200 --> 00:54:33,600 Speaker 1: psych department or through his literary agent, John Brockman. Sure, buddy, 1097 00:54:33,760 --> 00:54:36,800 Speaker 1: I feel like I remember how I first met Jeffrey Epstein, 1098 00:54:36,920 --> 00:54:37,960 Speaker 1: but maybe I'm wrong. 1099 00:54:38,680 --> 00:54:41,160 Speaker 2: But that speaks to his multitude. 1100 00:54:42,160 --> 00:54:42,640 Speaker 1: Man right. 1101 00:54:42,880 --> 00:54:44,719 Speaker 2: It could either be a money man or it could 1102 00:54:44,719 --> 00:54:48,960 Speaker 2: be an academic man. But one of them, yeah, introduced. 1103 00:54:48,520 --> 00:54:51,600 Speaker 1: Me, and both of them are implicated in some sketchy 1104 00:54:51,600 --> 00:54:56,080 Speaker 1: Epstein stuff. To be clear, we were all there. But 1105 00:54:56,280 --> 00:54:59,080 Speaker 1: in another interview, Church seemed to suggest that Epstein probably 1106 00:54:59,080 --> 00:55:02,360 Speaker 1: reached out to him because Epstein was friendly and working 1107 00:55:02,400 --> 00:55:06,759 Speaker 1: with a biologist and mathematician named Martin Noack. Church and 1108 00:55:06,760 --> 00:55:10,399 Speaker 1: Noac had worked together on various applications of Crisper too 1109 00:55:10,520 --> 00:55:13,480 Speaker 1: edit genes Per an article in stat by Karen Begley. 1110 00:55:13,920 --> 00:55:16,920 Speaker 1: At the get togethers with Noak, Church said Epstein seemed 1111 00:55:16,920 --> 00:55:20,200 Speaker 1: interested in the science of life's origins and mathematically modeling 1112 00:55:20,239 --> 00:55:24,120 Speaker 1: the evolution of viruses, cancer cells, and life itself. Epstein 1113 00:55:24,200 --> 00:55:26,359 Speaker 1: did not leave much of an imperson impression on him. 1114 00:55:26,440 --> 00:55:29,200 Speaker 1: Church said the meetings weren't really about Jeffrey. They were 1115 00:55:29,239 --> 00:55:32,200 Speaker 1: about the scientists who were talking with each other. Normally, 1116 00:55:32,320 --> 00:55:34,880 Speaker 1: expectations are low for people who sit in on meetings 1117 00:55:34,920 --> 00:55:38,080 Speaker 1: far outside their field of expertise, so he's kind of like, well, 1118 00:55:38,120 --> 00:55:40,479 Speaker 1: it was mostly just as scientist talking, and Jeffrey didn't 1119 00:55:40,480 --> 00:55:42,479 Speaker 1: really know much and when he talked, it didn't really 1120 00:55:42,680 --> 00:55:44,040 Speaker 1: make an impression as a result. 1121 00:55:44,160 --> 00:55:44,399 Speaker 3: Right. 1122 00:55:45,120 --> 00:55:48,640 Speaker 1: And if that's the truth, which I have trouble believing 1123 00:55:48,680 --> 00:55:51,959 Speaker 1: because their relationship goes on after this. But if that's 1124 00:55:52,000 --> 00:55:54,360 Speaker 1: the truth, then all Church did was take this guy's 1125 00:55:54,440 --> 00:55:56,680 Speaker 1: money who was not convicted of a crime yet, and 1126 00:55:56,760 --> 00:55:59,480 Speaker 1: show up at some dinners to talk about science. And 1127 00:55:59,520 --> 00:56:02,799 Speaker 1: that wouldn't be so bad, right. And in fact, there 1128 00:56:02,800 --> 00:56:05,560 Speaker 1: are some people who got some funding from Epstein and 1129 00:56:05,600 --> 00:56:08,000 Speaker 1: were not involved in the sketchy stuff because he funded 1130 00:56:08,040 --> 00:56:09,759 Speaker 1: a lot of guys and they didn't all go to 1131 00:56:09,800 --> 00:56:12,640 Speaker 1: his parties or have sex with teenagers. 1132 00:56:12,760 --> 00:56:12,960 Speaker 5: Right. 1133 00:56:14,360 --> 00:56:15,560 Speaker 1: And I'm not saying Church did. 1134 00:56:15,840 --> 00:56:18,800 Speaker 2: He made enough money to buy an island. You can't 1135 00:56:18,800 --> 00:56:21,880 Speaker 2: do that with only sex pests some people. Right, he 1136 00:56:21,920 --> 00:56:24,440 Speaker 2: had to be on some version of and up and up. 1137 00:56:24,640 --> 00:56:26,920 Speaker 1: There's some people who were involved with him who have 1138 00:56:27,000 --> 00:56:28,400 Speaker 1: been tarnished unfairly. 1139 00:56:28,600 --> 00:56:28,759 Speaker 3: Right. 1140 00:56:28,880 --> 00:56:31,279 Speaker 1: I'm also not saying that Church is tartished unfairly here, 1141 00:56:31,680 --> 00:56:33,680 Speaker 1: because I don't think he is. However, I would be 1142 00:56:33,680 --> 00:56:35,719 Speaker 1: remiss if I did not read a different description of 1143 00:56:35,719 --> 00:56:38,400 Speaker 1: the dinner parties and events that Epstein held for scientists 1144 00:56:38,480 --> 00:56:41,080 Speaker 1: around this time. Maybe these are a different set of 1145 00:56:41,120 --> 00:56:44,800 Speaker 1: parties than the ones Church attended, although they include people 1146 00:56:44,840 --> 00:56:47,120 Speaker 1: he's listed as his friends. So I'm gonna quote from 1147 00:56:47,120 --> 00:56:50,200 Speaker 1: the New York Times here. The Harvard cognitive psychologist Stephen 1148 00:56:50,200 --> 00:56:53,160 Speaker 1: Pinker said he was invited by colleagues, including Martin Noack, 1149 00:56:53,760 --> 00:56:56,920 Speaker 1: a Charvard professor of mathematics and biology, and the theoretical 1150 00:56:56,960 --> 00:57:00,520 Speaker 1: physicist Lawrence Krauss, to salons and coffee cloud at which 1151 00:57:00,520 --> 00:57:03,600 Speaker 1: mister Epstein would hold court on multiple occasions, starting in 1152 00:57:03,600 --> 00:57:07,080 Speaker 1: the early two thousands, mister Epstein told scientists and businessmen 1153 00:57:07,120 --> 00:57:09,480 Speaker 1: about his ambitions to use his new Mexico ranch as 1154 00:57:09,520 --> 00:57:12,320 Speaker 1: a base where women would be inciminated with his sperm 1155 00:57:12,320 --> 00:57:14,480 Speaker 1: and would give birth to his babies. According to two 1156 00:57:14,560 --> 00:57:17,760 Speaker 1: award winning scientists and an advisor to large companies and 1157 00:57:17,800 --> 00:57:21,040 Speaker 1: wealthy individuals, all of whom mister Epstein told about it, 1158 00:57:21,040 --> 00:57:23,480 Speaker 1: it was not a secret. The advisor, for example, said 1159 00:57:23,520 --> 00:57:25,439 Speaker 1: he was told about the plants not only by mister 1160 00:57:25,480 --> 00:57:28,439 Speaker 1: Epstein at a gathering at his Manhattan townhouse, but also 1161 00:57:28,440 --> 00:57:30,920 Speaker 1: by at least one prominent member of the business community. 1162 00:57:31,160 --> 00:57:33,960 Speaker 1: One of the scientists said mister Epstein divulged his idea 1163 00:57:34,000 --> 00:57:35,560 Speaker 1: in two thousand and one at a dinner at the 1164 00:57:35,560 --> 00:57:38,600 Speaker 1: same townhouse. The other recalled mister Epstein discussing it with 1165 00:57:38,680 --> 00:57:40,560 Speaker 1: him at a two thousand and six conference that he 1166 00:57:40,600 --> 00:57:43,480 Speaker 1: hosted in Saint Thomas and the Virgin Islands. Once, at 1167 00:57:43,480 --> 00:57:46,320 Speaker 1: a dinner at mister Epstein's mansion in Manhattan's uper East Side, 1168 00:57:46,480 --> 00:57:49,000 Speaker 1: mister Janeer and he's talking about Jeron Lanier said that 1169 00:57:49,000 --> 00:57:50,760 Speaker 1: he talked to a scientist who told him that mister 1170 00:57:50,800 --> 00:57:52,959 Speaker 1: Epstein's goal was to have twenty women at a time 1171 00:57:53,000 --> 00:57:55,959 Speaker 1: impregnated at his thirty three thousand square foot Zoro ranch 1172 00:57:55,960 --> 00:58:01,000 Speaker 1: in a tiny town outside of Santa Fe. Whoa cool. 1173 00:58:01,760 --> 00:58:05,440 Speaker 2: It is pretty impressive to find out that Jeffrey Epstein 1174 00:58:05,600 --> 00:58:08,320 Speaker 2: is somehow more of a piece of shit than I thought. 1175 00:58:09,080 --> 00:58:11,920 Speaker 2: I was like, Ah, he's just a monster. I don't 1176 00:58:11,960 --> 00:58:14,720 Speaker 2: think he's like a super monster god. 1177 00:58:14,800 --> 00:58:17,200 Speaker 1: No, no, no, I don't think he's got a baby ranch. 1178 00:58:17,320 --> 00:58:20,040 Speaker 1: Oh yeah, he's got a baby ranch. Yeah, or he 1179 00:58:20,120 --> 00:58:21,400 Speaker 1: tried to have a baby ranch. 1180 00:58:21,480 --> 00:58:21,680 Speaker 2: Now. 1181 00:58:22,360 --> 00:58:25,280 Speaker 1: Stat News, to their credit, did ask doctor Church after 1182 00:58:25,280 --> 00:58:29,000 Speaker 1: Epstein's death about Epstein's eugenics baby ranch, being like, you're 1183 00:58:29,040 --> 00:58:33,600 Speaker 1: working in like gene editing people and Epstein wanted to 1184 00:58:33,640 --> 00:58:36,040 Speaker 1: do this. Did he talk to you about this because 1185 00:58:36,040 --> 00:58:38,320 Speaker 1: you guys knew each other when he was talking about this. 1186 00:58:38,520 --> 00:58:41,880 Speaker 1: Right now, I have no proof either way. For his part, 1187 00:58:41,920 --> 00:58:45,240 Speaker 1: doctor Church said I never heard anything about it, although 1188 00:58:45,280 --> 00:58:47,320 Speaker 1: he went on to say, and I find this curious. 1189 00:58:47,600 --> 00:58:49,640 Speaker 1: I'd have thought that I would have been involved in 1190 00:58:49,680 --> 00:58:52,919 Speaker 1: that kind of conversation. But it didn't tend to go 1191 00:58:53,000 --> 00:58:56,080 Speaker 1: in that direction. But also, I think people tend to 1192 00:58:56,120 --> 00:58:59,880 Speaker 1: behave themselves around me. That's a weird thing to say. 1193 00:58:59,720 --> 00:59:01,600 Speaker 4: After want to want a strange little guy. 1194 00:59:01,720 --> 00:59:05,000 Speaker 1: Honestly, bro, if someone asks you whether or not you're 1195 00:59:05,040 --> 00:59:08,960 Speaker 1: involved in Jeffrey Epstein's baby Ranch, you end the statement 1196 00:59:09,000 --> 00:59:11,320 Speaker 1: with I never heard anything about Oh, yeah. 1197 00:59:11,120 --> 00:59:12,760 Speaker 2: You don't have to be like I would have liked 1198 00:59:12,760 --> 00:59:13,600 Speaker 2: to talk to him about it. 1199 00:59:13,720 --> 00:59:15,600 Speaker 1: Yeah, I would have. I'm kind of offended he didn't 1200 00:59:15,600 --> 00:59:16,000 Speaker 1: bring me. 1201 00:59:15,960 --> 00:59:19,960 Speaker 2: In, but also sounds awesome. I just not didn't talk 1202 00:59:20,000 --> 00:59:20,360 Speaker 2: to him. 1203 00:59:20,480 --> 00:59:22,480 Speaker 1: But also, then, when you say it didn't tend to 1204 00:59:22,520 --> 00:59:24,680 Speaker 1: go in that direction, well, tend doesn't mean never? Does 1205 00:59:24,680 --> 00:59:27,400 Speaker 1: that mean sometimes it kind of did? Like what are 1206 00:59:27,400 --> 00:59:31,480 Speaker 1: you saying? You seem like a man who's precise with 1207 00:59:31,520 --> 00:59:34,840 Speaker 1: his language. I don't know why you're phrasing it this way. 1208 00:59:34,920 --> 00:59:37,560 Speaker 2: Yeah, Jeffrey is like, you know, I want to start 1209 00:59:37,560 --> 00:59:40,560 Speaker 2: a baby ranch, and he's like huh, He's huh nothing nothing. 1210 00:59:40,560 --> 00:59:41,800 Speaker 1: I thought he was going to bring it up again, 1211 00:59:41,840 --> 00:59:42,320 Speaker 1: but he didn't. 1212 00:59:42,320 --> 00:59:45,200 Speaker 2: You know, never mind, never mind. I thought you'd be 1213 00:59:45,200 --> 00:59:45,760 Speaker 2: cool about it. 1214 00:59:45,800 --> 00:59:47,840 Speaker 1: I thought you were cool. Yeah. It's like Jeffrey being 1215 00:59:47,880 --> 00:59:50,360 Speaker 1: like he wants a coke? What nothing. I didn't say anything. Nothing. 1216 00:59:50,960 --> 00:59:51,600 Speaker 1: I don't do coke. 1217 00:59:51,720 --> 00:59:51,919 Speaker 2: Yeah. 1218 00:59:53,000 --> 00:59:56,240 Speaker 1: Sober sober Jeffrey Epstein, that's what they call me. I'm 1219 00:59:56,240 --> 00:59:59,760 Speaker 1: gonna go to the bathroom like fifteen minutes. I'm gonna 1220 00:59:59,760 --> 01:00:01,360 Speaker 1: come out really excited. I love pen. 1221 01:00:01,800 --> 01:00:03,600 Speaker 2: We're gonna talk a business like it. 1222 01:00:03,760 --> 01:00:06,280 Speaker 1: I'm gonna look like Robin Williams in nineteen eighty five 1223 01:00:06,320 --> 01:00:08,120 Speaker 1: when I step out of that thing. It's nothing weird. 1224 01:00:09,480 --> 01:00:12,280 Speaker 1: So perhaps that is the truth, that he had nothing 1225 01:00:12,320 --> 01:00:15,080 Speaker 1: to do with this. Friends and colleagues of doctor Church 1226 01:00:15,160 --> 01:00:18,400 Speaker 1: express surprise when, after Epstein's death, the years of close 1227 01:00:18,440 --> 01:00:22,200 Speaker 1: connections between Epstein and Church were made public. When associate 1228 01:00:22,280 --> 01:00:25,200 Speaker 1: pointed out that Church even brought a philosopher into his 1229 01:00:25,280 --> 01:00:29,080 Speaker 1: Harvard lab to flag potential bioethics issues and experiments, and 1230 01:00:29,120 --> 01:00:31,640 Speaker 1: that he teaches a research ethics class, which is uncommon 1231 01:00:31,760 --> 01:00:34,440 Speaker 1: for a scientist in his field working at his level, 1232 01:00:34,680 --> 01:00:36,280 Speaker 1: and so they're like, well, it's weird to me that 1233 01:00:36,280 --> 01:00:38,240 Speaker 1: he would have any relationship with Epstein because I have 1234 01:00:38,280 --> 01:00:40,920 Speaker 1: always considered him one of the most concerned with ethics 1235 01:00:40,920 --> 01:00:44,800 Speaker 1: people in our field. And again, to emphasize for legal 1236 01:00:44,840 --> 01:00:47,640 Speaker 1: and moral reasons, there is no evidence that Church was 1237 01:00:47,640 --> 01:00:51,040 Speaker 1: working on any kind of eugenics baby project for Epstein, 1238 01:00:51,560 --> 01:00:54,760 Speaker 1: not that there would be because Jeffrey Epstein didn't publish 1239 01:00:54,920 --> 01:00:57,560 Speaker 1: all the details about everyone he was involved with with everything. 1240 01:00:58,280 --> 01:01:00,440 Speaker 1: We just know he talked about this plan during several 1241 01:01:00,480 --> 01:01:03,120 Speaker 1: coffee clatches and other events with his pet scientists, and 1242 01:01:03,160 --> 01:01:07,080 Speaker 1: that Church was at similar events. Doctor Church claims that 1243 01:01:07,160 --> 01:01:10,200 Speaker 1: working with Epstein at all was an ethical lapse, but 1244 01:01:10,320 --> 01:01:13,920 Speaker 1: not entirely his fault. He points out that universities are 1245 01:01:13,920 --> 01:01:16,880 Speaker 1: supposed to vet donors before they meet with faculty, and 1246 01:01:16,920 --> 01:01:19,760 Speaker 1: he told Stack, my understanding is this vetting is the 1247 01:01:19,800 --> 01:01:22,320 Speaker 1: responsibility of the development office, which is yet another reason 1248 01:01:22,360 --> 01:01:24,720 Speaker 1: why scientists are a little more relaxed. They feel they 1249 01:01:24,720 --> 01:01:27,240 Speaker 1: have administrators who, in theory do the difficult job of 1250 01:01:27,280 --> 01:01:31,280 Speaker 1: figuring out who's legit. So sah, I'm just a little guy. 1251 01:01:31,680 --> 01:01:34,360 Speaker 1: How could I be expected to think about this sort 1252 01:01:34,360 --> 01:01:35,720 Speaker 1: of thing that's someone else's job. 1253 01:01:35,960 --> 01:01:39,960 Speaker 2: And now and now he's picking who introduced him. Right, 1254 01:01:39,960 --> 01:01:42,880 Speaker 2: Previously he didn't know who introduced him, Hurn, it was 1255 01:01:42,920 --> 01:01:43,840 Speaker 2: hard the whole time. 1256 01:01:43,920 --> 01:01:48,160 Speaker 1: It was Harvard. Yeah. Now, he added that scientists, quote 1257 01:01:48,200 --> 01:01:51,600 Speaker 1: myself included, are not very good at screening or judging 1258 01:01:51,680 --> 01:01:54,200 Speaker 1: human beings. Right that just like, ah, we're all just 1259 01:01:54,320 --> 01:01:56,720 Speaker 1: kind of, you know, bad at people. You know, it's 1260 01:01:56,720 --> 01:02:00,000 Speaker 1: not really our strong suit. And to be fair to also, first, 1261 01:02:00,240 --> 01:02:01,960 Speaker 1: I just don't believe that for Church, because he's an 1262 01:02:02,000 --> 01:02:05,240 Speaker 1: incredibly skilled public relations expert. I think he's very good 1263 01:02:05,280 --> 01:02:08,240 Speaker 1: with people, right, and he's probably very good at judging 1264 01:02:08,320 --> 01:02:12,600 Speaker 1: people because that's what he does. Anyway, to be fair 1265 01:02:12,640 --> 01:02:14,880 Speaker 1: to Church, he went on to make a good point 1266 01:02:14,880 --> 01:02:17,080 Speaker 1: in that stat interview that almost does sound like a 1267 01:02:17,160 --> 01:02:19,919 Speaker 1: Mia Kolpa. He states that a lot of scientists working 1268 01:02:19,960 --> 01:02:23,760 Speaker 1: on cutting edge projects with important applications feeled what he 1269 01:02:23,880 --> 01:02:27,280 Speaker 1: described as an exceptionalism, which is a sense that anything 1270 01:02:27,320 --> 01:02:29,720 Speaker 1: they do is okay if the work is important enough. 1271 01:02:29,760 --> 01:02:32,960 Speaker 1: This is almost like a precursor to like effective altruism 1272 01:02:33,280 --> 01:02:33,960 Speaker 1: type feelings. 1273 01:02:34,040 --> 01:02:34,200 Speaker 2: Right. 1274 01:02:34,280 --> 01:02:37,320 Speaker 1: He predates that, but I don't think he's wrong here. 1275 01:02:37,360 --> 01:02:38,840 Speaker 1: I do think that's a thing that a lot of 1276 01:02:38,880 --> 01:02:42,680 Speaker 1: scientists working in important fields feel, which is that like, well, 1277 01:02:42,880 --> 01:02:44,480 Speaker 1: if I have to do something a little fucked up 1278 01:02:44,560 --> 01:02:49,680 Speaker 1: to further this research with incredibly important like implications, it's 1279 01:02:49,760 --> 01:02:52,760 Speaker 1: worth it. And he cited the case of a Nobel laureate, 1280 01:02:52,800 --> 01:02:56,200 Speaker 1: a biologist named Sidney Brinner, who took fifteen million dollars 1281 01:02:56,240 --> 01:02:59,640 Speaker 1: from Philip Morris to fund a biology institute, and Sydney's 1282 01:02:59,680 --> 01:03:02,880 Speaker 1: argument was that, like, look, if big Tobacco keeps this money, 1283 01:03:02,880 --> 01:03:05,600 Speaker 1: they'll use it for something worse than I will using 1284 01:03:05,640 --> 01:03:09,320 Speaker 1: it for science, which is like an arguable point, but 1285 01:03:09,400 --> 01:03:11,560 Speaker 1: also like, well, big Tobacco's put give you that money 1286 01:03:11,600 --> 01:03:14,760 Speaker 1: because it's right off, and like they're gonna they're expecting something. 1287 01:03:14,800 --> 01:03:18,320 Speaker 1: They're expecting something from it, right, aren't they, Sidney? Are 1288 01:03:18,360 --> 01:03:20,000 Speaker 1: you giving them anything? Are you sure? 1289 01:03:20,360 --> 01:03:20,560 Speaker 3: Right? 1290 01:03:21,000 --> 01:03:24,640 Speaker 2: And also there you picked the worst guy, you know 1291 01:03:24,640 --> 01:03:26,800 Speaker 2: what I mean, Like, it's not like you got a 1292 01:03:26,920 --> 01:03:31,080 Speaker 2: comparable uh space to be putting this money into. 1293 01:03:31,360 --> 01:03:33,640 Speaker 1: It's when they like Walmart needed at tax right off, 1294 01:03:33,640 --> 01:03:35,880 Speaker 1: so they funded this like medical thing I was doing, 1295 01:03:35,960 --> 01:03:38,680 Speaker 1: and like you know, Walmart's a sketchy corporation, but also 1296 01:03:38,760 --> 01:03:40,400 Speaker 1: like the science is good. It's like, no, this is 1297 01:03:40,440 --> 01:03:44,920 Speaker 1: the tobacco industry. Yeah, their product is literally killing more 1298 01:03:44,960 --> 01:03:46,560 Speaker 1: people every year than World War Two. 1299 01:03:47,000 --> 01:03:50,040 Speaker 2: They win, they win the murder game. 1300 01:03:50,520 --> 01:03:53,280 Speaker 1: You did it. It's just a little different, yeah, you know. 1301 01:03:55,280 --> 01:03:58,160 Speaker 1: But anyway, there is an argument about like, well, how 1302 01:03:58,240 --> 01:04:00,640 Speaker 1: and obviously I'm in the advertising busines. There's always an 1303 01:04:00,680 --> 01:04:04,200 Speaker 1: argument how many moral compromises should you make to fund 1304 01:04:04,240 --> 01:04:05,120 Speaker 1: something valuable? 1305 01:04:05,200 --> 01:04:05,360 Speaker 2: Right? 1306 01:04:05,400 --> 01:04:08,080 Speaker 1: And the answer isn't none. You know, this is capitalism. 1307 01:04:08,320 --> 01:04:11,920 Speaker 1: I would say big tobaccos, maybe like fifteen million dollars 1308 01:04:11,920 --> 01:04:14,320 Speaker 1: a Philip Morris money is maybe a step beyond that. 1309 01:04:14,480 --> 01:04:18,120 Speaker 1: But you know, people feel differently, you know, like to 1310 01:04:18,280 --> 01:04:20,960 Speaker 1: you should is it fine to advertise vaping? I don't 1311 01:04:21,000 --> 01:04:27,120 Speaker 1: know whatever. What's less arguable is that after Epstein was charged, convicted, 1312 01:04:27,160 --> 01:04:30,080 Speaker 1: and sentenced in two thousand and eight, doctor Church continued 1313 01:04:30,080 --> 01:04:33,920 Speaker 1: his association with the by this point known sex criminal. Right, 1314 01:04:33,960 --> 01:04:36,960 Speaker 1: So two and five and seven we don't know if 1315 01:04:36,960 --> 01:04:38,800 Speaker 1: he was involved in the weird eugenic stuff. We know 1316 01:04:38,840 --> 01:04:42,000 Speaker 1: he's taking Epstein's money, but Epstein's not a known criminal, right, 1317 01:04:42,320 --> 01:04:46,160 Speaker 1: he could have been, you know, kind of innocent. He 1318 01:04:46,360 --> 01:04:52,080 Speaker 1: continues associating with Epstein repeatedly after he is convicted as 1319 01:04:52,080 --> 01:04:55,120 Speaker 1: a sex criminal in two thousand and eight, and that's 1320 01:04:55,560 --> 01:04:58,280 Speaker 1: crossing a line for me. Yeah. At one point it's. 1321 01:04:58,120 --> 01:05:01,320 Speaker 4: Like, I fear this man just lacks common sense. Nope, No, 1322 01:05:01,480 --> 01:05:05,640 Speaker 4: you should have known was an active decision to associate 1323 01:05:05,680 --> 01:05:08,360 Speaker 4: with one of the world's biggest monsters. 1324 01:05:08,600 --> 01:05:10,960 Speaker 1: You're you're making a choice here, brother. 1325 01:05:11,840 --> 01:05:14,280 Speaker 2: He was like, Nah, Jeffrey's awesome to me, and I'm 1326 01:05:14,280 --> 01:05:15,720 Speaker 2: gonna keep gett I'm gonna keep hanging. 1327 01:05:16,160 --> 01:05:19,520 Speaker 1: Yeah. So, when Church's book ReGenesis came out in twenty twelve, 1328 01:05:19,600 --> 01:05:22,400 Speaker 1: it elevated his profile, and Epstein seems to have gotten 1329 01:05:22,400 --> 01:05:24,400 Speaker 1: back in touch with him soon after. And this would 1330 01:05:24,400 --> 01:05:26,800 Speaker 1: have been, you know, after Epstein finished doing his quote 1331 01:05:26,880 --> 01:05:29,600 Speaker 1: unquote time, which doesn't really not by that was not 1332 01:05:29,720 --> 01:05:33,200 Speaker 1: time by normal people's standards, right like his slap on 1333 01:05:33,280 --> 01:05:35,680 Speaker 1: the wrist didn't even get a slap on the wrist, right. 1334 01:05:35,920 --> 01:05:36,520 Speaker 4: Yeah. 1335 01:05:36,560 --> 01:05:39,280 Speaker 1: And it's not clear to me when they got back 1336 01:05:39,280 --> 01:05:42,160 Speaker 1: in touch or if they ever got out of touch 1337 01:05:42,400 --> 01:05:44,480 Speaker 1: after two thousand and seven. I don't even know that. 1338 01:05:45,240 --> 01:05:47,880 Speaker 1: Whatever the case, Doctor Church has posted a public online 1339 01:05:47,920 --> 01:05:50,720 Speaker 1: calendar every year since nineteen ninety nine, and it shows 1340 01:05:50,720 --> 01:05:53,120 Speaker 1: that he had six separate phone calls or meetings with 1341 01:05:53,160 --> 01:05:57,800 Speaker 1: Epstein in twenty fourteen. Stant News writes sample entry June 1342 01:05:57,840 --> 01:06:01,160 Speaker 1: twenty first, twenty fourteen, Lunch with Jeffrey Epstein twelve to 1343 01:06:01,240 --> 01:06:04,520 Speaker 1: one thirty Martin Noak's Institute. And that's a lot of 1344 01:06:04,560 --> 01:06:08,120 Speaker 1: times to talk to Jeffrey Epstein, Right, that's a lot 1345 01:06:08,160 --> 01:06:10,000 Speaker 1: of that's a long lunch too. 1346 01:06:10,200 --> 01:06:14,840 Speaker 2: Twelve to one. You guys were chatting, You were pushing it, huh. 1347 01:06:14,920 --> 01:06:18,240 Speaker 1: When interviewed after Epstein's death, doctor Church admitted to meeting 1348 01:06:18,240 --> 01:06:23,200 Speaker 1: Epstein several times each year since twenty fourteen, and stat 1349 01:06:23,240 --> 01:06:25,560 Speaker 1: was like, didn't you hear that he'd been convicted of 1350 01:06:25,600 --> 01:06:28,520 Speaker 1: all those sex crimes? And like, you're a father and 1351 01:06:28,600 --> 01:06:31,080 Speaker 1: a grandfather. Did it not skiv you out to be 1352 01:06:31,200 --> 01:06:34,919 Speaker 1: involved with this guy? And Church replied, I did read 1353 01:06:34,920 --> 01:06:38,720 Speaker 1: a couple of news articles like ten years back quote, 1354 01:06:38,880 --> 01:06:40,720 Speaker 1: but they weren't clear enough for me to know if 1355 01:06:40,760 --> 01:06:45,000 Speaker 1: there was a serious problem. Now. I should note here 1356 01:06:45,080 --> 01:06:47,280 Speaker 1: that reporting in two thousand and eight alleged that Epstein 1357 01:06:47,280 --> 01:06:54,680 Speaker 1: had received massages from teenage girls. You didn't know you 1358 01:06:54,720 --> 01:06:57,200 Speaker 1: didn't know. Huh, you're a researcher. 1359 01:06:58,080 --> 01:07:01,960 Speaker 2: Like that's a real arch when we say teenage, how 1360 01:07:02,000 --> 01:07:02,720 Speaker 2: are we talking? 1361 01:07:02,880 --> 01:07:05,960 Speaker 1: Yeah, well we're talking right now. When he asked if 1362 01:07:06,000 --> 01:07:08,640 Speaker 1: he felt Epstein had paid his debt to society, stats like, 1363 01:07:08,720 --> 01:07:11,160 Speaker 1: city you think he'd like paid his debt to society 1364 01:07:11,240 --> 01:07:13,480 Speaker 1: after two thousand and eight and deserved a second chance? 1365 01:07:13,920 --> 01:07:16,640 Speaker 1: And I kind of I really respect stat for sitting 1366 01:07:16,680 --> 01:07:18,560 Speaker 1: down with this guy and kind of drilling him on this. 1367 01:07:19,160 --> 01:07:21,800 Speaker 1: Church responded with what I would call a non answer. 1368 01:07:21,840 --> 01:07:24,520 Speaker 1: So they like, hey, so is it that you thought, 1369 01:07:24,600 --> 01:07:28,400 Speaker 1: you know, he'd made good, that everything was okay now? 1370 01:07:28,480 --> 01:07:30,600 Speaker 1: And he said, as far as I know, people just 1371 01:07:30,600 --> 01:07:35,440 Speaker 1: didn't have that conversation, but it should have. So, ok, 1372 01:07:35,720 --> 01:07:39,120 Speaker 1: let's break that down. He's asked, do you think that 1373 01:07:39,240 --> 01:07:41,800 Speaker 1: after two thousand and eight Epstein had paid his debt 1374 01:07:41,840 --> 01:07:44,800 Speaker 1: to society? And he said, as far as I know, 1375 01:07:45,440 --> 01:07:50,040 Speaker 1: people not me, didn't have that conversation, but it should have. 1376 01:07:50,600 --> 01:07:54,480 Speaker 2: It's it, I guess the people. 1377 01:07:55,640 --> 01:07:57,720 Speaker 1: Your grammar should be better than that man, But like, 1378 01:07:57,760 --> 01:07:58,880 Speaker 1: what do you what do you mean? 1379 01:08:00,320 --> 01:08:03,240 Speaker 2: He's like, I'm not going to answer for myself. Yeah 1380 01:08:03,360 --> 01:08:06,960 Speaker 2: you yeah, you the single body of people that exist 1381 01:08:07,040 --> 01:08:09,880 Speaker 2: around me should have had the conversation with Jeffrey. 1382 01:08:09,520 --> 01:08:12,680 Speaker 1: Webs okay man. Now, he went on to add in 1383 01:08:12,680 --> 01:08:16,320 Speaker 1: that interview, I would think I would think to like 1384 01:08:16,439 --> 01:08:19,880 Speaker 1: that people's reputation is multi dimensional and multi year. It 1385 01:08:19,920 --> 01:08:22,080 Speaker 1: takes a long time to build up but also to 1386 01:08:22,200 --> 01:08:25,360 Speaker 1: tear down and stat notes. He was speaking generally and 1387 01:08:25,400 --> 01:08:28,960 Speaker 1: about himself, as in like, this shouldn't destroy my reputation 1388 01:08:29,200 --> 01:08:31,439 Speaker 1: because like, I've done other things. But it's kind of 1389 01:08:31,479 --> 01:08:34,439 Speaker 1: hard not to read that as I'm talking about Epstein too, like, well, 1390 01:08:34,520 --> 01:08:37,040 Speaker 1: he's a complicated guy. He's got other stuff that he's 1391 01:08:37,080 --> 01:08:38,920 Speaker 1: done besides the sex crimes. 1392 01:08:39,120 --> 01:08:41,000 Speaker 2: Yeah. 1393 01:08:41,040 --> 01:08:44,840 Speaker 1: Great, love an answer like that from our ethics man 1394 01:08:44,880 --> 01:08:49,479 Speaker 1: working on brain reading. For what it's worth, George Church 1395 01:08:49,520 --> 01:08:53,200 Speaker 1: did ultimately apologize for taking Epstein's money in a twenty 1396 01:08:53,320 --> 01:08:56,559 Speaker 1: nineteen interview, although you want to guess what he blamed 1397 01:08:56,560 --> 01:08:57,960 Speaker 1: his lapse in judgment on. 1398 01:08:59,280 --> 01:09:04,240 Speaker 2: Oh no, no, no, no, no, just tell us there's 1399 01:09:04,280 --> 01:09:07,559 Speaker 2: no way nerd tunnel vision. There we go. 1400 01:09:07,920 --> 01:09:09,720 Speaker 1: I'm just too much of a nerd to have a 1401 01:09:09,760 --> 01:09:11,000 Speaker 1: problem with sex crimes. 1402 01:09:11,560 --> 01:09:12,960 Speaker 2: WHOA, I don't even notice. 1403 01:09:13,000 --> 01:09:15,839 Speaker 1: Plus you know, you know how it goes you're watching 1404 01:09:15,920 --> 01:09:19,800 Speaker 1: Star Wars Friends trafficking teenage girls around the world. It 1405 01:09:19,960 --> 01:09:20,639 Speaker 1: just happens. 1406 01:09:21,120 --> 01:09:26,040 Speaker 2: Yeah, No, there is this this, there's again. It's just 1407 01:09:26,040 --> 01:09:29,479 Speaker 2: this this want to like cutify themselves out of like 1408 01:09:29,560 --> 01:09:32,880 Speaker 2: the human experience. It's like, I'm just a nerdy, little 1409 01:09:33,400 --> 01:09:36,040 Speaker 2: cutie boy. I I didn't even notice that bad things 1410 01:09:36,120 --> 01:09:38,800 Speaker 2: were happening. Like, you know, you're a grown man who's 1411 01:09:38,840 --> 01:09:42,799 Speaker 2: trying to manipulate genes. This isn't You're not a sweetheart 1412 01:09:42,840 --> 01:09:43,160 Speaker 2: at all. 1413 01:09:43,680 --> 01:09:47,920 Speaker 1: Yeah, yeah, it's it's it's just very it's it's great stuff. 1414 01:09:48,200 --> 01:09:51,920 Speaker 1: Good work, good work, doctor Church. Now that's pretty bad 1415 01:09:52,040 --> 01:09:55,559 Speaker 1: ending on Epstein and part one not ideal. It gets 1416 01:09:55,560 --> 01:09:59,160 Speaker 1: so much worse than part two. There's so much eugenics coming. 1417 01:09:59,400 --> 01:10:02,720 Speaker 1: There's so much fucked up shit on the way. I 1418 01:10:02,720 --> 01:10:05,879 Speaker 1: am so excited to tell you the rest of this story, Langston. 1419 01:10:06,320 --> 01:10:10,640 Speaker 1: But first let's talk about you. You know, what's your 1420 01:10:10,640 --> 01:10:11,280 Speaker 1: favorite color? 1421 01:10:12,040 --> 01:10:13,639 Speaker 2: Favorite color is coral? 1422 01:10:14,040 --> 01:10:17,360 Speaker 1: Coral. I honestly didn't call that, Okay, I didn't actually 1423 01:10:17,360 --> 01:10:19,240 Speaker 1: call anything. I had no idea what your favorite color 1424 01:10:19,280 --> 01:10:19,640 Speaker 1: would be. 1425 01:10:19,760 --> 01:10:21,800 Speaker 2: Yeah, I think for years I used to say blue 1426 01:10:21,920 --> 01:10:26,320 Speaker 2: to protect myself from yeah, from my own insecurities. But 1427 01:10:26,360 --> 01:10:28,479 Speaker 2: then as the same I had to be honest and say. 1428 01:10:28,439 --> 01:10:29,280 Speaker 1: Not a safe place. 1429 01:10:29,439 --> 01:10:32,800 Speaker 2: My favorite color is nuanced and slightly effeminated. 1430 01:10:32,880 --> 01:10:34,920 Speaker 1: I guess I have been looking for a pair of 1431 01:10:35,000 --> 01:10:37,360 Speaker 1: coral shorts for the summer. It does seem like a nice, 1432 01:10:37,520 --> 01:10:40,599 Speaker 1: nice short color, you know. Yeah, yeah, yeah, my brother 1433 01:10:40,640 --> 01:10:42,720 Speaker 1: wears a lot of coral. It's a good color. We're 1434 01:10:42,720 --> 01:10:45,360 Speaker 1: all getting over our insecurities here, you know, just like 1435 01:10:45,560 --> 01:10:48,759 Speaker 1: George Church got over his insecurities about his friend Jeffrey Epsto. 1436 01:10:49,479 --> 01:10:51,439 Speaker 2: He's like, you know what, no, not like, I can 1437 01:10:51,520 --> 01:10:52,040 Speaker 2: get past this. 1438 01:10:52,600 --> 01:10:54,880 Speaker 1: Yeah, we can get past this, and we can get 1439 01:10:54,920 --> 01:10:56,840 Speaker 1: past the part where we talk about George Church to 1440 01:10:56,880 --> 01:10:59,599 Speaker 1: talk about your pluckables? What are they? Oh? 1441 01:10:59,760 --> 01:11:03,360 Speaker 2: You can listen to my podcast. It's called Ma Mama 1442 01:11:03,439 --> 01:11:05,719 Speaker 2: told me. I do it with my friend David Borie, 1443 01:11:05,720 --> 01:11:11,679 Speaker 2: who is also alumnus of this gorgeous podcast. And yeah, 1444 01:11:11,760 --> 01:11:16,080 Speaker 2: we talk about conspiracy theories, specifically black conspiracy theories, and 1445 01:11:16,120 --> 01:11:19,479 Speaker 2: it's really fun and silly, and we do not nearly 1446 01:11:19,520 --> 01:11:21,519 Speaker 2: as effective research as you do, Robert. 1447 01:11:23,040 --> 01:11:26,480 Speaker 1: My only hope is that George Church gets integrated into 1448 01:11:26,720 --> 01:11:30,040 Speaker 1: a series of conspiracy theories about Jeffrey Epstein because everyone 1449 01:11:30,080 --> 01:11:33,439 Speaker 1: else who was tied to him has been And look, 1450 01:11:33,520 --> 01:11:35,240 Speaker 1: you know, are the all of those accurate? 1451 01:11:35,400 --> 01:11:35,519 Speaker 3: No? 1452 01:11:35,880 --> 01:11:39,800 Speaker 1: Are they all fun? Yes? And George miss out. 1453 01:11:40,000 --> 01:11:43,000 Speaker 2: You know, I will say there was that era on 1454 01:11:43,560 --> 01:11:47,200 Speaker 2: what is now X but formerly Twitter where they were 1455 01:11:47,240 --> 01:11:50,120 Speaker 2: just making up lists of the people who were on 1456 01:11:50,160 --> 01:11:53,240 Speaker 2: the flight logs, and it was always a funny list. 1457 01:11:53,680 --> 01:11:56,679 Speaker 2: It never failed, no matter who wrote it, the right, 1458 01:11:56,800 --> 01:12:00,960 Speaker 2: the left, the sickos, the imagineers, they were always funny 1459 01:12:00,960 --> 01:12:01,719 Speaker 2: lists of people. 1460 01:12:01,880 --> 01:12:04,320 Speaker 1: Oh yeah. One of my favorite things about that is 1461 01:12:04,360 --> 01:12:07,240 Speaker 1: just like, you know, maybe that's sketchy that uh fucking 1462 01:12:07,320 --> 01:12:09,719 Speaker 1: Eddie Murphy wound up in there, But I could also 1463 01:12:09,760 --> 01:12:12,040 Speaker 1: like it's a perfect thing for an Eddie Murphy movie 1464 01:12:12,040 --> 01:12:13,960 Speaker 1: where he just finds out he's been on this sex 1465 01:12:14,040 --> 01:12:18,800 Speaker 1: criminals plan a bunch of time, Like I'll watch that 1466 01:12:18,920 --> 01:12:20,040 Speaker 1: ninety minute comedy. 1467 01:12:20,600 --> 01:12:22,760 Speaker 2: Like there were a few people on some of those 1468 01:12:22,760 --> 01:12:24,600 Speaker 2: flight lines where I was like, I don't think they 1469 01:12:24,680 --> 01:12:26,280 Speaker 2: knew what they got on the plane for. 1470 01:12:26,479 --> 01:12:28,599 Speaker 1: Yeah, you might have just been going to a thing 1471 01:12:28,680 --> 01:12:31,519 Speaker 1: with him, right, you go into some sort of conference. 1472 01:12:31,200 --> 01:12:32,599 Speaker 2: Somebody said get on the PJ. 1473 01:12:33,000 --> 01:12:36,599 Speaker 1: Yeah, yeah, yea, I'll get in a private jet share. Yeah, 1474 01:12:37,360 --> 01:12:39,800 Speaker 1: it has. I will say one thing that I have 1475 01:12:40,000 --> 01:12:43,479 Speaker 1: learned as a result of this, because previously, before I 1476 01:12:43,520 --> 01:12:45,960 Speaker 1: knew any about Jeffrey Epstein, if some richka had been like, Hey, 1477 01:12:46,040 --> 01:12:47,519 Speaker 1: we're going to pay you to go to a conference. 1478 01:12:47,560 --> 01:12:49,280 Speaker 1: You want to ride my private jet, I probably would 1479 01:12:49,280 --> 01:12:51,840 Speaker 1: have been like, yeah, fuck man, that sounds dope. You know, 1480 01:12:52,080 --> 01:12:54,479 Speaker 1: fucking twenty five year old mate probably wouldn't have had 1481 01:12:54,520 --> 01:12:57,280 Speaker 1: the wherewithal to be like and don't know, but now 1482 01:12:57,560 --> 01:12:58,439 Speaker 1: absolutely not. 1483 01:12:59,000 --> 01:13:02,240 Speaker 2: Someone asked me the other day if I would get 1484 01:13:02,280 --> 01:13:06,799 Speaker 2: on that Trump plane, the one that the Qatar gave him. 1485 01:13:06,880 --> 01:13:10,040 Speaker 2: It's like, for the story alone, I kind of think. 1486 01:13:10,000 --> 01:13:12,800 Speaker 1: I have to. Yeah, I know that one. Yes, yes, yes, 1487 01:13:13,320 --> 01:13:15,160 Speaker 1: that's justifiable for journalism. 1488 01:13:15,360 --> 01:13:17,760 Speaker 2: Yeah, I just got to ride this wave and deal 1489 01:13:17,800 --> 01:13:18,800 Speaker 2: with the fallout later. 1490 01:13:19,000 --> 01:13:21,000 Speaker 4: Yeah, I'm good, I'm staying home. 1491 01:13:21,400 --> 01:13:23,040 Speaker 2: You would stay at home, You wouldn't touch it. 1492 01:13:23,240 --> 01:13:24,920 Speaker 4: No, I'm good, I'm staying home. 1493 01:13:25,320 --> 01:13:27,080 Speaker 1: But no, folks, the lesson here is that if a 1494 01:13:27,200 --> 01:13:29,240 Speaker 1: rich guy wants to fly you and pay you to 1495 01:13:29,280 --> 01:13:32,799 Speaker 1: speak at some sort of weird conference, tell him first 1496 01:13:32,880 --> 01:13:37,679 Speaker 1: class from a real airline, right, you know, it's nice enough, 1497 01:13:38,240 --> 01:13:41,200 Speaker 1: and no one can be like, what if that was 1498 01:13:41,200 --> 01:13:44,200 Speaker 1: that Delta flight like implicating you in crime? The only 1499 01:13:44,280 --> 01:13:48,320 Speaker 1: thing that Delta flight implicates you in is crashing at Newark. Right, Sorry, 1500 01:13:48,320 --> 01:13:49,479 Speaker 1: that's that's bad. 1501 01:13:51,160 --> 01:13:53,719 Speaker 2: That was They said they'll do better. They said they'll 1502 01:13:53,720 --> 01:13:54,120 Speaker 2: do better. 1503 01:13:54,200 --> 01:13:56,280 Speaker 1: I said they'll do better. Everyone's got to do better, 1504 01:13:57,800 --> 01:13:58,160 Speaker 1: all right. 1505 01:13:58,240 --> 01:13:59,280 Speaker 4: That's the episode. 1506 01:13:59,680 --> 01:14:02,320 Speaker 1: Yeah. 1507 01:14:03,320 --> 01:14:06,040 Speaker 5: Behind the Bastards is a production of cool Zone Media. 1508 01:14:06,400 --> 01:14:09,680 Speaker 5: For more from cool Zone Media, visit our website Coolzonemedia 1509 01:14:09,840 --> 01:14:13,040 Speaker 5: dot com, or check us out on the iHeartRadio app, 1510 01:14:13,120 --> 01:14:16,280 Speaker 5: Apple podcasts, or wherever you get your podcasts. Behind the 1511 01:14:16,280 --> 01:14:20,040 Speaker 5: Bastards is now available on YouTube, new episodes every. 1512 01:14:19,840 --> 01:14:21,040 Speaker 4: Wednesday and Friday. 1513 01:14:21,360 --> 01:14:22,080 Speaker 1: Subscribe to our 1514 01:14:22,160 --> 01:14:25,800 Speaker 4: Channel YouTube dot com slash at Behind the Bastards