1 00:00:05,120 --> 00:00:08,240 Speaker 1: We are defined in large part by the genomes that 2 00:00:08,280 --> 00:00:11,280 Speaker 1: we happen to come to the table with. So what 3 00:00:11,320 --> 00:00:15,440 Speaker 1: does it mean when we can edit our own genomes? 4 00:00:15,480 --> 00:00:19,360 Speaker 1: What does this have to do with viruses or copy pasting, 5 00:00:19,920 --> 00:00:23,200 Speaker 1: or whether we are going to modify the story of 6 00:00:23,239 --> 00:00:29,040 Speaker 1: our own species. This is in our cosmos, and I'm 7 00:00:29,120 --> 00:00:32,400 Speaker 1: David Eagleman. I'm a neuroscientist and author at Stanford And 8 00:00:32,440 --> 00:00:36,000 Speaker 1: in these episodes we sail deeply into our three pound 9 00:00:36,120 --> 00:00:40,000 Speaker 1: universe to uncover some of the most surprising aspects of 10 00:00:40,040 --> 00:00:54,800 Speaker 1: our lives. Today's episode is about the remarkable situation we 11 00:00:54,880 --> 00:00:58,840 Speaker 1: find ourselves in, which is that we now know how 12 00:00:58,880 --> 00:01:03,480 Speaker 1: to read our biological inheritance. Now, this is very easy 13 00:01:03,520 --> 00:01:05,760 Speaker 1: to take for granted because for most of us this 14 00:01:05,800 --> 00:01:08,640 Speaker 1: has been true for our whole lives. But it's a 15 00:01:08,959 --> 00:01:13,319 Speaker 1: very recent ability for our species. It only began in 16 00:01:13,360 --> 00:01:17,640 Speaker 1: the second half of last century. In my postdoctoral fellowship, 17 00:01:17,720 --> 00:01:21,520 Speaker 1: I worked with Francis Crick, who was the co discoverer 18 00:01:21,920 --> 00:01:26,040 Speaker 1: of the structure of DNA. In April of nineteen fifty three, 19 00:01:26,160 --> 00:01:29,840 Speaker 1: he and James Watson published a paper in Nature, and 20 00:01:29,880 --> 00:01:34,400 Speaker 1: in just over a page they proposed the double helix 21 00:01:34,520 --> 00:01:41,880 Speaker 1: structure of DNA, explaining how genetic information is stored and copied. Now, 22 00:01:41,880 --> 00:01:45,000 Speaker 1: there's something about this paper that always makes me tear 23 00:01:45,160 --> 00:01:48,680 Speaker 1: up when I read it, because the insight changed our 24 00:01:48,760 --> 00:01:52,960 Speaker 1: world and almost certainly the future of our species. What 25 00:01:53,000 --> 00:01:55,960 Speaker 1: they realize is that DNA is made of two complimentary 26 00:01:56,120 --> 00:02:00,640 Speaker 1: strands wound around each other, and the bases are which 27 00:02:00,680 --> 00:02:03,280 Speaker 1: just means A on one strand always links with T 28 00:02:03,520 --> 00:02:06,400 Speaker 1: on the other, and C with G. And this gives 29 00:02:06,440 --> 00:02:10,040 Speaker 1: a mechanism for making a xerox copy of the whole thing, 30 00:02:10,080 --> 00:02:13,360 Speaker 1: because you just unwind the two strands and then each 31 00:02:13,480 --> 00:02:16,240 Speaker 1: serves as the template for sticking on new bases in 32 00:02:16,280 --> 00:02:19,120 Speaker 1: the right spots. But what I want to emphasize is 33 00:02:19,200 --> 00:02:24,240 Speaker 1: how new this is. Nineteen fifty three, isn't that long ago? 34 00:02:24,440 --> 00:02:28,200 Speaker 1: World War Two was over, Eisenhower was president of the US, 35 00:02:28,240 --> 00:02:32,240 Speaker 1: and by this point we already had automatic transmission in 36 00:02:32,360 --> 00:02:37,079 Speaker 1: cars and color television and microwave ovens. But we had 37 00:02:37,200 --> 00:02:41,799 Speaker 1: no idea why your ears look like your father's ears, 38 00:02:42,160 --> 00:02:46,520 Speaker 1: or your eyes look like your mother's eyes. Just imagine this, 39 00:02:46,639 --> 00:02:50,720 Speaker 1: before the discovery of the DNA code, how difficult it 40 00:02:51,120 --> 00:02:58,000 Speaker 1: was to understand how inheritance actually happens biologically. People floated 41 00:02:58,280 --> 00:03:02,160 Speaker 1: all kinds of wacky hyppods disease about it, like maybe 42 00:03:02,400 --> 00:03:07,000 Speaker 1: each sperm cell contained a super tiny embryo. But everyone 43 00:03:07,080 --> 00:03:10,560 Speaker 1: knew these models didn't work. And even while people were 44 00:03:10,960 --> 00:03:15,560 Speaker 1: driving cars and watching TV and microwaving, they knew that 45 00:03:15,720 --> 00:03:20,560 Speaker 1: inheritance was a total unknown in science. And then that 46 00:03:20,720 --> 00:03:24,800 Speaker 1: all changed with that very short paper that laid the 47 00:03:24,880 --> 00:03:31,000 Speaker 1: foundation for modern genetics and molecular biology and ultimately technologies 48 00:03:31,320 --> 00:03:35,160 Speaker 1: like gene editing, and so that puts us where we 49 00:03:35,200 --> 00:03:39,400 Speaker 1: are now. For as long as animals have walked this earth, 50 00:03:39,480 --> 00:03:43,280 Speaker 1: we have been shaped by forces beyond our control, by 51 00:03:43,320 --> 00:03:47,640 Speaker 1: the slow hand of evolution, by the accidents of mutation, 52 00:03:48,240 --> 00:03:53,400 Speaker 1: by the blind winnowing of natural selection. The twisting helix 53 00:03:53,480 --> 00:03:59,360 Speaker 1: of our DNA has been sculpted by nature's chisel. But now, 54 00:03:59,720 --> 00:04:03,320 Speaker 1: for the first time, we are holding the chisel in 55 00:04:03,360 --> 00:04:08,440 Speaker 1: our own hands. Just in the last nanosecond of evolutionary time, 56 00:04:08,960 --> 00:04:14,760 Speaker 1: we now command gene editing technologies, things like crisper and 57 00:04:15,040 --> 00:04:19,640 Speaker 1: single base pair editing tools and epigenetic editing tools and 58 00:04:20,120 --> 00:04:23,400 Speaker 1: tools yet to be imagined, and these give us the 59 00:04:23,560 --> 00:04:28,719 Speaker 1: power to rewrite the code of life itself. We can 60 00:04:29,080 --> 00:04:34,760 Speaker 1: correct genetic disorders, we can eliminate inherited diseases, and we 61 00:04:34,839 --> 00:04:41,520 Speaker 1: can presumably even enhance ourselves, pushing beyond the biological boundaries 62 00:04:41,560 --> 00:04:46,359 Speaker 1: that we would have recently assumed are fixed. The rules 63 00:04:46,440 --> 00:04:52,440 Speaker 1: of genetic inheritance were once immutable, but now they are revisable. 64 00:04:52,800 --> 00:04:57,720 Speaker 1: So obviously, with this power comes deep questions. If we 65 00:04:57,760 --> 00:05:03,120 Speaker 1: can edit our genetic destiny, what should we choose to become? 66 00:05:03,640 --> 00:05:06,240 Speaker 1: Do we cure only what ails us? Or do we 67 00:05:06,320 --> 00:05:11,560 Speaker 1: optimize ourselves enhancing features like intelligence and strength and longevity. 68 00:05:12,200 --> 00:05:15,000 Speaker 1: How close are we to doing any of this? Does 69 00:05:15,040 --> 00:05:19,200 Speaker 1: anything happen to the meaning of human struggle? When suffering 70 00:05:19,279 --> 00:05:23,440 Speaker 1: can be edited away? Will we remain the same species 71 00:05:23,560 --> 00:05:28,280 Speaker 1: once we begin sculpting ourselves? And who decides is it 72 00:05:28,360 --> 00:05:32,719 Speaker 1: the scientists at the lab bench, the policymakers and government, 73 00:05:33,400 --> 00:05:37,680 Speaker 1: the parent holding their newborn in their arms. What are 74 00:05:37,720 --> 00:05:42,520 Speaker 1: the ethical and social and existential questions of putting our 75 00:05:42,640 --> 00:05:48,520 Speaker 1: genetic future in human hands? So in today's episode, we're 76 00:05:48,560 --> 00:05:52,280 Speaker 1: going to step into the frontier of gene editing. What 77 00:05:52,360 --> 00:05:54,279 Speaker 1: does it mean to be human when we are no 78 00:05:54,400 --> 00:05:58,800 Speaker 1: longer bound by the limits of our biology. What stories 79 00:05:58,800 --> 00:06:02,520 Speaker 1: will future generation tell about the choices that we make? 80 00:06:02,640 --> 00:06:06,640 Speaker 1: Now the code of life is no longer written in stone, 81 00:06:07,120 --> 00:06:10,920 Speaker 1: so what will we write? So I called my friend 82 00:06:10,920 --> 00:06:13,840 Speaker 1: Trevor Martin, who is the co founder and CEO of 83 00:06:14,120 --> 00:06:18,040 Speaker 1: Mammoth Biosciences, which is based here in the San Francisco 84 00:06:18,080 --> 00:06:22,159 Speaker 1: Bay area. They are building the next generation of Crisper 85 00:06:22,240 --> 00:06:25,200 Speaker 1: products for editing the genome. If you've never heard of 86 00:06:25,279 --> 00:06:28,160 Speaker 1: Crisper or aren't sure what it is, hangtight because we'll 87 00:06:28,160 --> 00:06:30,440 Speaker 1: get to that in a minute. But the quick preview is, 88 00:06:30,880 --> 00:06:34,560 Speaker 1: how do you build a platform to read and write 89 00:06:34,800 --> 00:06:38,120 Speaker 1: the code of life? How do you make tools? So 90 00:06:38,160 --> 00:06:41,680 Speaker 1: you get something like a word processor for the genome 91 00:06:41,880 --> 00:06:44,240 Speaker 1: where you say, look, I just want to hit control 92 00:06:44,480 --> 00:06:46,840 Speaker 1: X to cut a piece of the genome, or control 93 00:06:46,920 --> 00:06:50,159 Speaker 1: V to paste it, and control F to find some 94 00:06:50,360 --> 00:06:55,080 Speaker 1: sequence inside the genome. So here's my conversation with Trevor Martin. 95 00:07:00,160 --> 00:07:02,920 Speaker 1: Technology is some of the most amazing technology we have, 96 00:07:03,040 --> 00:07:05,880 Speaker 1: but it wasn't actually invented by humans. It was merely 97 00:07:05,880 --> 00:07:07,600 Speaker 1: discovered by us. So tell us about that, tell us 98 00:07:07,600 --> 00:07:08,240 Speaker 1: about Crisper. 99 00:07:08,720 --> 00:07:12,160 Speaker 2: Yeah, So, similar to how we have immune systems, actually 100 00:07:13,520 --> 00:07:17,440 Speaker 2: bacteria and small microbes and just teny little organisms. They 101 00:07:17,480 --> 00:07:21,120 Speaker 2: have to defend themselves against invasion as well. Typically viruses 102 00:07:21,160 --> 00:07:23,920 Speaker 2: actually much like us, and one of the ways that 103 00:07:24,000 --> 00:07:26,880 Speaker 2: they evolve to do this is this thing called crisper 104 00:07:27,120 --> 00:07:30,560 Speaker 2: that has become famous, of course for its ability to 105 00:07:30,600 --> 00:07:35,480 Speaker 2: do genetic editing. But fundamentally nature has used these crisper 106 00:07:35,560 --> 00:07:39,440 Speaker 2: type systems to protect themselves from viruses, very similar to 107 00:07:39,480 --> 00:07:42,280 Speaker 2: how our adaptive immune system protects us from viruses. 108 00:07:42,720 --> 00:07:44,160 Speaker 1: And we've kind of ripped that out. 109 00:07:44,040 --> 00:07:46,040 Speaker 2: Of nature and done a ton of engineering on top 110 00:07:46,080 --> 00:07:49,640 Speaker 2: of it to turn it into these technologies that can 111 00:07:49,680 --> 00:07:53,600 Speaker 2: do genomic engineering. But that's kind of fundamentally where it 112 00:07:53,640 --> 00:07:55,640 Speaker 2: came from. And I think it's this beautiful example of 113 00:07:56,320 --> 00:07:59,600 Speaker 2: leveraging billions of years of evolution and combining that with 114 00:07:59,720 --> 00:08:03,200 Speaker 2: human ingenuity and a lot of hard work from a 115 00:08:03,240 --> 00:08:04,080 Speaker 2: lot of scientists. 116 00:08:04,120 --> 00:08:06,720 Speaker 1: So let me get straight. So, so crisper if fits in, 117 00:08:06,800 --> 00:08:09,800 Speaker 1: Let's say a single cell organism, a virus injects some 118 00:08:09,920 --> 00:08:12,280 Speaker 1: DNA and it's got to figure out, hey, that's not mine, 119 00:08:12,320 --> 00:08:13,960 Speaker 1: and then it cuts it up exactly. 120 00:08:14,000 --> 00:08:15,920 Speaker 2: So there's a bunch of complicated steps there. First, it 121 00:08:15,920 --> 00:08:19,120 Speaker 2: has to recognize Hey, that's not mine. Then it has 122 00:08:19,200 --> 00:08:22,320 Speaker 2: to cut it up. And then actually there's a third step, 123 00:08:22,360 --> 00:08:24,680 Speaker 2: which is, hey, I should remember this came and i'd 124 00:08:24,680 --> 00:08:26,679 Speaker 2: want to make sure I protect against it in the future. 125 00:08:27,040 --> 00:08:29,679 Speaker 2: And it's actually funny, that's where the name crisper itself 126 00:08:29,720 --> 00:08:32,080 Speaker 2: comes from, is that I want to protect against the future. 127 00:08:32,360 --> 00:08:35,840 Speaker 2: So CRISPER is actually an acronym stands for clustered regulatory 128 00:08:35,880 --> 00:08:39,680 Speaker 2: interspace short palindromic repeats and this is actually kind of 129 00:08:39,679 --> 00:08:43,760 Speaker 2: the memory of the cell that the crisper systems used 130 00:08:43,760 --> 00:08:45,960 Speaker 2: to say, Hey, these are viruses that have invaded me before, 131 00:08:46,000 --> 00:08:47,520 Speaker 2: and I want to make sure they don't do it again. 132 00:08:48,120 --> 00:08:52,720 Speaker 1: And this happens in unicellular organisms. That's so incredible, Okay, 133 00:08:52,960 --> 00:08:56,760 Speaker 1: And so you started this company Mammoth with the idea 134 00:08:57,440 --> 00:09:01,400 Speaker 1: to take these crisper systems and I prove them from 135 00:09:01,400 --> 00:09:04,360 Speaker 1: what nature has done or search for other ways of 136 00:09:04,480 --> 00:09:06,000 Speaker 1: doing it. So give me a sense of how you 137 00:09:06,040 --> 00:09:06,920 Speaker 1: do that. Yeah. 138 00:09:06,960 --> 00:09:09,480 Speaker 2: So probably the most famous Chrisper system is this thing 139 00:09:09,520 --> 00:09:12,240 Speaker 2: called CAST nine. And this was what one of my 140 00:09:12,240 --> 00:09:14,840 Speaker 2: co founders, Jennifer Dalna, won the Nobel Prize for a 141 00:09:14,880 --> 00:09:18,679 Speaker 2: few years ago. Was the work in terms of really 142 00:09:18,760 --> 00:09:23,240 Speaker 2: characterizing and developing this into a geneomic engineering technology. And 143 00:09:23,360 --> 00:09:26,560 Speaker 2: CAST nine was just one example from a certain class 144 00:09:26,559 --> 00:09:30,080 Speaker 2: of bacteria of this type of crisper technology. And one 145 00:09:30,120 --> 00:09:33,559 Speaker 2: of the fundamental insights of Mammoth is that actually, there 146 00:09:33,559 --> 00:09:35,959 Speaker 2: are all sorts of Christopher technologies out there, because these 147 00:09:36,000 --> 00:09:40,720 Speaker 2: are present and all sorts of microorganisms, bacteria, even large 148 00:09:40,760 --> 00:09:44,920 Speaker 2: viruses archaea. And our insight was, hey, we should look 149 00:09:45,000 --> 00:09:48,360 Speaker 2: through all of these alternative versions of crisper that are 150 00:09:48,360 --> 00:09:50,760 Speaker 2: not CAST nine and do a lot of work to 151 00:09:50,920 --> 00:09:53,600 Speaker 2: develop those, and that could actually have a huge benefit 152 00:09:53,679 --> 00:09:58,360 Speaker 2: for building genomic medicines, for building diagnostics, for improving agriculture. 153 00:09:58,800 --> 00:10:01,400 Speaker 2: And that was kind of a fundamental insight that was 154 00:10:01,600 --> 00:10:03,600 Speaker 2: maybe obvious today, but at the time people were so 155 00:10:03,640 --> 00:10:06,240 Speaker 2: focused on CAST nine that people weren't really looking beyond that. 156 00:10:06,800 --> 00:10:10,000 Speaker 1: So you're looking for things that have already been discovered 157 00:10:10,040 --> 00:10:14,199 Speaker 1: by nature elsewhere, and you're looking for versions of that 158 00:10:14,200 --> 00:10:16,760 Speaker 1: that do what you want in terms of gene editing. 159 00:10:17,600 --> 00:10:20,120 Speaker 2: So the trick there is that you use nature as 160 00:10:20,120 --> 00:10:23,760 Speaker 2: a starting point. And the unfortunate truth is that when 161 00:10:23,840 --> 00:10:25,520 Speaker 2: you take these things and you rip them out of nature, 162 00:10:25,520 --> 00:10:27,480 Speaker 2: actually usually they don't work at all, but they give 163 00:10:27,480 --> 00:10:29,880 Speaker 2: you a starting point. And one of the fundamental insights 164 00:10:29,920 --> 00:10:32,160 Speaker 2: we had was it's not enough just to go into 165 00:10:32,240 --> 00:10:34,680 Speaker 2: nature and to say, ah, okay, what other alternative crispers 166 00:10:34,679 --> 00:10:37,160 Speaker 2: are there? You have to do that, and then you 167 00:10:37,160 --> 00:10:39,240 Speaker 2: have to do a ton of engineering. And we actually 168 00:10:39,280 --> 00:10:41,040 Speaker 2: have a whole floor of our building that has these 169 00:10:41,080 --> 00:10:43,680 Speaker 2: liquid handling robots. They're just running tens of thousands of 170 00:10:43,679 --> 00:10:45,959 Speaker 2: experiments at a time and you just kind of grind 171 00:10:46,000 --> 00:10:48,480 Speaker 2: away and you can use the latest AI techniques combining 172 00:10:48,520 --> 00:10:50,880 Speaker 2: it with the latest and microfluid candling is what those 173 00:10:50,920 --> 00:10:53,880 Speaker 2: robots are called. And it's only with that combination of 174 00:10:54,000 --> 00:10:57,160 Speaker 2: all of that, with this kind of usually very wacky, 175 00:10:57,240 --> 00:11:00,200 Speaker 2: kind of natural starting point. That's the secret sauce one 176 00:11:00,320 --> 00:11:01,800 Speaker 2: or the other is not enough. You have to really 177 00:11:01,800 --> 00:11:04,480 Speaker 2: have both together. And that was the unique insight as well. 178 00:11:04,559 --> 00:11:08,439 Speaker 1: Oh great, okay, And so they're looking for these smaller systems, 179 00:11:08,480 --> 00:11:10,920 Speaker 1: and what's the reason to have them smaller? 180 00:11:11,440 --> 00:11:14,040 Speaker 2: Yeah, So one of the giant challenges of the field 181 00:11:14,120 --> 00:11:16,640 Speaker 2: is how do you deliver these systems to the cells 182 00:11:16,640 --> 00:11:18,280 Speaker 2: that need it the most. So when you and I 183 00:11:18,440 --> 00:11:20,760 Speaker 2: think about genetic medicine, what comes to mind, it's going 184 00:11:20,800 --> 00:11:23,640 Speaker 2: to be things like Alzheimer's, Parkinson's, Huntington's, you know, these 185 00:11:23,679 --> 00:11:28,040 Speaker 2: really debilitating disorders where they can be basically a death 186 00:11:28,080 --> 00:11:30,360 Speaker 2: sentence and you something that's kind of known in your 187 00:11:30,400 --> 00:11:34,360 Speaker 2: genome from birth. And the big problem though, has been 188 00:11:35,120 --> 00:11:37,559 Speaker 2: what the genetic medicine, the crisper field has been focused 189 00:11:37,559 --> 00:11:40,680 Speaker 2: on is a tiny subset of diseases that are typically 190 00:11:40,720 --> 00:11:43,880 Speaker 2: not that and that's amazing for those patients that have 191 00:11:44,000 --> 00:11:47,280 Speaker 2: diseases that are like blood disorders or like certain liver disorders, 192 00:11:47,280 --> 00:11:49,600 Speaker 2: and there's amazing progress in this field, like for example, 193 00:11:49,840 --> 00:11:53,280 Speaker 2: there's now an approved therapy using Crisper for sickle cell 194 00:11:53,320 --> 00:11:56,319 Speaker 2: disease in beta talasmia. Those are overlook diseases with underrepresented 195 00:11:56,320 --> 00:11:59,280 Speaker 2: populations and that's a huge win for the field. But 196 00:11:59,400 --> 00:12:03,160 Speaker 2: the proms of genetic medicine is not just blood disorders 197 00:12:03,320 --> 00:12:06,520 Speaker 2: and liver disorders. The promise is to go to any 198 00:12:06,520 --> 00:12:08,360 Speaker 2: cell in the body and do any kind of edit. 199 00:12:08,720 --> 00:12:11,040 Speaker 2: And that really is what guides us at Maamath that 200 00:12:11,120 --> 00:12:12,320 Speaker 2: means you need to go to the muscle, you need 201 00:12:12,360 --> 00:12:13,440 Speaker 2: to go to the brain, you need to get to 202 00:12:13,440 --> 00:12:16,080 Speaker 2: the heart. And that has been a gigantic challenge, and 203 00:12:16,120 --> 00:12:19,080 Speaker 2: it's because CAST nine is actually a big protein. So 204 00:12:19,400 --> 00:12:23,000 Speaker 2: obviously it's small relative to us, like all proteins, but 205 00:12:23,120 --> 00:12:25,640 Speaker 2: it's really really big on a kind of molecular scale. 206 00:12:26,080 --> 00:12:28,640 Speaker 2: And one of the things that we did is we said, hey, 207 00:12:28,920 --> 00:12:30,880 Speaker 2: could we create a crisper system that's not just a 208 00:12:30,920 --> 00:12:33,240 Speaker 2: little bit smaller than CAST nine, but it's way smaller. 209 00:12:33,800 --> 00:12:37,679 Speaker 2: And that resulted in a thing we called nanocasts nano 210 00:12:38,640 --> 00:12:40,400 Speaker 2: and that was really exciting and there's a lot of 211 00:12:40,400 --> 00:12:42,800 Speaker 2: skepticism in the field, which is great for our patents, 212 00:12:42,840 --> 00:12:46,040 Speaker 2: people like, oh, these will never work. And it required 213 00:12:46,080 --> 00:12:47,560 Speaker 2: a ton of work and a lot of these robots 214 00:12:47,640 --> 00:12:50,199 Speaker 2: doing a lot of work overtime with our scientists. And 215 00:12:50,360 --> 00:12:52,559 Speaker 2: what's cool though, is that now we actually have data 216 00:12:52,640 --> 00:12:56,160 Speaker 2: showing that in monkeys, which is a really high bar. 217 00:12:56,200 --> 00:12:59,240 Speaker 2: It's not you know, cell lines or mice, we actually 218 00:12:59,280 --> 00:13:04,640 Speaker 2: get extremely good editing either equivalent or better than CAST 219 00:13:04,720 --> 00:13:07,320 Speaker 2: nine with these really really tiny systems. And these really 220 00:13:07,360 --> 00:13:11,120 Speaker 2: tiny systems, unlike CAST nine, can actually be delivered anywhere 221 00:13:11,120 --> 00:13:12,920 Speaker 2: in the body, so that's a seed change in terms 222 00:13:12,920 --> 00:13:14,200 Speaker 2: of what's possible. 223 00:13:14,120 --> 00:13:17,120 Speaker 1: And they can be delivered by a virus for example. 224 00:13:17,360 --> 00:13:19,400 Speaker 2: Yeah, So a classic way that you can deliver to 225 00:13:20,120 --> 00:13:23,240 Speaker 2: muscle or brain, for example, would be with a thing 226 00:13:23,320 --> 00:13:26,240 Speaker 2: called AV which is another acronym that stands for adno 227 00:13:26,280 --> 00:13:29,200 Speaker 2: associated virus. And one of the big limitations of this 228 00:13:29,360 --> 00:13:31,000 Speaker 2: is that it has a very strict size limit. Like 229 00:13:31,040 --> 00:13:33,440 Speaker 2: you can think of it as like a semi trailer 230 00:13:33,480 --> 00:13:35,120 Speaker 2: truck where you can only fit so much in the 231 00:13:35,160 --> 00:13:38,080 Speaker 2: back and cast nine is just way too big to 232 00:13:38,120 --> 00:13:42,480 Speaker 2: fit inside it. But these nanocast style systems don't just fit, 233 00:13:42,520 --> 00:13:44,160 Speaker 2: but they actually have a ton of room to spare. 234 00:13:44,760 --> 00:13:46,480 Speaker 2: And the room to spare is really important as well, 235 00:13:46,520 --> 00:13:48,480 Speaker 2: because when you're a scientist you can start to think 236 00:13:48,520 --> 00:13:51,040 Speaker 2: really creatively about how do I use that. And one 237 00:13:51,080 --> 00:13:53,760 Speaker 2: of the key ways that we use that is to 238 00:13:53,920 --> 00:13:56,640 Speaker 2: fit in the machinery to do different types of edits. 239 00:13:56,760 --> 00:13:59,679 Speaker 2: So people may have heard of things like base editing 240 00:13:59,760 --> 00:14:03,120 Speaker 2: or writing or epigenetic editing. These are all techniques that 241 00:14:03,360 --> 00:14:06,480 Speaker 2: take the fundamental Chrisper system and say, hey, what if 242 00:14:06,840 --> 00:14:09,720 Speaker 2: instead of editing the genome as this like word document 243 00:14:09,760 --> 00:14:12,280 Speaker 2: and only being able to delete sentences, what if we 244 00:14:12,320 --> 00:14:14,439 Speaker 2: could add a paragraph, What if we could spell check 245 00:14:14,480 --> 00:14:18,679 Speaker 2: a word? What if we could italicize a sentence. These 246 00:14:18,720 --> 00:14:20,960 Speaker 2: are all kind of different types of edits you can 247 00:14:21,000 --> 00:14:23,920 Speaker 2: do in the genome, and they require delivery have even 248 00:14:23,920 --> 00:14:27,240 Speaker 2: more machinery, and that means that Mammoth has really been 249 00:14:27,280 --> 00:14:30,080 Speaker 2: the only company that's been able to actually deliver not 250 00:14:30,120 --> 00:14:32,720 Speaker 2: just anywhere in the body, but also deliver any kind 251 00:14:32,720 --> 00:14:34,760 Speaker 2: of edit anywhere in the body. And I think it's 252 00:14:34,920 --> 00:14:36,360 Speaker 2: you have to do both of those if you really 253 00:14:36,360 --> 00:14:38,040 Speaker 2: want to address all genetic. 254 00:14:37,640 --> 00:14:42,120 Speaker 1: Disease, and any kind of edit means reading or writing. Yeah. 255 00:14:42,160 --> 00:14:43,800 Speaker 2: So we also had a lot of work we did 256 00:14:43,960 --> 00:14:47,320 Speaker 2: on diagnostics as well, and during the pandemic we actually 257 00:14:47,320 --> 00:14:50,800 Speaker 2: got emergency use authorization for a Chrisper based COVID test. 258 00:14:51,160 --> 00:14:52,760 Speaker 2: So we're super proud of the work we did there. 259 00:14:52,840 --> 00:14:54,640 Speaker 2: The focus of the company today is very much on 260 00:14:54,680 --> 00:14:58,520 Speaker 2: the writing side, but definitely I think there's huge potential 261 00:14:58,520 --> 00:14:59,760 Speaker 2: on the reading side. 262 00:15:00,960 --> 00:15:03,960 Speaker 1: As well. Give us an example of the reading side. 263 00:15:04,240 --> 00:15:07,280 Speaker 2: Yeah, so basically there, instead of using the Chrisper systems 264 00:15:07,320 --> 00:15:10,480 Speaker 2: to change the DNA, you can use the Chrispher systems 265 00:15:10,520 --> 00:15:14,160 Speaker 2: to send out some signal that there's a certain sequence present, 266 00:15:14,240 --> 00:15:18,360 Speaker 2: so to say, hey, I found the word potato and 267 00:15:18,640 --> 00:15:21,600 Speaker 2: it'll glow green if it finds potato, and it won't 268 00:15:21,640 --> 00:15:24,720 Speaker 2: show any color if it doesn't. And that's a very 269 00:15:24,760 --> 00:15:27,400 Speaker 2: powerful kind of concept, and that means you could do 270 00:15:27,480 --> 00:15:33,080 Speaker 2: really low cost, high accuracy style molecular testing, and that's 271 00:15:33,080 --> 00:15:35,560 Speaker 2: something that we're very bullish on long term. But as 272 00:15:35,560 --> 00:15:38,960 Speaker 2: a company, obviously you have to focus on a certain area, 273 00:15:39,000 --> 00:15:42,040 Speaker 2: and already, you know, trying to tackle all genetic diseases 274 00:15:43,360 --> 00:15:45,520 Speaker 2: a limited set of focus to begin with, So that's 275 00:15:45,560 --> 00:15:48,400 Speaker 2: kind of kind of where we're focusing our efforts otherwise. 276 00:15:48,520 --> 00:15:50,240 Speaker 1: Right, So let me come back to something you said. So, 277 00:15:50,320 --> 00:15:53,400 Speaker 1: as far as the writing goes, you can write single 278 00:15:53,440 --> 00:15:57,240 Speaker 1: base pairs, you can write something longer. You can write 279 00:15:57,320 --> 00:15:59,920 Speaker 1: whole genes or collections of genes. Ice. 280 00:16:00,720 --> 00:16:03,600 Speaker 2: Yeah, you could have all insert an entire gene. You 281 00:16:03,640 --> 00:16:06,880 Speaker 2: could kind of change a single base pair out of 282 00:16:06,920 --> 00:16:09,200 Speaker 2: all the billions of base pairs in your genome. And 283 00:16:09,200 --> 00:16:12,280 Speaker 2: that's an important philosophical point as well, because I think 284 00:16:12,600 --> 00:16:14,960 Speaker 2: in biotech we often get so enamored with technology, so 285 00:16:14,960 --> 00:16:17,120 Speaker 2: we always think in terms of like, ah, this is 286 00:16:17,160 --> 00:16:20,040 Speaker 2: a base editing technology, or this is a gene writing technology, 287 00:16:20,040 --> 00:16:22,320 Speaker 2: This is like Deuvile's trand break if you're patient. You 288 00:16:22,360 --> 00:16:25,840 Speaker 2: don't care, Right, I have a disease and I don't 289 00:16:25,840 --> 00:16:27,920 Speaker 2: care how you're doing it. I just want you to 290 00:16:28,120 --> 00:16:30,520 Speaker 2: cure or treat my disease. In our case, we can 291 00:16:30,560 --> 00:16:34,400 Speaker 2: actually cure it. And I think that's where our philosophy is. 292 00:16:34,440 --> 00:16:38,800 Speaker 2: Will actually develop many techniques, all the techniques, and actually 293 00:16:38,800 --> 00:16:40,640 Speaker 2: be able to deliver them. And then for any disease, 294 00:16:40,680 --> 00:16:42,600 Speaker 2: we might try a couple of different ways of doing it. 295 00:16:42,640 --> 00:16:43,920 Speaker 2: We might say, ah, there's like a way to do 296 00:16:44,000 --> 00:16:45,240 Speaker 2: it by base editing, there's a way to do it 297 00:16:45,280 --> 00:16:47,160 Speaker 2: by epigenetic editing. We're not sure which one's gonna be 298 00:16:47,200 --> 00:16:48,760 Speaker 2: the best, but we'll try both and see which one 299 00:16:48,800 --> 00:16:52,320 Speaker 2: actually works well. And that's very different from philosophy from 300 00:16:52,320 --> 00:16:55,240 Speaker 2: typical biotech, where you try and create like ten companies 301 00:16:55,240 --> 00:16:57,720 Speaker 2: whe each company is doing a different method and I 302 00:16:57,720 --> 00:17:00,960 Speaker 2: think that's all fine and well in some ways, maybe 303 00:17:00,960 --> 00:17:06,960 Speaker 2: also from like how do you maximize like investor involvement 304 00:17:06,960 --> 00:17:09,960 Speaker 2: in different companies, But from a I think long term 305 00:17:09,960 --> 00:17:12,400 Speaker 2: company building and from a patient perspective, I think that's 306 00:17:12,480 --> 00:17:14,120 Speaker 2: very much like the wrong way to go about it. 307 00:17:14,240 --> 00:17:17,600 Speaker 1: Yeah, So tell us about diseases in the brain and 308 00:17:17,760 --> 00:17:20,639 Speaker 1: what you're thinking of, is the future of those maybe 309 00:17:20,720 --> 00:17:23,840 Speaker 1: in give me a sense of three years, ten years 310 00:17:23,880 --> 00:17:25,679 Speaker 1: where we'll be with that. Yeah. 311 00:17:25,720 --> 00:17:28,399 Speaker 2: So to start with a specific example of one that 312 00:17:28,480 --> 00:17:32,080 Speaker 2: I think kind of is frankly a condemnation of the 313 00:17:32,080 --> 00:17:35,280 Speaker 2: genetic medicine space is Huntington's disease. So this is a 314 00:17:35,320 --> 00:17:38,320 Speaker 2: disease that was mapped, Oh my god, not just decades ago, 315 00:17:38,320 --> 00:17:40,840 Speaker 2: like half a century ago, like I think in like 316 00:17:40,880 --> 00:17:44,200 Speaker 2: the late eighties, and it was mapp They mapped it 317 00:17:44,280 --> 00:17:46,800 Speaker 2: with microsatellites, I believe, on giant gels that were in 318 00:17:46,880 --> 00:17:48,960 Speaker 2: the lab. Right, this is like, you know, not pre computer, 319 00:17:49,280 --> 00:17:52,240 Speaker 2: it's like Wexler, right, Yeah, but you know, very very 320 00:17:52,240 --> 00:17:56,480 Speaker 2: early technologies, and we have understood, you know, fundamentally kind 321 00:17:56,480 --> 00:17:58,720 Speaker 2: of the genetic base of Huntington's for a very long time, 322 00:17:59,160 --> 00:18:01,720 Speaker 2: and still today people die from this every single year, 323 00:18:02,440 --> 00:18:06,600 Speaker 2: and it's a horrible disease where basically, if you have 324 00:18:06,680 --> 00:18:10,040 Speaker 2: a certain genomic sequence, then you know, typically in your thirties, 325 00:18:10,080 --> 00:18:13,520 Speaker 2: you'll have the accumulation of a certain protein and you'll 326 00:18:13,800 --> 00:18:19,320 Speaker 2: pass away. And it's infuriating, honestly that we understand the 327 00:18:19,359 --> 00:18:22,640 Speaker 2: genetic cause of this and we can do nothing about it. 328 00:18:23,040 --> 00:18:25,800 Speaker 2: So I think the genetic medicine space and Crisper in 329 00:18:25,840 --> 00:18:29,760 Speaker 2: particular will have arrived and will have really I think, 330 00:18:29,920 --> 00:18:33,080 Speaker 2: done a hallmark deliverance of like what the true potential 331 00:18:33,200 --> 00:18:35,240 Speaker 2: is the day the last Huntington patient dies? 332 00:18:35,440 --> 00:18:37,000 Speaker 1: Agreed? When is that? Do you think? 333 00:18:37,520 --> 00:18:40,800 Speaker 2: Yeah? So, I think it's definitely within sight. I'm not 334 00:18:40,840 --> 00:18:42,479 Speaker 2: going to give a specific. 335 00:18:42,960 --> 00:18:44,360 Speaker 1: Three years or five or ten. 336 00:18:44,800 --> 00:18:49,240 Speaker 2: Yeah, it's probably not three, but it better not be ten. Okay, Yeah, 337 00:18:49,440 --> 00:18:52,040 Speaker 2: I think in general, like you can kind of see 338 00:18:52,040 --> 00:18:53,760 Speaker 2: the steps that you need to take and it's a 339 00:18:53,800 --> 00:18:55,080 Speaker 2: matter of walking down the path. 340 00:18:55,280 --> 00:18:58,040 Speaker 1: Got it? And what has been the problem? Given that 341 00:18:58,119 --> 00:19:01,359 Speaker 1: we have Chrisper cast technology and that we have known 342 00:19:01,440 --> 00:19:04,879 Speaker 1: the gene for Huntington's it's monogenetic, what has been the 343 00:19:04,880 --> 00:19:05,720 Speaker 1: hold up? Yeah? 344 00:19:05,760 --> 00:19:07,280 Speaker 2: So I think there's a lot of things you have 345 00:19:07,320 --> 00:19:09,959 Speaker 2: to think about. One is what type of edit are 346 00:19:09,960 --> 00:19:12,480 Speaker 2: you going to do? Like can you just knock out 347 00:19:12,480 --> 00:19:14,280 Speaker 2: the whole Huntington gene or do you have to think 348 00:19:14,359 --> 00:19:18,000 Speaker 2: more creatively about modifying it in a more subtle way. 349 00:19:18,840 --> 00:19:21,360 Speaker 2: Then the second one is can you actually get the 350 00:19:21,600 --> 00:19:24,720 Speaker 2: editing machinery to the cells of interest? Can you actually 351 00:19:24,720 --> 00:19:27,320 Speaker 2: get this into the brain, and those are two of 352 00:19:27,359 --> 00:19:30,320 Speaker 2: the most obvious problems. I think that we've really thought 353 00:19:30,320 --> 00:19:33,199 Speaker 2: deeply about mammoth and like the general sense, not just 354 00:19:33,320 --> 00:19:36,600 Speaker 2: running to the before any brain disease. And I think 355 00:19:36,640 --> 00:19:40,080 Speaker 2: that's where the technologies we've developed, like these ultracompact systems 356 00:19:40,080 --> 00:19:42,560 Speaker 2: and having all these different editing modalities I think can 357 00:19:42,560 --> 00:19:43,440 Speaker 2: make a huge difference. 358 00:19:43,480 --> 00:19:46,160 Speaker 1: Potentially got it and so it sounds like you've got 359 00:19:46,200 --> 00:19:49,399 Speaker 1: the can we get it to the right cells? It 360 00:19:49,400 --> 00:19:52,280 Speaker 1: sounds like you've got a beat on that. But as 361 00:19:52,280 --> 00:19:54,440 Speaker 1: far as what edit to make is that something you're 362 00:19:54,520 --> 00:19:55,760 Speaker 1: experimenting with, well, I. 363 00:19:55,720 --> 00:19:58,439 Speaker 2: Think that's definitely something where we have a lot of 364 00:19:58,600 --> 00:20:01,600 Speaker 2: great ideas for any about like different types, and that's 365 00:20:01,640 --> 00:20:03,960 Speaker 2: where we're very unique because we can actually try different 366 00:20:04,000 --> 00:20:07,400 Speaker 2: methods and we can say, hey, this is our hypothesis, 367 00:20:07,840 --> 00:20:10,479 Speaker 2: prove it out or not, but not have a hammer 368 00:20:10,520 --> 00:20:12,560 Speaker 2: and everything else to look like a nail, which is 369 00:20:12,800 --> 00:20:14,800 Speaker 2: very very classic not just a biotype but a deep 370 00:20:14,800 --> 00:20:17,399 Speaker 2: tech in general. And you know the classic startup advice 371 00:20:17,400 --> 00:20:19,639 Speaker 2: of like find the problem first and then figure out 372 00:20:19,640 --> 00:20:21,680 Speaker 2: the solution. There's a lot of reasons why that doesn't 373 00:20:21,680 --> 00:20:23,720 Speaker 2: work in deep tech right, Like, sometimes you really do 374 00:20:23,840 --> 00:20:25,960 Speaker 2: have to kind of build the thing and then figure 375 00:20:25,960 --> 00:20:28,840 Speaker 2: out what the best application is. But that being said, 376 00:20:28,840 --> 00:20:31,360 Speaker 2: the more you can mitigate that, that's a very powerful 377 00:20:31,400 --> 00:20:35,280 Speaker 2: idea to get away from hammer, you know, squint at 378 00:20:35,320 --> 00:20:36,400 Speaker 2: everything until it's a nail. 379 00:20:36,520 --> 00:20:54,560 Speaker 1: Yeah, let me make sure I understand. What's the way 380 00:20:54,600 --> 00:20:56,920 Speaker 1: that you can test out these different hypothesies. Do you 381 00:20:57,000 --> 00:20:58,560 Speaker 1: have an animal model of Huntington's. 382 00:20:58,600 --> 00:21:02,399 Speaker 2: Yeah, it's gonna so not speaking specifically about Huntington's, but 383 00:21:02,520 --> 00:21:05,360 Speaker 2: just generally in terms of different diseases. Some diseases you'll 384 00:21:05,359 --> 00:21:07,600 Speaker 2: have an animal model that's really good, and that means 385 00:21:07,680 --> 00:21:10,000 Speaker 2: you can actually try it out like in mice or 386 00:21:10,000 --> 00:21:12,119 Speaker 2: maybe even monkeys, and like really get a lot of 387 00:21:12,119 --> 00:21:15,360 Speaker 2: confidence for others. Honestly, you don't have anything. Maybe there's 388 00:21:15,400 --> 00:21:17,119 Speaker 2: no animal models, so like you have to try it 389 00:21:17,160 --> 00:21:19,239 Speaker 2: out in cell lines and then kind of make your 390 00:21:19,240 --> 00:21:20,680 Speaker 2: best guess about what's going to work. 391 00:21:20,720 --> 00:21:21,040 Speaker 1: Well. 392 00:21:21,119 --> 00:21:24,320 Speaker 2: So it's very varied, i'd say, across different diseases and 393 00:21:24,359 --> 00:21:28,040 Speaker 2: across different tissues, but you want to have as many 394 00:21:28,119 --> 00:21:30,440 Speaker 2: shots on goal I think as possible because then when 395 00:21:30,480 --> 00:21:33,080 Speaker 2: you go to humans, maybe you'll find something surprising, Like 396 00:21:33,119 --> 00:21:35,040 Speaker 2: you only had cell line work and then it doesn't 397 00:21:35,080 --> 00:21:37,400 Speaker 2: work in humans. And if you only have one technique, 398 00:21:37,400 --> 00:21:40,040 Speaker 2: you're kind of out of luck. But if you have, 399 00:21:40,119 --> 00:21:42,880 Speaker 2: like you know, a backup or a backup to the backup, 400 00:21:43,280 --> 00:21:45,040 Speaker 2: that means you can actually go in and you know, 401 00:21:45,440 --> 00:21:46,800 Speaker 2: do something for patients after that. 402 00:21:47,000 --> 00:21:48,719 Speaker 1: Got it. So that gives us a good sense of 403 00:21:48,720 --> 00:21:51,840 Speaker 1: what the challenge is with something like Huntington's. Now, Huntington's 404 00:21:51,960 --> 00:21:54,760 Speaker 1: is one gene. If you got it, you're getting Huntington's. 405 00:21:55,119 --> 00:21:58,800 Speaker 1: But what about other diseases, whether it's Alzheimer's or schizophreny 406 00:21:58,840 --> 00:22:01,440 Speaker 1: or whatever, that are polygenic, I can involve lots of machines. 407 00:22:02,160 --> 00:22:04,960 Speaker 1: Does that make the problem exponentially harder. 408 00:22:05,600 --> 00:22:08,880 Speaker 2: Yeah, so that's a really really good question. So kind 409 00:22:08,880 --> 00:22:11,560 Speaker 2: of building on your point about monogenic disease, the diseases 410 00:22:11,560 --> 00:22:14,119 Speaker 2: where there's a single gene that causes it, there's depending 411 00:22:14,119 --> 00:22:15,960 Speaker 2: on how you count, let's say about four thousand of 412 00:22:16,000 --> 00:22:19,720 Speaker 2: those that we kind of are well understood, and that 413 00:22:19,800 --> 00:22:21,280 Speaker 2: means you have a lot of you have your work 414 00:22:21,280 --> 00:22:25,040 Speaker 2: cut out for you just in monogenic disease, and you know, 415 00:22:25,080 --> 00:22:27,200 Speaker 2: there's a lot of work to be done there. And 416 00:22:28,359 --> 00:22:30,359 Speaker 2: the one thing I'll mention before moving on to the 417 00:22:30,640 --> 00:22:34,280 Speaker 2: polygenic is that one of the beauties of the kind 418 00:22:34,320 --> 00:22:37,400 Speaker 2: of crisper technology is that, unlike a lot of previous 419 00:22:37,480 --> 00:22:40,520 Speaker 2: things that have happened in biotech, the first therapy you 420 00:22:40,560 --> 00:22:43,040 Speaker 2: build with a crisper technology is the hardest, and then 421 00:22:43,080 --> 00:22:45,040 Speaker 2: the second one gets easier, and the third one gets easier, 422 00:22:45,080 --> 00:22:47,240 Speaker 2: and the fourth one gets easier. And that's very different 423 00:22:47,280 --> 00:22:49,560 Speaker 2: from a lot of things, like you know, small molecule development, 424 00:22:49,600 --> 00:22:51,320 Speaker 2: where every drug you kind of go back to the 425 00:22:51,320 --> 00:22:53,119 Speaker 2: starting board and you're like, Okay, well I got to 426 00:22:53,280 --> 00:22:54,840 Speaker 2: kind of go through the whole process again, and I 427 00:22:54,840 --> 00:22:57,760 Speaker 2: haven't you know, obviously I've learned something, but I'm not 428 00:22:57,760 --> 00:22:59,680 Speaker 2: going to like shorten the process for the second small 429 00:22:59,720 --> 00:23:02,959 Speaker 2: male the third small molecule I make. And with Christopher, 430 00:23:03,000 --> 00:23:05,400 Speaker 2: that's very very different because you're using the same technology 431 00:23:05,640 --> 00:23:07,720 Speaker 2: and you're switching out this thing that's called a guide RNA. 432 00:23:08,119 --> 00:23:09,240 Speaker 2: You can kind of think of it as like you 433 00:23:09,240 --> 00:23:11,320 Speaker 2: go to Google and you type into the search engine 434 00:23:11,359 --> 00:23:13,760 Speaker 2: and the guide RNA is what you're typing. So it's 435 00:23:13,840 --> 00:23:17,399 Speaker 2: very kind of facile to switch these things out. And 436 00:23:17,440 --> 00:23:19,879 Speaker 2: I think that means that even though there are four thousand, 437 00:23:20,119 --> 00:23:23,119 Speaker 2: you know where monogenetic diseases, I think we have a 438 00:23:23,160 --> 00:23:26,000 Speaker 2: real shot at tackling them all because the first one 439 00:23:26,040 --> 00:23:28,640 Speaker 2: is the hardest and it only gets easier from there. 440 00:23:28,640 --> 00:23:30,440 Speaker 2: And that's very different from classic biottech. 441 00:23:30,560 --> 00:23:32,640 Speaker 1: Now, how about the polygenetic diseases. 442 00:23:32,359 --> 00:23:35,240 Speaker 2: Right So on the polygenic side, I think the main 443 00:23:35,320 --> 00:23:39,040 Speaker 2: challenge there is in an exciting way, going to become 444 00:23:39,160 --> 00:23:42,960 Speaker 2: what to edit. So there are limitations to what we 445 00:23:43,040 --> 00:23:46,080 Speaker 2: call multiplex editing. Right now, I think the state of 446 00:23:46,119 --> 00:23:49,159 Speaker 2: the art would be, you know, you could reasonably do 447 00:23:49,240 --> 00:23:52,360 Speaker 2: maybe three to five edits in one go, depending on 448 00:23:52,400 --> 00:23:56,240 Speaker 2: which lab you want to kind of take a queue from, 449 00:23:56,359 --> 00:23:58,000 Speaker 2: And there's a lot of progress that can be made there, 450 00:23:58,040 --> 00:24:01,480 Speaker 2: of course, But even for these things where you're trying 451 00:24:01,480 --> 00:24:04,080 Speaker 2: to edit multiple genes, it's often very unclear. Even if 452 00:24:04,119 --> 00:24:05,800 Speaker 2: you want to edit five things and you said, hey, 453 00:24:05,800 --> 00:24:08,080 Speaker 2: I can go edit five things, like which five you 454 00:24:08,119 --> 00:24:10,960 Speaker 2: should edit can be very very tricky, like schizophrenia being 455 00:24:11,000 --> 00:24:15,320 Speaker 2: a classic example, even things like type two diabetes, and 456 00:24:15,400 --> 00:24:17,120 Speaker 2: there I think there's a lot of progress that could 457 00:24:17,119 --> 00:24:20,679 Speaker 2: potentially be made in terms of mapping these diseases and 458 00:24:20,720 --> 00:24:23,280 Speaker 2: really understanding even like are these edits additive, Like if 459 00:24:23,359 --> 00:24:27,160 Speaker 2: I edit five things, am I getting just the full 460 00:24:27,200 --> 00:24:30,159 Speaker 2: benefit of every edit? Or maybe is there a sequence 461 00:24:30,200 --> 00:24:32,560 Speaker 2: of edits that's going to be more beneficial if I 462 00:24:32,600 --> 00:24:35,080 Speaker 2: do them in a certain kind of cohort together. And 463 00:24:35,119 --> 00:24:38,640 Speaker 2: these are very complicated statistical questions, right, and it's very 464 00:24:38,680 --> 00:24:41,440 Speaker 2: non obvious kind of what the answer is for many 465 00:24:41,480 --> 00:24:43,119 Speaker 2: of these diseases. And that's where I think there's a 466 00:24:43,160 --> 00:24:46,000 Speaker 2: lot of additional kind of statistical work that could be done. 467 00:24:47,280 --> 00:24:49,960 Speaker 1: Okay, got it, But you've got the technology now to 468 00:24:50,040 --> 00:24:52,600 Speaker 1: go in there and do these experiments. What do you 469 00:24:52,640 --> 00:24:55,920 Speaker 1: see as the ethical issues about a society that knows 470 00:24:55,920 --> 00:24:58,840 Speaker 1: how to edit genes? As we move forward, Let's say 471 00:24:58,840 --> 00:25:00,560 Speaker 1: we're thinking ten years in the future, two twenty years, 472 00:25:00,640 --> 00:25:02,879 Speaker 1: what do you see is the issues there? Yeah? 473 00:25:02,920 --> 00:25:06,000 Speaker 2: Well, I think the exciting thing is that we're quickly 474 00:25:06,040 --> 00:25:09,880 Speaker 2: going to live in a society where we're better at 475 00:25:10,080 --> 00:25:12,399 Speaker 2: making edits in any cell in the human genome than 476 00:25:12,400 --> 00:25:15,159 Speaker 2: we are understanding what to edit, which also has to 477 00:25:15,200 --> 00:25:18,320 Speaker 2: read out on. Yeah, your question about the ethics of like, Okay, 478 00:25:19,280 --> 00:25:22,280 Speaker 2: let's move beyond monogenetic disease. Let's even move beyond like 479 00:25:22,359 --> 00:25:25,359 Speaker 2: the classic polygenetic diseases like schizophrena, type two diabetes. I 480 00:25:25,359 --> 00:25:29,000 Speaker 2: think these are relatively non controversial things where of course, 481 00:25:29,040 --> 00:25:32,639 Speaker 2: if you have an ability to cure people, yeah, you 482 00:25:32,960 --> 00:25:35,240 Speaker 2: probably should, or at least I feel very strongly personally 483 00:25:35,280 --> 00:25:38,160 Speaker 2: that you should. And then it gets into the realm 484 00:25:38,200 --> 00:25:40,480 Speaker 2: of I think, you know, people like to think about, oh, well, 485 00:25:40,600 --> 00:25:43,200 Speaker 2: what about things that are not necessarily diseases, but maybe 486 00:25:43,200 --> 00:25:46,600 Speaker 2: you want to improve, like whether that's uh, you know, kids, 487 00:25:46,640 --> 00:25:48,760 Speaker 2: athleticism or intelligentism and intelligence. 488 00:25:48,760 --> 00:25:49,760 Speaker 1: These are the classic ones. 489 00:25:49,800 --> 00:25:53,600 Speaker 2: And as someone that did a lot of work on 490 00:25:53,640 --> 00:25:56,520 Speaker 2: the kind of genetic side of the equation, I think 491 00:25:56,680 --> 00:26:00,000 Speaker 2: one thing that's often lost here is that there's no intelligence. 492 00:26:00,400 --> 00:26:02,880 Speaker 2: Just to be clear, Like the gains you can get 493 00:26:02,920 --> 00:26:05,119 Speaker 2: even from doing a lot of it, it's on the 494 00:26:05,119 --> 00:26:07,639 Speaker 2: intelligence side are kind of shockingly minimal. 495 00:26:08,640 --> 00:26:11,800 Speaker 1: Agreed. Although, although if we fast forward twenty years and 496 00:26:11,880 --> 00:26:15,320 Speaker 1: you've figured out, hey, polygenetically, here's some very clever AI 497 00:26:15,400 --> 00:26:17,600 Speaker 1: way to test this and try that, maybe we'll find 498 00:26:17,640 --> 00:26:18,520 Speaker 1: out Oh it's. 499 00:26:18,600 --> 00:26:20,919 Speaker 2: Yeah, it's definitely definitely possible. I think there's a bit 500 00:26:20,960 --> 00:26:23,080 Speaker 2: of a holy war in terms of, you know, the 501 00:26:23,200 --> 00:26:26,200 Speaker 2: environmental component versus the innate genetic component. 502 00:26:25,920 --> 00:26:27,840 Speaker 1: But the innate genetics doesn't hurt right. 503 00:26:27,960 --> 00:26:30,640 Speaker 2: Right, So let's put that aside for a second and say, okay, 504 00:26:30,720 --> 00:26:33,159 Speaker 2: let's just same. There's been progress made, and we have 505 00:26:33,280 --> 00:26:35,240 Speaker 2: some better understanding maybe of at least what are the 506 00:26:35,280 --> 00:26:38,720 Speaker 2: best possible dits you could make. I think there it's 507 00:26:38,720 --> 00:26:40,919 Speaker 2: going to be really interesting because if you zoom out, 508 00:26:41,400 --> 00:26:44,000 Speaker 2: I feel like this is, first of all, this is 509 00:26:44,000 --> 00:26:46,640 Speaker 2: a question beyond any individual and beyond any company. It's 510 00:26:46,640 --> 00:26:49,000 Speaker 2: really kind of a society level question where there's you know, 511 00:26:49,119 --> 00:26:52,480 Speaker 2: religious and you know, ethical and kind of personal and 512 00:26:52,520 --> 00:26:56,680 Speaker 2: of course corporate kind of viewpoints here. But I think 513 00:26:56,840 --> 00:26:59,320 Speaker 2: you've seen this in other deep tech areas. You see 514 00:26:59,320 --> 00:27:02,639 Speaker 2: it with AI right now, every country is going to 515 00:27:02,760 --> 00:27:05,320 Speaker 2: have kind of a different view on this. And I 516 00:27:05,359 --> 00:27:08,280 Speaker 2: think the really interesting thing when you start thinking about 517 00:27:08,320 --> 00:27:12,159 Speaker 2: human biology, which we all share, of course, is that 518 00:27:12,560 --> 00:27:16,320 Speaker 2: these decisions very much are not in isolation, right, And 519 00:27:16,600 --> 00:27:18,840 Speaker 2: it does make you wonder if one country is more 520 00:27:19,000 --> 00:27:21,639 Speaker 2: willing to kind of go down some of these paths 521 00:27:21,680 --> 00:27:25,160 Speaker 2: that other countries might find less ethical, does that create 522 00:27:25,160 --> 00:27:28,359 Speaker 2: an imperative for other countries just to fall along to 523 00:27:28,440 --> 00:27:31,520 Speaker 2: stay competitive. And I don't know what The answer to 524 00:27:31,560 --> 00:27:34,960 Speaker 2: that is, but I think that's the part that seems 525 00:27:35,000 --> 00:27:38,520 Speaker 2: like it might be most complicated, honestly, is different countries 526 00:27:38,560 --> 00:27:41,119 Speaker 2: will come to different conclusions, and there's definitely been a 527 00:27:41,160 --> 00:27:42,840 Speaker 2: lot of work to try and come to like international 528 00:27:42,880 --> 00:27:46,600 Speaker 2: consensus around these things. But in general, I think that's 529 00:27:46,640 --> 00:27:48,520 Speaker 2: going to be the trickiest pressure is that even if 530 00:27:48,600 --> 00:27:50,520 Speaker 2: let's say in the United States, we make, you know, 531 00:27:50,600 --> 00:27:53,159 Speaker 2: certain decisions around this is the line, We're not going 532 00:27:53,240 --> 00:27:56,159 Speaker 2: to do edits for intelligence, but we are going to 533 00:27:56,240 --> 00:27:58,880 Speaker 2: do edits for anything that's you know, classified as a disease. 534 00:28:00,040 --> 00:28:03,200 Speaker 2: Maybe another country decides, hey, actually I want to give 535 00:28:03,240 --> 00:28:08,159 Speaker 2: my population super charging powers to whatever extent I can, 536 00:28:08,200 --> 00:28:09,639 Speaker 2: and maybe it's a minimal extent, but we're going to 537 00:28:09,680 --> 00:28:12,400 Speaker 2: try our vest I think that creates a pretty interesting 538 00:28:12,520 --> 00:28:15,399 Speaker 2: situation geopolitically about like how do you handle that? 539 00:28:15,880 --> 00:28:19,640 Speaker 1: Yeah, agreed, once we're a species that knows how to 540 00:28:20,000 --> 00:28:24,639 Speaker 1: modify our code. Yes, So the geopolitical things, what do 541 00:28:24,680 --> 00:28:26,560 Speaker 1: you think it means just in terms of what it 542 00:28:26,640 --> 00:28:30,240 Speaker 1: means to be a human? How does that change? I mean, 543 00:28:30,320 --> 00:28:32,600 Speaker 1: let's say we're thinking fifty years in the future here, 544 00:28:34,840 --> 00:28:37,520 Speaker 1: and you get to choose everything about your kids and 545 00:28:38,200 --> 00:28:42,160 Speaker 1: perhaps yourself in some ways. But how does that change society? 546 00:28:42,440 --> 00:28:44,560 Speaker 2: Yeah, I think it brings to mind one of my 547 00:28:44,560 --> 00:28:49,520 Speaker 2: favorite movies, Gatica. I'm sure you've seen it. 548 00:28:49,680 --> 00:28:51,800 Speaker 1: I have seen it, but I thought that was it 549 00:28:51,840 --> 00:28:53,960 Speaker 1: was silly in the sense that, just as a reminder 550 00:28:53,960 --> 00:28:58,480 Speaker 1: to the listeners, your genes predispose you to some particular 551 00:28:58,600 --> 00:29:00,960 Speaker 1: career in that movie, and if you have these genes, 552 00:29:00,960 --> 00:29:03,600 Speaker 1: are going to be this kind of janitor or whatever 553 00:29:03,640 --> 00:29:08,080 Speaker 1: as opposed to the astronaut. And of course, with the 554 00:29:08,160 --> 00:29:12,840 Speaker 1: nature nurture debate, that's totally dead because it's always both so, 555 00:29:13,040 --> 00:29:14,920 Speaker 1: but yeah, tell me why it reminded you of Ghika. 556 00:29:15,000 --> 00:29:16,880 Speaker 2: Yeah, because the part there was most salient to me, 557 00:29:17,000 --> 00:29:20,120 Speaker 2: I think was this idea that let's say one child 558 00:29:20,320 --> 00:29:22,760 Speaker 2: was kind of quote unquote natural born, the other child 559 00:29:22,920 --> 00:29:25,880 Speaker 2: was given these certain advantages to whatever degree at birth, 560 00:29:26,320 --> 00:29:29,480 Speaker 2: and more to the point, being natural born was just 561 00:29:29,520 --> 00:29:31,920 Speaker 2: a disadvantage you weren't allowed to apply to be like 562 00:29:31,920 --> 00:29:34,760 Speaker 2: an astronaut, like doors were just closed to you, basically, 563 00:29:35,360 --> 00:29:38,480 Speaker 2: And I think that is a very kind of concerning 564 00:29:39,040 --> 00:29:40,520 Speaker 2: world to live. I don't want to live in that 565 00:29:40,560 --> 00:29:43,960 Speaker 2: world in all honesty and I think you could maybe 566 00:29:44,040 --> 00:29:47,080 Speaker 2: have like some rationalist argument about why maybe you should 567 00:29:47,120 --> 00:29:49,360 Speaker 2: do that, but I think just morally and ethically, that 568 00:29:49,520 --> 00:29:52,200 Speaker 2: feels just really horrible to me. And the whole point 569 00:29:52,200 --> 00:29:54,560 Speaker 2: of the movie was that actually, no matter how good 570 00:29:54,680 --> 00:29:57,280 Speaker 2: the sciences at that point, there's just certain factors that 571 00:29:58,640 --> 00:30:01,479 Speaker 2: make that a silly choice, just the resilience of the 572 00:30:01,480 --> 00:30:05,840 Speaker 2: individual and their ability to overcome the challenges they had genetically. 573 00:30:05,920 --> 00:30:08,000 Speaker 2: And I think that's a very that's a message that 574 00:30:08,080 --> 00:30:10,520 Speaker 2: really resonates with me, because I think even fifty years 575 00:30:10,520 --> 00:30:12,280 Speaker 2: from now, there's gonna be many things we don't understand 576 00:30:12,280 --> 00:30:14,440 Speaker 2: about our biology. And I think if you try and 577 00:30:14,480 --> 00:30:17,240 Speaker 2: overrationalize these things and shut doors because like oh, someone 578 00:30:17,280 --> 00:30:19,760 Speaker 2: had this genotype and not that one, just inherently you're 579 00:30:19,800 --> 00:30:21,600 Speaker 2: going to be missing things and you're not going to 580 00:30:21,640 --> 00:30:23,120 Speaker 2: be actually as rational as you think. 581 00:30:23,400 --> 00:30:25,640 Speaker 1: Well, that's true, but maybe one hundred and two years 582 00:30:25,680 --> 00:30:28,120 Speaker 1: from now will be less bad at that. It would 583 00:30:28,160 --> 00:30:30,800 Speaker 1: be just by Devil's advocate. It would be like saying, 584 00:30:32,160 --> 00:30:36,400 Speaker 1: you know, athletes are on anabolic steroids won't ever be 585 00:30:36,400 --> 00:30:39,000 Speaker 1: better than natural athletes, but they will be they can 586 00:30:39,040 --> 00:30:41,240 Speaker 1: lift more and so on. Well, yeah, now, there's gonna 587 00:30:41,240 --> 00:30:46,600 Speaker 1: be the Enhanced Olympics. I suppose, yeah, exactly so. But 588 00:30:46,600 --> 00:30:50,840 Speaker 1: but that doesn't affect the career choices like in Gatica. 589 00:30:51,760 --> 00:30:53,880 Speaker 2: Yeah, I think it's where I think where I get 590 00:30:54,280 --> 00:30:57,800 Speaker 2: missed squeamashes when it comes down to like the individual choice, 591 00:30:57,840 --> 00:31:00,280 Speaker 2: because going on the athletics example, you can choose to 592 00:31:00,680 --> 00:31:03,120 Speaker 2: take performs enhancing drugs or not, right, and that either 593 00:31:03,120 --> 00:31:05,680 Speaker 2: shuts through a pensors for you. I think the thing 594 00:31:05,680 --> 00:31:07,280 Speaker 2: that's more human to me is you don't choose how 595 00:31:07,280 --> 00:31:09,520 Speaker 2: you're born, right, That's a choice that's made for you 596 00:31:09,680 --> 00:31:11,880 Speaker 2: in a lot of different ways. And I think that's 597 00:31:11,960 --> 00:31:15,000 Speaker 2: where the type of world that I don't want to 598 00:31:15,040 --> 00:31:16,960 Speaker 2: live in is where things that are truly outside of 599 00:31:16,960 --> 00:31:20,200 Speaker 2: your control in terms of like your your genome before 600 00:31:20,240 --> 00:31:23,200 Speaker 2: you even born, determine kind of how you live. And 601 00:31:23,240 --> 00:31:28,360 Speaker 2: that's that's not something where I can reconcile personally. 602 00:31:29,120 --> 00:31:31,920 Speaker 1: I mean, interestingly, we're already in that situation, right, which 603 00:31:31,960 --> 00:31:34,680 Speaker 1: is that there's a genetic lottery and you show up 604 00:31:34,680 --> 00:31:38,400 Speaker 1: in the world with advantages or disadvantages. But now it's 605 00:31:38,440 --> 00:31:40,640 Speaker 1: just a matter of whether your parents did the right 606 00:31:40,920 --> 00:31:43,480 Speaker 1: payments and edited. 607 00:31:43,520 --> 00:31:45,440 Speaker 2: Well and I think that's an important point as well. 608 00:31:46,440 --> 00:31:47,960 Speaker 2: I don't want to live in a world where there's 609 00:31:48,000 --> 00:31:52,360 Speaker 2: a massive stratification between the rich and the poort, not 610 00:31:52,480 --> 00:31:54,560 Speaker 2: just from like a starting point of like, okay, these 611 00:31:54,560 --> 00:31:56,600 Speaker 2: are the opportunities available to you in terms of schooling, 612 00:31:56,600 --> 00:31:58,480 Speaker 2: but oh, this is the starting point for you in 613 00:31:58,560 --> 00:32:02,080 Speaker 2: terms of your genome. Yeah, and that's something that we 614 00:32:02,120 --> 00:32:05,440 Speaker 2: talk about the inheritance of wealth. Well, you literally inherit 615 00:32:05,520 --> 00:32:08,320 Speaker 2: your genes. Yes, So that's something that could be a 616 00:32:08,440 --> 00:32:13,040 Speaker 2: very durable advantage in some ways if it's not, you know, 617 00:32:13,120 --> 00:32:15,160 Speaker 2: something that we think about very deeply before we go 618 00:32:15,200 --> 00:32:16,800 Speaker 2: down that path. If we go down that path. 619 00:32:17,000 --> 00:32:19,880 Speaker 1: So following up on that, can we change features in 620 00:32:20,040 --> 00:32:22,560 Speaker 1: adults as in you make a change again in the 621 00:32:22,600 --> 00:32:25,080 Speaker 1: future where you decide, hey, now I want to be 622 00:32:25,240 --> 00:32:28,440 Speaker 1: like the equivalent of performance enhancing drugs and do that 623 00:32:28,520 --> 00:32:29,200 Speaker 1: at any age. 624 00:32:29,440 --> 00:32:32,320 Speaker 2: Yeah, And I think that's kind of where the most 625 00:32:32,360 --> 00:32:34,760 Speaker 2: of the focus is now, is like, well, first of all, 626 00:32:34,760 --> 00:32:37,080 Speaker 2: for disease, of course, you know, let's say someone was 627 00:32:37,080 --> 00:32:39,840 Speaker 2: born with Huntington's and now they're in their twenties, like 628 00:32:39,920 --> 00:32:42,719 Speaker 2: you want to help them out and really, you know, 629 00:32:42,720 --> 00:32:45,120 Speaker 2: prevent them from how any problems in their thirties. I 630 00:32:45,160 --> 00:32:46,720 Speaker 2: think for a lot of diseases, one of the key 631 00:32:46,760 --> 00:32:49,280 Speaker 2: questions you have to answer is like when do you 632 00:32:49,320 --> 00:32:51,520 Speaker 2: have to intervene For some diseases, maybe it could be 633 00:32:51,600 --> 00:32:54,560 Speaker 2: very early in life. Maybe there's certain biological processes that 634 00:32:54,680 --> 00:32:56,920 Speaker 2: just take place early in life and you need to 635 00:32:56,960 --> 00:32:58,840 Speaker 2: do the edit before then, or else maybe you have 636 00:32:58,880 --> 00:33:01,160 Speaker 2: to come up with some other way of reversing the disease. 637 00:33:01,480 --> 00:33:05,440 Speaker 2: I think, fortunately for seems like maybe most diseases you 638 00:33:05,440 --> 00:33:08,080 Speaker 2: can edit an adulthood and that can actually have a 639 00:33:08,280 --> 00:33:10,880 Speaker 2: very material effect on the disease. But that isn't going 640 00:33:10,960 --> 00:33:12,120 Speaker 2: to necessarily always be true. 641 00:33:12,320 --> 00:33:15,120 Speaker 1: Okay, got it. And so coming back to this conversation 642 00:33:15,200 --> 00:33:20,320 Speaker 1: about what society will become, it may be that some 643 00:33:20,360 --> 00:33:22,640 Speaker 1: of it is not an issue that you're born with 644 00:33:22,680 --> 00:33:24,040 Speaker 1: and you have to deal with, but that you make 645 00:33:24,080 --> 00:33:27,479 Speaker 1: a choice, just like anibolic steroids, that people are doing 646 00:33:27,480 --> 00:33:30,040 Speaker 1: it that way. Do you suppose that people are going 647 00:33:30,080 --> 00:33:34,680 Speaker 1: to be looking for things that involve longevity as one 648 00:33:34,680 --> 00:33:36,040 Speaker 1: of their first aims with this. 649 00:33:36,760 --> 00:33:38,640 Speaker 2: Yeah, I mean, obviously there's a huge amount of interest 650 00:33:38,680 --> 00:33:41,480 Speaker 2: in longevity. I have a personal interest in living a 651 00:33:41,520 --> 00:33:44,520 Speaker 2: healthy life. For a long time anyway. I think I'm 652 00:33:44,560 --> 00:33:46,720 Speaker 2: definitely fall more on the spectrum of like health span 653 00:33:46,880 --> 00:33:48,840 Speaker 2: versus life span. If I'm living to be two hundred, 654 00:33:48,920 --> 00:33:52,400 Speaker 2: but i'm you know, de crefit and can't remember my kids, 655 00:33:52,520 --> 00:33:55,120 Speaker 2: that's not a life I personally wanted to live. But 656 00:33:55,120 --> 00:33:56,720 Speaker 2: if I live to a very healthy one hundred and 657 00:33:56,760 --> 00:33:58,640 Speaker 2: ten and where you know, I die in my sleep, 658 00:33:58,720 --> 00:34:02,440 Speaker 2: that sounds great to me. So I think longevity is 659 00:34:02,440 --> 00:34:05,440 Speaker 2: definitely an area where gene editing could play a huge 660 00:34:05,480 --> 00:34:07,720 Speaker 2: role in terms of you know, what are the processes 661 00:34:07,840 --> 00:34:09,920 Speaker 2: is like, you know, some sort of mutation accumulation in 662 00:34:10,000 --> 00:34:13,080 Speaker 2: certain areas that's like causing these cells to age in 663 00:34:13,120 --> 00:34:15,360 Speaker 2: the way they are, and could we reverse that with 664 00:34:15,600 --> 00:34:18,800 Speaker 2: genetic editing approaches. I think that is a very reasonable 665 00:34:18,840 --> 00:34:23,640 Speaker 2: and potentially promising line of research. I think that in general, 666 00:34:24,480 --> 00:34:27,000 Speaker 2: some of the biggest longevity things we can do though 667 00:34:27,120 --> 00:34:31,720 Speaker 2: today is like, you know, heart disease like cancer obviously 668 00:34:31,760 --> 00:34:33,440 Speaker 2: being a big one, and those are of course directly 669 00:34:33,800 --> 00:34:36,560 Speaker 2: addressable by genetic editing as well. And it goes to 670 00:34:36,640 --> 00:34:39,920 Speaker 2: interesting questions of like, should we edit millions of people 671 00:34:39,960 --> 00:34:42,080 Speaker 2: in the US to reduce their risk of heart disease 672 00:34:42,160 --> 00:34:44,680 Speaker 2: by thirty percent. That could say tons and tons of lives. 673 00:34:45,120 --> 00:34:48,360 Speaker 2: It's a relatively large in invention for them. So obviously 674 00:34:48,400 --> 00:34:50,920 Speaker 2: you'd have questions about safety. So far, crisper seems to 675 00:34:50,960 --> 00:34:54,000 Speaker 2: be very very safe at least based on the trials 676 00:34:54,040 --> 00:34:56,440 Speaker 2: and humans so far. So that's something we could do. 677 00:34:56,600 --> 00:34:58,400 Speaker 2: Is that something we want to do as a society? 678 00:34:58,719 --> 00:35:00,000 Speaker 1: What would be the pros and cons? 679 00:35:00,360 --> 00:35:03,520 Speaker 2: So the pro would be you reduce let's say hard 680 00:35:03,560 --> 00:35:07,080 Speaker 2: tech and it all cause mortality essentially and a huge 681 00:35:07,120 --> 00:35:10,480 Speaker 2: segment of the population. The con would be there's going 682 00:35:10,520 --> 00:35:12,799 Speaker 2: to be some expense associated with that. Do we want 683 00:35:12,800 --> 00:35:14,959 Speaker 2: to pay that expense as a society? Maybe it saves 684 00:35:15,040 --> 00:35:17,440 Speaker 2: us money in the long term because both people are 685 00:35:17,440 --> 00:35:19,440 Speaker 2: more productive and you're also spending less and kind of 686 00:35:19,480 --> 00:35:20,880 Speaker 2: in stage care in the hospital. 687 00:35:21,800 --> 00:35:24,000 Speaker 1: And then see that as a government program that's paid 688 00:35:24,040 --> 00:35:26,120 Speaker 1: for as opposed to individuals saying, hey, I'm going to 689 00:35:26,120 --> 00:35:26,480 Speaker 1: pay for this. 690 00:35:26,600 --> 00:35:28,640 Speaker 2: Well, I think that's a very interesting question as well, 691 00:35:28,640 --> 00:35:31,560 Speaker 2: and I think you could see stratification there of maybe 692 00:35:31,600 --> 00:35:34,480 Speaker 2: you know, frankly wealthier individuals choosing to get a lot 693 00:35:34,480 --> 00:35:38,200 Speaker 2: of these types of technologies, whereas you know, is that 694 00:35:38,239 --> 00:35:41,040 Speaker 2: available to everyone. That's the kind of choice for society. 695 00:35:41,320 --> 00:35:42,800 Speaker 2: And then of course the big one would be safety. 696 00:35:42,800 --> 00:35:45,239 Speaker 2: You only want to do that if it's extremely safe, right, 697 00:35:45,280 --> 00:35:49,200 Speaker 2: because you're trying to prevent disease kind of an aggregate, 698 00:35:49,960 --> 00:35:52,880 Speaker 2: so it needs to be extraordinarily safe for each individual. 699 00:36:10,280 --> 00:36:12,719 Speaker 1: Do you think if you're interested in longevity, do you 700 00:36:12,719 --> 00:36:15,399 Speaker 1: think you're going to see this in your lifetime where 701 00:36:15,480 --> 00:36:18,720 Speaker 1: there are edits that can be made Let's say forty 702 00:36:18,760 --> 00:36:20,719 Speaker 1: years from now, where you say, hey, this is great, 703 00:36:20,760 --> 00:36:22,960 Speaker 1: I'm going to live longer. I'm going to expand this. 704 00:36:23,520 --> 00:36:26,000 Speaker 2: Yeah, it's an interesting question. I wouldn't be surprised if 705 00:36:26,000 --> 00:36:29,320 Speaker 2: there's things that can maybe get you ten to twenty 706 00:36:29,320 --> 00:36:32,560 Speaker 2: percent further along. Are we going to see like a doubling. 707 00:36:33,280 --> 00:36:35,400 Speaker 2: I mean, that'd be great. Sure, I'm not going to 708 00:36:35,400 --> 00:36:40,640 Speaker 2: be root against it. I would be surprised. Frankly, we've 709 00:36:40,719 --> 00:36:43,200 Speaker 2: run a natural experiment. There's billions of humans on Earth 710 00:36:43,239 --> 00:36:46,600 Speaker 2: with all sorts of genomes. No one seems to live 711 00:36:46,640 --> 00:36:49,200 Speaker 2: over one hundred and ten hundred and twenty. Let's say 712 00:36:49,880 --> 00:36:52,920 Speaker 2: that doesn't mean you know, it's not possible, but I 713 00:36:52,960 --> 00:36:55,200 Speaker 2: think that one of the areas where I think we'll 714 00:36:55,200 --> 00:36:57,800 Speaker 2: see way more progress on is like health span within 715 00:36:57,840 --> 00:36:59,960 Speaker 2: that period. So like if you can live to an 716 00:37:00,040 --> 00:37:02,640 Speaker 2: extraordinarily healthy one hundred ten where you're hiking mountains and 717 00:37:02,680 --> 00:37:05,520 Speaker 2: you're having fun with your kids, that's already a huge 718 00:37:05,560 --> 00:37:08,520 Speaker 2: extension of that's like a twofold extension of life relative 719 00:37:08,560 --> 00:37:11,080 Speaker 2: to what a lot of people really kind of practically experience. 720 00:37:12,120 --> 00:37:14,440 Speaker 2: And I think glip ones and things like that, the 721 00:37:14,480 --> 00:37:18,440 Speaker 2: obesity drugs from the Lili and Novo and others, I 722 00:37:18,440 --> 00:37:23,319 Speaker 2: think that that's also kind of u shown that there 723 00:37:23,360 --> 00:37:26,440 Speaker 2: can be these kind of giant mass markets in biottech. 724 00:37:26,520 --> 00:37:28,440 Speaker 2: And longevity is the other one that everyone always kind 725 00:37:28,480 --> 00:37:30,120 Speaker 2: of thinks about, but there's others as well. You can 726 00:37:30,200 --> 00:37:34,400 Speaker 2: imagine things that mitigate like alcohol consumption or you know, 727 00:37:34,440 --> 00:37:35,880 Speaker 2: other areas. I don't I don't think glip ones are 728 00:37:35,880 --> 00:37:37,600 Speaker 2: going to be the only kind of trillion dollar drug 729 00:37:37,640 --> 00:37:39,960 Speaker 2: in biotech, but I think it's the first, and I 730 00:37:40,000 --> 00:37:42,800 Speaker 2: think hopefully that kind of inspires everyone else in biotech, 731 00:37:43,040 --> 00:37:45,360 Speaker 2: including us at Mammoth, to really think more broadly and 732 00:37:45,400 --> 00:37:47,880 Speaker 2: more fundamentally about, Okay, what are the things we can 733 00:37:47,920 --> 00:37:51,520 Speaker 2: do that actually change society for the better overall, not 734 00:37:51,600 --> 00:37:54,280 Speaker 2: just necessarily for specifical you got to start with specific 735 00:37:54,320 --> 00:37:56,719 Speaker 2: things and knock out rere diseases. But I think in 736 00:37:56,760 --> 00:37:58,680 Speaker 2: terms of like the long term vision of Mammoth, I 737 00:37:58,680 --> 00:38:00,560 Speaker 2: think that becomes very very exciting. 738 00:38:01,760 --> 00:38:05,040 Speaker 1: And I've been throwing out numbers like three years, five years, ten, 739 00:38:05,239 --> 00:38:09,920 Speaker 1: But what you see, given your incredibly specific view on 740 00:38:10,040 --> 00:38:12,160 Speaker 1: what is happening right now, what you're capable of doing, 741 00:38:13,520 --> 00:38:15,920 Speaker 1: what is that time scale that we're talking about. 742 00:38:16,400 --> 00:38:19,080 Speaker 2: Yeah, so I mean, going back to the beginning of 743 00:38:19,120 --> 00:38:22,920 Speaker 2: the conversation, Already, today people are being treated for sickle 744 00:38:22,920 --> 00:38:24,520 Speaker 2: cell disease using crispers. 745 00:38:24,360 --> 00:38:25,200 Speaker 1: That's today. 746 00:38:25,960 --> 00:38:28,800 Speaker 2: I think over the next five years, for the tissues 747 00:38:28,840 --> 00:38:31,760 Speaker 2: that are kind of most easily accessible, like blood and liver, 748 00:38:32,800 --> 00:38:34,960 Speaker 2: people are going to be shocked at how much progress 749 00:38:35,040 --> 00:38:38,279 Speaker 2: is made at curing rare diseases in those areas. And 750 00:38:38,320 --> 00:38:39,840 Speaker 2: these are kind of things where it's like it happens 751 00:38:39,840 --> 00:38:42,160 Speaker 2: slowly and then it happens all at once. It's very 752 00:38:42,200 --> 00:38:45,160 Speaker 2: much like that. And then I think over the next 753 00:38:45,160 --> 00:38:49,560 Speaker 2: ten years, you're gonna, because of you know, Mammoth, be 754 00:38:49,600 --> 00:38:53,080 Speaker 2: able to see us knocking out diseases and like muscle 755 00:38:53,120 --> 00:38:54,920 Speaker 2: and brain and all the other tissues of the body, 756 00:38:55,400 --> 00:38:58,040 Speaker 2: and I think it's going to take time, and there's 757 00:38:58,080 --> 00:38:58,359 Speaker 2: a lot. 758 00:38:58,239 --> 00:38:59,440 Speaker 1: Of diseases to go through. 759 00:38:59,719 --> 00:39:01,760 Speaker 2: But I think if we're sitting here a couple decades 760 00:39:01,760 --> 00:39:04,760 Speaker 2: from now and people are still suffering from rare disease, 761 00:39:04,760 --> 00:39:07,040 Speaker 2: we've done something incredibly wrong, Like things have gone off 762 00:39:07,080 --> 00:39:09,960 Speaker 2: the rail somewhere, and that's insane. Like to live in 763 00:39:10,000 --> 00:39:12,680 Speaker 2: a world where genetic disease is really a thing of 764 00:39:12,719 --> 00:39:16,359 Speaker 2: the past would be transformative for those patients, but also 765 00:39:16,400 --> 00:39:20,080 Speaker 2: I think for society generally. And I think once you've 766 00:39:20,080 --> 00:39:23,719 Speaker 2: done that, that's when you in parallel, because now you're 767 00:39:23,719 --> 00:39:25,400 Speaker 2: showing the safety, Now you're showing that this is an 768 00:39:25,400 --> 00:39:29,200 Speaker 2: effective technology. I think you can start to dream even bigger. 769 00:39:29,239 --> 00:39:30,640 Speaker 2: You can start to think about, like, you know, how 770 00:39:30,680 --> 00:39:34,000 Speaker 2: do you go after reducing cardiovascular risk, how do you 771 00:39:34,000 --> 00:39:37,400 Speaker 2: go after even more esoteric things like thinking about health span, 772 00:39:37,600 --> 00:39:41,640 Speaker 2: Like there's certain people that have short sleep and they 773 00:39:41,680 --> 00:39:43,800 Speaker 2: actually only need to sleep like three to four hours 774 00:39:43,800 --> 00:39:45,319 Speaker 2: a night, Like what if we could all do that? 775 00:39:45,680 --> 00:39:45,799 Speaker 1: Right? 776 00:39:45,880 --> 00:39:49,560 Speaker 2: Like just really thinking about how do we transform society 777 00:39:49,600 --> 00:39:53,920 Speaker 2: for the better and kind of more aggregate ways. That's 778 00:39:54,000 --> 00:39:56,120 Speaker 2: really long term where I get very excited. 779 00:39:55,880 --> 00:40:00,680 Speaker 1: Unlike traditional medicine, which addresses symptoms of disease, gene editing 780 00:40:00,680 --> 00:40:04,680 Speaker 1: technology in general could change really fundamental aspects of who 781 00:40:04,719 --> 00:40:07,040 Speaker 1: we are. And so what do you think this does 782 00:40:07,080 --> 00:40:09,480 Speaker 1: to personal identity? Yeah? 783 00:40:09,600 --> 00:40:12,280 Speaker 2: I think to your point, this is a very different 784 00:40:12,320 --> 00:40:15,560 Speaker 2: way of doing medicine. Like often medicine is treating symptoms, 785 00:40:15,560 --> 00:40:18,759 Speaker 2: often very effectively, but really treating symptoms, whereas this is 786 00:40:18,800 --> 00:40:23,320 Speaker 2: going into the core, into your DNA and really solving 787 00:40:23,360 --> 00:40:24,440 Speaker 2: the problem at it. 788 00:40:24,640 --> 00:40:26,840 Speaker 1: Right, But I mean not just helping with disease, but 789 00:40:27,040 --> 00:40:30,560 Speaker 1: enhancing humans, changing who we are. So yeah, what does 790 00:40:30,560 --> 00:40:33,439 Speaker 1: that do to our sense of personal identity? Yeah? 791 00:40:33,520 --> 00:40:38,359 Speaker 2: I think it's definitely something where we have a deep 792 00:40:38,640 --> 00:40:41,719 Speaker 2: kind of personal identity obviously in terms of how we think, 793 00:40:41,800 --> 00:40:46,440 Speaker 2: how we act, how we dress, and I think long term, 794 00:40:46,480 --> 00:40:49,160 Speaker 2: just truly thinking like one hundred years here, maybe we 795 00:40:49,200 --> 00:40:51,560 Speaker 2: start to think more of our genomes as our clothing, right, 796 00:40:51,640 --> 00:40:54,200 Speaker 2: Like we think of our clothing as representing our personal identity. 797 00:40:54,640 --> 00:40:56,640 Speaker 2: And that's fine, and that's something people embrace, and people 798 00:40:56,640 --> 00:40:58,680 Speaker 2: of all sorts of different styles and things they do. 799 00:40:59,400 --> 00:41:01,319 Speaker 2: But right now, that's not very much how we think 800 00:41:01,320 --> 00:41:04,200 Speaker 2: of our genomes. But if genomes become malleable, and they 801 00:41:04,200 --> 00:41:06,640 Speaker 2: become something where you really do have this option of 802 00:41:07,000 --> 00:41:10,320 Speaker 2: kind of choosing what things you want to keep or change. 803 00:41:10,840 --> 00:41:13,759 Speaker 2: Maybe we start to see that more as kind of 804 00:41:14,320 --> 00:41:17,360 Speaker 2: extension of ourselves. That's not core in the sense of 805 00:41:17,360 --> 00:41:19,440 Speaker 2: it can never change, but core in the sense of 806 00:41:19,520 --> 00:41:21,640 Speaker 2: I want that to reflect who I am as an individual, 807 00:41:21,680 --> 00:41:23,000 Speaker 2: and I have the choice of how that looks. 808 00:41:23,160 --> 00:41:25,120 Speaker 1: Oh, that's great. Yeah, in the same way that someone 809 00:41:25,200 --> 00:41:26,640 Speaker 1: might go off and get blonde hair for a while 810 00:41:26,640 --> 00:41:28,520 Speaker 1: and red hair and dark hair and change color their 811 00:41:28,560 --> 00:41:31,279 Speaker 1: eyes or whatever. There are ways of doing the stuff now, 812 00:41:31,360 --> 00:41:34,399 Speaker 1: and there's can be better ways or more direct ways 813 00:41:34,400 --> 00:41:36,080 Speaker 1: of doing it. What do you suppose that means in 814 00:41:36,160 --> 00:41:39,279 Speaker 1: terms of motivation and earning something like, for example, I 815 00:41:39,320 --> 00:41:41,440 Speaker 1: go to the gym and I lift weights. It's really hard. 816 00:41:42,520 --> 00:41:43,960 Speaker 1: But if I could go and. 817 00:41:44,320 --> 00:41:47,360 Speaker 2: Sure you get your miostatin gene blasted and you just 818 00:41:47,440 --> 00:41:49,960 Speaker 2: have a huge muscles from setting home on the couch. 819 00:41:50,600 --> 00:41:53,120 Speaker 2: I think that's an interesting question at the society level, 820 00:41:53,160 --> 00:41:55,799 Speaker 2: because why do we care about muscles? Why do we 821 00:41:55,840 --> 00:41:58,440 Speaker 2: care about other things? Like at other times in society, 822 00:41:58,640 --> 00:42:01,239 Speaker 2: having pale skin has been something that people value. I 823 00:42:01,280 --> 00:42:03,120 Speaker 2: mean still today in many places, and it's because oh, 824 00:42:03,280 --> 00:42:04,560 Speaker 2: you don't have to go work in the farms, and 825 00:42:04,560 --> 00:42:06,560 Speaker 2: that means you're a high class individual with a lot 826 00:42:06,560 --> 00:42:09,600 Speaker 2: of money. Or in some societies even today and in 827 00:42:09,640 --> 00:42:12,759 Speaker 2: the past, being very overweight was a sign of high 828 00:42:12,760 --> 00:42:16,839 Speaker 2: status because you're not starving. And I think that what 829 00:42:16,880 --> 00:42:18,840 Speaker 2: that could mean, and this is just like you know, 830 00:42:19,080 --> 00:42:25,879 Speaker 2: purely hypothesizing, is maybe being healthy remains an important thing, 831 00:42:25,920 --> 00:42:27,719 Speaker 2: but it's not a reflector of status in the same 832 00:42:27,719 --> 00:42:29,640 Speaker 2: way it is today, which would be awesome, Like what 833 00:42:30,400 --> 00:42:34,040 Speaker 2: if actually everyone is walking around super healthy, Yeah, amazing, 834 00:42:34,920 --> 00:42:37,520 Speaker 2: But maybe that means that there's less of a status 835 00:42:37,520 --> 00:42:39,480 Speaker 2: differentiator of oh, this person has time to go to 836 00:42:39,520 --> 00:42:42,279 Speaker 2: the gym and they work very hard, so something else. 837 00:42:42,480 --> 00:42:45,319 Speaker 2: But you're humans, right, we love Frankly, it seems to 838 00:42:45,360 --> 00:42:47,680 Speaker 2: stratify ourselves based on status. So some new thing will 839 00:42:47,680 --> 00:42:48,880 Speaker 2: come up and it'll be like, oh, they have a 840 00:42:48,880 --> 00:42:51,680 Speaker 2: blue hat. That's the sign of, you know, high status, 841 00:42:51,760 --> 00:42:53,480 Speaker 2: and we could pass the judgment on whether that's good 842 00:42:53,560 --> 00:42:56,240 Speaker 2: or bad. But I think that's probably what would happen, 843 00:42:56,800 --> 00:42:59,440 Speaker 2: is that it just means that a society is better 844 00:42:59,480 --> 00:43:01,520 Speaker 2: off overall because everyone's very healthy and has the right 845 00:43:01,560 --> 00:43:05,200 Speaker 2: amount of muscle, but there's a different status game being 846 00:43:05,200 --> 00:43:07,319 Speaker 2: played in terms of you know, when you go to 847 00:43:07,480 --> 00:43:08,800 Speaker 2: the club on a Friday or whatever. 848 00:43:13,560 --> 00:43:17,840 Speaker 1: That was Trevor Martin, co founder and CEO of Mammoth Biosciences, 849 00:43:18,239 --> 00:43:22,279 Speaker 1: We talked about the emerging tools that allow us to 850 00:43:22,520 --> 00:43:26,840 Speaker 1: edit life at its most fundamental level, the conversation and 851 00:43:26,880 --> 00:43:29,839 Speaker 1: the tools he's building. This allows us to look at 852 00:43:29,880 --> 00:43:32,920 Speaker 1: the near future in which we can slice out chunks 853 00:43:32,960 --> 00:43:37,720 Speaker 1: of the genome, or rewrite individual letters, or even fine 854 00:43:37,800 --> 00:43:41,719 Speaker 1: tune the expression of genes without altering the sequence. And 855 00:43:41,800 --> 00:43:46,440 Speaker 1: eventually this gives us finer and finer control over our 856 00:43:46,560 --> 00:43:50,640 Speaker 1: biological destiny in ways that we're only beginning to understand. 857 00:43:51,120 --> 00:43:53,719 Speaker 1: Because these technologies are not going to only give us 858 00:43:53,760 --> 00:43:58,120 Speaker 1: the chance to fix mutations, They're going to expand what 859 00:43:58,320 --> 00:44:01,919 Speaker 1: is possible. They will read find the relationship between who 860 00:44:01,960 --> 00:44:05,680 Speaker 1: we are and who we might become. Where we stand 861 00:44:05,719 --> 00:44:08,960 Speaker 1: now is in a field of question marks that no 862 00:44:09,200 --> 00:44:13,880 Speaker 1: single thinker can answer alone. Questions about the line between 863 00:44:14,120 --> 00:44:21,160 Speaker 1: therapy and enhancement, between responsibility and hubris, between embracing the 864 00:44:21,200 --> 00:44:26,000 Speaker 1: malleability of life and respecting everything. We don't understand about it, 865 00:44:26,840 --> 00:44:29,640 Speaker 1: Only one thing seems certain. At this point, we are 866 00:44:29,680 --> 00:44:34,479 Speaker 1: no longer just passengers on evolution's very long and winding road. 867 00:44:35,040 --> 00:44:39,480 Speaker 1: We are taking the wheel, and with that power comes 868 00:44:39,520 --> 00:44:43,520 Speaker 1: a great responsibility, the weight of choices that are going 869 00:44:43,560 --> 00:44:47,640 Speaker 1: to ripple through generations, the weight of choices that will 870 00:44:47,680 --> 00:44:51,680 Speaker 1: shape the genetic landscape of those who come after us. 871 00:44:52,200 --> 00:44:54,360 Speaker 1: In the end, our role is going to be to 872 00:44:54,400 --> 00:44:58,080 Speaker 1: learn how to annotate the book of life with great care, 873 00:44:58,960 --> 00:45:03,759 Speaker 1: to correct its most tragic errors while preserving the poetry 874 00:45:03,880 --> 00:45:07,319 Speaker 1: of its imperfections. We now see the genome not as 875 00:45:07,360 --> 00:45:11,400 Speaker 1: a finished book, but instead as a draft in progress, 876 00:45:11,840 --> 00:45:16,479 Speaker 1: and that compels us to constantly ask ourselves, how much 877 00:45:16,520 --> 00:45:20,240 Speaker 1: of the story should we change? What kind of story 878 00:45:20,560 --> 00:45:23,239 Speaker 1: do we want to tell? And how can we be 879 00:45:23,840 --> 00:45:31,920 Speaker 1: the most careful custodians of the possibilities. Go to Eagleman 880 00:45:31,960 --> 00:45:35,040 Speaker 1: dot com slash podcast for more information and to find 881 00:45:35,120 --> 00:45:38,960 Speaker 1: further reading. Send me an email at podcast at eagleman 882 00:45:39,040 --> 00:45:42,440 Speaker 1: dot com with questions or discussion, and check out and 883 00:45:42,480 --> 00:45:45,719 Speaker 1: subscribe to Inner Cosmos on YouTube for videos of each 884 00:45:45,760 --> 00:45:51,120 Speaker 1: episode and to leave comments. Until next time. I'm David Eagleman, 885 00:45:51,320 --> 00:46:04,560 Speaker 1: and this is Inner Cosmos.