1 00:00:15,396 --> 00:00:22,796 Speaker 1: Pushkin from Pushkin Industries. This is deep background the show 2 00:00:22,876 --> 00:00:25,836 Speaker 1: where we explored the stories behind the stories in the news. 3 00:00:26,276 --> 00:00:30,476 Speaker 1: I'm Noah Feldman. Welcome to this week's program, where we're 4 00:00:30,476 --> 00:00:34,796 Speaker 1: going to talk about gene editing, designer babies, and the 5 00:00:34,836 --> 00:00:38,716 Speaker 1: future of science. These are all topics of importance all 6 00:00:38,756 --> 00:00:42,116 Speaker 1: the time, but they've become more pressing since the announcement 7 00:00:42,516 --> 00:00:46,836 Speaker 1: that a Chinese scientist had actually used Crisper Cast nine, 8 00:00:47,356 --> 00:00:51,836 Speaker 1: the leading cutting edge editing technology, to edit the genomes 9 00:00:51,876 --> 00:00:55,876 Speaker 1: of two embryos to assure that they would not be 10 00:00:55,956 --> 00:01:00,516 Speaker 1: able to be susceptible to the HIV virus. This was 11 00:01:00,556 --> 00:01:04,316 Speaker 1: good for the embryos, but it wasn't necessary scientifically, and 12 00:01:04,396 --> 00:01:07,596 Speaker 1: it's led to lots of intense discussion about whether the 13 00:01:07,716 --> 00:01:10,916 Speaker 1: era of designer babies was too close and needed to 14 00:01:10,956 --> 00:01:14,836 Speaker 1: be headed off by regulation. To discuss these and closely 15 00:01:14,876 --> 00:01:18,476 Speaker 1: related issues were super fortunate to have with us George Church. 16 00:01:19,356 --> 00:01:21,836 Speaker 1: George is the Robert Winthrop Professor of Genetics at Harvard 17 00:01:21,876 --> 00:01:25,156 Speaker 1: Medical School. He also teaches at MIT. He's been a 18 00:01:25,196 --> 00:01:28,716 Speaker 1: central actor in the development of the technologies of gene 19 00:01:28,836 --> 00:01:33,076 Speaker 1: editing and in applying them to the creation of new genomes. 20 00:01:33,116 --> 00:01:37,836 Speaker 1: He sometimes, in fact calls himself a genome engineer. George, 21 00:01:37,876 --> 00:01:40,316 Speaker 1: thank you so much for joining us. It's a great 22 00:01:40,316 --> 00:01:43,116 Speaker 1: pleasure to be here. Noah, let's just start with the 23 00:01:43,156 --> 00:01:45,676 Speaker 1: headline in order to make sense of what it really 24 00:01:45,716 --> 00:01:49,836 Speaker 1: means and whether there are dangerous associated with it of 25 00:01:49,876 --> 00:01:53,516 Speaker 1: the kind that many imagine. We read in the paper 26 00:01:53,876 --> 00:01:58,196 Speaker 1: that Jan Qui, a Chinese scientist, says, or has it 27 00:01:58,276 --> 00:02:01,516 Speaker 1: said about him, that he used Crisper cast nine to 28 00:02:01,956 --> 00:02:06,476 Speaker 1: engineer the DNA of a couple of babies, and that 29 00:02:06,516 --> 00:02:08,916 Speaker 1: they were then subsequently carried to term. That's what appears 30 00:02:08,956 --> 00:02:12,356 Speaker 1: to be new here, and he's been The Chinese government's 31 00:02:12,356 --> 00:02:14,676 Speaker 1: not very happy with him, and he has been under 32 00:02:14,676 --> 00:02:18,916 Speaker 1: house arrest and radio silence more or less since. From 33 00:02:18,956 --> 00:02:22,916 Speaker 1: the standpoint of the state of crisper technology, was there 34 00:02:22,956 --> 00:02:26,596 Speaker 1: anything remarkable about the accomplishment, assuming it was accomplished, or 35 00:02:26,596 --> 00:02:30,316 Speaker 1: is it relatively a trivial step compared to the scientific 36 00:02:30,316 --> 00:02:32,556 Speaker 1: advances that had already been in place? Yeah, I would 37 00:02:32,596 --> 00:02:36,796 Speaker 1: think similar things had been done by other groups, including 38 00:02:36,876 --> 00:02:40,916 Speaker 1: groups in the United States and Oregon on human embryos, 39 00:02:41,716 --> 00:02:44,396 Speaker 1: and even more amazing things had been done on other 40 00:02:44,476 --> 00:02:50,596 Speaker 1: mammalian embryos, including altering dozens of genes. So is here 41 00:02:50,636 --> 00:02:54,596 Speaker 1: the claim was only the altering of onengenee. You know, 42 00:02:54,636 --> 00:02:58,636 Speaker 1: there's some interesting questions about choice of genes. You know, 43 00:02:58,676 --> 00:03:01,116 Speaker 1: one of the things that's challenging in this field is 44 00:03:01,156 --> 00:03:05,036 Speaker 1: picking a gene that even a few other people would 45 00:03:05,076 --> 00:03:08,556 Speaker 1: agree is a good choice of gene, because you know, 46 00:03:08,596 --> 00:03:13,236 Speaker 1: a fair number of genes could be fixed by other methods, 47 00:03:13,316 --> 00:03:15,716 Speaker 1: Like you have to be in in vitro fertilization or 48 00:03:15,716 --> 00:03:18,276 Speaker 1: IVS clinic to do this at all, and if you're 49 00:03:18,276 --> 00:03:21,116 Speaker 1: in that clinic, you could fix it by just IVF 50 00:03:21,676 --> 00:03:25,196 Speaker 1: prenatal genetic testing EGT. So when it goes to question 51 00:03:25,196 --> 00:03:28,996 Speaker 1: of choosing the target of the intervention, what are the 52 00:03:29,116 --> 00:03:31,196 Speaker 1: right criteria or what do you see as the main 53 00:03:31,236 --> 00:03:33,996 Speaker 1: criteria people would consider. I mean, you've mentioned now one 54 00:03:34,276 --> 00:03:36,516 Speaker 1: could you get there by some other means? Could you 55 00:03:36,556 --> 00:03:39,036 Speaker 1: get there without without changing the germ line? But what 56 00:03:39,076 --> 00:03:41,796 Speaker 1: are other criteria that you think would be relevant here? 57 00:03:42,476 --> 00:03:45,836 Speaker 1: So in addition to could you get thereout altering the 58 00:03:45,836 --> 00:03:49,356 Speaker 1: germ line, there's also the criteria of is it a 59 00:03:49,396 --> 00:03:53,276 Speaker 1: net positive for that person or in a public health 60 00:03:53,276 --> 00:03:58,876 Speaker 1: sinse on average on that positive and environmentally dependent. So 61 00:03:58,916 --> 00:04:03,196 Speaker 1: if you're in an environment where everybody gets HIV and 62 00:04:03,356 --> 00:04:07,636 Speaker 1: dies young, and that's a very different environment than or 63 00:04:07,716 --> 00:04:11,916 Speaker 1: nobody gets HIV and China is not ground zero for 64 00:04:11,996 --> 00:04:14,076 Speaker 1: the highest level of HIV in the world, who is 65 00:04:14,156 --> 00:04:18,436 Speaker 1: highly stigmatized there, but you know places in Africa have 66 00:04:18,596 --> 00:04:21,036 Speaker 1: higher incidence of it and it would be better candidates. 67 00:04:21,396 --> 00:04:24,356 Speaker 1: Another one is, you know, is costs is a consideration 68 00:04:24,716 --> 00:04:26,956 Speaker 1: if you want something that's going to be of true 69 00:04:27,196 --> 00:04:30,596 Speaker 1: public health benefit for a disease like this impacts a 70 00:04:30,636 --> 00:04:34,516 Speaker 1: lot of impoverished individuals because you have the alternative is 71 00:04:35,436 --> 00:04:39,876 Speaker 1: drugs or safe sex. There there are a few alternatives, 72 00:04:39,916 --> 00:04:44,356 Speaker 1: none of what you're working perfectly worldwide because a million 73 00:04:44,516 --> 00:04:47,636 Speaker 1: people die for years, it's about two percent of all 74 00:04:47,716 --> 00:04:50,436 Speaker 1: human deaths. So to claim that this is a solved 75 00:04:50,436 --> 00:04:53,996 Speaker 1: problem is just as Naives saying that this is the 76 00:04:54,036 --> 00:04:56,436 Speaker 1: only way to solve the problem. Right, it's not a 77 00:04:56,476 --> 00:04:59,636 Speaker 1: self problem in real world descriptive terms, even though in 78 00:04:59,676 --> 00:05:04,876 Speaker 1: principle it could be solvable. So what about the criterion 79 00:05:04,916 --> 00:05:11,436 Speaker 1: that asks whether something is presently an immediate threat to 80 00:05:12,036 --> 00:05:15,556 Speaker 1: either the embryo or the local population as opposed to 81 00:05:15,676 --> 00:05:20,716 Speaker 1: offering some background improvement for the potential person who might 82 00:05:20,716 --> 00:05:23,676 Speaker 1: come to be created, or for the environment more generally, 83 00:05:23,716 --> 00:05:26,116 Speaker 1: does that seem like a meaningful difference to you? And 84 00:05:26,156 --> 00:05:28,676 Speaker 1: I just to show that I'm why I'm asking that question. 85 00:05:29,036 --> 00:05:32,516 Speaker 1: It goes to the broader societal fear about what are 86 00:05:32,596 --> 00:05:36,676 Speaker 1: sometimes called designer babies, you know, babies who are who 87 00:05:37,436 --> 00:05:41,516 Speaker 1: might have their genomes edited to give them some advantage, 88 00:05:41,676 --> 00:05:43,436 Speaker 1: which could be a health advantage to begin with, but 89 00:05:43,476 --> 00:05:46,476 Speaker 1: could also be other kinds of advantages, whether in esthetics 90 00:05:46,556 --> 00:05:49,356 Speaker 1: or intelligence, or athletics or what have you. I think 91 00:05:49,396 --> 00:05:55,596 Speaker 1: this is something that is very interesting entanglement of concerns. 92 00:05:56,796 --> 00:06:02,716 Speaker 1: So the concerns are that we will create a monoculture 93 00:06:03,756 --> 00:06:07,316 Speaker 1: number one, where everyone is the same, the way that 94 00:06:07,356 --> 00:06:11,276 Speaker 1: you might have a you know, many square kilometers of 95 00:06:11,516 --> 00:06:17,196 Speaker 1: identical crops. The second is that it will be inequably distributed, 96 00:06:17,236 --> 00:06:19,076 Speaker 1: in no words, that some people have some people don't. 97 00:06:19,076 --> 00:06:22,236 Speaker 1: That's a very different concern than you don't want anybody 98 00:06:22,236 --> 00:06:24,636 Speaker 1: to have it, as you want everybody to have it, okay, 99 00:06:25,196 --> 00:06:27,116 Speaker 1: you know. A third one is you could make a 100 00:06:27,196 --> 00:06:31,676 Speaker 1: mistake that it could be a very popular fad, but 101 00:06:31,756 --> 00:06:34,556 Speaker 1: it was a mistake that has long term consequences, either 102 00:06:34,636 --> 00:06:36,676 Speaker 1: for the people who had it or the people who 103 00:06:36,716 --> 00:06:42,596 Speaker 1: didn't get it. There could be stigmatization issues direction. Yeah, 104 00:06:42,596 --> 00:06:45,116 Speaker 1: it could beat not to have had the intention or 105 00:06:45,196 --> 00:06:47,076 Speaker 1: to have absolutely and there are a few more things 106 00:06:47,076 --> 00:06:50,156 Speaker 1: that get tangled up here. Sometimes it's phrased as blondere 107 00:06:50,196 --> 00:06:53,596 Speaker 1: blue eye, which doesn't make any sense at all. You know, 108 00:06:53,636 --> 00:06:57,476 Speaker 1: that's not a public health threat. It's unlikely to have 109 00:06:57,636 --> 00:07:01,116 Speaker 1: long term consequences even if it were a monoculture. It's 110 00:07:01,156 --> 00:07:04,716 Speaker 1: not a monoculture that threatens society if it's not. But 111 00:07:04,756 --> 00:07:06,796 Speaker 1: it's not an accident that people use that as the example. 112 00:07:06,836 --> 00:07:09,756 Speaker 1: What they're invoking when they say that is there associating 113 00:07:10,236 --> 00:07:14,476 Speaker 1: genetic engineering with the eugenics movement of the eighteen eighties, 114 00:07:14,636 --> 00:07:17,556 Speaker 1: nineties or early twentieth century, which was not only by 115 00:07:17,556 --> 00:07:19,916 Speaker 1: any stretch of imagination, popular in Germany but all over 116 00:07:19,956 --> 00:07:22,156 Speaker 1: the world, including very much the US. Yeah, the United 117 00:07:22,196 --> 00:07:24,356 Speaker 1: States kept doing it long after World War Two. I 118 00:07:24,356 --> 00:07:27,676 Speaker 1: think it was into early nineteen seventies, and so eugenics 119 00:07:27,676 --> 00:07:30,156 Speaker 1: though this wasn't the only line of eugenic thinking at all, 120 00:07:30,196 --> 00:07:33,516 Speaker 1: but it married itself up at least in the European context, 121 00:07:33,636 --> 00:07:37,396 Speaker 1: even you know now long discredited racialized theories. That's what 122 00:07:37,436 --> 00:07:42,796 Speaker 1: people mean, I think when they say absolutely absolutely. And furthermore, 123 00:07:42,836 --> 00:07:46,676 Speaker 1: there's all kinds of body image issues, shaming and so 124 00:07:46,716 --> 00:07:49,916 Speaker 1: forth the en conjures up, which some of which are 125 00:07:49,996 --> 00:07:54,316 Speaker 1: racially independent. You know that within a race you'll have 126 00:07:54,796 --> 00:08:01,036 Speaker 1: you know, obesity shaming and many other features that determine 127 00:08:01,436 --> 00:08:06,716 Speaker 1: your status of society, your socio economic status, enhance your 128 00:08:06,716 --> 00:08:11,716 Speaker 1: health status. So even though it's not health explicitly, that 129 00:08:11,916 --> 00:08:15,596 Speaker 1: has ramifications in that direction. So some of those can 130 00:08:15,636 --> 00:08:18,116 Speaker 1: be addressed. So you know, equitable distribution, we could bring 131 00:08:18,116 --> 00:08:22,036 Speaker 1: down the price as we have for various technologies like smallpox. 132 00:08:22,156 --> 00:08:25,996 Speaker 1: Vaccine has made it extinct and so it's basically free now. 133 00:08:26,396 --> 00:08:30,116 Speaker 1: So there are technologies that come down pretty quickly to zero, 134 00:08:30,836 --> 00:08:32,996 Speaker 1: so that you can take some of these things at 135 00:08:33,036 --> 00:08:35,316 Speaker 1: least not necessarily off the table, but you can put 136 00:08:35,316 --> 00:08:38,076 Speaker 1: them aside in a separate category. But another one is 137 00:08:38,116 --> 00:08:41,956 Speaker 1: that you could create It falls a bit in the 138 00:08:42,276 --> 00:08:46,836 Speaker 1: equitable distribution, but you could create people that it's not 139 00:08:46,876 --> 00:08:51,036 Speaker 1: necessarily racial. It's just a new capability right now, most 140 00:08:51,076 --> 00:08:54,316 Speaker 1: new capabilities. I believe that the human population at present 141 00:08:54,516 --> 00:08:59,036 Speaker 1: doesn't have right, and that enhancement is something that we 142 00:08:59,076 --> 00:09:03,916 Speaker 1: should feel very familiar with because we are enormously enhanced 143 00:09:03,916 --> 00:09:07,676 Speaker 1: relatives to our enhancestors. We're not actually fearful of enhancement 144 00:09:07,716 --> 00:09:09,556 Speaker 1: as far as I can tell. But that do you 145 00:09:09,556 --> 00:09:12,156 Speaker 1: mean incrementally enhanced? I mean so if you look at 146 00:09:12,196 --> 00:09:14,716 Speaker 1: I don't know the fastest one hundred meter dash times, 147 00:09:14,756 --> 00:09:16,876 Speaker 1: they're getting faster, though presumably they're also going to a 148 00:09:16,956 --> 00:09:19,716 Speaker 1: mutual limit. But I can beat one hundred meter dash 149 00:09:19,836 --> 00:09:23,236 Speaker 1: person any day in my jet. But that's a capacity 150 00:09:23,396 --> 00:09:27,596 Speaker 1: enhancement that is not embodied. We have developed tools that 151 00:09:27,716 --> 00:09:30,196 Speaker 1: enable us to do things, but they're not quite the 152 00:09:30,276 --> 00:09:33,316 Speaker 1: same as being incorporated. It's an interesting question why we 153 00:09:33,396 --> 00:09:36,436 Speaker 1: treat them differently because they are hurtable in a certain 154 00:09:36,476 --> 00:09:41,996 Speaker 1: sense culturally culturally inheritable. But culture is an inheritance. I 155 00:09:42,076 --> 00:09:46,956 Speaker 1: consider many ways more threatening and more rapid than DNA 156 00:09:47,036 --> 00:09:50,596 Speaker 1: obsessed inheritance. You know, it's much more likely that my 157 00:09:50,756 --> 00:09:53,996 Speaker 1: daughter will have my cell phone then she will have 158 00:09:54,076 --> 00:09:59,916 Speaker 1: my facial features. Thankfully, so we get hung up on 159 00:10:00,396 --> 00:10:05,316 Speaker 1: this sort of DNA obsession. If they actually knew less science. Oddly, 160 00:10:05,716 --> 00:10:08,876 Speaker 1: we might draw less of a line between these, Right, 161 00:10:09,116 --> 00:10:11,436 Speaker 1: we didn't know how we inherited these things. We didn't 162 00:10:11,436 --> 00:10:15,916 Speaker 1: really understand culture and technology and DNA they would kind 163 00:10:15,916 --> 00:10:18,876 Speaker 1: of look similar. Oh then it's your fault for helping 164 00:10:18,956 --> 00:10:21,076 Speaker 1: us understand it, so well. Sorry, Well, you're making a 165 00:10:21,116 --> 00:10:26,556 Speaker 1: really interesting observation that culture has a wide range of effects. 166 00:10:26,876 --> 00:10:29,756 Speaker 1: It's in some broad sense heritable. But I want to 167 00:10:29,756 --> 00:10:33,716 Speaker 1: add that it's also subject to the full range nearly 168 00:10:33,756 --> 00:10:36,556 Speaker 1: every one of the serious risks that you described with 169 00:10:36,636 --> 00:10:40,676 Speaker 1: respect to genetic editing. In fact, in many cases, the 170 00:10:40,716 --> 00:10:42,956 Speaker 1: way we come up with our fear is we look 171 00:10:42,996 --> 00:10:46,436 Speaker 1: at the distortions or the injustices produced by culture and 172 00:10:46,516 --> 00:10:49,156 Speaker 1: we say, well, oh boy, you know, genetic editing might 173 00:10:49,196 --> 00:10:52,836 Speaker 1: have the same effect, so you know, societal inequity. Check 174 00:10:52,956 --> 00:10:56,236 Speaker 1: on that. Right. It can also these target effects which 175 00:10:56,236 --> 00:10:59,156 Speaker 1: we haven't yet gotten to. But you know, that's broadly speaking, 176 00:10:59,516 --> 00:11:01,556 Speaker 1: the idea that if we edit one thing, there will 177 00:11:01,556 --> 00:11:04,316 Speaker 1: be unforeseen effects somewhere else in the genome. Check on that. 178 00:11:04,436 --> 00:11:06,556 Speaker 1: In culture, all the time, we improve one thing and 179 00:11:06,596 --> 00:11:11,236 Speaker 1: we make other things much worse, so certain consequences. Check 180 00:11:11,276 --> 00:11:14,276 Speaker 1: on that. Go ahead, Yeah, I mean the early adopters 181 00:11:14,276 --> 00:11:18,596 Speaker 1: sometimes they're more exposed to the unintended consequences the off targets, 182 00:11:18,636 --> 00:11:22,356 Speaker 1: and so they're essentially be signing up with their extra 183 00:11:22,796 --> 00:11:26,716 Speaker 1: dollars to become the first guinea pigs. So it seems 184 00:11:26,716 --> 00:11:31,476 Speaker 1: like there's some meaningful debate within the scientific community about 185 00:11:31,476 --> 00:11:36,756 Speaker 1: how concerned we should be that particular desirable genetic edits. 186 00:11:36,756 --> 00:11:38,956 Speaker 1: This is assume it's we're avoiding a disease, that there's 187 00:11:38,996 --> 00:11:40,276 Speaker 1: no other way to avoid it, that we fit your 188 00:11:40,316 --> 00:11:43,796 Speaker 1: other criteria. There seems to be some disagreement between one 189 00:11:43,836 --> 00:11:47,756 Speaker 1: campus scientists who say, because we don't know for sure 190 00:11:47,796 --> 00:11:50,676 Speaker 1: what the off target effects might be of a given intervention, 191 00:11:50,916 --> 00:11:55,396 Speaker 1: we should proceed extremely slowly and carefully, and another campus scientists, 192 00:11:55,436 --> 00:11:58,436 Speaker 1: again this is a continuum, who seem to say, we 193 00:11:58,516 --> 00:12:01,396 Speaker 1: can measure off target effects like we can measure anything else, 194 00:12:01,476 --> 00:12:04,556 Speaker 1: and if we reach the point that off target effects 195 00:12:04,676 --> 00:12:08,556 Speaker 1: are less likely to occur from a genetic edit than 196 00:12:08,596 --> 00:12:12,076 Speaker 1: they are in nature, then it's absurd to be so 197 00:12:12,236 --> 00:12:14,556 Speaker 1: worried about it. And I'm fascinated by this question because 198 00:12:14,676 --> 00:12:16,556 Speaker 1: as a layman, I have no idea how to go 199 00:12:16,596 --> 00:12:20,836 Speaker 1: about answering it right. Well, the probably the most glib 200 00:12:20,956 --> 00:12:24,276 Speaker 1: answer to it is this is the responsibility of food 201 00:12:24,316 --> 00:12:26,756 Speaker 1: and drug administration in the United States and equivalence in 202 00:12:26,796 --> 00:12:31,596 Speaker 1: other countries. And it happens with every single technology to 203 00:12:31,676 --> 00:12:34,596 Speaker 1: something expose. Certainly the medical technologies were the medical devices, 204 00:12:34,716 --> 00:12:39,476 Speaker 1: or molecule drugs or protein drugs. Every category has all 205 00:12:39,556 --> 00:12:44,596 Speaker 1: target effects which are physiological, so you need to do 206 00:12:44,636 --> 00:12:47,236 Speaker 1: as much as you can theoretically, followed by as much 207 00:12:47,276 --> 00:12:50,876 Speaker 1: as you can with animal models or human cells in culture, 208 00:12:51,596 --> 00:12:55,076 Speaker 1: and then you move on very cautlessly to one person 209 00:12:55,476 --> 00:13:00,356 Speaker 1: to do phase one toxicity and then efficacy, and then 210 00:13:00,636 --> 00:13:02,836 Speaker 1: make sure that it's really ready for primetime, and you 211 00:13:02,876 --> 00:13:05,516 Speaker 1: scale up the size of the cohort cautlessly so you 212 00:13:05,956 --> 00:13:09,036 Speaker 1: expose the minimum number of people to risk for good 213 00:13:09,156 --> 00:13:14,236 Speaker 1: theoretical reasons. So this combination of theory and testing cautiously 214 00:13:14,556 --> 00:13:18,436 Speaker 1: is what protects us from all our new technologies. Is 215 00:13:18,476 --> 00:13:22,956 Speaker 1: the timescale particularly challenging in the case of genetic editing 216 00:13:22,996 --> 00:13:25,636 Speaker 1: because you're editing an embryo, but it might be that 217 00:13:25,676 --> 00:13:28,596 Speaker 1: the off target effect doesn't express itself, you know, typically 218 00:13:28,676 --> 00:13:30,836 Speaker 1: until much later in life, so that it might take 219 00:13:30,996 --> 00:13:33,556 Speaker 1: much longer to find out what potential left target effects are. 220 00:13:33,996 --> 00:13:37,036 Speaker 1: Absolutely you know, preventive medicine is a nice buzzword, but 221 00:13:37,076 --> 00:13:39,876 Speaker 1: it's very hard to develop powerful preventive medicines because you're 222 00:13:39,916 --> 00:13:43,156 Speaker 1: testing them on people that are healthy. And it's not 223 00:13:43,196 --> 00:13:49,996 Speaker 1: restricted to embryo editing, its fetal surgery with actual microscalpels 224 00:13:49,996 --> 00:13:54,356 Speaker 1: of things and monus sickness drugs like Blood of Mind, 225 00:13:54,436 --> 00:13:58,756 Speaker 1: which which failed disastrous effects. Extreme caution is required when 226 00:13:58,756 --> 00:14:02,116 Speaker 1: you're dealing with prevented in medicine. For example, you take 227 00:14:02,196 --> 00:14:07,916 Speaker 1: chemotherapy as an adult woman, you could be affecting your 228 00:14:08,716 --> 00:14:12,836 Speaker 1: German line in ways that you also won't see until 229 00:14:13,236 --> 00:14:15,356 Speaker 1: your babies are born, and maybe their babies are born. 230 00:14:15,556 --> 00:14:18,196 Speaker 1: Right used sub interesting words now, which I think are 231 00:14:18,316 --> 00:14:20,076 Speaker 1: maybe they are characteristic give you, although I don't think 232 00:14:20,076 --> 00:14:22,676 Speaker 1: of them as characteristic of you, namely the words extreme caution. 233 00:14:23,236 --> 00:14:25,796 Speaker 1: So I want to ask you about your headge about 234 00:14:25,796 --> 00:14:29,676 Speaker 1: the FDA. Of course, institutionally we assign those decisions in 235 00:14:29,716 --> 00:14:34,196 Speaker 1: our democracy to the FDA, but that doesn't necessarily tell 236 00:14:34,236 --> 00:14:37,996 Speaker 1: us what criteria the FDA ought to apply, or, more 237 00:14:37,996 --> 00:14:40,556 Speaker 1: to the point, how cautious the FDA should be, or 238 00:14:40,556 --> 00:14:43,196 Speaker 1: how risk taking the FDA should be. I mean, real 239 00:14:43,276 --> 00:14:47,316 Speaker 1: human beings. Ideally, scientists and statisticians sitting in the FDA 240 00:14:47,436 --> 00:14:49,356 Speaker 1: have to make these decisions on the basis of what 241 00:14:49,436 --> 00:14:51,796 Speaker 1: data is known. So I guess what I wanted to 242 00:14:51,836 --> 00:14:53,716 Speaker 1: ask you is, if I'm right that there is this 243 00:14:53,796 --> 00:14:56,796 Speaker 1: continuum of how much risk we should take in the 244 00:14:56,796 --> 00:15:00,116 Speaker 1: scientific community around gene editing, where do you fall on 245 00:15:00,156 --> 00:15:02,836 Speaker 1: that continuum? The words extreme cautions sort of put you 246 00:15:03,076 --> 00:15:05,596 Speaker 1: sound like they put you among the Really, let's not 247 00:15:05,676 --> 00:15:10,396 Speaker 1: rush this side. I think tend to fall on the 248 00:15:11,196 --> 00:15:13,236 Speaker 1: I worry about everything, and I want everybody else. I 249 00:15:13,236 --> 00:15:16,556 Speaker 1: don't want to reassure people necessarily, I'm not in a 250 00:15:16,596 --> 00:15:20,076 Speaker 1: big rush for most things, even aging reversal, even though 251 00:15:20,116 --> 00:15:22,636 Speaker 1: I'm I'm not in a big rush show, so don't 252 00:15:22,636 --> 00:15:25,516 Speaker 1: worry about it. I'm not in a big rush. But 253 00:15:25,596 --> 00:15:28,556 Speaker 1: that said, I I don't fall into the camp of 254 00:15:28,836 --> 00:15:30,676 Speaker 1: we should put as many barriers in the way as 255 00:15:30,796 --> 00:15:34,316 Speaker 1: possible so that it never happens, or be so vague 256 00:15:34,356 --> 00:15:36,756 Speaker 1: about it that we can never be satisfied as to 257 00:15:36,836 --> 00:15:39,916 Speaker 1: what the criteria are for letting it go forward. And 258 00:15:40,276 --> 00:15:42,636 Speaker 1: it is a continuum, and I think the FDA does 259 00:15:42,636 --> 00:15:45,476 Speaker 1: an admirable job of prioritizing so if you have a 260 00:15:45,596 --> 00:15:48,156 Speaker 1: very serious disease is going to kill you tomorrow, they 261 00:15:48,196 --> 00:15:52,596 Speaker 1: have a different threshold. Then if you have a healthy 262 00:15:52,636 --> 00:15:55,956 Speaker 1: baby who has every expectation of being healthy, and you're 263 00:15:55,996 --> 00:15:58,396 Speaker 1: going to give it something that might extend its life 264 00:15:58,396 --> 00:16:02,036 Speaker 1: by ten years eight years from now, that barrier is 265 00:16:02,156 --> 00:16:05,436 Speaker 1: very hard to get permission to do that study. And 266 00:16:05,596 --> 00:16:09,676 Speaker 1: rightly so, there's a line of thought that says, using 267 00:16:09,756 --> 00:16:13,116 Speaker 1: the FDA as one pole of you describe pretty admirable 268 00:16:13,276 --> 00:16:15,996 Speaker 1: cost benefit weighing, and then on the other side, looking 269 00:16:16,036 --> 00:16:19,156 Speaker 1: at China, where at least in one instance, there seems 270 00:16:19,156 --> 00:16:22,756 Speaker 1: to have been a lot of risk taking, it says, look, 271 00:16:22,756 --> 00:16:24,236 Speaker 1: it's all well and good to say that the FDA 272 00:16:24,276 --> 00:16:27,316 Speaker 1: should be cautious, but out there in the world, outside 273 00:16:27,356 --> 00:16:30,996 Speaker 1: the reach of more cautious and maybe typically democratic although 274 00:16:31,036 --> 00:16:33,836 Speaker 1: that doesn't have to follow governments, we're going to get 275 00:16:33,876 --> 00:16:36,196 Speaker 1: lots of innovation, lots of risk taking, and that we'll 276 00:16:36,196 --> 00:16:40,036 Speaker 1: fall behind if we listen to the FDA or we 277 00:16:40,076 --> 00:16:42,996 Speaker 1: allow the FDA to be cautious, or alternatively, that says 278 00:16:43,236 --> 00:16:45,396 Speaker 1: we ought to be very aggressively going out and trying 279 00:16:45,396 --> 00:16:49,196 Speaker 1: to export our own limitations by pressuring foreign governments to 280 00:16:49,276 --> 00:16:53,236 Speaker 1: make sure that they crack down to a greater extent. 281 00:16:54,036 --> 00:16:56,076 Speaker 1: How do you think about that phenomenon? The science is 282 00:16:56,076 --> 00:17:00,156 Speaker 1: increasingly globalized, as prices come down, as scientists train across borders. 283 00:17:00,796 --> 00:17:02,916 Speaker 1: There's no in principal reason that a country with a 284 00:17:02,996 --> 00:17:07,476 Speaker 1: reasonable infrastructure scientific infrastructure can't make all kinds of innovations 285 00:17:07,956 --> 00:17:11,316 Speaker 1: under a very different regular environment. Well, and first of all, 286 00:17:11,316 --> 00:17:13,756 Speaker 1: I would say that the Chinese FDA to c FDA 287 00:17:14,316 --> 00:17:17,476 Speaker 1: is very similar. I think they have a similar risk profile. 288 00:17:18,116 --> 00:17:20,716 Speaker 1: I think their government whether you want to call it 289 00:17:21,156 --> 00:17:26,276 Speaker 1: capital hyper capitalistic, or less than perfectly democratic, it doesn't 290 00:17:26,316 --> 00:17:29,356 Speaker 1: really matter. The point is they are capable and often 291 00:17:29,436 --> 00:17:32,156 Speaker 1: do crackdown better than we do. So in the United 292 00:17:32,196 --> 00:17:35,756 Speaker 1: States it's very hard to interfere with the industry, not 293 00:17:35,836 --> 00:17:39,076 Speaker 1: because they run Congress or anything, but because there's an 294 00:17:39,076 --> 00:17:43,796 Speaker 1: obsession with freedom. And in the Chinese government they can 295 00:17:43,836 --> 00:17:46,476 Speaker 1: crack down on things that are unsafe in ways that 296 00:17:46,516 --> 00:17:49,596 Speaker 1: are very difficult to do otherwise. So for example, cleaning 297 00:17:49,676 --> 00:17:52,596 Speaker 1: up Pijing for the Olympics, and that's something that I 298 00:17:52,636 --> 00:17:56,556 Speaker 1: don't think we could have done. And they're very technocratic too. 299 00:17:56,556 --> 00:17:59,516 Speaker 1: I meant I'd heard that a huge faction, maybe eighty 300 00:17:59,516 --> 00:18:03,996 Speaker 1: percent of their top politicians have degrees in science for engineering, 301 00:18:04,396 --> 00:18:07,236 Speaker 1: and not because they're failed scientists or engineers, but because 302 00:18:07,276 --> 00:18:10,156 Speaker 1: as politicians they went back and got those degrees because 303 00:18:10,156 --> 00:18:12,636 Speaker 1: I felt it was important for understanding the issues. That 304 00:18:12,836 --> 00:18:15,236 Speaker 1: is so far from where we are. We're more in 305 00:18:15,236 --> 00:18:18,036 Speaker 1: a culture of if I need some facts, I'll give 306 00:18:18,076 --> 00:18:20,796 Speaker 1: them to you, right. You know, this is a question 307 00:18:20,836 --> 00:18:24,276 Speaker 1: that I've been very curious to ask you today. To 308 00:18:24,436 --> 00:18:30,436 Speaker 1: be a cutting edge researcher as you are, it's also 309 00:18:30,636 --> 00:18:32,556 Speaker 1: perhaps expect it is too strong a word, but not 310 00:18:32,676 --> 00:18:37,636 Speaker 1: surprising when one also holds lots of patents and starts 311 00:18:37,636 --> 00:18:41,156 Speaker 1: companies that develop those patents or that try to develop 312 00:18:41,196 --> 00:18:46,836 Speaker 1: those patterns. So the entrepreneurial side seems to coexist very naturally, easily, 313 00:18:47,356 --> 00:18:52,956 Speaker 1: almost normatively with being a leading scientist. How do you 314 00:18:53,036 --> 00:18:55,396 Speaker 1: think about that relationship, because that's been an important part 315 00:18:55,396 --> 00:19:00,076 Speaker 1: of your practice. So if you have an invention where 316 00:19:00,076 --> 00:19:05,156 Speaker 1: you've gone from the vast cloud of scientific discovery to 317 00:19:05,436 --> 00:19:08,196 Speaker 1: something that you think might be useful, it's not sufficient 318 00:19:08,236 --> 00:19:11,796 Speaker 1: to write up academic ivory tower paper and hope for 319 00:19:11,836 --> 00:19:15,756 Speaker 1: the best because nobody's going to use it. You know, 320 00:19:15,836 --> 00:19:21,396 Speaker 1: sometimes there's some concern that capitalism in that form will 321 00:19:22,036 --> 00:19:26,236 Speaker 1: infect us, you know, that affects us by manipulating us, 322 00:19:26,396 --> 00:19:30,236 Speaker 1: or as market scientists entrepreneur cannot be as pure as 323 00:19:30,236 --> 00:19:33,676 Speaker 1: the scientist who's received grants and does basic research. I 324 00:19:33,676 --> 00:19:36,756 Speaker 1: think a lot of people fear that. Yeah, and they 325 00:19:36,796 --> 00:19:38,916 Speaker 1: fear that even after the sign us steps away and 326 00:19:39,036 --> 00:19:41,556 Speaker 1: goes back the Ivory Tower, a company that has been 327 00:19:41,596 --> 00:19:44,436 Speaker 1: created as a life of its own, and it will 328 00:19:44,476 --> 00:19:48,116 Speaker 1: be motivated to use marketing and advertising and operative lobbying 329 00:19:48,516 --> 00:19:52,636 Speaker 1: to give us our opinion, run listen to us. And 330 00:19:52,676 --> 00:19:55,676 Speaker 1: I think these are all valid things. On the other hand, 331 00:19:56,236 --> 00:19:59,316 Speaker 1: if you're talking about companies going rogue, there's certain challenges 332 00:19:59,396 --> 00:20:03,676 Speaker 1: to that, meaning that the CEO reports the bord of 333 00:20:03,756 --> 00:20:06,276 Speaker 1: directors and to the stockholders in general, and so to 334 00:20:06,316 --> 00:20:12,716 Speaker 1: do something blatantly irresponsible result in a huge public relations problem, 335 00:20:12,876 --> 00:20:16,916 Speaker 1: your stots value and rash and so forth. While an 336 00:20:16,996 --> 00:20:22,556 Speaker 1: academic with tenure who has acted responsible, they can go 337 00:20:22,596 --> 00:20:24,956 Speaker 1: off and do whatever they want. They don't really report 338 00:20:24,996 --> 00:20:27,876 Speaker 1: to anybody. So there's a danger in other words, also 339 00:20:27,916 --> 00:20:30,316 Speaker 1: in having the pure scientists who can do whatever he 340 00:20:30,356 --> 00:20:32,596 Speaker 1: wants or she wants, and it's not responsible at all. 341 00:20:33,156 --> 00:20:36,516 Speaker 1: So let's take this then back to the question of 342 00:20:36,556 --> 00:20:41,996 Speaker 1: the future of Crisper or other similar editing technologies in 343 00:20:42,116 --> 00:20:45,756 Speaker 1: terms of the state of the science. Now, how credible 344 00:20:45,996 --> 00:20:51,676 Speaker 1: is it that existing editing technologies could, if they were 345 00:20:51,676 --> 00:20:58,196 Speaker 1: allowed to by regulators, actually make interventions at population level 346 00:20:58,196 --> 00:21:03,596 Speaker 1: in a significant way. Are there actually traits enough traits 347 00:21:03,676 --> 00:21:08,596 Speaker 1: that matter that are addressable by editing one or two 348 00:21:09,156 --> 00:21:12,116 Speaker 1: genes or a handful of genes to actually make it 349 00:21:12,116 --> 00:21:14,636 Speaker 1: make an impact. How much of this is science fiction 350 00:21:14,676 --> 00:21:16,756 Speaker 1: fantasy and how much of this is within the reach 351 00:21:16,796 --> 00:21:20,716 Speaker 1: of reality. So there's an interesting phenomenon. I would say 352 00:21:20,996 --> 00:21:25,516 Speaker 1: a huge fraction of my colleagues would reassure you that 353 00:21:25,916 --> 00:21:29,156 Speaker 1: human genetics is so complex that we are so far 354 00:21:29,196 --> 00:21:31,156 Speaker 1: away we don't need to worry about it. I think 355 00:21:31,156 --> 00:21:33,316 Speaker 1: that is false to some of them. I think that 356 00:21:33,436 --> 00:21:39,036 Speaker 1: is false reassurance. So I think that we have sort 357 00:21:39,076 --> 00:21:44,476 Speaker 1: of the textbook cases of complex genetics or things like height. Right, 358 00:21:44,876 --> 00:21:48,036 Speaker 1: thousands of genes involved, possibly all the genes are involved, 359 00:21:48,756 --> 00:21:51,276 Speaker 1: and they each have a very tiny effect along with 360 00:21:51,396 --> 00:21:54,796 Speaker 1: hundreds of environmental effects. You know, why is it over 361 00:21:54,836 --> 00:21:57,876 Speaker 1: the generations we've gotten taller and tallerance better nutritions. So 362 00:21:58,636 --> 00:22:03,636 Speaker 1: this seems hard to predict, much less to manipulate. And 363 00:22:04,036 --> 00:22:06,636 Speaker 1: when your colleagues just to flush out the argument for 364 00:22:06,996 --> 00:22:09,636 Speaker 1: listeners when they say nothing to worry about, they often 365 00:22:09,836 --> 00:22:12,516 Speaker 1: do use height and they say, well, look, if we 366 00:22:12,636 --> 00:22:15,076 Speaker 1: know right now that there are four or five hundred 367 00:22:15,316 --> 00:22:17,676 Speaker 1: genes involved in just producing a time percentage of the 368 00:22:17,756 --> 00:22:20,396 Speaker 1: variants right in thousands to get you to the whole variety, 369 00:22:20,836 --> 00:22:23,076 Speaker 1: we're never, or at least not in the foreseeable future, 370 00:22:22,956 --> 00:22:25,076 Speaker 1: are going to be able to edit all of those 371 00:22:25,236 --> 00:22:28,276 Speaker 1: different locations. So therefore, you know, don't worry your pretty 372 00:22:28,316 --> 00:22:31,116 Speaker 1: little head about it, or that's what they say to me. Sore. 373 00:22:32,556 --> 00:22:35,116 Speaker 1: There's three things wrong with that. One is, there are 374 00:22:35,236 --> 00:22:38,796 Speaker 1: single genes that are so impactful that even though in 375 00:22:38,796 --> 00:22:42,756 Speaker 1: the natural population they hardly add anything to it, in 376 00:22:42,796 --> 00:22:46,516 Speaker 1: the medical situation, they have a huge impact, in particular 377 00:22:46,556 --> 00:22:51,676 Speaker 1: for height. There's about seven different medical situations where physicians 378 00:22:51,716 --> 00:22:54,836 Speaker 1: and patients want to do something about height. It isn't 379 00:22:54,836 --> 00:22:58,516 Speaker 1: necessarily because they're genetically defective, and it's some other reason, 380 00:22:59,076 --> 00:23:02,036 Speaker 1: And the answer is one gene. One gene product was 381 00:23:02,156 --> 00:23:05,916 Speaker 1: somatotropin or sometimes called growth hormone. To me, that just 382 00:23:06,596 --> 00:23:10,596 Speaker 1: really nails the counter arguments. So the counter argument is 383 00:23:10,716 --> 00:23:13,716 Speaker 1: the geneticists who don't want you to worry, say there's 384 00:23:13,756 --> 00:23:16,036 Speaker 1: so many different genes they're involved, and you naturally right, 385 00:23:16,076 --> 00:23:17,636 Speaker 1: And you say, well, that might be true naturally, but 386 00:23:17,676 --> 00:23:19,916 Speaker 1: that doesn't mean there isn't an intervention we could make 387 00:23:19,996 --> 00:23:22,836 Speaker 1: exactly on a particular gene that would actually have the 388 00:23:22,876 --> 00:23:29,956 Speaker 1: effect of overshadowing overshadowing the natural variations. So, in other words, 389 00:23:29,996 --> 00:23:33,996 Speaker 1: and it's a mistake to reason from natural variation to 390 00:23:34,516 --> 00:23:38,076 Speaker 1: the capacities of technological intervention exactly. And this is not 391 00:23:38,156 --> 00:23:40,956 Speaker 1: an argument, and this is not hypothetical, because there are 392 00:23:40,996 --> 00:23:44,956 Speaker 1: seven different so called diseases or medical treatments that are 393 00:23:44,956 --> 00:23:47,916 Speaker 1: approved and are in routine use to involve this one 394 00:23:47,956 --> 00:23:52,716 Speaker 1: genes a metatropa. Second, they'll say, oh, we'll have unintended consequences. Well, 395 00:23:52,756 --> 00:23:56,516 Speaker 1: the fact is in medicine we always have unintended consequences. 396 00:23:56,676 --> 00:23:59,596 Speaker 1: We almost always deal with them. We just say the 397 00:23:59,636 --> 00:24:02,516 Speaker 1: benefits outweigh the risks. So that's an ethical argument that 398 00:24:02,516 --> 00:24:05,076 Speaker 1: they make. That The first argument is a don't worry 399 00:24:05,196 --> 00:24:08,396 Speaker 1: because it's not going to be practical to engage in 400 00:24:08,436 --> 00:24:11,596 Speaker 1: these kinds of systematic edits. This argument set is the 401 00:24:11,876 --> 00:24:14,836 Speaker 1: is the ethical concern that says, we don't know there 402 00:24:14,876 --> 00:24:17,796 Speaker 1: might be off target effects, and you're saying, there always are, 403 00:24:17,876 --> 00:24:20,276 Speaker 1: and we were accustomed to that, and we we balanced 404 00:24:20,276 --> 00:24:22,716 Speaker 1: the costs in the benefit. The distinction is blurred between 405 00:24:22,756 --> 00:24:26,956 Speaker 1: practical and ethical very often because you'll say it's impractical 406 00:24:27,036 --> 00:24:30,116 Speaker 1: because there's these off targets. The thing is it is 407 00:24:30,116 --> 00:24:33,396 Speaker 1: totally practical because almost everything has off targets and you 408 00:24:33,436 --> 00:24:35,596 Speaker 1: maybe take a second drug to deal with it, or 409 00:24:35,676 --> 00:24:39,036 Speaker 1: you stratify the population, or there's all sorts of things. 410 00:24:39,036 --> 00:24:43,356 Speaker 1: There's the tricks you can use, so don't they might 411 00:24:43,396 --> 00:24:46,316 Speaker 1: be target effects. When you hear those two arguments, these 412 00:24:46,356 --> 00:24:48,356 Speaker 1: are the grains of salt, you should take it with, okay, 413 00:24:48,396 --> 00:24:50,716 Speaker 1: because they're pretty big grains. And the third one is 414 00:24:51,956 --> 00:24:55,596 Speaker 1: the technology is so far from standing still. I mean, 415 00:24:55,636 --> 00:24:59,796 Speaker 1: it is growing exponentially that that even if you say 416 00:24:59,836 --> 00:25:02,996 Speaker 1: we can't do it today, it could be you know, 417 00:25:03,116 --> 00:25:07,436 Speaker 1: somewhere between minutes in years, not centuries. They often use 418 00:25:07,476 --> 00:25:12,636 Speaker 1: the word centuries. It's incredible, there's no centuries anymore. So. 419 00:25:12,636 --> 00:25:16,196 Speaker 1: So for example, my blab's personal record, a pretty much 420 00:25:16,236 --> 00:25:21,156 Speaker 1: world record for editing genes free Crisper or around the 421 00:25:21,196 --> 00:25:23,836 Speaker 1: time of Christophers two, we could addit two genes, and 422 00:25:24,276 --> 00:25:26,916 Speaker 1: we were very proud of ourselves. We edited more than one. 423 00:25:27,156 --> 00:25:29,916 Speaker 1: Just just literally two years later, we had edited sixty 424 00:25:29,956 --> 00:25:33,436 Speaker 1: two genes simultaneously to make a pig that didn't have 425 00:25:33,476 --> 00:25:37,356 Speaker 1: any viruses and retroviruses in its you know. Just a 426 00:25:37,436 --> 00:25:41,036 Speaker 1: year or two after that, we now have edited thirteen 427 00:25:41,076 --> 00:25:47,596 Speaker 1: thousand genes simultaneously in a mammal in human human cells. Okay, 428 00:25:48,476 --> 00:25:53,196 Speaker 1: not a human important to think. So the point is 429 00:25:53,236 --> 00:25:56,236 Speaker 1: this is moving very quickly. What's great is we're having 430 00:25:56,276 --> 00:25:59,516 Speaker 1: these conversations well in advance. But what used to be 431 00:25:59,516 --> 00:26:02,276 Speaker 1: well in advance. We started talking about designer babies in 432 00:26:02,316 --> 00:26:04,556 Speaker 1: the early days of our comment at DNA and in 433 00:26:04,676 --> 00:26:08,396 Speaker 1: vitro fertilizations sort of in the early seventies. That was 434 00:26:08,436 --> 00:26:11,436 Speaker 1: well in advance. But the conversations we're having now are 435 00:26:11,516 --> 00:26:15,876 Speaker 1: not forty years in advance. They're more like a decade. 436 00:26:16,876 --> 00:26:19,956 Speaker 1: When I look at the overall picture that you describe, 437 00:26:20,156 --> 00:26:25,356 Speaker 1: you know, the extraordinary change, the speed of change. I 438 00:26:25,436 --> 00:26:28,276 Speaker 1: wonder whether the institutions that we've talked about before the 439 00:26:28,316 --> 00:26:32,236 Speaker 1: FDA or the Chinese FDA are really strong enough and 440 00:26:32,316 --> 00:26:36,836 Speaker 1: capable enough to resist the temptations they're going to come 441 00:26:37,276 --> 00:26:41,396 Speaker 1: with the tremendous technological capacities that are being developed. I mean, 442 00:26:41,436 --> 00:26:42,756 Speaker 1: if we look at our history, we don't have a 443 00:26:42,756 --> 00:26:47,916 Speaker 1: great history of using institutions, regulatory institutions to block the 444 00:26:47,956 --> 00:26:50,916 Speaker 1: cutting edge. The cutting edge usually tends to emerge. And 445 00:26:50,956 --> 00:26:53,036 Speaker 1: that's partly, I think because of the power of capital, 446 00:26:53,596 --> 00:26:56,276 Speaker 1: you know, which you've mentioned here. It's also partly that 447 00:26:56,516 --> 00:26:59,796 Speaker 1: once something is possible, there will be people advocating for it. 448 00:26:59,836 --> 00:27:02,716 Speaker 1: If you have a rare disease, you will say, look, 449 00:27:02,716 --> 00:27:04,396 Speaker 1: it's not that it would be unethical to try to 450 00:27:04,436 --> 00:27:06,636 Speaker 1: cure this disease. It would be unethical not to try 451 00:27:06,636 --> 00:27:11,036 Speaker 1: to cure this disease. So can we hold back against 452 00:27:11,116 --> 00:27:15,196 Speaker 1: the potential risks here? You know, we say, now, well, 453 00:27:15,236 --> 00:27:17,636 Speaker 1: of course we would never do this to give people 454 00:27:17,716 --> 00:27:23,756 Speaker 1: purely esthetic advantages, but in reality maybe we would. It 455 00:27:23,916 --> 00:27:27,956 Speaker 1: is true that capitalism can distort the FDA, and it's 456 00:27:28,036 --> 00:27:30,796 Speaker 1: ilk around the world, but I think there is a 457 00:27:30,996 --> 00:27:34,076 Speaker 1: there is still feedback mechanism that's bigger than all of this, 458 00:27:34,476 --> 00:27:38,356 Speaker 1: which is that if it really doesn't work or isn't 459 00:27:38,676 --> 00:27:43,076 Speaker 1: cost effective or desirable, we will stop using it. In fact, 460 00:27:43,796 --> 00:27:45,836 Speaker 1: if it doesn't work. My worry is more, what if 461 00:27:46,916 --> 00:27:50,036 Speaker 1: by work, I mean doesn't work for society that the 462 00:27:50,196 --> 00:27:53,836 Speaker 1: long term benefits society. And they are examples. So first, 463 00:27:53,836 --> 00:27:57,316 Speaker 1: what do FDAs around the world not do? So they 464 00:27:57,356 --> 00:28:01,396 Speaker 1: do safety and efficacy, but they don't necessarily do equitable 465 00:28:01,476 --> 00:28:05,356 Speaker 1: distribution or very long term risk. They have long term 466 00:28:05,396 --> 00:28:07,236 Speaker 1: risks in mind, but the way they've set up the 467 00:28:07,276 --> 00:28:10,916 Speaker 1: structure is if you can convince that on a short 468 00:28:11,036 --> 00:28:13,916 Speaker 1: term risk basis that you've done all the experiments, you 469 00:28:13,996 --> 00:28:17,356 Speaker 1: get to test in phase four, which is basically selling it, 470 00:28:17,556 --> 00:28:20,316 Speaker 1: and then once capitalism kicks in, then it's very hard 471 00:28:20,356 --> 00:28:24,556 Speaker 1: to regulate. So perfect examples of this are nutritional supplements 472 00:28:24,556 --> 00:28:29,316 Speaker 1: and stem cells. Those managed to get into the marketplace 473 00:28:29,356 --> 00:28:33,756 Speaker 1: without full FDA approval and now they're bigger than the 474 00:28:33,796 --> 00:28:36,956 Speaker 1: rest of the pharmaceutical industry and so they're hard to regulate. 475 00:28:37,716 --> 00:28:41,036 Speaker 1: But bad example will be the opioid crisis. I mean, 476 00:28:41,076 --> 00:28:43,956 Speaker 1: it was just about gerty. It's a good one. So 477 00:28:44,676 --> 00:28:48,876 Speaker 1: this is something where if you create something that's sufficiently addictive, 478 00:28:49,196 --> 00:28:52,876 Speaker 1: people will try it and then they become lobbyists. They vote, 479 00:28:52,876 --> 00:28:54,996 Speaker 1: They vote with their wallet, which is in many ways 480 00:28:55,036 --> 00:28:59,476 Speaker 1: more powerful than voting with ballots. That said, there will 481 00:28:59,516 --> 00:29:02,436 Speaker 1: be a feedback system. If any country manages to get 482 00:29:02,476 --> 00:29:07,116 Speaker 1: a business model or a country policy model that prevents this, 483 00:29:08,316 --> 00:29:11,636 Speaker 1: they will win because the nations that can't control their 484 00:29:11,636 --> 00:29:19,396 Speaker 1: opiated crisis become impoverished, mismanaged, etc. They become the new cash. 485 00:29:19,436 --> 00:29:20,916 Speaker 1: Can I push back on that? I mean, so everything 486 00:29:20,916 --> 00:29:23,796 Speaker 1: we've talked about until now, I'm the purest lay person, 487 00:29:24,116 --> 00:29:26,676 Speaker 1: But now when we talk about feedback and the governments, 488 00:29:26,676 --> 00:29:28,276 Speaker 1: I have something to say. And here's what I would 489 00:29:28,276 --> 00:29:32,556 Speaker 1: want to say. We've got lots of examples where societies 490 00:29:32,596 --> 00:29:35,636 Speaker 1: inflict huge costs on themselves, but the costs are not 491 00:29:35,756 --> 00:29:38,396 Speaker 1: so great that they turn them into, as you would say, 492 00:29:38,396 --> 00:29:40,956 Speaker 1: third world countries, because they have other tremendous advantages. So 493 00:29:41,076 --> 00:29:43,116 Speaker 1: tobacco would be a really good example. Yeah, you know, 494 00:29:43,156 --> 00:29:46,596 Speaker 1: I remember my grandfather, who was trained in pharmacy school 495 00:29:46,676 --> 00:29:49,716 Speaker 1: in the nineteen twenties, describing a film that they were 496 00:29:49,716 --> 00:29:53,476 Speaker 1: all shown of the horrible effects on the human lung 497 00:29:53,916 --> 00:29:57,196 Speaker 1: of tobacco smoking. And then the tobacco industry literally came 498 00:29:57,436 --> 00:30:00,756 Speaker 1: bought up all the copies of those films, destroyed them 499 00:30:01,036 --> 00:30:03,676 Speaker 1: and block that from, you know, from being publicly. No, 500 00:30:03,796 --> 00:30:05,556 Speaker 1: it wasn't the same as a cause of a relationship 501 00:30:05,596 --> 00:30:08,276 Speaker 1: to cancer necessarily, but they knew the health effects were 502 00:30:08,316 --> 00:30:11,116 Speaker 1: absolutely terrible incomparable to what you got by working in 503 00:30:11,196 --> 00:30:14,316 Speaker 1: a mind. Yeah, so you know, there's an example of 504 00:30:14,396 --> 00:30:17,876 Speaker 1: total failure of regulatory institution. Absolutely, and the handful of 505 00:30:17,916 --> 00:30:21,156 Speaker 1: countries that we're able to regulate smoking to a slightly 506 00:30:21,196 --> 00:30:24,556 Speaker 1: greater degree did not enjoy any great In fact, there's 507 00:30:24,556 --> 00:30:26,556 Speaker 1: an even an argument, it's a perverse argument, but the 508 00:30:26,596 --> 00:30:29,476 Speaker 1: tobacco companies didn't make it at one point that it 509 00:30:29,556 --> 00:30:31,916 Speaker 1: was good for the economy that people smoked because you 510 00:30:31,916 --> 00:30:33,836 Speaker 1: didn't spend a lot of money on their latent life 511 00:30:33,916 --> 00:30:38,196 Speaker 1: healthcare because they died of lung cancer. So it's not 512 00:30:38,276 --> 00:30:42,436 Speaker 1: always the case that really bad social effects right impoverish 513 00:30:42,476 --> 00:30:45,836 Speaker 1: a country. That's correct. So I was I was just 514 00:30:45,956 --> 00:30:50,156 Speaker 1: saying that there is a feedback mechanism. Yeah, do we 515 00:30:50,236 --> 00:30:53,556 Speaker 1: need other things? We probably do, But do we think 516 00:30:53,596 --> 00:30:57,556 Speaker 1: that we can be a voluntary moratoria do something more 517 00:30:57,556 --> 00:31:00,636 Speaker 1: powerful in capitalism that I think is a little naive. 518 00:31:01,156 --> 00:31:03,476 Speaker 1: We need to think of all the ways we can 519 00:31:03,516 --> 00:31:06,236 Speaker 1: do surveillance, for example, is one of the things I've advocated. 520 00:31:06,676 --> 00:31:12,396 Speaker 1: Rather than being falsely reassured, let's have strong surveillance. You know, 521 00:31:12,476 --> 00:31:16,316 Speaker 1: in the topic today of these Cristopher babies, there were 522 00:31:16,316 --> 00:31:18,396 Speaker 1: plenty of people in the world, both in China and 523 00:31:18,396 --> 00:31:20,276 Speaker 1: the United States who knew about it, but they weren't 524 00:31:20,956 --> 00:31:24,836 Speaker 1: reporting it. We need to encourage whistleblowers, and we need 525 00:31:24,836 --> 00:31:29,076 Speaker 1: to optimize surveillance. I mean this one form of surveillance. 526 00:31:29,156 --> 00:31:32,676 Speaker 1: The other form of surveillance is computer surveillance. So we 527 00:31:32,716 --> 00:31:36,316 Speaker 1: should have computers looking through all the orders that are 528 00:31:36,476 --> 00:31:41,116 Speaker 1: relevant to synthetic biology broadly, and maybe other technologies as well, 529 00:31:41,276 --> 00:31:43,676 Speaker 1: to see if people are doing things for which they 530 00:31:43,756 --> 00:31:46,516 Speaker 1: do not have a license. So the privacy concerns interview 531 00:31:46,556 --> 00:31:50,356 Speaker 1: are just outweighed by the tremendous dangers associate with lack 532 00:31:50,396 --> 00:31:54,756 Speaker 1: of So I think that everybody who practices synthetic biology 533 00:31:54,876 --> 00:31:58,796 Speaker 1: and it's close relatives has given up some of their 534 00:31:59,036 --> 00:32:02,836 Speaker 1: privacy rights in the same sense that when you get 535 00:32:02,876 --> 00:32:05,196 Speaker 1: a driver's license, you give up certain rights. There are 536 00:32:05,316 --> 00:32:07,596 Speaker 1: radars that are looking to see what you're doing. You 537 00:32:07,636 --> 00:32:10,316 Speaker 1: can get pulled over if you're weaving around and you 538 00:32:10,316 --> 00:32:12,316 Speaker 1: can get an alcohol test, and you can get they 539 00:32:12,356 --> 00:32:15,236 Speaker 1: can check your age and so forth. You have to 540 00:32:15,276 --> 00:32:18,436 Speaker 1: have surveillance, and we do have more resistance typically those 541 00:32:18,436 --> 00:32:20,836 Speaker 1: forms of surveillance in the US than they do, for example, 542 00:32:21,196 --> 00:32:23,836 Speaker 1: in Europe. Here we think, you know, it would be 543 00:32:23,836 --> 00:32:26,076 Speaker 1: a terrible violation of our civil liberties if when we 544 00:32:26,116 --> 00:32:28,556 Speaker 1: got onto the highway they check the time, and when 545 00:32:28,596 --> 00:32:30,116 Speaker 1: we got off the highway they check the time and 546 00:32:30,156 --> 00:32:33,036 Speaker 1: saw whether we were speeding in between. But in you know, 547 00:32:33,036 --> 00:32:34,756 Speaker 1: if you did it on the mass Pike, you know 548 00:32:34,796 --> 00:32:37,236 Speaker 1: there would be a riot here in Massachusetts. But in 549 00:32:37,356 --> 00:32:41,236 Speaker 1: Europe there are speed cameras everywhere. The fines are enormous. 550 00:32:41,996 --> 00:32:45,756 Speaker 1: You know, you are meant to feel surveilled when you drive, 551 00:32:45,996 --> 00:32:48,316 Speaker 1: right that you do feel surveiled when you drive there, 552 00:32:48,316 --> 00:32:51,076 Speaker 1: And that's pro social, that's that's that's desirable, and that's 553 00:32:51,116 --> 00:32:54,276 Speaker 1: why I think when we create boogeymen outside the United States, 554 00:32:54,276 --> 00:32:58,556 Speaker 1: were completely ignoring how we have a huge fraction of 555 00:32:58,596 --> 00:33:03,076 Speaker 1: the world's billionaires, many of whom are very pro libertarian. 556 00:33:03,596 --> 00:33:05,996 Speaker 1: I can do whatever I want, or I think it's 557 00:33:06,076 --> 00:33:12,316 Speaker 1: much more likely that the rogues are going to be us. George, 558 00:33:12,316 --> 00:33:14,316 Speaker 1: thank you so much for your fear time, and thanks 559 00:33:14,156 --> 00:33:18,036 Speaker 1: for sharing your ideas. I'm not sure I emerged less 560 00:33:18,036 --> 00:33:19,596 Speaker 1: worried than when I started, but you've told me that 561 00:33:19,676 --> 00:33:22,636 Speaker 1: worries healthy, so I guess I've got plenty of it. 562 00:33:23,076 --> 00:33:31,516 Speaker 1: Thank you very much, thank you, thank you. I asked 563 00:33:31,516 --> 00:33:33,676 Speaker 1: George Church to come on the show because I wanted 564 00:33:33,676 --> 00:33:36,076 Speaker 1: to get a better sense than I had about just 565 00:33:36,156 --> 00:33:38,916 Speaker 1: how worried we should be about the potential advent of 566 00:33:38,956 --> 00:33:43,556 Speaker 1: designer babies. When George first started talking, I started to 567 00:33:43,596 --> 00:33:46,476 Speaker 1: feel better. I began to think that maybe the FDA 568 00:33:46,716 --> 00:33:52,436 Speaker 1: was a reasonable institution for regulating the possible bad effects 569 00:33:52,716 --> 00:33:56,196 Speaker 1: of designer babies. And George was also reassuring about the 570 00:33:56,236 --> 00:34:00,796 Speaker 1: thought that the Chinese government also is interested in blocking 571 00:34:00,996 --> 00:34:03,556 Speaker 1: progress from going too quickly or getting out of hand. 572 00:34:04,676 --> 00:34:08,636 Speaker 1: But as our conversation proceeded, I started to get more 573 00:34:08,716 --> 00:34:13,436 Speaker 1: and more were nervous. In particular, George was pretty convincing 574 00:34:13,676 --> 00:34:16,236 Speaker 1: in suggesting that those scientists who say we don't have 575 00:34:16,316 --> 00:34:19,076 Speaker 1: to worry about gene editing leading to designer babies soon 576 00:34:19,476 --> 00:34:24,396 Speaker 1: are actually completely wrong about that. Then, as he began 577 00:34:24,436 --> 00:34:28,796 Speaker 1: to describe the potential effects of companies and capitalism on 578 00:34:28,836 --> 00:34:31,556 Speaker 1: the development of science. I began to think that maybe 579 00:34:31,596 --> 00:34:34,396 Speaker 1: they were going to be economic pressures driving us in 580 00:34:34,396 --> 00:34:37,436 Speaker 1: the direction of greater and greater innovation and greater and 581 00:34:37,436 --> 00:34:40,676 Speaker 1: greater risk taking. And finally, when we circled all the 582 00:34:40,716 --> 00:34:44,116 Speaker 1: way back to those same regulatory agencies, George seemed to 583 00:34:44,116 --> 00:34:46,076 Speaker 1: think that there was some possibility that in the long 584 00:34:46,156 --> 00:34:48,116 Speaker 1: run we would make mistakes, but that we would be 585 00:34:48,116 --> 00:34:51,436 Speaker 1: self correcting. And there he and I parted paths because 586 00:34:51,476 --> 00:34:55,116 Speaker 1: I'm deeply afraid that we're not that good at fixing 587 00:34:55,116 --> 00:34:57,796 Speaker 1: our mistakes, that there are too many examples of our 588 00:34:57,836 --> 00:35:02,116 Speaker 1: regulatory regimes breaking down. And we ended with George saying 589 00:35:02,156 --> 00:35:03,996 Speaker 1: that he thinks that if anyone's going to go rogue, 590 00:35:04,076 --> 00:35:06,756 Speaker 1: it's going to be us right here in the United States, 591 00:35:07,316 --> 00:35:11,676 Speaker 1: and that, my friends, leaves me more nervous than anything else. 592 00:35:15,996 --> 00:35:18,956 Speaker 1: Deep Background is brought to you by Pushkin Industries. Our 593 00:35:18,996 --> 00:35:22,076 Speaker 1: producer is Lydia gene Coott, with engineering by Jason Gambrel 594 00:35:22,276 --> 00:35:25,996 Speaker 1: and Jason Roskowski. Our showrunner is Sophie mckibbon. Our theme 595 00:35:26,076 --> 00:35:28,876 Speaker 1: music is composed by Luis GERA special thanks to the 596 00:35:28,876 --> 00:35:32,996 Speaker 1: Pushkin Brass Malcolm Gladwell, Jacob Weisberg, and Mia Lobel. I'm 597 00:35:33,076 --> 00:35:36,316 Speaker 1: Noah Feldman. You can follow me on Twitter at Noah R. Feldman. 598 00:35:36,676 --> 00:35:38,076 Speaker 1: This is deep background