1 00:00:00,720 --> 00:00:03,559 Speaker 1: On this episode of Niche World. A few years ago, 2 00:00:03,680 --> 00:00:08,280 Speaker 1: with my colleague Emmanuel Sharpantier, I invented a new technology 3 00:00:08,520 --> 00:00:13,119 Speaker 1: for editing genomes. It's called Crisper CAST nine. The Crisper 4 00:00:13,160 --> 00:00:17,079 Speaker 1: technology allows scientists to make changes to the DNA in 5 00:00:17,160 --> 00:00:21,119 Speaker 1: cells that could allow us to cure genetic disease. You 6 00:00:21,239 --> 00:00:23,919 Speaker 1: might be interested to know that the Crisper technology came 7 00:00:23,960 --> 00:00:27,240 Speaker 1: about through a basic research project that was aimed at 8 00:00:27,280 --> 00:00:31,880 Speaker 1: discovering how bacteria fight viral infections. Bacteria have to deal 9 00:00:31,960 --> 00:00:34,240 Speaker 1: with viruses in their environment, and we can think about 10 00:00:34,240 --> 00:00:37,360 Speaker 1: a viral infection like a ticking time bomb. A bacterium 11 00:00:37,360 --> 00:00:40,360 Speaker 1: has only a few minutes to diffuse the bomb before 12 00:00:40,400 --> 00:00:44,199 Speaker 1: it gets destroyed. So many bacteria have in their cells 13 00:00:44,240 --> 00:00:47,720 Speaker 1: an adaptive immune system called crisper that allows them to 14 00:00:47,760 --> 00:00:51,360 Speaker 1: detect viral DNA and destroy it. Part of the crisper 15 00:00:51,479 --> 00:00:55,120 Speaker 1: system is a protein called CAST nine that's able to 16 00:00:55,200 --> 00:01:00,240 Speaker 1: seek out and cut and eventually degrade viral DNA in 17 00:01:00,280 --> 00:01:02,840 Speaker 1: a specific way. And it was through our research to 18 00:01:02,960 --> 00:01:06,560 Speaker 1: understand the activity of this protein CAST nine, that we 19 00:01:06,680 --> 00:01:10,160 Speaker 1: realize that we could harness its function as a genetic 20 00:01:10,360 --> 00:01:16,000 Speaker 1: engineering technology, a way for scientists to delete or insert 21 00:01:16,160 --> 00:01:20,920 Speaker 1: specific bits of DNA into cells with incredible precision. That 22 00:01:20,959 --> 00:01:24,080 Speaker 1: would offer opportunities to do things that really haven't been 23 00:01:24,120 --> 00:01:35,399 Speaker 1: possible in the past. My guest today is Walter Isaacson, 24 00:01:35,880 --> 00:01:38,760 Speaker 1: who's been a friend for many, many years. He's the 25 00:01:38,800 --> 00:01:43,160 Speaker 1: best selling author of biographies about Leonardo da Vinci, Steve Jobs, 26 00:01:43,280 --> 00:01:47,600 Speaker 1: Albert Einstein, Benjamin Franklin. Some have called him the biographer 27 00:01:47,600 --> 00:01:51,760 Speaker 1: of geniuses, and now he has added Jennifer Doodna to 28 00:01:51,960 --> 00:01:55,920 Speaker 1: that list with his New York Times bestseller The Codebreaker, 29 00:01:56,520 --> 00:02:00,280 Speaker 1: Jennifer Doodna, Gene Editing and the Future of Human Race. 30 00:02:00,800 --> 00:02:03,360 Speaker 1: It is a gripping account of how Nobel Prize winner 31 00:02:03,440 --> 00:02:07,680 Speaker 1: Jennifer Doodna and her colleagues launched a revolution that will 32 00:02:07,720 --> 00:02:11,280 Speaker 1: allow us to cure diseases, fend off viruses, and have 33 00:02:11,440 --> 00:02:22,880 Speaker 1: healthier babies. Walter, thank you for joining me. It's great 34 00:02:22,919 --> 00:02:26,200 Speaker 1: to talk with you. Let's start at the beginning. What 35 00:02:26,400 --> 00:02:29,040 Speaker 1: led you to decide to become a chronicler of unique 36 00:02:29,360 --> 00:02:33,840 Speaker 1: and genius personalities. I think that humans can affect the 37 00:02:33,919 --> 00:02:37,120 Speaker 1: course of history. You and I have both been history professors, 38 00:02:37,120 --> 00:02:39,560 Speaker 1: and we know that sometimes in the academy people think 39 00:02:39,639 --> 00:02:44,040 Speaker 1: it's all these grand economic forces or whatever. But I 40 00:02:44,200 --> 00:02:46,560 Speaker 1: like to believe that every now and an end, a 41 00:02:46,639 --> 00:02:50,240 Speaker 1: creative person comes along and they're able to change things 42 00:02:50,320 --> 00:02:53,760 Speaker 1: until whether it's Steve Jobs or Henry Kissinger or Leonardo 43 00:02:53,800 --> 00:02:56,800 Speaker 1: da Vinci Ben Franklin. To me, I want to inspire 44 00:02:56,840 --> 00:03:00,640 Speaker 1: people by saying, here's how people affect history. And Jennifer 45 00:03:00,720 --> 00:03:04,600 Speaker 1: DOWDNA is the latest subject because she's leading the third 46 00:03:04,600 --> 00:03:07,880 Speaker 1: grade revolution of our times. In terms of science. The 47 00:03:07,919 --> 00:03:11,440 Speaker 1: first was a physics revolution from Einstein that gave us 48 00:03:11,440 --> 00:03:14,440 Speaker 1: the Adam Baum in space travel. The second was a 49 00:03:14,720 --> 00:03:18,600 Speaker 1: revolution and information technology with Steve Jobs and others that 50 00:03:18,720 --> 00:03:21,359 Speaker 1: gave us personal computers. And now we're going to be 51 00:03:21,440 --> 00:03:25,000 Speaker 1: able to treat our own molecules as microchips. We're going 52 00:03:25,040 --> 00:03:28,400 Speaker 1: to be able to reprogram the genetic code in our body, 53 00:03:28,680 --> 00:03:31,560 Speaker 1: and that's going to have more consequences, I think than 54 00:03:31,639 --> 00:03:36,960 Speaker 1: even the information technology revolution. So when you began looking 55 00:03:37,520 --> 00:03:40,240 Speaker 1: at your next big book, what drew you to her? 56 00:03:40,520 --> 00:03:43,160 Speaker 1: In many different ways to tell the story? And you 57 00:03:43,240 --> 00:03:47,880 Speaker 1: picked and very unique way. Yes, I had been interested 58 00:03:48,080 --> 00:03:50,280 Speaker 1: in all the people who were involved in the life 59 00:03:50,320 --> 00:03:54,680 Speaker 1: science revolution and had a really colorful cast of characters, 60 00:03:54,720 --> 00:03:57,480 Speaker 1: all of whom are in the book, from George Church 61 00:03:57,520 --> 00:04:01,360 Speaker 1: and from Jang to Emmanuel Arpin Chain. But the more 62 00:04:01,520 --> 00:04:04,360 Speaker 1: I met with Jennifer Dowdner, the more I realized that 63 00:04:04,400 --> 00:04:07,040 Speaker 1: she was able to be a narrative thread that would 64 00:04:07,040 --> 00:04:10,760 Speaker 1: tie together all the themes I wanted, starting with the 65 00:04:10,840 --> 00:04:14,240 Speaker 1: discovery of the structure of DNA, which she reads about 66 00:04:14,280 --> 00:04:18,000 Speaker 1: as a young child and becomes fascinated by the structure 67 00:04:18,000 --> 00:04:20,440 Speaker 1: of molecules, and also by the role of somebody named 68 00:04:20,520 --> 00:04:25,320 Speaker 1: Rosalind Franklin who hasn't gotten that much acclaim in history, 69 00:04:25,680 --> 00:04:28,320 Speaker 1: but taught Jennifer that a woman could be a scientist. 70 00:04:28,640 --> 00:04:32,159 Speaker 1: And then she goes and she understands that RNA, rather 71 00:04:32,160 --> 00:04:35,400 Speaker 1: than DNA, is the more important molecule in our bodies, 72 00:04:35,839 --> 00:04:38,760 Speaker 1: and she discovers how RNA can be a guide to 73 00:04:38,800 --> 00:04:41,200 Speaker 1: cut up our genes, but it can also be a 74 00:04:41,240 --> 00:04:44,960 Speaker 1: messenger to give us vaccines against a coronavirus. It's also 75 00:04:45,080 --> 00:04:47,840 Speaker 1: how life began on this planet, because she discovers how 76 00:04:47,960 --> 00:04:52,000 Speaker 1: RNA can replicate itself, thus becoming the first molecule that 77 00:04:52,080 --> 00:04:56,719 Speaker 1: becomes a living organism. And finally, she starts wrestling with 78 00:04:56,800 --> 00:05:02,239 Speaker 1: immoral issues of gene attitude. So I writing narrative history, 79 00:05:02,440 --> 00:05:04,919 Speaker 1: you do it as well. I've read your books, and 80 00:05:05,120 --> 00:05:10,160 Speaker 1: narrative history usually benefits from having a central character that 81 00:05:10,240 --> 00:05:15,160 Speaker 1: we can follow that personalizes what becomes a journey of 82 00:05:15,279 --> 00:05:19,200 Speaker 1: discovery and a journey of invention. And so the more 83 00:05:19,320 --> 00:05:21,400 Speaker 1: I read about all the people and met all the 84 00:05:21,440 --> 00:05:24,040 Speaker 1: people in the life sciences, I said, let me use 85 00:05:24,160 --> 00:05:28,159 Speaker 1: Jennifer Dowdner as my central character. I'm curious for a second, 86 00:05:28,600 --> 00:05:31,159 Speaker 1: if we can take a brief detour on behalf of 87 00:05:31,200 --> 00:05:32,880 Speaker 1: those of us who don't know nearly as much as 88 00:05:32,920 --> 00:05:35,360 Speaker 1: you do. Would you walk as just from it through 89 00:05:35,400 --> 00:05:38,880 Speaker 1: the difference between DNA, which is more properly understood than 90 00:05:39,000 --> 00:05:42,839 Speaker 1: or in a How should the average person understand those two? Yeah, 91 00:05:43,000 --> 00:05:45,080 Speaker 1: DNA is the famous molecule, the one that get on 92 00:05:45,200 --> 00:05:48,320 Speaker 1: magazine covers and becomes a metaphor for the DNA in 93 00:05:48,320 --> 00:05:51,280 Speaker 1: our society or whatever it may be. But like a 94 00:05:51,279 --> 00:05:54,520 Speaker 1: lot of famous siblings, it doesn't really do much work. 95 00:05:54,600 --> 00:05:57,800 Speaker 1: It just sits in the nucleus of ourselves and it 96 00:05:57,880 --> 00:06:02,880 Speaker 1: curates information, and all it does is guard our genetic information. 97 00:06:03,800 --> 00:06:07,200 Speaker 1: RNA actually is a sibling that does real work. It 98 00:06:07,279 --> 00:06:10,880 Speaker 1: actually makes products. And so what RNA does is it 99 00:06:10,920 --> 00:06:14,200 Speaker 1: goes and takes a little bit of the information from 100 00:06:14,279 --> 00:06:17,560 Speaker 1: the DNA and moves to the region in our cell 101 00:06:17,640 --> 00:06:21,599 Speaker 1: where we manufacture of proteins, and it oversees the manufacturing 102 00:06:21,640 --> 00:06:24,400 Speaker 1: of proteins, whether it's a hair follicle or a fingernail, 103 00:06:24,480 --> 00:06:27,240 Speaker 1: or a hormone or an enzyme or whatever it's supposed 104 00:06:27,279 --> 00:06:31,039 Speaker 1: to be in our body. And so RNA is always 105 00:06:31,080 --> 00:06:33,680 Speaker 1: in motion. It can act as a guide, It can 106 00:06:33,760 --> 00:06:37,200 Speaker 1: act as a messenger telling us, as in the RNA 107 00:06:37,360 --> 00:06:41,719 Speaker 1: vaccines of Fiser and Moderna, what proteins to build. But 108 00:06:41,839 --> 00:06:45,880 Speaker 1: they both use a four letter code to say, here's 109 00:06:45,960 --> 00:06:48,920 Speaker 1: all the information you need to build a human. Over 110 00:06:49,040 --> 00:06:52,880 Speaker 1: that matter, any other organism in our case at three 111 00:06:53,040 --> 00:06:57,520 Speaker 1: billion pairs of those letters in DNA and RNA gets 112 00:06:57,560 --> 00:07:00,720 Speaker 1: to transcribe. I'm using a very similar code to say, Okay, 113 00:07:00,800 --> 00:07:03,720 Speaker 1: let me go do something with that information. When you're 114 00:07:03,760 --> 00:07:05,920 Speaker 1: going down this road and you're beginning to learn all 115 00:07:05,920 --> 00:07:09,880 Speaker 1: this what at track you did at down as your 116 00:07:09,920 --> 00:07:13,760 Speaker 1: sort of storyline, I think she's a decent person who 117 00:07:13,800 --> 00:07:16,760 Speaker 1: cares about the ethical implications. Like all of the characters 118 00:07:16,760 --> 00:07:20,120 Speaker 1: i've written about. She stands with one foot in the 119 00:07:20,200 --> 00:07:23,600 Speaker 1: humanities and one foot in the sciences. That's what Steve 120 00:07:23,680 --> 00:07:26,360 Speaker 1: Jobs did, That's what Ben Franklin did, That's what Leonardo 121 00:07:26,440 --> 00:07:30,280 Speaker 1: da Vinci's Vitruvian man drawing is all about. And so 122 00:07:30,360 --> 00:07:34,960 Speaker 1: she can help connect us to the technologies that we have. Secondly, 123 00:07:35,040 --> 00:07:39,720 Speaker 1: she was very creative. She was curious about basic science. 124 00:07:39,760 --> 00:07:42,600 Speaker 1: She enters into this field not trying to make an 125 00:07:42,600 --> 00:07:45,880 Speaker 1: invention that will edit our genes or give us vaccines. 126 00:07:46,600 --> 00:07:49,000 Speaker 1: She enters into it because she's curious about why do 127 00:07:49,120 --> 00:07:53,840 Speaker 1: bacteria have clustered repeated sequences, which we call crispers. And 128 00:07:53,920 --> 00:07:56,000 Speaker 1: so she and a group of other scientists were just 129 00:07:56,080 --> 00:07:59,560 Speaker 1: doing it driven by pure curiosity. But one day when 130 00:07:59,560 --> 00:08:03,160 Speaker 1: she figured out that here the components of the system 131 00:08:03,520 --> 00:08:07,200 Speaker 1: that can guide a scissors and our cell and enzyme 132 00:08:07,200 --> 00:08:09,920 Speaker 1: and our cell to cut our genes, she has an 133 00:08:09,960 --> 00:08:13,320 Speaker 1: Aha moment, and it says, oh, I can repurpose that. 134 00:08:13,480 --> 00:08:16,240 Speaker 1: I can re engineer it to be a system that 135 00:08:16,400 --> 00:08:22,360 Speaker 1: edits our own genetic code. And in doing that she 136 00:08:22,520 --> 00:08:26,080 Speaker 1: helps develop the whole notion of how you use Christopher 137 00:08:26,320 --> 00:08:30,080 Speaker 1: almost as an engineering device. This is my language, not yours. 138 00:08:30,200 --> 00:08:33,480 Speaker 1: But it's almost like she's figuring out the engineering that 139 00:08:33,720 --> 00:08:38,880 Speaker 1: enables us to reshape the relationships between RNA and DNA. 140 00:08:39,280 --> 00:08:41,240 Speaker 1: And is that a reasonable way to think of it. 141 00:08:41,920 --> 00:08:44,760 Speaker 1: I think what she's doing is she's using the power 142 00:08:44,880 --> 00:08:50,080 Speaker 1: of RNA to edit the genes in our body. Because 143 00:08:50,280 --> 00:08:55,559 Speaker 1: RNA knows how to target something. It's a good targeting tools. 144 00:08:55,920 --> 00:08:59,200 Speaker 1: Here's a piece of genetic code I want to change, 145 00:08:59,520 --> 00:09:04,000 Speaker 1: And she's able to engineer a system that bacteria used 146 00:09:04,120 --> 00:09:08,440 Speaker 1: for one billion years, which is when bacteria get attacked 147 00:09:08,440 --> 00:09:10,880 Speaker 1: by viruses, they take a little mug shot of the 148 00:09:10,960 --> 00:09:13,640 Speaker 1: virus that attacks them, and they keep a little bit 149 00:09:13,640 --> 00:09:16,560 Speaker 1: of the genetic code of that virus in their own 150 00:09:16,800 --> 00:09:21,120 Speaker 1: cluster repeated sequences known as crispers, and so they can 151 00:09:21,320 --> 00:09:26,400 Speaker 1: use the RNA to target that virus if it attacks again. Now, 152 00:09:26,400 --> 00:09:28,640 Speaker 1: that's pretty useful in this day and age when we 153 00:09:28,720 --> 00:09:32,120 Speaker 1: keep getting attacked by viruses. But she also made it 154 00:09:32,240 --> 00:09:35,040 Speaker 1: useful because, say, oh, if we can use RNA as 155 00:09:35,080 --> 00:09:39,000 Speaker 1: a targeting tool when it sees a piece of genetic 156 00:09:39,040 --> 00:09:41,880 Speaker 1: code that doesn't like, we can make that into an 157 00:09:42,000 --> 00:09:46,240 Speaker 1: editing tool that cuts and even pace genetic code we 158 00:09:46,400 --> 00:09:49,800 Speaker 1: don't like or do like. So in principle, for example, 159 00:09:50,400 --> 00:09:54,120 Speaker 1: you can use Crisper, at least in theory to begin 160 00:09:54,240 --> 00:09:59,640 Speaker 1: targeting say, cancer causing genes, and to eliminate that which 161 00:09:59,720 --> 00:10:04,000 Speaker 1: is dangerous. Absolutely, and that's such a wonderful promise, and 162 00:10:04,000 --> 00:10:07,800 Speaker 1: it is not just potentially last year in Mississippi. I'll 163 00:10:07,800 --> 00:10:11,280 Speaker 1: pick one example. A woman named Victoria Gray who's suffered 164 00:10:11,280 --> 00:10:14,679 Speaker 1: her whole life from sickle cell is cured by Crisper. 165 00:10:15,000 --> 00:10:17,920 Speaker 1: And sickle cell is out of the three billion letters 166 00:10:17,920 --> 00:10:21,200 Speaker 1: in your body. If you get one letter long, it 167 00:10:21,280 --> 00:10:24,560 Speaker 1: switches from one letter to the other letter, then you're 168 00:10:24,600 --> 00:10:27,040 Speaker 1: going to have sickled cells, meaning your blood cells are 169 00:10:27,040 --> 00:10:30,719 Speaker 1: going to be crumpled up. And so now with this 170 00:10:30,880 --> 00:10:34,520 Speaker 1: Crisper gene editing tool, we're able to fix that one 171 00:10:34,640 --> 00:10:37,480 Speaker 1: letter mutation. We can do it, as you just said 172 00:10:37,600 --> 00:10:41,440 Speaker 1: on cancer because one of the problems we have using immunotherapy, 173 00:10:42,040 --> 00:10:44,920 Speaker 1: meaning using our own immune system to fight cancer cells, 174 00:10:45,280 --> 00:10:47,880 Speaker 1: is cancer is pretty tricky and wiley, and it knows 175 00:10:47,920 --> 00:10:52,280 Speaker 1: how to turn off our immune system. Well, using Crisper, 176 00:10:52,320 --> 00:10:53,800 Speaker 1: you can make it so it doesn't have the key 177 00:10:53,840 --> 00:10:56,720 Speaker 1: to turn off our immune system. And it has been 178 00:10:56,840 --> 00:11:00,840 Speaker 1: used to fight congenital blindness up in Port and in 179 00:11:00,880 --> 00:11:03,560 Speaker 1: a couple of other places are in clinical trials with that. 180 00:11:03,960 --> 00:11:07,600 Speaker 1: So the easiest use of Crisper, by far the least 181 00:11:07,600 --> 00:11:11,280 Speaker 1: controversial use of Crisper is in living patients who have 182 00:11:11,480 --> 00:11:16,640 Speaker 1: very simple bad diseases that are caused by simple genetic mutations. 183 00:11:16,679 --> 00:11:21,439 Speaker 1: And they're almost seven thousand of those, meaning muscular dystrophe, 184 00:11:21,520 --> 00:11:26,880 Speaker 1: cystic fibrosis, taste sacks, Huntington's sickle cell, many others. So 185 00:11:27,000 --> 00:11:31,840 Speaker 1: from the standpoint of the breakthrough she created, she's sort 186 00:11:31,880 --> 00:11:35,600 Speaker 1: of the engineer who provides the tools for the engineers, 187 00:11:36,160 --> 00:11:40,480 Speaker 1: right right, once you've created this tool called Crisper gene 188 00:11:40,640 --> 00:11:45,040 Speaker 1: editing tool, the great thing about is that it's all 189 00:11:45,240 --> 00:11:49,520 Speaker 1: done in code. So you can code the RNA to 190 00:11:49,640 --> 00:11:53,640 Speaker 1: go snip up something that involves sickle cell, but somebody 191 00:11:53,640 --> 00:11:57,000 Speaker 1: else can recode it and say, do me this scene 192 00:11:57,040 --> 00:12:02,079 Speaker 1: that causes congenital blindness and cells. And so there are 193 00:12:02,160 --> 00:12:06,400 Speaker 1: now hundreds of researchers around the world, and there are 194 00:12:06,400 --> 00:12:10,960 Speaker 1: these conferences called Cristopher Conferences where they meet and they say, 195 00:12:11,040 --> 00:12:15,040 Speaker 1: let me use this tool, and I'm going to reprogram 196 00:12:15,080 --> 00:12:17,760 Speaker 1: it to fight cancer. I'm going to reprogram it to 197 00:12:17,840 --> 00:12:23,560 Speaker 1: fight muscular dystrophy. But it's not just inventing a cure 198 00:12:23,679 --> 00:12:28,400 Speaker 1: for a disease. She invented a platform, a tool that 199 00:12:28,559 --> 00:12:31,280 Speaker 1: others can use to cure and recode it to cure 200 00:12:31,320 --> 00:12:34,600 Speaker 1: many diseases. When she wins the Nobel Prize with her 201 00:12:34,640 --> 00:12:39,880 Speaker 1: research partner Emmanuel Sharpentchay this past October, the Nobel Committee 202 00:12:39,880 --> 00:12:43,680 Speaker 1: says it's not just for one discovery, it's for bringing 203 00:12:43,760 --> 00:12:48,280 Speaker 1: science into a whole new era. They are rewriting the 204 00:12:48,320 --> 00:12:51,560 Speaker 1: code of life. So if you put your finger on it, exactly, 205 00:12:51,679 --> 00:12:54,959 Speaker 1: they've not just made a discovery, They've made a platform 206 00:12:55,280 --> 00:12:58,760 Speaker 1: of which there'll be thousands of discoveries in the next 207 00:12:58,840 --> 00:13:18,199 Speaker 1: decade or two. You know, on the one hand, it 208 00:13:18,240 --> 00:13:21,320 Speaker 1: sounds almost like science fiction. On the other hand, off 209 00:13:21,320 --> 00:13:24,480 Speaker 1: I understanding correctly, yogurts played a role in all this, 210 00:13:25,040 --> 00:13:28,280 Speaker 1: and two French food scientists who actually were working in 211 00:13:28,320 --> 00:13:30,760 Speaker 1: Wisconsin played a role. Ont of them. I mean, how 212 00:13:30,800 --> 00:13:34,720 Speaker 1: does one of the most ancient of human foods fed 213 00:13:34,720 --> 00:13:39,400 Speaker 1: back into the most modern of scientific breakthroughs? Was I said? 214 00:13:39,440 --> 00:13:44,439 Speaker 1: It comes from a simple curiosity about nature, which involves bacteria. 215 00:13:44,880 --> 00:13:48,120 Speaker 1: Bacteria have been fighting viruses for a billion years, and 216 00:13:48,200 --> 00:13:51,400 Speaker 1: they developed this system to use an RNA guide to 217 00:13:51,559 --> 00:13:54,520 Speaker 1: chop up a virus that attacks them. We'll name the 218 00:13:54,600 --> 00:13:58,800 Speaker 1: industry that most depends on healthy bacteria. Well, that's the 219 00:13:58,920 --> 00:14:02,440 Speaker 1: cheese and yogurt industry. Four billion dollars a year of 220 00:14:02,520 --> 00:14:08,160 Speaker 1: creating starter cultures. Starter cultures are bacteria that takes milk 221 00:14:08,200 --> 00:14:11,720 Speaker 1: and turns it into cheese. Somehow it turns it into yogurt. 222 00:14:12,120 --> 00:14:14,480 Speaker 1: And you know, you've been in government and you know 223 00:14:14,960 --> 00:14:20,640 Speaker 1: this interplay between basic science and applied science, and so 224 00:14:20,680 --> 00:14:24,120 Speaker 1: what happens is we have the basic scientists who are 225 00:14:24,160 --> 00:14:27,000 Speaker 1: looking at what the heck of bacteria doing with all 226 00:14:27,040 --> 00:14:31,080 Speaker 1: these repeated sequences in their DNA, and then you get 227 00:14:31,120 --> 00:14:34,760 Speaker 1: a group of yogurt scientists who work for Denisco, a 228 00:14:34,800 --> 00:14:37,800 Speaker 1: couple of them, and they have a history of twenty 229 00:14:37,920 --> 00:14:41,000 Speaker 1: years of the genetic sequences of all the bacteria that's 230 00:14:41,000 --> 00:14:43,640 Speaker 1: been used to make yogurts. And they got a lot 231 00:14:43,640 --> 00:14:46,640 Speaker 1: of money too, because Denisco is really really eager to 232 00:14:46,720 --> 00:14:50,040 Speaker 1: spend money to protect its yogurt cultures. And so they 233 00:14:50,080 --> 00:14:54,360 Speaker 1: go back through all twenty years of the history of 234 00:14:54,400 --> 00:14:59,240 Speaker 1: these bacteria genetic sequences and they discover that, Oh, I 235 00:14:59,360 --> 00:15:02,040 Speaker 1: get it. Every time a new type of virus hits 236 00:15:02,040 --> 00:15:05,360 Speaker 1: our yogurt culture, the next year we see a sequence 237 00:15:05,760 --> 00:15:08,480 Speaker 1: that has this little clustered repeat with some of the 238 00:15:08,560 --> 00:15:11,920 Speaker 1: virus's genetic code. So that was one of the initial 239 00:15:12,000 --> 00:15:16,520 Speaker 1: discoveries that this was a virus fighting trick that bacteria used, 240 00:15:16,680 --> 00:15:19,320 Speaker 1: and so they patented it. And that's why if you 241 00:15:19,440 --> 00:15:22,280 Speaker 1: eat your yogurt, and a lot of people against GMOs 242 00:15:22,360 --> 00:15:25,280 Speaker 1: or whatever, but then don't eat yogurt or cheese. That's 243 00:15:25,320 --> 00:15:28,920 Speaker 1: how we begin with gene editing so that the yogurt 244 00:15:28,920 --> 00:15:32,400 Speaker 1: culture doesn't get killed by viruses. This is all sort 245 00:15:32,440 --> 00:15:36,080 Speaker 1: of wild. I mean, here you have things that had 246 00:15:36,080 --> 00:15:40,440 Speaker 1: evolved in nature, that creative framework for things that we 247 00:15:40,440 --> 00:15:43,800 Speaker 1: were trying to now evolve in laboratories. And you can 248 00:15:43,840 --> 00:15:46,120 Speaker 1: go back and you can look at the way in 249 00:15:46,200 --> 00:15:50,160 Speaker 1: which nature itself in some areas had evolved, and it's 250 00:15:50,280 --> 00:15:53,440 Speaker 1: very striking to me that part of what happens. I 251 00:15:53,560 --> 00:15:56,160 Speaker 1: think this is one of the most complex parts of 252 00:15:56,680 --> 00:16:02,040 Speaker 1: how science occurs. Is you have great individual scientists, but 253 00:16:02,120 --> 00:16:06,000 Speaker 1: they're inevitably part of a web of discovery that involves 254 00:16:06,080 --> 00:16:10,080 Speaker 1: many other people and may involve people who were working 255 00:16:10,080 --> 00:16:13,840 Speaker 1: on projects that had no relationship to what they automately affected. 256 00:16:14,040 --> 00:16:16,080 Speaker 1: I mean, aren't you sort of fascinated by the way 257 00:16:16,120 --> 00:16:20,360 Speaker 1: both the formal team and the informal network come together 258 00:16:20,640 --> 00:16:24,520 Speaker 1: when you get these great breakthroughs. Absolutely, and what you 259 00:16:24,640 --> 00:16:29,000 Speaker 1: have is basic science doing this sort of dance with 260 00:16:29,080 --> 00:16:32,840 Speaker 1: applied scientists, and that takes people from around the world. 261 00:16:33,120 --> 00:16:37,520 Speaker 1: It takes a lot of collaboration. Jennifer Dawdner collaborated with 262 00:16:37,560 --> 00:16:40,040 Speaker 1: the French scientists Emmanuel Sharp and Jay. They worked with 263 00:16:40,120 --> 00:16:43,960 Speaker 1: the yogurt makers I told you about Phelippe, Barrengou and Horvath, 264 00:16:44,480 --> 00:16:49,640 Speaker 1: and they had a team that helped create this crisper tool. 265 00:16:49,920 --> 00:16:53,640 Speaker 1: But also competition is involved. There was another team at 266 00:16:53,720 --> 00:16:56,640 Speaker 1: the Broad Institute of MIT and Harvard led by Eric 267 00:16:56,720 --> 00:17:00,760 Speaker 1: Lander who you know, and Fong Jang one of his scientists, 268 00:17:00,800 --> 00:17:03,480 Speaker 1: and they're competing to say how can we use this 269 00:17:03,640 --> 00:17:06,359 Speaker 1: in humans? And they're in a battle still over the 270 00:17:06,400 --> 00:17:09,360 Speaker 1: patent rights for how do we use this tool to 271 00:17:09,520 --> 00:17:12,800 Speaker 1: edit human genes? And they're in a race to win 272 00:17:12,840 --> 00:17:15,640 Speaker 1: the prizes. Ends up that Jennifer dowd to my main 273 00:17:15,760 --> 00:17:20,040 Speaker 1: character and her partner won the Nobel Prize but Fongjiang 274 00:17:20,240 --> 00:17:23,680 Speaker 1: and his team at MIT of also won prizes. So 275 00:17:23,760 --> 00:17:27,280 Speaker 1: you have the competition that's spurred by patents and prizes 276 00:17:27,600 --> 00:17:31,520 Speaker 1: and the priorities of publication and getting the fame for 277 00:17:31,640 --> 00:17:36,560 Speaker 1: doing things, but you also have the collaboration and collegiality 278 00:17:37,000 --> 00:17:40,880 Speaker 1: of hundreds of scientists around the world making little discoveries. 279 00:17:41,160 --> 00:17:45,400 Speaker 1: I sometimes think that patents are a little bit outmoded 280 00:17:45,800 --> 00:17:48,480 Speaker 1: because they have to find one inventor or one clear 281 00:17:48,560 --> 00:17:51,919 Speaker 1: set of inventors when there may be a hundred people 282 00:17:52,000 --> 00:17:55,000 Speaker 1: or a thousand people who contribute it. That's probably too 283 00:17:55,080 --> 00:17:59,399 Speaker 1: complicated to write into patent law. But certainly prizes like 284 00:17:59,520 --> 00:18:02,639 Speaker 1: the Nobel can only be given to three people. I 285 00:18:02,720 --> 00:18:05,440 Speaker 1: think that distorts science a bit because it makes it 286 00:18:05,720 --> 00:18:08,879 Speaker 1: so you're not sharing your information as much as you would. 287 00:18:09,320 --> 00:18:12,159 Speaker 1: The interesting thing to me, though, is when coronavirus struck 288 00:18:12,600 --> 00:18:16,520 Speaker 1: both the team at MIT and then Jennifer dowtness team 289 00:18:16,600 --> 00:18:21,440 Speaker 1: at Berkeley both turn their attention to coronavirus and they 290 00:18:21,520 --> 00:18:24,760 Speaker 1: quit trying to patent the use of whatever they discovered. 291 00:18:24,920 --> 00:18:28,439 Speaker 1: They just put it in the public domain instantly on 292 00:18:28,720 --> 00:18:33,000 Speaker 1: Internet servers, not in peer reviewed journal to say here's 293 00:18:33,000 --> 00:18:37,040 Speaker 1: what we've discovered this week. Anybody fighting COVID can use it. 294 00:18:37,280 --> 00:18:40,520 Speaker 1: So I think maybe the pandemic helped us write that 295 00:18:40,680 --> 00:18:45,159 Speaker 1: balance between the competition that occurs and the cooperation that 296 00:18:45,240 --> 00:18:50,120 Speaker 1: occurs in basic science research. I'm curious because you emphasize 297 00:18:50,119 --> 00:18:53,360 Speaker 1: it in your work on Steve Jobs. The Jobs himself 298 00:18:53,400 --> 00:18:57,320 Speaker 1: thought that creating teams were sort of central to why 299 00:18:57,320 --> 00:19:01,199 Speaker 1: Apple had worked. How did Dowton go about creating the 300 00:19:01,280 --> 00:19:06,240 Speaker 1: team that ultimately had such great accomplishments. Right when I 301 00:19:06,240 --> 00:19:09,640 Speaker 1: asked Steve Jobs, what was his greatest discovery? I thought 302 00:19:09,680 --> 00:19:14,280 Speaker 1: he'd say the iPhone Macintosh, but he said, no, Building 303 00:19:14,320 --> 00:19:17,320 Speaker 1: products is hard, but what's really hard is building a 304 00:19:17,359 --> 00:19:20,280 Speaker 1: team that can continue to build good products. And he 305 00:19:20,359 --> 00:19:24,920 Speaker 1: built a team like Franklin Roosevelt from your History studies 306 00:19:24,960 --> 00:19:28,880 Speaker 1: and to some extent, Lincoln from Darris Kuran's Goodwin's Team 307 00:19:28,920 --> 00:19:32,520 Speaker 1: of Rivals. He liked to build people that had creative tension. 308 00:19:32,840 --> 00:19:34,959 Speaker 1: He liked to have teams of people that were always 309 00:19:35,000 --> 00:19:38,840 Speaker 1: clashing with each other because he thought that spird creativity. 310 00:19:39,080 --> 00:19:43,600 Speaker 1: It challenged people's thinking. But there's no one correct way 311 00:19:43,680 --> 00:19:48,200 Speaker 1: to build teams. And Dowden was much more collegial whenever 312 00:19:48,200 --> 00:19:51,679 Speaker 1: somebody was going to join one of the companies she founded, 313 00:19:51,960 --> 00:19:55,119 Speaker 1: or join the team at her Berkeley lab, or become 314 00:19:55,160 --> 00:19:58,880 Speaker 1: a post doc studying under her, she would invite them 315 00:19:58,920 --> 00:20:02,040 Speaker 1: in and say, talk to everybody in the lab, talk 316 00:20:02,080 --> 00:20:04,000 Speaker 1: to everybody in the company, and then they would all 317 00:20:04,080 --> 00:20:06,480 Speaker 1: meet to make sure the person fit in. And so, 318 00:20:06,560 --> 00:20:09,040 Speaker 1: why aren't you losing some of the value of creative 319 00:20:09,040 --> 00:20:12,200 Speaker 1: tension if you do that? And she says, maybe so. 320 00:20:12,400 --> 00:20:16,840 Speaker 1: But different leaders formed teams in different ways, and my 321 00:20:16,960 --> 00:20:21,359 Speaker 1: way is to emphasize working together really closely, staying up 322 00:20:21,400 --> 00:20:25,680 Speaker 1: all night together to hack on problem, and trusting everybody 323 00:20:25,680 --> 00:20:28,440 Speaker 1: else on the team. And you can look at Ben Franklin, 324 00:20:28,520 --> 00:20:31,720 Speaker 1: the book I did on him, and he helped build 325 00:20:31,760 --> 00:20:34,440 Speaker 1: that team of founders that was probably the greatest team 326 00:20:34,520 --> 00:20:38,760 Speaker 1: ever built, which is, you needed to have a man 327 00:20:38,760 --> 00:20:42,240 Speaker 1: of great rectitude like George Washington. You needed to have 328 00:20:42,359 --> 00:20:45,800 Speaker 1: really smart people like Jefferson and Madison. You need to 329 00:20:45,840 --> 00:20:50,879 Speaker 1: have passionate people like Samuel Adams and his cousin John Adams. 330 00:20:51,240 --> 00:20:54,720 Speaker 1: But you also needed a Ben Franklin who would be 331 00:20:54,760 --> 00:20:58,000 Speaker 1: a glue that could bring them all together and sort of, 332 00:20:58,600 --> 00:21:19,399 Speaker 1: you know, wisely help reduce the tensions. I'm curious. One 333 00:21:19,400 --> 00:21:22,320 Speaker 1: of the things I admire about your work there's always 334 00:21:22,359 --> 00:21:27,159 Speaker 1: a sense of ethical concern and a sense of values 335 00:21:27,200 --> 00:21:30,840 Speaker 1: beyond just the story. And with Christoper, you get, I think, 336 00:21:30,880 --> 00:21:35,280 Speaker 1: into really at the cutting edge of an almost godlike 337 00:21:35,359 --> 00:21:38,200 Speaker 1: power that on the one hand can be really good, 338 00:21:39,080 --> 00:21:41,480 Speaker 1: but on the other hand, could be really really dangerous. 339 00:21:42,240 --> 00:21:43,960 Speaker 1: How in your own mind, as you've read all this, 340 00:21:44,040 --> 00:21:46,520 Speaker 1: and as you've interviewed these people and been in their 341 00:21:46,600 --> 00:21:49,199 Speaker 1: labs and watched them, how do you work through the 342 00:21:49,280 --> 00:21:53,159 Speaker 1: ethics of Christoper whether it ends up being used in 343 00:21:53,400 --> 00:21:56,399 Speaker 1: a lab in Wuhan, it ends up being used that 344 00:21:56,600 --> 00:22:00,719 Speaker 1: a great hospital, saving lots. When Jennifer Dowed invented this 345 00:22:00,840 --> 00:22:04,439 Speaker 1: technology with her team, she had a nightmare and it 346 00:22:04,520 --> 00:22:06,439 Speaker 1: was that she gets called into a room and somebody 347 00:22:06,480 --> 00:22:08,400 Speaker 1: wants to know how to use it, and when that 348 00:22:08,440 --> 00:22:12,520 Speaker 1: person looks up, it's Hitler. And so she has trouble 349 00:22:12,640 --> 00:22:16,240 Speaker 1: sleeping for weeks after that, and she starts gathering scientists 350 00:22:16,280 --> 00:22:20,840 Speaker 1: and religious leaders and political leaders and ethicis from around 351 00:22:20,840 --> 00:22:24,719 Speaker 1: the world, including China. Somebody I know they're launching pay 352 00:22:24,720 --> 00:22:28,879 Speaker 1: of the National Academy there, but the Royal Academy in England, 353 00:22:28,920 --> 00:22:32,120 Speaker 1: the Europeans and the Americans and Canadians, and they say, 354 00:22:32,160 --> 00:22:35,919 Speaker 1: how are we going to use this tool properly? Because 355 00:22:36,320 --> 00:22:40,240 Speaker 1: we talked earlier about the clearly great stuff you can do, 356 00:22:40,400 --> 00:22:44,000 Speaker 1: like taking a kid with sickle cell and saying, o, ry, 357 00:22:44,160 --> 00:22:47,399 Speaker 1: you no longer have sickled cells. But you could also 358 00:22:47,960 --> 00:22:52,840 Speaker 1: make hereditary at it and give your children traits. And 359 00:22:52,920 --> 00:22:55,160 Speaker 1: you might say, well, that's a good idea, let's give 360 00:22:55,200 --> 00:22:57,440 Speaker 1: them the trait you know, not to have sickle cell, 361 00:22:57,600 --> 00:23:00,879 Speaker 1: and that makes some sense. But in China a doctor 362 00:23:00,960 --> 00:23:04,879 Speaker 1: did it and made inheritable edits in early stage embryos 363 00:23:04,960 --> 00:23:07,959 Speaker 1: so that you create designer children. And there was a 364 00:23:08,040 --> 00:23:12,040 Speaker 1: lot of upset because that was crossing a line. It's 365 00:23:12,080 --> 00:23:15,199 Speaker 1: almost like, you know, Adam and Eve biting into the 366 00:23:15,240 --> 00:23:18,920 Speaker 1: apple or Prometheus snatching fire from the gods. If you're 367 00:23:18,960 --> 00:23:21,040 Speaker 1: going to snatch those powers, you're gonna have to be 368 00:23:21,080 --> 00:23:24,560 Speaker 1: careful of how you use that tree of knowledge. Well, 369 00:23:24,640 --> 00:23:28,080 Speaker 1: on second thought, that Chinese scientist did was edit the 370 00:23:28,200 --> 00:23:31,439 Speaker 1: children so they didn't have the receptor for HIV, the 371 00:23:31,560 --> 00:23:34,480 Speaker 1: virus that causes AIDS. So you have to say, well, 372 00:23:34,520 --> 00:23:38,399 Speaker 1: maybe after this pandemic using crisper to make sure we 373 00:23:38,400 --> 00:23:41,400 Speaker 1: don't have virus receptors. As long as it's safe, which 374 00:23:41,400 --> 00:23:44,520 Speaker 1: it wasn't when he did it, and we know it's effective. 375 00:23:44,920 --> 00:23:47,280 Speaker 1: Maybe that's a good idea. But what would you think 376 00:23:47,280 --> 00:23:48,960 Speaker 1: about saying, all right, I want my kids to be 377 00:23:49,000 --> 00:23:52,840 Speaker 1: eight inches taller. That really doesn't advantage all of humanity. 378 00:23:52,880 --> 00:23:56,200 Speaker 1: If everybody were eight inches taller, we wouldn't be better off, 379 00:23:56,240 --> 00:23:59,760 Speaker 1: you know, especially given the size of airline seats these days. 380 00:24:00,400 --> 00:24:04,000 Speaker 1: So it's only an advantage if you're wealthy and you 381 00:24:04,000 --> 00:24:06,880 Speaker 1: can buy eight inches for your kid, and then your 382 00:24:06,960 --> 00:24:09,320 Speaker 1: kids better off. And that gets us back to the 383 00:24:09,400 --> 00:24:12,960 Speaker 1: science fiction. Brave New World is exactly about that. We 384 00:24:13,119 --> 00:24:19,400 Speaker 1: create genetic elite with genetic inequalities that people can pay for. Well, 385 00:24:19,480 --> 00:24:24,360 Speaker 1: whatever you believe about the free market system, you'd probably 386 00:24:24,400 --> 00:24:28,760 Speaker 1: pause before you'd say, well, let's take inequalities we have 387 00:24:28,840 --> 00:24:32,119 Speaker 1: and allow people to encode it. So certain families have 388 00:24:32,160 --> 00:24:37,000 Speaker 1: an aristocracy. You know, they're taller, and they have higher 389 00:24:37,080 --> 00:24:40,600 Speaker 1: IQ points and all these things. You could have a 390 00:24:40,720 --> 00:24:44,760 Speaker 1: dystopian society if we got to that brave New world 391 00:24:44,840 --> 00:24:48,560 Speaker 1: where different parts of our species were genetically elite or 392 00:24:48,560 --> 00:24:51,680 Speaker 1: not that's what the movie Gattaca is about. Well, part 393 00:24:51,720 --> 00:24:54,840 Speaker 1: of that which intreates me is that we don't fully 394 00:24:54,960 --> 00:25:02,200 Speaker 1: understand why certain adaptive patterns occur. So, for example, there 395 00:25:02,240 --> 00:25:06,159 Speaker 1: may well have been a positive outcome from having sickle 396 00:25:06,200 --> 00:25:11,760 Speaker 1: cell in equatorial Africa, probably involving malaria. And it'll be 397 00:25:11,760 --> 00:25:14,480 Speaker 1: interesting because I think as smart as we are now 398 00:25:14,480 --> 00:25:18,000 Speaker 1: as a species, we still tend to ask pretty linear 399 00:25:18,119 --> 00:25:24,680 Speaker 1: questions and not necessarily understand all the permutations that make 400 00:25:24,720 --> 00:25:28,520 Speaker 1: it more dangerous to do certain things. I think, for example, 401 00:25:28,880 --> 00:25:32,440 Speaker 1: this whole issue of biological warfare, which I've been involved 402 00:25:32,440 --> 00:25:35,040 Speaker 1: with for over twenty years, and then a species which 403 00:25:35,040 --> 00:25:42,119 Speaker 1: has now invented hydrogen bombs and which has created extraordinary 404 00:25:42,200 --> 00:25:46,920 Speaker 1: levels of control capability through information. It's sort of hard 405 00:25:46,960 --> 00:25:50,439 Speaker 1: to say the next one is worse, but there is 406 00:25:50,480 --> 00:25:54,680 Speaker 1: something it seems to me a little more frightening about 407 00:25:54,720 --> 00:25:59,359 Speaker 1: a biological weapon than about these other things. As you 408 00:25:59,400 --> 00:26:03,200 Speaker 1: were going through these labs and you were learning about 409 00:26:03,200 --> 00:26:06,760 Speaker 1: the capabilities and the possibilities to what extember, you also 410 00:26:06,840 --> 00:26:11,600 Speaker 1: just jarred by this question of Shelley's Frankenstein or a 411 00:26:11,720 --> 00:26:15,440 Speaker 1: terrorist who pays somebody a lot of money to develop 412 00:26:15,960 --> 00:26:19,520 Speaker 1: a specific capacity and then on leashes that are threatens 413 00:26:19,560 --> 00:26:23,879 Speaker 1: to unleash unless their demands are absolutely I was worried 414 00:26:23,920 --> 00:26:27,520 Speaker 1: about it in the coronavirus pandemic, whatever the source of 415 00:26:27,560 --> 00:26:30,200 Speaker 1: that virus, and I don't think we fully know yet. 416 00:26:30,720 --> 00:26:34,240 Speaker 1: At least raises the specter that people can create using 417 00:26:34,400 --> 00:26:39,560 Speaker 1: a genetic or biological tools, very bad weapons. The largest 418 00:26:39,680 --> 00:26:43,760 Speaker 1: funder right now of research at Jennifer Dowdner's lab and 419 00:26:44,000 --> 00:26:47,520 Speaker 1: at the lab at Broad Institute on uses of CRISPA 420 00:26:47,720 --> 00:26:52,160 Speaker 1: is DARPA, the Defense Department's Advanced Research Project Agency, and 421 00:26:52,200 --> 00:26:58,200 Speaker 1: they've created and Jennifer's team helped create something called anti CRESPER, 422 00:26:58,520 --> 00:27:01,960 Speaker 1: which you can figure out just what it says, and 423 00:27:02,080 --> 00:27:05,240 Speaker 1: it means just like you can have ballistic missile systems, 424 00:27:05,800 --> 00:27:09,080 Speaker 1: you can have anti ballistic missile systems. So the military 425 00:27:09,119 --> 00:27:12,679 Speaker 1: gets at quite well. And so one of the worries 426 00:27:12,720 --> 00:27:15,680 Speaker 1: I would have with gene editing is not so much 427 00:27:15,680 --> 00:27:19,880 Speaker 1: that eugene added to a human, that's something that's possible. 428 00:27:20,320 --> 00:27:24,040 Speaker 1: It's that eugene edit a whole wave of mosquitoes with 429 00:27:24,160 --> 00:27:27,320 Speaker 1: a gene drive that would make them able to carry 430 00:27:27,480 --> 00:27:33,160 Speaker 1: incredibly dangerous pathogens and unleash them on a population. I mean, 431 00:27:33,160 --> 00:27:37,240 Speaker 1: that's a pretty simple biological weapon that could be pretty destructive, 432 00:27:37,560 --> 00:27:40,160 Speaker 1: and so anti crisper is a way to make sure 433 00:27:40,200 --> 00:27:44,639 Speaker 1: those mosquitoes, crisper systems, the edits of those get turned off. 434 00:27:44,920 --> 00:27:48,919 Speaker 1: But Vladimir Putin has said at a youth conference a 435 00:27:48,960 --> 00:27:52,080 Speaker 1: couple of years ago, he was extolling the virtues of 436 00:27:52,119 --> 00:27:55,920 Speaker 1: this tool that doubt had helped create, and he said, 437 00:27:56,000 --> 00:28:00,320 Speaker 1: Crisper may someday be able to make us create soldiers 438 00:28:00,320 --> 00:28:04,199 Speaker 1: that are impervious to radiation. They don't get radiation signals, 439 00:28:04,400 --> 00:28:07,840 Speaker 1: and they may be impervious to pain or even impervious 440 00:28:07,960 --> 00:28:11,840 Speaker 1: to fear. So you can see what an autocrat or 441 00:28:11,920 --> 00:28:15,560 Speaker 1: a person interested in military uses could think of doing. 442 00:28:16,040 --> 00:28:19,600 Speaker 1: I do think that it's a possibility that it can 443 00:28:19,680 --> 00:28:24,400 Speaker 1: become the subject for cooperation. I take heart that the 444 00:28:24,520 --> 00:28:27,879 Speaker 1: Chinese or the ones pretty far ahead of this game. 445 00:28:28,359 --> 00:28:33,240 Speaker 1: They've done it in cancer, and after one of their scientists, 446 00:28:33,320 --> 00:28:36,360 Speaker 1: a kind of young scientists and one of the provinces 447 00:28:37,040 --> 00:28:41,680 Speaker 1: edited embryos, they decided it was against the international and 448 00:28:41,760 --> 00:28:45,280 Speaker 1: their own national regulations. He's put on trial and he's 449 00:28:45,320 --> 00:28:48,480 Speaker 1: in jail for three years, and they started coming to 450 00:28:48,600 --> 00:28:52,000 Speaker 1: meetings with their American counterparts to saying, all right, what 451 00:28:52,120 --> 00:28:54,640 Speaker 1: are the limits and how are we going to have 452 00:28:54,880 --> 00:29:00,520 Speaker 1: detection to make sure people don't make inheritable genetic edits. So, 453 00:29:00,600 --> 00:29:02,960 Speaker 1: as you know from studies of the Cold War and 454 00:29:03,000 --> 00:29:06,720 Speaker 1: anything else, we're going to be in a struggle with China. 455 00:29:06,800 --> 00:29:10,040 Speaker 1: And when Tony Blinkin and Jake Sullivan meet their counterparts 456 00:29:10,040 --> 00:29:12,360 Speaker 1: in Alaska a couple of weeks ago, there was a 457 00:29:12,440 --> 00:29:15,960 Speaker 1: long list of things they fought about. But even Kissinger, 458 00:29:16,080 --> 00:29:19,000 Speaker 1: when he is doing detent with Russia, says and well, 459 00:29:19,000 --> 00:29:22,080 Speaker 1: maybe here's a list that we can try to cooperate on, 460 00:29:22,640 --> 00:29:26,920 Speaker 1: and that would be cultural and scientific exchanges. And at 461 00:29:27,000 --> 00:29:29,680 Speaker 1: the moment we're now doing that with Crisper. Not that 462 00:29:29,720 --> 00:29:33,120 Speaker 1: I necessarily trust all other countries, but I think it's 463 00:29:33,160 --> 00:29:36,800 Speaker 1: good to be discussing with China and the Europeans and 464 00:29:36,920 --> 00:29:40,040 Speaker 1: the British and hopefully someday the Russians will come in 465 00:29:40,120 --> 00:29:43,560 Speaker 1: on it ways that we're going to control genetic editing. 466 00:29:44,120 --> 00:29:45,800 Speaker 1: I have to ask you, because you sort of remind 467 00:29:45,800 --> 00:29:49,000 Speaker 1: me of a wonderful gourmet sharper who goes through this 468 00:29:49,560 --> 00:29:53,280 Speaker 1: grocery store of life and you pluck up Benjamin Franklin out, 469 00:29:53,280 --> 00:29:56,520 Speaker 1: and then you pluck out you know somebody else. In 470 00:29:56,560 --> 00:30:01,960 Speaker 1: almost every case, they're unique personalities with fascinating life stories 471 00:30:02,320 --> 00:30:05,959 Speaker 1: who've had a significant impact on history. So I have 472 00:30:06,040 --> 00:30:08,720 Speaker 1: to ask, what part of the grocery store are you 473 00:30:08,800 --> 00:30:12,160 Speaker 1: drifting towards now as you think about your next great project. 474 00:30:13,000 --> 00:30:18,040 Speaker 1: That's a good question, because I was surprised how amazing 475 00:30:18,080 --> 00:30:21,760 Speaker 1: it was to stumble across the grocery aisle that involved 476 00:30:21,800 --> 00:30:26,040 Speaker 1: the life sciences, something I had not thought about too 477 00:30:26,160 --> 00:30:28,480 Speaker 1: much before. And I thought this was going to be 478 00:30:28,520 --> 00:30:31,480 Speaker 1: a rich and interesting grocery aisle to use your metaphor, 479 00:30:31,960 --> 00:30:37,800 Speaker 1: filled with amazing characters and colorful things. And then when 480 00:30:37,800 --> 00:30:40,800 Speaker 1: I studied Crisper and the coronavirus, had realized I'd been 481 00:30:40,880 --> 00:30:47,040 Speaker 1: understating how amazing this field is. I'm interested in science 482 00:30:47,080 --> 00:30:52,680 Speaker 1: and technology, but I'm particularly like you, interested in people. 483 00:30:53,760 --> 00:30:56,920 Speaker 1: I'll call them the Ben Franklin's and Leonardo da Vinci's 484 00:30:57,240 --> 00:31:00,720 Speaker 1: who try to connect everything. They are polling man. I mean. 485 00:31:00,760 --> 00:31:03,440 Speaker 1: I started out in a news magazine where I would 486 00:31:03,480 --> 00:31:06,040 Speaker 1: write about music one week, in medicine, the next week 487 00:31:06,040 --> 00:31:09,040 Speaker 1: in foreign policy, the next in business. The fourth week, 488 00:31:09,560 --> 00:31:12,760 Speaker 1: So I became interested in different fields. So I'm gonna 489 00:31:12,960 --> 00:31:18,640 Speaker 1: try to find somebody who, like Leonardo da Vinci did, 490 00:31:18,680 --> 00:31:22,560 Speaker 1: and maybe Aristotle did, but somebody who tries to connect 491 00:31:22,600 --> 00:31:28,239 Speaker 1: all the field because patterns ripple across nature. And if 492 00:31:28,280 --> 00:31:31,560 Speaker 1: you see how spirals work when water, learning paths of rock, 493 00:31:31,680 --> 00:31:34,120 Speaker 1: as Leonardo did as a kid, you're going to end 494 00:31:34,200 --> 00:31:37,680 Speaker 1: up understanding the math of spirals and then maybe painting 495 00:31:37,680 --> 00:31:40,440 Speaker 1: the curls of the Mona Lisa. And so if you're 496 00:31:40,480 --> 00:31:45,080 Speaker 1: interested in everything from anatomy to art to zoology, you 497 00:31:45,160 --> 00:31:47,240 Speaker 1: have a feel for those patterns. So I'm looking for 498 00:31:47,360 --> 00:31:52,160 Speaker 1: great what I would call renaissance people in the sense 499 00:31:52,200 --> 00:31:54,760 Speaker 1: that was the original meaning of a renaissance person. The 500 00:31:54,840 --> 00:32:00,440 Speaker 1: question was, in part, was knowledge sufficiently limited and at 501 00:32:00,480 --> 00:32:05,080 Speaker 1: the same time being infused with enough new things being 502 00:32:05,080 --> 00:32:08,800 Speaker 1: imported from the Arab world and being imported from the 503 00:32:08,880 --> 00:32:11,880 Speaker 1: discovery of the Greek and Roman world that you suddenly 504 00:32:11,920 --> 00:32:14,240 Speaker 1: at this intersection. And I think part of what you've 505 00:32:14,240 --> 00:32:18,560 Speaker 1: proven is that in every century you can find people 506 00:32:19,200 --> 00:32:22,880 Speaker 1: who have common characteristics and who leave behind a bigger 507 00:32:22,920 --> 00:32:27,000 Speaker 1: footprint than you would imagine. Yeah, it happens in Florence 508 00:32:27,160 --> 00:32:31,440 Speaker 1: as you say, in the late fourteen hundreds, when Constantinople 509 00:32:31,600 --> 00:32:34,080 Speaker 1: is falling and people from the Arab world are coming 510 00:32:34,360 --> 00:32:37,840 Speaker 1: and bringing amazing things like algebra, and I think we 511 00:32:38,040 --> 00:32:41,200 Speaker 1: sometimes forget how amazing algebra is if you want to 512 00:32:41,240 --> 00:32:44,960 Speaker 1: understand this world. Or people coming in from Germany and 513 00:32:45,000 --> 00:32:49,440 Speaker 1: bringing the printing press at Guttenberg has just invented, and 514 00:32:49,560 --> 00:32:52,520 Speaker 1: people coming in from Asia on the Silk Road in 515 00:32:52,600 --> 00:32:55,760 Speaker 1: other places, and it all fuses together in Florence, and 516 00:32:55,840 --> 00:32:58,920 Speaker 1: you have these cradles of creativity throughout history. I would 517 00:32:58,920 --> 00:33:03,640 Speaker 1: put Philadelphia in the seventeen seventies in that category because 518 00:33:04,240 --> 00:33:07,480 Speaker 1: unlike other places in the colony, it wasn't a Puritan 519 00:33:07,560 --> 00:33:11,320 Speaker 1: theocracy like Boston, or wasn't like Virginia. Had had all 520 00:33:11,360 --> 00:33:15,800 Speaker 1: sorts of people in it. It had Anglicans and Protestants 521 00:33:15,880 --> 00:33:18,880 Speaker 1: and Catholics and slaves and freed slaves, and Jews and 522 00:33:19,040 --> 00:33:24,000 Speaker 1: Moravians and loyalists and anti monarchists and everything else. And 523 00:33:24,040 --> 00:33:26,960 Speaker 1: that ferment allows you of Ben Franklin and a group 524 00:33:26,960 --> 00:33:31,440 Speaker 1: of founders coming together creating the greatest document ever written, 525 00:33:31,520 --> 00:33:34,320 Speaker 1: which is our Constitution. So you have to look for 526 00:33:34,360 --> 00:33:39,040 Speaker 1: those cradles of creativity where just different ideas, as you 527 00:33:39,080 --> 00:33:42,800 Speaker 1: put it, you know, come together into a ferment. Yeah. 528 00:33:42,840 --> 00:33:46,920 Speaker 1: Peter Drucker once wrote that the key to genius wasn't 529 00:33:46,960 --> 00:33:50,560 Speaker 1: how smart you were or how hard you worked, it 530 00:33:50,680 --> 00:33:53,440 Speaker 1: was the size of the question you ask, and that 531 00:33:53,560 --> 00:33:56,320 Speaker 1: people who ask a certain size question sort of have 532 00:33:56,360 --> 00:33:59,080 Speaker 1: to grow into genius status to just find the answer. 533 00:33:59,680 --> 00:34:01,880 Speaker 1: You know, this happened to Jennifer Dowden and my main 534 00:34:02,040 --> 00:34:05,600 Speaker 1: character in the nineteen nineties. She was a graduate stude 535 00:34:05,640 --> 00:34:10,000 Speaker 1: and most of the men in biology. Then we're on 536 00:34:10,080 --> 00:34:13,120 Speaker 1: the Human Genome Project, which was sort of an alpha 537 00:34:13,120 --> 00:34:16,200 Speaker 1: male project, you know, the people involved from Craig Ventnor 538 00:34:16,280 --> 00:34:20,000 Speaker 1: to Eric Lander to Francis Collins, and so there were 539 00:34:20,000 --> 00:34:22,360 Speaker 1: a couple of women who said, all right, I'm not 540 00:34:22,400 --> 00:34:24,400 Speaker 1: going to run where the soccer ball is. I'm going 541 00:34:24,480 --> 00:34:26,640 Speaker 1: to play the rest of the field. And they're the 542 00:34:26,680 --> 00:34:29,840 Speaker 1: ones who focus not on sequencing the DNA, but on 543 00:34:29,960 --> 00:34:32,360 Speaker 1: figuring out the role of the RNA, which, as we 544 00:34:32,400 --> 00:34:35,000 Speaker 1: said at the very beginning, turns out, if you're going 545 00:34:35,040 --> 00:34:36,360 Speaker 1: to make a vaccine, if you're going to make a 546 00:34:36,440 --> 00:34:39,600 Speaker 1: gene editing tool, the RNA is the molecule that you 547 00:34:39,680 --> 00:34:43,960 Speaker 1: want to understand, and she's doing it with a professor 548 00:34:44,000 --> 00:34:48,200 Speaker 1: at Harvard named Jack show Stack, and she's just in 549 00:34:48,239 --> 00:34:53,440 Speaker 1: the minute details of exactly the shape and the structure 550 00:34:53,440 --> 00:34:57,600 Speaker 1: of RNA and how maybe that allows RNA to edit 551 00:34:57,719 --> 00:35:01,520 Speaker 1: its own self and to create copies of itself. And 552 00:35:01,640 --> 00:35:06,040 Speaker 1: Jack shows Stack says, all right, but what's the big picture? 553 00:35:06,320 --> 00:35:09,719 Speaker 1: What's the big idea? I mean, this is fine, what 554 00:35:09,800 --> 00:35:12,560 Speaker 1: are you pursuing? And she has a few answers. He 555 00:35:12,600 --> 00:35:15,440 Speaker 1: touched at the things. He says, No, the big question 556 00:35:15,680 --> 00:35:19,920 Speaker 1: is how did life begin? And nobody's figured it out yet. 557 00:35:20,200 --> 00:35:24,280 Speaker 1: But if you can discover how RNA can come together 558 00:35:24,320 --> 00:35:27,399 Speaker 1: with just a pool of real four chemicals in some 559 00:35:27,520 --> 00:35:31,239 Speaker 1: primordial stew four billion years ago, and then you can 560 00:35:31,320 --> 00:35:35,280 Speaker 1: show how just in that mix of chemicals four billion 561 00:35:35,440 --> 00:35:40,520 Speaker 1: years ago, it learned to replicate itself, then you've helped 562 00:35:40,520 --> 00:35:45,560 Speaker 1: discover the secret of how life began. And indeed, that's 563 00:35:45,640 --> 00:35:48,960 Speaker 1: now our theory called the RNA World of how life 564 00:35:49,000 --> 00:35:51,799 Speaker 1: began on the planet. And Jennifer Dowden had told me. 565 00:35:51,840 --> 00:35:55,759 Speaker 1: It taught me two related things, which is, always look 566 00:35:55,840 --> 00:35:59,480 Speaker 1: at the big picture, and number two, never forget that 567 00:35:59,520 --> 00:36:02,680 Speaker 1: God is in the details. So it's a tiny detail 568 00:36:02,920 --> 00:36:05,920 Speaker 1: of how that molecule folds in twists, but it's the 569 00:36:05,960 --> 00:36:09,759 Speaker 1: big picture that drives you. That's amazing. Well, listen, you're 570 00:36:09,760 --> 00:36:14,359 Speaker 1: such a polymath yourself, and you have been so extraordinarily 571 00:36:14,360 --> 00:36:17,560 Speaker 1: productive that we're going to list all of your books 572 00:36:17,600 --> 00:36:20,319 Speaker 1: on our show page, and people could actually go through 573 00:36:20,360 --> 00:36:23,600 Speaker 1: the University of Isaacson to have read your books. Should 574 00:36:23,640 --> 00:36:26,920 Speaker 1: be the equivalent of getting a graduate degree. Well, let 575 00:36:26,920 --> 00:36:30,200 Speaker 1: me turn the compliment back, you too, are a polymath. 576 00:36:30,360 --> 00:36:33,400 Speaker 1: I've never cease to be amazed in the thirty or 577 00:36:33,440 --> 00:36:36,439 Speaker 1: forty years in which our passive cross or you've come 578 00:36:36,520 --> 00:36:39,799 Speaker 1: to conferences, or I've been with you, that you have 579 00:36:39,920 --> 00:36:43,040 Speaker 1: some new ideas about everything at all times. And that's 580 00:36:43,040 --> 00:36:46,160 Speaker 1: what we should aspire to do. It keeps our minds nimble, 581 00:36:46,239 --> 00:36:51,040 Speaker 1: it keeps our minds open, and it reminds us that curiosity, 582 00:36:51,320 --> 00:36:56,040 Speaker 1: just pure curiosity, is the key to creativity. I think 583 00:36:56,080 --> 00:36:59,360 Speaker 1: that's right. I've also tried to follow the principle of 584 00:37:00,840 --> 00:37:03,719 Speaker 1: asking questions large enough that it's worth the effort to 585 00:37:03,800 --> 00:37:07,600 Speaker 1: learn them, and that turns out to be an endless process, because, 586 00:37:07,640 --> 00:37:09,879 Speaker 1: as you well know from my own experience, the world 587 00:37:10,040 --> 00:37:14,520 Speaker 1: is amazingly complex and constantly evolving. You and your life 588 00:37:14,560 --> 00:37:18,640 Speaker 1: has reflected that, both professionally and geographically, and I think 589 00:37:18,640 --> 00:37:22,240 Speaker 1: to somebody said, probably mine has two. So it's always 590 00:37:22,280 --> 00:37:25,760 Speaker 1: great to be with you. And it's remarkable to watch 591 00:37:25,800 --> 00:37:30,320 Speaker 1: your career and your ability to keep absorbing new things 592 00:37:30,400 --> 00:37:32,880 Speaker 1: and keep teaching the rest of us new things. So 593 00:37:33,280 --> 00:37:35,879 Speaker 1: I'm very grateful you take this time and be with us. 594 00:37:36,320 --> 00:37:39,000 Speaker 1: It's always good to be with somebody who stays curious. 595 00:37:39,120 --> 00:37:44,920 Speaker 1: Thank you, dude, Thank you to my guest, Walter Isaacson. 596 00:37:45,440 --> 00:37:48,279 Speaker 1: You can read more about Jennifer Doodna and get a 597 00:37:48,280 --> 00:37:51,719 Speaker 1: link to order the Codebreaker on our show page at 598 00:37:51,800 --> 00:37:54,840 Speaker 1: newtsworld dot com. News World is produced by g English 599 00:37:54,880 --> 00:37:59,640 Speaker 1: three sixty and iHeartMedia. Our executive producer is Debbie Myers, 600 00:38:00,040 --> 00:38:04,280 Speaker 1: our producer is Garnsey Sloan, and our researcher is Rachel Peterson. 601 00:38:04,880 --> 00:38:07,840 Speaker 1: The artwork for the show was created by Steve Penley. 602 00:38:08,360 --> 00:38:11,799 Speaker 1: Special thanks to the team at Gingwidge three sixty. If 603 00:38:11,800 --> 00:38:14,560 Speaker 1: you've been enjoying Newtsworld, I hope you'll go to Apple 604 00:38:14,600 --> 00:38:18,080 Speaker 1: Podcast and both rate us with five stars and give 605 00:38:18,160 --> 00:38:20,799 Speaker 1: us a review so others can learn what it's all 606 00:38:20,800 --> 00:38:24,920 Speaker 1: about right now. Listeners of newts World can sign up 607 00:38:24,920 --> 00:38:28,960 Speaker 1: for my three free weekly columns at Gingwidge three sixty 608 00:38:29,440 --> 00:38:34,239 Speaker 1: dot com slash newsletter. I'm Newt Gingridge. This is Newtsworld.