1 00:00:15,396 --> 00:00:24,036 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:24,076 --> 00:00:27,596 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:28,116 --> 00:00:31,756 Speaker 1: I'm Noah Feldman. Today I'm speaking to one of the 4 00:00:31,796 --> 00:00:37,556 Speaker 1: most influential and extraordinary scientists I know, Eric Lander. Eric 5 00:00:37,676 --> 00:00:40,916 Speaker 1: is the president and founding director of the Broad Institute 6 00:00:40,956 --> 00:00:45,836 Speaker 1: of MIT and Harvard. He's geneticist, a molecular biologist, a 7 00:00:45,876 --> 00:00:49,396 Speaker 1: mathematician by original training, and he's also the host of 8 00:00:49,436 --> 00:00:53,756 Speaker 1: a brand new Pushkin podcast called Brave New Planet. Eric 9 00:00:53,796 --> 00:00:58,036 Speaker 1: has been at the epicenter of a great transformation in 10 00:00:58,236 --> 00:01:02,036 Speaker 1: biology and indeed of science that's taken place over the 11 00:01:02,116 --> 00:01:07,076 Speaker 1: last thirty years. A transformation focused most fundamentally on what 12 00:01:07,196 --> 00:01:10,516 Speaker 1: can be done with the gathering of greater and greater 13 00:01:10,596 --> 00:01:15,076 Speaker 1: and greater amounts of data about biological systems, including the 14 00:01:15,156 --> 00:01:18,836 Speaker 1: human body. These developments are crucial to how science is 15 00:01:18,916 --> 00:01:22,996 Speaker 1: being done every day, and they're absolutely crucial as well 16 00:01:23,276 --> 00:01:27,756 Speaker 1: to how science has responded to COVID nineteen. It's a 17 00:01:27,796 --> 00:01:33,996 Speaker 1: thrill to have Eric on the podcast. Eric, thank you 18 00:01:34,076 --> 00:01:37,516 Speaker 1: so much for joining us. I want to start with 19 00:01:37,796 --> 00:01:41,996 Speaker 1: the role you've played in really a transformative period in 20 00:01:41,996 --> 00:01:45,596 Speaker 1: the history of modern science, a period which in certain ways, 21 00:01:45,636 --> 00:01:48,036 Speaker 1: is being reflected in the cutting edge developments that are 22 00:01:48,076 --> 00:01:51,436 Speaker 1: happening every day that we all care about most in science, 23 00:01:51,476 --> 00:01:56,276 Speaker 1: including indirectly in the context of the vaccines for COVID nineteen. 24 00:01:56,956 --> 00:02:00,276 Speaker 1: And that's a period in which big data has come 25 00:02:00,316 --> 00:02:03,636 Speaker 1: to be a fundamental, some would say, the fundamental tool 26 00:02:04,356 --> 00:02:08,916 Speaker 1: for solving our biggest scientific problems. You came across this 27 00:02:09,156 --> 00:02:12,316 Speaker 1: already when you're working on the human genome project, but 28 00:02:12,396 --> 00:02:14,916 Speaker 1: then it's been your central focus and your creation of 29 00:02:14,916 --> 00:02:17,116 Speaker 1: the broad Now. I know this is a big picture question, 30 00:02:17,196 --> 00:02:19,556 Speaker 1: but I wonder if you would just how a few bars, 31 00:02:19,756 --> 00:02:24,116 Speaker 1: if you would for the audience about how this transformation 32 00:02:24,556 --> 00:02:26,876 Speaker 1: in the way we think about what science is and 33 00:02:26,876 --> 00:02:29,876 Speaker 1: how it works has come home. Now, Wow, what a 34 00:02:29,916 --> 00:02:33,036 Speaker 1: small question to start with. No othing. I think there's 35 00:02:33,076 --> 00:02:37,316 Speaker 1: a kind of high school cartoon version of what science 36 00:02:37,476 --> 00:02:43,796 Speaker 1: is which runs something like this, Scientists make a hypothesis 37 00:02:43,876 --> 00:02:45,956 Speaker 1: and then they set up a test, and then they 38 00:02:46,076 --> 00:02:49,556 Speaker 1: test their hypothesis and see if it holds up or 39 00:02:49,596 --> 00:02:54,116 Speaker 1: they should reject it. I think it misses a fundamental 40 00:02:54,196 --> 00:02:59,996 Speaker 1: question where do hypotheses come from? Anyway, And many of 41 00:03:00,036 --> 00:03:04,076 Speaker 1: the most important hypotheses come from just looking at the world. 42 00:03:04,676 --> 00:03:07,996 Speaker 1: And what's happened in the last twenty thirty years in 43 00:03:07,996 --> 00:03:13,556 Speaker 1: biology is we've been able to take almost entirely unbiased 44 00:03:13,716 --> 00:03:16,956 Speaker 1: looks at parts of biology. The human genome was the 45 00:03:16,996 --> 00:03:21,356 Speaker 1: first great example of that three billion letters of human instructions, 46 00:03:21,356 --> 00:03:26,796 Speaker 1: and rather than diving in with a hypothesis I think 47 00:03:26,876 --> 00:03:32,236 Speaker 1: this gene causes cystic fibrosis or some other aspect of 48 00:03:32,236 --> 00:03:37,476 Speaker 1: a disease, you can ask questions like, amongst every gene 49 00:03:37,516 --> 00:03:41,396 Speaker 1: in the genome, which ones show an inheritance pattern that 50 00:03:41,436 --> 00:03:46,756 Speaker 1: matches up with cystic fibrosis, And instead of being limited 51 00:03:46,876 --> 00:03:51,956 Speaker 1: to hypothesis driven science, you can do what some people 52 00:03:52,356 --> 00:03:56,756 Speaker 1: you know call hypothesis free science. We're never free of hypothesis, 53 00:03:57,116 --> 00:04:00,476 Speaker 1: but the limits are removed so much it's changed the 54 00:04:00,516 --> 00:04:04,436 Speaker 1: way we approach biology because it's sort of big data. 55 00:04:04,516 --> 00:04:06,356 Speaker 1: But what I really think about it is the big 56 00:04:06,436 --> 00:04:10,956 Speaker 1: data is that life keep notes in its lab notebook, 57 00:04:11,156 --> 00:04:14,676 Speaker 1: the genome, and in this generation, we've got access to 58 00:04:14,716 --> 00:04:18,436 Speaker 1: that lab notebook and we can read and the questions 59 00:04:18,476 --> 00:04:22,636 Speaker 1: we can ask are only limited by our creativity. Translating 60 00:04:22,676 --> 00:04:27,556 Speaker 1: that into practical terms, how for example, today when scientists 61 00:04:27,596 --> 00:04:31,316 Speaker 1: can affronted sars Covi two and said, okay, let's figure 62 00:04:31,356 --> 00:04:33,276 Speaker 1: this out and then let's find something to do about it. 63 00:04:33,556 --> 00:04:36,156 Speaker 1: How did their approach differ from the way it would 64 00:04:36,196 --> 00:04:39,436 Speaker 1: have looked, say twenty five or thirty years ago, or 65 00:04:39,476 --> 00:04:42,876 Speaker 1: four years ago actually in this case, so let me 66 00:04:42,876 --> 00:04:46,316 Speaker 1: give you an example. There is something called the Human 67 00:04:46,516 --> 00:04:51,076 Speaker 1: cell Atlas. It's probably this generation's successor to the Human 68 00:04:51,116 --> 00:04:54,156 Speaker 1: Genome Project. The Human Genome Project was reading out all 69 00:04:54,196 --> 00:04:58,316 Speaker 1: the genetic information in a human The Human Cell Atlass 70 00:04:58,596 --> 00:05:01,556 Speaker 1: was reading out what are all the cells in the 71 00:05:01,676 --> 00:05:05,276 Speaker 1: human body? By asking which of the twenty thousand genes 72 00:05:05,276 --> 00:05:07,916 Speaker 1: are turned on in this cell or that cell. It 73 00:05:08,036 --> 00:05:11,876 Speaker 1: became possible and eight years ago to start reading at 74 00:05:11,916 --> 00:05:15,356 Speaker 1: a single cell basis the genes that are turned on 75 00:05:15,396 --> 00:05:18,636 Speaker 1: and by how much in first dozens of cells, then 76 00:05:18,716 --> 00:05:22,196 Speaker 1: thousands of cells, then millions of cells, and eventually it'll 77 00:05:22,236 --> 00:05:26,396 Speaker 1: be billions of cells. And an international catalog. International program 78 00:05:26,476 --> 00:05:31,316 Speaker 1: to create that catalog describing all possible cells and their 79 00:05:32,036 --> 00:05:36,756 Speaker 1: expression patterns of genes and the states they find themselves 80 00:05:36,756 --> 00:05:40,596 Speaker 1: in just caught fire around the world. So then stars 81 00:05:40,676 --> 00:05:47,516 Speaker 1: comes along and people ask what cells might sars Covie 82 00:05:47,596 --> 00:05:52,436 Speaker 1: two in fact, well, it was known that the previous 83 00:05:52,476 --> 00:05:57,876 Speaker 1: coronavirus SARS, infected cells that had a particular gene that 84 00:05:58,076 --> 00:06:01,316 Speaker 1: was active goes by the name ACE two. But that 85 00:06:01,356 --> 00:06:05,436 Speaker 1: doesn't matter. People quickly figured out that the same gene 86 00:06:05,556 --> 00:06:09,196 Speaker 1: was used as the receptor for the new coronavirus. And 87 00:06:09,236 --> 00:06:11,556 Speaker 1: then you wanted to ask the question, what's every cell 88 00:06:11,596 --> 00:06:17,436 Speaker 1: in the body that expresses the receptor for the new coronavirus. Well, 89 00:06:17,476 --> 00:06:21,036 Speaker 1: in ancient times, meaning four years ago, one would have 90 00:06:21,036 --> 00:06:25,116 Speaker 1: had to have done thousands of experiments to figure that out. Instead, 91 00:06:25,756 --> 00:06:28,156 Speaker 1: the scientific community came together and then a matter of 92 00:06:28,196 --> 00:06:32,716 Speaker 1: two weeks, sifted through all the data about which genes 93 00:06:32,756 --> 00:06:35,556 Speaker 1: are on and off in which cell types to say, oh, yes, 94 00:06:35,876 --> 00:06:39,116 Speaker 1: here are the different cell types that might be infectable. 95 00:06:39,476 --> 00:06:41,276 Speaker 1: And you know, one of the things they found was 96 00:06:41,316 --> 00:06:45,756 Speaker 1: the smell receptors in your nose. They made that discovery 97 00:06:45,796 --> 00:06:49,916 Speaker 1: and then three days later people reported that people were 98 00:06:49,916 --> 00:06:53,836 Speaker 1: losing their sense to smell when infected by the virus. Bingo. 99 00:06:54,476 --> 00:06:58,156 Speaker 1: The idea that we have lookup tables, Yeah, pretty good now, 100 00:06:58,196 --> 00:07:01,876 Speaker 1: but much much better in the coming years. For every 101 00:07:01,916 --> 00:07:04,636 Speaker 1: cell in our body is going to mean that any process, 102 00:07:04,676 --> 00:07:07,636 Speaker 1: whether it's a virus that's infecting us or some other 103 00:07:07,716 --> 00:07:10,676 Speaker 1: physiological process, we're going to be able to look up 104 00:07:10,756 --> 00:07:13,836 Speaker 1: signatures for it. It doesn't solve all problems, that doesn't 105 00:07:13,836 --> 00:07:16,556 Speaker 1: make a disease go away, but maybe it cuts a 106 00:07:16,716 --> 00:07:20,556 Speaker 1: year or more off the work. I guess an older 107 00:07:20,556 --> 00:07:23,836 Speaker 1: example would be chemistry before there was a periodic table 108 00:07:23,876 --> 00:07:27,036 Speaker 1: of the elements. In chemistry, after there was a periodic 109 00:07:27,076 --> 00:07:30,196 Speaker 1: table of the elements, didn't solve all chemistry, but no 110 00:07:30,356 --> 00:07:34,276 Speaker 1: chemists thought about chemistry the same way after the periodic 111 00:07:34,316 --> 00:07:38,956 Speaker 1: table was in evidence. Do the mRNA vaccines that are 112 00:07:39,036 --> 00:07:41,556 Speaker 1: in development and that have been so far attested in 113 00:07:41,596 --> 00:07:45,116 Speaker 1: what looks like a very promising way to address SOARS 114 00:07:45,156 --> 00:07:49,436 Speaker 1: covy two infection themselves owe something to these revolutions in 115 00:07:49,716 --> 00:07:52,556 Speaker 1: biology My impressions that they do. Oh yeah, of course, 116 00:07:52,996 --> 00:07:56,996 Speaker 1: How did anybody find this sarscovy two virus so quickly? 117 00:07:57,836 --> 00:08:02,636 Speaker 1: You know, establishing what was the virus behind AIDS HIV 118 00:08:03,396 --> 00:08:07,276 Speaker 1: took quite a long time. Even establishing some of these 119 00:08:07,276 --> 00:08:10,996 Speaker 1: other more recent novel virus that have appeared have taken 120 00:08:10,996 --> 00:08:15,636 Speaker 1: a long time. Now, with kind of a hypothesis free 121 00:08:15,636 --> 00:08:20,836 Speaker 1: brute force sequencing approach, you could take cells from infected patients, 122 00:08:21,396 --> 00:08:25,956 Speaker 1: sequence the genes that are getting expressed, and say, WHOA, 123 00:08:26,036 --> 00:08:29,076 Speaker 1: are there any genes here that aren't supposed to be 124 00:08:29,116 --> 00:08:32,556 Speaker 1: in humans? And when you sift the genes you see 125 00:08:32,556 --> 00:08:35,556 Speaker 1: in the cells versus the genes you expect in the cells, 126 00:08:36,036 --> 00:08:38,956 Speaker 1: the difference kind of turns out to be this virus 127 00:08:38,996 --> 00:08:41,476 Speaker 1: that's infecting you. You didn't need to go in with 128 00:08:41,716 --> 00:08:45,636 Speaker 1: the hypothesis that it wasn't necessarily a coronavirus, or that 129 00:08:45,676 --> 00:08:48,996 Speaker 1: you were looking for a particular kind of gene. You 130 00:08:48,996 --> 00:08:51,716 Speaker 1: could go in saying I think there's a virus to 131 00:08:51,876 --> 00:08:57,116 Speaker 1: look for what novel stuff is getting expressed. So it's 132 00:08:57,116 --> 00:09:01,636 Speaker 1: a great example of the ability to make discovery science 133 00:09:01,716 --> 00:09:05,036 Speaker 1: by looking at the big picture in a comprehensive way. 134 00:09:05,116 --> 00:09:10,076 Speaker 1: Kind of global views of biology have become possible, and 135 00:09:10,116 --> 00:09:14,876 Speaker 1: they depend on people generating data, making those data freely available, 136 00:09:15,276 --> 00:09:19,316 Speaker 1: a whole new generation of scientists growing up who live 137 00:09:19,476 --> 00:09:23,876 Speaker 1: to analyze those kind of data, and incredible creativity because 138 00:09:23,876 --> 00:09:25,556 Speaker 1: you'd think, oh, there's a big pile of data, you 139 00:09:25,636 --> 00:09:28,516 Speaker 1: just crunched it through the computer. But given a big 140 00:09:28,556 --> 00:09:32,636 Speaker 1: pile of information, you can ask a thousand different questions 141 00:09:32,796 --> 00:09:36,036 Speaker 1: and it depends on the perspective you're bringing. So I'm 142 00:09:36,076 --> 00:09:40,196 Speaker 1: just in awe of the creativity that people bring to 143 00:09:40,276 --> 00:09:43,156 Speaker 1: these questions. I think at the beginning people thought, oh, 144 00:09:43,196 --> 00:09:47,516 Speaker 1: all this data are boring, But of course they're an 145 00:09:47,556 --> 00:09:53,956 Speaker 1: invitation for incredible diversity of intellectual questions. This is a 146 00:09:53,996 --> 00:09:57,036 Speaker 1: philosophical question. But do you think that the changes that 147 00:09:57,316 --> 00:10:00,076 Speaker 1: we've seen in these recent decades count as not just 148 00:10:00,396 --> 00:10:02,916 Speaker 1: an evolution in the way the biology has done, but 149 00:10:02,956 --> 00:10:06,636 Speaker 1: in fact as a revolution, especially with respect to being 150 00:10:06,676 --> 00:10:09,356 Speaker 1: able to do something different than saying, here's our hypothesis. Now, 151 00:10:09,436 --> 00:10:11,716 Speaker 1: let's search instead being able to look at a huge 152 00:10:11,716 --> 00:10:14,956 Speaker 1: body of data and say, well, what are the associations? 153 00:10:14,996 --> 00:10:18,196 Speaker 1: What are the associations across the genome. Let's drill down 154 00:10:18,236 --> 00:10:20,436 Speaker 1: and see which ones turn out to be real and 155 00:10:20,476 --> 00:10:22,476 Speaker 1: which ones turn out not to be real. At that 156 00:10:22,516 --> 00:10:24,716 Speaker 1: point you can form my hypothesis. But as it were, 157 00:10:24,756 --> 00:10:28,236 Speaker 1: the hypothesis comes from the associations. Yeah. I think people 158 00:10:28,276 --> 00:10:30,796 Speaker 1: can fight over whether you should choose the word revolution, 159 00:10:30,876 --> 00:10:34,956 Speaker 1: but it seems to me anything that radically changes your 160 00:10:34,996 --> 00:10:39,516 Speaker 1: perspective deserves to be called a revolution. And this idea 161 00:10:39,636 --> 00:10:42,156 Speaker 1: that we used to have that we'd be good at 162 00:10:42,236 --> 00:10:47,396 Speaker 1: guessing what are the processes responsible for cystic fibrosis, or 163 00:10:47,476 --> 00:10:52,036 Speaker 1: heart disease or schizophrenia, you know, just a mystery locked 164 00:10:52,076 --> 00:10:55,236 Speaker 1: in people's brains. The idea that we're supposed to guess 165 00:10:55,316 --> 00:10:59,076 Speaker 1: that based on some prior knowledge of biology was I think, 166 00:10:59,236 --> 00:11:02,236 Speaker 1: doing biology with your hands tied behind your back. The 167 00:11:02,316 --> 00:11:05,956 Speaker 1: idea that we could range across the genome and ask 168 00:11:06,196 --> 00:11:10,916 Speaker 1: why there genetic variations anywhere more frequently found in people 169 00:11:11,276 --> 00:11:14,916 Speaker 1: who have schizophrenia, Well it turned out that instead of 170 00:11:14,956 --> 00:11:17,796 Speaker 1: finding one gene, it's pointed now to two hundred and 171 00:11:17,876 --> 00:11:21,956 Speaker 1: sixty five genes across the genome that play roles. But 172 00:11:22,316 --> 00:11:27,836 Speaker 1: no forward make a hypothesis based on prior knowledge could 173 00:11:27,956 --> 00:11:32,196 Speaker 1: possibly have produced that picture. And biologists at some point 174 00:11:32,476 --> 00:11:35,756 Speaker 1: argued over this new way of thinking, and now, as usual, 175 00:11:35,796 --> 00:11:39,076 Speaker 1: they've just absorbed it into the way biologists think. But 176 00:11:39,236 --> 00:11:42,556 Speaker 1: looking back, it was a shift that took place in 177 00:11:42,596 --> 00:11:46,916 Speaker 1: the eighties and nineties and really took hold in the 178 00:11:46,956 --> 00:11:50,436 Speaker 1: first decade of this century. You know, intellectual shifts like 179 00:11:50,476 --> 00:11:52,596 Speaker 1: that are quite amazing. You know. I were still run 180 00:11:52,596 --> 00:11:55,796 Speaker 1: a laboratory, and one of my students at some lab 181 00:11:55,836 --> 00:11:58,716 Speaker 1: meeting a couple of years ago, was listening to a 182 00:11:58,756 --> 00:12:03,476 Speaker 1: description and he just turned and he said, before you 183 00:12:03,556 --> 00:12:07,876 Speaker 1: knew the sequence to the genome, how did you do anything? 184 00:12:09,676 --> 00:12:12,116 Speaker 1: Of course lots got done, but it was the sign 185 00:12:12,236 --> 00:12:15,676 Speaker 1: of what what intellectual progress is, which it so completely 186 00:12:15,716 --> 00:12:18,516 Speaker 1: infects the way you think about things that you assume 187 00:12:18,636 --> 00:12:32,996 Speaker 1: it's been in the woodwork forever. We'll be right back, Eric. 188 00:12:33,036 --> 00:12:36,716 Speaker 1: I want to ask you about how the institutions that 189 00:12:36,916 --> 00:12:40,396 Speaker 1: enable science to be done have been changing as a 190 00:12:40,436 --> 00:12:45,596 Speaker 1: consequence of the apocal change in the cognitive part of 191 00:12:45,596 --> 00:12:48,596 Speaker 1: science that you're describing. Your institute that you're the founding 192 00:12:48,636 --> 00:12:52,796 Speaker 1: director of, the Broad Institute is a great example. It's 193 00:12:53,516 --> 00:12:57,516 Speaker 1: tremendously influential throughout the areas of science which it touches upon, 194 00:12:57,596 --> 00:13:01,076 Speaker 1: and it touches upon many areas of science. It's extremely 195 00:13:01,116 --> 00:13:04,916 Speaker 1: well funded, it's not subordinated to the universities that it's 196 00:13:04,956 --> 00:13:08,476 Speaker 1: affiliated with Harvard and m T the way a traditional 197 00:13:08,516 --> 00:13:13,036 Speaker 1: institutent have been. And to be blunt, it's enormously powerful 198 00:13:13,116 --> 00:13:18,476 Speaker 1: in the world of science. How coincidental is the evolution 199 00:13:18,516 --> 00:13:20,796 Speaker 1: of the Broad to the developments you're talking about. I mean, 200 00:13:20,876 --> 00:13:23,516 Speaker 1: had you started the Broad and it's not been the 201 00:13:23,516 --> 00:13:25,956 Speaker 1: way science was done would it have been as significant? 202 00:13:25,996 --> 00:13:28,676 Speaker 1: And similarly, had you not started the Broad But had 203 00:13:28,676 --> 00:13:31,436 Speaker 1: these scientific developments occurred, could they have occurred with the 204 00:13:31,476 --> 00:13:35,356 Speaker 1: same speed and efficiency Starting the Broad Institute it was 205 00:13:35,396 --> 00:13:39,716 Speaker 1: a response to these scientific changes. The Broad was an 206 00:13:39,756 --> 00:13:45,916 Speaker 1: answer to these new intellectual possibilities. But the word that 207 00:13:45,996 --> 00:13:48,036 Speaker 1: you didn't mention, which I think, is that the heart 208 00:13:48,076 --> 00:13:51,036 Speaker 1: of all of it is collaboration. We use the word 209 00:13:51,076 --> 00:13:55,596 Speaker 1: collaboration lightly. It's doing anything as we collaborate with somebody. No, no, 210 00:13:55,796 --> 00:14:00,596 Speaker 1: there's a sense of deep collaboration that underlies this new 211 00:14:00,636 --> 00:14:04,036 Speaker 1: era of science. So you know, what the Broad really 212 00:14:04,116 --> 00:14:06,876 Speaker 1: is is an intellectual meeting ground. What the Broad is 213 00:14:06,916 --> 00:14:14,116 Speaker 1: about is a collaborative spirit that is I think necessary 214 00:14:14,156 --> 00:14:18,116 Speaker 1: to take full advantage of where science is going right now. 215 00:14:18,356 --> 00:14:22,156 Speaker 1: Everything else follows from that. I totally buy that collaboration 216 00:14:22,276 --> 00:14:25,276 Speaker 1: is kind of the special sauce. It's also true, though, 217 00:14:25,276 --> 00:14:28,636 Speaker 1: that once there is a site of collaboration, that site 218 00:14:28,636 --> 00:14:32,516 Speaker 1: of collaboration can become tremendously empowered. I mean, think of 219 00:14:32,516 --> 00:14:36,156 Speaker 1: a trading zone in ancient civilizations or even in modern civilizations. 220 00:14:36,156 --> 00:14:38,756 Speaker 1: So if you find a spot where cultures converge in 221 00:14:38,796 --> 00:14:43,956 Speaker 1: collaboration becomes possible, that place becomes enriched and becomes powerful. 222 00:14:43,956 --> 00:14:46,236 Speaker 1: You know, Venice might be the classic example. It's a 223 00:14:46,316 --> 00:14:48,916 Speaker 1: kind of crossroads of East and West a certain moment 224 00:14:48,956 --> 00:14:52,396 Speaker 1: in history, and then tremendous wealth and power accrue in Venice. 225 00:14:52,596 --> 00:14:54,756 Speaker 1: And I guess what I'm wondering about is as the 226 00:14:54,876 --> 00:14:57,196 Speaker 1: resources needed to do biology at the highest level have 227 00:14:57,236 --> 00:15:02,236 Speaker 1: become greater and greater, doesn't that empower the institutions that 228 00:15:02,276 --> 00:15:04,876 Speaker 1: are at the center, that are the collaboration crossroads that 229 00:15:04,916 --> 00:15:07,996 Speaker 1: can raise the funds and then everyone really wants to 230 00:15:07,996 --> 00:15:10,076 Speaker 1: get a part of the action, which seems justifiable and 231 00:15:10,116 --> 00:15:11,996 Speaker 1: good for science. I'm not objecting to this, I'm just 232 00:15:11,996 --> 00:15:14,716 Speaker 1: trying to describe it. And no, I think biology is 233 00:15:14,756 --> 00:15:17,396 Speaker 1: just much more diverse than that. There are so many 234 00:15:17,516 --> 00:15:21,676 Speaker 1: different ways to do biology. As much as I champion 235 00:15:21,796 --> 00:15:26,156 Speaker 1: the rise of the ability to generate large data and 236 00:15:26,276 --> 00:15:30,316 Speaker 1: learn from it, it's one lens on biology. It's probably 237 00:15:30,316 --> 00:15:34,076 Speaker 1: fair to say that ninety five percent of biological discovery 238 00:15:34,116 --> 00:15:38,036 Speaker 1: today still is going on in traditional laboratory structures, and 239 00:15:38,276 --> 00:15:42,596 Speaker 1: they are dramatically empowered by the fact that the five 240 00:15:42,636 --> 00:15:46,356 Speaker 1: percent of places that are really into the let's generate 241 00:15:46,436 --> 00:15:49,036 Speaker 1: large data and analyze it all that put the data 242 00:15:49,116 --> 00:15:52,476 Speaker 1: freely out there because they use it. So the traditional 243 00:15:52,516 --> 00:15:58,556 Speaker 1: biological laboratory, which remains an incredibly powerful model, it becomes 244 00:15:58,596 --> 00:16:02,556 Speaker 1: a solid foundation that saves them tons of work. I 245 00:16:02,636 --> 00:16:06,676 Speaker 1: think what this has done is this has allowed different 246 00:16:06,756 --> 00:16:10,636 Speaker 1: approaches to grow up and interact, and it will continue 247 00:16:10,676 --> 00:16:15,036 Speaker 1: to change. I don't see anything close to a monopoly 248 00:16:15,116 --> 00:16:18,636 Speaker 1: of approach or a monopoly of power in modern biology, 249 00:16:18,676 --> 00:16:22,836 Speaker 1: because a biology is way too rich for that. One 250 00:16:22,876 --> 00:16:25,836 Speaker 1: of the consistent themes in your fascinating career, Eric is 251 00:16:25,876 --> 00:16:28,796 Speaker 1: that you've been constantly in touch with as an advisor 252 00:16:28,836 --> 00:16:33,396 Speaker 1: to or interacting with government. You've also been part of 253 00:16:33,436 --> 00:16:37,036 Speaker 1: the broader field of academic science, always centrally, and you've 254 00:16:37,076 --> 00:16:41,116 Speaker 1: also touched on the private sector corporate part, which is 255 00:16:41,116 --> 00:16:44,596 Speaker 1: one of the three legs, as it were, of contemporary science. 256 00:16:45,996 --> 00:16:49,196 Speaker 1: How do you see the relationship between those different moving 257 00:16:49,276 --> 00:16:53,236 Speaker 1: parts changing. We all still very much have the COVID 258 00:16:53,676 --> 00:16:56,876 Speaker 1: vaccine race in our minds, in which you've had government 259 00:16:56,876 --> 00:17:01,156 Speaker 1: playing sum roble, you've had private sector, you've had academic centers. 260 00:17:01,676 --> 00:17:04,276 Speaker 1: So that might be a concrete example to use, although 261 00:17:04,316 --> 00:17:07,476 Speaker 1: you feel free to use others too. You're describing a 262 00:17:07,556 --> 00:17:10,876 Speaker 1: model that was laid out in the closing months of 263 00:17:10,876 --> 00:17:15,996 Speaker 1: World War two. Very famously. Franklin Roosevelt, a couple of 264 00:17:16,036 --> 00:17:19,196 Speaker 1: weeks after his reelection in nineteen forty four, wrote to 265 00:17:19,276 --> 00:17:24,276 Speaker 1: his science advisor Veniva Bush, saying, boy, this science and 266 00:17:24,276 --> 00:17:27,796 Speaker 1: technology stuff's been pretty helpful in bringing the war at 267 00:17:27,876 --> 00:17:32,276 Speaker 1: least a successful direction. Then and eventually conclusion, how can 268 00:17:32,316 --> 00:17:35,916 Speaker 1: it make a big difference in peacetime? And Bush wrote 269 00:17:36,156 --> 00:17:42,156 Speaker 1: this report that is sort of known as a foundational text, Science, Science, 270 00:17:42,276 --> 00:17:47,676 Speaker 1: the Endless Frontier, and it laid out and eventually shaped 271 00:17:48,236 --> 00:17:52,236 Speaker 1: a world in which we have three pieces. We have government, 272 00:17:52,276 --> 00:17:55,876 Speaker 1: we have academia, we have industry. And there's this virtuous 273 00:17:55,916 --> 00:18:00,916 Speaker 1: cycle where Bush said in this report scientific discovery, which 274 00:18:00,996 --> 00:18:03,356 Speaker 1: during the war had been going on in government labs, 275 00:18:03,796 --> 00:18:06,396 Speaker 1: it should go on in universities, and it should go 276 00:18:06,396 --> 00:18:09,196 Speaker 1: on in the context of training the next generation. And 277 00:18:09,276 --> 00:18:14,396 Speaker 1: governments should fund academia to make basic knowledge fundamental knowledge 278 00:18:14,396 --> 00:18:18,076 Speaker 1: and make it broadly available. Industry is then able to 279 00:18:18,116 --> 00:18:20,796 Speaker 1: pick up that knowledge and turn it into private goods, 280 00:18:20,876 --> 00:18:24,996 Speaker 1: private products. And this virtuous cycle I've written about it, 281 00:18:24,996 --> 00:18:27,716 Speaker 1: and I called it like this miracle machine is something 282 00:18:27,756 --> 00:18:31,916 Speaker 1: that the United States perfected before and better than any 283 00:18:31,956 --> 00:18:34,956 Speaker 1: other country of how these pieces work off each other. 284 00:18:35,316 --> 00:18:39,996 Speaker 1: As an academic, I understand that our goal is create 285 00:18:40,196 --> 00:18:44,676 Speaker 1: knowledge and make it broadly available. But I also know 286 00:18:45,196 --> 00:18:47,556 Speaker 1: that it's never going to complete its mission if it 287 00:18:47,596 --> 00:18:50,916 Speaker 1: doesn't get to patients, and so academia has to work 288 00:18:50,956 --> 00:18:54,276 Speaker 1: with industry and government tests to think about what should 289 00:18:54,276 --> 00:18:58,356 Speaker 1: its policies be on funding. But understanding that balance of 290 00:18:58,356 --> 00:19:04,196 Speaker 1: those three partners in trying to create social products to 291 00:19:04,316 --> 00:19:08,116 Speaker 1: make society healthier, wealthier, and more secure, I think it's 292 00:19:08,116 --> 00:19:11,036 Speaker 1: really important to think about that. You depict a somewhat 293 00:19:11,116 --> 00:19:13,516 Speaker 1: rosy picture there, and I think for those of us 294 00:19:13,556 --> 00:19:15,636 Speaker 1: just coming out of we're not quite out of it yet, 295 00:19:15,636 --> 00:19:19,796 Speaker 1: but a Trump administration viewing an administration that was deeply 296 00:19:19,836 --> 00:19:22,396 Speaker 1: skeptical of science in a whole range of ways, from 297 00:19:22,436 --> 00:19:26,996 Speaker 1: climate ultimately to the epidemiologists advice on how to handle COVID, 298 00:19:27,076 --> 00:19:29,636 Speaker 1: and then which finally circled back to but we're doing 299 00:19:29,636 --> 00:19:33,756 Speaker 1: such a great job because we're facilitating the emergence of vaccines. 300 00:19:33,836 --> 00:19:36,276 Speaker 1: Left a lot of us skeptical about the productive role 301 00:19:36,556 --> 00:19:38,236 Speaker 1: that government can play, and then apart from that, a 302 00:19:38,276 --> 00:19:41,236 Speaker 1: lot of us have independent skepticism of the tremendous power 303 00:19:41,396 --> 00:19:44,076 Speaker 1: of private companies. Is there anything you would say that's 304 00:19:44,116 --> 00:19:46,716 Speaker 1: more critical about the relationship rather than that it's it 305 00:19:46,836 --> 00:19:49,036 Speaker 1: is still the miracle machine that it was depicted as 306 00:19:49,076 --> 00:19:53,236 Speaker 1: being in the rosy, good old Dask Miracle machines need 307 00:19:53,476 --> 00:19:56,596 Speaker 1: a lot of tending and repair and care. They don't 308 00:19:56,596 --> 00:19:59,356 Speaker 1: always get them right. I think, you know, we'll stand 309 00:19:59,396 --> 00:20:02,916 Speaker 1: back with enough distance and see what things got done 310 00:20:02,916 --> 00:20:05,796 Speaker 1: wrong and what lessons we can learn and what things 311 00:20:05,876 --> 00:20:09,436 Speaker 1: got done right. I'm thinking at a broader level, do 312 00:20:09,476 --> 00:20:11,396 Speaker 1: we have the model right? And I think the model 313 00:20:11,556 --> 00:20:14,796 Speaker 1: surely needs repair at many levels, but I don't think 314 00:20:14,796 --> 00:20:18,996 Speaker 1: it's fundamentally wrong. I think more fundamental science is getting 315 00:20:19,036 --> 00:20:22,716 Speaker 1: done in industry that's really important. I think, yes, we've 316 00:20:22,716 --> 00:20:27,116 Speaker 1: seen instances where the government has engaged in denying obvious 317 00:20:27,196 --> 00:20:31,236 Speaker 1: scientific facts. Not a good thing. But I don't think 318 00:20:31,316 --> 00:20:34,556 Speaker 1: there's any reason to think that that's a permanent condition. 319 00:20:35,316 --> 00:20:38,836 Speaker 1: I would point to a constant revision of this model. 320 00:20:39,036 --> 00:20:43,276 Speaker 1: Science does not sit on its laurels. It's always edgy, 321 00:20:43,676 --> 00:20:46,636 Speaker 1: and as soon as Veneva Bush laid out his early model, 322 00:20:47,156 --> 00:20:51,036 Speaker 1: there were efforts to rethink it and change it. If 323 00:20:51,036 --> 00:20:52,836 Speaker 1: you ask me, do I think we're in for a 324 00:20:52,916 --> 00:20:57,876 Speaker 1: period when there's both enormous need an enormous opportunity to 325 00:20:58,036 --> 00:21:03,596 Speaker 1: rethink those things. Yes, absolutely. But the fundamental idea that 326 00:21:04,156 --> 00:21:07,396 Speaker 1: this is one of the great engines of producing progress 327 00:21:07,476 --> 00:21:10,836 Speaker 1: for society I very much agree with, and that makes 328 00:21:10,836 --> 00:21:13,956 Speaker 1: it worth the trouble of figuring out how it needs 329 00:21:13,956 --> 00:21:18,236 Speaker 1: to be fixed and changed and improved. Eric, You've done 330 00:21:18,276 --> 00:21:21,996 Speaker 1: something pretty unusual for someone in your August position, which 331 00:21:22,036 --> 00:21:25,836 Speaker 1: is that you started a podcast, Brave New Planet, I 332 00:21:25,836 --> 00:21:28,796 Speaker 1: should say for disclosure purposes, produced by Pushkin, which also 333 00:21:28,876 --> 00:21:32,196 Speaker 1: produces this show. Tell me why I have to say, 334 00:21:32,196 --> 00:21:34,476 Speaker 1: I mean, it's someone who's relatively new to podcasting figures. 335 00:21:34,516 --> 00:21:36,836 Speaker 1: I'm learning a little bit every day, make two mistakes 336 00:21:36,836 --> 00:21:40,116 Speaker 1: for every step forward. I was kind of heartened to 337 00:21:40,236 --> 00:21:41,756 Speaker 1: hear that you were doing it because I know that 338 00:21:41,796 --> 00:21:43,396 Speaker 1: whatever you do your due to the highest standard. And 339 00:21:43,396 --> 00:21:45,796 Speaker 1: sure enough, it's a great podcast. But why why did 340 00:21:45,836 --> 00:21:49,236 Speaker 1: you decide to devote significant time to that pursuit. Well, 341 00:21:49,436 --> 00:21:53,076 Speaker 1: it's actually related to what we're just talking about. I 342 00:21:53,116 --> 00:21:57,436 Speaker 1: really do believe in this compact between science and society, 343 00:21:58,076 --> 00:22:01,436 Speaker 1: and I do think it's frayed in certain ways. And 344 00:22:01,516 --> 00:22:05,516 Speaker 1: so for me, Brave New Planet, which is seven episodes 345 00:22:05,636 --> 00:22:08,796 Speaker 1: that try to take on really hard problems where I 346 00:22:08,876 --> 00:22:12,916 Speaker 1: don't know what the answer is, deep fakes, for example, 347 00:22:13,596 --> 00:22:17,396 Speaker 1: solar geoengineering. Should we engineer the Earth's atmosphere to mitigate 348 00:22:17,436 --> 00:22:23,436 Speaker 1: climate change? Lethal autonomous weapons? Should we have killer robots? Biases? 349 00:22:23,476 --> 00:22:27,956 Speaker 1: And predictive algorithms? And new technology is called gene drives 350 00:22:27,996 --> 00:22:31,876 Speaker 1: to reshape species in nature. And what Brave New Planet 351 00:22:31,956 --> 00:22:37,636 Speaker 1: is about is smart, thoughtful, passionate people trying to grapple 352 00:22:37,716 --> 00:22:41,396 Speaker 1: with what should we do? People who agree on the facts, 353 00:22:42,036 --> 00:22:45,916 Speaker 1: agree on the societal goals, and then don't agree on 354 00:22:45,956 --> 00:22:49,716 Speaker 1: the solutions because they're hard. Brave New Planet as the 355 00:22:49,756 --> 00:22:55,396 Speaker 1: tagline utopia or dystopia. It's up to us, and I 356 00:22:55,436 --> 00:22:57,476 Speaker 1: think that's right. I mean, there are gonna be a 357 00:22:57,476 --> 00:23:02,076 Speaker 1: lot of consequential decisions about science and technology that if 358 00:23:02,076 --> 00:23:06,436 Speaker 1: we make wise choices, could leave us a lot better off, 359 00:23:06,516 --> 00:23:09,036 Speaker 1: and if we don't make wise choices could leave us 360 00:23:09,036 --> 00:23:12,196 Speaker 1: a lot worsolve. Brave New Planet was an invitation for 361 00:23:12,236 --> 00:23:14,316 Speaker 1: that kind of a conversation and I hope we're gonna 362 00:23:14,356 --> 00:23:16,996 Speaker 1: have a lot more of it. Eric let last question, 363 00:23:17,076 --> 00:23:20,916 Speaker 1: and it derives from that tagline utopia or dystopia. It's 364 00:23:20,956 --> 00:23:23,276 Speaker 1: up to us. There's a theme that I sense in 365 00:23:23,356 --> 00:23:26,316 Speaker 1: you are thinking about your work over the last decades 366 00:23:26,636 --> 00:23:28,516 Speaker 1: and in the direction where things are going now. I 367 00:23:28,556 --> 00:23:32,476 Speaker 1: see you as on the whole an extremely optimistic, positive person. 368 00:23:32,836 --> 00:23:34,636 Speaker 1: But I actually wonder if that might be one of 369 00:23:34,636 --> 00:23:37,636 Speaker 1: the many secrets to your extraordinary success, that you're looking 370 00:23:37,636 --> 00:23:39,836 Speaker 1: optimistically and what can come next, and how change can 371 00:23:39,876 --> 00:23:42,196 Speaker 1: be productive, and how institutions can be evolved to make 372 00:23:42,236 --> 00:23:46,316 Speaker 1: them better. And yet that word dystopia is still lingering 373 00:23:46,356 --> 00:23:49,796 Speaker 1: there as a much more potentially You're not saying it 374 00:23:49,796 --> 00:23:53,996 Speaker 1: will be dystopic, but a much more potentially worrisome picture 375 00:23:54,436 --> 00:23:56,676 Speaker 1: of how our world is evolving, and particularly coming out 376 00:23:56,716 --> 00:24:01,196 Speaker 1: of extraordinary technological innovations, of which big data biology is 377 00:24:01,236 --> 00:24:05,236 Speaker 1: only one. So how do you think about the big 378 00:24:05,316 --> 00:24:08,396 Speaker 1: picture risks that we face really as a society or 379 00:24:08,396 --> 00:24:10,916 Speaker 1: as a civilization, which you know, at least when you 380 00:24:10,956 --> 00:24:13,876 Speaker 1: go to Silicon Valley you hear a lot of smart 381 00:24:13,876 --> 00:24:17,356 Speaker 1: people with a big steak in the future, very very 382 00:24:17,396 --> 00:24:22,036 Speaker 1: worried about what their technologies are capable of producing. Look, 383 00:24:22,076 --> 00:24:25,716 Speaker 1: I am an optimist, but I am a very realistic optimist. 384 00:24:26,276 --> 00:24:30,356 Speaker 1: I'm not Poyanna that everything works out well automatically. You 385 00:24:30,436 --> 00:24:33,036 Speaker 1: have to work really hard to make sure that you 386 00:24:33,156 --> 00:24:39,276 Speaker 1: get the upsides. So, as a realistic optimist, every one 387 00:24:39,276 --> 00:24:44,876 Speaker 1: of the episodes of Brave New Planet starts with all 388 00:24:44,916 --> 00:24:47,676 Speaker 1: of the upsides that could come from something, and then 389 00:24:47,796 --> 00:24:51,276 Speaker 1: pivots part way through the show to exactly the same 390 00:24:51,396 --> 00:24:57,516 Speaker 1: question what could possibly go wrong? And it then unfolds 391 00:24:57,636 --> 00:25:01,356 Speaker 1: layer after layer of how things can go off the rails, 392 00:25:01,756 --> 00:25:05,196 Speaker 1: and in some cases are going off the rails. I 393 00:25:05,356 --> 00:25:10,516 Speaker 1: try to be just completely clear eyed. There aren't quick fixes, 394 00:25:11,036 --> 00:25:15,076 Speaker 1: but we do need fixes, and so that balance is 395 00:25:15,076 --> 00:25:17,516 Speaker 1: something You're right. I could not do what I do 396 00:25:17,916 --> 00:25:22,316 Speaker 1: without being deeply optimistic, and I couldn't do it responsibly 397 00:25:22,956 --> 00:25:27,356 Speaker 1: without being deeply realistic. Eric. Thank you for the extraordinary 398 00:25:27,356 --> 00:25:29,156 Speaker 1: work that you have been doing, that you're still doing, 399 00:25:29,236 --> 00:25:30,996 Speaker 1: that you're going to continue to do. And thanks for 400 00:25:31,076 --> 00:25:33,196 Speaker 1: coming in talking about it with me on deep background. 401 00:25:33,236 --> 00:25:42,276 Speaker 1: Thank you a real pleasure. Noah, take care. Eric Lander 402 00:25:42,396 --> 00:25:46,276 Speaker 1: is not only a tremendously influential figure in science, He's 403 00:25:46,316 --> 00:25:50,396 Speaker 1: also a very, very talented science explainer, and I was 404 00:25:50,436 --> 00:25:52,556 Speaker 1: thrilled that we had that on display here in the 405 00:25:52,596 --> 00:25:56,196 Speaker 1: podcast as he talked about the transformation in biology, the 406 00:25:56,236 --> 00:26:00,156 Speaker 1: emergence of what some have sometimes called hypothesis free science, 407 00:26:00,156 --> 00:26:03,516 Speaker 1: which is really about forming hypotheses in new ways. For 408 00:26:03,556 --> 00:26:07,676 Speaker 1: scientists and for me, listening to him describe these developments, 409 00:26:07,836 --> 00:26:11,676 Speaker 1: it is really extraordinary to hear just how basic they've become, 410 00:26:12,036 --> 00:26:16,156 Speaker 1: to how biologists can respond to real time crises like 411 00:26:16,356 --> 00:26:20,876 Speaker 1: the crisis caused by Czars Cove two. At the same time, 412 00:26:21,076 --> 00:26:26,516 Speaker 1: all transformations in scientific institutions drive changes in how power 413 00:26:26,836 --> 00:26:30,876 Speaker 1: is deployed and how power functions, And although Eric preferred 414 00:26:30,876 --> 00:26:33,716 Speaker 1: to emphasize the ways that power is spread out across 415 00:26:33,756 --> 00:26:37,356 Speaker 1: the biological community, it's also true that big data science 416 00:26:37,396 --> 00:26:42,116 Speaker 1: inevitably has some concentrating effects on those people, places, and 417 00:26:42,196 --> 00:26:45,716 Speaker 1: institutions where the best and most cutting edge techniques can 418 00:26:45,756 --> 00:26:48,916 Speaker 1: be found and consolidated, and Eric has been at the 419 00:26:48,996 --> 00:26:52,756 Speaker 1: center of that development as well. Finally, I was very 420 00:26:52,796 --> 00:26:56,956 Speaker 1: struck by the carefulness with which Eric addresses the problem 421 00:26:57,036 --> 00:27:00,196 Speaker 1: of whether science is on the whole today bringing us 422 00:27:00,196 --> 00:27:03,356 Speaker 1: in a better direction, or has the capacity to throw 423 00:27:03,436 --> 00:27:07,076 Speaker 1: us into dystopia. I think it's highly significant that for 424 00:27:07,236 --> 00:27:12,036 Speaker 1: his podcast Brave New Planet, Eric is exploring specifically technologies 425 00:27:12,076 --> 00:27:16,516 Speaker 1: and scientific developments that have the capacity to go very, 426 00:27:16,636 --> 00:27:20,196 Speaker 1: very wrong. He's a realist about how to try to 427 00:27:20,236 --> 00:27:22,756 Speaker 1: fix those developments because they can't be put back in 428 00:27:22,836 --> 00:27:25,796 Speaker 1: the bottle. I tend to agree with that instinct towards realism. 429 00:27:26,636 --> 00:27:29,596 Speaker 1: Yet he knows that we cannot do the right thing 430 00:27:29,916 --> 00:27:32,916 Speaker 1: unless we think hard about what the right thing to do, 431 00:27:33,316 --> 00:27:37,356 Speaker 1: in fact is. In this moment, we're feeling or about 432 00:27:37,396 --> 00:27:41,676 Speaker 1: to feel, tremendous gratitude to science for what it's accomplished 433 00:27:41,716 --> 00:27:44,876 Speaker 1: with respect to a vaccine for Stars Covey two. Maybe 434 00:27:44,876 --> 00:27:47,036 Speaker 1: we're not quite there yet, but it's entirely possible that 435 00:27:47,076 --> 00:27:50,236 Speaker 1: we will be very grateful to science relatively soon. That 436 00:27:50,356 --> 00:27:54,796 Speaker 1: gratitude will also call for common sense analysis of what 437 00:27:54,876 --> 00:27:57,876 Speaker 1: the risks are and the downsides are of trusting the 438 00:27:57,916 --> 00:28:01,396 Speaker 1: scientific community too much in terms of its response to 439 00:28:01,476 --> 00:28:05,516 Speaker 1: a range of the most difficult social, political, and health 440 00:28:05,596 --> 00:28:09,276 Speaker 1: problems facing us today. Until the next time I speak, you, 441 00:28:09,796 --> 00:28:13,716 Speaker 1: be careful, be safe, and be well. Deep background is 442 00:28:13,716 --> 00:28:17,196 Speaker 1: brought to you by Pushkin Industries, our producer is Lydia Gencott, 443 00:28:17,436 --> 00:28:20,716 Speaker 1: our engineer is Martin Gonzalez, and our showrunner is Sophie 444 00:28:20,756 --> 00:28:24,596 Speaker 1: Crane mckibbon. Theme music by Luis Gera at Pushkin. Thanks 445 00:28:24,596 --> 00:28:28,596 Speaker 1: to Mia Lobell, Julia Barton, Heather Faine, Carlie mcgliori, Mackie Taylor, 446 00:28:28,756 --> 00:28:31,556 Speaker 1: Eric Sandler, and Jacob Weisberg. You can find me on 447 00:28:31,596 --> 00:28:34,316 Speaker 1: Twitter at Noah Arfeld. I also write a column for 448 00:28:34,316 --> 00:28:37,876 Speaker 1: Bloomberg Opinion, which you can find at bloomberg dot com Slashfelder. 449 00:28:38,516 --> 00:28:41,876 Speaker 1: To discover Bloomberg's original slate of podcasts, go to Bloomberg 450 00:28:41,916 --> 00:28:44,796 Speaker 1: dot com slash podcasts, and if you liked what you 451 00:28:44,876 --> 00:28:48,036 Speaker 1: heard today, please write a review or tell Afrah this 452 00:28:48,276 --> 00:28:49,156 Speaker 1: is deep background