1 00:00:15,250 --> 00:00:27,250 Speaker 1: Pushkin. Welcome to a Brave New Planet. My name is 2 00:00:27,370 --> 00:00:33,290 Speaker 1: Eric Lander. I'm your host for this new podcast. In 3 00:00:33,370 --> 00:00:38,090 Speaker 1: the upcoming episodes, we'll explore frontiers of science and technology 4 00:00:38,410 --> 00:00:43,650 Speaker 1: that are both exciting and challenging. Artificial intelligence that's unleashing 5 00:00:43,730 --> 00:00:48,690 Speaker 1: artistic creativity, but also enabling deep fakes that undermine truth 6 00:00:48,730 --> 00:00:52,810 Speaker 1: and democracy. A plan to modify the Earth's atmosphere to 7 00:00:52,810 --> 00:00:57,170 Speaker 1: hold off climate change that might buy time or might 8 00:00:57,290 --> 00:01:00,730 Speaker 1: keep us from solving the real problem. A new genetic 9 00:01:00,770 --> 00:01:05,650 Speaker 1: technology for engineering nature that might prevent malaria but might 10 00:01:05,730 --> 00:01:10,850 Speaker 1: get out of hand. Computer algorithms that can diagnose diseases, 11 00:01:10,930 --> 00:01:14,370 Speaker 1: tell companies who to hire, and judges had a sentence, 12 00:01:15,050 --> 00:01:19,370 Speaker 1: but might also automate or human biases. And we'll ask 13 00:01:19,410 --> 00:01:24,250 Speaker 1: whether it's time to turn war over to robots. We'll 14 00:01:24,290 --> 00:01:28,930 Speaker 1: look at the amazing upsides and also ask what could 15 00:01:28,970 --> 00:01:37,050 Speaker 1: possibly go wrong. But for this short first episode, I 16 00:01:37,130 --> 00:01:40,250 Speaker 1: thought i'd better explain who I am and why I'm here. 17 00:01:40,810 --> 00:01:43,410 Speaker 1: I'm a scientist. I grew up in New York City, 18 00:01:43,530 --> 00:01:46,410 Speaker 1: a product of the public schools, where I fell in 19 00:01:46,530 --> 00:01:49,890 Speaker 1: love with math. Not long after I did my PhD 20 00:01:49,890 --> 00:01:55,010 Speaker 1: in mathematics, I became captivated by genetics. By luck, that 21 00:01:55,130 --> 00:01:58,810 Speaker 1: was just a few years before biologists began dreaming about 22 00:01:58,850 --> 00:02:03,650 Speaker 1: reading out the complete human genetic code, all three billion 23 00:02:03,770 --> 00:02:07,770 Speaker 1: letters of DNA. I got involved in this crazy idea 24 00:02:08,090 --> 00:02:11,010 Speaker 1: called the Human gene Project, and became one of the 25 00:02:11,130 --> 00:02:15,370 Speaker 1: leaders of that international collaboration. As the work neared its end, 26 00:02:15,410 --> 00:02:18,570 Speaker 1: I helped launch a research institute, the Broad Institute of 27 00:02:18,730 --> 00:02:22,290 Speaker 1: MT and Harvard, that I still lead today, to apply 28 00:02:22,450 --> 00:02:25,890 Speaker 1: that same collaborative spirit to help a new generation of 29 00:02:25,970 --> 00:02:30,770 Speaker 1: remarkable scientists propel the understanding and treatment of human diseases 30 00:02:31,450 --> 00:02:35,010 Speaker 1: outside the laboratory. I've always cared a lot about how 31 00:02:35,090 --> 00:02:38,890 Speaker 1: science affects the world. For eight years, I also co 32 00:02:39,130 --> 00:02:42,850 Speaker 1: chaired the President's Council of Advisors on Science and Technology 33 00:02:43,210 --> 00:02:46,250 Speaker 1: for the Obama White House. We got a chance to 34 00:02:46,290 --> 00:02:49,810 Speaker 1: wrestle with some of the nation's biggest opportunities and problems, 35 00:02:50,410 --> 00:02:55,290 Speaker 1: from energy to influenza, cybersecurity to aging, and through it all, 36 00:02:55,450 --> 00:03:00,250 Speaker 1: I've continued to teach introbiology to MIT students, who every 37 00:03:00,330 --> 00:03:04,890 Speaker 1: year restore my faith in the future. But that future 38 00:03:05,010 --> 00:03:08,250 Speaker 1: is on the line today in ways it's never been before. 39 00:03:09,090 --> 00:03:13,130 Speaker 1: The decisions we make or don't make will affect us 40 00:03:13,170 --> 00:03:16,810 Speaker 1: all for generations to come. So to kick off Brave 41 00:03:16,850 --> 00:03:19,770 Speaker 1: New Planet, I wanted to talk with someone who has 42 00:03:19,970 --> 00:03:23,890 Speaker 1: boundless curiosity and might help us think about what's at stake. 43 00:03:24,530 --> 00:03:27,490 Speaker 1: I reached out to Malcolm Gladwell. He's the author of 44 00:03:27,530 --> 00:03:31,010 Speaker 1: books like Talking to Strangers and Outliers and the host 45 00:03:31,050 --> 00:03:35,810 Speaker 1: of the podcast Revisionist History. To my delight, he agreed 46 00:03:35,850 --> 00:03:40,690 Speaker 1: to join us. Malcolm Gladwell, Welcome to Brave New Planet. 47 00:03:41,370 --> 00:03:44,650 Speaker 1: Thank you, Eric. I am delighted to be here and 48 00:03:44,930 --> 00:03:48,570 Speaker 1: fascinated to learn more about your new podcast. But before 49 00:03:48,570 --> 00:03:50,850 Speaker 1: we get into it, I wanted to talk a little 50 00:03:50,890 --> 00:03:55,330 Speaker 1: bit about you. I'm very curious about how your initial 51 00:03:55,370 --> 00:03:58,410 Speaker 1: ideas about science were formed and why what led you 52 00:03:58,450 --> 00:04:00,810 Speaker 1: into this world in the first place. You know, looking 53 00:04:00,890 --> 00:04:03,610 Speaker 1: back growing up in New York as a kid in 54 00:04:03,650 --> 00:04:06,970 Speaker 1: the nineteen sixties, it was pretty amazing. I was raised 55 00:04:07,010 --> 00:04:09,530 Speaker 1: by my mom and we didn't have a lot of money, 56 00:04:09,610 --> 00:04:12,890 Speaker 1: but she figured out everything free and everything cheap that 57 00:04:13,010 --> 00:04:16,490 Speaker 1: you could do, dragonist to museums. She let us stay 58 00:04:16,530 --> 00:04:20,170 Speaker 1: home and watch all the space launches, and then nineteen 59 00:04:20,290 --> 00:04:23,210 Speaker 1: sixty four sixty five was the World's Fair, the New 60 00:04:23,290 --> 00:04:27,290 Speaker 1: York City World's Fair, and that was amazing. There was 61 00:04:27,330 --> 00:04:31,890 Speaker 1: this the Unisphere, the big globe that was the symbol 62 00:04:31,890 --> 00:04:34,090 Speaker 1: of the World's Fair. It's it's the thing that gets 63 00:04:34,130 --> 00:04:36,530 Speaker 1: destroyed in Men in Black at the end of the movie, 64 00:04:38,090 --> 00:04:40,130 Speaker 1: and there was a guy in a jet pack who 65 00:04:40,170 --> 00:04:42,970 Speaker 1: flew over the unisphere and they told us all we 66 00:04:42,970 --> 00:04:45,730 Speaker 1: would be going to work in jetpacks. I still am 67 00:04:45,730 --> 00:04:48,610 Speaker 1: waiting for my jetpack. But there were all these other things. 68 00:04:48,650 --> 00:04:52,370 Speaker 1: There was like the ge Carousel of Progress where they 69 00:04:52,370 --> 00:04:56,370 Speaker 1: were singing There's a great, big, beautiful tomorrow. And there 70 00:04:56,450 --> 00:05:00,290 Speaker 1: was Belle Labs premiering the picture phone, and du Pont 71 00:05:00,410 --> 00:05:04,090 Speaker 1: with its wonderful world of Chemistry, which I think became 72 00:05:04,130 --> 00:05:07,530 Speaker 1: the better slogan better living through chemistry. And they had 73 00:05:07,530 --> 00:05:10,010 Speaker 1: this time capsule that we're going to dig up in 74 00:05:10,130 --> 00:05:12,890 Speaker 1: five thousand years. And it was also the era of 75 00:05:13,010 --> 00:05:17,730 Speaker 1: Star Trek, where everybody, all races and genders, we were 76 00:05:17,730 --> 00:05:20,930 Speaker 1: going to go out together and boldly go and so 77 00:05:21,130 --> 00:05:25,450 Speaker 1: it was a period of like infinite possibility, amazing optimism 78 00:05:25,490 --> 00:05:27,250 Speaker 1: about what you could do in the world. There are 79 00:05:27,250 --> 00:05:31,010 Speaker 1: obviously lots of tensions bubbling under the surface, but as 80 00:05:31,050 --> 00:05:33,570 Speaker 1: a seven eight year old kid I wasn't wear any 81 00:05:33,610 --> 00:05:35,930 Speaker 1: of that yet, and so I think I was formed 82 00:05:35,970 --> 00:05:39,650 Speaker 1: in this world that thought science was going to be 83 00:05:39,690 --> 00:05:43,970 Speaker 1: incredibly important to progress and the world was gonna it 84 00:05:44,010 --> 00:05:45,890 Speaker 1: was just going to become a better and better place. 85 00:05:46,130 --> 00:05:50,210 Speaker 1: I'm amazed, like in your description of that, how frictionless 86 00:05:50,730 --> 00:05:53,170 Speaker 1: your access to knowledge as a kid was seems to 87 00:05:53,170 --> 00:05:55,370 Speaker 1: have been. I mean, all the stuff's on your doorstep, 88 00:05:55,970 --> 00:05:57,970 Speaker 1: like the New York's Fairest. I mean, I don't know 89 00:05:58,010 --> 00:06:00,690 Speaker 1: what subway you're taking, or maybe a couple of buses, 90 00:06:00,690 --> 00:06:03,130 Speaker 1: but but I mean the point is, like, you don't 91 00:06:03,130 --> 00:06:05,890 Speaker 1: have to have a lot of money. You're going to 92 00:06:05,890 --> 00:06:08,610 Speaker 1: lost a dollar for kids to get in. Yeah, now 93 00:06:08,730 --> 00:06:11,370 Speaker 1: dollar was worth more than but still, you know, my 94 00:06:11,450 --> 00:06:14,730 Speaker 1: mom dragged us fourteen times to the World's Fair and 95 00:06:14,730 --> 00:06:18,370 Speaker 1: it was she was amazing. Fourteen times. I love it. 96 00:06:18,450 --> 00:06:21,650 Speaker 1: You remember how many times? Oh yeah, So you have 97 00:06:21,850 --> 00:06:26,770 Speaker 1: this coming out of your childhood, you inherit, this sense 98 00:06:26,770 --> 00:06:30,650 Speaker 1: of infinite possibility. What happens to that sense as you 99 00:06:30,690 --> 00:06:33,490 Speaker 1: get older? Is it still there? Well? It's interesting, I mean, 100 00:06:33,650 --> 00:06:36,930 Speaker 1: where does that sense come from? An infinite possibility? It's 101 00:06:37,010 --> 00:06:40,450 Speaker 1: it's no accident that the nineteen sixties are like that. 102 00:06:40,450 --> 00:06:43,890 Speaker 1: It's after World War Two, where science played a big 103 00:06:44,010 --> 00:06:49,850 Speaker 1: role in winning the war, radar, early computers, penicillin, of course, 104 00:06:49,850 --> 00:06:53,730 Speaker 1: atomic bombs, and the US makes this decision after the 105 00:06:53,770 --> 00:06:57,730 Speaker 1: war that science is going to be a cornerstone of 106 00:06:57,810 --> 00:07:01,890 Speaker 1: society going forward, that we're going to fund science at universities, 107 00:07:01,930 --> 00:07:05,570 Speaker 1: and we're gonna train people, and we'll have this virtuous 108 00:07:05,610 --> 00:07:10,090 Speaker 1: cycle of public knowledge producing technologies and wars and companies. 109 00:07:10,690 --> 00:07:15,810 Speaker 1: And then of course, nine months after I'm born, Sputnik 110 00:07:15,850 --> 00:07:18,890 Speaker 1: goes up. I don't remember it, but I know it 111 00:07:18,930 --> 00:07:21,970 Speaker 1: was exactly nine months after I was born, and like 112 00:07:22,170 --> 00:07:26,050 Speaker 1: everything goes into overdrive. Science is central to the survival 113 00:07:26,090 --> 00:07:30,610 Speaker 1: of the country. We pass laws for science education, we 114 00:07:30,690 --> 00:07:32,890 Speaker 1: start the space race to get a man on the 115 00:07:32,930 --> 00:07:36,250 Speaker 1: moon first. So you know, this is the world that 116 00:07:36,490 --> 00:07:39,650 Speaker 1: shaped that New York City of the nineteen sixties was 117 00:07:39,730 --> 00:07:43,130 Speaker 1: the sense that science was going to discover what's true, 118 00:07:43,330 --> 00:07:46,810 Speaker 1: technology was going to figure out what's possible, and society 119 00:07:46,850 --> 00:07:51,210 Speaker 1: reaps the benefits. And that assumption, that idea that the 120 00:07:51,490 --> 00:07:56,930 Speaker 1: scientific method, you know, you figure out what's true by evidence, 121 00:07:57,250 --> 00:08:01,450 Speaker 1: not authority. It's all about honesty, not advocacy. There was 122 00:08:01,490 --> 00:08:06,010 Speaker 1: a set of assumptions that underlay that world where this 123 00:08:06,050 --> 00:08:10,370 Speaker 1: country really bought into the ideas of science, and it 124 00:08:10,530 --> 00:08:13,690 Speaker 1: paid off amazing dividends. You know, through the rest of 125 00:08:13,730 --> 00:08:16,890 Speaker 1: the twentieth century. You think about all the things that 126 00:08:17,050 --> 00:08:19,530 Speaker 1: happened from what had to have been a bit of 127 00:08:19,530 --> 00:08:22,570 Speaker 1: a crazy bet. The US didn't invest in science a 128 00:08:22,610 --> 00:08:25,730 Speaker 1: lot before the war. But afterwards you look at the 129 00:08:25,770 --> 00:08:30,090 Speaker 1: things that start happening. You get polio vaccines, measles, and 130 00:08:30,210 --> 00:08:35,490 Speaker 1: smallpox vaccines and eradicate smallpox. And you get computer technology 131 00:08:35,610 --> 00:08:40,850 Speaker 1: and the Internet and GPS systems and like Google searches 132 00:08:40,890 --> 00:08:45,130 Speaker 1: come out of a National Science Foundation grant, and then 133 00:08:45,210 --> 00:08:48,810 Speaker 1: molecular biology and gene cloning and this Human Genome Project 134 00:08:48,890 --> 00:08:52,010 Speaker 1: I got very involved in, and onward and onward and onward, 135 00:08:52,170 --> 00:08:54,930 Speaker 1: and you get these industries that grow up around all 136 00:08:54,970 --> 00:08:59,490 Speaker 1: these things. So I think it paid off in huge ways. 137 00:09:00,210 --> 00:09:04,850 Speaker 1: And what's interesting is it's only accelerated since then in 138 00:09:04,930 --> 00:09:07,410 Speaker 1: terms of the science, even though there are a lot 139 00:09:07,410 --> 00:09:09,450 Speaker 1: of tensions, you know from the any of you of 140 00:09:09,530 --> 00:09:12,530 Speaker 1: the science. You look at the last two decades or 141 00:09:12,570 --> 00:09:16,610 Speaker 1: so and you start seeing, you know, artificial intelligence, we 142 00:09:16,690 --> 00:09:22,410 Speaker 1: can translate languages, by computers and spot lung cancers by computers, 143 00:09:22,450 --> 00:09:26,250 Speaker 1: and people are making self driving cars and quantum computers. 144 00:09:26,370 --> 00:09:32,170 Speaker 1: And in biology you have you know, new therapies for cancer, immunotherapies, 145 00:09:32,970 --> 00:09:36,130 Speaker 1: and this Human Genome project where we spent three billion 146 00:09:36,170 --> 00:09:38,810 Speaker 1: dollars to read one genome. It now costs a couple 147 00:09:38,890 --> 00:09:41,970 Speaker 1: hundred bucks to read a genome. There are technologies that 148 00:09:42,170 --> 00:09:45,890 Speaker 1: let you edit genomes like Crisper, and just keeps going 149 00:09:45,930 --> 00:09:49,130 Speaker 1: and going, and I think, if anything, it's accelerating in 150 00:09:49,290 --> 00:09:53,010 Speaker 1: terms of the science and the impact on society. I've 151 00:09:53,050 --> 00:09:55,970 Speaker 1: just curious, Eric, you've chosen to do this now? Is 152 00:09:55,970 --> 00:09:58,850 Speaker 1: there a reason why now? I mean, you could have 153 00:09:58,890 --> 00:10:00,610 Speaker 1: done this five years ago, You could have done this 154 00:10:00,970 --> 00:10:03,730 Speaker 1: five years from now. So there's something compelling you at 155 00:10:03,730 --> 00:10:08,810 Speaker 1: this moment in time. Yeah. Absolutely. It is so glaringly 156 00:10:08,810 --> 00:10:12,250 Speaker 1: out obvious that we are going to need science to 157 00:10:12,250 --> 00:10:16,450 Speaker 1: solve a lot of the problems ahead. Climate change. It's 158 00:10:16,490 --> 00:10:19,530 Speaker 1: in our face right now. It was theoretical to people, 159 00:10:19,570 --> 00:10:22,650 Speaker 1: but the last three or four years, it's so obvious 160 00:10:22,770 --> 00:10:25,490 Speaker 1: we're gonna have to solve that. The pandemic that we're 161 00:10:25,530 --> 00:10:28,970 Speaker 1: living through right now, it's clear there's no way to 162 00:10:29,090 --> 00:10:33,370 Speaker 1: solve pandemics without a lot more science than we've brought 163 00:10:33,410 --> 00:10:36,210 Speaker 1: to bear on it so far. And then I think 164 00:10:36,210 --> 00:10:39,930 Speaker 1: about things like Alzheimer's. It's going to be costing US 165 00:10:39,970 --> 00:10:44,690 Speaker 1: a trillion dollars a year as the usages, and we 166 00:10:44,730 --> 00:10:48,610 Speaker 1: really don't know how to do anything other than support people. 167 00:10:48,970 --> 00:10:51,330 Speaker 1: So we've got to come up with solutions, and those 168 00:10:51,330 --> 00:10:54,450 Speaker 1: solutions that got to come from science. I could go 169 00:10:54,530 --> 00:10:58,650 Speaker 1: on and on, and so it seemed like a moment 170 00:10:59,330 --> 00:11:02,090 Speaker 1: when we just had to decide are we going to 171 00:11:02,250 --> 00:11:06,490 Speaker 1: rally a world together around science. It's Matt Damon's famous 172 00:11:06,570 --> 00:11:09,490 Speaker 1: line from the movie The Martian when he branded on 173 00:11:09,650 --> 00:11:13,170 Speaker 1: Mars and he knows nobody's coming back for him soon, 174 00:11:13,290 --> 00:11:17,210 Speaker 1: and he says, I guess there's only one option. I'm 175 00:11:17,250 --> 00:11:19,210 Speaker 1: going to have to science the shit out of this. 176 00:11:20,170 --> 00:11:21,930 Speaker 1: We're gonna have to do that right now. And I 177 00:11:21,970 --> 00:11:25,770 Speaker 1: think the last several years have made that so apparent. 178 00:11:26,530 --> 00:11:30,250 Speaker 1: We really have to reconnect between science and society because 179 00:11:30,290 --> 00:11:33,450 Speaker 1: that connection is going to be so important going forward. 180 00:11:34,090 --> 00:11:36,010 Speaker 1: Do you think if you were an eight year old today, 181 00:11:36,530 --> 00:11:38,850 Speaker 1: you would have to describe what your eight year old 182 00:11:38,850 --> 00:11:42,090 Speaker 1: self today would think if is it the same sense 183 00:11:42,130 --> 00:11:46,170 Speaker 1: of infinite possibility. I think there's a frustration right now, 184 00:11:46,210 --> 00:11:49,330 Speaker 1: a sense that we're not doing all we can with this, 185 00:11:49,970 --> 00:11:53,530 Speaker 1: and there's a lot of reasons for it. The bright 186 00:11:53,850 --> 00:11:58,530 Speaker 1: shininess of science would discover the truth and technologies would 187 00:11:58,530 --> 00:12:02,410 Speaker 1: show us what's possible, and society would reap the benefits. 188 00:12:02,690 --> 00:12:07,530 Speaker 1: That simple social compact is getting frayed in some ways, 189 00:12:08,370 --> 00:12:10,770 Speaker 1: and there's a lot of different ways it's getting fred 190 00:12:10,850 --> 00:12:15,490 Speaker 1: it's not surprising that after sixty years things would would 191 00:12:15,530 --> 00:12:18,610 Speaker 1: begin to get a little tattered. But you know, if 192 00:12:18,610 --> 00:12:23,530 Speaker 1: you think about it, science sometimes conflicts with economic interests. 193 00:12:24,210 --> 00:12:29,050 Speaker 1: We began to see that companies might start attacking science 194 00:12:29,210 --> 00:12:31,890 Speaker 1: because they really don't like the answers. I think the 195 00:12:31,890 --> 00:12:35,410 Speaker 1: first case was when it became clear that cigarettes caused cancer, 196 00:12:36,050 --> 00:12:39,810 Speaker 1: and tobacco companies decided to pay people to put up 197 00:12:39,850 --> 00:12:43,730 Speaker 1: smoke screens and question the evidence. And then, of course, 198 00:12:43,770 --> 00:12:47,410 Speaker 1: climate changes is where we see it most today, where 199 00:12:48,090 --> 00:12:53,290 Speaker 1: despite massive amounts of evidence, the solutions just deny it. 200 00:12:53,850 --> 00:12:57,170 Speaker 1: You know, we've got massive wildfires in California, and we've 201 00:12:57,170 --> 00:13:00,570 Speaker 1: got wildfires a couple of years ago above the Arctic circle. 202 00:13:01,250 --> 00:13:05,890 Speaker 1: We got glaciers retreating powerful hurricanes, you know, the last 203 00:13:05,930 --> 00:13:10,610 Speaker 1: six years, the hottest years in history, and people just say, well, 204 00:13:10,610 --> 00:13:13,130 Speaker 1: you know, nothing to see here, that's all just a fluke. 205 00:13:13,650 --> 00:13:17,490 Speaker 1: And I think that that ability to not even have 206 00:13:17,570 --> 00:13:21,130 Speaker 1: to engage is one tension that we're trying to deal 207 00:13:21,170 --> 00:13:24,090 Speaker 1: with right now. Is is there a shared assumption that 208 00:13:24,170 --> 00:13:26,410 Speaker 1: we have to deal with the evidence. But then there's 209 00:13:26,450 --> 00:13:29,770 Speaker 1: other things. I think on science's side, sometimes science just 210 00:13:30,410 --> 00:13:34,290 Speaker 1: seems to overpromise, and sometimes probably does overpromise. There are 211 00:13:34,330 --> 00:13:36,370 Speaker 1: people who say, wait, wait a second, I thought you 212 00:13:36,370 --> 00:13:40,210 Speaker 1: were curing cancer. How come cancers not cured yet? I 213 00:13:40,250 --> 00:13:43,930 Speaker 1: think going overboard on promises and not giving people a 214 00:13:44,010 --> 00:13:46,570 Speaker 1: sense of the fact that, yeah, science is amazing, but 215 00:13:46,610 --> 00:13:49,970 Speaker 1: it takes a while to deliver can backfire. And I 216 00:13:50,050 --> 00:13:54,730 Speaker 1: think I think there's also people who you just want 217 00:13:54,810 --> 00:13:58,210 Speaker 1: answers to things and science doesn't have answers for them, 218 00:13:58,770 --> 00:14:02,250 Speaker 1: and they got to go seek them somewhere. Yeah, is 219 00:14:02,650 --> 00:14:04,410 Speaker 1: this kind of these kinds of ways what led you 220 00:14:04,450 --> 00:14:07,450 Speaker 1: to want to do a podcast? Well exactly, you know, 221 00:14:07,530 --> 00:14:11,250 Speaker 1: I think this compact between science and society is so important. 222 00:14:12,130 --> 00:14:14,490 Speaker 1: It's getting tattered and we have to do something about it, 223 00:14:14,490 --> 00:14:19,690 Speaker 1: and that requires drawing more people into hard problems and science. 224 00:14:19,770 --> 00:14:22,810 Speaker 1: We can't we can't ignore the fact that the problems 225 00:14:22,850 --> 00:14:27,810 Speaker 1: are hard. Sometimes these bright shiny futures that we talk 226 00:14:27,890 --> 00:14:32,970 Speaker 1: about we really get instead dystopian outcomes. Things really can 227 00:14:33,090 --> 00:14:36,650 Speaker 1: go wrong. The better living through chemistry can turn into 228 00:14:36,730 --> 00:14:40,570 Speaker 1: toxic waste. The Internet that's supposed to give us all 229 00:14:40,650 --> 00:14:45,450 Speaker 1: the world's information can bring us disinformation. You know. Social 230 00:14:45,490 --> 00:14:48,570 Speaker 1: media that's supposed to bring us together can tear us apart. 231 00:14:48,650 --> 00:14:51,930 Speaker 1: I know you're you're really interested in these topics yourself, 232 00:14:52,530 --> 00:14:55,290 Speaker 1: and I think there are times when when there's just 233 00:14:55,330 --> 00:14:59,970 Speaker 1: been failures of imagination to think about what could possibly 234 00:15:00,090 --> 00:15:04,090 Speaker 1: go wrong. Whatever the compact was in the nineteen sixties 235 00:15:04,810 --> 00:15:07,410 Speaker 1: where you kind of left it to the scientists and 236 00:15:07,410 --> 00:15:10,610 Speaker 1: the politicians to work at all out, We're now going 237 00:15:10,690 --> 00:15:13,610 Speaker 1: to need everybody on board. We're gonna need to open 238 00:15:13,650 --> 00:15:18,010 Speaker 1: this up to more people and recognize that science might 239 00:15:18,050 --> 00:15:20,610 Speaker 1: have a lot of answers about the science, but we're 240 00:15:20,610 --> 00:15:22,530 Speaker 1: not going to have all the answers about how it 241 00:15:22,570 --> 00:15:25,330 Speaker 1: should be applied in the world. And that was really 242 00:15:25,410 --> 00:15:28,410 Speaker 1: the heart of the podcast. If we really are now 243 00:15:28,690 --> 00:15:32,690 Speaker 1: all the stewards of a brave new planet, then we 244 00:15:33,010 --> 00:15:35,250 Speaker 1: got to figure out how to draw everybody in on 245 00:15:35,290 --> 00:15:39,810 Speaker 1: that and open it up. Would you describe yourself still 246 00:15:39,850 --> 00:15:44,090 Speaker 1: as an optimist? Oh, I am a tremendous optimist in 247 00:15:44,170 --> 00:15:48,930 Speaker 1: spite of lots of evidence to the contrary, because in 248 00:15:49,010 --> 00:15:51,810 Speaker 1: the end, I really don't see any alternative. So I 249 00:15:51,930 --> 00:15:57,530 Speaker 1: am a very realistic optimist. I think we're gonna need 250 00:15:57,570 --> 00:16:00,410 Speaker 1: to fight for truth. It's not a gimme like it 251 00:16:00,490 --> 00:16:04,330 Speaker 1: might have been a half century ago. I think we 252 00:16:04,370 --> 00:16:07,410 Speaker 1: are gonna need to bring everybody in to help make 253 00:16:07,530 --> 00:16:11,370 Speaker 1: decisions about how we should use this. It's not going 254 00:16:11,450 --> 00:16:13,850 Speaker 1: to be easy. And uh, you know, there's nothing about 255 00:16:13,890 --> 00:16:18,730 Speaker 1: the podcast that's advocating specific answers to anything. It's it's 256 00:16:18,730 --> 00:16:24,290 Speaker 1: really meant to model smart, thoughtful, passionate people struggling with 257 00:16:24,330 --> 00:16:26,610 Speaker 1: what do we do with our future? And so it's 258 00:16:26,650 --> 00:16:30,010 Speaker 1: meant to invite people into that because you know, in 259 00:16:30,090 --> 00:16:33,450 Speaker 1: my most optimistic self, that's how we make it through 260 00:16:33,570 --> 00:16:38,410 Speaker 1: is we we all work together to struggle through hard 261 00:16:38,490 --> 00:16:43,650 Speaker 1: problems that have amazing upsides, maybe big downsides, and together 262 00:16:43,730 --> 00:16:46,610 Speaker 1: we make it through. And you know, these are things 263 00:16:46,690 --> 00:16:49,810 Speaker 1: that they're just too big to fit in a tweet. 264 00:16:50,490 --> 00:16:54,890 Speaker 1: And I'm really interested. I know you are in ideas 265 00:16:54,890 --> 00:16:57,330 Speaker 1: that are too big to fit in a tweet but 266 00:16:57,410 --> 00:16:59,610 Speaker 1: will end up, you know, shaping the future in a 267 00:16:59,650 --> 00:17:02,170 Speaker 1: big way. And so that that was the heart of 268 00:17:02,170 --> 00:17:05,050 Speaker 1: the podcast. Has there been a moment maybe not doing 269 00:17:05,090 --> 00:17:09,570 Speaker 1: this podcast, but when you're dealings with others scientists in 270 00:17:09,650 --> 00:17:12,330 Speaker 1: fields not your own? Is the rare for a moment 271 00:17:12,370 --> 00:17:15,290 Speaker 1: when you're over your head? Will you say I have 272 00:17:15,330 --> 00:17:19,130 Speaker 1: no idea what these guys talking about? Oh? Yeah, oh yeah. 273 00:17:19,170 --> 00:17:22,770 Speaker 1: Frequently there were occasional times I was really deeply an 274 00:17:22,770 --> 00:17:25,570 Speaker 1: expert in the subject, but you know, many many other 275 00:17:25,610 --> 00:17:28,330 Speaker 1: different things. You know, I'm not an expert, and so 276 00:17:28,410 --> 00:17:31,130 Speaker 1: I think what I had to learn was the right 277 00:17:31,210 --> 00:17:34,490 Speaker 1: kind of scientific humility to ask dumb questions and say 278 00:17:34,810 --> 00:17:38,570 Speaker 1: what exactly does that mean? And you can learn a 279 00:17:38,570 --> 00:17:41,570 Speaker 1: lot from that. Yeah, yeah, I am. I am. That 280 00:17:41,650 --> 00:17:44,850 Speaker 1: actually is a lovely place to end. I'm in total 281 00:17:44,930 --> 00:17:47,570 Speaker 1: agreement with you. I think the key to being an 282 00:17:47,570 --> 00:17:53,450 Speaker 1: effective journalist, explainer, podcast host, whatever is the willingness to 283 00:17:53,530 --> 00:17:57,970 Speaker 1: ask really dumb questions. Indeed, thank you so much. This 284 00:17:58,010 --> 00:18:00,290 Speaker 1: is I cannot tell you how excited I am to 285 00:18:00,370 --> 00:18:02,650 Speaker 1: listen to this and how delighted I am that you 286 00:18:02,730 --> 00:18:07,530 Speaker 1: have you of all people, have decided to tackle this subject. 287 00:18:07,530 --> 00:18:11,490 Speaker 1: It's it is greatly needed. Oh Malcolm, thanks so much 288 00:18:11,530 --> 00:18:15,330 Speaker 1: for helping us kick off this first episode. Thank you 289 00:18:15,330 --> 00:18:19,890 Speaker 1: America and to all of you listeners. Come join us 290 00:18:20,050 --> 00:18:25,330 Speaker 1: for episode two, where we'll hear President Richard Nixon console 291 00:18:25,490 --> 00:18:29,850 Speaker 1: a grieving nation about the tragic outcome of America's failed 292 00:18:29,890 --> 00:18:33,930 Speaker 1: mission to land a man on the Moon. Good evening, 293 00:18:34,170 --> 00:18:38,530 Speaker 1: my fellow Americans. Fakes has ordained that the men who 294 00:18:38,570 --> 00:18:42,650 Speaker 1: went to the Moon to explore and peace will stay 295 00:18:42,690 --> 00:18:47,370 Speaker 1: on the Moon to rest in peace. These brave men, 296 00:18:48,370 --> 00:18:54,170 Speaker 1: Meal Armstrong and Edwin Auburn know that there's no pup 297 00:18:54,730 --> 00:19:00,490 Speaker 1: for their recovery. Deep Fakes next time on Brave New Planet. 298 00:19:02,610 --> 00:19:04,770 Speaker 1: Brave New Planet is a co production of the Broad 299 00:19:04,850 --> 00:19:07,930 Speaker 1: Institute of MT and Harvard Pushkin Industries in the Boston 300 00:19:08,010 --> 00:19:12,370 Speaker 1: Globe support from the Alfred P. Sloane Foundation. Our show 301 00:19:12,450 --> 00:19:15,930 Speaker 1: is produced by Rebecca Lee Douglas with Mary Dow. Theme 302 00:19:15,970 --> 00:19:19,730 Speaker 1: song composed by Ned Porter, mastering and sound designed by 303 00:19:19,810 --> 00:19:23,290 Speaker 1: James Garver, fact checking by Joseph Fridman, and a Stitt 304 00:19:23,290 --> 00:19:27,810 Speaker 1: and Enchant. Special thanks to Christine Heenan and Rachel Roberts 305 00:19:27,810 --> 00:19:32,050 Speaker 1: at Clarendon Communications, to Lee McGuire, Kristen Zarelli and Justine 306 00:19:32,130 --> 00:19:36,050 Speaker 1: Levin Allerhand at the Broad, Tamil o'bell and Heather Faine 307 00:19:36,050 --> 00:19:39,970 Speaker 1: at Pushkin, And to Eli and Edy Brode who made 308 00:19:39,970 --> 00:19:44,930 Speaker 1: the Broad Institute possible. This is brave new planet. I'm 309 00:19:45,010 --> 00:19:45,570 Speaker 1: Eric Lander.