1 00:00:15,250 --> 00:00:27,810 Speaker 1: Pushkin, you're listening to Brave New Planet, a podcast about 2 00:00:27,850 --> 00:00:32,010 Speaker 1: amazing new technologies that could dramatically improve our world, or 3 00:00:32,410 --> 00:00:34,970 Speaker 1: if we don't make wise choices, could leave us a 4 00:00:34,970 --> 00:00:39,610 Speaker 1: lot worse off. Utopia or dystopia. It's up to us. 5 00:00:47,530 --> 00:00:52,010 Speaker 1: On Saturday, November seventh, two and twenty, hundreds of millions 6 00:00:52,010 --> 00:00:54,890 Speaker 1: of people finally got an answer to a question that 7 00:00:54,970 --> 00:00:58,450 Speaker 1: had consumed them for more than eight weeks of balloting 8 00:00:58,690 --> 00:01:02,010 Speaker 1: and four days of vote counting, who would lead the 9 00:01:02,090 --> 00:01:05,330 Speaker 1: United States of America for the next four years. At 10 00:01:05,370 --> 00:01:09,210 Speaker 1: eleven twenty four am Eastern Time, CNN called the president 11 00:01:09,370 --> 00:01:12,650 Speaker 1: election for Joe Biden and his running mate Kamala Harris. 12 00:01:13,250 --> 00:01:18,210 Speaker 1: Within twenty minutes, every major network followed suit. The race 13 00:01:18,650 --> 00:01:22,890 Speaker 1: was over. But even as one question was answered, another 14 00:01:23,130 --> 00:01:27,170 Speaker 1: still loomed large. Well America now finally be able to 15 00:01:27,170 --> 00:01:30,770 Speaker 1: move forward and tackle the hard problems facing the country 16 00:01:30,890 --> 00:01:41,370 Speaker 1: and the world. My name is Eric Lander, and I'm 17 00:01:41,370 --> 00:01:45,050 Speaker 1: the host of Brave New Planet. When we began planning 18 00:01:45,050 --> 00:01:47,650 Speaker 1: the seven episodes of Brave New Planet more than a 19 00:01:47,730 --> 00:01:50,970 Speaker 1: year ago, I never imagined we'd be wrapping up in 20 00:01:51,010 --> 00:01:55,090 Speaker 1: the days just after a presidential election. We'd originally planned 21 00:01:55,090 --> 00:01:58,370 Speaker 1: to complete and release this series in spring twenty twenty, 22 00:01:58,930 --> 00:02:02,770 Speaker 1: but as with so many things, those plans were upended 23 00:02:02,810 --> 00:02:07,130 Speaker 1: by the pandemic. Somehow, though, the timings turned out to 24 00:02:07,130 --> 00:02:10,970 Speaker 1: be fitting Brave New Plant. And it's about amazing science 25 00:02:10,970 --> 00:02:16,410 Speaker 1: and technology that also poses hard challenges, But it's also 26 00:02:16,490 --> 00:02:18,810 Speaker 1: about how we're going to need to come together and 27 00:02:18,970 --> 00:02:24,130 Speaker 1: work together to make wise choices in many areas. Yes, 28 00:02:24,490 --> 00:02:28,130 Speaker 1: scientific problems, from the current pandemic to climate change, but 29 00:02:28,250 --> 00:02:34,090 Speaker 1: also societal problems from economic security to racial justice. Brave 30 00:02:34,130 --> 00:02:37,850 Speaker 1: New Planet has tried to show smart, thoughtful, passionate people 31 00:02:38,250 --> 00:02:41,050 Speaker 1: who agree on the facts and even agree on the 32 00:02:41,090 --> 00:02:47,050 Speaker 1: societal goals, but who disagree on solutions. Yet nonetheless they 33 00:02:47,090 --> 00:02:52,050 Speaker 1: grapple with complex problems, argue with respect, occasionally even change 34 00:02:52,090 --> 00:02:55,730 Speaker 1: their minds, and make some progress even where there are 35 00:02:55,850 --> 00:03:00,450 Speaker 1: no easy answers. To my mind, it's the only path forward. 36 00:03:01,610 --> 00:03:05,530 Speaker 1: Brave New Planet's mission is to invite everyone into these conversations. 37 00:03:06,730 --> 00:03:10,770 Speaker 1: So today's big question, what's it going to take to 38 00:03:10,810 --> 00:03:14,010 Speaker 1: do more of this as a society, to find common 39 00:03:14,050 --> 00:03:19,290 Speaker 1: ground on goals and argue productively about solutions. As I 40 00:03:19,330 --> 00:03:22,570 Speaker 1: thought about this question, it occurred to me that scientists 41 00:03:22,890 --> 00:03:26,290 Speaker 1: aren't the only people who spend their days gathering information 42 00:03:26,690 --> 00:03:31,090 Speaker 1: to try to help society solve problems. Journalists do too, 43 00:03:31,690 --> 00:03:34,250 Speaker 1: So I thought that a conversation between a scientist and 44 00:03:34,330 --> 00:03:38,370 Speaker 1: a journalist about the common challenges we face might be enlightening. 45 00:03:39,170 --> 00:03:43,530 Speaker 1: I reached out to journalist Nila Boodoo. NILA's worked for Reuters, 46 00:03:43,610 --> 00:03:46,770 Speaker 1: the Miami Herald, and in public radio, where she's hosted 47 00:03:46,770 --> 00:03:51,770 Speaker 1: shows on WBEZ Chicago. Now she's the host of Axios Today, 48 00:03:52,210 --> 00:03:57,170 Speaker 1: a new daily morning news podcast. Nila Boodoo, Welcome to 49 00:03:57,330 --> 00:04:00,610 Speaker 1: Brave New Planet. Hi, Eric, thank you so much for 50 00:04:00,650 --> 00:04:02,970 Speaker 1: having me. It's a pleasure to be here. Oh, it's 51 00:04:03,010 --> 00:04:06,610 Speaker 1: great to have you. So, Nila, I'd love to start 52 00:04:06,690 --> 00:04:10,650 Speaker 1: with how scientists and journalists can with the public, At 53 00:04:10,730 --> 00:04:13,850 Speaker 1: least in science, I think there's often a real problem 54 00:04:14,250 --> 00:04:18,410 Speaker 1: with humility and trust. You know. For example, when scientists 55 00:04:18,530 --> 00:04:20,650 Speaker 1: talk about what do we have to do to make 56 00:04:20,690 --> 00:04:24,130 Speaker 1: progress on problems, one of the first things people suggest 57 00:04:24,450 --> 00:04:28,010 Speaker 1: is more science education. That doesn't get me wrong, I'm 58 00:04:28,050 --> 00:04:32,010 Speaker 1: not opposed to more science education. I teach. I love it, 59 00:04:32,330 --> 00:04:35,890 Speaker 1: But I think there's an underlying assumption there that the 60 00:04:35,930 --> 00:04:39,530 Speaker 1: problem is that people are just ignorant, that if they 61 00:04:39,690 --> 00:04:43,650 Speaker 1: just got more science education, they'd know the facts or 62 00:04:43,690 --> 00:04:46,690 Speaker 1: accept the facts and fall in line with the solutions. 63 00:04:47,210 --> 00:04:49,930 Speaker 1: And I don't think that's the right place to start. 64 00:04:50,170 --> 00:04:54,130 Speaker 1: I mean, scientists do spend their days swimming around in facts, 65 00:04:54,690 --> 00:04:57,210 Speaker 1: but I don't think that's a reason to be looking 66 00:04:57,290 --> 00:05:00,450 Speaker 1: down on people. I think there's ever a reason to 67 00:05:00,490 --> 00:05:02,890 Speaker 1: be looking down on people. It's not a good posture. 68 00:05:03,210 --> 00:05:04,970 Speaker 1: We might have been able to get away with it 69 00:05:05,050 --> 00:05:09,370 Speaker 1: in the science of the nineteen fifties and sixties. You know, 70 00:05:09,410 --> 00:05:13,410 Speaker 1: the authority of scientists in the White Lab Code or something, 71 00:05:13,850 --> 00:05:16,890 Speaker 1: but scientists don't have a monopoly on the insights that 72 00:05:16,930 --> 00:05:19,770 Speaker 1: are going to matter. I think, you know, we have 73 00:05:19,810 --> 00:05:23,290 Speaker 1: to go in feeling we got something really important to contribute, 74 00:05:23,770 --> 00:05:27,610 Speaker 1: but it's only part of the puzzle. Yeah, I'm wondering, Nilot, 75 00:05:27,610 --> 00:05:30,850 Speaker 1: do you see that same issue in journalism? Well, I 76 00:05:30,850 --> 00:05:32,530 Speaker 1: think it's the same thing you said, right, Like, so 77 00:05:32,570 --> 00:05:35,210 Speaker 1: you said that, you know the idea that scientists in 78 00:05:35,250 --> 00:05:38,010 Speaker 1: the nineteen fifties or sixties or whatever, there's this idea 79 00:05:38,050 --> 00:05:39,730 Speaker 1: that scientists where its sort of the end all be 80 00:05:39,850 --> 00:05:43,210 Speaker 1: all of information. Journalists were like that too right. We 81 00:05:43,530 --> 00:05:46,650 Speaker 1: used to think that we were in charge of broadcasting 82 00:05:46,690 --> 00:05:49,450 Speaker 1: out the information to people, and I think, certainly in 83 00:05:49,490 --> 00:05:51,730 Speaker 1: my career as a journalist, we've seen that shift. With 84 00:05:51,770 --> 00:05:54,370 Speaker 1: the advent of social media the way that information flows, 85 00:05:54,970 --> 00:05:57,290 Speaker 1: journalists play a role. I think journalists play a very 86 00:05:57,330 --> 00:06:03,090 Speaker 1: important role in moderating, in sifting through, in amplifying voices 87 00:06:03,370 --> 00:06:06,250 Speaker 1: that don't have an opportunity to do that. But we 88 00:06:06,330 --> 00:06:08,330 Speaker 1: are not the source of I mean, we do not 89 00:06:08,370 --> 00:06:11,050 Speaker 1: broadcast information out to people anymore. And I think when 90 00:06:11,050 --> 00:06:13,890 Speaker 1: you talk about trust, which is a really important thing 91 00:06:13,930 --> 00:06:16,930 Speaker 1: that comes up in journalism, do people trust what we do? 92 00:06:17,410 --> 00:06:19,250 Speaker 1: I think a major reason why a lot of people 93 00:06:19,290 --> 00:06:23,250 Speaker 1: don't trust what journalists do is because of the way 94 00:06:23,250 --> 00:06:25,250 Speaker 1: that we go about doing it, and I think not 95 00:06:25,330 --> 00:06:27,570 Speaker 1: having a humble attitude. Now, I will say, when you 96 00:06:27,570 --> 00:06:30,810 Speaker 1: look at the data about this, people don't trust the media. 97 00:06:30,890 --> 00:06:32,850 Speaker 1: And first of all, I will start with the premise, 98 00:06:32,890 --> 00:06:35,170 Speaker 1: I'm not quite sure what the media is. So I 99 00:06:35,290 --> 00:06:37,570 Speaker 1: always say that first, like that's kind of my first 100 00:06:37,570 --> 00:06:40,210 Speaker 1: phrase to everyone is what is this media? I'm not 101 00:06:40,250 --> 00:06:42,490 Speaker 1: sure what you mean? And when you break it down, 102 00:06:42,610 --> 00:06:45,970 Speaker 1: I think people who have relationships, for example, with local journalists. 103 00:06:46,930 --> 00:06:52,130 Speaker 1: Those institutions score very high. People trust those local institutions, 104 00:06:52,250 --> 00:06:56,770 Speaker 1: local journalists as accurate and credible sources of information. I 105 00:06:56,810 --> 00:06:58,730 Speaker 1: think when you look at the national level, that's where 106 00:06:58,770 --> 00:07:01,210 Speaker 1: you see more of a breakdown. What do you think 107 00:07:01,250 --> 00:07:03,850 Speaker 1: about that question of trust? I wonder how important that 108 00:07:04,050 --> 00:07:08,090 Speaker 1: is as well when we're thinking about journalism and in 109 00:07:08,130 --> 00:07:13,210 Speaker 1: your case science. Oh boy, So look, trust, I think 110 00:07:13,290 --> 00:07:15,450 Speaker 1: is the next layer up over humility. You got to 111 00:07:15,450 --> 00:07:18,690 Speaker 1: come in with a humble attitude. But what do we 112 00:07:18,730 --> 00:07:23,210 Speaker 1: mean by trust the scientists. There's maybe two kinds of 113 00:07:23,290 --> 00:07:28,930 Speaker 1: trust that are worth distinguishing between this kind of blind trust, 114 00:07:29,490 --> 00:07:34,570 Speaker 1: that nineteen fifties nineteen sixties thing of just deferred to 115 00:07:34,690 --> 00:07:38,210 Speaker 1: me as the scientist because I know better than you. 116 00:07:39,250 --> 00:07:42,450 Speaker 1: I think there's actually instead a different kind of trust, 117 00:07:42,490 --> 00:07:46,370 Speaker 1: and I might call it like earned trust. Earned trust 118 00:07:46,490 --> 00:07:49,850 Speaker 1: is I'm going to be if I'm the scientist or 119 00:07:49,890 --> 00:07:53,130 Speaker 1: the doctor. I'm going to be transparent about the evidence 120 00:07:53,170 --> 00:07:56,970 Speaker 1: we have. I'll tell you why I believe things, and 121 00:07:57,370 --> 00:08:01,090 Speaker 1: every bit is important. I'm going to be transparent about 122 00:08:01,170 --> 00:08:05,290 Speaker 1: what we don't know. I don't trust people who don't 123 00:08:05,410 --> 00:08:08,690 Speaker 1: say I don't know. Some of the time, I don't 124 00:08:08,730 --> 00:08:11,890 Speaker 1: trust people who can't explain to me why they believe 125 00:08:11,970 --> 00:08:17,090 Speaker 1: the things they believe. So I think we are shifting 126 00:08:17,090 --> 00:08:20,530 Speaker 1: and maybe it's true for journalism as well, but certainly 127 00:08:20,570 --> 00:08:24,890 Speaker 1: in science, to the idea that people should be asking questions, 128 00:08:24,970 --> 00:08:28,490 Speaker 1: they should be probing, and if scientists should bring doubt 129 00:08:28,490 --> 00:08:32,090 Speaker 1: about other people's results in evidence, why shouldn't the general 130 00:08:32,130 --> 00:08:35,730 Speaker 1: public bring doubt. But again, it's worth distinguishing two kinds 131 00:08:35,770 --> 00:08:40,010 Speaker 1: of doubt. There's kind of the cynical doubt. I just 132 00:08:40,090 --> 00:08:43,330 Speaker 1: don't trust this science stuff. You know, the diet studies 133 00:08:43,450 --> 00:08:46,690 Speaker 1: keep contradicting each other, or you can't trust science because 134 00:08:46,730 --> 00:08:49,130 Speaker 1: they can't make up their minds, And I think that's 135 00:08:49,290 --> 00:08:53,650 Speaker 1: a very cynical, nehalistic kind of doubt. I think there's 136 00:08:53,770 --> 00:08:56,050 Speaker 1: a kind of doubt that I would love to see 137 00:08:56,090 --> 00:09:00,130 Speaker 1: more of, which is empowered doubt. I'm not going to 138 00:09:00,210 --> 00:09:03,450 Speaker 1: believe you until you give me the evidence, show me 139 00:09:03,610 --> 00:09:07,610 Speaker 1: hard evidence. So that's the kind of empowered doubt that 140 00:09:08,490 --> 00:09:11,370 Speaker 1: you know, we want to have because that gets people 141 00:09:11,850 --> 00:09:14,970 Speaker 1: like properly at the table as peers in this thing. 142 00:09:15,890 --> 00:09:20,610 Speaker 1: I fantasize about, you know, how the FDA might go 143 00:09:20,770 --> 00:09:26,930 Speaker 1: through its drug approval process for coronavirus vaccines. Just this week, 144 00:09:27,050 --> 00:09:30,850 Speaker 1: Feisser issued a press release saying it had really positive 145 00:09:30,890 --> 00:09:35,090 Speaker 1: results from its vaccine trial, and the press release didn't 146 00:09:35,130 --> 00:09:37,330 Speaker 1: have a lot of details, which was, you know, some 147 00:09:37,370 --> 00:09:39,690 Speaker 1: people noted and they're gonna have to come forward with 148 00:09:39,730 --> 00:09:42,490 Speaker 1: those details. But I'm imagining how do we get the 149 00:09:42,530 --> 00:09:46,090 Speaker 1: country involved in the drug approval process. And so you 150 00:09:46,130 --> 00:09:50,050 Speaker 1: can imagine like a Reddit Ama where you know, the 151 00:09:50,050 --> 00:09:53,490 Speaker 1: country's sending in questions and folks at the other side, 152 00:09:53,570 --> 00:09:56,890 Speaker 1: maybe both the drug company and the FDA are trying 153 00:09:56,890 --> 00:09:59,050 Speaker 1: to answer them. And now I don't I don't know 154 00:09:59,090 --> 00:10:01,130 Speaker 1: all effects. I'll just make this up, so don't don't 155 00:10:01,170 --> 00:10:03,850 Speaker 1: take these numbers to be exactly right, but it might 156 00:10:03,890 --> 00:10:07,570 Speaker 1: go something like this. The company starts by saying, well, 157 00:10:07,610 --> 00:10:10,810 Speaker 1: we ran a clinical trial with forty thousand people and 158 00:10:11,370 --> 00:10:14,770 Speaker 1: half got the vaccine and half got a placebo, and 159 00:10:14,810 --> 00:10:19,690 Speaker 1: then we waited until ninety five people had gotten infected 160 00:10:19,810 --> 00:10:23,210 Speaker 1: and shown symptoms. And we looked and we found that 161 00:10:23,490 --> 00:10:26,010 Speaker 1: ninety out of those ninety five people were people who 162 00:10:26,090 --> 00:10:29,050 Speaker 1: got the placebo, and only five of them were people 163 00:10:29,050 --> 00:10:31,810 Speaker 1: who got the vaccine. And so it looks like the 164 00:10:31,930 --> 00:10:34,690 Speaker 1: vaccine is doing a pretty good job of protecting people. 165 00:10:35,610 --> 00:10:38,090 Speaker 1: But then people will write in and they'll say, okay, well, 166 00:10:38,130 --> 00:10:41,130 Speaker 1: tell me what do you know about elderly people, people 167 00:10:41,130 --> 00:10:44,490 Speaker 1: over seventy did they get protection? What about men? What 168 00:10:44,610 --> 00:10:49,250 Speaker 1: about people who have serious health complications? How long is 169 00:10:49,290 --> 00:10:53,850 Speaker 1: this protection gonna last? And are there side effects? Some 170 00:10:54,010 --> 00:10:56,930 Speaker 1: of the times the answers are going to be we 171 00:10:57,130 --> 00:10:59,930 Speaker 1: just don't know. We haven't got enough data yet, we 172 00:11:00,010 --> 00:11:04,210 Speaker 1: haven't run long enough to see how long protection might last. 173 00:11:05,130 --> 00:11:08,490 Speaker 1: I think people are smartan they can take the information 174 00:11:08,530 --> 00:11:11,250 Speaker 1: what we know and what we don't know, and make 175 00:11:11,290 --> 00:11:13,810 Speaker 1: decisions based on that. So I think that's really the 176 00:11:13,850 --> 00:11:18,370 Speaker 1: foundation of trust. Earn trust is to be direct and 177 00:11:18,410 --> 00:11:21,730 Speaker 1: transparent about what we know and what we don't know. Yeah, 178 00:11:21,730 --> 00:11:24,610 Speaker 1: so do you think the credibility then you sort of 179 00:11:24,610 --> 00:11:27,450 Speaker 1: build the credibility and trust with the government regulator and 180 00:11:27,450 --> 00:11:31,490 Speaker 1: in having for example, the CDC or the FDA be 181 00:11:31,810 --> 00:11:35,970 Speaker 1: incredibly transparent about the whole process. Well, the government is 182 00:11:36,010 --> 00:11:38,570 Speaker 1: here to represent the people, and it's got to do 183 00:11:38,650 --> 00:11:42,530 Speaker 1: that job in a way that actually works. Given the 184 00:11:42,730 --> 00:11:46,890 Speaker 1: tensions around all these things and skepticism that has occurred 185 00:11:46,930 --> 00:11:50,650 Speaker 1: and conflicts, I think the more transparent we can be, 186 00:11:50,730 --> 00:11:53,770 Speaker 1: the more that we earn trust. So I think transparency 187 00:11:53,850 --> 00:11:56,370 Speaker 1: is one thing, but then also the actual message and 188 00:11:56,410 --> 00:12:00,210 Speaker 1: the knowledge, because I think oftentimes we tend to see 189 00:12:00,250 --> 00:12:03,570 Speaker 1: this as a binary choice of it either has to 190 00:12:03,610 --> 00:12:08,010 Speaker 1: be simple and easy to understand, or it's we're going 191 00:12:08,050 --> 00:12:10,730 Speaker 1: to get the full information and it's complex. This is 192 00:12:10,770 --> 00:12:14,130 Speaker 1: inherently the problem I think with science communication, and this 193 00:12:14,170 --> 00:12:16,850 Speaker 1: is something as a journalist we struggle with. How do 194 00:12:16,970 --> 00:12:19,770 Speaker 1: you distill something down into a way in my case 195 00:12:19,810 --> 00:12:22,210 Speaker 1: that someone is just hearing it, so they're not even 196 00:12:22,210 --> 00:12:24,730 Speaker 1: going to read it, they just hear it. How much 197 00:12:24,770 --> 00:12:28,130 Speaker 1: can they really take in at that point? Well, it's interesting. 198 00:12:28,130 --> 00:12:32,850 Speaker 1: I think Axeos talks about sort of smart brevity. Yeah, 199 00:12:32,890 --> 00:12:36,970 Speaker 1: that's a thing. So I think communication is a really 200 00:12:37,010 --> 00:12:40,770 Speaker 1: important thing, and in general science has not mastered the 201 00:12:40,890 --> 00:12:46,570 Speaker 1: art of communication. Putting things in such complete detail that 202 00:12:46,810 --> 00:12:51,010 Speaker 1: they're incomprehensible is not very helpful. I don't know how 203 00:12:51,050 --> 00:12:54,570 Speaker 1: often you take the package insert out of a drug 204 00:12:55,010 --> 00:12:57,970 Speaker 1: and read that big thin piece of paper when you 205 00:12:58,130 --> 00:13:01,410 Speaker 1: unfolded and look at all of the background data on 206 00:13:01,490 --> 00:13:04,490 Speaker 1: this drug. But I bet you know, maybe that's as 207 00:13:04,530 --> 00:13:07,010 Speaker 1: often as you read the click license on a piece 208 00:13:07,050 --> 00:13:09,690 Speaker 1: of software. Actually, you know what I was going to say. 209 00:13:09,770 --> 00:13:12,370 Speaker 1: My mother's a pharmacist, So I just ask her. And actually, 210 00:13:12,410 --> 00:13:15,090 Speaker 1: that I think is the key, Right I ask someone 211 00:13:15,130 --> 00:13:18,010 Speaker 1: who I know has the knowledge and I trust, and 212 00:13:18,050 --> 00:13:21,330 Speaker 1: I think she will distill it down for me. So 213 00:13:21,450 --> 00:13:25,170 Speaker 1: your mother plays the role of good scientific communication and 214 00:13:25,250 --> 00:13:28,810 Speaker 1: good journalistic communication. And the problem is most people don't 215 00:13:28,890 --> 00:13:33,170 Speaker 1: have your mother, And so how do we manage to 216 00:13:33,250 --> 00:13:38,210 Speaker 1: get things clear without pulling the wool over anybody's eyes 217 00:13:38,250 --> 00:13:43,530 Speaker 1: without oversimplifying? Albert Einstein famously said, and it's one of 218 00:13:43,570 --> 00:13:47,450 Speaker 1: the things I quote very often. Everything should be made 219 00:13:47,450 --> 00:13:53,930 Speaker 1: as simple as possible, but not simpler, finding that happy 220 00:13:54,530 --> 00:14:00,170 Speaker 1: medium of saying there is nothing about this vaccine approval 221 00:14:00,770 --> 00:14:05,930 Speaker 1: or many other things that can't be explained clearly without oversimplifying. 222 00:14:06,730 --> 00:14:12,170 Speaker 1: I think communicating with honesty and clarity is the heart 223 00:14:12,210 --> 00:14:15,130 Speaker 1: of it. And I'll say the one leg up I 224 00:14:15,210 --> 00:14:19,770 Speaker 1: feel like I have is at MIT, I teach freshmen 225 00:14:20,970 --> 00:14:24,930 Speaker 1: freshman holds your feet to the fire. They want to know, 226 00:14:25,330 --> 00:14:29,090 Speaker 1: but they want it clearly, And so I think this 227 00:14:29,130 --> 00:14:31,290 Speaker 1: is something we all have to aspire to if we're 228 00:14:31,290 --> 00:14:34,450 Speaker 1: going to get a country that's involved in making wise decisions, 229 00:14:34,450 --> 00:14:40,130 Speaker 1: whether journalistically or scientifically. So Nyla, let's turn to this 230 00:14:40,210 --> 00:14:45,130 Speaker 1: question of bringing people together. Many people feel like they 231 00:14:45,210 --> 00:14:48,090 Speaker 1: just want to give up on the prospect of bringing 232 00:14:48,130 --> 00:14:52,890 Speaker 1: people together. Everybody's in their tribes. Okay, maybe, but this 233 00:14:52,930 --> 00:14:55,850 Speaker 1: isn't gonna work in the long runs. So how do 234 00:14:55,970 --> 00:15:00,330 Speaker 1: we find common grounds or at least find meeting ground 235 00:15:00,570 --> 00:15:02,690 Speaker 1: where we can meet and talk with each other. Because 236 00:15:03,690 --> 00:15:08,370 Speaker 1: I do think most people deep down do want the 237 00:15:08,490 --> 00:15:12,250 Speaker 1: same thing things. They want their family to be secure. 238 00:15:13,770 --> 00:15:17,050 Speaker 1: They would like to have a healthy planet, you know, 239 00:15:17,170 --> 00:15:22,330 Speaker 1: a healthier life for themselves, more peace. I was struck 240 00:15:22,370 --> 00:15:27,730 Speaker 1: in the election coverage that there were instances where people 241 00:15:27,810 --> 00:15:31,330 Speaker 1: tried not to go head on saying I want to 242 00:15:31,330 --> 00:15:35,730 Speaker 1: convince you to vote for my candidate, but instead to 243 00:15:35,770 --> 00:15:39,450 Speaker 1: ask what's bothering you? What's on your mind? What are 244 00:15:39,450 --> 00:15:46,250 Speaker 1: you worried about? By listening and establishing what are common goals, 245 00:15:46,330 --> 00:15:48,410 Speaker 1: when may be able to circle back and say, okay, 246 00:15:48,570 --> 00:15:51,610 Speaker 1: if that's the goal, what are the ways we might 247 00:15:51,690 --> 00:15:54,210 Speaker 1: get there? Now I realize I may seem like a 248 00:15:54,210 --> 00:15:58,490 Speaker 1: hopeless optimist here, and it's not like I'm I'm unrealistic 249 00:15:58,530 --> 00:16:00,890 Speaker 1: about it. It's just I don't see anything else that 250 00:16:01,050 --> 00:16:05,930 Speaker 1: works other than trying to find that kind of meeting 251 00:16:05,970 --> 00:16:09,850 Speaker 1: ground amongst people, and any kind of change has to 252 00:16:09,890 --> 00:16:13,090 Speaker 1: start by finding something that shared. So I don't know 253 00:16:13,130 --> 00:16:15,770 Speaker 1: what your experiences has been with this. I think that 254 00:16:16,050 --> 00:16:18,090 Speaker 1: what I have found as a journalist, and this kind 255 00:16:18,090 --> 00:16:20,490 Speaker 1: of goes back again to communication, but I think it 256 00:16:20,530 --> 00:16:25,970 Speaker 1: also goes back to this idea of humility is language 257 00:16:26,050 --> 00:16:28,810 Speaker 1: is really important here because I think that the way 258 00:16:28,810 --> 00:16:33,610 Speaker 1: that you frame something tells people how to think about something. So, 259 00:16:33,730 --> 00:16:36,730 Speaker 1: for example, as a journalist, when I am interviewing someone, 260 00:16:36,970 --> 00:16:39,570 Speaker 1: I always ask them a question, which seems like a 261 00:16:39,730 --> 00:16:42,450 Speaker 1: very simple thing. But actually, if you listen to a 262 00:16:42,450 --> 00:16:45,330 Speaker 1: lot of journalists when they're interviewing people, they don't ask 263 00:16:45,370 --> 00:16:48,170 Speaker 1: them questions. They make statements or they say, tell me 264 00:16:48,250 --> 00:16:49,850 Speaker 1: about something. Well, if you tell someone to tell you 265 00:16:49,850 --> 00:16:52,090 Speaker 1: about something, they're going to tell you about something. Oh 266 00:16:52,130 --> 00:16:56,450 Speaker 1: that is so interesting because I hadn't actually processed before 267 00:16:56,810 --> 00:17:00,170 Speaker 1: that tell me about something is not really asking a question. 268 00:17:00,450 --> 00:17:02,410 Speaker 1: And so this is my pet peeve as a broadcast 269 00:17:02,450 --> 00:17:05,210 Speaker 1: journalist and as a host, that you should never say 270 00:17:05,250 --> 00:17:07,530 Speaker 1: tell me about something to someone you shouldn't because you 271 00:17:07,530 --> 00:17:10,130 Speaker 1: can always ask it as a question, because I actually 272 00:17:10,130 --> 00:17:13,610 Speaker 1: think our brains hear that differently and they process that differently. 273 00:17:13,970 --> 00:17:17,290 Speaker 1: And I think that's just one example of how language 274 00:17:17,290 --> 00:17:20,330 Speaker 1: can be so important when we're thinking about And this 275 00:17:20,410 --> 00:17:23,810 Speaker 1: is of course we parse every word, you know, as 276 00:17:23,890 --> 00:17:27,330 Speaker 1: journalists and as a broadcast journalist, and on our podcast 277 00:17:27,810 --> 00:17:30,770 Speaker 1: it is not live, and so we literally do parse 278 00:17:30,850 --> 00:17:34,290 Speaker 1: every word. And I wonder for you how you've seen 279 00:17:35,010 --> 00:17:37,570 Speaker 1: language is important to you, especially as you think about 280 00:17:37,650 --> 00:17:40,050 Speaker 1: brave New planet, right, And when I think about, like 281 00:17:40,130 --> 00:17:42,410 Speaker 1: you have this idea, I want to ask you, like 282 00:17:42,490 --> 00:17:45,410 Speaker 1: this whole idea of like stewards of the brave New planet. 283 00:17:45,730 --> 00:17:48,250 Speaker 1: That's an interesting choice of word that you have, stewarts. 284 00:17:48,290 --> 00:17:52,010 Speaker 1: It's a very deliberate one. Stewards of the Brave New 285 00:17:52,050 --> 00:17:58,810 Speaker 1: Planet was chosen very intentionally. I think across the political spectrum, 286 00:17:58,850 --> 00:18:05,410 Speaker 1: from religious conservatives to very progressive people, there is some 287 00:18:05,650 --> 00:18:10,730 Speaker 1: shared sense of stewardship in Eastern religion, the idea that 288 00:18:10,770 --> 00:18:16,170 Speaker 1: people are stewards of the planet, you know, that's fundamental 289 00:18:16,210 --> 00:18:21,610 Speaker 1: and biblical. We all feel like we have an obligation 290 00:18:21,770 --> 00:18:24,610 Speaker 1: to be and want to be stewards of this planet, 291 00:18:24,650 --> 00:18:28,810 Speaker 1: and so it dawned on me one day that this 292 00:18:28,970 --> 00:18:31,530 Speaker 1: was a word that we didn't have to argue about. 293 00:18:31,850 --> 00:18:34,690 Speaker 1: And if we have the common mission of being stewards, 294 00:18:35,130 --> 00:18:38,610 Speaker 1: we can now have a serious discussion about how can 295 00:18:38,690 --> 00:18:41,650 Speaker 1: we be the best stewards. But we start by being 296 00:18:41,650 --> 00:18:46,370 Speaker 1: on the same side. And so let's be optimistic and 297 00:18:46,410 --> 00:18:50,370 Speaker 1: say that we have established a common ground and that 298 00:18:50,450 --> 00:18:54,130 Speaker 1: we're working on building trust. We've been talking about the 299 00:18:54,130 --> 00:18:58,290 Speaker 1: pandemic we've had. You've had some really practical solutions for that. 300 00:18:58,690 --> 00:19:02,330 Speaker 1: Because I remain I would say, as a journalist, I 301 00:19:02,370 --> 00:19:05,010 Speaker 1: am an optimist, but I'm always a skeptical journalist, and 302 00:19:05,050 --> 00:19:09,970 Speaker 1: I remained very concerned about our ability the country to 303 00:19:11,090 --> 00:19:16,570 Speaker 1: unite around the science of the pandemic. I share your concern. 304 00:19:17,210 --> 00:19:19,450 Speaker 1: We all should be very concerned about it and therefore 305 00:19:19,490 --> 00:19:22,130 Speaker 1: work hard to try to overcome it. But then when 306 00:19:22,130 --> 00:19:25,570 Speaker 1: we think about other issues that are just as big, 307 00:19:26,090 --> 00:19:30,090 Speaker 1: arguably bigger, like climate change, I wonder, how do we 308 00:19:30,170 --> 00:19:33,330 Speaker 1: do that well. I think that's a great example to 309 00:19:33,370 --> 00:19:36,450 Speaker 1: think about climate change. We went through a long period 310 00:19:36,490 --> 00:19:40,530 Speaker 1: of time when the argument is climate changing. I think 311 00:19:40,570 --> 00:19:44,410 Speaker 1: we've largely moved past that. The question now is what 312 00:19:44,530 --> 00:19:47,770 Speaker 1: do we do about it? What worries me is how 313 00:19:47,770 --> 00:19:51,850 Speaker 1: many people feel overwhelmed, pessimistic that there's no prospect of 314 00:19:51,930 --> 00:19:56,810 Speaker 1: doing anything without wrecking the economy and dramatically changing daily life, 315 00:19:57,210 --> 00:20:00,930 Speaker 1: you know, banning hamburgers and airplanes. I think it's provoked 316 00:20:00,970 --> 00:20:04,250 Speaker 1: many people across the whole political spectrum to just throw 317 00:20:04,330 --> 00:20:07,810 Speaker 1: up their hands. I think it's terrible. We don't want 318 00:20:07,850 --> 00:20:12,330 Speaker 1: people to feel fatalist, stick and pessimistic and overwhelmed. You know, 319 00:20:12,450 --> 00:20:16,570 Speaker 1: the ultimate answer, it's a climate change. It's actually pretty straightforward. 320 00:20:17,090 --> 00:20:18,850 Speaker 1: The only thing that will work in the long run 321 00:20:18,970 --> 00:20:22,130 Speaker 1: is to make renewable energy that's cheaper than fossil fuels. 322 00:20:22,770 --> 00:20:25,930 Speaker 1: The minute that happens, the market will move to renewables 323 00:20:25,930 --> 00:20:29,010 Speaker 1: on its own. So the answer has to be innovation. 324 00:20:29,130 --> 00:20:32,170 Speaker 1: It's just how do you get that innovation. Now, we've 325 00:20:32,170 --> 00:20:35,450 Speaker 1: already seen a lot of progress. The cost of solar 326 00:20:35,570 --> 00:20:38,650 Speaker 1: energy and wind energy has been dropping dramatically. In some places, 327 00:20:39,210 --> 00:20:42,290 Speaker 1: that's cheaper than burning oil. Now, we still need a 328 00:20:42,290 --> 00:20:46,410 Speaker 1: lot more better battery storage and better electrification, but there's 329 00:20:46,450 --> 00:20:49,530 Speaker 1: every reason to think we can do it. So the 330 00:20:49,610 --> 00:20:52,010 Speaker 1: national goal ought to be for America to lead the 331 00:20:52,010 --> 00:20:56,130 Speaker 1: world in inventing and producing and selling new energy technologies. 332 00:20:57,010 --> 00:21:00,490 Speaker 1: And you know that way, addressing climate change and promoting 333 00:21:00,650 --> 00:21:04,090 Speaker 1: economic growth don't have to really be in conflict. There's 334 00:21:04,090 --> 00:21:07,930 Speaker 1: actually a great historical example. One of the reasons America 335 00:21:08,010 --> 00:21:11,610 Speaker 1: became the leader in semiconductors and computers is that the 336 00:21:11,650 --> 00:21:15,770 Speaker 1: government created huge incentives for the semiconductor industry. Way back 337 00:21:15,810 --> 00:21:20,010 Speaker 1: in the nineteen fifties. The military bought huge quantities of 338 00:21:20,050 --> 00:21:24,650 Speaker 1: semiconductors even when they were too expensive to be commercially viable. 339 00:21:25,130 --> 00:21:29,010 Speaker 1: They called it pump priming. So on climate change, I 340 00:21:29,050 --> 00:21:32,210 Speaker 1: think we have our incentives completely backward right now, and 341 00:21:32,250 --> 00:21:35,250 Speaker 1: I think most Americans could get together around the idea 342 00:21:35,290 --> 00:21:40,130 Speaker 1: of using incentives to unleash American innovation. How much do 343 00:21:40,170 --> 00:21:42,610 Speaker 1: you think that inertia for lack of a better word, 344 00:21:43,090 --> 00:21:46,250 Speaker 1: whether we're thinking of big things like changes in technology 345 00:21:46,250 --> 00:21:50,450 Speaker 1: and innovation with climate change, but I'm also really thinking 346 00:21:50,450 --> 00:21:53,530 Speaker 1: more on the individual level about people feeling overwhelmed and 347 00:21:53,570 --> 00:21:57,690 Speaker 1: pessimistic and sort of resigned. How much of that do 348 00:21:57,770 --> 00:22:01,810 Speaker 1: you think results from the way that we communicate, And 349 00:22:01,850 --> 00:22:04,530 Speaker 1: by that I'm talking primarily about social media. I think 350 00:22:04,570 --> 00:22:08,730 Speaker 1: that's a significant issue. Looking back. There was a time 351 00:22:08,810 --> 00:22:12,490 Speaker 1: that I think most Americans thought America could do anything 352 00:22:12,490 --> 00:22:15,610 Speaker 1: could put its mind to. I don't think people feel 353 00:22:15,650 --> 00:22:18,330 Speaker 1: that as much as they should, but there was a 354 00:22:18,410 --> 00:22:22,570 Speaker 1: sense not that long ago that we could tackle any challenge. 355 00:22:23,090 --> 00:22:26,250 Speaker 1: I don't think the kinds of wars that people get 356 00:22:26,330 --> 00:22:30,290 Speaker 1: into over social media and takedowns, I don't think they're 357 00:22:30,290 --> 00:22:37,650 Speaker 1: really conducive to letting people have big aspirations. I think 358 00:22:37,730 --> 00:22:41,170 Speaker 1: there are amazing things we can get done. Look at 359 00:22:41,210 --> 00:22:44,490 Speaker 1: what's gotten done over the last fifty years, everything that's 360 00:22:44,970 --> 00:22:48,130 Speaker 1: been able to be transformed. We can still do that 361 00:22:48,410 --> 00:22:51,850 Speaker 1: because I see this as something where people on the 362 00:22:51,930 --> 00:22:55,450 Speaker 1: left and people on the right both know that that's true, 363 00:22:56,130 --> 00:22:59,570 Speaker 1: and they you know, some may come from a market orientation, 364 00:22:59,770 --> 00:23:03,010 Speaker 1: some may come from from a research orientation, but we 365 00:23:03,130 --> 00:23:05,810 Speaker 1: know we've pulled things like this off in the past, 366 00:23:06,410 --> 00:23:10,050 Speaker 1: and so I'd like to reorient the discussion. So on 367 00:23:10,090 --> 00:23:13,850 Speaker 1: that note, how if we're thinking about the stewards who 368 00:23:13,850 --> 00:23:20,610 Speaker 1: are listening, what is your final advice or tips for them? 369 00:23:21,690 --> 00:23:26,530 Speaker 1: Do something doesn't matter what Go make a curriculum for 370 00:23:26,610 --> 00:23:30,410 Speaker 1: schools on some topic that you care about or that 371 00:23:30,530 --> 00:23:34,410 Speaker 1: we talked about in the program. Go organize the discussion, 372 00:23:34,570 --> 00:23:37,850 Speaker 1: Go talk to you know, a local legislator about it. 373 00:23:38,170 --> 00:23:41,810 Speaker 1: I think the key is to start. The point is, 374 00:23:41,970 --> 00:23:46,130 Speaker 1: if you feel pessimistic, if you feel overwhelmed, if you 375 00:23:46,130 --> 00:23:51,210 Speaker 1: feel paralyzed, that's terrible. Do something something will lead to 376 00:23:51,290 --> 00:23:55,970 Speaker 1: something else. Now, no one person changes the whole world, 377 00:23:56,530 --> 00:24:00,610 Speaker 1: but together changing our attitude that we can make change. 378 00:24:01,530 --> 00:24:05,410 Speaker 1: That is really important. It's the basis of science. When 379 00:24:05,450 --> 00:24:09,170 Speaker 1: people set out to try to cure cancer, they say, oh, 380 00:24:09,170 --> 00:24:11,850 Speaker 1: my god, that goal is so huge, how am I 381 00:24:11,890 --> 00:24:16,170 Speaker 1: going to do it? And yet scientists, step by step, 382 00:24:16,530 --> 00:24:18,250 Speaker 1: they take a piece of the problem and they make 383 00:24:18,330 --> 00:24:21,290 Speaker 1: progress against it. And so we go from the nineteen 384 00:24:21,410 --> 00:24:24,250 Speaker 1: seventies when nobody had a clue what cancer was about 385 00:24:24,770 --> 00:24:29,210 Speaker 1: two people understanding, oh, cancer is caused by genetic mutations, 386 00:24:29,730 --> 00:24:32,930 Speaker 1: and then discovering, oh, sometimes we can make drugs that 387 00:24:32,970 --> 00:24:36,970 Speaker 1: block the effects of those mutations. Oh, we can harness 388 00:24:37,050 --> 00:24:41,450 Speaker 1: the immune system to make therapies. You know, any given week, 389 00:24:41,490 --> 00:24:45,250 Speaker 1: any given months, you might feel pessimistic because you don't 390 00:24:45,290 --> 00:24:48,690 Speaker 1: really see progress. But if you step back and look 391 00:24:48,810 --> 00:24:52,130 Speaker 1: over the course of a decade or two, it's breathtaking 392 00:24:52,130 --> 00:24:55,930 Speaker 1: how much progress can happen. I think science and society 393 00:24:55,930 --> 00:24:58,730 Speaker 1: are pretty similar in this regard. You can take on 394 00:24:59,250 --> 00:25:04,130 Speaker 1: huge challenges that of enough people are moving that forward. Oh, 395 00:25:04,170 --> 00:25:06,890 Speaker 1: we end up making a big difference. Well, thank you. 396 00:25:06,970 --> 00:25:09,890 Speaker 1: I'm glad that we found common ground, and I appreciate 397 00:25:09,970 --> 00:25:13,490 Speaker 1: so much that you were willing to sit down and 398 00:25:13,530 --> 00:25:15,650 Speaker 1: talk to me about all of these things. It's an honor. 399 00:25:15,650 --> 00:25:18,930 Speaker 1: I appreciate it. Thank you, well, thank you, Nila. It's 400 00:25:18,970 --> 00:25:22,610 Speaker 1: been great to talk. And to all the listeners out there, 401 00:25:22,610 --> 00:25:33,970 Speaker 1: I hope you'll check out NILA's podcast Axios today. So 402 00:25:34,010 --> 00:25:36,970 Speaker 1: there you haven't stewards of the Brave New Planet. It 403 00:25:37,010 --> 00:25:40,210 Speaker 1: really is time to choose our future. There are so 404 00:25:40,250 --> 00:25:44,530 Speaker 1: many amazing opportunities ahead and so many challenges to getting 405 00:25:44,530 --> 00:25:47,890 Speaker 1: this right. We can't just throw up our hands and 406 00:25:48,010 --> 00:25:51,690 Speaker 1: leave it to others to decide. We all of us 407 00:25:51,930 --> 00:25:55,130 Speaker 1: have responsibility to make sure that we make wise choices. 408 00:25:55,970 --> 00:25:58,210 Speaker 1: It's going to take a lot. It's going to take 409 00:25:58,210 --> 00:26:01,850 Speaker 1: a commitment to renewing the compact between science and society 410 00:26:02,250 --> 00:26:06,250 Speaker 1: and to following the evidence. It's going to take humility. 411 00:26:06,970 --> 00:26:10,850 Speaker 1: Science is an amazingly powerful way to create new possibilities, 412 00:26:11,490 --> 00:26:15,370 Speaker 1: but we also have to ask what could possibly go wrong. 413 00:26:16,410 --> 00:26:20,130 Speaker 1: It's going to take trust and doubt, not blind trust, 414 00:26:20,210 --> 00:26:23,770 Speaker 1: not cynical doubt. It's going to take earned trust and 415 00:26:23,930 --> 00:26:27,730 Speaker 1: empowered doubt where anyone can raise questions and we're all 416 00:26:27,770 --> 00:26:30,890 Speaker 1: transparent about what we know and what we don't know. 417 00:26:32,010 --> 00:26:36,690 Speaker 1: And it's going to take engagement from everyone. Government, university, 418 00:26:36,850 --> 00:26:41,370 Speaker 1: scientific against its corporations, unions, faith groups, student organizations and 419 00:26:41,570 --> 00:26:46,890 Speaker 1: geo's all willing to debate in good faith about hard questions. 420 00:26:48,010 --> 00:26:52,130 Speaker 1: I'm an optimist, but a realistic optimist. It's going to 421 00:26:52,250 --> 00:26:57,570 Speaker 1: take a lot of work, but what's the alternative? And 422 00:26:57,730 --> 00:27:02,530 Speaker 1: getting this right as great rewards. I'm committed and I 423 00:27:02,570 --> 00:27:05,530 Speaker 1: hope you are too. I look forward to continuing the 424 00:27:05,570 --> 00:27:11,530 Speaker 1: conversation utopia or dystopia, It really is up to us. 425 00:27:12,530 --> 00:27:26,210 Speaker 1: Thank you for listening. Brave New Planet is a co 426 00:27:26,330 --> 00:27:29,250 Speaker 1: production of the Brode Institute of Might and Harvard Pushkin 427 00:27:29,330 --> 00:27:32,770 Speaker 1: Industries in the Boston Globe, with support from the Alfred P. 428 00:27:32,930 --> 00:27:36,690 Speaker 1: Sloane Foundation. Our show is produced by Rebecca Lee Douglas 429 00:27:36,810 --> 00:27:41,490 Speaker 1: with Mary Doo theme song composed by Ned Porter, mastering 430 00:27:41,530 --> 00:27:45,930 Speaker 1: and sound designed by James Garver, fact checking by Joseph Fridman, 431 00:27:46,010 --> 00:27:50,370 Speaker 1: and a Stitt and Enchant special Thanks to Christine Heenan 432 00:27:50,410 --> 00:27:54,210 Speaker 1: and Rachel Roberts at Clarendon Communications, to Lee mc guire, 433 00:27:54,370 --> 00:27:57,730 Speaker 1: Kristen Zarelli and Justine Levin Allerhand at the Broade, to 434 00:27:57,930 --> 00:28:01,970 Speaker 1: mil Lobell and Heather Faine at Pushkin, and to Eli 435 00:28:02,050 --> 00:28:05,730 Speaker 1: and Edy Brode who made the Brode Institute possible. This 436 00:28:06,290 --> 00:28:11,730 Speaker 1: is brave new planet. I'm Eric Lander.