1 00:00:15,356 --> 00:00:23,996 Speaker 1: Pushkin from Pushkin Industries. This is Deep Background, the show 2 00:00:24,076 --> 00:00:27,036 Speaker 1: where we explore the stories behind the stories in the news. 3 00:00:27,436 --> 00:00:33,716 Speaker 1: I'm Noah Feldman. This May, the United Nations released a 4 00:00:33,796 --> 00:00:37,796 Speaker 1: report based on thousands of scientific studies, saying that a 5 00:00:37,916 --> 00:00:42,156 Speaker 1: million species are at risk of extinction. It said that 6 00:00:42,236 --> 00:00:45,996 Speaker 1: humans were altering the natural world at a quote unprecedented pace. 7 00:00:46,956 --> 00:00:49,476 Speaker 1: This is something that Elizabeth Culbert has been reporting on 8 00:00:49,556 --> 00:00:52,196 Speaker 1: for years. She's a staff writer at The New Yorker 9 00:00:52,476 --> 00:00:55,236 Speaker 1: and the author of the twenty fifteen Pulitzer Prize winning 10 00:00:55,236 --> 00:00:59,516 Speaker 1: book The Sixth Extinction, which is all about biodiversity loss 11 00:00:59,796 --> 00:01:04,356 Speaker 1: as a result of the human impact on the environment. Elizabeth, 12 00:01:04,396 --> 00:01:07,076 Speaker 1: thank you very much for being here with us. I 13 00:01:07,156 --> 00:01:10,956 Speaker 1: want to ask you about biodiversity from a range of perspectives, 14 00:01:10,996 --> 00:01:13,516 Speaker 1: and I want to start with the question of what 15 00:01:13,596 --> 00:01:16,836 Speaker 1: biodiversity is. It's a kind of catchy phrase, and we 16 00:01:16,876 --> 00:01:19,516 Speaker 1: all feel bad when we hear that biodiversity is endangered. 17 00:01:20,036 --> 00:01:23,396 Speaker 1: But what do we mean when we actually say biodiversity. Well, 18 00:01:23,436 --> 00:01:26,116 Speaker 1: that's a good question. I think that in its most 19 00:01:26,196 --> 00:01:32,716 Speaker 1: basic sense, you know, biodiversity refers to the variety of 20 00:01:32,796 --> 00:01:37,556 Speaker 1: life on Earth, and a sort of shorthand is often 21 00:01:38,276 --> 00:01:41,716 Speaker 1: how many species there are on Earth, and the fact 22 00:01:41,756 --> 00:01:43,796 Speaker 1: of the matter is, we don't know how many species 23 00:01:43,836 --> 00:01:45,716 Speaker 1: there are on Earth. They like to use that estimated 24 00:01:45,796 --> 00:01:48,236 Speaker 1: number of eight million, right, Yeah, I mean there are 25 00:01:48,236 --> 00:01:51,916 Speaker 1: all sorts of numbers that get thrown around because you know, 26 00:01:51,916 --> 00:01:54,956 Speaker 1: we've only sort of named and identified, you know, around 27 00:01:54,996 --> 00:01:57,676 Speaker 1: a million species, so people have to sort of extrapolate 28 00:01:57,796 --> 00:02:00,196 Speaker 1: from that based on, you know, the rate at which 29 00:02:00,236 --> 00:02:03,756 Speaker 1: you're discovering new species. Things like that. We do know, 30 00:02:03,836 --> 00:02:06,436 Speaker 1: don't we that? However, many species there are a whole 31 00:02:06,476 --> 00:02:09,356 Speaker 1: bunch of them are insects. Right. When people is that 32 00:02:09,396 --> 00:02:11,276 Speaker 1: eight million number, they tend to say five and a 33 00:02:11,316 --> 00:02:13,156 Speaker 1: half million or insects. And even if you accept that 34 00:02:13,196 --> 00:02:16,436 Speaker 1: as a total projection, they're saying that substantially more than 35 00:02:16,516 --> 00:02:19,716 Speaker 1: half of the species in our biodiversity measure are in 36 00:02:19,756 --> 00:02:23,916 Speaker 1: fact insects. Yes, the vast majority of species on Earth 37 00:02:23,916 --> 00:02:27,476 Speaker 1: are invertebrates, so you know, animals without a backbone, and 38 00:02:27,556 --> 00:02:31,396 Speaker 1: of those, the biggest group are insects. Yes, definitely. So 39 00:02:31,436 --> 00:02:34,316 Speaker 1: when we talk about the number of species on Earth, 40 00:02:34,356 --> 00:02:36,396 Speaker 1: no matter what it is, as you point out, or 41 00:02:36,436 --> 00:02:39,436 Speaker 1: even what order of magnitude it is, it's likely that 42 00:02:39,556 --> 00:02:42,796 Speaker 1: you know, a great, great proportion of that will be insects. 43 00:02:43,756 --> 00:02:46,036 Speaker 1: So can I ask a follow on question? That's it's 44 00:02:46,076 --> 00:02:49,436 Speaker 1: a little philistinish, I fear, but I actually think it, 45 00:02:49,556 --> 00:02:51,276 Speaker 1: So I want to ask you about it. And it's this, 46 00:02:51,916 --> 00:02:55,356 Speaker 1: when we're talking about these vast numbers and then we 47 00:02:55,396 --> 00:02:59,516 Speaker 1: talk about decline, what is the thing that we're supposed 48 00:02:59,556 --> 00:03:02,476 Speaker 1: to be so panicked about, right, if we were to 49 00:03:02,516 --> 00:03:04,796 Speaker 1: go from eight million species and I understand it's not 50 00:03:04,796 --> 00:03:09,156 Speaker 1: a real number to seven million species, what is it 51 00:03:09,156 --> 00:03:15,236 Speaker 1: it's inherently so worrisome about that observation on its own. Well, 52 00:03:15,276 --> 00:03:17,396 Speaker 1: I think there are a number of different ways to 53 00:03:17,436 --> 00:03:21,916 Speaker 1: answer that question. And the first one that comes to 54 00:03:21,996 --> 00:03:25,636 Speaker 1: mind is if you're eliminating you know, a million species, 55 00:03:26,396 --> 00:03:29,996 Speaker 1: that's an indicator that something very very serious is going on. 56 00:03:30,116 --> 00:03:32,996 Speaker 1: And if you're a living organism, you know, like a human, 57 00:03:33,596 --> 00:03:37,156 Speaker 1: you might wonder, you know, why, why is that happening? 58 00:03:37,316 --> 00:03:39,916 Speaker 1: And how is it going to impact you know, my species, 59 00:03:39,956 --> 00:03:42,996 Speaker 1: because it's pretty unlikely that you're getting rid of, let's 60 00:03:43,036 --> 00:03:46,156 Speaker 1: just say, even an eighth of the species on Earth 61 00:03:46,316 --> 00:03:49,396 Speaker 1: with no impacts. Now, if you want to look at it, 62 00:03:49,516 --> 00:03:51,076 Speaker 1: you know, kind of what does this mean to me? 63 00:03:51,156 --> 00:03:53,716 Speaker 1: What are those million species mean to me? Right now? 64 00:03:54,316 --> 00:03:57,556 Speaker 1: I think the answer that you could give is you 65 00:03:57,636 --> 00:04:00,996 Speaker 1: just don't know. You know, human life, though we live 66 00:04:01,116 --> 00:04:06,436 Speaker 1: in this sort of a lot of manufactured habitat most 67 00:04:06,436 --> 00:04:10,956 Speaker 1: of us, most of the time, we're still absolutely vitally 68 00:04:11,156 --> 00:04:16,556 Speaker 1: dependent on the biological world, biological geochemical cycles, which we're 69 00:04:16,756 --> 00:04:19,836 Speaker 1: screwing around with very very dramatically right now, and that's 70 00:04:19,876 --> 00:04:24,196 Speaker 1: why we're seeing these very high extinction rates. And which 71 00:04:24,196 --> 00:04:28,876 Speaker 1: species we actually depend on, which species are absolutely crucial 72 00:04:28,916 --> 00:04:31,916 Speaker 1: to human life, we don't know, But you wouldn't want 73 00:04:31,956 --> 00:04:34,236 Speaker 1: to screw around with it to the point that you 74 00:04:34,596 --> 00:04:36,676 Speaker 1: find out and realize, oh, that that was the one 75 00:04:36,716 --> 00:04:40,076 Speaker 1: that was really crucial. So can I follow on that too? Though? 76 00:04:40,516 --> 00:04:42,916 Speaker 1: I hear the argument and it sounds like it's an 77 00:04:42,956 --> 00:04:47,316 Speaker 1: argument from uncertainty, right It says something like, that's a 78 00:04:47,316 --> 00:04:49,996 Speaker 1: lot of species. Anytime you're losing a high percentage of 79 00:04:49,996 --> 00:04:53,236 Speaker 1: what's out there in the world, things could go terribly awry, 80 00:04:53,276 --> 00:04:56,316 Speaker 1: and we just don't know how that might be the case. 81 00:04:56,516 --> 00:04:59,476 Speaker 1: So you know, why not try to mitigate and avoid 82 00:04:59,596 --> 00:05:03,956 Speaker 1: the things that are causing this decline in species? If 83 00:05:03,996 --> 00:05:05,916 Speaker 1: that's the argument I tell me first time, I getting 84 00:05:05,916 --> 00:05:07,676 Speaker 1: it right. Well, I mean that's I could offer you 85 00:05:07,716 --> 00:05:11,916 Speaker 1: another argument, and that would simply be an ethical argument. 86 00:05:11,996 --> 00:05:15,676 Speaker 1: I suppose what gives humans, you know, we obviously have 87 00:05:15,716 --> 00:05:18,876 Speaker 1: the ability to eliminate a lot of species. What gives 88 00:05:18,916 --> 00:05:21,996 Speaker 1: us the right, as it were, to do that. I'd 89 00:05:21,996 --> 00:05:23,476 Speaker 1: like to talk about both of those, if it's if 90 00:05:23,476 --> 00:05:26,196 Speaker 1: it's okay with you. They both seem super important. The 91 00:05:26,276 --> 00:05:29,396 Speaker 1: first is a kind of human centric argument, right, you know, 92 00:05:29,436 --> 00:05:32,636 Speaker 1: we should care about these features of our world because 93 00:05:33,076 --> 00:05:36,156 Speaker 1: we might really be in trouble if we don't. The 94 00:05:36,236 --> 00:05:40,316 Speaker 1: second is an argument from morality or from ethics, not 95 00:05:40,436 --> 00:05:43,396 Speaker 1: humans focused in the same way that says, well, you know, 96 00:05:43,436 --> 00:05:45,316 Speaker 1: we shouldn't assume that just because we're the humans, we 97 00:05:45,316 --> 00:05:46,436 Speaker 1: have the right to do all these things that we 98 00:05:46,476 --> 00:05:47,996 Speaker 1: have the capacity to. So I think those are both 99 00:05:47,996 --> 00:05:50,996 Speaker 1: super interesting and important. I'd love to talk about both them. Yeah, 100 00:05:51,436 --> 00:05:54,756 Speaker 1: on the uncertainty argument, this seems to me kind of 101 00:05:54,796 --> 00:05:57,436 Speaker 1: different than the argument with respect to say, climate change, 102 00:05:57,476 --> 00:06:00,916 Speaker 1: which is, you know, the other very pressing environmental issue 103 00:06:00,916 --> 00:06:08,036 Speaker 1: of our moment. There there's overwhelming evidence that rising temperatures 104 00:06:08,436 --> 00:06:12,596 Speaker 1: are going to transformative and transformatively bad effects on huge 105 00:06:12,676 --> 00:06:17,756 Speaker 1: numbers of human beings in the pretty near foreseeable future. 106 00:06:18,316 --> 00:06:20,076 Speaker 1: And that sounds like, as you know, you're looking for 107 00:06:20,076 --> 00:06:22,116 Speaker 1: clarion calls to action. That sounds like a pretty powerful one. 108 00:06:22,116 --> 00:06:24,756 Speaker 1: It's human focused, and it says things are getting bad 109 00:06:24,796 --> 00:06:29,156 Speaker 1: in the following set of ways with respect to the biodiversity. 110 00:06:29,996 --> 00:06:35,396 Speaker 1: Isn't the uncertainty argument slightly mitigated by the history of 111 00:06:35,556 --> 00:06:38,236 Speaker 1: big extinctions? I mean, you call your book the six Extinction, 112 00:06:38,276 --> 00:06:42,636 Speaker 1: because there are five massive prior extinctions, and those all 113 00:06:42,676 --> 00:06:47,396 Speaker 1: took place, and they were caused by disasters into typically 114 00:06:47,436 --> 00:06:50,956 Speaker 1: of various kinds. Of course, they led to big changes 115 00:06:50,996 --> 00:06:53,676 Speaker 1: in the nature of the biodiversity that was out there, 116 00:06:54,316 --> 00:06:56,596 Speaker 1: but we don't know for sure that we would be 117 00:06:56,636 --> 00:06:59,796 Speaker 1: the dinosaurs as it were, right, I mean. And the 118 00:06:59,876 --> 00:07:02,516 Speaker 1: question then becomes, if we're balancing at the human level 119 00:07:03,196 --> 00:07:07,036 Speaker 1: a whole bunch of competing interests, the main one, as 120 00:07:07,036 --> 00:07:08,996 Speaker 1: far as I can make out, is the size of 121 00:07:09,116 --> 00:07:12,796 Speaker 1: a population, because the report suggests that the most significant 122 00:07:12,796 --> 00:07:15,756 Speaker 1: of the various things that's leading to the decline and 123 00:07:15,796 --> 00:07:18,756 Speaker 1: biodiversity is just how many people we have and growing 124 00:07:18,796 --> 00:07:23,476 Speaker 1: food for them and eating animals and fishing for fish, 125 00:07:23,476 --> 00:07:25,396 Speaker 1: and the range of other features that come with having 126 00:07:25,476 --> 00:07:29,156 Speaker 1: so many people. So if we're trading off human values 127 00:07:29,716 --> 00:07:33,876 Speaker 1: like the population against uncertainty, isn't that sort of different 128 00:07:33,876 --> 00:07:36,796 Speaker 1: than the climate change context, where you know that we're 129 00:07:36,836 --> 00:07:38,556 Speaker 1: in a lot of trouble, and we're in a lot 130 00:07:38,596 --> 00:07:41,156 Speaker 1: of trouble very very soon. Well, I want to see 131 00:07:41,156 --> 00:07:43,676 Speaker 1: if you if you accept the idea, you know that 132 00:07:43,716 --> 00:07:48,196 Speaker 1: we're causing a mass extinction, a spasm of extinction, you know, 133 00:07:48,836 --> 00:07:53,396 Speaker 1: potentially on the level of these mass extinctions of the 134 00:07:53,476 --> 00:07:56,916 Speaker 1: path the most recent one of which you know, it's 135 00:07:56,956 --> 00:08:00,156 Speaker 1: believed was caused by an asteroid impact. That's the event, 136 00:08:00,356 --> 00:08:03,236 Speaker 1: the End Cretaceous extinction, which did in you know, not 137 00:08:03,276 --> 00:08:06,836 Speaker 1: just the dinosaurs, which were a dominant form of life 138 00:08:06,876 --> 00:08:10,876 Speaker 1: on land, but also lots of other major groups of organisms, 139 00:08:11,036 --> 00:08:13,716 Speaker 1: and also opened the door for us, not totally coincidentially, 140 00:08:13,716 --> 00:08:18,596 Speaker 1: our little rabbit like rodent like ancestors. Yes, absolutely, But 141 00:08:18,916 --> 00:08:22,196 Speaker 1: even if you were a bird or a small mammal, 142 00:08:22,436 --> 00:08:25,956 Speaker 1: kind of shrew like mammal which lived through the End 143 00:08:25,956 --> 00:08:30,756 Speaker 1: Cretaceous extinction and eventually did give rise to us, I 144 00:08:30,796 --> 00:08:34,356 Speaker 1: don't think that the End Cretaceous extinction was an event 145 00:08:34,476 --> 00:08:37,396 Speaker 1: you would have wanted to live through. It's not the 146 00:08:37,516 --> 00:08:40,156 Speaker 1: kind of world you would have wanted to, you know, 147 00:08:40,236 --> 00:08:44,196 Speaker 1: bequeath to your shrew like children. I think that the 148 00:08:44,476 --> 00:08:48,276 Speaker 1: defining characteristic of a mass extinction is a lot of 149 00:08:48,636 --> 00:08:53,876 Speaker 1: a lot of bad shit going down and cascading effects 150 00:08:53,956 --> 00:08:59,396 Speaker 1: that would affect all groups. So yes, some made it through, absolutely, 151 00:08:59,476 --> 00:09:02,676 Speaker 1: But to take the kind of view sixty six million 152 00:09:02,756 --> 00:09:06,076 Speaker 1: years later, it's easy to sort of sort of blithely say, well, 153 00:09:06,236 --> 00:09:07,876 Speaker 1: you know, that doesn't look so bad from a distance 154 00:09:07,876 --> 00:09:11,476 Speaker 1: of sixty six million years, But I think I look bad. 155 00:09:11,476 --> 00:09:14,156 Speaker 1: That's that's not the argument, But go on, okay, But 156 00:09:14,276 --> 00:09:17,276 Speaker 1: also to say, well, you know it takes your you know, 157 00:09:17,316 --> 00:09:20,076 Speaker 1: you pay your money, you take your chances. We're the 158 00:09:20,116 --> 00:09:23,276 Speaker 1: dominant organism. Now we might come through this extinction event 159 00:09:23,316 --> 00:09:26,836 Speaker 1: still of the dominant organism. I don't think any biologist 160 00:09:27,196 --> 00:09:29,916 Speaker 1: on the planet would say that would be a good 161 00:09:29,956 --> 00:09:31,596 Speaker 1: debt to make. But that's not I mean, to be 162 00:09:32,236 --> 00:09:35,076 Speaker 1: completely you know candid, that's not that I don't think 163 00:09:35,156 --> 00:09:37,196 Speaker 1: is the strongest version of the argument. I mean, the 164 00:09:37,276 --> 00:09:40,076 Speaker 1: strongest version of the argument is something like this, we're 165 00:09:40,116 --> 00:09:43,116 Speaker 1: involved in a world of constant trade offs. We have 166 00:09:43,156 --> 00:09:47,156 Speaker 1: a lot of people on the earth. Those people need 167 00:09:47,156 --> 00:09:49,556 Speaker 1: to eat, and even if all they're eating is fruits 168 00:09:49,596 --> 00:09:52,276 Speaker 1: and vegetables, they need agricultural in order to do that. 169 00:09:52,756 --> 00:09:57,516 Speaker 1: If we significantly changed the degree to which we're relying 170 00:09:57,676 --> 00:10:01,596 Speaker 1: on agriculture and other things to raise food, if we 171 00:10:01,636 --> 00:10:03,236 Speaker 1: could do that in some way, they would still let 172 00:10:03,316 --> 00:10:06,316 Speaker 1: us affordably feed all these people that we have. That 173 00:10:06,316 --> 00:10:08,756 Speaker 1: would be one thing. But we don't actually have the 174 00:10:08,756 --> 00:10:12,476 Speaker 1: full capacity to do that yet. Well that's a that's 175 00:10:12,476 --> 00:10:15,276 Speaker 1: a hardy I mean, you could say you're damned if 176 00:10:15,276 --> 00:10:17,756 Speaker 1: you do, you're damned if you don't. I mean, we 177 00:10:17,836 --> 00:10:20,996 Speaker 1: rely very, very heavily on you know, the health of 178 00:10:21,036 --> 00:10:25,636 Speaker 1: our soils, pollination services that are delivered for free by 179 00:10:25,676 --> 00:10:28,356 Speaker 1: lots of those insects that we were talking about. It's 180 00:10:28,396 --> 00:10:32,556 Speaker 1: not clear that you can do without those and feed 181 00:10:32,796 --> 00:10:35,476 Speaker 1: you know, eight billion, going on nine billion people. Totally 182 00:10:35,516 --> 00:10:38,356 Speaker 1: fair point, but it's a tradeoff situation. You know, reducing 183 00:10:38,436 --> 00:10:41,876 Speaker 1: population substantially, which was the view of all thoughtful educated 184 00:10:41,876 --> 00:10:44,676 Speaker 1: people a century ago. We don't think that's ethical anymore. 185 00:10:45,116 --> 00:10:47,356 Speaker 1: You know, we're beyond that kind of eugenics phase. We 186 00:10:47,556 --> 00:10:50,236 Speaker 1: we accept the world's population for what it is, and 187 00:10:50,316 --> 00:10:53,156 Speaker 1: so then we're in a world of trade offs and 188 00:10:53,476 --> 00:10:56,396 Speaker 1: how do how do we make those trade offs against uncertainty. 189 00:10:57,116 --> 00:10:59,596 Speaker 1: The extinction of the Cretasius tertiary boundary that we were 190 00:10:59,596 --> 00:11:04,436 Speaker 1: talking about was an asteroidal extinction. It wasn't human caused. 191 00:11:04,676 --> 00:11:07,156 Speaker 1: This one is human caused, and to some extent, it's 192 00:11:07,156 --> 00:11:10,076 Speaker 1: the result of humans making decisions that are designed to 193 00:11:10,076 --> 00:11:12,196 Speaker 1: serve the interest of humans. They may be the wrong decisions, 194 00:11:12,196 --> 00:11:15,596 Speaker 1: we may be getting it totally wrong, but they are decisions, 195 00:11:15,876 --> 00:11:19,076 Speaker 1: for example, with respect to growing things that humans have 196 00:11:19,156 --> 00:11:23,076 Speaker 1: made the judgment are necessary to keep us going. You're 197 00:11:23,116 --> 00:11:25,756 Speaker 1: trading off, on the one hand, supporting human life against 198 00:11:25,756 --> 00:11:27,916 Speaker 1: on the other hand, the uncertainty of the consequences of 199 00:11:28,156 --> 00:11:32,076 Speaker 1: the decline. Yeah, and I think that is exactly what 200 00:11:32,156 --> 00:11:35,236 Speaker 1: we're doing. We're doing it unconsciously, and I also want 201 00:11:35,236 --> 00:11:40,036 Speaker 1: to say we are quite possibly doing it very unwisely. 202 00:11:40,676 --> 00:11:44,276 Speaker 1: What looks like a good trade off in the moment 203 00:11:44,516 --> 00:11:46,876 Speaker 1: right in twenty nineteen, which looks like a good trade 204 00:11:46,876 --> 00:11:49,756 Speaker 1: off to us right now, is that a good trade 205 00:11:49,796 --> 00:11:51,436 Speaker 1: off for our kids? Is that a good trade off 206 00:11:51,436 --> 00:11:54,276 Speaker 1: for our kids kids? I mean, these are questions that 207 00:11:54,756 --> 00:12:01,076 Speaker 1: you know, unfortunately from purely scientific, biological and even you know, 208 00:12:01,156 --> 00:12:04,476 Speaker 1: probably ethical view are unanswerable, but I think they have 209 00:12:04,516 --> 00:12:07,756 Speaker 1: to be factored in. One other point that I want 210 00:12:07,796 --> 00:12:11,636 Speaker 1: to make, and I think it's pretty important, is as 211 00:12:11,836 --> 00:12:14,316 Speaker 1: we eat higher and higher, and as more and more 212 00:12:14,356 --> 00:12:16,596 Speaker 1: of us, you know, eat higher and higher, as it were, 213 00:12:16,596 --> 00:12:19,556 Speaker 1: on the food chain, we're using more and more calories 214 00:12:19,596 --> 00:12:24,636 Speaker 1: to produce our own chloric intake. So I don't think 215 00:12:24,676 --> 00:12:29,316 Speaker 1: it's at all true that you know, land use decisions 216 00:12:29,396 --> 00:12:33,276 Speaker 1: are straightforward. They're very very complicated, and they you know, 217 00:12:33,356 --> 00:12:35,716 Speaker 1: have to do with not just feeding eight billion people, 218 00:12:35,716 --> 00:12:38,716 Speaker 1: but how we're feeding eight billion people. For sure. I 219 00:12:38,956 --> 00:12:42,116 Speaker 1: don't dispute that for a moment. I wanted to turn 220 00:12:42,156 --> 00:12:44,236 Speaker 1: out to this ethical argument that you raise, which is 221 00:12:44,276 --> 00:12:46,876 Speaker 1: different from the where you know, we're playing dice with 222 00:12:46,916 --> 00:12:50,076 Speaker 1: our future. It's the argument that, as you put it, 223 00:12:50,596 --> 00:12:53,996 Speaker 1: humans have the capacity to have a you know, an 224 00:12:54,036 --> 00:12:56,196 Speaker 1: unpresented effect on the environment. That's why you know, a 225 00:12:56,236 --> 00:12:58,036 Speaker 1: lot of people think we're already in the anthropasy and 226 00:12:58,036 --> 00:13:00,556 Speaker 1: age defined geologically as the time that humans are having 227 00:13:00,556 --> 00:13:03,796 Speaker 1: an impact on the earth. But we shouldn't. We don't 228 00:13:03,796 --> 00:13:05,596 Speaker 1: have as you put it, the right to do that. 229 00:13:05,956 --> 00:13:07,156 Speaker 1: I don't know if you hold that view, but I'm 230 00:13:07,156 --> 00:13:09,876 Speaker 1: actually really curious to hear whether you do, and if so, 231 00:13:10,036 --> 00:13:14,116 Speaker 1: to hear a little bit more about why you think that. Well. 232 00:13:14,156 --> 00:13:17,836 Speaker 1: I mean, I think from a obviously, humans are one 233 00:13:17,876 --> 00:13:21,476 Speaker 1: species out of let's just say eight million, right, I think, 234 00:13:21,596 --> 00:13:25,756 Speaker 1: you know, the history of human consciousness, as it were, 235 00:13:26,276 --> 00:13:30,676 Speaker 1: has been a kind of widening circle of our concern 236 00:13:31,196 --> 00:13:33,756 Speaker 1: and I think that, you know, we may look back 237 00:13:33,756 --> 00:13:37,476 Speaker 1: at this particular moment when we kind of blithely did 238 00:13:37,476 --> 00:13:39,756 Speaker 1: in a lot of species, and a lot of species 239 00:13:39,796 --> 00:13:41,876 Speaker 1: I want to say that are very very close relatives 240 00:13:41,876 --> 00:13:46,756 Speaker 1: of ours. I mean, you know, orangutangs, gorillas, chimps are 241 00:13:46,796 --> 00:13:50,276 Speaker 1: all very highly threatened right now. So we are doing 242 00:13:50,396 --> 00:13:53,956 Speaker 1: in you know, species, a very high level of consciousness. 243 00:13:54,076 --> 00:13:55,556 Speaker 1: We can you know, you can get into the question 244 00:13:55,596 --> 00:13:58,076 Speaker 1: of also, you know, whether we make a distinction, whether 245 00:13:58,236 --> 00:14:01,436 Speaker 1: this ethical argument even allows for a distinction between conscious 246 00:14:01,436 --> 00:14:04,756 Speaker 1: creatures and unconscious creatures, And perhaps it doesn't, but I 247 00:14:04,796 --> 00:14:09,276 Speaker 1: should point out that we are sort of doing in 248 00:14:09,756 --> 00:14:13,756 Speaker 1: you know, our evolutionarily our very closest relatives, and I 249 00:14:13,796 --> 00:14:15,356 Speaker 1: don't think we're going to look back on this. I 250 00:14:15,356 --> 00:14:19,236 Speaker 1: don't think that, you know, humanity, whatever you know becomes 251 00:14:19,236 --> 00:14:22,476 Speaker 1: of us, is going to look back at that. That's 252 00:14:22,516 --> 00:14:24,276 Speaker 1: going to look like a crime. That's going to look 253 00:14:24,396 --> 00:14:27,516 Speaker 1: you know, not less necessarily crimes against humanity, but crimes 254 00:14:27,516 --> 00:14:32,436 Speaker 1: against humanity's closest relatives. I'm super fascinated by this line 255 00:14:32,436 --> 00:14:34,556 Speaker 1: of line of thought. I always have been, and it's 256 00:14:34,636 --> 00:14:37,716 Speaker 1: you know, a lot of environmentalism takes some part in 257 00:14:37,756 --> 00:14:40,036 Speaker 1: this kind of mode of thinking that we ought not 258 00:14:40,116 --> 00:14:43,316 Speaker 1: to have this effect on the world, not just because 259 00:14:43,316 --> 00:14:49,036 Speaker 1: we're causing pain, not just because we're eventually having you know, 260 00:14:49,116 --> 00:14:52,116 Speaker 1: terrible effects, as you said, killing off our closest cousins, 261 00:14:52,516 --> 00:14:55,876 Speaker 1: but more broadly, that somehow we shouldn't be so focused 262 00:14:55,876 --> 00:14:59,516 Speaker 1: on human consciousness. And I guess my question, and this 263 00:14:59,596 --> 00:15:01,396 Speaker 1: is something I've always struggled with myself. So it's not 264 00:15:01,436 --> 00:15:04,516 Speaker 1: like I think there's some magic blood answer to it is, 265 00:15:04,996 --> 00:15:08,756 Speaker 1: what about the our observation that, although of course we 266 00:15:08,796 --> 00:15:11,876 Speaker 1: do image far out of proportion to all other species, 267 00:15:12,636 --> 00:15:16,356 Speaker 1: that no other species seems to exercise this kind of 268 00:15:16,356 --> 00:15:21,596 Speaker 1: concern for making sure that they take into account the 269 00:15:21,636 --> 00:15:24,956 Speaker 1: interests of other species, or even other individuals of the 270 00:15:24,996 --> 00:15:27,916 Speaker 1: species I mean all species. But that just seems like 271 00:15:27,956 --> 00:15:31,676 Speaker 1: another kind of you know, human sallipsism. To be honest, 272 00:15:31,716 --> 00:15:34,636 Speaker 1: I mean, this is meant to say that what would 273 00:15:34,636 --> 00:15:37,676 Speaker 1: be salapsistic would be assuming that because humans uniquely have 274 00:15:37,716 --> 00:15:40,676 Speaker 1: the ability to think of the interests of others, therefore 275 00:15:41,276 --> 00:15:43,276 Speaker 1: it must be that the correct ethical way to be 276 00:15:43,356 --> 00:15:44,836 Speaker 1: is I think of the interests of others. This is 277 00:15:44,956 --> 00:15:47,716 Speaker 1: the question from meant to be from the other side, well, 278 00:15:47,876 --> 00:15:50,116 Speaker 1: I mean, yeah, that's one way to look at it. 279 00:15:50,156 --> 00:15:52,836 Speaker 1: But I think that once again, the history of you know, 280 00:15:52,876 --> 00:15:55,716 Speaker 1: the last you know, five hundred years or whatever from 281 00:15:55,796 --> 00:16:00,116 Speaker 1: you know, turning from a geocentric to a heliocentric world 282 00:16:00,676 --> 00:16:06,116 Speaker 1: should be sort of questioning trying to decenter humans. And 283 00:16:06,516 --> 00:16:09,636 Speaker 1: we are faced with this really interesting situation and board knows, 284 00:16:09,716 --> 00:16:11,556 Speaker 1: you know, if I had the answer to this, I 285 00:16:11,596 --> 00:16:13,956 Speaker 1: would be happy to impart it. I don't, but I 286 00:16:14,516 --> 00:16:17,836 Speaker 1: certainly think that all of these things are worth questioning 287 00:16:17,956 --> 00:16:21,956 Speaker 1: pretty profoundly at this particular moment in time. And so 288 00:16:21,996 --> 00:16:26,756 Speaker 1: we're faced with this really interesting two divergent trends as 289 00:16:26,756 --> 00:16:29,756 Speaker 1: that were here. One this as I say, d centering 290 00:16:29,756 --> 00:16:33,476 Speaker 1: of people, you know, we're descended from a common ancestor 291 00:16:33,476 --> 00:16:35,956 Speaker 1: with chimps. The Sun does not revolve around the Earth. 292 00:16:35,996 --> 00:16:39,196 Speaker 1: There's revolves around the Sun. So we're one planet among 293 00:16:39,316 --> 00:16:43,436 Speaker 1: you know, millions and potentially billions, So we're pretty small 294 00:16:43,436 --> 00:16:46,156 Speaker 1: aspects in the universe as it were. But then we're 295 00:16:46,196 --> 00:16:49,596 Speaker 1: also one of the lessons of the last generation or 296 00:16:49,636 --> 00:16:52,956 Speaker 1: so has been Wow, we are having a really significant 297 00:16:52,956 --> 00:16:56,156 Speaker 1: impact on this one planet that we as humans call 298 00:16:56,276 --> 00:16:58,956 Speaker 1: home and on everything that shares it with us, and 299 00:16:58,956 --> 00:17:01,436 Speaker 1: that shares an evolutionary history with us that you know, 300 00:17:01,476 --> 00:17:04,916 Speaker 1: we now know goes back almost four billion years. So 301 00:17:04,956 --> 00:17:08,156 Speaker 1: those are pretty heavy numbers in both directions. They sort 302 00:17:08,156 --> 00:17:11,476 Speaker 1: of point and conflict directions, and the fact that we 303 00:17:11,556 --> 00:17:17,036 Speaker 1: have not found as satisfactory way of working our way 304 00:17:17,076 --> 00:17:22,636 Speaker 1: through them is unfortunate because this damage is permanent. You know, 305 00:17:22,676 --> 00:17:24,556 Speaker 1: when you get rid of the species you have, you 306 00:17:24,596 --> 00:17:27,836 Speaker 1: have permanently cut off its evolutionary possibility. So we're really 307 00:17:27,916 --> 00:17:30,796 Speaker 1: screwing with the evolution of life. And once again, this 308 00:17:30,836 --> 00:17:33,356 Speaker 1: goes way beyond you know, our kids and our kids kids. 309 00:17:33,356 --> 00:17:35,916 Speaker 1: It goes to the future of all life on Earth, 310 00:17:36,036 --> 00:17:38,276 Speaker 1: you know, forever. I love the way you put that 311 00:17:38,276 --> 00:17:41,276 Speaker 1: that there's a deep contradiction between two two different insights, 312 00:17:41,316 --> 00:17:44,036 Speaker 1: one that we're just another species and the other that, boy, 313 00:17:44,076 --> 00:17:46,876 Speaker 1: we're not just another species. We're having this disproportionate impact 314 00:17:47,236 --> 00:17:50,596 Speaker 1: when it comes to screwing with evolution. To use your phrase, um, 315 00:17:51,196 --> 00:17:53,476 Speaker 1: are you a believer that that's always a bad thing? 316 00:17:53,516 --> 00:17:55,316 Speaker 1: I mean, I assume that you're in favor of all 317 00:17:55,396 --> 00:17:57,996 Speaker 1: kinds of ways that we tweak evolution that make us 318 00:17:58,076 --> 00:18:01,276 Speaker 1: better off as as humans, from you know, from antibiotics 319 00:18:01,676 --> 00:18:05,316 Speaker 1: on down. You know, I'm not I don't want to say, like, 320 00:18:05,756 --> 00:18:09,356 Speaker 1: you know, screwing with evolution is some you know, you're 321 00:18:09,436 --> 00:18:12,116 Speaker 1: right as you say, every time you know, you um, 322 00:18:13,036 --> 00:18:15,556 Speaker 1: you know, step on that you know, aunt or whatever 323 00:18:15,676 --> 00:18:18,596 Speaker 1: you've on some level, obviously everything is always you know, 324 00:18:18,676 --> 00:18:22,276 Speaker 1: screwing with you know, other organisms all the time. But 325 00:18:22,356 --> 00:18:27,636 Speaker 1: I think we very rarely knowingly and consciously and happily 326 00:18:28,636 --> 00:18:32,156 Speaker 1: cut off entire limbs of the evolutionary tree. And that 327 00:18:32,196 --> 00:18:36,356 Speaker 1: sort of brings me back to our closest relatives. I mean, 328 00:18:37,036 --> 00:18:41,316 Speaker 1: there were a lot of human cousins around at one point. 329 00:18:41,356 --> 00:18:43,596 Speaker 1: We know more and more about different you know, sort 330 00:18:43,596 --> 00:18:48,556 Speaker 1: of human species or subspecies. The Neanderthals, the nencipents, their 331 00:18:48,636 --> 00:18:52,036 Speaker 1: doubtless others out there that no longer exist, quite probably 332 00:18:52,036 --> 00:18:55,356 Speaker 1: because of us. And we are also looking at getting 333 00:18:55,436 --> 00:18:57,956 Speaker 1: rid of great apes. If you just look at the 334 00:18:57,956 --> 00:19:00,956 Speaker 1: future of the evolution of consciousness, right, if we consider 335 00:19:00,996 --> 00:19:04,516 Speaker 1: those to be the animals of the highest you know, consciousness, 336 00:19:04,876 --> 00:19:09,396 Speaker 1: we are potentially creating a world where consciousness you know, 337 00:19:09,436 --> 00:19:11,876 Speaker 1: will not arise again, or will not arise for a 338 00:19:11,996 --> 00:19:14,396 Speaker 1: very very very long time, and you know, take a 339 00:19:14,476 --> 00:19:16,956 Speaker 1: very very different form, quite likely at that point. But 340 00:19:16,996 --> 00:19:19,276 Speaker 1: it just so I understand. Are you saying that if 341 00:19:19,316 --> 00:19:23,716 Speaker 1: those other higher primates continue to exist in and in 342 00:19:23,796 --> 00:19:26,676 Speaker 1: larger numbers, that there would be the possibility of the 343 00:19:26,916 --> 00:19:31,076 Speaker 1: of a repeat in the evolutionary process in which consciousness, 344 00:19:31,076 --> 00:19:34,436 Speaker 1: at least of the human type it would evolve another time, 345 00:19:34,636 --> 00:19:37,276 Speaker 1: another time around. No, I'm not necessarily saying that, although 346 00:19:37,316 --> 00:19:39,836 Speaker 1: that is that is certainly possible. I'm just saying that 347 00:19:40,436 --> 00:19:43,156 Speaker 1: if you consider you know, our great ape you know 348 00:19:43,196 --> 00:19:45,716 Speaker 1: cousins to be conscious, and I certainly think that most 349 00:19:45,796 --> 00:19:50,796 Speaker 1: you know, animal behaviorists would that consciousness itself would continue 350 00:19:50,796 --> 00:19:53,276 Speaker 1: to exist. There would be other species of apes that 351 00:19:53,316 --> 00:19:57,596 Speaker 1: would eventually arise they would not necessarily be humans. They 352 00:19:57,596 --> 00:19:59,796 Speaker 1: would be other things. They would be you know, they 353 00:19:59,796 --> 00:20:03,876 Speaker 1: wouldn't necessarily be have any more or less consciousness, you 354 00:20:03,876 --> 00:20:06,996 Speaker 1: know than a gorilla does. Now let's just say, but um, 355 00:20:07,356 --> 00:20:10,196 Speaker 1: that would be a possibility, that would bevolutionary possibility that 356 00:20:10,236 --> 00:20:12,356 Speaker 1: was open. If you do away with all the species 357 00:20:12,356 --> 00:20:14,396 Speaker 1: of great apes, as we are, you know, pretty much 358 00:20:14,436 --> 00:20:16,636 Speaker 1: on our way to doing right now, then you're just 359 00:20:16,676 --> 00:20:21,876 Speaker 1: cutting off that evolutionary possibility. And I'm simply using this 360 00:20:21,956 --> 00:20:25,516 Speaker 1: example once again of apes because I think that it, 361 00:20:26,276 --> 00:20:29,916 Speaker 1: you know, there are a lot of other evolutionary branches 362 00:20:30,196 --> 00:20:33,276 Speaker 1: that are being you know, cut off for a whole genus, 363 00:20:33,356 --> 00:20:36,276 Speaker 1: let's us say, is disappearing, or there's only one species 364 00:20:36,356 --> 00:20:39,356 Speaker 1: left in the genus and it's disappearing. I mentioned apes 365 00:20:39,396 --> 00:20:42,396 Speaker 1: because I think people have an emotional reaction to them, 366 00:20:42,396 --> 00:20:44,516 Speaker 1: and if they thought of a world without great apes, 367 00:20:44,876 --> 00:20:48,996 Speaker 1: and they think of that evolutionary possibility being closed off, 368 00:20:49,836 --> 00:20:54,396 Speaker 1: that would have some you know, emotional connection that you know, 369 00:20:54,436 --> 00:20:56,676 Speaker 1: if I tell you about this, you know, genus of 370 00:20:56,796 --> 00:20:59,076 Speaker 1: I was just out in Nevada, and I you know, 371 00:20:59,996 --> 00:21:03,276 Speaker 1: was met saw. I don't know what word you'd use 372 00:21:03,396 --> 00:21:06,036 Speaker 1: encountered a you know, a fish that's the last fish 373 00:21:06,036 --> 00:21:09,396 Speaker 1: in its genus because we've been soaking the wall are 374 00:21:09,396 --> 00:21:12,636 Speaker 1: out of the Nevada Desert, so all the other species 375 00:21:12,636 --> 00:21:15,316 Speaker 1: in this genus are gone. I don't think that has 376 00:21:15,356 --> 00:21:18,356 Speaker 1: as much, you know, kind of connection to people, but 377 00:21:18,396 --> 00:21:21,396 Speaker 1: it's it's another way in which you know, we are 378 00:21:21,556 --> 00:21:25,596 Speaker 1: we are cutting off that evolutionary possibility. Can I ask 379 00:21:25,636 --> 00:21:29,236 Speaker 1: about what we can do? I mean, what are the 380 00:21:29,276 --> 00:21:35,556 Speaker 1: real world doable things that could substantially reduce the harm 381 00:21:35,636 --> 00:21:37,876 Speaker 1: that we're doing to to biodiversity that's out there in 382 00:21:37,876 --> 00:21:43,996 Speaker 1: the world. Well, I think that, you know, the basic answer, 383 00:21:44,036 --> 00:21:47,876 Speaker 1: probably the first order answer would be gets back to 384 00:21:47,916 --> 00:21:51,116 Speaker 1: that question of land use that we were discussing. I mean, 385 00:21:51,156 --> 00:21:57,796 Speaker 1: if we did not cut down the rest of the Amazon, 386 00:21:57,916 --> 00:22:01,076 Speaker 1: cut down the rest of the rainforest in the Congo, 387 00:22:01,276 --> 00:22:06,996 Speaker 1: you know, these big big still extant and you know, 388 00:22:07,156 --> 00:22:12,596 Speaker 1: relatively intact what are called biodiversity hot spots. The tropics 389 00:22:12,716 --> 00:22:15,956 Speaker 1: is where most of the diversity of the world exists. 390 00:22:16,036 --> 00:22:20,236 Speaker 1: There's you know, historical reasons for that. The tropics have 391 00:22:20,356 --> 00:22:22,516 Speaker 1: been have had a fairly stable climate for a long 392 00:22:22,556 --> 00:22:26,716 Speaker 1: time and had the ice ages and the temperature, fluctulating, 393 00:22:26,836 --> 00:22:29,716 Speaker 1: all things which are bad for a species continuation. Right 394 00:22:29,996 --> 00:22:33,156 Speaker 1: There's just been a long history of life evolving under 395 00:22:33,156 --> 00:22:36,236 Speaker 1: relatively stable conditions in that you know, that's one theory, 396 00:22:36,236 --> 00:22:38,956 Speaker 1: and it's it's a pretty popular theory. That's why, that's 397 00:22:38,956 --> 00:22:41,156 Speaker 1: why there's so much biodiversity in the tropics right now 398 00:22:41,876 --> 00:22:45,076 Speaker 1: as opposed to in temperate zones like you know, where 399 00:22:45,316 --> 00:22:47,796 Speaker 1: you and I live, where you know, there was ice 400 00:22:47,836 --> 00:22:51,716 Speaker 1: covering this area for you know, until twelve thousand years 401 00:22:51,716 --> 00:22:54,916 Speaker 1: ago or so. So obviously everything that's repopulated where I 402 00:22:54,996 --> 00:22:58,436 Speaker 1: live in western mass has has arrived re arrived in 403 00:22:58,476 --> 00:23:02,156 Speaker 1: the last twelve thousand years. So if we you know, 404 00:23:02,196 --> 00:23:04,796 Speaker 1: sort of got together as as it were, and decided, okay, 405 00:23:04,836 --> 00:23:07,596 Speaker 1: let's let's leave those parts of the world as intact 406 00:23:07,636 --> 00:23:12,796 Speaker 1: as possible, that would be one way to really probably 407 00:23:12,836 --> 00:23:17,076 Speaker 1: make a pretty big difference, not to stop this losses 408 00:23:17,236 --> 00:23:19,556 Speaker 1: by any stretch of the imagination, because there's all sorts 409 00:23:19,556 --> 00:23:22,836 Speaker 1: of other factors at work, including climate change, which that 410 00:23:23,236 --> 00:23:26,356 Speaker 1: report which you alluded to, you know, talked about becoming 411 00:23:26,396 --> 00:23:28,796 Speaker 1: a major driver of extinction. It probably is not yet 412 00:23:28,836 --> 00:23:31,316 Speaker 1: a major driver extinction, but it's it's it's looming very, 413 00:23:31,396 --> 00:23:34,996 Speaker 1: very large. So another you know, point that I guess 414 00:23:35,036 --> 00:23:37,636 Speaker 1: I would make getting back to this question of you know, 415 00:23:37,716 --> 00:23:40,116 Speaker 1: climate change is something we can all see as a 416 00:23:40,156 --> 00:23:43,876 Speaker 1: threat to you know, life as we know it for humans. 417 00:23:43,956 --> 00:23:46,356 Speaker 1: A lot of the things that we would try to 418 00:23:46,436 --> 00:23:51,396 Speaker 1: do for our own good would also probably although not necessarily, 419 00:23:51,996 --> 00:23:54,236 Speaker 1: because there are ways to combat climate change that would 420 00:23:54,276 --> 00:23:58,916 Speaker 1: have very you know, very significant land use change component 421 00:23:58,956 --> 00:24:01,556 Speaker 1: that would probably not be good for other species but 422 00:24:01,676 --> 00:24:04,836 Speaker 1: might be good for people. So that's another potential trade 423 00:24:04,876 --> 00:24:07,596 Speaker 1: off there. But it's also possible that there are ways 424 00:24:07,596 --> 00:24:10,756 Speaker 1: to mitigate climate change that would have a big impact 425 00:24:11,196 --> 00:24:13,076 Speaker 1: on the rest of the species of the planet too. 426 00:24:13,156 --> 00:24:16,356 Speaker 1: So there are there is in some senses, we're not 427 00:24:16,436 --> 00:24:18,876 Speaker 1: pitting our interests against those other species, but we actually 428 00:24:18,916 --> 00:24:22,276 Speaker 1: have a confluence of interest there. Elizabeth, you said something 429 00:24:22,316 --> 00:24:25,636 Speaker 1: that really grabbed me when you said that if we 430 00:24:25,716 --> 00:24:31,116 Speaker 1: could protect these tropical biodiversity hotspots, and you mentioned the 431 00:24:31,156 --> 00:24:34,556 Speaker 1: Amazon rainforest and the rainforest in the Congo, that we 432 00:24:34,596 --> 00:24:38,796 Speaker 1: would go some significant way towards reducing our attack on biodiversity. 433 00:24:38,796 --> 00:24:41,036 Speaker 1: And I have to say, you know, just by thinking 434 00:24:41,036 --> 00:24:42,876 Speaker 1: about this issue a little bit and reading your book. 435 00:24:42,876 --> 00:24:46,276 Speaker 1: I don't think I sufficiently took on board until you 436 00:24:46,356 --> 00:24:49,396 Speaker 1: just said it that in a way, this all may 437 00:24:49,436 --> 00:24:51,196 Speaker 1: be more doable, or much of this may be more 438 00:24:51,276 --> 00:24:55,276 Speaker 1: doable than we tend to think when we face major 439 00:24:55,996 --> 00:24:59,996 Speaker 1: environmental challenges, in the sense that, although they are obviously 440 00:25:00,076 --> 00:25:04,836 Speaker 1: very strong practical, logistical, and political challenges associated with reconfiguring 441 00:25:04,876 --> 00:25:08,636 Speaker 1: institutions to protect indigenous people, who often are the people 442 00:25:08,636 --> 00:25:14,836 Speaker 1: who live in these biodiversity hotspot zones, Nevertheless, if you 443 00:25:14,836 --> 00:25:18,636 Speaker 1: could achieve a lot of this by protecting a handful 444 00:25:18,636 --> 00:25:22,636 Speaker 1: of identifiable areas on Earth, that seems a lot more 445 00:25:22,756 --> 00:25:28,036 Speaker 1: doable than sort of a global transformation in either population 446 00:25:28,116 --> 00:25:32,196 Speaker 1: numbers or in the way that food is grown, or 447 00:25:32,196 --> 00:25:34,156 Speaker 1: even in human diet. I mean, there may be lots 448 00:25:34,156 --> 00:25:35,876 Speaker 1: of great reasons to all of those things, and I 449 00:25:35,876 --> 00:25:39,436 Speaker 1: don't dispute that, but when I think about the biodiversity challenge, 450 00:25:39,716 --> 00:25:42,036 Speaker 1: when you're read the UN report, or at least it's 451 00:25:42,156 --> 00:25:44,916 Speaker 1: it's summary, it sounds as though, oh, my goodness, you 452 00:25:44,956 --> 00:25:46,916 Speaker 1: know this is happening, It's inevitable. There's nothing that we 453 00:25:46,956 --> 00:25:50,476 Speaker 1: could really do substantially about it that would actually be doable. 454 00:25:50,516 --> 00:25:52,396 Speaker 1: But this seems a little more doable. This seems within 455 00:25:52,436 --> 00:25:55,716 Speaker 1: the realm of the reasonable, and it also seems like 456 00:25:55,716 --> 00:25:59,436 Speaker 1: it could be done without imposing tremendous costs on the 457 00:25:59,476 --> 00:26:02,316 Speaker 1: rest of the world. Is that slightly more optimistic? Take 458 00:26:02,356 --> 00:26:04,516 Speaker 1: that I just took away from what you said at 459 00:26:04,516 --> 00:26:07,236 Speaker 1: all credible? Or do you actually feel the pessimism that 460 00:26:07,276 --> 00:26:09,996 Speaker 1: I previously felt and that I think I took away 461 00:26:09,996 --> 00:26:12,316 Speaker 1: from your book, If you'll forgive you saying so, well, 462 00:26:12,356 --> 00:26:15,036 Speaker 1: I guess I however, somewhere between the two, I mean. 463 00:26:15,516 --> 00:26:18,116 Speaker 1: Ed Wilson has a fairly recent book called A Half 464 00:26:18,196 --> 00:26:21,596 Speaker 1: Earth where he sort of proposes putting aside, you know, 465 00:26:21,756 --> 00:26:24,476 Speaker 1: half of the land on Earth for for other species. 466 00:26:24,516 --> 00:26:27,476 Speaker 1: And that sounds, you know, like an awful lot. But 467 00:26:27,796 --> 00:26:30,756 Speaker 1: you know, when you look at these pretty intact places 468 00:26:30,796 --> 00:26:35,356 Speaker 1: like the boreal forest of Canada, the Amazon, the tropical 469 00:26:35,436 --> 00:26:40,196 Speaker 1: rainforest in parts of Africa, you could you could imagine it. 470 00:26:40,236 --> 00:26:43,676 Speaker 1: I think you can imagine, as you say, doing that 471 00:26:43,796 --> 00:26:46,636 Speaker 1: now that requires us to think totally differently about the 472 00:26:46,636 --> 00:26:52,996 Speaker 1: world's It requires, you know, transnational agreements. Unfortunately, you know, 473 00:26:53,076 --> 00:26:59,596 Speaker 1: there's no body and in fact, they very awkwardly acronymed ipb. Yes, 474 00:26:59,996 --> 00:27:03,356 Speaker 1: was you know sort of designed to be the inner 475 00:27:03,396 --> 00:27:07,236 Speaker 1: Governmental Panel on Climate change for biodiversity, to have some 476 00:27:07,356 --> 00:27:10,996 Speaker 1: group that was looking at this from an international perspective. Now, 477 00:27:11,036 --> 00:27:13,436 Speaker 1: do they have any clout? You know, that's a very 478 00:27:13,636 --> 00:27:16,436 Speaker 1: very good question. Well, is there any talk of an 479 00:27:16,476 --> 00:27:20,396 Speaker 1: international treaty regime, for example, comparable to or connected to 480 00:27:20,396 --> 00:27:23,116 Speaker 1: the climate change treaty regime, which, you know, it's a 481 00:27:23,196 --> 00:27:25,396 Speaker 1: topic for another day. Of course, the great challenges that 482 00:27:25,476 --> 00:27:28,556 Speaker 1: even that treaty regime faces. But is there any movement 483 00:27:28,596 --> 00:27:32,076 Speaker 1: out there for a kind of treaty regime that would say, look, yes, 484 00:27:32,156 --> 00:27:36,716 Speaker 1: substantial wealth transferred to the countries that house these biodiversity 485 00:27:36,716 --> 00:27:42,716 Speaker 1: hotspots in exchange for much stricter protection of those zones 486 00:27:43,876 --> 00:27:48,796 Speaker 1: with international monitoring if necessary, and guarantees not to turn 487 00:27:48,876 --> 00:27:52,236 Speaker 1: that or allow that land to be turned into farmland, 488 00:27:52,516 --> 00:27:54,636 Speaker 1: and again paid for. I mean, this would be more 489 00:27:54,636 --> 00:27:56,796 Speaker 1: expensive for the local people. And I'm the last person 490 00:27:56,836 --> 00:27:59,356 Speaker 1: to want an arrangement to disadvantages the local people. And 491 00:27:59,396 --> 00:28:02,676 Speaker 1: says that, well, because biodiversity hotspots are where you live, 492 00:28:02,716 --> 00:28:04,396 Speaker 1: you have to continue to live in poverty. That seemed 493 00:28:04,396 --> 00:28:06,116 Speaker 1: to me personally at least, exactly the wrong way to 494 00:28:06,156 --> 00:28:08,676 Speaker 1: go about it. But you can imagine a relatively as 495 00:28:08,756 --> 00:28:12,916 Speaker 1: these things go, straightforward treaty arrangement where the rest of 496 00:28:12,916 --> 00:28:16,476 Speaker 1: the world pays and gives advantages to the countries that 497 00:28:16,516 --> 00:28:20,316 Speaker 1: are the hosts of the hotspots and they make certain commitments. 498 00:28:20,836 --> 00:28:23,116 Speaker 1: Is anybody talking about that? I mean, it seems logical 499 00:28:23,156 --> 00:28:25,556 Speaker 1: in light of what you've said. Well, there there is 500 00:28:25,956 --> 00:28:29,476 Speaker 1: a companion treaty to the Framework Convention on Climate Change. 501 00:28:29,476 --> 00:28:34,476 Speaker 1: It's the Convention on Biological Diversity. It doesn't get nearly 502 00:28:34,476 --> 00:28:37,516 Speaker 1: as much pressed, but it does exist. Admit, I've never 503 00:28:37,556 --> 00:28:39,756 Speaker 1: heard of it until this moment, so you're right, it doesn't. 504 00:28:39,756 --> 00:28:42,716 Speaker 1: So go on, and what does it say? You know, 505 00:28:42,916 --> 00:28:46,436 Speaker 1: it is one of these very it has very high 506 00:28:46,516 --> 00:28:52,116 Speaker 1: aspirations for how we're going to you know, preserve biological diversity, 507 00:28:52,196 --> 00:28:54,916 Speaker 1: but it has, you know, no teeth. I think it 508 00:28:54,956 --> 00:28:59,596 Speaker 1: would be a platform that you could work out international 509 00:28:59,916 --> 00:29:02,596 Speaker 1: transfers of wealth, as you say, but I as far 510 00:29:02,636 --> 00:29:05,596 Speaker 1: as I know, those are not happening now. One way 511 00:29:05,676 --> 00:29:09,436 Speaker 1: in which that is happening is with you know, these 512 00:29:09,436 --> 00:29:12,956 Speaker 1: sort of forest credits that people are using as in 513 00:29:13,076 --> 00:29:17,956 Speaker 1: connection with you know, climate change, right as offsets climate offsets. 514 00:29:17,956 --> 00:29:20,116 Speaker 1: If you if you don't cut down your rainforest, you 515 00:29:20,116 --> 00:29:21,716 Speaker 1: know will pay you, and that kind of sort of 516 00:29:21,796 --> 00:29:25,916 Speaker 1: counts as a climate change mitigation effort. So that is 517 00:29:25,956 --> 00:29:28,996 Speaker 1: one arena, you know, where where there is money changing hands, 518 00:29:28,996 --> 00:29:30,836 Speaker 1: probably not a great deal of money, but a certain 519 00:29:30,836 --> 00:29:33,956 Speaker 1: amount of money changing hands. I want to sort of 520 00:29:34,036 --> 00:29:37,596 Speaker 1: just close by asking if you see there being some 521 00:29:37,996 --> 00:29:42,036 Speaker 1: spots for hopefulness. If you don't, that's that's totally fair. 522 00:29:42,476 --> 00:29:46,116 Speaker 1: Um I you know, I'm reacting to the possible moves 523 00:29:46,196 --> 00:29:48,876 Speaker 1: of treaty and regulation because if you can identify a 524 00:29:48,876 --> 00:29:52,476 Speaker 1: handful of big bad actors, it's humanly possible to pressure them. 525 00:29:52,516 --> 00:29:54,356 Speaker 1: I'm not saying it always works, but it's always at 526 00:29:54,396 --> 00:29:57,756 Speaker 1: least there's a path forward that you can imagine, you know, following. 527 00:29:58,196 --> 00:29:59,796 Speaker 1: You can name and shame, you can put pressure, you 528 00:29:59,796 --> 00:30:02,476 Speaker 1: can try to produce a regulatory regime, and those you know, 529 00:30:02,516 --> 00:30:04,876 Speaker 1: again they're no guarantees, but they seems a little hopeful. 530 00:30:05,196 --> 00:30:06,996 Speaker 1: But are there other points that you would think of, 531 00:30:07,036 --> 00:30:10,596 Speaker 1: as you know, gee, looks like a way forward, This 532 00:30:10,636 --> 00:30:14,956 Speaker 1: looks like a possible success or a replicable success. Well, 533 00:30:14,996 --> 00:30:17,716 Speaker 1: I think in theory there are, but all of the 534 00:30:17,756 --> 00:30:22,356 Speaker 1: problems that we're looking at environmental problems cannot be solved 535 00:30:22,876 --> 00:30:26,756 Speaker 1: without some kind of international framework, or cannot not even solved. 536 00:30:26,756 --> 00:30:29,196 Speaker 1: I'm not even talking about solving them, cannot be mitigated 537 00:30:29,796 --> 00:30:33,636 Speaker 1: without some kind of you very very serious international cooperation 538 00:30:33,756 --> 00:30:39,036 Speaker 1: and as you suggest, probably transfer of technology, transfer of wealth. 539 00:30:39,636 --> 00:30:42,316 Speaker 1: But we are not willing to do that right now, 540 00:30:42,356 --> 00:30:44,996 Speaker 1: and we are moving in exactly the opposite direction. We 541 00:30:45,036 --> 00:30:49,236 Speaker 1: have a moment of resurgent nationalism when when we look 542 00:30:49,276 --> 00:30:52,036 Speaker 1: back at this, at this moment which I unfortunately believe 543 00:30:52,116 --> 00:30:56,156 Speaker 1: will be looked at as a moment of just extraordinary 544 00:30:56,636 --> 00:31:02,276 Speaker 1: heedlessness and madness and consigning you know, future generations to 545 00:31:02,356 --> 00:31:05,756 Speaker 1: some pretty bad stuff that we didn't have to consign 546 00:31:05,796 --> 00:31:08,916 Speaker 1: them to had we gotten our act together earlier or 547 00:31:09,396 --> 00:31:13,516 Speaker 1: right now. I think that the fact that we are 548 00:31:13,676 --> 00:31:17,556 Speaker 1: seeing this resurgent nationalism at a moment when international corporation 549 00:31:17,596 --> 00:31:21,076 Speaker 1: you know, really couldn't be more pressing. Will that be 550 00:31:21,116 --> 00:31:23,156 Speaker 1: looked at as a coincidence or will that be looked 551 00:31:23,196 --> 00:31:27,116 Speaker 1: at as you know, part of this this human package 552 00:31:27,236 --> 00:31:29,996 Speaker 1: of not being willing to face up to the facts 553 00:31:30,036 --> 00:31:35,516 Speaker 1: until it's you know, screamingly too late. Well, we humans 554 00:31:35,516 --> 00:31:39,196 Speaker 1: are better at collective action than any other species, and 555 00:31:39,276 --> 00:31:41,076 Speaker 1: we can do a lot more with it, and we 556 00:31:41,076 --> 00:31:43,156 Speaker 1: can do a lot more bad things with our collective 557 00:31:43,156 --> 00:31:47,996 Speaker 1: action any other species. And so we're sort of we're 558 00:31:48,116 --> 00:31:51,236 Speaker 1: our distinctive features as a species are are definitely at 559 00:31:51,276 --> 00:31:54,436 Speaker 1: the heart of the challenge we faced a biodiversity. Thank 560 00:31:54,436 --> 00:32:04,076 Speaker 1: you so much, thanks for having me. Elizabeth Colbert offers 561 00:32:04,156 --> 00:32:08,516 Speaker 1: us an extremely cogent and extremely depressing account of our 562 00:32:08,516 --> 00:32:13,756 Speaker 1: current moment. The problems we face are clearly international, and 563 00:32:13,796 --> 00:32:17,116 Speaker 1: if we want to avoid massive extinctions, they demand solutions. 564 00:32:17,916 --> 00:32:20,836 Speaker 1: It's not like we don't have the technology of international 565 00:32:20,916 --> 00:32:25,396 Speaker 1: governance available. We know what that technology is. It's called treaties. 566 00:32:25,836 --> 00:32:28,796 Speaker 1: It's called trees with teeth that can actually be enforced 567 00:32:29,196 --> 00:32:34,196 Speaker 1: and that will make governments preserve the environment. Unfortunately, we 568 00:32:34,236 --> 00:32:38,556 Speaker 1: are also in a moment of profound skepticism via nationalism, 569 00:32:38,796 --> 00:32:42,156 Speaker 1: of exactly the kind of international cooperation that would be 570 00:32:42,316 --> 00:32:45,476 Speaker 1: necessary if we were actually going to take on the 571 00:32:45,516 --> 00:32:50,356 Speaker 1: problem of the collapse of biodiversity. What comes next, I guess, 572 00:32:50,596 --> 00:32:54,716 Speaker 1: is nothing good unless we're able to turn things around 573 00:32:55,316 --> 00:32:59,636 Speaker 1: and begin to re examine the tools that actually let 574 00:32:59,716 --> 00:33:03,116 Speaker 1: governments compel each other to do things to preserve the environment. 575 00:33:05,716 --> 00:33:08,636 Speaker 1: Deep Background is brought to you by Pushkin Industries. Our 576 00:33:08,676 --> 00:33:11,756 Speaker 1: producer is Lydia Jean Coott, with engineering by Jason Gambrell 577 00:33:11,956 --> 00:33:15,716 Speaker 1: and Jason Rostkowski. Our showrunner is Sophie mckibbon. Our theme 578 00:33:15,796 --> 00:33:18,596 Speaker 1: music is composed by Luis GERA special thanks to the 579 00:33:18,596 --> 00:33:22,716 Speaker 1: Pushkin Brass Malcolm Gladwell, Jacob Weisberg and Mia Lobel. I'm 580 00:33:22,756 --> 00:33:25,156 Speaker 1: Noah Feldman. You can follow me on Twitter at Noah 581 00:33:25,316 --> 00:33:27,756 Speaker 1: R Feldman. This is Deep Background