1 00:00:00,080 --> 00:00:03,160 Speaker 1: Hey, quick announcement everyone, We have just joined TikTok, So 2 00:00:03,240 --> 00:00:05,240 Speaker 1: head over there and follow us to see videos of 3 00:00:05,320 --> 00:00:20,000 Speaker 1: Daniel asking and answering science questions. All right, enjoy the pod. Hey, Kelly, 4 00:00:20,160 --> 00:00:22,760 Speaker 1: is a mosquito technically a parasite? 5 00:00:22,840 --> 00:00:25,720 Speaker 2: Well, you maybe don't realize the can of worms you've 6 00:00:25,720 --> 00:00:28,320 Speaker 2: opened up if you go to a parasitology meeting. This 7 00:00:28,360 --> 00:00:31,680 Speaker 2: is something that we actually fight about. But I'll just 8 00:00:31,720 --> 00:00:34,040 Speaker 2: cut to the chase, because I'm sure what you wanted 9 00:00:34,159 --> 00:00:36,920 Speaker 2: was a short answer. H And I would say that 10 00:00:36,960 --> 00:00:38,280 Speaker 2: they are micro predators. 11 00:00:38,720 --> 00:00:42,800 Speaker 1: What predators? You're saying that these bloodsuckers don't just make 12 00:00:42,840 --> 00:00:45,240 Speaker 1: me itch, they've turned me into their prey. 13 00:00:45,680 --> 00:00:48,120 Speaker 2: They do. And you know, unlike parasites, they don't have 14 00:00:48,280 --> 00:00:51,839 Speaker 2: like durable, long lasting interactions with your body. They just 15 00:00:51,920 --> 00:00:54,160 Speaker 2: kind of take a meal and then they run off. 16 00:00:55,000 --> 00:00:56,800 Speaker 1: Does that mean that I don't have to feel bad 17 00:00:56,800 --> 00:00:57,840 Speaker 1: when I swap one of them? 18 00:00:57,960 --> 00:01:00,800 Speaker 2: I don't think they feel bad when they're thinking your blood? 19 00:01:02,320 --> 00:01:04,200 Speaker 1: All right? Well, what if I was like going to 20 00:01:04,360 --> 00:01:06,840 Speaker 1: kill all of the mosquitoes, then should I feel bad? 21 00:01:07,080 --> 00:01:07,360 Speaker 3: Oh? 22 00:01:07,400 --> 00:01:09,760 Speaker 2: I feel like now we're getting into like philosophy. This 23 00:01:09,880 --> 00:01:12,040 Speaker 2: is like a twisted version of the trolley problem. 24 00:01:13,080 --> 00:01:15,760 Speaker 1: Well, you know, if I could pull a lever to 25 00:01:16,080 --> 00:01:19,119 Speaker 1: have that train kill every single mosquito, I would do it, 26 00:01:19,360 --> 00:01:21,680 Speaker 1: even if it saved nobody's lives, even if it just 27 00:01:21,720 --> 00:01:23,080 Speaker 1: saved us from some itches. 28 00:01:24,720 --> 00:01:27,360 Speaker 2: You don't need philosophy, you know the answers you go 29 00:01:27,480 --> 00:01:28,119 Speaker 2: with your gut. 30 00:01:43,720 --> 00:01:46,600 Speaker 1: Hi. I'm Daniel. I'm a particle physicist and a professor 31 00:01:46,640 --> 00:01:50,160 Speaker 1: at UC Irvine and a deep deep hater of mosquitoes. 32 00:01:50,480 --> 00:01:53,880 Speaker 2: I'm Kelly Wider Smith. I'm an adjunct professor at Rice University, 33 00:01:53,920 --> 00:01:56,360 Speaker 2: and you know I'm also a deep deep hater of mosquitoes. 34 00:01:56,640 --> 00:01:59,200 Speaker 1: I thought that as a parasitologist, you were like the 35 00:01:59,240 --> 00:02:01,760 Speaker 1: biggest ad of it for the most hated species on 36 00:02:01,840 --> 00:02:02,320 Speaker 1: the planet. 37 00:02:02,360 --> 00:02:05,480 Speaker 2: I'm an advocate for some parasites, but your mosquitoes kill 38 00:02:05,520 --> 00:02:07,240 Speaker 2: a lot of humans, and I don't think that we 39 00:02:07,320 --> 00:02:09,400 Speaker 2: really know what would happen if you took them out 40 00:02:09,440 --> 00:02:12,400 Speaker 2: of the ecosystems. Maybe they play a role that we 41 00:02:12,440 --> 00:02:14,760 Speaker 2: don't know about. But I feel like if you eradicated 42 00:02:14,800 --> 00:02:17,280 Speaker 2: all of them and then no one got malaria anymore, 43 00:02:17,720 --> 00:02:19,840 Speaker 2: we could probably find some way to fill in the 44 00:02:19,880 --> 00:02:22,600 Speaker 2: ecological niches that they were leaving empty. I think it 45 00:02:22,639 --> 00:02:23,919 Speaker 2: would be worth it to kill them all. 46 00:02:24,040 --> 00:02:26,839 Speaker 1: Oh, I think I know what purpose mosquitoes serve. They 47 00:02:26,840 --> 00:02:28,760 Speaker 1: serve to limit people's happiness. 48 00:02:28,960 --> 00:02:32,200 Speaker 2: You feel like there's some mechanism on this earth where 49 00:02:32,200 --> 00:02:34,960 Speaker 2: that's like a thing that needs to happen. I thought 50 00:02:35,000 --> 00:02:35,920 Speaker 2: that's what Twitter was for. 51 00:02:36,960 --> 00:02:39,679 Speaker 1: They call it X now, Kelly, they call it x Oh. 52 00:02:39,720 --> 00:02:40,960 Speaker 2: I'm sorry, I'm so behind. 53 00:02:41,040 --> 00:02:46,119 Speaker 1: Let's eliminate both of them, the mosquito of the Internet. Well, 54 00:02:46,160 --> 00:02:49,560 Speaker 1: welcome to the podcast. Daniel and Jorge explain the universe, 55 00:02:49,680 --> 00:02:52,720 Speaker 1: in which we dive deep into the joys of philosophy 56 00:02:52,840 --> 00:02:56,679 Speaker 1: and physics and cosmology and think about everything that's out 57 00:02:56,680 --> 00:02:59,000 Speaker 1: there in the universe. We want to understand how it 58 00:02:59,080 --> 00:03:01,560 Speaker 1: all works. We want to it makes sense of the universe. 59 00:03:01,720 --> 00:03:03,880 Speaker 1: We want to boil down all of those froth and 60 00:03:03,960 --> 00:03:07,080 Speaker 1: quantum particles into a story that fits into your mind, 61 00:03:07,080 --> 00:03:10,840 Speaker 1: that clicks together and goes Ah, I understand how it works. 62 00:03:11,040 --> 00:03:13,320 Speaker 2: If only all questions were that straightforward. 63 00:03:14,800 --> 00:03:16,880 Speaker 1: Well, all the physics at least has a goal to 64 00:03:17,000 --> 00:03:19,360 Speaker 1: tell you a story that makes sense to you, to 65 00:03:19,600 --> 00:03:23,239 Speaker 1: incorporate into your mind a mental model of the whole universe. 66 00:03:23,600 --> 00:03:25,840 Speaker 1: We don't dare to do that with chemistry and definitely 67 00:03:25,880 --> 00:03:28,840 Speaker 1: not with biology, but sometimes we can take a tiny 68 00:03:28,880 --> 00:03:31,640 Speaker 1: little sliver of physics and download it into your brain. 69 00:03:32,080 --> 00:03:34,400 Speaker 1: My normal friend and co host Orge can't be here today, 70 00:03:34,440 --> 00:03:36,160 Speaker 1: but I'm very excited to be here with you, Kelly. 71 00:03:36,240 --> 00:03:38,080 Speaker 2: I am super excited to be here with you. And 72 00:03:38,160 --> 00:03:40,480 Speaker 2: while I'm always super excited to be here with you, 73 00:03:40,600 --> 00:03:43,040 Speaker 2: I'm particularly excited that we're a little bit more in 74 00:03:43,120 --> 00:03:47,840 Speaker 2: my niche today talking about you know, ecology and species conservation, 75 00:03:48,080 --> 00:03:49,400 Speaker 2: and it's going to be a good time. 76 00:03:49,600 --> 00:03:52,440 Speaker 1: That's right, because we usually talk about real science on 77 00:03:52,480 --> 00:03:56,760 Speaker 1: the podcast, how the universe actually works. No, oh no, 78 00:03:57,200 --> 00:03:58,720 Speaker 1: this is not a biology slam. 79 00:03:58,800 --> 00:04:02,240 Speaker 2: I'm you are going the way in the mosquitoes, Daniel. 80 00:04:02,320 --> 00:04:06,280 Speaker 1: Oh no, no, you totally mistoo. No, no, let me finish. 81 00:04:06,640 --> 00:04:08,640 Speaker 1: I was going to say that usually we talk about 82 00:04:08,680 --> 00:04:11,960 Speaker 1: real science, but today we're talking about science fiction, not 83 00:04:12,160 --> 00:04:14,800 Speaker 1: that biology is not of real science. I think you're 84 00:04:14,800 --> 00:04:16,000 Speaker 1: a little too defensive. 85 00:04:17,440 --> 00:04:20,839 Speaker 2: Maybe I'm a little sensitive. I feel like physicists often 86 00:04:20,880 --> 00:04:24,159 Speaker 2: look down on biology. I saw a talk by Freeman 87 00:04:24,240 --> 00:04:27,119 Speaker 2: Dyson where he was certainly doing that. Anyway, I'll stop 88 00:04:27,120 --> 00:04:29,039 Speaker 2: being so sensitive no fairpoint. 89 00:04:29,120 --> 00:04:31,479 Speaker 1: Physicists have been guilty of that in the past. I 90 00:04:31,480 --> 00:04:34,000 Speaker 1: think it was Rutherford who said all science is physics 91 00:04:34,080 --> 00:04:36,880 Speaker 1: or stamp collecting, and I definitely do not agree with that. 92 00:04:37,200 --> 00:04:41,159 Speaker 1: There is glory in chemistry and wondrous questions in biology. 93 00:04:41,360 --> 00:04:43,720 Speaker 1: But today we are stepping beyond the bounds of all 94 00:04:43,800 --> 00:04:47,239 Speaker 1: kinds of science into the worlds of science fiction, because 95 00:04:47,279 --> 00:04:49,440 Speaker 1: one of the roles of science fiction is to think 96 00:04:49,480 --> 00:04:52,120 Speaker 1: about the boundaries of science, what's beyond it, What other 97 00:04:52,240 --> 00:04:55,039 Speaker 1: universes could we live in? What are the consequences of 98 00:04:55,080 --> 00:04:58,200 Speaker 1: the technology that we develop? If science keeps barreling forward, 99 00:04:58,200 --> 00:04:59,920 Speaker 1: does it change the way we live and what it's 100 00:05:00,160 --> 00:05:01,880 Speaker 1: like to be a human being and the choices that 101 00:05:01,880 --> 00:05:03,600 Speaker 1: we have to make. And I think that there's a 102 00:05:03,680 --> 00:05:06,400 Speaker 1: very close connection between the work of scientists and the 103 00:05:06,440 --> 00:05:09,320 Speaker 1: imagination of science fiction. So we have a series of 104 00:05:09,320 --> 00:05:12,599 Speaker 1: episodes on this podcast where we read a science fiction novel, 105 00:05:12,800 --> 00:05:15,000 Speaker 1: talk about the science in it, and then interview the 106 00:05:15,080 --> 00:05:18,359 Speaker 1: author about how they wrote it and why they decided 107 00:05:18,400 --> 00:05:20,960 Speaker 1: to build their science fiction universe the way they did. 108 00:05:21,240 --> 00:05:22,960 Speaker 2: And most of the episodes that you and I have 109 00:05:23,080 --> 00:05:26,119 Speaker 2: done together with sci fi writers has been about being 110 00:05:26,160 --> 00:05:28,919 Speaker 2: moved into a world that's totally unlike our own, and 111 00:05:28,960 --> 00:05:31,000 Speaker 2: so it's sort of they build this brand new universe 112 00:05:31,040 --> 00:05:33,040 Speaker 2: and you get to enjoy living in it for a while. 113 00:05:33,320 --> 00:05:35,160 Speaker 2: What what's so exciting about today is that it's more 114 00:05:35,240 --> 00:05:38,120 Speaker 2: near term and you're left thinking, oh, gosh, is this 115 00:05:38,200 --> 00:05:40,720 Speaker 2: going to be us in a few decades, And so 116 00:05:40,760 --> 00:05:42,760 Speaker 2: it's a little bit different than what we usually do, 117 00:05:42,800 --> 00:05:44,120 Speaker 2: which and I really enjoyed it. 118 00:05:44,200 --> 00:05:46,120 Speaker 1: This is a wonderful book we'll be talking about today. 119 00:05:46,400 --> 00:05:50,599 Speaker 1: It's called The Venomous Lumpsucker, a novel by Ned Bowman 120 00:05:50,960 --> 00:05:54,120 Speaker 1: and asks a really intriguing question about the nature of extinction, 121 00:05:54,520 --> 00:05:56,800 Speaker 1: what price we should have to pay to drive a 122 00:05:56,880 --> 00:06:00,560 Speaker 1: species extinct, whether we should care about species going extinct? 123 00:06:00,720 --> 00:06:03,680 Speaker 1: For example, does the mosquito deserve to live? 124 00:06:03,960 --> 00:06:06,560 Speaker 2: We think that's something that he talked about in particular. 125 00:06:06,600 --> 00:06:08,960 Speaker 2: But I did appreciate that there was a whole sort 126 00:06:09,000 --> 00:06:11,599 Speaker 2: of speech about how we should be thinking about parasites 127 00:06:11,640 --> 00:06:14,279 Speaker 2: and conservation, and that was very refreshing to me. 128 00:06:14,480 --> 00:06:16,720 Speaker 1: Yeah, it was fascinating. I thought about you as I 129 00:06:16,760 --> 00:06:19,040 Speaker 1: was reading this book. Today on the podcast, we'll be 130 00:06:19,120 --> 00:06:28,400 Speaker 1: covering the science fiction universe of the venomous lumpsucker. 131 00:06:28,600 --> 00:06:29,400 Speaker 2: What a great name. 132 00:06:30,600 --> 00:06:33,960 Speaker 1: Everybody I tell about this book invariably says the what 133 00:06:34,360 --> 00:06:36,479 Speaker 1: are you serious? Who would name their book that? 134 00:06:36,839 --> 00:06:38,919 Speaker 2: But I also like totally bought it. Like when I 135 00:06:38,960 --> 00:06:42,240 Speaker 2: first read the title, I thought, ah, I thought I 136 00:06:42,279 --> 00:06:44,960 Speaker 2: knew about a lot of the weird fish, but venomous 137 00:06:45,000 --> 00:06:48,159 Speaker 2: lovesucker I can totally buy there's a fish called venomous lovesucker. 138 00:06:48,240 --> 00:06:50,320 Speaker 2: But I was wrong. But anyway, he came up with 139 00:06:50,360 --> 00:06:51,640 Speaker 2: a really glorious creature. 140 00:06:51,880 --> 00:06:54,200 Speaker 1: He really did. So let's tell people this setting the book, 141 00:06:54,240 --> 00:06:56,479 Speaker 1: and then we'll dig into what the story is. The 142 00:06:56,520 --> 00:06:59,479 Speaker 1: book is set somewhere in the near future on Earth, 143 00:06:59,520 --> 00:07:01,760 Speaker 1: and it's very much set in our universe. This is 144 00:07:01,760 --> 00:07:03,480 Speaker 1: not the kind of book where they invent all sorts 145 00:07:03,520 --> 00:07:05,719 Speaker 1: of new physics, and the universe is very different and 146 00:07:05,760 --> 00:07:08,120 Speaker 1: we have fast and light travel. This basically takes place 147 00:07:08,160 --> 00:07:11,400 Speaker 1: on Earth in Europe in about fifteen years, and it's 148 00:07:11,440 --> 00:07:14,640 Speaker 1: facing the question of how do we cope with this 149 00:07:14,800 --> 00:07:18,040 Speaker 1: massive extinction event? So many species, so many little beetles 150 00:07:18,320 --> 00:07:21,280 Speaker 1: are going extinct every day, What should we do about it? 151 00:07:21,280 --> 00:07:23,840 Speaker 1: What can we do about it? And this book paints 152 00:07:23,840 --> 00:07:27,200 Speaker 1: this specific picture about how society might handle it. 153 00:07:27,440 --> 00:07:28,960 Speaker 2: And I thought he did a really nice job of 154 00:07:29,040 --> 00:07:31,520 Speaker 2: creating a world that you could imagine, like if we 155 00:07:31,600 --> 00:07:35,080 Speaker 2: take all the wrong turns between now and like fifteen 156 00:07:35,120 --> 00:07:38,240 Speaker 2: to twenty maybe thirty years from now, we could be there. 157 00:07:38,280 --> 00:07:40,440 Speaker 2: And so you know, right now we use carbon credits, 158 00:07:40,560 --> 00:07:43,040 Speaker 2: and companies can buy carbon credits, you know, to essentially 159 00:07:43,560 --> 00:07:46,760 Speaker 2: pay for the right to release more carbon into the atmosphere. 160 00:07:47,000 --> 00:07:49,600 Speaker 2: Here he's created extinction credits, where if you're going to 161 00:07:49,680 --> 00:07:52,040 Speaker 2: start some big new project that's going to result in 162 00:07:52,080 --> 00:07:54,400 Speaker 2: the extinction of some species, you can sort of pay 163 00:07:54,440 --> 00:07:57,280 Speaker 2: for that with extinction credits. And you know, there's like 164 00:07:57,360 --> 00:08:00,240 Speaker 2: a sliding scale for how many extinction credits you need 165 00:08:00,240 --> 00:08:02,760 Speaker 2: depending on some characteristics of the animal that you're going 166 00:08:02,800 --> 00:08:05,840 Speaker 2: to have go extinct. But like I bought that this 167 00:08:05,920 --> 00:08:08,720 Speaker 2: is a path we could go down in a couple decades. 168 00:08:08,760 --> 00:08:09,640 Speaker 2: What about you, Daniel. 169 00:08:09,680 --> 00:08:13,560 Speaker 1: I thought it was both inspired, creative, and also very realistic. 170 00:08:14,040 --> 00:08:16,160 Speaker 1: We often have trouble figuring out like how do we 171 00:08:16,200 --> 00:08:18,440 Speaker 1: solve a problem, and when we can't figure out what 172 00:08:18,480 --> 00:08:20,920 Speaker 1: direction to go in, we basically just turn it over 173 00:08:20,960 --> 00:08:23,880 Speaker 1: to capitalism. We're like, can we financialize this can we 174 00:08:23,920 --> 00:08:26,480 Speaker 1: incentivize people to do the right thing by making it 175 00:08:26,560 --> 00:08:29,960 Speaker 1: expensive to do the wrong thing? And I feel like 176 00:08:30,000 --> 00:08:32,000 Speaker 1: that's sort of clever, like turn it over to the 177 00:08:32,000 --> 00:08:34,319 Speaker 1: free market, But it also feels like sort of an 178 00:08:34,360 --> 00:08:38,000 Speaker 1: abdication of our responsibility. But then again, we can never 179 00:08:38,040 --> 00:08:40,480 Speaker 1: really decide on how to do anything, so it's better 180 00:08:40,520 --> 00:08:43,360 Speaker 1: to do something than nothing, I suppose. But this was 181 00:08:43,400 --> 00:08:47,880 Speaker 1: really uncomfortable to read about. This like financialization of extinction, 182 00:08:48,520 --> 00:08:50,720 Speaker 1: It really reminds me of like putting a price on 183 00:08:50,800 --> 00:08:52,920 Speaker 1: a human life. You know, when the government has to 184 00:08:52,920 --> 00:08:55,600 Speaker 1: make decisions about like how much to spend on things, 185 00:08:55,720 --> 00:08:58,199 Speaker 1: or should a company have to install seat belts? They 186 00:08:58,200 --> 00:09:00,600 Speaker 1: do so if the price of the seatbelt is less 187 00:09:00,600 --> 00:09:02,840 Speaker 1: than the expected loss of human life, you know, you 188 00:09:02,880 --> 00:09:05,679 Speaker 1: have to calculate, like, oh, human life is ten million dollars. 189 00:09:06,080 --> 00:09:08,360 Speaker 1: Makes me wonder, like, well, if I had ten million dollars, 190 00:09:08,400 --> 00:09:11,199 Speaker 1: could I buy a murder credit to like kill somebody 191 00:09:11,240 --> 00:09:13,280 Speaker 1: and then give that money to the family and be 192 00:09:13,360 --> 00:09:15,439 Speaker 1: like I bought the right to kill your husband, right? 193 00:09:15,640 --> 00:09:18,040 Speaker 1: That seems terrible. This is just sort of the same 194 00:09:18,080 --> 00:09:19,319 Speaker 1: thing on a larger scale. 195 00:09:19,600 --> 00:09:22,440 Speaker 2: Yeah, I mean, I do think humans for better or worse, 196 00:09:22,520 --> 00:09:26,240 Speaker 2: feel more comfortable doing that with non human animals, and 197 00:09:26,320 --> 00:09:28,199 Speaker 2: like we get more uncomfortable, you know, if you're talking 198 00:09:28,240 --> 00:09:32,240 Speaker 2: about like, well, this chimpanzee really made me angry, Like people, would, 199 00:09:32,559 --> 00:09:34,280 Speaker 2: you know, maybe make you pay more to take the 200 00:09:34,320 --> 00:09:37,720 Speaker 2: chimpanzee out than the like stink bug that lives on 201 00:09:37,760 --> 00:09:40,199 Speaker 2: your curtain or something. But yeah, no, I agree, these 202 00:09:40,240 --> 00:09:44,240 Speaker 2: things are complicated and uncomfortable, And I thought he also 203 00:09:44,240 --> 00:09:46,800 Speaker 2: did a really nice job of sort of like weaving 204 00:09:46,920 --> 00:09:49,960 Speaker 2: in the way that even our best intentions can get 205 00:09:50,000 --> 00:09:53,320 Speaker 2: corrupted by things like you know, the market doesn't always 206 00:09:53,360 --> 00:09:55,320 Speaker 2: do what we think it's gonna do, even though we 207 00:09:55,360 --> 00:09:57,520 Speaker 2: like maybe should have been able to anticipate the two 208 00:09:57,600 --> 00:10:01,400 Speaker 2: thousand and eight you know, financial crisis, because what were 209 00:10:01,400 --> 00:10:03,760 Speaker 2: we doing with the banking system and housing back then, 210 00:10:04,080 --> 00:10:06,480 Speaker 2: But we didn't. And so, you know, things don't go 211 00:10:06,679 --> 00:10:09,880 Speaker 2: exactly the way the characters thought they were going to 212 00:10:09,960 --> 00:10:11,880 Speaker 2: go with these extinction credits and how they're going to 213 00:10:11,920 --> 00:10:14,800 Speaker 2: pay out monetarily, and you know, this stuff doesn't always 214 00:10:14,800 --> 00:10:16,200 Speaker 2: work the way we think it will, and so this 215 00:10:16,240 --> 00:10:18,000 Speaker 2: is sort of a story about things going kind of 216 00:10:18,000 --> 00:10:18,760 Speaker 2: horribly awry. 217 00:10:20,160 --> 00:10:22,439 Speaker 1: Yeah, it is sort of a cautionary tale, and I 218 00:10:22,440 --> 00:10:25,080 Speaker 1: thought it was super thoughtful and creative. There are so 219 00:10:25,080 --> 00:10:26,720 Speaker 1: many things that happened in the book that surprised me. 220 00:10:26,920 --> 00:10:28,240 Speaker 1: And then as soon as I thought about I thought, 221 00:10:28,240 --> 00:10:31,240 Speaker 1: you know what, that's totally realistic, Like that's exactly what 222 00:10:31,360 --> 00:10:33,199 Speaker 1: could happen. And to me, that's the best kind of 223 00:10:33,240 --> 00:10:36,439 Speaker 1: science fiction. Somebody who's like creatively thought about the consequences 224 00:10:36,440 --> 00:10:38,640 Speaker 1: of some new technology. And you know the way in 225 00:10:38,720 --> 00:10:42,760 Speaker 1: the book, lobbyists and special interests add like loopholes and 226 00:10:42,880 --> 00:10:45,480 Speaker 1: exceptions and they end up like driving down the cost 227 00:10:45,520 --> 00:10:48,680 Speaker 1: of extinction credits to make it like horribly cheap to 228 00:10:48,800 --> 00:10:52,320 Speaker 1: send some caterpillar off to its final demise. I thought 229 00:10:52,320 --> 00:10:55,000 Speaker 1: that was very realistic. Another aspect of the book which 230 00:10:55,000 --> 00:10:58,120 Speaker 1: is super fascinating is the role of technology. He thinks 231 00:10:58,160 --> 00:11:01,040 Speaker 1: about what it really means for a species to become 232 00:11:01,120 --> 00:11:04,559 Speaker 1: extinct when you can record it, when you can record 233 00:11:04,600 --> 00:11:07,600 Speaker 1: its genome and its behavior and get samples of it, 234 00:11:07,640 --> 00:11:10,040 Speaker 1: and if it's possible to bring a species back, is 235 00:11:10,080 --> 00:11:11,079 Speaker 1: it still extinct? 236 00:11:11,200 --> 00:11:13,160 Speaker 2: And this is a topic that's like near and dear 237 00:11:13,240 --> 00:11:16,000 Speaker 2: to my heart. You know, the positive implications that we 238 00:11:16,240 --> 00:11:18,679 Speaker 2: think will happen when we create a technology. So you know, 239 00:11:18,800 --> 00:11:21,559 Speaker 2: right now people are working on de extinction, like can 240 00:11:21,600 --> 00:11:24,280 Speaker 2: you bring back the mammoth and put it back in 241 00:11:24,320 --> 00:11:26,920 Speaker 2: its and you know the permafrost habitats in Russia to 242 00:11:26,960 --> 00:11:29,200 Speaker 2: try to make the habitat what it once was, which 243 00:11:29,200 --> 00:11:30,880 Speaker 2: would be better for all of the other species that 244 00:11:30,880 --> 00:11:33,280 Speaker 2: were there. But you know, a lot of these technologies, 245 00:11:33,360 --> 00:11:37,400 Speaker 2: even if they were envisioned with only positive implications, the 246 00:11:37,480 --> 00:11:40,080 Speaker 2: way they get rolled out can often have some pretty 247 00:11:40,080 --> 00:11:44,320 Speaker 2: negative implications. So here you see they're working on, you know, 248 00:11:44,840 --> 00:11:47,720 Speaker 2: figuring out ways to store biological information so that you 249 00:11:47,760 --> 00:11:50,560 Speaker 2: cannot only bring back sort of members of a species, 250 00:11:50,600 --> 00:11:54,120 Speaker 2: but you could even bring back specific individuals. And I think, 251 00:11:54,240 --> 00:11:57,880 Speaker 2: you know, I don't know what the original plan was 252 00:11:58,120 --> 00:12:00,240 Speaker 2: in the book for the people who made these technologies, 253 00:12:00,280 --> 00:12:03,440 Speaker 2: but I can certainly imagine on Earth people having really 254 00:12:03,520 --> 00:12:08,720 Speaker 2: good intentions creating that technology, and then it makes extinction blurry, 255 00:12:08,840 --> 00:12:12,120 Speaker 2: Like is you know the beetle that you study really 256 00:12:12,160 --> 00:12:14,839 Speaker 2: extinct if everything that you need to bring it back 257 00:12:15,040 --> 00:12:17,520 Speaker 2: is in a computer and you could recreate it in 258 00:12:17,600 --> 00:12:20,199 Speaker 2: a lab at some point. So you know, these technologies 259 00:12:20,200 --> 00:12:22,120 Speaker 2: that are meant to help, but like can get used 260 00:12:22,160 --> 00:12:24,440 Speaker 2: in the wrong way and really mess up incentives is 261 00:12:24,720 --> 00:12:26,439 Speaker 2: to me a fascinating topic that I felt like he 262 00:12:26,480 --> 00:12:27,680 Speaker 2: handled really well on the book. 263 00:12:27,800 --> 00:12:30,800 Speaker 1: Yeah, and lots of really interesting questions that seem initially 264 00:12:30,800 --> 00:12:33,600 Speaker 1: like they have obvious answers, like when is a species extinct? 265 00:12:33,600 --> 00:12:36,199 Speaker 1: You might say, well, when there are no more living bits, right, 266 00:12:36,240 --> 00:12:38,160 Speaker 1: and when there are no more living individuals, But then 267 00:12:38,200 --> 00:12:41,520 Speaker 1: he walks you through these arguments in a really thoughtful way, like, well, 268 00:12:41,559 --> 00:12:43,960 Speaker 1: if there's a few more living individuals but they can't 269 00:12:43,960 --> 00:12:46,280 Speaker 1: reproduce because it's too small a group where they can 270 00:12:46,360 --> 00:12:50,480 Speaker 1: only live in zoos, then is that really somehow less 271 00:12:50,480 --> 00:12:53,840 Speaker 1: extinct than another species where there are no living individuals, 272 00:12:53,880 --> 00:12:56,240 Speaker 1: but we have the capacity to make more because of 273 00:12:56,280 --> 00:12:58,800 Speaker 1: our technology and we could bring them back. Which species 274 00:12:58,880 --> 00:13:02,640 Speaker 1: really is more extinct in that case? So it's very persuasive. 275 00:13:02,679 --> 00:13:05,400 Speaker 1: It really changed my mind a lot of these tricky questions. 276 00:13:05,520 --> 00:13:07,720 Speaker 2: Yeah, at the book tackles a lot of difficult questions. 277 00:13:08,040 --> 00:13:09,520 Speaker 2: So what is the story about, Daniel? 278 00:13:09,720 --> 00:13:12,400 Speaker 1: Yeah, so it's not just like here's a future Earth 279 00:13:12,440 --> 00:13:15,000 Speaker 1: where everything is going wrong. It tells the story, and 280 00:13:15,040 --> 00:13:17,040 Speaker 1: it's from the point of view of a biologist who's 281 00:13:17,080 --> 00:13:21,200 Speaker 1: being asked to assess whether a particular species is intelligent because, 282 00:13:21,200 --> 00:13:23,600 Speaker 1: as you said, it costs more extinction credits in this 283 00:13:23,720 --> 00:13:26,880 Speaker 1: book to kill something if it's intelligent, which I guess 284 00:13:26,920 --> 00:13:29,600 Speaker 1: makes sense, but also it feels really icky. And her 285 00:13:29,720 --> 00:13:33,000 Speaker 1: job is to assess whether the venomous lumpsucker is an 286 00:13:33,000 --> 00:13:36,080 Speaker 1: intelligent species or not. And it turns out she has 287 00:13:36,120 --> 00:13:38,440 Speaker 1: her own stake in this game. She wants it to 288 00:13:38,440 --> 00:13:41,800 Speaker 1: be intelligent for her own reasons. And in the book, 289 00:13:41,960 --> 00:13:44,800 Speaker 1: some corporation comes along and accidentally kills off all of 290 00:13:44,800 --> 00:13:47,760 Speaker 1: the venomous lumpsuckers or do they And in these places 291 00:13:47,800 --> 00:13:51,600 Speaker 1: where people store these species, these biobanks then get hacked 292 00:13:51,640 --> 00:13:54,320 Speaker 1: and the whole question of whether species or extinct becomes 293 00:13:54,400 --> 00:13:57,480 Speaker 1: much more fuzzy and questionable. It's a really exciting sort 294 00:13:57,520 --> 00:13:59,480 Speaker 1: of thriller that takes you through this world. 295 00:13:59,720 --> 00:14:02,120 Speaker 2: Yeah, it's a totally fascinating world. Can you tell me 296 00:14:02,120 --> 00:14:04,440 Speaker 2: a little bit more about some of the science that 297 00:14:04,600 --> 00:14:07,560 Speaker 2: was sort of created or forwarded for the story in particular? 298 00:14:07,720 --> 00:14:09,640 Speaker 1: Yeah, I wanted to ask you about it actually, because 299 00:14:09,640 --> 00:14:12,080 Speaker 1: a lot of this stuff is biological, I mean, the 300 00:14:12,080 --> 00:14:15,959 Speaker 1: core of technological innovations that exist in this world. Are 301 00:14:16,000 --> 00:14:20,080 Speaker 1: the ability to preserve as species to imprinciple de extinctify it, 302 00:14:20,280 --> 00:14:22,440 Speaker 1: and they do, for example, genome sequencing. Of course you've 303 00:14:22,400 --> 00:14:24,440 Speaker 1: got to store the DNA, but they also do deep 304 00:14:24,480 --> 00:14:27,480 Speaker 1: scans of the animals and they watch their behaviors et cetera, 305 00:14:27,520 --> 00:14:29,440 Speaker 1: et cetera. And as you said, this is something people 306 00:14:29,480 --> 00:14:32,080 Speaker 1: are actually working on now. So it made me wonder like, 307 00:14:32,320 --> 00:14:35,080 Speaker 1: is it possible today or in the near future to 308 00:14:35,320 --> 00:14:38,000 Speaker 1: actually do this to bring a species back from extinction 309 00:14:38,760 --> 00:14:41,800 Speaker 1: or what would you need in order to make that possible. 310 00:14:42,200 --> 00:14:44,640 Speaker 2: So there are people who are way smarter than me 311 00:14:44,640 --> 00:14:47,680 Speaker 2: who would say that the answer is definitely yes, we 312 00:14:47,760 --> 00:14:51,400 Speaker 2: can bring species back from extinction. But to be honest, 313 00:14:51,720 --> 00:14:54,080 Speaker 2: I'm skeptical that we're going to bring back the exact 314 00:14:54,080 --> 00:14:56,960 Speaker 2: same thing, and maybe that doesn't matter. So, for example, 315 00:14:57,000 --> 00:14:59,640 Speaker 2: they're working on, as I mentioned, bringing back the wooly mammoth, 316 00:14:59,720 --> 00:15:01,840 Speaker 2: and different groups are doing this in different ways. So 317 00:15:01,880 --> 00:15:04,680 Speaker 2: for example, one group has was I think it's the 318 00:15:04,760 --> 00:15:08,680 Speaker 2: genome from some elephant species, and then they're taking what 319 00:15:08,720 --> 00:15:11,920 Speaker 2: they've been able to get from mammoths that have been 320 00:15:11,960 --> 00:15:14,240 Speaker 2: like frozen in permafrost. What they've been able to extract 321 00:15:14,280 --> 00:15:17,440 Speaker 2: out of their genome. They're tinkering with an elephant genome 322 00:15:17,480 --> 00:15:19,400 Speaker 2: to try to make it look like a mammoth genome. 323 00:15:19,760 --> 00:15:22,160 Speaker 2: But then there's all sorts of like you know, maternal 324 00:15:22,160 --> 00:15:24,320 Speaker 2: effects that are missing, so that mammoth would have to 325 00:15:24,400 --> 00:15:27,080 Speaker 2: be ges stated given the science that we have now 326 00:15:27,200 --> 00:15:29,800 Speaker 2: ges dated in the body of an elephant, And how 327 00:15:29,840 --> 00:15:33,760 Speaker 2: does that like hormonal environment, you know, differ. And then 328 00:15:33,800 --> 00:15:36,520 Speaker 2: I think, you know, some elephants and maybe some mammoths 329 00:15:36,560 --> 00:15:39,760 Speaker 2: like eat feces after they're born to get the right microbiome. 330 00:15:39,840 --> 00:15:42,920 Speaker 2: And so now you're not getting like a mammoth microbiome. 331 00:15:42,960 --> 00:15:45,640 Speaker 2: You're getting like an elephant microbiome. And is that enough? 332 00:15:45,760 --> 00:15:47,200 Speaker 1: Did you say eat feces? 333 00:15:47,520 --> 00:15:49,680 Speaker 2: Sure did, Yeah, this is biology, man. We get to 334 00:15:49,720 --> 00:15:51,320 Speaker 2: feces in about five minutes. 335 00:15:51,680 --> 00:15:53,440 Speaker 1: You're saying it's not a mammoth if it has been 336 00:15:53,440 --> 00:15:55,080 Speaker 1: eaten mammoth poop when it was a baby. 337 00:15:55,160 --> 00:15:57,320 Speaker 2: You know, I wouldn't say that that's the line yes 338 00:15:57,440 --> 00:15:59,840 Speaker 2: or no, but I would say that, like, you know, 339 00:16:00,000 --> 00:16:03,360 Speaker 2: so to some extent, these differences build up and like so, yes, 340 00:16:03,440 --> 00:16:06,880 Speaker 2: you have a mammoth, but the mammoth is now placed 341 00:16:06,920 --> 00:16:09,840 Speaker 2: into an ecosystem that varies dramatically from what it was 342 00:16:09,880 --> 00:16:13,280 Speaker 2: in before. It's social interactions might be different, because will 343 00:16:13,280 --> 00:16:15,520 Speaker 2: they act the same in a different environment, And is 344 00:16:15,520 --> 00:16:18,320 Speaker 2: there something about their development that's missing that's going to 345 00:16:18,400 --> 00:16:21,160 Speaker 2: change the way they interact with each other, and so like, yes, 346 00:16:21,200 --> 00:16:24,280 Speaker 2: you have a mammoth, but you don't have the mammoth 347 00:16:24,320 --> 00:16:27,360 Speaker 2: you used to have. And the extent to which that matters, 348 00:16:27,760 --> 00:16:29,240 Speaker 2: I don't know. Maybe it's enough to just have a 349 00:16:29,280 --> 00:16:32,000 Speaker 2: mammoth back in the environment and that does some good, 350 00:16:32,400 --> 00:16:34,400 Speaker 2: But I think these things are complicated. And as far 351 00:16:34,400 --> 00:16:37,920 Speaker 2: as MRI scans and like connectomes so that you can 352 00:16:37,920 --> 00:16:41,120 Speaker 2: bring back a human who is exactly the same as 353 00:16:41,200 --> 00:16:44,080 Speaker 2: the person you love to just passed away, I think 354 00:16:44,120 --> 00:16:46,440 Speaker 2: we are way more than decades away from that. But 355 00:16:46,560 --> 00:16:48,160 Speaker 2: I'm sure there are a lot of people who disagree 356 00:16:48,200 --> 00:16:50,200 Speaker 2: with me. But it seems like when we first got 357 00:16:50,200 --> 00:16:52,040 Speaker 2: the human genome, we were like, there's so much we're 358 00:16:52,040 --> 00:16:53,480 Speaker 2: gonna be able to deal with it, And then we 359 00:16:53,480 --> 00:16:55,360 Speaker 2: were like, oh, well, not really, because actually it's much 360 00:16:55,400 --> 00:16:57,520 Speaker 2: more complicated, And that always seems to be the answer. 361 00:16:57,640 --> 00:17:00,000 Speaker 1: A friend of mine just completed the connectome of life. 362 00:17:00,160 --> 00:17:03,440 Speaker 1: The fruitfly which has a tiny number of neurons compared 363 00:17:03,480 --> 00:17:06,399 Speaker 1: to the humans, and it took forever, and it seems 364 00:17:06,440 --> 00:17:08,960 Speaker 1: like we're never going to get to the connectome of 365 00:17:09,000 --> 00:17:11,040 Speaker 1: the human brain. But you raise a lot of really 366 00:17:11,080 --> 00:17:13,720 Speaker 1: interesting questions that I think touch on the deeper issue 367 00:17:13,760 --> 00:17:15,800 Speaker 1: of like what does it really mean for a species 368 00:17:15,840 --> 00:17:19,080 Speaker 1: to be extinct. It's not really just about the individuals. 369 00:17:19,080 --> 00:17:22,760 Speaker 1: It's about their entire environment and can they survive and propagate? 370 00:17:23,320 --> 00:17:26,960 Speaker 1: And that requires much more than just the actual bodies, right, 371 00:17:27,000 --> 00:17:29,919 Speaker 1: It requires the parents and the poop and everything around 372 00:17:29,960 --> 00:17:32,120 Speaker 1: them and all this kind of stuff. And I think 373 00:17:32,119 --> 00:17:35,120 Speaker 1: that's one reason why all of the efforts so far 374 00:17:35,359 --> 00:17:38,600 Speaker 1: in the real world to de extinctify have focused on 375 00:17:38,680 --> 00:17:42,159 Speaker 1: things that are near relatives to existing species, Like you 376 00:17:42,200 --> 00:17:44,800 Speaker 1: could have a mammoth baby maybe born in an elephant, 377 00:17:44,800 --> 00:17:47,160 Speaker 1: and that's giving you something that's close to a mammoth. 378 00:17:47,800 --> 00:17:50,560 Speaker 1: Or you can have like an extinct rat be born 379 00:17:50,640 --> 00:17:53,040 Speaker 1: from existing rats. These kinds of things. I don't think 380 00:17:53,040 --> 00:17:55,840 Speaker 1: it could be possible, for example, to de extinctify species 381 00:17:56,000 --> 00:17:59,360 Speaker 1: that was very distant from anything that was currently alive, 382 00:18:00,119 --> 00:18:02,440 Speaker 1: like a dinosaur, although maybe I guess you could grow 383 00:18:02,480 --> 00:18:03,560 Speaker 1: one in a big alligator. 384 00:18:03,600 --> 00:18:05,119 Speaker 2: I don't know, yeah, Like I don't know how you 385 00:18:05,200 --> 00:18:08,720 Speaker 2: de extinctify a trilobite or something, for example. And maybe 386 00:18:08,760 --> 00:18:10,960 Speaker 2: the question is, like, you know, if you could de 387 00:18:11,080 --> 00:18:13,080 Speaker 2: extinct it, but it could only live in a zoo 388 00:18:13,200 --> 00:18:15,399 Speaker 2: because you've like destroyed all of its habitat and it 389 00:18:15,520 --> 00:18:18,120 Speaker 2: just can't. The things that it needed don't exist anymore. 390 00:18:18,960 --> 00:18:21,040 Speaker 2: What kind of a life is that? And I'm sure 391 00:18:21,080 --> 00:18:24,159 Speaker 2: people would dramatically differ in their answer to that question, 392 00:18:24,480 --> 00:18:25,200 Speaker 2: And so there you go. 393 00:18:25,280 --> 00:18:27,359 Speaker 1: Well, one of the fascinating things about the book Venomous 394 00:18:27,440 --> 00:18:30,600 Speaker 1: Lumpsucker is he talks about the influence of this technology 395 00:18:30,680 --> 00:18:34,320 Speaker 1: on decision making, and if it's possible to bring species 396 00:18:34,359 --> 00:18:36,879 Speaker 1: back from the dead, then doesn't make it less bad 397 00:18:36,920 --> 00:18:38,840 Speaker 1: to make them extinct. That it sort of makes the 398 00:18:38,920 --> 00:18:42,160 Speaker 1: question like fuzzier now, because what is extinct really mean? 399 00:18:42,359 --> 00:18:43,880 Speaker 1: You know, It's sort of like saying, oh, I can 400 00:18:44,000 --> 00:18:46,160 Speaker 1: upload you to the clouds, So what does it matter 401 00:18:46,200 --> 00:18:48,080 Speaker 1: if I murder you? Like, well, I still don't really 402 00:18:48,080 --> 00:18:50,240 Speaker 1: want to get murdered, even if I'm backed up. 403 00:18:50,960 --> 00:18:52,880 Speaker 2: Well, I mean, I guess there's also like nobody wants 404 00:18:52,920 --> 00:18:55,960 Speaker 2: the physical pain of being murdered and many layers of 405 00:18:56,000 --> 00:18:58,639 Speaker 2: complication in all of these questions. 406 00:18:58,640 --> 00:19:02,320 Speaker 1: Many reasons to not be murdered by Kelly we Smith. 407 00:19:03,640 --> 00:19:06,960 Speaker 2: Speaking of commercialization. Let's take a quick break for word 408 00:19:07,000 --> 00:19:21,560 Speaker 2: from our sponsors, and we're back. 409 00:19:21,600 --> 00:19:21,879 Speaker 3: All right. 410 00:19:21,920 --> 00:19:23,960 Speaker 1: Well, we thoroughly enjoyed this book. We thought it was 411 00:19:24,080 --> 00:19:27,439 Speaker 1: very thoughtful, very interesting, very creative, but also very very funny. 412 00:19:27,440 --> 00:19:29,840 Speaker 1: I laughed out loud many times while reading this book. 413 00:19:29,920 --> 00:19:33,440 Speaker 2: Okay, so, without further ado, let's bring Ned Bowman onto 414 00:19:33,440 --> 00:19:33,840 Speaker 2: the show. 415 00:19:34,160 --> 00:19:36,480 Speaker 1: Well, then it's my pleasure to welcome to the program today, 416 00:19:36,600 --> 00:19:39,080 Speaker 1: Ned Bowman. Ned, thank you very much for joining us today. 417 00:19:39,240 --> 00:19:40,080 Speaker 3: Thanks for having me. 418 00:19:40,240 --> 00:19:42,280 Speaker 1: So tell us a little bit about yourself. How did 419 00:19:42,280 --> 00:19:44,119 Speaker 1: you get into science fiction writing. 420 00:19:44,359 --> 00:19:48,360 Speaker 3: Well, this is my fifth novel, but it's my first 421 00:19:48,440 --> 00:19:51,080 Speaker 3: real science fiction novel. I think it was inevitable that 422 00:19:51,119 --> 00:19:54,920 Speaker 3: I would write one eventually because I read pretty much 423 00:19:55,000 --> 00:19:58,920 Speaker 3: nothing but science fiction when I was growing up, and 424 00:19:58,960 --> 00:20:02,720 Speaker 3: then kind of moved over into more mainstream literary fiction, 425 00:20:02,800 --> 00:20:05,920 Speaker 3: but continued to read science fiction, and to be honest, 426 00:20:05,920 --> 00:20:07,800 Speaker 3: I always felt, like, you know, it was a genre 427 00:20:07,920 --> 00:20:11,440 Speaker 3: that I appreciated, but wasn't necessarily up to myself because 428 00:20:11,480 --> 00:20:15,240 Speaker 3: I think it requires quite a specific set of skills. 429 00:20:15,640 --> 00:20:19,080 Speaker 3: But eventually, you know, I tried my hand at various 430 00:20:19,119 --> 00:20:21,960 Speaker 3: other things, and I had done it. I'd published a 431 00:20:22,040 --> 00:20:24,400 Speaker 3: couple of science fiction short stories, and then with this one, 432 00:20:24,600 --> 00:20:27,119 Speaker 3: I thought, okay, I'll give it a shot, put everything 433 00:20:27,119 --> 00:20:29,439 Speaker 3: into it. Yeah, and that was how I ended up 434 00:20:29,480 --> 00:20:31,320 Speaker 3: with Venoma's Lampsyca. 435 00:20:31,520 --> 00:20:33,520 Speaker 2: So you noted that you need to have a certain 436 00:20:33,560 --> 00:20:36,520 Speaker 2: set of skills to write science fiction. How did you 437 00:20:36,800 --> 00:20:38,960 Speaker 2: get those skills yourself? So you got a lot of 438 00:20:39,000 --> 00:20:41,680 Speaker 2: the you know, the biology in this book is fantastic 439 00:20:41,800 --> 00:20:45,719 Speaker 2: as a biologist. Did you like pull out biology textbooks? 440 00:20:45,760 --> 00:20:48,040 Speaker 2: What was the process of trying to blend science fiction 441 00:20:48,160 --> 00:20:50,520 Speaker 2: with all of the appropriate science fact. 442 00:20:50,480 --> 00:20:54,800 Speaker 3: Well, all of my books have been quite research heavy, 443 00:20:55,400 --> 00:20:58,360 Speaker 3: you know. For instance, my second book had a lot 444 00:20:58,400 --> 00:21:03,480 Speaker 3: about the Avon God Theater in the Weimar era. You know, 445 00:21:03,080 --> 00:21:08,000 Speaker 3: I don't think researching science is inherently any harder than 446 00:21:08,040 --> 00:21:10,240 Speaker 3: researching that kind of thing, at least until you get 447 00:21:10,240 --> 00:21:13,960 Speaker 3: into the really confusing stuff. When I say a specific 448 00:21:14,040 --> 00:21:18,359 Speaker 3: set of skills, I mean more trying to kind of 449 00:21:18,440 --> 00:21:27,240 Speaker 3: paint a plausible and internally consistent future world without leaving 450 00:21:28,240 --> 00:21:33,600 Speaker 3: huge gaps and blind spots. And I have always so 451 00:21:33,680 --> 00:21:36,480 Speaker 3: admired the science fiction novelists who were good at that, 452 00:21:36,840 --> 00:21:39,880 Speaker 3: And with this book, I was very conscious it's set 453 00:21:40,240 --> 00:21:43,359 Speaker 3: fifteen to twenty years in the future, and A, I 454 00:21:43,359 --> 00:21:46,560 Speaker 3: don't specify the exact date, which makes it easier, and 455 00:21:46,640 --> 00:21:49,560 Speaker 3: also B I think fifteen to twenty years is kind 456 00:21:49,600 --> 00:21:53,080 Speaker 3: of the easiest place to put it because it's not 457 00:21:53,400 --> 00:21:58,280 Speaker 3: so soon that you can get refuted in all your 458 00:21:58,320 --> 00:22:02,320 Speaker 3: predictions really quickly. But it's also not so far away 459 00:22:02,320 --> 00:22:07,000 Speaker 3: that you really have to make some big calls about 460 00:22:07,040 --> 00:22:10,120 Speaker 3: like what's going to change and where things are going 461 00:22:10,160 --> 00:22:13,520 Speaker 3: to go. So I think I was doing it sort 462 00:22:13,520 --> 00:22:16,800 Speaker 3: of on easy modes in that respect, but that was 463 00:22:16,840 --> 00:22:20,520 Speaker 3: the challenge. Whereas the science stuff, Yeah, you know, at 464 00:22:20,520 --> 00:22:22,800 Speaker 3: this point, I'm just kind of used to being at 465 00:22:22,840 --> 00:22:27,919 Speaker 3: dilettante and with each book, I stroll through some new 466 00:22:28,240 --> 00:22:31,160 Speaker 3: arab of research and I didn't really find it any 467 00:22:31,160 --> 00:22:33,919 Speaker 3: harder than any of the stuff I've researched in the past. 468 00:22:34,160 --> 00:22:36,560 Speaker 1: Well, before we dive into the details of your book, 469 00:22:36,640 --> 00:22:39,320 Speaker 1: we like to ask the same questions to every author 470 00:22:39,400 --> 00:22:42,120 Speaker 1: to sort of put them on the spectrum of science fiction. 471 00:22:42,359 --> 00:22:45,600 Speaker 1: So here's some generic science fiction questions, not specifically about 472 00:22:45,600 --> 00:22:48,480 Speaker 1: your book. So the first one is do you think 473 00:22:48,520 --> 00:22:52,840 Speaker 1: that Star Trek style transporters kill you and clone you 474 00:22:53,320 --> 00:22:56,320 Speaker 1: or do you think they actually transport your atoms somewhere else? 475 00:22:56,800 --> 00:23:03,080 Speaker 3: Well, I studied philosophis undergraduate and then later read this 476 00:23:03,119 --> 00:23:07,000 Speaker 3: book called Reasons in Persons by Derek Parfitt, which one 477 00:23:07,760 --> 00:23:12,880 Speaker 3: reviewer actually noticed as an influence on this book by coincidence, 478 00:23:12,920 --> 00:23:16,120 Speaker 3: which I hadn't really consciously thought about, but in hindsight, yeah, 479 00:23:16,160 --> 00:23:19,440 Speaker 3: I think a lot of those ideas had implanted themselves 480 00:23:20,040 --> 00:23:24,800 Speaker 3: in my head. And I think path It's answer would 481 00:23:24,880 --> 00:23:31,760 Speaker 3: be that you need to start thinking about personhood in 482 00:23:31,760 --> 00:23:36,399 Speaker 3: a way which doesn't have such strict boundaries. You have 483 00:23:36,440 --> 00:23:38,439 Speaker 3: to think about a person as being a kind of 484 00:23:39,680 --> 00:23:45,760 Speaker 3: soft entity which doesn't begin or and in a specific place. 485 00:23:46,480 --> 00:23:49,160 Speaker 3: And if you look at things that way, then it's 486 00:23:49,280 --> 00:23:53,720 Speaker 3: legitimate to say, is the person who beams down to 487 00:23:53,800 --> 00:23:59,400 Speaker 3: the planet me sort of that person is semi continuous 488 00:24:00,119 --> 00:24:03,840 Speaker 3: with me, not continuous to the extent that we normally 489 00:24:03,840 --> 00:24:07,040 Speaker 3: think of ourselves as being continuous. And yeah, I think 490 00:24:07,080 --> 00:24:09,159 Speaker 3: Parfitt would say that's okay. That doesn't have to be 491 00:24:09,200 --> 00:24:11,280 Speaker 3: a strict boinary answer to that, So I think that's 492 00:24:11,320 --> 00:24:13,800 Speaker 3: probably what I would go with because I really respect 493 00:24:14,440 --> 00:24:15,840 Speaker 3: Parfit's concept of the world. 494 00:24:16,080 --> 00:24:18,280 Speaker 1: So it is you as long as you redefine you 495 00:24:18,400 --> 00:24:20,440 Speaker 1: to be whatever you ends up on the other side 496 00:24:20,480 --> 00:24:21,440 Speaker 1: of the transporter. 497 00:24:21,680 --> 00:24:23,520 Speaker 3: Yeah, I think it's fair enough to say it's sort 498 00:24:23,560 --> 00:24:25,520 Speaker 3: of you in many ways, pretty much you. 499 00:24:25,800 --> 00:24:30,240 Speaker 2: That's a great answer. So another tech question, what tech 500 00:24:30,320 --> 00:24:33,720 Speaker 2: in science fiction would you most like to see become 501 00:24:33,720 --> 00:24:34,320 Speaker 2: a reality? 502 00:24:34,440 --> 00:24:36,199 Speaker 3: I think the science fiction story that has had the 503 00:24:36,200 --> 00:24:39,720 Speaker 3: most influence on me in terms of my sort of 504 00:24:39,800 --> 00:24:43,480 Speaker 3: personality and outlook is this story called Reasons to Be 505 00:24:43,600 --> 00:24:48,240 Speaker 3: Cheerful by Greg Egan, and that is a story about 506 00:24:48,280 --> 00:24:56,199 Speaker 3: a guy who gets a brain tumor which affects his 507 00:24:57,040 --> 00:25:00,000 Speaker 3: ability to take pleasure in things, which kind of flattened 508 00:25:00,119 --> 00:25:03,600 Speaker 3: his abilities to take pleasure in things, and well, I 509 00:25:03,640 --> 00:25:05,479 Speaker 3: have no choice but to spore the ending bride any 510 00:25:05,480 --> 00:25:08,960 Speaker 3: think it really spoils it. They eventually develop a device 511 00:25:09,600 --> 00:25:14,400 Speaker 3: which allows him to adjust how much pleasure he takes 512 00:25:14,480 --> 00:25:18,680 Speaker 3: in different things, so he's able to say, not do 513 00:25:18,760 --> 00:25:21,919 Speaker 3: I like this or do I find that beautiful? But 514 00:25:22,040 --> 00:25:24,359 Speaker 3: do I want to like this? Do I want to 515 00:25:24,400 --> 00:25:27,720 Speaker 3: find this beautiful. What is it that it would be 516 00:25:27,840 --> 00:25:34,679 Speaker 3: most convenient or positive for me to take pleasure in, 517 00:25:34,920 --> 00:25:37,920 Speaker 3: and he's able to adjust it on that basis. I've 518 00:25:37,960 --> 00:25:42,000 Speaker 3: always felt that would be so good, that would make 519 00:25:42,040 --> 00:25:44,359 Speaker 3: it so much easier for us to adjust to the world. 520 00:25:44,760 --> 00:25:47,480 Speaker 2: What would you change about your response to the world 521 00:25:47,560 --> 00:25:48,880 Speaker 2: first if you had this device. 522 00:25:49,080 --> 00:25:52,200 Speaker 3: Well, again, this kind of comes up in the novel, 523 00:25:52,760 --> 00:25:56,680 Speaker 3: and I'm sure again the novel was kind of influenced 524 00:25:56,720 --> 00:26:00,679 Speaker 3: an unconscious way by this story. But basically, one of 525 00:26:00,680 --> 00:26:03,359 Speaker 3: the two main characters of Venomous Lumpsucker is this guy 526 00:26:04,280 --> 00:26:07,639 Speaker 3: who's a real foodie, but because of the effects of 527 00:26:07,680 --> 00:26:12,320 Speaker 3: climate change in fifteen or twenty years, most foods don't 528 00:26:12,359 --> 00:26:16,880 Speaker 3: really taste of anything anymore. So he has to take 529 00:26:16,920 --> 00:26:19,840 Speaker 3: this pill, which means he doesn't care whether a meal 530 00:26:19,960 --> 00:26:24,760 Speaker 3: is good or not, which is sort of the more 531 00:26:24,880 --> 00:26:30,360 Speaker 3: destructive version of what I'm talking about. The better alternative 532 00:26:30,480 --> 00:26:33,800 Speaker 3: would be, you know, for him to go, well, what 533 00:26:34,040 --> 00:26:37,359 Speaker 3: is still available to me, I'm going to decide that 534 00:26:37,440 --> 00:26:40,600 Speaker 3: I will love that, and then I'll be perfectly adjusted 535 00:26:40,640 --> 00:26:44,000 Speaker 3: to the world that I actually have, as opposed to 536 00:26:44,440 --> 00:26:46,359 Speaker 3: the world that I would like to be in. I 537 00:26:46,359 --> 00:26:48,000 Speaker 3: mean there's probably a lot of more profound ways, so 538 00:26:48,040 --> 00:26:49,760 Speaker 3: you could use something like that. But that's probably what 539 00:26:49,840 --> 00:26:52,280 Speaker 3: I would do, at least to start off with, because 540 00:26:52,320 --> 00:26:55,080 Speaker 3: you know, I am quite a snob about like food 541 00:26:55,320 --> 00:27:00,159 Speaker 3: and fabrics and all that kind of thing. Like No, 542 00:27:00,440 --> 00:27:04,280 Speaker 3: imagine if you could just take just as much pleasure 543 00:27:04,320 --> 00:27:07,720 Speaker 3: and cheap polyester as in cashmere, or you could take 544 00:27:07,840 --> 00:27:11,760 Speaker 3: just as much pleasure and a protein bar as in 545 00:27:12,640 --> 00:27:16,200 Speaker 3: you know, a delicious meal. Oh, I mean that's another one. 546 00:27:16,280 --> 00:27:18,480 Speaker 3: Like I'm trying to be vegan, not very successfully. I 547 00:27:18,520 --> 00:27:20,639 Speaker 3: would love you know, I would just adjustice so that 548 00:27:20,680 --> 00:27:24,040 Speaker 3: I didn't even want meat anymore and enjoyed chickpeas way 549 00:27:24,119 --> 00:27:27,040 Speaker 3: more than IEVER used to enjoy procido if I can. 550 00:27:27,400 --> 00:27:28,280 Speaker 2: That's going to be tough. 551 00:27:28,760 --> 00:27:32,399 Speaker 1: Chickpeas are delicious. I'm definitely pro chickpee on this question. 552 00:27:32,560 --> 00:27:34,640 Speaker 3: I'm pro check pee. But as soon as you start 553 00:27:34,640 --> 00:27:37,600 Speaker 3: eating vegan, you find yourself eating chickpeas like seven times 554 00:27:37,600 --> 00:27:39,200 Speaker 3: a day, and it's too much. 555 00:27:40,119 --> 00:27:42,199 Speaker 1: No, there's so many kinds of beans out there. You 556 00:27:42,200 --> 00:27:45,480 Speaker 1: should get into indigenous kinds of beans. We're members of 557 00:27:45,480 --> 00:27:47,600 Speaker 1: the Rangel Gordo Bean Clubs, and we get this shipment 558 00:27:47,680 --> 00:27:50,680 Speaker 1: of heirloom beans every month. It's wonderful anyway. A big 559 00:27:50,680 --> 00:27:53,159 Speaker 1: fan of Greg Egan over here. Love his stories so 560 00:27:53,280 --> 00:27:56,080 Speaker 1: thoughtful and creative. Last genaeric question before we dive into 561 00:27:56,119 --> 00:27:59,639 Speaker 1: the book is what's your personal answer to the Fermi paradox? 562 00:28:00,119 --> 00:28:03,160 Speaker 1: Why haven't aliens visited us? Or have they? 563 00:28:03,640 --> 00:28:08,600 Speaker 3: Oh? Yeah, I don't. I don't have a great answer 564 00:28:08,640 --> 00:28:12,800 Speaker 3: to that. I mean, I don't see strong reason to 565 00:28:12,960 --> 00:28:20,960 Speaker 3: believe that they have. I'm not particularly convinced by any 566 00:28:21,000 --> 00:28:26,760 Speaker 3: of those hypotheses about how they know where here and 567 00:28:26,840 --> 00:28:32,359 Speaker 3: they're watching and they've chosen not to visit us or interfere. 568 00:28:32,960 --> 00:28:37,399 Speaker 3: The answer to it that kind of grips me with 569 00:28:37,560 --> 00:28:43,160 Speaker 3: the most like cold, implacable grip as soon as I 570 00:28:43,240 --> 00:28:47,920 Speaker 3: heard it. Is just the idea that all advanced civilizations 571 00:28:48,000 --> 00:28:51,560 Speaker 3: have actually destroyed themselves one way or another, you know, 572 00:28:51,600 --> 00:28:54,200 Speaker 3: before they leave their solar systems, if not before they 573 00:28:54,320 --> 00:28:59,920 Speaker 3: leave their planets. So it could be that, But then 574 00:29:00,080 --> 00:29:04,960 Speaker 3: you'd think somebody would have got as far as you know, 575 00:29:05,160 --> 00:29:10,720 Speaker 3: self replicating probes, ornoy machines or whatever. So I really 576 00:29:10,720 --> 00:29:12,560 Speaker 3: don't know why we haven't had any of those. I 577 00:29:12,600 --> 00:29:13,640 Speaker 3: can't explain that all. 578 00:29:13,680 --> 00:29:16,720 Speaker 2: Right, so let's start jumping into venomous lumpsucker. I love 579 00:29:16,840 --> 00:29:20,120 Speaker 2: this book. So the novel like, so I'm an ecologist, 580 00:29:20,240 --> 00:29:23,400 Speaker 2: and so top like climate change in the massive extinction 581 00:29:23,440 --> 00:29:25,760 Speaker 2: event that we're living through right now, or topics that 582 00:29:25,800 --> 00:29:28,280 Speaker 2: are near and dear to my heart. What fascinates you 583 00:29:28,320 --> 00:29:30,680 Speaker 2: about these themes? Why did you decide that you wanted 584 00:29:30,680 --> 00:29:33,200 Speaker 2: to write a book around the topic of extinction. 585 00:29:33,600 --> 00:29:37,400 Speaker 3: Well, it's a combination of you know, on the one hand, 586 00:29:37,720 --> 00:29:42,200 Speaker 3: I am very concerned about the climates, and I love animals, 587 00:29:42,360 --> 00:29:46,320 Speaker 3: and a lot of the sentiments in the book about 588 00:29:46,360 --> 00:29:52,200 Speaker 3: how thinking about animals being driven extinct is so painful 589 00:29:52,240 --> 00:29:54,760 Speaker 3: you can't even bear it, Like some of that is 590 00:29:54,800 --> 00:29:57,080 Speaker 3: an exaggeration of how I feel. But then on the 591 00:29:57,120 --> 00:30:00,760 Speaker 3: other hand, like I sail a studied philosophy, and I'm 592 00:30:00,840 --> 00:30:06,840 Speaker 3: often frustrated by the way we have so many surface 593 00:30:06,920 --> 00:30:10,520 Speaker 3: level debates about things which go round and round in 594 00:30:10,600 --> 00:30:13,360 Speaker 3: circles and you never get anywhere around. I always just 595 00:30:13,360 --> 00:30:17,120 Speaker 3: think this needs some real philosophy applied to it, and 596 00:30:17,720 --> 00:30:20,760 Speaker 3: the question of extinction is really one of those, you know, 597 00:30:20,840 --> 00:30:27,600 Speaker 3: because most people basically seem to agree that it's bad. 598 00:30:27,680 --> 00:30:33,080 Speaker 3: If a species goes extinct, but obviously there's no consensus 599 00:30:33,160 --> 00:30:38,600 Speaker 3: on what we are willing to pay or sacrifice to 600 00:30:38,640 --> 00:30:42,080 Speaker 3: prevent that happening. That's not one of those questions you 601 00:30:42,120 --> 00:30:48,960 Speaker 3: can answer just by people sort of vaguely, you know, 602 00:30:49,080 --> 00:30:52,200 Speaker 3: talking past each other about how they feel about it. 603 00:30:52,240 --> 00:30:55,760 Speaker 3: I really think if we're going to talk about how 604 00:30:55,880 --> 00:30:58,400 Speaker 3: much do we really care about preventing extinction, you have 605 00:30:58,520 --> 00:31:03,000 Speaker 3: to look at it riskly and ask, well, why is 606 00:31:03,040 --> 00:31:06,320 Speaker 3: it bad if a species goes extinct? How much do 607 00:31:06,360 --> 00:31:09,600 Speaker 3: we or should we care? Why is the species valuable? 608 00:31:10,040 --> 00:31:13,600 Speaker 3: Why should we prevent it? And you have to look 609 00:31:13,640 --> 00:31:18,440 Speaker 3: at those philosophically instead of just relying on intuition and 610 00:31:18,600 --> 00:31:22,640 Speaker 3: assumptions and so on. So I thought that would be 611 00:31:22,680 --> 00:31:28,240 Speaker 3: an interesting basis for a novel to start, not offering answers, 612 00:31:28,240 --> 00:31:32,120 Speaker 3: but at least asking some questions that I felt like 613 00:31:32,320 --> 00:31:35,880 Speaker 3: needed to be asked that weren't being asked in a 614 00:31:35,920 --> 00:31:38,560 Speaker 3: more serious philosophical way about this issue. 615 00:31:38,680 --> 00:31:40,880 Speaker 1: I totally agree, and I know that in your answer 616 00:31:40,960 --> 00:31:43,840 Speaker 1: you gave sort of two questions. One was what price 617 00:31:43,920 --> 00:31:46,160 Speaker 1: are we willing to pay? And the other was how 618 00:31:46,240 --> 00:31:48,480 Speaker 1: much should we care for me? One of the most 619 00:31:48,520 --> 00:31:50,440 Speaker 1: interesting things about the book was that it seemed to 620 00:31:50,520 --> 00:31:54,000 Speaker 1: sort of sound a warning about attempts to legislate and 621 00:31:54,120 --> 00:31:58,160 Speaker 1: financialize decision making. I've often heard economists say things like 622 00:31:58,600 --> 00:32:00,479 Speaker 1: it's good to put a price on things, even if 623 00:32:00,480 --> 00:32:03,160 Speaker 1: it's the wrong price. Do you think that there's a 624 00:32:03,400 --> 00:32:07,320 Speaker 1: danger to try to assign a monetary value to moral 625 00:32:07,400 --> 00:32:11,320 Speaker 1: choices like a human life or the existence of a species. 626 00:32:11,440 --> 00:32:13,160 Speaker 1: Is that the right way for us as a society 627 00:32:13,200 --> 00:32:14,280 Speaker 1: to balance these things? 628 00:32:14,600 --> 00:32:18,480 Speaker 3: I mean, I don't think it's intrinsically immral to do that. 629 00:32:18,600 --> 00:32:21,320 Speaker 3: You know, if you work in the government, you have 630 00:32:21,360 --> 00:32:24,040 Speaker 3: to operate at least in this country on the basis 631 00:32:24,080 --> 00:32:29,280 Speaker 3: of they're called qa L wise quality adjusted life years, 632 00:32:29,320 --> 00:32:33,640 Speaker 3: and you have to decide is it worth buying this 633 00:32:33,720 --> 00:32:37,200 Speaker 3: treatment for a rare cancer, And then you have to think, well, 634 00:32:37,920 --> 00:32:43,720 Speaker 3: how many people will live how many extra years longer? 635 00:32:43,800 --> 00:32:47,440 Speaker 3: And you have to put a number on that stuff. So, 636 00:32:47,840 --> 00:32:50,200 Speaker 3: you know, I find it very frustrating when people are like, 637 00:32:50,680 --> 00:32:53,560 Speaker 3: we can't have bureaucrats putting a price on human life 638 00:32:53,640 --> 00:32:56,080 Speaker 3: or whatever, when I think you have to. That's the 639 00:32:56,080 --> 00:32:58,800 Speaker 3: only way you can make trade offs in it, you know, 640 00:33:00,120 --> 00:33:04,440 Speaker 3: relative scarcity, but on the other hand, when the reason 641 00:33:04,480 --> 00:33:07,400 Speaker 3: you're trying to put a price on something is because 642 00:33:07,440 --> 00:33:13,960 Speaker 3: you're saying, well, a price signal is the only signal 643 00:33:14,000 --> 00:33:17,200 Speaker 3: that the free market really understands. So the reason we're 644 00:33:17,200 --> 00:33:19,840 Speaker 3: putting in a price on it is so that we 645 00:33:19,880 --> 00:33:23,440 Speaker 3: can plug it into the free market and then pull 646 00:33:23,520 --> 00:33:27,840 Speaker 3: a few levers and then allow the free market to 647 00:33:27,920 --> 00:33:32,280 Speaker 3: work its magic and solve this problem for us. Again, 648 00:33:32,320 --> 00:33:34,560 Speaker 3: I don't think that's inherently a moral. Is just one 649 00:33:34,560 --> 00:33:36,200 Speaker 3: of the things I'm saying in the book is it's 650 00:33:36,240 --> 00:33:39,440 Speaker 3: not going to work because the thing that the free 651 00:33:39,440 --> 00:33:45,640 Speaker 3: market is good at is rooting around any impediments to profit, 652 00:33:46,520 --> 00:33:50,280 Speaker 3: and the free market the reason it works is it's, 653 00:33:51,720 --> 00:33:56,120 Speaker 3: you know, a collaboration of millions of very intelligent people 654 00:33:57,240 --> 00:34:02,760 Speaker 3: all working together to solve this problem. Where the problem 655 00:34:03,200 --> 00:34:06,040 Speaker 3: is someone is stopping us from making enough money. And 656 00:34:06,320 --> 00:34:10,719 Speaker 3: if opposed to them, you only have a handful of 657 00:34:10,960 --> 00:34:15,239 Speaker 3: kind of well meaning people in government, then the free 658 00:34:15,280 --> 00:34:19,640 Speaker 3: market is always going to outsmart the people in government. 659 00:34:19,880 --> 00:34:23,080 Speaker 3: So that's why it's a danger. So that's why I 660 00:34:23,120 --> 00:34:28,520 Speaker 3: think it's genius to put a price on it, because 661 00:34:29,320 --> 00:34:31,560 Speaker 3: if that price is meant to be a kind of 662 00:34:31,800 --> 00:34:36,440 Speaker 3: you know, essentially translating it into free market language. You 663 00:34:36,480 --> 00:34:38,719 Speaker 3: don't necessarily want it in that language because once you 664 00:34:38,760 --> 00:34:40,240 Speaker 3: give it to them, you never get it back. 665 00:34:40,600 --> 00:34:42,920 Speaker 2: And what is the role of the individual and how 666 00:34:42,960 --> 00:34:45,480 Speaker 2: these things all play out? You know, like I recently 667 00:34:45,520 --> 00:34:47,719 Speaker 2: purchased something like a new perse the other day made 668 00:34:47,760 --> 00:34:50,200 Speaker 2: out of billboards, and I felt so great because I'm 669 00:34:50,239 --> 00:34:53,480 Speaker 2: reusing something. But like maybe I didn't need that new 670 00:34:53,520 --> 00:34:57,160 Speaker 2: person So what extent do these like you know, credits 671 00:34:57,200 --> 00:34:59,600 Speaker 2: and these you know, telling people that your company is 672 00:34:59,640 --> 00:35:01,640 Speaker 2: greener and another like, to what extent is it's still 673 00:35:01,680 --> 00:35:04,640 Speaker 2: the individual's responsibility when we have all these ways of 674 00:35:04,640 --> 00:35:07,640 Speaker 2: making ourselves feel better that may not actually be doing anything. 675 00:35:08,480 --> 00:35:10,520 Speaker 3: Yeah, I don't know. I mean I really see by 676 00:35:10,560 --> 00:35:13,759 Speaker 3: the size of this, because on the one hand, you 677 00:35:13,840 --> 00:35:18,440 Speaker 3: often hear people saying the emphasis on individual responsibility for 678 00:35:18,520 --> 00:35:23,880 Speaker 3: climate change is just a way of distracting from the 679 00:35:23,920 --> 00:35:28,360 Speaker 3: fact that we need enormous structural changes at the level 680 00:35:28,400 --> 00:35:32,000 Speaker 3: of governments and mega corporations to make any real difference. 681 00:35:32,120 --> 00:35:35,600 Speaker 3: And you know, I think it is literally the case 682 00:35:35,880 --> 00:35:41,400 Speaker 3: that you know, polluters, via their think tanks and lobbyists 683 00:35:41,440 --> 00:35:44,560 Speaker 3: and AstroTurf operations have tried to move the climate change 684 00:35:44,920 --> 00:35:49,920 Speaker 3: conversation towards people recycling their bottles or whatever, because it 685 00:35:50,600 --> 00:35:52,560 Speaker 3: kind of changes the terms of it, which makes it 686 00:35:52,600 --> 00:35:59,400 Speaker 3: easier for them to avoid these demands. But on the 687 00:35:59,440 --> 00:36:03,080 Speaker 3: other hand, I I'm always very conscious that my carbon footprint, 688 00:36:03,280 --> 00:36:06,719 Speaker 3: as like an affluent northern European, is many times that 689 00:36:06,880 --> 00:36:11,800 Speaker 3: of the you know, median global person, and that also 690 00:36:11,920 --> 00:36:17,239 Speaker 3: does put me in a difficult moral position. But then 691 00:36:17,280 --> 00:36:19,359 Speaker 3: also I feel relatively smug about that. The whole thing 692 00:36:19,520 --> 00:36:23,759 Speaker 3: is like I don't drive, I don't have children, I've 693 00:36:23,800 --> 00:36:26,920 Speaker 3: basically given up flying, and like I said, I'm trying 694 00:36:26,960 --> 00:36:30,320 Speaker 3: to be vegan, and I live in five hundred square feet, 695 00:36:31,239 --> 00:36:33,640 Speaker 3: so like it's pretty easy for me to look down 696 00:36:33,680 --> 00:36:37,400 Speaker 3: on other people. I also think looking down at other 697 00:36:37,440 --> 00:36:40,759 Speaker 3: people for climate reasons is bad and not helpful, But 698 00:36:42,560 --> 00:36:44,880 Speaker 3: it does make it easy for me to say that 699 00:36:44,920 --> 00:36:48,560 Speaker 3: individual responsibility is important, because if you look at my 700 00:36:48,640 --> 00:36:52,040 Speaker 3: individual responsibilities, I come out looking pretty good, I think, 701 00:36:52,080 --> 00:36:55,319 Speaker 3: although I do buy quite a lot of clothes. Of course, 702 00:36:55,360 --> 00:36:57,120 Speaker 3: the answer is we have to do both, like we 703 00:36:57,560 --> 00:37:01,839 Speaker 3: have to have governments making huge changes. Then also realistically, 704 00:37:01,880 --> 00:37:03,799 Speaker 3: in the future, all of us individually are going to 705 00:37:03,800 --> 00:37:07,560 Speaker 3: have to make changes in our lives as well, because 706 00:37:08,160 --> 00:37:12,319 Speaker 3: if all six or seven billion people on Earth live 707 00:37:12,440 --> 00:37:17,040 Speaker 3: like affluent Northern Europeans, that won't work. But we also 708 00:37:17,160 --> 00:37:21,640 Speaker 3: can't ask the majority of the global population to maintain 709 00:37:21,719 --> 00:37:25,360 Speaker 3: a lower standard living than we have because there's no 710 00:37:25,400 --> 00:37:27,960 Speaker 3: reason for that. So we are going to have to 711 00:37:28,000 --> 00:37:30,960 Speaker 3: smooth things out in some way. So I don't know, 712 00:37:31,040 --> 00:37:32,719 Speaker 3: but yeah, I think, you know, we have to do both. 713 00:37:32,719 --> 00:37:35,600 Speaker 1: Of course, I think it's really fascinating the moral implications 714 00:37:35,600 --> 00:37:38,360 Speaker 1: of turning things into costs. Though. If I'm willing to 715 00:37:38,400 --> 00:37:42,080 Speaker 1: pay more for a banana that's very environmentally expensive, does 716 00:37:42,120 --> 00:37:44,760 Speaker 1: that like make it okay that I'm eating this banana 717 00:37:44,760 --> 00:37:46,719 Speaker 1: because I've paid for it, Or like in the world 718 00:37:46,840 --> 00:37:50,000 Speaker 1: you've constructed, If I want a specific view from my condo, 719 00:37:50,080 --> 00:37:52,160 Speaker 1: and I know that building a condo there meant some 720 00:37:52,200 --> 00:37:54,520 Speaker 1: caterpillar had to go extinct, but hey, I'm willing to 721 00:37:54,520 --> 00:37:57,560 Speaker 1: pay another ten k for that condo, does that like 722 00:37:57,640 --> 00:38:00,760 Speaker 1: absolve me of responsibility? Or am I just like seeding 723 00:38:00,880 --> 00:38:04,120 Speaker 1: the responsibility for this choice to the algorithm of free 724 00:38:04,120 --> 00:38:05,080 Speaker 1: market capitalism. 725 00:38:05,600 --> 00:38:12,719 Speaker 3: So there is this attitude that offsets are dangerous because 726 00:38:13,520 --> 00:38:19,160 Speaker 3: they simply, you know, shunt the damage to someone else, 727 00:38:19,200 --> 00:38:22,800 Speaker 3: and they relieve the pressure to actually make real changes, 728 00:38:23,800 --> 00:38:28,040 Speaker 3: and we need that pressure. I don't really agree with that. 729 00:38:28,080 --> 00:38:32,640 Speaker 3: You know. Obviously, the premise of offsets is that the 730 00:38:32,680 --> 00:38:39,400 Speaker 3: free market is good at finding the most efficient method 731 00:38:39,760 --> 00:38:43,040 Speaker 3: and time and place to accomplish something. And if the 732 00:38:43,080 --> 00:38:46,400 Speaker 3: thing we want to accomplish is you know, not omit 733 00:38:46,800 --> 00:38:50,040 Speaker 3: one hundred times tons of carbon, then we might as 734 00:38:50,080 --> 00:38:52,439 Speaker 3: well do that in the most efficient time and place 735 00:38:52,480 --> 00:38:55,120 Speaker 3: and by the most efficient method. You know, I don't 736 00:38:55,160 --> 00:39:00,480 Speaker 3: think there's any reason why we can't smooth that out. 737 00:39:01,480 --> 00:39:04,560 Speaker 3: But you know, as I write about in the book, 738 00:39:04,840 --> 00:39:09,640 Speaker 3: the whole offset idea, since its inception and in every 739 00:39:09,680 --> 00:39:15,960 Speaker 3: implementation of it has been extremely be deviled by loopholes 740 00:39:16,040 --> 00:39:21,160 Speaker 3: and corruption and fraud and lies and so on. So 741 00:39:21,200 --> 00:39:24,600 Speaker 3: in practice it hasn't really worked. But in principle I 742 00:39:24,640 --> 00:39:29,960 Speaker 3: don't see anything wrong with it. You know, if the 743 00:39:30,040 --> 00:39:33,880 Speaker 3: fact that is it coldplay who were like, our tours 744 00:39:33,880 --> 00:39:36,200 Speaker 3: are going to be carbon neutral and some of the 745 00:39:36,239 --> 00:39:39,680 Speaker 3: ways we're going to do that with offsets, if the 746 00:39:39,719 --> 00:39:43,360 Speaker 3: offsets are real. But I think that's goods. I think 747 00:39:43,680 --> 00:39:45,399 Speaker 3: I was something it's good if the offsets are real. 748 00:39:45,520 --> 00:39:49,160 Speaker 3: But the problem is again, because the free market is 749 00:39:49,200 --> 00:39:52,839 Speaker 3: so nimble and devious, a fake offset is always going 750 00:39:52,880 --> 00:39:55,600 Speaker 3: to be more profitable than a real one. So most 751 00:39:55,640 --> 00:39:57,799 Speaker 3: of the offsets will turn out to be fake. But 752 00:39:58,239 --> 00:40:02,319 Speaker 3: if we could make them more real, great, But the 753 00:40:02,320 --> 00:40:04,640 Speaker 3: free market is cleverer than us, so I don't think 754 00:40:04,640 --> 00:40:05,360 Speaker 3: that will ever happen. 755 00:40:06,400 --> 00:40:08,840 Speaker 2: Yeah, these things are complicated and it all depends on 756 00:40:08,840 --> 00:40:12,040 Speaker 2: their implementation, which sort of leads to the next question. 757 00:40:12,239 --> 00:40:16,000 Speaker 2: So technology is an important feature of the book, and 758 00:40:16,360 --> 00:40:20,200 Speaker 2: in the book they're working through the technology to maybe 759 00:40:20,400 --> 00:40:23,600 Speaker 2: to be able to bring individual people back after they've died, 760 00:40:23,680 --> 00:40:27,400 Speaker 2: and then a whole species back after they've gone extinct. 761 00:40:27,520 --> 00:40:29,160 Speaker 2: And so you know, this sort of ties in with 762 00:40:29,239 --> 00:40:31,200 Speaker 2: the extinction credits. You don't have to feel quite as 763 00:40:31,239 --> 00:40:34,040 Speaker 2: bad if you think you can bring an animal back eventually. Also, 764 00:40:34,600 --> 00:40:37,720 Speaker 2: so to you, what is the thing that makes extinction 765 00:40:38,320 --> 00:40:41,319 Speaker 2: so terrible? Like if we still have it as a 766 00:40:41,360 --> 00:40:43,279 Speaker 2: backup on one of our computers and we can maybe 767 00:40:43,280 --> 00:40:45,600 Speaker 2: bring it back one data, is that make it less 768 00:40:45,680 --> 00:40:48,600 Speaker 2: bad because maybe it's not completely gone? So what do 769 00:40:48,640 --> 00:40:51,400 Speaker 2: you think about the role of technology and extinction and 770 00:40:51,440 --> 00:40:52,960 Speaker 2: when is the species really extinct? 771 00:40:53,080 --> 00:40:55,560 Speaker 3: Yeah, it's I right about on the book. In principle, 772 00:40:55,600 --> 00:40:58,879 Speaker 3: we could get to a point where we have all 773 00:40:58,880 --> 00:41:01,200 Speaker 3: of these threatened spece is in Buyo banks, and then 774 00:41:01,239 --> 00:41:04,760 Speaker 3: then the future we could bring them back. But will 775 00:41:04,840 --> 00:41:07,400 Speaker 3: we ever bring them back? I just don't think we will. 776 00:41:07,480 --> 00:41:11,800 Speaker 3: I can see us bringing back willy mammoths and stuff, 777 00:41:12,440 --> 00:41:16,399 Speaker 3: but the vorst majority of the species going extinct every 778 00:41:16,480 --> 00:41:21,240 Speaker 3: year are kind of very obscure rainforest beetles or whatever, 779 00:41:22,280 --> 00:41:24,160 Speaker 3: And I just don't think we ever will bring those 780 00:41:24,200 --> 00:41:29,440 Speaker 3: back because who is going to pay for that, and 781 00:41:29,480 --> 00:41:31,920 Speaker 3: who is going to keep them alive once they're brought back, 782 00:41:32,040 --> 00:41:34,920 Speaker 3: and you know where is that going to happen and 783 00:41:35,000 --> 00:41:37,480 Speaker 3: so on. So I think the fact that we could 784 00:41:37,600 --> 00:41:40,919 Speaker 3: doesn't mean that we will. We probably won't, which means 785 00:41:40,960 --> 00:41:44,200 Speaker 3: we shouldn't put ourselves in that position of being like, well, 786 00:41:44,239 --> 00:41:46,399 Speaker 3: we've still got them, so we could still bring them back, 787 00:41:46,400 --> 00:41:49,919 Speaker 3: so they're not really extinct. But then when you start 788 00:41:49,960 --> 00:41:58,840 Speaker 3: asking whether this kind of potential resurrected beetle is a 789 00:41:58,960 --> 00:42:01,520 Speaker 3: kind of Ersat's version the way real thing, that's when 790 00:42:01,520 --> 00:42:08,360 Speaker 3: you do start to like wander into this fazzia territory. 791 00:42:08,480 --> 00:42:13,000 Speaker 3: You know, is there something inherently valuable about a beetle 792 00:42:13,320 --> 00:42:21,839 Speaker 3: that has continuously lived in the habitat in which it evolved, and, 793 00:42:22,520 --> 00:42:29,360 Speaker 3: as it were, the kind of community and ecosystem role 794 00:42:30,200 --> 00:42:34,520 Speaker 3: of that species within the you know, broader web of 795 00:42:34,600 --> 00:42:41,680 Speaker 3: species has continually existed from the first moment it evolved. 796 00:42:42,520 --> 00:42:50,439 Speaker 3: Is that more valuable than hypothetically the species being brought 797 00:42:50,480 --> 00:42:53,360 Speaker 3: back in a zoo in the future. Well, it seems 798 00:42:53,400 --> 00:42:56,239 Speaker 3: to me that it is. But it is harder than 799 00:42:56,280 --> 00:43:01,719 Speaker 3: to say, well, why it doesn't really seem to affect anyone. 800 00:43:02,960 --> 00:43:06,719 Speaker 3: It doesn't make anyone's life better, even if we're very 801 00:43:06,760 --> 00:43:10,280 Speaker 3: invested in this beetle existing somewhere in the world, whose 802 00:43:10,360 --> 00:43:15,400 Speaker 3: life is better because this you know beetle has continuously existed. 803 00:43:15,600 --> 00:43:20,520 Speaker 3: It is like caring deeply about your table being a 804 00:43:20,600 --> 00:43:24,160 Speaker 3: real antique instead of a fake antique. If you're very 805 00:43:24,200 --> 00:43:26,839 Speaker 3: into antiques, then of course you care about that. But 806 00:43:26,880 --> 00:43:30,239 Speaker 3: why should anyone else care about that? In particularly, why 807 00:43:30,239 --> 00:43:34,640 Speaker 3: should anyone else pay costs or give things up because 808 00:43:34,840 --> 00:43:38,319 Speaker 3: you care about that. That's a niche interest. It does 809 00:43:38,360 --> 00:43:43,160 Speaker 3: seem to me that it would be nice not to 810 00:43:43,200 --> 00:43:46,719 Speaker 3: eradicate this beetle and simply have it in a biobank 811 00:43:46,800 --> 00:43:51,120 Speaker 3: and clone it later. But you know that's not how 812 00:43:51,160 --> 00:43:53,719 Speaker 3: politics works. You can't say to people, well, we all 813 00:43:53,760 --> 00:43:55,759 Speaker 3: have to agree to do this because I think that 814 00:43:55,800 --> 00:43:58,400 Speaker 3: would be nice. So I think that's where philosophy comes in. 815 00:43:58,440 --> 00:44:00,920 Speaker 3: That's where you have to start thinking, well, well, I 816 00:44:00,960 --> 00:44:03,520 Speaker 3: have reasons for thinking it would be nice, and once 817 00:44:03,560 --> 00:44:06,680 Speaker 3: we dig into the reasons, maybe you would start to 818 00:44:06,719 --> 00:44:08,439 Speaker 3: agree with me too. But then, of course the danger 819 00:44:08,560 --> 00:44:10,600 Speaker 3: is once you start digging into the reasons, the reverse 820 00:44:10,640 --> 00:44:12,480 Speaker 3: could happen. It could be that I start thinking, well, 821 00:44:12,480 --> 00:44:15,120 Speaker 3: actually I don't even care anymore now that I've looked 822 00:44:15,120 --> 00:44:18,960 Speaker 3: at it, you know, really harshly, I don't care. I 823 00:44:18,960 --> 00:44:21,920 Speaker 3: think there actually are more important things. The other thing 824 00:44:21,920 --> 00:44:24,200 Speaker 3: I talked about in the book is that knowing that 825 00:44:24,280 --> 00:44:29,000 Speaker 3: this technology is there sort of takes the pressure off. 826 00:44:29,040 --> 00:44:32,040 Speaker 3: It's going to make us more lackadaisical because we have 827 00:44:32,160 --> 00:44:36,319 Speaker 3: a plan B. I think there is something to that, 828 00:44:36,360 --> 00:44:38,840 Speaker 3: but I wo you know, I don't think that's the 829 00:44:38,920 --> 00:44:42,400 Speaker 3: reason not to build biobanks or whatever. Better to have 830 00:44:42,440 --> 00:44:44,960 Speaker 3: them in case we need them than not to have 831 00:44:45,040 --> 00:44:46,480 Speaker 3: them out of a fear that they would make us 832 00:44:46,520 --> 00:44:47,320 Speaker 3: lazy or whatever. 833 00:44:47,640 --> 00:44:50,759 Speaker 1: I think it's fascinating the way having biobanks or the 834 00:44:50,800 --> 00:44:54,720 Speaker 1: ability to rest the day species makes extinction itself less terrible, 835 00:44:54,760 --> 00:44:57,759 Speaker 1: because it's the irreversibility of extinction that really gives it 836 00:44:57,760 --> 00:44:59,960 Speaker 1: its moral drama. Sort of reminds me of your answered 837 00:45:00,040 --> 00:45:03,600 Speaker 1: the question about teleporters, like, if I murdered somebody, it's 838 00:45:03,640 --> 00:45:05,960 Speaker 1: actually less terrible to murder them if I knew I 839 00:45:06,000 --> 00:45:08,359 Speaker 1: could just recreate them somewhere else. And then I'd say, like, look, 840 00:45:08,400 --> 00:45:11,800 Speaker 1: according to you know, novelist Ned Bowman, you still exist 841 00:45:11,840 --> 00:45:13,719 Speaker 1: and you as you even if I murdered you and 842 00:45:13,760 --> 00:45:14,319 Speaker 1: recreated you. 843 00:45:14,719 --> 00:45:17,080 Speaker 3: Yeah, I think that's a great analogy actually, because again 844 00:45:17,880 --> 00:45:20,000 Speaker 3: I talk about this in the book. Yeah, the question 845 00:45:20,040 --> 00:45:23,839 Speaker 3: of whether something is extinct or not extinct, it's simplistic 846 00:45:24,000 --> 00:45:28,080 Speaker 3: to make that a binary. You know, extinction is arguably 847 00:45:28,200 --> 00:45:31,319 Speaker 3: not a clear cut enough concept that you can use 848 00:45:31,360 --> 00:45:35,440 Speaker 3: it in that way. It might be more helpful to 849 00:45:35,520 --> 00:45:40,160 Speaker 3: start talking about species being sort of extinct ish, although 850 00:45:40,200 --> 00:45:45,560 Speaker 3: again in the book Yeah, I asked, is that gonna 851 00:45:46,440 --> 00:45:52,359 Speaker 3: sort of expand our sense of how worried we might 852 00:45:52,440 --> 00:45:54,720 Speaker 3: need to be about a species, or on the contrary, 853 00:45:55,239 --> 00:45:59,120 Speaker 3: is it gonna let us relax when we shouldn't be 854 00:46:00,520 --> 00:46:02,080 Speaker 3: relaxing about it. 855 00:46:02,880 --> 00:46:05,160 Speaker 1: Well, then let me make the philosophical game of making 856 00:46:05,160 --> 00:46:08,920 Speaker 1: it more personal. Say we could scan you and resuscitate 857 00:46:09,000 --> 00:46:11,440 Speaker 1: you or recreate you later on. Would you want that 858 00:46:11,520 --> 00:46:13,400 Speaker 1: to happen? And would that make it less bad for 859 00:46:13,440 --> 00:46:14,400 Speaker 1: somebody to murder you? 860 00:46:14,640 --> 00:46:17,600 Speaker 3: Well again, reading loads of Greg Egan when I was 861 00:46:17,600 --> 00:46:20,080 Speaker 3: younger has been a huge infert of my thinking about this, 862 00:46:20,120 --> 00:46:22,759 Speaker 3: because he writes more interestingly than anyone else I've ever 863 00:46:22,800 --> 00:46:25,160 Speaker 3: read about what it would be like to be an 864 00:46:25,239 --> 00:46:31,520 Speaker 3: uploaded consciousness. And you know, of course, if you end 865 00:46:31,600 --> 00:46:34,440 Speaker 3: up living on a computer, then you might live for 866 00:46:35,800 --> 00:46:38,399 Speaker 3: well a you might live for another million or billion 867 00:46:38,480 --> 00:46:42,800 Speaker 3: years and be at that point you have complete freedom 868 00:46:42,920 --> 00:46:49,080 Speaker 3: to alter yourself. So is the person at the end 869 00:46:49,120 --> 00:46:54,600 Speaker 3: of the Billionaires who's been radically kind of expanded and 870 00:46:54,800 --> 00:47:00,880 Speaker 3: altered and perhaps merged or split into two or whatever. 871 00:47:01,320 --> 00:47:04,799 Speaker 3: Is that the same person as the person who was uploaded? 872 00:47:06,320 --> 00:47:08,480 Speaker 3: Once again, I think it's preposterous to give a straight 873 00:47:08,600 --> 00:47:11,280 Speaker 3: yes or no answer. You have to say, well, there's 874 00:47:11,320 --> 00:47:16,600 Speaker 3: some degree of continuity in it being the sameish person, 875 00:47:17,920 --> 00:47:20,440 Speaker 3: But I don't know. So you know. That's why I 876 00:47:20,480 --> 00:47:23,560 Speaker 3: always think it's a bit kind of apid to say, 877 00:47:24,040 --> 00:47:26,320 Speaker 3: or would you want to be immortal or not, because 878 00:47:26,480 --> 00:47:29,680 Speaker 3: clearly the person who's there at the end of eternity 879 00:47:30,680 --> 00:47:34,000 Speaker 3: is only in certain ways continuous with the person who 880 00:47:34,120 --> 00:47:36,200 Speaker 3: was there at the beginning of it, Like is that 881 00:47:36,280 --> 00:47:41,799 Speaker 3: person any more similar to you than your father is 882 00:47:41,880 --> 00:47:45,000 Speaker 3: similar to you or whatever. So when I think would 883 00:47:45,000 --> 00:47:47,160 Speaker 3: I want to live forever? Would that be terrifying? I 884 00:47:47,160 --> 00:47:49,560 Speaker 3: always thinking, well, I don't think living forever is possible 885 00:47:49,600 --> 00:47:52,200 Speaker 3: because the person at the end of forever is only 886 00:47:52,280 --> 00:47:55,400 Speaker 3: partly you. All of that said, my answer basically is no, 887 00:47:55,640 --> 00:47:59,879 Speaker 3: I think seventy to ninety years is ample. I really 888 00:48:00,040 --> 00:48:03,920 Speaker 3: don't feel any need or desire to live several hundred 889 00:48:04,160 --> 00:48:07,960 Speaker 3: or several thousand more. And also, you know, one of 890 00:48:07,960 --> 00:48:10,840 Speaker 3: the things greg Igan writes about, and again there is 891 00:48:11,120 --> 00:48:13,240 Speaker 3: sort of reference in a distent way in the book, 892 00:48:13,440 --> 00:48:17,400 Speaker 3: is like, that's a lot of time to go nuts. Basically, 893 00:48:17,480 --> 00:48:21,360 Speaker 3: that's a lot of time to become obsessed with the 894 00:48:21,400 --> 00:48:28,920 Speaker 3: wrong thing or to start valuing the wrong things. And obviously, 895 00:48:29,200 --> 00:48:32,719 Speaker 3: if you're in this position way you can sort of 896 00:48:33,000 --> 00:48:36,160 Speaker 3: edit yourself, then that can really turn into a spiral. 897 00:48:36,360 --> 00:48:38,839 Speaker 3: Like if you spend a week thinking there's nothing more 898 00:48:38,840 --> 00:48:41,879 Speaker 3: important than this thing that I've just got into, then 899 00:48:41,880 --> 00:48:46,560 Speaker 3: maybe you think, well, I'm going to edit myself so 900 00:48:46,600 --> 00:48:49,800 Speaker 3: I'm more committed to this thing that I've just got into. 901 00:48:50,600 --> 00:48:54,160 Speaker 3: And then the person that you've become who's more committed 902 00:48:54,200 --> 00:48:56,000 Speaker 3: to it thinks, well, I got to become even more 903 00:48:56,000 --> 00:49:00,000 Speaker 3: committed to it. So you start editing your own conscious 904 00:49:00,200 --> 00:49:02,719 Speaker 3: so that you become more and more into this specific thing, 905 00:49:03,080 --> 00:49:05,279 Speaker 3: and then you can never get out of it, and 906 00:49:05,320 --> 00:49:10,239 Speaker 3: then you're just there for eternity, kind of shriveling up 907 00:49:10,440 --> 00:49:16,320 Speaker 3: into this monomoniacal commuter consciousness. And you know, I'm already 908 00:49:16,400 --> 00:49:20,080 Speaker 3: way too into Monster Hunter World for my Xbox, Like 909 00:49:20,200 --> 00:49:22,399 Speaker 3: I dread to think how much I could get into 910 00:49:22,440 --> 00:49:25,719 Speaker 3: it if I had complete control over my own consciousness 911 00:49:25,719 --> 00:49:28,000 Speaker 3: and was going to live a billion years. So no, basically, 912 00:49:28,040 --> 00:49:31,120 Speaker 3: I think safer to die of old age. But I 913 00:49:31,160 --> 00:49:34,400 Speaker 3: wish the best to anyone who's getting uploaded, and I 914 00:49:34,440 --> 00:49:37,239 Speaker 3: completely think that's possible and they will be the same 915 00:49:37,239 --> 00:49:39,320 Speaker 3: person at least in the short term, so I encourage 916 00:49:39,360 --> 00:49:41,520 Speaker 3: I encourage people to try it out, but not for me. 917 00:49:42,040 --> 00:49:45,400 Speaker 2: So speaking of long term planning. What are your thoughts 918 00:49:45,440 --> 00:49:49,560 Speaker 2: on are we going to eventually avert this extinction disaster 919 00:49:49,800 --> 00:49:52,280 Speaker 2: at some point? Like, what do you think our prospects 920 00:49:52,320 --> 00:49:55,480 Speaker 2: are for humanity in the next one hundred or one 921 00:49:55,520 --> 00:49:56,240 Speaker 2: thousand years? 922 00:49:56,480 --> 00:49:58,319 Speaker 3: Well, again, this is why I didn't set the book 923 00:49:58,360 --> 00:50:01,280 Speaker 3: any further in the future. I know people become furious 924 00:50:01,320 --> 00:50:04,200 Speaker 3: when this is said. I do think there is at 925 00:50:04,280 --> 00:50:08,600 Speaker 3: least a possibility that when we build an AI that's 926 00:50:08,640 --> 00:50:11,480 Speaker 3: like a million times more intelligent than any human being, 927 00:50:11,920 --> 00:50:13,919 Speaker 3: the AI will come up with something that we didn't 928 00:50:13,920 --> 00:50:17,000 Speaker 3: come up with. Like, I do think that could happen. 929 00:50:17,120 --> 00:50:19,520 Speaker 3: I don't think we should rely on that happening. And 930 00:50:19,640 --> 00:50:23,040 Speaker 3: if that doesn't happen, I don't think it is looking 931 00:50:23,440 --> 00:50:27,400 Speaker 3: very good. I actually listen to a different podcast recently 932 00:50:27,440 --> 00:50:31,640 Speaker 3: with Peter Watts, the Canadian science fiction novelist who's really 933 00:50:31,640 --> 00:50:36,560 Speaker 3: brilliant and also famous for his pessimism, and his take 934 00:50:36,680 --> 00:50:41,040 Speaker 3: on it is that even with a lot of geo engineering, 935 00:50:42,280 --> 00:50:47,080 Speaker 3: so much climate change has already locked into the oceans 936 00:50:47,280 --> 00:50:52,960 Speaker 3: and so forth, that we can avert the very worst. Maybe, 937 00:50:53,000 --> 00:50:56,080 Speaker 3: but we can't. It's already too late to avert the 938 00:50:57,000 --> 00:51:01,480 Speaker 3: almost as bad and the almost as bad definitely involves 939 00:51:02,040 --> 00:51:06,720 Speaker 3: a lot of ecosystems being absolutely devastated and a huge 940 00:51:06,880 --> 00:51:11,160 Speaker 3: chunk of the biodiversity of the Earth just going away, 941 00:51:11,680 --> 00:51:15,920 Speaker 3: probably before we have the opportunity to scan and preserve 942 00:51:15,960 --> 00:51:18,360 Speaker 3: it all. But then, you know, you've got to have 943 00:51:18,400 --> 00:51:21,160 Speaker 3: a certain amount of intellectual humility about this stuff. Like 944 00:51:21,320 --> 00:51:23,319 Speaker 3: every ten years you look at the grass and it's 945 00:51:23,400 --> 00:51:26,319 Speaker 3: like the graph is not where it's supposed to be, 946 00:51:26,400 --> 00:51:28,759 Speaker 3: Like sometimes it's worse or sometimes it's better, Like the 947 00:51:28,800 --> 00:51:31,560 Speaker 3: whole thing about renewable energy having gone down in price 948 00:51:31,600 --> 00:51:34,160 Speaker 3: point ninety seven percent or whatever it is over the 949 00:51:34,200 --> 00:51:38,560 Speaker 3: past decade. So I really can't say it'll be nice 950 00:51:38,560 --> 00:51:40,719 Speaker 3: if Ani saved us, But I do want to emphasize 951 00:51:41,080 --> 00:51:42,919 Speaker 3: I don't think we should like sit back and wait 952 00:51:42,960 --> 00:51:44,840 Speaker 3: for that to happen. It would be if that was 953 00:51:44,920 --> 00:51:46,959 Speaker 3: only the emergency plan and we came up with something 954 00:51:46,960 --> 00:51:47,800 Speaker 3: better in the meantime. 955 00:51:47,880 --> 00:51:50,719 Speaker 1: All Right, we have lots more hard philosophical questions for 956 00:51:50,760 --> 00:52:06,439 Speaker 1: a NED. Well, first we have to take a quick break. Okay, 957 00:52:06,480 --> 00:52:08,799 Speaker 1: we're back and we are channing with Niin Bowman, the 958 00:52:08,840 --> 00:52:11,600 Speaker 1: author of Venomous Lumpsucker. Well, I'd love to hear a 959 00:52:11,640 --> 00:52:13,960 Speaker 1: little bit more about your writing process. You said you 960 00:52:13,960 --> 00:52:16,280 Speaker 1: did a lot of research. Why did you decide to 961 00:52:16,360 --> 00:52:19,399 Speaker 1: invent a fictional species for your book whereas the rest 962 00:52:19,440 --> 00:52:21,800 Speaker 1: of it seems to follow the rules of our universe? 963 00:52:22,360 --> 00:52:27,440 Speaker 3: Well, the book had to be premised on a highly 964 00:52:27,719 --> 00:52:32,120 Speaker 3: intelligent species, and most of the highly intelligent species that 965 00:52:32,200 --> 00:52:38,840 Speaker 3: we know about are fairly well publicized, so the fact 966 00:52:39,080 --> 00:52:45,760 Speaker 3: of whether they are endangered or extinct is a fact 967 00:52:45,760 --> 00:52:47,839 Speaker 3: in the world that people know, which would have made 968 00:52:47,840 --> 00:52:52,920 Speaker 3: it very hard to fictionalize it. So I had to 969 00:52:52,960 --> 00:52:58,319 Speaker 3: come up with a fictional intelligent species that could plausibly 970 00:52:58,440 --> 00:53:04,600 Speaker 3: have remained obscure. And I didn't really feel like it 971 00:53:04,640 --> 00:53:07,239 Speaker 3: could be a mammal because if you look at the 972 00:53:07,320 --> 00:53:10,799 Speaker 3: club of mammals, there aren't actually that many, Like there 973 00:53:10,800 --> 00:53:15,239 Speaker 3: really aren't that many mammals, especially in Europe, and if 974 00:53:15,280 --> 00:53:19,200 Speaker 3: there was an intelligent mammal, we would have heard about it. 975 00:53:19,239 --> 00:53:21,120 Speaker 3: I mean, apart from the ones that we obviously already 976 00:53:21,120 --> 00:53:24,120 Speaker 3: know about. I really don't think there are any very 977 00:53:24,239 --> 00:53:27,920 Speaker 3: intelligent mammals that just nobody who's noticed yet. That didn't 978 00:53:27,920 --> 00:53:30,960 Speaker 3: feel realistic to me. So I made it a fish 979 00:53:31,040 --> 00:53:36,480 Speaker 3: because there are so many fish, and fish intelligence is 980 00:53:36,560 --> 00:53:41,560 Speaker 3: still pretty understudied, so it was just about credible to me, 981 00:53:42,080 --> 00:53:46,120 Speaker 3: and hopefully it's the reader that there could be this 982 00:53:46,320 --> 00:53:49,600 Speaker 3: fish that was really special, but just we hadn't really 983 00:53:49,600 --> 00:53:53,080 Speaker 3: been paying any attention and it had maybe gone extinct 984 00:53:53,080 --> 00:53:55,880 Speaker 3: without anyone really noticing. And the other advantage of a 985 00:53:55,920 --> 00:53:59,239 Speaker 3: fish is that fish are hard to find. Like, if 986 00:53:59,239 --> 00:54:02,280 Speaker 3: it's a bird, you can just set up cameras or whatever. 987 00:54:02,480 --> 00:54:04,080 Speaker 3: I mean, if you care enough about it, you can 988 00:54:04,080 --> 00:54:06,080 Speaker 3: just set up loads of cameras. But if something is 989 00:54:06,120 --> 00:54:10,000 Speaker 3: obviously in the ocean, then you know, it's very dark 990 00:54:10,040 --> 00:54:14,239 Speaker 3: down there, so it's easier to believe that you could 991 00:54:14,280 --> 00:54:18,400 Speaker 3: have a quest for this species that didn't just entail 992 00:54:19,480 --> 00:54:22,200 Speaker 3: well we you know, send up one hundred drones with 993 00:54:22,239 --> 00:54:23,160 Speaker 3: cameras to look for it. 994 00:54:23,360 --> 00:54:24,880 Speaker 2: So when I was reading the book, it sort of 995 00:54:24,880 --> 00:54:28,319 Speaker 2: reminded me of some George Saunders short stories that I've read, 996 00:54:28,400 --> 00:54:30,600 Speaker 2: like it's sort of like wild and out there and 997 00:54:30,640 --> 00:54:32,920 Speaker 2: oh my gosh, what are these people thinking? But at 998 00:54:32,920 --> 00:54:35,520 Speaker 2: the end you're left positing all these big questions about 999 00:54:35,520 --> 00:54:38,400 Speaker 2: society and humanity. And clearly I'm no literary critic, so 1000 00:54:38,480 --> 00:54:40,319 Speaker 2: I've done a horrible job of describing all of this. 1001 00:54:40,440 --> 00:54:44,560 Speaker 2: But who outside of like your science fiction influences, Who 1002 00:54:44,600 --> 00:54:47,920 Speaker 2: are your like straight fiction influences other than Egan? 1003 00:54:48,080 --> 00:54:50,279 Speaker 3: I mean, I love George Saunders. Well I wouldn't really 1004 00:54:50,320 --> 00:54:53,719 Speaker 3: say he's an influence on me, partly because like so 1005 00:54:53,800 --> 00:54:56,239 Speaker 3: on as I think he talks about this, ultimately he's 1006 00:54:56,280 --> 00:55:02,239 Speaker 3: like very concerned with human feeling and human kindness and 1007 00:55:02,280 --> 00:55:04,200 Speaker 3: stuff like that, and like I'm not interested in that 1008 00:55:04,280 --> 00:55:06,960 Speaker 3: kind of thing at all, Like that's that's not what 1009 00:55:07,000 --> 00:55:10,279 Speaker 3: I write Norm's about. So there's a there's a there's 1010 00:55:10,280 --> 00:55:14,080 Speaker 3: a limit to how much I can take from him. 1011 00:55:14,520 --> 00:55:19,480 Speaker 3: So influences from outside science fiction, well, yes, funny. Any 1012 00:55:19,520 --> 00:55:22,160 Speaker 3: of the names I would mention, I don't know how 1013 00:55:22,280 --> 00:55:25,800 Speaker 3: much you would see of them in this book. Well, actually, 1014 00:55:26,400 --> 00:55:31,120 Speaker 3: Graham Green is one. You know, Green's novels are all 1015 00:55:31,200 --> 00:55:39,200 Speaker 3: about putting kind of tortured people into terrible moral situations, 1016 00:55:39,440 --> 00:55:43,239 Speaker 3: and I think that was definitely an influence. And what 1017 00:55:43,320 --> 00:55:46,359 Speaker 3: happens to Rasaint in this book, and actually when now 1018 00:55:46,440 --> 00:55:50,439 Speaker 3: they think about it, when she talks about Catholics and 1019 00:55:52,680 --> 00:55:55,920 Speaker 3: you know how thorny their theology is. I think I 1020 00:55:56,080 --> 00:56:00,440 Speaker 3: almost put kind of Catholics in the Graham Green novel 1021 00:56:00,920 --> 00:56:04,719 Speaker 3: or Catholic Grand Green readers or whatever, So that's definitely 1022 00:56:04,760 --> 00:56:07,640 Speaker 3: in there. I don't know other than that, you know, 1023 00:56:07,680 --> 00:56:09,720 Speaker 3: I'm not going to say I have transcended my influencers 1024 00:56:09,840 --> 00:56:12,560 Speaker 3: or anythink, but I would say that my earlier novels 1025 00:56:13,280 --> 00:56:18,839 Speaker 3: were very much a patchwork of influences and pastiches and 1026 00:56:19,280 --> 00:56:22,560 Speaker 3: even direct quotes, and I would happily go, well, this 1027 00:56:22,600 --> 00:56:25,560 Speaker 3: bit is from this person, and this bit is from 1028 00:56:25,640 --> 00:56:30,680 Speaker 3: this person. But I don't know. By the point of 1029 00:56:30,680 --> 00:56:34,200 Speaker 3: this novel, I'm still like totally in the shadow of 1030 00:56:34,239 --> 00:56:37,479 Speaker 3: all my influences, but I think I've at least found 1031 00:56:37,520 --> 00:56:41,240 Speaker 3: my own style and preoccupations to this to the point 1032 00:56:41,239 --> 00:56:44,200 Speaker 3: that I wouldn't say about this novel, well, this novel 1033 00:56:44,280 --> 00:56:47,240 Speaker 3: is simply this writer and this writer and this writer 1034 00:56:48,800 --> 00:56:51,239 Speaker 3: mashed together in the way that I would have with 1035 00:56:51,320 --> 00:56:52,000 Speaker 3: the early ones. 1036 00:56:52,760 --> 00:56:54,640 Speaker 1: So the book is really thoughtful. But I also want 1037 00:56:54,680 --> 00:56:57,439 Speaker 1: our listeners to appreciate, like how funny it is on 1038 00:56:57,480 --> 00:57:00,120 Speaker 1: the page, and part of that just comes from you know, 1039 00:57:00,200 --> 00:57:03,359 Speaker 1: your particular terms of phrase. And as I was reading it, 1040 00:57:03,400 --> 00:57:05,200 Speaker 1: I was struck by this for one word which I 1041 00:57:05,239 --> 00:57:07,120 Speaker 1: had to look up, and I'm going to ask you 1042 00:57:07,160 --> 00:57:09,160 Speaker 1: to give us like a useful definition of it because 1043 00:57:09,160 --> 00:57:11,799 Speaker 1: I need to know it in context. What exactly is 1044 00:57:11,920 --> 00:57:12,879 Speaker 1: an rgbargie? 1045 00:57:15,040 --> 00:57:18,480 Speaker 3: Well, the thing is, as with any word like that, 1046 00:57:19,000 --> 00:57:21,680 Speaker 3: if there was an easier way of saying it than 1047 00:57:21,720 --> 00:57:24,000 Speaker 3: meant the same thing that I would have used that like. 1048 00:57:24,160 --> 00:57:27,840 Speaker 3: I think I'm pretty sure I remember having to think, like, 1049 00:57:28,000 --> 00:57:34,920 Speaker 3: what is a one word or one phrase expression for 1050 00:57:35,560 --> 00:57:38,320 Speaker 3: what I am trying to talk about here? And I 1051 00:57:38,360 --> 00:57:41,080 Speaker 3: think it took me while to get to lgibargie because 1052 00:57:41,600 --> 00:57:44,000 Speaker 3: argibigi is not a word that I would normally use 1053 00:57:44,040 --> 00:57:46,840 Speaker 3: in conversation. It's probably a word that I had never 1054 00:57:47,320 --> 00:57:51,160 Speaker 3: written out in my life before. It's not a word 1055 00:57:51,160 --> 00:57:55,200 Speaker 3: that you hear come up that much, but it is 1056 00:57:55,240 --> 00:57:57,880 Speaker 3: one of those English phrases with a specific meaning that 1057 00:57:57,960 --> 00:58:10,680 Speaker 3: it's some combination of sort of fuss, commotion, disputation, hassle, argument, 1058 00:58:11,240 --> 00:58:14,560 Speaker 3: you know, all those kinds of things, but none of 1059 00:58:14,560 --> 00:58:19,919 Speaker 3: them quite capture it. And then if I remember rightly, 1060 00:58:20,120 --> 00:58:23,160 Speaker 3: it comes up when you know most of the book 1061 00:58:23,240 --> 00:58:25,919 Speaker 3: is about like Australians and Europeans in Europe, that bit 1062 00:58:26,160 --> 00:58:29,640 Speaker 3: is a English character talking about something that happened in England, 1063 00:58:30,440 --> 00:58:33,680 Speaker 3: and it's in in England, which has kind of gone backwards, 1064 00:58:34,200 --> 00:58:41,000 Speaker 3: so it felt appropriate there to use a quite old fashioned, quaint, 1065 00:58:41,360 --> 00:58:43,800 Speaker 3: very English words. 1066 00:58:45,160 --> 00:58:48,280 Speaker 1: But for example, is this something a married couple might 1067 00:58:48,400 --> 00:58:51,440 Speaker 1: do when they're, you know, disagreeing about whose turn it 1068 00:58:51,480 --> 00:58:53,960 Speaker 1: is to have to do the dishes? Or is this 1069 00:58:54,040 --> 00:58:56,840 Speaker 1: something kids? This is the description of kids arguments on 1070 00:58:56,880 --> 00:59:00,720 Speaker 1: the playground Or I'm just I'm lacking a concrete like 1071 00:59:00,880 --> 00:59:02,080 Speaker 1: understanding of what it means. 1072 00:59:02,320 --> 00:59:04,800 Speaker 3: If you said, like, oh, I had a bit of 1073 00:59:04,960 --> 00:59:08,560 Speaker 3: argie bargie with the wife or whatever, that would sound 1074 00:59:08,720 --> 00:59:17,040 Speaker 3: condescending or at least kind of inappropriately jovial, because it 1075 00:59:17,360 --> 00:59:29,160 Speaker 3: slightly implies a sort of annoying, somewhat inconsequential obstacle or 1076 00:59:30,000 --> 00:59:34,760 Speaker 3: friction that you just have to get past. You would 1077 00:59:34,760 --> 00:59:38,560 Speaker 3: have to say, like, during last night's argie bargie, my 1078 00:59:38,680 --> 00:59:42,880 Speaker 3: wife expressed some very real concerns which I listened to 1079 00:59:43,000 --> 00:59:45,920 Speaker 3: and took on boards like that simply wouldn't be compatible. 1080 00:59:48,640 --> 00:59:51,760 Speaker 2: What about your kid for the one thousandth time didn't 1081 00:59:51,800 --> 00:59:54,360 Speaker 2: put their underwear in the hamper and you had a 1082 00:59:54,360 --> 00:59:56,920 Speaker 2: bit of an argi bargie with them about it? Would 1083 00:59:56,920 --> 00:59:59,960 Speaker 2: that be appropriate, Like it is sort of inconsequence. 1084 01:00:00,400 --> 01:00:05,480 Speaker 3: Oh again, it's it's so hard to articulate why. But 1085 01:00:05,560 --> 01:00:11,480 Speaker 3: it doesn't have that sort of kind of intimate, interpersonal context. 1086 01:00:11,760 --> 01:00:14,520 Speaker 3: I think it implies more to something that happens at work, 1087 01:00:14,920 --> 01:00:18,080 Speaker 3: or I'm kind of imagining. I don't know, this is 1088 01:00:18,080 --> 01:00:27,800 Speaker 3: a random example, like if a policeman tells someone to 1089 01:00:28,000 --> 01:00:31,640 Speaker 3: move their bike or something, you know, or policemen don't 1090 01:00:31,640 --> 01:00:34,480 Speaker 3: carry guns, So I'm imagining like a slightly more benign 1091 01:00:34,560 --> 01:00:36,600 Speaker 3: version of that than might happen else are in the world. 1092 01:00:36,680 --> 01:00:39,800 Speaker 3: I mean, I do think it implies two people who 1093 01:00:39,880 --> 01:00:44,280 Speaker 3: don't really know each other kind of snapping at each other, 1094 01:00:45,240 --> 01:00:51,560 Speaker 3: not really succeeding in communicating. But ultimately it doesn't matter 1095 01:00:51,600 --> 01:00:53,920 Speaker 3: and it may as well never have happened. 1096 01:00:56,400 --> 01:00:58,000 Speaker 2: Oh so like everything on the internet. 1097 01:00:58,640 --> 01:01:03,480 Speaker 3: Yeah, but no, not really that I really am. I 1098 01:01:03,520 --> 01:01:05,880 Speaker 3: know it makes it sound like argie bargie is as 1099 01:01:06,160 --> 01:01:10,920 Speaker 3: hard a word to define as personhood or extinction. And 1100 01:01:10,960 --> 01:01:14,040 Speaker 3: when I'm thinking about personhood or extinction, I am thinking 1101 01:01:14,040 --> 01:01:18,040 Speaker 3: about like you know, vik Enstein famously said, no one 1102 01:01:18,040 --> 01:01:21,640 Speaker 3: can define a game. A game is just a kind 1103 01:01:21,680 --> 01:01:25,960 Speaker 3: of tangle of associated things. So that's why it's slightly 1104 01:01:26,040 --> 01:01:28,439 Speaker 3: upisaid whenever we try and define anyone, because any word 1105 01:01:28,480 --> 01:01:31,640 Speaker 3: basically is a tangle of associated, semi continuous things, And 1106 01:01:31,680 --> 01:01:34,640 Speaker 3: I think personhood is definitely like that, and I think, unfortunately, 1107 01:01:34,720 --> 01:01:37,160 Speaker 3: argie bargie is like that. Like that's how I'm so 1108 01:01:37,200 --> 01:01:41,320 Speaker 3: struggling to define it is. So it's so English, so contextual, 1109 01:01:42,520 --> 01:01:47,040 Speaker 3: and so hard to pin down exactly. It does have 1110 01:01:47,160 --> 01:01:54,280 Speaker 3: some implication of like bureaucracy, misunderstanding, someone trying to exert 1111 01:01:54,440 --> 01:02:00,560 Speaker 3: or authority, maybe a vague sense of impending physical scuffle, 1112 01:02:00,720 --> 01:02:02,640 Speaker 3: but the scuffle doesn't quite happen. 1113 01:02:03,280 --> 01:02:04,600 Speaker 1: This sounds like a faculty meeting. 1114 01:02:04,840 --> 01:02:07,880 Speaker 3: Yeah, but a faculty meeting would be unlikely to rise 1115 01:02:08,080 --> 01:02:10,200 Speaker 3: in RGI bargie in that way. I don't know. But 1116 01:02:10,560 --> 01:02:12,720 Speaker 3: this is also maybe why I never use this word, 1117 01:02:12,960 --> 01:02:14,480 Speaker 3: because it's so hard to grasp. 1118 01:02:14,800 --> 01:02:17,040 Speaker 1: Well. I think it's delicious how difficult it is to 1119 01:02:17,120 --> 01:02:19,760 Speaker 1: understand where the meanings of words are. Maybe we'll find 1120 01:02:19,800 --> 01:02:23,840 Speaker 1: somewhere a philosophy thesis on the topic of the rgbargie. 1121 01:02:23,880 --> 01:02:26,280 Speaker 1: But thanks very much for joining us today and digging 1122 01:02:26,320 --> 01:02:28,960 Speaker 1: into these tricky questions. We really enjoyed the book and 1123 01:02:29,000 --> 01:02:30,880 Speaker 1: we really enjoyed our conversation with you. Thank you. 1124 01:02:31,080 --> 01:02:32,720 Speaker 3: Yeah, I'm trying to say thanks a lot for having me. 1125 01:02:32,920 --> 01:02:34,440 Speaker 1: And before we let you go, can you tell us 1126 01:02:34,440 --> 01:02:36,600 Speaker 1: anything about your upcoming projects or your next book. 1127 01:02:36,880 --> 01:02:43,200 Speaker 3: I have such another novel which is about how the 1128 01:02:43,240 --> 01:02:49,080 Speaker 3: most evil institution in world's history that has existed for 1129 01:02:49,320 --> 01:02:55,720 Speaker 3: hundreds of years is still in operation and thriving just 1130 01:02:56,000 --> 01:02:59,880 Speaker 3: west of London. But it is too early to reveal 1131 01:03:00,200 --> 01:03:05,000 Speaker 3: what that anician is. But people, welcome to guys. 1132 01:03:05,360 --> 01:03:08,560 Speaker 1: Wonderful sounds delicious. We look forward to seeing it. All right, 1133 01:03:08,600 --> 01:03:10,920 Speaker 1: thanks very much for coming on the program. All right, 1134 01:03:10,960 --> 01:03:14,360 Speaker 1: so that was a super fun conversation with Ned. I'm 1135 01:03:14,400 --> 01:03:17,280 Speaker 1: glad that he wouldn't consider that conversation an RG bargie. 1136 01:03:18,600 --> 01:03:21,720 Speaker 2: I cannot wait to use that word on zach And 1137 01:03:22,120 --> 01:03:26,439 Speaker 2: because he like loves old English stuff, and I can't 1138 01:03:26,440 --> 01:03:28,080 Speaker 2: wait to see if he knows what that word means. 1139 01:03:28,080 --> 01:03:30,120 Speaker 2: And I am going to use that word like five 1140 01:03:30,200 --> 01:03:32,480 Speaker 2: or six times a day until I personally feel like 1141 01:03:32,520 --> 01:03:34,200 Speaker 2: I know where it belongs in my life. 1142 01:03:34,440 --> 01:03:36,800 Speaker 1: Well, I hope it doesn't cause any RG bargies. I'm 1143 01:03:36,840 --> 01:03:39,520 Speaker 1: going to use it on my brother who moved to 1144 01:03:39,560 --> 01:03:41,560 Speaker 1: the UK and might have heard it and actually have 1145 01:03:41,800 --> 01:03:45,480 Speaker 1: like a native understanding of it while remembering his American roots. 1146 01:03:45,480 --> 01:03:47,080 Speaker 1: So perhaps he can translate it for me. 1147 01:03:47,280 --> 01:03:49,520 Speaker 2: Oh here, yeah, here's hoping. Keep me posted. 1148 01:03:50,160 --> 01:03:52,320 Speaker 1: Here's hoping. All right. Well, we had a lot of 1149 01:03:52,320 --> 01:03:54,440 Speaker 1: fun reading this book and talking to the author and 1150 01:03:54,480 --> 01:03:56,680 Speaker 1: talking to you about it, so I highly recommend the 1151 01:03:56,680 --> 01:03:59,640 Speaker 1: book A Venomous Lumpsucker by Ned Bowman. Go ahead and 1152 01:03:59,720 --> 01:04:02,040 Speaker 1: get it, read it, enjoy it. Thanks very much Kelly 1153 01:04:02,080 --> 01:04:03,920 Speaker 1: for reading this with me and talking about it. 1154 01:04:04,000 --> 01:04:05,840 Speaker 2: Thanks for having me. You were right when you said 1155 01:04:05,880 --> 01:04:07,600 Speaker 2: you read that passage and it made you think of me. 1156 01:04:08,080 --> 01:04:10,440 Speaker 2: This was the perfect book for me. I enjoyed it 1157 01:04:10,480 --> 01:04:11,800 Speaker 2: so much. Thanks for the invite. 1158 01:04:11,920 --> 01:04:14,880 Speaker 1: All right, Thanks everybody for listening, and tune in next time. 1159 01:04:20,040 --> 01:04:22,920 Speaker 1: For more science and curiosity, come find us on social 1160 01:04:22,960 --> 01:04:27,920 Speaker 1: media where we answer questions and post videos. We're on Twitter, Discord, instance, 1161 01:04:27,960 --> 01:04:31,200 Speaker 1: and now TikTok. And remember that Daniel and Jorge Explain 1162 01:04:31,280 --> 01:04:35,280 Speaker 1: the Universe is a production of iHeartRadio. For more podcasts 1163 01:04:35,280 --> 01:04:39,920 Speaker 1: from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever 1164 01:04:40,000 --> 01:04:42,120 Speaker 1: you listen to your favorite shows.