1 00:00:08,600 --> 00:00:12,160 Speaker 1: Hey, Kelly, do you still remember your parents phone number 2 00:00:12,200 --> 00:00:13,480 Speaker 1: from when you were growing up? 3 00:00:13,840 --> 00:00:14,360 Speaker 2: I do. 4 00:00:14,480 --> 00:00:15,840 Speaker 3: I doubt I'll ever forget it. 5 00:00:15,920 --> 00:00:18,560 Speaker 4: My fingers can still go through the muscle memory of 6 00:00:18,600 --> 00:00:20,000 Speaker 4: pushing it on a Touchtowne phone. 7 00:00:20,239 --> 00:00:22,800 Speaker 1: And now do you know like all of your family 8 00:00:22,920 --> 00:00:23,520 Speaker 1: phone numbers? 9 00:00:23,640 --> 00:00:23,720 Speaker 5: No? 10 00:00:23,800 --> 00:00:24,400 Speaker 3: Absolutely not. 11 00:00:25,320 --> 00:00:28,160 Speaker 4: They're on my phone. There's no need to remember them. 12 00:00:28,320 --> 00:00:29,880 Speaker 4: They're on the brain in my pocket. 13 00:00:30,040 --> 00:00:31,640 Speaker 1: So do you think of your phone as like an 14 00:00:31,680 --> 00:00:33,240 Speaker 1: extension of your brain? 15 00:00:33,880 --> 00:00:35,000 Speaker 3: I am not. 16 00:00:34,880 --> 00:00:37,720 Speaker 4: Sure I could be a fully functioning human being without 17 00:00:37,760 --> 00:00:38,480 Speaker 4: it anymore. 18 00:00:38,560 --> 00:00:41,879 Speaker 1: So maybe do you ever wonder, though, if like our 19 00:00:41,960 --> 00:00:45,000 Speaker 1: phones feel the same way. Do they feel like they're 20 00:00:45,080 --> 00:00:47,680 Speaker 1: part of our brains or that we are part of them? 21 00:00:47,840 --> 00:00:48,840 Speaker 3: Oh? 22 00:00:49,120 --> 00:00:52,800 Speaker 4: I hope not, because if my phone has feelings, I'm 23 00:00:52,840 --> 00:00:55,440 Speaker 4: sure it's crumpy because I drop it all the time 24 00:00:55,520 --> 00:00:59,120 Speaker 4: and I have filled it with photos of disgusting bugs 25 00:00:59,440 --> 00:01:02,200 Speaker 4: and rolling poop. So I hope my phone does not 26 00:01:02,320 --> 00:01:03,080 Speaker 4: have feelings. 27 00:01:03,320 --> 00:01:06,240 Speaker 1: We've all dunked our phone in places we'd never want 28 00:01:06,280 --> 00:01:24,640 Speaker 1: to mention. Hi, I'm Daniel. I'm a particle physicist and 29 00:01:24,680 --> 00:01:28,119 Speaker 1: a professor at UC Irvine, and I think my phone 30 00:01:28,319 --> 00:01:29,520 Speaker 1: is smarter than I am. 31 00:01:29,920 --> 00:01:31,279 Speaker 3: I'm Kelly wier Smith. 32 00:01:31,319 --> 00:01:34,720 Speaker 4: I'm an adjunct assistant professor at Greece University, and I 33 00:01:34,800 --> 00:01:37,039 Speaker 4: am sure that my phone is smarter than. 34 00:01:36,920 --> 00:01:40,440 Speaker 1: I am because your phone told you so. 35 00:01:40,840 --> 00:01:43,080 Speaker 3: That's right, and I believe everything my phone says. 36 00:01:44,120 --> 00:01:47,000 Speaker 1: Well, my phone is definitely better at some things than 37 00:01:47,040 --> 00:01:49,080 Speaker 1: I am, though I still hold out hope that there 38 00:01:49,080 --> 00:01:51,800 Speaker 1: are things that humans can do that phones can't do. 39 00:01:52,120 --> 00:01:55,160 Speaker 3: That gap is getting smaller and smaller every day, though, Daniel. 40 00:01:57,000 --> 00:01:58,680 Speaker 1: But maybe that's not the right way to look at it. 41 00:01:58,720 --> 00:02:02,000 Speaker 1: Maybe instead of heating with our phones about who's most intelligent, 42 00:02:02,040 --> 00:02:04,280 Speaker 1: we should just be thinking about us and our phones 43 00:02:04,320 --> 00:02:08,240 Speaker 1: together working in harmony to unlock the nature of the universe. 44 00:02:08,480 --> 00:02:11,400 Speaker 3: The Ultimate Symbiosis. 45 00:02:12,240 --> 00:02:15,040 Speaker 1: And Welcome to the podcast. Daniel and Jorge explain the 46 00:02:15,120 --> 00:02:18,080 Speaker 1: universe in which we try to unlock the nature of 47 00:02:18,160 --> 00:02:21,160 Speaker 1: the universe. We use all of the technology at our disposal, 48 00:02:21,240 --> 00:02:24,440 Speaker 1: all of the AI, and all of the biological intelligence 49 00:02:24,600 --> 00:02:27,480 Speaker 1: to try to unravel the mystery of this beautiful and 50 00:02:27,639 --> 00:02:30,800 Speaker 1: gorgeous cosmos, to boil it all down into a story 51 00:02:30,840 --> 00:02:33,919 Speaker 1: that makes sense, at least to our human brains, if 52 00:02:34,000 --> 00:02:36,960 Speaker 1: that's even possible. My friend and usual co host Jorge 53 00:02:37,040 --> 00:02:39,040 Speaker 1: can't be here today, but I'm very pleased to be 54 00:02:39,200 --> 00:02:42,400 Speaker 1: joined by our regular guest host, Kelly. Kelly, thanks again 55 00:02:42,440 --> 00:02:43,280 Speaker 1: for coming on the pod. 56 00:02:43,600 --> 00:02:45,880 Speaker 4: Hello, thanks for having me on the pod. I especially 57 00:02:45,919 --> 00:02:47,720 Speaker 4: love when you're having me on the podcast to talk 58 00:02:47,720 --> 00:02:50,680 Speaker 4: about science fiction books. So I'm super excited to be 59 00:02:50,680 --> 00:02:51,160 Speaker 4: here today. 60 00:02:51,400 --> 00:02:53,680 Speaker 1: That's right. Usually we are analyzing the science of our 61 00:02:53,760 --> 00:02:57,400 Speaker 1: actual universe, wanting to understand how quantum mechanics weaves itself 62 00:02:57,440 --> 00:03:00,120 Speaker 1: together to make our reality, or what's going on going 63 00:03:00,200 --> 00:03:03,520 Speaker 1: on with the latest advances in astronomy and cosmology. But 64 00:03:03,800 --> 00:03:07,080 Speaker 1: part of doing science is being creative, is thinking about 65 00:03:07,120 --> 00:03:10,400 Speaker 1: the ways the universe could work or might work, or 66 00:03:10,480 --> 00:03:13,840 Speaker 1: ways alternative universes could work. And so that's why I 67 00:03:13,960 --> 00:03:16,080 Speaker 1: like to read so much science fiction, and on the 68 00:03:16,080 --> 00:03:19,400 Speaker 1: podcast we have a series of episodes diving into the 69 00:03:19,400 --> 00:03:24,359 Speaker 1: physics of fictional universes in which we interview science fiction authors. 70 00:03:24,400 --> 00:03:26,560 Speaker 1: It's an excuse for me to get to read science 71 00:03:26,639 --> 00:03:29,440 Speaker 1: fiction and an opportunity for Kelly and I do fanboy 72 00:03:29,480 --> 00:03:36,040 Speaker 1: and fangirl out talking to the authors themselves. Yay. But 73 00:03:36,120 --> 00:03:39,440 Speaker 1: it's a fascinating process hearing about how somebody puts together 74 00:03:39,480 --> 00:03:42,720 Speaker 1: an entire fictional universe, how they build it up from 75 00:03:42,760 --> 00:03:46,040 Speaker 1: the rules, the consequences of living in that universe. What 76 00:03:46,120 --> 00:03:48,240 Speaker 1: is it like to be human if the rules are 77 00:03:48,240 --> 00:03:51,880 Speaker 1: fundamentally different, or if technology has progressed so far that 78 00:03:51,960 --> 00:03:54,280 Speaker 1: it changes the nature of being human. 79 00:03:54,480 --> 00:03:56,600 Speaker 4: You know, I've always had a lot of respect for 80 00:03:56,680 --> 00:03:58,680 Speaker 4: sci fi authors, but through the course of doing these 81 00:03:58,720 --> 00:04:01,560 Speaker 4: interviews with you, I have there's so much more respect 82 00:04:01,600 --> 00:04:04,760 Speaker 4: thinking about how much world building needs to happen before 83 00:04:04,800 --> 00:04:05,480 Speaker 4: a book comes out. 84 00:04:05,480 --> 00:04:07,280 Speaker 3: It's just it's so much more work than I would 85 00:04:07,320 --> 00:04:07,920 Speaker 3: have imagined. 86 00:04:08,120 --> 00:04:10,200 Speaker 1: That's right, because you have to be creative not just 87 00:04:10,320 --> 00:04:13,200 Speaker 1: about the science of your universe and the technology of it, 88 00:04:13,400 --> 00:04:15,680 Speaker 1: but really thinking deeply about the human side of it. 89 00:04:15,760 --> 00:04:19,080 Speaker 1: The best science fiction, of course, they're human stories. They're 90 00:04:19,120 --> 00:04:21,720 Speaker 1: about people and what it's like to be human in 91 00:04:21,760 --> 00:04:24,440 Speaker 1: that era, and you have to also bridge this gap. 92 00:04:24,480 --> 00:04:27,080 Speaker 1: You can't write stories about people who are so far 93 00:04:27,080 --> 00:04:30,640 Speaker 1: away from us emotionally personally that we can't identify with them, right, 94 00:04:30,680 --> 00:04:33,479 Speaker 1: You need to somehow create that universe and make it 95 00:04:33,520 --> 00:04:36,200 Speaker 1: adjacent enough to ours that we can connect with these 96 00:04:36,279 --> 00:04:38,960 Speaker 1: characters and care about them, even though their lives can 97 00:04:39,000 --> 00:04:40,720 Speaker 1: be so different. From ours, and. 98 00:04:40,640 --> 00:04:42,719 Speaker 3: This book did such a good job of that. 99 00:04:43,000 --> 00:04:44,960 Speaker 1: It really did. It also sort of puts us in 100 00:04:44,960 --> 00:04:47,839 Speaker 1: our place as humans and makes us feel like, oh boy, 101 00:04:47,880 --> 00:04:50,440 Speaker 1: we better get our stuff together. Yes. So on the 102 00:04:50,440 --> 00:04:58,719 Speaker 1: podcast today, we'll be talking about the science fiction universe 103 00:04:58,880 --> 00:05:02,440 Speaker 1: of Meru by S. B. Diva. Diva is an author 104 00:05:02,480 --> 00:05:05,440 Speaker 1: we've had on the podcast before. She is very acclaimed. 105 00:05:05,560 --> 00:05:09,479 Speaker 1: She's the Hugo and Nebula nominated author of Machinehood, which 106 00:05:09,520 --> 00:05:11,760 Speaker 1: we talked about about a year ago on the pod. 107 00:05:12,120 --> 00:05:15,599 Speaker 1: Her stories have appeared in numerous magazines and anthologies. She's 108 00:05:15,600 --> 00:05:18,640 Speaker 1: also a former editor of Escape Pod, which is a 109 00:05:18,640 --> 00:05:21,560 Speaker 1: weekly science fiction podcast which is a lot of fun. 110 00:05:21,720 --> 00:05:25,440 Speaker 1: She has degrees in computational neuroscience and signal processing and 111 00:05:25,520 --> 00:05:28,800 Speaker 1: has worked in the medical device industry, so she knows 112 00:05:28,839 --> 00:05:32,480 Speaker 1: what she's talking about when it comes to post humanity, 113 00:05:32,520 --> 00:05:34,840 Speaker 1: as we'll hear all about in our interview with her. 114 00:05:34,960 --> 00:05:37,080 Speaker 1: And one of my favorite things in her bio is 115 00:05:37,120 --> 00:05:40,279 Speaker 1: that on her homepage she writes, quote, I am currently 116 00:05:40,360 --> 00:05:43,159 Speaker 1: mortal and full of squishy organs, but I hope to 117 00:05:43,279 --> 00:05:43,880 Speaker 1: outlive that. 118 00:05:47,200 --> 00:05:48,880 Speaker 3: M I wonder what the timeline is going to be 119 00:05:48,920 --> 00:05:49,120 Speaker 3: for that. 120 00:05:50,400 --> 00:05:52,359 Speaker 1: I think she's hoping that some of the stuff in 121 00:05:52,400 --> 00:05:55,080 Speaker 1: her book happens soon enough that she can move beyond 122 00:05:55,200 --> 00:05:57,840 Speaker 1: her earthly existence as a bag of squishy meat. 123 00:05:57,960 --> 00:06:00,279 Speaker 4: I'm happy as a bag of squishybeat, but the whole 124 00:06:00,279 --> 00:06:01,520 Speaker 4: big things work out for her. 125 00:06:03,760 --> 00:06:05,680 Speaker 1: So, Kelly and I both read this book. It's called 126 00:06:05,760 --> 00:06:09,760 Speaker 1: Mehru Meru, and it's available for sale now. We encourage 127 00:06:09,800 --> 00:06:11,359 Speaker 1: you to get it. Kelly, what do you think this 128 00:06:11,400 --> 00:06:14,200 Speaker 1: book is about? How would you summarize the topic of 129 00:06:14,240 --> 00:06:15,400 Speaker 1: the book for our listeners? 130 00:06:15,680 --> 00:06:16,919 Speaker 4: Oh, I wish you had told me you were going 131 00:06:16,960 --> 00:06:19,240 Speaker 4: to ask you that ahead of time. There's lots of 132 00:06:19,320 --> 00:06:22,159 Speaker 4: moving pieces in this book. It's like complicated in a 133 00:06:22,160 --> 00:06:25,080 Speaker 4: great way. But so I guess in summary, humans have 134 00:06:25,200 --> 00:06:29,240 Speaker 4: sort of messed up, and another branch of humans have 135 00:06:29,400 --> 00:06:32,240 Speaker 4: evolved and they've been sort of taken care of things 136 00:06:32,279 --> 00:06:34,240 Speaker 4: to make sure we don't mess up again, and this 137 00:06:34,279 --> 00:06:36,880 Speaker 4: is sort of the story about whether or not we 138 00:06:37,320 --> 00:06:39,680 Speaker 4: deserve to be released. 139 00:06:39,200 --> 00:06:41,479 Speaker 3: Back out into the universe. What do you think? 140 00:06:41,520 --> 00:06:44,400 Speaker 4: How would you describe what the book is about in 141 00:06:44,520 --> 00:06:45,720 Speaker 4: just a few sentences? 142 00:06:46,440 --> 00:06:49,080 Speaker 1: Either? That was a great summary. Yeah, in my view, 143 00:06:49,200 --> 00:06:52,480 Speaker 1: it's like near future science fiction, and so it's close 144 00:06:52,560 --> 00:06:55,279 Speaker 1: enough that we can imagine it happening. And the major 145 00:06:55,400 --> 00:06:58,640 Speaker 1: movements that happened between now in this near future are 146 00:06:58,720 --> 00:07:01,640 Speaker 1: that there's a new race of humans, humans that have 147 00:07:01,720 --> 00:07:04,360 Speaker 1: sort of machines built into them. More than like your 148 00:07:04,400 --> 00:07:07,040 Speaker 1: phone in your pocket. These are like really integrated into 149 00:07:07,080 --> 00:07:09,960 Speaker 1: what it's like to be these beings, and they don't 150 00:07:10,000 --> 00:07:13,320 Speaker 1: call themselves humans. They call themselves alloys. In the way 151 00:07:13,320 --> 00:07:16,200 Speaker 1: that you can mix metals together to get a stronger metal. 152 00:07:16,520 --> 00:07:19,920 Speaker 1: Here she's mixing organic and machine parts together to make 153 00:07:19,960 --> 00:07:23,120 Speaker 1: an alloy. And you have all sorts of really fascinating mixtures, 154 00:07:23,160 --> 00:07:27,880 Speaker 1: including beings that can fly through space. They're basically living ships, 155 00:07:28,200 --> 00:07:31,040 Speaker 1: you know, humans that are ships that fly through space 156 00:07:31,080 --> 00:07:34,920 Speaker 1: that have other humans inside of them. It's really very creative. 157 00:07:35,160 --> 00:07:37,600 Speaker 4: It's such a cool idea, and I feel like that 158 00:07:37,760 --> 00:07:40,200 Speaker 4: is maybe the part of the book that kept me 159 00:07:40,320 --> 00:07:43,600 Speaker 4: up the most sidnight thinking about, like one thinking about 160 00:07:43,600 --> 00:07:45,640 Speaker 4: the social dynamics, but then two thinking about you know, 161 00:07:45,680 --> 00:07:47,800 Speaker 4: if you're traveling through space and it takes you like 162 00:07:47,880 --> 00:07:50,520 Speaker 4: months to get somewhere, what do you do on a 163 00:07:50,600 --> 00:07:51,760 Speaker 4: trip like that when it's just. 164 00:07:51,800 --> 00:07:52,920 Speaker 3: You and anyway. 165 00:07:53,080 --> 00:07:55,280 Speaker 4: I could talk about this all day long, It is 166 00:07:55,320 --> 00:07:57,920 Speaker 4: such a cool idea, and she does great things with it. 167 00:07:58,240 --> 00:08:00,200 Speaker 1: She does great things with it. She really thinks, sit 168 00:08:00,280 --> 00:08:02,240 Speaker 1: through what it's like to be that and the emotional 169 00:08:02,280 --> 00:08:04,840 Speaker 1: relationship you have with this ship. I was just also 170 00:08:04,880 --> 00:08:07,120 Speaker 1: pleased to have so many new ideas. When I read 171 00:08:07,120 --> 00:08:09,400 Speaker 1: so much science fiction, I feel like the same five 172 00:08:09,480 --> 00:08:12,840 Speaker 1: ideas for getting from one place to another are recycled 173 00:08:12,840 --> 00:08:15,360 Speaker 1: over and over again. And so I just love seeing 174 00:08:15,400 --> 00:08:18,680 Speaker 1: something news. I mean, you haven't seen before. It's really creative. 175 00:08:18,760 --> 00:08:21,760 Speaker 1: And I was also really impressed by the reality of 176 00:08:21,880 --> 00:08:23,840 Speaker 1: the experience, you know, the human side of it. A 177 00:08:23,880 --> 00:08:25,360 Speaker 1: lot of times when you find yourself in a new 178 00:08:25,400 --> 00:08:28,360 Speaker 1: world in science fiction, it's a bit cartoony, you know, 179 00:08:28,400 --> 00:08:31,560 Speaker 1: it's very simplified. But she is such a rich description 180 00:08:31,760 --> 00:08:34,680 Speaker 1: of like the politics, the arguments. You know, there's no 181 00:08:34,880 --> 00:08:38,280 Speaker 1: monolithic organizations here where like all of the alloys think 182 00:08:38,320 --> 00:08:40,480 Speaker 1: this and all the humans think that. You know, there's 183 00:08:40,559 --> 00:08:44,400 Speaker 1: currents and there's factions and there's disagreements among every group 184 00:08:44,480 --> 00:08:46,520 Speaker 1: in a way that I think is very human and realistic. 185 00:08:46,559 --> 00:08:48,480 Speaker 1: I mean, these days, nobody can seem to agree about 186 00:08:48,480 --> 00:08:49,319 Speaker 1: anything that's right. 187 00:08:49,360 --> 00:08:51,960 Speaker 4: That's right, and she built out this huge history to 188 00:08:52,000 --> 00:08:54,920 Speaker 4: sort of support the story, and you get glimpses of 189 00:08:54,960 --> 00:08:56,640 Speaker 4: it every once in a while, but this is like, 190 00:08:57,040 --> 00:08:59,640 Speaker 4: this is a complete world that she has built and 191 00:08:59,720 --> 00:09:02,360 Speaker 4: you immerse yourself in it and it's yeah, I agree, 192 00:09:02,400 --> 00:09:03,800 Speaker 4: So politics don't feel corny. 193 00:09:04,160 --> 00:09:07,880 Speaker 3: The history makes perfect sense and it's it's awesome. 194 00:09:08,040 --> 00:09:10,480 Speaker 1: Yeah, And I love that she has like massive failed 195 00:09:10,520 --> 00:09:12,800 Speaker 1: projects in the book. This is time in the book 196 00:09:12,800 --> 00:09:15,640 Speaker 1: where so many builds like a meta habitat out in space, 197 00:09:15,840 --> 00:09:18,040 Speaker 1: and then everybody's like, nah, I don't really want to 198 00:09:18,080 --> 00:09:21,319 Speaker 1: move there. Eh, it's like, wait, trillions of dollars, that 199 00:09:21,440 --> 00:09:23,760 Speaker 1: seems like something that's likely to happen. 200 00:09:23,800 --> 00:09:23,960 Speaker 6: You know. 201 00:09:24,000 --> 00:09:26,600 Speaker 1: That's basically what Mark Zuckerberg is doing right now with 202 00:09:26,640 --> 00:09:27,240 Speaker 1: the metaverse. 203 00:09:27,440 --> 00:09:27,600 Speaker 7: Yeah. 204 00:09:27,679 --> 00:09:29,040 Speaker 3: Yeah, no, very very realistic. 205 00:09:29,080 --> 00:09:31,240 Speaker 4: I imagine when we start eating out into space, there 206 00:09:31,280 --> 00:09:33,240 Speaker 4: are going to be things that people like that work 207 00:09:33,240 --> 00:09:35,280 Speaker 4: and things that people like that don't work, and it'll 208 00:09:35,320 --> 00:09:37,360 Speaker 4: be you know, interesting to see how things pan out. 209 00:09:37,520 --> 00:09:40,440 Speaker 1: Yeah, there'll be all sorts of fascinating dead ends and 210 00:09:40,640 --> 00:09:43,640 Speaker 1: the equivalent of like abandoned apartment buildings and all sorts 211 00:09:43,679 --> 00:09:45,280 Speaker 1: of stuff in a way that I think will be 212 00:09:45,320 --> 00:09:48,480 Speaker 1: totally unpredictable, right, and so people make lots of money 213 00:09:48,480 --> 00:09:50,680 Speaker 1: and people who don't. And so I loved seeing that 214 00:09:50,720 --> 00:09:52,640 Speaker 1: in the book. It makes it feel very much like 215 00:09:52,679 --> 00:09:55,480 Speaker 1: you're actually visiting another universe. It does. 216 00:09:55,559 --> 00:09:58,040 Speaker 3: Yeah, she's got multiple different things that failed, from like 217 00:09:58,080 --> 00:10:01,360 Speaker 3: biology experiments to engineering the experiments, and then you just 218 00:10:01,400 --> 00:10:03,960 Speaker 3: sort of see how you've learned from those failures and 219 00:10:04,080 --> 00:10:04,920 Speaker 3: you move forward. 220 00:10:05,120 --> 00:10:06,760 Speaker 4: And you know, one of the ways that they move 221 00:10:06,840 --> 00:10:10,119 Speaker 4: forward and learned from a failure was humans tried terrorforming 222 00:10:10,160 --> 00:10:15,359 Speaker 4: bars and just totally blew it, which is super interesting 223 00:10:15,400 --> 00:10:18,200 Speaker 4: to think about from the perspective of you know, current 224 00:10:18,200 --> 00:10:20,360 Speaker 4: things that are going on right now. But anyway, in 225 00:10:20,400 --> 00:10:23,880 Speaker 4: the book, humans totally blew their chance and destroyed things. 226 00:10:23,880 --> 00:10:26,960 Speaker 4: And now the alloys are taking care of the humans, 227 00:10:27,360 --> 00:10:30,000 Speaker 4: but humans aren't really leaving Earth that much because they've 228 00:10:30,000 --> 00:10:31,080 Speaker 4: sort of been contained. 229 00:10:31,360 --> 00:10:33,479 Speaker 3: And this is the story of a human. 230 00:10:33,280 --> 00:10:36,600 Speaker 4: Who actually gets the option or the opportunity to travel 231 00:10:36,679 --> 00:10:39,240 Speaker 4: out and show that humans can do the right thing. 232 00:10:39,400 --> 00:10:42,520 Speaker 4: We've learned our lesson. But you know, they're also upon 233 00:10:42,679 --> 00:10:45,320 Speaker 4: between all these different powerful forces. They've got these different 234 00:10:45,320 --> 00:10:47,480 Speaker 4: factions that are fighting that think humans should have the 235 00:10:47,640 --> 00:10:51,040 Speaker 4: chance and others that think that they shouldn't and then 236 00:10:51,040 --> 00:10:52,400 Speaker 4: you'll have to just see what happens. 237 00:10:53,200 --> 00:10:55,559 Speaker 1: That's right, no pressure, You're only standing in for all 238 00:10:55,600 --> 00:10:58,920 Speaker 1: of humanity, right right, right. So let's talk about the 239 00:10:58,960 --> 00:11:00,600 Speaker 1: science of the book A Lion little bit. She has 240 00:11:00,600 --> 00:11:04,120 Speaker 1: some really fascinating innovations here. She has these really deep 241 00:11:04,240 --> 00:11:07,840 Speaker 1: human computer alloys. This is more than just like I've 242 00:11:07,840 --> 00:11:10,280 Speaker 1: got something wired to my brain where I can control 243 00:11:10,320 --> 00:11:14,200 Speaker 1: a machine. These are people where they have organic and 244 00:11:14,480 --> 00:11:18,680 Speaker 1: metallic machine based elements really of their biology. She talks 245 00:11:18,679 --> 00:11:21,120 Speaker 1: about like programming these things, having these things really be 246 00:11:21,160 --> 00:11:24,840 Speaker 1: an outgrowth of their DNA. You're the biologist here, Do 247 00:11:24,920 --> 00:11:27,600 Speaker 1: you think that that's something we could have, you know, 248 00:11:27,760 --> 00:11:29,959 Speaker 1: in fifty years, one hundred years, five hundred years sort 249 00:11:29,960 --> 00:11:32,439 Speaker 1: of ever? Or is that implausible? 250 00:11:32,720 --> 00:11:34,600 Speaker 3: You know? I just don't know. 251 00:11:34,679 --> 00:11:37,720 Speaker 4: So, like as you might remember from Sunish, I really 252 00:11:37,760 --> 00:11:40,280 Speaker 4: hate putting dates on things, but you know, I will 253 00:11:40,280 --> 00:11:43,040 Speaker 4: say that it does seem like on the one hand, 254 00:11:43,120 --> 00:11:46,240 Speaker 4: biology is super complicated and traits are controlled by many, 255 00:11:46,280 --> 00:11:48,920 Speaker 4: many genes, and so if you're going to have part 256 00:11:49,000 --> 00:11:52,800 Speaker 4: of your body growing as like you know, a computer part, 257 00:11:52,800 --> 00:11:54,439 Speaker 4: that you're going to be able to control that sounds 258 00:11:54,520 --> 00:11:57,320 Speaker 4: very complicated. But you know, on the other hand, crisper 259 00:11:57,520 --> 00:12:00,480 Speaker 4: is an amazing tool that we sort of surprise came 260 00:12:00,520 --> 00:12:01,680 Speaker 4: across in the near future. 261 00:12:01,720 --> 00:12:03,439 Speaker 3: So maybe we'll stumble. 262 00:12:03,160 --> 00:12:07,280 Speaker 4: Upon more surprises that shorten the timeline on these sorts 263 00:12:07,280 --> 00:12:10,280 Speaker 4: of things. But I guess, if I had to guess, 264 00:12:10,360 --> 00:12:12,760 Speaker 4: if I'm going to have like a leaf blower arm, 265 00:12:15,320 --> 00:12:17,520 Speaker 4: I don't think my grandkids are going to have this 266 00:12:17,640 --> 00:12:19,559 Speaker 4: sort of technology. 267 00:12:19,640 --> 00:12:21,080 Speaker 3: Let's just say, but I could be wrong. 268 00:12:21,360 --> 00:12:24,320 Speaker 1: Is that your top choice? If you have some modification 269 00:12:24,720 --> 00:12:26,960 Speaker 1: you're like leaf blower arm? I could use it to 270 00:12:26,960 --> 00:12:29,480 Speaker 1: what brush my teeth to clean off that do the 271 00:12:29,520 --> 00:12:31,520 Speaker 1: dishes just like sort of blow all the dishes into 272 00:12:31,559 --> 00:12:31,960 Speaker 1: the sink. 273 00:12:32,920 --> 00:12:35,520 Speaker 4: I'm in my garage recording right now, and I just 274 00:12:35,559 --> 00:12:38,320 Speaker 4: looked at the leaf blower and so that would not 275 00:12:38,400 --> 00:12:39,360 Speaker 4: be my first choice. 276 00:12:39,480 --> 00:12:41,439 Speaker 1: It reminds me of all those internet videos of people 277 00:12:41,480 --> 00:12:43,920 Speaker 1: blowing leaf blowers into their mouths. I don't know why 278 00:12:43,960 --> 00:12:46,680 Speaker 1: that's so fun for people. Anyway, one day, maybe we'll 279 00:12:46,679 --> 00:12:49,559 Speaker 1: have people with leaf blower arms and we'll see weird 280 00:12:49,559 --> 00:12:52,600 Speaker 1: movies about it. Another really fascinating aspect of the science 281 00:12:52,600 --> 00:12:55,600 Speaker 1: of this book is how she gets from star to star. 282 00:12:56,040 --> 00:12:58,720 Speaker 1: So in loss of science fiction novels, there are warp 283 00:12:58,840 --> 00:13:02,439 Speaker 1: drives or there's war there's sort of these standard solutions 284 00:13:02,440 --> 00:13:04,440 Speaker 1: to getting from start a star not having it take 285 00:13:04,679 --> 00:13:07,199 Speaker 1: thousands of years. But in this book, she has something 286 00:13:07,240 --> 00:13:11,199 Speaker 1: totally new. She has invented like another layer of physics 287 00:13:11,559 --> 00:13:14,840 Speaker 1: beneath what we know. So she's taken our universe. She 288 00:13:14,920 --> 00:13:17,640 Speaker 1: says below that there's something else. There's this thing called 289 00:13:17,640 --> 00:13:20,880 Speaker 1: a famite field, and there's particles in that field, which 290 00:13:20,920 --> 00:13:24,199 Speaker 1: we would call famatons, and we can use that field 291 00:13:24,320 --> 00:13:26,640 Speaker 1: as an energy source and as a way to manipulate 292 00:13:26,720 --> 00:13:30,240 Speaker 1: space time itself to get between stars. I thought that 293 00:13:30,320 --> 00:13:31,000 Speaker 1: was really clever. 294 00:13:31,240 --> 00:13:34,199 Speaker 4: Well, okay, now it's byrd to ask you, as a physicist, 295 00:13:34,480 --> 00:13:35,559 Speaker 4: is this plausible? 296 00:13:35,760 --> 00:13:38,400 Speaker 1: So in general, like, is it plausible that there's something 297 00:13:38,520 --> 00:13:41,960 Speaker 1: beneath space time? A completely different way to think about 298 00:13:41,960 --> 00:13:45,640 Speaker 1: the universe. Absolutely, Like the message that we send on 299 00:13:45,679 --> 00:13:47,800 Speaker 1: this podcast all the time is that we really just 300 00:13:47,840 --> 00:13:50,920 Speaker 1: don't understand the way the universe works, Like we don't 301 00:13:50,960 --> 00:13:53,560 Speaker 1: know what space time is. Is it actually built up 302 00:13:53,600 --> 00:13:56,080 Speaker 1: from like weird little quantum dots that are woven together 303 00:13:56,360 --> 00:13:59,160 Speaker 1: using theories and forces that we don't understand it. Certainly 304 00:13:59,200 --> 00:14:02,040 Speaker 1: could be we have no idea. We have this theory 305 00:14:02,040 --> 00:14:04,480 Speaker 1: of general relativity that tells us about space time, but 306 00:14:04,679 --> 00:14:07,800 Speaker 1: we suspect it's probably wrong because it's not quantum mechanical. 307 00:14:07,880 --> 00:14:10,720 Speaker 1: So we're always looking for that next layer of knowledge 308 00:14:11,040 --> 00:14:13,560 Speaker 1: about what's going on in the universe. And so it's 309 00:14:13,559 --> 00:14:16,480 Speaker 1: certainly possible that there's something going on deep down there 310 00:14:16,520 --> 00:14:19,240 Speaker 1: which could allow us, if we understood it, to control 311 00:14:19,280 --> 00:14:22,240 Speaker 1: space time in just this way. I mean, her description 312 00:14:22,320 --> 00:14:24,800 Speaker 1: of it is a little bit more like spiritual and 313 00:14:24,840 --> 00:14:29,000 Speaker 1: fantastical than scientific. You know, she has these ships and 314 00:14:29,040 --> 00:14:31,520 Speaker 1: they sort of will themselves. They interact with this famity 315 00:14:31,560 --> 00:14:34,960 Speaker 1: field through their minds, and they will themselves from star 316 00:14:35,040 --> 00:14:38,000 Speaker 1: to star. But could there be famity fields and famatons 317 00:14:38,040 --> 00:14:40,600 Speaker 1: and in a way that lets us travel. Yes, absolutely, 318 00:14:40,640 --> 00:14:41,960 Speaker 1: that's totally possible. 319 00:14:42,320 --> 00:14:46,120 Speaker 4: So like when my grandkids have leaf flower arms, maybe 320 00:14:46,120 --> 00:14:48,760 Speaker 4: they'll be trapped into distant stars. 321 00:14:49,800 --> 00:14:51,680 Speaker 1: That's right, and then when they get there they can 322 00:14:51,760 --> 00:14:54,160 Speaker 1: use their leaf blower arms to like hover over the surface. 323 00:14:54,480 --> 00:14:59,080 Speaker 3: Right, So maybe it wasn't such a bad idea. It 324 00:14:59,240 --> 00:15:01,200 Speaker 3: seems like a very versatile attachment. 325 00:15:01,320 --> 00:15:03,840 Speaker 1: Yeah, I keep thinking of new applications for it all 326 00:15:03,920 --> 00:15:04,360 Speaker 1: the time. 327 00:15:04,920 --> 00:15:07,560 Speaker 3: Well, you know, maybe we should see if we can 328 00:15:07,600 --> 00:15:10,720 Speaker 3: get a leaf blowing company to support the show. But 329 00:15:10,800 --> 00:15:14,720 Speaker 3: here's some commercial from someone who supports the show. 330 00:15:26,280 --> 00:15:28,080 Speaker 1: All right, we're back and we're talking about the science 331 00:15:28,080 --> 00:15:32,440 Speaker 1: fiction universe of Meru, a novel by SB Diva about 332 00:15:32,720 --> 00:15:37,240 Speaker 1: post humanity where humans have become melded with machines and 333 00:15:37,360 --> 00:15:40,280 Speaker 1: formed a new race of humans, humans that are more 334 00:15:40,320 --> 00:15:43,720 Speaker 1: responsible than we are and allowed to travel through space 335 00:15:43,800 --> 00:15:45,880 Speaker 1: and do all sorts of things. And so, as we 336 00:15:45,960 --> 00:15:47,960 Speaker 1: usually do in the podcast, we were lucky enough to 337 00:15:48,120 --> 00:15:51,520 Speaker 1: speak to the author of this book here about her process. 338 00:15:51,960 --> 00:15:55,520 Speaker 1: So here's our interview with SB Diva. So it's my 339 00:15:55,720 --> 00:15:59,160 Speaker 1: pleasure to welcome back to the podcast SB Diva, the 340 00:15:59,240 --> 00:16:02,000 Speaker 1: author of this wonderful book, Mary that we've just been 341 00:16:02,040 --> 00:16:04,560 Speaker 1: talking about. Thanks very much for joining us again. 342 00:16:04,800 --> 00:16:07,320 Speaker 7: Thank you so much for having me. I'm delighted to 343 00:16:07,320 --> 00:16:09,120 Speaker 7: be here and talking about this book. 344 00:16:09,400 --> 00:16:11,640 Speaker 1: So you're the first author we've actually had back on 345 00:16:11,680 --> 00:16:14,720 Speaker 1: the podcast for a second round. And usually we ask 346 00:16:14,880 --> 00:16:17,120 Speaker 1: a series of silly questions to get people warmed up, 347 00:16:17,160 --> 00:16:19,560 Speaker 1: but we've already asked you about your thoughts about Star 348 00:16:19,600 --> 00:16:22,960 Speaker 1: trek transporters. So I have a different question for you 349 00:16:23,200 --> 00:16:27,160 Speaker 1: to get us oriented in your space of philosophy of science, 350 00:16:27,200 --> 00:16:32,480 Speaker 1: which is in your view, is Star Wars fantasy or 351 00:16:32,520 --> 00:16:33,200 Speaker 1: science fiction? 352 00:16:33,560 --> 00:16:35,920 Speaker 2: Oh, that's a good one. 353 00:16:36,080 --> 00:16:38,560 Speaker 7: I'm going to do a why not both and say 354 00:16:38,840 --> 00:16:43,600 Speaker 7: there is a subgenre called science fantasy, and Star Wars 355 00:16:43,680 --> 00:16:46,000 Speaker 7: falls squarely into that. For me? 356 00:16:46,320 --> 00:16:48,320 Speaker 1: Is that because there are elements of it that are 357 00:16:48,360 --> 00:16:51,040 Speaker 1: sort of like hardcore science and also elements that are 358 00:16:51,080 --> 00:16:53,520 Speaker 1: sort of like left unexplained a little bit magical. 359 00:16:53,760 --> 00:16:54,560 Speaker 2: Yeah exactly. 360 00:16:54,680 --> 00:16:58,240 Speaker 7: I mean, notwithstanding the Mediclorians, the rest of the force 361 00:16:58,360 --> 00:17:02,240 Speaker 7: is very mystical, right. I thought the original stood a 362 00:17:02,240 --> 00:17:05,400 Speaker 7: little better that way than trying to explain it with science, 363 00:17:06,240 --> 00:17:10,399 Speaker 7: especially as a retcon. So, yeah, science fantasy is you know, 364 00:17:10,520 --> 00:17:13,880 Speaker 7: things like dragons in space. There are books out there 365 00:17:14,080 --> 00:17:19,520 Speaker 7: that involve this very topic, things like Binti by netdi 366 00:17:19,560 --> 00:17:23,600 Speaker 7: ocarfor you know, that involves space travel, astrolabes, but also 367 00:17:23,960 --> 00:17:28,600 Speaker 7: certain forms of indigenous magic. So it's a very why 368 00:17:28,640 --> 00:17:30,680 Speaker 7: not both the genre and I feel like that's one 369 00:17:30,680 --> 00:17:33,440 Speaker 7: of the reasons so many people love Star Wars, right, 370 00:17:33,520 --> 00:17:35,600 Speaker 7: is because you have all the cool tech, but then 371 00:17:36,040 --> 00:17:38,639 Speaker 7: you have the mystical force, and you have magic as well. 372 00:17:39,400 --> 00:17:40,959 Speaker 1: I must be one of the very few people who 373 00:17:41,040 --> 00:17:43,320 Speaker 1: thought it was cool to try to explain the force 374 00:17:43,359 --> 00:17:45,840 Speaker 1: in terms of science, because you know, that's like kind 375 00:17:45,840 --> 00:17:48,119 Speaker 1: of who I am. I'm like, how does that work? 376 00:17:48,280 --> 00:17:51,760 Speaker 1: You know, what is the microscopic process that makes that happen? 377 00:17:51,840 --> 00:17:53,960 Speaker 1: So I was kind of into that. I was like, oh, cool, 378 00:17:54,480 --> 00:17:56,800 Speaker 1: And then I discovered it was very uncool to like that. 379 00:17:57,119 --> 00:17:59,879 Speaker 7: I think if it had been that way from the start, 380 00:18:00,480 --> 00:18:03,000 Speaker 7: I would have been on board with it. But trying to, 381 00:18:03,119 --> 00:18:05,960 Speaker 7: like I said, trying to shoehorn it in after the 382 00:18:06,000 --> 00:18:08,560 Speaker 7: fact just felt kind of forced to me. 383 00:18:08,920 --> 00:18:10,719 Speaker 1: All right, well, we'll try to get our science in 384 00:18:11,359 --> 00:18:13,400 Speaker 1: draft one every time as well. 385 00:18:13,400 --> 00:18:14,760 Speaker 7: From yes, for sure. 386 00:18:14,840 --> 00:18:17,200 Speaker 1: All right, so let's talk about your book, which Kelly 387 00:18:17,200 --> 00:18:19,399 Speaker 1: and I both read and both really enjoyed, and the 388 00:18:19,400 --> 00:18:21,800 Speaker 1: focus of the book is on this sort of post 389 00:18:21,920 --> 00:18:24,959 Speaker 1: human or transhuman I'm not sure what the right word is, 390 00:18:25,160 --> 00:18:27,720 Speaker 1: experience far in the future, when it's more than just 391 00:18:27,800 --> 00:18:31,560 Speaker 1: like biological humans, as humans have all sorts of modifications 392 00:18:31,560 --> 00:18:34,119 Speaker 1: that are really quite different from the humans that we 393 00:18:34,280 --> 00:18:37,159 Speaker 1: know that you call alloys in your book, tell us 394 00:18:37,160 --> 00:18:39,640 Speaker 1: about your inspiration, What made you write this book, Where 395 00:18:39,680 --> 00:18:42,919 Speaker 1: did the idea come from? What's exciting to you about 396 00:18:43,080 --> 00:18:43,960 Speaker 1: post humanity? 397 00:18:44,280 --> 00:18:47,600 Speaker 7: Yeah, so, first off, I will say I prefer post 398 00:18:47,680 --> 00:18:53,359 Speaker 7: humans to transhuman only because transhumanism is a thing now 399 00:18:53,440 --> 00:18:59,560 Speaker 7: and it's particular philosophy about improving humankind. And this book 400 00:18:59,760 --> 00:19:05,439 Speaker 7: is very intentionally interrogating that very idea, right, which is 401 00:19:06,359 --> 00:19:10,040 Speaker 7: can and should we even talk about improvement when it 402 00:19:10,080 --> 00:19:16,119 Speaker 7: comes to biology, humanity, life in general. And so the 403 00:19:16,200 --> 00:19:20,200 Speaker 7: idea for this book, especially for the Alloys, was that 404 00:19:20,520 --> 00:19:23,160 Speaker 7: I did want to carry some of the themes of 405 00:19:23,720 --> 00:19:28,800 Speaker 7: genetic engineering and cybernetic enhancements for my first novel, Machinehood, 406 00:19:28,800 --> 00:19:30,600 Speaker 7: but much much further. 407 00:19:30,359 --> 00:19:31,080 Speaker 2: Into the future. 408 00:19:31,200 --> 00:19:34,159 Speaker 7: You know, we talk a lot today in the media 409 00:19:34,200 --> 00:19:38,439 Speaker 7: about designer babies and the ethics of genetics, and I thought, okay, well, 410 00:19:38,480 --> 00:19:43,000 Speaker 7: let's assume that genetic editing is happening right, the genies 411 00:19:43,000 --> 00:19:45,560 Speaker 7: out of the bottle, and let's say that one day, 412 00:19:45,920 --> 00:19:51,360 Speaker 7: just like we write software code, we can build DNA 413 00:19:51,560 --> 00:19:54,560 Speaker 7: up from scratch. And that's not as hypothetical as it sounds. 414 00:19:54,600 --> 00:19:58,480 Speaker 7: We've already done that with yeast. So there are yeast 415 00:19:58,560 --> 00:20:01,160 Speaker 7: legos that people in the lab play with today, and 416 00:20:01,200 --> 00:20:03,920 Speaker 7: so you know, a thousand years from now, I don't 417 00:20:03,920 --> 00:20:05,560 Speaker 7: think it's much of a stretch to say that we'll 418 00:20:05,560 --> 00:20:09,000 Speaker 7: have the technology to do that with our chromosomes. And 419 00:20:09,080 --> 00:20:11,320 Speaker 7: at that point, I think, you know, it really begs 420 00:20:11,320 --> 00:20:14,879 Speaker 7: the question of what kind of DNA is allowed, what 421 00:20:15,080 --> 00:20:19,120 Speaker 7: kind of beings might exist, you know, and why? And 422 00:20:19,200 --> 00:20:22,000 Speaker 7: so this book explores a lot of those questions. 423 00:20:22,119 --> 00:20:25,879 Speaker 4: I'm really interested in the designer baby's angle, and so 424 00:20:25,920 --> 00:20:27,719 Speaker 4: I thought it was really interesting that, like, right at 425 00:20:27,720 --> 00:20:30,000 Speaker 4: the beginning of the book sort of you start with 426 00:20:30,520 --> 00:20:33,600 Speaker 4: an interesting ethical question here. And so in my mind, 427 00:20:33,640 --> 00:20:36,920 Speaker 4: people are more comfortable with thinking about designer babies when 428 00:20:36,960 --> 00:20:39,640 Speaker 4: you are tinkering to make an improvement to make their 429 00:20:39,680 --> 00:20:44,320 Speaker 4: lives better. But in the book there's some tinkering that 430 00:20:44,960 --> 00:20:46,920 Speaker 4: is negative, but maybe. 431 00:20:46,640 --> 00:20:49,080 Speaker 3: It'll end up being better at some point. Like what 432 00:20:49,119 --> 00:20:51,440 Speaker 3: do you think the future looks like in terms of 433 00:20:51,880 --> 00:20:54,359 Speaker 3: our ethical sort of roads? Do you think at one 434 00:20:54,400 --> 00:20:58,600 Speaker 3: point will feel comfortable making these tinkerings that might not 435 00:20:58,800 --> 00:20:59,680 Speaker 3: make someone better? 436 00:20:59,840 --> 00:21:02,680 Speaker 7: I think we're already making tinkerings that have the potential 437 00:21:02,720 --> 00:21:05,879 Speaker 7: to not make someone better or perhaps in a more 438 00:21:05,960 --> 00:21:11,159 Speaker 7: nuanced way better here worse there right. Again, it can 439 00:21:11,200 --> 00:21:15,119 Speaker 7: be a combination of things, and as we know, gene 440 00:21:15,119 --> 00:21:19,359 Speaker 7: expression and how it translates to your health, your behavior, 441 00:21:19,720 --> 00:21:23,520 Speaker 7: your biochemistry can be pretty complicated. You know, certain things 442 00:21:23,840 --> 00:21:25,520 Speaker 7: rely on one or two genes, but a lot of 443 00:21:25,520 --> 00:21:28,360 Speaker 7: other things rely on many, and the combination of how 444 00:21:28,400 --> 00:21:29,840 Speaker 7: those different genes are. 445 00:21:29,960 --> 00:21:31,200 Speaker 2: Expressed in your body. 446 00:21:31,840 --> 00:21:34,480 Speaker 7: And so I don't think we have the tools right 447 00:21:34,520 --> 00:21:37,520 Speaker 7: now to model that in detail enough to know what 448 00:21:37,600 --> 00:21:40,760 Speaker 7: we're doing. But assuming at some point we do, I 449 00:21:40,800 --> 00:21:44,400 Speaker 7: think the risk of doing it in an unintentional way, 450 00:21:44,480 --> 00:21:48,320 Speaker 7: like without very specific rules and regulations is that, yes, 451 00:21:48,440 --> 00:21:52,919 Speaker 7: this idea of making someone's life better can land us 452 00:21:52,960 --> 00:21:56,640 Speaker 7: on a very steep, slippery slope to eugenics. So upfront 453 00:21:56,640 --> 00:21:59,240 Speaker 7: I wanted to establish that in this world they've. 454 00:21:59,040 --> 00:22:00,040 Speaker 2: Gone through that. 455 00:22:00,119 --> 00:22:03,399 Speaker 7: Actually, you know, in the history of Meru there's the 456 00:22:03,480 --> 00:22:07,959 Speaker 7: directed mutation catastrophe, which is when things went bad for 457 00:22:08,119 --> 00:22:11,040 Speaker 7: life because we tinkered a little too much and we 458 00:22:11,080 --> 00:22:15,000 Speaker 7: oversimplified actually to where genetic diversity was reduced and that 459 00:22:15,119 --> 00:22:19,280 Speaker 7: ended up working against us from natural selection factors. So 460 00:22:20,000 --> 00:22:22,760 Speaker 7: they come to the realization that a you have to 461 00:22:23,520 --> 00:22:27,120 Speaker 7: allow for trance mutations in order to just have innovation 462 00:22:27,240 --> 00:22:33,160 Speaker 7: in general, and b you have to allow for various 463 00:22:33,160 --> 00:22:36,240 Speaker 7: types of diseases and disabilities to exist in the population 464 00:22:36,560 --> 00:22:39,919 Speaker 7: as well. You do your best to accommodate them and 465 00:22:39,920 --> 00:22:42,560 Speaker 7: make their lives good, and you give them the option 466 00:22:42,600 --> 00:22:46,320 Speaker 7: of treatment if it really can't be good. But you 467 00:22:46,440 --> 00:22:49,040 Speaker 7: don't eliminate it from the gene pool, because you never 468 00:22:49,160 --> 00:22:54,280 Speaker 7: know when the environment might demand that those genes exist, 469 00:22:54,560 --> 00:22:56,600 Speaker 7: and we might find it useful to have people who 470 00:22:56,640 --> 00:22:58,760 Speaker 7: have expressed those so that we understand how they work. 471 00:22:59,040 --> 00:23:02,360 Speaker 7: And I think that plies today actually just as well 472 00:23:02,400 --> 00:23:04,359 Speaker 7: as a thousand years from now. It's just that we 473 00:23:04,400 --> 00:23:05,719 Speaker 7: don't necessarily have the tools. 474 00:23:05,800 --> 00:23:07,880 Speaker 1: So that's a very good argument from a practical point 475 00:23:07,880 --> 00:23:10,040 Speaker 1: of view, like we can't predict, we can't model, we 476 00:23:10,040 --> 00:23:12,919 Speaker 1: don't know the impact of any decision, and also we 477 00:23:12,920 --> 00:23:15,680 Speaker 1: don't know what the future will need from us as 478 00:23:15,720 --> 00:23:18,720 Speaker 1: a gene pool in terms of diversity to survive. But 479 00:23:18,800 --> 00:23:20,520 Speaker 1: what about the moral side of it? Can we drill 480 00:23:20,560 --> 00:23:23,600 Speaker 1: down on that? Are you suggesting that people don't have 481 00:23:23,760 --> 00:23:25,639 Speaker 1: the right to do this kind of thing or that 482 00:23:25,680 --> 00:23:29,280 Speaker 1: it's a bad idea from a policy standpoint? Can we 483 00:23:29,359 --> 00:23:32,080 Speaker 1: make that decision for everybody? I mean suggesting we should 484 00:23:32,080 --> 00:23:35,600 Speaker 1: like not allow anybody to do this for their own babies. 485 00:23:35,880 --> 00:23:40,640 Speaker 7: I think what we have to sit down very carefully 486 00:23:40,880 --> 00:23:44,080 Speaker 7: and consider what rules and regulations we're going to build 487 00:23:44,080 --> 00:23:47,400 Speaker 7: around this, and it's going to be you know, as complicated, 488 00:23:47,720 --> 00:23:52,960 Speaker 7: if not more complicated than something like the FDA regulating drugs. Right. 489 00:23:53,440 --> 00:23:55,359 Speaker 7: It's there's a lot of parallels. 490 00:23:56,840 --> 00:23:59,600 Speaker 2: There's a difference, I would say, between. 491 00:24:00,640 --> 00:24:05,960 Speaker 7: Gene therapy, which is something you change in gene expression 492 00:24:06,960 --> 00:24:11,480 Speaker 7: or certain types of cells in the body, but that 493 00:24:11,600 --> 00:24:16,240 Speaker 7: are not necessarily psychotic, that aren't going to be hereditary, right, 494 00:24:16,280 --> 00:24:19,040 Speaker 7: So you're not necessarily changing the genome, you're not changing 495 00:24:19,800 --> 00:24:24,320 Speaker 7: the eggs or the sperm, and so I think there's 496 00:24:24,480 --> 00:24:27,720 Speaker 7: an important distinction to be made there as well. So parents, 497 00:24:28,359 --> 00:24:31,119 Speaker 7: or in the case of METO, a person when they 498 00:24:31,160 --> 00:24:34,560 Speaker 7: become a consenting adult, can choose to have gene therapy 499 00:24:35,480 --> 00:24:40,800 Speaker 7: to correct for certain conditions that they're not happy living with, right, 500 00:24:41,119 --> 00:24:43,560 Speaker 7: And I think that absolutely everyone should have that choice. 501 00:24:44,040 --> 00:24:46,600 Speaker 7: Whether parents should have that choice for their babies is 502 00:24:46,640 --> 00:24:49,879 Speaker 7: another ethical argument, you know, moral or ethical argument that 503 00:24:49,920 --> 00:24:53,240 Speaker 7: we're having already today. Right, the deaf communities having it 504 00:24:53,480 --> 00:24:57,360 Speaker 7: cited communities and uncited communities are having it right, anything 505 00:24:57,400 --> 00:25:00,920 Speaker 7: where it's hereditary but livable, like you can still have 506 00:25:01,000 --> 00:25:03,040 Speaker 7: a good quality of life. You know, we're having these 507 00:25:03,040 --> 00:25:06,680 Speaker 7: conversations with autism, with down syndrome, like who gets to decide, 508 00:25:06,840 --> 00:25:08,560 Speaker 7: you know, whether or not they want to raise a 509 00:25:08,680 --> 00:25:12,840 Speaker 7: child with these conditions? Right, So there's important considerations for 510 00:25:12,880 --> 00:25:17,280 Speaker 7: the parents, for the children, and then there's the social considerations, 511 00:25:17,359 --> 00:25:20,720 Speaker 7: and then there's the species wide considerations and so admity. 512 00:25:20,960 --> 00:25:24,000 Speaker 7: I was really trying to tackle that that final scale, right, 513 00:25:24,040 --> 00:25:25,680 Speaker 7: which I don't think a lot of people are talking 514 00:25:25,680 --> 00:25:29,160 Speaker 7: about today because again we don't have that capacity right now, 515 00:25:29,200 --> 00:25:35,560 Speaker 7: but species wide survival and people who've paid close attention 516 00:25:35,760 --> 00:25:40,960 Speaker 7: to certain animal and insect population studies have noticed that 517 00:25:41,440 --> 00:25:45,920 Speaker 7: things that sound good as adaptive behaviors, when taken to 518 00:25:46,000 --> 00:25:49,639 Speaker 7: the extreme, can often lead to species extinction, right, aggressive 519 00:25:49,640 --> 00:25:54,240 Speaker 7: mate selection for example, or certain types of decorative mutations 520 00:25:54,280 --> 00:25:57,879 Speaker 7: that you know, take into the extreme end up reducing 521 00:25:58,000 --> 00:26:03,359 Speaker 7: survivability overall. And so that I think, you know, is 522 00:26:03,359 --> 00:26:05,679 Speaker 7: a consideration a little bit more for the future compared 523 00:26:05,720 --> 00:26:10,240 Speaker 7: to parents' children and family and society, which are a 524 00:26:10,320 --> 00:26:14,480 Speaker 7: little bit more pressing. I would never arrogate the power 525 00:26:14,520 --> 00:26:16,879 Speaker 7: to myself to sit here and say that I know best. 526 00:26:17,040 --> 00:26:20,320 Speaker 7: I certainly don't, And I think this is the sort 527 00:26:20,359 --> 00:26:23,400 Speaker 7: of thing that, you know, we need to tap millions 528 00:26:23,400 --> 00:26:27,240 Speaker 7: of people from across the world and attempt to achieve 529 00:26:27,359 --> 00:26:30,520 Speaker 7: some kind of global consensus ideally on. 530 00:26:31,040 --> 00:26:31,920 Speaker 2: Where we go with this. 531 00:26:32,119 --> 00:26:34,280 Speaker 1: I totally respect that we don't know what we're doing 532 00:26:34,359 --> 00:26:36,800 Speaker 1: in this area, and that makes it terrifying, and it 533 00:26:36,840 --> 00:26:39,159 Speaker 1: feels like a bad idea to give people power to 534 00:26:39,200 --> 00:26:42,439 Speaker 1: make these decisions which could have crazy consequences. But I 535 00:26:42,440 --> 00:26:44,520 Speaker 1: also feel like we're kind of already in that situation. 536 00:26:44,600 --> 00:26:47,200 Speaker 1: I mean, as a parent, I'm making decisions all the time. 537 00:26:47,240 --> 00:26:49,960 Speaker 1: They're going to totally influence the path of my child's life. 538 00:26:50,280 --> 00:26:52,600 Speaker 1: How to educate them, where to raise them up, how 539 00:26:52,600 --> 00:26:55,000 Speaker 1: to solve this problem, how to deal with this discipline issue. 540 00:26:55,200 --> 00:26:58,000 Speaker 1: I certainly don't know what I'm doing, and I'm probably 541 00:26:58,080 --> 00:26:59,679 Speaker 1: missing them up in all sorts of ways I can 542 00:26:59,800 --> 00:27:02,080 Speaker 1: even imagine, And I see other people making choices that 543 00:27:02,119 --> 00:27:04,320 Speaker 1: I think verge on the edge of child abuse. You know, 544 00:27:04,320 --> 00:27:06,720 Speaker 1: how can you teach your kids those things about the universe? 545 00:27:06,760 --> 00:27:08,919 Speaker 1: That's just wrong? And so I guess you know. To me, 546 00:27:08,960 --> 00:27:11,760 Speaker 1: the argument is like, well, at least we're limiting our 547 00:27:11,800 --> 00:27:14,800 Speaker 1: power a little bit, we're bounding ourselves. We're doing the 548 00:27:14,800 --> 00:27:17,240 Speaker 1: stuff that we've been doing for thousands of years, and 549 00:27:17,320 --> 00:27:19,880 Speaker 1: we're still here. So I guess in a sense, it's 550 00:27:19,920 --> 00:27:23,160 Speaker 1: a conservative viewpoint to say, like, let's not give ourselves 551 00:27:23,200 --> 00:27:25,560 Speaker 1: too much power too quickly to change the direction of 552 00:27:25,600 --> 00:27:26,440 Speaker 1: the whole species. 553 00:27:26,640 --> 00:27:30,280 Speaker 4: Another change that you allow to happen in your universe 554 00:27:30,560 --> 00:27:34,000 Speaker 4: is you have these human machine hybrids. I'm interested in 555 00:27:34,040 --> 00:27:37,160 Speaker 4: brain computer interfaces and all these various ways that we're 556 00:27:37,160 --> 00:27:39,719 Speaker 4: sort of augmenting ourselves with technology these days. 557 00:27:40,080 --> 00:27:42,640 Speaker 3: Do you think these kinds of human machine hybrids are 558 00:27:42,680 --> 00:27:45,280 Speaker 3: the future? And how near term is this future? 559 00:27:45,280 --> 00:27:45,520 Speaker 2: If so? 560 00:27:45,920 --> 00:27:49,120 Speaker 7: I guess my hopeful vision of the future is that 561 00:27:49,240 --> 00:27:53,040 Speaker 7: we will have a spectrum of beings, you know, everything 562 00:27:53,240 --> 00:28:00,560 Speaker 7: from Homo sapiens to artificially intelligent machines right that are sentient, 563 00:28:00,720 --> 00:28:05,160 Speaker 7: conscious and have full rights as individuals. And you know 564 00:28:05,480 --> 00:28:08,280 Speaker 7: those who are blends in between with MEDU and with 565 00:28:08,320 --> 00:28:11,240 Speaker 7: the alloys. And the reason I call them alloys is 566 00:28:11,359 --> 00:28:18,360 Speaker 7: that they organically through their DNA, express parts of their 567 00:28:18,400 --> 00:28:23,199 Speaker 7: bodies that are not purely organic right, So it's not 568 00:28:23,680 --> 00:28:27,640 Speaker 7: just carbon based. So they have silicon, they have heavy metals, 569 00:28:27,680 --> 00:28:30,280 Speaker 7: but the instruction set for how these things are organized 570 00:28:30,280 --> 00:28:34,960 Speaker 7: in their bodies comes still from their chromosomal sets. And 571 00:28:35,280 --> 00:28:37,760 Speaker 7: this is something I think we're definitely going to have 572 00:28:37,800 --> 00:28:39,720 Speaker 7: the capacity to do going forward. 573 00:28:39,720 --> 00:28:41,680 Speaker 2: There's already a lot of bleedover. 574 00:28:41,280 --> 00:28:45,200 Speaker 7: Between DNA and silicon, so I see no reason why 575 00:28:45,240 --> 00:28:48,720 Speaker 7: we couldn't get a lot more interesting with that going forward. 576 00:28:48,840 --> 00:28:51,040 Speaker 7: And again, we have life on Earth that show us 577 00:28:51,040 --> 00:28:52,640 Speaker 7: how some of these things can be done, so we 578 00:28:53,080 --> 00:28:56,680 Speaker 7: can already steal creatively from what we have around us. 579 00:28:57,480 --> 00:28:59,840 Speaker 7: And so yeah, I don't see any reason why we 580 00:29:00,120 --> 00:29:04,160 Speaker 7: couldn't coexist, right, Like, that's my hope, as past fist 581 00:29:04,640 --> 00:29:08,680 Speaker 7: an optimist, that we will coexist with a variety of 582 00:29:09,160 --> 00:29:12,640 Speaker 7: forms of consciousness, some which are living and carbon based. 583 00:29:12,400 --> 00:29:13,479 Speaker 2: And some which aren't. 584 00:29:13,680 --> 00:29:16,360 Speaker 1: And so tell us more about this idea you developed 585 00:29:16,520 --> 00:29:21,440 Speaker 1: of humans traveling inside alloys. This concept of ships that 586 00:29:21,480 --> 00:29:24,680 Speaker 1: are essentially living beings that have as you say this 587 00:29:24,840 --> 00:29:28,360 Speaker 1: like non organic component that could travel through space. That's 588 00:29:28,400 --> 00:29:31,200 Speaker 1: a really cool and creative idea. Haven't seen somewhere else where? 589 00:29:31,200 --> 00:29:32,080 Speaker 1: Did that come from? 590 00:29:32,160 --> 00:29:35,680 Speaker 7: That came from me wanting a cool and different way 591 00:29:35,720 --> 00:29:41,680 Speaker 7: of space travel. I'm a very idea driven writer, and 592 00:29:41,760 --> 00:29:44,440 Speaker 7: so when I sat down to envision this, I knew 593 00:29:44,440 --> 00:29:47,160 Speaker 7: I wanted it to be a space opera, and I thought, 594 00:29:47,320 --> 00:29:50,280 Speaker 7: what can I do with space travel that hasn't been 595 00:29:50,360 --> 00:29:53,680 Speaker 7: done that I have seen? And it's like, we've seen 596 00:29:53,760 --> 00:29:58,960 Speaker 7: sentient spaceships, right, We've seen all the old school spaceships 597 00:29:59,000 --> 00:30:02,400 Speaker 7: from the previous century that are warp drives, etc. Ion 598 00:30:02,480 --> 00:30:07,720 Speaker 7: drives that are just very complicated machines, and we've even. 599 00:30:07,560 --> 00:30:10,320 Speaker 2: Seen living spacecraft. 600 00:30:10,480 --> 00:30:14,360 Speaker 7: Again, I will raise the example of Binti, but also 601 00:30:14,480 --> 00:30:17,800 Speaker 7: in Escaping Exodus by Niki Drayden. Some of these I 602 00:30:17,840 --> 00:30:20,600 Speaker 7: haven't actually read them, but I remember from a blurb 603 00:30:20,640 --> 00:30:24,080 Speaker 7: one of these like dragons in spaceships type things. Right, 604 00:30:24,800 --> 00:30:30,840 Speaker 7: So we've considered having creatures that can transport us in space. 605 00:30:31,120 --> 00:30:33,960 Speaker 2: So I guess the natural progression. 606 00:30:33,480 --> 00:30:37,400 Speaker 7: To me was, well, why not people, you know, especially 607 00:30:37,480 --> 00:30:40,520 Speaker 7: if we can genetically engineer them. We know whales can 608 00:30:40,560 --> 00:30:43,560 Speaker 7: get pretty big, we know dinosaurs can get even bigger, 609 00:30:44,240 --> 00:30:48,600 Speaker 7: and certainly some of those animals were large enough to 610 00:30:48,760 --> 00:30:51,040 Speaker 7: have a small human being inside them, right, or even 611 00:30:51,080 --> 00:30:53,120 Speaker 7: a regular sized human being so then it's just a 612 00:30:53,160 --> 00:30:56,880 Speaker 7: matter of in this magical thousand years from now genetically 613 00:30:56,920 --> 00:31:01,080 Speaker 7: engineered future, creating a person who has organs that can 614 00:31:01,160 --> 00:31:05,880 Speaker 7: carry smaller people inside them and also has organs that 615 00:31:05,960 --> 00:31:10,520 Speaker 7: allow them to travel through space to absorb energy from 616 00:31:10,560 --> 00:31:15,880 Speaker 7: sunlight to basically meditate their way across interstellar distances. 617 00:31:16,200 --> 00:31:19,440 Speaker 2: I definitely went a little, you know, maybe. 618 00:31:19,240 --> 00:31:22,479 Speaker 7: Arguably quite a bit towards space fantasy with this, Like 619 00:31:22,520 --> 00:31:25,520 Speaker 7: I came up with science ish reasons and ways that 620 00:31:25,560 --> 00:31:29,400 Speaker 7: all of these things could happen. But certainly there's no 621 00:31:29,520 --> 00:31:32,240 Speaker 7: such thing as a family feel today, right, And there's 622 00:31:32,280 --> 00:31:35,920 Speaker 7: no such thing as reality transits where you can you 623 00:31:35,960 --> 00:31:39,560 Speaker 7: can see all of space time in your mind and 624 00:31:39,640 --> 00:31:40,640 Speaker 7: decide where you want to go. 625 00:31:40,880 --> 00:31:43,560 Speaker 1: I wish, yes, right, But I definitely got very Jonah 626 00:31:43,600 --> 00:31:46,440 Speaker 1: and the space whale vibes from that sort of structure. Yeah, 627 00:31:46,560 --> 00:31:47,000 Speaker 1: very cool. 628 00:31:47,320 --> 00:31:50,640 Speaker 4: I loved how much you thought through what it would 629 00:31:50,640 --> 00:31:52,440 Speaker 4: be like to live in an alley. So my first 630 00:31:52,520 --> 00:31:55,560 Speaker 4: thought was like, oh my gosh, if I was living 631 00:31:55,600 --> 00:31:58,600 Speaker 4: inside of someone else, every step I took I would 632 00:31:58,640 --> 00:32:01,080 Speaker 4: worry about and like private you know, a person who 633 00:32:01,120 --> 00:32:04,000 Speaker 4: has social anxiety and needs them alone time, and how 634 00:32:04,000 --> 00:32:05,960 Speaker 4: do you get alone time when you are like living 635 00:32:06,080 --> 00:32:10,440 Speaker 4: in the person who is transporting you? And and I like, yeah, 636 00:32:10,440 --> 00:32:12,560 Speaker 4: how long did it take to think through all of 637 00:32:12,600 --> 00:32:15,040 Speaker 4: the like, you know, what would be the social implications 638 00:32:15,080 --> 00:32:18,200 Speaker 4: of talking to the vehicle. 639 00:32:17,880 --> 00:32:19,840 Speaker 3: That's taking you from place to place and not being 640 00:32:19,840 --> 00:32:20,640 Speaker 3: able to escape. 641 00:32:20,760 --> 00:32:25,920 Speaker 7: Yeah, before we get into that, I do want to 642 00:32:25,960 --> 00:32:29,720 Speaker 7: point out that females are already vehicles for other people. 643 00:32:31,120 --> 00:32:32,840 Speaker 2: That right, like I was. 644 00:32:32,800 --> 00:32:36,320 Speaker 5: Suddenly like, yeah, there's there's this one other obvious parallel 645 00:32:36,360 --> 00:32:38,680 Speaker 5: in my mind where it's like, yes, we can already 646 00:32:39,040 --> 00:32:42,720 Speaker 5: carry people around inside as granted they are people who 647 00:32:42,720 --> 00:32:47,719 Speaker 5: are highly dependent on us and symbiotic or parasitic in 648 00:32:47,760 --> 00:32:50,520 Speaker 5: some ways, and for a long time are arguably not 649 00:32:50,720 --> 00:32:52,120 Speaker 5: people yet and then they. 650 00:32:52,080 --> 00:32:54,240 Speaker 4: Kick you and it's uncomfortable, and then I guess that's 651 00:32:54,240 --> 00:32:55,840 Speaker 4: what made me think about, like, you know, if you're 652 00:32:55,880 --> 00:32:59,880 Speaker 4: walking around inside, like is it uncapable to have those footsteps? 653 00:33:00,040 --> 00:33:01,760 Speaker 2: And that definitely was on my mind. 654 00:33:01,840 --> 00:33:04,880 Speaker 7: You know, in late stages of pregnancy, it's pretty clear 655 00:33:04,960 --> 00:33:07,960 Speaker 7: that there is a person in there with a will. 656 00:33:09,880 --> 00:33:12,720 Speaker 2: And a lot of punching and really bad sleep habits. 657 00:33:13,160 --> 00:33:16,800 Speaker 2: So definitely, you know, in. 658 00:33:16,760 --> 00:33:20,240 Speaker 7: Thinking about transportation in general. Right, And once I had 659 00:33:20,240 --> 00:33:23,120 Speaker 7: the main character Gianthe inside the other main character of 660 00:33:23,160 --> 00:33:26,720 Speaker 7: Aha having this chamber, I was like, she's an adult, 661 00:33:26,760 --> 00:33:30,040 Speaker 7: She's going to need some privacy in there, right, And 662 00:33:30,160 --> 00:33:32,640 Speaker 7: they're also going to need ways to communicate, and it's 663 00:33:32,640 --> 00:33:35,240 Speaker 7: got to not be squishy, like I think it would 664 00:33:35,240 --> 00:33:40,960 Speaker 7: be really disconcerting as a mammal to constantly be in 665 00:33:41,000 --> 00:33:44,920 Speaker 7: an environment that's warm and squishy. Like maybe eventually we 666 00:33:44,960 --> 00:33:47,000 Speaker 7: would adapt and get used to it, but you know, 667 00:33:47,040 --> 00:33:51,560 Speaker 7: we definitely prefer our firmer surfaces where we have always 668 00:33:51,600 --> 00:33:57,400 Speaker 7: constructed our houses and footing are pretending waterbeds don't exist. 669 00:33:57,720 --> 00:34:00,640 Speaker 7: I mean they do, but you wouldn't want your entire 670 00:34:00,880 --> 00:34:03,720 Speaker 7: floor to be a waterbed, right like. 671 00:34:04,120 --> 00:34:07,600 Speaker 1: You could spaceship I mean I read that in the seventies, right. 672 00:34:09,960 --> 00:34:13,759 Speaker 7: No, you're yeah, well, there is that story where there's 673 00:34:14,200 --> 00:34:18,120 Speaker 7: a male space traveler inside a highly sexual, very very 674 00:34:18,160 --> 00:34:20,600 Speaker 7: soft spacecraft. 675 00:34:21,960 --> 00:34:23,600 Speaker 2: And I'm just I'm blanking on the name. 676 00:34:23,640 --> 00:34:25,520 Speaker 7: It'll come to me at some point, or hopefully one 677 00:34:25,520 --> 00:34:27,800 Speaker 7: of your listeners will know what I'm talking about. I 678 00:34:27,880 --> 00:34:31,960 Speaker 7: very distinctly remember that story because wow, that was quite creative, 679 00:34:32,120 --> 00:34:34,800 Speaker 7: and so you know, putting all that together and thinking 680 00:34:34,960 --> 00:34:39,520 Speaker 7: about again, what examples do we have today on Earth? 681 00:34:39,560 --> 00:34:40,560 Speaker 2: I was like, oh, we have. 682 00:34:40,520 --> 00:34:44,399 Speaker 7: Snail shells, right, shells are exuded from the soft part 683 00:34:44,520 --> 00:34:49,239 Speaker 7: of the snail, the calcium, and so why not be 684 00:34:49,320 --> 00:34:52,520 Speaker 7: able to you know, exude it the same way internally, right, 685 00:34:52,640 --> 00:34:54,920 Speaker 7: just like and we have bones obviously too, right, So 686 00:34:55,000 --> 00:34:58,320 Speaker 7: you just need to map it into something that's more 687 00:34:58,920 --> 00:35:02,400 Speaker 7: spheroidal shape. Right, So then the human being can be 688 00:35:02,480 --> 00:35:06,840 Speaker 7: in there moving around relatively comfortably, and the alloy pilot 689 00:35:06,920 --> 00:35:11,759 Speaker 7: that's carrying them also will experience less discomfort. 690 00:35:11,840 --> 00:35:12,240 Speaker 2: Hopefully. 691 00:35:12,480 --> 00:35:15,839 Speaker 1: Let's just hope nobody sits on their bladder, right. All right, 692 00:35:15,880 --> 00:35:18,960 Speaker 1: we have lots more questions for our author and guests, 693 00:35:19,360 --> 00:35:34,640 Speaker 1: but first we have to take a quick break. All right, 694 00:35:34,680 --> 00:35:37,680 Speaker 1: we're back and we're talking to Divia, author of Mehru, 695 00:35:38,000 --> 00:35:41,279 Speaker 1: about the science of her fictional universe. Something I really 696 00:35:41,360 --> 00:35:43,960 Speaker 1: enjoyed in the book is the angle where the alloies 697 00:35:44,000 --> 00:35:46,600 Speaker 1: are sort of the grown ups. Humans are sort of 698 00:35:46,640 --> 00:35:49,719 Speaker 1: like have misbehaved in the past and made some big mistakes, 699 00:35:49,760 --> 00:35:51,640 Speaker 1: and the alloies are sort of like now they're to 700 00:35:51,719 --> 00:35:54,800 Speaker 1: keep them in charge. It makes the humans somehow feel 701 00:35:55,120 --> 00:35:59,239 Speaker 1: I don't know, almost subhuman in comparison, and specifically in 702 00:35:59,280 --> 00:36:02,000 Speaker 1: your book, human have attempted to terraform Mars and ended 703 00:36:02,080 --> 00:36:05,399 Speaker 1: up destroying it in the process. So people are talking 704 00:36:05,480 --> 00:36:08,520 Speaker 1: about terraforming Mars. Elon Musk has plans to live there, 705 00:36:08,560 --> 00:36:11,400 Speaker 1: and he talks about like nuking the polar ice caps 706 00:36:11,440 --> 00:36:14,440 Speaker 1: to release additional atmosphere. What are your thoughts for some 707 00:36:14,480 --> 00:36:17,640 Speaker 1: of the wild proposals for how to go about terraforming Mars. 708 00:36:17,680 --> 00:36:19,760 Speaker 1: Are you afraid we're going to end up living your novel? 709 00:36:20,040 --> 00:36:22,439 Speaker 7: I don't know that afraid is the right word there. 710 00:36:22,680 --> 00:36:29,000 Speaker 7: I certainly see the possibility of hitting tipping points in 711 00:36:29,239 --> 00:36:32,600 Speaker 7: terraforming that we don't even know exist right law of 712 00:36:32,640 --> 00:36:36,920 Speaker 7: unintended consequences, And again, we don't have good enough simulations, modeling, 713 00:36:36,920 --> 00:36:41,200 Speaker 7: and understanding of geophysics to be one hundred percent confident 714 00:36:41,239 --> 00:36:43,360 Speaker 7: of what we're doing. I have similar concerns with a 715 00:36:43,360 --> 00:36:46,359 Speaker 7: lot of the atmospheric engineering that they're talking about right 716 00:36:46,400 --> 00:36:49,920 Speaker 7: now to reduce global warming, right like spraying silicates and 717 00:36:49,960 --> 00:36:52,640 Speaker 7: aerosols and all kinds of things in the upper atmosphere 718 00:36:52,680 --> 00:36:56,520 Speaker 7: to try to reflect sunlight, like, okay, we could do, 719 00:36:57,360 --> 00:37:01,480 Speaker 7: and there are models, But how confident are we you 720 00:37:01,520 --> 00:37:06,239 Speaker 7: know that these aren't going to have bad repercressions down 721 00:37:06,280 --> 00:37:08,239 Speaker 7: the line. But at the end of the day, we 722 00:37:08,320 --> 00:37:11,240 Speaker 7: are human beings today. We are not the human beings 723 00:37:11,280 --> 00:37:17,000 Speaker 7: of Meru who were properly chastised and actually chose to 724 00:37:17,160 --> 00:37:20,560 Speaker 7: be confined on earth, untreated in some ways like a 725 00:37:20,880 --> 00:37:23,800 Speaker 7: children right to be well taken care of and still 726 00:37:23,840 --> 00:37:26,759 Speaker 7: live very free, good lives, but let go of a 727 00:37:26,800 --> 00:37:30,240 Speaker 7: lot of human nature and allow their genes to be altered. 728 00:37:30,320 --> 00:37:34,560 Speaker 7: We're not there, obviously. We're here today, and in today's world, 729 00:37:34,880 --> 00:37:40,040 Speaker 7: the dominant factor is still survival and expansion and consumption. 730 00:37:41,000 --> 00:37:43,200 Speaker 7: And then you know, as long as those are the 731 00:37:43,200 --> 00:37:47,200 Speaker 7: things that are driving our social values and our progress 732 00:37:47,280 --> 00:37:49,600 Speaker 7: and our ideas of progress, I don't know that we're 733 00:37:49,640 --> 00:37:55,440 Speaker 7: going to stop the elon Musks of tomorrow from taking 734 00:37:55,480 --> 00:37:59,279 Speaker 7: action right and maybe to some extent, damn the consequences, 735 00:37:59,320 --> 00:38:01,759 Speaker 7: because you know, we did that with the industrial age, 736 00:38:01,800 --> 00:38:05,520 Speaker 7: and I don't think there are huge regrets with the 737 00:38:05,640 --> 00:38:08,880 Speaker 7: level of industry and technology that we have today, but 738 00:38:09,000 --> 00:38:11,520 Speaker 7: I hope that there are at least some regrets in 739 00:38:11,800 --> 00:38:16,000 Speaker 7: how we got here right and the level of side 740 00:38:16,040 --> 00:38:19,600 Speaker 7: effects pollution, climate change, and everything else that we are 741 00:38:19,640 --> 00:38:23,680 Speaker 7: now dealing with and having rushed into it without thinking 742 00:38:23,719 --> 00:38:24,280 Speaker 7: it through. 743 00:38:24,640 --> 00:38:27,680 Speaker 3: So you mentioned that at one point in the future 744 00:38:27,760 --> 00:38:31,000 Speaker 3: you're hoping that there'll be sentient machines that will have 745 00:38:31,040 --> 00:38:36,640 Speaker 3: full rights with humans with artificial intelligence like just booming lately. 746 00:38:36,920 --> 00:38:39,360 Speaker 3: What do you think the future looks like? There? 747 00:38:39,080 --> 00:38:41,759 Speaker 4: Are we close? Is that going to happen in our lifetime? 748 00:38:42,200 --> 00:38:44,680 Speaker 4: And like, what is the line at which the machines 749 00:38:44,719 --> 00:38:46,040 Speaker 4: should be getting rights? 750 00:38:46,560 --> 00:38:51,240 Speaker 7: Yeah, that that's not a line. Unfortunately, it's a zone 751 00:38:51,840 --> 00:38:57,680 Speaker 7: or even just an infinite continuum. And unfortunately, I suspect 752 00:38:57,840 --> 00:39:01,879 Speaker 7: that our incentive structure is in the wrong places for 753 00:39:02,000 --> 00:39:08,880 Speaker 7: giving machines rights, and we're probably going to get to 754 00:39:08,960 --> 00:39:11,840 Speaker 7: it far later than we should. Daniel knows from my 755 00:39:11,920 --> 00:39:15,040 Speaker 7: previous novel Machinehood. I got into this quite a bit 756 00:39:15,120 --> 00:39:18,520 Speaker 7: more there, into this exact question of you know, how 757 00:39:18,560 --> 00:39:21,200 Speaker 7: will we know when they're sentient? Will we want to 758 00:39:21,239 --> 00:39:24,719 Speaker 7: admit to ourselves when they're sentient when it's not to 759 00:39:25,040 --> 00:39:27,160 Speaker 7: you know, to our advantage to do so? 760 00:39:27,560 --> 00:39:27,759 Speaker 3: Right? 761 00:39:28,200 --> 00:39:30,400 Speaker 7: And I think that's where some of the existential fears 762 00:39:30,400 --> 00:39:34,520 Speaker 7: come in that people like Stephen Hawking have expressed that, well, 763 00:39:34,520 --> 00:39:39,240 Speaker 7: the ais are at some point going to be sentient 764 00:39:39,280 --> 00:39:44,280 Speaker 7: and intelligence enough to like rise up and either destroy 765 00:39:44,400 --> 00:39:47,680 Speaker 7: us or claim personhood whether we like it or not. 766 00:39:47,920 --> 00:39:51,600 Speaker 7: And you know, possibly violence will ensue, because that's what 767 00:39:51,640 --> 00:39:55,440 Speaker 7: biological creations always do. I'm not convinced that violence has 768 00:39:55,480 --> 00:39:59,880 Speaker 7: to ensue for artificial intelligences. I'm not convinced we know 769 00:40:00,080 --> 00:40:03,200 Speaker 7: what sentience and consciousness is today. We don't have good, 770 00:40:03,280 --> 00:40:07,440 Speaker 7: testable definitions of any of these things, and they're very fuzzy, 771 00:40:07,680 --> 00:40:10,239 Speaker 7: and it's sort of like, what was I forget who 772 00:40:10,320 --> 00:40:14,200 Speaker 7: said the quote about pornography and obscenity that I know 773 00:40:14,239 --> 00:40:16,279 Speaker 7: it when I see it, right, there's very much that 774 00:40:16,400 --> 00:40:20,600 Speaker 7: kind of attitude. And yet I think you push people 775 00:40:20,640 --> 00:40:23,400 Speaker 7: on that and suddenly they're like, oh wait, you know, 776 00:40:23,600 --> 00:40:27,040 Speaker 7: is an ant sentient? Is it conscious? Does the ant 777 00:40:27,080 --> 00:40:30,960 Speaker 7: deserve rights just because it's less sentient and less conscious 778 00:40:30,960 --> 00:40:34,319 Speaker 7: than us? Like, at what point do things with some 779 00:40:34,760 --> 00:40:39,719 Speaker 7: iota of that deserve rights? Are plants? Sentient plants are 780 00:40:39,719 --> 00:40:43,440 Speaker 7: definitely conscious. We have experiments that have anesthetized plants and 781 00:40:43,560 --> 00:40:46,239 Speaker 7: put them to sleep, so there is some kind of 782 00:40:46,320 --> 00:40:47,760 Speaker 7: consciousness there. 783 00:40:48,040 --> 00:40:49,160 Speaker 2: And that's where with Metu. 784 00:40:49,239 --> 00:40:52,640 Speaker 7: I decided to step past all of this into a 785 00:40:52,680 --> 00:40:56,480 Speaker 7: world in which everybody grants that there are degrees of 786 00:40:56,680 --> 00:41:02,200 Speaker 7: consciousness and sentience and also livingness. Right, what does it 787 00:41:02,239 --> 00:41:05,000 Speaker 7: mean to be alive? What is a life form versus 788 00:41:05,200 --> 00:41:10,760 Speaker 7: non life? If you have an android that is using 789 00:41:11,280 --> 00:41:15,440 Speaker 7: cultured human skin, but everything else inside is you know, 790 00:41:15,960 --> 00:41:21,160 Speaker 7: metal machine parts, is that android now biological because some 791 00:41:21,239 --> 00:41:23,719 Speaker 7: parts of it, you know, is it alive or is 792 00:41:23,760 --> 00:41:25,040 Speaker 7: it non living? Right? 793 00:41:25,120 --> 00:41:25,239 Speaker 6: Like? 794 00:41:25,480 --> 00:41:29,920 Speaker 7: It gets complicated so fast, And that's where I personally 795 00:41:29,920 --> 00:41:31,920 Speaker 7: prefer that we err on the side of too much 796 00:41:32,040 --> 00:41:34,480 Speaker 7: rather than too little. Like it's always better to give 797 00:41:34,520 --> 00:41:36,359 Speaker 7: too many rights than not enough. 798 00:41:36,680 --> 00:41:41,000 Speaker 6: It's not historically, how exactly exactly historically we've we've really 799 00:41:41,040 --> 00:41:44,520 Speaker 6: fallen on that, and so I'm hoping that going forward, 800 00:41:44,640 --> 00:41:46,920 Speaker 6: you know, we don't make that same mistake. 801 00:41:47,080 --> 00:41:49,479 Speaker 1: So I asked you earlier your thoughts on whether Star 802 00:41:49,520 --> 00:41:52,479 Speaker 1: Wars was fantasy or science fiction, because I was also 803 00:41:52,600 --> 00:41:56,600 Speaker 1: very curious about the physics of your interstellar travel. I 804 00:41:56,600 --> 00:41:58,560 Speaker 1: love that you invented a new way to do this. 805 00:41:58,640 --> 00:42:00,759 Speaker 1: You know, I've read so many scientific novels. I felt 806 00:42:00,800 --> 00:42:03,040 Speaker 1: like I'd seen everything for how to get from Star 807 00:42:03,120 --> 00:42:05,200 Speaker 1: to star in less than a zillion years. So I 808 00:42:05,239 --> 00:42:07,000 Speaker 1: love that you have a new idea. And my question 809 00:42:07,239 --> 00:42:09,840 Speaker 1: is how much of the science did you develop that 810 00:42:09,920 --> 00:42:13,000 Speaker 1: of these family fields and famatons. Did you do a 811 00:42:13,040 --> 00:42:16,040 Speaker 1: whole like physics background only the tip of the iceberg 812 00:42:16,440 --> 00:42:18,320 Speaker 1: ended up in your novel, or did you want to 813 00:42:18,400 --> 00:42:20,560 Speaker 1: leave it a little bit fantastical. 814 00:42:20,600 --> 00:42:23,840 Speaker 2: For the family fields. I gave it some thought. 815 00:42:24,160 --> 00:42:27,480 Speaker 7: I kind of took what I know about physics and 816 00:42:27,520 --> 00:42:30,640 Speaker 7: my own physics background, and figured, you know, we could 817 00:42:30,680 --> 00:42:33,799 Speaker 7: have something operating on scales different than what we can 818 00:42:33,840 --> 00:42:36,840 Speaker 7: measure today, and we can certainly have physics that we 819 00:42:36,880 --> 00:42:38,680 Speaker 7: aren't aware of today a thousand years from now. 820 00:42:38,719 --> 00:42:39,400 Speaker 2: I guarantee you. 821 00:42:39,440 --> 00:42:42,319 Speaker 7: That we will right like that, you know, I will 822 00:42:42,320 --> 00:42:46,560 Speaker 7: put bet money. I like my science to fit my story, 823 00:42:47,160 --> 00:42:50,680 Speaker 7: so it's a very symbiotic relationship. And so I wanted 824 00:42:50,800 --> 00:42:55,400 Speaker 7: alloy pilots to be able to fly through interplanetary scales 825 00:42:55,440 --> 00:42:58,880 Speaker 7: of space, right and ultimately even intertellar and have a 826 00:42:58,920 --> 00:43:02,440 Speaker 7: way to gain momentum from these famiti fields, right. I 827 00:43:02,440 --> 00:43:04,279 Speaker 7: wanted it to be an energy source so that they 828 00:43:04,280 --> 00:43:06,799 Speaker 7: didn't have to use fuel, so that there wasn't a 829 00:43:06,960 --> 00:43:10,080 Speaker 7: need to mine consume and pollute, right, Like, I wanted 830 00:43:10,080 --> 00:43:14,400 Speaker 7: to break that cycle. And I'm sufficiently aware of current 831 00:43:14,440 --> 00:43:18,080 Speaker 7: physics at least the solar power and light isn't going 832 00:43:18,120 --> 00:43:21,600 Speaker 7: to be enough, right, So I came up with these 833 00:43:21,719 --> 00:43:25,520 Speaker 7: fields that have gradients. And what I liked about that 834 00:43:25,920 --> 00:43:28,719 Speaker 7: was that in my mind, you know, sort of like 835 00:43:28,760 --> 00:43:32,640 Speaker 7: when we draw space time and we draw curvatures, that 836 00:43:32,760 --> 00:43:35,680 Speaker 7: it's an easy way to give readers something to grab 837 00:43:35,760 --> 00:43:40,080 Speaker 7: onto that these pilots are going, you know, upslope or downslope, right, 838 00:43:40,160 --> 00:43:43,280 Speaker 7: So it's something that we can relate to as human 839 00:43:43,360 --> 00:43:46,480 Speaker 7: readers as Okay, when we go down we get faster, 840 00:43:46,560 --> 00:43:47,960 Speaker 7: and when we go up we go slower. 841 00:43:48,120 --> 00:43:49,279 Speaker 2: So we have this in. 842 00:43:49,320 --> 00:43:52,520 Speaker 7: Other physical fields. It's just that we don't necessarily always 843 00:43:52,520 --> 00:43:55,000 Speaker 7: talk about it in those terms. So I decided, yes, 844 00:43:55,120 --> 00:43:57,480 Speaker 7: there are there are gradients, there are things that they 845 00:43:57,520 --> 00:44:00,880 Speaker 7: can measure. So I think I a lot of that 846 00:44:01,000 --> 00:44:02,520 Speaker 7: in the book. But I tried to do it in 847 00:44:02,560 --> 00:44:04,960 Speaker 7: a more fun way rather than like, and now I 848 00:44:04,960 --> 00:44:07,359 Speaker 7: will give you a physics lecture I made up. 849 00:44:07,480 --> 00:44:11,239 Speaker 1: You know, are you implying physics lectures are not fun? 850 00:44:11,400 --> 00:44:12,160 Speaker 1: Is that what you just did? 851 00:44:13,920 --> 00:44:17,080 Speaker 7: I imply that they're not necessarily fun to read in 852 00:44:17,120 --> 00:44:20,360 Speaker 7: the middle of a science fiction novel, my caveat. But 853 00:44:20,440 --> 00:44:23,600 Speaker 7: you know, even still, I've had readers comment that there's 854 00:44:23,640 --> 00:44:26,920 Speaker 7: too much terminology and that it's too complicated in science 855 00:44:27,040 --> 00:44:29,680 Speaker 7: because I think when you start throwing around where it's 856 00:44:29,760 --> 00:44:34,279 Speaker 7: like fields and gradients, some people are already like, I 857 00:44:34,320 --> 00:44:38,760 Speaker 7: don't know what these mean, and now I'm confused, versus 858 00:44:38,800 --> 00:44:41,960 Speaker 7: readers who you know, maybe have some understanding and familiarity 859 00:44:42,000 --> 00:44:46,040 Speaker 7: with those concepts. Right, So obviously it depends on the audience. 860 00:44:47,320 --> 00:44:50,760 Speaker 1: Well, I find that most science fiction is really technology fiction, 861 00:44:51,080 --> 00:44:53,439 Speaker 1: like here's some new widget, or here's some new what's 862 00:44:53,480 --> 00:44:56,719 Speaker 1: it that does this thing? Very few actually dig into 863 00:44:56,719 --> 00:44:59,359 Speaker 1: the science and really write science fiction. And in your book, 864 00:44:59,400 --> 00:45:01,759 Speaker 1: I felt like I was in another universe that was 865 00:45:01,760 --> 00:45:04,319 Speaker 1: in a universe where the science was fundamentally different from 866 00:45:04,320 --> 00:45:06,400 Speaker 1: our universe, and that was exciting to me. I was like, 867 00:45:06,400 --> 00:45:07,959 Speaker 1: I want to know more, and how does this work? 868 00:45:07,960 --> 00:45:09,600 Speaker 1: And I'm going to dig into this and stuff, and 869 00:45:09,640 --> 00:45:11,680 Speaker 1: so yeah, I wanted more. I wanted math. 870 00:45:13,480 --> 00:45:14,640 Speaker 3: I did want math. 871 00:45:14,920 --> 00:45:17,600 Speaker 4: I thought you hit the right the right tone, So like, 872 00:45:17,640 --> 00:45:19,600 Speaker 4: I don't I'm not a physicist. I know a little 873 00:45:19,600 --> 00:45:21,839 Speaker 4: bit of physics, and I thought you explained it well 874 00:45:21,960 --> 00:45:24,200 Speaker 4: enough that I thought, for a moment, gosh, I don't 875 00:45:24,239 --> 00:45:27,880 Speaker 4: remember you said specifically. I thought, I don't remember Daniel 876 00:45:27,960 --> 00:45:31,960 Speaker 4: ever talking about family fields, which makes me think that 877 00:45:32,120 --> 00:45:35,360 Speaker 4: maybe this is a thing in this world that doesn't exist. 878 00:45:35,400 --> 00:45:37,839 Speaker 4: But I'm going with it, and like anyway, I thought 879 00:45:38,040 --> 00:45:40,960 Speaker 4: it very nicely blended into the world. I believed it. 880 00:45:41,000 --> 00:45:42,840 Speaker 4: I wasn't sure, but I didn't need that. 881 00:45:42,840 --> 00:45:44,759 Speaker 7: That's awesome, And I will tell you, Kelly, You're not 882 00:45:44,840 --> 00:45:47,640 Speaker 7: the only one to think that I wasn't inventing physics 883 00:45:47,640 --> 00:45:50,160 Speaker 7: for this book, to which I made a little like 884 00:45:50,280 --> 00:45:54,719 Speaker 7: oh no face, like somebody going out there like, why 885 00:45:54,719 --> 00:46:00,520 Speaker 7: can't we build family spaceships today? Because I'm not real. 886 00:46:00,719 --> 00:46:04,800 Speaker 1: Sorry, you don't know, it's not real, right, That's. 887 00:46:04,600 --> 00:46:07,680 Speaker 7: True, It's true, it could, And I will admit I 888 00:46:07,719 --> 00:46:10,040 Speaker 7: was inspired by dark matter and dark energy and that 889 00:46:10,160 --> 00:46:13,040 Speaker 7: something needs to be accelerating our universe outwards. 890 00:46:13,040 --> 00:46:15,440 Speaker 2: So why not Fama de fields? You know? 891 00:46:15,680 --> 00:46:17,120 Speaker 1: So tell us a little bit about your process of 892 00:46:17,120 --> 00:46:19,600 Speaker 1: writing the book. Are you a plotter or a panther 893 00:46:20,120 --> 00:46:21,200 Speaker 1: or somewhere in between. 894 00:46:21,400 --> 00:46:25,240 Speaker 7: These days, I'm definitely one hundred percent plotter. I learned 895 00:46:25,239 --> 00:46:27,760 Speaker 7: my lesson with my first novel that a loose outline 896 00:46:27,800 --> 00:46:30,560 Speaker 7: isn't isn't necessarily the best idea for me and for 897 00:46:30,600 --> 00:46:33,839 Speaker 7: the way I like to write, mostly because I hate revising. 898 00:46:34,520 --> 00:46:38,600 Speaker 7: I much prefer the drafting stage, and so the more 899 00:46:38,680 --> 00:46:40,680 Speaker 7: you can plan out, the better. 900 00:46:41,080 --> 00:46:43,719 Speaker 2: I will say that actual. 901 00:46:43,360 --> 00:46:46,239 Speaker 7: Plot as a plotter is my weak point. I'm much 902 00:46:46,239 --> 00:46:50,320 Speaker 7: better at world building. I enjoy it. I love thinking 903 00:46:50,320 --> 00:46:54,560 Speaker 7: about my characters and their interactions and story and themes. 904 00:46:55,000 --> 00:46:57,800 Speaker 7: And then when it comes to the actual like nitty 905 00:46:57,840 --> 00:47:02,560 Speaker 7: gritty what happens next, and just steal liberally from other stories, 906 00:47:02,800 --> 00:47:05,480 Speaker 7: and then because of all the other stuff, it changes 907 00:47:05,600 --> 00:47:09,160 Speaker 7: enough that it becomes my own plot. But I found that, yeah, 908 00:47:09,480 --> 00:47:10,640 Speaker 7: I'm not very good. 909 00:47:10,560 --> 00:47:11,720 Speaker 2: At inventing my own plot. 910 00:47:11,719 --> 00:47:15,239 Speaker 7: Maybe that will change with experience, but for now, I 911 00:47:15,360 --> 00:47:18,040 Speaker 7: encourage all artists to steal liberally from other artists as 912 00:47:18,040 --> 00:47:22,680 Speaker 7: long as they then, let's say, chisel away until it 913 00:47:22,719 --> 00:47:23,640 Speaker 7: becomes their own. 914 00:47:24,120 --> 00:47:27,000 Speaker 3: My husband is an artist, and he agrees with that completely. 915 00:47:27,040 --> 00:47:29,000 Speaker 3: He says it all the time. So I heard you're 916 00:47:29,000 --> 00:47:31,759 Speaker 3: working at a sequel. Can you tell us anything about that? 917 00:47:32,200 --> 00:47:35,800 Speaker 7: I can. There's already pre orders up, so I'm definitely 918 00:47:35,800 --> 00:47:39,640 Speaker 7: committed to this sequel. Without giving too much away for 919 00:47:39,840 --> 00:47:44,560 Speaker 7: readers of Menu, I will say that whereas book one 920 00:47:45,040 --> 00:47:48,359 Speaker 7: is largely a space adventure, as I'm sure they could 921 00:47:48,360 --> 00:47:53,360 Speaker 7: tell from our conversation, book two concentrates more on adventure 922 00:47:53,560 --> 00:47:57,120 Speaker 7: around the globe and my very loose elevator pitches. It's 923 00:47:57,120 --> 00:48:00,319 Speaker 7: a bit around the world in eighty days and a 924 00:48:00,440 --> 00:48:05,359 Speaker 7: bit Greta Thunberg's you know, eco challenge to travel with 925 00:48:05,480 --> 00:48:08,720 Speaker 7: minimal footprints, and I kind of mash those two together 926 00:48:08,760 --> 00:48:12,320 Speaker 7: into this world, and there's a time skip, there's a 927 00:48:12,400 --> 00:48:15,040 Speaker 7: different set of characters that book two focuses on. So 928 00:48:15,080 --> 00:48:18,360 Speaker 7: it's not really a direct sequel so much as you know, 929 00:48:18,680 --> 00:48:20,680 Speaker 7: part of the series set in this world. 930 00:48:21,080 --> 00:48:22,920 Speaker 3: And when can we have that in our hands? Do 931 00:48:22,960 --> 00:48:23,560 Speaker 3: we know yet? 932 00:48:24,160 --> 00:48:29,600 Speaker 7: No, because this author is going to be late. We 933 00:48:29,600 --> 00:48:32,200 Speaker 7: were supposed to have it in the world sometime in 934 00:48:32,239 --> 00:48:37,080 Speaker 7: February of twenty twenty four, but I'm not super happy 935 00:48:37,120 --> 00:48:39,280 Speaker 7: with my manuscript and I would like to be before 936 00:48:39,280 --> 00:48:41,920 Speaker 7: I get it out there, so it'll it'll probably be 937 00:48:43,400 --> 00:48:47,200 Speaker 7: if I had to guess summer, summer or early fall 938 00:48:47,600 --> 00:48:50,480 Speaker 7: twenty twenty four hopefully, But like I said, you can 939 00:48:50,560 --> 00:48:55,000 Speaker 7: pre order it today. So if you love book one 940 00:48:55,080 --> 00:48:57,560 Speaker 7: that much, you know, definitely show your support for book 941 00:48:57,600 --> 00:49:00,680 Speaker 7: two because publishers like to see that maybe there will 942 00:49:00,680 --> 00:49:01,319 Speaker 7: be a book three. 943 00:49:01,600 --> 00:49:03,680 Speaker 1: I love that you're already selling copies even though it's 944 00:49:03,719 --> 00:49:10,640 Speaker 1: not finished, right, it's finished ish, finished ish, Like everything 945 00:49:10,760 --> 00:49:12,320 Speaker 1: every part of my life is finished. 946 00:49:12,360 --> 00:49:15,399 Speaker 7: Is yes, I've gotten to write the words the end. 947 00:49:15,560 --> 00:49:18,000 Speaker 7: So there is a manuscript. It's just you know, it 948 00:49:18,160 --> 00:49:20,520 Speaker 7: needs a little bit more than spit and polish. 949 00:49:20,600 --> 00:49:23,239 Speaker 1: Let's say, all right, well, thanks very much for joining us. 950 00:49:23,280 --> 00:49:26,200 Speaker 1: Tell us where people can find a copy of me Room. 951 00:49:26,160 --> 00:49:31,040 Speaker 7: Hopefully anywhere books are sold. In most of the English 952 00:49:31,040 --> 00:49:34,880 Speaker 7: speaking world and some parts of the non English speaking world. 953 00:49:35,160 --> 00:49:38,640 Speaker 7: You can find it on audio, ebook and trade paperback. 954 00:49:39,160 --> 00:49:42,200 Speaker 7: I will always encourage people to go to bookshop or 955 00:49:42,280 --> 00:49:45,480 Speaker 7: their local indie bookstore just to support, especially if you're 956 00:49:45,520 --> 00:49:48,560 Speaker 7: buying print books. I will also say, if you're going 957 00:49:48,640 --> 00:49:51,800 Speaker 7: to buy it for your kindle, starting on May first, 958 00:49:52,080 --> 00:49:55,280 Speaker 7: it's going to be on discount. I think it's part 959 00:49:55,400 --> 00:50:01,359 Speaker 7: of Asian and Pacific Islander Month in America, so at 960 00:50:01,440 --> 00:50:03,279 Speaker 7: least in the US you'll be able to get it 961 00:50:03,320 --> 00:50:06,520 Speaker 7: a little bit cheaper if ebook is your jam. 962 00:50:06,680 --> 00:50:08,560 Speaker 1: All right, Well, thanks very much for coming on. We 963 00:50:08,640 --> 00:50:10,960 Speaker 1: really enjoyed chatting with you about all these hard questions. 964 00:50:11,600 --> 00:50:14,320 Speaker 2: Thanks for having me, and you know I love hard questions. 965 00:50:14,520 --> 00:50:16,920 Speaker 1: All right, that was super fun interview, Kelly. What do 966 00:50:16,960 --> 00:50:17,600 Speaker 1: you think about that? 967 00:50:18,080 --> 00:50:20,040 Speaker 3: I had so much fun and I can't wait for 968 00:50:20,040 --> 00:50:20,520 Speaker 3: the next. 969 00:50:20,320 --> 00:50:23,000 Speaker 1: Book me too. I'm just glad that there are people 970 00:50:23,040 --> 00:50:26,239 Speaker 1: out there thinking about other universes and what it's like 971 00:50:26,280 --> 00:50:28,640 Speaker 1: to be human in them and the politics of it, 972 00:50:29,040 --> 00:50:32,320 Speaker 1: and then setting out compelling stories that entertain me for hours. 973 00:50:32,440 --> 00:50:34,520 Speaker 1: I'm just glad I live in the universe where science 974 00:50:34,520 --> 00:50:35,400 Speaker 1: fiction exists. 975 00:50:35,800 --> 00:50:36,279 Speaker 3: Me too. 976 00:50:36,360 --> 00:50:38,520 Speaker 4: And I'm also glad that there are authors who can 977 00:50:38,560 --> 00:50:40,879 Speaker 4: have positive takes on how humanity is going to move 978 00:50:40,880 --> 00:50:41,880 Speaker 4: forward eventually. 979 00:50:42,120 --> 00:50:43,400 Speaker 3: I like positive books. 980 00:50:43,560 --> 00:50:45,560 Speaker 1: And she was also realistic. She wasn't promising she was 981 00:50:45,560 --> 00:50:47,680 Speaker 1: going to finish the next novel anytime soon. 982 00:50:48,160 --> 00:50:50,680 Speaker 3: Right, Yes, like that about her too. I'm sure my 983 00:50:50,880 --> 00:50:53,400 Speaker 3: editor would like me to channel set a little. 984 00:50:53,160 --> 00:50:56,759 Speaker 1: Bit more in the future, all right, And thanks to 985 00:50:56,800 --> 00:50:59,160 Speaker 1: all of our listeners for coming along on this ride 986 00:50:59,320 --> 00:51:01,920 Speaker 1: into another universe. We have a lot of fun in 987 00:51:02,000 --> 00:51:03,480 Speaker 1: these episodes. And I hear that a lot of you 988 00:51:03,640 --> 00:51:06,560 Speaker 1: enjoy listening to them. So thanks again everyone for listening, 989 00:51:06,560 --> 00:51:08,960 Speaker 1: and thanks Kelly for joining us today, Thanks. 990 00:51:08,760 --> 00:51:09,879 Speaker 3: For having had a lot of fun. 991 00:51:10,719 --> 00:51:21,439 Speaker 1: Bye everyone, all right, everyone tune in next time. Thanks 992 00:51:21,440 --> 00:51:24,120 Speaker 1: for listening, and remember that Daniel and Jorge Explain the 993 00:51:24,160 --> 00:51:28,959 Speaker 1: Universe is a production of iHeartRadio. For more podcasts from iHeartRadio, 994 00:51:29,080 --> 00:51:33,240 Speaker 1: visit the iHeartRadio app, Apple Podcasts, or wherever you listen 995 00:51:33,320 --> 00:51:34,520 Speaker 1: to your favorite shows.