1 00:00:00,160 --> 00:00:02,160 Speaker 1: Let's take a quick break, and when we get back, 2 00:00:02,200 --> 00:00:06,000 Speaker 1: we'll find out how important scientific consistency is to readers 3 00:00:06,040 --> 00:00:11,119 Speaker 1: of science fiction. Daniel, you read a lot of science fiction, right, 4 00:00:11,480 --> 00:00:14,480 Speaker 1: I certainly do. Is there a phrase in a science 5 00:00:14,480 --> 00:00:18,160 Speaker 1: fiction novel that, when you're reading it you just automatically cringe? 6 00:00:18,640 --> 00:00:22,279 Speaker 1: Oh man, I have got quite a list that really 7 00:00:22,280 --> 00:00:25,400 Speaker 1: has a physicist if you If you read this, it 8 00:00:25,520 --> 00:00:28,240 Speaker 1: makes you a little bit scared. It makes me worried. 9 00:00:28,280 --> 00:00:29,760 Speaker 1: You know that I'm not gonna be able to enjoy 10 00:00:29,800 --> 00:00:31,680 Speaker 1: the novel if they don't treat it right. What are 11 00:00:31,720 --> 00:00:34,159 Speaker 1: some of these phrases that make you afraid? Like the 12 00:00:34,200 --> 00:00:37,040 Speaker 1: Higgs boson If somebody mentions the Higgs boson, Oh man, 13 00:00:37,080 --> 00:00:39,120 Speaker 1: I tossed the book across the room almost every time 14 00:00:39,240 --> 00:00:45,000 Speaker 1: I see automatically. How about quantum mechanics or the quantum realm? 15 00:00:45,200 --> 00:00:48,479 Speaker 1: Oh boy, don't get me started on that other dimensions? 16 00:00:50,520 --> 00:00:52,960 Speaker 1: That is definitely near the top of stuff that's handled 17 00:00:53,000 --> 00:00:55,760 Speaker 1: badly in science fiction. What if I write a novel 18 00:00:55,800 --> 00:00:59,600 Speaker 1: about a quantum Higgs Boson dimension, I'm not even cracking 19 00:00:59,600 --> 00:01:18,520 Speaker 1: that book open. Hi'm or handmad. Cartoonists and the creator 20 00:01:18,640 --> 00:01:22,360 Speaker 1: of PhD commics. Hi, I'm Daniel. I'm a particle physicist 21 00:01:22,480 --> 00:01:25,679 Speaker 1: and I love reading science fiction, especially when it has 22 00:01:25,760 --> 00:01:29,800 Speaker 1: actual science in it, but the fiction. No, it's awesome 23 00:01:29,800 --> 00:01:32,320 Speaker 1: if you can have science and then build fiction around it. 24 00:01:32,360 --> 00:01:35,560 Speaker 1: That's why it's called science fiction. Well, welcome to our podcast, 25 00:01:35,680 --> 00:01:40,679 Speaker 1: which is a work of unfiction or unfiction. We strive 26 00:01:40,720 --> 00:01:43,640 Speaker 1: at least to make this podcast about the real universe 27 00:01:43,720 --> 00:01:47,760 Speaker 1: and not about fictional universities. Welcome to Daniel and Jorge 28 00:01:47,800 --> 00:01:50,680 Speaker 1: Explain the Universe, a production of our Heart Radio in 29 00:01:50,720 --> 00:01:53,320 Speaker 1: which we usually take you on a tour of the 30 00:01:53,400 --> 00:01:56,680 Speaker 1: real universe that we find ourselves in all the incredible, 31 00:01:56,720 --> 00:01:59,640 Speaker 1: all the amazing, all the mind blowing things that exist 32 00:01:59,720 --> 00:02:02,760 Speaker 1: out there in the real universe. And sometimes we like 33 00:02:02,840 --> 00:02:06,320 Speaker 1: to talk about um sort of a crazy ideas that 34 00:02:06,400 --> 00:02:09,680 Speaker 1: maybe physicists have or or you know, wishful thinking that 35 00:02:09,760 --> 00:02:13,120 Speaker 1: physicists have about how the universe might work or what 36 00:02:13,240 --> 00:02:15,799 Speaker 1: it what could be out there in the universe. That's right, 37 00:02:15,880 --> 00:02:19,200 Speaker 1: aspirational universes. Maybe we live in this universe, Maybe we 38 00:02:19,240 --> 00:02:22,200 Speaker 1: live in a universe where there are tiny strings librating 39 00:02:22,200 --> 00:02:24,560 Speaker 1: at the smallest scale, or maybe we live in an 40 00:02:24,560 --> 00:02:27,519 Speaker 1: infinite universe. The truth is we just don't really know, 41 00:02:27,720 --> 00:02:30,200 Speaker 1: and so it's fun to consider a whole spectrum of 42 00:02:30,320 --> 00:02:36,920 Speaker 1: possible What is it universe I Universes University. Yeah, yeah, 43 00:02:36,960 --> 00:02:39,560 Speaker 1: you know. They call it now a speculative sign. It's science. 44 00:02:40,160 --> 00:02:44,080 Speaker 1: We call it speculative science. This is not a speculative podcast. 45 00:02:44,600 --> 00:02:46,960 Speaker 1: We uh, we don't just speculate. We go out and test, 46 00:02:47,000 --> 00:02:49,880 Speaker 1: We do experiments. One day people will know the truth, 47 00:02:50,000 --> 00:02:52,760 Speaker 1: the real answer to how the universe works. But until 48 00:02:52,800 --> 00:02:57,840 Speaker 1: then we rely on clever people to imagine other possible universes. 49 00:02:58,120 --> 00:03:00,160 Speaker 1: And that exists sometimes in the mind. It is a 50 00:03:00,240 --> 00:03:04,520 Speaker 1: theoretical physicists, but sometimes those ideas begin elsewhere. Because it 51 00:03:04,600 --> 00:03:06,920 Speaker 1: is sort of fun to understand the universe as we 52 00:03:06,960 --> 00:03:09,000 Speaker 1: know it and to see to know and to see 53 00:03:09,040 --> 00:03:11,320 Speaker 1: what's out there. But it's also fun to think about 54 00:03:11,320 --> 00:03:13,800 Speaker 1: what could be, or what might be or what um 55 00:03:14,400 --> 00:03:16,960 Speaker 1: you maybe know it's impossible, but it's fun to think 56 00:03:16,960 --> 00:03:19,080 Speaker 1: about the what would happen? How would what would the 57 00:03:19,120 --> 00:03:22,040 Speaker 1: universe look like if a crazy idea was actually true? Yeah, 58 00:03:22,120 --> 00:03:24,840 Speaker 1: and it's more than just like does the universe work 59 00:03:24,960 --> 00:03:27,120 Speaker 1: this way or that way? It's also like, what do 60 00:03:27,280 --> 00:03:29,839 Speaker 1: we do with the universe. If it does work this way, 61 00:03:29,880 --> 00:03:32,880 Speaker 1: what kind of awesome tech can we develop? How can 62 00:03:32,919 --> 00:03:36,080 Speaker 1: we change our lives and the way that we interact 63 00:03:36,120 --> 00:03:39,040 Speaker 1: with and live in the universe given our mastery of 64 00:03:39,120 --> 00:03:41,600 Speaker 1: the physical laws. Yeah, and so as you said, Daniel, 65 00:03:42,200 --> 00:03:45,040 Speaker 1: it exists not only in the minds of physicists and philosophers, 66 00:03:45,080 --> 00:03:49,600 Speaker 1: but authors and artists that are out there trying to 67 00:03:49,640 --> 00:03:51,880 Speaker 1: think about these ideas and what would they mean for 68 00:03:51,960 --> 00:03:55,360 Speaker 1: the human condition. That's right, and everybody's familiar with creativity 69 00:03:55,360 --> 00:03:58,640 Speaker 1: in the minds of artists, but also, of course authors 70 00:03:58,640 --> 00:04:01,960 Speaker 1: require creativity. They imagine an entire new universe, maybe with 71 00:04:02,000 --> 00:04:05,920 Speaker 1: different physical laws, maybe with new technology, and that creativity 72 00:04:05,960 --> 00:04:08,480 Speaker 1: in some ways is parallel to the creativity that's going 73 00:04:08,520 --> 00:04:11,600 Speaker 1: on at the forefront of physics, and those ideas bleed in. 74 00:04:11,760 --> 00:04:15,120 Speaker 1: Sometimes we get awesome ideas from reading science fiction and 75 00:04:15,200 --> 00:04:17,760 Speaker 1: we think, oh, maybe the universe does work that way, 76 00:04:17,839 --> 00:04:21,280 Speaker 1: or maybe we could build a ray gun. Do you think, Daniel, 77 00:04:21,360 --> 00:04:25,280 Speaker 1: is a big overlap between science fiction readers and physicists 78 00:04:25,800 --> 00:04:29,600 Speaker 1: or is it like overlap. I don't think every science 79 00:04:29,600 --> 00:04:33,719 Speaker 1: fiction reader is a physicist, but I think everything is 80 00:04:33,720 --> 00:04:36,360 Speaker 1: a science fiction reader that might be true. That's right. 81 00:04:36,440 --> 00:04:41,520 Speaker 1: Not all nerds are physicists, but all physicists are. And 82 00:04:41,560 --> 00:04:44,920 Speaker 1: I know of more than one sort of practicing professional 83 00:04:45,000 --> 00:04:49,440 Speaker 1: physicist who then became a successful science fiction author. So 84 00:04:49,520 --> 00:04:52,240 Speaker 1: that is a pathway, Like Alistair Reynolds, for example, he 85 00:04:52,320 --> 00:04:55,960 Speaker 1: was an astrophysicist in Europe before he started writing. And 86 00:04:56,000 --> 00:04:58,840 Speaker 1: there's one here in my department, Greg Benford's actually kind 87 00:04:58,839 --> 00:05:02,080 Speaker 1: of famous. He's a professor in my department. For you guys, 88 00:05:02,200 --> 00:05:05,040 Speaker 1: is it sort of fun to not be shackled by 89 00:05:05,040 --> 00:05:06,800 Speaker 1: the laws of physics and just be able to kind 90 00:05:06,800 --> 00:05:10,200 Speaker 1: of spin stories and not have to worry about being 91 00:05:10,720 --> 00:05:14,400 Speaker 1: completely scientific. Well, you know a lot of theoretical physicists 92 00:05:14,400 --> 00:05:16,919 Speaker 1: already don't feel shackled by the laws of physics because 93 00:05:16,920 --> 00:05:20,680 Speaker 1: they propose things we can never test, like the laws 94 00:05:20,680 --> 00:05:25,279 Speaker 1: of physics they're authoring them. Yeah, but I think that 95 00:05:25,320 --> 00:05:28,120 Speaker 1: there's a creativity that's required in physics, and it's a 96 00:05:28,160 --> 00:05:32,280 Speaker 1: similar creativity that's required in writing science fiction to imagine 97 00:05:32,320 --> 00:05:35,360 Speaker 1: the way the universe might work. And so yeah, sort 98 00:05:35,360 --> 00:05:38,039 Speaker 1: of stretches a different muscle. Um. But also, I think 99 00:05:38,200 --> 00:05:40,839 Speaker 1: we're just all fanboys and fan girls because we read 100 00:05:40,839 --> 00:05:43,719 Speaker 1: so much science fiction. It's fun to think about writing 101 00:05:43,760 --> 00:05:45,520 Speaker 1: some right, because I imagine it's a lot of fun 102 00:05:45,520 --> 00:05:48,240 Speaker 1: to explore other universes, you know, not just the one 103 00:05:48,279 --> 00:05:52,200 Speaker 1: we live in, but to imagine new universes. Absolutely. And 104 00:05:52,279 --> 00:05:55,120 Speaker 1: so today on the podcast, it will it's the first 105 00:05:55,160 --> 00:05:57,520 Speaker 1: of a new kind of episode that we're going to 106 00:05:57,640 --> 00:06:02,239 Speaker 1: try out in which we talk about famous science fiction 107 00:06:02,320 --> 00:06:05,480 Speaker 1: authors and famous science fiction novels that are out there, 108 00:06:05,520 --> 00:06:08,280 Speaker 1: and we actually are going to be talking to each 109 00:06:08,279 --> 00:06:11,120 Speaker 1: of the authors. That's right. I reached out to some 110 00:06:11,279 --> 00:06:14,919 Speaker 1: famous science fiction authors and gasp, imagine they actually wrote 111 00:06:14,920 --> 00:06:16,839 Speaker 1: back to us. And so we have the honor and 112 00:06:16,920 --> 00:06:19,600 Speaker 1: privilege to talk to some of the brightest minds in 113 00:06:19,640 --> 00:06:23,200 Speaker 1: science fiction and to explore how they build their science 114 00:06:23,240 --> 00:06:26,159 Speaker 1: fiction universes. How much physics goes into it? Yeah, because 115 00:06:26,160 --> 00:06:27,640 Speaker 1: that's a big question I think a lot of people 116 00:06:27,720 --> 00:06:29,520 Speaker 1: might have, which is um. You know, when you read 117 00:06:29,560 --> 00:06:31,480 Speaker 1: one of these science fiction novels, you you sort of 118 00:06:31,520 --> 00:06:33,560 Speaker 1: wonder how much of this is true and how much 119 00:06:33,560 --> 00:06:36,240 Speaker 1: of this is just totally made up? I know, how 120 00:06:36,320 --> 00:06:39,080 Speaker 1: much of this is They just say quantum mechanics when 121 00:06:39,120 --> 00:06:44,400 Speaker 1: they don't know what to write. The quantum realm yeah, 122 00:06:44,680 --> 00:06:46,440 Speaker 1: I read a lot of science fiction. And you can 123 00:06:46,480 --> 00:06:49,400 Speaker 1: tell when the author has consulted a scientist, and you 124 00:06:49,440 --> 00:06:52,920 Speaker 1: can tell when the author has not as as consulted 125 00:06:52,960 --> 00:06:58,520 Speaker 1: Wikipedia or not even you know, relied on the readers 126 00:06:58,560 --> 00:07:00,680 Speaker 1: to like not really know what these words mean and 127 00:07:00,720 --> 00:07:03,400 Speaker 1: sort of accept them as a word salad that says 128 00:07:03,720 --> 00:07:07,800 Speaker 1: plot hole fixed here. And so, yeah, you're a big 129 00:07:07,800 --> 00:07:09,800 Speaker 1: science fiction fan, right, Daniel? You read a ton of 130 00:07:09,840 --> 00:07:12,840 Speaker 1: science fiction? Do I read a couple of novels a week? Well? Wow, 131 00:07:13,080 --> 00:07:18,960 Speaker 1: I read a couple as a kid, mostly eyes like asama. Um. 132 00:07:19,000 --> 00:07:21,480 Speaker 1: But yeah, I read I think almost every I sid 133 00:07:21,520 --> 00:07:24,000 Speaker 1: asthma of work that's out there. Why did you stop? 134 00:07:24,320 --> 00:07:26,480 Speaker 1: I'm not sure. I think I got into a fantasy 135 00:07:26,560 --> 00:07:29,040 Speaker 1: for a while, and then I started reading other things 136 00:07:29,160 --> 00:07:31,600 Speaker 1: in comics. Well, I guess I never grew up, and 137 00:07:31,640 --> 00:07:34,440 Speaker 1: I'm still reading science fiction. And in fact, now on 138 00:07:34,480 --> 00:07:37,280 Speaker 1: our website you can find a list of science fiction 139 00:07:37,320 --> 00:07:40,559 Speaker 1: novels that I have enjoyed. Um. I only put novels 140 00:07:40,560 --> 00:07:42,800 Speaker 1: on there that I liked. I don't pan anybody's work 141 00:07:42,840 --> 00:07:44,960 Speaker 1: because I know that every novel that's out there is 142 00:07:45,280 --> 00:07:47,960 Speaker 1: somebody's life and heart and soul. Went into that book, 143 00:07:48,160 --> 00:07:50,520 Speaker 1: so I'm not gonna say negative things about them. So 144 00:07:50,640 --> 00:07:52,720 Speaker 1: you can go on our website and under about you 145 00:07:52,760 --> 00:07:55,200 Speaker 1: can find a list of novels that I recommend. Yeah, 146 00:07:55,240 --> 00:07:57,560 Speaker 1: and so this is the first in our series. And 147 00:07:57,600 --> 00:07:59,920 Speaker 1: so today on the podcast, we'll be tackling the question 148 00:08:05,840 --> 00:08:09,720 Speaker 1: it's science fiction scientific The core question we're asking is, 149 00:08:10,200 --> 00:08:12,720 Speaker 1: you know, how important is it that science in a 150 00:08:12,760 --> 00:08:16,920 Speaker 1: science fiction story be logical and self consistent and or 151 00:08:17,000 --> 00:08:19,240 Speaker 1: is it a sort of okay because it's science fiction 152 00:08:19,360 --> 00:08:22,080 Speaker 1: to be sort of hand wavy. Yeah, And it's not 153 00:08:22,240 --> 00:08:25,800 Speaker 1: a question that has a universal answer. You will find 154 00:08:25,840 --> 00:08:28,840 Speaker 1: people out there, like particle physicists who want the science 155 00:08:28,880 --> 00:08:30,920 Speaker 1: to be real or to be logical, or at least 156 00:08:30,920 --> 00:08:33,480 Speaker 1: to be self consistent, especially when it's a crucial element 157 00:08:33,520 --> 00:08:35,760 Speaker 1: in the story. And I think you'll find other people 158 00:08:35,760 --> 00:08:38,200 Speaker 1: out there that just wants stuff to blow up and 159 00:08:38,280 --> 00:08:44,120 Speaker 1: zoom across the screen. Page Smith in a spaceship, Yeah, 160 00:08:44,320 --> 00:08:46,480 Speaker 1: send Bruce Willis out there, doesn't really matter if his 161 00:08:46,520 --> 00:08:49,360 Speaker 1: mission makes sense. Um. And so you'll hear a spectrum 162 00:08:49,360 --> 00:08:51,440 Speaker 1: of answers, I think, and it's totally valid to have 163 00:08:51,480 --> 00:08:54,400 Speaker 1: a spectrum of opinions. Let's take a quick break, and 164 00:08:54,400 --> 00:08:57,120 Speaker 1: when we get back, we'll find out how important scientific 165 00:08:57,160 --> 00:09:13,840 Speaker 1: consistency is to readers of science fiction as usual. I 166 00:09:13,920 --> 00:09:16,400 Speaker 1: was curious what people have thought about this question, so 167 00:09:16,480 --> 00:09:19,760 Speaker 1: I walked around campus and I asked people how important 168 00:09:19,760 --> 00:09:22,240 Speaker 1: it was to them that the science and the story 169 00:09:22,360 --> 00:09:25,120 Speaker 1: be logic, will be self consistent. Here's what people had 170 00:09:25,160 --> 00:09:28,960 Speaker 1: to say. I think it should make sense, but sometimes 171 00:09:29,000 --> 00:09:31,640 Speaker 1: I will not always notice, and other times it's not 172 00:09:31,679 --> 00:09:33,640 Speaker 1: as sleep as they deal. I think it's okay if 173 00:09:33,640 --> 00:09:38,360 Speaker 1: it's a bit like not real, yeah, because it's pi' 174 00:09:38,480 --> 00:09:41,760 Speaker 1: not based on a true like actual story or and 175 00:09:42,360 --> 00:09:45,000 Speaker 1: if it's trying to be like legitimate, then yeah. But 176 00:09:45,040 --> 00:09:46,760 Speaker 1: if it's like a science fiction, I guess it's a 177 00:09:46,760 --> 00:09:50,080 Speaker 1: little bit okay. For Tunoby, it's okay. If it's hand wavy. 178 00:09:50,120 --> 00:09:53,240 Speaker 1: It should be consistent within the story, but not necessarily 179 00:09:53,280 --> 00:09:56,160 Speaker 1: with reality. It probably should be a little consistent so 180 00:09:56,480 --> 00:10:00,400 Speaker 1: it remains kind of realistic or factual. Is it okay 181 00:10:00,440 --> 00:10:03,280 Speaker 1: if they just sort of say quantum mechanics sometimes and 182 00:10:03,280 --> 00:10:07,760 Speaker 1: and wave away some problems, yeah, that's okay. I would 183 00:10:07,840 --> 00:10:14,040 Speaker 1: like it to be plausible and actually like realistic. I 184 00:10:14,080 --> 00:10:17,280 Speaker 1: know that a lot of times that doesn't really work 185 00:10:17,360 --> 00:10:20,720 Speaker 1: for the story. But I think as long as it's 186 00:10:20,880 --> 00:10:25,840 Speaker 1: reasonably almost there, then it's fine with me. Um. Actually 187 00:10:25,880 --> 00:10:29,360 Speaker 1: it does matter to me because I want to imagine 188 00:10:29,400 --> 00:10:32,439 Speaker 1: a future work could happen, so I don't get my 189 00:10:32,480 --> 00:10:35,400 Speaker 1: hopes up, so to speak. All right, cool, it must 190 00:10:35,440 --> 00:10:38,600 Speaker 1: be logical or plaus bow All right. People seem pretty 191 00:10:38,600 --> 00:10:41,600 Speaker 1: relaxed about the design has been real or not. That 192 00:10:41,679 --> 00:10:43,720 Speaker 1: was a little bit infuriating. I was hoping more people 193 00:10:43,760 --> 00:10:47,920 Speaker 1: would with me. He didn't interview in the physicist. I 194 00:10:48,000 --> 00:10:51,520 Speaker 1: purposely didn't. People are like, I don't care about the 195 00:10:51,520 --> 00:10:54,200 Speaker 1: physics of light sabers. I just think lightsabers are cool. 196 00:10:54,480 --> 00:10:57,439 Speaker 1: Well that's true, you know, lightsabers are cool, whether or 197 00:10:57,480 --> 00:11:00,360 Speaker 1: not you could ever actually make them. Um. But yeah, 198 00:11:00,360 --> 00:11:01,960 Speaker 1: some people are like, yeah, whatever, I don't really care 199 00:11:02,000 --> 00:11:04,040 Speaker 1: for his hand wavy. You know. It's like they expect 200 00:11:04,040 --> 00:11:07,640 Speaker 1: to be fed spoonfuls of random science sounding words, you know, 201 00:11:07,800 --> 00:11:10,080 Speaker 1: and there's a place for that. Like I enjoy Star Trek, 202 00:11:10,200 --> 00:11:14,360 Speaker 1: even though the science on that show is ridunculous. You know. Yeah, 203 00:11:14,400 --> 00:11:17,880 Speaker 1: they're like reverse the transponder on the polarization ion beam, 204 00:11:18,200 --> 00:11:22,600 Speaker 1: a matic or something which you're like you can't make it, 205 00:11:22,640 --> 00:11:27,360 Speaker 1: so that's impossible. Yeah, but they're being so ridiculous that 206 00:11:27,400 --> 00:11:29,440 Speaker 1: it sounds like they're sort of trying to be ridiculous. 207 00:11:29,440 --> 00:11:32,040 Speaker 1: They don't take themselves seriously, and so I guess part 208 00:11:32,080 --> 00:11:34,920 Speaker 1: of it is just sort of what are you're aiming for? Yeah, well, 209 00:11:34,960 --> 00:11:36,920 Speaker 1: I think that's kind of what you're saying earlier, which 210 00:11:36,960 --> 00:11:39,160 Speaker 1: is that's kind of the point of science fiction, which 211 00:11:39,200 --> 00:11:41,800 Speaker 1: is to just kind of tickle our imagination and to 212 00:11:41,960 --> 00:11:45,199 Speaker 1: make us wonder about what's other, because you know, sometimes 213 00:11:45,240 --> 00:11:48,719 Speaker 1: the science fiction stuff inspire scientists and engineers to make 214 00:11:48,760 --> 00:11:51,400 Speaker 1: it real, to make it so, as Pricard would say, 215 00:11:51,480 --> 00:11:53,920 Speaker 1: and then it becomes like a real thing. It certainly does, 216 00:11:54,040 --> 00:11:56,240 Speaker 1: and it also sort of warns us of the dangers 217 00:11:56,240 --> 00:11:58,400 Speaker 1: of technology. These days, we have a lot of dystopian 218 00:11:58,440 --> 00:12:00,800 Speaker 1: science fiction with the robots of tape can over or 219 00:12:01,000 --> 00:12:04,280 Speaker 1: everybody's online and getting deleted and stuff, and so it's 220 00:12:04,280 --> 00:12:07,560 Speaker 1: helpful to sort of think through the consequences of technology. 221 00:12:07,640 --> 00:12:10,559 Speaker 1: And to me, that's what science fiction is about. It's like, 222 00:12:11,080 --> 00:12:14,040 Speaker 1: what stories could you tell in a universe where the 223 00:12:14,080 --> 00:12:16,560 Speaker 1: rules are different, either the science is different, like the 224 00:12:16,640 --> 00:12:19,480 Speaker 1: laws of physics have changed or you have new kinds 225 00:12:19,520 --> 00:12:21,559 Speaker 1: of tech which change what it's like to be human, 226 00:12:21,920 --> 00:12:24,160 Speaker 1: and then explore that, like what stories are there, what 227 00:12:24,240 --> 00:12:25,840 Speaker 1: is it like to grow up in that world? What 228 00:12:25,880 --> 00:12:29,360 Speaker 1: can you do or can't do that it's weird compared 229 00:12:29,400 --> 00:12:31,480 Speaker 1: to the world you come from, Right, It's all like 230 00:12:31,520 --> 00:12:34,720 Speaker 1: a big thought experiment, Right, could you guess what if 231 00:12:34,920 --> 00:12:36,800 Speaker 1: this would happen? And it kind of tells you about 232 00:12:36,840 --> 00:12:38,880 Speaker 1: how things are right now. Yeah, And that to me 233 00:12:39,000 --> 00:12:41,720 Speaker 1: is why it's important to be consistent, because that's the 234 00:12:41,720 --> 00:12:44,360 Speaker 1: whole experiment. Like, if you're trying to figure out what 235 00:12:44,480 --> 00:12:46,319 Speaker 1: is it like to live in a world where you 236 00:12:46,320 --> 00:12:49,760 Speaker 1: can teleport around the world to visit your brother in 237 00:12:50,080 --> 00:12:53,200 Speaker 1: you know, Australia in two seconds, Well, then you've got 238 00:12:53,200 --> 00:12:55,400 Speaker 1: to have some rules that's constrained that world so that 239 00:12:55,440 --> 00:12:57,520 Speaker 1: you can explore what it's like. Because if you're actually 240 00:12:57,520 --> 00:13:00,120 Speaker 1: living in that world, there are rules, right, and if 241 00:13:00,120 --> 00:13:01,640 Speaker 1: you want to know what it's actually like to live 242 00:13:01,640 --> 00:13:04,480 Speaker 1: in that world, when you gotta follow that world's rules. 243 00:13:04,520 --> 00:13:06,640 Speaker 1: So if you're just making up rules all the time 244 00:13:06,679 --> 00:13:09,120 Speaker 1: and they violate each other don't make any sense, then 245 00:13:09,120 --> 00:13:12,480 Speaker 1: you're not really exploring that world. In my opinion. And 246 00:13:12,520 --> 00:13:14,920 Speaker 1: if I had a secret brother in Australia, that would 247 00:13:14,920 --> 00:13:18,880 Speaker 1: be the shocking part of that novel for me. That 248 00:13:18,880 --> 00:13:22,000 Speaker 1: would be the huge reveal. Yeah, but the all right, 249 00:13:22,000 --> 00:13:24,760 Speaker 1: so these are all super interesting questions, and Daniel thought 250 00:13:24,760 --> 00:13:26,200 Speaker 1: that it would be great to kind of get into 251 00:13:26,200 --> 00:13:28,200 Speaker 1: that with with a real science fiction author and a 252 00:13:28,200 --> 00:13:30,560 Speaker 1: famous science fiction author. And so today on the program, 253 00:13:30,679 --> 00:13:33,120 Speaker 1: we are going to play an interview with Ann Lecky, 254 00:13:33,480 --> 00:13:36,840 Speaker 1: who is the author of a famous science fiction nolical 255 00:13:36,960 --> 00:13:40,120 Speaker 1: ancillary Justice. That's right, I think she pronounced the ancillary 256 00:13:40,120 --> 00:13:43,360 Speaker 1: injustice ancillary. Sorry. And this is not just any book. 257 00:13:43,520 --> 00:13:46,520 Speaker 1: This is a book which won a series of awards. 258 00:13:46,559 --> 00:13:49,840 Speaker 1: It's sort of famous in science fiction circles because it 259 00:13:49,880 --> 00:13:52,880 Speaker 1: won all the major awards. That won the Hugo Award, 260 00:13:52,960 --> 00:13:55,400 Speaker 1: won the Nebula Award, won a bunch of other awards. 261 00:13:55,440 --> 00:13:58,360 Speaker 1: And that's that's pretty unusual. Um. And I think also 262 00:13:58,400 --> 00:14:00,960 Speaker 1: he was her debut novel. Oh, this is the first 263 00:14:00,960 --> 00:14:03,600 Speaker 1: novel she ever wrote, is the first novel she ever published. 264 00:14:03,880 --> 00:14:06,280 Speaker 1: First time up in bat like hit a home run, 265 00:14:06,320 --> 00:14:09,200 Speaker 1: the biggest home run anybody's ever hit. That's pretty awesome. 266 00:14:10,080 --> 00:14:13,160 Speaker 1: It made quite a splash. Wow, and so it sounds 267 00:14:13,200 --> 00:14:15,880 Speaker 1: like a super interesting books, Daniel, So before we play 268 00:14:15,920 --> 00:14:17,640 Speaker 1: the interview, tell me a little bit about what the 269 00:14:17,679 --> 00:14:20,480 Speaker 1: book has done, what the kind of the science inside 270 00:14:20,480 --> 00:14:23,080 Speaker 1: of the science fiction novel is about. Yeah, So this 271 00:14:23,120 --> 00:14:26,680 Speaker 1: book you would categorize as space opera because it takes 272 00:14:26,720 --> 00:14:30,720 Speaker 1: place over vast scales and distances, and it's far in 273 00:14:30,760 --> 00:14:34,400 Speaker 1: the future, and humanity has conquered a big fraction of 274 00:14:34,400 --> 00:14:38,480 Speaker 1: the galaxy. So humans have spaceships and lots of solar 275 00:14:38,520 --> 00:14:41,520 Speaker 1: systems and were spread out all over the galaxy. Wow, 276 00:14:41,840 --> 00:14:44,280 Speaker 1: like the Milky Way. Yeah, like the Milky Way. Though 277 00:14:44,320 --> 00:14:46,600 Speaker 1: it's never actually named, it could have been a galaxy 278 00:14:46,680 --> 00:14:50,760 Speaker 1: far far far away, a long time ago. It could 279 00:14:50,760 --> 00:14:52,200 Speaker 1: have been. But you know, the point is that we 280 00:14:52,240 --> 00:14:56,600 Speaker 1: occupy several different solar systems. And the key new idea 281 00:14:56,680 --> 00:14:59,040 Speaker 1: in this book, the key new element of technology that 282 00:14:59,160 --> 00:15:01,600 Speaker 1: changes what it's like to be a person is that 283 00:15:01,800 --> 00:15:06,160 Speaker 1: in her universe, they've developed technology to connect brains to 284 00:15:06,240 --> 00:15:09,320 Speaker 1: each other, so like I can have a consciousness which 285 00:15:09,360 --> 00:15:11,080 Speaker 1: is not just in my body, but I can also 286 00:15:11,200 --> 00:15:14,360 Speaker 1: take over other bodies, so I can like spread my 287 00:15:14,440 --> 00:15:17,680 Speaker 1: mind across like five different people. So you're here now, 288 00:15:18,200 --> 00:15:20,720 Speaker 1: but suddenly, if you wanted to be the Daniel in 289 00:15:20,760 --> 00:15:23,600 Speaker 1: Alpha Centaur, you could just flip a switch and suddenly 290 00:15:24,080 --> 00:15:25,760 Speaker 1: you feel like you're there. No, I think you are 291 00:15:25,880 --> 00:15:29,440 Speaker 1: simultaneously in all of them. So it's like you have 292 00:15:29,600 --> 00:15:32,840 Speaker 1: five pairs of eyes and you're controlling five bodies. Me 293 00:15:32,920 --> 00:15:36,360 Speaker 1: as a conscious entity, I'm experiencing what five people are 294 00:15:36,360 --> 00:15:39,200 Speaker 1: experiencing at the same time. That's right. Yeah, you have 295 00:15:39,280 --> 00:15:43,000 Speaker 1: five brains at your disposal and ten hands, and you know, 296 00:15:43,200 --> 00:15:45,840 Speaker 1: fifty toes and all that stuff, and they're all human. 297 00:15:45,920 --> 00:15:48,040 Speaker 1: Are are are we robots? What am I? Well? In 298 00:15:48,080 --> 00:15:50,240 Speaker 1: the book, a lot of the characters are human and 299 00:15:50,280 --> 00:15:52,640 Speaker 1: the and the ancillaries are the ones that are slaved 300 00:15:52,640 --> 00:15:55,320 Speaker 1: to you. So if you have five ancillaries, that means 301 00:15:55,360 --> 00:15:57,480 Speaker 1: it's you plus five other bodies. So you can take 302 00:15:57,520 --> 00:16:02,080 Speaker 1: your human body and your human mind and experience, you know, 303 00:16:02,120 --> 00:16:06,120 Speaker 1: control six different bodies at once. But you can also 304 00:16:06,320 --> 00:16:09,400 Speaker 1: connect humans to AI. So, for example, there are these 305 00:16:09,560 --> 00:16:13,200 Speaker 1: spaceships that have really intelligent AI in them, and they 306 00:16:13,240 --> 00:16:16,120 Speaker 1: can also have humans that they control. So the mind 307 00:16:16,160 --> 00:16:19,200 Speaker 1: of an AI can also exist across a ship and 308 00:16:19,440 --> 00:16:22,160 Speaker 1: these bodies. So there's me, like me, the brain that 309 00:16:22,560 --> 00:16:24,720 Speaker 1: the body that my brain was born in. But then 310 00:16:24,760 --> 00:16:28,240 Speaker 1: I also have like puppets that I can control. Yes, yes, 311 00:16:28,240 --> 00:16:31,520 Speaker 1: they're just like puppets. They're like biological puppets or robot puppets, 312 00:16:31,600 --> 00:16:33,720 Speaker 1: their biological puppets. So you know, there's a lot of 313 00:16:33,720 --> 00:16:36,040 Speaker 1: discussions of the of the morality of this in the 314 00:16:36,080 --> 00:16:39,120 Speaker 1: book because basically, you go to war, you take prisoners, 315 00:16:39,320 --> 00:16:42,600 Speaker 1: and then you just basically delete those personalities from the 316 00:16:42,640 --> 00:16:45,360 Speaker 1: bodies and use them for yourself. It's like you're wiping 317 00:16:45,360 --> 00:16:47,840 Speaker 1: a floppy disk or something. Oh, I see, so it's 318 00:16:47,880 --> 00:16:52,240 Speaker 1: like another human body, but it's sort of like deactivated. Yeah, 319 00:16:52,240 --> 00:16:56,320 Speaker 1: it's like you've written your consciousness onto their hardware. Yeah. 320 00:16:56,360 --> 00:16:59,280 Speaker 1: So it's like, for example, if you deleted my personality 321 00:16:59,320 --> 00:17:01,120 Speaker 1: and took me over, then they would just be Daniel 322 00:17:01,160 --> 00:17:03,720 Speaker 1: and Jorge would just be Jorge in Daniel and Whore's 323 00:17:03,760 --> 00:17:07,320 Speaker 1: bodies be Jorrie and Jorge Explain the Universe a new 324 00:17:07,359 --> 00:17:11,639 Speaker 1: podcast from my Heart Radio. Huh okay, So this it 325 00:17:11,680 --> 00:17:14,280 Speaker 1: imagines the future where this technology is possible that I 326 00:17:14,280 --> 00:17:17,480 Speaker 1: can somehow wipe out the consciousness of a human and 327 00:17:17,520 --> 00:17:21,080 Speaker 1: then reimposed I guess the consciousness of another person in 328 00:17:21,119 --> 00:17:24,680 Speaker 1: it or an artificial intelligence, yeah, or an artificial intelligence. 329 00:17:24,760 --> 00:17:26,840 Speaker 1: So you take prisoners in a war, and then you 330 00:17:26,880 --> 00:17:31,000 Speaker 1: can make those prisoners be like soldiers of your AI 331 00:17:31,119 --> 00:17:34,159 Speaker 1: powered ships, or you can take them for yourself and 332 00:17:34,200 --> 00:17:36,439 Speaker 1: make them your own slaves. So you can exist in 333 00:17:36,480 --> 00:17:39,399 Speaker 1: several places. So these people are sort of like zombies 334 00:17:39,720 --> 00:17:41,919 Speaker 1: or like if I if you were to meet another Jorge, 335 00:17:42,080 --> 00:17:44,600 Speaker 1: would would you feel like you're talking to Jorge or 336 00:17:44,640 --> 00:17:47,320 Speaker 1: do you feel like you're talking to a robot? You 337 00:17:47,320 --> 00:17:49,560 Speaker 1: would feel like you're talking to Jorge, because Jorge would 338 00:17:49,560 --> 00:17:51,560 Speaker 1: be in several places. Yeah, And so you can do 339 00:17:51,600 --> 00:17:53,280 Speaker 1: several things at once, and you can take like two 340 00:17:53,320 --> 00:17:55,119 Speaker 1: of you to go shopping while one of you is 341 00:17:55,119 --> 00:17:58,520 Speaker 1: staying home cooking, and everybody's sort of mentally connected to 342 00:17:58,520 --> 00:18:02,840 Speaker 1: each other because you have one consciousness stretched across several bodies. So, 343 00:18:02,960 --> 00:18:06,000 Speaker 1: and how is this technology made possible? Is it like 344 00:18:06,119 --> 00:18:10,879 Speaker 1: we have Is it like implants or you know, biological 345 00:18:11,200 --> 00:18:13,679 Speaker 1: or is it magic? It's not magic. She's tried to 346 00:18:13,680 --> 00:18:16,040 Speaker 1: think it through technologically, and she's imagined that if you 347 00:18:16,040 --> 00:18:19,200 Speaker 1: put implants into the brain, they can receive the signals 348 00:18:19,200 --> 00:18:23,200 Speaker 1: necessary to control the body and send the signals necessary 349 00:18:23,200 --> 00:18:26,640 Speaker 1: to sort of transmit the experience of that body. It's 350 00:18:26,640 --> 00:18:29,080 Speaker 1: like having a brain walkie talkie, yeah, sort of, or 351 00:18:29,119 --> 00:18:32,119 Speaker 1: like an internet of brains, right, instead of having a 352 00:18:32,200 --> 00:18:34,000 Speaker 1: single brain in the body, you sort of become a 353 00:18:34,160 --> 00:18:37,320 Speaker 1: larger virtual brain, I guess right away. My question is like, 354 00:18:37,400 --> 00:18:39,320 Speaker 1: how do you deal with the delays? Because like to 355 00:18:39,400 --> 00:18:42,159 Speaker 1: talk to our set spaceship and Jupiter, it takes like, 356 00:18:42,240 --> 00:18:44,280 Speaker 1: you know, thirty minutes, doesn't it. Yes, And that is 357 00:18:44,280 --> 00:18:47,640 Speaker 1: a key element of the novel that the leader of 358 00:18:47,680 --> 00:18:51,280 Speaker 1: this empire, this lord of this empire, has become so 359 00:18:51,320 --> 00:18:53,960 Speaker 1: spread out across hundreds or thousands of bodies that she 360 00:18:54,000 --> 00:18:57,400 Speaker 1: can no longer sort of keep a single consciousness going. 361 00:18:57,480 --> 00:19:00,960 Speaker 1: She's fractures into two and ends up with like her 362 00:19:01,000 --> 00:19:04,119 Speaker 1: mind being split. And so one of the really awesome 363 00:19:04,160 --> 00:19:06,200 Speaker 1: things about this book is that she's really thought through 364 00:19:06,280 --> 00:19:08,840 Speaker 1: what this would be like and the consequences of it, 365 00:19:09,200 --> 00:19:11,760 Speaker 1: And as in what I think the best science fiction 366 00:19:11,840 --> 00:19:15,280 Speaker 1: does is she's found some sort of surprising or counterintuitive 367 00:19:15,320 --> 00:19:19,800 Speaker 1: consequences of this technology if it was possible, interesting things 368 00:19:19,800 --> 00:19:23,200 Speaker 1: that nobody had thought about happened in this novel. Yeah, exactly, 369 00:19:23,240 --> 00:19:26,280 Speaker 1: And she really explores in depth and imagines what it 370 00:19:26,359 --> 00:19:29,199 Speaker 1: be like, and also imagines how people would talk to 371 00:19:29,240 --> 00:19:31,520 Speaker 1: each other and treat each other, and how you would 372 00:19:31,560 --> 00:19:34,720 Speaker 1: talk to ancillaries and how you would talk to AI, 373 00:19:34,800 --> 00:19:37,480 Speaker 1: and so that the interactions in this world feel real. 374 00:19:37,600 --> 00:19:40,159 Speaker 1: I mean, they feel like somebody went out and lived 375 00:19:40,160 --> 00:19:42,360 Speaker 1: this world and it's coming back to tell you stories 376 00:19:42,359 --> 00:19:45,320 Speaker 1: about their experience. It feels like a real world. It's 377 00:19:45,359 --> 00:19:48,399 Speaker 1: it's very well done storytelling interesting. But how did she 378 00:19:48,480 --> 00:19:50,400 Speaker 1: deal with the time? Like, like, if you were talking 379 00:19:50,440 --> 00:19:53,920 Speaker 1: to a Jorge and Jupiter that I'm controlling, wouldn't it 380 00:19:53,960 --> 00:19:56,040 Speaker 1: just take thirty minutes between each response? You'd be like, 381 00:19:56,040 --> 00:19:59,600 Speaker 1: hey Jorge, wait thirty minutes for me to get it, 382 00:19:59,760 --> 00:20:02,240 Speaker 1: and and send the response back to that horror and see, 383 00:20:02,320 --> 00:20:04,560 Speaker 1: hey Daniel, how's it going. Yeah, it would. And so 384 00:20:04,760 --> 00:20:07,320 Speaker 1: essentially what you end up doing is like splitting off 385 00:20:07,359 --> 00:20:09,800 Speaker 1: into sub versions. So you like send a bunch of 386 00:20:09,840 --> 00:20:12,080 Speaker 1: yourselves off on a mission and you don't hear back 387 00:20:12,080 --> 00:20:13,720 Speaker 1: from them for a while, and then they come back 388 00:20:13,760 --> 00:20:17,160 Speaker 1: and you try to reintegrate what they've learned back into 389 00:20:17,200 --> 00:20:20,960 Speaker 1: your central consciousness. But I sent them with my consciousness. 390 00:20:21,160 --> 00:20:24,120 Speaker 1: It's like a copy thing I copy my consciousness. Well, 391 00:20:24,119 --> 00:20:26,359 Speaker 1: it's sort of fractures, and there are moments of the 392 00:20:26,359 --> 00:20:29,159 Speaker 1: novel where, for example, the communication breaks down because somebody 393 00:20:29,160 --> 00:20:32,199 Speaker 1: develops technology that blocks the jams this kind of communication, 394 00:20:32,440 --> 00:20:34,720 Speaker 1: and all of a sudden, all of the bodies feel 395 00:20:34,800 --> 00:20:37,560 Speaker 1: just like individuals, and they all they're not zombies. They 396 00:20:37,560 --> 00:20:40,080 Speaker 1: all feel like they are the one that just all 397 00:20:40,080 --> 00:20:42,840 Speaker 1: of a sudden, each one is isolated in its own body. 398 00:20:42,880 --> 00:20:46,040 Speaker 1: It's like Dropbox. You gotta think it. You can sever 399 00:20:46,119 --> 00:20:48,520 Speaker 1: the connection, you can rewrite all you want. It feels 400 00:20:48,520 --> 00:20:51,200 Speaker 1: like you have the dropbox. That's the drop box, and 401 00:20:51,240 --> 00:20:54,120 Speaker 1: then when you rethink, then everything has to settle. It's 402 00:20:54,119 --> 00:20:57,879 Speaker 1: exactly like personality drop Box. Yeah. But something that's interesting 403 00:20:58,240 --> 00:21:02,600 Speaker 1: is that she also allows people to go from solar 404 00:21:02,640 --> 00:21:05,600 Speaker 1: system to solar system using these gates, which are basically 405 00:21:05,640 --> 00:21:09,520 Speaker 1: like wormholes, for faster than light travel, because it's pretty 406 00:21:09,520 --> 00:21:12,320 Speaker 1: hard to have an interstellar empire if it takes, you know, 407 00:21:12,359 --> 00:21:14,440 Speaker 1: a thousand years to get from one side to the other. 408 00:21:15,000 --> 00:21:16,960 Speaker 1: So she has this shortcut. You can get from solar 409 00:21:16,960 --> 00:21:19,000 Speaker 1: system to solar system and a reasonable amount of time. 410 00:21:19,400 --> 00:21:22,520 Speaker 1: But within a solar system, she really wanted to play 411 00:21:22,600 --> 00:21:25,680 Speaker 1: with the sort of time lag element. It's the core 412 00:21:25,720 --> 00:21:28,639 Speaker 1: part of her story. And so there's no faster than 413 00:21:28,680 --> 00:21:31,440 Speaker 1: like communication or travel within the solar system. But then 414 00:21:31,440 --> 00:21:33,480 Speaker 1: you have these gates to go from one solar system 415 00:21:33,520 --> 00:21:35,560 Speaker 1: to the other. Oh, I see, and in which you 416 00:21:35,640 --> 00:21:39,400 Speaker 1: can send information to you can send information to. Yeah, 417 00:21:39,480 --> 00:21:41,439 Speaker 1: And so you'll hear about what I asked her in 418 00:21:41,440 --> 00:21:43,240 Speaker 1: the interview. I asked her if she ever thought about 419 00:21:43,600 --> 00:21:47,640 Speaker 1: using that faster than like communication technology between the ancillaries, 420 00:21:47,800 --> 00:21:50,560 Speaker 1: like a pocket wormhole. Yeah, like a pocket wormhole for 421 00:21:50,600 --> 00:21:53,800 Speaker 1: instantaneous updates across all of your people, Right, wouldn't you 422 00:21:53,880 --> 00:21:58,640 Speaker 1: like wormhole powered jop box. Wouldn't that be awesome? I'm 423 00:21:58,640 --> 00:22:01,320 Speaker 1: not sure that would help me be more productive, but 424 00:22:02,160 --> 00:22:05,360 Speaker 1: it sounds like a good IDEA quantum drop box. That's 425 00:22:05,359 --> 00:22:10,840 Speaker 1: what thanks boson quantum in other dimensions. All right, I 426 00:22:10,880 --> 00:22:13,320 Speaker 1: have a few more questions for you about this technology 427 00:22:13,359 --> 00:22:14,960 Speaker 1: and about this plot, and then we'll get into the 428 00:22:15,000 --> 00:22:17,800 Speaker 1: interview with and Lecky. But first let's take a quick break, 429 00:22:30,640 --> 00:22:33,760 Speaker 1: all right. Then we're talking about the universe, not the 430 00:22:33,840 --> 00:22:36,600 Speaker 1: universe we live in, but the universe of Ancillary Justice, 431 00:22:36,640 --> 00:22:39,399 Speaker 1: which is uh the debut novel by author and Lecky, 432 00:22:39,440 --> 00:22:41,960 Speaker 1: which won all kinds of science fiction awards. And there's 433 00:22:41,960 --> 00:22:44,560 Speaker 1: a super well known in the science fiction world, and 434 00:22:44,600 --> 00:22:46,880 Speaker 1: you're teld me that it relies on this technology of 435 00:22:46,920 --> 00:22:51,960 Speaker 1: like controlling other brains and faster than light travel through formals, 436 00:22:52,080 --> 00:22:55,400 Speaker 1: that's right, speed of light communication between brains, and then 437 00:22:55,560 --> 00:22:59,320 Speaker 1: faster than like travel between solar systems, which I'll be 438 00:22:59,320 --> 00:23:01,680 Speaker 1: honest is a bit of a friction for me. Like, 439 00:23:02,040 --> 00:23:03,959 Speaker 1: you know, she wanted to have both things in her 440 00:23:04,000 --> 00:23:07,720 Speaker 1: world and they slightly contradict each other, but she separated 441 00:23:07,760 --> 00:23:10,160 Speaker 1: them sort of in space. Like you can go fastened 442 00:23:10,200 --> 00:23:13,240 Speaker 1: light between solar systems, but inside a solar system you're 443 00:23:13,240 --> 00:23:16,080 Speaker 1: limited to speed of light communication. Well, and I guess, 444 00:23:16,119 --> 00:23:20,719 Speaker 1: like my question is, are these technologies implossible or impossible? 445 00:23:20,800 --> 00:23:24,320 Speaker 1: They're not right, Like you could maybe imagine developing wormholes 446 00:23:24,320 --> 00:23:27,720 Speaker 1: in the future, and you can maybe imagine developing like 447 00:23:27,840 --> 00:23:31,040 Speaker 1: brain implants that can do all these things to your brain. Yeah, 448 00:23:31,040 --> 00:23:33,399 Speaker 1: I think the big picture is that this could be 449 00:23:33,480 --> 00:23:36,600 Speaker 1: our future. I mean, we could be all drop bug links, 450 00:23:36,680 --> 00:23:39,760 Speaker 1: personality zombies in the future. Um, I don't see a 451 00:23:39,800 --> 00:23:43,480 Speaker 1: physics reason why it couldn't, Like you said, wormholes totally 452 00:23:43,480 --> 00:23:46,120 Speaker 1: a possibility. We dug into that in a podcast episode. 453 00:23:46,440 --> 00:23:50,120 Speaker 1: Never been actually observed, but theoretically could happen. I think 454 00:23:50,160 --> 00:23:53,399 Speaker 1: the trickier bit is this implant. Like, imagine you developed 455 00:23:53,400 --> 00:23:56,320 Speaker 1: an implant which could sense everything that's happening in your brain. 456 00:23:56,800 --> 00:23:59,359 Speaker 1: I don't know how plausible it is that I could 457 00:23:59,440 --> 00:24:02,040 Speaker 1: understand to what's going on in somebody else's brain. Like, 458 00:24:02,520 --> 00:24:04,800 Speaker 1: even if you say, like I can develop an implant 459 00:24:04,840 --> 00:24:06,680 Speaker 1: which senses all that stuff, how am I going to 460 00:24:06,800 --> 00:24:09,000 Speaker 1: process it? Like, it's not clear to me that your 461 00:24:09,000 --> 00:24:11,520 Speaker 1: brain works in a way is similar to mind, so 462 00:24:11,600 --> 00:24:13,679 Speaker 1: that it even makes sense to me, right, Like how 463 00:24:13,720 --> 00:24:16,040 Speaker 1: do you translate it? Yeah? How do you translate it? 464 00:24:16,040 --> 00:24:18,159 Speaker 1: That is a hard problem, um, And then how do 465 00:24:18,200 --> 00:24:21,080 Speaker 1: you control it? It's a hard problem, but it's not implausible, 466 00:24:21,160 --> 00:24:23,879 Speaker 1: Like maybe we'll figure it out. It could be it 467 00:24:23,920 --> 00:24:26,240 Speaker 1: could be that we figure it out. Um. In her book, 468 00:24:26,240 --> 00:24:28,000 Speaker 1: it doesn't take very long, Like you turn on a 469 00:24:28,040 --> 00:24:30,720 Speaker 1: new ancillary and you get control of it pretty fast. 470 00:24:31,160 --> 00:24:33,720 Speaker 1: But you know that babies or whatever, when they turn 471 00:24:33,760 --> 00:24:35,679 Speaker 1: on their bodies and need to learn to control it, it 472 00:24:35,600 --> 00:24:37,479 Speaker 1: it takes them a while to get used to it 473 00:24:37,560 --> 00:24:40,399 Speaker 1: and and familiar with it and comfortable with it. So 474 00:24:40,480 --> 00:24:43,160 Speaker 1: I think, at a very minimum, even if you could 475 00:24:43,359 --> 00:24:46,560 Speaker 1: capture all that information and translate it and experience it, 476 00:24:46,560 --> 00:24:48,280 Speaker 1: it would take you a while to get used to 477 00:24:48,320 --> 00:24:51,919 Speaker 1: controlling those bodies. But no, not impossible. And these implants, 478 00:24:51,920 --> 00:24:54,040 Speaker 1: are they like chips or do they like take over 479 00:24:54,080 --> 00:24:56,800 Speaker 1: your brain too? They're like chips. Yeah, they're inserted into 480 00:24:56,840 --> 00:24:59,400 Speaker 1: your brain somehow. They do some surgery to make these 481 00:24:59,400 --> 00:25:01,680 Speaker 1: ancillaries to take a human body, and they plant all 482 00:25:01,680 --> 00:25:04,040 Speaker 1: this stuff in them to turn them into an ancillary. 483 00:25:04,160 --> 00:25:07,119 Speaker 1: I also wonder what it's like to have a hundred 484 00:25:07,119 --> 00:25:10,119 Speaker 1: bodies and a hundred brains, and you know, two hundred 485 00:25:10,160 --> 00:25:12,880 Speaker 1: pairs of eyeballs. Where do you feel like you are? 486 00:25:13,000 --> 00:25:15,160 Speaker 1: Like right now, I feel like I'm in my head. 487 00:25:15,720 --> 00:25:17,639 Speaker 1: But if I had two bodies that are looking at 488 00:25:17,720 --> 00:25:19,919 Speaker 1: each other, then do I feel like I'm sort of 489 00:25:19,920 --> 00:25:23,399 Speaker 1: floating between them? Or I mean both? Or That's the 490 00:25:23,440 --> 00:25:25,240 Speaker 1: coolest thing about this book is that it tries to 491 00:25:25,280 --> 00:25:27,720 Speaker 1: give you a sense for what that would be like 492 00:25:27,720 --> 00:25:32,159 Speaker 1: like if you, as a conscious mind, had multiple experiences. Yeah, 493 00:25:32,280 --> 00:25:33,840 Speaker 1: and so in the book, she really tries to give 494 00:25:33,840 --> 00:25:35,880 Speaker 1: you a sense or what this is like and it's 495 00:25:35,880 --> 00:25:38,720 Speaker 1: a challenge because she's writing from the point of view 496 00:25:38,720 --> 00:25:42,720 Speaker 1: of something that has multiple experiences simultaneously, but she's telling 497 00:25:42,720 --> 00:25:44,879 Speaker 1: it to you, the reader that can only experience one 498 00:25:44,880 --> 00:25:47,199 Speaker 1: thing at a time, in a format of a story 499 00:25:47,280 --> 00:25:49,399 Speaker 1: where she has to write, you know, one word at 500 00:25:49,400 --> 00:25:52,000 Speaker 1: a time. She can't like layer ten sentences on top 501 00:25:52,000 --> 00:25:57,320 Speaker 1: of each other. Did she didn't use columns? Two columns 502 00:25:57,320 --> 00:26:01,440 Speaker 1: and the chapter could be interesting? All right? So Daniel, 503 00:26:01,520 --> 00:26:03,080 Speaker 1: so you interview, You got to interview, and like you 504 00:26:03,160 --> 00:26:04,800 Speaker 1: and you were very excited about this because you're you're 505 00:26:04,800 --> 00:26:06,840 Speaker 1: a fan of the book. Oh yeah, I'm a huge fan. 506 00:26:07,000 --> 00:26:08,879 Speaker 1: I love this book. When it came out, I reread 507 00:26:08,920 --> 00:26:11,600 Speaker 1: it just before I talked to her. I was again 508 00:26:11,640 --> 00:26:14,840 Speaker 1: impressed with the depth of the universe that she imagined 509 00:26:15,040 --> 00:26:17,239 Speaker 1: and just the sort of craft of the writing of 510 00:26:17,240 --> 00:26:19,800 Speaker 1: pulling off such an ambitious thing. And then the book 511 00:26:19,880 --> 00:26:22,360 Speaker 1: is just fun to read. It's like it's an adventure. 512 00:26:22,400 --> 00:26:25,520 Speaker 1: You want to know what happens. There's mysteries, there's you know, drama, 513 00:26:25,640 --> 00:26:29,040 Speaker 1: this politics as personalities. It's an impressive book. It deserves 514 00:26:29,040 --> 00:26:31,160 Speaker 1: these awards, So hats off to her. I was very 515 00:26:31,800 --> 00:26:33,719 Speaker 1: happy to get to talk to her. Oh, and so 516 00:26:33,760 --> 00:26:35,520 Speaker 1: what kind of what kinds of questions did you did 517 00:26:35,520 --> 00:26:37,760 Speaker 1: you ask her? Well, you know, I'm an aspiring science 518 00:26:37,760 --> 00:26:39,560 Speaker 1: fiction author, so of course I asked her like, how 519 00:26:39,560 --> 00:26:41,840 Speaker 1: did you get this idea? And what did you like 520 00:26:41,920 --> 00:26:45,840 Speaker 1: about it? And I was really also interested in how 521 00:26:45,880 --> 00:26:48,640 Speaker 1: important it was to her that the technology was plausible, 522 00:26:48,680 --> 00:26:51,000 Speaker 1: Like how much did she drill down and think about 523 00:26:51,280 --> 00:26:53,879 Speaker 1: how this could work and what would be needed and 524 00:26:53,920 --> 00:26:56,520 Speaker 1: how that affected her story or was it okay to 525 00:26:56,560 --> 00:26:58,359 Speaker 1: her that it was just sort of like a little 526 00:26:58,400 --> 00:27:00,960 Speaker 1: bit handwavy in the details. I see. I wonder which 527 00:27:01,000 --> 00:27:03,560 Speaker 1: answer would disappoint you or get you more excited. Well, 528 00:27:03,560 --> 00:27:05,960 Speaker 1: if I had two bodies, I could be simultaneously excited 529 00:27:06,000 --> 00:27:09,840 Speaker 1: and disappointed. You could have a pessimist Ancillary and an 530 00:27:09,840 --> 00:27:13,199 Speaker 1: optimist Ancellary. All right, well, here is Daniel's interview with 531 00:27:13,320 --> 00:27:17,440 Speaker 1: science fiction author and like, it's my pleasure to welcome 532 00:27:17,440 --> 00:27:20,440 Speaker 1: to the show, and Lucky, And once you introduce yourselves 533 00:27:20,440 --> 00:27:24,840 Speaker 1: to our listeners, I'm and Lucky. I'm the author, most famously, 534 00:27:24,840 --> 00:27:27,840 Speaker 1: I guess, the author of the novel Ancillary Justice and 535 00:27:28,000 --> 00:27:31,199 Speaker 1: its sequels Ancillary Sword and Ancellery Mercy. Well, I'm a 536 00:27:31,280 --> 00:27:33,399 Speaker 1: huge fan of your book and of the universe that 537 00:27:33,480 --> 00:27:37,600 Speaker 1: you've created. Congratulations on this wonderful creation and all of 538 00:27:37,640 --> 00:27:41,240 Speaker 1: your success. But before we dive into the physics of 539 00:27:41,280 --> 00:27:43,240 Speaker 1: the universe that you've built, we want to ask you 540 00:27:43,280 --> 00:27:45,560 Speaker 1: a couple of questions to sort of get to know 541 00:27:45,760 --> 00:27:48,919 Speaker 1: you as a scientist or as a science thinker. And 542 00:27:48,960 --> 00:27:51,880 Speaker 1: these are questions we're gonna ask every author. The first 543 00:27:51,960 --> 00:27:55,119 Speaker 1: question is sort of a philosophical question that bounces around 544 00:27:55,119 --> 00:27:57,199 Speaker 1: the science fiction community, and it has to do with 545 00:27:57,480 --> 00:28:01,200 Speaker 1: Star Trek transporters. Do you think when you go into 546 00:28:01,240 --> 00:28:04,080 Speaker 1: a transporter on Star Trek do you think it actually 547 00:28:04,160 --> 00:28:07,000 Speaker 1: moves you from one place to another? Or do you 548 00:28:07,040 --> 00:28:10,960 Speaker 1: think it tears you apart, effectively killing you and recreates 549 00:28:10,960 --> 00:28:14,400 Speaker 1: a clone somewhere else. I actually feel like you're killing 550 00:28:14,440 --> 00:28:16,640 Speaker 1: somebody and creating a new person. But I mean, it's 551 00:28:16,640 --> 00:28:20,360 Speaker 1: all ship of theseus that's wonderfull. I totally agree. Um. 552 00:28:20,400 --> 00:28:23,240 Speaker 1: And then the second question I have for you is 553 00:28:23,359 --> 00:28:26,639 Speaker 1: about science fiction technology in general. You must read a 554 00:28:26,640 --> 00:28:29,680 Speaker 1: lot of science fiction, and so I'm wondering what element 555 00:28:29,760 --> 00:28:32,879 Speaker 1: that you see in science fiction are you most excited 556 00:28:32,920 --> 00:28:36,280 Speaker 1: about actually becoming real? Would you like to actually see 557 00:28:36,320 --> 00:28:40,280 Speaker 1: scientists build one day medical tech, like the kind of 558 00:28:40,320 --> 00:28:42,560 Speaker 1: oh wee can just heal wounds by spraying a thing 559 00:28:42,600 --> 00:28:46,040 Speaker 1: over it? That would be really fabulous. Uh. And actually 560 00:28:46,080 --> 00:28:50,320 Speaker 1: food replicators, food replicators would be amazing, And I'm kind 561 00:28:50,360 --> 00:28:53,080 Speaker 1: of suspicious of the way they're often handled. Like in 562 00:28:53,160 --> 00:28:57,479 Speaker 1: Star Trek, it's this really amazingly miraculous technology, but it's 563 00:28:57,520 --> 00:28:59,520 Speaker 1: always Oh but the food is not really as good 564 00:28:59,560 --> 00:29:02,440 Speaker 1: as when you like it yourself. Well, if it's molecular, 565 00:29:02,640 --> 00:29:06,880 Speaker 1: I mean it's it's absolutely indistinguishable, then why should it 566 00:29:06,920 --> 00:29:09,720 Speaker 1: not be as good? Right? I find that kind of 567 00:29:09,720 --> 00:29:13,360 Speaker 1: an interesting. Oh, but mom has to make it slaving 568 00:29:13,400 --> 00:29:16,600 Speaker 1: for hours over the stove or it won't taste as good, Like, yeah, 569 00:29:16,720 --> 00:29:19,880 Speaker 1: mom's tears aren't really that delicious. All right. So let's 570 00:29:19,920 --> 00:29:22,480 Speaker 1: talk about the science fiction universe that you created in 571 00:29:22,520 --> 00:29:26,240 Speaker 1: this wonderful novel Ancillary Justice. I was wondering first if 572 00:29:26,280 --> 00:29:29,080 Speaker 1: you could describe to our listeners. So, what is the 573 00:29:29,200 --> 00:29:32,880 Speaker 1: key element of the technology that you've created that changes 574 00:29:32,960 --> 00:29:36,240 Speaker 1: what it's like to be in that universe? There's technology 575 00:29:36,400 --> 00:29:40,680 Speaker 1: for slaving brains to a central or to each other 576 00:29:40,800 --> 00:29:43,760 Speaker 1: or to a central uh authority isn't the right word, 577 00:29:44,040 --> 00:29:47,200 Speaker 1: so that those brains experience themselves as not having an 578 00:29:47,200 --> 00:29:50,120 Speaker 1: individual identity, but as being part of the larger identity. 579 00:29:50,280 --> 00:29:52,240 Speaker 1: That's an amazing idea and one that plays out in 580 00:29:52,240 --> 00:29:54,480 Speaker 1: a fascinating way in your novel. But I was wondering, 581 00:29:54,720 --> 00:29:56,720 Speaker 1: sort of, how did you come up with that idea? 582 00:29:56,720 --> 00:29:59,360 Speaker 1: Where's that idea come from? Do you start from the 583 00:29:59,440 --> 00:30:02,080 Speaker 1: concept of the technology and then come up with the 584 00:30:02,160 --> 00:30:04,720 Speaker 1: story or think about the story you wanted to tell, 585 00:30:04,800 --> 00:30:07,920 Speaker 1: and then sort of reverse engineer the technology that could 586 00:30:08,040 --> 00:30:12,440 Speaker 1: make it happen. It started out my imagining beings that 587 00:30:12,480 --> 00:30:14,360 Speaker 1: could be in more than one place at a time, 588 00:30:14,640 --> 00:30:16,320 Speaker 1: Like I think a lot of stories started out with 589 00:30:16,400 --> 00:30:20,000 Speaker 1: just sort of idly fantasizing about different cool things. You know, 590 00:30:20,560 --> 00:30:22,200 Speaker 1: you're in line of the post office or whatever in 591 00:30:22,240 --> 00:30:24,520 Speaker 1: your board and you're making up cool stuff. Uh, And 592 00:30:24,560 --> 00:30:28,360 Speaker 1: of course eventually it builds up into something more complicated, 593 00:30:28,400 --> 00:30:30,200 Speaker 1: and then all of a sudden, I'm like, well, wait, 594 00:30:30,320 --> 00:30:32,320 Speaker 1: I've built up this character and they're here, but they 595 00:30:32,360 --> 00:30:34,400 Speaker 1: also need to be this other place. And then I thought, 596 00:30:34,400 --> 00:30:37,000 Speaker 1: well could I make that work? And that became really 597 00:30:37,080 --> 00:30:39,640 Speaker 1: fascinating to me, and the story sort of built out 598 00:30:39,680 --> 00:30:42,120 Speaker 1: from there? And what is it that's so fascinating to 599 00:30:42,160 --> 00:30:44,920 Speaker 1: you about that idea? What drew you into that idea 600 00:30:45,120 --> 00:30:47,920 Speaker 1: and made you want to create this entire universe around it? 601 00:30:48,120 --> 00:30:50,200 Speaker 1: The idea of being in more than one place is 602 00:30:50,240 --> 00:30:53,240 Speaker 1: really interesting. But then when I came up with a 603 00:30:53,320 --> 00:30:56,400 Speaker 1: mechanism for how that would work, that was really kind 604 00:30:56,400 --> 00:31:00,120 Speaker 1: of horrifying, uh and upsetting. That's a terrible thing to 605 00:31:00,160 --> 00:31:02,480 Speaker 1: do to a person. So then, of course I wonder 606 00:31:02,520 --> 00:31:05,760 Speaker 1: how deeply did you think about how to actually implement this, 607 00:31:05,920 --> 00:31:09,560 Speaker 1: Like did you think about the technology in great detail 608 00:31:09,680 --> 00:31:12,760 Speaker 1: to figure out whether this was plausible. Once I decided 609 00:31:12,800 --> 00:31:14,560 Speaker 1: how it was going to work, at least the sort 610 00:31:14,560 --> 00:31:16,480 Speaker 1: of hand wavy version of how it was going to work, 611 00:31:16,680 --> 00:31:20,120 Speaker 1: because it's all hand wavy, I started looking into human neurology. 612 00:31:20,160 --> 00:31:23,320 Speaker 1: And it's really horrifying when you realize how much of 613 00:31:23,320 --> 00:31:27,080 Speaker 1: our identities and sense of ourselves are contingent on a 614 00:31:27,120 --> 00:31:30,680 Speaker 1: couple of very delicate connections in our brains. And so 615 00:31:30,840 --> 00:31:34,440 Speaker 1: how important is it to you that this could actually 616 00:31:34,440 --> 00:31:37,239 Speaker 1: work in our universe? How important is it to you 617 00:31:37,320 --> 00:31:40,400 Speaker 1: that the science of it is like all really logical 618 00:31:40,480 --> 00:31:42,520 Speaker 1: and consistent or is it all right for some of 619 00:31:42,560 --> 00:31:45,360 Speaker 1: it to be sort of hand wavy and approximate? In 620 00:31:45,400 --> 00:31:47,800 Speaker 1: some ways it's important to me, and in some ways 621 00:31:47,840 --> 00:31:50,560 Speaker 1: it's not so. As I said, I spent a fair 622 00:31:50,600 --> 00:31:54,720 Speaker 1: amount of time looking at the neurological implications for how 623 00:31:54,760 --> 00:31:57,240 Speaker 1: this sort of thing would work, But in terms of 624 00:31:57,840 --> 00:32:01,560 Speaker 1: how do ancillary implants actually do the work, like what 625 00:32:01,600 --> 00:32:04,520 Speaker 1: connections do they cut, how do they communicate with each other? 626 00:32:04,600 --> 00:32:08,880 Speaker 1: I have no freaking idea that's just reverse the polarity 627 00:32:09,000 --> 00:32:12,840 Speaker 1: on the whatever that's Star Trek techno babble um. But 628 00:32:12,920 --> 00:32:15,800 Speaker 1: I did want the neurology and the psychology to be 629 00:32:15,920 --> 00:32:20,080 Speaker 1: fairly realistic because I feel like one of the cool 630 00:32:20,120 --> 00:32:22,280 Speaker 1: things about science fiction is you can do that. You 631 00:32:22,280 --> 00:32:24,680 Speaker 1: can just stand up and say and now talking cows, 632 00:32:25,000 --> 00:32:27,440 Speaker 1: and your audience will buy and large they'll take it. 633 00:32:27,960 --> 00:32:31,600 Speaker 1: You don't always have to explain how that happens. But also, 634 00:32:31,720 --> 00:32:34,480 Speaker 1: if I want my audience to continue to believe in 635 00:32:34,520 --> 00:32:37,600 Speaker 1: those talking cows, I feel like I need to make 636 00:32:37,680 --> 00:32:40,880 Speaker 1: other things around them very realistic. So the grass ought 637 00:32:40,880 --> 00:32:43,320 Speaker 1: to be you believe it's grass, and the cows ought 638 00:32:43,320 --> 00:32:45,400 Speaker 1: to talk in a way that makes sense for the cows, 639 00:32:45,760 --> 00:32:48,800 Speaker 1: and if I explain anything, it really should make sense. 640 00:32:49,000 --> 00:32:52,160 Speaker 1: But I'm never going to explain why the cows are talking. Well, 641 00:32:52,200 --> 00:32:54,000 Speaker 1: maybe I will, I don't know. There's a lot of 642 00:32:54,000 --> 00:32:56,680 Speaker 1: power in just being able to say handwavy thing, But 643 00:32:56,760 --> 00:32:58,640 Speaker 1: there's also a lot of power and being able to 644 00:32:58,680 --> 00:33:01,360 Speaker 1: describe exactly how it's happening. And then of course I 645 00:33:01,440 --> 00:33:04,520 Speaker 1: have to ask, do you think this could actually happen 646 00:33:04,560 --> 00:33:06,960 Speaker 1: in our universe? Do you think in a thousand or 647 00:33:06,960 --> 00:33:10,920 Speaker 1: five thousand years this might be a reality brains slave 648 00:33:11,040 --> 00:33:15,800 Speaker 1: to a central intelligence? I kind of hope not. But 649 00:33:15,880 --> 00:33:19,720 Speaker 1: at the same time, I actually don't think. Given if 650 00:33:19,720 --> 00:33:23,160 Speaker 1: you could make the things small enough, the equipment, if 651 00:33:23,160 --> 00:33:26,160 Speaker 1: you could miniaturize it enough and have the kind of 652 00:33:26,200 --> 00:33:31,840 Speaker 1: sophistication with neurosurgery that we don't yet have, maybe you 653 00:33:31,840 --> 00:33:33,440 Speaker 1: could do it. I don't know how you empower it. 654 00:33:33,520 --> 00:33:36,280 Speaker 1: That's another most science fiction doesn't stop to think how 655 00:33:36,320 --> 00:33:38,520 Speaker 1: you power things? Right? They just say, oh, my, my 656 00:33:38,640 --> 00:33:40,720 Speaker 1: ray gun works all the time, like yeah, and how 657 00:33:40,720 --> 00:33:42,320 Speaker 1: big is the battery? And why are you not always 658 00:33:42,400 --> 00:33:44,560 Speaker 1: changing it? Right? I say nothing about how any of 659 00:33:44,600 --> 00:33:46,840 Speaker 1: that is powered. In science fiction. One of the ways 660 00:33:46,920 --> 00:33:48,960 Speaker 1: you don't get people to ask those questions, is you 661 00:33:49,000 --> 00:33:51,440 Speaker 1: don't mention it? And in your book the point of view, 662 00:33:51,480 --> 00:33:55,320 Speaker 1: the main narrator is an artificial intelligence. That makes me wonder, 663 00:33:55,720 --> 00:33:59,920 Speaker 1: do you think artificial intelligence in today's world could actually 664 00:34:00,040 --> 00:34:02,920 Speaker 1: have a point of view, could have a first person experience? 665 00:34:03,160 --> 00:34:05,640 Speaker 1: Do you think that if we developed an AI now 666 00:34:06,040 --> 00:34:09,200 Speaker 1: that was sufficiently complex enough that it seemed human, that 667 00:34:09,280 --> 00:34:12,080 Speaker 1: it would actually be having a first person experience like 668 00:34:12,280 --> 00:34:14,200 Speaker 1: you and I? I feel like that's really hard to 669 00:34:14,200 --> 00:34:17,840 Speaker 1: say because from a certain point of view, and understand 670 00:34:17,840 --> 00:34:20,319 Speaker 1: I don't subscribe to this point of view, but it's 671 00:34:20,400 --> 00:34:23,880 Speaker 1: logically makes sense. I don't know that anybody around me 672 00:34:23,960 --> 00:34:26,680 Speaker 1: is actually having a first person experience except for me, 673 00:34:26,800 --> 00:34:30,479 Speaker 1: because I'm experiencing mine. And the only way that I 674 00:34:30,520 --> 00:34:33,480 Speaker 1: can know that anybody else is having that experience is 675 00:34:33,480 --> 00:34:36,320 Speaker 1: that they tell me. And so I go through life 676 00:34:36,320 --> 00:34:38,719 Speaker 1: when I meet another person assuming that they're having a 677 00:34:38,760 --> 00:34:42,960 Speaker 1: first person experience, Uh, partly because it just makes life easier, 678 00:34:43,320 --> 00:34:45,600 Speaker 1: and I would rather make that my default assumption than 679 00:34:45,640 --> 00:34:48,560 Speaker 1: the other. So I think if an AI were to 680 00:34:48,680 --> 00:34:53,040 Speaker 1: pass the interiority touring test and sound really like it 681 00:34:53,080 --> 00:34:56,319 Speaker 1: was having an actual first person experience, then it would 682 00:34:56,320 --> 00:34:58,399 Speaker 1: seem to me prudent to accept that. All right, well, 683 00:34:58,440 --> 00:35:00,840 Speaker 1: thank you very much and for human hearing my questions 684 00:35:00,840 --> 00:35:03,760 Speaker 1: about your science fiction universe and for coming on our show. 685 00:35:04,040 --> 00:35:08,280 Speaker 1: All right, pretty cool. Um sounds totally fascinating her process 686 00:35:08,400 --> 00:35:10,760 Speaker 1: and what she thought about the world and the universe 687 00:35:10,760 --> 00:35:13,440 Speaker 1: and science. What was your takeaway from talking with her, Daniel, 688 00:35:13,480 --> 00:35:16,239 Speaker 1: I think that she did a good job imagining sort 689 00:35:16,280 --> 00:35:19,120 Speaker 1: of the important bits of how her universe worked and 690 00:35:19,160 --> 00:35:23,160 Speaker 1: making those consistent without getting into the weeds of like, 691 00:35:23,360 --> 00:35:25,279 Speaker 1: you know, how would you actually build this thing? You know, 692 00:35:25,280 --> 00:35:28,279 Speaker 1: basically the engineering like is there a battery in this thing? 693 00:35:28,400 --> 00:35:30,360 Speaker 1: Or do you need to recharge every four hours? And 694 00:35:30,360 --> 00:35:34,719 Speaker 1: you're like where does the triple A fit inside of 695 00:35:34,760 --> 00:35:37,520 Speaker 1: your brain? And that's why there's no genre called engineering fiction. 696 00:35:40,520 --> 00:35:44,440 Speaker 1: Oh maybe maybe there should be, Maybe there should be 697 00:35:44,480 --> 00:35:47,000 Speaker 1: a good talk to your agent. Um. So I think 698 00:35:47,040 --> 00:35:49,120 Speaker 1: she did a good job. She sort of thought about 699 00:35:49,120 --> 00:35:51,600 Speaker 1: the details, try to make a world that was consistent, 700 00:35:52,040 --> 00:35:55,120 Speaker 1: follow those rules in her story, but not get bogged 701 00:35:55,120 --> 00:35:58,200 Speaker 1: down in all this little trivia. And you know she 702 00:35:58,560 --> 00:36:01,400 Speaker 1: you can hear she's saying like she thought the audience 703 00:36:01,400 --> 00:36:03,279 Speaker 1: would accept that the audience didn't want to hear more 704 00:36:03,320 --> 00:36:05,800 Speaker 1: details that you can say talking cows and the audience 705 00:36:05,840 --> 00:36:07,680 Speaker 1: would be like, all right, cool, what happens when you 706 00:36:07,719 --> 00:36:10,600 Speaker 1: have talking cows? Basically tries to avoid the topics she 707 00:36:10,800 --> 00:36:13,680 Speaker 1: doesn't want to get into. She says, if you don't 708 00:36:13,680 --> 00:36:15,400 Speaker 1: want people to ask questions about it, just sort of 709 00:36:15,440 --> 00:36:17,600 Speaker 1: don't bring it up. But then again, I asked her 710 00:36:17,680 --> 00:36:20,719 Speaker 1: questions that apparently nobody had asked her before, so some 711 00:36:20,800 --> 00:36:23,200 Speaker 1: people do bring it up. That's what happens when physicists 712 00:36:23,200 --> 00:36:25,120 Speaker 1: read your book. Well, what do you think is the 713 00:36:25,160 --> 00:36:27,480 Speaker 1: sort of the core idea or the core lesson from 714 00:36:27,480 --> 00:36:29,880 Speaker 1: her novel that she was trying to get at, you know, 715 00:36:29,960 --> 00:36:33,400 Speaker 1: is it's sort of about what does human consciousness means? 716 00:36:33,760 --> 00:36:36,799 Speaker 1: Or you know, how how does technolog once we go 717 00:36:36,880 --> 00:36:39,120 Speaker 1: across the universe and across the galaxy, what does that 718 00:36:39,200 --> 00:36:43,040 Speaker 1: do for our maybe our collective conscious is the human species. Yeah, 719 00:36:43,080 --> 00:36:45,880 Speaker 1: I think that the core idea of the book is 720 00:36:45,920 --> 00:36:49,520 Speaker 1: that technology can really change what it means to be human, 721 00:36:49,760 --> 00:36:52,040 Speaker 1: and that what it means to be human can change, 722 00:36:52,040 --> 00:36:54,759 Speaker 1: and you know it has, right, our experience of living 723 00:36:54,760 --> 00:36:57,000 Speaker 1: in this world is very different than the experience of 724 00:36:57,040 --> 00:37:00,000 Speaker 1: the world ten years ago, just because of the knowledge 725 00:37:00,040 --> 00:37:02,120 Speaker 1: and the tools that we have and how big the 726 00:37:02,160 --> 00:37:05,760 Speaker 1: world feels and how accessible the stars field really changed 727 00:37:05,880 --> 00:37:08,040 Speaker 1: what it's like to be human. And this sort of 728 00:37:08,040 --> 00:37:11,200 Speaker 1: extrapolates that out to a crazy extreme and reminds you 729 00:37:11,520 --> 00:37:14,120 Speaker 1: that technology is not just a tool, but it sort 730 00:37:14,160 --> 00:37:16,640 Speaker 1: of defines who we are and how we live, right, 731 00:37:16,719 --> 00:37:20,080 Speaker 1: It redefines it changes what it means. Right. Yeah, it's 732 00:37:20,200 --> 00:37:23,200 Speaker 1: completely like even just our listeners listening to this podcast. 733 00:37:23,280 --> 00:37:26,920 Speaker 1: I mean, it's a totally different human experience than they 734 00:37:26,920 --> 00:37:29,600 Speaker 1: that humans had, you know, three years ago. Yeah, just 735 00:37:29,640 --> 00:37:34,680 Speaker 1: like dropbox change what it means to work together, right, Um, lightsabers, 736 00:37:35,120 --> 00:37:38,239 Speaker 1: draw box, it's all fantasy. Hey can you put that 737 00:37:38,320 --> 00:37:41,960 Speaker 1: lightsaber in the drop box so I can use it? Also? Awesome? Yeah, 738 00:37:42,000 --> 00:37:45,279 Speaker 1: you just make sure you turn it off, don't put 739 00:37:45,320 --> 00:37:47,720 Speaker 1: it don't put it activated. Oh I've made that mistake, 740 00:37:47,840 --> 00:37:51,359 Speaker 1: let me tell you. But yeah, it's it is an 741 00:37:51,400 --> 00:37:54,440 Speaker 1: interesting to think as we explore more of the universe, 742 00:37:54,480 --> 00:37:56,840 Speaker 1: as we learn more about the universe, how is that 743 00:37:56,960 --> 00:38:00,000 Speaker 1: changing our conception of what it means to be human. Yeah, 744 00:38:00,040 --> 00:38:02,640 Speaker 1: the answer is that it would be very different. But 745 00:38:02,719 --> 00:38:04,279 Speaker 1: of course as a challenge there. If you're going to 746 00:38:04,360 --> 00:38:07,760 Speaker 1: write a novel to be read by today's humans about 747 00:38:07,840 --> 00:38:10,200 Speaker 1: the experience of future humans, you have to make it 748 00:38:10,239 --> 00:38:12,440 Speaker 1: at least a little bit relatable. I mean, if your 749 00:38:12,480 --> 00:38:15,319 Speaker 1: point is, wow, in a billion years or whatever, it's 750 00:38:15,320 --> 00:38:17,960 Speaker 1: going to be impossible to understand future society, well, then 751 00:38:18,000 --> 00:38:20,000 Speaker 1: the book is just gibberish, Like if you just wrote 752 00:38:20,000 --> 00:38:23,200 Speaker 1: it in a future language nobody knows now. Yes, so 753 00:38:23,560 --> 00:38:25,600 Speaker 1: you have to bridge. You have to make it accessible 754 00:38:25,719 --> 00:38:28,320 Speaker 1: enough to today's humans that we can get a glimpse 755 00:38:28,360 --> 00:38:30,879 Speaker 1: for what it might be like to them. And that's 756 00:38:30,880 --> 00:38:34,800 Speaker 1: also what's fascinating about old science fiction. Like you read 757 00:38:34,840 --> 00:38:38,640 Speaker 1: science fiction from the nineteen fifties, it's like it's dated. 758 00:38:39,320 --> 00:38:42,440 Speaker 1: It tells you not just what they thought about what 759 00:38:42,560 --> 00:38:44,759 Speaker 1: the future would look like, but what was hard for 760 00:38:44,800 --> 00:38:47,120 Speaker 1: them to imagine, what was easy for them to think about. 761 00:38:47,360 --> 00:38:49,719 Speaker 1: It gives you a sense really for what it was 762 00:38:49,760 --> 00:38:51,799 Speaker 1: like to be back there in the nineteen fifties. The 763 00:38:51,800 --> 00:38:54,600 Speaker 1: way they extrapolated to the future missed a little bit 764 00:38:54,640 --> 00:38:57,040 Speaker 1: from what where we are now. Yeah, and if she 765 00:38:57,120 --> 00:38:59,080 Speaker 1: had written this book in a hundred years or in 766 00:38:59,080 --> 00:39:01,760 Speaker 1: two hundred years, you would probably have different challenges writing 767 00:39:01,800 --> 00:39:05,440 Speaker 1: it to extrapolate from that society to her future society. 768 00:39:05,600 --> 00:39:08,120 Speaker 1: So science fiction, in this way, it sort of depends 769 00:39:08,160 --> 00:39:10,760 Speaker 1: on both time points, the time point in the book 770 00:39:10,840 --> 00:39:13,080 Speaker 1: and the time point of the reader. And so that's 771 00:39:13,080 --> 00:39:15,279 Speaker 1: why the best science fiction is the ones that that 772 00:39:15,320 --> 00:39:17,799 Speaker 1: works well, not just in one year, but you know 773 00:39:18,000 --> 00:39:20,759 Speaker 1: over ten years, or over twenty or fifty years. If 774 00:39:20,760 --> 00:39:24,000 Speaker 1: it's really timeless, then you've captured something that's essential about humanity, 775 00:39:24,080 --> 00:39:26,439 Speaker 1: not something which is like, oh, here's what Twitter looks 776 00:39:26,440 --> 00:39:28,480 Speaker 1: like today, and how annoying it is, and let me 777 00:39:28,520 --> 00:39:31,400 Speaker 1: write a science fiction novel about this moment. It's about 778 00:39:31,440 --> 00:39:33,480 Speaker 1: being human. Yeah, I think I'm just going to write 779 00:39:33,480 --> 00:39:36,360 Speaker 1: a science fiction novel where the future doesn't have Twitter. 780 00:39:36,719 --> 00:39:43,240 Speaker 1: That's my fantasy. That would That's called utopian fiction. Utopian 781 00:39:43,320 --> 00:39:46,319 Speaker 1: that's right. All right. Well, we hope you enjoyed this 782 00:39:46,880 --> 00:39:50,719 Speaker 1: trip down to the consciousness of other people and like 783 00:39:51,000 --> 00:39:53,600 Speaker 1: science fiction authors as they take you to other star 784 00:39:53,680 --> 00:39:56,920 Speaker 1: systems across the galaxy yep. And so if you enjoyed this, 785 00:39:57,080 --> 00:39:59,319 Speaker 1: let us know, and if you'd like for us to 786 00:39:59,360 --> 00:40:02,000 Speaker 1: talk about us ses fiction novel that you really enjoyed, 787 00:40:02,280 --> 00:40:04,440 Speaker 1: send it to me, I'll read it and maybe we 788 00:40:04,480 --> 00:40:06,600 Speaker 1: will talk about it on the podcast and into your 789 00:40:06,640 --> 00:40:09,360 Speaker 1: view the author. Yeah, it'll give us an excuse to 790 00:40:09,360 --> 00:40:11,839 Speaker 1: reach out to these famous people and talk to them. Yeah, 791 00:40:11,880 --> 00:40:14,400 Speaker 1: which is always fun. It's such an honor. I was 792 00:40:14,440 --> 00:40:16,560 Speaker 1: so glad that Anne was so friendly and so willing 793 00:40:16,560 --> 00:40:20,000 Speaker 1: to spend her time explaining her universe two hours. So again. 794 00:40:20,040 --> 00:40:23,239 Speaker 1: The book is called Ancillary Justice by Anne Lecky, And 795 00:40:23,320 --> 00:40:25,359 Speaker 1: if you enjoy it, it's part of a trilogy. There's 796 00:40:25,360 --> 00:40:28,640 Speaker 1: two more books in that same universe, all of excellent quality. 797 00:40:28,680 --> 00:40:30,680 Speaker 1: All right, well, thank you for joining us. See you 798 00:40:30,719 --> 00:40:40,840 Speaker 1: next time. If you still have a question after listening 799 00:40:40,920 --> 00:40:44,000 Speaker 1: to all these explanations, please drop us a line. We'd 800 00:40:44,040 --> 00:40:46,880 Speaker 1: love to hear from you. You can find us at Facebook, Twitter, 801 00:40:46,960 --> 00:40:50,600 Speaker 1: and Instagram at Daniel and Jorge that's one word, or 802 00:40:50,719 --> 00:40:54,680 Speaker 1: email us at Feedback at Daniel and Jorge dot com. 803 00:40:54,680 --> 00:40:57,480 Speaker 1: Thanks for listening, and remember that Daniel and Jorge Explain 804 00:40:57,560 --> 00:41:00,440 Speaker 1: the Universe is a production of I Heart Ready for 805 00:41:00,600 --> 00:41:03,520 Speaker 1: More podcast from my Heart Radio. Visit the I Heart 806 00:41:03,600 --> 00:41:07,200 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 807 00:41:07,239 --> 00:41:13,360 Speaker 1: favorite shows. Yeah,