1 00:00:08,880 --> 00:00:11,920 Speaker 1: These days, we're all spending a lot of time with 2 00:00:11,960 --> 00:00:17,160 Speaker 1: our computers, our iPads, and our iPhones. I personally spend 3 00:00:17,200 --> 00:00:19,480 Speaker 1: more than ten hours every day in front of my 4 00:00:19,600 --> 00:00:23,360 Speaker 1: trusty laptop. But here's a question. When you're done for 5 00:00:23,400 --> 00:00:26,280 Speaker 1: the day and you close your computer, do you say 6 00:00:26,360 --> 00:00:28,600 Speaker 1: good night to it the way you might say good 7 00:00:28,680 --> 00:00:31,520 Speaker 1: night to a friend or a coworker, even though that 8 00:00:31,560 --> 00:00:34,920 Speaker 1: computer is doing so much work for you? I mean, 9 00:00:35,040 --> 00:00:37,560 Speaker 1: do you ever think about what it's like to be 10 00:00:37,640 --> 00:00:42,000 Speaker 1: your laptop? If your laptop had more personality like Jarvis 11 00:00:42,000 --> 00:00:45,080 Speaker 1: in Iron Man, would you treat it more like a person? 12 00:00:45,320 --> 00:00:48,000 Speaker 1: Would you care about how it felt even if you 13 00:00:48,040 --> 00:01:07,840 Speaker 1: could never actually know what it's like to be a laptop. Hi, 14 00:01:08,080 --> 00:01:11,080 Speaker 1: I'm Daniel. I'm a particle physicist, and I've spent a 15 00:01:11,120 --> 00:01:13,560 Speaker 1: lot of time thinking about what it's like to be 16 00:01:13,640 --> 00:01:17,880 Speaker 1: a particle, an alien, or even another person. And Welcome 17 00:01:17,920 --> 00:01:22,120 Speaker 1: to the podcast Daniel and Jorge explain the Universe, in 18 00:01:22,120 --> 00:01:24,920 Speaker 1: which we explore all of the ways of thinking and 19 00:01:25,040 --> 00:01:28,200 Speaker 1: being in our universe, where we talk about what's going 20 00:01:28,240 --> 00:01:30,680 Speaker 1: on in the hearts of black holes, to how long 21 00:01:30,680 --> 00:01:33,520 Speaker 1: our son is going to live, to how volcanoes have 22 00:01:33,640 --> 00:01:36,360 Speaker 1: formed the atmosphere of Earth. We talk about all the 23 00:01:36,360 --> 00:01:38,479 Speaker 1: physics in the universe and try to make sure all 24 00:01:38,520 --> 00:01:41,880 Speaker 1: of it makes sense to you, but we don't limit 25 00:01:41,880 --> 00:01:45,839 Speaker 1: ourselves just to talking about this universe. In the podcast, 26 00:01:45,880 --> 00:01:48,760 Speaker 1: we also like to talk about the process of science, 27 00:01:49,000 --> 00:01:53,280 Speaker 1: the creative element, where we are imagining other possible universes 28 00:01:53,400 --> 00:01:56,560 Speaker 1: that might be ours. Because here we are on the 29 00:01:56,640 --> 00:02:01,120 Speaker 1: forefront of science, having understood some time the fraction of 30 00:02:01,160 --> 00:02:04,960 Speaker 1: the mysteries of the universe, but knowing that many enormous 31 00:02:05,000 --> 00:02:09,040 Speaker 1: discoveries lay ahead, hoping at least that there are grand 32 00:02:09,160 --> 00:02:12,600 Speaker 1: revelations about the nature of the universe lying ahead of us, 33 00:02:12,919 --> 00:02:16,840 Speaker 1: and a critical step to revealing those deep bits of 34 00:02:17,000 --> 00:02:20,919 Speaker 1: knowledge is imagining them first. And since we don't know 35 00:02:21,240 --> 00:02:24,200 Speaker 1: which universe is ours, we have to hold in our 36 00:02:24,240 --> 00:02:29,360 Speaker 1: minds several possible universes, sometimes an infinite set of possible 37 00:02:29,440 --> 00:02:33,519 Speaker 1: universes which could be ours. And that's the job sometimes 38 00:02:33,560 --> 00:02:37,920 Speaker 1: of theoretical physicists thinking about how the universe might or 39 00:02:38,040 --> 00:02:41,720 Speaker 1: might not work. But it's also the province of artists, 40 00:02:41,840 --> 00:02:46,280 Speaker 1: especially science fiction authors, who think about the possible nature 41 00:02:46,360 --> 00:02:48,720 Speaker 1: of the universe, the way it is, or the way 42 00:02:48,720 --> 00:02:51,600 Speaker 1: that it might be. And this is important not just 43 00:02:51,720 --> 00:02:54,080 Speaker 1: for those of us who are curious about the deep 44 00:02:54,280 --> 00:02:57,320 Speaker 1: nature of the universe, how it's built from its time, 45 00:02:57,400 --> 00:03:01,200 Speaker 1: the organizing principle, and how this incredible the complexity and 46 00:03:01,240 --> 00:03:05,799 Speaker 1: beauty emerges from the organization of those tiny little strings 47 00:03:05,880 --> 00:03:09,320 Speaker 1: or pixels or whatever it is that the foundation of 48 00:03:09,360 --> 00:03:13,200 Speaker 1: the nature of reality. But it's also important to think 49 00:03:13,200 --> 00:03:17,160 Speaker 1: about how to live in that universe, how our knowledge 50 00:03:17,200 --> 00:03:20,240 Speaker 1: of that physics and the technology that we build changes 51 00:03:20,240 --> 00:03:23,560 Speaker 1: our lives and changes how we treat each other. Who 52 00:03:23,639 --> 00:03:26,959 Speaker 1: is deserving of our respect, who earns rights, and who 53 00:03:27,000 --> 00:03:31,640 Speaker 1: gets self determination. Think about, for example, how differently you 54 00:03:31,800 --> 00:03:35,160 Speaker 1: view the universe than somebody did who lived a hundred 55 00:03:35,240 --> 00:03:39,280 Speaker 1: years ago, or a thousand years ago or ten thousand 56 00:03:39,360 --> 00:03:42,720 Speaker 1: years ago. They knew so much less about the nature 57 00:03:42,800 --> 00:03:46,080 Speaker 1: of our reality and our place in it. That surely 58 00:03:46,120 --> 00:03:49,600 Speaker 1: affected the way they lived, the way they thought of themselves, 59 00:03:49,720 --> 00:03:52,640 Speaker 1: and the way they treat each other. In the same way, 60 00:03:52,680 --> 00:03:55,960 Speaker 1: you might imagine somebody looking back on us in a 61 00:03:56,080 --> 00:04:00,360 Speaker 1: hundred years, in a thousand years, in ten thousand years 62 00:04:00,600 --> 00:04:04,600 Speaker 1: and thinking about our shocking ignorance about the basic nature 63 00:04:04,760 --> 00:04:08,360 Speaker 1: of the universe. After all, there are huge questions that 64 00:04:08,400 --> 00:04:11,560 Speaker 1: we still do not know the answer to. What is 65 00:04:11,880 --> 00:04:16,000 Speaker 1: space after all? Why does time only flow forwards? Why 66 00:04:16,040 --> 00:04:19,960 Speaker 1: is the universe continuing to expand and accelerate its expansion. 67 00:04:20,240 --> 00:04:22,880 Speaker 1: What happened to the beginning of the universe. These basic 68 00:04:23,000 --> 00:04:26,479 Speaker 1: questions should inform the context of our lives, and yet 69 00:04:26,520 --> 00:04:30,039 Speaker 1: we are totally clueless about their answers. And in a 70 00:04:30,080 --> 00:04:33,120 Speaker 1: hundred years or ten thousand years, people might, at least 71 00:04:33,160 --> 00:04:36,080 Speaker 1: I hope they will have some insight into the nature 72 00:04:36,080 --> 00:04:38,680 Speaker 1: of these questions, which will flesh out for them the 73 00:04:38,760 --> 00:04:41,960 Speaker 1: context of the human experience, what it means to be 74 00:04:42,080 --> 00:04:45,159 Speaker 1: human and what it means to be human in this universe. 75 00:04:45,440 --> 00:04:47,440 Speaker 1: And those folks will look back at us, they will 76 00:04:47,480 --> 00:04:50,159 Speaker 1: wonder what it was like to be so ignorant, to 77 00:04:50,279 --> 00:04:53,680 Speaker 1: be as clueless as we are, and they might have 78 00:04:53,760 --> 00:04:56,640 Speaker 1: trouble understanding the choices that we make about how we 79 00:04:56,680 --> 00:05:00,159 Speaker 1: live our lives and how we treat each other, specific lee, 80 00:05:00,480 --> 00:05:03,920 Speaker 1: how we treat other beings which may or may not 81 00:05:04,000 --> 00:05:09,279 Speaker 1: be senttioned. That includes animals and includes artificial intelligence. So 82 00:05:09,520 --> 00:05:12,560 Speaker 1: today on the podcast, we're gonna be continuing a series 83 00:05:12,600 --> 00:05:17,000 Speaker 1: of conversations with science fiction authors about the universes they 84 00:05:17,040 --> 00:05:20,240 Speaker 1: created and what that means about our universe and how 85 00:05:20,320 --> 00:05:28,359 Speaker 1: to live in it. So today on the podcast the 86 00:05:28,480 --> 00:05:32,800 Speaker 1: Science Fiction Universe of S. B. Divia as me. Diva 87 00:05:33,040 --> 00:05:35,599 Speaker 1: is a geek. She is a nerd, just like me. 88 00:05:35,720 --> 00:05:37,960 Speaker 1: She went to cal Tech the age of sixteen. She 89 00:05:38,080 --> 00:05:41,640 Speaker 1: is an expert in AI and in data science and 90 00:05:41,800 --> 00:05:46,080 Speaker 1: in neuroscience, and she has written a book called Machine Hood. 91 00:05:46,440 --> 00:05:49,640 Speaker 1: It's a fascinating novel that explores what it's like to 92 00:05:49,720 --> 00:05:52,920 Speaker 1: be a machine and how society should treat machines and 93 00:05:53,120 --> 00:05:56,920 Speaker 1: how we formulate questions of respect and rights and sort 94 00:05:56,960 --> 00:06:00,160 Speaker 1: of the structure of society to accommodate different kind end 95 00:06:00,160 --> 00:06:02,800 Speaker 1: of sension beings. I read this book recently and I 96 00:06:02,800 --> 00:06:05,839 Speaker 1: thought it was very well written, beautifully crafted, with some 97 00:06:05,960 --> 00:06:09,159 Speaker 1: really good characters and a thrilling plot. It's sort of 98 00:06:09,360 --> 00:06:13,400 Speaker 1: cyberpunk and very exciting, and I totally recommend those of 99 00:06:13,400 --> 00:06:15,719 Speaker 1: you who are interested in science fiction and seem to 100 00:06:15,800 --> 00:06:18,200 Speaker 1: enjoy the same kinds of science fiction that I like 101 00:06:18,640 --> 00:06:20,680 Speaker 1: to go out and pick it up. And so today 102 00:06:20,720 --> 00:06:22,680 Speaker 1: we'll be digging into this book and I have a 103 00:06:22,760 --> 00:06:26,120 Speaker 1: nice interview with the author at the end of the podcast. 104 00:06:26,440 --> 00:06:28,599 Speaker 1: All right, so let's break it down. First. Let me 105 00:06:28,640 --> 00:06:30,720 Speaker 1: tell you what the book is about, and then I'll 106 00:06:30,760 --> 00:06:33,000 Speaker 1: dig into the science of it a little bit, and 107 00:06:33,040 --> 00:06:36,640 Speaker 1: then we'll talk to the author. So first, what is 108 00:06:36,760 --> 00:06:40,239 Speaker 1: this book. It's called Machine Hood, and it takes place 109 00:06:40,279 --> 00:06:44,080 Speaker 1: in sort of the near future, approximately fifty years from now, 110 00:06:44,440 --> 00:06:46,680 Speaker 1: and it's on Earth. This is not a journey to 111 00:06:46,720 --> 00:06:50,440 Speaker 1: some other weird imagined universe. It takes place right here 112 00:06:50,440 --> 00:06:53,440 Speaker 1: in our universe, following our laws of physics, and it 113 00:06:53,560 --> 00:06:57,440 Speaker 1: sort of extrapolates cleverly where we are now to what 114 00:06:57,600 --> 00:07:00,480 Speaker 1: the future might be like. And there's a few aspects 115 00:07:00,560 --> 00:07:02,440 Speaker 1: that are really key to the structure of the novel 116 00:07:02,520 --> 00:07:04,880 Speaker 1: and the ideas that it fleshes out. And the first 117 00:07:04,920 --> 00:07:08,600 Speaker 1: one is, of course, artificial intelligence in the future. In 118 00:07:08,720 --> 00:07:12,600 Speaker 1: her book, artificial intelligence is everywhere, but it's a sort 119 00:07:12,600 --> 00:07:16,080 Speaker 1: of weak AI. It's the kind of AI that can't 120 00:07:16,120 --> 00:07:18,640 Speaker 1: answer your questions. It can help you with things, it 121 00:07:18,680 --> 00:07:21,280 Speaker 1: can order things for you, it can analyze data and 122 00:07:21,320 --> 00:07:23,600 Speaker 1: summarize it for you in a way that humans can 123 00:07:23,680 --> 00:07:27,560 Speaker 1: make sense of, and that's already very, very powerful. But 124 00:07:27,640 --> 00:07:29,760 Speaker 1: the distinction she makes in her book is that while 125 00:07:29,840 --> 00:07:34,440 Speaker 1: these AI are powerful and intelligent, they are not self aware. 126 00:07:34,840 --> 00:07:38,640 Speaker 1: They're not conscious, they're not sentient, they're not having a 127 00:07:38,680 --> 00:07:42,040 Speaker 1: first person subjective experience in the way that you are, 128 00:07:42,440 --> 00:07:44,800 Speaker 1: I hope and the way that I know that I 129 00:07:44,920 --> 00:07:48,000 Speaker 1: am like they cart was. So this is an important 130 00:07:48,040 --> 00:07:50,600 Speaker 1: topic in her book, and the question really of the 131 00:07:50,640 --> 00:07:55,160 Speaker 1: book is can AI become self aware? Not just intelligent, 132 00:07:55,400 --> 00:07:59,000 Speaker 1: not just effective, not just good at solving problems, but 133 00:07:59,080 --> 00:08:03,400 Speaker 1: can it actually have a first person experience? And if 134 00:08:03,400 --> 00:08:06,680 Speaker 1: it does, how should we treat it? What rights does 135 00:08:06,720 --> 00:08:09,560 Speaker 1: it deserve in that case? And will it be angry 136 00:08:09,600 --> 00:08:11,960 Speaker 1: at the way that we've been treating it. So that's 137 00:08:12,000 --> 00:08:14,760 Speaker 1: really the deepest sort of science angle to the book 138 00:08:15,080 --> 00:08:18,160 Speaker 1: is extrampolining a onto the future and wondering how much 139 00:08:18,200 --> 00:08:21,760 Speaker 1: further it can go and the impact on society. But 140 00:08:21,800 --> 00:08:25,320 Speaker 1: there's a lot of other really cool near future tech. 141 00:08:25,920 --> 00:08:29,200 Speaker 1: My favorite near future sci fi stories are the ones 142 00:08:29,240 --> 00:08:32,560 Speaker 1: where the author really has thought about where technology could go, 143 00:08:33,040 --> 00:08:35,800 Speaker 1: what might be possible. And I think this must be 144 00:08:35,840 --> 00:08:40,280 Speaker 1: like really rich territory for you know, venture capitalists or 145 00:08:40,360 --> 00:08:43,720 Speaker 1: researchers trying to think about what they should work on next. 146 00:08:43,760 --> 00:08:46,520 Speaker 1: I mean, books like this are just loaded with clever 147 00:08:46,640 --> 00:08:49,320 Speaker 1: ideas for what we should try to make. And in 148 00:08:49,320 --> 00:08:53,280 Speaker 1: this book, the author has gone sort of beyond cyborg ism, 149 00:08:53,280 --> 00:08:56,400 Speaker 1: where people are like strapping machines onto their bodies or 150 00:08:56,600 --> 00:08:59,920 Speaker 1: replacing their arms with machines, and instead she sort of 151 00:09:00,040 --> 00:09:04,439 Speaker 1: internalized this cyborder approach. Rather than replacing your body with 152 00:09:04,480 --> 00:09:08,640 Speaker 1: a machine, you swallow these mini machines they call pills. 153 00:09:08,960 --> 00:09:11,520 Speaker 1: And these things can make you stronger, or they can 154 00:09:11,559 --> 00:09:14,320 Speaker 1: make you think faster, or they can help you recover 155 00:09:14,520 --> 00:09:17,760 Speaker 1: from injuries much more quickly. And she's really worked this 156 00:09:17,840 --> 00:09:22,080 Speaker 1: out in gory detail. You know, she has really microscopic 157 00:09:22,200 --> 00:09:25,000 Speaker 1: understandings of how these things might work and what they 158 00:09:25,000 --> 00:09:28,720 Speaker 1: would do to yourselves, and it's really quite plausible, frankly, 159 00:09:28,920 --> 00:09:32,720 Speaker 1: and her background in the science and understanding the details 160 00:09:32,920 --> 00:09:36,199 Speaker 1: really comes through. Something that's really cool in her novel 161 00:09:36,320 --> 00:09:40,240 Speaker 1: also is that these things do not require like massive 162 00:09:40,320 --> 00:09:43,760 Speaker 1: factories to build. You can download the instructions and print 163 00:09:43,800 --> 00:09:47,240 Speaker 1: them at home in your kitchen. So you want to 164 00:09:47,240 --> 00:09:49,880 Speaker 1: be smart for the afternoon, you print this flow pill 165 00:09:49,920 --> 00:09:52,640 Speaker 1: and you'd swallow it and you're just much smarter for 166 00:09:52,679 --> 00:09:55,920 Speaker 1: the whole afternoon. So it's really super fascinating. And then 167 00:09:55,920 --> 00:09:59,360 Speaker 1: I think maybe the last major element of this future 168 00:09:59,400 --> 00:10:03,599 Speaker 1: society it makes it different from ours, is the omnipresence 169 00:10:03,640 --> 00:10:08,760 Speaker 1: of surveillance. So she has cameras everywhere, these tiny little 170 00:10:08,840 --> 00:10:12,520 Speaker 1: drones with cameras on them. They are literally everywhere. Every 171 00:10:12,559 --> 00:10:16,000 Speaker 1: moment of your life could be filmed and streamed online, 172 00:10:16,520 --> 00:10:21,120 Speaker 1: which means shockingly, of course, that there's basically no privacy couples, 173 00:10:21,160 --> 00:10:24,560 Speaker 1: intimate evenings, what you're doing in the bathroom, your afternoon 174 00:10:24,640 --> 00:10:26,720 Speaker 1: on the couch, there could always be a micro drone 175 00:10:26,800 --> 00:10:30,120 Speaker 1: somewhere filming you. So there's this future society where life 176 00:10:30,200 --> 00:10:33,800 Speaker 1: is really very different, but not in a totally alien way. 177 00:10:33,960 --> 00:10:35,560 Speaker 1: When you read this book, you don't feel like, I 178 00:10:35,559 --> 00:10:39,680 Speaker 1: don't recognize that society. Instead, you see echoes of our society. 179 00:10:39,720 --> 00:10:43,559 Speaker 1: It's just taken an extrapolated and exaggerated for effect. So 180 00:10:43,600 --> 00:10:45,880 Speaker 1: I think she can make some sort of social commentary 181 00:10:45,960 --> 00:10:48,720 Speaker 1: on what it's like to be human today and whether 182 00:10:49,000 --> 00:10:52,240 Speaker 1: we like the way it's going towards the future. So 183 00:10:52,400 --> 00:10:55,080 Speaker 1: let me comment a little bit on the science of 184 00:10:55,120 --> 00:10:57,920 Speaker 1: this story, like, is it robust, does it make sense? 185 00:10:57,960 --> 00:11:01,600 Speaker 1: Has she taken some liberties over all? I was really impressed. 186 00:11:01,679 --> 00:11:04,400 Speaker 1: I think she has thought about this stuff deeply and clearly. 187 00:11:04,440 --> 00:11:07,000 Speaker 1: She knows what she's doing. She is not writing in 188 00:11:07,040 --> 00:11:10,640 Speaker 1: an area where she's not an expert. She's a neuroscientist, 189 00:11:10,920 --> 00:11:13,760 Speaker 1: she is a developer. She knows all about AI. So 190 00:11:13,800 --> 00:11:16,319 Speaker 1: she has thought about this stuff very carefully. But there 191 00:11:16,320 --> 00:11:18,920 Speaker 1: are some wrinkles here that are hard. You know, there 192 00:11:18,920 --> 00:11:21,760 Speaker 1: are some questions here that science just doesn't know the 193 00:11:21,800 --> 00:11:26,000 Speaker 1: answer to, especially the ones around sentience. We don't even 194 00:11:26,040 --> 00:11:30,000 Speaker 1: really know what it means when we talk about consciousness. 195 00:11:30,040 --> 00:11:34,640 Speaker 1: This is a question not necessarily of science, but of philosophy. 196 00:11:34,720 --> 00:11:37,520 Speaker 1: You know, they call it the hard question of consciousness 197 00:11:37,880 --> 00:11:41,559 Speaker 1: is understanding how another being, an object that's just made 198 00:11:41,559 --> 00:11:45,240 Speaker 1: out of particles or tying little pieces can come together 199 00:11:45,320 --> 00:11:49,840 Speaker 1: to somehow have a first persons subjective experience. So, for 200 00:11:49,840 --> 00:11:52,079 Speaker 1: those of you who are not into philosophy of science, 201 00:11:52,320 --> 00:11:56,920 Speaker 1: this this concept of the philosophical zombie, that's an object 202 00:11:57,120 --> 00:12:00,520 Speaker 1: or a creature which acts sentiented act, it's like it's 203 00:12:00,559 --> 00:12:03,640 Speaker 1: having a first person experience tells you that it is 204 00:12:03,840 --> 00:12:07,640 Speaker 1: but actually isn't. And the whole construction, the whole idea 205 00:12:07,720 --> 00:12:10,120 Speaker 1: of this philosophical zombie is to make the point that 206 00:12:10,160 --> 00:12:13,080 Speaker 1: there is no way to tell the difference. Right, a 207 00:12:13,160 --> 00:12:16,319 Speaker 1: real person who is actually having a first person experience 208 00:12:16,360 --> 00:12:20,240 Speaker 1: who's in there or who's feeling these things, can't say 209 00:12:20,280 --> 00:12:23,560 Speaker 1: anything or do anything to convince you that they're having 210 00:12:23,559 --> 00:12:27,800 Speaker 1: that experience that the philosophical zombie can't also do. So 211 00:12:27,840 --> 00:12:30,520 Speaker 1: the point is that there's no way to know whether 212 00:12:30,679 --> 00:12:34,120 Speaker 1: somebody really in there or not other than actually being 213 00:12:34,360 --> 00:12:36,800 Speaker 1: in there. That's the foundation of this question. Also that 214 00:12:36,920 --> 00:12:39,440 Speaker 1: in one of the most famous philosophy papers of all time, 215 00:12:39,520 --> 00:12:42,320 Speaker 1: it's called what's it like to be a bat? And 216 00:12:42,360 --> 00:12:44,680 Speaker 1: it's not a joke. It really is asking that question, 217 00:12:44,679 --> 00:12:47,280 Speaker 1: and the answer is will never know, because we'll never 218 00:12:47,360 --> 00:12:49,640 Speaker 1: be that bad. And the point is larger than that. 219 00:12:49,679 --> 00:12:51,400 Speaker 1: It's that we'll never know what it's like to be 220 00:12:51,520 --> 00:12:54,800 Speaker 1: another person. So we have the ability to empathize with 221 00:12:54,840 --> 00:12:58,400 Speaker 1: other folks, to understand to imagine what it might be 222 00:12:58,520 --> 00:13:01,600 Speaker 1: like to be them, but we don't actually know what's 223 00:13:01,640 --> 00:13:04,600 Speaker 1: in them. So there's this deep question of where consciousness 224 00:13:04,600 --> 00:13:08,400 Speaker 1: comes from. I mean, if my consciousness is somehow emergent 225 00:13:08,720 --> 00:13:11,840 Speaker 1: just from like the interactions of my particles coming together 226 00:13:11,880 --> 00:13:15,320 Speaker 1: in this complex structure we call human brain. Then surely 227 00:13:15,520 --> 00:13:18,600 Speaker 1: that should also be possible for other people's brains, which 228 00:13:18,640 --> 00:13:22,199 Speaker 1: means that it should be possible in theory for artificial 229 00:13:22,280 --> 00:13:26,480 Speaker 1: brains that if you put together somehow a set of 230 00:13:26,520 --> 00:13:29,480 Speaker 1: particles that operate the way a brain does, in principle, 231 00:13:29,600 --> 00:13:33,240 Speaker 1: it could also be self aware, not just hard working, 232 00:13:33,559 --> 00:13:38,079 Speaker 1: not just smart, not just intelligent, but actually self aware, 233 00:13:38,559 --> 00:13:42,760 Speaker 1: capable of a real experience, capable maybe even of suffering, 234 00:13:43,120 --> 00:13:47,040 Speaker 1: of feeling joy, of feeling love, of feeling whatever the 235 00:13:47,080 --> 00:13:51,079 Speaker 1: first person experience is of that intelligence. It's connected to 236 00:13:51,120 --> 00:13:54,040 Speaker 1: this other question you might imagine, like if you scan 237 00:13:54,200 --> 00:13:57,480 Speaker 1: your brain and uploaded it to a computer that was 238 00:13:57,520 --> 00:14:00,040 Speaker 1: able to simulate your brain, so I had exactly the 239 00:14:00,040 --> 00:14:02,880 Speaker 1: same sort of information content as your brain, would that 240 00:14:03,080 --> 00:14:07,360 Speaker 1: simulated version of your brain also be self aware? These 241 00:14:07,360 --> 00:14:09,840 Speaker 1: are really deep questions we just don't know the answer to. 242 00:14:10,240 --> 00:14:14,400 Speaker 1: Is consciousness embedded in the relationship between the objects and 243 00:14:14,400 --> 00:14:17,880 Speaker 1: the information that's being passed around inside your brain or 244 00:14:17,960 --> 00:14:21,040 Speaker 1: is it closely connected to the actual wet wear, like 245 00:14:21,080 --> 00:14:24,480 Speaker 1: the brainy part of your brain, In which case, why 246 00:14:24,600 --> 00:14:26,200 Speaker 1: is that and what is it about the brain that 247 00:14:26,200 --> 00:14:28,640 Speaker 1: would make it so? And isn't it possible to reproduce 248 00:14:28,680 --> 00:14:32,160 Speaker 1: it somehow in other materials. These are things we just 249 00:14:32,320 --> 00:14:35,200 Speaker 1: don't know the answers to. People are sort of feeling 250 00:14:35,240 --> 00:14:37,120 Speaker 1: their way around in the dark and trying to come 251 00:14:37,200 --> 00:14:39,720 Speaker 1: up with ways to tackle these questions. So there's nothing 252 00:14:40,040 --> 00:14:43,200 Speaker 1: in this book which is incorrect scientifically. It's just that 253 00:14:43,320 --> 00:14:47,640 Speaker 1: it dives into a really difficult question philosophically that we 254 00:14:47,720 --> 00:14:51,760 Speaker 1: might not ever be able to know the answer to scientifically. Remember, 255 00:14:52,000 --> 00:14:55,080 Speaker 1: not every question you ask about the universe is a 256 00:14:55,120 --> 00:14:59,160 Speaker 1: science question, because not every question can be answered scientifically. 257 00:14:59,520 --> 00:15:02,320 Speaker 1: In order probe the question of consciousness, it might be 258 00:15:02,400 --> 00:15:05,240 Speaker 1: required to step out of a conscious observer, or to 259 00:15:05,360 --> 00:15:07,680 Speaker 1: move from one to the other to compare and contrast 260 00:15:07,760 --> 00:15:11,400 Speaker 1: their experiences, which is not something that an observer can do. 261 00:15:11,960 --> 00:15:14,680 Speaker 1: So it's not even clear to me whether this hard 262 00:15:14,760 --> 00:15:18,400 Speaker 1: question of consciousness is a scientific one or will always 263 00:15:18,480 --> 00:15:21,840 Speaker 1: be a philosophical one that we argue about forever. Now, 264 00:15:21,920 --> 00:15:24,160 Speaker 1: the rest of the science in her book is really 265 00:15:24,280 --> 00:15:27,280 Speaker 1: very robust. All of this near future tech is worked 266 00:15:27,280 --> 00:15:30,440 Speaker 1: out in gory detail. There's a lot of really fascinating 267 00:15:30,440 --> 00:15:33,480 Speaker 1: ideas in there that frankly, I think scientists and engineers 268 00:15:33,480 --> 00:15:36,000 Speaker 1: and venture capital people should read and think about whether 269 00:15:36,120 --> 00:15:37,560 Speaker 1: or not they want to get into that, whether or 270 00:15:37,560 --> 00:15:39,920 Speaker 1: not it's a good idea. Something else I love about 271 00:15:39,920 --> 00:15:42,680 Speaker 1: this book is that there are real scientists in it, 272 00:15:42,880 --> 00:15:47,960 Speaker 1: scientists solving problems, scientists basing puzzles, having frustration, not always 273 00:15:48,000 --> 00:15:51,920 Speaker 1: making progress, sometimes getting too sleepy. So she really knows 274 00:15:51,920 --> 00:15:53,920 Speaker 1: what she's talking about. I think she's done a great 275 00:15:54,000 --> 00:15:56,720 Speaker 1: job right in this book. It's compelling, it's got lots 276 00:15:56,720 --> 00:15:59,040 Speaker 1: of great characters in it. And in a minute when 277 00:15:59,080 --> 00:16:01,600 Speaker 1: we're gonna talk to her about how she wrote this book, 278 00:16:01,800 --> 00:16:04,360 Speaker 1: what's important to her, and what she thinks about some 279 00:16:04,440 --> 00:16:07,600 Speaker 1: of these deep questions of the nature of consciousness. But 280 00:16:07,760 --> 00:16:23,880 Speaker 1: first let's take a quick break. All right, we're back 281 00:16:24,000 --> 00:16:28,720 Speaker 1: and we are talking about the book Machine Hood by S. B. Devia, 282 00:16:28,960 --> 00:16:32,280 Speaker 1: which has just come out recently, which I thoroughly enjoyed 283 00:16:32,320 --> 00:16:35,600 Speaker 1: and recommend to you. The book is all about whether 284 00:16:35,640 --> 00:16:39,400 Speaker 1: it's possible to have artificial intelligence that actually is self aware, 285 00:16:39,680 --> 00:16:42,080 Speaker 1: and what that would mean for our society and how 286 00:16:42,120 --> 00:16:44,880 Speaker 1: we should treat it. So, without further ado, here's my 287 00:16:45,040 --> 00:16:49,240 Speaker 1: interview with the author. Alright, so thank you very much 288 00:16:49,240 --> 00:16:52,400 Speaker 1: for joining me. It's my pleasure to introduce to our 289 00:16:52,520 --> 00:16:56,480 Speaker 1: podcast as b Divia, author of Machine Hood, Say hello 290 00:16:56,520 --> 00:16:59,760 Speaker 1: to all of our listeners. Hello everyone, very happy to 291 00:16:59,800 --> 00:17:02,640 Speaker 1: be here. Well, thanks very much for joining us and 292 00:17:02,720 --> 00:17:05,040 Speaker 1: for talking to us about your book. First, we have 293 00:17:05,080 --> 00:17:07,480 Speaker 1: a set of questions that we ask every science fiction 294 00:17:07,560 --> 00:17:10,000 Speaker 1: author to so to get them calibrated in our science 295 00:17:10,000 --> 00:17:12,840 Speaker 1: fiction universe. We love to hear first about how you 296 00:17:13,000 --> 00:17:15,680 Speaker 1: got into science fiction writing. I know you have quite 297 00:17:15,680 --> 00:17:18,800 Speaker 1: a background in like actual science and engineering. Tell us 298 00:17:18,800 --> 00:17:22,280 Speaker 1: about how that happened for you. I started reading science 299 00:17:22,320 --> 00:17:26,119 Speaker 1: fiction when I was around the age of ten, and 300 00:17:26,920 --> 00:17:30,320 Speaker 1: I did my first writing in my eighth grade English class, 301 00:17:30,320 --> 00:17:33,520 Speaker 1: of all places, we had a little assignment in class 302 00:17:33,600 --> 00:17:36,560 Speaker 1: to write something short and then swap it with a 303 00:17:36,640 --> 00:17:40,520 Speaker 1: partner and critique each other's work. And of course, you know, 304 00:17:40,560 --> 00:17:43,520 Speaker 1: what I wrote was a little snippet of science fiction, 305 00:17:44,000 --> 00:17:46,840 Speaker 1: and my friend who read it turned to me and said, 306 00:17:46,960 --> 00:17:49,679 Speaker 1: this is great, but this is not a complete story. 307 00:17:49,800 --> 00:17:52,760 Speaker 1: You have to write more. And so that kind of 308 00:17:52,760 --> 00:17:56,960 Speaker 1: put me on the path of enjoying the writing part 309 00:17:57,160 --> 00:18:02,119 Speaker 1: of science fiction for several years through my teenage years. 310 00:18:02,320 --> 00:18:06,160 Speaker 1: Then a little thing called Caltech happened to me as 311 00:18:06,200 --> 00:18:09,520 Speaker 1: an undergraduate. It's called drinking from a fire hose for 312 00:18:09,560 --> 00:18:12,920 Speaker 1: a reason. And so I put aside my fiction habits. 313 00:18:13,080 --> 00:18:16,479 Speaker 1: I think the only thing I read while there was 314 00:18:16,640 --> 00:18:19,919 Speaker 1: Analog magazine, and you know, looked forward to getting that 315 00:18:20,040 --> 00:18:23,040 Speaker 1: in my school mailbox. That was my little treat. That's 316 00:18:23,080 --> 00:18:26,119 Speaker 1: a very intense undergraduate experience, isn't It's a very small 317 00:18:26,160 --> 00:18:30,760 Speaker 1: community of students. Yes, it was fabulous and awful all 318 00:18:30,800 --> 00:18:35,640 Speaker 1: at the same time. I would do it again if 319 00:18:35,720 --> 00:18:38,520 Speaker 1: I were sixteen again. I don't think I would do 320 00:18:38,560 --> 00:18:41,040 Speaker 1: it again at this stage in my life. It was 321 00:18:41,080 --> 00:18:44,520 Speaker 1: a lot. But you went to Caltech at sixteen. I did. Yeah. 322 00:18:44,680 --> 00:18:47,120 Speaker 1: My my parents were all freaked out, but we had 323 00:18:47,520 --> 00:18:51,360 Speaker 1: good family friends nearby to look after me. And yeah, 324 00:18:51,400 --> 00:18:54,720 Speaker 1: I ended up moving between junior high and high school 325 00:18:55,080 --> 00:18:58,400 Speaker 1: and skipped a grade because the high school I ended 326 00:18:58,480 --> 00:19:02,120 Speaker 1: up at just didn't have enough stuff for me. I guess. 327 00:19:02,240 --> 00:19:06,600 Speaker 1: So all that happened, and I put aside my writing 328 00:19:06,640 --> 00:19:09,320 Speaker 1: and fiction for a very long time, figuring, you know, 329 00:19:09,400 --> 00:19:12,320 Speaker 1: it was a nice dream. Maybe when I'm retired, like 330 00:19:12,359 --> 00:19:17,119 Speaker 1: a focus on my engineering career first, that's where you know, 331 00:19:17,240 --> 00:19:19,760 Speaker 1: I made my money, and then sort of in my 332 00:19:19,880 --> 00:19:23,359 Speaker 1: mid thirties. I got way laid by a few factors, 333 00:19:23,440 --> 00:19:26,760 Speaker 1: including a kid, losing a couple of friends to cancer, 334 00:19:26,880 --> 00:19:28,800 Speaker 1: and it was all just kind of a big wake 335 00:19:28,880 --> 00:19:33,400 Speaker 1: up call to maybe not defer my dream for twenty 336 00:19:33,440 --> 00:19:39,240 Speaker 1: more years. And so I picked up the analog to 337 00:19:39,280 --> 00:19:43,199 Speaker 1: a pen, which is the keyboard, and said, you know, 338 00:19:43,280 --> 00:19:44,919 Speaker 1: I'm going to try to make a real go of 339 00:19:44,960 --> 00:19:47,280 Speaker 1: it this time. I'm actually going to try to get 340 00:19:47,320 --> 00:19:51,600 Speaker 1: something published, take a class, and really commit to myself 341 00:19:51,680 --> 00:19:55,720 Speaker 1: to try to turn this into something real and not 342 00:19:55,840 --> 00:19:59,879 Speaker 1: just a little side hobby. So I got very lucky. 343 00:20:00,040 --> 00:20:03,840 Speaker 1: I published my first short story in two thousand and fourteen. 344 00:20:04,720 --> 00:20:07,600 Speaker 1: I had a short novel come out from Towards dot 345 00:20:07,600 --> 00:20:12,000 Speaker 1: com in two thousand and sixteen, and this past week, 346 00:20:12,200 --> 00:20:15,560 Speaker 1: Machine Hood, my first novel, is out. Very excited to 347 00:20:15,800 --> 00:20:18,560 Speaker 1: share that with the world. Well congrats. It's exciting for 348 00:20:18,600 --> 00:20:21,080 Speaker 1: all those geeks out there to think that, you know, 349 00:20:21,200 --> 00:20:23,960 Speaker 1: there are opportunities to get into science fiction writing even 350 00:20:23,960 --> 00:20:26,080 Speaker 1: if you don't have a long background in literature. So 351 00:20:26,080 --> 00:20:28,720 Speaker 1: that's always nice to hear people's stories. Yeah, for sure. 352 00:20:28,960 --> 00:20:31,439 Speaker 1: So as a reader of science fiction, here's some questions 353 00:20:31,440 --> 00:20:33,440 Speaker 1: for you by the science fiction genre is it your 354 00:20:33,480 --> 00:20:38,760 Speaker 1: opinion that a Star Trek transporter kills you and recreates 355 00:20:38,760 --> 00:20:43,240 Speaker 1: you somewhere else or actually transports your atoms? Like, is 356 00:20:43,240 --> 00:20:45,720 Speaker 1: it really a transporter or is it a slaughter and 357 00:20:45,800 --> 00:20:50,479 Speaker 1: recreation machine. I'm pretty sure the Star Trek transporter is 358 00:20:50,840 --> 00:20:56,240 Speaker 1: a slaughter and recreation machine just by the way they 359 00:20:56,400 --> 00:21:01,200 Speaker 1: show it. There's never like a particle being sent somewhere, 360 00:21:01,200 --> 00:21:02,919 Speaker 1: which is kind of what you'd have to do, and 361 00:21:03,000 --> 00:21:05,560 Speaker 1: you'd have to send it at warp speeds to get 362 00:21:05,600 --> 00:21:11,440 Speaker 1: you know, that much material of that far. So yeah, vaporized, rebuilt. 363 00:21:11,760 --> 00:21:15,480 Speaker 1: Very proud actually of Star Trek for taking that particular 364 00:21:15,560 --> 00:21:17,880 Speaker 1: leap of faith, because I think it freaks a lot 365 00:21:17,920 --> 00:21:22,960 Speaker 1: of people out. And I have had many wonderful philosophical 366 00:21:23,119 --> 00:21:26,919 Speaker 1: arguments with my friends on whether it matters whether the 367 00:21:26,960 --> 00:21:30,760 Speaker 1: thing that's recreated is you or really just a clone 368 00:21:30,800 --> 00:21:33,920 Speaker 1: of you because it's a different set of atoms. Never 369 00:21:34,000 --> 00:21:37,120 Speaker 1: mind that we are all exchanging atoms with our environment 370 00:21:37,200 --> 00:21:43,399 Speaker 1: on a constant basis, and our bodies are renewing themselves continually, 371 00:21:43,600 --> 00:21:47,240 Speaker 1: so you know, we are rebuilt every so many years. 372 00:21:47,240 --> 00:21:50,760 Speaker 1: It's just happening very slowly. So then would you step 373 00:21:50,800 --> 00:21:53,880 Speaker 1: into a transporter, being totally comfortable with having your body 374 00:21:54,080 --> 00:21:56,000 Speaker 1: taken apart. You would do it. You would let the 375 00:21:56,080 --> 00:21:59,760 Speaker 1: machine kill you and rebuild you. I would do it. 376 00:22:00,119 --> 00:22:03,679 Speaker 1: I as long as the technology was safe and we 377 00:22:03,720 --> 00:22:06,920 Speaker 1: had rebuilt other things. I wouldn't be an early adopter 378 00:22:07,160 --> 00:22:10,719 Speaker 1: of it, okay, but once it was proven that I 379 00:22:10,760 --> 00:22:14,160 Speaker 1: wasn't going to just die along the way and stay dead, 380 00:22:14,520 --> 00:22:18,440 Speaker 1: I would absolutely step in. And I have no qualms. Well, 381 00:22:18,480 --> 00:22:22,199 Speaker 1: it certainly would be convenient. Well, I'm a tourist. I 382 00:22:22,320 --> 00:22:26,040 Speaker 1: like to travel, so the idea of having a teleporter 383 00:22:26,160 --> 00:22:30,800 Speaker 1: or transporter make travel easier is just way too tempting. 384 00:22:30,840 --> 00:22:33,080 Speaker 1: It's like I want to see the universe as much 385 00:22:33,080 --> 00:22:36,520 Speaker 1: as possible. Totally agree. I love to be other places. 386 00:22:36,560 --> 00:22:39,760 Speaker 1: I don't actually enjoy traveling, and so getting there without 387 00:22:39,760 --> 00:22:43,359 Speaker 1: the traveling part, Wow, that'd be nice. Speaking of technological advancements, 388 00:22:43,440 --> 00:22:46,119 Speaker 1: what other kind of technology do you see in science 389 00:22:46,160 --> 00:22:49,680 Speaker 1: fiction that you'd like to actually have become a reality. 390 00:22:49,840 --> 00:22:52,120 Speaker 1: If you could pick one thing from the science fiction 391 00:22:52,240 --> 00:22:57,080 Speaker 1: universe and make it real from cannon, just one, that's tough. 392 00:22:57,760 --> 00:23:04,320 Speaker 1: I think the most useful, at least initially, would probably 393 00:23:04,320 --> 00:23:11,119 Speaker 1: be the answable the ability to communicate instantly across long distances, 394 00:23:11,200 --> 00:23:17,040 Speaker 1: because once we have that communication channel available, we can 395 00:23:17,160 --> 00:23:19,520 Speaker 1: do a lot more with it in terms of sending 396 00:23:19,600 --> 00:23:23,040 Speaker 1: data back and forth, gathering that data, and you know, 397 00:23:23,240 --> 00:23:28,240 Speaker 1: deploying ourselves further afield, even just within our solar system, 398 00:23:28,359 --> 00:23:34,080 Speaker 1: and having more immediate control and interactivity with those environments. 399 00:23:34,160 --> 00:23:37,160 Speaker 1: Because you want to drive the Mars rover in real time, yeah, 400 00:23:37,400 --> 00:23:40,680 Speaker 1: think about it though. If we had a real time 401 00:23:40,880 --> 00:23:46,480 Speaker 1: remote operable vehicle, we can directly interact with the controls, 402 00:23:46,600 --> 00:23:52,520 Speaker 1: collect samples, you know, react to environmental changes. And that's 403 00:23:52,520 --> 00:23:54,600 Speaker 1: a big struggle right now, right is that we've got 404 00:23:54,600 --> 00:23:57,120 Speaker 1: to send robots. And our robots, you know, are very 405 00:23:57,119 --> 00:24:01,359 Speaker 1: sophisticated by our current stand words, but they're still a 406 00:24:01,400 --> 00:24:05,879 Speaker 1: long way from being truly sophisticated and autonomous. So the 407 00:24:05,960 --> 00:24:09,200 Speaker 1: next best thing is human remote control. It's just that 408 00:24:09,280 --> 00:24:12,680 Speaker 1: lag is killer, all right. Then, one more question before 409 00:24:12,680 --> 00:24:15,560 Speaker 1: we dig into your novel. What's your personal answer to 410 00:24:15,760 --> 00:24:19,959 Speaker 1: the Fermi paradox. Given the number of likely habitable worlds 411 00:24:20,000 --> 00:24:22,879 Speaker 1: out there and the vastness of the universe, why haven't 412 00:24:22,880 --> 00:24:26,879 Speaker 1: aliens visited us or contacted us the same exact actually 413 00:24:26,920 --> 00:24:30,120 Speaker 1: reason that I like the answervill, which is it takes 414 00:24:30,400 --> 00:24:35,760 Speaker 1: light time to travel across vast descisances, and even though 415 00:24:36,520 --> 00:24:40,639 Speaker 1: there is probably plenty of life out there, much of 416 00:24:40,680 --> 00:24:45,480 Speaker 1: it is not a advanced enough to send you know, 417 00:24:45,600 --> 00:24:48,720 Speaker 1: extra solar communications and then be we've got to wait 418 00:24:48,760 --> 00:24:52,000 Speaker 1: for that extra solar communication to show up. See, we 419 00:24:52,080 --> 00:24:56,160 Speaker 1: have to recognize it amidst all the other signals we're 420 00:24:56,200 --> 00:25:00,399 Speaker 1: dealing with, and d we have to then be able 421 00:25:00,440 --> 00:25:02,720 Speaker 1: to capture it and interpret it. So that's a lot 422 00:25:02,800 --> 00:25:07,600 Speaker 1: of steps that add decreasing you know, probability that we're 423 00:25:07,640 --> 00:25:11,159 Speaker 1: actually going to get that communication. So given enough time, 424 00:25:11,480 --> 00:25:16,160 Speaker 1: if humanity manages to last for millions of years and 425 00:25:16,359 --> 00:25:21,919 Speaker 1: continues to build our technology, we probably will eventually find 426 00:25:22,000 --> 00:25:25,359 Speaker 1: some signals. If we're lucky enough that some other life 427 00:25:25,359 --> 00:25:31,880 Speaker 1: form is concurrently you know, across the span of light 428 00:25:31,960 --> 00:25:36,000 Speaker 1: years in those delays, at that same level of technology 429 00:25:36,720 --> 00:25:40,280 Speaker 1: sending something over, we'll get it. But the thing that 430 00:25:40,440 --> 00:25:43,240 Speaker 1: science fiction love is that these things should be synchronous, 431 00:25:43,280 --> 00:25:45,480 Speaker 1: that we can communicate back and forth. But of course 432 00:25:45,520 --> 00:25:49,600 Speaker 1: it's going to be deeply asynchronous communication and for all 433 00:25:49,720 --> 00:25:52,080 Speaker 1: we know, by the time our signal reaches them were 434 00:25:52,160 --> 00:25:55,359 Speaker 1: extinct or vice versa. Right, So we're not gonna be 435 00:25:55,400 --> 00:25:59,960 Speaker 1: swapping text messages with anyone anytime soon, even though that's 436 00:26:00,080 --> 00:26:02,560 Speaker 1: what we all want. Not without your answer, will at 437 00:26:02,600 --> 00:26:05,160 Speaker 1: least and imagine how long it would take to learn 438 00:26:05,240 --> 00:26:08,080 Speaker 1: to decode their language. You know, if it takes twenty 439 00:26:08,480 --> 00:26:11,040 Speaker 1: years between sending a question and getting an answer, it's 440 00:26:11,080 --> 00:26:14,439 Speaker 1: hard enough to decode ancient human languages, you know, just 441 00:26:14,480 --> 00:26:16,520 Speaker 1: a few thousand years old. So I hope we do 442 00:26:16,560 --> 00:26:18,560 Speaker 1: get messages from aliens, but I'd be amazed if we 443 00:26:18,560 --> 00:26:21,119 Speaker 1: ever understood them. All right, So then let's dive into 444 00:26:21,200 --> 00:26:24,600 Speaker 1: your novel, which I read and very much enjoy. Congratulations. 445 00:26:24,640 --> 00:26:28,520 Speaker 1: It's such a fascinating and complex but also very realistic story. 446 00:26:28,880 --> 00:26:30,880 Speaker 1: You really feel the characters and they feel like they 447 00:26:30,920 --> 00:26:33,600 Speaker 1: really live in that universe, which it was an accomplishment 448 00:26:33,760 --> 00:26:35,720 Speaker 1: to me. Your book raises a lot of interesting and 449 00:26:35,760 --> 00:26:39,720 Speaker 1: important questions about how technology changes the pattern of our life, 450 00:26:40,040 --> 00:26:43,040 Speaker 1: how we treat each other, and who deserves what kind 451 00:26:43,080 --> 00:26:45,720 Speaker 1: of treatment, what kind of rites? So what intrigues you 452 00:26:45,720 --> 00:26:47,560 Speaker 1: about these themes? Why did you decide to write a 453 00:26:47,560 --> 00:26:50,440 Speaker 1: book about these kinds of topics and create this kind 454 00:26:50,440 --> 00:26:54,359 Speaker 1: of world for your characters. There's probably a few factors 455 00:26:54,520 --> 00:26:59,399 Speaker 1: from going to dive into my personal psychology and unearthed 456 00:26:59,440 --> 00:27:02,639 Speaker 1: what try as me to write these stories. I've always 457 00:27:02,760 --> 00:27:08,919 Speaker 1: been fascinated by the philosophy of mind. One of the 458 00:27:09,000 --> 00:27:14,560 Speaker 1: reasons I switched from being on the astrophysics track to 459 00:27:14,840 --> 00:27:20,439 Speaker 1: the computational neuroscience track was that I felt like the brain, 460 00:27:21,320 --> 00:27:25,760 Speaker 1: in some ways, especially in terms of mind, was more 461 00:27:25,880 --> 00:27:30,119 Speaker 1: unknown than parts of our own universe, and it was 462 00:27:30,200 --> 00:27:33,960 Speaker 1: equally interesting to kind of delve into those questions and ideas. 463 00:27:35,000 --> 00:27:40,200 Speaker 1: And that intersected with my more practical engineering technology career, 464 00:27:40,280 --> 00:27:45,000 Speaker 1: which was in pattern recognition, machine learning, and signal processing, 465 00:27:46,119 --> 00:27:50,399 Speaker 1: and looking at, especially in the last five years, the 466 00:27:50,440 --> 00:27:54,119 Speaker 1: conversations people have been having in Silicon Valley in terms 467 00:27:54,200 --> 00:28:01,479 Speaker 1: of AI and automation and life, and then also drawing on, 468 00:28:02,119 --> 00:28:05,320 Speaker 1: you know, the tropes of science fiction, both in terms 469 00:28:05,400 --> 00:28:12,320 Speaker 1: of literature and film and its portrayal of artificial intelligence 470 00:28:12,480 --> 00:28:16,760 Speaker 1: versus the reality of what we were actually developing in 471 00:28:16,920 --> 00:28:21,360 Speaker 1: labs and at corporations. All of that kind of mashed 472 00:28:21,520 --> 00:28:25,919 Speaker 1: up with general human history. The fact that you know, 473 00:28:26,040 --> 00:28:29,520 Speaker 1: I come from India, which was colonized and I live 474 00:28:29,520 --> 00:28:33,199 Speaker 1: in America, which has this history of slavery, and you know, 475 00:28:33,280 --> 00:28:36,040 Speaker 1: the kind of the social factors of how we treat 476 00:28:36,040 --> 00:28:40,280 Speaker 1: each other. I grew up vegetarian, how we treat animals, 477 00:28:41,000 --> 00:28:43,560 Speaker 1: you know, climate change and how we treat the environment. 478 00:28:44,440 --> 00:28:47,040 Speaker 1: I think all of these things kind of collided to 479 00:28:48,200 --> 00:28:52,240 Speaker 1: form the themes of machine hood and really examining these 480 00:28:52,360 --> 00:28:59,160 Speaker 1: questions of what makes something human sentient deserving of rights 481 00:28:59,360 --> 00:29:03,280 Speaker 1: and detections. Thank you. These are really fascinating questions, difficult 482 00:29:03,320 --> 00:29:07,040 Speaker 1: philosophically and scientifically. So what is your personal opinion. I mean, 483 00:29:07,080 --> 00:29:10,479 Speaker 1: if we are able to create intelligent AI that's at 484 00:29:10,480 --> 00:29:12,800 Speaker 1: the level of humanity, do you think it deserves the 485 00:29:12,840 --> 00:29:15,880 Speaker 1: same rights? Like should it be illegal to turn it off? 486 00:29:16,200 --> 00:29:22,560 Speaker 1: I think we have to separate intelligence from consciousness and sentience. 487 00:29:24,200 --> 00:29:29,040 Speaker 1: Intelligence is actually easier in some ways to define because 488 00:29:29,120 --> 00:29:36,240 Speaker 1: we can at least model capability and reactivity to stimuli, 489 00:29:36,720 --> 00:29:39,520 Speaker 1: you know, the ability to solve problems and adapt to 490 00:29:39,560 --> 00:29:43,520 Speaker 1: your environment. These are at least the hallmarks of machine 491 00:29:43,600 --> 00:29:47,360 Speaker 1: intelligence right now. And you know, a large part of 492 00:29:47,400 --> 00:29:50,720 Speaker 1: what we know about human beings, we can't necessarily quantify 493 00:29:50,720 --> 00:29:54,120 Speaker 1: our intelligence levels. I don't buy into I Q stuff, 494 00:29:54,840 --> 00:30:00,040 Speaker 1: but we can at least simulate it and recreate it 495 00:30:00,080 --> 00:30:06,000 Speaker 1: in you know, digital and artificial formats sentience and consciousness. However, 496 00:30:06,280 --> 00:30:09,240 Speaker 1: we don't have as good of a grasp on yet, right. 497 00:30:09,880 --> 00:30:14,760 Speaker 1: Neuroscientists are still struggling with that. Philosophers are struggling with that, 498 00:30:15,120 --> 00:30:18,440 Speaker 1: trying to just define what it means much less, you know, 499 00:30:18,560 --> 00:30:23,920 Speaker 1: come up with models that are predictive and ways in 500 00:30:24,000 --> 00:30:27,760 Speaker 1: which we can quantify it. So until we do that, 501 00:30:28,080 --> 00:30:31,360 Speaker 1: I think the machines we build are going to be 502 00:30:31,440 --> 00:30:36,360 Speaker 1: increasingly intelligent in that they will be capable of solving 503 00:30:36,520 --> 00:30:41,560 Speaker 1: very complex problems. They'll be able to handle increasingly complex situations. 504 00:30:42,440 --> 00:30:48,640 Speaker 1: But are they self aware enough to develop the desire 505 00:30:48,800 --> 00:30:53,600 Speaker 1: for rights without us specifically coding that into them? I 506 00:30:53,640 --> 00:30:56,000 Speaker 1: don't know. And then I think that raises a bigger 507 00:30:56,000 --> 00:30:59,800 Speaker 1: ethical question of do we code it into them? And 508 00:31:00,040 --> 00:31:03,320 Speaker 1: if so, you know, what are the ramifications of that, right, Like, 509 00:31:03,440 --> 00:31:05,840 Speaker 1: why would we code that into them? And once we do, 510 00:31:05,920 --> 00:31:08,680 Speaker 1: then I think, yeah, we do have some ethical obligations 511 00:31:08,720 --> 00:31:11,400 Speaker 1: to treat them as if they were life, because they 512 00:31:11,400 --> 00:31:14,959 Speaker 1: can't help it, especially if it's in their hardware like 513 00:31:15,040 --> 00:31:18,680 Speaker 1: it is with us. Write the desire for survival, the 514 00:31:18,760 --> 00:31:23,440 Speaker 1: desire for selfhood. So I do think we're gonna have 515 00:31:23,560 --> 00:31:26,640 Speaker 1: to start tackling these questions, not maybe in the next 516 00:31:26,680 --> 00:31:30,040 Speaker 1: few decades, but you know, within the next century or two, 517 00:31:30,160 --> 00:31:32,800 Speaker 1: we're going to get to that level of complexity where 518 00:31:33,120 --> 00:31:35,680 Speaker 1: maybe it doesn't matter that they're not conscious in the 519 00:31:35,680 --> 00:31:39,760 Speaker 1: way that we're conscious, that they behave in a manner 520 00:31:39,960 --> 00:31:44,600 Speaker 1: sufficiently similar that the distinction doesn't matter anymore. And that's 521 00:31:44,640 --> 00:31:47,360 Speaker 1: really the crux of machine hood. And I don't know 522 00:31:47,400 --> 00:31:51,600 Speaker 1: if I personally have a strong answer yet. I really 523 00:31:51,600 --> 00:31:54,440 Speaker 1: wrote the book to kind of explore the question in 524 00:31:54,520 --> 00:31:57,960 Speaker 1: my own mind, and I can see myself being swayed 525 00:31:58,600 --> 00:32:04,320 Speaker 1: in either direction to ending on the nuances. But ultimately, 526 00:32:04,360 --> 00:32:09,440 Speaker 1: I think it's a reflection of ethics in terms of 527 00:32:09,480 --> 00:32:12,000 Speaker 1: how we treat each other, and like said, how we 528 00:32:12,080 --> 00:32:17,560 Speaker 1: treat the world around us. And if we can respect 529 00:32:17,680 --> 00:32:21,640 Speaker 1: ourselves and other human beings, if we can respect our planet, 530 00:32:22,520 --> 00:32:27,920 Speaker 1: then why shouldn't we also respect these intelligent machines that 531 00:32:28,000 --> 00:32:30,280 Speaker 1: we're building. I was struck at some point early in 532 00:32:30,280 --> 00:32:34,360 Speaker 1: the novel when one of your characters asks herselves the question, 533 00:32:34,680 --> 00:32:38,040 Speaker 1: maybe the creation of consciousness is not possible, right, Like, 534 00:32:38,080 --> 00:32:41,240 Speaker 1: maybe we can build systems that can solve hard problems 535 00:32:41,280 --> 00:32:44,240 Speaker 1: and do things and be helpful as the week AI 536 00:32:44,400 --> 00:32:47,360 Speaker 1: are in your novel, but are not actually conscious. But 537 00:32:47,480 --> 00:32:49,840 Speaker 1: my question for you is, how do we know when 538 00:32:49,840 --> 00:32:52,680 Speaker 1: you talk about a being that simulates the experience of 539 00:32:52,680 --> 00:32:55,400 Speaker 1: being conscious, But when we always run into the problem 540 00:32:55,440 --> 00:32:58,160 Speaker 1: of the philosophical zombie, something which seems to be conscious 541 00:32:58,160 --> 00:33:00,240 Speaker 1: but we can't ever tell of it actually is. Me, 542 00:33:00,600 --> 00:33:02,480 Speaker 1: I don't know if you're conscious, for example, what I 543 00:33:02,680 --> 00:33:05,120 Speaker 1: take it onto faith? How do we know when our 544 00:33:05,160 --> 00:33:08,680 Speaker 1: machines have reached that level? That really is the question. 545 00:33:09,000 --> 00:33:14,400 Speaker 1: And I do think at some point, if we can 546 00:33:14,600 --> 00:33:17,840 Speaker 1: figure out the underpinnings of our own consciousness, that's how 547 00:33:17,920 --> 00:33:22,680 Speaker 1: we'll know. Right, if we can get to the biophysics 548 00:33:22,760 --> 00:33:28,840 Speaker 1: of whatever consciousness is, how we have it ourselves? What 549 00:33:29,320 --> 00:33:33,720 Speaker 1: gives us degrees of consciousness? Right? Not so much amongst humans, 550 00:33:33,720 --> 00:33:37,200 Speaker 1: but when we look at you know, other animals, mammals 551 00:33:37,280 --> 00:33:40,280 Speaker 1: all the way down to insects and then eventually plants, 552 00:33:40,560 --> 00:33:43,840 Speaker 1: You know, what is it in life that brings forth 553 00:33:43,840 --> 00:33:47,480 Speaker 1: consciousness at all these sort of varying degrees? Right? Obviously 554 00:33:47,520 --> 00:33:49,720 Speaker 1: there is a spectrum or at least to me, it's 555 00:33:49,720 --> 00:33:54,200 Speaker 1: obvious that there's a spectrum of consciousness, So that indicates 556 00:33:54,240 --> 00:33:57,320 Speaker 1: to me that there is some underlying physical structure. I 557 00:33:57,360 --> 00:34:01,800 Speaker 1: am fundamentally a physicalist. I don't believe in the soul. 558 00:34:01,840 --> 00:34:07,600 Speaker 1: I don't believe in something, you know, beyond matter and energy. 559 00:34:07,840 --> 00:34:11,319 Speaker 1: I am very intrigued by the idea of panpsychism that 560 00:34:11,600 --> 00:34:16,719 Speaker 1: consciousness maybe is an inherent property of the matter and 561 00:34:16,880 --> 00:34:21,600 Speaker 1: energy in our universe, and when it hits certain types 562 00:34:21,840 --> 00:34:26,640 Speaker 1: of configurations, that's when we get the emergent property of 563 00:34:26,680 --> 00:34:31,880 Speaker 1: what we experience as consciousness. And so if we can really, 564 00:34:32,239 --> 00:34:34,839 Speaker 1: you know, get to the bottom of all that, get 565 00:34:34,880 --> 00:34:37,479 Speaker 1: to the physics of all that, and understand it enough 566 00:34:37,600 --> 00:34:40,200 Speaker 1: to model it and predict it that, Hey, if we 567 00:34:40,239 --> 00:34:44,480 Speaker 1: build this it's going to behave like a beatle in 568 00:34:44,600 --> 00:34:48,879 Speaker 1: terms of its level of consciousness, then I think we 569 00:34:49,239 --> 00:34:54,920 Speaker 1: are on the path to knowing when the really complex 570 00:34:55,120 --> 00:34:59,719 Speaker 1: intelligent machine has also developed consciousness, because then we can 571 00:34:59,760 --> 00:35:05,640 Speaker 1: measure or it. And until then I'm gonna still go 572 00:35:05,840 --> 00:35:09,399 Speaker 1: for Maybe it doesn't matter. It looks like a duck, 573 00:35:09,480 --> 00:35:11,560 Speaker 1: it walks like a duck, it quacks like a duck. 574 00:35:11,960 --> 00:35:14,640 Speaker 1: Maybe we should just treat it like a duck, even 575 00:35:14,760 --> 00:35:18,200 Speaker 1: if we don't see that it has duck DNA. At 576 00:35:18,200 --> 00:35:21,879 Speaker 1: the end of the day, it gets defined by our 577 00:35:21,960 --> 00:35:27,800 Speaker 1: interactions with the intelligent machine, just like our interactions with 578 00:35:28,440 --> 00:35:34,120 Speaker 1: our air quality and you know, water quality and everything 579 00:35:34,160 --> 00:35:37,319 Speaker 1: else that's inanimate that we still need to interact with 580 00:35:37,400 --> 00:35:40,360 Speaker 1: in order to survive and live and pass through the world. 581 00:35:41,040 --> 00:35:45,200 Speaker 1: If we're going to interact with these highly intelligent, sophisticated 582 00:35:45,520 --> 00:35:51,360 Speaker 1: robots or pieces of software, then we're going to feel 583 00:35:51,640 --> 00:35:55,479 Speaker 1: emotions about them. You know, our sentience and our own 584 00:35:55,560 --> 00:35:58,520 Speaker 1: consciousness is going to have us react to them in 585 00:35:58,600 --> 00:36:03,440 Speaker 1: specific ways. And that might be sufficient of a line 586 00:36:03,480 --> 00:36:07,400 Speaker 1: to say that, Yeah, at this point, we need to 587 00:36:07,480 --> 00:36:11,960 Speaker 1: give these intelligent machines some level of protection so that 588 00:36:12,000 --> 00:36:16,000 Speaker 1: they're not exploited, because that then reflects on how we 589 00:36:16,040 --> 00:36:19,000 Speaker 1: treat them and how we treat ourselves. Yeah, that's really insightful. 590 00:36:19,040 --> 00:36:21,799 Speaker 1: It's a lot about our emotional response since we have 591 00:36:21,960 --> 00:36:25,800 Speaker 1: no actual heart evidence of anybody else's consciousness. But speaking 592 00:36:25,880 --> 00:36:28,080 Speaker 1: of you know, speaking like a duck and talking like 593 00:36:28,080 --> 00:36:30,719 Speaker 1: a duck, let's think about how we treat ducks or 594 00:36:30,760 --> 00:36:33,840 Speaker 1: other animals. Right, If you imagine that these animals are centered. 595 00:36:33,880 --> 00:36:37,320 Speaker 1: The chickens and pigs are ascentioned and have emotions and experience. 596 00:36:37,680 --> 00:36:39,440 Speaker 1: We don't treat them very well. We raise them in 597 00:36:39,440 --> 00:36:42,600 Speaker 1: factories and slaughter them for food. There's a difference between 598 00:36:42,960 --> 00:36:45,560 Speaker 1: what we understand intellectually to be the rights or the 599 00:36:45,600 --> 00:36:48,759 Speaker 1: experience of a creature and how our society is constructed 600 00:36:48,800 --> 00:36:51,920 Speaker 1: to respect those or not. As as you mentioned earlier, 601 00:36:52,080 --> 00:36:54,160 Speaker 1: I really enjoyed in your novels a lot of those 602 00:36:54,239 --> 00:36:58,080 Speaker 1: nuances for how society is built, sometimes on flawed premises. 603 00:36:58,160 --> 00:37:00,359 Speaker 1: Do you think that would be a difficult transition for us, 604 00:37:00,360 --> 00:37:02,839 Speaker 1: say we come to this emotional realization, do you think 605 00:37:02,840 --> 00:37:05,080 Speaker 1: there will still be people who say, well, yes, maybe 606 00:37:05,120 --> 00:37:08,120 Speaker 1: they feel something, but whatever, you know, they still don't 607 00:37:08,120 --> 00:37:11,600 Speaker 1: deserve rights. For sure. That's definitely one of the points 608 00:37:11,600 --> 00:37:15,440 Speaker 1: of the book is that when it benefits us to 609 00:37:15,520 --> 00:37:18,160 Speaker 1: think that way, there will be a lot of people 610 00:37:18,239 --> 00:37:20,560 Speaker 1: pushing to continue to think that way, and we have 611 00:37:20,800 --> 00:37:24,400 Speaker 1: only to look at like you said, factory farming. You know, 612 00:37:24,440 --> 00:37:28,920 Speaker 1: people know that animals are sentient. I don't think there's 613 00:37:29,080 --> 00:37:31,719 Speaker 1: much of a question there at this point that they 614 00:37:31,760 --> 00:37:35,880 Speaker 1: are at least somewhat self aware that there are certainly 615 00:37:36,280 --> 00:37:40,720 Speaker 1: living creatures who feel pain, who feel emotions, who feel attachments, 616 00:37:40,760 --> 00:37:43,920 Speaker 1: who can develop their own sorts of language and communications. 617 00:37:44,160 --> 00:37:47,360 Speaker 1: Everything we learn about animals points to them being a 618 00:37:47,440 --> 00:37:53,600 Speaker 1: lot more sophisticated than we often give them credit for. Similarly, 619 00:37:53,800 --> 00:37:56,560 Speaker 1: if we can enslave the machines and use them for 620 00:37:56,760 --> 00:38:01,080 Speaker 1: labor without having to give them time off, without having 621 00:38:01,120 --> 00:38:03,920 Speaker 1: to compensate them in any way, we're going to do it. 622 00:38:04,320 --> 00:38:06,640 Speaker 1: You know, it's to our benefit, right, So it's going 623 00:38:06,719 --> 00:38:11,080 Speaker 1: to be a struggle to convince those who benefit from 624 00:38:11,120 --> 00:38:14,440 Speaker 1: that to give that up, just like it was a 625 00:38:14,480 --> 00:38:18,200 Speaker 1: struggle to give up slavery, right, just like it continues 626 00:38:18,280 --> 00:38:20,800 Speaker 1: to be a struggle for a lot of people to 627 00:38:21,920 --> 00:38:24,920 Speaker 1: acknowledge that life doesn't have to be a ZR sum game, 628 00:38:25,040 --> 00:38:27,600 Speaker 1: that the only way they can gain is if somebody 629 00:38:27,600 --> 00:38:31,319 Speaker 1: else loses. And I don't believe that to be the case, 630 00:38:31,360 --> 00:38:34,360 Speaker 1: because we don't really live in that closed of a 631 00:38:34,440 --> 00:38:36,880 Speaker 1: system at this point. I think we can all gain, 632 00:38:37,000 --> 00:38:39,600 Speaker 1: and we've already shown that, so there's no reason we 633 00:38:39,640 --> 00:38:44,280 Speaker 1: can't continue to expand that particular ethos in human society. 634 00:38:44,600 --> 00:38:47,640 Speaker 1: It's just that it does go against a lot of 635 00:38:47,719 --> 00:38:50,560 Speaker 1: people's human nature. I won't say everyone. I think there 636 00:38:50,600 --> 00:38:56,480 Speaker 1: are you know, vast segments of society that are altruistic 637 00:38:56,480 --> 00:38:59,879 Speaker 1: in a lot of ways, that are willing to sacrifice 638 00:39:00,920 --> 00:39:04,960 Speaker 1: immediate or personal gain for a larger good, whether it's 639 00:39:05,040 --> 00:39:09,719 Speaker 1: family or society. So I think it comes down to 640 00:39:10,040 --> 00:39:13,759 Speaker 1: what we cultivate, and I think we're seeing that shift already, 641 00:39:13,920 --> 00:39:18,360 Speaker 1: right to plant based proteins. We've got beyond meat and 642 00:39:18,440 --> 00:39:22,280 Speaker 1: impossible workers and all of these things, partly just for 643 00:39:22,760 --> 00:39:26,480 Speaker 1: ecological and economic reasons, right, that pressure has gotten to 644 00:39:26,560 --> 00:39:30,240 Speaker 1: the point where people are willing to trade off animal 645 00:39:30,280 --> 00:39:34,359 Speaker 1: meat for plant based proteins. But also I think more 646 00:39:34,400 --> 00:39:39,200 Speaker 1: people have become aware of these ethical quandaries, right, that 647 00:39:39,960 --> 00:39:43,000 Speaker 1: many of these animals are not well treated for their 648 00:39:43,160 --> 00:39:46,680 Speaker 1: entire lives, were pumping them full of hormones and chemicals, 649 00:39:46,680 --> 00:39:51,200 Speaker 1: that maybe this is not the right way to move 650 00:39:51,239 --> 00:39:54,319 Speaker 1: into our future. Well, I noticed in your book it's 651 00:39:54,360 --> 00:39:57,200 Speaker 1: not just a question of how many rights we give 652 00:39:57,239 --> 00:40:00,160 Speaker 1: the AI, but also the humans. In your book, you're 653 00:40:00,280 --> 00:40:03,000 Speaker 1: always having a great time. I mean, their life is 654 00:40:03,000 --> 00:40:05,640 Speaker 1: not very secure. They've work in these gig jobs, they 655 00:40:05,640 --> 00:40:08,880 Speaker 1: have to modify their bodies and take you know, personal risks. 656 00:40:09,040 --> 00:40:11,759 Speaker 1: They've essentially no privacy. It's not like a future that 657 00:40:11,960 --> 00:40:15,000 Speaker 1: I would be excited to live in. Necessarily enjoyed reading 658 00:40:15,000 --> 00:40:16,680 Speaker 1: the book, but I wouldn't want to be in it. 659 00:40:17,000 --> 00:40:19,920 Speaker 1: So did you intend it to be dystopian? I didn't 660 00:40:19,960 --> 00:40:24,320 Speaker 1: intend it to be any more dystopian than our lives today. 661 00:40:24,480 --> 00:40:29,480 Speaker 1: I intended it to be, you know, a plausible extrapolation 662 00:40:29,760 --> 00:40:34,400 Speaker 1: of trends that are happening. And granted, these are fairly 663 00:40:34,480 --> 00:40:38,920 Speaker 1: linear projections of these trends, and you know, I'm sure 664 00:40:38,960 --> 00:40:42,279 Speaker 1: there will be disruptions. I don't believe that this is 665 00:40:42,320 --> 00:40:47,640 Speaker 1: going to be our future, but I wanted to examine 666 00:40:48,440 --> 00:40:52,960 Speaker 1: these particular aspects of the present, which I think a 667 00:40:53,040 --> 00:40:57,440 Speaker 1: lot of near science fiction future does. It's really not 668 00:40:57,600 --> 00:41:01,520 Speaker 1: about the future. It's about what we're dealing with today, 669 00:41:01,800 --> 00:41:05,200 Speaker 1: the concerns of today, and kind of projecting those forward 670 00:41:05,280 --> 00:41:10,360 Speaker 1: that if we don't do anything about these trends, here's 671 00:41:10,400 --> 00:41:13,560 Speaker 1: where we might end up. And if you don't like that, 672 00:41:14,200 --> 00:41:17,560 Speaker 1: then now is the time to start making changes to 673 00:41:17,719 --> 00:41:22,200 Speaker 1: ensure that that's not our future. I think true dystopian stories, 674 00:41:22,680 --> 00:41:26,880 Speaker 1: you know, carry these sorts of trends to great extremes. 675 00:41:26,920 --> 00:41:30,799 Speaker 1: That are implausible but are there. From a sort of 676 00:41:30,880 --> 00:41:38,000 Speaker 1: polemical standpoint to illustrate a very specific point, I don't 677 00:41:38,040 --> 00:41:42,480 Speaker 1: think the future and machine Hood is that bad. There 678 00:41:42,520 --> 00:41:47,680 Speaker 1: are good things to balance out the negatives, and you know, 679 00:41:47,719 --> 00:41:49,920 Speaker 1: there's a lot of people in the past several years 680 00:41:49,960 --> 00:41:54,439 Speaker 1: who feel like we are currently inhabiting a dystopia and 681 00:41:54,480 --> 00:41:58,160 Speaker 1: we're living through it. From that standpoint, yeah, I don't 682 00:41:58,200 --> 00:42:01,600 Speaker 1: think the Machine had future is anymore dystopic than you know, 683 00:42:01,680 --> 00:42:05,920 Speaker 1: the one that we're inhabiting. So if that's your feeling, 684 00:42:06,000 --> 00:42:09,280 Speaker 1: then it's like, okay, look around you and think about 685 00:42:09,600 --> 00:42:13,120 Speaker 1: where we've ended up today and how we've allowed ourselves 686 00:42:13,160 --> 00:42:15,319 Speaker 1: to get here. Now, I had this feeling of like 687 00:42:15,440 --> 00:42:18,759 Speaker 1: uber drivers driving late to the night, drinking red Bull constantly. 688 00:42:19,040 --> 00:42:21,319 Speaker 1: Sort of definitely had that feeling. All right, this is 689 00:42:21,360 --> 00:42:24,280 Speaker 1: a super fun conversation and I have lots more questions, 690 00:42:24,320 --> 00:42:40,359 Speaker 1: but first let's take a quick break. All right, we're 691 00:42:40,440 --> 00:42:42,800 Speaker 1: back from break and we are talking to the author 692 00:42:43,120 --> 00:42:46,759 Speaker 1: sp Diva about her novel Machine Hood. I noticed on 693 00:42:46,800 --> 00:42:49,520 Speaker 1: your website you have this line that says, I am 694 00:42:49,520 --> 00:42:52,600 Speaker 1: currently mortal and full of squishy organs, but I hope 695 00:42:52,640 --> 00:42:54,800 Speaker 1: to outlive that, and that made me wonder if you 696 00:42:54,840 --> 00:42:57,800 Speaker 1: were like looking forward to embracing the sort of post 697 00:42:57,840 --> 00:43:03,360 Speaker 1: biological future. Yeah, yeah, for sure. I am not afraid 698 00:43:03,360 --> 00:43:07,879 Speaker 1: of immortality. There's a lot of things I want to do. 699 00:43:08,600 --> 00:43:12,120 Speaker 1: I like to learn stuff, I like to dabble, and 700 00:43:12,160 --> 00:43:17,960 Speaker 1: I love seeking new experiences and knowledge. So the greatest 701 00:43:18,000 --> 00:43:20,600 Speaker 1: sorrow in my life is that I don't get to 702 00:43:20,680 --> 00:43:23,200 Speaker 1: live to see the future, which is of course an 703 00:43:23,280 --> 00:43:26,720 Speaker 1: impossible thing to do unless you're immortal, then you always 704 00:43:26,760 --> 00:43:29,000 Speaker 1: get to live to see the future. And I also 705 00:43:29,120 --> 00:43:34,000 Speaker 1: find that this argument that you know natural is best 706 00:43:34,080 --> 00:43:38,360 Speaker 1: to be very flawed because natural has occurred by random 707 00:43:38,440 --> 00:43:44,160 Speaker 1: chance and over long periods of natural selection. It's worked 708 00:43:44,160 --> 00:43:48,120 Speaker 1: out pretty well, but it's not necessarily optimal. I'm enough 709 00:43:48,160 --> 00:43:51,960 Speaker 1: of an engineer to kind of look at our physiological 710 00:43:52,000 --> 00:43:55,360 Speaker 1: systems and go, yeah, but you know, we could tweak 711 00:43:55,440 --> 00:44:00,840 Speaker 1: this and make it better. So from that standpoint, I 712 00:44:00,880 --> 00:44:06,920 Speaker 1: would certainly like my artificial knees and elbows and back, 713 00:44:07,640 --> 00:44:10,240 Speaker 1: you know, to be less painful in my old age, 714 00:44:10,400 --> 00:44:14,759 Speaker 1: to avoid neuro degeneration, to have the brain of a 715 00:44:14,760 --> 00:44:17,640 Speaker 1: twenty five year old forever, you know, at my peak 716 00:44:17,680 --> 00:44:22,719 Speaker 1: performance of my frontal cortex and everything else, because why not, 717 00:44:23,080 --> 00:44:26,400 Speaker 1: right if you could have it and it's not born 718 00:44:26,640 --> 00:44:30,240 Speaker 1: so much from I want to be superhuman as again, 719 00:44:30,360 --> 00:44:33,000 Speaker 1: I just want to be able to live life to 720 00:44:33,040 --> 00:44:36,879 Speaker 1: my fullest as long as I possibly can, because I'm 721 00:44:36,960 --> 00:44:41,200 Speaker 1: desperately curious about everything. Well, that's definitely something we understand 722 00:44:41,239 --> 00:44:43,160 Speaker 1: on this podcast. We want to know the secrets to 723 00:44:43,200 --> 00:44:46,120 Speaker 1: the universe, and we want to learn what scientists are 724 00:44:46,160 --> 00:44:48,680 Speaker 1: going to learn in the next hundred five hundred thousand years. 725 00:44:48,800 --> 00:44:51,000 Speaker 1: Let's dig into the details of that a little bit, 726 00:44:51,000 --> 00:44:54,000 Speaker 1: if you don't mind. In your novel, the people take 727 00:44:54,080 --> 00:44:57,360 Speaker 1: these pills you call them their zips and flows and 728 00:44:57,440 --> 00:45:00,480 Speaker 1: buffs and jew ers that give you specific enhancement, and 729 00:45:00,520 --> 00:45:02,440 Speaker 1: you really have worked them out in gory details, like 730 00:45:02,480 --> 00:45:05,080 Speaker 1: what they actually do inside the body. What's the origin 731 00:45:05,120 --> 00:45:07,560 Speaker 1: of these ideas? Are these like actual research projects you 732 00:45:07,560 --> 00:45:10,600 Speaker 1: wanted to work on, Not necessarily that I wanted to 733 00:45:10,640 --> 00:45:14,000 Speaker 1: work on, but things that I have seen poking around 734 00:45:14,080 --> 00:45:19,279 Speaker 1: science news that I find utterly fascinating. And also I 735 00:45:19,320 --> 00:45:24,160 Speaker 1: wanted to take one of the favorite aspects of science fiction, 736 00:45:24,480 --> 00:45:27,960 Speaker 1: one that I myself wrote about in my novella run Time, 737 00:45:28,160 --> 00:45:34,120 Speaker 1: which is the cyborg and the visualization of you know, 738 00:45:34,239 --> 00:45:39,600 Speaker 1: a mechanized human being with like these big bulky exoskeletons 739 00:45:39,680 --> 00:45:43,640 Speaker 1: and you know, hydraulics and gears and all the stuff 740 00:45:43,680 --> 00:45:49,200 Speaker 1: that comes with mechanical systems, and turn that on its 741 00:45:49,239 --> 00:45:52,640 Speaker 1: head and say, can we have a cyborg that looks 742 00:45:53,600 --> 00:45:57,760 Speaker 1: human from the outside? Completely human? You have no idea. 743 00:45:58,320 --> 00:46:02,480 Speaker 1: They're not superpowers, it's not magic, but it's that same 744 00:46:02,520 --> 00:46:08,520 Speaker 1: biotechnology as ministerization, because honestly, who wants to wander around 745 00:46:08,560 --> 00:46:11,480 Speaker 1: in a bunch of bulky exoskeleton gear? Right that's not 746 00:46:12,719 --> 00:46:16,000 Speaker 1: very comfortable looking from what I can tell, you know, 747 00:46:16,040 --> 00:46:18,640 Speaker 1: it makes it harder to fit into those airline seats. 748 00:46:18,760 --> 00:46:22,120 Speaker 1: Just it's no good. Everyone can take a pill, you know, 749 00:46:22,400 --> 00:46:26,640 Speaker 1: you's like tear the fabric. Things keep breaking. It seems 750 00:46:26,719 --> 00:46:30,560 Speaker 1: terribly inconvenient to me. So on the other hand, you know, 751 00:46:30,680 --> 00:46:34,000 Speaker 1: we have people in labs today developing micro machines that 752 00:46:34,160 --> 00:46:38,200 Speaker 1: you can swallow, that will track through your colon, that 753 00:46:38,280 --> 00:46:43,960 Speaker 1: can be externally controlled via magnetics, and that are also 754 00:46:44,120 --> 00:46:48,319 Speaker 1: somewhat autonomous, right they you know, that's lovely like oregamy 755 00:46:48,440 --> 00:46:54,000 Speaker 1: unfolding structures and right now, because it's a pill and 756 00:46:54,040 --> 00:46:56,080 Speaker 1: it goes through your gut. That's kind of the only 757 00:46:56,120 --> 00:46:59,680 Speaker 1: place where really able to use this, and it's still 758 00:47:00,080 --> 00:47:03,080 Speaker 1: not being done in medicine, but I can see this 759 00:47:03,239 --> 00:47:08,919 Speaker 1: technology moving forward, you know, in fifty years, to nanoscale 760 00:47:08,960 --> 00:47:12,920 Speaker 1: things that can cross into your blood stream, that can 761 00:47:12,960 --> 00:47:16,160 Speaker 1: cross your blood brain barrier, that can sit there and 762 00:47:16,920 --> 00:47:21,319 Speaker 1: do really interesting things within your body, working in concert 763 00:47:22,080 --> 00:47:29,800 Speaker 1: with your inherent biochemistry and physiological systems. Two. Then tweak 764 00:47:29,880 --> 00:47:33,800 Speaker 1: little things that you know, can maybe like in the book, 765 00:47:34,239 --> 00:47:37,920 Speaker 1: speed up clotting and healing of the skin, or you know, 766 00:47:38,120 --> 00:47:45,280 Speaker 1: bypass nerve conduction for faster communications, maybe change the state 767 00:47:45,360 --> 00:47:48,440 Speaker 1: of your brain so that you can focus your attention 768 00:47:48,520 --> 00:47:52,240 Speaker 1: on something a little better. You know. Again, not superpowers, 769 00:47:52,280 --> 00:47:56,200 Speaker 1: because I don't think that's very realistic, but enough that 770 00:47:56,320 --> 00:47:59,200 Speaker 1: it's going to be beneficial, especially when we're trying to 771 00:47:59,280 --> 00:48:02,600 Speaker 1: keep up with these highly intelligent systems that are working 772 00:48:02,640 --> 00:48:05,680 Speaker 1: at you know, gig your hurts and terror hurts speeds, 773 00:48:05,920 --> 00:48:10,359 Speaker 1: and are going to become increasingly complex going forward. And 774 00:48:10,400 --> 00:48:14,040 Speaker 1: then I threw some capitalism in there, because a everyone's 775 00:48:14,080 --> 00:48:16,840 Speaker 1: got to make money, right, like, nothing in life is free. 776 00:48:17,520 --> 00:48:21,200 Speaker 1: So these pills because you have to take them, and 777 00:48:21,280 --> 00:48:24,080 Speaker 1: they're small, and they're in your body, they're going to 778 00:48:24,200 --> 00:48:26,279 Speaker 1: break down, they're going to be cleaned out by your 779 00:48:26,280 --> 00:48:30,120 Speaker 1: own bodies garbage collection systems and flushed away. So you've 780 00:48:30,120 --> 00:48:33,400 Speaker 1: got to keep taking more pills. And who doesn't want that, 781 00:48:33,440 --> 00:48:35,680 Speaker 1: because that's a great way, you know, for the pill 782 00:48:35,719 --> 00:48:39,120 Speaker 1: designers to make their money. Make money on the ink, right, 783 00:48:39,200 --> 00:48:43,080 Speaker 1: not the printer. Yeah. Always. I worked for a medical 784 00:48:43,160 --> 00:48:46,040 Speaker 1: device company where I learned that lesson. You know, the 785 00:48:46,080 --> 00:48:49,600 Speaker 1: business model is always give away the razor charge for 786 00:48:49,640 --> 00:48:53,400 Speaker 1: the blades, right, same thing with the printers. So same 787 00:48:53,440 --> 00:48:56,640 Speaker 1: thing with this. You can print your smart pills at 788 00:48:56,680 --> 00:48:59,360 Speaker 1: home in your kitchen, but you've got to keep downloading 789 00:48:59,400 --> 00:49:02,080 Speaker 1: new design in every day or every week, and so 790 00:49:02,320 --> 00:49:05,440 Speaker 1: that's what you're paying for, and the pills keep getting 791 00:49:05,440 --> 00:49:07,359 Speaker 1: fleshed out of your system, so of course you're going 792 00:49:07,400 --> 00:49:10,680 Speaker 1: to want the latest and greatest on a daily basis. Well, 793 00:49:10,719 --> 00:49:13,200 Speaker 1: another thing I really enjoyed about your book was that 794 00:49:13,280 --> 00:49:16,040 Speaker 1: you had real science puzzles in there. You get things 795 00:49:16,080 --> 00:49:18,759 Speaker 1: going on that the characters didn't understand, and then they 796 00:49:18,760 --> 00:49:22,440 Speaker 1: were using their science brain to try to unravel the mystery. 797 00:49:22,800 --> 00:49:25,879 Speaker 1: And that's not something you see very often in science fiction. 798 00:49:26,000 --> 00:49:28,360 Speaker 1: Usually it's here's a new technology, let's use it to 799 00:49:28,440 --> 00:49:31,000 Speaker 1: kill people, or how's it change or society or whatever. 800 00:49:31,040 --> 00:49:34,360 Speaker 1: But like the actual process of science, the detective mystery 801 00:49:34,360 --> 00:49:38,440 Speaker 1: of false paths and frustration and limited resources, you know, 802 00:49:38,480 --> 00:49:40,759 Speaker 1: the things that some of us actually live, you don't 803 00:49:40,760 --> 00:49:44,359 Speaker 1: see that captured very often in science fiction. So, first 804 00:49:44,360 --> 00:49:47,560 Speaker 1: of all, just kudos on capturing that. But sort of narratively, 805 00:49:47,680 --> 00:49:49,799 Speaker 1: why did you want to show that in your story? 806 00:49:49,840 --> 00:49:52,400 Speaker 1: Why did you decide to include like this actual scientific 807 00:49:52,480 --> 00:49:55,160 Speaker 1: method as part of the story for your character. For 808 00:49:55,280 --> 00:50:02,120 Speaker 1: exactly that reason, I do feel like in older science fiction, 809 00:50:02,280 --> 00:50:06,320 Speaker 1: you know, from the forties and fifties especially, scientists were 810 00:50:06,400 --> 00:50:11,200 Speaker 1: often the heroes or the protagonists and occasionally got to 811 00:50:11,239 --> 00:50:16,120 Speaker 1: do actual science as part of the stories. The problem 812 00:50:16,160 --> 00:50:20,799 Speaker 1: to me in those old science fiction stories is the 813 00:50:20,920 --> 00:50:25,080 Speaker 1: scientists were very sort of cardboard cutout scientists. There wasn't 814 00:50:25,120 --> 00:50:29,680 Speaker 1: a lot of character development or interesting plot development, and 815 00:50:29,760 --> 00:50:33,880 Speaker 1: you know, that larger social aspect was usually completely ignored, 816 00:50:34,200 --> 00:50:37,879 Speaker 1: all in favor of just the scientific discovery part. So 817 00:50:38,000 --> 00:50:41,840 Speaker 1: I wanted to modernize that concept and bring it all 818 00:50:41,920 --> 00:50:45,600 Speaker 1: together and have you know, the big picture as well 819 00:50:45,640 --> 00:50:48,920 Speaker 1: as sort of this localized person who's trying to figure 820 00:50:49,000 --> 00:50:51,399 Speaker 1: something out, and we do. I guess we have one 821 00:50:51,440 --> 00:50:54,560 Speaker 1: good example that became you know, huge in pop culture, 822 00:50:54,600 --> 00:50:57,840 Speaker 1: which was The Martian Right, And that was fantastic to 823 00:50:57,920 --> 00:51:01,560 Speaker 1: see so many people actually like that because I read 824 00:51:01,600 --> 00:51:03,719 Speaker 1: it and I was really surprised that, you know, it 825 00:51:03,800 --> 00:51:07,920 Speaker 1: got as popular as it did, just considering the themes 826 00:51:07,920 --> 00:51:10,600 Speaker 1: and what happens in that story. So I wanted to 827 00:51:10,600 --> 00:51:15,160 Speaker 1: tell that particular aspect. And also that's again, I guess, 828 00:51:15,280 --> 00:51:18,440 Speaker 1: a part of my own personality. While I'm an engineer 829 00:51:18,480 --> 00:51:21,520 Speaker 1: by profession, I think I'm a scientist at heart, and 830 00:51:21,600 --> 00:51:25,080 Speaker 1: that's kind of where I got my start. And I 831 00:51:25,160 --> 00:51:29,640 Speaker 1: love this idea of teasing out puzzles, of debugging things. 832 00:51:29,760 --> 00:51:31,839 Speaker 1: You know, we're really debugging the universe in a lot 833 00:51:31,920 --> 00:51:35,080 Speaker 1: of ways, right, We're trying to understand how it works, 834 00:51:35,480 --> 00:51:37,960 Speaker 1: how we broke it, and how we fix it. And 835 00:51:38,000 --> 00:51:41,120 Speaker 1: so it was nice to kind of present that particular 836 00:51:41,160 --> 00:51:45,279 Speaker 1: aspect as part of a science fiction narrative. Well, I 837 00:51:45,320 --> 00:51:48,480 Speaker 1: really enjoyed seeing the scientists not just being nerds and 838 00:51:48,480 --> 00:51:50,440 Speaker 1: white lab coats, So thank you very much for that. 839 00:51:50,640 --> 00:51:53,480 Speaker 1: And then my last question is about the universe that 840 00:51:53,560 --> 00:51:56,840 Speaker 1: you've constructed. You know, in a science fiction novel, you 841 00:51:56,880 --> 00:51:59,440 Speaker 1: have the liberty to change the laws of physics or 842 00:51:59,480 --> 00:52:03,040 Speaker 1: biology or whatever and create a different universe that follows 843 00:52:03,080 --> 00:52:05,560 Speaker 1: different rules. It seemed to me in your book you 844 00:52:05,840 --> 00:52:09,560 Speaker 1: really huge closely to what could be possible in our universe. 845 00:52:09,760 --> 00:52:11,480 Speaker 1: Is that true first of all? And is that because 846 00:52:11,480 --> 00:52:14,320 Speaker 1: you wanted to speak to issues in our society that 847 00:52:14,360 --> 00:52:17,360 Speaker 1: were most relevant to like our actual world. It was 848 00:52:17,400 --> 00:52:22,120 Speaker 1: definitely deliberate too, you know, hue to most of what 849 00:52:22,160 --> 00:52:26,360 Speaker 1: we know about today in terms of our physical world. 850 00:52:26,600 --> 00:52:29,719 Speaker 1: But I did it more because I think, genuinely, in 851 00:52:29,719 --> 00:52:33,640 Speaker 1: the next seventy five years, were unlikely to find something 852 00:52:35,160 --> 00:52:39,680 Speaker 1: radically groundbreaking, even as much as you know, quantum physics 853 00:52:39,800 --> 00:52:43,720 Speaker 1: kind of turned classical mechanics on its head a hundred 854 00:52:43,760 --> 00:52:46,840 Speaker 1: issue years ago and led to the transistor and the 855 00:52:46,880 --> 00:52:52,920 Speaker 1: digital revolution. I decided not to take that particular path 856 00:52:53,040 --> 00:52:55,960 Speaker 1: because I knew that whatever I put in there is 857 00:52:56,080 --> 00:52:58,239 Speaker 1: probably not ever going to happen. I'm going to be 858 00:52:58,280 --> 00:53:01,360 Speaker 1: making it up right, And the places that we're really 859 00:53:01,400 --> 00:53:03,880 Speaker 1: poking at right now, I feel like that are the 860 00:53:03,920 --> 00:53:09,720 Speaker 1: big questions, are more large scale stuff like dark matter 861 00:53:10,160 --> 00:53:12,560 Speaker 1: where we're genuinely puzzled, and I think those are the 862 00:53:12,600 --> 00:53:15,960 Speaker 1: puzzles were probably going to tease out in the next 863 00:53:16,280 --> 00:53:21,479 Speaker 1: few decades. So the one I guess sort of kind 864 00:53:21,480 --> 00:53:24,480 Speaker 1: of fun innovation I threw into machine hood that I 865 00:53:24,520 --> 00:53:27,080 Speaker 1: don't know if it will happen is this idea of 866 00:53:27,719 --> 00:53:32,920 Speaker 1: you know, tabletop genetics, thinking things like Crisper and saying 867 00:53:32,960 --> 00:53:37,480 Speaker 1: that it's going to be, you know, the computer revolution 868 00:53:37,680 --> 00:53:40,120 Speaker 1: of this century, that everybody's going to be able to 869 00:53:40,160 --> 00:53:43,400 Speaker 1: do that in their household. And so that was really 870 00:53:43,440 --> 00:53:46,040 Speaker 1: the only sort of leap I've taken where I'm not 871 00:53:47,360 --> 00:53:49,880 Speaker 1: sure that's really going to happen for the rest of it. 872 00:53:50,000 --> 00:53:53,360 Speaker 1: Especially with near term science fiction, I think it's always 873 00:53:54,080 --> 00:53:58,120 Speaker 1: better to stick with what we know and build a 874 00:53:58,160 --> 00:54:05,600 Speaker 1: plausible future than to radically redefine physics. That said, the 875 00:54:05,640 --> 00:54:07,960 Speaker 1: next novel that I have been working on for the 876 00:54:08,040 --> 00:54:10,880 Speaker 1: past year is that a thousand years in the future, 877 00:54:11,000 --> 00:54:14,279 Speaker 1: and I have definitely invented some new physics for thou one, 878 00:54:15,640 --> 00:54:19,239 Speaker 1: bringing in these ideas of dark matter and you know 879 00:54:19,800 --> 00:54:23,120 Speaker 1: why the universe is accelerating and maybe there's something in 880 00:54:23,120 --> 00:54:26,680 Speaker 1: our field theories that we are missing that. Okay, all right, 881 00:54:26,800 --> 00:54:29,440 Speaker 1: I let me, you know, define a new particle and 882 00:54:29,520 --> 00:54:32,799 Speaker 1: something that works on larger scales, and maybe we just 883 00:54:32,880 --> 00:54:37,959 Speaker 1: didn't have the ability to sense it in But five 884 00:54:38,440 --> 00:54:41,320 Speaker 1: years from now we'll have the appropriate technology to find 885 00:54:41,360 --> 00:54:46,000 Speaker 1: those things and then harness those things. So I find 886 00:54:46,000 --> 00:54:48,440 Speaker 1: it more comfortable to play with that, you know, on 887 00:54:48,480 --> 00:54:52,600 Speaker 1: a longer time scale than in the next hundred or so, 888 00:54:53,080 --> 00:54:55,920 Speaker 1: you know, hundred hundred and fifty years. Unless you turn 889 00:54:55,920 --> 00:54:58,279 Speaker 1: into a cyborg, you probably won't be around for people 890 00:54:58,320 --> 00:55:01,360 Speaker 1: to compare the factor to your fiction. Yeah, Sadly, I 891 00:55:01,719 --> 00:55:06,879 Speaker 1: may or may not live long enough to find out. Well, 892 00:55:06,920 --> 00:55:08,840 Speaker 1: thanks very much for taking the time to talk to 893 00:55:08,920 --> 00:55:12,000 Speaker 1: us about the universe you created in Machine hood Tell 894 00:55:12,040 --> 00:55:14,960 Speaker 1: our listeners about your next project and when we can 895 00:55:15,000 --> 00:55:18,399 Speaker 1: expect to see something. The next project, unfortunately is still 896 00:55:18,560 --> 00:55:21,880 Speaker 1: very in its nascent sort of stages, so I can't 897 00:55:21,920 --> 00:55:26,040 Speaker 1: promise anything regarding it at this point. But I'm hard 898 00:55:26,040 --> 00:55:28,879 Speaker 1: at work on this book and we'll see how it goes. 899 00:55:29,080 --> 00:55:32,160 Speaker 1: Right now, a lot of my focus is just getting 900 00:55:32,200 --> 00:55:35,440 Speaker 1: the word out about Machine Hoods. So tell your audience. 901 00:55:35,480 --> 00:55:37,560 Speaker 1: If you read it and you like it, please tell 902 00:55:37,600 --> 00:55:40,239 Speaker 1: your friends so there can be more books like it 903 00:55:40,320 --> 00:55:43,480 Speaker 1: in the world. And if you want a little teaser, 904 00:55:43,800 --> 00:55:46,480 Speaker 1: go to machine hood dot com. There's a little excerpt 905 00:55:46,640 --> 00:55:50,000 Speaker 1: from the machine Hoods manifesto there, and you know, links 906 00:55:50,040 --> 00:55:53,160 Speaker 1: to more good stuff. And yeah, it was my pleasure 907 00:55:53,200 --> 00:55:55,120 Speaker 1: to be here and talk about this. I can talk 908 00:55:55,160 --> 00:55:58,839 Speaker 1: about this stuff all day and speculate about science and 909 00:55:58,880 --> 00:56:04,240 Speaker 1: technology fully for you know, many decades forward through my fiction. 910 00:56:04,400 --> 00:56:06,520 Speaker 1: All right, Well, congrats again on the novel and thanks 911 00:56:06,560 --> 00:56:08,640 Speaker 1: again for joining us. All right, so, thank you very 912 00:56:08,719 --> 00:56:10,880 Speaker 1: much to the opto for coming on the show and 913 00:56:10,960 --> 00:56:14,880 Speaker 1: answering all of my pesky questions about physics and consciousness 914 00:56:14,920 --> 00:56:17,600 Speaker 1: and artificial intelligence. This is the kind of stuff that 915 00:56:17,680 --> 00:56:20,440 Speaker 1: I love to think about. I really enjoy a novel 916 00:56:20,520 --> 00:56:22,560 Speaker 1: that pushes me to the limits to make me think 917 00:56:22,560 --> 00:56:25,880 Speaker 1: about what's possible, what our universe might be like, what 918 00:56:26,000 --> 00:56:28,920 Speaker 1: our life might be like in this universe as we 919 00:56:29,000 --> 00:56:31,960 Speaker 1: grow to become masters of the science and develop new 920 00:56:32,000 --> 00:56:35,720 Speaker 1: technology that can really change what it means to be human. 921 00:56:35,840 --> 00:56:38,040 Speaker 1: For me, that's the whole goal of science, is to 922 00:56:38,280 --> 00:56:41,200 Speaker 1: push back on this boundary of ignorance and reveal the 923 00:56:41,280 --> 00:56:45,239 Speaker 1: nature of the universe and change our relationship with the Cosmos. 924 00:56:45,440 --> 00:56:48,160 Speaker 1: I'm frustrated by our ignorance of the nature of the 925 00:56:48,200 --> 00:56:50,319 Speaker 1: universe we live in, and i feel like we're making 926 00:56:50,440 --> 00:56:53,200 Speaker 1: ridiculous and silly decisions about how to live our life 927 00:56:53,200 --> 00:56:55,879 Speaker 1: and how to explore the universe just because we are 928 00:56:55,960 --> 00:56:59,319 Speaker 1: so ignorant. And so I'm desperate to see deep into 929 00:56:59,360 --> 00:57:02,040 Speaker 1: the future and understand what we might learn and how 930 00:57:02,080 --> 00:57:04,720 Speaker 1: we might live. And some of these science fiction novels 931 00:57:04,880 --> 00:57:06,839 Speaker 1: are so fun because they give you a little bit 932 00:57:06,880 --> 00:57:09,000 Speaker 1: of a taste of that. So thanks for tuning in 933 00:57:09,040 --> 00:57:12,319 Speaker 1: to Daniel and Jorge Explain the Universe, where usually we 934 00:57:12,360 --> 00:57:15,800 Speaker 1: explore our universe, but sometimes we take these detours into 935 00:57:15,840 --> 00:57:18,560 Speaker 1: the world of science fiction. Thanks for tuning in and 936 00:57:18,640 --> 00:57:28,840 Speaker 1: see you next time. Thanks for listening, and remember that 937 00:57:29,000 --> 00:57:31,720 Speaker 1: Daniel and Jorge Explain the Universe is a production of 938 00:57:31,840 --> 00:57:35,200 Speaker 1: I Heart Radio. Or more podcast from my Heart Radio 939 00:57:35,360 --> 00:57:38,920 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 940 00:57:39,040 --> 00:57:40,720 Speaker 1: you listen to your favorite shows.