1 00:00:03,080 --> 00:00:05,920 Speaker 1: Welcome to Stuff to Blow your Mind from how Stuff 2 00:00:05,920 --> 00:00:20,160 Speaker 1: Works dot com. The thoughts of all men arise from 3 00:00:20,160 --> 00:00:23,000 Speaker 1: the darkness. If you were the movement of your soul, 4 00:00:23,480 --> 00:00:26,160 Speaker 1: and the cause of that movement procede you, then how 5 00:00:26,200 --> 00:00:29,040 Speaker 1: could you ever call your thoughts your own? How could 6 00:00:29,040 --> 00:00:32,320 Speaker 1: you be anything other than a slave to the darkness 7 00:00:32,360 --> 00:00:36,640 Speaker 1: that comes before? Hey, welcome to Stuff to Blow your Mind. 8 00:00:36,720 --> 00:00:39,199 Speaker 1: My name is Robert Lamb and I'm Joe McCormick. In 9 00:00:39,240 --> 00:00:41,720 Speaker 1: today's episode is mostly going to be an interview with 10 00:00:41,800 --> 00:00:44,960 Speaker 1: the Canadian author are Scott Baker, best known for his 11 00:00:45,040 --> 00:00:49,280 Speaker 1: Second Apocalypse saga, Robert, how would you characterize that? Well? 12 00:00:49,440 --> 00:00:51,680 Speaker 1: I think that quote that I read there from his 13 00:00:51,760 --> 00:00:54,320 Speaker 1: first book in the series, The Darkness That Comes Before, 14 00:00:54,360 --> 00:00:57,880 Speaker 1: sums it up rather nicely. I was first turned onto 15 00:00:57,880 --> 00:00:59,960 Speaker 1: the work of our Scott Baker back in two thousand seven, 16 00:01:00,320 --> 00:01:02,760 Speaker 1: and I hadn't read anything quite like it at the time, 17 00:01:02,800 --> 00:01:05,360 Speaker 1: and I really haven't read anything quite like his work 18 00:01:05,440 --> 00:01:08,400 Speaker 1: since then. His first novel, The Darkness That Comes Before, 19 00:01:08,480 --> 00:01:11,759 Speaker 1: cast the reader into a dark fantasy world and invokes 20 00:01:11,800 --> 00:01:15,440 Speaker 1: the Holy Wars of our own world, token esque evil 21 00:01:15,840 --> 00:01:19,120 Speaker 1: factional in fighting worthy of Frank Herbert's Dune and a 22 00:01:19,280 --> 00:01:23,040 Speaker 1: deep philosophical core, And we're actually gonna be getting into 23 00:01:23,080 --> 00:01:25,920 Speaker 1: not so much his science fiction and fantasy but a 24 00:01:26,000 --> 00:01:29,360 Speaker 1: recent philosophy paper of his that sounds kind of odd, 25 00:01:29,400 --> 00:01:31,600 Speaker 1: though We will discuss his science fiction and fantasy a 26 00:01:31,680 --> 00:01:34,119 Speaker 1: fair amount. Don't don't worry about that if you're a fan, 27 00:01:34,240 --> 00:01:36,800 Speaker 1: but we're really going to be focusing on our Scott 28 00:01:36,840 --> 00:01:40,679 Speaker 1: Baker's recent paper on alien philosophy, published this year in 29 00:01:40,680 --> 00:01:43,960 Speaker 1: the Journal of Consciousness Studies. Yes, you heard that right, 30 00:01:44,280 --> 00:01:47,480 Speaker 1: alien philosophy. We will get back to that in just 31 00:01:47,560 --> 00:01:50,360 Speaker 1: a little bit. Yes, on alien philosophy. This is a 32 00:01:51,080 --> 00:01:55,040 Speaker 1: wonderful thought experiment and sort of a reverse engineering of 33 00:01:55,120 --> 00:02:00,120 Speaker 1: human philosophy via the consideration of a fictional convergeon ale 34 00:02:00,200 --> 00:02:04,040 Speaker 1: in species and what's sort of philosophical systems they might 35 00:02:04,120 --> 00:02:06,720 Speaker 1: create to make sense of their own existence. Yeah, so 36 00:02:06,800 --> 00:02:10,640 Speaker 1: in the past we've actually speculated about possible characteristics of 37 00:02:10,680 --> 00:02:13,640 Speaker 1: alien life forms. We did this in that episode Grizzly 38 00:02:13,680 --> 00:02:16,320 Speaker 1: Bears from Outer Space. Do you remember that? Yeah, like, 39 00:02:16,360 --> 00:02:19,760 Speaker 1: would aliens have eyes? Would they have hands? How large 40 00:02:19,840 --> 00:02:23,160 Speaker 1: or intelligent life forms? Generally? Where there was there was 41 00:02:23,160 --> 00:02:26,320 Speaker 1: a paper back then that tried to use some statistical 42 00:02:26,360 --> 00:02:29,120 Speaker 1: analysis to say, you know, it's really more likely, given 43 00:02:29,120 --> 00:02:32,120 Speaker 1: certain planet sizes and gravity and stuff, that aliens are 44 00:02:32,120 --> 00:02:35,960 Speaker 1: going to be pretty big. Um. But this is always 45 00:02:35,960 --> 00:02:38,960 Speaker 1: a tricky game because if you are a reasonable person, 46 00:02:39,960 --> 00:02:42,519 Speaker 1: you have to admit that if alien life exists somewhere 47 00:02:42,520 --> 00:02:45,400 Speaker 1: out there, there's every chance that it could just totally 48 00:02:45,440 --> 00:02:48,720 Speaker 1: defy your expectations. We we don't always know what to expect. 49 00:02:49,320 --> 00:02:52,800 Speaker 1: We can't be certain about much, but it is always 50 00:02:52,800 --> 00:02:55,640 Speaker 1: fun to play the game. Okay, if we just start 51 00:02:55,680 --> 00:02:59,520 Speaker 1: with a few fairly safe assumptions, what can we deduce 52 00:02:59,639 --> 00:03:02,560 Speaker 1: about what types of life are possible? And one of 53 00:03:02,600 --> 00:03:06,320 Speaker 1: those fairly safe assumptions is that whatever alien life exists, 54 00:03:06,440 --> 00:03:09,400 Speaker 1: it's going to be the product of evolution by natural selection. 55 00:03:10,240 --> 00:03:13,120 Speaker 1: I think it would be incredibly surprising if it turned 56 00:03:13,120 --> 00:03:16,320 Speaker 1: out to not be the product of evolution by natural selection, 57 00:03:16,639 --> 00:03:20,320 Speaker 1: meaning it's going to be a system that somehow incodes information, 58 00:03:20,919 --> 00:03:24,639 Speaker 1: makes copies of itself through that information, and the copies 59 00:03:24,680 --> 00:03:28,840 Speaker 1: survive and make their own copies at differential rates. And 60 00:03:29,040 --> 00:03:31,640 Speaker 1: from these humble premises you can actually start to make 61 00:03:31,680 --> 00:03:35,000 Speaker 1: a lot of interesting guesses about what types of life 62 00:03:35,040 --> 00:03:38,240 Speaker 1: are more possible or more common than others. And so 63 00:03:38,280 --> 00:03:41,520 Speaker 1: we've already tried this with biological and ecological traits in 64 00:03:41,560 --> 00:03:45,560 Speaker 1: the past. But in this new paper, Our Scott Baker 65 00:03:45,680 --> 00:03:49,840 Speaker 1: chases this this phenotype problem really deep. He goes on 66 00:03:49,880 --> 00:03:54,520 Speaker 1: to guests, what are the philosophical projects that intelligent alien 67 00:03:54,600 --> 00:03:58,080 Speaker 1: life forms would wonder about? What big questions would they 68 00:03:58,120 --> 00:04:01,040 Speaker 1: share with us? What hang up so they really likely 69 00:04:01,120 --> 00:04:05,080 Speaker 1: to dwell upon. Yeah, it's a fascinating paper and if 70 00:04:05,120 --> 00:04:07,680 Speaker 1: you want to read it in full, I'll include a 71 00:04:07,720 --> 00:04:09,600 Speaker 1: link to it on the landing page for this episode 72 00:04:09,600 --> 00:04:11,640 Speaker 1: is Stuff to Blow your Mind dot Com. And I 73 00:04:11,640 --> 00:04:13,680 Speaker 1: should also point out that the first six books in 74 00:04:13,720 --> 00:04:17,240 Speaker 1: the Second Apocalypse saga, including the first trilogy, are currently 75 00:04:17,240 --> 00:04:19,599 Speaker 1: out there and just about any reading format you desire, 76 00:04:19,960 --> 00:04:23,000 Speaker 1: and this summer Our Scott Baker brings it all home. 77 00:04:23,040 --> 00:04:25,760 Speaker 1: He's gonna follow up last year's The Great Ordeal with 78 00:04:25,839 --> 00:04:28,799 Speaker 1: the Unholy Consults, which is gonna be out July eleven 79 00:04:28,880 --> 00:04:31,479 Speaker 1: from Overlook Press. And if you're not ready to commit 80 00:04:31,480 --> 00:04:34,880 Speaker 1: to a multi book series, his standalone novels Neuropath and 81 00:04:34,920 --> 00:04:37,559 Speaker 1: Disciple of the Dog are out there as well. Robert, 82 00:04:37,600 --> 00:04:39,960 Speaker 1: I know you warned me about neuropath well, as as 83 00:04:40,240 --> 00:04:43,960 Speaker 1: we'll explore in the interview, Baker himself warns everybody about neuropaths, 84 00:04:44,040 --> 00:04:46,120 Speaker 1: so be sure to tune in for that section of 85 00:04:46,160 --> 00:04:49,480 Speaker 1: the conversation. Just a few quick notes about this episode. 86 00:04:49,560 --> 00:04:52,160 Speaker 1: First up, this is a phone interview. We're in Atlanta 87 00:04:52,240 --> 00:04:54,640 Speaker 1: and Scott chatted with us from the wilds of Canada 88 00:04:54,680 --> 00:04:57,560 Speaker 1: in the midst of a thunderstorm. So, uh, everything's not 89 00:04:57,600 --> 00:05:00,479 Speaker 1: going to be necessarily as crisp as a you reually is, 90 00:05:00,560 --> 00:05:03,800 Speaker 1: but we think the content is is definitely worth sticking 91 00:05:03,839 --> 00:05:06,839 Speaker 1: around for. Secondly, I bring up a couple of details 92 00:05:06,880 --> 00:05:11,000 Speaker 1: from the Second Apocalypse saga that I think bear quick explanation. First, 93 00:05:11,160 --> 00:05:13,919 Speaker 1: there's the ink Ai. Now, this is a hedonistic alien 94 00:05:14,000 --> 00:05:18,560 Speaker 1: species that descends upon the world in ancient times undying. 95 00:05:18,640 --> 00:05:23,320 Speaker 1: They're devoted to selfish indulgence, limitless pleasure, and a cataclysmic 96 00:05:23,360 --> 00:05:26,400 Speaker 1: scheme to shield themselves from judgment. And then there are 97 00:05:26,440 --> 00:05:29,080 Speaker 1: the non Men. This is an all male race of 98 00:05:29,160 --> 00:05:32,760 Speaker 1: humanoids that essentially serves as the waning elder race the 99 00:05:32,800 --> 00:05:37,839 Speaker 1: elves of Baker's world, only far darker and interestingly inhuman 100 00:05:37,880 --> 00:05:41,679 Speaker 1: in many respects, both in body and in mind. And finally, 101 00:05:41,800 --> 00:05:43,919 Speaker 1: there's a tiny bit of cursing in this episode, but 102 00:05:44,000 --> 00:05:47,599 Speaker 1: it's polite Canadian cursing, and we bleaped it all out 103 00:05:47,640 --> 00:05:50,200 Speaker 1: for you. All Right, Well, i'd say, without further delay, 104 00:05:50,240 --> 00:06:05,240 Speaker 1: maybe we should get into our conversation with our Scott Baker. Hey, Scott, 105 00:06:05,279 --> 00:06:07,599 Speaker 1: welcome to Stuff to Blow your Mind. Thanks for taking 106 00:06:07,640 --> 00:06:09,680 Speaker 1: time out of your day to talk with us here 107 00:06:09,720 --> 00:06:13,560 Speaker 1: about your new paper on alien philosophy, as well as 108 00:06:13,680 --> 00:06:17,560 Speaker 1: uh your works of fiction, which I know personally have 109 00:06:17,640 --> 00:06:19,320 Speaker 1: been a lot to me over the years. I'm I'm 110 00:06:19,360 --> 00:06:22,040 Speaker 1: a big fan of the Second Apocalypse saga. I loved 111 00:06:22,080 --> 00:06:26,520 Speaker 1: an Neuropath and Disciple the Dog as well. So we 112 00:06:26,560 --> 00:06:29,000 Speaker 1: want to welcome you to the show. And I believe 113 00:06:29,200 --> 00:06:33,600 Speaker 1: Joe has the first question here related to on alien philosophy. Well, actually, 114 00:06:33,680 --> 00:06:36,320 Speaker 1: I mean, I guess we should start just by Scott. 115 00:06:36,360 --> 00:06:38,960 Speaker 1: Is there anything you'd like to tell our audience about yourself? 116 00:06:39,040 --> 00:06:41,919 Speaker 1: Just introduce yourself and then second we'll we'll get to 117 00:06:41,920 --> 00:06:44,120 Speaker 1: the meat of the paper. I'm a farm boy who 118 00:06:44,160 --> 00:06:48,760 Speaker 1: grew up in southwestern Ontario and uh ended up falling 119 00:06:48,800 --> 00:06:52,359 Speaker 1: in love with Lord of the Rings and Conan the 120 00:06:52,400 --> 00:06:57,120 Speaker 1: Barbarian at a preposterously young age and just never grew up. 121 00:06:59,279 --> 00:07:02,120 Speaker 1: What's your opinion on the Millius movie? It has to 122 00:07:02,160 --> 00:07:05,680 Speaker 1: be the one written by alver Stone. Yeah yeah, Ernie. 123 00:07:06,880 --> 00:07:09,680 Speaker 1: All right, So we wanted to get into the idea 124 00:07:09,680 --> 00:07:12,280 Speaker 1: of your your paper about alien philosophy. Could you just 125 00:07:12,480 --> 00:07:16,000 Speaker 1: start by as succinctly as you can explaining what your 126 00:07:17,280 --> 00:07:20,000 Speaker 1: motive for writing this paper was and what your basic 127 00:07:20,040 --> 00:07:24,080 Speaker 1: conclusions are. Okay, So, I mean I kind of backed 128 00:07:24,080 --> 00:07:28,760 Speaker 1: into philosophy. Um, my original my original degree program I 129 00:07:28,800 --> 00:07:33,640 Speaker 1: took at university was languages and literature, and it quickly 130 00:07:33,680 --> 00:07:37,560 Speaker 1: became apparent to me that I was barking up the 131 00:07:37,600 --> 00:07:40,440 Speaker 1: wrong tree. And so the question for me was basically 132 00:07:40,440 --> 00:07:45,600 Speaker 1: how to transition my literature degree into something more philosophical. 133 00:07:45,920 --> 00:07:50,800 Speaker 1: So I did a critical theory graduate degree, and in 134 00:07:50,920 --> 00:07:54,720 Speaker 1: that case, I found myself knee deep and all these 135 00:07:54,720 --> 00:08:00,720 Speaker 1: philosophical traditions, ten thousand different interpretations of the same bloody thing. 136 00:08:01,400 --> 00:08:04,000 Speaker 1: I think the best way to put it is dismayed. 137 00:08:04,040 --> 00:08:07,320 Speaker 1: There's just so much confusion. It seemed like there was 138 00:08:07,640 --> 00:08:14,040 Speaker 1: so much obvious obtuscation. I so it quickly came to 139 00:08:14,080 --> 00:08:18,880 Speaker 1: the conclusion that something profoundly wrong lay at the heart 140 00:08:19,240 --> 00:08:24,280 Speaker 1: of traditional philosophy, and I just endeavored to be as 141 00:08:24,440 --> 00:08:27,880 Speaker 1: honest and as ruthlessly skeptical as I could and try 142 00:08:27,920 --> 00:08:30,600 Speaker 1: to figure out what that something wrong was. And aude 143 00:08:30,600 --> 00:08:34,440 Speaker 1: alien philosophy is basically the answer I've been able to 144 00:08:34,440 --> 00:08:37,440 Speaker 1: come up with after about twenty years of studying and 145 00:08:37,480 --> 00:08:42,040 Speaker 1: researching the topic, both informal university contexts and UH and 146 00:08:42,240 --> 00:08:46,679 Speaker 1: on my own. And the idea, in a nutshell is 147 00:08:46,760 --> 00:08:51,120 Speaker 1: simply that we human beings simply did not evolve the 148 00:08:51,200 --> 00:08:57,480 Speaker 1: capacity to reflect upon ourselves and our experience in any way, 149 00:08:57,480 --> 00:09:00,480 Speaker 1: shape or form that would allow us to answer theoretical 150 00:09:00,640 --> 00:09:06,280 Speaker 1: questions regarding our experience. And when you really look at 151 00:09:06,559 --> 00:09:09,880 Speaker 1: you know the question empirically in terms of cognitive neuroscience, 152 00:09:09,920 --> 00:09:12,360 Speaker 1: for instance, and you see the way in which the 153 00:09:12,440 --> 00:09:18,760 Speaker 1: brain constantly windows information, constantly just selects a little bit 154 00:09:18,760 --> 00:09:22,200 Speaker 1: of information from this process, a little information from that process, 155 00:09:22,200 --> 00:09:25,240 Speaker 1: a little information from that process. You discovered the brain 156 00:09:25,480 --> 00:09:29,040 Speaker 1: really is a giant bottleneck machine that is bend on 157 00:09:29,160 --> 00:09:33,640 Speaker 1: selecting only what it needs. And on alien philosophy gives 158 00:09:33,720 --> 00:09:38,199 Speaker 1: us a sort of picture of how we can understand 159 00:09:38,280 --> 00:09:44,960 Speaker 1: that bottleneck machine such that it explains the morass of 160 00:09:45,840 --> 00:09:52,520 Speaker 1: traditional philosophy of mind are you know, epical inability to 161 00:09:52,840 --> 00:09:55,760 Speaker 1: figure out what the hell is going on inside of 162 00:09:55,760 --> 00:09:57,959 Speaker 1: our own heads? So in the paper you you make 163 00:09:58,040 --> 00:10:00,360 Speaker 1: this distinction you just talked about and you you mentioned 164 00:10:00,400 --> 00:10:05,120 Speaker 1: the idea of causal cognition versus heuristic cognition, right, Uh, 165 00:10:05,640 --> 00:10:09,400 Speaker 1: basically having a true understanding of the workings of things 166 00:10:09,440 --> 00:10:12,240 Speaker 1: in in a sort of deep information way, versus having 167 00:10:12,280 --> 00:10:15,160 Speaker 1: a you know, on the fly, good enough understanding of 168 00:10:15,200 --> 00:10:18,199 Speaker 1: things that gets us through everyday life and that we're 169 00:10:18,280 --> 00:10:23,520 Speaker 1: constantly using the ladder to try to get at the former. Um. 170 00:10:24,040 --> 00:10:26,320 Speaker 1: A question I sort of wonder about. Is the former 171 00:10:26,400 --> 00:10:29,760 Speaker 1: to me, sounds like it is best executed in science. 172 00:10:29,840 --> 00:10:32,880 Speaker 1: If we're trying to get causal cognition, it's when we 173 00:10:32,960 --> 00:10:36,760 Speaker 1: you know, put these tools on our inquiries, like scientific 174 00:10:36,800 --> 00:10:42,040 Speaker 1: investigatory tools. And so does that leave any room for philosophy? 175 00:10:42,080 --> 00:10:45,560 Speaker 1: Do you think that there is really any good philosophy 176 00:10:45,679 --> 00:10:49,280 Speaker 1: to be done or really should it just be science 177 00:10:49,440 --> 00:10:54,800 Speaker 1: and everything outside of science is misapplying these heuristic models 178 00:10:54,800 --> 00:10:57,559 Speaker 1: we have. Um, yeah, I mean I think there's definitely 179 00:10:57,559 --> 00:11:02,080 Speaker 1: good philosophy. I mean, uh, Um, it's not the death 180 00:11:02,240 --> 00:11:06,360 Speaker 1: of philosophy or theoretical speculation. I mean, there's no way 181 00:11:06,400 --> 00:11:09,920 Speaker 1: to kill that. Um. What for me it signals the 182 00:11:10,040 --> 00:11:13,240 Speaker 1: end of is a certain type of philosophizing, which just 183 00:11:13,280 --> 00:11:17,120 Speaker 1: happens to be the majority of philosophizing since the ancient Greeks, 184 00:11:17,640 --> 00:11:21,200 Speaker 1: which is the idea of actually using heuristic cognition to 185 00:11:21,280 --> 00:11:25,400 Speaker 1: try to get to the truth of heuristic cognition, using 186 00:11:25,440 --> 00:11:29,760 Speaker 1: these shortcuts, these ways of actually avoiding knowledge of what's 187 00:11:29,760 --> 00:11:33,880 Speaker 1: actually going on to solve problems as a means of 188 00:11:33,920 --> 00:11:38,320 Speaker 1: getting at what's going on. And that's the philosophy that 189 00:11:38,360 --> 00:11:44,040 Speaker 1: does That is the philosophy that's caused the bulk of 190 00:11:43,400 --> 00:11:47,480 Speaker 1: our confusion today. So how much of philosophy do you 191 00:11:47,520 --> 00:11:51,400 Speaker 1: think is really just sort of backward looking ex post 192 00:11:51,480 --> 00:11:56,000 Speaker 1: facto justification for our biases. So the way I look 193 00:11:56,040 --> 00:11:59,839 Speaker 1: at it is we have this capacity to metacognize. We 194 00:12:00,040 --> 00:12:05,440 Speaker 1: have these ancestral abilities to basically pick things from the 195 00:12:05,440 --> 00:12:10,720 Speaker 1: stream of experience and think about them and change our 196 00:12:10,760 --> 00:12:13,480 Speaker 1: behavioral responses to the world on the basis of them. 197 00:12:13,960 --> 00:12:17,640 Speaker 1: So think about Christmas with your relatives. You know, everyone 198 00:12:17,760 --> 00:12:27,079 Speaker 1: has that family member who says something that you jest. Oh, 199 00:12:27,120 --> 00:12:29,120 Speaker 1: you got to say something about it, but you know 200 00:12:29,200 --> 00:12:31,439 Speaker 1: if you do, it's going to ruin the whole night. 201 00:12:32,480 --> 00:12:35,240 Speaker 1: And then it's a great example of meta cognition at work. 202 00:12:35,760 --> 00:12:37,679 Speaker 1: You know you're gonna say something, and then all of 203 00:12:37,720 --> 00:12:42,000 Speaker 1: a sudden, Wait a second, Scott, he's an soul. Everyone 204 00:12:42,040 --> 00:12:45,080 Speaker 1: knows he's a whole. You don't need to tell everybody 205 00:12:45,120 --> 00:12:48,040 Speaker 1: he's in sool. If you tell everybody's, well, then the 206 00:12:48,040 --> 00:12:51,760 Speaker 1: whole night's ruined. Right, Your reproductive chances go flying out 207 00:12:51,800 --> 00:12:56,040 Speaker 1: the window. That sounds weird. It doesn't know, But meta 208 00:12:56,120 --> 00:12:59,720 Speaker 1: cognition consists of a suite of practical tools. There's no 209 00:13:00,000 --> 00:13:04,240 Speaker 1: a in the world that we evolved the meta cognitive capacity. 210 00:13:04,320 --> 00:13:08,760 Speaker 1: We needed to do much more than a handful of 211 00:13:08,880 --> 00:13:13,320 Speaker 1: practical things on the fly. And what philosophical reflection does 212 00:13:13,400 --> 00:13:18,320 Speaker 1: is it it basically takes those tools, repurposes them, and 213 00:13:18,360 --> 00:13:23,160 Speaker 1: asks them to solve questions. There's just simply no way 214 00:13:23,320 --> 00:13:27,120 Speaker 1: they could possibly solve them. But since we lack the 215 00:13:27,120 --> 00:13:30,480 Speaker 1: ability to even see how little ability we have when 216 00:13:30,480 --> 00:13:33,360 Speaker 1: it comes to man cognition, we have no sense of 217 00:13:33,400 --> 00:13:36,679 Speaker 1: constraints or limits or whatever, and so we're confunded by 218 00:13:36,679 --> 00:13:39,840 Speaker 1: this constant illusion of transparency. It seems like, yes, if 219 00:13:39,880 --> 00:13:43,120 Speaker 1: only I think hard enough on this, Yes, if only 220 00:13:43,160 --> 00:13:47,840 Speaker 1: I get my concepts arranged just right. Yes, if only 221 00:13:47,880 --> 00:13:51,280 Speaker 1: I abandoned the metaphysics a presence, Yes only I abandoned 222 00:13:52,600 --> 00:13:55,320 Speaker 1: the mirror major on and on and on it goes. 223 00:13:56,040 --> 00:13:59,360 Speaker 1: Everybody thinks that there's a way to use philosophical reflection 224 00:14:00,080 --> 00:14:07,360 Speaker 1: to solve or the problem with human experience. So, uh, Scott, 225 00:14:07,480 --> 00:14:09,720 Speaker 1: in the paper you seem and the way you just 226 00:14:09,760 --> 00:14:12,240 Speaker 1: described it now, you definitely do seem to be trying 227 00:14:12,280 --> 00:14:16,800 Speaker 1: to use this idea of imagining what alien philosophy looks 228 00:14:16,840 --> 00:14:20,479 Speaker 1: like in order to reverse engineer our own human philosophy 229 00:14:20,520 --> 00:14:24,239 Speaker 1: to better understand how we've arrived at our philosophical traditions. 230 00:14:24,720 --> 00:14:28,240 Speaker 1: But I'm also just really interested in that speculative project 231 00:14:28,600 --> 00:14:33,640 Speaker 1: trying to literally imagine what an information processing organism orbiting 232 00:14:33,640 --> 00:14:37,160 Speaker 1: a faraway star might think about, what hard problems would 233 00:14:37,200 --> 00:14:40,040 Speaker 1: they encounter and how would they deal with them? And 234 00:14:40,120 --> 00:14:42,160 Speaker 1: so I wonder if I could ask you just a 235 00:14:42,200 --> 00:14:45,160 Speaker 1: few crazy kinds of questions that you don't even address 236 00:14:45,200 --> 00:14:48,920 Speaker 1: in the paper. Uh. I started off thinking, like, you know, 237 00:14:49,160 --> 00:14:51,600 Speaker 1: would could you imagine that there would be an alien 238 00:14:51,680 --> 00:14:54,840 Speaker 1: Hagel or an alien Wittgenstein. But these are kind of 239 00:14:54,840 --> 00:15:00,040 Speaker 1: stupid questions because I think these philosophers necessarily exist in 240 00:15:00,080 --> 00:15:02,000 Speaker 1: a tradition, if you know what I mean, Like they're 241 00:15:02,040 --> 00:15:05,640 Speaker 1: reacting to all the philosophers that came before them, and 242 00:15:05,760 --> 00:15:08,160 Speaker 1: so it's sort of silly to try to imagine if 243 00:15:08,200 --> 00:15:12,280 Speaker 1: that history of philosophy would be recreated on other planets. 244 00:15:12,280 --> 00:15:16,040 Speaker 1: But you can't imagine other fields of philosophy and how 245 00:15:16,040 --> 00:15:18,240 Speaker 1: they would be created. So I'm trying to think, like, 246 00:15:18,560 --> 00:15:21,920 Speaker 1: what would alien meta ethics look like? Is there anything 247 00:15:22,000 --> 00:15:25,960 Speaker 1: that we could guess, uh that that would form the 248 00:15:25,960 --> 00:15:29,000 Speaker 1: basis of how aliens would think about, you know what, 249 00:15:29,040 --> 00:15:31,760 Speaker 1: they're where their sense of right and wrong comes from. 250 00:15:32,520 --> 00:15:34,960 Speaker 1: But this is a really hard question to ask. I mean, 251 00:15:35,000 --> 00:15:37,800 Speaker 1: part of the reason why I took so much care 252 00:15:38,480 --> 00:15:45,160 Speaker 1: to uh underscore how how really you know, plausibility of 253 00:15:45,240 --> 00:15:48,880 Speaker 1: alien philosophy was all I was after. It's just simply 254 00:15:48,880 --> 00:15:53,360 Speaker 1: because asking the question of alien philosopy earnestly, you know, 255 00:15:53,640 --> 00:15:56,280 Speaker 1: um is really difficult, simply because you've got to assume 256 00:15:56,320 --> 00:15:59,000 Speaker 1: so much convergence just to just get off the ground 257 00:15:59,680 --> 00:16:04,400 Speaker 1: I mean have to that they use language in similar 258 00:16:04,440 --> 00:16:08,560 Speaker 1: ways that we as we use language for instance. Um, 259 00:16:08,600 --> 00:16:12,440 Speaker 1: So if you grant all that, if you grant convergence, 260 00:16:12,720 --> 00:16:17,120 Speaker 1: as I call them in the paper, then I think 261 00:16:17,480 --> 00:16:21,320 Speaker 1: meta ethics would look like a giant mess the way 262 00:16:21,360 --> 00:16:27,640 Speaker 1: meta ethics looks like in contemporary philosophy, justly because when 263 00:16:27,680 --> 00:16:32,560 Speaker 1: it comes to where we stand in these various super 264 00:16:32,560 --> 00:16:36,600 Speaker 1: complicated systems that nature has us pinned in, we just 265 00:16:36,880 --> 00:16:42,040 Speaker 1: have no access to the information we need. So what 266 00:16:42,080 --> 00:16:47,800 Speaker 1: we really have to rely on is basically these blind guesses, 267 00:16:48,200 --> 00:16:52,480 Speaker 1: you know, these simple heuristics, these little tricks that have, 268 00:16:52,960 --> 00:16:57,360 Speaker 1: for whatever reason, um, preserved our ancestors in the past. 269 00:16:58,040 --> 00:17:04,520 Speaker 1: Now these tricks are just simply uh, they're literally invisible 270 00:17:04,560 --> 00:17:09,520 Speaker 1: to us, even though they actually are the foundation of 271 00:17:09,560 --> 00:17:15,360 Speaker 1: guiding our behavior through these super complicated natural natural environments. Now, 272 00:17:15,400 --> 00:17:20,639 Speaker 1: any creature like the human is stuck in the exact 273 00:17:20,720 --> 00:17:27,320 Speaker 1: same informational bind. They're stuck in a shallow information environment. 274 00:17:27,840 --> 00:17:31,160 Speaker 1: They have these tools for picking out those handles, those 275 00:17:31,200 --> 00:17:36,120 Speaker 1: features of the environment that actually help them get along 276 00:17:36,359 --> 00:17:41,440 Speaker 1: or get on or get it on, um, but they 277 00:17:41,520 --> 00:17:46,520 Speaker 1: actually have no way of isolating those simple heuristics as 278 00:17:46,800 --> 00:17:49,919 Speaker 1: simple heuristics, no way of understanding where they stand in 279 00:17:49,960 --> 00:17:54,520 Speaker 1: these superordinance systems. They're just they're blind, and even worse, 280 00:17:54,560 --> 00:17:57,840 Speaker 1: they're blind to their blindness. So every time these aliens 281 00:17:57,840 --> 00:18:02,680 Speaker 1: would attempt to solve ethical questions what is ethics, They're 282 00:18:02,720 --> 00:18:05,600 Speaker 1: going to run into the same sets of illusions and 283 00:18:05,640 --> 00:18:10,439 Speaker 1: problems that beset human philosophers. They're going to confuse the 284 00:18:10,600 --> 00:18:16,640 Speaker 1: load in scape of ethical concepts as being a sort 285 00:18:16,640 --> 00:18:19,600 Speaker 1: of autonomous system as being all there is. They're going 286 00:18:19,680 --> 00:18:24,520 Speaker 1: to think that they've actually solved something, even though all 287 00:18:24,560 --> 00:18:26,640 Speaker 1: it takes is another person to ask the exact same 288 00:18:26,720 --> 00:18:31,479 Speaker 1: question to arrive at, in some cases, radically different answer 289 00:18:32,040 --> 00:18:35,520 Speaker 1: following what seemed to be very similar intuitions. So those 290 00:18:35,560 --> 00:18:38,240 Speaker 1: two things I think they're gonna are going to strand 291 00:18:38,240 --> 00:18:41,200 Speaker 1: them as effectively as we've been stranded by meta ethics. 292 00:18:41,560 --> 00:18:45,840 Speaker 1: I can imagine a skeptical objection to what you're saying, Um, 293 00:18:45,880 --> 00:18:49,200 Speaker 1: that would be okay, you're right that we are, by 294 00:18:49,320 --> 00:18:53,440 Speaker 1: nature shallow information consumers. We don't have deep knowledge about 295 00:18:53,440 --> 00:18:56,959 Speaker 1: the world in order to get along in a natural setting. 296 00:18:57,680 --> 00:18:59,960 Speaker 1: But you can definitely look at the way we've loved 297 00:19:00,080 --> 00:19:03,359 Speaker 1: ridged the weak tools. We do have to do amazing 298 00:19:03,480 --> 00:19:06,600 Speaker 1: things in say science and technology, like we can build 299 00:19:06,600 --> 00:19:09,920 Speaker 1: an international space station, and that has nothing to do 300 00:19:10,040 --> 00:19:14,720 Speaker 1: with our ground level survival and reproduction. We've just managed 301 00:19:14,760 --> 00:19:18,680 Speaker 1: to sort of bootstrap up some very basic survival tools 302 00:19:19,000 --> 00:19:23,200 Speaker 1: into incredible products. Why would you think that we couldn't 303 00:19:23,320 --> 00:19:27,359 Speaker 1: do the same thing with these hard problems, like like 304 00:19:27,520 --> 00:19:32,080 Speaker 1: met ethics or natural philosophy or understanding the mind that 305 00:19:32,119 --> 00:19:35,280 Speaker 1: we do with science and technology. Well, I mean I 306 00:19:35,320 --> 00:19:38,280 Speaker 1: actually think that you know a lot of the say 307 00:19:38,320 --> 00:19:41,520 Speaker 1: you know norms of science. I actually see these things 308 00:19:41,560 --> 00:19:45,520 Speaker 1: as technology. I mean I see them as basically ways 309 00:19:45,560 --> 00:19:48,000 Speaker 1: of turning, you know, spinning tools out of our own 310 00:19:48,000 --> 00:19:51,720 Speaker 1: brains if you will. I mean the idea of the 311 00:19:51,760 --> 00:19:57,280 Speaker 1: idea isn't that we can't use heuristic cognition in novel ways. 312 00:19:57,440 --> 00:20:01,919 Speaker 1: I mean we literally, I think we literally evolved to 313 00:20:02,160 --> 00:20:06,000 Speaker 1: do exactly what philosophers do, which is just simply trying 314 00:20:06,000 --> 00:20:11,640 Speaker 1: to recast, repurpose that our existing cognitive capacities in order 315 00:20:11,680 --> 00:20:16,640 Speaker 1: to solve um different kinds of problems. The problem with 316 00:20:16,720 --> 00:20:19,399 Speaker 1: meta ethics in particular is it doesn't really have anything 317 00:20:19,920 --> 00:20:23,800 Speaker 1: to do with that process. So ethical thought it has 318 00:20:23,840 --> 00:20:26,760 Speaker 1: given us a lot of great ideas that we've been 319 00:20:26,760 --> 00:20:31,800 Speaker 1: able to institutionalize in ways that have been tremendously tremendously 320 00:20:31,880 --> 00:20:37,720 Speaker 1: beneficial um to humanity. But there's no way of actually 321 00:20:38,880 --> 00:20:42,800 Speaker 1: realizing that that's the case short of putting those tools 322 00:20:42,840 --> 00:20:48,320 Speaker 1: into application. I mean, I would argue that the type 323 00:20:48,359 --> 00:20:55,359 Speaker 1: of norm toolmaking that humans have just done so fantastically 324 00:20:55,440 --> 00:21:00,000 Speaker 1: well is actually a completely different process than the process 325 00:21:00,080 --> 00:21:06,960 Speaker 1: that underwrites meta ethical thought. I mean, problems get solved 326 00:21:07,560 --> 00:21:12,320 Speaker 1: when we actually discover a new simple heuristic that allows 327 00:21:12,400 --> 00:21:15,080 Speaker 1: us to get along with each other in some way, 328 00:21:15,119 --> 00:21:20,000 Speaker 1: shape or form. But when you actually ask the fundamental 329 00:21:20,200 --> 00:21:25,480 Speaker 1: nature of ethical thought, and you begin by looking at 330 00:21:25,560 --> 00:21:29,439 Speaker 1: all these terms, all this norm talk as referring to 331 00:21:29,680 --> 00:21:34,239 Speaker 1: posits that actually plays some role in some kind of 332 00:21:34,240 --> 00:21:39,639 Speaker 1: economy transcendental economy or a normative economy, or an anomalous 333 00:21:39,720 --> 00:21:44,439 Speaker 1: economy or an autonomous economy outside of the circuit of nature. 334 00:21:45,000 --> 00:21:48,680 Speaker 1: No one's gotten anywhere, no one's the through anything. I mean, 335 00:21:48,720 --> 00:21:53,720 Speaker 1: the all the dilemmas that confronted the ancient Greeks are 336 00:21:54,119 --> 00:21:58,879 Speaker 1: still confronting modern philosophers today. And that's just simply because 337 00:22:00,280 --> 00:22:05,560 Speaker 1: constitutes an attempt to apply these simple heuristics not to 338 00:22:05,640 --> 00:22:10,480 Speaker 1: practical problems in the real world of politics and social interaction, 339 00:22:11,200 --> 00:22:14,640 Speaker 1: but to the theoretical problem of ethics itself. All right, 340 00:22:14,680 --> 00:22:16,560 Speaker 1: now it's time to take a quick break, and when 341 00:22:16,600 --> 00:22:19,639 Speaker 1: we come back, more of our conversation with our Scott Baker. 342 00:22:24,280 --> 00:22:25,840 Speaker 1: So maybe I've got one more and then I'm going 343 00:22:25,880 --> 00:22:28,879 Speaker 1: to throw back to Robert here. Uh So, I wanted 344 00:22:28,880 --> 00:22:33,800 Speaker 1: to go to a footnote in your paper about alien philosophy. Um, 345 00:22:33,880 --> 00:22:37,560 Speaker 1: you say that because we evolved in ecologies that didn't 346 00:22:37,600 --> 00:22:40,119 Speaker 1: give us access to deep information, you know, we just 347 00:22:40,200 --> 00:22:43,400 Speaker 1: knew enough to get along. You write, quote, My fear 348 00:22:43,480 --> 00:22:46,000 Speaker 1: is that the provision of this information is likely to 349 00:22:46,200 --> 00:22:49,760 Speaker 1: crash the effectiveness of many of our tools. A good 350 00:22:49,800 --> 00:22:53,560 Speaker 1: deal of my fiction is devoted to exploring different crash scenarios. 351 00:22:53,800 --> 00:22:55,639 Speaker 1: So I wondered if you could explain a little bit 352 00:22:55,720 --> 00:22:58,000 Speaker 1: more what you mean about the idea of a crash 353 00:22:58,080 --> 00:23:02,440 Speaker 1: scenario and how you explored so when you understand that 354 00:23:02,920 --> 00:23:06,359 Speaker 1: the bulk of our cognition is heuristic, that it really 355 00:23:06,400 --> 00:23:10,119 Speaker 1: relies on these kinds of guesses that we're making with 356 00:23:10,280 --> 00:23:13,919 Speaker 1: the lack any sort of deep understanding of the actual 357 00:23:14,720 --> 00:23:20,840 Speaker 1: physical structure of the environment around us. You realize that 358 00:23:20,840 --> 00:23:25,800 Speaker 1: that guesswork depends upon an invariant background. So a perfect 359 00:23:25,840 --> 00:23:30,520 Speaker 1: example would be AI for instance. So human beings we 360 00:23:30,640 --> 00:23:37,000 Speaker 1: evolved to manage unbelievable amount of complexity. We we evolved 361 00:23:37,040 --> 00:23:41,840 Speaker 1: to solve the most complicated systems that we know of 362 00:23:41,880 --> 00:23:46,359 Speaker 1: in the universe, basically each other, without actually knowing the 363 00:23:46,440 --> 00:23:50,119 Speaker 1: first thing about what's going on in brains or or 364 00:23:51,200 --> 00:23:56,560 Speaker 1: ecologies or or what what have you. So heuristics solved 365 00:23:56,800 --> 00:24:01,000 Speaker 1: by taking things for granted in their environments. So that 366 00:24:01,119 --> 00:24:06,200 Speaker 1: means the only cognition can be properly function is if 367 00:24:06,320 --> 00:24:10,359 Speaker 1: those things that it takes for granted actually obtained in 368 00:24:10,440 --> 00:24:14,320 Speaker 1: its environments. So if you look at that last year, 369 00:24:14,359 --> 00:24:18,480 Speaker 1: there was the first fatality ever attributed to a self 370 00:24:18,600 --> 00:24:22,840 Speaker 1: driving car, and I think it was a Tesla Um 371 00:24:22,920 --> 00:24:27,280 Speaker 1: and the unfortunate fellow was driving on an incline and 372 00:24:27,320 --> 00:24:31,520 Speaker 1: a truck was crossing his path. He was watching Harry 373 00:24:31,560 --> 00:24:36,919 Speaker 1: Potter on his dashboard. He had his autopilot on his car. 374 00:24:37,760 --> 00:24:42,200 Speaker 1: Now the autopilot actually read forward, and the white truck 375 00:24:42,640 --> 00:24:45,680 Speaker 1: crossed what was actually white sky. This is what they 376 00:24:45,680 --> 00:24:50,520 Speaker 1: think happened, and as a result, the truck's trailer qued 377 00:24:50,680 --> 00:24:54,719 Speaker 1: open space to the computer, and so the car just 378 00:24:54,800 --> 00:24:59,600 Speaker 1: drove right underneath the truck trailer. Um, that's exactly what 379 00:24:59,720 --> 00:25:04,520 Speaker 1: happen when the environment doesn't cooperate with a heuristic problem 380 00:25:04,560 --> 00:25:09,919 Speaker 1: solving system. If that system cannot discriminate between white truck 381 00:25:10,400 --> 00:25:15,520 Speaker 1: white sky, then it just sees sky. And so that's 382 00:25:15,640 --> 00:25:19,040 Speaker 1: actually a perfect example of a crash space where you 383 00:25:19,080 --> 00:25:22,760 Speaker 1: have a cognitive system that requires the environment to be 384 00:25:22,920 --> 00:25:26,520 Speaker 1: a certain way in order to properly solve the problem. 385 00:25:26,560 --> 00:25:30,640 Speaker 1: When the environment is changed or varies in a way 386 00:25:30,680 --> 00:25:33,760 Speaker 1: that it cannot accommodate, then you literally have a car 387 00:25:33,840 --> 00:25:38,400 Speaker 1: crash in that case. Now, that obtained as much for 388 00:25:38,560 --> 00:25:42,040 Speaker 1: self driving cars as it does for human social cognition. 389 00:25:42,720 --> 00:25:46,320 Speaker 1: So my big fear with AI generally is that you 390 00:25:46,359 --> 00:25:50,600 Speaker 1: and I evolved to solve each other um over literally 391 00:25:50,640 --> 00:25:55,280 Speaker 1: the history of life on this planet. We're enormously complicated, 392 00:25:55,880 --> 00:25:59,399 Speaker 1: and yet we're so finally attuned to one another that 393 00:25:59,480 --> 00:26:03,240 Speaker 1: we can make unbelievable predictions as to each other's behavior 394 00:26:03,720 --> 00:26:09,320 Speaker 1: and reliability and so forth. Now, what happens when you 395 00:26:09,400 --> 00:26:14,240 Speaker 1: take that ecology and you start injecting all these little 396 00:26:15,000 --> 00:26:20,520 Speaker 1: artificial agencies that are literally designed to cue your heuristic 397 00:26:21,240 --> 00:26:24,680 Speaker 1: social positive systems out of school, right for some sort 398 00:26:24,680 --> 00:26:28,640 Speaker 1: of commercial advantage. Now, all of a sudden, you find 399 00:26:28,720 --> 00:26:33,000 Speaker 1: human beings using these systems that are exquisitely designed to 400 00:26:33,080 --> 00:26:36,240 Speaker 1: make sense of other human beings. Now we're using these 401 00:26:36,280 --> 00:26:41,040 Speaker 1: systems to make sense of machines that have literally been 402 00:26:41,080 --> 00:26:46,720 Speaker 1: designed to mpre wallets or you know, swear our votes 403 00:26:47,040 --> 00:26:51,000 Speaker 1: or what have you. Now that's a crash space, and 404 00:26:51,119 --> 00:26:55,919 Speaker 1: it's a tremendously significant crash space because what it means 405 00:26:56,200 --> 00:26:58,719 Speaker 1: is that human beings are going to be able to 406 00:26:58,880 --> 00:27:05,080 Speaker 1: trust their social cognitive systems that suite of simple heuristics 407 00:27:05,119 --> 00:27:09,320 Speaker 1: we use to solve the monstrous complexity of each other. 408 00:27:09,960 --> 00:27:12,120 Speaker 1: We're no, we're gonna be able to trust that less 409 00:27:12,160 --> 00:27:17,800 Speaker 1: and less and less moving forward because the environment, the 410 00:27:17,880 --> 00:27:23,320 Speaker 1: invariant background that it's adapted to solve, no longer exists. 411 00:27:24,200 --> 00:27:28,000 Speaker 1: So really, what my big fear is is that we're 412 00:27:28,080 --> 00:27:33,160 Speaker 1: looking at the destruction of the human cognitive habitat, that 413 00:27:33,680 --> 00:27:37,080 Speaker 1: all this stuff that people are celebrating Mark Zuckerberg, even 414 00:27:37,200 --> 00:27:42,760 Speaker 1: President uh really is the beginning of the end of 415 00:27:43,600 --> 00:27:46,280 Speaker 1: the ability of human beings to make sense of each other, 416 00:27:46,800 --> 00:27:51,520 Speaker 1: the world themselves. What have you, Robert, I know some 417 00:27:51,600 --> 00:27:53,680 Speaker 1: of these ideas come up in some questions you had 418 00:27:53,680 --> 00:27:56,920 Speaker 1: about the fiction, right, Oh, yes, Um. At this point 419 00:27:56,960 --> 00:27:58,960 Speaker 1: I want to jump ahead to a question I had 420 00:27:58,960 --> 00:28:00,719 Speaker 1: for later, but I think it's ees in here. So 421 00:28:01,280 --> 00:28:06,720 Speaker 1: your your book Neuropath is a wonderfully scientifically disturbing thriller, 422 00:28:07,760 --> 00:28:11,840 Speaker 1: sort of near future neuroscientifically charged psychological thriller for those 423 00:28:11,880 --> 00:28:14,680 Speaker 1: who haven't read it, Well, you've just described here instantly 424 00:28:14,680 --> 00:28:17,399 Speaker 1: made me think of the semantic apocalypse. Is that the 425 00:28:17,440 --> 00:28:20,760 Speaker 1: same concept? Or is that a related concept? That's the 426 00:28:20,800 --> 00:28:25,120 Speaker 1: same concept, that's the that's the semantic apocalypse. I kind 427 00:28:25,119 --> 00:28:27,520 Speaker 1: of see it as having two stages to it. I 428 00:28:27,520 --> 00:28:31,159 Speaker 1: mean the first stage is actually on alien philosophy. I mean, 429 00:28:31,160 --> 00:28:35,720 Speaker 1: the first stage is just clearing away all these philosophical 430 00:28:35,720 --> 00:28:40,760 Speaker 1: conceptions of meaning and um coming to understand the practical 431 00:28:41,160 --> 00:28:46,400 Speaker 1: nature of of intentional cognition. And then the big problem 432 00:28:46,440 --> 00:28:50,480 Speaker 1: with semantic apocalypse has to be with the slow degradation 433 00:28:50,720 --> 00:28:55,040 Speaker 1: of the environments that intentional cognition requires in order to 434 00:28:55,120 --> 00:28:59,160 Speaker 1: function reliably right, And it's the depth of meaning in 435 00:28:59,280 --> 00:29:02,760 Speaker 1: every sense. When you say the intentional environment, there you 436 00:29:02,800 --> 00:29:07,040 Speaker 1: mean reasoning on the basis of inferring the intentions of others, 437 00:29:07,080 --> 00:29:09,840 Speaker 1: sort of what Daniel Dennet would call the intentional stance. 438 00:29:10,560 --> 00:29:14,600 Speaker 1: Actually greatly enjoying uh Denna's book right now, his latest book. 439 00:29:14,960 --> 00:29:19,160 Speaker 1: There's no intentional stance in my position. Um, there's no 440 00:29:19,560 --> 00:29:25,160 Speaker 1: uh perspectives or anything like that. I mean, intentional cognition 441 00:29:25,360 --> 00:29:28,160 Speaker 1: just simply means basically all the machinery in your brain 442 00:29:28,800 --> 00:29:32,960 Speaker 1: that enables you to solve the behavior of other biological 443 00:29:33,000 --> 00:29:41,720 Speaker 1: life in your in your vicinity to like incredibly rarefied extents, right, um, 444 00:29:41,760 --> 00:29:44,000 Speaker 1: and when you look and when you look at human 445 00:29:44,000 --> 00:29:47,600 Speaker 1: behavior in particular. The problem with the intentional stance is 446 00:29:47,640 --> 00:29:50,560 Speaker 1: that it's actually an attempt to use intentional cognition to 447 00:29:50,720 --> 00:29:54,920 Speaker 1: theorize intentional cognition. And that's the big reason why Dinnett 448 00:29:54,920 --> 00:29:58,440 Speaker 1: has such a hard time actually selling his position to 449 00:29:59,120 --> 00:30:04,120 Speaker 1: h their philosophers of mine who want intentionality in meaning 450 00:30:04,120 --> 00:30:08,480 Speaker 1: to be a thing, to be something in the world. Uh, 451 00:30:09,320 --> 00:30:14,560 Speaker 1: I share much of of then it's view, but I 452 00:30:14,600 --> 00:30:18,240 Speaker 1: don't see how the intentional stance has anything to do 453 00:30:19,000 --> 00:30:22,960 Speaker 1: with the parts I share your original question had to 454 00:30:23,000 --> 00:30:27,320 Speaker 1: do with, uh, just simply what I mean by intentional cognition, 455 00:30:27,880 --> 00:30:32,040 Speaker 1: And once again, it's simply the machinery in our head. 456 00:30:33,160 --> 00:30:35,720 Speaker 1: That's you know, what gets taken out by a stroke. 457 00:30:36,560 --> 00:30:43,200 Speaker 1: That's what gets you know, pathologically attenuated in cases of autism, 458 00:30:43,240 --> 00:30:48,440 Speaker 1: that's you know what gets degraded in dementia. You know, 459 00:30:48,760 --> 00:30:53,840 Speaker 1: that's where the action is. And UM, I think that's 460 00:30:53,880 --> 00:30:57,200 Speaker 1: all we need to to really get a theoretical grasp 461 00:30:58,400 --> 00:31:01,600 Speaker 1: on what's going on with meaning. In your Second Apocalypse saga, 462 00:31:01,720 --> 00:31:04,719 Speaker 1: an alien race known as the ink Roy play a 463 00:31:04,720 --> 00:31:09,160 Speaker 1: crucial role did on alien philosophy impart stem from trying 464 00:31:09,160 --> 00:31:11,960 Speaker 1: to understand their perspective and yes or no, how does 465 00:31:12,000 --> 00:31:15,120 Speaker 1: one try to think like the people of emptiness? How 466 00:31:15,160 --> 00:31:18,760 Speaker 1: does one try to think like an Inkroy? Wow? I've 467 00:31:18,800 --> 00:31:22,720 Speaker 1: tried to actually get into their headspace on several occasions. 468 00:31:22,760 --> 00:31:25,800 Speaker 1: I find it really difficult. I mean, the problem is 469 00:31:25,800 --> 00:31:28,320 Speaker 1: is that, you know, our own sort of experience, of 470 00:31:28,360 --> 00:31:32,960 Speaker 1: our own first person is so unbelievably specific to our 471 00:31:33,000 --> 00:31:38,520 Speaker 1: biology and to our history that that, um, as soon 472 00:31:38,560 --> 00:31:41,120 Speaker 1: as you've tried to actually think of an alien way 473 00:31:41,120 --> 00:31:45,800 Speaker 1: of thought ceases to make sense. I mean on Alien 474 00:31:45,800 --> 00:31:49,959 Speaker 1: philosophy is tied to the ink roy, not not so 475 00:31:50,040 --> 00:31:56,560 Speaker 1: much as uh exploration. Okay, so the incroy not so 476 00:31:56,680 --> 00:32:00,720 Speaker 1: much you know, the object of on alien philosophy as 477 00:32:00,960 --> 00:32:08,360 Speaker 1: they are kind of uh, the cipher for the ultimate 478 00:32:08,440 --> 00:32:12,880 Speaker 1: significance of alien philosophy, because the ultimate significance of Alan 479 00:32:12,920 --> 00:32:17,760 Speaker 1: philosophy is that, you know, once we understand that thought 480 00:32:18,480 --> 00:32:23,920 Speaker 1: is ultimately material physical um uh, and just simply part 481 00:32:24,360 --> 00:32:27,640 Speaker 1: of all the natural processes going on in our environment, 482 00:32:28,320 --> 00:32:34,600 Speaker 1: nothing on theologically extraordinary, then we realize that this boundary 483 00:32:34,640 --> 00:32:37,800 Speaker 1: that we've just we're just creeping up to and are 484 00:32:37,920 --> 00:32:42,440 Speaker 1: getting ready to cross, where we have literally gained the 485 00:32:42,440 --> 00:32:50,920 Speaker 1: ability to control physical processes at unbelievably small scales, cellular scales, 486 00:32:51,920 --> 00:32:58,400 Speaker 1: at this point where our own biology is becoming technology. Um. 487 00:32:58,440 --> 00:33:05,160 Speaker 1: If on alien philosophy is right, then ink oi are 488 00:33:06,000 --> 00:33:11,560 Speaker 1: one possible consequence of us crossing this boundary. So the 489 00:33:11,600 --> 00:33:18,320 Speaker 1: ink roy could possibly be us in some respect where 490 00:33:18,360 --> 00:33:22,920 Speaker 1: you know, been up on any sort of normative ideals 491 00:33:23,000 --> 00:33:28,800 Speaker 1: of or truth or what have you, and have literally 492 00:33:28,880 --> 00:33:36,160 Speaker 1: just simply collapsed into this bioheedionism that is to our 493 00:33:36,320 --> 00:33:41,560 Speaker 1: moral sensi abilities, absolutely horrific. And that's for me. The 494 00:33:41,600 --> 00:33:47,280 Speaker 1: ink y have always kind of represented the ugly consequences 495 00:33:47,720 --> 00:33:53,440 Speaker 1: of an alien philosophy, the fact that we are going 496 00:33:53,520 --> 00:33:56,520 Speaker 1: to climb out of this dream that we've had of 497 00:33:56,640 --> 00:34:00,400 Speaker 1: being exceptional in some way and this or that we 498 00:34:00,480 --> 00:34:04,760 Speaker 1: are just simply material and that we will just simply 499 00:34:04,840 --> 00:34:09,360 Speaker 1: chase fitness indicators, you know, via our technology, to the 500 00:34:09,400 --> 00:34:14,520 Speaker 1: point where we um become something that our present belves 501 00:34:14,520 --> 00:34:17,080 Speaker 1: can only be horrified by. Well, that's excellent, that's certainly 502 00:34:17,120 --> 00:34:21,000 Speaker 1: a horrifying thought to think of ourselves as the ink roy. 503 00:34:21,520 --> 00:34:25,400 Speaker 1: So in imagining alien philosophy, you've got these convergions, who 504 00:34:25,440 --> 00:34:30,040 Speaker 1: are you know, have some amount of sufficient uh convergent 505 00:34:30,080 --> 00:34:33,080 Speaker 1: evolution with us there there's somewhat similar to how our 506 00:34:33,120 --> 00:34:37,400 Speaker 1: information processing works at least. Um, is there any reason 507 00:34:37,440 --> 00:34:41,400 Speaker 1: to assume that, even if conscious cognition exists in conversions, 508 00:34:41,520 --> 00:34:45,480 Speaker 1: that it would feel the same to them as conscious 509 00:34:45,520 --> 00:34:47,920 Speaker 1: cognition feels to us. I know, that's kind of a 510 00:34:47,960 --> 00:34:51,040 Speaker 1: strange question to ask, because you can't even be sure 511 00:34:51,560 --> 00:34:55,560 Speaker 1: that another person's you know, consciousness exists or feels the 512 00:34:55,600 --> 00:34:58,280 Speaker 1: same as yours. But we tend to assume that feeling 513 00:34:58,320 --> 00:35:01,759 Speaker 1: like a human, you know, if sort of like one thing. 514 00:35:02,000 --> 00:35:05,080 Speaker 1: Mostly we could be wrong about that, But is it 515 00:35:05,120 --> 00:35:09,760 Speaker 1: possible to imagine radically different subjectivity is not just radically 516 00:35:09,800 --> 00:35:13,360 Speaker 1: different behavior throughout the universe. I mean, I'd argue that 517 00:35:13,360 --> 00:35:15,279 Speaker 1: it's not impossible to them that, and I think it's 518 00:35:15,280 --> 00:35:17,640 Speaker 1: just a lot more complicated than than people would give 519 00:35:17,680 --> 00:35:19,719 Speaker 1: it credence. And I think all you have to do 520 00:35:19,840 --> 00:35:22,680 Speaker 1: is look at the human case to see that um 521 00:35:23,360 --> 00:35:25,680 Speaker 1: has to be the case. Just think of the difference 522 00:35:26,000 --> 00:35:31,720 Speaker 1: between different philosophers and the phenomenological tradition like Heidegger, Myrtle, Punty, 523 00:35:32,280 --> 00:35:38,279 Speaker 1: uh Sartraum, husserl Shoots Gatamer. I mean, if you look 524 00:35:38,400 --> 00:35:42,920 Speaker 1: at all these different philosophers and all the different interpretations 525 00:35:43,239 --> 00:35:49,839 Speaker 1: they've given of the fundamental nature of experience. I mean, 526 00:35:50,160 --> 00:35:54,360 Speaker 1: some cases are similar, they certainly share similar commitments, but 527 00:35:54,400 --> 00:36:01,160 Speaker 1: there's a wild variance in how human subjective experience is describd. So, 528 00:36:01,280 --> 00:36:07,040 Speaker 1: given that we have difficulty even pinpointing what it's like 529 00:36:07,160 --> 00:36:09,839 Speaker 1: to be a human, the notion of being able to 530 00:36:10,080 --> 00:36:13,239 Speaker 1: understand what it's like to be an alien, it's got 531 00:36:13,239 --> 00:36:17,080 Speaker 1: to be that much more difficult, And so one of 532 00:36:17,080 --> 00:36:19,799 Speaker 1: the things they try to argue in that paper is 533 00:36:19,840 --> 00:36:22,320 Speaker 1: that you really, even if you can't say what it 534 00:36:22,360 --> 00:36:25,799 Speaker 1: would be like, what you could say is what kinds 535 00:36:25,800 --> 00:36:28,840 Speaker 1: of information they'd have access to, and what kinds of 536 00:36:28,880 --> 00:36:33,040 Speaker 1: information they wouldn't have access to, and what kinds of 537 00:36:33,080 --> 00:36:39,520 Speaker 1: problems they potentially might into as the ability to see 538 00:36:39,560 --> 00:36:43,280 Speaker 1: their own neglect structure, to understand what it was their neglecting. 539 00:36:44,000 --> 00:36:47,239 Speaker 1: So to give a strange answer to your question, I mean, 540 00:36:47,239 --> 00:36:51,040 Speaker 1: I think you can you can say things about the 541 00:36:51,200 --> 00:36:56,600 Speaker 1: shape of alien experience, even if you can't say much 542 00:36:56,640 --> 00:37:01,799 Speaker 1: of anything about the quality of of that experience. Yeah. 543 00:37:01,880 --> 00:37:05,600 Speaker 1: One possible idea I had here was I was wondering 544 00:37:05,640 --> 00:37:07,839 Speaker 1: if you think it would make a difference in the 545 00:37:07,920 --> 00:37:11,279 Speaker 1: evolution of an alien philosophy if it occurred in a 546 00:37:11,360 --> 00:37:15,520 Speaker 1: species that subjectively just had much less of a self 547 00:37:15,680 --> 00:37:18,360 Speaker 1: world distinction than we have. You know, we think of 548 00:37:18,400 --> 00:37:23,600 Speaker 1: ourselves as in the universe rather than being the universe. 549 00:37:23,680 --> 00:37:26,080 Speaker 1: But of course we are the universe. The universe is 550 00:37:26,440 --> 00:37:29,040 Speaker 1: in a small part, embodied in us and in the 551 00:37:29,080 --> 00:37:33,360 Speaker 1: brains that generate our consciousness. Um, we we just don't 552 00:37:33,440 --> 00:37:36,319 Speaker 1: feel that way yet. There are you know, they're meditative 553 00:37:36,360 --> 00:37:39,960 Speaker 1: exercises like you can do meditation that's specifically aimed at 554 00:37:39,960 --> 00:37:42,200 Speaker 1: trying to get you into that state of mind where 555 00:37:42,200 --> 00:37:44,480 Speaker 1: you feel like you are the universe. You just are 556 00:37:44,560 --> 00:37:51,680 Speaker 1: the world you are experience and wat right, right apprehension exactly. 557 00:37:51,920 --> 00:37:54,600 Speaker 1: Uh So, I wonder if you know, if you imagine 558 00:37:54,640 --> 00:37:59,759 Speaker 1: an alien species that doesn't naturally possess this self world distinction, 559 00:38:00,120 --> 00:38:03,640 Speaker 1: just feels that it is the universe, could we imagine 560 00:38:03,640 --> 00:38:07,520 Speaker 1: it would generate a radically different type of ontology of 561 00:38:07,560 --> 00:38:10,000 Speaker 1: the universe, of what it means to be and all 562 00:38:10,040 --> 00:38:13,800 Speaker 1: of the socially derived philosophy, the meta ethics and everything 563 00:38:13,840 --> 00:38:16,040 Speaker 1: like that. Yeah, and I think this is I mean, 564 00:38:16,040 --> 00:38:19,040 Speaker 1: I think this is actually a fascinating question. I mean 565 00:38:19,040 --> 00:38:21,399 Speaker 1: I've I've cracked my skull open against it a couple 566 00:38:21,400 --> 00:38:25,719 Speaker 1: of times now. And um, I mean what I think 567 00:38:25,920 --> 00:38:29,080 Speaker 1: is the case? I think I think meta cognition it 568 00:38:29,120 --> 00:38:32,640 Speaker 1: doesn't matter where you are in the universe. Meta cognition 569 00:38:32,800 --> 00:38:37,360 Speaker 1: is expensive. And if meta cognition is expensive everywhere and 570 00:38:37,600 --> 00:38:41,040 Speaker 1: you go in the universe, um, it is going to 571 00:38:41,160 --> 00:38:44,920 Speaker 1: be hard, very very hard for an alien species to 572 00:38:45,239 --> 00:38:50,040 Speaker 1: to it its own continuity with its own environments. So 573 00:38:51,000 --> 00:38:54,720 Speaker 1: I mean, they're going to be stuck relying on simple 574 00:38:54,719 --> 00:38:58,359 Speaker 1: heuristics in some way, shape or form. And if they 575 00:38:58,480 --> 00:39:05,160 Speaker 1: don't develop the medicoc it of capacity to be intuitively deduce, 576 00:39:05,880 --> 00:39:10,800 Speaker 1: you know, the fractioning heuristic nature of their own metacognitive capacities, 577 00:39:11,320 --> 00:39:13,880 Speaker 1: they're going to be duped the same way we've been 578 00:39:13,960 --> 00:39:17,600 Speaker 1: duped into thinking that the information they're getting is sufficient 579 00:39:17,680 --> 00:39:23,600 Speaker 1: to draw conclusions. And I think about human philosophy, how 580 00:39:23,680 --> 00:39:26,360 Speaker 1: strange it is. I mean, we've been asking the same 581 00:39:26,440 --> 00:39:33,120 Speaker 1: bloody questions for thousands of years, thousands of years, no answers, 582 00:39:33,120 --> 00:39:38,719 Speaker 1: no answers. I mean, really that's madness, isn't it. I mean, 583 00:39:38,960 --> 00:39:42,520 Speaker 1: the same thing and expecting a different result. I mean, 584 00:39:42,719 --> 00:39:45,400 Speaker 1: I mean, I know each philosopher tweaks something along the 585 00:39:45,400 --> 00:39:47,600 Speaker 1: way and they think that tweak is going to give 586 00:39:47,640 --> 00:39:52,279 Speaker 1: them a different result. But but it really is. You know, 587 00:39:52,320 --> 00:39:54,799 Speaker 1: a paradigm for madness, if you if you think about 588 00:39:54,840 --> 00:40:00,520 Speaker 1: philosophy in those terms, traditional intentional philosophy for flection on 589 00:40:00,560 --> 00:40:07,360 Speaker 1: the soul um ah. I think that yes simply follows 590 00:40:07,880 --> 00:40:13,759 Speaker 1: from the biological expense of metacognition, so that you can 591 00:40:13,800 --> 00:40:19,880 Speaker 1: have an alien species realize their continuity with nature, but 592 00:40:20,280 --> 00:40:24,279 Speaker 1: there will be some point in their past where they 593 00:40:24,320 --> 00:40:28,200 Speaker 1: assume the same kinds are similar kinds of exceptionalism as 594 00:40:28,239 --> 00:40:31,480 Speaker 1: we have, where they think that they're actually something apart, 595 00:40:31,960 --> 00:40:38,000 Speaker 1: simply because they lack the metacognition ability to actually cognize 596 00:40:38,040 --> 00:40:44,040 Speaker 1: themselves as continuous with their environments. You see, I mean 597 00:40:44,040 --> 00:40:50,400 Speaker 1: think about choice for instance, there's just nothing incredively obvious interest. 598 00:40:51,880 --> 00:40:57,680 Speaker 1: But they're just physically impossible when you consider it in 599 00:40:57,920 --> 00:41:04,240 Speaker 1: light of causal cognition. Um. Like, the reason we believe 600 00:41:04,239 --> 00:41:05,880 Speaker 1: in choices is simply that we have no way of 601 00:41:05,880 --> 00:41:09,400 Speaker 1: intuiting any causal providence for anything that happens inside of 602 00:41:09,400 --> 00:41:12,239 Speaker 1: our beings. And that will be the case for any 603 00:41:12,280 --> 00:41:15,840 Speaker 1: alien species whatsoever. So they're gonna be blind to the 604 00:41:15,880 --> 00:41:21,680 Speaker 1: causal providence that holds them continuous with the greater circuit 605 00:41:21,680 --> 00:41:24,960 Speaker 1: of nature. So and I think you know that that 606 00:41:25,040 --> 00:41:28,560 Speaker 1: first move, the move that humans made, Oh well, we 607 00:41:28,640 --> 00:41:33,239 Speaker 1: must be something apart from nature. We must be something exceptional, 608 00:41:33,680 --> 00:41:38,520 Speaker 1: something outside, transcendent, anomalous, autonymous, what have you. I think 609 00:41:38,560 --> 00:41:44,960 Speaker 1: that is going to be something Any intelligent species, working 610 00:41:45,040 --> 00:41:49,480 Speaker 1: through the bugs of its metacognitive capacities, is going to 611 00:41:50,200 --> 00:41:54,160 Speaker 1: encounter awesome. Now, as as we've been talking about the 612 00:41:54,200 --> 00:41:58,960 Speaker 1: crash scenarios alien philosophy, Uh, it's interesting to to think 613 00:41:59,000 --> 00:42:03,080 Speaker 1: back on the books of the Second Apocalypse Saga and 614 00:42:03,120 --> 00:42:07,040 Speaker 1: you know, and and see how this is utilized throughout 615 00:42:07,080 --> 00:42:11,040 Speaker 1: the books because this fantasy series has been the playground 616 00:42:11,040 --> 00:42:14,440 Speaker 1: for your ideas for so long here and it's generally 617 00:42:14,440 --> 00:42:18,080 Speaker 1: classified as as fantasy. Yourself classified as fantasy. So I 618 00:42:18,120 --> 00:42:20,560 Speaker 1: wanted to ask in your in advance of your recent 619 00:42:20,760 --> 00:42:23,600 Speaker 1: UH A m A on Reddit, you stated quote, if 620 00:42:23,640 --> 00:42:26,799 Speaker 1: God is dead, then fantasy is his grave. Can you 621 00:42:26,840 --> 00:42:30,279 Speaker 1: expand on that for our audience? Um, yes, yes, sure, 622 00:42:30,600 --> 00:42:36,239 Speaker 1: I mean the m A organizers Um. Uh, they give 623 00:42:36,280 --> 00:42:38,160 Speaker 1: you a fact sheet. In the fact sheet, the first 624 00:42:38,160 --> 00:42:41,440 Speaker 1: thing the fact sheet recommends is that you come up 625 00:42:41,440 --> 00:42:45,239 Speaker 1: with some something jaunty and light you introduce your A 626 00:42:45,440 --> 00:42:50,160 Speaker 1: M A. And I'm just such a conturian first thing 627 00:42:50,200 --> 00:42:56,040 Speaker 1: that in my head and uh, but it was Joe 628 00:42:57,200 --> 00:43:02,279 Speaker 1: very serious. At the same time. It's serious insofar as 629 00:43:02,360 --> 00:43:05,160 Speaker 1: it makes a reference, of course to Nietzsa's famous claim 630 00:43:05,239 --> 00:43:08,520 Speaker 1: that God is dead, and that's usually taken as UH 631 00:43:09,160 --> 00:43:13,640 Speaker 1: an emblem for basically the way in which enlightenment, reason 632 00:43:14,280 --> 00:43:20,600 Speaker 1: collapses into nihilism. And uh so, in that sense that 633 00:43:20,640 --> 00:43:24,520 Speaker 1: claim is very serious. If God is dead, then fantasy 634 00:43:24,640 --> 00:43:28,120 Speaker 1: is his grave. Um. If you see the death of 635 00:43:28,160 --> 00:43:35,920 Speaker 1: God as the problem of nihilism, then fantasy actually becomes 636 00:43:35,960 --> 00:43:39,200 Speaker 1: the greatest place in the world to try to understand 637 00:43:39,360 --> 00:43:44,759 Speaker 1: what the role of nihilism in contemporary society is. Fantasy 638 00:43:45,120 --> 00:43:50,000 Speaker 1: worlds are fantasy worlds because they resemble scriptural scriptural worlds. 639 00:43:50,000 --> 00:43:52,840 Speaker 1: The more of your world looks like that a India 640 00:43:53,120 --> 00:43:59,080 Speaker 1: or biblical Israel or or Homeeric, the more readily your 641 00:43:59,120 --> 00:44:03,520 Speaker 1: world will be identify fight as fantasy. Why is that, Well, 642 00:44:04,160 --> 00:44:10,560 Speaker 1: in all worlds, meaning is objective, morality is objective. There's 643 00:44:10,600 --> 00:44:12,520 Speaker 1: a fact of the matter when it comes to right 644 00:44:12,600 --> 00:44:18,080 Speaker 1: or wrong. You know, um uh, intentions are objective. All 645 00:44:18,160 --> 00:44:24,520 Speaker 1: these things have objective existence, and that's what accused them 646 00:44:24,640 --> 00:44:29,160 Speaker 1: as being fantastic. Now that's for me, that's crazy. For me. 647 00:44:30,080 --> 00:44:34,640 Speaker 1: That makes fantasy the canary in the coal mine. Fantasy 648 00:44:34,760 --> 00:44:39,760 Speaker 1: is where we can actually see meaning die in our culture. 649 00:44:40,800 --> 00:44:47,359 Speaker 1: We see all these worlds that science has rendered factually irrelevant, 650 00:44:48,040 --> 00:44:52,239 Speaker 1: you know, recreated over and over and over again for 651 00:44:52,360 --> 00:44:58,640 Speaker 1: the enjoyment and edification of millions millions of readers worldwide, 652 00:44:59,719 --> 00:45:03,960 Speaker 1: and the um uh that statement is just simply meant 653 00:45:04,000 --> 00:45:07,440 Speaker 1: to encapsulate that that, you know, we can look at 654 00:45:07,480 --> 00:45:18,680 Speaker 1: fantasy not as a throwaway, escapist, culturally retrograde uh mode 655 00:45:18,800 --> 00:45:23,680 Speaker 1: of entertainment. You know, that's ideologically tram open, how many 656 00:45:23,680 --> 00:45:27,000 Speaker 1: waves we can look at. Fantasy is actually the very 657 00:45:27,080 --> 00:45:35,200 Speaker 1: cutting edge of where human thought, in nihilism, our own 658 00:45:35,600 --> 00:45:38,880 Speaker 1: the fact of our own material nature come into conflict. 659 00:45:39,280 --> 00:45:43,520 Speaker 1: Fantasy is the spark, you know, when when that flint 660 00:45:44,160 --> 00:45:47,759 Speaker 1: and that iron are struck. So Scott, if you see 661 00:45:47,800 --> 00:45:52,799 Speaker 1: fantasy embodying um some of the meaning structures that we 662 00:45:52,960 --> 00:45:55,400 Speaker 1: used to get from mythology and religion, I wonder if 663 00:45:55,480 --> 00:46:00,360 Speaker 1: you see a kind of parallelism in in modern asy 664 00:46:00,719 --> 00:46:05,400 Speaker 1: of the different types of meaning and ethical uh structures 665 00:46:05,400 --> 00:46:07,879 Speaker 1: that you would get from different ancient mythology. Is one 666 00:46:07,880 --> 00:46:12,240 Speaker 1: thing that I think of is like the mythological world 667 00:46:12,320 --> 00:46:15,080 Speaker 1: that has a virtue ethic versus the one that has 668 00:46:15,080 --> 00:46:18,040 Speaker 1: a moral ethic. I don't know if you might consider 669 00:46:18,080 --> 00:46:21,920 Speaker 1: this like Greek mythology versus Buddhism or something where wherein 670 00:46:22,040 --> 00:46:25,600 Speaker 1: one you have you know, an ethic that's about being great, 671 00:46:25,840 --> 00:46:28,440 Speaker 1: and then another one in ethic that's about being good. 672 00:46:28,520 --> 00:46:31,640 Speaker 1: Did you see this paralleled in modern fantasy, I mean 673 00:46:31,640 --> 00:46:37,120 Speaker 1: fantasy like any other fictional platform in which to explore ideas. 674 00:46:37,719 --> 00:46:41,760 Speaker 1: Um just really just sort of you know, UM rips 675 00:46:41,760 --> 00:46:43,920 Speaker 1: the windows out and he kicks the doors down, right. 676 00:46:44,000 --> 00:46:47,480 Speaker 1: I mean, you really can go almost any direction you want. 677 00:46:48,080 --> 00:46:53,920 Speaker 1: Stephen Shaviro, who's a uh fantastic cultural uh critic, uh 678 00:46:53,960 --> 00:46:57,439 Speaker 1: who teaches at Wayne State University. He has a book 679 00:46:57,480 --> 00:47:05,520 Speaker 1: called Discognition where he literally um argues that fiction is 680 00:47:06,760 --> 00:47:11,800 Speaker 1: or has become the primary platform for being able to 681 00:47:11,840 --> 00:47:15,920 Speaker 1: explore philosophical ideas. For this very reason, the fact that 682 00:47:15,960 --> 00:47:21,600 Speaker 1: you can actually contrast the virtue ethics to a more 683 00:47:22,520 --> 00:47:29,960 Speaker 1: authoritarian ethics, the fact that you can ah ask questions 684 00:47:30,000 --> 00:47:34,120 Speaker 1: like what is the meaning of life without it just 685 00:47:34,160 --> 00:47:39,719 Speaker 1: simply collapsing into a joke. Only what happen it's outside 686 00:47:39,760 --> 00:47:43,640 Speaker 1: of fiction, it seems, Um. I mean, either you think 687 00:47:43,640 --> 00:47:46,520 Speaker 1: of meaning of life? What part of the bookstore you 688 00:47:46,560 --> 00:47:49,319 Speaker 1: go to to discover the meaning of life? You don't 689 00:47:49,320 --> 00:47:50,960 Speaker 1: go to the science section, you don't put to the 690 00:47:50,960 --> 00:47:53,440 Speaker 1: philosophy section. You go to the new wage called section. 691 00:47:54,400 --> 00:48:02,560 Speaker 1: Um fiction gives you that freedom to actually show that no, 692 00:48:02,840 --> 00:48:07,279 Speaker 1: I don't care what your a priori intuitions tell you, 693 00:48:07,680 --> 00:48:11,839 Speaker 1: we can actually build a plausible narrative as to you know, 694 00:48:12,200 --> 00:48:18,000 Speaker 1: what would come of the depth of meaning for instance. Um, 695 00:48:18,040 --> 00:48:24,040 Speaker 1: it just provides a platform that allows you to blow 696 00:48:24,200 --> 00:48:31,400 Speaker 1: past all of these philosophical constraints and explore things without 697 00:48:32,000 --> 00:48:36,799 Speaker 1: having to worry about being lectured by pendance, you know, 698 00:48:36,960 --> 00:48:40,640 Speaker 1: on on your ignorance regarding to sort that. All right, 699 00:48:40,640 --> 00:48:42,480 Speaker 1: we're gonna take a quick break and when we come back, 700 00:48:42,520 --> 00:48:50,560 Speaker 1: we're gonna jump right back in to the conversation. Now, Scott, 701 00:48:50,600 --> 00:48:53,520 Speaker 1: you've you've mentioned that, among other works, Frank Herbert's done 702 00:48:53,760 --> 00:48:57,480 Speaker 1: influence your your fantasy fiction. What's your take on Frank's 703 00:48:57,600 --> 00:49:00,719 Speaker 1: run of Doom books and were there any essons there 704 00:49:00,760 --> 00:49:03,440 Speaker 1: that informed your own world building and approach to a 705 00:49:03,520 --> 00:49:07,759 Speaker 1: multi volume series. Yeah, and this is a very good question, 706 00:49:07,800 --> 00:49:11,040 Speaker 1: I mean, because it actually my my experience reading Dune 707 00:49:11,480 --> 00:49:16,920 Speaker 1: probably is one of the has had the biggest influence 708 00:49:17,080 --> 00:49:24,960 Speaker 1: on my UM discipline writing these books over over the 709 00:49:24,960 --> 00:49:30,080 Speaker 1: past decade. Um. I mean, so my experience reading Dune 710 00:49:30,200 --> 00:49:35,160 Speaker 1: was I read Dune blew me away, absolutely blew me away, 711 00:49:35,520 --> 00:49:37,360 Speaker 1: and so what do what do I do? I go 712 00:49:37,440 --> 00:49:39,799 Speaker 1: out and it buy Children at Dune. I quoted by 713 00:49:39,840 --> 00:49:45,240 Speaker 1: God Emperor in Dune Messiah and I read in order 714 00:49:45,680 --> 00:49:53,680 Speaker 1: and God Emperor okay, but he lost me on and yah, 715 00:49:54,239 --> 00:49:58,319 Speaker 1: And I could just feel that, Uh. I just had 716 00:49:58,320 --> 00:50:02,360 Speaker 1: a feeling that he had kind of lost focus. Uh 717 00:50:03,400 --> 00:50:07,840 Speaker 1: lost the spark that drove the original vision. And so 718 00:50:07,960 --> 00:50:11,360 Speaker 1: for me, reading Doone was sort of a process of 719 00:50:11,400 --> 00:50:14,920 Speaker 1: being disappointed in in Um what I thought was his 720 00:50:15,200 --> 00:50:18,439 Speaker 1: commitment to to the vision. And so when I set 721 00:50:18,480 --> 00:50:21,120 Speaker 1: out and started these books, I'll never forget. I mean 722 00:50:21,440 --> 00:50:24,080 Speaker 1: darkness that came before I had come out with Think 723 00:50:24,160 --> 00:50:26,800 Speaker 1: was two thousand four, two thousand three, and I found 724 00:50:26,840 --> 00:50:33,439 Speaker 1: myself having lunch with one of my idols, Uh as 725 00:50:33,680 --> 00:50:37,960 Speaker 1: a younger man. Anyways, Um guy K and we're on Uh. 726 00:50:38,560 --> 00:50:41,200 Speaker 1: We're in Greektown in Toronto, and he asked me about 727 00:50:41,280 --> 00:50:43,520 Speaker 1: my next book and I told him as a sequel, 728 00:50:43,880 --> 00:50:45,640 Speaker 1: and he said, well, how many sequels do you have planned? 729 00:50:45,680 --> 00:50:50,200 Speaker 1: And I said, well, I think nine and he literally 730 00:50:50,280 --> 00:50:58,080 Speaker 1: said the last thing. This is another multi volume epic fantasy. 731 00:50:59,360 --> 00:51:02,680 Speaker 1: And no, my my heart did not think to the 732 00:51:02,760 --> 00:51:06,040 Speaker 1: through the bottom of my shoe, I said, no, I 733 00:51:06,080 --> 00:51:10,280 Speaker 1: think that what the world really needs is a multi 734 00:51:10,360 --> 00:51:14,080 Speaker 1: volume ethics fantasy that does not lose sight of its 735 00:51:14,200 --> 00:51:21,480 Speaker 1: vision guide and hum. And so that's the thing I mean. 736 00:51:22,360 --> 00:51:24,440 Speaker 1: And I was a young writer writing The Prince of 737 00:51:24,480 --> 00:51:27,160 Speaker 1: Nothing and The Warrior Prophet, I mean, I was teaching 738 00:51:27,160 --> 00:51:31,439 Speaker 1: at the time, and uh, the thousand fol thought even 739 00:51:31,520 --> 00:51:35,320 Speaker 1: more so, I really felt the pressure to actually write 740 00:51:35,320 --> 00:51:40,560 Speaker 1: those books, um inside of my delivery dates. And uh um, 741 00:51:40,600 --> 00:51:43,400 Speaker 1: I think both those books actually suffer for it, and 742 00:51:43,440 --> 00:51:46,479 Speaker 1: I think the vision suffered for it. I still think 743 00:51:46,800 --> 00:51:49,360 Speaker 1: I still love the books, but I would love to 744 00:51:49,360 --> 00:51:52,560 Speaker 1: rewrite them as well. Um. Moving into The Aspect Emperor, 745 00:51:52,680 --> 00:51:56,600 Speaker 1: I decided, no, I mean that's a problem. I committed 746 00:51:56,640 --> 00:51:59,480 Speaker 1: to the vision in the very beginning, and that means 747 00:51:59,520 --> 00:52:01,560 Speaker 1: I just like the vision called the call the Shocks, 748 00:52:02,200 --> 00:52:05,600 Speaker 1: and so you know, with the judging Eye from the 749 00:52:05,600 --> 00:52:08,320 Speaker 1: White Luck Warrior, and then of course the final volume 750 00:52:08,360 --> 00:52:11,600 Speaker 1: of The Aspect Emperor having into two, that was just 751 00:52:11,680 --> 00:52:14,640 Speaker 1: exactly what had happened, you know. Rather than worry about 752 00:52:14,680 --> 00:52:17,160 Speaker 1: what my agent would say or what my editors would say, 753 00:52:17,360 --> 00:52:20,799 Speaker 1: and I hate to say it, my my readers, I 754 00:52:20,880 --> 00:52:24,600 Speaker 1: thought you know what, everyone will be better served if 755 00:52:24,760 --> 00:52:28,000 Speaker 1: the vision is better served, because really, that's what people 756 00:52:28,040 --> 00:52:31,280 Speaker 1: are signing on for. They want to follow this story, 757 00:52:31,600 --> 00:52:34,680 Speaker 1: one story all the way through to the conclusion. They 758 00:52:34,680 --> 00:52:37,080 Speaker 1: don't want it to turn into a different story. They 759 00:52:37,120 --> 00:52:40,160 Speaker 1: don't want it to suffer because of my you know, 760 00:52:40,360 --> 00:52:43,160 Speaker 1: I mean I spent ten thousand hours with these characters. 761 00:52:43,800 --> 00:52:47,440 Speaker 1: I hate them at times. You know, readers, readers spend 762 00:52:47,800 --> 00:52:50,359 Speaker 1: a few dozen hours with them, and they don't get 763 00:52:50,360 --> 00:52:54,800 Speaker 1: tired of them the way a writer does. And so, um, 764 00:52:54,840 --> 00:52:59,239 Speaker 1: really all along that that experience reading Dune is really 765 00:52:59,239 --> 00:53:02,799 Speaker 1: sort of informed my decision making. You know, if I'm 766 00:53:02,840 --> 00:53:06,000 Speaker 1: hating this character, I'll stop writing. I'm not going to 767 00:53:06,120 --> 00:53:08,560 Speaker 1: force it, right, I know, I love that character. I'm 768 00:53:08,600 --> 00:53:10,759 Speaker 1: just tired of that character right now. I'm not going 769 00:53:10,840 --> 00:53:14,080 Speaker 1: to invent a new character, right because I'd just be 770 00:53:14,120 --> 00:53:17,600 Speaker 1: freshening freshening things up for me, my writer, my readers, 771 00:53:17,680 --> 00:53:19,920 Speaker 1: they don't want a new character. They just want to 772 00:53:19,960 --> 00:53:22,840 Speaker 1: see what happens to this character, this beloved character or 773 00:53:22,880 --> 00:53:26,839 Speaker 1: hated character or what have you. And U um, yeah, 774 00:53:27,000 --> 00:53:30,680 Speaker 1: so my whole m o kind of became one of 775 00:53:30,800 --> 00:53:33,520 Speaker 1: sit back, let the story tell itself the way and 776 00:53:33,680 --> 00:53:36,120 Speaker 1: needs to be told, and I think that's part of 777 00:53:36,120 --> 00:53:41,400 Speaker 1: the reason why I just feel so supremely confident in 778 00:53:41,440 --> 00:53:45,000 Speaker 1: these final two books. I know these people who disagree, 779 00:53:45,040 --> 00:53:49,000 Speaker 1: there always is, but but I really feel like, I 780 00:53:49,040 --> 00:53:52,239 Speaker 1: really feel like I accomplished that the promise I made 781 00:53:52,239 --> 00:53:55,839 Speaker 1: to ay all those years back in Greek Camp, well, 782 00:53:55,960 --> 00:53:58,720 Speaker 1: I really enjoyed the most recent one, and I'm looking 783 00:53:58,960 --> 00:54:05,319 Speaker 1: forward very much to The Unholy Consult coming out the summer. Correct. Yes, yes, yes, 784 00:54:05,640 --> 00:54:08,680 Speaker 1: hopefully you agree with me. Hopefully you think it you 785 00:54:08,680 --> 00:54:11,719 Speaker 1: think it works. So the Rooky series, right, I mean 786 00:54:12,360 --> 00:54:15,680 Speaker 1: I was certainly. I always feel it's it's a risky 787 00:54:15,680 --> 00:54:18,640 Speaker 1: scenario reading one of your books, because I don't know 788 00:54:18,760 --> 00:54:21,480 Speaker 1: to what extent You're going to damage my psyche a 789 00:54:21,520 --> 00:54:24,399 Speaker 1: little bit, you know, or force me. There's so much 790 00:54:24,400 --> 00:54:28,719 Speaker 1: fantasy takes one out of oneself, you know, and uh, 791 00:54:28,760 --> 00:54:32,000 Speaker 1: and it serves as as at times just pure escapism, 792 00:54:32,080 --> 00:54:34,799 Speaker 1: but that your work always manages to sort of do 793 00:54:35,000 --> 00:54:37,239 Speaker 1: both at the same time. So I'm able to to 794 00:54:37,480 --> 00:54:41,719 Speaker 1: escape into a dark, rich imagined world, but then also 795 00:54:41,800 --> 00:54:45,400 Speaker 1: you're forcing me to ask, at times troubling questions about myself. 796 00:54:45,480 --> 00:54:49,279 Speaker 1: And about humanity in general. That's the goal. I mean, 797 00:54:49,280 --> 00:54:52,080 Speaker 1: that's just what you just said there. That's writer's gold. 798 00:54:53,640 --> 00:54:55,600 Speaker 1: That's what I'm aiming for. What I'm trying to always 799 00:54:55,640 --> 00:54:59,560 Speaker 1: trying to do, is to try to uh immerse and 800 00:55:00,160 --> 00:55:03,560 Speaker 1: and push out at the exact same time. That's stuff. 801 00:55:04,160 --> 00:55:05,959 Speaker 1: It doesn't work for a lot of readers. It's true. 802 00:55:06,560 --> 00:55:08,840 Speaker 1: Now you've mentioned in interviews before how you might have 803 00:55:08,840 --> 00:55:12,160 Speaker 1: written some aspects of neuropath differently if you've finished it 804 00:55:12,600 --> 00:55:15,520 Speaker 1: after the birth of your daughter. Has becoming a father 805 00:55:15,640 --> 00:55:19,160 Speaker 1: affected the arc of the Second Apocalypse saga? Um, well, 806 00:55:19,200 --> 00:55:23,680 Speaker 1: I certainly hope not. I mean through the Second Apocalypse saga. Uh. 807 00:55:24,239 --> 00:55:26,760 Speaker 1: The the idea came to me when I was seventeen 808 00:55:26,840 --> 00:55:31,080 Speaker 1: years old, the basic narrative idea, and it's actually remarkably unchanged. 809 00:55:31,200 --> 00:55:33,360 Speaker 1: I mean, it's given the fact that it's you know, 810 00:55:33,440 --> 00:55:38,920 Speaker 1: sort of soaked up so much of my education basically, 811 00:55:39,040 --> 00:55:42,160 Speaker 1: you know, all those years in university, you know, studying literature, 812 00:55:42,200 --> 00:55:46,759 Speaker 1: and then all those years studying uh history, philosophy, and 813 00:55:46,760 --> 00:55:52,440 Speaker 1: then all these years now soaking up cognitive science. Um, 814 00:55:52,480 --> 00:55:56,719 Speaker 1: it's still that that narrative which ends at the end 815 00:55:56,960 --> 00:56:01,360 Speaker 1: of the Unholy Consult is still the same. So I 816 00:56:01,440 --> 00:56:04,520 Speaker 1: even dug out for the for a couple of the 817 00:56:04,560 --> 00:56:07,719 Speaker 1: final scenes of Dann Holy Console. I even dug out 818 00:56:08,320 --> 00:56:13,200 Speaker 1: material that I had written almost twenty years ago to 819 00:56:13,360 --> 00:56:15,960 Speaker 1: rework to put into the end of the book, and 820 00:56:16,000 --> 00:56:17,959 Speaker 1: that was really exciting for me. I mean it, uh, 821 00:56:18,719 --> 00:56:23,319 Speaker 1: the fact that you know, when you aim at a destination, 822 00:56:23,360 --> 00:56:27,279 Speaker 1: at a far far away destination, um, and when the 823 00:56:27,360 --> 00:56:31,920 Speaker 1: journey is a sort of fraud with reversals in this direction, 824 00:56:32,160 --> 00:56:35,560 Speaker 1: as as this journey has been for me, um, when 825 00:56:35,600 --> 00:56:40,040 Speaker 1: you actually arrive at your destination, and sometimes wondered this 826 00:56:40,120 --> 00:56:42,440 Speaker 1: must be what a nuclear missile feels like when it 827 00:56:42,480 --> 00:56:48,000 Speaker 1: actually hits its target. I mean, it just doesn't seem 828 00:56:48,040 --> 00:56:50,320 Speaker 1: possible that you can actually arrive at the place you 829 00:56:50,360 --> 00:56:54,640 Speaker 1: set out to arrive. And Uh, I really feel as 830 00:56:54,640 --> 00:57:00,360 Speaker 1: though I've closed um the deal when it comes to 831 00:57:00,440 --> 00:57:04,000 Speaker 1: that first seventeen year old idea. So there's more to come. 832 00:57:04,040 --> 00:57:07,600 Speaker 1: There's more to come yet. But I'm interested in something 833 00:57:07,640 --> 00:57:10,640 Speaker 1: you mentioned briefly in the paper, which is Susan Schneider's 834 00:57:10,680 --> 00:57:15,120 Speaker 1: idea that if we encounter alien intelligence, it's less likely 835 00:57:15,280 --> 00:57:18,920 Speaker 1: to be biological intelligence and more likely to be machine 836 00:57:18,960 --> 00:57:21,800 Speaker 1: intelligence or or what you might call more broadly post 837 00:57:21,840 --> 00:57:26,680 Speaker 1: biological intelligence. UM. So I assumed, based on the way 838 00:57:26,680 --> 00:57:28,520 Speaker 1: you mentioned it that you agree with that. Would you 839 00:57:28,560 --> 00:57:31,080 Speaker 1: agree with that? And what do you think about it? Oh? Yeah, yeah, no, 840 00:57:31,200 --> 00:57:36,280 Speaker 1: I think I think the argument is pretty iron clad. 841 00:57:36,400 --> 00:57:38,600 Speaker 1: I mean, if you just look at where we stand 842 00:57:39,120 --> 00:57:42,360 Speaker 1: in terms of our ability to travel between the stars, 843 00:57:43,040 --> 00:57:46,040 Speaker 1: and where we stand in terms of our ability to 844 00:57:46,760 --> 00:57:55,800 Speaker 1: manipulate our biology and uh basically offload um cognition onto 845 00:57:57,080 --> 00:58:03,680 Speaker 1: our Machini artifacts, I think it seems pretty clear that 846 00:58:03,840 --> 00:58:10,520 Speaker 1: the transition from biology to post biology is actually something 847 00:58:10,560 --> 00:58:18,680 Speaker 1: that becomes more technically feasible um earlier than interstellar travel. 848 00:58:19,160 --> 00:58:21,160 Speaker 1: Another question that that comes to mind, and this might 849 00:58:21,200 --> 00:58:24,520 Speaker 1: be just two specific and I'm, you know, tugging at 850 00:58:24,600 --> 00:58:28,480 Speaker 1: something that's supposed to remain mysterious. But there's a there's 851 00:58:28,520 --> 00:58:31,400 Speaker 1: a scene in I believe it's The Judging I where 852 00:58:31,640 --> 00:58:34,800 Speaker 1: the characters are walking through the ruins of the non men, 853 00:58:35,440 --> 00:58:38,919 Speaker 1: and it's in and and you do a wonderful job 854 00:58:38,960 --> 00:58:42,840 Speaker 1: just explaining the their artistic style, their their use of sculpture. 855 00:58:43,240 --> 00:58:47,160 Speaker 1: And I believe it's mentioned that the non men cannot 856 00:58:47,440 --> 00:58:51,480 Speaker 1: see paintings UH, and that one of the reasons they 857 00:58:51,880 --> 00:58:55,680 Speaker 1: they depend on sculpture. UM. Can you elaborate on that 858 00:58:55,760 --> 00:58:57,240 Speaker 1: at all? Or is that meant to be sort of 859 00:58:57,280 --> 00:59:01,960 Speaker 1: a mysterious cipher You always want to distinguish your your 860 00:59:02,080 --> 00:59:07,560 Speaker 1: various UH races and species that you create in UH 861 00:59:07,920 --> 00:59:12,720 Speaker 1: specul fiction and UM. This notion of non men not 862 00:59:12,760 --> 00:59:15,840 Speaker 1: being able to see visual two dimensional visual representations is 863 00:59:16,680 --> 00:59:20,400 Speaker 1: sort of like a textured textural detail along those lines, 864 00:59:20,880 --> 00:59:24,040 Speaker 1: But it actually does have a rationale. I me the 865 00:59:24,080 --> 00:59:29,200 Speaker 1: idea is that we just think of the cavemen in 866 00:59:29,480 --> 00:59:37,400 Speaker 1: Chauvais in France actually drawing you know, their uh, charcoal 867 00:59:37,640 --> 00:59:41,040 Speaker 1: stained fingers across the cave wall for the first time 868 00:59:41,880 --> 00:59:44,880 Speaker 1: and realizing you can see shape in that in the 869 00:59:44,880 --> 00:59:49,160 Speaker 1: experiment right turns out for umans we can actually see 870 00:59:49,240 --> 00:59:56,120 Speaker 1: horses and bison and figures of humans. UH. Given very 871 00:59:56,280 --> 01:00:00,400 Speaker 1: very small amount of visual information, finger coverage in charcoal 872 01:00:01,360 --> 01:00:04,400 Speaker 1: dragged across the cave wall is enough for us to 873 01:00:04,400 --> 01:00:07,960 Speaker 1: be able to recognize a lion or a horse. UM. 874 01:00:08,400 --> 01:00:11,480 Speaker 1: Famous horses of Chauveis are are a wonderful example of 875 01:00:12,520 --> 01:00:16,360 Speaker 1: UM for non men. Their ability you know, to cue 876 01:00:17,880 --> 01:00:22,120 Speaker 1: demission of scenes. UM just simply requires a bit more information, 877 01:00:22,400 --> 01:00:26,840 Speaker 1: in particular requires um death information so they can see 878 01:00:26,880 --> 01:00:32,400 Speaker 1: representation the way weekend. They just have difficulty with two 879 01:00:32,400 --> 01:00:38,120 Speaker 1: dimensions just simply because those the amount of information that 880 01:00:38,320 --> 01:00:42,440 Speaker 1: is given in a two dimensional representation comp isn't it 881 01:00:42,560 --> 01:00:47,120 Speaker 1: enough to actually cue the cognitive systems involved in recognizing 882 01:00:47,280 --> 01:00:52,320 Speaker 1: horses and tigers and what have you. So so there's 883 01:00:53,240 --> 01:00:56,160 Speaker 1: a kind of that. I mean, it's just one of 884 01:00:56,240 --> 01:00:58,840 Speaker 1: many ways in which you know, my blind brain theory, 885 01:00:59,760 --> 01:01:05,760 Speaker 1: uh uh um has sort of nuance the background and 886 01:01:05,880 --> 01:01:09,120 Speaker 1: the landscape of of the novels and the blind brain theory. 887 01:01:09,720 --> 01:01:11,800 Speaker 1: I don't know if you want to elaborate here, but 888 01:01:11,840 --> 01:01:13,840 Speaker 1: this is basically the idea of getting down to the 889 01:01:14,240 --> 01:01:16,640 Speaker 1: idea that the brain cannot perceive itself and that's one 890 01:01:16,680 --> 01:01:20,320 Speaker 1: of the big stumbling blocks to understanding ourselves in our 891 01:01:20,320 --> 01:01:24,600 Speaker 1: place in the world. Right, Yeah, exactly. It's just basically 892 01:01:25,200 --> 01:01:30,080 Speaker 1: the description I gave earlier, this notion of of meta 893 01:01:30,160 --> 01:01:34,560 Speaker 1: cognition being largely almost utterly blind as to the brain 894 01:01:34,640 --> 01:01:39,400 Speaker 1: zone operations and UM, and more importantly being blind to 895 01:01:39,520 --> 01:01:42,760 Speaker 1: that blindness and so being convinced that it actually can 896 01:01:42,840 --> 01:01:47,240 Speaker 1: see its own operations. And then that's the explanation for why, 897 01:01:47,760 --> 01:01:51,600 Speaker 1: you know, we have this madness called philosophy where we 898 01:01:51,680 --> 01:01:55,000 Speaker 1: simply asked the same questions over and over again, assuming 899 01:01:55,000 --> 01:01:58,840 Speaker 1: there's answers and during their proves I know in past 900 01:01:58,880 --> 01:02:02,640 Speaker 1: interviews you've you've referred to the ending of Neuropath, which 901 01:02:02,880 --> 01:02:05,160 Speaker 1: no spoilers for anyone, but it's a it's a particularly 902 01:02:05,200 --> 01:02:08,600 Speaker 1: bleak ending in many ways. And you've you've you said 903 01:02:08,640 --> 01:02:11,160 Speaker 1: that you might have written that differently if you had 904 01:02:11,160 --> 01:02:14,040 Speaker 1: written it after the birth of your daughter. Yeah, I 905 01:02:14,080 --> 01:02:17,320 Speaker 1: almost certainly would have written it differently, almost certainly would have. 906 01:02:18,200 --> 01:02:21,040 Speaker 1: It's a hard book to read it. I mean I 907 01:02:21,080 --> 01:02:23,680 Speaker 1: had a fan like I don't recommend the book. Actually, 908 01:02:23,680 --> 01:02:26,720 Speaker 1: I've stopped recommending the book to people. I've had good 909 01:02:26,720 --> 01:02:29,320 Speaker 1: friends email me, you know, give waited a couple of 910 01:02:29,360 --> 01:02:31,920 Speaker 1: years before reading the book and read the book, and 911 01:02:31,960 --> 01:02:34,400 Speaker 1: they basically said, I wish I hadn't read this book. 912 01:02:35,160 --> 01:02:39,520 Speaker 1: I've been feeling depressed for weeks. I found it at 913 01:02:39,560 --> 01:02:44,000 Speaker 1: the depressing but fulfilling reading. Um. I do occasionally recommend 914 01:02:44,000 --> 01:02:46,640 Speaker 1: it to people who I think you gave me a copy. 915 01:02:46,760 --> 01:02:52,200 Speaker 1: Rob to do now the speaking of parenthood and dark stories. Uh. 916 01:02:52,600 --> 01:02:54,600 Speaker 1: I assume reading plays a big role in your household. 917 01:02:55,400 --> 01:02:58,520 Speaker 1: What's the darkest children's book you've found yourself reading to 918 01:02:58,600 --> 01:03:01,840 Speaker 1: your daughter? Uh? In, is there a particular genre book 919 01:03:02,000 --> 01:03:04,880 Speaker 1: you're looking forward to being able to read with her? Well, 920 01:03:05,000 --> 01:03:06,920 Speaker 1: I mean, I'm looking forward to being able to read 921 01:03:06,920 --> 01:03:09,120 Speaker 1: The Hobbit with her. I mean The Hobbit was a 922 01:03:09,560 --> 01:03:14,480 Speaker 1: watership read for me, and uh, the idea of actually 923 01:03:14,480 --> 01:03:16,440 Speaker 1: being able to share that with my daughter makes me 924 01:03:16,440 --> 01:03:19,600 Speaker 1: feel giddy. I mean, it's crazy about with the balloons 925 01:03:19,600 --> 01:03:23,800 Speaker 1: and streamers up or something like that. But it's a 926 01:03:23,800 --> 01:03:27,840 Speaker 1: pretty dark book. But I mean all the all the 927 01:03:27,920 --> 01:03:30,960 Speaker 1: other's grown fairy tales, mother Goose fairy tales. I mean, 928 01:03:30,960 --> 01:03:33,640 Speaker 1: we have um a couple of collections and I read 929 01:03:33,680 --> 01:03:36,360 Speaker 1: them to my daughter, and my wife would prefer that 930 01:03:36,440 --> 01:03:39,240 Speaker 1: I not read them, just simply because they're so strange 931 01:03:39,440 --> 01:03:42,960 Speaker 1: and uh and violent, right, I mean, Hunsel and Gretel, 932 01:03:43,680 --> 01:03:51,480 Speaker 1: Holy moly, what a story that is. And uh uh, 933 01:03:51,800 --> 01:03:56,080 Speaker 1: I worry that as a culture, we're really we're really 934 01:03:56,120 --> 01:04:02,600 Speaker 1: losing faith in our children's ability to deal with darkness 935 01:04:02,840 --> 01:04:07,000 Speaker 1: and unsettling thoughts. And I actually think that that's a, 936 01:04:07,680 --> 01:04:12,040 Speaker 1: that's a huge social problem moving forward. I remember yearning 937 01:04:12,120 --> 01:04:13,640 Speaker 1: for that kind of stuff when I was a kid. 938 01:04:13,680 --> 01:04:16,120 Speaker 1: I remember when I was a little kid, I wanted 939 01:04:16,320 --> 01:04:20,400 Speaker 1: more dark. I was like not satisfied with how dark 940 01:04:20,440 --> 01:04:23,560 Speaker 1: and disturbing and weird the children's stories I was supplied with. 941 01:04:23,640 --> 01:04:27,600 Speaker 1: Were asked me what my favorite Bible story is? What 942 01:04:27,800 --> 01:04:33,400 Speaker 1: is your favorite Bible story? Sodom and God mora? I 943 01:04:33,520 --> 01:04:38,480 Speaker 1: just thought that was the coolest story ever. I mean, um, 944 01:04:39,280 --> 01:04:43,960 Speaker 1: obviously it's not a cool story at all. Well yeah, 945 01:04:43,960 --> 01:04:47,240 Speaker 1: and plus they have these illustrated a daughters seducing him 946 01:04:47,240 --> 01:04:49,560 Speaker 1: in the cave afterwards. I mean to sticks down right 947 01:04:49,640 --> 01:04:53,040 Speaker 1: creepy story. But those are the kinds of things I liked, 948 01:04:53,080 --> 01:04:56,120 Speaker 1: And I mean I grew up in religious households. Those 949 01:04:56,160 --> 01:05:02,120 Speaker 1: are the things that I caught my attention held it alright. 950 01:05:02,160 --> 01:05:04,240 Speaker 1: So one last question here, and this is just an 951 01:05:04,240 --> 01:05:08,080 Speaker 1: idol curiosity here. So I've long wondered if a Wrong 952 01:05:08,160 --> 01:05:11,600 Speaker 1: and the Rocks the last living in Karoa in your books? 953 01:05:11,920 --> 01:05:14,280 Speaker 1: Are they at all a wink at Kang and Kodos 954 01:05:14,280 --> 01:05:16,920 Speaker 1: on the Simpsons, Not that they have anything to do 955 01:05:17,040 --> 01:05:19,160 Speaker 1: with each other except the fact that there are pairs 956 01:05:19,160 --> 01:05:24,960 Speaker 1: of aliens. Yeah, yeah, no they're not. But it's pretty 957 01:05:25,200 --> 01:05:32,320 Speaker 1: hard to move forward. Thanks you that a little piece 958 01:05:32,320 --> 01:05:37,560 Speaker 1: of pollution there, and mean, well, maybe it just changes 959 01:05:37,600 --> 01:05:41,960 Speaker 1: the Simpsons. It elevates the Simpsons rather than Yeah, as 960 01:05:42,000 --> 01:05:45,520 Speaker 1: if Simpsons is the only immovable object in the universe. 961 01:05:45,560 --> 01:05:52,080 Speaker 1: I think it doesn't in anyways. It's funny like I 962 01:05:52,160 --> 01:05:57,200 Speaker 1: stopped reading um uh messageward banter on my books and stuff. 963 01:05:57,640 --> 01:06:00,680 Speaker 1: It's just just simply um to keep too late clean. 964 01:06:01,080 --> 01:06:04,320 Speaker 1: I just find you know, people, people will so will 965 01:06:04,480 --> 01:06:06,880 Speaker 1: mention something to you and it'll be like, hey, that's 966 01:06:06,880 --> 01:06:14,600 Speaker 1: a great bloody idea, actually better than the idea, and uh, 967 01:06:15,320 --> 01:06:17,959 Speaker 1: I mean it may feel that way. Your own idea 968 01:06:18,040 --> 01:06:21,560 Speaker 1: is always feel old to me, right, Um. I just 969 01:06:21,680 --> 01:06:25,160 Speaker 1: try to shelter myself from that a bit so I 970 01:06:25,200 --> 01:06:28,640 Speaker 1: don't get too much interference when it comes to following 971 01:06:28,640 --> 01:06:30,920 Speaker 1: through the original vision that I went on and on about. 972 01:06:31,960 --> 01:06:33,880 Speaker 1: Oh yeah, I can imagine where you definitely want to 973 01:06:33,880 --> 01:06:37,000 Speaker 1: plow ahead as much as possible towards that original vision 974 01:06:37,040 --> 01:06:41,560 Speaker 1: without polluting it in any way. Basically interaction. Yeah, I 975 01:06:41,560 --> 01:06:43,720 Speaker 1: mean the big thing is is like I say, you 976 01:06:43,760 --> 01:06:46,600 Speaker 1: spend ten thousand hours with these characters and this narrative 977 01:06:46,640 --> 01:06:52,040 Speaker 1: and stuff, and the tendency is to you make it broke, right. 978 01:06:52,120 --> 01:06:54,480 Speaker 1: I Mean, there's just so many ways in which an 979 01:06:54,480 --> 01:06:59,240 Speaker 1: author's exhaustion with the project ends up creeping into that project. 980 01:06:59,800 --> 01:07:02,560 Speaker 1: And why For me, so much of it just comes 981 01:07:02,600 --> 01:07:07,320 Speaker 1: down to U being mindful of my own exhaustion, I mean, 982 01:07:07,320 --> 01:07:10,480 Speaker 1: how I feel about my own material, and you know, 983 01:07:10,640 --> 01:07:13,880 Speaker 1: making sure I'm in the right place while I'm working 984 01:07:13,920 --> 01:07:17,720 Speaker 1: on the books so that the books actually express, you know, 985 01:07:17,760 --> 01:07:19,959 Speaker 1: what it does I wantcome to. Well, on that note, 986 01:07:20,080 --> 01:07:23,160 Speaker 1: thanks again for chatting with us, for answering our questions. 987 01:07:23,720 --> 01:07:27,000 Speaker 1: Your books are all available right now, the all the 988 01:07:27,040 --> 01:07:31,120 Speaker 1: books in the Second Apocalypse Saga, Disciple of the Dog Neuropath, 989 01:07:31,480 --> 01:07:33,320 Speaker 1: you know, for the Brave I guess based on our 990 01:07:33,360 --> 01:07:37,160 Speaker 1: discussions of it here, and the Unholy Consult. The latest 991 01:07:37,160 --> 01:07:40,880 Speaker 1: book in the Second Apocalypse Saga coming out this summer. 992 01:07:41,040 --> 01:07:46,280 Speaker 1: This summer. Yeah, and it's the culmination of what feels 993 01:07:46,320 --> 01:07:50,000 Speaker 1: like a lifelong quest for me now so basically uh 994 01:07:50,440 --> 01:07:56,400 Speaker 1: adolescent narrative that I've managed to pursue through you know, 995 01:07:56,720 --> 01:08:02,600 Speaker 1: graduate degrees and job changes. Strangely enough, the world, the 996 01:08:02,640 --> 01:08:06,959 Speaker 1: world just keeps trying to make my horrific vision come true. 997 01:08:07,000 --> 01:08:12,080 Speaker 1: So it feels like it's only become more relevant um 998 01:08:12,160 --> 01:08:14,800 Speaker 1: now than than it was back at the turn of 999 01:08:14,840 --> 01:08:18,519 Speaker 1: the millennium when I first got serious about publishing. And 1000 01:08:18,720 --> 01:08:20,559 Speaker 1: if anyone went out there wants to follow you, you're 1001 01:08:20,720 --> 01:08:23,360 Speaker 1: you're on social media, but also your blog Three Pound 1002 01:08:23,439 --> 01:08:26,120 Speaker 1: Brain correct is is a great way to sort of 1003 01:08:26,200 --> 01:08:31,000 Speaker 1: keep up with your regularly plice to to pester me 1004 01:08:31,120 --> 01:08:34,840 Speaker 1: with with questions and whatnot, or just simply shoot shoot 1005 01:08:34,840 --> 01:08:36,719 Speaker 1: me an email. You'll find my email on my blog. 1006 01:08:37,680 --> 01:08:41,720 Speaker 1: But just google three Pound Brain and Baker and you'll 1007 01:08:41,760 --> 01:08:47,400 Speaker 1: you'll find yourself at my profane doorstep soon enough. All right, 1008 01:08:47,479 --> 01:08:53,960 Speaker 1: Thanks Scott, Thanks Scott, Thank you guys. So there you 1009 01:08:54,000 --> 01:08:56,280 Speaker 1: have it. Thanks again to our Scott Baker for taking 1010 01:08:56,320 --> 01:08:58,200 Speaker 1: time out of his day to chat with us about 1011 01:08:58,200 --> 01:09:01,400 Speaker 1: alien philosophy and his dark fantasy works. It was really 1012 01:09:01,880 --> 01:09:05,160 Speaker 1: a treat for me to finally chat with our Scott 1013 01:09:05,200 --> 01:09:07,880 Speaker 1: Baker throw out a few questions here and there. I 1014 01:09:07,960 --> 01:09:11,080 Speaker 1: tried to limit my geeky questions in this interview, but 1015 01:09:11,160 --> 01:09:13,559 Speaker 1: I worked in one or two there again. The first 1016 01:09:13,560 --> 01:09:15,840 Speaker 1: six books in the Second Apocalypse Saga are out there, 1017 01:09:15,960 --> 01:09:19,559 Speaker 1: and the next one, The Unholy Consult, comes out July 1018 01:09:19,680 --> 01:09:23,120 Speaker 1: eleven from Overlook Press. If you wanna learn more about 1019 01:09:23,120 --> 01:09:26,240 Speaker 1: our Scott Baker and follow him, his blog A three 1020 01:09:26,240 --> 01:09:30,080 Speaker 1: Pound Brain is our s Baker dot WordPress dot com. 1021 01:09:30,439 --> 01:09:33,519 Speaker 1: He's also online our Scott Baker dot com and if 1022 01:09:33,560 --> 01:09:35,680 Speaker 1: you want to follow him on Twitter. He is the 1023 01:09:35,720 --> 01:09:39,160 Speaker 1: Devil's Chirp. That's just one word, of course, the Devil's chirp. 1024 01:09:39,640 --> 01:09:41,559 Speaker 1: And finally, I want to drive him again that one 1025 01:09:41,560 --> 01:09:43,400 Speaker 1: of the things that I really love about his work 1026 01:09:43,479 --> 01:09:46,559 Speaker 1: is that I'm able to, on one hand, escape into 1027 01:09:46,720 --> 01:09:50,680 Speaker 1: just this this wonderful dark fantasy world with all of 1028 01:09:50,680 --> 01:09:53,080 Speaker 1: its magic and intrigue, all of it just just so 1029 01:09:53,600 --> 01:09:57,080 Speaker 1: perfectly created for the reader. And then at the same 1030 01:09:57,120 --> 01:09:59,960 Speaker 1: time he's throwing in all of these, uh, these thought 1031 01:10:00,080 --> 01:10:04,400 Speaker 1: provoking notions about human experience and identity, motivation and cognition. 1032 01:10:04,800 --> 01:10:07,519 Speaker 1: It really, you know, challenges you and and and forces 1033 01:10:07,560 --> 01:10:10,439 Speaker 1: you to to to look hard into the mirror. And 1034 01:10:10,479 --> 01:10:12,680 Speaker 1: I wanted to just share one more quick quote from him. 1035 01:10:12,720 --> 01:10:15,040 Speaker 1: This is from his book The Judging I uh that's 1036 01:10:15,040 --> 01:10:18,000 Speaker 1: in the Dark the Second Apocalypse saga. He says, I 1037 01:10:18,040 --> 01:10:21,120 Speaker 1: remember asking a wise man once, why do men fear 1038 01:10:21,160 --> 01:10:24,599 Speaker 1: the dark? Because darkness, he told me, is ignorance made visible. 1039 01:10:24,960 --> 01:10:27,960 Speaker 1: And to men despies ignorance, I asked, no, he said, 1040 01:10:28,000 --> 01:10:31,439 Speaker 1: they prized it above all things, all things, but only 1041 01:10:31,520 --> 01:10:35,759 Speaker 1: so long as it remains invisible. All right, and uh again. 1042 01:10:35,760 --> 01:10:37,960 Speaker 1: We'll throw all those links on the landing page for 1043 01:10:38,000 --> 01:10:40,440 Speaker 1: this episode at stuff to Blow your Mind dot com. 1044 01:10:40,479 --> 01:10:42,479 Speaker 1: And if you want to check out more of our work, 1045 01:10:42,720 --> 01:10:44,960 Speaker 1: that's where you'll also find it. You'll find blog posts, 1046 01:10:44,960 --> 01:10:47,400 Speaker 1: you'll find podcasts, you'll find videos, links out to our 1047 01:10:47,479 --> 01:10:51,320 Speaker 1: various social media accounts such as Facebook, Twitter, Tumbler, Instagram. 1048 01:10:51,479 --> 01:10:53,360 Speaker 1: We're on all those things. And if you want to 1049 01:10:53,400 --> 01:10:56,080 Speaker 1: make your ignorance visible to us, you can email us 1050 01:10:56,080 --> 01:11:09,360 Speaker 1: at blow the Mind at how stuff works dot com 1051 01:11:09,400 --> 01:11:11,960 Speaker 1: for more on this and thousands of other topics. Isn't 1052 01:11:11,960 --> 01:11:24,760 Speaker 1: how stuff works dot com? They believe the differ