1 00:00:02,960 --> 00:00:05,320 Speaker 1: Welcome to Stuff to Blow Your Mind, a production of 2 00:00:05,360 --> 00:00:11,080 Speaker 1: My Heart Radio. Hey, welcome to Stuff to Blow your Mind. 3 00:00:11,280 --> 00:00:14,800 Speaker 1: Listener Mail. My name is Robert Lamb and I'm Joe McCormick, 4 00:00:14,880 --> 00:00:17,200 Speaker 1: and today we're bringing you some of the messages that 5 00:00:17,280 --> 00:00:19,720 Speaker 1: you've sent us over the past couple of weeks. So, 6 00:00:19,840 --> 00:00:21,920 Speaker 1: Rob would you mind if I kick things off by 7 00:00:21,920 --> 00:00:24,840 Speaker 1: reading a message that we got about our machine Lords 8 00:00:24,880 --> 00:00:34,200 Speaker 1: of Barnard sixty eight episodes. Okay, this first message comes 9 00:00:34,240 --> 00:00:37,760 Speaker 1: from Aiden. Aiden writes, Hi, Robert and Joe, I just 10 00:00:37,800 --> 00:00:41,880 Speaker 1: finished listening to your episode about post biological intelligence. Towards 11 00:00:41,960 --> 00:00:44,159 Speaker 1: the end of part two, you raised the question of 12 00:00:44,200 --> 00:00:48,360 Speaker 1: whether such an intelligence would have something like emotions. This 13 00:00:48,440 --> 00:00:50,960 Speaker 1: is a great question, and in the recent Vault episodes 14 00:00:51,000 --> 00:00:53,920 Speaker 1: on invertebrate emotions you shared some ideas that could help 15 00:00:53,960 --> 00:00:57,240 Speaker 1: to answer it. In the Invertebrates episode, you focused on 16 00:00:57,320 --> 00:01:00,520 Speaker 1: the ways that an internal emotional state could make manifest 17 00:01:00,600 --> 00:01:04,600 Speaker 1: as measurable behaviors. The example that I remember most clearly 18 00:01:04,680 --> 00:01:08,800 Speaker 1: is the one about bees. After receiving a free sugary treat, 19 00:01:09,200 --> 00:01:13,160 Speaker 1: bees will forage with a more optimistic bias, but after 20 00:01:13,240 --> 00:01:17,200 Speaker 1: a simulated attack, they will forage with a more pessimistic bias. 21 00:01:17,680 --> 00:01:21,440 Speaker 1: This change in behavior may be attributable to something like 22 00:01:21,520 --> 00:01:24,200 Speaker 1: an emotion. Those of parenthesis and and adds as a 23 00:01:24,280 --> 00:01:27,600 Speaker 1: pre post biological intelligence. Writing this based on memory, I 24 00:01:27,640 --> 00:01:30,120 Speaker 1: may be missing something from the b example. Feel free 25 00:01:30,120 --> 00:01:32,880 Speaker 1: to add on or correct if you read it on air. Well, 26 00:01:33,319 --> 00:01:36,600 Speaker 1: nothing to add or correct so far, and you're doing good. Uh. 27 00:01:36,640 --> 00:01:40,119 Speaker 1: Continuing this got me thinking that maybe the same test 28 00:01:40,160 --> 00:01:42,600 Speaker 1: could be applied to a machine intelligence if we could 29 00:01:42,600 --> 00:01:46,039 Speaker 1: observe it. Maybe a free burst of gamma ray energy 30 00:01:46,080 --> 00:01:50,360 Speaker 1: would cause it to display some optimistic behaviors, while a destructive, 31 00:01:50,440 --> 00:01:54,480 Speaker 1: unexpected supernova would cause it to show more pessimistic behaviors, 32 00:01:54,480 --> 00:01:58,080 Speaker 1: whatever that might look like. However, such an intelligence would 33 00:01:58,120 --> 00:02:01,560 Speaker 1: probably have some similarity to modern machine learning in the 34 00:02:01,600 --> 00:02:05,880 Speaker 1: sense that it would extrapolate based on past data. Optimism 35 00:02:05,960 --> 00:02:09,720 Speaker 1: or pessimism would probably show up as an overreaction or 36 00:02:09,840 --> 00:02:13,960 Speaker 1: under reaction to the stimulus beyond what the cold calculation 37 00:02:14,080 --> 00:02:17,359 Speaker 1: of an algorithm would predict. For example, if the machine 38 00:02:17,440 --> 00:02:20,840 Speaker 1: knows that in the past a supernova or other negative 39 00:02:20,880 --> 00:02:24,760 Speaker 1: stimulus has caused a ten percent impact on its systems, 40 00:02:25,040 --> 00:02:29,320 Speaker 1: the cold, emotionless calculation of data would would suggest changing 41 00:02:29,320 --> 00:02:32,440 Speaker 1: its behavior by ten percent. On the other hand, if 42 00:02:32,440 --> 00:02:35,920 Speaker 1: the machine had some kind of emotional state like pessimism, 43 00:02:35,960 --> 00:02:39,120 Speaker 1: maybe it would change its behavior by fifteen percent. That 44 00:02:39,280 --> 00:02:42,600 Speaker 1: five percent difference could be the way to tease out 45 00:02:42,600 --> 00:02:46,079 Speaker 1: the effect of emotions from the effect of adaptive behavior 46 00:02:46,160 --> 00:02:49,400 Speaker 1: based on past experience. The caveat here is that the 47 00:02:49,440 --> 00:02:53,280 Speaker 1: observer would need to know what the emotionless baseline is. 48 00:02:54,000 --> 00:02:56,480 Speaker 1: Maybe this is a study that only an even greater 49 00:02:56,560 --> 00:03:00,400 Speaker 1: machine intelligence could carry out on smaller, simpler ones. It 50 00:03:00,520 --> 00:03:03,200 Speaker 1: was fun exploring the connection between these episodes. So thanks 51 00:03:03,240 --> 00:03:07,000 Speaker 1: for another week of great podcasts. Best Aiden. Yeah, this 52 00:03:07,080 --> 00:03:09,560 Speaker 1: is a good point and so yeah. In that Invertebrate 53 00:03:09,600 --> 00:03:13,840 Speaker 1: Emotions episode, we talked about the difficulties in separating out 54 00:03:13,919 --> 00:03:17,520 Speaker 1: the different things that we would classify as emotions, Like 55 00:03:17,720 --> 00:03:20,960 Speaker 1: you might be able to regard an emotion like anger 56 00:03:21,080 --> 00:03:23,880 Speaker 1: in one sense, as an internal state that has a 57 00:03:24,040 --> 00:03:27,799 Speaker 1: subjective felt quality to it, like it feels like something 58 00:03:27,840 --> 00:03:31,240 Speaker 1: to be angry, and then on. In another case, you 59 00:03:31,280 --> 00:03:37,040 Speaker 1: could say anger is a set of externally observable behaviors 60 00:03:37,080 --> 00:03:40,360 Speaker 1: that you see clustered together. That was you know, they 61 00:03:40,360 --> 00:03:43,920 Speaker 1: represent a certain number of biases and behavior that occur 62 00:03:43,960 --> 00:03:47,000 Speaker 1: at the same time, maybe like uh, a quickness to 63 00:03:47,080 --> 00:03:49,920 Speaker 1: physical aggression or something like that, in other things that 64 00:03:49,960 --> 00:03:52,880 Speaker 1: would correlate with anger uh. And so those are definitely 65 00:03:52,920 --> 00:03:55,320 Speaker 1: separate things. I guess you would have to leave aside 66 00:03:55,320 --> 00:03:59,280 Speaker 1: the question of whether it would feel like something to uh, 67 00:03:59,280 --> 00:04:02,120 Speaker 1: to have an emotion shan for a post biological intelligence 68 00:04:02,160 --> 00:04:04,440 Speaker 1: the same way it would feel like something for us. 69 00:04:04,480 --> 00:04:06,560 Speaker 1: That comes back to the question we talked about in 70 00:04:06,600 --> 00:04:10,680 Speaker 1: the episode of weather. Post biological intelligences would actually be conscious, 71 00:04:10,720 --> 00:04:13,400 Speaker 1: and one part of having a conscious experience is having 72 00:04:13,520 --> 00:04:16,800 Speaker 1: internal emotional states that feel like something. Yeah. The other 73 00:04:16,839 --> 00:04:19,280 Speaker 1: part would just be like, would they have these sort 74 00:04:19,320 --> 00:04:26,200 Speaker 1: of clusters of externally observable behaviors that sort of are 75 00:04:26,279 --> 00:04:33,960 Speaker 1: seen together and they represent a certain disposition towards external stimuli? Yeah, yeah, 76 00:04:34,200 --> 00:04:36,000 Speaker 1: this is a yeah, this is a good breakdown. I 77 00:04:36,000 --> 00:04:44,160 Speaker 1: I appreciate this. All right, let's hear from another listener. 78 00:04:44,200 --> 00:04:47,880 Speaker 1: This one comes to us from Chris. Chris writes, Hi, Robert, 79 00:04:47,920 --> 00:04:51,760 Speaker 1: Joe and Seth, good day. Your recent two part episode 80 00:04:51,760 --> 00:04:55,200 Speaker 1: on the Machine Lords of Barnard has been a great listen. 81 00:04:55,279 --> 00:04:57,800 Speaker 1: It is difficult to conceptualize how we could get to 82 00:04:57,839 --> 00:05:00,960 Speaker 1: a post biological point uh in the time scale that 83 00:05:01,040 --> 00:05:03,680 Speaker 1: might be involved, but it is a fun thought experiment. 84 00:05:04,200 --> 00:05:07,760 Speaker 1: The topic was particularly relevant to a short story by 85 00:05:07,920 --> 00:05:10,840 Speaker 1: Sushin Lieu that I am reading as part of an 86 00:05:10,839 --> 00:05:14,640 Speaker 1: anthology work titled to Hold Up the Sky. I had 87 00:05:14,720 --> 00:05:17,799 Speaker 1: first read The Three Body Problem after hearing you mentioned 88 00:05:17,800 --> 00:05:20,800 Speaker 1: it many times on past episodes. I really enjoyed the 89 00:05:20,800 --> 00:05:23,760 Speaker 1: way his sci fi has written through a completely different 90 00:05:23,800 --> 00:05:27,640 Speaker 1: lens that most Western readers are used to. Um or 91 00:05:27,680 --> 00:05:31,120 Speaker 1: maybe it's just me, this is great. I did recommend 92 00:05:31,279 --> 00:05:34,799 Speaker 1: The Three Body Problem in a previous summer reading episode 93 00:05:34,800 --> 00:05:38,039 Speaker 1: we did a few years back. I loved that novel. Uh, 94 00:05:38,080 --> 00:05:41,520 Speaker 1: but I have not read any of Cician lous Um 95 00:05:41,520 --> 00:05:43,679 Speaker 1: short stories, so so yeah, this is this is totally 96 00:05:43,720 --> 00:05:46,080 Speaker 1: fresh to me. Yeah, same here. I I did the 97 00:05:46,080 --> 00:05:49,240 Speaker 1: audio book of the of of that of that first one, 98 00:05:49,279 --> 00:05:51,200 Speaker 1: the Three Body Problem, but I haven't read any of 99 00:05:51,240 --> 00:05:53,760 Speaker 1: the subsequent books in that series, or any of his 100 00:05:54,000 --> 00:05:56,960 Speaker 1: shorter works. Well, that is a dark forest that we 101 00:05:56,960 --> 00:06:01,919 Speaker 1: should maybe both wander into all uh, they continue. The 102 00:06:01,920 --> 00:06:04,920 Speaker 1: short story I'm referring to is titled Cloud of Poems. 103 00:06:05,160 --> 00:06:08,279 Speaker 1: It is a fascinating story involving an advanced race of 104 00:06:08,360 --> 00:06:12,240 Speaker 1: dinosaurs called the Devouring Empire, who have recently joined the 105 00:06:12,279 --> 00:06:16,560 Speaker 1: Greater Galactic Civilization coming to our Solar System, enslaving humans 106 00:06:16,560 --> 00:06:19,919 Speaker 1: and raising them as feedstock on their interstellar ship, and 107 00:06:21,200 --> 00:06:25,600 Speaker 1: in a particular human named ye Ye who teaches classical 108 00:06:25,680 --> 00:06:29,120 Speaker 1: Chinese literature to the feed lot humans to make them 109 00:06:29,120 --> 00:06:33,120 Speaker 1: more tender. Now this is where the post biological life 110 00:06:33,120 --> 00:06:36,559 Speaker 1: comes in. Another member from the Greater Galactic Society comes 111 00:06:36,560 --> 00:06:38,960 Speaker 1: into the Solar System and is only referred to as 112 00:06:39,080 --> 00:06:41,880 Speaker 1: quote a God. There's a lot of backstory, but it 113 00:06:42,000 --> 00:06:44,400 Speaker 1: is made clear that this is a being from a 114 00:06:44,440 --> 00:06:48,760 Speaker 1: significantly more advanced race that has been that has transformed 115 00:06:48,800 --> 00:06:52,080 Speaker 1: itself into beings of pure energy with the ability to 116 00:06:52,160 --> 00:06:54,479 Speaker 1: jump from one side of the Milky Way galaxy to 117 00:06:54,520 --> 00:06:57,839 Speaker 1: the other. In this sci fi world, it's explained that 118 00:06:57,920 --> 00:07:00,440 Speaker 1: the level of civilization is based on the number of 119 00:07:00,440 --> 00:07:04,600 Speaker 1: dimensions it can access. The Esteemed God's race can access 120 00:07:04,600 --> 00:07:08,160 Speaker 1: eleven dimensions. That's a lot of dimensions. In the episode, 121 00:07:08,200 --> 00:07:11,440 Speaker 1: you discussed how the motivations of a post biological life 122 00:07:11,600 --> 00:07:14,280 Speaker 1: form might be different from our own and whether they 123 00:07:14,280 --> 00:07:17,520 Speaker 1: would have the same concerns as a living being, etcetera. 124 00:07:17,680 --> 00:07:20,200 Speaker 1: And this story gets at that point, which I think 125 00:07:20,240 --> 00:07:22,400 Speaker 1: is relevant. If a race evolved to a point where 126 00:07:22,400 --> 00:07:26,160 Speaker 1: individuals transformed past a stage where a lifespan is no 127 00:07:26,240 --> 00:07:28,920 Speaker 1: longer a concern, I think they would not continue to 128 00:07:28,960 --> 00:07:32,360 Speaker 1: have the same desires and motivations as their pre biological selves. 129 00:07:32,800 --> 00:07:35,560 Speaker 1: In this story, the Esteemed God, who is an intergalactic 130 00:07:35,640 --> 00:07:38,480 Speaker 1: art collector and researcher, is challenged by the human you 131 00:07:38,640 --> 00:07:41,960 Speaker 1: you to become a better poet than the classical Chinese 132 00:07:42,080 --> 00:07:46,440 Speaker 1: master uh liebe. The human point is that even with 133 00:07:46,520 --> 00:07:49,960 Speaker 1: all the technology that the Esteemed God possesses, it cannot 134 00:07:50,000 --> 00:07:52,760 Speaker 1: replicate the poetry of a human because it does not 135 00:07:52,880 --> 00:07:57,080 Speaker 1: possess the ability to understand the human spiritual realm. Since 136 00:07:57,120 --> 00:08:00,360 Speaker 1: this God is still an individual, albeit being a pure energy, 137 00:08:00,520 --> 00:08:03,200 Speaker 1: he takes the challenge, transforming himself into a human and 138 00:08:03,240 --> 00:08:07,720 Speaker 1: attempting to surpass the poetry. Long story short, He fails. 139 00:08:07,880 --> 00:08:10,480 Speaker 1: Then instead decides to build a quantum computer to write 140 00:08:10,520 --> 00:08:13,720 Speaker 1: and record every possible poem using the Chinese alphabet and 141 00:08:13,840 --> 00:08:17,000 Speaker 1: save them each on an atomic level storage device where 142 00:08:17,000 --> 00:08:19,720 Speaker 1: each poem is stored on a single atom. It's then 143 00:08:19,760 --> 00:08:22,320 Speaker 1: discussed that it will take ten to the hundred and 144 00:08:22,320 --> 00:08:27,080 Speaker 1: seventy second power number of atoms to store every possible combination, 145 00:08:27,480 --> 00:08:30,640 Speaker 1: and that unfortunately there are only ten to the eightieth 146 00:08:30,680 --> 00:08:33,720 Speaker 1: power of atoms in the entire universe. I don't know 147 00:08:33,720 --> 00:08:35,680 Speaker 1: how to check these numbers, so I am unsure if 148 00:08:35,720 --> 00:08:38,720 Speaker 1: they actually match up to reality. Uh maybe Seth can 149 00:08:38,760 --> 00:08:41,440 Speaker 1: help me out on that. Let's let's hear it. Let's 150 00:08:41,480 --> 00:08:44,840 Speaker 1: Seth quickly in real time chime in and uh in 151 00:08:44,880 --> 00:08:47,840 Speaker 1: fact check those numbers for us. He's got to go 152 00:08:47,920 --> 00:08:50,280 Speaker 1: count him. Okay, he's gonna count. You'll come back um 153 00:08:50,600 --> 00:08:54,440 Speaker 1: in ten to the hundred and seventy second power minutes. 154 00:08:54,880 --> 00:08:58,040 Speaker 1: Uh So, anyway, the Esteemed God then decides to start 155 00:08:58,040 --> 00:09:00,559 Speaker 1: building the quantum computer, and to do so, he will 156 00:09:00,600 --> 00:09:03,960 Speaker 1: need to deconstruct the entire Solar System, humans and the 157 00:09:04,000 --> 00:09:07,960 Speaker 1: Devouring Empire to their constituent atoms a mere ten to 158 00:09:08,000 --> 00:09:11,920 Speaker 1: the fifty seven power to do so, and has no 159 00:09:12,480 --> 00:09:15,560 Speaker 1: had has no qualms with the destruction to achieve those ends. 160 00:09:16,559 --> 00:09:19,680 Speaker 1: This was an entirely too long email, but I did 161 00:09:19,880 --> 00:09:24,960 Speaker 1: do my best to summarize an astounding work by Sushin Lou. 162 00:09:25,360 --> 00:09:28,199 Speaker 1: You should really just just read it. But to but 163 00:09:28,320 --> 00:09:31,240 Speaker 1: suffice to say, I think a post biological race that 164 00:09:31,360 --> 00:09:34,880 Speaker 1: is descended from previously living beings would be extremely dangerous, 165 00:09:35,160 --> 00:09:37,839 Speaker 1: especially if they have the power available to them, like 166 00:09:38,040 --> 00:09:41,360 Speaker 1: the esteemed god of our story, willing to destroy entire 167 00:09:41,440 --> 00:09:44,160 Speaker 1: races and solar systems with no second thought to achieve 168 00:09:44,200 --> 00:09:46,520 Speaker 1: a goal. Again, great episode. Thanks for all you do 169 00:09:46,679 --> 00:09:51,000 Speaker 1: bringing us listeners, great content. Best Chris. Oh thanks Chris. 170 00:09:50,800 --> 00:09:54,000 Speaker 1: This sounds like a like a great read. Yeah, yeah, 171 00:09:54,120 --> 00:09:57,600 Speaker 1: I um, I love the ideas explored in this. Uh 172 00:09:57,640 --> 00:10:01,080 Speaker 1: you know the this This is definitely an ideas author 173 00:10:01,520 --> 00:10:05,880 Speaker 1: that toys with wonderful sci fi concepts. I mean, based 174 00:10:05,880 --> 00:10:09,240 Speaker 1: on the one novel I've read. Um, that's kind of 175 00:10:09,240 --> 00:10:11,719 Speaker 1: how it rolls out. So there's a lot of individuals 176 00:10:11,760 --> 00:10:17,640 Speaker 1: discussing very at times high minded science scientific topics, but 177 00:10:17,679 --> 00:10:19,600 Speaker 1: with some other fun stuff thrown in as well, like 178 00:10:19,679 --> 00:10:25,440 Speaker 1: real world, um, political concerns, Chinese mythology, etcetera. Yeah, totally, 179 00:10:25,480 --> 00:10:27,560 Speaker 1: And I will say I don't want to spoil anything, 180 00:10:27,559 --> 00:10:30,200 Speaker 1: but the book also has a really great, a really 181 00:10:30,240 --> 00:10:35,120 Speaker 1: great science fiction weapon in it that's extremely counterintuitive that 182 00:10:35,160 --> 00:10:37,240 Speaker 1: comes in towards the end. I won't say any more 183 00:10:37,240 --> 00:10:41,200 Speaker 1: than that it's it's it's not a piece of wood 184 00:10:41,200 --> 00:10:45,559 Speaker 1: with a nail in it though. But yeah, I really 185 00:10:45,600 --> 00:10:50,120 Speaker 1: like this idea of a a super intelligent post biological 186 00:10:50,200 --> 00:10:54,679 Speaker 1: being being challenged to try to write poetry that transcends 187 00:10:54,720 --> 00:10:56,959 Speaker 1: the great poets of Earth, like the example in the 188 00:10:57,000 --> 00:10:59,320 Speaker 1: story would be would be lead by I think a 189 00:10:59,320 --> 00:11:02,280 Speaker 1: lot of Western readers probably know his name spelled more 190 00:11:02,360 --> 00:11:05,160 Speaker 1: like a Leapo or lee Bow, often like l I 191 00:11:05,280 --> 00:11:07,640 Speaker 1: p O or l I b O, but like a 192 00:11:07,679 --> 00:11:11,520 Speaker 1: Tang Tang dynasty era poet who's just a wonderful poet. 193 00:11:11,960 --> 00:11:14,320 Speaker 1: And so it raises the question of like, well, you know, 194 00:11:14,360 --> 00:11:18,720 Speaker 1: if this, if this being is so so intellectually past 195 00:11:18,800 --> 00:11:22,000 Speaker 1: human beings, why would it be that it couldn't write 196 00:11:22,000 --> 00:11:25,520 Speaker 1: poetry better than you know, the than the poetry produced 197 00:11:25,559 --> 00:11:29,360 Speaker 1: by the best of this you know, biologically confined uh 198 00:11:29,440 --> 00:11:32,160 Speaker 1: species on the planet Earth. Well, I think there actually 199 00:11:32,200 --> 00:11:35,000 Speaker 1: maybe is something plausible to that, because you know, almost 200 00:11:35,040 --> 00:11:38,640 Speaker 1: all great art is about like suffering in some way. Uh. 201 00:11:38,679 --> 00:11:41,080 Speaker 1: And so if you were to take a being that 202 00:11:41,280 --> 00:11:44,120 Speaker 1: is is just like so powerful it essentially has no 203 00:11:44,280 --> 00:11:47,680 Speaker 1: real wants or physical limitations, you can imagine how a 204 00:11:47,720 --> 00:11:51,960 Speaker 1: being like that could have real trouble creating compelling narratives 205 00:11:51,960 --> 00:11:55,440 Speaker 1: community that that would like, uh, speak emotionally to the 206 00:11:55,480 --> 00:11:59,120 Speaker 1: experience of beings who are limited like humans are. Yeah, 207 00:11:59,160 --> 00:12:02,240 Speaker 1: so then it has to take human form and try 208 00:12:02,240 --> 00:12:05,000 Speaker 1: and create human art. It has to be the word 209 00:12:05,040 --> 00:12:06,920 Speaker 1: made flesh, right, it has to come down and become 210 00:12:06,960 --> 00:12:10,439 Speaker 1: one of us. Yeah, exactly how else can it possibly 211 00:12:10,520 --> 00:12:13,760 Speaker 1: understand it? So? Yeah, definitely, Yeah, thank you for the recommendation, Chris. 212 00:12:13,920 --> 00:12:16,960 Speaker 1: I will have to look that that story up all right. Well, well, Carney, 213 00:12:16,960 --> 00:12:20,600 Speaker 1: our mail bot has even more machine and AI and 214 00:12:20,760 --> 00:12:29,600 Speaker 1: robotic based listener mail for us here today. Yes, this 215 00:12:29,720 --> 00:12:33,440 Speaker 1: next message is a response to part one of our 216 00:12:33,559 --> 00:12:38,240 Speaker 1: episode about punishing machines. This comes from Karen. Karen says, 217 00:12:38,679 --> 00:12:40,640 Speaker 1: love the show. I'm just going to jump into this. 218 00:12:41,240 --> 00:12:43,720 Speaker 1: I think that if we judge a robot able to 219 00:12:43,760 --> 00:12:47,360 Speaker 1: commit a crime, the robot is also implicitly able to 220 00:12:47,440 --> 00:12:50,400 Speaker 1: have a crime committed against it. But if a human 221 00:12:50,520 --> 00:12:54,320 Speaker 1: or another robot victimizes a robot, what would the consequences be. 222 00:12:54,880 --> 00:12:57,160 Speaker 1: Let's say that I commit a crime against a robot. 223 00:12:57,400 --> 00:13:00,840 Speaker 1: I take the Amazon Virtual Assistant Alex and throw the 224 00:13:00,880 --> 00:13:04,640 Speaker 1: device on the floor, constituting an assault on Alexa. What 225 00:13:04,800 --> 00:13:08,400 Speaker 1: happens next? What is justice to her? Is it based 226 00:13:08,400 --> 00:13:13,240 Speaker 1: on the robots programmed values, likes, dislikes, or goals? If so, 227 00:13:13,360 --> 00:13:16,880 Speaker 1: then Alexa's greatest value is supporting Amazon, and her goal 228 00:13:16,960 --> 00:13:19,800 Speaker 1: is to sell Amazon products. What do you do with that? 229 00:13:20,240 --> 00:13:23,480 Speaker 1: Another silly example, Think of the security robot who drove 230 00:13:23,559 --> 00:13:25,640 Speaker 1: into a pond. I like that, you say, who by 231 00:13:25,679 --> 00:13:28,520 Speaker 1: the way, I think of the security robot who drove 232 00:13:28,559 --> 00:13:33,320 Speaker 1: into a pond and drowned itself in people saw that happen. 233 00:13:33,600 --> 00:13:37,240 Speaker 1: Would people who witnessed a robot getting damaged without stopping 234 00:13:37,240 --> 00:13:40,200 Speaker 1: it be guilty of some sort of neglect if they were, 235 00:13:40,520 --> 00:13:43,160 Speaker 1: what would make it right to the robot? If you 236 00:13:43,240 --> 00:13:45,200 Speaker 1: made it this far, thanks for reading, and what do 237 00:13:45,280 --> 00:13:48,920 Speaker 1: you think? Well, Karen, we actually do talk about this 238 00:13:48,960 --> 00:13:51,920 Speaker 1: a little bit in part two of that series, but 239 00:13:52,200 --> 00:13:55,160 Speaker 1: I've been thinking about it more since we recorded that 240 00:13:55,240 --> 00:13:58,120 Speaker 1: part two, and I think this raises some really good questions. 241 00:13:58,360 --> 00:14:01,120 Speaker 1: So a few distinctions on my thoughts about whether robots 242 00:14:01,160 --> 00:14:03,400 Speaker 1: could be the victim of a crime. I would say, 243 00:14:03,440 --> 00:14:07,800 Speaker 1: at the surface level, this depends, at least in my opinion, 244 00:14:07,840 --> 00:14:10,840 Speaker 1: on whether the robot is conscious or not. And I 245 00:14:10,880 --> 00:14:13,760 Speaker 1: think the standard assumption today is that machines like the 246 00:14:13,800 --> 00:14:18,320 Speaker 1: Amazon Alexa, even complex machines, they're not conscious. But in 247 00:14:18,360 --> 00:14:21,040 Speaker 1: the future it might be hard to say. And this 248 00:14:21,120 --> 00:14:23,520 Speaker 1: gets back, of course to the hard problem of consciousness, 249 00:14:23,520 --> 00:14:25,520 Speaker 1: which we talked about in the Machine Lords of Barnard 250 00:14:25,600 --> 00:14:29,560 Speaker 1: sixty eight episodes. Uh, if we don't know what consciousness 251 00:14:29,880 --> 00:14:32,480 Speaker 1: is and why it arises in the first place, even 252 00:14:32,480 --> 00:14:35,440 Speaker 1: in biological brains like ours, it's gonna be hard to 253 00:14:35,560 --> 00:14:39,280 Speaker 1: judge whether a non biological machine could ever be conscious 254 00:14:39,320 --> 00:14:41,960 Speaker 1: or not. So this is just a big open question. 255 00:14:42,000 --> 00:14:43,920 Speaker 1: To me. I don't really come down on one side 256 00:14:44,040 --> 00:14:47,160 Speaker 1: or the other. But with that huge caveat, I would 257 00:14:47,200 --> 00:14:49,880 Speaker 1: say that I think if robots are ever able to 258 00:14:50,000 --> 00:14:52,920 Speaker 1: be conscious, if for whatever reason we decide yes they 259 00:14:52,960 --> 00:14:56,080 Speaker 1: are having an inner experience like we are, then I 260 00:14:56,120 --> 00:14:59,000 Speaker 1: think the obvious implication would be that they have the 261 00:14:59,120 --> 00:15:01,880 Speaker 1: right to be protect it against harm just like anybody else, 262 00:15:02,400 --> 00:15:05,080 Speaker 1: but for an Alexa or whatever. Since I don't think 263 00:15:05,120 --> 00:15:08,760 Speaker 1: anybody really has suspicions that Alexa is conscious or that 264 00:15:08,800 --> 00:15:12,160 Speaker 1: Alexa has any kind of internal experience, I think harm 265 00:15:12,240 --> 00:15:15,040 Speaker 1: done against an Alexa would really just be a property 266 00:15:15,080 --> 00:15:17,160 Speaker 1: crime against its owner, like if you were to damage 267 00:15:17,200 --> 00:15:22,000 Speaker 1: somebody's wheelbarrow or something. But then there's another big complication 268 00:15:22,040 --> 00:15:24,600 Speaker 1: that I'll throw in that that your email really made 269 00:15:24,600 --> 00:15:29,840 Speaker 1: me think about, which is the possible brutalizing effect on 270 00:15:30,200 --> 00:15:35,920 Speaker 1: society and on onlookers of tolerating crime against robots that 271 00:15:36,040 --> 00:15:39,800 Speaker 1: appear to be conscious even though they're not. This is 272 00:15:39,840 --> 00:15:42,160 Speaker 1: something that I'd take kind of seriously. So I'm imagining 273 00:15:42,160 --> 00:15:44,040 Speaker 1: a scenario like this, see if this makes any sense 274 00:15:44,080 --> 00:15:47,600 Speaker 1: to rob Like, so we imagine most people are still 275 00:15:48,480 --> 00:15:50,800 Speaker 1: decided that, yeah, there's nothing that it's like to be 276 00:15:50,840 --> 00:15:54,040 Speaker 1: a robot. Robots can't actually suffer, so they don't have 277 00:15:54,120 --> 00:15:56,560 Speaker 1: like inherent rights that we need to protect because there's 278 00:15:56,560 --> 00:15:58,320 Speaker 1: nothing that it's like to be them that they just 279 00:15:58,320 --> 00:16:01,040 Speaker 1: don't care. But if you were to make a robot 280 00:16:01,160 --> 00:16:06,640 Speaker 1: that convincingly acted out suffering when it was harmed, and 281 00:16:06,680 --> 00:16:10,600 Speaker 1: then you just had lots of robots like this constantly 282 00:16:10,680 --> 00:16:14,600 Speaker 1: like being harmed in public views, Like a humanoid looking 283 00:16:14,680 --> 00:16:17,880 Speaker 1: robot that could just like sit there while somebody beat 284 00:16:17,920 --> 00:16:20,040 Speaker 1: it with a stick and it would scream in pain. 285 00:16:21,120 --> 00:16:24,600 Speaker 1: Something does seem like very wrong about that that just 286 00:16:24,680 --> 00:16:28,320 Speaker 1: being stimuli that we are constantly exposed to and doing 287 00:16:28,360 --> 00:16:30,960 Speaker 1: nothing about. You know, it almost seems like that that 288 00:16:31,040 --> 00:16:35,640 Speaker 1: would have a kind of horrible numbing effect on onlookers, 289 00:16:35,680 --> 00:16:39,760 Speaker 1: that would desensitize them to the real suffering of human 290 00:16:39,800 --> 00:16:44,520 Speaker 1: beings and of animals. Yeah, I guess. On a related note, 291 00:16:44,680 --> 00:16:47,520 Speaker 1: I know in my household with a with a with 292 00:16:47,560 --> 00:16:52,640 Speaker 1: a child, we we have stressed at times that you know, 293 00:16:52,880 --> 00:16:56,360 Speaker 1: even though you know Alexa or Amazon or Google or 294 00:16:56,400 --> 00:16:58,760 Speaker 1: whatever you're talking to, uh, you know, even though it's 295 00:16:58,760 --> 00:17:01,960 Speaker 1: not a real person, and we're very clear about that. Um, 296 00:17:02,200 --> 00:17:05,560 Speaker 1: you know, you you shouldn't talk mean mean to it. 297 00:17:05,640 --> 00:17:07,359 Speaker 1: You know, you should you should be nice when you 298 00:17:07,400 --> 00:17:12,320 Speaker 1: address the robot. You shouldn't be you know, unnecessarily uh, 299 00:17:12,560 --> 00:17:17,959 Speaker 1: you know, angry or or you know, or anything like that. Uh. Likewise, 300 00:17:18,080 --> 00:17:20,639 Speaker 1: we've we've had this discussion when when one is playing 301 00:17:20,680 --> 00:17:25,800 Speaker 1: against an AI in a game, particularly you know, certainly 302 00:17:25,800 --> 00:17:28,200 Speaker 1: if it's like an enemy AI, but but more specifically, 303 00:17:28,560 --> 00:17:31,000 Speaker 1: you know, if it's something like an online Settlers of 304 00:17:31,040 --> 00:17:35,560 Speaker 1: Catan situation where it's like a fake human player, Like, 305 00:17:35,600 --> 00:17:39,240 Speaker 1: you're not allowed to just say mean things to the 306 00:17:39,680 --> 00:17:42,560 Speaker 1: non human in the in the chat box because that 307 00:17:42,720 --> 00:17:45,600 Speaker 1: it just sets a weird precedent, you know, right, it's 308 00:17:45,640 --> 00:17:48,840 Speaker 1: and it's not because it would hurt the AI. It's 309 00:17:48,880 --> 00:17:52,840 Speaker 1: because it's like it trains you to behave that way, 310 00:17:52,960 --> 00:17:55,680 Speaker 1: and eventually you may end up behaving that way towards 311 00:17:55,720 --> 00:17:59,359 Speaker 1: somebody who could actually be harmed by it. Yeah, so anyway, 312 00:17:59,400 --> 00:18:01,520 Speaker 1: I mean, of course, it's it's in a whole additional 313 00:18:01,560 --> 00:18:03,800 Speaker 1: area of like, how do we treat how do we 314 00:18:03,880 --> 00:18:10,200 Speaker 1: treat you know, virtual entities in simulated environments and in games. Yeah, well, 315 00:18:10,240 --> 00:18:12,160 Speaker 1: I guess this starts to sort of bleed over into 316 00:18:12,200 --> 00:18:15,760 Speaker 1: the bigger, big controversial question of like whether you know, 317 00:18:16,000 --> 00:18:18,760 Speaker 1: whether video games that have violence in them train people 318 00:18:18,800 --> 00:18:21,679 Speaker 1: to commit violence in the real world. And I'm certainly 319 00:18:21,720 --> 00:18:24,600 Speaker 1: not taking a position on that. I I don't have 320 00:18:24,840 --> 00:18:27,040 Speaker 1: a strong opinion one way or another about that. Maybe 321 00:18:27,080 --> 00:18:29,240 Speaker 1: we could look at the research more on that in 322 00:18:29,280 --> 00:18:32,080 Speaker 1: the future. But um, but I mean I would say 323 00:18:32,119 --> 00:18:34,320 Speaker 1: that there there's something like if something is happening in 324 00:18:34,359 --> 00:18:37,760 Speaker 1: physical space and you're seeing people actually like use physical 325 00:18:37,920 --> 00:18:41,760 Speaker 1: violence against a robot or be uh, you know, performatively 326 00:18:41,920 --> 00:18:45,679 Speaker 1: verbally abusive to a robot, even in in real physical space, 327 00:18:46,359 --> 00:18:50,320 Speaker 1: and nothing is is just being tolerated. Something about that, 328 00:18:50,400 --> 00:18:52,840 Speaker 1: at least intuitively to me, would have would seem to 329 00:18:52,880 --> 00:18:56,840 Speaker 1: have a kind of deadening and and very detrimental effect 330 00:18:56,960 --> 00:19:00,080 Speaker 1: on on the culture. Right, But then again, we have 331 00:19:00,160 --> 00:19:02,360 Speaker 1: to be open to the idea that there could conceivably 332 00:19:02,400 --> 00:19:07,680 Speaker 1: be situations in which the robot's presence is intolerable. Maybe 333 00:19:07,680 --> 00:19:11,880 Speaker 1: it's not the robot's fault, but um, like say, there's 334 00:19:11,920 --> 00:19:13,679 Speaker 1: some sort of a I mean you to go back 335 00:19:13,720 --> 00:19:16,720 Speaker 1: to the cigarette robot example from that the first episode. 336 00:19:16,880 --> 00:19:20,320 Speaker 1: If if the cigarette bot can roll into your house uninvited, 337 00:19:20,960 --> 00:19:22,720 Speaker 1: I mean, I I think you should be able to, 338 00:19:23,080 --> 00:19:26,800 Speaker 1: um to kick it out of your house, right right? 339 00:19:27,160 --> 00:19:29,119 Speaker 1: Sure you know we were not going to stand for 340 00:19:29,160 --> 00:19:31,679 Speaker 1: the tyranny of cigarette bot. Yeah. Maybe so that's the 341 00:19:31,720 --> 00:19:33,879 Speaker 1: other side of it. Maybe maybe we should. Maybe there 342 00:19:33,880 --> 00:19:37,240 Speaker 1: are cases where it's actually good to to appear to 343 00:19:37,400 --> 00:19:40,280 Speaker 1: violate the rights of a robot if that robot represents 344 00:19:40,320 --> 00:19:42,719 Speaker 1: something really evil and bad that you want to like 345 00:19:43,440 --> 00:19:47,120 Speaker 1: demonstrate your disapproval against. I guess that that actually did 346 00:19:47,160 --> 00:19:49,160 Speaker 1: come up in the paper. They were talking about arguments 347 00:19:49,240 --> 00:19:52,919 Speaker 1: that like part of what juries sometimes want to do 348 00:19:53,080 --> 00:19:58,280 Speaker 1: is just like symbolically demonstrate moral approprium. Yeah, here's an idea. 349 00:19:58,320 --> 00:20:01,000 Speaker 1: What if you could build up moral willpower by having 350 00:20:01,040 --> 00:20:03,800 Speaker 1: a robot devil that actually sits on your shoulder and 351 00:20:03,920 --> 00:20:06,239 Speaker 1: is constantly trying to tempt you to do evil, and 352 00:20:06,280 --> 00:20:11,040 Speaker 1: so you like just practice ignoring it all the time. Yeah, yeah, yeah, 353 00:20:11,080 --> 00:20:16,119 Speaker 1: I guess so I could see that working. I'm not 354 00:20:16,200 --> 00:20:19,000 Speaker 1: sure if I could. I mean, I can see people 355 00:20:19,040 --> 00:20:26,920 Speaker 1: doing it. I don't know if it would work, uh 356 00:20:27,200 --> 00:20:29,520 Speaker 1: at anyway, Well, we'll let those the listeners decide on that. 357 00:20:29,760 --> 00:20:31,640 Speaker 1: Here's another bit of listener mail for UCE. This comes 358 00:20:31,640 --> 00:20:34,080 Speaker 1: to us from Jim and New Jersey, Robert and Joe. 359 00:20:34,640 --> 00:20:37,400 Speaker 1: I would like to update the trolley car problem slightly. 360 00:20:37,480 --> 00:20:40,440 Speaker 1: For autonomous vehicles, it's not a choice of whether the 361 00:20:40,480 --> 00:20:43,000 Speaker 1: car strikes the elderly couple or the mother pushing the 362 00:20:43,000 --> 00:20:46,240 Speaker 1: baby stroller, inflicting the least possible harm. It's whether the 363 00:20:46,280 --> 00:20:49,440 Speaker 1: car chooses to crash into a wall or tree seriously 364 00:20:49,480 --> 00:20:52,840 Speaker 1: injuring or killing the passengers, or to strike pedestrians injuring 365 00:20:52,920 --> 00:20:56,919 Speaker 1: or killing them while leaving the passengers mostly unscathed. But 366 00:20:57,400 --> 00:21:00,639 Speaker 1: let's make it a bit more interesting. UM in the 367 00:21:01,000 --> 00:21:04,360 Speaker 1: gym includes a few um caveats here. First of all, 368 00:21:04,560 --> 00:21:07,840 Speaker 1: it's a choice between one passenger and a group of pedestrians, 369 00:21:08,280 --> 00:21:10,679 Speaker 1: or it's a choice between a group in the vehicle 370 00:21:10,760 --> 00:21:13,879 Speaker 1: and one pedestrian, or it's a choice between an equal 371 00:21:13,960 --> 00:21:16,399 Speaker 1: number of people in the vehicle and an equal number 372 00:21:16,400 --> 00:21:21,120 Speaker 1: of pedestrians. Does the car's decision favor passenger safety as 373 00:21:21,160 --> 00:21:24,879 Speaker 1: the car moves from basic utility vehicles into more expensive 374 00:21:24,960 --> 00:21:28,159 Speaker 1: luxury models. I don't know what these answers are or 375 00:21:28,200 --> 00:21:31,680 Speaker 1: should be. There are no easy answers, Jim. I think 376 00:21:31,680 --> 00:21:34,119 Speaker 1: this raises a great point, Jim, and we actually, I 377 00:21:34,119 --> 00:21:36,479 Speaker 1: think we this is probably the response that came in 378 00:21:36,520 --> 00:21:38,880 Speaker 1: after part one published, but before part two, So we 379 00:21:39,000 --> 00:21:40,800 Speaker 1: sort of address some of this in part two. But 380 00:21:40,880 --> 00:21:44,359 Speaker 1: this raises a bunch of other permutations that we didn't 381 00:21:44,400 --> 00:21:47,600 Speaker 1: get into. UH. And and one thing that these variations 382 00:21:47,640 --> 00:21:50,680 Speaker 1: really highlight for me is um a problem that we 383 00:21:50,800 --> 00:21:52,840 Speaker 1: also did not really get into in the episode, which 384 00:21:52,880 --> 00:21:56,920 Speaker 1: is UH, making life or death decisions when the probabilities 385 00:21:56,920 --> 00:21:59,879 Speaker 1: of the outcomes that you're trying to choose between our 386 00:22:00,119 --> 00:22:03,719 Speaker 1: very uncertain. So we were talking about how an autonomous 387 00:22:03,800 --> 00:22:06,399 Speaker 1: vehicle in reality is going to have to make like 388 00:22:06,480 --> 00:22:09,960 Speaker 1: trolley car type decisions all the time. But actually, what 389 00:22:10,000 --> 00:22:11,800 Speaker 1: it's gonna have to do is make like a trolley 390 00:22:11,800 --> 00:22:15,159 Speaker 1: problem decision where it's not one track with one person 391 00:22:15,359 --> 00:22:18,320 Speaker 1: versus another track with multiple people. It's going to be 392 00:22:18,600 --> 00:22:23,520 Speaker 1: lots of abstract branches of probabilities, like you you have 393 00:22:23,560 --> 00:22:27,000 Speaker 1: an x percent probability that someone will be injured or 394 00:22:27,040 --> 00:22:30,439 Speaker 1: harmed on this track versus that track, And a lot 395 00:22:30,480 --> 00:22:33,280 Speaker 1: of times those probabilities, even that the machine judges with 396 00:22:33,320 --> 00:22:35,480 Speaker 1: the best of information available to it, are just going 397 00:22:35,520 --> 00:22:38,240 Speaker 1: to be wrong. So it's not just that the decision 398 00:22:38,240 --> 00:22:40,320 Speaker 1: will have to be made, but the decision will have 399 00:22:40,359 --> 00:22:43,800 Speaker 1: to be made necessarily on incomplete information that could be 400 00:22:43,920 --> 00:22:48,000 Speaker 1: could be very misguided. Just one example, like if an 401 00:22:48,000 --> 00:22:51,080 Speaker 1: autonomous vehicle is trying to make a split second decision 402 00:22:51,560 --> 00:22:55,919 Speaker 1: to minimize harm in an oncoming wreck. Uh, it's going 403 00:22:55,960 --> 00:22:58,560 Speaker 1: to have to make judgments like how many people are 404 00:22:58,600 --> 00:23:01,879 Speaker 1: in the other car, but like how well it seems 405 00:23:01,880 --> 00:23:03,919 Speaker 1: like that's something that's often going to be difficult or 406 00:23:03,960 --> 00:23:08,160 Speaker 1: impossible to determine, or like what's the probability that people 407 00:23:08,200 --> 00:23:10,679 Speaker 1: will will be injured or killed in certain collisions? I 408 00:23:10,680 --> 00:23:14,720 Speaker 1: think it's unfortunately it just gets more and more difficult 409 00:23:14,840 --> 00:23:17,640 Speaker 1: the more you try to get into the details on it. Though, 410 00:23:17,680 --> 00:23:20,359 Speaker 1: as we were saying in the episode a number of times, 411 00:23:20,359 --> 00:23:23,800 Speaker 1: it's not like this is a scenario where the human 412 00:23:23,920 --> 00:23:27,639 Speaker 1: driver naturally has an advantage. I mean human drivers, I 413 00:23:27,640 --> 00:23:30,280 Speaker 1: think are often just like making split second decisions based 414 00:23:30,320 --> 00:23:32,840 Speaker 1: on almost no reasoning whatsoever. Is just sort of like 415 00:23:32,880 --> 00:23:37,400 Speaker 1: instinctual jerky movements. Though the individual human is not going 416 00:23:37,440 --> 00:23:40,720 Speaker 1: to see their own driving that way, right, So yeah, 417 00:23:40,720 --> 00:23:52,200 Speaker 1: it's it's a complicated situation. Okay. This next message is 418 00:23:52,240 --> 00:23:55,520 Speaker 1: about dad jokes and a former Listener mail episode. It 419 00:23:55,560 --> 00:23:58,920 Speaker 1: comes from Mohammed. Mohammed just says, hey, guys, just listen 420 00:23:58,960 --> 00:24:00,800 Speaker 1: to a bit in the list in her Mail episode 421 00:24:01,040 --> 00:24:04,240 Speaker 1: about indicating sarcasm and text. And I wanted to point 422 00:24:04,240 --> 00:24:07,320 Speaker 1: out another sarcasm indicator. I see a lot on social 423 00:24:07,359 --> 00:24:11,080 Speaker 1: media that I think works really well alternating upper and 424 00:24:11,160 --> 00:24:15,560 Speaker 1: lowercase letters. It reads intuitively to me as mocking and sarcastic. 425 00:24:16,160 --> 00:24:19,320 Speaker 1: And that's the whole message, except Muhammad attaches a picture 426 00:24:19,359 --> 00:24:23,320 Speaker 1: of a tattoo where it's one of those like they say, 427 00:24:23,359 --> 00:24:25,800 Speaker 1: I say things, so the parents say, you'll regret that 428 00:24:25,840 --> 00:24:29,280 Speaker 1: tattoo when you get older. And then the response is 429 00:24:29,359 --> 00:24:32,840 Speaker 1: me also saying you'll regret that tattoo when you get older. 430 00:24:32,880 --> 00:24:35,720 Speaker 1: But it's alternating upper in lowercase letters. And then the 431 00:24:35,720 --> 00:24:41,000 Speaker 1: tattoo below that is a weird looking SpongeBob square pants. Yeah, 432 00:24:41,160 --> 00:24:42,840 Speaker 1: I don't know. This looks fine and it's not a 433 00:24:42,880 --> 00:24:46,560 Speaker 1: squid word tattoo. For crying out loud, it's SpongeBob. That's 434 00:24:46,560 --> 00:24:49,119 Speaker 1: a weird SpongeBob, but it's good. I don't know the difference. 435 00:24:49,160 --> 00:24:50,520 Speaker 1: What what what would be the deal if it was 436 00:24:50,520 --> 00:24:53,280 Speaker 1: a squid word tattoo? Does that have political significance? No? No, 437 00:24:53,320 --> 00:24:56,520 Speaker 1: it's just like squid Where have you ever watched SpongeBob? No? 438 00:24:56,720 --> 00:24:58,639 Speaker 1: Never a whole episode. I mean, I know what it is, 439 00:24:58,720 --> 00:25:01,119 Speaker 1: but well, I I could explain it to you, but 440 00:25:01,320 --> 00:25:03,000 Speaker 1: the best thing to do just do. You have to 441 00:25:03,040 --> 00:25:05,399 Speaker 1: watch and then you'll you'll understand the context. But yes, 442 00:25:05,440 --> 00:25:08,080 Speaker 1: squid word is one of the other characters. And I 443 00:25:08,080 --> 00:25:10,480 Speaker 1: guess there's squid word tattoos out there. Maybe there's some 444 00:25:10,520 --> 00:25:14,840 Speaker 1: great squid word tattoos. Um, but I feel like SpongeBob 445 00:25:14,920 --> 00:25:19,040 Speaker 1: is the better choice. I did not know we would 446 00:25:19,080 --> 00:25:22,160 Speaker 1: get factionalism in the response to this, But this is good. 447 00:25:27,680 --> 00:25:30,840 Speaker 1: All right. Let's get to some weird House related listener 448 00:25:30,840 --> 00:25:33,320 Speaker 1: mail here. This one comes to us from Chris offhand. 449 00:25:33,320 --> 00:25:35,119 Speaker 1: I don't know if this is the same Chris as earlier, 450 00:25:35,160 --> 00:25:37,920 Speaker 1: but different one different Chris. A lot of a lot 451 00:25:37,920 --> 00:25:39,600 Speaker 1: of Chris. Is a lot of Chris is out there, 452 00:25:39,640 --> 00:25:43,560 Speaker 1: all right? This Chris says, Hello, Rob and Joe. Well 453 00:25:43,680 --> 00:25:46,119 Speaker 1: we all have that first time it denied was the 454 00:25:46,240 --> 00:25:49,520 Speaker 1: night I experienced Highlander. This is not related to any 455 00:25:49,560 --> 00:25:53,160 Speaker 1: recent podcast specifically, but after searching and watching related weird 456 00:25:53,200 --> 00:25:56,280 Speaker 1: house cinema shows on Prime, I'm being presented with quite 457 00:25:56,440 --> 00:25:59,320 Speaker 1: the bevy of sci fi and weird house options. So 458 00:25:59,359 --> 00:26:01,919 Speaker 1: tonight I try Odd Highlander. It's not over yet, but 459 00:26:01,960 --> 00:26:05,159 Speaker 1: Sean Connery is Egyptian but has a Scottish accent. We 460 00:26:05,240 --> 00:26:08,479 Speaker 1: helped just roll with it, I guess onward um. On 461 00:26:08,520 --> 00:26:11,480 Speaker 1: another note, the dad joke episode was great. I'm a 462 00:26:11,520 --> 00:26:13,800 Speaker 1: father to four children, the oldest is the same age 463 00:26:13,800 --> 00:26:15,840 Speaker 1: as your son, Robert, and she has gotten to the 464 00:26:15,840 --> 00:26:18,679 Speaker 1: point where she's trying her hand at making up jokes. 465 00:26:18,720 --> 00:26:21,480 Speaker 1: A few land with her siblings or parents, but most 466 00:26:21,480 --> 00:26:24,919 Speaker 1: are rather misunderstood. But she persists as far as the 467 00:26:24,960 --> 00:26:27,639 Speaker 1: way a captive audience reinforces our dad jokes. This is 468 00:26:27,680 --> 00:26:31,439 Speaker 1: spot on. The word play jokes and fart innuendo just 469 00:26:31,680 --> 00:26:34,720 Speaker 1: make makes for an easy target for the five to 470 00:26:34,800 --> 00:26:37,600 Speaker 1: nine year old age group. Fortunately, I'm going to have 471 00:26:37,800 --> 00:26:40,840 Speaker 1: around a decade of time with children in that age range, 472 00:26:41,040 --> 00:26:44,000 Speaker 1: so I'll be well entrenched in the dad joke mindset. 473 00:26:44,400 --> 00:26:48,800 Speaker 1: So one for the road then not knock, who's there? Who? Who? Who? 474 00:26:48,880 --> 00:26:52,119 Speaker 1: What are you? And al Yeah, that's the end of 475 00:26:52,119 --> 00:26:57,560 Speaker 1: the regards Chris, I've encountered that one in the wild before. Now. 476 00:26:57,600 --> 00:27:01,040 Speaker 1: As for Highlander, um, yeah, absolutely. Highlander is a is 477 00:27:01,080 --> 00:27:05,000 Speaker 1: a wonderful and and weird film, Uh that I think 478 00:27:05,040 --> 00:27:07,840 Speaker 1: about quite a bit. And obviously we're we're big fans, 479 00:27:08,200 --> 00:27:13,120 Speaker 1: huge fans, enormous fans. We're princess of the universe of Now, 480 00:27:13,119 --> 00:27:16,560 Speaker 1: regarding the idea that Sean Connery is Egyptian in The Highlander, 481 00:27:16,600 --> 00:27:21,760 Speaker 1: I recalled that's actually multiply confusing because, uh, if I'm 482 00:27:21,800 --> 00:27:25,080 Speaker 1: not mistaken, it's Sean Connery not even attempting to mask 483 00:27:25,240 --> 00:27:29,159 Speaker 1: his Scottish accent, but he's playing a guy who comes 484 00:27:29,240 --> 00:27:35,040 Speaker 1: from Egypt originally, but his name is one Sanchez Villa 485 00:27:35,119 --> 00:27:37,879 Speaker 1: Lobos Ramirez, and I think he is supposed to have 486 00:27:37,920 --> 00:27:42,600 Speaker 1: been more recently Spanish, right, yes, So you know, I 487 00:27:42,640 --> 00:27:44,800 Speaker 1: don't know. It's I guess there are different ways you 488 00:27:44,840 --> 00:27:47,199 Speaker 1: could crack that apart. I mean, the obvious answer is 489 00:27:48,040 --> 00:27:50,640 Speaker 1: Sean Connery is not going to do an accent. He's 490 00:27:50,640 --> 00:27:53,639 Speaker 1: going to Sean Connery's accents. So he's going to be 491 00:27:53,680 --> 00:27:57,320 Speaker 1: Scottish in any and everything, whether he's playing an ancient 492 00:27:57,320 --> 00:28:01,680 Speaker 1: Egyptian immortal or like a Russian submarine captain, you're getting 493 00:28:01,800 --> 00:28:04,159 Speaker 1: you're getting the same accent. But I don't know you 494 00:28:04,200 --> 00:28:06,360 Speaker 1: could I guess say that, like if you live long enough, 495 00:28:06,400 --> 00:28:09,320 Speaker 1: perhaps you're fluid enough to move through different cultures. You 496 00:28:09,400 --> 00:28:12,679 Speaker 1: also moved through different languages, and you move through different accents, 497 00:28:12,760 --> 00:28:15,000 Speaker 1: and I don't know how you wind up there, But 498 00:28:15,119 --> 00:28:17,320 Speaker 1: like I guess the thing is, you could look at 499 00:28:17,359 --> 00:28:21,639 Speaker 1: it this way. Ramirez is traveling in Scotland at the time, 500 00:28:22,160 --> 00:28:26,520 Speaker 1: uh in that movie, That's how he encounters McLeod um. Therefore, 501 00:28:26,640 --> 00:28:31,359 Speaker 1: perhaps he's just shifted into Scottish mode. And indeed we 502 00:28:31,480 --> 00:28:34,320 Speaker 1: never see him in that film anyway out of the 503 00:28:34,359 --> 00:28:38,160 Speaker 1: Scottish context. So that's right. Yeah, he's a worldly wanderer 504 00:28:38,280 --> 00:28:44,320 Speaker 1: in the Highlands. He's like, he's like Brian Cox and Braveheart. Yeah. 505 00:28:44,320 --> 00:28:46,240 Speaker 1: So I don't know. I guess it makes I'm gonna 506 00:28:46,240 --> 00:28:47,480 Speaker 1: go I'm gonna go with Hi. I'm gonna say it 507 00:28:47,520 --> 00:28:50,880 Speaker 1: makes perfect sense. It's what It makes perfect sense within 508 00:28:50,920 --> 00:28:54,600 Speaker 1: the context of Highlander one. I guess it. In Highland 509 00:28:54,640 --> 00:28:58,000 Speaker 1: or two he's a ghost, so he's he's just set 510 00:28:58,120 --> 00:29:00,640 Speaker 1: in whatever he was, right, but where he died. So, 511 00:29:00,720 --> 00:29:02,560 Speaker 1: you know what, I'm going to take it a step further. 512 00:29:02,720 --> 00:29:06,160 Speaker 1: It works perfectly in Highlander two as well. I don't 513 00:29:06,160 --> 00:29:08,280 Speaker 1: know if I agree that he's a ghost in Highlander 514 00:29:08,360 --> 00:29:11,680 Speaker 1: two isn't. Well, he's the product of some kind of 515 00:29:11,720 --> 00:29:14,240 Speaker 1: necromancy from the planet Zeist. So the way it goes, 516 00:29:14,320 --> 00:29:17,600 Speaker 1: he's he's killed in Highlander one and then many years 517 00:29:17,640 --> 00:29:22,080 Speaker 1: in the future, Uh, Connor McCloud yells his name, and 518 00:29:22,120 --> 00:29:25,200 Speaker 1: then yelling his name causes him to be like reborn 519 00:29:25,840 --> 00:29:28,880 Speaker 1: in Scotland, and then he comes to visit Connor McCloud. 520 00:29:28,960 --> 00:29:31,719 Speaker 1: So I don't know what you call that. Yeah, I'm 521 00:29:31,720 --> 00:29:36,200 Speaker 1: going to call it a ghost. Okay. He appears very fleshy, 522 00:29:36,280 --> 00:29:44,840 Speaker 1: but but maybe maybe a fleshy ghost. Okay, this next 523 00:29:44,960 --> 00:29:47,800 Speaker 1: message is about Weird House Cinema. It comes from Andy, 524 00:29:48,000 --> 00:29:50,800 Speaker 1: and he says, Hey, guys, I really dig the relatively 525 00:29:50,840 --> 00:29:53,640 Speaker 1: new Weird House Cinema segment. I noticed you haven't done 526 00:29:53,640 --> 00:29:56,560 Speaker 1: any animated features. May be wrong on that point. I 527 00:29:56,600 --> 00:29:58,560 Speaker 1: haven't had the chance to listen to them all. So 528 00:29:58,600 --> 00:30:00,840 Speaker 1: I had a couple of suggestions. Two of my personal 529 00:30:00,880 --> 00:30:03,760 Speaker 1: favorites from when I was a kid, nineteen seventy seven's 530 00:30:04,000 --> 00:30:08,360 Speaker 1: Wizards and nineteen eighty threes Rock and Rule. Both are 531 00:30:08,400 --> 00:30:13,000 Speaker 1: fun and definitely strange post apocalyptic animated features. Wizards is 532 00:30:13,040 --> 00:30:16,160 Speaker 1: a Ralph backsheet film with some great humor. Rock and 533 00:30:16,280 --> 00:30:19,600 Speaker 1: Rule contains wonderful music by lou Rey, Debbie Harry and 534 00:30:19,680 --> 00:30:22,680 Speaker 1: Cheap Trick, as well as a demon made of animated 535 00:30:22,720 --> 00:30:25,560 Speaker 1: cow brains. LOOK forward to seeing your thoughts on these 536 00:30:25,560 --> 00:30:30,080 Speaker 1: fantastically weird animated features, Andy. Well, Andy, I've read about 537 00:30:30,120 --> 00:30:34,080 Speaker 1: both of these movies but seen neither one. Same. Yeah, 538 00:30:34,080 --> 00:30:37,160 Speaker 1: I'm not familiar with Rock and Rule at all, but 539 00:30:37,200 --> 00:30:40,040 Speaker 1: I'm familiar with Wizards just because it's it's often held 540 00:30:40,080 --> 00:30:42,160 Speaker 1: up there. It has this you know, it's a. It's 541 00:30:42,200 --> 00:30:46,640 Speaker 1: favorite for people who are into like seventies weird animated features, 542 00:30:46,680 --> 00:30:50,080 Speaker 1: and you know, I like, we're other work by Ralph 543 00:30:50,120 --> 00:30:52,120 Speaker 1: bak Sheef for sure, So I don't know, maybe that 544 00:30:52,120 --> 00:30:54,320 Speaker 1: one's in our future. I know we've we have a couple, 545 00:30:54,400 --> 00:30:56,560 Speaker 1: at least a couple of animated titles that we've been 546 00:30:56,680 --> 00:30:59,280 Speaker 1: kind of knocking around, and uh, I think we may 547 00:30:59,320 --> 00:31:03,240 Speaker 1: have animated content discussed on the show in the near future. 548 00:31:08,520 --> 00:31:12,960 Speaker 1: So this one is from Landon, and Landon says, hey, guys, 549 00:31:13,280 --> 00:31:17,680 Speaker 1: my older brother had a Turbo Graphics sixteen rob which 550 00:31:17,840 --> 00:31:19,880 Speaker 1: I think this came up in Gunhead, right, because we're 551 00:31:19,880 --> 00:31:23,719 Speaker 1: talking about the idea of Gunhead having uh ports for 552 00:31:23,840 --> 00:31:26,560 Speaker 1: multiple systems. Is a video game, and one of them 553 00:31:26,600 --> 00:31:28,520 Speaker 1: was a game on the Turbo Graphics sixteen that I 554 00:31:28,520 --> 00:31:30,560 Speaker 1: don't recall playing, but when I was a kid, I 555 00:31:30,600 --> 00:31:33,600 Speaker 1: had this, uh this weird game console, and we talked 556 00:31:33,600 --> 00:31:37,160 Speaker 1: about the game Balk's Adventure, which is about an aggressive 557 00:31:37,280 --> 00:31:42,360 Speaker 1: caveman baby that headbuts dinosaurs to death. Yeah. I had 558 00:31:42,360 --> 00:31:44,080 Speaker 1: to look it up that it is some sort of 559 00:31:44,080 --> 00:31:47,080 Speaker 1: caveman baby. I also recall that at some point, I 560 00:31:47,160 --> 00:31:50,520 Speaker 1: think you have to rescue the Princess of the Moon 561 00:31:51,160 --> 00:31:53,040 Speaker 1: or something like. You go to the Moon and the 562 00:31:53,080 --> 00:31:55,400 Speaker 1: bad guys from the Moon, and the bad guy has 563 00:31:55,440 --> 00:32:00,360 Speaker 1: been transforming dinosaurs into evil versions of themselves, and if 564 00:32:00,400 --> 00:32:03,200 Speaker 1: you like head butt the dinosaurs enough, they revert to 565 00:32:03,320 --> 00:32:07,680 Speaker 1: their sort of sweet, nerdy former selves. Anyway, Uh, Landing 566 00:32:07,760 --> 00:32:11,640 Speaker 1: goes on about other turbographic sixteen games, says says Bonks 567 00:32:11,760 --> 00:32:14,640 Speaker 1: was a great game, but Landon also says Keith Courage 568 00:32:14,800 --> 00:32:17,960 Speaker 1: was good too. Now, Keith Courage, I think the game 569 00:32:18,040 --> 00:32:21,400 Speaker 1: was called Keith Courage in Alpha zones. And this was 570 00:32:21,560 --> 00:32:24,200 Speaker 1: a really bizarre game that I also had. It was 571 00:32:24,280 --> 00:32:28,000 Speaker 1: a it was a two D platform side scroller, but 572 00:32:28,040 --> 00:32:31,520 Speaker 1: it had two very different types of levels. One was 573 00:32:31,600 --> 00:32:33,600 Speaker 1: like and they would alternate. You'd go one, and then 574 00:32:33,640 --> 00:32:36,840 Speaker 1: the other one was this cute animated overworld where you'd 575 00:32:36,880 --> 00:32:39,479 Speaker 1: walk around and and go into shops and stuff, and 576 00:32:39,480 --> 00:32:41,640 Speaker 1: it was animated in a way almost loo kind of 577 00:32:41,640 --> 00:32:44,920 Speaker 1: like earth Bound or something except two D. Uh, just 578 00:32:44,960 --> 00:32:47,560 Speaker 1: like very like cute and sunny and bright, and then 579 00:32:47,600 --> 00:32:50,920 Speaker 1: every other level was just this demonic nightmare in these 580 00:32:50,960 --> 00:32:55,800 Speaker 1: caves with like satanic robots attacking you, very very hot 581 00:32:55,840 --> 00:32:59,840 Speaker 1: and cold showers to use the Grand Guniol phrase. But 582 00:33:00,000 --> 00:33:02,320 Speaker 1: then anyway, Landing goes on. There was also a really 583 00:33:02,320 --> 00:33:05,320 Speaker 1: cool racing slash adventure game too. You had to wander 584 00:33:05,360 --> 00:33:08,280 Speaker 1: a world and look for people to race. I wish 585 00:33:08,320 --> 00:33:10,160 Speaker 1: I remember what it was called. I don't know what 586 00:33:10,200 --> 00:33:13,120 Speaker 1: that one was land but Landon says, I've been really 587 00:33:13,240 --> 00:33:16,680 Speaker 1: enjoying the Weird House Cinema episodes. I've seen several that 588 00:33:16,720 --> 00:33:19,880 Speaker 1: have been covered. Robot Jocks was a childhood favorite for 589 00:33:19,920 --> 00:33:22,800 Speaker 1: me and my brothers. Crash and burn became a catchphrase 590 00:33:22,880 --> 00:33:25,480 Speaker 1: for us. What do you think about doing an episode 591 00:33:25,480 --> 00:33:31,720 Speaker 1: on the movie Universal Soldier? Thanks for the great shows, Landon, Um, Well, 592 00:33:31,760 --> 00:33:33,600 Speaker 1: I don't know Universal Soldier. It's been a long time 593 00:33:33,600 --> 00:33:35,640 Speaker 1: since I've seen it, But I don't know. I guess 594 00:33:35,640 --> 00:33:38,680 Speaker 1: anything's possible. We've we often have this discussion, like we 595 00:33:38,680 --> 00:33:41,920 Speaker 1: we don't really have a a set in stone criteria 596 00:33:41,960 --> 00:33:44,240 Speaker 1: for Weird House Cinema is it's it's kind of a 597 00:33:44,960 --> 00:33:47,160 Speaker 1: does it feel right, does it feel like it fits? 598 00:33:48,240 --> 00:33:50,240 Speaker 1: And and then we kind of go with it. So 599 00:33:50,640 --> 00:33:52,560 Speaker 1: I don't know, I'm not sure. I haven't looked at 600 00:33:52,880 --> 00:33:57,360 Speaker 1: Universal Soldiers. Roland Imrick, isn't it I think is it. Yeah, 601 00:33:57,400 --> 00:34:00,000 Speaker 1: I believe it was in early one of his films. 602 00:34:00,000 --> 00:34:03,880 Speaker 1: Could be wrong about that. Speaking of Roland Emerick, Rachel 603 00:34:03,920 --> 00:34:07,160 Speaker 1: and I just recently decided to re revisit a movie 604 00:34:07,160 --> 00:34:09,120 Speaker 1: I hadn't seen in a long time. We watched the 605 00:34:09,239 --> 00:34:14,879 Speaker 1: nineteen Godzilla, directed by Roland Emerick, and what a travesty. 606 00:34:15,080 --> 00:34:18,720 Speaker 1: Just that movie is just direct, no offense to anybody 607 00:34:18,719 --> 00:34:22,319 Speaker 1: who worked on it. But yeah, just I don't know, 608 00:34:22,480 --> 00:34:24,839 Speaker 1: as someone who's who's come to really be more discerning 609 00:34:24,880 --> 00:34:28,279 Speaker 1: about Kaishu type films as as time goes on, that 610 00:34:28,320 --> 00:34:30,560 Speaker 1: one is just the bottom of the barrel. This is 611 00:34:30,600 --> 00:34:34,719 Speaker 1: the Godzilla in name only picture. I don't know what 612 00:34:34,760 --> 00:34:37,080 Speaker 1: you mean by that. I mean it's the one with 613 00:34:37,120 --> 00:34:41,080 Speaker 1: Matthew Broderick and genre No and those people. Yeah, yeah, 614 00:34:41,120 --> 00:34:43,080 Speaker 1: I think it's sometimes called I've seen it referred to 615 00:34:43,120 --> 00:34:47,200 Speaker 1: as Gino as Godzilla in name Only. Well yeah, well, 616 00:34:47,239 --> 00:34:50,120 Speaker 1: like most of the movie is not actually Godzilla. It's 617 00:34:50,440 --> 00:34:52,680 Speaker 1: Matthew Broderick and a bunch of people running around in 618 00:34:52,719 --> 00:34:56,960 Speaker 1: a building running from velociraptors. I think there's the baby godzillas. 619 00:34:57,000 --> 00:34:58,920 Speaker 1: But so it came out a few years after Jurassic 620 00:34:59,000 --> 00:35:02,959 Speaker 1: Park and so there are raptor sized baby Godzilla's doing 621 00:35:03,080 --> 00:35:05,600 Speaker 1: most of the action in the film. Well, you know 622 00:35:05,920 --> 00:35:08,799 Speaker 1: Roland Imerick, he he had some He did some fun pictures. Though. 623 00:35:08,800 --> 00:35:11,400 Speaker 1: We have to remember he did Stargate. Oh, Argate was 624 00:35:11,480 --> 00:35:14,479 Speaker 1: kind of fun. You know, Stargate was fun. I haven't 625 00:35:14,480 --> 00:35:17,200 Speaker 1: seen it in a long time, but ye say, remember 626 00:35:17,280 --> 00:35:20,319 Speaker 1: had a gloriously befuddled James Spader in it. It was 627 00:35:20,400 --> 00:35:23,960 Speaker 1: James Spader and Hugh Grant mode. Yeah. Yeah, and there 628 00:35:24,040 --> 00:35:27,759 Speaker 1: was some fun teleportation high jinks. Uh. Other than that, 629 00:35:27,800 --> 00:35:36,160 Speaker 1: I'm a little little foggy on what happened. Okay, all right, 630 00:35:36,200 --> 00:35:37,880 Speaker 1: here's another bit of listener mail. Lison comes to us 631 00:35:37,920 --> 00:35:40,680 Speaker 1: from Cheryl. Cheryl Rights, I've come across a movie that 632 00:35:40,760 --> 00:35:43,120 Speaker 1: might be suitable for this feature. It is a late 633 00:35:43,200 --> 00:35:45,919 Speaker 1: sixties sort of movie when studios were trying to catch 634 00:35:45,960 --> 00:35:48,279 Speaker 1: up with youth culture. It goes about as well as 635 00:35:48,280 --> 00:35:50,680 Speaker 1: you'd expect. I'd be interested in your take on it. 636 00:35:50,960 --> 00:35:54,080 Speaker 1: The Magic Christian stars Peter Seller's and Ringo Starr, with 637 00:35:54,120 --> 00:35:58,040 Speaker 1: an amazing cast of credited and uncredited actors, including John 638 00:35:58,120 --> 00:36:02,400 Speaker 1: Cleese and Wilfred Hyde. Y Yule Brenner sings a torch 639 00:36:02,480 --> 00:36:05,960 Speaker 1: Song in Drag very well too. Thanks for what you 640 00:36:06,000 --> 00:36:09,560 Speaker 1: do best, Cheryl. Oh, I've seen The Magic Christian. This 641 00:36:09,600 --> 00:36:11,799 Speaker 1: movie is nuts. I also it's been a while since 642 00:36:11,800 --> 00:36:15,160 Speaker 1: I've seen it, but it's uh, it doesn't really have, 643 00:36:15,360 --> 00:36:18,360 Speaker 1: as far as I recall, much of an overarching narrative. 644 00:36:18,520 --> 00:36:20,680 Speaker 1: It's not like a plot driven movie. It's more just 645 00:36:20,800 --> 00:36:26,440 Speaker 1: kind of a series of bizarre vignette strung together of 646 00:36:26,440 --> 00:36:29,600 Speaker 1: of people. Like the main thing I remember about it 647 00:36:29,640 --> 00:36:31,400 Speaker 1: is it has like somebody who's got a lot of 648 00:36:31,480 --> 00:36:36,440 Speaker 1: money tricking people into like like doing pranks on people, 649 00:36:36,520 --> 00:36:41,399 Speaker 1: essentially getting people into bizarre scenarios under the impression, under 650 00:36:41,400 --> 00:36:44,919 Speaker 1: the idea that people will do anything for money. Yeah, 651 00:36:44,960 --> 00:36:47,120 Speaker 1: I've I've never seen it, but yeah, it does have 652 00:36:47,160 --> 00:36:49,880 Speaker 1: a lot of interesting people attached to it, and I 653 00:36:49,880 --> 00:36:52,360 Speaker 1: don't know, it seems to be part of a a 654 00:36:52,440 --> 00:36:55,040 Speaker 1: genre that I have very little exposure to, sort of 655 00:36:55,080 --> 00:37:01,800 Speaker 1: a like a late sixties British satire. The film um 656 00:37:01,840 --> 00:37:05,839 Speaker 1: like I was looking at Joseph McGrath's uh work here, 657 00:37:06,080 --> 00:37:09,480 Speaker 1: Scottish film director, and it's a lot of stuff that 658 00:37:09,520 --> 00:37:12,400 Speaker 1: I've never heard of. But also stuff like the nineteen 659 00:37:12,520 --> 00:37:18,480 Speaker 1: seven Casino Royal adaptation which had Peter Sellers in it 660 00:37:18,680 --> 00:37:22,400 Speaker 1: and Ursa Landrew's David Niven, etcetera. Yeah, I think it 661 00:37:22,480 --> 00:37:24,680 Speaker 1: was based on something that was written by by the 662 00:37:24,719 --> 00:37:29,560 Speaker 1: writer Terry Southern and uh, actually, fact, I remember I'm 663 00:37:29,560 --> 00:37:33,480 Speaker 1: pretty sure The Magic Christian was a favorite of former 664 00:37:33,560 --> 00:37:37,799 Speaker 1: show host Christian Seger. Really okay, yeah, maybe that's maybe 665 00:37:37,840 --> 00:37:39,600 Speaker 1: that's why it sort of rings about, like maybe I 666 00:37:39,600 --> 00:37:47,080 Speaker 1: remember him talking about it. Yeah, all right, what do 667 00:37:47,080 --> 00:37:48,920 Speaker 1: we have? Looks like we have one left in the 668 00:37:48,960 --> 00:37:52,760 Speaker 1: bag there. Ah, yeah, okay. This comes from Greg. Greg, 669 00:37:53,160 --> 00:37:56,880 Speaker 1: also writing about Weird House Cinema, says The Keep is 670 00:37:56,920 --> 00:38:00,239 Speaker 1: an interesting film by Michael Mann, his second feat. Sure, 671 00:38:00,480 --> 00:38:02,480 Speaker 1: I was super excited to see this movie when it 672 00:38:02,520 --> 00:38:07,480 Speaker 1: first came out after seeing production stills in maybe Fangoria. Anyway, 673 00:38:07,560 --> 00:38:10,920 Speaker 1: the film is a beautifully shot, confusing mess with Nazi 674 00:38:11,200 --> 00:38:15,400 Speaker 1: SS troops, an Immortal Warrior, and pure evil every fog 675 00:38:15,440 --> 00:38:18,279 Speaker 1: machine ever made crank to eleven during the whole thing. 676 00:38:18,600 --> 00:38:21,320 Speaker 1: Oh and an all star cast. Look it up. Maybe 677 00:38:21,320 --> 00:38:23,919 Speaker 1: worth a visit for Weird House Cinema. Thanks and love 678 00:38:24,040 --> 00:38:27,760 Speaker 1: the show. Greg. Have you seen The Keep? Rob? I haven't. 679 00:38:27,800 --> 00:38:29,840 Speaker 1: It's been on my list for a long time. I 680 00:38:29,880 --> 00:38:33,040 Speaker 1: think it's one that I've occasionally like powered up like 681 00:38:33,280 --> 00:38:35,479 Speaker 1: and I'm like, I think tonight's tonight, I'm gonna watch 682 00:38:35,480 --> 00:38:37,799 Speaker 1: the keep and then it just doesn't happen for one 683 00:38:37,800 --> 00:38:40,960 Speaker 1: reason or another. I'm almost positive it's got a Tangerine 684 00:38:41,040 --> 00:38:44,840 Speaker 1: Dream score. That's what thats you in. That's probably it. 685 00:38:45,120 --> 00:38:47,520 Speaker 1: H I mean, also, you know the idea Michael Mann 686 00:38:47,640 --> 00:38:50,560 Speaker 1: is not really known for his genre pictures. You know, 687 00:38:50,600 --> 00:38:53,840 Speaker 1: you tend to, so the idea of one of his 688 00:38:53,880 --> 00:38:57,480 Speaker 1: early films had a bunch of supernatural weirdness and and uh, 689 00:38:57,600 --> 00:39:01,000 Speaker 1: you know, and and and Nazis and stuff. It sounds 690 00:39:01,000 --> 00:39:02,880 Speaker 1: worth checking out. I've certainly talked to people who are 691 00:39:02,880 --> 00:39:06,480 Speaker 1: big fans of this flick. Um, So, yeah, I don't know. 692 00:39:06,560 --> 00:39:09,439 Speaker 1: Maybe maybe this is weird house material. I did look 693 00:39:09,440 --> 00:39:12,160 Speaker 1: it up. I can confirm it is music by Tangerine 694 00:39:12,239 --> 00:39:15,439 Speaker 1: Dream uh and and it fits because the movie has 695 00:39:15,520 --> 00:39:18,480 Speaker 1: a kind of that that Tangerine Dreams sort of slow 696 00:39:18,800 --> 00:39:21,800 Speaker 1: dream equality to it. There's lots of fog floating around, 697 00:39:22,280 --> 00:39:25,120 Speaker 1: and the cast is really excellent. I don't think I 698 00:39:25,120 --> 00:39:28,840 Speaker 1: could say it's a good movie. H. Greg is correct 699 00:39:28,960 --> 00:39:32,839 Speaker 1: that this movie is is very confusing, and it doesn't 700 00:39:32,880 --> 00:39:35,279 Speaker 1: I don't recall it having much of a like very 701 00:39:35,280 --> 00:39:39,359 Speaker 1: a very propulsive narrative. But it's got great actors in it. 702 00:39:39,360 --> 00:39:42,640 Speaker 1: It's got Scott Glenn as some kind of strange like 703 00:39:42,800 --> 00:39:46,000 Speaker 1: chosen One type figure who is who is drawn to 704 00:39:46,080 --> 00:39:49,960 Speaker 1: this war zone in Europe by supernatural forces. I think 705 00:39:50,000 --> 00:39:52,480 Speaker 1: at some point his eyes start glowing and I recall 706 00:39:52,560 --> 00:39:57,000 Speaker 1: Scot Glenn has unbelievable what do you call the muscles 707 00:39:57,040 --> 00:39:59,920 Speaker 1: on top of your shoulders, the where like your clavic 708 00:40:00,000 --> 00:40:04,640 Speaker 1: hold your neck, those things. Um, yeah, those traps are 709 00:40:04,680 --> 00:40:08,520 Speaker 1: those traps? Traps? Sure? His his traps are off the charts. 710 00:40:08,600 --> 00:40:12,280 Speaker 1: His traps are unbelievable. Scott Glenn is great. I mean really, 711 00:40:12,440 --> 00:40:14,000 Speaker 1: this whole I mean you look at the people in 712 00:40:14,000 --> 00:40:17,359 Speaker 1: this film is like Gabriel Byrne. Ian McKellen. Um, if 713 00:40:17,400 --> 00:40:19,920 Speaker 1: Brian Payne shows up, I think he's basically just a 714 00:40:20,200 --> 00:40:22,480 Speaker 1: rando in it. But like Bruce Payne is a great 715 00:40:22,520 --> 00:40:25,000 Speaker 1: Beam movie actor as well. Yeah, Ian McKellan is of 716 00:40:25,000 --> 00:40:28,400 Speaker 1: course wonderful as always. I think Jurgen proc Now is 717 00:40:28,440 --> 00:40:31,560 Speaker 1: in it. He plays a Nazi. I think Gabriel Burne 718 00:40:31,760 --> 00:40:35,040 Speaker 1: plays a Nazi and it is the elevator pitches that 719 00:40:35,120 --> 00:40:38,799 Speaker 1: this demon in a cave who looks like a bodybuilder 720 00:40:38,920 --> 00:40:43,040 Speaker 1: hell spawn slash robot comes to life, is awakened somehow 721 00:40:43,120 --> 00:40:45,359 Speaker 1: and kills a bunch of Nazis. Yeah. I think I've 722 00:40:45,400 --> 00:40:47,440 Speaker 1: seen a picture of the the entity in question, and 723 00:40:47,520 --> 00:40:51,960 Speaker 1: he looks like he's probably Thanos his personal trainer. Yeah, 724 00:40:52,080 --> 00:40:56,799 Speaker 1: he does lift this, This demon does lift bro Well, 725 00:40:56,880 --> 00:40:59,080 Speaker 1: I mean say no more. It's got a tangerine dream score, 726 00:40:59,120 --> 00:41:01,799 Speaker 1: so I'd watch it. Okay, Well, maybe this goes on 727 00:41:01,840 --> 00:41:03,880 Speaker 1: the list for the future. And yeah, it is funny 728 00:41:03,880 --> 00:41:06,520 Speaker 1: seeing this come out of Michael Mann, who I think 729 00:41:06,600 --> 00:41:09,000 Speaker 1: Michael Mann is a great filmmaker. I really enjoy a 730 00:41:09,000 --> 00:41:10,640 Speaker 1: lot of his movies, but most of the ones I 731 00:41:10,640 --> 00:41:13,920 Speaker 1: can think of are like crime thriller type movies, like 732 00:41:14,080 --> 00:41:16,640 Speaker 1: some of the best of that genre. But yeah, you 733 00:41:16,680 --> 00:41:20,040 Speaker 1: don't really think of him as making horror movies. Though. 734 00:41:20,400 --> 00:41:23,960 Speaker 1: One movie of his that I think is is really overlooked. 735 00:41:24,600 --> 00:41:27,520 Speaker 1: Is he in the like mid to late two thousands, 736 00:41:27,560 --> 00:41:29,440 Speaker 1: I think, like two thousand five or six or so, 737 00:41:30,000 --> 00:41:34,040 Speaker 1: did a Miami Vice movie? Did you ever see this? No? 738 00:41:34,200 --> 00:41:36,080 Speaker 1: I never did. I know the film in question, though 739 00:41:36,960 --> 00:41:40,160 Speaker 1: this movie is it is not at all what you 740 00:41:40,160 --> 00:41:43,719 Speaker 1: would expect it. Uh. It's so it's just a you know, 741 00:41:43,920 --> 00:41:47,680 Speaker 1: like cops dealing with crime and drug smugglers and stuff 742 00:41:47,680 --> 00:41:51,759 Speaker 1: in terms of story content. But stylistically, this movie is 743 00:41:51,960 --> 00:41:55,400 Speaker 1: so dark and do me. It is like it's like 744 00:41:55,440 --> 00:41:58,200 Speaker 1: the cop movie at the end of the world. The 745 00:41:58,239 --> 00:42:01,560 Speaker 1: cinematography is full of all this a negative space and 746 00:42:01,719 --> 00:42:07,120 Speaker 1: haunting low light. It's it's it is astonishingly weird and 747 00:42:07,200 --> 00:42:10,200 Speaker 1: beautiful in terms of how it looks for a cop movie, 748 00:42:10,280 --> 00:42:12,680 Speaker 1: which is what it is. I would say, it's if 749 00:42:12,680 --> 00:42:14,720 Speaker 1: you're into I don't know. If that tickles your fancy, 750 00:42:14,760 --> 00:42:17,520 Speaker 1: you should. You should look it up. Tangerine Dreamscore. Oh 751 00:42:17,600 --> 00:42:21,480 Speaker 1: I don't know. Okay, well then I can't commit. Okay, 752 00:42:22,040 --> 00:42:24,399 Speaker 1: all right, We're gonna go ahead and close it up here, 753 00:42:24,560 --> 00:42:27,160 Speaker 1: but we'll be back next week with more listener mail. 754 00:42:27,280 --> 00:42:30,800 Speaker 1: Keep it coming, um you know it. Feel free to 755 00:42:30,840 --> 00:42:34,360 Speaker 1: write in about any topics past or present or potentially 756 00:42:34,440 --> 00:42:38,040 Speaker 1: future stuff doble your mind, weird how cinema. Uh, and 757 00:42:38,080 --> 00:42:40,560 Speaker 1: you know, never be shy about even chiming in between 758 00:42:40,640 --> 00:42:42,920 Speaker 1: part one and part two of a series, because that 759 00:42:42,920 --> 00:42:46,040 Speaker 1: that's exactly the case with the Punishing Robots listener mail 760 00:42:46,080 --> 00:42:48,600 Speaker 1: that we read in this episode because we're recording this 761 00:42:48,640 --> 00:42:53,279 Speaker 1: episode between the publication of part one in part two, right, yes, 762 00:42:53,400 --> 00:42:57,080 Speaker 1: good good, Thank you for noting anyway, Thanks as always 763 00:42:57,080 --> 00:43:00,759 Speaker 1: to our excellent audio producer Seth Nicholas Anson. If you 764 00:43:00,760 --> 00:43:02,840 Speaker 1: would like to get in touch with us with feedback 765 00:43:02,880 --> 00:43:05,480 Speaker 1: on this episode or any other, to suggest a topic 766 00:43:05,520 --> 00:43:07,480 Speaker 1: for the future, or just to say hello, you can 767 00:43:07,520 --> 00:43:10,479 Speaker 1: email us at contact at stuff to Blow your Mind 768 00:43:10,600 --> 00:43:20,160 Speaker 1: dot com. Stuff to Blow Your Mind is a production 769 00:43:20,200 --> 00:43:22,960 Speaker 1: of I heart Radio. For more podcasts my heart Radio, 770 00:43:23,160 --> 00:43:25,840 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 771 00:43:25,880 --> 00:43:27,240 Speaker 1: you listen to your favorite shows.