1 00:00:02,960 --> 00:00:05,280 Speaker 1: Welcome to Stuff to Blow Your Mind, a production of 2 00:00:05,360 --> 00:00:10,799 Speaker 1: My Heart Radio. Hey you welcome to Stuff to Blow 3 00:00:10,840 --> 00:00:14,480 Speaker 1: your Mind. Listener mail, my name is Robert Lamb and 4 00:00:14,520 --> 00:00:16,560 Speaker 1: I'm Joe McCormick. And boy, we've got a good mail 5 00:00:16,600 --> 00:00:19,279 Speaker 1: bag for you today. I've noticed. I think this is 6 00:00:19,320 --> 00:00:22,560 Speaker 1: bi pure coincidence that today's batch of mail is just 7 00:00:22,760 --> 00:00:27,720 Speaker 1: jammed with Danny's. We've got messages from Dan's, Daniells, Danny's 8 00:00:27,760 --> 00:00:29,400 Speaker 1: of all sorts, and as far as I can tell, 9 00:00:29,480 --> 00:00:33,159 Speaker 1: they are all different people. So so non Danny's. You've 10 00:00:33,159 --> 00:00:35,279 Speaker 1: got to step up your mail game, don't. Don't let 11 00:00:35,280 --> 00:00:37,400 Speaker 1: the Danni's have it all right? Are you sure this 12 00:00:37,440 --> 00:00:41,280 Speaker 1: isn't a male robot error here? This sounds like exactly 13 00:00:41,320 --> 00:00:43,519 Speaker 1: the kind of thing that could mean that there's some 14 00:00:43,560 --> 00:00:47,560 Speaker 1: sort of underlying programming problem with our mail bot Carney. 15 00:00:47,680 --> 00:00:52,720 Speaker 1: Perhaps he is now associated all listeners, um, all individuals 16 00:00:52,720 --> 00:00:55,840 Speaker 1: who ride in with the name Dan Carney. What have 17 00:00:55,880 --> 00:00:59,560 Speaker 1: you got to say for yourself? Okay, we are assured 18 00:00:59,680 --> 00:01:02,440 Speaker 1: that this is on the level. Okay, good, because today's 19 00:01:02,520 --> 00:01:04,400 Speaker 1: episode is also going to deal a lot with the 20 00:01:04,440 --> 00:01:14,440 Speaker 1: idea of robot air robot law. Yeah, so jumping into 21 00:01:14,520 --> 00:01:18,280 Speaker 1: responses to our pair of episodes on on robot law, 22 00:01:18,480 --> 00:01:21,959 Speaker 1: robot punishment, and all the questions that are probably gonna 23 00:01:21,959 --> 00:01:24,520 Speaker 1: be popping up in the near future about what happens 24 00:01:24,520 --> 00:01:28,440 Speaker 1: when robots do wrong. Let's jump right into this first 25 00:01:28,520 --> 00:01:33,839 Speaker 1: message from Danny. Danny says, Hi, there, when thinking about 26 00:01:33,920 --> 00:01:36,440 Speaker 1: how to best punish robots if they were ever to 27 00:01:36,480 --> 00:01:39,520 Speaker 1: commit crimes, I thought about how people treat the character 28 00:01:39,640 --> 00:01:43,400 Speaker 1: of Janet on the NBC show The Good Place. If 29 00:01:43,440 --> 00:01:46,720 Speaker 1: You're not familiar, Janet is a near godlike AI that 30 00:01:46,800 --> 00:01:50,040 Speaker 1: was created to help run an afterlife for humans. She 31 00:01:50,160 --> 00:01:53,120 Speaker 1: looks like a normal person, but constantly reminds our main 32 00:01:53,200 --> 00:01:55,960 Speaker 1: characters that she is not a person, but rather just 33 00:01:56,040 --> 00:01:59,200 Speaker 1: a program written for the inhabitants of the afterlife to use. 34 00:01:59,680 --> 00:02:01,720 Speaker 1: Some of the funniest parts of the show come when 35 00:02:01,760 --> 00:02:04,720 Speaker 1: the human characters have to reboot Janet, but she is 36 00:02:04,760 --> 00:02:07,880 Speaker 1: programmed to beg for her life to dissuade them from 37 00:02:07,920 --> 00:02:11,040 Speaker 1: doing so. The closer they get to her reboot switch, 38 00:02:11,360 --> 00:02:13,680 Speaker 1: the more she pleads with them to spare her life, 39 00:02:13,919 --> 00:02:16,760 Speaker 1: all while also reminding them that she cannot in fact 40 00:02:16,800 --> 00:02:19,959 Speaker 1: feel pain or emotions, as she is just an AI. 41 00:02:20,200 --> 00:02:22,200 Speaker 1: It raises the point that I think you touched on. 42 00:02:22,560 --> 00:02:25,480 Speaker 1: Would it be cruel to program a robot or AI 43 00:02:25,560 --> 00:02:28,839 Speaker 1: to feel pain as punishment? And even if we did 44 00:02:28,880 --> 00:02:31,640 Speaker 1: program it to do so, would it actually be feeling 45 00:02:31,680 --> 00:02:34,680 Speaker 1: pain or would we just be happy with the idea 46 00:02:34,720 --> 00:02:37,760 Speaker 1: that we think the bad robot is feeling pain? I 47 00:02:37,840 --> 00:02:40,120 Speaker 1: don't know, but I'm certain that The Good Place is 48 00:02:40,200 --> 00:02:43,320 Speaker 1: hilarious and Janet is one of the best ais ever created. 49 00:02:43,440 --> 00:02:46,000 Speaker 1: Keep up the good work, Danny. Yeah, the the The 50 00:02:46,000 --> 00:02:47,880 Speaker 1: Good Place is a terrific show. Did did you ever 51 00:02:47,880 --> 00:02:50,240 Speaker 1: watch A Good Place? J Yeah? We uh is it 52 00:02:50,320 --> 00:02:52,520 Speaker 1: over now? If it is? We never actually got to 53 00:02:52,560 --> 00:02:54,519 Speaker 1: the end of it, but we watched at least part 54 00:02:54,520 --> 00:02:56,200 Speaker 1: of it and we liked it. Yeah. I mean, it's 55 00:02:56,200 --> 00:02:58,440 Speaker 1: one of those rare shows that they went exactly four 56 00:02:58,480 --> 00:03:02,160 Speaker 1: seasons and it's being a pretty pitch perfect in my opinion, 57 00:03:02,240 --> 00:03:04,760 Speaker 1: like they got where they wanted to go, they closed 58 00:03:04,760 --> 00:03:09,640 Speaker 1: it out, terrific finish and Yeah, Janet both good, Janet Bad, Janet, 59 00:03:09,680 --> 00:03:12,639 Speaker 1: various other Janets that appear. They're all terrific, all played 60 00:03:12,639 --> 00:03:19,600 Speaker 1: by h Darcy Cardon's American actress and comedian jinks Um. 61 00:03:20,040 --> 00:03:22,680 Speaker 1: You can't jinks under a roof. Oh okay, I didn't 62 00:03:22,680 --> 00:03:24,679 Speaker 1: know that. Well, that may be a regional rule in 63 00:03:24,720 --> 00:03:29,680 Speaker 1: our house. So thinking about whether robots would in fact 64 00:03:29,720 --> 00:03:32,760 Speaker 1: be programmed to feel pain, I mean, so as we 65 00:03:32,800 --> 00:03:34,840 Speaker 1: talked about in the episode. Obviously, I've got a lot 66 00:03:34,840 --> 00:03:40,200 Speaker 1: of questions and skepticism about the underlying justification for punishments 67 00:03:40,200 --> 00:03:42,760 Speaker 1: in general. I mean, I guess at bottom, my personal 68 00:03:42,840 --> 00:03:47,000 Speaker 1: view of punishments would be partially utilitarian. Like if you 69 00:03:47,000 --> 00:03:50,320 Speaker 1: could show that punishing people who harm other people actually 70 00:03:50,320 --> 00:03:53,840 Speaker 1: does provide a significant deterrent that prevents them from doing 71 00:03:53,840 --> 00:03:56,400 Speaker 1: harmed others in the first place, then I guess in 72 00:03:56,440 --> 00:03:59,200 Speaker 1: some cases it would make sense. Uh. Though then you'd 73 00:03:59,200 --> 00:04:02,400 Speaker 1: also have to factor in the idea of like individual 74 00:04:02,560 --> 00:04:05,080 Speaker 1: rights where like even you know, so like, if you're 75 00:04:05,120 --> 00:04:08,560 Speaker 1: to go with a purely utilitarian idea of justice, you 76 00:04:08,600 --> 00:04:11,880 Speaker 1: could probably find out that you could achieve overall good 77 00:04:11,880 --> 00:04:15,080 Speaker 1: effects by doing like, you know, really unfair things to 78 00:04:15,240 --> 00:04:17,000 Speaker 1: like one person in a case, So you'd have to 79 00:04:17,040 --> 00:04:19,760 Speaker 1: avoid that kind of thing too. Um. But on the 80 00:04:19,800 --> 00:04:22,720 Speaker 1: basic question, I guess I'm just skeptical about how good 81 00:04:22,720 --> 00:04:25,960 Speaker 1: the evidence is that there is a strong deterrent effect 82 00:04:26,000 --> 00:04:28,680 Speaker 1: across all types of cases, for all types of crimes, 83 00:04:28,680 --> 00:04:31,039 Speaker 1: I think it probably would actually vary a lot by 84 00:04:31,279 --> 00:04:34,160 Speaker 1: type of crime and situation. For example, you can imagine 85 00:04:34,160 --> 00:04:38,960 Speaker 1: a scenario where harsh punishments for harm caused by institutional 86 00:04:39,040 --> 00:04:42,800 Speaker 1: actors like the leaders of corporations or governments, might be 87 00:04:42,839 --> 00:04:47,560 Speaker 1: more effective as direct deterrence than harsh punishments of individual 88 00:04:47,600 --> 00:04:49,720 Speaker 1: behavior would be. But you know that kind of thing. 89 00:04:49,720 --> 00:04:51,920 Speaker 1: But I don't know what the actual evidence is or 90 00:04:51,960 --> 00:04:54,920 Speaker 1: how strong it would be. But anyway, so directly addressing robots, 91 00:04:55,720 --> 00:04:59,760 Speaker 1: assuming it is true that punishments are sometimes effective as 92 00:04:59,800 --> 00:05:03,000 Speaker 1: it aterrent that prevents a significant amount of harm from 93 00:05:03,040 --> 00:05:06,400 Speaker 1: happening in the first place. In the case of biological 94 00:05:06,440 --> 00:05:10,320 Speaker 1: offenders like us, it's just an a priori assumption that 95 00:05:10,520 --> 00:05:13,400 Speaker 1: disincentives have to come in the form of suffering, right 96 00:05:13,600 --> 00:05:15,560 Speaker 1: We We sort of alluded to this in the episode 97 00:05:15,560 --> 00:05:19,200 Speaker 1: because we inherently desire not to have to I don't know, 98 00:05:19,240 --> 00:05:21,839 Speaker 1: pay somebody a huge sum of money when it's found 99 00:05:21,880 --> 00:05:24,240 Speaker 1: that we hurt them or spend time in jail or 100 00:05:24,279 --> 00:05:27,080 Speaker 1: something like that. We don't want to face those outcomes 101 00:05:27,080 --> 00:05:30,160 Speaker 1: because those outcomes will cause us to suffer and we're 102 00:05:30,240 --> 00:05:35,120 Speaker 1: inherently motivated not to suffer. But robots don't have inherent motivations. 103 00:05:35,160 --> 00:05:39,040 Speaker 1: They're motivated by whatever their program to be motivated by. 104 00:05:39,120 --> 00:05:43,720 Speaker 1: So the equation of deterrence with suffering is a natural 105 00:05:43,760 --> 00:05:47,800 Speaker 1: component of it. To whatever extent and in whichever cases 106 00:05:47,880 --> 00:05:51,799 Speaker 1: that is actually effective, it's contingent on the offenders being 107 00:05:51,880 --> 00:05:56,400 Speaker 1: biological with inherent motivations against suffering. Robots could be deterred 108 00:05:56,400 --> 00:05:59,560 Speaker 1: by things that are not really painful to them in 109 00:05:59,600 --> 00:06:03,200 Speaker 1: any way that we would understand pain. They're just motivated 110 00:06:03,240 --> 00:06:06,560 Speaker 1: by whatever they're told to be motivated by. Yeah, I mean, ultimately, 111 00:06:06,600 --> 00:06:10,280 Speaker 1: you could program them to desire pain, to desire suffering, 112 00:06:10,320 --> 00:06:12,840 Speaker 1: and you want to share suffering with everyone and to 113 00:06:12,839 --> 00:06:16,960 Speaker 1: to make sure that that all beings attain maximum suffering. 114 00:06:17,880 --> 00:06:22,400 Speaker 1: So um, yeah, it's again, this was a fascinating topic 115 00:06:22,400 --> 00:06:24,160 Speaker 1: to talk about, just because of all the little nooks 116 00:06:24,160 --> 00:06:28,719 Speaker 1: and crannies you find yourself in, uh, contemplating both the 117 00:06:28,839 --> 00:06:33,120 Speaker 1: human condition and the idea of creating something comparable to 118 00:06:33,160 --> 00:06:36,640 Speaker 1: the human condition in machines. But anyway, Danny, I think 119 00:06:36,640 --> 00:06:38,800 Speaker 1: you raise good questions and they can connect back to 120 00:06:38,839 --> 00:06:40,680 Speaker 1: a lot of the things we were talking about. I 121 00:06:40,680 --> 00:06:44,560 Speaker 1: mean you you can imagine even people desiring robots that 122 00:06:44,720 --> 00:06:48,680 Speaker 1: simulate pain, like you're talking about, just to give people 123 00:06:48,800 --> 00:06:51,919 Speaker 1: the psychic satisfaction of seeing them punished when they do 124 00:06:52,000 --> 00:06:54,560 Speaker 1: wrong or something like that. When really the thing that 125 00:06:54,600 --> 00:06:57,880 Speaker 1: if you're like trying to rationally minimize harm, what would 126 00:06:57,880 --> 00:07:00,240 Speaker 1: be important would not be the robot feeling pain, but 127 00:07:00,400 --> 00:07:03,000 Speaker 1: them being reprogrammed in a way that so that they 128 00:07:03,000 --> 00:07:05,479 Speaker 1: don't behave like that in the future, or other robots 129 00:07:05,480 --> 00:07:14,520 Speaker 1: of their same kind don't behave that way in the future. Yeah, anyway, Rob, 130 00:07:14,560 --> 00:07:18,360 Speaker 1: do you want to look at this next message from Robin? Yes? Uh? 131 00:07:18,640 --> 00:07:22,360 Speaker 1: Robin rides high, Rob, Joe and seth Uh. In They 132 00:07:22,440 --> 00:07:25,240 Speaker 1: Punished the Machine Part two podcast, Rob mentioned a thought 133 00:07:25,280 --> 00:07:28,920 Speaker 1: experiment about an AI train system that accidentally killed a 134 00:07:28,920 --> 00:07:32,000 Speaker 1: bunch of people by not having oxygen ready for the arrival. 135 00:07:32,640 --> 00:07:35,360 Speaker 1: You could not remember where this particular story came from. 136 00:07:35,400 --> 00:07:38,160 Speaker 1: This is true. I could not remember. Um, I think 137 00:07:38,240 --> 00:07:41,440 Speaker 1: prior I was actually looking it up deering or prior 138 00:07:41,480 --> 00:07:45,080 Speaker 1: to the episode, and I was thinking, um, maybe it's 139 00:07:45,120 --> 00:07:47,520 Speaker 1: Max teg Mark uh, And I was looking around for it, 140 00:07:47,560 --> 00:07:49,000 Speaker 1: and then I did a like a search or two 141 00:07:49,040 --> 00:07:50,520 Speaker 1: and I just could not for the life we remember 142 00:07:50,520 --> 00:07:53,480 Speaker 1: where I picked this up. Um, but I was thinking nonfiction. 143 00:07:53,920 --> 00:07:57,240 Speaker 1: But Robin straightens me out. Quote. The anecdote you were 144 00:07:57,240 --> 00:08:00,720 Speaker 1: remembering is from the Rifters series by Peter Watts. I 145 00:08:00,760 --> 00:08:03,640 Speaker 1: believe it was in the first book Starfish. Uh. The 146 00:08:03,760 --> 00:08:06,720 Speaker 1: smart gel or head cheese as they call it, was 147 00:08:06,840 --> 00:08:09,280 Speaker 1: using a clock to learn when to turn on the 148 00:08:09,280 --> 00:08:12,600 Speaker 1: oxygen and the clock broke. There is some delicious AI 149 00:08:12,640 --> 00:08:16,920 Speaker 1: Shenanigans speculation in those books, and yes, true enough, this 150 00:08:16,960 --> 00:08:19,600 Speaker 1: is exactly what once you pointed out, I'm like, yep, yep, 151 00:08:19,640 --> 00:08:22,360 Speaker 1: I remember that totally, because yeah, the uh, these these 152 00:08:22,400 --> 00:08:25,080 Speaker 1: neural nets, these smart gels or head cheese as they 153 00:08:25,160 --> 00:08:27,240 Speaker 1: they do, play a major role in the plot, and 154 00:08:27,280 --> 00:08:30,560 Speaker 1: there's a lot of consideration about how they how they 155 00:08:30,560 --> 00:08:35,480 Speaker 1: could they could make ultimately catastrophic decisions that were that 156 00:08:35,600 --> 00:08:38,400 Speaker 1: were from their vantage point, the best way to solve 157 00:08:38,440 --> 00:08:41,640 Speaker 1: given problems. This makes so much sense thinking about Peter 158 00:08:41,679 --> 00:08:44,760 Speaker 1: Watts uh fiction. A lot of what Peter Watts does 159 00:08:44,880 --> 00:08:49,840 Speaker 1: is create genuine horror scenarios where the horror is derived 160 00:08:50,000 --> 00:08:55,400 Speaker 1: from a failure of human imagination to comprehend other types 161 00:08:55,440 --> 00:08:58,640 Speaker 1: of cognition. And reasoning. So there's a kind of cosmic 162 00:08:58,679 --> 00:09:02,600 Speaker 1: horror that comes from the human inability to understand what 163 00:09:02,679 --> 00:09:05,320 Speaker 1: it's like to think as an alien, or the human 164 00:09:05,360 --> 00:09:10,040 Speaker 1: inability to understand what it's like to think as an AI. Yeah. Yeah, 165 00:09:10,160 --> 00:09:13,439 Speaker 1: so so definitely. Yeah, Peter Watts, Uh, look up, look 166 00:09:13,480 --> 00:09:16,240 Speaker 1: up his work if you're interested. Uh, we're big fans. 167 00:09:16,280 --> 00:09:20,760 Speaker 1: I read Starfish and a couple of the books and 168 00:09:20,880 --> 00:09:24,840 Speaker 1: Rifters that followed that. Um, and then what was Blindside 169 00:09:24,920 --> 00:09:29,480 Speaker 1: was the other one? Yeah, very scary alien contact novel. Uh. 170 00:09:29,520 --> 00:09:32,160 Speaker 1: And we've read some of his short stories. We talked 171 00:09:32,160 --> 00:09:35,320 Speaker 1: about one that deals with religious technology that was very 172 00:09:35,320 --> 00:09:38,760 Speaker 1: good and very scary. Yeah. Yeah, so yeah, look him up. 173 00:09:39,240 --> 00:09:41,880 Speaker 1: Great some good reads there can get a bit, you know, 174 00:09:42,240 --> 00:09:43,880 Speaker 1: for some taste. It may get a little heady and 175 00:09:43,920 --> 00:09:46,720 Speaker 1: a little dark. Though. I think Watts would argue that 176 00:09:47,200 --> 00:09:49,960 Speaker 1: the futures that he creates are in many ways less 177 00:09:50,040 --> 00:09:52,960 Speaker 1: dark than the realities that we find ourselves in today. 178 00:09:53,440 --> 00:09:57,920 Speaker 1: Uh so you know, so consider that as you peruse 179 00:09:58,400 --> 00:10:01,520 Speaker 1: his work. Yeah, I would say, overall, just really good 180 00:10:01,640 --> 00:10:05,800 Speaker 1: cerebral horror sci fi. All right, Robin has more, though, 181 00:10:06,000 --> 00:10:08,320 Speaker 1: Robin says, as a side note, I would like to 182 00:10:08,360 --> 00:10:11,199 Speaker 1: thank you folks for doing the crossover episode with Seth. 183 00:10:11,600 --> 00:10:13,960 Speaker 1: It got me to look up the Spotify playlist for 184 00:10:14,120 --> 00:10:17,160 Speaker 1: Record Store Society, which led me to discover the artist 185 00:10:17,559 --> 00:10:20,880 Speaker 1: Do make say think they are fabulous to listen to 186 00:10:20,960 --> 00:10:24,079 Speaker 1: as I work, I'm slowly working through that Spotify playlist 187 00:10:24,160 --> 00:10:26,960 Speaker 1: and I am rediscovering old and discovering new music that 188 00:10:27,040 --> 00:10:29,840 Speaker 1: I enjoy. So thanks to Seth and to you folks 189 00:10:29,880 --> 00:10:34,880 Speaker 1: for the musical education. Cheers, Robin, I'll love to hear that. Yeah, absolutely, 190 00:10:35,040 --> 00:10:37,559 Speaker 1: Uh yeah, we did a couple of crossovers with with Seth. 191 00:10:37,640 --> 00:10:40,839 Speaker 1: We did a Weird House Cinema where where he came 192 00:10:40,840 --> 00:10:42,760 Speaker 1: on the show and we talked about weird music videos, 193 00:10:42,800 --> 00:10:45,320 Speaker 1: and then we went on to Record Store Society and 194 00:10:45,360 --> 00:10:49,240 Speaker 1: talked about just music, cool music videos in general. Uh so, yeah, 195 00:10:49,280 --> 00:10:52,000 Speaker 1: hopefully that led a number of people to discover Record 196 00:10:52,040 --> 00:10:55,680 Speaker 1: Store Society and uh and even discover some new music 197 00:10:55,800 --> 00:10:58,120 Speaker 1: that's awesome totally and if you haven't checked it out yet, 198 00:10:58,200 --> 00:11:00,400 Speaker 1: you can look them up there. They're just few weeks 199 00:11:00,400 --> 00:11:02,040 Speaker 1: back in our feed now. And of course you can 200 00:11:02,080 --> 00:11:10,360 Speaker 1: find Record Store Society wherever you get your podcasts. Okay, 201 00:11:10,440 --> 00:11:14,520 Speaker 1: this next message comes from Kenneth. Kenneth says, Hi, Rob 202 00:11:14,559 --> 00:11:17,600 Speaker 1: and Joe, thanks for reading out my last email. Kenneth, 203 00:11:17,640 --> 00:11:20,280 Speaker 1: I honestly don't remember which one that was. But but 204 00:11:20,400 --> 00:11:22,600 Speaker 1: Kenneth says, though if I had known it would make 205 00:11:22,640 --> 00:11:26,600 Speaker 1: it on air, I would have waffled a lot less. Again, 206 00:11:26,679 --> 00:11:29,240 Speaker 1: don't know what that's referring to. I loved the recent 207 00:11:29,280 --> 00:11:32,360 Speaker 1: series of AI themed episodes, and as you discussed what 208 00:11:32,520 --> 00:11:35,680 Speaker 1: might be an appropriate sanction for a wayward robot, it 209 00:11:35,720 --> 00:11:38,800 Speaker 1: immediately put me in mind of Red Dwarf, the UK 210 00:11:39,000 --> 00:11:43,520 Speaker 1: comedy slash sci fi show. In one episode, Crichton, I 211 00:11:43,520 --> 00:11:45,360 Speaker 1: hope I'm saying that right. I've never seen this show. 212 00:11:45,880 --> 00:11:49,800 Speaker 1: The crews outdated mechanoid is replaced by Hudson ten the 213 00:11:49,960 --> 00:11:54,200 Speaker 1: perfect Machine. Unfortunately, in the millions of years he's spent 214 00:11:54,320 --> 00:11:57,600 Speaker 1: waiting to be delivered, his sanity chip has worn out, 215 00:11:57,640 --> 00:12:00,760 Speaker 1: and he goes on a rampage. He's eventually defeated in 216 00:12:00,800 --> 00:12:05,280 Speaker 1: an homage to Captain Kirk murdering supercomputers with paradoxes. When 217 00:12:05,360 --> 00:12:10,360 Speaker 1: Crechton tells him that Silicon Heaven doesn't exist, Hudson asks, 218 00:12:10,360 --> 00:12:13,240 Speaker 1: but where do all the little calculators go to which 219 00:12:13,280 --> 00:12:17,280 Speaker 1: Crechton replies, they just die. Hudson then expires in a 220 00:12:17,440 --> 00:12:21,160 Speaker 1: kind of does not compute buzzed kind of way. Uh. 221 00:12:21,320 --> 00:12:24,640 Speaker 1: This got me thinking that perhaps positive reinforcement is the 222 00:12:24,679 --> 00:12:27,800 Speaker 1: way to go with our future robotic companions. If they 223 00:12:27,840 --> 00:12:30,319 Speaker 1: are programmed for a given task, they could be given 224 00:12:30,360 --> 00:12:34,680 Speaker 1: rewards of energy, upgrades, repairs, etcetera. If they were good 225 00:12:34,720 --> 00:12:38,000 Speaker 1: little robots. It's very transactional, but I could imagine them 226 00:12:38,040 --> 00:12:41,200 Speaker 1: striving for rewards that will make them more efficient. I 227 00:12:41,280 --> 00:12:46,080 Speaker 1: also really enjoyed the episodes about extraterrestrial machine intelligences, but 228 00:12:46,160 --> 00:12:48,840 Speaker 1: I did want to defend the humans and drones of 229 00:12:48,880 --> 00:12:52,520 Speaker 1: Ian M Banks culture. They aren't just parasites or pets. 230 00:12:52,600 --> 00:12:55,320 Speaker 1: They are an essential part of the good works the 231 00:12:55,360 --> 00:12:59,800 Speaker 1: culture undertakes. Interfering to improve other civilizations is what it 232 00:13:00,000 --> 00:13:03,520 Speaker 1: considers it's moral right to exist, and the humans and 233 00:13:03,600 --> 00:13:06,679 Speaker 1: drones are pretty central to that. I like the idea 234 00:13:06,720 --> 00:13:09,480 Speaker 1: of ultra intelligent minds using their power to make the 235 00:13:09,600 --> 00:13:13,160 Speaker 1: universe a better place, and banks optimistic take on AI 236 00:13:13,360 --> 00:13:16,440 Speaker 1: was always one I found very engaging. On a complete 237 00:13:16,440 --> 00:13:20,440 Speaker 1: tangent future episodes discussing mega structures such as Niven rings 238 00:13:20,480 --> 00:13:25,600 Speaker 1: Banks orbitals or O'Neill cylinders would be absolutely fascinating. Keep 239 00:13:25,640 --> 00:13:28,040 Speaker 1: up the great work, Kenneth well. I I have not 240 00:13:28,120 --> 00:13:30,440 Speaker 1: seen Red Dwarf either, but I know that's a show 241 00:13:30,480 --> 00:13:35,160 Speaker 1: that's much beloved by many science fiction fans. Um On 242 00:13:35,160 --> 00:13:38,480 Speaker 1: on the culture note, Um, I don't remember. Did I 243 00:13:38,559 --> 00:13:41,880 Speaker 1: argue that they were parasites or pets? I don't know. Um, 244 00:13:41,920 --> 00:13:46,160 Speaker 1: I don't. We made an offhand comparison that maybe maybe 245 00:13:46,160 --> 00:13:49,160 Speaker 1: we weren't being super serious about well maybe the confusion. 246 00:13:49,360 --> 00:13:51,440 Speaker 1: I might have said something meaning or I might have 247 00:13:51,480 --> 00:13:54,400 Speaker 1: meant to state that it's kind of like the opposite, 248 00:13:54,440 --> 00:13:59,559 Speaker 1: Like the the culture is essentially machines, it's essentially the minds, 249 00:14:00,120 --> 00:14:04,000 Speaker 1: but the humans you could almost see them as pets 250 00:14:04,040 --> 00:14:07,080 Speaker 1: at this point that you could almost. I wouldn't maybe 251 00:14:07,120 --> 00:14:09,080 Speaker 1: I wouldn't call them parasites per se, but they're this 252 00:14:09,160 --> 00:14:11,840 Speaker 1: thing that is kept alive by the by the minds 253 00:14:11,840 --> 00:14:13,920 Speaker 1: of the culture because that is, you know, part of 254 00:14:13,920 --> 00:14:16,640 Speaker 1: what they do, it's part of their their whole existence, 255 00:14:16,679 --> 00:14:21,040 Speaker 1: a mission. Um But that anyway, I mean, ultimately, this 256 00:14:21,080 --> 00:14:23,560 Speaker 1: is all expressed in the culture books far better than 257 00:14:23,560 --> 00:14:26,360 Speaker 1: I can just summarize here. But he he does. He 258 00:14:26,400 --> 00:14:31,960 Speaker 1: did indeed have this wonderful optimistic view of of super 259 00:14:32,000 --> 00:14:36,040 Speaker 1: intelligent machines, but with plenty of complications that that make 260 00:14:36,160 --> 00:14:39,680 Speaker 1: you know, novels of of adventure and discovery worthwhile. So 261 00:14:39,720 --> 00:14:43,479 Speaker 1: they may be sort of the culture's reason for existing, 262 00:14:43,600 --> 00:14:47,160 Speaker 1: even if they're not really calling the shots within the culture, right, 263 00:14:47,480 --> 00:14:49,320 Speaker 1: which is funny because in that way, I think they 264 00:14:49,320 --> 00:14:54,400 Speaker 1: could possibly be compared to consciousness within humans. Right, So 265 00:14:54,440 --> 00:14:56,880 Speaker 1: your consciousness is the reason to get out of bed 266 00:14:56,880 --> 00:14:59,880 Speaker 1: in the morning, like your conscious experience is the reason 267 00:15:00,040 --> 00:15:02,640 Speaker 1: to keep living and keep doing anything. But of course, 268 00:15:02,680 --> 00:15:05,360 Speaker 1: a lot of the decisions you make, in actions you perform, 269 00:15:05,680 --> 00:15:07,920 Speaker 1: while you might have the illusion that they're driven by 270 00:15:07,920 --> 00:15:10,520 Speaker 1: conscious processes, you can actually show that at least in 271 00:15:10,560 --> 00:15:12,600 Speaker 1: a lot of cases, maybe in every case, maybe not 272 00:15:12,640 --> 00:15:14,760 Speaker 1: in every case, you know, can't know for sure, but 273 00:15:14,800 --> 00:15:17,880 Speaker 1: at least in a lot of cases, decisions about what 274 00:15:18,160 --> 00:15:21,880 Speaker 1: you your body does are happening sort of upstream of consciousness, 275 00:15:21,880 --> 00:15:24,480 Speaker 1: and you're just sort of having the illusion that like, yeah, 276 00:15:24,560 --> 00:15:30,000 Speaker 1: I did that with my conscious brain. Yeah. It's just 277 00:15:30,040 --> 00:15:32,360 Speaker 1: also point out that in the culture books you'll often 278 00:15:32,360 --> 00:15:35,960 Speaker 1: have long stretches where it's just one mind talking to 279 00:15:36,000 --> 00:15:39,000 Speaker 1: another one AI talking to another AI or multiple AI 280 00:15:39,120 --> 00:15:42,720 Speaker 1: is having a conversation about what's happening or what's going 281 00:15:42,760 --> 00:15:45,680 Speaker 1: to happen, and those can be quite amusing, in part 282 00:15:45,680 --> 00:15:48,720 Speaker 1: because they all have kind of ridiculous drawn out names 283 00:15:48,760 --> 00:15:51,160 Speaker 1: for each other, and they all have different personalities and 284 00:15:51,200 --> 00:15:54,680 Speaker 1: different factions. Uh we we actually didn't. So we were 285 00:15:54,720 --> 00:15:57,600 Speaker 1: talking in the Machine Lords of Barnard sixty eight episodes 286 00:15:57,960 --> 00:16:02,000 Speaker 1: about whether post biological super intelligences would have something that 287 00:16:02,040 --> 00:16:05,240 Speaker 1: we would recognize as emotions, and you kind of have 288 00:16:05,280 --> 00:16:07,800 Speaker 1: to think if they have personalities, then maybe they would 289 00:16:07,840 --> 00:16:10,760 Speaker 1: have something that we recognize as emotions. Do do do 290 00:16:10,800 --> 00:16:12,880 Speaker 1: the Uh? I don't know. Do the ais and the 291 00:16:12,920 --> 00:16:16,680 Speaker 1: Culture series by Banks have have emotions? Yeah? They they 292 00:16:17,200 --> 00:16:19,200 Speaker 1: seem to. I mean it gets kind of complicate. You know. 293 00:16:20,000 --> 00:16:22,680 Speaker 1: The machine is capable of of seeming like it has 294 00:16:22,720 --> 00:16:28,600 Speaker 1: emotions and realizes there's there's an advantage in portraying emotions 295 00:16:28,640 --> 00:16:31,840 Speaker 1: than it is and sometimes it's hard to differentiate that 296 00:16:31,880 --> 00:16:35,280 Speaker 1: between genuine emotions or maybe there is no difference. Uh yeah, 297 00:16:35,520 --> 00:16:38,160 Speaker 1: And I can't recall this very question may have come 298 00:16:38,240 --> 00:16:40,280 Speaker 1: up at some point in the Culture books, and I 299 00:16:40,320 --> 00:16:44,120 Speaker 1: just don't remember it, but they seem to have personalities 300 00:16:44,160 --> 00:16:46,960 Speaker 1: and emotions. I think some some are a bit more 301 00:16:46,960 --> 00:16:50,360 Speaker 1: temperamental and bristly than others, and others are, you know, 302 00:16:50,400 --> 00:16:53,720 Speaker 1: a bit eccentric. So so yeah, yeah, I would say so. 303 00:17:02,240 --> 00:17:05,040 Speaker 1: All right. The next one comes to us from Joe. 304 00:17:05,600 --> 00:17:08,840 Speaker 1: Uh different Joe. Joe writes, I really enjoyed your two 305 00:17:08,880 --> 00:17:13,360 Speaker 1: partner on law enforcement applied to autonomous intelligences. There were 306 00:17:13,440 --> 00:17:16,040 Speaker 1: some wonderful explorations of how this relates to enforcing the 307 00:17:16,119 --> 00:17:20,439 Speaker 1: law against corporate persons. There's another parallel path where we 308 00:17:20,520 --> 00:17:23,399 Speaker 1: as a society had to struggle with how to enforce 309 00:17:23,440 --> 00:17:26,240 Speaker 1: the law while also reckoning with a growing understanding of 310 00:17:26,280 --> 00:17:29,080 Speaker 1: how the perpetrators are not motivated by what we expect, 311 00:17:29,560 --> 00:17:32,760 Speaker 1: not using the same processes to make decisions, and not 312 00:17:33,040 --> 00:17:38,320 Speaker 1: as fundamentally culpable as we imagine juvenile law enforcement. I've 313 00:17:38,320 --> 00:17:40,960 Speaker 1: been lucky enough to serve in a policy making capacity 314 00:17:41,000 --> 00:17:43,600 Speaker 1: as part of a state juvenile justice agency and have 315 00:17:43,720 --> 00:17:48,160 Speaker 1: seen firsthand the benefits of this better informed approach. As 316 00:17:48,240 --> 00:17:51,120 Speaker 1: everyone who has had a toddler can confirm, children are 317 00:17:51,119 --> 00:17:54,040 Speaker 1: capable of tremendous destruction, and as they get older, they 318 00:17:54,040 --> 00:17:56,199 Speaker 1: can do and say terrible and harmful things that have 319 00:17:56,320 --> 00:17:59,760 Speaker 1: all the appearance of conscious and informed choice. However, as 320 00:17:59,800 --> 00:18:02,440 Speaker 1: we've learn more about child brain development, we are moving 321 00:18:02,480 --> 00:18:06,560 Speaker 1: towards restorative and corrective models of juvenile justice, rather than 322 00:18:06,600 --> 00:18:10,479 Speaker 1: the retributive and punitive systems that have been dominant throughout history. 323 00:18:11,119 --> 00:18:13,639 Speaker 1: As you have covered on other episodes, we are beginning 324 00:18:13,680 --> 00:18:17,480 Speaker 1: to suspect that human brains, and especially juvenile brains, often 325 00:18:17,520 --> 00:18:21,600 Speaker 1: provide illusory surface level rationales for actions that are actually 326 00:18:21,600 --> 00:18:26,240 Speaker 1: controlled by obscure, deep seated processes that more closely resemble 327 00:18:26,320 --> 00:18:30,040 Speaker 1: machine learning than we care to admit. We've started to 328 00:18:30,119 --> 00:18:33,480 Speaker 1: incorporate those lessons into how we handle children who commit 329 00:18:33,520 --> 00:18:37,320 Speaker 1: otherwise criminal offenses and have begun reaping the rewards of 330 00:18:37,359 --> 00:18:41,639 Speaker 1: healthier and safer communities. I'm optimistic that we will apply 331 00:18:41,800 --> 00:18:45,080 Speaker 1: that wisdom to correcting the mistakes of artificial intelligence moving forward, 332 00:18:45,200 --> 00:18:49,200 Speaker 1: rather than seeking the primal gratification of knowing some being 333 00:18:49,280 --> 00:18:52,359 Speaker 1: has suffered as a consequence of our misfortune. I even 334 00:18:52,480 --> 00:18:54,919 Speaker 1: dare to hope that we will someday learn to apply 335 00:18:54,960 --> 00:18:58,080 Speaker 1: those lessons to adult human minds, which are often much 336 00:18:58,160 --> 00:19:02,840 Speaker 1: less authentically culpable. For a harm they cause. Then we imagine, Joe, Oh, 337 00:19:02,960 --> 00:19:06,440 Speaker 1: interesting to hear from somebody who's involved in this field. Yeah, 338 00:19:06,480 --> 00:19:09,439 Speaker 1: getting into juvenile criminal justice and all that. That's uh, 339 00:19:09,760 --> 00:19:12,320 Speaker 1: I know, that's got to be a really just difficult 340 00:19:12,359 --> 00:19:15,920 Speaker 1: area even you know, people motivated with the best of intentions. 341 00:19:16,560 --> 00:19:19,200 Speaker 1: Um so, so yeah, it's it's it's good to hear 342 00:19:19,240 --> 00:19:28,000 Speaker 1: from somebody who's got first hand experience. Yeah, absolutely, all right, 343 00:19:28,080 --> 00:19:30,199 Speaker 1: here's another one for us, Joe, why don't you hit 344 00:19:30,240 --> 00:19:32,200 Speaker 1: this one? This one looks like it's a question about 345 00:19:32,240 --> 00:19:35,280 Speaker 1: the music. Here's another one from Dan Kind. This one 346 00:19:35,359 --> 00:19:39,199 Speaker 1: comes from Danielle. Danielle says, HI was wondering if you 347 00:19:39,200 --> 00:19:41,439 Speaker 1: could tell me the name of the music that plays 348 00:19:41,520 --> 00:19:45,280 Speaker 1: during the Artifact episodes. Thank you, Danielle. That's it. Well, 349 00:19:45,359 --> 00:19:48,200 Speaker 1: I had to check with Seth about this. Uh. This 350 00:19:48,200 --> 00:19:50,720 Speaker 1: this was not one of the pieces that that I created. 351 00:19:50,800 --> 00:19:54,000 Speaker 1: This was one that Seth got from one of the 352 00:19:54,000 --> 00:19:57,720 Speaker 1: one of the deep libraries of stock music that we 353 00:19:57,800 --> 00:20:00,840 Speaker 1: have access to here at I heartened me, and so 354 00:20:00,920 --> 00:20:03,679 Speaker 1: he dug up the meta. This track is by an 355 00:20:03,760 --> 00:20:08,320 Speaker 1: artist named Josh Hyneman. It is called Creeping In, which 356 00:20:08,359 --> 00:20:11,000 Speaker 1: we found very funny when he revealed that to us earlier. 357 00:20:11,200 --> 00:20:16,040 Speaker 1: And it's from an album called Reality Game Show Too. Uh, 358 00:20:16,080 --> 00:20:18,199 Speaker 1: and so you can look that up if you like. 359 00:20:18,359 --> 00:20:20,400 Speaker 1: But yeah, it seems like he he sourced it from 360 00:20:20,440 --> 00:20:23,320 Speaker 1: a from a library of stock music that's common for 361 00:20:23,760 --> 00:20:27,680 Speaker 1: producers to use to create beds underneath the talking in 362 00:20:27,720 --> 00:20:31,560 Speaker 1: all of your favorite podcasts. Yeah, it's, you know, a 363 00:20:31,600 --> 00:20:34,680 Speaker 1: looping track, you know, so that it doesn't matter how 364 00:20:34,720 --> 00:20:37,200 Speaker 1: long the artifact episode goes. Yeah, and some of those 365 00:20:37,240 --> 00:20:39,320 Speaker 1: databases can be pretty neat. I remember looking around in 366 00:20:39,400 --> 00:20:41,240 Speaker 1: one a while back and noticing that there was some 367 00:20:41,840 --> 00:20:44,480 Speaker 1: there was at least like one collection of sounds that 368 00:20:44,520 --> 00:20:47,680 Speaker 1: have been created by Alan Howarth, who of course is 369 00:20:47,720 --> 00:20:51,440 Speaker 1: known for his musical collaborations with director John Carpenter on 370 00:20:51,920 --> 00:20:55,440 Speaker 1: creating helping to create some of the most iconic John 371 00:20:55,480 --> 00:20:59,640 Speaker 1: Carpenter movie film scores Chariots of Pumpkins Man. Yeah, great track. 372 00:21:00,080 --> 00:21:05,200 Speaker 1: Also for our for our amusement, Seth supplied the keywords 373 00:21:05,240 --> 00:21:08,840 Speaker 1: that go along with this track. So it's a slow tempo, 374 00:21:09,240 --> 00:21:15,520 Speaker 1: subtle to neutral tension, underscore, drone bed ethereal electronic electronic 375 00:21:16,000 --> 00:21:20,040 Speaker 1: soundtrack score, pulsing synth pad pulsing synth bass light, low 376 00:21:20,080 --> 00:21:23,639 Speaker 1: fi perk loop, ethereal synth pads, reverse guitar, subtle to 377 00:21:23,720 --> 00:21:26,560 Speaker 1: neutral tension. I think it already said that underscore, and 378 00:21:26,640 --> 00:21:28,800 Speaker 1: it looks like it's just looping now, which is appropriate 379 00:21:29,160 --> 00:21:36,919 Speaker 1: for looping track. All right, here's another one. This one 380 00:21:36,960 --> 00:21:39,840 Speaker 1: comes to us from Jeremy. Hello, Joe and Robert. Your 381 00:21:39,840 --> 00:21:43,200 Speaker 1: recent fault episodes about Medusa's head and weird House episodes 382 00:21:43,200 --> 00:21:46,600 Speaker 1: about detached heads living on reminded me of the detachable 383 00:21:46,640 --> 00:21:50,520 Speaker 1: head of the Queen of the Moon played by Valentina 384 00:21:51,119 --> 00:21:56,080 Speaker 1: Cortes opposite Robin Williams. Or is that Cortez or Cortez 385 00:21:56,640 --> 00:22:02,400 Speaker 1: Cortes Cortez. I'm not familiar with this anyway. Quote which 386 00:22:02,440 --> 00:22:05,959 Speaker 1: has an independent life in the classic epic The Adventures 387 00:22:05,960 --> 00:22:09,199 Speaker 1: of Baron Muchausen. Although it in no way counts as 388 00:22:09,200 --> 00:22:11,960 Speaker 1: a B movie, The Adventures of Baron Munchausen may well 389 00:22:12,000 --> 00:22:15,399 Speaker 1: contain enough weirdness for a weird House episode. Many thanks 390 00:22:15,440 --> 00:22:19,360 Speaker 1: for the hours of intelligent entertainment. Best regards, and Jeremy, Hey, 391 00:22:19,440 --> 00:22:24,640 Speaker 1: that's Terry Gilliam. Yeah, Terry Gilliam is always potentially. It's 392 00:22:24,720 --> 00:22:27,160 Speaker 1: anything he creates I would say is potentially weird enough 393 00:22:27,160 --> 00:22:29,679 Speaker 1: for weird house cinema, no matter how much money he 394 00:22:29,760 --> 00:22:32,880 Speaker 1: had to play with in creating it. Uh, where where 395 00:22:32,880 --> 00:22:35,359 Speaker 1: are you on Gilliam? Rob? Because I've got a theory 396 00:22:35,400 --> 00:22:39,520 Speaker 1: that nobody is tepid on Terry Gilliam's post Monty Python movies. 397 00:22:39,760 --> 00:22:43,800 Speaker 1: People either love them or absolutely hate them. And I 398 00:22:43,840 --> 00:22:48,479 Speaker 1: think Gilliam's aesthetics can be kind of challenging sometimes. But 399 00:22:48,600 --> 00:22:52,160 Speaker 1: I love the Terry Gilliam movies. Yeah. I haven't sat 400 00:22:52,200 --> 00:22:54,760 Speaker 1: down and watched a Terry Gilliam movie and and quite 401 00:22:54,760 --> 00:22:58,720 Speaker 1: a spell here at this point, but I I can 402 00:22:58,760 --> 00:23:01,400 Speaker 1: say that I love Terry Gilliam movies or I love 403 00:23:01,480 --> 00:23:03,919 Speaker 1: to like them. I think some of the more recent 404 00:23:03,960 --> 00:23:07,720 Speaker 1: ones I maybe haven't I loved them as much, but 405 00:23:07,840 --> 00:23:10,080 Speaker 1: there's always something to love in them, you know. There's 406 00:23:10,119 --> 00:23:13,600 Speaker 1: there's an authentic weirdness and authentic Terry Gilliam vibe to them, 407 00:23:13,640 --> 00:23:18,400 Speaker 1: where there's something to latch onto there. Yeah. Well, yeah, 408 00:23:18,440 --> 00:23:20,399 Speaker 1: I haven't seen all of his more recent movies, but 409 00:23:20,440 --> 00:23:23,159 Speaker 1: I'm thinking back at least to the earlier post Python 410 00:23:23,280 --> 00:23:26,919 Speaker 1: movies like Time Bandits Brazil, The Adventures of Baron Munchausen. 411 00:23:27,280 --> 00:23:29,480 Speaker 1: Did he do twelve Monkeys. I think we were talking 412 00:23:29,480 --> 00:23:32,920 Speaker 1: about twelve Monkeys if you're in Loathing in Las Vegas. Yeah, 413 00:23:33,080 --> 00:23:34,960 Speaker 1: a bunch of good ones. I think we were talking 414 00:23:34,960 --> 00:23:37,600 Speaker 1: about Brazil on the show not too long ago because 415 00:23:37,840 --> 00:23:40,359 Speaker 1: of some other scene in a movie where they were 416 00:23:40,400 --> 00:23:43,320 Speaker 1: just random ducts go all over the place, and that 417 00:23:43,440 --> 00:23:45,760 Speaker 1: that's the thing I always think about with Brazil is 418 00:23:45,840 --> 00:23:50,760 Speaker 1: just absolutely genius visual representation of the themes of the movie, 419 00:23:50,840 --> 00:23:54,199 Speaker 1: Like every room is just crammed full of ducts and 420 00:23:54,240 --> 00:23:56,480 Speaker 1: you have no idea why they're there or what they're 421 00:23:56,480 --> 00:23:59,800 Speaker 1: taking from one place to the other. Yeah, yeah, absolutely, Yeah. 422 00:23:59,840 --> 00:24:02,760 Speaker 1: But Brazil's an excellent movie. And in Baron Munchausen, which 423 00:24:02,800 --> 00:24:07,040 Speaker 1: I may have seen more recently than other of his films, 424 00:24:07,359 --> 00:24:08,639 Speaker 1: which means I saw it at some point in the 425 00:24:08,720 --> 00:24:11,400 Speaker 1: last ten years. Uh, it is is a lot of fun. 426 00:24:11,560 --> 00:24:16,280 Speaker 1: It's uh it's just got such some rich, rich, old 427 00:24:16,359 --> 00:24:18,720 Speaker 1: fashioned fantasy going on in it. Yeah. It's been a 428 00:24:18,800 --> 00:24:27,080 Speaker 1: long time for me, but I remember feeling the same. Okay, 429 00:24:27,520 --> 00:24:29,199 Speaker 1: you're ready to move on to this next one. I 430 00:24:29,200 --> 00:24:31,520 Speaker 1: can do. This message from Jimmy that is about robot 431 00:24:31,560 --> 00:24:37,000 Speaker 1: punishment and weird house cinema. Go for it, okay, Jimmy says, Hey, guys, 432 00:24:37,040 --> 00:24:39,280 Speaker 1: I love your show. Just started listening to it and 433 00:24:39,280 --> 00:24:41,800 Speaker 1: have been really digging the topics. I got pulled in 434 00:24:41,880 --> 00:24:44,600 Speaker 1: with the episode on Locks, and I knew I was 435 00:24:44,640 --> 00:24:46,520 Speaker 1: going to be a permanent listener when I saw you 436 00:24:46,560 --> 00:24:50,080 Speaker 1: had an episode on Gunhead. It's a huge It's a 437 00:24:50,160 --> 00:24:53,520 Speaker 1: huge fan of weird movies and D and D, which 438 00:24:53,520 --> 00:24:56,199 Speaker 1: you guys have mentioned several times. I knew it was 439 00:24:56,280 --> 00:24:59,560 Speaker 1: fate that brought your show to my attention that I had. 440 00:24:59,640 --> 00:25:02,280 Speaker 1: I have jump in here, Joe. I. I feel like 441 00:25:02,280 --> 00:25:04,639 Speaker 1: like doing an episode on Gunhead is the kind of 442 00:25:04,680 --> 00:25:08,359 Speaker 1: thing where you would almost expect someone in the company 443 00:25:08,400 --> 00:25:09,760 Speaker 1: and be like, hey, guys, what are you doing doing 444 00:25:09,760 --> 00:25:12,879 Speaker 1: an episode on Gunhead? Um? And and now we can 445 00:25:12,960 --> 00:25:17,560 Speaker 1: legitimately respond with we are earning permanent listeners by doing 446 00:25:17,560 --> 00:25:20,760 Speaker 1: episodes on Gunhead. Otherwise, you know, how would we bring 447 00:25:20,800 --> 00:25:22,960 Speaker 1: people in? But that's like every episode we do. It 448 00:25:22,960 --> 00:25:25,440 Speaker 1: would be kind of hard to explain to the business division, 449 00:25:25,560 --> 00:25:29,920 Speaker 1: why why this episode on Spoons? Why this episode on Gunhead? 450 00:25:30,640 --> 00:25:34,960 Speaker 1: So we say you're just gonna have to trust us anyway. 451 00:25:35,080 --> 00:25:38,399 Speaker 1: Jimmy goes on in the episode punish the machine. You 452 00:25:38,440 --> 00:25:42,159 Speaker 1: talked about how we react negatively to the seeming quote 453 00:25:42,280 --> 00:25:47,639 Speaker 1: behavior of inanimate objects, often blaming them for transgressions or failures. 454 00:25:48,040 --> 00:25:50,480 Speaker 1: Being a fan of D and D, I immediately thought 455 00:25:50,560 --> 00:25:54,520 Speaker 1: of dice shaming, dice jails, and the acts of player 456 00:25:54,600 --> 00:25:57,760 Speaker 1: might perform when their dice roll poorly, from the aforementioned 457 00:25:57,760 --> 00:26:02,920 Speaker 1: examples to throwing dice, banishing them back into the bag, 458 00:26:03,119 --> 00:26:06,879 Speaker 1: admonishing the di verbally, or deciding that a particular D 459 00:26:07,000 --> 00:26:10,000 Speaker 1: twenty has quote used up it's good roles for the 460 00:26:10,040 --> 00:26:13,160 Speaker 1: session and is now useless. I kept hoping it would 461 00:26:13,200 --> 00:26:16,520 Speaker 1: be mentioned in the episode. Uh, you know, I think 462 00:26:16,760 --> 00:26:19,119 Speaker 1: it did come to mind when we were doing it, 463 00:26:19,160 --> 00:26:21,520 Speaker 1: and I just uh ended up not not bringing it 464 00:26:21,600 --> 00:26:25,040 Speaker 1: up because I was thinking about Uh, I've never actually 465 00:26:25,040 --> 00:26:27,560 Speaker 1: punished dice before, but I know i've I've spoken with 466 00:26:27,600 --> 00:26:30,000 Speaker 1: people who have, or no people that have. You know 467 00:26:30,040 --> 00:26:32,840 Speaker 1: that the dice wasn't performing properly, it would be shamed 468 00:26:32,920 --> 00:26:36,359 Speaker 1: or put away um or even discarded. I think the 469 00:26:36,359 --> 00:26:38,800 Speaker 1: closest thing for me is that for a long time 470 00:26:38,840 --> 00:26:41,840 Speaker 1: I had two different D twenties in my bag of dice. 471 00:26:42,160 --> 00:26:44,240 Speaker 1: One was a black D twenty that matched the rest 472 00:26:44,280 --> 00:26:46,000 Speaker 1: of my dice, and the other one was like this 473 00:26:46,200 --> 00:26:49,440 Speaker 1: random kind of amber colored D twenty And I have 474 00:26:49,520 --> 00:26:52,480 Speaker 1: no idea where it came from, but for some reason 475 00:26:52,560 --> 00:26:54,760 Speaker 1: or another, I was for a while I was using 476 00:26:54,800 --> 00:26:58,400 Speaker 1: both of them, and I it seemed like the brown 477 00:26:58,760 --> 00:27:01,960 Speaker 1: dice was out the brown D twenty was outperforming the 478 00:27:01,960 --> 00:27:04,639 Speaker 1: black one, and so I ended up just sticking with 479 00:27:04,680 --> 00:27:07,520 Speaker 1: the brown one. And eventually I think I gave the 480 00:27:07,520 --> 00:27:11,639 Speaker 1: black one away to uh, to a child, uh so 481 00:27:11,920 --> 00:27:14,800 Speaker 1: they could play D N D. May you absorb the 482 00:27:14,800 --> 00:27:17,520 Speaker 1: curse of luck of this this item, yeah, which is 483 00:27:17,600 --> 00:27:20,480 Speaker 1: of course it's all ridiculous because these are exactly the same. 484 00:27:20,480 --> 00:27:23,080 Speaker 1: There's no difference. They're not, you know, waited differently, they 485 00:27:23,080 --> 00:27:26,240 Speaker 1: don't have different energies, etcetera. And I'm not even particularly 486 00:27:26,440 --> 00:27:29,280 Speaker 1: I mean aesthetically, I I would have to say, I 487 00:27:29,760 --> 00:27:32,080 Speaker 1: would prefer the one that matches the rest of uh 488 00:27:32,720 --> 00:27:35,520 Speaker 1: my set, But you know, the brown one was performing better, 489 00:27:35,560 --> 00:27:38,359 Speaker 1: so that's the one I kept. Uh this is funny 490 00:27:38,359 --> 00:27:40,600 Speaker 1: because I was just playing D and D last night. Actually, 491 00:27:40,640 --> 00:27:42,720 Speaker 1: we were talking before we started recording that we both 492 00:27:42,760 --> 00:27:46,200 Speaker 1: played DN D virtually last night, and UH so the 493 00:27:46,280 --> 00:27:49,080 Speaker 1: character I'm playing right now, is is a rogue. If 494 00:27:49,080 --> 00:27:51,080 Speaker 1: you're not familiar with D n D, he's he's got 495 00:27:51,119 --> 00:27:54,080 Speaker 1: low constitution, low armor class. So it means like, you 496 00:27:54,119 --> 00:27:56,879 Speaker 1: know a couple of bad roles and you're very low. 497 00:27:57,160 --> 00:28:00,600 Speaker 1: You're very close to death. Um and and one of 498 00:28:00,600 --> 00:28:03,439 Speaker 1: the closest calls of my campaign right now is was 499 00:28:03,520 --> 00:28:07,399 Speaker 1: literally from repeatedly rolling a one to to check my 500 00:28:07,440 --> 00:28:10,480 Speaker 1: ability at walking up a hill in the snow. Uh. 501 00:28:10,520 --> 00:28:13,520 Speaker 1: And so the dice can really do you dirty and 502 00:28:13,640 --> 00:28:16,640 Speaker 1: you can get mad at him. Yeah, yeah, get another 503 00:28:16,720 --> 00:28:18,200 Speaker 1: character to help you next time, and then you get 504 00:28:18,200 --> 00:28:21,359 Speaker 1: an advantage on that roll to help me walking. Yes, 505 00:28:23,200 --> 00:28:26,760 Speaker 1: help an old man up the hill. Anyway, we should 506 00:28:26,760 --> 00:28:29,960 Speaker 1: go on with Jimmy's message. Jimmy picks up. However, the 507 00:28:30,040 --> 00:28:32,560 Speaker 1: thing that compelled me to write in was your Weird 508 00:28:32,600 --> 00:28:36,200 Speaker 1: House Cinema episode on frogs, which I really enjoyed. Your 509 00:28:36,240 --> 00:28:39,120 Speaker 1: prompt for other Nature and Revolt movies was a call 510 00:28:39,200 --> 00:28:43,400 Speaker 1: to action, as I immediately thought of phase for a 511 00:28:43,440 --> 00:28:45,800 Speaker 1: film in which a couple of scientists study a group 512 00:28:45,800 --> 00:28:50,160 Speaker 1: of strangely behaving ants that are plaguing the countryside and 513 00:28:50,240 --> 00:28:53,960 Speaker 1: building bizarre hives. It gets very out there with concepts 514 00:28:53,960 --> 00:28:57,479 Speaker 1: and features some really awesome and also horrifying scenes of ants. 515 00:28:57,560 --> 00:29:00,480 Speaker 1: It's definitely weird, and whether or not you feature it 516 00:29:00,480 --> 00:29:03,200 Speaker 1: in an episode, it is certainly worth a watch. A 517 00:29:03,240 --> 00:29:07,600 Speaker 1: second suggestion for a uniquely bizarre film is The Visitor. 518 00:29:07,880 --> 00:29:12,240 Speaker 1: I know this one. A truly strange film about cosmic 519 00:29:12,400 --> 00:29:16,560 Speaker 1: entities in a struggle of good versus evil, throwing killerbirds, 520 00:29:16,600 --> 00:29:20,120 Speaker 1: trippy art house cinema visuals, a girl with mental powers, 521 00:29:20,200 --> 00:29:22,400 Speaker 1: and it turns into a heck of a ride. It's 522 00:29:22,440 --> 00:29:26,560 Speaker 1: got Lance Hendrickson, Glenn Ford, John Houston, and a cameo 523 00:29:26,720 --> 00:29:31,400 Speaker 1: by Franco Nero as Space Jesus. Anyway, I am truly 524 00:29:31,520 --> 00:29:34,000 Speaker 1: enjoying your show and can't wait to dig through the 525 00:29:34,040 --> 00:29:36,280 Speaker 1: back catalog. Keep up the great work. Thank you for 526 00:29:36,280 --> 00:29:39,840 Speaker 1: your time, Jimmy. Well, thanks for this message, Jimmy, And 527 00:29:39,880 --> 00:29:42,959 Speaker 1: oh yeah, we've seen The Visitor. You may not know this, 528 00:29:43,040 --> 00:29:45,280 Speaker 1: but we of course live in the Atlanta area and 529 00:29:45,320 --> 00:29:48,880 Speaker 1: The Visitor was filmed in Atlanta in the seventies or 530 00:29:48,960 --> 00:29:52,239 Speaker 1: seventies or eighties I think seventies, uh, and so like. 531 00:29:52,280 --> 00:29:55,480 Speaker 1: There are recognizable roads and landmarks in it, but it 532 00:29:55,560 --> 00:29:58,800 Speaker 1: is also just a it is so odd. It has 533 00:29:58,840 --> 00:30:01,400 Speaker 1: a kind of Spaghetti West turned thing going on, which 534 00:30:01,480 --> 00:30:03,959 Speaker 1: is and by that I mean like it's an Italian 535 00:30:04,040 --> 00:30:08,000 Speaker 1: film filmed in the United States with primarily US actors 536 00:30:08,720 --> 00:30:12,200 Speaker 1: and uh, and there's a certain I don't know that, 537 00:30:12,360 --> 00:30:15,400 Speaker 1: there's a certain shadow quality to it that is very 538 00:30:15,480 --> 00:30:19,600 Speaker 1: engrossing and very alienating at the same time. Yeah, it's 539 00:30:19,680 --> 00:30:21,920 Speaker 1: a it's it's a it's a lot of fun. Though 540 00:30:21,960 --> 00:30:23,400 Speaker 1: A lot of the fun I had with it was 541 00:30:23,480 --> 00:30:25,880 Speaker 1: just the fact that it was filmed in Atlanta back 542 00:30:25,880 --> 00:30:29,040 Speaker 1: in the day, so you can see various landmarks of 543 00:30:29,080 --> 00:30:32,000 Speaker 1: Atlanta there. It doesn't hide it's Atlantic nous, and its 544 00:30:32,000 --> 00:30:34,960 Speaker 1: Atlanta nous is not subtle. So it's not like Free 545 00:30:35,040 --> 00:30:37,520 Speaker 1: Jack where they just they're filming it in Atlanta pretending 546 00:30:37,560 --> 00:30:41,200 Speaker 1: it's something else. Um, and you have to look closely 547 00:30:41,240 --> 00:30:44,040 Speaker 1: to catch it. No, it's it's very Atlanta. As for 548 00:30:44,120 --> 00:30:47,000 Speaker 1: Phase four, I haven't seen this one. I'm vaguely aware 549 00:30:47,040 --> 00:30:48,960 Speaker 1: of it because it's kind of late for a giant 550 00:30:48,960 --> 00:30:53,600 Speaker 1: ant movie, uh in cinematic history. But I'd be interested 551 00:30:53,640 --> 00:30:55,600 Speaker 1: to check it out. Is it giant ants? I thought 552 00:30:55,600 --> 00:30:59,000 Speaker 1: it was normal sized and normal ants. Well, I haven't 553 00:30:59,000 --> 00:31:01,160 Speaker 1: seen it, so we will have to find out the 554 00:31:01,200 --> 00:31:03,880 Speaker 1: ant looks big on the poster, but then again that 555 00:31:04,000 --> 00:31:06,320 Speaker 1: the frog looks big on the poster for frog. So 556 00:31:06,360 --> 00:31:08,320 Speaker 1: it could be the same marketing gimmick where it's like, 557 00:31:08,360 --> 00:31:10,160 Speaker 1: we can't put a normal size frog on this poster. 558 00:31:10,480 --> 00:31:12,120 Speaker 1: It needs to be huge, and it needs to be 559 00:31:12,160 --> 00:31:13,880 Speaker 1: a person's arm sticking out of it. If we're gonna 560 00:31:13,880 --> 00:31:16,600 Speaker 1: put an on this poster, it needs to be larger 561 00:31:16,640 --> 00:31:24,600 Speaker 1: than a human hand. I think that's very plausible. All right, 562 00:31:24,640 --> 00:31:26,280 Speaker 1: here's another one. You want to take? This one Joe 563 00:31:26,320 --> 00:31:29,920 Speaker 1: from Brandon. Sure, Brandon says, Hi, guys, I have a 564 00:31:29,960 --> 00:31:32,800 Speaker 1: suggestion for weird house cinema. One of my all time 565 00:31:32,840 --> 00:31:36,680 Speaker 1: favorites Shachma. It is a nineties B movie about a 566 00:31:36,760 --> 00:31:42,080 Speaker 1: genetically altered baboon that terrorizes a group of med students 567 00:31:42,120 --> 00:31:45,760 Speaker 1: that are LARPing D and D. Also, it has Roddy McDowell. 568 00:31:46,080 --> 00:31:48,160 Speaker 1: I think you would like this odd treat. Keep up 569 00:31:48,200 --> 00:31:52,080 Speaker 1: the great work, Brandon. I don't remember much about this movie, 570 00:31:52,080 --> 00:31:54,239 Speaker 1: but I did watch part of it long ago at 571 00:31:54,360 --> 00:31:57,080 Speaker 1: a friend's house, and it was at least from the 572 00:31:57,080 --> 00:32:00,000 Speaker 1: parts I saw the best killer baboon Bottle episode movie 573 00:32:00,040 --> 00:32:03,240 Speaker 1: B I've ever seen the trailer for it is amazing, 574 00:32:03,560 --> 00:32:06,520 Speaker 1: So see the trailer if nothing else, because the trailer 575 00:32:06,560 --> 00:32:09,360 Speaker 1: has this wonderful intense narration that gets it just gets 576 00:32:09,400 --> 00:32:12,880 Speaker 1: more intense, and it keeps saying Shakma, shakma till at 577 00:32:12,920 --> 00:32:16,160 Speaker 1: the very end there's the baboon leaps at the camera 578 00:32:16,440 --> 00:32:21,040 Speaker 1: and the narrator just goes shakma and then it ends. 579 00:32:21,200 --> 00:32:26,280 Speaker 1: I think Shaka is the baboon's name. I think so. Yeah. 580 00:32:26,560 --> 00:32:28,760 Speaker 1: And of course this is also like, this is the 581 00:32:28,760 --> 00:32:30,280 Speaker 1: sort of trailer where they just say the name of 582 00:32:30,280 --> 00:32:32,200 Speaker 1: the film so many times and I missed that, Like 583 00:32:32,240 --> 00:32:42,640 Speaker 1: I wish they did that more. Yeah, gabbo gabbo gabbo. Uh, Rob, 584 00:32:42,680 --> 00:32:47,200 Speaker 1: do you want to read our last transmission from Planet Daniel? Yes, 585 00:32:47,360 --> 00:32:50,240 Speaker 1: all right, Daniel riots Hey, y'all just listen to the 586 00:32:50,240 --> 00:32:52,440 Speaker 1: Mad Love episode and wanted to add to the voices 587 00:32:52,480 --> 00:32:54,600 Speaker 1: in support of all of your amazing dives into these 588 00:32:54,640 --> 00:32:58,160 Speaker 1: curious and overlooked films. The Weird House Cinema episodes are 589 00:32:58,200 --> 00:33:02,480 Speaker 1: a great source of education, well, okay, and entertainment. Uh. 590 00:33:02,520 --> 00:33:04,920 Speaker 1: I guess they are educational in their own way. Sometimes 591 00:33:05,560 --> 00:33:08,880 Speaker 1: I miss visiting art house theaters and peeping whatever strange 592 00:33:08,920 --> 00:33:11,920 Speaker 1: film they've unearthed and are eager to share. Your podcast 593 00:33:12,000 --> 00:33:14,160 Speaker 1: brings a similar sense of wonder. So for that, I'm 594 00:33:14,240 --> 00:33:17,600 Speaker 1: very appreciative. Thank you and keep doing amazing work. Daniel, 595 00:33:17,680 --> 00:33:20,320 Speaker 1: Oh too kind, Daniel, Well, thank you and thanks for 596 00:33:20,320 --> 00:33:24,040 Speaker 1: writing in. Okay, should we wrap it up there? I 597 00:33:24,040 --> 00:33:26,080 Speaker 1: guess so. Yeah. We generally once we get into the 598 00:33:26,280 --> 00:33:28,760 Speaker 1: weird houses, we're uh, we're done. I think we're out 599 00:33:28,760 --> 00:33:31,800 Speaker 1: of time, but we thank everybody for writing in, you know, 600 00:33:31,880 --> 00:33:34,960 Speaker 1: if you if we just bear in mind, you know, 601 00:33:35,000 --> 00:33:37,880 Speaker 1: we don't always get to respond to email. We rarely 602 00:33:37,880 --> 00:33:39,800 Speaker 1: get to respond to email, and we only get to 603 00:33:39,840 --> 00:33:42,360 Speaker 1: read some of it on here, despite the weekly format. 604 00:33:42,520 --> 00:33:45,840 Speaker 1: But we do read uh in our own head silently 605 00:33:46,280 --> 00:33:48,640 Speaker 1: or maybe sometimes mumbling out loud, I don't know, but 606 00:33:48,680 --> 00:33:51,520 Speaker 1: we do read it all. So if you write into us, 607 00:33:51,880 --> 00:33:53,880 Speaker 1: we will read it, We will try and respond if 608 00:33:53,880 --> 00:33:55,840 Speaker 1: we have time, or we'll you know, if we'll try 609 00:33:55,880 --> 00:33:57,640 Speaker 1: and feature some of it on the listener mail. But 610 00:33:57,960 --> 00:34:00,479 Speaker 1: just keep it coming, because it's great to to know 611 00:34:00,640 --> 00:34:03,360 Speaker 1: what everybody else out there is thinking, what you're you know, 612 00:34:03,600 --> 00:34:07,520 Speaker 1: your your own expertise or experience, says about a given 613 00:34:07,560 --> 00:34:10,320 Speaker 1: topic and uh and also it's a great place to 614 00:34:10,400 --> 00:34:13,120 Speaker 1: make suggestions for the future, so keep it coming. Should 615 00:34:13,120 --> 00:34:14,719 Speaker 1: we tell you to hold off for a couple of weeks, 616 00:34:14,760 --> 00:34:17,920 Speaker 1: if your name is any variation on Dan is that no, no, 617 00:34:18,080 --> 00:34:22,120 Speaker 1: don't don't limit the dance, let the let's, let's unleash 618 00:34:22,160 --> 00:34:27,319 Speaker 1: the dance, kick it up, dance, Dan's, Danny's, Danielle's, Daniel's, 619 00:34:27,440 --> 00:34:31,440 Speaker 1: just just come on in anyway. Huge thanks as always 620 00:34:31,440 --> 00:34:34,920 Speaker 1: to our excellent audio producer Seth Nicholas Johnson. If you 621 00:34:34,960 --> 00:34:36,880 Speaker 1: would like to get in touch with us with feedback 622 00:34:36,920 --> 00:34:39,240 Speaker 1: on this episode or any other, to suggest a topic 623 00:34:39,320 --> 00:34:41,440 Speaker 1: for the future, or just to say hello, you can 624 00:34:41,520 --> 00:34:44,320 Speaker 1: email us at contact at stuff to blow your Mind 625 00:34:44,480 --> 00:34:53,880 Speaker 1: dot com. Stuff to Blow Your Mind is a production 626 00:34:53,920 --> 00:34:56,680 Speaker 1: of I Heart Radio. For more podcasts my Heart Radio, 627 00:34:56,880 --> 00:34:59,680 Speaker 1: visit the iHeart Radio app, Apple Podcasts, or wherever you 628 00:34:59,719 --> 00:35:00,960 Speaker 1: listen to your favorite shows.