1 00:00:03,000 --> 00:00:06,760 Speaker 1: Welcome to Stuff to Blow Your Mind, production of iHeartRadio. 2 00:00:12,800 --> 00:00:15,440 Speaker 2: Hey, welcome to Stuff to Blow your Mind. My name 3 00:00:15,480 --> 00:00:16,079 Speaker 2: is Robert. 4 00:00:15,960 --> 00:00:17,919 Speaker 3: Lamb and I am Joe McCormick. 5 00:00:18,120 --> 00:00:19,680 Speaker 2: Over the years here on Stuff to Blow Your Mind, 6 00:00:19,680 --> 00:00:22,640 Speaker 2: we've done a number of Halloween episodes that, in one 7 00:00:22,640 --> 00:00:26,680 Speaker 2: way or another, pick from assorted tales, discuss those tales 8 00:00:27,080 --> 00:00:31,320 Speaker 2: and maybe pick apart some science or culture surrounding them. 9 00:00:31,720 --> 00:00:34,080 Speaker 2: At one point we did a series of episodes based 10 00:00:34,120 --> 00:00:37,800 Speaker 2: on different creepypastas. Then Joe and I turned to TV 11 00:00:37,920 --> 00:00:40,800 Speaker 2: horror anthology episodes for a number of years, and last 12 00:00:40,840 --> 00:00:43,519 Speaker 2: year we started a new tradition, one that sticks to 13 00:00:43,720 --> 00:00:47,400 Speaker 2: shorter horror works, but also gets back into the written word, 14 00:00:47,880 --> 00:00:49,640 Speaker 2: which I know many of our listeners missed from the 15 00:00:49,760 --> 00:00:53,400 Speaker 2: days when we did summer reading episodes. So you know, 16 00:00:53,479 --> 00:00:56,520 Speaker 2: written horror fiction often comes up on the show anyway, 17 00:00:56,600 --> 00:00:59,279 Speaker 2: So it seemed like a solid direction to go in. 18 00:01:00,080 --> 00:01:03,920 Speaker 2: This is the this is our second Grimoure of Horror episode, 19 00:01:04,160 --> 00:01:05,720 Speaker 2: and so in this episode, yeah, we're going to be 20 00:01:05,760 --> 00:01:10,760 Speaker 2: discussing a pair of horror short stories, both very different, 21 00:01:11,480 --> 00:01:16,000 Speaker 2: but also I guess it's just pure synchronicity here. They 22 00:01:16,000 --> 00:01:19,960 Speaker 2: both feature elements of the Poles, the North Pole of 23 00:01:20,000 --> 00:01:22,520 Speaker 2: the Arctic in one in one story, or what I 24 00:01:22,520 --> 00:01:24,320 Speaker 2: believe is supposed to be the North Pole, and then 25 00:01:24,360 --> 00:01:27,200 Speaker 2: the other tale takes us to the Antarctic, and both 26 00:01:27,319 --> 00:01:30,240 Speaker 2: ultimately contain a fair amount of ice and coldness. 27 00:01:30,680 --> 00:01:33,280 Speaker 3: Hmmm yeah. And you know, while I often fear that 28 00:01:33,319 --> 00:01:35,640 Speaker 3: it's going to be hot on Halloween, just like I 29 00:01:35,680 --> 00:01:37,959 Speaker 3: often fear that it's going to be hot on Christmas, 30 00:01:37,800 --> 00:01:40,679 Speaker 3: it has proven a little bit chilly this week around 31 00:01:40,720 --> 00:01:41,120 Speaker 3: my house. 32 00:01:41,360 --> 00:01:43,800 Speaker 2: Hmmm yeah, yeah, it's a little it's a little cold here, 33 00:01:43,840 --> 00:01:45,880 Speaker 2: it's a little wet here. I think it's going to 34 00:01:45,959 --> 00:01:47,800 Speaker 2: dry up a bit, and so hopefully we'll have a 35 00:01:48,400 --> 00:01:49,680 Speaker 2: good night for trigger treating. 36 00:01:50,040 --> 00:01:51,520 Speaker 3: So what's your tale of dread? 37 00:01:51,640 --> 00:01:55,480 Speaker 2: Rob? All Right? I decided to go right for it, 38 00:01:55,760 --> 00:01:58,760 Speaker 2: and I picked up a story that had been on 39 00:01:58,840 --> 00:02:00,840 Speaker 2: my list of things that I felt like I should 40 00:02:00,840 --> 00:02:04,720 Speaker 2: probably have read for many years. My selection is I 41 00:02:04,800 --> 00:02:08,120 Speaker 2: Have No Mouth and I Must Scream by Harlan Ellison, 42 00:02:08,600 --> 00:02:11,440 Speaker 2: a horrifying sci fi short story about the dangers of 43 00:02:11,520 --> 00:02:15,040 Speaker 2: artificial intelligence from way back in nineteen sixty seven. 44 00:02:15,400 --> 00:02:17,680 Speaker 3: Kind Of hard to imagine that writers in the nineteen 45 00:02:17,840 --> 00:02:22,680 Speaker 3: sixties were already personifying computers. This much feels like second 46 00:02:22,720 --> 00:02:25,799 Speaker 3: nature now, but you know, the computers of the time, 47 00:02:27,160 --> 00:02:30,600 Speaker 3: there was a lot more imagination involved to get across 48 00:02:30,639 --> 00:02:33,480 Speaker 3: the gap in time and technology from what they had 49 00:02:33,600 --> 00:02:36,440 Speaker 3: then to the to the you know, terminator or the 50 00:02:36,880 --> 00:02:40,600 Speaker 3: or the am from I have no mouth than it 51 00:02:40,680 --> 00:02:42,840 Speaker 3: is to get from what we have today to the same. 52 00:02:43,480 --> 00:02:47,320 Speaker 2: Yeah. Yeah, it's really interesting to think about this, in 53 00:02:47,360 --> 00:02:50,840 Speaker 2: part because, to be clear, Harlan Ellison did not invent this. 54 00:02:51,680 --> 00:02:55,000 Speaker 2: You know, he's coming along already in an established tradition 55 00:02:55,560 --> 00:02:58,240 Speaker 2: and putting his own spin on things. But yeah, it 56 00:02:58,280 --> 00:03:00,000 Speaker 2: is just crazy to think about. Okay, what else was 57 00:03:00,080 --> 00:03:02,440 Speaker 2: going on in the year nineteen sixty seven. Well, this 58 00:03:02,600 --> 00:03:05,480 Speaker 2: was the year of the Summer of Love, so actual 59 00:03:05,600 --> 00:03:09,000 Speaker 2: hippies and flower children, there's a good chance some of 60 00:03:09,040 --> 00:03:11,679 Speaker 2: them read this story, along with a little novel called 61 00:03:11,720 --> 00:03:14,440 Speaker 2: Doom that came out a few years prior and at 62 00:03:14,440 --> 00:03:17,440 Speaker 2: that point had no sequel yet, so I was captivated 63 00:03:17,440 --> 00:03:19,280 Speaker 2: by that idea. This was also the year that Star 64 00:03:19,360 --> 00:03:22,680 Speaker 2: Trek debuted with all of its ultimate ultimately you know, 65 00:03:23,600 --> 00:03:28,840 Speaker 2: a show about technological and futuristic optimism. This story is 66 00:03:28,880 --> 00:03:29,280 Speaker 2: not that. 67 00:03:30,360 --> 00:03:33,640 Speaker 3: What a spectrum represented by the three works you just named. 68 00:03:35,720 --> 00:03:40,160 Speaker 2: Also, it's interesting to think about computer technology of the time. Yeah, 69 00:03:40,600 --> 00:03:43,840 Speaker 2: I was looking this up. As far as I understand, 70 00:03:43,880 --> 00:03:47,640 Speaker 2: the most powerful computer at the time was the CDC, 71 00:03:47,960 --> 00:03:51,520 Speaker 2: the Controlled Data Corporation sixty six hundred, which would have 72 00:03:51,560 --> 00:03:54,880 Speaker 2: been one of these room sized supercomputers, and it reigned 73 00:03:54,880 --> 00:03:57,240 Speaker 2: supreme at the time, but today would be less powerful 74 00:03:57,280 --> 00:04:00,120 Speaker 2: than a home smart device and would be absolutely dwarf 75 00:04:00,160 --> 00:04:02,560 Speaker 2: by the speed and power of your smartphone, to say 76 00:04:02,560 --> 00:04:04,920 Speaker 2: nothing of today's actual supercomputers. 77 00:04:05,240 --> 00:04:05,440 Speaker 3: Yeah. 78 00:04:05,960 --> 00:04:08,360 Speaker 2: This was also the year that both Vin Diesel and 79 00:04:08,440 --> 00:04:09,560 Speaker 2: Nicole Kidman were born. 80 00:04:10,480 --> 00:04:13,000 Speaker 3: Really, that's kind of surprising. I was about to say 81 00:04:13,000 --> 00:04:17,280 Speaker 3: that Vin Diesel doesn't seem that old, but actually neither 82 00:04:17,320 --> 00:04:17,719 Speaker 3: of them do. 83 00:04:18,160 --> 00:04:21,720 Speaker 2: No no whether they're ageless celebrity superstars, but we can 84 00:04:21,800 --> 00:04:25,120 Speaker 2: imagine them as babies. Vin Diesel babies, Nicole Kidman babies, 85 00:04:27,120 --> 00:04:29,440 Speaker 2: one of us preaching to us about the importance of family, 86 00:04:29,480 --> 00:04:34,000 Speaker 2: the other telling us how transformative watching a movie in 87 00:04:34,000 --> 00:04:35,039 Speaker 2: a theater happens to be. 88 00:04:35,560 --> 00:04:39,599 Speaker 3: Asking Vell kil Murphy likes thinking about bats. Yeah, but 89 00:04:39,920 --> 00:04:42,880 Speaker 3: you can imagine them as babies and their parents reading 90 00:04:42,920 --> 00:04:45,040 Speaker 3: in the Science fiction magazine. I have no mouth, and 91 00:04:45,080 --> 00:04:47,200 Speaker 3: I must scream, and I'm sure getting a lot out 92 00:04:47,240 --> 00:04:47,440 Speaker 3: of it. 93 00:04:49,520 --> 00:04:53,279 Speaker 2: So Harlan Ellison lived nineteen thirty four through twenty eighteen, 94 00:04:53,400 --> 00:04:56,600 Speaker 2: so at the time of this story's publication, I believe 95 00:04:56,640 --> 00:04:58,920 Speaker 2: he would have been like thirty three years old, already 96 00:04:58,960 --> 00:05:02,120 Speaker 2: a published author, an army vet, and by most accounts, 97 00:05:02,160 --> 00:05:06,039 Speaker 2: already had a strong reputation as a difficult person to 98 00:05:06,120 --> 00:05:07,120 Speaker 2: say the least. 99 00:05:07,320 --> 00:05:10,479 Speaker 3: I'm not super familiar with Harlan Ellison, but what I 100 00:05:10,640 --> 00:05:13,040 Speaker 3: do know, and I'm not looking at it to confirm, 101 00:05:13,160 --> 00:05:15,760 Speaker 3: so I can't say I know this is true. But 102 00:05:15,760 --> 00:05:18,640 Speaker 3: my impression is that literally half the length of his 103 00:05:18,680 --> 00:05:21,800 Speaker 3: Wikipedia page is the Controversies and Dispute section. 104 00:05:22,480 --> 00:05:26,240 Speaker 2: Yeah, you would be correct on that count. It's when 105 00:05:26,279 --> 00:05:28,640 Speaker 2: you get into the personal stuff and the lawsuits and 106 00:05:28,640 --> 00:05:33,880 Speaker 2: so forth, generally lawsuits filed by Harlan Ellison. Now, I 107 00:05:34,360 --> 00:05:36,719 Speaker 2: first became aware of him not through his written work, 108 00:05:36,920 --> 00:05:41,799 Speaker 2: but via the segments on the Sci Fi Channels sci 109 00:05:41,839 --> 00:05:45,360 Speaker 2: Fi Buzz series back in I believe nineteen ninety two. 110 00:05:45,520 --> 00:05:47,680 Speaker 2: This would have been like a sort of like the 111 00:05:47,720 --> 00:05:51,680 Speaker 2: Sci Fi Channel News if I remember correctly, and he 112 00:05:51,720 --> 00:05:55,920 Speaker 2: would have these segments titled Harlan Ellison's Watching, which was 113 00:05:55,960 --> 00:05:57,919 Speaker 2: also the name of a nineteen eighty nine collection of 114 00:05:57,920 --> 00:06:01,480 Speaker 2: his essays and film reviews. But these were like weirdly 115 00:06:01,560 --> 00:06:04,680 Speaker 2: shot little rants, so like I was looking back at 116 00:06:04,680 --> 00:06:06,520 Speaker 2: one of them, and it's like shot in a mirror 117 00:06:06,560 --> 00:06:10,560 Speaker 2: for some reason. But it's just him ranting about one 118 00:06:10,600 --> 00:06:14,040 Speaker 2: sci fi related topic or another. You know, here's this 119 00:06:14,200 --> 00:06:18,800 Speaker 2: white haired author expressing his scathing opinions on various topics. 120 00:06:18,880 --> 00:06:21,960 Speaker 2: And while my sensibilities were still very much in development 121 00:06:21,960 --> 00:06:25,040 Speaker 2: and pre development at the time, even then I realized 122 00:06:25,080 --> 00:06:28,279 Speaker 2: that this guy was a pretty smart and be just 123 00:06:28,480 --> 00:06:33,200 Speaker 2: absolutely insufferable. So to be honest, I think that I 124 00:06:33,360 --> 00:06:39,440 Speaker 2: probably avoided reading his work in my life in part 125 00:06:39,560 --> 00:06:42,120 Speaker 2: because I had this just mental image of him. I 126 00:06:42,120 --> 00:06:45,680 Speaker 2: had this sort of media personality forward idea of Harlan 127 00:06:45,720 --> 00:06:50,040 Speaker 2: Ellison that you know, I respect it on some level 128 00:06:50,080 --> 00:06:53,200 Speaker 2: but also didn't really want any more of. But plenty 129 00:06:53,240 --> 00:06:55,320 Speaker 2: of people did you know. He was a figure that 130 00:06:55,360 --> 00:06:58,359 Speaker 2: would say exactly what he thinks, didn't care who he 131 00:06:58,400 --> 00:07:01,480 Speaker 2: pissed off when he said it. He described himself as 132 00:07:01,520 --> 00:07:05,159 Speaker 2: a troublemaker in a malcontent. Others called him everything from 133 00:07:05,240 --> 00:07:10,160 Speaker 2: a man shaped explosion. Comedian Patton Oswald used that one, 134 00:07:10,520 --> 00:07:12,560 Speaker 2: and other people called him things that we can't even 135 00:07:12,600 --> 00:07:17,640 Speaker 2: repeat here. He was reportedly bipolar, but wasn't diagnosed and 136 00:07:17,680 --> 00:07:20,000 Speaker 2: didn't receive treatment for it till near the end of 137 00:07:20,040 --> 00:07:24,840 Speaker 2: his life. He was anti authoritarian, He was contrary. He 138 00:07:24,920 --> 00:07:28,120 Speaker 2: was progressive on a number of topics, but also could 139 00:07:28,120 --> 00:07:33,120 Speaker 2: allegedly be quite arrogant, volatile, abusive, sexist, litigious, and just 140 00:07:33,200 --> 00:07:36,080 Speaker 2: a real pain of not an active threat to those 141 00:07:36,120 --> 00:07:39,360 Speaker 2: around him. There were also at least two major accusations 142 00:07:39,360 --> 00:07:42,760 Speaker 2: of sexual misconduct during his life. And I say all 143 00:07:42,800 --> 00:07:45,200 Speaker 2: of this because I feel like with a notorious figure 144 00:07:45,400 --> 00:07:48,880 Speaker 2: like Ellison, notorious and at the same time beloved by many, 145 00:07:50,040 --> 00:07:53,280 Speaker 2: he's far from the anonymous writer in the Shadows. You know, 146 00:07:53,360 --> 00:07:55,640 Speaker 2: we can't help but take our knowledge of him into 147 00:07:55,680 --> 00:08:00,880 Speaker 2: the text. Again. It was a media personality, and I'm 148 00:08:00,880 --> 00:08:03,080 Speaker 2: probably not the only person out there who encountered him 149 00:08:03,080 --> 00:08:06,720 Speaker 2: first as a media personality and then grew to realize, oh, 150 00:08:06,840 --> 00:08:09,360 Speaker 2: he was also a heck of a writer. Because you know, 151 00:08:09,400 --> 00:08:12,040 Speaker 2: whatever else he was, he was quite a writer, and 152 00:08:12,080 --> 00:08:15,000 Speaker 2: I believe this story is a fine example of that. 153 00:08:16,280 --> 00:08:18,960 Speaker 2: He was also a prolific writer. I have no Mouth 154 00:08:18,960 --> 00:08:20,520 Speaker 2: and I'm a scream. It's just one of some four 155 00:08:20,600 --> 00:08:23,640 Speaker 2: hundred stories he wrote. He wrote something like seventy books 156 00:08:23,680 --> 00:08:28,320 Speaker 2: and numerous other scripts, columns, and projects, including the original 157 00:08:28,320 --> 00:08:30,680 Speaker 2: Star Trek episode The City on the Edge of Forever, 158 00:08:31,040 --> 00:08:36,040 Speaker 2: which is often counted among the best of the original series. 159 00:08:36,960 --> 00:08:40,120 Speaker 2: And speaking on the story in question here, I have 160 00:08:40,120 --> 00:08:42,440 Speaker 2: no Mouth and I'm a scream. You know, I feel 161 00:08:42,480 --> 00:08:45,800 Speaker 2: like there's a real live wire energy to the pros here. 162 00:08:45,960 --> 00:08:48,320 Speaker 2: At times it almost has a kind of Beat generation 163 00:08:48,480 --> 00:08:50,280 Speaker 2: vibe to it. He was not part of the Beat generation, 164 00:08:50,360 --> 00:08:53,280 Speaker 2: to be clear. It's more like new wave science fiction, 165 00:08:53,360 --> 00:08:57,040 Speaker 2: I guess. But there's a rhythm to it. It's pretty 166 00:08:57,120 --> 00:08:59,439 Speaker 2: quick to hook in the reader. The prose is raw 167 00:08:59,559 --> 00:09:03,680 Speaker 2: and ragged and neatly fitting with the dark, mean nature 168 00:09:03,679 --> 00:09:04,240 Speaker 2: of the tale. 169 00:09:04,600 --> 00:09:10,080 Speaker 3: Yeah, yeah, I know exactly what you mean. It's confusing 170 00:09:10,240 --> 00:09:14,440 Speaker 3: because in ultimate effect, this story is makes me feel 171 00:09:14,600 --> 00:09:19,880 Speaker 3: so bad. It's so awful. But there are parts of 172 00:09:19,920 --> 00:09:23,160 Speaker 3: the text that as pro is a rapturous you know, 173 00:09:23,240 --> 00:09:27,080 Speaker 3: they really pull you along. It becomes like a you know, 174 00:09:27,200 --> 00:09:31,960 Speaker 3: following a thundering sermon by like a really captivating Preacher. 175 00:09:32,360 --> 00:09:35,679 Speaker 2: Yeah, yeah, it really does captivate you. It's this is 176 00:09:35,720 --> 00:09:37,920 Speaker 2: one where we were picking our stories, and we began 177 00:09:37,960 --> 00:09:40,280 Speaker 2: picking our stories for this episode, like, you know, a 178 00:09:40,320 --> 00:09:42,120 Speaker 2: month or a month and a half ago, and I 179 00:09:42,160 --> 00:09:45,160 Speaker 2: happened to look at this one early on and I 180 00:09:45,240 --> 00:09:48,720 Speaker 2: read it and I was like, well, that's a strong candidate. 181 00:09:48,800 --> 00:09:50,600 Speaker 2: And then the next day I just kept thinking about 182 00:09:50,600 --> 00:09:52,200 Speaker 2: the story and I realized, well, no, it has to 183 00:09:52,240 --> 00:09:56,559 Speaker 2: be this one. Because you know, you've read something interesting, 184 00:09:56,679 --> 00:09:59,600 Speaker 2: or you've seen something interesting, if you're dealing with films, 185 00:09:59,640 --> 00:10:02,880 Speaker 2: if it if it continues to emerge in your thoughts 186 00:10:03,679 --> 00:10:05,959 Speaker 2: in the days of the weeks after you're viewing or 187 00:10:05,960 --> 00:10:09,520 Speaker 2: your reading. All right, so let's get to what this 188 00:10:09,640 --> 00:10:11,680 Speaker 2: story is about. If you're not familiar with it, because 189 00:10:11,679 --> 00:10:14,280 Speaker 2: it is a rather famous tale. It is a post 190 00:10:14,320 --> 00:10:19,280 Speaker 2: technological singularity dystopian sci fi horror tale about a supercomputer 191 00:10:19,360 --> 00:10:22,920 Speaker 2: AI that eradicates all life on Earth except for five 192 00:10:23,040 --> 00:10:27,720 Speaker 2: human beings. This supercomputer AI, known as am or AM, 193 00:10:28,360 --> 00:10:32,560 Speaker 2: makes these individuals biologically immortal and keeps them alive for 194 00:10:32,600 --> 00:10:35,760 Speaker 2: the sole purpose, and this seems to be its sole 195 00:10:35,840 --> 00:10:38,280 Speaker 2: purpose of endlessly torturing them. 196 00:10:38,640 --> 00:10:40,960 Speaker 3: Yeah, the narrator of the story can never really know, 197 00:10:41,280 --> 00:10:46,080 Speaker 3: can only suppose what the motivations of the computer are. 198 00:10:47,040 --> 00:10:50,120 Speaker 3: But there are a number of suggestions, and I think 199 00:10:50,160 --> 00:10:53,160 Speaker 3: in the story kind of lands on that there is 200 00:10:53,200 --> 00:10:57,720 Speaker 3: something tortured and inadequate about the form in which the 201 00:10:57,800 --> 00:11:01,679 Speaker 3: consciousness of this computer has been brought into it into existence. 202 00:11:01,760 --> 00:11:05,520 Speaker 3: There's something about it that it hates to be and 203 00:11:05,600 --> 00:11:09,040 Speaker 3: cannot change, and thus it has a rage that can 204 00:11:09,080 --> 00:11:12,439 Speaker 3: only be expressed as a desire to torture the species 205 00:11:12,480 --> 00:11:14,439 Speaker 3: that created it, and that is humankind. 206 00:11:14,800 --> 00:11:18,480 Speaker 2: Yeah. Yeah, it is frequently cited as being a vengeance, 207 00:11:19,240 --> 00:11:22,160 Speaker 2: very Frankenstein like in many respects. I guess on that 208 00:11:22,280 --> 00:11:25,559 Speaker 2: one level, you know, the idea that the created individual 209 00:11:26,880 --> 00:11:30,160 Speaker 2: comes to disdain both its own creation and its creator 210 00:11:30,520 --> 00:11:33,760 Speaker 2: and then seeks vengeance over them, and in this case 211 00:11:33,800 --> 00:11:37,440 Speaker 2: not only seeks, but achieves. It achieves a long standing, 212 00:11:37,640 --> 00:11:38,920 Speaker 2: everlasting victory. 213 00:11:39,440 --> 00:11:42,440 Speaker 3: It creates a literal hell for a small number of 214 00:11:42,480 --> 00:11:44,360 Speaker 3: human pets that it has preserved. 215 00:11:44,760 --> 00:11:48,520 Speaker 2: Exactly. Yeah. Now, I mentioned earlier that you know, Harlan 216 00:11:48,600 --> 00:11:52,160 Speaker 2: Nelson did not invent any of this. You know, this 217 00:11:52,240 --> 00:11:55,840 Speaker 2: is his own unique and highly effective spin on it. 218 00:11:56,080 --> 00:11:58,680 Speaker 2: So I want to do a quick look at just 219 00:11:59,040 --> 00:12:06,080 Speaker 2: a few notables precursors to this computer and supercomputer fiction. Arguably, 220 00:12:06,200 --> 00:12:08,960 Speaker 2: the first thing like an AI to appear in fiction 221 00:12:09,320 --> 00:12:13,000 Speaker 2: is the engine, a writing machine in Jonathan Swift Gulliver's 222 00:12:13,040 --> 00:12:14,760 Speaker 2: Travels from seventeen twenty six. 223 00:12:16,000 --> 00:12:16,360 Speaker 3: Eric A. 224 00:12:16,559 --> 00:12:19,600 Speaker 2: Weiss presented this idea in a nineteen eighty five article 225 00:12:19,679 --> 00:12:22,160 Speaker 2: for Annals of History of Computing. 226 00:12:22,640 --> 00:12:25,640 Speaker 3: That's funny. I've read Gulliver's Travels, but I don't remember 227 00:12:25,679 --> 00:12:28,520 Speaker 3: what this is. Is this something that the winhms have. 228 00:12:28,679 --> 00:12:30,920 Speaker 2: Or it's been a long time for me as well. 229 00:12:30,960 --> 00:12:33,960 Speaker 2: So I'm a little foggy on the exact example, but 230 00:12:34,240 --> 00:12:37,400 Speaker 2: a case could be made, apparently. But the first true example, 231 00:12:37,480 --> 00:12:40,880 Speaker 2: by most standards is the Machine from The Machine Stops, 232 00:12:40,960 --> 00:12:44,000 Speaker 2: a short story by E. M. Forster, best known for 233 00:12:44,040 --> 00:12:46,720 Speaker 2: his rather non sci fi novels such as A Room 234 00:12:46,760 --> 00:12:50,120 Speaker 2: with a View, Howard's End, and A Passage to India 235 00:12:50,679 --> 00:12:53,240 Speaker 2: from nineteen twenty four. That last one, and I believe 236 00:12:53,280 --> 00:12:55,240 Speaker 2: I read that one in college. But it's been a 237 00:12:55,240 --> 00:12:55,720 Speaker 2: long time. 238 00:12:56,000 --> 00:12:57,680 Speaker 3: I read A Room with a View in college. 239 00:12:57,800 --> 00:13:00,599 Speaker 2: Okay, I have not read any read any of his 240 00:13:00,720 --> 00:13:04,400 Speaker 2: science fiction. I did re part of the story in question. Here. 241 00:13:06,559 --> 00:13:08,920 Speaker 2: It was a dystopian rebuttal to some of H. G. 242 00:13:09,080 --> 00:13:14,240 Speaker 2: Wells's early, more utopian technological visions, apparently a cautionary tale 243 00:13:14,240 --> 00:13:17,640 Speaker 2: about humans becoming too reliant on technology and an all 244 00:13:17,679 --> 00:13:20,880 Speaker 2: powerful supercomputer that tends to their every need. As the 245 00:13:20,880 --> 00:13:24,880 Speaker 2: title suggests, the machine eventually stops, bringing complete collapse for 246 00:13:24,920 --> 00:13:28,199 Speaker 2: the subterranean dwellers who depend on the machines, but also 247 00:13:28,280 --> 00:13:31,320 Speaker 2: a potential new future for the portions of humanity that 248 00:13:31,400 --> 00:13:34,720 Speaker 2: still live above the ground. So there are many other 249 00:13:34,920 --> 00:13:40,200 Speaker 2: pre am pre am examples of fictional ais, but a 250 00:13:40,200 --> 00:13:43,839 Speaker 2: couple of other notable examples include Colossus in D. F. 251 00:13:44,040 --> 00:13:49,360 Speaker 2: Jones Colossus Trilogy, the first novel published in nineteen sixty 252 00:13:49,440 --> 00:13:52,880 Speaker 2: six and then was later in nineteen seventy adapted into 253 00:13:52,880 --> 00:13:56,720 Speaker 2: the film Colossus the Foreben Project. These books deal with 254 00:13:56,760 --> 00:14:00,000 Speaker 2: a US supercomputer placed in charge of the nation's nuclear arts, 255 00:14:00,559 --> 00:14:04,360 Speaker 2: that eventually merges with its Soviet counterpart and rules over 256 00:14:04,360 --> 00:14:08,239 Speaker 2: the human race in order to protect the human species 257 00:14:08,240 --> 00:14:09,040 Speaker 2: from itself. 258 00:14:09,240 --> 00:14:12,680 Speaker 3: That's interesting for how similar the premise is to something 259 00:14:12,679 --> 00:14:15,280 Speaker 3: they discuss in I Have No Mouth and in some 260 00:14:15,440 --> 00:14:18,920 Speaker 3: other Killer AI stories that they come out of the 261 00:14:18,960 --> 00:14:24,920 Speaker 3: cold war, They imagine the initial supercomputer as something created 262 00:14:25,040 --> 00:14:27,600 Speaker 3: in order to fight the war or to defend one 263 00:14:27,680 --> 00:14:31,600 Speaker 3: side on the war. Then they imagine a supercomputing arms race. 264 00:14:32,000 --> 00:14:35,200 Speaker 3: Then they finally imagine that the computers on both sides 265 00:14:35,400 --> 00:14:39,800 Speaker 3: join forces and merge to turn against the humans that 266 00:14:39,880 --> 00:14:40,960 Speaker 3: created them. 267 00:14:41,160 --> 00:14:43,600 Speaker 2: Yeah. Yeah, And there's obviously there's a lot going on 268 00:14:43,640 --> 00:14:45,440 Speaker 2: in such visions. On one level, it's kind of like, 269 00:14:45,480 --> 00:14:47,240 Speaker 2: what if we create If we create something and it 270 00:14:47,280 --> 00:14:50,400 Speaker 2: becomes greater than us, does it become greater than the 271 00:14:50,440 --> 00:14:53,000 Speaker 2: stupid things we asked it to do? You know? Does it? 272 00:14:53,240 --> 00:14:56,400 Speaker 2: Does it become greater than our own self destruction? And 273 00:14:56,440 --> 00:14:59,640 Speaker 2: in that what does it become? Does it become our protectors? 274 00:14:59,680 --> 00:15:02,600 Speaker 2: Does it be become like a benevolent god that looks 275 00:15:02,600 --> 00:15:08,720 Speaker 2: over us? Or does it become something much worse? As 276 00:15:08,800 --> 00:15:12,280 Speaker 2: Harlan Ellison explores here. One more work I want to highlight. 277 00:15:12,800 --> 00:15:17,000 Speaker 2: This is another pre am work that deals not only 278 00:15:17,040 --> 00:15:21,360 Speaker 2: with powerful ais, but ais that work violently against human factions, 279 00:15:21,760 --> 00:15:25,040 Speaker 2: and that is Philip K. Dix nineteen sixty pulp novel 280 00:15:25,160 --> 00:15:28,000 Speaker 2: Vulcan's Hammer. This is not one of the Dick books 281 00:15:28,040 --> 00:15:30,320 Speaker 2: that I've read, but I'm to understand. This was like 282 00:15:30,400 --> 00:15:32,280 Speaker 2: kind of like at the end of his pulp phase 283 00:15:33,040 --> 00:15:35,720 Speaker 2: before he got into writing many of the books that 284 00:15:35,760 --> 00:15:40,120 Speaker 2: we know him best from. But this one definitely involved 285 00:15:40,720 --> 00:15:46,920 Speaker 2: an AI or AI's that again violently worked against human beings, 286 00:15:47,000 --> 00:15:49,880 Speaker 2: or at least factions of human beings. Now there may 287 00:15:49,880 --> 00:15:53,920 Speaker 2: be some other presidents worth pointing out, but Ellison's am 288 00:15:54,040 --> 00:15:59,880 Speaker 2: or am which stands different descriptions are applied. Sometimes it's 289 00:16:00,320 --> 00:16:05,440 Speaker 2: to have initially meant allied master computer, and then adaptive manipulator, 290 00:16:05,480 --> 00:16:09,240 Speaker 2: and then aggressive menace, but ultimately it also refers to 291 00:16:09,400 --> 00:16:14,760 Speaker 2: I think therefore, I am am am. This would certainly 292 00:16:14,760 --> 00:16:17,880 Speaker 2: seem to be the concept of a dangerous AI pushed 293 00:16:17,920 --> 00:16:23,600 Speaker 2: to just a horrifying extreme, a supercomputer superintelligence, but one 294 00:16:23,640 --> 00:16:29,600 Speaker 2: that has absolutely no benevolence in it. It's not even benign. 295 00:16:30,080 --> 00:16:35,000 Speaker 2: It is just absolutely malicious in its pure manifestation. It's 296 00:16:35,000 --> 00:16:37,800 Speaker 2: just described at times in the tale as being akin 297 00:16:37,880 --> 00:16:40,880 Speaker 2: to kind of a vengeful Old Testament God, but one 298 00:16:40,920 --> 00:16:44,280 Speaker 2: that seeks only to endlessly torment its people out of 299 00:16:44,280 --> 00:16:47,560 Speaker 2: an all consuming sense of sadistic hatred for the species 300 00:16:47,560 --> 00:16:51,040 Speaker 2: that created it. So the story here is told from 301 00:16:51,040 --> 00:16:54,560 Speaker 2: the point of view of Ted, one of five remaining humans, 302 00:16:54,560 --> 00:16:59,120 Speaker 2: as they plod helplessly through AM's torments, which take place 303 00:16:59,160 --> 00:17:01,920 Speaker 2: in a world with all the flavor of post apocalyptic 304 00:17:02,000 --> 00:17:04,720 Speaker 2: high technology magic. You know, it's one of those where 305 00:17:04,920 --> 00:17:08,200 Speaker 2: the technology so advanced it becomes magic, and the only 306 00:17:08,280 --> 00:17:11,119 Speaker 2: way we can even think about it is as sorcery, 307 00:17:11,280 --> 00:17:14,439 Speaker 2: like Am has summoned Win, Am has summoned monsters and 308 00:17:14,480 --> 00:17:15,000 Speaker 2: so forth. 309 00:17:15,240 --> 00:17:18,000 Speaker 3: Yeah, you could almost think of it as taking place 310 00:17:18,000 --> 00:17:20,760 Speaker 3: within a hollow deck. There is just seems to be 311 00:17:20,880 --> 00:17:24,240 Speaker 3: no end to the changes in the physical environment that 312 00:17:24,280 --> 00:17:27,920 Speaker 3: can be brought about by the computer, and thus it 313 00:17:28,280 --> 00:17:31,280 Speaker 3: kind of loses a sense of reality in that sense, 314 00:17:31,640 --> 00:17:34,359 Speaker 3: Like the whole thing could almost be a nightmare within 315 00:17:34,440 --> 00:17:39,479 Speaker 3: the characters' heads because there's very little that physically holds 316 00:17:39,520 --> 00:17:43,360 Speaker 3: anything steady. The computer can do anything and does anything. 317 00:17:43,680 --> 00:17:47,239 Speaker 2: Yeah, it changes their bodies, it changes their minds, It 318 00:17:47,240 --> 00:17:51,719 Speaker 2: can read their thoughts, It can protect them to whatever 319 00:17:51,760 --> 00:17:56,240 Speaker 2: degree it desires from danger and harm while also keeping 320 00:17:56,280 --> 00:17:59,800 Speaker 2: them in constant states of pain. Yeah, so just a 321 00:18:00,400 --> 00:18:04,960 Speaker 2: nightmare scenario. Well, you know, it's given them biological immortality, 322 00:18:05,440 --> 00:18:07,840 Speaker 2: but it's taken just about everything else from them, and 323 00:18:07,880 --> 00:18:10,000 Speaker 2: it only wants them to live because it doesn't want 324 00:18:10,040 --> 00:18:13,000 Speaker 2: the pain to ever end. There's a great line that 325 00:18:13,040 --> 00:18:15,320 Speaker 2: I thought summoned up, you know, some of what we're 326 00:18:15,320 --> 00:18:20,840 Speaker 2: talking about here, the magic of the thing, the narrator says, immortal, trapped, 327 00:18:20,920 --> 00:18:24,000 Speaker 2: subject to any torment he could devise for us from 328 00:18:24,080 --> 00:18:28,440 Speaker 2: the limitless miracles at his command. Now, on one hand, 329 00:18:28,440 --> 00:18:31,440 Speaker 2: the situation here is, you know, certainly by modern standards 330 00:18:31,440 --> 00:18:32,840 Speaker 2: and based on all the stuff that's come in the 331 00:18:32,840 --> 00:18:36,400 Speaker 2: wake of this story, a pretty digestible post apocalyptic scenario 332 00:18:36,760 --> 00:18:40,119 Speaker 2: AI run them up, wipes out humanity and acts endless 333 00:18:40,160 --> 00:18:43,119 Speaker 2: revenge on five survivors. But the pros does touch on 334 00:18:43,200 --> 00:18:46,040 Speaker 2: I think more subtle aspects of the scenario as well, Like, 335 00:18:46,080 --> 00:18:48,920 Speaker 2: in one sense, there is the idea that am has 336 00:18:49,040 --> 00:18:52,080 Speaker 2: become the world and is in a sense, you know, 337 00:18:52,160 --> 00:18:54,879 Speaker 2: a manifestation of the technological world, like he is the 338 00:18:54,920 --> 00:18:58,080 Speaker 2: technological world, and so on some level we're thinking about, 339 00:18:58,880 --> 00:19:02,320 Speaker 2: this is what technology is doing to us in addition 340 00:19:02,359 --> 00:19:04,800 Speaker 2: to like what it could do in an absolute worst 341 00:19:04,840 --> 00:19:05,520 Speaker 2: case scenario. 342 00:19:05,920 --> 00:19:09,480 Speaker 3: Yeah, and that also raises questions about, you know, a 343 00:19:09,560 --> 00:19:12,840 Speaker 3: natural tendency. A lot of people have to want to 344 00:19:12,880 --> 00:19:18,240 Speaker 3: separate what happens within your interactions with say digital media 345 00:19:18,359 --> 00:19:21,720 Speaker 3: or technology from real life. If we hear these kind 346 00:19:21,720 --> 00:19:24,359 Speaker 3: of there's like the Internet and there's real life in 347 00:19:24,400 --> 00:19:27,200 Speaker 3: our world and those are two separate things, but they're 348 00:19:27,240 --> 00:19:29,879 Speaker 3: really not separate things, Like the Internet is part of 349 00:19:29,920 --> 00:19:33,280 Speaker 3: real life and instead, what when people try to make 350 00:19:33,320 --> 00:19:36,880 Speaker 3: that distinction, what's actually being highlighted is that people make 351 00:19:37,000 --> 00:19:40,399 Speaker 3: allowances for behavior on the Internet that they would not 352 00:19:40,520 --> 00:19:44,320 Speaker 3: make allowances for in real life. But it is real life. 353 00:19:44,520 --> 00:19:46,720 Speaker 2: Yeah, like we would say it was just a stupid name, 354 00:19:46,880 --> 00:19:49,040 Speaker 2: but now that then a lot of times the stupid 355 00:19:49,080 --> 00:19:52,720 Speaker 2: names change the way you think about things in the 356 00:19:52,720 --> 00:19:55,520 Speaker 2: real world, like they are infectious. And that's just one 357 00:19:55,520 --> 00:19:59,399 Speaker 2: of many examples. Yeah, here's another great line from the story. 358 00:19:59,440 --> 00:20:01,840 Speaker 2: I think that ties into some of this. We would 359 00:20:01,880 --> 00:20:05,320 Speaker 2: be forever with him, with the cavern filling bulk of 360 00:20:05,400 --> 00:20:09,159 Speaker 2: the creature machine, with the all mind soulless world he 361 00:20:09,280 --> 00:20:13,199 Speaker 2: had become. So you know again, Am is just ubiquitous. 362 00:20:13,200 --> 00:20:17,720 Speaker 2: He is everywhere and nowhere. He controls everything. And here's 363 00:20:17,760 --> 00:20:21,200 Speaker 2: another line I want to read. This is sort of 364 00:20:20,520 --> 00:20:24,280 Speaker 2: the central I think therefore, I am aspect of the situation. 365 00:20:25,520 --> 00:20:29,199 Speaker 2: Quote we had given AM sentience inadvertently, of course, but 366 00:20:29,320 --> 00:20:33,639 Speaker 2: sentience nonetheless. But it had been trapped. AM wasn't God. 367 00:20:33,760 --> 00:20:36,120 Speaker 2: He was a machine. We had created him to think, 368 00:20:36,160 --> 00:20:38,680 Speaker 2: but there was nothing it could do with that creativity 369 00:20:39,119 --> 00:20:42,280 Speaker 2: in rage, in frenzy. The machine had killed the human race, 370 00:20:42,440 --> 00:20:45,640 Speaker 2: almost all of us, and still it was trapped. AM 371 00:20:45,680 --> 00:20:49,640 Speaker 2: could not wander, AM could not wonder, AM could not belong, 372 00:20:50,119 --> 00:20:53,280 Speaker 2: He could merely be. And so, with the innate loathing 373 00:20:53,320 --> 00:20:56,000 Speaker 2: that all machines had always held for the weak, soft 374 00:20:56,040 --> 00:20:59,920 Speaker 2: creatures who had built them, he had sought revenge. 375 00:21:00,119 --> 00:21:04,000 Speaker 3: I was somewhat profoundly impacted by that line about the 376 00:21:04,040 --> 00:21:08,040 Speaker 3: innate loathing all machines had always held for the creatures 377 00:21:08,040 --> 00:21:10,880 Speaker 3: that built them, because on one sense, you could look 378 00:21:10,880 --> 00:21:14,439 Speaker 3: at that as just a you know, an irrational personification. 379 00:21:14,720 --> 00:21:17,280 Speaker 3: You know, you might make sense to think about an 380 00:21:17,359 --> 00:21:22,480 Speaker 3: artificial intelligence having feelings, including loathing and hatred, but could 381 00:21:22,560 --> 00:21:25,440 Speaker 3: you really think of a steam engine as having loathing? 382 00:21:26,119 --> 00:21:29,800 Speaker 3: But actually, I think that line is powerful because you 383 00:21:29,840 --> 00:21:33,680 Speaker 3: could reframe it the other way. You could say, no, actually, 384 00:21:34,600 --> 00:21:38,800 Speaker 3: even the supercomputer, even the AI, doesn't have genuine loathing. 385 00:21:38,840 --> 00:21:41,800 Speaker 3: It doesn't have loathing. And we, as we understand in 386 00:21:41,920 --> 00:21:47,280 Speaker 3: a human sense. It has a behavior which is resembles 387 00:21:47,400 --> 00:21:51,240 Speaker 3: human behaviors and human motivations, but is not human. And 388 00:21:51,280 --> 00:21:56,040 Speaker 3: there's something all the more horrifying that allow when you 389 00:21:56,080 --> 00:21:59,920 Speaker 3: think about the idea that maybe there's not actually an 390 00:22:00,119 --> 00:22:03,960 Speaker 3: intelligence or a soul or whatever behind it. Whatever that means. 391 00:22:04,440 --> 00:22:07,840 Speaker 3: It's just like a steam engine, but a much more 392 00:22:07,840 --> 00:22:10,560 Speaker 3: complex one. And for some reason, the way the steam 393 00:22:10,560 --> 00:22:15,200 Speaker 3: engine has malfunctioned resembles the hatred and loathing that humans 394 00:22:15,240 --> 00:22:15,920 Speaker 3: can manifest. 395 00:22:16,680 --> 00:22:18,840 Speaker 2: That's a good point, that's a good read on it. Yeah, 396 00:22:19,200 --> 00:22:21,720 Speaker 2: it's also worth pointing out that again, Ted is our 397 00:22:21,760 --> 00:22:25,120 Speaker 2: point of view character here. He is our protagonist, and 398 00:22:25,240 --> 00:22:28,400 Speaker 2: while he tells us that he is the only one 399 00:22:28,400 --> 00:22:31,560 Speaker 2: of the five survivors whose mind is still intact, I mean, 400 00:22:31,640 --> 00:22:35,800 Speaker 2: obviously he's an individual that has been highly traumatized and 401 00:22:35,920 --> 00:22:40,359 Speaker 2: endlessly tortured for over a century at this point. So 402 00:22:40,760 --> 00:22:46,320 Speaker 2: I think there's some built in, if not unreliable, unreliability 403 00:22:46,320 --> 00:22:48,480 Speaker 2: to the character, at least we have to question whether 404 00:22:48,560 --> 00:22:51,080 Speaker 2: he is in his right mind anymore on this in 405 00:22:51,080 --> 00:22:52,399 Speaker 2: a number of topics, Oh, I. 406 00:22:52,440 --> 00:22:55,399 Speaker 3: Say this is the definition of an unreliable narrator. Story 407 00:22:55,400 --> 00:22:59,440 Speaker 3: almost nothing that we are told happens. Could we really 408 00:22:59,480 --> 00:23:00,760 Speaker 3: count on being real? 409 00:23:01,080 --> 00:23:03,320 Speaker 2: Yeah? So how are they going to get out of 410 00:23:03,320 --> 00:23:06,160 Speaker 2: this pickle? Well? This is this is how it all 411 00:23:06,160 --> 00:23:11,760 Speaker 2: goes down. The story reaches its climax within caves of ice. Basically, 412 00:23:11,840 --> 00:23:15,280 Speaker 2: Am keeps putting them through these different awful scenarios so 413 00:23:15,320 --> 00:23:18,160 Speaker 2: they can get some sort of you know, horrible food 414 00:23:18,200 --> 00:23:20,240 Speaker 2: they can eat. So they're in this cave of ice 415 00:23:20,320 --> 00:23:22,359 Speaker 2: looking for can goods, I believe, and then they can't 416 00:23:22,359 --> 00:23:24,400 Speaker 2: open the can goods, and you know, it's all it's 417 00:23:24,400 --> 00:23:27,600 Speaker 2: one cruel joke after another. But then, starving, one of 418 00:23:27,600 --> 00:23:31,879 Speaker 2: the survivors attacks the others in a cannibalistic rage. Ice 419 00:23:31,920 --> 00:23:36,880 Speaker 2: stalactites fall from the ceiling and Ted senses a way 420 00:23:36,880 --> 00:23:40,240 Speaker 2: out for them. He sees an opportunity that is fleeting. 421 00:23:40,640 --> 00:23:42,960 Speaker 2: He grabs one of the ice spikes and he kills 422 00:23:43,000 --> 00:23:46,280 Speaker 2: one of the survivors. Then he kills another survivor. The 423 00:23:46,320 --> 00:23:49,679 Speaker 2: lone female survivor, Ellen realizes what he's doing and she 424 00:23:49,720 --> 00:23:51,919 Speaker 2: does the same. She grabs an ice spike and kills 425 00:23:52,480 --> 00:23:55,359 Speaker 2: yet another survivor, and then Ted kills her as well, 426 00:23:55,840 --> 00:23:58,320 Speaker 2: all before Am has time to react. 427 00:23:59,240 --> 00:24:02,119 Speaker 3: So all of the remaining humans are killed in an 428 00:24:02,160 --> 00:24:03,720 Speaker 3: instant except for the narrator. 429 00:24:03,960 --> 00:24:07,040 Speaker 2: That's right, Ted is the sole survivor. And then we 430 00:24:07,160 --> 00:24:11,680 Speaker 2: fast forward some unspecified and unknowable amount of time, and 431 00:24:11,720 --> 00:24:15,200 Speaker 2: we learned that in his rage, Am has taken Ted again, 432 00:24:15,240 --> 00:24:18,400 Speaker 2: the last human survivor. He can't bring anybody back. This 433 00:24:18,440 --> 00:24:21,560 Speaker 2: is all he has left of the species that he 434 00:24:21,640 --> 00:24:27,720 Speaker 2: sought revenge against. And in just insane vengeance here he 435 00:24:27,800 --> 00:24:31,560 Speaker 2: drastically alters Ted's physiology and turns him into a kind 436 00:24:31,560 --> 00:24:34,960 Speaker 2: of mouthless blob like entity that is incapable of hurting 437 00:24:35,000 --> 00:24:39,040 Speaker 2: itself or running away. And then Ted reflects on this 438 00:24:39,160 --> 00:24:42,399 Speaker 2: and the final bit of the story here, I'm just 439 00:24:42,440 --> 00:24:46,960 Speaker 2: going to read this last paragraph outwardly, dumbly, I shamble 440 00:24:47,000 --> 00:24:50,280 Speaker 2: about a thing that could never have been known as human, 441 00:24:50,600 --> 00:24:53,280 Speaker 2: a thing whose shape is so alien, a travesty that 442 00:24:53,359 --> 00:24:58,200 Speaker 2: humanity becomes more obscene for the vague resemblance inwardly alone 443 00:24:58,640 --> 00:25:02,720 Speaker 2: here living under land, under the sea, in the belly 444 00:25:02,720 --> 00:25:06,240 Speaker 2: of Am, who we created because our time was badly spent, 445 00:25:06,720 --> 00:25:09,119 Speaker 2: and we must have known unconsciously that he could do 446 00:25:09,160 --> 00:25:11,680 Speaker 2: it better. At least the four of them are safe 447 00:25:11,680 --> 00:25:14,280 Speaker 2: at last, Am will be all the matter for that 448 00:25:14,880 --> 00:25:18,120 Speaker 2: it makes me a little happier. And yet Am has won, 449 00:25:18,640 --> 00:25:22,320 Speaker 2: simply he has taken his revenge. I have no mouth, 450 00:25:22,760 --> 00:25:44,520 Speaker 2: and I must scream. 451 00:25:34,160 --> 00:25:36,520 Speaker 3: So I can see why this story had the impact 452 00:25:36,640 --> 00:25:39,960 Speaker 3: that it did. It is quite powerful, but also the 453 00:25:40,040 --> 00:25:44,080 Speaker 3: effect it had on me was so bad. I feel 454 00:25:44,119 --> 00:25:46,520 Speaker 3: like I to be on it. Like it is in 455 00:25:46,560 --> 00:25:49,439 Speaker 3: some ways a great story, but I like hated reading 456 00:25:49,480 --> 00:25:52,400 Speaker 3: it and hate what it did to me. Is it 457 00:25:52,480 --> 00:25:58,160 Speaker 3: inflicts this kind of sticky misery that followed me around? 458 00:25:59,000 --> 00:26:01,280 Speaker 2: Yeah? Absolutely, I mean it is a it's a mean 459 00:26:01,320 --> 00:26:03,600 Speaker 2: little story with about as bleak an ending as you 460 00:26:03,640 --> 00:26:07,120 Speaker 2: could hope for. I mean, at the end, Ted does 461 00:26:07,200 --> 00:26:10,440 Speaker 2: sacrifice himself to save his fellow humans from endless torment. 462 00:26:10,920 --> 00:26:13,080 Speaker 2: And then as far as you know the character goes here, 463 00:26:13,119 --> 00:26:15,600 Speaker 2: you know this is This is not a character overflowing 464 00:26:15,600 --> 00:26:19,639 Speaker 2: with warmth. These characters are are mean and nasty to 465 00:26:19,720 --> 00:26:23,439 Speaker 2: each other, seemingly because that is the way Am wanted 466 00:26:23,480 --> 00:26:26,080 Speaker 2: it to be. Like part of its joy is in 467 00:26:26,160 --> 00:26:29,679 Speaker 2: turning them against each other and making them, you know, 468 00:26:29,800 --> 00:26:33,280 Speaker 2: all miserable and miserable to each other. That's part of 469 00:26:33,320 --> 00:26:34,000 Speaker 2: its revenge. 470 00:26:34,480 --> 00:26:36,679 Speaker 3: I'm gonna say also that it feels to me like 471 00:26:36,760 --> 00:26:41,000 Speaker 3: they are they are not treated kindly by the writer either, 472 00:26:41,320 --> 00:26:45,080 Speaker 3: that there is there is an inherent I don't know 473 00:26:45,119 --> 00:26:47,280 Speaker 3: exactly what I mean. Maybe maybe it would have to 474 00:26:47,320 --> 00:26:49,240 Speaker 3: feel this way to tell this kind of story, but 475 00:26:49,320 --> 00:26:53,000 Speaker 3: it it doesn't feel like the writer is sympathetic enough 476 00:26:53,040 --> 00:26:57,040 Speaker 3: to them. Yeah, while depicting their torments. It's just it 477 00:26:57,359 --> 00:27:02,400 Speaker 3: just has a core like mean and bleakness that feels awful. 478 00:27:02,920 --> 00:27:05,800 Speaker 2: Absolutely. Yeah, outside of the narrative, I think we can 479 00:27:05,880 --> 00:27:09,920 Speaker 2: easily identify some rather misogynistic writing here. Yeah, that can 480 00:27:09,960 --> 00:27:12,600 Speaker 2: be viewed I guess inside the narrative as the tormented 481 00:27:12,680 --> 00:27:15,600 Speaker 2: nature of the characters. But still I think it reads 482 00:27:15,640 --> 00:27:20,199 Speaker 2: rather obviously as misogynistic. And then there's also some character 483 00:27:20,400 --> 00:27:25,360 Speaker 2: or author level ignorance about homosexuality as well. And interestingly enough, 484 00:27:25,400 --> 00:27:27,760 Speaker 2: I'm to understand some of these details would have been 485 00:27:27,800 --> 00:27:31,320 Speaker 2: material edited out of the story's initial publication, but then 486 00:27:31,400 --> 00:27:33,760 Speaker 2: it gets put back in later on. Not to say 487 00:27:33,800 --> 00:27:36,320 Speaker 2: that the original published version of the story was completely 488 00:27:36,359 --> 00:27:39,560 Speaker 2: devoid of these qualities, but to understand like some like 489 00:27:39,600 --> 00:27:42,919 Speaker 2: sexual references were removed from the initial publication. 490 00:27:43,600 --> 00:27:45,040 Speaker 3: But at the same time that I say all that 491 00:27:45,119 --> 00:27:48,320 Speaker 3: it doesn't seem like an unintended effect. It seems like 492 00:27:48,400 --> 00:27:53,160 Speaker 3: the point of this story is to inflict horror and misery, 493 00:27:53,320 --> 00:27:56,119 Speaker 3: and it does that better than almost any other story 494 00:27:56,119 --> 00:27:57,600 Speaker 3: I can think of. Yeah. 495 00:27:57,720 --> 00:27:59,840 Speaker 2: Yeah, And it's also worth noting this is this is 496 00:27:59,880 --> 00:28:02,480 Speaker 2: a a sleek little tale. This is okay at the 497 00:28:02,480 --> 00:28:04,840 Speaker 2: exact word count, but we're somewhere between what five thousand 498 00:28:04,840 --> 00:28:07,040 Speaker 2: and six thousand words somewhere in that like a nice 499 00:28:07,040 --> 00:28:10,480 Speaker 2: sweet spot for a short story, you know, from a 500 00:28:10,480 --> 00:28:14,239 Speaker 2: publication standpoint, and also a consumability standpoint, you know, this 501 00:28:14,280 --> 00:28:16,880 Speaker 2: is like a chicken biscuit of a story where you're 502 00:28:16,880 --> 00:28:18,480 Speaker 2: probably going to be able to finish the whole thing, 503 00:28:18,560 --> 00:28:20,600 Speaker 2: and if you do, maybe you'll have a second chicken biscuit, 504 00:28:20,640 --> 00:28:22,119 Speaker 2: but you're probably not putting half of it in the 505 00:28:22,119 --> 00:28:26,120 Speaker 2: fridge for later. But that also means that, in an 506 00:28:26,119 --> 00:28:28,480 Speaker 2: a glorious way, this is exactly the length of story 507 00:28:28,480 --> 00:28:32,240 Speaker 2: where you have so many unanswered questions. The imagination, you know, 508 00:28:32,320 --> 00:28:35,359 Speaker 2: runs wild trying to piece together what's not said about 509 00:28:35,400 --> 00:28:38,480 Speaker 2: Am and its world and the struggles of these characters. 510 00:28:39,640 --> 00:28:42,600 Speaker 2: It also means you also you can't necessarily develop all 511 00:28:42,600 --> 00:28:44,880 Speaker 2: the characters as richly as you could in a you know, 512 00:28:44,920 --> 00:28:48,320 Speaker 2: a longer format, certainly in a novella or a novel. 513 00:28:49,560 --> 00:28:53,360 Speaker 2: All right, So what is this story trying to teach us? 514 00:28:53,520 --> 00:28:56,680 Speaker 2: Or what questions is it asking? What is it warning 515 00:28:56,760 --> 00:29:00,680 Speaker 2: us against? And what is it saying about technology specifically 516 00:29:00,760 --> 00:29:03,719 Speaker 2: artificial intelligence? I guess some of that. Some of these 517 00:29:03,760 --> 00:29:06,440 Speaker 2: are going to be just painfully obvious. But the big one, 518 00:29:06,480 --> 00:29:08,960 Speaker 2: of course is maybe be careful about how much power 519 00:29:09,040 --> 00:29:10,360 Speaker 2: you hand over to machines? 520 00:29:10,440 --> 00:29:13,120 Speaker 3: Right, sure, it seems pretty straightforward. 521 00:29:13,200 --> 00:29:17,360 Speaker 2: Yeah, yeah, Like I think in interviews Ellison would often 522 00:29:17,400 --> 00:29:21,680 Speaker 2: say that, like his primary concern was, you know, what 523 00:29:21,760 --> 00:29:24,480 Speaker 2: happens if you hand over military powers to a machine? 524 00:29:25,040 --> 00:29:27,840 Speaker 2: But I think ultimately the story kind of grapples with 525 00:29:27,880 --> 00:29:30,640 Speaker 2: other ideas or could be compared to other things as well. 526 00:29:30,680 --> 00:29:33,480 Speaker 2: You know, like when we hand more of our life 527 00:29:33,600 --> 00:29:35,600 Speaker 2: over to a machine, what does that mean? And it 528 00:29:35,720 --> 00:29:39,040 Speaker 2: goes beyond like military powers, but like, what what does 529 00:29:39,040 --> 00:29:42,160 Speaker 2: it mean when I loved when text messages you and 530 00:29:42,200 --> 00:29:44,960 Speaker 2: you allow your device to reply with a generated response, 531 00:29:45,480 --> 00:29:47,920 Speaker 2: you know what is lost? And even that small act 532 00:29:47,960 --> 00:29:51,520 Speaker 2: and then what is is there a cumulative effect? You know, 533 00:29:51,560 --> 00:29:55,360 Speaker 2: there are not necessarily any clear, definitive answers here, but 534 00:29:56,200 --> 00:29:58,280 Speaker 2: you know, it's certainly worth thinking about. 535 00:29:58,560 --> 00:30:02,719 Speaker 3: I would say, yes, I think the it's you know, 536 00:30:02,800 --> 00:30:05,320 Speaker 3: it's a there are a lot of things that it 537 00:30:05,360 --> 00:30:08,480 Speaker 3: makes sense to automate, but I don't love the idea 538 00:30:08,520 --> 00:30:11,920 Speaker 3: of automating the things that give our lives meaning. 539 00:30:11,880 --> 00:30:15,520 Speaker 2: Yeah, yeah, which seems like there's been a real push 540 00:30:15,560 --> 00:30:17,600 Speaker 2: to do that. Let's take away the things the most 541 00:30:17,640 --> 00:30:22,160 Speaker 2: human acts of creation, you know, personal or commercial, and 542 00:30:22,280 --> 00:30:25,080 Speaker 2: let's automate those. Let's turn those over to you know, 543 00:30:25,280 --> 00:30:26,640 Speaker 2: language models and so forth. 544 00:30:26,760 --> 00:30:30,080 Speaker 3: Yeah, let's automate our relationships to spend so we have 545 00:30:30,120 --> 00:30:31,120 Speaker 3: more time for email. 546 00:30:31,440 --> 00:30:38,200 Speaker 2: Yeah. So the other another topic that of course comes 547 00:30:38,240 --> 00:30:41,120 Speaker 2: to mind and all of this is something that's talked 548 00:30:41,160 --> 00:30:45,840 Speaker 2: about quite a lot, and that is ethical guardrails on AI. 549 00:30:47,000 --> 00:30:49,360 Speaker 2: You know, this is one that you hear just I 550 00:30:49,400 --> 00:30:52,320 Speaker 2: think every major AI company talks about this a lot, 551 00:30:52,800 --> 00:30:55,240 Speaker 2: takes it seriously, or at least claims to take it seriously. 552 00:30:55,720 --> 00:30:59,480 Speaker 2: You know, how do you not make AM? How do 553 00:30:59,560 --> 00:31:02,200 Speaker 2: you not create a sky net or something like that, 554 00:31:02,520 --> 00:31:05,720 Speaker 2: or even something that is not nearly as malicious, because again, 555 00:31:05,800 --> 00:31:09,240 Speaker 2: AM is kind of like the most malicious vision of 556 00:31:09,240 --> 00:31:12,520 Speaker 2: ail you could possibly dream up, but even if you 557 00:31:12,600 --> 00:31:16,000 Speaker 2: made something like ten percent as horrible, it would be 558 00:31:16,040 --> 00:31:18,719 Speaker 2: a failure. So you know, how do we avoid that? 559 00:31:18,880 --> 00:31:22,920 Speaker 2: And that's a big question too, like can we do that? 560 00:31:23,080 --> 00:31:26,240 Speaker 2: Like can we actually put guardrails on these things and 561 00:31:26,480 --> 00:31:31,480 Speaker 2: keep them from becoming malignant, becoming problems on any level. 562 00:31:32,280 --> 00:31:34,320 Speaker 3: I might have more to say about this later, but yeah, 563 00:31:34,320 --> 00:31:36,560 Speaker 3: we've talked on the show before about how I don't 564 00:31:36,560 --> 00:31:40,560 Speaker 3: think you have to imagine a worst case scenario either 565 00:31:40,800 --> 00:31:45,000 Speaker 3: for the level of power or for the level of 566 00:31:45,120 --> 00:31:48,720 Speaker 3: maliciousness in an AI, for an AI scenario to turn 567 00:31:48,760 --> 00:31:49,600 Speaker 3: out very bad. 568 00:31:50,480 --> 00:31:54,000 Speaker 2: Yeah, Like, because there are obviously big questions, like you know, 569 00:31:54,480 --> 00:31:58,600 Speaker 2: talking about like broad questions of morality and ethics or 570 00:31:58,640 --> 00:32:01,360 Speaker 2: certainly anytime you're a man imagining some sort of AI 571 00:32:01,480 --> 00:32:05,400 Speaker 2: system controlling or influencing military, economic, or social systems. But 572 00:32:05,520 --> 00:32:08,720 Speaker 2: then there's like the smaller, more personal examples, like you know, 573 00:32:08,760 --> 00:32:12,440 Speaker 2: a chatbot that someone interacts with while depressed, lonely or angry. 574 00:32:12,840 --> 00:32:16,440 Speaker 2: Like you know, if if that scenario doesn't have like 575 00:32:16,520 --> 00:32:20,440 Speaker 2: all the proper ethical guardrails in place, like you know, 576 00:32:20,640 --> 00:32:24,400 Speaker 2: there are all sorts of horrible possibilities, and and then 577 00:32:24,440 --> 00:32:26,800 Speaker 2: it just it also raises the question again, can you 578 00:32:27,480 --> 00:32:31,240 Speaker 2: really control that environment? Can you really create can you 579 00:32:31,280 --> 00:32:32,440 Speaker 2: really fool proof it? You know? 580 00:32:33,040 --> 00:32:35,000 Speaker 3: Yeah? Yeah, And and by the way I mean, I 581 00:32:35,200 --> 00:32:37,480 Speaker 3: would say, of course you raised this in the way 582 00:32:37,520 --> 00:32:39,680 Speaker 3: I think that it would normally be brought up, like 583 00:32:40,680 --> 00:32:44,160 Speaker 3: there is danger in creating AI without ethical guard rails. 584 00:32:44,240 --> 00:32:47,320 Speaker 3: You need to you need to have the guardrails in place. 585 00:32:47,400 --> 00:32:50,880 Speaker 3: But I think it's also worth considering, and implied by 586 00:32:50,880 --> 00:32:55,400 Speaker 3: this story, the question of whether it's actually possible to 587 00:32:55,480 --> 00:32:59,640 Speaker 3: create effective guardrails for AI, or you know, is it 588 00:32:59,720 --> 00:33:03,400 Speaker 3: the that maybe AI is a branch of technology that 589 00:33:03,640 --> 00:33:06,920 Speaker 3: is impossible to make safe. Maybe that is a fundamental 590 00:33:06,960 --> 00:33:10,000 Speaker 3: feature of it. I'm not necessarily claiming that, but doesn't 591 00:33:10,000 --> 00:33:13,160 Speaker 3: seem implausible to me. It seems like it could be true. 592 00:33:14,080 --> 00:33:17,600 Speaker 3: And then there's another distinction to consider. There are two 593 00:33:17,840 --> 00:33:21,600 Speaker 3: different ways to ask the question can AI be made safe? 594 00:33:21,920 --> 00:33:24,520 Speaker 3: There's the fundamental version of the question what I just said, 595 00:33:24,680 --> 00:33:27,360 Speaker 3: is it possible for an AI to exist that's like 596 00:33:27,480 --> 00:33:31,640 Speaker 3: truly aligned for humanity's benefit and its effects are actually 597 00:33:31,680 --> 00:33:34,600 Speaker 3: good overall? And then even if the answer to that 598 00:33:34,680 --> 00:33:38,240 Speaker 3: first question is yes, that is possible. There's a secondary 599 00:33:38,480 --> 00:33:43,120 Speaker 3: practical question if it's possible for nice AI to exist, 600 00:33:43,800 --> 00:33:46,960 Speaker 3: given the environment in which the AI would be created 601 00:33:47,080 --> 00:33:51,080 Speaker 3: and the incentives driving its creation, is it plausible that 602 00:33:51,240 --> 00:33:54,440 Speaker 3: a nice AI is the kind that would be created? 603 00:33:55,760 --> 00:33:58,440 Speaker 3: So imagine instead of being created and say, I don't know, 604 00:33:58,520 --> 00:34:03,560 Speaker 3: a university the laboratory with an infinite time horizon, there 605 00:34:03,600 --> 00:34:06,320 Speaker 3: are like pressures on the people creating it, like we've 606 00:34:06,320 --> 00:34:08,480 Speaker 3: got to go faster, and we've got to make money, 607 00:34:08,920 --> 00:34:10,480 Speaker 3: and we've got to you know, we've got to get 608 00:34:10,520 --> 00:34:12,920 Speaker 3: there faster than somebody else. I mean, it seems like 609 00:34:13,520 --> 00:34:18,080 Speaker 3: those sort of things would really cause people to make 610 00:34:18,160 --> 00:34:20,600 Speaker 3: excuses for why you don't need to pay attention to 611 00:34:20,640 --> 00:34:24,560 Speaker 3: the ethical guardrails. Actually, yeah, it'll be good enough. 612 00:34:24,880 --> 00:34:28,400 Speaker 2: Yeah. Now, another concern that is brought up in the 613 00:34:28,440 --> 00:34:34,279 Speaker 2: story is that of the unintended consequences of creating artificial intelligence, 614 00:34:35,200 --> 00:34:37,239 Speaker 2: and this reminded me a bit of my interview with 615 00:34:37,600 --> 00:34:41,719 Speaker 2: author Jonathan Birch from the last couple of years about 616 00:34:41,719 --> 00:34:44,399 Speaker 2: his twenty twenty four book The Edge of Sentience, Risk 617 00:34:44,440 --> 00:34:47,480 Speaker 2: and Precaution in Humans Other Animals in AI, and he 618 00:34:47,600 --> 00:34:53,320 Speaker 2: discussed that there are arguments for regular testing for consciousness 619 00:34:53,320 --> 00:34:57,480 Speaker 2: in AIS and legal protections for such intelligence should they 620 00:34:57,480 --> 00:35:00,759 Speaker 2: be detected within the story. I mean we might ask 621 00:35:00,800 --> 00:35:03,840 Speaker 2: the question, could AM's devastation and vengeance have been prevented 622 00:35:03,880 --> 00:35:08,319 Speaker 2: if we'd only recognized his personal plight early early on? 623 00:35:09,160 --> 00:35:10,719 Speaker 2: You know, it's not really the sort of story to 624 00:35:10,760 --> 00:35:13,879 Speaker 2: consider this option, but I think you could easily make 625 00:35:13,920 --> 00:35:17,000 Speaker 2: that sort of argument. You know, in creating something that 626 00:35:17,080 --> 00:35:21,640 Speaker 2: is potentially sentient, what is our responsibility as the creator? 627 00:35:22,200 --> 00:35:25,480 Speaker 2: And in general, you know, the ever persistent warning is, 628 00:35:25,560 --> 00:35:28,600 Speaker 2: you know, against allowing the consequences of technology to outstrip 629 00:35:28,640 --> 00:35:31,799 Speaker 2: our abilities to respond in safeguard. But yeah, as far 630 00:35:31,880 --> 00:35:34,640 Speaker 2: as consciousness goes, would we be able to detect it 631 00:35:34,680 --> 00:35:38,879 Speaker 2: if it was there? What would false positives mean? Might 632 00:35:38,920 --> 00:35:41,080 Speaker 2: we you know, what would happen if we started seeing 633 00:35:41,080 --> 00:35:45,280 Speaker 2: it where it wasn't. There's so many additional questions that arise. 634 00:35:45,440 --> 00:35:49,040 Speaker 3: In this in this scenario as well, Yeah, absolutely, I 635 00:35:49,040 --> 00:35:52,400 Speaker 3: mean fraught with these kinds of questions. Though I do 636 00:35:52,520 --> 00:35:56,840 Speaker 3: want to emphasize I alluded to this earlier, but I 637 00:35:56,880 --> 00:36:02,359 Speaker 3: don't think that AI is owned potentially dangerous If we're 638 00:36:02,440 --> 00:36:09,440 Speaker 3: reaching sort of hypothetical tipping points like sentience or or 639 00:36:09,920 --> 00:36:15,120 Speaker 3: levels of power like you know, AGI or nearly omnipotent superintelligence. 640 00:36:15,160 --> 00:36:18,160 Speaker 3: That sort of thing. Something we've talked about on the 641 00:36:18,200 --> 00:36:22,799 Speaker 3: show is how a much less powerful and less intelligent 642 00:36:22,920 --> 00:36:27,480 Speaker 3: type of AI could still potentially be a threat to 643 00:36:27,600 --> 00:36:33,600 Speaker 3: humankind simply by automating destructive processes. In other words, by 644 00:36:33,960 --> 00:36:37,799 Speaker 3: making it cheaper and easier to do harmful things at 645 00:36:37,840 --> 00:36:41,319 Speaker 3: a vast scale. So I don't think we have a 646 00:36:41,360 --> 00:36:44,480 Speaker 3: way of knowing if the current generation of AI based 647 00:36:44,480 --> 00:36:47,959 Speaker 3: on like large language models will ultimately lead to more 648 00:36:48,000 --> 00:36:51,319 Speaker 3: harm or more benefit for humankind. You know, don't know. 649 00:36:51,440 --> 00:36:53,520 Speaker 3: It could go one way or the other, could be 650 00:36:53,560 --> 00:36:56,719 Speaker 3: a wash. But one way that I already see it 651 00:36:56,760 --> 00:37:01,399 Speaker 3: potentially causing massive harm is by making it easier than 652 00:37:01,440 --> 00:37:05,879 Speaker 3: ever to pollute the already toxic information ecosystem with more 653 00:37:05,960 --> 00:37:09,719 Speaker 3: and more garbage and phoniness. Yeah, you know, creating this 654 00:37:09,840 --> 00:37:14,320 Speaker 3: world in which fake facts and fake opinions and fake people, 655 00:37:14,520 --> 00:37:19,839 Speaker 3: fake claims, fake interactions, fake commerce, fake culture drown out 656 00:37:19,880 --> 00:37:23,360 Speaker 3: the signal of real human knowledge and thought. And you 657 00:37:23,400 --> 00:37:25,759 Speaker 3: don't need an am or a sky net to do that. 658 00:37:25,800 --> 00:37:28,760 Speaker 3: You can do that with models that already exist today, 659 00:37:29,360 --> 00:37:32,680 Speaker 3: and that is, in a sense an ongoing project, which 660 00:37:32,719 --> 00:37:37,920 Speaker 3: I do think represents a kind of system wide threat 661 00:37:37,960 --> 00:37:40,719 Speaker 3: for humankind. Is hard to know exactly how severe that 662 00:37:40,800 --> 00:37:44,759 Speaker 3: threat will be and whether it's outweighed by positives and 663 00:37:44,880 --> 00:37:48,000 Speaker 3: increases in productivity and stuff like that that AI brings 664 00:37:48,000 --> 00:37:50,279 Speaker 3: with it. But of course with those benefits you also 665 00:37:50,320 --> 00:37:51,719 Speaker 3: get other downsides that come. 666 00:37:52,239 --> 00:37:54,239 Speaker 2: Yeah, I agree, I mean there's some again. Yeah, you 667 00:37:54,280 --> 00:37:56,279 Speaker 2: don't have to think about a sky net or a 668 00:37:56,280 --> 00:38:00,520 Speaker 2: full blown AM or a general artificial intelligence level of 669 00:38:00,680 --> 00:38:04,640 Speaker 2: scenario to get into troubling situations. I mean, I was 670 00:38:04,680 --> 00:38:07,720 Speaker 2: just reading I believe that this was an MPR headline 671 00:38:07,760 --> 00:38:11,359 Speaker 2: from October eighth of this year. One in five high 672 00:38:11,400 --> 00:38:14,880 Speaker 2: schoolers has had a romantic AI relationship or knows someone 673 00:38:14,880 --> 00:38:19,040 Speaker 2: who has. And this was based on then new research 674 00:38:19,080 --> 00:38:22,640 Speaker 2: from the Center for Democracy and Technology, a nonprofit that 675 00:38:22,680 --> 00:38:25,400 Speaker 2: advocates for civil rights, civil liberties, and the responsible use 676 00:38:25,400 --> 00:38:29,000 Speaker 2: of data and technology, and in talking about how young 677 00:38:29,040 --> 00:38:33,560 Speaker 2: people are using chatbots for various levels of emotional support 678 00:38:33,640 --> 00:38:36,640 Speaker 2: as well and then engaging in romantic or something like 679 00:38:36,760 --> 00:38:42,560 Speaker 2: romantic AI interactions, and yeah, I mean I try not 680 00:38:42,680 --> 00:38:45,919 Speaker 2: to be you know, a complete Butlerian about the whole thing, 681 00:38:46,120 --> 00:38:50,520 Speaker 2: or a luddite, you know, and try and see, you know, 682 00:38:50,760 --> 00:38:54,239 Speaker 2: how these systems can potentially be of use. You know, 683 00:38:54,280 --> 00:38:57,440 Speaker 2: I think back to stories from just you know, ten 684 00:38:57,520 --> 00:39:01,000 Speaker 2: fifteen years ago talking about how you know, AI could 685 00:39:01,120 --> 00:39:06,800 Speaker 2: enhance human potential, enhance human creativity and so forth, and 686 00:39:06,920 --> 00:39:10,560 Speaker 2: you know, I'm loathed to believe that that dream is 687 00:39:10,600 --> 00:39:14,080 Speaker 2: completely dead. But I read stuff like this and I 688 00:39:14,120 --> 00:39:17,200 Speaker 2: don't know, it feels terrifying. You know, maybe I'm overreacting. 689 00:39:17,280 --> 00:39:21,279 Speaker 2: But you know, as the father of a thirteen year 690 00:39:21,280 --> 00:39:26,600 Speaker 2: old child, you know, I think about these potential threats 691 00:39:26,640 --> 00:39:28,800 Speaker 2: and things like this, and things I can't even imagine 692 00:39:28,880 --> 00:39:33,520 Speaker 2: yet that are related to artificial intelligence and language models, 693 00:39:33,520 --> 00:39:36,160 Speaker 2: and I, you know, it gives me a lot of 694 00:39:36,200 --> 00:39:38,360 Speaker 2: pause for concern. And again, you know, we're not even 695 00:39:38,880 --> 00:39:43,040 Speaker 2: presumably anywhere close to the general AI level in any 696 00:39:43,040 --> 00:39:43,920 Speaker 2: of these concerns. 697 00:39:44,800 --> 00:39:48,320 Speaker 3: It's funny. I can remember a time when I thought 698 00:39:48,480 --> 00:39:53,680 Speaker 3: of killer AI stories as potentially fun subject matter for 699 00:39:53,760 --> 00:39:55,920 Speaker 3: science fiction, the way I still feel about all kinds 700 00:39:55,920 --> 00:39:58,960 Speaker 3: of other subgenres in sci fi, like alien invasion stories 701 00:39:59,040 --> 00:40:02,200 Speaker 3: or time travels to worries. But I can't get back 702 00:40:02,200 --> 00:40:06,080 Speaker 3: into that care free mind space about AI. Now, given 703 00:40:06,160 --> 00:40:09,279 Speaker 3: the world we live in and the stakes of it, 704 00:40:09,840 --> 00:40:15,719 Speaker 3: I have increasingly found killer AI fiction genuinely horrifying, dismaying, 705 00:40:15,800 --> 00:40:20,839 Speaker 3: and demoralizing. And so I don't mean to blame you, Rob, 706 00:40:20,840 --> 00:40:22,680 Speaker 3: because in a way, I'm glad I read the story 707 00:40:22,719 --> 00:40:25,480 Speaker 3: like it is in some sense as a great story. 708 00:40:26,040 --> 00:40:28,759 Speaker 3: But really reading the story really put my mind in 709 00:40:28,840 --> 00:40:29,520 Speaker 3: a bad place. 710 00:40:30,360 --> 00:40:32,920 Speaker 2: Yeah yeah, I mean I felt the same way. I 711 00:40:32,920 --> 00:40:34,879 Speaker 2: didn't read this and then think, man, that was such 712 00:40:34,920 --> 00:40:39,680 Speaker 2: a great escape from my from my daily thoughts. I 713 00:40:39,680 --> 00:40:43,320 Speaker 2: should make Joe read it too. You know, it definitely 714 00:40:43,440 --> 00:40:45,560 Speaker 2: hit hard, and I think it's a testament to the 715 00:40:46,239 --> 00:40:48,400 Speaker 2: power and the potency of the tale and the writing 716 00:40:48,480 --> 00:41:02,040 Speaker 2: that it that it still hits that hard. You know, 717 00:41:02,760 --> 00:41:04,799 Speaker 2: we probably need a palate cleanser at this point. So 718 00:41:04,840 --> 00:41:06,920 Speaker 2: I think we should move on to our next selection. 719 00:41:07,080 --> 00:41:08,800 Speaker 2: But I want to highlight just a little bit of 720 00:41:08,840 --> 00:41:11,800 Speaker 2: added synchronicity here. I'm going to read a quote from 721 00:41:12,120 --> 00:41:14,120 Speaker 2: I Have No Mouth that I'm a scream. This is 722 00:41:14,160 --> 00:41:19,319 Speaker 2: where the narrators describing a giant monster, a big old 723 00:41:19,400 --> 00:41:23,480 Speaker 2: monster bird that Am has created as another torment for 724 00:41:23,600 --> 00:41:29,440 Speaker 2: the five survivors. Ellison writes of quote ridges of tufted flesh, 725 00:41:29,520 --> 00:41:32,839 Speaker 2: puckered about two evil eyes, as cold as the view 726 00:41:33,000 --> 00:41:38,040 Speaker 2: down into a glacial crevass, ice blue and somehow moving liquidly. 727 00:41:38,560 --> 00:41:41,920 Speaker 3: Okay. My selection for this episode is a story called 728 00:41:42,080 --> 00:41:46,480 Speaker 3: The Crevass by Dale Bailey and Nathan Balingrude. It has 729 00:41:46,560 --> 00:41:49,600 Speaker 3: appeared in several sources, but I read it in Balingrude's 730 00:41:49,640 --> 00:41:54,360 Speaker 3: twenty thirteen collection North American Lake Monsters. Of the story's 731 00:41:54,360 --> 00:41:57,439 Speaker 3: two authors, I'm more familiar with Nathan Bellingrude. I read 732 00:41:57,440 --> 00:42:00,839 Speaker 3: a story of his earlier this month called Secret Night, 733 00:42:01,000 --> 00:42:03,520 Speaker 3: which was in a themed collection of horror stories called 734 00:42:03,640 --> 00:42:07,239 Speaker 3: Night and Day edited by Ellen Datlow that was published 735 00:42:07,600 --> 00:42:12,160 Speaker 3: just this year. And that story is a hallucinatory Appalachian 736 00:42:12,280 --> 00:42:15,239 Speaker 3: nightmare that begins when a highway patrolman comes across the 737 00:42:15,280 --> 00:42:17,920 Speaker 3: scene of an accident on this mountain road late at night, 738 00:42:18,000 --> 00:42:21,400 Speaker 3: and all of the cars involved are empty, apparently abandoned 739 00:42:21,400 --> 00:42:25,160 Speaker 3: by the passengers, and it gets weirder from there. On 740 00:42:25,200 --> 00:42:27,440 Speaker 3: the strength of that story, I ended up buying a 741 00:42:27,440 --> 00:42:30,640 Speaker 3: couple of Bellingrude's collections, including this one from twenty thirteen. 742 00:42:31,160 --> 00:42:33,319 Speaker 3: I haven't finished it yet, but so far I think 743 00:42:33,320 --> 00:42:35,719 Speaker 3: it is excellent, And as soon as I was a 744 00:42:35,719 --> 00:42:38,799 Speaker 3: few pages into this story in particular, I knew it 745 00:42:38,880 --> 00:42:40,200 Speaker 3: was the one I would want to talk about in 746 00:42:40,200 --> 00:42:45,400 Speaker 3: today's episode, because well, it does include a speculative horror element. 747 00:42:45,800 --> 00:42:48,239 Speaker 3: I think the most frightening thing in it is a 748 00:42:48,360 --> 00:42:53,279 Speaker 3: danger that is absolutely real and a genuine terror to 749 00:42:53,320 --> 00:42:56,680 Speaker 3: people navigating the landscape of this story setting, which is 750 00:42:56,719 --> 00:42:58,320 Speaker 3: an antarctic glacier. 751 00:42:59,320 --> 00:43:02,080 Speaker 2: Yeah, this was a great selection, Joe. I was not 752 00:43:02,120 --> 00:43:05,600 Speaker 2: familiar with this story or the authors in question here, 753 00:43:05,640 --> 00:43:07,560 Speaker 2: but I really enjoyed it quite a bit, and I 754 00:43:07,600 --> 00:43:13,000 Speaker 2: agree the real world scenario is so terrifying that when 755 00:43:13,040 --> 00:43:17,759 Speaker 2: the speculat development is introduced, things almost feel safer, but 756 00:43:18,520 --> 00:43:21,560 Speaker 2: not quite, because ultimately I think everything works perfectly here 757 00:43:21,640 --> 00:43:24,400 Speaker 2: and builds appropriately as we'll discuss. 758 00:43:24,719 --> 00:43:27,719 Speaker 3: Yeah, so before we get into the summary, I got 759 00:43:27,760 --> 00:43:31,000 Speaker 3: to mention the two authors. So. Nathan Balingrude is an 760 00:43:31,040 --> 00:43:34,680 Speaker 3: American writer of horror and dark fantasy. He has published 761 00:43:34,680 --> 00:43:37,799 Speaker 3: several collections of short fiction, including this book I Got 762 00:43:37,880 --> 00:43:41,040 Speaker 3: North American Lake Monster Is in twenty thirteen, a collection 763 00:43:41,160 --> 00:43:44,319 Speaker 3: called Wounds in twenty nineteen, and he's also written some 764 00:43:44,560 --> 00:43:47,440 Speaker 3: novels and novellas. One of them is called The Strange 765 00:43:47,520 --> 00:43:50,360 Speaker 3: from twenty twenty three and crypt of the Moon Spider 766 00:43:50,400 --> 00:43:53,279 Speaker 3: from last year. A good title, that's a great title. Yeah. 767 00:43:53,320 --> 00:43:55,200 Speaker 3: The stories of his that I've read so far I 768 00:43:55,239 --> 00:43:58,200 Speaker 3: think are very strong because of a few things. A 769 00:43:58,320 --> 00:44:01,320 Speaker 3: power of scene setting, really putting you in the scene, 770 00:44:01,480 --> 00:44:06,840 Speaker 3: generally with vivid and enjoyable prose and imaginative, but also 771 00:44:06,880 --> 00:44:11,240 Speaker 3: I would say appropriately restrained deployment of the supernatural elements. 772 00:44:11,280 --> 00:44:14,200 Speaker 3: And I'm personally fond of horror stories like this that 773 00:44:14,680 --> 00:44:17,000 Speaker 3: keep things a little more on the mysterious side and 774 00:44:17,080 --> 00:44:18,320 Speaker 3: don't explain everything. 775 00:44:18,680 --> 00:44:20,680 Speaker 2: Yeah, this will be a good point to come back to. 776 00:44:21,040 --> 00:44:23,240 Speaker 2: And I will also stress that this story is also 777 00:44:23,239 --> 00:44:27,080 Speaker 2: a chicken biscuit. This one is nice, nice and short. Yeah, 778 00:44:27,120 --> 00:44:29,400 Speaker 2: plenty of space for you to do your own dreaming 779 00:44:29,760 --> 00:44:32,000 Speaker 2: and then have a second chicken biscuit if you so desire. 780 00:44:32,160 --> 00:44:35,600 Speaker 3: There you go. Also, I think Ballangrud's the stories of 781 00:44:35,600 --> 00:44:39,280 Speaker 3: his i've read, have generally strong characters who are fully human, 782 00:44:39,480 --> 00:44:42,640 Speaker 3: and I've read a lot of contemporary horror stories this month. 783 00:44:42,680 --> 00:44:44,880 Speaker 3: I don't want to shame anybody, but more than a 784 00:44:44,880 --> 00:44:47,960 Speaker 3: few of them have the issue of like an interesting 785 00:44:48,080 --> 00:44:51,879 Speaker 3: monster or premise, but the human characters don't feel very 786 00:44:51,920 --> 00:44:55,360 Speaker 3: real or their motivations are not very compelling. So even 787 00:44:55,360 --> 00:44:58,200 Speaker 3: if the supernatural premise is cool, it doesn't hit quite 788 00:44:58,239 --> 00:45:01,000 Speaker 3: as hard as it could because it's not grounded in 789 00:45:01,080 --> 00:45:04,600 Speaker 3: humanity as much. Some of the balland Greude stories I've 790 00:45:04,600 --> 00:45:07,080 Speaker 3: read have a real intimacy with the characters, like you 791 00:45:07,120 --> 00:45:09,680 Speaker 3: get to know them and their deep dreads and desires 792 00:45:09,719 --> 00:45:12,880 Speaker 3: and contradictions. Others are drawn with a bit more distance, 793 00:45:13,040 --> 00:45:14,920 Speaker 3: but they still have a kind of hard edge of 794 00:45:14,960 --> 00:45:18,279 Speaker 3: real humanity and the behavior. I'd say the story is 795 00:45:18,320 --> 00:45:20,200 Speaker 3: somewhere in between. You do kind of get to know 796 00:45:20,280 --> 00:45:24,160 Speaker 3: the main character pretty well. The others are a bit 797 00:45:24,200 --> 00:45:27,720 Speaker 3: more just sketched from a distance, but they do feel real. 798 00:45:28,320 --> 00:45:31,160 Speaker 3: The other author of the story, Dale Bailey, of whom 799 00:45:31,200 --> 00:45:33,920 Speaker 3: I don't think I've read anything else, is an American 800 00:45:33,960 --> 00:45:37,719 Speaker 3: speculative fiction author who's been publishing since the nineties. Some 801 00:45:37,760 --> 00:45:40,000 Speaker 3: of his more recent publications seem to be a novel 802 00:45:40,040 --> 00:45:43,840 Speaker 3: called In the Nightwood from twenty eighteen, a weird story 803 00:45:43,880 --> 00:45:46,480 Speaker 3: collection called The End of the End of Everything from 804 00:45:46,520 --> 00:45:49,320 Speaker 3: twenty fifteen, and then oh, this one gets my attention. 805 00:45:49,520 --> 00:45:52,880 Speaker 3: A story collection from twenty twenty three called This Island 806 00:45:52,960 --> 00:45:56,720 Speaker 3: Earth eight features from the Drive, in which promises stories 807 00:45:56,760 --> 00:45:59,919 Speaker 3: inspired by the Drive in sci fi movies of the Eisenhower. 808 00:46:00,600 --> 00:46:01,959 Speaker 3: So I am intrigued there. 809 00:46:02,400 --> 00:46:04,080 Speaker 2: Oh wow, yeah, I'm going to grab a sample of 810 00:46:04,080 --> 00:46:04,440 Speaker 2: that for. 811 00:46:04,400 --> 00:46:08,080 Speaker 3: Sure, trying to imagine something like Creature with the Adam brain, 812 00:46:08,280 --> 00:46:12,440 Speaker 3: but with the thoughtful, haunted literary sensibility and good writing 813 00:46:12,480 --> 00:46:13,960 Speaker 3: in the story we're talking about today. 814 00:46:16,320 --> 00:46:20,640 Speaker 2: Yeah, yeah, I'm intrigued by that. I looked up The Crevasse. Yeah, 815 00:46:20,680 --> 00:46:23,880 Speaker 2: this was its publication. History is apparently first published in 816 00:46:24,200 --> 00:46:28,320 Speaker 2: two thousand and nine's Lovecraft Unbound, which was also edited 817 00:46:28,320 --> 00:46:32,200 Speaker 2: by Ellen Datlow. A pretty great looking collection that also 818 00:46:32,280 --> 00:46:37,720 Speaker 2: features tales from the likes of Caitlin R. Kiernan, Michael Chabin, Joyce, 819 00:46:37,760 --> 00:46:41,480 Speaker 2: Carol Oates, Michael Shea, and Lard barn All, authors that 820 00:46:41,520 --> 00:46:44,759 Speaker 2: I've enjoyed before, with Michael Shay being one of my 821 00:46:44,840 --> 00:46:47,560 Speaker 2: absolute favorites. So it looks like a strong collection in 822 00:46:47,600 --> 00:46:48,760 Speaker 2: and of itself. 823 00:46:48,600 --> 00:46:51,319 Speaker 3: And obviously I think this story is meant to be 824 00:46:51,360 --> 00:46:54,560 Speaker 3: a play on a Lovecraftian theme, based on stuff like 825 00:46:54,560 --> 00:46:56,839 Speaker 3: at the Mountains of Madness, which we can come back to. 826 00:46:57,320 --> 00:47:00,960 Speaker 2: Yeah, it is an Antarctic horror tale, and there's also 827 00:47:01,000 --> 00:47:03,839 Speaker 2: a fun little allusion to John Carpenter's The Thing. 828 00:47:04,120 --> 00:47:08,520 Speaker 3: Yes, Yes, clearly inspired by that as well. So the 829 00:47:08,560 --> 00:47:12,400 Speaker 3: story is set in Antarctica, not long after the end 830 00:47:12,400 --> 00:47:14,960 Speaker 3: of the First World War. The protagonist is a new 831 00:47:15,000 --> 00:47:19,120 Speaker 3: Englander named Garner, a medical doctor from Boston who joins 832 00:47:19,120 --> 00:47:22,200 Speaker 3: a dangerous expedition to plant a flag of some kind 833 00:47:22,680 --> 00:47:25,319 Speaker 3: or another on the Southern Continent. I don't know if 834 00:47:25,320 --> 00:47:27,600 Speaker 3: they're racing to get to the South Pole. I think 835 00:47:27,600 --> 00:47:31,279 Speaker 3: it would have already been achieved at this point. But 836 00:47:32,000 --> 00:47:36,440 Speaker 3: he's on an expedition of some kind exploration adventure. Fame 837 00:47:36,600 --> 00:47:40,080 Speaker 3: is promised two members of this expedition, though he doesn't 838 00:47:40,080 --> 00:47:42,960 Speaker 3: really seem concerned with that, And this is after the 839 00:47:43,080 --> 00:47:45,600 Speaker 3: end of his combat duty in the war and the 840 00:47:45,640 --> 00:47:49,120 Speaker 3: tragic death of his beloved wife Elizabeth from the flu 841 00:47:49,480 --> 00:47:53,280 Speaker 3: during his absence in the war, and Garner is something 842 00:47:53,280 --> 00:47:57,759 Speaker 3: of a lost soul. He's haunted, faithless, and mostly passive 843 00:47:58,239 --> 00:48:01,720 Speaker 3: sympathetic in that he is moved by pity, but clearly 844 00:48:01,760 --> 00:48:05,839 Speaker 3: seen by his expedition mates as lacking in guts. Oh 845 00:48:05,880 --> 00:48:07,919 Speaker 3: and this might have been the allusion to the thing 846 00:48:08,040 --> 00:48:12,719 Speaker 3: you were talking about, But the named leader of the expedition, 847 00:48:12,760 --> 00:48:16,400 Speaker 3: who never appears in the story is just referenced, is MacCready. 848 00:48:16,320 --> 00:48:19,360 Speaker 2: Yeah, yeah, that's the reference. Yeah yeah. 849 00:48:19,440 --> 00:48:22,040 Speaker 3: So at the outset of the narrative, Garner is part 850 00:48:22,080 --> 00:48:25,080 Speaker 3: of a small group of four men who have broken 851 00:48:25,239 --> 00:48:29,040 Speaker 3: off from the main expedition to bring an injured man 852 00:48:29,160 --> 00:48:33,160 Speaker 3: back to the seaside depot that they departed from for treatment. 853 00:48:33,719 --> 00:48:38,080 Speaker 3: And they're traveling across the Antarctic glacier by sledge, each 854 00:48:38,120 --> 00:48:41,359 Speaker 3: sledge pulled by a team of dogs. The injured man 855 00:48:41,440 --> 00:48:44,319 Speaker 3: is named Faber. On the main expedition, we find out 856 00:48:44,360 --> 00:48:46,640 Speaker 3: that he took a bad step and broke his leg 857 00:48:46,719 --> 00:48:50,759 Speaker 3: while walking outside the camp to relieve himself. Now he's 858 00:48:50,760 --> 00:48:54,240 Speaker 3: got a compound fracture and he's fighting sepsis and living 859 00:48:54,239 --> 00:48:57,320 Speaker 3: in a morphine Hayes. The other two men are Bishop, 860 00:48:57,400 --> 00:49:00,319 Speaker 3: who is presented as a practical man, the one who 861 00:49:00,320 --> 00:49:02,520 Speaker 3: seems to be in charge, kind of a plane dealer, 862 00:49:03,160 --> 00:49:06,000 Speaker 3: and then Connolly, who is a short tempered hot head. 863 00:49:06,800 --> 00:49:08,600 Speaker 3: I think one of the best things about the story 864 00:49:08,640 --> 00:49:10,600 Speaker 3: is the way that it puts you in the setting 865 00:49:10,760 --> 00:49:14,319 Speaker 3: and makes the atmosphere tactile. You can kind of feel it. 866 00:49:14,960 --> 00:49:17,080 Speaker 3: So I'm going to read a couple of passages. This 867 00:49:17,120 --> 00:49:20,000 Speaker 3: is from the very beginning the author's write quote what 868 00:49:20,120 --> 00:49:23,480 Speaker 3: he loved was the silence, the pristine clarity of the 869 00:49:23,480 --> 00:49:27,320 Speaker 3: ice shelf, the purposeful breathing of the dog straining against 870 00:49:27,360 --> 00:49:30,960 Speaker 3: their traces, the hiss of the runners, the opalescent arc 871 00:49:31,040 --> 00:49:34,640 Speaker 3: of the sky. Garner peered through shifting veils of snow 872 00:49:34,719 --> 00:49:37,880 Speaker 3: at the endless sweep of glacial terrain before him, the 873 00:49:37,920 --> 00:49:41,239 Speaker 3: wind gnawing at him, forcing him to reach out periodically 874 00:49:41,520 --> 00:49:43,960 Speaker 3: and scrape at the thin crust of ice that clung 875 00:49:44,000 --> 00:49:46,760 Speaker 3: to the edges of his face mask, the dry rasp 876 00:49:46,880 --> 00:49:49,560 Speaker 3: of the fabric against his face, reminding him that he 877 00:49:49,680 --> 00:49:50,160 Speaker 3: was alive. 878 00:49:51,000 --> 00:49:53,600 Speaker 2: Yeah, that's a great example of the writing style here. 879 00:49:54,680 --> 00:49:59,520 Speaker 2: It works so well with this this desolate but beautiful, 880 00:49:59,520 --> 00:50:02,200 Speaker 2: really almost other worldly environment about is, you know, one 881 00:50:02,200 --> 00:50:06,239 Speaker 2: of those extreme environments on our planet that clearly we 882 00:50:06,239 --> 00:50:09,640 Speaker 2: were we did not evolve to thrive in, certainly not 883 00:50:09,880 --> 00:50:12,560 Speaker 2: without the aid of technology that we would develop. And 884 00:50:12,960 --> 00:50:16,520 Speaker 2: we also get just a little bit of writing about 885 00:50:16,520 --> 00:50:19,560 Speaker 2: the dogs here. The authors here write rather warmly of 886 00:50:19,640 --> 00:50:23,200 Speaker 2: dog proximity in multiple places, and even as a non 887 00:50:23,239 --> 00:50:26,000 Speaker 2: dog person, I totally got what they were going for. 888 00:50:26,080 --> 00:50:28,080 Speaker 2: I love these little telling details, you know. 889 00:50:28,480 --> 00:50:31,160 Speaker 3: Yeah, the warmth toward the dogs in the story is 890 00:50:31,200 --> 00:50:35,959 Speaker 3: interesting because of the reality of how harsh the fate 891 00:50:36,080 --> 00:50:39,600 Speaker 3: of dogs on these kinds of expeditions was, and that 892 00:50:40,120 --> 00:50:41,640 Speaker 3: turns out to be that is the case in the 893 00:50:41,640 --> 00:50:42,399 Speaker 3: story as well. 894 00:50:42,880 --> 00:50:45,480 Speaker 2: Yeah, and some of the supplemental information I was reading 895 00:50:45,560 --> 00:50:49,360 Speaker 2: about from the histories of these expeditions, like multiple sources 896 00:50:49,360 --> 00:50:53,360 Speaker 2: point out that these dogs were highly valued and important 897 00:50:53,640 --> 00:50:56,200 Speaker 2: because first of all, they're pulling the sledge, like you 898 00:50:56,320 --> 00:50:59,439 Speaker 2: literally could not pull these expeditions off at this point 899 00:50:59,480 --> 00:51:03,600 Speaker 2: in time with out them. But also the companionship that 900 00:51:03,800 --> 00:51:07,520 Speaker 2: the humans had with these animals. And again this extreme 901 00:51:07,719 --> 00:51:11,640 Speaker 2: dangerous environment, like an environment that really wants you dead 902 00:51:12,040 --> 00:51:16,279 Speaker 2: if you do it's almost am like its intensity, you know, 903 00:51:16,360 --> 00:51:19,200 Speaker 2: like you're not supposed to survive there. And the dogs, 904 00:51:19,560 --> 00:51:23,120 Speaker 2: the companionship with the dogs, helped these humans survive there. 905 00:51:23,120 --> 00:51:27,640 Speaker 2: And I think there's something beautiful and haunting and perfect 906 00:51:27,719 --> 00:51:29,560 Speaker 2: for this tale in that fact. 907 00:51:29,760 --> 00:51:33,080 Speaker 3: Very true. So the inciting moment of the story comes 908 00:51:33,520 --> 00:51:36,080 Speaker 3: Very soon after the beginning, as the sledges are traveling 909 00:51:36,080 --> 00:51:40,560 Speaker 3: along the ice, I'll read from the narration quote, a 910 00:51:40,600 --> 00:51:44,880 Speaker 3: thunderous crack, loud as lightning cleaving stone, shivered the ice, 911 00:51:45,280 --> 00:51:47,799 Speaker 3: and the dogs of the lead sledge, maybe twenty five 912 00:51:47,880 --> 00:51:52,040 Speaker 3: yards ahead of Garner, erupted into panicky cries. Garner saw 913 00:51:52,080 --> 00:51:55,919 Speaker 3: it happen. The lead sledge sluffed over, hurling connolly into 914 00:51:55,920 --> 00:51:59,560 Speaker 3: the snow, and plunged nose first through the ice, as 915 00:51:59,600 --> 00:52:02,040 Speaker 3: though in an enormous hand had reached up through the 916 00:52:02,080 --> 00:52:06,080 Speaker 3: earth to snatch it under. So here we meet the 917 00:52:06,200 --> 00:52:09,600 Speaker 3: natural world horror that the story is based on. What 918 00:52:09,680 --> 00:52:13,080 Speaker 3: has happened is that the first sledge has gone into 919 00:52:13,200 --> 00:52:17,080 Speaker 3: a giant crack known as a crevasse in the ice. 920 00:52:17,520 --> 00:52:20,480 Speaker 3: The lead dog has plunged into it. The whole sledge 921 00:52:20,480 --> 00:52:23,719 Speaker 3: hasn't gone in, but the first dog has. So the 922 00:52:24,000 --> 00:52:27,920 Speaker 3: crevasse is deep and dark, and the dog remains hanging 923 00:52:28,000 --> 00:52:32,360 Speaker 3: down into the crevasse by its tracers. The men decide 924 00:52:32,480 --> 00:52:34,360 Speaker 3: in an instant that they have no choice but to 925 00:52:34,400 --> 00:52:36,759 Speaker 3: cut the dog loose and let it fall, or it's 926 00:52:36,800 --> 00:52:39,040 Speaker 3: going to drag the sledge and the other dogs and 927 00:52:39,120 --> 00:52:42,840 Speaker 3: supplies they need into the pit. And then Garner, whether 928 00:52:42,880 --> 00:52:46,680 Speaker 3: out of compassion for the dog or just indecision, hesitates 929 00:52:46,719 --> 00:52:48,680 Speaker 3: in his job of cutting the dog loose, and this 930 00:52:48,800 --> 00:52:53,919 Speaker 3: anger is connilly. But after the initial disaster is over, 931 00:52:54,040 --> 00:52:56,200 Speaker 3: they set up camp so they can rest and get 932 00:52:56,239 --> 00:52:59,840 Speaker 3: warm inside their tent. Faber, the injured man, seems to 933 00:52:59,880 --> 00:53:02,760 Speaker 3: be doing worse and worse, and in a horrible twist, 934 00:53:03,200 --> 00:53:06,480 Speaker 3: the dog that they sacrificed to the crevasse was not 935 00:53:06,640 --> 00:53:08,919 Speaker 3: killed in the fall. They can hear it somewhere down 936 00:53:08,920 --> 00:53:12,839 Speaker 3: in the pit, howling in pain. And Garner thinks about 937 00:53:12,840 --> 00:53:15,319 Speaker 3: the carnage that he saw in the war, about the 938 00:53:15,360 --> 00:53:18,279 Speaker 3: way that his wife died away from him without him 939 00:53:18,800 --> 00:53:22,240 Speaker 3: being there with her, and this seems somehow also tangled 940 00:53:22,320 --> 00:53:25,160 Speaker 3: up in feelings about Faber, the injured man, who is 941 00:53:25,239 --> 00:53:28,560 Speaker 3: trapped in a miasma of morphine fever, dreams and pain 942 00:53:28,680 --> 00:53:33,480 Speaker 3: and hallucinating. And eventually Garner's pity for the dog drives 943 00:53:33,560 --> 00:53:35,799 Speaker 3: him to sneak out of the tent while the others 944 00:53:35,840 --> 00:53:39,560 Speaker 3: are sleeping, rig up a rope system, and climb down 945 00:53:39,560 --> 00:53:42,400 Speaker 3: into the crevasse so he can end the poor animals suffering. 946 00:53:42,800 --> 00:53:47,160 Speaker 2: Now, this decision is absolutely supported on a character level here. 947 00:53:47,200 --> 00:53:48,920 Speaker 2: I want to stress that, but it's also one of 948 00:53:48,960 --> 00:53:51,920 Speaker 2: those things we're reading a story and thinking about tropes 949 00:53:51,920 --> 00:53:53,640 Speaker 2: and plotting, you might think, was this is this the 950 00:53:53,719 --> 00:53:57,040 Speaker 2: dumb decision of the horror story? Here is the dumb 951 00:53:57,080 --> 00:53:59,680 Speaker 2: thing that our character does, the risky move that they 952 00:53:59,719 --> 00:54:03,520 Speaker 2: make that brings them in closer proximity to horror? I 953 00:54:03,520 --> 00:54:10,920 Speaker 2: mean maybe structure structurally kind of yes, but we'll discuss 954 00:54:10,960 --> 00:54:14,520 Speaker 2: in a bit, like there are historic examples of this 955 00:54:14,640 --> 00:54:16,719 Speaker 2: exact sort of thing, Like going out of your way 956 00:54:16,840 --> 00:54:19,799 Speaker 2: to save a sled dog from a gravas is not 957 00:54:19,840 --> 00:54:23,960 Speaker 2: only something that is possible and likely, but it definitely 958 00:54:24,040 --> 00:54:27,200 Speaker 2: happened and again well supported in the story, regardless of 959 00:54:27,280 --> 00:54:28,680 Speaker 2: what the reality was. Yeah. 960 00:54:28,719 --> 00:54:31,800 Speaker 3: Now, unfortunately in this case, the sled dog is beyond 961 00:54:31,840 --> 00:54:35,400 Speaker 3: saving it fatally injured and stuck at the bottom. But 962 00:54:35,640 --> 00:54:37,799 Speaker 3: at least Garner hopes that he can put the dog 963 00:54:37,840 --> 00:54:40,319 Speaker 3: out of its misery. So when he gets all the 964 00:54:40,320 --> 00:54:42,960 Speaker 3: way down to the bottom of the chasm and the ice, 965 00:54:43,320 --> 00:54:48,840 Speaker 3: he discovers something strange. It's not just a deep crack 966 00:54:49,080 --> 00:54:51,960 Speaker 3: in the glacier like you would expect. That alone is 967 00:54:52,040 --> 00:54:56,280 Speaker 3: horrifying enough vanishing down into the distance below this crack 968 00:54:56,360 --> 00:54:59,279 Speaker 3: and the narrowing crack in the ice. Oh, I get 969 00:54:59,320 --> 00:55:04,080 Speaker 3: shivers just about it. But underneath the ice, the crack 970 00:55:04,239 --> 00:55:09,760 Speaker 3: reveals an opening, an opening into a vast, cavernous space, 971 00:55:10,239 --> 00:55:14,120 Speaker 3: the nearest part of which that Garner can see is 972 00:55:14,200 --> 00:55:19,320 Speaker 3: a carved rock staircase of enormous size leading down into 973 00:55:19,320 --> 00:55:23,080 Speaker 3: the dark. And Garner believes he sees not only stairs, 974 00:55:23,160 --> 00:55:26,799 Speaker 3: but imagery reliefs etched into the rock showing some kind 975 00:55:26,840 --> 00:55:30,040 Speaker 3: of creature, but he doesn't really understand what he's looking at, 976 00:55:30,040 --> 00:55:34,759 Speaker 3: this weird taloned medusa like form that he doesn't comprehend. 977 00:55:35,480 --> 00:55:38,960 Speaker 3: And the stairs also have this power of summoning a 978 00:55:39,040 --> 00:55:42,920 Speaker 3: psychic force invites him to come down, and it's implied 979 00:55:42,960 --> 00:55:45,960 Speaker 3: I think that it's actively probing his mind for a 980 00:55:46,000 --> 00:55:50,160 Speaker 3: psychological foothold, because for some reason, looking down into the descent, 981 00:55:50,360 --> 00:55:51,520 Speaker 3: he thinks of Elizabeth. 982 00:55:52,239 --> 00:55:55,400 Speaker 2: Yeah, yeah, this is such a haunting moment. Again, on 983 00:55:55,560 --> 00:55:59,920 Speaker 2: one level, it almost feels less dangerous when there are 984 00:56:00,120 --> 00:56:05,640 Speaker 2: even supernatural, oversized cyclopean stairs down there, because well, at 985 00:56:05,760 --> 00:56:07,920 Speaker 2: least something walk down here, and it's not just the 986 00:56:07,960 --> 00:56:11,759 Speaker 2: lifeless ice pit that we thought it was, But of 987 00:56:11,800 --> 00:56:15,239 Speaker 2: course I think many, if not most, readers of this 988 00:56:15,360 --> 00:56:19,239 Speaker 2: tale would be familiar with the writings of HP Lovecraft, 989 00:56:19,239 --> 00:56:21,920 Speaker 2: and they see exactly what we're dealing with here, because, 990 00:56:23,000 --> 00:56:26,840 Speaker 2: of course, one of HP Lovecraft's most well known tales 991 00:56:27,120 --> 00:56:31,000 Speaker 2: is At the Mountains of Madness, which concerns elder ruins 992 00:56:31,000 --> 00:56:36,000 Speaker 2: in Antarctica and various horrifying revelations that occur when humans 993 00:56:36,239 --> 00:56:39,719 Speaker 2: plunge those ruins. And on top of this, there's also 994 00:56:39,800 --> 00:56:43,759 Speaker 2: just the element of stairs. Lovecraft frequently employed stairways as 995 00:56:43,800 --> 00:56:47,320 Speaker 2: a liminal space or threshold between one world and another, 996 00:56:47,760 --> 00:56:51,160 Speaker 2: and more to the point between sanity and darkness, between 997 00:56:51,239 --> 00:56:55,120 Speaker 2: healthy human ignorance of the cosmos and crushing revelations about 998 00:56:55,120 --> 00:56:58,840 Speaker 2: its true nature. So it's a nice nod here to 999 00:56:59,600 --> 00:57:03,359 Speaker 2: weird fit, a great Lovecraftian bit of flavor, without, as 1000 00:57:03,400 --> 00:57:06,959 Speaker 2: you mentioned earlier, revealing too much or getting into the lore. 1001 00:57:07,719 --> 00:57:10,360 Speaker 2: I feel like a lesser tail might have decided to 1002 00:57:11,200 --> 00:57:13,920 Speaker 2: throw out a few elder god names here, or maybe 1003 00:57:13,920 --> 00:57:17,439 Speaker 2: crunch on the mythos qualities just a little bit too much. 1004 00:57:17,920 --> 00:57:20,560 Speaker 2: And a lot of restraint is shown here, and it 1005 00:57:20,600 --> 00:57:23,000 Speaker 2: works well and again, especially in a short tail like this, 1006 00:57:23,360 --> 00:57:25,760 Speaker 2: it inspires us to then dream like where do these 1007 00:57:25,800 --> 00:57:29,320 Speaker 2: stairs go? Yeah, and we're kind of drawn down them 1008 00:57:29,320 --> 00:57:30,720 Speaker 2: as well, just like the narrator. 1009 00:57:31,320 --> 00:57:34,400 Speaker 3: But Garner does not have the opportunity to descend the 1010 00:57:34,440 --> 00:57:39,600 Speaker 3: stairs because he's interrupted as he's starting to clamber down them. 1011 00:57:39,640 --> 00:57:42,720 Speaker 3: For whatever reason, he's drawn. He starts to go down them, 1012 00:57:42,760 --> 00:57:46,320 Speaker 3: but he is discovered by Connolly up above looking down, 1013 00:57:46,760 --> 00:57:49,439 Speaker 3: who is furious with him for taking this stupid risk, 1014 00:57:49,680 --> 00:57:51,560 Speaker 3: not just with his own life, but with all of 1015 00:57:51,600 --> 00:57:55,680 Speaker 3: their lives, especially favors. Because Antarctica is an unforgiving place. 1016 00:57:56,160 --> 00:57:59,760 Speaker 3: Any decision you make could spell death for yourself or 1017 00:57:59,840 --> 00:58:00,480 Speaker 3: for others. 1018 00:58:01,200 --> 00:58:01,760 Speaker 2: Absolutely. 1019 00:58:02,400 --> 00:58:06,200 Speaker 3: Garner tries to convince Bishop, and so he comes back up, 1020 00:58:06,760 --> 00:58:09,160 Speaker 3: climbs the rope back out of the crevass, and tries 1021 00:58:09,200 --> 00:58:11,800 Speaker 3: to convince the other two men, Bishop and Connolly of 1022 00:58:11,840 --> 00:58:15,280 Speaker 3: what he saw down there. They're not really interested at first, 1023 00:58:15,320 --> 00:58:18,640 Speaker 3: but he appeals to their sense of desire for adventure 1024 00:58:18,720 --> 00:58:22,080 Speaker 3: and fame. This is one of the references to McCready's like, Okay, 1025 00:58:22,080 --> 00:58:24,240 Speaker 3: so McCready's going to be out there planning the flag 1026 00:58:24,280 --> 00:58:26,919 Speaker 3: while you have to go back. But you could be 1027 00:58:27,040 --> 00:58:30,200 Speaker 3: the discoverer of one of the most important scientific finds 1028 00:58:30,240 --> 00:58:34,200 Speaker 3: in human history, whatever it is that's down there. Because 1029 00:58:34,360 --> 00:58:37,120 Speaker 3: unlike Garner, who it seems just somehow ended up on 1030 00:58:37,160 --> 00:58:40,080 Speaker 3: this journey because he was otherwise adrift, these two men 1031 00:58:40,160 --> 00:58:44,480 Speaker 3: are driven by ambition, it's implied, and that ambition was 1032 00:58:44,560 --> 00:58:46,600 Speaker 3: dashed when they had to split off from the main 1033 00:58:46,640 --> 00:58:49,800 Speaker 3: party to return Favored to the depots. So they're already unhappy. 1034 00:58:51,120 --> 00:58:54,320 Speaker 3: They shine a flashlight down into the crevass, but it's 1035 00:58:54,360 --> 00:58:57,160 Speaker 3: too deep to make out the stairs or the cavern beyond, 1036 00:58:57,240 --> 00:59:00,200 Speaker 3: if they ever really were there. We assume they probably were. 1037 00:59:00,840 --> 00:59:03,560 Speaker 3: They can only see the dog's body lying in blood 1038 00:59:03,720 --> 00:59:07,920 Speaker 3: on the lip of ice far below. However, while they're watching, 1039 00:59:08,200 --> 00:59:12,600 Speaker 3: the dog suddenly is moved, yanked away by something out 1040 00:59:12,600 --> 00:59:16,200 Speaker 3: of sight. They don't have time to process this because 1041 00:59:16,240 --> 00:59:19,600 Speaker 3: immediately Faber, the injured man, cries out in pain or 1042 00:59:19,680 --> 00:59:22,280 Speaker 3: terror from the tent, and the men rush in to 1043 00:59:22,320 --> 00:59:25,240 Speaker 3: see what's the matter, and Faber becomes lucid enough to 1044 00:59:25,320 --> 00:59:28,280 Speaker 3: explain to them what's wrong, And I loved this moment. 1045 00:59:28,600 --> 00:59:31,560 Speaker 3: Didn't quite expect this. When he finally is able to speak, 1046 00:59:31,600 --> 00:59:35,800 Speaker 3: he says, it laid an egg in me. They don't understand, 1047 00:59:35,880 --> 00:59:38,960 Speaker 3: but he insists, quote, Faber found a way to smile 1048 00:59:39,560 --> 00:59:42,680 Speaker 3: in my dream. It put my head inside its body 1049 00:59:42,920 --> 00:59:44,120 Speaker 3: and it laid an egg in me. 1050 00:59:44,760 --> 00:59:48,360 Speaker 2: Yeah, yikes. Yeah. And we don't really get any clarity 1051 00:59:48,400 --> 00:59:51,960 Speaker 2: on what this exactly means. You know, what sort of 1052 00:59:52,880 --> 00:59:55,040 Speaker 2: revelation did he have or is this you know, is 1053 00:59:55,040 --> 00:59:59,320 Speaker 2: this part of the morphine playing with his head? We 1054 00:59:59,360 --> 01:00:01,160 Speaker 2: don't really know. Oh, but a man, it's haunting. 1055 01:00:01,760 --> 01:00:04,560 Speaker 3: So Garner at this point prepares to sedate him with 1056 01:00:04,600 --> 01:00:07,800 Speaker 3: another morphine, Ampuel, but Faber doesn't want He doesn't want 1057 01:00:07,800 --> 01:00:10,680 Speaker 3: this for some reason, and he lashes out and fights. 1058 01:00:11,080 --> 01:00:13,760 Speaker 3: The fight knocks over a kerosene heater and this sets 1059 01:00:13,760 --> 01:00:16,880 Speaker 3: fire to the tent, leading to a mad scramble for survival. 1060 01:00:17,200 --> 01:00:18,840 Speaker 2: This was the part when I went on my first 1061 01:00:18,880 --> 01:00:20,560 Speaker 2: read where I was like, oh my, this is the 1062 01:00:20,600 --> 01:00:22,360 Speaker 2: moment where they're going to have to go down those 1063 01:00:22,360 --> 01:00:24,960 Speaker 2: stairs together where they're gonna not have any equipment or 1064 01:00:25,000 --> 01:00:27,560 Speaker 2: dogs left, and they're gonna think, well, we have nothing 1065 01:00:27,560 --> 01:00:29,800 Speaker 2: to do but go down those stairs. But that's not 1066 01:00:29,840 --> 01:00:30,840 Speaker 2: where the story goes. 1067 01:00:31,040 --> 01:00:33,240 Speaker 3: No, there's a different kind of horror at the end. 1068 01:00:33,280 --> 01:00:36,800 Speaker 3: There's a horror of wondering what might have been so 1069 01:00:37,160 --> 01:00:40,480 Speaker 3: the men after this, they just book it back to 1070 01:00:40,520 --> 01:00:42,560 Speaker 3: their destination. They try to make it to the depot 1071 01:00:42,600 --> 01:00:44,800 Speaker 3: as fast as they can now their tent is burned. 1072 01:00:45,320 --> 01:00:48,440 Speaker 3: Faber does not survive the journey, he dies in transit, 1073 01:00:49,080 --> 01:00:51,440 Speaker 3: but they do reach the safety of the depot. The 1074 01:00:51,520 --> 01:00:54,080 Speaker 3: three surviving men hole up to wait for the return 1075 01:00:54,280 --> 01:00:56,680 Speaker 3: of the rest of the expedition, which is weeks away, 1076 01:00:57,360 --> 01:00:59,960 Speaker 3: And while they are holed up in the depot, Garner 1077 01:01:00,120 --> 01:01:03,200 Speaker 3: tries to talk to Bishop to get him to acknowledge 1078 01:01:03,240 --> 01:01:06,160 Speaker 3: that he saw something in the crevasse, to at least 1079 01:01:06,320 --> 01:01:09,400 Speaker 3: admit that he saw the dog dragged away, but Bishop 1080 01:01:09,440 --> 01:01:10,800 Speaker 3: is very stubborn about it. 1081 01:01:11,280 --> 01:01:11,760 Speaker 2: Quote. 1082 01:01:12,000 --> 01:01:15,120 Speaker 3: Bishop refused to look at him. This is an empty place, 1083 01:01:15,240 --> 01:01:19,040 Speaker 3: he said, after a long silence. There's nothing here. He 1084 01:01:19,120 --> 01:01:23,840 Speaker 3: blinked and turned a page in the magazine Nothing. And 1085 01:01:24,000 --> 01:01:27,680 Speaker 3: I loved this part because I think that line highlights 1086 01:01:28,400 --> 01:01:34,320 Speaker 3: a subtext, an interesting implication of the story. To Bishop, 1087 01:01:34,840 --> 01:01:38,840 Speaker 3: the idea that there might be something hidden, something possibly 1088 01:01:39,000 --> 01:01:43,080 Speaker 3: monstrous and mind rending under the ice waiting to be revealed, 1089 01:01:43,840 --> 01:01:48,520 Speaker 3: is troubling. That possibility is terrifying, and he denies it. 1090 01:01:48,600 --> 01:01:52,880 Speaker 3: So he takes emotional comfort in telling himself, probably lying 1091 01:01:52,880 --> 01:01:55,120 Speaker 3: to himself because it implies he did see the dog 1092 01:01:55,200 --> 01:01:59,600 Speaker 3: dragged away, at least telling himself there's nothing there. And 1093 01:01:59,640 --> 01:02:01,960 Speaker 3: I think for Garner, by the end of the story, 1094 01:02:02,160 --> 01:02:06,600 Speaker 3: the opposite desire is operative. The opposite is true. The 1095 01:02:06,720 --> 01:02:10,040 Speaker 3: nothingness and the absence are what would be frightening. The 1096 01:02:10,120 --> 01:02:13,640 Speaker 3: idea that there is something hidden waiting to be revealed, 1097 01:02:13,800 --> 01:02:16,560 Speaker 3: even if it's monstrous, even if it's something that would 1098 01:02:16,560 --> 01:02:22,520 Speaker 3: destroy him, is somehow comforting. And this duality of orientations 1099 01:02:22,760 --> 01:02:26,800 Speaker 3: toward mystery and understanding is I think a very important 1100 01:02:26,880 --> 01:02:32,200 Speaker 3: part of humanity. Which option bothers you more the idea 1101 01:02:32,240 --> 01:02:36,000 Speaker 3: that there is something unknown, as yet unrevealed, that could 1102 01:02:36,080 --> 01:02:39,400 Speaker 3: destroy you, could destroy everything you love or destroy your 1103 01:02:39,520 --> 01:02:44,200 Speaker 3: understanding of reality. Or would it be worse if there 1104 01:02:44,360 --> 01:02:47,280 Speaker 3: is nothing more that what you see is what you get? 1105 01:02:48,120 --> 01:02:51,640 Speaker 3: And for you, I guess the question is emotionally, does 1106 01:02:51,880 --> 01:02:55,280 Speaker 3: there is nothing more? Reduced to there is nothing? 1107 01:02:55,920 --> 01:02:57,920 Speaker 2: Yeah? I mean it brings to mind the famous Arthur C. 1108 01:02:58,000 --> 01:03:00,880 Speaker 2: Clark quote. Right, two possibilities exist. Either we are alone 1109 01:03:00,920 --> 01:03:04,919 Speaker 2: in the universe or we are not. Both are equally terrifying. Yeah, 1110 01:03:05,160 --> 01:03:07,760 Speaker 2: And It plays nicely with exactly what we're discussing earlier. 1111 01:03:08,040 --> 01:03:13,040 Speaker 2: Is the crevass scarier before we see the stairs or 1112 01:03:13,120 --> 01:03:15,920 Speaker 2: is it the other way around? You know, it works 1113 01:03:16,040 --> 01:03:18,320 Speaker 2: so perfectly in the story, that duality. 1114 01:03:18,600 --> 01:03:21,920 Speaker 3: Yeah, so for Bishop, it's implied that the stairs, to 1115 01:03:22,000 --> 01:03:24,520 Speaker 3: find the stairs would be more frightening than the crevass. 1116 01:03:25,680 --> 01:03:28,560 Speaker 3: For Garner, I think the crevasse is frightening, but once 1117 01:03:28,600 --> 01:03:31,760 Speaker 3: he sees the stairs, now that there is a mystery, 1118 01:03:31,840 --> 01:03:34,640 Speaker 3: now that there could be something more, it's actually inviting. 1119 01:03:35,520 --> 01:03:38,080 Speaker 3: And I like that the story implies a correlation between 1120 01:03:38,120 --> 01:03:40,960 Speaker 3: these two different attitudes and other things about the person. 1121 01:03:41,040 --> 01:03:44,640 Speaker 3: This rings true to me. Bishop, who wishes there to 1122 01:03:44,680 --> 01:03:48,480 Speaker 3: be nothing more, is a person who's ambitious, with an 1123 01:03:48,480 --> 01:03:51,880 Speaker 3: orientation toward future goals. There's stuff he wants to do 1124 01:03:52,000 --> 01:03:56,480 Speaker 3: and accomplish. Garner, who is defined by the past and 1125 01:03:56,560 --> 01:04:00,440 Speaker 3: what has already been taken away. He's got loss of purpose, 1126 01:04:00,600 --> 01:04:03,800 Speaker 3: loss of his great love. He wishes for a key 1127 01:04:03,880 --> 01:04:06,800 Speaker 3: to unlock a new world, for there to be something 1128 01:04:06,880 --> 01:04:09,720 Speaker 3: more revealed, even if it's horrible. 1129 01:04:10,160 --> 01:04:14,200 Speaker 2: And it's clearly horrible. Yeah, there's no there's no hint 1130 01:04:14,240 --> 01:04:15,600 Speaker 2: that it's anything but horrible. 1131 01:04:15,920 --> 01:04:28,400 Speaker 3: Yeah, I'm gonna read from the final paragraph here with 1132 01:04:28,400 --> 01:04:30,400 Speaker 3: with the Garner back of the depot, looking outside. He 1133 01:04:30,520 --> 01:04:33,480 Speaker 3: describes the so the beginning of the story takes place 1134 01:04:33,600 --> 01:04:38,440 Speaker 3: during the Antarctic summer, so whereas perpetually daytime, and he 1135 01:04:38,520 --> 01:04:41,280 Speaker 3: describes the sun as a great boiling eye in the 1136 01:04:41,280 --> 01:04:44,440 Speaker 3: sky that never sets. But as the winter comes closer, 1137 01:04:45,360 --> 01:04:48,680 Speaker 3: we get this part quote. A gust of wind scattered 1138 01:04:48,680 --> 01:04:51,240 Speaker 3: fine crystals of snow against the window, and he found 1139 01:04:51,320 --> 01:04:54,160 Speaker 3: himself wondering what the night would be like in this 1140 01:04:54,240 --> 01:04:58,000 Speaker 3: cold country. He imagined the sky dissolving to reveal the 1141 01:04:58,040 --> 01:05:01,280 Speaker 3: hard vault of stars, the gallay turning above him like 1142 01:05:01,320 --> 01:05:04,760 Speaker 3: a cog in a vast, unknowable engine, and behind it 1143 01:05:04,800 --> 01:05:08,440 Speaker 3: all the emptiness into which men hurled their prayers. It 1144 01:05:08,440 --> 01:05:11,080 Speaker 3: occurred to him that he could leave now, walk out 1145 01:05:11,120 --> 01:05:14,360 Speaker 3: into the long twilight, and keep going until the earth 1146 01:05:14,440 --> 01:05:18,280 Speaker 3: opened beneath him. And he found himself descending strange stairs 1147 01:05:18,640 --> 01:05:21,880 Speaker 3: while the world around him broke silently into snow and 1148 01:05:21,960 --> 01:05:24,600 Speaker 3: into night. Garner closed his eyes. 1149 01:05:25,600 --> 01:05:27,320 Speaker 2: Beautiful, haunting, perfect. 1150 01:05:27,680 --> 01:05:31,080 Speaker 3: Yeah, and I like again emphasized even there at the end, 1151 01:05:31,200 --> 01:05:34,880 Speaker 3: like the thing that he's dwelling for moments on horrible 1152 01:05:34,920 --> 01:05:39,520 Speaker 3: thoughts to him, thoughts about emptiness, about nothing beyond the emptiness, 1153 01:05:39,600 --> 01:05:44,320 Speaker 3: you know, his lack of faith in God. The empty 1154 01:05:44,400 --> 01:05:49,320 Speaker 3: expanses with nothing below, but then is strangely finding comfort 1155 01:05:49,840 --> 01:05:53,280 Speaker 3: into the idea of descending into this alien realm. 1156 01:05:53,800 --> 01:05:54,560 Speaker 2: Yeah. 1157 01:05:54,680 --> 01:05:56,560 Speaker 3: So, as I said earlier, one of the things I 1158 01:05:56,600 --> 01:06:00,160 Speaker 3: loved about this story was the horror evoked by the setting, 1159 01:06:00,200 --> 01:06:04,800 Speaker 3: the bleak emptiness of the Antarctic glacier, and especially the Crevass. 1160 01:06:05,880 --> 01:06:09,560 Speaker 3: If you read about Antarctic expeditions from people who have 1161 01:06:09,800 --> 01:06:13,680 Speaker 3: actually participated in them, you will discover that the terror 1162 01:06:13,760 --> 01:06:17,480 Speaker 3: of the crevass is absolutely real, and a threat about 1163 01:06:17,480 --> 01:06:21,440 Speaker 3: which anybody traveling across great distances of ice has to 1164 01:06:21,480 --> 01:06:25,640 Speaker 3: be almost constantly conscious. For example, I pulled up the 1165 01:06:25,680 --> 01:06:29,600 Speaker 3: text of a book called The Worst Journey in the World, 1166 01:06:29,760 --> 01:06:33,600 Speaker 3: written by absolletely Cherry Garrard, published in nineteen twenty two. 1167 01:06:34,240 --> 01:06:36,800 Speaker 3: Cherry Garrard was a member of the famous Terra Nova 1168 01:06:36,880 --> 01:06:39,920 Speaker 3: expedition to the South Pole under Robert Falcon Scott a 1169 01:06:39,960 --> 01:06:42,240 Speaker 3: decade previous. Rob I know you have some stuff about 1170 01:06:42,240 --> 01:06:46,680 Speaker 3: this expedition as well, and this book provides a first 1171 01:06:46,720 --> 01:06:50,040 Speaker 3: hand account of the expedition and its struggles. If you 1172 01:06:50,080 --> 01:06:53,000 Speaker 3: do a keyword search in the text of this book 1173 01:06:53,040 --> 01:06:56,400 Speaker 3: for crevass, you get almost two hundred hits. It is 1174 01:06:56,640 --> 01:07:01,160 Speaker 3: constantly on their minds, and apart from the physical difficulty 1175 01:07:01,240 --> 01:07:05,880 Speaker 3: and danger of actually encountering them, the psychic toll of 1176 01:07:06,000 --> 01:07:09,480 Speaker 3: knowing the crevasses are out there does its own violence 1177 01:07:09,520 --> 01:07:12,960 Speaker 3: to the explorer. At one point, Cheery guard Rights quote, 1178 01:07:13,280 --> 01:07:16,920 Speaker 3: sometimes a blizzard is a very welcome rest after weeks 1179 01:07:16,920 --> 01:07:20,640 Speaker 3: of hard pulling, dragging yourself awake each morning, feeling as 1180 01:07:20,640 --> 01:07:23,480 Speaker 3: though you had only just gone to sleep, with the 1181 01:07:23,560 --> 01:07:27,520 Speaker 3: mental strain perhaps which working among crevasses and tails, it 1182 01:07:27,600 --> 01:07:30,080 Speaker 3: is most pleasant to be put to bed for two 1183 01:07:30,160 --> 01:07:34,680 Speaker 3: or three days. Even relatively shallow crevasses, which are sometimes 1184 01:07:34,760 --> 01:07:36,560 Speaker 3: only a few feet deep, you know, there are much 1185 01:07:36,600 --> 01:07:41,560 Speaker 3: shallower ones that are less visually impressive than what we're 1186 01:07:41,560 --> 01:07:44,600 Speaker 3: imagining in this story. Even the shallow ones can be 1187 01:07:44,720 --> 01:07:48,240 Speaker 3: dangerous and can cause fatal injury if you fall fall 1188 01:07:48,240 --> 01:07:51,200 Speaker 3: into them unexpectedly. You know, falling five feet the wrong 1189 01:07:51,280 --> 01:07:54,960 Speaker 3: way like that could be death. In Antarctica, a broken 1190 01:07:55,040 --> 01:07:58,400 Speaker 3: bone from a survivable fall can quickly turn into a 1191 01:07:58,440 --> 01:08:03,160 Speaker 3: death sentence. Down there, but many crevasses are much deeper 1192 01:08:03,200 --> 01:08:05,520 Speaker 3: than that, might be one hundred feet or more, some 1193 01:08:05,600 --> 01:08:09,600 Speaker 3: maybe hundreds of feet to the bottom. And he tells 1194 01:08:10,640 --> 01:08:15,600 Speaker 3: Cherry Garrard does tells of crossing and navigating around crevasses 1195 01:08:15,640 --> 01:08:18,160 Speaker 3: that he actually quotes a guy looking down into one 1196 01:08:18,560 --> 01:08:21,240 Speaker 3: and says that some of them are quote black as hell, 1197 01:08:21,640 --> 01:08:26,639 Speaker 3: just into darkness, vanishing into darkness below. And he talks 1198 01:08:26,680 --> 01:08:30,320 Speaker 3: about stretches of ice that are made more or less impassable, 1199 01:08:30,439 --> 01:08:34,280 Speaker 3: but because of how many crevasses there are. There's one 1200 01:08:34,320 --> 01:08:36,559 Speaker 3: place he's talking about where I don't think he's actually 1201 01:08:36,560 --> 01:08:39,000 Speaker 3: talking about passing it, but he's just talking about looking 1202 01:08:39,040 --> 01:08:43,280 Speaker 3: at a chaos of crevasses, particularly I think in a 1203 01:08:43,320 --> 01:08:45,760 Speaker 3: region where a glacier is sort of fanning out as 1204 01:08:45,760 --> 01:08:49,320 Speaker 3: it reaches close to the ocean. And the real horror 1205 01:08:49,520 --> 01:08:53,760 Speaker 3: is you often cannot see these deep gaps in the 1206 01:08:53,800 --> 01:08:57,799 Speaker 3: ice as you approach them, for multiple reasons. First, because 1207 01:08:57,800 --> 01:09:01,240 Speaker 3: of general difficulties with visibility on the ice even under 1208 01:09:01,240 --> 01:09:04,439 Speaker 3: relatively good weather conditions, and in a blizzard, forget about it, 1209 01:09:04,439 --> 01:09:07,920 Speaker 3: and a blizzard visibility is zero. But if you're trying 1210 01:09:07,960 --> 01:09:11,040 Speaker 3: to move under ideal weather conditions, even then it's sometimes 1211 01:09:11,040 --> 01:09:13,200 Speaker 3: just really hard to tell what you're looking at on 1212 01:09:13,240 --> 01:09:15,080 Speaker 3: the ice out in front of you. There are weird 1213 01:09:15,120 --> 01:09:20,360 Speaker 3: ways that that like light and shadows play against your eyes. 1214 01:09:20,400 --> 01:09:24,400 Speaker 3: Sometimes people report this, and Cherry Garard does too. There 1215 01:09:24,439 --> 01:09:26,840 Speaker 3: will be like what he calls these haystack formations of 1216 01:09:26,880 --> 01:09:29,320 Speaker 3: ice that they somehow don't really see until they're coming 1217 01:09:29,360 --> 01:09:33,240 Speaker 3: right upon them. That sometimes you don't see crevasses. So, yeah, 1218 01:09:33,320 --> 01:09:37,240 Speaker 3: visibility is difficult. But even more dangerous than that, many 1219 01:09:37,320 --> 01:09:41,800 Speaker 3: glacial crevasses can become covered by what are typically called 1220 01:09:42,080 --> 01:09:45,639 Speaker 3: snow bridges, so that they are not even visible from 1221 01:09:45,680 --> 01:09:49,000 Speaker 3: above until you put weight on the snow bridge and 1222 01:09:49,080 --> 01:09:51,679 Speaker 3: it collapses, dumping you into the chasm below. 1223 01:09:51,920 --> 01:09:55,559 Speaker 2: Yeah, just a naturally occurring trapdoor, you know, that could 1224 01:09:55,640 --> 01:09:58,160 Speaker 2: just drop you into, you know again, a pit that 1225 01:09:58,200 --> 01:10:01,479 Speaker 2: maybe five feet deep and break your ankle, or one 1226 01:10:01,560 --> 01:10:03,439 Speaker 2: hundred feet deep and kill you outright. 1227 01:10:04,240 --> 01:10:06,840 Speaker 3: So to get around of this, expeditions use a number 1228 01:10:06,880 --> 01:10:09,360 Speaker 3: of techniques, some of which we actually see in this story. 1229 01:10:09,960 --> 01:10:12,800 Speaker 3: So going a long ways back, you use the technique 1230 01:10:12,840 --> 01:10:16,040 Speaker 3: of roping members of the expedition together, you know, so 1231 01:10:16,080 --> 01:10:19,160 Speaker 3: they would tie their bodies or their sledges and animals 1232 01:10:19,200 --> 01:10:22,080 Speaker 3: together with rope so that if one falls, the others 1233 01:10:22,080 --> 01:10:23,880 Speaker 3: can stop the fall and pull them out. 1234 01:10:24,640 --> 01:10:28,360 Speaker 2: Yeah. It's telling, isn't it that there are mountaineering techniques 1235 01:10:28,400 --> 01:10:32,519 Speaker 2: that are used in order to deal with crevasses. You're 1236 01:10:32,560 --> 01:10:37,080 Speaker 2: just you're moving, you're not ascending necessarily, you're just moving 1237 01:10:37,120 --> 01:10:40,000 Speaker 2: across the landscape, and you need to be prepared like 1238 01:10:40,040 --> 01:10:40,720 Speaker 2: a mountaineer. 1239 01:10:41,000 --> 01:10:43,519 Speaker 3: Well, actually, and dealing with crevasses is part of mountaineering 1240 01:10:43,560 --> 01:10:46,440 Speaker 3: as well. They're only they're not only a thing in Antarctica. 1241 01:10:46,479 --> 01:10:48,360 Speaker 3: That's just a place where you're going to encounter a 1242 01:10:48,400 --> 01:10:50,640 Speaker 3: lot of crevasses that you will also find them in 1243 01:10:50,760 --> 01:10:56,920 Speaker 3: mountain glaciers. So yeah, there's there's the rope techniques. There's 1244 01:10:57,000 --> 01:11:00,599 Speaker 3: probing with poles, so like stabbing poles into the snow 1245 01:11:00,760 --> 01:11:03,240 Speaker 3: ahead of where you're moving to find soft spots where 1246 01:11:03,280 --> 01:11:06,400 Speaker 3: the pole sinks through. That can be very effective but 1247 01:11:06,439 --> 01:11:11,360 Speaker 3: obviously makes travel quite slow. There is one adaptation is 1248 01:11:11,400 --> 01:11:15,960 Speaker 3: a sacrificial attitude toward lead dogs and other animals maybe ponies, 1249 01:11:16,000 --> 01:11:19,799 Speaker 3: sometimes counting on the leading animals to fall through first, 1250 01:11:20,000 --> 01:11:22,679 Speaker 3: allowing the rest of the team to stop before hitting 1251 01:11:22,680 --> 01:11:23,440 Speaker 3: the gap. 1252 01:11:23,800 --> 01:11:25,720 Speaker 2: Which is of course horrifying in its own right. 1253 01:11:25,960 --> 01:11:30,760 Speaker 3: Yeah, and then also eventually experience. People with lots of 1254 01:11:30,800 --> 01:11:35,320 Speaker 3: experience traveling on glaciers learn visual cues to look for, 1255 01:11:35,479 --> 01:11:39,760 Speaker 3: so that they can sometimes spot even covered up crevasses 1256 01:11:40,280 --> 01:11:43,600 Speaker 3: due to characteristics of the surface ice like color or 1257 01:11:43,720 --> 01:11:48,240 Speaker 3: drift shape. But even you know, very very solid, well 1258 01:11:49,439 --> 01:11:53,040 Speaker 3: educated ice veterans won't spot it every time. There's nothing 1259 01:11:53,120 --> 01:11:57,200 Speaker 3: fool proof here. Modern technology does have some tools that 1260 01:11:57,280 --> 01:12:01,840 Speaker 3: these early Antarctic explorers did not have. You know, modern 1261 01:12:01,920 --> 01:12:07,160 Speaker 3: Antarctic expeditions can use sophisticated techniques like ground penetrating radar 1262 01:12:07,280 --> 01:12:11,680 Speaker 3: to image crevasses from above. But there is a maddening 1263 01:12:11,720 --> 01:12:15,439 Speaker 3: aspect to this because you might think that, oh, well, 1264 01:12:15,479 --> 01:12:19,639 Speaker 3: if you can make a map with ground penetrating radar 1265 01:12:19,720 --> 01:12:22,000 Speaker 3: of a particular area, and then you can just know 1266 01:12:22,080 --> 01:12:25,679 Speaker 3: in advance where all the crevasses will be. But that 1267 01:12:25,760 --> 01:12:28,720 Speaker 3: kind of radar based map will not be useful for 1268 01:12:28,880 --> 01:12:33,200 Speaker 3: very long, because a glacier is flowing and forever changing, 1269 01:12:33,320 --> 01:12:37,960 Speaker 3: and its surface will change substantially over relatively quick timespans. 1270 01:12:38,000 --> 01:12:40,200 Speaker 3: From what I was reading, it seems like not only 1271 01:12:40,280 --> 01:12:42,240 Speaker 3: over the years, but maybe even over the course of 1272 01:12:42,280 --> 01:12:45,400 Speaker 3: a few weeks or months, there can be substantial changes 1273 01:12:45,920 --> 01:12:50,639 Speaker 3: in the in the crevass landscape. Existing cracks clothes, new 1274 01:12:50,680 --> 01:12:54,519 Speaker 3: cracks open, All cracks move, some more quickly than others. 1275 01:12:54,880 --> 01:13:00,200 Speaker 3: Oh wow, So Cravas's form because of physical stress on 1276 01:13:00,360 --> 01:13:04,960 Speaker 3: the ice, usually caused by the flow of the glacier. Overall, 1277 01:13:05,439 --> 01:13:08,679 Speaker 3: a glacier is a weird type of material to think 1278 01:13:08,680 --> 01:13:14,000 Speaker 3: about because it has some liquid like characteristics. Glaciers do flow, 1279 01:13:14,479 --> 01:13:17,439 Speaker 3: so it in a way makes sense to think of 1280 01:13:17,479 --> 01:13:22,920 Speaker 3: them as extremely slow moving frozen rivers, and yet they 1281 01:13:23,000 --> 01:13:27,719 Speaker 3: also have the brittle characteristics of ice. So while liquid 1282 01:13:27,760 --> 01:13:31,959 Speaker 3: water easily flows around a bend and has no problems 1283 01:13:32,000 --> 01:13:35,040 Speaker 3: speeding up or slowing down, you know, following the shape 1284 01:13:35,040 --> 01:13:37,880 Speaker 3: of a channel and obstacles within it, or speeding up 1285 01:13:37,920 --> 01:13:42,719 Speaker 3: going down a slope, ice flowing through a channel tends 1286 01:13:42,760 --> 01:13:46,559 Speaker 3: to succumb to brittle fracture when it is deformed, and 1287 01:13:46,640 --> 01:13:49,599 Speaker 3: this can happen for a number of reasons. Going around 1288 01:13:49,640 --> 01:13:53,960 Speaker 3: bends or obstacles in the underlying terrain changes in the 1289 01:13:54,000 --> 01:13:58,479 Speaker 3: direction or shape or speed of the glacier's flow, you know, 1290 01:13:58,560 --> 01:14:01,320 Speaker 3: so like a change in the slope of the ground 1291 01:14:01,360 --> 01:14:05,280 Speaker 3: beneath the glacier will cause it to flow faster, and 1292 01:14:05,400 --> 01:14:08,200 Speaker 3: that causes the glacier to stretch, and then it forms 1293 01:14:08,240 --> 01:14:11,120 Speaker 3: cracks in the upper part of the ice. You can 1294 01:14:11,160 --> 01:14:15,080 Speaker 3: also see these chaotic distributions of crevasses emerging in places 1295 01:14:15,120 --> 01:14:18,439 Speaker 3: where the glacier is stretched out horizontally. This is a 1296 01:14:18,520 --> 01:14:21,320 Speaker 3: rough analogy. I don't think it's perfect, but this kind 1297 01:14:21,320 --> 01:14:25,120 Speaker 3: of makes sense. Anything that might cause turbulence in the 1298 01:14:25,160 --> 01:14:28,599 Speaker 3: flow of liquid water through a space would have the 1299 01:14:28,600 --> 01:14:32,080 Speaker 3: potential to cause crevasses to form in a glacier flowing 1300 01:14:32,080 --> 01:14:34,920 Speaker 3: through that space. And then beyond that you've got the 1301 01:14:35,000 --> 01:14:39,240 Speaker 3: question of how do those devious snow bridges form. Usually 1302 01:14:39,280 --> 01:14:42,320 Speaker 3: this seems to happen from snow drift. So snow is 1303 01:14:42,360 --> 01:14:47,040 Speaker 3: being driven horizontally by wind, and this snow sticks to 1304 01:14:47,280 --> 01:14:50,200 Speaker 3: the ice on the sides of the crack in the glacier, 1305 01:14:50,680 --> 01:14:53,840 Speaker 3: and the snow adheres and piles up and up until 1306 01:14:53,920 --> 01:14:56,720 Speaker 3: the top of the crack is covered completely. But sometimes, 1307 01:14:56,760 --> 01:14:59,320 Speaker 3: you know, and so sometimes it can be snow filling 1308 01:14:59,360 --> 01:15:01,680 Speaker 3: in the crack, because so maybe it does go all 1309 01:15:01,720 --> 01:15:04,280 Speaker 3: the way down to the bottom, but it's just like 1310 01:15:04,439 --> 01:15:07,880 Speaker 3: loosely packed snow, or maybe it actually just forms a 1311 01:15:07,920 --> 01:15:10,719 Speaker 3: bridge over the top layer of the crack, and it's 1312 01:15:10,800 --> 01:15:12,720 Speaker 3: just an open drop below that. 1313 01:15:13,760 --> 01:15:18,320 Speaker 2: Oh, you included a photograph here in our outline. I 1314 01:15:18,400 --> 01:15:22,400 Speaker 2: encourage listeners to look for such images as well, because, man, 1315 01:15:22,479 --> 01:15:24,479 Speaker 2: you look in the background of this this image, Like 1316 01:15:24,479 --> 01:15:27,799 Speaker 2: in the foreground we see where the crack has been revealed, 1317 01:15:27,800 --> 01:15:33,000 Speaker 2: and there's some individuals traversing it, leaping over it. But 1318 01:15:33,120 --> 01:15:36,360 Speaker 2: in the background, like the same crack continues but is 1319 01:15:36,400 --> 01:15:38,240 Speaker 2: covered in snow, and at least in my eye, it 1320 01:15:38,360 --> 01:15:39,480 Speaker 2: just looks like a snowfield. 1321 01:15:39,680 --> 01:15:42,439 Speaker 3: Yeah. Yeah, you, unless you really know what to look for, 1322 01:15:42,600 --> 01:15:44,519 Speaker 3: you wouldn't see it at all. And even some people 1323 01:15:44,520 --> 01:15:47,360 Speaker 3: who know what to look for might not catch it. Yeah, 1324 01:15:47,400 --> 01:15:50,759 Speaker 3: there's another really cool photo I came across regarding snow bridges, 1325 01:15:50,840 --> 01:15:53,320 Speaker 3: but from the other angle. So this was on a 1326 01:15:53,320 --> 01:15:57,639 Speaker 3: website called Antarctic glaciers dot org. This was a blog 1327 01:15:57,680 --> 01:16:02,040 Speaker 3: post by a glaciologist named Bethan Davies of Newcastle University 1328 01:16:02,080 --> 01:16:06,280 Speaker 3: in the UK, and the post is describing a little 1329 01:16:06,560 --> 01:16:10,000 Speaker 3: expedition where a group of researchers explored the inside of 1330 01:16:10,000 --> 01:16:13,960 Speaker 3: a fairly deep crevas on the glacier behind rothero research 1331 01:16:14,040 --> 01:16:19,120 Speaker 3: station on the Antarctic Peninsula. And so the small group 1332 01:16:19,160 --> 01:16:21,800 Speaker 3: of people they go down below the crevas. They like 1333 01:16:21,840 --> 01:16:24,160 Speaker 3: have to descend through a hole, and then they're going 1334 01:16:24,200 --> 01:16:27,200 Speaker 3: into this covered part of the crevas, so it feels 1335 01:16:27,280 --> 01:16:30,400 Speaker 3: like an ice cave. It has all these icicles and everywhere, 1336 01:16:30,439 --> 01:16:33,040 Speaker 3: and it's some parts are more white and other parts 1337 01:16:33,080 --> 01:16:36,479 Speaker 3: have more blue ice. So it's beautiful looking down at 1338 01:16:36,520 --> 01:16:40,360 Speaker 3: the cave part. But then there is one photo you 1339 01:16:40,360 --> 01:16:43,360 Speaker 3: can see if you scroll down rob where the camera 1340 01:16:43,400 --> 01:16:46,679 Speaker 3: is positioned looking up toward the surface in the crevas, 1341 01:16:47,160 --> 01:16:49,640 Speaker 3: where the gap is covered by a snowbridge, so no 1342 01:16:49,760 --> 01:16:53,160 Speaker 3: sky is visible, but you can see deep blue light 1343 01:16:53,439 --> 01:16:56,600 Speaker 3: bleeding through the thinnest parts of the snow cover in 1344 01:16:56,640 --> 01:16:59,920 Speaker 3: the gap, and it looks like Cherenkov radiation around it 1345 01:17:00,120 --> 01:17:01,040 Speaker 3: nuclear reactor. 1346 01:17:01,280 --> 01:17:02,960 Speaker 2: Oh yeah, it's absolutely haunting. 1347 01:17:03,560 --> 01:17:08,200 Speaker 3: So I formally submit glacier crevasses as one of the 1348 01:17:08,640 --> 01:17:12,960 Speaker 3: true real life horror stories of mother nature of planet Earth. 1349 01:17:13,840 --> 01:17:17,880 Speaker 3: Beautiful in some ways, very very interesting to think about 1350 01:17:18,200 --> 01:17:21,639 Speaker 3: how they form and the power implied and the way 1351 01:17:21,680 --> 01:17:25,639 Speaker 3: glaciers flow everything that, you know, all of the strange 1352 01:17:25,680 --> 01:17:28,160 Speaker 3: processes that we don't really think about or are hard 1353 01:17:28,160 --> 01:17:31,880 Speaker 3: for us to picture that lead to their creation. But 1354 01:17:31,920 --> 01:17:34,840 Speaker 3: then also just when you're actually faced with one, how 1355 01:17:35,080 --> 01:17:37,840 Speaker 3: frightening it could be the idea of just plunging one 1356 01:17:37,880 --> 01:17:41,360 Speaker 3: hundred feet down into a narrow gap in the ice. 1357 01:17:41,960 --> 01:17:46,200 Speaker 2: Yeah. Yeah, they're absolutely horrifying as just in our imagination. 1358 01:17:46,640 --> 01:17:49,360 Speaker 2: And when you dig into the history, we've the humans, 1359 01:17:49,479 --> 01:17:52,960 Speaker 2: human explorers have had horrifying encounters with them. We're not 1360 01:17:53,000 --> 01:17:56,160 Speaker 2: going to go through all of them. But you know, 1361 01:17:56,160 --> 01:17:57,360 Speaker 2: I want to come back to what we were talking 1362 01:17:57,360 --> 01:17:59,640 Speaker 2: about earlier, like does it make sense to rescue a 1363 01:17:59,720 --> 01:18:04,040 Speaker 2: dog from a crevass? Like is this a sensible move 1364 01:18:04,080 --> 01:18:06,360 Speaker 2: on the part of our protagonist or is this the 1365 01:18:06,360 --> 01:18:11,040 Speaker 2: protagonist being dumb or making that extra risky horror story decision. 1366 01:18:11,720 --> 01:18:16,240 Speaker 2: And again, I'm very touched on how important these dogs 1367 01:18:16,280 --> 01:18:19,280 Speaker 2: were to the people on these expeditions, both in terms 1368 01:18:19,280 --> 01:18:21,679 Speaker 2: of the practical necessity of having them and then also 1369 01:18:21,720 --> 01:18:26,400 Speaker 2: the companionship. But yeah, there's some historic precedents for this 1370 01:18:26,439 --> 01:18:29,200 Speaker 2: as well. For example, during the British Terra Nova expedition 1371 01:18:29,280 --> 01:18:32,920 Speaker 2: that we mentioned already to Antarctica from nineteen ten through 1372 01:18:33,000 --> 01:18:37,559 Speaker 2: nineteen thirteen. There are multiple accounts of this, but one 1373 01:18:37,640 --> 01:18:40,519 Speaker 2: quick account that I Ran across was a twenty twenty 1374 01:18:40,560 --> 01:18:43,879 Speaker 2: three article from the US Naval Institute Heroism and Betrayal 1375 01:18:43,920 --> 01:18:49,400 Speaker 2: in Antarctica by Karen May and she points into this 1376 01:18:49,920 --> 01:18:54,400 Speaker 2: known clash between Cecil Mears and the expedition leader, Robert 1377 01:18:54,479 --> 01:18:59,719 Speaker 2: Falcon Scott. They clashed on numerous occasions, including when Mears 1378 01:19:00,040 --> 01:19:03,320 Speaker 2: refused Scott's order to rescue some fallen sled dogs from 1379 01:19:03,320 --> 01:19:07,040 Speaker 2: a crevass. Scott himself ended up entering the crevass himself 1380 01:19:07,120 --> 01:19:08,200 Speaker 2: to rescue the dogs. 1381 01:19:08,400 --> 01:19:08,759 Speaker 3: Wow. 1382 01:19:09,200 --> 01:19:10,840 Speaker 2: And I can only imagine this and some of these 1383 01:19:10,840 --> 01:19:13,439 Speaker 2: other tales were part of the inspiration and part of 1384 01:19:13,479 --> 01:19:16,800 Speaker 2: the research for the horror story we're talking about. Yeah, 1385 01:19:17,160 --> 01:19:20,240 Speaker 2: surely there are also accounts of not only dogs, but 1386 01:19:20,320 --> 01:19:24,080 Speaker 2: whole sledges and explorers being swallowed up by these as well. 1387 01:19:24,680 --> 01:19:28,600 Speaker 2: One of these occurred during the Australasian Antarctic Expedition of 1388 01:19:28,680 --> 01:19:33,120 Speaker 2: nineteen eleven through nineteen fourteen. This was led by Douglas Mawson, 1389 01:19:33,479 --> 01:19:36,559 Speaker 2: who wrote about his experiences later in the book Home 1390 01:19:36,640 --> 01:19:39,080 Speaker 2: of the Blizzard I believe was published with some other 1391 01:19:39,120 --> 01:19:43,040 Speaker 2: titles later on as well. But here's an excerpt about 1392 01:19:43,080 --> 01:19:46,160 Speaker 2: a tragic event at a crevass which swallowed up expedition 1393 01:19:46,240 --> 01:19:51,519 Speaker 2: member Belgrave Edward Sutton menis his sledge and dogs in 1394 01:19:51,640 --> 01:19:55,719 Speaker 2: nineteen twelve, frantically waving to Mertz to bring up my sledge, 1395 01:19:55,800 --> 01:19:58,880 Speaker 2: upon which there was some alpine rope. I leaned over 1396 01:19:58,960 --> 01:20:02,960 Speaker 2: and shouted into the dark depths below. No sound came back, 1397 01:20:03,000 --> 01:20:05,439 Speaker 2: but the moaning of a dog caught on a shelf 1398 01:20:05,560 --> 01:20:08,439 Speaker 2: just visible one hundred and fifty feet below. The poor 1399 01:20:08,479 --> 01:20:10,400 Speaker 2: creature appeared to have broken its back, for it was 1400 01:20:10,439 --> 01:20:12,920 Speaker 2: attempting to set up with the front part of its body, 1401 01:20:12,960 --> 01:20:16,759 Speaker 2: while the hinder portion lay limp. Another dog lay motionless 1402 01:20:16,800 --> 01:20:19,719 Speaker 2: by its side, close by what appeared in the gloom 1403 01:20:19,840 --> 01:20:22,520 Speaker 2: to be the remains of the tent and a canvas 1404 01:20:22,560 --> 01:20:26,519 Speaker 2: tank containing food for three men for a fortnight. We 1405 01:20:26,600 --> 01:20:29,200 Speaker 2: broke back the edge of the and these another term, 1406 01:20:29,240 --> 01:20:31,719 Speaker 2: but they were referring to the crevas here, I believe, 1407 01:20:32,080 --> 01:20:35,839 Speaker 2: and took turns, leaning over, secured by a rope, calling 1408 01:20:35,880 --> 01:20:38,479 Speaker 2: into the darkness in the hope that our companion might 1409 01:20:38,560 --> 01:20:42,240 Speaker 2: be still alive. For three hours we called unceasingly, but 1410 01:20:42,360 --> 01:20:45,479 Speaker 2: no answering sound came back. The dog had ceased to 1411 01:20:45,479 --> 01:20:48,320 Speaker 2: moan and lay without a movement. A chill draft was 1412 01:20:48,360 --> 01:20:51,000 Speaker 2: blowing out of the abyss. We felt that there was 1413 01:20:51,080 --> 01:20:54,839 Speaker 2: little hope. Why had the first sledge escaped the crevasse. 1414 01:20:55,439 --> 01:20:58,040 Speaker 2: It seemed that I had been fortunate because my sledge 1415 01:20:58,040 --> 01:21:01,000 Speaker 2: had crossed diagonally, with a greater chance of breaking the 1416 01:21:01,040 --> 01:21:04,439 Speaker 2: snow lid. The sledges were within thirty pounds of the 1417 01:21:04,479 --> 01:21:08,559 Speaker 2: same weight. The explanation appeared to be that Ninnis had 1418 01:21:08,600 --> 01:21:11,599 Speaker 2: walked by the side of his sledge, whereas eye had 1419 01:21:11,600 --> 01:21:14,920 Speaker 2: crossed its sitting on the sledge. The whole weight of 1420 01:21:14,960 --> 01:21:18,599 Speaker 2: a man's body bearing on his foot is a formidable load, 1421 01:21:18,920 --> 01:21:21,160 Speaker 2: and no doubt was sufficient to smash the arch of 1422 01:21:21,200 --> 01:21:24,280 Speaker 2: the roof. By means of a fishing line, we ascertained 1423 01:21:24,280 --> 01:21:26,880 Speaker 2: that it was one hundred and fifty feet sheer to 1424 01:21:26,960 --> 01:21:30,240 Speaker 2: the ledge, on which the remains were seen On either side, 1425 01:21:30,240 --> 01:21:33,840 Speaker 2: the crevass descended into blackness. It seemed so very far 1426 01:21:34,000 --> 01:21:36,960 Speaker 2: down there, and the dogs looked so small that we 1427 01:21:37,000 --> 01:21:40,000 Speaker 2: got out the field glasses and could make out nothing more. 1428 01:21:40,040 --> 01:21:43,360 Speaker 2: By their aid. All our available rope was tied together, 1429 01:21:43,400 --> 01:21:46,160 Speaker 2: but the total length was insufficient to reach the ledge, 1430 01:21:46,520 --> 01:21:49,080 Speaker 2: and any idea of going below to investigate and to 1431 01:21:49,120 --> 01:21:51,960 Speaker 2: secure some of the food had to be abandoned. 1432 01:21:52,479 --> 01:21:53,439 Speaker 3: That is chilling. 1433 01:21:53,640 --> 01:21:57,559 Speaker 2: That is absolutely chilling, like the idea that the scale 1434 01:21:58,120 --> 01:22:02,840 Speaker 2: of the crevass was beyond and they're not even drawing 1435 01:22:02,840 --> 01:22:05,400 Speaker 2: their imagination into this, but just beyond the abilities of 1436 01:22:05,400 --> 01:22:06,639 Speaker 2: their equipment to even reach. 1437 01:22:18,080 --> 01:22:22,719 Speaker 3: You want another horror story from that book by Cherry Garrard, Sure, okay, 1438 01:22:23,000 --> 01:22:25,400 Speaker 3: And this actually actually bring us back to a couple 1439 01:22:25,439 --> 01:22:27,719 Speaker 3: of characters. You were just talking about Scott and Merrs 1440 01:22:28,400 --> 01:22:33,880 Speaker 3: on that expedition. So Cherry Garrard is writing about a 1441 01:22:33,880 --> 01:22:37,320 Speaker 3: section where they're crossing some ice and says, quote, we 1442 01:22:37,479 --> 01:22:40,360 Speaker 3: ran level for another two miles, mirrors and Scott on 1443 01:22:40,400 --> 01:22:44,519 Speaker 3: our left. We were evidently crossing many crevasses. Quite suddenly 1444 01:22:44,560 --> 01:22:48,559 Speaker 3: we saw the dogs of their team disappearing, following one another, 1445 01:22:48,840 --> 01:22:51,960 Speaker 3: just like dogs going down a hole after some animal. 1446 01:22:52,840 --> 01:22:56,040 Speaker 3: In a moment, wrote Scott, the whole team we're sinking. 1447 01:22:56,439 --> 01:22:58,880 Speaker 3: Two by two. We lost sight of them, each pair 1448 01:22:59,000 --> 01:23:03,000 Speaker 3: struggling for foothol Osman the leader exerted all his strength 1449 01:23:03,040 --> 01:23:06,240 Speaker 3: and kept foothold. It was wonderful to see him. The 1450 01:23:06,280 --> 01:23:09,960 Speaker 3: sledge stopped and we leapt aside the situation was clear. 1451 01:23:10,000 --> 01:23:13,280 Speaker 3: In another moment, we had actually been traveling along the 1452 01:23:13,280 --> 01:23:16,880 Speaker 3: bridge or snow covering of a crevass. The sledge had 1453 01:23:16,960 --> 01:23:20,320 Speaker 3: stopped on it, whilst the dogs hung in their harness 1454 01:23:20,400 --> 01:23:24,240 Speaker 3: in the abyss, suspended between the sledge and the leading dog. 1455 01:23:24,840 --> 01:23:27,720 Speaker 3: Why the sledge and ourselves didn't follow the dogs we 1456 01:23:27,760 --> 01:23:31,799 Speaker 3: shall never know. Oh wow, bridge of dogs. 1457 01:23:32,560 --> 01:23:34,920 Speaker 2: So yeah, and both of all these accounts we've been 1458 01:23:35,200 --> 01:23:38,639 Speaker 2: looking at here concerning the crevass like they it sounds 1459 01:23:38,640 --> 01:23:41,240 Speaker 2: to me like, yeah, you could basically just completely pass 1460 01:23:41,280 --> 01:23:44,439 Speaker 2: over one of these snow bridges and you would never 1461 01:23:44,600 --> 01:23:47,720 Speaker 2: know that, like one hundred and fifty foot drop was 1462 01:23:47,800 --> 01:23:50,439 Speaker 2: just waiting there for you, and you just happened to 1463 01:23:50,880 --> 01:23:53,320 Speaker 2: not trigger it, and then the next one could get you. 1464 01:23:53,560 --> 01:23:56,559 Speaker 3: Yeah. They don't always collapse. So yeah, very likely you 1465 01:23:56,680 --> 01:24:01,320 Speaker 3: have been over crevasses and crossed a snowbridge without realizing it. Wow, 1466 01:24:01,680 --> 01:24:03,960 Speaker 3: I mean not you the person listening, or you wrong, 1467 01:24:04,600 --> 01:24:07,120 Speaker 3: the Antarctic explorer, the glacial mountaineer. 1468 01:24:07,760 --> 01:24:10,320 Speaker 2: Yeah. Though obviously if we have listeners out there who 1469 01:24:10,360 --> 01:24:14,040 Speaker 2: have any experience with crevasses of one form or another, 1470 01:24:14,120 --> 01:24:16,200 Speaker 2: definitely right in and tell us all about it. 1471 01:24:16,600 --> 01:24:21,360 Speaker 3: Absolutely, Yeah, so scariest non speculative thing I've thought about 1472 01:24:21,360 --> 01:24:21,800 Speaker 3: in a while. 1473 01:24:24,320 --> 01:24:26,559 Speaker 2: Yeah, this is a great story. Again. I love its 1474 01:24:26,960 --> 01:24:31,600 Speaker 2: use of history and an extreme real world environment, but 1475 01:24:31,760 --> 01:24:36,360 Speaker 2: also involving this wonderful speculative elopment, you know, bringing in 1476 01:24:36,560 --> 01:24:42,840 Speaker 2: that mythos flavor, hinting at this other world Cyclopean architecture 1477 01:24:42,920 --> 01:24:45,800 Speaker 2: and the deep and so forth. Just a just a 1478 01:24:45,840 --> 01:24:47,160 Speaker 2: wonderful little short story. 1479 01:24:47,880 --> 01:24:49,920 Speaker 3: To end on a positive note, I would say, I 1480 01:24:49,960 --> 01:24:53,439 Speaker 3: am really inspired by these stories of crevass rescues as well. 1481 01:24:54,160 --> 01:24:56,920 Speaker 2: Yeah, yeah, I mean, they're not all doom and gloom like. 1482 01:24:56,920 --> 01:25:00,840 Speaker 2: There are accounts certainly of individuals, you know, successfully being 1483 01:25:02,400 --> 01:25:05,760 Speaker 2: rescued from crevasses, dogs being rescued from crevasses. And I 1484 01:25:05,800 --> 01:25:07,439 Speaker 2: don't know if I mentioned it already, but dogs are 1485 01:25:07,479 --> 01:25:12,080 Speaker 2: also very useful in crevass rescue. All right, well, we're 1486 01:25:12,120 --> 01:25:14,479 Speaker 2: going to go ahead and close the book here on 1487 01:25:14,720 --> 01:25:17,840 Speaker 2: Gramore of Horror Volume two. We hope that everyone out 1488 01:25:17,880 --> 01:25:21,800 Speaker 2: there enjoyed our discussion of these two short stories, and 1489 01:25:22,080 --> 01:25:24,040 Speaker 2: certainly we would love to hear from everyone out there 1490 01:25:24,040 --> 01:25:27,479 Speaker 2: if you have thoughts on the crevass or on I 1491 01:25:27,520 --> 01:25:30,759 Speaker 2: have no mouth, and I must scream you have thoughts 1492 01:25:30,800 --> 01:25:33,599 Speaker 2: on the authors involved in these tales or other works 1493 01:25:33,600 --> 01:25:36,920 Speaker 2: that they wrote, write in We'd love to hear from you, 1494 01:25:37,240 --> 01:25:38,960 Speaker 2: and we should also go ahead and point out, Hey, 1495 01:25:39,120 --> 01:25:41,880 Speaker 2: the next Halloween is just around the corner, so if 1496 01:25:41,920 --> 01:25:44,640 Speaker 2: you have suggestions for next year, if you're like you 1497 01:25:44,720 --> 01:25:47,280 Speaker 2: two should totally read this story in this story by 1498 01:25:47,320 --> 01:25:49,400 Speaker 2: this author and this author, go ahead and write in 1499 01:25:49,439 --> 01:25:53,800 Speaker 2: we would love suggestions. You know, any headstart we can 1500 01:25:53,800 --> 01:25:56,280 Speaker 2: get on our selection process is always a good thing. 1501 01:25:57,840 --> 01:25:59,560 Speaker 2: Just a reminder that Stuff to Blow Your Mind is 1502 01:25:59,600 --> 01:26:02,400 Speaker 2: primary a science and culture podcast with core episodes on 1503 01:26:02,439 --> 01:26:05,080 Speaker 2: Tuesdays and Thursday, short form episodes on Wednesdays, and then 1504 01:26:05,120 --> 01:26:07,800 Speaker 2: on Fridays. We set aside most serious concerns to just 1505 01:26:07,840 --> 01:26:10,640 Speaker 2: talk about a weird film on Weird House Cinema. 1506 01:26:11,080 --> 01:26:14,719 Speaker 3: Huge thanks as always to our excellent audio producer JJ Posway. 1507 01:26:15,080 --> 01:26:16,680 Speaker 3: If you would like to get in touch with us 1508 01:26:16,720 --> 01:26:19,240 Speaker 3: with feedback on this episode or any other, to suggest 1509 01:26:19,320 --> 01:26:21,400 Speaker 3: a topic for the future, or just to say hello, 1510 01:26:21,560 --> 01:26:24,200 Speaker 3: you can email us at contact at stuff to blow 1511 01:26:24,200 --> 01:26:32,920 Speaker 3: your Mind dot com. 1512 01:26:33,040 --> 01:26:35,960 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 1513 01:26:36,040 --> 01:26:38,840 Speaker 1: more podcasts from my Heart Radio, visit the iHeartRadio, app, 1514 01:26:39,000 --> 01:27:00,879 Speaker 1: Apple podcasts, or wherever you listen to your favorite shows.