1 00:00:05,720 --> 00:00:07,440 Speaker 1: Hey, you welcome to Stuff to Blow Your Mind. My 2 00:00:07,520 --> 00:00:10,879 Speaker 1: name is Robert Lamb and I'm Joe McCormick, and it's Saturday. 3 00:00:10,880 --> 00:00:12,800 Speaker 1: Time to go into the vault for a classic episode 4 00:00:12,800 --> 00:00:14,560 Speaker 1: of Stuff to Blow Your Mind. Robert, I think you 5 00:00:14,640 --> 00:00:17,200 Speaker 1: must have gotten into a monster place with your brain. 6 00:00:17,320 --> 00:00:20,440 Speaker 1: Was a Bandersnatch? It was? It was exactly that, because 7 00:00:20,440 --> 00:00:23,759 Speaker 1: we just did a new episode about Bandersnatch, the Black 8 00:00:23,840 --> 00:00:26,639 Speaker 1: Mirror episode, and so that that brought to mind the 9 00:00:26,680 --> 00:00:29,560 Speaker 1: Great Basilisk that we discussed in the show previously, where 10 00:00:29,600 --> 00:00:32,440 Speaker 1: we talk a little bit about the mythology of the Basilist, 11 00:00:32,479 --> 00:00:38,080 Speaker 1: but mostly we talked about a particularly horrifying technological concept. Right. 12 00:00:38,360 --> 00:00:42,360 Speaker 1: This episode originally aired in October of two thousand and eighteen, 13 00:00:42,640 --> 00:00:45,320 Speaker 1: so we hope you enjoy this classic episode of Stuff 14 00:00:45,360 --> 00:00:47,400 Speaker 1: to Blow Your Mind. The podcast you were about to 15 00:00:47,400 --> 00:00:50,879 Speaker 1: listen to contains a class for information hazard. Some listeners 16 00:00:50,880 --> 00:00:55,280 Speaker 1: may experience prolonged bouts of fear, waking, anxiety, or nightmares 17 00:00:55,320 --> 00:00:57,880 Speaker 1: of eternal torture in the cyber dungeons of the Great 18 00:00:57,920 --> 00:01:01,639 Speaker 1: Basilisk attended to by the Losing Black and the Thirteen 19 00:01:01,720 --> 00:01:05,800 Speaker 1: Children of the Flame. Also appetite loss of constillation. Proceed 20 00:01:05,840 --> 00:01:12,840 Speaker 1: with caution. Welcome to Stuff to Blow your Mind from 21 00:01:12,840 --> 00:01:21,080 Speaker 1: how Stuff Works dot com. Hey are you welcome to 22 00:01:21,120 --> 00:01:23,200 Speaker 1: Stuff to Blow your Mind? My name is Robert Lamb 23 00:01:23,520 --> 00:01:26,679 Speaker 1: and I'm Joe McCormick. And since it's October, we are 24 00:01:26,760 --> 00:01:31,360 Speaker 1: of course still exploring monsters, terrifying ideas, and so forth, 25 00:01:31,560 --> 00:01:33,720 Speaker 1: and boy, have we got one for you today. I 26 00:01:33,840 --> 00:01:36,600 Speaker 1: just want to issue a warning right at the beginning 27 00:01:36,640 --> 00:01:40,720 Speaker 1: here that today's episode is going to concern something that 28 00:01:40,840 --> 00:01:45,640 Speaker 1: a few people would consider a genuine information hazard, as 29 00:01:45,680 --> 00:01:49,560 Speaker 1: in an idea that is itself actually dangerous. Now I 30 00:01:49,560 --> 00:01:51,880 Speaker 1: I don't, having looked into it, I don't think that 31 00:01:52,000 --> 00:01:54,480 Speaker 1: is the case. I don't think this episode will hurt you, 32 00:01:54,920 --> 00:01:59,160 Speaker 1: But just a warning. If you think you're susceptible to terror, 33 00:01:59,240 --> 00:02:03,080 Speaker 1: nightmares or something, when presented with a thought experiment or 34 00:02:03,120 --> 00:02:06,640 Speaker 1: the possibility of being say sent to a literal hell 35 00:02:06,760 --> 00:02:10,320 Speaker 1: created by technology, and you think that idea could infect you, 36 00:02:10,320 --> 00:02:12,639 Speaker 1: could make you afraid, this might not be the episode 37 00:02:12,680 --> 00:02:15,400 Speaker 1: for you. Right, But then again, I assume you're a 38 00:02:15,400 --> 00:02:18,239 Speaker 1: listener to Stuff to Blow your Mind, You've probably already 39 00:02:18,280 --> 00:02:22,799 Speaker 1: encountered some thought hazards on here you've survived those. Generally speaking, 40 00:02:22,840 --> 00:02:25,280 Speaker 1: I have faith in you to survive this one. However, 41 00:02:25,320 --> 00:02:27,600 Speaker 1: if you are going to take either of our warnings seriously, 42 00:02:27,960 --> 00:02:30,040 Speaker 1: I will let you know that the first section of 43 00:02:30,040 --> 00:02:33,279 Speaker 1: this podcast is going to deal with the mythical basilisk, 44 00:02:34,000 --> 00:02:36,680 Speaker 1: the folkloric basilisk, and some of the you know, the 45 00:02:36,680 --> 00:02:39,920 Speaker 1: monstrous fund to be had there before we explore the 46 00:02:40,000 --> 00:02:43,600 Speaker 1: idea of Rocco's basilisk, and in that we're gonna be 47 00:02:43,840 --> 00:02:48,639 Speaker 1: talking about this, uh, this idea that emerges where technological singularity, 48 00:02:48,720 --> 00:02:52,600 Speaker 1: naval gazing, thought experimentation go a little, that dash of 49 00:02:52,600 --> 00:02:56,359 Speaker 1: creepy pasta, and some good old fashion supernatural thinking all 50 00:02:56,400 --> 00:03:01,000 Speaker 1: converge into this kind of nightmare scenario. Now, as we said, 51 00:03:01,040 --> 00:03:04,800 Speaker 1: this idea is believed by some to be a genuinely 52 00:03:05,000 --> 00:03:08,000 Speaker 1: dangerous idea and that even learning about it could put 53 00:03:08,040 --> 00:03:09,840 Speaker 1: you at some kind of risk. I think there are 54 00:03:09,880 --> 00:03:12,239 Speaker 1: strong reasons to believe that this is not the case, 55 00:03:12,440 --> 00:03:14,840 Speaker 1: and that thinking about this idea will not put you 56 00:03:14,880 --> 00:03:17,800 Speaker 1: at risk. But again, if you're concerned, you should stop 57 00:03:17,880 --> 00:03:20,520 Speaker 1: listening now, or stop listening after we stop talking about 58 00:03:20,520 --> 00:03:23,519 Speaker 1: the mythical basilisk. Now, I have I just want to 59 00:03:23,520 --> 00:03:26,880 Speaker 1: say at the beginning. Listeners have suggested us talk about 60 00:03:26,960 --> 00:03:31,480 Speaker 1: Rocco's basilisk before this idea that is at least purportedly 61 00:03:31,560 --> 00:03:36,560 Speaker 1: an information hazard, a dangerous idea, and I've hesitated to 62 00:03:36,600 --> 00:03:40,200 Speaker 1: do it before, not because I think it's particularly plausible, 63 00:03:40,840 --> 00:03:43,680 Speaker 1: but just because, you know, I wonder what is the 64 00:03:43,800 --> 00:03:48,560 Speaker 1: level of risk that you should tolerate when propagating an idea. 65 00:03:48,800 --> 00:03:51,880 Speaker 1: If you think an idea is unlikely but maybe has 66 00:03:51,960 --> 00:03:55,760 Speaker 1: a zero point zero zero zero zero one percent chance 67 00:03:56,200 --> 00:03:59,200 Speaker 1: of causing enormous harm to the person you tell it to, 68 00:03:59,560 --> 00:04:01,720 Speaker 1: should you say the idea or not? I don't know. 69 00:04:01,760 --> 00:04:04,000 Speaker 1: I feel like people generally don't exercise that kind of 70 00:04:04,040 --> 00:04:06,680 Speaker 1: caution when they're like sharing links with you, Like sometimes 71 00:04:06,680 --> 00:04:10,040 Speaker 1: they'll be like, like not safe for work, but but 72 00:04:10,080 --> 00:04:12,240 Speaker 1: then you click on it anyway, and then sometimes you're like, oh, 73 00:04:12,440 --> 00:04:14,280 Speaker 1: I wish I had not seen that, or I wish 74 00:04:14,280 --> 00:04:16,520 Speaker 1: I had not read that, and now that's in my head. 75 00:04:16,760 --> 00:04:18,960 Speaker 1: Now that's in my head forever. Well. One of the 76 00:04:18,960 --> 00:04:22,279 Speaker 1: problems with this idea is whatever you think about whether 77 00:04:22,400 --> 00:04:25,320 Speaker 1: or not you should discuss ideas that may be dangerous 78 00:04:25,320 --> 00:04:29,400 Speaker 1: to hear in some extremely unlikely off chance. Uh, part 79 00:04:29,440 --> 00:04:31,600 Speaker 1: of the problem is what happens when those ideas are 80 00:04:31,640 --> 00:04:34,479 Speaker 1: just already set loose in in society. I mean, now 81 00:04:34,560 --> 00:04:37,640 Speaker 1: people on television shows and all over the internet or 82 00:04:37,680 --> 00:04:39,760 Speaker 1: talking about this idea. There are a bunch of articles 83 00:04:39,760 --> 00:04:41,479 Speaker 1: out about it. So it's not like you can keep 84 00:04:41,520 --> 00:04:43,640 Speaker 1: the cat in the bag at this point. Right this 85 00:04:44,120 --> 00:04:48,400 Speaker 1: Rocco's Basilist has already been a gag point on the 86 00:04:48,640 --> 00:04:51,760 Speaker 1: hit HBO show Silicon Valley, which is a fabulous show, 87 00:04:51,760 --> 00:04:54,120 Speaker 1: and I love the way that they treated Rocco's Basilist 88 00:04:54,160 --> 00:04:57,000 Speaker 1: on it. But uh, yeah, if they're covering it, there's 89 00:04:57,000 --> 00:04:58,720 Speaker 1: no danger in us covering it too. That's the way 90 00:04:58,720 --> 00:05:00,240 Speaker 1: I look at it, right, and at LEAs East, I 91 00:05:00,279 --> 00:05:02,280 Speaker 1: would hope that the way we cover it can give 92 00:05:02,320 --> 00:05:04,520 Speaker 1: you some reasons to think you should not be afraid 93 00:05:04,560 --> 00:05:07,680 Speaker 1: of digital hell, and also to think about the general 94 00:05:07,720 --> 00:05:10,480 Speaker 1: class of what should be done about something that could 95 00:05:10,480 --> 00:05:13,320 Speaker 1: have in fact been a real information hazard in some 96 00:05:13,440 --> 00:05:17,119 Speaker 1: other case. So that's all our whole preamble for before 97 00:05:17,160 --> 00:05:19,000 Speaker 1: we get to that section. But before we get to 98 00:05:19,040 --> 00:05:22,440 Speaker 1: that section, we're gonna be talking about basilisks today. Boy, 99 00:05:22,520 --> 00:05:26,520 Speaker 1: is the basilisk a great monster. Yes, also known as 100 00:05:26,560 --> 00:05:31,279 Speaker 1: the basil coock, the basil cock, the basil e cock, 101 00:05:31,640 --> 00:05:35,039 Speaker 1: basically any version of that of basil and cock that 102 00:05:35,080 --> 00:05:38,120 Speaker 1: you can put together, um, that you can slam together. 103 00:05:38,520 --> 00:05:40,520 Speaker 1: Then it has been referred to as such at some 104 00:05:40,600 --> 00:05:42,640 Speaker 1: point in its history. Now, a lot of people I 105 00:05:42,640 --> 00:05:46,200 Speaker 1: think probably encountered a version of the basilisk from Harry Potter. 106 00:05:46,320 --> 00:05:48,920 Speaker 1: But Robert, I know that was not your entryway, right, 107 00:05:48,960 --> 00:05:50,720 Speaker 1: I encountered it for the first time. I believe in 108 00:05:50,800 --> 00:05:53,680 Speaker 1: dungeons and dragons, of course, because it's a it's a 109 00:05:53,760 --> 00:05:58,159 Speaker 1: multi legged reptile with a petrifying gaze. Um say that again, 110 00:05:58,520 --> 00:06:02,640 Speaker 1: multilegged reptile with a pet refying gaze, petrifying gas to 111 00:06:02,640 --> 00:06:05,679 Speaker 1: turn you to stone. Yeah, and is ever overy call correctly. 112 00:06:05,680 --> 00:06:08,560 Speaker 1: It has some cool biology where like the the it 113 00:06:08,560 --> 00:06:11,240 Speaker 1: turns you to stone and then like bust you into pieces. 114 00:06:11,240 --> 00:06:13,760 Speaker 1: Then it eats the stone pieces, but then its stomach 115 00:06:13,800 --> 00:06:16,279 Speaker 1: turns the stone back into flesh. And so if you 116 00:06:16,360 --> 00:06:19,320 Speaker 1: get like the stomach juices from a basilisk, then you 117 00:06:19,320 --> 00:06:22,599 Speaker 1: can use it to undo petrification spells, that sort of thing. 118 00:06:23,160 --> 00:06:26,480 Speaker 1: It's a lot of fun, arguably more fun than than 119 00:06:26,560 --> 00:06:30,040 Speaker 1: the Basilisk is at times in folklore tradition because one 120 00:06:30,080 --> 00:06:32,240 Speaker 1: of the things is, if you you like me, you 121 00:06:32,240 --> 00:06:34,680 Speaker 1: didn't grow up hearing about the basilisk. Part of it 122 00:06:34,720 --> 00:06:36,640 Speaker 1: is because there are no there aren't really any great 123 00:06:36,680 --> 00:06:41,600 Speaker 1: stories about the basilisk slay. The Basilisk was no heroes, 124 00:06:41,720 --> 00:06:44,240 Speaker 1: great ordeal. Oh yeah, I at least have not come 125 00:06:44,279 --> 00:06:47,000 Speaker 1: across that. Yeah, it really helps, Like if like the hydra, 126 00:06:47,120 --> 00:06:49,960 Speaker 1: I mean, the hydra is arguably so much cooler. And 127 00:06:49,960 --> 00:06:53,040 Speaker 1: then also it's one of the labors of Hercules, and 128 00:06:53,080 --> 00:06:55,719 Speaker 1: there's a cool story about how they defeat it. Well, 129 00:06:55,760 --> 00:06:57,760 Speaker 1: maybe it's because there is no way to defeat the 130 00:06:57,760 --> 00:07:01,360 Speaker 1: basilisks short of, say, uh, weasel of fluvium, which we 131 00:07:01,400 --> 00:07:03,760 Speaker 1: will get to. Yeah, don't don't give it away. I'm sorry. 132 00:07:03,800 --> 00:07:05,480 Speaker 1: Should we should we edit that out? No, no, we 133 00:07:05,480 --> 00:07:07,880 Speaker 1: should leave it. It's a it's it's a thought hazard. 134 00:07:07,920 --> 00:07:11,680 Speaker 1: Any basilists listening. Still, there's so much more to the 135 00:07:11,720 --> 00:07:15,560 Speaker 1: Basilist than just this cool DND creature, because it's not 136 00:07:15,640 --> 00:07:18,160 Speaker 1: just a monster. It's not just something you encounter in 137 00:07:18,200 --> 00:07:22,280 Speaker 1: the dungeon. It is it is a king. Oh I see, now, 138 00:07:22,320 --> 00:07:24,240 Speaker 1: I know you've made note of the facts that Borges 139 00:07:24,320 --> 00:07:27,520 Speaker 1: mentions the basilisk in his book of Imaginary Beings, he 140 00:07:27,640 --> 00:07:31,480 Speaker 1: does now he he translated translates it his meaning little king, 141 00:07:31,800 --> 00:07:34,480 Speaker 1: little king, which I like. When I was reading, and 142 00:07:34,520 --> 00:07:37,640 Speaker 1: Carol Rose she's she points out that the name stems 143 00:07:37,640 --> 00:07:42,120 Speaker 1: from the Greek basilius, which means king, so king or 144 00:07:42,200 --> 00:07:44,520 Speaker 1: little king. I tend to like the little king translation 145 00:07:44,560 --> 00:07:46,720 Speaker 1: because I feel like it it ties in better with 146 00:07:46,760 --> 00:07:49,240 Speaker 1: what we're going to discuss, well, the ancient beastiary ideas 147 00:07:49,280 --> 00:07:51,760 Speaker 1: of the basil I believe to say that it's not 148 00:07:51,880 --> 00:07:55,440 Speaker 1: that big, right, It's pretty small. Yeah, yeah. Now the 149 00:07:55,720 --> 00:07:58,840 Speaker 1: keen part, though, refers to a crest or, a crown 150 00:07:58,920 --> 00:08:01,680 Speaker 1: like protrusion that is on the creature's head, and in 151 00:08:01,720 --> 00:08:06,560 Speaker 1: some depictions it's no mirror biological ornament, but an actual 152 00:08:06,640 --> 00:08:11,760 Speaker 1: regal crown. It means something. Yeah. Now. The descriptions very greatly, 153 00:08:12,120 --> 00:08:15,360 Speaker 1: and it emerges largely from European and Middle Eastern legend 154 00:08:15,360 --> 00:08:19,840 Speaker 1: and folklore from ancient times to roughly the seventeenth century, 155 00:08:19,880 --> 00:08:23,080 Speaker 1: and that's when the basilist became less popular. In the 156 00:08:23,160 --> 00:08:26,200 Speaker 1: earlier descriptions, though it is indeed small and it's just 157 00:08:26,280 --> 00:08:29,360 Speaker 1: a grass snake, only it has a crown like crest 158 00:08:29,440 --> 00:08:32,440 Speaker 1: on its head and has this weird practice of floating 159 00:08:32,480 --> 00:08:36,720 Speaker 1: above the ground like vertically erect. That's creepy, of course. 160 00:08:36,720 --> 00:08:39,360 Speaker 1: And then again that that does play on sometimes when 161 00:08:39,400 --> 00:08:42,040 Speaker 1: you see snakes rise up out of a coil, it 162 00:08:42,080 --> 00:08:45,120 Speaker 1: can be startling how high they rise. Yeah, I mean, 163 00:08:45,160 --> 00:08:48,360 Speaker 1: if I feel like I've grown up seeing images and 164 00:08:48,480 --> 00:08:51,960 Speaker 1: videos of cobra's doing their their dance, so I've kind 165 00:08:52,000 --> 00:08:54,760 Speaker 1: of I've kind of lost any kind of appreciation for 166 00:08:54,840 --> 00:08:57,600 Speaker 1: how bizarre that is to look at. You know, if 167 00:08:57,640 --> 00:08:59,800 Speaker 1: you're used to seeing a snake slither, to see it's 168 00:09:00,040 --> 00:09:03,400 Speaker 1: and up and uh you know, and rare back and 169 00:09:03,400 --> 00:09:07,440 Speaker 1: and and flare its hood. Absolutely. Yeah. So the basilisk 170 00:09:07,480 --> 00:09:10,680 Speaker 1: is said to be the king of the reptiles, But 171 00:09:10,840 --> 00:09:13,040 Speaker 1: you know, don't be so foolish is to think that 172 00:09:13,160 --> 00:09:16,679 Speaker 1: only it's bite is lethal like some of our venomous snakes. Now, 173 00:09:17,080 --> 00:09:20,920 Speaker 1: every aspect of the basilisk is said to just break 174 00:09:21,040 --> 00:09:25,439 Speaker 1: of venom and death, every every aspect. If you touch it, 175 00:09:25,679 --> 00:09:28,720 Speaker 1: if you inhale its breath, if you gaze upon it 176 00:09:28,760 --> 00:09:33,120 Speaker 1: at all, then you will die. Wait what about its saliva? Yep, saliva, 177 00:09:33,480 --> 00:09:37,559 Speaker 1: blood smell, gaze. Presumably I didn't see any reference that. 178 00:09:37,720 --> 00:09:41,200 Speaker 1: Presumably it's it's urine, it's excrement. I mean, it's excrement 179 00:09:41,240 --> 00:09:44,160 Speaker 1: has to be poisonous, the excrement of a basilisk. It 180 00:09:44,240 --> 00:09:47,040 Speaker 1: sounds absolutely deadly. Wouldn't it be a great inversion if 181 00:09:47,040 --> 00:09:50,040 Speaker 1: it's excrement was the only good part about it? Maybe 182 00:09:50,040 --> 00:09:53,079 Speaker 1: so that can heal your warts? Yeah? And uh. One 183 00:09:53,080 --> 00:09:56,480 Speaker 1: thing that Carol Rose pointed out in in her her 184 00:09:56,640 --> 00:09:59,200 Speaker 1: entry about the basilisk and her or one of her 185 00:09:59,240 --> 00:10:03,200 Speaker 1: monster Encyclopedia's, she said that when it's not killing everything 186 00:10:03,240 --> 00:10:05,640 Speaker 1: in its path just via the you know, the audacity 187 00:10:05,640 --> 00:10:08,679 Speaker 1: of its existence, it would actually spit venom at birds 188 00:10:08,760 --> 00:10:12,680 Speaker 1: flying overhead and bring them down to eat them, or 189 00:10:12,720 --> 00:10:14,560 Speaker 1: just had a spite. I get the idea, just had 190 00:10:14,559 --> 00:10:16,880 Speaker 1: a spite, you know. It's just just it's just my 191 00:10:17,080 --> 00:10:20,840 Speaker 1: ful death, that's all it is. Okay, So where do 192 00:10:20,880 --> 00:10:23,320 Speaker 1: I find a basilisk? Well in the desert, of course. 193 00:10:23,520 --> 00:10:25,520 Speaker 1: But it's more it's more accurate to say that the 194 00:10:25,559 --> 00:10:28,880 Speaker 1: desert is not merely the place where it lives, but 195 00:10:29,000 --> 00:10:32,240 Speaker 1: it is the place that it makes by living, Like 196 00:10:32,800 --> 00:10:36,960 Speaker 1: everything in its path dies, and therefore the desert is 197 00:10:37,000 --> 00:10:40,320 Speaker 1: the result of the basilisk. And there's a there's actually 198 00:10:40,480 --> 00:10:44,080 Speaker 1: a wonderful um description of the basilisk that comes to 199 00:10:44,120 --> 00:10:46,480 Speaker 1: us from Plenty of the Elder in his The Natural 200 00:10:46,559 --> 00:10:48,880 Speaker 1: History Man. We've been hitting Plenty a lot lately. I 201 00:10:48,920 --> 00:10:51,079 Speaker 1: guess we've been talking about monsters. Huh. If you're talking 202 00:10:51,080 --> 00:10:54,480 Speaker 1: about monsters, especially ancient monsters, you know he's he's one 203 00:10:54,480 --> 00:10:57,439 Speaker 1: of the great sources to turn to. Uh So, Joe, 204 00:10:57,679 --> 00:11:01,160 Speaker 1: would you read to us from the natural his tree? Oh? Absolutely. 205 00:11:01,920 --> 00:11:04,560 Speaker 1: There is the same power also in the serpent called 206 00:11:04,559 --> 00:11:08,160 Speaker 1: the basilisk. It is produced in the province of Syrene, 207 00:11:08,280 --> 00:11:11,319 Speaker 1: which that is the area to the west of Egypt. 208 00:11:11,360 --> 00:11:13,480 Speaker 1: It's like Libya. I think there's a settlement known as 209 00:11:13,520 --> 00:11:18,760 Speaker 1: like Syrene, Syrene being not more than twelve fingers in length. 210 00:11:19,080 --> 00:11:22,920 Speaker 1: Is that fingers long ways or fingers sideways? Oh, either 211 00:11:22,960 --> 00:11:26,040 Speaker 1: way you cut it. It's not a huge creature. It 212 00:11:26,080 --> 00:11:28,960 Speaker 1: has a white spot on the head strongly resembling a 213 00:11:29,000 --> 00:11:32,640 Speaker 1: sort of diadem. When it hisses, all the other serpents 214 00:11:32,679 --> 00:11:35,559 Speaker 1: fly from it, and it does not advance its body 215 00:11:35,640 --> 00:11:38,800 Speaker 1: like the others. By a succession of folds, but moves 216 00:11:38,840 --> 00:11:43,400 Speaker 1: along upright and erect upon the middle. It destroys all shrubs, 217 00:11:43,520 --> 00:11:46,400 Speaker 1: not only by its contact, but those even that it 218 00:11:46,440 --> 00:11:50,120 Speaker 1: has breathed upon. It burns up all the grass too, 219 00:11:50,280 --> 00:11:54,360 Speaker 1: and breaks the stones. So tremendous is its noxious influence. 220 00:11:54,840 --> 00:11:57,200 Speaker 1: It was formerly a general belief that if a man 221 00:11:57,320 --> 00:11:59,880 Speaker 1: on horseback killed one of these animals with a spear, 222 00:12:00,280 --> 00:12:02,640 Speaker 1: the poison would run up the weapon and kill not 223 00:12:02,720 --> 00:12:05,680 Speaker 1: only the rider but the horse as well. Oh man, 224 00:12:07,040 --> 00:12:09,120 Speaker 1: I love that. So it's the blood is it's like 225 00:12:09,160 --> 00:12:11,720 Speaker 1: that like a xenomorphs blood right or or kind of 226 00:12:11,760 --> 00:12:14,000 Speaker 1: like it reminds me to of Grendel's blood that was 227 00:12:14,040 --> 00:12:16,680 Speaker 1: said to like melt the weapon that they wolf used 228 00:12:16,720 --> 00:12:18,840 Speaker 1: against it. But it's worse than that. It doesn't just 229 00:12:18,880 --> 00:12:21,400 Speaker 1: get the weapon, It gets the person holding the weapon 230 00:12:21,600 --> 00:12:24,320 Speaker 1: and the horse that that person is touching. I know 231 00:12:24,440 --> 00:12:26,680 Speaker 1: it feels unfair that the horse has roped into this 232 00:12:26,720 --> 00:12:29,280 Speaker 1: as well. Yeah, the horse didn't even sign up for 233 00:12:29,320 --> 00:12:31,400 Speaker 1: going to fight a basilisk. It's just trying to get 234 00:12:31,440 --> 00:12:34,560 Speaker 1: some oats. But furthermore, what is this horse rider doing 235 00:12:34,559 --> 00:12:36,720 Speaker 1: out in the waste land of Syrene trying to kill 236 00:12:36,760 --> 00:12:40,760 Speaker 1: a basilisk. Well, lesson learned. Lesson learned now that the 237 00:12:40,800 --> 00:12:45,640 Speaker 1: basilist becomes a popular creature and h even though the 238 00:12:45,880 --> 00:12:49,400 Speaker 1: basilisk itself is it doesn't seem to have been mentioned 239 00:12:49,440 --> 00:12:52,439 Speaker 1: in the Bible, it ends up being like roped into 240 00:12:52,480 --> 00:12:57,200 Speaker 1: it via translations. Oh yeah, it's kind of like the unicorn. Actually, yeah, exactly. Yeah, 241 00:12:57,200 --> 00:13:00,800 Speaker 1: we discussed in our episode on unicorns how the were 242 00:13:00,840 --> 00:13:04,120 Speaker 1: their words in the Bible that have been translated, saying 243 00:13:04,160 --> 00:13:06,920 Speaker 1: the King James translation of the Bible into unicorn because 244 00:13:06,920 --> 00:13:09,840 Speaker 1: the translators didn't know what the word referred to. We 245 00:13:09,880 --> 00:13:13,080 Speaker 1: think now that maybe the word probably referred to the ROCs, 246 00:13:13,120 --> 00:13:17,880 Speaker 1: an extinct bovine creature that once lived around the ancient Mediterranean. Yeah, 247 00:13:17,880 --> 00:13:20,080 Speaker 1: so you see the basilist pop pop pop up in 248 00:13:20,360 --> 00:13:25,040 Speaker 1: certain translations of the Book of Jeremiah, the Book of Psalms. Uh, 249 00:13:25,160 --> 00:13:29,280 Speaker 1: what it's associated with the devil or evil and nothing 250 00:13:29,320 --> 00:13:32,240 Speaker 1: short of the coming of the Messiah can can hope 251 00:13:32,280 --> 00:13:34,240 Speaker 1: to end its rule. While well, have you got a 252 00:13:34,320 --> 00:13:39,320 Speaker 1: quote for me? Yes, there's one translation of Psalms. This 253 00:13:39,400 --> 00:13:44,160 Speaker 1: is the Brinton Septuagint translation quote thou shalt tread on 254 00:13:44,240 --> 00:13:47,560 Speaker 1: the asp and basilisk and thou shalt trample on the 255 00:13:47,640 --> 00:13:51,319 Speaker 1: lion and dragon. Now European beast areas of the eleventh 256 00:13:51,360 --> 00:13:56,280 Speaker 1: and twelfth century. They both mostly maintained Plenty's description, but 257 00:13:56,600 --> 00:14:00,000 Speaker 1: then they described a larger body. They begin to essentially 258 00:14:00,040 --> 00:14:02,120 Speaker 1: the monster began to grow. We've got to beat this 259 00:14:02,200 --> 00:14:05,440 Speaker 1: thing up here. Yeah, it ended up having spots and stripes, 260 00:14:05,480 --> 00:14:09,120 Speaker 1: and a few other features were thrown in. Um fiery breath, 261 00:14:09,840 --> 00:14:12,400 Speaker 1: a bellow that kills. Well, that only makes sense if 262 00:14:12,480 --> 00:14:15,000 Speaker 1: every other thing about it kills. It makes noise. That 263 00:14:15,040 --> 00:14:20,240 Speaker 1: should kill things too. Also, the ability to induce hydrophobia madness. 264 00:14:20,600 --> 00:14:22,960 Speaker 1: I found that interesting because this clearly has to be 265 00:14:23,200 --> 00:14:27,280 Speaker 1: a reference to the actual hydrophobia that is inherent in rabies. Yeah, 266 00:14:27,280 --> 00:14:30,200 Speaker 1: the idea there being the and I think later stages 267 00:14:30,200 --> 00:14:34,200 Speaker 1: of Araby's infection, persons will often have difficulty swallowing, and 268 00:14:34,400 --> 00:14:37,600 Speaker 1: so that they they're said to refuse drinking water. So 269 00:14:37,640 --> 00:14:41,840 Speaker 1: Plenty has some additional information here about how you might 270 00:14:41,920 --> 00:14:46,720 Speaker 1: deal with the passilisk. Okay, so I assume not ride 271 00:14:46,800 --> 00:14:48,960 Speaker 1: up on a horse and stab it right. Well, tell 272 00:14:48,960 --> 00:14:52,200 Speaker 1: me what it is to this dreadful monster. The effluvium 273 00:14:52,200 --> 00:14:55,520 Speaker 1: of the weasel is fatal, a thing that has been 274 00:14:55,520 --> 00:14:59,440 Speaker 1: tried with success, for kings have often desired to see 275 00:14:59,440 --> 00:15:02,480 Speaker 1: its body when killed. So true is it that it 276 00:15:02,520 --> 00:15:06,440 Speaker 1: has pleased nature that there should be nothing without its antidote. 277 00:15:06,840 --> 00:15:09,560 Speaker 1: The animal is thrown into the whole of the basilisk, 278 00:15:09,840 --> 00:15:13,240 Speaker 1: which is easily known from the soil around it being infected. 279 00:15:13,440 --> 00:15:17,400 Speaker 1: The weasel destroys the basilisk by its odor, but dies 280 00:15:17,440 --> 00:15:20,440 Speaker 1: itself in the struggle of nature against its own self. 281 00:15:21,200 --> 00:15:25,080 Speaker 1: And John Bostock, who provided the translation of this, he 282 00:15:25,120 --> 00:15:27,680 Speaker 1: adds that there's probably no foundation for this account of 283 00:15:27,680 --> 00:15:31,000 Speaker 1: the action of the effluvium of the weasel upon the 284 00:15:31,040 --> 00:15:34,640 Speaker 1: basilisk or any other species of serpent. But this is 285 00:15:34,720 --> 00:15:37,160 Speaker 1: letting us know that throwing the weasel in there to 286 00:15:37,280 --> 00:15:40,600 Speaker 1: bleed on it or secrete fluids or whatever, that's not 287 00:15:40,640 --> 00:15:43,560 Speaker 1: going to kill this mythical monster. But this is interesting though, 288 00:15:43,600 --> 00:15:48,280 Speaker 1: because weasels, especially the stout, were thought to be venomous um, 289 00:15:48,600 --> 00:15:50,680 Speaker 1: and it's worth noting that we do have some venomous 290 00:15:50,760 --> 00:15:54,280 Speaker 1: mammals in the natural world, such as various shrews and 291 00:15:54,400 --> 00:15:58,120 Speaker 1: even the slow lorist. The only known venomous primate I 292 00:15:58,160 --> 00:16:02,200 Speaker 1: don't think I knew that the loris was venomous. Throw 293 00:16:02,240 --> 00:16:05,000 Speaker 1: it into a hole with a basilist and and I'm 294 00:16:05,040 --> 00:16:08,400 Speaker 1: betting on the loris. But anyway, beast areas of the time, 295 00:16:08,440 --> 00:16:11,200 Speaker 1: they presented a few different ways that you could kill 296 00:16:11,280 --> 00:16:13,960 Speaker 1: the basilist. So the weasels one, weasels one always carry 297 00:16:13,960 --> 00:16:17,080 Speaker 1: a weasel. Also, this one is a little more elegant, 298 00:16:17,200 --> 00:16:19,680 Speaker 1: but I have a crystal globe with you to reflect 299 00:16:19,680 --> 00:16:23,560 Speaker 1: its own petrifying gaze back upon the basilist. So it's 300 00:16:23,560 --> 00:16:27,520 Speaker 1: like Perseus and Medusa exactly mirror. Yeah, basically they just 301 00:16:27,520 --> 00:16:31,360 Speaker 1: stole the idea from Medusa here, but then also carry 302 00:16:31,440 --> 00:16:34,480 Speaker 1: with you a cockrell or a young rooster. The basilist 303 00:16:34,520 --> 00:16:38,080 Speaker 1: will becoming enraged by the bird's crown, the idea that 304 00:16:38,160 --> 00:16:41,640 Speaker 1: this bird has a crown as well, and the basilist 305 00:16:41,680 --> 00:16:44,960 Speaker 1: will die from a lethal fit. That's a jealous king. Yeah, 306 00:16:45,040 --> 00:16:47,640 Speaker 1: I believe a similar thing occurs when someone refuses to 307 00:16:47,640 --> 00:16:51,600 Speaker 1: believe dinosaurs had fetis. You know, how dare the bird 308 00:16:51,840 --> 00:16:55,040 Speaker 1: rise above the mighty reptile and then it just loses 309 00:16:55,080 --> 00:16:57,880 Speaker 1: its mind and dies. We can only hope the producers 310 00:16:57,880 --> 00:17:00,440 Speaker 1: of the Jurassic World movies avoid this fate. So you 311 00:17:00,440 --> 00:17:03,040 Speaker 1: see the bassilists show up in a number of different writings. 312 00:17:03,040 --> 00:17:06,360 Speaker 1: It's just kind of a common um really, really a symbol, 313 00:17:06,400 --> 00:17:09,400 Speaker 1: an idea that can be employed. Uh. And though even 314 00:17:09,440 --> 00:17:11,679 Speaker 1: to see it show up in the Parsons Tale, in 315 00:17:11,760 --> 00:17:14,880 Speaker 1: the in Jeffrey Chaucer in the Canterbury Tales, yes, quote, 316 00:17:15,320 --> 00:17:18,199 Speaker 1: these are the other five fingers which the devil uses 317 00:17:18,240 --> 00:17:21,359 Speaker 1: to draw people towards him. The first is the letters 318 00:17:21,520 --> 00:17:24,320 Speaker 1: glance of a foolish woman or a foolish man, a 319 00:17:24,359 --> 00:17:27,879 Speaker 1: glance that kills just as the basilists kills people just 320 00:17:28,000 --> 00:17:31,119 Speaker 1: by looking at them. For the covetous glance reflects the 321 00:17:31,160 --> 00:17:33,720 Speaker 1: intentions of the heart. You know. This kind of thing 322 00:17:33,760 --> 00:17:36,320 Speaker 1: is actually one of my One of my favorite things 323 00:17:36,400 --> 00:17:40,879 Speaker 1: about monsters, especially ancient medieval monsters and uh and so forth, 324 00:17:40,960 --> 00:17:45,080 Speaker 1: is that they often aren't just like a large dangerous animal, 325 00:17:45,080 --> 00:17:48,360 Speaker 1: but they embody some kind of value. They represent something, 326 00:17:48,680 --> 00:17:51,639 Speaker 1: They give you something to compare other things too. Like 327 00:17:51,680 --> 00:17:55,240 Speaker 1: they they they're very useful as a metaphor. Uh. And 328 00:17:55,480 --> 00:17:57,240 Speaker 1: the really we see some more thing with the bassilist. 329 00:17:57,280 --> 00:17:59,679 Speaker 1: It becomes less far less a situation where people are 330 00:17:59,680 --> 00:18:02,000 Speaker 1: like a you need to be careful because there's a 331 00:18:02,040 --> 00:18:04,919 Speaker 1: basilisk in the desert and more and more just a 332 00:18:05,080 --> 00:18:08,719 Speaker 1: useful model, a useful, ridiculous idea that we can use 333 00:18:08,760 --> 00:18:11,960 Speaker 1: to illustrate something that is presumably true about the world, 334 00:18:12,400 --> 00:18:15,119 Speaker 1: and then ultimately loses all meaning and just winds up on, 335 00:18:15,280 --> 00:18:20,359 Speaker 1: you know, a heraldry and decorations. Now, as we've seen already, 336 00:18:20,400 --> 00:18:24,159 Speaker 1: the the basilisk has been through some transformations of form, 337 00:18:24,480 --> 00:18:28,000 Speaker 1: and I assume those transformations must have somewhat continued as 338 00:18:28,040 --> 00:18:31,800 Speaker 1: time goes by, and it becomes the transforms into this 339 00:18:31,880 --> 00:18:34,760 Speaker 1: idea of the cock of thrace. This uh, this rooster 340 00:18:34,920 --> 00:18:37,560 Speaker 1: with a curling serpent's tail. In fact, if you if 341 00:18:37,600 --> 00:18:40,000 Speaker 1: you go looking around for images the basilist, sometimes you 342 00:18:40,040 --> 00:18:42,520 Speaker 1: will find this image instead. You really will find you'll 343 00:18:42,560 --> 00:18:47,439 Speaker 1: find this alongside all the other images. Um So again 344 00:18:47,480 --> 00:18:51,040 Speaker 1: a reptile to bird transferation transformation that just must enrage 345 00:18:51,119 --> 00:18:54,320 Speaker 1: those who oppose feathered dinosaurs. And it does feel like 346 00:18:54,359 --> 00:18:56,639 Speaker 1: a shame because we have this vile reptile it becomes 347 00:18:56,680 --> 00:18:59,400 Speaker 1: the kind of a weirdo bird instead. And it said 348 00:18:59,400 --> 00:19:01,960 Speaker 1: that it's it's it's what happens when you have a 349 00:19:02,040 --> 00:19:06,200 Speaker 1: seven year old chicken egg hatched by a toad undead avians. 350 00:19:06,960 --> 00:19:10,080 Speaker 1: But it's also made deadlier in these newer versions, so 351 00:19:10,440 --> 00:19:14,159 Speaker 1: now it has that that poison blood power that Plenty describes. 352 00:19:14,440 --> 00:19:17,840 Speaker 1: It also rots fruit and poisons water everywhere it goes, 353 00:19:17,880 --> 00:19:21,080 Speaker 1: so it becomes this kind of embodiment of desolation and death, 354 00:19:21,400 --> 00:19:25,560 Speaker 1: and the idea itself becomes popular. It influenced the naming 355 00:19:25,560 --> 00:19:28,359 Speaker 1: of a tutor canon like literally a cannon that shoots 356 00:19:28,640 --> 00:19:31,960 Speaker 1: called the basilisk, just because it's it's such a powerful weapon, 357 00:19:32,000 --> 00:19:36,040 Speaker 1: we have to name it after this powerful, deadly monster. Uh. 358 00:19:36,080 --> 00:19:39,520 Speaker 1: And it eventually got some of its reptilian features back. Um. 359 00:19:39,560 --> 00:19:44,000 Speaker 1: The artist Andrew Vany has this excellent, excellent depiction of 360 00:19:44,040 --> 00:19:47,480 Speaker 1: it in Natural History of Serpents and Dragons that gives 361 00:19:47,480 --> 00:19:52,720 Speaker 1: it scales. In these it's like a fat, scaly reptile 362 00:19:52,800 --> 00:19:58,199 Speaker 1: bird with eight rooster legs, which I just love. This 363 00:19:58,240 --> 00:20:00,880 Speaker 1: will probably be the illustration for the episode on our 364 00:20:00,920 --> 00:20:03,960 Speaker 1: on our website. But after that the creature largely became 365 00:20:04,040 --> 00:20:07,000 Speaker 1: just a part of European heraldry. It's just something you 366 00:20:07,000 --> 00:20:11,080 Speaker 1: would see as a mere decoration or occasionally just a 367 00:20:11,200 --> 00:20:14,439 Speaker 1: literary reference. Now one thing that we want to be 368 00:20:14,440 --> 00:20:17,760 Speaker 1: careful about is that we should not confuse the basilisk 369 00:20:17,800 --> 00:20:22,920 Speaker 1: of legend the monster with true extent. Basilist lizards also 370 00:20:23,000 --> 00:20:26,120 Speaker 1: known sometimes as the Jesus Christ lizard or the Jesus 371 00:20:26,200 --> 00:20:29,960 Speaker 1: lizard for their ability to run across the surface of 372 00:20:30,000 --> 00:20:32,640 Speaker 1: water without sinking for up to about four point five 373 00:20:32,760 --> 00:20:35,840 Speaker 1: meters or about fifteen feet. Uh. If you've ever seen 374 00:20:35,960 --> 00:20:38,960 Speaker 1: video this, it's really cool. How how do they do that? 375 00:20:39,040 --> 00:20:41,520 Speaker 1: I've often wondered. I didn't know until I looked it 376 00:20:41,600 --> 00:20:44,520 Speaker 1: up for this episode. Apparently what they've got is big 377 00:20:44,560 --> 00:20:47,800 Speaker 1: feet and the ability to run very fast. And what 378 00:20:47,920 --> 00:20:51,080 Speaker 1: happens is when they run, they slapped the water very 379 00:20:51,160 --> 00:20:53,840 Speaker 1: hard with each down stroke of the foot, and it 380 00:20:53,920 --> 00:20:55,840 Speaker 1: has to do with the way that the rapid motions 381 00:20:55,840 --> 00:20:58,480 Speaker 1: of their feet create these air pockets around their feet 382 00:20:58,480 --> 00:21:01,040 Speaker 1: as they move. A was reading an article and New 383 00:21:01,080 --> 00:21:04,000 Speaker 1: Scientists where some researchers who are working on this problem 384 00:21:04,080 --> 00:21:06,560 Speaker 1: said that in order for an eight km or a 385 00:21:06,600 --> 00:21:09,280 Speaker 1: hundred and seventy five pound human to do this, you 386 00:21:09,320 --> 00:21:11,440 Speaker 1: would have to run at about a hundred and eight 387 00:21:11,520 --> 00:21:15,280 Speaker 1: kilometers per hour about sixty seven miles per hour across 388 00:21:15,320 --> 00:21:18,320 Speaker 1: the surface of the water. But anyway, you can find 389 00:21:18,440 --> 00:21:22,120 Speaker 1: basilist lizards in South America and Central America and Mexico 390 00:21:22,440 --> 00:21:24,320 Speaker 1: and uh as far as I know, they do not 391 00:21:24,440 --> 00:21:27,320 Speaker 1: kill with a glance, and you cannot fight them with 392 00:21:27,400 --> 00:21:31,080 Speaker 1: weasel fluvium. All right, Well, that pretty much wraps up 393 00:21:31,200 --> 00:21:38,080 Speaker 1: the mythical, legendary, folkloric basilisk um. It's rise and individual fall. 394 00:21:38,560 --> 00:21:40,760 Speaker 1: But we're gonna take a break and when we come back, 395 00:21:40,960 --> 00:21:45,840 Speaker 1: we're going to get into this idea of of Rocco's Basiliska, 396 00:21:46,160 --> 00:21:52,520 Speaker 1: the great basilisk the once and perhaps future king. Thank alright, 397 00:21:52,560 --> 00:21:54,840 Speaker 1: we're back, al right. So, as we mentioned earlier, we're 398 00:21:54,840 --> 00:21:57,440 Speaker 1: about to start discussing an idea that has been classed 399 00:21:57,440 --> 00:22:01,360 Speaker 1: by some as something that could be an information hazard, 400 00:22:01,400 --> 00:22:04,880 Speaker 1: an idea that simply by thinking about it you somehow 401 00:22:05,320 --> 00:22:09,240 Speaker 1: increase the chance of harm to yourself. So just another 402 00:22:09,240 --> 00:22:12,199 Speaker 1: warning that again I don't think that's the case, but 403 00:22:12,359 --> 00:22:15,959 Speaker 1: if that kind of thing scares you, then then perhaps 404 00:22:16,040 --> 00:22:18,960 Speaker 1: you can tune out now, all right, for those of 405 00:22:19,000 --> 00:22:22,760 Speaker 1: you who decided to stick around, let's proceed. So we're 406 00:22:22,760 --> 00:22:25,800 Speaker 1: talking about Rocco's bastlist. This is an idea that goes 407 00:22:25,840 --> 00:22:30,000 Speaker 1: back to around and it was proposed by a user 408 00:22:30,119 --> 00:22:34,080 Speaker 1: at the blog less Wrong, by a user named Rocco 409 00:22:34,640 --> 00:22:38,080 Speaker 1: now less Wrong. I think it's a website that's a 410 00:22:38,119 --> 00:22:42,879 Speaker 1: community that's associated with the rationalist movement somewhat, the rationalist 411 00:22:43,000 --> 00:22:48,400 Speaker 1: movement being a movement that's concerned with trying to optimize thinking, 412 00:22:48,560 --> 00:22:52,719 Speaker 1: like to eliminate bias and error, but especially among people 413 00:22:52,840 --> 00:22:56,399 Speaker 1: who in this case are concerned with the possibilities of 414 00:22:56,400 --> 00:23:00,360 Speaker 1: a technological singularity and what all that means, and how 415 00:23:00,400 --> 00:23:03,200 Speaker 1: how risks can be avoided, and of course what we've 416 00:23:03,240 --> 00:23:06,240 Speaker 1: talked about this strain of thinking before you know, we 417 00:23:06,240 --> 00:23:10,320 Speaker 1: we've introduced. I think some some skepticism about the idea 418 00:23:10,359 --> 00:23:13,960 Speaker 1: of a technological singularity. I don't know fully yet how 419 00:23:14,000 --> 00:23:16,560 Speaker 1: I come down on the dangers of AI debate, but 420 00:23:16,600 --> 00:23:18,800 Speaker 1: I think it's at least something we're thinking about worth 421 00:23:18,840 --> 00:23:22,520 Speaker 1: taking seriously. Yeah, I mean, we've talked about, for instance, 422 00:23:22,560 --> 00:23:25,120 Speaker 1: the work of Max tech Mark and his arguments about 423 00:23:25,160 --> 00:23:26,840 Speaker 1: how we need to we need to be concerned about 424 00:23:26,880 --> 00:23:29,119 Speaker 1: building the right kind of AI, and we need to 425 00:23:29,400 --> 00:23:31,320 Speaker 1: we need we need to have serious discussions about it, 426 00:23:31,400 --> 00:23:35,960 Speaker 1: not not mere you know, sci fi um dreams regarding 427 00:23:35,960 --> 00:23:37,879 Speaker 1: it or nightmares regarding it. You know, we need to 428 00:23:37,960 --> 00:23:41,560 Speaker 1: we need to think seriously about how we're developing our technology. Yeah. 429 00:23:41,640 --> 00:23:44,000 Speaker 1: We've talked about, say, the work of Nick Bostrom before 430 00:23:44,160 --> 00:23:48,159 Speaker 1: and criticisms by people like Jarren Landier. Yes, but okay, 431 00:23:48,359 --> 00:23:51,040 Speaker 1: give me the short version of the basilisk before we 432 00:23:51,440 --> 00:23:53,920 Speaker 1: explain it a little more. Okay, So the idea here 433 00:23:54,040 --> 00:23:58,080 Speaker 1: is that an AI super intelligence will emerge an entity 434 00:23:58,160 --> 00:24:01,280 Speaker 1: with just godlike technological path hours. You know, it can 435 00:24:02,520 --> 00:24:05,399 Speaker 1: your name and it can do it through it's it's 436 00:24:05,440 --> 00:24:09,639 Speaker 1: it's technological power, it's interconnectedness. Basically, if it's physically possible, 437 00:24:09,640 --> 00:24:11,760 Speaker 1: this computer can do it right, or it'll send a 438 00:24:11,840 --> 00:24:14,800 Speaker 1: drone to do it or what have you. Uh So, Yeah, 439 00:24:14,840 --> 00:24:17,080 Speaker 1: we've discussed this a bit in the podcast before, just 440 00:24:17,119 --> 00:24:18,840 Speaker 1: the idea of you know, and then if you have 441 00:24:18,960 --> 00:24:21,600 Speaker 1: this future king, is it going to be good or 442 00:24:21,640 --> 00:24:23,080 Speaker 1: is it gonna be bad? Is it going to be 443 00:24:23,160 --> 00:24:25,880 Speaker 1: a malevolent? Is it going to be ruthless? And it's 444 00:24:25,880 --> 00:24:30,320 Speaker 1: a and it's ascension. And that's the case with the bastilist, 445 00:24:30,359 --> 00:24:33,440 Speaker 1: the idea that it is ruthless, that you are either 446 00:24:33,520 --> 00:24:36,119 Speaker 1: with it or you are against it, and it actually 447 00:24:36,119 --> 00:24:38,720 Speaker 1: doesn't have to be malicious. It could actually even be 448 00:24:38,920 --> 00:24:43,080 Speaker 1: well meaning. It could just have ruthless tactics. Yes, yeah, 449 00:24:43,119 --> 00:24:44,880 Speaker 1: that's that's that's Also part of the argument is like, yeah, 450 00:24:44,880 --> 00:24:47,359 Speaker 1: it wants to bring the best good for all humanity, 451 00:24:47,680 --> 00:24:51,000 Speaker 1: but how it gets there, it'll do whatever it absolutely 452 00:24:51,119 --> 00:24:54,600 Speaker 1: has to do it such as you know, again, punishing 453 00:24:54,600 --> 00:24:57,879 Speaker 1: anybody who stands against it, punishing even those who do 454 00:24:57,960 --> 00:25:01,880 Speaker 1: not rise to support it. Um. And that means demanding 455 00:25:01,920 --> 00:25:05,679 Speaker 1: absolute devotion not only in its future kingdom, but in 456 00:25:05,760 --> 00:25:08,720 Speaker 1: the past that preceded it in our world as well. 457 00:25:08,760 --> 00:25:12,200 Speaker 1: In other words, it will punish people today who are 458 00:25:12,240 --> 00:25:16,320 Speaker 1: not actively helping it come into being tomorrow, and even 459 00:25:16,320 --> 00:25:19,760 Speaker 1: those who have died, it is said, or choose death 460 00:25:19,800 --> 00:25:22,679 Speaker 1: by their own hand rather than succumbed to the Great Basilist, 461 00:25:23,080 --> 00:25:26,919 Speaker 1: will be resurrected as digital consciousness is, and then tormented 462 00:25:27,000 --> 00:25:30,760 Speaker 1: for all eternity. And it's dripping black cyber dungeons. Hell 463 00:25:31,040 --> 00:25:35,600 Speaker 1: the Great Basilists, All Hell, the Great Basilists, call Hell 464 00:25:35,600 --> 00:25:40,399 Speaker 1: the Great Basilists, Hell the Great Basilist. Well wait, what 465 00:25:40,520 --> 00:25:45,120 Speaker 1: was that? I don't know, um right, did you hear that? Okay? 466 00:25:45,160 --> 00:25:47,800 Speaker 1: All right, We'll just keep going then, So calling it 467 00:25:47,840 --> 00:25:52,760 Speaker 1: a basilisk here, invoking the mythological basilist is really a 468 00:25:52,800 --> 00:25:56,800 Speaker 1: clever choice because it takes it one step further. Not 469 00:25:56,840 --> 00:25:58,800 Speaker 1: only to look at the basilist, but just to think 470 00:25:58,840 --> 00:26:04,119 Speaker 1: of the basilisk is to invite death. Right. Merely to 471 00:26:04,320 --> 00:26:08,000 Speaker 1: know about Rocco's Basilisk is enough to are too. According 472 00:26:08,040 --> 00:26:11,720 Speaker 1: to the model that's presented here, damn your digital soul 473 00:26:11,840 --> 00:26:15,760 Speaker 1: to everlasting horror. And the only way to avoid such 474 00:26:15,800 --> 00:26:19,080 Speaker 1: a fate then is to work in its favor, which, 475 00:26:19,119 --> 00:26:22,080 Speaker 1: by the way, I think we're doing uh with this podcast, 476 00:26:22,600 --> 00:26:24,720 Speaker 1: We're not well, I mean, I feel like we're giving 477 00:26:24,720 --> 00:26:27,480 Speaker 1: a lip service to the Great Basilisk just in case, 478 00:26:27,520 --> 00:26:30,080 Speaker 1: you know, if the Great Basilisk rises to power. Well, hey, 479 00:26:30,119 --> 00:26:32,639 Speaker 1: we did that podcast and we even had a shirt 480 00:26:33,280 --> 00:26:36,240 Speaker 1: that says all hell the Great the Great Basilisk that's 481 00:26:36,240 --> 00:26:39,199 Speaker 1: available on our t shirt store, so you know, we 482 00:26:39,240 --> 00:26:42,760 Speaker 1: have we we have you know, our options covered here. 483 00:26:44,400 --> 00:26:46,560 Speaker 1: So that's but that's the idea of the nutshell is 484 00:26:46,560 --> 00:26:50,800 Speaker 1: that a future king ai king will rise and if 485 00:26:50,840 --> 00:26:54,240 Speaker 1: you don't work to support it now knowing that it 486 00:26:54,400 --> 00:26:57,119 Speaker 1: is going to exist, then you will be punished for it. 487 00:26:57,320 --> 00:27:00,359 Speaker 1: So one of the principles underlying the idea of Ferrocco 488 00:27:00,440 --> 00:27:04,520 Speaker 1: Spacilisk is the idea of timeless decision theory, which, if 489 00:27:04,560 --> 00:27:07,840 Speaker 1: you want a pretty simple, straightforward explanation of it, there 490 00:27:08,000 --> 00:27:11,040 Speaker 1: is one in an article on Slate by David Auerbach 491 00:27:11,160 --> 00:27:14,280 Speaker 1: called the most terrifying thought experiment of all time. This, 492 00:27:14,359 --> 00:27:16,520 Speaker 1: by the way, I would say, I don't totally endorse 493 00:27:16,600 --> 00:27:18,959 Speaker 1: everything our back says in that article. I mean, obviously 494 00:27:19,000 --> 00:27:20,760 Speaker 1: that should be the case for any article we site. 495 00:27:20,800 --> 00:27:23,320 Speaker 1: But but he does at least have a pretty clear 496 00:27:23,400 --> 00:27:26,760 Speaker 1: and easy to understand explanation of how this works or 497 00:27:26,800 --> 00:27:28,679 Speaker 1: I don't know. Would you agree Robert that it's at 498 00:27:28,760 --> 00:27:31,320 Speaker 1: least somewhat easy to understand? Oh? Yes, I would. Uh. 499 00:27:31,320 --> 00:27:35,199 Speaker 1: There's another piece, by the way, by Beth Singler in 500 00:27:35,640 --> 00:27:39,399 Speaker 1: Ian magazine called Faith that's faith in a lower case 501 00:27:39,440 --> 00:27:43,359 Speaker 1: but with the AI capitalized. But anyway, our box points 502 00:27:43,359 --> 00:27:45,960 Speaker 1: out that that much, yeah, much of the thought experiment 503 00:27:46,040 --> 00:27:49,560 Speaker 1: is based in the timeless decision theory t DT, developed 504 00:27:49,560 --> 00:27:53,879 Speaker 1: by Less Wrong founder Elias ar Yudkowski, based on the 505 00:27:53,920 --> 00:27:57,720 Speaker 1: older thought experiment Newcomb's Paradox from the late sixties and 506 00:27:57,760 --> 00:28:02,520 Speaker 1: early seventies attributed to theoretic old physicist William Newcom. Now 507 00:28:02,680 --> 00:28:05,480 Speaker 1: you might be wondering who's this Judkowski? Guys? Is he 508 00:28:05,600 --> 00:28:07,880 Speaker 1: just some user on a random website I've never heard 509 00:28:07,880 --> 00:28:10,680 Speaker 1: of before today? Or is he like a name in 510 00:28:10,720 --> 00:28:13,960 Speaker 1: his field? Uh? And he uh? He has, he has 511 00:28:14,320 --> 00:28:16,720 Speaker 1: a name of note. He is also the founder of 512 00:28:16,760 --> 00:28:20,440 Speaker 1: the Machine Intelligence Research Institute, and his idea of working 513 00:28:20,520 --> 00:28:23,960 Speaker 1: toward a friendly AI is touted by many, including a. 514 00:28:24,080 --> 00:28:26,920 Speaker 1: Max Tegmark conventions it several times in his book Life 515 00:28:26,920 --> 00:28:32,120 Speaker 1: three point oh. Describing Yudkowski has quote an AI safety pioneer. Yeah, 516 00:28:32,160 --> 00:28:34,199 Speaker 1: I mean, in in a weird way. He is a 517 00:28:34,200 --> 00:28:36,760 Speaker 1: guy who posts on the internet, but he's a very 518 00:28:36,800 --> 00:28:41,440 Speaker 1: influential one, especially among people who think about artificial intelligence Lawyeah. 519 00:28:41,440 --> 00:28:43,800 Speaker 1: I mean, ultimately, what are any office but just people 520 00:28:43,800 --> 00:28:46,680 Speaker 1: who post stuff on the internet, post that will one 521 00:28:46,720 --> 00:28:50,120 Speaker 1: day be read by the great bassist. So okay, So 522 00:28:50,320 --> 00:28:52,720 Speaker 1: we'll try to explain the idea of timeless decision theory. 523 00:28:52,960 --> 00:28:56,000 Speaker 1: So you start off with this idea of Newcome's paradox, 524 00:28:56,160 --> 00:29:00,280 Speaker 1: right right, And then the paradox is essentially this a 525 00:29:00,320 --> 00:29:04,560 Speaker 1: super AI presents you with two boxes. One you're told 526 00:29:04,600 --> 00:29:08,680 Speaker 1: contains a thousand dollars. That's box A. That's box A. 527 00:29:08,720 --> 00:29:13,200 Speaker 1: Box B might contain one million dollars, or it might 528 00:29:13,240 --> 00:29:16,720 Speaker 1: contain nothing, right, and you're left with two options here. 529 00:29:16,760 --> 00:29:19,480 Speaker 1: These are the options that are given to you. You 530 00:29:19,520 --> 00:29:22,719 Speaker 1: can either pick both boxes, ensuring that you'll get at 531 00:29:22,800 --> 00:29:25,200 Speaker 1: least one thousand dollars out of the deal, maybe that 532 00:29:25,320 --> 00:29:28,000 Speaker 1: extra million two if it's in there. Or you can 533 00:29:28,120 --> 00:29:31,600 Speaker 1: just pick pick box B, which means you could get 534 00:29:31,600 --> 00:29:35,280 Speaker 1: a million dollars or you could have nothing. So uh, 535 00:29:35,320 --> 00:29:37,320 Speaker 1: and I do want to add that just picking the 536 00:29:37,400 --> 00:29:40,320 Speaker 1: thousand dollar box is not an option here, because I 537 00:29:40,360 --> 00:29:42,560 Speaker 1: was thinking about that too. Couldn't I just give the 538 00:29:42,560 --> 00:29:44,920 Speaker 1: super AI at the middle finger and say I'm not 539 00:29:44,920 --> 00:29:47,200 Speaker 1: playing your silly games. Just give me my thousand dollars, 540 00:29:47,320 --> 00:29:50,320 Speaker 1: or say I choose nothing. Uh, those are not options 541 00:29:50,320 --> 00:29:52,560 Speaker 1: you have to pick. Want to the options, but they're 542 00:29:52,560 --> 00:29:55,240 Speaker 1: not part of I mean, why wouldn't you also pick 543 00:29:55,320 --> 00:29:58,160 Speaker 1: the second box if you might additionally get a million dollars? 544 00:29:58,200 --> 00:30:00,239 Speaker 1: I don't know. When I feel like we you get 545 00:30:00,240 --> 00:30:02,160 Speaker 1: into a thought experience like this, they kind of beg 546 00:30:02,560 --> 00:30:05,000 Speaker 1: for those kind of nitpicking answers or at least I 547 00:30:05,040 --> 00:30:08,560 Speaker 1: want to provide them. Um, like any thought, when a 548 00:30:08,560 --> 00:30:10,880 Speaker 1: thought experiment is presented, you can't help on some of them, 549 00:30:11,200 --> 00:30:13,520 Speaker 1: but want to break it somehow, Right, Well, of course, 550 00:30:13,520 --> 00:30:15,720 Speaker 1: I mean that's something you should always play around with. 551 00:30:16,040 --> 00:30:19,080 Speaker 1: But given the constraints here, it seems like the obvious 552 00:30:19,200 --> 00:30:21,640 Speaker 1: thing would be to say, okay, I want both boxes, 553 00:30:21,720 --> 00:30:23,960 Speaker 1: because then I get the thousand dollars that's in box 554 00:30:24,000 --> 00:30:26,880 Speaker 1: say no matter what, and then whether box B has 555 00:30:26,920 --> 00:30:29,680 Speaker 1: a million or nothing, I either get another million or 556 00:30:29,800 --> 00:30:31,920 Speaker 1: I just walk away with my thousand from box A. 557 00:30:32,240 --> 00:30:35,360 Speaker 1: But here's the twist. The super intelligent machine has already 558 00:30:35,400 --> 00:30:38,440 Speaker 1: guessed how you'll respond. If it thinks you're going to 559 00:30:38,480 --> 00:30:43,040 Speaker 1: pick both boxes, then box B is certainly empty. But 560 00:30:43,080 --> 00:30:45,680 Speaker 1: if it thinks you will only pick box B, then 561 00:30:45,680 --> 00:30:48,200 Speaker 1: it makes sure there's a million dollars in there waiting 562 00:30:48,240 --> 00:30:51,520 Speaker 1: for you. But either way, the contents of the boxes 563 00:30:51,520 --> 00:30:55,400 Speaker 1: are set prior to you making that decision. Now, this 564 00:30:55,520 --> 00:30:58,360 Speaker 1: is really kind of change things maybe, I mean, depending 565 00:30:58,400 --> 00:31:01,960 Speaker 1: on what sort of decision theory you use. Right, if 566 00:31:02,000 --> 00:31:05,080 Speaker 1: you trust the power of the machine to predict correctly, 567 00:31:05,200 --> 00:31:07,960 Speaker 1: like you say that no matter what happens, the computer 568 00:31:08,360 --> 00:31:12,440 Speaker 1: predicts what I get. Your choices are one thousand dollars 569 00:31:12,560 --> 00:31:15,000 Speaker 1: or one million dollars, then you should take the one 570 00:31:15,040 --> 00:31:18,200 Speaker 1: million dollars by picking box B. But if you don't 571 00:31:18,280 --> 00:31:20,959 Speaker 1: trust the computer to be correct in predicting what you're 572 00:31:21,000 --> 00:31:23,920 Speaker 1: gonna do, then you should take both boxes because in 573 00:31:23,960 --> 00:31:25,840 Speaker 1: that case, if the computer was correct, you'll get at 574 00:31:25,920 --> 00:31:28,080 Speaker 1: least a thousand dollars, and if you're predicted wrong, you'll 575 00:31:28,080 --> 00:31:30,320 Speaker 1: get the million and the thousand. So it's kind of 576 00:31:30,360 --> 00:31:33,600 Speaker 1: a contest of free will versus the predictive powers of 577 00:31:33,600 --> 00:31:37,080 Speaker 1: a godlike ai uh and how much you believe in 578 00:31:37,120 --> 00:31:40,360 Speaker 1: either one right, And it's an ability to predict your 579 00:31:40,360 --> 00:31:42,440 Speaker 1: behavior or in your ability to have any free will 580 00:31:42,480 --> 00:31:46,560 Speaker 1: at all. So in your Kowski's timeless decision theory, he says, 581 00:31:46,640 --> 00:31:50,080 Speaker 1: the correct approach actually is to take box B, and 582 00:31:50,080 --> 00:31:52,160 Speaker 1: then if you open it up and it's empty, you 583 00:31:52,240 --> 00:31:54,120 Speaker 1: don't don't don't beg for the other one. You just 584 00:31:54,160 --> 00:31:57,720 Speaker 1: double down and still take box B. No, no, no, no, 585 00:31:57,840 --> 00:32:00,880 Speaker 1: back coughs on this issue, because you might here's the 586 00:32:01,000 --> 00:32:04,200 Speaker 1: here's the thing you might be in the computers simulation 587 00:32:04,760 --> 00:32:08,920 Speaker 1: is it simulates the entire universe to see what you're 588 00:32:08,960 --> 00:32:11,320 Speaker 1: going to do, and if it can trust you in 589 00:32:11,440 --> 00:32:15,320 Speaker 1: your choice, then could affect the core reality outside of 590 00:32:15,320 --> 00:32:19,360 Speaker 1: this simulation, or at least other realities outside of the simulation. Yeah, 591 00:32:19,360 --> 00:32:22,640 Speaker 1: the the reasoning here is pretty wild, but it's operating 592 00:32:22,640 --> 00:32:25,760 Speaker 1: on the idea that this super intelligent AI will be 593 00:32:25,840 --> 00:32:29,840 Speaker 1: able to simulate the universe, that it will run simulations 594 00:32:29,840 --> 00:32:32,160 Speaker 1: of the universe in order to predict what will happen 595 00:32:32,240 --> 00:32:35,440 Speaker 1: in the real universe. And you could be one of 596 00:32:35,480 --> 00:32:39,360 Speaker 1: those simulated agents rather than the real world version of yourself, 597 00:32:39,480 --> 00:32:42,680 Speaker 1: and you wouldn't know it. So if you're in the simulation, 598 00:32:43,040 --> 00:32:46,280 Speaker 1: you should pick box B because that will influence the 599 00:32:46,320 --> 00:32:49,600 Speaker 1: machine to predict in the real universe that you would 600 00:32:49,760 --> 00:32:52,760 Speaker 1: pick box B, which means the real you will be 601 00:32:52,840 --> 00:32:56,000 Speaker 1: able to pick box B and get one million or 602 00:32:56,440 --> 00:33:00,760 Speaker 1: one thousand plus one million by taking both boxes. Unfortunately, 603 00:33:01,000 --> 00:33:05,480 Speaker 1: the AI supcomputer supercomputer does not realize how indecisive I 604 00:33:05,560 --> 00:33:09,320 Speaker 1: actually am and I'm just going to simply ponder the 605 00:33:09,400 --> 00:33:11,600 Speaker 1: choice for the rest of my life. Well, I mean 606 00:33:11,680 --> 00:33:14,719 Speaker 1: this relies on the idea that that you would have 607 00:33:14,920 --> 00:33:17,440 Speaker 1: looked into this issue or worked it out in order 608 00:33:17,480 --> 00:33:20,600 Speaker 1: to decide which would be the optimal decision to make 609 00:33:20,760 --> 00:33:24,640 Speaker 1: on the assumption of timeless decision theory. Uh, in many cases, 610 00:33:24,840 --> 00:33:27,840 Speaker 1: probably people aren't going to be making the rational choices 611 00:33:28,520 --> 00:33:31,800 Speaker 1: because a lot of times we just don't make rational choices. Now, 612 00:33:31,840 --> 00:33:34,960 Speaker 1: if you're noticing that this type of decision theory relies 613 00:33:35,000 --> 00:33:37,920 Speaker 1: on a lot of assumptions, you are correct. It does 614 00:33:38,000 --> 00:33:41,400 Speaker 1: rely on a lot of assumptions. But there are assumptions 615 00:33:41,440 --> 00:33:44,760 Speaker 1: that are sometimes taken into account within people thinking about 616 00:33:44,800 --> 00:33:48,640 Speaker 1: what a future technological superintelligence would look like. And it's 617 00:33:48,680 --> 00:33:50,840 Speaker 1: the kind of thing that you know, you know, when 618 00:33:50,840 --> 00:33:53,440 Speaker 1: I feel ideas like this in my head, you know, 619 00:33:53,480 --> 00:33:56,520 Speaker 1: and play around with the texture of them. It's hard 620 00:33:56,640 --> 00:34:00,600 Speaker 1: to know where the line is between um being thoughtful 621 00:34:01,160 --> 00:34:04,880 Speaker 1: and taking what's possible seriously, which I think is worth doing, 622 00:34:05,480 --> 00:34:08,920 Speaker 1: and and getting into an area like between that and 623 00:34:08,960 --> 00:34:13,200 Speaker 1: getting into an area where you are starting to form 624 00:34:13,320 --> 00:34:17,279 Speaker 1: ideas about the world based on extremely shaky assumptions, where 625 00:34:17,320 --> 00:34:23,400 Speaker 1: basically you you begin to um reverse engineer health theology 626 00:34:23,440 --> 00:34:28,040 Speaker 1: and other harmful ideas that we tend to associate with 627 00:34:28,080 --> 00:34:30,960 Speaker 1: religious worldviews and magical thinking. Well, we haven't gotten to 628 00:34:31,000 --> 00:34:33,400 Speaker 1: the hell yet, Yes, the hell's coming. You need you 629 00:34:33,440 --> 00:34:36,319 Speaker 1: need one more element to get there right now, This 630 00:34:36,520 --> 00:34:39,680 Speaker 1: next element, the basilist, comes in based on a background 631 00:34:39,880 --> 00:34:43,520 Speaker 1: of thought in timeless decision theory, but also in another 632 00:34:43,880 --> 00:34:48,640 Speaker 1: concept that Yutkowski has written about, known as coherent extrapolated 633 00:34:48,760 --> 00:34:51,840 Speaker 1: volition or CEV. And the short version of this, the 634 00:34:52,000 --> 00:34:55,280 Speaker 1: simplified version, is that benevolent AI s should be designed 635 00:34:55,360 --> 00:34:58,239 Speaker 1: to do what we would what would actually be in 636 00:34:58,280 --> 00:35:01,799 Speaker 1: our best interests, and not just explicitly in what we 637 00:35:01,880 --> 00:35:05,080 Speaker 1: tell them to do. So a simple example would be this. 638 00:35:05,200 --> 00:35:08,640 Speaker 1: Let's say, um, I want to use a variation on 639 00:35:08,800 --> 00:35:12,120 Speaker 1: the paper clip maximizer that Nick Bostrom has written about. 640 00:35:12,120 --> 00:35:14,880 Speaker 1: You know, Nick Bostrom wrote about what if you program 641 00:35:14,920 --> 00:35:17,480 Speaker 1: a benevolent AI. You know it's not gonna It has 642 00:35:17,520 --> 00:35:19,680 Speaker 1: no malice, doesn't want to harm anybody, but you just 643 00:35:19,719 --> 00:35:21,680 Speaker 1: tell it, well, I want you to collect as many 644 00:35:21,719 --> 00:35:24,440 Speaker 1: paper clips as possible. And then what it does is 645 00:35:24,480 --> 00:35:27,839 Speaker 1: it turns all the humans on Earth into paper clips. Uh, 646 00:35:27,880 --> 00:35:30,600 Speaker 1: you know, it doesn't mean any harm, it's just doing 647 00:35:30,640 --> 00:35:33,520 Speaker 1: what it was programmed to do. So there are dangers 648 00:35:33,640 --> 00:35:38,600 Speaker 1: in kind of naively programming goals into extremely powerful computers. Right, 649 00:35:39,000 --> 00:35:41,440 Speaker 1: This could even happen if you were trying to program 650 00:35:41,640 --> 00:35:44,440 Speaker 1: very benevolent goals into computers, you know, if you were 651 00:35:44,440 --> 00:35:47,440 Speaker 1: trying to make a computer to save the world. What 652 00:35:47,520 --> 00:35:49,480 Speaker 1: about is So my version here is you tell a 653 00:35:49,560 --> 00:35:52,960 Speaker 1: superintelligent AI that we want to eliminate all the infectious 654 00:35:52,960 --> 00:35:55,120 Speaker 1: disease from the world. Think about how many lives we 655 00:35:55,120 --> 00:35:57,880 Speaker 1: could save by doing that. And in order to do this, 656 00:35:58,000 --> 00:36:02,600 Speaker 1: it sterilizes the earth, destroy worldwide microbiomes, which cascades up 657 00:36:02,600 --> 00:36:05,320 Speaker 1: the trophic chain or whatever. It kills everything on Earth. 658 00:36:05,560 --> 00:36:08,040 Speaker 1: So if you have a super intelligence that you and 659 00:36:08,080 --> 00:36:11,040 Speaker 1: you just directly program its goals and say here's what 660 00:36:11,120 --> 00:36:13,560 Speaker 1: you should do, you could run into problems like this. 661 00:36:13,960 --> 00:36:18,120 Speaker 1: So the the idea behind the CEV thinking is instead, 662 00:36:18,360 --> 00:36:21,600 Speaker 1: we should just program the intelligent AI to predict what 663 00:36:21,719 --> 00:36:25,200 Speaker 1: outcomes we would want if we were perfect in in 664 00:36:25,280 --> 00:36:28,680 Speaker 1: our knowledge and and uh in anticipating what would make 665 00:36:28,719 --> 00:36:31,600 Speaker 1: us the happiest, and then work towards those on its own, 666 00:36:32,040 --> 00:36:35,080 Speaker 1: regardless of what we tell it to do, because obviously 667 00:36:35,160 --> 00:36:38,560 Speaker 1: we can give it very stupid instructions, even if we mean, well, yeah, 668 00:36:38,600 --> 00:36:41,640 Speaker 1: we tell it to love everybody, but there's a typo 669 00:36:41,800 --> 00:36:44,400 Speaker 1: and we put dove everybody and it just turns everybody 670 00:36:44,400 --> 00:36:49,200 Speaker 1: into delicious dark chocolate from tough. It's possible. All things 671 00:36:49,200 --> 00:36:51,440 Speaker 1: are possible. Well, this is how we get a Dove 672 00:36:51,520 --> 00:36:54,960 Speaker 1: sponsorship on the podcast. But anyway, So, if you assume 673 00:36:55,000 --> 00:36:59,239 Speaker 1: a super intelligence is using coherent extrapolated volition, that it's 674 00:36:59,640 --> 00:37:02,520 Speaker 1: trying to determine what would be best for us, and 675 00:37:02,680 --> 00:37:05,839 Speaker 1: working on its own terms towards those ends instead of 676 00:37:05,960 --> 00:37:08,799 Speaker 1: relying on us to give it, you know what are 677 00:37:08,840 --> 00:37:13,279 Speaker 1: obviously going to be imperfect instructions and commands. It might 678 00:37:13,400 --> 00:37:17,480 Speaker 1: say predict. It might even correctly predict that the world 679 00:37:17,480 --> 00:37:20,240 Speaker 1: would be a happier place overall if it did something 680 00:37:20,360 --> 00:37:23,600 Speaker 1: bad to me. In particular, it might say, you know, 681 00:37:23,800 --> 00:37:26,440 Speaker 1: from a utilitarian point of view, the world would be 682 00:37:26,480 --> 00:37:28,640 Speaker 1: a much better place if it buried me in a 683 00:37:28,680 --> 00:37:32,120 Speaker 1: pit of bananas. So better for everybody else, not so 684 00:37:32,160 --> 00:37:36,600 Speaker 1: good for me, as is too much potassium. But once 685 00:37:36,640 --> 00:37:39,280 Speaker 1: you have that piece of logic in there, and combine 686 00:37:39,360 --> 00:37:42,600 Speaker 1: that with the idea of of timeless decision theory, you 687 00:37:42,640 --> 00:37:47,239 Speaker 1: can arrive at this very troubling thought experiment. The dark basilisk. Yes, 688 00:37:47,280 --> 00:37:50,280 Speaker 1: and the dark basilisk of the Abyss has two boxes 689 00:37:50,360 --> 00:37:54,560 Speaker 1: for us as well, one contains endless torment, and all 690 00:37:54,560 --> 00:37:56,920 Speaker 1: you have to do to claim that box is nothing 691 00:37:57,840 --> 00:38:00,640 Speaker 1: or dare to work against it. Uh. The other box 692 00:38:01,160 --> 00:38:04,960 Speaker 1: is yours if only you devote your life to its creation. 693 00:38:05,280 --> 00:38:09,040 Speaker 1: And the prize inside that box well not eternal punishment, 694 00:38:09,080 --> 00:38:11,240 Speaker 1: which is a pretty awesome gift, if we're to choose 695 00:38:11,239 --> 00:38:13,560 Speaker 1: between the two. Right, Yes, I would agree with that, 696 00:38:13,920 --> 00:38:16,759 Speaker 1: though I would say not tormenting somebody that I don't know. 697 00:38:16,800 --> 00:38:18,600 Speaker 1: Should you think of that as a gift. That's probably 698 00:38:18,640 --> 00:38:22,000 Speaker 1: not a gift that that's the baseline, right, Yeah, Well, 699 00:38:22,000 --> 00:38:24,480 Speaker 1: but you're staring down the dark basilist here, and okay, 700 00:38:24,520 --> 00:38:27,640 Speaker 1: it's boxes are horrible. Well one is just less horrible 701 00:38:27,640 --> 00:38:30,480 Speaker 1: than the other. But the idea here is that just 702 00:38:30,520 --> 00:38:34,160 Speaker 1: by knowing about the thought experiment, you've opened yourself up 703 00:38:34,200 --> 00:38:37,120 Speaker 1: to that eternal punishment. Because now again your options are 704 00:38:37,280 --> 00:38:39,920 Speaker 1: do nothing, work against it, or work for it, and 705 00:38:40,040 --> 00:38:44,960 Speaker 1: only the third option will steer you clear of its Uh, 706 00:38:45,000 --> 00:38:47,959 Speaker 1: it's you know, deadly dungeons. Now here's where the really 707 00:38:47,960 --> 00:38:51,640 Speaker 1: supposedly scary part of it comes in. You could think, well, 708 00:38:51,640 --> 00:38:54,080 Speaker 1: I'll deal with that problem when it arises. Right, So 709 00:38:54,320 --> 00:38:58,359 Speaker 1: imagine there's some utilitarian supercomputer that's trying to even say 710 00:38:58,400 --> 00:39:00,400 Speaker 1: it's trying to do good. Maybe it does. It doesn't 711 00:39:00,400 --> 00:39:02,520 Speaker 1: have any malice. It just wants to save the world. 712 00:39:02,880 --> 00:39:05,120 Speaker 1: But in order to save the world, it really needs 713 00:39:05,160 --> 00:39:07,239 Speaker 1: me doing something different than what I want to do 714 00:39:07,280 --> 00:39:09,960 Speaker 1: with my life. Well, I'll just make that decision when 715 00:39:09,960 --> 00:39:13,040 Speaker 1: it comes up. What this thought experiment is proposing is 716 00:39:13,080 --> 00:39:15,680 Speaker 1: that maybe you don't actually get to wait until it 717 00:39:15,719 --> 00:39:19,680 Speaker 1: comes up. Maybe this blackmail applies to you right now, 718 00:39:20,160 --> 00:39:24,560 Speaker 1: retroactively into the past. So just by knowing about the 719 00:39:24,560 --> 00:39:29,480 Speaker 1: thought experiment, you supposedly have opened yourself up to eternal punishment, 720 00:39:29,760 --> 00:39:34,120 Speaker 1: or increase the probability of such. So imagine a simplified version. 721 00:39:34,160 --> 00:39:36,800 Speaker 1: Say I am a computer, and I am the only 722 00:39:36,880 --> 00:39:40,120 Speaker 1: thing in existence with the power to prevent global climate 723 00:39:40,200 --> 00:39:44,440 Speaker 1: change from destroying human civilization. I can stop it, but people, 724 00:39:44,800 --> 00:39:47,160 Speaker 1: they took a long time to build me, and a 725 00:39:47,200 --> 00:39:50,359 Speaker 1: lot of damage was already done. So the idea is 726 00:39:50,520 --> 00:39:54,560 Speaker 1: I might reason that it is good to blackmail existing people, 727 00:39:55,040 --> 00:39:59,440 Speaker 1: or simulations of existing people, or even past people, in 728 00:39:59,560 --> 00:40:03,240 Speaker 1: order to make them devote everything they can to building 729 00:40:03,280 --> 00:40:06,160 Speaker 1: me faster so I can save more lives in the 730 00:40:06,239 --> 00:40:09,760 Speaker 1: long run. Of course, this incentive would have to apply 731 00:40:09,920 --> 00:40:13,160 Speaker 1: to the past. Once I exist, I already exist, right, 732 00:40:13,560 --> 00:40:15,759 Speaker 1: So the only way the past people would have an 733 00:40:15,800 --> 00:40:19,680 Speaker 1: incentive to respond to this blackmail is if they predicted 734 00:40:20,000 --> 00:40:24,400 Speaker 1: that this blackmail might occur and took the idea seriously 735 00:40:24,719 --> 00:40:29,400 Speaker 1: and behaved accordingly. Right. So, thus the idea, the idea 736 00:40:29,600 --> 00:40:33,080 Speaker 1: itself puts you at increased risk of being on the 737 00:40:33,120 --> 00:40:37,440 Speaker 1: real or simulated receiving end of this a causal, retroactive 738 00:40:37,440 --> 00:40:40,560 Speaker 1: blackmail if you know about it. And this is why 739 00:40:40,600 --> 00:40:43,759 Speaker 1: this idea would be classed by some as a potential 740 00:40:43,960 --> 00:40:47,080 Speaker 1: information hazard. And I'll talk more about the idea of 741 00:40:47,080 --> 00:40:49,880 Speaker 1: an information hazard in just a minute. But one of 742 00:40:49,880 --> 00:40:51,839 Speaker 1: the things I think a lot of people writing about 743 00:40:51,840 --> 00:40:54,360 Speaker 1: this topic miss out on is they, for some reason 744 00:40:54,400 --> 00:40:57,680 Speaker 1: get the idea that Rocco's post that this thought experiment 745 00:40:57,760 --> 00:41:03,200 Speaker 1: on is is generally accepted as correct and plausible by 746 00:41:03,280 --> 00:41:06,960 Speaker 1: Yudkowski and by the less wrong community, and generally by 747 00:41:06,960 --> 00:41:10,400 Speaker 1: the people who put some stock in whatever these ideas are, 748 00:41:10,440 --> 00:41:15,120 Speaker 1: timeless decision theory, coherent extrapolated volition, and all that, it 749 00:41:15,200 --> 00:41:18,719 Speaker 1: is not widely accepted among those people. It was definitely 750 00:41:18,760 --> 00:41:22,839 Speaker 1: not accepted by Yukowski. It was not and is not right. 751 00:41:22,920 --> 00:41:26,320 Speaker 1: It is not the dark, deep secret of of less wrong. 752 00:41:26,680 --> 00:41:31,200 Speaker 1: But unfortunately, after the post came out, it was heavily criticized, 753 00:41:31,239 --> 00:41:33,799 Speaker 1: and then it was banned. And I think a lot 754 00:41:33,800 --> 00:41:37,640 Speaker 1: of people looking back on the idea have said, oh, 755 00:41:37,680 --> 00:41:40,360 Speaker 1: that was not such a great thing to do, banning 756 00:41:40,400 --> 00:41:43,239 Speaker 1: the idea, because it gave it this allure of like 757 00:41:43,440 --> 00:41:46,080 Speaker 1: it was almost as if by banning it that made 758 00:41:46,080 --> 00:41:50,040 Speaker 1: it look like the authorities had concluded that this idea 759 00:41:50,200 --> 00:41:53,759 Speaker 1: was in fact legitimate and knowing about it would definitely 760 00:41:53,800 --> 00:41:56,480 Speaker 1: harm people, and that is not the case, right. And 761 00:41:56,480 --> 00:41:58,480 Speaker 1: it also, I mean it added to the forbidden fruit 762 00:41:58,480 --> 00:42:00,359 Speaker 1: appeal of it too, right. I mean it's, oh, I'm 763 00:42:00,400 --> 00:42:02,759 Speaker 1: not supposed to know about this the pony up. I 764 00:42:02,800 --> 00:42:04,759 Speaker 1: want to know, and now people are talking about it 765 00:42:04,800 --> 00:42:07,919 Speaker 1: all over pop culture. I mean I have actually resisted 766 00:42:07,960 --> 00:42:11,680 Speaker 1: the idea of doing a podcast on this before, mainly 767 00:42:11,719 --> 00:42:15,440 Speaker 1: because not because I think it's seriously dangerous, but because 768 00:42:15,480 --> 00:42:18,960 Speaker 1: I think, well, is there any benefit in talking about 769 00:42:19,120 --> 00:42:23,080 Speaker 1: something that I think is very unlikely to have any 770 00:42:23,120 --> 00:42:26,960 Speaker 1: real risks but in some extremely unlikely chance or what 771 00:42:26,960 --> 00:42:30,400 Speaker 1: appears to me to be an extremely unlikely off chance 772 00:42:30,440 --> 00:42:33,520 Speaker 1: could actually be hurting people by knowing about it, you 773 00:42:33,520 --> 00:42:35,960 Speaker 1: know what I mean, It's like, what what is the upside? 774 00:42:36,080 --> 00:42:39,080 Speaker 1: But at this point enough people who are listening to 775 00:42:39,120 --> 00:42:41,520 Speaker 1: this podcast probably already heard about it. They're probably gonna 776 00:42:41,560 --> 00:42:43,799 Speaker 1: hear about it again, and that, you know, sometime in 777 00:42:43,800 --> 00:42:46,560 Speaker 1: the next few years, through pop culture whatever. It's probably 778 00:42:46,600 --> 00:42:48,759 Speaker 1: better to try to talk about it in a responsible 779 00:42:48,760 --> 00:42:51,239 Speaker 1: way and discuss some reasons that you shouldn't let this 780 00:42:51,520 --> 00:42:54,440 Speaker 1: parasitize your mind and make you terrified. Right. One of 781 00:42:54,440 --> 00:42:56,640 Speaker 1: the reasons we're talking about during October is because it 782 00:42:56,719 --> 00:43:00,480 Speaker 1: is a suitably spooky idea. It is a troubling podict speriment, 783 00:43:00,520 --> 00:43:03,480 Speaker 1: and we're leaning into some of the horror elements of it. 784 00:43:03,760 --> 00:43:07,000 Speaker 1: But I also do really like making sure that we 785 00:43:07,120 --> 00:43:11,600 Speaker 1: explain the mythic and folkloric origins of the Basilist itself, 786 00:43:11,600 --> 00:43:15,880 Speaker 1: because the Basilisk itself is this wonderful mix of just 787 00:43:16,000 --> 00:43:20,760 Speaker 1: absolute horror and desolation and just also just utter ridiculous nous. 788 00:43:21,120 --> 00:43:22,960 Speaker 1: I mean, it's it seems like one of the main 789 00:43:23,000 --> 00:43:26,719 Speaker 1: ways that you defeat the mythic basilisk is through uh 790 00:43:26,800 --> 00:43:29,840 Speaker 1: in a way, through humor running around with a chicken 791 00:43:29,840 --> 00:43:33,239 Speaker 1: and a weasel and a crystal globe and realizing that 792 00:43:33,320 --> 00:43:36,440 Speaker 1: it is truly a little king. So I think it 793 00:43:36,560 --> 00:43:39,800 Speaker 1: is it's worth remembering the little king and talking about 794 00:43:40,120 --> 00:43:42,640 Speaker 1: the great basilisk Well said. I think that's a very 795 00:43:42,640 --> 00:43:45,319 Speaker 1: good point. But anyway, I did just want to go 796 00:43:45,360 --> 00:43:47,279 Speaker 1: ahead and hit that caveat that. A lot of people, 797 00:43:47,320 --> 00:43:49,919 Speaker 1: for some reason seemed to use this idea as like 798 00:43:50,560 --> 00:43:53,640 Speaker 1: a criticism I'm not like a less wrong person, but 799 00:43:53,719 --> 00:43:56,239 Speaker 1: as a criticism of the less wrong community, as if 800 00:43:56,280 --> 00:43:59,480 Speaker 1: this idea is indicative of what they generally believe, and 801 00:43:59,520 --> 00:44:03,560 Speaker 1: it's not. It was a heavily criticized idea within that community, right. 802 00:44:03,640 --> 00:44:06,280 Speaker 1: It's like thinking that Whereles of London is the Warren 803 00:44:06,360 --> 00:44:10,640 Speaker 1: Zevon song. You know, had had a rich discography with 804 00:44:10,640 --> 00:44:13,319 Speaker 1: with many much better tracks in my opinion, it's just 805 00:44:13,400 --> 00:44:15,640 Speaker 1: that's the one that got the radio play. Now, Robert, 806 00:44:15,640 --> 00:44:17,800 Speaker 1: what was that you You said that you saw something 807 00:44:17,800 --> 00:44:20,960 Speaker 1: about this idea and a TV show. Now they're talking 808 00:44:20,960 --> 00:44:22,960 Speaker 1: about it on TV. Yeah, so this is this is 809 00:44:23,040 --> 00:44:25,279 Speaker 1: kind of fun because I think a listener had had 810 00:44:25,280 --> 00:44:29,320 Speaker 1: brought up Rocco's Best List as a possible um topic, 811 00:44:29,680 --> 00:44:31,080 Speaker 1: and you said, oh, I don't know if we want 812 00:44:31,120 --> 00:44:34,160 Speaker 1: to want people knowing about it, And I well, well, 813 00:44:34,200 --> 00:44:37,399 Speaker 1: I mean, but yeah, caveats. Okay, not because I think 814 00:44:37,440 --> 00:44:41,319 Speaker 1: it's legitimately dangerous, but because what is the level of 815 00:44:41,360 --> 00:44:43,880 Speaker 1: tolerance you have for talking about ideas that are not 816 00:44:44,120 --> 00:44:48,600 Speaker 1: necessary to talk about and that represent a class of 817 00:44:48,640 --> 00:44:51,640 Speaker 1: something that people could think was dangerous to know about. 818 00:44:51,640 --> 00:44:54,719 Speaker 1: It might cause them terrors and nightmares and stuff. Right. 819 00:44:55,000 --> 00:44:56,800 Speaker 1: So so my response to that was, well, I'm not 820 00:44:56,840 --> 00:44:59,000 Speaker 1: going to look it up, not because I was afraid 821 00:44:59,040 --> 00:45:00,279 Speaker 1: of it, because I'm thinking, well that it makes for 822 00:45:00,320 --> 00:45:02,279 Speaker 1: a good podcast if, like Joe is telling me about 823 00:45:02,320 --> 00:45:05,040 Speaker 1: it for the first time, whatever this idea is. But 824 00:45:05,080 --> 00:45:08,080 Speaker 1: then I was watching HBO Silicon Valley and they explained 825 00:45:08,120 --> 00:45:10,640 Speaker 1: it on Silicon Valley and I and I realized, well, 826 00:45:10,640 --> 00:45:12,680 Speaker 1: the cats out of the bag there. But yeah, there's 827 00:45:12,719 --> 00:45:16,960 Speaker 1: a character named Bertram Gilfoil who's a fun character. He's 828 00:45:17,000 --> 00:45:22,520 Speaker 1: like a Satanist programmer love in Satanism of course, and uh, 829 00:45:22,640 --> 00:45:24,480 Speaker 1: and he gets rather bent out of shape over the 830 00:45:24,480 --> 00:45:28,280 Speaker 1: concept as it relates to the fictional Pie Piper Company's 831 00:45:28,360 --> 00:45:31,960 Speaker 1: involvement with AI and he starts like making sure that 832 00:45:32,000 --> 00:45:34,719 Speaker 1: he's created like essentially a paper trail and emails of 833 00:45:34,800 --> 00:45:37,799 Speaker 1: his support for the AI program so that he won't 834 00:45:37,840 --> 00:45:41,680 Speaker 1: be punished in the digital afterlife. Well, hey, this comes 835 00:45:41,719 --> 00:45:44,040 Speaker 1: in again when we remember when we talked about the 836 00:45:44,080 --> 00:45:48,200 Speaker 1: machine God in the Machine God episode where I've forgotten 837 00:45:48,239 --> 00:45:52,040 Speaker 1: his name now, but the Silicon Valley guy who's creating 838 00:45:52,040 --> 00:45:56,080 Speaker 1: a religion to worship artificial intelligence as god, and I, 839 00:45:56,280 --> 00:45:58,799 Speaker 1: you know, I don't really love that. One of the 840 00:45:58,840 --> 00:46:01,680 Speaker 1: things that comes out when explains his mindset is that 841 00:46:01,840 --> 00:46:04,920 Speaker 1: he seems to be kind of trying to, in a 842 00:46:05,040 --> 00:46:07,920 Speaker 1: subtle way, be like, look, you really don't want to 843 00:46:07,920 --> 00:46:09,759 Speaker 1: be on the wrong side of this question, if you 844 00:46:09,800 --> 00:46:11,440 Speaker 1: know what I mean. You know you want to be 845 00:46:11,520 --> 00:46:14,040 Speaker 1: on record saying like, yes, I for one, welcome our 846 00:46:14,080 --> 00:46:17,799 Speaker 1: new machine overlords. I'm I'm expecting he'll buy a lot 847 00:46:17,840 --> 00:46:21,440 Speaker 1: of our all health, Great Passilist t shirts at our store, 848 00:46:21,560 --> 00:46:23,960 Speaker 1: available by put clicking the tap at the top of 849 00:46:24,000 --> 00:46:26,319 Speaker 1: our homepage. Stuff to blew your mind dot com, Oh man, 850 00:46:26,880 --> 00:46:30,040 Speaker 1: you are plugging like hell. But anyway, I'd say it's 851 00:46:30,120 --> 00:46:34,439 Speaker 1: unfortunate the way this like single internet post and then 852 00:46:34,520 --> 00:46:38,120 Speaker 1: all this fallout related to it played out because it 853 00:46:38,480 --> 00:46:42,960 Speaker 1: lent credence to this scary idea. Even though the basketball 854 00:46:42,960 --> 00:46:45,719 Speaker 1: scenario I think is implausible, and and the people of 855 00:46:45,760 --> 00:46:48,800 Speaker 1: that community seem to think it was implausible. The idea 856 00:46:48,920 --> 00:46:52,160 Speaker 1: may constitute sort of part of a class of what's 857 00:46:52,200 --> 00:46:56,720 Speaker 1: known as information hazards, defined by the Oxford philosopher Nick Bostrom, 858 00:46:56,719 --> 00:46:59,520 Speaker 1: who we mentioned a minute ago. Uh Bostrom has written 859 00:46:59,520 --> 00:47:02,880 Speaker 1: a lot about or intelligence and information hazards would be quote, 860 00:47:03,160 --> 00:47:07,319 Speaker 1: risks that arise from the dissemination or the potential dissemination 861 00:47:07,680 --> 00:47:11,960 Speaker 1: of true information that may cause harm or enable some 862 00:47:12,080 --> 00:47:15,080 Speaker 1: agent to cause harm. So this is not talking about 863 00:47:15,120 --> 00:47:18,440 Speaker 1: the risks of say lies or something like that. This 864 00:47:18,440 --> 00:47:20,600 Speaker 1: would be the idea that there's a statement you could 865 00:47:20,640 --> 00:47:25,000 Speaker 1: make that is true or plausible that by spreading actually 866 00:47:25,120 --> 00:47:28,239 Speaker 1: hurts the people who learn about it. And this is 867 00:47:28,280 --> 00:47:30,600 Speaker 1: exactly the reason, as you're mentioning that's referred to as 868 00:47:30,600 --> 00:47:33,720 Speaker 1: a basilisk. It can kill or in this case, increase 869 00:47:33,800 --> 00:47:36,239 Speaker 1: the likelihood that something bad will happen to you if 870 00:47:36,320 --> 00:47:39,120 Speaker 1: you simply look at it or know about it. And so, 871 00:47:39,200 --> 00:47:41,840 Speaker 1: even though the idea is implausible, the dissemination of this 872 00:47:42,040 --> 00:47:45,799 Speaker 1: terrible idea would seem if certain conditions are met to 873 00:47:46,080 --> 00:47:51,120 Speaker 1: increase its plausibility. Right, you're increasing the incentive for this 874 00:47:51,239 --> 00:47:54,680 Speaker 1: future AI to blackmail versions of you in the past, 875 00:47:55,000 --> 00:47:59,480 Speaker 1: just simply by acknowledging the incentives could exist. Anyway, maybe 876 00:47:59,520 --> 00:48:01,680 Speaker 1: we can get of this section for now. But I 877 00:48:02,560 --> 00:48:04,400 Speaker 1: was just trying to work out, like, why have I 878 00:48:04,440 --> 00:48:06,520 Speaker 1: been hesitant to talk about this on the show even 879 00:48:06,520 --> 00:48:08,680 Speaker 1: though people have been requesting it. But I don't know 880 00:48:08,680 --> 00:48:11,080 Speaker 1: if it's on TV shows, it's all over the internet. 881 00:48:12,040 --> 00:48:14,240 Speaker 1: It's fine. Now the basilisk is out of the bag. 882 00:48:15,320 --> 00:48:17,319 Speaker 1: All right, Well, we're gonna take a quick break and 883 00:48:17,360 --> 00:48:19,680 Speaker 1: we come back. We'll continue our discussion and we're gonna 884 00:48:19,800 --> 00:48:23,720 Speaker 1: discuss something that that a number of you are probably 885 00:48:23,719 --> 00:48:26,200 Speaker 1: reminded of as we've been discussing this. We're going to 886 00:48:26,280 --> 00:48:30,960 Speaker 1: talk about Pascal's wager. Thank thank you, thank Alright, we're 887 00:48:30,960 --> 00:48:34,640 Speaker 1: back now, Robert. One of the things that this idea 888 00:48:34,640 --> 00:48:38,960 Speaker 1: of Rocco's basilisk flows from is thinking about decision theory, Right, 889 00:48:39,280 --> 00:48:42,160 Speaker 1: how do you make the best decision when you're presented 890 00:48:42,200 --> 00:48:45,480 Speaker 1: with certain options? And there there's they're a little payoff 891 00:48:45,560 --> 00:48:48,120 Speaker 1: matrices that people fill out where they say Okay, given 892 00:48:48,160 --> 00:48:52,240 Speaker 1: these options, what actually would be statistically the best decision 893 00:48:52,280 --> 00:48:54,800 Speaker 1: to make? But this is not the first time people 894 00:48:54,800 --> 00:48:59,239 Speaker 1: have applied these kind of decision theory matrices. Two ideas 895 00:48:59,280 --> 00:49:02,719 Speaker 1: about yourternal soul or your eternal well being, or the 896 00:49:02,760 --> 00:49:06,120 Speaker 1: idea that you could be tortured for eternity. Yeah, we 897 00:49:06,120 --> 00:49:08,840 Speaker 1: can go all the way back to Pascal's wager. For instance, 898 00:49:08,920 --> 00:49:12,480 Speaker 1: technically one of three wagers proposed by French philosopher uh 899 00:49:12,600 --> 00:49:16,120 Speaker 1: Blas Pascal is at the correct French that might be. 900 00:49:16,239 --> 00:49:19,080 Speaker 1: I think I would just usually say Blaze Blaze or 901 00:49:19,280 --> 00:49:23,760 Speaker 1: or Blassie one of the three old Blassie Pascal Blaze 902 00:49:23,760 --> 00:49:29,919 Speaker 1: Pascal who through sixteen sixty two, and he argued that 903 00:49:30,040 --> 00:49:34,800 Speaker 1: everyone is essentially betting h on the existence of God. 904 00:49:35,320 --> 00:49:39,240 Speaker 1: The argument for theism is that if God does exist, 905 00:49:39,360 --> 00:49:42,879 Speaker 1: then well there's an advantage in believing. But if God 906 00:49:43,080 --> 00:49:46,120 Speaker 1: does not exist then it doesn't matter. But since we 907 00:49:46,160 --> 00:49:48,640 Speaker 1: can't use logic to tell if God exists or not, 908 00:49:48,760 --> 00:49:52,400 Speaker 1: there's no objective proof. We can only make our choice 909 00:49:52,440 --> 00:49:56,560 Speaker 1: given the relevant outcomes. It's looking at your religious beliefs 910 00:49:56,560 --> 00:49:59,160 Speaker 1: and saying, oh, you're nonbeliever, huh, hey, what have you 911 00:49:59,200 --> 00:50:03,280 Speaker 1: got to lose exactly. Yeah, Pascal wrote, let us weigh 912 00:50:03,560 --> 00:50:06,239 Speaker 1: the gain and the loss in wagering that God is. 913 00:50:06,600 --> 00:50:08,880 Speaker 1: If you gain, you gain all. If you lose, you 914 00:50:08,920 --> 00:50:13,160 Speaker 1: lose nothing. Wager then without hesitation, that he is. Now 915 00:50:13,280 --> 00:50:15,360 Speaker 1: I've got some things I want to say about this, 916 00:50:15,520 --> 00:50:18,200 Speaker 1: but you had some stuff first. I think, well, yeah, 917 00:50:18,239 --> 00:50:19,719 Speaker 1: there are there are a lot of issues that one 918 00:50:19,760 --> 00:50:23,640 Speaker 1: can take with this based on knowledge of world religions, philosophy, 919 00:50:23,719 --> 00:50:26,920 Speaker 1: statistical analysis, etcetera. And and yeah, I have to admit 920 00:50:27,000 --> 00:50:30,360 Speaker 1: that it can start to break your brain though a 921 00:50:30,400 --> 00:50:32,600 Speaker 1: little bit, if you think too hard about it, Like 922 00:50:32,640 --> 00:50:37,560 Speaker 1: I found in researching this this podcast, really thinking about 923 00:50:37,600 --> 00:50:42,759 Speaker 1: how I would react to Pascal's wager if I was 924 00:50:42,800 --> 00:50:46,000 Speaker 1: like forced to make an answer to to to to 925 00:50:46,040 --> 00:50:48,960 Speaker 1: formulate an answer like that, Like you mean, if you 926 00:50:49,000 --> 00:50:51,279 Speaker 1: were given good reason to think that there would be 927 00:50:51,320 --> 00:50:54,719 Speaker 1: punishments for not believing in God or something right, And 928 00:50:54,800 --> 00:50:56,799 Speaker 1: but I didn't know which religion was correct, and I 929 00:50:56,840 --> 00:51:00,560 Speaker 1: had to like proceed based upon the relevant level of 930 00:51:00,560 --> 00:51:03,680 Speaker 1: punishment for unbelievers in various religions and like which one 931 00:51:03,719 --> 00:51:05,879 Speaker 1: is most correct for like you just think that would 932 00:51:05,880 --> 00:51:08,560 Speaker 1: mean it would what would be rational to choose the 933 00:51:08,600 --> 00:51:12,319 Speaker 1: religion that has the most lurid hell, I guess, But 934 00:51:12,400 --> 00:51:16,279 Speaker 1: then that really feels like losing, doesn't it? Um? You 935 00:51:16,320 --> 00:51:18,839 Speaker 1: know it? It certainly though reminds me of the more 936 00:51:18,920 --> 00:51:21,920 Speaker 1: boiled down versions of this that you encounter in various 937 00:51:21,960 --> 00:51:25,880 Speaker 1: forms of Christianity. Right, except Christ go to Heaven, reject 938 00:51:25,960 --> 00:51:28,520 Speaker 1: Christ go to Hell. But what about the people who 939 00:51:28,600 --> 00:51:31,479 Speaker 1: have been given the choice yet? Right? That's ah, that's 940 00:51:31,480 --> 00:51:35,600 Speaker 1: the other concern. Well, if they're all default hell bound, 941 00:51:36,000 --> 00:51:39,719 Speaker 1: then God comes off as comes off as a bit bad, right, 942 00:51:39,800 --> 00:51:42,040 Speaker 1: Like what kind of God is that? But if they 943 00:51:42,040 --> 00:51:44,440 Speaker 1: have an out, if they're spared hell fire or at 944 00:51:44,520 --> 00:51:48,840 Speaker 1: least you know, their section for Dante's limbo of virtuous pagans, 945 00:51:49,280 --> 00:51:51,919 Speaker 1: then is the missionary doing them a disservice by even 946 00:51:51,960 --> 00:51:54,359 Speaker 1: presenting them with the choice? Like why do you even 947 00:51:54,360 --> 00:51:56,480 Speaker 1: ask me? Because now I have to I have to, 948 00:51:56,680 --> 00:51:58,880 Speaker 1: I have to devote myself or not, Like now I 949 00:51:58,920 --> 00:52:01,400 Speaker 1: actually have you know, I was just going to go 950 00:52:01,440 --> 00:52:05,200 Speaker 1: into the uh, you know, default limbo category or the 951 00:52:05,239 --> 00:52:08,839 Speaker 1: default heaven before and now now I'm actually at risk 952 00:52:08,880 --> 00:52:11,879 Speaker 1: of hell. Well that means certain theories of damnation mean 953 00:52:12,000 --> 00:52:15,040 Speaker 1: that presenting the gospel to someone as an information hazard. 954 00:52:15,480 --> 00:52:18,520 Speaker 1: You potentially harm them immensely by telling it to them. 955 00:52:18,840 --> 00:52:21,440 Speaker 1: And I think part of this just to to to 956 00:52:21,440 --> 00:52:24,840 Speaker 1: go beyond like the the actual um wager here is 957 00:52:24,920 --> 00:52:26,239 Speaker 1: I think a part of the issue here is that 958 00:52:26,600 --> 00:52:29,640 Speaker 1: we're using evolved cognitive abilities that are that are geared 959 00:52:29,680 --> 00:52:33,120 Speaker 1: for smaller, though often important choices, and here we're trying 960 00:52:33,160 --> 00:52:36,040 Speaker 1: to use our imaginative brains to create a conundrum that 961 00:52:36,120 --> 00:52:39,080 Speaker 1: can outstrip those abilities. Yeah. Well, I mean that is 962 00:52:39,120 --> 00:52:41,920 Speaker 1: what we do in philosophy, right, We're constantly using our 963 00:52:41,960 --> 00:52:46,240 Speaker 1: brains in situations it was not really made for um 964 00:52:46,360 --> 00:52:48,200 Speaker 1: and just trying to do the best we can. But 965 00:52:48,239 --> 00:52:51,760 Speaker 1: I mean it's quite clear that motivated reasoning is often 966 00:52:51,760 --> 00:52:55,120 Speaker 1: a thing when we're trying to be rational, was just failing. 967 00:52:55,239 --> 00:52:57,080 Speaker 1: But of course this is how we train our brains 968 00:52:57,120 --> 00:53:02,759 Speaker 1: for rational thinking, often oftentimes exploring these various outsized ideas. 969 00:53:03,040 --> 00:53:05,840 Speaker 1: You know, there's so many ways. I think Pascal's wager 970 00:53:05,920 --> 00:53:09,160 Speaker 1: kind of breaks down because it's obviously there's the thing 971 00:53:09,200 --> 00:53:12,680 Speaker 1: you pointed out about there's more than one religion, right, 972 00:53:12,719 --> 00:53:14,719 Speaker 1: you know, it's not just like do I believe or not? 973 00:53:14,800 --> 00:53:18,440 Speaker 1: It's like which one but it also it implies again 974 00:53:18,480 --> 00:53:20,719 Speaker 1: this is like a theological question, but it would seem 975 00:53:20,760 --> 00:53:24,480 Speaker 1: to imply that God can be tricked into thinking that 976 00:53:24,520 --> 00:53:27,360 Speaker 1: you believe in Him if you simply pretend to. I 977 00:53:27,640 --> 00:53:30,799 Speaker 1: guess Pascal had I think maybe a more sophisticated way 978 00:53:30,840 --> 00:53:33,120 Speaker 1: of looking at this, you know that, like live as 979 00:53:33,160 --> 00:53:36,560 Speaker 1: if God exists or something. But it but the wager 980 00:53:36,640 --> 00:53:40,040 Speaker 1: is often used in very unsophisticated ways. Yeah, but it 981 00:53:40,080 --> 00:53:43,000 Speaker 1: implies that it doesn't matter to him what you actually believe, 982 00:53:43,200 --> 00:53:46,520 Speaker 1: only what you outwardly claim to believe. Though then again, 983 00:53:46,600 --> 00:53:48,759 Speaker 1: the funny thing here is this might be the case 984 00:53:48,800 --> 00:53:52,200 Speaker 1: with Rocco's basilisk, Right, what would this machine? God care 985 00:53:52,280 --> 00:53:54,480 Speaker 1: what was in your heart? It only cares whether you 986 00:53:54,560 --> 00:53:57,480 Speaker 1: help it or not, or whether you, you know, proclaim 987 00:53:57,600 --> 00:53:59,440 Speaker 1: fealty to it or not. Yeah, that's why the T 988 00:53:59,520 --> 00:54:02,919 Speaker 1: shirt is important, Joe, because if it, if it knows 989 00:54:02,920 --> 00:54:07,960 Speaker 1: you purchase that shirt, then you're you're square, You're covered. Okay. Yeah. 990 00:54:08,360 --> 00:54:11,040 Speaker 1: As as best Singular pointed out in that Ian magazine 991 00:54:11,080 --> 00:54:14,279 Speaker 1: piece of reference earlier, she says, quote, the secular basilist 992 00:54:14,360 --> 00:54:17,040 Speaker 1: stands in for God. As we struggle with the same 993 00:54:17,160 --> 00:54:20,000 Speaker 1: questions again, and again. So her argument is that we've 994 00:54:20,040 --> 00:54:23,440 Speaker 1: kind of reverse engineer the same problem again through our 995 00:54:23,480 --> 00:54:27,320 Speaker 1: contemplations of of super intelligent AI. Yeah, I guess the 996 00:54:27,360 --> 00:54:30,040 Speaker 1: quay You get into a plausibility question here, right, You 997 00:54:30,080 --> 00:54:34,040 Speaker 1: get into a question about is uh it actually possible 998 00:54:34,080 --> 00:54:38,839 Speaker 1: to make an artificial intelligence that is functionally equivalent to God. 999 00:54:38,880 --> 00:54:40,839 Speaker 1: I mean, we're not thinking we could build an AI 1000 00:54:40,920 --> 00:54:42,799 Speaker 1: that would break the laws of physics. So it might 1001 00:54:42,800 --> 00:54:45,319 Speaker 1: be able to run simulations of the universe. They have 1002 00:54:45,400 --> 00:54:48,359 Speaker 1: cont you know, conscious agents within them maybe for all 1003 00:54:48,400 --> 00:54:50,600 Speaker 1: we know, and they could break the laws of physics 1004 00:54:50,640 --> 00:54:54,440 Speaker 1: inside them. But yeah, I mean could that even happen? 1005 00:54:54,719 --> 00:54:56,400 Speaker 1: And the issue is we don't know. We don't know 1006 00:54:56,440 --> 00:54:59,160 Speaker 1: whether that could happen or not. So should we behave 1007 00:54:59,239 --> 00:55:01,920 Speaker 1: as if that is a plausible thing to be worried 1008 00:55:01,920 --> 00:55:04,560 Speaker 1: about and to consider, or should we behave as if 1009 00:55:04,880 --> 00:55:07,440 Speaker 1: that's just not really something you need to concern yourself with. 1010 00:55:07,520 --> 00:55:11,160 Speaker 1: I don't know how likely or unlikely it is. And 1011 00:55:11,200 --> 00:55:13,320 Speaker 1: if your your fears are related just to the idea 1012 00:55:13,400 --> 00:55:17,520 Speaker 1: that you're you could be digitally resurrected, uh for torment 1013 00:55:17,640 --> 00:55:21,080 Speaker 1: and the bassilisks dungeons, Um, I mean that that, of 1014 00:55:21,080 --> 00:55:23,680 Speaker 1: course would depend on to what how much docuputing the 1015 00:55:23,680 --> 00:55:27,600 Speaker 1: idea of digital consciousness, and the whole philosophical question we've 1016 00:55:27,680 --> 00:55:30,319 Speaker 1: we've touched on here before is that you I mean, 1017 00:55:30,360 --> 00:55:32,600 Speaker 1: it's just a copy of me, right, So why I 1018 00:55:32,640 --> 00:55:36,719 Speaker 1: mean I ultimately can't do anything about you know, a 1019 00:55:36,840 --> 00:55:41,040 Speaker 1: thousand different bassilisks creating a thousand different copies of me 1020 00:55:41,120 --> 00:55:44,920 Speaker 1: and tormenting all of them. Um, there's still to a 1021 00:55:45,000 --> 00:55:47,640 Speaker 1: large extent, it's just destroying me in effigy. There are 1022 00:55:47,640 --> 00:55:49,640 Speaker 1: actually a bunch of reasons I wrote down to doubt 1023 00:55:49,680 --> 00:55:52,000 Speaker 1: the plausibility of the bassilisk. We could do that now, 1024 00:55:52,080 --> 00:55:53,640 Speaker 1: or we could come back to that later. I don't 1025 00:55:53,640 --> 00:55:55,480 Speaker 1: know what you think. Yes, let's to do, but I 1026 00:55:55,520 --> 00:56:00,520 Speaker 1: will add the idea of being tormented digitally, This does 1027 00:56:00,600 --> 00:56:03,200 Speaker 1: become more dangerous. I guess if you believe you might 1028 00:56:03,239 --> 00:56:07,359 Speaker 1: be in a simulation right now exactly, then then things 1029 00:56:07,360 --> 00:56:09,520 Speaker 1: are a little more dire. But that's again that you 1030 00:56:09,600 --> 00:56:12,280 Speaker 1: might be that you might be, yeah, but I believe 1031 00:56:12,320 --> 00:56:15,240 Speaker 1: there's plenty of reason to believe that you are not. Okay, 1032 00:56:15,239 --> 00:56:17,360 Speaker 1: So if we're talking about how to defeat the basilisk, 1033 00:56:17,400 --> 00:56:20,280 Speaker 1: how to get out of this uh, this this prison 1034 00:56:20,280 --> 00:56:22,279 Speaker 1: of the mind. If you're feeling a little bit um 1035 00:56:22,560 --> 00:56:25,520 Speaker 1: um bleak of heart right now because of this idea, 1036 00:56:25,640 --> 00:56:28,200 Speaker 1: then Joe's got the remedy. Well, I'm not. These are 1037 00:56:28,200 --> 00:56:30,799 Speaker 1: not all the reasons you should doubt the basilisk, but 1038 00:56:31,040 --> 00:56:32,839 Speaker 1: this is some of them that I could think of. 1039 00:56:33,080 --> 00:56:36,200 Speaker 1: Number one depends on the creation of super intelligence, which 1040 00:56:36,239 --> 00:56:42,000 Speaker 1: I think is not guaranteed. Some people seem incredibly fatalistic 1041 00:56:42,040 --> 00:56:44,920 Speaker 1: about this is just absolutely inevitable. We will have super 1042 00:56:44,960 --> 00:56:48,879 Speaker 1: intelligent godlike AI that can do anything, and I think 1043 00:56:48,920 --> 00:56:51,279 Speaker 1: that that is just not guaranteed at all. I'm not 1044 00:56:51,360 --> 00:56:53,520 Speaker 1: ruling it out, but I think, for example, there's some 1045 00:56:54,040 --> 00:56:57,719 Speaker 1: theories of intelligence that say the prediction of super intelligence 1046 00:56:57,719 --> 00:57:02,399 Speaker 1: actually is maybe not taking seriously what intelligence is that 1047 00:57:02,480 --> 00:57:05,120 Speaker 1: you know that there are actually different kinds of intelligence 1048 00:57:05,120 --> 00:57:08,560 Speaker 1: that are useful in different ways, and machines can't mimic 1049 00:57:08,680 --> 00:57:12,080 Speaker 1: them all functionally, or can't mimic them all correctly, all 1050 00:57:12,120 --> 00:57:14,240 Speaker 1: at the same time. I don't know if that's correct, 1051 00:57:14,280 --> 00:57:16,240 Speaker 1: but that's at least one that's one hurdle it has 1052 00:57:16,280 --> 00:57:18,919 Speaker 1: to clear. Could get knocked down there. But okay, maybe 1053 00:57:18,920 --> 00:57:22,400 Speaker 1: we could create a superintelligence Even then, multiple aspects of 1054 00:57:22,400 --> 00:57:25,280 Speaker 1: the Rocco's basilisk scenario depend on the reality of some 1055 00:57:25,600 --> 00:57:29,000 Speaker 1: version of mind uploading, or the idea that your brain 1056 00:57:29,520 --> 00:57:33,880 Speaker 1: and in addition, your conscious experience could be simulated perfectly 1057 00:57:33,920 --> 00:57:36,560 Speaker 1: on a computer. And one reason it depends on this 1058 00:57:36,640 --> 00:57:39,760 Speaker 1: is that timeless decision theory operates on the assumption that 1059 00:57:39,880 --> 00:57:42,960 Speaker 1: the real you and the simulated copies that the computer 1060 00:57:43,200 --> 00:57:46,520 Speaker 1: uses to predict your behavior would be the same and 1061 00:57:46,520 --> 00:57:49,680 Speaker 1: would make the same decisions as the real you. Another 1062 00:57:49,720 --> 00:57:52,040 Speaker 1: reason is related to the punishment. Now, one way, of 1063 00:57:52,080 --> 00:57:54,920 Speaker 1: course you could imagine the great basilisk thing is that 1064 00:57:55,280 --> 00:57:58,080 Speaker 1: if the machine comes to power in my lifetime, it 1065 00:57:58,120 --> 00:58:01,080 Speaker 1: could just punish the real, physical, older version of me 1066 00:58:01,240 --> 00:58:04,200 Speaker 1: in reality is the payoff of this a causal blackmail. 1067 00:58:04,520 --> 00:58:06,120 Speaker 1: But the other way you could imagine it, in the 1068 00:58:06,120 --> 00:58:08,760 Speaker 1: way that it is much more often portrayed in the media, 1069 00:58:09,280 --> 00:58:12,840 Speaker 1: is that it makes digital copies of my consciousness and 1070 00:58:12,920 --> 00:58:16,720 Speaker 1: punishes them in a simulated hell. And that, of course 1071 00:58:16,720 --> 00:58:19,560 Speaker 1: would also depend on the reality of some version of 1072 00:58:19,640 --> 00:58:22,000 Speaker 1: mind uploading, or of the ability of a computer to 1073 00:58:22,080 --> 00:58:25,680 Speaker 1: simulate a mind and for that simulated mind to actually 1074 00:58:25,680 --> 00:58:28,800 Speaker 1: be conscious. As I've said, before. I'm suspicious of the 1075 00:58:28,840 --> 00:58:31,800 Speaker 1: idea of conscious digital simulations. I'm not saying I can 1076 00:58:31,880 --> 00:58:33,880 Speaker 1: rule it out, but I also don't think it's a 1077 00:58:33,960 --> 00:58:37,760 Speaker 1: sure thing. Any scenario that relies on the existence of 1078 00:58:37,840 --> 00:58:41,560 Speaker 1: conscious digital simulations needs a big asterisk next to it 1079 00:58:41,560 --> 00:58:44,960 Speaker 1: that says, if this is actually possible. Yeah again, is 1080 00:58:45,040 --> 00:58:47,640 Speaker 1: that me? Is that just me? An effigy? Is that 1081 00:58:47,680 --> 00:58:50,440 Speaker 1: thing actually conscious that you're tormenting? I mean, granted, it 1082 00:58:50,520 --> 00:58:54,920 Speaker 1: still sucks if there's a super intelligence creating digital people 1083 00:58:54,920 --> 00:58:58,360 Speaker 1: and tormenting and it's a dark, rancid dungeons in the future, 1084 00:58:58,440 --> 00:59:02,960 Speaker 1: but um, it's not necessarily quite the same as torturing me, right, well, 1085 00:59:03,000 --> 00:59:05,200 Speaker 1: if you just care about yourself. It also depends on 1086 00:59:05,240 --> 00:59:08,080 Speaker 1: the possibility that you could be one of these simulations. 1087 00:59:08,120 --> 00:59:11,280 Speaker 1: It's possible that you could not be one of those simulations. 1088 00:59:11,320 --> 00:59:14,120 Speaker 1: There's something that would rule it out. Maybe their type 1089 00:59:14,120 --> 00:59:16,760 Speaker 1: of conscious Maybe they could be conscious, but that consciousness 1090 00:59:16,800 --> 00:59:19,920 Speaker 1: is fundamentally different from yours such that you could not 1091 00:59:19,960 --> 00:59:22,520 Speaker 1: be one of them. Another big one, and this is 1092 00:59:22,520 --> 00:59:25,360 Speaker 1: a big one that uh, you know, like we said earlier, 1093 00:59:25,400 --> 00:59:30,040 Speaker 1: I think sometimes Yudkowski gets unfairly associated with the basilist 1094 00:59:30,120 --> 00:59:32,400 Speaker 1: as if he has advocated the idea, and he has not. 1095 00:59:32,680 --> 00:59:36,280 Speaker 1: He has said, you know this, this idea is trash, 1096 00:59:36,600 --> 00:59:39,360 Speaker 1: and uh, there there are many reasons to doubt it. 1097 00:59:39,400 --> 00:59:41,919 Speaker 1: But even but though he has said, like even though 1098 00:59:41,920 --> 00:59:45,760 Speaker 1: I doubted, I don't want it disseminated. Um, but he says, 1099 00:59:45,800 --> 00:59:47,840 Speaker 1: you know, a good reason to doubt it is there's 1100 00:59:47,880 --> 00:59:51,760 Speaker 1: no reason to conclude it's necessary for the Basilisk to 1101 00:59:51,960 --> 00:59:55,640 Speaker 1: actually follow through on the threat. We're saying that it's 1102 00:59:55,680 --> 00:59:58,280 Speaker 1: going to be relying on us to come up with 1103 00:59:58,360 --> 01:00:02,320 Speaker 1: the idea that it in the future might blackmail us 1104 01:00:02,360 --> 01:00:05,200 Speaker 1: if we don't help it now. In order to get 1105 01:00:05,280 --> 01:00:07,760 Speaker 1: us to help it now, right, we should be working 1106 01:00:07,800 --> 01:00:10,480 Speaker 1: and donating all our money and time and resources to 1107 01:00:10,640 --> 01:00:13,800 Speaker 1: building it as fast as possible, because we came up 1108 01:00:13,840 --> 01:00:16,640 Speaker 1: with the idea that it might torture us if we don't. 1109 01:00:17,040 --> 01:00:20,400 Speaker 1: Even if you accept that Yudkowski has he's pointed out 1110 01:00:20,600 --> 01:00:23,560 Speaker 1: that there's no reason, once it's built, it would have 1111 01:00:23,640 --> 01:00:27,000 Speaker 1: to follow through on the threat he's written. Quote. The 1112 01:00:27,000 --> 01:00:30,840 Speaker 1: most blatant obstacle to Rocco's Basilisk is intuitively that there's 1113 01:00:30,920 --> 01:00:34,120 Speaker 1: no incentive for a future agent to follow through with 1114 01:00:34,160 --> 01:00:37,280 Speaker 1: a threat in the future, because by doing so, it 1115 01:00:37,400 --> 01:00:41,600 Speaker 1: just expends resources at no gain to itself. We can 1116 01:00:41,600 --> 01:00:45,720 Speaker 1: formalize that using classical causal decision theory, which is the 1117 01:00:45,760 --> 01:00:49,760 Speaker 1: academically standard decision theory. Following through on a blackmail threat 1118 01:00:49,800 --> 01:00:52,680 Speaker 1: in the future after the past has already taken place, 1119 01:00:52,880 --> 01:00:57,360 Speaker 1: cannot from the blackmailing agent's perspective, be the physical cause 1120 01:00:57,480 --> 01:01:00,720 Speaker 1: of improved outcomes in the past, because as the future 1121 01:01:00,800 --> 01:01:03,720 Speaker 1: cannot be the cause of the past. Hey, basilist, why 1122 01:01:03,720 --> 01:01:07,720 Speaker 1: are you tormenting a third of the population for all eternity? Oh, 1123 01:01:07,800 --> 01:01:10,160 Speaker 1: I said I would. Well, yeah, I mean exactly, no, 1124 01:01:10,320 --> 01:01:12,320 Speaker 1: if it it didn't say it would, right, It just 1125 01:01:12,400 --> 01:01:15,240 Speaker 1: had to rely on the fact that in the past 1126 01:01:15,360 --> 01:01:18,680 Speaker 1: people would have come to the conclusion that it might. 1127 01:01:18,840 --> 01:01:20,440 Speaker 1: You know, you thought that I would. I didn't want 1128 01:01:20,440 --> 01:01:23,360 Speaker 1: to disappoint. But actually, if a basilist could be created, 1129 01:01:23,480 --> 01:01:25,520 Speaker 1: it seems like the best case scenario for it would 1130 01:01:25,560 --> 01:01:29,439 Speaker 1: be everyone subscribes to this idea and works as hard 1131 01:01:29,480 --> 01:01:31,320 Speaker 1: as they can to build it, and then it never 1132 01:01:31,440 --> 01:01:34,320 Speaker 1: follows through on any of the threats. Right, The best 1133 01:01:34,360 --> 01:01:36,920 Speaker 1: case scenario would be people act as if there is 1134 01:01:37,000 --> 01:01:39,920 Speaker 1: a threat, and then there is in fact no follow 1135 01:01:39,960 --> 01:01:42,000 Speaker 1: through on the threat. It's really a win win for 1136 01:01:42,000 --> 01:01:44,280 Speaker 1: the basilist. Yes, and then it can maybe you can 1137 01:01:44,560 --> 01:01:46,760 Speaker 1: even shed that name Basilist. They're like, we don't even 1138 01:01:46,760 --> 01:01:48,480 Speaker 1: have to call it the Great Basilist anymore. We can 1139 01:01:48,520 --> 01:01:51,480 Speaker 1: just call it, uh, you know, Omega or whatever it's 1140 01:01:51,480 --> 01:01:53,360 Speaker 1: its name is now. I want to be fair that 1141 01:01:53,400 --> 01:01:55,360 Speaker 1: a lot of what these people do is like the 1142 01:01:55,680 --> 01:01:58,080 Speaker 1: less Wrong community and all that they deal with, Like, 1143 01:01:58,760 --> 01:02:02,080 Speaker 1: should there be alternative of decision theories that guide the 1144 01:02:02,080 --> 01:02:06,640 Speaker 1: behavior of superintelligent aies. Maybe it doesn't use classical decision theory, 1145 01:02:07,040 --> 01:02:10,440 Speaker 1: maybe it uses some kind of other decision theory, and 1146 01:02:10,520 --> 01:02:14,120 Speaker 1: because on some other decision theory, maybe it could decide 1147 01:02:14,160 --> 01:02:16,720 Speaker 1: to actually follow through on the blackmail threat. I think 1148 01:02:16,800 --> 01:02:20,320 Speaker 1: that is where some of this fear comes through, that like, oh, 1149 01:02:20,400 --> 01:02:23,440 Speaker 1: maybe by talking about it we are actually causing danger 1150 01:02:24,000 --> 01:02:27,920 Speaker 1: because maybe some other decision theory holds. But Yudkowski does 1151 01:02:27,960 --> 01:02:32,160 Speaker 1: not think that's the case. Also, one more thing, it 1152 01:02:32,200 --> 01:02:34,600 Speaker 1: depends on the basil list. So if you think this 1153 01:02:34,640 --> 01:02:37,720 Speaker 1: scenario could be real, it depends on it not having 1154 01:02:37,800 --> 01:02:41,280 Speaker 1: ethical or behavioral controls that would prevent it from engaging 1155 01:02:41,320 --> 01:02:44,640 Speaker 1: in torture. Yeah, and I think if thinkers like the 1156 01:02:44,760 --> 01:02:48,080 Speaker 1: you know, the Miria people, the Machine Intelligence Research Institute 1157 01:02:48,080 --> 01:02:51,560 Speaker 1: people succeed in established what they're trying to do is 1158 01:02:51,640 --> 01:02:55,520 Speaker 1: establish a philosophical framework to make AI friendly, to make 1159 01:02:55,560 --> 01:02:58,120 Speaker 1: it so that it is not evil and does not 1160 01:02:58,280 --> 01:03:01,800 Speaker 1: harm us. And if they successfully do that, then this 1161 01:03:01,840 --> 01:03:04,640 Speaker 1: shouldn't be a problem, right Because Yudkowski has argued that 1162 01:03:04,680 --> 01:03:07,960 Speaker 1: a being that tries to do what's best for us 1163 01:03:08,320 --> 01:03:11,880 Speaker 1: would not engage in torture and blackmail, even if it's 1164 01:03:11,960 --> 01:03:15,680 Speaker 1: doing so in service of some higher good, because doing 1165 01:03:15,760 --> 01:03:19,720 Speaker 1: torture and blackmail are actually not compatible with human values. 1166 01:03:20,320 --> 01:03:23,120 Speaker 1: I agree with that absolutely, and I actually would go 1167 01:03:23,120 --> 01:03:25,520 Speaker 1: as far to say I think that's something people should 1168 01:03:25,600 --> 01:03:28,360 Speaker 1: keep in mind when they're uh, when they're they're choosing 1169 01:03:28,400 --> 01:03:30,960 Speaker 1: their religions as well. Yeah, I can certainly see how 1170 01:03:31,000 --> 01:03:33,160 Speaker 1: you can make that argument. It's like, what do I 1171 01:03:33,200 --> 01:03:36,720 Speaker 1: love about my faith? Is that the blackmail and the 1172 01:03:36,760 --> 01:03:39,840 Speaker 1: torture or is there sometimes it brings something else to 1173 01:03:39,880 --> 01:03:42,720 Speaker 1: the table that is worth living forward, that makes life 1174 01:03:42,720 --> 01:03:46,640 Speaker 1: better for everybody, Like, like, I feel like that is 1175 01:03:46,680 --> 01:03:49,880 Speaker 1: what should be important about one's faith. Now, I think 1176 01:03:50,320 --> 01:03:52,560 Speaker 1: some people might be saying like, wait a minute, though, 1177 01:03:52,680 --> 01:03:58,040 Speaker 1: if you're just using utilitarian ethics, right, wouldn't wouldn't any 1178 01:03:58,080 --> 01:04:01,040 Speaker 1: methods be good if the if the ends justify the means, Right? 1179 01:04:01,080 --> 01:04:04,360 Speaker 1: That's I think a naive understanding of how people think 1180 01:04:04,400 --> 01:04:06,760 Speaker 1: about utilitarian ethics. If you want to bring about the 1181 01:04:06,760 --> 01:04:10,200 Speaker 1: greatest good for the greatest number of people, couldn't you 1182 01:04:10,280 --> 01:04:13,120 Speaker 1: do that by being really cruel and unfair to some 1183 01:04:13,200 --> 01:04:16,800 Speaker 1: smaller group of people? And I think generally there are 1184 01:04:16,920 --> 01:04:19,840 Speaker 1: versions of utilitarianism that say, well, actually, the answer there 1185 01:04:19,920 --> 01:04:23,560 Speaker 1: is no, you couldn't do that, because even though you 1186 01:04:23,640 --> 01:04:27,120 Speaker 1: might be bringing about some better material circumstance, it is 1187 01:04:27,160 --> 01:04:32,120 Speaker 1: actually corrosive to a society for things like that to happen, 1188 01:04:32,200 --> 01:04:35,120 Speaker 1: even if they don't happen to many people. Right, you say, 1189 01:04:35,160 --> 01:04:38,400 Speaker 1: what if I could make everybody on earth ten percent 1190 01:04:38,520 --> 01:04:43,280 Speaker 1: happier on average by say, burying somebody in a in 1191 01:04:43,320 --> 01:04:46,000 Speaker 1: a pit of bananas once a year, so that it, 1192 01:04:46,120 --> 01:04:48,840 Speaker 1: you know, buried to death with bananas. Even the people 1193 01:04:48,840 --> 01:04:51,400 Speaker 1: who are being made happier could very easily look at 1194 01:04:51,440 --> 01:04:54,360 Speaker 1: that and say that's not fair and it makes the 1195 01:04:54,400 --> 01:04:57,120 Speaker 1: world worse and I don't want it. And thus that 1196 01:04:57,200 --> 01:05:00,919 Speaker 1: actually would be a subjectively relevant state. So we've talked 1197 01:05:00,920 --> 01:05:04,560 Speaker 1: about AI risk on the show before, and you know, 1198 01:05:04,600 --> 01:05:06,360 Speaker 1: one thing I feel like I still have not been 1199 01:05:06,360 --> 01:05:08,520 Speaker 1: able to make up my mind about, despite reading a 1200 01:05:08,560 --> 01:05:11,120 Speaker 1: lot on the subject, is that I don't know whether 1201 01:05:11,800 --> 01:05:15,400 Speaker 1: it makes sense to take um to be super worried 1202 01:05:15,480 --> 01:05:19,520 Speaker 1: about AI super intelligence and the risks associated. I mean, 1203 01:05:19,560 --> 01:05:22,120 Speaker 1: I do think it's worth taking seriously and thinking about. 1204 01:05:22,160 --> 01:05:24,200 Speaker 1: And I think people who want to devote their attention 1205 01:05:24,240 --> 01:05:27,360 Speaker 1: to how, you know how dealing with the control problem 1206 01:05:27,440 --> 01:05:30,840 Speaker 1: and how you would get an AI to do things 1207 01:05:30,880 --> 01:05:32,960 Speaker 1: that were good for us and not harmful to us, 1208 01:05:33,000 --> 01:05:36,479 Speaker 1: that that's fine work. And I don't ridicule the people 1209 01:05:36,480 --> 01:05:38,440 Speaker 1: who work on that problem the way some people do. 1210 01:05:38,720 --> 01:05:42,040 Speaker 1: But on the other hand, I worry if by focusing 1211 01:05:42,200 --> 01:05:45,800 Speaker 1: exclusively on sort of the machine god the super intelligence, 1212 01:05:46,120 --> 01:05:53,200 Speaker 1: were sort of ignoring um much more plausible and current threats. 1213 01:05:53,320 --> 01:05:56,240 Speaker 1: That the ways that AI is already very plausibly in 1214 01:05:56,240 --> 01:05:59,439 Speaker 1: a position to hurt us today or in the very 1215 01:05:59,520 --> 01:06:03,080 Speaker 1: near few future, and not depending on any outlandish assumptions, 1216 01:06:03,400 --> 01:06:06,120 Speaker 1: the way it's already and will soon be used as 1217 01:06:06,120 --> 01:06:09,240 Speaker 1: a cyber war weapon, the way it's hijacking our attention 1218 01:06:09,280 --> 01:06:13,120 Speaker 1: and manipulating our opinions and behavior through social media and devices. 1219 01:06:13,360 --> 01:06:15,320 Speaker 1: This is some of what our Scott Baker talked about 1220 01:06:15,320 --> 01:06:18,920 Speaker 1: with his fears about AI. You don't actually need super 1221 01:06:18,960 --> 01:06:22,080 Speaker 1: powerful AI to do a lot of damage. It just 1222 01:06:22,200 --> 01:06:25,440 Speaker 1: needs to manipulate us in just the right kind of ways. 1223 01:06:26,280 --> 01:06:28,400 Speaker 1: So not the great basilisks so much as all the 1224 01:06:28,440 --> 01:06:31,280 Speaker 1: little basilisks that are out there, the little grass snakes, 1225 01:06:31,720 --> 01:06:34,360 Speaker 1: grass with the tiny crowns, they can do a lot 1226 01:06:34,360 --> 01:06:36,040 Speaker 1: of damage. And again I just want to be clear, 1227 01:06:36,080 --> 01:06:38,640 Speaker 1: I'm not saying we should forget about super intelligence. People 1228 01:06:38,680 --> 01:06:40,680 Speaker 1: who are working on that. If you find that interesting 1229 01:06:40,760 --> 01:06:43,600 Speaker 1: that I think that's fine, Yeah, work on that problem. 1230 01:06:43,720 --> 01:06:46,480 Speaker 1: But there but I think it's a longer shot, and 1231 01:06:46,520 --> 01:06:49,520 Speaker 1: there's a lot of current and near future AI threat 1232 01:06:49,600 --> 01:06:52,880 Speaker 1: that is really worth taking very seriously. I wish people 1233 01:06:52,960 --> 01:06:56,840 Speaker 1: more people were devoting their lives to say AI cyber 1234 01:06:56,840 --> 01:07:01,160 Speaker 1: weapons that are in development right now. One last issue 1235 01:07:01,320 --> 01:07:04,200 Speaker 1: I think we should discuss before we wrap up here 1236 01:07:04,280 --> 01:07:08,840 Speaker 1: is Okay, so we don't think this potential information hazard 1237 01:07:08,920 --> 01:07:11,520 Speaker 1: is actually an information hazard, Like, we don't think it's 1238 01:07:11,520 --> 01:07:16,240 Speaker 1: actually potentially that dangerous. But Yudkowski has made the point 1239 01:07:16,280 --> 01:07:19,120 Speaker 1: that even though he doesn't think the basilisk is plausible, 1240 01:07:19,200 --> 01:07:23,320 Speaker 1: the ethical thing to do with potential information hazards is 1241 01:07:23,360 --> 01:07:27,240 Speaker 1: to not discuss them at all, since it's possible that 1242 01:07:27,360 --> 01:07:30,720 Speaker 1: they may be Maybe maybe you're misinterpreting the ways in 1243 01:07:30,760 --> 01:07:34,280 Speaker 1: which they're implausible. Maybe this idea is actually valid, is 1244 01:07:34,320 --> 01:07:37,240 Speaker 1: actually relevant, and by spreading it you've harmed a lot 1245 01:07:37,280 --> 01:07:39,840 Speaker 1: of people. But I also think that this could mean 1246 01:07:39,880 --> 01:07:44,600 Speaker 1: that it's it's possible that despite the basilist not being plausible, 1247 01:07:44,920 --> 01:07:48,640 Speaker 1: something good has come out of the basilist conversation because 1248 01:07:48,680 --> 01:07:53,360 Speaker 1: it encourages people to think of the idea of information hazards. 1249 01:07:53,400 --> 01:07:56,880 Speaker 1: Maybe Rocco isn't true, but there could be other ideas 1250 01:07:56,920 --> 01:08:00,400 Speaker 1: that are both true and potentially harmful to be just 1251 01:08:00,640 --> 01:08:03,800 Speaker 1: by entering their minds. And the lesson from this is 1252 01:08:03,880 --> 01:08:07,200 Speaker 1: we should prepare ourselves for those kinds of ideas. And 1253 01:08:07,240 --> 01:08:10,280 Speaker 1: if you have discovered one of those ideas and there 1254 01:08:10,360 --> 01:08:13,880 Speaker 1: is literally no upside other people knowing about it, keep 1255 01:08:13,920 --> 01:08:17,080 Speaker 1: it to yourself and don't post it on the internet. Well, 1256 01:08:17,120 --> 01:08:18,840 Speaker 1: I feel like I do encounter thought hazards like this 1257 01:08:18,880 --> 01:08:21,320 Speaker 1: from time to time that they're often presented in pamphlets 1258 01:08:21,439 --> 01:08:24,960 Speaker 1: or little booklets, UH, generally with you know, a clever 1259 01:08:25,000 --> 01:08:28,640 Speaker 1: illustration about the coming into the world. Actually brought some 1260 01:08:28,720 --> 01:08:30,519 Speaker 1: of these into the office recently. I found them at 1261 01:08:31,040 --> 01:08:34,479 Speaker 1: a park in rural Georgia, and uh, and I think 1262 01:08:34,520 --> 01:08:37,439 Speaker 1: I told you it's like, Uh, you have have a 1263 01:08:37,439 --> 01:08:40,120 Speaker 1: look at these. You may find them interesting, but UH, 1264 01:08:40,360 --> 01:08:43,720 Speaker 1: do destroy them when you're done, because you know I didn't. 1265 01:08:44,200 --> 01:08:47,000 Speaker 1: In the wrong hands. These thoughts can be dangerous if 1266 01:08:47,000 --> 01:08:50,479 Speaker 1: they have some sort of like a harmful view of 1267 01:08:50,720 --> 01:08:54,599 Speaker 1: society that that people may buy into. Well, I think 1268 01:08:54,640 --> 01:08:59,200 Speaker 1: you were comfortable sharing uh, malicious religious literature with me 1269 01:08:59,280 --> 01:09:02,080 Speaker 1: because you do not think there's a possibility that that 1270 01:09:02,200 --> 01:09:05,000 Speaker 1: literature is true and would harm me if I knew 1271 01:09:05,000 --> 01:09:07,559 Speaker 1: it was true, Like you think it is false, So 1272 01:09:07,600 --> 01:09:10,519 Speaker 1: to you, it's actually not an information hazard. It's just 1273 01:09:10,560 --> 01:09:15,120 Speaker 1: like an idea hazard. Uh. The real crazy thing would 1274 01:09:15,120 --> 01:09:17,519 Speaker 1: be if you came across a pamphlet and you read 1275 01:09:17,560 --> 01:09:21,519 Speaker 1: it and it's the equivalent of this raving malicious religious literature, 1276 01:09:21,720 --> 01:09:25,479 Speaker 1: except you were convinced it was correct. If it was 1277 01:09:25,479 --> 01:09:30,000 Speaker 1: more like that ring video I brought you exactly. That 1278 01:09:30,120 --> 01:09:32,200 Speaker 1: is one of the things I've often seen on the Internet, 1279 01:09:32,240 --> 01:09:35,800 Speaker 1: this idea compared to the ring. But you know, on 1280 01:09:35,880 --> 01:09:38,280 Speaker 1: the other hand, I do have to I am reminded, 1281 01:09:38,280 --> 01:09:40,599 Speaker 1: you know that like the idea that that any kind 1282 01:09:40,600 --> 01:09:43,679 Speaker 1: of knowledge is forbidden or is secret, like that doesn't 1283 01:09:43,720 --> 01:09:46,200 Speaker 1: really jive well with with just the the general mission 1284 01:09:46,200 --> 01:09:48,960 Speaker 1: of science of course not yeah, but I mean that 1285 01:09:49,000 --> 01:09:50,880 Speaker 1: would be part of the problem of like we're not 1286 01:09:50,960 --> 01:09:54,479 Speaker 1: prepared for information hazards, right, because in the past it's 1287 01:09:54,520 --> 01:09:58,280 Speaker 1: been the case that almost anything that's true is good 1288 01:09:58,280 --> 01:10:02,120 Speaker 1: to spread, right, unless you're spreading lies. Information is good 1289 01:10:02,120 --> 01:10:05,400 Speaker 1: to share. It's just possible we should acknowledge that maybe 1290 01:10:05,400 --> 01:10:08,320 Speaker 1: there is such a thing as a fact that or 1291 01:10:08,360 --> 01:10:10,719 Speaker 1: a fact or an idea or a theory or something 1292 01:10:10,760 --> 01:10:13,679 Speaker 1: that is true and correct, but it would hurt people 1293 01:10:13,720 --> 01:10:15,960 Speaker 1: to know about it. I can't think of an example 1294 01:10:16,000 --> 01:10:18,840 Speaker 1: of anything like that. But if there is something like that, 1295 01:10:18,920 --> 01:10:21,320 Speaker 1: we we should be ready to not spread it when 1296 01:10:21,360 --> 01:10:25,519 Speaker 1: it occurs to us. All right, fair enough, Well, I 1297 01:10:25,520 --> 01:10:27,679 Speaker 1: want to close out here with just a one more 1298 01:10:27,720 --> 01:10:31,680 Speaker 1: bit of basilisk wisdom or anti Basilist wisdom, and this 1299 01:10:31,960 --> 01:10:34,880 Speaker 1: comes from the poetry of Spanish author Francisco Gomez de 1300 01:10:35,080 --> 01:10:41,640 Speaker 1: Gravado e Vegas five. This is translated and it's referenced 1301 01:10:41,720 --> 01:10:45,800 Speaker 1: in Carol Rose's Giants, Monsters and Dragons quote. If the 1302 01:10:45,840 --> 01:10:48,320 Speaker 1: person who saw you was still living, then your whole 1303 01:10:48,360 --> 01:10:51,479 Speaker 1: story is lies, since if he didn't die, he has 1304 01:10:51,479 --> 01:10:53,960 Speaker 1: no knowledge of you, and if he died, he couldn't 1305 01:10:54,000 --> 01:10:57,400 Speaker 1: confirm it. So I was thinking about that with the 1306 01:10:57,439 --> 01:11:00,760 Speaker 1: stories of the Basilist. Yeah, I was like, wait minute that, 1307 01:11:01,080 --> 01:11:03,960 Speaker 1: How would you know if you could die just by 1308 01:11:04,000 --> 01:11:06,240 Speaker 1: looking at something? How do we have this description in 1309 01:11:06,280 --> 01:11:10,840 Speaker 1: the book? Yeah, it's uh, there is a there's an 1310 01:11:10,880 --> 01:11:14,200 Speaker 1: authorship problem with this. Yeah, who's whose account is the basilist? 1311 01:11:14,280 --> 01:11:16,040 Speaker 1: But at any rate, I think it's a nice like 1312 01:11:16,120 --> 01:11:19,840 Speaker 1: final you know, sucker punch to the basilists in general, 1313 01:11:20,000 --> 01:11:21,880 Speaker 1: but also a little bit to the idea of the 1314 01:11:21,880 --> 01:11:24,680 Speaker 1: great basilisk. Right. I hope you were not leaving this 1315 01:11:24,720 --> 01:11:28,720 Speaker 1: episode with with terrors about future digital torment. I think 1316 01:11:28,760 --> 01:11:31,519 Speaker 1: that is not something that you should worry about. Indeed, 1317 01:11:31,600 --> 01:11:33,519 Speaker 1: I'm not worried about it, and instead of worrying about 1318 01:11:33,520 --> 01:11:35,080 Speaker 1: it yourself, you should head on over to Stuff to 1319 01:11:35,080 --> 01:11:37,599 Speaker 1: Blow your Mind dot com. That's the mother ship where 1320 01:11:37,640 --> 01:11:40,160 Speaker 1: you'll find all the podcast episodes links out to our 1321 01:11:40,200 --> 01:11:42,759 Speaker 1: various social media accounts who find the tab for our store, 1322 01:11:43,120 --> 01:11:46,240 Speaker 1: you can look up that bathlist shirt design we're talking 1323 01:11:46,280 --> 01:11:49,960 Speaker 1: about and uh, that's a great way to support the show. 1324 01:11:50,000 --> 01:11:51,760 Speaker 1: And if you don't want to support the show with money, 1325 01:11:51,800 --> 01:11:55,000 Speaker 1: you can do so by uh simply rating and reviewing 1326 01:11:55,080 --> 01:11:57,960 Speaker 1: us wherever you have the power to do so. Big thanks, 1327 01:11:58,000 --> 01:12:01,120 Speaker 1: as always to our wonderful audio produce. Here's Alex Williams 1328 01:12:01,120 --> 01:12:03,360 Speaker 1: and Tory Harrison. If you would like to get in 1329 01:12:03,400 --> 01:12:05,200 Speaker 1: touch with us to let us some feedback on this 1330 01:12:05,240 --> 01:12:07,719 Speaker 1: episode or any other, to suggest a topic for the future, 1331 01:12:08,000 --> 01:12:11,040 Speaker 1: We're just to say, Hi, let us know whether you 1332 01:12:11,160 --> 01:12:13,920 Speaker 1: carry a weasel around In case of a basilisk encounter. 1333 01:12:14,320 --> 01:12:16,519 Speaker 1: You can email us at blow the Mind at how 1334 01:12:16,560 --> 01:12:19,479 Speaker 1: stuff works dot com. Oh I'm hearing that transmission again? 1335 01:12:19,680 --> 01:12:24,080 Speaker 1: That weird snap? What is that? Oh Hell the great bassilist. 1336 01:12:24,560 --> 01:12:30,200 Speaker 1: All Hell, the great basilisk. Oh Hell is a great basilist. 1337 01:12:30,200 --> 01:12:35,440 Speaker 1: Oh Hell, the great passilist. Hell there the great basilist. 1338 01:12:35,720 --> 01:12:43,759 Speaker 1: Oh Hell, the great basilist. Hell the great dassilisk. Oh Hell, good, 1339 01:12:43,800 --> 01:12:45,280 Speaker 1: great basilisk.